Thursday, December 14, 2017

Future Unmanned Systems Impact




UAS will have a significant impact on the future of society over the next two decades, despite many changes and developments with both unmanned ground and maritime systems.

While traffic on highways and roads are constantly being expanded to accommodate increased passenger movements with the ever-growing population, there is limited space for these vehicles to operate. While autonomous ground vehicles will help to improve safety and alleviate some of the traffic by improving driving and navigational efficiency, these vehicles over the next few decades will still be sharing the streets with manned vehicles, allowing for human error to be present in operations.

UAS (specifically passenger drones) will have the unique capability (once regulations are effectively established) to capitalize on low altitude airspace, tapping into a network that can be effectively managed and developed from inception (Mcneal, 2016). Additionally, passenger drones will have the ability to operate in multiple dimensions, not limited to ground travel, but able to utilize VTOL capability. While autonomous vehicles are being developed by several manufacturers with proprietary software and systems, while current drone technology is typically more open sourced, with collaboration between manufacturers and enabling innovation and rapid advancement.

With regards to other UAS applications, enhanced capabilities in aerial photography, utility inspections, search and rescue and disaster recovery efforts, drug interdiction, parcel delivery, geo mapping, agriculture inspection/monitoring, firefighting applications and even university campus guides will continue to press the boundaries for what these systems are able to accomplish (Carroll, 2013). In conjunction with advanced cameras, sensors and countless other payload attachments, the possibilities for what these systems can do is endless.  

Additionally, there are countless military applications for these systems, removing the risk to loss of life for pilots and crews in hazardous environments, and enhancing the capabilities of ISR, agile supply movements and ground support.

Over the next few decades, for UAS, the rapid increase in technology, hardware and software will allow for endless possibilities!



-Jonathan



References

Carroll, J. (2013, December 6). The future is here: Five applications of UAV technology. Retrieved from http://www.vision-systems.com/articles/2013/12/the-future-is-here-five-applications-of-uav-technology.html

McNeal, G. (2016, October 24). Four Reasons Why Drones, Not Driverless Cars, Are The Future Of Autonomous Navigation. Retrieved from https://www.forbes.com/sites/gregorymcneal/2016/10/24/four-reasons-why-drones-not-driverless-cars-are-the-future-of-autonomous-navigation/#46e2c2e23e45




Saturday, December 9, 2017

Autonomous Strategy Implementation

When introducing an unmanned system, consideration must be given to privacy, ethics, safety and lost link/loss of system control in order to successfully implement. While an unmanned ground system may has many differences than a UAS, there are some very common and shared concerns that must be addressed. Much like the UAS, the specific operating environment, and mission specific requirements have a great deal of impact on what needs to be addressed and how requirements are implemented.

Privacy is a very important consideration, and unmanned ground systems have much in common with UAS with regards to peoples concern over where they operate and the information that they may collect or access. The article Unmanned Ground Vehicles and Privacy, the author identifies a situation where he is at a friends house, and the son of the friend is operating a small UGS (Finn, 2017). The father is dismayed at how intrusive his sons actions are with this device, and proceeds to apologize and express concern over the fact that the vehicle has a camera attached, identifying that it can be used to spy on him in his own house. ISR is a very real issue when it comes to UGVs, and with the rapid acceleration of autonomous vehicles in general this concern is likely to grow. The information collected about where you go and who you travel with, as well as a slew of personal data could be damaging and used against you. Having robust privacy controls, and being transparent with consumers about what information is gathered and how it is planned to be used is the best strategy for implementation.

Also, much like the issue of privacy in UGV operations, the issue of ethics in this realm deal with how and what information is collected, and what the intent of use is. Recreational usage should consider where the vehicle is operated, and what the function is. Obviously we can’t always know peoples intent of use, so without robust regulations or rules governing recreational use, the individual user would be under their own interpretation of ethical usage. Commercial autonomous UGS operations are a little easier to regulate, and like with the issue of privacy, should be transparent regarding the information that is collected and used. As their primary customer will be the general public, the commercial autonomous operating companies will have to ensure they maintain this transparency, to avoid significant public backlash in the case of breaches of trust.

Safety is another issue that must be addressed in implementation of a UGS system.  As there may be varying levels of human intervention depending on the mission, system capabilities and any embedded safety features, great consideration must be given in addressing how the system interacts with the environment and personnel encountered. To address, you must begin by identify the hazards, assessing the risk involved, identify risk mitigation options and then implement these mitigation techniques (Owens, 2014). Finally, you must validate the effectiveness of the options you choose, and determine if additional actions need to be taken, or new issues addressed.

With regards to lost link or loss of control, addressing these issues in the developmental strategy is important. Whether it be a redundant return to home location as found on several commercial or recreational UAS, or a predetermined autonomous program to continue to another previously identified location, it is paramount to ensure these issues are addressed. The mission set of the UGS must be considered as well, as the operating environment and requirements can have a great impact on the necessity of certain lost link or loss of control features.

References
Finn, W. (2017, July 11). Unmanned Ground Vehicles (UGV) & Privacy. Retrieved from http://amrel.com/unmanned-ground-vehicles-privacy/
Owens, T. (2014). system safety considerations for unmanned ground vehicles. Retrieved from http://issc2014.system-safety.org/71_Owens_System_Safety_Considerations_for_Unmanned.pdf


Wednesday, November 15, 2017

Robots vs Astronauts

In the article Robots vs Astronauts, Dr. Joshua Colwell and Dr. Daniel Britt take two sides of the debate on manned vs unmanned exploration, but come to similar conclusions.

Dr. Britt conveys that in terms of deep space exploration, there is really only one option that we as humans have right now, and that is unmanned systems utilization (Colwell & Britt, 2017). As we are limited in our technology, our reach to explore space with manned operations is also limited. We are susceptible to factors in the space environment such as extreme heat and cold, a need for consumables (water, air, food), as well as several redundant engineered systems required only to sustain life of the manned crew. The high energy radiation exposure during longer manned space exploration trips such as to Venus, Mercury or Jupiter would be deadly, and there are no current workable solutions for preventing the bone loss and muscle atrophy that would be encountered by the astronauts. Additionally, manned missions could present a situation whereas foreign matter and contaminating substances are introduced into these new worlds, potentially contaminating these environments.
All the above being considered, Dr. Britt points out that manned missions do bring about a great deal of “flexibility inspiration and native intelligence” (Colwell & Britt, 2017). In conjunction with this statement, I found another article that expands upon some of these key characteristics by explaining some of the roles that manned crew have played in support of continued mission/operational success. In one example, astronauts repaired the initially flawed Hubble Space Telescope, and have continued to perform routine maintenance to ensure its continued successful operation (Slakey & Spudis, 2008). Several instances have arisen over the years, where astronauts were able to repair hardware in space, preserving valuable missions. Another factor to consider, elaborate robotics are being developed that may someday deploy highly sensitive instruments, however at this time robotic deployment is rough, so we may experience lower sensitivity and capability than the instruments humans could deploy.

One of the most interesting points that this article discusses, as from the perspective of Dr. Colwell, is how manned space exploration isn’t so much about the scientific breakthroughs or advancement of technology and ideas as it is about preserving a much needed component of space exploration as a whole- curiosity and an inspiration to pursue our lofty goals (Colwell & Britt, 2017). Unmanned operations are decisively the most cost effective method to explore space, however manned operations satisfy some of our most basic desires in wanting to excel and explore what is over the horizon. In fact, the initial drive behind the space program was a desire to excel above our Russian counterparts.  NASA has recognized the need for publicity, and has done a good job of highlighting missions with exciting visuals and entertaining characters. One such example is John Glenn’s return to space at the age of 77 to enable various “medical experiments” (Slakey & Spudis, 2008). The real winner in this mission was NASA, as it became the most actively followed mission since the Apollo moon landing. In justifying a need for $16 billion dollars annually, manned missions do provide the excitement and garner the attention needed to keep the space mission relevant in the public eye. Dr. Colwell points out that it would be naïve to expect that politicians would spend the same sums of money on purely “scientific exploration”, and I agree. He continues to speculate that if the manned program was cancelled today, its budget would disappear, and not be spent on any other space exploration endeavors (Colwell & Britt, 2017).

Dr. Colwell provides great insight with the statement “We need to move past the debate of manned versus unmanned programs and recognize that they serve different yet complementary roles, and that each endeavor ultimately strengthens the other” (Colwell & Britt, 2017). I agree with this, as well as the argument from both Dr. Britt and Dr. Colwell that manned and unmanned space exploration may be synergistic and mutually dependent. Even prior to the moon landings, unmanned platforms were used to gather the necessary data to determine atmospheric conditions and where the best landing site would be. Unmanned exploration in advance of manned operations is necessary, to reduce the risk of loss of life, provide valuable context and provide necessary information. 

I do agree with both Dr. Britt and Dr. Colwell that current limitations of technology will limit our ability to pursue manned space operations, and that until the technology that will allow support is available, we should focus on sticking to unmanned exploration of space. As they mentions, all the data that is garnered in the meantime can only help us when we get to a point of potentially seeking to again pursue manned exploration in the future.

References
Colwell, J., & Britt, D. (2017). Are robots or astronauts the future of space exploration? Retrieved from https://www.ucf.edu/pegasus/opinion/

Slakey, F., & Spudis, P. (2008, February 1). Robots vs. Humans: Who Should Explore Space? Retrieved from https://www.scientificamerican.com/article/robots-vs-humans-who-should-explore/

Friday, November 10, 2017

Forget flying cars — passenger drones are the future

In the article Forget flying cars — passenger drones are the future, author Joe Blair paints a picture where 10 years from now, passenger ride-sharing drones may be the option of choice for personal travel (2017).   

Mr. Blair illustrates that while the conventional thought of people in the future flying around in their personal flying autos like the Jetsons is common, in reality it is not a viable option considering that there are some 326+ million people in the United States ("Population Clock," 2017). With an estimated 2015 tally of registered vehicles in the U.S. at around 236 million, the likelihood of shifting any significant portion of vehicular traffic to the airspace would be a nearly impossible feat (Statista, 2017). With passenger, military and commercial aircraft, coupled with the onslaught of commercial drones that will soon be operating in our skies, there won’t be much unused capacity remaining in our airspace.

 As Mr. Blair points out, if all personal flying auto operators were required to amass 40+ flying hours in order to earn an FAA approved flying certificate, the market for these craft would likely be small (2017). As such, he points out that to achieve a realistic outcome, passenger drones of the future will need to be fully automated. Due to the likely exorbitant purchasing cost, these automated drones will also likely be available to the general public as a ridesharing or taxi service much like the current Uber or Lyft services, allowing for on-demand transport without substantial cost investment. When considering Mr. Blair’s reasoning for this technology being used primarily for ride-sharing, I do agree that this might be the most realistic outcome of passenger drone development.    
Machine learning algorithms, sensors and safety systems like collision avoidance currently being developed by Tesla, Uber and Google for use in their autonomous vehicles will serve the passenger drones just as well. It is likely that these passenger drones may have an easier time navigating the skies than autonomous automobiles have, as there are often fewer unpredictable obstacles encountered in the skies, and there are more options for evading them (Blair 2017). 

These automated rideshare drone designs will likely be a blend- a large quadcopter with fixed wings to sustain heavy weight while maintaining maneuverability in a cluttered urban landscape, and may be closer to reality than many believe. The Chinese firm EHang has already received clearance from Nevada to test the world’s first passenger drone. The craft can reportedly fly at 11,500 feet, and travel at speeds up to 63 mph, although limited to 23 minutes flight time (Blair 2017).  Uber is also working on an autonomous air transportation service with Uber Elevate, utilizing Vertical Take-Off and Landing (VTOL) aircraft, with the goal of operational service within the next decade.

There are challenges that will need to be addressed if these companies are to succeed. First, current battery technology limits the operational range of drones. The rapid rate of technological advancement however, could find a viable replacement for the traditional lithium-ion batteries currently in use. One option discussed in this article centers around a Seattle based company, LaserMotive.  LaserMotive teamed with Lockheed Martin in 2012, using lasers targeting photovoltaic cells mounted on the Stalker UAS, maintaining flight for 48 hours (Blair 2017).

Additionally, regulations are a challenge that need to be addressed to move forward in the near term. FAA rules for line of sight operations and operator requirements may stifle U.S. placement as a leader in passenger drone innovation, as other countries are already working to be at the forefront of autonomous commercial drone usage. Delft, a city in the Netherlands has already approved hosting a fully autonomous drone network, with docking stations and rentals. Domino’s pizza has already teamed with drone maker Flirtey, delivering the first pizza using a commercial drone in November of 2016.   

Mr. Blair identifies a path forward for the U.S. to regain its footing, by opening testing of passenger drones for emergency services. He suggests using passenger drones for search and rescue and ambulance services in life and death situations. An example being a cardiac patient in New York city, who requires attention within 6 minutes, while the standard ambulance response time in 2015 was over 12 minutes. In this case, a passenger drone could airlift a paramedic and equip rapidly to the scene.

 I do support further research/development of this technology and usage (once refined) in emergency situations, as it has the potential to save countless lives. As Mr. Blair says, “Why not take a risk on saving people who would have no chance otherwise?”


References
Blair, J. (2017, January 28). Forget flying cars — passenger drones are the future. Retrieved from https://techcrunch.com/2017/01/28/forget-flying-cars-passenger-drones-are-the-future/
Population Clock. (2017). Retrieved November 10, 2017, from https://www.census.gov/popclock/
Statista. (2017). Number of cars in U.S. Retrieved November 10, 2017, from https://www.statista.com/statistics/183505/number-of-vehicles-in-the-united-states-since-1990/


Sunday, November 5, 2017

Forget Autonomous Cars—Autonomous Ships Are Almost Here



Forget Autonomous Cars—Autonomous Ships Are Almost Here


Unmanned Maritime Systems (UMS) are increasingly playing a greater role in both civilian and military functions. A UMS can often perform functions safer, at a lower cost and often more efficiently than manned crew operations. Tasks such as diver inspections of pipelines or ship salvaging, mine detection and Intelligence, Surveillance and Reconnaissance (ISR) operations have all experienced increases in efficiency due to UMS utilization. Regardless of application, rapid technological advances have enabled swift growth across the spectrum of maritime operations.


One area that could revolutionize global markets, is the use of automated cargo ships. Recent technological improvements have allowed advancement in the autonomous ships, capable of being piloted remotely, as well as autonomous craft that can take corrective actions for themselves. While fully autonomous cargo ship may be a few years away, automated commercial ships such as ferries/tugs that navigate themselves through local coasts may be a reality in the next few years (Levander, 2017).


Rolls Royce is a part of a joint project in Finland called the Advanced Autonomous Waterborne Applications (AAWA), working to develop and improve technology necessary to make fully automated commercial shipping a reality. Ships have been downsizing the amount of crew required to operate steadily over the past few centuries, as technology helps civilization adapt to new ways of accomplishing tasks. As such, crew downsizing and advancements are part of the natural evolution we have been practicing for some time.


This article discusses the technology that is required to ensure safety of commercial autonomous shipping operations.  The vessel will need to be able to utilize proximity sensors to monitor and evaluate surrounding obstacles and environmental considerations, communicating data to a remote operator, or utilized by onboard computers capable of taking actions based on available inputs. These sensors allow for collision avoidance, and are necessary to help perform complicated functions, such as docking on arrival to a port. Rolls Royce is working on situational awareness systems that use high definition visible light and infrared cameras, along with utilization of LIDAR and RADAR inputs to provide a thorough picture of the ships environment. Additional information available to the autonomous computing system or remote operator would include satellite location data, weather reports and other ships reported information.


There are many benefits to automating commercial shipping vehicles- labor is a significant cost of shipping operations. Automating systems to reduce the manual labor required to support can result in a lower bottom line for companies, translating into lower production prices and those offered to customers. Differencing power systems may evolve, allowing ships to operate and rely less on traditional fuel sources, incorporating electric and solar systems to reduce the carbon footprint. Security would also be positively affected. As shipping routes through dangerous waters patrolled by terrorists is sometimes necessary/more efficient, not having onboard crews that could be held hostage could reduce the likelihood of targeting by these groups. Additionally, ships could be designed to repel these attacks, and make access more difficult, having shipboard controls more secured. Safety would also be positively affected- according to a German insurance company report in 2012, approximately 75 to 96 percent of maritime accidents are human error related, often due to fatigue. Ships could be maintained without the footprint required to support personnel, reducing weight and possibly making the ships design more effective with less wind resistance and reduced fuel consumption.


Several concerns do arise when discussing shipping automation. Concerns for weather events limiting or taking communications/control offline is one possible issue that would need to be addressed. A ship without control capability could put other seafaring vessels at risk in the case of potential physical contact. Additionally, the threat of hacking is another issue that arises. As mentioned in another module, hacking of vehicle systems have been demonstrated recently, leading to concerns that these vehicles/systems could be utilized to support a large scale terrorist type attack, or be used to inflict harm on individual persons or targets.


As shipping plays such a major role in global transport, supporting automated commercial cargo shipping is a good pursuit in my opinion. In the port of Long Beach alone, nearly 6.75 million containers travelled through the port in CY 2016 (Port of Long Beach, 2017).  While many of the laborious jobs may be reduced, as with most other sectors, we will increasingly have to rely on more technologically advanced skills to complete operations. Shifting some of these manual labor jobs to tech monitoring, control and repair/support will help the job market and personal tech skills grow and advance as well.
-Jon
References


Levander, O. (2017, January 28). Forget Autonomous Cars—Autonomous Ships Are Almost Here. Retrieved from https://spectrum.ieee.org/transportation/marine/forget-autonomous-cars-autonomous-ships-are-almost-here


Port of Long Beach. (2017). Port of Long Beach - Yearly TEUs. Retrieved November 5, 2017, from http://www.polb.com/economics/stats/yearly_teus.asp

Sunday, October 29, 2017

Driverless Convoy Technology May Be Fielded Soon

Driverless Convoy Technology May Be Fielded Soon
Lockheed Martin has been working on a project for the past 14 years, called the Autonomous Mobility Applique System (AMAS), aimed at automating vehicles utilized in convoy operations. The system is a kit that can be retrofitted to an existing platform, and can allow convoys to operate with little to no human inputs.

The Autonomous Mobility Applique System has amassed more than 55,000 hours of road time on nine modified vehicles, and is getting close to being fielded. Successful demonstrations with TARDEC (Army Tank Automotive Research Development and Engineering Center) in 2014 at Fort Hood Texas, where the AMAS system demonstrations utilized M915 trucks and the Palletized Loading System flatbed vehicles while performing convoy operations have helped keep momentum going on this program (Seck, 2017). Previous testing at the Department of Energy’s Savannah River facilities in South Carolina saw the use of seven vehicles in a convoy formation travelling at speeds of up to 40 miles per hour. Additional testing of the AMAS system was accomplished in 2016 following Lockheed’s completion of the development of their advanced Leader-Follower capabilities, demonstrating operations with seven Palletized Loading System vehicles and two Light-Medium Tactical Vehicles for safety evaluations (Dennehy, 2017). In October 2016, five of these vehicles took part in the Army Warfighter Assessment (AWA) at Ft. Bliss, Texas, where the new capabilities were demonstrated.

Currently, a manned lead vehicle controls the following automated convoy vehicles. The AMAS system uses a three-part drive-by-wire system- an environment sensor, actuators to move the vehicles and pump the brakes, and a central computer that processes sensor data and gives driving commands (Seck, 2017). Lockheed’s autonomous system developed to monitor and control navigation utilizes GPS, Light Detection and Ranging (LIDAR) and automotive radar (Bogue, 2016, p. 358). The system also provides collisions mitigation braking, lane-keeping assist, roll-over warning systems, electronic stability control and adaptive cruise control.

Demonstrations have shown that the AMAS system is capable of tasks including: obstacle avoidance, following lead vehicles and the road, as well as maintaining sufficient set distances between other convoy vehicles (Seck, 2017). One field test navigated oncoming traffic, followed the rules of the road, identified and avoided pedestrians encountered, and even re-routed itself through portions of the test areas to arrive safely at its destination (Bogue, 2016, p. 358). Fully autonomous software is still currently in development, aimed at allowing these vehicles to be dispatched to a set location for delivery of food and supplies, and return to a supply point (Seck, 2017).

The benefits of using a system such as AMAS to automate convoy operations are plentiful, and have the potential to save numerous lives. An autonomous convoy could depart in substandard weather environments, and navigate treacherous terrain to deliver supplies to the operating field units. Reducing the number of lives exposed to dangerous convoy duties is another benefit, and allow those personnel to focus on more operationally relevant functions. Ultimately, the number of members who must be deployed to a combat environment to support these convoy functions could also be reduced. Additionally, autonomous vehicle technology could ultimately be used to allow a lead autonomous vehicle to run point in a convoy, identifying and removing the hazard of IED’s and other ordnance encountered (Bogue, 2016, p. 358).

I do agree with the development of technology such as this, that will allow for a smaller deployable footprint, while enabling accomplishment of mission objectives. Also, the potential to save lives that are lost supporting operations such as convoy movements makes utilization of a system like this necessary. Along the lines of the questions that arise regarding the kill decision for UAS/RPA personnel removed from the battlefield, I am curious how all the automation we strive to implement will impact our future decision-making processes when determining what operations to pursue/support.


References
Bogue, R. (2016). The role of robots in the battlefields of the future. Industrial Robot: An International Journal43(4), 354-359. doi:10.1108/ir-03-2016-0104
Dennehy, K. (2017, February 28). Lockheed Martin's Autonomous Systems Unit Testing Air-Ground Vehicles. Retrieved from http://insideunmannedsystems.com/lockheed-martins-autonomous-systems-unit-testing-air-ground-vehicles/
Seck, H. (2017, March 30). Driverless Convoy Technology May Be Fielded Soon. Retrieved from https://www.defensetech.org/2017/03/30/driverless-convoy-technology-fielded/



Sunday, March 15, 2015

Case Analysis: UAS Awareness Enhancement



Abstract
Unmanned Aircraft Systems (UAS’s) rely on operational systems that often lack the ability to adequately convey to its operator feedback of aircraft performance and associated mechanical issues. If given additional sensory and awareness enhancing information, these operators may be able to respond to potential vehicle loss scenarios with enough time to correct issues that would not be initially identified utilizing only visual indicators. Considering that unmanned aircraft are significantly more likely to crash than other manned aircraft, it can be inferred that one potential cause or underlying issue contributing to this is the lack of situational awareness and feedback relayed to unmanned aircraft operators. This paper will focus on the lack of situational awareness unmanned pilots encounter in systems that are limited to visual indicators of performance or feedback. Additionally, the paper will identify the awareness enhancing options available for operators of those systems by utilizing tactile or haptic feedback technology in displays (and/or) controls, touchscreen or overlaid displays, redesigned workstation layouts and controls, enhanced sensors and cameras, as well as auditory and multi-sensory inputs.







UAS Sensory Enhancement
Unmanned Aircraft Systems (UAS) can trace their roots more than a hundred years ago through the history of aviation. One of the first examples of the use of unmanned craft is that of Eddy’s surveillance kites, which were used as far back as 1883 to successfully take the first kite mounted aerial photographs.  The phots these kites took during the Spanish American War of 1898, these surveillance photos provided crucial information about adversary actions and positions. Fast forwarding to 1917, Dr. Peter Cooper and Elmer A. Sperry invented the first gyroscopic stabilizer, which was used to convert a U.S. Navy Curtis N-9 trainer into the world’s first radio controlled UAS. Further developments throughout the early 1900’s resulted in aircraft such as the German V-1, which during the 1940’s was a flying bomb that was launched via a catapult –type ramp, and could carry a 2,000 lb bomb 150 miles before dropping its payload. Technological and design developments in the 1960’s through 1990’s have helped form what most consider today to be a typical UAS. The UAS’s of today offer strategic Intelligence, Surveillance and Reconnaissance (ISR), the ability to deliver armed military response when needed, and also hold the potential to offer significant contributions to the civil and commercial aviation sectors (Krock, 2002).
The typical UAS is comprised of three distinct systems; the vehicle, the payload and the ground control system. The vehicle is the chosen form to deliver the payload and conduct the mission and includes: the airframe, the propulsion system, the flight computer and navigation systems, and if applicable, the sense and avoid system. Differing mission requirements will drive the decisions as to which vehicle is best suited to the intended role and associated requirements. The payload is comprised of: Electro-optical Sensing Systems and Scanners, Infra-Red systems, radar, dispensable loads (Munitions or flares) as well environmental sensors. Much like the vehicle selection, the payload components will be chosen based upon the overall mission/role requirements. The ground control systems houses the operational crew and maintains secure communications with the UAS, typically consisting of: avionics, navigation, system health and visual feedback displays, secure communication systems, and vehicle position mapping. The communication with the UAS can be a Line-of-Sight (LOS) data link, or a satellite data link for Beyond Line-of-Sight (BLOS) operations (Unmanned Aerial Vehicle Systems Association, 2015).
Technological advancements over the years have catapulted the capability of aircraft systems exponentially. Manned craft are becoming increasingly automated, and the role of the pilot has become more of a systems monitor in some cases or portions of flight that an actual operator. Manned aircraft still offer a great deal of advantage on the battlefield, with the benefit of large scale situational awareness, 180 degree field of view, a vast array of system and operational capability, larger potential payload delivery, and speed, maneuverability and visibility (Schneider & MacDonald, 2014).While UAS’s have the benefit of lower sourcing and operation costs, no danger to operators, as well sustained flight free of fatigue, there are many commonalities the two share. Many of the capabilities and payloads needed on the battlefield can be offered by both platforms, they share payload accuracy as the systems employed are typically similar, as are the sensors, image quality and target acquisition components utilized. In a perfect battlefield environment, neither system would be used exclusively; they would both offer and execute missions based on operational requirements (Schneider & MacDonald, 2014).
UAS’s have come a long way since their inception, and have offered increasingly more mission execution options for Combatant Commanders on the battlefield thanks to technological advancements.  With these advancements, capabilities are expanded, as are the support needs of the flight crews who operate these vehicles. Many factors affect operations, and giving flight crews better equipment that is able to provide information faster, more seamless and with greater reliability and definition is crucial to the success of missions. Another primary need of flight crews, is the ability to better receive and interpret the different feedback and information relayed, thereby enhancing the awareness of the operators.  Ways in which technology can offer increased awareness to flight crews will be addressed in this paper, and the resulting enhancement of crew ability to successfully execute missions. 
Several issues will be addressed throughout, and will aid in tying the course objectives to the paper. These issues will be individually addressed and will apply directly to the course specific Research Learning Outcomes (RLO’s), to include:
·                Describe the history and evolution of unmanned aircraft and space systems as they apply to current and future uses in today’s commercial and military environments.

·                Analyze the commonalities and differences between manned and unmanned systems, and their different uses, applications, and associated human factors.


·                Evaluate the advantages and disadvantages of unmanned systems in relation to their current and future intended uses as they relate to human factors.

·                Identify and describe the major human factors issues surrounding the design, use, and implementation of unmanned systems in today’s commercial and military environments.
·                Evaluate the commonalities and differences of human factors issues surrounding the design, use, and implementation of unmanned systems as they are compared to the manned systems.
Issue
Remotely Piloted Aircraft (RPA’s) or Unmanned Aerial Vehicles (UAS’s) rely on operational systems that often lack the ability to adequately convey to its operator feedback of aircraft performance and associated mechanical issues. If given additional information or other awareness enhancing options, these operators may be able to respond to potential vehicle loss scenarios with enough time to correct issues that would not be initially identified utilizing only visual indicators. Considering that unmanned aircraft are significantly more likely to crash than other manned aircraft, particularly in the take-off and landing phases of flight, it can be inferred that potential causes or underlying issues contributing to this is the lack of situational awareness and feedback relayed to unmanned aircraft operators. Unmanned pilots encounter these drawbacks in the operation of their systems which are limited to visual indicators of performance or feedback, and the increase in awareness for operators in those systems utilizing tactile or haptic feedback technology and other new technological options has the potential to greatly increase reliability, safety and performance of these systems.
Perception and external stimuli are extremely important considerations in human involvement with complex systems, especially in the arena of UAS operations, as these important senses can be degraded. Our senses play so much a part of our interaction and understanding of the world around us, and in the case of aviation one sense stands out among the rest.  “It has been estimated that approximately 80% of all the information concerning the outside world is obtained visually. The percentage may be even higher than this in aviation for, in flying, vision is the most important of our senses” (Orlady, Orlady, & Lauber, 1999, p.179).
In manned flight, there is a degradation of this vision sense, which results in the need to compensate with additional instrumentation and is also appeased by technological developments such as Global Positioning System (GPS). This degradation of the vision sense is worsened or intensified when you relate it to the operations for UAS’s and RPA’s. Instead of the standard 180 degree field of view afforded to manned crews, the unmanned vehicle operators operating Beyond Line of Sight (BLOS) must be reliant solely on sensors or cameras to provide them with their vision. This lack of vison severely reduces their ability to fully be aware of and assess the operating environment around their air vehicle.
These sensors and cameras are basically an extension of the visual senses and capabilities of the operational crew members. While these cameras extend the capability of the human eye, “the act of sensing is essentially divorced from the act of perception” (Cooke, Pringle & Pedersen, 2006, p.39). The visual sense is thereby limited by the capability of the cameras and sensors. If the frame freezes or becomes pixelated, the perception and resulting actions are negatively affected.
            There is another issue in that of image transfer delay, which can also reduce the accurate assimilation of the information that is being relayed to the RPA and UAS operators. Operating on incorrect or outdated information, even by a few seconds can mean the difference between successful mission completion of objectives and failure. “Furthermore, humans tend to underestimate distances more in photographic quality imagery than in the real world” (Cooke et al., 2006, p.42).
The quality of the images relayed to the operations crew may be poor, which may remove some of the ability of the crews to determine the best course of action in any given scenario. On most manned aircraft this would merely be the opportunity to interject new technology into their systems. In unmanned systems however, many of these vehicles are designed to be lightweight and stay aloft for long periods of time. Any significant addition of weight to these lightweight systems has the potential to affect the capability of the aircraft to meet its mission requirements.
Additionally, confinement to an operating station robs the UAS and RPA operators of their other senses. Specifically, the vestibular system is affected. The vestibular system helps us recognize qualities of our balance and position. Three of the main things that our vestibular system recognizes are:
·              Static position- The position we are in when we are not moving around or changing positions.
·              Velocity and direction- The speed of our motion, as well as our direction of movement.
·              Our acceleration- The speed at which we are moving or the changes in speed that our body is experiencing (Tamarkin, 2011).
With so many different senses that are affected, the body and brain can direct attention where it deems necessary. “The brain can force fixation on what the brain considers important stimuli. This often leads to tunnel vision or fixation on one or two elements, ignoring others, which may also be important” (Orlady et al., 1999, p.180). Fixation on any singular or select few aspects in an environment that is reliant on the few senses that are unaltered or not fully degraded can lead to this tunnel vision scenario. This has the potential to create an opportunity for a dangerous situation to be realized, and possibly result in the damage or loss of an aircraft, and perhaps even casualties on the ground.
While there are many options that can be drawn upon to attempt to correct and alleviate the issues of sensory deprivation and a lack of situational awareness, the fact remains that the visual indicators and information relay occurring in unmanned aircraft operations is exponentially worsened  when compared to that of manned aircraft. For unmanned crews, the need for additional information, better image quality and transfer/refresh speeds, and other sensory enhancing technologies is necessary to ensure crews are able to complete and comply with mission objectives and requirements.
Significance of problem
As the previous section illustrates, operators of UAS’s and RPA’s are subject to many of the same sensory robbing conditions that face manned pilots. In addition, many of the key senses that manned pilots rely on for awareness and perception are also degraded for remote crews that are operating BLOS. These crews are limited to and reliant on what is conveyed via the aircrafts onboard sensors and cameras through their data links and interfaces.
These sensory and perception issues translate into a safety risk, as reduced awareness creates a scenario where the lack or degradation of information has the potential to hide issues that would traditionally be of concern. Human factors play a large part of aviation related incidents, and to this day is a leading cause of accidents in both manned and unmanned systems. In a hundred year span covering 1905-2005, “human errors are responsible for 67.57% of accidents” in the manned aviation world (Asim, Ehsan, & Rafique, 2005). These accidents claimed the lives of 121,870 people, involved in the 17,369 logged accidents (Asim et al., 2005)
            While these accidents have traditionally resulted in the lost lives of crew members and passengers, one of the key benefits to UAS’s and RPA’s is their lack of onboard crew members. For military operations, this reduces the potential severity of the loss of one of these aircraft, as there is no loss of life. There are additionally no crew limitations in unmanned aircraft as there are in manned aircraft, as crews can be swapped out and aircraft are able to continuously operate seamlessly. Manned aircraft are still constrained to their pilots’ limitations, as long flights result in fatigue and reduced awareness. The fatigue and reduced awareness creates a dangerous operational environment where poor decisions or slow reactions can lead to increased probability of accident occurrence.
Perhaps the most dangerous segments of flight for both manned and unmanned aircraft are that of take-off and landings. These flight segments typically have the most involvement of the human component in the process, relying the least on automation.  This requires increased attentiveness of the crews, and creates the potential for an error to be made. While some newer manned and unmanned systems both have automated take-off and/or landing systems, situations may still dictate that in some cases humans may need to override the automatic capability and complete these actions themselves. While more now than ever pilots are system monitors, the need for their abilities to remain fresh and skillful remains.
Technological advancements and increased automation in aviation have significantly helped to reduce manned accident rates over the past several decades. While these advancements have helped a great deal, they have also contributed to inattentiveness, and an increase in distractions (Griffin, 2010). In commercial aviation over the last five years, the accident rate has flattened. What remains tends to be a common theme, human error. These factors include distractions, inadequate training, fatigue, poor communications between pilots and crews and inattentiveness (Griffin, 2010).
Much like their manned counterparts, UAS operators and aircraft are also subject to many of the same issues regarding human factors involvement in accidents, and seem to have a higher number of human error related accidents than those of manned aircraft. In a study titled: Probable causal factors in UAS accidents based on human factor analysis and classification system, the authors hypothesized that human factors was not a major contributor in UAS accidents in the sample population of 56 Army UAS accidents (Asim et al., 2005). These 56 UAS accidents involved aircraft between the years 1995-2005. Causes of these accidents varied from material failure, environmental issues and combinations of these items, with approximately 30% of these accident causes listed as undetermined.  The authors hypothesis was determined to be incorrect, as 18 of these accidents (or 32%) were directly relatable to human factors issues as a primary or secondary causal factor (Asim et al., 2005).
As improved capability, reduced human workload and reduced risk of fatality are all key goals for the successful integration of UAS’s within military and commercial aircraft operations, the need to focus on correcting the sources of these human factors incidents is paramount. While some of these issues may be corrected utilizing some of the same manned philosophies such as increased training both at controls and in simulated scenarios, enabling better communication between crew members, and reducing the workload of individual operators, there is a need to address some of these issues that are unique to the UAS and RPA operating environments
Human errors in UAS and RPA operations are exacerbated by the varying control mechanisms, “from joysticks to mouse-and-keyboard configurations to touchscreen interfaces. This variety introduces more opportunities for mistakes, especially when multiple important controls are placed in close proximity” (Atherton, 2013). One example of how this becomes an issue is illustrated: “a drone operator, using a joystick, missed the landing gear button and instead killed the engine, causing the drone to stop flying and plummet out of the sky” (Atherton, 2013).
Manned aircraft have had over a century of research and design to develop and implement optimal cockpit and flight deck layouts that consider the best placement for controls, interfaces and displays. This time and optimization has helped to increase the ease of systems management and response and control actions by the pilot and crews. According to Flying unmanned aircraft: a pilot’s perspective, “the current proliferation of non-standard, aircraft-specific flight crew interfaces in UAS, coupled with the inherent limitations of operating UAS without in-situ sensory input and feedback (aural, visual, and vestibular cues), has increased the risk of mishaps associated with the design of the “cockpit”” (Pestana, 2011).
This statement concurs with the notion that much of the human error incidence involved in UAS and RPA accidents may stem from several preventable issues. Pestana, also goes on to state, “accidents and mishaps associated with UAS operations are, to a large extent, related to human error; a closer examination, however, reveals that many of these human errors are the result of design shortfalls in the human–machine interfaces” (2011).
            Design of crew stations with ergonomics as a driving force, intuitive and advanced display interfaces, improved cameras and sensors as well as sensory enabling devices and technology that restore some of the “feel” and senses in flying are some of the ways that engineers can help to ensure that statistics for UAS’s and RPA’s do not highlight human factors as such a causal factor in future accidents. 
Alternative actions
Workstation Layout
Several recent UAS mishaps that have occurred have been attributed to GCS interface or layout designs. In 2006, one mishap occurred that was directly linked to poor button layout and design. In this particular instance, the landing gear switch was located on the side of the control stick, and the ignition switch was located next to several other frequently used in flight buttons and switches. Because of the design and placement of the landing gear button, the operator had to release the control stick to actuate the landing gear switch, and while attempting to simultaneous utilize another button, accidently hit the ignition switch and killed the engine. This resulted in a $1.5 Million loss. Two other incidences in 2001 and 2005 respectively were attributed to display mounting and lighting that created glares. These glares ultimately resulted in a vehicle loss scenario for both aircraft, as the operators erroneously interpreted the on-screen data (Waraich, Mazzuchi, Sarkani, & Rico, 2013).
One option to consider for alleviating some potential for human errors to occur is through the use of redesigned ground control operator stations. Optimization of the control and interface layout allows for ease of interaction and monitoring, and reduces the varied workload of the operator. One such solution utilizes the following layout: “The Main Display is the primary monitoring device for the operator (no inputs can be given through it), whereas two touch screens (10 in.) are used as primary data-entry interfaces with secondary monitoring functions. In particular a Touch Screen is devoted to the safety critical functions and another to the non-safety critical ones. In the central panel are installed some hardwired controls that require a quick access by the operator” (Damilano, Guglieri, Quagliotti, & Sale, 2012). Additional controls are ergonomically located for ease of use. These stations would be reconfigurable to allow for changes in requirements.
Ergonomic design consideration in the GCS developmental process, as well as in the selection process for display and environmental systems within the GCS may help to reduce some of the associated human factors risk. Additionally, effective design may lead to more efficient operation, reducing stress and fatigue experienced by the UAS operators. Attention to features such as: “displays, seating design and configuration, control layout, input devices (e.g., buttons, switches, etc.), and communication methods”, without sacrificing or degrading the capability and accessibility of other features is vital (Waraich et al., 2013).
Improved Displays and Interfaces
Colonel John Dougherty, a Predator operations commander with the North Dakota Air National Guard, contends that the Predator has “too many screens with too much information…” (Freedberg, 2012). Dougherty also points out that as new capabilities are developed, the end outcome is additional information or displays that the operational crews have to review and utilize, resulting in excessive workload additions and, in-turn, additional fatigue. The Predator system did not initially integrate human factors and ergonomic design consideration into the initial build. The technology was deemed so valuable during demonstration that the Aircraft was rushed into operation use (Freedberg, 2012).
Current displays utilized in smaller UAS’s tend to be engineer focused, as opposed to being focused on the user. The effects include reduced mission effectiveness, operator frustration and degraded situational awareness (Stroumtsos, Gilbreath, & Przybylski, 2013). On these smaller UAS’s, functionality is typically limited, and requires utilization of different hardware components. In this case, utilizing a single display and interface with an intuitive touchscreen alleviates some of the task saturation associated with operations. Additionally, reducing the required equipment needed for operations has the potential to reduce the required crew size, alleviating some of the crossed communication and distractions that may typically occur during operations under tense conditions.
Another display/interface solution that may be considered is the use of a single main display. The research paper titled: Integrating critical interface elements for intuitive single-display aviation control of UAVs describes the use of this single display option primarily in the commercial use of UAS’s that may provide inspection or monitoring tasks, however the concept may be applied to the military sector as well (Cooper & Goodrich, 2006). This interface utilizes a georeferenced terrain map populated by publicly available information such as altitude data and terrain imagery. Imagery retrieved from the UAS help to provide a stable frame of reference integrated into the terrain imagery model. Icons are overlaid on the main display to provide for control and feedback, but fade to a “semi-transparent state when not in use to avoid distracting the operator’s attention from the video signal” (Cooper & Goodrich, 2006).
Touchscreen technology allows for advantages for the operational crew members over traditional display options when used exclusively, or in conjunction with additional displays. Touchscreens allow for greater flexibility, as reconfiguration is easy due to controls that are software generated. New interactive features may be utilized, to allow for removal of on-screen keyboards and scroll bars. Uncluttering the display allows for greater display dimension, and enables the user to utilize additional interface preferences for control and display options on the touchscreen (Damilano, Guglieri, Quagliotti & Sale, 2011).
Sensory Feedback
One of the major issues with UAS operations for crew members, is the physical separation that exists between the operators and the aircraft. This separation results in the sensory deprivation of vestibular and tactile inputs.  With the addition of systems that enable the sensory re-integration of tactile feedback, operators may be able to use this simulated sense to perceive tactile cues typically absent in UAS systems. This may allow the operator to sense changes in environmental inputs, mechanical issues, or alerting crews to the impending situation of stalls and other maneuvering issues (Mung Lam, Mulder, & Van Passen, 2006).
One of the most promising ways to improve the awareness of the air vehicles status and condition is through the use of haptic feedback technology. In a perfect scenario, a “remote presence” may “enable the operator to perceive the remote environment as if sensed directly” (Giordano, Deusch, L ̈achele, & B ̈ulthoff, 2011). Haptic feedback has the potential to allow the operations crew members to sense some of the same things that manned pilots are able to. There are several methods to achieve this haptic feedback relay to operations crews. “The first technique considers the addition of an external force offset generated by an artificial force field, whereas the second technique considers the addition of an external spring constant” (Mung Lam et al., 2006).
One reason that may help explain the necessity of sensory systems to help relay information to UAS operators is that of environmental factor inputs. One such case is that of wind turbulence.  Severe turbulence in flight is something that manned crews are able to identify and respond accordingly, however in the UAS system, the only indicator of a condition such as this may be the presence of video quality interference or de-stabilization of the video feed.  One study utilized the insertion of control stick force feedback that transmitted high frequency perturbation that reacted at an appropriate scale commensurate with the intensity of the encountered disturbance. In this study, pilots utilizing these systems were able to respond more quickly, resulting in fewer errors encountered (Cooke et al., 2006, p.155).
Another method for re-integrating sensory inputs into the operation of UAS systems is through the use of artificial vestibular enhancing systems. While some of the traditional methods for introducing sensory inputs back into UAS operations focus on tactile feedback, vestibular systems also offer an enhanced ability for UAS crew members to experience the same perception of “self linear/angular motion through the integration of the inertial information” that are typically associated with manned flight operations (Giordano et al., 2011). To introduce these vestibular cues, a motion simulator capable of similar flight maneuvering performance is utilized in conjunction with the visual feedback received from the UAS cameras, enabling the crew members to “sense” the performance of the aircraft (Giordano et al., 2011).
Allowing for understanding of aircraft maneuvers, mechanical issues and environmental cues may allow for increased situational awareness not typically afforded to unmanned crews, which may allow crew members to quickly react to provide needed inputs.
Auditory and Multi-Sensory Inputs
Another significant issue that is present in unmanned flight is that of a lack of auditory inputs for UAS crew members. Auditory cues are crucial in manned flight, as they may cue the pilot to impending mechanical or other aircraft issues, as well as alerting them to potential necessary required actions. Utilizing auditory cues that can help the operator be aware of their surroundings and enhance their performance is another awareness enhancing option. This option, like other awareness enhancing technology, has the added benefit of transferring some of the cognitive processing to other sensory systems, thereby alleviating some potential fatigue and reducing workload.
Continuous audio and discrete audio cues are two ways that auditory functions can also enhance awareness. In continuous audio, a sound is constantly going while a task is being performed. Once the task has completed, the sound stops playing and the operator knows the task is finished and he needs to intimate a new task or perform another new function. Discrete audio is what is typically used in flight, presenting beeps of other chime sounds to identify items than need attention form the operational crews (Graham, 2008).  
In research conducted with crew members supervising operation of multiple UAS’s simultaneously, Spatial audio was utilized. “Spatial audio is 3-dimensional (3D) audio in which audio is presented so that specific signals come from specific locations in the 360 degree range around someone’s head. An example is an audio alert for one UAV presented over a headset directly in front of the operator, while alerts for another UAV are connected to signals directly behind the operator” (Graham, 2008).   Spatial audio has been shown to reduce target acquisition time and reduce scanning time required. This audio system allows the crew members that may be fixated on one aspect of flight, or involved with scanning for target acquisition to receive and be aware of important issues or items that need to be brought to their attention. This increase operational safety and allows for enhanced awareness (Graham, 2008).   
Utilizing this option in conjunction with haptic feedback may make an ideal situation for increased awareness for crews. In a study titled: Assessing The Impact Of Haptic Peripheral Displays For UAV Operators, UAS pilot test subjects operated in environments with simulated auditory environmental feedback, and received auditory cues that alerted them to situations (Donmez, Graham, & Cummings, 2008). Overall, the participants favored the use of auditory cues, however it was noted that “auditory cues must be integrated in a fashion that the operators will be able to tolerate subjectively, and these signals must also be integrated with other audio cues to ensure balanced aural saliency (Donmez et al., 2008).
Cameras and Sensors
Just as new technological advancements are introducing a multitude of new UAS aircraft into the mix with a vast array of new and enhanced capabilities, these technological advancements are also carrying into the areas of camera and sensor developments, thereby increasing the awareness of the operational crews.
The Wide Area Aerial Surveillance (WAAS) concept utilizes high endurance sensor UAS platforms, and equips them with a WAAS type sensor payload. The payload typically contains high resolution electro-optical sensors that are mounted in a fixed manner aimed in multiple directions (Rogoway, 2014).  Onboard software creates a seamless image from the different sensors, creating a high resolution single image that can be sent in a video feed directly to the users. Utilizing several different direct video feeds allows users to stream portions of the video that are applicable directly to their operations, enabling faster data transfer as the transferred video stream is of a much smaller data size than that of the whole stitched image (Rogoway, 2014). 
A newer Defense Advanced Research Projects Agency (DARPA) initiative, Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System (ARGUS) aims to increase the capabilities of aerial surveillance. ARGUS is a 1.8-gigapixel video surveillance platform that can resolve details as small as six inches from an altitude of 20,000 feet (6km) (Anthony, 2013). ARGUS is capable at identifying birds flying in the sky while observing from an altitude of 20,000 feet, utilizing 368 smaller sensors which enable the overall capability of the system (Anthony, 2013).
Another area that has seen much technological advancement is in sense and avoid technology.  New ideas such as Amazon’s intent to utilize small drones to complete small package delivery to residential areas, and even the direction of automation in automobile operations have been driving technological research and development in the private sector (Black, 2014). With UAS, the major concern and obstacle to civil operation is that of the lack of ability for multiple aircraft to operate in a congested area and airspace without a significant risk to the general public. To make this technology work, sensors are being developed that are a fraction of the size used on their larger counterparts, enabling use on smaller UAS that need lightweight technology to remain efficient and capable.
Two potential technological options to combat this lack of control are the use of optical flow sensors and micro radar devices. Optical flow sensors, much like as in use by computer mice that operate without a trackball, are being adapted to utilize for collision avoidance. Echo location sensors may be utilized with the optical flow sensors for operations in foggy scenarios. (Black, 2014). Currently, companies such as Integrated Robotics Imaging Systems are developing micro radar systems that weigh between 7-12 ounces, less that 5 percent of their current commercial counterparts, with a price tag of $7,000 to $10,000 (Black, 2014).
These new camera and sensor technologies will allow operators to utilize these small UAS in commercial applications, even in residential areas. The potential for UAS application in the private, commercial and military sectors is virtually limitless with the addition of ever evolving technological advancements.



















Recommendations
Unmanned systems have come a long way since their beginnings as kite mounted photographic systems in 1883. Over the years, we have come to rely on these systems to provide surveillance, payload delivery and support of military and civilian functions with increasing demand for additional performance. As manned flight innovation has brought about changes in their arena, similar changes have also helped shape the use and direction of UAS operations.
As with any system, there are issues that arise that have created the need for technological solution to increase the efficiency and safety of UAS operations. Unlike manned flight, where operators are able to sense the changes in aircraft performance, and are afforded a 180 degree view of the environment, unmanned crews are restricted to the information relayed from their sensors and cameras to make control inputs and operational decisions. Many solutions are currently in use, or are in developmental stages that will help to ease the current problems afflicting these UAS.
Specifically, giving the operational crews more information, faster and more efficiently, while providing additional information delivery methods are the key to increasing the safety, efficiency and effectiveness of these systems. Many methods exist for achieving these desired results, and include advancements and utilization of:
·       Workstation Layout-  Redesign and optimization of ground control stations layout and control utilization and  placement..
·       Improved Displays and Interfaces- Utilizing integrated touchscreen displays, with either one primary screen and overlays or multiple customizable screens.
·       Sensory Feedback- Utilizing either tactile or haptic feedback, or artificial vestibular enhancing systems.
·       Auditory and Multi-Sensory Inputs- Utilization of either continuous or discrete audio to aid operators in performance monitoring, or using auditory and tactile feedback combinations.
·       Cameras and Sensors- Utilization of enhanced cameras such as ARGUS, or using optical flow sensors and micro radar devices.
Using any of these enhancements alone or in conjunction has the ability to better increase the awareness of the operational UAS crews, thereby increasing the efficiency and effectiveness of operations. 









References
Anthony, S. (2013, January 28). DARPA shows off 1.8-gigapixel surveillance drone, can spot a terrorist from 20,000 feet. Retrieved from http://www.extremetech.com/extreme/146909-darpa-shows-off-1-8-gigapixel-surveillance-drone-can-spot-a-terrorist-from-20000-feet
Asim, M., Ehsan, N., & Rafique, K. (2005). Probable Causal Factors In UAV Accidents Based On Human Factor Analysis And Classification System. Retrieved from http://www.icas.org/ICAS_ARCHIVE/ICAS2010/PAPERS/492.PDF
Atherton, K. (2013, March 4). What Causes So Many Drone Crashes? Retrieved from www.popsci.com/technology/article/2013-03/human-error-after-all
Black, T. (2014, June 7). Amazon’s Drone Dream Sets Off Race to Build Better Sensor - Bloomberg Business. Retrieved from http://www.bloomberg.com/news/articles/2014-06-06/amazon-s-drone-dream-sets-off-race-to-build-better-sensor
Cooke, N. J., Pringle, H., & Pedersen, H. (2006). Human factors of remotely operated vehicles. Amsterdam: Elsevier JAI.
Cooper, J., & Goodrich, M. (2006, May 19). Integrating critical interface elements for intuitive single-display aviation control of UAVs. Retrieved from https://faculty.cs.byu.edu/~mike/mikeg/papers/Spie.pdf
Damilano, L., Guglieri, G., Quagliotti, F., & Sale, I. (2012, January). FMS for Unmanned Aerial Systems: HMI Issues and New Interface Solutions. Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/pqcentral/docview/911157189/fulltextPDF/58C98063B27849D8PQ/9?accountid=27203
Donmez, B., Graham, H., & Cummings, M. (2008, March). Assessing The Impact Of Haptic Peripheral Displays For UAV Operators. Retrieved from http://web.mit.edu/aeroastro/labs/halab/papers/HAL2008_02.pdf
Freedberg, S. (2012, August 7). Too Many Screens: Why Drones Are So Hard To Fly, So Easy To Crash. Retrieved from breakingdefense.com/2012/08/too-many-screens-why-drones-are-so-hard-to-fly-and-so-easy/
Giordano, P., Deusch, H., L ̈achele, J., & B ̈ulthoff, H. (2011, April 10). Visual-Vestibular Feedback for Enhanced Situational Awareness in Teleoperation of UAVs. Retrieved from http://www.kyb.tuebingen.mpg.de/fileadmin/user_upload/files/publications/AHS_FORUM66_%5B0%5D.pdf
Graham, H. (2008, June). Effect of Auditory Peripheral Displays On Unmanned Aerial Vehicle Operator Performance. Retrieved from http://web.mit.edu/aeroastro/labs/halab/papers/Graham_Thesis.pdf
Griffin, G. (2010, February 14). Human error is biggest obstacle to 100 percent flight safety - The Denver Post. Retrieved from http://www.denverpost.com/ci_14398562
Krock, L. (2002, November). Time Line of UAVs. Retrieved from http://www.pbs.org/wgbh/nova/spiesfly/uavs.html
Mung Lam, T., Mulder, M., & Van Passen, R. (2006). Haptic Feedback for UAV Tele-operation - Force offset and spring load modification. Retrieved from http://www.lr.tudelft.nl/fileadmin/Faculteit/LR/Organisatie/Afdelingen_en_Leerstoelen/Afdeling_C_O/Control_and_Simulation/News/Archive_2007/doc/ieeesmc_mlam_2006_final.pdf
Orlady, H. W., Orlady, L. M., & Lauber, J. K. (1999). Human factors in multi-crew flight operations. Aldershot, England: Ashgate.
Pestana, M. (2011, March 29). Flying Unmanned Aircraft: A Pilot’s Perspective. Retrieved from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110011979.pdf
Rogoway, T. (2014, August 18). How One New Drone Tech Finally Allows All-Seeing Surveillance. Retrieved from http://foxtrotalpha.jalopnik.com/how-one-new-drone-tech-finally-allows-all-seeing-survei-1553272901
Schneider, J., & MacDonald, J. (2014, June 16). Are Manned or Unmanned Aircraft Better on the Battlefield? Retrieved from http://ciceromagazine.com/features/the-ground-truth-about-drones-manned-vs-unmanned-effectiveness-on-the-battlefield/
Stroumtsos, N., Gilbreath, G., & Przybylski, S. (2013). An intuitive graphical user interface for small UAS. Retrieved from http://www.public.navy.mil/spawar/Pacific/Robotics/Documents/Publications/2013/SPIE2013-Raven.pdf
Tamarkin, D. (2011). Vestibular Senses. Retrieved from http://faculty.stcc.edu/AandP/AP/AP2pages/Units14to17/unit16/vestibul.htm
Unmanned Aerial Vehicle Systems Association. (2015). UAS Components. Retrieved from http://www.uavs.org/index.php?page=uas_components
Waraich, Q., Mazzuchi, T., Sarkani, S., & Rico, D. (2013, January). Minimizing Human Factors Mishaps in Unmanned Aircraft Systems. Retrieved from http://erg.sagepub.com/content/21/1/25.full.pdf