Sunday, March 15, 2015

Case Analysis: UAS Awareness Enhancement



Abstract
Unmanned Aircraft Systems (UAS’s) rely on operational systems that often lack the ability to adequately convey to its operator feedback of aircraft performance and associated mechanical issues. If given additional sensory and awareness enhancing information, these operators may be able to respond to potential vehicle loss scenarios with enough time to correct issues that would not be initially identified utilizing only visual indicators. Considering that unmanned aircraft are significantly more likely to crash than other manned aircraft, it can be inferred that one potential cause or underlying issue contributing to this is the lack of situational awareness and feedback relayed to unmanned aircraft operators. This paper will focus on the lack of situational awareness unmanned pilots encounter in systems that are limited to visual indicators of performance or feedback. Additionally, the paper will identify the awareness enhancing options available for operators of those systems by utilizing tactile or haptic feedback technology in displays (and/or) controls, touchscreen or overlaid displays, redesigned workstation layouts and controls, enhanced sensors and cameras, as well as auditory and multi-sensory inputs.







UAS Sensory Enhancement
Unmanned Aircraft Systems (UAS) can trace their roots more than a hundred years ago through the history of aviation. One of the first examples of the use of unmanned craft is that of Eddy’s surveillance kites, which were used as far back as 1883 to successfully take the first kite mounted aerial photographs.  The phots these kites took during the Spanish American War of 1898, these surveillance photos provided crucial information about adversary actions and positions. Fast forwarding to 1917, Dr. Peter Cooper and Elmer A. Sperry invented the first gyroscopic stabilizer, which was used to convert a U.S. Navy Curtis N-9 trainer into the world’s first radio controlled UAS. Further developments throughout the early 1900’s resulted in aircraft such as the German V-1, which during the 1940’s was a flying bomb that was launched via a catapult –type ramp, and could carry a 2,000 lb bomb 150 miles before dropping its payload. Technological and design developments in the 1960’s through 1990’s have helped form what most consider today to be a typical UAS. The UAS’s of today offer strategic Intelligence, Surveillance and Reconnaissance (ISR), the ability to deliver armed military response when needed, and also hold the potential to offer significant contributions to the civil and commercial aviation sectors (Krock, 2002).
The typical UAS is comprised of three distinct systems; the vehicle, the payload and the ground control system. The vehicle is the chosen form to deliver the payload and conduct the mission and includes: the airframe, the propulsion system, the flight computer and navigation systems, and if applicable, the sense and avoid system. Differing mission requirements will drive the decisions as to which vehicle is best suited to the intended role and associated requirements. The payload is comprised of: Electro-optical Sensing Systems and Scanners, Infra-Red systems, radar, dispensable loads (Munitions or flares) as well environmental sensors. Much like the vehicle selection, the payload components will be chosen based upon the overall mission/role requirements. The ground control systems houses the operational crew and maintains secure communications with the UAS, typically consisting of: avionics, navigation, system health and visual feedback displays, secure communication systems, and vehicle position mapping. The communication with the UAS can be a Line-of-Sight (LOS) data link, or a satellite data link for Beyond Line-of-Sight (BLOS) operations (Unmanned Aerial Vehicle Systems Association, 2015).
Technological advancements over the years have catapulted the capability of aircraft systems exponentially. Manned craft are becoming increasingly automated, and the role of the pilot has become more of a systems monitor in some cases or portions of flight that an actual operator. Manned aircraft still offer a great deal of advantage on the battlefield, with the benefit of large scale situational awareness, 180 degree field of view, a vast array of system and operational capability, larger potential payload delivery, and speed, maneuverability and visibility (Schneider & MacDonald, 2014).While UAS’s have the benefit of lower sourcing and operation costs, no danger to operators, as well sustained flight free of fatigue, there are many commonalities the two share. Many of the capabilities and payloads needed on the battlefield can be offered by both platforms, they share payload accuracy as the systems employed are typically similar, as are the sensors, image quality and target acquisition components utilized. In a perfect battlefield environment, neither system would be used exclusively; they would both offer and execute missions based on operational requirements (Schneider & MacDonald, 2014).
UAS’s have come a long way since their inception, and have offered increasingly more mission execution options for Combatant Commanders on the battlefield thanks to technological advancements.  With these advancements, capabilities are expanded, as are the support needs of the flight crews who operate these vehicles. Many factors affect operations, and giving flight crews better equipment that is able to provide information faster, more seamless and with greater reliability and definition is crucial to the success of missions. Another primary need of flight crews, is the ability to better receive and interpret the different feedback and information relayed, thereby enhancing the awareness of the operators.  Ways in which technology can offer increased awareness to flight crews will be addressed in this paper, and the resulting enhancement of crew ability to successfully execute missions. 
Several issues will be addressed throughout, and will aid in tying the course objectives to the paper. These issues will be individually addressed and will apply directly to the course specific Research Learning Outcomes (RLO’s), to include:
·                Describe the history and evolution of unmanned aircraft and space systems as they apply to current and future uses in today’s commercial and military environments.

·                Analyze the commonalities and differences between manned and unmanned systems, and their different uses, applications, and associated human factors.


·                Evaluate the advantages and disadvantages of unmanned systems in relation to their current and future intended uses as they relate to human factors.

·                Identify and describe the major human factors issues surrounding the design, use, and implementation of unmanned systems in today’s commercial and military environments.
·                Evaluate the commonalities and differences of human factors issues surrounding the design, use, and implementation of unmanned systems as they are compared to the manned systems.
Issue
Remotely Piloted Aircraft (RPA’s) or Unmanned Aerial Vehicles (UAS’s) rely on operational systems that often lack the ability to adequately convey to its operator feedback of aircraft performance and associated mechanical issues. If given additional information or other awareness enhancing options, these operators may be able to respond to potential vehicle loss scenarios with enough time to correct issues that would not be initially identified utilizing only visual indicators. Considering that unmanned aircraft are significantly more likely to crash than other manned aircraft, particularly in the take-off and landing phases of flight, it can be inferred that potential causes or underlying issues contributing to this is the lack of situational awareness and feedback relayed to unmanned aircraft operators. Unmanned pilots encounter these drawbacks in the operation of their systems which are limited to visual indicators of performance or feedback, and the increase in awareness for operators in those systems utilizing tactile or haptic feedback technology and other new technological options has the potential to greatly increase reliability, safety and performance of these systems.
Perception and external stimuli are extremely important considerations in human involvement with complex systems, especially in the arena of UAS operations, as these important senses can be degraded. Our senses play so much a part of our interaction and understanding of the world around us, and in the case of aviation one sense stands out among the rest.  “It has been estimated that approximately 80% of all the information concerning the outside world is obtained visually. The percentage may be even higher than this in aviation for, in flying, vision is the most important of our senses” (Orlady, Orlady, & Lauber, 1999, p.179).
In manned flight, there is a degradation of this vision sense, which results in the need to compensate with additional instrumentation and is also appeased by technological developments such as Global Positioning System (GPS). This degradation of the vision sense is worsened or intensified when you relate it to the operations for UAS’s and RPA’s. Instead of the standard 180 degree field of view afforded to manned crews, the unmanned vehicle operators operating Beyond Line of Sight (BLOS) must be reliant solely on sensors or cameras to provide them with their vision. This lack of vison severely reduces their ability to fully be aware of and assess the operating environment around their air vehicle.
These sensors and cameras are basically an extension of the visual senses and capabilities of the operational crew members. While these cameras extend the capability of the human eye, “the act of sensing is essentially divorced from the act of perception” (Cooke, Pringle & Pedersen, 2006, p.39). The visual sense is thereby limited by the capability of the cameras and sensors. If the frame freezes or becomes pixelated, the perception and resulting actions are negatively affected.
            There is another issue in that of image transfer delay, which can also reduce the accurate assimilation of the information that is being relayed to the RPA and UAS operators. Operating on incorrect or outdated information, even by a few seconds can mean the difference between successful mission completion of objectives and failure. “Furthermore, humans tend to underestimate distances more in photographic quality imagery than in the real world” (Cooke et al., 2006, p.42).
The quality of the images relayed to the operations crew may be poor, which may remove some of the ability of the crews to determine the best course of action in any given scenario. On most manned aircraft this would merely be the opportunity to interject new technology into their systems. In unmanned systems however, many of these vehicles are designed to be lightweight and stay aloft for long periods of time. Any significant addition of weight to these lightweight systems has the potential to affect the capability of the aircraft to meet its mission requirements.
Additionally, confinement to an operating station robs the UAS and RPA operators of their other senses. Specifically, the vestibular system is affected. The vestibular system helps us recognize qualities of our balance and position. Three of the main things that our vestibular system recognizes are:
·              Static position- The position we are in when we are not moving around or changing positions.
·              Velocity and direction- The speed of our motion, as well as our direction of movement.
·              Our acceleration- The speed at which we are moving or the changes in speed that our body is experiencing (Tamarkin, 2011).
With so many different senses that are affected, the body and brain can direct attention where it deems necessary. “The brain can force fixation on what the brain considers important stimuli. This often leads to tunnel vision or fixation on one or two elements, ignoring others, which may also be important” (Orlady et al., 1999, p.180). Fixation on any singular or select few aspects in an environment that is reliant on the few senses that are unaltered or not fully degraded can lead to this tunnel vision scenario. This has the potential to create an opportunity for a dangerous situation to be realized, and possibly result in the damage or loss of an aircraft, and perhaps even casualties on the ground.
While there are many options that can be drawn upon to attempt to correct and alleviate the issues of sensory deprivation and a lack of situational awareness, the fact remains that the visual indicators and information relay occurring in unmanned aircraft operations is exponentially worsened  when compared to that of manned aircraft. For unmanned crews, the need for additional information, better image quality and transfer/refresh speeds, and other sensory enhancing technologies is necessary to ensure crews are able to complete and comply with mission objectives and requirements.
Significance of problem
As the previous section illustrates, operators of UAS’s and RPA’s are subject to many of the same sensory robbing conditions that face manned pilots. In addition, many of the key senses that manned pilots rely on for awareness and perception are also degraded for remote crews that are operating BLOS. These crews are limited to and reliant on what is conveyed via the aircrafts onboard sensors and cameras through their data links and interfaces.
These sensory and perception issues translate into a safety risk, as reduced awareness creates a scenario where the lack or degradation of information has the potential to hide issues that would traditionally be of concern. Human factors play a large part of aviation related incidents, and to this day is a leading cause of accidents in both manned and unmanned systems. In a hundred year span covering 1905-2005, “human errors are responsible for 67.57% of accidents” in the manned aviation world (Asim, Ehsan, & Rafique, 2005). These accidents claimed the lives of 121,870 people, involved in the 17,369 logged accidents (Asim et al., 2005)
            While these accidents have traditionally resulted in the lost lives of crew members and passengers, one of the key benefits to UAS’s and RPA’s is their lack of onboard crew members. For military operations, this reduces the potential severity of the loss of one of these aircraft, as there is no loss of life. There are additionally no crew limitations in unmanned aircraft as there are in manned aircraft, as crews can be swapped out and aircraft are able to continuously operate seamlessly. Manned aircraft are still constrained to their pilots’ limitations, as long flights result in fatigue and reduced awareness. The fatigue and reduced awareness creates a dangerous operational environment where poor decisions or slow reactions can lead to increased probability of accident occurrence.
Perhaps the most dangerous segments of flight for both manned and unmanned aircraft are that of take-off and landings. These flight segments typically have the most involvement of the human component in the process, relying the least on automation.  This requires increased attentiveness of the crews, and creates the potential for an error to be made. While some newer manned and unmanned systems both have automated take-off and/or landing systems, situations may still dictate that in some cases humans may need to override the automatic capability and complete these actions themselves. While more now than ever pilots are system monitors, the need for their abilities to remain fresh and skillful remains.
Technological advancements and increased automation in aviation have significantly helped to reduce manned accident rates over the past several decades. While these advancements have helped a great deal, they have also contributed to inattentiveness, and an increase in distractions (Griffin, 2010). In commercial aviation over the last five years, the accident rate has flattened. What remains tends to be a common theme, human error. These factors include distractions, inadequate training, fatigue, poor communications between pilots and crews and inattentiveness (Griffin, 2010).
Much like their manned counterparts, UAS operators and aircraft are also subject to many of the same issues regarding human factors involvement in accidents, and seem to have a higher number of human error related accidents than those of manned aircraft. In a study titled: Probable causal factors in UAS accidents based on human factor analysis and classification system, the authors hypothesized that human factors was not a major contributor in UAS accidents in the sample population of 56 Army UAS accidents (Asim et al., 2005). These 56 UAS accidents involved aircraft between the years 1995-2005. Causes of these accidents varied from material failure, environmental issues and combinations of these items, with approximately 30% of these accident causes listed as undetermined.  The authors hypothesis was determined to be incorrect, as 18 of these accidents (or 32%) were directly relatable to human factors issues as a primary or secondary causal factor (Asim et al., 2005).
As improved capability, reduced human workload and reduced risk of fatality are all key goals for the successful integration of UAS’s within military and commercial aircraft operations, the need to focus on correcting the sources of these human factors incidents is paramount. While some of these issues may be corrected utilizing some of the same manned philosophies such as increased training both at controls and in simulated scenarios, enabling better communication between crew members, and reducing the workload of individual operators, there is a need to address some of these issues that are unique to the UAS and RPA operating environments
Human errors in UAS and RPA operations are exacerbated by the varying control mechanisms, “from joysticks to mouse-and-keyboard configurations to touchscreen interfaces. This variety introduces more opportunities for mistakes, especially when multiple important controls are placed in close proximity” (Atherton, 2013). One example of how this becomes an issue is illustrated: “a drone operator, using a joystick, missed the landing gear button and instead killed the engine, causing the drone to stop flying and plummet out of the sky” (Atherton, 2013).
Manned aircraft have had over a century of research and design to develop and implement optimal cockpit and flight deck layouts that consider the best placement for controls, interfaces and displays. This time and optimization has helped to increase the ease of systems management and response and control actions by the pilot and crews. According to Flying unmanned aircraft: a pilot’s perspective, “the current proliferation of non-standard, aircraft-specific flight crew interfaces in UAS, coupled with the inherent limitations of operating UAS without in-situ sensory input and feedback (aural, visual, and vestibular cues), has increased the risk of mishaps associated with the design of the “cockpit”” (Pestana, 2011).
This statement concurs with the notion that much of the human error incidence involved in UAS and RPA accidents may stem from several preventable issues. Pestana, also goes on to state, “accidents and mishaps associated with UAS operations are, to a large extent, related to human error; a closer examination, however, reveals that many of these human errors are the result of design shortfalls in the human–machine interfaces” (2011).
            Design of crew stations with ergonomics as a driving force, intuitive and advanced display interfaces, improved cameras and sensors as well as sensory enabling devices and technology that restore some of the “feel” and senses in flying are some of the ways that engineers can help to ensure that statistics for UAS’s and RPA’s do not highlight human factors as such a causal factor in future accidents. 
Alternative actions
Workstation Layout
Several recent UAS mishaps that have occurred have been attributed to GCS interface or layout designs. In 2006, one mishap occurred that was directly linked to poor button layout and design. In this particular instance, the landing gear switch was located on the side of the control stick, and the ignition switch was located next to several other frequently used in flight buttons and switches. Because of the design and placement of the landing gear button, the operator had to release the control stick to actuate the landing gear switch, and while attempting to simultaneous utilize another button, accidently hit the ignition switch and killed the engine. This resulted in a $1.5 Million loss. Two other incidences in 2001 and 2005 respectively were attributed to display mounting and lighting that created glares. These glares ultimately resulted in a vehicle loss scenario for both aircraft, as the operators erroneously interpreted the on-screen data (Waraich, Mazzuchi, Sarkani, & Rico, 2013).
One option to consider for alleviating some potential for human errors to occur is through the use of redesigned ground control operator stations. Optimization of the control and interface layout allows for ease of interaction and monitoring, and reduces the varied workload of the operator. One such solution utilizes the following layout: “The Main Display is the primary monitoring device for the operator (no inputs can be given through it), whereas two touch screens (10 in.) are used as primary data-entry interfaces with secondary monitoring functions. In particular a Touch Screen is devoted to the safety critical functions and another to the non-safety critical ones. In the central panel are installed some hardwired controls that require a quick access by the operator” (Damilano, Guglieri, Quagliotti, & Sale, 2012). Additional controls are ergonomically located for ease of use. These stations would be reconfigurable to allow for changes in requirements.
Ergonomic design consideration in the GCS developmental process, as well as in the selection process for display and environmental systems within the GCS may help to reduce some of the associated human factors risk. Additionally, effective design may lead to more efficient operation, reducing stress and fatigue experienced by the UAS operators. Attention to features such as: “displays, seating design and configuration, control layout, input devices (e.g., buttons, switches, etc.), and communication methods”, without sacrificing or degrading the capability and accessibility of other features is vital (Waraich et al., 2013).
Improved Displays and Interfaces
Colonel John Dougherty, a Predator operations commander with the North Dakota Air National Guard, contends that the Predator has “too many screens with too much information…” (Freedberg, 2012). Dougherty also points out that as new capabilities are developed, the end outcome is additional information or displays that the operational crews have to review and utilize, resulting in excessive workload additions and, in-turn, additional fatigue. The Predator system did not initially integrate human factors and ergonomic design consideration into the initial build. The technology was deemed so valuable during demonstration that the Aircraft was rushed into operation use (Freedberg, 2012).
Current displays utilized in smaller UAS’s tend to be engineer focused, as opposed to being focused on the user. The effects include reduced mission effectiveness, operator frustration and degraded situational awareness (Stroumtsos, Gilbreath, & Przybylski, 2013). On these smaller UAS’s, functionality is typically limited, and requires utilization of different hardware components. In this case, utilizing a single display and interface with an intuitive touchscreen alleviates some of the task saturation associated with operations. Additionally, reducing the required equipment needed for operations has the potential to reduce the required crew size, alleviating some of the crossed communication and distractions that may typically occur during operations under tense conditions.
Another display/interface solution that may be considered is the use of a single main display. The research paper titled: Integrating critical interface elements for intuitive single-display aviation control of UAVs describes the use of this single display option primarily in the commercial use of UAS’s that may provide inspection or monitoring tasks, however the concept may be applied to the military sector as well (Cooper & Goodrich, 2006). This interface utilizes a georeferenced terrain map populated by publicly available information such as altitude data and terrain imagery. Imagery retrieved from the UAS help to provide a stable frame of reference integrated into the terrain imagery model. Icons are overlaid on the main display to provide for control and feedback, but fade to a “semi-transparent state when not in use to avoid distracting the operator’s attention from the video signal” (Cooper & Goodrich, 2006).
Touchscreen technology allows for advantages for the operational crew members over traditional display options when used exclusively, or in conjunction with additional displays. Touchscreens allow for greater flexibility, as reconfiguration is easy due to controls that are software generated. New interactive features may be utilized, to allow for removal of on-screen keyboards and scroll bars. Uncluttering the display allows for greater display dimension, and enables the user to utilize additional interface preferences for control and display options on the touchscreen (Damilano, Guglieri, Quagliotti & Sale, 2011).
Sensory Feedback
One of the major issues with UAS operations for crew members, is the physical separation that exists between the operators and the aircraft. This separation results in the sensory deprivation of vestibular and tactile inputs.  With the addition of systems that enable the sensory re-integration of tactile feedback, operators may be able to use this simulated sense to perceive tactile cues typically absent in UAS systems. This may allow the operator to sense changes in environmental inputs, mechanical issues, or alerting crews to the impending situation of stalls and other maneuvering issues (Mung Lam, Mulder, & Van Passen, 2006).
One of the most promising ways to improve the awareness of the air vehicles status and condition is through the use of haptic feedback technology. In a perfect scenario, a “remote presence” may “enable the operator to perceive the remote environment as if sensed directly” (Giordano, Deusch, L ̈achele, & B ̈ulthoff, 2011). Haptic feedback has the potential to allow the operations crew members to sense some of the same things that manned pilots are able to. There are several methods to achieve this haptic feedback relay to operations crews. “The first technique considers the addition of an external force offset generated by an artificial force field, whereas the second technique considers the addition of an external spring constant” (Mung Lam et al., 2006).
One reason that may help explain the necessity of sensory systems to help relay information to UAS operators is that of environmental factor inputs. One such case is that of wind turbulence.  Severe turbulence in flight is something that manned crews are able to identify and respond accordingly, however in the UAS system, the only indicator of a condition such as this may be the presence of video quality interference or de-stabilization of the video feed.  One study utilized the insertion of control stick force feedback that transmitted high frequency perturbation that reacted at an appropriate scale commensurate with the intensity of the encountered disturbance. In this study, pilots utilizing these systems were able to respond more quickly, resulting in fewer errors encountered (Cooke et al., 2006, p.155).
Another method for re-integrating sensory inputs into the operation of UAS systems is through the use of artificial vestibular enhancing systems. While some of the traditional methods for introducing sensory inputs back into UAS operations focus on tactile feedback, vestibular systems also offer an enhanced ability for UAS crew members to experience the same perception of “self linear/angular motion through the integration of the inertial information” that are typically associated with manned flight operations (Giordano et al., 2011). To introduce these vestibular cues, a motion simulator capable of similar flight maneuvering performance is utilized in conjunction with the visual feedback received from the UAS cameras, enabling the crew members to “sense” the performance of the aircraft (Giordano et al., 2011).
Allowing for understanding of aircraft maneuvers, mechanical issues and environmental cues may allow for increased situational awareness not typically afforded to unmanned crews, which may allow crew members to quickly react to provide needed inputs.
Auditory and Multi-Sensory Inputs
Another significant issue that is present in unmanned flight is that of a lack of auditory inputs for UAS crew members. Auditory cues are crucial in manned flight, as they may cue the pilot to impending mechanical or other aircraft issues, as well as alerting them to potential necessary required actions. Utilizing auditory cues that can help the operator be aware of their surroundings and enhance their performance is another awareness enhancing option. This option, like other awareness enhancing technology, has the added benefit of transferring some of the cognitive processing to other sensory systems, thereby alleviating some potential fatigue and reducing workload.
Continuous audio and discrete audio cues are two ways that auditory functions can also enhance awareness. In continuous audio, a sound is constantly going while a task is being performed. Once the task has completed, the sound stops playing and the operator knows the task is finished and he needs to intimate a new task or perform another new function. Discrete audio is what is typically used in flight, presenting beeps of other chime sounds to identify items than need attention form the operational crews (Graham, 2008).  
In research conducted with crew members supervising operation of multiple UAS’s simultaneously, Spatial audio was utilized. “Spatial audio is 3-dimensional (3D) audio in which audio is presented so that specific signals come from specific locations in the 360 degree range around someone’s head. An example is an audio alert for one UAV presented over a headset directly in front of the operator, while alerts for another UAV are connected to signals directly behind the operator” (Graham, 2008).   Spatial audio has been shown to reduce target acquisition time and reduce scanning time required. This audio system allows the crew members that may be fixated on one aspect of flight, or involved with scanning for target acquisition to receive and be aware of important issues or items that need to be brought to their attention. This increase operational safety and allows for enhanced awareness (Graham, 2008).   
Utilizing this option in conjunction with haptic feedback may make an ideal situation for increased awareness for crews. In a study titled: Assessing The Impact Of Haptic Peripheral Displays For UAV Operators, UAS pilot test subjects operated in environments with simulated auditory environmental feedback, and received auditory cues that alerted them to situations (Donmez, Graham, & Cummings, 2008). Overall, the participants favored the use of auditory cues, however it was noted that “auditory cues must be integrated in a fashion that the operators will be able to tolerate subjectively, and these signals must also be integrated with other audio cues to ensure balanced aural saliency (Donmez et al., 2008).
Cameras and Sensors
Just as new technological advancements are introducing a multitude of new UAS aircraft into the mix with a vast array of new and enhanced capabilities, these technological advancements are also carrying into the areas of camera and sensor developments, thereby increasing the awareness of the operational crews.
The Wide Area Aerial Surveillance (WAAS) concept utilizes high endurance sensor UAS platforms, and equips them with a WAAS type sensor payload. The payload typically contains high resolution electro-optical sensors that are mounted in a fixed manner aimed in multiple directions (Rogoway, 2014).  Onboard software creates a seamless image from the different sensors, creating a high resolution single image that can be sent in a video feed directly to the users. Utilizing several different direct video feeds allows users to stream portions of the video that are applicable directly to their operations, enabling faster data transfer as the transferred video stream is of a much smaller data size than that of the whole stitched image (Rogoway, 2014). 
A newer Defense Advanced Research Projects Agency (DARPA) initiative, Autonomous Real-Time Ground Ubiquitous Surveillance Imaging System (ARGUS) aims to increase the capabilities of aerial surveillance. ARGUS is a 1.8-gigapixel video surveillance platform that can resolve details as small as six inches from an altitude of 20,000 feet (6km) (Anthony, 2013). ARGUS is capable at identifying birds flying in the sky while observing from an altitude of 20,000 feet, utilizing 368 smaller sensors which enable the overall capability of the system (Anthony, 2013).
Another area that has seen much technological advancement is in sense and avoid technology.  New ideas such as Amazon’s intent to utilize small drones to complete small package delivery to residential areas, and even the direction of automation in automobile operations have been driving technological research and development in the private sector (Black, 2014). With UAS, the major concern and obstacle to civil operation is that of the lack of ability for multiple aircraft to operate in a congested area and airspace without a significant risk to the general public. To make this technology work, sensors are being developed that are a fraction of the size used on their larger counterparts, enabling use on smaller UAS that need lightweight technology to remain efficient and capable.
Two potential technological options to combat this lack of control are the use of optical flow sensors and micro radar devices. Optical flow sensors, much like as in use by computer mice that operate without a trackball, are being adapted to utilize for collision avoidance. Echo location sensors may be utilized with the optical flow sensors for operations in foggy scenarios. (Black, 2014). Currently, companies such as Integrated Robotics Imaging Systems are developing micro radar systems that weigh between 7-12 ounces, less that 5 percent of their current commercial counterparts, with a price tag of $7,000 to $10,000 (Black, 2014).
These new camera and sensor technologies will allow operators to utilize these small UAS in commercial applications, even in residential areas. The potential for UAS application in the private, commercial and military sectors is virtually limitless with the addition of ever evolving technological advancements.



















Recommendations
Unmanned systems have come a long way since their beginnings as kite mounted photographic systems in 1883. Over the years, we have come to rely on these systems to provide surveillance, payload delivery and support of military and civilian functions with increasing demand for additional performance. As manned flight innovation has brought about changes in their arena, similar changes have also helped shape the use and direction of UAS operations.
As with any system, there are issues that arise that have created the need for technological solution to increase the efficiency and safety of UAS operations. Unlike manned flight, where operators are able to sense the changes in aircraft performance, and are afforded a 180 degree view of the environment, unmanned crews are restricted to the information relayed from their sensors and cameras to make control inputs and operational decisions. Many solutions are currently in use, or are in developmental stages that will help to ease the current problems afflicting these UAS.
Specifically, giving the operational crews more information, faster and more efficiently, while providing additional information delivery methods are the key to increasing the safety, efficiency and effectiveness of these systems. Many methods exist for achieving these desired results, and include advancements and utilization of:
·       Workstation Layout-  Redesign and optimization of ground control stations layout and control utilization and  placement..
·       Improved Displays and Interfaces- Utilizing integrated touchscreen displays, with either one primary screen and overlays or multiple customizable screens.
·       Sensory Feedback- Utilizing either tactile or haptic feedback, or artificial vestibular enhancing systems.
·       Auditory and Multi-Sensory Inputs- Utilization of either continuous or discrete audio to aid operators in performance monitoring, or using auditory and tactile feedback combinations.
·       Cameras and Sensors- Utilization of enhanced cameras such as ARGUS, or using optical flow sensors and micro radar devices.
Using any of these enhancements alone or in conjunction has the ability to better increase the awareness of the operational UAS crews, thereby increasing the efficiency and effectiveness of operations. 









References
Anthony, S. (2013, January 28). DARPA shows off 1.8-gigapixel surveillance drone, can spot a terrorist from 20,000 feet. Retrieved from http://www.extremetech.com/extreme/146909-darpa-shows-off-1-8-gigapixel-surveillance-drone-can-spot-a-terrorist-from-20000-feet
Asim, M., Ehsan, N., & Rafique, K. (2005). Probable Causal Factors In UAV Accidents Based On Human Factor Analysis And Classification System. Retrieved from http://www.icas.org/ICAS_ARCHIVE/ICAS2010/PAPERS/492.PDF
Atherton, K. (2013, March 4). What Causes So Many Drone Crashes? Retrieved from www.popsci.com/technology/article/2013-03/human-error-after-all
Black, T. (2014, June 7). Amazon’s Drone Dream Sets Off Race to Build Better Sensor - Bloomberg Business. Retrieved from http://www.bloomberg.com/news/articles/2014-06-06/amazon-s-drone-dream-sets-off-race-to-build-better-sensor
Cooke, N. J., Pringle, H., & Pedersen, H. (2006). Human factors of remotely operated vehicles. Amsterdam: Elsevier JAI.
Cooper, J., & Goodrich, M. (2006, May 19). Integrating critical interface elements for intuitive single-display aviation control of UAVs. Retrieved from https://faculty.cs.byu.edu/~mike/mikeg/papers/Spie.pdf
Damilano, L., Guglieri, G., Quagliotti, F., & Sale, I. (2012, January). FMS for Unmanned Aerial Systems: HMI Issues and New Interface Solutions. Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/pqcentral/docview/911157189/fulltextPDF/58C98063B27849D8PQ/9?accountid=27203
Donmez, B., Graham, H., & Cummings, M. (2008, March). Assessing The Impact Of Haptic Peripheral Displays For UAV Operators. Retrieved from http://web.mit.edu/aeroastro/labs/halab/papers/HAL2008_02.pdf
Freedberg, S. (2012, August 7). Too Many Screens: Why Drones Are So Hard To Fly, So Easy To Crash. Retrieved from breakingdefense.com/2012/08/too-many-screens-why-drones-are-so-hard-to-fly-and-so-easy/
Giordano, P., Deusch, H., L ̈achele, J., & B ̈ulthoff, H. (2011, April 10). Visual-Vestibular Feedback for Enhanced Situational Awareness in Teleoperation of UAVs. Retrieved from http://www.kyb.tuebingen.mpg.de/fileadmin/user_upload/files/publications/AHS_FORUM66_%5B0%5D.pdf
Graham, H. (2008, June). Effect of Auditory Peripheral Displays On Unmanned Aerial Vehicle Operator Performance. Retrieved from http://web.mit.edu/aeroastro/labs/halab/papers/Graham_Thesis.pdf
Griffin, G. (2010, February 14). Human error is biggest obstacle to 100 percent flight safety - The Denver Post. Retrieved from http://www.denverpost.com/ci_14398562
Krock, L. (2002, November). Time Line of UAVs. Retrieved from http://www.pbs.org/wgbh/nova/spiesfly/uavs.html
Mung Lam, T., Mulder, M., & Van Passen, R. (2006). Haptic Feedback for UAV Tele-operation - Force offset and spring load modification. Retrieved from http://www.lr.tudelft.nl/fileadmin/Faculteit/LR/Organisatie/Afdelingen_en_Leerstoelen/Afdeling_C_O/Control_and_Simulation/News/Archive_2007/doc/ieeesmc_mlam_2006_final.pdf
Orlady, H. W., Orlady, L. M., & Lauber, J. K. (1999). Human factors in multi-crew flight operations. Aldershot, England: Ashgate.
Pestana, M. (2011, March 29). Flying Unmanned Aircraft: A Pilot’s Perspective. Retrieved from http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20110011979.pdf
Rogoway, T. (2014, August 18). How One New Drone Tech Finally Allows All-Seeing Surveillance. Retrieved from http://foxtrotalpha.jalopnik.com/how-one-new-drone-tech-finally-allows-all-seeing-survei-1553272901
Schneider, J., & MacDonald, J. (2014, June 16). Are Manned or Unmanned Aircraft Better on the Battlefield? Retrieved from http://ciceromagazine.com/features/the-ground-truth-about-drones-manned-vs-unmanned-effectiveness-on-the-battlefield/
Stroumtsos, N., Gilbreath, G., & Przybylski, S. (2013). An intuitive graphical user interface for small UAS. Retrieved from http://www.public.navy.mil/spawar/Pacific/Robotics/Documents/Publications/2013/SPIE2013-Raven.pdf
Tamarkin, D. (2011). Vestibular Senses. Retrieved from http://faculty.stcc.edu/AandP/AP/AP2pages/Units14to17/unit16/vestibul.htm
Unmanned Aerial Vehicle Systems Association. (2015). UAS Components. Retrieved from http://www.uavs.org/index.php?page=uas_components
Waraich, Q., Mazzuchi, T., Sarkani, S., & Rico, D. (2013, January). Minimizing Human Factors Mishaps in Unmanned Aircraft Systems. Retrieved from http://erg.sagepub.com/content/21/1/25.full.pdf

Human Factors, Ethics and Morality

Human Factors, Ethics and Morality



Traditionally, manned aircraft warfare has presented itself with the inherent risk of losing the lives of the flight crews in the performance of their duties. While this is not always deterrence to swift action, it is a consideration that must be addressed prior to execution of missions. One key aspect that is always a significant input, is the role that human intervention and situational insertion of our moral consideration play in our decision making process.
Just War theory utilizes the notions that certain principles should guide our conduct, even in extreme situations. As described in Just War Theory and the Ethics of Drone Warfare, four main principles make up Jus in Bello (Freiberger, 2013). The principle of military necessity, the principle of distinction (between military and civilians), the principle of proportionality (action must be proportionate to objective), and the principle of humanity (care should be taken to ensure no undue suffering or action is taken upon civilians or their property) (Freiberger, 2013).
The Law of Armed Conflict seeks to align these same principles and identify the moral and ethical considerations that member countries agree are necessary to ensure that was that is fought is done so in an acceptable manner by outlying what are legal and illegal actions and combatants. Unfortunately, as populations increase and technologies become increasingly advanced, we are further removed from the actions of war, and actions have become closer to large civilian populations. Thus, “this trend has blurred the line between combatants and civilians and made it difficult to distinguish between legitimate and illegitimate targets” ( Kreps & Kaag, 2012).
While manned aircraft operations may offer a larger picture of surroundings and allow for consideration of more variables, UAS operations have typically been limited by their onboard sensor and camera capability. Being removed from the action may perhaps reduce the internal turmoil that may typically be encountered when determining targets and making the kill decision. Also, target acquisition and tracking may be hindered by the low quality of the video streams typically associated with UAS cameras.
Conversely, utilization of UAS can also improve the chances of getting actionable and valid information regarding targets. UAS are able to hover and track for extended periods of time from extended heights, allowing for determination of the best point of mission execution. As one pilot explains, “I see mothers with children, I see fathers with children, I see fathers with mothers, I see kids playing soccer, before the call comes to fire a missile and kill the target” (Bumiller, 2012).
I do think that although there is a physical separation between the UAS operator and the targets in warfare, that there is still the emotional connection to a kill action similar to that of manned aircraft operations. Operators still know that their actions have a consequence that ends the life of another human. As with any military action, there is always the possibility of bad intelligence leading to the realization of action taken that negatively affects civilians or results in collateral damage. As technology in sensor and cameras becomes more sophisticated, such as in the ARGUS system, the ability to monitor and get a bigger picture of situations with greater clarity and scope will improve the ability to make more informed decisions on the battlefield.




References
Bumiller, E. (2012, July 29). A Day Job Waiting for a Kill Shot a World Away. Retrieved from    http://www.nytimes.com/2012/07/30/us/drone-pilots-waiting-for-a-kill-shot-7000-miles-away.html?_r=0
Freiberger, E. (2013, July 18). Just War Theory and the Ethics of Drone Warfare. Retrieved from http://www.e-ir.info/2013/07/18/just-war-theory-and-the-ethics-of-drone-warfare/
Kreps, S., & Kaag, J. (2012, April). The Use of Unmanned Aerial Vehicles in Contemporary Conflict: A Legal and Ethical Analysis. Retrieved from http://search.proquest.com.ezproxy.libproxy.db.erau.edu/docview/992898373