Saturday, March 10, 2018

Ethics and Morality ASCI 638 - 9.6


(Serle, 2017)
I do believe that UAS is a valuable asset and should continue to be used for warfare.  Professor Strawser, of the Naval Postgraduate School, has stated “all the evidence we have so far suggests that drones do better at both identifying the terrorist and avoiding collateral damage than anything else we have” (Shane, 2012).  UAS operators can observe targets for hours or days prior to a strike.  This is significantly longer than any manned aircraft, or boots on the ground.  This observation also allows them to time a strike for when innocents are not nearby, while providing a view that would allow a launched strike to be diverted in innocents happen to walk in (Shane, 2012).  Based on results from targeted strikes in Pakistan fewer civilians were killed than other modes of warfare. Even taking into account the wide ratio of reported of civilian victims to combatant deaths that range from 4% to 20% depending upon the report (Shane, 2012).
President Obama stated that for targeted UAS strikes the US would uphold the highest standards in defending that nation’s security. President Obama stated, “That means taking strikes only when we face a continuing, imminent threat, and only where there is near certainty of no civilian casualties” (Peterson, 2016).  In 2016 the White House released their self-evaluation of the targeted strikes. 116 civilians were killed, and 2,581 combatants were killed in Pakistan (Peterson, 2016). As mentioned in the previous paragraph the actual numbers are questions by other independent sources.
            To improve the use of UAS for warfare there are several options.  UAS manufactures are continuing to improve the UAS themselves, becoming more fuel efficient, improved imagery sensors, improving the GCA and addressing any shortcoming of the UAS. This is standard growth and improvement of an industry, make the product better. We should also strive to improve the policy behind targeted strikes.
The CIA states that any targets for UAS strikes go through a careful review policy done by the most senior officials, the President is ultimately responsible for any strike (Peterson, 2016). This review process is largely secret, and details are not released.  Congress should have some oversight in this starting with mandatory reporting on any strike. Such reports “contain both classified sections and unclassified sections in which the administration provides a legal and policy analysis of any use of force in self-defense or other uses of force outside traditional battlefields" (Brooks, 2013). This would allow oversight of any decisions made, along with recommendations for how to improve the process. Unclassified versions of any legal memorandum related to targeted strikes should also be released, so the public understands on how these strikes are being conducted. This would allow for more educated public debate on targeted strikes.


References
Brooks, R. (2013, April 11). 10 Ways to Fix the Drone War. Retrieved March 10, 2018, from http://foreignpolicy.com/2013/04/11/10-ways-to-fix-the-drone-war/
Peterson, M. (2016, August 18). Is Obama's Drone War Moral? Retrieved March 10, 2018, from https://www.theatlantic.com/international/archive/2016/08/obama-drone-morality/496433/
Shane, S. (2012, July 14). The Moral Case for Drones. Retrieved March 10, 2018, from http://www.nytimes.com/2012/07/15/sunday-review/the-moral-case-for-drones.html
Serle, J. (2017, February 06). Suspected drone strikes kill 12 civilians in Yemen. Retrieved March 10, 2018, from https://www.thebureauinvestigates.com/stories/2012-05-15/suspected-drone-strikes-kill-12-civilians-in-yemen


Saturday, March 3, 2018

UAS Crew Member Selection ASCI 638 - 8.6

ScanEagle Launch (Gettinger, 2014)

The Insitu ScanEagle is capable of flying above 15,000 feet with a loiter time of around 20 hours. It is autonomously launched by a catapult launcher and recovered with a SkyHook recovery system that catches the edge of the wing as it flies by.  It carries an electro-optic or dual imager inside a gyro-stabilized turret.  It has a modular design, so the camera technology can change as new technologies become available (Boeing, 2017).  The trailer mounted launcher requires 2 people around 10 minutes to set up. The SkyHook recovery system is a separate trailer and is also autonomous, requiring 2 people and around 20 minutes to set up (ScanEagle, 2016). It is capable of BVLOS operations and has numerous flights helping with national disasters worldwide (Insitu Flies, 2017).
The General Atomics Ikhana, a variant of the Predator, is capable of flying above 40,000 feet with a loiter time of around 20 hours. It requires a runway for both launch and recovery. A GCS is required for operations as well. It is inside a trailer and houses the pilot’s instruments and controls as well as computer workstations for payload operators. It is capable of BVLOS operations and has done so around the world (Ikhana Unmanned Science, 2015).
Insitu provides training for UAS operators.  There is a 10-week long course to become a UAS operator.  It should be required as part of the initial training program to attend this course, unless the new hire is already a trained operator of the ScanEagle.  Any maintenance personnel can attend the 5-week maintainer course. The maintainers can be trained to launch and recover the UAS. Because of the long endurance capability, it can exceed normal crew duty day, maximum of 12 hours.  If operations extend that long, additional personnel would be required, as well as shift hand over checklist developed.
Ikhana (Clements, 2012)
The Ikhana requires a pilot, a payload operator and maintainers.  General Atomics provides training for this as well.  There is an 8-week long course, this covers both pilots and sensor operators. The sensor operators will graduate with less total hours, but the training is similar in duration. General Atomics requires a pilot to possess a bachelor’s degree, a FAA commercial instrument pilot ratings and 300 hours as pilot in command.  Sensor operators require a private pilots license. Both require a Class II FAA medical certificate and a security clearance (GA-ASI, 2016). Previous certification in the system could eliminate the need for this training. If the missions will extend beyond 12 hours additional crews would be required, along with shift hand over checklists.
Any flights that are over 400 feet in altitude or are BVLOS require a Part 107 Waiver from the FAA. These can take up to 90 days to process.  Any flights within controlled airspace; B, C, D or surface E also require this waiver (Request a Waiver, 2018).  All pilots would be required to pass an aeronautical knowledge test at an FAA facility (Fly Under, 2017).




References
Boeing Historical Snapshot. (2017). Retrieved March 03, 2018, from http://www.boeing.com/history/products/scaneagle-unmanned-aerial-vehicle.page
Clements, R. (2012, March 29). NASA's Ikhana MQ-9 Drone Flies With ADS-B Equipment for the First Time. Retrieved March 03, 2018, from https://theaviationist.com/2012/03/29/ikhana-mq-9-adsb/
Fly under the Small UAS Rule. (2017, December 14). Retrieved March 03, 2018, from https://www.faa.gov/uas/getting_started/part_107/
GA-ASI UAS Flight Training Academy Graduates First Aircrews. (2016, August 25). Retrieved March 03, 2018, from http://www.ga.com/ga-asi-uas-flight-training-academy-graduates-first-aircrews
Gettinger, D. (2014, January 6). ScanEagle: A Small Drone Making a Big Impact. Retrieved March 03, 2018, from http://dronecenter.bard.edu/scaneagle-drone/
Ikhana Unmanned Science and Research Aircraft System. (2015, August 06). Retrieved March 03, 2018, from https://www.nasa.gov/centers/armstrong/news/FactSheets/FS-097-DFRC.html
Insitu Flies ScanEagle UAS for Disaster Relief and Fire Suppression Missions. (2017, November 28). Retrieved March 03, 2018, from https://insitu.com/press-releases/Insitu-Flies-ScanEagle-for-Disaster-Relief-and-Fire-Suppression
Request a Part 107 Waiver or Operation in Controlled Airspace. (2018, March 02). Retrieved March 03, 2018, from https://www.faa.gov/uas/request_waiver/
ScanEagle. (2016). Retrieved March 03, 2018, from https://insitu.com/information-delivery/unmanned-systems/scaneagle#3

Sunday, February 25, 2018

Operational Risk Management ASCI 638 - 7.6

Figure 1.  Safety vs. Risk (Long, 2017)


The DJI Phantom 4 Pro is a commercially available sUAS used by hobbyist and professionals alike for aerial video and photography.  For the purpose of this discussion we will focus on the flight characteristics and safety features of the Phantom 4. It is capable of autonomous flight and obstacle avoidance (Phantom 4, 2017).

            The Phantom 4 has built in GPS, barometer, forward vision, downward vision, rearward vision, two ultrasonic sensors, and infrared systems on both sides.  These systems are combined for obstacle avoidance and navigation and can be used independently if needed. For instance, if flying indoors, without GPS signal, the vision system will hold a stable hover without inputs from the operator (Phantom 4, 2017).

            There are three Return-to-Home (RTH) modes of flight; Failsafe RTH, Smart RTH and Low Battery RTH. For all modes obstacle avoidance is active and the Phantom will adjust altitude if needed. Before any flight a “home” point is established for the Phantom 4, this is the point of launch. Failsafe RTH will enable whenever link with the controller is lost for 3 seconds, it will retrace its route of flight back home, ensuring a safe route of flight. Smart RTH is user activated with a button on the controller and is used to automatically fly back to the home point. Low Battery RTH is automatically engaged when the battery is too low for continued flight. The Phantom will internally calculate the time of flight required to return to home at all times, when the battery is low enough continued flight may affect safe flight Low Battery RTH engages and flies the Phantom home (Phantom 4, 2017).

            The Phantom visually scanned and remembers the terrain when the home point was established. During any RTH feature if the landing terrain doesn’t match the terrain in the memory the Phantom will hold hover and wait for user input to confirm landing. During any landing the Phantom is visually scanning the landing zone, if the Phantom determines it isn’t’ suitable it will automatically hold a hover awaiting confirmation from the operator to land. It will hold that hover till confirmed to land or until the battery is too low, at which point it will land (Phantom, 2017).

            An Operational Risk Management (ORM) assessment was completed for the Phantom 4 in order to ensure safe flight, and mitigate any risks identified.  This includes a Preliminary Hazard List (PHL), Preliminary Hazard Assessment (PHA), and Operational Hazard Review and Analysis (OHR&A).

            A PHL is a “brainstorming tool used to identify initial safety issues early in the UAS operation” (Barnhart, Shappee & Marshall, 2011).  Once a hazard is identified the probability and severity need to be established. Probability would be frequent (A), probable (B), occasional (C), remote (D) or improbable (E).   The severity is listed as catastrophic (I), critical (II), marginal (III) or negligible (IV) (Barnhart, Shappee & Marshall, 2011). 

            A PHA is completed after the PHL. A PHA is where we ask what can be done to reduce or eliminate the hazard identified in the PHL. In regard to probability we want to reduce the or eliminate the possibility of occurrence, to decrease the exposure to the risk. See Figure 2 for an example of a PHL/A for Phantom 4 operations. It should be noted this is not an all-inclusive list, this is for illustration purposes. An all-inclusive list would be developed and include every possible risk for a specific flight. For example, if using the Phantom to record a local charity run there may be additional factors such as over flight of a crowd or public space, or perhaps other sUASs in the area. If recording wildlife there could be risks of interaction with the wildlife, loosing the Phantom in a body of water, or airspace restriction if the land has any particular designation. If weather is a risk it would be seasonally dependent and where the flight is occurring, mountains versus an open plane.  

Figure 2. PHL/A
            The OHR&A “is used to identify and evaluate hazards throughout the entire operation and its stages (planning, staging, launching, flight, and recovery)” (Barnhart, Shappee & Marshall, 2011).  Risks identified in the PHL/A may also appear in the OHR&A, but can include additional risks such as human factors. The mitigating action from the PHL/A is used as the Action Review in the OHR&A tool. If the action was effective at mitigating risk no change is needed, however if it is not a new mitigating action needs to be developed. See Figure 3 for an example of an OHR&A for a flight.

Figure 3. OHR&A
            An ORM serves two purposes. First is to provide a “quick look at the operation before committing to the flight activity (a go/no-go decision)” (Barnhart, Shappee & Marshall, 2011).  The second is to “allows safety and management of real-time information needed to continually monitor the overall safety of the operation” (Barnhart, Shappee & Marshall, 2011).  This is meant to be an aid in deciding to conduct the flight, not the only means to deciding. See Figure 4 for an example. If anything changes during the flight, such as unexpected people show up to your location or weather is not as planned for, the risk would need to be updated appropriately in real time.

Figure 4. ORM



References

Long, R. (2017, December 13). The Great Safety is a Choice Delusion. Retrieved February 25, 2018, from https://safetyrisk.net/the-great-safety-is-a-choice-delusion/

Phantom 4 User Manual. (2017, October). Retrieved February 26, 2018, from https://dl.djicdn.com/downloads/phantom_4_pro/20171017/Phantom_4_Pro_Pro_Plus_User_Manual_EN.pdf

Barnhart, R. K., Shappee, E., & Marshall, D. M. (2011). Introduction to Unmanned Aircraft Systems. London, GBR: CRC Press. Retrieved from http://www.ebrary.com.ezproxy.libproxy.db.erau.edu


Saturday, February 17, 2018

Automated Takeoff and Landing ASCI 638 - 6.6

Figure 1. Otto Pilot, the worlds greatest autopilot (Airplane!, 2017)



Manned Aircraft Autoland Airbus A330
For the Autoland system to be operational there are several requirements:
·       Functioning autopilot
·       2/3 functioning hydraulic systems
·       Functioning autothurst
·       Radio Altimeter
·       Nose Wheel Steering System
·       Autobrake system
·       Two functioning ILS receivers
·       Functioning instruments to pilots can monitor airspeed, attitude, altitude
·       Announcement capability so the aircraft can notify the pilots if something is wrong

The Autoland for the A330 can execute the full approach to the runway to include braking and steering to stay on the runway after landing without pilot input. The A330 uses it’s fly-by-wire system to control rudder, flaps and slats to adjust the aircrafts airspeed/altitude/attitude following an ILS approach to the ground (How Does a Plane, 2017).   The auto thruster will retard the throttle on the engines after landing, but the pilots are required per the checklist to still manually retard the controls so thrust levers match thrust demand. Announcement systems are required so the pilot is aware of any issues, such as a failure of the autobraking system, the pilot would then be required to manually control the brakes, same is true for the Nose Wheel Steering System (Autoland 2017).

The Autoland feature should be fully automated as it currently is. It should also be able to be disabled by the pilots at any time in order to manually conduct the landing, this is the case currently. It should also maintain its alert/announcement capability to warn the pilots if there is an issue and the Autoland system may not be functioning properly.

Unmanned Aircraft DJI Phantom 4
The automatic landing feature on the DJI Phantom 4 will automatically occur under certain situations. When the low battery level warning activates an audible alarm is sent through the controller for the user to hear.  The Phantom will return to home automatically if no action is taken by the operator after 10 seconds. “Home” is a GPS location set by the user during the start up procedure of the Phantom. The operator can cancel the return to home if they would like, but when the battery is low enough, critical low, the Phantom determines it only has the power to descend from its current altitude.  The Phantom uses 2 ultrasonic sensors and 4 monocular sensors to detect and avoid obstacles during any return to home flight, as well as normal flight. The sensors can only pick up obstacles 60° off the nose and 50° below.  If an obstacle is detected 65 feet ahead it will stop and hover, then ascend to at least 16 feet above the obstacle, then continue its designated path (Phantom 4, 2016).

This system benefits from being automated since it allows more users to be able to fly a drone with little training, increasing the enjoyment for many and allowing those who never thought they could fly a drone to be able to. This is also a safety feature since the battery low return to home feature prevents the operator from accidently crashing the Phantom because of a low battery. Collision avoidance and return to home are excellent selling points for automation and should be kept on all future models of the Phantom.

  



References
How Does a Plane Land on Autopilot? (2017, February 16). Retrieved February 18, 2018, from https://thepointsguy.com/2017/02/how-a-plane-lands-on-autopilot/

Autoland. (2017, September 22). Retrieved February 18, 2018, from https://www.skybrary.aero/index.php/Autoland


Phantom 4 User Manual. (2016, March). Retrieved February 18, 2018, from https://dl.djicdn.com/downloads/phantom_4/en/Phantom_4_User_Manual_en_v1.0.pdf

Airplane! (1980). (2017, August 20). Retrieved February 18, 2018, from https://www.moviehousememories.com/airplane-1980/

Saturday, February 10, 2018

Shift Work Schedule ASCI 638 - 5.6

Figure 1 Night Shift (Safari Signs, 2016)

I have been hired as a human factors consultant for an MQ-1B Medium Altitude, Long Endurance (MALE) UAS squadron of the United States Air Force (USAF). This squadron conducts missions 24/7, 365 days a year providing armed, Intelligence, Surveillance, and Reconnaissance (ISR) to ground forces in a conflict zone. In order to accomplish this mission, the UAS crews have been separated into 4 teams and put onto a continuous shift work schedule of 6 days on, 2 days off. The Squadron Commander is concerned because the UAS crews have been reporting extreme fatigue while conducting operations and complained of inadequate sleep due to their current shift schedule. It is my job to analyze the current shift rotation, Figure 2, and the number of days on and off per week, and, based upon your research, design the schedule that you think will allow the squadron to optimize operations, while improving the fatigue issues that the crews are reporting.

Figure 2. Current Schedule
            Circadian rhythm is the internal biological clock that regulates our body functions, based on our wake/sleep cycle.  When deprived of time cues most people gravitate towards a 25-hour circadian cycle. Any time this is disrupted, such as shift work, it can have physiological and behavioral impacts, known as circadian rhythm disruption (Circadian Rhythm, n.d.).  The circadian rhythm is reset by exposure to bright light, especially sunlight. When a night shift worker is driving home in daylight the bright morning sun resets the internal clock, making it difficult for the person to simply go home and go to sleep. The disparity between biological days, 25-hour, and solar days, 24-hours, drives the recommendation that rotating shifts move towards a longer day. Shift rotations should be moved to later shifts, instead of earlier shifts (FAA Shift Work, n.d.).
            I propose a 2-2-2 plan for improving the shift work.  This would be an 8-day cycle for the 4 crews working 8.5 hour days, Figure 3. The first 30 minutes of each shift overlaps with the previous shift to ensure adequate time for shift turnover procedures. The proposed change would be 2 day shifts, 2 swing shifts, 2 night shifts and then 2 days off (Miller, 2012). This schedule follows the concept of moving shift rotations forward, allowing for our natural 25-hour circadian cycle.  This rotation keeps the circadian rhythm in a daytime orientation, it is not in a state of constant disruption. Many consecutive night shifts may cause chronic sleep deprivation. It also provides more free evenings per week to the operators, allowing more regular contact with friends and family (Managing 24/7, n.d.).

Figure 3. Proposed Schedule
            The Squadron Commander also needs to implement training on fatigue, fatigue management, stress, stress management and sleep hygiene. Educated operators are more likely able to effectively cope with a changing schedule if they are aware of coping mechanisms. Keeping the work lighting bright will help circadian rhythms be in sync with working hours, so that night employees are not working in darker rooms and keeping their bodies on a solar circadian rhythm.  Having fridges available for the operators to bring food from home, possibly with a small kitchenette. This would allow for healthier food to be an option, instead of relying on food from vending machines.  Also allowing or encouraging discussions with flight physicians if people are having issues sleeping, to potentially prescribe and monitor sleep aids as needed (Miller, 2012).

  
References
Federal Aviation Administration. (n.d.). Circadian Rhythm Disruption and Flying [Brochure]. Retrieved February 10, 2018, from https://www.faa.gov/pilots/safety/pilotsafetybrochures/media/Circadian_Rhythm.pdf
FAA Shift Work and Scheduling. (n.d.). Retrieved February 10, 2018, from https://www.faa.gov/about/initiatives/maintenance_hf/library/documents/media/human_factors_maintenance/human_factors_guide_for_aviation_maintenance_-_chapter_4.shiftwork_and_scheduling.pdf
Managing 24/7. (n.d.). Retrieved February 10, 2018, from http://www.circadian.com/solutions-services/publications-a-reports/newsletters/managing-247-enewsletter/managing-247-speed-of-rotation-from-day-shift-to-night-shift.html
Miller, J. C. (2012, April). White Paper: Shift Plans with Seven Consecutive Shifts. Retrieved February 10, 2018, from https://primis.phmsa.dot.gov/crm/docs/shift_plans_with_seven.pdf
Safari Signs. (2016, May 26). Retrieved February 10, 2018, from https://www.pinterest.com/pin/504825439462650568/


Sunday, February 4, 2018

UAS Beyond Line of Sight Communication ASCI 638 - 4.6

Figure 1. Predator Communication (Reading Mission, n.d.)

The RQ-1A/MQ-1 Predator UAS is capable of beyond-line-of-sight (BLOS) communications.  The Predator has been present for the wars in Iraq and Afghanistan, while being controlled from Nellis Air Force Base in California. “Predator is equipped with reconnaissance equipment and weapons to provide persistent Intelligence gathering, Surveillance, and Reconnaissance (ISR) capability. It is designed to perform over-the-horizon, long-endurance, medium-altitude surveillance, reconnaissance, and weapons delivery on mission endurance of up to 40 hours” (RQ-1A, n.d.).  The Predator was used to circle over both Iraq and Afghanistan conducting ISR operations. The concept is to move data rather than people; this shortens the kill chain and reduces the opportunity for targets to flee (RQ-1A, n.d.). 
The Predator is capable of both BLOS and line-of-sight (LOS) communications. LOS communicants are used within 100 NM and allow the UAS to be launched and recovered. When the Predator is to operate BLOS communication between the UAS and the ground control station (GCS) is done via Ku-band satellite communications (BLOS, 2016).   Two different GCSs are needed for this operation. The launch and recovery crew element (LRE) use LOS to launch and recover. The mission control element (MCE) will then take control and use BLOS for the mission operations. Both LRE and MCE must have the GCSs set up with the same parameters for the Predator before a handover can take place (MQ-1, 2015).
            The infrastructure for the Predator is extensive. Each Combat Air Patrol (CAP) consists of 4 air vehicles, the local GCS for LRE, potentially a second GCS for MCE and the satellite used for communication. Each CAP includes around 170 personnel which includes; 40 mission control personnel, seven pilots and seven sensor operators, 60 personnel for launch and recovery, which also has pilots and sensor operators and 60 personnel for exploitation of information and around 15 maintenance personnel (Wheeler, 2012).  The notion of “unmanned” is misleading when it takes around 170 personnel to operate 4 Predators.
            The main advantage of the BLOS operations is the ability of the Predator to fully utilize it’s roughly 40-hour endurance. It can be flown anywhere in the world from any location, as mentioned above it was flying in Iraq and Afghanistan from California. The LRE crews were locally located, in Iraq and Afghanistan, and the MCE crews were in California. Less military personnel need to be placed in the vicinity of the battle, some of the pilots can be left at home minimizing their stress and the stress on their families. Another advantage is the video feed can be fed back to any ground commander anywhere, so they can make real time decisions to shape the battle field with live video. 
             A civilian industry that could benefit from BLOS operations is natural resource management. This includes wildlife management personnel and land/forestry management personnel. Tracking and monitoring wildlife is a large undertaking and has benefited from the use of small drones already, but these are limited to LOS operations only.
From 2013 to 2015 the National Oceanic and Atmospheric Administration (NOAA), conducted surveys in Alaska where the manned flight proved to be cheaper than the UAS.  Sea Lion population was counted in the Aleutian Islands in Alaska by a Twin Otter and an APH-22 Hexacopter (Christie, Gilbert, Brown, Hatfield & Hanson,  2016).  Due to the Hexacopter being required to stay Line of Sight (LOS) it only had a range of 0.8km with the longest flight being 16 minutes.  The Hexacopter had to be launched from a ship sailing from island to island. Over the course of two months it surveyed 30 different sites, provided high quality images due to flight altitude of 150 feet and averaged a cost of $1700 per site visited. It could access areas the Twin Otter could not due to inclement weather since minimum of 750 feet ceilings are required, remoteness or a lack of suitable landing sites (Christie, et al., 2016).
In contrast, the Twin otter visited 201 sites, provided lower quality images due to its flight altitude but averaged a cost of $400 per site since it surveyed 171 more sites (Christie, et al., 2016).  Both aircraft could meet to objective of NOAA to count the population of the Sea Lions, but both have limitations.  Currently NOAA uses both systems, the UAS augments the Twin Otter by providing higher quality images or surveying inaccessible sites (Fritz, Sweeney, Towell, Gelatt, 2016). If NOAA could use a UAS such as the Predator the research could potentially be done more efficiently using only one platform instead of a Twin Otter, Hexacpoter and the ship required for the Hexacopter. Of course, the overall price tag of a complex Predator system may prevent any non-governmental agency, or an agency with a smaller budget, from using it. However, to concept and abilty to operate BLOS would be an advantage.



References
BLOS - Beyond Line of Sight. (2016, July 29). Retrieved February 4, 2018, from https://www.linkedin.com/pulse/beyond-line-sight-operations-brandon-fowler/
Christie, K., Gilbert, S., Brown, C., Hatfield, M., & Hanson, L. (2016). Unmanned Aircraft Systems in Wildlife Research: Current and Future Applications of a Transformative Technology. Frontiers of Ecological Environments, 14, 241-251. doi:10.1002/fee.1281
Fritz, L., K. Sweeney, R. Towell, and T. Gelatt. 2016. Aerial and Shipbased Surveys of Steller Sea Lions (Eumetopias Jubatus) Conducted in Alaska in June-July 2013 through 2015, and an Update on the Status and Trend of the Western Distinct Population Segment in Alaska. U.S. Dep. Commerce, NOAA Tech. Memo. NMFS-AFSC-321, 72 p. doi:10.7289/V5/TM-AFSC-321.
MQ-1 Predator Beyond Line of Sight Operations. (2015, February 3). Retrieved February 04, 2018, from https://knghthwksuas.weebly.com/uas-blogs/mq-1-predator-beyond-line-of-sight-operations
Reading Mission Control Data of a Predator. (n.d.). Retrieved February 04, 2018, from http://gbppr.dyndns.org/~gbpprorg/nfl/predator-drone-readout-2009.html
RQ-1A/MQ-1 Predator. (n.d.). Retrieved February 04, 2018, from http://defense-update.com/products/p/predator.htm

Wheeler, W. (2012, February 28). 2. The MQ-9's Cost and Performance. Retrieved February 04, 2018, from http://nation.time.com/2012/02/28/2-the-mq-9s-cost-and-performance/

Sunday, January 28, 2018

UAS Integration in the NAS ASCI 638 - 3.6

 
Figure 1. UAS in the NAS (Cameron, 2012)
“NextGen is the FAA-led modernization of our nation’s air transportation system” (What is NextGen, 2017).  The goal is to increase safety and efficiency in the national airspace. Replacing ground based navaids, such as an NDB or VOR, with GPS based airways. Radar would be replaced with Satellite GPS tracking and Automatic Dependent Surveillance-Broadcast (ADS-B) of aircraft for ATC. These combined will enable aircraft to fly closer together, yet still safely, since the location is more accurate and up to date compared to radar. More direct routes are able to be taken with GPS than ground based navaids which decreases fuel consumption, brings down the overall cost of the trip and reduces transportation time (What is NextGen, 2017).  
ATC needs to be able to communicate with the UAS controller whenever the UAS is in controlled airspace.  Since the UAS controller is on the ground, and most likely not in line-of-sight communication with ATC the UAS could be used to relay the communication to the ground controller via satellite communications (Pongracz & Palik, 2012).  This would allow direct communication with the UAS controller and it would be similar to contacting an onboard pilot. Of course, this only works for larger UASs, most notable military controlled UASs that have access to satellite communications.  An example would be the Global Hawk that uses Satcom to relay UHF/VHF communication between ATC and the ground controller (Global Hawk UAS, 2018).
According to the Federal Aviation Regulation (FAR) 14CFR 91.111(b) “the operator of an aircraft must maintain vigilance so as to see and avoid other aircraft. The operator must also give way to other aircraft if they have the right of way”.  Since a UAS cannot see to avoid it needs to “sense and avoid”.  ADS-B is used to broadcast to ATC the position, altitude and velocity of an aircraft. Having ADS-B built into a UAS would enable ATC to locate the UAS, any aircraft or person with an ADS-B receiver and let the UAS know of other aircraft in its vicinity and it could navigate to stay clear of the manned aircraft. A company called UASioni has released a line of small ADS-Bs that could easily be fitted into a small UAS, some of these are only an inch by an inch in size (ADS-B Transceivers, n.d.).  
PrecisionHawk has developed Low Altitude Tracking and Avoidance System, or LATAS. LATAS is “onboard system that provides flight planning, tracking and avoidance for every drone in the sky using real-time flight data transmission based on existing world-wide cellular networks” (Say Hello, 2015).  LATAS uses existing cell towers to transmit a UASs location to ATC, which would relay that position to pilots in the area. This is a small bit of electronics, about one inch by two inches, that can be added to any UAS during manufacturing.
NextGen will need to address human factors the same way the previous National Airspace system (OldGen? Maybe PreviousGen?) needed to. We will still have crew rest, human inattention, human mistakes, a need for well designed human machine interface and crew resource management just to name a few. With the advent of ADS-B receivers there has been an advancement in safety. Previously a pilot needed to be visually scanning outside the cockpit when flying VFR, and even IFR at times if not in the clouds, in order to see-and-avoid other aircraft. While this is still a necessity the use of ADS-B has enabled pilots to be able to see other aircraft relative to their position on a moving map or digital display in today’s glass cockpits. If a UAS had an ADS-B transmitting it location even a small hard to visually see UAS will be prominently displayed on a screen for other pilots in the area.
When we fully upgrade to NextGen, or if everyone starts using ADS-B while still fying ground based navaids human factors will always be an issue. We need to continue to train, improve and stay vigilant as members of the aviation industry. As with all aviation it takes skilled people, on the ground, in the tower or in the sky to make it all work safely. 

  

References
ADS-B Transceivers, Receivers and Navigation Systems for Drones (n.d.). Retrieved January 28, 2018 from http://www.unmannedsystemstechnology.com/company/UASionix-corporation/
Cameron, A. (2012, May 22). The System: Fly the Pilotless Skies: UAS and UAV. Retrieved January 28, 2018, from http://gpsworld.com/the-system-fly-the-pilotless-skies-uas-and-uav/
Say Hello to LATAS (January 09, 2015). Retrieved January 28, 2018 from http://www.precisionhawk.com/media/topic/say-hello-to-latas/
Global Hawk UAS of NASA. (2018). Retrieved January 28, 2018, from https://directory.eoportal.org/web/eoportal/airborne-sensors/content/-/article/global-hawk
What is NextGen? (2017, November 21). Retrieved January 28, 2018, from https://www.faa.gov/nextgen/what_is_nextgen/


Saturday, January 20, 2018

UAS GCS Human Factors Issue ASCI 638 - 2.6

Predator RQ-1

The RQ-1 Predator is a long-endurance, medium-altitude UAS for surveillance and reconnaissance missions and interdiction. Imagery is provided from synthetic aperture radar, video cameras and a forward-looking infrared (FLIR) can be sent real-time to the front-line soldier, operational commandeer or worldwide via satellite communications. It can be armed with AGM-114 Hellfire missiles.  The ground control station (GCS) is a single 30 foot trailer, containing pilot and payload operator consoles, three Boeing data exploitation and mission planning consoles and two synthetic aperture rater workstations. It is launched with direct line-of-sight control from a semi-improved surface. Line-of-sight data link or satellite links produce continuous video for the operators and all controls are commanded through those communication channels (Predator RQ-1, n.d.).
One human factors issues that lead to an accident was the design of a control lever made it easy to create a mishap. The GCS has two operators who sit at identical consoles. One operator is the pilot and the other is the payload operator. When configured for the pilot the condition lever will start or stop fuel flow and can feather the propeller to reduce drag. When configured for the payload operator the same condition lever will operate the iris on the camera. Per the checklist if control is transferred from one console to the other the condition lever on the payload console must be set to match the pilot console prior to transfer of controls. During a control transfer the checklist was not followed and the condition lever was not matched. Once the transfer of controls was completed the conditions lever which had been set to control the iris was not set to stop fuel flow to the engine, causing the engine to shut down. This was a major contributing factor to a Predator crash in 2016 in Arizona (Carrigan, Long, Cumming & Duffner, n.d.)
This failure mode could be designed out of the system. Something as simple as having two distinct condition levers, one for the iris control and a separate for engine control.  Another would be to prevent or give a warning if pilot control transfer is attempted and the condition lever is not matched between the two control consoles (Carrigan, Long, Cumming & Duffner, n.d.). “While some would advocate for more training to address this problem, humans are imperfect systems, and it is important to design for an operator that may make mistakes” (Carrigan, Long, Cumming & Duffner, n.d.).
Human factors in cockpit design occurs both in unmanned and manned aircraft systems. The cockpit needs to be designed in such as way as to allow the pilot to be efficient and comfortable as possible.  “One of the first formal human factors studies was carried out by Fitts and Jones in 1947 to analyze pilot experiences with display readings” (Flying Towards the Future).  In the late 1970s cockpits had an excess of 100 individual components the pilots were required to monitor and manage, the technology at the time only allowed for a gauge to display one piece of information (Salas, E., Maurino, 2010).
Another human factors issue is the limited field of view for the Predator operator. The original design of the GCS gave approximately a 30-degree field of view from the nose camera. It was stated that it is similar to “driving your care with paper towel tubes over your eyes” (Shiner, 2001). This is shown on one screen for the pilot overlayed with information such as transponder code, airspeed and
Original GCS Cockpit   (It's Better to Share, 2011)
altitude. A second screen providing data such as a map with a symbol of the aircraft and the corridor for its’ route of flight. This limited field of view, and the lack of vestibular and proprioceptive cues, means the pilots rely on visual cues to fly, such as when the runway fills the bottom third of the screen the nose is raised to flare prior to touchdown.
Raytheon designed a new cockpit for the GCS that now has three wide screens and a 270-degree field of view with some synthetic data overlayed. This allows for a large increase in the pilot’s situational awareness. This improved display along with more ergonomic controls, more comfortable
Advanced GCS (Pocock, 2007)
and adjustable seat as well as new interfaces increases the comfort of the pilot as well as their situational awareness (Pocock, 2007).
Limited field of view can occur with manned aircraft as well.  Many military and civilian pilots fly at night under Night Vision Devices (NVD) such as night vision goggles. These goggles limit the field of view to around 40-degrees. The pilot overcomes this limitation by increasing scan rate, looking left and right frequently and not relying on peripheral vision in order to see surroundings.















References
Carrigan, G., Long, D., Cummings, M., & Duffner, J. (n.d.). Human Factors Analysis of Predator B Crash [Scholarly project]. Retrieved January 21, 2018, from https://hal.pratt.duke.edu/sites/hal.pratt.duke.edu/files/u13/Human%20Factors%20Analysis%20of%20Predator%20B%20Crash%20.pdf
Flying Towards the Future: An Overview of Cockpit Technologies (October, 2013). Retrieved January 21, 2018 from http://www.ergonomics.org.uk/flying-towards-the-future/ 
Its Better to Share: Breaking Down UAV GCS Barriers. (2011, October 03). Retrieved January 21, 2018, from https://www.defenseindustrydaily.com/uav-ground-control-solutions-06175/
Pocock, C. (2007, June 16). New UAV Control System May Cut Predator Losses. Retrieved January 21, 2018, from https://www.ainonline.com/aviation-news/defense/2007-06-16/new-uav-control-system-may-cut-predator-losses
Predator RQ-1 / MQ-1 / MQ-9 Reaper UAV. (n.d.). Retrieved January 21, 2018, from http://www.airforce-technology.com/projects/predator-uav/
Salas, E., Maurino, D. E. (2010). Human factors in aviation (2nd ed.). Amsterdam: Academic
            Press/Elsevier

Shiner, L. (2001, April 30). Predator: First Watch. Retrieved January 21, 2018, from https://www.airspacemag.com/military-aviation/predator-first-watch-2096836/?all