Friday, June 27, 2014

Beyond Line of Sight


This week was a look into beyond line of sight technologies, or BLOS. It was quite difficult to obtain information in this area due to a simple fact: BLOS is primarily a military grade technology, so private sector and civilian information on the technologies is hard to come by.
That being said, the following information and analysis relates what I was able to find:
Beyond Line of Sight
Abstract
As of now, beyond line of sight (BLOS) capabilities are typically found in the defense industry. However, as we near development of NextGen and the systems and products that will integrate unmanned aerial systems (UAS) into the national air space (NAS), it is possible more robust UAS technologies such as beyond line of sight capabilities will be introduced to the private sector. Unmanned aerial systems like the Global Hawk and Predator utilize long range operations for surveillance and boarder protection using satellite based data exchange.
Analysis
 Ku Band is used for beyond line of sight operations for multiple UASs including the Global Hawk, Predator, and their derivatives which utilize the BLOS C2 system which relies on a 11.7-12.7 GHz download and 14-14.5 satellite data uplink (Valavanis, Oh, & Piegl, 2008). The primary concern with BLOS satellite control links is latency, thus the need for autopilot operations in which the pilot, “…remains out of the C2 Loop but monitors the flight operations for unusual situations.” (Valavanis, Oh, & Piegl, 2008). In the case of the Global Hawk, the UAS ground control station (GCS) segment consists of a launch and recovery element (LRE) and a mission control element (MCE) (Northrop Grumman, 2014). The RD-2B LRE and the RD-2A MCE work together to provide both line of sight (LOS) and BLOS operations. The system requires operators to switch from LRE to MCE for BLOS operation. This creates a potential issue due to human factors being introduced. If the pilot/operator do not “catch” the aircraft as it leaves LOS then loss link may occur, or communication or procedures concerning operating status is not conveyed from one pilot-operator to another as in the Predator B accident with the Border Patrol. Additionally, if the aircraft is to be on mission for extended periods of time multiple pilots will switch control in shifts, introducing potential information exchange issues. Once operating BLOS the aircrafts autopilot GUI executes the loaded flight plan, operators/pilots can alter the flight paths if necessary if, for example, airspace is crowded or mission parameters change.
Conclusion
The downside to BLOS operations as previously stated is latency, which can cause situational awareness issues, a large concern when discussing UAS integration into national air space (NAS). Although new advancements are being made to combat situational awareness issues, latency is why - at this time - I believe LOS operations are the only safe use of UAS in NAS. However, as technology progresses and BLOS and situational awareness in UAS improves, the commercial usage of BLOS UAS operations will increase and many missions – from large farm crop dusting, to commercial goods delivery to difficult geographical locations (like Coke delivering soda to skyscraper workers). I know in many parts of the world there are hard to reach locations with humanitarian needs, i.e. medicine and food, however either dangerous terrain - which makes for virtually impassable roads, or militant forces - making ground travel too dangerous, makes UAS BLOS operations a viable answer. The downside is in the cost however, as many commercial and private uses do not warrant the investment. I can see where organizations could perhaps pull resources and gain positive publicity by helping fund UAS while aiding those in need at the same time by simply investing in UAS with BLOS capabilities, but until such technologies are brought down into the consumer or industrial price brackets UAS with BLOS capabilities will be scarce in the NAS. That being said, an interest from the private sector could spur development and in turn reduce production costs of BLOS capable UAS.

References:
Kimon P. Valavanis, Paul Y. Oh, Les A. Piegl (2014). Unmanned Aircraft Systems (2008).
International Symposium on Unmanned Aerial Vehicles, UAV’08.
Northrop Grumman (2014). RQ-4 Global Hawk High-Altitude, Long-Endurance Unmanned
Aerial Reconnaissance System Facts. Retrieved from: http://www.northropgrumman.com/Capabilities/RQ4Block20GlobalHawk/Documents/HALE_Factsheet.pdf

Friday, June 20, 2014

UAS in the NAS

This week I did research into the world of NextGen, and how it relates to UAS and human factors. This was an interesting and relevant topic in that people are concerned over just how exactly people hope to integrate UAS into domestic commercial use. The following was my viewpoint on these topics:

UAS Integration in the NAS
Abstract
The goal of the Next Generation Air Transportation System (NextGen) is to update current systems and processes that govern safety both in air and on the ground. Through multiple advancements the NextGen system seeks to utilize satellite based technologies to move air travel into the future, letting aircraft have better situational awareness through precision GPS and digital instrumentation, and both aircraft and air traffic control better situational awareness once on the ground. NextGen will also offer sharing of weather data and airspace conditions. All this adds up to increased safety for more flights, and cost savings over time. 
Analysis
The NextGen system fits in well with unmanned aerial systems (UAS). The technology used in both will work hand in hand, as current UAS utilize GPS as a navigating system, and through such developments as MITREs Intelligent Analyzer the ability to mitigate lost link issues and maintain safe airspace for both UAS and manned aircraft is possible (MITRE, 2012). The commercial possibilities are numerous, further deepening the need for nationwide FAA approval. The general public is somewhat leery of UAS due to select media coverage, this coupled with the mismatch in manned aircraft system technology has led to restrictions and regulations hindering UAS usage and arguably further development. Currently UAS are operated within national airspace only under strict guidelines and FAA approval. This comes from current compatibility issues with UAS systems and ground control systems, which, due to the technological differences it is often unsafe to fly UAS in national airspace. However, as NextGen becomes implemented, the ability to integrate the two systems is now becoming possible. Research being conducted by a MITRE-FAA partnership seeks to share real time flight information streaming from a UAS with air traffic control systems. This data link will integrate UAS with manned aircraft and create a complete airspace picture. 
The advancements and integration of both UAS and NextGen pose challenges. Research is still being conducted to find the best solution, and determine what human factor issues will arise from such a system. It is a major concern that the introduction of more incoming information will produce complications with pilot fatigue and awareness factors. The technology may be of great benefit, but there is a potential learning curve where mistakes could be made. In the FAA’s Human Factors Research Status Report it is stated that: “Achieving NextGen will require advanced concepts and technologies, along with higher levels of automation – all of which will result in changes to roles and responsibilities for pilots and air traffic controllers. These transitions, in combination with increased interaction with automation, can lead to unwanted side effects, such as increased errors, loss of situational awareness, or mode confusion.” (FAA, 2012). It can be compared to cell phone use or using navigation systems while driving in my opinion, the increased automation will require more incoming data to be looked at and analyzed, and this will create moments where error may occur just as in a driving scenario.
Conclusion
The FAA, NASA, MITRE, and countless individuals are all working together to make NextGen as safe as possible while meeting their goal of advancing aviation. Humans are part of every step in NextGen, from design - to use, and as such every effort is being made to reduce risks while implementing. The added challenge of UAS integration may create issues, but they are being worked through, and once complete I believe we will see an increase in UAS in national airspace and greater public acceptance as they become more commonplace.
References:
Federal Aviation Administration (2014). NextGen Implementation (2013). Retrieved from:
http://www.faa.gov/nextgen/implementation/
MITRE Corporation (2014). Integrating UAS into NextGen Systems (2011). Retrieved from:
https://www.youtube.com/watch?v=7hBcugTsWRQ
MITRE (2014). Keeping Track of Unmanned Aircraft by Overcoming Lost Links (2011). Retrieved from:
http://www.mitre.org/publications/project-stories/keeping-track-of-unmanned-aircraft-by-overcoming-lost-links
Federal Aviation Administration (2014). Next Generation Air Transportation System Human
Factors Research Status Report (2012). Retrieved from:
http://www.jpdo.gov/library/2012_Human_Factors_Research_Status_v2.0.pdf

Friday, June 13, 2014

UAS Ground Control Station

This week was an interesting dive into the world of Ground Control Stations (GCS) I did a bit of research and found a site listing some popular GCSs, while reading through them I stumbled on Raytheon Company's Common Ground Control Station, and it really sparked my interest. Now I know the unit has been out for awhile but developments are still apparently being made and the product tweaked. Below is my quick review and analysis of the CGCS:

Ground control stations (GCS) are constantly being evolved. As technology and methodology in the UAS field progresses, the capabilities are being tested and boundaries governing what a system can or can’t do are being pushed. To that end the Raytheon Company has created a unique GCS known as the Common Ground Control System (CGCS) that leverages the hand-eye coordination and learning curves found in video game development to build a GCS that is more intuitive and that lessens the time needed in training. The intent is to adhere to the NATO STANAG 4586 standard and create a "universal" GCS.
Through its use of a first person perspective, the CGCS, "...immerses the pilots or the operators in the system and helps them project their minds into the battle space. They actually feel like they are riding on the UAV.” according to Mark Bigham of Raytheon Intelligence and Information Systems (Defense Industry Daily, 2014). Raytheon's CGCS allows use of multiple UAV types, and aims to reduce losses and errors through its integrated system with customizable configurations and human factor ergonomic considerations such as pilot/operators being able to stand or sit and multi-function control (Raytheon Company, 2014).
Running common core UAS U2 software, the CGCS is the sole system providing U.S. government administration rights to the source code and interfaces. According to Raytheon Company, "The government has the source code to the UAS framework, owns the open, documented interfaces and makes them readily available for vendors to adapt and compete to provide the latest innovative ideas and applications." (Raytheon, 2014). According to the company, the CGCS provides three benefits:
-      - Flexibility to scale the ground station from large headquarters implementations all the way down to handheld‑phone‑size controllers.
-       -Allows unmanned systems management functions and information to be distributed across the total enterprise.
-       -Open, common, nonproprietary architecture minimizes life‑cycle costs, simplifies configuration management, and reduces training time and costs.
It is in these all these claims that certain human factors question can arise. A primary factor behind the CGCS is the cockpit view for ground pilots. While this is certainly an improvement, the issues in lens angle still exist and can create issues in airspace situational awareness. Certainly this is found in manned flight as well, but in the case of manned flight it is limited to peripheral vision - for the most part. In UASs there is currently limited field of view, leaving operators to rely on GPS locating to try and determine proximity to other aircraft, a not always successful approach. To combat this, the CGCS is certainly heading in the right direction, which is, increasing the display area as shown on their website where operators use 3 monitors to broaden their viewpoint.
Secondly, the CGCS claims to reduce manpower requirements by 20% (Defense Update, 2014). But as necessary crew shrinks, the cognitive load on pilot/operators increase with more information coming their way, and more responsibility being placed on them. Again, manned pilots undergo many sensory inputs at a time, however they are often split between the HUD/equipment, and their own sensory perceptions. In the case of the UAS pilot there is no such benefit of being "there" in the aircraft to use natural occurring inputs to aid in decisions. They are left to only digital inputs presented to them on screen. One technique to aid in the sensory overload of operators, is to keep operating times to a minimum, meaning more pilot scheduling turnaround time. Working in short "bursts" alternating shifts can keep minds fresh and focused on all the information being streamed their way. Additionally, I believe that adding more than visual cues would help in operations. Incorporating sensors linked to gyro outputs in equipment much like a simulator, one could give the ground pilots more of the sensation a traditional manned pilot would feel, and increase focus as the interface then becomes more "real". This approach is used when designing equipment, the goal of which is to keep the user focused and fend off boredom and keep thoughts from drifting from the task at hand.
            The CGCS is a revolutionary design that has caught the attention of the UAS community and government agencies. The developments the system introduces will help mitigate certain human factor issues presently found in traditional proprietary GCSs. The unique training methods and intuitive controls and interface, make the CGCS a candidate for more user-friendly GCSs. While there are still factors to be aware of, the CGCS is not a solve-all solution, however, the technology and capabilities it introduces will benefit the operators looking to improve human-machine interface and better conduct UAS operations.





References:
Raytheon Company (2014). Common Ground Control System (CGCS) (2014). Retrieved from:
http://www.raytheon.com/capabilities/products/cgcs/

Defense Industry Daily (2014). It’s Better to Share: Breaking Down UAV GCS Barriers (2011).
Retrieved from: http://www.defenseindustrydaily.com/uav-ground-control-solutions-06175/

Defense Update Magazine (2014). Raytheon Offers More Efficient Ground Control for the

Predator. Retrieved from: http://defense-update.com/products/c/cgcs.html#more

Friday, June 6, 2014

Bio

As a student of aeronautics, and a designer having worked on multiple contracts from aircraft carriers to reverse engineering the USS Monitor, I have had the privilege of creating this blog to document my progress as I work through my Masters degree in Aeronautical Science.

Aviation is a passion, and as such I love to study and analyze all of its facets. I will periodically update this blog to showcase my research and document my progress towards my ultimate goal of obtaining my Ph.D. in Aviation.