Advanced Air Mobility (AAM) is coming sooner than later. So soon that between Harbour Air aiming to become full-electric in 2022, the partnership between Drone Delivery Canada (DDC) and Edmonton International Airport (EIA), and the 2021 Breath in the sky by Unither Bioélectronique Inc., one could argue that AAM is already here. Not in the volume that one day will be, but in parts. But, what will happen when AAM achieves its full potential? In Canada, people can only fly drones in certain areas because of the potential danger; will the same restrictions apply to Urban Air Mobility (UAM)?
AAM infrastructure: AAM, traditional aviation, and the airspace
A curious aspect of integrating AAM and traditional aviation don’t necessarily share the same airspace, at least when thinking about UAM. The reason for that is the altitude. Conventional Take-Off and Landing (CTOL) and Short Take-Off and Landing (STOL) fly many meters above cities. Unless taking off and landing, chances are people only see these more traditional airplanes as a minor point up in the sky. So in terms of sharing the airspace, those planes already have systems to ensure accidents aren’t a common occurrence, and those are pretty good and reliable systems.
The moments that AAM and conventional aviation share the same airspace are on take-off and landing. Still, with transponders, communications systems, and air traffic management (ATM), it’s possible to have AAM aircraft flying close to traditional aviation. As mentioned earlier, DDC and EIA are currently doing precisely this daily.
Furthermore, with geofencing, it’s possible to prevent uncrewed aircraft from entering airports, military bases, power stations, nuclear stations, and other dangerous places. So to answer the question, “How will AAM and traditional aviation share the airspace?” “With investment in technology, especially transponders, communications systems, ATMs, and geofencing, they’ll seemingly share the airspace.”
But how will AAM share the airspace with AAM?
AAM infrastructure: AAM sharing the airspace with AAM
If UAM tends not to interact a lot with CTOLs and STOLs, everything changes when talking about Vertical Take-Off and Landing (VTOL) aircraft. VTOLs will be a part of everyday life, and they’ll be flying around as cars drive around, which means that there’ll be significant numbers and collisions are a real risk.
As Aijaz Hussain, Vincent Rutgers, and Tim Hanley write in Deloitte Insights,
[AAM has to have an] Advanced detection and collision avoidance system. The established infrastructure for aircraft to communicate is expanding, with systems such as ADS–B (automatic dependent surveillance-broadcast) that show other aircraft currently aloft. But to make on-the-fly decisions and ensure passenger and cargo safety, autonomous eVTOL aircraft would need to be able to see even farther ahead. Enhanced detect-and-avoid technology that uses micro or millimetre wave technology is needed to (a) accurately identify and measure objects over longer distances, especially in difficult terrain and unsafe operating environments and (b) assist in real-time decision-making to establish safe navigation during bad weather conditions.
And this article by FutureFlight complements,
[Marie-Pierre] Guilbert [product line manager, Thales Communication & Security] explained that the new ACAS [aircraft collision avoidance system] format will require aircraft to be fitted with both cooperative sensors (like those used in existing traffic collision avoidance systems or the radio function of air traffic control systems) and a non-cooperative sensor, such as radar, or an optical or infrared sensor. For the cooperative sensor, Thales could provide an adapted version of its existing hardware, such as a Mode S transponder. The input from the sensors is fused together by the algorithms developed by Thales.
Some of the sensors are,
Stereo vision sensors: Stereo vision works similarly to 3D sensing in human vision. Stereoscopic vision calculates depth information by combining two-dimensional images from two cameras at slightly different viewpoints;
Ultrasonic sensors (Sonars): An Ultrasonic sensor sends out a high-frequency sound pulse and then times how long it takes for the echo of the sound to reflect. The ultrasound sensor has two openings. One of these openings transmits the ultrasonic waves (like a tiny speaker), and the other opening receives the ultrasonic waves (like a small microphone). Submarines use sonars to locate themselves underwater, and the inspiration to create this technology came from observing bats;
Time-of-Flight sensors: A Time-of-Flight camera consists of a lens, an integrated light source, a sensor, and an interface. Those four components capture depth and intensity information simultaneously for every pixel in the image, making it extremely fast with high frame rates creating highly accurate depth information;
Infrared sensors: Similar to the Sonar, the Infrared (IR) works by emitting and getting information back. But where the Sonar uses sound, the IR uses an infrared ray to locate possible obstacles;
LiDAR sensors: A LiDAR sensor calculates distances and detects objects by measuring the time it takes for a short laser pulse to travel from the sensor to an object and back, calculating the distance from the known speed of light;
Monocular vision sensors: Monocular sensors capture images through a single-lens camera. It’s a 3D depth reconstruction from a single still image.
So again, AAM will be able to share space with AAM safely and, consequently, enable UAM because of the technology that already exists and that industry and academia update and find new, better, and safer solutions daily.
Technological advancement is critical to create airspace shared by traditional aviation and AAM safely.
By Giovani Izidorio Cesconetto