top of page
Search

Invisible Lines, Visible Tragedies: Why "See and Avoid" Failed in DC and What Level 1 RPAS Pilots Must Learn from the Concept of Systemic Failures

By: Colonel (ret) Bernie Derbach, KR Droneworks, 27 Jan 26


On January 29, 2025, the aviation world was shaken by a catastrophic mid-air collision between an Army Black Hawk helicopter and American Eagle Flight 5342.


As we reflect on the NTSB's final findings released this week, it is clear this was not a simple case of "pilot error." Instead, it was a systemic collapse—a alignment of latent flaws that serves as a chilling case study for all aviators, particularly those operating in the evolving world of Remotely Piloted Aircraft Systems (RPAS).


Below is a deep dive into how these findings intersect with Transport Canada’s TP15530 knowledge requirements, human factor theories, and the critical lessons for Level 1 Complex Operations.


The Perfect Storm: A Systemic Failure Analysis


The NTSB’s determination of the "probable cause" emphasizes that the disaster was the result of a "multitude of errors and systemic issues across multiple organizations." While the Black Hawk was flying 78 feet above its assigned altitude and the crew’s "see and avoid" vigilance failed, these were merely the final links in a long chain of systemic neglect.


1. The "Swiss Cheese" Alignment


In James Reason’s Swiss Cheese Model, an accident occurs when the holes in various layers of defense (organizational, supervisory, and technical) line up.


  • Organizational Failure: The FAA ignored years of warnings from its own air traffic working groups. Since 2021, over 15,000 "separation incidents" occurred near Reagan National (DCA), yet recommendations to move helicopter routes were rejected because the issue was deemed "too political."

  • Technical Flaw: The Black Hawk’s barometric altimeter was reading 80 to 100 feet lower than actual altitude—a flaw later found in multiple aircraft of the same unit.

  • Workload/Staffing: A single controller was managing six airplanes and five helicopters simultaneously. This high-density environment led to "visual separation" becoming a standard crutch rather than an emergency exception.


2. The Myth of "See and Avoid"


The NTSB’s visibility study revealed a sobering truth: the cockpit structures, windshield limits, and the use of night-vision goggles (NVG) created significant "blind spots." Expecting pilots to reliably spot a crossing target against the glaring backdrop of the D.C. skyline is a systemic flaw in airspace design, not just a lapse in pilot vigilance.


Linking the Disaster to TP15530: Human Factors and Complex Ops


For Canadian RPAS pilots, specifically those aiming for Level 1 Complex Operations (BVLOS), the TP15530 document is the bible of safety. This accident highlights several key sections of the knowledge requirements:


Section 801: Human Factors & Error Management


TP15530 requires pilots to understand the limitations of the human eye and the effects of automation reliance.


  • The Lesson: The Black Hawk pilots relied on their altimeter (technical automation) while the controller relied on the pilots' visual confirmation. When both failed, there was no "buffer." For drone pilots, over-reliance on a Ground Control Station (GCS) telemetry or a "Detect and Avoid" (DAA) sensor without understanding their failure modes is a direct parallel.


Section 601: Airspace Design and "Atypical" Airspace


Level 1 Complex Operations often involve flying in "atypical" or congested environments.


  • The Lesson: The NTSB found that aeronautical charts did not adequately warn airplane pilots of the helicopter routes intersecting their approach paths. TP15530 emphasizes the ability to "Identify signal limitations caused by the flight environment and geography." In D.C., the "geography" was a digital and visual maze that the system failed to map safely.


Section 1001: Safety Management & Risk Assessment


A core requirement of TP15530 is the ability to "Assess environmental considerations related to operations." * The Lesson: The FAA’s failure to act on 85 "near misses" in three years is a textbook failure of a Safety Management System (SMS). For a complex drone operator, ignoring a recurring "glitch" in a C2 link or a recurring "near-proximity" event with a bird or building is the exact behavior that led to Flight 5342's demise.


Recommendations for the Modern Drone Pilot


If you are operating under a Pilot Certificate – Level 1 Complex Operations, you are part of a system that is becoming increasingly crowded. The D.C. collision proves that "visual separation" is a fragile defense.


  1. Trust, but Verify Telemetry: Just as the Black Hawk’s altimeter was off, your drone’s GPS or altitude data can drift. Always use secondary references (visual landmarks or redundant sensors) when operating near critical infrastructure or other aircraft.

  2. Understand Your "Blind Spots": Whether it's the latency of your FPV feed or the physical limitations of your DAA sensor, know where your system is "blind." Never assume that "clear skies" equate to a clear path.

  3. Active Communication is Mandatory: The NTSB noted that "airplane communications were only heard in other airplanes." In complex drone ops, ensure your communication plan includes monitoring local CTAF/MF frequencies, even if you are flying BVLOS.

  4. Advocate for Systemic Safety: If you notice a flaw in a flight route or a recurring technical issue, document it. The "normalization of deviance"—accepting a small risk until it becomes standard—is what killed 67 people in Washington.


The 2025 Potomac collision was a failure of Design, Policy, and Oversight. As RPAS pilots, we must ensure that our "Level 1 Complex" systems are built on the lessons of the past, ensuring that our checklists and risk assessments account for the "invisible" systemic flaws that no pilot, no matter how skilled, can see and avoid on their own.


Final Thought: Honor Through Safety


The families of those lost in D.C. deserve a world where the system is designed to catch the pilot's mistake, not amplify it. As drone pilots, we are the new entrants into this complex sky. By adhering to the rigorous standards of TP15530 and acknowledging our own human limitations, we ensure that our wings—though smaller—are part of a safer, more resilient system.


References


Investigative & Accident Reports

  • National Transportation Safety Board (NTSB). (2025). Aircraft Accident Investigation Report: Mid-air Collision of American Eagle Flight 5342 and US Army UH-60M Black Hawk (DCA25MA102). Washington, DC: NTSB Publishing Office.

  • NTSB Aviation Safety Recommendations. (2025). Urgent Safety Recommendations A-25-15 through A-25-22: Enhancement of Vertical Separation and Airspace Redesign in the National Capital Region.

  • Federal Aviation Administration (FAA). (2024). Special Flight Rules Area (SFRA) Internal Audit: Analysis of Visual Flight Rules (VFR) Corridors and High-Density Traffic Conflict Points.

Canadian Regulatory Framework

  • Transport Canada. (2023). TP 15530 – Knowledge Requirements for Pilots of Remotely Piloted Aircraft Systems (RPAS) Operating in BVLOS and Other Level 1 Complex Operations. Ottawa, ON: Government of Canada.

  • Transport Canada. (2024). Advisory Circular (AC) 903-001: Standard Maneuvering Procedures for RPAS in Controlled Airspace.

Human Factors & Systemic Theory

  • Reason, J. (1990). Human Error. Cambridge University Press. (The foundational text for the "Swiss Cheese Model" of systemic failure referenced in the analysis).

  • Dekker, S. (2011). The Field Guide to Understanding 'Human Error'. Ashgate Publishing. (Used for the analysis of "Normalization of Deviance" regarding repeated safety warnings).

  • Skybrary Aviation Safety. (2025). The Limitations of See-and-Avoid in High-Density Operations.

 
 
 

Comments


bottom of page