What if the most dangerous moment in aviation isn't the emergency — it's the ten seconds before the crew realizes there is one? This week's Safety Layer takes the Teterboro EMAS overrun apart layer by layer: the infrastructure, the bias, and the three things operators must change before the next constrained-runway approach.

EMAS stopped the aircraft. Expectation bias caused the problem. Here is the full breakdown — what happened, why it happened, and what every operator should do before their next constrained-runway approach.

The aircraft touched down fast.

The runway was already running out.

In the next 12 seconds, the crew made a decision they believed was safe.

It wasn't.

01 — What actually happened

The approach was unstable. Speed, descent profile, and runway remaining were no longer aligned with a safe landing outcome. At that point, the correct decision was unambiguous: go around.

Instead, the landing continued.

By the time the crew fully recognized the severity of the situation, the margin for correction had disappeared. The aircraft exited the runway and was brought to a stop by the Engineered Materials Arresting System (EMAS) — a bed of crushable concrete blocks installed at the end of Runway 24, approximately 300 feet from Route 46, a six-lane highway.

Runway margin disappears faster than pilots perceive

No fatalities. No hull loss. But this was not a success story. It was a near-miss with a safety net — and the difference between the two is not safety. It is luck compounded by
infrastructure.

02 — The real problem: expectation bias

This was not simply "pilot error." That label closes the conversation at the moment it should open. What happened here was expectation bias — one of the most persistent and documentable cognitive traps in aviation, and one of the least visible in the moment it operates.

Expectation bias activates when three conditions align:

  • The pilot believes the situation is under control

  • They interpret incoming information to confirm that belief

  • Contradictory signals are delayed, minimized, or reclassified as manageable

In high-workload environments, the brain does not process reality objectively. It processes what it expects to see. The final approach and rollout phases are precisely the conditions under which this bias is strongest — high workload, time pressure, and a strong prior expectation that the landing will succeed because landings almost always do.

Expectation bias. It doesn’t feel like a bias. If feels like confidence.

In the Teterboro event, the crew likely held three beliefs simultaneously — each individually reasonable, collectively dangerous:

"The landing is recoverable."
Reasonable in isolation. Wrong in combination.

"The runway remaining is sufficient."
Felt true. Was not calculated.

"This is within normal limits."
Based on expectation, not data.

03 — Why this matters more than the incident

Events like this are quickly labelled "unstable approach" or "poor judgement."

That framing is not wrong — but it is incomplete, and incomplete framing produces incomplete corrective action.

Expectation bias is not a rare failure or a character flaw.

It is a predictable, documented human behavior under pressure.

It is the reason stable approach criteria exist.

It is the reason go-around culture must be explicitly built — not assumed.

And it is the reason EMAS exists: because the industry already knows that all preventive layers will occasionally fail together.

What Teterboro adds to the record is this: when a crew's cognitive model and the aircraft's physical situation diverge, the model almost always wins — right up until the runway runs out.

04 — What operators should take seriously

  1. Go-around culture must be non-negotiable

    Not encouraged. Not suggested. Expected, normalized, and never penalized — operationally, commercially, or culturally. If a go-around is treated as an inconvenience in your operation, expectation bias has a structural ally.

  2. Training must go beyond procedures
    Crews already know the rules. What they need is deliberate exposure to ambiguous scenarios, time pressure, and situations where "it still looks manageable" — the exact conditions under which expectation bias activates. Procedure recitation does not build resistance to cognitive bias. Scenario practice does.

  3. Safety systems are not solutions
    EMAS worked exactly as designed. But a system that relies on passive last-line defenses is not practicing safety — it is managing the consequences of failed safety. Every EMAS activation is evidence that multiple active layers failed first.
    Treat it accordingly.

05 - Teterboro Takeaway

The aircraft was stopped safely. But safety was not created in those final seconds.

It was almost lost there.

If there is one takeaway from Teterboro, it is this:

the most dangerous decisions in aviation are the ones that feel reasonable at the time.

Expectation bias does not announce itself. It does not feel like bias.

It feels like confidence, right up until the moment the runway ends.

P.S. The EMAS stopped the jet. But it didn't stop the investigation. In your operation — what is the final passive layer? And when did you last stress-test the active ones that are supposed to make it unnecessary? Reply and tell me.

Ghanshyam Acharya
Founder - The Safety Layer
Human Factor Instructor

Reply

Avatar

or to participate

Keep Reading