You are driving fast, maybe too fast, on a highway at night. Maybe it’s snowing, or raining, or your eyes are glazing over as you feel the fatigue of a long day set in, or maybe your phone dings and you look down for an instant. Suddenly the car in from of you stops and you hit the brakes. You feel your tires skid and for a second, you are sure you have crashed.
But then: Nothing.
You stopped just in time. Heart pounding, you exhale. You are shaken but also impressed by your speedy reflexes. You think to yourself: No harm done.
But harm nearly done. And that’s the problem.
Near-misses like this often disappear from our minds as fast as they happen. But they are the most valuable safety information we have. People, organizations and societies often fail to prevent disasters, not for lack of warnings, but because they don’t take near misses seriously.
Safety scientist James Reason saw near misses as “immunizations” for a safety system, chances to detect and fix underlying vulnerabilities before real harm occurs. But too often, we waste these chances. We get lucky, and instead of investigating or analyzing what went wrong, we move on.
My interest in near-misses comes from practising medicine and from my research into the history of disasters and system failures, work that informed my book Written in Blood. Studying accidents across fields, from fires to transportation to health care, shows that warning signs are often visible long before catastrophe strikes.

(Unsplash+/Getty Images)
Luck is not a strategy
Take something as mundane as your phone. In late 2025, Apple released iOS 26.1, a routine software update. Except it wasn’t routine. It patched multiple critical vulnerabilities that could have allowed attackers to seize control of iPhones. Had hackers succeeded, millions of users’ data and privacy could have been compromised. And while some phones probably had been hacked, for most people, the crisis was avoided.
In health care, near-misses are common: A medication nearly given to the wrong patient but caught in time, or a surgical tool counted incorrectly but found before the patient’s incision is closed. These are serious signals, but too often they go unreported. The majority of health-care workers fail to report near misses due to fear of blame, lack of feedback or the false belief that no harm means no problem.
Often, staff in health care don’t even realize a near-miss has occurred. If we’re not looking for near-misses, we are nearly guaranteed not to learn from them.
THE CANADIAN PRESS/John Woods
Transportation shows the same pattern. Near-collisions on icy highways. Trains braking just before overshooting a signal. Aircraft diverting after onboard systems detect a mechanical fault mid-flight. In aviation and rail, these close calls are treated as data. In many other sectors, they are dismissed as background noise. But the data is there.
A recent Canadian Automobile Association (CAA) study found that at just 20 monitored intersections, more than 610,000 “near-miss” incidents — close calls between vehicles and pedestrians or cyclists — were recorded from September 2024 to February 2025.
Our systems are sending signals. Every time we get lucky is a chance to learn — a chance to build better layers of defence; a chance to prevent the next tragedy. Near-misses aren’t false alarms. They’re the most honest feedback a system gives: The future, whispering in the present.
Our brains aren’t wired for prevention
So why don’t we learn from close calls?
Psychologists have long understood the human brain is terrible at processing invisible risks. We overreact to dramatic events but underreact to near-misses. We confuse luck with safety. And we discount what “almost” happened.
Three psychological traps are especially pernicious:
- Availability bias: We remember big disasters, but not the hundreds of times catastrophe was narrowly averted. This skews our risk radar.
- Confirmation bias: We assume a system is safe because it didn’t fail. But many systems survive not because they’re strong, but because nothing has lined up to break them — yet.
- Optimism bias: We know bad things happen to other people but assume our skill or luck will protect us.
Reason’s “Swiss cheese” model describes how disasters happen when weaknesses in multiple layers of defence align. A near miss is when they almost line up and something, often by chance, blocks the path. But unless we plug those holes, the next time, we might not be so lucky.
THE CANADIAN PRESS/Darryl Dyck
There are exceptions. Aviation, nuclear energy and air traffic control, so-called “high-reliability organizations,” understand this. Ideally, they treat every close call as a data point. They institutionalize reporting. They never forget to be afraid.
These organizations cultivate a chronic unease, a kind of productive paranoia. It’s not pessimism; it’s realism. They know that systems often drift toward failure unless they’re constantly corrected. That mindset is why they’re among the safest sectors in the world.
Imagine if we brought that mindset to more sectors — if every phishing text that almost fooled someone became a reason to upgrade security, if every minor medical error was reviewed like a crash. The price of ignoring near-misses is always paid eventually — in insurance claims, infrastructure failures, lawsuits and preventable grief.
What you can do now
If near-misses are warning flares, the simplest step is to stop ignoring them. When something almost goes wrong, the instinct is often to shrug it off as luck. But luck is data. It is evidence that a system came close to failing.
The real lesson of near-misses is that they allow us to learn without paying the full price of disaster. Aviation, nuclear power and other high-risk industries have built entire safety systems around studying these moments.
We should treat them the same way in everyday life: on the road, at home and at work. Notice them. Talk about them. Fix the conditions that made them possible.
Because the goal is not simply to avoid disaster. The goal is to learn from the moments when things almost go wrong.

