Imagine for a moment that you are a high-wire walker. On your first day, you wear a thick safety harness, a giant net stretches out below you, and a team of spotters watches your every move. You feel secure because the rules are strict and the consequences of a slip are carefully managed.

However, after a few weeks of perfect performances, you notice the harness is a bit itchy and the net takes a long time to set up. You decide to loosen the straps just a little bit. Nothing happens. You still make it across the wire. Emboldened by your success, you eventually stop using the harness altogether. One day, you decide the net is unnecessary because, after all, you "never fall." You haven’t actually become a better walker; you have simply grown comfortable with an increasing amount of danger.

This subtle, creeping shift from "following the rules" to "cutting corners without consequence" is a phenomenon known as the Normalization of Deviance. It is one of the most dangerous psychological traps in modern engineering, medicine, and aviation. It suggests that disasters rarely strike like a sudden bolt of lightning. Instead, they are usually the final, predictable result of a long series of tiny, ignored warnings. We often mistake a lack of disaster for the presence of safety. In reality, we might just be getting lucky while operating far outside the lines of sensible design. By the time we realize we are in trouble, the risky behavior has become the local standard, and the original safety rules look like ancient, irrelevant history to the people on the ground.

The Quiet Erosion of Excellence

Sociologist Diane Vaughan famously coined the term while investigating the Space Shuttle Challenger disaster. To the outside world, it looked like a sudden mechanical failure of an O-ring (a rubber seal) during a cold morning launch. But when Vaughan looked deeper into the records, she found something much more chilling. NASA engineers had seen signs of O-ring erosion on previous flights. Each time it happened, they held a meeting, analyzed the data, and concluded that because the shuttle had returned safely despite the damage, the risk was "acceptable." They didn't fix the problem; they just adjusted their definition of what was okay. Stepping away from the original design became the new normal.

This process is insidious because it doesn't feel like "breaking the rules" to those involved. It feels like adapting to reality. In a high-pressure environment, whether you are launching rockets or managing a hospital ward, there is always a push to be faster, cheaper, and more efficient. When a technician skips a minor checklist item to save ten minutes and the machine still works perfectly, their brain receives a powerful reward. They feel they have discovered a "shortcut" that the "overly cautious" rule-makers didn't know about. Over months or years, these small shortcuts stack on top of one another until the original safety manual is effectively discarded, replaced by a culture of "how we actually do things around here."

This shift happens because humans are naturally poor at evaluating "silent" risks. We are programmed to respond to immediate feedback. If you touch a hot stove, you get burned instantly and never do it again. But if you skip a safety check and nothing explodes, your brain records that as a victory. You saved time, you met your deadline, and everyone is happy. The danger is still there, perhaps even higher than before, but because it didn't cause a catastrophe, we assume it doesn't exist. We confuse "no accidents" with "high safety," failing to realize that we are surviving on a shrinking margin of error.

The Three Pillars of Routine Risk

To understand how a team of brilliant professionals can slowly walk themselves off a cliff, we have to look at the social and psychological pillars that support the normalization of deviance. The first pillar is the pressure to perform. In any competitive field, there is a constant demand for results. When a rule stands in the way of a deadline, the rule becomes an obstacle to be "managed." If leadership prioritizes the schedule above all else, the staff quickly learns that following every safety protocol is a good way to get a reputation for being slow or difficult.

The second pillar is the "incubation period" of silence. Every complex system has a built-in safety margin, a "buffer" designed by engineers to account for the unexpected. When someone violates a rule, they are usually cutting into that safety margin without knowing it. Because the system was designed with that extra cushion, it doesn't fail immediately. This silence mimics safety. It creates a false sense of security that reinforces the rule-breaking. The team begins to believe that the original safety margins were "over-engineered" or "unrealistic," and they start to pride themselves on their ability to operate in the "real world" where things are messy.

The third pillar is the social reinforcement of the group. If a new employee joins a team and sees everyone skipping Step 4 of a procedure, they will likely do the same to fit in. If they try to point out the violation, they might be told, "Oh, we don't do that here; that's just for the auditors." Before long, the shortcut isn't just a behavior; it is a cultural badge of belonging. This makes it incredibly difficult to stop, because challenging the practice feels like challenging the competence and identity of your colleagues. You aren’t just arguing about a bolt or a checklist; you are suggesting that your friends are being reckless.

Comparing Intentional Safety and Normalization

It is helpful to look at how a healthy organization differs from one that has fallen into the trap of normalized deviance. The differences are often subtle, appearing in the way failures are discussed and how "success" is measured.

Feature Healthy Safety Culture Culture of Normalized Deviance
Response to Near Misses Treated as a "free lesson" and investigated deeply. Ignored or celebrated as a sign of system strength.
View of Rules Rules are vital protections for the "worst-case scenario." Rules are seen as red tape or bureaucratic hurdles.
Communication Speaking up is encouraged and rewarded. Speaking up is seen as "not being a team player."
Success Metric Success is defined by the quality of the process. Success is defined only by the final outcome.
View of the Future Persistent worry that something might go wrong tomorrow. Confidence that since it worked today, it will work tomorrow.

This table highlights the core of the problem: a shift in mindset from "how can we be sure this is safe?" to "can you prove this is unsafe?" In a healthy system, the burden of proof is on the person who wants to take a risk. In a compromised system, the burden of proof shifts to the person who wants to be cautious. If you can't prove that the part will definitely fail today, then the project proceeds. This is a complete reversal of the engineering mindset that built the system in the first place.

Recognizing the Red Flags in Your Own Work

While we often use space shuttles or oil rigs as examples, this phenomenon happens in small ways in every office and home. Do you drive while looking at your phone because you've done it a hundred times and "never hit anyone"? That is the normalization of deviance. You are operating outside the safe design of the task, and your lack of a crash isn't proof of your skill - it's just proof that you haven't run out of luck yet. Identifying these patterns in ourselves requires high self-awareness and a willingness to be "the annoying person" who insists on following the standard.

One of the best ways to spot this drift is to look for "work-arounds." A work-around is a temporary fix for a problem that never gets a permanent solution. If you find yourself saying, "This button doesn't work, so you just have to jiggle this wire three times," you are looking at a deviation that has become normalized. Over time, that jiggled wire becomes part of the training for new employees, and nobody remembers that the button was supposed to work on its own. These work-arounds are the breadcrumbs that lead back to a system's true state of health.

Another major red flag is the "standardization of the exception." This happens when a team decides that a one-time emergency shortcut worked so well that they should just do it every time there’s a minor rush. When you stop asking permission to bypass a rule and start treating the bypass as the default, you have crossed a dangerous threshold. To combat this, organizations must foster a culture where people feel safe flagging these slips without fear of being blamed for "slowing things down." In fact, the most resilient systems are those that reward people for stopping the line when something doesn't feel right.

Building a Culture of Chronic Unease

The antidote to normalized deviance is something safety experts call "chronic unease." It sounds unpleasant, but in a high-stakes environment, it is a virtue. Chronic unease is the healthy, persistent suspicion that no matter how well things went yesterday, there is a hidden flaw waiting to be found today. It involves constantly questioning the "standard" ways of doing things and refusing to become complacent just because the outcome was positive. It means checking the harness and the net every single time, even after you've walked the wire a thousand times without falling.

Leading with this mindset requires a shift from "Outcome-Based Thinking" to "Process-Based Thinking." If you follow a bad process and get a good result, you should be worried, not happy. In a high-integrity system, a "lucky" win is treated with the same scrutiny as a failure, because luck is not a sustainable strategy. By focusing on the process rather than just the result, we can catch the tiny shifts in behavior before they grow into a catastrophe. We must learn to respect the rules not because we are boring, but because those rules were written using the hard-earned lessons of those who came before us.

True excellence is not found in the glamorous moments of a successful launch, but in the quiet, repetitive, and often "boring" work of maintaining a standard when nobody is watching. It is the courage to say "stop" when everyone else is pushing for "go," and the wisdom to know that "getting away with it" is the most dangerous form of feedback you can receive. By staying vigilant against the slow creep of the mediocre, we protect our projects, our systems, and the lives of everyone who relies on them. Be the person who respects the margin, and you will find that a reliable system is worth a thousand lucky breaks.

Systems Thinking

The Luck Trap: Why Breaking the Rules Becomes the Norm in High-Stakes Work

February 19, 2026

What you will learn in this nib : You’ll learn how to spot the slow slip into risky shortcuts, understand why it happens, and use practical habits to keep safety rules strong and protect your team.

  • Lesson
  • Quiz
nib