By Alaska White, CAA Chief Advisor, Human Factors

This is about the risk you take to ‘get the job done’ – you know, like NASA did with the space shuttle Challenger in 1986.

In psychology circles, it’s called the ‘normalisation of deviance’. In everyday language, it means gradually accepting, as ops normal, practices no longer matching genuine safety standards.

The disintegration of Challenger, 73 seconds into its January 1986 flight, is possibly the most high profile example of this.

The subsequent investigation revealed that erosion of rubber seals – the now-infamous ‘O-rings’ – led to hot gas leaking and igniting the shuttle’s main fuel tank.

But the root cause was that the O-ring erosion had been known about for years by NASA and its engineers. They had assessed it as a 'criticality 1' component, the failure of which would result in the loss of the shuttle and/or lives.

But numerous successful launches, despite the potential for disaster, had led to the issue being tolerated, then accepted, then normalised.

We’ve all done it

Challenger clears the tower at the Kennedy Space Center in Florida

Challenger clears the tower at the Kennedy Space Center in Florida, 28 January 1986 | Photo courtesy of NASA

We’ve all strayed from safe practices at some point. We’ve stood on a wobbly ladder or a chair to reach something, or we haven’t worn our seatbelt on a quick car trip, or jaywalking instead of using a nearby crossing.

When this behaviour fails to lead to a bad result, we relax a little about doing it properly and, slowly, the unsafe practices become accepted into normal practice and culture.

Deviating from safe practices creeps in slowly and the result is that we become desensitised to it, and it no longer feels wrong. It establishes a new normal and a false sense of what is ‘safe’.

Following the Challenger disaster, one of the managers responsible for the operations of the solid rocket boosters said:

“Since the risk of O-ring erosion was accepted and indeed expected, it was no longer considered an anomaly to be resolved before the next flight… the conclusion was, there was no significant difference in risk from previous launches. We’d be taking essentially the same risk on January 28 that we have been ever since we first saw O-ring erosion.”

Without external intervention, such as audits, changes in procedures, or staff or contractors speaking up, the cycle of deviance continues, and is disrupted only when other factors line up and result in something bad – the lining up of the holes in the Swiss Cheese(external link).

Intentional and unintentional

So-called ‘deviant’ actions can be intentional and unintentional, and those engaging in them often feel their deviance is justified (as the former manager demonstrated earlier) or they’re largely unaware of their deviations.

But either way, their ability to accurately identify and understand the associated risks and potential outcomes is compromised. In NASA’s case, they obviously didn’t mean to cause harm and they genuinely believed nothing was wrong. All those successful flights…

When people in aviation – or in any high-risk industry – believe that what they think is a minor departure from defined procedures is acceptable, they’ve often started down a road that could easily lead to something terrible. The additional danger here is that it opens the door for further deviations.

We know that drifting from safe standards creeps in slowly, but what encourages and reinforces it?

Pressure and rewards

When faced with a situation that puts you under pressure – for example, financial, client expectations, or time pressure – it’s all too tempting to relax standards or bend the rules to ‘get the job done’.

We’re naturally vulnerable to succumbing to, and justifying, those shortcuts under pressure – all the more so when it’s rewarded.

Rewarding behaviour – whether it’s good or bad – powerfully reinforces that behaviour. Being given a financial reward or even just praise for a ‘job well done’ gives the brain a boost in dopamine (a ‘feel-good’ chemical in the brain). The brain likes that feel-good feeling, and when we like something, we seek it out, and so the behaviour continues.

Organisational culture and conformity

A Boeing 737 MAX takes to the skies.

A Boeing 737 MAX takes to the skies | Photo: iStock.com/Ryan J Lane

In any group, we’re always looking for clues to how we should behave. We believe that others are more knowledgeable or experienced than we are, so we follow their lead. We’re afraid of repercussions if we don’t go along with the group – we don’t want to look foolish or feel like an outsider, feel different, or be excluded.

A belief that it’s better to ‘go along to get along’, ‘it’s not my job’, or that there’s no choice but to conform to ‘how things are done around here’ are all ways we justify our decisions or convince ourselves that something we know is not okay, is okay.

If we don’t believe what we’re doing is wrong, will we report it or challenge it? That’s very unlikely, but it’s very likely nothing will change until a catastrophic event unfolds.

Space shuttle Columbia

Did NASA learn its lesson from 1986 and improve the organisation’s culture? It didn’t appear so because in 2003, the space shuttle Columbia disintegrated over Texas as it re-entered the earth’s atmosphere. As with Challenger, it was a known technical issue that led to the shuttle’s destruction. Many shuttle flights had returned successfully, despite bits of foam breaking off and damaging thermal insulating tiles.

Any risk was therefore downgraded and normalised to the point it was a 'maintenance issue'. Then, in February 2003, a piece of foam debris hit the shuttle’s wing, puncturing a hole that proved fatal upon re-entry to the earth’s atmosphere.

Costa Concordia

In 2012, the giant cruise vessel Costa Concordia ran aground off the coast of Italy in relatively shallow water. The captain, and many captains before him, had intentionally and continually deviated from standard operating procedures (SOPs) for years.

There were no consequences to those deviations and that reinforced the belief that they were safe. The ensuing shipwreck killed 32 passengers and crew and seriously injured many more.

Boeing 737 MAX

The Boeing 737 MAX airliner was introduced into commercial air travel in 2017. Soon after, two of them crashed, killing a total of 346 people. In March 2019, all 737 MAX aircraft were grounded by the FAA.

A US Congress report on the crashes concluded that Boeing was facing pressure to produce a rival of the Airbus A320neo, and this pressure led to safety and performance concerns being dismissed to meet production schedule and goals.

After changes by Boeing, the CAA assessed the aircraft in 2021 as meeting required safety standards, allowing it to fly in New Zealand airspace.

Memo to CEOs

"The standard you walk past is the standard you accept."
(Lieutenant-General David Morrison, Chief of Australian Army, 2011-15)

Normalisation of deviance is easier to prevent than it is to correct. Set the standard you refuse to walk past, and model safe behaviour yourself. Make sure your senior people do that too.

Create and maintain a culture where ‘all the team’ wins, not just one individual.

Silence is enabling and powerful. So create an atmosphere where staff feel okay to talk about their mistakes and concerns, and where they can trust each other to give good constructive feedback. Encourage differing views and opinions. Encourage incident reporting. Maintain a routine of briefings and debriefings. Encourage robust discussion.

Reward safe behaviour – remember, we like dopamine’s feel-good effect – not just ‘job done’ results that may have pushed safety boundaries.

Know your company – pay attention to all aspects of your operation. Be active about this, because what you think is happening, from the comfort of your office chair, might be vastly different to what is actually happening on the ground.

Regularly look at, and be prepared to revise, your procedures. An exposition isn’t to be left on a shelf gathering dust, or buried in a computer file, never to be referred to once created. You won’t know you’re deviating from safety if you don’t know what your procedures say to begin with.

Stop, and think ahead – ask yourself ‘What if?’

And a key lesson from the Challenger disaster – don’t use past successes that involved deviant actions to achieve, to redefine acceptable performance and safety standards.

Finally, remember the observation of American physicist Richard Feynman, “When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next.”


Posted 9 months ago