Outside of war, spaceflight is one of the most dangerous human endeavours. At take-off, astronauts sit atop rockets loaded with hundreds of tonnes of highly flammable fuel; once outside of earth’s atmosphere, pressurised suits and oxygen are essential for survival; and, upon re-entry, heat shields provide protection from temperatures exceeding 1650 °C.

At any time, anything can go wrong, often with fatal consequences.

For instance, 14 astronauts tragically died when NASA lost two of its five space shuttles. Challenger exploded soon after take-off in 1986, and Columbia disintegrated during re-entry in 2003. 

However, when the shuttle Atlantis landed safely at Florida’s Kennedy Space Centre on 21 July 2011, it marked the end of arguably one of the most successful space flight programs ever.

Between 1981 and 2011, the space shuttle flew 135 times. Despite the horrible tragedies of Columbia and Challenger, the shuttle’s 133 successful missions led to a 98.5% success rate. In one of the most high-risk pursuits in human history, that statistic is quite remarkable.

Behind the shuttle program’s success lies a culture that didn’t walk away from its problems. After each shuttle disaster, there was considerable discussion about the technical reasons for mission failure. But the real learning for NASA came in understanding that behind even the tiniest glitch lay human error. Acknowledging that simple miscommunication can go from short-term oversight to downstream mission failure, NASA’s Chief Knowledge Officer, Ed Hoffman, says, ‘If you don’t talk about mistakes, you get failures.’

Just like at NASA, mistakes happen in every organisation. No one wants to get anything wrong, but add the pressures of deadlines, limited resources, and competing priorities, and something will always go amiss. However, rather than humbly admit our faults, pride ensures we often strive to cover up the slightest failing.

The critical question is, do we value our mistakes? 

Learning from what has gone wrong in the past is at the heart of achieving incremental improvement. That’s why when it comes to our decision-making processes, we need to see past errors as essential inputs just as necessary as costs, resources, and time. That way, we can leverage our strengths to improve our identified weaknesses. 

Not only does this approach result in better outcomes, but in team decision-making processes, it enables us to reflect on those aspects that work well and, more importantly, to enjoy our successes and how far we’ve come.

Today, NASA talks about how mistakes can support mission success. In fact, the organisation uses a sliding scale of terminology:

  • Mistakes – these always happen but have to be communicated
  • Mishaps – these are prevented if mistakes are reported early; they don’t affect the outcome of the mission, but they’re an example of something not going to plan
  • Failure – something isn’t working, and the mission outcome is affected*

If an organisation behind one of the most dangerous pursuits on the planet can embrace mistakes and admit they always happen, then surely, we can as well.

*2016, Defining a Learning Culture, How NASA Shares Knowledge to Learn, The Leaders in Performance Podcast