Matthew Syed – Black Box Thinking


Nobody wants to fail. But in highly complex organizations, success can happen only when we confront our mistakes, learn from our own version of a black box, and create a climate where it’s safe to fail.

We all have to endure failure from time to time, whether it’s underperforming at a job interview, flunking an exam, or losing a pickup basketball game. But for people working in safety-critical industries, getting it wrong can have deadly consequences. Consider the shocking fact that preventable medical error is the third-biggest killer in the United States, causing more than 400,000 deaths every year. More people die from mistakes made by doctors and hospitals than from traffic accidents. And most of those mistakes are never made public because of malpractice settlements with nondisclosure clauses.

For a dramatically different approach to failure, look at aviation. Every passenger aircraft in the world is equipped with an almost indestructible black box. Whenever there’s any sort of mishap, major or minor, the box is opened, the data is analyzed, and experts figure out exactly what went wrong. Then the facts are published and procedures are changed, so the same mistakes won’t happen again. By applying this method in recent decades, the industry has created an astonishingly good safety record.

Few of us put lives at risk in our daily work, as surgeons and pilots do, but we all have a strong interest in avoiding predictable and preventable errors. So why don’t we all embrace the aviation approach to failure rather than the health-care approach? As Matthew Syed shows in this eye-opening audiobook, the answer is rooted in human psychology and organizational culture.

Syed argues that the most important determinant of success in any field is an acknowledgment of failure and a willingness to engage with it….

Author: Matthew Syed
Narrator: Simon Slater
Duration: 12 hours 14 minutes
Released: 15 Mar 2011
Publisher: Penguin Audio
Language: English

User Review:

vestige unauthorized

This book is all about failure. Its about the fact that we hide and stigmatise failure when we should be embracing it – and using it to continuously improve all our enterprises by submitting them to trial and error.

He gives many excellent, moving and gripping examples of contexts where this approach was lacking and resulted in dire consequences: In the medical profession, senior doctors have very high status and self-esteem, and they dont like to admit their errors. They use euphemisms such as a complication or an adverse event. The author argues that the lack of openness about error means that we are deprived of the opportunity to analyse what went wrong and use this information to continuously improve our systems. He gives a graphic example of a woman who needlessly dies because a group of doctors are finding it difficult to pass a breathing tube during a routine operation. They become fixated with this task and they lose track of time, when they could have performed an emergency tracheotomy a relatively straightforward lifesaving procedure. The nurse was there ready with the tracheotomy kit – but she only hinted instead of speaking up forcefully, because of the steep authority gradient between her and the doctors.

A second example is criminal law. Since the invention of DNA testing, it has become apparent that our jails are full of innocent people wrongly convicted. But the legal system has been slow to admit its errors and to introduce processes to fix this. Again, high status people, such as investigators and prosecutors are reluctant to admit that they are error prone.

One industry that seems to get this right is aviation. All errors are investigated thoroughly and recommendations are made to change practice. For example, in aviation there have been many crashes resulting when junior members of a team wouldnt speak up to alert the captain of a danger, because the captain was the commander and speaking up could have resulted in severe rebuke. So the aviation industry changed the culture to a teamwork approach and encouraged all crew members to speak up. This has been a great success, and lessons from this have now been adopted in many medical settings.

In the field of sociology, there was an initiative introduced called Scared Straight – designed to put potential delinquents off serious crime by sending them to a prison for 3 hours to spend time with hardened criminals. It appeared to work, and was subsequently adopted Worldwide. But nobody actually tested it to see if it really did work, except to send out some questionnaires. Once it was subjected to rigorous scientific testing using a randomised controlled trial it was shown that this intervention actually increased criminality in the subjects by about 25%.

The point is, you dont know if something is going to succeed or fail unless you test it. You cant predict whether something will work or not purely by intuition or because it seems logical the world is just too complex and there are too many unknown variables. So you should test your idea, then change it and test it again, and so on. This process works the same way that natural selection works in evolution. The entrepreneur who invented the very successful Dyson vacuum cleaner made over 5,000 prototypes and this resulted in an excellent product he wasnt afraid of failure, he harnessed it as a tool to drive continuous improvement.

As you have probably guessed if youve read this far, I enjoyed this book. Its interesting and as well as giving an insight into how major institutions and industries could be improved if they embraced failure, it also shares some ideas that we can all apply in our own lives.