Back to top
Article

Turn on the black box

21 January 19

The Word of Gold: for professionals, sorry seems to be the hardest word. And the consequences?

by Stephen Gold

An error doesn’t become a mistake until you refuse to correct it. (Orlando Battista)  

A couple of months ago in this column I discussed the parallels between practising medicine and practising law, suggesting that we lawyers could learn a lot from our doctor friends. I’ve just been re-reading Matthew Syed’s terrific book, Black Box Thinking, which draws the same conclusion, but not in a good way. He acknowledges that the medical profession is full of fine, committed people, dedicated to their patients, but he shines a harsh light on its attitude to mistakes. 

The title Black Box Thinking derives from the black boxes recovered after aircraft accidents. Aviation’s culture is to extract every last piece of data, come to rigorous conclusions, and make them public, irrespective of how damaging the evidence may be for organisations or individuals. Lessons are continually learned, practices improved and as a result, the incidence of death or injury from plane crashes is vanishingly low: currently, for members of IATA, the jet accident rate is one per 8.3 million flights. Contrast this with the findings of the National Audit Office that in 2016, the latest data available, 24% of the 597,206 deaths that year were from “causes considered avoidable in the presence of timely and effective healthcare or public health interventions”. Not all of these were due directly to medical error, but the best estimate is that around 35,000 were. 

Facing the truth

Why the difference? Medicine is more complex than aviation. A vital part of it is using pioneering techniques which are inherently high-risk. But Syed is clear that by far the most important reason is that in medicine there is little evidence of a “black box” culture. On the contrary, a typical impulse is to put mistakes or misfortune down to “complications”, and move on. Forensic examination and publicity may be too damaging for doctors’ self-esteem and reputations. At the root of this behaviour is cognitive dissonance, the phenomenon named by psychologist Leon Festinger in 1957, whereby, instead of facing up to facts which threaten our beliefs, or challenge our behaviour, we find ways of justifying or explaining them, at least to ourselves.

Lawyers don’t kill people too often (however much they may fantasise about certain clients), but we have a similar attitude to error. Syed cites numerous examples of how prosecutors faced with irrefutable evidence that they had put innocent people in jail, time and again came up with ludicrous theories to justify that they had been right all along. All of them were passionate about doing justice. The idea they had been unjust was completely unbearable. 

In less dramatic ways, having the bad luck to be human, lawyers make mistakes all the time. Looking back at cases, deals and transactions, it is clear if we face up to it where we might have done better. Leaders may invest major resources and their personal reputation in strategies which don’t work, and precisely because their investment is so big, persevere long after they should have called time. As Ogden Nash put it:

An optimist fell ten storeys,
And at each window bar,
He called to the waiting crowd below,
“All right so far!”

Safety alert

It is easy to trot out clichés about having a no-blame culture. The truth is that changing mindset is really tough. Oceans of blood and sweat have gone into into acquiring skills, advancing our careers and shaping our self-image. We may be far from convinced that if we admit error, colleagues, bosses and clients will be understanding. To forgive may be divine, but they may not. 

Yet there is no alternative to leaders having the courage to tackle this head on, making it explicitly clear that no one is immune from error, including them, and that bringing it to light is a professional and moral obligation for all. People will be supported. Lessons will be learned for everyone’s benefit. But cover-ups and cognitive dissonance have no place. 

Clear leadership and making reporting easy can have dramatic effects. Syed cites the example of Virginia Mason hospital in Seattle, which has introduced “patient safety alerts” to be issued by staff, whenever they see a need. As a result, it is now one of the safest hospitals in the world. But here’s the thing: the initial takeup of PSAs was low. Staff at every level were reluctant reporters. It took the catalyst of a truly dreadful patient death for this reform to be widely accepted.

Human fallibility can never be eradicated, but the reward for being in the right place on this issue is a culture of high achievement, steadily improving performance and resilience. It’s our best defence, as we take off into another turbulent year.

Stephen Gold was the founder and senior partner of Golds, a multi-award winning law firm which grew from a sole practice to become a UK leader in its sectors. He is now a trusted adviser to leading firms nationwide and internationally. t: 07968 484232; w: www.stephengold.co.uk; twitter: @thewordofgold

Have your say