Last week I met James Titcombe, National Advisor on Patient Safety and Culture at the Care Quality Commission (CQC). James is perhaps better known as the father of Joshua, who died at University Hospitals Morecombe Bay.
James is determination and persistence personified. His background is in the nuclear industry, but he’s recently been appointed to the CQC, a brave move by the organisation – and an excellent one.
Like me, James is well aware of the role human factors plays in safety critical work; he’s also only too aware of the pain of loss. In his case the pain was made worse by an investigation that achieved nothing, an insult to all involved – it actually made things worse for the organisation, the people involved, the patients and their relatives.
After meeting James I drove home. The classic John Denver song, Country Roads, came on the radio. As I listened I was suddenly and sadly reminded that Denver himself died as a result of human factors. As well as a brilliant singer songwriter, he was also a passionate aviator and owned several aeroplanes. He bought another one shortly before his death, and during an early flight in this very unfamiliar and slightly unusually designed plane, something went wrong and he crashed into Monterey Bay.
As you would expect in a safety critical industry, a team of trained and skilled investigators (trained and skilled in investigative techniques that is) descended. Their report lists a root cause (what they call the ‘Probable Cause’): diverted attention, which resulted in loss of control (read a brief version of the report or the even briefer press release).
Imagine that was in healthcare. We have the root cause: the person made a mistake, their attention was diverted and they lost control of the patient’s condition (for those of you who know my late wife’s case, this sounds familiar).
So what do we learn, what do we write in the action plan? Pay attention? Don’t get distracted? Or should we just get rid of the person who did this so it doesn’t happen again?
In Denver’s case the investigators continued beyond the root cause into the other ‘casual factors’, as you’d expect experienced investigators to do. There were issues around design and layout of controls, the regulation and laws governing such designs, the training that pilots in Denver’s situation received…so it’s perhaps no surprise that the recommendations (the equivalent of an NHS action plan) are much more specific and useful.
They set out which regulations to change, who will change them and how they’ll be monitored to ensure safer easier design layouts. But there are also recommendations involving other bodies and ways to gain leverage. In this particular case they included working with insurers and industry lead bodies to promote training.
When you investigate and only identify a root cause, you’re missing the real potential learning. You must work with trained investigators who can look at all casual factors. And you must want to do it.
James Titcombe has also written a blog about our meeting but he starts by talking about the complaint process: ‘In fact, the term “complaints” itself I don’t think is particularly helpful, rather we should talk about how organisations encourage and learn from the feedback of people who use their services.’
In other words, an investigation should not happen because the patient wants it or has complained. It should a happen because people believe it’s the right thing to do.
I’d like to finish with two pieces of good practice. One is a 27 minute video by the ever brilliant Dr Tim Cook. Please take the time to watch it. It’s an attempt at openness and transparency into an awful incident, but one where they looked at all the causal factors to allow learning to take place. It’s also a lesson for us all in professionalism, learning and compassion.
The second is from an NHS hospital ‘up north’. A patient received an injection of the wrong fluid, which some days later led to the loss of a limb. And in this case, as well as a more careful root cause analysis, the trust actually simulated the scenario.
When they recreated preparation of the equipment tray with, just like the real situation, an accidental spillage and a rush to get things done for a senior colleague, they discovered that under the normal lighting conditions the pink coloured liquid was hard to discern in its temporary container, a blue pot called a ‘Gallipot’. With both fluids in identical unmarked containers on the tray, it was a perfect set of holes in swiss cheese. There are multiple causal issues here and each one has brought about a change in practice. And even better, the trust have committed to make a video of the story to help us all learn. I’ll let you know when it’s complete.
Since my late wife died I’ve seen investigations into serious incidents become an industry. The root cause is identified, the action plan ends the process and we can breathe easy.
No, we can’t.
Martin is a pilot and the founder and current Chair of the Clinical Human Factors Group, www.twitter.com/MartinBromiley