BBC Horizon & Kevin Fong Explore Human Factors in Healthcare

Although titled “How to Avoid Mistakes in Surgery”, which may lead some medical viewers to dismiss it as unrelated to their specialty, the BBC Horizons program last week aired a fascinating episode, hosted by Dr Kevin Fong, a well-known Anaesthetic/ICU Consultant turned TV presenter from England, whose credits include such shows as Extreme A&E.  In this episode Kevin cuts to the heart of some of the issues we are discussing on RRM, and I’d urge you all to watch it.

Not sure how long this YouTube link will last, and please note, this show is only available (legally) via the BBC iPlayer on the BBC website, which restricts access to UK viewers only. To get around this, you may need to use a VPN service such as Unblock-us, as well as settings from some of the free VPN providers such as topukvpn or getukvpn or bestukvpn. NB: We don’t condone the use of these services or accept any liability for problems with your computer should you choose to use them!

In case the YouTube links craps itself, or gets taken down, or you don’t have time to watch it – here’s my point-form synopsis of the show:

Kevin utilises the now famous Elaine Bromley case as a striking example of loss of situational awareness.  Staff were so focused on establishing an oral airway they overlooked other options. If you’re unaware of this case, I suggest you watch the following video:

In a nutshell: We are wired up to fail, and we have a finite ability to cope with complex information.

It has nothing to do with intelligence, but it’s about accepting limitations & designing strategies that allow you to cope.

Kevin then used examples from other high performance industries to demonstrate the fallibility of the human mind under pressure

Fire Crews
Firefighters face the constant challenge of activities of search & rescue vs monitoring the constant threat of a rapidly changing environment

It is very hard to focus on lifesaving rescue tasks whilst maintaining awareness of the situation around you.

Sounds easy in theory, in real life, it’s extremely difficult

Aviation:
The utility of challenge-response checklists were demonstrated in a commercial aircraft simulator. (If only we could get that sort of training in medicine… Sigh).

Rationale for use of checklists in aviation:
Even smart people need reminders!
The human brain is frail & we all have bad days!!

He then interviewed Atul Gawande – Author of The Checklist Manifesto: How to Get Things Right

Atul believes that often knowledge is not the problem, it’s the execution or lack thereof that causes problems in surgery, and checklists can help with the execution of essential tasks.

Use of the WHO Surgical checklist which he helped develop led to an 18-47% reduction in complications in surgery, across 1st world, 3rd world and even military applications. Their utility in surgical settings is now beyond question.

Teamwork & Handover
Alan Goldman, (Coronary ICU Consultant at the world famous Great Ormond Street Children’s Hospital) instituted a 3-phase process for transfer of paeds cardiac surgery patients from theatre to ICU. This meant tasks were allocated, checklists completed, and practice became standardized, leading to fewer errors.

To develop this handover process, F1 Pit Crews were analysed:
In Formula-1 racing, the pit crews have the following characteristics
Each individual has a very specific, small task (Task allocation)
Leadership
Checklists
Situational awareness
Contingencies with a definitive plan

This left me wondering whether this sort of standardisation can ever really work in the ED, with the constant variation in staffing, environment, patients and acuity that we experience. I think not.

Simulation
Some of the benefits of simulation were demonstrated, with footage of a real anaesthetic crisis being run in sim-mode with an unfortunate Registrar bearing the brunt of a very difficult case and surgeon!

But what about unpredictable, sudden emergencies?
Well, it’s CRM to the rescue, of course!
Captain Chesley “Sully” Sullenberger, the pilot from Flight 1549 (which famously landed in the Hudson River after a bird-strike), who has become an advocate of CRM in medicine, was interviewed:

“Over many decades, thousands of people have in aviation have worked very hard to create a robust, resilient safety system in which we operate, which formed the firm foundation on which we could innovate, improvise”

He highlighted some of the characterstics of CRM:
Shared sense of responsibility
Flattened hierarchy
Open channels of communication

“We have teams trained in the consistent application of best practice, with well learned, well defined roles & responsibilities to each other & to the passengers”.

“We took what we did know & applied it to a new situation to solve the problem”

Kevin’s response to this: “Standardise until you absolutely have to improvise”

Error Correction:
He then travelled to the US to meet a Professor of Psychophysiology, Jason Moser, an expert in error processing in the brain. This is where the show got interesting for me:

He showed how the more positive your attitude is to error, the shorter the time there is before you realize & correct your error. Interestingly, 100% of the time when your brain is in this positive, error-correct mode, you will not make a mistake with the next decision.

In a crisis, being positive about errors is essential. If you have a negative response/attitude to error, (for example, you’re afraid of being criticised, reprimanded or even sued, which is a very common perception of error in medicine), not only do you take longer to correct them, you make more of them.

Learning from mistakes is something that runs deep in the DNA of the airline industry. Every pilot is brought up with a POSITIVE attitude to errors. Error is not only accepted, it is expected.  He describes this as a “search for progress”, which is in stark contrast to the medical industry, which has an ingrained culture of blame.

Human error is always going to be with us, but it’s how we deal with it that’s important

Conclusion
Overall I think this was a great show that highlighted many of the problems, and the underlying psychology behind them in the medical world. The final commentary about attitude to error and safety cultures, to me, were the most useful points in the show, which if anything highlights the gaping chasm between medicine and aviation from the CRM/RRM perspective. I find it frustrating that Captain Sullenberger would reiterate exactly what I’ve been on about, that “over many decades, thousands of people have in aviation have worked very hard to create a robust, resilient safety system in which we operate”, when in medicine, over many decades, thousands of lawyers and hospital administrators have sought to hang individual doctors out to dry, to sue them personally for millions of dollars and create a paranoid culture of individual responsibility and blame that flies in the face of the aviation model, where any error in medicine is seen as a grave individual mistake that should be punished. On top of this we have a work environment, (including such simple things as poor rostering, hospital overcrowding and constant interruption), especially in Emergency Medicine, that not only exposes us to error, but guarantees it. “Safety”, whilst now being recognised as important, is not a concept I’ve encountered any formal sense, in any training, or any “culture”, in any public hospital I’ve worked at in nearly 15 years as a doctor. We do not have “teams trained in the consistent application of best practice, with well learned, well defined roles & responsibilities”, and I wonder if we ever really can. A noble ideal for sure, but all the checklists, CRM and simulation in the world isn’t going to work until we change the culture.

What do you think? Can we change the culture of blame in medicine and become more like aviation?

 

 

 

 

 

5 Responses to BBC Horizon & Kevin Fong Explore Human Factors in Healthcare