Safety in healthcare
My attention was caught by this programme on Radio 4 last week. Speaking from the control tower at London City airport, Claudia Hammond looked at how lessons learned from the commercial airline industry over the past 20 to 30 years are being used to improve safety in healthcare.
The programme addressed the relationship between human behaviour and safety – or the psychology of error. These apparently different industries share some strikingly similar human risk factors: steep gradients of authority, information overload and distraction leading to loss of situational awareness. Stories illustrate their influence in practice.
Tenerife airport, 1977. Overruling the concerns of his crew, the captain of a KLM flight took off, crashing into another aircraft in fog leaving 580 people dead. The captain was described as having ‘godlike’ status: a co-pilot would rather die than challenge the captain.
Kegworth, 1989. A fire in one of the engines. The captain announced that he had closed down the right-side engine when crew and passengers could see the fire burning on the left-hand side. No one questioned him. The crew were specifically trained not to interrupt, reinforcing the presumption that the captain knows what he or she is doing – a presumption that many patients make of their doctor.
Years of training in the aviation industry has embedded a safety culture in which speaking up and reporting concerns are celebrated. There is an understanding that ‘to err is human’ and that the responsibility is to learn and mitigate future risk. Voice recordings from the cockpit reveal the nature of teamwork, leadership and decision making.
Rhona Flin is Professor of Applied Psychology at the University of Aberdeen. She leads a team of psychologists conducting research on human performance in high-risk industries including healthcare. She describes another effect of power gradients. There is compelling evidence that witnessing or experiencing bad behaviour can seriously impair our judgement and performance.
Matt Lindley, a former airline pilot who now applies his experience to the NHS, identifies similarities in roles across these two industries: the captain is the consultant; co-pilot is the junior doctor; cabin crew are frontline staff and ’the eyes, ears and nose’ of an organisation. Anyone of us can lose ‘situational awareness’ when exposed to sufficient levels of stress; medical and surgical emergencies are prime examples of when we need to listen and our team to speak up.
But we are still not good enough at this. A junior doctor tried to raise concerns about wrong-site surgery in the operating room. Sadly for the patient, she was ignored until it was too late. There are some trigger words which should always be heeded; think ‘concerned’, ‘unsafe’ and ‘stop’. It’s a psychology of safety which starts at the top and the bottom to meet in the middle that is needed to create an open and just culture that puts patient safety at the heart of healthcare.
And for those of you who are sceptical about the evidence for this approach, remember the two red devil skydivers whose parachutes became entangled during a stunt a couple of days ago? They attributed their skilled manoeuvres and breathtaking survival to years of training and teamwork.