In A Real-World Lesson of How Information-Sharing Saves Lives, I talked about how programs and cultures of honesty and transparency in healthcare not only reduce medical-legal issues and lawsuits, but initiatives such as these can truly save lives. However, changing the culture of a healthcare organization is only half the battle.
The traditional response to a medical mistake or “adverse event” has been focused on identifying and blaming the providers who had the last contact with the patient before the error occurred. It’s human nature to point fingers, and it is much more satisfying knowing that there was a person behind a problem.
The results of this “blame game” are usually calls for more vigilance, better training, as well as professional reprimands or firings. However, administrators are often too focused on “getting rid of the bad apple” and the root cause of the mistake is never determined. Worse, this mistake is still quite likely to occur again in the future, and eliminating the bad apple did not contribute any long-term value or system improvement. Unfortunately, healthcare is really not the best role model of “learning from our mistakes.” In terms of how it operates, healthcare often has a short-term memory.
The landmark 1999 Institute of Medicine report To Err is Human indicates that up to 100,000 people die each year in the U.S. from preventable medical errors. But it’s remarkable that the report was essentially titled “we make mistakes because we’re human?” No matter how many in-services and training sessions, we are still human; we are guaranteed to make a mistake eventually. At the same time, no checklist, protocol, guideline, or EMR system is perfect, and these are guaranteed to fail at one point or another as well.
So with so many disparate systems, imperfect care providers, and complexity, is it impossible to create a truly safe healthcare system? Most people would say no, and you’d likely get the same response if you asked pilots 50 years ago if they thought an aviation industry with less than one death per billion passenger-miles was possible. With nearly 30,000 flights per day in the U.S. alone, aviation has achieved six-sigma quality when it comes to safety in air travel.
But for decades, pilots acted much like the attending surgeon in an operating room. They were a commander – rather than a leader – doing what they wanted, demeaning subordinates, putting their egos first, sometimes leading to the demise of hundreds of passengers. After the infamous crash at Tenerife in the Canary Islands, aviation decided it had to also change its culture and learned that culture was just as important as a good system and protocols.
Leilani Schweitzer’s 21-month old son was admitted to a university children’s hospital and placed in intensive care. With many monitoring leads attached to her child, multiple alarms sounded each time he moved. This became not only very distracting but did not afford the mom any sleep. The night nurse was nice enough to go through many buttons on the monitoring device to shut off the alerts. Unfortunately the following morning, the patient’s mother wakes up to find her son cold and immobile. Despite attempts to resuscitate, her son had passed away. An investigation soon discovered that the nurse went through at least seven screens on the monitor to shut off alerts that would have otherwise immediately notified the staff that his heart had stopped beating. The manufacturer said they didn’t think anyone would go through such a hassle and did not put a safeguard in place.
As hospital administrators arrived to her room, the mother expected a brick wall and few answers as to how this tragedy unraveled. Surprisingly and contrary to her expectations, doctors and administrators explained what had gone wrong, gave a full apology, and listed ways they would ensure this mistake would never happen again. They also offered to incorporate her personally into the improvement process. The manufacturer did eventually change the design of their monitor so that even in a critical event, the alerts would still sound. Of note, the hospital kept true to their policy of learning and system improvement and did not fire the nurse.
We are all human. We all make mistakes. But like aviation, the consequences in healthcare unfortunately involve people’s lives.
Although a culture that advocates transparency, honesty, flat hierarchy, and teamwork, hospitals are still not safe without a system that is designed correctly. Contrarily, no amount of protocols, checklists, training, or functional systems will counteract people who inherently make mistakes.
To truly be safe, any healthcare organization must have several layers to protect against unsound practices and errors. But each of these layers inherently has holes or weaknesses as well – much like slices of Swiss cheese – and it is the combination of all these layers that can lead to a true and acceptable level of patient safety.
It is the combination of culture and properly designed system processes that can catch causes of patient harm. This is applicable not just to healthcare but to all types of organizations. Implementing a culture change and sound system processes is what aviation has done, and hopefully this can be a resounding lesson that many of us in healthcare will learn from.
Additional Information:
- Insurance Information Institute. World Aviation Accidents.
- WHO Patient Safety Curriculum Guide. Topic 1: What is Patient Safety?
- Transparency, Compassion, and Truth in Medical Errors: Leilani Schweitzer at TEDx University of Nevada
- Medical Mistakes: Human Error or System Failure? by Bill Bornstein
- The Canadian Medical Protective Association. Systems Thinking.
Isaac Hernandez
Latest posts by Isaac Hernandez (see all)
- Patient Safety: Correcting Both Our Culture and System Failures - October 15, 2013
- A Real-World Lesson of How Information-Sharing Saves Lives - October 1, 2013