Aviation and healthcare have a significantly different approach to communicating outcomes of critical incidents, writes Todd Fraser
In 2002, two aircraft crashed into each into other in otherwise empty skies over Southern Germany. Both were destroyed, resulting in the deaths of 71 people.
Both aircraft were functioning completely normally, the crew were very experienced, the weather was good for flying and both aircraft were fitted with functioning communication and alert systems.
This was a phenomenal catastrophe. How a sequence of events that led to this crash could line up to create this perfect storm are hard to fathom, but they did.
The ensuing investigation attributed blame to factors commonly seen in these incidents, just like they are in healthcare :
- Excessive workloads
- Violation of workplace regulations
- Faulty or disabled instrumentation systems
- A disordered command structure
- Dysfunctional communication
Humans are designed to make mistakes. We can, do and will always make mistakes, no matter how hard we try. So we design systems to prevent, minimise or mitigate them.
Both aircraft carried the “Traffic Collision Avoidance System”, or TCAS. When the TCAS senses an approaching aircraft that is a potential threat, it automatically instructs the crew to take action to avoid a collision.
Yet they still crashed.
If the pilots of both planes had followed the instructions given by the TCAS, the accident would not have occurred. Unfortunately, the pilot of one of the planes was given conflicting instructions - the air traffic controller told him to descend while the TCAS told him to climb. And the pilot didn’t know which one to follow, because there was no protocol in place to deal with this scenario. The planes descended into one another and everyone on board was killed.
In the final devastating chapter of this story, the air traffic controller in charge at Swiss company “Skyguide” at the time of the accident was stabbed to death by a relative of one of the victims.
If you thought this horrible chain of events could not get any worse, it did.
This scenario – confusion resulting from conflicting instructions of the TCAS and the air traffic controller, resulting in near-miss incidents – had occurred five times in the previous year.
FIVE TIMES.
On five occasions, the industry missed the opportunity to implement and communicate a solution on an industry-wide basis, resulting in the deaths of 72 people.
Aviation is an industry that prides itself on safety, on distributing cautionary information globally, and this incident provoked a significant re-evaluation of its processes.
So why is a medical specialist writing about this type of incident?
Because if this gives you pause to consider your mode of transport for your next holiday, consider how well the healthcare system deals with similar incidents.
When it comes to effectively communicating industry wide, healthcare is so far behind aviation it’s not funny.
It’s time to do better.
To paraphrase George Santayana, if we fail to learn from the near misses and critical incidents that occur in our industry then we are condemned to repeat them.
“Mistakes were made by us also, and we regret them deeply. We acknowledge our responsibility…and we ask the families of the victims for forgiveness”.
Alain Rossier, Skyguide Chief Executive
About the Author
Dr Todd Fraser is an intensive care and retrieval medicine specialist, podcast editor of the Society of Critical Care Medicine, and founder of Osler Technology, a clinical performance management platform for acute healthcare providers.
No comments:
Post a Comment