The operational unit in a team is the individual.  It is through our leadership/followership skills that we integrate team and individual needs. Concepts described in the field of emotional intelligence identify ways to improve those team based skills. So, what does that look like operationally?

Individual self-awareness, an Emotional Intelligence Domain[1] is the key to that integration. Active self-awareness resolves many team issues before they become problematic. Team leaders must ask themselves many questions to be responsive in improving team performance. Perhaps the most important question for a medical team leader is “am I empowering my team to communicate to correct error producing behaviors and identify errors when they occur”? This is a fertile area for patient safety improvement.

Developing self-awareness skills require watching and listening to yourself honestly. This can be described this as having one eye looking out and one eye looking in as we interact. Often, this is more of an evaluative process that occurs after the fact. However, we need to have “recorded” our behavior at the time it occurred, which requires a developed self-awareness skill. From a leadership improvement perspective, this skill is invaluable.  When we identify errors, in operation or leadership, it is essential to track backwards to find the cause. Many times, the true cause of an error occurs early in a sequence of interactions and is not obvious if only viewing the resultant error.  Aviation has been struggling with error mitigation since the Wright brother’s first flight in 1903 at Kitty Hawk, North Carolina.

So, what more can be learned from aviation to aid in medical error mitigation? Recently, there have been great strides in integrating aviation techniques in the application of medical practice. Check lists, briefings, debriefings, etc.. Most of those integrations are systemic in nature. Some studies have shown disappointing results from these efforts. Often, the medical industry assumes that if we do what the airlines do, we will see the same great improvements in safety and error reduction.  While there has been improvement, it has not been the dramatic results expected from those efforts. In fact, a recent report from Johns Hopkins University published May 3, 2016 in THE BMJ indicates that medical error related deaths may exceed 250,000 per year. That is the equivalent of the airlines crashing 12, B-747s full of passengers every week!  Would you fly in the face of those statistics?  How long can our medical institutions remain profitable in light of those numbers?

There are several reasons for this resistance to error reduction in medicine. First, healthcare is a very complex system with many variables on a macro level that require integration. Next, the focus on systems and procedure has missed where the real improvement in aviation occurred. Personal, rather than systemic, improvement has been the last area to attain focus and development in aviation and now needs to take prominence in medicine. A physician’s autonomy and independence are barriers to continuous improvement implementation through systemic processes.

Changing leadership concepts met with some resistance from established Captains and crew members.  It had to be clear that these changes were empowering rather than directive. As humans, we cannot avoid making errors. As driven high performing professionals, we hate having to accept that fact. The biggest change in aviation safety came with crew acceptance and understanding that they will make errors! With that acceptance, the effort to avoid error creating environments, improve error recognition, and develop mitigation strategies became pervasive. Knowledge and awareness are worthless without operational integration. Training and operational evaluation with long term reinforcement and accountability are needed to make this culture shift permanent.

Self-management is a huge benefit associated with improved self-awareness. Neuroscience teaches us how the mind processes and prioritizes information as well as the resultant behavioral impact. Improved awareness of who we are at any moment helps us manage how we present ourselves and interact in our environment. Team effectiveness now can improve through better relationship management. This critical aspect of error mitigation and management is based on the ability to engage the team in real time reassessment.  As the captain/physician/team leader becomes task saturated or stressed, situational awareness (SA) is reduced. Effective relationship management expands the SA of the team, broadening the error recognition/mitigation strategies across more eyes and minds. Better relationship management opens the communication for that recognition to positively impact patient safety.

Finally, from a risk management perspective, if you capture errors but fail to understand why they occur, you are still at risk. The difference between errors of omission and errors of commission yield a completely different evaluation. Therefore, the importance of developing the self-awareness skills previously discussed takes on new significance.  The event is not finished until it has been debriefed.  Historically, this has been a recap of the event identifying errors, or areas to improve. Unfortunately, the likelihood of repeating the same error under stress is still very high because that type of debrief doesn’t identify the causal factors. Causal factors leading to procedural, functional, behavioral or communication errors may be very different. If we don’t ask why the error occurred, we will never know how to prevent it. A major mistake in debriefing is assuming that if no actual errors are identified, then no errors occurred.

Medicine, like aviation is dynamic and idiosyncratic. We become experts at capturing errors as they develop because every event is unique and presents unexpected challenges.  When we capture and correct errors, before they become unacceptable events, we often fail to identify them as errors.  The key to effective debriefing is to not only identify errors that occurred but to identify captured errors. The error occurred, if it is not traced back to cause it will threaten safety again and may not be captured the next time. A common technique in aviation is to start an evaluative debrief with the most serious deviations and work downward.  Any risk to safety is the highest priority. The debrief question may sound like this. “Was safety compromised in any way?” and “Did we capture any errors that could have resulted in a safety risk?”, then, “What was I doing that allowed the error that was captured?”. In other words, why did we have to capture an error?  The conclusion becomes a discussion of a mitigating strategy that will prevent the exposure to that risk in the future. Once the most serious error is identified, captured or not, and it is traced to cause with mitigations developed, the debrief is complete. Trying to fix every small area where improvement is possible results in a lack of efficiency and degradation of the commitment to implement any new strategies. A good debrief is specific, positive, focused and defines a clear behavior/action for improvement.

Institutional culture change opened the way for aviation to identify and mitigate threats prior to flight.  Empowerment to be human and improved error recognition/mitigation came next. Finally, curiosity in identifying causal factors drives a functional debrief. Medicine is committed to improving patient safety. Make it personal, lives require it, both the patient’s and the provider’s!

[1] Goleman D, Boyatzis R, McKee A. Primal Leadership, Learning to Lead with Emotional Intelligence, 2002;39