ABSTRACT
The number of deaths due to medical error or preventable adverse events (PAEs) in the US is unacceptably high.[i] Contrast those numbers to the fact that the high level of safety observed in the aviation world is impressive to say the least.[ii] For the most part, aviation accident rates remain consistently low with a spike every now and again- it is bound to happen. But why, as they say, ‘is it safer to fly than drive a car’? This paper will examine numerous reasons why the aviation industry has excelled in safety as compared to the medical world, notwithstanding the fact that the practice of medicine is highly complex and variable. We will then offer critical and wide-ranging recommendations on how to hopefully, significantly, and definitively improve patient outcomes.
INTRODUCTION
Many reports, papers, and articles have been published comparing low aviation error rates to error rates in the medical environment. Consider the report ’To Err Is Human: Building a Safer Heath System’ or the ‘Patient Safety Quality Act of 2005’ (PSQIA) as just two references that identify issues and make recommendations for improving patient safety that could be more widely employed. Much of the information included in this paper is not new, but is provided as background. The recommendations made in this paper to close the huge gap between aviation safety and patient safety are not unheard of, but the guidance to implement these concepts IS relatively new. These recommendations include:
1) Strategies to better incorporate higher levels of Crew Resource Management (CRM) and Threat and Error Management (TEM) in surgical and clinical environments
2) Ways to enhance and grow the use of trusted confidential safety error reporting systems
3) Ways to gain acceptance that allows for operations in a Just Culture[iii] environment
4) The implementation of long-term confidential observation programs
ERROR REPORTING AND CREW RESOURCE MANAGEMENT
Error reporting in the aviation world is a well-refined process. Reports are collected confidentially in a non-punitive environment resulting in a wealth of data collection and trend analyses that enhances safety throughout the industry. A key component of this error reporting process is the fact that aviation, like other high-risk environments, operates under the umbrella of Just Culture. That is to say, humans will make errors, most of these errors are unintentional or not deemed risky, and that as long as reports are submitted within defined parameters, lessons are learned, and improvements made, an employee cannot be fired, fined, or have license action taken against them. This process yields a wealth of information collection and employee performance improvement. The aviation world is well-known for making enhancements and modifications to operations including in the critical area of human factors as a result of all types of events, major and minor.
The medical world could well-learn from these processes. The challenge is finding ways to amend the culture so that these processes can be more widely implemented. The Patient Safety and Quality Improvement Act of 2005 mandated confidential reporting systems, but those reports are voluntary and don’t require adherence to Just Culture protocols. Offenders are still subject to punitive measures. Non-compliance with the Joint Commission’s Sentinel Event Policy reporting requirements is staggering. According to the Joint Commission website: ‘The reporting of most sentinel events to The Joint Commission is voluntary and represents only a small proportion of actual events. Therefore, these data are not an epidemiologic data set and no conclusions should be drawn about the actual relative frequency of events or trends in events over time.’ The reasons for reluctance in reporting will be discussed later.
It took decades for the aviation world to evolve into the current day environment wherein a crew operates the aircraft as a cohesive team versus under an authoritative captain that ‘tells the crew what to do’. This cultural evolution has dramatically reduced accident rates and improved safety. Crews brief together before a flight to build a team, and to identify and mitigate threats. Crews also debrief together after a flight to identify errors captured, deviations from standard, and lessons learned all in the name of continual improvement and further risk mitigation.
In aviation, procedures are highly standardized and each crewmember is required to abide by them. Procedures meant to close communication feedback loops are in place to ensure changes made to the aircraft state are verbally verified by other crewmembers. If a deviation from standard procedure is necessary it must be discussed and affirmed. Crewmembers are required to speak up at the first indication they see that something may be amiss. Not abiding by these policies increases the risk of errors being made.
So, what can happen if there is a dominating or authoritative captain who creates an environment where those on his/her crew are reluctant to speak up, or who deviates from standard procedures without reason? Risk increases and error capture rate decreases. It is just human nature. Leadership and teamwork are critical to safe operations in the flight environment.
Let’s look at the medical environment. Working in a medical environment should be a ‘team sport’. We will use a surgical team as an example. Have the majority of surgical teams evolved to the point where prior to a procedure a team is built, and each team member feels valued and free to speak up? Does the surgeon set a tone of teamwork or is he/she more authoritative? Are standardized procedures in place that must be adhered to, not just technically, but non-technically (e.g. Human Factors skills)? If so, if there is a deviation to those standards are the reasons understood and accepted by the team? Are communication feedback loops incorporated so that any time a change is made or something is required it is verified by other team members?
Technical mistakes can and do lead to medical error, which is why technical proficiency is the foundational bedrock for safe outcomes- it must start there. Numerous policies and procedures exist to ensure the highest levels of technical proficiency and skill. However, technical skill is only a part of the larger collection of competencies that will yield the most successful outcomes.[iv] Creating a new procedure every time something goes wrong is not the cure-all to enhancing safe patient outcomes. You cannot ‘proceduralize your way to safety’- which means that non-technical skills cannot be ignored or minimized.
Studies show that the number one cause of medical error is poor communication.[v] A study in the Journal of Patient Safety cited the Joint Commission in revealing that communication errors are among the most common causes of sentinel events, and that these errors are present in 70% of adverse events.[vi] Part of proper communication is knowing when to, and feeling free to, speak up. Additionally, closing of the feedback loop is critical to successful communication. For example, let’s say an order is given for a certain dosage of medication to be dispensed, but instead of verbal verification that the correct dosage will be provided, nothing is said or verified and the wrong dosage given- the feedback loop was never closed leading to an unnecessary error and possible patient harm. Another important aspect of the feedback loop are the filters that may exist between the sender and the receiver. Lingo, slang, non-standard phrases, dialect, etc. can all contribute to what was said not being received appropriately by the receiver- this is why clarification is so critical in successful teams.
Studies also show that highly reliable non-technical skills are not as engrained in the medical world as they are in aviation. In examining time-out protocols[vii] and surgical checklists,[viii] even though very successful from a technical skill perspective, many have no mention listed of non-technical skills that should be discussed. For example, it would be appropriate to ensure that prior to the beginning of a procedure that each team member is well-rested, not under any undue stress, and not in any other compromising situation that would inhibit them from doing their job to the best of their ability. Perhaps using the aviation IMSAFE checklist (Illness, Medication, Stress, Alcohol, Fatigue, and Emotion) would be useful. This checklist is a great assessment of pilot fitness for duty that was compiled by numerous aviation advocates, airlines and union groups, and released by the FAA. It could easily be applied in the medical world and is actually found in TeamSTEPPS protocol.
Why does it appear from statistical review of numerous publications that the number of deaths from medical error in the US is staying fairly steady? The need for improved non-technical skills has been addressed repeatedly. So, why isn’t it to the point where we see actionable numbers? Is it cultural? Some say we perform more surgeries now than we even did 20 years ago which means the percentage of errors is decreasing. That doesn’t justify, though, that an estimated 250,000 or more deaths per year result from preventable medical error.[ix] The conclusion to make here may be that the concept of close teamwork, closed communication loops, and effective briefs/debriefs are not engrained cultural concepts that are accepted, seen as necessary, and made a priority. There may also exist engrained cultural pressures that must be overcome from a leadership/corporate perspective that if mitigated, can lead to robust reporting and error analysis.
In 2024 the Joint Commission said that human behavior is not considered in sentinel event studies so as to avoid the assigning of ‘blame’.[x] The commission focuses on system process and procedure, both of which are important, but miss the consideration of the human component that led to the cause of the errors in the first place, maybe that was even occurring well before the error took place. This fact reflects a disregard in the value of confidential reporting and Just Culture. It also misses the fact that if human behavior isn’t changed in the framework of continual improvement, that those behaviors will continue to exist leading to more error, no matter how many procedures are implemented. Until this culture swings more toward human behavior, acknowledges that humans make unintentional errors, and that we can learn to mitigate the risk of these errors occurring, deaths caused by medical error may not substantially change for the better.
Regarding non-technical skills, the aviation world has adopted the concept of Crew Resource Management (CRM).[xi] CRM is a major key to successful outcomes when applied to the fullest extent. CRM evolved from a NASA research study and follow-on workshop on aviation accidents in 1979 that resulted in the introduction of ‘Cockpit Resource Management’ that later became known as ‘Crew Resource Management’.[xii] These skills are based on the fact that humans commit errors that can lead to accidents. Ensuring technical skill proficiency, proper utilization of leadership skills, development of a cohesive team, and ensuring open and free communications with closed feedback loops leads to the highest levels of situational awareness, therefore, mitigating as much risk as possible. The bottom-line goal is to capture, or trap, errors or potential errors at the earliest possible opportunity.
In the aviation world breakdowns in CRM occur. It is the severity of the breakdown that can lead to a poor outcome. CRM failures can lead to uncaptured errors that can exacerbate an otherwise non-threatening situation. Most aviation accidents incur some type of human error (roughly 78%). [xiii]
What can be learned from these statistics in the medical environment? For one, a proper time-out protocol that is efficient, methodical, and addresses each team-member’s fitness for duty is critical. Additionally, all team members including the surgeon must be present. A cohesive team must be built with the freedom to communicate being a necessity- a team member should never be afraid to speak up or feel intimidated to do so. The passing on of critical information to a team member who wasn’t present induces a risk for error. In the Threat and Error Management (TEM) process known possible threats to a safe patient outcome are outwardly discussed and identified. Risk mitigation strategies are then agreed to and established to minimize any risk from those threats so that errors can be trapped before they become consequential. The time out briefing currently does a nice job of utilizing TEM protocols in regards to the procedure and the patient (technical skills). However, TEM must address possible risks associated with non-technical skills as well.
Secondly, the free and continuous flow of communication within the team throughout a procedure must be established and expected. If something doesn’t look right or feel right it must be stated and resolved. No team member, no matter their status or level of experience, should ever feel reluctant to speak up.
Thirdly, following a medical procedure, even when there is a successful outcome, the team should conduct a debrief together. It may be short, but should be appropriate in length and content to ensure that what went well, as well as what could be improved upon is discussed. CRM is a critical part of this debrief. How did the team handle challenges? Did the team work well together? Were errors captured before becoming consequential?
Many medical errors occur during patient turnover from one professional to another. 47% of malpractice claims involve provider-provider mis-communication and 53% involve provider-patient mis-communication.[xiv] A process and standardized turnover protocol should always be used and include all involved in a patient’s care. These are valuable discussions to ensure risk mitigation, patient safety, and continual professional improvement. It is estimated that 77% of these errors could be reduced by using a standardized handover tool. Protocols like I-PASS or SBAR can be helpful in standardizing handoffs and should be made standard procedure.
The need for strong CRM skills in the medical world is evident. There are questions that still remain such as why isn’t CRM more robust, and why isn’t CRM standardized across the industry like it is with aviation? Listed below are possible reasons for lack of adherence higher-level CRM skills
- Other than federal and state governments there is no over-arching governing lawful rule-making and enforcement authority in the medical world similar to the FAA in aviation. Many rules are implemented at the state levels and are not consistent from state-to-state. The Joint Commission and other certification entities can make recommendations, but there is no legal enforcement of recommended policies.
- Heart disease and cancer are listed in many publications as the first and second leading cause of death in the US, however, medical error, even though third, isn’t highly publicized. Therefore, are we not addressing this issue because of the lack of public exposure and outcry?[xv]
- Many State Medical Boards and current incident review committees don’t recognize Just Culture, and they rarely search for the human cause of an error. Most of the time only possible systemic causes are addressed.
- Even though certification outfits conduct observations and periodic audits, there don’t appear to be comprehensive routine annual observations of medical professionals from a non-technical perspective. In the aviation world these observations can be in the form of ‘check rides’ which take place up to three times a year, or in the form of confidential observations the purpose of which are to collect operational data used to improve safety and standards.
- There isn’t enough emphasis placed on non-technical skills in medical school and CME
- Hospital and medical staff leadership including nurses, surgeons, CMOs, and others don’t prioritize and place enough emphasis on non-technical skills.
- Although gaining traction is some places there are very few truly confidential reporting systems that allow team members to report errors or mistakes that can sacrifice outcomes without the threat of some type of punitive action possible on the one who made the error, or possible reprisal of those who filed the report.
- The concept of Just Culture is not as widely an adopted standard throughout the medical world as it is in aviation. Operating in an environment of Just Culture ensures that anyone who has made a mistake or error of any kind, whether technical or non-technical in nature, cannot incur any punitive action unless the action was a result of reckless behavior, unlawful behavior, or the action resulted in an accident or death.
CONFIDENTIALITY AND JUST CULTURE
Let’s examine two areas for improvement that could very well lead to higher levels of CRM, enhanced data collection, and improved patient outcomes: Confidential Reporting and Just Culture.
First of all, the following recommendations must be supported at every level of a health care system from leadership to the operator level. ‘Buy-in’ is a must!
The aviation and railroad environments have both realized significant safety improvements by utilizing confidential reporting systems created in partnership with NASA. In the railroad environment the Confidential Close Call Reporting System (C3RS) was created in 2007 in cooperation with the Federal Railroad Administration (FRA). In 1976 NASA introduced the Aviation Safety Reporting System (ASRS). This program was created in cooperation with the Federal Aviation Administration (FAA). Both programs allow for any respective industry professional to confidentially report a safety issue they have incurred or observed, report an error they made personally, or submit a suggestion they believe might improve safety of operations. If the reports are filed in a timely manner and the event is not the result of reckless of unlawful behavior, or did not result in an accident then that professional cannot have punitive action taken or a fine imposed. It is critical to point out that the intent of these reports is not to assign blame, but to learn all in the name of continuous improvement and risk mitigation.
Humans make mistakes and these reporting systems allow for that fact. In a non-punitive environment, the amount of data collected leads to numerous safety enhancements. In the aviation world alone, operational issues have been refined not only in the aircrew environment, but with air traffic control (ATC), and in maintenance departments as well. Over 2,000,000 NASA reports have been filed and not one time has confidentiality been breached. On average roughly 100,000 reports are filed annually. Imagine what could be done to improve patient safety if we had robust reporting like ASRS in the medical world!
In the early 2000s the NASA ASRS program was given an additional boost by the FAA with the introduction of the Aviation Safety Action Program (known as ASAP).[xvi] ASAP has taken NASA reports to the next level. ASAP is managed at the individual airline level. When a crewmember or other aviation professional files an ASAP report they have the same protections as a NASA report. However, all filed reports are reviewed by a committee that meets routinely (normally quarterly). Members of this Event Review Committee (ERC) are trained in Just Culture, and consist of the airline safety department, local FAA, company representative, and union representative if applicable. The group discusses each event and must unanimously agree on the disposition of the event. Data is collected and assessed with trends identified and appropriate recommendations made when necessary. This discussion enhances standardization and may actually lead to procedural enhancements. Discussion focuses heavily on CRM and the role non-technical aspects played in each event.
The Just Culture concept was brought to the forefront by David Marx.[xvii] It is a foundation of a High Reliability Organization (HRO). This concept assumes human errors are made, and that many of those errors are not intentional or are the result of at-risk behavior (ex. shortcuts, normalization of deviance). Just Culture is the opposite of Blame Culture, however, it is important to point out that reckless behavior is never tolerated. According to Marx, the term ‘outcome engineering’ is applicable to high-risk industries. His algorithm lists 3 basic duties
- 1 Duty to produce an outcome
- 2 Duty to follow a procedural role
- 3 Duty to avoid unjustifiable risk
Consequently, with breach of duty, the mechanism of the breach is categorized into one of the following:
- 1 Human error
- 2 At-risk behavior (a conscious drift from safe behavior)
- 3 Reckless behavior (conscious of conduct and risk)
Nurse managers and charge nurses will tell you that they observe at-risk behavior as the biggest cause of increased risk for patient harm.[xviii] Getting in a hurry, becoming task saturated, or simply laziness, leads to shortcuts. A term frequently used to describe this situation is ‘normalization of deviance’ from standard procedure.
In a Just Culture world an employee is allowed to learn from their mistakes without fear of repercussion or punishment including things like license suspension, fines, or time off provided no reckless or unlawful behavior is involved. Human error and at-risk behavior are dealt with slightly differently. After an unfortunate event takes place in a Just Culture world, if the error was unintentional as a result of human error (distraction, fatigue, stress, etc.) the offending individual may be consoled- that is to say the event is discussed, and the root cause identified which leads to further awareness and continual improvement. If the behavior was deemed ‘at risk’ as indicated above, that individual will be coached so as to ensure understanding of why the deviation from procedure that led to an error was unacceptable. In many cases, some type of corrective action may be implemented (surgeon performs under observation or incurs some type of re-training, etc.). However, as long as these contingencies are agreed to and carried out successfully, no further action can be taken, and that professional moves forward having learned very valuable lessons. Precious data is collected as well. It is important to note that in a Just Culture world the intent is not to find blame, but rather to undergo root cause analyses, and learn from unfortunate events so that the risk of them happening again is mitigated.
As mentioned before the need for confidentiality and data collection is recognized throughout the industry. Years ago, in conjunction with the Veterans Administration (VA), NASA actually implemented the Patient Safety Reporting System[xix] (much like C3RS and ASRS). The PSRS system saw tremendous participation, but was a victim of governmental budget cuts. The data collected was invaluable and showed that this type of system can work in the medical environment. However, the current lack of nationwide reporting has led to a huge gap in data collection, and a reluctance for anonymous reporting when there are no protections for medical professionals. Malpractice premiums can increase as well. If these factors exist then why would anyone report? Good question. The result is that we only hear about the egregious errors (sentinel events) with many smaller ones (close-calls, near-misses, good-catches, etc.) never brought to light and/or kept under wraps. This lack of reporting can hide critical data and concerning trends that may exist.
One study from the University of Texas at Austin concluded that medical professionals do feel a professional obligation to report data on ‘close calls’ and sentinel events, but will not without immunity from punishment and a guarantee that the information gained will actually lead to changes to the system.[xx]
Many health systems have implemented patient and employee incident reporting systems where the reports are filed anonymously, however, one cannot compel participation. The need for data, not just on sentinel events, but ‘close-calls’ and others is essential to patient safety improvement and error reduction. While these reporting systems are a step in the right direction, there are numerous reasons these report requirements go largely unmet:[xxi]
- Apathy in that there is a belief that nothing actionable will result in the filing of a report
- Lack of trust in the system that reports will be kept anonymous
- Consequences including reprimand of the offender or reprisal against those who submitted reports. Corporate institutional culture of ‘blame and retribution’ is engrained within many current systems.
- Ego (much like the authoritative airline captain)
- Time involved to fill out a report
- Lack of data being shared with the staff which is seen as the reporting system not being made a priority
Often, the measure of success in these systems is determined by the level of malpractice claims that are filed. While reduced claims is good, it doesn’t solve the problem of confidentiality, protection, and data collection that are key components to operating in a Just Culture world. We are simply missing out on the collection of critical data that could reduce medical error and patient harm.
INCIDENT REVIEW COMMITTEES AND LONG-TERM OBSERVATIONS
Many of the facts addressed thus far are not unknown in the medical world. However, what has not been accomplished are significant and direct steps taken from a non-technical perspective to solve the issue of the third largest cause of deaths in our country- medical error.[xxii] We know reporting is low and why, we know significant errors are made, we know patients are harmed, and we know CRM exists at non-standard levels industry-wide.
Many companies will give presentations focusing on error reduction techniques, CRM, and reporting. It is vitally important that follow-up to these seminars is conducted. A lack of follow-up will result in caregivers simply going back to the way they do business on a daily basis. Additionally, and in that same line of thought, many articles and studies recommend that surgeons and other leaders take the initiative to become better educated in error reduction and patient safety. However, simply attending educational seminars may not result in institutional and cultural change. Long term processes are needed to ensure sustainable success in improving patient outcomes.
We have addressed the vital importance of CRM at every level. Those principles are critical to patient safety. In addition to more robust CRM, the implementation of two bold initiatives is recommended to ensure a measurable reduction in medical error, and dramatic improvement in patient outcomes:
- Laws passed or current ones amended requiring that confidential reporting systems similar to ASRS, C3RS, and ASAP be made available to medical professionals at the state and/or federal level in a Just Culture environment
- Long term confidential/de-identified observations and studies of medical professionals in action in clinical and surgical environments with reporting done on a systemic, and not individual level. Feedback and recurrent review, and follow up must be provided.
It is incumbent upon State and/or Federal Legislatures to pass additional, more robust laws protecting medical professionals when they observe or make unintentional errors, as well as when a potential error is captured that could have caused patient harm (i.e. close call). These laws should mandate confidential reporting be made available, and require state medical boards, certified Patient Safety Organizations (PSOs) or other designated entities such as a hospital Professional Practice Evaluation Committee (PPEC) to act as incident review committees in a Just Culture environment. However, why wait on the government? Health care systems can implement these enhancements on their own! Just consider the amount of data and trend analyses that could be accomplished, standardization protocols put in place, and lives saved as a result. This process may prove to be arduous, but support from organizations like the Joint Commission, HHS, NAM, ASHRM, AHRQ, AMA, etc. as well as health system administrators, physicians, nurses, and other medical professionals on the front lines would lead to a wealth of discovery and improvement of patient outcomes.
They key to success of this initiative must include the following:
- The review committee MUST operate in a Just Culture world- no exceptions. This process will address many concerns of those who don’t trust the anonymity of reports or the concerns of punitive action of those making errors.
- Published feedback MUST be provided on a regularly occurring basis. This feedback guarantees all professionals have the confidence that their voices are being heard and considered.
- CRM MUST be a central part of any discussion. Most errors occur, not due to lack of technical knowledge and skill, but due to CRM breakdown like fatigue, stress, lack of teamwork, and communication breakdowns, etc.
- Submission of reports on ALL errors or close-calls must be reviewed. There is a reluctance for professionals to fill out reports, especially on themselves, and often times that is where the most can be learned. But, remember that if operating in a Just Culture environment they are protected under most circumstances.
- It is understood that errors resulting in patient harm can have other consequences such as malpractice claims. A key component of this program would ensure that reports filed could not be used in discovery as a part of any lawsuit. Also, in a Just Culture world the professional who made the error would still be protected from punitive action under most conditions. This process could possibly reduce malpractice claims and premiums.
Back to the aviation world- crewmembers are constantly checked and re-checked. They can be observed and/or given ‘check rides’ three or more times a year on actual flights and in the simulator environment. Additionally, there is a program called the Line Oriented Safety Audit (LOSA)[xxiii] where unbiased and neutral observers sit on flightdecks to observe and report on operations throughout a system from a safety and CRM perspective. The resulting data is de-identified and put into a data-driven engine with defined results. Reports are filed and disseminated on a regular basis to the entire operation. Procedural and CRM changes at the system level often result from these observations.
Does a program like LOSA exist in the medical world? Not that this author is aware of. It is highly recommended that health care systems employ long-term observation programs where expert Human Factors Analysts (different from HF Scientists/Engineers) assess operations in an embedded environment from a CRM/non-technical skills perspective. A ‘medical LOSA’ would be invaluable to learn where non-technical mistakes and errors are made on a systemic level.
A three-phase program should be established with the ultimate goal of identifying shortcomings that lead to close calls or other errors from a non-technical skills perspective. Perhaps expanding use of the Non-Technical Skills for Surgeons NOTSS[xxiv] behavior rating system would be appropriate, but applied in a confidential and Just Culture environment. In a 2020 study entitled ‘Assessment of the Non-Technical Skills for Surgeons (NOTSS) framework in the USA’ it was concluded that to be used broadly in practice, a scalable method is needed for training and assessing surgeons’ non-technical skills.[xxv]
In phase 1 an observation team will observe departmental surgical/clinical teams in action in a confidential and non-threatening environment. There would be no interaction with the teams at any time. The experts would simply observe, collect data, and submit a de-identified report to appropriate leadership on culture, CRM implementation, and other non-technical skills with recommendations for improvement. The culture of an organization can be easily defined by using a CRM concept called ‘Observable Behaviors’ which are specific traits that are exhibited when CRM is being used effectively. CRM breakdowns that lead to errors will be observed and included in the report.
In phase 2 the observation team will debrief the department and present strategies for implementation of recommended changes.
Phase 3 will include a return of the observation team after a period of time to gather feedback and to ensure implemented changes have resulted in the desired effect. Performance metrics will be assessed and adjustments made as necessary.
A three-to-six-month observation program will allow the gathering of enough data to make educated conclusions, to detect trends, and to implement engrained and repeatable changes. The key to success is for the observation team to then return later to ensure the changes continue to be followed and to modify processes as necessary.
CONCLUSION
While no system is perfect, the aviation world has achieved tremendous safety results through substantive and entrenched use of Crew Resource Management. The use of confidential reporting systems such as ASRS and ASAP (and C3RS in the railroad world) have proven invaluable to safety as well. In a Just Culture environment, the receiving and reviewing of these reports is completed using non-punitive and confidential protocols. These reports are not an attempt to assign blame for errors, but rather to learn from them. Long term audits allow for the observation of the operational culture, while detecting negative safety trends that could lead to major error.
In the medical world creating an environment that requires and uses effective CRM is a must, data collection is essential, and confidential reporting in a non-punitive environment is critical. The challenge then is how to accomplish these tasks. Long-term audits wherein an expert Human Factors team observes operations in an immersed environment and then reports and follows up on those observations is a huge start. Additional legislation protecting medical professionals that also require confidential reporting systems, and board review using Just Culture protocols is critical. But, health care systems can choose to implement these enhancements on their own. These recommendations are not realized in the majority of the medical world. It takes time, but that should not be a reason to not implement these processes and protocols in an effort to dramatically reduce the rate of poor patient outcomes due to medical error.
REFERENCES
[i] James, John T. PhD (September 2013). A New, Evidence-Based Estimate of Patient Harms Associated with Hospital Care. Journal of Patient Safety
[ii] US Civil Aviation Accident Dashboard: 2008-2023. NTSB
[iii] Marx D. (2001). Patient Safety and the Just Culture: A Primer for Health Care Executives. New York, NY: Trustees of Columbia University
[iv] Berner, Juan Enrique; Ewertz, Ernesto (2018). The Importance of Non-Technical Skills in Modern Surgical Practice. Spanish Association of Surgeons published in Science Direct.
[v] Carrie, Anne (2022). The 8 Most Common Root Causes of Medical Errors. Always Culture.
[vi] Guttman, Oren T. MD, MBA; Lazzara, Elizabeth H. PhD; Keebler, Joseph R. PhD; Webster, Kristen L. W. PhD; Gisick, Logan M. BS; Baker, Anthony L. PhD (December 2021). Dissecting Communication Barriers in Healthcare: A Path to Enhancing Communication Resiliency, Reliability, and Patient Safety. Journal of Patient Safety
[vii] The Joint Commission. The Universal Protocol for Preventing Wrong Site, Wrong Procedure, and Wrong Person Surgery Guidance for health care professionals. www.jointcommission.org
[viii] World Health Organization (2008). WHO Surgical Safety Checklist. www.who.int
[ix] Aryankhesal, A.; Aghighi, N.; Raeissi, P.; Najafpour, Z.; (2023). Recurrence of medical errors despite years of preventive measures: A grounded theory study. J Educ Health Promot. 2023
[x] Rodziewicz, Thomas L.; Houseman, Benjamin; Vaqar, Sarosh; Hipskind, John E. (2024). Medical Error Reduction and Prevention. StatPearls Publishing, Treasure Island, FL
[xi] Federal Aviation Administration (2004). Crew Resource Management Training. Advisory Circular (AC) 120-51E
[xii] Helmreich, R.L.; Merritt, A.C., & Wilhelm, J.A. (1999). University of Texas at Austin Human Factors Research Project: 235. The evolution of Crew Resource Management training in commercial aviation. International Journal of Aviation Psychology, 9(1), 19-32
[xiii] Ansel, Ersin; Shih, Ann (2012). The Analysis of the Contribution of Human Factors to the In-flight Loss of Control Accidents. American Institute of Aeronautics and Astronautics
[xiv] Humphrey, Kate E,; Sundberg, Melissa; Milliren, Carley E.; Graham, Dionne A.; Landrigan, Christopher P.; (2022). Frequency and Nature of Communication and Handoff Failures in Medical Malpractice Claims. Journal of Patient Safety
[xv] McMains, Vanessa (May, 2016). Johns Hopkins Study Suggests Medical Errors are Third-Leading Cause of Death in U.S. Hub.jhu.edu
[xvi] Federal Aviation Administration, AFS-200 (2020). Aviation Safety Action Program. Advisory Circular (AC) 120-66C
[xvii] Marx D. (2001). Patient Safety and the Just Culture: A Primer for Health Care Executives. New York, NY: Trustees of Columbia University
[xviii] Wachter, Robert (2007). In Conversation with… David Marx, JD. PSNet and AHRQ.
[xix] Hooey, Becky, ASRS Director NASA (2025). Interview regarding the NASA PSRS program
[xx] Helmreich, R. L.; Harper, Michelle L. (2004). Identifying Barriers to Success of a Reporting System. Advances in Patient Safety, Vol. 3
[xxi] Helmreich, R. L., Harper; Michelle L. (2004). Identifying Barriers to Success of a Reporting System. Advances in Patient Safety, Vol. 3
[xxii] Makary, Martin A.; Daniel, Michael (2016). Medical Error- the third leading cause of death in the US. The British Medical Journal
[xxiii] Federal Aviation Administration, AFS-230 (2006). Line Operations Safety Audits. Advisory Circular (AC) 120-90
[xxiv] Yule, Steven PhD; Henrickson Parker, Sarah PhD; Wilkinson, Jill MSc; McKinley, Aileen MBChB, BSc, FRCS; MacDonald, Jaime MB BCh BAO, FRCS; Neill, Adrian MB BCh BAO; McAdam, Tim MB BCh BAO, FRCS (2015). Coaching Non-technical Skills Improves Surgical Residents’ Performance in a Simulated Operating Room. Journal of Surgical Education
[xxv] Pradarelli, J. C.; Gupta, A.; Lipsitz, S.; Blair, P. Gabler; Sachdeva, A. K.; Smink, D. S.; Yule, S. Assessment of the Non-Technical Skils for Surgeons (NOTSS) framework for the USA. Wiley Online Library (www.bjs.co.uk)