Northwestern Medicine’s Just Culture Program Encourages Safety Event Reporting

Added on Sep 20, 2018

Northwestern Medicine’s Fair and Just Culture Program Encourages Safety Event Reporting Systemwide
​By Audrey Doyle

To advance Northwestern Medicine’s Patients First philosophy of care delivery, a team of quality and patient safety leaders have developed a Fair and Just Culture program that guides department heads, managers and staff in differentiating between human error, behavioral choices and system weaknesses in response to safety events.

The goals of the new program, which launched last fall, are to increase the reporting of near-miss and precursor safety events, decrease the number of serious safety events and decrease the anonymous reporting of safety events systemwide.

According to Dea Hughes, program director for patient safety, Northwestern Medicine has had a well-established patient safety program for several years, and the implementation of a fair and just culture that encourages and rewards staff for speaking up for safety was already an integral component of the system’s mission, vision and values.

“But we wanted to take our existing efforts a step further and advance our Patients First philosophy,” she said. “So we developed this program to formalize a systemwide approach to a fair and just culture so that every employee in the organization would see what fair and just means in the context of patient safety events and would then feel safe to report these events.”

Innovative Algorithm Analyzes Events

The Fair and Just Culture program is integrated into the organization’s NM Toolkit of intranet-based safety, service and leadership tools. Developing the program’s foundational components was an interdisciplinary effort that was led by Cynthia Barnard, Vice President of Quality, and involved a core team of key stakeholders from Northwestern Medicine’s Human Resources, NM Academy, Talent Development, Patient Safety, Quality Management, Performance Improvement and Risk Management departments.

At the heart of the program is an innovative, internally developed, evidence-based decision support algorithm that analyzes the occurrence of near-miss events (the error was caught by a detection barrier or by chance before it reached the patient), precursor events (although the error reached the patient, it caused minimal or no harm) and specific staff behaviors that may have led to the events. It then suggests appropriate and fair responses based on whether the events occurred due to a human error, behavioral choice or system weakness. The team based the algorithm on research and resources from a multitude of sources, including Press Ganey’s HPI safety experts, the National Institutes of Health and others, that explain what constitutes a fair and just culture and how to guide organizations toward high reliability.

The algorithm allows the Northwestern Medicine team to review safety events in a standardized way, offering an impartial method of addressing performance and conduct and fostering a balanced approach to accountability, according to Hughes. The algorithm works by using data-driven decision making to lead the user to an evidence-based conclusion as to what caused the event to occur.

For example, say a nurse has programmed an infusion pump with the wrong patient weight, leading to an overinfusion of intravenous medication. Once that safety event has been logged in Northwestern Medicine’s incident reporting system, the nurse’s manager would call it up in the NM Toolkit and work it through the algorithm by answering a series of targeted “yes” and “no” questions to determine how to proceed.

If the algorithm subsequently categorizes the event as having been caused by human error or behavioral choice, it will present the manager with three possible conclusions: The nurse’s behavior may have been the result of normal human error, at-risk behavior that was not intended to cause harm or reckless behavior. The algorithm would then offer suggestions for an appropriate response. For example, for normal human error, the response would lead to consoling the nurse, reexamining system improvement opportunities and monitoring for future errors. For at-risk behavior, it would lead to system improvement and monitoring.

For reckless behavior, it would lead to corrective action and, potentially, discipline. At this stage, the manager would consider everything that occurred during the course of the event to identify the factors that contributed to the event. “For example, was the patient’s weight incorrectly documented in the medical record? Were there some human factors issues associated with the intravenous pump? Was the nurse working a double shift and overly tired?” Hughes offered. “The manager would take into account everything that may or may not have contributed to the event and base the corrective action plan on the findings.”

If the algorithm determines that the event occurred due to a system weakness, as opposed to human error or behavioral choice, it would instruct the manager to console the nurse and then to conduct a comprehensive systematic investigation of the event, such as a root cause analysis, to better understand what happened, why it happened and how to prevent it from happening again.

Promoting the Program

According to Hughes, the team incorporated the Fair and Just Culture program into the collection of leadership tools in the NM Toolkit. “Promotion of a fair and just culture in any organization has to start at the top, and we felt that making the program a leadership tool would help engage our leaders in promoting this culture,” she said.

But the program won’t be implemented if staff members don’t report the occurrence of safety events, and they won’t report them if they don’t feel safe doing so, she added, so the team developed training sessions that use hypothetical safety event scenarios to explain the difference between human error, behavioral choices and system weaknesses and show how their managers have been instructed to respond to each.

So far, system leadership, key stakeholders and team leaders have completed the training. Sessions geared toward department managers and front-line staff were rolled out this past May. The goal, said Hughes, is to make the program highly visible in all the Northwestern Medicine hospitals to ensure that everyone is aware that a structured, fair and just process is in place for analyzing safety events and differentiating between human error, behavioral choices, and system breakdowns. The hope, she added, is that this will reassure staff that they can safely report safety events, while also reiterating the importance of an atmosphere of transparency.

In addition to implementing the algorithm into the Fair and Just Culture program, the team has begun to operationalize it in other areas. “For example, in one of our hospitals we’ve integrated the algorithm within the nursing peer review process. When safety events are discussed at peer review they’re taken through the algorithm so that everyone has a clearer understanding of how staff behavior may have led to the event,” Hughes said. “We also use the algorithm when we’re discussing safety cases during hospital-based and system safety committee meetings. It allows us to review these cases in a standardized way and come to a standardized conclusion.”

According to Hughes, initially there was some concern among staff members regarding whether they would have time for the training sessions given their competing priorities. To alleviate this concern, the team designed the training sessions around staff schedules, and made the sessions optional.

There was also some concern about consistent and fair application of the algorithm, and whether staff would use it as intended. To alleviate this barrier, the team provided transparent and accessible information about the Fair and Just Culture methodology on the system’s intranet under the NM Toolkit, including an email address for submitting questions or concerns. “For the most part, people have been very receptive to the program, and they’ve indicated that it’s something they want to see more of,” Hughes said.

As the program has become increasingly visible across the system, the team has been tracking how it is affecting event reporting. Although concrete data are not yet available, Hughes noted that the team has seen an increase in the reporting of near-miss and precursor events. “And while we’re not doing this at the present time, we plan to track whether we see a decrease in the number of anonymous incident reports. These metrics will help us gauge the progress and success of the program,” Hughes said.

Even in hospitals with well-established safety cultures, errors in the delivery of health care are common. And when errors occur, people tend to keep information about them to themselves for fear of being looked down on, blamed or punished.

The Fair and Just Culture program encourages staff to report the occurrence of safety events so that others can learn from them, and so that system errors that may have led to the event are revealed and can be corrected.

“Punishing employees for safety errors doesn’t make patients safer. But knowing when events have occurred and understanding what caused them can make patients safer,” Hughes said. “If you have a well-developed fair and just culture, staff will feel that it’s safe to report events, and they’ll recognize the value in reporting them. Then you’ll be able to address the cause of the events, and your patients will be safer in the long run.”