Culture of Safety

Use adverse events to drive improvements to systems and processes designed to catch errors before they cause harm
This feature is the next installment in a series of dialogues among members of the AAOS Patient Safety Committee. This dialogue focuses on how to establish and maintain a culture of safety within an environment in which everyone is expected to err but also champions the teams and systems that catch errors before they cause harm.

Dr. Ring: Let’s talk about ideas for creating a culture of safety.

Dr. Pinzur: One might start by creating a healthier environment by addressing morbidity and mortality conferences, which historically seem to focus on assessing blame rather than learning from adverse outcomes and errors. The idea is to change the title of that conference and change the meeting to a quality improvement endeavor. The assumption should be that human errors are expected and harm is the result of ineffective processes. We can use the adverse event, or error, to drive improvements to systems and processes designed to catch errors before they cause harm.

Dr. Ring: We need to adopt David Marx’ “Just Culture Algorithm™,” which states human error is addressed with improved systems. Drift (getting comfortable with ways around effective systems) is addressed with coaching. Only reckless disregard for systems should be addressed with punishment. 

Dr. Gaunder: This is important for residency programs. They are now asked to involve their residents in quality and safety endeavors.

Dr. Pinzur: Rather than assigning blame, we should try to figure out what led to the error or adverse event. That is the idea of root cause analysis and contributing cause analysis. When something goes wrong, it is easy to make a snap judgment and assign blame. For instance, if you talk to most surgeons who have performed a wrong-site surgery, you learn one or more very valuable lessons. For instance, I know a very capable hand surgeon who operated on an MCP rather than an IP joint. He typed in the wrong code in scheduling. His usual process was to get to the surgery center early and review the notes in order to check for any errors. That day, he arrived late and the only two terminals hooked up with the university network were occupied. So, he skipped part of his usual routine. We made it easier to do the right thing by putting in more computers.

Dr. Ring: Human error; systems solution.

Dr. Pinzur: And I learned from this error, too. I can see how it could happen to me. Now, before I operate, I walk into the operating room and go through the notes of each patient I’m operating on that day, just to make sure my brain is in tune with exactly what my goals are for that day.

Dr. Ring: When I put the ink on the site, I make sure that it is the correct site. And I make sure the pathology is still an issue. Once it is marked, we are going to do it. I realize how many ways we can get it wrong, so I use every check and confirmation I can think of.

Dr. Burney: I heard about one surgery center that didn’t know its complication rate. Management leaders initiated a conference and reviewed all the surgeries from the previous week. They figured out that they had a 6 percent complication rate, which all agreed was too high. So they got to work.

Dr. Ring: There is some work being done to develop tools that identify adverse outcomes based on claims data and then facilitate quality improvement. A revamped Quality Improvement Conference could review the available data—operation, admission, and infection rates, for instance.

Dr. Grose: High reliability theory posits that every success is built on countless failures, and every failure draws attention away from a large number of successes. We should look at what we do well as much as where our opportunities lie.

Dr. Ring: In management, the focus on opportunities for improvement in the context of what works is sometimes referred to as appreciative inquiry. What do we do well? How do we get this new area as good as the other area?

Dr. Pinzur: This is more difficult than an automobile assembly line. There’s a much bigger human factor in medicine than there is in a lot of other industries. For instance, it’s easier to standardize processes in industry.

Dr. Ring: How do we foster a culture that values standardization for its ability to help limit errors and harm?

Dr. Pinzur: If you bring in a consultant to look at an issue within your organization, the first thing he or she does is compare the way you are doing things to best practices at other organizations.

Dr. Ring: You’re talking about benchmarking and getting feedback.

Dr. Marks: You can’t use a cookie cutter approach, but you have to still hold things to a specific level to make sure that you are following the best principles. You can’t be dismissive and just say, “Well, we’re just a different organization; none of these things apply.”

Dr. Grose: And to get to high reliability, you have to decrease variability so that it’s easier to spot anomalies and then either enhance them or fix them. You don’t decrease variability to eradicate anomalies.

Dr. Ring: We sometimes talk about having residents that we believe should not be orthopaedic surgeons because they don’t have dexterity and seem to have trouble acquiring it. What do you do? Teodor Grantcharov in Toronto has addressed aptitude screening using simulator training. He trained people on simulators and saw different learning curves. Some people were flat (no improvement) and others were erratic (no clear improvement with repetition). He speculated that these patterns might identify people who should not become surgeons. If there are inherent limits to what one can learn and become adept at, it seems important to measure those and put people in situations where they can succeed. Is that part of a culture of safety? Should we accept screening for aptitude?

Dr. Reznik: There is a move toward having fellowships be accredited, not just on educational pieces but the final skill set of their fellows.

Dr. Ring: I believe that’s the direction residencies are going. It’s called milestones. The idea is that residents don’t pass to the next level after a specific amount of time. Instead, they pass to the next level after demonstrating specific skills and knowledge.

Dr. Pinzur: What concerns me is that we are getting so focused on the technical skills, we may not be adequately focusing on the cognitive and decision-making skills.

Dr. Ring: If we put an emphasis on nontechnical skills—including shared decision making, empathy, and effective communication strategies—we might develop surgeons that think in a way that’s safer. They would have a constant awareness: “I will err. If I pay attention to nontechnical skills, I can count on my team to help me catch my errors before I cause harm.”

Dr. Pinzur: I think there is another issue here. We need to train people who can run effective teams.

Dr. Grose: When I was the chief resident, I was quite bothered by people not following my procedures and used poor technique to manage my frustration. I went to my mentor and said, “I’m not happy with the way I’m handling this situation, and I think I should be able to do it better. Maybe be more compassionate instead of screaming and yelling at them?” The answer from my chairman was, “You’re doing a fine job. Those people deserve what they’re getting.”

Dr. Archdeacon: A leader has to lead by example. That’s a hard job. It’s easy to be a leader when everything is going well, but it’s conflict and change that really tests leadership. You have to set an example and hold other people accountable to that example.

Dr. Ring: The Patient Safety Committee hopes this exchange helps you and your teams see the value of nurturing your safety culture. We also hope you picked up a few practical tips as well.