Lessons From Near Misses

by Paul Levy www.runningahospital.blogspot.com
Published on Sep 08, 2015

Paul Levy's commentary is based on the article in the NY Times that was written by surgeon Bud Shaw and published Sunday, September 6, 2015. Dr. Shaw writes that despite overwhelming evidence of a growing clinical crisis, the professionals charged with caring for his daughter made a series of conscious decisions to disregard the red flags that herald impending crisis. If you draw any parallels to the Lewis Blackman case then you're as concerned as we are about the failure to rescue. Read more about the Blackman case and the failure of human cognition at.... http://www.academia.edu/5455690/Human_Cognition_and_the_Dynamics_of_Failure_to_Rescue_The_Lewis_Blackman_Case


If our goal is to lead our places to be learning organizations, we must help our folks understand that near misses are gems that should stimulate us to focus on underlying process failure. Why? Well, for every adverse event that is reported in a service or manufacturing organization, there are literally hundreds of near misses.  Each one represents an opportunity to correct a systemic problem that could someday lead to a catastrophic event. Let's look at a recent example from health care.

Bud Shaw published a powerful and deeply disturbing story in the New York Times this past week.  Shaw, a surgeon, was at his daughter's bedside in the hospital when he recognized that she had a serious problem:

I’ve been watching the monitor for hours. Natalie’s asleep now and I’m worried about her pulse. It’s edging above 140 beats per minute again and her blood oxygen saturation is becoming dangerously low. I’m convinced that she’s slipping into shock. She needs more fluids. I ring for the nurse.

I know about stuff like septic shock because for more than 20 years I was a transplant surgeon, and some of our patients got incredibly sick after surgery. So when I’m sitting in an I.C.U. in Omaha terrified that Natalie, my 17-year-old daughter, might die, I know what I’m talking about. I tell the nurse that Natalie needs to get another slug of intravenous fluids, and fast.
 
The hospital's staff was unresponsive. Shaw broke into the crash cart and administered the saline solution himself. Luckily, things worked out.
 
After three days in the hospital, Natalie got better. A new chest X-ray showed that there was much less fluid in her chest. Her fever resolved. They changed one of the antibiotics and the nausea she had had all but disappeared. They told her she could go home. They prescribed antibiotics for her to take at home, and removed her IV catheter.
 
We could say a lot about this incident. The part I'd focus on is what happened after. Shaw doesn't say, but I'm willing to bet that there was minimal or no debriefing of this case by the hospital staff. I say that not because I know the facts: It's just that the pattern of behavior related by Shaw is indicative of a hospital that is well behind when it comes to clinical process improvement.
 
First, though, let's look at the science, things that are taught in every medical school and nursing school and every residency training program.  Failure to rescue is a major cause of mortality and morbidity in hospitals.  Its causes, though, are multifactorial and the condition often presents itself in subtle ways.  Patient safety expert Robert Wachter has noted:
 
Analysis of deaths and unexpected cardiopulmonary arrests in hospitals often find signs of patient deterioration that went unnoticed for hours preceding the tragic turn of events. (Understanding Patient Safety, page 283.)
 
How might the hospital's clinical leadership have helped people learn from this near miss?  The discussion must be set up to "be hard on the problems and soft on the people," making clear that the debriefing is not an investigation targeted at finding fault or assigning blame.  It is an examination of the elements of our work flow that could lead other well intentioned doctors and nurses to similar results in the future.  Let's look at just a few such elements that might be relevant in this case.
 
What was it that led us to premature closure in Natalie's case?  The symptoms were there to see, yet the doctors and nurses had decided that it wasn't serious. How can we improve our ability to avoid the cognitive error of diagnostic anchoring?
 
What could our team learn from the fact that a concerned parent could not get the staff to respond? Do we have a protocol in place to activate a rapid response team when key patient indicators warrant? Do we have a patient- or family-activated rapid response program?
 
Do we use any predictive analytic tools to assess severity of illness that can be tracked over time?
 
My late colleague Donald Schön once described a learning organization as one "capable of bringing about its own transformation."  This is a powerful concept. It suggests that sustained improvement in a place requires--almost as a Zen master might say--that change must come from within.  Near misses provide excellent opportunities for that kind of learning if the leader engenders a sense of responsibility to notice them and act on the information they offer.