A little more than 13 years ago, the Institute of Medicine (IOM) released its seminal report on patient safety, To Err is Human.
You can say that again. We humans sure do err. It seems to be in our very nature. We err individually and in groups — with or without technology. We also do some incredible things together. Like flying jets across continents and building vast networks of communication and learning — and like devising and delivering nothing- short-of-miraculous health care that can embrace the ill and fragile among us, cure them, and send them back to their loved ones. Those same amazing, complex accomplishments, though, are at their core, human endeavors. As such, they are inherently vulnerable to our errors and mistakes. As we know, in high-stakes fields, like aviation and health care, those mistakes can compound into catastrophically horrible results.
The IOM report highlighted how the human error known in health care adds up to some mindboggling numbers of injured and dead patients—obviously a monstrous result that nobody intends.
The IOM safety report also didn’t just sound the alarm; it recommended a number of sensible things the nation should do to help manage human error. It included things like urging leaders to foster a national focus on patient safety, develop a public mandatory reporting system for medical errors, encourage complementary voluntary reporting systems, raise performance expectations and standards, and, importantly, promote a culture of safety in the health care workforce.
How are we doing with those sensible recommendations? Apparently to delay is human too.
We, of course, every year or so trot out campaigns and outrage about the patient safety problem. We also have lots of programs and initiatives attempting to address safety — by institution — through federal agencies. But we arguably have not credibly and systematically addressed the major recommendations in that report. We’re not even close. And we still every single day have major safety problems in almost every aspect of U.S. health care.
For example, we do not have anything like mandatory reporting of misses and near misses. We did get those Patient Safety Organizations (PSOs) through the Patient Safety Act of 2005. PSOs are a loose network of designated entities scattered sporadically across the nation to help gather information on a voluntary basis about some adverse events. They submit that data to a website operated by the Agency for Healthcare Research and Quality (AHRQ). That site is called, somewhat ominously, the “PSO Privacy Protection Center.” It’s not clear what, if anything, happens with that information. It is clear, though, that the primary concern seems to be about protecting the privacy of the information, rather than using it urgently to address safety. More recently, AHRQ requested permission to run a pilot program that would facilitate consumer, as opposed to professional, reporting of medical errors. That experimental program is still under consideration.
In 2009, the Robert Wood Johnson Foundation, through its Pioneer Portfolio, extended a two-year planning grant to a group interested in creating a public-private response to the health care safety challenge, similar to the Commercial Aviation Safety Team (CAST). That group explored the possibility of creating a Public-Private Partnership to Promote Patient Safety (P5S). As CAST does in aviation, the P5S would work to identify and mitigate safety hazards. The group found numerous barriers for such a health care partnership and so far has yet to find its national footing. In health care, in spite of federal legislation and national attention, we nevertheless seem to be having a hard time even creating surveillance systems for reporting errors like the aviation industry has had for years—much less establishing collaborations to handle reported problems.
How about that “culture of safety”? Have we aggressively pursued every possible avenue to ensure that health professionals, patients, and families feel comfortable and empowered to look for, find, talk about, and resolve safety problems? Do most health professionals feel free to talk openly about mistakes and near misses with each other, as a team? These questions are obviously rhetorical. That’s unfortunate because this culture issue may be the linchpin to successful management of error in medicine. We are collectively having a difficult time meeting these decade-old IOM recommendations, especially those requiring vast new data sources, reporting capability, and tricky collaborations. Maybe we should instead look hard at the root of the problem—the human factor—our inherent propensity to err and the ways the professional culture handles that basic fact.
Former BIDMC CEO Paul Levy in several recent posts on his terrific Not Running a Hospital blog touches on these themes. He focuses on Crew Resource Management (CRM), which is an approach to error prevention used in aviation that should have applicability in health care. In those posts he cites an article that describes the use of CRM in the ICU.
Those authors note that,
“[i]n aviation, non-technical skills, a blame-free environment and Team Situational Awareness (SA) are considered CRM core competencies that require specific and focused training.”
Those same authors also observed:
“The archetypical medical specialist’s personality (highly motivated, A-type, control freak) helps in creating an environment in which a junior team member could feel inhibited to offer input in a senior team with ‘vertical’ leadership. This impacts Team SA, posing a threat to process safety, and thus patient safety.”
Their point, like the IOM’s, is that the human propensity to err is at the very core of our safety problems.
What if a large part of the answer to our safety challenge is not more and more layers of technical capability? What if, instead, this challenge first and foremost requires the basics—like attention to team skills, composition, function, and training? What if we worked hard to teach all health professionals and help all patients and families to be observant, assertive, and vocal about mistakes and potential mistakes? What if we deliberately created enlightened clinical environments in which we embraced our human frailties, rather than worked so hard to deny them?
Although errors will always be part of our nature, they do not necessarily control our destiny. Remember Pope didn’t just say, “To err is human.” His full quote is important: “To err is human; to forgive, divine.” It’s not so much the errors; we all make them. Maybe it’s what we do together with those errors that ultimately matters most.
Michael W. Painter, JD, MD is the senior program officer at the Robert Wood Johnson Foundation.
Categories: The Business of Health Care
What’s wrong with medical care in the US?
It’s a business!
The purpose of a business is to maximize income for the business entity. The purpose of medical care should be to maximize the health and wellbeing of the patient.
Private and governmental tinkering with the system has produced an entity, which does neither but has created a boon to the insurance industry.
Consider an earlier time in America when doctors were dedicated to the patient’s health and made accommodations for those who had difficulty paying.
Hospitals, for the most part were owned and operated by the county and were subsidized by the taxpayers.
Today most doctors try to maximize their income. That’s the business mentality. They selected the field of study because of the potential income. Hospitals are corporate organizations dedicated to maximizing profits. They were organized because of the potential income. Government assistance comes from the Medicare and Medicaid programs and protection of hospital territories via the “Certificate of Need” processes.
As a consequence of the current system, medi-business prospers, medi-insurance prospers medical providers prosper but the patients and other taxpayers suffer. The system fosters unnecessary and expensive tests, prescriptions and procedures.
So, one solution is to eliminate the profit motive, which would take the medical care system out of the capitalistic system. That would be socializing medicine in the US. But, consider the opposition to that concept:
During the Vietnam War years, a bill was introduced in the House of Representatives to create a Military Medical College. It would be similar to the military academies with strict entrance requirements and its purpose was to provide enough doctors to fill the needs of the Armed Forces in combat without drafting more practicing doctors. That was HR #1 in the House for ten years and because of lobbying opposition by the AMA and their allies, the bill never passed the House. The rationale espoused to defeat this resolution was that the government could not provide sufficient education and training of doctors. Of course they have provided Army, Navy and Marine officers to defend the US for a couple of centuries. The real reason for the opposition seems to have been to prevent the creation of too many doctors who would engage in price competition for their business when they returned to civilian life. Severe quotas for Medical schools have always limited the number of doctors, thus preventing any price competition. The Certificate of Need Process eliminates the price competition among hospitals.
Now, in addition to the AMA we have the powerful lobbies for Insurance companies, Pharmaceutical companies, corporate hospitals and other corporate entities engaging in medical care. No reasonable solution to socialize medicine will ever pass in congress as long as it is controlled by those whose election or re-election depends on their support for the status quo.
So, wake up America! Don’t vote for incumbents or new office seekers who won’t commit to a single payer health care system. Eliminate the middleman (insurance) and control costs in a manner similar to the current Medicare safeguards.
This would be the first step in taking business out of medical care.
We might also like to require a routine second opinion for the need for certain tests and procedures. But let’s take one step at a time.
I think one of the big problems is that our patients expect that we are NOT human. They expect perfection. Fear of lawsuits prevents us from disclosing mistakes, especially those that have no impact on the patient. Systems issues, especially with regard to computer programs and meaningful use, are not correcting mistakes as they were expected to. And we actually have implemented a whole boatload of patient safety initiatives such as National Patient Safety Goals. In Surgery we now have time-outs before the time-outs. The result has mostly been more paperwork, not more safety. Patients are sicker, we have more production pressures, and frankly medicine is not an exact science. Every patient is different and therefore it’s impossible to legislate healthcare at the direct patient-care level.