Policy Forum
Sep 2011

Patient Safety Organizations Are Step 1; Data Sharing Is Step 2

Allan S. Frankel, MD
Virtual Mentor. 2011;13(9):642-646. doi: 10.1001/virtualmentor.2011.13.9.pfor1-1109.

 

The health care industry will forever require careful oversight to ensure safety. It suffers from the ubiquitous and very human trait of reaching out towards desired goals and concentrating on attaining products, while putting fewer resources into the commensurately necessary safety nets and safety measurement systems. There is no reason to presume this trait will change. We see it manifest wherever humans push the envelope: in deep-sea oil exploration, e.g., Deepwater Horizon, and nuclear power, e.g., the Fukushima Daiichi Power Plant [1]. We obtained oil and nuclear power, we presumed safety, but paid great human and environmental costs because of inadequate safety defenses. The difference in health care is that our disasters tend to be many episodes of single deaths and human suffering rather than a single episode with many deaths and injuries. As a result, meaningful patterns of systemic failure are difficult to identify and easier to ignore. To safeguard, we must attend to failure.

The Patient Safety Act of 2005 [2] created patient safety organizations (PSOs) to confidentially collect and aggregate data on adverse events from health care organizations on a large enough scale to generate insights of value for clinical improvement. There is precedent for the act in the Aviation Safety Reporting System (ASRS), which serves as a reminder that confidential reporting systems over time can be effective. The ASRS had detractors for years after its inception but has proved to be of great value [3].

The PSOs protect the confidentiality of adverse event data by building upon peer review, the method that states use to protect an organization’s quality and safety data from lawsuit discovery, in part to aid learning and improvement. In most states, the protection built into peer review ends when quality and safety data leave the walls of the health care institution. The Patient Safety Act extends legal protection to a PSO to facilitate the collection of a wide range of data from many organizations, but with caveats. The protection and confidentiality afforded to PSOs mandates that analysis of aggregated data must occur for learning and improvement. The logic is sound; the goal of the PSO is to generate action, not to collect data.

Patient safety organizations move us in the right direction. The authors of the Patient Safety Act recognized the many challenges of collecting accurate data [4]; how, for example, human beings resist admitting wrong [5, 6], yet have a propensity to blame [7, 8]; the detrimental influence of legal malpractice on our learning [9]; the myths about patient expectations after an adverse event [10]; and the glacially slow incorporation of effective teamwork and improvement into our culture [11]. Given these challenges, there is little surprise that physicians and organizations underreport adverse events and won’t voluntarily make data available for public scrutiny.

Why Congress would confer the privilege of legal protection to PSOs and limit public access to their data, and why, overall, this is ethically practical and a reasonable but incomplete first step warrants some reflection. Characterizing health care’s effort at self-policing and identifying the factors that influence it will help put the current situation into context.

Health care was once offered through a guild of independent practitioners who occasionally plied their trade within common walls called hospitals. In that setting, physicians were responsible for self-policing as the mechanism to ensure the safest and most reliable care [12]. Some of their efforts were laudable, others offensive. The American Medical Association’s 1847 Code of Medical Ethics required that members not criticize other members, an example of physicians’ closing ranks against other clinicians and patients. In her book on medical ethics, Virginia Sharpe relates how this compact resulted in the burning of a scathing report on the quality of medical schools in the United States in the early 1900s before the report was made public. The burning initiated what ultimately became known as the Flexner Report. Frequently, but not always, the gentlemanly code [13] promoted ethical behavior but also helped shape the complex, error-prone system of health care we have in place today.

Although health care systems in most advanced countries are now large, industrial, and complex, the old model of self-policing has remained fully intact, a relic that is useful but inadequate in light of the fact that so much of care today is a team effort. There is a hodgepodge of publicly available information obtained as a result of required regulatory and governmental reporting that ostensibly measures the safety of health care. However the metrics are only partly the right kind of data, and they’re not particularly accurate. Whole sets of cultural and risk data are ignored, and a considerable amount of information collected by the health care industry remains unavailable to the public. Health care is not unique in measuring the wrong things.

The book Moneyball explains how the RBI (runs batted in) metric used in baseball is influenced mostly by chance (a given player’s RBI depends on the players who happen to bat and get on base before him), yet this metric has been used for a century to characterize a baseball player’s excellence. One professional baseball team, the Oakland Athletics, capitalized on this fallacy for a number of years with great success, allowing them to spend 1/6 of what other good teams spent on player salaries and still get to the playoffs [14]. Similarly, in health care we classify the best 100 hospitals [15], the best 50 hospitals [16], the best international health care institutions [17], and the like, using measures that may have no bearing on safety and reliability of care. The bald truth is that even those deeply knowledgeable about health care don’t have available to them a set of reliable measures that identify the “best” safe and reliable institution.

The reports about best hospitals are based on reputation, imperfect quality data, and self-assessment and do not include the very important measures of culture, risk, and reliability. Furthermore, in what could be perceived as a conflict of interest, some organizations publish metrics of safety and then offer commercial services to help those same hospitals improve. We can achieve a materially better understanding of safe and reliable health care if we aggregate public health care data and other selected data that is now strictly private.

It is in this setting that Congress addressed the practical aspects of collecting data about adverse events, near misses, and errors. So far, however, the PSOs have not achieved anything even close to their potential. It is difficult to collect adverse events manually, and human beings don’t like to report their own errors of omission, commission, and lapses in judgment or memory. In fact, they won’t reliably do so, and so far, they’re not. PSOs may well become the repositories of increasingly important data, and they may play a major role in safeguarding the learning process that is necessary in our health care industry. But part of this future success will rely not on person-dependent reporting but on a combination of automation and person-dependent oversight.

The Internet, easy access to computers, and electronic health records are making real-time electronic collection of adverse events in large health systems a reality, theoretically bringing us closer to achieving a real national assessment of care safety. A census approach that looks at hospital databases might finally produce a view of the “real” number of potential and adverse events, the denominator in the risk equation. That number has been elusive, sought after for the past 20 years since the 1991 Harvard Practice Study identified that we are an error-prone industry [18]. It is now on the horizon and brings us closer to quantifying risk in hospitals in a standard and comparable fashion using meaningful measures. Combined with the increasing sophistication of how we measure culture and attitude [19], we might finally be able to identify the organizations capable of delivering stellar care and to pinpoint those most in need of improvement. This requires measures of culture, processes, outcomes, and adverse events. PSOs can collect all this data.

But the Patient Safety Act is flawed; PSOs are not required to share their data, which limits the ability to achieve a much-needed national perspective. Regardless, it is a step in the right direction. Organizations are getting their hands around the measurement of health care culture in earnest for the first time and are beginning to really measure risk. The culture and risk insights that ensue [1] will change the way leaders manage health care, alter how we view organizational excellence, and most likely lead to safer and more reliable care.

PSOs make sense for learning, and confidentiality is appropriate to increase the amount and quality of data collected. To reach full potential, however, PSOs must find ways, or be required, to aggregate their findings.

Maybe it is wishful thinking, but at some point an organization with a prescient leader who understands reliability and the factors that predispose to excellence may make some of this data publicly available and not implode but improve. The Lexington, Kentucky, Veterans Affairs Medical Center did so with disclosure of adverse events to patients in 1987 [20, 21], followed very successfully by others like the University of Michigan [22] health care system. In those cases, it took singular individuals to start the process. That will certainly portend a shiny new day.

References

  1. In place of safety nets: don’t assume disasters won’t happen at the frontiers of technology—presume they will. Economisthttp://www.economist.com/node/18586658. April 20, 2011.

  2. Agency for Healthcare Research and Quality. The Patient Safety and Quality Improvement Act of 2005. http://www.ahrq.gov/qual/psoact.htm. Accessed August 17, 2011.

  3. NASA Aviation Safety Reporting System. Callback. 1996;204. http://asrs.arc.nasa.gov/docs/cb/cb_204.pdf. June 2011

  4. Chassin M. Achieving and sustaining improved quality: lessons from New York State and cardiac surgery. Health Aff (Millwood). 2002;21(4):40-51.
  5. Cullen DJ, Bates DW, Small SD, Cooper JB, Nemeskal AR, Leape LL. The incident reporting system does not detect adverse drug events: a problem for quality improvement. Jt Comm J Qual Improv. 1995;21(10):541-548.
  6. Mariner WK. Medical error reporting: professional tensions between confidentiality & liability. Issue Brief (Mass Health Policy Forum). 2001;(13):1-35.

  7. Wahlberg D. Her pain’s still raw, nurse says suspension over, but she can’t work in a U.S. hospital for five years. Madison.com. July 20, 2008. http://host.madison.com/news/article_67e9a9ae-0357-5446-b10e-158125f23283.html. Accessed August 17, 2011.

  8. Hospital administrator takes plea deal in whistle-blower retaliation case. Outpatient Surgery Magazine. March 24, 2011. http://www.outpatientsurgery.net/news/2011/03/27-hospital-administrator-takes-plea-deal-in-whistle-blower-retaliation-case. Accessed August 17, 2011.

  9. Hoyt RE, Hall EB. Evidence shows changing roles of health care risk managers. J Healthc Risk Manag. 2003;23(2):7-11.
  10. Powell S. When Things Go Wrong: Responding to Adverse Events: A Consensus Statement of the Harvard Hospitals. Massachusetts Coalition for the Prevention of Medical Errors; March 2006. http://www.macoalition.org/documents/respondingToAdverseEvents.pdf. Accessed August 17, 2011.

  11. Frankel AS. The Essential Guide for Patient Safety Officers. Oakbrook Terrace, IL: Joint Commission Resources; 2009.

  12. Sharpe VA. Behind closed doors: accountability and responsibility in patient care. J Med Philos. 2000;25(1):28-47.
  13. Liang BA. Law, health care, and ethics: detoxifying the lethal mix. Virtual Mentor. 2004;6(3):146-149. http://virtualmentor.ama-assn.org/2004/03/oped1-0403.html. Accessed August 17, 2011.

  14. Lewis M. Moneyball: The Art of Winning an Unfair Game. New York: W.W. Norton; 2003.

  15. Best hospitals. US News and World Reporthttp://health.usnews.com/best-hospitals. Accessed August 17, 2011.

  16. HealthGrades America’s 50 best hospitals: new study finds more than a half million preventable deaths in last decade. HealthGrades. February 23, 2011. http://www.healthgrades.com/cms/ratings-and-awards/2011-Americas-50-Best-Hospitals-Award-Announcement.aspx. Accessed August 17, 2011.

  17. Ranking web of world hospitals: January 2011: hospitals of United States of America. Cybermetrics Lab of the Consejo Superior de Investigaciones Cientificas (CSIC). http://hospitals.webometrics.info/rank_by_country.asp?country=us. Accessed August 17, 2011.

  18. Brennan TA, Sox CM, Burstin HR. Relation between negligent adverse events and the outcomes of medical-malpractice litigation. N Engl J Med. 1996;335(26):1963-1967.
  19. Frankel AS. Revealing and resolving patient safety defects: the impact of leadership walk rounds on frontline caregiver assessments of patient safety. Health Serv Res. 2008;43(6):2050-2066.
  20. Kraman SS. A risk management program based on full disclosure and trust: does everyone win? Compr Ther. 2001;27(3):253-257.

  21. Zimmerman R. Doctors’ new tool to fight lawsuits: saying “I’m sorry.” Malpractice insurers find owning up to errors soothes patient anger. Okla State Med Assoc. 2004;97(6):245-247.
  22. Boothman RC. Apologies and a strong defense at the University of Michigan Health System. Physician Exec. 2006;32(2):7-10.

Citation

Virtual Mentor. 2011;13(9):642-646.

DOI

10.1001/virtualmentor.2011.13.9.pfor1-1109.

The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.