AMA Journal of Ethics. November 2015, Volume 17, Number 11: 1073-1078.
History of Medicine
Promoting Cost Transparency to Reduce Financial Harm to Patients
The escalating costs of health care call for the promotion of cost transparency and physician training on how to discuss medical costs with patients.
Reshma Gupta, MD, MSHPM, Cynthia Tsay, MPhil, and Robert L. Fogerty, MD, MPH
Medical care is continuously evolving as new drugs are discovered and new technologies are mastered. Along with these strides, however, come added costs. With nearly one in every five dollars spent in the United States going to health care , the sheer volume of money that changes hands in the health care sector is enormous. How did we get here?
Early Twentieth Century
Up until the end of the nineteenth century, most doctors’ visits took place in patients’ homes. Charges for treatments and procedures were determined through a negotiation between the physician and patient . With the development (by Joseph Lister) and widespread adoption of aseptic techniques by the 1890s , the modern hospital emerged as a place of medical advancement and treatment. People previously treated at home were now seeking treatment at hospitals , which had to recoup building and operating costs. Estimates suggest that the percentage of an average US family’s medical bill dedicated to hospital charges almost doubled in the first third of the twentieth century—from 7.6 percent in 1918 to 13 percent in 1929 [3, 4]. And in 1929, hospital expenses drove up the average annual health care charges ($67) by nearly 400 percent per family—to $261 . These alarming statistics, coupled with the end of World War I and the Great Depression, led reformers to call for a national health insurance system or an appropriate community agency focusing on the promotion of group practice, equitable distribution of costs of medical care among social groups and over time, and an emphasis on preventative medicine .
In the early 1900s, established professional standards for physicians had emerged, and Abraham Flexner helped to incorporate them into medical education . The profession responded to these improvements in medical science, education, and training with division of labor and an increase in medical specialization . The Committee on the Costs of Medical Care (CCMC) argued in 1933 that variation in health care use and more frequent contact with medical practitioners were leading to increased health care expenditures for individual families . Physicians’ decisions about what to charge patients for services were influenced by a wide variety of factors—such as the rising cost and length of medical education, hospital and administrative fees, and increased competition—not directly related to providing what we now call “high-value care,” a scenario that some would argue is continuing today .
After the world wars, the field of medicine grew rapidly, employing much experience gained from treatment of battlefield wounds and mental conditions. Health care began to approach what we know it to be today. For example, antibiotics came into wide use, childbirth was increasingly a hospital event, and chemotherapy was first used clinically in 1942 . Medicine was fully entrenched as a science, and, as medical knowledge grew, so too did cost, that is, the monetary burden of providing a service . (This term is distinct from charges—the amount billed by the entity providing the service—and payments or reimbursements—the monetary amount received by the entity providing the service.)
However, though costs were rising, cost information was sometimes made available to patients so they could make informed financial decisions about their care. In 1954, for example, Grace-New Haven Hospital presented all expectant mothers with the cost of room and board for the upcoming delivery . The prices of different types of rooms were handed to the patient the same way we today place identification bands on patients and have them sign informed consents. As the costs increased and care became more complex, this transparency has disappeared.
Since the last third of the twentieth century, the doctor’s toolkit has grown to encompass more technology, treatments, and tests, and costs have grown with it. Organ transplantation, elaborate cardiac surgeries, and life-sustaining technology not only increase cost of care enormously, but also keep people alive to incur even more charges in the future. In 2004, $1.9 trillion was spent on health care in the United States—a 36-fold increase from 1947 when adjusted for inflation —or $6,508 per person . In 1960, US health care expenditures were only $27.4 billion, or $147 per person . The increasing resources dedicated to health care are becoming so great that financial harms are visited upon patients, who often do not have information to make fully informed financial decisions about their care. Prominent authors have discussed these financial “side effects” or “toxicities” and exhorted medicine to “do no (financial) harm” [12-14].
Health care is the fourth largest share of household expense for the typical family in this country, behind housing, food, and transportation . More than three-quarters of polled Americans with health insurance in 2005 reported being concerned about their ability to pay medical bills for routine care, and, in 2006, 32 percent of polled Americans reported worrying about financial harm in the event of a serious illness or accident . Recently, it has been reported that more than half (52.1 percent) of all debts in the US are due to medical expenses . These debts may in part be incurred because of a lack of price transparency and communication between patients and physicians concerning medical prices . Patients have reported wanting to have these conversations with their clinicians .
Why has cost control only recently become a rallying cry among clinicians, given what is at stake for our patients and the nation? Major reasons include a lack of information about costs among both physicians and patients and gaps in physician training about financial harms.
Cost negotiations have changed over time. Prices are no longer distributed to patients in advance. Now, closed-door negotiations between hospitals, clinics, and other provider organizations and insurance companies set complex fee schedules, a practice that results in physicians’ ignorance of costs and patients’ making purchases without knowing the prices or completely understanding the services they are receiving. As a result, the cost of a medical service may be drastically lower than the charges sent to the insurance company for reimbursement and the charges that patients see in their medical bills [20, 21].
We suggest that medical centers take the following steps to promote cost transparency and to train physicians and patients how to have open discussions about costs and the risks of financial harm:
Patients and physicians have a joint ethical responsibility to discuss medical costs and to avoid financial harms for patients and society at large. Simply put, the United States cannot withstand the escalating cost of health care indefinitely. However, we believe that the recommendations outlined above, in combination with national policies and incentives, can improve cost transparency, help avoid financial harms, and promote ethical medical practice. Moving forward, we must reflect on these cost trends, identify key lessons, and promote efforts to rapidly evaluate and scale interventions that improve the delivery of high quality care at lower costs.
Reshma Gupta, MD, MSHPM, is an internal medicine physician who is supported by the VA Office of Academic Affiliations through the VA/Robert Wood Johnson Clinical Scholars Program at the University of California, Los Angeles. She is also the director of the joint Costs of Care/American Board of Internal Medicine Foundation Teaching Value in Healthcare Learning Network.
Cynthia Tsay, MPhil, is a second-year medical student at the Yale School of Medicine in New Haven, Connecticut. She is interested in bioethics and in how the complex social forces that historically have shaped the development of new trends and beliefs in medicine can inform future global health policy and reform.
Robert L. Fogerty, MD, MPH, is an assistant professor of medicine at Yale University in New Haven, Connecticut, and a practicing hospitalist at Yale-New Haven Hospital, where he is associate chief of the Generalist Firm and a member of the core faculty in the Internal Medicine Residency Program.
Related in the AMA Journal of Ethics
The viewpoints expressed in this article are those of the author(s) and do not necessarily reflect the views and policies of the AMA.
© 2015 American Medical Association. All Rights Reserved. ISSN 2376-6980