Name Your Price: The Costs of Commercializing Health

Sonia Sarkar, NLC Boston

Part two of a series on radical healing — an exploration of fundamental tension between our status quo healthcare system and achieving actual health. Catch up on the entire series here.

Reading a medical bill can feel like attempting to decipher IKEA instructions written in a foreign language. $500 for a band-aid? Multiple service charges for what felt like a single physician encounter? An explanation of benefits that appears to indicate that you, the patient, aren’t actually receiving any benefits at all?

In today’s U.S. healthcare system, we’ve resigned ourselves to this reality — in which it is nearly impossible, at least at the patient level, to understand the cost, price, and payment of the medical care we receive. But it wasn’t always this way. As the possibilities of medicine accelerated over the past century and yielded breakthroughs in treating serious diseases, our system accelerated alongside it, morphing its way into today’s frustrating web of business interests and intermediaries.

As Elizabeth Rosenthal explains in her book, An American Sickness: How Healthcare Became Big Business and How You Can Take it Back, medical treatments at the beginning of the 20th century were largely ineffective and simplistic. Hospitals, typically supported by mission-based religious groups, essentially served as end-of-life hospices for the severely ill.

However, as science advanced in the 1920s, the concept of insurance emerged: workers could pay a small amount per month to ensure a covered hospital stay up to three weeks — this way, hospitals avoided unpaid bills and patients avoided the near-certain bankruptcy that could come from a sudden health emergency.

The concept of health insurance was born — and proliferated quickly. According to Rosenthal, “Three million people had signed up by 1939 and the concept had been given a name: Blue Cross Plans. The goal was not to make money, but to protect patient savings and keep hospitals — and the charitable religious groups that funded them afloat. Blue Cross Plans were then not-for-profit.”(Today, some of the Blues are still nonprofit — which allows them to maintain tax-exempt status and receive multi-million dollar tax breaks, while others are for-profit corporations making billion-dollar profits.)

The development of technologies and medications with major life-saving potential in the 1930s and 1940s, however, cemented the value of health insurance and gave rise to a brand new industry: post-WWII employers offered health insurance as a pre-tax benefit and by 1955 the number of Americans with health insurance reached 60 percent.

The creation of an enormous new market paved the way for competitors who operated as for-profits and were not beholden to the mission statements of the Blues. These companies, Rosenthal writes “accepted only younger, healthier patients on whom they could make a profit. They charged different rates, depending on factors like age, as they had long done with life insurance. And they produced different types of policies, for different amounts of money, which provided different levels of protection.” Fast forward to today, with wildly climbing costs that typically get passed on to the patient while the insurance company takes its cut.

The commercialization of healthcare doesn’t simply stop at insurance, however. As a public health undergraduate at Johns Hopkins, I was surrounded by (and took a great deal of pride in) the highlighting of Hopkins as one of the best healthcare institutions in the country, if not the world. After passing by those posters and announcements, I would arrive in class, where I listened to lecture after lecture where professors described how health disparities in Baltimore are some of the worst in the country.

In the early 2000s, for example, infant mortality in the city was twice that of the national average. Life expectancy in Poppleton — just a mile away from the Johns Hopkins Hospital, was 63 years — lower than the average life expectancy in Rwanda or Nepal. Even today, nearly 60,000 Baltimore children (10 percent of the total city population) are at risk of lead poisoning.

It made no sense to me — my own alma mater’s hospital sat just blocks from where these health injustices were occuring. Theoretically, patients in Baltimore had access to the best medicine our system has to offer. So why was that still not enough to deliver actual health?

Following the precedent set by health insurance companies, healthcare delivery today is driven predominantly by financial forces — forces that shape not only the hospitals and clinics that serve patients, but also the decisions made by healthcare professionals and even the ways that we as patients seek out care.

Today, healthcare institutions — many of which still operate as not-for-profit organizations, reflecting their original legacy — must walk the line between patient-focused missions and a culture of revenue generation. This is not to say that hospitals aren’t filled with passionate, pragmatic clinicians and administrators who understand the roles that their institutions play not only in the lives of individual patients, but also as employers and anchors within communities. The moral mission of healthcare continues to be a core pillar of the industry.

Rarely, however, is it the strongest pillar. The financial incentives set in motion by the insurance industry also create similarly strong forces for hospitals — and in a society where market culture is dominant, operating as a “business” focused on revenue and compensation is seen as best practice. In many ways, the not-for-profit nature of hospitals creates an even more ripe environment for this behavior, as there is little external accountability in the form of shareholders (let alone patient boards or community councils).

The competing values of hospitals are often directly visible. In American Sickness, one health system simultaneously donates $250,000 to build a teaching hospital in Haiti after the 2010 earthquake and announces a $150 million venture capital fund, led by a former Amazon executive.

Through my work in healthcare and public health, I’ve seen these contradictions up close. Business acumen is highly prized in healthcare settings; innovative ideas focused on addressing the actual root causes of health disparities, not so much. Seemingly hamstrung by the existing financing structure of healthcare, sector leaders and frontline staff alike demand that any attempt at improving health outcomes must also come with a clear “return on investment” — a significantly different frame than the original service core that many were founded upon.

Of course, they key question is where all of this commercialization leaves the patient. Too often, it’s at the receiving end of that bill, questioning whether (for those who are insured) it is even possible to pay their share or (for those who are not insured), where to go with a seemingly impossible situation. Within this paradigm, it’s no surprise that access to even the highest-quality, affordable care doesn’t necessarily result in improved health for people, their families, or their communities. Indeed, as many of the Baltimore residents I’ve worked with have shared, the existing healthcare system often serves to further deepen the inequalities that already exist.

Next month, we’ll explore this topic of health equity in more detail. Why does it matter, for all of us? And what can we actually do about it, given that it seems like such an intractable and expansive problem?

Sonia Sarkar is the Former Chief Policy and Engagement Officer for the Baltimore City Health Department, and previously served as Chief of Staff and Special Advisor at Health Leads, a social enterprise focused on essential needs for health, such as food and housing. Currently, she is a Robert Wood Johnson Foundation Culture of Health Leader, a Health Policy Fellow at New America, and a Center for a Livable Future-Lerner Fellow at the Johns Hopkins School of Public Health.