Shortening The Moscow March Of U.S. Medical Research

Chad Swiatecki
Vital Signs Signature Course
3 min readMar 31, 2017

Imagine if we had to wait 15 years between each new version of the iPhone or new car model.

That interminably long development and innovation cycle is pretty much the standard operating procedure in the medical world, and one more glaring example of why health care in the U.S. is so fractured and inefficient.

In his Vital Signs lecture, Dell Medical School Dean Clay Johnston shared pieces of his own journey through the prolonged and frustrating process of trying to find a better way to treat patients.

Johnston’s example focused on work that began early in his career, when he noticed a large disparity in incidence of transient ischemic attacks — known colloquially as “mini strokes” — between the hospital where he was a resident and another hospital in the same city.

That was in 1999.

Now, 18 years later, the government-funded study that Johnston began to compare best treatment methods for TIAs to prevent follow-up strokes and other complications is still two years away from completion. And it’ll take another five years beyond that for the findings to result in any meaningful change in care delivery.

It’s safe to say that in the intervening two decades since Johnston first set out to find a better way to treat patients with a TIA there have literally been tens of thousands of patients outside of the study whose course of care could have been recorded, analyzed and added to the body of knowledge on how to best treat that condition.

To illustrate the point, Johnston compared the laborious and prolonged process of preparing for a grant — beginning research, achieving meaningful results and then integrating those results into everyday medical practice — to Napoleon’s ill-fated march to Moscow.

The problem is that the U.S. health care system treats care delivery and research as separate, and in the process passes up countless opportunities to create a gigantic corpus of data on best health care practices for all diseases and conditions.

It might be unsettling to think of ourselves as guinea pigs of sorts, but Johnston is frank in sharing that roughly 50 percent of clinical decisions are based on a best guess instead of being backed up by research a provider has reviewed themselves.

Giving meaning to the case-by-case data that providers all across the country are able to aggregate — and then having that data coalesced and made available through innovative electronic health records platforms — would be transformative.

Rather than studying a small, controlled cohort of patients for a decade or more, big data systems could collect anonymized information about treatment results everywhere for a given condition and determine best practices in an ongoing, real-time basis.

This would not only essentially eliminate the need for studies on best care practices — thereby freeing up government and other funding for other health care uses — but it would also dramatically shorten the time span between improvements in care delivery for a given condition.

Of course, accessing the resulting body of knowledge would require some reliance on artificial intelligence or some similar technology, to consult with the latest data and practices on the fly. Johnston admits the idea of asking Apple’s Siri or Amazon’s Alexa how to treat a stroke patient creates a whole new paradigm that some might find unsettling at first.

But if we loosen our thinking when it comes to how our own care can contribute to possible improvements for others almost immediately, the possibilities are exciting.

Making meaningful improvements in treatment could happen in a year or two instead of the quarter century that Johnston has spent chipping away on his ITA hypothesis.

While faster doesn’t automatically mean better in the world of health care, it’s pretty hard to argue with the benefits of finding ways to disperse innovations and information as quickly as possible.

--

--