The AI Will See You Now

Why healthcare workers are worried about an AI revolution in an already inhuman system

James Fitz Gerald
Flipping the Script
4 min readSep 19, 2023

--

Photo by National Cancer Institute on Unsplash

A pandemic-era surge in AI promises to automate fundamental tasks in medicine. Hospitals have ramped up use of machine learning systems that can now do things like screen for skin cancer, monitor vital signs, and even detect diabetic retinopathy from digital images of the eye — no doctor required. Clearly, healthcare workers are right to worry about AI-replaceable skills, but equally concerning is the way this tech devalues their irreplaceable ones.

Many healthcare workers are worried about what an AI revolution means for the human side of an already inhuman healthcare system.

A recent study in npj Digital Medicine finds clinicians ambivalent at best about the use of artificial intelligence. In addition to a panoply of ethical and safety concerns, fears related to a loss of professional autonomy were “unanimously reported.” Assurances that AI can’t replace your doctor haven’t stopped some from wondering whether it threatens their essential role in patient care. Others even question why they learned the art of diagnosis in the first place.

High costs and stubborn clinical workflows currently keep healthcare jobs from being automated out of existence, but it’s important to remember these barriers aren’t permanent.

More rule than exception, skepticism for AI in healthcare is obscured by the hype. Tech firms tout their own selective compliance studies while medical conglomerates celebrate the AI arms race in press releases, billboards, and public speeches.

Given that digitized healthcare markets are estimated to reach nearly $1 trillion by 2025, this optimism at the top makes sense. Firms with stakes in biodata collection like Google, Apple, Facebook, and Amazon are primary investors, leaving corporate managers and hospital administrators eager to push for adaptive technologies that leverage advanced statistical algorithms for processing data.

But this ad-copy ignores countless caregivers who lament the replacement of key parts of their profession with machines. Generally, when people interact with technology in their professional lives, they prefer autonomy over the terms of its application — something the vast majority of American workers lack. It shouldn’t be surprising that healthcare workers are so distrustful, given that the arc of innovation in their field historically bends toward a more acute world of work. AI could mean more time with patients, but it’s just as likely to be used as another efficiency widget for servicing more “clients” in fewer minutes.

Realists say that caregivers and autonomous technology will have to work together whether people like it or not. They correctly point out that clinicians are sought after for their warmth, empathy, and understanding as much as their brainpower and expertise. AI will never replicate “that gut feeling” honed by sitting at the bedside, as one doctor puts it.

It’s true that doctors and nurses are indispensable precisely because they’re not machines. But taking comfort in the idea that qualitative skills will save this or any other profession misses an important point. AI doesn’t threaten livelihoods because of the uniquely human qualities it replicates. It automates labor by deskilling it, accelerating the breakdown of human input into ever simpler, measurable components. This makes workers more productive but also more replaceable in the long run (not to mention more surveillable in the short run). As the art of medicine advances, it seems the artisan recedes.

For these reasons, AI’s role in the mechanization of care work poses serious challenges for what remains of the patient-provider relationship. Everyone yearns for more humanity in health care, but warmth, empathy, and “gut feelings” are not what first come to mind when thinking of AI-assisted medicine. Doctors already average 16 minutes on electronic health-records for every one patient they see, leaving them with fewer than 5 minutes for direct interaction. Nurses in ICUs spend nearly 1/3 of their shift on flowsheets.

There’s no reason to believe that the tech industry’s “fail fast and fix it later” approach will make any of this better. At least, not without direct influence from the people who deliver care.

Collaboration among clinicians, their unions, and patient advocates is needed to grow the fight for greater transparency around tech integration. U.S. lawmakers could take a page from the EU’s playbook and regulate AI technologies in areas that pose unacceptable levels of risk to people’s safety, including health. Industry reforms interested in cutting waste might start with the $250 billion spent on administrative excess every year, or CEOs’ soaring pay since the start of the pandemic. Any of these options serve the goals of healthcare better than automating the labor of doctors and nurses.

AI promises efficiency and cost reductions, but its implementation under the business model of American healthcare is poised to further impersonalize an already deeply impersonal system.

--

--

James Fitz Gerald
Flipping the Script

English professor at Bentley University (Waltham, MA) focusing on the rhetoric of health and medicine.