What a Medical RN thinks about ML in healthcare

A summary of my interview with a Medical Registered Nurse (RN). This is one of my 18 interviews with clinicians for my MPH capstone (link here) for UC Berkeley’s School of Public Health.

Visit my Building Trust and Adoption in Machine Learning in Healthcare site (link here) for the abridge and full version of my MPH capstone as well as upcoming summaries of interviews and additional research.

Note that this interview is a de-identified summary for ease of reading and privacy. It has been approved for distribution by the interviewee.

“I am concerned that ML will get rid of the human connection.”

Job Background

I am a Staff Registered Nurse at a pediatric health system, “floating” between hospitals and shifts while I take on a lower workload to pursue a Nurse Practitioner (NP) degree. I work in the medical patient units.

Familiarity with ML in healthcare

To start off, what have you heard about artificial intelligence and/or machine learning in healthcare?

They are computer programs that can help assess a patient’s situation and give instructions on what to do.

Past and future use

Have you used any ML tools? Would you?

I don’t know. Generally when it comes to technology, I tell new nurses to turn away from the computer screen and look at their patients. This can’t be replaced by a computer, since so much patient care is done by physical touch and intuition. While I think these machines are helpful — for example, analyzing lab values — I also think they oftentimes just confirm what we already know.

Excitement and concerns

What about ML in healthcare is concerning or exciting for you? What else is exciting or concerning for you?

I am excited about how it can make certain things easier for me. For example, I remember chatting with my nurse colleagues during a break about the need for a medication dispensing robot. It could run into all of the patients’ rooms to give them their meds, and we could focus on more important things. But also, this could lead to the same pattern of abuse and possibly errors that we see today in medications.

I am concerned that ML will get rid of the human connection. Patients just want someone to talk to. Instead of an ML tool, I think every unit needs a psychiatrist on standby to ask if a patient is okay. Just a person to listen to their concerns and be present. This helps in a person’s healing more than most other things. How does the ML tool connect with the patient emotionally? This is important and oftentimes even more important than a great diagnosis or traditional treatment. Patients just want to be cared for by a human. My pediatric patients’ health is always related to how much human attention they get. Is there a caregiver in the room? If not, are nurses available to spend a lot of time with them? If not, these kids just don’t heal as well and a computer is not going to help.

--

--

Harry Goldberg
Building Trust and Adoption in Machine Learning in Healthcare

Beyond healthcare ML research, I spend time as a UC Berkeley MBA/MPH, WEF Global Shaper, Instant Pot & sous vide lover, yoga & meditation follower, and fiance.