The Greatest Challenge of Deep Learning Powered Medical Assistant Chatbot: Asymmetric Information

Deep learning powered medical assistant chatbots are quite popular now. They have the promise to provide the 24/7 health education or early detections of symptoms before doctor’s office visits. It’s like 2.0 google search of medical symptoms. Usually they are in chatbots format: you ask questions and the system will narrow down to possible symptoms, a health education article or general guidance regarding the symptoms.

Medical chatbots-Google medical search 2.0

[gallery ids=”495,497" type=”rectangular”]

Left: Babylon Right: Your.MD (2 most popular medical assistant chatbots)

They try to mimic your doctor office visits experience: you describe a symptom, tell the doctor and the doctor gives you an answer for your symptom.

Deep learning techniques

The usual deep learning techniques used by these chatbots are the NLP (natural language processing). Rather than google like key word search, the chatbot tries to understand your conversation like questions and narrow down the possibilities of possible symptoms like a decision tree. However, I met the machine learning at the NIPS17 of Babylon, in reality they still use key word like techniques not the ideal conversation understanding yet.

Challenge of medical chatbot

Most of platforms heavily invested in the machine learning algorithms to better understand your questions and “have a conversation” with you. However, the real challenge like in any medical situation is: patients don’t medical knowledge to effectively ask or describe the symptoms. That is patients don’t know what to ask or the “right way” to ask.

An example is chest pain. It could be just stomachache or worse: symptom of heart attack. Chest pain is a vague concept for patients. For doctors, chest has its clear physical boundary, but if you ask a patient to point out the exact area of their “chest pain”, it could be chest, upper abdomen or stomach. Also, most of time, patients just feel uncomfortable but hard to put such uncomfortableness into exact words.

This is caused by asymmetric information in health economics.

What is asymmetric information?

I drew this graph to try to illustrate the asymmetric information in medical settings. The information flow between the payer, payee and the provider is not straightforward but rather a triangle. For the scope of discussion of medical chatbot, we leave out insurance companies out of our discussion.

For patients:

They know their uncomfortableness, however, lack of medical information put them in a award situation. Usually patients will describe their uncomfortableness in plain English and in general terms. It’s up to medical professionals to decode the language and put them into medical context.

For doctors:

They have medical knowledge but they are not you. Without your exact and correct description, physical check ups like touch and feel the affected area or other related lab tests, they can not know how you feel and pin point to the possible medical symptoms.

Asymmetric Information + Chatbot =Perfect Storm

The asymmetric information and the chatbot environment create the perfect storm: patients don’t what question to ask.

Taking Babylon as an example, when you started the app, this is the screen you are at. The golden question is where do I start? what’s my uncomfortableness? how should I best describe it? which word shall I use?

And no matter how good the machine learning algorithms is, it is based on the right questions. The machine learning magic cannot help here.

Also, because the difficulty to describe the symptom, it’s hard to continue the conversation with chatbot, therefore there won’t be enough information for the algorithm to figure out the possible medical symptom and give advice.

The Solution

Asymmetric information is a major problem even in real life medical office visits. It won’t be solved soon. Unlike in the real world, doctors can guide the conversation and conduct physical check and lab tests to help decode the symptom. In the chatbot situation, the solution is to provide enough tools to patients to lead the conversation and decode their problems step by step.

This is simple mockup UX/UI design I did to lead the conversation and keep the conversation between the chatbot and the patients going. This is just a small step towards the ideal situation, a full UI/UX design is needed.

Takeaway

Asymmetric information will always exist in medical settings. Medical chatbot’s challenge sometime is not a machine learning problem but to decode users’ language. It will take machine learning practitioners, UI/UX designer and medical professionals to better solve the problem.