How Artifical Intelligence Makes Healthcare More Human
The Question I Hate About My Health
You know the moment when you go in for a yearly physical and the doctor asks, “So, how have you been?” I hate that question. I can’t recall what I ate yesterday, let alone remember a pattern of headaches or the overall quality of my sleep. The problem is that I am human. I forget and can be lax when it comes to taking care of myself. And I’m not a doctor. Most health issues sneak up on us, and we’re not inherently wired to remember patterns. Yet our current healthcare system requires that we be hyper-observant — we must notice the frequency of symptoms and determine if they are important enough to recall months later in a rushed doctor’s appointment.
On top of my own health, I now have a daughter adding another level of responsibility and recall. How many times did she poop? How was her sleep? What did that cough sound like? The burden falls on me to determine if I should stop worrying or call the doctor…all while I am sleep deprived and holding a screaming child.
Over the past few years, health experts have been gravitating towards the idea of personalized healthcare. Doctors provide more tailored diagnoses and treatments based on a wealth of personal health data their patients are gathering. This is becoming a reality through programs like Apple’s Health Kit that is standardizing how doctors see data from personal devices. I love this idea, but for its greatest flaw: when will a doctor find time to synthesize all of this rich new data? The average doctor visit is only 15 minutes long, and is shrinking. Doctors just don’t have time to incorporate individual health data into their process, so the burden is falling on patients.
We’ve recently heard a lot of buzz about biosensing (Rock Health “Future of Biosensing Wearables”). Health enthusiasts (and venture capitalists) predict a future where better health will be enabled by wearable technology. Activity trackers, posture support, heart rate trackers — all worn while we head to the grocery store, go to work, do yoga, and pick up our kids.
The data sounds exciting, but this stuff is so frustrating. The day my Fitbit lost its first charge was the day I stopped tracking my steps. A wearable asks too much of us: remember to charge it, remember to wear it, remember to turn it on. And don’t get me started with wearables for kids (imagine cleaning a wearable after a diaper blowout).
What if a device existed that helped you understand your overall health, but you didn’t have to wear it, charge it, or turn it on? It could provide more context, more accurate personal data AND remember everything for you. Enter cameras.
A Camera with a Brain
In the abstract, cameras are ideal sensors — all-seeing objects that passively record information. A camera stays in one place, watching from a distance without interference. Night-vision and telephoto lenses allow cameras to see far and wide, day and night. Cameras are affordable and ubiquitous in our current culture, but what if a camera wasn’t just collecting data? What if a camera could understand what it sees?
I’ve often thought about Baymax, that robot from the Disney movie, Big Hero 6. Baymax provides health assistance by scanning a human’s body for health issues and giving feedback on anything of importance. This scan is enabled by computer vision — a form of artificial intelligence (AI) that works to understand the world using a camera. Real intelligence comes when a camera knows what it’s observing. It can determine what’s happening and which objects are important and which ones are superfluous. For example, Baymax is thinking: “Am I looking at a person and what part of the body?” New advances in artificial intelligence allow us to train cameras to think and learn like brains. This is important because instead of dealing with TeraBytes of useless data (like your cat sitting in your baby’s crib), the camera can focus on moments and data that matter.
Data is useless to people unless it is personalized and actionable. With the help of computer vision and artificial intelligence techniques, a sensor like Baymax can learn to interpret what’s important to you, the user. A camera is also a great tool for sharing data. A simple video clip that represents a pattern of coughing, abnormal breathing or inconsistent bedtimes is a powerful story. Cameras make health data actionable because they give context which leads to belief. We see the story and so we believe it. We have objective data that we feel empowered to share with our doctor.
Start with a Human Habit
People are creatures of habit. Habits are hard to break, and habits are hard to form. We’ve already determined that wearables fail because they ask us to form new habits without an immediate payoff. The key to getting continuous insights about a person’s health is to build upon existing habits. What’s one habit we all do everyday, at the same time and in the same place? We sleep.
Human sleep is an uncharted frontier of health information, and a camera is the perfect sensor for studying health during sleep. Sleep is a natural barometer of our bodies. When we’re stressed, we don’t sleep well. Days before the obvious symptoms arrive, our sleep patterns can reveal the onset of illness. With the right camera, breathing and heart rate can be measured, even through a down comforter. Those measurements can help determine short-term issues like a cough, or long-term chronic issues like asthma. Sleep itself has a major impact on our short-term and long-term health. When we sleep well we have more energy to exercise and eat better. We are also less likely to be obese, have hypertension, diabetes and mental health issues like depression. Sleep is the perfect opportunity to capture continuous data about our health every day.
With this collected sleep data, we can make behavior changes that matter. As a parent, I want to wake up in the morning and know how my daughter slept so I can decide if she will need an extra nap or if we should attempt a trip to the zoo. I want information that’s personalized and relevant to my family. With artificial intelligence, a device can learn to filter what information to share and when to share it. It can highlight a video clip of a child’s cough and wait until the right moment to suggest “share with doctor.” Instead of sending a suggestion to a parent dealing with a screaming child, the device could wait until the child is sound asleep. When information is glance-able, personalized and timely, insights add up and we become more empowered in our health.
Stay Safe, Go Cloudless
When we design an experience for health unique challenges must be considered. A camera in the bedroom introduces a new level of concern for privacy and security. In order to help the user feel confident and safe, the designer must put the control of information into the hands of the user. People want simple and quick ways to decide what leaves their household. I decide if a video clip of my daughter’s cough is something I want to send to my doctor or share with a friend. Additionally, if I rely on a device for health insights, I don’t want to deal with the infamous “Lost Connection” I see on my Netflix account from time to time. A breathing alert for my sick daughter has to be completely reliable. Privacy, security and reliability are at serious risk with cloud-based devices that rely on a user’s internet connection to function.
The next generation of connected devices and services are driving a new wave of edge computing — the idea that more processing should happen cheaply, securely and efficiently at the source of the data, instead of on a remote cloud server. Huge technology vendors like Qualcomm are jumping on this wave by creating processors to directly support edge computing. Edge computing leads to a better, more trusted health data experience for the user in two ways: First, users get to decide what data, if any, leaves their homes, to share with a doctor or access at work. Second, devices are much more reliable and secure when they don’t have to reach out to the cloud for information or processing. Everything can stay on the user’s home network, and is shielded from internet hackers and service outages. By keeping the internet out of the critical path, users get a faster answer and devices are more secure from attacks like we saw the other day on “internet of things” (IOT) devices.
A Baymax Future?
Well sort of. I doubt we will have a lovable, extra large robot hiding in the corner when we become sick. Instead, I anticipate a world where a camera is no longer just a camera, but a sensor that understands our healthy days and our sicks days. We will track our sleep, our breathing, our coughs, our colds, our heart rates and our behaviors. We will have individualized baselines and clear indications when our health deviates. This brain in our bedroom will help us better understand the subtle patterns that used to be impossible to recall or record.
I imagine a future where I’ll know when I am becoming too stressed and need to adjust my lifestyle. I’ll know if I am getting sick before I feel it, so I can get more rest or an antiviral for the flu. I’ll understand what makes my daughter the most comfortable when she has a cold (humidity?, temp of room?, earlier bedtime?). And I’ll know when I need to reach out for a tailored, expert opinion. Ultimately, I imagine a future where I visit my doctor for that annual checkup confident and prepared to answer the question, “So, how have you been?”
Introducing Knit Health
The reality of what I’ve described is not a far-flung future, but is something we can have today. Knit Health, the company I founded with two other IDEO alums, has built the world’s first human health sensor using a camera, deep learning, computer vision and human centered design. We designed our first product for parents to help their children get better sleep and improved health. Without wearables, Knit tracks my daughter’s sleep and helps me determine how to adjust her routines and room conditions to help her sleep better. Knit measures her breathing, giving me peace of mind in the middle of the night and the feedback needed to make her comfortable when she is sick. Knit learns about my daughter’s sleep habits, behaviors and respiration and alerts me when things are out of the norm. I can see that a cough has been bothering her for the last three nights and decide to share a highlight clip with my pediatrician. I can wake-up in the morning and know how her night went and use this as a predictor of how her day will be.
While we haven’t figured it all out, we’ve made a great leap towards the goal of providing personalized healthcare support to families. We recently launched our product on Kickstarter. We’re looking for supporters who believe in our mission and are excited to help us create an empowering product for everyone.
“So, how have you been?” Let’s get started on your answer. Support Us