My AI Doctor — Or: How I Learned to Stop Worrying and Love the Machine

Newlab
Newlab
Published in
5 min readNov 17, 2017

When we go out for a drive, we’re not alarmed to find ourselves traveling faster than we can run. For some reason, though, when we think about AI, the idea that machines can surpass human ability scares the shit out of us. But humans have been augmenting their capacities forever, since the invention of the wheel.

Photography by Rich Gilligan / Installation by Hovver / Reporting by New Lab

When will the Singularity happen, and what will it be? Will our grandkids have robot lovers? (Will we!?) There’s much to wring our sweaty hands over. But a host of companies at New Lab are using AI’s power to enhance humanity’s quest to understand itself — and for learning, communicating, and healing what ails us.

Human eyes can see less than one percent of the electromagnetic spectrum; which is to say, they’re kinda lousy. (No offense to New Lab’s Head of Business Development, Rebecca Birmingham, shown here in extreme closeup.)

Human eyes can see less than one percent of the electromagnetic spectrum; which is to say, they’re kinda lousy. That doesn’t cut it when, say, you’re searching for blemishes on the vast surface of a semiconductor.

“The human eyes are fascinating for a model,” says Matthew Putman, the founder and CEO of Nanotronics Imaging. “But using it for anything on a small scale isn’t practical.”

Investigating all that “room at the bottom,” as physicist Richard Feynman once called the vast distances that appear as we zoom into particles, requires instrumentation that’s smart enough to assemble images that humans can’t.

Enter AI.

Nanotronics makes the invisible visible. It builds microscopes capable of inspecting very small machinery and materials. Computer chips, nanotubes, LEDs, aerospace instrumentation and hard drives get defects like any other, but at that level, light’s wavelength is too large to be of use. Their microscopes leverage changes in the angle and position with which light hits a specimen — this, combined with patented algorithms and incredibly fast calculations, allows them to view objects a billionth of a meter in size.

“The human eyes are fascinating for a model,” says Matthew Putman, the founder & CEO of Nanotronics Imaging. Nanotronics makes the invisible visible — its microscopes are capable of inspecting objects a billionth of a meter in size.

“We took a mathematical approach to tricking a law of physics,” Putman says.

Cognitive-powered computers don’t have to work on femtosecond scale, however. Take health matters: A doctor’s visit is a conversation. A physician has you list your symptoms, but they also weave in real-time feedback: vitals, age, history, length of cough, sound of the cough, that it hurts here not there, and so forth. But we often punch a few keywords into a search bar when we’re feeling off.

Adam Lathram, the co-founder of the AI company Buoy Health, cautions against diagnosis via message board. “When you Google your symptoms you put yourself at risk at being diagnosed with cancer — by the Internet,” he says. (Adam is photographed inside Hovver’s Liminal Scope light installation.)

“When you Google your symptoms you put yourself at risk of being diagnosed with cancer–by the Internet,” says Adam Lathram, the co-founder of Buoy Health. “People aren’t getting their health information the right way.”

Buoy Health uses the learning power of AI to create a dialogue and connection with the patient. Its AI is trained on thousands of clinical research papers, medical statistics, and a patient’s own medical history.

Buoy Health uses the learning power of AI to create a dialogue with the patient. The company trains its AI on thousands of clinical research papers, medical statistics and a patient’s history — that informs how the system asks new questions and makes predictions about the user’s health issue. “That is absolutely crazy that it was able to go through that and figure out exactly what was going on,” a user remarked after Buoy’s AI figured out she was having Crohn’s disease flare based on what she told the system.

“You tell a story you can’t tell a search bar,” Lathram says.

Most of us lack degrees in computer science, but that doesn’t mean we don’t have need for AI capabilities. “Everyone wants these tools,” says Birago Jones, the founder of Pienso, a company based at New Lab. “And for some use cases, they want a level of input and transparency.”

Birago Jones, the founder of Pienso, specializes in the intersection of human-computer interaction and AI. “We believe in the exponential power of human-machine collaboration,” he says.

Pienso allows laymen to harness the power of AI. The company is democratizing machine learning and making it accessible to users who may not have a technical or data science background. Pienso couples a dynamic user interface with a patent-pending machine learning technique called Lensing, which allows subject matter experts the ability to interact with the algorithm to impart their knowledge and context on their data.

Pienso is democratizing machine learning and making it accessible to users who may not have a technical or data science background.

“We believe in the exponential power of human-machine collaboration,” Jones says.

That type of collaboration also holds promise for anyone who’s ever struggled with a language barrier. Waverly Labs has created a wearable, real-time translating earpiece that lets two people have a conversation in different languages. Founder Andrew Ochoa says the experience goes far beyond converting words from X to Y. A good experience, Ochoa says, means you don’t have a computer voice droning Mandarin. You replicate who you’re talking to. And conversations aren’t always from person A to B. You need intelligence smart enough to translate the room, not just the person in front of you.

Waverly Labs exists at the convergence of wearable technology and machine translation. “Our interest is enhancing the entire speech recognition experience,” the company’s CEO Andrew Ochoa says. (Seen here is Marion Guerriero, one of Waverly’s first employees.)

“Our interest is in enhancing the entire speech recognition experience,” Ochoa says.

Five thousand years after the invention of the wheel, we can do things like catch a midnight flight to California (or, you know, the moon). With the ever-increasing speed of innovation, today — and maybe only for today — consciousness needn’t be the goal when it comes to AI. Enhancing humanity in a way that benefits society just might be.

The universal translator, as anyone who’s read the Hitchhiker’s Guide to the Galaxy knows, has long been the stuff of fiction. But Waverly Labs’ Pilot earpiece is real (if not fully universal yet). The earpiece is provides real-time translation for two people speaking different languages.

--

--

Newlab
Newlab
Writer for

Newlab is a global venture platform for critical technology startups building our sustainable future. Simply put, Newlab makes startups go faster.