Loading…

Mary Lou Jepsen was an executive at Oculus, led “moonshots” at Google, founded the low-power computer display company Pixel Qi, and co-founded One Laptop Per Child. But the company she launched last year, Openwater, is her most ambitious project yet. The plan is to use optoelectronics and LCDs to create an affordable MRI alternative. Among the technology’s manifold possibilities is a provocative goal: mind reading. If you are alarmed by this prospect, Jepsen welcomes this feedback. Rather than cloaking Openwater’s research in mystery, she is establishing a conversation around the ethics before that moonshot lands.

“Wearables” seems insufficient to describe Openwater. I know there are buzzwords like neurotech and BCI. How do you talk about your company?

Mary Lou Jepsen: What we’re talking about is diagnostics for the body to replace MRI into a wearable or pad or CT, and also telepathy — communicating with thought—lowering drug development costs, lowering all kinds of diagnostic costs that relate to digging into your body.

The reason I started this company was that I figured out how optically to make your body completely transparent so it doesn’t scatter, so you can get really crisp images of any kind of features inside your body. The pixel size of camera chips is now approximately the wavelength of light, and with it, the push for high-fidelity next-generation virtual reality and augmented reality was approaching that pixel size of the wavelength of light. With that, we can capture the base parts of the wavelength of light.

You can get a super-crisp image, crisper than an MRI image, using liquid crystals, lights, and cameras that can be made in the factories that are making the current LCDs and cameras for your smartphones. Once it’s done at scale, it can lower the cost of medical imaging by a thousandfold. Last year in the U.S., the average cost of an MRI scan was $2,700 and made $50 billion of revenue for the hospitals. So we’re talking about lowering the cost of MRI scans to the cost of a phone call, but also the size — you can actually wear it.

You can scan yourself: Do I have an aneurysm? Is my artery clogged, or is it the tumor? Or, as it has been shown with MRI, communicate with thought — images in your head, words in your head, music in your head. Are you in love or not? The whole thought and emotion of it, all of that is possible. We don’t know what we’re going to do for our first product. We’re trying to understand the limits of physics right now. And we’re really surprised at the versatility of this so far.

How do you prevent coercive use of the technology?

Jepsen: It’s a big question as we develop this ski hat that allows you to communicate with thought, and also monitors, calibrates, and secures forms of brain disease, and allows us to [take an image] inside our bodies.

So, the ethics: We create this ski hat, unbelievable as it may seem, in a single-digit number of years and get it into production. Can the police make you wear it? Can the military make you wear it? Can your parents make you wear it when you come home at 3 a.m. and they want to know where you were? Openwater has committed to making sure it will only work if you want it to work, if you willingly think into the hat. But where I become really concerned is — imitation is the sincerest form of flattery — people reverse engineering our work, copying it. We patent and are protecting it in many different ways, trying to make it very hard to do that. But I have great concerns about the use of this technology.

We’re in dialogue with lots of different ethics committees. Steven Hyman, who founded one of these organizations, called the International Neuroethics Society, said at the last meeting that the basis of ethics committees sitting behind closed doors and deciding what’s ethical is over. Because our notions of what is appropriate privacy are in massive flux right now. And so he pointed out that an open dialogue is the only way to determine this, which is part of why I’m talking to you. It’s important to have this discussion. Do we teach people to lie into the machine, meaning tell untruths? Is that ethical to lie? Do we get them to quiet their brains so nothing comes out if people reverse engineer [our product] and use it on people? I am spending time on it and trying to figure out how we act responsibly as a company. We’ll fully commit to that and the rest, but is the sincerest-form-of-flattery issue worth it?

Most people don’t expect to be in a position to defend the privacy of their own thoughts. That’s something human beings have always taken for granted.

Jepsen: Right. That’s why we talk things through. But I think what we want to enable is also amplification of human excellence. Can people choose to let their minds swirl around with each other? Why not, if it’s consensual? It’s probably more intimate than sex. I mean, honestly. I don’t know, we haven’t done it yet. Can a movie director use it to dump a rough cut of an idea she has for a movie to a computer? Why would we want to restrict that? Can a musician use it for a rough cut for all of the layers of sound in his head to his computer? Why not? Can you imagine if you go straight from an idea to product? It would be amazing to be able to communicate and not be limited by the low bandwidth out of our brains that’s basically how fast I can move my mouth or type my fingers.

And we’re not the only things with brain cells in the world. A lot of animals have extraordinary noses: rats, dogs can smell cancer, for example. We might stop eating some of these animals and start collaborating with them. Octopuses have neurons all over their bodies, are thought to be quite smart, and they can solve Rubik’s cubes. It’s actually quite astonishing given they’ve never gone to school. It’s a bit provocative, yet we collaborate with animals all the time. We may find more enhanced collaboration, be it on the farm or in search and rescue operations with dogs or what have you.

What concerns have people brought to your attention?

Jepsen: One guy said to me, “Is there a mother in the world that won’t put a ski cap on their newborn baby? Because when they take their baby home from the hospital, they don’t want to break the baby?” Is that wrong or right? They’re really worried about the baby. Especially if the baby’s been sick. Is that okay or not? I don’t know. I’m not answering that question, but the bigger concern I have is even if they do that in the first few years of the baby’s life, when does that hat come off? When the kid turns 18? I mean, did you ever do anything your parents didn’t know about?