Founder of Google’s Stealthy Surgical Robotics Project Speaks

Before he left for Amazon, Babak Parviz brought expertise and a vision for a medical moonshot that may now achieve liftoff through Verb Surgical, a spinoff of Google’s life sciences research division

It’s a decade or so from now, and an artificial voice familiar from your smartphone and self-driving car is asking you to count slowly backwards from one hundred. You’re about to be operated on by the latest Google gadget: an autonomous surgical robot.

That’s the vision Babak Parviz sold to Google in 2010 when he joined the company’s ultra-secretive Google X division, where he was initially recruited to develop the Google Glass wearable computer. Parviz has since moved on to Amazon. But his ideas appear to have taken a big step towards reality with the spinoff of Verb Surgical, a new company formed with medical behemoth Johnson and Johnson, from Alphabet’s life sciences research division Verily. While few details are public, Verb announced this week that it “aims to develop a comprehensive surgical solutions platform that will incorporate leading-edge robotic capabilities and best-in-class medical device technology for operating room professionals.”

Google famously keeps a tight lid on its researchers, but Babak broke silence for the first time with Backchannel earlier this year, in a wide-ranging conversation in which he explained his ideas for bringing robots into the operating room, among other things.

“I founded the robot surgery program at Google,” he told Backchannel. “We rely on the dexterity of human surgeons but now we know machines are quite a bit more precise than humans. If you want to do things with extreme precision, a machine would be better.”

Surgical robots are already found in operating theaters. The most common system, called da Vinci, has been used in over 3 million laparoscopic (keyhole) surgeries, and WinterGreen market researchers estimate that the market for surgical robots will grow to $20 billion by 2021. But existing medical robots, just like Mars rovers or bomb disposal bots, are largely remote control devices, with human surgeons using mechanical manipulators for procedures that are awkward to carry out manually.

Autonomous surgical robots would be able to operate without a human hand — and its attendant tremors and slip-ups — on the joystick. Surgical robot manipulators are much smaller than human hands, and can be made to twist and flex in ways our wrists and fingers cannot. This means smaller, safer incisions, and the possibility of carrying out awkward and delicate surgeries, for example in the throat, without harming surrounding tissue.

“Using a machine opens up opportunities for surgeries that are not even possible with a normal human hand,” says Parviz.

Another advantage to surrendering the scalpel to a robot is that operations could be quicker. “Conventional surgery is fundamentally limited by how fast humans can make a decision, and how fast humans can mechanically move an instrument,” says Parviz. “We know that machines can do things much faster, both mechanically and even decision making.” While a human doctor would still decide whether to operate, a robot could potentially spot and clamp a leaking blood vessel much faster than a human. Robots could also mean less tissue damage from lightning-fast incisions, less blood loss, and less time under general anaesthetic. Each additional hour of surgery, for example, can increase the risk of a life-threatening blood clot by 25 per cent.

Parviz also sees surgical robots as a social benefit, eventually bringing high-quality healthcare to the poor. “For thousands of years, we have had one human surgeon training another human surgeon,” he says. “But we know machines scale better than humans. If we can train a good machine surgeon that can be replicated and deployed very quickly, it could make surgery accessible to a lot of people right now that don’t have access to it.”

Google is well-positioned to build autonomous robotic medics. Robotic surgery depends heavily on computer vision and machine learning, systems it has developed extensively for its self-driving cars and have helped them cover over 1.3 million miles on public roads without causing a single accident.

Verily grew out of Google X, the company’s secretive research division devoted to “moonshot” technologies that could change the world. Specializing in life sciences, it has to date experimented with healthcare devices such as a tremor-cancelling spoon for Parkinson’s patients, and a Parviz-led project developing contact lenses that can measure the glucose levels in diabetics that Novartis plans to bring to market under a deal signed in 2014. Signaling its interests in surgical robotics, Verily in March partnered with Ethicon, a division of Johnson and Johnson, a deal that appears to have deepened significantly with the Verb spinoff.

Google insists that Verily will not, at least for now, be developing robotics systems to help surgeons control the surgical instruments, instead limiting its contribution to advanced imaging and machine learning technologies. Google declined to comment on the role of Parviz while at the company.

However, Parviz has a long background in medical technologies. While a ‘bionanotechnology’ professor at the University of Washington in Seattle, he constructed a bionic contact lens with an LED that was powered wirelessly by radio waves. In 2010, he was recruited by Google to lead work on Google Glass, the innovative head-mounted computer that was released to a mixture of praise and ridicule in 2012 — and that Google stopped producing earlier this year.

“When we started, Google Glass was just a few lines on a napkin in a coffee shop,” he remembers. “To take that all the way to something that was on the heads of thousands of people walking down the street, and had a pretty interesting interaction with society, was quite a roller-coaster.”

Parviz also brought his bionic contact lenses with him to Google. Early last year, Parviz announced that Google X had rebuilt his contact lenses from the ground up with a built-in glucose sensor to help people with diabetes. Tiny lights would indicate when it was time for an insulin injection. “We went back to square one to design a system that can be readily deployed on a human being,” says Parviz. “This time we knew the outcome of this work would not be an academic paper, it would be something that has to go on a person and has to work.”

But it is robotic surgery that Parviz sees as having the greatest potential to deliver Google’s first successful moonshot — possibly even beating driverless cars to widespread use. “At the moment, we don’t already have commercially deployed self-driving cars, but we do have commercially deployed robotic surgeons,” he notes.

Other robotics experts are not so sure. Ryan Calo is a law professor at the University of Washington, and teaches a class on Robotic Law and Policy. “The US Food and Drug Administration (FDA) approved robotic surgery relatively quickly because it made an analogy between robotic surgery and laparoscopic surgery,” he says. Makers of robotic surgical systems claimed that their devices were essentially an extension of traditional laparoscopic instruments. In fact, viewing and controlling manipulators remotely via a video screen, using complex software and hardware features, means robotic surgery feels new for both medics and patients. “If Google were to try to make the same argument for an autonomous robot, the analogy would break down completely. It’s a very different thing,” says Calo.

There are also issues around certifying medical staff to work with robotic surgeons, as well as untangling product liability or malpractice lawsuits where a robot rather than a person is wielding the knife. “I don’t find the idea of a fully autonomous surgeon plausible,” says Calo. “Human surgeons will be in the picture for a long time.”

“By no means I’m saying this is immediate, by no means I’m saying this is easy, by no means I’m saying this is even going to be cheap initially,” admits Parviz. “At least for the foreseeable future, we’ll have human surgeons making decisions, but the machine will execute depending on what the surgeon has decided.”

Whether Babak Parviz will be building those robots himself, no one knows. When asked about his current activities at Amazon, Parviz only laughs and says, “We’re working on really cool stuff.”

Illustration by Backchannel