Oliver Mitchell, founding partner at Autonomy Venture (left) and Stephen Gorevan, chairman and founder of Honeybee Ventures. Images: Caroline Sinno Photography LLC

Robotics AI is augmenting human intelligence

What does methane on Mars mean for robotic requirements? How are robots driving productivity and the P&L? What are the limits of learning? Stephen Gorevan of Honeybee Robotics, Keith Ross of NYU Shanghai, Oliver Mitchell of Autonomy Ventures, and Vikram Kapila of NYU Tandon Mechatronics weigh in

Published in
6 min readDec 20, 2018

--

Robots have been used in industrial manufacturing and research applications for decades, but AI creates the ability for these machines to do more. To be clear, that means helping humans do more — both here on earth, and in space. In part 4 of our series on the Future Labs stage at The AI Summit, New York on December 5, we offer insights from robotics leaders on challenges, opportunities, and a not-so-nebulous near future.

Giving robots intelligence isn’t easy

Despite remarkable progress in robotics AI in recent years, training machines to make decisions in physical spaces remains a challenge.

“When you’re working with a robot, it’s very, very slow,” said Keith Ross, dean of Engineering & Computer Science at NYU Shanghai and professor at NYU Tandon. “Each time you have an interaction in its environment, it takes time and it also wears out the robot.”

Keith Ross, dean of Computer Science and Engineering, NYU Shanghai; professor, NYU Tandon School of Engineering

Researchers like Keith use advanced models like deep reinforcement learning to train robots’ algorithms inside software simulation. When the simulations are accurate, the learned policies can be applied to real robotic environments.

Reaching accurate simulations, however, requires a vast amount of sample data on those real robotic environments. (Keith’s team is working specifically on this ‘sample efficiency’ problem.)

Even with near-perfect simulations, software-to-hardware transitions for robotic AI systems aren’t simple. Technology changes fast, and new innovations tend to complicate existing solutions.

“Simultaneously, as we’re trying to drive software progress forward, we’re also overlaying new sensors and new hardware at the same time, right?” says Steven Kuyan, Future Labs managing director. “Every time new hardware is introduced […] it makes it harder for the software to be reliable.”

Steven pointed out that the lack of any open or standard platform in robotics makes the problem even more challenging. There are many, many third-party hardware providers out there, but no cohesive solutions or standards for developing universally useful software for robotic systems. (Some companies, like Lyft, are collaborating toward platform-driven innovation.)

Leaders tie AI to the P&L

Even in the absence of industry standards, intelligent automation has already transformed warehousing and manufacturing. As the costs of sensors and cloud computing continue to drop, Oliver Mitchell — founding partner of Autonomy Ventures and a speaker on the Future Labs stage — predicts the $30B logistics market and $8B construction industry are next.

In all those sectors, robotics AI augments human capabilities and fills labor gaps.

Today’s ‘co-bots’ are one example. Unlike the pre-programmed industrial robots of yesteryear — which operated in an assembly-line fashion, away from human workers — co-bots use AI to help people do their jobs, like handing tools to a technician at the right time during an engine repair.

Oliver Mitchell, founding partner, Autonomy Ventures

Co-bot assistance helps people focus on higher-order aspects of their work, which can drive productivity gains for companies. Other forms of robotics AI have even closer ties to company performance.

Amazon, obviously, is the case example. Oliver says that since acquiring Kiva Systems in 2012, Amazon has developed robotics AI that does more than bring shelves to the packing table. “What’s really interesting with Kiva is that it organizes the warehouse based on bestsellers. So the bestsellers are on the perimeter [near the workers] and the ‘dogs’ are in the interior. And that’s all AI,” he points out. “It’s not just about moving towards a kind of conveyor belt. It’s tying it into Amazon’s business.”

AI heads to the final frontier

Warehouses, however, could be called an easily learned environment for robotic systems.

Space — as in outer space — is not. But the robots in space today do have autonomous AI capabilities… They just don’t get to use them very often.

Stephen Gorevan is the chairman of Honeybee Robotics and a longtime collaborator with NASA on the Mars Exploration rovers. He told the Future Labs audience that the devices are capable of making intelligent decisions, such as avoiding objects, but are “too expensive” to risk operating in autonomous mode for too long.

Stephen Gorevan (right), chairman, Honeybee Robotics

Honeybee Robotics’ team send software commands to the Mars Curiosity rover daily, and Stephen has sent rovers to space since the 1990s. To date, the work on Mars has largely focused on using robots to explore the planet’s surface. This ‘horizontal exploration’ is leading to discoveries such as recent detections of bursts of methane on Mars. Now, methane on Earth only comes from either geothermal activity or biology — and there’s no geothermal activity on Mars.

“We have these tantalizing hints that there’s something below the ground,” Gorevan said. “So we have to start exploring vertically.”

AI is crucial to that effort. Drilling and sampling require a tremendous amount of intelligence, according to Gorevan — the nuanced pressure changes and movement corrections we take for granted when drilling with, say, a handrail, are incredibly difficult to automate.

But Gorevan says the AI for subsurface exploration is improving all the time. And Honeybee is involved in a number of drilling and sampling projects in space that will contribute further learnings. One of these is the CAESAR mission, which will be the first ever to bring a piece of a comet down to earth for study.

Next steps for robotics

More learnings come from inside the research lab and other areas of frontier technology. Vikram Kapila, at the NYU Tandon Mechatronics Lab, says the “evolving culture around making, inventing, and startups” is fueling innovation at the intersection of several different industries.

“I believe that robotics, artificial intelligence, augmented and virtual reality, and blockchain technologies — all four of these — are converging together,” Kapila told our audience. “The greatest applications will come from interactions amongst these technologies, not as unique technologies by themselves.”

Vikram Kapila, professor, Mechatronics Lab, NYU Tandon School of Engineering

The Mechatronics group is exploring this convergence and working to minimize the level of sensing required to operate robotic systems successfully.

Kapila’s team has developed highly efficient models for augmented reality (AR) systems that enable human-robot manipulation. That is, enabling humans to operate robots in physical environments, based on touchscreen actions in simulated virtual environments on iPads.

Further work will enhance robots’ ability to ‘see’ and act autonomously. The better robots can recognize people and objects using machine learning and machine vision (rather than by picking up sensor signals), the more effectively humans can deploy them for human-like activities — even in unknown or uncertain environments.

An example robo-scenario for this is domestic assistance, such as commanding an autonomous robot to pick up a specific item at home.

The robotic capabilities Kapila is advancing are also key to autonomous driving and swarm robotics, the subfield focused on successful deployment of multi-robot systems.

In the final part of our series from The AI Summit, we’ll look at how the Mechatronics group is exploring new ways for swarming robots to communicate and collaborate — and are finding inspiration in blockchain consensus models. In the meantime, stay tuned for part 5, which will focus on language AI.

Read more about what to expect in AI over the next 12–24 months in our 6-part series on the Future Labs stage at The AI Summit, New York:

Part 1: 2019 outlook: AI research to real-world application
Part 2:
Vision AI and ‘senseable’ autonomy|
Part 3:
Voice AI for business opportunity

--

--

The Future Labs at NYU Tandon offer the businesses of tomorrow a network of innovation spaces and programs that support early stage startups in New York City.