Mobileye Co-founder, CTO and Chairman Talks the Future of Self-Driving Cars

Consumer Tech Association
CTATech
Published in
6 min readDec 15, 2016

by Gary Shapiro, President and CEO, Consumer Technology Association (CTA)

Enabling the full potential of self-driving lies in technology that teaches driverless vehicles “the rules of breaking the rules.” Successfully developing driving policy technology, through reinforcement learning algorithms powering machine intelligence that mimics human driving intuition, is one of the final pieces to completing the autonomous driving puzzle.

Gary Shapiro, president and CEO of the Consumer Technology Association (CTA) speaks with Prof. Amnon Shashua, co-founder, CTO and Chairman at Mobileye, about the future of mobility.

BMW, Intel and Mobileye are working together to bring self-driving vehicles on the road. What is the team working on?

Our areas of focus these days concern the performance evaluation of the eight-camera system; the fusion algorithms of cameras, radars and lidars; the trajectory planning and driving policy where the intelligence of driving comes to bear; and mapping and localization. In addition, the production-intent hardware with high-performance computing by Mobileye and Intel is being built around the highest levels of functional safety with middle-ware software layers that guarantee safety.

By the end of 2017, the team is expected to have a fleet of 60 operational vehicles for validation as part of the series development effort.

Recent news reports suggest that driverless car systems have to be perfect before being implemented, like airline autopilot systems. Will they be perfect, or should we accept some imperfection given the benefits associated?

Given that driverless cars should be driving among human-driven cars — at least for the foreseeable future — the challenge is to enable a driving policy that on one hand mimics human driving negotiation skills and on the other hand is safe. If the robotic car is too defensive in its driving policy, then it will likely obstruct traffic. But on the other hand, one would like to avoid reckless driving that humans might adopt occasionally. Therefore, some imperfection must be expected.

From a pragmatic point of view, a realistic target would be to reduce the probability of an accident by two to three orders of magnitude. In other words, if in the U.S. there are around 35,000 fatalities a year due to car accidents, then if one could reach a state where the rate of fatalities of robotic cars would bring that number to 35–100, society would be in a position to embrace the technology.

Similar examples can be found in other domains, such as airbags, where the huge benefits of saving lives outweighs the very small level of imperfection.

Mobileye is a leader in artificial vision technology for Advanced Driver Assistance Systems. How are vision and data analysis increasing driver safety especially for seniors and people with disabilities? What’s the future of vision-based car solutions?

Many new vehicles today have the option of being equipped with Advanced Driver Assistance Systems (ADAS) technologies such as automatic emergency braking and lane keeping support. These systems are active as they avoid or mitigate collisions, as opposed to seat belts and airbags, which are intended to protect vehicle occupants only after a collision has occurred.

Computer vision technology is the primary enabler of these systems, and Mobileye is a leader in this field, with vehicle programs with 27 different automakers from around the globe. Considering that more than 90 percent of vehicle collisions are caused by human error, the potential to reduce collisions, fatalities and injuries is very significant.

ADAS systems are rapidly evolving to deal with a wider and wider range of dangerous situations and the vision systems that power ADAS also are the building blocks of perception for autonomous vehicles. The percentage of new vehicles equipped with ADAS is expected to grow meaningfully in the near-term as safety regulators around the world are beginning to require these systems in the base price of a vehicle for it to receive an acceptable safety rating.

Advancements in technology may be different from tomorrow’s innovations. What obstacles may inadvertently slow or stop these innovations in mobility? Are there any state policies that are making it easier or harder to deploy new technologies?

It’s difficult to be against innovation that has potential to reduce one of the leading causes of death and injury around the globe. As mentioned above, increasingly stringent crash test rating systems are supporting higher adoption of these ADAS technologies. In addition, we are beginning to see proactive guidelines being published, as regulators are expressing that the lifesaving potential of autonomous vehicles is too important to be slowed down by a slow-moving regulatory or legislative process.

At Mobileye, we are moving rapidly to develop the technology to enable this, and we have high confidence that we, and our partners, have the tools available to put production Level 4 autonomous vehicles on the road over the next several years.

How do you see advancements in automotive technology changing the way we work and live?

Beyond the lifesaving potential noted above, we see many other societal benefits as likely. Less traffic congestion; more flexible city planning with less dedicated parking areas; increased, safer mobility options for the elderly, disabled or inexperienced drivers; and the potential for millions of people to make much better use of their commuting time.

Ford has said they will mass produce self-driving vehicles without provision for a driver. Mercedes has said a backup driver will always be an option. What’s your view?

Our job at Mobileye is to, along with our partners, create the technology that makes autonomous vehicles work at the highest levels of functional safety. We believe that safely deploying vehicles that can be driven by a human or by the artificial intelligence system is certainly possible. But it is up to our customers to determine their own strategy of how best to deploy the vehicles. And there are many differing use cases that are being contemplated: shared autonomous fleets, mobility on-demand programs, high-density shared commuting vehicles, shared vehicle ownership models and many more. What is clear is that nearly all automakers see autonomous vehicles as potentially driving a once-in-a-century revolution in transportation.

At the CES 2016, Mobileye introduced new mapping technology that enables crowd-sourced real-time data for precise localization and high-definition lane data that forms an important layer of information to support fully self-driving technology. How’s the technology evolving?

We see our mapping and localization product called Road Experience Management (REM™) as a way to deliver a high-definition map, which is a clear prerequisite to autonomous driving, in an extremely low-cost, efficient way. We see crowd-sourcing as the only logical method to build an HD map, but you need a “crowd,” and the source data needs to be light enough to communicate wirelessly.

Mobileye has created a solution where single cameras on ADAS-equipped vehicles can be used to harvest the mapping data. We then use real-time geometrical and semantic analysis to compress the massive amount of data coming in through the camera into a low-bandwidth form (10 kilobytes per kilometer) that can be communicated wirelessly.

Since announcing this at CES, the technology and business relationships have been evolving well. Technical proofs-of-concept have been successful, and the Delphi/Mobileye vehicle demo at CES 2017 will be powered by REM.

On the business side, we are engaged in definitive contract negotiations around the globe, and we see very strong interest to partner with us on this unique product. We expect the first contracts to be concluded soon.

Finally, the first REM map covering the highways of an entire country is expected by late 2018, in cooperation with an OEM and mapmaker.

Finally, any sneak peeks you can share about your CES 2017 participation?

We certainly have a full agenda at CES this year. In addition to displaying our technology at the Mobileye booth, I will deliver a speech entitled “Artificial Intelligence for Autonomous Driving: State of Technology, Breakthroughs, and Why Alliances are Valuable,” which will describe our approach to endowing vehicle systems with the human-like intuition and intelligence to negotiate a nearly infinite number of complex driving situations. This is our approach to Driving Policy, the third pillar of Autonomous Driving, after Sensing and Mapping.

We and our partner Delphi will demonstrate our autonomous vehicle performance over a 6.3 mile course around Las Vegas that we believe is one of the most challenging courses an autonomous vehicle has tackled to date. The BMW/Intel/Mobileye partnership will highlight our progress to date and next steps.

See the next generation of self-driving technology at the Self-Driving Technology Marketplace at CES 2017.

--

--

Consumer Tech Association
CTATech

Formerly CEA. Uniting 2,000 CT companies, producer of @CES, THE industry authority on market research, surveys, news with a mission to grow the industry