Autonomous Driving: Moving the Big Rocks to Reach Level 5

In April, I got an opportunity to sit down with Mohammad Musa, founder and CEO of Deepen.AI, to learn more about the company, future vision and his optics on solving bigger problems in Autonomous Driving.

Shoieb: So tell us about yourself and your company.

Moh: Sure. I come from an engineering background. I spent some time in the semiconductor industry at Xilinx. Then, I worked for another startup, and then I went to the gaming industry. I worked in technical sales, a physics simulation company and a game, a game engine kind of rendering company. Uh, then uh, was at Google for six years, worked on a lot of different products including Gmail, Google Drive, calender, admin console, mobile device management and Chrome OS. I managed the launch team that launched a lot of new features and new products in that area.

In 2016, I left Google to start this company. I spent four months working in the virtual reality and augmented reality space; didn’t see a huge opportunity in that market. There was a lot of hype at that point but no real business. So we wanted to find something that was revenue generating and has a technical advantage over others, a high technical barrier and a real need in the market that we can address. We also look for the large market size so that we can build a company that actually can grow and become successful. So we switched to artificial intelligence (AI) and we tried different markets and different solutions within AI and we settled on autonomous driving … as the market that we’re focusing on.

Shoieb: Okay. So, so you went through some journey. What were some of some of the factors that contributed to or made you look into autonomous driving?

Moh: Before I met my cofounders I spent some time on VR and AR. Everything I wanted to do require some computer vision. And then, I grouped with my cofounders. They are very strong in computer vision and deep learning. We tried six different markets, six different potential products using AI and computer vision, and one of them was autonomous driving. We picked that after six months of experimentation.

Shoieb: How would you characterize experimentation? Was it some kind of baseline technology, actual product or proof of concept? Or, was it more like a market research?

Moh: We had some IP (intellectual property) about our approach to AI, and we tried to apply that IP into different segments. The only segment that looked exciting enough and big enough was in Autonomous Vehicles. Okay.

Shoieb: Alright. At this point you’ve been working on this for two years. For autonomous driving, it is less than a year before you founded the company. So at this point, do you have a product or are you still in technology development?

Moh: We have three products. We have a 2D details called Rannotate, like reality annotations, and a second tool is called Eve. And, that’s for a fused sensor, multisensor understanding, annotation, visualization and AI benchmarking.

So with Eve, that’s our bread and butter. That’s where we have got a ton of IP. It has computer vision, tracking, and prediction; a lot of different algorithms — some of them made in-house and some that we are taking from academia, or state-of-the-art research from the universities. But, we’re productizing those aspects. It has a blend of some unique IP that we built in-house and some state of the art, sensor fusion, LiDAR, camera and radar. So, we can scale depending on the customer need. Eve is now a Linux tool. We’re trying to have a more scalable version of that. Renotate is already on the web, but we’re still working on improving the feature set and the functionality. So it will be coming soon. We have a Linux version of Renotate that is already in production. Our internal in house team is using it.

Shoieb: Okay. Any customers that are publicly announced?

Moh: Not publicly announced yet, but we do have several customers.

Shoieb: That means you have revenue coming in, and you’re not a pre-revenue company.

Moh: Yes, we do have revenue coming in.

Shoieb: So that’s, that’s very impressive for an effort that was only what, nine months, and be able to reach the revenue stage.

Moh: Yeah, I mean the team has been great. They work fast. I try to get the company to focus on the opportunities that are defensible and where there’s a true market need rather than just building technology for the sake of technology isn’t gonna help anybody. So just by understanding what our customers want, adding our kind of secret sauce to the solution. And providing that in a way that fits with what the customer needs we can make things happen.

Shoieb: Sure. Is your solution strictly software only or is there a hardware component to it?

Moh: We are software only.

Shoieb: When you talk about all sensor fusion that means you are working with other sensor makers

Moh: Yeah, we have a couple of sensor companies as customers, and we’re working with tier-one suppliers and OEMs on the sensors that they’re using.

Shoieb: Is it fair to say that you are working on the perception of the vehicle, or more on the sensor fusion, data collection, processing and learning?

Moh: We picked the tooling segment. We’re software tools and services provider. We don’t build the perception AI to sell it. We build perception AI for our internal use. It’s a different use case because we don’t want to run our code on the car; we run it offline either in the cloud or on the edge, but not for the purpose of running on the car, that’s the job of our customers.

Shoieb: As a software tools and services provider, what’s your value proposition to your customers?

Moh: We help our customers build the best, most accurate AI for perception planning, prediction, tracking, classification, control actuation, optimization … all of these different things that are required for any robotic system to our tools. We help our customers deliver the solutions much faster and with much higher accuracy, reducing the cost significantly for some of the challenges that they have to solve around data annotation. Our annotation tools are packed with a lot of AI in them for making the jobs of the human annotators much, much faster, much higher accuracy, much much more productive. We have IP in that space, both on camera, Lidar, and pretty soon we’re going to introduce some radar IP as well.

Shoieb: So with that said, you will not only work with tier-one OEMs you will also work with companies who are providing perception systems to automakers because you are an enabler for these providers.

Moh: Yes, we’re building the picks and shovels of autonomy. You have to annotate data in order to develop AI models, you have to optimize your AI to run on the edge for any robotic application and you have to have a stream for tooling and for managing your data sets. How you go about a data collection and understanding and management. These things are big, big problems in this space. That’s what we help our customers with.

Shoieb: When it comes to automotive perception systems and all the data collection, processing and learning, that is, making the data models and then applying that to the system that it basically becomes a learned system, there’s a ton of computation involved. Is your system real time, or it doesn’t matter?

Moh: It doesn’t matter because it’s running offline. We have some components that run real time, but not on the car itself. People use laptops. So we have a mix of AI running on the cloud and AI running on the laptop itself. We have optimized our AI to run on cheaper devices in a reasonable amount of time because running AI in the cloud all the time will cost a lot of money.

We are using Google Cloud and Amazon Web Services (AWS). We are on the Google TPU (Tensor Processing Unit) Beta and as well as we are an Nvidia partner. We’re a part of the Nvidia Inception program where we use a lot of Nvidia GPUs. We have an API server that we run our training and kind of more advanced tasks here locally that has many, many GPUs on one motherboard and it’s like an industry grade system. But, it’s not as fast as what you’d get with a cluster of Google TPU, for example. It’s something that we built ourselves with a few thousand dollars rather than paying 10 bucks an hour for cloud services. So, we’ll use both. There are some things that it’s better to use the cloud for and some things that you can use in-house.

Shoieb: It seems that you provide your customers a turnkey solution. What options do they have?

Moh: Our customers have multiple options. They can connect to AI running on the cloud built by them, or they can connect to AI running on the cloud built by us, or they can use our AI on the edge, on their own laptops and servers or devices directly. We have to support multiple modalities.

Shoieb: Do you see yourself going through an evolutionary process where you become more than a tooling company where you become a household name in the years to come?

Moh: We made a business decision not to compete with our customers. So, we’re not gonna try to build a mobility autonomous vehicle that is competitive to Uber or Lyft. That’s not our business model. We are not trying to be a mapping company. We are not trying to be a drone or last mile delivery robot company, or a trucking company. We have customers in all of these segments actually. So, it’s very intentional decision to not compete. We will have a car and we will drive it around to collect our own data and experiment with our own algorithms, but never for the purpose of of releasing an autonomous product. We just to be in the shoes of our customers who are building these autonomous products so we know exactly what are the pain points that they’re facing and the scale issues that are blocking them. We will build solutions that work for us. We feed it internally and then we launch a working delightful product that customers can use and enjoy the benefits of as soon as it launches.

Shoieb: Do you have a vehicle that goes around and collect real-time data?

Moh: Right now we have partners who are collecting the data, so, we leverage the partner data. After we fundraise the next step is to get a couple of vehicles and collect our own data and strictly for testing purposes. We need our own data to help improve our AI. We can’t use the customer data to benefit our AI.

Shoieb: From your vantage point you are an enabler and you are looking at your customers and their problems. What, what are some of the challenges that still need to be conquered in this space?

Moh: Pretty much all of the autonomous stack has not been figured out yet. Everybody, including Waymo, still has not been able to remove the human out of the equation. If you look at the cars that do not have a safety driver, they’re required to have a remote driver that is tele-operators connecting to the car and taking control. We are still very far away from a true autonomous vehicle. That’s what needs to be solved. And, that’s across the stack from perception to planning to prediction to actuation optimization … those are opportunities across the board.

Shoieb: How far are we from having a Level 5 vehicle?

Moh: We’re not going to have a Level 5 vehicle. I don’t think it’s economical to have Level 5 autonomous vehicle. A Level 5 meaning in the sense that it’s a car that is completely autonomous, that can drive anywhere, anytime without a pre-planned high definition map inside the vehicle. I don’t think that’s ever possible. I don’t think that’s ever economical actually.

Shoieb: But, don’t you think at some point it may become possible?

Moh: Technically you can make it happen, but it’s not going to be economical. However, for limited use case, limited functionality, limited speed, limited geographic area, and limited payload … that is possible in five years. For the Robo-taxi kind of dream to materialize in a generic way, I think that’s a very far away, maybe 10 to 15 years. But, for geo-fenced area that has a very accurate, fresh, a high definition map stored on the vehicle with a good set of sensors backed by a good OEM that manufactures the vehicle and ensures its reliability and safety, then yeah, maybe that can happen in the next five years.

Shoieb: So what we’re talking about, getting to level 3 or 4?

Moh: We are talking about totally autonomous, Level 4 where you still can take over from the computer on the car. But for the most part, the car is making all the decisions.

Shoieb: So it’s not a completely steering-less or paddle-less car?

Moh: Yeah, I don’t think we will have a steering-less car, and there is a nuance, so the steering-less car that still is operated by a remote person, is still driven by a human. That distinction is important. A truly autonomous car that is not meant to be controlled by humans at any point, I think it’s not feasible anytime soon.

Shoieb: When you talk about Level 5 not being economically feasible, you’re really talking about overhauling and retooling the physical infrastructure. And, that will be very expensive.

Moh: Yes!

Shoieb: Some of the autonomous vehicles companies coming out with the fleet of vehicles, Waymo for instance. Do you think they are still in testing mode, and they’re just collecting more data and real-world scenarios?

Moh: There are monetization opportunities, not just from mobility. The data has a lot of value. The ecosystem around it will evolve. Things like smart city infrastructure, kind of the potential of insights that you can generate from the data; the personalization that you can enable in a lot of people when they get into Uber and they do work and, catch up on phone calls, email or whatever. So it’s the ecosystem that autonomy enables is a lot more than just what you can get out of Level 4 and level 5. And, just having a good solid Level 3 type of functionality is a huge booster. If you can really let go of the steering wheel and bumper to bumper traffic and just rely on the car making the right decisions since it knows your destination, and as soon as the bumper to bumper traffic condition changes, the system reverts back to the driver, that can add a lot of productivity and add value to the customers and the car owners.

So I think the rest of the industry thinks of it as a zero sum game, either have autonomy or you don’t have autonomy. It’s a lot more gray than black or white and you’ll see companies like the last mile delivery robot company what you can enable with drones, home robotics or other type of robotics. If you survey other market segments or verticals, whether it’s the agriculture, defense and military, search and rescue, industrial or manufacturing ‘Autonomy’ can add a lot of value there. It’s the same technology stack. It’s just a different application. That’s what got us excited about this market … that it has a lot of potential and a lot of utility to different components that makes it a little bit future proof, if you will.

Shoieb: Some of the Chinese companies are combining two initiatives, namely Autonomy and the Electrification of the vehicle. Electric vehicles still have limited energy, and all the sensors in the vehicle require power to function. What are your thoughts?

Moh: There’s a lot of innovation happening on the compute, battery life, AI, and the sensor power consumption. So, the technology will evolve. These cars are intelligent, so, they will know how long of a trip they can make. Ultimately the car will know when to go charge itself, when it can join the power grid, when it can dispatch from the grid, then disconnect, and then go serve then comeback and so on. That whole infrastructure ecosystem is too early to worry about. I think people need to build a truly autonomous vehicle first. Once that works and is guaranteed to deliver some ROI, then they should start focusing a lot more on the energy aspect, the storage, and so on.

Shoieb: But you know, there are companies who are combining these two initiatives. I have an electric car in the household, and sometime I run into challenges where it shows me there’s enough electric charge left, but I crank up the heat and turn on the seat warmer, then I see myself driving with the reserve charge. Then I begin to wonder how the heck I’m going to get back home.

Moh: Like I said, there’s a lot of innovation happening, multiple startups looking at these issues. If you think that just the sensors or are consuming electricity, think if you had 10x the number of electric cars in the system. Can the electric grid even support that kind of charging all of these cars? Imagine millions of electric cars connecting at night, trying to charge at the same time.

These are bigger problems that people are working on, and they’re not yet fully solved, and they don’t need to be fully solved right now. But, there are some feasible, viable solutions. Some are very expensive though. Such solutions are not economical. So some things are doable but very, very expensive and some things are low hanging fruit that has all been picked already. So, there are no low hanging fruit right now in this market. There are no quick wins.

Shoieb: What’s the long-term vision for your company?

Moh: We see ourselves as an enabler for the short-term future. We think that the same capabilities and technology that we are building and investing in have the biggest market and biggest potential in the autonomous space now driving the market. But, the same technology is useful for any intelligent robotic application that includes manufacturing, industrial drones, home robots, etc. So, if we do really well in autonomy and become a market leader in the tooling services and infrastructure side, then we can start looking at the adjacent markets that need similar technology, and grow from there.
 
Shoieb: Great! Thanks for your time.
 
Moh: Yes, no problem. Great talking to you!

EVE by Deepen.AI