Mythic’s AI inference chip performs hybrid digital/analog calculation (like human poses) inside flash arrays, performing the inference step of deep neural networks inside the memory array that stores the processing weights long term — bringing huge advantages in performance, power life, and accuracy.

Mythic: Enabling Intelligence at the Edge

By Andreas Stavropoulos and Mo Islam

Today, we are excited to share that Mythic closed its $40 million Series B round, led by Softbank Ventures.

Congratulations to Mike Henry, Dave Fick, and the entire team at Mythic for the great progress they’ve made with their novel AI inference chip, which attracted a roster of terrific new investors that share our vision to enable an entirely new class of truly edge-intelligent devices. We were thrilled to have the opportunity to double down on our investment and continue to be Mythic’s largest institutional investor.

We were introduced to Mythic two years ago by Naveen Rao, corporate vice president and general manager of the AI Products Group at Intel and former CEO of Nervana, our first deep learning investment back in 2014. We led Mythic’s Series A soon after, and are privileged to have the opportunity to partner with this incredible team. They continue to keep us true believers in the disruptive potential of deep learning across every industry.

While the cloud was where we started, the edge is inevitably where we must move.

Despite all the hype around the internet of things, most of the current devices at the edge are bandwidth-constrained and functionality-limited because of their need to do the heavy lifting in the cloud. Try to get Siri to play your favorite song or Nest to distinguish your mailman from an intruder. The reason these fail to do the job is simple: the computational power that is needed to do sophisticated voice recognition and computer vision at the edge is too complex and power-hungry, and the cloud is too slow to deliver on consumer expectations.

Until now, that is.

Mythic brings to market a radically novel analog processing-based chip that has the potential to bring an enormous amount of computing power to a small, smart edge device.

Leveraging the power of flash memory to perform enormous matrix multiplications for deep learning, Mythic will have the world’s most power-efficient AI inference chip that can service everything from drones to autonomous vehicles. The sensory cortex of the intelligent edge is being built and we are excited to see Mythic power it.

Their incredible benchmarks and strong early customer demand helped catalyze this investment round that will bring them to the mass market. We are excited to have SoftBank Ventures leading a group of new investors, including Lockheed Martin and Andy Bechtolsheim. We are also pleased to welcome Rene Haas, president of ARM, to the board of directors.

Congratulations again to the Mythic team from all of us at DFJ!

Mythic team outside their Redwood City offices.