Lightmatter + GV

Nicholas Harris
Lightmatter
Published in
2 min readFeb 25, 2019

Today, we’re excited to announce the close of our $22M Series A-1 funding round led by GV (formerly Google Ventures) with participation from Spark Capital, and Matrix Partners. This brings our total Series A raise to $33M.

This January marks one year since we started— a lot has happened. We carefully assembled a team of 23 world-class scientists and engineers to develop a scalable platform for high throughput, high efficiency artificial intelligence computing. We taped out our first (successful) test chip with transistors and photonic elements from start to finish in four months. Eight months after that, we taped out a chip with over a billion transistors — a testament to the talent and drive of the Lightmatter team. Looking forward, we will need even more talent and drive to deliver the first photonics-based artificial intelligence accelerator product.

While integrated photonic systems are typically considered in the context of communication in datacenters, we’re leveraging them to address the growing demand for compute, largely driven by artificial intelligence, in the post-Moore’s Law era. It’s worth noting that the end of Moore’s Law isn’t (yet) due to the inability of chip makers to shrink transistors. If you’re going to pack more transistors onto the same sized chip, which has been happening for decades, those transistors need to be commensurately more energy efficient; herein lies the problem. At Lightmatter, we are addressing this scaling challenge in the context of artificial intelligence.

The leading artificial intelligence algorithms (deep neural networks) of today are mainly composed of a single operation: the matrix-vector (or tensor-tensor) product. From a mathematical standpoint, this is perhaps unsurprising since deep neural networks are inspired by networks of neurons in the brain. The connectivity of these neurons, wherein information is believed to be encoded, can be straightforwardly tabulated in the form of an adjacency matrix. Assuming artificial intelligence algorithms continue to take hints from biology, matrix math is likely to be relevant for a long time to come.

Today, digital electronic computers perform matrix-vector products using multiply-accumulate units (sometimes in 2-D arrays). At Lightmatter, we replace this device with an optical counterpart: the programmable Mach-Zehnder interferometer. This photonic device is not bound by the physics that limit transistor-based electronic circuits — opening an avenue towards continuing the currently broken trend of exponential growth in compute per unit area within a practical power envelope.

There’s a lot of work ahead and we’re thrilled to bring the GV team on board as we continue to build out our team and realize a new kind of computer that’s fast and efficient enough to drive the next generation of artificial intelligence.

To learn more about Lightmatter, please visit lightmatter.co.

--

--