177 Followers
·
Follow

Creating world-changing energy storage tech is tough. Doing it during a pandemic is even tougher. Here’s how The Engine’s portfolio company Form Energy is pushing forward amidst global uncertainty.

By Ted Wiley, Co-Founder, President & COO of Form Energy

Image for post
Image for post

We founded Form Energy with the assumption that the R&D of our core technology must be done by a team in the same space — shoulder to shoulder in the lab. …


Part V of our series, “Real Perspectives on Artificial Intelligence” features Rick Calle, AI business development lead for M12, Microsoft’s venture fund.

How energy-intensive is the AI infrastructure today? And what does that mean for the future of discipline?

Image for post
Image for post
Rick leads AI business development for M12, Microsoft’s venture fund. He works at the intersection of AI algorithms, hardware computing efficiency, and novel AI use cases. During his time with Qualcomm’s AI Research, he worked with the team that launched Qualcomm’s AI Engine into over 100 different models of AI-enabled mobile phones.

Today’s AI algorithms, software and hardware combined are 10X to 100X more energy-intensive than they should be. In light of Microsoft’s recent announcement of its carbon negative commitment, my challenge to the industry is clear: let’s improve AI hardware and software so that we don’t overheat our planet.

The computing industry is always optimizing for speed and innovation, but not necessarily considering the lifetime energy cost of that speed. I saw an inflection point around 2012 when the progression of AI hardware and algorithmic capabilities began to deviate from Moore’s law. Prior to that, most AI solutions were running on one, maybe two processors with workloads tracking to Moore’s law. …


Part IV of our series, “Real Perspectives on Artificial Intelligence” features Dan Huttenlocher, the inaugural dean of the MIT Schwarzman College of Computing.

No matter how responsibly developed AI may be, its generality seems at once its greatest asset and its greatest danger. How do you reconcile this duality?

Image for post
Image for post
Dan is the inaugural dean of the MIT Schwarzman College of Computing. Previously he helped found Cornell Tech, the digital technology oriented graduate school created by Cornell University in New York City, and served as its first Dean and Vice Provost.

I think it’s a great question. Much of this — and I don’t know if it’s fear of AI, exactly — but much of the sense of the potential dangers of AI comes from a misunderstanding of what the technology really is.

There’s often a tendency to anthropomorphize technology and with AI this tendency is much more extreme. In the end, these are still just algorithms. When we’re using machine learning, for example, they are algorithms that we can teach instead of algorithms that we have to code. But they’re still algorithms. They’re not going to become evil. There’s no rational basis for that worry at the present time (I’m not saying it’s impossible for that to be true in some future with technologies one cannot foresee today, but for the path we are on it is science fiction). …

About

The Engine

Built by MIT, we help founders create the next generation of world-changing companies. Visit us at engine.xyz.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store