Learnings from a Deep Tech Horizon Scanning Project

Lawrence Lundy-Bryan
Lunar Ventures
Published in
7 min readMay 4, 2022
Photo by Drew Beamer on Unsplash

Atoms and bits, AI is a GPT, and centralising all the things

At the beginning of the year, we began a horizon scanning project to put a structure how we think about the future. Horizon scanning is often done by companies and governments, but rarely in venture capital, which is all about making bets on the future. Maybe if you are a SaaS, crypto or metaverse investor, that’s cool; you’ve made your bed. But in deep tech with longer timelines and greater science and engineering risk, you have to think deeply about where to invest not just what to invest in. Should we look at DNA storage or holographic storage? How much science risk is left in quantum computing or nuclear fusion? How can we weight the probability of a central bank digital currency (CBDC) + stablecoin crypto outcome versus full-fat, full-ponzi cryptoassets? (I joke, kinda)

So we got together and have so far researched 50+ technologies. We have another 100 to get through by the end of the year. I’m most looking forward to space elevators which is a real thing. We’ve done everything from drones to stablecoins to small modular reactors to clean meat. Using the same framework to assess different technologies across energy, materials, biology, and electronics means we have unearthed some unexpected connections.

1. Atoms AND bits, aka deep tech

Oversimplifing for narrative clarity, the last 20 years have been about bits, and the next 20 years will be about atoms. We’ve created wonderful(ish) digital spaces, brought the Internet to billions of people and stuck software in all sorts of things. Now we will connect all these spaces, make them persistent, give them economies and access them in higher fidelity through goggles stuck in our ears and on our eyes. (The future!) And that is an important story, but it won’t be the most important story. In fact, the whole “we wanted flying cars, instead we got 140 characters” meme acknowledges that we haven’t really applied computing and software to atoms yet. But things are changing. The climate emergency has put a renewed focus on how we manipulate atoms to generate, move, and store energy. Huge direct air capture plants. Solar and wind everywhere. And then fusion in the 2030s. On the biology side, it’s taken a long time. Still, we are about to see an inflection in the bio-economy. Cheap genomic sequencing, novel gene-editing tools, advances to DNA synthesis combined with AI and noisy intermediate-scale quantum (NISQ) promise transformative IRL, not just better ads. The ability to read, write and engineer biology promises unprecedented advancements in energy, food, materials and drugs. Not to mention the elimination of disease and maybe even ageing. 2002 wasn’t really all that different from 2022. I’m still listening to Original Pirate Material because music used to be better. We have faster internet, smartphones and culture wars yes, but we still get most of our energy from the same places, go to the same hospitals, and travel using the same cars. Sure in 2042 I will still be listening to Original Pirate Material, but IRL will likely look very different.

2. AI is a general-purpose technology (GPT)

Counterpoint: the future will actually be bits; it will be AI. General-purpose technologies were rare in the 20th century; we only had seven: the automobile, the aeroplane, mass production, the computer, lean production, the Internet, and biotechnology. And so far in the 21st century, we’ve only had nanotechnology. There have been many calls to categorise AI as a GPT, and it’s hard not to look at all the emerging technologies and not identify AI as a catalyst. Maybe it’s because AI has been around for such a long time, and in the attention economy, new is always better, but for many, AI is old news. DeepMind with AlphaGo and AlphaFold, and OpenAI with GPT and DALL-E grab headlines; it’s not fully appreciated how early we still are in the AI transformation. Most of the energy is still in the dirty work of labelling, organising, and moving data for use. “You can see the computer age everywhere but in the productivity statistics.” As Robert Solow quipped. The best parallel is the shift from steam-powered to electricity-powered factories in the early 20th century.

Before electrification yielded the expected productivity gains, factories needed to change the architecture and production process. Steam-powered factories were powered by a single massive steam engine which turned a steel drive shaft that ran along the length of the factory. Replacing the steam engine with an electric motor didn’t deliver productivity gains. Giant leaps were made only when factories were redesigned around electric motors powering a production line. We are at a similar moment with AI. AI projects fail to live up to expectations because companies haven’t redesigned their processes to take advantage of the new technology. AI will allow software and machines to hear, see, understand, predict and act. The 20s and 30s will see AI become a fully-fledged GPT, a fact that, remarkably in 2022, is still underappreciated.

3. Centralise all the things

You can’t think about the next 20 years without carving out a chunk for crypto. Regardless of the outcome of the crypto project, it has injected interest and money into the development of distributed systems. A pattern you can see throughout the work to date. A function of cheaper production and distribution of bits and atoms will make it easier to do stuff locally. Wind, solar, home healthcare diagnosis, 3D printing (finally), federated learning, blockchains, decentralised autonomous organisations (DAOs), synthetic media, and plenty of others fit this pattern. The 20s will see more stuff done locally. But the 20s and especially the 30s will see more stuff done centrally. Fusion looks increasingly likely to come on stream in the 20s, centralising energy production again. Even without general-purpose fault-tolerant quantum computers, noisy intermediate-scale quantum (NISQ) computers will centralise computing for high-performance applications like large language models (LLMs). LLMs are likely to underpin most AI applications, which probably means most of the economy by the 2030s. And then on the bio-economy, yes, production and distribution costs will fall. Still, as much as economies of scale, regulation will require centralisation. For energy, computing, and biology, centralisation forces will get stronger in the 30s. And look, this is history rhyming. In the 60s and 70s, we had strong centralisation forces around the computer. In the 80s and 90s, decentralisation, forces were in the ascendency with the advent of personal computers. Which, in turn, centralised around the Cloud in the 00s and 10s. Decentralisation will change everything. Until centralisation changes everything again.

4. The future is closer than you think

The final point I’ll make is around optimism. It’s easy to look at demographics, especially in China and the age of globalisation coming to an end re: Russia and see the next 20 years as challenging for the global economy. If you wake up and read the news and look at the markets, it’s easy to be pessimistic. But it’s even harder to look at 150 technologies and not be optimistic about growth (Maybe less confident about how we share the benefits of that growth). We are on the cusp (within 20 years) of cheap and abundant energy (fusion), stuff (3d printing), food (CRISPR), entertainment (VR, AR, metaverse) and labour (AI). This isn’t even to mention genetic engineering or space travel!

I’m telling the story of the zero marginal cost society. Still, the surprising thing is just how soon that could arrive. Some claim fusion by 2029. Others say don’t be silly; there is no chance before 2039. But STILL, that’s 2039. Less than 20 years away. And that’s fusion which has some of the biggest engineering challenges. There are only really commercial challenges left for CRISPR, virtual reality, artificial intelligence, and 3D printing. I think people are complacent because we’ve been here before. Dolly was cloned in 2003. VR was hyped in the early 1990s. And well, 3D printing was supposed to take over everything in the 2010s. And look, here we still are. The zero marginal cost society seems further away than ever. And yes some of these technologies will stall because of engineering challenges, geopolitics, ethics, regulation, a whole bunch of restraints. But for the big showstopping ones, other than fusion, they are on the learning curve. We know it works and we know it can scale. The problem is cost. And the learning curve and the market solves that.

The purpose of the work is to put some structure about how we think about the future. And I can do the quantitative dance with a robust methodology and all that, but actually at half time, the big thing is how it’s made me feel. The work has made me more optimistic than ever. It feels like we’ve been putting down foundations for ages and everyone is like “why is building the house taking so long?” And then BOOM, the builder is like, we were never building a house, we were editing genes, creating new materials and generating fusion energy.

I’ve been sent from the future to tell you: it’s closer than you think.

--

--