Revenue durability in the LLM world

Gil Dibner
Angular Ventures
Published in
4 min readApr 6, 2024

--

My favorite podcast these days is BG2Pod, a conversation between Bill Gurley and Brad Gerstner about the technology industry. The podcast feels like eavesdropping on an intimate and very thoughtful conversation between two very experienced friends who are genuinely trying to make sense of the complex world around them. It’s brilliant.

The last eight minutes of the latest episode of BG2Pod is a fascinating discussion of the commercial implications of the rise of LLMs. It’s probably the best eight minutes on that topic I’ve spent recently. Gerstner and Gurley make a few key points, but here are the three I found most interesting:

  • Tech is deflationary, and LLMs are massively deflationary. Intense competition among well-funded foundational model companies and tech giants is driving down the costs of deploying LLMs. Open-source models are nipping at the heels of even the most sophisticated proprietary models. The value of these models is going up quickly, but the ability of the companies behind those models to capture that value is increasingly in question.
  • Data gravity. The concept of “data gravity” poses a real hurdle to LLM-first challengers. It has typically been easier to move new capabilities to where masses of data reside than to move masses of data to new platforms. Even as most LLM providers pursue PaaS-like models, hoping that customers move data onto their platforms, we are seeing existing data platform companies (GCP, AWS, Azure, Snowflake, Oracle, etc) aggressively add vectorDB and LLM capabilities to their existing offerings. Business models that rely on convincing customers to move massive amounts of data to new platforms to enjoy the benefits of proprietary models face an uphill battle. It’s often easier and usually preferable to add these capabilities to existing platforms (especially if leveraging open-source tooling).
  • Revenue durability is the key challenge. LLMs are intrinsically at odds with switching costs. By their very nature, LLMs are great at ingesting and leveraging unstructured data. The wider the context window, the easier it is to add data into an LLM and — hopefully — make sense of it. These realities make switching to LLM-based solutions pretty painless, but they also make switching out of an LLM-based solution towards another (perhaps open-source, perhaps internally built) pretty painless as well. Low switching costs are great for customers, but bad for vendors.

Our search for revenue durability. The final point, on revenue durability, is the real clincher. Everything about LLMs seems to make revenue durability more challenging than ever. As VCs, our approach has been to try to focus tightly on this question as we evaluate the current generation of AI-first/LLM-powered companies, both on the infrastructure and application layers.

  • At the infrastructure layer, we are seeking companies that combine some genuine technical innovation with deep enterprise-level workflow integration. The data gravity argument coupled with the rapidly expanding capability of open source models suggests to us that most of the value in this eco-system will accrue to infrastructure companies that empower the enterprise to roll their own LLM-powered applications. We are less convinced by the straightforward PaaS models pursued by the foundational model vendors or vectorDB vendors.
  • At the application layer, we are seeking companies for whom AI/LLMs are perhaps a critical enabling technology, but not the prime value driver. The deeper the workflow integration, the greater the potential stickiness and switching costs to customers. We have begun to find a few enterprise application companies that leverage deep domain expertise to deliver high-value solutions. LLMs often play a supporting role here but rarely play the starring role. The path to value creation for these companies can accelerated by LLMs in several ways: easier interfaces, natural-language queries, quick ingest of unstructured data, and generative output. But these companies’ path to value capture usually comes from somewhere else entirely: domain-specific models, generation of novel proprietary datasets, deep integration into human enterprise workflows, etc.

LLMs and GenAI provide powerful new tools for technology vendors and customers alike. They change the playing field, but they do not change the fundamental rules of the game. Within the new (and still emerging) rules of the LLM/GenAI game, we continue to search for companies that have a path to creating significant customer value while capturing value for their shareholders.

Whether you are building on the infrastructure layer or application layer, we are eager to hear about your plans to create value for customers while capturing value for your shareholders over a long time horizon. If you have a thesis for revenue durability in this era of LLMs, we would love to have a chat — so please reach out.

--

--

Gil Dibner
Angular Ventures

A global venture investor. Fascinated by the finance of innovation. Trying to help the few to do the impossible. Investing across Europe + Israel.