2023 LLM Infrastructure Market Map

A framework to think through your GenAI tech stack

Hamza
4 min readNov 9, 2023

Introduction

The GenAI infrastructure space is seeing a lot of activity. This makes sense: new technologies require new tooling.

To help keep a pulse on the “who’s who” of the space, numerous VCs have published market maps. However, we noticed a big problem: the primary audience for these market maps is other VCs, not buyers.

Why are you making a market map about your own market?

We get it, it’s a bit unconventional.

At Autoblocks, we are obsessed with helping companies build better GenAI products. Part of this is ensuring our users are thoughtful about the way they construct their GenAI stack.

The space is very noisy, so we wanted to create a resource to help you think through the tools you’ll need in your toolkit to take your GenAI products to the next level.

LLM Infrastructure Market Map

It’s a pretty market map, but how do I read it?

In the following sections, we help you comprehend the market map — layer by layer — starting from the bottom and moving up.

Model Layer

The bottom layer is the model layer. Conceptually think of this as the vendors that are helping supply the “intelligence” in artificial intelligence.

  • Cloud API Models: Products like OpenAI and Anthropic provide models via cloud-based API, so you can easily plug their models’ intelligence into your product.
  • Open-Source Models: Products like Hugging Face and Replicate offer open-source model hubs, where you can find open-source intelligence. Companies like Meta pre-train and open-source models like Llama 2.
  • Training & Data Labeling: Products like Labelbox, Scale.ai, and Snorkel help you create and label your own, proprietary training data. Products like MosaicML help you pre-tain in-house models.
  • MLOps: Products like Weights & Biases, Arthur AI, and Arize AI help you manage your traditional machine learning models.

Management Layer (Platforms)

The middle layer is the management layer (platforms). Conceptually think of this as the vendors that are helping you with the operational management of GenAI products.

  • LLMOps: Products like Autoblocks, Humanloop, and Langsmith help integrate the different parts of your stack, so your team can use them as the one-stop-shop for managing your GenAI product management.

Management Layer (Point Solutions)

The top layer is the management layer (point solutions). Conceptually think of this as the vendors that are helping you with specific aspects of the operational management of GenAI products.

They aren’t end-to-end platforms, so while they may excel at one or two specific things, you’ll need to find a way to integrate them into the rest of your stack and workflow.

  • Prompt Engineering: Products like PromptLayer help you quickly iterate on your prompts.
  • Frameworks: Products like LangChain give you application frameworks to chain together LLMs to make them more effective.
  • Retrieval: Products like LlamaIndex and LangChain provide embedding models to turn your data into something that’s useable by LLMs.
  • Vector Databases: Products like Pinecone and Weaviate help you store your embeddings in a vector database (which is ideal for the GenAI use case).
  • Monitoring: Products like Helicone help you monitor performance metrics like token usage, latency, and cost.
  • Evaluation: Products like Patronus AI provide evaluation models to help you evaluate your application’s performance.

Platforms vs. Point Solutions

For managing your GenAI products, you’ll be faced with two distinct approaches:

Approach #1: Use point solutions and piece them together.

  • Benefit: they often excel in the specific “job to be done” they’re targeting.
  • Downside: They present a lot of implementation overhead because you need to figure out how to integrate them with the rest of your stack.

Approach #2: Use a platform for end-to-end GenAI product management.

  • Benefit: Platforms serve as a centralized solution that wrangles the various components of your GenAI product and development cycle. This is important because product velocity is of the essence when it comes to GenAI products. Your ability to quickly and collaboratively refine your product is defining in how successful your product is.
  • Downside: Native features might not be “best-in-class.” This downside isn’t super concerning because platforms often integrate with point solutions to help you get the best of both worlds.

Autoblocks: The Most Adaptable & Collaborative LLMOps Platform

Autoblocks is a collaborative GenAI product workspace that allows you to quickly iterate on their GenAI products. Our platform eliminates any bottlenecks caused by fragmented tooling and miscommunication.

We equip product managers, developers, and other stakeholders with the tools they need to make the product refinement cycle as fast as possible.

Want to know what makes Autoblocks the most adaptable solution in the market? Learn more about our differentiated approach.

--

--

Hamza

Helping businesses build world-class GenAI products @Autoblocks