The New Record-Setting 100 Million Context Window Model

Three orders of magnitude ‘just because’

Ignacio de Gregorio
8 min readSep 2, 2024
Source: Magic Dev

750 novels. 10 million lines of code. That’s how much information you can give to Magic Dev’s new Long-Term Memory (LTM) model, a model that breaks all the records (and it’s not even close).

While everyone agrees that AI progress is fast, nobody expected the emergence of a model that increases the amount of content you can send in one prompt by 100 times the previous state-of-the-art (Gemini), potentially your entire lifetime of facts, experiences, and events in one prompt.

In an industry that has gone for more than a year without a step-function breakthrough, this certainly feels like it. Here’s why.

Get news like this before anyone else by subscribing to my newsletter, the place where analysts and strategists get answers to AI’s most pressing questions.

The Hard Truth With Current Models

To understand why models have a limited context they can see at any given time, we…

--

--