A fascinating tree of GPTs & LLMs reveals what’s been going on under the covers

Paul Pallaghy, PhD
11 min readApr 28, 2023

Yann LeCun, who’s not entirely convinced GPT is as impressive as most of us think, has all the same, just posted this amazingly insightful viz of LLM (large language model) history and relatedness by GitHub-er JingfengYang.

Yann LeCun just posted this fascinating tree of LLM variants on my LI feed. CREDIT | JingfengYang

Wow. Gotta love this.

Yann LeCun, the brilliant deep-learning pioneer, now at Meta, after dissing on LLMs last week, is educating us about them this week.

It’s a fascinating technological heritage of each of the LLMs we know and love. And another 4 or 5 dozen.

Going by the number of releases, as tracked in the side-plot, you’d think that Meta and Google are ahead of OpenAI in LLMs. But of course count does not equal performance: OpenAI’s GPT-4 is generally agreed to be significantly ahead of Google’s Bard. Recent tests report huge superiority of GPT-4 over Bard.

But it’s a handy bar chart all the same, especially the stacking of open-source status. Be aware that open source here includes ‘available for academic research only’, especially in the case of Meta LLMs. (Thnx to Voxist on Medium for that update).

Insights

The root of the tree is rightly seen as Word2Vec and friends, as I recently pointed out.

--

--

Paul Pallaghy, PhD

PhD Physicist / AI engineer / Biophysicist / Futurist into global good, AI, startups, EVs, green tech, space, biomed | Founder Pretzel Technologies Melbourne AU