Let’s review productized GPT-3 together

Alex Schmitt
Cherry Ventures
Published in
4 min readJan 14, 2021

This post was written by Alex Schmitt and Max Brückner

Towards the latter half of last year, GPT-3 was seemingly everywhere. But even though the initial spike of interest has cooled off, we’re still interested (re: really, really interested) in what entrepreneurs can build with GPT-3.

That’s why, together with you, we created a comprehensive and, more importantly, collaborative map.

So, if you see any interesting GPT-3 developments or emerging use cases, we want to know. You can add them here. But more on that shortly.

Why natural language processing matters

Let’s quickly review just a few key concepts.

We’ve started mapping out GPT-3 projects to spur better understanding of this emerging tech here.

Natural language processing (NLP) capabilities, of which GPT-3 falls under, can help us to cope with an ever increasing flow of information. Companies with NLP at their core, such as Hyperscience, Mobvoi or Element.AI, support their users in building frictionless processes that not only speed up processes, but also reach better conclusions.

Professor Patrick Henry Winston, one of most influential professors at the Massachusetts Institute of Technology, famously formulated the Inner Language Hypothesis, the basis of NLP. It states, “Using a symbolic inner language, we (=humans) construct symbolic descriptions of situations and events that are far beyond the reach of other primates.”

Essentially, that’s just a long way to say if we are to understand how to make truly intelligent machines, then we first have to understand the role of language.

And that’s where GPT-3 comes in.

So, what’s new about GPT-3?

All languages rely on internal structures and grammatical patterns. An NLP algorithm is no different as it utilizes these same structures by applying concepts such as part-of-speech (like a noun or verb) or dependency relations (like a subject-of) to understand language. Widely published in May 2020 and released by Open.AI in June 2020, GPT-3 represents the newest edition of pre-trained language representations.

In contrast to other pre-trained models, which require fine-tuning on a large set of data for a specific use case, GPT-3’s sheer size of 499bn tokens (x116 times larger than GPT-2) allows it to be rather task agnostic. Translated, that means the possibilities — and potential — could well be infinite.

“But here’s the really magical part. As a result of its humongous size, GPT-3 can do what no other model can do perform specific tasks without any special tuning,” Google’s Dale Markowitz recently wrote. “You can ask GPT-3 to be a translator, a programmer, a poet, or a famous author, and it can do it with you providing fewer than 10 training examples. Damn.”

Damn is right.

GPT-3 certainly sets new standards. But make no mistake, a wide set of limitations apply, controversial debates continue, and ethical boundaries are — and should be — considered. It goes without saying that GPT-3 isn’t magic, but it does represent a new frontier of sorts.

What we’re working on on

But the main purpose of this post isn’t to tell you what GPT-3 is — there are better resources out there — but rather figure out how it can best be used. To get to the bottom of GPT-3’s potential and development, we want to collaboratively and transparently share ongoing GPT-3 projects.

Just look at the GPT-3 projects we’ve already noticed here. We will regularly update that overview, but more importantly, you can contribute to this overview by submitting new projects here.

We’ve noticed that most of these projects are just experimenting with what is possible. But, already, interesting use cases and teams have emerged working to improve how we communicate, study, and ultimately work with data across industries and work settings.

How GPT-3 can be productized

From translating short natural language snippets into code to imitating famous authors, GPT-3 use cases go well-beyond what one would assume as day-to-day language.

For example, we’ve compiled the chart below to outline these use cases.

What we particularly like

We believe that the underlying infrastructure for NLP, such as GPT-3, will be further commoditized. Through democratized access to machine learning models, innovation can be accelerated. A potentially interesting company will build up a defensible position by a) building a proprietary tech-stack on top of commoditized NL-models and b) building a targeted product strategy.

A) A proprietary tech-stack will allow to cost-effectively query NL-models, by cherry-picking the most relevant user input data — to receive the best high-quality response. This can be enabled by building customized user-centric models on top of the queries.

B) A targeted product strategy centers not only around a clear product and feature design, but is especially supported by a strong go-to-market approach and distribution strategy.

As a final thought — make sure to get some inspiration from our map and if you actually managed to get access to GPT-3 and are starting a project make sure to add it here.

--

--

Alex Schmitt
Cherry Ventures

VC Investor @CherryVentures 🍒. Prev. Founder Mergerspot. Educated in CS & Finance. Official Chief Sports Officer. In love with 🏔 ⛵️ ⚽️ 🏋️‍🚴‍♂ ⛷