Can GPT-3 Build a GPT-3 App?

Automating the human intuition of crafting prompts

Sahar Mor
The Startup

--

Three months since OpenAI has released their GPT-3 API, it’s now a shared notion — getting to STOA results is mostly a function of effective prompts programming, rather than an NLP task.

It can literally be the difference between a chatbot whose replies do not make any sense, and a chatbot able to pass a Turing test.

What are Prompts?

The interaction with GPT-3 is being done via prompts — textual hints guiding it about the context and task at hand.

To a large extent, prompts are an abstract fine-tuning layer, allowing the user, i.e. developer, to guide GPT-3 in knowing which neuron-tree in its ‘brain’ it should activate to perform best. The more examples of successful completions one provides, i.e. pair of inputs and outputs, the higher the chances it will generate the right completion for an unseen case.

An example prompt for a Q&A bot

After running hundreds of iterations with different prompts, I started wondering —

How can one automate this guesswork of human intuition? What if

--

--

Sahar Mor
The Startup

Bringing the latest in AI to the mass through writings and Github repos | aitidbits.substack.com - generative AI weekly roundup in <2 min