Note of GPT-3

Lynx Chang
3 min readOct 28, 2022

--

GPT-3 is a neural network machine learning model trained using internet data to generate any type of text. It requires a small amount of input text to generate large volumes of relevant and sophisticated machine-generated text.

How does GPT-3 work?

When a user provides text input, the system analyzes the language and uses a text predictor to create the most likely output. This is accomplished by training the system on the vast body of internet text to spot patterns.

GPT-3 is focused on text generation based on being pre-trained on a huge amount of text.

Capabilities of GPT-3

  • Generate various text content depending on the context and predict statements according to available sentences.
  • The text predictor of GPT-3 processes all the text existing on the internet and can calculate the most statistically expected output.
  • Write poetry, blogs, PR content, resumes, and technical documentation.
  • Analyze the context in the provided text and based on that it can generate business idea pitches, fan fiction, memes, and so on.
  • Create workable code that can be run without error.
  • One developer has combined the UI prototyping tool Figma with GPT-3 to create websites just by describing them in a sentence or two.
  • Clone websites by providing a URL as suggested text.
  • In the gaming world, create realistic chat dialog, quizzes, images and other graphics based on text suggestions.

History of GPT-3

The version of GPT:

  • The first version of the GPT (117 million parameters):
  • Showed that the language prediction model could capture global knowledge.
  • Proposed that a language model must be learned using unlabeled data, then, improved through samples of natural language processing tasks such as text classification, sentiment analysis, and word segmentation.

GPT-2 (1.5 billio parameters):

  • Able to generate realistic text that was refused by OpenAI to be made open-source as the company was concerned about the development and spread of fake news.

GPT-3 (175 billion parameters)

Benefits of GPT-3

Whenever a large amount of text needs to be generated from a machine based on some small amount of text input, GPT-3 provides a good solution. There are many situations where it’s not practical or efficient to have a human on hand to generate text output.

Risks and limitations of GPT-3

  • The biggest issue is that GPT-3 is not constantly learning. It has been pre-trained, which means that it doesn’t have an ongoing long-term memory that learns from each interaction.
  • Limited input size: A user cannot provide a lot of text as input for the output. GPT-3 can only deal with input text a few sentences long.
  • A wide range of machine learning bias: Since the model was trained on internet text, it exhibits many of the biases that humans exhibit in their online text.

Using GPT-3

  • OpenAI has released an API for accessing AI models developed by them.

Reference

--

--