What GPT-3 Means for Non-Technical Professionals

Michael
The Startup
Published in
5 min readJul 23, 2020
Photo by Kevin Ku on Unsplash

Earlier this week I was lucky to get early access to a beta version of OpenAI’s latest generative pre-trained transformer model (GPT-3) — a new platform technology that can be used as a sophisticated AI-assistant without having to write code.

As someone who isn’t a software engineer, I was still able to experiment and demonstrate use cases that can improve productivity, boost creativity, and accelerate learning and education.

This ease of use — where non-technical people can so easily leverage their creativity to benefit from AI — has got me just as excited as I was when I interacted with a computer for the first time as a child; so much so that I now believe AI is going to substantially impact every professional job sooner than expected.

In this post I’ll briefly highlight three use cases I tinkered with and which can be applied to work right away. I’ll then touch on what the broader benefits of GPT-3 could be and some of the potential risks.

A Caveat on Hype

As is the case with any new advanced technology, GPT-3 isn’t perfect. It breaks a bunch of times, it needs a human to guide it often, and it sometimes spits out gibberish.

From what I can gather, GPT-3 isn’t sophisticated enough to replace jobs entirely (although you can see this happening if you take these advancements to their logical conclusions) but the speed with which I was able to get it to do useful things is truly astonishing.

3 Example Use Cases

(1) Turning legalese into plain English: I trained GPT-3 how turn legal text into simple English without writing any code. You can see the example from my tweet below.

(2) Generating code: I gave it 3 coding examples on how to turn English into regex code. Without more context it was still able to generate longer pieces of useful code given simple English prompts. Here’s a video demo of it.

(3) Writing content: Finally, I trained GPT-3 on my writing with two paragraphs from this blog post. I then gave it a 7-word prompt for a point I wanted to make and it generated a handful of solid paragraphs. I took bits from the two best iterations and used them in this blog. And you know what, they sound just like me! (I’ll let you guess which bit of the blog is GPT-3 generated.)

Saying I “taught” the AI in these examples is both an over-complication and over-simplification. It’s an over-complication because all I did was give GPT-3 a few examples so it could have context. Anyone can do this. There’s no special talent or skill required unless you’d like to build these things into publicly available web apps.

Moreover it’s an over-simplification because what’s going on under the hood of GPT-3 is insanely complex and impressive. The underlying neural network was trained on over 500GB of text data and has 175 billion parameters (i.e. the weighting variables applied to inputs) which is over 100 times what GPT-2 had.

Source: Jay Alammar’s intro to neural networks

Benefits for Non-Technical Professionals

Clearly the potential for GPT-3 is ubiquitous but we can broadly group the benefits in these buckets:

  • Gains in Productivity — You can use GPT-3 as an AI productivity assistant to automate repetitive tasks. I’ve already trained it on a few simple emails I send regularly and all I have to do is feed it prompts which it uses to generate a kind email reply. The archaic way of doing this historically was to hire an assistant who would draft documents for you to review, edit, sign, and send. Now an AI can do this for you.
  • Creativity Boost — If you get creative block, tell GPT-3 what general structure you want for a piece of writing and it will generate random (but potentially cohesive) sentences that might lead to a new thought. Don’t like them? Then tell it to try again and it will auto-generate more sentences that could help get you out of a jam.
  • Democratized Access to Technical Understanding — GPT-3 can be used as an education tool to accelerate learning. As I show with the legalese translator, it can help remove barriers to technical understanding by translating industry lingo into plain language, which is key when you are trying to learn something new.

Challenges to Consider

As is the case with any technology, there will be many challenges with its use. To name but a few:

  • Malicious Use Cases — GPT-3 can be used at scale for sophisticated phishing scams, propaganda, and fraud. Society will need to be vigilant by default when it comes to all digital content going forward. We will also need to rethink what it means to author something. Copyright laws, academic reviews, and attribution are going to get real tricky.
  • Biased AI Models — GPT-3 is trained on existing human records and they are full of bias and stereotypes across qualities such as race, gender, and religion. Until such biases are sufficiently mitigated for, AI outputs have to be challenged and considered in the context of the kind of society we would like to live in.
  • Cost and Access — GPT-3 reportedly cost $12m to train and so far it’s in a closed invite-only beta. To benefit society more widely, the cost of training similar models will need to come down substantially so that the technology is more accessible and evenly distributed to foster innovation.

To some, GPT-3 seems like just a cool toy. It does a few seemingly clever things but isn’t groundbreaking. To others, it may be the biggest thing in technology since Bitcoin. We’ll know who’s right in a few years but given what I’ve seen so far, and what I’ve been able to do with GPT-3, my money is on the enthusiasts.

A brief epilogue: I was initially surprised that GPT-3 hasn’t yet got much coverage from mainstream media. But then I remembered how the world’s first ecommerce transaction in 1994 was relegated to a low-key section of the NY Times back in 1994 (see picture below). Turns out, these things take time to get wider reach and it just might be that 2020 is for AI what 1994 was for ecommerce and the web.

Article by Peter H Lewis in 1994 (source)

--

--

Michael
The Startup

Investor-in-Residence at Ada Ventures. Ex-fintech operator/CFO. Tinkering with code and curiousity at www.michaeltefula.com