What is GPT-3 , can it worsen the job market amidst the pandemic ?

What if I tell you that there is an new tech that can write your essays ,answer your physics teacher and even finish your math assignments ? Well GPT-3 is far beyond your high school scope, it can also write React code for app building, generate your email with just key points , be your attorney general and also an doctor on the go.It can even treat your loneliness !!!

So , What is GPT — 3 and How does it work?

Let me try to keep it simple, Google introduced a Sequence to Sequence model called Transformer in 2017 ,a sequence-to-sequence (Seq2Seq) is a neural net that transforms a given sequence of elements, such as the sequence of words in a sentence, into another sequence. GPTs are merely improvised and larger sized Transformer.

GPT- 3 is an task agnostic , natural language processing model that requires minimal fine-tuning.

It means that it is adaptive enough for text generation ,qualitative queries and a lot more with minimal user adjustment.

GPT-3, like other text-generating model, works as follows: you give it a chunk of text and it predicts the next chunk of text. The model then uses its own output to extend the length of its prediction. This process continues until reaching a threshold length or a stop token. Just imagine if the given chunk of text provided to the model was almost the entire internet .

Yes ! The training data for GPT-3 is collected from the internet by crawling webpages for the past 8 years(Common crawl dataset), and to improve the quality and diversity, the team has used their indigenous dataset WebText(created for the predecessor of GPT-3). Notably ,WebText is created from a reddit forum where users were asked to submit links of interesting sites and the ones which got more than 3 karma(analogous to likes in instagram) were chosen for web scrapping.

GPT-3 stands for Generative Pre-Trained Transformer,the name itself explains the model. Lets break it down.The model is “Generative” meaning that the model generates text with the same style of the input corpa. The model is already trained with good amount of data(WebCrawled) hence its “Pre-Trained” and the “Transformer” is from its parent model ,the Google Transformer.

To summarize, GPT -3 has been trained to capture data to a very fine detail available on the web semantically . Meaning that it not only captures the data blindly but also learns the interconnection between them.

The public internet was GPT-3’s training ground, though it still doesn’t know about COVID-19! It’s likely seen code, movie scripts, tweets, blog posts, and more. Lets see what this AI is capable of doing .

Demonstrations

GPT-3 Generates React code to build Apps as described by the user
Generate e-mails from key points(Open tweet to watch it in action)
GPT-3 Showing its Write like Attorney Style
Doctor on the go
Just dictate your math questions and GPT posts your answers
Feeling bored at home, try the GPT Recipe Maker too.

Can it worsen the job market?

All the above demonstrations seems to be convincing enough ,that the model along with its API can cause a huge blow on the job market, but what does the reality of using the API look like?

Undoubtedly GPT-3 is a great leap in AI technology , but it has a few downsides too.

The transformer architecture of GPT upper bounds its ability at memorization. It cannot learn many algorithms due to the functional form of its forward pass, and spends a fixed compute per token — i.e. it can’t “think for a while”. Progress here critical, likely but non-trivial.

-Andrej Karpathy

As mentioned by Andrej, the third generation GPT makes use of the perception “amazing things happen if you just make a transformer bigger”, this puts an upper bound in the architecture.

First and foremost, the API is slow, this is hugely because of its 175 billions parameters(10 times more parameters than its runner up) .Not to mention that the cost to run the API is also high as it outpaces the growth of GPU memory , giving rise to the need for parallel GPU processing or an alternative of cloud computing.It has been reported that the GPT-3 was trained on Tesla V100 cloud instance, costing a bill of $5M just to train, which is too expensive.Considering these factors it is definitely not going to affect the job market in anyways.

Every Cloud has a Silver Lining, similarly GPT-3 finds its ways improve the customer service rather than killing jobs.For example the generate mail with key points API can be incorporated in any mailbox clients, the React code builder API can be helpful to bring out creativity in children . Replica,an AI companion ,uses GPT-3 and has shown that the conversation with its customers were very natural and that its customer happiness has increased by 20% after upgrading to GPT-3.Although the outcomes of such a huge NLP model is indefinite , OpenAI , the company accredited to building of GPT-3, only grants license to specific users ,there by restricting any misuse of the technology.

So, in every way the GPT-3 has been summoned in the direction of making life easier and happier.

Happy Reading !

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store