GPT-3 and the Rise of Human-centric Adaptive Software — Part 2

Paolo Perazzo
6 min readNov 17, 2020

--

💡 This article is part of a series: Intro, Part 1, Part 2, Part 3

A new generation of adaptive software

The reactions to the GPT-3 demos I presented have been all over the place: “Mind-blowing!”, “Overhyped!”, “Designers and developers will lose their jobs to AI!”

Some of these demos might be handpicked results from many training experiments; some might not go beyond the initial simplistic cases; GPT-3 might not be that great interpreter we think it is when put in front of the famous “average user”, just as it happens to any UI perfectly crafted by the best designers in the world.

The fundamental implication of these GPT-3 demos is that they directionally showed to product designers three important opportunities:

  1. The power of offering natural language as an interface that “adapts” to each user
  2. The possibility to shift more “control” and “customization” of the software in the hands of users, without having to code and present all the permutations for each specific use case
  3. The ability of letting GPT-3 almost entirely generate software applications based on user needs

I call this new generation of software “ adaptive software”.

Product designers and developers won’t be replaced any time soon by AI, but they will have to embrace this new paradigm of designing and building software.

Let’s then take a look at a very generalized GPT-3 Application stack, where on top of the Core functionalities of the Application, one or more Modules (e.g. scripts, plugins, templates, platform apps) are presented to the user as an higher level abstraction via a conversational User Interface.

Depending where GPT-3 is inserted in this stack, designers will have to provide one of the following:

Let’s explore each type with practical examples of GPT-3 applications.

GPT3-programmable Modules

As in the Figma example from Jordan, the most obvious first step is to interface GPT-3 with the Application Modules (Figma plugins in that case), allowing the user to “program” them via natural language.

As described before, natural language is mapped by GPT-3 via the data interchange format of your choice to the application modules developed specifically to be programmed by GPT-3 inputs. Understanding this interface and properly design the modules to be mapped to is going to be the “new” job of the product designer.

Inspired by Jordan’s Designer plugin, Dhvanil Patel released a more advanced version that allows to build a full website by describing each section in plain english.

As you can see in the demo, the GPT-3 output “designed” by Dhvanil enables two major functionalities:

Just by adding the second “GPT-3 step” the resulting application became way more powerful. As noted by Dhavanil, GPT-3 can be applied also in other areas to further reduce the amount of work the plugin is doing vs. GPT-3.

GPT3-programmable Core

If we offer a programming language for the core functionalities of our application and interface GPT-3 with it, we can further reduce the need for extra developer work, shifting the control and customization of the application to the end user.

After Plotly integrated GPT-3 in their product, a “conversation” in natural language with the application was sufficient to visualize in a chart format the information from a preloaded dataset; basic instructions in plain english allowed to even manipulate the type of chart.

As they pointed out in their thread, the key aspect is that GPT-3 needed only one example of their proprietary Plotly Express code (i.e. the specific “language” used by the app to plot charts) to generate the desired application output.

What’s remarkable is that very likely GPT-3 was pre-trained on many annotated Plotly Express code samples (if any at all), differently from common coding languages like HTML/JSX.

Yet, GPT-3 was able to create an interface between the human language and the application language (i.e. its programming language):

Any human can now “program” Plotly core functionality to create and analyze graphs based on their dataset, without writing a single line of code.

A similar example of user-programmable core functionality was showcased by Chaitanya Chokkareddy. He used GPT-3 to setup the IVR system of a company through natural language.

GPT-3 here is interfacing with KOOKOO, the markup language that is used to program the IVR system. In reality, as Chaitanya admitted, an intermediate JSON representation was needed to make the demo work (GPT-3 ⇒ JSON ⇒ KOOKOO vs. GPT-3 ⇒ KOOKOO), but that opened up an interesting conversation on why it worked instead for Plotly.

Getting your documentation scanned by GPT-3 might be even more relevant in the future than being indexed by Google: the consumer of that documentation is GTP-3 and that can save humans from having to read it!

In fact, look at the user impact of Chaitanya’s solution: any SPA owner (as in the example) can now make any changes to their IVR system without any KOOKOO knowledge and coding expertise, just describing the IVR flow they have in mind.

GPT3-programmable Applications

What if we now make GPT-3 take care of the entire application stack? What if GPT-3 “learns” how to code a programming language and let user simply describe the application they need?

This seems unreal, impossible, right? But some people are already trying to make this happen.

Just few weeks ago, Sharif Shameem started building a layout generator that turns a description into actual JSX code (or HTML/CSS if the output of the two training examples were in HTLM/CSS).

He then described Google home page and generated the associated code:

He then continued by building a functioning React app simply describing its business logic.

After getting GPT-3 to build a to-do app, an app rolling dice, and an app to translate feelings in emojis, he decided to start a company, Debuild.co, with an ambitious goal:

We’re building an autonomous system that can create software at the level of the world’s most skilled engineers.

In less than a couple of week from the first GPT-3 demo on Twitter, Debuild got funding from the founders of Quora, Segment, Expo, Superhuman, as well as investors like Brianne Kimmel, Todd Goldberg, Garry Tan.

💡 Part 3 explains the fundamental importance of coupling GPT-3 Adaptive Software architecture with a Human-centric design

Originally published at https://ppaolo.substack.com.

--

--

Paolo Perazzo

Cross-pollination ignites disruptive innovation. Part of Andiamo founding team, acquired by Cisco. Started SiVola. Building something new at Companyons. For you