We Need a Natural Language Programming Environment!

Forget code editors for Python, Javascript, and computer code. What about code editing in natural language?

Violet Whitney
Spatial Pixel
Published in
12 min readMar 6, 2024

--

TLDR: This is the beginning of a research and design exercise for creating a natural language programming environment. My partner, William and I’ve only just started to experiment with these ideas: here’s a bare bones one for P5, and one for physical space. Now we’re looking for topics related to code editors, visual programming, and programming languages that can spur ideas to help in this journey. If you have references you think might help us, please send them to us!

  1. Rise in Natural Language Programming.
  2. Prompting Isn’t Enough.
  3. What to Borrow from Code Editors.
  4. Emerging Experiments in Natural Language Programming
  5. Provocations for Natural Language Programming.

1 — Rise in Natural Language Programming (NLP)

If you write code you’ve probably spent a decent amount of time in a Code Editor or an IDE (Integrated Development Environment). Code editors like Visual Studio Code help developers write, edit, and test code. But as we see the emergence of “natural language programming” where we can write executable programs with everyday language, will we still need a programming environment for natural language? I think so.

Natural Language Programming, related but not to be confused with Natural Language Processing, is a way of programming with natural-language sentences, like writing in English.

A very simple NLP: “Can you turn off the bedroom light off after 10 o’clock?”

What makes this a program and not just plain english? It becomes a program when it moves beyond a description and becomes something a computer can execute. Its a set of instructions meant to be computed, calculated, decided on or carried out.

Last summer my partner, William Martin and I taught a course in the new Computational Design program at Columbia University GSAPP. For the first time, we used natural language (ChatGPT) to write programs rather than starting with raw code. Typically in academic programs, we have to dedicate a decent amount of time to up-skilling students in programming. In the past learning the ins and outs of Python or Javascript syntax was a prerequisite to really designing anything that leveraged computation.

back: Visual Studio Code IDE, front: P5 IDE

However, in this go-round, students were able to develop complete websites, totally bespoke to their ideas, relying much more heavily on Natural Language Programming. And for as much as ChatGPT “did their homework” developing their website code, so much of these designs were inherently theirs. Students still had to design the website, its functionality; spend days prompting to get useful responses; and struggle through actual code to develop meaningful programs. It was ultimately their ideas, their designs, and their visions.

Left: Matthew Heaton, Right: Hongqian Li

In all fairness NLP is not that good yet. Students still needed a good technical understanding of Javascript syntax to string these websites together.

But that’s changing.

Recently Jensen Huang, CEO of NVIDIA, has predicted the death of coding. He argued, “It is our job to create computing technology such that nobody has to program and that the programming language is human. Everybody in the world is now a programmer. This is the miracle of AI.” [source][video]

coding says goodbye

Programming Isn’t Going Away

So while coding may be going away, programming is not. There are useful ways of thinking that someone develops when they learn to code. They learn to think systematically about how things are connected. They learn concepts like recursion, and learn how the language itself can impact the capabilities of a program. Those ways of thinking will still be important to develop. But a lot of the nuances of coding keeps us too deep in the technical weeds and prevent us from thinking more strategically and conceptually. Natural language programming allows us to dedicate our brain power at a higher level to the problems that matter.

Today, most people aren’t writing sophisticated programs with ChatGPT, but I do believe that will come. A wave of “programming” literacy is coming via natural language.

Left: Data from World Bank (2023) showing % of literacy over the last 200 years; van Zanden, J. et al. (2014) — with major processing by Our World in Data, Right: My highly speculative possible curves of future “programming literacy” worldwide.

Today, only 2.54% of the global workforce are software engineers. [source] In 1820 only 12% of the world was able to read and write, but today 87% can read and write. [source] Like the steep incline of literacy following the Industrial Revolution, will most people be programming literate 200 years from now? Or perhaps even sooner? Today so much control of our world is only achievable by this tiny % of people who are software engineers. I welcome a future where programming is far more accessible and more people can write programs specific to their own interests and domains.

2 — Prompting Isn’t Enough

Great so most people in the future will be programming and doing so in natural language. So does that mean we’re all just going to ChatGPT our programs into existence? Not exactly. Language is very useful for thinking, but writing prompts is hardly programming.

As folks who program regularly will tell you, programming is not about coding. It’s a way of thinking. My partner recently pointed out that when software engineers discuss problems with one another, they rarely talk about code. They speak more abstractly about the structure of the system and its behavior. They consider how to structure the semantics of the program and how that will translate into the program’s capabilities.

Programming teaches you to think holistically about the system grappling with scale, recursion, network affordances, and emergent behaviors. It teaches problem solving, forcing someone to think critically, logically, and about operational flow.

Prompting in contrast to programming today isn’t always the easiest way to think programmatically.

There’s a danger that as we move towards prompting that we’ll rely too heavily on computers to think on our behalf. Prompting is more about a wait and see what the computer does that’s more reactive, vs programming which is more about designing, orchestrating, thinking and creating. Prompting also doesn’t allow you to build the holistic system. You’re only ever working on segments. And you must bring those segments to a different environment to build anything more complex.

But there’s a great opportunity if we leverage natural language’s ease and flexibility with the more systemic capabilities of programming. My hope and aim is to help contribute to a best of both worlds scenario where we harness the capabilities of both to help people think at a higher level.

Why we need a programming environment!

But why do we need a programming environment if AI’s going to do all of the work anyway? Even if AI is going to do all the program writing on our behalf, we’ll still need a way to communicate with it, clarify our intentions, and hopefully design programs together with AI.

We need an environment better than a chatbot to write natural language programs.

My hope is that programming literacy is not about having AI think for you, but that it empowers many more people to think creatively in terms of holistic systems and interconnected programs. I’m bullish that we’ll need an environment better than a chatbot, beyond just language to help us become sophisticated programmers.

3 — What to Borrow from Code Editors.

Luckily we’re not starting from scratch. There’s a great lineage of programming environments. Integrated development environments include a place to make, see, and check your program. Each of these capabilities is useful for an NLP.

Structure of P5.js Integrated Development Environment
  1. Make. . . . . . . . . .[Code Editor]
    This is where you author and design a program. Where you actually write the code. It usually includes some form of syntax highlighting. This might also include tools like auto-complete to improve the ease of writing code. It also might include higher level maps that help you author the hierarchy and components of your program.
  2. See . . . . . . . . . . .[Playground]
    A way to run the programs to see how they behave, so you can actively test and iterate on your program. Some environments include a “playground” where you can “play” a version of your program.
  3. Check . . . . . . . . .[De-bugging]
    Tools for testing and understanding bugs and errors, including things like error logs.

4 — Emerging Experiments in Natural Language Programming

A Showcase of NLP-First Programming

In 2021 OpenAI published CodeX, an environment for natural language programming. Today the programming capabilities you access in ChatGPT are from this CodeX model. This environment demonstrates the ability to write a longer structured program by iteratively providing natural language instructions. I’m really encouraged by this environment. However, it also shows there’s definitely a ways to go to make something that can write sophisticated programs and that anyone can use.

OpenAI CodeX — 2021

NLP Environments Aren’t There Yet

Lacks the Natural Language Program — In the CodeX environment you never see more than a single prompt at a time which really limits your ability to control and design the program structure. What I’d consider your program code (all of the iterative prompts provided) isn’t viewable, scannable, editable. You’re always prompting one at a time one or two sentences. How can this become a designed and structured part of the interface?

No Referencing Between Contexts — Additionally this environment makes it difficult to understand how the contexts interact. How does the prompt impacts portions of the formal code on the right? Where is “animate” in the formal code? And how does my prompt or elements in the formal code impact the behavior in the canvas (where the game is running)? Its hard to understand really how they all relate to one another beyond a general “they seem to relate”. In an environment that supports natural language which is inherently more abstract, it will be more important to be able to understand the relationships across these contexts.

Other Emerging NLP Trends

Engineer-First NLP — Since CodeX, there’s been an explosion in AI coding Copilots: from Github Copilot, Amazon AI Code Generator, and Sourcegraph. These copilots allow you to type prompts in natural language, but they still return responses in formal code. For the time being, you’ll still need to be a software engineer or at least rather technical to use them.

Novice Programming-NLP — Meanwhile, gaming companies like Roblox an Unity have started to advertise their aims of creating natural language programming capabilities natively inside of their development environments for novice programmers.

Draw-A-UI Demo — 2023

Designer-first-NLP — Designer-first AI tools like Draw-A-UI could point to how future natural language environments could evolve in a way that leverages more native design skills like drawing. Here someone can draw shapes on a canvas and by adding labels to the elements the environment can intuit the program functionality.

5 —Provocations for Natural Language Programming.

If we want a programming environment for natural language, it’ll need to be honed to the affordances of natural language. So how is natural language different from a formal language?

My Comparison of Formal and Natural Languages

Ambiguity

Sequence Ambiguity — Today its hard to understand the order and sequence of natural language programs. To develop a complex program through prompt engineering, you’d have pages worth of instructions making it difficult to follow operational flow. It gets really muddy. A development environment should help reveal the inherent structure and sequence of the program. Perhaps like some visual programming environments attempt to do? This could facilitate a programmer’s ability to navigate and design that higher level structure of the program.

Syntax Ambiguity — Natural language in contrast to structured language programming is far more flexible and ambiguous. It relies more heavily on the contextual semantics of the language and even the context of where and when the program takes place.

The same program demonstrated on the left as a Python program, and on the right as a natural language program.

So what might the syntax highlighting be in natural language? Perhaps its actually useful to reveal the grammar through highlighting each word class? Verbs are similar to functions and methods, whereas nouns are more like variables and objects.

Example of word class syntax highlighting to reveal the part-of-speech. Above experiment by Spatial Pixel.

But syntax highlighting that follows the grammar of natural language doesn’t quite feel useful because so much of the meaning is determined by the entire sentence from the semantics in the context of the whole sentence. Perhaps more appropriate than syntax highlighting would be a semantic highlighting.

Example of semantic highlighting to reveal whether instructions are a command, conditional or calculation.

But within natural language’s ambiguity, perhaps even more important is than highlighting is that in natural language programming, AI interprets intent, whereas in formal language the computer follows explicit instructions. , it’s more important to be able to continually control, check and account for that hidden context. I can give an abstract command like “make me something pretty” and AI will have to interpret what “pretty” is. Even for something that seems more concrete like “draw a line” the AI is still interpreting what I mean by “a line” and what I mean by “draw”.

In a natural language environment, it becomes more important to reveal what the program is inferring. We should make it easier to specify our intent by making the design and control its inference more explicit.

Formal language: follows explicit instructions

Natural language: interprets ambiguous intent

Examples of inferring intent source: Spatial Pixel’s natural language program

Inspecting hidden assumptions— This means future programs will need to be able to navigate between abstract and specific more quickly. Language is especially abstract in contrast to programs like C# which are highly specific. We’ll need to create better ways to help NL programmers navigate levels of specificity and abstraction. Consider a simple instruction for a robot “bring the cup downstairs” and how many assumptions (mini-decisions) the program would have to make. This designer of this program likely wants the ability to control more finite parameters of the program buried in the assumptions of that program. This might warrant an interface that allows you to zoom into levels of specificity within the text.

This article has gotten so long that I’ve decided to create a separate one to to focus solely on mock-ups of what a natural language programming environment could be. That way I can explore more of these little ideas, and focus on the design itself.

Hopefully you have ideas too! I’m looking for more.

Hey, you! Thanks for reading.
If you’re a researcher, designer, or just interested in natural language programming,
please reach out! I’m actively looking for projects, case studies, research, and ideas for what a natural language code editor should be. If you have anything to share it’s much appreciated!

--

--

Researching Spatial & Embodied Computing @Columbia University, U Penn and U Mich