Stubbs is coming — form your own opinions.

Stubbs, the Generative App Builder powered by Gemini.

Bedros Pamboukian
6 min readOct 22, 2023

Preface

This is a technical article — meant to provide all the facts for all sides. If you wish to cover a high level version of this, feel free to do so, just link this source somewhere in there — if you want me to cover a higher level version of it, you can say something in the Responses section of this article.

Feel free to just scroll to the good part, but I think this is worth reading. A few days ago, I published a piece on Gemini and Stubbs. Gemini being something everyone has heard of, with clear definitions as to what it can do. Stubbs? not so much — and I may be the one to blame.

I added a response to that post, clarifying that it’s unclear what Stubbs can do, and that it might not even generate fully fledged apps at all; later on, I added a response to that as well, saying I’ve got proof that it can write code. I’ve been digging into this for days, trying to dig up a clear definition of Stubbs — only to not find any. You can see that in my previous post, I made a few assumptions about Stubbs that could be misleading, as well as not keeping it entirely factual. So this article will only contain the facts — I’ll give my interpretation of it elsewhere, maybe Twitter? Or Reddit? Or really just the responses section…

The facts

This’ll be split into sections. Indicators that this is an entire app generator, prompt generator and indicators of both. The reason it’s not so clear as to which one it could be is mainly due to the terminology being so loose. App can refer to a model, prompt or entire apps.

Prompt Generator

Covers facts that indicates this is a tool for making “models” built for a specific purpose without needing to finetune

“executeStubbsPrompt” -> Executes a model prompt entered into a generated app.
LLM style suggestions for stubbs creation — indicates this creates AI model responses

Executes a model prompt entered into a generated app. Google using App vaguely here.

Not much over here… let’s look at the App Generator.

App Generator

This covers the facts that can lead one into believing this is a fully fledged app generator, with code covered. If this is true, app developers may have to start integrating AI into their workflow just to keep their job.

The UI and design language point to this having multimodal output, not specific to code.
The app is going to have full code… of something?
Font and color palette suggestions
Compiled from DSL tells us this might not have code for Angular, Flutter, proper frameworks

I should enable word wrap…

Compiled from diffs and/or DSL into executable code, but still needs find/replace with prompt call, etc. Will perform find/replace client-side with data from session.

Both

This covers the facts that indicate this creates fully fledged apps that interface with the “model” created for the use case. More facts included in this section does not mean this is a more likely possibility, it’s just a side effect of this appealing to both sides.

The remix functionality is based on a code snapshot from the original Stubb. Code snapshot could refer to the JSON containing ingo about the Stubb
“Generates an app” — “Currently wraps GenerateText”

Wrapping GenerateText to Generate an app can mean different things. It could mean it’s wrapping GenerateText to create a prompt to fit AI response styling needs — or it could mean it’s generating code. This is heavily floating towards code, which I know sounds weird but, there’s definite proof to this. The response points to GenerateStubbsResponse, have a look:

The versions of code the LLM produced (in cases where it produces multiple).

[???]

These are details that you can warp to fit your argument, or just cool details to better fill in gaps — not all of these are Stubbs specific

Styling recommendations can refer to either the app style or the response style — you can pick a better model for it, force it to use gemini — as of right now, Gemini is not under “models/”

Overview

From the looks of it, yes, Stubbs will generate entire apps with working code. It will be powered through either Gemini, or anything more capable — it is not explicitly stated that it’s powered by Gemini. It takes in images and provides different outputs with previews, which tells me it’s a multimodal code-bot? Not truly multimodal, but supporting text, image, code / structured content and even visualizing the output, that’s a new level of AI.

My view

I rushed this part out before, here is what it said before:

This’ll generate app code and it’ll excel in it; I’m not going to backpedal on what I said before, I still stand firm on that view, the view that app developers must adapt to integrating AI before they’re doomed. Companies should not fire app developers or UI designers, and if any layoffs begin on app developers due to Gemini or Stubbs then that’s just entirely goofy. Stubbs, when used properly, can be a great tool for app developers to get the boring parts done. This’ll cut down on spending time to create each component up to the design spec, layout and pretty much every UI implementation detail, with functionality sprinkled throughout. This will not magically figure out backend implementations. Frontend work and app development is not just beating UI into code.

I think that this is a step towards app devs being replaced. It seems to use a domain specific language for this, not actually writing code for it — check the evidence up there. This is great for people who wish to create personal apps without installing some random 2 star app. However, I still stand strong on the point that app developers must adapt to integrating AI before it’s too late. Stubbs is an early warning, imagine where we’ll be in just a few months. It is growing at an alarming rate, and I expect this sort of thing to just be a demonstration.

So why do I think app devs are doomed? All app developers are not doomed. They are doomed if they do not adapt, and a crucial part of development is the ability to adapt. Chances are, if you get replaced by another person that can utilize AI more effectively, you were bound to be replaced from the beginning.

Other news

LaMDA might be added to Makersuite and we’re getting more models soon?
That’s a huge assumption. That’s from the API docs, which includes lamda as an example for the models in Makersuites “models/” rather than just listing the available models, like text-bison-v001 — we’ve got models/lamda-v001 and models/ulm-small-v000 listed as examples. Could be an indicator!

"Model": {
"properties": {
"name": {
"description": "Required. The resource name of the Model. This serves as an ID for the Model. Format: models/{model} with a {model} naming convention of: \"{base_model_id}-v{version}\" Examples: `models/lamda-v001` `models/ulm-small-v000`",
"type": "string"
},
...
"displayName": {
"type": "string",
"description": "The human-readable name of the model. E.g. \"ULM Medium\" The name can be up to 128 characters long and can consist of any UTF-8 characters."
},
...
"baseModelId": {
"type": "string",
"description": "Required. The name of the base (or foundational) model. Examples: `ulm-small` `ulm-xxl` `lamda`"
}
},
"id": "Model",
"description": "Information about a Generative Language Model. See http://google3/google/ai/generativelanguage/v1main/model.proto?q=symbol:Model",
"type": "object"
}

^ ULM refers to PaLM 2

Same as last time, this is the source of the leak. This time, you get to form your own opinions, as now I have provided images and the facts instead of mainly opinions. Images from this article can be used and linked — just provide credit. This took time to compile. If you want all the JSON, I can provide that too at a later time if it’s a popular request, though I think this JSON can be easily found.

--

--