Why GPTs aren’t (yet) the new App Store

Duncan Anderson
Barnacle Labs
Published in
5 min readNov 13, 2023

OpenAI just released GPTs, which are customised versions of ChatGPT.

Provide a custom prompt, documents as sources of answers and actions that allow information from external APIs to be integrated with your GPT. It’s really simple to build a GPT — now anyone can do it!

Once you’ve built your GPT, it’s then available within ChatGPT. Currently, you can share them with others via a link and OpenAI has committed to rolling out a ‘GPT Store’ to aide with discovery within the month.

The experience of building and using GPTs is slick and well executed.

Is OpenAI onto something? Take a look at the initial GPTs — all created within just days of the launch announcement and judge for yourself.

There’s definite interest, but will this be bigger than the ChatGPT plugins that have been a bit of a damp squib? The GPTs I’ve tried have varied in quality, but it’s only days since the announcement, so too early to tell.

A big moment?

There’s an opinion floating around that says GPTs just sherlocked a part of the AI industry. Custom bots, retrieval augmented generation, vector databases, etc — all these things suddenly fall within the OpenAI orbit.

However, I don’t see this at all. GPTs are a welcome addition, but there’s lots of reasons why you might need to build something outside of that orbit…

1. 🏪 ChatGPT is a niche app store

You can only use a custom GPT within ChatGPT, meaning you need a ChatGPT account just to use it. You can’t (yet, at least) host a custom GPT from some random website. ChatGPT might be popular, but it’s popular mostly in a certain bubble of people. GPT builders need to be conscious of the nature of that audience. I would hazard a guess that this trends heavily towards a younger, tech-savvy, middle-class audience.

Then there’s the question of if GPTs are going to require a paid-for ChatGPT Plus subscription — they may well do. If that turns out to be the case, the audience is further limited.

This challenge is especially stark when we then consider enterprise access. There might be an audience for GPTs within enterprises, but that will require those enterprises to have an enterprise ChatGPT license.

From what I’ve seen, few businesses have yet bitten the bullet and poneyed up for such. It’s still early days for that product and a lot of businesses are struggling with justifying the costs. For the right workers it’s a no-brainer, but proving that and building the business case to justify investment is a work in progress for many. Most CFOs are still saying “prove to me why I should spend more money”.

If you want to build a custom GPT that targets enterprise customers, good luck. I suspect the audience will be very limited for a while yet.

2. 💰 Storing documents in OpenAI is costly

If you want a GPT to access your data, which is kind of the whole point, that data must be uploaded to OpenAI. When you do that, you’ll be charged storage costs.

Let’s compare those costs with the broader market:

  • OpenAI data storage cost: $0.20/GB, per assistant, per day.
  • Amazon S3 storage cost: $0.023/GB, per month.

In other words, OpenAI are charging approximately 300x the cost of the market rate for bucket storage.

This is expensive. Very expensive!

If you’re just hosting a few small pdfs, you probably won’t care much. However, knowledge bases of any size will get expensive really quickly at that rate.

As context, at Barnacle Labs we operate an AI RAG infrastructure we built for the scientific community that hosts many millions of pdf documents. The costs of doing that as a GPT would probably be financially ruinous, so it’s not an option at our scale. There will be a market for GPTs, but not for where the knowledge base is extensive.

3. 🔥 Enterprises won’t want to store confidential data outside their firewall

I’ve already mentioned the cost of hosting data on OpenAI’s platform. But for many businesses, there’s also the information security risks. If you want to build an internal GPT that works on proprietary data, it’s unlikely this is the solution. Enterprises building solutions that access proprietary data may well feel that data needs to stay within their firewall.

That means GPTs are probably more focussed on consumer audiences, than internal enterprise customers, for now.

4. 👘 One size doesn’t fit all

The technology behind GPTs is opaque and undisclosed. It almost certainly includes an implementation of the retrieval augmented generation (RAG) pattern, with a vector store and similarity matching algorithm(s). Exactly what that technology is, which vector store is used, what the chunking strategy for documents is, how the algorithms work, etc, are all unknowns and impossible to influence.

My personal experience is that RAG solutions have a lot of opportunity for tuning to improve accuracy, but this isn’t available to GPT builders. This “consumerisation” of the build process is welcome and hugely positive because it increases the audience of those who can build solutions. However, those building more ambitious solutions will still need access to the underlying technology so they can tweak and twiddle to their heart’s content.

Summary

GPTs are a great solution. They broaden the audience of people able to build custom GPT solutions and almost anyone can now do so. But as with most consumerisation initiatives, this starts with a relatively simple solution that won’t satisfy those with higher ambitions. The costs preclude larger knowledge bases and the limited audience of ChatGPT (Plus?) users will blunt its wider impact in the short term. Further, those with higher ambitions will need greater access to the underlying technology and the ability to address information security concerns.

OpenAI could choose to address all of the concerns I’ve raised and I would expect them to do so in the long term. GPTs is an important initiative, but I expect its impact may be relatively limited in the short term.

I’m particularly intrigued to see if OpenAI might widen access to GPTs outside of ChatGPT. Does OpenAI see ChatGPT as the new app store, and so will limit access in order to encourage take-up of ChatGPT? Or, will they see the bigger opportunity to be getting people to use GPTs, whether inside or outside of ChatGPT itself? That decision may have a significant impact on how important the custom GPT initiative becomes.

--

--

Duncan Anderson
Barnacle Labs

Eclectic tastes, amateur at most things. Learning how to build a new startup. Former CTO for IBM Watson Europe.