Photo by Riho Kroll on Unsplash

ChatGPT is stochastic

That’s why ChatGPT gives you different outputs.

Mastafa Foufa
9 min readMar 3, 2023

--

Context

I ran Chat GPT over the exact same query multiple times and I got different outputs over and over.

In the example below, I asked the OpenAI tool to summarize a passage of a Wikipedia article related to Cristiano Ronaldo, a famous football player.

ChatGPT returns several outputs given the same input query.

As we can see, the outputs differ despite sharing the exact same query. To generate the above outputs, several parameters can be changed through the openai library. Among those parameters, you will find: temperature, max_tokens, top_p, etc. Some of those parameters are key to understand why the outputs differ over time.

Parameters used to summarize text. Temperature is set to 0.9 and all other parameters are left to their default values. For example, Top-p sampling is used by sampling over all the candidate tokens as it is set by default to one.

Reproducing the context

To reproduce the above, you can quickly leverage the openai python library. Below, you will find my source code. At first glance, the parameters should be new to you. You would need to learn a little bit more to find the optimal parameters based on the use…

--

--

Mastafa Foufa

Data Scientist @Microsoft | ex-Teacher @EPITA Paris | 8 patents in AI