You’re Using ChatGPT Wrong!

Improving your ChatGPT prompts and how searching differs from prompting

Oliver Lövström
Write A Catalyst
3 min readJul 8, 2024

--

The Old Way of Searching

Have you ever felt Googling is like searching through a cluttered library — like looking for a needle in a haystack? With Generative AI models like ChatGPT, Gemini, and Perplexity, you no longer need to ramble through this endless library. However, prompting a Generative AI model is different from a regular search.

Preciseness: The Stepping Stone of Effective Prompting

I recently watched an X-Files episode in which the main protagonist, Agent Mulder, discovers a genie and wishes for world peace, only to have all humans wiped from Earth. Mulder gets what he wishes for, but in a different way than he had intended. If he only had a more specific wish. This scenario illustrates how Generative AI works—preciseness is crucial.

Imagine searching on Google for “What is the color of snow?”. The top result on Google says “white”. However, asking ChatGPT the same question, and you will get a too-long response:

“The color of snow is typically white. This is because snow is composed of small ice crystals, which reflect and scatter all wavelengths of visible light. This scattering of light causes snow to appear white to our eyes.”

To get a concise answer, we need to be more specific: “What is the color of snow? In one word only”. This gives the shorter answer: “white”.

Creating the Perfect Prompt

As opposed to Googling, specificity is key to creating the perfect prompt. In the previous example, we added formatting to the prompt, making the output more concise. We can improve our prompts by providing the system with more information. This can be done by adding context, formatting, roles, or examples.

Example: Bypassing AI Detectors

If you are familiar with Generative AI, you know it’s challenging to get the model to generate text that is passable as human text. However, this issue can be solved by creating more specific prompts. To illustrate this, we will attempt to generate a story about a cat that passes all common AI detectors using a single prompt. See the story prompt in the image below. In it, we introduce more information, such as assigning the AI system a role and giving additional context and formatting information. The output from the Generative AI system passes most AI detectors.

Prompt Example — Image by Author.

Prompt engineering isn’t limited to text-to-text Generative AI. It can also be used for multimodal AI systems such as text-to-image prompting. I won’t go into much detail, but adding more information can generate much nicer and more interesting images. The first image is generated with a prompt that only has a task, while the second image prompt also contains the context and style of the image.

Generated Images by Midjourney.

Where to learn more?

Adding specificity and information to our prompts can greatly improve the output of Generative AI models. However, the preciseness of a prompt is only one part of the prompt engineering equation. If you want to learn more about this, I recommend taking the “Prompt Engineering with Llama 2” and other LLM/Generative AI courses from the Deep Learning Institute. You can access it for free when signing up here. Additionally, for ChatGPT, I recommend reading OpenAI’s “Prompt engineering documentation”.

--

--