Prompt Pre-fixing for LLM : Efficient Zero-Shot Prompting

Anmol Talwar
6 min readNov 2, 2023

--

In this blog, we will deep-dive (using examples) into the simplest type of prompting i.e., Zero-Shot prompting and how to efficiently use it via Prompt-Prefixing.

For understanding Zero-Shot Prompting with Visual Aids and its Practical implementation in python, watch my video:

To comprehend Prompt prefixing, lets first briefly understand a prompt and its components.

Prompt

A “prompt” refers to a specific input or instruction provided to an AI model to guide its output or generation. There are three main components of a prompt but not all three are needed for every application.

Components of a Prompt-Q&A based on Climate Change Article
  • Instruction : Provides a clear directive to the AI model about the task it should perform. This instruction specifies the action, such as “Translate,” “Summarize,” “Answer,” or “Generate,” and guides the model’s behavior. It sets the high-level objective for the generated content. [For instance, in the above image-Instructing the LLM to answer the questions on climate change within the context provided.]
  • Context : Offers additional information or background that helps the AI model understand the specific requirements or constraints of the task. It provides context or details related to the task or content generation, ensuring that the model’s response aligns with the user’s expectations. Context can include topic information, input data, or any relevant details needed to complete the task effectively. [For instance, in the above image-Providing the context as a form of an article on Climate Change which will be referred to while answering the questions]
  • Input Data / Query : Some applications require input data that the model should process or use as a basis for generating responses. This component may include the text to be translated, the article to be summarized or the question to be answered. [For instance, in the above image-The user is querying on the cause of Climate change which will be answered within the context provided.]

Zero-Shot Prompting

Zero-shot prompting is a technique used with generative AI models to generate text without any fine-tuning or training on a specific task. In zero-shot prompting, the user provides a prompt that specifies the desired task or context, and the model generates a response based on its pre-trained knowledge and general language understanding.

Lets understand it using a Q&A use case on Climate Change using GPT 3.5 Turbo :

Query 
What causes climate change?

Article on Climate Change (Context)
Climate change is a global challenge that affects every living being
on our planet. It is caused by a rise in greenhouse gas emissions,
leading to global warming and severe weather events. One of the obvious
reasons for increased emmisons is growing population and its demands.
World population was 7 Billion in 2010 and has grown 50% in 2020.
It increased another 10% in the next 2 years. As the population increases,
the demand for fuel, food and other essential resources increases.
This leads to more industrialization and pollution

Instruction / Prefix
Answer the question given the context

Based on the above prompt, we get the below response as an output from the LLM.

Response
The main cause of climate change is the rise in greenhouse gas emissions,
which leads to global warming and severe weather events. One of the factors
contributing to increased emissions is the growing population and its demands.
As the population increases, the demand for fuel, food, and other essential
resources also increases, resulting in more industrialization and pollution.

Prompt-Prefixing

Prompt prefixing, in the context of Large Language Models (LLMs) refers to the practice of adding a predefined or customized text prefix to a user’s input when making a request to the model. The prefix serves as a way to instruct the model about the desired task or context of the input.

With respect to the Zero-Shot example above, the prefix or the instruction given to the LLM is to respond to the user query within the confines of the context or the article provided.

Prompt prefixing is commonly used to guide the LLM to produce responses that are coherent, relevant, and contextually appropriate. By providing a specific prompt or context in the form of a prefix, you can improve the quality of the model’s responses.

Lets understand it using a Q&A use-case instance:

Query 
What is world's population in 2022

Article on Climate Change (Context)
Climate change is a global challenge that affects every living being
on our planet. It is caused by a rise in greenhouse gas emissions,
leading to global warming and severe weather events. One of the obvious
reasons for increased emmisons is growing population and its demands.
World population was 7 Billion in 2010 and has grown 50% in 2020.
It increased another 10% in the next 2 years. As the population increases,
the demand for fuel, food and other essential resources increases.
This leads to more industrialization and pollution

Vanilla Prefix (1)
Answer the question given the context

Enhanced Prefix (2)
Act as a mathematician to answer the question, given the context

For the same query (mathematical problem) and context, we passed in two different prefixes. the first prefix is a plain vanilla prefix instructing the LLM to answer within the given context. Whereas, the second prefix is explicitly asking the LLM to act as a mathematician and respond within the given context.

Response 1
The world's population in 2022 cannot be determined based on the given context.

Response 2
To estimate the world's population in 2022, we can use the given information.
The world population was 7 billion in 2010 and grew by 50% in 2020.
A 50% increase on 7 billion would be an additional 3.5 billion people,
bringing the population to 10.5 billion in 2020.

Furthermore, it is stated that the population increased by another 10%
in the next two years. A 10% increase on 10.5 billion would be an additional
1.05 billion people. Adding this to the previous population, we can estimate
that the world's population in 2022 would be approximately 11.55 billion people.

It is important to note that this is an estimate based on the given information
and may not be exact. The world's population is influenced by various factors,
such as birth rates, mortality rates, and migration, which can vary over time.

Just enhancing the prefix by instructing the LLM more specifically, improved the response.

Lets understand this even further using a subjective use-case of Text-Summarization:

Query 
Summarize the article

Article on Climate Change (Context)
Climate change is a global challenge that affects every living being
on our planet. It is caused by a rise in greenhouse gas emissions,
leading to global warming and severe weather events. One of the obvious
reasons for increased emmisons is growing population and its demands.
World population was 7 Billion in 2010 and has grown 50% in 2020.
It increased another 10% in the next 2 years. As the population increases,
the demand for fuel, food and other essential resources increases.
This leads to more industrialization and pollution

Vanilla Prefix (1)
Answer the question given the context

Templatized Prefix (2)
Summarize the article point by point based on the format below:
1. Major Cause of Climate change
2. Other reasons for Climate change
3. Increase Population increase over years
Answer only using the given context

We are aiming to summarize the same article using 2 different Prefixes. The first is a Vanilla prefix as we saw in the last example. The second is a Templatized Prefix used to create summary in a custom and structured format.

Response 1 
The article discusses climate change as a global challenge caused by a rise in
greenhouse gas emissions. It highlights the role of the growing population and
its demands in increasing emissions. The world population has grown by 50% from
7 billion in 2010 to present times, and it continues to increase, leading to
higher demand for resources and more industrialization, which contributes to
pollution.

Response 2
1. Major Cause of Climate change: The major cause of climate change is the rise
in greenhouse gas emissions, leading to global warming and severe weather
events.
2. Other reasons for Climate change: One of the obvious reasons for increased
emissions is the growing population and its demands. As the population
increases, the demand for fuel, food, and other essential resources
increases, leading to more industrialization and pollution.
3. Increase Population increase over years: The world population was 7 billion
in 2010 and has grown by 50% in 2020. It increased another 10% in the next
2 years.

Though both summaries are appropriate, by explicitly instructing the LLM to generate summary in a Templatized format one can customize the responses.

Overall, prompt prefixing enhances the user experience and ensures that generative AI models generate content that aligns with user expectations and requirements. It is a valuable technique for making these models practical and useful across a wide range of natural language processing tasks.

--

--