Automate Prompt Engineering with Prompt Debugger

Bhavishya Pandit
Google Cloud - Community
4 min readJul 15, 2024

Imagine this: you’re on a quest for knowledge, and your guide is a vast library containing every book ever written. Sounds perfect, right? But what if the librarian speaks a strange, convoluted language, and retrieving the information you need becomes an exercise in frustration?

That’s the situation we face with Large Language Models (LLMs). These AI marvels are brimming with potential, but their raw power is often inaccessible without a special key: Prompt Engineering.

Prompt engineering is like giving clear instructions to a super-powered translator. It helps Large Language Models understand what you want them to do, turning gibberish into gold!

How do you do Engineer Prompts?

· Instruction and Context: We provide clear instructions and relevant context to guide the LLM’s understanding. Imagine showing the librarian the topic you’re interested in, not just saying “Give me a book.”

· Examples and Templates: Sometimes, showing is better than telling. We can use examples or templates to illustrate the desired format and style of the output. Think of it as giving the librarian a few pages from the kind of book you’re looking for.

· Fine-tuning and Iteration: The beauty of Prompt Engineering is its iterative nature. We test different prompts, analyse the results, and refine our approach until we unlock the LLM’s hidden gem. It’s like a constant conversation, where we learn the LLM’s strengths and weaknesses to get the best out of it.

Prompt Engineering has no doubt made our lives fairly easy but we still have to iterate over the prompts and tweak it if the desired output is not met. Sometimes it can be a cumbersome job. Well, what if I tell you that there’s a tool that can automate Prompt Engineering?

Intro to Prompt Debugger

I recently built a tool — Prompt Debugger using Google Cloud services which will debug your existing prompt to give you the desired result.

I would like to thank Prashanth Subrahmanyam and Romin Irani for their guidance and mentorship — it wouldn’t have been possible without them!

This post is a tutorial on Prompt Debugger. The below screenshots will help you understand how it works. You can visit the tool from here.

The shared screenshot is the welcome page that introduces Prompt Debugger and its functionalities. It currently supports 2 functionalities: Debug Prompt and Refine Prompt.

1. Debug Prompt — Helps you debug prompt based on the set of instructions, the test case and the expected output.

2. Refine Prompt — rewrites your existing prompt based on the best practices of Prompt Engineering.

Let’s see how each of the functionality works.

Prompt Debugger — It comprises of the fields –

1. Context Window — based on this selection the tool will restrict your token count. It will notify you when your prompt exceeds the window size

2. Input Prompt — this is where you paste your instructions

3. Test Case — it takes care of the context you want to test your input prompt on

4. Current Output — of all the fields this is the only non-mandatory field. As the name suggests you can share the current output that you are getting from your existing prompt.

5. Expected Output — a mandatory field which expects you to share the expected output.

The screenshot demonstrates the same —

The below screenshot shows the output we get after clicking Debug Prompt and saying Hallelujah three times! 😂

Moving on to the second feature which is again simple yet effective. Refine Prompt comprises of the following fields –

1. Context Window — to help you keep your prompt within the window size of the LLM you are using.

2. Input Prompt — takes your instruction prompt as input.

The below screenshot will help you understand the relevance of each field.

The below screenshot shows the output upon clicking Refine Prompt (you don’t have to say Hallelujah this time 😉)

The tool automates Prompt Engineering for you thereby saving time and boosting your productivity. Do try it and share it with your friends by clicking on the Share app button.

You can copy the link of the app with a sample text to share with people on different social media platforms.

You can also submit feedback — from the thumbs up/down widget (the only data we collect from the user). Your feedback will help in improving the tool and making it easier for people to use it.

If you have any queries, feel free to reach out to me on LinkedIn!

Hallelujah! Hallelujah! Hallelujah! 😉

--

--