TravelGPT 😎🌴 — Using Azure Prompt Flow & OpenAI

In this short article, i’ll be using Azure Prompt Flow and Open AI to add real time insights into chat flows. We will walk through creating a simple travel advisor with access to weather forecast data!

Arron Dougan
KPMG UK Engineering
5 min readOct 25, 2023

--

Photo by Elizeu Dias on Unsplash

For those of you who know me, i’m a sucker for a good holiday! Thus in today’s article i’ll be producing a basic travel advisor using Azure OpenAI and Azure Machine Learning Prompt flow.

In my last article I discussed some of the limitations of current LLMs, most prominently the fact they were trained back in 2021 and don’t have context of your organisations private data.

Prompt flow allows you to enhance the limitations of an LLM by providing a workflow to allow for prompt augmentation through external sources using Python.

In this basic example i’ll be querying an Open Weather API (api.openweathermap.org) to provide travel recommendations with the context of the weather over the next 5 days.

Although basic this is a simple way of supercharging the power of an LLM with real time data that could be fetched by any API or via web scraping — the possibilities are endless.

Introducing Azure Machine Learning Prompt Flow

Azure Machine Learning (ML) Prompt Flow, is a cool recent feature currently in public preview and part of the wider comprehensive ML workspace offering that Microsoft offers.

In a nutshell it allows you to do the following:

  • Have a visual development workflow that combines the output of Python Scripts (ie Web Scraping / API calls) into LLM workflows.
  • Integrated development environments, including support of custom containers to run workloads.
  • Comprehensive monitoring and deployment options.

Seeing it in Action — “TravelGPT”

Let’s go for a simple tongue in cheek example, a travel advisor that can give activity recommendations based on the context of the weather that day.

For the latter part ChatGPT models have trawled enough of the internet to recommend activities in any given City — but have no context of the upcoming weather — enter Prompt Flow!!

Simple, Yet Effective…

Prerequisites

  • Spin up an Azure Machine Learning Instance and associated compute runtime following the guide here.
  • Ensure you have an Azure Open AI service spun up, you’ll need to register for this service if you haven’t already — details are here.
  • Head to Prompt Flow under your Azure ML workspace and register an Azure Open AI connection in the connections tab, you’ll need the URL and API key.

Step One — Setting the Inputs / Outputs 💬

Head on over to your ML instance studio and select Prompt Flow in the left hand navigator — create a new flow.

The GUI will allow you to select inputs to use throughout your flow

We are using three inputs for this exercise, location, start time and end time. We will use Toronto — Canada, selected as it has varying weather conditions this time of year.

Step Two — Querying the weather using Python 🐍

Time to write some code to fetch the weather using the input chosen earlier. I’ve added comments to the code below to see where the input (Location) is being entered into the tool function.

from promptflow import tool

# The inputs section will change based on the arguments of the tool function, after you save the code
# Adding type to arguments and return value will help the system show the types properly
# Please update the function name/signature per need
@tool
def my_python_tool(location: str) -> str:
import requests
import json
from datetime import datetime

# Replace with your OpenWeatherMap API key
api_key = "XXXXXXX"

# Construct the API URL
base_url = "https://api.openweathermap.org/data/2.5/forecast"
params = {
"q": location,
"appid": api_key,
"units": "metric",
"cnt": 40 #Free tier Api Returns 8 readings a day, 40 in total over 5 days.
}
response = requests.get(base_url, params=params)

# Create a dictionary to store the weather data
weather_data = {
"location": location,
"forecast": []
}


data = response.json()
print(json.dumps(data, indent=4))

# Extract and add the weather forecast for the next 7 days to the dictionary
for forecast in data['list']:
timestamp = forecast['dt']
temp = forecast['main']['temp']
description = forecast['weather'][0]['description']
date = datetime.utcfromtimestamp(timestamp).strftime('%Y-%m-%d %H:%M:%S')

weather_data["forecast"].append({
"datetime": date,
"temperature": temp,
"description": description
})

# Return the weather data as JSON
json_response = json.dumps(weather_data, indent=4)
return json_response

You can then link the input into the Python task as follows…

The beauty of Azure Prompt Flow is it allows you to debug each step — here you can see the Python code is doing the business and retrieving the weather data.

A range of weather in Toronto — perfect to test the LLM!

Step Three — Bringing it together with the LLM 🧠

Prompt flow then allows you to interact with an LLM using a Jinja2 templating engine, in this case it brings in the start / end time of the itinerary and our freshly fetched weather data from before.

We also customise it with a system message — which are described in detail here. This essentially tells the LLM how to react — in this case generating a itinerary based on the weather and city provided. It can handle the JSON formatted data with ease, whilst also handling the conversion into the cities local time zone!

Instructing the LLM how to act

Results Time!

Let’s review the results…

A 5 day itinerary — offering weather relevant activities

Not bad — it’s interpreting the next five days, converting to the local timezone, summarising the day of weather then giving 2 activities a day relevant to the weather.

Wrapping up — Chat GPT with real time context

Awesome — I can now get weather relevant travel itineraries with ease. This is a simple yet powerful linear flow example, where you can combine anything fetch-able via an API or scrap-able from the web into an LLM for your desired results. It is very much the tip of the iceberg.

I see Azure Prompt flow as a real focus area for Microsoft in the future, adding a easy and efficient way of fine-tuning LLMs to meet more complex chat flows and business requirements.

This was certainly fun to experiment with, hope you have enjoyed it!

--

--

Arron Dougan
KPMG UK Engineering

Azure focused DevOps and Cloud engineer based in Manchester 🐝☁️