Developing LLMs-powered applications with Prompt Flow

An implementation with Python and Streamlit

Valentina Alto
Microsoft Azure
Published in
8 min readJan 15, 2024

--

Over the last year, LLMs have paved the way to a new application architectures. It soon became clear that new frameworks and orchestrators are needed to guarantee the end-to-end development of LLMs-powered applications.

During the opening keynote of the Microsoft Ignite event, on November 15th, Microsoft announced its platform to test, build, evaluate and deploy end-to-end pipelines involving typical GenAI architectures, such as RAG.

One of the most remarkable capabilities of Azure AI Studio is Prompt Flow, a low-code GUI to help developers and AI professional building their own copilots.

In my last article, we saw how Azure AI Studio and Prompt flow can assist developers in evaluating LLM-powered application outputs, using built-in, AI-assistat evaluation metrics. As an example, we built a Wikipedia Q&A flow from a template available in the Prompt Flow catalog.

In this article, we will use this same flow as an example; if you want to replicate the hands-on steps, you can follow the practical instructions here.

Once deployed as a managed endpoint, your flow can be consumed via REST API and embedded within your applications. In this article, we are going…

--

--

Valentina Alto
Microsoft Azure

Data&AI Specialist at @Microsoft | MSc in Data Science | AI, Machine Learning and Running enthusiast