The ASOS Tech Blog
Published in

The ASOS Tech Blog

In partnership with UCL: building chatbots with the Bot Framework SDK

ASOS is an innovative business and we’re constantly improving and learning to make our platforms better. As part of that commitment, one of our Senior Machine Learning Scientists, Dr Fabon Dzogang (@faabom), recently undertook a project in collaboration with the Centre for Doctoral Training at UCL. The project explored the latest advances in Natural Language Processing (NLP) and chatbot Tech, helping them explore how artificial intelligence could be used to improve Customer Care chatbots.

As the world of retail increasingly moves into an online space, maintaining a smooth and positive experience for customers is essential — and Customer Care has a big role to play in that. Chatbots have long been recognised as a tool to relieve pressure from Customer Care teams whilst still providing a positive experience for customers, in a world where 60% people would prefer to consult a bot before a human.

Please note, this is a research and development project designed to generate new learnings that could be applicable in the future. At the point of publishing, this blog post does not directly relate to the ASOS Virtual Assistant that is used by our customers.

Proof of concept

Based on the project’s research and findings, we’re going to show you how to build a simple bot using the Bot Framework on Azure and demonstrate how we created a bot that integrated GPT3 and plugins for a more interesting chat experience.

Introducing Ayo

We present AYO an engaging, conversational AI internal prototype that has learned to provide the latest tips and recommendations when shopping online at We have integrated OpenAI’s autoregressive language model GPT3 (175 billion parameters) to Microsoft QnA BotMaker to offer a safe and unique conversational experience.

When a new user comes in, AYO first parses the incoming message to detect the user’s intent between chit-chat (e.g greetings, goodbye, what is the weather like today?) and a customer query (e.g where is my order, I cannot track my parcel). Should a query type be identified by AYO, the message is mapped to one of our FAQ pages on our website (, then GPT3 will find an answer by ranking and serving the latest tips directly from our FAQ pages, on the alternative a summary of the information redacted beforehand by a content specialist is served to the user, or that way only information and language curated by our specialist content teams is served to the user.

Building the bot


We don’t need to reinvent the wheel to get started. cookiecutter will create a working bot for you including ARM templates for deployment and a detailed with tonnes of useful information for development and testing.

pip install botbuilder-core asyncio aiohttp cookiecuttercookiecutter

Then we can implement our own methods for handling the kinds of activities that the bot will encounter when conversing.

Run and test

The bot should just work if you run this from a terminal inside your project:


The easiest method to interact with your bot locally is to install and run the desktop application that Microsoft provides, the Microsoft Bot Emulator. Point the bot to the service running locally like so:

The emulator will construct the necessary HTTP requests for you behind the scenes to query the service, emulating a ‘real’ chat experience.

Try chatting to your bot, like so:

QnA Maker

Is a managed service provided by Microsoft, used primarily for building ‘knowledge bases.’ In a production-ready implementation, we would utilise our own data scientists and build a model from our own data. However, for a proof-of-concept, this is a huge time-saver and an extremely useful tool.

Creating a knowledge base

By following these instructions, you construct a knowledge base by first creating a QnA service and importing a sample chit chat dataset into it. It’s provided to you as an option in the service creation process:

Microsoft provides a range of sample chit chat datasets depending on the personality you want.

Then it’s a case of clicking ‘Save and train’ in the portal. Once you’ve trained a model, you can actually test the knowledge base to see if the responses make sense and check the confidence scores of each answer like so:

Inspect QnA Maker’s responses in the QnA Maker portal.

Connect our bot to QnAMaker

We can now update our code to use the Python SDK for interacting with QnAMaker and enable it to query QnA in real-time. Update your environment:

pip install botbuilder-ai>=4.13.0

We also need a few new environment variables in our DefaultConfig class in

Our bot already has a method for handling incoming messages. We can update the on_message_activity function to send a request to our QnAMaker knowledge base and return the user an answer:

Connect to Teams

Interacting with a deployed bot via Teams in the same way you would start a conversation with a colleague is as essentially a configuration change — the properties section of your ARM template should contain these values:

Giphy integration

Once you’ve enabled your bot to chat to QnAMaker, you can extend your bot however you like. The image below is an example of a POC bot that ASOS is working on that makes calls to Giphy for a more ‘modern’ chat experience.

For some chit chat answers, our bot called Ayo responds with a popular gif instead of a text response for a more personal touch.


GPT-3 probably deserves a blog all of its own. We decided to harness the power of GPT-3 for a few reasons:

  • To generate artificial data for us to train basic chit chat and FAQ models on, given a set of parameters.
  • To reduce the likelihood of malicious or toxic language from being shown to the user.
  • To decide which text snippet from each of our Customer Care pages best answered the user’s question and return it to them as appropriate.
Figure 1 Synthetic Data Augmentation with GPT-3, our intent classifier is trained by asking GPT-3 to produce additional training points of user queries, the x-axis indicates the number of new samples per each message in the training set. Our detection rate in a five-fold cross-validation settings is positively impacted by our data augmentation strategy with an observed +15% in overall weighted f1 (yellow). The gain in accuracy is consistent across accuracy (blue), precision (green), and recall (red).

We have leveraged GPT-3 to augment our original training set with additional examples of user queries, thus providing additional variations of language and extra vocabulary to AYO. Our results were extracted in the context of our collaboration with the Centre for Doctoral Training (CDT) at University College London (UCL) — Msc Student Haoyang Zhai, industry mentor Connor McCabe, and industry supervisor Dr. Fabon Dzogang.

Figure 2. “How long before my Premier is active?” Information snippets extracted from an example FAQ page on the website, each block is matched against the user query using GPT-3’s semantic search engine, the greatest match is used to produce a final response.

The order is matched against the user query and the new order is used to select a top answer, in this case GPT-3 reorders the paragraphs as 5, 1, 3, 4, 2, 6. Giving as the top answer: ‘Premier Delivery is valid on the order you purchase it on and can take up to one hour to be activated against your account. You’ll receive an email once your Premier Delivery subscription is active.’ Now when GPT-3 is not confident in the match between the user query and the extracted block, AYO will automatically revert back to serving the information summary prepared for that page.


  • Tools like cookiecutter are extremely useful for providing us a template from which to build a bot from scratch.
  • We can quickly train conversational models for the bot to use when engaging with a user using Microsoft’s managed service called QnAMaker.
  • We can fairly easily extend the bot to call other APIs and technologies and enrich the conversational flow.
  • GPT-3 can generate artificial data for us to train a model in the absence of real data, so that we can quickly begin development and start validating the bot’s responses.

We’re hiring! If you’re interested in joining the team that is improving the ASOS experience for our customers, check our our open roles here.

Ollie is a Senior Big Data Engineer at ASOS. He loves black Labradors, Rubik’s Cubes, marathon running and puzzle-solving in general.



A collective effort from ASOS's Tech Team, driven and directed by our writers. Learn about our engineering, our culture, and anything else that's on our mind.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store