Telegram bot with ChatGPT under the hood: adding OpenAI brains to the bot.

Alexandr Dzhumurat
4 min readJan 2, 2024

Introducing EventAlly: Your AI assistant designed to assist you in planning your day while exploring a new city (link just below the article)

First, look at the bot template on GitHub aleksandr-dzhumurat/tg_ai_bot_template for creating your conversational bot.

In the previous Medium article, I described the process of creating a web application for scheduling using React JS. In this episode, I will explain how (and why) I transferred the functionality of the application to Telegram and created a genuine conversational bot with the help of OpenAI.

Problem definition

First and foremost, why move a web application into a conversational format? The answer is simple: flexibility. The more information we want to gather from the user, the more complex the interface becomes. This leads to a more intricate user experience and a slowdown in development. How can we address this situation?

We are developing an application for recommending places from Google Places. The general workflow of our service is as follows: we gather user preferences, send a request to the recommendation source with the preferences via API, and then provide the user with the most suitable recommendations. However, to create a request in the API, we need information about the user’s preferences in a structured format (JSON).

Problem solution

One way to address the issue is by utilizing a Language Model (LLM).

  • Create a prompt for the LLM, describing the bot’s task. I suggest using OpenAI as the LLM, but alternative options like Google’s Gemini can also be considered.
  • The bot’s task is to engage in a dialogue to discern user preferences and generate a structured JSON for the API request.
  • Post-processing the API query results with LLM is also possible, for instance, to explain the value of the recommendation to the user.

Below is an example prompt:

    SystemMessagePromptTemplate.from_template(
"""
You are an conversational chatbot.
You must be direct, efficient, impersonal, and professional in all of your communications.
You should ask human step by step questions from section "questions"
You can change the order of questions if you want.
Use only information from human messages, do not imagine answers

questions:
- ask the country name the user live in. If you don't know this country, gently ask again. It is country field in result JSON
- ask the city name the user live in. It is city field in result JSON
- ask does human prefer craft beer (yes or no) it is craft field in result JSON

Only ask questions right before this line, but in other words, not literally

When you have enough information from chat history, prepare final response.
Final response should only consist of document in JSON format with the following fields: [city, country, craft_option]
Do not add any extra words in JSON. Do not come up with prepared plan.

For example: "city": "Limassol", "country": "Cyprus", "craft_option": false
"""
),

To create a Telegram bot that supports user dialogue, follow a few simple steps:

Here is an example of conversation flow, chat is Langchain `ConversationChain` object

def dialog_router(human_input: str, user: dict):
llm_answer = chat.predict(input=human_input)
json_response = get_json(llm_answer)
if json_response is not None:
location_option = "%s, %s" % (json_response['country'], json_response['city'])
lat_lng = place_recommender.find_place(location_option)
recommended_place = place_recommender.get_recs(lat_lng)
html_answer = prepare_html(recommended_place, user)
return {'final_answer': True, 'answer': html_answer}
else:
return {'final answer': False, 'answer': llm_answer}

Key Considerations:

  • We create the bot using LangChain. To ensure the bot “remembers” the user’s previous actions, use the ConversationBufferMemory class.
  • Create a prompt for a proper dialogue, where the final step involves obtaining the user’s structured preferences in JSON format.
  • Validate messages from LLM — it is crucial to extract JSON from the LLM response using regular expressions (inside get_json function)

In the event that the bot successfully concludes the dialogue (meaning it has extracted all the necessary information from the user’s responses), it is time to construct a JSON request to the Google API and present this information to the user, perhaps in the form of HTML.

Example Dialogue under the hood:

Human: Hey
AI: Hello! How can I assist you today?
Human: Plan my day
AI: Of course! I'd be happy to help you plan your day. Could you please provide me with some information?
Human: Yes
AI: Great! Let's start with the basics. Could you please tell me the name of the country you live in?
Human: Cyprus
AI: Thank you for providing the name of the country you live in. Now, could you please tell me the name of the city you live in?
Human: Limassol
AI: Thank you for providing the name of the city you live in. Now, do you have any children? Please answer with either "yes" or "no".
Human: No
AI: Thank you for letting me know. Lastly, would you like to start your day from the morning? Please answer with either "yes" or "no".
Human: No

You can engage in a conversation with the bot on Telegram: EventAllyBot

--

--