Gymnaskillazy Bot: A Fun Journey of AI, Notion and Telegram

Noy Pearl
8 min readJul 7, 2024

--

Demo — TL;DR

The actual bot

https://github.com/noypearl/Gymnaskillazy

Overview

There’s a tradition for my gym to log each exercise during the lesson in a logbook — but I hate paper and I’m too messy to keep it tidy, so I spent a few nights together with ChatGPT (❤) and here’s the result — Gymnaskillazy, an intelligent bot that logs your exercises and lessons to Notion based on your inputs from Telegram. This bot uses Google Sheets to manage exercises and additional questions, and it leverages OpenAI’s GPT-4 model to generate snazzy lesson titles. The cherry on top? It’s hosted on AWS Lambda because, let’s face it, I’m lazy and it’s cost-efficient.

Also — here’s an illustration of me at the gym:

Gymnaskillazy icon (Generated using ChatGPT-4)

Usage

The whole purpose of the bot is to log text in a quick and effective way during a workout in my gym, and being able to see and monitor it in my Notion.

Commands

  • start — initiate a new session of logging a lesson
  • end — finish log and save to Notion
  • stop — abort log lesson
  • add — add custom exercise
  • skip — skip logging current exercise
  • prev — go to previous exercise

How

I used ChatGPT to generate most of the code, the README.md and even to write most of this article (hihi). I even used its API — I send the data that I wrote for the lesson and generate a very specific title — here’s the prompt that I use:

CHATGPT_PROMPT = "I have json of gym exercises that I logged. It holds " \
"key for exercise type and value for the description. " \
"Also it has additional questions. I want to analyze " \
"it and create a short title of the exercise - put a colon between exercises instead of words. use minimum words. remove words such as 'seconds' and instead keep the number only. It should be one line only. no newlines. Don't tell me the training type. Don't tell me the coach. Don't tell me the goal. I want to know about the big achievements. minimum words. no more than 10 words Don't mention the word 'title'. here it is: "

Why am I doing it?

You might be wondering, why would I build this from scratch when there are existing tools out there? Well, using Zapier-like tools costs money, they’re limited, and frankly, developing a system from scratch is a lot more fun. Plus, I got pretty bored one weekend and decided to embark on this little adventure.

Illustration of my messy lifestyle whenever I try to write in my gym logbook

Features

  • Exercise Logging: Logs exercises and lessons with descriptions to Notion.
  • Google Sheets Integration: Fetches exercises and additional questions from Google Sheets.
  • AI-Generated Titles: Uses OpenAI GPT-4 to generate concise and meaningful lesson titles.
  • AWS Lambda Hosting: Runs on AWS Lambda, triggered by webhooks, and uses CloudWatch for scheduling and AWS API Gateway for access.

Components

  1. Telegram Bot: Handles user interactions, collects exercise details, and communicates with other components.
  2. Google Sheets Client: Fetches exercise data and additional questions from Google Sheets.
  3. Notion Client: Saves the collected data to a Notion database.
  4. OpenAI Client: Generates lesson titles based on the logged exercises.
  5. AWS Lambda: Hosts the bot and handles incoming webhook requests from Telegram.
Low-effort graph of the system’s components

Now — let’s dive into the details 😎

Crazy dog on her way to muscle up

The Development Process

Setting Up the Telegram Bot

First things first, setting up the Telegram bot. Using the python-telegram-bot library made handling interactions a breeze.

from telegram.ext import Application, CommandHandler, MessageHandler, filters

def start(update, context):
user_id = update.message.from_user.id
if user_id != TELEGRAM_USER_ID:
update.message.reply_text("Access denied. You are not authorized to use this bot.")
return
update.message.reply_text('Welcome! Use /start to start logging a new lesson. 💪')

application = Application.builder().token(TELEGRAM_TOKEN).build()
application.add_handler(CommandHandler("start", start))sageHandler, filters

Integrating Google Sheets

Integrating Google Sheets was crucial for managing exercise data and additional questions.

Every month there are new exercises and I needed a quick, comfortable solution that wouldn’t require me to change the code every month — so I decided to use Google Sheets and created a separate worksheet for each month with the name of the month:

Exercises and additional questions in sheet
Separate worksheets

The gspread library was perfect for this task.

import gspread
from oauth2client.service_account import ServiceAccountCredentials

class GoogleSheetsClient:
def __init__(self, credentials_file, sheet_id):
self.credentials_file = credentials_file
self.sheet_id = sheet_id
self.client = self.get_gcloud_connection()

def get_gcloud_connection(self):
scope = ["https://spreadsheets.google.com/feeds", "https://www.googleapis.com/auth/drive"]
credentials = ServiceAccountCredentials.from_json_keyfile_name(self.credentials_file, scope)
return gspread.authorize(credentials)

And I pull the lesson’s data according to the current month:

    def get_current_sheet(self):
current_month = datetime.now().strftime('%B')
return self.client.open_by_key(self.sheet_id).worksheet(current_month)

Saving Data to Notion

Up until now — I managed a Notion database with templates for each type of exercise:

Notion database & lesson page

And every lesson I manually added a new page in the Notion database and typed all the necessary information

I wanted to dynamically add to that database and keep the same structure, style and tags that I previously had.

To save data to Notion, I used Notion’s API. This involved creating pages and appending blocks of data.

import requests

class NotionClient:
def __init__(self, notion_token, notion_database_id):
self.notion_token = notion_token
self.notion_database_id = notion_database_id
self.headers = {
'Authorization': f'Bearer {self.notion_token}',
'Content-Type': 'application/json',
'Notion-Version': '2022-06-28'
}

async def save_to_notion(self, user_id, lesson_title, sessions):
# Payload and request to save data to Notion

AI-Powered Title Generation

One of the coolest parts of the project was using OpenAI’s GPT-4 to generate concise titles for each logged session. I just passed the JSON of the whole lesson along with a long prompt that I made — and voila! Magic happened and I got the title that I wanted!

Here’s how the integration looked:

from openai import OpenAI

def generate_title(api_key, sessions):
client = OpenAI(api_key=api_key)
response = client.chat.completions.create(
model="gpt-4",
messages=[
{"role": "system", "content": CHATGPT_PROMPT},
{"role": "user", "content": str(sessions)}
],
max_tokens=30,
temperature=0.7
)
return response.choices[0].message.content

And again — the long prompt I made:

CHATGPT_PROMPT = "I have json of gym exercises that I logged. It holds " \
"key for exercise type and value for the description. " \
"Also it has additional questions. I want to analyze " \
"it and create a short title of the exercise - put a colon between exercises instead of words. use minimum words. remove words such as 'seconds' and instead keep the number only. It should be one line only. no newlines. Don't tell me the training type. Don't tell me the coach. Don't tell me the goal. I want to know about the big achievements. minimum words. no more than 10 words Don't mention the word 'title'. here it is: "

I had to test and improve it every time so that’ll I’ll get the format I wanted.

Hosting on AWS Lambda

Hosting the bot on AWS Lambda made it accessible 24/7 without needing a dedicated server. Lambda’s event-driven architecture was perfect for handling webhook events from Telegram.

import json
import os
from telegram import Update
from telegram.ext import Application
from bot import TelegramBot
from dotenv import load_dotenv

load_dotenv()

def lambda_handler(event, context):
TELEGRAM_TOKEN = os.getenv('TELEGRAM_TOKEN')
application = Application.builder().token(TELEGRAM_TOKEN).build()
update = Update.de_json(json.loads(event['body']), application.bot)
application.process_update(update)
Cool buff dog for motivation

Challenges and Solutions

  1. Migrating from local development to AWS: I wasted too much time on creating an endpoint and debugging whenever I worked with the telegram webhook and tried to understand why it doesn’t work, but learned a lot overall.
  2. Webhook vs. Polling: Switching from polling to webhooks was essential for efficiency. Setting up the webhook required configuring an API Gateway in AWS to trigger the Lambda function.
  3. Error Handling: Handling exceptions gracefully in asynchronous functions required careful structuring and use of logging. For the lambda I eventually use print() (sue me, I’m lazy and It was good enough to be displayed in the logs of CloudWatch :) )
  4. Using OpenAI for Title Generation: Integrating OpenAI’s API was really fun. I don’t want to use it where I can use regular code — so generating a title from a dynamic lesson’s log was perfect for this. I had to ensure that the prompt was crafted correctly to generate meaningful titles for the new Notion pages.
  5. Thinking out of the box: I ran a demo of the lesson in my head for scenarios that might happen — the coach decided to add a surprise exercise, skipped an exercise or I finished the lesson earlier. I had to design and code a solution for each one of them, or else I will fallback to manual exercise logging (No way jose)

Setup Instructions

Prerequisites

  • Python 3.12
  • Poetry for dependency management
  • AWS CLI configured with your AWS credentials
  • AWS Lambda and API Gateway setup

Environment Variables

Create a .env file in the root directory with the following keys:

TELEGRAM_TOKEN=<Your_Telegram_Bot_Token>
NOTION_TOKEN=<Your_Notion_Integration_Token>
NOTION_DATABASE_ID=<Your_Notion_Database_ID>
GOOGLE_SHEETS_CREDENTIALS_FILE=credentials.json
GOOGLE_SHEETS_ID=<Your_Google_Sheets_ID>
OPENAI_API_KEY=<Your_OpenAI_API_Key>
TELEGRAM_USER_ID=<Your_Telegram_User_ID>
NOTION_USER_ID=<Your_Notion_User_ID>

Google Sheets Setup

  1. Create a Google Sheet with tabs named after the months (e.g., July, August) and one tab named General.
  2. Add your exercises to the respective monthly tabs and questions to the General tab.
  3. Share the Google Sheet with the service account email from your credentials.json.

Notion Setup

  1. Create a database in Notion and note its ID.
  2. Configure an integration in Notion and add it to the database.

Installation

Clone the repository:

git clone https://github.com/noypearl/Gymnaskillazy.git
cd Gymnaskillazy

Install dependencies using Poetry:

poetry install

Now you can either run the main.py if you want to trigger it yourself

OR deploy the code to AWS and use lambda_function.py

Deploy to AWS Lambda, add an API Gateway for the Telegram bot, and configure a webhook for the bot so that the Lambda will get & process every new message.

Conclusion

Developing Gymnaskillazy was an exciting journey filled with learning and problem-solving. From deciding on using Telegram for its ease of integration to leveraging OpenAI for smart title generation, the project combined multiple technologies to create a cohesive and functional system. Using AWS Lambda for hosting ensured that the bot could run efficiently without the need for dedicated servers AND I was REALLY happy with the Cloudwatch integration that offer realtime logging in a very convenient way.

I hope you enjoyed reading this article and you got some ideas of what you can build yourself / use ChatGPT for.

Let me know if you have any questions or feedback!

--

--