How to build a personalized fitness chatbot powered by GEMINI on Google cloud platform

Yamini Akula
Google Cloud - Community
6 min readAug 14, 2024

Overview

Problem Statement

Many individuals struggle to maintain proper form and maximize the effectiveness of their workout routines, often due to a lack of personalized guidance. This can lead to inefficient exercises and an increased risk of injuries, hindering overall fitness progress.

Objective

This blog aims to guide you through building a fitness chatbot(FitGen AI) using GEMINI-1.5-PRO on the Google Cloud Platform (GCP). The chatbot will analyze workout routines and provide real-time, personalized feedback on posture and exercise techniques, helping users improve their workout efficiency and safety.

Impact

The fitness chatbot will provide users with tailored, real-time feedback, enhancing their workout effectiveness and preventing injuries. By leveraging advanced AI technologies like GEMINI-1.5-PRO, this solution will offer immediate improvements to physical fitness routines. The chatbot’s unique capability to learn from existing recommendations and continuously retrain itself ensures increasingly personalized feedback based on the user’s evolving workout patterns.

Design

Cloud Function Trigger When an image is uploaded, a Cloud Storage Function triggers GEMINI to analyze it and provide recommendations. This design choice enables real-time processing and analysis of images, ensuring users receive immediate feedback. Automating this process minimizes latency and enhances the user experience with swift responses. The rationale behind using Cloud Functions lies in their ability to scale automatically and handle multiple triggers simultaneously, making the system efficient and responsive.

Data Accumulation Data accumulates in a Firestore collection named feedback_collection, which stores these recommendations, including fields such as file_name, recommendations, and timestamp. Firestore was chosen for its scalability and flexibility in handling unstructured data, allowing for efficient querying and retrieval based on user inputs. This approach ensures consistent data storage and easy access, facilitating accurate and personalized feedback for users. The use of Firestore also supports real-time updates and synchronization, crucial for providing up-to-date recommendations.

Model Training The recommendations field from Firestore is extracted and stored in a list, which is then fed into GEMINI-1.5-PRO. This enables the model to use these recommendations for providing personalized feedback. The design choice of continuously training the model on new data ensures that the system can adapt to changing user needs and preferences. This iterative process improves the model’s accuracy and relevance, enhancing the overall quality of feedback provided to users.

Deployment The application is deployed using Google App Engine and Docker containers, with Cloud Run serving the Streamlit app. This deployment strategy was chosen for its scalability, ease of maintenance, and ability to handle diverse workloads. Google App Engine and Docker provide a robust infrastructure capable of managing high traffic volumes, while Cloud Run offers a secure and efficient environment for running the application. The rationale behind this deployment setup is to ensure high availability, scalability, and secure operation, making the system reliable and easy to manage.

Step-by-step instructions

Cloud Function Trigger: Under GCP console navigate to cloud funtions and click on create function : use the below options to create the cloud storage trigger function. Under bucket choose your respective cloud storage bucket.Once you have uploaded the codes, deploy the funtion and check the logs for errors.

import logging
import vertexai
from google.cloud import storage, firestore
from vertexai.generative_models import GenerativeModel, Part
import vertexai.preview.generative_models as generative_models

# Configuration
PROJECT_ID = "project-id"
LOCATION = "us-central1"
MODEL_ID = "gemini-1.5-pro-001" # Update with your Gemini model ID
MODEL = None

# Initialize Vertex AI for Gemini model
def init_model():
global MODEL
logging.info("Initializing Vertex AI model…")
vertexai.init(project=PROJECT_ID, location=LOCATION)
MODEL = GenerativeModel(MODEL_ID)
logging.info("Model initialized successfully.")

# Function to generate recommendations using Gemini model
def generate_recommendations(image_uri):
logging.info("Generating recommendations using Gemini model…")

if MODEL is None:
init_model()

# Prepare the image and text input for Gemini model
image = Part.from_uri(mime_type="image/jpeg", uri=image_uri)
text = """for this image analyse the posture and give recommendations to correct it and provide feedback and improvements"""

generation_config = {
"max_output_tokens": 8192,
"temperature": 1,
"top_p": 0.95,
}

safety_settings = {
generative_models.HarmCategory.HARM_CATEGORY_HATE_SPEECH: generative_models.HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
generative_models.HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT: generative_models.HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
generative_models.HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT: generative_models.HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
generative_models.HarmCategory.HARM_CATEGORY_HARASSMENT: generative_models.HarmBlockThreshold.BLOCK_MEDIUM_AND_ABOVE,
}

try:
# Generate content using Gemini model
responses = MODEL.generate_content([image, text], generation_config=generation_config, safety_settings=safety_settings, stream=True)
recommendations = [response.text for response in responses]
logging.info("Recommendations generated successfully.")
except Exception as e:
logging.error(f"Error generating recommendations: {e}")
recommendations = []

return recommendations

# Function to store feedback in Firestore
def store_feedback(file_name, recommendations):
logging.info(f"Storing recommendations for {file_name} in Firestore")

try:
db = firestore.Client()
# Save the recommendations in Firestore
doc_ref = db.collection('feedback_collection').document(file_name)
doc_ref.set({
'file_name': file_name,
'recommendations': recommendations,
'timestamp': firestore.SERVER_TIMESTAMP
})
logging.info(f"Stored recommendations for {file_name} in Firestore successfully.")
except Exception as e:
logging.error(f"Error storing feedback: {e}")

# Cloud Function triggered by new file upload to Cloud Storage
def trigger_analysis(event, context):
"""Cloud Function triggered by Cloud Storage events."""
logging.info(f"Function triggered by event: {event}")

try:
file = event
bucket_name = file['bucket']
file_name = file['name']
logging.info(f"Processing image file: {file_name}")

# Generate recommendations using Gemini
image_uri = f"gs://{bucket_name}/{file_name}"
recommendations = generate_recommendations(image_uri)

# Store the recommendations in Firestore
store_feedback(file_name, recommendations)

logging.info("Processing completed.")
except Exception as e:
logging.error(f"Error in trigger_analysis function: {e}")

Data Accumulation Once you have finished setting up the trigger function. Upload the image to cloud storage and check the firestore database for recommendations. Make sure to upload image to the same bucket as chosen in the previous step(Cloud Function Trigger). Recommendations should look like below , upon successful function trigger.

Model Training and deployment In the console open, Google app engine and click on open editor , use the below code (main.py) and the requirements.txt, Dockerfile file to successfully pull the recommendations field from firestore data and feed it into Gemini-1.5-pro for personalized recommendation.

import streamlit as st
from google.cloud import firestore
import vertexai
from vertexai.generative_models import GenerativeModel, GenerationConfig, HarmCategory, HarmBlockThreshold

from google.oauth2 import service_account

# Your GCP project and model configurations
PROJECT_ID = "project-id"
LOCATION = "us-central1"
MODEL_ID = "gemini-1.5-pro-001"

# Initialize Firestore client
client = firestore.Client()

# Initialize Vertex AI model
vertexai.init(project=PROJECT_ID, location=LOCATION)
model = GenerativeModel(MODEL_ID)

# Streamlit app
def main():
st.title("FitGen AI")

# Fetch data from Firestore
collection_ref = client.collection('collection-name')
documents = collection_ref.stream()
recommendations_list = []
for doc in documents:
data = doc.to_dict()
recommendations = data.get('recommendations', [])
recommendations_list.extend(recommendations)

# User input
user_input = st.text_input("Meet your personalised fitness trainer powered by GEMINI")
if st.button("Ask about how have you been working out!"):
prompt = f"""
User input: {user_input}

Here are some posture tips from various analyses:
{recommendations_list}

Please analyze these tips and provide a comprehensive and personalized set of recommendations to improve posture based on the above suggestions.
Answer:
"""


# Generate content using Gemini
response = model.generate_content(
[prompt],
generation_config=GenerationConfig(
temperature=0.9,
top_p=1.0,
max_output_tokens=8192,
),
safety_settings={
HarmCategory.HARM_CATEGORY_HARASSMENT: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
HarmCategory.HARM_CATEGORY_HATE_SPEECH: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT: HarmBlockThreshold.BLOCK_LOW_AND_ABOVE,
},
)

# Display response
st.write(f"Response:\n{response.text}")

if __name__ == "__main__":
main()

Once completed use the below code in the terminal to launch the app using streamLit.Make sure you update the Dockerfile with the python version and check the service account to have the necessary API access. Make sure to check the cloud run logs for errors.

gcloud run deploy - port=3000 - allow-unauthenticated - platform=managed - region=us-central1 - source=. –your-service-account

Result

Once you have deployed the app successfully you can input your prompt and get receive personalized recommendations from your historical data with the help of GEMINI-1.5-pro on Streamlit.

Links

I would like to thank my mentor Mahesh(boggavarapumohanmahesh@gmail.com) for guidance on this project.

To learn more about Google Cloud services and to create impact for the work you do, get around to these steps right away:

· Register for Code Vipassana sessions

· Join the meetup group Datapreneur Social

· Sign up to become Google Cloud Innovator

--

--