Geek Culture
Published in

Geek Culture

Discord bot using DailoGPT and HuggingFace API.

Design your own Rick bot using Hugging face API and deploy it on Replit server.

Photo by Michael Marais on Unsplash

Introduction

If you ever wonder how Discord bot works and how you can create your own bot that speaks like a certain celebrity or certain character from your favorite cartoon shows that you are at the best place to start. We will be learning how to use HuggingFace API and use it as a Discord bot. We will also learn about Replit, Kaggle CLI, and uptimerobot to keep your bot running. We won’t be going deep into fine-tuning our DailoGPT model for that you can check out this blog.

Dataset

we will use a Kaggle CLI to download the Rick&Morty Scripts | Kaggle that is publicly avaible. If you want to learn how to get your Kaggle API and how to use Kaggle CLI please check you this this link.

!kaggle datasets download andradaolteanu/rickmorty-scripts -f RickAndMortyScripts.csv

As you can see, we have columns: index, season no., episode no, episode name, name of the character, and line. We will be focusing on name and line column.

We will convert this dataset that every response row will contain n previous responses as a context.

CHARACTER_NAME = 'Rick'contexted = []

# context window of size 7
n = 7

for i in data[data.name == CHARACTER_NAME].index:
if i < n:
continue
row = []
prev = i - 1 - n # we additionally substract 1, so row will contain current responce and 7 previous responces
for j in range(i, prev, -1):
row.append(data.line[j])
contexted.append(row)

columns = ['response', 'context']
columns = columns + ['context/' + str(i) for i in range(n - 1)]

df = pd.DataFrame.from_records(contexted, columns=columns)

As you can see below, we have response columns and 6 context columns. This will help us fine tune our generative model. After this we will download model from Hugging face and start train our model. You can check full code here.

Deploy to Hugging Face

After fine-tuning your model you can simply deploy your model to hugging face cloud but before that, you need to create a HF (Hugging Face) account and create the model as shown below.

You will need HF API to upload your mode and you can get it from settings.

  • Install git-lfs: git extension to upload larger files
  • Configure git email and user mail
  • Use your HF API key and model named to upload both tokenizer and model.
!sudo apt-get install git-lfsReading package lists... Done
Building dependency tree
Reading state information... Done
git-lfs is already the newest version (2.3.4-1).
The following package was automatically installed and is no longer required:
libnvidia-common-460
Use 'sudo apt autoremove' to remove it.
0 upgraded, 0 newly installed, 0 to remove and 40 not upgraded.
!git config --global user.email "abidaliawan@rocketmail.com"
# Tip: using the same email as your huggingface.co account will link your commits to your profile
!git config --global user.name "kingabzpro"
MY_MODEL_NAME = 'DialoGPT-small-Rick-Bot'
with open('/content/HuggingFace_API.txt', 'rt') as f:
HUGGINGFACE_API_KEY = f.read().strip()
model.push_to_hub(MY_MODEL_NAME, use_auth_token=HUGGINGFACE_API_KEY)
tokenizer.push_to_hub(MY_MODEL_NAME, use_auth_token=HUGGINGFACE_API_KEY)

You can add tags such as conversational, gpt2 etc by editing your model card. You can check out my model here.

Image by Author

Testing your model

After deploying let’s test it in python script using transformers.

  • download both tokenizer and model using transformer.
  • run loop for 4 inputs
  • user input is used to generate text
tokenizer = AutoTokenizer.from_pretrained('kingabzpro/DialoGPT-small-Rick-Bot')
model = AutoModelWithLMHead.from_pretrained('kingabzpro/DialoGPT-small-Rick-Bot')
# Let's chat for 4 lines
for step in range(4):
# encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
# print(new_user_input_ids)

# append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids

# generated a response while limiting the total chat history to 1000 tokens,
chat_history_ids = model.generate(
bot_input_ids, max_length=200,
pad_token_id=tokenizer.eos_token_id,
no_repeat_ngram_size=3,
do_sample=True,
top_k=100,
top_p=0.7,
temperature=0.8
)

# pretty print last ouput tokens from bot
print("RickBot: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))

As you can see our model is working perfectly. For every user input, there is a response by Rick.

Discord bot

In order to create the Discord bot, first, you need to get into portal Discord Developer Portal. Click on New Application and get started.

Write the application name.

Now click on the Bot tab and then Add Bot. It will promote you to a new window that will ask you to write the bot name and add an image to the bot.

Click on OAuth2 tab and check mark bot and bot permission: Send Message. Later copy the link and paste it into a new tab.

After pasting the link in the new tab, it will ask you to select the server, and that’s it.

Replit

You can create your account on https://replit.com/. We will be using this platform as a bot server for our Discord bot. After creating an account just create a new python repl or import my GitHub repo. This platform is beginner-friendly and it won’t take more than 5 minutes to understand how to use the editor.

Testing API

Before we need to use replit secret feature to add our HF API and Discord API as show below. Your Discord API key is in developer portal: Bot tab.

  • create test python file to test HF API
  • create function that use requests to send and receive information using model id and HF API token.
  • Print generated response.

The function is working perfectly now we need to add this function into bot.py.

import requestsdef query(payload, model_id, api_token): headers = {"Authorization": f"Bearer {api_token}"} API_URL = f"https://api-inference.huggingface.co/models/{model_id}" response = requests.post(API_URL, headers=headers, json=payload) return response.json()
model_id = "kingabzpro/DialoGPT-small-Rick-Bot"api_token = "api_XXXXXXXXXXXXXXXXXXXXXX" # get yours at hf.co/settings/tokendata = query('What you think about mom', model_id, api_token)print(data['generated_text'])

Uptime for Replit

We will be using https://uptimerobot.com to keep our bot running forever. As you can see below that we can Add Monitor and add the URL that we will be creating using flask. It will ping our bot app after 5 minutes which will trick Replit server into running forever.

Flask WebApp

The code below is a simple Flash web app which will display “I am Alive” while running. The web app will generate the link in Replit, something like this: https://DailoGPT-RickBot.kingabzpro.repl.co. You are going to paste this link into uptimerobot which will trigger this app after 5 minutes.

from flask import Flaskfrom threading import Threadapp = Flask('')
@app.route('/')def home(): return "I am Alive"
def run():
app.run(host='0.0.0.0',port=8080)def keep_alive(): t=Thread(target=run) t.start()

Bot Code

This code contains the backbone of our Discord bot and integration of HF API and Flask web app.

  • We are using the keep_alive function from the web_app file that we just create.
  • Using discord python client.
  • query function sends and receives a response from HF API, which is protected by the API key.
  • on_ready warms up HF API by sending a “Hello” message. It takes almost 20 seconds to warm up the server.
  • on_message checks if the message is from bot or user, then it takes user message and sends it to HF API and then displays the response on discord chatbot. Extra error checks or not responding serve are added to make sure we can debug easily.

To learn more, you can look at the code below and read the comments.

import os# these modules are for querying the Hugging Face model
import json
import requests
from web_app import keep_alive
# the Discord Python API
import discord
# this is my Hugging Face profile link
API_URL = ‘https://api-inference.huggingface.co/models/'
class MyClient(discord.Client):
def __init__(self, model_name):
super().__init__()
self.api_endpoint = API_URL + model_name
# retrieve the secret API token from the system environment
huggingface_token = os.environ[‘HUGGINGFACE_TOKEN’]
# format the header in our request to Hugging Face
self.request_headers = {
‘Authorization’: ‘Bearer {}’.format(huggingface_token)
}
def query(self, payload):
“””
make request to the Hugging Face model API
“””
data = json.dumps(payload)
response = requests.post(
self.api_endpoint,
headers=self.request_headers,
data=data)
ret = json.loads(response.content.decode(‘utf-8’))
return ret
async def on_ready(self):
# print out information when the bot wakes up
print(‘Logged in as’)
print(self.user.name)
print(self.user.id)
print(‘ — — — ‘)
# send a request to the model without caring about the response
# just so that the model wakes up and starts loading
self.query({‘inputs’: {‘text’: ‘Hello!’}})
async def on_message(self, message):
“””
this function is called whenever the bot sees a message in a channel
“””
# ignore the message if it comes from the bot itself
if message.author.id == self.user.id:
return
# form query payload with the content of the message
payload = message.content
# while the bot is waiting on a response from the model
# set the its status as typing for user-friendliness
async with message.channel.typing():
response = self.query(payload)
bot_response = response[‘generated_text’]

# we may get ill-formed response if the model hasn’t fully loaded
# or has timed out
if not bot_response:
if ‘error’ in response:
bot_response = ‘`Error: {}`’.format(response[‘error’])
else:
bot_response = ‘Hmm… something is not right.’
# send the model’s response to the Discord channel
await message.channel.send(bot_response)
def main():
# DialoGPT
client = MyClient(‘kingabzpro/DialoGPT-small-Rick-Bot’)
keep_alive()
client.run(os.environ[‘DISCORD_TOKEN’])
if __name__ == ‘__main__’:
main()

As you can see our bot serve is running perfectly.

Image by Author

Now let's check our discord server. As you can see RickBot is online.

Image by Author

Have fun playing around with the bot you trained and deployed.

Image by Author

You can also just import my repository on Replit and add your HF API and Discord API key to get started with in a few minutes.

Image by Author

Code

You can find code on GitHub and DAGsHub. You can also check my HuggingFace model here.

Image by Author

Conclusion

My experience of learning from video tutorials and training my own bot was amazing. It got even better when I started playing around with the discord library and server. In this guide, we have learned Hugging Face API, Transformers, Discord server, uptime robot, Kaggle API, and Replit platform. I can’t wait to see what character you chose next and share your experience with me.

Machine learning is simple done make it hard, always keep looking out for hacks and tips to improve your workspace.

Reference

About Author

Abid Ali Awan (@1abidaliawan) is a certified data scientist professional who loves building machine learning models. Currently, he is focusing on content creation and writing technical blogs on machine learning and data science technologies. Abid holds a Master’s degree in Technology Management and a bachelor’s degree in Telecommunication Engineering. His vision is to build an AI product using a graph neural network for students struggling with mental illness.

Originally published at https://www.analyticsvidhya.com on November 1, 2021.

--

--

--

A new tech publication by Start it up (https://medium.com/swlh).

Recommended from Medium

“I never forget the Nintendo 64” Write-up

Marriage Disproportionally Benefits Men

A complete roadmap for becoming an expert java SE developer

TCP Life Cycle. (demo using wireshark)

Helper Containers — Sidecar, Adapter, Ambassador Patterns

We Fixed Our Monorepo In A Single Day!

Finding the Articulation Points OR Cut Vertices in a Graph

Differences between static and dynamic libraries in C

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Abid Ali Awan

Abid Ali Awan

I love building machine learning solutions and write blogs on Data Science. abid.ninja

More from Medium

How to Run Hadoop on macOS (Apple M1 and Intel CPU) by One Command

Profanity Filtering in Speech

Does Elon Musk really affect Crypto Prices?

How To Transcribe Streams of Audio Data in Real-Time with Python?