How My Deep Frying Messenger Bot Works

Ethan Roberts
The Startup
Published in
4 min readNov 13, 2020

Diving deeper into the FBChat library and Deeppyer

Photo by Brett Jordan on Unsplash

I’m in a meme group-chat on Facebook Messenger, and sometimes a situation arises (naturally) where an image just needs to be deep fried. In this circumstance we have to use an online service, taking valuable minutes out of the day. It’s this exact type of minor inconvenience that bots are perfect for solving.

Wouldn’t it be great, I thought to myself, if saying the phrase “Can I request a deep fry” triggers a bot to automatically deep fry the last image sent to the chat? Yes, yes it would.

(If you don’t know what deep frying refers to, please see this source)

In this article we’ll be diving deeper into the code in this GitHub repo, with the goal of helping you understand how it all works.

Python is great for this kind of task, because usually 99% of the work is done for you. This is certainly the case here, as we’ll be mainly utilising two libraries.

  • FBChat: Used for interfacing between Python and Messenger
  • Deeppyer: Used for processing the im

It’s worth noting just before we get into everything that this code is pretty quick and dirty.

FBChat

Although not a particularly well-maintained library, FBChat is perfectly adequate for this purpose. It provides a “Client” class to which you simply pass your Facebook login information. On successful initialisation of your client, you get something like this in the output.

I had some trouble with flaky logins due, but managed to fix it by adding a try/except loop around line 190.

try:
revision = int(r.text.split('"client_revision":', 1)[1].split(",", 1)[0])
except IndexError as e:
revision = 1

This was a quick and dirty fix, which meant the package had to be bundled into the Docker container manually. This is why it’s included in src.

Let’s go through how FBChat was used in practice. The gist below shows code relating to FBChat.

FBChat relevant code from main.py

In line 1, we create a new class that inherits from the aforementioned Client class. This allows us to write an onMessage function which is automatically triggered every time a message is received by the bot. This is all handled automatically for us as we inherit from Client .

Line 4 is where we specify that messages received should only be processed if their thread ID (i.e. what chat the message is from) is the correct ID — t_id . This ID is found using the other script in the GitHub repo.

Line 7 uses the fetchThreadImages() function to get a list of images last sent to the chat with the given thread ID. The object it returns is a generator, so the first object in there is taken using itertools.islice() in a for-loop.

That image is just an object with some properties. It doesn’t contain the image explicitly, but a URL from which to download the image. In lines 9 and 10 we HTTP Get the URL.

On Line 12, we use Client’s inbuilt function sendLocalImage() to send the now-deep-fried image back to the chat. The first argument is simply the file path to the image.

After all of that functionality is defined, on lines 14 and 15 we simply create an instance of DeepFryBot and get it to start listening for messages. This will keep the process running forever — theoretically.

Deeppyer

Image by Ovyerus: https://github.com/Ovyerus/deeppyer

While not exactly a well known library, this one definitely helped me get this project out the door quickly.

This library uses Pillow and opencv to deep fry an image. You simply pass the image (as a Pillow Image) to Deeppyer and it performs its function.

The gist below shows the Deeppyer-relevant code from main.py.

Line 1 uses Pillow’s Image class to convert the body of the HTTP request (from earlier) into a Pillow Image.

Line 2 passes that to the Deeppyer (dp) deepfry function, which is then saved to a local file ‘fried.jpg’ on line 4.

The original function in the Deepyer library is async, which blocked me from being able to use it as I could not make the onMessage() function async. I simply removed the async propert from the function definition in the source.

This is why the library is bundled into the src folder, I couldn’t use the default pip-installed version of the library.

The rest of the code is fairly self explanatory, but if you do have questions please feel free to leave a comment or message me and I’ll get back to you as soon as I can.

Check out my other project, where I used Google Cloud and Machine Learning to run a Twitter streaming Sentiment Analyser: https://towardsdatascience.com/tracking-dysons-public-image-using-46-166-tweets-68af509923a6

--

--