Using LLMs to Help Families Caring for Alzheimer’s Patients
A few years ago, my grandmother Alice was diagnosed with Alzheimer's, and one of the things we always do at home to keep her memory fresh is to show her old pictures, play songs that she used to listen to, tell her stories that will make her remembering of her wedding day or her favorite park she used to do her walks on when she was younger. The years passed, and she got to the point of forgetting who I was. One day, I decided to sit down with her and show her some old pictures of mine, making some sort of lifetime, like: "This is me now, that's me in 2017, this is us together at your sister's house", and that was great to her memory, she started to remember even about my mother and friends she had on that time. Based on this day, I have been thinking about what I could do to release something that could potentially help her with these things, something that could be doing what I did on that day to refresh her memory about whatever she needs to remember.
I have been a software engineer for 8 years now, and recently I got the chance to work with Llama3, the Large Language Model developed by Meta. Honestly, I was never really into these technologies before, but since then I have found some cool solutions that can be implemented with them. Using this experience I had on it, I decided to create a solution with LLMs to help families caring for Alzheimer's patients, like my grandmother. Here is a simple sketch of what I thought:
The main idea of this project is to allow any family member to talk to an LLM assistant, informing the bot what Alice (or your relatives) forgot about. For example, one could say: "My grandmother Alice forgot about her husband today", and the bot should answer something like: "I am sorry to hear it, I will look for some memories related to her husband and change some devices along the day to make her remember about him". The focus is on changing stuff at home like digital portraits, songs on speakers, and any other devices that can play videos, photos, or sounds, it would be a passive interaction between the patient and the technology. The main idea of doing it this way, instead of making the user interact directly with the system, is because most of the patients are elder people, my grandmother herself is not a big fan of smartphones or tablets, so she would never make these interactions. With this idea, the house changes itself to help the patient remember something. The only person interacting directly with the system is the family member (or the person taking care of the patient), the bot controls the whole rest.
From that, I could evolve to a better diagram:
My uncle: the user (family member)
the cellphone: any interface the user prefers to use, such as Telegram, WhatsApp, or desktop custom UI.
LLM service: I intend to use Llama3 with Groq (because of its speed) and Langchain 🐦. It is a Python Service.
Alice's Memory Box: A database
Devices: These can be any device at home that we want to send information to, such as a song or a picture.
To make things easier, I decided to go with Telegram instead of creating a whole UI for that, it would save me time to test it and that would also make it easier for anyone at home to use it as well. For the LLM service, I am using Python with Llama, Groq, and Langchain. For syncing the devices at home, I have chosen to use Socket.io.
Here is a quick demonstration:
The next steps around it are:
- Improve the memory box
- Make it work for more than a single family
- After solving these two things, release it as an open source app to everyone that needs it
Technologies used:
- Python3
- Llama3
- Groq
- Langchain
- Redis
- Postgres
- Socket.io
Thank you for reading it, if you have any interest on this project, do not hesitate to message me.