RaspberryPI, make me a coffee!
how we can turn a RaspberryPI into an Alexa-like LLM-based assistant
RaspberryPI is a very afforable small computer that can fit in a palm.
I have been playing with and using RaspberryPI devices since more than 10 years for many purposes. Now with the rise of large language models (LLMs) I have been experimenting many new possible uses.
First of all, I stick with RaspberryPi because I have many of them, but there are other possibilities: Jetson Nano, LePotato, etc. , so feel free to be inspired by other things.
RaspberryPI is quite limited with memory, so we cannot really run LLMs locally on it and get seamless performance. We need to use SMALL language models (sometimes called baby-LLMs), like Phi, TinyLama, TinyDolphin, etc. Nevertheless, the versability and low cost of a RaspberryPI make it prone to implement easily federated learning systems, and distributed and personalized solution for edge computing which will probably grow in the future in the internet of things (IoT) universe. Those solutions can also be empowered by fine-tuned or agmented personalized language models. Generally, you should manage to deploy a language model on a RaspberryPI 5 with 8GB of RAM, but also on RaspberryPI 4 with 4GB, using older models and less RAM might be very challenging.