How I Transformed a Raspberry Pi 4 into a Smart Home Hub with Home Assistant, Secure Remote Access, and Real-Time Monitoring.
As someone who loves tinkering with tech and home automation, I found the Raspberry Pi 4 (4GB version) to be the perfect platform for a multitude of tasks. From managing my smart devices to securing remote access, this tiny powerhouse handles it all. In this blog, I’ll walk you through how I’m using my Raspberry Pi 4 for various purposes, including Home Assistant, Twingate, WireGuard, Prometheus.
Setting the Stage: Why Raspberry Pi 4?
The Raspberry Pi 4 with 4GB of RAM strikes the ideal balance between affordability and performance for home automation projects. Its quad-core ARM Cortex-A72 CPU is powerful enough to run multiple services simultaneously without breaking a sweat. I decided to use mine as the control center of my smart home, handling everything from automation tasks for IoT devices to VPNs and monitoring my home internet speed.
To ensure that my Raspberry Pi 4 runs smoothly, I housed it in the GeeekPi Acrylic Case, which I found on Amazon. This case includes a 40x40x10mm fan and heatsinks, providing adequate cooling even when the Raspberry Pi is running multiple services at once.
In addition, I’m using a SONOFF Universal Zigbee 3.0 USB Dongle Plus Gateway for controlling various Zigbee devices via Home Assistant and Zigbee2MQTT. This dongle allows me to connect smart lights, sensors, and other devices to my automation system. Its strong antenna boosts the signal, providing excellent range for my Zigbee devices, also picked it up from Amazon.
Home Assistant: A Central Hub for Smart Devices
At the heart of my setup is Home Assistant, an open-source platform for managing smart devices. From lights to security cameras, I use Home Assistant to almost automate everything. The Raspberry Pi runs this platform efficiently, providing me with a web interface where I can monitor and control my devices.
I’ve configured Home Assistant to interact with several smart devices in my home, including:
- Smart lights: Controlled based on schedules or triggers.
- Sensors: Monitoring temperature, and motion.
Managing Services with Docker and Portainer
To ensure that everything runs smoothly, I’ve set up Docker on the Raspberry Pi, and I manage my containers through Portainer. This setup makes it easier to organize and handle multiple services, such as Home Assistant, Twingate, WireGuard, Prometheus, and add more containers, without worrying about system conflicts.
Here’s a snapshot of my Portainer setup, showing how I manage my containers:
With Portainer, I can easily start, stop, or restart containers for various services, ensuring that everything stays up-to-date and running smoothly.
Containers I’m Running:
- Home Assistant: For managing smart home devices.
- Twingate: For secure remote access to my home network.
- Prometheus: For scraping system stats and displaying on Grafana which hosted onto another server.
Prometheus for Data Collection
Prometheus is a powerful tool for collecting and storing time-series data. To monitor my Raspberry Pi’s health, I use Node Exporter, a lightweight agent that gathers essential system metrics such as CPU usage, memory consumption, disk I/O, network traffic, and more.
Node Exporter is specifically designed to expose these metrics in a format that Prometheus can scrape (collect) at regular intervals, making it easy to monitor the system’s performance over time.
Node Exporter Setup
The Node Exporter runs as a background service on the Raspberry Pi, exposing a metrics endpoint at a specified port (typically :9100
). Prometheus is configured to scrape this endpoint periodically (every 15 seconds by default) to collect the following metrics:
- CPU usage: How much of the CPU resources are being utilized.
- Memory usage: The amount of RAM used vs. available.
- Disk I/O: How much data is being read from or written to the disk.
- Network traffic: Incoming and outgoing network data.
Once these metrics are scraped by Prometheus, they are stored in a time-series database, which allows me to analyze the data historically or in real-time.
Grafana for Visualization
While Prometheus gathers data, Grafana is what I use to visualize it. Instead of relying on basic numbers and tables, Grafana allows me to create sleek, customizable dashboards that provide at-a-glance information. I run Grafana on a separate server, but it reads the Prometheus metrics from the Raspberry Pi.
Here’s a glimpse of the system monitoring dashboard tracking the system performance:
Monitoring Internet Speed with SpeedTest and InfluxDB
To keep track of my home internet speed, I’ve set up a cron job that runs every 30 minutes on my Raspberry Pi, executing a Python script that logs the speed test data into InfluxDB, which Grafana then visualizes.
I followed this guide to get everything configured. The script runs a speed test and captures key metrics like ping, download speed, and upload speed. These metrics are then written to InfluxDB, which I have connected to Grafana for visualizing the data.
Here’s another glimpse of my Grafana dashboard where the internet speed is being tracked:
Python Script for Speed Monitoring
Below is the Python script I use to gather internet speed data:
import re
import subprocess
from influxdb import InfluxDBClient
response = subprocess.Popen('/usr/bin/speedtest --accept-license --accept-gdpr', shell=True, stdout=subprocess.PIPE).stdout.read().decode('utf-8')
ping = re.search('Latency:\s+(.*?)\s', response, re.MULTILINE)
download = re.search('Download:\s+(.*?)\s', response, re.MULTILINE)
upload = re.search('Upload:\s+(.*?)\s', response, re.MULTILINE)
jitter = re.search('Latency:.*?jitter:\s+(.*?)ms', response, re.MULTILINE)
ping = ping.group(1)
download = download.group(1)
upload = upload.group(1)
jitter = jitter.group(1)
speed_data = [
{
"measurement" : "internet_speed",
"tags" : {
"host": "RaspberryPiHost"
},
"fields" : {
"download": float(download),
"upload": float(upload),
"ping": float(ping),
"jitter": float(jitter)
}
}
]
client = InfluxDBClient('localhost', 8086, 'dummy_user', 'dummy_password', 'dummy_database')
client.write_points(speed_data)
Cron Job Configuration
To automate the execution of this script, I set up a crontab entry that runs the script every 30 minutes:
*/30 * * * * python3 /home/user/speedtest.py
This ensures that my internet speed is regularly monitored without manual intervention. The results are then pushed into InfluxDB and visualized in Grafana, allowing me to keep an eye on my network’s performance over time.