Using Docker On A Raspberry Pi As An IoT Hub

Duncan Sample
Compare the Market
Published in
8 min readNov 11, 2016

--

This should roughly guide you (albeit on a meandering path), through the process of getting a Raspberry Pi (3) up and running with Docker, and using a simple Docker-contained toolset, create your own data & control hub for the so-called Internet of Things (IoT).

With the complex job or raising a newborn, and the fear put into us about keeping her temperature in some sort of Goldilocks zone, I thought I’d see if I could put an iBeacon a colleague leant me to good use, as it has a temperature sensor (among others).

So, we ordered a Raspberry Pi 3, since it has both Wi-Fi and Bluetooth 4.1 Low Energy built in, and then I began setting it up. Then the first roadblock hit me… getting the latest version of Docker on an ARM-based machine.

While I’m looking at using the Bean, I’ve also looked at the Owl Intuition-C heating control and energy monitoring device, which I’ve previously integrated with some Ruby scripts, but I’ve been meaning to update.

The aim

This was the first diagram I drew while thinking about the main building blocks of the architecture was going to be.

What I ended up with (for now) is a bit more simplistic than that, but it’s also a bit more coupled together. The reason being that Node-RED was far more capable than I was expecting, and integrated into everything directly.

Installing (compiling) Docker

After setting up Wi-Fi the task was to install Docker. I quickly found Hypriot, but soon realised that the latest version they had for download was 1.10.3 (the proper latest version at this time is 1.12.0).

After a day spare-time effort (between changing nappies and keeping mum well hydrated and fed) attempting to convert the Dockerfile for Docker itself into a shell script I could run on the Pi, with limited success, Hypriot happened to publish the perfect blog post, Building Docker 1.12 on a Raspberry Pi.

Basic steps

  1. Use Docker, by using the ready-made HypriotOS Raspberry Pi image, to build the Docker package
  2. sudo apt-get install -y dphys-swapfile (add a swap file so the Raspberry Pi has enough memory to compile Docker)
  3. git clone https://github.com/docker/docker.git
  4. cd docker && git checkout v1.12.0
  5. git fetch origin pull/25192/head:fix-manpages-on-arm && git cherry-pick fix-manpages-on-arm (apply a necessary fix which hadn’t been applied to the master repo)
  6. time make deb (compile Docker, and time it)
  7. Play with your newborn baby and change several nappies while waiting
  8. Transfer the resulting package from the Raspberry Pi, ready for use later (they’re in a directory similar to ~/docker/bundles/1.12.0/build-deb/*/*.deb)
  9. Reflash with Raspbian (or your preferred flavour) onto an SD card. You can keep using HypriotOS, but I’m more familar with a closer-to-pure Debian distro
  10. apt update && apt dist-upgrade (update as normal)
  11. apt install libapparmor1 (install a dependency of Docker)
  12. Copy the package back to the Pi and run dpkg -i docker-engine_*_armhf.deb
  13. sudo useradd -G docker pi (add the pi user to the docker group)
  14. Check docker --version and rejoice
  15. sudo systemctl enable docker (configure Docker to start on boot as a service)

Useful things to note

Use tmuxor screen to keep the terminal session open (if you’re doing this over SSH like I did). The compiling step can take a few hours.

  • You can now just use apt (rather than apt-get) for most Debian package management needs, and you’ll get better feedback, progress bars, etc.
  • Now that you’ve got Docker installed, don’t get complacent. You can’t just docker run any image you find on Docker Hub. Each one has to be built to run on ARM. Look for images tagged rpi or armhf.

The integration layer (Node-RED)

If you’ve ever used an SOA/BPM orchestration tool, you’ll be familiar with Node-RED. The premise is that you use Node-RED as the integration point for your IoT devices and services, and use it to react to events — such as pushing a lightswitch — with some useful action(s) — such as turning on a light, followed by some music.

Deploying Node-RED

Node-RED has a handy Docker image, and even a rpi tagged one that we could, in theory, use out-of-the-box. It already has some pre-installed plugins, like GPIO, but it doesn’t have Bluetooth, and doesn’t have the plugin for devices like the PunchThrough Bean (the iBeacon variant I’m using).

The way I improved this was to make my own derivative Dockerfile with Bluetooth configured. I’ve published my Dockerfile on GitHub and also pushed my built image on Docker Hub.

The basic docker run -d --name=node-red dsample/node-red-iot:rpi would get Node-RED running, but there would be several issues with this basic setup:

  • The container wouldn’t start when the Raspberry Pi is rebooted. Solve this by adding--restart=always
  • The Node-RED flows you create will be lost if you recreate the image. Solve this by adding -v /src/node-red/data:/data to map the local directory /srv/node-red/data to the container’s /data directory
  • Bluetooth won’t work. The network interfaces within the container are virtual, and don’t include the hci0 device for the Raspberry Pi’s bluetooth adapter. We can solve this by adding --net=host, but this will remove the ability to do internal ‘links’ between this container and other Docker containers. This is the main negative point I’ve found during this project.

With these additions, we now have a command docker run -d — net=host — restart=always — name=node-red -v /src/node-red:/data dsample/node-red-iot:rpi.

Now open a browser to your Raspberry Pi’s IP on port 1880 and you should see the Node-RED interface.

The messaging bus (MQTT)

Thinking that Node-RED would only be the interface towards devices and services for ‘IoT’ purposes, I figured I’d need a messaging bus to carry the signals to an from other services which would be listening and piping the data into a other systems (eg. a database). I was wrong, Node-RED can integrate into MongoDB, InfluxDB and other backend systems easily, but I thought I’d give myself the option of using MQTT anyway.

I chose MQTT rather than RabbitMQ which I’m more familiar with as MQTT is lighter and designed for the purpose of IoT (previously called M2M), so felt like the right tool for the job.

The MQTT Broker I chose is Mosquitto. It doesn’t appear to be the most featureful broker, but I didn’t need much to begin with.

As with the Node-RED container, I ended up creating my own image. Partially because I wanted to configure slightly different volume paths than the other attempts, but also because I didn’t see the need to install either cURL or Wget into the image due to the fact that Docker can download files itself (and save 6MB in the image).

Again, the code is published on GitHub.

We end up with:

docker run -d \ 
-p 1883:1883 \
-p 9001:9001 \
-v /srv/mqtt/config:/mqtt/config:ro \
-v /srv/mqtt/log:/mqtt/log \
-v /srv/mqtt/data:/mqtt/data \
--restart=always \
--name mqtt \
dsample/rpi-mosquitto:rpi

The metrics (InfluxDB)

In order to visualise the data we’ll need to a database. I’m planning to use MongoDB for long-term storage, but having used MongoDB for the past couple of years to store my heating an electricity readings, I felt it was time to use something else, as the graph generation from MongoDB was getting a bit slow (although that’s probably just my poor indexing).

At work we use Prometheus, but I’m not sold on the benefits of it over Graphite, so I thought I’d give another alternative a try, namely InfluxDB.

Again, I wrote my own Dockerfile. I used the config from the official build but they use wget again, and also don’t provide an ‘rpi’ version.

docker run -d \ 
-p 8083:8083 \
-p 8086:8086 \
-v /srv/influxdb/meta:/influxdb/meta \
-v /srv/influxdb/data:/influxdb/data \
-v /srv/influxdb/wal:/influxdb/wal \
--restart=always \
--name influxdb \
dsample/rpi-influxdb:rpi

For this one, if you go to port 8083 on your Pi you should see the InfluxDB interface.

TODO

  • I haven’t gotten around to installing a dashboard tool yet (Chronograf or Grafana), so no funky energy/temperature graphs yet.
  • I haven’t installed MongoDB yet as I haven’t found an image I’m happy with yet (most images are cross-compiled rather than native). The retention policy I’ve set on InfluxDB is 1 year though, so although I may lose the raw events, I’ll be able to visualise the data for now.

Now we get onto the more interesting part of making Node-RED do something vaguely useful. For this article, I’ll concentrate on just getting data out of the devices.

Integration

PunchThrough Bean

After spending a long time working out how to get Bluetooth working within the Node-RED container, getting data out of the Bean was pretty straightforward.

What this does is use an inject node to trigger a collection of the Bean temperature every 10 seconds and set a global variable with the result. There is then another flow triggered by an HTTP request which takes the global value and responds with it.

OWL Intuition

I use an OWL Intuition-C heating control & electicity monitoring system. I bought it as it had an API, and the main API is a UDP Multicast data stream (XML).

I’d previously spent days/weeks coding some Ruby services to collect and transmit and store the data. I managed to do the same thing in Node-RED in around 10 minutes.

This flow is a little more complex. It triggers from a UDP multicast message which contains XML. It deserialises the XML into JSON and maps the result, then it creates an InfluxDB ‘measurement’ and also publishes it to MQTT. The couple of nodes between just deal with the ‘topic’ of the message. This topic is used to separate out the heating and electricity reading messages so that we can have smaller functions for each.

You can create another flow that starts with an MQTT subscription, and that was my plan for getting the data into MongoDB (although I wasn’t planning to use Node-RED for that part). Node-RED, however, has a simple MongoDB output node which will insert data.

Originally published at sample.me.uk.

--

--

Duncan Sample
Compare the Market

Mobile, Web and general tech geek. Developer, innovator and all-round enthusiast