DIY Your AI Home Security Camera With Raspberry Pi And Open-source Software

Bofu Chen
BerryNet
Published in
9 min readAug 3, 2019

This article is a step-by-step tutorial to setup an AI home security camera WITHOUT leaking videos of your love to the cloud.

Figure: This is a step-by-step intro to setup a home security system without leaking video to the cloud.

So you want to enhance home security to protect the family you love, and decide to install a home surveillance system. After some Google searches however, you find many solutions which provide intelligent notification and video backup by cloud, sounds great right? But what about privacy, is security more important than privacy? After all, there’s no such thing as a free lunch.

In this tutorial, we will helping you to DIY a home Surveillance system step-by-step. The system will be running on is a $35 computer, the wonderful and awesomely powerful Raspberry Pi, with all the computations running locally, so you can protect your family without losing your privacy. Our system is based on BerryNet, an open source AI system, so everything is transparent.

Before starting the tutorial

You will need to prepare the following devices to complete the tutorial:

  • An old mobile phone as camera input
  • A Raspberry Pi for local AI computation (*)
  • A laptop or desktop with Ubuntu installed to check the result
  • A gmail account to send out notifications

(*) The BerryNet open-source software also support other devices as AI computer, if you do not have a raspberry Pi in hand, please check BerryNet Wiki for other alternative hardware choices.

More details of the required software and hardware are listed below in the Precondition section. If you encounter any problem during the execution, please join BerryNet Telegram and the BerryNet team is there to support you.

Scenario

For the first step, we will help you setup a mobile as an IP camera, send the camera images to Raspberry Pi for AI analysis, and display the analysis results on the Raspberry Pi’s screen. You will have an overview how the entire system works.

Figure: Development stage, to receive notifications by laptop.

Then we will help you setup your Gmail account and send event notifications through it. Your Gmail account will be like an assistant sending event notifications to another indicated email address.

Figure: Real scenario, after the system configuration is done, one can receive notifications in Email.

Our sample scenario of this tutorial is below:

  • Raspberry Pi 4 with BerryNet, the analysis center
  • Mobile as IP camera, the image source provider
  • Gmail account, the event notification sender
  • Another email account, the event notification receiver
Figure: Sample scenario setup.

Precondition

Here is the list of hardware and software you will need:

Table: Required software and hardware. [1] https://www.balena.io/etcher/

Step-By-Step Tutorial

Step 1: Turn your mobile to be an IP camera

CamOn Live Streaming is an easy-to-use app to turn your Android phone into an IP camera.

Figure: Screenshot of my mobile running CamOn Live Streaming

In the figure above, video streaming information is displayed on the screen. To get the RTSP URL for H.264 streaming, you can drop the menu down, to see the streaming information and the stop button. The RTSP URL is your source of input video frames in the BerryNet camera client in the latter sections.

Figure: Get RTSP URL of camera video stream from the drop down menu.

You can choose focus mode in camera settings. Sometimes continuous auto-focus might affect the input quality and subsequently the accuracy of the AI system. Personally I prefer to choose on-touch auto-focus or disable auto-focus completely.

Step 2: Download And Install BerryNet System Image to SD Card

Download the latest image (if your browser helps decompress Zip file automatically, disable it before download). To check the integrity of the downloaded image, you can check the md5sum of the image and compare it with the checksum file.

Note: If your host OS is Windows, you can write BerryNet image to SD card by balenaEtcher.

Assuming that the image is saved to /home/DT42/Downloads/ directory.

notebook $ cd /home/DT42/Downloads
notebook $ md5sum 2019-07-15-raspbian-buster-berrynet.zip
8bda8053a69995814a0ca6fa60b1d49e 2019-07-31-raspbian-buster-berrynet.zip

Write image to SD card. Assuming that the inserted SD card is represented as the device node /dev/sdb on system. You can check the device node name of the SD card by GParted Partition Editor. If you see other similar device notes like /dev/sdb1 or /dev/sdb2, you can ignore them without causing any problem.

Figure: Get a list of storage device nodes in GParted.

Run the command below to write BerryNet image to SD card:

notebook $ unzip -p 2019-07-15-raspbian-buster-berrynet.zip | sudo dd of=/dev/sdb bs=4M conv=fsync status=progress

To get more information of installation, you can refer to the installation Wiki page.

Step 3: Boot And Disable Auto-Launched AI System

After booting your Raspberry Pi, BerryNet is ready and will already have an object detection application will be running on it. Here you will learn how to create your own AI application and launch it automatically after booting in the latter sections. Now, please stop the default application manually:

rpi $ sudo supervisorctl stop all

Step 4: Update The Latest BerryNet.

Update BerryNet to the latest version:

rpi $ sudo apt update
rpi $ sudo apt install -y berrynet

Step 5: Run Your AI Application And Verify Functionality Manually

You are now ready to launch an object detection application by running a detection service, a dashboard client, and a camera client manually.

Figure: Re-cap the development scenario.

Before running the steps below, make sure that your mobile and Raspberry Pi connected to the same WiFi access point.

Firstly, start the object detection service:

rpi $ bn_tflite -s detector -p mobilenet-ssd-coco-tflite-2.0.0 --num_threads 4 --draw --debug[D 190724 06:28:19 tflite_service:183] model filepath: /usr/share/dlmodels/mobilenet-ssd-coco-tflite-2.0.0/model.tflite
[D 190724 06:28:19 tflite_service:184] label filepath: /usr/share/dlmodels/mobilenet-ssd-coco-tflite-2.0.0/labels.txt
INFO: Initialized TensorFlow Lite runtime.
[D 190724 06:28:19 __init__:11] Connected with result code 0
[D 190724 06:28:19 __init__:13] Subscribe topic berrynet/data/rgbimage

You can see that object detection service is ready to receive input data (from the berrynet/data/rgbimage MQTT topic)

Secondly, start the dashboard client, and the detection result will be shown on it after camera client is started by the next step (before that, dashboard content will be empty):

rpi $ bn_dashboard --topic berrynet/engine/tflitedetector/result --no-decoration --no-full-screen --debug[D 190724 06:37:54 __init__:11] Connected with result code 0
[D 190724 06:37:54 __init__:13] Subscribe topic berrynet/engine/tflitedetector/result

If you are logging in via SSH remotely, you can launch the dashboard window by adding DISPLAY=:0 before bn_dashboard. To stop the dashboard client, you can click the dashboard window, and pressing “q”.

Thirdly, start the camera client, and assign an image filepath as the input source:

rpi $ wget -O /tmp/person.jpg https://raw.githubusercontent.com/pjreddie/darknet/master/data/person.jpgrpi $ bn_camera --mode file --filepath /tmp/person.jpg --debug
[D 190724 06:46:18 camera:165] payload: 3.301 ms
[D 190724 06:46:18 camera:166] payload size: 161232
[D 190724 06:46:18 __init__:50] Send message to topic berrynet/data/rgbimage
[D 190724 06:46:18 camera:171] mqtt.publish: 8.06 ms
[D 190724 06:46:18 camera:172] publish at 2019-07-24T06:46:18.65415

You should see the detection result shown on the dashboard, meaning that the object detection service and data communication mechanism are ready.

Finally, stop the camera client, and restart it with RTSP URL of the camera video stream as its input source:

rpi $ bn_camera --stream-src rtsp://10.0.0.108:8080/video/h264 --fps 1 --debug[W 190724 06:34:52 camera:113] Camera FPS is 180000.0 (>30 or <1). Set it to 30.
[D 190724 06:34:52 camera:127] ===== VideoCapture Information =====
[D 190724 06:34:52 camera:128] Stream Source: rtsp://10.0.0.108:8080/video/h264
[D 190724 06:34:52 camera:129] Camera FPS: 30
[D 190724 06:34:52 camera:130] Output FPS: 5.0
[D 190724 06:34:52 camera:131] Interval: 6
[D 190724 06:34:52 camera:133] ====================================
[D 190724 06:34:52 camera:142] Drop frames: 5
[D 190724 06:34:52 __init__:50] Send message to topic berrynet/data/rgbimage

The detection results will be updated on dashboard constantly. You can adjust the fps parameter of camera client, and put Big Digital Clock in front of the camera, so that you can see if the system is capable to process the input frames by the given fps parameter. We suggest to set 1 to the fps parameter to prevent email flooding.

Note: If you are interested in the benchmark of running MobileNet SSD on Raspberry Pi 3 and 4 with TFLite, I recommend this Allan’s article.

Step 6: Send Detection Result by Gmail

If you want to receive a notification when an event happens, the Gmail client helps you to send notification emails to an indicated email address.

Figure: Recap the real scenario.

Firstly, stop the dashboard client by clicking the dashboard window, and pressing “q”.

Secondly, configure your Gmail account to allow 3rd-party app login. Go to Google Account, disable 2-step verification, and enable less secure app access:

Figure: Google account setup for Gmail client access.

Finally, start the Gmail client:

rpi $ bn_gmail --sender-address foo@gmail.com --sender-password <gmail-password> --receiver-address bar@dt42.io --target-label person --topic berrynet/engine/tflitedetector/result --debug[D 190722 15:36:27 __init__:11] Connected with result code 0
[D 190722 15:36:27 __init__:13] Subscribe topic berrynet/engine/ovdetector/result
[D 190722 15:36:50 __init__:20] Receive message from topic berrynet/engine/ovdetector/result
[D 190722 15:36:50 gmail:120] inference text result: {'timestamp': '2019-07-22T15:36:49.674736', 'annotations': [{'left': 63, 'confidence': 0.9310929179191589, 'bottom': 347, 'right': 203, 'label': 'dog', 'top': 261}, {'left': 405, 'confidence': 0.7530375123023987, 'bottom': 341, 'right': 606, 'label': 'horse', 'top': 135}, {'left': 182, 'confidence': 0.9989088773727417, 'bottom': 369, 'right': 271, 'label': 'person', 'top': 103}, {'left': 404, 'confidence': 0.4551317095756531, 'bottom': 337, 'right': 600, 'label': 'sheep', 'top': 135}]}
[D 190722 15:36:50 gmail:109] Result labels: ['dog', 'horse', 'person', 'sheep']
[D 190722 15:36:50 gmail:126] Find target label person: True
[D 190722 15:36:50 gmail:72] Sender: foo@gmail.com
[D 190722 15:36:50 gmail:73] Receiver: bar@dt42.io
[D 190722 15:36:50 gmail:74] Subject: BerryNet mail client notification
[D 190722 15:36:50 gmail:80] Attachment: /tmp/2019-07-22T15:36:50.632503.jpg
[D 190722 15:36:50 gmail:80] Attachment: /tmp/2019-07-22T15:36:50.632503.json

The receiver should receive a notification email, and the attachments are an image and text of the detection results:

Figure: Receive notifications consisting of a snapshot and text details in Gmail.

If you change the target label to other object such as dog, the notification will be sent when a dog is detected.

Figure: Email attachment when the label is changed from person to dog.

If you see the messages below instead of receiving the notification email, please check the password entered in the command line:

[W 190803 15:31:54 gmail:148] (535, b'5.7.8 Username and Password not accepted. Learn more at\n5.7.8 https://support.google.com/mail/?p=BadCredentialsn140sm80107108fpd.132 - gmtp')

Step 7: Make System to Run Automatically

You might remember that you disabled default object detection application in the beginning of this tutorial. The default application is described in the configuration file

/etc/supervisor/conf.d/berrynet-tflite.conf

To return your application to its default, you can modify the configuration by

  • Updating the fps parameter of the camera client (bn_camera)
  • Adding a section for the Gmail client (bn_gmail)

The configuration content will look like (you can modify it by your needs):

rpi $ cat /etc/supervisor/conf.d/berrynet-tflite.conf[program:tflite-service]
command=bn_tflite --service detector --service_name tflitedetector --model_package mobilenet-ssd-coco-tflite-2.0.0 --num_threads 4 --draw --debug
stdout_logfile=/var/log/berrynet/tflite-service-stdout.log
stdout_logfile_maxbytes=1048576
stderr_logfile=/var/log/berrynet/tflite-service-stderr.log
stderr_logfile_maxbytes=1048576
priority=10
[program:camera]
command=bn_camera --fps 1
stdout_logfile=/var/log/berrynet/camera-stdout.log
stdout_logfile_maxbytes=1048576
stderr_logfile=/var/log/berrynet/camera-stderr.log
stderr_logfile_maxbytes=1048576
priority=30
[program:gmail]
command=bn_gmail --sender-address foo@gmail.com --sender-password <gmail-password> --receiver-address bar@dt42.io --target-label person --topic berrynet/engine/tflitedetector/result
stdout_logfile=/var/log/berrynet/gmail-stdout.log
stdout_logfile_maxbytes=1048576
stderr_logfile=/var/log/berrynet/gmail-stderr.log
stderr_logfile_maxbytes=1048576
priority=30

Reboot the Raspberry Pi, and start protecting your family without sacrificing your privacy!

Summary

We believe that AIoT (edge AI computation and IoT) is a powerful tool to protect our security and privacy at the same time. And we’re hoping that the flexible architecture can help you build more interesting AIoT applications in education, research, robotics, health care, and more!

If you have any questions or suggestions, please visit the BerryNet Telegram group. We look forward to seeing you!

Special Thanks

In the end of the tutorial, I would like to thank the BerryNet QA team, Sherry, the lady in the picture who performed tests and provided suggestions to this tutorial, and Mei-mei, the dog who played the detected object.

Figure: BerryNet QA team

--

--