IoT Motion Detect Android Things Project

Mehmet Burak Akgün
Sep 2, 2018 · 5 min read

Things used in this project

Hardware components

Raspberry Pi 3 Model B

Raspberry Pi Camera Module

Ultrasonic Sensor — HC-SR04

Jumper wires

Android device (To Run companion app)

Software and external services

Google Android Things

Android Studio

Spring Boot (As middleware)

Firebase

Redis

Concept

Alarm systems can be vital, but how do they succeed if they can’t see the restricted area? Was the sensors alert to the cat or flying bird? Or even worst is there someone breaking into your house?

It would be great idea if you could get real time rich notifications via the internet of things while away from home.

Core features of the Android Things Device

Remotely Handle to System Active / Passive

Detect Motion and take photos

Save Captured Motion and Receive the moment on time

Ability to view photos anytime and get informed to the time that photo was taken

This IoT application works as follows

If the motion sensor catches any motion, a photo is taken and uploaded to Server and also save to database with a timestamp.

If motion is detected, you will be notified from your android device.

To produce this project

Follow the steps below ↓

Once you have the device setup running Android Things, attach the Motion Sensor (UltraSonic Sensor) to the Android Things board. Then Follow the schematics below:

Here is an image of the whole wiring:

It’s quite simple!

Clone the Project:

Check out the source code from the Github repository linked here. Open the code in Android Studio.

The repository consists of 3 projects:

- mobile : the companion app that runs on any Android phone. Receives push notifications when photos are taken from the things project.

- things : the project that runs on the Android Things device, this includes the sensor detection and photo capturing.

- spring project : The middleware between application and Firebase that will be deployed to your Application Server. This project contains the logic that monitors when a new motion log is added on the server. It constructs a push notification to send to registered mobile devices. Also store the image.

Firstly you will need to create a Firebase Project. The Firebase Cloud Messaging will be used to pushing rich notification when the sensor was triggered.

Head to firebase.google.com and click “Get Started”.

Select “Add project”, give it a useful name and a default region and select “Create Project

Select “Add Firebase to your Android App

Input the package name, in this case we will use “com.burak.iot” , give it an App nickname and then select “Register App”.

It will then prompt you to save the google-services. json file to your own directory for your application.

com.burak.iot/mobile/google-services.json

You have now completed the Firebase portion of the project.

Running the Android Things Project [IoT]

Switch back to Android Studio and sync your gradle project.

Plug in your Android Things device into your computer or make sure you are connected to the device via ADB. Make sure your device has an internet connection so that it can upload images to your backend. To connect the device to the internet, run the following ADB command with your WiFi SSID and WiFi Password (alternatively connect an ethernet cable):

adb shell am startservice -n com.google.wifisetup/.WifiSetupService -a WifiSetupService.Connect -e ssid YourWifiSSID -e passphrase YourWifiPassword

Select the Android Things IoT device to deploy the “things” module. And run project. Things project written in Kotlin.

Spring Project Deployment [middleware]:

In order to receive push notifications to our phones and save moment as a picture when motion has been detected, we need to deploy the Spring project to Application Server. To do this, navigate into the iot/spring project from GitHub.

The Spring Boot project requires Maven and Redis, which you can install by following the instructions on https://redis.io/topics/quickstart Installing Redis also installs Maven.

Once you have Maven and Redis installed, configure your Backend by changing Authorization key which taken from Firebase and than simply deploy the any server / container.

You should then be able to send notification in your backend whenever you’re ready. Default environments & configurations set up for Glassfish Server.

Android App [mobile]:

Switch back to Android Studio, make sure you have another Android device plugged in (Android Phone). Select “mobile” module to deploy, press the green run button. Project written in Kotlin and you need Kotlin plug in and latest version of Android Studio.

Select your phone and select “OK”.

You should now see the example app running on your mobile device, with a list of the latest captured pictures and the option to remotely handle system Active / Passive to your alarm.

There we have an alarm, you have successfully built your Android Things Motion Sensing Camera! Have anything to ask? Let me know on LinkedIn.

A Note about Security

Currently this application has no complex security rules implemented. Anyone who gains access to the your deviceId or generatedToken as well as the test backend server can easily access any information on the system as well as change any information. There are plenty of different ways in which you can secure the system but for demonstration purposes it hasn’t been implemented in this project. Please do not do this if you intend on using this project in production.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade