Automating the capture of airplane pictures with Raspberry Pis, ADS-B and IoT software.

Arun Venkataswamy
7 min readAug 15, 2019

--

The weatherproof enclosure containing a Raspberry Pi, SDR (Software Defined Radio) and ADS-B antenna. For another unrelated project, it also includes a camera and a dome for it.

A Raspberry pi in my roof listens to Automatic Dependent Surveillance Broadcast (ADS-B) transmission from aircraft and based on that location and altitude data, another Raspberry pi connected to a camera in my balcony snaps a photo of them when they flyby. The communication between the two devices is through a MQTT broker. MQTT is the most common protocol used in IoT projects.

I wanted to do a project to get some experience in reliably controlling, monitoring and operating a fairly complex device remotely. At the moment, the remoteness is limited to my roof, four floors above my apartment. In the future, what I learn from this project would hopefully help me set up a remote observatory for astrophotography.

Sample image of a photograph taken by the system. The photograph is triggered when the aircraft is approaching Chennai (MAA) airport for landing and the elevation of the aircraft is around 1400 feet.
A montage of 200 aircraft captured during the course of the day on 4th August 2019. During the night, the camera’s long exposure creates the flight trails.

Software used for the implementation

  • dump1090-fa for decoding ADS-B signals
  • piaware for feeding the decoded information to FlightAware. You would be contributing to the world’s largest flight tracking network. And for feeding information to them, they upgrade you to an enterprise account. This is really useful if you are an aviation enthusiast.
  • Node-RED for flow programming. A brilliant platform to do programming without writing a single line of code. It was used both in the roof/ADS-B unit for publishing aircraft info to the MQTT topic and in the balcony/camera unit for subscribing to the MQTT topic and clicking the camera.
  • Mosquitto MQTT broker for providing MQTT service.
  • gPhoto2 for capturing pictures from a connected camera.
  • ImageMagick for overlaying text on top of the captured images.

Hardware used for the implementation

Automatic Dependent Surveillance Broadcast (ADS-B)

ADS-B is a surveillance technology by which an aircraft determines it’s position through satellite navigation and periodically broadcasts this data via radio signals, allowing it to be tracked by ground stations. This broadcast can be received by other aircraft in the vicinity to have awareness about other aircraft and achieve safe separation.

Information such as flight identification, ICAO 24-bit aircraft address, position, altitude, rate of climb, ground speed, track angle etc. are included in the messages which are transmitted. ADS-B signals are transmitted at 1090 MHz. These messages are typically transmitted twice every second from the aircraft.

More information about ADS-B here :
Wikipedia article about ADS-B
and The 1090Mhz Riddle (quite detailed info about ADS-B)

Implementation

Schematic of the implementation

The dump1090 application running in the Raspberry Pi in the roof keeps decoding the ADS-B signals it receives and publishes the data about all the aircraft in the vicinity. This data is available at http://localhost:8080/dump1090-fa/data/aircraft.json when you are accessing it from the Raspberry pi in the terrace. You will have to replace localhost with the IP address of the host if you have to access it from outside the host. A sample of the JSON is shown below

{
"now": 1565621401.1,
"messages": 5386880,
"aircraft": [{
"hex": "800d36",
"flight": "IGO314 ",
"alt_baro": 6250,
"alt_geom": 6125,
"gs": 258.8,
"ias": 220,
"tas": 246,
"mach": 0.368,
"track": 150.9,
"track_rate": 1.84,
"roll": 26.4,
"mag_heading": 144.8,
"baro_rate": 3648,
"geom_rate": 3648,
"squawk": "0201",
"emergency": "none",
"category": "A3",
"nav_qnh": 1012.8,
"nav_altitude_mcp": 14016,
"lat": 13.004059,
"lon": 80.299309,
"nic": 8,
"rc": 186,
"seen_pos": 0.0,
"version": 2,
"nic_baro": 1,
"nac_p": 9,
"nac_v": 1,
"sil": 3,
"sil_type": "perhour",
"gva": 2,
"sda": 3,
"mlat": [],
"tisb": [],
"messages": 1025,
"seen": 0.0,
"rssi": -4.6
},
.
. <Array containing other aircraft info>
.
....]
}

Before we continue with how the JSON is used — FlightAware provides a cool user interface with their version of dump1090 (dump1090-fa). If you are an aviation enthusiast, this is an awesome visualization. Even if you are not, this provides an easy way to assess if the setup is optimal. Take a look at this screenshot

Determining which aircraft and when to capture the photo:
My house is close to the descent path of a major international airport and at peak hours, there are about 20 aircraft which are tracked by dump1090. Out of these, we need to trigger the camera in the balcony when an aircraft passes parallel to the balcony, and in the camera’s view frame. The conditions used to determine this is as follows

  • The heading of the aircraft is 250˚ This is the heading of aircraft when it is aligned to the runway.
  • The altitude of the aircraft is 1,400 feet.
  • The aircraft is descending.

Once these conditions are met, the aircraft’s JSON information is published to a MQTT topic which the balcony Raspberry pi is subscribed to. The balcony unit uses gphoto2 app to trigger the camera. The image from the camera is downloaded and imagemagick is used to annotate the photo with the current date and time, ICAO code of the air-frame and the flight number.

Note that the altitude trigger was determined experimentally.

Getting things done without writing a single line of code

All the logic required to accomplish the goal of detecting an aircraft passing by my balcony and photographing it when it does was done without writing a single line of code. I used Node-RED, a visual, flow based development tool originally developed by IBM. It is an open source project now.

From Wikipedia: Node-RED is a flow-based development tool for visual programming developed originally by IBM for wiring together hardware devices, APIs and online services as part of the Internet of Things. Node-RED provides a web browser-based flow editor, which can be used to create JavaScript functions.

The Node-RED system was not as intuitive as I assumed it would be — considering it was a visual programming system. But once I got hang of the basics, I was able to appreciate its power and what it could accomplish without writing a single line of code.

The Node-RED flow running in the roof unit

The flow program running on the roof unit.
The code exported by Node-RED. This is for the roof unit which is connected to the ADS-B decoder. It analyses the data to pick an aircraft which is passing by my balcony

The Node-RED flow running in the balcony unit

The flow program running in the balcony unit
The code exported by Node-RED. This is for the balcony unit which captures the photo of the passing aircraft using the connected camera

External applications called from the balcony unit

To capture photos, ‘gphoto2’ was used and this is how it is invoked

gphoto2 --capture-image-and-download --filename /tmp/aircraft.jpg --force-overwrite

To annotate the photo with information, imagemagick’s ‘convert’ tool is used and this is how it is invoked

# To annotate the photo with information
convert /tmp/aircraft.jpg -pointsize 158 -background Khaki label:'15/08/2019 10:30 AM / ICAO CODE : XXXXXX / FLIGHT : XXXXX' -gravity Center -append /tmp/aircraft-annotated.jpg
# To resize the annotated image and store it
convert /tmp/aircraft-annotated.jpg -resize 50% /home/pi/devel/camera/20190815-1030.jpg

Bringing the Internet to my roof

One of the prerequisites for the project was to get the Internet to my roof from my home broadband connection. This was a bigger problem than I anticipated. Finally the solution was to use a WiFi extender in the balcony and another WiFi router in the terrace. The extender and the terrace WiFi router is connected using wired LAN. This was the stable system that worked for me.

The setup mentioned above is good enough for getting the internet connected to the remote Pi devices. They would be able to access the internet and communicate with MQTT and FTP servers. They can upload images to public servers. But, I needed to access the Raspberry Pi in the roof from the internet. This was required to view the visualization of the aircraft being tracked. FlightAware provides this application and it runs locally in the Raspberry Pi. For this I needed to get a static IP from my internet service provider and setup a DMZ/port forwarding in my WiFi routers.

Some takeaways from my experience the project

  • Always have a remote power cycling ability for your remote projects. All electronics and all software at some point will require a hard reboot or full power cycle (off and on). If you can’t do it from your home, you might have to travel all the way to the remote site.
  • Stick to hardware and tech you are familiar with. I did not have experience with the small Raspberry Pi Zero. I tried using it remotely and was surprised to find out how different it was to the regular Pi. It just did not have the horsepower to do the decoding of ADS-B signals.
  • Don’t cheap-out on components you are going to place remotely. This will bite you back! Get the best you can afford.
  • Take weather seriously. Make sure that the remote unit can withstand the weather. Consider rain, heat and humidity when you choose the enclosure.

The enclosure also houses a camera and a dome. This was for a different, unrelated project.

--

--