seattle.watch, an exploration in IBM Bluemix, a Raspberry Pi and some Time-Lapse magic

Jesse Proudman
16 min readFeb 21, 2016

--

Following the acquisition of Blue Box by IBM, one of the first projects I wanted to tackle was my own exploration of the IBM Cloud Bluemix service catalog. As our combined Blue Box strategy includes the integration of these services into our Dedicated and Local product offerings, getting truly hands-on felt necessary to see what we were working with.

Now, I could have built a small “Hello World” application, pushed it to Bluemix and called it quits, but that seemed like the fastest path to total boredom. I wanted something that could live on, something that could evolve and something that I really could sink my teeth into. Fortunately, we had recently moved, and our new house experienced a lot of exposure to the weather. Last November, I bought a weather station so I could geek out on the stats as the wind buffeted our house. That Weather Station had integration back into Weather Underground, which happened to have been recently acquired by IBM, and provided an ideal opportunity to build a full application tying together many IBM cloud services.

And so, my weather-cam project was born! You can check it out at http://queen-anne.seattle.watch

Isn’t this fun?!?

The Details
The weather-cam utilizes a variety of services, both IBM Cloud specific and third party:

  • IBM Bluemix Cloud Foundry Service
  • IBM Bluemix Auto-Scaling Service
  • IBM Bluemix Object Storage Service
  • IBM Bluemix Virtual Machine Service Beta
  • IBM Weather Underground’s Personal Weather Station
  • Memcached Cloud
  • Twilio
  • Youtube

Because this project has called forth substantial outside interest, I decided to write a blog post describing the details of the build. I’ve also open sourced all the code I used here. The source code repository itself contains a README with details I don’t go into in depth in this post, so I encourage you to check that out.

And apologies now — this is a long and very technical post. This project seemed simple at first, but as I began to write it up, I realized that there are many moving pieces. So with that warning, let’s dive in:

Reference architecture of your future Weather Cam

Bluemix

This whole project began as a way for me to explore a variety of IBM Bluemix services, so Bluemix is where I will begin. If you’re looking to replicate this project, you can get started by signing up for a free Bluemix account here.

I’ve used Cloud Foundry before, but the Bluemix platform felt magical.

I’ve been a strong proponent of Cloud Foundry for years, so I had some idea of what to expect in getting started. However, in my experimentation, the simplicity, integration and capabilities offered by Bluemix truly felt magical. From the ease of use of the online code editing capabilities, to simplicity of Cloud Foundry to go from idea to implementation in minutes, Bluemix really empowered me in my development of this idea. So mission accomplished in exploration.

Let me walk you through each Bluemix component and demonstrate how each was utilized for this project:

Bluemix Virtual Machine Service Beta
To start, I needed a virtual machine to do some of the heavy lifting with video creating. At the beginning of this project, I participated in the Bluemix Virtual Machine Service Beta (powered by OpenStack). That Virtual Machine Beta is now closed to the general public (it’s coming back soon), so if you’re following along at home, for the purposes of this portion, you just need to obtain a small virtual server running Ubuntu 14.04 from any provider (Give SoftLayer a try). This VM will be referred to as cam-control throughout the rest of this document.

On cam-control, I started by installing a small software stack(full instructions can be found here). This software stack includes the OpenStack swift client that provides the interface to IBM Bluemix’s Object Storage, and also provides the video encoding capabilities to build the time-lapses which then get uploaded to Youtube.

Bluemix Object Storage

The next step is to create the Object Storage account. Once you’re logged into Bluemix, select “Catalog” and then “Object Storage”. Select the Free plan and click “Create”. The system will then present you a JSON blob with your service credentials. Save that data, you’ll need it to populate your configuration files for the Raspberry Pi Camera and cam-control.

With these credentials, go back to cam-control and create a configuration file in your home directory:

$ vi ~/.weather-camexport OS_USER_ID=<The userId from the JSON>
export OS_PASSWORD=<The password from the JSON>
export OS_PROJECT_ID=<The projectId from the JSON>
export OS_AUTH_URL=https://identity.open.softlayer.com/v3
export OS_REGION_NAME=dallas
export OS_IDENTITY_API_VERSION=3
export OS_AUTH_VERSION=3

Once that’s complete, confirm you can access the Bluemix object store:

$ source ~/.weather-cam
$ swift stat
Account: AUTH_09efdd634bf5134ebdf34ff6a196db27
Containers: 46
Objects: 76794
Bytes: 138194993603
Containers in policy "standard": 46
Objects in policy "standard": 76794
Bytes in policy "standard": 138194993603
X-Trans-Id: tx9ced6c8280c64b699b591-0056c7a8c6
X-Account-Project-Domain-Id: 4c7ab851488c49f2bb5d241cc5b60aef
X-Timestamp: 1450056663.79268
Content-Type: text/plain; charset=utf-8
Accept-Ranges: bytes

OpenStack swift uses the notion of “containers” instead of files. For this project, we maintain one public container called “weather-cam” and then each day’s images get uploaded to their own container created on the fly. Once you have access to the Object Store, you can create the default weather-cam container and make it world readable:

$ swift post weather-cam -r '.r:*'

The last step is to determine the URL prefix for this container:

$ swift auth
export OS_STORAGE_URL=https://dal.objectstorage.open.softlayer.com/v1/AUTH_09efdd634bf5134ebdf34ff6a196db27

Knowing the above, we now know that the URL for your weather-cam bucket (and weather.jpg) will be:

https://dal.objectstorage.open.softlayer.com/v1/AUTH_09efdd634bf5134ebdf34ff6a196db27/weather-cam/weather.jpg

Once we’ve got our Object Storage account setup, we may proceed. Deploying the weather-cam app itself to Bluemix’s Cloud Foundry PaaS…

Bluemix’s Cloud Foundry Service
Now that you have your object storage account setup, it’s time to publish the weather-cam web application. The weather-cam app is a very simple Ruby Sinatra app. This process to get it up and running is dead simple and is done using the Cloud Foundry command line.

Begin by downloading the Cloud Foundry command line interface. Next, clone the weather-cam repository from GitHub. Then push your app to Bluemix.

In this example below, make sure to replace variables in brackets with your specific information. <Your App Name> can be any name you choose (assuming it’s not already taken).

$ git clone git@github.com:blueboxjesse/weather-cam.git
$ cd weather-cam
$ cf api https://api.ng.bluemix.net
$ cf login
API endpoint: https://api.ng.bluemix.net
Email> <Your Bluemix username>Password> <Your Bluemix password>Authenticating...
OK
Select an org (or press enter to skip):
1. Example Org
Org> 1
Targeted org Example Org
Targeted space Example SpaceAPI endpoint: https://api.ng.bluemix.net (API version: 2.40.0)
User: j.proudman@us.ibm.com
Org: Example Org
Space: Example Space
$ cf push <Your App Name>

Now that the app is online, you need to customize it using a series of Cloud Foundry environmental variables. The Object Storage URL is what you’ve just determined in the above steps, and the Weather Underground Station ID is the ID (i.e. KWASEATT457) of a station closet to you (or your station if you’re setting that up as defined in the Hardware section below).

$ cf env <Your App Name> WEATHER_JPG_URL: <Object Storage URL>
$ cf env <Your App Name> WEATHER_UNDERGROUND_ID <Weather Underground Station ID>

My particular implementation also connects to our Tigo Energy solar array to display the Kilowatt Hours of electricity generation created every day. It uses a Memcache service (procured through the Bluemix service catalog) to cache that data throughout the day. If you’re not using those (which would make sense), then skip to the next section. Excluding these environmental variables will just ensure the solar array data is not displayed.

$ cf env <Your App Name> MEMCACHE_PASS <pass>
$ cf env <Your App Name> MEMCACHE_SERVER <server>
$ cf env <Your App Name> MEMCACHE_USER <username>
$ cf env <Your App Name> TIGO_URL https://user:pass@api.tigoenergy.com/api/data?cmd=list&sysid=<sysid>

With the environmental variables set, you can re-publish your application with a simple command…

$ cf push <Your App Name>

And you’re done!

You should now be able to see the results of your effort by visiting http://<Your App Name>.mybluemix.net.

Magic, isn’t it?

For fun, I ended up buying a vanity domain “seattle.watch”. Bluemix makes it super easy to route your custom domain to your app, and so with a few simple clicks, you can find my cam at http://queen-anne.seattle.watch.

Bluemix’s Auto-Scaling Service
For fun (who knows, maybe this weather cam becomes ultra popular), I also setup Bluemix’s auto-scale service. This free service observes your application and adds additional app server capacity should certain conditions be met. The service itself can be added through the Bluemix service catalogue. It’s a very simple service so I won’t go into a ton of detail here, but I suggest you check it out.

And that’s it. You’ve now got a fully functional Object Storage account to store each day’s images, and the display application to show off you work. With that all setup, now we’ll proceed to setup the hardware and software.

The Hardware Platform

There are two main hardware components that power this offering: The Weather Station itself and the Weather Camera, powered by a Raspberry Pi.

The Weather Station

I selected the“Ambient Weather WS-1200 Observer Solar Powered Wireless Weather Station”. I did a lot of comparison-shopping and this unit felt like it provided the best bang for the buck regarding the features I was looking for.

The WS-1200, perched on top of our roof.

The WS-1200 provides a host of measurements including:

  • Wind Speed and Direction
  • Temperature and Humidity
  • Barometric Pressure
  • Rain Fall
  • Solar Radiation Levels
  • Dew Point

Not only that, but it was super simple to install… I clamped it to a bar we already had on our roof, installed the 3 included AA rechargeable batteries and plugged in the console/screen inside the house. The batteries recharge from the solar panel on the station itself, the Weather Station talks to the console over the 915MHz band and then the console talks to Weather Underground over Wi-Fi. No power wires to run or data wires to string up… The console’s user interface itself is a bit clunky and dated, but it certainly gets the job done and the Weather Underground user interface provides a more attractive view.

Once the Weather Cam was installed, I simply signed up for a Weather Underground Personal Weather Station account and inserted my credential into the WS-1200 base station.

Weather Underground has a done a great job with their data presentation. Also, it’s windy here today!

The Camera: A Lesson in Pi

The camera setup was more involved and thus more fun to build. I looked a bunch of web based / IP cameras on the market, and none of them seemed up to the task. I really wanted to control how the process worked, where the images were stored, etc… and at the same time, I wanted to ensure I could create daily time-lapses of the weather.

I also secretly wanted a reason to play with my first Raspberry Pi.

I ultimately decided I could build my own solution using the following parts:

I assembled the Pi inside the housing in a pretty rudimentary way with an enormous quantity of hot glue:

Not the world’s most beautiful installation, but I’m more of a Software guy anyway…

With the installation complete, I setup the Raspberry Pi to boot to a command prompt, setup the wireless to automatically connect to my home’s Wi-Fi network and to come online with a static IP. This then became the development platform on which I wrote much of the software discussed below.

The software installation instructions for the Raspberry Pi itself are covered in the weather-cam’s README.

Through this process, I did learn a few key lessons:

  1. I did not use the heater / fan that came with the enclosure… The Pi stayed warm enough through the winter so I haven’t found it necessary so far to warrant hooking up a DC power supply. But the option exists with the items in the enclosure incase that changes over the summer.
  2. The photo above was the first revision, and I subsequently learned that I needed to get the camera lens closer to the glass to avoid reflections. Put your lens as close to the glass as possible.
  3. If it rains where you live, I suggest you invest in Rain-X. I found for the first week or two, the lens on the housing would get covered in beaded raindrops that wouldn’t clear. With the Rain-X, the raindrops still appear, but they dissipate faster (you can see it in one of the time lapse videos).
  4. The camera on the Pi has a red LED on the front that turns on when taking a photo. This obviously caused a red flash in the picture from the reflection off the enclosure’s glass. You can turn that LED off by inserting “disable_camera_led=1” in /config/boot.txt and rebooting the Raspberry Pi.
  5. The Wi-Fi Adapter has a blue LED that you can not turn off. I used black tissue paper to cover that up (otherwise I was getting a blue effect in the photos at night).
  6. The Pi 2B is fast enough for general processing, but trying to composite multiple images fast enough to be able to get to 2 frames a minute was a bit of a challenge. I had to do some minor over-clocking on the Pi to make that image processing fast enough. This took some finesse since I was having some stability issues with the Pi (still unsure if those were due to the overclocking or due to the system dropping off the network due to where it sits on the roof). I ended up using “sdram_freq=450” and “core_freq=450” in /config/boot.txt and this seems to work well.

With that all done, I mounted the camera on the roof and went to work on the Software portion of the equation.

The Weather-Cam Software

I’ve always been a bit of a software hack… I can get almost anything to work, but it may not be the prettiest thing you’ve ever seen. Writing the software for this project ended up being quite fun — but cleaning it up so I could share it with all of you without being totally mortified? A bit more of a challenge.

That said, you can find the entire code set on GitHub here.

The final version of this code resulted in 3 main components:

  1. The cam-pi code that runs on the Raspberry Pi and takes and uploads the individual photos.
  2. The cam-control code that runs on the cam-control server and does the compositing, creation and uploads of the time-lapse videos to Youtube.
  3. The weather-cam Sinatra app that runs on IBM Bluemix and provides the user interface that people can watch.

Both the cam-pi and cam-controller code are powered by Cron jobs that run on their respective systems. The details of their installation and operation can be found in the README here.

I’ll dive into each component below and provide a high level overview of the functionality:

cam-pi — “aka where the magic happens…”

The cam-pi codebase runs on the Raspberry Pi and is tasked with taking the individual photos and ensuring they make it from the Pi into the Bluemix Object Storage platform. The code itself can be found here on GitHub.

cam-pi is made up of 4 main components:

cam.rb
The heart of this code lives is the cam.rb simple ruby program.

Because the camera is pointed at the skyline, it receives direct sunlight and a whole host of unique lighting conditions. Additionally, the system takes photos 24 x 7, night and day, and so the exposures need to be different depending on the time of day. I explored around with a bunch of different solutions and it took awhile to get this part right (and I’m still not 100% satisfied) but what I have now seems to be working fairly well.

cam.rb calls out to Sunrise Sunset’s API and based on my Longitude and Latitude, determines what time of day Sunrise and Sunset actually are. Using that data, the software then breaks the day up into 3 sections: daylight, twilight and night.

During daylight, the Pi takes 3 consecutive photos with different exposure settings (using Raspistill’s “EV” option), blends them together in an HDR style composition and then adjusts the brightness / contrast of the resulting image using Image Magick.

During twilight, the Pi takes 2 consecutive photos and blends them together using the same HDR style composition.

During nighttime, the Pi takes a single long exposure.

I then ran into the challenge that each one of those steps takes a different amount of time, and so I was getting fewer frames per minute captured during the day and too many frames per minute captured at night. I timed out the entire operation and realized that the max FPM I could shoot during the day (due to the computation of the compositing on the Pi) was 2, so I set the twilight and nighttime captures to follow suit.

cam.rb is called twice a minute via cron (via cam_wrapper.sh) to get 2 photos a minute (approx every 30 seconds).

It then makes a call out to upload.sh to go upload the current shot into the public container, overwriting that weather.jpg.

upload.sh
This script has two jobs: its first is to ensure the weather.jpg that displays on the weather-cam app is up to date, and its second is ensure that all photos taken that day make it into IBM Bluemix’s object storage. With the Raspberry Pi stuck inside a metal enclosure up on my roof, its Wi-Fi connection was a bit spotty. I’ve been working to tune that since installation, but I found that I would get a fair amount of timeouts on image uploads. As such, upload.sh has a bunch of retry logic built in to ensure the entire day’s imagery is captured and saved.

As noted above, upload.sh is called within cam.rb for updating the “current status” weather.jpg and then every 10 minutes via Cron for that bulk upload. Additionally, one last run of upload.sh occurs at 23:59 every day to ensure all the photos for that day have been uploaded.

ssh_tunnel.sh
This script creates and maintains a reverse SSH tunnel back to cam-control to assist with monitoring that the Pi is online, and to provide remote access to the Pi if I’m not on the home network.

It’s not totally necessary and is an artifact of some of the debugging work I’ve had to do.

network_reconnect.sh
Crap Wi-Fi is crap. This script is called on boot and forces the Pi to attempt to reconnect to the Wi-Fi if for any reason it dropped.

cam-control — “aka Lights, Camera, Action!”

The cam-control code runs on the “cam-control” server. Its primary job is to create the daily time-lapse video and push it to Youtube. It’s also in charge of some basic monitoring to ensure the “current status” weather.jpg image is being updated. Its source code can be found on GitHub here.

cam-control is made up of 2 main components:

video-creator.sh
video-creator is called via Cron once day at 5am. Its job is to download the previous days images from the IBM Bluemix Object Store, composite them into a video and upload that video to Youtube.

It took some work to find a media compression program that would work reliably, and create HD video that wasn’t 4GB in raw size. I originally was using mencoder, but ran into issues with it crashing encoding the video and ultimately settled on a multi-step process that uses mencoder to create the original file and avconv to shrink that file size down.

This program uses the youtube-upload program found on GitHub here. YouTube’s new API an oAuth authentication can be a bit tricky, so make sure to follow the instructions provided.

video-update-check.sh
This script’s job is to make sure the weather.jpg image that sits on the Bluemix Object Storage public container is up to date. I found throughout my process that either the Pi would drop offline, would crash or would stall out uploading images, so I wanted a watch guard in place until I felt more confident in its reliability. It’s called via Cron, it uses Twilio to send my phone a text message if that image is stale.

weather-cam — “aka, show me the money”

The last chunk of code is the Sinatra app that sits on the Bluemix Cloud Foundry installation and serves up the HTML interface to view the image. When I first started, I was content to just have a static JPG I viewed, but as time went on, I wanted that JPG to update dynamically, and to bring together all the pieces from my system (Tigo Solar Data, Weather Underground PWS Data and my cam image) into a unified interface.

The motion blur on planes at night is one of my favorite artifacts… Can you spot them in the time-lapse videos?

The code itself is pretty straightforward. I’ve tried to make it a bit more generic so others could try it out by abstracting a number of variables out to use Cloud Foundry’s environmental variable configuration. These variables are called during deployment (using Sinatra-Asset-Pipeline) to hard code items into the CSS and JavaScript, and are called during runtime within the Sinatra app itself.

I’m pretty happy with the final result. If you load the URL, you’ll get a live updating feed of the cam (updated every 30 seconds or so), live weather data displayed in the bottom right corner, and live solar accumulation data in the bottom left.

In Closing and Next Steps

It’s funny how the most simple of projects can become insanely complicated in a rather short period of time. I accomplished my goal of discovery about the Bluemix platform, and I ended up building a platform I’ve really come to enjoy.

So what’s next for the project?

The most obvious revision ahead is to move the video processing from cam-control into a container running in the IBM Cloud container service. The production only takes about ~10 minutes, so there’s little reason to have the full virtual machine running 24 x 7. I’ve also fixed much of the stability issues with our Wi-Fi, so needing a remote shell into the Pi isn’t as big of a deal. This will leave a gap around monitoring the current weather.jpg status so I’ll have to continue to explore the Bluemix platform to find a solution for that.

Additionally, because of all the challenges with my Wi-Fi, I’m thinking about bypassing my Wi-Fi and using Ethernet with a PoE hat to inject data and power directly to the Raspberry Pi.

With the recent uStream acquisition by IBM, there’s a new opportunity there to discover what a live stream from the Camera could look like.

The possibilities are endless…

I hope you’ve enjoyed watching the time-lapse videos as much as I’ve enjoyed creating them. This was way more fun than I ever anticipated it would be.

If you end up building your own implementation using bits of pieces of this, I’d love to hear about it. Feel free to contribute to the GitHub project, and if you have any questions, please don’t hesitate to reach out.

— Jesse

--

--

Jesse Proudman

Proud father and husband with an unfortunate 🛥 and 🏎 addiction. Fueled by Crypto. Passion for building fun things with excellent people.