How to reduce your electricity bill → Real Time Usage Dashboard → Python, Raspberry PI , Azure, Docker, Streamlit— Available on Github, Docker Hub

Ankit Choudhary
The Startup
Published in
7 min readMay 11, 2020

--

Snapshot of the Reporting Dashboard
Snapshot of the Reporting Dashboard

Project: A utility to continuously monitor the electricity usage of a consumer and plot the various usage patterns on a Dashboard. The entire application has been built on Python and is hosted on a Raspberry Pi. I have made the code base available on Github and the builds can be found on Docker-hub.

The below information is currently displayed on the Dashboard:
→Meter Info (Address of Property, ID, etc)
→Usage in the Current Cycle.
→Time Series for the Usage across the past 24 Hours.
→Last Billed Reading.
→Time Series for the Usage across the past 45 Days.
→Billing Patterns for the past 12 months.
→Time Series (with 15 minute Interval) for an entire day (T-2).

Cost to build: It took me close to 10 days time for researching the usage, sourcing the data provisioning API and building the Dashboard. I also procured a pre-owned raspberry-pi-4b from MicroCenter for $40. You can get a brand new one from here for $55.And finally a monitor. I had an old Dell Monitor lying around and that came in handy. Optionally, you can create a free Azure Cloud Subscription.

Inspiration: (Getting the best deal on electricity by controlling usage.)
am on an electricity plan (Maxx Saver 12) from 4changeenergy where I get a bill credit of $90, if my usage crosses the 1000kWh benchmark.

Electricity Charges — Initial Billing Cycle

When, I initially started this plan, I ended up using 853kWh for the first billing cycle and was charged an amount of $139.

The usage was lower than I expected because my flatmate was traveling and I was staying alone. I ended up losing the $90 credit :(

Electricity Charges — Subsequent Billing Cycle

In the subsequent month, my flatmate was back and this time we ensured that we are not sacrificing on the comfort of our lives and partied harder than ever before. Unfortunately, we ended up consuming (1343kWh) way beyond our benchmark. We did get the $90 credit, however there was a guilt feeling of overusing the resources.

Now, that’s when I wanted to build a utility that keeps me posted on the usage status and thereby letting me consume wisely and enabling me with actionable insights.

And that’s when I learnt about the Smart Meter Texas portal. It’s a site that stores daily, monthly and 15-minute interval energy data recorded by digital electric meters (commonly known as “smart meters”), and provides secure access to that data to customers and authorized market participants.

I have been exploring various ways of extracting the data from the portal and could learn about its internals by scanning through the Developer Mode of my Safari browser. I was able to figure out the details of the calls being made to the back-end server, the various APIs the request limits and much more.

The details of the API that am polling for my project here can be found here.

A Quick Look at the Smart Meter Texas Portal

Challenges: Following the developers mantra, I started with googling around to see for any previous works already done in this field. I did come across a few repositories working with smart-meter-texas, however all of them were web-scrapping projects.

I am not a big fan of the web-scrapping way of acquiring data points as they are not reliable, scalable and are prone to break with any new UI changes made by the host. I spent a lot of hours in the Developer mode of my safari browser. I was able to trace the authentication calls and subsequently all calls made by the Website to it’s backend servers.

Challenge 1: Requests to API were simply ignored as they weren't made from the browser and from the smartmeter portal.

Solution 1: I had to mock the request headers to replicate the browser as below:

{
User-Agent’: ‘Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML,’ ‘ like Gecko) Chrome/77.0.3865.90 Safari/537.36’,
Origin’: ‘https://www.smartmetertexas.com,
}

Challenge 2: I was trying to make similar POST calls to the APIs using the same set of parameters (headers + body + endpoint) using python’s requests module, however I was unable to authenticate myself. This was due to the fact that the website uses cookies to handle its flow. I tried disabling the browser’s permissions for cookies and the smartmetertexas.com wouldn’t load.

Solution: I had to make use of Requests → Sessions Module to handle the cookies. Here, I make an initial call to the portal to fetch the cookies making it believe that the incoming request is from a web browser. I would then set the same cookies while making calls to the Authentication API. These steps can be clearly seen here on the github repo.

After solving the authentication challenges, I was able to make calls to various endpoints of the API to fetch the different data points. That covers the data ingestion part. Let’s move on to the visualizations.

Overview of the Setup

Dashboard: Building the dashboard was another major challenge. Reasons being primarily as my lack of Front End Skills. I tried doing a use case on Tableau Desktop/Public, Power BI, etc. and couldn't opt for any of these as all of them are tagged with a pricing factor that would no longer make this project budget free.

The choice of dashboard had to be pretentious as I wanted this utility to be a light weight tool. We have aimed to run this application on a single board computer. Loading a heavy software on the raspbian OS would slowdown the system and heat up the motherboard.

To the rescue came the purely python based, fast and easy to use framework called Streamlit.

The data ingestion framework stores all the data files for the trends in CSV format and that makes it very easy to form Pandas Dataframe on it.

Streamlit provides a very fast and efficient way of plotting the charts and visualizations. The initial dashboard was up and running in the first few hours of the dashboard development.

Further, Streamlit open a port on the computer to the network which it is connected to and shared a link accessible by all other devices on the network. Thus, this dashboard is now accessible across mobile devices and computing devices of all my flatmates. Mobility is definitely a cherry on the cake.

Screenshot of the Dashboard from my iPhone’s browser

Here’s how the setup at my home looks like:

Used a spare monitor, removed the stands, turned the screen to Portrait mode
Assembling the Raspberry PI 4. Fascinating!
The Final Product. A fully loaded computing platform.

Here’s a video of the live dashboard.

Risks: One and a Major one! I am relying on the smart meter APIs. Now that am making my code base publicly available, if smart meters decides to change the policies and makes changes to their access methods, this project will be a cripple. I have made sure that the application is built ethically and only looks for the concerned user’s data and respects the API limits imposed. Rest is upto the providers to decide. 🤞

Enhancements: A lot can be added to this project. A few major things that I am planning to do constitutes of adding prediction analysis to leverage the available data and compute the expected billing amount.

We can also incorporate the offers from various energy providers participating in smart meters and help a consumer choose the best plan suited for his/her household. A lot can be done!

Stay Tuned: I will be publishing further blogs in silos sharing the setup of Docker, Raspberry Pi, Azure Apps, etc. If there’s anything that you want me to write specifically on, please drop a note. ✌️

“The best way to cheer yourself is to try to cheer someone else up.”— Mark Twain

Hope you liked my first blog on medium.com, please let me know your thoughts. Got a question? Leave a comment.

--

--

Ankit Choudhary
The Startup

Programmer & Architect @ Deloitte in Python, Big Data, Azure/AWS by Profession. Biker, Chef & Philanthrope by Passion. Student Pilot.