Monitor your Heroku app in Datadog
At Unsplash, we love Heroku but sometimes, it’s not easy to monitor whatever happens in your Heroku app.
Most of our dev-ops monitoring happens in Datadog. Unfortunately, there’s not an easy and clear way to serve Heroku data points to Datadog for monitoring. These data points being measurements like dyno memory usage, database load or even in-app measurement like query times etc…
A while ago, we found a lonely git repo that creates that missing bridge between Heroku and Datadog pretty neatly by parsing Heroku logs and sending data to Datadog accordingly.
Send your Heroku logs to DataDog. Contribute to Bristlr/heroku-datadog development by creating an account on GitHub.github.com
So we forked it and quickly hacked around it to do some of the things we needed it to do… and that’s it. We started collecting Heroku data points in Datadog, monitoring our web and worker dynos, our Heroku Postgres databases and a couple of custom metrics like the execution time of our ETL scripts.
Today, as we’re adding a new metric to measure, we thought it’d be worth refactoring the code and making it so that anyone can build their own Heroku-Datadog logs parser in NodeJS, easily.
There you go:
Send your Heroku logs to DataDog. Contribute to unsplash/heroku-datadog development by creating an account on GitHub.github.com
How does it work?
By setting up a log drain in Heroku towards the logs parsing app, whenever Heroku logs something like a web request, a database or dyno status update, an error or a console log from your application, the log will get pushed to your logs parsing app.
Your parsing app will be able to read every single log line and to react accordingly. By default, the app parses the dyno metrics logs and the Heroku Postgres database metrics logs.
The interesting bit is that you’re absolutely free to add your own custom parsing function to parse Heroku logs and trigger custom Datadog measurements.
Custom parsing function
Here’s an example of a custom log parser we built for ourselves:
console.log(‘ETL job job:name done in 2 mins and 20 secs’)
This makes our worker app write a log that will be drained and read by our log parsing app. Our custom parsing function reads the line, extracts the task name and its duration and sends the the measurement to Datadog.
If you want to build your own custom log parsing, create a parser in
app/parsers using the
runtime.js example in the project.
You can easily see that with this technique, you can pretty much get in Datadog anything that’s important to you and happening in Heroku:
- Log anything in your Heroku application
- Parse the log in your parsing app and trigger Datadog measurements
You could log execution times, CPU usage measurements, database request results, database query times … pretty much anything that you’re able to code, plus some out of the box logs from Heroku.
Where to host my log parsing app?
You can host your app in Heroku. Follow the directions in the README:
Want an alternative?
You can check out this repo that achieves the same goal by formatting your logs rather than writing parsers. It’s a different approach that you may or may not be more comfortable with.