Redis Time Series

Time Series Data implemented directly in Redis with Lua.

Terminologie

Redis has no Date/Time types like most other databases. But this shouldn’t stop us storing time series data in Redis. We simply have to convert a datestring to unixtime.

Writing

My example data has 43200 timestamps, á 27 measurementpoints. All in one, 1166400 measurement values. Luckily in JSON Format like this.

{
"timestamp": "2013-01-31 15:43:12",
"data": {
"sensor1": 4.31,
"sensor2": 2.1
}
}

So, all we need is

  1. convert timestamp to unixtime
  2. save data in a sorted list
ZADD KEYNAME SCORE VALUE

Where `SCORE` is the unixtime and `VALUE` is the data json string.

With Python and hiredis, it takes ~3 seconds to transform ~1.2 mio datestrings to unixtime und fill the redis pipeline. Executing the pipeline takes only ~1 second.

Reading

Reading the complete ~1.2 mio values takes only 0.25 seconds with

ZRANGE KEYNAME 0 -1 WITHSCORES

`WITHSCORES` is very important here, otherwise it’s impossible to match the values.
The output in python looks like this (a key value pair in a list).

>>> r.zrange('test04', 0, -1, withscores=True)
[(b'{ "sensor1": 15.1}', 1362152596.0), (b'{ "sensor1": 15.4}', 1362152597.0)]

And if you need just a range of your time series data, simple use

ZRANGEBYSCORE KEYNAME MIN MAX WITHSCORES

Simplify

When your timestamp is already in unixtime format, there is nothing else to do for you. But when your timestamp is a date string, you have to convert it in your language in use.
So it depends on you language how fast and how easy this is.

TLDR;

But there’s still lua inside of Redis!

E.g. for dateformat `yyyy-mm-dd HH:MM:SS` all you need is this lua script

load it with

redis-cli SCRIPT LOAD "$(cat unixtime.lua)"
"b0bcc4b2268c713bb4e4c226db7c27930da8c998"

call it with

EVALSHA b0bcc4b2268c713bb4e4c226db7c27930da8c998 1 test01 "2013–03–01 15:43:13" '{ "mp1": 15.1}'

That’s all. You don’t have to worry about converting anymore.

PS: all in all converting with lua in redis is ~12% faster than converting in python for ~1.2 mio values.