The Sigfox IoT Connector in the Record Evolution Data Science Studio

Image by Daniel Falcao

Sigfox provides worldwide low-power and low-intensity IoT data streaming infrastructure via a radio network. This technology is complementary to several other IoT realms (WiFi, LTE, M2M, etc.). Sigfox is built for wide-coverage and long-term operation without human intervention. For example, let’s take a gigantic agricultural field or several remotely located farms where you have to monitor temperature and humidity several times a day. The embedded devices you use for such projects are usually a combination of a computational unit (a programming interface), sensors, and a radio transmitter.

In this article, we show how to connect data from a Sigfox Sens’it device to the Record Evolution Data Science Studio. The Record Evolution Data Science Studio is a user-centric cloud data platform developed by Record Evolution GmbH. The platform enables users to simply plug in and play.

To follow the steps we show in this article, you can have a look at the Sigfox Data Pod we have created. Don’t worry if you are not a skilled SQL coder. The data science studio already does it for you. The platform supports you with easy-to-use connectors, data versioning, data historization, and basic data visualization. However, as our case is rather specific, we had to write a Sigfox payload decoder in SQL to present human-readable data such as battery levels, as well as temperature and humidity values.

We can also go one step further. You can stream electric power consumption measurements with sensors via the Sigfox radio network. You can request the data from the Sigfox cloud. For this task, you can design a new device with the desired set of sensors; you can deploy a modest computational interface and a corresponding serialization algorithm to implement payload encoding and the streaming frequency of data packages.

The data science studio provides an easy-to-use plugin option for web-based data. You are welcome to register for a free data pod.

>>> CLICK HERE TO GET TO THE SIGFOX CONNECTOR<<<

1. Sigfox and IoT

Sigfox is a French telecommunications company providing a worldwide sub FM (920 MHz) radio network to stream low-intensity data to the Sigfox Cloud. Radio antennas covering a 10 to 100 km range collect the data and transmit it to the cloud via a Wi-Fi connection. Sigfox provides an API and custom callback functions to pull the data from the Sigfox Cloud. Apart from this infrastructure, Sigfox is wholly flexible with edge device design (low-power sensors and transmitters) and data storage analytics tool stacks.

The Sigfox business model is mainly royalty-based. Sigfox sells connectivity packages per device plus additional functionalities such as GPS coordinate estimation. Sigfox works with partners at both ends of the pipe, which includes device producers and data analytics service developers. Also, Sigfox partners with numerous vendors of sensors.

2. Sigfox Sens’it Data Stream to Cloud

In this article, we work with a Sigfox prototype device: Sens’it. The device has five sensors (temperature, humidity, light, magnetism, and movement), a button, and three LEDs. You can change the device mode by pressing the button. The device is embedded and has a USB interface — so it can be reprogrammed via Embedded C compiled with corresponding Arm compilers (check the Sigfox Github documentation for details).

The device capacity is limited by design. Each data package has a maximum of 12 Byte payload. The streaming is also limited in daily package capacity. The device sends around 160 serialized messages per day, including timestamps and GPS coordinates.

To start streaming data to the cloud, you need to first register your device to https://www.sensit.io using the unique device ID at the back of the Sens’it device. You can use the Sens’it App to visualize the data or can stream data directly to the Sigfox Cloud. The Sigfox Developer documentation offers additional information on this topic.

Once you set up your Sigfox account, the device starts streaming data to the Sigfox Cloud in your selected mode. The device can operate in only one of the modes displayed below:

Sens’it has a default serialization algorithm that puts measured values in different bytes packed in hexadecimal. The data is imported from the Sigfox cloud using Sigfox Backend in raw form. Thus, the serialization algorithm should be reversed at the storage site.

3. Sigfox API and Data Payload

Sigfox GET API allows you to request data from the Sigfox cloud. The response to the API request has the device metadata and the payload published by the device.

3a. Data encoding by Sensit: The Sens’it payload is mode-dependent. In each mode, the payload is encoded differently to be transformed into physical units. The payload is sent as hexadecimal of 8 characters, or 24 characters if config data is also included. This message is serialized and encrypted to be received by a gigantic FM radio antenna and finally from the antenna to the Sigfox Cloud.

3b. Data package JSON and data decoding: The backend constructs several JSON packages to deal with data and metadata accordingly. The JSON message includes deviceID, GPS coordinates, payload data, etc.

The mode values are encoded by a custom firmware that you can reprogram. We have kept the default firmware encoding for the Sigfox Connector. In the following two tables, we can see how the encoding/decoding is accomplished:

The detailed payload data structure encoding
Byte2 and Byte3 encapsulate only sensor values

The temperature transformation is: Temperature = T_msb*256+T_lsb. Apart from this, unit conversions are given as algebraic relations such as for temperature in Celsius:

A comparable formula is stated also for humidity. Check the documentation for details.

3c. Sigfox API

The APIs to import device metadata and sensor data are given as:

  1. https://api.sigfox.com/v2/devices/{device-id}/messages?since={since}
  2. https://api.sigfox.com/v2/devices/

One can use the devices list to go through a series of APIs with the given device ID. We have skipped this as we have only one device.

4. The Record Evolution Platform Sigfox Connector

We use the Record Evolution platform to connect the Sigfox Cloud to the GET API data endpoint. The resulting message has the data encoded in a hexadecimal phrase. The payload contains 4 bytes of sensor info while the device config data is 8 bytes long. There is no config data in the temperature mode to which we have set the device. Thus, the payload data looks like an eight-character hexadecimal, e.g. ”data”: “ce097b54”.

The platform has a built-in web import functionality

4a. The Record Evolution Platform web connector:

The web connector guides you step by step in the process of web data import. You need to fill in the GET API endpoint in the URL section. Username and password are to be received from the Sigfox Cloud subscription. (To try it out yourself, you might have to buy a Sens’it device and/or a data package from Sigfox.) The target is the raw data table S_SIGFOXMSG2 which is updated according to your selected interval (5 minutes in our case).

Implementation of automated Web Import functionality using Sigfox GET API for a standing data import from Sigfox Cloud
Custom device selection by hand or from a device table

In the URL, the parameter since shows how the API call is recurrently affected by the latest validation DateTime in the resulting table. This simply helps API ignore already imported data.

4b. Decoding of the data by custom SQL query:

The Record Evolution Data Science Studio allows you to represent groups of devices in different modes by using different pipes. As we have only one sensor mode at the moment, we haven’t benefited from this functionality. In the case of tens of devices and different mode groups, the functionality makes data pipes more transparent.

SIGFOX_PAYLOAD_DECODER is implemented as a custom SQL query pipe representation of the decoder

The specifics of this decoding function will be covered in another short article. We encourage the reader to check the data pod and the corresponding SQL decoder for temperature and humidity.

4c. What does the Record Evolution Studio do for you under the hood?

Apart from the PostgreSQL database on which the app is built, the data science studio provides a container-based (personalized) storage architecture, as well as several services to track and analyze data. The service encompasses packaging, data versioning and historization, as well as the option to differentiate between event and status tables. The data pipe representation helps you implement a data model optimization. The process is as follows: First, the incoming data is packaged, meaning that there are several incoming rows with one request. Second, valid data is tracked; duplicates are discarded. Third, an event table stores all incoming events whereas a status table tracks the status of a given variable as changes in the variables occur (with timestamps valid_to and valid_from). In the case of our Sigfox data, it would make sense to create a status table due to the low intensity (max 160 payloads a day).

The resulting table T_DEVICE_EVENT
A closer look at the sensor mode, temperature, and humidity values

The Analysis panel provides an overview of the data. By clicking on the SIGFOX_DEVICE node, you can create an analysis such as binning (e.g. interval average). The analysis chart below presents the average temperature and humidity per hour. You can click on the interval options to change the granularity.

Further, rather than simple analysis functionality, you can also construct custom dynamic infographics with D3.js, SQL, and Python. This is a topic that we won’t cover at this point. You can take a look at our article Data Visualization with the Record Evolution Data Science Studio for a start.

5. Conclusion

If you own a Sigfox Sens’it device, you can easily connect your data to the Record Evolution Data Science Studio to do analytics with it. We have built a connector for you that you can copy in the data science studio and adapt to your specific case. This is a user-friendly alternative to the do-it-from-scratch approach.

You can also go one step further by designing a new device with your desired set of sensors and a dedicated serialization algorithm: payload encoding and frequency of packages.

--

--