EE5111 IoT Assignment Report A0179741U

Ashfaq Memon

Selected Topics in Industrial Control & Instrumentation (EE5111)

Implement a Simple IoT Pipeline with AWS Cloud Platform and Visualise the Data

Rogatiya Mohmad Aspak Arif (A0179741U)

Department of ECE, National University of Singapore

e0269760@u.nus.edu

+65 84073081

Introduction

In a real-time embedded system, the architecture is subdivided to different layers and each of the layer can be replaced without affecting the functionality of overall IoT system. Sensor is sensing and gathering information about environment. Low power embedded processor process data and interface with sensor and wireless transceiver that transfer data to internet gateway. Internet gateway connects to internet and allows data transfer to server. The application is running on server to deliver application specific service to user.

Figure 1: 5 Layers of a simple IOT Architecture

This project is to implement a simple Internet of Things (IoT) pipeline with AWS Cloud platform and visualise the data. AWS IoT provides secure, bi-directional communication between Internet-connected devices such as sensors, actuators, embedded microcontrollers and the AWS Cloud. This enables user to collect telemetry data from multiple devices, and store and analyse the data. We will simulate two small IoT setups through Python that record and push data from two jet engines as shown in Figure 2 to AWS DynamoDB. A step by step guideline is written on how to set up the entire pipeline process.

Figure 2: Figure 2 — Jet Engine

In this task, we obtained four jet engines data namely FD001, FD002, FD003 and FD004 in text file format from given, each text file contains a jet engine data operating settings and sensors from Commercial Modular Aero-Propulsion System. The requirement is to publish 2 pre-defined FD001 and FD002 jet engine data as shown in Figure 3 as IoT Things to AWS. In this arrangement of data, there are a sum of 26 arrangements of data for every line, which is given in the following configurations: id, cycle, os1, os2, os3, sensor1,…, sensor21 which relate to the data collected on the jet engine at each time instance.

Figure 3:FD001 and FD002 Jet Engines Data

Introduction to Message Queuing Telemetry Transport (MQTT)

In this assignment, we will be using MQTT as a Client Server to publish/subscribe IoT Things data, MQTT is a messaging transport protocol that is lightweight, open, simple, and designed so as to be easy to implement. These characteristics make it ideal for use in constrained environments such as for communication in Machine to Machine (M2M) and IoT contexts where a small code footprint is required and/or network bandwidth is at a premium. Figure 4 shows the full AWS IoT communication diagram for this assignment to publish data from two jet engines to client server, and AWS DynamoDB would subscribe from it.

Figure 4 : AWS IoT Communication Diagram

Setting Up AWS IoT and Sending Data with Development Computer: The setting process requires registration of an AWS account, an IAM administrator user in the AWS account and a desktop or laptop development computer to work with the AWS IoT console from a web browser, and to push jet engines data into AWS IoT. This computer can be running a Windows, macOS, Linux, or Unix operating system.

Create Thing1 for FD001

Step 1: Create the AWS IoT Policy for FD001 Thing 1

In this step, to allow desktop/laptop as a substitute simulator, to perform AWS IoT operations by creating an AWS IoT policy. X.509 certificates are used to authenticate devices with AWS IoT. AWS IoT policies are used to authorize devices to perform AWS IoT operations, such as subscribing or publishing to MQTT topics. Under IoT Core console, select ‘Secure’ then ‘Policies’ to ‘Create’ a policy for FD001 Thing 1 as shown in Figure 5. Follow the red boxes guide and Enter a ‘Name’, ‘iot:*’, ‘*’ and check ‘Allow’ checkbox respectively.

Figure 5: Setting up Policy part 1

Figure 6: Setting up Policy part 2

Step 2: Create Thing for FD001

In this step, create a thing in AWS IoT to represent desktop/laptop as a device simulator. Devices connected to AWS IoT are represented by things in the AWS IoT registry. The registry keeps a record of all of the devices that are connected to the AWS account in AWS IoT. Under IoT Core console, select ‘Manage’ then ‘Things’ to ‘Create’ a Thing for FD001 as shown in Figure 7 and 8. Follow the red boxes guide and Enter a ‘Name’ and select ‘next’ as shown in Figure 9.

Figure 7: Setting up Thing part 1

Figure 8: Setting up Thing part 2

Figure 9:Setting up Thing part 3

Step 3: Create Certificate for FD001 Thing

Select ‘Create certificate’ for FD001 Thing to gain authentication to publish jet engine data to MQTT server for as shown in Figure 10. Download ‘A certificate for this thing’, ‘A public key’ and ‘A private key’ and select ‘Activate’ as shown in Figure 11. Put the 3 downloaded files with the Python file into the same folder. Click on the checkbox of the thing ‘Name” to add policy to it as shown in Figure 12.

Figure 10: Setting up Thing part 4

Figure 11: Setting up Thing part 5

Figure 12: Setting up Thing part 6

Step 4: Send and Receive Test Data for the Thing

In this step, we can send jet engines data to the thing shadow for the desktop/laptop as a device simulator. A thing’s shadow is a JSON document, stored in AWS IoT , that AWS IoT uses to save and retrieve current state information for a device. The Device Shadow Service for AWS IoT maintains a shadow for each device connected to AWS IoT. Go to ‘Things’ and select ‘Interact’ as shown in Figure 13.

Figure 13: Thing’s shadow send and receive data part 1

For MQTT, make a note of the value for each of the following MQTT topics, which enable you to set and get updates to the shadow as shown in Figure 14:

arn:aws:iot:us-east-2:170968857749:thing/EE5111_A0179741U_THING_1

$aws/things/EE5111_A0179741U_THING_1/shadow/update

$aws/things/EE5111_A0179741U_THING_1/shadow/update/accepted

$aws/things/EE5111_A0179741U_THING_1/shadow/get

$aws/things/EE5111_A0179741U_THING_1/shadow/get/accepted

Figure 14: Things’s shadow send and receive data part 2

For Subscription topic, enter the MQTT topic value in Figure 13 of this procedure for Update to thing shadow (for example, $aws/things/EE5111_A0179741U_THING_1/shadow/update), and then choose Subscribe to topic. Repeat the same procedure for the MQTT topic values for (for example, $aws/things/EE5111_A0179741U_THING_1/shadow/get) and (for example, $aws/things/ EE5111_A0179741U_THING_1/shadow/get/accepted) as shown in Figure 15.

Figure 15: Things’s shadow send and receive data part 3

To get that data from the shadow, choose the MQTT topic value for Get this thing shadow (for example, $aws/things/EE5111_ A0179741U_THING_1/shadow/get). In the message payload area, replace the current payload with the following payload and ‘Publish to topic’ as shown in Figure 16.

Figure 16: JSON document

Step 5: Creating an Amazon DynamoDB Rule

DynamoDB rules allow users to take information from an incoming MQTT message (When running the Python script) and write it to a DynamoDB table. In the AWS IoT console, in the navigation panel, choose ‘Act’ and select ‘Create’ as shown in Figure 18.

Figure 17: Amazon DynamoDB part 1

On the Create a rule page, enter a name, description for your rule and select ‘Add action’ to choose ‘Split message into multiple columns of a DynamoDB table (DynamoDBv2)’, and then choose ‘Configure action’ as shown in Figure 19, 20, 21 and 22.

Figure 18: Amazon DynamoDB part 2

Figure 19: Amazon DynamoDB part 3
Figure 20: Amazon DynamoDB part 4
Figure 21: Amazon DynamoDB part 5

On the Configure action page, choose ‘Create a new resource’. This action will bring you DynamoDB table as shown in Figure 23.

Figure 22: Amazon DynamoDB part 7

On the Amazon DynamoDB page, choose ‘Create table’ as shown in Figure 23

Figure 23: Amazon DynamoDB part 8

On the Create DynamoDB table page, enter a ‘Name’ in Table name. In Partition key, enter ‘id’. Select Add sort key, and then enter ‘timestamp’ in the Sort key field. Row represents an id of the sensors for jet engine. Timestamp represents the time in UTC (e.g. UTC 2019–01–28 14:41:15.237). Choose String for both the partition and sort keys, and then choose Create. It takes a few seconds to create your DynamoDB table. Close the browser tab where the Amazon DynamoDB console is open. If you don’t close the tab, your DynamoDB table is not displayed in the Table name list on the Configure action page of the AWS IoT console.

Figure 24: Amazon DynamoDB part 9

On the Amazon DynamoDB page, choose ‘Manage Stream’ and select ‘Enable’ for DynamoDB table to receive data as shown in Figure 26 and 27.

Figure 25: Amazon DynamoDB part 10
Figure 26: Amazon DynamoDB part 11

Back on the Configure action page, choose the table created earlier from the Table name list. Select ‘Create Role’ and attach Policy for the table as shown in Figure 28 and 29.

Figure 27: Amazon DynamoDB part 12

Figure 28: Amazon DynamoDB part 13

On the Configure action page, enter the JSON message: SELECT state.reported.* FROM ‘$aws/things/EE5111_A0179741U_THING_1/shadow/update/accepted’ for rule query statement in the window to retrieve data then select ‘Create Rule’ as shown in Figure 30 and 31.

Figure 29: Amazon DynamoDB part 14

Figure 30: Amazon DynamoDB part 15

Step 6: Creating Python codes

From the given AWS given skeleton codes, modify the following for Thing1.

SHADOW_CLIENT = “EE5111_A0179741U_THING_1”

HOST_NAME = “a12d228r28ffvb-ats.iot.us-east-2.amazonaws.com”

ROOT_CA = “AmazonRootCA1.pem.txt”

PRIVATE_KEY = “bfaedc01c3-private.pem.key”

CERT_FILE = “bfaedc01c3-certificate.pem.crt”

SHADOW_HANDLER = “EE5111_A0179741U_THING_1”

The codes are able to read and publish data from trainFD001.txt to your thing under AWS IoT platform at the rate of 10 seconds per row. Overwrite column ‘id’ of the engine as ‘FD001’ + id (e.g. FD001_12), add one more columns ‘timestamp’ as timestamps in UTC (e.g. UTC 2019–01–28 14:41:15.237), add one more column that containing Matric number, concatenate data into JSON string and send it to Thing1 shadow handler as shown with comments in Figure 32, 33 and 34. Full codes of FD001 Thing1 and FD002 Thing2 are in the appendix of this report.

Figure 31: Python code part 1

Figure 32: Python code part 2

Figure 33: Python code part 3

Figure 34: Python code part 3

Figure 35: Python code part 3

Step 7: Executing Python codes and streaming data to AWS DynamoDB table

Run the Python codes and shown in Figure 35 and an update will appear for every 10 seconds interval on Python Shell upon successful streaming of the FD001 engine data to AWS DynamoDB table. To view the published data, go to AWS DynamoDB table and select ‘Items’ as shown in Figure 36.

Figure 36: Executing Python for Thing1

Figure 37: Updated DynamoDb Table

Create Thing2 for FD002

To simulate two IoT things now, add one more thing under AWS IoT platform and one more certificate, create a copy of the Jupyter notebook above, renaming the client name, certificate, data source train_FD002.txt, modify the rule under AWS IoT platform to be triggered by ‘$aws/things/+/shadow/update/accepted’ as shown in Thing1 for FD001 step 1–6. Now, this rule will push data of both engines from both things. Let the two Jupyter notebooks run at the same time as shown in Figure 37, simulating the two “things” to run in parallel to publish data with a sampling rate of one record per 10 seconds in each thing.

Figure 38: Executing Python for Thing1 and Thing2

To view both jet engines data for Thing1 and Thing2, go toAmazon DynamoDB page select ‘Items’, click on ‘Refresh’ and click on ‘timestamp’ to view FD001 and FD002 data as shown in Figure 38.

Figure 39: AWS DynamoDB Table for Thing1 and Thing2

Data visualization on Redash for both Thing1 and Thing2

IAM Policies

Using Identity-Based Policies (IAM Policies) for Amazon DynamoDB allows an account administrator can attach permissions policies to IAM identities (that is, users, groups, and roles) and thereby grant permissions to perform operations on Amazon DynamoDB resources. From IoT console interface, search for ‘IAM’, select ‘Users’ and click on ‘Add user’ as shown in Figure 39.

Figure 40: AWS IAM Table for Things part 1

Enter ‘User name’,check the checkboxes for ‘Programmatic access’ and ‘AWS Management Console access’ and select ‘Next Permission’ as shown in Figure 40.

Figure 41: AWS IAM Table for Things part 2

Select ‘Attach existing policies directly’ and search for ‘AmazonDynamo Db’ under Filter policies. Check the 3 checkboxes, click on ‘Next Tags’, ‘Next Review’ and ‘Create user’ as shown in Figure 41, 42 and 43.

Figure 42: AWS IAM Table for Things part 3

Figure 43: AWS IAM Table for Things part 4

Figure 44: AWS IAM Table for Things part 5

Download the ‘credential’ file containing User name, Password, Access key ID, Secret access key and Console login link by clicking on ‘Download .csv’ as shown in Figure 44. The authentication access information will be used later on for data visualization on Redash platform.

Figure 45: AWS IAM Table for Things part 6

Redash an open source tool used for Thing1 and Thing2 in this assignment by connecting and querying data sources from AWS DynamoDB, it can also build dashboards to visualize data and share them with the public through URLs. Redash is quick to setup and works with any data source to begin, sign up with Redash and ‘Create Your Account’ as shown in Figure 45.

Redash data visualization

Figure 46: Redash data visualization part 1

After creating an account, a new query can be created at Redash main page as shown in Figure 46.

Figure 47: Redash data visualization part 2

To link Redash query to AWS DynamoDB table, go to ‘Data Sources’ and select ‘New Data Source’ under settings as shown in Figure 47.

Figure 48: Redash data visualization part 3

Under ‘Create a New Data Source’ search for ‘DynamoDB (with DQL)’ and select ‘Create’ as shown in Figure 48.

Figure 49: Redash data visualization part 4

Enter all 4 rows namely ‘Name’, ‘Access Key, ‘Region’ and ‘Secret Key’ which can be obtained from ‘credential’ csv downloaded earlier and select ‘Create’ as shown in Figure 49 and 50.

Figure 50: Redash data visualization part 5

In Redash query interface, input the query ‘Name’, enter the SQL query codes for all the sensors from jet engine, ‘Save’ and select ‘Execute’ as shown in Figure 51. Loading of the data from AWS DynamoDB to Redash maybe take some while depending on the amount of data to be transferred. SQL query codes can be found in the appendix.

Figure 51: Redash data visualization part 7

Once all data have been export over, it will be displayed on the interface. Select ‘New Visualization’ to create data visualization for the Things as shown in Figure 52.

Figure 52: Redash data visualization part 8

Under visualization editor, enter a ‘Visualization Name’ for the Thing1, choose chart type to be ‘Line’ graph, X column to be ‘Timestamp_UTC’, Y column to be ‘sensorX’, id to be the ‘cycle’ of interest and select ‘Save’ as shown in Figure 53.

Figure 53: Redash data visualization part 9

Repeat the same procedure above for Thing2 for both Things to be visualized in dashboard later as shown in Figure 54.

Figure 54: Redash data visualization part 10

Redash dashboard allows combine visualizations and text boxes that provide context with the data.On Redash interface, select ‘Create’ new dashboard and enter a ‘Name’ for the dashboard as shown in Figure 55.

Figure 55: Redash data visualization part 11

After creating the dashboard, go back to the query created and select ‘Add to Dashboard’ as shown in Figure 56.

Figure 56: Redash data visualization part 12

Under ‘Add to Dashboard’, select the dashboard name created earlier. Repeat the same procedure for both Thing1 and Thing2 as shown in Figure 57.

Figure 57: Redash data visualization part 13

On Redash interface, choose Dashboard to view the queries chosen to be displayed. More sensors data can be added in its query interface or different cycle of the things can be selected by changing the ‘id’. Public ULR of the dashboard can also be set public as clicking the ‘Share’ icon as shown in Figure 58.

https://app.redash.io/nus_aspak/public/dashboards/qJA2vf1lNzZ2kJVFUG4XwskXBfH1aUmRN3SNWkDW

Figure 58: Redash Dashboard

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade