Build a Real-Time Data Dashboard with 9 Lines of Code

akius
akius
Nov 20, 2020 · 7 min read
Image for post
Image for post
Photo by Anas Alshanti on Unsplash

As the availability of real time data sources is increasing at an incredible pace in the last few decades, so are the possibilities of analysing this data to make financial analysis.

In recent years, using social media as a signal has become more popular as indicated by the many sentiment analysis guides and tutorials floating around on Medium and other similar platforms. Some interesting academic work has been done in this field too, for example the work by some researchers on the correlation between Trump tweets and the stock market.

Today we want to show how simple it can be to build out a real-time dashboard, using bog-standard tcp-connection, python, and the sockets package, and stream it into a high-performance data integration and front-end development platform, 3forge. 3forge was designed for high frequency, low latency and high-volume data applications. 3forge’s built-in realtime database enables you to quickly build out web-browser dashboards and applications with real-time data.

To do this, we simply connect on a designated port and start streaming in data. It’s that simple.

As the availability of real time data sources is increasing at an incredible pace in the last few decades, so are the possibilities of analysing this data to make financial analysis.

In recent years, using social media as a signal has become more popular as indicated by the many sentiment analysis guides and tutorials floating around on Medium and other similar platforms. Some interesting academic work has been done in this field too, for example the work by some researchers on the correlation between Trump tweets and the stock market.

Today we want to show how simple it can be to build out a real-time dashboard, using bog-standard tcp-connection, python, and the sockets package, and stream it into a high-performance data integration and front-end development platform, 3forge. 3forge was designed for high frequency, low latency and high-volume data applications. 3forge’s built-in realtime database enables you to quickly build out web-browser dashboards and applications with real-time data.

To do this, we simply connect on a designated port and start streaming in data. It’s that simple.

No setting up ports, creating tables, shaping, or transforming data.

3forge’s built-in database will see the tweets, create a table for us, and start storing the data just as if the rows were inserts. Because a simple network connection is a very generic protocol, it really lends itself to be highly customizable and work with whatever your business needs to stream.

In this example, we will be streaming in tweets from Twitter’s API using the tweepy package. The tweets will also be analysed for sentiment using the python package vaderSentiment. To finally attach some financial world relevance to this exercise, we will specifically only be streaming in tweets that mention the top 100 (by market cap) tickers from Nasdaq, using this list.

Keep in mind you might want to use a dynamic list if you’re doing this for a real application, obviously this will change as the market cap of the companies changes but for this demo we will be using a static list.

We have broadly followed some existing tutorials to build out the logic to perform the streaming of the tweets into python, and analyzing them there. There are some great guides out there, for example the one we mainly used by sentdex on pythonprogramming.net. For now, we are going to treat this part of the program as a black box that outputs timestamps, tweets, corresponding tickers, and finally the sentiment of those tweets. We take the output of this black box and stream it into the real-time database.

This is where 3forge starts to shine.

To start with, we need to set up a connection to our database that is listening on a default port on 3289. We will be running my 3forge application on a local machine so I am simply connecting to localhost and logging in to my database. We will also be publishing a hands-on tutorial on some synthetic real time data streaming using python, bash, and powershell — keep an eye out for that in the coming weeks. The first step is to import the socket package and log in, which looks something like this:

import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect(('127.0.0.1',3289))
s.settimeout(2)
s.sendall(login_message.encode('utf-8'))

The login message comes from the documentation here. For more information on this process, please check out our real-time API documentation here.

Next, we need to know what our data looks like. In this case we will be streaming in rows with 4 columns: timestamp, ticker, tweet, and sentiment.

We will define a function for this that looks like this:

def send_data(unix, ticker, tweet, sentiment):
unixString=str(unix)
sentimentString=str(sentiment)
data_message=(
'O|T="Sentiment"|unix="'+unixString
+'"|ticker="'+ticker
+'"|tweet="'+tweet
+'"|sentiment='+sentimentString+'\n'
)
s.sendall(data_message.encode('utf-8'))

The first two lines in the function are simple casting. The multiline parenthesis starting on the third row concatenates our message into an appropriate format for our backend database. Finally, the last row sends this message to our database using our sockets. We note that we defined what table the row will stream into using T=”Sentiment”. Running the sentiment analysis in a while statement and invoking our previously defined send_data function inside the loop will start streaming in tweets into the database. Next, we simply go to our Layout editor and create a new window. In our new window, we can now create a new real-time table:

Image for post
Image for post
Image for post
Image for post

We can see that if we expand the Objects-tree, our new Sentiments table has been created and we can access the data.

We’re going to quickly show that the data is coming in by selecting a Real Time Table, the sentiment Table under Objects, and hitting next.

Image for post
Image for post

Leave the configuration as is and hit finish.

Image for post
Image for post

Now we should have a simple, real-time table with live tweets coming in, including the sentiment and the ticker. It’s that simple!

Image for post
Image for post

Naturally, you would probably want to make something a little bit more exciting than a simple table, so we have built a dashboard with some more interesting figures and aggregated data that you can see in the image below.

Image for post
Image for post

We can see that we have some traditional dashboard features such as a title, some large KPIs at the top and four graphs below. The top left graph is a line chart of the cumulative sum of sentiment of the top 5 tickers by volume of tweets. The size of the boxes on the treemap in the top right represents the volume of tweets, and the color represents the sentiment of those tweets. Throughout the dashboard, black means negative sentiment, bright twitter-blue corresponds to positive sentiment. In the bottom left we can see the relationship between volume of tweets and sentiment of said tweets. Finally, on the bottom right we have a live feed of tweets coming in, in table form.

As a final note on the agility of the 3forge development platform, perhaps the most impressive figure in all of this is the number of lines of code needed. The python portion of this program was just under 100 lines of code which in itself is impressive and comes down to the abundance of pre-built packages available for python. But armed with a tcp-stream of data, the data modelling and shaping needed in the 3forge platform to create this dashboard was a total of 9(!) lines of code.

If you’d like to hear more about the 3forge’s high performance data platform, please visit http://www.3forge.com. Also feel free to drop us an email on info@3forge.com, we would be delighted to get in touch with you and discuss how we can enable your business to be more agile with your data front-ends.

Full disclosure: the author of this post works for 3forge.

To get an update when new articles are published, be sure to sign-up to our newsletter and to follow us on LinkedIn.

Integration, Visualization & Beyond

When you need more from your data than pretty charts and tables.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store