Architecture behind Scrum.ai

Here’s some sneak peek about the architecture backing our bot, Scrum.ai.

Rakha Kanz Kautsar
Scrum.ai
3 min readApr 4, 2018

--

Source: unsplash

Our Stack

A bit of background of our stack before we dive into the architecture.

Node.js

Our back-end are powered by the latest Node.js engine. We choose this because the growth of Javascript nowadays are unbelievable, there are abundant libraries for everything you can think of (oh, and the product owner also wanted us to use Node to integrate with their existing product).

TypeScript

To encourage good development practices (and because we are eager to learn) we also use TypeScript as our language of choice (which then compiles to ES6 Javascript that can be run by Node). TypeScript enables us to have a reliable type system and reduces the chance of our code failing.

PostgreSQL (with TypeORM)

We use PostgreSQL as our choice of DBMS. You can read our complete story in the following past story:

The Architecture

A simple diagram of our (current) architecture:

Simple, right?

We have a main server hosted at Heroku which will receive events from Slack API, resulting in the so-called Event-driven architecture pattern. Also, our database are hosted in Heroku with the free Heroku Postgres addons.

And here’s our vision for the architecture of the final product:

We intend to have our final product as SaaS, communicating with chat services and scrum tools services, and also implementing microservices architecture that can easily scale with our user. Our services will includes:

  • Main service
  • Metrics Rendering service (with Puppeteer)
  • Chat services (e.g. interfacing to Slack, Telegram, Line, etc)
  • Scrum Tools services (e.g. Pivotal Tracker, JIRA, GitLab boards, Trello, etc)

These microservices will communicate through their API using HTTP communication with secure authentication token. Maybe we can use JWT or even shared secrets (e.g. Vault by Hashicorp).

We also will be using Redis to implement asynchronous queue and scheduling as our task volume increases. Some other (wild) thoughts, we can also use serverless function (á la Lambda) to further abstract things and reduce our operating cost as our microservices are pretty small and not dependent of each other.

All in all, we hope we can achieve our vision of this product and learn as much as we can through the process.

--

--

Rakha Kanz Kautsar
Scrum.ai

React Native developer excited about performance and system designs. https://rakha.dev/