It’s the start of the year and I’m just getting my new year resolutions in order. While I think about how to make sure I follow these resolutions, I realized one thing that works perfectly well for me is using email as a reminder. It acts as a negative reinforcement and more importantly when someone sends me an email, it is a constant reminder staring at me to get that ‘thing’ done.
Figuring out the Architecture
Any modern application is usually a 3-tier system generally with a user interface, a business logic, and a database. In my case, I was building a web interface, thus UI was a simple HTML/ CSS and JS (no React or Angular needed). Now here comes the next thing that’s the point of discussion, what should I use for business logic? What database should I use? Should I stick to simple SQL or is the NoSql database appropriate for me? While talking about business logic, you wonder, should I use Flask, or does a simple node.js application works? Do I need nginx or redis?
I ended up going with node.js for server and MongoDB for storing data. The reason I felt the right need to use MongoDB was the goals people entered was not fixed. It could be a 10 character goal or it might be 1000 too. I also had in mind a possible improvement of the product and ended up with MongoDB. As far as choosing node.js is concerned, just a personal choice (even though I do heavy python dev these days :D).
I decided I didn’t need anything fancy like nginx or redis as it will just add up computation power on my EC2 instance (more about it later).
Once I figured out architecture, I needed to figure out what service should I use to send the email. Should I set-up my own server or simply use any service available. I could think of a couple of services but ended up with SendGrid or Amazon SES. Both are highly developer-friendly. Since there wasn’t much load I was expecting, I wasn’t worried about pricing.
For deployment of the backend code, I used Amazon EC2 free tier instance. (t2 micro, with 1 GB RAM and 25GB SSD). The only costs that arise are due to the IP address that I need.
Building the Frontend
Building Frontend ain’t the most difficult task, but indeed it does take time, changing CSS, ensuring bootstrap compatibility. But more importantly, determining the right UI is what takes time. Since this was more of a hobby project, I relied heavily on Codepen for ideas and used a modified version of the templates there (to all the backend guys out there- Codepen, Colorlib, etc are your friends so you can focus more on getting your prototype out ASAP).
The adding of goals page is made using Angular(just because the template was available), as it allowed me to add ’n’ number of goals, while each request sent to the server was a simple POST request.
While choosing the right Database system, it was either SQL (MySQL) or NoSQL (MongoDB). If the data-size would have been too obvious, maybe I could have gone with SQL. But it wasn’t obvious what would have been text size of the goals. I could have fixed the character limit and used SQL, but I just wanted a level of freedom with database design as well wanted to save some data in nested form (in future), for which MongoDB was an obvious candidate.
As far as the schema is concerned, it was quite simple. No complex layouts are needed.
All I needed was to save the e-mail address, the time at which they wanted the email, and the goals. I kept each goal as a separate entry as I wanted the users to be able to set a custom time for each goal, a row was created for each goal.
Building the Backend
The backend was a simple MEN( Mongo, express, node) application. Since the frontend was a simple HTML coupled with JS (with some Angular), I didn’t need to worry about serving any requests at the time of user interaction, rather all requests were triggered when a user clicked on submit button!
The node application with express is supposed to listen on a port for any API calls on that. Since the application just makes a POST request, I just needed to respond to each request with a status code, just forsake.
One tricky problem I had to solve was the problem with timezones. Since it’s just not a good idea to ask users for their timezone, what the frontend did was directly identify system timezone when a user sent in the request. Now, this user-timezone is converted to GMT when data was inserted into the database.
|--index.js // default process; Starts server
|--schedule.js // CRON job and email triggering
| |--route.js // api endpoints defined here
| |--goals.js // database schema
index.js file, we initialize the server with help of express and also make sure that the database is connected.
routes.js , we define each API endpoints, like
/addGoals where we have the business logic as well as create the database entry.
While there’s one process that handles the incoming requests via Express, we need another process that periodically checks the database for which emails are to be sent and then actually send it.
A perfect solution for this is a CRON job. CRON job allows one to trigger something periodically, with a high amount of customization.
For me, I needed the command to be invoked every 30 minutes as it didn’t make sense to trigger it every minute since all functions are synchronous, and if there are a large number of emails at a time or database becomes huge, the total time (query time + actual time sending email) may take more than a minute which might result in missing triggers for the next minute.
Thus, what I ended up doing was round up time between 9:00 and 9:30 to 9:00 while between 9:31 to 10 at 10:00, during the time of insertion of the query to the database. As far as triggering is concerned, AWS sets the time of each instance to GMT +0000, thus each entry in the database needs to have time in terms of GMT.
0,30 * * * * command_here // invoke command at every 30 minutes
You can use either SendGrid or Amazon SES to build the service. But since SendGrid allowed to start sending an email immediately, I ended up using it. For Amazon SES, you have to send a request to move you out of the sandbox which takes roughly 24hrs.
The application is deployed on Amazon EC2 instance with MongoDB installed locally on the instance.
To keep the application running, initially, I tried using
nohup, a Linux command which allows running the process in the background, but then paved the way for pm2, a node package used for production process management.
While the project was pretty much straight forward, there were two interesting problems that I got to solve.
- Setting the timezone for the user. The system sends an email at the desired time of the user. But since users are across the globe, I can’t really just use one timezone on the system and send it accordingly. And asking the user for timezone just didn’t feel right when you already can get this from the user’s web browser. So, how it works is using JS, along with goals and other user info, I also pad along the user’s timezone. Which in backend automatically adjusts to the timezone of the system(server). One thing I wished it could solve is the daylight saving challenge, but at the time of building, other than Australia, no major country was in daylight saving.
- The second challenge that I had to solve was as I was deploying, I had considered GitHub Pages or Netlify to host Frontend as both had HTTPS support. But the problem arose when the Frontend made the call to the backend which was in HTTP, which resulted in CORS policy error for all browsers. One solution was I could enable https on my EC2 instance, but that meant spending $$. So I ended up using Surge, which allowed me to host the static website for free using HTTP.
All talk, here’s the link to try the service: http://newyeargoals.surge.sh/