Bashaway [ Battle Campaign ]

Akalanka Perera
SLIIT FOSS Community
8 min readNov 1, 2023

SLIIT Bashaway is the first of its kind inter university competition within Sri Lanka and is targeted at the undergraduates of the country who happen to be the roots of the upcoming generation of computer wizards. The competition is organized by the SLIIT FOSS Community along with the support of its subcommunities, the Mozilla Campus Club of SLIIT and the SLIIT Women In FOSS Community. This was further backed by GitHub which has been the backbone of the entire competition in terms of its infrastructure.

Fine grained details about the competition can be found at its official website and is better witnessed than explained.

The purpose of this text is while scrubbing through a bit of its history, is to give a high level overview of the technology behind it and when it comes to embracing the power of open source, how far one can go with it.

Rekindling the flame

Something which is not known to many is that the oldest traces of Bashaway actually dates back to October 2020 where it took place in the shadows. It was much more of a giveaway than a fully fledged competition and while it was on a small scale, it was a success in its own right as no one could have predicted that it would be one day kick off an entirely new class of inter university competitions.

Bashaway 2022 was a much more refined version of its predecessor. It redefined the meaning of the event and setup a virtual battle arena at the center of the island and saw the construction of a one of its kind platform which was 100% open source.

The same platform as of today has been completely revamped and comprises of an entire ecosystem of technological marvels. It is a fully automated system and was the central point for Bashaway 2023 which broke all barriers while onlooking the fierce clash between 112 teams from 18 universities across the island. The competition spanned over 2 rounds and was a battle of pure wit and skill with over 2500 submissions being made in total for 42 unique challenges.

Here is the aftermath of the battle.

Round 1 Statistics
Round 1 Statistics
Round 1 Leaderboard
Round 1 Leaderboard
Round 2 Statistics
Round 2 Statistics
Round 2 Leaderboard
Round 2 Leaderboard

The platform

Portal https://portal.bashaway.sliitfoss.org

The portal is the main entry point for the competition and is the place where the participants can register for the competition, manage their teams and submit their solutions.

Adminhttps://admin.bashaway.sliitfoss.org

A portal governing the entire system which also includes analytics and the ability to grade submissions manually if needed.

Leaderboardhttps://leaderboard.bashaway.sliitfoss.org

A real time leaderboard which is capable of being masked when it’s time to turn up the heat.

Core Serverhttps://api.bashaway.sliitfoss.org

This is the heart of the system and is the place where all the magic happens. It is responsible for everything which happens behind the scenes.

Scorekeeperhttps://github.com/sliit-foss/scorekeeper

A repository which leverages the full power of GitHub Actions to automate the scoring of submissions. It is capable of handling thousands of them while staying within the free tier of GitHub Actions.

Bashaway UI https://design.bashaway.sliitfoss.org

A React component library built on top of Radix UI which provides for the entire UI of the system. It is also capable of being used as a standalone component library for other projects.

Bashaway Testing https://www.npmjs.com/package/@sliit-foss/bashaway

A JavaScript utility library which was built to aid in the testing of solutions.

A video demonstrating the features of the system can be found over here and at the same time, the staging environment of it is accessible through the following URLs,

Portal — https://portal.staging.bashaway.sliitfoss.org

Admin — https://admin.staging.bashaway.sliitfoss.org

Leaderboard — https://leaderboard.staging.bashaway.sliitfoss.org

API — https://api.staging.bashaway.sliitfoss.org

NOTE:- The initial response of the API might be slow due to the fact that it is hosted on a free tier of Render. The first request might take up to 45 seconds to respond but the subsequent requests will be much faster.

The technology behind

The entire system is built on top of the MERN stack with the addition of Storybooks for the component library. The interesting part however comes when it comes to the deployment of components and automated marking of submissions for both of which, GitHub was a life savior. The total cost of running the entire system for the span of two months was < $1, something which was thought to be impossible at the start of the project.

The 2 primary deployment branches of almost all repositories were main and development. A merge to main would instantly roll out an update to production and development to staging. The entirety of the CI/CD was handled by GitHub Actions and incurred no costs due to all repositories being public.

All web components being built with pure React + Tailwind CSS was a conscious decision made to ensure that the system is as lightweight as possible. The build outputs of these were static files which made it possible to host them on GitHub Pages which again is free for public repositories with the only limit to this being a 100GB bandwidth per month which in actuality was no limitation. The lack of server side rendering was a fair trade off and even then, the Lighthouse scores of all components were well above 90. Further to this, custom domains were added under bashaway.sliitfoss.org which pointed to GitHub servers.

The API was dockerized and a staging instance was deployed on Render which is free for 500 hours per month. It does have a build limit of 100 mins due to which we disabled automatic deployments. The production API was deployed on Cloud Run for which we utilized the free $300 sign up credits.

The biggest gambit however was on the automated scoring. The challenges were such that some of them required the installation of various tools and technologies for the solution including multiple languages. Participants had to have the freedom to install whatever they wished to achieve their targets. An isolated environment was needed and one which could reset itself for each submission.

The solution was to employ the use of GitHub Actions. A new workflow run would be triggered through a Repository dispatch event where a payload consisting of the submission URL, challenge URL, submission info and team info would be sent over upon the invocation of a GitHub API endpoint. The workflow would then download the submission, clean it, download the challenge and use it to replace the original tests. The submission script would then be run followed by its associated tests. If all tests pass the workflow would invoke an endpoint within the core server to update the score of the submission. The entire process would happen quite fast and each submission would be marked within a maximum of a minute or two. While jobs could very well run for 6 whole hours, we limited the execution time of each job to 5 minutes. This along with the limit of 25 concurrent jobs was more than enough to handle the load of 100+ teams.

This also allowed us to release challenges with all of their tests provided alongside. This might have been a first for a competition of this nature but it did pay off. Participants were able to test their solutions locally before submitting them as well as learn from the tests themselves. The inputs for the tests were generated on each run of the test using Faker.js which made it impossible to hardcode solutions or get them from the likes of AI tools.

Workflow Runs
Workflow Runs
Workflow Steps
Workflow Steps
Passing Tests
Passing Tests
Failing Tests
Failing Tests

Each submission was linked to its associated workflow run which was accessible through our portal. A team would additionally receive an email with a link to it on each submission. The emails were sent out through Nodemailer connected to a Gmail transport within the workflow itself. Gmail does have a limit of 500 emails per day due to which unfortunately we did have to swap out accounts once the quota was reached.

Workflow Email
Workflow Email

The following is a diagram depicting the full scoring process (The directional action of a two way communication is the one towards the arrow head).

Scoring Mechanism
Scoring Mechanism

All packages were versioned in accordance with Semantic Versioning which was automated with the use of Conventional Commits and SLIIT FOSS Automatic Versioning before being published to NPM.

Automated Versioning
Automated Versioning
GitHub Releases

Error tracking and analytics

The entire system was monitored using Sentry. The GitHub Student Developer Pack was utilized for this purpose which granted us the capacity for capturing 5000 errors and 500 replays. We did limit the replays to capture only errors to prevent overuse. Overally we did have a solid mechanism of detecting errors and fixing them before they could cause any major issues along with the ability to replay them exactly as they occured.

Sentry Error List
Sentry Error List
Sentry Error Details
Sentry Error Details

All backend activity was accessible through the Log Explorer of Google Cloud . The server was configured to log all requests along with their associated payloads while masking out sensitive information. This along with detailed traces related to their execution flow, was useful in determining the cause of any issue which occured within the system. The major approach was for this process was the usage of the tracing functions within the NPM library @sliit-foss/functions.

GCP Log Explorer
GCP Log Explorer

Google Analytics was integrated to provide us with useful metrics such as the number of users, their locations, the devices they used and the pages they visited.

User Acquisition
User Acquisition
User Engagement
User Engagement
Demographic Details
Demographic Details
Tech Overview
Tech Overview

The future

The future of Bashaway is bright and the competition is expected to be held annually. Everything related to this year’s edition have now been officially released, along with the challenges as well as their solutions. The system itself is expected to be improved upon and the decision of making it public was to open the possibility of adapting it for other competitions of the same nature. Let this be the start of a new era of inter university competitions and let the power of open source be the driving force behind it.

--

--