HackMIT 2020: through the lens of a judge and mentor

Sidra Ahmed
7 min readSep 24, 2020

--

Recently, I was given the opportunity to be a mentor and judge for one of the largest undergraduate hackathons in the world, HackMIT 2020. It was a weekend long event in which thousands of students from all across the globe came together to learn new technologies and collaborate on some really cool and innovative software/hardware projects. 💡🌍

Keeping in mind how unexpected this year has been, many of us have found ourselves increasingly disconnected and isolated. This became one of the key motivating factors to the formation of the first project track at HackMIT: communication connectivity. Apart from this track, students were also encouraged to build solutions addressing health tech, education and urban innovation.

Adding to this, IBM has had a longstanding involvement with Hack MIT over the past few years and was one of the Gold sponsors for the event this year too! Along with this sponsorship, other IBMers like me were given the chance to host workshops showcasing different IBM technologies and also host a sponsor challenge aligned with IBM’s native Call For Code challenge. Call For Code initiative focuses on inspiring developers to build solutions that mitigate the impact of Climate Change and COVID-19.

HackMIT roadmap: Mentor and Judge edition 💁🏽

With a few of the event logistics being covered, let’s do a brief walkthrough of my experience as a technical mentor and an expo judge at the event. As a technical mentor, I got the chance to help students in:

  • Brainstorming different project ideas that are aligned with diverse tracks whilst keeping in mind feasibility and usability.
  • Setting up development environments.
  • Code debugging and troubleshooting.
  • Leveraging open source frameworks and libraries.
  • API usage and authentication.

Apart from these key tasks, I also helped students in building the necessary skills to present an engaging and interesting demo. As an expo judge, I was given a list of hackathon projects (IBM tech involved + open source) and was asked to watch a list of project demos and rate each demo based on factors such as innovation, solution completeness, project niche and open source tools usage 💻✔️

On an overall, my experience as a mentor and judge at HackMIT was extremely gratifying and more of a “personal milestone achieved” moment for me. Being able to contribute towards building an inclusive developer community and passing down my knowledge to nurture this community has always been one of my goals.

Here’s a curated list of the projects I have mentored and judged 📌

  • Guide and Vacuate: Guide & Vacuate is a real-time, IoT-based emergency guiding device which provides smart emergency guidance and evacuation safety to civilians under emergencies through voice assistance, indoor-navigation and smart sensor data. It measures and responds to sensor data while sending alerts to affected civilians. The app was developed using Firebase and Flutter, OpenCV was leveraged for CV tasks and IBM Watson APIs for TTS (Text to Speech) based voice assistance. Adding to this, it also calculates the shortest and safest evacuation route using BlE checkpoints. Here’s a link to this project’s GitHub repository: GuideandVacuate
  • WebShelf: WebShelf, aims to link communities together one annotation at a time. It’s a personal file system for web content with the functionality of modern social media. With convenient webpage highlights and annotations, communal features such as public-facing libraries, a personalized feed (likes, comments, shares), and an ML ConnectBot (created using Watson Assistant and IBM Discovery News) that uses your historical preferences to automatically spark relevant conversations between users, it’s a platform that ushers in a new medium for connectivity. Here’s a link to this project’s GitHub repository: WebShelf
  • Buy or $hort: Buy or $hort is a financial literacy game that teaches people of all ages how to invest in the stock market by analyzing components of the economy and society that affect stock prices using key metrics like Twitter posts, company growth data and even news articles. There is also a data visualizations segment to this application to better depict the variables that affect stocks. This project has been implemented using the Twitter API, Yahoo Finance API, News API, IBM Tone Analyzer, React.js (React-vis, Bootstrap, CSS), and Flask (Python, Requests, JSON). Here’s a link to the GitHub repository for this project. Buyor$hort
  • Debbie: Debbie is an AI-inspired React web application that recommends personalized exercise sessions based on a user’s physical capabilities and possible disabilities. It also incorporates a physician integration module to help with medical administrative tasks like monitoring the patient’s health remotely. To add to this, there’s an emergency notification system to inform family members about possible injuries during workout sessions. This application incorporates Material UI components to visually design product features, Okta sign-in integration to manage account creation, Flask and SQL Alchemy to query for and store user training sessions, and scikit-learn to outline AI-recommendation model. Last but not the least, it uses Twilio to send SMS messages when a patient is in danger. Here’s a link to the web app demo: Debbie
  • Latent Space: Real-time collaboration during the COVID-19 pandemic is hard. Without being able to meet in person, people are forced to connect via video call platforms such as Zoom, Google Meet, and Jitsi. Today’s leading video call platforms have extremely high latencies; while humans can easily recognize latencies of 10ms and more, Zoom’s latency is 135ms, Google Meet’s latency is 100ms, so on and so forth. Latent Space is a decentralized video calling web app that utilizes recurrent neural autoencoders with embeddings specifically for music to transfer encoded (compressed) audio files via RTC packets with minimal latency. Musicians and anyone wishing to meet virtually can join a call at latent-space.tech, where they can then record audio accurately and with ease. Here’s a link to the project demo: LatentSpace
  • Lyrum: Lyrum is a social media platform to help facilitate sharing and finding new music. Users can post about songs they like, which is then visible to all of their followers. These followers can like, comment, or even play the attached song directly from our app. The login, followers, songs, and user profiles come from the Spotify web API. JSON parsing was used to efficiently extract all the data, using a new method created by the developers of converting arrays into dictionaries. Lyrum created this new technique because available documentation was very slow, and the app was much more effective when used. Here’s the link to the GitHub repository for this project: Lyrum
  • WatchBot: An opt-in Reddit bot service that uses an ML text classification model to analyze a user’s recent post history for suicidal sentiment. If recent posts indicated a high-risk for suicide, the user-defined emergency contacts are notified. The bot utilizes several models, one of them being a deep-learning neural network model to classify texts for capturing suicidal tendencies with IBM Watson Natural Language Understanding (NLU). Specifically, the Watson NLU analyzes every element in the dataset for overall sentiment (whether it is generally positive or negative), as well as how much it displays the specific emotions of joy, sadness, anger, disgust, and fear. Here’s the link to the GitHub repository for this project: WatchBot
  • Green Machine: Green Machine is a web app that estimates and visualizes the carbon cost of high-performance computations through estimating the carbon cost of ML models. It helps calculate the carbon emissions from high-performance computing services, shares insights on how to reduce carbon impact as a developer, and understands computational data through interactive visualizations. Here is a link to the web app demo: GreenMachine
  • Nest: Nest is an immersive augmented reality second brain which will empower the original human brain to focus on the things that matter. There are products like roam research but the issue with them is that you can’t have an immersive experience i.e. you can’t interact and be inside your “second brain”, you are also restricted by the screen size. The “second brain” within this application can cluster certain important pieces of information together and even create links between the information. Additionally, this can also be a massive productivity booster, e.g:-machine learning recommendation integration to show a recommendation dashboard alongside the brain. Here’s the link to the GitHub repository for this project: Nest
  • iSight: Traditionally, people with severe visual impairments are assisted by caretakers when they are in public spaces (e.g. crossing the street, navigating a tight space, etc), leading to a lost sense of independence. iSight is an end-to-end app that generates real-time captions of a user’s camera feed and converts them into human-like text-to-speech narratives. From Facebook Research’s SlowFast Video Recognition model to WaveNet Models to Google Cloud Vision API; tons of frameworks and models have been used to build this solution. Here’s the link to the GitHub repository for this project: iSight

Get in touch

If you have any further questions or tips on mentoring/judging, feel free to reach out 📫

--

--