Google Summer of Code 2021 with CloudCV

Gautam Jajoo
4 min readAug 22, 2021

--

Introduction

I am Gautam Jajoo, an undergraduate student at BITS Pilani. I got accepted for GSOC 2021 with CloudCV for the project “Improvements in EvalAI Frontend. The past three months were a lot of fun and exciting for me. Every day I got to learn something new. It also helped me to work on a project which has a larger impact on the users.

About CloudCV

CloudCV is an open-source cloud platform with the aim to make AI research more reproducible. CloudCV actively works on building tools that enable researchers to build, compare and share start-of-the-algorithms.

EvalAI, a project of CloudCV, is an open-source platform for evaluating and comparing Machine Learning (ML) and Artificial intelligence (AI) algorithms at scale. Some of the features offered:

  • Custom evaluation protocols and phases
  • Remote evaluation
  • CLI support
  • Portability
  • Evaluation inside custom environments

Brief description of the project

After last year’s GSOC, we had reached feature parity on EvalAI-ngx with the existing UI; this project involved fixing the last remaining kinks in the UI. The goal of this project was to improve the new UI as we replaced the existing UI with the new UI. We also improved on the new UI and incorporated the feedback from the challenge hosts and participants for the AI challenges organized.

The Work

The work done during this period could be broadly divided into four verticals:

Here is a link to the GSOC presentation.

Contributions

  1. Settings tab: The settings tab is where hosts can edit most of the challenge details, and we have provided a tab-like structure for the convenience of the hosts.
  • Add edit evaluation script to settings tab (#3451)
  • Add edit overview to settings tab(#3462)
  • Add phase edit option with edit modal(#3460)
  • Add edit terms and conditions to the settings tab(#3483)
  • Add more parameters to edit phase modal and option to change phase and submission visibility(#3491)
  • Add leaderboard edit section including phase split visibility change(#3524)
  • Add tab like view structure to the settings tab, also select the first option as default for the select component(#3551)
  • Add leaderboard precision value to leaderboard section, also increase the precision to 20(#3568, #3543)

2. Addition of two new sections on the homepage

  • Add testimonial section(#3527)
  • Add Twitter feed section(#3282)

3. Optimising the components: This decreased the data load time across various components, especially the challenge-list page.

  • Add lazy loading to tabs on the various pages(#3548)
  • Remove stars fetch function from the challenge-list component(#3545)
  • Remove unnecessary conditions, imports, and services which reduces the bundle size and hence the time to load(#3547)
  • Add lazy loading to images across all components(#3571)
  • Remove submissions count API from submissions page(#3562)
Page Speed Insights score for homepage
Page Speed Insights score for the challenge-list page

4. Tab highlight feature: Tab is now highlighted according to the path followed by the user to reach the challenge component (#3531)

5. UI Enhancements

  • Restructure submissions page with changes in bootstrap grid layout(#3534)
  • Center content on challenge page(#3455)
  • Fix challenge-template page(#3532)
  • Addition of notes at various places, including when a challenge is unapproved, submissions are greater than 5k, test annotation file size is greater than 100MB(#3556, #3442, #3570)

6. Bug Fixes

  • Fix leaderboard polling logic(#3423)
  • Fix team filter option on all-submissions page(#3424)
  • Fix S.No on submissions page(#3425)
  • Fix console errors when loading challenges(#3449, #3514)
  • Fix organisation logo of the challenge-card(#3521)
  • Fix worker logs error notification(#3486)
  • Fix error for change in expression value(#3550)
  • Show/hide private/public button on submit page (#3541)
  • Fetch all submissions when the checkbox is ticked after polling(#3557)

And many more minor fixes.

7. Unit testing using Karma and Jasmine(#3581)

Future Plans

Currently, we have the v2 on the staging server, and the hosts of various challenges are already using it and post GSOC we would be ready to move it to production for everyone. Now, we are working on fixing the last few issues remaining and performing the frontend unit testing.

Conclusion

I want to thank all the Org Admins and the Mentors who helped me during this period. The mentors were really supportive and encouraging. Any doubt, be it a silly one, was resolved by the mentors as earliest as possible, and the PRs were also reviewed within time. The scrum calls also helped us interact with mentors and gave us daily feedback and updates on the timeline.

Rishabh Jain- Org Admin

Ram Ramrakhya- Org Admin

Kajol Kumari- Mentor

Finally, I would like to thank Google’s Open Source Community for introducing such a program and providing this opportunity to contribute and make a wonderful journey for me.

Thank You!

--

--