GSoC’17 @ CloudCV : EvalAI Goals
The final week of August marked the end of 3 months long journey called Google Summer Of Code, and what a wonderful summer it has been! It turned out for everything I could hope for & more. I had the honour of working with some of the coolest & nicest people I have come across. Also, the mentors Deshraj Yadav, Shiv Baran Singh, Akash Jain, Taranjeet Singh, Harsh Agrawal have always been a constant support and guide during the summer of code.
About The Project
I was working on CloudCV’s project EvalAI which is a web app to help researchers, students and data scientists to create, collaborate & participate in various AI challenges organised round the globe.
Goals for the Summer
- Create a easy to go process for challenge creation on EvalAI
- Display all the submissions for challenge host under one tab
- Export all the submissions for participants as well as host
- Star/Unstar a challenge
- Designing new dashboard
- Introduced feature for Challenge Analytics for participants as well hosts
- Updating metadata for a submission
- Automating submission worker
- Writing tests for API’s
- Writing Documentation
I have to create the UI as well as the back end API’s along with tests during the summer.
Description
Create a easy to go process for challenge creation on EvalAI
A challenge on EvalAI can be created using two ways:
1. Using a zip configuration file
The challenge zip file will consists of a YAML config file, Evaluation script file, test annotation file all bundled into a zip file, which is then uploaded on the app and the challenge is created. The major challenge during this feature was to maintain the atomic nature of the transaction and handling all the corner cases.
2. Using Interactive UI
A challenge host can also fill the multi-step form for challenge, leader board, challenge phase, data set split and challenge phase split and he can review and submit the form to create a challenge. The major challenges that I faced was to minimise the API calls during form submission and to handle data during form review and update.
Now the role of admin comes in when he has to approve the challenge and it will be hosted on EvalAI.
Display all the submissions for challenge host under one tab
This feature will help challenge host to view all the submissions to his challenge under a single tab.
Export all the submissions for participants as well as host
Exporting all the submissions will help the challenge hosts as well as participants to analyse their submissions. I have added the feature to download all the submissions as a csv file.
Star/Unstar a challenge
Starring and Unstarring a challenge will help the challenge hosts as well as participants to track the popularity of a challenge.
Designing new dashboard
This was a major task as we want to change the old dashboard for participants as well as hosts in order to integrate the challenge analytics.
Introduced feature for Challenge Analytics for challenge hosts as well for participants
1. Challenge Host
Challenge Analytics is an important feature that will help the challenge hosts to get an insight of his hosted challenge. Also, a new dashboard for analytics is designed for the challenge host to view the analytics. The challenge analytics includes total hosted challenge, total submissions in a challenge phase, total participant teams, latest submission date time in a challenge phase. The challenge was to minimise the data requests and write an efficient query for the database. We achieved this by caching the analytics data.
2. Participants
Analytics for participants includes informing the participant about the total submissions in a day and in total, the remaining submission time for the next set of submissions once the limit has been exhausted. Moreover, the participants can now also view the total submissions to a challenge phase.
The transition from old dashboard to new dashboard integrated with challenge analytics dashboard is shown here.
Updating metadata for a submission
The submission metadata is the data which the participant submits when he creates a submission on EvalAI like method name, url, project url etc. Now a participant can also update this data using a simple modal.
Automating submission worker
Let me give you a brief description worker. Submission worker is a RabbitMq worker which is responsible for evaluating the submission messages. It listens on a queue and on receiving the message for a submission, it processes & evaluates the submission. Earlier we used to reload the submission worker manually after a new challenge is being created, but now Evalai admin can reload it using the django-admin at the time of approving the challenge.
Writing tests for API’s
For all the back end code that I have added is covered with tests. I have written 150+ unit test cases with a code coverage of 92% from coveralls.
Writing Documentation
Initially the documentation of EvalAI was on github WIKI but now has been moved to Sphinx and it is hosted on Read The Docs. The documentation is quite comprehensive and extensive for the users who wants to get familiar with EvalAI. I have created a documentation for the features that I have added during the summers.
Miscellaneous Tasks
There were many small features like contact us, import-export for admin, etc. & minor bug fixes. Moreover, I also helped other contributors to start their contributions on EvalAI.
Conclusion
I can only recommend every student out there to consider applying for the Summer of Code! It has been an absolutely great time, meeting new people and discussing stuff with them has always been fun for me, but also the feeling of being able to really move a project forward and taking the time to do that is awesome!