Annotation Tool Part 1: Why We Built Our Own Tool

Mabel Tan
Sequence Blog
Published in
3 min readSep 25, 2018

This is the first article in a series for our Sequence Annotation Tool. Read on to learn more about our journey along the learning curve.

Polygon Annotation of a Cat. Created in Sequence Annotation Tool.

At Sequence, we provide text and image annotation services for machine learning projects. Our most common use cases fall under image annotation at high volumes. There are many quality open source annotation tools across the internet and this is often a good solution for most businesses. However, we’ve decided to build our annotation tool in-house. In this article, we’ll talk about some reasons why.

Annotation is a core functionality of our platform. Our decision to build our own tool focuses on laying the right foundation for future builds. For us, it’s a win-win situation. By investing into our own annotation tool, we can better serve the needs of our clients and our contributors while working with a tool that’s just the right fit.

Bounding Box Annotation of a Dog. Created in Sequence Annotation Tool

We wanted our Annotation Tool to be:

1. Web-based

We needed the annotation tool to be able to run on the browser. Our platform is web-based, which allows anyone to work on it as long as they have internet access. With no software to install and no manual transfer of work-files from desktop to web, we can simplify the user experience of working on a task.

2. Flexible

We wanted to be able to configure shortcuts, add specific shapes in the future (beyond bounding boxes and polygons), and tailor the behaviour of the tool based on our business use cases. As most of the time spent on the platform by contributors is on this tool, being able to control the functionality at a granular level gives us an opportunity to refine details and experiment with new ideas.

3. Robust

To guarantee the same platform behaviour across browsers, and under slower network conditions — we think an in-house build is a right choice for us. As small bugs can sometimes snowball into significant negative experiences within the platform, it was important for us to be able to run extensive tests against the tool and make sure it’s easy for us to ensure the quality we wanted to provide.

4. Fast

Our goal is to limit delays and lags as much as possible. We pride ourselves on keeping the platform feeling responsive, especially on older machines. Building our annotation tool in-house allows us to make sure that the tool runs smoothly for contributors who have low-CPU computers.

These constraints helped us to focus on what we wanted for our platform and gain clarity on the requirements we needed to best serve our users.

We’ve launched the tool on our platform as of September 2018 and the results have been positive. We’ve learned a lot from the process and if you’d like to know more about our process on the build, read the follow-up article here: Annotation Tool Part 2: Retrospective on the Build.

Want to see the Annotation Tool for yourself? Take it for a test drive here: Annotation Tool Demo.

This article was written in collaboration with Luc Leray.

Learn more about us at sequence.work. We provide annotation and classification outsourcing for data science teams.

Looking to work on tasks instead? Become a contributor here.

*Images used in the sample annotations are from Unsplash.

--

--

Mabel Tan
Sequence Blog

Co-Founder @sequencework. Alumni @hyperisland. Dim Sum enthusiast.