Success story: How Microwork annotations help with the monitoring and policing of video content

Tom Norman
Microwork
Published in
2 min readSep 20, 2018
How to monitor video content for unlawful actions. AI has the answer

Cogisen are building AI technology to extract context and temporal information. They’ve developed a patented technology which isolates the frequency domain signals that contain this elusive information, which can then be used with numerous real-world applications including gaze-tracking and violence detection.

The problem

The internet connects over 4 billion of us now which, in theory, allows all of us ability to instantaneously stream live footage to anywhere in the world. The barrier to entry is now so low for streaming video online that the problem we face is no longer one of connectivity, but instead the colossal task of monitoring this insurmountable mass of new video content. In the case of online streaming, or perhaps even video surveillance footage, how do we police everything that gets uploaded to ensure the detection of anything unlawful or inappropriate?

The solution

Cogisen want to be able to automatically detect actions in video. Rather than humans manually policing hours and hours of video content to look out for inappropriate actions, it would be far more efficient if we were alerted the instant something sinister occurs.

To make this a reality Cogisen’s technology first needs to know what type of actions should be monitored in the first place. It’s important to able to recognise and distinguish between unacceptable behaviour like kicks and punches compared to innocuous actions such as handshakes and hugs. Microwork was given the job of helping them collect and annotate this action data. First of all, we collected videos of people doing various different actions on camera before we fed these videos into our annotation app. Next, our expert team of full-time annotators got to work. They carefully annotated each frame with the required attributes. In the case of action detection this meant looking for the precise moment that an action sequence began and annotating exactly what the AI should track.

After repeating this process for all the action videos we collected, we had our quality assurance team check the accuracy of each annotation before finally sending them to Cogisen to feed into their Machine Learning Models.

If you think our team could help fulfil your video annotation needs, get in touch for an introduction. Looking to buy datasets? Perhaps we can help you with that too. Click here to browse our dataset store.

--

--

Tom Norman
Microwork

Founder of Introvert's Social Club and Connect & Grow Co.