Now You See Me, Now You Don’t: Fooling a Person Detector

Synced
SyncedReview
Published in
3 min readApr 24, 2019

Advanced machine learning techniques and the widespread deployment of surveillance cameras have dramatically improved the efficiency and accuracy of human detection systems in airports, train stations, and other sensitive public places. Is this the end of anonymity? A team of machine learning researchers from Belgian university KU Leuven think not, and have introduced a simple hack that can enable people to circulate undetected by AI-powered surveillance cameras, like a cyber reboot of the “Invisible Man.”

In a paper published last week the researchers show how an adversarial attack approach using a colorful 40 sq cm printed patch can significantly lower the accuracy of an object detector. In a two-minute video demo, they demonstrate how a person displaying the patch becomes undetectable to a computer vision system. When the patch is flipped to its blank side, the camera again detects their presence.

An adversarial attack is a technique designed to intentionally deceive an image recognition system. Modifications that are undetectable to the human eye can be introduced into images that will fool computers into misclassifying those images. Previous research efforts mostly involved object detection: an MIT research team duped a Google Cloud Vision API into identifying rifles as helicopters; and a UC Berkeley research team showed that adding specific patterns to a stop sign with adhesive tape could trick an AI into recognizing it as a 60 MPH speed limit sign.

The KU Leuven researchers explain that creating adversarial attacks for person detection is more challenging because human appearance has many more variations than for example stop signs.

Their goal was to find a portable patch which could significantly lower the accuracy of person detection on a large dataset using optimization methods. They targeted the YOLOv2 object detector, a popular fully-convolutional neural network model; and used the INRIA Person Dataset for training because it contains full-body pedestrians.

The paper proposes three approaches to create the adversarial patch: minimizing the classification probability of class “person”; minimizing the objectness score; and a combination of both. They first trained a YOLOv2-based person detector, then added the patches created from the three approaches to the image data and fed the resulting images into the pre-trained detector. They then repeatedly tweaked the patch’s pixels to minimize person detection accuracy.

The potential impact of this research is far-reaching: surveillance systems with similar detectors might be vulnerable to this sort of adversarial attack; while self-driving vehicles could fail to recognize pedestrians on public roads.

Less than a week after the paper’s publishing, the website cloakwear.co appeared, selling t-shirts with the adversarial patch.

Google Researcher David Ha’s tweet regarding the paper, “we are living in a cyberpunk future,” received over 4200 likes in two days. Ha also tweeted: “While printing this pattern on a T-shirt might be good for a fashion item, it’s going to be useless against any system that doesn’t deploy variants of YOLOv2. In the future, your adversarial outfits will also need to be adaptable in real-time.”

The paper Fooling Automated Surveillance Cameras: Adversarial Patches to Attack Person Detection is on arXiv. Supplementary research material will be presented in a CVPR 2019 Workshop.

Journalist: Tony Peng | Editor: Michael Sarazen

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.

Follow us on Twitter @Synced_Global for daily AI news!

We know you don’t want to miss any stories. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

--

--

Synced
SyncedReview

AI Technology & Industry Review — syncedreview.com | Newsletter: http://bit.ly/2IYL6Y2 | Share My Research http://bit.ly/2TrUPMI | Twitter: @Synced_Global