Zumo Labs
Published in

Zumo Labs

🍊 The Juice: Flickr of Recognition

Zumo Labs presents The Juice, a weekly newsletter focused on computer vision problems (and sometimes just regular problems). Get it while it’s fresh.

Week of February 1–5, 2021


If you were given the option of providing personal photographs — images of yourself, your friends, and family — to help private companies develop and monetize their facial recognition technology, would you do it? You would not be compensated, nor would you have any control over how those technologies are deployed. This is not a hypothetical, just a decision you’ve already made unwittingly.

In 2019, the New York Times published a piece called How Photos of Your Kids Are Powering Surveillance Technology, about the MegaFace dataset. A dataset of over 4.7 million photos, MegaFace was made by scraping images of people from the internet including over 3.5 million from photo-sharing platform Flickr. To date it has been hard to ascertain whether your images were included in MegaFace or other image datasets commonly used for facial recognition. But a project called Exposing.AI, launching this week, gives you the ability to run a search and find out for yourself. There’s a great conversation with the project’s creators, Liz O’Sullivan and Adam Harvey, over at the NYT.

Facial recognition technologies are a minefield, and it’s difficult to say whether there’s a path for them to wind up as a net good for society. What’s certain, however, is that we must strive for ethical development of those or any computer vision-powered technologies. That means respecting people’s basic privacy rights, and finding a better way to generate robust performant datasets than just hoovering up people’s Facebook photos. We have some ideas on that.



Speaking of facial recognition, the NY-based facial recognition startup Clearview AI operated unlawfully when scraping images of Canadian citizens according to our neighbor to the north. “What Clearview does is mass surveillance and it is illegal,” says Canada’s privacy commissioner Daniel Therrien. “It is an affront to individuals’ privacy rights and inflicts broad-based harm on all members of society, who find themselves continually in a police lineup. This is completely unacceptable.”

Clearview AI ruled ‘illegal’ by Canadian privacy authorities, via TechCrunch.


Beginning next month, users of Google Pixel devices will be able to measure both their heart rate and respiratory rate in the Google Fit app, all thanks to the devices’ cameras. Heart rate can be calculated through “subtle color changes” in the user’s finger tip, applied directly to the camera. Meanwhile, respiratory rate is captured using optical flow, a monitoring of the slightest movements of the user’s chest. The team is awaiting peer review, but suggests their clinical studies have shown an accuracy within 2%.

Google to offer heart and respiratory rate measurements using just your smartphone’s camera, via TechCrunch.


One clever thing about integrated marketing, otherwise known as product placement, is that consumers can’t really skip it the same way they can with a commercial. Now an LA-based startup called Ryff has built a tool that uses computer vision technology to ingest existing content, identify placement opportunities — surfaces such as tables, counters, billboards, or awnings — and digitally insert branded products. Here’s hoping every use of the technology is as compelling as the sizzle reel in the article, several scenes from a Lifetime Original Christmas movie featuring CGI Bailey’s Irish Cream.

How Computer Vision And AI Make New Revenue From Old Media, via Forbes.

A side-by-side image showing people at a kitchen table. Honey Nut Cheerios have been digitally inserted into the right image.


If you didn’t use Zoom a year ago, you probably do now. This week the company announced they’ll be adding to their suite of computer vision-powered features, currently including background replacement and blurring. One new feature is attendee counting — ”We do computer vision segmentation of the image to identify how many people are in the room,” says product head Jeff Smith — which might be helpful come the day multiple people are in one room again, I guess.

Zoom announces new software-hardware integrations for its hybrid conference rooms, via VentureBeat.


While by no means the biggest news out of Amazon this week — founder Jeff Bezos is stepping into an executive chairman role while promoting Andy Jassy to CEO — Amazon has revealed their plans to install driver-monitoring surveillance cameras in their delivery vehicles. Ostensibly a safety measure, the cameras will record 100% of the time, and upload footage for review if any number of predefined events (such as “distracted driving”) are triggered. It seems warehouse workers are no longer the only employees that Amazon is interested in monitoring and optimizing.

Amazon plans to install always-on surveillance cameras in its delivery vehicles, via The Verge.


📄 Paper of the Week

Scaling Laws for Transfer

Going theoretical this week with some great fundamental research out of OpenAI. They find that “pre-training effectively multiplies the fine-tuning dataset size. Transfer, like overall performance, scales predictably in terms of parameters, data, and compute.” Transfer learning is used extensively in the computer vision world and understanding the fundamental relationships between dataset size, model size, compute, and model performance is critical for the future.


Think The Juice was worth the squeeze? Sign up here to receive The Juice weekly.



Zumo Labs accelerates computer vision model development using custom-made synthetic training data. Improve model performance, reduce bias, and eliminate privacy concerns with synthetic data.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store