Zumo Labs
Published in

Zumo Labs

🍊 The Juice: Lip Service

Zumo Labs presents The Juice, a weekly newsletter focused on computer vision problems (and sometimes just regular problems). Get it while it’s fresh.

Week of March 1–5, 2021

____

For over a century, CHANEL has been on the cutting edge of fashion. But, on the cutting edge of technology? As it turns out, perhaps the classiest thing about CHANEL is their progressive approach to data governance.

A few weeks ago CHANEL launched Lipscanner for iOS. The app allows users to scan any color they encounter — whether in a magazine, on a friend, or at breakfast — and find the corresponding CHANEL lipstick color. It then offers an in-app virtual try on experience using AR. The kicker is that they run that model locally on the user’s device, so no camera data is ever transmitted back to CHANEL’s servers.

CHANEL doesn’t need to collect user data to train their algorithm, in part because they already own a vast database of marketing images (which their team painstakingly annotated in house). But according to Cedric Begon, the director of the team that built Lipscanner, that dataset was overwhelmingly composed of white faces and lips. To make sure that the app would work well on skin of any color, CHANEL turned to synthetic data. That also gave them the ability to train and continue to refine their model without compromising anyone’s privacy.

Per Begon, “We don’t have any access to a single piece of personal data. To be useful, simple and fun, you need to have trust, you need to make sure that customers don’t ask themselves am I tracked, am I going to give my images to someone. That’s the reason why we wanted the privacy approach to be extreme.”

Their approach is brilliant in its simplicity. CHANEL makes it seem like the easiest way to be an ethical tech company is to just not be a tech company in the first place.

____

#Electrolytes

Athletes sweat, but how do they know when it’s time to drink Gatorade™? Well thanks to PepsiCo, they can purchase and wear the Gx Sweat Patch on their inner arm. It’s a flexible patch that changes color and intensity according to the wearer’s sweat rate and volume. After getting sweaty the user loads up the companion app on their phone and snaps a picture. The app then uses computer vision tech to analyze the colors present and build a unique “sweat profile.” If you have the patience to do all of this rather rather than just taking an occasional swig of Gatorade, you deserve the perfect electrolyte balance you will achieve.

How Your Smart Phone Can See You Sweat, via IEEE Spectrum.

#BIPA

Back in 2015 a Chicago attorney took Facebook to court over what their client perceived to be a violation of the Illinois Biometric Information Privacy Act. Specifically, Facebook’s Tag Suggestions tool worked by storing and referencing the exact biometric data the law was intended to protect. The suit became a class action in 2018, and now a judge has approved a settlement of $650 million. It’s a landmark judgement that highlights the significance of the Illinois law, but since Facebook posted a revenue of $85 billion in 2020 they’re probably not sweating the pocket change.

Facebook privacy settlement approved: Nearly 1.6 million Illinois users will ‘expeditiously’ get at least $345, via The Chicago Tribune.

#EIEIO

A team of researchers at Michigan State University has received a five-year, $1 million grant to improve computer vision systems in precision livestock farming — specifically pigs and cattle. One of the students outlined the scope of their work, saying, “Computer vision can really improve livestock farming systems by enhancing animal welfare, monitoring animal behavior, detecting illnesses earlier, and by allowing the assessment and measurement of many animal parameters that are either very difficult or even impossible to measure without the utilization of cameras and computer algorithms.”

MSU-led international research team receives $1 million grant to build multi-disciplinary precision livestock farming network, via Michigan State University.

#Deepfakes

If you’ve been on the internet at all this week, you’ve likely seen the Tom Cruise deepfakes going around. They’re undoubtedly impressive, but as virally effortless as they seem, it turns out they took quite a lot of work to pull off. So while you probably won’t be deepfaking Tom Cruise anytime soon, you could always reanimate your Meemaw. A site called MyHeritage has launched a tool they’re calling Deep Nostalgia, which gives folks the ability to apply lifelike motion to an uploaded image. You can try their digital necromancy tool today, just please do not share the results with us, thanks.

Here’s How Worried You Should Be About Those Tom Cruise Deepfakes, via VICE.

New AI ‘Deep Nostalgia’ brings old photos, including very old ones, to life, via The Verge.

#GPT3

The language model GPT-3 remains one of the most impressive showcases of AI to date, but it’s far from perfect. This piece from Nature covers both its capabilities and shortcomings, all while sliding in some really sick burns. For example, one computer scientist interviewed says that GPT-3 is “mostly a memorization engine. And nobody is surprised that if you memorize more, you can do more.”

Robo-writers: the rise and risks of language-generating AI, via Nature.

____

📄 Paper of the Week

Generative Adversarial Transformers

Someone finally did it: they made a GAN with Transformers. Oh, and they also got state of the art for image generation on Cityscapes, CLEVR, and LSUN. This will likely be the beginning of many GAN papers to come that are based on Transformer, and might be the end of the road for CNNs in GANs (every researcher and their mother is on the hype train after this). Be sure to scroll down to the bottom to look at the new images, and check out the code here.

____

Think The Juice was worth the squeeze? Sign up here to receive The Juice weekly.

--

--

Zumo Labs accelerates computer vision model development using custom-made synthetic training data. Improve model performance, reduce bias, and eliminate privacy concerns with synthetic data.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store