Please stop using the age filter 👵🏽
You can’t look anywhere on the internet anymore without seeing celebrities, friends, and family using the popular FaceApp age filter to imagine what they’ll look like in 50 years. While it may seem like an innocuous and amusing way to spend 10 seconds of your time, FaceApp’s technology isn’t exactly harmless.
For one, the app has a shaky policy when it comes to accessing your photos. Some have speculated that the app uploads users’ entire photo libraries in the background, but these claims remain unsubstantiated. Still, there are concerns about users who have listed photo access as “never” still being able to share individual photos with the app. It is likely that as people upload their photos onto the app, they don’t have a full understanding on what they are actually handing over to this company. FaceApp’s own privacy policy acknowledges: “We may share User Content and your information (including but not limited to, information from cookies, log files, device identifiers, location data, and usage data) with businesses that are legally part of the same group of companies that FaceApp is part of, or that become part of that group”. We aren’t sure who exactly these affiliates are and what they might do with these photos.
There is a chance that any photos you upload onto the app will be used not just for advertising purposes, but also in the aid of developing and advancing facial recognition software that can be used by not so neutral external parties. For example, Ever AI was using any photos uploaded onto their database to build surveillance technology. IBM has also used photos from Flickr to improve their own facial recognition software. Why is this an issue? Facial recognition systems often make large mistakes when trying to identify racial minorities, women, and transgender people. When this largely imperfect technology is then used to police communities, there is a risk that the most vulnerable and already marginalized bear the brunt of inaccurate data. Furthermore, activists warn that this type of technology could help law enforcement disproportionately target minorities they are already biased against. Especially as we begin to learn about the harmful consequences of facial recognition (consequences that have led the city of San Francisco to ban the use of it in law enforcement), it is imperative that we are aware of how our own photos could be used to advance these technologies without our knowledge.
So please, stop using the aging filter for your own photos, and especially those of others without their permission. The momentary satisfaction of joining a viral trend is not quite worth the consequence of having your data manipulated and used in ethically dubious ways.