You can’t look anywhere on the internet anymore without seeing celebrities, friends, and family using the popular FaceApp age filter to imagine what they’ll look like in 50 years. While it may seem like an innocuous and amusing way to spend 10 seconds of your time, FaceApp’s technology isn’t exactly harmless.
There is a chance that any photos you upload onto the app will be used not just for advertising purposes, but also in the aid of developing and advancing facial recognition software that can be used by not so neutral external parties. For example, Ever AI was using any photos uploaded onto their database to build surveillance technology. IBM has also used photos from Flickr to improve their own facial recognition software. Why is this an issue? Facial recognition systems often make large mistakes when trying to identify racial minorities, women, and transgender people. When this largely imperfect technology is then used to police communities, there is a risk that the most vulnerable and already marginalized bear the brunt of inaccurate data. Furthermore, activists warn that this type of technology could help law enforcement disproportionately target minorities they are already biased against. Especially as we begin to learn about the harmful consequences of facial recognition (consequences that have led the city of San Francisco to ban the use of it in law enforcement), it is imperative that we are aware of how our own photos could be used to advance these technologies without our knowledge.
So please, stop using the aging filter for your own photos, and especially those of others without their permission. The momentary satisfaction of joining a viral trend is not quite worth the consequence of having your data manipulated and used in ethically dubious ways.