This is funny at several levels.
Peter Schaeffer
103

Like artificial neural networks, our own neural networks (a.k.a. brains) pick up on statistical correlations — and use them, often unconsciously, to make predictions. A useful hack, but it can lead to all sorts of problematic assumptions and behaviors, as well as stories we tell ourselves about certain “types” of people (“just so stories”).

It’s worth thinking about the term “accuracy” in this context. In a machine learning context, “accuracy” and “precision” have specific and rather simple statistical meanings that have nothing to do with causes, mechanisms, or good judgement, so you should use caution in saying “if it’s accurate who cares”. Example: the huge majority of people in prison are men, so a classifier that optimizes for accuracy in criminal judgment based on a large corpus of cases would certainly latch onto gender as a powerful “predictor”… do you see how this could be problematic? Would you want the “AI jury” stacked against you because you’re male?

Also, fwiw, our use of the word “stereotype” in the title referred at least as much to the study authors’ preconception that gay men’s faces would be feminized while lesbian faces would be masculinized — an assumption that led them down what we believe to be a rabbit-hole of incorrect reasoning about their own data.

As for value judgments, we are not making any. Though, personally, have to admit I’m more partial to our “gay” selfies than to our “straight” ones… my wife tells me she agrees :)