
…wever. While MS Celeb is under intense scrutiny now, other large datasets are also ripe for misuse. For example, a hiring algorithm trained on data that associates men with leadership positions because it’s been given more data about men in senior roles will reinforce that unconscious societal bias. A facial recognition algorithm trained on faces with lighter skin will be significantly worse at identifying faces with darker skin, making it an unreliable and even dangerous tool for law enforcement and security.