Look into my eyes, look into my eyes… or turn on your FaceTime Attention Correction

Enrique Dans
Enrique Dans

--

One of the features of Apple’s new iOS 13, due for release this fall, that has attracted a lot of comment is FaceTime Attention Correction. Easily activated, it allows users to correct what can be an unsettling aspect of video calls: your eyes look at the image of the other person on the screen, not at the camera, which makes it look as though you are avoiding eye contact or looking someplace else. Some people with a certain experience in video calls have learned that looking directly at the camera can help, particularly if they want to make a particular point during a conversation, but still, it feels unnatural to focus on a black and motionless hole rather than the space on the screen occupied by the face of the person you’re talking to.

The solution, until manufacturers put the camera in the center of the screen without the need for a notch, seems futuristic and perhaps slightly controversial: digitally manipulate the image of our eyes so they appear to be looking at the other person. People who have tried it say the image is practically indistinguishable from reality and that it feels like you’re making eye contact.

What does it mean to manipulate a person’s eyes so they appear to be looking at you? The technology has been available since Microsoft Research experiments in

--

--

Enrique Dans
Enrique Dans

Professor of Innovation at IE Business School and blogger (in English here and in Spanish at enriquedans.com)