Facebook Wants Me To Friend Girls Half My Age
But that’s not the most creepy part of the recommendation algorithm

There I sat, scrolling through my Facebook feed, scanning updates from friends, groups, and sponsored ads for monthly murder mystery boxes, beer mugs crafted from baseball bats, and useless crowdsourced products. I mean, who needs an AI-powered flight control system for paper airplanes — technology that’s so sophisticated it can fly a piece of lettuce? Wait, an artificial intelligence autopilot that’s so advanced it can fly romaine? That’s worthy of a bookmark.
As I scrolled past — and seriously considered the aeronautic marvel — Facebook’s recommendation section caught my eye. In the “People you may know” section, the social graph raised my eyebrow by presenting a girl with the same name as my 12-year-olds bestie.
“On what planet do I have something in common with a pre-teen?” I thought.
I was determined to reverse engineer the illogical suggestion. In my brain and without evidence, of course, as I had no access to the code. Still, the journey I took down the rabbit hole using educated guesses to understand Facebook’s graph recommendation went from creepy and perverse to morbid.
She was legal, but half my age. From her public profile, I could tell she was on the basketball team in high school. And I once attended a high school that coincidentally, also had a basketball team, though I never played. Close enough, I guess?
Investigating the recommendation engine
The suggestion was curious. The recommended name on my feed was unique, but the profile image didn’t appear to be a minor. That’s not unusual; many Facebook users hide behind pictures of friends, family, pets, or seasonally change images to match causes or memes.
I’ve been known to swap out my profile pic for Voltron on occasion, and nobody has ever accused me of being a robot. Emotionally unavailable, sure. But nobody has mistaken me for five mechanical lions capable of transforming into the “Defender of the Universe.”
The avatar was of a young woman, who I had guessed to be about twenty years old. Between her name and the large blue “Add Friend” button was the summary validation — we had one mutual friend connection in common.
In those first few moments, I had assumed our mutual was my daughter. In a simplistic recommendation engine, if Bob knows James and James knows Adam, then the algorithm might test if Bob knows Adam. And if Bob doesn’t know Adam but is open to networking, he might discover a new friend with similar interests.
Recommender platforms like this are used in everything from music playlists to product suggestions by predicting the likelihood of user engagement or customer conversion. And they learn by being trained. That’s because over time, your clicks, opt-outs, or avoidance of recommendations help the AI build a better profile to suggest more friends or murder mystery boxes to sell you.
But my daughter doesn’t have social media accounts, or at least I didn’t think she did. My mind raced. What was she hiding from me, and was she just outed by Facebook? It certainly seems plausible. As early as 2012, artificial intelligence (AI) helped retailer Target determine if teenagers were pregnant before their parents even knew. That must have caused some amusing dinner table conversations.
Take a breath; she wasn’t a minor
I clicked the page to investigate. It did not appear to be a pre-teen profile and instead contained all the hallmarks of a twenty-something I had assumed based upon the variety of images of friends around the same age. Her public page proudly displayed recent selfies from college life, with older posts from high school. It looked legit, but still, coming off a recent episode where someone tried to create a fake Instagram profile in my name, the tin-foil hat in me thought this might be some elaborate attempt to create a sock-puppet account.
Still, the name was so unique and continued to tickle my brain. So I called the mother of my daughter’s best friend and asked, “Do you have a relative that shares the same name as your daughter?”
“Yes, my niece,” she said. “Why do you ask?”
I’d be curious, too, if a guy in his mid-forties were asking about her twenty-two-year-old relative. I don’t deny that’s totally sus. I explained the puzzling Facebook friend recommendation, and we laughed it off. Still, I wondered why Facebook imagined we might make for good online pals. She was legal but still half my age. What did we have in common?
The connection wasn’t her aunt because we aren’t connected on Facebook; we typically communicate instead over email or text. Now my mind raced further. Was Facebook exploiting contacts in my personal address book to infer new connections? I initially dismissed this as being too nefarious, even for Mark Zuckerberg in his vision to connect the world.
The common thread gets bizarre
I scanned the girl’s page again, trying to understand what that algorithm assumed to be a logical connection. From her public profile, I could tell she was on the basketball team in high school. And I once attended a high school that also had a basketball team, though I never played. Close enough, I guess?
As I couldn’t imagine any data scientist developing a recommendation engine that ludicrous, I hit the one link that I should have clicked from the start — our mutual friend in common.
He was also a twenty-something, my brother-in-law’s nephew by marriage. Someone I’d occasionally see at extended family gatherings. We’re talking twice a year tops, perhaps for a holiday event. At one point, several years ago, between sips of eggnog, we friended on Facebook. Because, why not?
The inner workings of the algorithm were starting to make sense. We never engaged on Facebook, but given his age and location, he ran in the same circles as this young woman. So why not ask if we’d like to buddy up digitally?
The problem is, he died nearly a year ago.
Facebook’s policy on deceased accounts
Facebook has a process that allows people to managed situations when a user has passed away. According to their policies and reporting, Facebook will memorialize an account if they are told that a person has died. Those memorialized accounts will allow friends and family to continue to share memories while securing them from unauthorized access, says Facebook. But that requires reporting to the support team, which now has “fewer people available to review reports because of the coronavirus pandemic.”
On her own Facebook page, his mother keeps his digital footprint alive by tagging him with remembrance posts. Those, in turn, receive likes and comments in support. It is unclear if his account has been reported to support or if his mother is waiting for a response. Either way, it is unfortunate that any social engagement continues to trigger recommendation scoring on the extended graph.
And the friend recommendation isn’t the only place where the algorithm is used. As I scroll through the Facebook Groups to which I belong, Zuckerberg constantly asks me to consider friends to participate. Just today, the algorithm suggested I extend a group invitation to this deceased young man.
For all the intelligence the Facebook graph possesses so it can predict the likelihood that I’ll consider a smart plane that can swap lettuce for wings (yes, I bought it), the algorithm has yet to infer that this young man will never be posting on his timeline ever again. It’s rather sad that no amount of Facebook’s AI could determine that this original friend suggestion was horribly wrong on so many levels. Imagine how painful that might be if suggested to a closer family member or acquaintance.
I decided to train the AI model the only way I could. I clicked the grey button to “remove” the recommendation.