Algorithms are Taking Over Our Lives

Understanding the dangers of data collection

Mahdyel
ILLUMINATION
4 min readMar 8, 2023

--

Photo by Markus Spiske on Unsplash

In 2013, Eric Loomis was sentenced to 11 years in prison, with six years behind bars, not because of the decision of a judge or a jury of his peers but because an algorithm said so. The police had pulled over Loomis for driving a car used in a shooting, which he wasn’t involved in at all. He pleaded guilty to attempting to flee an officer and no contest to operating a vehicle without the owner’s permission, crimes that usually don’t mandate prison time.

However, the judge in charge of Loomis’s case determined that he had a high risk of recidivism by using an algorithm called the Correctional Officer Management Profiling for Alternative Sanctions Risk Assessment Algorithm or COMPAS. Without questioning the algorithm’s decision, Loomis was denied probation and incarcerated.

This story raises important questions: How much do algorithms control our lives, and can we trust them?

It’s been roughly ten years since Eric Loomis’s sentencing, and algorithms have far greater penetration into our daily lives. From when you wake up to when you go to bed, you’re constantly interacting with tens, maybe even hundreds, of algorithms.

Let’s say you wake up and quickly search for a place near you to eat breakfast. In this one act, you’re triggering Google’s complex algorithm that matches your keywords to websites and blog posts to show you the most relevant answers. When you click on a website, an algorithm serves you ads on the side of the page. Those ads might be products you’ve searched for, stores near your location, or even something you’ve only spoken to someone about.

You then try to message a friend to join you for your meal, and when you open any social media app today, your feed no longer simply displays the most recent post by people you follow. Instead, what you see can be best described by TikTok’s “For You” page.

Complex mathematical equations behind the scenes decide what posts are most relevant to you based on your view history on the platform. YouTube, Twitter, Facebook, and, most notoriously, TikTok use these recommendation systems to get you to interact with the content that their machine thinks is right for you.

It’s not just social media. Netflix emails you recommendations for movies to watch based on what you’ve already seen, Amazon suggests products based on what you previously bought, and, probably the most sinister of all, Tinder recommends you the person you’re supposed to spend the rest of your life with, or at least that night.

These might seem like trivial matters, but it’s more than that. Algorithms are also used to determine who needs more healthcare and when you have your day in court, and a computer program decides whether you’d spend the next decade of your life behind bars for a crime that usually doesn’t carry any time.

One of the most dangerous things about algorithms is the data that is used to power them.

The more data you feed into an algorithm, the better its results. So where do companies get this data? It’s from their users like you and me. Most of the time, giving out this information is harmless, but often, these companies sell your information to data brokers, who then sell that information to other companies that want to sell you stuff. That’s why you keep getting targeted ads from random companies you’ve never heard of before. And what’s worse is that these data brokers are often targeted by nefarious actors who steal all the information they have in data breaches.

The good news is that you can get these data brokers to delete their information about you. Sadly, to do it manually, it could take years. Despite the risks associated with algorithms, they are not going away anytime soon. They are becoming more and more ingrained in our lives.

So what can we do to protect ourselves?

One thing we can do is to be aware of the data we are giving away. Before you click “I agree” to a website or app’s terms and conditions, read through them. Many times, you agree to give away a lot of personal information that you may not want to give away.

Another thing you can do is limit the amount of information you give away. For example, you can use a fake name or email address when signing up for something that you don’t want to give your real information. You can also use a virtual private network (VPN) to mask your IP address and protect browsing data.

If you are concerned about selling your data to data brokers, you can opt out of data sharing. Many companies allow you to opt out of having your data sold, but you have to know where to look. PrivacyRights.org lists companies that allow you to opt out of data sharing.

Finally, if you want to take things a step further, you can use tools like DuckDuckGo or Tor to browse the web anonymously. These tools help to keep your browsing data private and make it harder for companies to track you.

In conclusion, algorithms are powerful tools that can help us in many ways but also come with risks. By being aware of the data we are giving away and taking steps to protect our privacy, we can enjoy the benefits of algorithms while minimizing the risks.

--

--

Mahdyel
ILLUMINATION

I am a writer and storyteller, writing about life, self-actualization, and work.