Targeted Ads

Rebecca Case
#im310-sp20— social media
4 min readApr 2, 2020

The issue of privacy and data in regard to targeted ads and whether social media sites should be held responsible for regulating content is a tricky one.

On one hand, I believe that targeted ads can be helpful and convenient in certain situations. I would rather get ads for American Eagle and Converse than for car companies. It is a more efficient way for me to see deals that might interest me as opposed to ads that are just filling up space and getting in my way.

A majority of social media sites are free to use. The only way many of these sites can get money is through advertisements. In turn, we “pay” for access to these sites with our information regarding what kind of person we are and what kind of ads we might be interested in. In many ways, this doesn’t bother me because it is unavoidable. Most of the information is relatively harmless, and mainly just tracks my preferences on certain products.

I am aware of it happening, but I can’t do much about it, so it doesn’t particularly bother me that social media sites are sending me targeted ads. Until something harmful comes from this, I can’t really do anything about my information being spread.

However, I do think there is a problem with social media sites, such as Facebook, selling users data to Cambridge Analytical in order to send users targeted ads, like they did with the 2016 election in America. I think it is irresponsible to sell users data, and Facebook should be held responsible, especially when it comes to politics.

Politics is often a touchy subject, especially in this day and age. When Facebook sold user data to Cambridge Analytical to target specific political ads to user who had specific preferences, it led to the manipulation of many of those users.

While it can be argued that all advertisements are manipulative in their own ways, it is particularly unethical to target political ads to specific customers. For instance, targeted eyeshadow ad may not have much of an impact on large populations, but when a targeted ad significantly impacts how certain populations view national elections, many more people are affected.

Cambridge Analytica used psychographic ads, which are more fine-tuned and specific than demographic ads. Psychographic ads target belief systems, lifestyles, values, interests, and more. These ads can be used against people in ways that make it so specific political ads can be displayed in ways that will sway a particular user into that candidate’s favor.

For example, someone endorsing another candidate for political office can have six different ads, ultimately with same conclusion. Cambridge Analytica had the power to analyze the data from Facebook in order to tailor ads to specific groups of people based on their tendency to worry, how agreeable they are, how open they were, those who are more extroverted, and those who are more traditional, among other categories.

Ads like these can be seen as an invasion of privacy in many ways, because an ad that you see on your screen might tell you how a candidate is an adequate leader and will take charge in necessary situations, while another person will get an ad for the same person talking about how that candidate is going to ensure a safer world for children. Ultimately, these ads have the same purpose to get you to vote for a certain candidate, but the way they go about that conclusion is different depending on what psychographic you fit into.

This leads to people seeing a very limited picture of what that candidate is running for, which may lead to someone voting for a candidate that has policies that they might strongly disagree with. However, they won’t know about these policies if they don’t see an ad that discusses it.

This is not the same a broadcasting an ad to a larger, more general audience, which ensures that everyone receives the same information on the same subject. While it is still manipulative, it is not tailored to specific people, and allows for people to have a little more freedom when trying to pick a candidate. If they are interested in the candidate, they can look further into them and find out their exact policies and get a better picture of what they stand for, and not just a limited view.

I think one solution to this problem is for social media sites to make their terms and conditions more explicit to their users, making them aware that they will be subjected to targeted political ads, as well as other commercial ads. This way, users have more freedom in knowing what they are getting into when they make an account.

I don’t think social media sites should completely filter and get rid of targeted ads on their sites, because doing so would be a form of censorship. However, I do think a pop-up warning should be given in advance in order to inform users that they are about to see a targeted ad. This way, users will be able to judge the ad from a more objective standpoint, and better evaluate the message that they are given.

By regulating this content, users can have more trust in social media sites, but advertisers can still get their message across. While targeted ads can seem invasive, they are efficient in getting people to listen to what the ads have to say. It is a new way of advertising, as companies are increasingly figuring out how to get our increasingly online society to work in their favor. However, as long as users are aware of what is being done, I don’t see any problem with this new advertising strategy.

--

--