Today marks Juneteenth — a celebration commemorating the end of American slavery. But, Juneteenth dates back not to 1863, when the Emancipation Proclamation was signed, but to 1865 — more than two years later. That’s how long it took for the message of emancipation to reach the slaves in Texas.
Despite all the recent advances that modern society enjoys, limitations in communication still exist. Technology does enable news to spread more quickly, but not everyone receives online information equally — a problem that will only intensify if recent trends in malicious internet tactics persist.
Imagine, for example, if the United States federal government put information online today announcing the forgiveness of all federal education loans. A number of people might not even be capable of getting a message like this due to persistent unequal internet access problems. Not everyone has access to affordable and reliable home broadband. Some areas in the United States (primarily rural counties) do not have any access at all to broadband. In one rural Ohio county, about 80% of the residents did not have home Internet access. Some urban areas only have one Internet provider that may provide poor quality or exorbitantly priced service. This of course only speaks to access to the Internet and does not address possession of a device to get on the Internet to receive this message. Reports show populations living on low income access the internet primarily through mobile devices. It will be difficult for individuals to read the announcement if it’s not optimized for smaller screens.
Even if an individual has access to affordable, good quality broadband, she might never see a digital announcement like this if it got filtered out of her timeline or dashboard by optimization algorithms. Optimization algorithms are why social media timelines are no longer chronological and why your ads look different than your friend’s. These algorithms determine what posts will get the best return using engagement (e.g. likes, shares, comments) as a metric of success. If an algorithm predicts that a user or others like her normally do not engage with posts like the announcement, this post will not appear on her feed. If the user’s social media bubble is made up of other people who similarly don’t get the announcement, they will all miss this important news.
Another threat to the delivery of this message is disinformation — false information designed to trick users into believing it is real. For example, a prominent figure or organization could post that the federal government has forgiven all education loans and that this move violates the Constitution. As a result, a user may make false assumptions about the status of her private student loans or on the legality of the government’s loan forgiveness. Other disinformation posts may attempt to dissuade people from claiming their benefits or debunk the existence of any loan forgiveness. Disinformation distorts and brings doubt into the understanding of the announcement. Not all users have the time or means to investigate statements they see online. Fact-checking also requires users to know that the post they read was false. We have seen recently, social media companies are reluctant to take down misleading posts.
One particularly pernicious form of disinformation, dark ads, could further threaten delivery of the important announcement. These ads are not only false, but work in secret. Typically only visible to target audiences, dark ads are extremely difficult to detect and combat. If an entity sends false statements targeted to marginalized and vulnerable communities, the onus will fall on those very members to find correct information.
In the abstract, it may be difficult to see why these practices are an issue. But to understand how these tactics could disrupt the delivery of an important announcement — disproportionately in marginalized communities — we need look no further than the Russian propaganda campaign waged during the United States 2016 presidential campaign. Russia leveraged social media sites’ optimization algorithms and ad delivery services to spread disinformation and misinformation, and successfully suppressed Black voter turnout. They took the very features social media platforms tout as innovation, and turned them against already disadvantaged communities. Bad actors slowly perfected their tactics and now we see the mainstream effect.
What can be done about these problems? One answer is to ensure that as Congress develops and discusses federal data practices legislation, civil rights protections are at the center. Focusing on marginalized communities will ensure that tactics are identified and suspended before they impact more people. By listening to and addressing the concerns of vulnerable communities, we can work towards technology ushering in a more equitable society.