Russia influence underscores need for sunshine on digital political advertising

Social media companies Facebook, Twitter and Google are under scrutiny for practices that allowed Russian affiliates to influence the 2016 U.S. presidential election through undisclosed advertisements and posts.

Originally published in California Publisher, Fall 2017.

by Jason M. Shepard

Nearly one year after the 2016 presidential election, citizens still are learning how foreign entities covertly influenced American voters.

Congressional committees and a special counsel continue to investigate Russian influence, but already the scandal underscores a dark side of social media: We often have no idea who is behind the messages we see.

“We are in a new world,” Facebook CEO Mark Zuckerberg wrote in September, responding to reports that a Russian troll farm paid for at least $100,000 in Facebook ads to influence U.S. voters. Zuckerberg pledged to do more to “make political advertising more transparent.”

Ellen Weintraub, the senior commissioner on the Federal Election Commission, said the U.S. government also must do more. “Our campaigns are moving headlong onto the Internet; our laws must catch up,” Weintraub recently wrote in the Washington Post. “Americans have the right to know who’s paying for the evermore-influential political material that’s popping up in our social media feeds. The FEC has the ability — and the duty — to make sure it’s not Vladimir Putin.”

When it comes to political advertising, the internet remains the Wild West. And that is a problem, because democratic self-governance is undermined when citizens are left in the dark about the identities and motives of election influencers.

“Our campaign finance rules are outdated for the internet age, and rules on the books aren’t enforced,” Ann Ravel, a former member of the FEC and chair of the California Fair Political Practices Commission, recently wrote in Politico.

Rise of digital political advertising

For politicians and advocacy groups, social media open new doors of influence and persuasion in public opinion. More and more, Americans get their news from social networks such as Facebook and Twitter. Facebook boasts 210 million active U.S. users; Twitter has 68 million. Two-thirds of Americans now get some of their news from social media, according to the Pew Research Center.

These communications tools have remarkable benefits for citizens, consumers and corporations, connecting us in ways never imagined before the internet age. Social media networks use sophisticated tools that allow people to reach niche audiences for a sliver of the cost of other media.

For political advertising, broadcast TV used to be the dominant platform. But digital political advertising is skyrocketing. An estimated $1.4 billion was spent on internet advertising in all local, state and national elections in 2016, up 789 percent from 2012, according to advertising-tracking firm Borrell Associates.

About 40 percent of digital political advertising was spent on social media sites, with Facebook being the top beneficiary. The Trump campaign spent half of its overall advertising budget on online advertising, according to The Post, including $70 million on Facebook advertising alone.

On a recent “60 Minutes” broadcast, Trump campaign digital director Brad Parscale revealed that a team of Facebook employees helped him tailor tens of thousands of ads each day to micro-target voters whose interests and motivations are filtered through Facebook’s algorithms.

“Twitter is how [Trump] talked to the people, [but] Facebook was going to be how he won,” Parscale said.

Social media advertising will only grow in coming campaigns, in part because of Trump’s success. And campaigns won’t be the only entities trying to persuade online.

Russian ads on social media

In the weeks after the 2016 election, social media companies denied they had become unwitting tools of electioneering influence by Russia.

Facebook, however, now acknowledges that a group called the Internet Research Agency, described as a “Kremlin-affiliated troll farm” by the Washington Post, spent $100,000 on 3,000 Facebook ads seen by about 10 million U.S. Facebook users before and after the election. CNN reported the goal of the ad buys was to “amplify political discord” and “fuel an atmosphere of divisiveness and chaos.”

Also revealed in recent weeks: Twitter shut down 201 accounts associated with the same firm and continues to investigate other Russian-related election activities on its system, and Google acknowledged that Russian affiliates spent “tens of thousands” of dollars in election-related advertisements on its systems.

Jonathan Albright, research director of the Tow Center for Digital Journalism at Columbia University, told The Post the disclosures thus far greatly underreport the influence of Russian-directed disinformation.

“It’s social media marketing at an expert level,” Albright said. “This is very well executed.”

After the names of six Russian-supported Facebook pages and sites were revealed in September, Albright began to trace all posts, comments, shares and “likes” of Russian-supported elections content, not just the paid ads. Albright’s research, involving the scraping of data from web servers, found that their content had been “shared” 340 million times.

“That’s from a tiny sliver of the 470 accounts that have been made public,” The Post reported. “Albright’s findings still suggest a total reach well into the billions of ‘shares’ on Facebook.”

Many posts weren’t necessarily advocating the election of Trump or defeat of Hillary Clinton but aimed at polarizing topics such as Black Lives Matter and LGBT rights, Albright found.

“A lot of these posts had the intent to get people not to vote,” Albright told The Post. “This is a concerted effort at manipulation.”

But days after Albright reported his initial findings, Facebook shut down access to information about other Russian-connected sites, saying it violated privacy rules.

Sen. Mark Warner (D-Va.), vice chairman of the Senate’s Select Committee on Intelligence, has said the disclosures to date are just “the tip of the iceberg.”

Legal fixes to combat “dark money”

Campaign finance laws are complex and changing, and the First Amendment erects limits on the extent to which the government can regulate political communications.

But disclosure and disclaimer laws don’t present the same constitutional problems, and regardless of partisanship, all citizens should be concerned that a hostile nation can take advantage of communications infrastructure to covertly influence American elections.

The legal issues are complicated, to be sure.

In 2010, the U.S. Supreme Court’s decision in Citizens United v. FEC changed more than 30 years of campaign finance laws, removing limits on campaign spending by corporations and unions under the theory that the First Amendment allows entities the right to spend unlimited money on political speech. The vote was 5–4, split on ideological lines.

However, the Court also ruled, by an 8–1 vote, that disclosure and disclaimer laws were necessary and proper so that voters could be fully informed on election influences. “This transparency enables the electorate to make informed decisions and give proper weight to different speakers and messages,” the Court wrote. Justice Clarence Thomas was the only dissenter. His view is that the First Amendment gives people a right to anonymously influence elections using money to amplify their message.

In a separate 2010 case upholding disclosure laws, Justice Antonin Scalia said disclosure served important government interests, writing, “Requiring people to stand up in public for their political acts fosters civic courage, without which democracy is doomed.”

Elected Republican leaders have said they support disclosure laws, including President Trump, who in a 2015 interview with TIME, said, “I want transparency. I don’t mind the money coming in. Let it be transparent. Let them talk, but let there be total transparency.”

But former FEC commissioner Ravel says Republicans have increasingly stymied not only broader campaign finance regulations, but disclosure and disclaimer laws as well.

“The First Amendment defense isn’t just a shameful excuse to do nothing — it’s bewildering,” Ravel wrote.

“Requiring transparency in political advertising, like we already do for television and radio, doesn’t limit anyone’s free speech. Enforcing the law and updating policy, on the other hand, defends democracy and the electoral process. For years, we’ve done neither, and we now know that our democracy has been under attack.”

The FEC, created by Congress after the Watergate scandal, is charged with protecting the integrity of the election process but it generally gets poor marks for efficacy. Even when violations are clear cut, fines can be tiny and levied years after the fact, if at all.

When she was on the commission, Ravel said her proposals to engage internet and tech experts in discussing solutions were met with “harassment and death threats stoked by claims by the three Republican commissioners that increased transparency in internet political advertisingwas censorship.”

Even on matters of combating foreign influence, Republican commissioners blocked enforcement of existing laws and attempts to strengthen others, Ravel said. After the Facebook disclosures, Democratic senators urged the FEC to take action. In a letter, legislators asked the FEC to develop proposals to eliminate loopholes in campaign disclosure laws that allowed Russia to evade disclosure of their spending, to increase best practices among social media platforms to prevent illicit foreign campaign spending, to better monitor coordination between campaigns, third parties and foreign actors, and to better align digital political advertising rules with broadcast rules.

New legislation is also being drafted. One bill would require a reporting system for digital platforms with more than 1 million users, creating a public database of information on anyone spending more than $10,000 on electioneeringcommunications, including the ad’s content, view count, price and contract information.

Going forward

FEC Commissioner Weintraub wants new rules for disclaimers and disclosures on internet political advertising.

“People can argue in good faith about the merits of unbridled corporate spending in American elections,” Weintraub wrote. “But no reasonable person would grant full First Amendment rights to a Russian troll farm.”

She says the FEC should look at model legislation recently passed in Maryland and California.

In October, California Gov. Jerry Brown signed the DISCLOSE Act, a law revising multiple campaign-finance provisions that has been years in the making in the state Legislature. Advocates described the law as having the strongest disclosure requirements in the country. The law includes disclosure provisions for ads on social media.

“Every voter has a right to know who is trying to influence our votes and our Legislature,” Nicolas Heidorn, legislative affairs director at California Common Cause, said in a statement. “While Congress and federal agencies fail to act to require more transparency in the post-Citizens United era, the DISCLOSE Act will continue California’s leadership in building a strong and transparent democracy.”

But it is unlikely that the FEC will follow California’s lead anytime soon.

President Trump’s latest appointee to the FEC is Texas lawyer James E. “Trey” Trainor III. He has been a dogged opponent of campaign-finance laws in general, including disclosure laws. The largest “dark money” group in Texas, Empower Texas, successfully sought to keep its donors secret and paid Trainor’s law firm more than $1.3 million in 2014 and 2015, according to Salon.

After Trump announced Trainor’s appointment, a former counsel to the FEC, Alex Tausanovitch, wrote in The Hill that Trainor “will almost certainly continue to advocate for weaker enforcement and less disclosure, a step backwards at a time when we’re gravely in need of progress.” And Texas political analyst Scott Braddock told the Center for Public Integrity, “No one has fought against transparency in Texas elections with as much energy as President Trump’s nominee for the Federal Election Commission.”

While citizens wait for Congress and the FEC to act, Facebook pledges to do more.

Zuckerberg said Facebook continues to study how Russian agents used their tools, and it will create systems that increase transparency by showing what ads are purchased by each Facebook page. The network is adding 250 new employees focused on security and election integrity.

“I don’t want anyone to use our tools to undermine democracy,” writes Zuckerberg. “That’s not what we stand for.”

Jason M. Shepard, Ph.D., is chair of the Department of Communications at California State University, Fullerton. His primary research expertise is in media law, and he teaches courses in journalism, and media law, history and ethics. Contact him at jshepard@fullerton.edu or on Twitter at @jasonmshepard

Jason M. Shepard, Ph.D.

Written by

Media law prof and COMM dept chair @CSUF. Into: #FirstAmendment #journalism #socialmedia #politics. Past: @CapTimes @isthmus @TeachForAmerica @UWMadison PhD.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade