A Place to Connect — Facebook’s Public Engagement Campaign

Following up on: Silicon Valley’s Civic Responsibility, we had the opportunity to check out Facebook’s efforts to improve data privacy and confront the enormous challenge of misinformation on its social platform at a sponsor expo this week. TLDR: They’re making good progress, but there is so much work left to do.


The company held “A Place to Connect” in DC on June 19 and invited a healthy cross section of press, technologists, digital professionals, security experts, policy wonks and users for a half-day expo to discuss challenges and new features on the platform. The event was a bit surreal, as if a San Francisco loft-style event was shoved into the stuffy and buttoned-up box of Washington, DC. Most of the presenters appeared out of their element, taking the dress code up a notch but not quite masking their unease; you could tell some of them would rather have been wearing hoodies instead.

As the day unfolded, the presentations covered a range of topics on everything from opioid addiction to election integrity and were complemented by a small expo floor where staff helped guests explore new options and features.

Data Privacy

Rob Sherman, Facebook’s Deputy Chief Privacy Officer. Photo by author.

Rob Sherman, the company’s Deputy Chief Privacy Officer, spoke about the personas of people on Facebook and how they think about what privacy controls to build for users: the “Proud Parent,” the “Advocate” (definitely us!) and the “Mystery” (aka lurkers).

He said a lot that made sense: that Facebook feels “everyone has a basic right to privacy,” that they needed to “put people in control of their privacy” and even that “companies that use data need to do more to uphold our responsibility to protect it.” But then he lost us:

“What we have to do is find when people are misusing our data and take aggressive action to prevent it.”

We completely disagree with this sentiment, and it lays bare the underlying tension of where we are today with social media. As of now, tech companies profit from their users’ data and have little incentive to protect it. That’s the crux of the whole issue: privacy comes second to profit. Facebook provides its services for free, which means that its users (specifically their activity and data) are the product. And Facebook intends to continue selling and licensing that product to other partners — only reactively intervening when a problem arises. But it’s not enough to keep an eye on partners and then punish them when they step out of line. We all saw with Cambridge Analytica that this only needs to happen once in order to do tremendous damage.

Our preference would be to flip that script on this entire notion: that Facebook partners have to work hard and continually prove themselves in order to earn access to data and that the access be granted in typical least-privileged fashion, aggregated to protect customer privacy, and revoked any time it’s misused while ensuring customers’ granular data wouldn’t be able to be copied or stolen in any usable way. This may mean rewriting the entire underlying business model for the company, but it must be done if we’re to realign the incentive structure that puts user data protection first.

Swag: leather lanyards emblazoned with “Take Control”. Photo by author.

Finally, Rob mentioned the upcoming addition of the “Clear History” feature that will allow users to purge their own data — including activity that happens when you’re off Facebook. This is a strong move that will help a lot of more tech savvy users control their data.

During the limited Q&A portion of his session, a gentleman from The Guardian asked Rob a simple question: “What does Facebook know about us?” and he was unsurprisingly referred back to the online privacy policy and given a brief recap of what had just been covered during the session. While at its face value, Rob’s response was acceptable and probably did what he intended — to avoid newsworthy admissions— it fell short of a genuinely engaging conversation opportunity that was steamrolled instead by the company line.

Election Integrity

Katie Harbath hosts a panel on election integrity. Photo by author.

Katie Harbath, Vice President for Communications and Public Policy, covered a lot of different ways the company is responding to the 2016 federal election. Most notably, Facebook has greatly enhanced the ability for users to report posts involving all types of nefarious activities such as misinformation, false amplification, misrepresentation, harassment, violence, and account security. This bolsters people’s ability to flag a post for review by the Community Standards humans to determine if these posts violate said policies and remove them if necessary. Facebook has also committed to doubling the number of Community Standards reviewers working for the company from 10,000 to 20,000, which should speed up the process of getting flagged posts reviewed.

The company is also tackling the issue of fake account activity, like what happened when for-profit trolls in Macedonia were amplifying election sentiment for countries outside their own. This behavior is easily identified, and Facebook is cracking down on allowing those types of accounts to exist on the platform.

A sample of political ad tagging on the Facebook platform.

They are also doing a lot of work in the area of political advertising on the platform. Most notably, they are requiring account verification with government IDs and confirmed mailing addresses before they run political ads, so it’s harder to spoof a fake political action committee. They are also now clearly displaying ad patrons and providing more complete information about these patrons such as what type of impact the ad is having and the other ads in that account’s archives.

During the panel that followed, however, I was struck by the cynical nature of political advertising. A Republican on the panel jumped immediately into a discussion of how, (our paraphrase) “archived ads will allow us to do so much more research on candidates!” I suspect this is a wholly overlooked side effect of the ad archives tool and certainly was not its intended use. Instead of allowing typical citizens to see what type of sentiment an advertiser has promoted in the past, it’s handing campaign opposition research firms a treasure-trove of ad data to mine. It just goes to show you the double-edged sword that transparency can be on the frontier of digital engagement.

Despite the cynical approach to mining political ads, I was left heartened by the activities Facebook is taking to expand voter awareness and turnout. They’re moving to remind people to register to vote, encourage them to turn out on election days, and help them find their local polling places. Given Facebook’s tremendous reach, this should lead to a measurable level of improvement in civic engagement that our country sorely needs. Just a little effort on their part can easily outpace a mediocre get-out-the-vote operation.

Beyond that, they are connecting people to their representatives more directly by flagging public officials who represent users and encouraging users to “like” and “follow” their representatives’ pages. Hopefully Facebook can continue to improve upon this to allow a two-way dialog between representatives and their constituents regarding policy issues.

Product

Chris Cox, Facebook’s Chief Product Officer. Photo by author.

Finally, Chris Cox, the company’s Chief Product Officer, led us through a short free-form speech that largely recapped the highlights of the day. He also mentioned that he was in DC for a bit of a “listening tour” and briefings on Capitol Hill. There was also a very empathetic and heartfelt dialog with an audience member about use of the platform by those who might not necessarily want their data shared due to threats on their life or from their governments. Chris did a good job of seeming to be both on top of the problem and willing to listen to other points of view about what to do.


Much More Left to Do

While we are encouraged that Facebook is making an effort to be more transparent and respond these challenging issues, it is clear that there is much work left to do. In the short term, Facebook needs to realize that Washington is an indispensable part of this conversation and that, while Menlo Park may feel like another world, they must learn to speak the language of Washington if we are all going to be successful together.

We maintain that the solution to these issues lies within a genuinely motivated partnership between lawmakers on the hill and tech teams like Facebook. While this week’s briefing was only a mile or so from the Capitol Building, it might as well have been on another planet. Silicon Valley needs to realize the value of government liaisons and continue recruiting people for key positions established to educate congress and do the hard work of bridging the gap between industry and regulation.

We’ve got a lot of work left to do together.


Update: Fox News Among New Content for Facebook Watch

Source: Social Samosa

We later learned that Facebook Watch, the new livestreaming service Facebook is spinning up to replace its Trending Topics feature, will offer Fox News alongside CNN, ABC, and Univision.

This is very disappointing as no amount of technology work we can do to confront fake news and propaganda will solve the problem if companies like Facebook promote content provided by organizations with such a horrid track record of spinning and distorting facts like Fox News. If Facebook wants to be taken seriously as a trusted partner in these matters, they should immediately pull Fox News content and consider blocking it from the platform entirely. The damage this active misinformation can do to our democracy is not worth the profit to Facebook’s shareholders.