Silicon Valley’s Civic Responsibility

The time has come for the tech giants to balance their profits against their responsibility as a part of a well-ordered society.

Image: Shutterstock

Given the current climate of technology companies on the hot-seat with regard to the ethics of operating some of the world’s largest technology platforms and civic responsibility of how data they collect is being utilized, we’d like to offer some thoughts.

At an Axios360 event last month talking 5G and transportation, several of the attendees and elected officials spoke on the state of technology and regulation.

Greg Walden, Chairman of the House Energy and Commerce Committee, made a particularly foreshadowing remark that “if responsibility isn’t there, regulation will be.”

This is a common issue at the intersection of the public sector and technology. Government is always slower to respond with regulation of the free market — especially where technology is concerned. Just look to the tumultuous frontiers of cryptocurrency and drones for good examples of technology functions outpacing regulation. But speed of innovation doesn’t exempt Silicon Valley from its responsibility to provide technology that has an overall positive impact on the world for the greater good instead of just driving their bottom line.


First, we had floods of Twitter bots amplifying and spreading truly fake news as a part of misinformation campaigns throughout the 2016 elections. And though Twitter seems to have stepped in the right direction by purging hundreds of fake accounts late last month, they seem to have done little in terms of verifying new accounts since then to ensure that there are actual humans on the other side of these tweets.

Then we had Equifax, a company surveilling American citizens’ credit behavior, losing the sensitive, personally identifiable information (PII) of roughly half the U.S. population. Yet they’re still open for business.

But now, we have the bombshell revelations surrounding the data firm Cambridge Analytica exploiting Facebook’s developer terms of service and mining the social media giant’s platform for the behavioral data of its users. It has become clear that the tech giants bear the burden of responsibility (and of the corresponding punishment when necessary) that comes with being embedded so deeply in its users’ personal lives and being so reckless with that information.

Read the Guardian’s full article on Cambridge Analytica here.

Setting aside the fact that a foreign company was deeply involved in the campaign operation of the domestic election of a president, this news is particularly shocking because it offers a glimpse into the corporate ethos of Facebook. Merely having a terms of service document buffered by legalese with conditions about how data may or may not be used is woefully insufficient. It appears that the tech company adopted a look-the-other-way unspoken policy about what happens to that data once it left their API. That’s not good enough.

It also appears that Facebook knew about the data being accessed and neither did any sort of policing or bringing their considerable legal might to bear on the bad actors, nor — perhaps most egregiously — did they acknowledge the breach or warn their users that their data had been collected and disclosed.

There’s long been a bedrock principle of digital that “if you aren’t being charged for the service, then you’re the product.” Ad revenue, demographic data, lists of email addresses, and now hyper-specific behavioral data from social media are all being weaponized in the name of profit. And some of this data is scary stuff. As the New York Times reported, here’s a sampling of the type of data Cambridge Analytica is peddling:

If you’re a Facebook user, do you recall consenting to have your levels of extraversion or neuroticism shared with third parties? How about your IQ or political views? Neither do we…

There used to exist a healthy tension between the inherent profit-driven motives of these companies and the regulation and oversight of the government where the impacts on citizens and society are concerned. The rapid growth and acceleration of technology has upset this balance, and it’s time for the government to reassert itself. These problems are technological and human in nature and they can accordingly be tackled with a robust yet adversarial approach to a partnership between Silicon Valley and government.


Google’s motto is “Don’t be evil,” and that’s a nice sentiment. But these companies must go beyond a motto and invest in actually policing data to ensure that the access they give partners isn’t also being used for malicious purposes. Because, as a founder of Cambridge Analytica said of the leadership team: “Rules don’t matter for them. For them, this is a war, and it’s all fair.”

“Rules don’t matter for them. For them, this is a war, and it’s all fair.”

The tech companies and their partners have recognized the critical nature of this struggle. It’s time for government to catch up to that fact.


How can you help?