That is the question.
Today’s large scale Internet services — companies like Facebook, Google, and Twitter — insist they are “platforms.” Perhaps it’s ironic that their insistence comes just when they face heightened scrutiny for their role in enabling third parties to inflict damage on others via their “platforms.” While old media like the New York Times remain publishers, responsible for what appears in their pages, the platforms are sticking to the story that they’re just “common carriers,” like the phone companies.
For the platforms, unfiltered user-generated content is a requirement for scale. If they had to approve every post — well — that would slow things down. And yes there would have to be editorial approval, which might look arbitrary, but goes with the territory. And some “letters to the editor” might not make it into print, even after editing for ad hominem attacks, factual correctness, brevity, proper respect for disclosure laws, and other flaws. The fluff and vitriol would dry up in a hurry, if ready platforms weren’t just sitting there, begging for more user-generated content day and night.
A publisher should be defined by what people see rather than by the mechanics of the underlying content distribution machine. Inevitably, even on the platforms, content is somewhat curated by people, despite algorithms’ doing a lot of screening. HumInt is still involved at some point. Finally, there’s Mark Zuckerberg sitting on Mount Olympus pondering whether to take down a fresh live beheading video. Actually, I made that up. In the real version, some person, paid by the hour, ascertains what the algorithm thought it saw 10 seconds ago and kills the video.
So, humans ultimately decide what gets seen by whom. Why not own that?
There is no such thing as unbiased. Therefore, publishers need to declare their bias and move on. If they disapprove of videos that incite people to riot, they don’t let them run, whatever the law says. They’re under no obligation to provide anything in particular to anyone. And they decide what not to present at any given moment as well. It’s called editorial discretion, and it is nothing less than the high art of presenting the truth in the right perspective with appropriate context. Inevitably, judgment is involved. That’s why you want good editors who understand the seriousness of the job.
Say! You don’t find a lot of those around these days, do you?
Yes, the New York Times has some, as does the Washington Post, and the Financial Times. But this sort of skilled labor is hard to come by in a general way. So, Facebook, Google, and Twitter would have a hard time finding enough real editors to maintain their current scale. And that’s the point, isn’t it? They shouldn’t have that scale, which comes from a distortion of the click-based advertising model. Left to a more “circulation” based model — where people pay for that they get — scale wouldn’t be such a critical issue.
Jimmy Wales, who runs a decent ship over at Wikipedia, launched an alternative to Facebook and Twitter recently, hoping to bring some of Wikipedia’s rationality to social media. It’s easy to take for granted, but Wikipedia is a gem, a north star in the digital pantheon. It’s curated and runs on a semi-circulation model; that is, it doesn’t charge for its service, but does ask for contributions, which apparently come in sufficiently to keep Wales up and running. Wikipedia is a stable encyclopedia of knowledge available to all, a magnificent achievement. It is not a platform for exchanging personal gossip. Serious people argue about the right way to portray something. Entries are ultimately a judgment call, but the general lines of reasoning are traceable. Most things come down to common sense, exercised with courage.
And I’m sorry children, but not everybody has to talk. I know that comes as a surprise, but many people could benefit from more listening and less yammering. For example, many comments on published pieces are duplicative. An editor could choose a representative one and make it exemplary for many others. If there were a lot of them, an accompanying comment could say, this was one of many like it. So, the perspective would be kept but not all the repetition.
Make the platforms responsible for 100% of the content on their pages, and they would rapidly find a way to ensure they weren’t violating the law, which would lead to a whole lot less — but better — material.
And what of all those angry citizen journalists, some of whom have been trying to set the place on fire with their incendiary language? Where do they get to go to “express” themselves, to exercise their rights of “free speech,” and to blather their opinion at any length any time of day or night? Well, apart from hosting their own forums, with all the attendant costs and headache, they have to appeal to someone else’s and satisfy that outlet’s editorial standards. Your cartoon might not make it into the New Yorker, no matter how good you or your uncle Bob thought it was.
The platforms — Facebook, Google, and Twitter — style themselves as common carriers. But what does that mean? In the United States, the common carrier statutes were legislated during the Depression, when the Communications Act of 1934 created the Federal Communications Commission (FCC) to regulate companies building out huge businesses in broadcast TV and telephony, particularly the Bell System, which had cornered the entire telephony market. Under Title II of the act, the FCC could classify as common carriers Bell and the broadcasters, recognizing their status as monopolies or oligopolies, but enjoining them to provide service on a uniform basis, with published prices, and to serve the entire market, not just the most profitable segments. Thus, not only uptown New Yorkers got phones and TV, but so did farmers way out in the countryside.
Common carriage laws in general date back to the 1880s,when railroads were the monopolies in question. Simply put, a common carrier transports things for people: coal in railcars, voices over telephone lines, bits over TCP/IP networks. In order to transport things for people in a fair and even manner, the carrier cannot discriminate against either the cargo or the people. If people are the cargo, the carrier can’t individually price tickets based on any human characteristic. Nor can it discriminate between coal cars and milk cars. It can charge customers only by objective measures like the weight of the goods or the distance traveled.
An important corollary to the Title II designation of the 1934 act was that — in both physical and metaphorical ways — carriers were not suppose to “look inside the packages,” but simply pass them on unexamined as proof that no discrimination was being practiced. Hidden in this part of the statute was a gift that would keep on giving in the Internet age: a lack of responsibility for the content being carried. Carriers have been given a pass, even if they are found to be carrying illegal content (drugs in freight, child porn on the Internet). The shipper or recipient, rather than the carrier, is responsible.
Now, we come to the update of common carriage laws, Section 230 of the Federal Communications Decency Act of 1996, a statute that originally served as a liability shield for Internet Service Providers (ISPs). At the time, subscriptions to ISPs — companies like AOL or CompuServe — were how most people got on the Internet. The World Wide Web had just been invented. Section 230 says that “with some exceptions, online platforms can’t be sued for something posted by a user — and that remains true even if they act a little like publishers, by moderating posts or setting specific standards,” according to reporting by Alina Selyukh for National Public Radio.
In the telling of it, this has meant that big Internet platforms are allowed to clean up bad content after the fact rather than before. They argue essentially that they will respond to take-down requests rather than curate content. In this way, bits can move around the Internet with a minimum of friction.
The current dominant platforms — Facebook, Google, and Twitter — love minimal friction. Speed of click is a metric they inspect most diligently. This whole curation business is a cost, and it slows things down. Well, yeah. It does.
Ain’t it a shame.
An echo of this question of responsibility for content resounded a few weeks ago, when the Supreme Court decided not to hear Remington Outdoor Co.’s appeal of the Connecticut Supreme Court’s judgment in favor of the Sandy Hook families.
Remington claimed that it simply made, marketed, and sold the Bushmaster XM15-E2S rifle that Adam Lanza used to kill 26 people in the Newtown, Connecticut, elementary school. The company asserted that it had no other culpability. The lower court ruled that Remington may indeed have a share of responsibility for how its products are used. That share will be determined as the families pursue their case in the state court system.
So, you see, this issue of platform vs. publisher has relevance for U.S. society well beyond the tech sector.