Part 3 of a series on America’s Political Divide

The Liberal-Conservative-Libertarian Collaboration that Shaped Today’s Political Environment

Policy makers thought the “Information Superhighway” would open up and democratize speech. You see where this is going.

Dan Heller
Politically Speaking
15 min readDec 22, 2021


Map of the Internet: 1973 (ARPA/DOD)
Map of the Internet: 1973 (ARPA/DOD)

[This is part 3 of a multipart series exploring America’s political divide. Part 1 addresses whether America’s divisions are caused by our two-party system (they’re not), while part 2 explores the difference between how Americans themselves feel about actual issues versus that of their parties (very different). In both articles, I argue that America’s political divisions are not due to the number of parties there are, nor about “liberals versus conservatives,” but in the systemic departures from democratic norms employed by the conservative right, which is now a worldwide problem. The internet is seen as the breeding ground for this phenomenon, but its roots are deeper than just that. In this essay, I explore how internet policy came into effect in the first place, as it was a byproduct of a strange and unintended collaboration between liberals, conservatives and libertarians. Things went awry from there.]

In 1994, TV and print news organizations picked up on an Associated Press report that Microsoft had acquired the Roman Catholic Church. The press release said that Microsoft expects “a lot of growth in the religious market in the next five to ten years… the combined resources of Microsoft and the Catholic Church will allow us to make religion easier and more fun for a broader range of people.”

To America’s shock and amusement, the mainstream media published the report without having vetted the story. After two weeks of late night talk show giggles, the number of people who actually believed it got so ridiculous that Microsoft actually had to issue its own press release, saying the press release was fake.

What only a few people knew at the time was that the article started as a joke on the internet. Yes, the “Internet” (which, at the time, was required to be capitalized). This was an unfamiliar term to most Americans, and neither was the prank, obviously. But neither was new.

The Internet was officially launched in 1969 by the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense, for the purpose of connecting multiple autonomous computers that could share information independently of a centralized system, thereby making it less susceptible to being taken down by, say, a nuclear holocaust.

Through the 1970s and 80s, the network’s commercial promise evolved to the point that by 1993, Vice President Al Gore declared, “The Information Superhighway will flood the economy with innovative goods and services, lifting the general level of prosperity and strengthening American industry.”

True, but all coins have two sides to it, and the Microsoft hoax hinted at the other side of that coin: that obvious misinformation — even when presented as a joke — could be so easily taken as truth. The hoax made Mr. Gore’s premonition of the effects of the internet (oops, sorry… the Internet) seem a tad naive.

Worse, some of us who’d worked in the industry worried that America was wholly unprepared, because we’d seen the same thing going on throughout the 70s and 80s, not just on the internet, but on a variety of other online social networks. And that’s just the social side; the technical vulnerabilities were but only a mild dystopian fantasy. What too few people knew — and were powerless to do anything about — was how the political landscape had changed so radically in the Reagan administration that the Internet’s potential was at the same time invigorating and deeply frightening.

You see where this is going.

The Cultural Roots of Online Societies

Not to dismiss the fast-growing technology benefits and risks of the internet, it’s really the social aspect of online communities that established the political compass. But the two really go hand-in-hand because the bowels of the internet rely on technology in ways that affect social behavior. My favorite excerpt about this comes from Quinn Norton’s fabulous summary about networking that even non-technologists can appreciate. It’s in his piece titled, “Everything Is Broken

Now imagine billions of little unknowable boxes within boxes constantly trying to talk and coordinate tasks at around the same time, sharing bits of data and passing commands around from the smallest little program to something huge, like a browser — that’s the internet. All of that has to happen nearly simultaneously and smoothly, or you throw a hissy fit because the shopping cart forgot about your movie tickets.

To put this into context, let’s go into the wayback machine to the early 1970s, when personal computers were benefitting from the cheap manufacturing of microchips and transistors from the 60s. Aside from some technical computing, many consumers used telephone lines and modems to large mainframe computer systems that hosted electronic bulletin board systems (BBSs).

Each BBS varied in size, style, and community, but they all provided roughly the same services: chat, games, love, investment advice, short stories, gambling, and of course, the ability to pontificate crazy political views (again, many of which were literally just jokes). Sprinkled among the jokes were insane conspiracy theories: the Illuminati, Elvis not being dead, QAnon-style cabals, the Flat Earth Society, the moon landing being staged, and many, many more. It was often hard to tell which were jokes and which were falsehoods people genuinely believed.

When Apple Computer introduced its first model in 1976, BBSs growth exploded. The largest such system was Compuserve, which also started in 1969, and is still around today, shockingly. By 1985 — almost ten years before the Microsoft hoax — BBSs had gotten so popular, the Washington Post published, Computers Becoming Nation’s Bulletin Board, which described these networks as “a whole new form of mass communication that is cheaper, more accessible and less regulated than almost any other national medium.”

This didn’t really resonate with most Americans. Even in 1985, most consumers were still having trouble setting the clock on their VCRs (video cassette recorders). Consider these statistics about the public’s attitude about digital technology, gathered by the Miami Herald in 1985 and published in an op-ed by David Grimes:

The fact that 82% of respondents “wished that everything ‘digital’ would just go away” illustrates the public’s indifference to — and even dismissal of — digital technology.

Consider the public’s response when the news came out that seven teens from New Jersey were arrested and charged with “conspiring to use their home computers to exchange stolen credit card numbers […] how to make explosives and how to call coded phone numbers in the Pentagon.” The American Civil Liberties Union defended the teens by saying, “the case might have involved a violation of the First Amendment rights of some of the young computer experts if all they did was list information on their electronic bulletin boards.”

And yet, policy makers were indifferent. Liberals have always favored “free speech” as the lesser evil to less speech, even if some of that speech may be distasteful or produce some bad events. So, the ACLU didn’t have much resistance defending online shenanigans.

On the right, conservative groups felt the same about online activities — they wanted to spread their messages about white power and religion, so they also wanted less government oversight on speech.

One such group was “Aryan Nations Liberty Net” (ANLN), which formally formed in 1984 as a consolidation of many smaller BBSs from the 1970s. The ANLN also enjoyed protection by the ACLU without even a hint of concern from the liberal Left. Even the Anti-Defamation League was largely indifferent to online speech. In 1985, they issued a report saying, “there is little to suggest that [BBSs] represents a great leap forward in the spread of anti-Semitic and racist propaganda.”

The fact that liberals and conservatives embraced the “free speech” moniker would be comparable to the way today’s liberals and conservatives each use the phrase “my body my choice” to support polar opposite positions on the government’s role in cultural issues: Liberals use the phrase to express their opposition to the government limiting their access to abortion, while conservatives use the same expression in their opposition to the government forcing them to wear masks and get vaccinated against COVID. When each side adopts the same mantra, you suspect one of them is making a seriously grave error in judgment that could have devastating consequences for themselves, the country, and the world. (Hint: abortion isn’t contagious.)

Similarly with speech in the early era of online social networks: Everyone was waving the “free speech” flag, but one of them would be making a grave error that we only now know will have devastating consequences for culture, society and democracy.

This apathy also happened to coincide with another cultural shift: the antigovernment rhetoric resurrected from the political graveyard of the 1960s: Libertarians.

You see where this is going.

Public Attitudes Towards Government Shifts

Most think of Ronald Reagan as having ushered in the right-wing conservative Christian movement in the 1980s, but as Rick Perlstein describes in his book “Reaganland,” Reagan was a Libertarian conservative, which means he believed in conservative values and philosophies, but he was more of a libertarian politically; he felt that people should voluntarily make moral choices, without government interference or coercion in the decision-making process. He’s quoted in a 1975 interview as saying, “​​I don’t believe in a government that protects us from ourselves.”

This would put him both at odds with — but also attractive to — the evangelical movement of the 1970s. As illustrated in this NY Times documentary, evangelicals rarely registered to vote and believed politics was fundamentally amoral. Though Jimmy Carter was himself a born-again evangelical that tried to bring them into politics, evangelicals opposed his policies: Prayer was no longer permitted in schools, Blacks were given equal rights and access through desegregation, private Christian universities were no longer eligible to receive public financing if they discriminated against minorities, and Carter’s counsel on families included gays and lesbians.

But their attitudes changed when, in 1979, Jerry Falwell capitalized on his fast-growing television ministry spawned by the rapid growth of cable television. His galvanizing message was on abortion, which was a gamble at the time. The issue was less urgent on evangelicals’ collective radar compared to other issues, but their anti-gay and anti-racial views were just not palatable to a national audience. By focusing on abortion, Falwell could preach a different message that didn’t appear “hateful.” Furthermore, he could couple it with politics: The government should have a very strong hand in moral issues, but be very hands-off on everything else, a combination that evangelicals were very comfortable with.

There were no perfect political candidates for such an odd political stance, but Ronald Reagan was good enough. As with all Republican presidents that would follow, Reagan was risky: He signed the Therapeutic Abortion Act in May 1967 (legalizing abortion in California), he was married twice before, and he was heavily into astrology. But he was also willing to change his position on abortion (as did Donald Trump and George W. Bush) in order to gain evangelicals’ favor.

By 1980, Reagan’s presidential platform was established: The campaign slogan was “Make America Great Again,” where he espoused religious righteousness and virtues and eliminated all the racist and homophobic language from the GOP platform. But his inaugural address revealed his libertarian policy agenda when he proclaimed, “In this time of crisis, government is not the solution to our problem, government is the problem.”

This solidified a new era of weakening the federal government and its institutions, relaxing regulations, lowering taxes, disempowering unions, and — yes, you saw it coming — easing restrictions on speech.

Insofar as speech is concerned, the Reagan administration focused its crosshairs on the Fairness Doctrine of the 1937 Communications Act. This seemingly obscure provision of law that “required holders of broadcast licenses to present controversial issues of public importance and to do so in a manner that was honest, equitable, and balanced.” The law made it very difficult for extremist groups and religious organizations to pontificate their views about race and gender issues, not to mention antigovernment rhetoric over television and radio. At least, not without being honest, fair and equitable.

While the Fairness Doctrine only applied to broadcast license holders (radio and network television), the groups were worried that the FCC (and the Democratic congress) would apply the doctrine to cable television as well, because it was considered “a mass communications medium,” and is where they saw their future. (They didn’t yet perceive the internet as a viable messaging platform, but again: you can see where this is going.)

So in the early 1980s, we have three players in cahoots: Liberals who loved unfettered speech, conservatives who wanted to pontificate extremist views, and libertarians who wanted to dismantle government as a general principle. It was this last principle — “weakening the federal government” — that worried Democrats, who still believed in strong and reliable institutions and the role of government as a baseline for establishing boundaries on the public welfare (speech, economics, technology, health, security, etc.).

As if on cue, the final player emerges on the scene that would change it all: tech entrepreneurs.

You see where this is going.

Technology Libertarians Flex Their Muscles

We in the 2020s think of “AI” as a matter of daily life now. Artificial Intelligence crunches “big data” (information gathered by the computers and phones we use everyday) to bring us things like book and music recommendations, advertising that meets our personal preferences, and of course, news and political information customized to our tastes.

And while there’s been a great deal of benefit, there are also serious pitfalls, especially to the threat of personal privacy, security and even democracy as we know it. But we know this today because we’ve now experienced what AI and social media networks can do, which is very different from the vision that tech entrepreneurs promoted in the early 1980s, where Artificial Intelligence was a huge movement, and I was part of it.

I got my degree in computer science from the University of California, Santa Cruz in 1985, with a focus on artificial intelligence. When I was looking for jobs after graduation, the primary employers in the AI sector were the four horsemen: Teknowledge, Intellicorp, Inference Corporation and Carnegie Group. Each company promoted the idea that “expert systems” would one day be able to predict what music and book’s you will like, the news you want to hear, the food you’ll want to eat, the people you’ll want to date, and even the candidates you should vote for (based on your profile characteristics).

All this information and data prediction will not only be accurate, it’ll be free because these AI engines will track your online activities, which doesn’t cost you anything, and the companies will make money through advertising (just like network television programs were free in the 80s).

The public loved it. After all, what could possibly go wrong?

“What could go wrong,” the technologists warned, “is if the government got in the way. That’s what could go wrong.”

As it happened, I was also very involved in evolving internet standards and networking protocols, so I opted instead to work for SRI International, a non-profit research and technology company; my group focused on AI and evolving internet technologies, where my role was in electronic mail.

As part of both the internet and AI space, it was very clear that the futuristic vision (and rhetoric) were also moving Democrats partywide to support the “smaller government” shift the Reagan administration had been promoting.

To be sure, the misgivings that many of us were feeling about an unregulated internet were not universal. Most technologists also learn libertarian, but on the left: They tend to be highly liberal on social and cultural values, polar opposites to right-leaning libertarians, but also felt strongly that government shouldn’t be too involved. It was then — and still is today — a fractured community on this point.

Nevertheless, by the mid 1980s, everyone was now onboard: liberals, conservatives, Republicans, Democrats and Libertarians. And just like that, the Reagan administration started slashing regulations and government institutions to unprecedented levels, many of which could have provided some protections for privacy, data, security and above all, speech.

Then in 1987, the FCC revoked the Fairness Doctrine stating,

“The intrusion by government into the content of programming occasioned by the enforcement of [the Fairness Doctrine] restricts the journalistic freedom of broadcasters … [and] actually inhibits the presentation of controversial issues of public importance to the detriment of the public and the degradation of the editorial prerogative of broadcast journalists.”

Fast-forward ten years, and by 1996, it appeared everyone got what they wanted: The Left got an internet where speech was almost entirely unregulated; technology companies could build the world wide web that was as open and vulnerable to crime and misinformation as one could get; and the neoconservative right wing of the GOP got to promote their political rhetoric on conservative talk radio.

Oh, and Roger Ailes became the founding CEO of Fox News. The network’s “fair and balanced” tagline seemed to mock language from the Fairness Doctrine — the proverbial thumb-poke in the eye.

Alas, there was still one more aspect to online speech that needed to be addressed: whether online platforms should be responsible for moderating harmful speech by its users. Again, the combined political allies ultimately felt that open speech is better than preventing bad speech, which led to their passing section 230 of the Internet Decency Act of 1996, which stated that online platforms cannot be held liable for content their users post.

No one predicted anything bad would happen. Quite the contrary. It was seen as a new, bold new era of freedom for both speech and information in general. The sentiment would be best distilled by John Perry Barlow, a libertarian who, in 1996, wrote A Declaration of the Independence of Cyberspace, where an excerpt reads:

“Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”

Even President Clinton was a fan, declaring in his 1996 inaugural address that “the era of big government is over.”

You see where this is going.

The “Free Speech” Conundrum Is Not Really About Speech

It’s correct to say that the events described here are complicated; it’s too simple to suggest that any individual element could have been avoided. But it’s also unavoidable to see that things have gone horribly awry. Today, broad swaths of the country believe in a profoundly large number of extremely dangerous conspiracy theories, such as whether the coronavirus exists, whether the 2020 election was stolen, or that the government is being run by a cabal of cannibalistic pedophiles. Yes, seriously. QAnon believers are alive and well in half the country.

We think we learned something new with the internet — that false information travels faster than truth, that fear and hatred motivate people to vote and donate money, and simple explanations for things are more easily believed than nuanced ones. I’ve written extensively about how easy it is for anyone — yes, including you — to fall for conspiracy theories.

But these are not new lessons at all. Just as many of us worried about the implications of the internet back in 1994’s Microsoft Hoax, legislators already knew exactly these same principles back in 1937 when the Fairness Doctrine was conceived.

It has led to renewed debate on how to reign it all in. Break up the tech companies? Into what–smaller ones that will do the same thing? No.

Repeal section 230? Sure, but replace it with what?

A common refrain about regulating online activity is, “…but what about free speech?”

Most serious observers now say it’s not about “free speech,” nor should it ever have been. Speech and internet policy evolved as a byproduct of a much more insidious deconstruction of institutions. Fix that, and you fix the speech problem. And that leads to another trend in political circles: Revisiting the perception of the value of “small government.” Perhaps, maybe, the government should be more involved? A tad, maybe?

According to Pew Research, Americans’ trust in government always rose during Democratic presidencies, except for Carter (peaking at 77% during the Johnson administration), and dropped precipitously during every Republican administration, starting with Nixon (when Libertarians aligned with the GOP).

Pew Research also reports that 47% say major tech companies should be regulated by the government, which includes 48% of Republicans and Independents who lean toward the GOP.

But Democrats have a very hard time fighting to reinstitute democratic guardrails, even when they control Congress and the White House, as they do in 2021. With Republican states taking over voting administration and certification, it’s unlikely the Dems are going to have another chance till there’s some other systemic change.

Some say we need to move to a multiparty system, which I refute in the first essay of this series, where I claim that America’s problems have nothing to do with how many parties there are, but to the erosion of institutions. And that is not unique to America. As this NYTimes analysis shows, global democracies have been backsliding since 2010 due to “core elements like election fairness or judicial independence” having been weakened.

Others have argued that America’s division is due to an existential war between liberals and conservatives, which I refute in my second article in this series, where I point out that our political divide does not mirror how people actually feel about issues.

This creates a circular dependency: How do you fix institutions, when only the government can do that? I don’t have the answers or firm predictions, but I do believe it starts with speech, which I’ll cover in my next article.

You may not see where this is going.