Face-swapped pornography, like many technological and social developments of our age, seems like a bad thing but is, in fact, much, much worse. It’s not hard to conclude that the synthetic-porn hobbyists, busily helping each other teach themselves to teach computers how to paste new people’s likenesses into existing porn video, are doing something violative; PornHub has declared that the resulting product is nonconsensual and violates its terms of service.

Even as the prospect of unwittingly appearing in a hardcore sex video is disturbing on its own terms, though, it’s much more disturbing as a test case, a cheap homemade version of the larger, well-funded project of making plausible video of things that never happened — what BuzzFeed’s Charlie Warzel described this month as “a potential fake news Manhattan Project.”

The difference between video manipulation and the actual Manhattan Project is that the scientists on the Manhattan Project knew they were members of the war effort and that their goal was to build the world’s most powerful killing device. Once they had built it, the scientists understood that they had changed the world in a grave and awful way.

The video manipulation industry seems to have no awareness at all of what it makes possible. In an Adobe Photoshop product presentation from late last year, a crowd applauds as company representatives demonstrate how Adobe’s Project Cloak technology can remove an unsightly lamppost from a video of a Russian cathedral. But wait! There’s more!

To further applause, the reps go on to show footage of two hikers going through a canyon, and the software removes a strap from one of their backpacks. Can it remove the people? “Yeah, we can — we can remove the people,” the presenter says. “Yep.” And the people are wrapped in a purple outline, as an unwanted visual element, isolated, and removed.

“Whatever we know about our society, or indeed about the world in which we live, we know through the mass media,” Niklas Luhmann wrote in his book The Reality of the Mass Media: Cultural Memory in the Present (1996). “This is true not only of our knowledge of science and history but also of our knowledge of nature. What we know about the stratosphere is the same as what Plato knows about Atlantis: we’ve heard tell of it.”

To live this way, we depend on mechanically mass-distributed words, sounds, and especially images. Things that appear on video are more real than things we are merely told about. There were plenty of good reasons for the fast-food executive Andrew Puzder not to become secretary of labor, but the one that guaranteed his nomination’s defeat was the appearance of his ex-wife, disguised in a wig and huge tinted glasses, on a videotape of an old episode of Oprah about spousal abuse.

The abuse allegations had long since been reported, along with the fact that she recanted them in the course of a custody settlement. But the visuals made the difference. One oddity in the scandal, testifying to the supernatural power of the format, was that the company that originally aired the video refused to cooperate with the effort to find it and make it public. Oprah had put forth the woman’s image under a pseudonym, as a representative of the generic problem or scandal of “high-class battered women.” Now, though, it was evidence about a specific, powerful person, a more potent thing than the producers had ever meant to create.

It got out anyway, just as Donald Trump’s grab-them-by-the-pussy Access Hollywood tape got out, despite NBC’s desire to slow-walk the news. In the 22 years since Luhmann described the inescapability of mass media, the mass media have lost a great deal of control they enjoyed as institutions. Production and dissemination have fractured and become participatory or have created new high-speed simulations of participation.

Yet the space in which mass media operate remains the only space there is. There is still no other way of knowing anything on the scales at which things demand to be known about: the nation, the government, the economy, the world, the culture. But the knowledge is now broken and destabilized. Images and stories can be distributed to huge expanses of the planet by anyone, or they can be distributed to a small and restricted set of people. On the receiving end, they are experienced exactly the same way.

The face-pasters understand this as a truth, whether or not it rises to the level of insight. For the purposes of making synthetic porn, a mainstream movie actor and a person in a porn clip exist on a single plane of reality or unreality. Both are moving images, nothing more, to be received and consumed. It’s the amoral logic of the learning machines themselves: Here are two versions of the same type of input to be reworked into a new output of that same type. Any distinctions of meaning and value — of underlying reality, as people might wish to understand it — amount to easily overwritten metadata.

It’s easy, and grim, to think up the possibilities: outrageous remarks a political candidate never uttered, false confessions, counterfeited acts of violence. Troops in different uniforms, under different flags, doing the same things. Inaugural crowds overflowing the Mall and packing the grandstands of the parade route.

No one really set out to demolish reality. No one set out to enroll the world’s population into a full-time surveillance system, either. They just built smartphones, and those phones needed to stay in touch with the communication networks around them. And more and more services exchanged information about movement and usage and behavior and attention, and the phones became advertising platforms, and the advertisers wanted to know as much as they could about potential customers — which turned out to be approximately everything.

And likewise, even while some people are building the tools to make video of events that never happened, other people are building the tools to distribute that material through separate, opaque, and unaccountable channels. Facebook operates as a microtargeting service in the guise of a vast public forum, so that no one can tell what anyone else’s Facebook might look like. It took months of crisis, a clouded presidential election, and federal investigations before the company decided it would begin working on ways to make ad campaigns visible and accountable. Twitter offers ads linked to nonexistent Twitter accounts, so it is impossible to know where they came from or who else has seen them. The context is as unverifiable as the content.

So far, the scare over fake news has mostly focused on the senders — foreign agents, Russians, a hidden enemy but one that can be pictured. But the sender is almost beside the point. The horror lies in being the recipients, each of us alone in a hermetic reality, unable to trust what we see or to know what other realities might be right beside us.