China’s Censorship Machine, Made by Millennials

Caiwei Chen
22 min readMay 13, 2022

--

Meet the young people who grew up during the most open time of the Chinese internet, but made a living by smothering the last corners of free speech

Image by Caiwei Chen

At first glance, Jiajun Zeng, 29, is not unlike other athleisure-wearing, avidly tweeting Bay Area “tech bros.” Now a graduate student at Northeastern University majoring in information systems, Zeng moved to San Jose, California, from China, in 2021, after three years working in his home country’s booming tech industry. But what he did in that tech industry is very different from what some of his colleagues in Silicon Valley are used to. He worked on the sophisticated algorithm that helped ByteDance, the Chinese tech upstart behind TikTok, fulfill its obligations to “moderate content.” That is to say, he was a censor.

This wasn’t what he had hoped to do when he first became obsessed with computers as a little kid. Born in 1993, Zeng is almost the same age as the Chinese internet. In 1994, a pilot research network started by a group of experts at the National Academy of Science of China opened a 64K international dedicated circuit to the internet through the U.S.-based company Sprint, marking China’s official linkage to the global internet. Zeng’s father, among other technology enthusiasts, started bringing home hardware parts and assembled their first computer. Zeng started to browse the internet when he was in middle school, spending an extended time on Reddit-like Baidu Tieba looking up everything he was interested in, from his favorite football club Manchester United to places you could travel to with a Chinese visa. “The internet granted me access to a bigger world that a small-town kid like me would have otherwise never had,” said Zeng.

A smart kid with a thirst for knowledge, Zeng at the age of 14 started using FreeGate, an anti-censorship software, to access websites that were blocked in China. When he was in high school, Zeng grew skeptical of the Chinese education system, which to him was defined by cramming, conformity, and over-the-top competition. When he was 19, Zeng made a bold decision, dropping out of Hunan University of Technology, where he was a freshman, and enrolling at the University of Tartu in Estonia, where full-time attendance is free for all undergraduate studies. “I wouldn’t have found the opportunity, managed to apply, or survived four years in Estonia if it weren’t for the internet,” said Zeng. Where he was from, studying abroad was not common, and the few who went abroad for undergrad usually went to the U.S., U.K., Canada, or Australia.

“The internet drastically changed the trajectory of my life for the better,” said Zeng, “and I hope that more people can have the access to more information through the internet to fix their own plight.” After his time in Estonia, it was natural for Zeng to choose to return to China to pursue a career in its booming internet industry. He was able to work almost anywhere: he did stints at search giant Baidu, social media titan Tencent, delivery mogul Meituan, and finally ByteDance. The emergence of TikTok stirred up excitement in Zeng’s heart — for the first time, Chinese social media products were taking over the world in a frenzy, reversing the common narrative that they are merely “copycats” of western products. Zeng hesitated when learning the job was censorship related, but decided that it was worth it if that meant being part of the frontier of Chinese internet technology. ByteDance, with its Silicon-Valley-inspired culture that stressed egalitarianism and creativity, was a dream employer among young job seekers starting 2018, promising a chance to create substantial change by innovation.

That was not quite how things worked out. Zeng became was a product manager for an internal platform that almost all the company’s products use for this “moderation”. The system he helped develop streamlined the process of reviewing posts by integrating censorship algorithms and human reviewers. Zeng and his team at the Department of Trust and Safety worked to keep obscenity, nudity, risky behavior, and, most importantly, “politically sensitive” content off the platforms. They did so following guidelines set out by the government, but not under the government’s direct order or its supervision. Nonetheless they were informed at the onboarding that if they let too much wrong content slip through, possibly leading to a significant consequence for the company, it would be their jobs at risk. So, they took their work seriously.

TikTok and its Chinese version, Douyin, have skyrocketed in popularity around the world in the past five years. By 2021, TikTok and Douyin boasted 1.29 billion active users worldwide, comparable with much older competitors like Facebook, Twitter, and Instagram. Apart from the short video app, ByteDance also owns a number of other content apps, including the news aggregator Toutiao, the YouTube-adjacent Flipagram, and a music streaming service, Resso. The wide portfolio of apps all have one winning recipe — ByteDance’s proprietary system of allocating user-generated content. With the massive amount of content comes the difficult undertaking of keeping it in check — a task more than 10,000 people in China alone are working grueling hours to perform. The content reviewers sit in censorship centers all around the world, Jakarta, Kuala Lumpur, Gurgaon and emerging Chinese cities, while fewer than 200 white-collar managers stay in the headquarter in Beijing to oversee the process. Content reviewers are assigned videos based on their primary language and region. Zeng was among the 200.

The system had two purposes: to locate illicit content and handle it as efficiently and comprehensively as possible by means of technology, and to facilitate the work of human content reviewers in the “base.” Zeng’s team served as a link between the technology team and the policy team, where needs and demands from superiors were transformed into practice. Above Zeng there was “the office,” a group of more experienced managers who oversaw the process. The content moderation unit was primarily made up of millennials and older Gen Zs like Zeng, most of them graduates of top-tier universities with gold-plated résumés.

Zeng described the censorship process as effective and sophisticated: Content would first be filtered through an AI censor based on categories of forbidden content.The AI censor used a combination of voice recognition, transcription, text analysis, keyword cross-examination, and image recognition to rule out “high-risk” content first, and then sent the “low-risk” content to human reviewers. Typical imagery that would instantly trigger a ban included the faces of Chinese president Xi Jinping and other prominent political figures, dissidents like Tibetan activist Dalai Lama, guns, and religious objects like the cross.

Although the list of touchy topics remained relatively stable, savvy Chinese users came up with new and creative ways to subtly hint at these topics every day. To combat these attempts, platforms needed to constantly develop countertactics and implement changes. For example, images that included both candles and tanks would trigger the system’s alert of “politically sensitive content,” as it would be deemed a kind of memorial to the Tiananmen Square massacre in 1989, in which thousands of students were killed by the Chinese military during a democratic protest. Content referencing the song “Do You Hear the People Sing?” from musical Les Misérables is also highly restricted, as the indignant beat depicting the French Revolution found popularity among Chinese public who feel their voice is silenced.

Aside from routine checks of newly uploaded content, the system also distributed the most viral content on the platform every hour for an addition check to make sure the content complied with the ever-shifting rules under the command of the Cyberspace Administration of China. The bar of what constituted illicit content fluctuated depending on the time of year. Days that were of historical significance to the Party or days during which a key Party conference or international diplomatic event was happening were days of extra scrutiny. “A ‘low-risk’ video on regular days might become a ‘high-risk’ one, and get the user banned from the platform on special days,” said Zeng.

Their jobs in content moderation were attractive — ByteDance offered employees high salaries and promising internal growth. They were also intellectually stimulating, using cutting-edge software to parse a massive amount of data. But the costs soon became clear. Growing up in authoritarian China, Zeng and his coworkers knew they had to play along to survive, that there was little they could do to change the system. However, the dull affliction of deteriorating freedom of speech still touched most of them. Zeng remembered his coworkers discussing the sense of deprivation at work at the team’s weekly social gathering. “I started to realize that the moral guilt was building up, and it started to offset any gratification I experienced from achievement at work,” said Zeng. Most of his coworkers agreed that all they could do was to “raise the muzzle by one inch” when possible, because going against the regime was simply not an option.

In spring 2019, the censoring guidelines were updated, and Zeng found himself going through the sea of existing content to recheck it. A lot of content that had been deemed OK was now considered “subversive.” Some of the changes had to do with new political realities. For example, feminist discourse would come under increased scrutiny every time pop culture celebrated the sentiment. Others had to do with the changing status of various individuals. For example, Denise Ho, a popular Hong Kong singer, had been blacklisted in 2019 by the Chinese government for her support of Hong Kong’s democratic movement; now all content featuring her had to be removed.

There was also now extreme sensitivity toward COVID-19, which was first found in Wuhan in December and then quickly swept China. Zeng and his coworkers spent a good amount of time that winter trying to keep “rumors” of a new virus off TikTok and other platforms. Not only were posts about the pandemic censored but key sources of news were punished. One of the first whistleblowers, Dr. Li Wenliang at Wuhan Central Hospital, recognized the dire situation and took it upon himself to warn colleagues and friends about it. Dr. Li was taken to the Wuhan police and warned about “spreading rumors and disrupting social order.” He was forced to sign a promise that he would not make such statements again. Two weeks later, Dr. Li passed away on the frontline of COVID-19 treatment.

Dr. Li’s tragic death triggered widespread public outrage, and served as the last straw for Zeng. “I was scrolling through Weibo and seeing posts about Li Wenliang popping up and immediately go missing,” said Zeng. “I was too familiar with what was happening. There are enormous, stone-cold machines operating behind information we can see, and I felt guilty for being part of that system.” Zeng thought of himself then as “politically depressed” — from the grief of being part of a relentless machine; from his long-gone belief in the internet as utopia.

Three weeks later, he quit. Zeng returned to his hometown Meizhou, spending the rest of the year teaching himself English and coding skills from materials online. In early summer, he learnt that he was admitted to a Master of Science program at Northeastern University.

+++

When the idea of the internet was first proposed, in the 1960s, American computer scientist J.C.R. Licklider envisioned a globally interconnected set of computers through which everyone could quickly access data and programs from any site. By the end of 20th century, the once radical notion was largely accepted as a prophecy bound to be fulfilled. High-profile technology, post-cold war prosperity, and a tinge of looming anxiety converged. In 2000, then-U.S. president Bill Clinton hailed the arrival of a new era that would be characterized by the global spread of freedom. He dismissed the Chinese’s government’s nascent efforts to crack down on the internet: “That’s sort of like trying to nail Jell-O to the wall.”

The Chinese government thought they could do it, however, and they set about this seemingly impossible mission by building a wall — one that singled out a semi-segregated cyberspace from the rest of the world; one that would eventually trap 1.4 billion people inside. The internet came to China in 1994, and the “Golden Shield Project” began very soon thereafter, in 1998. The outcome of the project would later be known as “The Great Firewall.” The Great Firewall, which loosely refers to a set of technologies and regulations set by Chinese officials, is a comprehensive filtering and surveillance system that not only blocks foreign websites from IP addresses within China, but tracks requests by computers inside China to visit foreign websites.

In 2010, then U.S. secretary of state Hillary Clinton famously harked back to Winston Churchill to address the importance of internet freedom: “A new information curtain is descending across much of the world,” she said at the twentieth anniversary celebration of the fall of the Berlin Wall. “And beyond this partition, viral videos and blog posts are becoming the samizdat of our day.” Although she did not directly mention China, it was hard to escape the conclusion that this is who she meant: China had just recently pushed out first Facebook, then Google. But as time went on, the wall not only did not collapse like the Berlin Wall, but evolved into the most sophisticated information control machine in the world, capable of countering a million separate brewing dissents.

The void left by Facebook, Google, and Twitter (also banned, in 2019) was quickly filled by their Chinese counterparts: the social media giant Tencent, search powerhouse Baidu, e-commerce mogul Alibaba, and microblogging platform Weibo. Although these companies and their users are, unlike the banned Western companies, subject to Chinese law and willing to attempt to abide by it, their sheer scope make them hard to monitor. As the national cadre training material produced by the Chinese Communist Party states, the popularization of digital technology has transformed the structure of mass communication in China, and requires a shift of propaganda strategy from “one to many” to “many to many.” That is to say, the influence of state-affiliated media is waning. The new dynamic cyberspace will require many content reviewers to keep up with the many people posting potentially subversive things online.

In 2011 the State Internet Information Office was established to regulate internet content. Since Xi Jinping, China’s current leader, came to power in 2012, the office has been renamed the Cyberspace Administration of China, run by the Central Cybersecurity and Informatization Leading Small Group and personally chaired by Xi. “Digital governance” has since become a recurring buzzword in Xi’s speeches, demonstrating his ambition to extend state control into cyberspace. According to Rogier Creemers, a China scholar at University of Leiden, Xi’s reconfiguration of the online digital sphere has vastly reduced its autonomy and spontaneity.

In the platform-dominated internet age, an unshakable state of speech control is unattainable without a tight grip on the internet companies, the young but robust private players behind the nation’s most popular sites and apps. In 2021, China’s anti-monopoly regulators cracked down on the high-flying tech companies by ordering a halt on Ant Group’s secondary IPO in Hong Kong, with a subsequent fine as high as $2.8 billion. The fintech spin-off of Alibaba was not alone under the iron fist. As the Data Security Law passes, the message Chinese regulators are sending is clear — no one but the Party is in charge.

As the platforms and government join forces, a “networked authoritarian” state has arisen, a term coined by the scholar Rebecca MacKinnon. According to MacKinnon, “The single ruling party remains in control while a wide range of conversations about the country’s problems nonetheless occurs on websites and social-networking services.” In a networked authoritarian society, an individual citizen with access to the internet might feel a much greater sense of freedom — in a way that is not possible in classic authoritarianism — but her online activity is simultaneously monitored and manipulated. The reactive nature of most censorship efforts made it possible for people to vent their grievances at the moment, but those voices hardly surface and almost never last.

However, even the most effective government would find it hard to enforce real-life consequences for every little violation of law given the sheer number of individuals online. Lu Wei, the former director of the Internet Information Office of China, often denies allegations of mass censorship by citing numbers: “China has four million websites, with nearly 700 million Internet users, 1.2 billion mobile phone users, 600 million WeChat and Weibo users, and generates 30 billion pieces of information every day,” Lu said at a press conference in 2015. “It is not possible to apply censorship to this enormous amount of data. Thus censorship is not the correct word choice.” But then he added: “No censorship does not mean no management.”

Lu’s argument highlights the selective and labor-intensive nature of on-the-ground censorship work. In her book, Censored: Distraction and Diversion Inside China’s Great Firewall, Margaret E. Roberts dispels the common perception that the censorship in China was accomplished by an airtight ban on sensitive content. Instead, the intentional ambiguity and slightly laid-back censoring action, which is part of what Roberts calls “porous censorship,” not only allows public sentiment to bubble up so that the government can react accordingly, but drives a wedge between savvy internet users who have higher digital literacy and those who won’t jump through extra hoops to access political information.

Roberts, whose research draws on a vast quantity of empirical evidence including interviews and user data, describes three mechanisms of censorship in her book: fear, friction, and flooding. The fear mechanism is closest to what most people think of when they think of censorship — a mechanism powered by intimidation and potential consequences. Friction, on the other hand, is a kind of “tax” on information, making it more time-consuming or complicated if users want to access the censored material. The last one, flooding, is “a coordinated production of information by the authorities with the intent of competing with or distracting from information the authorities don’t want consumers to have.” Social media platforms, as the de facto source of information, venue of discussion and battlefield of opinions nowadays, see a combination of those.

Roberts’s findings correspond with Zeng’s experience and point to a salient fact: it’s the content moderation system created by internet companies, rather than the police and other authorities, who are directly behind most of the censorship an average Chinese internet user would encounter. By issuing bans and warnings on individual accounts, reviewers create friction for users targeted; by removing sensitive content and prioritizing state-approved ones, the algorithm constantly floods users’ page. Such tactics have proven effective. According to Roberts, tiny increases in the cost of political information will strongly decrease the probability a citizen will consume it.

+++

Behind high-earning white-collar office workers like Zeng, sitting in landmark skyscrapers in Beijing, is an entire “base” of less educated, and less well-compensated, content reviewers. These are the people who actually consume a lot of the content to counter Chinese internet users’ increasing subtle and creative online expression. If high-tech algorithms and swiftly shifting strategies are the glamorous side of censorship work, content reviewers are those who are getting their hands dirty.

In 2011, 22-year-old Eric Lipeng Liu graduated from Tianjin Agricultural University and landed a job as a “content reviewer” at Weibo, the Chinese version of Twitter. Liu considered himself a “sort of country bumpkin” at the time, not having any access to information about the larger world outside the Chinese internet. When he applied for the job, Liu imagined he’d be a forum facilitator, assisting people in online spaces.

An old industrial city right next to Beijing, Tianjin has been slowly transforming under the radiating influence of the rapidly expanding tech giants, and Liu could smell the genuine excitement among his peers about the changes in his hometown. Liu didn’t know that his hometown would later be dubbed “the censorship capital” by the Western media and China’s internal critics, or that content reviewers like him would form the fastest-growing “blue collar class” that keeps Chinese social media running. According to the Chinese career website Boss Zhipin, the average monthly salary of a content reviewer is 5K RMB (approximately $630). In lower-tier cities like Nanjing and Wuhan, the base salary can be as low as 2K.

Every morning, Liu and his coworkers took the company bus to Tianjin’s outskirts, where they sat in a two-story building and implemented the systems created by people like Zeng to censor the internet. “The area was so underdeveloped that even buying breakfast was hard,” said Liu. On the way to the nearest bus station, Liu would see the five-story pre-fabricated Khrushchyovkas from the Mao era standing side by side with a polo field built for the new moneyed classes. “The scenery felt surreal, and every new construction looked obtrusive,” Liu remembered. Sitting in his office with thousands of other reviewers, Liu issued warnings, deleted posts, and shut down the accounts of users who ran afoul of an ever-increasing list of banned topics and keywords.

To help the authorities win the battle of public opinion, Liu and his coworkers need to “know the self, know the enemy,” as Sun Tzu has noted in The Arts of War. That entails learning about the deepest insecurities of the current regime during their onboarding training. Liu and coworkers studied significant events of civil unrest in recent history like the 1989 Tiananmen Massacre and July 2009 Ürümqi riots closely, so that they could spot even the subtlest nod in a post to effectively shut it down. Another subject is recognizing the “you-know-whos” (usually key government officials and sometimes their family members) so that the leadership’s image doesn’t get stained by caricature or parody. The thing that confused Liu at first the most was niche culture references. The thing that stuck out the most was the niche culture references. The keyword “Ministry of Truth,” the name of the propaganda department in George Orwell’s dystopian novel 1984, was Liu’s favorite on the list. “Imagine a kid posting an innocuous book review and finding out their post cannot go through,” Liu laughed, as he still finds the preemptive word ban ironically comical.

Every day, newly updated censorship guidelines were issued to the reviewers. As in the military, following orders was job one here. Liu was trained to evaluate in under 10–15 seconds whether a post was allowed to circulate publicly. Failure to operate at the intended speed or misjudgment that led to unwanted public reactions could get both him and the poster into trouble. The reviewing process opened Liu’s eyes to the amount of information the government was keeping from wide circulation: cruelty faced by human rights lawyers, casualties of major public accidents, prosecutions of journalists… It was a stressful job. Liu regularly came home exhausted. He wasn’t just being asked to evaluate political posts. Each day he also saw a large amount of pornography and violence. A lot of it was material he would rather not have looked at, especially in such quantities, but he had to look carefully, because mistakes in reviewing would affect his accuracy rate, and thus affect his performance and salary. While Liu and his coworkers evaluated netizens’ posts, the system they used was also closely evaluating them.

Most of the high-risk content was sent to different reviewers to get separately assessed, and if the work submitted by a single reviewer deviated too much from the final assessment, the incident was considered a “misjudgment,” resulting in a drop in that reviewer’s accuracy rate. In the event of a major error, such as a post approved but deemed to have “adverse effect,” the reviewers who approved it were solely responsible and could face dismissal. As a result, the bar of censorship became even higher in practice, each reviewer fearful of coming up short. One reviewer is thus pitted against thousands of others to perform exactly as requested.

High pressure and a low sense of fulfillment are the career crises almost all employees engaged in content review experience. Content reviewers find themselves with little work-life balance. “Most of the content reviewers were young people who had little work experience and would sacrifice a lot for a decent internet company job title,” said Fan Yuan, a former content reviewer at a popular video platform. Fan, who just quit her job in Wuhan to seek a local government job in her hometown, described her experience working as a content reviewer as physically draining and mentally challenging.

In the past two years, Chinese internet companies have established new content review centers in lower-tier cities including Jinan, Xi’an, Wuhan, Wuhu, Chengdu, Chongqing. These cities have two things in common: relatively low labor costs and ample university graduates. Similar to call centers, censorship centers have sprung up all over China to provide outsourced content moderation services for small and medium-size platforms that cannot afford to set up in-house content review centers. As demand surges, the level of education required for reviewers has been lowered. “This line of work usually requires at least a college degree,” said Fan, “but now some contractors would hire high school graduates, because they’re easier to manage and ask fewer questions.”

Fan left her post as a content reviewer after almost two years, when she was experiencing severe mood disorders and physical fatigue. “My shift would often stretch out to as long as 14 hours, and the experience of taking user-generated content down just because of a small violation of platform rules is just not pleasant at all,” said Fan. Sometimes, users would try to play “boundary ball,” an internet buzzword first popularized by journalists who hoped to create social critique content that skirted the line and tested the limits of the permissible. Fan felt especially conflicted when coming across persistent players, who would make a few edits to a video and upload it again and again to test where the line was. “Some users would spend a long time trying to argue their case, and I would feel so powerless, knowing my job was no less precarious than the content they were creating,” Fan said.

Fan is not alone in her feelings. Liu also felt morally conflicted when he took down individual users who were consistently outspoken. “In our monthly meeting, superiors would require a ‘head count’ report of violations and a name list of consistently violating users. I developed a genuine admiration for people who would risk their safety speaking up about political issues, and would use my personal account to warn them what the ‘sensitive words’ are sometimes,” Liu said. Among the people he warned was the famous author Murong Xuecun, whose posts were flagged on Liu’s work queue. Liu’s secret acts of resistance also included keeping a personal copy of the “sensitive words” list and the daily public discussion monitor log passed between reviewers. Later, days before he left his job in 2013, Liu stopped deleting anything at all as a silent protest.

+++

Zeng, Liu, and Fan, all Chinese youth who joined the workforce in the 2010s, were the first generation in China who were digital natives and market savvy. Their formative years occurred when the popular sentiment was that China would become a much more democratic, open, and pluralistic place. For them, this was connected, at least in part, to increased access to the world through the internet.

Like Zeng, Hana Chan, a 28-year-old tech product manager, also traces her career choice back to her early childhood experience with the Internet. Wang first encountered the Internet in 1999, when her upstairs neighbor’s new computer came with a DOS system. Chan would visit the neighbor often after he installed the Windows operating system, to play his drawing program and inbuilt games. “Even the most mundane task like sending emails would keep me mesmerized,” said Chan. “I was so small, the computer desk so high.” The chassis would blast out warm breaths in the northern winter, lingering on her computer screen. “I felt the screen linked me to a world of infinite possibilities,” Chan remembered.

The prevailing cyber utopianism planted seeds in many Chinese millennials’ hearts. Zeng and Chan’s families were among the early owners of personal computers, but the expansion of internet use had soon become unstoppable all over China. In the first half of 2008, China surpassed the United States with its number of internet users. Various commentators in the Chinese press reported the feat as a triumph worthy of celebration. Simultaneously, the development of new forms and outlets for expression on the internet — most prominently blogs and microblogs — have triggered a revolution in Chinese public life. In the 2000s, blogs, microblogs and bulletin boards became common venues for not just online discussions of public affairs but sites where offline mobilization occur. As Yang Guobin, China scholar at University of Pennsylvania noted, China’s online activism in the 2000s constitutes a vibrant current of “unofficial democracy.”

Growing up under the prospect of China’s democratic transition, Chan started blogging early, first on MySpace-like Qzone, then Zhihu and Weibo. She documented her life, shared her musings and weighed in on public affairs. Chan also remembered browsing through Wikipedia and being shocked by fact that anyone at all could contribute to the editing process. “The experience I had on the internet challenged the dogmatic thought that knowledge can only come from an authority,” said Chan.

But as Xi’s China heightened censorship, Zeng and Chan found it difficult to keep up the internet optimism they grew up with. The sentiment is echoed by millions of Chinese youth, as “political depression” (zheng zhi yi yu) became a popular buzzword among liberal-leaning internet users who felt deprived of a future they were promised. Wang felt less motivated to post on Weibo as censorship tightened, not to mention the growing number of trolls that would unconditionally defend conservative nationalistic values. For Zeng, the depression took on another layer of guilt — “I felt that I could no longer believe in what I did”.

“The popular term ‘political depression’ is not merely a pathological notion, but rather designates a collective phenomenon,” said Gao Zhipeng, cultural psychologist at American University of Paris. “Popular discourse and literature on depression tend to see it as inherently negative, while a collective feeling of despair and discontent might give rise to active resistance.” Starting in 2020, mass retweeting of a viral citizen journalism piece started becoming a phenomenon on Weibo and WeChat. “The power of those viral works derives from its simplicity,” said Yang Guobin, China scholar at University of Pennsylvania, “It is an online relay in which individuals would make their own remixes of the media to battle censorship.” Creative internet users would write the article backwards, in emojis or obscure codes, just to extend the lifespan of the piece of information by a tiny bit before it got taken down.

+++

In 2016, Liu handed over the classified censorship materials he had collected to the New York-based Committee to Protect Journalists (CPJ). After realizing he might be vulnerable to the state’s retaliation, Liu moved to the United States. Liu finally felt a long-lost sense of relief as the information went out. He changed his Twitter handle to “Eric Liu unchained,” as if he was the Django, the Quentin Tarantino main character that rode away from the implosion of his master’s mansion.

However, would they be able to really tear down the master’s house, with the master’s tools in hand? To Liu, what he knew is that he could no longer take “my hands are tied” as an answer. He currently works as a news editor at a digital publication, dedicated to document information the Communist Party tried to censor. Zeng has set out to solve the problem from a technological perspective: finding a decentralized internet technology that makes censorship nearly impossible. He believes blockchain, his new passion, has a lot of unrealized potential.

Zeng is a still a pessimist when it comes to whether individuals’ personal workarounds could produce any systemic change. After all, “it was hard to unplug yourself when you were born in the Matrix,” said Zeng. However, a pessimist might also just be an optimist in retreat. To come forward on the record about his experience, for Zeng, is a recent leap inspired by the individuals like Nobel Peace Prize winner Liu Xiaobo who risk their own safety to speak their mind freely. He has also been inspired by Xiaobo’s insistence that those who fight the regime do so with love in their hearts, rather than hostility.

Zeng’s most recent project is one that works around the concept of love. Zeng is creating a marriage registration service on Ethereum, a decentralized open-sourced blockchain. “I don’t think an authoritarian government is entitled to pronounce a marriage,” Zeng said. “Only the people can.” Reimagining a world free of censorship and oppression is the most challenging task he has taken on, and he believes that world can only be built on the internet, if anywhere.

--

--