Transparency, Media Literacy, and Addressing Social Divisions:

Responding Appropriately to Online Falsehoods and Their Implications

Written Representation to the Parliamentary Select Committee on

Deliberate Online Falsehoods

Parliament of the Republic of Singapore

The Clerk of Parliament, Parliament House, 1 Parliament Place, Singapore 178880

Transparency, Media Literacy, and Addressing Social Divisions:

Responding Appropriately to Online Falsehoods and Their Implications

February 28, 2018

Chong Ja Ian1

Chong Ja Ian is a teacher at an institution of higher learning in Singapore. His work covers security, foreign policy, external intervention, state building, and nationalism. The author has no financial interest in the matter. The author’s only interest in providing a submission is limited to that of Singapore citizen who wishes to participate in the political process as legally permitted under Singapore law, for the general betterment of society. The submission draws on the author’s professional expertise and experience. The author is happy to provide evidence before the Parliamentary Select Committee on Deliberate Online Falsehoods if called upon and time permitting.

Disclaimer: All comments and observations made here are solely the author’s own. They do not represent or reflect the views of author’s employer or any other organisation or person with which the author may have an affiliation. The author is solely responsible for this document. The author welcomes the open publication and public scrutiny of this submission, but requests inclusion of the disclaimer that the document reflects only the author’s personal opinions in any complete or partial reproduction of this submission.

Summary

Falsehoods in the forms of misinformation, disinformation, and foreign attempts to affect the internal affairs of another parallel the history of human society. Recent advances in information technologies may have extended the reach of false and inaccurate information, but do not change their fundamental nature. Attempts to address online falsehoods need to be mindful of the fundamentals that have allowed successful responses to misinformation and disinformation in the past — namely resolving social tensions, transparency, and public awareness. Singapore should focus on developing these three qualities if it wants to have a serious capability to deal with online falsehoods. Singapore already has existing laws and capabilities to deal swift and effectively with the spread and symptoms of online falsehoods. However, technical, administrative, and legislative remedies can only augment but not replace the roles of bridging social cleavages, transparency, and public awareness. Rather than simply worry about not doing enough, Singapore should also fear the negative consequences of overreacting to the potential dangers of online falsehoods.

I. Introduction: A Familiar Problem

1. Disinformation and misinformation are longstanding issues facing any society, including Singapore. Online disinformation and misinformation is an old problem played out through a new medium. The Singapore state already possesses well-developed and robust legal as well as law enforcement tools to address the most pernicious forms of disinformation and misinformation, including those that may occur online. Efforts to address deliberate online falsehoods should focus on technical matters relating to managing and reducing the volume of disinformation rather than policing content; extant laws and policies already oversee content with sufficient rigour.

2. Disinformation and misinformation are especially potent given the presence of serious social cleavages. Studies strongly suggest that disinformation and misinformation tends to work on confirmation biases rather than to introduce new ideas. Attempts to address disinformation and misinformation that seek to overtly control access to information or are overly punitive can fuel conspiracy theories and deepen distrust, resulting in a more brittle society that is more vulnerable to hostile manipulation. Efforts to remedy the effects of disinformation and misinformation should consciously avoid exacerbating existing social cleavages and suspicions already present in society. Handling the phenomena of disinformation and misinformation in a heavy-handed manner can be counter-productive and play into the hands of those wish to divide a targeted society.

3. The most effective long-term solutions to disinformation and misinformation are public education, enhanced transparency, and access to independent fact-checking, which address causes rather than simply symptoms. People empowered with the ability to think critically and access to reliable information they trust are the first line of defence against disinformation and misinformation. They are able to provide immediate and organic, ground-up responses to disinformation and misinformation and are capable of adapting to changing conditions. State reactions, while possibly more comprehensive, are likely to be slower, more cumbersome, and disruptive. A citizenry that is actively able to assess information critically is more amenable to Singapore’s need for open society where the availability of information is central to everyday activities ranging from business and finance to education, transportation, and leisure.

II. Online Falsehoods: (Somewhat) Old Wine in New Bottles

4. Disinformation and misinformation are two distinct, but related phenomena that are arguably as old as organised society.2 The former is false information that is intentionally created and/or spread to “influence public opinion or obscure the truth,” while the latter refers to false or inaccurate information that spread mistakenly or inadvertently.3 Disinformation seems much closer to the definition of falsehoods as laid out in the Green Paper released by the Ministry of Communications and Information and Ministry of Law.4 Neither can be fully eradicated so long as humans possess the need to communicate information with each other and such communication remains susceptible to distortion. The Internet and social media increased the speed, scope, and scale of disinformation and misinformation, but neither created such phenomena nor, according to current studies, made them more serious.

a. Misinformation and Disinformation

5. False and inaccurate information is a naturally occurring part of the social environment, even without the Internet and social media. In late 1967, Singapore was struck by an apparent epidemic of men experiencing the “koro” or a disease supposedly resulting in the sudden retraction of the penis followed by death.5 Physicians now determine that the condition is psychological rather than physical, leading to bouts of panic rather any actual or fatal physiological changes.6 Nonetheless, an increasing number of people visited the hospital for physical symptoms as news about the affliction spread, prompted in part by initial state efforts to inform the public about the harmlessness of consuming vaccinated pork, peaking at 97 cases a day.7 Even though the incident was brought under control through an expanded public education and media campaign, it demonstrates that mass misinformation is a phenomenon that can develop in the absence of the Internet and social media.

6. Misinformation was also central to cases of mass unrest in the pre-Internet age. In a well-documented case from mid-eighteenth century Qing China, widespread claims about sorcerers stealing souls led to mass hysteria and a subsequent waves of violent state crackdown that aimed ostensibly to quell the unrest.8 Rumours of Christian missionaries consuming orphaned babies and foetuses similarly sparked and fuelled the mass panic that led to the 1899–1901 Boxer Episode in late Qing China.9 These events took place as inaccurate and false information spread by word of mouth and the public lacked credible sources against which they could satisfactorily verify claims they came across. More than the Internet or social media, a public unable to critically assess information and independently fact check played a much larger role in perpetuating and exaggerating falsehoods to the degree that they ignited existing social tensions.

7. Disinformation too is an age-old, pre-Internet age phenomenon that can be perpetuated by various actors for a myriad of reasons. For example, the military regime that came to power in Indonesia after September 30, 1965 and their supporters spread exaggerated information through the media and other state mechanisms about the brutality of the Partai Komunis Indonesia (PKI).10 In the words of one scholar, this was “a deliberate campaign to promote a climate of fear and retribution” with the “urgency of ‘kill or be killed’” aimed to encourage the population to target suspected PKI members and sympathisers whom the military regime opposed.11 Private scores and other differences unrelated to the PKI were also reportedly settled under the cover of eliminating the PKI, given the social climate created at the time. Disinformation from the Indonesian military regime at the time resulted in the mass killing of an estimated 200,000 to 500,000 people somehow associated with the PKI, even if loosely.12

8. Other cases of disinformation that have little to do with the Internet and social media also exist. Conservative groups organised advertising and news reports from a group claiming to be Vietnam War veterans who served with John Kerry, U.S. Democratic Party presidential candidate in the 2004 U.S. presidential elections, to discredit him during the campaign.13 Corporations too can engage in disinformation. U.S. federal courts found the tobacco industry guilty of falsely denying, distorting, and minimising the harmful effects of smoking over decades to attract consumers, including young persons.14 Governments at times engage in disinformation in toward other governments to discredit them and foster negative sentiment, an example being a Soviet campaign in the 1980s to create the erroneous impression that AIDS was a biological weapon developed by the U.S. military.15

b. External Intervention

9. Efforts by external actors to involve themselves in the domestic affairs of another state or society over issues that touch on their interests is a common and unsurprising feature of politics-as-usual, and have little to do with the Internet or social media. Much of such activity is generally harmless, particularly if transparent and well-regulated. Despite some restrictions, the Singapore state does permit foreign entities to participate in shaping policies.16 Chambers of commerce representing foreign commercial interests, the heads of overseas corporations, and other entities lobby the Singapore government from time to time on issues ranging from the regulation of foreign labour to emissions levies for electric vehicles.17 At the behest of the British government and to facilitate extradition, Singapore recently agreed to refrain from imposing judicial caning on a robbery suspect if he is found guilty.18 To facilitate ratification of the Singapore-U.S. Free Trade Agreement, Singapore loosened its ban on the sale of chewing gum in 2003 as a concession to American chewing gum manufacturers and their allies.19

10. The Singapore state too engages in efforts to change domestic policies in other jurisdictions. One prominent example is the Singapore government’s efforts to get local-level governments in Suzhou and Jiangsu to prioritise the Suzhou Industrial Park over the Suzhou New District during the 1990s.20 Newer government-to-government projects around Tianjin and Chongqing in China involve efforts to shape local government policies as well.21 State agencies are not the only Singaporean entities involved in legal means to inform decisions by foreign governments. The Singapore Government as well as Singaporean and Singapore-based businesses work with the U.S.-ASEAN Business Council to develop commercial and other opportunities in the United States, for example.22

11. More pernicious and disruptive means to influence the domestic affairs in other states exist, but such actions are rarer given the common understanding among states — especially well-functioning states — that refraining from interference is mutually beneficial. States sometimes intervene inside other states to support various actors as a means of strategic competition.23 Various colonial and Cold War projects demonstrated such characteristics, and had the potential to be highly destructive. Foreign states may as well seek to exert influence over public opinion or policy by finding ways to exert domestic pressure on political, business, and opinion leaders, as seen with recent allegations of influence operations by the People’s Republic of China (PRC) in Australia and New Zealand.24 That said, outside attempts to shape domestic developments in states tend to be most effective in weak states where fragile institutions of governance limit the ability of the targeted polity to respond adequately.

12. Operations to shape a policy making space and consequently policy decisions have a long history, and are traditionally known as “active measures.”25 Active measures and other types of influence campaigns are independent of the Internet and social media. They only provide another platform for such activities to take place. Indeed, examples of external influence cited in the Government Green Paper on Deliberate Online Falsehoods includes two instances of alleged foreign influence relating to print publications from the 1970s, well before the advent of the Internet — and observers dispute the official account in one of these cases.26 Efforts to address active measures and influence campaigns that focus on the Internet, social media, and other online sources merely focus a narrow set of transmission mechanisms rather than the conditions that enable such operations to work.

13. Common to most damaging attempts at disinformation and foreign interference is that actors pursuing such action must have significant resources at hand. Effective disinformation and external interference requires a substantial purpose, direction, consistency, and persistence to rise above the various other sources of influence societies, governments, and individuals face on a regular basis. Such operations are expensive options largely available to actors with deep pockets and significant organisational capability.27 Otherwise, efforts at messaging or influence may get lost in the high level of noise that characterises much of daily life, particularly in situations where there are multiple information sources available. This implies that actors that should be of concern when it comes to disinformation and foreign interference should be states, state agencies, and large corporate entities — a fact that seems to bear out in the examples listed above. Start-up attempts at creating misinformation for profit, such as Macedonian teenagers operating troll farms during the 2016 U.S. presidential elections, tend to be haphazard and rely heavily on existing social cleavages, biases, and conspiracies.28

14. Importantly, recent discussions about foreign online misinformation, influence operations, propaganda campaigns, and active measures centre on alleged Russian actions toward the United States and various European states. Most examples of alleged online disinformation cited in the Green Paper on Deliberate Online Falsehoods have to do with supposed Russian operations toward the United States, United Kingdom, France, Germany, Italy, and Sweden.29 The treatment of those cases were also lengthier than the sole non-European, non-North America case — that of Indonesia, which featured domestically-driven rumours, falsehoods, and hoaxes instead.30 Online disinformation by a foreign entity intending to shape domestic electoral processes seems to centre the unique capabilities and strategic circumstances surrounding the Russian state. Given that Singapore does not feature in strategic competition with Russia in Europe, it may well be the case that deliberate online falsehoods is not an imminent threat for Singapore. Scarce national security resources can be better devoted to addressing clear and present dangers like terrorism.

c. Causes for Disruption

15. Fundamental to the disruptiveness and scale of misinformation and disinformation appears to be the presence of deep, existing social cleavages that exacerbate the effects of confirmation biases. Distrust encourages groups and individuals exacerbates the human tendency to play up information that supports positions they already hold uncritically while discounting disconfirming information, regardless of their objective value.31 Groups and individuals that already hold particular positions then tend to echo and amplify false and inaccurate information as well as conspiracies with which they already happen to agree. This is a phenomenon that the Government Green Paper on Deliberate Online Falsehoods recognises.32 Studies suggest that those with conservative views are more susceptible to false information and tend to hold on to such views more strongly, possibly due to higher sensitivity towards danger as well as greater skepticism toward change, ambiguity, and difference.33

16. Insufficient transparency and the inability to independently verify information compounds the challenges surrounding misinformation and disinformation. People may be more easily mislead by untrue or inaccurate claims if they are unable to ascertain the veracity of information they come across or distrustful of the sources that provide differing perspectives.34 An absence of multiple, independent authoritative sources for fact-checking can make people more open to inaccurate or false claims when facing large amounts of confirming or mutually reinforcing claims. Over-reliance on official channels of information too can be problematic. The public — and Singapore — may be left more vulnerable if there is spoofing, mismanagement, or loss of control, even temporary, of official information outlets and have no alternative sources of information they can trust.35 Negative past experience false, inaccurate, or imprecise information from official channels, whatever the reason, can as well increase skepticism of the quality of information from official channels.

d. Online Effects

17. Where the Internet and social media seems to make a difference to misinformation, disinformation, and foreign intervention efforts is in the speed, volume, scope, and directness of communication.36 An online medium accelerates the time it takes to convey information across distance and expands the number of people that come into contact with any given information in a shorter amount of time. Information technology enables the dissemination of large amounts of information that cover a wider array of topics within a given time. The Internet also allows users to access information without the inhibitions of dictated broadcast schedules, while social media enhances the ability of users to filter out information they do not need or wish to contact. The spread of mobile technology that provides potentially uninterrupted access to the Internet multiplies the effects above.

18. Disinformation, including when it involves foreign interference, exploits these qualities of the Internet and social media, which happen to create another permissive environment for misinformation to spread. Given the penetration of the Internet and social media into the daily lives of people, especially in the developed world, where online misinformation and disinformation appear as a new type of complication.37 The dependence of people on online sources of information and the acceleration of the news cycle can make dealing with misinformation and disinformation, whatever the source or motivation, seem challenging. The nature of misinformation, disinformation, and even attempts at foreign intervention discussed above remain the same, however. Efforts to address misinformation and disinformation should, therefore, focus on causes rather than simply symptoms or vectors, or such phenomena will simply find other means of manifesting themselves.

19. Nonetheless, the political effects of online misinformation and disinformation appear ambiguous and, at worse, marginal. Discussion about effects of false and inaccurate information online became widespread discussion with the 2016 U.S. presidential election and U.S. President Donald Trump’s tendency to call unfavourable news “fake.” Much of the attention centres on the volume of false and inaccurate information, including those paid for by actors hoping to influence policy. The Green Paper on Deliberate Online Falsehoods reflects this tendency to highlight the number of posts, tweets, and fake accounts, as seen in the case studies it includes — which take up about half the document.38 However, the volume of posts, tweets, and fake accounts do not automatically or neatly translate into discernible political outcomes, including those intended by actors that seek to purposefully engage in disinformation, propaganda, or active measures.

20. Recent studies indicate that online misinformation and disinformation on voting behaviour and outcomes tends affect voting behaviour at the margins, if at all.39 Even the Green Paper on Deliberate Online Falsehoods acknowledges that “there are differing views on whether the outcomes [of elections and referenda] were indeed affected” by such action.40 Moreover, efforts to increase transparency and public education campaigns successfully limited to the effects of alleged disinformation campaigns targeting the 2017 French presidential elections and German federal elections.41 These results suggest that online misinformation and disinformation are likely to be most important when issues are highly divisive and electoral competitions very close as a result. Such conditions tend to be present when there are already deep social cleavages, and addressing online misinformation and disinformation may be misallocating resources to target the symptoms rather than the causes of such challenges.

III. A Deft Touch: Subtlety, Precision, and the Identification of Appropriate Responses

21. Misinformation, disinformation, and external influence operations using online platforms and social media can be managed well even if they cannot realistically be eradicated. Responses have to be careful, precise, and target the conditions which enable such behaviour to take root and become disruptive, even potentially damaging. Any solution cannot be worse than the cure and must be mindful of second- and third-order effects. Otherwise, they risk inadvertently deepening distrust within and division of society, rendering Singapore even more susceptible to misinformation, disinformation, and the machinations of outside actors. Such outcomes play into the hands of those who wish Singapore ill and can be to the detriment of the country and its residents.

a. Avoiding the Risks of Overreaction

22. Heavy-handed measures focusing on censorship and punitive actions with the intention to deter can have the unintended consequence of making society more vulnerable to misinformation and disinformation. Preventing discussions of disputed topics by the state can create and reinforce impressions of state partiality or the use of state mechanisms for partisan or parochial interests. Such perspectives encourage greater cynicism toward and distrust of state institutions, and can become bases for conspiracy theories. Unless the state is able to police every private exchange, communication, or thought, such ideas will be difficult to dislodge. Once in place, they can create social spaces where misinformation and disinformation can fester — which entities seeking to sow confusion or influence Singapore’s domestic politics can exploit.

23. Overzealously using censorship and threats of heavy punishment to addressing misinformation and disinformation can lead to the inadvertent suppression of important information that need to be shared publicly. Uncertainty and fear of punishment for propagating misinformation and disinformation can lead individuals to avoid sharing information that can otherwise benefit society, creating circumstances that allow false and inaccurate information to thrive unchecked. For instance, people coming across a report about a traffic accident, crime, or missing person that asks for witnesses may refuse to share the information out of fear for being punished for spreading falsehoods and impede resolution of the situation. If schools and parents had not shared information about what appeared to kidnap attempts near international schools in Singapore, there may not have been an investigation that cleared up the incidents as a driver trying to be helpful.42 Should there have been real kidnap attempts, an unwillingness to share information about suspicious attempts to give rides to students may have endangered more students.

24. Mechanisms for reporting and punishing apparent instances of misinformation and disinformation can become a means for groups and individuals to attack those whose views differ, creating an environment of mutual recrimination and divisiveness in society. If groups and individuals believe that they can get the state to remove views they do not like, they are likely do so. This is likely to invite reprisals from the targeted group or persons followed by counter-reprisals, which can easily spiral into creating an atmosphere of suspicion and distrust in society that can cleave along the lines of values, ideology, partisanship, ethnicity, and religion. Such a situation can prompt people to retreat into filter bubbles and echo chambers inhabited by others who share similar views rather than to engage more widely, leaving Singapore more divided and open to social tensions with potentially dangerous results.43 State resources will be stretched at the same time, as official agencies engage in investigations and try to be even-handed or risk having their credibility damaged from perceptions of arbitrary or biased enforcement.

25. Recent studies about online misinformation and disinformation further suggest that direct, head-on attempts to debunk inaccuracies can often be ineffective.44 Such action can even reinforce existing perceptions that an actor or even the state is trying to cover-up or distract from some more important issue. Direct refutation can even have the opposite effect of entrenching false or inaccurate information in ways that perpetuate misinformation and disinformation. A tragic and unfortunate example is the belief by hardline gun ownership advocates in the United States that mass school shootings are really staged incidents involving actors, paid for by anti-gun lobbyists and politicians who want stricter gun control.45 Direct appeals and rebuttals by survivors of the shootings, their families, and officials against such views only appear to strengthen the convictions of those who already believe in such conspiracies. Singapore can ill-afford to adopt policies that risk creating such disaffection in its diverse and highly pluralistic resident population.

26. Domestic legislation will be ineffective in addressing disinformation and active measures directed at Singapore from abroad. In the event states or other entities overseas engage in online disinformation or influence campaigns in Singapore, there is little that domestic legislation can do to stop such action or punish perpetrators. Singapore by-and-large does not enjoy extra-territorial rights in other jurisdictions, and seeking the extradition, seizure, and prosecution of foreign leaders for disinformation or active measures under Singapore law is unrealistic and impractical. Penalising internet service providers or platforms for the spread of disinformation will not effectively address the problem, since the actors responsible can simply shift platforms and use servers located outside Singapore, beyond the reach of Singapore’s laws. A demonstrated inability of Singapore law and law enforcement to respond to disinformation and active measures from abroad can even erode confidence in the effectiveness of law enforcement agencies and the legal system in Singapore.

b. Singapore Already Possesses Effective Legal Instruments

27. Singapore is fortunate in already having a comprehensive and robust set of laws and law enforcement tools that can be easily turned to deal with online misinformation and disinformation. This body of laws and instruments makes Singapore different from other jurisdictions where there is concern about online misinformation and disinformation affecting electoral outcomes, social stability, and confidence in state institutions. Moreover, the Singapore state has a track record for acting quickly on issues relating to online misinformation and disinformation. In this respect, Singapore does not need any additional legislation to manage online misinformation and disinformation, unlike other jurisdictions. For Singapore, it is merely a matter of applying existing legislation and capabilities to online misinformation and disinformation, something well within the remit of the executive and judicial branches of the Singapore state.

28. Current Singapore laws are well-positioned to address concerns with interference in elections and influence over electoral outcomes though online misinformation and disinformation, which the Green Paper on Deliberate Online Falsehoods identify as highly important. The Parliamentary Elections Act and the Presidential Elections Act already prohibit undue influence, defined in both laws as:

Every person who —
(a) directly or indirectly, by himself or by any other person on his behalf, makes use of or threatens to make use of any force, violence or restraint, or inflicts or threatens to inflict, by himself or by any other person, any temporal or spiritual injury, damage, harm or loss upon or against any person in order to induce or compel that person to vote or refrain from voting, or on account of that person having voted or refrained from voting at any election; or
(b) by abduction, duress or any fraudulent device or contrivance, impedes or prevents the free exercise of the franchise of any elector or voter, or thereby compels, induces or prevails upon any elector or voter either to vote or refrain from voting at any election.46

29. Undue Influence during parliamentary and presidential elections in Singapore already carries with punishments of a fine of up to $5,000, a prison term of up to three years, or both.47 Restrictions on last minute election advertising in presidential and parliamentary elections to are clearly spelt out, with violators being liable for fine of up to $1000, a prison term of up to a year, or both.48 Making false statements about the character of a candidate and the withdrawal of a candidate are also subject to a fine and imprisonment of up to a year under the Presidential Elections Act and Parliamentary Elections Act.49 Singapore’s defamation law is another tool that can be used against those who make false claims about the character of a candidate, which has seen some use during elections in the past.50 The Protection from Harassment Act further protects persons from online actions amounting to harassment, including during elections, just as the Penal Code guards against intimidation, insults, and annoyance regardless of the medium.51

30. Singapore laws likewise already criminalise online speech that threaten inter-ethnic and sectarian relations as well as confidence in state institutions. The Penal Code makes it an offence to “wound the racial or religious feeling of any person.”52 The Sedition Act makes it an offence to:

(a) to bring into hatred or contempt or to excite disaffection against the Government;
(b) to excite the citizens of Singapore or the residents in Singapore to attempt to procure in Singapore, the alteration, otherwise than by lawful means, of any matter as by law established;
(c) to bring into hatred or contempt or to excite disaffection against the administration of justice in Singapore;
(d) to raise discontent or disaffection amongst the citizens of Singapore or the residents in Singapore;
(e) to promote feelings of ill-will and hostility between different races or classes of the population of Singapore.53

The Protection from Harassment Act as well covers racially or religiously driven harassment of a person.54 Moreover, the Administration of Justice (Protection) Act criminalises speech, including online speech, that undermines public confidence in the administration of justice.55

31. Singapore has an added legal safeguard against speech that affects inter-religious and sectarian relations in the form of the Maintenance of Religious Harmony Act. Under the act, the state can restrain, for up to two years, a person from making public addresses orally and in writing, publishing or distributing material, and holding a position on an editorial board or committee of a publication for suspicion of :

(a) causing feelings of enmity hatred, ill-will or hostility between different religious groups;
(b) carrying out activities to promote a political cause, or a cause of any political party while, or under the guise of, propagating or practising any religious belief;
(c) carrying out subversive activities under the guise of propagating or practising any religious belief; or
(d) exciting disaffection against the President or the Government, while, or under the guise of, propagating or practising any religious belief.56

A minister has the authority to extend this restraining order for up to two years on behalf of the state.57 Such restraining orders are further subject to formal oversight by the Presidential Council on Religious Harmony and the President of the Republic of Singapore.58

32. Existing legislation also includes provisions against the deliberate spreading of falsehoods that is applicable to online speech and disinformation. Under the Telecommunications Act, the sending false or fabricated messages knowingly and the fraudulent retention of a message carries a fine of up to $10,000 and/or a prison term of up to three years.59 If false or fabricated messages relate to an explosive device, the maximum prison term increased to seven years and the maximum fine rises to $50,000.60 Should the sending or false or fabricated messages or fraudulent retention of messages continue after conviction, the offender is liable for a maximum daily fine of up to $1,000 each day the offence continues.61 The Computer Misuse and Cybersecurity Act further criminalises the access, use, or restriction of computers that intends to or causes:

(c) a disruption of, or a serious diminution of public confidence in, the provision of any essential service within the meaning of section 15A(12) in Singapore;
(d) a disruption of, or a serious diminution of public confidence in, the performance of any duty or function of, or the exercise of any power by, the Government, an Organ of State, a statutory board, or a part of the Government, an Organ of State or a statutory board; or
(e) damage to the national security, defence or foreign relations of Singapore.62

Unauthorised access, modification, use and interception, and obstructing the use of computer material carries a maximum $50,000 fine and/or a seven year jail term.63 Access of computer material with the intent to commit or facilitate an offence will see the maximum prison term increase to ten years.64

33. Present laws and regulations in Singapore give state agencies the legal mandate to take down sources of online misinformation and disinformation. The Administration of Justice (Protection) Act covers audio, visual, and online material and has provisions for the Attorney-General, with leave of the High Court, to cessation of publications that have a prima facie risk of contempt of court.65 Failure to comply can result in a penalty of a maximum $20,000 fine and/or a maximum one year jail term.66 The Films Act gives a minister the power to prohibit any film and law enforcement officers the legal mandate to seize unlawful films, which can include online videos, and arrest individuals who possess such material.67 Additionally, the Broadcasting Act and Internet Code of Practice issued by the Infocomm Media Development Authority (IMDA) permits the IMDA to order the removal of any material that “objectionable on the grounds of public interest, public morality, public order, public security, national harmony, or is otherwise prohibited by applicable Singapore laws.”68

34. A range of other sections in Singapore’s Penal Code further criminalise and include significant penalties for acts that can pertain to online misinformation and disinformation. They include abetment under Chapter V of the Penal Code and criminal conspiracy under Chapter VA of the Penal Code.69 Sections 121A to 121D and 123 to 126 of the Penal Code, which come under offences against the state extend particular protections to key state institutions in Singapore from action that can be a result of disinformation conducted online or through other media.70 These laws extend particular protection to the President, Government, and Members of Parliament, while also subjecting Singapore’s relations with other countries to protection against attempts to wage war and depredation. Additionally, Sections 267C, 350, and 505 of the Penal Code address incitement and mischief, including from online sources.71

35. State agencies in Singapore have a strong, demonstrated track record of responding rapidly and effectively to online content they deem unacceptable, minimising any worries about timely reactions to online misinformation and disinformation. The legal system dealt swiftly with online allegations by Mr. Roy Ngerng in 2014 about Prime Minister Lee Hsien Loong’s management of Central Provident Fund monies.72 Likewise, law enforcement acted quickly once police reports were made about Mr. Roy Ngerng, Mr. Teo Soh Lung, Mr. Jason Chua, The Independent Singapore and others posting online material relating to the 2016 Bukit Batok by-election on “Cooling Off Day.”73 The Attorney-General’s Chambers moved similarly swiftly in response to Dr. Li Shengwu’s 2017 personal Facebook post that allegedly displayed contempt of court.74 The single example of an online falsehood in Singapore, the case of the now defunct website The Real Singapore posting a fabricated report on the 2015 Thaipusam procession, also saw the legal system act quickly against individuals responsible.75

36. Other cases of online speech touching on legal restrictions saw quick responses by law enforcement agencies and the Attorney-General’s Office as well. Mr. Amos Yee was arrested by the police and charged by the Attorney-General’s Office for “intending to wound the religious feelings of Christians” four days after posting a YouTube video online where he criticised Christianity.76 The 2015 video also included stick figure depictions of former Prime Minister, Mr. Lee Kuan Yew, and former British Prime Minister, Dame Margaret Thatcher, engaging in a sexual act, which many people found offensive.77 Mr. Yee was later convicted for “uttering of words with the deliberate intent to wound the religious or racial feelings,” an offence under Section 298A of the Penal Code.78

37. Moreover, the Attorney-General’s Chambers has a good track record of achieving quick compliance with its take-down orders. Independent news site The Online Citizen received take-down notices from the Attorney-General’s Chambers on two occasions in 2015. One was for a letter from Mr. Amos Yee’s lawyer alleging ill-treatment of Mr. Yee while in detention.79 The other take-down order was for a piece on alleging the use of Singapore Savings Bonds for infrastructure construction that the Attorney-General’s Chambers found “patently false.”80 The Online Citizen complied shortly after receiving notice from the Attorney-General’s Chambers in both instances.

c. Recommendation I: Prioritising Inoculation

38. To address the challenges posed by misinformation, disinformation, and foreign influence whether from online or other sources, Singapore should concentrate on bridging social divisions while expanding media literacy. Such efforts address the fundamental conditions that allow misinformation and disinformation to take root in a society. Experts identify addressing social tensions and public education about information as foundational to any serious response to misinformation and disinformation even if they are long-term objectives.81 Such approaches remove or at least mitigate sources of grievance that render a society more prone to misinformation and disinformation. A habit of verifying information against multiple independent sources and engaging broadly with different perspectives helps people avoid being locked into filter bubbles and echo chambers, while enabling them to better manage conflicting or ambiguous information.82

39. Squarely facing the problems of social divisions and their causes as well as media literacy helps to inoculate the population against misinformation and disinformation regardless of origin. Such approaches to managing misinformation and disinformation share the same fundamental principles as Singapore’s longstanding efforts at deal with inter-ethnic and inter-religious differences.83 That Singapore is facing increasingly serious class divisions, due in part to the effects of globalisation, makes addressing social cleavages and improving public education and awareness about evaluating information all the more crucial.84 The population can then be in a stronger position to regulate misinformation and disinformation automatically over time.85 A large network of people working actively offers a more efficient and cost-effective manner to dispel misinformation and disinformation than depending on state monitoring and waiting for a centralised state response.

40. Addressing social cleavages and media literacy emphasises preemptive upstream remedies over reactive downstream responses. Steps to cross social divisions have to include measures that address discrimination, equality of opportunity, and redistribution frankly so as to reduce the reality and belief of unfair treatment. Sidestepping tricky social issues can fuel disaffection and suspicion, ultimately to the detriment of the country and society at large. Enhanced media literacy should familiarise adults and children with critically evaluating data, cross-referencing information against multiple sources, as well as becoming comfortable in dealing with disputed and ambiguous accounts. Doing so likely means long-term investment in a public education campaign as well as the active incorporation of these skills in schools.86

41. Another area of focus is to enable more effective independent fact-checking. Greater transparency and better tools for independent fact-checking can help dispel false and inaccurate information as people are able to more easily and quickly ascertain the veracity of information they come across.87 To the extent possible, enhancing transparency should apply to official as well as corporate information. Providing platforms to share information about policy-making and legislative work could prove especially helpful. Examples of such services include, but are not limited to, an online repository of unclassified and de-classified official documents as well as live video streaming and public video archives of parliamentary sittings and committee meetings.

42. Effective and timely fact-checking is a critical corollary to ensuring higher levels of transparency. Fact-checking should be comprehensive, fair, non-partisan, transparent with sources, open about funding and organisation, clear in terms of providing replicable methodology, and forthright about corrections.88 There should be several independent platforms for fact-checking that can include state agencies like an official ombudsman alongside several non-profit entities, which then allows for cross-verification of information. The scope for fact-checking should include images, videos, and audio clips on top of text. Verified facts should be jointly presented by individuals across political, partisan, ethnic, and religious spectra in the effort to consciously reach out to different segments of the population.89 Such steps help break down filter bubbles and echo chambers, ultimately allowing for greater public confidence in fact-checking efforts.

d. Recommendation II: Legal, Administrative and Technical Tweaks

43. A simple, standard process for the public to request for official documents together procedures for reviewing, approving, and explaining decisions can significantly advance government transparency. Such a process can help maintain and build trust in public institutions and Singapore’s system of government in a complex, confusing information environment. They provide means for the public to gain access to trustworthy information should they need to look more closely at past decision-making to gain a deeper understanding of various policy positions. Legislation on freedom of information, which is common in many jurisdictions, can be a means to provide a mechanism for public access to official documents and archives.90 This should be a straightforward step for Singapore, since its strong, longstanding record of clean, efficient government means it has nothing to hide.

44. Improving transparency to win and hold public trust against misinformation and disinformation should as well include clear, evenhanded reporting of funding, support, and lobbying efforts. All advertising, commissioned pieces, and data collection efforts should clearly indicate sponsorship in a manner traceable to an identifiable entity with a usable point of contact. These regulations should apply to state agencies, corporations, non-profit entities, and all for-compensation lobbying activity. Foreign entities that engage in lobbying activity in Singapore and their representatives should be legally subject to registration and have a clear set of regulations establishing the scope for their work.91 In addition to regular reporting requirements to Parliament, such information should also be made available to the public through a free and easily accessible public repository. These policies can facilitate the management of legal advertising and lobbying work, while making false and inaccurate information as well as dubious sources easier to identify and dispel.

45. Similarly, elected officials should make public all directorships, meetings with lobbying organisations, and other activity that may constitute a real or perceived conflict of interest so long as they are in office. The actions of all public officials, especially at the senior levels, can colour perceptions about public institutions and the work they conduct. Reporting requirements and their application should be fair, non-partisan, comprehensive, inclusive, and non-discriminatory to reduce the potential for public cynicism.92 Reports should be regular and provided at least annually if not with greater frequency. Mandating transparency on these matters can reduce the potential for misinformation and disinformation to take hold.

46. Several technical steps can mitigate the volume of bots, false accounts, and mass automated postings that create conditions conduce for misinformation and disinformation. Currently efforts by social media platforms to fact-check and close down fake accounts may be inadequate given the exponential growth in the numbers of bots, false accounts, and re-posts.93 One approach state agencies in Singapore can adopt is to work with social media providers to prohibit false accounts as well as bots that re-post information or at least set a limit to the number and rate of re-posts permissible by bots. Reducing the number and pace of re-posts gives Internet and social media users more time to fact-check and verify facts. A smaller volume of posts and re-posts can make efforts by fact-checkers and users to ascertain the veracity of claims in various posts more manageable.

47. Singapore’s ultimate protection against misinformation and disinformation regardless of form and source lies in having a robust, reflective, and empowered society. Technical fixes, legal mandates, and administrative adjustments cannot replace active efforts to understand, bridge, accommodate, and address the many differences that naturally occur in any society, much less one as diverse as Singapore. Technology and laws complement but do not stand in for a public that is willing and able to assess information critically and verify claims independently. Foreign actors committed to engaging in active measures in Singapore will not be put off by domestic legislation and will try to work around technical defences. Diplomacy, active countermeasures, and retaliation are standard and effective means of addressing unwanted foreign influence, but they can only go so far without adequate support from society.

IV. Conclusion: Moderation and Precision are Key

48. Online falsehoods whether in the form of misinformation or deliberate disinformation is a new manifestation of an old problem. The upside to this situation is that appropriate and effective responses are well-known: reducing cleavages in society wherever they may lie, increasing transparency — especially where it relates to power and authority — and enhancing public consciousness. Moderation and calm should be the guiding principle behind attempts to counter misinformation and disinformation. Any form of state intervention, be it an administrative move, legislation, or public outreach, should be precise, limited, and take support for increasing transparency, mitigating social divides, and improving media literacy as its ultimate aim. Measured and careful responses are the best ways to contain and reduce threats from misinformation and disinformation over the long-term rather than unleashing the heavy hand of the state at every rumour or rustling of leaves. Vigilance is not paranoia.

49. Fortunately, there are other societies that have to deal more constantly with misinformation, disinformation, and hostile foreign intervention. Their successes and failures can be instructive for Singapore going forward. Taiwan, for instance, faces the constant threat of cyberattacks, hacking, disinformation, influence operations, and united front work from the PRC that operate both online and off.94 As a result, the island has developed a robust set of societal and online responses that have so far staved off and limited the effects of online disinformation campaigns by emphasising transparency and media literacy on top of technical fixes.95 Such experiences provide instances of positive learning for Singapore.

50. Singapore should beware overreactions and the chasing of shadows in the effort to control and eradicate misinformation and disinformation. Unless the state is ready to police almost every online or mobile exchange and citizens ready to accept such a reality, policies that can be read as overly repressive or excessively intrusive creates permissive conditions for distrust, suspicion, and conspiracy theories to take root. Once entrenched such perspectives are difficult to eradicate and can gnaw away at the fabric of Singapore society from within. Overreacting to abstract threats of misinformation and disinformation may, in fact, play into the hands of those who wish Singapore ill. Hostile entities only need to work the ground that Singaporeans have inadvertently prepared for misinformation and disinformation campaigns to succeed in pushing the country towards greater divisiveness, disharmony, and confusion. Remedies must not be worse than or aggravate the ailment in question.

51. This submission examined the basic contours of misinformation, disinformation, and foreign interference in domestic affairs, and considered the ways in which such phenomena can play out in Singapore. A robust and active society that is ready to respond to various contingencies rather than simply legislation or technical tools is what will see Singapore through any threat from deliberate online falsehoods. Singapore is lucky in that it already has a set of sophisticated legal instruments that allows various state and law enforcement agencies to respond quickly and effectively to online falsehoods as and when necessary. The challenge for Singapore is not to be complacent in relying too much on centralised state to deal with challenges that are elusive, adaptable, and fluid in nature, especially if this comes at the expense of inhibiting all-important societal responses. Much like dealing with conventional, offline falsehoods, nothing beats transparency, careful fact-checking, and a critical eye.

52. I urge the Parliamentary Select Committee on Deliberate Online Falsehoods to make all submissions publicly available along with its hearings and findings in the spirit of transparency. The Committee can further livestream and video archive its public hearings. This would allow the Committee to lead by example in setting the standard for legislative transparency, which has the added benefit of removing public speculation and doubt about the Committee’s work. Just as I am ready to discuss my views with the Committee, this submission is available to the public online. This is to make the content of this submission easily and independently verifiable to avoid confusion or misunderstanding about my opinions. I ask that those to that wish to cite or reference my comments in this submission to note my disclaimer at the start of the document in footnote 1, and to include it in any partial or complete representations or reproductions of my views in this document.

- END -

1 The opinions and comments in this submission are entirely the author’s own. They do not represent the views of the author’s employer or any other person or entity with which the author has an affiliation. The author is solely responsible for any errors in this document.

2 Claire Wardle. 2017. Fake News. It’s Complicated. First Draft News (February 16), https://firstdraftnews.org/fake-news-complicated/, accessed February 21 2018.

3 University of Michigan Library. “Fake News.” Lies and Propaganda: How to Sort Fact from Fiction. Research Guide for the University of Michigan Library, guides.lib.umich.edu/fakenews, accessed February 20, 2018.

4 The Ministry of Communications and the Information and the Ministry of Law, Singapore. 2018. Deliberate Online Falsehoods: Challenges and Implications. Green Paper, Misc. 10 of 2018, Presented to Parliament of Singapore by the Minister of Law, p. 1.

5 Koro Study Team. 1969.“The Koro ‘Epidemic’ in Singapore. Singapore Medical Journal, 10:4 (December), pp. 234–242, B.-Y. Ng and K.-T. Chee. 2006. A Brief History of Psychiatry in Singapore. International Review of Psychiatry, 18:4, p. 357.

6 Johann J. Mattelaer and Wolfgang Jilek. 2007. Koro — The Psychological Disappearance of the Penis. Journal of Sexual Medicine, 4, p. 1510.

7 Koro Study Team. The Koro ‘Epidemic’ in Singapore. p. 234.

8 Philip A. Kuhn. 2006. Soulstealers: The Chinese Sorcery Scare of 1768. Cambridge: Harvard University Press.

9 Paul A. Cohen. 1998. History in Three Keys: The Boxers as Event, Experience, and Myth. New York: Columbia University Press. pp. 146–172.

10 John Roosa. 2006. Pretext for Mass Murder: The September 30th Movement and Suharto’s Coup d’État in Indonesia. Madison, Wisconsin: University of Wisconsin Press.

11 Michael van Langenberg. (1990) Gestapu and State Power in Indonesia. in The Indonesian Killings, 1965–1966: Studies from Java and Bali. Edited by Robert Cribb. Clayton, Victoria, Australia: Centre of Southeast Asian Studies, Monash University. pp. 47–8.

12 Robert Cribb. 2004. The Indonesian Massacres. In Century of Genocide. Edited by Samuel Totten and William S. Parsons. London: Routledge. p. 239, Hilmar Farid. 2006. Indonesia’s Original Sin: Mass Killings and Capitalist Expansion. Inter-Cultural Asian Studies, 6:1, pp. 3–16.

13 G. Mitchell Reyes. 2006. The Swift Boat Veterans for Truth, the Politics of Realism, and the Manipulation of Vietnam Remembrance in the 2004 Presidential Elections. Rhetoric and Public Affairs, 9:4 (Winter), pp. 571–600.

14 Tobacco Control Legal Consortium. 2006. The Verdict Is In: Findings from United States v. Philip Morris, The Hazards of Smoking.

15 Thomas Boghardt. 2009. “Operation INFEKTION: Soviet Bloc Intelligence and Its AIDS Disinformation Campaign.” Studies in Intelligence, 53:4 (December), pp. 1–24, United States Department of State. 1987. The U.S.S.R.’s AIDS Disinformation Campaign. Foreign Affairs Note (July).

16 Justin Ong. 2017. Singapore to Block Foreigners from Promoting Political Causes Locally. Channel News Asia (April 3). https://www.channelnewsasia.com/news/singapore/singapore-to-block-foreigners-from-promoting-political-causes-lo-8712130, accessed February 23, 2018, Section 5, Public Order Act, Chapter 257A. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/POA2009#pr5-, accessed February 24, 2018.

17 American Chamber of Commerce in Singapore, The Australian Chamber of Commerce, Singapore, British Chamber of Commerce, Singapore, Canadian Chamber of Commerce in Singapore, EuroCham Singapore, French Chamber of Commerce in Singapore, Japanese Chamber of Commerce and Industry, Singapore, New Zealand Chamber of Commerce, Singapore, and Singapore-German Chamber of Industry and Commerce. 2013. Letter to Acting Minister of Manpower from Chambers of Commerce Endorsing SBF Position Paper on Population, February 4, http://www.sbf.org.sg/letter-to-acting-minister-of-manpower, accessed February 20, 2018, Christopher Tan. 2017. Tesla Boss Calls PM Lee over CO2 Levy. Straits Times (March 8), http://www.straitstimes.com/singapore/transport/tesla-boss-calls-pm-lee-over-co2-levy, accessed February 20, 2018.

18 2018. StanChart Robbery: Singapore Agrees to UK Request to Not Cane Suspect If Found Guilty. Channel News Asia (February 20). https://www.channelnewsasia.com/news/singapore/stanchart-robbery-singapore-david-roach-uk-request-extradition-9974270, accessed February 24, 2018.

19 Lawrence A. Green and James K. Sebenius. 2004. Tommy Koh and the U.S.- Singapore Free Trade Agreement: A Multi-Front “Negotiation Campaign.” Harvard Business School Working Paper 15–053 (December 16), pp. 10–11.

20 John Thomas. 2001. Institutional Innovation and Prospects for Transference, Part I: Transferring Singaporean Institutions to Suzhou, China. John F. Kennedy School of Government, Harvard University, Faculty Working Paper Series, RWP02–001 (September), pp. 14–16, H.L. Tey. 1999. Suzhou Park: Singapore to Cut Stake to 35%. Business Times (June 29), p. 1.

21 Ravi Menon. Stepping Up the Chongqing-Singapore Connection. Keynote Address at Singapore-China (Chongqing) Financial Conference, September 4, 2017. www.mas.gov.sg/News-and-Publications/Speeches-and-Monetary-Policy-Statements/Speeches/2017/Stepping-Up-the-Chongqing-Singapore-Connection.aspx, accessed February 24, 2018, Tianjin Eco-City. 2017. 13th Joint Council for Bilateral Cooperation (JCBC) Meetings in Beijing, The People’s Republic of China, February 27, 2017. https://www.tianjinecocity.gov.sg/news-press/2017/20170227.htm, accessed February 24, 2018.

22 About Singapore. U.S.-ASEAN Business Council. https://www.usasean.org/countries/singapore/about, accessed February 24, 2018, What We Do. U.S.-ASEAN Business Council. https://www.usasean.org/about/what-we-do, accessed February 24, 2018.

23 Ja Ian Chong. 2012. External Intervention and the Politics of State Formation: China, Indonesia, Thailand — 1893–1952. Cambridge: Cambridge University Press.

24 Anne-Marie Brady. 2017. Magic Weapons: China’s Political Influence Activities under Xi Jinping. Kissinger Institute on China and the United States, Woodrow Wilson International Center for Scholars, Washington, DC (September 18), https://www.wilsoncenter.org/article/magic-weapons-chinas-political-influence-activities-under-xi-jinping, accessed February 21, 2018, Clive Hamilton and Alex Joske. 2018. Submission to the Parliamentary Joint Committee on Intelligence and Security: Inquiry into the National Security Legislation Amendment (Espionage and Foreign Interference) Bill 2017. Parliament of Australia. https://www.aph.gov.au/DocumentStore.ashx?id=96afcef1-c6ea-4052-b5e3-bcac4951bb0e&subId=562658, accessed February 21, 2018, Matt Nippert. 2017. Prime Minister Jacinda Ardern Orders Security Agencies to Look into Case of Burgled Professor. New Zealand Herald (February 19). http://www.nzherald.co.nz/nz/news/article.cfm?c_id=1&objectid=11997764, accessed February 26, 2018.

25 Thomas Rid. 2017. Disinformation: A Primer in Russian Active Measures and Influence Campaigns. Hearings before the Select Committee on Intelligence, United States Senate, One Hundred and Fifteenth Congress, (March 30). https://www.intelligence.senate.gov/sites/default/files/documents/os-trid-033017.pdf, accessed February 22, 2018.

26 Deliberate Online Falsehoods, p. 18, Joseph B. Tamney. 1976. The Singapore Herald Affair. Asian Studies, 10:2, pp. 256–61, Cherian George. 2013. Why Singaporean Journalists Don’t Press for Legal Reform. In Democracy, Media, and Law in Malaysia and Singapore: A Space for Speech. Edited by Andrew T. Kenyon, Tim Majoribanks, and Amanda Whiting. London: Routledge. p. 47.

27 Elizabeth Dwoskin, Adam Entous, and Craig Timberg. 2017. Google Uncovers Russian Bought Ads on YouTube, Gmail, and Other Platforms. Washington Post (October 9). https://www.washingtonpost.com/news/the-switch/wp/2017/10/09/google-uncovers-russian-bought-ads-on-youtube-gmail-and-other-platforms/?utm_term=.af80a2373dfe, accessed February 21, 2018.

28 Claire Wardle and Hossein Derakhshan. 2017. Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking. Council of Europe (September 2017), p. 35.

29 Green Paper on Deliberate Online Falsehoods, pp. 4–11.

30 Green Paper on Deliberate Online Falsehoods, pp. 11–12.

31 Yuen Foong Khong. 1992. Analogies at War: Korea, Munich, Dien Bien Phu, and the Vietnam Decisions of 1965. Princeton: Princeton University Press. Chaps. 1–3, 8–9.

32 Deliberate Online Falsehoods, p. 2.

33 John Ehrenreich. 2017. Why are Conservatives More Susceptible to Believing Lies? Slate (November 9), www.slate.com/articles/health_and_science/science/2017/11/why_conservatives_are_more_susceptible_to_believing_in_lies.html, accessed February 21, 2018, J.R. Hibbling et al. 2014. Differences in Negativity Bias Underlie Variations in Political Ideology. Behavioral Brain Science, 37:3 (June), pp. 297–307, David Later et al. 2017. Combatting Fake News: An Agenda for Research and Action. Shorenstein Center for Media, Politics, and Public Policy, Kennedy School of Government, Harvard University, https://shorensteincenter.org/combating-fake-news-agenda-for-research/, accessed February 21, 2018, Vidya Narayana et al. 2018. Polarisation, Partisanship, and Junk News Consumption over Social Media in the U.S. COMPROP Data Memo 2018.1, Computational Propaganda Project, Oxford University (February 6), comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2018/02/Polarization-Partisanship-JunkNews.pdf, accessed February 21, 2018.

34 Lazer et al, Combatting Fake News, Wardle and Derakhshan, Information Disorder, pp. 49–74, 86–90.

35 Wardle and Derakhshan, Information Disorder, pp. 75–76.

36 Lazer et al, Combatting Fake News.

37 Wardle and Derakhshan, Information Disorder, pp. 20–48.

38 Deliberate Online Falsehoods, pp. 4–12.

39 Andrew Guess, Brendan Nyhan, and Jason Riefler. 2018. Selective Exposure to Misinformation: Evidence from the Consumption of Fake News during the 2016 U.S. Presidential Campaign. European Research Council (January 9), http://www.dartmouth.edu/~nyhan/fake-news-2016.pdf, accessed February 22, 2018, Vidya Narayanan et al. 2017. Russian Involvement and Junk News during Brexit. COMPROP Data Memo 2017.10, Computational Propaganda Project, University of Oxford (December 19), http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/12/Russia-and-Brexit-v27.pdf, accessed February 22, 2018.

40 Deliberate Online Falsehoods, p. 4.

41 Digital Forensic Research Lab, Atlantic Council. 2017. #ElectionWatch: Disinformation in Deutschland. Medium (September 27). https://medium.com/dfrlab/electionwatch-disinformation-in-deutschland-a97b61d7b025, accessed February 22, 2018, Mika Aaltola. 2017. Democracy’s Eleventh Hour: Safeguarding Democratic Elections against Cyber-Enabled Autocratic Meddling. Finnish Institute of International Affairs Briefing Paper 226 (November). pp. 6–8, Kremlin Watch. 2017. Overview of Countermeasures by the EU28 to the Kremlin’s Subversion Operations: How Do the EU28 Perceive and React to the Threat of Hostile Influence and Disinformation Operations by the Russian Federation and Its Proxies. Kremlin Watch Report, European Values Think Tank (May 16). pp. 55–61, Constanze Stelzenmüller. 2017. The Impact of Russian Interference on Germany’s 2017 Elections. Brookings Institution, Washington, DC (June 28). https://www.brookings.edu/testimonies/the-impact-of-russian-interference-on-germanys-2017-elections/, accessed February 21, 2018.

42 Jalelah Abu Baker. 2018. International School Van Scare: Mixed Views from Parents on Driver Having ‘No Ill Intent.’ Channel News Asia (January 18). https://www.channelnewsasia.com/news/singapore/uwc-tanglin-trust-uwcsea-international-school-van-students-9871802, accessed February 23, 2018.

43 Wardle and Derakhshan, Information Disorder, pp. 49–56.

44 Man-Pui Sally Chan et al. 2017. Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psychological Science, 28:11 (September), pp. 1531–46, Lazar et al, Combatting Fake News, C.R. Sunstein et al. 2016. How People Update Beliefs about Climate Change: Good News and Bad News (SSRN Scholarly Paper No. ID 2821919). Rochester, NY: Social Science Research Network.

45 Kim LaCapria. 2018. Conspiracy Theories about Mass and School Shootings, Explained. Snopes (February 21). https://www.snopes.com/2018/02/21/conspiracy-theories-mass-school-shootings-explained/, accessed February 23, 2018, David Mikkelson. 2018. Sandy Hook Exposed? Snopes (February 21). https://www.snopes.com/politics/guns/newtown.asp, accessed February 23, 2018, Jason Wilson. 2018. Crisis Actors, Deep State, False Flag: The Rise of Conspiracy Theory Code Words. The Guardian (February 21), https://www.theguardian.com/us-news/2018/feb/21/crisis-actors-deep-state-false-flag-the-rise-of-conspiracy-theory-code-words, accessed February 23, 2018.

46 Section 59, Parliamentary Elections Act (Chapter 218). Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/PEA1954#pr61-, accessed February 23, 2018, Section 40, Presidential Elections Act (Chapter 240A). Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://statutes.agc.gov.sg/Act/PrEA1991#pr37-, accessed February 23, 2018.

47 Section 61(1)(b) and 42(1)(ii), Parliamentary Elections Act, Sections 42(1)(b) and 42(1)(ii), Presidential Elections Act.

48 Sections 61(1)© and 61(1)(iii), Parliamentary Elections Act, Sections 42(1)© and 42(1)(iii), Presidential Elections Act.

49 Sections Sections 61(1)(d), 61(1)(e) and 61(1)(iv), Parliamentary Elections Act, Sections 42(1)(d), 42(1)(e), and 42(1)(iv), Presidential Elections Act.

50 Defamation Act (Chapter 75), Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/DA1957#pr12-, accessed February 23, 2018.

51 Sections 503–508, Penal Code (Chapter 224). Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/PC1871?&ProvIds=P4XXII_503-&ViewType=Within&Phrase=intimidation&WiAl=1, accessed February 23, 2018, Protection from Harassment Act, Chapter 256A. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act-Rev/PHA2014/Published/20150525?DocDate=20150525, accessed February 23, 2018.

52 Sections 298 and 298A, Penal Code.

53 Section 3(1), Sedition Act, Chapter 290. Singapore Statutes Online, Attorney-General’s Office, Singapore. https://sso.agc.gov.sg/Act/SA1948#pr3-, accessed February 23, 2018.

54 Protection from Harassment Act.

55 Sections 3–13, Administration of Justice (Protection) Act 2016. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/AJPA2016, accessed February 23, 2018.

56 Sections 8 and 9, Maintenance of Religious Harmony Act, Chapter 167A. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/MRHA1990, accessed February 25, 2018.

57 Section 13, Maintenance of Religious Harmony Act.

58 Sections 10–12 and 14, Maintenance of Religious Harmony Act.

59 Sections 45, 46, and 65, Telecommunications Act. Singapore Statutes Online, Attorney-General’s Office, Singapore. https://sso.agc.gov.sg/Act/TA1999#pr65-, accessed February 23, 2018.

60 Section 45, Telecommunications Act.

61 Section 65, Telecommunications Act.

62 Section 11(4)(b)-(d), Computer Misuse and Cybersecurity Act. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/CMCA1993#pr11-, accessed February 23, 2018.

63 Sections 3, 5, 6, and 7, Computer Misuse and Cybersecurity Act.

64 Section 4, Computer Misuse and Cybersecurity Act.

65 Section 13, Administration of Justice (Protection) Act.

66 Section 13, Administration of Justice (Protection) Act.

67 Sections 34 and 35, Films Act. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://sso.agc.gov.sg/Act/FA1981#pr34-, accessed February 23, 2018.

68 Internet Code of Practice. Infocomm Media Development Authority, Singapore. https://www.imda.gov.sg/-/media/imda/files/regulation-licensing-and-consultations/codes-of-practice-and-guidelines/acts-codes/19-policiesandcontentguidelinesinternetinternecodeofpractice.pdf, accessed February 23, 2018, Sections 13 and 16, Broadcasting Act, Chapter 28. Singapore Statutes Online, Attorney-General’s Chambers, Singapore. https://statutes.agc.gov.sg/Act/BA1994#pr13-, accessed February 23, 2018.

69 Sections 107–120B, Penal Code.

70 Section 121A-121D and 123–126, Penal Code.

71 Sections 267A, 350, and 505, Penal Code.

72 Lee Hsien Loong vs. Roy Ngerng Yi Ling. Singapore Academy of Law. http://www.singaporelaw.sg/sglaw/laws-of-singapore/case-law/free-law/high-court-judgments/18327-lee-hsien-loong-v-roy-ngerng-yi-ling, accessed February 23, 2018.

73 Charissa Yong. 2016. Elections Department, Police Explain Cooling-Off Day Probes. Straits Times (June 2). http://www.straitstimes.com/politics/elections-department-police-explain-cooling-off-day-probes, accessed February 23, 2018, 2017. ‘Stern Warnings’ Issued to 4 People for Cooling-Off Day Breaches. Channel News Asia (February 16). https://www.channelnewsasia.com/news/singapore/stern-warnings-issued-to-4-people-for-cooling-off-day-breaches-7588486, accessed February 23, 2018.

74 Elgin Toh. 2017. AGC Proceeds on Contempt of Court Case against Li Shengwu. Straits Times (November 13). http://www.straitstimes.com/singapore/courts-crime/agc-proceeds-on-contempt-of-court-case-against-li-shengwu.

75 Rachel Au-Yong. 2016. TRS Sedition Trial: Ai Takagi Convicted of Four Charges of Sedition. Straits Times (March 8). http://www.straitstimes.com/singapore/courts-crime/trs-sedition-trial-ai-takagi-convicted-of-four-charges-of-sedition, accessed February 23, 2018, Pearl Lee. 2016. TRS Co-Founder Yang Kaiheng Jailed 8 Months for Sedition. Straits Times (June 8). http://www.straitstimes.com/singapore/courts-crime/trs-co-founder-yang-kaiheng-jailed-8-months-for-sedition, accessed February 23, 2018.

76 Elena Chong. 2015. Amos Yee Charged Over Remarks against Christianity and Offending Viewers of his Video on Lee Kuan Yew. Straits Times (March 31). www.straitstimes.com/singapore/courts-crime/amos-yee-charged-over-remarks-against-christianity-and-offending-viewers-of, accessed February 23, 2018,

77 2015. Teen Behind Video Insulting Lee Kuan Yew and Christians Arrested. Today Online (March 31). www.todayonline.com/singapore/teen-behind-online-video-insulting-christians-arrested, accessed February 23, 2018.

78 George Baylon Radics and Yee Suan Poon. 2016. Amos Yee, Free Speech, and Maintaining Religious Harmony in Singapore. University of Pennsylvania Asian Law Review 12:2, pp. 186–242.

79 2015. AGC Sends TOC Take-down Notification for Letter by Amos Yee’s Lawyer. The Online Citizen (June 15). https://www.theonlinecitizen.com/2015/06/15/agc-sends-toc-take-down-notification-for-letter-by-amos-yees-lawyer/, accessed February 23, 2018.

80 2015. AGC Issues Take-Down Notice to TOC for ‘Patently False’ Article. Channel News Asia (August 5). https://www.channelnewsasia.com/news/singapore/agc-issues-take-down-notice-to-toc-for-patently-false-article-8222972, accessed February 23, 2018.

81 Kremlin Watch, Overview of Countermeasures by the EU28 to the Kremlin’s Subversion Operations, pp. 3–27, Lazar et al, Combating Fake News, Wardle and Derakhshan, Information Disorder, pp. 57–74, 80–85.

82 Youkyung Lee. 2017. Taiwan’s Hacker Minister Reshaping Digital Democracy. Associated Press (April 24). https://www.apnews.com/c8883f13efe644b9b605799f8eed2a8e/Taiwan%27s-%22hacker-minister%22-reshaping-digital-democracy, accessed February 25, 2018.

83 Yolanda Chin and Norman Vasu. 2006. A Blueprint for Social Resilience: The Next 1,826 Days and Beyond. IDSS Commentaries 121/2006 (November 30), https://www.rsis.edu.sg/wp-content/uploads/2014/11/CO06121.pdf accessed February 23, 2018, Eugene K.B. Tan. 2009. From Clampdown to Limited Empowerment: Soft Law in the Calibration and Regulation of Religious Conduct in Singapore. Law and Policy, 31:3 (July), pp. 351–79.

84 Vincent Chua et al. 2017. A Study on Social Capital in Singapore. Institute of Policy Studies, National University of Singapore. lkyspp2.nus.edu.sg/ips/wp-content/uploads/sites/2/2017/11/Study-of-Social-Capital-in-Singapore_281217.pdf, accessed February 23, 2018, You Yenn Teo. 2018. This is What Inequality Looks Like. Singapore: Ethos Books.

85 See, for example, Charlie Warzel. 2018. The Pro-Trump Media Has Met Its Match In The Parkland Students. Buzzfeed (February 22). https://www.buzzfeed.com/charliewarzel/parkland-school-shooting-survivors-crisis-actors-pro-trump?utm_term=.vbWWAJBdn#.qkMZ1v59j, accessed February 23, 2018.

86 For an example of media literacy efforts in schools, see Nicola Smith. 2017. Schoolkids in Taiwan Will Now Be Taught How to Identify Fake News. Time (April 7). http://time.com/4730440/taiwan-fake-news-education/, accessed February 25, 2018.

87 Kremlin Watch, Overview of Countermeasures by the EU28 to the Kremlin’s Subversion Operations, pp. 3–27, Lazar et al, Combating Fake News, Wardle and Derakhshan, Information Disorder, pp. 57–74, 80–85.

88 Shery Ricchiardi. 2016. Q&A with Alexios Mantzarlis: A new Age of Global Fact-Checking. International Journalists’ Network (June 30). https://ijnet.org/en/blog/qa-alexios-mantzarlis-new-age-global-fact-checking, accessed February 24, 2018, International Fact-Checking Network Fact-Checkers’ Code of Principles.

89 Kremlin Watch, Overview of Countermeasures by the EU28 to the Kremlin’s Subversion Operations, pp. 3–27, Lazar et al, Combating Fake News, Wardle and Derakhshan, Information Disorder, pp. 57–74, 80–85.

90 See, for example, Freedom of Information Act 2000. legislation.gov.uk, United Kingdom National Archives. https://www.legislation.gov.uk/ukpga/2000/36/contents, accessed February 24, 2018, Freedom of Information Act, 5 U.S.C. § 552. United States Department of Justice. https://www.justice.gov/oip/freedom-information-act-5-usc-552, accessed February 24, 2018.

91 See, for example, Foreign Agents Registration Act, 22 U.S.C. § 611 et seq. United States Government Publishing Office. https://www.gpo.gov/fdsys/pkg/USCODE-2009-title22/pdf/USCODE-2009-title22-chap11-subchapII.pdf, accessed February 24, 2018.

92 International Fact-Checking Network Fact-Checkers’ Code of Principles. Poynter Institute. https://www.poynter.org/international-fact-checking-network-fact-checkers-code-principles, accessed February 24, 2018.

93 Sam Levin. 2017. ‘Way Too Little, Way Too late’: Facebook’s Factcheckers Say Effort is Failing. The Guardian (November 13). https://www.theguardian.com/technology/2017/nov/13/way-too-little-way-too-late-facebooks-fact-checkers-say-effort-is-failing, accessed February 23, 2018.

94 Hillary Lamb. 2017. Taiwan to Strengthen Cyber-Security Measures in Face of Persistent Attacks. Engineering and Technology (December 12). https://eandt.theiet.org/content/articles/2017/12/taiwan-to-strengthen-cyber-security-measures-in-face-of-persistent-attacks/, accessed February 25, 2018, Nicholas J. Monaco. 2017. Computational Propaganda in Taiwan: Where Digital Democracy Meets Automated Autocracy. Working Paper №2017.2. Computational Propaganda Research Project, Oxford University. http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Comprop-Taiwan-2.pdf, accessed February 25, 2018.

95 Charlie Taylor. 2017. The Hacker Who is Adding a Different Flavour to Politics in Taiwan. Irish Times (October 5). https://www.irishtimes.com/business/technology/the-hacker-who-is-adding-a-different-flavour-to-politics-in-taiwan-1.3244102, accessed February 25, 2018, 2017. The Fake News Invasion: Understanding the Dangers of Misinformation and What To Do About It. American Institute in Taiwan. https://www.ait.org.tw/fake-news-invasion-understanding-dangers-misinformation/, accessed February 25, 2018.

--

--