Starting with Phil Howard’s keynote.
Key argument: within a few years, the complex geopolitical concerns around the IOT will be made more complex by the rich data of social media (and algorithms and bots in politics).
The use of social media to manipulate public opinion is now a hot topic. Original OII research into Russian activities. Surprised to see mechanisms of social control from eastern europe / authoritarian regimes transferring to western democracies.
OII seek to increase civic engagement and solve public problems through social data science. Very diverse community, many different activities.
Pax Technica book says:
- democracy defined by the relationship between people and their devices.
- Not peace, but stability through pacts.
- Govt and tech industry bound in mutual defense, design collaboration, standards setting and data mining.
IOT technologists don’t like the book because it’s not optimistic :) IOT is getting everywhere. Embedded devices in infrastructure, bodies. Drones and dust/mote sensors.
We (political science) are used to definitions of regime type which are how the executive of government relates to citizens. But this will shift from abstract to encoded in software, concrete.
There’s so many data points already — our phones, each app, so many data points just in this roomCitizens agree to T&C, that’s a pact. Expect to see more of these — agreements, which if breached have consequences.
There won’t be one IOT, but at least 3. China is building its own. USA has a freeform, mostly unregulated one; increasingly connected to social media data. Europe may end up with a separate one reflecting values here. Not sure what Brexit will mean.
IOT will generate immense amounts of politically valuable data in coming decades. Important for our future.
Look at highly automated twitter account; one which moves, agile, between interests — UK election, US election, other things. Bot with weird keywords and activity. If tweeting more than 49.5 times a day using political hashtags — probably a bot not human. Switching english to cyrillic is also a giveaway. Trump’s follower list also a good source of bot accounts — no photo, no content, but some mistakes — thousands of accounts generated on the same day, say.
danger: when these automated accounts carry fake news. Or, “junk news” — better term.
Junk — all caps headlines, made up photos, weird stories.
Almost all RussiaToday content is junk news
You can map the networks of highly automated social media, and the kinds of content that is shared.
Who would buy 10,000 fake twitter users? Marketing, entertainment, porn? Pharma, actually, is biggest buyer.
Google “computational propaganda” for more
OII project releasing data memos, short form, lots of methodology notes, and only a few punchlines — trying to do this whilst elections are still running. Expect a change from elections to issue based propaganda.
In authoritarian regimes, military units are repurposed to do this work. In democracies, it’s political parties, with staffed offices. Found in each country studied.
Gender politics — more research coming on this. Seems easier to drive women off social media, soft target.
Michigan example. It’s not the average amount of junk news on social media, it’s the targeting — where it appears, such as swing states. Found twitter conversation was biased — twice as much Trump as Clinton hashtags. More extremist, masked commentary (stuff in NYT font!), sensationalist, than professionally researched news. Because of the rhythm of content production, the amount of shared professional news reached the lowest point just before the vote. Was the amount of junk news on twitter concentrated in swing states? Yes. (data memo published, scholarly publication in the works). Why was this? various possible explanations.
But this is a global problem.
1:1 ratio of professional news to junk news in USA before election. 7:1 in France, 4:1 in UK and Germany. Theory: if there’s been some public investment in national news infrastructure, some innoculation?
30 countries studied. Different platforms, different regimes. mostly looked at twitter so far, because the data is available. That’s true for most research. Would be valuable to stop looking at twitter in future research — action is on facebook now and we must find ways to study it. With or without platform cooperating. Also snapchat, instagram, platforms used in other countries. Young people are there. Need to look at these too.
Bot writers are adapting. eg in germany, they’ve read the research, adjusting their methods so the researchers can’t track it. eg reducing below 45 junk messages a day from one twitter account.
governments are overregulating. EU in particular — may affect political speech by overdoing it, too much control over political content on social media. Algorithmic audits for instance, fines (current mode for regulation).
Tech companies digging in. Facebook take ages to release tiny nuggets event after many formal US inquiries.
More elections coming — including sensitive ones.. Brazil, Russia, Colombia, Mexico, Rwanda, Georgina, Pakistan, Thailand and more. Countries where rumours may turn to street violence.
Let’s look at IOT data. Every budget, national security bill, taxes, etc, will come with some sort of organised battle (from in country or outside).
Russia’s activities are visible. China — harder to know what’s possible / being done. We know the EU is crafting a regulatory response, possibly overbearing. The US presidential election tricks will start to be used in climate, oil, pharma issues. Any issue where an industry might seek something from government, expect bots, junk news etc to be used.
Don’t think there’s been any genuine AI in this space yet. But it will appear. AI will be generating personal messages for politicians — pulled from IOT data and more. IOT data is not simply attitudinal data, it’s not aspirational data (what you would like), it’s almost purely behavioural. Social sciences need to adapt for this.
Expect AI to craft content for you, personal to you, for the 48h before you vote. This will be the primary way we experience political content on social media. Very hard to know what is actually happening when it’s so individual.
This is a deep attack on the enlightenment.
eg “smoking causes cancer” — clear 50y consensus. But there’s online campaigns which sow doubt, teach the controversy — it depends on gender, lifestyle, brand, which cancer.
This is worse when the issues have genuine controversy today.
Politicians who go with their gut, who think critical thinking is listening to both sides.
Strategy — what shall we do?
We’ve lost the privacy war in this internet.
But we can craft some guidelines now — this moment, although it’s closing. 2–3y to sort this, afterwards it is entrenched. Ideas:
- report ultimate beneficiary. You should be able to take any data and make a list of the organisations that use that data. (based on blood diamonds work)
- additional opt-ins.You should be able to volunteer your data to others - so my coffee habits are already there for coffee industry but i can send it to the coffee co-op i love, so they can benefit from it too.
- an IOT that tithes.
- extended non-profit rule. In most countries you can’t profit by selling voter registration files. Data gets all mixed up by data brokers, bought and sold. Should be able to come up with a list of variables which you couldn’t profit on, which should be shared — maybe public health info, or census data. This could prevent the secondary market in personal data from benefiting exclusively from the marriage of IOT and social data.
Project is working on a few things:
- make it easier for journalists to understand, teach them to report well. Encourage reporting on AI, bots and digital public sphere.
- build tools to help people identify junk news and AI generated content. Can AI help here?
- use blockchains to track news provenance
- junk science countermeasures — eg if Russians pushing anti climate change campaign, get our experts using social media to counter it. Best way to respond to bad political speech is with better.
- develop research dissemination plans with public impact
- advise civil society groups on attack response — when they get attacked after releasing reports etc. In E Europe, this can mean CSOs using bot armies; the pushback in authoritarian regimes can be brutal, so this is an understandable response. Is there another response possible or better?
Window is closing on deciding how we manage this tech.
Q: This was mostly about political discourse. What about the religious sphere?
A: The applications of these tech to religious discourse is mostly led by theocracies. eg Iran. Some interesting research on small-p-politics-communities, eg connecting young women to religious leaders outside their local communities/culture via social media. Are Catholics investing in scriptural analysis AI??
Q: surprised by how negative about privacy the talk was. Eu law is last bastion of privacy and data protection; individual cases are making a difference — Schrems, Snowdon, Google spain right to be forgotten. As well as academics, where’s the role for the empowerment of individuals?
A: Don’t see all those as clear victories for privacy advocates. Any data wanted about a member of this audience can be bought now on the darknet. That’s the reality. Educating public is important. But industrial benemoth around selling sensing tech / things with cameras / etc is so big, even if you don’t have that smart stuff in your house, if your neighbour has it, if drones are in your street — you can’t opt out of this IOT future. There is no opt out. More proactive to engage with industry — talk to engineers who don’t realise they are affecting public life — help them see what they are doing, and together build protocols to protect future society.
Q: Obama campaign wasn’t about tech, it was his experience as a community organiser. Trump — already dominated media prior to internet. How much is really new, vs old using new tools of internet? Arab Spring — centralised govt facing decentralised internet response. But it wasn’t the internet, it was existing movement
A: people can fight for democracy who have never experienced it. Stories cascaded, in arab spring, from country to country, in a way that would never have happened before. Charisma… politics has always struggled to create a story that translates across contexts, which isn’t about an individual’s charisma. Facebook and twitter offered embedded staff to both campaigns; Clinton campaign felt they had existing analytics capacity, whereas understaffed Trump campaign welcomed help. Bots may have mattered at two moments in US campaign — when Trump was still a joke in the primaries (to make him seem more popular), and the negative campaigning later at the end (pushing anti-clinton junk news).
Q: premise that new era of stability will be ushered in by government and industry alignment. But seems that they are less aligned right now. Given governments have different approaches, how does it all align?
A: consumers are a *little* concerned about govt surveillance — not a lot. Which govt would act against facebook? China, maybe Russia. Otherwise we are negotiating with facebook. A systematic change — where FB may share data with civil society or libraries or national science foundation, say — doesn’t seem to be on the table. The actors with access and interests are firms, and the govt. Democracy relies on sharing of power, that means sharing of data today. Civil society doesn’t have the capability, or resources of lobbyists. Hard to see how these arrangements, built into free trade and terms of service agreements, would be undone. Hope we can add to them.
Q: Rowntree in 1904 — selfish, unscrupulous wealth, power through press. How much are we seeing a continuation of same issue? How much a categorical change?
A: Politicians created fake news for 1000s of years. Military rumours, etc. Changed course of world history. The proximate cause of false info is facebook serving junk news to voters just before they vote — political science tells us people don’t make up their mind until the 3 days before they vote. We could speak of other things — Higher Ed’s failure to teach critical thinking, say. But we can address the proximate cause. We can measure the number of citizens in UK who think brexit will save £350m — a year on, can still measure impact. love the idea of big historic context, but we should think about the proximate cause.
Q: Presenting fear of microtargeting. But many of the examples are macro, about volume and speed, not targeted. The £350m was on the side of a bus (and leaflets in hospitals!) — this was a ‘most effective message’ regardless of how it got to people.
A: Problems arise when these ridiculous stories jump media, appear in mainstream. Broadcast media still important. Unless we as outsiders can see there’s 50m voters getting 50m messages, we are ill equipped to address transparency or accountability or anything. This election wasn’t microcustomised, but AI is going to get us there.
Q: Surprise played down importance of privacy law. We haven’t reached peak data yet, or thought about the level to which ads can be customised and targeted (eg blood glucose level live from your wearable, so when you are susceptible to political messaging). Cosmetics ads are pushed mondays because women feel least attractive then. So there’s a long way further this could go, where we are natural persons will produce huge quantities of data which could lead to highly customised and persuasive advertising. GDPR — requires privacy by design — is that important in this context? Also, raising awareness in ordinary citizens about their new rights in GDPR, such as objection to profiling, and other tools they will have access to
A: like Privacy by design as a principle. We could also do democracy by design, get designers to think creatively about this! Voting maybe not our most important democratic exercise! Skeptical about fixing privacy law — even if you object to a particular firm sharing your data, and you have the energy to track it all down, there will be enough other people LIKE you which can be used to target you. example: prolife and prochoice, bucket women by whether they’ve ever bought contraception on a credit card. They create a complex database — 25y ago! — aggregate down to household. Most advanced democracies have this level of study. Even if you remove yourself from this dataset, or decline to participate — you still get the marketing. The beauty ads on a monday. So we do need the law but most citizens most of the time won’t think about it. Most internet traffic isn’t politics. It’s consumption and Kardashians. Most people won’t do stuff so we must design oversight which prevents abuse by firms. (examples in recent years of abuse based on these existing datasets).
Next session blogging here.