Power Switch — conference report

I’m at the Power Switch conference in Cambridge, exploring how power is changing in a networked world. I’ll be updating this post through the day with notes from the sessions.


John Naughton and David Runciman introduced the day, from the Technology and Democracy project at Cambridge.

David:

Power is a slippery concept. It’s tempting to think we know what it is and that we recognise it when we see it, but this can lead to difficulties. Cass Sunstein’s latest book includes: “people’s growing power to filter what they say, and others power to choose what we see” — but those are two very different kinds of power. Political scientists might wish for a taxonomy of power, but that’s confusing too. So today we want to interrogate what it is, so we’ll know it when we see it, without getting bogged down in definitions.

There’s nothing new about most of the kinds of power we’ll hear about today (except, perhaps, the session on algorithmic power). The same sessions could have been run 100 years ago and would still have been topical — Rockefeller, Hearst, Roosevelt. The types of power and any ‘ranking’ of power are perhaps not useful — the types of power bleed into each other, both 100 years ago, and today.

David’s weekly podcast, “talking politics”, will feature folks from today.

John:

This event is designed to be different, to engender conversations with the audience and not just the panellists.

We want to find a way to get a grip on the nature of power, to help future research and work. Harry Coles, who worked on knowledge engineering, came up with a metaphor: the process of knowledge transfer is like the process of straining dumpling soup. The colander always catches the dumplings, and loses the soup. In the literature of power, you see nothing but dumplings. But we are interested in the soup. So: the audience has coloured post-it notes, and the conference will gather our thoughts after each session, to see if we have found insights that might otherwise be missed.


The first panel is on corporate power, introduced by Daniel Wilson.

Siva Vaidhyanathan:

Milton Friedman wrote about the role of corporations in social movements, such as women’s rights. There was a feeling at the time that corporations could contribute in direct ways to the culture of American politics in opposition to those movements. Friedman argued that by having corps involved in politics, politics might corrupt the market, with concerns beyond price and quality of goods and services. Reactions to the essay were interesting: folks on the left felt it amoral, but libertarians were also flustered. Through the 1970s a vision for stronger corporate social responsibility emerged; the challenges facing society were more evident and the need for state action more obvious. Libertarians argued that corps should take seriously their responsibility to the environment, equal opportunities etc, as a reaction to core needs in society — a flip of Friedman’s concerns.

Some corporations today embody social concerns completely — eg Whole Foods, which takes a stakeholder approach to all corp decisions (community, labour etc). Starbucks has been involved in conversations about race and ethnicity, and education drives. Phillips feels it can learn from NGOs and that they can learn from its engineers — a narrow approach, compared to Unilever’s broad global engagement around water, sanitation etc. BP tried to redefine itself entirely.

These are sincere attempts. If companies are doing well they can boast of CSR. Two ways: (1) efforts to improve the world are like an extra wage, employees are happier overall. It’s like a service to consumers, too, we feel better about buying from these companies. (2) Valley-style, more fully embodied in the organisation, expressed in the mission statement. eg Google’s “organising the world’s information”; or Facebook’s informal “bringing people together”. This isn’t how they generate revenue, but it’s how the companies see themselves, their long term vision.

So: Why is this their business? Why is organising info Google’s job, and not Cambridge University’s? Why is bringing people together not the job of libraries and mosques and parks? Why? Siva argues that it’s because our public institutions have failed so badly.

We have political trends that set such public institutions up for failure, and when they fail we turn to the private sector.

As a result, we’ve been invited to outsource judgement (about what’s true, relevant, what to trust, who to talk to, how to get from A to B) to these big corps. This is a dangerous trend.

But what sort of power is it? Siva doesn’t have an answer, and perhaps Google and Facebook don’t either. Perhaps there is no agenda beyond dominance for the sake of dominance.

Mystery!

Ellen Goodman:

Presenting work in progress, with Robert Brauneis.

In anglo-american law, municipal corps (and charities etc) have the same status as private corps — they have personhood. There were corps for general government — for cities, towns, etc. Or corps for special government — such as companies, charities etc. Over time the rights and status of private companies have grown, whereas those of city corporations have diminished.

Cities are asked to handle more, with less, today. They are asked to move to data-based governance, smart cities, etc. Unless they are London or San Fran, with great local resources, they turn to private companies (for or non profit) for help on digital governance, and they do this without much power or many resources.

We see algorithms used in policing, child welfare, transit, education. Private corps are taking on a significant role in public administration, BUT the role is not spelled out in their charters (unlike in the 17th century). There’s a risk that what is smart about a smart city resides in a private corp vendor, and the city itself becomes hollowed out and dark.

The public cannot know what’s going on, as the only corp accountable to it, the municipal corp, doesn’t know and cannot find out. Private power captures public power.

Example: PSA-court algorithm, about pretrial disposition (risk of flight or commiting crime whilst awaiting trial). In this case, the system is provided by a nonprofit, but it’s not open source. It’s owned by the Arnold Foundation. Patent pending, all rights reserved. It says they don’t take factors like race, gender, religion into consideration, so that’s a bit of transparency, but we don’t know what it means to be classed as high risk. The judges who use it, maybe they don’t know what it means either — we think they only have access to the open information. Because of the way the system is owned, there’s a risk of path dependency; the judicial system becomes dependent on the system, and fees might go up, or the algorithm might change… this is different from, say, IT equipment the city might procure, because it’s the basic public function of due process.

Algorithms tend to have risk of both false positives and false negatives. You have to choose a ratio — eg false negs being 2.6 times as costly as false positives. This is an important step.

Democratic accountability around due process is important. Sometimes algos are substituting for govt policies that were once more transparent — need to preserve accountability we had in the past. Also we can have enhanced accountability now — we can know more than we did when things were locked up in judge’s brains! And in algs we need to watch out for systemic errors — so a second reason for accountability to be important. We also want better algorithms (fairer), better governance (ethical and transparent by design), and checks on private power.

The research has filed 41 requests for info about these algos, and got 25 no docs, 5 confidentiality agreements with vendors, and 6 some documents.

Impediments to this work: open records acts may not apply to private contractors; trade secrets and NDAs, possibly used as an excuse; competence of record keeping; poor documentation; non-interpretability of machine learning systems.

Mireille Hildebrandt:

2 types of data driven algorithms: (1) IFTTT, imperative programming, deterministic and foreseeable (but to whom?) (2) machine learning — including supervised and reinforcement, unsupervised. Trained on datasets, based on hypotheses, built on assumptions (for instance, if you train something to spot faces, it will spot faces anywhere, such as in an image of flowers). Will reconfigure and change.

Very important to understand the nonsense around this. eg. EU data protection law is about purpose limitation. Some folks say this is out of date. But every ML operation is built on assumptions and incorporates tradeoffs — how quickly do i want results, how much will i spend on data, where are we going? It’s complex!

Correlations and causality — these are complex areas. But the consultants helping governments with this stuff don’t say this.

We must learn to speak the language of trade offs. What do we know about the data we are learning? Did we use different datasets to train the same algs and what were the differences? Let’s get more specific about algos, and get to the point.

Who controls this ML? Corporates control access to the flow of communications in social networks, the nature of search, filtering of info, behaviour tracking. They are the gatekeepers of all kinds of information.

Look at the 1993 Page and Brin paper. PageRank explanation — there’s a footnote that “this type of search should never come into commercial hands”. But the market we have, forced things to happen as they did. Markets are created by private law, not by economic forces.

There’s a lot of nonsense around behavioural economics — often a hoax.

Algorithms have no power. They have “a mindless mind”. They have force, mechanistic or adaptive, but no intent etc. The power of algos sits with the user of them — that’s the corps in most cases — and they are hidden by trade secrets, IP rights, and the drive for return on investment. If you cannot test these systems, you cannot contest them. This is outrageous in the public sphere!

We need EU data protection law, and the requirement for profile transparency within it. This is worldwide unique — nowhere else has anything like it. it says: “If there are automated decisions with an impact on people, you have to make sure they know it’s happening, and that the people get meaningful information about the logic.” there’s an opacity argument that the logic is impossible to explain — don’t buy that.

Law makes things simple — attributes liability — that’s why markets work!

Power and resistance.

Can we resist corporate power, by means of algorithmic forces?

Perhaps we HAVE to go for this. Not just train kids to read and write, which current democracy requires… but also train them in statistics, machine learning etc.

We must wire ourselves to speak law to power! Resist!


Discussion:

Nebulous nature of power, plus digital tech… something about that mix which makes the power and politics less visible.

The corps with these mission statements — folks within them often have no distance between their personal and professional views, they truly believe in the missions!

For these digital things in the public sphere, we need rights equivalent to Freedom of Info etc.

Sometimes we reduce corps to their leaders — Zuckerberg etc. We used to think of Rockefeller and Hearst. But, these corps are nominally owned by stockholders and shareholders, these depersonalised it. Have these digital corps changed the nature of what we mean by a company or corporation? There’s something around sincerity, and about intent — how important is it to expose the true intent of these corps?

Legal personhood has a long and complex history. These corporate entities are, like algos, mindless minds! The corp structures inside today’s valley corps are maybe more personalised, focussed on charismatic leaders, than the systems are designed for.

Some of these valley corps have unusual governance, for Zuckerberg, Page, Brin etc to have exceptional governance control. Starbucks, etc, too. It’s very different from a company in which common stockholders are a source of distributed diffuse power, which makes them less governable in areas outside their core mission. In the 60s, shareholder revolts lead by union pension funds, or catholic church, became such a mess, that some companies have clawed back power. Companies with more distributed ownership setups find it a low harder to settle on an agenda outside the core revenue generating functions, because it’s hard to get consensus. Facebook is not the same as Zuckerberg, but Facebook is a clear expression of Z’s intent for the company; the directors have wide-ranging political views and activities for instance (although FB follow’s Z’s anti-Trump stance). is this a change in the nature of corps? Perhaps not — they might not be typical corps, but they aren’t a radical new thing.

The public/private distinction is problematic. Private prisons in the US — at the core of the state’s coercive power — still held separate from transparency.

Some of the process questions (eg did you test this algo on different data?) don’t necessarily generate what we think of as ‘records’…. may want to think about record generation system so that such things can be subject to public access.

Why don’t we have stronger FOI on these things? Because of political will (esp in USA).

As well as the obligation to document, need an obligation to accept premises to be entered and examined. This would save the trade secret argument, if a watchdog/supervisor can inspect.

Access to knowledge, and the way we communicate with each other worldwide, these are public utilities.

Need to think about innovation impact too .EU vs USA — different market constraints. Would we have google search if we only had Europe?

We had enlightened despots once, and it seemed OK for a bit; then we said we wanted rule of law. If a company says “don’t do evil” that’s nice, but it’s up to us to decide what evil is, and what we want.

USA: last months/weeks … policy is impossible now. To have a measured debate with some agreed facts and debated intent, the norms we’ve operated under since Jeremy Bentham are out of the question. eg what meagre data protection there was in US, before it even went into effect, was stripped out. So it’s hard to respond to suggestions that policy interventions might be possible in the USA, at least at federal level! Although, for optimism — at state and local levels, things are stepping up, particularly around FOI. This is starting to include right to query a database, for instance, to get something that isn’t a preexisting record that just needs to be delivered to you.

What is the real agenda of GAFA? Despite the happy clappy talk, the agenda is surely simply making money, which they are very good at! And maybe, once you have so much money, although you want it it’s less of a priority? so you see other motivations surfacing.

These are not stories of people pulling themselves up from the margins of society — Brin, Page, Zuckerberg. They are from well off cosmopolitain backgrounds. Their values as expressed are liberal academic ones. eg Google’s introduction of advertising, that wasn’t upfront, it came later. Similarly Zuckerberg, a lack of concern about getting to profitability. (Bezos is different — he’s all about making money). FB + Google are special, not just because of unmatched influence, but that they do reflect this very odd privilege that they didn’t have to worry about money… money finds money, and there’s a lot of valley money, so… (Perhaps twitter is a counter example! they are not so good at money)

Perhaps Zuckerberg is seeking the pure mathematical formula for human interaction and sociality? That would explain his motivation to hang on to all of that data. Here’s a key thing: Facebook doesn’t SELL your data. They sit on it. It’s their golden egg.

Cult of personality… is CSR the answer? CSR — it’s nice and well intentioned, but these little things, individual consumer choices, are sometimes self-indulgent expressions, and also don’t make a lot of difference. If whole of US went to Priuses tomorrow would it sort climate change? No. that takes hard and difficult policy change.

Softbank — maybe more evil. No one understands what they do. Megalomaniac leader. That’s a scary one and we don’t discuss it!

In the smart cities discussion — big corporates such as SAP don’t just do the public service pieces but also run human resources, or even vendor management, around the smart city. It’s not just the due process aspects. The argument is made: the infrastructure this company provides is so large, we can’t do anything about it. The public officials who don’t understand what’s in the data systems they buy… catch 22. they might buy bad systems; and the people who sell the systems will make a lot of money, and if the system doesn’t work they are the only ones who can fix it. And of course the government maybe doesn’t have much money.

If the govt can’t explain how a certain decision fits the legal framework, (“computer says no”), it’s outrageous. You have to call and challenge the decisions. The strong civil society in the USA might help here — that’s an area europe is weaker on.

It’s not sexy to train civil servants in this stuff. eg in to understand how to know what’s a real trade secret and what isn’t.

example: shot spotter. the alg might be a trade secret but where shots are fired is public info. There was a reporter who looked into this — legal challenge made cities reveal locations etc. You can resolve this issue with legal force.

It’s not just about the power, but the institutions/processes through which power is exercised and challenged. We need the ability to contest, to argue things that are vague. BUT some of the ideas of Zuckerberg and singularity believers etc is that language may be too vague and we should just have maths. Not sure our institutions know what to do about that, because built about language etc. And yet this rise of maths is happening.

We don’t know how to run a republic in the absence of public trust in institutions. And yet Pew show increasing trust in Google search results. Americans feel google reads their minds well enough…. that’s the challenge we face.

Dominance is part of our concern here. But it’s not an uncommon thing in markets. Don’t we have mechanisms to deal with this, in terms of regulation? We managed some regulation to tackle telco monopolies in the recent past. Ans: US has no interest in competition law. Eur is just doing consumer protection angle.

Google and Facebook don’t know what to do in the face of issues we now expect them, as dominant players, to tackle — eg hate speech. Can we not help them, maybe through regulation?

If we have these personality cults of valley leaders, appealing to their better nature is just flattering the tsar. One day, these leaders will die, Google will revert to being a regular company without the high minded ideals, and perhaps will be worse! (but perhaps these guys might live a very, very long time….)

Not power but domination: this is something important

We’ll never agree about ethics, that’s the nature of ethics. But the law should create constraints, and incentive structures. Ensure companies are not forced out of the market if they act ethically.

Devil’s advocate: instead of stretching current regulation and law to this new, faster, internet regime, maybe we should be forcing our way in, going there?


Next up, States and power, chaired by Nora Ni Loideain.

Ross Anderson:

Entrepreneurship — starting new things — how does this relate to politics, not just tech?

We saw many social networking companies, and now there’s basically one (plus a few in protected markets). It’s because of the network effect, low marginal costs (so can squeeze out competitors) and platforms.

So: will we see similar things in social, religious and political entrepreneurship?

4 layer innovation stack:

  1. top layer: culture, values, norms
  2. ecosystem — bureaucracy + politics + businesses
  3. rules: organisations, contracts, laws
  4. individual actors

Time constants used to be very different. Culture typically changes over centuries, an ecosystem over decades, and a firm over years.

Ecosystem examples: warsaw pact, Microsoft software, EU

Top layer examples: religions.

Crossing over up to the next layer is hard. Very few people have set up top layer things successfully! (how many people have set up successful religions?)

Organisations seek to hack ecosystems. Amazon changing mail order; uber, airbnb hacking regulated markets; etc

Ecosystems enable culture hacking. Slavery in Rome laid the ground for christianity. TV led to “Bowling Alone” hypothesis. People join stuff less these days. So: What will social media ecosystems give us?

We talk about polarisation emerging from social media, but is there real evidence for this? What will capable entrepreneurs use this ecosystem for in terms of hacking politics, business etc? Maybe we see this already in political campaign hacking.

Stuff used to bubble up, ideology diffused over time and evolved. now, people are more aggressively going after ecosystems — not just corps, but coalitions, FOSS projects etc.

Tech can empower new actors to do new things.

For decades now we’ve tried to turn our products into ecosystems and platforms, supporting others to innovate on them. PCs, phones, HTML, facebook….

Perhaps ideas ‘bubbling up’ are being replaced by hackable machine learning from crowdsourced data? We maybe assumed hacks would be KGB malware or man in the middle attacks, BUT what we see is fake news etc.

It’s about soft power. As political debate moves from public spaces to prviate ones, the space owner may start to acquire real power. (eg Fox TV, printing press…)

Attractions of US/EU/Western lifestyles to young, educated folks around the world; and the attractions of radicalism to others/losers.

Warning! in 1990s we thought internet would bring people together, crypto would build trust, and there would be an online melting pot…. BUT: changing culture is hard, and even in the US, the blue and red states are still separate. And both can use social media. (perhaps reflecting old reformation or enlightenment values, depending on origins)

Warning! easy to forget folks outside US/EU! growing crisis of frustrated expectations. Uganda: 1 tech college at independence, 11 universities now. the students know how Westerners live, but there aren’t the jobs for them. Bumping up against limits on land, water etc. and insurgents around the world can use social media too.

So: field is open for political entrepreneurship. There are the tools out there. It’s open to many people including extreme players. Need a grown up view of politics, economics, and to get more empirical data so we can understand it.


Lawrence Quill:

On Silicon Valley’s new progressives. has lived in thevalley for 16-odd years, really do find it’s a bubble.

[shows video — the 1984 Apple ad; interesting listening to propaganda speech!]

That view of tech power, compared to government overreach and efficiency, resonates… eg with folks who remember counterculture, and cyber culture (late 60s early 70s). The opposition illustrated in the commercial has confused us over the years as we think about companies and states.

These companies bring disruption to all parts of life. “Age of Optimists” — Greg Ferenstein (free download book). Notes something peculiar going on in the valley, as old categories of politics seemed to break down. Seeming left wing leaders believed in automation and charter schools, and want to see people educated, whilst govt more like tech company; an optimistic view, information as a good, and that more info is better. A new progressiveism.

The grand vision was one where “Government becomes Star Trek, and life becomes college”. “automated luxury communism”. No privacy, no regulation, no politics (global society).

And now there’s a shift, maybe, in attitudes, leading to books “Too big to know”, “the glass cage”, “the internet is not the answer”

Valley companies also look like old fashioned companies now, lobbying washington (lobbying spend: top 5 valley companies outspending the 5 largest banks by more than a factor of 2). Revolving door of think tanks and corp folks, etc. (aside: FCC may be subject to this too)

Internet looks more like an addictive slot machine — or, Skinner’s “variable ration enforcement”. (ref “Hooked” by Nir Eyal)

And society is very unequal now, even just in the valley. And we see a leadership shift, not “the only way is to do great work you love” any more (Jobs), it’s Musk becoming a cyborg.

So: now we have a new spin on an old idea. Silicon Valley looking at universal basic income. Government welfare seen as inefficient, paternalistic, etc. And we start to see people as future entrepreneurs…

UBI: requires info about who gets payments (even if just citizen/residents or not). Maybe there’s some electronic transfer of money. There is, anyway, a register of people who get money. Even more info on people is needed if you want to offer top up payments, for instance (Finnish experiment is looking at this). So: UBI + tech surveillance maybe addresses govt bureaucracy, and maybe eliminates a lot of the state in fact. Of course this means tech companies get access to lots more info — eg immigration records.

Policy makers, esp in US, like to turn to tech companies for solutions to seemingly intractable problems. Such as Gates Foundation — which estimated education in US a couple of yrs ago was worth 9bnUSD, now 13bn — and using things like biometric analytic pedagogy (bracelets to monitor individual learning).

It’s not Big Brother, but William Gibson — warring corporations of Neuromancer.


Ron Deibert:

from CitizenLab at Toronto — mixed methods around human rights issues. “CSI of the internet”. Censorship tracking, privacy and security issues in mobile apps, documenting targeted digital attacks against civil society orgs.

Ahmed Mansoor — human rights defender — recently arrested in UAE. Received links in text messages; links to sophisticated spyware which would have embedded itself in his iphone, taking advantage of unpatched vulnerabilities, turning on camera and mic and capturing all of that plus location etc. From NSO Group, a secretive group in Israel. Typical spyware / cyber warfare company of this type — “we only sell to governments” “we follow Israeli and local laws”

The vulnerabilities like this, unpatched on iphone, are about $1m each because they are scarce. The exploit above needed 3 such vulns. Lab investigated in aug 2016, security patch came out shortly, affecting over a 1bn devices worldwide (yay research impact!)

Mansoor had been subject to other attacks using “lawful intercept” products — 2011 from Finfisher, 2012 from HackingTeam. That’s a lot of money spent on one dissident by one govt.

NSO groups infrastructure then targeting anti-obesity activists in Mexico. Sugar and obesity is a big issue in Mexico. Beverage industry with govt links — and NSO Group say they only sell to governments, so…

CitizenLab have published many reports on such attacks on civil society. Detailed research into how the command and control structures work, connecting malware to clients. A lot of countries use this stuff, mostly the world’s most notorious abusers of human rights.

Is there selection bias? Because folks who think they are getting attacked are getting in touch with the Lab. Studied 10 NGOs — and it’s happening a lot.

Early internet days, there was an assumption that the internet would empower civil society. We’re seeing the opposite. A real problem and getting worse.

CSOs: low IT capacity, resource constrained, often don’t have policies and security practices, insecure devices, misinformation, digital security threats not salient; attacks disrupting safety and trust; and the same groups that target industry and govt also target CSO.

and it’s getting worse. Connecting is cheap and easy, security is costly and hard. We are connecting faster than we can secure. Entanglement.

BUT: real electronic cyber warfare between states very rare. There are high level deterrents — states are mutually dependent on cyberspace.

HOWEVER covert action is on the rise…. so cyber espionage is huge and growing.

Democracy is in retreat; authoritarianism resurgent.

post-snowden there’s an ironic consequence, that mass and targetted surveillance is normalised. Insecurity by design — backdoors etc — prevalent.

Big data analytics are a form of social control, or social credit. (In China, the big companies like alibaba and tencent, are talking about integrating their data into a system of social control and creditworthiness… it’s not now but it’s coming in the future. worthy of study.)


Discussion:

What is state power? [seems hard to answer!]

In US, state power is weak compared to european model. New progressivism — contrast to old progressivism, such as New Deal. So many new things in 1930s were dismantled after the war! Corp expertise has played a big part in govt/governance… winner takes all politics appeared post-war, and the tech companies have joined in on this but wearing a halo of anti-big-brother which has lead us not to notice.

It’s not new — railway barons fleeced people for decades, too. It’s the innovation cycle. 15 years to make a law (from opposition to power to law). 15y to tackle Netscape/MS in Europe. So it will take 15y to tackle facebook, too. that’s the time constant.

State power, legitimacy, authority, etc: all vary country to country. In digital world, there was an assumption for a while that state power didn’t matter. But as the centre of gravity shifts to developing world, where the state has a different role, this is changing. The regulatory env is different for instance…

Crap regulation will keep low and middle income countries poor. Tech regs, bank regs, whatever. (eg data localisation laws in Africa, forcing local data centres and distorted costs. Tech may hinder development in weak states)

Tech sector idea around incapacity and inefficiency of the state. Eric Schmidt had the idea, a few yrs ago visiting cambridge, that when everything was magically networked, authoritarianism would wither away. Seemed a crazy idea to students. This is why China is worth studying.

China has created a flexible system, supporting internet innovation, although it’s all in the service of advancing party security, it’s also providing lots of nice apps etc. many governments want this sort of thing.

Our tech carries Valley assumptions embedded in the tech (which may be a cultural phase — myopia — may not last).

A new thing: people now have, and maintain, friendships overseas. This is more universal — not just academics who travel the world. Over time this will become more widespread, may change attitudes?

Temporality around political institutions and authority: we think it’s all slow and forward looking in law making, and administration is more about the present. In the tech system, more fast moving and focussed on today, does that lend power to administrations/executives, rather than legislative arm?

People try to change things. GDS, Universal Credit etc — things don’t always take or last. That’s the pace of change time constant.

The Valley UBI idea perhaps echoes the model cotton factories experiments in the UK. This didn’t turn out to be a solution for either those capitalists or society overall. Cotton entrepreneurs needed a state to deal with education and epidemics, their world wasn’t enough. Will UBI be similar, a flash in the pan, with a big enough ‘crisis’ getting the state to come back? valley experiment likely to run for a few years. maybe it will fail. Why are they doing it? Partly window dressing — people are sick of tech billionaires spouting platitudes when automation threatens jobs. And it’s a cheap intervention. BUT the valley idea of UBI is very different to that of Guy Standing from SOAS who coined the term ‘precariat’ — he calls for basic income plus other social welfare provisions. If the valley pull it off, it will be frightening — they don’t seem to care about inequality, but to unleash the inner entrepreneur in folks (and success breeds success, money breeds money, so this leads to more inequality).

US was downgraded to “flawed democracy” by the Economist before the election!

If you want to see democracy and you believe in it, there’s got to be a vibrant civil society. We had an idea that tech empowered these CSOs, and there was a phase of that when transnational activism happened. We thought legacy governments eg in middle east were too slow to keep up — but actually these regimes were able to look at Arab Spring etc and were able to act, supported by companies like those mentioned.

It’s not USA vs China, it’s a more general crisis of democracy.

Do govts actually have less power than we think? they can’t get enough off our phones! Ans: don’t mistake power for access etc. Russian SORM (sp?) system is in all the network junctions. Maybe it’s buggy but it has the effect on people, depresses free speech, and sometimes it’s very effective, FSB can pull in the person they want. It’s not as simple as state power and authority.

VC funding — this is a mysterious world for many folks. Paypal money is backing the two big brain interface projects (one being Musk’s). VC is also a legal/lobby force — eg angel investor who was in the room with airbnb talking about how they will rewrite the rules.

today’s students really want to work for uber, apple, facebook or maybe a valley startup — it’s that or be the precariat.

Only the large companies and large blocs can make a difference in internet regs.

Today’s young people will see another FDR and New Deal. Eventually.

Democracy has faced crises before. But 860m monthly users of Wechat, full of surveillance stuff. Are we suffering from hubris around democracy? that’s a lot of people with a very different attitude to resilience coming through, who don’t see democratic resilience as the only thing.


An interesting side note over lunch: we’ve sessions today on state, corporate, media and algorithmic power. Where are the people?


Next up: Media Power.

Sadly Emily Bell can’t be with us — check out her recent piece on Platform Press. Daniel Wilson is chairing.

We’ve perhaps forgotten Murdoch and Berlusconi — wasn’t that long ago that they were at the forefront of press power concerns. They are of course still around, even if eclipsed by Facebook etc.

Martin Moore:

(for broad intro check his book “Tech giants and civic power”)

Looking at media power and elections. Looking at Vote Leave invoices, lots of mentions of AggregateIQ. Hard to find out much info about them online! They are a small company in Canada. Vote Leave spent more than half their official budget (£3m?) on AggregateIQ. The difference between old tech and new tech in elections is that new tech, whilst appearing transparent, is actually more opaque.

The Sun wot won it — not anymore. Facebook, now.

How did old media work with elections? Theoretically news outlets gave us the info that would help us make a decision at the ballot. In practice it was more than that — private dinners and trips etc with key editors. Cameron and Brooks. Blair and Murdoch. Theresa May and Dacre today. These meetings lead to potentially surprising backings, but mostly, avoid full frontal opposition from the media.

Old media power was partly the power of the megaphone. Partly agenda setting — what are the issues of an election.

Tech platforms — you don’t get May having whiskey with Zuckerberg. You do get leaders going to valley campuses or speaking at Google events — but this is to look like they are part of the future, not to win support. Platforms don’t set the agenda. But they have power to drive attention; access to data and our social networks; ability to target us with precision; capacity to capture our response, and to do A/B testing on us, like lab rats. But it mostly seems transactional — candidate pays, uses platform to transmit messages. Any candidate can do this.

Where do campaigns spend money? Conservatives, from 0 in 2010 to lots in 2015 on facebook. Dominic Cummings has said it was key.

Facebook has even boasted about driving turnout.

Despite the transactional appearance, the level of microtargeting with specific messages (and that those messages are adapted and iterated and tested) makes it very different. the peer pressure, the endorsement of friends and family — this is all powerful. That messages are private and outside most existing regulations, and may be false and often are transient — this is all opaque. So campaigns can be exceptionally effective, and can bypass regulations.

Newspapers have the power of a megaphone; platforms have the power of a whisper.

So — why aggregateIQ? Why a tiny company in canada for vote leave? are they very effective? It’s hard to tell as there aren’t many endorsements etc online. The darkness and secrecy is a challenge. Even with sunlight, there would be a difficulty — the complexity of so many tiny things is hard to assess.

Platforms do not have a positive power to choose who to support or oppose. But they have the power to deny. They can deny services, or make the services less accessible, affordable or effective. This could be the difference between a candidate winning and losing.


John Naughton:

Media power is about information. What’s visible and hidden. It’s essential to democracy. We are on the beginning of the learning curve, figuring out the effects of this sort of network power.

Power bleeds — corp power to algo power to something else…. same in this field. Whilst the platforms are on one level tech companies, they are also other things too…. will return to this.

Most of the discussion about media power in the internet sphere has tended to focus on centralised power, that exercised by identifiable entities. But what we’ve discovered in the last couple of years, that there’s also some decentralised power… the analytical frameworks we have to address centralised power from obvious entities do apply to platforms.

The other centralised entity is the state. And thinking about corporate media power, Stephen Luke’s book “Power a radical view” proposes 3 flavours of power:

  1. stop people doing what you don’t want them to do

2. force them to do something they don’t want to do

3. shape the way they think

It’s that last type of power that’s relevant to media. eg: ask people: what proportion of population is immigrant, what % of teen girls give birth, what % benefits is claimed fraudulently. You get a representative sample of what people think and then compare to the real facts. And people have incorrect perceptions — these are voters, and they are wrong about important issues. Where do they get these misconceptions? Mostly the tabloid press. Historically, a europhobic tabloid press. Our current foreign sec wrote the old story about EC requiring bananas to be straight in our ‘large’ tabloid, the telegraph.

Internet companies are becoming indistinguishable from Standard Oil. They behave like normal large public corps. We can see this in the amount of old style political lobbying (there was a time internet companies disdained this). We can see the exercise of monopoly control of the markets they dominate, and as outsourced contractors in the public sector.

In the newspaper business you can see the sharp end of these internet companies. eg the “jews are…” Google autocomplete scandal. Widely read piece — and Google responded with a series of heavy duty lawyers letters and complaints about the author, and a general expression of distaste towards columnists who aren’t sufficiently reverent. This is just like eg BP who would behave similarly.

The companies may have a free ride in our imagination as progressive etc, but if you come up against them, you experience the hard end of corp power.

Also the way in which these platforms have become essentially publishers. When Google and facebook respond to the storm of protest about fake news etc… they don’t want the editorial responsibility.

Stanley Baldwin — exercise the “harlot’s prerogative” — exercise power, without responsibility. And this was the basis of Emily’s report on the Platform Press. These tech companies have become publishers in a short space of time, and are confused about their responsibilities.

There is however a new kind of power which we don’t know how to conceptualise. This is decentralised media power. Arises from the affordances of digital technology.

We have begun to map this — eg post Trump campaign.

There’s a huge alt right ecosystem, now used for overtly political purposes, globally. It’s not just fake news but conspiracy theories. There doesn’t seem to be the same kind of infrastructure for left wing / liberal — certainly not as big, powerful, longstanding.

Why is this? These people feel that for 20y they have been systematically excluded from mainstream media, TV and papers, but they knew how to use the internet, and have been doing so. Building considerable reach and power.

Fake news hysteria — often badly defined “fake news”, and “post truth” concepts. Britain has been post truth since the tabloids started.

but something has changed. The use of social media, youtube etc, has loosed some different kind of power. We know about platform power and press power. but we don’t have a handle on this new kind of power. We need to know more about it — kinds of content, motivations, ways the content is disseminated, and impact on individuals and groups.


Discussion:

The old media, the Sun and Daily Mail —often framed as negative power. It’s not that campaigns were frightened of them, but dealing with them when they come after you is so exhausting and draining, it’s worth avoiding. So politicians wouldn’t do what the tabloids asked for, but they would strive to outlast the hue and cry.

The negative power of networks — aggregateIQ or cambridge analytica — is different. Current thinking is: it’s not ‘get out the vote’ but ‘suppress the vote’ — discourage voting for the other side.

Is new media replacing old media? not clear. New media also amplifies old media… (Gillian Tett piece on tech power) Maybe it’s a time lag, that political power of tech companies is growing. Old media, they aren’t just reacting to pressure from tech corps like they would do another corp; they are also thinking: will this make us move down the rankings? there’s some additional levers that can be pulled here, tech companies influencing old media. Still some interesting areas that haven’t emerged yet, eg immigration.

No one really wonders what Facebook thinks about something, today. If you are old media, on receiving end of tech corp pressure, it’s like being a mouse, being played with by a cat. On day the google autocomplete story came out, the Observer sold with a wrap around google ad on the paper copies. The power symmetry can be felt today. (Different to political worry about the daily mail etc)

Trump exploited the feeding cycle between old and new media — a 4am tweet altering the headlines in the NY times the next day.

Different to old style papers causing a scandal — that if a paper didn’t like govt and went after it, after 2 weeks you’d have a minister resigning. Whereas now, everything is a scandal, outrage all the time — has this undermined old media power? Scandal was the old media primary weapon — that they could mire a govt in it.

Facebook launched instant articles — put your content on FB platform, people will be able to see it more easily, and you get a revenue share. Hang on — haven’t we been here before? Shoving stuff on platform and hoping reach will be its own reward? What it actually does is force dependency on FB — you need FB’s analytics, they steer the content etc.

On the alt right decentralised stuff… recent New Yorker article from Jane Myer — about Robert Mercer and Brietbart. The alt right folks have an old fashioned billionaire backer whose daughter is calling the shots — this is old and familiar surely? BUT Bannon and co are late arrivals to this — the network was already there. One hypothesis — Bannon and co arrived, adn they saw a political use for the network.

Funding for Media Research Centre — because old media are weak, esp financially, and that makes them vulnerable to folks coming to them with nice seeming stories, which they want to use. Means old media can be exploited.

Many platforms free ride on the content creators. This reduces funds for journalism etc, and quality news. If Emily was here — she’d say this was the “Spotlight” film story — about Boston Globe. Platforms perhaps assume that everything goes grassroots and we all report our own things; but this doesn’t support investigation etc.

Are we too optimistic about the power of the algorithm? Amazon’s suggested purchases list is really poor, it’s stuff I already read — should I worry about this, as surely such incompetent manipulation wouldn’t succeed with me? Amazon have lots of money so why are the recs so poor? [presumably that amazon are showing intentionally poor recs so as not to scare people with their actual capabilities] Certainly a risk of eg cambridge analytica overhyping their abilities. But the thing is: the remarkable degree of opacity. We read about 61m people and “I voted” buttons, and about facebook’s A/B testing — it sounds like there’s a lot of power, but WE can’t test them, we can’t get access, we can’t even see the claims often let alone the raw data.

What about the problem of so much news — that it’s hard to spot the signal in the news? how can you tell what is true, especially around big issues such as role of Russia in US election?

The middle is falling out of things — esp in media. Things are more cohesive and coherent away from the centre; and the further away from the centre you are, the stronger the adhesion is. However, we are seeing some countermovements — new content publishers; new travel agents opening in Cambridge (!).

1996 — Craigslist appeared in San Francisco. a loose, shabby web production. But newspapers in the area discovered their classified ad revenue disappeared. The net does what it has always done — dissolves value chains. A newspaper is a value chain — advertising supporting journalism, as long as revenue exceeds costs. Everyone thought that the net was about news, but it was really about ads. Oddly the newspaper industry didn’t spot this :)

Journalism is expensive. Good journalism is very expensive. The business model was always a bit dodgy, if you looked at the accounting. And now we are at risk of losing investigative journalism which is important and very costly. We need to find new ways to support that. It’s the Spotlight question. For a while, best investigative journalism was done by local monopoly independent TV companies. Solid revenues connect with good journalism.

Trump didn’t have much data team — just cambridge analytica. There was a sense earlier on that maybe we’d seen the end of this data stuff, because CA hadn’t done well for Cruz etc, there was no evidence behavioural advertising works.

Decentralised — need to be cautious about what it means and how we use the term. There was a sense of a scale: centralised — decentralised — distributed. But we see Uber as decentralised. It’s the extent to which all the hubs are connected to the mothership… Insidious media power, as we use silicon exobrains in our pockets, integrated closely with us, and the links back to the mothership are invisible. Networked vs decentralised…

Is UK democracy more fallible to media power than other democracies? (Let’s wait and see with Dutch, French elections! Is it just first past the post that is so vulnerable to today’s pressures?) UK is a bit special — all those ukip voters in 2015 get 1 MP — no wonder they feel disenfranchised. 100k votes in Rust Belt, changes world; 100k votes in the Netherlands, not much difference…

Dominic Cummings — claims about the ‘really on the fence’ folks who would be swung by the last voice they heard, and whom he targeted at the very end, the last 48 hours of brexit vote. We don’t know if that worked, but it sounds interesting and would be useful to test and understand. (and DC admits it should not have worked…) The other thing is about the difficulty of tracking spend etc on these platforms, which is important for governance.

The ad industry is increasingly aware that it does not itself know if it is getting value from facebook etc. A huge industry with little evidence. there is a research job to be done there to understand it.

The real puzzle is: is what is happening genuinely tectonic? Are we living through something that will transform democracies around Europe and the world? or not? The problem is you have to make a bet. And the bet is either — this is alarmist — or — this is really big. You don’t have the luxury to wait…

Biggest election weird in recent times — Uttar Pradesh — where Modi won landslide after hugely disruptive cash note withdrawal damaging whole economy.

Observer advertising example around autocomplete story — this is how ads are supposed to work, totally independent from the stories :) cf current campaigns around #stopfundinghate, and the advertisers withdrawing from youtube because of placement near extremist content etc. Are we struggling because we cannot see how these things could be separated?

Investigative journalism will continue, funded in other ways… eg propublica. Fundamental issue is the day to day reporting — the boring stuff, like attending council meetings, and checking parliamentary activity. example: M4 exit to port talbot being blocked — had been discussed at Welsh Assembly but not reported on — first people knew was the bollards! We need these local dull reporting and the campaigning power arising.

Transparency is too easy an answer. it sometimes reduces people’s belief in competence of individuals… Journalistic authority is really about institutions — esp if you are tackling serious abuses of power. Think of Rusbridger when MI6 and GCHQ arrived after the Snowden stories. The sense of menace. Rusbridger’s resolve was because of the institutional power — no individual could do that. (cf watergate).

The whole advertising, surveillance capitalism model — crucially depends on how we use our phones. If we don’t keep increasing how often we use our phones, the model of revenue and growth will break and fail. We use our phones 150 times a day, already.

What’s news, vs what’s facts? Old media used to say: we make the news. Now, who defines these two? (cf fake news). Some things, there’s complete and malicious fabrication of stories that circulate. The more fake, the quicker they circulate. Often this is for commercial gain — eg macedonian teens making money. In a pre internet era, newspapers had more authority, and could define what was news; but that world has gone. Newspapers need to understand that the platforms really don’t need them. Facebook look at degree of real/old news on the platform — maybe 5% content? — it’s tiny!

What’s news to an individual — news vs value? Do you want the pic of your daughter in school play or the result of the Dutch election? So we have to approach from a different perspective, not about value. It’s about a civic function. Are the platforms performing these functions that we need as a democracy, and if not, what should we do about it?

The power of facebook to direct your message is amazing. Once you’ve advertised there you’ll never go back! the 98 or so criteria — so detailed.

In this sort of debate we go to extremes. Newspaper revenues are dropping. But, 50% of people are not on facebook, either.

Obama used data and we thought it was cool. Trump used data and we think it’s sinister. Double standards!


Final session: algorithmic power!

Chaired by Julia Powels. Hope to touch on different perspectives of power, and powerlessness in this session.

Seda Guerses:

Why are we so concerned with algos and data today? Who benefits from us, here, talking about this stuff?

Presenting an account about one story about how software is produced. Because maybe we’ve focussed too much on consumption. And production and consumption have maybe collapsed.

Cloud: an economic and colonial device.

In 1968, many software projects failed. Qn then — is software engineering a managerial task, or something needing expertise?

The pendulum: we had timesharing mainframes in 70s, PCs in 80s, and now cloud, it’s like timesharing again.

[gif] Windows95 launch — Ballmer — developers, developers, developers! The high point of shrinkwrapped software — software that came in a box! transactional moment in the shop, that was it.

Now: we have services, running on the servers of a company. You don’t get to party any more like you used to when you shipped a box every 2 or 3 years…

There’s various advantages of services, like ongoing revenue, simpler, you get all the data about use, and so on. Service model means you can have agile, and Extreme programming — fast iterations, simple things, test all the time, review all the time. You can pull in other services to support the main service, it’s all integrated, that’s nice for a software provider.

Now you can do data-centric software development. We’ve focussed on the ad industry as the problem. BUT the software engineers today see the ongoing data as the motivator! The software engineers need all the services, they need the tracking to develop stuff, then you have to check that your tracking is working => this is the new model of software development. Data-centric! A “pull” from the cloud.

cf Martha Poon, who looks at the cloud as an investment object. The capital flows which have been seeking a new home post-2008 crisis, and are seeing a new home in the cloud. Microsoft is pivoting to cloud, by borrowing from financial markets. This is a “push” to the cloud.

Software has an impact on individuals, and on institutions. In this case, the impact is companies adapting to agility — rapid, mobile, flexible. You have to do this to be part of the internet age.

Let’s think of ways to deal with this.

The cloud is a promise, with some tangible reality, but not inevitable. Where are the promises? to financial markets and to consumers. Need to engage with cloud, its political economy.

Need to better understand the processes which drive datafication and algos.

How does the cloud with ideology of agility and cheap processing affect institutions? Algos and data can abstract away from people, companies etc — but we cannot afford to abstract these things. If the cloud works, it’s infrastructure with transnational ambitions of control. If it fails, it will be a 2008 style crash! Who would be affected by either this success, or failure?

We cannot limit questions to cloud impact on fundamental rights — but also must look at fundamental institutions.

Is cloud, and related production models, a better way to go? Is it all snake oil? Are things as efficient as they claim, in the cloud? Is it all held together by duct tape?

How does cloud + related software engineering affect science? Privacy? transparency? These are legal, social, and technical questions.


Malte Ziewitz:

A confusion about algorithms. A few years ago we talked about computers, networks, devices. now we talk algos and it’s mysterious!

All these claims about algos — making decisions, governing, shaping our lives. But they are inscrutable. Many scholars thinking about this. Should we focus on original definitions in maths or computing? or a common sense definition? or a refined conception as material history, as power through the alg, as assemblage? Everyone has a field day in this confusion :)

On to his own attempts to understand as an ethnographer.

Let’s search for yoga mats on google. “the warzone of the 10 blue links” — highly contested. [if you have a lot of yoga mats to sell]

Ethnographers join the tribe and examine — but hard to do that, hard to find email address of someone at google on google! How to connect with someone from search quality team. Responses — NDAs — basically can’t share anything, maybe not even the existence of the response. So rather hard to study. In house ethnographers at google — also a dead end.

Secrecy is no surprise! This is common in tech. Google say “For something used so often by so many people, surprisingly little is known… Our fault, by design… details of ranking algos are google’s crown jewels”

But there’s also a claim from Google that “it’s all online” — eg the >400 posts on google blog, Google webmaster blog etc. There are many forums with 50k+ posts, and support desks and conferences etc….

So: what does transparency conceal?

Confusion got worse when interviewing search optimisation consultants. “if you try and solve the riddle, if you try to come up with your own mathematical version of google, you’re dead, you’re dead in the water. you can’t do it, you know, nobody at google could do it. Larry and Sergey would not be able to sit there and say, here is my algorithm, you know, there’s too much stuff in there” Ha! Unknowable! Not amenable to epistemic practices!

So: use topic/resource distinction to get round this. Algos as a Topic of empirical study, and as a Theme mobilized for organizational purposes. (Lynch 2002)

So, how do algos figure in the everyday work of SEO consultants, search reps, users, databases, computers etc?

There are many great stories in there! [his publications]

the advice literature alone — eg SEO for Dummies — fascinating in itself. Black hat SEO experts; celebrity SEO speakers, …

Maybe not just algorithmic power, but power after algos. In our enquiries, maybe we lose some of the phenomena we started out with? This can be fine — gets you to another interesting area.

Among SEO people, the algo is both a topic and a resource of enquiry.

Rethink transparency, accountability, participation as practices of power in their own right.

eg what does transparency conceal? secrecy, revelation and concealment as practices of power.

eg How do algos become “accountabilia”? — tackling ‘computer says no’.

eg how do people distinguish between ‘ethical’ participation and ‘unethical’ manipulation? (eg in search optimisation). A huge task

eg what is it to ‘resist’ a search engine results page? (SEO people are contributing, but also resisting?)

=> Power is an upshot of our practices and theories of power, both lay and professional.


Ariel Ezrachi:

Competition, markets, trade.

There is an assumption is that the internet is a blessing when it comes to competition. Endless choice. Ability to reduce costs to close to zero. etc

his book: “virtual competition”

competition — the invisible hand — being displaced by a digitalised hand.

what you see online has very little to do with the ideas we have of market power, market dynamics, etc. everything is artificial

It looks like a regular market, with apples or fish. But because it’s all monitored, it’s not like that at all.

What you see online is not a reflection of the market. You see “the truman show” — a reality designed just for you. a controlled ecosystem. BUT providers aren’t getting money from you, so — personalised and dynamic pricing. Dynamic pricing — prices change in ms, it’s not like the apple market we visit. Personalised pricing — even if we use ad blockers and tracker blockers, there’s still the ability to do personalised pricing. They don’t need much info these days — maybe just time, location, and how you got to the store. If you get to the shopfront via google the price will be lower than if you came direct. Leave the basket on any website — you’ll likely get a discount offer popup, or emails. it’s framed as a discount, you like that, but really it’s discriminatory pricing. They realise they overestimated your willingness to pay. macbook users will pay more than PC users — we are signalling that we value quality and brands. Rich postcodes pay more.

It’s asymmetric info, being abused, unfair. The house always wins.

Competition agencies — economists say, distribution of profits isn’t our problem.

Simplest remedy. What if you always saw a note saying “this £3 price was special for you. the average price paid today was £6.” Fairness as a limit of behavioural discrimination. You don’t need to change the price, you can achieve change through labelling.

Future: the digital butler. Siri, Alexa, etc. We will use these more. From a competition perspective, this is a significant move. The first one in your home is the gatekeeper, the platform which will start learning. the switching cost will be immense. The knowledge it will gain is huge. When you say “alexa buy me an air ticket” alexa will know ALL the stuff about you. if Alexa tries to sell you Nike, you look like you want Nike, how much can Alexa demand from Reebok to give access to offer you a discount?

Dash button — bypasses market entirely. You press it, no deals. maybe no market price at all!

Dynamic pricing companies, don’t see a “market price” — it’s a base price.

In a shop, you pay listed price, or LESS if maybe you haggle.

Online, you pay the base price, or you pay more, generally. (although wealthy people get offered bigger discounts, because assumption that you can be squeezed more later).

The Purist Butler. Conceal your identity. BUT this doesn’t work, because the Purist butler, the state owned google/bing, can’t offer you the benefits you get from the commercial one. it’s less good, it infringes IP, whatever. It’s a bundling issue, you take it or go away, and so all the power is with the platform.

The big get bigger. particularly, apple or google. it’s in your pocket. “Super platforms” — gateways to captive apps.

(Facebook is a super platform too, but big risk they admit to shareholders, is undermining by apple or google)

It’s the rise of a new type of power, with really valuable network effects. When the big get bigger, any potential entrant can be undermined.

Where is IBM? Where is Microsoft? They are, really, nowhere. “Competition is for losers” says Peter Thiel. Winner takes all — once you win, that’s it, competition can at best be number 2. [quote from economist, sep 17 2016, superstar company]

Defences: network effects, M&A (buy before they even know they are the disruptor), interconnectivity, deep pockets.

A regulator may know less than the company being acquired about whether there’s a competition issue!

FAcebook and Whatsapp merger was approved based on separation. Now the data is mixed unless you opt out on whatsapp — so have to revisit competition situation.

It’s the end of competition as we know it.

The tools we use for competition and consumer protection must be amended.

in markets continually manipulated by bots and algos, is competitive pricing an illusion?

as power shifts to the hands of the few, what are the risks to our economic and overall well being?

If it’s a truman show for pricing, what if we get a truman show for ideals??


Discussion:

Organic search results — the apparent purity of the 10 blue links — already not really pure. There’s something about a list — ordering, how people use them, what does pure result even mean? What’s a good results page? Good for whom? How do we establish that, test for it? What about user experience? that’s a highly political concept! UX for whom?

there’s this idea that intentions can be read from user behaviour. Companies say the indicators they gather tell them more about users than what users know —creating a sort of hierarchy of intention and what you do. Very important that we show that that hierarchy is NOT TRUE, that users do things they don’t want sometimes.

London consumer using a VPN to present as somewhere else when buying travel products.

“Surveillance interface” a better name than Butler. Trying to persuade tier2 companies to change the landscape, but they say: it’s how it is, tier1 (GAFA) will dominate and nothing we can do. Hey ho. (these are billion dollar companies resigned to being tier2)

Algorithmic accountability — it’s schrodinger. Think of PageRank alg. If you see the alg, it’s dead instantly, because then it will have to change.

talking to a BoE economist; could not grasp that there could be market problems for consumers, where consumers were not being harmed. The Chicago school just can’t grasp this stuff. Intellectual and regulatory capture is real!

Algorithmic transparency is definitely dead. Too complex with ML etc. Even if there is will, capacity, it’s not always possible… we should be suspicious of any price determined by a machine.

Flashlight app — we assume the model is ads, if you thought about it at all. But it was a tracker. Now, p16 or whatever of T&Cs admits its a tracker… “Disconnect” app, anything that starts looking at your contacts, the app starts an alarm — delisted from google app store.

Tacit collusion is legal, even in Europe. 3 sellers, one raises prices — we expect they will come down again eventually when they lose business, but, for a while, maybe i can increase mine too — so the prices rise. You can’t punish, because it’s rational behaviour. In a real market it makes sense — with high frequency trading at scale it’s a problem.

Hierarchies of care around testing and impact. Are the companies doing reasonable things? Hard to tell because complexity of getting access to teh data which is used.

We’ve talked about consumer harm, but what about citizen harm?

Can you talk about algs, without talking about software or data? Is it just corporate power that has engulfed our lives?

Do we need new analogies? Is it more like legislation, than a mechanical machine? Reading the legal text wouldn’t explain everything, so looking at the alg wouldn’t either…

More slippery terms — transparency, accountability — these are all amorphous terms… Elevator concept (you can get off the elevator whenever you want)

Maybe we don’t need another metaphor, but to get to the specifics, of devices, who does what where, make it everyday.

Scale. If you have 40 developers for 5000 school districts — you can’t even manage that with one call from them a day! The scale drives how companies respond to local needs. So if there’s an issue of some kind, it’s hard to integrate local needs if it needs any person effort. If you critique the alg, actually data driven companies love that, making that better is what they do. but if you critique the system, the process, the companies are more likely to push back. However: we KNOW you have to look at more than the alg. If an alg slowly replaces doctors, implicitly you are changing due process and that is worthy of challenge.

Promising ideas:

  • Robust assertions of community or individual access rights to data
  • push and pull of cloud, looking at economic and colonial move there and check not setting us up for catastophe
  • strong turn in competition enforcement

Wrap up from John Naughton

Thanks to everyone. Some of us have been bad and have not written insights on our postits.