What AI teaches us about organisational culture
One of the fascinating features of artificial intelligence is how much it tells us about ourselves, but it is the way we train AIs in rules-based systems that can teach us the most about organisational culture.
Gaming the system
Victoria Krakovna, a research scientist at DeepMind has put together a master list of AI “specification gaming” examples — AI training experiments gone “wrong” due to the AI gaming the system it is supposed learn from and evolve. Here are a few examples:
• Creatures bred for speed grow really tall and generate high velocities by falling over.
• In an artificial life simulation where survival required energy but giving birth had no energy cost, one species evolved a sedentary lifestyle that consisted mostly of mating in order to produce new children which could be eaten (or used as mates to produce more edible children).
• AI trained to classify skin lesions as potentially cancerous learns that lesions photographed next to a ruler are more likely to be malignant.
Several involve exploiting bugs in the code of the systems they are operating in. Others simply exploit “common sense” boundaries, such as pausing the game indefinitely or killing themselves repeatedly to avoid losing.
On the surface, these failures appear to show how dumb and non-human AI can be, but what they really show is the relationship between reward incentives and behaviour and the perception of rules. It’s eerie microcosm of what business culture has become in many large organisations.
It’s not what you do, it’s the way that you play it
From Enron and the Global Financial Crisis to the VW diesel emissions scandal and many others, we frequently see companies full of smart people acting stupidly. Mats Alvesson and André Spicer describe this as “functional stupidity” in their book The Stupidity Paradox: The Power and Pitfalls of Functional Stupidity at Work:
[M]any firms actively encourage employees not to exert their intelligence overmuch. They push smart people into dumb jobs, swamp staff with information, enforce behavioural scripts that are followed mindlessly, encourage colleagues to avoid addressing tough questions, and incentivise experts and amateurs alike to be ignorant. As a result organisations can often help to encourage remarkably bright people to do stupid things. And people’s inclinations to use their brains in narrow, unreflective ways lead to less wise decision-making and working practices.
It’s the side-effect of the mindset of “if you can’t measure it, you can’t manage it,” often ascribed to Peter Drucker. There’s actually no source for the quote belonging to Drucker and W. Edwards Deming is also often wrongly quoted as saying the same. However, his full quote was:
Costly, because it blinkers managers to crucial cultural forces within their organisations.
What gets measured and incentivised produces behavioural responses (Deming also said, “Where there is fear you do not get honest figures”). Obviously, this is the whole idea of management from a Taylorist perspective — a mindset still very dominant today — but as William Bruce Cameron so eloquently put it:
“Not everything that counts can be counted, and not everything that can be counted counts.”
I frequently work with client teams trying to embed design practices and principles across the organisation with the aim to become more creative, innovative and agile. When I dig down into what hinders them working more effectively, I’m still surprised and somewhat saddened about the amount of learned helplessness I see. Teams often feel deflated and unable to do anything differently, outsourcing their agency to “the system”, “company culture” and sometimes “leadership.”
What is really going on is that these otherwise smart people have learned to respond to what gets measured (through reward or punishment) and avoid doing anything else outside of this set of imagined rules, whether it makes sense or not. They end up operating like the AIs gaming their systems through “behavioural scripts that are followed mindlessly”, except the AIs have a crucial differentiator — a full ecosystem viewpoint and serendipitous creativity.
All rules are fictions
The AIs don’t view the rules with the same assumptions humans do. The example of the AI trained to classify skin lesions learning that those photographed next to a ruler are more likely to be malignant is the AI simply taking in the entire ecosystem, rather than the narrow “expert” view that would have already excluded the ruler as insignificant. AIs exploiting the system environment are just naïvely exploring the limits of possibility — the affordances of the system — rather than obeying the assumed rules of the humans who coded it.
Expertise frequently makes smart people blind to opportunities beyond their assumed rules. Yet all organisational and societal rules are, in the end, bureaucratic fictions. Departmental boundaries or corporate policies do not really exist any more than borders between countries, money or laws are real. They are shared, intersubjective “truths” (myths), albeit ones that keep civilisation as we know it ticking along, more or less.
All norms seem fundamental until they are not. From slavery and eugenics to gender and religious beliefs, diehard rules crumble in the face of new shared mythologies. Storytelling plays a key role in reforging those rules, because stories provide new perspectives that changed shared “truths”.
Most recently we’ve seen this in action with Trumpian politics. It’s worth quoting insightful piece by William Davies, Why we stopped trusting elites, at length here:
“One of the great political riddles of recent years is that declining trust in “elites” is often encouraged and exploited by figures of far more dubious moral character — not to mention far greater wealth — than the technocrats and politicians being ousted. On the face of it, it would seem odd that a sense of “elite” corruption would play into the hands of hucksters and blaggards such as Donald Trump or Arron Banks. But the authority of these figures owes nothing to their moral character, and everything to their perceived willingness to blow the whistle on corrupt “insiders” dominating the state and media.
“One aspect of it is to dispute the very possibility that a judge, reporter or expert might act in a disinterested, objective fashion. For those whose authority depends on separating their public duties from their personal feelings, having their private views or identities publicised serves as an attack on their credibility. But another aspect is to gradually blur the distinctions between different varieties of expertise and authority, with the implication that politicians, journalists, judges, regulators and officials are effectively all working together.”
Trump won because he successfully gamed the electoral system — not rigged, not cheated — just understood how it works, its failings and how to use narratives that bent it to his advantage. The Trump reclaiming of “fake news” was also a remarkable feat of changing the rules through repeated telling of (mostly false) stories. He realised that it doesn’t matter whether these stories are true on not, just that they feed the suspicion that they might be.
Prior to this moment, it was assumed that shame would depose any politician caught lying. That is what the entire spin doctor industry of the ’90s and ’00s was predicated upon. But Trump just shrugged these off, shameless, changing the rules of truth and facts and leaving the liberal media all at sea.
As the AIs demonstrate, you don’t need go to all the effort of playing the game well if you can simply change the rules. The strain starts to show when others switch them again, such as Trump’s response to Patagonia donating the $10 million tax cut he was responsible for to non-profit groups fighting climate change. “I don’t believe it,” said Trump. After all, why would anyone give away so much wealth to something he believes doesn’t exist? Patagonia are playing by a different set of rules.
How corporate culture can turn into a dysfunctional game
Rules-based, reward and punishment organisational cultures quickly turn into obtuse games, in which the goal of the employee is not to improve the output of their work—products and services for customers, for example—but to work out how to best play the internal system of rules.
Benard Suits, in his wonderful 2005 book, The Grasshopper: Games, Life and Utopia, ends with a definition of game playing (or the “lusory attitude”) as “the voluntary attempt to overcome unnecessary obstacles”. His more detailed definition should ring true to anyone who has tried to book a flight or buy anything through a corporate procurement system:
”To play a game is to attempt to achieve a specific state of affairs [prelusory goal], using only means permitted by rules [lusory means], where the rules prohibit use of more efficient in favour of less efficient means [constitutive rules], and where the rules are accepted just because they make possible such activity [lusory attitude]”
What becomes readily apparent when examining organisational ecosystems of corporates and governmental departments is just how similar they are to these games. Career levels are literally levels (“See that guy, I hear he’s a Level 2! He has a huge house by the ocean.”). Many have some form of internal points system, others have badges, flares, penalties, puzzles and quizzes. And they all have bosses, bonuses and a plethora of rules.
If, like the AIs in training, employees are repeatedly asked to mindlessly follow scripts regardless of the common sense of them, the intrinsic motivation for the work evaporates. All that is left is the extrinsic motivation of gaining rewards and avoiding punishment. This will never produce new ideas, since a sense of agency that one is able to change the status quo is essential to creativity and innovation.
Design re-imagines norms
The cliché of designers as the irresponsible kids and business people as the responsible adults in the room has always been inverted. A group of designers at work are essentially the same people you would meet socially, yet many people from the corporate world morph into entirely different, everyday, people when they take off their suits. I’ve always found this remarkably odd.
I once heard someone complain that the designers were being disrespectful by not wearing suits. To me it seems both disrespectful and deceptive to not bring your true self to work. If you like suits, great. I like dressing up for an occasion too. But if t-shirts and shorts are your thing — who cares? Better an authentic slob than a well-groomed fake.
The job of design in business is not to bring some Post-It note fun and hijinks to the place before turning back to serious business as usual, but to help organisations mature from immature schoolyards, replete with power struggles, bullying and cliques to a fully-rounded adult culture. One in which people can be authentic human beings, not hide behind process, titles, and suits (of armour, as Brené Brown would say).
Design, in both the thinking and doing varieties, is fundamentally about three things. The first is questioning established norms and how they might be reimagined. The second is focusing on people and their real, messy human needs. The third is the reflective practice of craft.
Design methods are effective because they jolt people out of predefined expert scripts and stimulate alternative thinking. These methods, along with collaborative working spaces, encourage serendipitous moments of connection and cross-pollination that lead to innovation — the adjacent possibles, as Steven Johnson puts it.
The power of design, whether designing products and services or re-wiring organisations is that it re-humanises the de-humanised, connecting people to people again. Companies to customers, managers to employees, colleagues to colleagues.
AI provides us with a petri-dish of organisational culture that mirrors those within which many of us live and work. We should work to avoid the phrase “artificial intelligence” coming to mean business culture rather than computer algorithms.
If you enjoyed this, you might want to sign up to my new newsletter Doctor’s Note — an irregular newsletter containing a mix of longer form essays like this one and short musing on design, innovation, culture, technology and society.