Ch-ch-ch-ch-changes
Zooming out on changes with AI and considering changing expectations
There’s a well known challenge when trying to develop digital products and services to create social value, that you need to meet people’s needs, behaviours and expectations. It stems from Tom Loosemore’s description (when at GDS) of what we really mean by ‘digital’ and has since been used as a way of ensuring products and services are designed in a way that creates enough value to actually be used, and useful.
But this isn’t a post about developing digital products and services, nor how we understand changing needs. It’s about how Generative AI (GenAI) products and services being created outside of our influence impact how we operate — shifting behaviours and, crucially, our expectations.
It’s clear from the vast amount of money being poured into GenA; the rapid adoption that these tools are achieving and the transformative interactions that they offer, that they’re part of a set of technologies that can be described as ‘once per decade disruptions’ — Joel Birnbaum’s observation of the ten-yearly shift in the dominant model of commercial provision of computing. But this disruption isn’t just about how we interact with computers, it’s about how that then informs how we interact with each other, with problems, with institutions — how we conceive of new ways of organising and creating solutions. The internet and mobile phones show how profoundly the dominant model of commercial provision of computing influences how we communicate, organise and share — GenAI is expected to have just as big an impact — if not bigger.
Expectations
If someone is so used to receiving an instant answer through their digital interactions, it’s unlikely that asking for 24–48 hours to respond to a question meets their expectations — potentially meaning they’ll head off for an answer elsewhere, or be left feeling unsupported. If someone is so used to accessing people and information 24 hours a day, wherever they are, through their mobile, it may be having to wait for certain office hours or locations for support, again, means they look elsewhere or feel unsupported. This is about the ‘raised expectations’ of the internet era. I’m not arguing whether the expectation of immediacy is a good or bad thing, but that the expectation from our digital interactions is often then applied to other contexts. Our expectations from past experiences influence our perceptions, motivations, behaviours and emotions.
What does this have to do with GenAI? Over the past few weeks, a whole series of announcements from companies like OpenAI, Google and Apple, showcasing their latest GenAI releases. Within them were demos of a video-based assistant that answers questions based on what’s going on in your environment, but also tells you where you left your keys; an AI tutor that coaches someone to work through a maths problem; all sorts of images and videos generated from text or speech prompts, and, on the horizon, agents that complete whole projects moving between systems and tasks until completion.
How might these sorts of tools shift our expectations about how we navigate and interact with the world around us? And what does this mean for how we respond to these ‘raised expectations’?
Possible and probable futures
Canadian futurist Norman Henchey proposed four ways of thinking about the future: possible, plausible, probable and preferable futures. The choices made today have an effect over time on which futures we bring about. Given the investment and market forces behind GenAI, it is probable that some or many of the approaches unveiled in the past few weeks will scale dramatically, and with each new iteration of GenAI models and tools, so more opportunities and challenges will emerge.
So how might our expectations change, in the following scenarios:
- If we become very familiar with working alongside generative tools that act as a never-frustrated teacher and coach;
- If we become reliant upon multiple ‘digital interns’ delivering first drafts (or nearly complete work) to speed up our own tasks;
- If we rely on generative tools to provide immediate, real-time language translation;
- If we don’t see the need to contribute to the collective wisdom on the web.
Critically, if those expectations change, what does that mean for people coming to your organisation; service or activity? What does it mean for your future workforce? What does it mean for how you best achieve your mission?
We’ve put together a living library of AI resources that we hope will support you as you explore the challenges and opportunities associated with AI. Please take a look — and do let us know if there is anything you need particular support with.
Main image by Gerd Altmann from Pixabay