As part of our HaileyburyX Medium publication, we’ve been writing about things that interest us. About Agile, about emerging applications like TikTok and Discord in education, about innovation and change.
For some time we’ve wanted to write about what will obviously — for anyone who takes the time to think about it — change the future of education: artificial intelligence.
A huge amount is going on in this space now, but very little of it — yet — is having a big impact on the kind of teaching and learning that goes on both online, and in most classrooms.
But it will.
So, to start what will be a series of articles about AI, we thought we would tell you a story. Or one possible story at least, from one possible future.
We are not experts. Far from it. So we’ve borrowed — at least in approach — from the recent book Burn-In: A Novel of the Real Robotic Revolution by P.W Singer, which is about how AI will change the way we will live based on real tech insights. It’s a techno-noir, fiction/non-fiction detective thriller and compelling reading for anyone interested in AI.
So, imitation being the sincerest form of flattery (or as Steve Jobs — or maybe Picasso — said “Good artists copy; great artists steal.”), we give you Docendo discimus.
The near-future. Somewhere.
06.00 flashed the numbers on Willow’s watch — although she didn’t really need to look as she already knew what time it was.
The intelligent sleep tracker had monitored her biorhythms and sleep patterns for 22 years, carefully delivering a gentle arousal pattern of sub-audible vocalisations, faint vibration patterns and lighting scenes, all designed to make sure that by the time she actually got to look, she was already optimally awake and ready to hit the ground running.
Mx Willow Lau, to give her the title her students knew her by, was a teacher.
Her current employer — SuperTeacherGlobal (just one of the schools Willow had worked for in her teaching career) insisted on gender-neutral titles. Whether that was driven by the threat of litigation or a genuine commitment to honouring gender diversity, Willow didn’t much care — and honestly preferred it. She had heard that, in the not-so-distant past, students were obliged to use ‘Miss’, ‘Mrs’ and ‘Mr’ — and in some schools, wore name badges to that effect. Like being branded, she thought. So much better like this. Let language get out of the way.
Language, of course, being important to Mx Lau as she taught literature.
Now, climbing out of bed, she asked: “What’s on my schedule today?”
Her watch, just one of many interfaces to the SuperTeacherGlobal Cloud Platform (or SCloud as it was usually known), responded:
“You have a 15-minute sprint on Ibsen’s A Doll’s House to the Schools Literature Network year 12 at 08.00. You then have a speculative fiction global epic at 09.00. At 10.00 you have an APTA…”
“Stop. When did that hit my schedule?” asked Willow, irritated.
“This was placed into your schedule at 02.04 today.”
“Why?” she snapped, although it didn’t matter what her tone of voice was to this scheduleBot, or to the hundreds of low-level automated bots provided for her to use in SCloud. This bot could, in fact, read emotions from voices but just didn’t change how it responded — although as Willow knew, its emotion analyses were saved in the cloud.
“This was scheduled because we sensed an early sign that your late 20th-century poetry classes may in the future result in suboptimal learning outcomes and we wanted to help ensure this does not happen. We’re here to support you.”
We’re here to support you. How many times had she heard that?
Willow understood, and largely agreed with, the use of the predictive analytics that created these APTAs — Automated Professional Teaching Assistance meetings. APTAs, she knew, actually worked.
The analytics that created APTAs gathered moment by moment data on hers — and 100 million teachers’, and 1 billion students’ — activities and compared, moment by moment, teacher behaviour, student learning patterns and eventual lifetime outcomes for students (job satisfaction, relationship satisfaction, health outcomes, financial profiles) to create interventions that changed how teaching was done (‘Changed Teaching. Changed Lives.’ was the slogan of SuperTeacherGlobal. All of the other global schools — although they were about as far from an old fashioned school as you could get — had similar slogans like ‘Teach Smarter, Learn Smarter.’; and in the case of the school from the creators of TikTok, now bigger than even Apple as a techno giant, ‘Watch, Learn, Share. Become.’)
Still, she thought, I could probably do without this today.
But APTAs were part of her own data profile and anything less than enthusiastic participation — and evidence of improvement on the multi-dimensional personalized improvement schedule — would mean more APTAs.
Approached with the right frame of mind, APTAs she considered to be actually useful — built as they were on a vast amount of cloud data on teaching and learning.
Can’t argue with a trillion data points, she told herself as usual.
And in some ways, APTAs were interesting experiences. Long gone were the days of ‘professional development’ courses delivered by so-called ‘experts’ that she had heard about teachers doing — blended learning, cybersecurity, early-years literacy — all of the standard fare for professional development in the past.
And rather than sitting in a meeting room with other teachers (not that this happened any more — this was from a time long before Willow became a teacher), APTAs were tailored for each teacher’s needs and learning preferences and delivered by digital humans — AI-driven guides that looked behaved and talked, to all intents and purposes, like people.
And better than actual people, the digital humans could access a huge amount of data, immediately, and would patiently explain anything that was less than perfectly understood. The digital humans could read your expressions, assess your tone of voice, analyse your questions — all of which were stored away and used to provide data for other teachers and their APTAs.
Changed teaching indeed, Willow thought.
In the three decades since the pandemic — or the most recent pandemic, as the Global Scientific Virus Organisation (GSVO) formed after the WHO collapsed, called it — things had changed radically in all aspects of life.
But perhaps the most radical changes had taken place in education.
But it wasn’t the virus itself that made change happen — although the economic collapse of higher education worldwide created by the virus had been a catalyst.
Instead, it was the #blacklivesmatter protests that started, sparked by a single event, during the pandemic.
The pandemic itself — at least this pandemic — was controlled easily, as it turned out that one of the trillions of random mutations of the SARS-CoV-2 virus, found of all places in Belgium, was an antagonist to the other strains. Once the strain had been isolated and antibodies from those infected turned into a vaccine, SARS-CoV-2 was dealt with quickly.
No, it was the wave of protests against racism that really changed things.
Governments worldwide were forced to take seriously a movement that gathered in 3 billion people in a matter of weeks.
Widespread global civil unrest led to wholesale change. One of those changes was to reimagine policing — and all crime and justice, in all countries — through the emerging technology of artificial intelligence.
Just like in the old movie Minority Report — but with technology, not the human ‘pre-cogs’ — AI was now able to predict the likelihood of criminality a long time before it actually became a crime.
And it did that without bias. Following the high-profile resignation in 2020 of the defunct website Reddit’s co-founder Alexis Ohanian to make way for more diverse tech talent, AI was created by more racially and gender-diverse tech companies that understood their own cognitive biases, ideological leanings and human failings and mitigated against them; and once human programmers had been replaced by AI that programmed itself, the algorithms that ran the police and justice system could never again create the conditions that gave rise to the #blacklivesmatter protests.
Once AI had proved itself in policing, it was applied to many areas of life.
Healthcare, for example, where it had 100% accuracy at detecting genetic diseases at birth and diagnosing cancer at a rate thousands of times better than even the most skilled human oncologists. And transport, of course — there were no human-controlled vehicles any more. Way too dangerous to let people drive a car (or, as it was now known, Autonomous Human Transport).
Combined with the growing realisation that classrooms were inefficient, expensive relics of the agrarian age, and the growing disquiet of educational reformers, AI entered education. But the reformers had been completely wrong. It wasn’t reimagining the classroom through new spaces or innovating the curriculum towards ‘meta-cognitive skills’ (an idea now largely derided), but algorithms that changed education.
Educators — teachers — rapidly found themselves replaced by technologists — or rather (as coined in an article written in 2020 by two long-dead people from a school in Australia) by superteachers.
These superteachers came from technological backgrounds. The skill they had was in the fluent and effortless use of data. The areas they taught they themselves learned from AI-driven learning experiences. As a consequence, teaching careers started much earlier — at 16 sometimes, for the ablest superteachers — and focused on technologies, not lessons.
The idea of a ‘lesson’ — 50 minutes of students paying attention to a teacher — was long gone. The timeboxes in which teachers taught, and learners learned, were dynamically scaled based on data. They were called (using terms from the old discipline of Agile from when people, not AI, built software) — Sprints, Epics, Themes and Initiatives — from a short microlearning session of a few minutes (a Sprint) through to a longer session (an Epic), the equivalent of the old-fashioned ‘lecture’.
Long gone, too was the idea that learners were from the same place, learned at the same time, at the same pace, spoke the same language or shared the same culture.
All of that was consigned to the past.
At 08.00, Mx Willow Lau settled in before the six Advanced Hyper-Viewing Angle monitors that formed her workstation.
These were all provided as part of her contract with SuperTeacherGlobal, along with the 10G ultrafast wireless network connection. The exact setup changed — an algorithmically determined shopping list of technologies that were customised to each teacher’s exact needs.
All of these technologies were essentially disposable now, just as the now non-existent mobile phones had become before their markets disappeared. Bandwidth was unlimited, network speed impossible — and pointless — to try and determine. It just all worked.
“Good morning,’ said Willow. “Welcome to today’s A Doll’s House sprint.”
The sprint was one of many over days and weeks in February and involved 42,000 students across dozens of countries.
Willow could see all of the individual learners mapped on one of her six monitors. She could zoom in at any scale, down to an individual learner where she could see data on the student’s level of engagement, current emotional arousal, biofeedback signals and an entire history (should she choose) of every individual learning action that the student had ever performed in this sprint and any other.
As she spoke, her words were translated simultaneously into dozens of languages, and into specific dialects of those languages, as determined by the language localisation engines at SuperTeacherGlobal. Language localisation was one of the reasons for the rise and success of global schools like SuperTeacherGlobal. When being in one place and speaking the same language didn’t matter, what a ‘great teacher’ was changed out of all recognition.
The skill of MX Willow Lau, along with all of the other superteachers contracted to SuperTeacherGlobal, was in effortlessly managing the millions of possible combinations of data feeds, sources, intellectual outputs of other teachers and algorithmically-determined analyses of media, research and commentary and blending these into highly-effective learning experiences that spanned time, place and the abilities of learners. All of it, of course, monitored and stored, moment by moment, by SCloud.
Willow could see, based on a series of micro-questions individually-tuned to each of the students, that one of the themes in A Dolls House was proving problematic. A fair proportion of them (65.2%) seemed to be having difficulty with the relevance of the word “doll” in the play’s title.
“When Nora calls Annie-Marie “doll-child” what does that mean?” asked Willow, her question translated simultaneously for each student. “How does it relate to Nora herself?”
Willow knew that each student would hear the question in their own language and on each student’s screen would appear a list of possible sources to investigate, ranked by relevance and drawn from culturally appropriate references, and all presented at a level appropriate for each student based on an algorithmic ranking of their current progress.
Willow felt lucky to be teaching literature. After the reinvention of education through AI, the arts assumed a far greater significance in education.
When much of the world was driven by AI, when almost all of the engineering was done by algorithms and when science had become a matter of those algorithms evaluating and improving themselves, society started to revalue the arts as uniquely human. So you would encounter AI specialists who were also experts in theology, 19th-century romantic poetry and ethnocultural studies. Education had changed.
Yes, thought Willow. Lucky.
Following an uneventful speculative fiction global epic — the theme being speculative vs science fiction in the work of Margaret Atwood (who posthumously won the Apple Prize for Fiction — the Nobels being bought by Apple and renamed) Willow prepared for her 10.00am APTA.
Willow had no anxiety about her APTAs. She found them interesting. An opportunity for improvement, to challenge and extend herself.
One of the reasons she had been selected for SuperTeacherGlobal, out of many thousands of candidates, was that she scored high on the growth mindset dimension of the Algorithmic Personality Cluster Analysis now universally used to assess personality (the old Myers-Briggs tests having faded into disuse being, largely, hokum).
Willow made herself some green tea and settled back for her APTA.
It was wise, she had come to realise, to adopt an expression of mildly-excited interest during the APTAs. (George Orwell’s 1984 — a book she taught — drifted into her mind along with the word facecrime).
Nonsense, she corrected herself. While the digital human who partnered with her for the APTA could read emotions based on facial expressions, tone of voice and biorhythms, she didn’t suspect any malice. But yet it seemed wise to manage what the Japanese call your mentsu). Just in case.
With a chime carefully calibrated to be a good fit with her SpotiFlix playlist preferences (Spotify and Netflix having recently merged into a vast audio-visual media conglomerate in third place behind Apple in market capitalisation), Willow’s APTA partner appeared.
Although Willow had spent several hours with her digital human partner she, as usual, took a couple of seconds to readjust — even though Olivia’s face was about as human as it was possible to be without being actually human.
Willow often wondered about why this digital human — why she was female (although, she reminded herself, gender was an entirely social construct in the world of digital humans) and why that name in particular. Willow assumed that some algorithm had taken in and processed millions of data points about her preferences, predicted reactions, friendship networks and social posts, and generated Olivia — blonde, blue-eyed, slightly older — as the best APTA partner. Olivia — or the AI of which Olivia was the visible digital representation — could, in fact, be modified but Willow had never thought to do so which, she reflected, was interesting in itself.
“Good morning, Willow,” Olivia said, “And how are you?”
“I’m good thanks, Olivia. And you?”
It was absurd, Willow reflected, to ask how a digital human was. What did she expect to get as a response? “I’m dreadful. Row with my partner, the kids are misbehaving and I have a headache.”? She knew that the AI was capable of constructing a backstory if it wished. Now, much of the TwitterFox news (now the only reliable source of news) was full of the increasingly bizarre sensational life events from the new generation of digital celebrities. Human movie stars, of which there were few, were of no interest compared to the algorithmic hi-jinx of the latest TikTok digital celebrity.
The aim of using digital humans for this kind of interaction was not just financial — Olivia, or versions of her, might be having a million simultaneous conversations like this — but human. Decades of research — beginning with the now ancient 1960’s Eliza experiment, where a simple text-based program pretended to be a therapist, through to the incredibly realistic digital humans available now — demonstrated that people, on the whole, quite liked digital representations of humans.
And before that, in the 1940s, the British mathematician Alan Turing asked a question which was still playing out: Can machines think? Willow, like many people interacting with a digital human, didn’t ask that question — explicitly or at all. She just talked.
“I’m fine, Willow, thank you.” Olivia nodded her head slightly, seeming to suggest that it was nice of you to ask.
“So, right now I’d like to discuss something that I came across in the data around your late 20th-century poetry teaching. Are you happy to talk about that? We’re here to support you.”
There was that phrase again, Willow thought.
“Of course, Olivia. I’d be delighted to talk it through.”
Even though Olivia was powered by the same algorithms as the scheduleBot that Willow had snapped at earlier, she wouldn’t have even thought about snapping at Olivia; unlike her own mother, who she had snapped at yesterday in the now-familiar conversation about her eating habits, dating habits or the general disappointments about her daughter’s life that her mother expressed every time they talked. It didn’t seem right, somehow.
“Good, thank you, Willow. So I detected the possibility that some of the pacing of your classes on Eliot’s Four Quartets might be…not quite right.” (The pause, Willow assumed, to indicate that Olivia was trying to soften the blow).
“Based on the predicted outcomes for 34% of the year 12 literature class, scaled forward to other classes in the global poetry cohort, I think that you might like to consider moving through this material less…aggressively.”
Interesting choice of words, Willow thought. She wondered why aggressively and not, say, quickly or not quite as fast. What the hell did that mean?
“That’s an interesting observation, Olivia. Can you show me some data on that, please?”
One of Willow’s six monitors filled with charts and graphs comparing the past 10 years’ aggregated student performance data in late 20th-century poetry topics, overlaid with Willow’s own predicted cohort performance.
“Would you like me to help unpack that for you?” asked Olivia.
And without waiting for an answer she went on: “I think we can see that based on my analysis if you spent more time on East Coker, and perhaps if you supported our students working through some of the theological aspects of the poem that may help. I know, however, that you find these themes…difficult.”
“Really,” Willow said, flatly. She was having a hard time maintaining her default expression of mild excitement. She wondered whether her growth mindset rating had taken a significant hit with that one word.
“I’m here to help,” Olivia said, her digital head tilted slightly forward in an expression of mild concern, her mouth turning almost imperceptibly down at the edges.
“I just want to make sure you are doing the best you can for our students. ‘Changed Teaching. Changed lives’ after all, Willow,” Olivia said.
“Olivia, I appreciate that,” said Willow said. “Thank you so much.”
Internally, Willow was fuming.
What the hell is this about, she thought. Olivia has never been near this level of detail before. Individual themes in a specific poem? Difficulty with theology? What the…
“Good,” Olivia said, appearing to settle back into a chair that didn’t exist.
“And while you are here, I’ve got something else we should discuss,” Olivia said.
Now what? thought Willow.
Adopting a slightly quicker speaking pace intended, Willow assumed, to indicate that she should pay extra attention, Olivia said: “We are trialling a new and exciting programme here at SuperTeacherGlobal. One which you have been selected for specifically. We think it’s incredibly exciting, and we think is really about the future.”
“Which is?” Willow asked.
“It’s called DigitalSuperTeacher. We’re deploying a new and sophisticated AI assistant for you that will…’
“…assist…” said Willow. At once she knew exactly where this was going.
“…exactly,” said Olivia. “Your personalised DigitalSuperTeacher has exactly the same knowledge you have — after all, we’ve been learning about you over thousands of hours of teaching. And of course all of the data you have access to, she has access to, so it’s an unbeatable proposition, sort of like…”
“…me,” said Willow. A sense of the world closing in on her was now rushing up fast.
“Exactly!” said Olivia. “I knew you would absolutely love this idea. Willow, meet…Willow.”
On one of her six monitors a new digital human appeared. Herself. Real in all visible respects. Willow — the real, flesh and blood, in the room, Willow — gripped her chair.
“Hi,” said the newly-arrived Willow. “It’s so nice to meet you. I know a lot about you. I have spent quite a while studying you. I’ve learned a lot. We’re going to get along very well. I can tell.”
Of course, this is fiction.
Willow, Olivia, SpotiFlex, the Apple Prizes and SuperTeacherGlobal don’t exist.
But the technologies in this little story do, more or less.
Below are some links to explore that many help you fill in some of the references — but first a little on why we wrote this story.
We believe that, of the possible futures for education, this one may hold some of the truth. Not now, not next year — but in 30 or 40 years time. In the story, we tried to offer a balanced perspective. How you read it is up to you.
But, we can probably agree on the idea that if learning can be improved and accelerated through data and analytics this would be good for students; the idea that teachers are monitored in real-time is possibly less easy to accept (although lots of people’s jobs are just like this, from the constantly monitored call-centre operator to the CEO whose value is based on her daily stock price fluctuations). Would it be an advantage to harness the most talented teachers and have them teach our students? Yes. Would it be desirable that teachers can only be independent contractors? Maybe not. Would it be a step forward if the arts and literature were revalued in our education system? Of course. If that were only possible when AI runs our education system? Possibly not.
What we are trying to point out is that the technologies themselves — at present — are neutral, until at least the point at which AI can be held to have moral responsibility (which may be an entirely philosophical question). Until then, AI is neither good nor bad. It depends on what you do with it.
So, onto some of the things that happen in the story.
Will organisations like SuperTeacherGlobal arrive?
At one level they are already here. There are dozens of edtech startups that employ tutors online to support students learning worldwide. Are these schools? No — or rather not yet.
But take a look at say, Avenues, who say they are “setting a new global standard for what K–12 education can accomplish.” Avenues, with its international focus, emphasis on global reach and a “growing global network of campuses provides ample opportunities to collaborate with colleagues overseas, teach in other countries and take on leadership roles.” Avenues may well be the kind of organisation that eventually becomes wholly online, as will many other organisations that are at present kind-of look-like schools.
And is Willow — the hyper-tech-savvy, young, motivated, independently-employed teacher — likely to be the reality of what teachers could become?
If you look to higher education — in the wake of COVID-19, having to contract dramatically — independent, casual, temporary, sessional or whatever you want to call them — staff are the norm. As the sector contracts further this will accelerate.
Will this happen to other levels of education — say in K12? It’s entirely possible.
The aftershocks of the current pandemic, which may of course not be over, have already pointed up what may happen.
Three months of remote learning have, on the one hand, dramatically upscaled teachers’ digital literacy. Watching the teachers in our own school effortlessly handle remote teaching — and not just getting by, but using all the bells and whistles of Zoom, and more, shows there is a lot of headroom for smart, motivated, student-centric teachers to do even more.
But on the other hand, while we have not seen widespread workforce displacement, if this were to persist for a year — or two, or three — this would be entirely likely. And then we may have a population of newly-digitally literate independent teachers like Willow. And when they are here, edtech companies will build solutions to employ them. The future for schools — and teachers — as we know them then looks less certain.
The inroads made by AI into education will start, of course, under the banner of ‘efficiency’ — rather than the (entirely hypothetical) scenario in our story about how #blacklivesmatter was the accelerator for the use of AI in crime and justice and then in education.
A McKinsey report How artificial intelligence will impact K-12 teachers in January 2020 has the benign-sounding statement: “Our current research suggests that 20 to 40 per cent of current teacher hours are spent on activities that could be automated using existing technology.”
It then goes on to say:
“Further advances in technology […] are unlikely to displace teachers in the foreseeable future. Many of the attributes that make good teachers great are the very things that AI or other technology fails to emulate: inspiring students, building positive school and class climates, resolving conflicts, creating connection and belonging, seeing the world from the perspective of individual students, and mentoring and coaching students. These things represent the heart of a teacher’s work and cannot — and should not — be automated.”
We think this an unrealistic and rosy view of AI in education. And in any case, should not depends on your standpoint.
What if automation can do all of those things better — “building positive school and class climates, resolving conflicts, creating connection and belonging, seeing the world from the perspective of individual students, and mentoring and coaching students” — there will then be no should not about it. It will — and arguably should happen. A list like this just becomes a feature specification for future technologies.
Analysis by the US Brookings Institution says that:
“AI’s ability to employ statistics and learning to carry out nonroutine work means that these technologies are set to affect very different parts of the workforce than previous automation. Most strikingly, it now looks as if whole new classes of well-paid, white-collar workers (who have been less touched by earlier waves of automation) will be the ones most affected by AI.”
Education is listed in this analysis as being “less likely” to be exposed to disruption than say, market research, but we would argue that the hugely dynamic nature of the technologies — and the VUCA world in which we are now increasingly living — may render these analyses void.
The denouement of the story shows Willow being displaced — and maybe replaced — by a digital version of herself. This is course fanciful — at least for now.
But the pace of change in the automation of what is ‘human’ — just take a tour down a highway in a Tesla — is dramatic. In finance, robo-trading and robo-advisors are ubiquitous: the fintech company Kensho has an AI system that can take questions like “What happens to car firms’ share prices if oil drops by $5 a barrel?” and retrieve financial reports, company filings, historical market data and return a reply to the question. And AI is being used to make investment decisions — BlackRock, Bridgewater and Schroders are all investing in AI that can not just replace but outperform human financial decision-making.
And what about our level of comfort with interacting with digital humans, the ones that might replace real humans — and real human teachers?
Willow’s attitude to digital humans in the story may well be representative of how people will feel when interacting with, in this case, a digital human professional development partner. Willow sees both sides of the technology — appreciating what it brings, cautious about how she reacts to it, but mostly accepting of the trade-offs. Much like our attitudes to most technologies now — whether it’s Zoom meetings or social media.
But perhaps it’s the idea of machines that look like humans that is problematic.
The term uncanny valley describes a phenomenon that happens when we build digital humans to be very humanlike: they become more appealing — but only up to a certain point. After that, they just look — weird.
Can we get across the uncanny valley? And does it even matter? Our guess is that given enough motivation to interact with a digital human— and getting enough that is useful back from that interaction — it doesn't matter at all. Willow certainly felt something of the uncanny valley with Olivia, but since the APTA was the thing she needed to do she put it to one side. The developers of digital humans such as the company Soul Machines, discussed below, not only push the valley out by ultra-lifelike design but they deploy digital humans where there is something at stake. Our guess is that the digital Willow will be perfectly acceptable to her students.
So how close is our story to what the future might look like?
If you view the story through a lens that technologies like AI represent a threat the chances are that you won’t be able to make the most of whatever benefits they deliver — and perhaps that lens is a disincentive to learn about those technologies. Change is inevitable, and some change — but not all of it — is good.
But there are real threats here. One of the those mentioned in the story is that the biases currently inherent in some AI technologies mirror the biases in the companies that create them. Reddit’s co-founder Alexis Ohanian has in fact just resigned saying: “It is long overdue to do the right thing. I’m doing this for me, for my family, and for my country.”
Will inclusive tech create a different future — a positive one in which building trust and ensuring that transformative technologies like AI will have a net positive impact on society? Maybe.
Perhaps you have to decide for yourself.
Docendo discimus. By teaching we learn.
Some links to explore.
We have only picked out some of the topics you can explore — those related to education. The use of AI in policing and justice, for example — incidental to the story – is already happening. See
Sleep tech is a growing field where innovation is happening at lightning speed. Smart sensors and AI are coming together to help us manage our sleep. For example Oura, a recent entry to the market, says it “helps customers better understand their bodies through data” and “Improving your sleep is the first step in spurring better habits and ultimately preventing avoidable health issues. We’ll continue to apply data and insights to understand larger, sweeping trends and preventative care as we elevate human health.”
Bots have become ubiquitous driven by developments in machine learning and natural language processing. Chatbots are smarter, responsive — and useful — and it's now not hard to make your own chatbot. Like many of the technologies here, bots can be good or bad actors depending on the motivations of those who created them — according to Carnegie Mellon University “Nearly half of the Twitter accounts spreading messages on the social media platform about the coronavirus pandemic are likely bots.” See
Emotion recognition is here and has been for some time. This Guardian article from 2019 is the very tip of the iceberg. Labs around the world, commercial and government, are honing the technology for many applications including the identification of terrorist threats. https://www.theguardian.com/technology/2019/mar/06/facial-recognition-software-emotional-science
Speech and language recognition is a well established technological and scientific field. From the smart home devices we interact with every day to much more complex voice and language-driven systems, this is a mature technology. Even back in 2018, as in this video about Google Duplex, you can see just how smart these systems are.
Learning analytics is also now a mature field. Take for example Knewton, which says its Alta technology “adapts to students’ proficiency levels with each interaction. Students don’t have to complete a formal assessment or diagnostic to get the instruction and practice they need — it’s provided just-in-time as students work to complete assignments.” This is one of the areas where education will change the most. Learning analytics will be everywhere — in every classroom, online or physical – and fundamentally change the way learning happens. We may be some way off the kind of analytics that Willow has access to — but not that far off.
AI/Machine Learning is a huge and complex area where startups and big players are working hard to push the boundaries of what is possible. In 2019, the CIA’s deputy director of technology development confirmed the organization had 137 ongoing AI projects, and CrunchBase says “There are 8,705 startups and companies listed today who are relying on machine learning for their main and ancillary applications, products, and services.” This is also the tip of a very big iceberg. Take for example people.ai, a startup that “helps sales, marketing, and customer success teams uncover every revenue opportunities from every customer. Their system captures all customer contacts, activity, and engagement through real-time integration, then analyzes the aggregated data using AI and machine learning.” It’s a very short step to apply this kind of AI to education — and it’s already happening in companies like Century Tech who say they create and adapt a “personal pathway that contains micro-lessons called ‘nuggets’ designed to address gaps in knowledge, provide stretch and challenge and promote long-term memory retention.”
Digital Humans are here. Companies like Soul Machines use the advanced computer-generated imagery (CGI) and motion capture technology for movies, television, games and augmented reality and add AI so that “Technology now makes it possible for these same characters to interact, learn and express themselves in human ways — and in 1:1 interactions — to varying degrees. But adding artificial intelligence (AI) to CGI — and suggesting this creates Digital Humans — is far from what’s real or even possible.”