A look at how digital technologies designed for young people are often addictive, unhealthy or unsafe — and what we should be doing about it.
by Stephanie Hankey and Daisy Kidd
The destruction of the natural environment has sparked anger and frustration among young people¹, who have taken to the streets to demand change, and yet threats to their digital environment are less widely discussed. Young people around the world are disillusioned that their futures are being drastically impacted by the actions and decisions of older generations, who they feel have not done enough to act responsibly and sustainably. The threats to our natural environment run parallel to the systemic challenges and threats to democracy that come from digital technologies; they demand the same attention from younger generations. Young people care about the digital environment they inhabit and will inherit and we need to listen to them and involve them in the conversation about their digital future.
In late 2017, the former vice-president of user growth at Facebook, Chamath Palihapitiya, publicly stated that platforms such as Facebook were “destroying the social fabric of society” and that he did not allow his children to use digital technologies at home. Shortly afterwards, shareholders holding over 2 billion dollars of stock in Apple wrote an open letter to the company urging them to act on the addictive nature of iPhones and iPads on children. This was quickly followed by initial responses from a handful of tech companies reflecting on the potentially addictive nature of their tools and services and introducing the topic of ‘digital well-being’. However, over two years after industry leaders and influencers publicly acknowledged their own concerns about the impact of technology on children and young people, very little has been done. As we edge towards a state of further technological dependency, from emergency home schooling to social interaction, we find ourselves as families and as societies more reliant on these technologies than ever, yet still not prepared to curb or solve the problems they create.
Two years after industry leaders and influencers publicly acknowledged their own concerns about the impact of technology on young people, very little has been done.
Today, by the age of 11, 90% of children in Europe and the US have a smartphone and in many contexts, 50% of children as young as seven have a personal device. They live in a connected world, from smart classrooms to smart cities to smart homes. This makes it an important time to start working with young people on questions of digital literacy and agency, but also on what artist and technologist James Bridle calls “systemic literacy”². He defines this term as “the thinking that deals with a world that is not computable, while acknowledging that it is irrevocably shaped and informed by computation.” It is no longer enough to think about the internet, or data, or smart devices as independent things — we must start thinking of them as part of a wider system that includes the climate, economics, politics and power.
Parents, educators and regulators have struggled with protecting children on the internet for some time. However, the widespread introduction of the smartphone, combined with the use of machine learning and data-driven business models, has changed the nature of the problem. Unlimited and unfettered access to ‘free’ services has lowered the barrier to entry and enhanced potential for learning, exploration and self-expression, which has brought many benefits, especially for those who traditionally have had less access. Yet it has also expanded exposure to online harms and introduced a lack of agency in decision making. Young people use private devices to access public spaces where they have to navigate an incredibly complex, challenging and rapidly shifting environment at a time in their lives when they are developing as individuals and learning to independently explore the world outside of the home.
Young people use private devices to access public spaces at a time in their lives when they are developing as individuals and learning to independently explore the world outside of the home.
As many children use their phones and tablets not only for entertainment and communication, but also as safety, coordination and support devices, new dependencies form. Studies show that devices have moved to an ‘always on’ mode, with a majority of young people reporting that they sleep with their smartphones next to their bed and feel nervous if their phone is losing power. Adults are constantly trying to adapt to parenting in a digital age. An Australian study found that 62% of parents reported that they have arguments with their children about ‘screen time’, whilst parents themselves reportedly spend an average of 3.5 hours per day on their phones. Amidst the COVID-19 crisis, everyday activities such as schooling have moved online, further complexifying this relationship. Increased dependency has not only laid bare the inadequacies and failings of digital technologies but has also opened the door for greater control for the big tech companies, as Naomi Klein has described in detail.
Parents not only struggle with integrating digital boundary-setting in their parenting. They are also often responsible for bringing the latest data-driven devices into the home. Smart parenting can include everything from tracking technologies to data-driven home, play and entertainment devices, such as smart speakers or ‘Internet of Things’ toys — yet parents rarely know what data is being collected, transferred and utilised. If they wanted to find out by reading the privacy policies or terms of service, they would not only need to read the lengthy first party document, but also navigate an endless spiderweb of third-party policies.
Schools equally rely on data-driven technologies, with the G Suite for Education (Google for Education) products being the most widely used tool for administering, documenting and organising school life. This reduces costs for schools, yet further expands the data and brand reach of centralised technology monoliths, and can lead to tracking beyond the classroom. Approaches to data-driven technologies in schools vary by country. In France, for example, phones in schools are banned, whereas in the US apps for behavioral scoring of children and young people in schools and facial recognition for classroom entry are being introduced. In this context, the idea of ‘online’ or ‘cyber’ safety for children has become a thing of the past. At home, at school, alone or with friends, the digital (data) environment has become an ever-present part of a young person’s daily life. This dependency on technology has only grown in the context of a worldwide pandemic that restricts movement and requires enforced large-scale long-term social distancing.
It is important to acknowledge that digital technology is an inevitable, and often positive, element of young people’s lives. However, we are not ready and have not done enough to fix the environment in which they operate. Data-driven technologies bring great benefits to society and to the planet, yet they are also currently failing us and especially failing young people who will inherit a digital environment that is out of their hands.
“Young people may be ‘thumb fast’, but that does not make them data-savvy.” — Beeban Kidron, 5 Rights
The data-driven and machine learning nature of these technologies creates a dynamic that is hard to escape and even harder to understand from the outside. The ‘true cost’ of these free and low-cost services is rarely understood; and the ways in which the majority of services, platforms and apps operate is opaque to young users. Young people may be ‘thumb fast’, as Baroness Beeban Kidron states, but that does not make them data-savvy. Recommendation algorithms drag young people down negative algorithmic ‘rabbit warrens’ of content on YouTube and TikTok. Free computer games use online gambling technologies to keep children and young people coming back for more. Technologies proven to be harmful, such as anonymous rating apps implicated in bullying and attempted suicide cases, are shut down by parents, only to reemerge as popular features on Snapchat. Teenagers create and spread disinformation because social media is the perfect breeding ground, and there is a market for it. Furthermore, new tools come on the market every day, hoping to be the ‘next big thing’, yet trusted app stores do not vet them, incorrectly classifying hundreds of inappropriate and data-leaky apps as suitable for children. The tech industry tests tools designed for an ‘ideal’ world, not a ‘real’ world, on young people — in no other industry would this be possible. The UK Children’s Commissioner succinctly framed the problem as such: “When it comes to staying safe online, children and their parents have been left with all of the responsibility, but none of the control.”
“When it comes to staying safe online, children and their parents have been left with all of the responsibility, but none of the control.” — UK Children’s Commissioner
Disrupted Childhood, a report by the UK children’s rights and technology advocacy group 5 Rights, looks at how this data business model is enabled through persuasive design. Techniques promoted and popularised through big tech platforms include dopamine rushes, popularity contests, ‘pings’ and emotional highs, all used as strategies to “capture and hold users’ attention and imprint habitual behaviours.” This leads to a fundamental question: Can we blame young people for not being able to put down phones that are intentionally designed to habituate them by some of the wealthiest companies in the world? 5 Rights reports that these mechanisms, combined with machine learning–driven services and content, impact children significantly, contributing to “personal anxiety, social aggression, denuded relationships, sleep deprivation” with an impact on “education, health and wellbeing.”
Can we blame young people for not being able to put down phones that are intentionally designed to habituate them by some of the wealthiest companies in the world?
Within this context, very little is said about the business model behind these data-driven technologies, and how young people are part of a vast industry designed not only to keep their attention but to modify their behaviour and collect data inequitably. In her book ‘The Age of Surveillance Capitalism’³, Shoshana Zuboff states: “They accumulate vast domains of new knowledge from us, but not for us. They predict our futures for the sake of others’ gain, not ours.” As we enter further into what Zuboff calls “behavioral future markets”, where surveillance capitalists gather “ever more predictive sources of behavioral surplus”, young people will end up at the forefront of this intrusion. If we are going to start addressing surveillance capitalism for meaningful and positive change then we need to start with young people. This is true not only of the big tech companies but also less explored aspects of industries that utilise data from young people en masse, such as actors in the gaming industry.
We are learning more and more about how digital environments designed for and used by children can exacerbate and intensify challenges related to economic differences, as well as their wellbeing and mental health. The American non-profit Common Sense Media have shown that mobile phone exposure is different for young people within minorities and across income and class. This difference has been accentuated by emergency schooling and dependency on infrastructure in the home, with many children left out of the loop and with little to no access to education. Young people from low income and minority communities in the US are more heavily impacted at an earlier age. Furthermore, Sonia Livingstone’s project at the London School of Economics has explored the complexity of such a digital environment for children at risk and residing in care, finding that it can both offer them a lifeline and important space for escape, yet also exacerbate vulnerabilities which expose them and their carers to risk. The same is true of children with learning challenges and disabilities where there are a high number of benefits with an increasing amount of dependency, yet the current environment presents serious challenges, such as avenues for abuse and modification of behaviors. This is a largely under explored area of risk that is not yet balanced against the significant benefits.
Some progress has been made in reining in tech companies. Yet, despite essential GDPR and COPPA progress with young people’s data, the effects of these regulatory changes have to date mostly served to clear up the very worst parts of the industry. Many areas of practice have yet to be revised. The changes to even a key area such as consent are still at best partial and at worst meaningless. In most cases, young people simply circumvent the need to get parental consent, with 61% of under-13s having a social media account, despite those services being for over-13s. The same goes for dating apps, such as Tinder, despite over-18 policies. These examples are ever changing and serve to illustrate the depth and breadth of the problem and the inadequacy of working on a purely reactive basis. The one exception and essential development in the policy landscape is the UK Information Commissioner Office’s Age Appropriate Design Code, which sets an extremely high standard and has been designed with an up-to-date and comprehensive handle on the challenges, but would require an enormous shift in the industry.
Overall, the digital environment is an inequitable environment for young people. Young people want to have a say in the digital environment they inhabit and call for support in digital wellbeing and resilience. There is a growing lack of trust amongst young people that technology companies act responsibly, with three out of four teens believing that tech companies intentionally manipulate them and a reasonable concern that regulators have not taken and will not take the necessary measures. Parents and educators equally struggle to answer questions about how these technologies work and what the main concerns should be. And even for regulators and rights groups who do try to take action, this is a difficult terrain to navigate, with a need to balance out protection with infringement of rights.
The answer here is not to remove technology from children, nor to create walled gardens, but to enhance the agency and control of young users and to empower them and those who support them to affect the tools they use.
Children and young people play, learn and socialise in a ‘datafied’ commercial environment, one where tracking, personalising, profiling, scoring, targeting, and nudging are monetised and surveillance is normalised. Dr. Victoria Nash of The Oxford Internet Institute describes this as a significant shift: from concerns of “children using the internet”, to “the internet using children”. The answer here is not to remove technology from children, nor to create walled gardens, but to enhance the agency and control of young users and to empower them and those who support them to affect the tools they use.
Beyond the question of how digital technologies are currently experienced by young people, there are wide-ranging debates about how they will shape the way we, as societies, will live in the future. These include ethical questions about artificial intelligence, automation and the future of work; the trade-offs associated with ‘techno-solutionism’ as a political response; the threats to democracy and the media; the future of digital citizenship and the environmental impact of technology dependency. What is surprising is that young people are not educated about the challenges to their values and norms that they will inevitably face. They currently struggle not only to imagine what these issues are but also to connect them to things they learn about at school.
This moment of increased dependency provides a unique opportunity to engage with young people and to critically reflect on how they want the digital environment they inhabit and will inherit to look. This informs our new youth and data project at Tactical Tech. Over the coming years we need to be asking:
- What needs to be done to the current digital environment to make it appropriate, equitable and fair for young people?
- What does it mean for young people to grow up in a ‘post-truth’ age? How do algorithms impact what they see? And how do augmented reality and the spilling over of digital surveillance into real life impact their actions and their autonomy?
- What does the new vocabulary for digital rights look like from a young person’s perspective?
- What do we need to do to build digital (data) literacy and citizenship into the educational curriculum?
- What are the values and norms that they should expect from the digital environment?
- What would a positive digital future environment look like? And how can we ensure that this happens in a sustainable and proactive way, with young people at the centre of it?
Stephanie Hankey is the Executive Director of Tactical Tech, an international NGO working in public education on the impact of technology on society.
Daisy Kidd leads Tactical Tech’s youth initiative. To hear more about the project, contact daisy — at — tacticaltech . org.
- ‘Young people’ in this article refers to 10–18 year olds
- James Bridle, New Dark Age (Verso: 2018)
- Shoshana Zuboff, The Age of Surveillance Capitalism (Profile Books: 2019)