Why Teach Students About AI?

School pupils should be learning about AI, even if they don’t pursue it as a career

Nicole Wheeler
Nerd For Tech
6 min readJul 21, 2021

--

We have a duty to prepare each generation of students for the future that lies ahead of them. This generation of school pupils will fall into at least one of the following groups:

  1. Professionals who train machine learning algorithms as part of their job.
  2. Professionals who interact with AI products as part of their job.
  3. Citizens who have decisions or predictions made about them by AI.

All three of these groups have good reasons to be learning about AI in school.

Photo by Clarisse Croset on Unsplash

For professionals: addressing the UK digital skills shortage

The UK is set to face a skills shortage in programming and data science. A report from WorldSkills UK found that over one in three employers say that their workforce lacks advanced digital skills their businesses need, and fewer than half of British employers believe students are leaving school with a high enough level of digital skills. 70% of young people surveyed said they expect their employer to invest in their digital skills, but only half of employers said they actually provide this training.

There is also a stark gender gap in digital skills, with young females under-represented at every level. Females account for just 22% of GCSE entrants in IT subjects, 17% of A-Level entrants, 23% of apprenticeship starts in ICT, and 16% of undergraduate starts in computer science.

But what is driving these differences?

Low interest from girls in digital skills. Since 2015, GCSE has been phasing out ICT in favour of computer science. 2020 figures showed a 40% drop in GCSE entries in ICT. Most of these entries were replaced by uptake of Computer Science in boys, but not in girls. Boys are four times more likely to say IT is their favourite subject, and over five times more likely to say that they plan to take an IT-related subject at A-Level.

Limited access to IT resources. With schools, colleges and universities having to move to remote education during the pandemic, there has been growing recognition that many young people lack the devices or the internet connections to access online learning. One in ten young people in the UK lack access to an appropriate digital device on which to do their homework and build their skills, rising to one in five among young people from lower socio-economic groups.

Unclear career benefits. The two most common answers given for why students would not consider a career requiring advanced digital skills were not having the required knowledge (41%) and thinking careers requiring these skills sounded boring (30%). Dr Neil Bentley-Gockman, Chief Executive of WorldSkills UK, says there are four main reasons why the digital skills shortage is growing across the country:

  • a lack of clearly defined job roles in certain fields
  • a lack of understanding and guidance about potential career paths
  • a lack of relatable role models
  • a difficulty in making many technical professions seem appealing to young people, especially young women

Bringing professionals into schools to share their career journeys and describe the work they do is a great way to tackle these challenges. But, this approach relies on schools being able to connect with these professionals and host them onsite on a workday, so having teaching resources that fill a similar purpose would also be helpful. This could include lessons designed by different professionals working in data science and programming, or interviews with different professionals about their jobs.

Photo by Ryoji Iwata on Unsplash

For citizens: improving AI literacy

As AI becomes increasingly integrated into our lives, a population that’s well-educated about AI will be better prepared to embrace AI where it is beneficial and identify situations in which it may be doing harm. The UK AI Council’s AI roadmap recommends students :

“[know] enough to be a conscious and confident user of AI-related products; to know what questions to ask, what risks to look out for, what ethical and societal implications might arise, and what kinds of opportunities AI might provide”
-
UK AI Council’s AI Roadmap

There are a number of ways AI can be designed, trained or deployed in ways which can do harm. The team behind AI Blindspot has compiled an excellent set of resources outlining how unconscious biases and structural inequalities can creep into products. It’s important that the people who develop AI understand these issues, but it’s also important for the people who reguate this AI, people who consider using AI for part of their business, and people who are having decisions made about them that are made or influenced by AI.

For example, a recent Mercer survey found that the number of companies using predictive analytics for hiring jumped from 10% in 2016 to 39% in 2020. These tools do things like analyzing someone’s face and voice during an interview or scan CVs and select those that look like the best match for a job. An MIT Technology Review recently investigated AI interview tools, and found that these tools, which have already been adopted by companies to help then screen large numbers of job candidates perform really poorly. For example, one screening tool gave someone a high proficiency score for English and ranked them in the top half of candidates, even though they spent the entire interview reading from a Wikipedia page written in German.

Another system tested by MIT Technology Review, Pymetrics, tests interviewees on a number of tasks, then checks to see which candidates behave in a similar way to employees who are already successful in that role. While this may sound like a sensible approach, hiring a team of people who all think and solve problems in the same way is proven to be less effective than building diverse teams.

Having students build a simple machine learning algorithm themselves can help them gain intuition about how these algorithms work and the problems that can arise if algorithms aren’t trained and evaluated carefully. Looking at examples of AI behaving badly can also help students explore their opinions on AI and its adoption in different areas. Below are some useful resources:

Photo by NeONBRAND on Unsplash

Improving current teaching tools

Schools are increasingly offering classes on Python programming and using simplified AI teaching tools like Teachable Machine to introduce students to AI, but teachers are wanting more advanced and realistic programming tasks to give to students. Some of the features that make for really helpful teaching resources include:

  • Programming in a job-relevant language, like Python
  • Real data, including messy data with outliers or biases
  • Addressing problems students care about that may lead to real jobs, like climate change or improving clinical diagnoses
  • Projects that extend across term time which involve planning, problem-solving, teamwork, and communicating findings to others

--

--

Nicole Wheeler
Nerd For Tech

Bioinformatician + data scientist, building machine learning algorithms for the detection of emerging infectious threats to human health