Artificial Intelligence in Education

Absolutely NOT an article on robot teachers

Aaron Wang
Prodigy Engineering
7 min readSep 15, 2021

--

If I wrote this article 20 years ago, I’d probably need to explain that AI in education doesn’t necessarily look like a robot teaching students. But just in case anyone thinks so, I put it in the subtitle!

That said, AI and machine learning have gained so much traction in recent years that we might even see this become a reality. In fact, some countries are currently giving it a try.

Industry Trends

The Google Trends below on AI, machine learning, and deep learning show a dramatic increase from the year 2014 and a slight drop after 2020. This trend roughly indicates the public’s interest in the subject, which can be a lagging indicator of research and potentially a leading indicator for industrial applications.

A global survey by McKinsey in 2020 with 2395 participants shows that almost 50% of the respondents mentioned that their organizations adopted AI in at least one function. 22% of the AI adopters attributed more than 5% EBIT in the fiscal year 2019 to AI, while 48% reported less than 5%.

The same survey also shows the revenue increase in different company functions. While marketing and sales still remain the number one department, we see a large increase in finance, supply chain, manufacturing, and risk.

Product and service development slowed down but still has one of the largest revenue contributions across all functions, with 16% reporting more than a 10% revenue increase in 2019.

Research done by Deloitte in 2017 summarized the cost and benefits of machine learning projects into four categories:

  1. Revenue and growth
  2. Time and efficiency
  3. Capital savings
  4. Investment costs

According to the survey, the benefits of machine learning projects can range from 250 thousand dollars to 20 million. The cost can range from hundreds of thousands of dollars to millions. The ROI for the first year is typically two to five times the cost, which means most projects make money. And the average project duration is 12 months.

Research Trends

There are numerous AI research topics to cover in one article, so I picked four areas from a report by the State of AI. The report emphasizes deep learning, but bear in mind that AI and machine learning go beyond deep learning in general. Please have a look at the original report which offers much more than what’s being discussed here.

1. AI is less open

For people with a software background, we may think everything is open-sourced nowadays, but it’s not always the case for AI research.

Only 15% of the papers are published with codes, and as you may expect, academic groups are more likely to publish their codes than industry ones. Also, the industry’s top talents are highly concentrated in big companies such as Deepmind, OpenAI, Google, and Amazon.

2. Deep Learning Frameworks

Researchers love PyTorch, but Tensorflow and Caffe are still the first choices in production.

3. Size Matters in Deep Learning

GPT-3, a natural language processing model, leverages 175 billion parameters. The reason behind this is the larger the model, the better the performance. (If set up properly, of course.)

4. Cost for Training

Based on the information from Google, it costs $1 per 1000 parameters for deep learning models. And some experts estimate the budget for training the GPT-3 was 10 million dollars. So state of art deep learning research is a closed game among top companies.

AI in Education

So how is AI is related to the educational technology industry? A report by Deloitte points out that AI may transform education for everyone involved (see the table below). In those ways, machine learning contributes to education equity, efficiency, and quality.

AI Algorithms in Education

There are many AI algorithms we can leverage to bring the aforementioned benefits. The algorithms here are a broader category of machine learning. Some of them will not require machine learning at all but using their machine learning or deep learning alternatives can increase the accuracy.

Education Equity

Optical character recognition (OCR) is the electronic or mechanical conversion of images of typed, handwritten, or printed text into machine-encoded text.

OCR can improve education equity by establishing an async connection between remote teachers and students as an alternative way of getting inputs from students who may lack consistent access to devices with an internet connection.

OCR also provides the foundation for natural language processing by digitalizing handwriting data onto the cloud, which educators can analyze afterward.

Education Efficiency

Natural language processing (NLP) is concerned with the interactions between computers and human language, and in particular, how to program computers to process and analyze large amounts of natural language data.

Source

According to a study titled Natural Language Processing for Enhancing Teaching and Learning, NLP improves education efficiency by liberating teachers from repetitive tasks and offering schools analytical reports on student performance.
NLP can offer automatic language assessment (summative), intelligent tutoring (formative), dialogue generation, chat room regulation, material recommendation, and question generation.

Education Quality

Recommender System seeks to predict the “rating” or “preference” a user would give to an item. Recommender system improves education quality by leveraging teacher-generated data to offer personalized content to students. It can recommend or auto-allocate assignments, plans, and test preps to teachers. It can also recommend additional learning resources to parents.

Enablers

Procedural content generation (PCG) is the process of creating data algorithmically rather than manually to create textures and 3D models in computer graphics and a large amount of content in games.

Source

As a digital game-based learning company, Prodigy can leverage PCG to increase the amount of game content and replayability, reduce development time and cost and establish a competitive edge.

AI Infrastructure

To develop any AI applications, we need infrastructure. The video below, which I highly recommend, illustrates the basics of machine learning infrastructure quite well.

Alternatively, the infrastructure can be seen as a platform to automate the traditional analytical process from data gathering to the result presentation with the help of machines and codes.

Each tool fills in a specific niche but together they form a complete pipeline to replace a cognitive task (e.g., identifying cats) that previously could only be accurately done by human brains.

AI at Prodigy: A Case Study on Math Tutoring

Prodigy has been leveraging machine learning for a while to automate some tasks. Prodigy Math Tutoring, for example, is an important part of our business model that complements the math game by providing customized tutoring to suit each student’s personal learning pace and needs.

We get a large number of requests for free sessions on a daily basis. And guess what, a lot of the requests are from students who think they will get free in-game items through bookings, which is not actually the case. Here are some real booking messages we got from students.

  • La-la, la-la, la-la
  • let him be a member pls
  • i do math and have fun

While some students’ responses are humorous, it did pose a challenge for us as it took time to filter out the real parents and guardians.

And that’s where machine learning comes into play. We manually cleaned up thousands of booking requests and tagged them as either legitimate or not. Then we fed the data to a Naive Bayes classifier which tokenizes the individual words and predicts the result based on the Bayes Theorem.

With joint probability and the naive assumption that all features are mutually independent, the probabilistic function could further be simplified. In the end, we have the classifier function indicating the predicted label is only related to the probability of the outcomes (prior) and the likelihood of the outcomes.

With the help of the natural library, we trained and implemented the model. The classifier gives us decent accuracy rates. And together with some rule-based weightings, we are able to automate a tedious task with high accuracy.

Recap

This is a short article that only touches surfaces on AI and some algorithms. It aims to provide an overview and the very basics to those who may be enlightened by such a topic. If you are interested and want to get more details, I would recommend reviewing the resources section below.

Resources

Market Research

Machine Learning Fundamentals

Algorithms

Implementation

--

--