Big data, big problems?

It can help improve and personalize the learning experience, but deploying algorithms to make decisions can open the door to racism, sexism and elitism, analysts warn

IlanaIB
IB World
8 min readFeb 15, 2018

--

How will big data affect education? (Getty Images/Hero Images)

By Sophie-Marie Odum

Every detail of your life — what you buy, where you go (and who with) — is being extracted from the internet, packaged and traded by data-mining companies. Big data analysis, or the mining of extremely large data sets to identify trends and patterns, is fast becoming standard practice in many sectors, including education.

Data is collected from the websites you browse, the things you buy, social media posts, customer-loyalty reward cards, and the music you listen to online. Your favourite brands use this information to better understand you and your spending habits so they can target market their products and services. Some people are concerned about the invasion of privacy, while others welcome this growing practice.

In fact, an increasing number of business models are built on big data. For example, music sharing platform Spotify and streaming service Netflix are successful because they use an invisible array of algorithms to recommend content that you may like.

As schools look to improve and personalize learning and teaching, many are taking a data-driven approach. Using algorithms — a sequence of instructions or a set of rules — schools can deepen their understanding of how their students are learning and provide the necessary support. Test and attainment scores can be fed into algorithms to help shape curriculums, improve teaching, differentiate class instruction and encourage educators to consider other methods of assessment. Big data can also help identify the reasons behind dropout rates and absenteeism in certain communities.

Around the world, higher learning institutions are relying on big data to enhance racial and economic diversity of their student population, as well as retention and graduation rates.

Such institutions have always gathered information about their students — from how many complete certain courses to how accurately a grade in one course predicts their success in other classes — but much of that information had, until now, been collected merely for accountability purposes and not much more.

Preparing students for employment

Allison Littlejohn, Professor of Learning Technology and Academic Director of Digital Innovation at The Open University, UK, says the possibilities are endless if data is used in the right way.

“We can look at trends, and take some of the data that’s coming from education and school systems, and connect that with employment within countries. Depending on what the future job opportunities might be, schools can then adapt the curriculum.

“At one level, that might seem scary and ‘Big Brother-like’, but we’re now in a society where jobs and employment opportunities are continually changing. We need to be sure that students are properly prepared so that when they do leave school, they’re able to aim for jobs that still exist, and later change careers, which they’re very likely to do throughout their lives,” she says.

The Open University is using quantitative data to examine how schools and education systems operate. In the future, Littlejohn predicts that most students will be learning more online, offering more opportunities for analytics and personalization.

“If all students had their own digital devices, and learn through those devices, there would be lots of opportunity for them to connect with teachers and other students around the world, and gather resources or contribute their knowledge online,” says Littlejohn.

“All of that can be traced, and we’ll be able to view the development that a student could make. Teachers and parents can see much more clearly and how well a student is progressing. Schools will be able to target the support that the student might need for their learning much better, because everything becomes more transparent.”

The IB sees the potential for big data in education. Director General Dr Siva Kumari wants the IB to become underpinned by data, which is a big part of the IB’s strategy 2.0.

“Big data offers a way to help students have learning that is tailored to their needs,” says Dr Kumari.

“The world of education is waking up to the technology, and we want to be that organization that distributes best practice. We have a very vibrant community that spends a lot of time thinking about good teaching and we have thousands of teachers around the world who are invested in good teaching. Therefore we can serve as a platform for distribution of these practices,” she adds.

“As part of strategy 2.0, we could quickly share what’s working well in a certain type of school, and we can also build predictive systems. However, we should know how to collect, use and create intelligence out of data sets for schools. It’s mostly a case of how can we become a great data- rich organization that distributes back to the schools. All with the aim of creating a really good education for students.”

The dark side of data

The potential for big data is immense — but so are the risks. Because algorithms are produced by people — who may have inherent biases — critics argue that algorithms can reflect these and harm learning.

Google’s Autocomplete feature, which aims to help users complete an internet search, is a simple example of how algorithms can produce a biased outcome and perpetuate stereotypes. However, it’s equally important to recognize that Google’s Autocomplete is based on what it learns from its users and popular searches. Multiple studies have found that gender and racial biases operate in the classroom, and carelessly trusting an algorithm can reinforce, rather than eliminate, discrimination.

Cathy O’Neil, author of Weapons of Math Destruction: How big data increases inequality and threatens democracy, a book that tracks the effects of computerized discrimination in today’s society, says the discriminatory and even predatory way in which algorithms are being used in everything from our school system to the criminal justice system is detrimental.

She argues that the algorithms, conjured from the whole universe of data we constantly generate, are used against us. Her work highlights how being on the wrong side of an algorithmic decision can snowball in incredibly destructive ways.

Bettina Berendt, Professor of Artificial Intelligence at the University of Leuven Belgium, agrees. “On the one hand, big data is something that can be in favour of equity and can go against biases, stereotypes and discrimination. But at the same time it can make things worse.

“There is a normalization of surveillance going on that will ultimately weaken democratic learning and consciousness. That is extremely problematic and that is completely beside any scientific things about observing people in their learning behaviour, and that connects to the way big data is concretely being handled at the moment and the economics of big data.”

Berendt adds that big data and algorithms cause labelling which can negatively affect development. She says: “They create an atmosphere where students and teachers feel under surveillance, where they feel under pressure to perform all the time. Traditionally, learning environments have a protected and safe nature. This absence of fear and competitive pressure, at least in phases, is really crucial for learning.”

She argues that systems that are built by big corporations, which could have an alternative agenda, can influence data outcomes, too.

Even though algorithms can be biased, they may be less biased than teachers, says Littlejohn. “Unconscious bias affects teaching. By looking through the codes and understanding the underlying assumptions behind them, you could even argue that the biases and algorithms are less problematic than unconscious bias and can be fixed.”

Involving students can counteract this, says Dr Kumari. “Ideally, our algorithms will be created for an IB student. It’s crucial that when algorithms are created, particularly in an IB context, that we focus on the right things. My interest is in the deconstruction of the concept, how do we build learning and scaffolding for a student. For instance, how do we assess the student’s prior understanding needed to master new learning and help them move forward to new mastery.”

Littlejohn adds that the right expertise is essential when creating algorithms: “The success of the algorithm depends on how well coders, teachers and people who really understand learning work together. It’s also very difficult to actually gather the data that you need to come to the conclusions that you want to reach.

“For example, we already know that assessment — the way that we traditionally assess through a test or an exam — is only really an approximation of the learning. So if you’re testing a subject like physics, for example, and you give a student a calculation to do, you’re only testing part of their knowledge, you’re not testing the totality of their ability to be able to work as a physicist. It’s the same with the data. A lot of what we measure and analyse is an approximation of what people’s actual ability is.”

Big data also creates some significant ethical dilemmas. Among the key concerns are: where children’s data is stored, who can access it and how much freedom of choice students will have about their learning. For example, the Open University uses predictive modelling to predict whether or not a student is going to pass their next assessment. All students are asked for permission, but to be effective the system needs as many people as possible involved. “The fewer people you have contributing, the less accurate your model is going to be,” explains Littlejohn.

“There’s an ethical dilemma about people who don’t supply their data; should they still benefit from the system or not? Secondly, once we know that someone is highly likely not to pass their next assessment, what do we do about it? Do we tell them, and if we do, how do we tell them because we can’t simply say ‘our system is telling us you are going to fail’. Once students know, there is a risk that they’ll become demotivated, which can have other negative knock-on effects to learning as well as to health and wellbeing. So we have to find ways to be able to help them subvert the system. This kind of data can be used to try to point the students towards success.”

It’s time to trust big data

Analysts agree that big data in education is here to stay. “It’s a little bit like the argument raised 20 years ago of ‘should the internet be let into schools?’,” says Berendt.

“Big data characterizes the world into which these children will grow. It will shape curriculums.” She adds that it’s now time to help students understand big data.

“Why not encourage children to build a weather station with which they collect big data about their school garden, for example. Educators can analyse the data with them, helping them change their growing methods, etc, in response to the insights they get from this big data. Such projects do not require students to spy on other people, but they will help students learn to understand and question the uses and abuses of big data.”

Littlejohn believes that organizations are likely to combine different types of data in future to provide radical new ways of viewing education data. She says it’s time for the education sector to trust big data as we offer our information to companies in many other aspects of our lives without hesitation.

“Do you think it’s scary that Google — or any website — has your data and knows where you are, what you’re doing and so on? If we’re quite happy with Google using our information, why aren’t we happy with school systems using our data, too?”

Dr Kumari encourages IB educators to embrace the changes. “It’s important
for us as an international education organization to engage in the conversation and embrace this inevitability as it’s happening in our everyday life already,” she says. “It’s definitely a technological advantage that we should all shape for newer ways of teaching and better learning.”

Originally published in IB World magazine in October 2017.

--

--

IlanaIB
IB World

Stories from the International Baccalaureate community.