Could the use of technology be inhibiting performance in mathematics?

Inside the OECD ‘PISA’ reports is a wealth of data that goes far beyond the simple headlines of one country compared with another.
Indeed, some of the data is even more worrying, especially in the light of the many millions of pounds spent and being spent on computers and technology.

But the headline is:
On average, the more students use technology in school or at home, the worse they perform at mathematics!

Here is a graph showing the performance of students in the UK (against the OECD average — but I am sure you could access the database to see similar results for different countries. for maths against how much homework is expected to be completed online:

Here, another graph showing UK maths performance against how often the students use the internet in lessons for schoolwork.

And finally, performance against how often they use computers for practice and drilling (this for the OECD as a whole, which again needs some more research, but you can imagine where the UK would be).

This report concerned the 2012 PISA tests, but similar data is available for earlier PISA reports.
And it should be concerning. Are UK schools using technology for maths in a way that could actually inhibit performance? With regards to practice and drilling, one of the most common uses of online technology for maths, we are looking at a potential 14% drop in performance.

So why is this?

If we look at the online software programs available for mathematics, we can see that almost all use multiple choice or single answer methods of checking the work.

We can all remember teachers telling us to “show your working”, when we do paper or examination work. But for online work, this important fact has been conveniently ignored against the ease of getting quick marked work.
Even Ofsted have noticed this, and they recently said:

“concern emerged around the frequent use of online software which requires pupils to input answers only. Although teachers were able to keep track of classwork and homework completed and had information about stronger and weaker areas of pupils’ work, no attention was given to how well the work was set out, or whether correct methods and notation were used.”
Para 90, ‘Mathematics: made to measure’

Additionally, if you look at any GCSE examiners reports for mathematics, they always comment that students are not communicating well in mathematics, or showing their working out, or their reasoning.

Could the performance in the PISA tests, the GCSE examiners reports and the use of technology be linked?

More research is needed here. The OECD reports draw attention to the fact that the more computers are used, the lower the performance, and hint at the fact that more research is needed to find out why, but to date I have not found such research for mathematics.

It is my contention that the lack of working out and reasoning required by online maths software could have contributed to such drops in performance.

Fortunately, the technology has moved on, and Mathspace has spent 4 years developing algorithms that ask for step by step working out, provide immediate feedback, and support students and teachers in understanding why something is right or wrong, not just that it is.

As the technology has leapt on incredibly over the past year or two, Mathspace also enables tablet users to write their maths just as if it were on paper — the lack of easy data entry is potentially another reason why students used to typing in single answers find it more difficult to answer paper based questions.

In the meantime, those schools using mathematics software that limit how the mathematics is being entered, should take a long look at what is now available.

Keeping to the status quo is not an option.

Like what you read? Give Tim Stirrup a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.