Computers And The Web Are Changing The Way We Learn and Remember.
Computers have transformed our daily lives so much that it’s hard for most people who have grown up in this digital age to imagine a time without technology.
When it comes to learning especially, there have been many changes, most of which are certainly for the better; from the Internet to Web 2.0 tools and interactive textbooks, to mobile devices that enable on-the-go learning and most recently, ground-breaking wearable technology like Google Glass.
But has our switch to digital merely altered our way of learning on the surface or has it actually had an impact on the way the brain functions?
A new study from Northwestern University in Illinois and China’s Peking University shows that our repetitive, manual interaction with computers has changed the way the brain represents movement.
The researchers came to this conclusion after comparing two groups of Chinese migrant workers that were given the task of navigating a cursor across a screen in one direction by moving their hand, which was hidden from view.
Participants in the first group were already familiar with using computers, while those in the second group had never used a computer before.
Although both groups were able to perform the task of moving the cursor in one direction without much difficulty, a difference in their abilities became apparent when they were asked to move the cursor to specific targets.
Those with prior computer experience were more easily able to apply what they had previously learned to the more complicated type of task. This process is commonly known as “generalizing.”
The researchers explain that when certain visually guided movements are learned, a representation is created in the brain.
This representation continues to be strengthened as a person gets better at translating visual cues to the corresponding movements, which in turn, allows them to perform similar tasks more quickly.
In order to better understand this difference, the researchers carried out a second experiment with a group of computer novices. They followed the group before and after they spent two weeks playing computer games that required intensive mouse usage for two hours each day.
They found that after these two weeks of practice, the formerly inexperienced computer-users displayed similar generalization patterns as regular computer users.
“Our data revealed that generalization has to be learned, and we should not expect it to happen automatically,” commented the study’s lead author, Kunlin Wei.
Another study that shows how technology is impacting cognitive function was carried out by Harvard scientists. It looked at how the Internet has changed the way our memories function by making information available at the click of a mouse, or tap of a screen.
In one experiment the researchers asked participants to type out a collection of memorable statements, such as “An ostrich’s eye is bigger than its brain.”
They found that participants were better able to recall the phrase if they believed it had been erased. Those who were told that the information had been saved on the computer were much more likely to forget it, even though they had been asked to commit it to memory.
The researchers also found that participants were more likely to be able to recall the folder locations where the work had been stored than the statements themselves.
Daniel Wegner, the study’s lead author, notes that their findings show that the Internet has become part of a transactive memory source.
He explains that transactive memory refers to a method by which our brain compartmentalizes information. For example, when a husband relies on his wife to remember an important date such as a child’s birthday, he no longer feels the need to keep track of it, and thus won’t remember.
“[It’s] this whole network of memory where you don’t have to remember everything in the world yourself,” said Wegner.
“You just have to remember who knows it. Now computers and technology as well are becoming virtual extensions of our memory.”
Although these studies are just the tip of the iceberg, they do show just how little we know about the brain and demonstrate that even the little things we do every day have the ability to influence cognitive function.
They also raise the question of what other processes in the brain have been or could be altered by frequent use of technology.
Wegner believes that his study will lead to further research into understanding computer dependence and personally plans to continue tracing the extent of human interdependence with the computer world.