How Artificial Intelligence Can Support Learning in a 360 Degree Web Environment?

The reason to think about Artificial Intelligence (AI) in a 360 degree web environment is first of all practical. Think about this: when you are fully immersed in admiring, let’s say, the stunning sight of Aurora Borealis in the middle of a glacier in Greenland, the last thing you want to do is to step out of it just to type in a new search using your phone that is still attached to a VR headset. And even if you could keep the headset on, going back to any kind of main menu for more content feels disrupting.

It’s not only about improving search in an immersive 360 degree environment. It also about getting better help in any situation that engages us to several simultaneous cognitive & physical activities — driving & navigation, cooking & figuring out ingredients and timing, managing a household with pets and kids, you name it! For a solution, one does not need to look further than Amazon, Apple, Microsoft, IBM, Facebook and Google. All the companies are developing machine intelligence that can soon step up to the role of a digital personal assistant — for all of us.

AI in Education: from assistance to assessment

Research reports focusing on AI and education, such as AI Grand Challenges for Education by Woolfe et al. 2013, offer a mindblowing range of options on how AI can contribute to changing education. They include instructional software and Intelligent Tutor Systems providing access to digital materials as well as more complex systems that support real-time understanding of student’s cognition, action and emotions. Latter ones include sophisticated voice recognition and natural language processing systems, as well as large-capacity data processing and machine learning to assess students knowledge and performance, and to personalize instruction according to learner’s preferences and motivations.

Although there is still relatively little discussion about what forms of machine intervention in our kids education is reasonable and desirable, research such as Ma et al., 2014 has already proven that machine-aided mentoring and the use of different kinds of Intelligent Tutor Systems lead learners to achieve higher outcome scores compared to learners using other instructional methods. This is not exactly surprising. In Woolfe’s words : “The current environment of fixed classrooms, lectures, and static printed textbooks is clearly not capable of serving a digital society or flexibly adapting for the future.” One can therefore expect that systems giving students easier access and more personalized recommendations to a variety of digital learning materials — not to mention more flexible ways to construct and demonstrate knowledge — will most likely improve their learning outcomes.

One way or another, robots will be an increasingly important part of our kids lives. But as a mother of three myself, I find myself asking: What exactly is the job we are proposing for artificial intelligence in the classroom? How can machine learning and AI help our kids learn better, AND how can it give teachers more time to provide face-to-face interactions that encourage higher level thinking skills, communication and creativity?

The promise of 360 degree web to learning

In the past six months at ThingLink we have seen teachers and students create over two thousand interactive 360 degree photos ranging from virtual tours and lesson plans to joint classroom projects. These lessons and tours have been viewed by one million students so far.

During this time, educators have suggested several ways of bringing interactive 360 degree viewing to the classroom. Here is a brief summary of the main use cases and expected benefits for learning:

USING VIRTUAL FIELD TRIPS TO CHANGE THE CONTEXT OF LEARNING

Taking students to traditional field trips is part of every elementary and middle school curriculum. Research has found physical field trips to support learning in multiple ways; stimulating interest in a new subject matter, developing observation and perception skills, gaining a better understanding of topics, building cultural understanding and tolerance, developing creativity and critical thinking, and exposing students to worlds outside their own.

Although virtual field trips differ from physical, research comparing the two has concluded that teachers could choose either type of field trip and obtain similar achievement scores. For example Garner & Gallo (2005) write: “A virtual field trip does not provide the same experiences as a physical trip into the field. It represents a compromise, a set of distilled experiences designed to mimic the real thing. Students do not actually get their feet wet or dig into the mud in search of bivalves. They do, however, move through a series of interactive experiences that can be designed and controlled for maximizing learning. One cannot count on encountering an injured manatee, a nesting sea turtle, or a burrowing gopher tortoise on a physical field trip; however, on a virtual field trip these encounters can be guaranteed.”

Professional educators such as Monica Burns, the Founder of ClassTechTips.com write: “Taking students on a field trip to more than one biome is a challenge for most teachers. This app (VR Lessons) can take students around the world to spark discussions, build background knowledge and make text-to-world connections.”

These are strong arguments in favor of making virtual field trips part of the curriculum.

CREATING UNINTERRUPTED TIME FOR SELF-PACED EXPLORATION

A 360 degree web environment — especially when viewed with a virtual reality headset — simulates a first-person experience of being in a place without interruptions or distractions in the peripheral vision. This gives the student a new kind of agency for self-paced individual exploration — as long as there is something to explore! Monica Burns recommends: “Before kicking off a lesson on ecosystems let students spin and tap on the screen to develop questions for a KWL chart. As they move through each environment have your class periodically pause for a stop and jot or turn and talk to share their questions.

In this example, teacher Mona Voelkel has created an information rich lesson plan on Abaiang Atol for her students to explore with their learning partner.

A self-paced immersive exploration combined with targeted assessments gives students with different levels of knowledge an opportunity to take their time where they need it. Not everyone needs to be looking at the same mountain top at the same time, and not everyone will. Letting students to actively observe, come up with their own questions, and discover information not only helps the student, but it can also free up the teacher to answer higher level questions and provide valuable time for face-to-face interactions.

USING MULTIMEDIA ANNOTATIONS TO SUPPORT DIFFERENT LEARNING STYLES

There are countless studies confirming the power of visual imagery in learning. Visual memory has proven to be the genesis of our perceptions of reality, the facilitator of our decision-making and the motivator of our being (Williams, 2004).

Letting students access information, not only by reading, but also by watching and listening expands learning opportunities for learners with different forms of intelligence and styles of learning. For example, this image of a nature path was created by teachers at the Savonlinna teacher education department. It features multimedia annotations with closeup images, text, and narration — well aligned with the Universal Design for Learning principle.

Students at the Savonlinna teacher education department created a n audiovisual forest path following the Universal Design for Learning principle

Offering students the option to read, listen or watch information inside 360 images seem to work especially well in teaching new vocabulary and key academic concepts. It can prepare students to think about higher level questions, and discuss the subject matter with teacher and in groups.

The importance of learning vocabulary gets further support from research showing how the proficiency of academic language (ability to use general and content-specific vocabulary being one of its components) is one of the most important factors in the academic success of English Language learners (ELLs). It has also found to be a major contributor to achievement gaps between ELLs and English-proficient students.

What’s AI got to do with it: collecting data for a personal learning assistant

It’s hard to think about the possibilities of Artificial Intelligence and Machine Learning in education without picturing a real-world context and a problem it is developed to solve.

In this case, we are trying to figure how to use AI and machine learning to assist students with different cultural backgrounds, economic situations, interests, cognitive abilities, and learning styles to meet their personal learning goals. A broad question, sure, but let’s take a look at what we already know about student engagement in a 360 degree environment, and how that could contribute to building a learning system.

Leaving personal identification aside, there are at least three types of data that combined together with voice recognition and natural language processing can become an embryo for an intelligent learning system in a 360 web environment:

  1. Self assessement
  2. Past and present individual interaction data
  3. Past and present system level interaction data

In the example below, we have a virtual lesson about a coral reef ecosystem. The lesson has an embedded a Google Form that works as an entry test for students to rate their knowledge about related keywords and concepts.

The data from a simple self assessment can serve many purposes. First, it gives initial feedback on how students themselves evaluate their own knowledge about a particular subject matter. This information can be used to automatically define what kind of information and interactions will appear for this student inside the 360 experience.

Combined with other kinds of data sets, student assessment data can be used to guide navigation inside and between 360 lessons. For example, interaction data on each 360 viewing session gives real time information on which piece of content inside the 360 photo the student is looking at. Semantic data (and image recognition data) tells us what is it that the student is looking at. As this data combined with natural language processing, a smart navigator can converse and guide the student as they are exploring the virtual lesson. This can also include a keyword-based information retrieval to suggest additional 360 degree content that other students with the same level of knowledge have reviewed or recommended.

Next step: adding more ingredients in the mix

The same way that search engines use our online browsing history to recommend related content, user and session specific interaction data from 360 media viewing can be used to learn about students interests.

Let’s say that a 5th grade student, Mona, is interested in modern history. Our data shows that outside the curriculum, she has taken additional virtual drone tours over several major European cities, and on those tips watched videos on famous landmarks and cultural traditions. At the same time we know that this week in mathematics Mona is supposed to learn about three dimensional geometric shapes, and in social sciences they study ancient Asian cultures. But …how do those three things combine?

If you rely on traditional textbooks, or even a traditional web search, they may not. But a smart assistant can instantly process data about Mona’s learning goals, current level of knowledge, previous browsing history and preferences, and compare that data to other students browsing history and preferences. Based on all this, the assistant can fly Mona over major Asian cities cities pointing out famous landmarks and pieces of architecture, and help her observe and identify three dimensional shapes in buildings that she is looking at. Taking one step further, using voice recognition and natural language processing, the smart assistant can identify strengths and weaknesses in Mona’s general knowledge about the two subject matters, and finally, give parents or teachers a summary of what she has seen and learned that day.

To conclude

The shift from fixed classroom settings and printed text books to digital learning platforms and 360 degree navigation environments pushes us to think about how machine learning and artificial intelligence can best contribute to K-12 education. Search and recommendation robots are already a part of our daily lives, but immersive navigation in a 360 degree environment will make them present in a new, more personal way.

360 viewing has several unique characteristics that supports student learning. Teachers use 360 degree lessons to

  • Change the context of learning by taking students to virtual field trips
  • Create uninterrupted time for self-paced learning, and
  • Provide students with multiple ways (visual, text, audio) to access information

Automatically processed data on student’s knowledge and goals, browsing history and interactions in a 360 degree environment can be used to personalize navigation and learning. A digital personal assistant can provide through up-to-date introductions to different topics, and help students master required concepts and vocabulary. This in turn can help teachers struggling with setting and achieving individual learning goals, and free up their time to provide valuable face-to-face interactions that encourage higher level thinking skills, communication, and creativity.