AI and the Future of Teaching and Learning: Engaging Educators

A series of sticky notes with technology symbols including a think bubble with question marks, a magnifying glass, two settings gears inside of a lightbulb, the word “AI”, and a computer.

This is the latest post in our “AI and the Future of Teaching and Learning” series. To review the previous articles click here: AI and the Future of Teaching and Learning

Key Points:

  • Developers of artificial intelligence systems in education are now aiming to help educators.
  • Policies should encourage teacher engagement, including the development of teachers’ trust, and their confidence to recommend not using untrustworthy artificial intelligence systems.
  • Policies should incorporate experiences for educators to shape and support their own professional learning about how to utilize artificial intelligence systems in teaching and learning.
  • Including and informing educators in design and development decisions will result in more useful and usable teacher supports.

Artificial intelligence (AI) systems for learning environments have traditionally been designed to help students, however new AI systems are being designed to assist or support educators. Teachers want to be in the loop as they lead classrooms, develop lessons, grade papers, and work with students and their families. This is why teachers may value AI systems that augment an educator’s intelligence more than agents that appear to be alternatives to an educator’s intelligence. Teachers want to ensure that they continue to have the ability to personalize instruction, determine how to make technology work for their students, and so much more that demands their expertise.

Animated image of teacher and student in a classroom, robot, brain network nodes, mechanical lightbulb, and network icon.
Educators should be in the loop when designing and brainstorming the future of Educational Technologies — Digital Promise

When creating systems to help educators in their work, involving educators early in the process will lead to improved designs. When educators are included in the development of systems, they see new potentials for implementation and develop trust in the resulting products. Educators will also have insights for needed policies to guide safe, ethical, and equitable uses of AI in education. Educators can contribute:

  1. Their understandings of student, classroom, school, parent, and district needs;
  2. Their insights about cultural or individual barriers and issues, especially as these relate to equity;
  3. Their creativity in framing new teaching practices to leverage AI possibilities; and
  4. Their visions in shaping new policies for the integration of these tools in their schools.

Some technology developers already include educators in their design process. Further, discussions with both educators and developers suggest that there are more educators who are ready to step up and participate. Informed educators see many potentials of AI to impact both schools and society positively[1] and negatively[2]. They want to see it integrated into learning environments in transformative and responsible ways[3].

Including and informing educators

As schools and districts make decisions about AI systems, they need to share information and professional learning with educators, families, and communities. This includes informing and engaging educators and community members with diverse racial and cultural identities, technological experiences, and gender backgrounds. Educators need time to learn about AI, as the AI systems offer possibilities that are unlike existing educational technology (see prior blog posts in this series). Educators also need time to explore specific AI systems that a district or school is considering, as it is often not easy to understand how AI systems gather and use data, nor how AI systems make inferences, decisions, and take actions. For AI to be effective in education, it must be trusted — it isn’t widely trusted now and building trust takes time.

Ensuring educators have influence

Educators can be influential both in the development of systems and in the decisions made in their schools. Educators would need to be compensated by the developers and should have the option to remain in their classroom (perhaps at a reduced time capacity) so that students and educators in the school benefit from their experiences. Designers of new technologies should always include subject matter experts, and educators are the experts in this work. We need to include more educators and encourage them to influence and participate in system design.

Additionally, educators’ voice must have influence not just in designing but also when using an AI system. For example, AI systems that make “consequential decisions” should not be allowed to make them without a human oversight board. AI systems reflect the bias of the humans who create them and the data that trains them. Although it is easy to say that educators should have influence, true power sharing is rare in technology development, procurement, and implementation; it is easy and typical for voices of technologists to overpower the voices of educators. Intentional efforts to ensure that educators’ voices are invited and then heard are necessary to overcome this tendency. There are national research projects focusing on emerging technologies for teaching and learning and educators are invited to participate in these projects.[4]

Educators’ policy roles

To realize the opportunities while managing the risks of AI systems in education, prior blog posts have suggested that policies will be needed. Educators can play an important role in creating sound policies. Educators can identify specific challenges and risks in classroom, school, and community contexts. They can review proposed policy guidance, and provide their experiences, analyses, and draft language to policy proposals.

In one pre-policy effort, the Center for Integrative Research on Computing and Learning Sciences (CIRCLS) brought together teams of researchers and educators in a four-month working group to review documents[5] around AI. The participants then wrote about the policy needs[6] they saw from an educator and researcher perspective. They suggested:

  • Policies need to ensure educators have time and resources to learn about AI systems and to participate in designing and evaluating AI systems.
  • Policies need to anticipate that educators will have a key role in informing families and communities about AI systems and provide resources and training for them to do so.
  • Policies already include laws that protect privacy and data security; educators can have a key role in shaping revisions to these laws to address new AI capabilities.

Educators are not only consumers or users of future AI systems in education. They are experts and thought leaders on how AI can affect their classrooms and students’ learning experiences. They have concerns about the use of AI systems, including around the transparency of the algorithms the AI systems use, the data that trains the AI systems, and whether AI systems allow humans to proactively override decisions from those systems. For AI systems in education to have benefits, educators need to trust them. To develop trust among educators in AI-enabled teacher supports, educators need to be involved at every step of the process, including design, development, evaluation, and policy.

[1] Fusco, J. & Ruiz, P. (n.d.) I’m a teacher, will artificial intelligence help me? CIRCLS. https://circls.org/educatorcircls/im-a-teacher-will-artificial-intelligence-help-me

[2] Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code. Wiley. https://www.wiley.com/en-us/Race+After+Technology%3A+Abolitionist+Tools+for+the+New+Jim+Code-p-9781509526406

[3] Roschelle, J., Lester, J. & Fusco, J. (Eds.) (2020). AI and the future of learning: Expert panel report [Report]. Digital Promise. https://circls.org/reports/ai-report

[4] The Engage AI Institute, CIRCLS Expertise Exchanges, National Institute for Student AI-Teaming

[5] Documents such as the 2016 report Preparing for the future of artificial intelligence, OECD principles, and other policy organizations.

[6] CIRCLS AI and Education Policy Expertise Exchange Group

--

--

Office of Ed Tech
AI and the Future of Teaching and Learning:

OET develops national edtech policy & provides leadership for maximizing technology's contribution to improving education. Examples ≠ endorsement