By guest authors Aydali Campa and Sarah Trent from the Wall Street Journal
The Future of Everything covers the innovation and technology transforming the way we live, work and play, with monthly issues on health, money, cities and more. This month is Education & Learning, online starting Aug. 6 and in the paper the Wall Street Journal on August 13, 2021.
Not all robots are good at math. Take ProJo, a programme that researchers are testing to help students of all ages spot their math and science mistakes, embodied in a small, humanoid robot. Instead of standing in for an instructor, ProJo acts as a peer, inviting the students themselves to help it solve problems. “Let’s take turns,” it might say. “I’m not so good at this.”
ProJo can also help students work together and assess their growth and weaknesses, in both robot form and on a computer screen. It is one of a variety of teaching aids in development, boosted by artificial intelligence, that scientists and educators say could support tomorrow’s classrooms.
Typically, AI education products serve one function, such as assessing a student’s literacy, tailoring tools to individual learners or performing administrative functions such as grading. Next-generation tools may do all of this in a single platform, serving at times as a peer learning partner, a group facilitator and a monitor for educators—a sort of superpowered teacher’s assistant personalised for each student.
A look at how innovation and technology are transforming the way we live, work and play.
Educators have long feared that such a tool could replace them, says Lalitha Vasudevan, managing director at the Digital Futures Institute at Teachers College, Columbia University. Instead, she says, thoughtfully designed AI tools could help pick up on patterns in behavior and performance that a busy human might otherwise miss, making good teachers better and requiring more—and more data-savvy—educators.
“So much of classroom learning is still dependent upon, the teacher says something, the student responds, and then the teacher is able to form an observation and an assessment of that child,” Dr. Vasudevan says. AI capable of analysing multiple communication cues, social dynamics and academic performance could afford many students more opportunities to be seen, she says.
A future where students might receive a personal robot or on-screen AI buddy along with their textbooks is rife with ethical and design challenges, including ease of use, inclusivity and data use, AI and education researchers say.
“Everybody goes right to the privacy considerations, but there’s a lot more than just privacy that really has to be thought through,” says Cynthia Breazeal, a roboticist who heads the Massachusetts Institute of Technology’s Initiative on Responsible AI for Social Empowerment and Education. Teachers and students will need the ability to correct algorithm errors, programmes will have to adapt to differences in hyperlocal demographics, and designers will need to consider perceived power dynamics between students, AI agents and teachers, she says.
Here are three teaching aids, now in development, that put AI to work:
Keeping Students Engaged
Diana is a teaching assistant designed to respond to students’ nonverbal and visual cues in an effort to help middle-school teachers run their classrooms more smoothly. Computer science professors James Pustejovsky of Brandeis University and Nikhil Krishnaswamy of Colorado State University have spent the past few years developing the Diana prototype, which exists today as an interactive avatar on a computer, iPad or iPhone. They are working on developing an online version, says Dr. Krishnaswamy.
As students interact in the classroom, the goal is for Diana to notice students’ facial expressions, conversations, gazes and gestures to infer whether they could use help or are getting distracted. Diana then responds by engaging them in conversation or prompting the teacher. The tool could be especially useful when a teacher divides students into small groups and can only focus on one at a time, Dr. Pustejovksy says.
Next, the researchers want to teach Diana to reliably recognize faces and voices, particularly diverse skin colors, accents and local dialects—a common oversight in data collection that can lead to gaps in AI’s effectiveness. They plan to strengthen Diana’s capabilities by collecting more visual and audio recognition data from student interactions in five middle schools in Colorado this fall.
Mai Vu, a computer science and AI teacher at one of the schools, Altona Middle School in Longmont, Colo., says that AI could be a good partner to help meet students where they are at in their learning. “It won’t replace what a teacher can do,” she says.
The Professor’s Time Saver
Unlike most professors, Jill Watson is available 24/7 to respond to routine questions from students at the Georgia Institute of Technology. The teacher’s assistant chatbot is meant to free up professors’ time to assist students with more complex questions, says Ashok Goel, a professor of computer science who developed the bot in 2016 using International Business Machines Corp.’s Watson AI platform. The chatbot can also analyze language and learn geographic, academic and other information about students—provided they opt in to share their data—to create online micro-communities to “enhance learner-to-learner interaction,” Dr. Goel says.
Jill Watson has been used in at least 17 courses at Georgia Tech so far, according to Dr. Goel. It was designed to be used with any course with uploadable documents, such as a syllabus and schedule. Recently, Dr. Goel and his team developed a new AI tool, which they dubbed Agent Smith after the replicating character in “The Matrix,” to generate a Jill Watson in five hours. The first Jill Watson took roughly 1500 hours to generate, he says.
Dr. Goel aims to expand the tool to all Georgia Tech courses within three years and to colleges internationally within five years. The team is also starting to develop possible business models.
A Robot Friend and Teacher
Sandra Okita, the programme director of the Communication, Media and Learning Technologies Design Programme at Columbia’s Teachers College, developed ProJo as a research tool to explore how AI tools that act as a student’s partner can be more effective than AI that mimics instructors. The team began testing the programme in a 2-foot-tall robot in 2013.
ProJo can make math errors similar to those made by its grade school partners, acting as a mirror and helping children learn to spot and fix their own mistakes—an area where kids often struggle, Dr. Okita says. College science undergraduates also tested ProJo by teaching it about biology, she says.
“If you design the relationship so it’s a specialist or an expert, the learner doesn’t try to exchange much information,” Dr. Okita says. “But if you design it as a peer, then they try to challenge it or spend more time trying to figure out if the AI agent is right.”
Dr. Okita’s team has been researching the best form for an AI partner to take—a physical robot or on-screen avatar? Should it look human? The answer depends on the situation, including whether eye contact is useful to the activity or a physical presence is distracting.
Next, the team is working on adapting the ProJo programme for a smaller, cheaper, more companion-like robot to continue testing how its form impacts its function. In the next two years, Dr. Okita plans to develop a free ProJo application that individuals or educators could download to their own robot to use as a math or language learning partner and further the team’s research. One day, Dr. Okita says, students might have a ProJo-like partner that does it all, following students from class to class and robot to screen as the activity or subject matter requires.