Are chatbot tutors more effective if they are cheerful and female?
Ever since ChatGPT burst onto the scene just a few months ago, theres been a resurgence of interest in using AI chatbots as tutors. The technology itself poses many difficult questions. Some researchers are investigating problems that may sound trivial but are actually very thorny. What do these computer-generated teaching assistants look and sound like?
Richard Meyer, one of the worlds most cited education researchers, is in a series of studies looking at what kinds of computer-generated sounds and images are most engaging with learners and lead to the best outcomes. It is working.
Mayer, a professor of psychology at the University of California, Santa Barbara, notes that ChatGPT has renewed this push, dreaming that in the field of education people could have individual tutors to help them learn. When we get to that point, we have to understand. What are the characteristics of those tutors? How do you create an online tutor who is friendly and eager to learn?
One of these studies by Mayer was published last month in a research journal titled Emotional Tones and Gender Roles of Computer-Generated Speech in Multimedia Teaching. This paper describes an experiment in which college students (some male and some female) each watched a short online presentation of her slides narrated by computer-generated audio. All participants saw the same slide, but in one of four different voices: happy woman, sad woman, happy man, sad man.
Mayer believes that as computer-generated voices become more realistic, the influence of voice tone, gender, and other features will become more important. One of his hypotheses is that students respond better to cheerful characters than to brooding characters, which fits what Mayer calls the positivity principle. And that happened to the male participants in the study, too. They performed better on the post-video quiz about the material when happy voices conveyed the material than when sad voices did.
Not only does Mayer believe that upbeat virtual tutors work better than other emotionally toned tutors, but some students may learn better from optimized agents than from human tutors. I believe there is
He points to research showing that some students learn better from male instructors, while others learn better from female instructors. He suggests that in the future, students may be able to choose the gender and race of the interactive agents that deliver lectures or act as AI tutors, just as they can choose their accent.
As a next step in his research, Mayer hired drama students to help him design an interactive agent to further test his theory. If you can find the characteristics of the most socially engaging instructors, you can use it in any lecture, presentation, or educational interaction.Send images to them for applications like virtual teachers. He says initial results indicate that students responded more positively to computer-generated instructors reading as women.
Concerns about simulating race and gender
Some computing and education professionals raise eyebrows at the line of research.
My concern about this kind of over-customization of tutoring is that it might lead to false approximations and enforce stereotypes that arent true, says a computer science graduate student at Stanford University. Parth Sarin says.
For example, Sarin grew up with parents who spoke a mixture of Hindi and English, but an AI model trained primarily on standard American English could struggle to emulate Hindi and English. There is a nature.
People using AI models shouldnt try to approximate identities that are very different from their own, Sarin says. Sarin likened a white professor who had a computer agent to deliver lecture videos in black voices to a black performer.
As for gender, there is a long history of robots being programmed with female voices. Some observers tend to criticize it for fostering gender bias, especially given the relative lack of women involved in creating these kinds of technical tools. The dominance of tutoring tools for women reflects the reality that three-quarters of public school teachers in the United States are women.
One possible solution? Invented a genderless virtual voice. Thats the idea behind voice assistant Q, built using modulated recordings of people who identify as non-binary.
Is Authenticity Essential?
Derek Bruff, Visiting Deputy Director of the Center for Excellence in Teaching and Learning at the University of Mississippi, says the move to create the ideal digital tutor personality reminds us of moments before online learning. About a decade ago, when a prestigious university rushed to launch a free online course called his MOOC, some proponents considered having the Hollywood celebrity offer it. Braff says people envisioned having professors script the videos and letting Matt Damon or Morgan Freeman narrate the lectures.
That trend never materialized, he added. The main reason is that for many students, the relationship with the professor who provides the material is important, regardless of the instructors speaking tone, gender, or race.
For some students, not having a personal relationship with a professor is not a problem, adding that they tend to be older students or already working adults. However, most undergraduates, especially undergraduates, greatly benefit from having a relationship with a professor.
But the advent of ChatGPT and the idea of virtual tutors makes it more likely that the technology can effectively complement human teaching, says Braff. But he hopes such tools will be used more like textbooks and teaching materials than as replacements for human instructors.
If I could choose between figuring out what face, voice, and tone to give my guidance agent, or give my 30 students real teachers, I would give my students real teachers, he says. .
A bigger question, according to Sarin, is whether AI agents can form effective educational relationships with students.
Sarin says its impossible for a chatbot to represent a real voice. Students can know the credibility of the teacher.
Sources