It’s Pretty Clear That These Math Students Aren’t Interested in Learning From an AI Chatbot Tutor
On the New York Times profile of Khanmigo, Khan Academy's AI chatbot
Earlier this week in the New York Times, Natasha Singer profiled Newark Public Schools and the district’s pilot of Khanmigo, Khan Academy’s artificial intelligence chatbot.
When Khan Academy began looking for districts to pilot test its experimental tutorbot this spring, Newark volunteered.
This is a rare and useful look at a new technology in a school that is eager to pilot it and eager to be profiled while piloting it. We see the technology operating on friendly terrain, and still the profile does little to shake my confidence in the bets I staked earlier this month, bets that AI will result in, at most, some limited quality of life improvements for K-12 math students and teachers. Several reasons follow with the most important one at the end.
Ineffective tutoring.
A key competency of even moderately skilled tutors is their sensitivity to learner needs. Is this student flailing? Is this student on the edge of a realization? Is this student ready for another challenge?
Remind me to show you a clip another time where San Diego teacher Gen Esmende persuades a student that he knows something very useful when he is sure he does not. Gen resists solving the problem for the student several times because she knows he is one question away from realizing he knows the answer himself. And then he does.
In Newark, the AI chatbot displays none of that sensitivity, immediately answering the main question the class was meant to think about—”What fraction of the letters in the word MATHEMATICIAN are consonants?”—right when a student types it in:
When students asked Khanmigo the fraction question posted on the classroom’s white board, the bot answered that the word “mathematician” contained 13 letters and that seven of those letters were consonants. That meant the fraction of consonants was seven out of 13, the bot wrote, or 7/13.
Khan Academy then told the New York Times they fixed the chatbot’s tendency towards excessive helpfulness on this question:
“Our engineering team corrected the A.I. a few weeks ago,” Khan Academy said in an email on Tuesday, “so that it no longer gives the answer to this question.”
That’s great and responsive but there are many questions we might ask students throughout their math education (dozens probably!) and it doesn’t seem sustainable to sand down every rough edge on an AI model every time it clips a student’s ankle with its helpfulness1.
Unclear classroom application.
Singer describes the most hopeful case for AI chatbots like Khanmigo:
Proponents contend that classroom chatbots could democratize the idea of tutoring by automatically customizing responses to students, allowing them to work on lessons at their own pace.
Nowhere in the article do we see this promise illustrated. Instead, we see teachers throwing Khanmigo what feels like busywork, something for the reporter to report:
Ms. Drakeford knew that “consonant” might be an unfamiliar word to some students. So she suggested they ask Khanmigo, a new tutoring bot that uses artificial intelligence, for help.
[..]
Their teacher, Tito Rodriguez, suggested the students start by asking Khanmigo two background questions: What is a survey? What makes a question statistical?
If you asked me to write down five ways students might learn this background knowledge, I wouldn’t dream of asking Khanmigo, which is something like using a pneumatic jackhammer to fill a cavity.
Use Google. Ask students to share what they know. Just tell them yourself. Perhaps most effectively: give students a short survey. Then give them something that isn’t a survey and ask them to describe the differences with the survey. Run a contrasting cases routine.
The case for personalized chatbot tutors seems to rest on the idea that there are vast and irreconcilable differences in student knowledge about mathematics in every class (a premise we should not take on faith) but that isn’t really true for the word “consonant.” Once you know it, well that’s kind of that. Why are we involving a chatbot tutor here?
Cost.
Look I just think this is pretty expensive, that’s all.
Districts like Newark that use Khan Academy’s online lessons, analytics and other school services — which do not include Khanmigo — pay an annual fee of $10 per student. Participating districts that want to pilot test Khanmigo for the upcoming school year will pay an additional fee of $60 per student, the nonprofit said, noting that computing costs for the A.I. models were “significant.”
Poor product-market fit for K-12 math learning.
Either chatbot boosters grossly misunderstand the needs of math teachers and students, or I do.
I can just say, as simply and emphatically as possible, that when K-12 students are in a classroom with other humans trying to learn mathematics, the majority of them do not wish to have a conversation with their computer. They wish to have conversations with the other humans. They do not wish to be complimented and seen as smart and successful by their computer. They wish to be complimented and seen as smart and successful by the humans in whom they have invested their social capital2.
We could hope, maybe, that students would use these chatbots at home when they are far from those humans. But the students most likely to have the resources to make use of these chatbots outside of school—time, attention, broadband internet, a computer, the background knowledge to form queries and interpret the results—are the students least likely to need them.
I’m not making a normative claim about the quality of these chatbots here. I’m making a descriptive claim about their limited appeal to students. This claim isn’t abstract either. It’s concrete. Singer doesn’t quote a single kid extolling the value of these chatbots, either actual or potential, which is rather surprising for a profile of this sort. These kids just don’t want this.
2023 Jun 29. Singer responded on Twitter to my note that she didn’t quote any students. “That's because I just wrote a student-centric piece on this to which Newark is the sequel.”
This is to say nothing of the fact that the chatbot later gave the Singer the wrong answer to the same question, claiming the fraction is 8/13 rather than 7/13. I am holding onto faith that these models will eventually stop hallucinating, but as the web fills with exhaust from AI models, exhaust which is then inhaled and exhaled by other AI models and so on, this feels more and more like a statement of faith on my part.
This is quite distinct from the “writing college essays” use case where these chatbots seem to have a near perfect product-market fit. You’re on your device anyway, writing by yourself.
This morning’s NY Times Daily episode has an interview with a professor that likens reading an AI generated discussion post to being a duck who swims up to a wooden duck decoy and tries to mate with it. I think that’s an apt description of an issue that you are surfacing here about humans needing to learn by connecting with other humans. As an instructional coach, I have noticed that participation is lower in classes when teachers ask questions in a way that makes it so that it doesn’t matter who answers the question, whereas students participate more eagerly when questions are asked in a way that indicates that the teachers wants to understand the ideas particular to individuals in the class. I wonder how the presence of an all-knowing robot expert might impact students’ willingness and eagerness to share their own mathematical thinking and ideas...
It's comforting to read this view coming from a highly respected and frontrunner math educator. You make valid points Dan; thank you for sharing them. I agree that we must proceed with great caution regarding AI chatbots, not only in education, but in all of life. Human connections are what give us reason to live and learn; they are what allow us to feel and understand and fulfill our purposes on this earth. Let us remember that today's children are tomorrow's leaders, and we must be very careful not to replace their human relationships with robots.