10 Comments
Jun 28, 2023Liked by Dan Meyer

This morning’s NY Times Daily episode has an interview with a professor that likens reading an AI generated discussion post to being a duck who swims up to a wooden duck decoy and tries to mate with it. I think that’s an apt description of an issue that you are surfacing here about humans needing to learn by connecting with other humans. As an instructional coach, I have noticed that participation is lower in classes when teachers ask questions in a way that makes it so that it doesn’t matter who answers the question, whereas students participate more eagerly when questions are asked in a way that indicates that the teachers wants to understand the ideas particular to individuals in the class. I wonder how the presence of an all-knowing robot expert might impact students’ willingness and eagerness to share their own mathematical thinking and ideas...

Expand full comment
author

+1 - I listened to that same episode with a lot of interest this morning. My feeling is that social relationships mediate learning to a much lower degree in higher ed than in K-12. So it was wild to hear how bereft the professor was, and to imagine how much worse that feeling will be in environments where trust and relationship are so much more important.

Expand full comment
Jun 28, 2023Liked by Dan Meyer

It's comforting to read this view coming from a highly respected and frontrunner math educator. You make valid points Dan; thank you for sharing them. I agree that we must proceed with great caution regarding AI chatbots, not only in education, but in all of life. Human connections are what give us reason to live and learn; they are what allow us to feel and understand and fulfill our purposes on this earth. Let us remember that today's children are tomorrow's leaders, and we must be very careful not to replace their human relationships with robots.

Expand full comment

First, thanks for alerting me to this piece. I hadn't seen it.

What struck me about this article and the product is as described, it's not using much AI. When there's a problem with AI, the developers don't say "yeah, our programmers are working on that bug" or "yeah, we've fixed that bug". The data itself is providing the answer, so there's nothing to fix.

This seems more like "math tutoring software" that they've labeled as AI, much like they would relabel textbooks as "Common Core compliant" back in the day.

I also noticed this disconnect:

"Designed specifically for schools, the tutoring bot often takes students through the sequential steps needed to solve a problem."

Later on, a district rep said:

"The district did not want the bot to lead students through a problem step by step, he said, adding, 'We want them to know how to tackle the problem themselves, to use their critical thinking skills.'"

So the district doesn't want an AI tutor, but rather an AI teacher. Meanwhile, Khan is offering a probably not entirely AI but hand-coded tutor.

Might want to mull that over before pulling the purchase trigger.

Expand full comment

Dan, I strongly agree. In fact when I first saw the demo video I made the point that 1) you don’t teach maths by telling. You teach by showing. 2) The responses are in no way pedagogically scaffolded for English or maths. The writing coach is the worst example. It’s demoralizing and you don’t teach writing this way.

I put the video in front of 6 very bright and motivated 14 year olds and they each said (independently), I can get a faster answer off of Google and I have no interest in chatting with a fake Gatsby.”

Khanmigo shows limited understanding of the learner or a teacher. And you don’t have to be a teacher to see this. It also is not grounded in an evidence-based practice.

Expand full comment

Absolutely beautiful article DM. I think you succinctly and nimbly put a lot of my hesitations with the boom of AI Tutors. I'm currently a sophomore at Stanford working in the AI & Education lab, and we've been experimenting with solutions to truly safeguard learners' best interests while integrating AI into the classroom. I'm building a tool that scales oral conversation and evaluation as a way to uncover true student understanding, simulating a virtual conversation between a student and AI on any assignment, shifting the focus of evaluation from product, to the process involved in creating something. Would love to chat about it with you if you're open to it!

Expand full comment

Interesting article, and I appreciate the heads-up regarding the Khanmigo profile as I had not yet seen it.

I do have a couple of observations though. First, you present the cost concern as a hurdle for implementing AI in math education en masse, but that is really a hurdle for specifically implementing Khan Academy's take on AI, not the technology in general. There are other cheaper or free implementations already available and some that will be well before the start of school in the fall that are at least as potentially useful.

Second, you make the comment that "Tito Rodriguez, suggested the students start by asking Khanmigo two background questions: What is a survey? What makes a question statistical?" and related it to "using a pneumatic jackhammer to fill a cavity." Perfectly understandable - why ask a supercomputer how to solve 2+2? However, you then suggest that the teacher themself should perhaps "give students a short survey. Then give them something that isn’t a survey and ask them to describe the differences with the survey," which is a great lesson, but certainly not an example of a less complex answer to a simple question!

The idea behind having a (viable and functional) AI tutoring assistant is to free up teacher time to address students as individuals. By encouraging the students to direct fairly simple questions to the AI (similar to 'looking it up on Google', as you also suggested, but with a more relatable interface), the teacher is free to work directly with the students who have trouble with the concept (perhaps by having just those students construct a survey), while allowing the ones who are already comfortable with the idea continue to progress instead of spending their time on a class-wide activity that isn't particularly applicable to them.

Again, just a couple of observations. I certainly appreciated the article overall.

I, for one, think the jury's very much still out on how/if AI will actually be transformative to math education, but I certainly see some real potential. When a teacher can use AI to not just react to a student who failed an evaluation by prescribing remedial work, but can instead predict beforehand which students might need it and head it off ahead of time (perhaps by AI-recommended revisions of those specific students' versions of the same upcoming lesson), then AI is not only improving teaching efficiency, but also increasing student confidence in a meaningful way by setting them up to succeed, not just giving them a smiley when they get an answer right.

Expand full comment

Your article certainly points out nunerous issues with highly relying on Chatbots to learn math. However, I don't think we should totally write off chatbots for this purpose. Wouldn't it be great if students learned how to develop and fix their own chatbots, working collaboratively?

Expand full comment
author

Perhaps in a computer science course? But if a human tutor were hired to support math learning and supported it this ineffectually, I think they'd get fired. We wouldn't ask the students to tutor the tutor.

Expand full comment

I agree. But, to further refine my point, what if students could deepen their math knowledge, in line with the objectives under study, by improving the chatbot? Wouldn't this be allowing them to use their math for something meaningful? BTW, I do understand your point and agree with it.

Expand full comment