My district is buying new AI tools on what seems like a weekly basis. I am getting surveys on what training I want on each tool. What I want is training on autism, on ADHD, on homelessness in teens, on supporting teen parents, and on how to build and maintain trust quickly with people you have just met. And, no, I don’t want to ask an AI tutor about any of it.
"on how to build and maintain trust quickly with people you have just met"
If someone is offering training on this, I strongly suspect they are also selling bridges. I hope every teacher knows that they must build trust and invest in their students... and rightfully so.
As an alternative high school math teacher, building relationships is my core job. I have presented at conferences like NCTM on this topic, but you are right that most trainings are selling products or services. The problem with math conferences is that topics about building trust are easily rejected and pushed to alternative ed or special education conferences. Because we math people are all about “content,” as if that could be taught without the relationships…
This is certainly a thing I'd like to get better at. If anyone had ideas and put them into a workshop or training, I'd sign up for it. Knowing you *must* do a thing and knowing *how* to do the thing aren't the same thing.
Humans have been doing math forever (the Babylonians wrote about the relationships between the sides of a right triangle and its hypotenuse ~4,000 years ago!). If teaching math was simply about finding the "best explanation", we'd have found most of them by now, and no kid would fail geometry in 2024.
Humans have been *learning* forever (ancient Greece had students studying reading, writing, and math ~2,500 years ago!). If education was simply about "giving teachers the right tools," we’d have discovered them by now, and no teacher would be burnt out in 2024.
But that's (obviously) not the case.
So when the loudest, most lauded voices are claiming that they've transformed education because they've discovered the best explanations and the right tools, it's hard to take those claims seriously.
The problem is that the issues of school are more fine-grained than explanations and tools, which makes them hard (but not impossible!) to address. Hard (but not impossible!) problems take time to solve - they require a degree of patience **for developers**. Instead, the burden of patience is often placed on teachers and students: "just wait for the next model to come out! But for now, here's something lesson plan-shaped."
I can't pretend that gen AI is all hooey–my gen AI use has significantly changed my professional life for the better over the past 12 months–but we also can't pretend that tools made by and for R&D folks like me can be simply plopped into a classroom setting.
👏👏👏 your newsletter continues to make me feel less alone in wondering why we continue to fawn over tools that (and founders who) do virtually nothing for the vast majority of students (or teachers)
"Unfortunately, more than 40 years of academic research exploring human cognition suggests that generative AI could also harm learning at all levels, from online tutoring to employee training, for three reasons.
Problem 1: Empathy
[...]
it is biologically impossible for humans and AI to develop an empathetic relationship: the transpersonal nature of empathy precludes its emergence in the digital realm.
This is one major reason why students operating in purely digital environments perform worse and are significantly less likely to graduate than comparable students engaged in face-to-face instruction. Without empathy, students become passive receivers of information with little impetus to push through the requisite struggles inherent in the learning process.
Even amongst highly skilled human educators, failure to cultivate an empathetic relationship inevitably hinders learning. And this only serves as a further warning against AI, as it reveals that neither knowledge nor pedagogy (presumably the forte of digital tutors) are sufficient for effective teaching.
The focus on math in AI tutoring demos isn't coincidental - it's because math teachers often have the luxury of instant, clear-cut assessment. In the humanities, where teachers have to consider work done outside class hours, the assessment validity crisis becomes clear. While students could always get help from others, AI has made outsourcing work trivially easy and nearly untraceable. This is precisely why the student-AI chat approach is promising: it has the potential to reveal students' thinking processes AND develop their AI literacy. At least the 'grade the chats' guys are attempting to address what AI skeptics seem content to sidestep: preparing students for a world where AI literacy isn't optional.
The whole notion of “giving teachers the right tools” seems to come from folks that rarely use tools. Every teacher, just like a woodworker or mechanic, needs a whole big box of quality tools. Along with those tools is acquiring the skill to use each one - the expertise to understand which tool is optimal for which student on which day as well as the skill to properly use that tool. While machines are very standardized, the mechanic must still adapt to unusual situations. Wood is even more variable and forces the craftsman adopt non-standard thinking to solve problems. With students, every one is a unique individual - the antithesis of standardization. Only a very robust toolkit combined with deep expertise in human problem solving will make learning successful.
I just cannot believe that technology can replace teachers. And I don’t think anyone who teaches believes it either. We offer too much as humans and no AI, no tech, no scientific advancement can match it.
There's a moment in the interview where Anderson Cooper says something like, "Should teachers be worried about this technology replacing them?" and it's like, tell me you don't understand this technology or you don't understand teaching or you don't understand both without telling me.
“Other countries level the playing field for their kids through redistributive policies like child welfare, public housing, nationalized health care, etc, all of which would require increasing the tax burden on economic elites. It should not surprise us to see which of those dreams receive fawning coverage during primetime corporate media.”
I feel your frustration and share your skepticism here. Case in point: On Monday I was using Google Gemini with a student learning place value. We answered a question correctly, then Gemini replied "Close, but not quite! The 2 in 12,345 is actually in the ten thousands place." ?!?!
I appreciate your questioning of the value of generative AI in education. It's hard in the face of so much cheer leading about the current and future benefits to speak out and be heard. I recently had a conversation with a former colleague (not a former educator) who is a partner at BCG, and she was so bullish on it's value in education, including in classrooms, that I could not get her to even consider a different perspective. BCG is all in on AI in all forms and all sectors. I have no doubt AI/ML, including generative AI, could help with back-end office functions and administrative tasks in education much as it's doing in other domains, though I think the financial and environmental costs may be too much given the returns we'll see. But as you and others rightly point out, generative AI, even if we think about it futuristic-ally, does not align with how children learn be it in our own academic systems or currently/historically where there are not formal education systems. It all boils down to relationships with people, though the material may be conveyed differently from observations and imitation to more formal training. How can AI replace that aspect?
"I want to believe as much as anybody that we’ve really truly found the One Weird Trick for human cognition, that we’re all on a rocketship ride to easier teaching and better learning."
Oh you do not. You rightly bring a skeptical mindset to all this nonsensical hype, which is what we need. There's too much hope in education, not enough wry cynicism!
Dan! Thank you for your powerful and consistent advocacy for students and educators doing authentic teaching and learning.
Have you considered this small change in sequence:
"It should not surprise us to see economic elites dream of leveling the playing field through technology contracts that would benefit other economic elites.
Dan - seriously. I agree completely. Can we intervene? Can we intro them to some teachers? Can we connect them to some great orgs like TNTP or Leading Educators? MagicSchool is not ok.
My district is buying new AI tools on what seems like a weekly basis. I am getting surveys on what training I want on each tool. What I want is training on autism, on ADHD, on homelessness in teens, on supporting teen parents, and on how to build and maintain trust quickly with people you have just met. And, no, I don’t want to ask an AI tutor about any of it.
Featured comment next week.
"on how to build and maintain trust quickly with people you have just met"
If someone is offering training on this, I strongly suspect they are also selling bridges. I hope every teacher knows that they must build trust and invest in their students... and rightfully so.
As an alternative high school math teacher, building relationships is my core job. I have presented at conferences like NCTM on this topic, but you are right that most trainings are selling products or services. The problem with math conferences is that topics about building trust are easily rejected and pushed to alternative ed or special education conferences. Because we math people are all about “content,” as if that could be taught without the relationships…
This is certainly a thing I'd like to get better at. If anyone had ideas and put them into a workshop or training, I'd sign up for it. Knowing you *must* do a thing and knowing *how* to do the thing aren't the same thing.
Humans have been doing math forever (the Babylonians wrote about the relationships between the sides of a right triangle and its hypotenuse ~4,000 years ago!). If teaching math was simply about finding the "best explanation", we'd have found most of them by now, and no kid would fail geometry in 2024.
Humans have been *learning* forever (ancient Greece had students studying reading, writing, and math ~2,500 years ago!). If education was simply about "giving teachers the right tools," we’d have discovered them by now, and no teacher would be burnt out in 2024.
But that's (obviously) not the case.
So when the loudest, most lauded voices are claiming that they've transformed education because they've discovered the best explanations and the right tools, it's hard to take those claims seriously.
The problem is that the issues of school are more fine-grained than explanations and tools, which makes them hard (but not impossible!) to address. Hard (but not impossible!) problems take time to solve - they require a degree of patience **for developers**. Instead, the burden of patience is often placed on teachers and students: "just wait for the next model to come out! But for now, here's something lesson plan-shaped."
I can't pretend that gen AI is all hooey–my gen AI use has significantly changed my professional life for the better over the past 12 months–but we also can't pretend that tools made by and for R&D folks like me can be simply plopped into a classroom setting.
👏👏👏 your newsletter continues to make me feel less alone in wondering why we continue to fawn over tools that (and founders who) do virtually nothing for the vast majority of students (or teachers)
thank you once again, Dan, for speaking truth to the nonsense.
Great article!
It reminds me of this other great article from the neuroscientist and educator Jared Cooney Horvath (https://hbr.org/2024/07/the-limits-of-genai-educators) in which he whites this :
"Unfortunately, more than 40 years of academic research exploring human cognition suggests that generative AI could also harm learning at all levels, from online tutoring to employee training, for three reasons.
Problem 1: Empathy
[...]
it is biologically impossible for humans and AI to develop an empathetic relationship: the transpersonal nature of empathy precludes its emergence in the digital realm.
This is one major reason why students operating in purely digital environments perform worse and are significantly less likely to graduate than comparable students engaged in face-to-face instruction. Without empathy, students become passive receivers of information with little impetus to push through the requisite struggles inherent in the learning process.
Even amongst highly skilled human educators, failure to cultivate an empathetic relationship inevitably hinders learning. And this only serves as a further warning against AI, as it reveals that neither knowledge nor pedagogy (presumably the forte of digital tutors) are sufficient for effective teaching.
The focus on math in AI tutoring demos isn't coincidental - it's because math teachers often have the luxury of instant, clear-cut assessment. In the humanities, where teachers have to consider work done outside class hours, the assessment validity crisis becomes clear. While students could always get help from others, AI has made outsourcing work trivially easy and nearly untraceable. This is precisely why the student-AI chat approach is promising: it has the potential to reveal students' thinking processes AND develop their AI literacy. At least the 'grade the chats' guys are attempting to address what AI skeptics seem content to sidestep: preparing students for a world where AI literacy isn't optional.
The whole notion of “giving teachers the right tools” seems to come from folks that rarely use tools. Every teacher, just like a woodworker or mechanic, needs a whole big box of quality tools. Along with those tools is acquiring the skill to use each one - the expertise to understand which tool is optimal for which student on which day as well as the skill to properly use that tool. While machines are very standardized, the mechanic must still adapt to unusual situations. Wood is even more variable and forces the craftsman adopt non-standard thinking to solve problems. With students, every one is a unique individual - the antithesis of standardization. Only a very robust toolkit combined with deep expertise in human problem solving will make learning successful.
I just cannot believe that technology can replace teachers. And I don’t think anyone who teaches believes it either. We offer too much as humans and no AI, no tech, no scientific advancement can match it.
Do any other teachers think otherwise?
There's a moment in the interview where Anderson Cooper says something like, "Should teachers be worried about this technology replacing them?" and it's like, tell me you don't understand this technology or you don't understand teaching or you don't understand both without telling me.
Just came here to say thank you for this:
“Other countries level the playing field for their kids through redistributive policies like child welfare, public housing, nationalized health care, etc, all of which would require increasing the tax burden on economic elites. It should not surprise us to see which of those dreams receive fawning coverage during primetime corporate media.”
I feel your frustration and share your skepticism here. Case in point: On Monday I was using Google Gemini with a student learning place value. We answered a question correctly, then Gemini replied "Close, but not quite! The 2 in 12,345 is actually in the ten thousands place." ?!?!
On the plus side, the chatbot seems to be fluent in the use idiomatic use of "yeah, totally". So there's that going for it.
"Nah, can't really help you with the math, but if you want I can sit in the back of the classroom with you and make wisecracks about the teacher."
I appreciate your questioning of the value of generative AI in education. It's hard in the face of so much cheer leading about the current and future benefits to speak out and be heard. I recently had a conversation with a former colleague (not a former educator) who is a partner at BCG, and she was so bullish on it's value in education, including in classrooms, that I could not get her to even consider a different perspective. BCG is all in on AI in all forms and all sectors. I have no doubt AI/ML, including generative AI, could help with back-end office functions and administrative tasks in education much as it's doing in other domains, though I think the financial and environmental costs may be too much given the returns we'll see. But as you and others rightly point out, generative AI, even if we think about it futuristic-ally, does not align with how children learn be it in our own academic systems or currently/historically where there are not formal education systems. It all boils down to relationships with people, though the material may be conveyed differently from observations and imitation to more formal training. How can AI replace that aspect?
"I want to believe as much as anybody that we’ve really truly found the One Weird Trick for human cognition, that we’re all on a rocketship ride to easier teaching and better learning."
Oh you do not. You rightly bring a skeptical mindset to all this nonsensical hype, which is what we need. There's too much hope in education, not enough wry cynicism!
I am kidding, but I am kidding on the square.
I WANT to want to maybe! :D
Dan! Thank you for your powerful and consistent advocacy for students and educators doing authentic teaching and learning.
Have you considered this small change in sequence:
"It should not surprise us to see economic elites dream of leveling the playing field through technology contracts that would benefit other economic elites.
'Yeah, that’s the dream.'"
😭
Dan - seriously. I agree completely. Can we intervene? Can we intro them to some teachers? Can we connect them to some great orgs like TNTP or Leading Educators? MagicSchool is not ok.
The question we need to start with, as always, is “What problem is this supposed to solve?”.
https://open.substack.com/pub/curmudgucation/p/ai-in-ed-the-unanswered-question?r=bk8ww&utm_medium=ios