Why Generative AI Will Underperform Expectations in Education
It feels nearly impossible for technologists and educators to speak across domain expertise here, and I'm going to give it another try anyway.
I’m sure this commenter is correct.
Honestly I still don’t get why you have such a problem with AI in education, I am sure you have a good point but you aren’t getting it across to someone who isn’t an educator.
Over the last year, I have participated in some very unproductive discussions about generative AI in education. Tons of heat, zero light. All of those conversations deteriorated in no small part because of, well, my various personality defects of course, but also for lack of shared context between me and my counterparts.
My counterparts know much about AI technology and little about schools. I know much about schools and little about AI technology.
Here is my current best shot at creating a shared context between us.
First, I am not responding here to someone who says, “These AI chatbots might do some good for students and teachers.” I believe that. I am responding instead to people who suggest that AI chatbots will be “as good a tutor as any human” or that they’ll “enable every student in the United States, and eventually on the planet, to effectively have a world-class personal tutor” or that they’ll lead to “fewer teachers being employed – possibly even none” or that “Every child will have an AI tutor that is infinitely patient, infinitely compassionate, infinitely knowledgeable, infinitely helpful” and to anyone persuaded by those claims.
I’m not trying to explain why AI will be bad for education, rather why AI will not transform education inside of the next five years, and I’d guess much longer. AI tutors won’t produce job loss for teachers. Students and parents won’t prefer AI tutors to human tutors. AI tutors will not transform the core of education
Here is where I imagine you, someone who understands AI technology but has never taught, and I, someone who has taught but has a twelve year-old’s beep boop understanding of AI technology, might agree:
Students attend school to answer two questions.
Who am I?
How do I do this?
“Who am I?” encompasses all kinds of questions about identity, purpose, value, and belonging. It asks “Who am I in relation to you? In relation to my community? In relation to grownups and systems of power?” Maybe you haven’t thought about that part of schooling for a long while, maybe because your own answers to those questions are quite well developed. I hope you can believe me that this question matters enormously to children.
Meanwhile, “How do I do this?” is a question about the ideas and tools of a discipline. In Mathworld, where I live, those questions might be “How do I add two fractions with different denominators?” or “How do I solve the equation 2x - 3 = 5?”
I hope we both agree that AI chatbots are more useful for one of those questions than the other. I hope we both agree that students should not type “Who am I?” into ChatGPT or Khanmigo. No good will come.
If you want to understand why AI will fail to transform education, all you need to understand is that, for most K-12 students, those two questions are the same question.
They are not different questions. When I raise my hand with a question about “How do I do this?” I am also asking the question, “Who am I?” When my classmates or teacher or tutor listen to my ideas about math, when they tell me my ideas are valuable or smart or can be improved, I am learning not just about mathematics but about myself in relation to other people.
Every minute students spend with a chatbot is a minute not getting an answer to one of the two questions they came to school to answer. Whenever students learn "how do I do this?" with another person, they are getting answers to both. Students have only a certain limited appetite for learning that disassociates one question from the other.
This is not uniformly true across all students, of course. As kids get older and turn into adults, their identities become less and less pliable. They can separate the study of a new idea from the study of self. Students will appreciate AI chatbots more in post-secondary spaces, for example, where their identities are better defined and where their next best alternative to the chatbot—a 500-person lecture hall where you are anonymous to your instructor and classmates—doesn’t do much to answer “Who am I?” anyway.
That’s it. The main liability of teachers in conversations like these is that they don’t understand AI technologies all that well. The main liability of AI technologists is that they don’t understand all that well how important the question “Who am I?” is to school learning and how hard it is for most students to separate that question from the kinds of questions that AI chatbots seem pretty good at answering. Hope this helps.
2023 July 20. The commenter who inspired this post has responded:
Nice to see my comment spawned a full article.
That's exactly what I think was missing in your communication about AI, I never thought about the "who am I" part of education, I guess I always though about it in terms of a college student, not in terms of a school student.
This stuff is extremely valuable and eye opening for a non-educator, we aren't exposed to such things almost at all, so I hope you keep us in mind the next time you trash something we are passionate about.
2023 Aug 1.
Joseph South, the chief innovation officer at the ISTE, offers another overly triumphal quote that I wanted to save for posterity:
A carefully engineered, focused chatbot, however, could “engage students from where they are very specifically, and that can be incredibly powerful,” South said. “A teacher doesn’t have time to do that with every student. But AI does.”
What Else?
The 74 has an illuminating national survey of teachers, students, and parents on their experiences with ChatGPT.
Michael Feldstein’s recent pessimistic analysis of the edtech market analyzes post secondary edtech specifically but feels quite applicable to K-12 as well. On AI technologies: “Generally speaking, we tend to overestimate how quickly these technologies will penetrate particular markets because we forget about all the human stuff that gets in the way. And there is a lot of human stuff in education.”
Audrey Watters, one of the sharpest critics of education technologies, is back (not gonna remind myself why she ever left) with a Substack that examines fitness technologies and the future of food. It’s great.
Nice to see my comment spawned a full article.
That's exactly what I think was missing in your communication about AI, I never thought about the "who am I" part of education, I guess I always though about it in terms of a collage student, not in terms of a school student.
This stuff is extremely valuable and eye opening for a non-educator, we aren't exposed to such things almost at all, so I hope you keep us in mind the next time you trash something we are passionate about.
Beautifully written!! What a great message for all teachers to read as we begin preparing for the new school year. You are exactly right -- when a child asks a question in class they are essentially asking, "How does this information help me better understand my world and myself?" As we continue to move forward in this "Post AI World" we have to remember that all people seek understanding, not just knowledge.