That's exactly what I think was missing in your communication about AI, I never thought about the "who am I" part of education, I guess I always though about it in terms of a collage student, not in terms of a school student.
This stuff is extremely valuable and eye opening for a non-educator, we aren't exposed to such things almost at all, so I hope you keep us in mind the next time you trash something we are passionate about.
Beautifully written!! What a great message for all teachers to read as we begin preparing for the new school year. You are exactly right -- when a child asks a question in class they are essentially asking, "How does this information help me better understand my world and myself?" As we continue to move forward in this "Post AI World" we have to remember that all people seek understanding, not just knowledge.
Thanks for the insights on Dan's insight: beyond AI as a technical and commercial fact, there remains the question of its role in education.
I do wonder if it blurs the distinction between "my world and myself", at least my perception of a separation.
As I understand, there is a case to be made for cognition as computation; my understanding of the world is computed. This insight comes from our understanding computation by machines.
Is there useful insight here for math education? Should/could arithmetic be taught/learned as computation. Could anyone point me to current work. Thanks
I'm curious what you think about the utility of this tech for teachers acting as a quasi-teacher assistant. For example, I can go to GPT-4 and ask it to come up with several approaches to an upcoming lesson "utilizing Dan Meyer's 3 act math task framework", and the results are good! I just finished my first-year in the classroom, and AI enabled a ton of creativity in my instructional design and improved my workflow dramatically. In my eyes, the most powerful, short-term benefit of this technology is to streamline/automate various repetitive teaching tasks (like ruminating on the perfect context to ignite curiosity in an upcoming lesson or student work feedback).
I think you're absolutely right to side-eye the most ambitious claims about the impact of Generative AI/chatbots. But I would be curious to see how you'd approach steelmanning the case that these technologies can benefit teachers/students.
AI researchers have managed to essentially distill human knowledge into an impressively functional probabilistic matrix. I think this has massive implications for people who work in the production of knowledge. We have an obligation to optimistically grapple with the opportunities/challenges AI poses, and I think there's room to do that without engaging too much with the most out-there claims about its impact.
I think there's really intense energy at the level of policy, philanthropy, and capital to make AI the next big thing for /students/. Teachers like yourself (congrats on wrapping year one btw!) can see straight through those aspirations with laser eyes but I feel a bit of an obligation to make the limitations that are instantly tangible to you a bit more legible to these other folks.
That said, I'd love to let this post stand for awhile, and do as you suggest, consider the possibility that AI is the next big thing for /teachers/ rather than /students/. The idea that ChatGPT could be a very effective teacher's aide is fairly appealing to me.
I believe you’ve written about this before, but these are the same people who didn’t understand why all students weren’t willing to sit down and learn from Khan Academy of their own volition or passion. While I won’t put these folks in a singular box, there are certainly shared traits amongst technologists that create a blind spot around student engagement for large swathes of the US student population. Yes, there are students who are willing (maybe even eager) to sit down and plug away on their own to learn something new; maybe they’ve accelerated their “Who am I?” development or aren’t too fussed about that question. However the rest of us need/needed much more than that. So really this is as much a misunderstanding of our own experiences as students as it is about current students. If one is sitting there thinking and strategizing about AI in education and all they can conjure up is their own education experience then I think there is a huge missing piece in the shape of the multitudes of students we teach in this country.
Agreed, and so as not to knock technologists too much, I think teachers themselves are not immune to these limits on their perspectives. ie. As a new teacher I recall feeling surprised that so many students found algebra so much harder and so much less interesting than I did. If we're going to build anything useful out of this technology for the classroom, we'll need to help technologists understand a bunch of truths about that social environment they might find surprising. Certainly, that's a goal of this post.
In a recent essay, venture capitalist Marc Andreessen led his case for AI with this claim:
“Every child will have an AI tutor that is infinitely patient, infinitely compassionate, infinitely knowledgeable, infinitely helpful. The AI tutor will be by each child’s side every step of their development, helping them maximize their potential with the machine version of infinite love.”
I laughed out loud when I read this, because, well, it’s a ludicrous thing to suggest. But what I’ve been wondering lately is how sincerely technologists believe some version of this vision is true. Put another way, do they *really believe* AI tutors will be able to grapple with the identity-forming aspects of schooling that you eloquently articulate, and manifest “infinite love”? Or are they just convinced AI will kick ass at “how do I do this?” and are adding some rhetorical flourishes about infinite compassion to help sell the thing?
🤦♂️ How did I forget Andreessen's quote in my list? I read that, it's conviction and everything, and really have to wonder, "Do these guys know something I don't know?" Like, either they know some kind of AGI / sentience packaged in a humanoid form factor is right around the corner, or they are nuts.
For context: I've been working with AI applications to cognitive tutors, such as https://www.carnegielearning.com/ I still find Papert's "The Children's Machine" a useful manifesto. Legitimate peripheral participation in communities of practice is a framework that views learning as taking on new roles. I want my students to answer, "Who am I?" with, "A mathematician: a person who MAKES mathematics with my math friends."
Your essay made me think of game genres, as a metaphor. "Who am I?" means cognitive tutor is as a roleplaying game, not a puzzle game. What mathematical roles can students take on via their in-game characters? What corresponding NPC roles can AI support?
I like to invite children to pretend-play a math person, as they would a doctor, a superhero, a firefighter, etc. They have no idea how, not at first. They get to, "Math people solve math problems," if I am lucky. Children struggle to imagine that, too: do I wear a hero cape, or what? They don't know where math problems come from. Mathematical question-posing, conjecturing, problem-posing, etc. is not a thing in their world, and has to be taught from scratch. Maybe AI can play a character who elicits questions? It can be a court jester or a fantasy oracle who does react, does answer (so learners keep posing their precious questions!), but doesn't necessarily make sense. An improvement on https://en.wikipedia.org/wiki/Telling_the_bees
On the other hand, AI can play a quest-giver. Whatever pretty heroic things quest-givers say in RPGs, they are anthropomorphic representations of time and task management software. In that role, AI can serve the hero like Jeeves would. Most math people don't have executive secretaries, but it Would Be Nice.
There is a cognitive tutor where AI plays the role of a student who the actual child has to teach math. That's promising, as roles go.
One more example of possible roles: peer review. A student works on problems or exercises; the AI makes suggestion as a peer math person would, and vice versa. The idea is old. Desmos activities and many curricula already have exercises in that format, eliciting student feedback to real or fictional students' work. AI could help to make this format more interactive in single-player mode, and play a peer reviewer to the learner. Most tutor software already does that in some form. AI can also give shape and form to students peer reviewing one another, like software that manages multi-player game matches. An improvement on https://en.wikipedia.org/wiki/Rubber_duck_debugging
I found this metaphor of roles ("Who?") useful for my work - thank you for the inspiration! I don't believe ANY tech BY ITSELF can ever empower learners in the spirit of "The Children's Machine." Human communities will have to do that. "Thinking with big data" (AI) does give us another tool for our community toolkit, to add to "Thinking with code" and "Thinking with the internet."
The first thing that jumped to mind upon finishing reading this is that generative AI based chatbots already exist and are being charged for to assist with cognitive behavioral therapy. If anything is more fundamentally "Who am I?" than teaching, its psychotherapy.
The second thing that came to mind is the emergence of generative AI dating simulators. A lot of money is already being made selling romantic emotional "fulfillment".
Not saying I think either of those are good ideas, just pointing out that monetizing meeting the need of "Who am I?" though AI is already underway, even if people trying to do that in education haven't realized they need to yet.
Let's add to the list an interesting study in which medical experts preferred ChatGPT responses to medical questions to trained doctors. All of these innovations are really interesting to me but my view is that none of them will change the demand for relationship with a romantic partner, counselor, or doctor even a little. People without access to those relationships may now have access to a simulacrum that's a little less lousy than before, but the transformation predicted by the people I quote goes quite a bit farther than that.
Nice to see my comment spawned a full article.
That's exactly what I think was missing in your communication about AI, I never thought about the "who am I" part of education, I guess I always though about it in terms of a collage student, not in terms of a school student.
This stuff is extremely valuable and eye opening for a non-educator, we aren't exposed to such things almost at all, so I hope you keep us in mind the next time you trash something we are passionate about.
Fair! Thanks for the encouragement to write with more light & less heat here.
Beautifully written!! What a great message for all teachers to read as we begin preparing for the new school year. You are exactly right -- when a child asks a question in class they are essentially asking, "How does this information help me better understand my world and myself?" As we continue to move forward in this "Post AI World" we have to remember that all people seek understanding, not just knowledge.
Thanks for the insights on Dan's insight: beyond AI as a technical and commercial fact, there remains the question of its role in education.
I do wonder if it blurs the distinction between "my world and myself", at least my perception of a separation.
As I understand, there is a case to be made for cognition as computation; my understanding of the world is computed. This insight comes from our understanding computation by machines.
Is there useful insight here for math education? Should/could arithmetic be taught/learned as computation. Could anyone point me to current work. Thanks
I'm curious what you think about the utility of this tech for teachers acting as a quasi-teacher assistant. For example, I can go to GPT-4 and ask it to come up with several approaches to an upcoming lesson "utilizing Dan Meyer's 3 act math task framework", and the results are good! I just finished my first-year in the classroom, and AI enabled a ton of creativity in my instructional design and improved my workflow dramatically. In my eyes, the most powerful, short-term benefit of this technology is to streamline/automate various repetitive teaching tasks (like ruminating on the perfect context to ignite curiosity in an upcoming lesson or student work feedback).
I think you're absolutely right to side-eye the most ambitious claims about the impact of Generative AI/chatbots. But I would be curious to see how you'd approach steelmanning the case that these technologies can benefit teachers/students.
AI researchers have managed to essentially distill human knowledge into an impressively functional probabilistic matrix. I think this has massive implications for people who work in the production of knowledge. We have an obligation to optimistically grapple with the opportunities/challenges AI poses, and I think there's room to do that without engaging too much with the most out-there claims about its impact.
I think there's really intense energy at the level of policy, philanthropy, and capital to make AI the next big thing for /students/. Teachers like yourself (congrats on wrapping year one btw!) can see straight through those aspirations with laser eyes but I feel a bit of an obligation to make the limitations that are instantly tangible to you a bit more legible to these other folks.
That said, I'd love to let this post stand for awhile, and do as you suggest, consider the possibility that AI is the next big thing for /teachers/ rather than /students/. The idea that ChatGPT could be a very effective teacher's aide is fairly appealing to me.
I believe you’ve written about this before, but these are the same people who didn’t understand why all students weren’t willing to sit down and learn from Khan Academy of their own volition or passion. While I won’t put these folks in a singular box, there are certainly shared traits amongst technologists that create a blind spot around student engagement for large swathes of the US student population. Yes, there are students who are willing (maybe even eager) to sit down and plug away on their own to learn something new; maybe they’ve accelerated their “Who am I?” development or aren’t too fussed about that question. However the rest of us need/needed much more than that. So really this is as much a misunderstanding of our own experiences as students as it is about current students. If one is sitting there thinking and strategizing about AI in education and all they can conjure up is their own education experience then I think there is a huge missing piece in the shape of the multitudes of students we teach in this country.
Agreed, and so as not to knock technologists too much, I think teachers themselves are not immune to these limits on their perspectives. ie. As a new teacher I recall feeling surprised that so many students found algebra so much harder and so much less interesting than I did. If we're going to build anything useful out of this technology for the classroom, we'll need to help technologists understand a bunch of truths about that social environment they might find surprising. Certainly, that's a goal of this post.
In a recent essay, venture capitalist Marc Andreessen led his case for AI with this claim:
“Every child will have an AI tutor that is infinitely patient, infinitely compassionate, infinitely knowledgeable, infinitely helpful. The AI tutor will be by each child’s side every step of their development, helping them maximize their potential with the machine version of infinite love.”
I laughed out loud when I read this, because, well, it’s a ludicrous thing to suggest. But what I’ve been wondering lately is how sincerely technologists believe some version of this vision is true. Put another way, do they *really believe* AI tutors will be able to grapple with the identity-forming aspects of schooling that you eloquently articulate, and manifest “infinite love”? Or are they just convinced AI will kick ass at “how do I do this?” and are adding some rhetorical flourishes about infinite compassion to help sell the thing?
🤦♂️ How did I forget Andreessen's quote in my list? I read that, it's conviction and everything, and really have to wonder, "Do these guys know something I don't know?" Like, either they know some kind of AGI / sentience packaged in a humanoid form factor is right around the corner, or they are nuts.
For context: I've been working with AI applications to cognitive tutors, such as https://www.carnegielearning.com/ I still find Papert's "The Children's Machine" a useful manifesto. Legitimate peripheral participation in communities of practice is a framework that views learning as taking on new roles. I want my students to answer, "Who am I?" with, "A mathematician: a person who MAKES mathematics with my math friends."
Your essay made me think of game genres, as a metaphor. "Who am I?" means cognitive tutor is as a roleplaying game, not a puzzle game. What mathematical roles can students take on via their in-game characters? What corresponding NPC roles can AI support?
I like to invite children to pretend-play a math person, as they would a doctor, a superhero, a firefighter, etc. They have no idea how, not at first. They get to, "Math people solve math problems," if I am lucky. Children struggle to imagine that, too: do I wear a hero cape, or what? They don't know where math problems come from. Mathematical question-posing, conjecturing, problem-posing, etc. is not a thing in their world, and has to be taught from scratch. Maybe AI can play a character who elicits questions? It can be a court jester or a fantasy oracle who does react, does answer (so learners keep posing their precious questions!), but doesn't necessarily make sense. An improvement on https://en.wikipedia.org/wiki/Telling_the_bees
On the other hand, AI can play a quest-giver. Whatever pretty heroic things quest-givers say in RPGs, they are anthropomorphic representations of time and task management software. In that role, AI can serve the hero like Jeeves would. Most math people don't have executive secretaries, but it Would Be Nice.
There is a cognitive tutor where AI plays the role of a student who the actual child has to teach math. That's promising, as roles go.
One more example of possible roles: peer review. A student works on problems or exercises; the AI makes suggestion as a peer math person would, and vice versa. The idea is old. Desmos activities and many curricula already have exercises in that format, eliciting student feedback to real or fictional students' work. AI could help to make this format more interactive in single-player mode, and play a peer reviewer to the learner. Most tutor software already does that in some form. AI can also give shape and form to students peer reviewing one another, like software that manages multi-player game matches. An improvement on https://en.wikipedia.org/wiki/Rubber_duck_debugging
I found this metaphor of roles ("Who?") useful for my work - thank you for the inspiration! I don't believe ANY tech BY ITSELF can ever empower learners in the spirit of "The Children's Machine." Human communities will have to do that. "Thinking with big data" (AI) does give us another tool for our community toolkit, to add to "Thinking with code" and "Thinking with the internet."
awesome. nice and eye opening perspective
This is beautiful blog. “Who am I?” Is the definition of engagement.
The first thing that jumped to mind upon finishing reading this is that generative AI based chatbots already exist and are being charged for to assist with cognitive behavioral therapy. If anything is more fundamentally "Who am I?" than teaching, its psychotherapy.
The second thing that came to mind is the emergence of generative AI dating simulators. A lot of money is already being made selling romantic emotional "fulfillment".
Not saying I think either of those are good ideas, just pointing out that monetizing meeting the need of "Who am I?" though AI is already underway, even if people trying to do that in education haven't realized they need to yet.
Let's add to the list an interesting study in which medical experts preferred ChatGPT responses to medical questions to trained doctors. All of these innovations are really interesting to me but my view is that none of them will change the demand for relationship with a romantic partner, counselor, or doctor even a little. People without access to those relationships may now have access to a simulacrum that's a little less lousy than before, but the transformation predicted by the people I quote goes quite a bit farther than that.
https://jamanetwork.com/journals/jamainternalmedicine/article-abstract/2804309