10 Comments
Jun 26Liked by Dan Meyer

Awesome stuff Dan. I’m particularly struck by the 2nd question “Will generative AI help bring teachers back to the profession?” Seriously? This question feels so out of touch with reality. I get that it’s a survey question but come on - talk to just about any teacher / former teacher and they’ll tell you boat loads of things before “generative AI.” My word.

Thank you for your strong words and continued clarity on thinking through all issues re: AI in education 🙌🏼

Expand full comment
Jun 26·edited Jun 26Liked by Dan Meyer

Nice point on The Diamond Age. It has always bugged me that human dimension of learning is an explicit theme of the novel that gets missed by so many readers who fall in love with the idea of “The Book” But it works the way it does only because a human spends hour and hours each day enlivening the educational experiences "The Book" offers.

Expand full comment

I'm generally a fan of this blog, but have to quibble with the premise of this post.

On the one hand, you (correctly) point out too many AI apps are just very shallow UX innovations on top of one of the standard issue chatbots.

On the other hand, the core problem you are highlighting is just another UX problem, admittedly a higher quality one than most folks are thinking about. Even without getting uber intrusive and detecting facial emotion from a camera, just activity monitoring and occasional screenshots are enough to detect when someone is stuck or going down the wrong path.

Caveat : I am myself building / shilling an AI tool (hitwit.ai), a virtual TA for higher ed. I like to think some of us are both mindful of how shallow some of the hype is, and hopeful for what a well-crafted solution can bring about, in an age when resource inequality results in massive educational and social inequality.

Expand full comment

I'm not a fan of AI tutors in general. I agree they currently do not effectively replace humans, nor do I think they should. However, I wouldn't put it past those who believe in the path these solutions are on to try to address the problem of when to intercede with a student by using a camera feed a la testing solutions that monitor students. I do not advocate this approach but certainly seems within the realm of possibility technologically. And, given how many of these companies approach ed tech, where the solution to the next challenge they encounter is more tech, seems more likely then seeing a change in direction to finding solutions that solve more pressing use cases.

Expand full comment

We'll have each student wear an Apple Watch, so we can monitor heart rate as a clue to their anxiety level. This ends with students being wired up like NASA astronauts.

Expand full comment

I wonder if part of the issue is bound up in linguistics. Tutor and teacher are reserved for humans, not machines. I found Dan’s words about what tutors (human) actually do compelling. I thought about the sensitivity and responsiveness great tutoring requires, and I drew on my own clinical experience as a reading specialist tutoring profoundly struggling readers. Right now I can’t see how AI could simulate what I did. I would think about experiences during tutorials unbidden, intuit something to try, carry these kids in my mind and think about their issues on long runs in the morning. I worked with some kids for long periods, up to two years. I’m not sure the root question is ethical as in “should AI tutors be permitted” but pragmatic. Can an occasional picture of a learners face or a tongue in cheek Apple Watch taking a pulse—can AI perform such a thoroughly human activity? Remember the high-class tutors are responsible for the edification and education of royalty throughout the ages. Can a machine replicate such a pattern for high-poverty children?

Expand full comment

I don't think it can at least not in the near future but that doesn't mean it won't be tried.

Expand full comment

Hope you don't mind a follow up with a link. On the very day you featured my comment about how the AI roll outs get press attention, but the AI disasters get ignored, the non-profit journalism outfit The74 broke the news that AllHere, the ed-tech company that is building Ed, The LA Unified School District's chatbot, fired its CEO and laid off most of it staff. So far, no other news outlets have covered the story. Here is my write up: https://ailogblog.substack.com/p/the-la-unified-school-districts-chatbot

Expand full comment

I don't know the Stephenson story, but an Asimov story, "The Fun We Had", concerns a future where human teaching is dimly remembered, as humans are taught by robots. Short, sweet, to the point, and like many aspects of science fiction....it trying to be turned into "science fact" by technologists. (Honestly "science fiction and its relationship to 'science fact' is an incredibly interesting topic-writers dream the future and it comes to be).

Expand full comment

Definitely non-verbal cues matter a ton. Plus I'm building a model about how a particular student thinks as we work 1:1--I start to recognize "this person has a lot of trouble with abstractions, I need to concretize" or "the problem this student is having is not actually what they say the problem is--they say they don't understand a particular point, but what they are struggling with is considerably upstream of that point". etc.

Expand full comment