26 Comments

teacher AI use isn’t the story, in my opinion. It’s kid AI use. I get that these tools aren’t as helpful for teachers as we were told, in fact I completely agree. But kids are using them all the time, sometimes without knowing it, and not just for homework.

AI Literacy is the story, in my opinion. Everyone needs it, even if they don’t plan on using it. Even if only just to be aware of when they are using it and when they are not — since so many established tech platforms are embedding it and acting as “wrappers” without even asking consumers if that’s what they wanted in the first place.

My two cents.

Expand full comment

We heard this same appeal approximately 100 years ago about "social media literacy." Kids are using it all the time! Even in class! You don't really hear those appeals too often anymore.

Expand full comment

1) Yes, we do. Quite literally all the time. In 2023 and 2024, the first laws ever were passed in the United States requiring that children under specific ages could not access social media depending on specific sets of circumstances. Utah, Florida, Arkansas, Florida, New York, Texas, Tennessee. That's a response to the dangers of social media and a lack of SM literacy across several layers of society.

2) Cell phone bans in schools have absolutely been a top education story over the last several years. And schools have had to resort to bans because the education market chose not to prioritize social media literacy in the first place. Missed the boat, dropped the ball, let the cat out of the bag, whichever colloquialism you want. All because the prevailing wisdom was "they'll teach themselves" or "it's not a big deal." It quite literally was a huge deal - and still is. And it's this exact thinking that you are expressing that led to it. Oy ve.

3) The lack of social media literacy that resulted from a refusal to prioritize it in education is precisely the reason we have a borderline illiterate society that has no clue how to discern fact from fiction online. It's the reason why the majority of the country believes in conspiracy theories and doesn't trust even the most unbiased news sources like NPR. The choice not to prioritize social media literacy "100 years ago" is not proof that "tech doesn't change education." It's almost exactly the opposite. It changed education a great deal.

2) Your analogy is inaccurate. Social media is not to Education as AI is to Education. A better comparison is: Social Media is to Traditional Print Media as AI is to Education.

For folks who haven't been inside of an industry that has been flipped on its head, it can be hard to recognize when its happening. But for those of us who have, the warning signs are obvious.

There is absolutely a need to adapt. As someone who has been a classroom teacher in the AI era - and within the last decade -- I can say that with a high degree of confidence. And that doesn't mean I'm pro-AI or I think it's the greatest thing since sliced bread. It's precisely because it's flawed that the adaptation needs to occur.

But yes, let's spend more time clapping back at EdTech companies. As if we don't have anything better to do.

Expand full comment

I agree, Mike. As a language and literacy researcher, I’ve spent the past two years experimenting with LLMs and exploring how interacting intentionally with a bot on academic tasks can improve metacognition, task analysis, background research, critical thinking, and motivation. You’re right on the money. The crucial need is to understand how kids use the bot now and give them structured and guided experiences to build their capacity to engage the machine flexibly and fluently to enhance their learning in and out of the classroom. It almost seems as if teachers who resist do so because the bot doesn’t do what the teacher wants it to do. Using AI seriously in teaching and learning means taking learners seriously and expecting more purposeful execution of tasks from them. I suspect that a school year replete with AI-aided projects will improve reading comprehension and written composition in both efficiency and sophistication. Btw, I also agree with your linking social media and widespread under-education resulting from disengagement with print that is a feature of the cultural and epistemic chaos we are experiencing. The coherence and unity of message in traditional text got lost in the hyperlinks and the social interactions. Making the mistake of ignoring AI until it does exactly what teachers want (tutors? grading papers?) could impact the nature and number of under-educated/non-reading individuals the country produces going forward. It’s not a good idea to resist the inevitable. AI isn’t going anywhere, and I don’t mean that as a suggestion that people should succumb to it. Educators must understand how it works and what kids do with it so we can help them learn with it rather than play around with it or cheat with it.

Expand full comment

I couldn't agree more Terry, as usual you've explained it really well here. These lines sum it up for me:

"It almost seems as if teachers who resist do so because the bot doesn’t do what the teacher wants it to do....Making the mistake of ignoring AI until it does exactly what teachers want (tutors? grading papers?) could impact the nature and number of under-educated/non-reading individuals the country produces going forward."

This is a problem that is very hard to navigate. I try to lean on the phrase, "The flaws are features, not bugs." The very fact that they screw up is (partly) what provides opportunity for critical review and analysis -- which to your point can be used to increase metacognition, critical thinking, and task analysis.

It's going to be a long slog, but one of the first tasks is getting educational thought leaders to realize that the negative content is just pushing teachers-in-denial to dig their heels in more. Subsequently, we miss that opportunity you describe to prepare everyone for a world where AI is all around them -- whether they are useful for creating curriculum or not (they are not.)

Expand full comment

Hi Mike, as Benjamin Riley of Cognitive Resonance said on the MyEdtechLife podcast, "AI literacy" often means "AI hype." I wrote a post full of resources for AI literacy that never ask children to use these tools. https://www.criticalinkling.com/p/high-school-ai

Expand full comment

I don't hype it at all when I teach AI Literacy. Others might, but I don't.

I call it a "bullshit machine," referencing the academic study from this past June.

The sources you provide are insightful and useful. Absolutely they are a good first step and an important part of the conversation. But that's not going to develop a deep understanding of the nature and essence of a tool that will surround them for the rest of their lives, whether we like it or not. Learning about something by reading articles and watching videos will not give you the experience to navigate it in your life -- which kids will have to do. That's like saying, "here, let's learn how to play basketball by watching videos and reading articles." It's a good first step, but eventually you're going to have to pick up the dang ball.

The frustration that you guys have with EdTech (and Tech) companies is getting in the way of teaching AI Literacy. I understand that you are annoyed by them, you have every right to be. Their foisting of these tools on us will undoubtedly lead to negative unintended consequences, and they are absolutely being irresponsible with their marketing and advertising efforts around these tools. But yelling at clouds isn't going to solve anything. It's way past due to roll up our sleeves and develop real AI literacy in faculty and students, and that involves exploring them critically so that you can understand them.

I wrote about this here:

https://mikekentz.substack.com/p/ai-tutoring-is-not-ai-literacy-a

Sorry for my tone, but I am just so tired of this narrative.

Expand full comment

Let’s figure how to use the internet by reading books about the internet. Absurd logic from the overly cautious who have categorically closed their minds to all investigations prior to experimentation.

Expand full comment

Amen brother! We can write a paper on how we read books about the internet while we are at it.

Expand full comment

You do not need to use LLMs such as ChatGPT to understand how they work. Do we really need to expose children to the Eliza Effect? Do we need to spend environmental resources to educate children about bullshit machines? I have plenty of information about how generative AI works in these two posts. My sources are experts whose work is all publicly accessible - the New York Magazine longform article about Dr. Emily M. Bender I cite in my post about high school students is one such example. https://www.criticalinkling.com/p/ai-vocabulary-for-teachers and https://www.criticalinkling.com/p/be-precise-when-talking-about-ai

Expand full comment

Yes, we need to educate them because they are using them. We are not exposing them, they are already exposed. Like I said, giving them information is not the same as giving them (guided) experience.

Expand full comment

How can someone understand how a tool works without having used it?

That’s what education gets wrong: listening to a lecture about X or writing a paper about X is not knowing X.

Students are using the tools. How do I know? I have every student an anonymous survey on day 1 of school.

I also see it everyday in my classroom & have regular dialogue with them about using AI for other classes & things outside of school.

Are you in front of kids 5 days per week & engaging in these conversations with a diverse population of learners?

If so, I’d love to hear what your take aways are from those discussions with students.

Expand full comment

You said yourself that your kids use the tools. So why do you have to use it with them? Isn't that an endorsement of them? Shouldn't the teacher's role be facilitating critical thinking about them? The environmental costs alone of multiple classes of students entering prompts into LLMs and generating images should give teachers pause.

Expand full comment

Students using SnapAI & ChatGPT to cheat is not the same as "learning AI literacy."

Your argument is akin to saying Driver'd Ed isn't necessary because kids can drive without it. Using a tool =/= being literate in the use of it

Expand full comment

I want you to think about this excellent sentence from your post........

That's like saying, "here, let's learn how to play basketball by watching videos and reading articles." It's a good first step, but eventually you're going to have to pick up the dang ball.

and follow up with some questions about what students use, why they use it and how they use it.

Is using a calculator to learn integer operations the same as using a paper and pencil?

Is using autocorrect the same as learning to write and spell correctly?

Is using AI to summarize some papers and organize your writing the same as learning to do it yourself, on paper or PC?

If we want to teach LLM literacy just have all of the students calculate the frequency of words in a book and have them complete a sentence you made up using that data. That is exactly what LLMs do and is the reason they can't add well. That is all this "AI literacy" is right now and one is hardly teaching "AI literacy" by showing students how to use ChatGPT. That is like claiming you have taught integer operations by teaching students how to enter them in a calculator.

My favorite line from technology bound students from back in the day when preparing for the PCAT......."I can't do calculus without a calculator"......actual answer-they can't do calculus.

Ask yourself if all of the AI that people interact with is LLMs, what is "AI Literacy"?

Expand full comment

I cannot tell if you are agreeing or disagreeing with me, so I will just respond as if it is all neutral.

1 - What-Why-How is the literal framework for my "grade the chats" methodology. It is the crux of the AI Purpose Statement that comes before AI use in the classroom and the AI Reflection that occurs afterward. Think we are in agreement there.

2 - As to your last question I also agree, which is why I've tried to break out AI Literacy into three different subdivisions - Machine Literacy, AI Historical Studies, and LLM Literacy. There probably could be more.

https://mikekentz.substack.com/p/ai-literacy-is-a-soft-skill

Last, I wouldn;t claim to be teaching them AI Literacy if I said I was teaching them "how to use ChatGPT." That's not what I teach. I teach how to think about ChatGPT, how to critically analyze it (The Adversary is a good example), how to approach it and how to maintain a healthy relationship with it. That is far more nuanced than "how to use ChatGPT."

I appreciate your comment. You make good points about the integer operations and calculus. The BCG study has some good nuggets that support your point.

Expand full comment

A big part of why AI Literacy is still in the hype stage is that teachers have not been exposed to the concepts, let alone grappled with them.

Saying something is hype because bureaucratic institutions haven’t rapidly adopted it is a gross misrepresentation at best, & laughable to anyone currently working in K-12 education.

The internet is still mostly “hype”, judging by the number of teachers assigning a paper test based on a textbook.

Expand full comment

And to what end does AI literacy matter?

Is it for teacher to spam more worksheets faster?

Or for students to learn the skills that will dramatically affect their ability to learn, read, & write?

AI is a tool & a process, not the end goal. Student learning is the end goal, and the highest purpose of teachers.

Expand full comment

Thank you for writing this. I have seen the CEO of MagicSchool brag about using teachers for free publicity and use a child's suicide as an opportunity to promote his business. To his credit, he deleted that post when I called him on it.

Expand full comment

I just like creating my own content, thinking about that one specific question that will bring out a common misconception in just the right way, seeing a story in the news and thinking, "Hey, I could make a math question out of that!" To me - and I suspect to most teachers - this is the fun stuff, why would I want to hand it over to a machine? So I can focus more on record-keeping and other non-fun stuff that would probably be done better by a machine?

Expand full comment

In my QR class this morning we were looking at a lesson that's about the graphical representation of data, emphasizing that these are choices that people make, perhaps to emphasize one thing and to de-emphasize another. I started the discussion with the Presidential election results: A map of the US with the electoral college results, a pie chart of the popular vote numbers I made using a popular AI program called Microsoft Excel. Questions: What story is each graph telling? Are they telling the same story? And then, in discussion, I came up with, "If you were just elected President, which graph would you want to look at to guide you in the decisions you need to make?" The idea for comparing the two graphs just popped into my head yesterday morning, the "If you were President" question popped into my head during the class discussion. And this is a part of my job that I LOVE. I don't know what prompt I'd need to send Chat GPT to produce this lesson, and I don't really care to know.

Expand full comment

how do you explain chegg then? look at their 2 year chart especially after chatgpt became mainstream. students are voting with their wallets.

Expand full comment

It sounds like you are talking about student AI use and he is talking about teacher AI use. I agree with you (hence my comments above). The story about teacher AI use is a way to clap back at EdTech companies, but it misses the bigger issue. Students are using it in a wide variety of ways. Stories like these continue to distract us from that more pertinent issue and slow the process of developing meaningful AI literacy for adults and young people.

It's like people writing in 2010 that teachers shouldn't let kids use Wikipedia for research, all the while ignoring the fact that kids are using it irresponsibly the whole time. We eventually got to a more nuanced place where teachers learned to say, "You can use Wikipedia as a starting point, but you can't cite it." That's what these articles should be about, but unfortunately there is so much anger/denial/grief that we can't reach a place of stoic acceptance regarding the reality we are in.

https://mikekentz.substack.com/p/acceptance-the-final-boss-and-the

https://mikekentz.substack.com/p/ai-tutoring-is-not-ai-literacy-a

Expand full comment

There is a lot to agree with here, but, I would push back on the notion that professional development has been tried and not successful. We vastly underinvested in professional development in the early Internet era, and seemed to spend even less as we went from computer labs to net-connected devices in every pocket. Then we banned phones, then we relented due to vague notions of "learning tools," but still didn't help teachers figure it out. Then Covid happened and the lack of professional development really hurt us. Now we're banning phones. AI is going to change a lot here... and we must invest in people and infrastructure or our worst concerns about AI in education will become the default option.

Expand full comment

Thanks for bringing up some important points. Sure, the numbers haven’t moved much, but honestly, it’s just a matter of time before they do. The tech is getting better, training is improving, and lots of people are starting to see the potential. I always encourage my teachers to use AI tools to enhance the teaching and learning process—it’s a great way to save time and create more engaging lessons. At the same time, I also encourage students to use AI tools to solve problems and explore new ideas, but never to replace their own thinking or doing the actual tasks. AI should be a tool to support. The adoption will grow in future —and there’ll be no turning back. It's kind of like smartphones, once ppl started to use them, you can't imagin living without.

Expand full comment