Do Kids Want a Personalized Netflix for Education?
What happens to education in a world where you can personalize explanation videos on demand?
Make me a video about subtracting integers in the style of Grant Sanderson.
This is roughly what Agasthya Pradhan Shenoy showed off recently on LinkedIn—a completely AI-generated video explaining the addition of integers in the style and voice of Grant Sanderson, the popular math YouTuber who posts under the name 3Blue1Brown.
I think it is an extremely impressive demo. Imagine you could ask for an explainer video about any topic in the style of any artist you want. I’d like to set aside concerns about appropriating someone else’s identity like this for a minute and just wonder: is this useful?
Me, I think it’s possible that generative AI is extremely powerful consumer technology and very average education technology.
For example, I can more easily imagine that AI will make education unnecessary (by automating every kind of work, let’s say, or enslaving all of us in data centers) than I can imagine AI transforming education.
I do not know Shenoy’s larger body of work at all, but I do know that this current crop of generative AI edtech developers and I understand education very differently. They imagine that students most need dynamic content generation when instead students most need dynamic connection generation—connection between people and ideas. When software developers aim at the cognitive aspects of education without understanding the social aspects, they miss the dartboard entirely.
In a class I helped out in on Monday, the class seemed most engaged when the teacher asked them for examples of things that numbered one million, one billion, and one trillion. Students opened up who were previously withdrawn, each one offering ideas from their personal interests, from their cultural and historical knowledge. Kids debated one another. The teacher demonstrated curiosity about their ideas and searched the internet for facts to help referee their debates.
Make me a video about things that measure in millions, billions, and trillions in the style of Mr. Beast.
I’m sorry, but this wouldn’t have improved outcomes among those kids. The vast majority of them want their content and connection combined and would sooner deconstruct them as eat their turkey, bread, and mayonnaise separately.
We have tried this. In 2000, we did not have a vast library of explanatory videos to access at will. In 2025 we do. From “no videos” to “vast library of videos” is an enormous jump, far larger than the jump from “vast library of videos” to “videos I can customize at will.” If the vast library of videos did not change outcomes, we should not have high hopes for the customized videos. Netflix for education hasn’t worked.
We have tried this. In 2018, Netflix produced an episode of the show Black Mirror they called “Bandersnatch” in which viewers were invited to choose their own path through the episode. Netflix has since abandoned this style of storytelling. It was an unsuccessful episode of television on multiple levels but a large one was that it fragmented people and dissolved the social bonds they form over the collective experience of media. People couldn’t share their reactions to the same story because their stories were different! No one’s memes made sense to anyone else! Personalized Netflix hasn’t worked.
People do not want their TV shows or their New York Times crossword puzzles or their Wordles customized to their interests. It matters to people that they’re watching and working on the same thing on the same day as millions of other people. Similarly, kids do so much of what they do, including learning in schools, out of a desire to connect with others in the place we call a classroom, to know and be known by their classmates and teacher.
Kids do things for reasons, maybe reasons they can’t yet articulate or don’t yet fully understand, and when you take away those reasons, they will stop doing those things.
Featured Comment
I was talking about teaching with an ed-tech exec once and he said, "well, if you can't measure it, it doesn't exist." I struggled to square that with what was happening in my classroom. I was seeing so much growth and chatter and "aha" moments - i.e., existence. I was really proud of those moments, but I recognized how hard it would be to measure them. To the dominant paradigm, that teaching, in all its mess and beauty, doesn't exist and will not exist until they can measure it. Unfortunately the way it was framed was as if the immeasurability of those important moments was my problem not theirs.
Odds & Ends
¶ Bethanie Drake-Maples of Atypical AI said something smart on the Edtech Insiders podcast:
The big issue with personalization is that it must coincide with orchestration. Teachers are going nuts with all of these over-personalized learning paths and it’s really hard for them to put it back together and find a way to facilitate learning with all of these different needs. So we call that orchestration, right? And there is absolute power in AI to be able to take these disparate learning needs and bring it back into an instruction whole and a daily plan that allows people to pair correctly and the right misconceptions to be retaught. But if you don’t have it you’re just going to be driving people crazy.
¶ Matt Barnum and Deepa Seetharaman of The Wall Street Journal go long on AI cheating. This part stuck out to me, and resonates with all of the companies moving to in-person job interviews.
Joshua Allard-Howells, a high-school English teacher in Sonoma County, Calif., said AI cheating spread quickly among his students last year. He now requires them to write their first drafts in class by hand, with computers and phones prohibited and out of reach. The change has produced an unexpected benefit, he said: Students take more time on their work, and their writing is more authentic.
¶ I don’t think it’s possible to build useful tools for teachers when you don’t understand their work. I don’t know how you understand the work of teaching without teaching. But if it is possible it’s because you devoured the writing of the dwindling numbers of teachers who have the time and energy to share their insights on the internet. For example:
Dylan Kane takes you inside the many quotidian and aggravating tasks you may not have considered the work of teaching. “A bag of Takis is crinkling but I don't know where it’s coming from. Is it Timmy again?”
Instructional coach Erin takes you inside her week as a substitute teacher and from that perspective reveals the invisible social structures and routines that make classroom learning possible.
Education Week has a well-reported piece on what it’s like to be a teacher in 2025. “Convincing students to see value in their education and to apply themselves to their work has gotten harder, the teachers said. From the core subjects all the way to electives like art, teachers have faced student disengagement in learning—and had to brainstorm new ways to motivate them.”
¶ Larry Ferlazzo is retiring. What a career.
¶ Some of my favorite researchers are conducting a survey seeking to understand how grades 4-9 mathematics teachers in the US make decisions about their curriculum when planning and teaching mathematics lessons in the middle grades. If that’s you, take the survey, get $10 Amazon cash, and help us all learn.
Great points. I’ve had this term knocking around in my head—“shiny uselessness”—and I think you’ve defined it here: AI outputs that are impressive and attractive if you think about them for <30 seconds.
But after my initial rush of “omg did I just kill Khan Academy??”, it became pretty clear that *even if it did replace Khan Academy, it would not change the status quo*: having a kid passively sit in front of a thing that’s talking at them. Add some personalization and maybe engagement goes up, but if engagement was the same as learning, every Duolingo user would be a polyglot.
I fully agree that we’ve been fed a false narrative of “personalized, on-demand = better”. In fact, you can read about how curation and shared experience is paramount to my design philosophy here: https://teachinglabstudio.beehiiv.com/p/ai-for-humans.
From that essay:
“I should say here that this is all experimental. My way is not the only way, and is likely not the best way. But it’s clear to me that there is a better way than [chatbots and lesson plan generators].…
…The promise of AI has been conflated with the promise of productivity. But to me, the real promise of AI [should be more opportunities to] deeply engage with each other, with ideas, and with the process of creation. We can do that—I can see that future clearly—but only if we commit. Commit to shifting paradigms, commit to deep engagement, commit to human experience.”
Personalized content isn’t that. But personalized content that provides an opening to deeper human relationships—“let’s turn Sam’s answer to #3 into a video. How is it different from Sarah’s written explanation? Do both strategies hold for #4?”—might be.
Thanks as always for the thoughtful post. Excited to hear others’ thoughts!
(Ps the narration in the ai generated video is a clone of my own voice, not Grant Sanderson. But it does use his open source manim library!)
I loved this line, 'When software developers aim at the cognitive aspects of education without understanding the social aspects, they miss the dartboard entirely.' Awesome share