25 Comments

I strongly suspect Andrej Karpathy can be significantly more successful than most. I have watched two of his videos, they are code-alongs and he displays a remarkable and uncanny gift to sift through the details and get to the crux of a concept or idea, typically a gift reserved only for those that understand the concept well and have worked in a setting where they need to convey it. Additionally, working in computer vision he became of the mindset of "chunking" the world and representing it mathematically for a computer to manipulate. I am hoping and assuming he will do that same for the data he will collect on his platform from learners. Finally, I imagine he has put aside a few dollars to live on and teaching is a passion more than a means to an end which should allow him to use techniques students don't necessarily enjoy but, instead, struggle with and learn a great deal from. That idea is described so well in PNAS Vol. 116 | No. 39 "these results suggest that when students experience the increased cognitive effort associated with active learning, they initially take that effort to signify poorer learning. That disconnect may have a detrimental effect on students’ motivation, engagement, and ability to self-regulate their own learning." Having experienced Karpathy's video, I am now going to see how I might be able to access his platform. So exciting!

Expand full comment

Hi. I'm Andrew and I'm looking for constructive conversations, not arguments, on the internet.

I'd like to ask about this idea:

"allow him to use techniques students don't necessarily enjoy but, instead, struggle with and learn a great deal from. "

Can you think of an example of a technique like this? When I try, I run up against scenarios that reply either on social/physical proximity or authority. Or some sort of third entity performing adaptive support (study group, lab, TA).

Expand full comment

Hello Andrew, the paper I referenced was about active learning which students tend to feel contributes less to their learning but which shows achievement gains that are considerable. I taught at a community college for many years and for my class it basically involved getting students to attempt to solve problems related to what we were going over. The problem sets were often kooky and they rarely had a single, satisfying correct answer. It is awkward for students as they don't feel like experts yet and don't wish to put wrong things on the paper. They would prefer to sit and listen. Magic happens, though, when we are wrong early and often with low penalties. I think the best "problem sets" allow exploring and identifying what you know and what you did not quite get so that the instructor receives quick feedback on where to go next (or AI directed lessons). I imagined challenging problem sets with helpful feedback that were relatively low/no stakes and would shape subsequent work. The AI portion would involve mapping the problem sets to relevant skills, crafting problems for assessing each skill to find areas to work on and shaping subsequent teaching to those areas. The tricky part would be training an AI to map the concepts, give the prompt, useful feedback and suggest the next steps. I suspect Andrej might be the right guy for the job. If it is too large a task, it would still be helpful to see what he tried and why.

No TA or physical proximity should be needed if the problem sets were mapped properly and assessed what was needed to know the next best steps. Anyone in IT spends quite a bit of the day failing to solve complex, difficult problems without a single, satisfying correct answer and learning from it all on their own, hopefully with useful error codes.

Expand full comment

Thanks for that detailed response. I've seen a lot of those same student behaviors when facing a well designed problem/challenge sequence. In the CS Context, I think about the well scaffolded labs from CS50. I sgtrongly agree that the way students learn is by DOING the work. A colleauge of mine with a mystical bent used to refer to this as Grimoire Learning - it may seem weird but if you perform this set of ritual steps, understanding will come.

My experience also suggests that work independently through even well design Grimoire sequences is a very autodidact move. The 95% of students need a partner/guide to discuss and externalize the challenges provided. Those students need to exist in a community that values this knowledge/process, and celebrate the struggles along the way.

I have had some positive AI-experiences working through mysterious excercises, and as a CS teacher I'm reasonably optimistic that AI Tutor support might provide useful Just In Time support through challenging moment. But I haven't seen any LLM behavior yet that suggests that they're capable of the care and support that learners need in those powerful, open-ended learning moments.

Expand full comment

I could not agree more with your assessment of the LLM behavior. I was imagining something much more hard coded by professionals. If anyone was being tutored by an LLM, I fear for their future...and ours.

Expand full comment

You often point out that AI chat bots can’t form a meaningful relationship with people. One example of a successful pseudo-relationship with a tech agent is when your phone pops up a memory from 10 years ago. It does give a warm fuzzy. Along those lines, could Desmos create a chat bot that the students in the class train with their own writing? After selected writing prompts in Desmos activities, the class selects 4-5 responses as training material for the chat bot. Moving forward, the chat bot is enabled to say stuff like, “Remember when Angel said two fractions are equal if they have the same decimal? You could try that strategy here.” The bot would refer students back to the memories of their class learning experiences, and might even be able to show sketches the students drew, etc. And students’ writing attempts become more meaningful because some of them would get selected to literally train the class’s personal chat bot.

Expand full comment
author

This kind of thinking about human-computer interactions is really uncommon right now—seeing humans as a source of value, seeing GROUPS of humans as a source of even GREATER value. I'll give your specifics more thought but I want to endorse their general nature.

Expand full comment

This kind of thing could be genuinely useful to teachers, as it could help with the memory of who said what as well as with formatting the presentation back to the students.

Expand full comment

I agree but I would definitely have privacy concerns.

Expand full comment

I've been thinking about this, especially as the AI wave has fractured here in Spain in various ways.

There's a clear analogy to a LLM as a form of public classroom space, in the same lineage of Bulletin Board in either technological sense. With more active development of the sense of a plausible Classroom Commons, we would be able to have more nuanced conversation around local control and granular use of student data.

Expand full comment

Right. Can there be onboard AI, like the way hospitals have their own server behind a firewall to protect patient privacy? Not sure if I’m using the right words.

Expand full comment

The simple truth is that most children crave a meaningful relationship with their teachers. This is a normal part of growing up. The digital tools are just that. Tools. We are looking for ways to learn more about learning and how to make learning a bit more accessible and memorable.

AI has been fun as a tool to spark my creativity in new ways.

I get excited thinking about how Star Trek this world is becoming because of advancements in technology! I wish I had more time to play and create.

Expand full comment

Well said! And right there with you.

Expand full comment

I remember putting those punchcards with the code in and just getting back "X" ;) I changed my major....

Expand full comment
author

Nice history lesson!

Expand full comment

He is focused at first on an AI course, which is much more of a natural for at least 3 reasons:

1. The demographic is self-motivated CS types

2. They *prefer* to learn on line

3. Using a computer is necessary for the learning

BTW these are the courses and people who make it on Coursera etc.

The real test would be extending to any other discipline- good luck with that! :)

Expand full comment
author

Yeah, starting with AI makes a lot of sense. Building a platform for any engineer to upskill in any engineering discipline would also make sense. Building a platform for "anyone to learn anything" will require learning a bunch of lessons his predecessors have not.

Expand full comment

I am hoping that Karpathy's classes will inspire people in other areas to apply what they have learned to teaching and learning in their domain.

Expand full comment

Agreed. And just knowing the *limitations* of a given technology is a good thing, to guide the eventual design.

Expand full comment

Perhaps not the best fit for this post and replies, but:

if you are throwing out or ignoring 95% of your audience (while pretending that you are not), that seems like you are dealing in bad faith so all bets are off.

but

if you are focused on the 5% of your audience and are making that clear (either to yourself or others), then you can claim to be dealing in good faith, and keep on trucking.

The above thoughts about when one is operating in good faith vs bad faith explains to me how Dan Meyer can remain reasonable in his evaluations of EdTech - an activity that would prompt my wife to ban me from the breakfast nook (no banging on the table is allowed). I could see the EdTech bros thinking they are talking about / working for the 5% and hence, they are operating in good faith.

Expand full comment

Brilliant overview. And responding to SM McCarthy: It's not about understanding the concepts it's about understanding children. Every couple weeks on r/EdTech (Reddit) there is somebody starting a new ed tech "learning" company asking for advice. Takes three years to hit your stride as a teacher, five years to master even one subject and age range. Very few who who design education software have any experience running successful classrooms or schools (successful = 80 to 90% of your students reaching their potential at least). Desmos was a tool that good math teachers loved to use. It was not a replacement for the math teachers. I take it it worked because it was designed by working teachers. Whether the rest of amplifies offerings can say the same I do not know!

But if Karpathy wants to create an AI tutor that can project a 3-D holograph into the room and sit down with young princelings and give them the education of say, Queen Elizabeth, that might work. But that's tutoring not teaching.

Expand full comment

Tedward, I am quite certain Andrej's audience is adults and has planned for tutoring more than credentialing, an unfortunate side effect of our formal education systems. While I have never taught in K-12, I feel STRONGLY that AI has little place in the classroom with that age group-tutoring at home with a parent's desire and consent would sit differently with me for that age group but no place in a school system. I actually just wasted my time reaching out to edtech at the Department of Education about this. Everyone wants AI to do the difficult work but no one wants to pay the ACTUAL professionals to train it. It leads to a product that is often anything but desirable, especially for K-12. I am certain, with adequate funding, today's teachers could train an excellent system to evaluate their desired learning objectives in the assignments they wrote and corrected and produce the wordy documents the state requires.

I see Andrej's vision as a great experiment on consenting adults but not as a substitute for a caring, professional teacher.

Expand full comment

Great observations! Here's another lens for thinking about this: I think we often muddle the question of "how" we learn with the "why" we learn. Dan's description of "responsive feedback" is a brilliant answer to "how". (Although I'd guess there are many answers to the how). "Why" is incredibly intricate - and changes with our own changes in culture and circumstances. Community is one powerful answer to "why" -- again, necessary but possibly not sufficient. So question of how successful Karpathy will be will hinge on what he (or others) define as *success" -- teaching one student? Teaching 1 billion? And from the students point of view: 1 inspired student? 1 billion?

Expand full comment
author

Agreed. And I wouldn't denigrate anyone who said, "I'm going to build the best platform for software engineers to upskill in artificial intelligence." Those are fine ambitions. But trying to build a platform for "anyone to learn anything" requires sensitivity to "culture and circumstance" that many of Karpathy's predecessors have not demonstrated.

Expand full comment

I think many people with a high desire to learn AI will be happy to take the course, because of who's running it.

Expand full comment