It's certainly a mistake to classify AI tutors as equivalent to human tutors. It's self-evident (or should be) that the key element of a person-to-person tutoring experience is the relationship between the tutor and the student -- hence why not a single tutor I know is concerned about losing their job to an AI chatbot, nor should they be.
But I don't fully agree with your post. For one thing, the trope that the fancy new technology inhibits critical child-focused learning practices could easily by applied to video tutoring. You're tutoring a 5th grader over video chat, and you're showing this student how to convert mixed numbers into improper fractions. Couldn't you very easily make a case that it would be much better to tutor this kid in person? That holding up a piece of loose leaf to a camera is not that same as having your tutor watch over your shoulder as you work step-by-step? That getting full human context and building a relationship is harder over Zoom? That your student could have a phone with them, could get distracted, could work in a reclined position on their couch instead of in active learning position at their desk? But OF COURSE we've come to accept that there are many benefits to video tutoring: the ability to easily embed media, quick dissemination of materials, and -- here's the big one -- flexibility to meet kids all over the world.
Now: can an AI tutor build a relationship? No. (Not all human tutors do either.) But can they be pushy? Can they ask for context? Can they use multimedia? Depends on the tutoring program you're using -- much like it depends on the human tutor you're working with. FWIW, I know a shocking number of virtual tutors who don't even own an Apple pencil, or understand Desmos, or who constantly multitask while working. Will a human tutor be available to you at 11pm after school + soccer game + finishing homework + procrastinating because you're 15 years old? Probably not. And -- of course -- will a human tutor work with you for free? Or $20 a month? Do you even want a human tutor who is charging $20 a month?
Excellent human tutoring is irreplaceable. It's also expensive, hard to scale, and hard to come by. AI tutoring is a fundamentally different beast, and one that is in its infancy. The real question should not be, can AI tutoring replace human tutoring? Rather, can AI tutoring help kids? Even right now, with GPT 4.0 less than 1 year old, the answer to that question is yes. Skepticism is vital, and each platform should be considered on an individual basis. But I would not compare infant-stage AI tutoring to the platonic ideal of human tutoring.
You're taking the "positive framing" POV here, which I think is a defensible perspective, though time will tell. I think you're also naming certain costs of human tutors and benefits of AI tutors that I haven't mentioned here. Also fair. I am mainly critiquing here the view of the tech elite that at some point in the near future there will be no costs to the AI tutor relative to humans, only benefits.
Just adding to your points on the benefits of Zoom tutoring: With the college students I'm helping I think it actually can work better than in-person, and not just for the added convenience (although it's great to be able to say "How about right now?" to a student who needs help rather than "Let's schedule an appointment.")
For online tutoring, I create a google doc with edit privileges for both of us, I might screenshot in a question the student is struggling with, they type some stuff, I watch, maybe do some typing myself, although I try to limit that. We don't get hung up on textbook-appearance formatting, x^2 is good enough, and using / for fractions makes students think about proper use of parentheses. An added bonus is that you can cut n' paste an arithmetic expression into google search for evaluation.
I agree, and generally feel like Zoom meetings can be more efficient and productive. But I'll tell you this: March 2020? I was not so good at Zoom tutoring! It took time and a sense of urgency. People had to give it a shot and had to stick with it, since there were no alternatives -- and than many found it to be better. I feel like we're entering "PICK A SIDE" territory with AI tutoring, where you have to label it as good or bad. It's new and it's different. Those of us building this platforms need to be thoughtful about how we build them, and avoid "CHANGING THE WORLD!" proclamations. But let's not kid ourselves: there's real power and potential with AI tutoring. It can help kids.
I am currently tutoring Alg 2. One thing that I do that AI can’t do is to know when to stop a student who is doing work algorithmically and move them to something else. Then, when they have been distracted enough, return to what they were doing originally. My theory is that you have to forget something and then revisit it before you really get it into long term memory This seems to work.
I have found that AI is most useful as a kind of "proofreading" tool. When students have shown understanding of a concept and are now applying what they know, an algorithm can be used to check their work for errors that the student will recognize as errors and be able to self correct. Algorithms work best when the student already has a decent understanding of what they want to do. For example: When I am practicing my Spanish I will write my own sentences, then run them through a translation app to see if my use of the language is functional.
Dan, you raise some great points here. Edu chat bots should hopefully NOT build emotional connections with students - e.g., we don't want this: this https://www.theverge.com/24092638/ai-dating-chatbots-romance-replika-tinder-decoder. However, bots can and will ask questions and seek context in the future - you're equating Khanmigo with all educational AI, and it's not the smartest system out there. At Quill, we have a team of teachers who map out custom feedback and dialogue for every single prompt, injecting the specific guidance a teacher would provide about the key concepts being tested in that prompt. Our approach is not scalable by design - we need to map out every single prompt - but it allows us to support a more in-depth conversation. Khan Academy's approach is a quantity over quality approach, but it's not the only way to approach this work.
For what it's worth, JSON.parse and JSON.stringify are inverse functions. The first turns a string into a JavaScript object, and the second turns a JavaScript object into a string. One is not an alternative to the other; which you use depends entirely on the direction that the data is flowing. Are you trying to bring data from the outside into your code? If so, parse. Or are you trying to send data inside your code to the outside? If so, stringify.
Love that you hired a tutor to learn about their tutoring and experience it yourself. Do they know? If I were the tutor, I might feel self-conscious or flattered or both haha.
Dan, this is very similar to all the conversations about "AI takes all the jobs", conflating the (discrete) task(s) with the jobs (which require adaptation and coordination - the hard part).
Chatbots effectively are are way of communication(much like web pages or mobile phones) and they are only powerful and genuine based on the source. I was talking to bunch of students from Grades 9 & 10. For them, the answers , expression and knowledge should feel like coming from their friends, teachers is more important to them. Like it was mentioned in the "home tutors know their limits" , this exact experience is what is needed else it feels like there is no difference between the general GPT experience.
Maybe if we reversed this thing? AI students who help train human tutors to be better tutors? Could even be gamified so your human tutor trainees get points for using good approaches with their AI students.
It's certainly a mistake to classify AI tutors as equivalent to human tutors. It's self-evident (or should be) that the key element of a person-to-person tutoring experience is the relationship between the tutor and the student -- hence why not a single tutor I know is concerned about losing their job to an AI chatbot, nor should they be.
But I don't fully agree with your post. For one thing, the trope that the fancy new technology inhibits critical child-focused learning practices could easily by applied to video tutoring. You're tutoring a 5th grader over video chat, and you're showing this student how to convert mixed numbers into improper fractions. Couldn't you very easily make a case that it would be much better to tutor this kid in person? That holding up a piece of loose leaf to a camera is not that same as having your tutor watch over your shoulder as you work step-by-step? That getting full human context and building a relationship is harder over Zoom? That your student could have a phone with them, could get distracted, could work in a reclined position on their couch instead of in active learning position at their desk? But OF COURSE we've come to accept that there are many benefits to video tutoring: the ability to easily embed media, quick dissemination of materials, and -- here's the big one -- flexibility to meet kids all over the world.
Now: can an AI tutor build a relationship? No. (Not all human tutors do either.) But can they be pushy? Can they ask for context? Can they use multimedia? Depends on the tutoring program you're using -- much like it depends on the human tutor you're working with. FWIW, I know a shocking number of virtual tutors who don't even own an Apple pencil, or understand Desmos, or who constantly multitask while working. Will a human tutor be available to you at 11pm after school + soccer game + finishing homework + procrastinating because you're 15 years old? Probably not. And -- of course -- will a human tutor work with you for free? Or $20 a month? Do you even want a human tutor who is charging $20 a month?
Excellent human tutoring is irreplaceable. It's also expensive, hard to scale, and hard to come by. AI tutoring is a fundamentally different beast, and one that is in its infancy. The real question should not be, can AI tutoring replace human tutoring? Rather, can AI tutoring help kids? Even right now, with GPT 4.0 less than 1 year old, the answer to that question is yes. Skepticism is vital, and each platform should be considered on an individual basis. But I would not compare infant-stage AI tutoring to the platonic ideal of human tutoring.
You're taking the "positive framing" POV here, which I think is a defensible perspective, though time will tell. I think you're also naming certain costs of human tutors and benefits of AI tutors that I haven't mentioned here. Also fair. I am mainly critiquing here the view of the tech elite that at some point in the near future there will be no costs to the AI tutor relative to humans, only benefits.
Just adding to your points on the benefits of Zoom tutoring: With the college students I'm helping I think it actually can work better than in-person, and not just for the added convenience (although it's great to be able to say "How about right now?" to a student who needs help rather than "Let's schedule an appointment.")
For online tutoring, I create a google doc with edit privileges for both of us, I might screenshot in a question the student is struggling with, they type some stuff, I watch, maybe do some typing myself, although I try to limit that. We don't get hung up on textbook-appearance formatting, x^2 is good enough, and using / for fractions makes students think about proper use of parentheses. An added bonus is that you can cut n' paste an arithmetic expression into google search for evaluation.
I agree, and generally feel like Zoom meetings can be more efficient and productive. But I'll tell you this: March 2020? I was not so good at Zoom tutoring! It took time and a sense of urgency. People had to give it a shot and had to stick with it, since there were no alternatives -- and than many found it to be better. I feel like we're entering "PICK A SIDE" territory with AI tutoring, where you have to label it as good or bad. It's new and it's different. Those of us building this platforms need to be thoughtful about how we build them, and avoid "CHANGING THE WORLD!" proclamations. But let's not kid ourselves: there's real power and potential with AI tutoring. It can help kids.
I like Dan's critiques because if IA tutoring is in its infancy, we need to be honest - even brutally so - or it won't get better in the right ways.
I like them too (that's why I subscribe!), and he's called out some overzealous and underbaked AI tutors.
I am currently tutoring Alg 2. One thing that I do that AI can’t do is to know when to stop a student who is doing work algorithmically and move them to something else. Then, when they have been distracted enough, return to what they were doing originally. My theory is that you have to forget something and then revisit it before you really get it into long term memory This seems to work.
This is an interesting comment, I think I know what you mean by "doing work algorithmically" but could you be a little more explicit here? Thanks.
I have found that AI is most useful as a kind of "proofreading" tool. When students have shown understanding of a concept and are now applying what they know, an algorithm can be used to check their work for errors that the student will recognize as errors and be able to self correct. Algorithms work best when the student already has a decent understanding of what they want to do. For example: When I am practicing my Spanish I will write my own sentences, then run them through a translation app to see if my use of the language is functional.
I need a "bag of entrails with a smiley-face" emoticon. Classic!
Dan, you raise some great points here. Edu chat bots should hopefully NOT build emotional connections with students - e.g., we don't want this: this https://www.theverge.com/24092638/ai-dating-chatbots-romance-replika-tinder-decoder. However, bots can and will ask questions and seek context in the future - you're equating Khanmigo with all educational AI, and it's not the smartest system out there. At Quill, we have a team of teachers who map out custom feedback and dialogue for every single prompt, injecting the specific guidance a teacher would provide about the key concepts being tested in that prompt. Our approach is not scalable by design - we need to map out every single prompt - but it allows us to support a more in-depth conversation. Khan Academy's approach is a quantity over quality approach, but it's not the only way to approach this work.
For what it's worth, JSON.parse and JSON.stringify are inverse functions. The first turns a string into a JavaScript object, and the second turns a JavaScript object into a string. One is not an alternative to the other; which you use depends entirely on the direction that the data is flowing. Are you trying to bring data from the outside into your code? If so, parse. Or are you trying to send data inside your code to the outside? If so, stringify.
Love that you hired a tutor to learn about their tutoring and experience it yourself. Do they know? If I were the tutor, I might feel self-conscious or flattered or both haha.
Dan, this is very similar to all the conversations about "AI takes all the jobs", conflating the (discrete) task(s) with the jobs (which require adaptation and coordination - the hard part).
Chatbots effectively are are way of communication(much like web pages or mobile phones) and they are only powerful and genuine based on the source. I was talking to bunch of students from Grades 9 & 10. For them, the answers , expression and knowledge should feel like coming from their friends, teachers is more important to them. Like it was mentioned in the "home tutors know their limits" , this exact experience is what is needed else it feels like there is no difference between the general GPT experience.
Maybe if we reversed this thing? AI students who help train human tutors to be better tutors? Could even be gamified so your human tutor trainees get points for using good approaches with their AI students.