As I’ve written before (A REALLY BAD THING ABOUT AI IS THAT IT FORCES US TEACHERS TO SPEND A TON OF TIME RETHINKING OUR LESSONS. I GUESS THAT MIGHT BE A GOOD THING ABOUT IT, TOO), though I think Artificial Intelligence can be a huge asset to language learning (see The Best Posts About Using Artificial Intelligence With ELLs), I certainly don’t think it’s going to revolutionize education (no matter what Sal Kahn says; also see Just Received Temporary Free Access To Khan Academy’s “Khanmigo” – Here’s What I Think).

As I’ve said before, there are some ways it can help, particularly in saving teacher’s time (see CHATGPT IS A STUDENT LETTER OF RECOMMENDATION MIRACLE!).

I also think it has the potential of making our work in the classroom a whole lot worse if we don’t adapt to it.  That’s why I’m spending so much time experimenting with, and writing about, it.   In this scenario, we teach the same ways and the same lessons we’ve always taught and AI either does most of the critical thinking for our students or our “policing” of students’ work to keep it AI-free becomes a major component of what we do. Apart from using it with ELLs, everything else I’m doing comes from this “defensive” stance.

Other educators have written more excellent analyses, better critiques, and probably more accurate predictions about the future AI in the classroom than me, most notably these three pieces:

AI Chatbots Will Help Students Learn Nothing Faster Than Ever by Dan Meyer.

AI Does Not Fit the Shape of Schooling by Dan Meyer.

Chatbots (mostly) won’t change education by Michael Pershan.

What I’d like to do here, though, is talk more about a more narrow critique of AI in education focusing on student motivation, which I know a little about. My criticism here is similar to one I’ve written about what I consider to be rather Pollyannaish perspectives that generally come from people who are not working inside a K-12 classroom (see It Doesn’t Matter If It’s “Effective” If Students Won’t Do It and The “Best Learning Techniques” Are Useless If Students Won’t Do Them — A Critical Take On A Well Done Study).

The reality, to borrow from some of my posts’ headlines, is that all the bells and whistles of AI are going to be useless if students don’t feel like engaging with it.

If students did what we wanted them to do, AI would be great!  Though, if students did what we wanted them to do, there wouldn’t be any need to even think about using AI in the classroom.

Some students do what we want them to without much encouragement.  These motivated young people learn a lot from us teachers now, and they’ll learn a lot from us teachers and from Artificial Intelligence.  As a colleague of mine once said, these students tend to be “teacher-proof” – they’re going to learn a lot no matter what happens in class, often, though not always, because of their fairly affluent, safe and well-schooled home environment.

But many other students need us to work hard at creating the classroom conditions where intrinsic motivation to learn can flourish.  A twenty-year-old study from the National Research Council pegged that number at forty-percent of high school students. I doubt that many teachers would argue at my assertion that the percentage is even higher as a result of the pandemic.

Researchers have identified four elements that are necessary for that to happen.  I’d like to quickly review my thoughts about how good of a job Artificial Intelligence can do in support those four features.

Relatedness, which means doing the activity helps them feel more connected to others, and feel cared about by people whom they respect.  Obviously, AI can’t make this happen at all. Of course, teachers could have students working in partners when doing some AI-related work, but that can be good practice doing any classroom activity. AI brings no added value to it furthering relatedness at all and, if used in some of the ways AI advocates are talking about, could, in effect, reduce opportunities for human connection in the classroom.

Competence is feeling that one has the ability to be successful in doing what is being asked of you.  For reinforcement of basic skills that have been taught in the classroom (grammar, vocabulary, simple math) , I think AI’s adaptive learning abilities can help (see THE BEST FREE ONLINE TOOLS USING ADAPTIVE LEARNING).  I do think, though, that using mini-whiteboards in the classroom can, in many ways, do the same thing (see SOME “BEST” IDEAS FOR USING MINI-WHITEBOARDS IN THE CLASSROOM), but adaptive learning tools can provide a “change-of-pace” for students and be easy warm-up activities, as they have been for the past few years.

Relevance means the work must be seen by students as interesting and valuable to them, and useful to their present lives and/or hopes and dreams for the future.  I guess that it could be possible, in an algorithmic kind of way (like Amazon or Facebook claim, often inaccurately, to know what I want), for AI to haphazardly identify points of relevance if a student used it enough.  But, just as those algorithms, when they are correct, only identify “surface” relevance, I suspect AI would do the same.  It’s going to be a steep hill for AI to climb to be able to do what we teachers do – through countless individual conversations, seeing how students feel, identifying body language – learning what our students’ hopes and dreams are for the future and then trying to tailor what we do in the classroom – as much as possible – to help them feel that what we’re doing will help them get there.

However, “relevance” can also be used to describe activities that are just interesting to students, also.  So, I do have to give it to AI that in some cases it could provide a somewhat more “interesting learning experience,” like talking with a Frederick Douglass chatbot instead of reading his biography.

Autonomy is having a degree of control over what needs to happen and how it can be done. The Frederick Douglass chatbot example could also be seen as an example of supporting student autonomy. The trade-off, though, could be that the student misses some key points about Douglass’ life because they didn’t ask key questions.  Personally, I think having students working in partners to read a biographical essay and then being given the option of various ways to demonstrate their learning, and then teach it via jigsaw (THE BEST RESOURCES FOR LEARNING ABOUT THE JIGSAW INSTRUCTIONAL STRATEGY), is a better way to support student autonomy.

AI could let students read about whatever topics they wanted, though they can do that through Google Search now.  AI would allow ELLs to make whatever content they wanted to read more accessible by defining the content’s English proficiency level and making it bilingual.  In general, though, it seems to me that sites like Epic! have more than enough choices for any student to read about whatever they want to read, and do it a much more pleasing format.

 

I may be very wrong here and, if so, it won’t be the first or last time that’s the case.

I’m just seeing AI as another tool that, apart from language learning, has the potential, if we adapt, to be – at best – marginally helpful, in a few cases, to help create classroom conditions that support student intrinsic motivation.

And, without student intrinsic motivation, AI would often just be one more task we’re pushing students to do.

I don’t know about you, but I, and my students, don’t need any more of those on our plates.

ADDENDUM:

ADDENDUM: