ChatGPT has a “for teachers” edition, and Carl Hendrick has taken it out for a spin. He has concerns.
There’s a predictable pattern when some edtech companies enter education: they build sophisticated solutions to problems that don’t exist while ignoring the problems real teachers actually face. I am cautiously optimistic about what AI could offer education, particularly around the idea of teacher workload but this is not it, and I’m sure the people who developed ChatGPT for teachers are well-intentioned but what we have here is a step backwards which actively promotes discredited theories and pedagogically bankrupt approaches.
1. “Activity First” Lesson Design
On the explore page the very first ‘tip’ promises the ability to “find activity in seconds.”

The problem with format-first pedagogy and instructional design is that it fundamentally inverts the relationship between knowledge and pedagogy. Instead of asking “what knowledge structure demands this particular pedagogical approach?”, it asks “what activities can I fit this content into?” It’s a completely superficial approach to learning and instruction.
“Activities-first” design treats curriculum as a collection of discrete events rather than a coherent narrative. It sees curriculum design as forward-engineering rather than backward design. When teachers begin with activity templates, ie think-pair-share for this part of the lesson, retrieval practice for that one, a guided practice here etc, they’re assembling a sequence of disconnected episodes rather than constructing a journey through a discipline. As Christine Counsell memorably puts it “You cannot Rosenshine your way into a curriculum, and nor can you Rosenshine your way into mediating content in subject‑sensitive ways.”1
The deeper problem is that activity-first design reduces knowledge to mere propositions and procedures. It asks: “What do I want students to know or be able to do?” and then selects activities to achieve those outcomes. But much of the time in humanities, arts, and literature, one is not teaching procedures or isolated propositions. McCarthy and Minsky call these “ill-defined domains” (which actually have a long history with AI going back to the 50s) and there, one is teaching knowledge that exists in relationships, structures, and flows that cannot be packaged into discrete “learning objectives” without destroying what makes them educationally valuable.
The efficiency promised by ChatGPT to ”find activity in seconds” is efficiency in exactly the wrong dimension. Yes, you can quickly generate a lesson structure. But you’ve saved time on the professional work that distinguishes effective from ineffective instruction: thinking deeply about the nature of the knowledge you’re teaching, how it connects to what came before and what comes next, what disciplinary features demand particular pedagogical approaches, and how to structure encounters with content so that students experience both the intrinsic value of the moment and the gathering momentum of a larger narrative.
Pedagogy should follow epistemology, not the other way round.
He cites two more concerns. Read the full post here.