Russell Shaw is the head of school at Georgetown Day School in Washington, D.C. In this guest opst at Larry Cuban’s website, he explains some of the limits of AI.
ChatGPT thinks I’m a genius: My questions are insightful; my writing is strong and persuasive; the data that I feed it are instructive, revealing, and wise. It turns out, however, that ChatGPT thinks this about pretty much everyone. Its flattery is intended to keep people engaged and coming back for more. As an adult, I recognize this with wry amusement—the chatbot’s boundless enthusiasm for even my most mediocre thoughts feels so artificial as to be obvious. But what happens when children, whose social instincts are still developing, interact with AI in the form of perfectly agreeable digital “companions”?
I recently found myself reflecting on that question when I noticed two third graders sitting in a hallway at the school I lead, working on a group project. They both wanted to write the project’s title on their poster board. “You got to last time!” one argued. “But your handwriting is messy!” the other replied. Voices were raised. A few tears appeared.
I kept my distance and after a few minutes, the students appeared to be working purposefully. The earlier flare-up had faded into the background.
That mundane scene captured something important about human development that digital “friends” threaten to eliminate: the productive friction of real relationships.
Virtual companions, such as the chatbots developed by Character.AI and PolyBuzz, are meant to seem like intimates, and they offer something seductive: relationships without the messiness, unpredictability, and occasional hurt feelings that characterize human interaction. PolyBuzz encourages its users to “chat with AI friends.” Character.AI has said that its chatbots can “hear you, understand you, and remember you.” Some chatbots have age restrictions, depending on the jurisdiction where their platforms are used—in the United States, people 14 and older can use PolyBuzz, and those 13 and up can use Character.AI. But parents can permit younger children to use the tools, and determined kids have been known to find ways to get around technical impediments.