Former teacher and school administrator Jan Greenhawk suggests that there’s a presence in some classrooms more scary than any teacher. Her op-ed ran in the Easton Gazette in Maryland.
While parents have been worrying about what their child’s teacher is teaching in the classroom, there may be another classroom presence they should be more concerned about.
That presence is a new generation of artificial intelligence products being used in the classroom to complete tasks such as diagnostic testing, content drills, data collection on students, etc. They are also being used to teach.
One product, i-Ready by Curriculum Associates, is a program being used throughout the United States and most important, in Maryland.
The program is promoted as an easy way to assess students, identify student weaknesses and progress, and instruct students. All without a teacher involved. The company adds, “And it’s fun!”
There are some glaring problems with AI (artificial intelligence) programs in school.
Alex Molnar, a director of the National Education Policy Center (NEPC) at Colorado University Boulder, recently wrote an article suggesting an “indefinite pause” in implementing these programs in our nation’s classrooms. Co-authors included Ben Williamson of the University of Edinburgh in the United Kingdom and Faith Boninger, assistant research professor of education at CU Boulder.
First, Molnar notes that these programs use opaque and usually proprietary algorithms—making their inner workings mysterious to educators, parents and students alike. The companies claim to be protecting their investment but may also be protecting harmful changes in how these programs work on the minds of our children.
First, there is a concern over the data collection that will happen when a child is connected to this program and responding to carefully designed questions. With little to no control or knowledge of the algorithms, it’s hard to protect what information artificial programming will elicit from children and how that information will be used.
Protection of that student data is another major concern because of the strong possibility of data leaks. These leaks will be from third party vendors who ultimately will not be held accountable since there are no laws that currently address the issue in these cases. Many of the programs are in beta testing phase, thus amplifying the danger even more.
Then there’s the problem of who decides and creates the curriculum content written into the AI platform. Since programmers are not teachers and teachers are not programmers, the content in these programs may be neither pedagogically nor developmentally appropriate or effective for kids. Who will review the content?
With limited access and testing, even the school systems that use this technology may not be able to gauge the effectiveness of programs in helping students learn. This is exacerbated if the AI company is both the teacher and the assessor creating a huge conflict of interest and unreliability in the data. When millions of dollars are on the line, algorithms can be altered to make data look more favorable to the AI program and project success that is not real.