Common Responses from
Schools and AI Supporters

When bringing up the topic of the use of AI in schools, both with supporters of AI and schools themselves, they will often respond with the same talking points.

AI companies that sell to schools know that parents are wary of AI, and have created pages dedicated to teaching school administrators how to convince parents that AI should be used in the classroom.

Here we list those common points and why they are wrong.

Children in a classroom

"AI is the future, students must learn how to use it"

This is extremely easy to debunk. One of the major selling-points of AI is that it requires no training to use, so why would students need to spend time learning it?

Schools also often talk about the importance of "AI Literacy", claiming that AI is just another tool that children should learn how to use.

This framing is completely incorrect.

Just using AI has extremely negative impacts on learning1, critical thinking2, retention and creativity.

A child using a laptop in a darkened room

"AI can answer anything, it's like a personal tutor"

The one-on-one interaction with chatbots like SchoolAI is extremely isolating. It hijacks the personal connections that humans naturally build when talking with others, and turns it into a reliance on a for-profit tool.

It promotes a hyper-individualism where children are no longer learning as part of a class community.

That isolation and hijack of personal connections has driven adult users to states of psychosis3.

"AI keeps kids engaged"

AI companies are quick to point out how engaging AI tools are. They do this because studies have shown that tools like ChatGPT are terrible for learning and retention.

The interactive snappy way of interacting with chatbots decreases attention spans and comprehension ability.

A child using a tablet

"The AI has safety measures"

SchoolAI, and other AI companies that market to schools are aware that chatbots have encouraged children to commit suicide4 and have talked to young children about sexual topics5.

They reassure parents that their AI tools are safer, even when they simulate interacting with victims of the holocaust6.

As with other talking-points, the issue is not only whether it is safe, but whether it is helping the children learn.

Example AI slop with gibberish text and gross-looking candy.

"It enhances creativity"

Image and text generation is amazing for giving people the illusion of creativity without any of the effort. Users of AI consistently overrate their own performance7, despite being slower8.

By using it in schools, children are taught that practicing to get better at something like drawing or writing is pointless, and that instead they should just use a tool to do it for them.

References

  1. Is AI Enhancing Education or Replacing It? (Chronicle of Higher Education) 

  2. University students offload critical thinking, other hard work to AI (Heching Report) 

  3. AI Psychosis is a Medical Mystery 

  4. Google-Backed AI Startup Tested Dangerous Chatbots on Children, Lawsuit Alleges (Futurism) 

  5. Meta's 'Digital Companions' Will Talk Sex With Users—Even Children (Wall Street Journal) 

  6. Schools Using AI Emulation of Anne Frank That Urges Kids Not to Blame Anyone for Holocaust (Futurism) 

  7. AI makes you smarter but none the wiser: The disconnect between performance and metacognition (Science Direct) 

  8. Measuring the Impact of Early-2025 AI on Experienced Open-Source Developer Productivity (Metr)