
For much of its history, the internet has been a library, a static archive of documents waiting to be searched, sifted, and cited. Education adapted around this model: we taught students how to look things up, how to cross-check sources, and how to evaluate reliability. This seemed reasonable, and it mostly worked well.
But the web is changing. With the rise of generative AI, it is becoming dynamic: information no longer needs to exist in advance; it can be conjured on demand, often personalised to the individual. As I have argued before, my version of the internet may not look anything like yours, because the content itself is shaped by the questions I ask and the data it knows about me.
For schools, this is not an abstract shift. It affects how pupils complete homework, how teachers plan lessons, how administrators manage logistics, and how parents support their children at home. The implications are immense, swinging between utopian promise and dystopian concern.
Perhaps the best way to explore them is through four familiar characters, each encountering potential scenarios in which this new dynamic web either supports, distorts, or reshapes their everyday experience of school life.
Sam, Year 8 Student – Homework in History
- The Panglossian Dream: Sam has been set a project on the Tudors. He types “Henry VIII wives” into the school’s AI portal. Because it has access to the MIS, the system already knows that he struggles with extended writing and is flagged for mild dyslexia. Instead of presenting walls of text, it creates a neat, narrated timeline with vocabulary scaffolds and a self-marking quiz that adapts to his recent mistakes. For the first time, he produces a piece of history homework that shows real confidence.
- The Black Mirror: The AI hallucinates, presenting a scrambled order of Henry’s marriages with complete authority. Sam copies and pastes it into his homework, which, incidentally, was set by the school’s AI and will later be marked by it. At every step, the AI is in conversation with itself. To Sam, it looks like progress; in reality, it is a hall of mirrors, where learning is simulated and cognitive offloading has displaced the struggle that helps students think hard, understand, and remember.
- The Probable Future:
Sam does benefit from AI-generated scaffolds and quizzes: they give him practice at the right level and help him organise his thoughts. But his teacher insists that he cross-check his answers against approved sources and his own handwritten class notes. AI speeds the work, but critical habits and the friction of desirable difficulty ensure that new information is transformed into genuine knowledge and understanding.
Ms. Patel, Secondary Teacher – Preparing a Physics Lesson
- The Panglossian Dream: When Ms. Patel asks the AI for a lesson on energy transfer, it doesn’t just generate a set of slides. It draws on the MIS (assessment results, SEND profiles, attendance records) to build a picture of the class. It notices who missed the last practical, who needs additional scaffolding, and who is ready for stretch and challenge. Out of this, the AI produces a carefully sequenced lesson with activities, explanations, and resources designed to move every student forward. Ms. Patel still reviews and adapts the plan, but most of her energy is now spent on pedagogy and curriculum design rather than the mechanics of formatting PowerPoints or worksheets.
- The Black Mirror: Over time, reliance creeps in. Ms. Patel begins to let the AI do everything, and her professional instincts start to atrophy. The lessons the AI generates are competent but predictable. Subtleties of sequencing and emphasis slip by unnoticed, and when the system misreads the data, she doesnt catch it. Students sense the shift: their teacher appears less the architect of learning and more the technician of delivery, following scripts rather than devising lively lessons.
- The Probable Future: The AI lightens Ms. Patel’s workload and offers valuable prompts, but it cannot see the whole picture. Data only goes so far; it misses the nuance of classroom relationships, the sparks of curiosity, and the lived reality of teaching. That’s why the system remains a capable assistant rather than an all-knowing planner. Ultimately, it is Ms. Patel’s professional judgement that gives coherence and quality to the lesson. The teacher is still the teacher, only now she has modest superpowers.
John, Office Admin – Planning a Trip to Paris
- The Panglossian Dream: John types, ‘Plan a Year 10 trip to Paris for 30 students in June.’ Within moments, an AI agent produces a complete itinerary: Eurostar tickets, hotel bookings with accessible rooms, meal plans adjusted for dietary needs drawn from the MIS, and risk assessments generated automatically. Consent forms are drafted, parental letters personalised, and email reminders scheduled, all without John opening a single spreadsheet or database. Instead of drowning in logistics, he can focus on what matters in his role: clear communication with parents, efficient coordination with staff, and ensuring compliance with school policies and procedures.
- Black Mirror: Sensitive student data is fed into the AI, including health records, SEN notes, and demographic details. Some of it is inadvertently shared with third-party providers. Weeks later, a family discovers their child has been excluded from a trip activity on the grounds of unspecified ‘behavioural risk factors’ highlighted by automated profiling, with racial bias suspected. What once looked like a triumph of efficiency now feels like AI-assisted discrimination.
- The Probable Future: In practice, AI saves John significant time, producing itineraries and paperwork in minutes rather than hours. Yet every output sits within the guardrails of strict data protection rules. John still double-checks safeguarding details, reviews medical and SEN information carefully, and makes sure parents are clear about how their data is being used. The gains in efficiency are evident, but they only hold value because human oversight remains in control.
Maria, Parent – Supporting Ana with Maths
- The Panglossian Dream: Ana is struggling with simultaneous equations. Her mother, Maria, opens the school’s AI-assisted parent portal and types, ‘Help with homework.’ Within moments, the system generates a step-by-step tutorial pitched exactly to Ana’s recent assessment results, complete with scaffolded practice questions. Alongside it comes a parent-friendly guide, written in plain language so Maria can feel confident in offering support. For once, homework becomes a point of connection rather than conflict.
- The Black Mirror: Maria begins to rely on the portal for every problem, gradually sidelining Ana’s teacher, and begins to interpret gaps in Ana’s progress as evidence of poor teaching. Constructive communication with school deteriorates, replaced instead by a stream of AI-drafted, quasi-legal demands that grow increasingly belligerent. The teaching body, forced to spend hours investigating and responding to spurious complaints, has less time for actual teaching.
- The Probable Future: The AI offers Maria some genuine support. It produces explanations pitched at Ana’s level, with practice tasks that align with recent classroom work. But Maria knows the portal is not infallible. Some explanations are oversimplified, and Ana still needs her teacher to address deeper misconceptions. The real benefit lies in balance: Maria feels better equipped to help at home, while her regular check-ins with the teacher. For Maria, the technology helps to ease the pressure, but it cannot replace the human dialogue and trust on which Ana’s progress ultimately depends.
Compass Points
These vignettes show the spectrum of futures schools face. The Panglossian Dream is seductive, but the Black Mirror warnings are very real.
The most likely scenario is neither extreme. AI will help, but it will also stumble. It will lighten the load, but never without the risk of misuse. That is why guardrails matter: professional ethics, regulation, data protection, and teacher judgement are the compass points that would guide us.
One pressing question is assessment. If AI can set work, complete it, and even grade it, then the cycle of assessment risks becoming self-referential. Schools must define clearly what counts as evidence of learning, assessing students in ways that compel them to think, speak, and write in ways that cannot be outsourced.
Equity, too, is at stake. Access to reliable broadband, up-to-date devices, and even premium versions of AI tools could create a two-tier experience, where advantage accrues to those with greater financial resources and digital fluency. If AI is to serve education well, schools must ensure it narrows rather than widens existing gaps.
And for school leaders, the responsibility is clear. This is not only about mitigating risks but about shaping culture. That means investing in staff development so teachers feel confident as curators of AI outputs, embedding AI literacy in the curriculum, and setting policies that define acceptable use.
In the end, the goal is not to hand learning over to AI, but to sharpen what only humans can offer: discernment, creativity, empathy, and the wisdom to guide one another through a world where knowledge is no longer stored but summoned.
“Before you become too entranced with gorgeous gadgets and mesmerizing video displays, let me remind you that information is not knowledge, knowledge is not wisdom, and wisdom is not foresight.“
— Arthur C. Clarke
Subscribe to my newsletter
- Actionable insights on leadership, learning, and organisational improvement
- Thought-provoking reflections drawn from real-world experience in schools and beyond
- Curated resources on effective practice and digital strategy
- Early access to new articles, events, and consultancy updates
- Invitations to subscriber-only webinars, Q&As, and informal conversations
- Clarity, not clutter—you will not be bombarded by emails
Cancel or pause anytime.
Leave a Reply