About
Subscribe

Don’t let AI in schools become the next e-tolls debacle

AI surveillance in education promises efficiency, but like e-tolls, it risks importing foreign systems that fail our people, values and lived realities.
Rennie Naidoo
By Rennie Naidoo, Professor in Information Systems and Research Director at the Wits School of Business Sciences
Johannesburg, 13 Aug 2025
Rennie Naidoo, professor in Information Systems at the Wits School of Business Sciences.
Rennie Naidoo, professor in Information Systems at the Wits School of Business Sciences.

In a dusty classroom in the Eastern Cape, a teacher with 50 learners and no assistant juggles maths, discipline and hope. It’s no wonder artificial intelligence (AI) sounds like salvation.

In South Africa’s ongoing battle to improve education quality, especially in STEM, it’s tempting to look toward technological solutions. Our classrooms are overcrowded, under-resourced and unevenly distributed in quality. AI, with its promise of smart and personalised learning, seems like a modern-day lifeline.

But before we welcome AI surveillance into our schools, we must ask: What kind of education are we designing − and at what cost?

Let’s learn from China’s AI-equipped classrooms, where facial recognition cameras, emotion-reading algorithms and brainwave-monitoring headbands have turned students into data points.

Despite being wrapped in the rhetoric of progress and efficiency, these technologies operate less like tutors and more like wardens. And the results are as dystopian as they sound. If we’re not careful, South Africa could sleepwalk into a similar paradigm, especially as our growing tech partnerships often come with a quiet surrender of policy autonomy to powerful global allies.

Before we welcome AI surveillance into our schools, we must ask: What kind of education are we designing − and at what cost?

Just look at how the e-tolls saga unfolded: imported solutions, limited consultation and a public left footing the bill for decisions they never truly owned. And now, as our politicians maintain an increasingly cosy digital relationship with China, the worry is that we won’t just adopt their technology − we’ll absorb their values too: ones that prioritise control over consent.

This is not a debate about whether AI can assist learning. Yes, it can. It’s about the use of AI to control learning, to surveil, discipline and normalise children through silent, ever-watching algorithms. A promise of better marks that undermines the very soul of education: curiosity, autonomy, dignity and the slow, human work of deliberate practice.

The classroom as a panopticon

French philosopher Michel Foucault warned that modern institutions such as schools and prisons use surveillance to shape individuals into ‘docile bodies’, compliant, conforming and self-policing.

Some classrooms in China have taken this literally. Students sit beneath AI cameras that scan their faces every 30 seconds. Algorithms judge their expressions: bored, happy, confused? An AI scoreboard tallies their attention. Deviate from the norm, and a warning is issued, sometimes to the teacher, sometimes to parents.

The result? Students learn to fake smiles, suppress emotion and feign engagement. They become actors in a play written by code. The spontaneous giggle, the curious side question and the thoughtful daydream – hallmarks of real learning − are lost in the noise of digital conformity.

Advocates of classroom AI point to its benefits: faster attendance tracking, data to support struggling learners and real-time feedback. All helpful if the intent is to make classrooms more efficient.

But education is not a process. It’s not about producing obedient, test-ready graduates. It’s about nurturing independent thinkers who can ask hard questions – about science, society, technology and themselves.

Here, Foucault’s idea of “normalising judgement” comes into play. Technology begins to define what is acceptable, not based on pedagogy, but on what is easiest to quantify. AI surveillance risks turning our classrooms into compliance factories.

Imagine an AI misinterpreting a shy learner’s gaze as disengagement or labelling a neurodiverse student as inattentive. These systems encode narrow models of “normal” behaviour, models that may not reflect South Africa’s rich diversity of learners.

If China’s model of AI in education evokes Orwell’s 1984, a society watched into submission, then the American variant brings to mind Aldous Huxley’s Brave New World, where learners aren't surveilled into fear, but nudged into passive distraction.

In the US, classroom tech often arrives in friendlier packaging: gamified apps, digital learning environments and AI tutors. But beneath the glossy UX lies a quieter risk, not of being watched too much, but of thinking too little. Of outsourcing curiosity to content recommendation engines. Of training children not to comply, but to consume.

One is a model of discipline through data, while the other is a model of distraction through dopamine. Neither prioritises the kind of critical human development South Africa so urgently needs. We must be cautious not to import either extreme.

Our children deserve more than a choice between control and commodification. They deserve education that nurtures self-belief and curiosity, not through coercion or convenience, but through care.

Education needs trust, not tracking

Education thrives on trust between the teacher and student, the parent and the school, and between the learner and their own sense of purpose and possibility. AI surveillance corrodes that trust.

When every glance is recorded, every yawn flagged, learners begin to perform for the system, not for their own growth. Teachers, too, may self-censor, teaching to what the algorithm rewards rather than what sparks inquiry.

In a country still healing from generations of authoritarian control, importing systems that automate obedience feels dangerously tone-deaf. Moreover, constant monitoring can erode mental health. Some Chinese students have reported stress, paranoia and a sense of being “boxed in”. Must we add this burden to learners, who already contend with trauma, poverty and structural inequality?

Even in China, resistance has emerged. Students have unplugged cameras during exams. Parents pushed back against brainwave-monitoring headbands. Teachers quietly ignored AI alerts. These acts echo a vital truth: education is largely a human process. It cannot be fully automated. Nor should it be.

As Brazilian educator and philosopher Paulo Freire, a pioneer of critical pedagogy, warned, education either functions as an instrument to bring about conformity or freedom.

China’s AI-heavy classrooms represent the former, where data governs behaviour and dissent disappears under algorithmic discipline. In contrast, the US model seduces learners into passivity through gamified apps and dopamine-driven edtech. In both cases, the learner is no longer a thinker, just a user or a subject. Freire would remind us that real education is not about managing attention but awakening it.

Freire also argued that, in oppressive systems, even those who were once oppressed can internalise the logic of domination. When teachers begin to see AI surveillance not as a threat, but as a tool for efficiency and discipline, they risk reproducing the very values that once silenced them.

The teacher, once a guide for critical thought, becomes a technician of behavioural compliance. The school, once imagined as a space of liberation, becomes a testing ground for algorithmic authority.

In adopting the tools of control, education ceases to be an act of freedom and becomes an instrument of quiet submission.

The path to human-centred AI

We can choose how we build systems. We can choose to reduce harm rather than normalise it. South Africa can use AI ethically, responsibly and humanely.

As we navigate the promises and perils of AI in education, we must stay clear-eyed about power, ethics and intent. In an age of smart technology, the wisest move may be to remain sceptical of systems that promise control but resist accountability.

If we are to build a just education system, we must resist the temptation to outsource decisions to those driven by profit, prestige, or political survival instead of listening to the real needs of learners and educators on the ground.

The goal of education in South Africa should be to unlock potential, not surveil it into silence.

Let’s invest in reliable infrastructure, teacher training and context-sensitive curricula. Let’s adopt tools that genuinely assist teachers: adaptive platforms, early-warning analytics and AI that expands access, not restricts behaviour.

Social media is rapidly eroding our ability to focus and reflect. Its instant feedback and dopamine loops discourage the discomfort essential to real learning. Platforms built to maximise engagement teach young minds to chase likes, not understanding, to skim, not grapple. If schools now adopt a similar logic of tracking, rewarding and nudging, deliberate practice won’t just be neglected. It’ll be unrecognisable.

Progress isn’t just about what technology can do − it’s about what we choose to do with it. The question is not whether AI belongs in education. It’s whether our children belong in systems designed to watch rather than teach. That’s a debate worth having.

Let’s remember: children don’t learn through monitoring. They learn through struggle, reflection and deliberate practice − the kind of learning no algorithm can automate. True learning requires deliberate practice, not digital obedience.

Be sceptical of both the Chinese model that disciplines through data and the US model that distracts through dopamine. Both disempower learners by denying them agency and critical thought, which is the essence of education.

South Africa must heed this warning. We should empower teachers, not replace their judgement with code. We should uplift learners, not condition them to mask their true selves. And we should never confuse surveillance with care. Let us not forget that just because we can watch a child’s every move doesn’t mean we should.

A child is not a data point. And education should never feel like a prison.

Use human-centred AI where it helps, to personalise learning, support teachers and catch learners before they fall through the cracks. But when it comes to AI surveillance, face scanning, emotion detection, behaviour scoring and addictive edtech that manipulates young minds, let’s draw the line.

Let’s say: “Not here.”

Share