Aslam Fataar
Artificial Intelligence (AI) has entered our lecture halls, grading systems, and research labs with astonishing speed. It now drafts essays, summarises readings, and generates lesson plans in seconds. Yet beneath this fluency lies a troubling question: whose intelligence, whose language, whose world does AI serve?
AI arrives in societies already marked by inequality. In Africa and across the Global South, access to reliable internet, computing power, and digital literacy is profoundly uneven. The danger is that AI will not only mirror these inequalities but will magnify them. If we are not vigilant, the algorithm will become the new frontier of exclusion.
AI and the Deepening of Educational Inequality
AI tools promise efficiency, but they often amplify structural divides. Students in well-resourced institutions gain instant access to personalised tutoring and data-driven feedback. At the same time, those in underfunded schools and universities are left with broken connectivity and outdated devices. What appears as a “technological revolution” for some becomes a new layer of deprivation for others.
The same pattern operates globally. The data that feed generative AI is mainly drawn from the Global North. English dominates its language models. Western logics shape its definitions of intelligence and success. Thus, AI extends the epistemic hierarchies of colonialism, making Western ways of knowing appear universal while marginalising local and indigenous traditions.
Education cannot accept this as inevitable. The task is to use AI critically and productively, by exposing how inequality is embedded in digital systems; and productively, by applying those same technologies towards justice, creativity, and a deeper understanding of humanity.
You may also want to read
Algorithmic Coloniality and the Politics of Knowledge
This dynamic is called algorithmic coloniality: the digital inheritance of power embedded in the architecture of AI. Algorithms are not neutral. They are historical artefacts built from data that privilege certain languages, histories, and markets. When African names are misspelt, indigenous ideas mistranslated, or local histories omitted, AI reproduces the epistemic erasures of empire.
Decolonising the algorithm requires moving from critique to creation. Africa can shape its own data futures by digitising indigenous languages, developing local corpora, and curating open epistemic commons that represent diverse knowledge systems. Building such infrastructures is not a luxury; it is a moral and intellectual necessity if education is to remain a vehicle for justice rather than replication of inequality.
Teaching in Unequal AI Ecologies
Students today inhabit AI-mediated ecologies consisting of networks of peers, devices, and data that extend beyond the classroom. But not all students stand on equal ground within these ecologies. Those with access to premium platforms, high-speed internet, and digital mentors can translate machine output into a learning advantage. Those without are left navigating opaque systems that speak a language not their own.
On this challenging terrain, teaching should be viewed as a process of design and inclusion. Educators need to facilitate learning as a conversation between human insight and machine inputs, helping students build critical thinking and creativity. The aim isn’t to compete with AI, but to teach students to think alongside it, understand how algorithms influence their perceptions, and regain control within those systems.
Such pedagogy must also restore the slow work of reflection. In a world of instant outputs, teaching must invite students to pause, deliberate, and discern, to see that meaning arises not from automation but from encounter, struggle, and ethical choice.
Rethinking Assessment and Authorship
When AI can generate perfect text in seconds, assessment based on polished products rewards superficial access rather than effort. The risk is a widening gulf between those who can afford sophisticated tools and those who cannot.
A more equitable approach emphasises the pedagogical process: drafts, reflections, and feedback loops that place the emphasis on a student’s intellectual development. Assessment moves beyond just having answers to cultivating genuine understanding. It appreciates curiosity, persistence, and ethical integrity- qualities that algorithms cannot replicate. In this context, assessment becomes an integral part of learning design and the cultivation of critical epistemic skills.
We must also teach students the ethics of what is called post-authorship, the reality of humans writing with generative AI. In this new ecology of co-creation, authorship becomes a shared and negotiated act. Students must learn to acknowledge assistance, question algorithmic outputs, and exercise moral judgment in deciding when and how to use AI. This literacy of accountability transforms writing into an ethical practice of discernment, ensuring that AI serves as a partner in thought rather than a substitute for it, and that its use strengthens rather than weakens academic integrity.
Institutional Responsibility and the Courage to Reimagine
Pedagogical reform cannot occur in isolation. Universities must act with institutional courage to confront the inequalities that AI exposes. Too many have responded to generative tools with bans or surveillance, missing the opportunity to redesign learning ethically.
Institutions need infrastructures that enable experimentation across disciplines, encourage open dialogue about bias and access, and invest in digital equity, ensuring that all students can participate meaningfully in AI-mediated learning. Courage also means resisting the metric-driven logic of neoliberal educational governance, which treats learning as data rather than transformation.
To teach for justice in the age of AI is to reaffirm the university’s public purpose: to cultivate citizens capable of discernment, empathy, and ethical imagination.
Education as an Act of Care
At its core, education is fundamentally an act of caring about truth, others, and the preservation of knowledge through generations. While machines can perform calculations and make predictions, they lack the capacity to care. They may imitate empathy, but they cannot genuinely experience responsibility.
The educator’s vocation is to inspire this care in students: to cultivate their ability to listen, to act with humility, and to envision fairer futures. This serves as the antidote to algorithmic inequality, the belief that human worth cannot be measured.
If universities can design for justice, curiosity, and care, AI will not diminish their mission; it will deepen it. But this requires teaching not only how to use AI, but why and for whom. It means holding technology accountable to the values it too easily conceals.
Our goal is not to surpass the machine in competition but to surpass it in care, reflexivity, and ethical thinking, viewing these as expressions of intellectual freedom. In this way, we reaffirm that education, even in an age dominated by algorithms, continues to be a fundamentally human pursuit grounded in justice, solidarity, and the collective effort of shared knowledge.
(Professor Aslam Fataar is a Research and Development Professor in Higher Education Transformation at Stellenbosch University. This article is based on his keynote address, delivered at the University of Zululand’s 9th Teaching and Learning Conference, 23 October 2025.)
First published by UWN online: https://www.universityworldnews.com/post.php?story=20251029072506658





![The ethics and barriers for Islamic finance in Africa’s economic development [+video]](https://muslimviews.co.za/wp-content/uploads/2025/11/B20-Part-2-360x180.webp)
![Islamic finance at the first African G20 [+video]](https://muslimviews.co.za/wp-content/uploads/2025/11/B20-Part-1-360x180.webp)





























































