It's no surprise to learn that students at all levels are using artificial intelligence tools, specifically large language models such as Copilot, ChatGPT, Gemini and Claude. But adoption rates are nonetheless staggering: 92%, according to surveys by Oxford's Higher Education Policy Institute.
Such a figure will be cold comfort for universities and colleges worried about AI's negative impact. As Anitia Lubbe, Associate Professor at North-West University, writes on The Conversation: "For many university teachers, this raises alarm bells about plagiarism and integrity. While some institutions have rushed to restrict or support AI use, others are still unsure how to respond."
Leaning into AI
How should they respond? What are their options and the best path forward? AI may be a threat, but it also brings opportunities. Future-facing institutions have a responsibility to lead from the front by offering safe and governed AI tools.
Academic institutions have several priorities on the table, says Edward Müller, Senior Solutions Architect at Mint Group: "There are natural concerns, such as plagiarism and students unquestioningly accepting AI output. There are also concerns about hallucinations and other inaccuracies. Then there are questions about AI skills and use cases. Underneath all that, we have operational areas such as digital readiness, data infrastructure, policies, security and how much control an institution really has over an AI."
The case for AI in tertiary learning institutions is straightforward. Unofficial or "shadow" adoption is already widespread, and demand grows for learning AI-related skillsets, from both students and faculty members. Beyond large language model (LLM) AI, there are other use cases, such as process and operational automation.
"AI isn't just about helping with studies. AI can support better enrolment, help identify students at risk, provide self-help services, improve grant applications, create secure repositories of institutional knowledge and enable faculty staff to develop and experiment with other AI concepts," Müller explains.
Getting to grips with AI
AI's rapid and rampant adoption is a serious challenge for tertiary institutions. Some would like to stop the use of AI, especially to curb plagiarism. But it's an unrealistic expectation since students can access dozens of third-party AI services. Instead, it's smarter to embrace and own AI. But how would that work?
A primary goal is to give faculty and students access to in-house AI tools they can trust and that are under the institution's control. This sovereignty is important for several reasons: it ensures that sensitive learning materials and intellectual property don't leak into public AI models; the institution can validate and change outputs for greater accuracy; and users have access to curated and governed tools they can trust.
Enterprise AI environments such as Microsoft's Copilot focus on creating this capability trifecta. When designed and implemented according to an institution's requirements, these environments provide safe and controlled access to models and their services, using cloud infrastructure while maintaining a secure fence around the university's data, policies and users.
The first step is to grasp the opportunities and requirements by attending an envisioning workshop. These free online events, hosted by the Mint Group, take participants through a high-level exploration of AI's possibilities and requirements. Once decision-makers decide to move forward, the next step prepares a college or university's AI foundation, says Müller.
"Tertiary institutions must build out secure solutions they can use, such as Copilot for day-to-day work requirements. This ensures that all your data remains with you and isn't used to train external models. You also need to ensure that your data architecture and the surrounding security is set up correctly. If you use an off-the-shelf solution, it might get access to data that you haven't secured properly. Ensure that your environment is set up correctly before you enable an AI."
Exploring AI opportunities
A simplification of the AI adoption journey has three steps:
- Reviewing and improving data architecture and digital security.
- Creating AI policies and frameworks.
- Provide digital enablement and training for self-sufficiency.
As these stages develop, the institution can start looking at relevant use cases, such as back-office automation agents, student self-help and detecting struggling students for interventions before their situation becomes dire.
Other opportunities for AI are as teachers' assistants and student tutors trained on curated information, implementing or enhancing no code/low code services, and building agents and models with tools like Microsoft Foundry. The presence of governed in-house AI enables tertiary learning to teach their students and staff valuable future-ready skills. This level of control also enables universities and colleges to add additional features, such as integrating third-party services to offer more local languages.
Ultimately, the goal is to create an AI environment that the institution can control and steer, reducing the risks of ungoverned shadow AI and enhancing in-house capabilities and educational offerings. Rather than becoming passive AI service consumers, learning institutions can remain in control.
"We want to enable organisations to be self-sufficient and not always have to use contractors," says Müller. "Effectively, train the trainer and ensure that they are up to speed. We also assist large enterprises with setting up AI centres of excellence, an internal mix of non-technical and technical people who can take their AI plans forward."

