Across the world, universities have been in a moral panic about generative AI and its implications for student assessment. It’s true – plagiarism detectors cannot spot AI-generated text. OpenAI tried, but gave up after their detector maxed at 26% of text correctly identified as AI.
Spotting AI-assisted plagiarism is a game of whack-a-mole that assessors cannot win. Banning or blocking it is pointless if it’s undetectable.
There is the argument that traditional written exams are AI-proof. However, written exams disadvantage people of differing abilities and are not reflected as a communication method in any other other context, least of all careers.
Traditional forms of assessment won’t sustain, scale or leverage GenAI. We need to use GenAI as part of learning. We need to create the course bot, debate it, grade it, pair it and improve it.
This is an opportunity to develop students’ critical faculties, to compare and contrast multiple sources and not to simply rely on one textbook. These are the skills that will be called for in their lives as employees, employers, consumers and citizens.
GenAI can help students with ideation, to grow and to think – useful career skills – yet instructivist learning (where the teacher or book says ‘X’) has dominated education since Socrates in 400 BCE.
GenAI can support constructivist approaches like project-based and hands-on learning where students build their own knowledge, coached by the teacher.
Instead of wrestling with how to stop cheating, we need to consider and explore the educational opportunities offered by AI. Here are just a few.
AI-powered role-playing allows students to interact with fine-tuned/prompt engineered Large Language Models on specialist content – it’s is a dynamic learning tool.
AI systems can improve accessibility to classroom content in real time, offering mid lecture, individual questions and clarifications.
This technology can assist with content creation rather like a team member – a responsive, personalised and accessible support. The key is that these classroom LLMs should be fine-tuned to the task, thus diluting most of the (free) ChatGPT content that flattens and debases course content.
We let 142 DCU undergrads loose on AI as part of one of their data analytics modules to see how they would use it. They looked for course summaries, information about assignment timetables, recaps of missed lectures and explanations of technical terms. They looked for data visualisations and explanations in their mother tongues. All very useful stuff.
Here’s the classroom AI that I visualise: Systems that will recast content into a medium and formats that will accommodate impairments like vision or hearing, ADHD and autism; content adaptation that will simplify complex text, create summaries and structure text into bullet points or diagrams.
AI can convert text to speech for visual impairment and not just word for word but attuned to the student, or into image or animations, or into sign language. It can personalise content for different learning styles, offering examples or analogies or customise the pace of delivery.
It can support interactive learning with real time explanations, questions answered.
It can be dyslexia-friendly and adaptable for neurodivergent learners or people with motor impairments.
This is not some utopia – it’s happening already.
University of Michigan provides full access to several popular LLM models including GPT, 4o, Llama, DALL-E, for all 100,00 students and staff through ‘Maizey’ – hosted in a private cloud so no data sharing with big tech.
There are 2,500 such use cases of generative AI being used in courses at the University of Michigan right now.
How to progress this in Ireland? Individual institutions and researchers could be given access to shared cloud hosting at national level. The question is – who could host and maintain such an infrastructure? It would be a bold and far-sighted national investment.
Large Language Models were not planned or designed, they happened. Their ability surprises us. Criticising them is disingenuous because they do have some form of inferred knowledge, their own version of memory (parametric memory) which we do not understand.
AI literacy is mandatory and some form of job transformation is likely.
GenAI is happening in your workplace and in mine. You can design it out, or design it in. Our schools and colleges are no different – time to design it in.