‘If we want to reduce inequality in society, it’s so important for educational sociologists and data scientists to work together, right now.’
Professor Delma Byrne on the new power structure in Irish education: AI
Education systems can create opportunities for people and improve their life chances. They can also narrow options and offer only a limited view of the pathways available.
An equitable, ethical education system contributes to the flattening of class structures. It recognises and removes barriers that shrink opportunities for certain groups based on income or class, race, physical ability, sex or gender, and their intersections.
It is the job of the sociologist to examine the forces that shape and drive education in particular directions, forces which are ultimately unleashed on pupils, students and those working in education. These forces are important to understand so that more ethical and equitable systems are supported. The sociologist is trained to discover, document and explain the ways that an education system might reproduce inequality, rather than challenge it.
Delma Byrne, associate professor Maynooth University Departments of Sociology and Education has studied the educational structures at play in Ireland for many years and is now turning her lens on a system that will have as much shaping power on Irish education as church and state have had in the past: AI.
‘AI technology is developing quickly and already being used in a variety of ways, at multiple levels, in the education system in Ireland; from teachers using it for lesson planning to policymakers using it for resource allocation,’ say Professor Byrne, who recently became a member of the Insight Research Ireland Centre for Data Analytics.
‘We have to train a critical lens on AI in education. We know that it can reinforce processes of inequality and social stratification. We also know that it can, in some instances, enhance and support learning. It’s very important that we examine all the ways that AI and education intersect and ensure that the technology is working for, not against, the student and the education system.’
Research of this kind demands a transdisciplinary approach, which is why Professor Byrne has joined Insight, with its 450 data researchers across a very broad range of disciplines from computer science and engineering to social science and humanities.
‘We have a lot of work to do to examine all the ways that AI is shaping the experience of students and learners,’ she says. ‘The educational technology companies are moving fast and will increasingly govern what and how people learn across all types of formal and informal educational settings, not just schools and colleges. Meanwhile we are seeing more and more government policy publications that broadly encourage the adoption of AI and digitisation in education.’
Prof Byrne describes AI’s eventual role in education as a sort of ‘algorithmic governance’ – one that will have as much influence on how and what we learn in the future as religious bodies had on the education system in the past. In real time, education is being reframed in the context of AI. We should not let it happen passively, she argues.
‘There is a role for the ed tech companies but there is also a role for researchers to assess to gains and losses of the tech they are bringing to classrooms. For example, if a teacher uses image generation software for curriculum content, is the software producing imagery that reinforces stereotypes? Teachers are the gatekeepers here and we have to support them in that role with critical appraisal of the tech and how it performs.’
At the structural level, AI is increasingly being used worldwide to support decision making by governments around education resourcing. We know that when AI is trained on incomplete data sets, it makes biased decisions.
‘I worked for some years at the Economic and Social Research Institute, looking at educational policy,’ says Professor Byrne. ‘I’ve always been interested in asking – who is in the data? With the rapidly evolving AI landscape, are we currently relying on general data sets to make decisions that impact on specific groups with specific needs? This is why I am drawn to working with a data research centre like Insight where I will have access to people with knowledge of the datasets that are used to train Large Language Models.’
The Insight Research Ireland Centre framework is perfect for this sort of enquiry, Byrne believes. It would be difficult for her to assemble the right team of researchers by any other means.
‘What we need is a team of researchers that unites computer scientists, STEM experts, psychologists, sociologists, political scientists and those working in the humanities. The Insight funding structure allows for that model. The Centre funds and supports hundreds of projects in AI and data science innovation, while fostering a commitment to examine the ethical impact of AI and to interrogate assumptions about its value in society. The Centre also supports research on equity and public engagement principles, widening the lens, and bringing all types of people into developing the knowledge base.’
‘We already know that there is a digital divide in our society. Some young people have much greater access to tech and the skills to benefit from it. Ed tech companies will keep putting the AI out there and the government has a general policy to support that. It’s up to educators to pull all that together and use it in a way that doesn’t disadvantage groups who are already challenged by multiple factors. There’s a huge onus on teachers and educators to be aware of the constraints and affordances of AI in the classroom. This type of critical research is required so that educators, professional bodies, and policymakers can make the right decisions for students, learners and for society, It’s also important so that users such as children, young people and learners more broadly can evaluate the pedagogical aspects of AI.
That’s why it’s so important for educational sociologists and data scientists to work together, right now.’
in interview with Louise Holden