View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

Russell Group of leading UK universities set out principles for AI in education

The group of universities has developed principles that focus on how to utilise AI safely rather than ignore or ban its completely.

By Ryan Morrison

A set of principles for using artificial intelligence have been established by educators working for the Russell Group of leading UK universities. This includes rules that are aimed at creating an “AI literate” body of staff and students to capitalise on the benefits of the technology while mitigating risk. Industry experts have described it as a “wise move” to prepare the future workforce.

AI has proved to be a controversial topic in education resulting in questions over authenticity. The Russell Group, which includes Cambridge University, has set out principles for its use (Photo: Aniczkania/Shutterstock)

The general purpose principles include the promotion of the ethical use of generative AI, ensuring equal access to the technology and upholding academic rigour regardless of how the content is produced or used. The principles come as the government launches a consultation on the use of generative AI in education in England.

The Russell Group is a collection of 24 universities including Birmingham, Cambridge, Oxford and Durham. Between them the institutions educate more than half a million undergraduates and 150,000 postgraduates in an average year and produce research that has led to entirely new sectors of the economy including in AI.

Since the launch of ChatGPT by OpenAI in November last year there has been a race to understand the implications of AI on education, particularly generative AI which has the capacity to produce entire essays from a simple prompt. Some institutions initially banned its use or deployed AI content detectors that have been shown to be unreliable with high false positive rates or issues with text written by non-native English speakers.

With companies like Microsoft and Google deploying generative AI tools throughout products used regularly by students, including Word, Excel and Google Docs, banning has become more difficult. The approach from educators is now shifting to explore how to utilise the technology without undermining teaching or academic principles.

Dr Tim Bradshaw, chief executive of the Russell Group, said AI was already changing the way we work and so it is crucial students have the ability to utilise this technology before they enter the workforce. He said educators and other university staff also need to understand how AI can be used to enhance teaching and bring subjects to life.

“This is a rapidly developing field, and the risks and opportunities of these technologies are changing constantly,” Bradshaw explained. “It’s in everyone’s interests that AI choices in education are taken on the basis of clearly understood values.” He said the principles underline a commitment to capitalise on the transformative opportunities of AI in a way that “protects the integrity of high quality education.”

Content from our partners
The hidden complexities of deploying AI in your business
When it comes to AI, remember not every problem is a nail
An evolving cybersecurity landscape calls for multi-layered defence strategies

Preparing the future workforce

The five principles are: support students and staff to become AI-literate, equip staff to support students in using generative AI tools, adapt teaching and assessment to incorporate the ethical use of generative AI and ensure equal access, ensure academic rigour and integrity is upheld, work collaboratively to share best practice as the technology evolves.

Professor Michael Grove, deputy pro-vice chancellor at the University of Birmingham, said the rapid pace of development in generative AI will require universities to continually review and re-evaluate assessment practices but this should be seen as an opportunity. “We have an opportunity to rethink the role of assessment and how it can be used to enhance student learning and in helping students appraise their own educational gain.”

Jaeger Glucina, chief of staff at AI service company Luminance said AI being rolled out across multiple sectors without an adequately prepared workforce could “stifle innovation and limit our ability to leverage the technology to its potential.” She says that if the UK wants to realise the dream of being a leader in the AI revolution it needs to plug the AI skills gap.

“As home to some of the best universities and tech start-ups in the world, the UK has an opportunity to take a leading role when it comes to encouraging digital inclusion,” Glucina explains. “Universities, schools, and the government must work together to close the AI skills gap with training, reskilling, and early AI education to prepare future generations to take on the challenges and opportunities of tomorrow.”

Sheila Flavell CBE, COO of technology recruitment company FDM Group, welcomed the move as a way of solving the future recruitment crisis. “With businesses crying out for new hires equipped with the latest tech skills and analytics capabilities, providing students with a fully rounded education and qualifications in this area is critical for building a dynamic workforce, fit for the future ahead,” she says.

Read more: Japan targets light touch AI regulation

Topics in this article : ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU