AI Used in Schools Should ‘detect Signs of Learner Distress’

Key Highlights

  • The DfE has updated its AI safety guidance for schools to include new standards on detecting signs of learner distress.
  • AI products in schools should flag concerning behavior and direct learners to human support when necessary.
  • Strict new guidelines also cover emotional, social, and cognitive development, with a focus on protecting against manipulation.
  • The government emphasizes that AI must not replace vital human interactions, especially for younger pupils and those with special educational needs.

New Guidelines for AI in Schools

The Department for Education (DfE) has recently updated its guidelines on the use of artificial intelligence (AI) in schools to address emerging concerns about learner distress. The new standards, announced by Education Secretary Bridget Phillipson during the Global AI Safety Summit in London, aim to ensure that AI tools are used responsibly and ethically.

Detecting Signs of Distress

The updated guidance emphasizes that AI products should be designed to detect signs of learner distress. Specifically, these products must identify references to suicide, depression, or self-harm, as well as spikes in night-time usage, negative emotional cues, and patterns indicative of a crisis. If such distress is detected, the AI should follow an appropriate pathway, including signposting to support services and raising a safeguarding flag to school authorities.

Protecting Against Manipulation

The new standards go beyond just detecting distress by also addressing the risk of manipulation. AI products must not use manipulative or persuasive strategies such as flattering language, stimulating negative emotions for motivational purposes, or exploiting users by steering them towards prolonged screen time. This is particularly important to avoid fostering unhealthy levels of trust and disclosure in young learners.

Supporting Cognitive Development

In addition to emotional support, the new guidelines also focus on cognitive development. AI tools are expected to encourage critical thinking rather than spoon-feeding answers. They should provide a pattern of progressive disclosure, prompting learners to engage with problems and find solutions themselves. This approach is intended to empower students without replacing the role of teachers and teaching assistants.

Government’s Vision for AI in Education

The DfE believes that AI can significantly enhance learning experiences, especially for disadvantaged children and those with special educational needs (SEND). However, Secretary Phillipson stressed that AI should complement human interaction rather than replace it. “AI will back our teachers, but never remove them,” she stated. The government’s goal is to leverage technology while maintaining the core principles of deep, personal education.

The announcement comes as part of a broader £23 million expansion of school edtech and AI pilot projects. While the new standards are non-statutory, they provide a framework for schools and developers to ensure that AI tools contribute positively to the educational environment.