Skip to main content
European Education Area home page European Education Area home page

European Education Area

Quality education and training for all

Explainable AI in education: fostering human oversight and shared responsibility

As Artificial Intelligence (AI) becomes more integrated into education systems, the need for transparency, trust, and human involvement has never been more urgent. But what exactly makes AI explainable and why does it matter in the classroom?

About the online session

This interactive session hosted by the European Digital Education Hub will unpack the latest insights from the Explainable AI (XAI) working group (squad).

Together, we’ll explore how explainability supports better learning outcomes, safeguards rights, and strengthens the role of educators and learners in AI-enhanced environments.

Session highlights

The session will highlight key findings from the squad’s newly released report, including:

  • What XAI is and why it matters for education
  • Legal and ethical considerations, including the AI Act and GDPR
  • Perspectives from diverse stakeholders and practical, real-world use cases
  • Core competences educators need to navigate and evaluate AI systems effectively

Speakers will include squad leader Francisco Bellas and fellow members, who will share actionable insights and practical examples that illustrate the impact of explainable AI across different educational settings.

This event is open to all members of the Hub

Not a member yet?

Join the European Digital Education Hub

 

Last updated: