EUROPEAN EDTECH POLICY MAP
4.1. Sustainable funding for spaces and structures for testing, trialling, co-creation
4.1.3 Create education-specific regulatory sandboxes
Summary of suggested actions
Establish and expand education-specific regulatory sandboxes that provide controlled environments to test digital and AI-driven tools in real or simulated educational settings, ensuring interoperability, data protection, and ethical compliance before large-scale deployment.
Description
As AI and data-driven systems become increasingly embedded in education, the need for environments that enable safe experimentation under regulatory oversight is critical. Regulatory sandboxes, which are controlled spaces in which innovators, policymakers, and data protection authorities collaborate, allow for the testing of emerging technologies in realistic conditions without exposing learners or institutions to undue risk.
Education-specific sandboxes would allow EdTech developers and education institutions to explore interoperability, usability, and compliance challenges under the supervision of relevant authorities, such as data protection regulators and ministries of education. They would also facilitate dialogue between developers, educators, and policymakers, enabling joint understanding of how technologies perform pedagogically and legally before they enter classrooms at scale.
By combining technical testing with ethical and legal guidance, such sandboxes can accelerate the responsible development and adoption of EdTech, reduce compliance costs for smaller companies, and build trust among educators and parents. Structured evaluation within these environments would also help inform broader policy development, including standards under the EU AI Act and General Data Protection Regulation (GDPR).
Major enabling factors
-
The AI Act mandates and funds the creation of national regulatory sandboxes to support safe experimentation under legal supervision, providing a structural entry point for the education sector.
-
The Digital Europe Programme and Horizon Europe include funding lines for AI testing, cybersecurity, and regulatory sandboxes, which could be extended to education-specific contexts.
-
Initiatives like the CNIL Education Sandbox (France) and GovTech Polska Innovation Sandbox demonstrate feasibility and public value.
-
Data protection authorities (e.g. CNIL, BfDI, Datatilsynet) and innovation agencies already possess expertise to operate or co-run sandbox environments.
Major roadblocks
-
Education is rarely prioritised within existing national sandbox programmes, which typically focus on health, finance, or manufacturing.
-
Ministries of education may lack the technical and legal capacity to co-manage sandboxes with data authorities.
-
Participation in sandbox programmes requires time and technical preparation that smaller EdTech firms may struggle to provide without financial support.
-
The use of learner data in testing contexts remains a legal and ethical challenge, particularly for minors.
-
Current sandboxes focus on technical or legal dimensions; few integrate pedagogical evaluation criteria.
-
The absence of coordination between data protection authorities, ministries, and innovation agencies can hinder coherent implementation.
Suggested action: Network of European testing and evaluation environments
WHO (Potential actors)
-
European Commission (DG Connect, DG EAC) to fund and coordinate an EU-level framework for education-specific regulatory sandboxes under the Digital Europe Programme and AI Act.
-
National Data Protection Authorities and Ministries of Education to co-develop and operate national sandboxes.
-
EdTech alliances, research organisations, and digital innovation hubs to act as operational partners and provide testing expertise
WHAT (Goal of suggested activities)
Develop and institutionalise education-specific regulatory sandboxes to enable EdTech developers, schools, and researchers to test new digital and AI systems under safe, compliant, and transparent conditions that foster innovation while upholding data protection and ethical standards.​
​
HOW (Suggested activities)
-
Establish national education sandbox pilots under the framework of the AI Act, jointly managed by data protection authorities and education ministries.
-
Allocate targeted EU co-funding for education-specific sandbox activities via the Digital Europe Programme, including SME participation grants.
-
Develop EU-level guidance for education sandboxes to ensure consistency with GDPR, AI Act provisions, and Council of Europe recommendations on AI in education.
-
Involve teachers, school leaders, and researchers in co-designing sandbox procedures to ensure pedagogical relevance and contextual understanding.
-
Create a European Education sandbox network to share anonymised findings, compliance strategies, and technical standards among Member States.
-
Introduce evidence reporting requirements for sandbox participants to support EU-level policy learning and contribute to an emerging European evidence base for EdTech.
Existing steps in the right direction
CNIL “Sandbox Numérique Éducatif” (Education Sandbox, France)
In 2023, the Commission nationale de l’informatique et des libertés (CNIL), France’s data protection authority, launched a dedicated sandbox for digital education projects (bac à sable numérique éducatif). The initiative supports public and private actors in developing EdTech tools that comply with the GDPR and uphold learners’ fundamental rights. Selected projects receive regulatory guidance and technical advice from CNIL experts to ensure that personal data is processed lawfully, transparently, and proportionately.
The sandbox aims to address key challenges in balancing innovation with privacy, such as processing children’s data, managing parental consent, using learning analytics, and ensuring algorithmic transparency in AI-enabled educational systems. It also facilitates cooperation between ministries, EdTech developers, and educational institutions in testing technologies under real conditions while maintaining compliance.
The CNIL sandbox provides a structured, real-world example of how regulatory experimentation environments can accelerate EdTech innovation while safeguarding data protection and ethical standards. It demonstrates that compliance support and pedagogical oversight can coexist, reducing barriers for EdTech SMEs to innovate responsibly.
​
Specific support required to achieve the Goal:
-
The CNIL model could be replicated and expanded at both national and European levels with Digital Europe or Erasmus+ funding to support it.
-
Integrating educational sandboxes into broader Testing and Experimentation Facilities (TEFs)
-
Publishing anonymised findings to inform European EdTech policy and strengthen evidence-based governance of AI in education.
