News & Insights

New Government AI Guidance for Irish Schools: Is Your Tool Compliant?

Ireland published its first AI guidance for schools in Oct 2025. With the EU AI Act classifying education AI as high-risk, here is a 7-point compliance checklist.

10 min read
#AI in Schools Ireland#GDPR#AI in Education#Data Protection#Policy Update#Primary Schools
Share:

Your school probably uses AI already. A teacher used ChatGPT to draft a report last week. A principal asked it to summarise a policy document. Someone on staff has been experimenting with image generators for classroom displays. The question isn't whether AI is in your school. It's whether it's safe.

In October 2025, the Department of Education published its first ever Guidance on Artificial Intelligence in Schools. By August 2026, the EU AI Act will classify education AI systems as high-risk, triggering strict legal obligations. And Ireland's own Regulation of Artificial Intelligence Bill 2026 will establish a new AI Office to enforce it all.

If you're a principal or SET using AI tools right now, this is what you need to know.

What Changed in 2025 and 2026

Three major policy developments landed in quick succession. Each one changes what schools need to think about before adopting any AI tool.

1. Department of Education AI Guidance (October 2025)

The Department's guidance is the first national framework for AI use in Irish schools. Developed with Oide Technology in Education, it establishes core principles that every school should now be following:

  • AI should augment, not replace human-led teaching and learning
  • Teachers must maintain human oversight of all AI-generated content
  • Schools need a clear AI policy embedded in their Acceptable Use Policy
  • The "4P Framework" (Purpose, Planning, Policies, Practice) should guide all AI adoption decisions

Minister McEntee described it as a "living document" that will be updated regularly. That signals more requirements are coming, not fewer.

2. EU AI Act: Education Classified as High-Risk

The EU AI Act entered into force in August 2024, with provisions rolling out in phases. The most significant for schools: AI systems used in education are classified as high-risk when they determine access, evaluate outcomes, or assess learning levels.

From February 2025, schools have been obligated to ensure staff have sufficient AI literacy for their roles. From August 2026, the full high-risk compliance framework applies, including mandatory risk assessments, documentation, and human oversight mechanisms.

3. Ireland's Regulation of AI Bill 2026

Ireland's domestic legislation, the Regulation of Artificial Intelligence Bill 2026, gives legal teeth to the EU AI Act. It creates the AI Office of Ireland (Oifig Intleachta Shaorga na hÉireann), a new statutory body responsible for enforcement. The Office must be operational by August 2026.

Ireland has designated 15 National Competent Authorities as Market Surveillance Authorities, including the Data Protection Commission. This means enforcement will be real, sector-specific, and already staffed.

Key Point

What This Means for Your School

These aren't theoretical frameworks. By August 2026, Irish schools using AI tools for SEN documentation, learning assessment, or student data processing will need to demonstrate that those tools meet specific compliance standards. The enforcement infrastructure is already being built.

The 7-Point AI Compliance Checklist for Irish Schools

7-point AI compliance checklist for Irish schools covering human oversight, DPIA, EU data residency, PII minimisation, consent, zero data retention, and no model training

The government guidance, EU AI Act, and GDPR together create a clear set of requirements. Here's a practical checklist every school should apply to any AI tool they're using or considering:

1. Human Oversight

Requirement: Teachers must review, verify, and approve every AI-generated output before it's used.

No AI tool should make decisions about children autonomously. The Department's guidance is explicit: AI augments professional judgment, it does not replace it. Every draft, summary, or recommendation needs a teacher's eyes and approval before it becomes official.

2. Data Protection Impact Assessment (DPIA)

Requirement: Schools must conduct a DPIA on any AI tool processing personal data.

The DPC's Data Protection Toolkit for Schools includes a sample DPIA template specifically designed for schools. If you haven't completed one for each AI tool in use, that should be your first action.

3. EU Data Residency

Requirement: Student data should be processed within the EU/EEA wherever possible.

When personal data leaves the EU, additional safeguards are required under GDPR Chapter V. For schools, the simplest path to compliance is choosing tools that process and store data within EU borders entirely. Ask every AI vendor: where does our data go?

4. PII Handling and Minimisation

Requirement: Only the minimum necessary personal data should be processed. Ideally, personal identifiers should be removed before data reaches any AI system.

The GDPR principle of data minimisation is especially important with AI tools. The gold standard is client-side redaction: removing names, dates of birth, addresses, and other identifiers before any data is sent to an AI model. If a tool can't explain exactly how it handles PII, that's a red flag.

Requirement: Schools must be transparent about how AI tools process data, and consent mechanisms must meet GDPR standards.

In Ireland, the digital age of consent is 16. For primary school students, this means parental consent is required if consent is the lawful basis for data processing. Your school's privacy policy must explicitly address AI tool usage, and parents should be informed.

6. Data Retention and Training

Requirement: AI tools must not retain student data beyond what's necessary, and should not use school data to train their models.

This is where many generic AI tools fall short. Standard ChatGPT, for example, retains conversation data and has used it for model training. Only OpenAI's Enterprise, Edu, and API products with specific agreements offer zero data retention. Ask your vendor: how long do you keep our data, and is it used to improve your AI?

7. No Model Training on School Data

Requirement: School data must not be used to train or fine-tune AI models without explicit, informed consent.

This is separate from data retention. Even if data is "deleted" after processing, the question is whether it was used for training during processing. GDPR requires a lawful basis for any processing, and model training on children's data without consent is legally precarious.

Why Generic AI Tools May Not Meet These Requirements

This isn't about generic AI tools being "bad." ChatGPT, Claude, and Gemini are extraordinary technologies. But they weren't designed for Irish school data, and that matters.

Here are specific areas where general-purpose AI tools may not align with the requirements above:

  • Data residency: Standard consumer AI products typically process data on US servers. Even with standard contractual clauses, this adds complexity and risk under GDPR
  • Data retention: Consumer AI products may retain conversation data for 30 days or longer after deletion. Some providers are currently subject to legal orders requiring indefinite data retention
  • Model training: Consumer-tier products may use conversation data for model improvement unless users specifically opt out. This creates a GDPR concern when the data involves children
  • PII handling: No automatic identification or redaction of personally identifiable information before processing. Whatever you type goes straight to the model
  • No DPIA support: General-purpose tools don't provide Data Protection Impact Assessment documentation or school-specific compliance information

None of this means teachers should stop exploring AI. It means schools should choose tools designed for their specific compliance context.

AI should augment and not replace human-led teaching and learning, with an emphasis on human oversight, verification, and ethical reflection.

Department of Education, Guidance on AI in Schools, October 2025

What Principals Should Do Now

You don't need to become an AI compliance expert overnight. But you do need to take three concrete steps before August 2026:

1. Update Your Acceptable Use Policy

Your school's AUP should explicitly address AI tool usage by staff. Cover which tools are approved, what data can be entered, and what oversight is required. The Oide TiE AI Hub has resources to help, and AIpolicy.ie offers school-specific AI policy templates.

2. Conduct a DPIA on Every AI Tool in Use

Use the DPC's template to assess each tool. If a tool can't clearly answer the seven questions above, it shouldn't be processing student data.

3. Choose Purpose-Built Tools with Demonstrable Compliance

For SEN documentation specifically, look for tools that can demonstrate: EU data residency, client-side PII redaction, zero data retention, no model training on school data, and built-in human oversight. These aren't nice-to-haves anymore. They're the baseline.

This is exactly why we built SENScribe. Every draft goes through client-side PII redaction before any data leaves your browser. Processing happens on Azure servers in the EU (West Europe region). We operate on a zero-retention basis: no student data is stored, no conversations are saved, and your data is never used to train AI models. The teacher reviews, edits, and approves every output.

We didn't design these features to tick compliance boxes. We designed them because teachers are already burned out and the last thing they need is compliance worry on top of everything else.

See How SENScribe Handles Your Data

Client-side PII redaction. EU data residency. Zero retention. No model training. Full details on our privacy page.


Frequently Asked Questions

Is ChatGPT safe to use in Irish schools?

It depends on which version and how it's used. Standard ChatGPT (Free, Plus, Pro) processes data on US servers and may retain conversations for model training. ChatGPT Edu and Enterprise versions offer stronger protections, including Data Processing Addendums and the option for EU data storage. For any AI tool, schools should conduct a Data Protection Impact Assessment and verify data residency, retention policies, and training practices before use with student data.

What is a DPIA and does my school need one for AI?

A Data Protection Impact Assessment (DPIA) is a structured process for identifying and minimising data protection risks. Under GDPR, a DPIA is required whenever processing is likely to result in a high risk to individuals' rights and freedoms. Since the EU AI Act classifies education AI as high-risk, any school using AI tools that process student data should conduct a DPIA. The DPC's Data Protection Toolkit for Schools includes a sample template.

What does the EU AI Act mean for Irish schools?

The EU AI Act classifies AI systems used in education as high-risk when they determine access, evaluate learning outcomes, or assess educational levels. From August 2026, schools using such systems must ensure they meet requirements including human oversight, risk assessment, documentation, and transparency. Ireland's Regulation of AI Bill 2026 establishes the domestic enforcement framework, including a new AI Office and 15 designated regulatory authorities.

How can schools ensure AI tools are GDPR compliant?

Schools should verify five key areas: (1) Data residency: is data processed within the EU? (2) Data retention: how long is data kept after processing? (3) Model training: is school data used to train the AI? (4) PII handling: is personal data minimised or redacted before processing? (5) DPIA: has the vendor provided compliance documentation? The DPC toolkit and the Department's AI guidance are the best starting points.


Sources: Department of Education AI Guidance (Oct 2025) · EU AI Act · Regulation of AI Bill 2026 (ICS.ie) · DPC Data Protection Toolkit for Schools · Oide TiE AI Resources · Digital Strategy for Schools to 2027 · ESRI: Future Proofing Schools (2025)

On this page