Intelligence Brain · education

Administrative AI for Irish education — the safe wins

← Back to Intelligence Brain

Schools in Ireland are drowning in administrative work that has nothing to do with teaching. Enrolment forms, GDPR-bound parent communications, attendance returns, SUSI references, the ESINET reporting cycle, special-education resource allocation, board minutes, policy reviews. The principal ends up doing two jobs and the school secretary ends up doing four. AI can absorb a meaningful chunk of that load — but only if it's deployed somewhere that won't get a Department of Education circular waved at it. Below is what I'd actually build first if I were a principal, deputy principal or ETB administrator looking at this seriously.

The line between safe wins and risky experiments

Before any school office plugs an AI tool into anything, draw a clear line. On one side you have administrative tasks: drafting, summarising, classifying, formatting, looking up internal policy. On the other side you have decisions about children: who gets a place, who gets resource hours, who's flagged for child protection concern, what goes on a school report. The first set is where AI earns its keep. The second set is where you keep humans firmly in charge and use AI only to surface evidence, never to recommend an outcome.

This isn't squeamishness. It's the practical reading of GDPR Article 22 (automated decision-making), the Department's data protection guidance for schools, and the reality that a board of management will have to defend any decision a parent challenges. If the answer to "how did you decide?" is "the AI suggested it", you're finished. If the answer is "the deputy principal decided, having reviewed an AI-generated summary of the file alongside the original documents", you're fine.

Every safe win below sits firmly on the administrative side of that line.

Drafting and triaging the parent inbox

The single highest-volume task in most school offices is parent correspondence. Absence notes, uniform queries, lost-property questions, complaints about homework, requests for meetings, GDPR subject access requests, queries about the book grant scheme, queries about voluntary contributions. A school of 600 pupils can generate hundreds of inbound emails a week, almost all of which need a polite, accurate, policy-consistent reply.

The technical pattern that works:

  • Classify, don't auto-reply. Run inbound mail through a model that tags by category — absence, finance, complaint, SEN-related, child-protection-flag, urgent — and routes it to the right tray. No reply leaves the building without a human pressing send.
  • Draft against the school's own documents. The model should be retrieving from your Code of Behaviour, Admissions Policy, Anti-Bullying Policy, and current circulars — not generating from general knowledge. This is where retrieval-augmented generation matters: the answer must be grounded in your policy text, not the model's guess at what an Irish school policy might say.
  • Flag, don't filter, safeguarding. Anything that hints at a child protection concern goes straight to the DLP's tray with the original message untouched. The AI's job is to make sure nothing gets buried, not to assess severity.

The result is the secretary clearing the inbox in an hour instead of half a day, and the principal seeing only the items that actually need a principal.

Policy and circular comprehension

Every school in the country is subject to a constant drip of Department circulars, NCCA updates, NCSE guidance, Tusla notifications, and JMB or ETBI bulletins. Nobody reads them all. They sit in a folder until something goes wrong, and then someone has to find the relevant paragraph at speed.

This is where a properly indexed internal knowledge base earns its money. The technical bits that matter:

  • Document-level provenance. Every answer the system gives must cite the circular number, page, and paragraph. "Circular 0042/2024, page 3, section 4(b)" — not a paraphrase floating in space.
  • Versioning. Circulars supersede each other. The system has to know that the 2024 version of a financial procedure replaces the 2019 one, and surface only the live guidance unless asked for history.
  • Cross-document reasoning. A question like "what's our obligation when a parent withdraws consent for a school tour?" pulls from the Education Welfare Act, the school's own consent forms, and the relevant Department guidance. The model has to stitch those together and show its working.

This is the workload I had in mind when building the Intelligence Brain for education — a retrieval layer that runs against the school's own document set and never sends a circular, a child's name, or a board minute outside the school's environment.

Enrolment, returns and the paperwork cycle

Schools run on a calendar of returns: the October Returns, PPOD and POD updates, attendance reporting to Tusla, NCSE applications for resource hours, SET allocation paperwork, free-book-scheme returns, and the annual budget submission. Most of this is structured data trapped inside unstructured documents — application forms, medical notes, psychological assessments, previous school reports.

The safe pattern here is structured extraction with human verification:

  • The AI reads the document and proposes field values — pupil PPSN, date of birth, previous school roll number, EAL status, medical flags, custody arrangements.
  • The school office sees the extracted fields side-by-side with the source document. They confirm or correct.
  • The confirmed record goes into the school MIS (VS-ware, Aladdin, Compass, whatever you use) through a controlled export — not through screen-scraping or browser automation that breaks every time the vendor pushes an update.

Done properly, this collapses an enrolment-processing day into an enrolment-processing hour and dramatically reduces the typo rate on PPSNs and dates of birth, which is where most downstream pain comes from when DEIS or NCSE figures don't reconcile.

Board of management minutes, AGM packs and policy review

Board work is the hidden tax on school leadership. Six meetings a year, each producing minutes that must be accurate, neutral, and aligned with the agreed agenda. Plus an AGM. Plus the annual cycle of policy reviews — admissions, anti-bullying, child safeguarding, RSE, data protection, acceptable use.

What works:

  • Transcribe locally, summarise locally. If the recording or the transcript leaves the school's environment, you've created a data protection problem. On-premise or sovereign-cloud transcription with a model that produces draft minutes from a transcript is the only configuration I'd sign off on.
  • Draft, don't decide. The AI produces a first draft of minutes structured to the standing agenda. The secretary edits. The chair signs. Same governance as before, ninety minutes shorter.
  • Policy review as diff. When the Department updates a model policy, the system shows your current policy alongside the new model with the differences highlighted and the implications flagged. The board reviews changes; it doesn't re-read the whole document.

What "on-premise" actually means for an Irish school

"On-premise" doesn't mean a server in the principal's office. For a small primary school that's neither realistic nor desirable. It means the AI workload runs in an environment that the school controls, with a clear data-residency commitment, no training on your data, and an audit log that a DPO can actually read.

In practical terms, for Irish schools, that's one of three configurations:

  • A managed service hosted in EU/Ireland with a DPA naming the school as controller and the provider as processor, no model training on school data, and pen-test reports available on request.
  • An ETB-level shared deployment, where one set of infrastructure serves all schools in the scheme — this is where the economics work for community and comprehensive sectors.
  • A diocesan- or trust-level shared deployment for voluntary secondary, doing the same job for the JMB-aligned schools.

The architecture I describe in the broader Intelligence Brain writeup is built around exactly this: a single tenant per organisation, EU residency, full audit trail, no cross-customer training, and enough configurability that a DPIA is a one-week exercise rather than a six-month one.

Where to start this week

Pick one task. The parent inbox is the obvious first candidate because the volume is high, the policy ground-truth already exists in your school documents, and the risk of a bad outcome is contained — every reply is reviewed by a human before it goes out. Spend a fortnight measuring how much time it actually saves and whether the drafts genuinely match your school's voice. If it works, add policy retrieval next, then enrolment extraction. Don't try to do everything at once, don't let anyone sell you a magic platform that solves all of it on day one, and don't put any tool near safeguarding decisions or pupil-level judgements until you have a year of comfortable, boring, audited use behind you on the safe wins.

Book a 30-minute assessment

Direct with Michael. No charge. No pitch deck.

Pick a slot →