Intelligence Brain · Education

For Education firms in Ireland.

For schools, third-level, and training providers. An on-premise intelligence brain designed for the regulatory posture, the document archive, and the rollout cadence of an Irish education practice.

The pattern of the education firm in 2026 — what's actually broken in the day

I've spent the last eighteen months sitting in staff rooms, registrars' offices, and IT rooms across Irish schools, ETBs, and a handful of universities. The pattern is consistent. The teaching itself isn't broken. What's broken is everything sitting around the teaching.

A subject teacher in a post-primary school is now expected to maintain a digital learning platform, log behaviour, write SEN reports, contribute to subject planning documents, complete CPD records, respond to parent emails inside the day, and keep a paper trail that would survive a Department inspection. A university lecturer has the same problem with grant reporting, ethics paperwork, module descriptors, and the constant reformatting of the same material for three different systems.

This is where AI helps. It helps where the same information has to be re-expressed in a different format for a different reader — a parent, an inspector, a board, a funder. It helps where someone is reading fifty submissions and trying to remember what the third one said. It helps where a policy document needs to be cross-checked against last year's version.

It does not help with the teaching, the relationship, the discipline call, or the judgement on a vulnerable child. Anyone selling you school AI that replaces a teacher's professional judgement is selling you something I wouldn't run in my own kids' school. The Intelligence Brain is built on that line.

The seven workflows that pay for the project in month one

For an education firm — and I'm using that word loosely to mean schools, ETBs, third-level institutions, and education-adjacent bodies — these are the seven I've seen pay back inside the first month.

  • Inspection and self-evaluation evidence packs. The Brain reads your existing planning documents, minutes, and policy folder and assembles the evidence trail a WSE or QQI review actually asks for. The work was always there; you just stop hand-stitching it.
  • SEN and student support documentation. Drafts of Student Support Files, IEPs, and review notes built from the teacher's own classroom notes — never from external data, never from the child's identifiable record leaving the building.
  • Parent communication triage. Inbound parent email gets summarised, categorised, and a draft response prepared in the school's own voice. The teacher or year head still sends it. They just don't write it from a blank page at half nine at night.
  • Policy review and cross-check. Your Child Safeguarding Statement, AUP, RSE policy, and code of behaviour read against each other and against current Department circulars. It flags contradictions. It doesn't rewrite them.
  • Grant and funding narrative drafting. For third-level, ETBs, and DEIS-funded work — the Brain holds your prior successful applications and drafts the narrative sections of the next one, grounded in your actual data.
  • Module and scheme-of-work assembly. Pulls last year's scheme, the spec, and the teacher's notes into a current draft. The teacher edits. The blank document is gone.
  • Board and governance pack preparation. Minutes, principal's reports, financial summaries, and risk register updates assembled into the board pack format your patron or governing authority expects.

The data-residency posture — what an Irish education firm actually needs

This is where most generic AI for education falls over. An Irish school or third-level institution is handling special category data on minors, SEN information, and in many cases Tusla-relevant material. That data does not belong in a US-hosted chatbot.

The Intelligence Brain runs on-premise or in a sovereign Irish/EU tenancy. Nothing leaves the building unless you explicitly send it. That posture has to satisfy several layers at once: GDPR and the Data Protection Act 2018, the Department of Education's own guidance on cloud and AI use in schools, the DPC's position on processing children's data, and — for third-level — the institution's own DPO and research ethics committee.

For ETBs there's the additional layer of being a public body under the Public Service Data Strategy. For universities, there's HEA reporting and, increasingly, funder requirements around how AI is used in research workflows. The Brain produces an audit log of every prompt, every retrieval, and every output. That log is yours, on your hardware, and it's what your DPO will ask for the first time something goes wrong.

One more point. The Brain does not train on your data. Your students' work, your staff's notes, and your parents' correspondence are not feedstock for someone else's model. That's a contractual and architectural commitment, not a marketing line.

The deployment cadence — thirty-two weeks, four gates

Weeks 1–8, ingest. I sit with your IT lead and your DPO. We map every system that holds text — VSware or Synergy, Google Workspace or Microsoft 365, your VLE, your shared drives, your policy folder. Nothing is moved. We build connectors. At the gate, you see a single search box that reads across everything you already have.

Weeks 9–16, structure. The Brain learns the shape of your organisation — your subjects, your year groups, your departments, your committees, your reporting lines. At this gate, you see the seven workflows above running on real documents, with a human in the loop on every output.

Weeks 17–24, swarm. Multiple specialised agents start handing work to each other — the policy-checker talks to the drafting agent, the SEN agent talks to the evidence agent. At the gate, you see the lift in throughput, measured against a baseline we took in week one.

Weeks 25–32, audit. Every output is logged, every prompt is reviewable, every model decision is traceable. At the final gate, your DPO and your senior leadership team get a full audit pack and the Brain is handed over as a standing system, not a project.

What to bring to the assessment call

If you're considering this for your school, ETB, or institution, three things make the first call useful:

  • An honest list of where staff time is being lost. Not a wish list — the actual top three repetitive document tasks that everyone resents.
  • Your current data protection position. Who your DPO is, what your last DPIA looked like, and any open items from your most recent inspection or review.
  • One sceptic. Bring the person on staff who is most resistant to AI. If the Brain doesn't answer their questions on the call, it isn't ready for your building.

That's the page. The Intelligence Brain works in other verticals too, but this one — Irish education — is where the regulatory care and the document load line up most clearly. If that sounds like your day, the call is the next step.

Frequently asked questions — Education

Is the Intelligence Brain on-premise or cloud?

Default is on-premise — the firm's own server, the firm's own data, the firm's own model weights. We support private-cloud (your AWS, your GCP, your Azure tenant) when on-prem hardware isn't a fit. We do not run a multi-tenant SaaS.

How long is the rollout?

About six months from kick-off to live use. Four eight-week stages — ingest, structure, swarm, audit. The swarm runs in shadow mode for the first ninety days alongside your team; only at day ninety, with the audit logs to back it up, does the swarm earn the right to run a tool live.

What does it cost?

Per-firm engagement, scoped from a free thirty-minute assessment. Firms vary too widely for a public list price — a five-partner law firm and a forty-person SME need different scoping. Book a slot via Calendly and we will scope it together.

Can it write contracts / draft accounts / produce clinical letters automatically?

It can produce a first pass that a qualified human reviews before anything is signed, filed, or sent. Tool-layer authorisation is a hard architectural boundary in the brain — the swarm reads everything and signs nothing.

What about hallucination?

The auditor agent's job is to catch hallucination before output reaches a human. Every claim in every output is required to be cited; every cite has to be reachable; every cite has to load. If the auditor cannot verify, the output is rejected as a build-failure signal — not corrected.

What's specific about education firms in the rollout?

The education vertical brings its own data-residency, professional-body, and audit-trail constraints. The methodology is the same; the structure-stage and swarm-stage prompts are vertical-specific.

Do you understand the education regulatory environment in Ireland?

I have worked with firms in this vertical and I bring the regulatory posture into the architecture from day one. The compliance pack at delivery includes DPIA, LIA, and EU AI Act tier-mapping, all reviewed against the vertical's specific framework.