Ireland Quantum 100 · Technical brief

How the Ireland Quantum machine feeds the IMPT carbon stack

← Ireland Quantum overview

There is a specific question I get asked every time someone hears "sovereign quantum machine in Tipperary, dedicated to climate workloads." It is some variation of: how does a 100-qubit transmon device in a dilution refrigerator end up affecting the price or quality of a tonne of carbon offset on the IMPT platform? It is a fair question, and the answer is more concrete than people expect. The chemistry that determines whether a carbon-capture sorbent works, or whether a reforestation soil-amendment is durable, or whether a direct-air-capture solvent regenerates without falling apart — that chemistry is exactly the class of problem superconducting quantum hardware is being built to attack. This article walks through how the Ireland Quantum 100 machine, once it comes online, will feed the IMPT carbon stack: the wiring, the workloads, the limits, and the parts I am not willing to pretend are solved.

What the machine actually is

Ireland Quantum 100 is a 100-physical-qubit superconducting transmon system being delivered over the next twelve months in Co. Tipperary. The qubits are aluminium-on-silicon transmons — the same family used by most of the serious gate-model players — operated at sub-15 millikelvin inside a dilution refrigerator. Connectivity follows a heavy-hex topology, which trades raw qubit-to-qubit coupling for lower frequency-collision risk and a cleaner path to surface-code error correction further down the road.

The control stack is the usual: room-temperature arbitrary waveform generators driving microwave pulses through attenuated coax, with readout via dispersive measurement on coupled resonators. From the user side, workloads come in through OpenQASM 3 and the standard SDKs — Qiskit, PennyLane, Cirq — so a chemistry team that has been prototyping on simulators or on cloud-accessed devices does not have to rewrite anything fundamental to run on the Tipperary machine.

The 100 figure is physical qubits, not logical qubits. I want to be precise about that because it matters for what we can and cannot do at first light. Surface-code logical qubits are on the roadmap, not the day-one menu. The first cohort of workloads will be NISQ-era — variational, hybrid quantum-classical, error-mitigated rather than error-corrected.

Why climate chemistry is the right first cohort

The reason climate workloads sit at the top of the queue is not sentiment. It is fit-to-hardware. Variational quantum eigensolvers (VQE) and quantum approximate optimisation (QAOA) are the two algorithm families with the most mature tooling on NISQ devices, and both map cleanly onto problems that matter for carbon removal and clean-energy discovery:

  • Ground-state energy of catalytic intermediates. Carbon-capture sorbents — amines, MOFs, calcium-loop variants — succeed or fail based on binding energies that classical DFT estimates approximately and CCSD(T) estimates expensively. VQE on a well-chosen active space can give a different angle on the same question.
  • Photovoltaic and photocatalytic excited states. Excited-state chemistry is where classical methods struggle hardest. Quantum subspace expansion methods built on top of VQE are genuinely interesting here.
  • Battery cathode and electrolyte screening. Multi-reference character in transition-metal oxides is the textbook case for "classical methods are uncomfortable here."
  • Grid and logistics optimisation. QAOA on grid-balancing and route-optimisation problems is more speculative on near-term hardware, but the formulation is honest and the classical baselines are well understood.

None of these are "quantum will solve climate change" claims. They are "this is a workload where a 100-qubit transmon machine, used carefully, can produce a result a classical workflow cannot easily produce, or can produce it with a different error profile." That is enough to be useful.

How the workloads actually flow into IMPT

IMPT runs an offset stack — a portfolio of supplier projects across removal, avoidance, and reduction categories. The platform's job is to source, verify, price, and bundle those supplier candidates into something a buyer can actually purchase against a defensible methodology. Most of that pipeline is classical: MRV data, registry integration, satellite imagery, supplier diligence.

The quantum-fed part is narrower and specific. When a candidate supplier is doing something materials-dependent — a novel sorbent, an enhanced-weathering mineral blend, a biochar feedstock with claimed durability — the underlying chemistry question can be queued onto Ireland Quantum 100 as a workload. The output is not "this offset is good" or "this offset is bad." The output is a chemistry result: a binding energy, a reaction-pathway barrier, a stability estimate. That result then goes back into the classical diligence model that IMPT already runs.

So the integration is unglamorous and correct: quantum is one input among many, sitting upstream of the supplier-quality model, contributing where the chemistry is genuinely hard and the classical answer is genuinely uncertain. It is not a separate product. It is a sharper instrument bolted onto a stack that already works.

The hardware-software wiring between Tipperary and the platform

Practically, the workflow looks like this:

  • Problem framing. A chemistry team — internal or external — defines an active space: which orbitals matter, which can be frozen, which symmetries can be exploited. This is the most important step and it is entirely classical. A badly framed VQE problem wastes shots no matter how good the hardware is.
  • Circuit compilation. The ansatz — typically hardware-efficient or UCCSD-derived — gets compiled against the heavy-hex topology. Transpilation chooses the qubit mapping, routes two-qubit gates, and inserts dynamical decoupling where idle qubits would otherwise dephase.
  • Execution and error mitigation. Runs go through with zero-noise extrapolation, probabilistic error cancellation, or readout-error mitigation depending on the workload. Shot counts are nontrivial; a serious VQE convergence is hundreds of thousands to millions of circuit executions.
  • Classical post-processing. Energies and gradients feed back into the classical optimiser. The loop closes. Eventually the optimiser converges or the team decides it has enough information to make a chemistry call.
  • Hand-off to the IMPT diligence model. The chemistry result becomes a structured input — never a free-text claim — into the supplier-evaluation pipeline.

Sovereignty matters here. Climate-relevant chemistry results, supplier IP, and the reasoning chains around offset quality are not data I want sitting on someone else's cloud, in someone else's jurisdiction, under someone else's export rules. A machine on Irish soil, accessed through Irish infrastructure, gives the IMPT Ireland quantum integration a property that cloud-quantum cannot give: the workload never leaves the country. For the longer argument on why that matters, the Ireland Quantum 100 overview goes into the sovereignty case in more depth.

What I am not willing to pretend

A few things I want to keep honest about, because the quantum space is not short of people willing to be less honest:

NISQ-era results are noisy. A VQE energy on a 100-qubit transmon with no logical encoding is going to have an error bar. Error mitigation narrows that bar but does not eliminate it. Any chemistry result coming out of the machine in 2027 will be reported with its uncertainty, not as a single number.

Quantum advantage is workload-specific. Most chemistry problems are still better solved classically today. The right question is never "is quantum better?" — it is "is quantum better for this specific molecule, this specific active space, this specific question?" Often the answer is no, and the workload should not be queued at all.

Surface-code error correction is a multi-year arc. Logical qubits at useful code distances need physical-qubit counts well beyond 100. Ireland Quantum 100 is a starting point on that arc, not the destination. The roadmap is real and the timelines are public, but I am not going to pretend day-one logical operations.

Offset quality is mostly not a chemistry problem. Most of what makes a carbon offset good or bad is additionality, permanence, leakage, MRV integrity, and supplier governance. Quantum chemistry helps on a narrow slice — the materials and reaction-pathway questions. The rest of the stack carries the weight, and always will.

Where to start this week

If you are a chemistry researcher with a workload you think might fit — carbon-capture sorbents, photovoltaic materials, battery chemistry, climate-relevant catalysis — the useful thing to do this week is frame the active space. Write down which orbitals matter, what classical method you have already tried, and what you would do with a more accurate ground-state energy if you had one. That document is the entry ticket to a serious conversation about queuing on the machine. The hardware arrives over the next twelve months; the workloads that will run on it first-light are being scoped now. If you have something climate-relevant that fits the profile, the climate workloads page is the right place to read what we are prioritising and how to put a workload forward.

Research collaboration or early access

Book a research call →