The chemistry of pulling CO₂ out of ambient air is, at its heart, an electron-correlation problem. Amines bind CO₂ through a delicate dance of lone-pair donation, proton transfer, and zwitterion formation, and the energy differences that decide whether a sorbent regenerates at 80°C or 120°C are the same energy differences that classical density functional theory routinely gets wrong by 5–10 kcal/mol. That gap matters: it is the difference between a viable direct-air-capture plant and one that burns more energy than it sequesters. Quantum hardware is one of the few tools that can, in principle, close it.
Why DAC chemistry is hard for classical methods
Direct air capture works against the second law. Atmospheric CO₂ sits at roughly 420 ppm, which means the sorbent has to be selective enough to grab one molecule in 2,400 while ignoring water vapour, oxygen, and nitrogen. The dominant chemistries are aqueous hydroxides (KOH, NaOH), solid amines on silica or MOF supports, and a handful of newer guanidines and phase-change solvents. Each of these involves bond-making and bond-breaking events where the transition state is multi-reference in character — that is, a single Slater determinant is a poor description of the electronic structure.
The standard classical workhorse, DFT with B3LYP or ωB97X-D, handles ground-state geometries reasonably well but underperforms badly on:
- Activation barriers for carbamate and bicarbonate formation
- Spin-state energetics in metal-organic framework binding sites
- Dispersion-dominated weak binding in physisorption regimes
- Proton-transfer transition states in zwitterionic intermediates
Coupled-cluster methods like CCSD(T) — the so-called "gold standard" — scale as O(N⁷) with the number of orbitals. For a primary amine like monoethanolamine reacting with CO₂ and a few explicit waters, you can do it. For a realistic active-site cluster of a tetraamine grafted onto a silica surface with a hydration shell, you cannot, not on any classical machine that exists or is planned.
What a transmon machine actually computes
A 100-physical-qubit superconducting transmon processor doesn't run chemistry directly. It runs parameterised circuits — typically variational quantum eigensolver (VQE) ansätze, or more recently quantum subspace expansion and quantum Krylov methods — that encode a molecular Hamiltonian onto qubits via a fermion-to-qubit mapping (Jordan-Wigner, Bravyi-Kitaev, or parity).
For a chemistry workload you typically follow this pipeline:
- Choose an active space — the subset of orbitals that actually matter for the reaction. For an amine + CO₂ system, this is usually 8–16 orbitals around the nitrogen lone pair, the CO₂ π system, and any participating O-H bonds.
- Build the second-quantised Hamiltonian using a classical SCF reference, often from PySCF or Psi4.
- Map to qubits. With Jordan-Wigner, an active space of N spin-orbitals needs N qubits. So a (16e, 16o) CASCI-equivalent calculation needs 32 qubits before any error correction.
- Choose an ansatz — UCCSD, hardware-efficient, or ADAPT-VQE — that respects the heavy-hex connectivity of the chip.
- Optimise the parameters classically while the quantum device evaluates expectation values.
On the Ireland Quantum 100 hardware, the cryostat sits below 15 mK at the mixing-chamber stage, which is what keeps the transmons in their computational subspace and suppresses thermal photon population in the readout resonators. The heavy-hex topology — three-coordinated qubits arranged in a hexagonal lattice — is a deliberate trade-off: lower connectivity than all-to-all, but also dramatically lower frequency-collision rates and crosstalk, which in turn makes the gate fidelities good enough to run the deeper circuits chemistry needs.
The amine-sorbent problem in detail
Take a concrete case: a secondary amine reacting with CO₂ in the presence of one water molecule. The accepted mechanism goes through a zwitterion intermediate (R₂NH⁺-COO⁻) that then loses a proton, either intramolecularly to a second amine or to water, forming a carbamate or bicarbonate. The rate-limiting step depends on the local environment, and the regeneration energy — how much heat you need to release the CO₂ for storage — depends on the relative stability of those products.
For a sorbent designer, the questions are:
- What's the binding enthalpy at the active site?
- What's the activation energy for the rate-limiting proton transfer?
- How does substituting the amine backbone (primary → secondary → hindered tertiary) shift those numbers?
- What does adding humidity actually do to the kinetics?
These are exactly the questions where multi-reference quantum chemistry pays off, and exactly the questions where current DFT-screened candidate libraries leave money on the table. A reasonable path on a 100-qubit machine is to use VQE or quantum Krylov on a (10e, 10o) or (12e, 12o) active space around the bond-forming region, embedded in a classical DFT environment via projector-based embedding or DMET. You don't need to put the whole molecule on the quantum chip — you need to put the part DFT gets wrong on the quantum chip.
What 100 physical qubits is and isn't enough for
I want to be honest about the regime. One hundred physical transmons, pre-error-correction, is not going to factor RSA-2048 and it is not going to simulate a full nitrogenase active site overnight. What it can do — with good calibration, dynamical decoupling, and modern error-mitigation techniques like zero-noise extrapolation and probabilistic error cancellation — is run chemistry problems in the 30–50 qubit active-space range that are genuinely beyond the reach of CCSD(T) on a workstation, and competitive with the largest CCSD(T) calculations done on national HPC clusters.
The honest framing is that we are in the late-NISQ-to-early-fault-tolerant transition. The surface-code roadmap — encoding one logical qubit in roughly a thousand physical qubits at the relevant code distance — is real, and it's where the dramatic chemistry speedups (full quantum phase estimation on industrial-scale Hamiltonians) eventually live. But the bridge to that future is built one calibrated chip at a time, and the chemistry you can extract from a well-run pre-FT machine is already useful for ranking candidate sorbents that classical screening can't separate.
This is part of why our sovereign quantum programme is dedicated first to climate workloads rather than spreading thinly across cryptography, finance, and ML benchmarks. Concentration of effort matters more than qubit count alone.
Integrating with materials discovery pipelines
A quantum machine is not a magic oracle; it is one stage in a screening funnel. The realistic workflow for finding better DAC sorbents looks like this:
- Stage 1 — generative or combinatorial enumeration. Classical ML, often a graph neural network or transformer trained on QM9 / OE62-type datasets, proposes thousands of candidate amine and MOF structures.
- Stage 2 — DFT screening. Cheap functionals filter for geometric feasibility, basic binding, thermal stability.
- Stage 3 — quantum embedding. The top tens-to-hundreds of candidates get the active-site treatment on the quantum machine, refining binding energies and barriers.
- Stage 4 — molecular dynamics. Classical MD with quantum-corrected force-field parameters, looking at humidity, degradation, and cycling stability.
- Stage 5 — synthesis and bench testing. The honest answer to whether the chemistry was right.
Stages 3 and 4 are where the quantum and HPC sides talk to each other through OpenQASM 3 circuit submission and Qiskit Runtime-style primitives, with results piped into PennyLane or a custom embedding driver for the next iteration. Cirq, Qiskit, and PennyLane all expose the abstractions you need; the choice is largely a matter of which ecosystem your computational chemists are already fluent in.
Where this connects to actual carbon removal
The point of doing better DAC chemistry on a quantum machine is not to publish papers — it's to lower the regeneration energy of real sorbents in real plants, because the cost per tonne of CO₂ removed is dominated by that energy. A 10% reduction in regeneration enthalpy translates fairly directly into a 10% reduction in operating cost for the capture stage, and that's before you account for the second-order effects on plant footprint and equipment lifetime.
That's also why we have wired the sovereign quantum programme into IMPT's offset stack. Sorbents that emerge from the pipeline as credible candidates feed back into the supplier-screening process for IMPT's climate-science workloads, so the chemistry doesn't just sit in a journal — it ends up evaluated against operating projects with measurable removal numbers.
Where to start this week
If you're a computational chemist or a sorbent group lead and you want to be ready when the machine comes online: pick one well-characterised amine