Ireland Quantum 100 · Technical brief

Synthetic photosynthesis and quantum chemistry

← Ireland Quantum overview

A leaf converts sunlight to chemical energy at conversion efficiencies that still embarrass our best photovoltaics on a per-photon basis, and it does so wet, warm, and cheap. The reason it works is partly chemistry and partly something stranger — the initial energy transfer through the antenna complex appears to exploit quantum coherence, with electronic excitations sampling multiple paths to the reaction centre simultaneously. If we want to copy that, classical molecular dynamics will only take us so far. The honest tools for the job are quantum chemistry methods, and increasingly, quantum hardware itself.

Why classical chemistry hits a wall on light-harvesting systems

The chlorophyll-protein complexes that drive natural photosynthesis — Fenna-Matthews-Olson in green sulphur bacteria is the canonical example — are not neat little molecules. They are pigments embedded in protein scaffolds, with electronic states that couple to vibrational modes of the surrounding environment. Modelling this means you need to handle electron correlation, vibronic coupling, and a non-Markovian bath, all at once, on systems with dozens of strongly interacting electrons.

The standard classical workhorses — density functional theory (DFT), coupled cluster, multi-reference methods like CASSCF and CASPT2 — each fail in characteristic ways. DFT functionals struggle with charge-transfer excited states, which are exactly the states that matter in synthetic chlorophyll analogues. Coupled cluster scales like O(N^7) for CCSD(T), which puts realistic chromophore arrays out of reach. CASPT2 handles the multi-reference character but the active space blows up as you add pigments. You end up choosing between a system small enough to compute and a system big enough to be interesting.

This is the regime where quantum hardware has a credible angle. The variational quantum eigensolver (VQE) and its descendants encode the molecular Hamiltonian into qubits via Jordan-Wigner or Bravyi-Kitaev mappings, and the cost of representing strongly correlated states grows polynomially rather than exponentially. That's the theoretical pitch. The engineering reality is harder.

What artificial photosynthesis actually requires from a quantum computer

If you want to design a synthetic light-harvesting complex — say, a porphyrin-based artificial antenna feeding a catalytic centre that splits water or reduces CO₂ — there are a few specific calculations you genuinely need:

  • Excited-state energies and transition dipole moments for the candidate chromophore, accurate to roughly chemical accuracy (~1 kcal/mol) for relative energies.
  • Inter-pigment electronic couplings as a function of geometry, because that's what determines whether energy migrates coherently or hops incoherently.
  • Reorganisation energies for charge-transfer states at the donor-acceptor interface — a Marcus-theory ingredient that DFT routinely gets wrong by a factor of two.
  • Catalytic transition states on the water-oxidation or CO₂-reduction side, which are the rate-limiting steps and which involve transition metals with stubbornly multi-reference ground states.

None of these are trivially classical. The catalytic transition states in particular — manganese clusters mimicking the oxygen-evolving complex, or cobalt and nickel centres for CO₂ reduction — are the kind of problems where a 100-physical-qubit machine, used carefully, can run VQE on active spaces that simply cannot be diagonalised classically.

Mapping the problem onto superconducting transmon hardware

Here is where I have to be honest about what the hardware does and doesn't do. A 100-physical-qubit superconducting transmon system, operating in a dilution refrigerator at sub-15 mK and laid out on a heavy-hex topology, is not running Shor's algorithm and it is not factoring RSA. What it can do is run noisy intermediate-scale quantum (NISQ) chemistry workloads — VQE, quantum subspace expansion, ADAPT-VQE — on Hamiltonians of perhaps 30 to 50 spin-orbitals after qubit reduction techniques like frozen-core approximation and active-space selection.

The heavy-hex topology, inherited from the broader transmon design lineage, has a specific consequence for chemistry: the qubit connectivity graph does not match the all-to-all interactions you find in a fermionic Hamiltonian. So the compiled circuit pays a SWAP overhead. For a Jordan-Wigner mapped electronic Hamiltonian on a heavy-hex lattice, the SWAP networks dominate two-qubit gate count, and two-qubit gate fidelity is the binding constraint on circuit depth. This is why the realistic near-term workflow is:

  • Aggressive active-space reduction using classical CASSCF as a pre-processor.
  • Hardware-efficient ansätze that respect the native connectivity, rather than chemically-motivated ansätze like UCCSD which require deep circuits.
  • Error mitigation — zero-noise extrapolation, probabilistic error cancellation, Clifford data regression — applied per-shot.
  • Classical post-processing via quantum subspace expansion to recover excited states from a ground-state ansatz.

This is hybrid quantum-classical chemistry. The SDK ecosystem — Qiskit Nature, PennyLane's qchem module, OpenFermion talking to Cirq — already supports this workflow. What's missing for most research groups isn't software; it's reliable access to hardware they can iterate on.

Quantum coherence in light harvesting: what we're actually copying

The "quantum photosynthesis" story has been overcooked in the popular press, so let me put it plainly. The original 2007 evidence for long-lived electronic coherence in FMO has since been re-interpreted: a lot of what looked like electronic coherence was vibrational coherence in the protein scaffold. The modern view is that vibronic coupling — the mixing of electronic and vibrational degrees of freedom — is what gives natural light-harvesting its robustness. The bath is not just noise; it's a participant.

That matters for synthetic design. A pure electronic-coherence picture says "isolate the chromophores, eliminate vibrations". A vibronic picture says "engineer the vibrational modes, tune them into resonance with the electronic gaps". These are opposite design strategies, and getting it wrong wastes years of synthetic-chemistry effort.

To simulate vibronic systems on quantum hardware, you extend the electronic Hamiltonian with bosonic modes for the vibrations. There are encodings — binary, unary, Gray-code — that map a truncated bosonic Hilbert space onto qubits. The qubit cost grows quickly, which is why this remains an active research frontier rather than a production workflow. But it's the right frontier to be working on if your target is a synthetic chlorophyll that actually exploits the same tricks the real one does.

Where Ireland Quantum 100 fits

The machine we are bringing online in Tipperary is built for exactly this cohort of workloads. The first-light single-qubit milestone is the start of a calibration ladder — randomised benchmarking, gate-set tomography, then progressively deeper VQE circuits — that ends with multi-qubit chemistry workloads accessible to academic and industrial users. Climate-relevant chemistry, including artificial photosynthesis catalyst screening, is in the priority cohort precisely because it's a domain where 100 physical qubits with good calibration can do useful work that classical clusters cannot do in reasonable time.

The realistic near-term contribution is screening: take a library of candidate porphyrin and phthalocyanine derivatives, run VQE on the active space of each candidate's lowest charge-transfer excited state, rank them by a figure of merit that combines transition dipole, charge-transfer energy, and reorganisation energy. The ranking goes back to the synthesis lab. The synthesis lab makes the top three. The cycle repeats. This is unglamorous and it is exactly what the hardware is good for. More on the broader programme is on the Ireland Quantum 100 page, and the specific climate workloads are catalogued on the climate-science applications sub-page.

The error-correction horizon

None of the above gets you to fault-tolerant chemistry. For that you need the surface code, logical qubit encodings with code distance 7 or higher, and physical-to-logical ratios in the thousands. A 100-physical-qubit machine gives you maybe one or two distance-3 logical qubits if you push it — enough to demonstrate the encoding, not enough to run a Trotterised quantum phase estimation on a real catalyst. The honest roadmap is: NISQ chemistry now, surface-code demonstrations on the same hardware, and full fault-tolerant chemistry on the generation that comes after this one. Anyone selling you a different timeline is selling you something.

Where to start this week

If you're a chemist who wants to be ready when this hardware comes online: install Qiskit Nature or PennyLane, pick a chromophore you already understand classically, and run VQE on a small active space (4 electrons in 4 orbitals is enough to learn the workflow) against a noisy simulator. Compare the answer to your CASSCF reference. Get a feel for how active-space choice, ansatz depth, and noise model interact. The skill you're building is not "quantum programming"; it's intuition for what these machines can and can't do on real molecules. That intuition is what will let you ask the right question of the hardware on day one, instead of the third year.

Research collaboration or early access

Book a research call →