Posted

Zoltán Kolarovszki, Ágoston Kaposi, Zoltán Zimborás, Michał Oszmaniec (Apr 17 2026).
Abstract: Photonic architectures are one of the leading platforms for demonstrating quantum computational advantage, with Boson Sampling and Gaussian Boson Sampling as the primary schemes. Yet, we lack for these photonic primitives a systematic theoretical understanding of linear cross-entropy benchmarking (LXEB), which is a central tool for testing quantum advantage proposals. In this work, we develop a representation-theoretic framework for the classical computation of average LXEB scores and second moments of output probability distributions, covering a range of quantum advantage experiments based on scattering nn-photon states through mm-mode Haar-random interferometers. Our methods apply in any regime, including the saturated regime, where the (expected) number of photons is comparable to the number of optical modes. The same second-moment techniques also allow us to prove anticoncentration for traditional Fock-state Boson Sampling in the saturated regime. Interestingly, for Gaussian Boson Sampling second moments are not sufficient to establish a meaningful anticoncentration statement. The technical core of our approach rests on decomposing two copies of the nn-particle bosonic space Symn(Cm)\mathrm{Sym}^n(\mathbb{C}^m) into irreducible representations of U(m)\mathrm{U}(m). This reduces two-copy Haar averages to computing purities of initial states after partial traces over particles, highlighting the role that particle entanglement plays for LXEB and anticoncentration.

Order by:

Want to join this discussion?

Join our community today and start discussing with our members by participating in exciting events, competitions, and challenges. Sign up now to engage with quantum experts!