Bence Bakó, Zoltán Kolarovszki, Zoltán Zimborás (Nov 19 2025).
Abstract: Quantum generative learning is a promising application of quantum computers, but faces several trainability challenges, including the difficulty in experimental gradient estimations. For certain structured quantum generative models, however, expectation values of local observables can be efficiently computed on a classical computer, enabling fully classical training without quantum gradient evaluations. Although training is classically efficient, sampling from these circuits is still believed to be classically hard, so inference must be carried out on a quantum device, potentially yielding a computational advantage. In this work, we introduce Fermionic Born Machines as an example of such classically trainable quantum generative models. The model employs parameterized magic states and fermionic linear optical (FLO) transformations with learnable parameters. The training exploits a decomposition of the magic states into Gaussian operators, which permits efficient estimation of expectation values. Furthermore, the specific structure of the ansatz induces a loss landscape that exhibits favorable characteristics for optimization. The FLO circuits can be implemented, via fermion-to-qubit mappings, on qubit architectures to sample from the learned distribution during inference. Numerical experiments on systems up to 160 qubits demonstrate the effectiveness of our model and training framework.