Abstract: We present a quantum algorithm for simulating the time evolution generated by any bounded, time-dependent operator −A with non-positive logarithmic norm, thereby serving as a natural generalization of the Hamiltonian simulation problem. Our method generalizes the recent Linear-Combination-of-Hamiltonian-Simulation (LCHS) framework. In instances where A is time-independent, we provide a block-encoding of the evolution operator e−At with O(tlogϵ1) queries to the block-encoding oracle for A. We also show how the normalized evolved state can be prepared with O(1/∥e−At∣u0⟩∥) queries to the oracle that prepares the normalized initial state ∣u0⟩. These complexities are optimal in all parameters and improve the error scaling over prior results. Furthermore, we show that any improvement of our approach exceeding a constant factor of approximately 3 is infeasible. For general time-dependent operators A, we also prove that a uniform trapezoidal rule on our LCHS construction yields exponential convergence, leading to simplified quantum circuits with improved gate complexity compared to prior nonuniform-quadrature methods.
Join our community today and start discussing with our members by participating in exciting events, competitions, and challenges. Sign up now to engage with quantum experts!