Thank you for the question. Quantum technologies are evolving so fast, this affects the way we think about algorithms. In particular, there have been key results in hardware improvements as well as in quantum error correction. See for example the Google's Willow Processor (2024) where they achieved below-threshold error correction, improving qubit stability. IBM is actively working on demonstrate the first quantum-centric supercomputer by integrating modular processors, middleware, and quantum communication. They also enhance the quality, execution, speed, and parallelization of quantum circuits.
In hybrid quantum-classical methods there is continuous effort in reducing resource demands, improving optimization routines and include tensor network alternatives.
In HEP we have collected most interesting results here:
REF .
Improvements in the hardware quality allow to run and test more complex circuits in terms of circuit depth, that means getting into more realistic examples.
This is true for quantum field theory calculation (mainly lattice based) but also for data analysis and data generation. We can get to better quantum field theory simulations via enhanced error mitigation