Amira Abbas, Yanlin Chen, Tuyen Nguyen, Ronald de Wolf (Oct 07 2025).
Abstract: The technique of combining multiple votes to enhance the quality of a decision is the core of boosting algorithms in machine learning. In particular, boosting provably increases decision quality by combining multiple weak learners-hypotheses that are only slightly better than random guessing-into a single strong learner that classifies data well. There exist various versions of boosting algorithms, which we improve upon through the introduction of QuantumBoost. Inspired by classical work by Barak, Hardt and Kale, our QuantumBoost algorithm achieves the best known runtime over other boosting methods through two innovations. First, it uses a quantum algorithm to compute approximate Bregman projections faster. Second, it combines this with a lazy projection strategy, a technique from convex optimization where projections are performed infrequently rather than every iteration. To our knowledge, QuantumBoost is the first algorithm, classical or quantum, to successfully adopt a lazy projection strategy in the context of boosting.