Hi CERN team, thanks so much for organising this challenge. I have a generic question for you at this stage. What can you do today with quantum algorithms that was simply out of reach three years ago? Are there specific breakthroughs in error mitigation, variational methods, or quantum subroutines that have made a real difference in your research?
Thank you for the question. Quantum technologies are evolving so fast, this affects the way we think about algorithms. In particular, there have been key results in hardware improvements as well as in quantum error correction. See for example the Google's Willow Processor (2024) where they achieved below-threshold error correction, improving qubit stability. IBM is actively working on demonstrate the first quantum-centric supercomputer by integrating modular processors, middleware, and quantum communication. They also enhance the quality, execution, speed, and parallelization of quantum circuits.
In hybrid quantum-classical methods there is continuous effort in reducing resource demands, improving optimization routines and include tensor network alternatives.
In HEP we have collected most interesting results here: REF .
Improvements in the hardware quality allow to run and test more complex circuits in terms of circuit depth, that means getting into more realistic examples.
We were able to perform a real calculation of a non trivial cross section on quantum hardware 10.1088/2058-9565/ada9c5. Regarding generative model, we have interesting direction for QML looking into QBM: https://arxiv.org/abs/2410.16363
This is true for quantum field theory calculation (mainly lattice based) but also for data analysis and data generation. We can get to better quantum field theory simulations via enhanced error mitigation
Quantum computers are, in principle, a natural framework for simulating particle interactions and QFTs in HEP due to their ability to efficiently handle quantum states and entanglement. Traditional simulations of QFTs rely on lattice-based methods (like Lattice QCD) and perturbative methods, which require immense classical computational resources. Quantum computing can address these challenges by representing quantum states directly on qubits, evolving them using quantum circuits, and leveraging quantum entanglement to capture correlations more efficiently. While this is an active research area and we are not there yet! Many researchers are working on it, and interest in the field has grown significantly over the years. See Michele's answer for more information
thank you for the question. At CERN we are running CERN QUANTUM TECHNOLOGY INITIATIVE, where quantum computing represents 1 competence center. See here for additional details and the other Cocs https://quantum.cern/research
In general, our mission is to integrate with the EU and US HPC+QC infrastructures and take part in the design and deployment of hybrid computing solutions in support of several scientific use cases. We have PhD students and Post Docs that work on different aspects of computing, from theoretical foundations of QML, to more application oriented algorithms. We also host conferences, this year we had QT4HEP2025 where you can find many of the topics we work on: https://indico.cern.ch/event/1433194/overview
Thank you for the response sir. CERN has truly been a source of motivation for me. I have one more question. Quantum computers require extremely low temperatures, which consume significant energy. Given this, how can quantum computing contribute to sustainability and combating climate change?
This is a fair question. However we need to take into account we have multiple technologies for building a quantum computer. Superconducting is of course the one that requires extremely low temperature. There are many studies about energy footprint of QC, but the computation power of qubit is scaling better than classic ones, especially if you consider big data centres. Though cooling requires low temperatures (~millikelvin range), quantum computers may still be more energy-efficient than classical supercomputers for specific tasks, reducing overall computational carbon footprints in the long run. Overall, their ability to solve complex sustainability challenges (e.g. better catalysts for carbon capture, developing low-energy fertilizers, discovery of new materials) could ultimately offset these costs, making them a valuable tool in combating climate change.
Yes, partons and jets are related, as partons (quarks or gluons) from the hard process undergo hadronization, producing sprays of particles that form jets. The main challenge is establishing a mapping between initial partons and final-state jets. Commonly used features for analysis include transverse momentum (pT), pseudorapidity (η), azimuthal angle (ϕ), and the angular separation ΔR=Δη2+Δϕ2, which helps associate partons with jets.
Additional useful features include jet energy and momentum fractions to compare jets with their originating partons. Matching techniques such as ΔR-based nearest neighbor assignment or momentum fraction consistency are commonly used. Depending on the dataset, a simple kinematic-based matching may be sufficient, but more advanced techniques could be explored if needed.
thank you @yhaddad it gave me a new perspective now, i am going for advanced techniques
though i agree on kinematic-based matching which generally focuses on event reconstruction and particle association.
The Session is open! Don't hesitate to as CERN all your questions!
Want to join this discussion?
Join our community today and start discussing with our members by participating in exciting events, competitions, and challenges. Sign up now to engage with quantum experts!