Hybrid quantum-classical systems cut AI energy demands
As AI models grow more powerful, they’re also growing more power-hungry. The energy demands of training and running large neural networks are pushing data centers to their limits and raising concerns about sustainability and scalability. But a new benchmarking study by ORCA Computing, Toyota Motor, and Toyota Tsusho has shown that hybrid quantum-classical systems can already help reduce the load, delivering real-world performance gains today.
Quantum-classical collaboration, real-world results
The joint project focused on integrating photonic quantum processors into standard AI workflows, without reinventing the wheel. By pairing quantum systems with familiar architectures like Convolutional Neural Networks (CNNs) and Vision Transformers (ViTs), the teams explored how quantum could complement, rather than replace, classical AI infrastructure. The results? Hybrid models reduced computational operations by more than 20%, directly cutting down on energy-intensive GPU usage. In some cases, such as quantum reservoir computing, classical compute time was slashed by over 80%, signaling enormous potential for ultra-efficient AI inference.
What it means for Belgium and the broader ecosystem
For Belgium’s quantum ecosystem and the wider global quantum community, this is a clear signal: quantum is beginning to deliver tangible performance gains now, not later. It validates hybrid approaches and calls on companies and researchers to experiment with quantum integration in current tech stacks. This also speaks to one of Quantum Circle’s core beliefs, that co-creation across disciplines is the fastest path to impact. By aligning industry needs with quantum capabilities, collaborations like this one are shaping the future of sustainable, scalable AI. As we look to reduce AI’s environmental footprint and boost efficiency, Belgium’s quantum community is well-positioned to lead in this space.


