Quantum optimization shows potential as a powerful tool for enhancing resource allocation efficiency in Open Radio Access systems (Open RAN)

Photo of D-Wave quantum computer equipped with a quantum annealing processor..
A D-Wave quantum computer equipped with a quantum annealing processor. Photo credit: D-Wave Quantum Inc.

 

Open Radio Access Networks (Open RAN) are emerging as a key architectural approach in next-generation mobile networks. One of its key capabilities—RAN slicing—enables multiple virtual networks to operate on a shared radio infrastructure. This allows for improved flexibility, scalability, and interoperability across services with different Quality of Service (QoS) requirements.
In Open RAN environment, it’s crucial to efficiently allocate the network resources between the different demands of applications like URLLC, eMBB and mMTC, ultimately improving their different QoS performance factors.
Several techniques have been introduced to manage sharing components in 5G and beyond networks, i.e.: using multi-access edge computing (MEC) and a platform as a service (PaaS) model for URLLC applications. Other techniques are priority-based aiming to maximize QoS within limited data rate and BW using genetic algorithms or Deep RL.

While these techniques have succeeded in dynamically allocating the RAN resources, they suffer from highly computational complexity, limiting their performance in real-time and also their adaptability to larger networks structures.

In their recent paper titled “Open RAN Slicing with Quantum Optimization— [Available: IEEE, arXiv], Lincs lab PhD student Patatchona Keyela, in collaboration with Prof. Soumaya Cherkaoui,  proposed a quantum annealing (QA) framework for multi-slice Open RAN resource allocation. Their goal is to maximize the overall system throughput while meeting the QoS constraints for both eMBB and URLLC services in 5G networks.

The study demonstrated that quantum optimization provides (near) optimal solutions quickly enough for real-time applications. To validate their approach, the authors conducted network simulations with various realistic scenarios of gNodeBs, RBs, and users, using parameters chosen to closely mimic real-world conditions.

The authors have received the Best Paper Award at the 2025 15th Global Information Infrastructure and Networking Symposium (GIIS25). The symposium took place from February 25 – 27, 2025, and was held in Dubai, UAE.

 

Scroll to Top