Challenge
A global logistics company that manages complex supply chains and delivery networks was exploring whether emerging quantum computing techniques could give them an edge in optimizing operations. Their business involves massive routing challenges – like dynamically assigning thousands of packages to trucks and planes, or planning the most efficient delivery routes under changing conditions – as well as strategic capacity planning problems such as warehouse placement and fleet utilization. Classical algorithms and software already handled these tasks, but improvements even of a few percentage points could translate to millions in savings or faster delivery times. The challenge for the company was to cut through the hype and answer a practical question: Can today’s quantum or hybrid quantum-classical algorithms actually solve our optimization problems better than existing methods? And if not today, what about in the near future?
The company’s leadership did not want to fall behind if quantum computing reached a point of practical advantage. However, they were rightly skeptical, given that current quantum hardware is still limited in scale and stability. They needed to identify the right use cases to test – ones small enough to run on current or near-term quantum hardware, yet representative enough to indicate potential value. They also needed to build internal know-how, so that their team could continue to experiment as the technology evolved. In short, the challenge was to design a proof-of-concept (PoC) that would rigorously evaluate quantum optimization on real-world data and integrate with their existing analytics pipeline, without disrupting ongoing operations.
Solution
We partnered with the logistics company to execute a structured quantum optimization proof-of-concept. Our first step was to help them select the most suitable use cases for testing. We conducted workshops with their operations research and IT teams to review the various optimization problems they tackle daily. From a dozen possibilities, we zeroed in on two candidate problems that met our criteria:
-
A route optimization problem – specifically, optimizing last-mile delivery routes for a subset of a city’s deliveries. This is akin to a Traveling Salesman Problem with added constraints (time windows for deliveries, vehicle capacity, etc.). It’s a classic NP-hard problem that strains classical solvers as the number of stops grows, but we could start with a manageable size (for example, 10–15 delivery points) to fit on current quantum hardware.
-
A warehouse capacity allocation problem – determining the optimal distribution of certain high-value products across a network of warehouses to minimize shipping times and costs, subject to constraints like storage capacity and forecasted demand. This problem involves a mix of discrete decisions and was representative of their strategic planning challenges.
With these use cases defined, we proceeded to the benchmarking phase. For each problem, we set up parallel solution pipelines: one using state-of-the-art classical methods, and one using quantum or quantum-inspired algorithms:
-
For the classical baseline, we used the company’s existing optimization tools and also introduced an open-source solver to ensure we had the absolute best classical reference (for instance, using integer linear programming for the warehouse problem, and a combination of heuristic and exact algorithms for the routing problem).
-
For the quantum side, we collaborated with quantum software experts to formulate the problems in a way suitable for quantum computing. The route problem was formulated as a QUBO (Quadratic Unconstrained Binary Optimization) so it could be tried on a quantum annealer (as well as on quantum-inspired annealing software). The warehouse allocation was formulated for a gate-model quantum computer using a hybrid variational algorithm (a QAOA – Quantum Approximate Optimization Algorithm – approach). We also explored purely quantum-inspired algorithms (running on classical hardware) provided by some quantum vendors, as a comparison point.
We ran these approaches on real data provided by the company: actual delivery locations and time windows for the routing case, and real inventory and demand data for the warehouse case. Because current quantum hardware is limited, we used problem instances scaled down to a size the quantum methods could handle, but kept them realistic enough to reflect operational patterns. Each run produced results in terms of solution quality (e.g., total distance traveled for routes, or total cost for distribution in the warehouse case) and computation time.
The core of our analysis was comparing performance. We analyzed how close the quantum or hybrid solutions got to the best classical solutions, and how the required computation time scaled with problem size. We found, for example, that for small instances the quantum annealer could find good solutions to the routing problem within a few percent of the optimal distances found by the classical solver. The gate-model approach for the warehouse problem was hampered by noise on current hardware, but the algorithm design showed promise when we simulated it on an idealized quantum simulator. We documented these findings in detail, highlighting where quantum approaches performed well and where they lagged the classical state-of-the-art. Equally importantly, we outlined the scaling behavior – projecting how large a quantum computer would need to be (in qubits and lower error rates) before we might expect it to outright beat classical methods for these specific problems.
The final component of our work was designing a quantum optimization PoC pipeline that the company could continue to use going forward. We delivered a modular pipeline architecture integrated with their existing data workflow. Concretely, we provided:
-
Data preprocessing tools that take the company’s operations data (like lists of deliveries or warehouse metrics) and automatically format it either for input to classical solvers or to the quantum algorithms (for instance, generating the QUBO matrices or parameter sets required for the quantum solvers).
-
A set of scripts and API calls to interface with quantum computing cloud services. We made it so that with minimal effort, the company’s analysts could submit a problem to a quantum annealer or a gate-model quantum service and retrieve the results. This also included a simulator fallback, so they could test algorithms even when they didn’t have immediate access to actual quantum hardware.
-
Reporting and visualization tools to compare solutions. We built a simple dashboard that, for any given problem instance, could display the classical solution and the quantum solution side by side – showing key metrics like cost or distance, and highlighting differences in the solution approach (e.g., which delivery sequence each method chose). This made it easy for stakeholders to grasp how the solutions differed and how close quantum came to classical.
Outcome
Through this proof-of-concept, the logistics company gained a grounded, realistic understanding of what current quantum optimization can and cannot do for them. The direct comparison on their own data demystified the technology. One immediate outcome was that the company avoided misguided investment – the results indicated that, at the current state of hardware, quantum approaches did not yet outperform the best classical methods for these particular problems. This validated the leadership’s caution; there was no “quantum magic” – at least not yet – that would instantly revolutionize their routing or planning.
However, the PoC also yielded positive insights. The exercise identified certain problem structures within their operations where quantum or hybrid algorithms showed promise. For instance, the quantum annealer consistently found very good solutions for the routing problem when it was constrained to a smaller region and specific constraints – suggesting that as quantum machines grow, they might handle larger versions of that sub-problem effectively. The company’s analytics team, having learned how to formulate and run quantum optimization, is now equipped to keep tabs on progress. As hardware improves (more qubits, less noise) or new algorithms emerge, they can plug those into the PoC pipeline we provided and re-run the benchmarks on updated problem sets. In essence, they have a future-proof experimentation framework.
Another outcome was internal education and innovation. The project catalyzed the creation of a small internal “Quantum Task Force” – analysts and engineers who worked alongside us and are now quantum champions within the company. They have hands-on experience with QUBOs, quantum annealers, and variational algorithms, and can guide future explorations. This group has already brainstormed follow-up investigations, such as testing a quantum-inspired algorithm for real-time re-routing during traffic disruptions, which they can do using the pipeline without external help.
Finally, the company’s leadership gained a clear strategic path: they decided to adopt a watch and prepare stance. They identified specific future milestones that would signal a true quantum advantage (such as hardware of a certain scale or demonstrated algorithm improvement) and built those into their R&D roadmap. In the meantime, they continue to refine classical optimization (indeed, the PoC itself yielded some improvements to their classical solvers) while remaining confident that they won’t miss the quantum opportunity. When the technology does mature to a tipping point, they will have data, experience, and infrastructure ready to integrate quantum optimization, ensuring they remain at the forefront of logistics innovation.
© 2025 Applied Quantum. All rights reserved