Quantum Computing Use Cases: What’s Real, What’s Hype

Quantum Computing Use Cases: What’s Real, What’s Hype

What Quantum Computing Is (And Isn’t)

Quantum computing gets discussed as though it’s the next desktop computer that will revolutionize everything… but it won’t.  These machines are not simply faster versions of the computers we already have – they are faster versions of very specific algorithms that we run currently on expensive and slow silicon CPUs. The physics that they operate on are very different than the transistor – gate model physics we currently leverage, and as such, very specific problems will be solved. Understanding where quantum computing actually applies requires understanding what makes it different.

There is a common feature among the types of problems that quantum computers will excel at: they involve searching through or evaluating a massive number of possibilities. Classical computers handle these problems by checking options one at a time, or by using clever shortcuts that still grow computationally expensive as the problem scales. Quantum computers take a different approach, one that allows them to consider many configurations simultaneously. This makes them well suited to optimization problems, simulations of physical systems that are inherently quantum mechanical, and certain mathematical operations that underpin modern cryptography. It does not make them universally faster when running, say,  Microsoft Excel. In fact, most everyday computing tasks work worse with quantum hardware, as quantum algorithms are slow and error-prone for traditional calculations. It’s important to know what domains this is useful for.

The technology is real, but it is early. The industry is currently in what researchers call the NISQ era: Noisy Intermediate-Scale Quantum. We won’t explain that here, but be on the lookout for a detailed explanation of NISQ and FTQC – and why it matters for the future of quantum. 

Ultimately the difference between these two states comes down to the number of logical qubits that can be used to solve problems, which is one of the most important factors holding back most quantum-enabled applications today. This matters for how you should interpret the announcements and partnerships that appear regularly in the press, and is something that Coherence Report covers in detail. 


Most quantum computing applications today are proofs of concept, research collaborations, or benchmarking exercises. Genuine production deployments are rare, and claims of “quantum advantage” are, well, claims. The gap between what vendors say in press releases and what the technology can actually do today is significant. But that doesn’t mean the technology won’t improve, and that is why its important for communities like Coherence to keep publishing.

With that context, the goal of this article is to lay out the primary use cases where quantum computing has a realistic path to impact. Some are closer than others. Some are surrounded by more hype than substance. For each, we will look at what the application actually involves, who it affects, and where things currently stand.

Quantum Simulation: Physics, Chemistry, and Drug Discovery

By quantum simulation, we are referring to the understanding of how the smallest things in the world interact and behave at the atomic level. Software packages such as ANSYS, COMSOL, and SolidWorks have been predicting how the physical world works for decades. Finite element analysis and computer-aided simulation emerged in the 1960s and became commercial staples by the 1980s. These tools are excellent for modeling stress on a bridge or airflow around a wing, but they do not work well when making predictions at the molecular or atomic level. That is because physics gets strange and counterintuitive at these scales: matter behaves like a wave, particles can exist in multiple states simultaneously, and measurement itself changes outcomes. These are the exact principles that quantum computing leverages. It makes little sense to simulate quantum behavior on classical hardware when you could run those problems on a quantum processing unit that operates on the same physics.

“Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical.” – Richard Feynman

This matters for materials science, chemistry, and drug discovery because it could make previously unsolvable problems solvable. Researchers could model how atoms arrange themselves in a new battery material, predict how a drug molecule binds to a protein, or understand the precise steps of a chemical reaction. These are not incremental improvements to existing methods. They represent the ability to ask questions that classical computers simply cannot answer at useful scale.

The reason quantum computing matters for these problems comes down to what we are actually trying to model. Batteries, superconductors, and catalysts all depend on how electrons and atoms behave at the quantum level. When a lithium ion moves through a battery electrode, when electrons pair up in a superconductor, when a molecule binds to a catalyst surface, these are quantum mechanical events. Classical computers can approximate this behavior for very small systems, but the computational cost grows exponentially as the system gets larger. Adding just a few more atoms can double or triple the resources required. 

When modeling a quantum system on a classical computer, you have to track every possible state that system could be in. For a system of n quantum particles, the number of states you need to represent grows as 2^n. A 50-particle system requires tracking over a quadrillion values! 

This “overhead” has to be processed individually when using traditional transistor-based methods: each possible state is stored as a number in memory, and every operation requires reading those numbers, performing calculations in the CPU, and writing results back. For a 50-particle system, this means shuffling quadrillions of values through the same physical circuit –  one operation at a time. A quantum computer does not track these states in memory – it embodies them. Each qubit is itself a quantum object that can be placed in superposition and entangled with other qubits. When you set up 50 qubits to simulate a 50-particle system, the exponential complexity is not stored in memory; it exists in the physical state of the processor itself.

Material researchers encode the energy rules of the system they want to study, prepare an initial quantum state, then let the qubits evolve according to those rules. Measurements at the end reveal properties like energy levels, electron configurations, or reaction pathways. You are not calculating what a quantum system would do. You are building a controllable quantum system and watching what it actually does. Because quantum computers perfectly embody the physics that you want to simulate, it only makes sense to use a QPU for these applications. 

Where each industry stands: 

Materials Science

The battery industry is one of the most active areas for quantum simulation research, largely because is considered to be one of the most stagnant industries using classical methods. Companies like IBM, Google, IonQ, and Quantinuum have partnerships with energy companies and automotive manufacturers exploring how quantum computing could accelerate battery development. The focus is on modeling lithium-ion behavior at the atomic level and evaluating next-generation materials like solid-state electrolytes. 

Superconductor research is another major target for quantum simulation. Superconductors are materials that conduct electricity with zero resistance, as opposed to normal wires, where energy is lost (typically through heat or infrared radiation). In a superconductor, current flows perfectly, with no loss at all. That sounds academic, but it opens profound opportunities for 21st century engineering, such as: lossless power transmission across electrical grids, stronger and more efficient MRI machines, maglev transportation systems, the powerful magnets required for fusion reactors, and, notably, the circuits that power many of today’s quantum computers themselves. But the primary blocker right now is how little we understand about the conditions that make superconductivity possible. Current superconductors only work at temperatures near absolute zero, requiring expensive cooling systems that limit practical use. Finding materials that superconduct at higher temperatures requires understanding electron behavior at the quantum level, which is precisely what quantum simulation is built for.

In a future piece, we will explore the state of superconducting technology in depth. It is possible that advances in superconductors could prove even more transformative than quantum computing itself, and Coherence will cover this. 

Another potential application are Industrial catalysis simulation – finding more efficient catalysts for chemical manufacturing. Catalysts are materials that speed up chemical reactions without being consumed in the process, and they are essential to industries ranging from oil refining to plastics to pharmaceuticals. Designing a better catalyst today often involves trial and error because modeling how molecules interact at the atomic surface is computationally demanding. Quantum simulation could allow researchers to predict catalyst behavior from first principles, potentially identifying materials that make industrial processes faster, cheaper, or less energy-intensive. 

Chemistry

Quantum chemistry has a longer history with computational methods, and researchers have been using classical computers to model molecular behavior since the 1950s. The challenge is that accuracy degrades as molecules get larger and more complex. Quantum simulation promises to extend that accuracy to systems that are currently out of reach: complex reaction pathways, transition states, and large molecules with many interacting electrons. Nitrogen fixation is a frequently cited target. The Haber-Bosch process, which produces ammonia for fertilizers, consumes roughly 1-2% of global energy. A better catalyst discovered through precise quantum simulation could reduce that significantly. Companies like Dow, BASF, and ExxonMobil have explored quantum computing partnerships to model chemical processes. Most projects today involve small molecule simulations with tens of atoms. Scaling to industrially relevant molecules with hundreds or thousands of atoms will require significant advances in both hardware and error correction.

Drug Discovery

Pharmaceutical companies have been among the most visible investors in quantum computing research. Pfizer, Roche, Merck, Biogen, and others have established programs or partnerships with quantum hardware providers. The goal is to simulate protein folding and protein-ligand interactions with enough accuracy to identify promising drug candidates earlier in the development process. Traditional drug discovery is expensive and slow: bringing a single drug to market can take over a decade and cost billions of dollars. If quantum simulation could reduce the time and failure rate in early-stage screening, the financial impact would be substantial. That said, the current state is modest. Most demonstrations involve small molecules that classical computers can also handle. Simulating a full protein, which might involve thousands of atoms, is well beyond what today’s quantum hardware can manage. The field is in an exploratory phase: building expertise, running benchmarks, and waiting for hardware to mature. Production use cases are likely years away.

 

Optimization Problems: Logistics, Finance, and Scheduling

Optimization problems are everywhere in business. Which route should a delivery truck take to visit 50 locations in the least amount of time? How should a hospital schedule 200 nurses across three shifts while respecting skill requirements, labor laws, and employee preferences? What mix of assets should a portfolio hold to maximize returns while minimizing risk? These are all optimization problems: situations where you need to find the best answer from a massive number of possibilities.

Classical computers solve these problems using a combination of brute force and clever shortcuts. Software packages like Gurobi, CPLEX, and Google OR-Tools have been handling logistics, scheduling, and financial optimization for decades. These tools are powerful, and for many problems, they work well. But they share a fundamental limitation: as the number of variables grows, the number of possible solutions explodes. A routing problem with 10 stops has about 3.6 million possible paths. A problem with 30 stops has more possible paths than there are atoms in the solar system. Classical solvers use heuristics and pruning techniques to avoid checking every option, but at some point, the math simply overwhelms the hardware.

Quantum computing offers a different approach. Rather than checking possibilities one at a time, quantum systems can represent and evaluate many configurations simultaneously. There are two main methods being explored. Quantum annealing, the approach used by D-Wave, treats the problem as an energy landscape where the best solution corresponds to the lowest point. The system starts in a superposition of all possible states and gradually settles into a low-energy configuration. Gate-based quantum computing uses algorithms like QAOA (Quantum Approximate Optimization Algorithm) to iteratively search for good solutions through a series of quantum operations. Both approaches aim to find high-quality answers faster than classical methods, though neither has yet demonstrated a clear advantage on real-world problems at scale.

Logistics, Routing, and Supply Chain Optimization

Transportation and delivery companies have been among the earliest adopters of quantum optimization experiments, and the business justification is straightforward: margins are thin, and even tiny efficiency gains translate to enormous savings at scale. Consider a logistics company operating 10,000 delivery trucks (DHL, UPS, etc.). Each truck makes dozens of stops per day, and the order of those stops determines fuel consumption, driver hours, and how many deliveries can be completed before a shift ends. A routing algorithm that is 2% more efficient across the entire fleet could shave hundreds of millions of dollars annually in fuel costs alone, not to mention reduced vehicle wear and additional delivery capacity.

These optimization problems are typically named the “traveling salesman” problem. The premise is simple: given a list of cities and the distances between them, what is the shortest possible route that visits each city exactly once and returns to the starting point? The problem is easy to state but deceptively difficult to solve. With 10 cities, there are about 3.6 million possible routes. With 30 cities, the number of possible routes exceeds the number of atoms in the solar system. No known algorithm can find the optimal solution in a reasonable time as the problem scales; classical computers rely on approximations and heuristics that get progressively worse as complexity increases.

Real-world routing is even more complex than the typical “traveling salesman” problem. Drivers have delivery windows to meet. Trucks have weight and volume limits. Traffic patterns shift throughout the day. Some customers require special handling, or specific business rules apply. A company like UPS or FedEx is not solving one optimization problem; it is solving thousands of interdependent problems simultaneously, every day. UPS famously saved $400 million over two years by implementing a routing system that reduced left turns, which cut idle time at intersections. That result came from classical optimization. The hope is that quantum methods could find efficiencies that classical algorithms miss entirely.

Airlines face similar math at even higher stakes, because their optimization problems require far more dimensions of complexity. Crew scheduling alone involves matching thousands of pilots and flight attendants to flights while respecting training certifications, union rules, rest requirements, and seniority preferences. A suboptimal schedule does not just waste money; it can cascade into delays and cancellations when crews time out or aircraft are out of position. Aircraft routing adds another layer: planes need to be in the right place at the right time for maintenance, and fuel costs vary by route and load. Delta, for example, spends over $10 billion annually on fuel. Shaving even 1% off that figure through better routing would cover the cost of a significant technology investment many times over.

The same logic extends upstream into supply chain optimization, where the variables multiply further. Retailers and manufacturers are not just moving goods from point A to point B — they are simultaneously deciding where to hold inventory, how much to order, when to reorder, and how to route product through a network of suppliers, distribution centers, and stores, all under conditions of uncertain demand and fluctuating costs. A planning algorithm that is 1% more accurate in its inventory placement could save a company like Walmart or Amazon hundreds of millions of dollars annually in carrying costs, stockouts, and expedited shipping, before accounting for downstream effects on supplier relationships and customer satisfaction. Add real-world constraints — lead times, minimum order quantities, shelf life, seasonal demand, promotional spikes, and capacity limits at each node — and the solution space grows faster than any classical system can enumerate.

A number of Fortune 500 companies have pursued proof-of-concept projects using quantum computing across both routing and supply chain problems, with mixed results. Volkswagen ran an early pilot project using D-Wave to optimize bus routes in Lisbon, attempting to reduce travel time and congestion by recalculating routes in near real-time based on traffic conditions. ExxonMobil partnered with IBM to explore quantum algorithms for maritime routing of liquefied natural gas shipments, a problem involving dozens of ships, fluctuating demand, weather patterns, and inventory levels at delivery sites. The company found that even a simplified version of the problem produces more possible decision combinations than atoms in the universe. IBM also worked with an unnamed commercial vehicle manufacturer to test hybrid classical-quantum optimization for deliveries to 1,200 locations in New York City, factoring in 30-minute delivery windows and truck capacity constraints. Airbus has partnered with IonQ and launched multiple quantum computing challenges focused on flight path optimization, aircraft loading, and supply chain logistics. On the supply chain side, BASF ran a pilot with D-Wave focused on production assignment and scheduling, though notably the company could not send a representative to present its results at D-Wave’s own user conference — a telling signal about the depth of the engagement. BMW has explored quantum optimization for production planning, attempting to reduce scheduling conflicts and material shortages across a network of assembly plants.

Results across all of these pilots are preliminary. Most demonstrate that quantum approaches can match or approximate classical results on small problems, but none have yet shown clear quantum advantage at production scale. Classical solvers like Gurobi handle these problems effectively for most real-world sizes, and quantum approaches have not yet shown consistent improvement. D-Wave has essentially zero good use cases (see our writings on D-Wave for deeper insights), and other providers have done little themselves.

Portfolio Optimization and Financial Risk Modeling

Financial services firms were early investors in quantum computing research. JPMorgan, Goldman Sachs, BBVA, and others have established quantum programs or partnerships with hardware providers. Portfolio optimization is a natural fit: balancing risk and return across thousands of assets with complex interdependencies is a combinatorial problem that grows expensive quickly. 

Portfolio optimization sounds straightforward: choose the right mix of assets to maximize returns while minimizing risk. In practice, it is one of the most computationally demanding problems in finance. A typical institutional portfolio might contain thousands of assets, each with its own expected return, volatility, and correlation with every other asset. The number of possible portfolio configurations grows exponentially. Add real-world constraints like position limits, sector caps, and liquidity requirements, and the problem becomes a mixed-integer programming challenge that strains even modern computing infrastructure.

Monte Carlo simulations, used for pricing derivatives and modeling risk exposure, are another target. These methods rely on generating large numbers of random samples to estimate probabilities, and quantum speedups for sampling could reduce computation time significantly. 

Instead of needing a million samples to achieve a certain precision, a quantum approach might need only a thousand. Goldman Sachs has been at the forefront of this research. In collaboration with IBM, the firm published the first end-to-end resource estimate for derivative pricing on a quantum computer, examining what hardware specifications would be needed to achieve practical advantage. Their research on autocallable contracts and other complex derivatives suggests that quantum methods could eventually make pricing up to a thousand times faster for certain products. More recently, Goldman researchers developed new techniques using Quantum Signal Processing that reduce the quantum resources required for derivative pricing by a factor of 16 compared to earlier approaches.

That said, practical applications in production remain limited. Most financial quantum projects today are research exercises or benchmarking studies. Production trading systems still run on classical hardware.

Scheduling

Workforce scheduling is another commonly approached optimization problem in business. Hospitals need to staff nurses across shifts while respecting certifications, union rules, rest requirements, and patient acuity levels. Airlines must match thousands of pilots and flight attendants to flights while accounting for training qualifications, seniority preferences, and federal rest mandates. Retail chains balance labor costs against customer demand that fluctuates by hour and season. Manufacturers sequence production runs to minimize changeover time between different product configurations. In each case, the core challenge is the same: assign people or resources to time slots while satisfying dozens of constraints simultaneously.

The mathematics of scheduling problems is well understood, and most are formulated as mixed-integer programming problems, and commercial solvers like Gurobi and CPLEX handle them efficiently at moderate scale. The difficulty arises at the edges. When constraint sets become unusually complex, when real-time adjustments are needed, or when the number of variables grows very large, classical solvers can struggle. A hospital scheduling system might work well for a single unit but slow dramatically when coordinating staff across an entire medical center. An airline’s crew pairing algorithm might find good solutions under normal conditions but fail to respond quickly when weather disrupts the network and hundreds of crews need reassignment within minutes.

Quantum optimization, particularly quantum annealing, is theoretically suited to these problems. D-Wave has promoted workforce scheduling as a commercial use case, and have conducted several pilot projects. The most frequently cited is Pattison Food Group, a Canadian grocery and pharmacy chain with over 100 retail locations. During the pandemic surge in online orders, the company faced a scheduling bottleneck: three to four employees spent 80 hours per week manually creating delivery driver schedules across multiple provinces. Pattison worked with D-Wave to automate the process, building a system that accounts for driver seniority, shift preferences, store assignments, and company policies. The result, according to D-Wave, was an 80% reduction in manual scheduling effort, from 80 hours to 15 hours per week.

The Pattison case is instructive for what it reveals about current quantum applications. The efficiency gain is real, but context matters. The system being replaced was a manual, spreadsheet-based process. Whether quantum provides an advantage over modern classical scheduling software is a separate question that the case study never answers, although the lack of expansion is in fact instructive. Pattison has not expanded the quantum solution to broader workforce scheduling across its retail operations, and the company no longer appears at D-Wave’s annual user conference. The project demonstrates that quantum optimization can work for scheduling problems; it does not demonstrate that quantum solves problems well enough to be invested in by a major organization. 

Airlines represent another active area of research, though results here remain more preliminary. Crew scheduling is a notoriously difficult optimization problem. American Airlines operates approximately 5,900 flights per day with a fleet of 900 aircraft; coordinating crew assignments across that network while respecting FAA rest requirements, union contracts, and training certifications generates enormous combinatorial complexity. Lufthansa Industry Solutions is working with the German Aerospace Center on quantum algorithms for both strategic flight planning and tactical crew reassignment during disruptions. The University of Hamburg is collaborating on gate assignment optimization for airport operations. IBM has tested workforce scheduling on its 127-qubit quantum devices, handling problems with up to 874 binary decision variables and over 1,000 constraints. These efforts remain in the research and benchmarking phase.

The broader question is whether quantum scheduling delivers value beyond what classical optimization already provides. Modern workforce management systems from vendors like Kronos, SAP, and Oracle handle complex scheduling for large organizations routinely. These systems have decades of development behind them, run on inexpensive commodity hardware, and integrate with existing enterprise infrastructure. For quantum scheduling to justify its cost and complexity, it would need to demonstrably outperform these mature alternatives on equivalent problems. The cases announced so far show quantum working, but not necessarily quantum winning at scale.

Quantum Machine Learning

With all the hype around artificial intelligence and machine learning, it’s expected that quantum computing would research ways to make inroads into this growing field. The core premise is that because of how they process information, quantum computers might accelerate tasks that sit at the foundation of modern AI. Unfortunately, most of it is hype with very little chance of becoming reality. 

Machine learning is a mathematics-intensive discipline (not to mention computationally expensive). Training a model involves performing enormous numbers of linear algebra operations across datasets that can involve millions or billions of parameters. Quantum computers handle certain classes of linear algebra in ways that are theoretically faster than classical methods. If that advantage translates into practice, the implications for AI development could be meaningful, since training large models today requires substantial investments in hardware, energy, and time.

Coherence is publishing an article on How GPUs Took Over AI shortly. Sign-up for membership to be the first to read it!

The most straightforward version of the argument is that quantum hardware could speed up the linear algebra underlying model training. Researchers have proposed quantum versions of standard machine learning tools, including quantum principal component analysis and quantum support vector machines, both of which show theoretical advantages over their classical counterparts (under specific conditions). The difficulty is that those conditions are narrow, and the overhead required to load classical data into a quantum system often eliminates the speedup before it materializes. Practical training speedups have not been demonstrated at any scale that would matter to a working AI system.

A second area of interest involves generative AI models, which work by learning to sample from complex probability distributions. A model generating a sentence is, in mathematical terms, drawing from a distribution over possible next words based on everything that came before. Quantum computers produce probabilistic outputs naturally, which has led some researchers to argue there is a genuine connection between quantum hardware and generative modeling. D-Wave has been an active proponent of this idea, pointing to its collaboration with Japan Tobacco on drug discovery as evidence that quantum sampling can improve the performance of generative models in pharmaceutical applications. The project used a quantum processor to train a specific component of a molecule-generation pipeline, and Japan Tobacco reported positive results. It’s important to note that those results have not been independently validated, no quantitative benchmarks comparing quantum and classical performance on the same task have been published, and the quantum component addressed a narrow sub-task within a pipeline… where the majority of the work ran on conventional hardware.

Feature selection is the third area where quantum approaches have been proposed. In fields like genomics and quantitative finance, researchers routinely work with datasets that have far more variables than observations. A genomics study might involve millions of genetic markers but only a few thousand patients. Identifying which variables actually matter is a combinatorial search problem that grows harder as the dataset grows larger, and it maps naturally onto the kinds of discrete optimization problems quantum hardware is designed for. In practice, classical techniques like LASSO regression and random forest importance scoring handle feature selection effectively for the vast majority of commercial applications, run on hardware every data science team already has, and produce results that are well-understood and reproducible.

The organizations most actively tracking quantum machine learning span a wide range of industries. Large AI research labs at Google and IBM maintain quantum machine learning programs, in part because they are building the hardware and want to understand its limits. Pharmaceutical and biotech companies have a practical interest given the combinatorial complexity of genomic analysis and drug discovery pipelines. Any industry running large-scale machine learning at significant cost has reason to follow the space, since even a modest training speedup would translate to meaningful savings at scale.

For now, quantum machine learning remains a research direction rather than a commercial one. Classical AI infrastructure continues to improve rapidly, hardware limitations constrain what quantum systems can process, and the bottleneck of loading classical data into quantum systems remains unsolved. Organizations building AI products today have no practical reason to incorporate quantum hardware into their workflows. The question worth watching is whether these theoretical advantages hold up when tested on real hardware, as the technology continues to mature. 

Cryptography and Cybersecurity

Governments and financial institutions were the earliest experimenters and are some of the most attentive observers of the technology’s progress. The reason has to do with encryption and cybersecurity: most classified and secretive communications use encryption standards that were designed with classical computers in mind. The standard most worried about is RSA.

RSA stands for Rivest-Shamir-Adleman, named after the three cryptographers who developed it in 1977. It works by generating a pair of mathematically linked keys, one public and one private, derived from the product of two very large prime numbers that are easy to multiply together but practically impossible to reverse-engineer. The premise behind the widespread adoption of RSA encryption, which has been integrated into all secure transactions since the 1990s, is that the method of breaking it depends on the impossibility of factoring very large numbers. A classical computer attempting to factor a 2,048-bit RSA key by brute force would require more time than the age of the universe. A sufficiently powerful quantum computer could, in theory, accomplish the same task in hours.

Why do we care so much about some security algorithm? RSA encryption is the backbone of HTTPS, the protocol that secures every website that handles a login, a payment, or personal data. It protects email servers, virtual private networks, digital signatures, and the authentication systems that verify software updates on computers and mobile devices. Financial institutions rely on it for interbank communications and transaction verification. Government agencies use it to protect classified communications and critical infrastructure. Hospitals and healthcare networks use RSA to secure patient records in transit, making medical data as vulnerable to a quantum-capable adversary as any financial or government system. 

In practical terms, RSA is not one system among many. It is the underlying lock on most of the doors that matter. Being able to crack it, at will, would be like having a skeleton key to every single house or office across the world – with the added benefit that no one has to know that you’ve broken in. 

Having a quantum computing system that can crack RSA at will would undermine the foundational security architecture of the modern internet, and thus, everything in our world that depends on secure communications.

Despite the headlines, however, this threat is far from imminent. Running Shor’s algorithm (the algorithm that would be used on a QPU) against a 2,048-bit RSA key would require a fault-tolerant quantum computer with millions of error-corrected logical qubits. No such machine exists today, and the most credible timelines place it at least a decade away, possibly longer. 

Interestingly, these codes do not have to be cracked now, for intelligence agencies to start leveraging the threat. Intelligence agencies and well-resourced adversaries are already engaged in what security researchers call “harvest now, decrypt later.” The strategy involves intercepting and storing encrypted communications now, with the intention of decrypting them once quantum capability eventually arrives. Classified government communications, sensitive financial negotiations, and proprietary corporate data transmitted today could be exposed a decade from now. For information with a long shelf life, the encryption protecting it needs to survive a future that current standards were not designed for.

This is the justi fication for the government action. The National Institute of Standards and Technology finalized its first set of post-quantum cryptographic standards in 2024, after an eight-year evaluation process involving submissions from cryptographers worldwide. These updated standards are designed to resist attacks from both classical and quantum computers, relying on mathematical problems (e.g. lattice-based cryptography), that are believed to be hard for quantum algorithms to solve. The migration effort has been considerable, and is similar to effort back in the Y2K days. Every system, protocol, and device that relies on current encryption standards will eventually need to be updated, a transition that will take years and cost billions across both the public and private sectors.

Quantum on the Defensive

But quantum computing is not only poised to break encryption – it can also defend. Quantum Key Distribution, or QKD, uses the properties of inherit to a QPU to establish encryption keys that is theoretically impossible to intercept without detection. Any attempt to eavesdrop on a QKD channel disturbs the quantum states being transmitted, alerting both parties to the intrusion. China has demonstrated QKD over satellite links spanning thousands of kilometers, and several European governments have invested in quantum communication networks as part of broader national security infrastructure. 

Several companies have pursued quantum cryptography projects with results that vary in credibility. ID Quantique, a Swiss firm, has been deploying QKD hardware to financial institutions and government agencies for over a decade and represents the most mature commercial implementation available. Toshiba has run QKD trials over existing fiber optic infrastructure in the United Kingdom. IBM and Google are both engaged in post-quantum cryptography research, focused on hardening their own cloud infrastructure against future threats. On the more speculative end, some quantum computing vendors have attempted to position their hardware as relevant to cryptographic applications today, a claim that requires scrutiny given that current qubit counts and error rates fall far short of what any meaningful cryptographic attack would require. 

The technology is real, but practical deployment remains limited. QKD requires dedicated hardware and specialized infrastructure, and its security guarantees apply only to the key distribution process, not to the classical systems that use those keys.

Coherence is writing more about QKD in the future – including a breakdown on how this works. Make sure to sign-up for membership, so you can be alerted when this is published. 

Most of the near-term action in this space is defensive rather than offensive. The immediate priority for enterprises and governments is auditing existing systems, identifying cryptographic dependencies, and beginning the migration to post-quantum standards.

Search and Grover’s Algorithm

Most discussions of quantum computing focus on simulation, optimization, or cryptography. Search is less glamorous but sits at the foundation of how computers solve problems, especially in database structures. 

Grover’s algorithm was developed by Lov Grover in 1996. The problem it addresses involves search: given an unsorted list of items, find the one you are looking for. A classical computer has no choice but to check items one by one in the worst case, meaning a search through one million items could require one million steps. Grover’s algorithm does the same search in roughly the square root of that number – in other words, a tiny fraction of the time. Interestingly, that relationship holds regardless of the size of the database.

This is one of the first established (and published) quantum advantages over classical compute. Unlike many quantum computing claims, Grover’s algorithm is mathematically established and has been demonstrated on real quantum hardware, even if only at small scales. The result sits alongside Shor’s algorithm as one of the foundational theoretical achievements of the field.

The practical implications are harder to pin down. A quadratic speedup is meaningful in absolute terms, but it is less transformative than the exponential speedups quantum computing promises in areas like simulation and cryptography. Cutting a search from one million steps to one thousand is useful, but not necessarily the kind of improvement that makes headlines. For comparison, Shor’s algorithm does not just speed up factoring; it reduces a problem that would take longer than the age of the universe and brings it down to hours. Grover’s algorithm indeed compresses timelines but does not cross that kind of threshold.

There is also a structural issue with applying Grover’s algorithm to real-world search problems. The algorithm’s advantage applies specifically to unstructured search, meaning a database… View full article here.

Preview available to all visitors. Full access requires membership.