Back to blog
Oct 16, 2025
Why quantum computing is here to stay
Why fault-tolerant machines are inevitable, how error correction scales, and where real-world advantage will emerge.
Why quantum computing is here to stay

Part 1 of our series on how quantum computing is moving from theory to reality. Subscribe to stay up to date.

Written by Gavin Brennen, Chief Quantum Officer at BTQ.

Every useful computer built so far, from the abacus to a GPU-based supercomputer, obeys the same laws of classical physics. But we’ve known for a century now, in fact 2025 is the International Year of Quantum centennial celebration, that nature at its core isn’t classical. When atoms bind together to make complex molecules, or superconducting currents flow without resistance to create the giant magnetic fields in MRI machines, those processes use quantum physics. A classical computer can try to emulate those behaviours, but it will inevitably face an exponential slowdown in the number of quantum particles, e.g. atoms, electrons, or photons, involved. We are at a point in time where quantum computers are our only way forward to make progress on some of the most challenging problems of our day. We’ve been in this situation before. The laws of thermodynamics governed steam engines that drove the first industrial revolution in the late 18th to mid 19th centuries, but it was only by harnessing the then new laws of electromagnetism that ubiquitous machines connected by a power grid were made possible. The modern digital age was enabled by cheap and reliable semiconductor logic circuits, which deep down owe their performance to quantum tunneling effects. However the processors on your laptops and mobile phones don’t involve controlling and measuring individual quantum bits (qubits). To do that entirely new quantum machines are needed.

A full size fault tolerant quantum computer will be the most complex machine ever built by humans. It will likely require tens of billions of dollars in investment with the committed effort of large teams of physicists, engineers, computer scientists, and software engineers. But we are well on our way to that goal. Progress is measured by the size of an accurate quantum computation that can be run and that is determined by the number of qubits and quantum gates acting on them. This is a bit different from how we benchmark classical computers by floating point operators per second (FLOPS). The reason is raw gate speed isn’t what makes quantum computers excel. In fact a single XOR gate on a classical CPU takes about a nanosecond while on a quantum computer it can take 10s to 100s of nanoseconds depending on the architecture (and even slower taking into account quantum error correction). The power comes from the ability to run gates that classical computers can’t do, like the famous Hadamard gate that takes qubit basis states, the analogue of 0 and 1 bit states, to superposition states: |0>→(|0>+|1>)/√2  and |1>→(|0>-|1>)/√2 . This one gate, together with classical gates that act on superpositions, plus measurement, is enough to get the whole power of quantum computing. The real measure of quantum advantage is in the number of quantum gates needed to solve a problem and this can be huge. For example, the fast Fourier transform, one of the most common functions used in signal processing, scientific computing, and telecoms, takes O(2nn) classical gates on an input vector with 2n components, but needs only O(n2) quantum gates on a quantum computer with n qubits, an exponential speedup. There are other problems where quantum computers are thought to give exponential speedups: factoring large numbers, ground state energy estimation of molecules and materials, computing hidden Abelian subgroups. These exponential speedups are dramatic. For example, the time to factor a 2048 bit number and hence to break RSA-2048 public key cryptography, would take roughly 1000 years on a large supercomputer, whereas on a fault tolerant quantum computer would take less than a day. High accuracy simulation of the ground state energy of the FeMoCo cofactor, which would aid design of energy efficient ammonia production for agriculture, can be done on a quantum computer in less than a week whereas it is thought to be intractable on classical supercomputers. There are even more problems where less dramatic, polynomial speedups, are possible. Particularly interesting are super-quadratic quantum speedups like in linear systems solving, semidefinite programing, and certain optimization problems. These problems have applications in a host of areas including finance, machine learning, robotics, etc. 

How far away are we from commercially useful quantum computers? Remember we need to translate our problem of interest into a quantum algorithm run on an accurate quantum computer. Just like there are more quantum gates than classical gates, there are also more sources of noise. This noise is exactly why we don’t see large coherent quantum effects in everyday life; why a tennis ball can’t tunnel through a wall. Decades of research have proven however that noise can be tamed using sophisticated hardware control devices involving lasers, electric & magnetic fields, advanced materials like quantum dots and superconductors, and so forth, together with quantum error correction software and classical computers. In practice it means that a single qubit, what we call a logical qubit, where data is processed, needs to be encoded into many physical qubits. The more gates in your algorithm the more protection you need, but mercifully, the number of physical qubits per logical qubit doesn’t grow linearly with the algorithm size, instead it scales logarithmically. If this weren’t the case then quantum computing wouldn’t be feasible but that scaling has been corroborated now by quantum error correction experiments run on a variety of platforms including trapped ions, neutral atoms, photonic qubits, and superconducting qubits. The actual number of physical qubits per logical qubit depends sensitively on the gate error rate but overheads of a factor of 10s to a few hundred now look good enough to solve. With advances in quantum error correction software and improved hardware, we’ve seen estimates on the number of physical qubits needed to crack RSA-2048, which needs about 4000 logical qubits and a few billion logical gates, drop from 1 billion [2012 estimate] to less than 1 million [2025 estimate]. Many problems will be even easier. For example the quantum software company PhaseCraft has used streamlined quantum algorithms design to compile a database of over 45 advanced materials that can be simulated by quantum computers using a few hundred to a few thousand logical qubits. 

There are now a few dozen quantum computer hardware companies, most are private though several have gone public and some are well established players. Last year, the neutral atom quantum company QuEra reported quantum processing hundreds of logical gates on 40 logical qubits, Quantinuum with Microsoft created 12 entangled logical qubits in trapped ions, and Atom computing with Microsoft performed operations on 28 logical qubits with neutral atoms. The challenges are pushing the gate error rates even lower to reduce time and space overheads, and to connecting all the qubits together. The architecture of quantum chips is different from classical computers, in that the qubits tend to be organized into modules of limited size, where all the trapping and control devices can be fit together. Module sizes can range from a few thousand for superconducting qubits, to hundreds of thousands for trapped neutral atoms. These modules are then connected by quantum links (e.g. fiber optics or microwave transmission lines) for scalable computing. For many platforms these will be the weakest link, limiting the speed of the full quantum computation, though they may be native for photonic QCs.

In summary, quantum computing is hard but like all grand challenges, worth it. Thirty years ago there were many sceptics based on scientific grounds, but progress in error correction and hardware development has reduced that community to a handful. Quantum computers aren’t the panacea for all problems. For example, problems with big input/output of unstructured data could well be slower on quantum computers. Some have doubted whether the use cases make the investment worth it. You can find a list of over 70 animals at the quantum algorithm zoo, and it has to be stressed that all of these algorithms were designed without access to QCs. Before classical computers were built, most of modern public key cryptography wouldn’t have been invented, machine learning and AI wouldn’t be conceivable, and chaos theory wouldn’t become an empirical science. Once modest sized QCs become available the use cases can only be expected to grow and we will devise new problems that don’t have any classical analogue.