TLDR: quantum so far has scaled like shit compared to classical computers and there is no clear path to that changing anytime soon.
Google 3 Pro analysis of quantum scaling:
The comparison between the scaling of classical CPUs and quantum computers involves two fundamentally different physics paradigms, economic models, and measures of "power." While classical computing has followed a predictable exponential curve known as Mooreβs Law for decades, quantum computing is in a nascent volatility stage where scaling is defined not just by count, but by quality and error rates.
## Classical CPUs: Mooreβs Law and The Silicon Era
### scaling Mechanism and Investment
Mooreβs Law is the empirical observation that the number of transistors in a dense integrated circuit doubles about every two years. This scaling was driven by photolithography advancements that allowed components to shrink physically.
* **Investment Model:** The investment in classical silicon was cyclical and self-funding. As chips became smaller and faster, they opened new markets (PCs, then mobile, then cloud), generating massive revenue that funded the next generation of fabs (fabrication plants).
* **Cost Efficiency:** Over 50 years, the price per transistor dropped from dollars to billionths of a dollar.
* **Result:** The scaling was "smooth" exponential. If \(N\) is the number of transistors, computational power roughly scaled as \(N\) (until the breakdown of Dennard scaling around 2005, where power density issues limited clock speeds).
### Actual Results Achieved
The result of this investment is tangible. A modern GPU or CPU contains tens of billions of transistors. We achieved:
* Global connectivity (Internet).
* Real-time processing of massive datasets.
* Reliable, error-free computation (bit error rates are negligible).
## Quantum Computers: Nevenβs Law and The Qubit Challenge
### Scaling Mechanism and Investment
Quantum scaling is often discussed in terms of "Nevenβs Law," a term coined by Hartmut Neven of Google, suggesting that quantum processing power grows at a double-exponential rate relative to classical computing for specific tasks. However, the physical metric is the **Qubit** (quantum bit).
* **Diffculty of Scaling:** Unlike transistors, which are stabilized by material science, qubits are extremely fragile. They are susceptible to "decoherence" caused by temperature or electromagnetic noise.
* **The Investment:** Investment in quantum computing is currently speculative. It is driven by governments and Venture Capital, necessitating billions of dollars in cooling systems (dilution refrigerators), laser optics, and ion traps without immediate commercial revenue to offset the costs.
* **Cost Efficiency:** Currently, the cost per qubit is astronomical compared to the cost per transistor.
### Computational Density vs. Physical Count
The primary contrast lies in how power is derived from the hardware components.
* **Classical:** Adding one transistor adds a tiny, linear incremental amount of logic capacity.
* **Quantum:** Adding one qubit doubles the "state space" the computer can represent. A system of \(n\) qubits represents a superposition of \(2^n\) states. This suggests that quantum computers scale their *potential* power significantly faster than classical computers relative to hardware size.
## Comparative Analysis: Faster or Slower?
To determine if quantum computers are scaling faster or slower than CPUs, one must distinguish between *physical scaling* and *computational utility*.
### 1. Physical Hardware Scaling: Quantum is Slower
in terms of raw component count and manufacturing reliability, quantum computers are scaling **much slower** than classical CPUs did in their respective early eras.
* **Integration:** By the late 1960s/early 70s, Intel was integrating thousands of transistors. Currently, the leading superconducting quantum processors struggle to maintain coherence with just over 100 to 1,000 physical qubits.
* **Fidelity:** Classical transistors work 99.99999...% of the time. Good quantum gates currently struggle to hit 99.9% fidelity. This requires "Quantum Error Correction," meaning you might need 1,000 physical qubits to create just 1 stable "logical" qubit.
### 2. Computational Potential: Quantum is Faster
If we measure scaling by "complexity of problems solvable," quantum computers theoretically scale faster. This is often described using complexity classes.
* Classical computers generally struggle with problems in roughly \(O(2^n)\) time (exponential time), such as factoring large integers.
* Quantum computers, using algorithms like Shor's algorithm, can solve these in polynomial time \(O(n^3)\) [1].
Therefore, to gain a 2x increase in problem-solving capability for specific hard problems:
* **Classical:** You might need \(2^x\) more time or hardware.
* **Quantum:** You might only need to add a few qubits.
## Actual Results vs. Future Projections
The disconnect between hype and reality in quantum computing is significant compared to the tangible history of Moore's Law.
### Actual Results (The NISQ Era)
We are currently in the **NISQ** (Noisy Intermediate-Scale Quantum) era.
* **Achievements:** Google and USTC (China) have demonstrated "Quantum Supremacy" or "Quantum Advantage." They performed specific calculations (like random circuit sampling or boson sampling) in minutes that would take classical supercomputers thousands of years.
* **Limitation:** These calculations were largely useless for practical business applications. They proved the physics, not the business case.
* **Value:** the dollar-for-dollar return on computation today is effectively zero for commercial enterprises compared to classical cloud computing.
### Future Projections (Fault-Tolerant Quantum Computing)
Projections rely on solving the error correction hurdle.
* **The Logic:** If engineers can lower error rates, the "double exponential" power growth kicks in.
* **The Prediction:** A roughly 4,000 to 10,000 logical qubit machine could break RSA-2048 encryption, a standard for internet security.
* **The Timeline:** While Moore's Law had immediate deliverables, useful Fault-Tolerant Quantum Computing (FTQC) is projected to be 10 to 20 years away requires investment scaling that mimics the Manhattan Project or the Apollo program, rather than the market-driven scaling of Silicon Valley.
