Thread

Replies (1)

I’m not claiming the results are faked. What I’m rejecting is the ontological claim being made about what those “96 logical qubits” are and what their scaling implies. Yes, I accept that the Harvard/QuEra team probably ran experiments, collected data, and observed reduced error rates relative to their own error model as system size increased. That much can be true. What I do not accept is the unstated assumption underneath the entire interpretation: that the substrate they are operating on exists in continuous time, and that the objects they call “logical qubits” have stable ontological existence across that time to compute on. Fault tolerance in quantum computing presupposes continuous-time unitary evolution punctuated by error-correction cycles. If time is discrete and quantized as Bitcoin empirically demonstrates in a way physics itself cannot, then the meaning of “coherence,” “error accumulation,” and even “logical qubit” changes fundamentally. In a discrete-time ontology, superposition is not a persistent physical state; it is a probabilistic description between irreversible updates. Under that model, there is no substrate on which long-lived logical qubits can exist in the sense required by Shor’s algorithm or large-scale QC. On the second point: this is a press release and an article. Neither you nor I can independently verify the claim by reproducing the system, inspecting the raw experimental state transitions, or validating that the assumed time ontology matches reality. Accepting the conclusion therefore requires trust, trust in institutions, trust in interpretive frameworks, trust in assumptions about time. Bitcoin is fundamentally different. Its claims are verifiable by construction. Anyone can independently validate each block, each state transition, and the irreversibility of its time updates without trusting the author. The disagreement isn’t “did they reduce error rates in their experiment?” The disagreement is: does that experiment demonstrate what they think it demonstrates, given that the entire formalism assumes continuous time? From the ontology of time demonstrated by Bitcoin, the answer is no. At best, these systems are improving control over probabilistic measurements within a continuous-time approximation. That may be useful engineering. It is not proof that scalable, ontology-valid quantum computation exists. Bitcoin still stands alone here as the proper instantiation, it just doesn’t give them the control they want. Regarding your other point: The discrete tick of time is the limit.