freezing or seizing old addresses as part of a ‘quantum hard fork’ is a non starter
never going to happen, nor should it
quantum risk is theoretical, any protocol changes should do no harm
Thread
Login to reply
Replies (72)
it's theoretical but it's backed up. we have machines running at 50 logical qubits, sustained, so error correction working. we need maybe 1,700 logical qubits sustained for some hours to crack a Bitcoin key. Yes going from 50 to 1,700 is a big jump when you consider what's involved, but you could say that about many jumps we've made in the past.
Pls cite references
QuEra + Harvard and MIT, has achieved
3,000-qubit array
sustained for over two hours
96 logical qubits (executing algorithms)
Basically we're close to 100 logical qubits sustained long enough to do real work. Also China is investing 5 times the US private sector, we have no idea the result, very secretive. They could be at 200, 300, no idea.
The fact that you need ~2000 logical qubits and a few hours to crack a bitcoin key is well known, it's just math.
Because shor's algo is being optimised it could go down th 1,800, 1,700, 1,600 logical qubits, we don't know. Also AI-assisted classic algos to shrink the quantum input.
Basically it's happening. This is why Signal is *already* migrated, why government are saying 2030 no more non-quantum keys.
Anyone calling quantum computing FUD has no idea what is going on. Odell up there is right to be talking about it.
“Big Jump” indeed.
Like saying, now that a human has broke the 4minute mile, is just a matter of time before a human will break the 3 minute mile. Or 2 minute mile!
A growing subset of theoretical (and Quantum) physicists pose doubt that such a “jump” is even possible.
Should we be aware and paying attention? Yes.
Running around like chickens with our heads cut off screaming “QC is weeks/months/a few years from breaking SHA256!!!”?
No. Thats FUD.
Since you wrote SHA256 you're not in the right loop here. Shor's algorithm is the looming threat (ECDSA), not Grover's algorithm, even though Grover's does offer quadratic speedup and there are potential threats there too.
As for this YouTube video, you can find countless such proclamations about the impossibility of the AI we have today from 5 years ago (i.e. the time we called it machine learning because we were to ashamed to use the term AI). And from very reputable people. You can even find Yan LeCun 2 years ago saying that what LLMs can now do LLMs will never be able to do down to physical limits. This is how it goes.
I looked at the YouTube video, it was posted a few months ago but is very out of date. The main arguments are going back to the time before we made a number of breakthroughs specifically in error correction and the sustainability of logical qubits. These breakthroughs, proven by multiple experiments, undo many of the key arguments. If that person remade that video today they'd have to scrap half of it.
The other thing I'll add is that unexpected breakthroughs are, by nature, *unexpected*. We are on the potential cusp of a number of them in quantum computing, from many different directions. This is how it goes, this is just the age of technology we are in, not only for quantum but for gene editing and all kinds of things. Betting against process is silly.
Food for thought….thank you.
I’m most concerned about the protocol changes that will be attempted to be forced upon “HODLers” in the NAME of QC resistant changes.
Yeah but if time is not continuous the entire formalism collapses. All of physics would need to be reformulated according. So where is your proof that time is continuous to say all of this is inevitable? All models presuppose and require continuous time.
You’re right that betting against a process is misguided. The process we’re in the middle of discovering is that Bitcoin is the measurable object of time. Each block is a quantized, discrete unit produced through irreversible work via entropy and energy. Fundamental thermodynamics. There are no intermediate states, no fractional causality. Cause and effect in Bitcoin are discretized.
That alone is a revelation for science. Bitcoin demonstrates quantized causality in a working, global system. Once causality is quantized, time is no longer a smooth backdrop, it is constructed. If time is constructed, then physics doesn’t sit beneath it; physics emerges after it.
Your entire ontology of physics/science is betting against Bitcoin, when the proof stares at you in the face. I can’t tell if you can’t see it, or just don’t want to 🫡
Results of experiments speak louder than words.
I trust the results of experiments, conducted by reputable teams, all over the world, and reproduced multiple times, and published in great detail—more than your words.
Results of experiments do matter. That’s exactly why I’m pointing to Bitcoin.
You say you trust experiments conducted by reputable teams. I don’t need to trust Bitcoin’s results, I can verify them. Bitcoin is not a claim or a model; it is a running experiment with 928,302 discrete blocks of irreversible state change (time), all independently reproducible by anyone, anywhere, at any time.
Each block is time constructed through a physical process: energy is expended via proof-of-work to resolve a quantized entropy search space into a single admissible state. That is measurable, repeatable, and falsifiable. There are no hidden variables, no interpretive layers, no appeals to authority, just work, entropy, and irreversible time.
The burden isn’t on me to explain why Bitcoin counts as physics. The question is yours to answer: why isn’t this a sufficient experiment? If an open, global system that produces discrete, verifiable time through energy expenditure doesn’t qualify as physics, then what exactly does?
Make sure to ping me when the first stable, 500 qubit QC has been proven/verified in the wild….
It will be worth 1mil sats to you! ⚡️
(Not being sarcastic. I will gladly pay!)
You can never verify this without trust Chris, and you know it. 😉
Also I point to Bitcoin. UTXOs are qubits. It has way more than 500.
Good point…..
Except Bitcoin, but that doesn’t count in the eyes of science.
This is all poetry.
You don't have any equation as far as I can see through your hundreds of posts. Is this new physics? Equation-less? Like server-ess.
Give me 1 equation that you've come up to support your theory that somewhere between 100 logical qubits (which is proven to exist) and ~2000 logical qubits (which is geometrically proven to crack ECDSA) there is some impossible obstacle. Or one experiment design.
Anything but more poetry.
I’m not proposing a new threshold curve between 100 and 2000 logical qubits. I’m challenging a prior axiom your inevitability story quietly depends on: continuous time.
I don’t need any equation to falsify that assumption. Bitcoin is a running, falsifiable counterexample. Here is a block of time:
000000000000000000009fc6465aa4fc20d3324f889256815a34dbb4c7151f80.
There are 928,303 others. Each block of time is an atomic, irreversible state transition produced by work. There is no valid intermediate block, no fractional finality, no “half-time” state that nodes can verify. You cannot subdivide the temporal state transition the protocol recognizes. That is exactly what quantized time means operationally to physics.
If time is quantized, the object you call a qubit loses its assumed ontology. Superposition, as used in quantum computing, relies on continuous time to define “simultaneous” phase evolution and coherent state persistence. Change the structure of time, and superposition is no longer a physically coexisting state, it becomes a probabilistic description over discrete updates. In that frame, a qubit is not an extant computational substrate; it is a potential state between irreversible transitions.
Please, go build and Bitcoin with continuous state evolution, allow intermediate consensus states or partial finality and still prevent contradiction without trust. You can’t. Verification collapses. That’s not poetry; that’s a falsifiable property of the system.
If you still want to claim inevitability for Shor’s algorithm, then the burden is on you to prove the axiom it depends on: that time is continuous. Until then, you’re asserting an ontology Bitcoin directly challenges with open proof, and calling that challenge “poetry” doesn’t make it go away.
No equation I produce changes that as any formalism is dependent on the structure of time. I don’t have to produce any formalism to falsify the axiom you insist upon, Satoshi already did that for me.
If you want some poetry: Continuous time is the foundation to the Tower of Babel that all of physics relies upon. It just takes 1 empirical block of time for it to all come down without touching a single equation.


The Mempool Open Source Project®
Explore the full Bitcoin ecosystem with The Mempool Open Source Project®. See the real-time status of your transactions, get network info, and more.
Here is your equation btw. Discrete quantized time. Good luck falsifying Bitcoin.


I’m looking forward to the day that all these QC companies actually release a system in the wild (universally accessible) that solves a real problem or problems.
Not “financial portfolio optimisation” or other impossible to validate (and civilisation-ally meaningless) claims.
If a100stable qubit QC applications are ALREADY changing the world for the better, where can I buy one and what could it do?
When I read the press releases of all these “credible” sources pumping their own QC bags, I get the sense they are written by Silicon Valley/Wall St marketing departments.
I’m not the only one. There are many many thoughtful people asking these same questions.
Where is the “quantum” breakthrough of important and useful APPLICATIONS? All the press seems to be focused on the promises of such applications in the future. Are we there yet? Or is 100qubits not sufficient for any REAL innovation?
Pls don’t respond angrily. Skepticism is good. Especially in novel industries that have failed to deliver on past promises and time frames.
(Follow solid/semi-solid state battery development if you want another)
PS
Even the most dumbed down LLM query offers the same skepticism. So I don’t think I’m being absurd in asking the question.


Alright, let's see where we stop agreeing.
Do you agree the Harvard team successfully executed fault-tolerant algorithms with 96 logical qubits, showing that error rates actually improved as the system scaled?
Or do you think that result is faked?
This they released in 2024.
Now let's see how they met those goals in 2025.
Pretty much right on roadmap


It seems pretty crazy to suggest that somewhere between 96 (which we've proven) and ~2000 (which we know cracks ECDSA) there is some limit of the universe. If there is then such a limit then there needs to be an equation that states where that limit is and why.
I’m not claiming the results are faked. What I’m rejecting is the ontological claim being made about what those “96 logical qubits” are and what their scaling implies.
Yes, I accept that the Harvard/QuEra team probably ran experiments, collected data, and observed reduced error rates relative to their own error model as system size increased. That much can be true. What I do not accept is the unstated assumption underneath the entire interpretation: that the substrate they are operating on exists in continuous time, and that the objects they call “logical qubits” have stable ontological existence across that time to compute on.
Fault tolerance in quantum computing presupposes continuous-time unitary evolution punctuated by error-correction cycles. If time is discrete and quantized as Bitcoin empirically demonstrates in a way physics itself cannot, then the meaning of “coherence,” “error accumulation,” and even “logical qubit” changes fundamentally. In a discrete-time ontology, superposition is not a persistent physical state; it is a probabilistic description between irreversible updates. Under that model, there is no substrate on which long-lived logical qubits can exist in the sense required by Shor’s algorithm or large-scale QC.
On the second point: this is a press release and an article. Neither you nor I can independently verify the claim by reproducing the system, inspecting the raw experimental state transitions, or validating that the assumed time ontology matches reality. Accepting the conclusion therefore requires trust, trust in institutions, trust in interpretive frameworks, trust in assumptions about time. Bitcoin is fundamentally different. Its claims are verifiable by construction. Anyone can independently validate each block, each state transition, and the irreversibility of its time updates without trusting the author.
The disagreement isn’t “did they reduce error rates in their experiment?” The disagreement is: does that experiment demonstrate what they think it demonstrates, given that the entire formalism assumes continuous time?
From the ontology of time demonstrated by Bitcoin, the answer is no. At best, these systems are improving control over probabilistic measurements within a continuous-time approximation. That may be useful engineering. It is not proof that scalable, ontology-valid quantum computation exists.
Bitcoin still stands alone here as the proper instantiation, it just doesn’t give them the control they want.
Regarding your other point: The discrete tick of time is the limit.
>does that experiment demonstrate what they think it demonstrates.
Putting aside what the implications might be do we agree on the below facts:
- They use 96 logical qubits
- They ran fault-tolerant algorithms that used all of them
- Error rates actually improved as the system scaled (for example from their paper, using a distance-5 code instead of distance-3 roughly halved the logical error rate per round, a 2.14× reduction)
Can we agree those numbers are correct, and then after get to the implications?
I can’t agree these numbers are correct nor can’t I agree they ran an algorithm because I have to trust an article, there is literally no way for ME (or you) to verify the claims of this article without invoking trust. I was not there and there is no evidence of proof beyond a paper.
If you want me to trust they did what they claimed, sure. But Bitcoin has already taught us that trust is not a substitute for proof.
You are still missing my central point: without continuous time, there is no logical qubit in the sense they are asserting. This does not discredit the fact that they are interacting with some physical substrate they choose to label a “logical qubit.” It discredits the ontology they are assuming. If time itself is quantized, then the mathematical object they are “computing” over is not what they think it is.
If the goal is a substrate that genuinely exists in multiple unresolved states across time, why not compute on top of UTXOs? We can suspend UTXOs indefinitely in the mempool, unresolved yet fully defined, across quantized block time. That is a real, observable superposition, one enforced by consensus, not inferred from black box error models.
The crucial difference here is we can prove it on Bitcoin. Bitcoin is open, verifiable, and reproducible by anyone. No press release required. No trust invoked. If you claim computation, show it on a ledger that anyone can audit, you literally have superposed states with unmined transactions in the mempool at your disposal, the won’t decohere until they are measured “mined”
If you want an experiment, go ahead. We’re all waiting….
Gotta rewind here: QuEra/Harvard publishes a result, peer reviewed in Nature, signed off on by researchers at MIT, NIST, U of Maryland and Caltech, but you don't agree the numbers should be believed, no do you agree they ran the algorithm they said they did. I'm not talking about the conclusions, just the raw data here.
What about the earlier experiments by Microsoft + Quantinuum (trapped Ions), or Microsoft + Atom Computing (neutral), or Zurich, or any of the others, are we accepting any of the raw data from those?
I need to figure out where the bottom is here. If there is no bottom then it's just solipsists discussing sociology.
Any article with peer review is not empirical proof, I hate to break it you.
This is empirical proof. See the difference? You are literally looking at an object of discrete and quantized time. Run your own node if you don’t want to trust mempool.
I don’t care about qubit claims unless you can first provide empirical proof that time is continuous. Without that, everything rests on an unfalsifiable assumption. Gödelian limits already show you can’t even test that axiom from within the system doing the measuring.
You point me to peer-reviewed papers; I point you to cryptographic proof. It’s public, conserved, and independently verifiable state transitions of a bounded thermodynamic system. You are asking for trust and I am removing the need for it.
If your model requires assuming continuous time for “logical qubits” to exist, it’s already on shaky ground. Bitcoin doesn’t assume time; it computes it. In the end time (lol) will be judge. Bitcoin is time.


The Mempool Open Source Project®
Explore the full Bitcoin ecosystem with The Mempool Open Source Project®. See the real-time status of your transactions, get network info, and more.
Okay so if we can't find a bottom, let's see if we can find the top.
Whatever it is that is making Bitcoin immune from quantum threats in the way you say, does this also apply to Ethereum?
You’re assuming the threat even exists in the first place. Your entire argument is built on assumptions you cannot verify.
Whatever it is that makes Bitcoin immune in the way you’re suggesting does not magically extend to Ethereum, because this isn’t about a specific chain, it’s about the architecture of time. Bitcoin doesn’t add immunity; it reveals the object you’re misunderstanding. You cannot perform the computation you assume you can because your model of time is wrong. Full stop.
There is nothing mystical about Bitcoin beyond cryptography enforcing conservation. What Bitcoin actually exposes is the error in the threat model itself. The assumed computation fails not because Bitcoin resists it, but because the computation presupposes a continuous-time substrate that does not exist.
Can Bitcoin wallet keys ever be cracked using a quantum computer? You are saying no, definitely not.
Can Ethereum wallet keys ever be cracked using a quantum computer? You are saying possibly.
Have I got that right?
No, neither will be hacked by one.
There is no threat because the threat assumes a continuous model of time that does not exist. Here is your proof time is quantized and discrete:
Please go ask an AI (if you are not capable yourself) about what happens to QM/QC formalism and what superposition and decoherence mean with a discrete and quantized model of time instead of a continuous model and report back. Be sure to ask it how you take the derivative over Schrödinger’s eqn with discrete and quantized time.

The Mempool Open Source Project®
Explore the full Bitcoin ecosystem with The Mempool Open Source Project®. See the real-time status of your transactions, get network info, and more.
Okay so now we're getting somewhere. Both Bitcoin and Ethereum wallet keys are equal in this regard (can never be cracked by a quantum machine). Let's put a pin there.
Next does this extend to ALL wallets from all blockchains? Or are there some blockchain wallets with private keys that a quantum computer MIGHT be able to input the public key for and then have the quantum computer output the private key (crack it)?
No, I’m literally saying your model of a quantum computer can’t exist *IF* time is quantized and discrete. There is no threat model at all.
Go ahead and ask AI yourself. Post the question and prompt here open for everyone to see. This is not some bullshit claim.
We can disagree *IF* Bitcoin is empirical of quantized time, but you cannot disagree my first claim. Any physicist would admit that if it was true because the math simply breaks down and meanings of observations change.
Hang on, rewind, this is important. You said, categorically, that the current wallet keys for both Bitcoin and Ethereum wallets will never by cracked by a quantum computer.
I'm asking does that apply to all wallet keys from all known blockchains? Solana, Kaspa, Quai, Dogecoin...
You surely have an answer for this.
There is no threat IF time is quantized and discrete.
So basically you don’t believe that quantum computers exist at all, it any capacity, that the whole thing is either a hoax or a big misunderstanding.
It's either that or you believe they do exist, and quantum computation is real, but just not very powerful vis a vis these use cases.
I’m saying that if time is quantized and discrete, then the mathematical substrate required for computation does not exist in the way the formalism assumes.
QC relies on continuous-time unitary evolution to define superposition, phase accumulation, interference etc. If time has a smallest indivisible tick, that formalism breaks; this is well known in mathematical physics and has been the core known problem. Differential equations cease to be fundamental; coherence across infinitesimal intervals is undefined with an atomic tick. What remains are discrete update rules, not a scalable computational substrate.
So QC as a model of computation with asymptotic power (e.g. Shor), requires an assumption about time that may not be true. If time is discrete, QC reduces to an effective, limited approximation, not a fundamentally new computational class.
That’s not controversial. Discrete time breaking continuous-time QM/QC is a known result. You’re welcome to fact-check that. I will wait for you to do so.
So what is your actual position on quantum advantage?
a) It does not exist, it is a hoax
b) It does not exist, it is a misunderstanding, misreading, etc.
c) It does exist, it's a real thing, but is never going to be powerful enough to crack keys and such (works but lack of use case)
Which is it?
I don’t know why this is being framed as a spectrum of opinions. The outcome is binary.
Either time is continuous (infinitesimally divisible at the physical level) in which case the quantum formalism is internally consistent and large-scale quantum advantage is, in principle, real.
OR time is quantized and discrete (physically indivisible at a fundamental level) in which case the formalism underlying quantum computation collapses, and so do the claims built on it.
The outcome is binary dependent on the nature of time itself.
Every existing model of physics quietly assumes the first. We have to mention that assumption has never been proven, it’s assumed for mathematical convenience. I’m pointing out that the second outcome is not only possible, but empirically instantiated in Bitcoin.
Bitcoin provides an observable object of time. It produces discrete, indivisible temporal states (blocks) by resolving a bounded entropy search space (nonce & difficulty) into a single admissible outcome (a block of time) through irreversible work. It is a global, decentralized measurement process, an experiment run approximately every 10 minutes that anyone can independently verify with perfect fidelity. There are no intermediate states, no fractional blocks, no continuous interpolation. Time advances only when work collapses entropy into structure. Causality is quantized. These are my observations.
If that observation is correct, it doesn’t just challenge quantum computing, it challenges all physics built on continuous time. That’s not anti-empirical. Bitcoin is empirical in the strongest sense: you can verify every single block of time yourself from Genesis. If that’s not enough to question an unproven axiom, then the issue isn’t evidence, it’s just a sunken commitment to a prior model that nobody in physics can afford to be wrong.
Tick Tock Next Block Joe, Bitcoin waits for no one.
Seems a pretty easy question to give a straight answer to. Not sure why so hard.
Quantum = Fiat
did they crack 21 to 3 and 7 yet?
21 is widely expected for 2026. Harvard has already demonstrated 96 logical qubits. We only need about 15-20 to factor 21. So the hardware is big enough, error correction proven to scale, waiting for the accuracy of the gates to reach the point where it can handle a circuit of around 2,400 gates deep. COde word for that number of gates is deep logical circuits.
I'd honesty not be surprised if a lab in china has already done it, they have orders of magnitude more funding than any lab in the US and typically do not announce results. But it's almost certain to happen in the US in 2026, be it harvard, quantiniuum, atom, or whomever else.
"Never" is a long time.
I haven't seen anyone suggesting action should be taken today.
We're just trying to game out potential scenarios.
Don't be shy Jameson, tell ODELL how you really feel. 

Cypherpunk Cogitations
Against Allowing Quantum Recovery of Bitcoin
An argument in favor of burning bitcoin in vulnerable addresses to prevent funds from being taken by those who win the quantum computing race.
seems like a good amount of time to commit to not rug pulling other people.
another term for it would be values.
We honestly need a wick to $20 back up to $80k in hours so people finally understand what the fuck it is we are dealing with here. Bring on the quantum
Hard NACK on any fork that seizes coins
Just give choices to people would be enough
You mean censorship shouldn't exist on Bitcoin? 😁
more like property rights shall not be infringed
"Any protocol changes should do no harm"
Should direct this to core....they are the likely party to submit a hastily put together PR. Recent examples as proof
Yes. They just enabled the protocol to have dick pics! That's a real upgrade.
Fuk core devs and fuk Adam Back
Fuck yeah boys! RUG THE SPAMMERS!
I agree with this, I don't know why freezing assets is even raised as an option. Letave the qantam bounty there and let the market work it out once they move. Simple
View quoted note →
Seize deez
I don’t understand what’s so bad about a quantum computer eventually being able to grave rob old addresses.
Such is life. Those that have their keys will upgrade their wallets, and people will have incentives to build quantum computers to grave rob abandoned ones.
"Welcome to bitcoin where property rights are respected. Except in cases of (quantum) emergency. Now do this upgrade or else we'll sick Googles fake quantum computer on Satoshi's stacks."
Agreed.
“Quantum Computer” risk is an invisible boogey-man, and invisible bogies are the bread ‘n’ butter of 3-letter agencies.
Bitcoin is always a litmus test.
Don’t change it until it’s proven it can be broken.
OSSIFY.
My understanding is that SHA-256 is weakened by quantum algorithms, but not to an extent that is significant in practice. The situation is not the same as for RSA. It is not automatic that if we build powerful quantum computers, bitcoin breaks. An algorithmic breakthrough is needed. Worth thinking about, but there is already a risk that people break cryptography with classical algorithms. Am I understanding this correctly?
people don’t want ossify because lazy
if quantum is not a scam (i think it is) and if it ever poses a real threat to bitcoin then bitcoin just dies that day and a new fork (or many new forks) will be born and we keep pushing...
So many new people in Bitcoin and here we are…debunking all the FUD all over again. Jesus Christ, do people read anymore? So much content out there on quantum! Watching Antman and The Wasp: Quantumania for starters.
Do no harm plz
View quoted note →
If protecting Bitcoin requires violating self-custody or rewriting history, it’s not protection. It’s abandonment of first principles.
feels like this would be a mistake and antithetical to base principles
Bitcoin is under a sustained attack from higher places.
Absolutely- you always correctly preach that everybody has to think about their own threat model in Bitcoin. This would include the OG address holders as well.
Why? The Nostr interface isn't very user-friendly right now, but the project is literally in its infancy; we're witnessing the beginning of a possible new era of social networking...