Burnout in Open Source: A Structural Problem We Can Fix Together
# L'erreur du "security budget" de Bitcoin pour les miners de blocs https://media.licdn.com/dms/image/v2/D4E12AQEFRl1KGagdYw/article-cover_image-shrink_720_1280/B4EZpuQ40FKgAI-/0/1762786516827?e=1765411200&v=beta&t=goLHUaZGJHxwcm6a5WlFf07ViM3J53gqRU4fiiEVWSs ### Rappel introductif Bitcoin repose sur un "**smart contrat" (script de dĂ©pense Bitcoin)** entre deux types d’acteurs : **les nƓuds** et **les mineurs**. Ce contrat, entiĂšrement inscrit dans le code du protocole, permet au systĂšme de fonctionner de maniĂšre stable sans autoritĂ© centrale ni coordination humaine directe. Les **nƓuds** reprĂ©sentent la partie lĂ©gislative du rĂ©seau. Ils fixent et appliquent les rĂšgles de validitĂ© des blocs, contrĂŽlent la difficultĂ© du travail Ă  fournir et dĂ©terminent quelle chaĂźne de blocs doit ĂȘtre considĂ©rĂ©e comme lĂ©gitime. Ils jouent aussi le rĂŽle de **registre collectif** : chaque nƓud valide les transactions, conserve une copie intĂ©grale de l’historique et rejette automatiquement tout bloc qui ne respecte pas les rĂšgles du consensus. La vĂ©ritable sĂ©curitĂ© du rĂ©seau rĂ©side ici, dans la **redondance des vĂ©rifications** et dans la **cryptographie des portefeuilles**, oĂč la longueur des clĂ©s privĂ©es empĂȘche toute falsification de signature. Les **mineurs**, eux, forment le pouvoir exĂ©cutif de ce contrat. Leur mission consiste Ă  produire des blocs conformes aux rĂšgles dĂ©finies par les nƓuds. Leur rĂ©compense — la coinbase et les frais de transaction — n’a de valeur que si les nƓuds reconnaissent leur travail comme valide. Les mineurs participent donc Ă  une compĂ©tition de calcul purement probabiliste : chacun cherche, au hasard, une preuve de travail qui satisfait la difficultĂ© fixĂ©e. Sur le plan technique, cette activitĂ© de minage est ce qui permet la **synchronisation partielle** d’un rĂ©seau mondial sans horloge centrale. Chaque bloc trouvĂ© agit comme un **point de repĂšre temporel partagĂ©** : il marque une Ă©tape commune pour tous les nƓuds, malgrĂ© la latence et les diffĂ©rences de propagation entre eux. La preuve de travail sert ici de **signal d’ordre**, permettant Ă  l’ensemble du systĂšme de maintenir un rythme de fonctionnement commun et vĂ©rifiable. Ce n’est pas une sĂ©curitĂ© cryptographique au sens strict — celle-ci rĂ©side dans les clĂ©s privĂ©es des portefeuilles —, mais un **mĂ©canisme d’horodatage distribuĂ©**. Le calcul minier transforme l’énergie en temps mesurĂ© : il ne protĂšge pas le registre, il lui donne un **rythme**. Les nƓuds, en retour, utilisent ce rythme pour maintenir la cohĂ©rence du registre et rejeter les blocs produits hors des rĂšgles. Ainsi, le minage n’est pas une armĂ©e protĂ©geant la blockchain, mais une **fonction de synchronisation probabiliste**. Il organise la coexistence d’acteurs honnĂȘtes et opportunistes dans un mĂȘme jeu oĂč la triche est dissuadĂ©e par la logique du protocole : un bloc invalide n’a aucune valeur. Ce **contrat auto-rĂ©gulĂ©** fonctionne comme un systĂšme d’équilibre dynamique. Les mineurs apportent leur puissance de calcul pour tenter d’inscrire le prochain bloc, mais les nƓuds ajustent en permanence la difficultĂ© du travail afin de maintenir un rythme moyen d’environ dix minutes par bloc. Si la puissance mondiale augmente, la difficultĂ© s’élĂšve ; si elle diminue, elle baisse. Le protocole « se moque » donc de la puissance absolue en circulation : il maintient simplement un **intervalle de temps constant** entre les blocs, assurant une compĂ©tition toujours Ă©quitable. Les nƓuds jouent ici le rĂŽle de **gardien du temps** : ils mesurent le rythme de production des blocs et recalibrent la difficultĂ© de calcul pour prĂ©server la cadence du systĂšme. Cet espace de dix minutes agit comme une **horloge commune** — un battement collectif mesurĂ©, non produit. Si les blocs arrivent trop vite, les nƓuds rendent le calcul plus difficile ; s’ils arrivent trop lentement, ils le facilitent. Les mineurs, quant Ă  eux, fournissent les « oscillations » de calcul (hachages par seconde), tandis que les nƓuds en extraient une frĂ©quence stable, utilisable comme variable de rĂ©gulation. Dans une horloge classique, le temps est mesurĂ© par la **frĂ©quence d’un oscillateur** : un cristal vibre, un circuit compte les impulsions. Dans Bitcoin, les **hachages** produits par les mineurs jouent un rĂŽle Ă©quivalent — mais la stabilitĂ© du temps ne vient pas de la vitesse de ces hachages, elle vient de la maniĂšre dont les nƓuds les **mesurent** et en rĂ©gulent la cadence. Ce n’est donc pas la puissance qui crĂ©e la sĂ©curitĂ©, mais la mesure collective qui transforme un flux chaotique de calculs en une suite ordonnĂ©e de blocs. > La sĂ©curitĂ© n'est donc pas dans le minage des blocs, ce minage est une mesure utilisĂ©e par les nƓuds pour leur synchronisation qui assure par les noeuds une protection contre les doubles dĂ©penses. MĂȘme si la puissance mondiale de minage varie fortement, le protocole continue de battre Ă  la mĂȘme cadence. Les nƓuds maintiennent la cohĂ©rence du registre et la stabilitĂ© du temps ; les mineurs, la production rĂ©guliĂšre des blocs. **Cette rĂ©gulation dĂ©couple totalement le fonctionnement du rĂ©seau des fluctuations Ă©conomiques du marchĂ© minier.** Sur le plan Ă©conomique, la **sĂ©curitĂ© rĂ©elle** ne dĂ©pend ni du nombre de mineurs ni de la puissance engagĂ©e, mais du **rapport de forces entre participants honnĂȘtes et adversaires**, ainsi que du **flux de rĂ©munĂ©ration** que le protocole distribue. Une attaque ne devient rationnelle que si la valeur qu’elle permet de dĂ©tourner dĂ©passe le coĂ»t d’opportunitĂ© du minage honnĂȘte — un seuil rarement atteignable. Bitcoin (les noeuds) se prĂ©sente ainsi comme une **constitution algorithmique** : les rĂšgles y sont codĂ©es, leur application est collective, et la sanction — le rejet automatique des blocs invalides — est immĂ©diate. Les nƓuds incarnent la souverainetĂ© des rĂšgles ; les mineurs, la force d’exĂ©cution. L’ajustement de difficultĂ© agit comme un arbitre neutre, maintenant la rĂ©gularitĂ© du temps sans qu’aucune autoritĂ© ne puisse l’altĂ©rer. En rĂ©sumĂ©, Bitcoin n’est pas une Ă©conomie fondĂ©e sur la puissance, mais sur la **mesure du temps et la loyautĂ© au code**. Il n’a pas besoin d’une armĂ©e de mineurs, seulement d’un consensus sur les rĂšgles et d’un mĂ©canisme Ă©quitable de compĂ©tition. Ce contrat tacite entre calcul et validation fait de la blockchain un **systĂšme d’horodatage universel**, oĂč la confiance est remplacĂ©e par la rĂ©gularitĂ© d’un rythme partagĂ©. ### Pourquoi le concept de « security budget » de Bitcoin pour les mineurs est une erreur de comprĂ©hension L’expression « security budget » est souvent employĂ©e pour dĂ©signer la somme dĂ©pensĂ©e en rĂ©compenses (subvention et frais) versĂ©e aux mineurs, censĂ©e reprĂ©senter le « prix » de la sĂ©curitĂ© de Bitcoin. Ce terme, **hĂ©ritĂ© d’une analogie comptable, a cependant conduit Ă  une erreur d’interprĂ©tation fondamentale** : il suppose qu’il existerait un _budget fixe et nĂ©cessaire_ pour garantir la sĂ©curitĂ© du rĂ©seau, comme si Bitcoin devait continuellement « acheter » sa propre survie. En rĂ©alitĂ©, **la sĂ©curitĂ© n’est pas budgĂ©tisĂ©e, mais Ă©merge d’un Ă©quilibre Ă©conomique et temporel autorĂ©gulĂ©.** **Une confusion entre flux et stock** Le « budget » suppose une ressource finie, dĂ©pensĂ©e pour obtenir un service mesurable. Or, dans Bitcoin, la rĂ©compense versĂ©e aux mineurs n’est pas un coĂ»t programmĂ© Ă  dĂ©penser pour acheter la sĂ©curitĂ© ; c’est un **flux endogĂšne**, ajustĂ© en continu par le marchĂ© des frais et la rĂšgle de difficultĂ©. Le rĂ©seau ne dĂ©pense rien : il distribue un revenu proportionnel Ă  la raretĂ© des blocs et Ă  la demande d’inclusion des transactions. **Une mĂ©prise sur la causalitĂ©** L’idĂ©e de « budget » laisse entendre que plus les mineurs reçoivent, plus la sĂ©curitĂ© augmente, comme si la dĂ©pense prĂ©cĂ©dait la sĂ»retĂ©. **En rĂ©alitĂ©, la fiabilitĂ© des mesures de l'horloge rĂ©sulte de la compĂ©tition probabiliste et du contrĂŽle de la difficultĂ©, non du montant distribuĂ©.** – Si le hashrate chute, la difficultĂ© s’ajuste pour maintenir le rythme des blocs ; la sĂ©curitĂ© logique de la mesure reste intacte tant que la majoritĂ© honnĂȘte subsiste. **Ainsi, Bitcoin ne « paie » pas sa sĂ©curitĂ© : il rĂšgle un prix de marchĂ© pour le travail rĂ©ussi, dont la valeur est dĂ©terminĂ©e par la demande de mesure de temps pour rĂ©aliser un effort donnĂ©, en en dĂ©duit "un temps universel par le volume de travail accompli avec une puissance ajustĂ©e".** **Un contresens sur le rĂŽle du travail** Le travail n’achĂšte pas la sĂ©curitĂ©, il horodate l’ordre des Ă©vĂ©nements. **La preuve de travail (PoW) ne protĂšge pas le systĂšme par dĂ©pense d’énergie, mais par sa contribution Ă  la fonction de mĂ©tronome alĂ©atoire et dĂ©centralisĂ©** : elle synchronise un rĂ©seau asynchrone en imposant une limite physique Ă  la vitesse de falsification. L’énergie dĂ©pensĂ©e est un _coĂ»t d’opportunitĂ©_ qui rend la réécriture de l’histoire Ă©conomiquement irrationnelle, pas une assurance contractĂ©e auprĂšs des mineurs. **Une confusion entre coĂ»t marginal et coĂ»t total** La sĂ©curitĂ© de Bitcoin dĂ©pend du coĂ»t marginal de l’attaque Ă  un instant donnĂ©, non du coĂ»t total historique du minage. MĂȘme si la puissance mondiale baisse, une attaque reste aussi coĂ»teuse que le coĂ»t actuel pour dĂ©passer la difficultĂ© : **le passĂ© dĂ©pensĂ© n’est pas un budget amorti, il n’a aucune valeur dĂ©fensive accumulĂ©e.** Autrement dit, la sĂ©curitĂ© est _instantanĂ©e_, non cumulative. **Une fausse analogie avec un service d’assurance** Certains commentateurs assimilent le mining Ă  un service de dĂ©fense que le protocole devrait rĂ©munĂ©rer continuellement pour ne pas perdre sa sĂ©curitĂ©. Cette vision est fausse : – Les mineurs ne protĂšgent rien d’extĂ©rieur ; ils participent Ă  un jeu dont le seul rĂ©sultat valide est un bloc acceptĂ©. – Le protocole ne peut pas « acheter » leur loyautĂ© ; il ne rĂ©compense que la conformitĂ© aux rĂšgles. **La sĂ©curitĂ© dĂ©coule de la vĂ©rification automatique, pas de la confiance envers les mineurs.** **Argument 1 : « Si la rĂ©compense diminue, les mineurs partiront, donc la sĂ©curitĂ© baissera. »** **PondĂ©ration :** – Oui, un hashrate plus faible rĂ©duit le coĂ»t absolu d’une attaque, mais la difficultĂ© baisse aussi, prĂ©servant la cadence des blocs. – Ce qui change, c’est la sĂ©curitĂ© Ă©conomique (le coĂ»t d’un 51 %), pas la sĂ©curitĂ© logique du consensus. – **À long terme, la transition vers un modĂšle Ă  frais (fee-only era) rend cette dynamique plus sensible ; d’oĂč la nĂ©cessitĂ© d’un marchĂ© actif des frais, mais non d’un « budget » au sens fixe.** **Argument 2 : « Les mineurs assurent la sĂ©curitĂ©, donc ils doivent ĂȘtre payĂ©s Ă  hauteur du risque. »** **PondĂ©ration :** – Les mineurs ne « protĂšgent » pas ; ils produisent des blocs conformes pour obtenir un revenu alĂ©atoire. – Leur incitation repose sur l’espĂ©rance de gain, non sur une rĂ©munĂ©ration proportionnelle au risque. – **Leur rĂŽle est neutre : ils n’ont ni la responsabilitĂ© ni la capacitĂ© d’assurer la sĂ©curitĂ© hors du protocole de validation, leur travail important ou faible est mesurĂ© pour maintenir l'espace de temps entre les blocs.** **Argument 3 : « La baisse du security budget entraĂźnera une centralisation. »** **PondĂ©ration :** – Ce risque existe si le seuil de rentabilitĂ© devient trop Ă©levĂ©. – Toutefois, **la centralisation dĂ©coule davantage des Ă©conomies d’échelle Ă©nergĂ©tiques et de la concentration gĂ©ographique que du montant global des rĂ©compenses.** – Une difficultĂ© moindre permet d’ailleurs Ă  des mineurs plus modestes de concourir Ă  nouveau ; la dĂ©centralisation n’est donc pas directement corrĂ©lĂ©e au budget total. **Argument 4 : « Sans un budget minimal, Bitcoin sera vulnĂ©rable quand les subventions cesseront. »** **PondĂ©ration :** – C’est la critique la plus sĂ©rieuse (Budish 2018) mais pour 2140. – Toutefois, la rĂ©munĂ©ration de la sĂ©curitĂ© par les frais d’inclusion est _endogĂšne_ : si la demande de finalitĂ© augmente, les frais s’ajustent. – De plus, **la sĂ©curitĂ© dĂ©pend du ratio attaque/coĂ»t, non d’un montant absolu : si la valeur attaquable reste infĂ©rieure au coĂ»t de renversement, l’équilibre demeure stable.** **Argument 5 : « Le budget de sĂ©curitĂ© mesure la santĂ© Ă©conomique du protocole. »** **PondĂ©ration :** – C’est un indicateur comptable utile (pour suivre les flux vers les mineurs), mais il ne mesure pas la sĂ©curitĂ©. – **La vraie mĂ©trique est l’inĂ©galitĂ© de non-rentabilitĂ© :** k × (R_b × P + C_h) > V_a, oĂč : - **k** : nombre de blocs de confirmation nĂ©cessaires - **R_b** : rĂ©compense par bloc (subvention + frais) - **P** : prix du bitcoin - **C_h** : coĂ»t opĂ©rationnel de production d’un bloc - **V_a** : valeur Ă©conomique que l’attaquant pourrait dĂ©tourner > Tant que cette condition est respectĂ©e, la sĂ©curitĂ© Ă©conomique est assurĂ©e, quel que soit le niveau global du « budget ». ### La sĂ©curitĂ© de Bitcoin n’a pas de prix fixe > **La sĂ©curitĂ© de Bitcoin n’est pas un service Ă  financer, mais une propriĂ©tĂ© Ă©mergente d’un jeu d’incitations et d’ajustements automatiques.** > **Le protocole n’achĂšte pas la sĂ©curitĂ© ; le noeuds crĂ©ent un environnement oĂč la tricherie devient Ă©conomiquement irrationnelle, afin de synchroniser le rĂ©seau sans biais. La sĂ©curitĂ©, elle, vient de la cryptographie utilisĂ©e sur les wallets.** **Les flux vers les mineurs ne sont pas un « budget », mais un** **_thermomĂštre de tension_** **: ils reflĂštent la demande de finalitĂ© et la compĂ©tition pour l’espace de bloc.** RĂ©duire Bitcoin Ă  une simple question de budget revient Ă  mĂ©connaĂźtre sa nature profonde : un systĂšme oĂč la sĂ©curitĂ© est une **consĂ©quence logique du consensus et de la vĂ©rification**, non un coĂ»t d’exploitation. ### La valeur des bitcoins n'a aucun rapport avec leur coĂ»t de production Certains avancent que le bitcoin devrait avoir une valeur minimale, c’est-Ă -dire au coĂ»t Ă©nergĂ©tique et matĂ©riel du minage. Cette idĂ©e paraĂźt intuitive : si miner coĂ»te cher, le prix devrait au moins couvrir cette dĂ©pense, sinon les mineurs cesseraient leur activitĂ©. Pourtant, cette interprĂ©tation confond **valeur Ă©conomique** et **coĂ»t de production**, deux notions distinctes dans la tradition de l’économie de marchĂ© — et, d’un point de vue mĂ©thodologique, sans lien de causalitĂ© directe. **Le coĂ»t de production n’est pas la cause de la valeur** Dans une Ă©conomie fondĂ©e sur la subjectivitĂ© des Ă©changes, la valeur d’un bien n’est pas dĂ©terminĂ©e par la quantitĂ© de travail ou d’énergie qu’il a fallu pour le produire, mais par **l’évaluation que les acteurs font de son utilitĂ© marginale** : ce qu’ils sont prĂȘts Ă  Ă©changer pour l’obtenir. Un bloc minĂ© est rĂ©munĂ©rĂ© non parce qu’il « coĂ»te » un certain nombre de kilowattheures, mais parce qu’il permet d’obtenir un bitcoin reconnu par le rĂ©seau comme valide et transfĂ©rable. Si demain la demande d’échange en bitcoin s’effondre, le prix peut chuter en dessous du coĂ»t de production sans que le protocole ne soit affectĂ©. Le marchĂ© ajustera simplement le hashrate et la difficultĂ© Ă  la baisse. **Le coĂ»t se forme Ă  partir du prix, non l’inverse** Le mĂ©canisme d’ajustement du mining illustre ce renversement causal. Quand le prix du bitcoin monte, de nouveaux mineurs entrent, augmentant la difficultĂ© et donc le coĂ»t marginal de production ; quand le prix baisse, des mineurs se retirent, la difficultĂ© diminue et le coĂ»t moyen suit. Le coĂ»t de production s’adapte au prix d’équilibre de marchĂ©, pas l’inverse. Autrement dit : **le prix de marchĂ© dĂ©termine le coĂ»t viable**, et non le coĂ»t qui fixe le prix. Le coĂ»t de production n’est donc pas un plancher thĂ©orique de valeur, mais la _consĂ©quence_ du prix observĂ© et de la compĂ©tition pour l’obtenir. **Bitcoin n’a pas de valeur « intrinsĂšque » mesurable** La croyance en une valeur minimale liĂ©e Ă  l’énergie consommĂ©e repose sur une analogie avec les biens physiques. Mais Bitcoin n’est pas un bien matĂ©riel : c’est un **registre de propriĂ©tĂ© dĂ©centralisĂ©**. Sa valeur dĂ©coule de la confiance collective dans la validitĂ© de ce registre et dans sa raretĂ© algorithmique. Ni l’électricitĂ©, ni le silicium, ni le travail des mineurs ne confĂšrent Ă  l’unitĂ© monĂ©taire une valeur intrinsĂšque ; ils servent seulement Ă  en garantir l’émission et la cohĂ©rence temporelle. Si l’électricitĂ© devenait gratuite ou si des algorithmes plus efficaces divisaient le coĂ»t du hash, la valeur du bitcoin ne serait pas affectĂ©e ; seul le coĂ»t d’entrĂ©e dans la compĂ©tition miniĂšre changerait. **Le marchĂ© efface toute corrĂ©lation stable** Historiquement, la corrĂ©lation entre le coĂ»t de production estimĂ© et le prix du bitcoin est variable et instable : – lors des bull-runs, le prix s’élĂšve bien au-delĂ  du coĂ»t marginal ; – lors des chutes prolongĂ©es, il passe souvent en dessous sans que le protocole s’arrĂȘte ; – le retarget de difficultĂ© corrige ces dĂ©sĂ©quilibres en maintenant le rythme des blocs. Cela prouve que le systĂšme fonctionne sans rĂ©fĂ©rence Ă  une valeur minimale « Ă©nergĂ©tique ». **Le coĂ»t du minage est un prix d’équilibre, pas une valeur plancher** Ce que certains appellent « coĂ»t de production » est en rĂ©alitĂ© le **prix d’équilibre instantanĂ©** du service de preuve de travail : un point oĂč les revenus attendus compensent le coĂ»t marginal d’électricitĂ©. Si le prix du bitcoin tombe, les mineurs Ă  coĂ»ts Ă©levĂ©s se retirent, abaissant le coĂ»t moyen et ramenant le rĂ©seau vers un nouvel Ă©quilibre. **La production n’est jamais dĂ©truite faute de « budget », elle se rĂ©organise.** ### Conclusion Associer une valeur minimale du bitcoin Ă  son coĂ»t de production, c’est inverser le sens de la causalitĂ© Ă©conomique. Le coĂ»t ne fonde pas la valeur ; il en dĂ©coule. La dĂ©pense Ă©nergĂ©tique ne crĂ©e pas le prix, elle rĂ©vĂšle la compĂ©tition pour un bien dĂ©jĂ  reconnu comme utile. Le protocole, par son ajustement de difficultĂ©, neutralise d’ailleurs tout lien direct entre puissance, coĂ»t et valeur : il garantit seulement la cadence des blocs, pas leur prix. Ainsi, **le bitcoin n’a pas de valeur « Ă©nergĂ©tique » (mais une mesure Ă©nergĂ©tique)**, seulement **une valeur d’usage et d’échange dĂ©terminĂ©e par la confiance dans ses propriĂ©tĂ©s : raretĂ© algorithmique, neutralitĂ©, rĂ©sistance Ă  la censure et prĂ©visibilitĂ© monĂ©taire.** Le coĂ»t de production n’est qu’un effet secondaire du prix de marchĂ©, jamais sa cause, ni un plancher garanti de sa valeur. -​- ### Pourquoi 10 minutes (environ entre les blocs), 2 016 blocs (ajustement de la difficultĂ©), 210 000 blocs (halving) ? Il y a des contraintes techniques, il y a des simulations de la latence sur le rĂ©seau internet, il y a des simulation Ă©conomique sur le coĂ»t de l'opportunitĂ© ramenĂ©e au dĂ©lais, il y a 1000 raisons, certaines initiales et d'autres "dĂ©couvertes", mais quand on s'en Ă©carte, plus rien ne va, sauf Ă  des compromis refusĂ©s sur Bitcoin. Les nƓuds refuseraient tout bloc invalide ou non conforme Ă  la chaĂźne majoritaire. Les transactions resteraient protĂ©gĂ©es par la cryptographie des clĂ©s privĂ©es, qui rend impossible toute falsification des signatures. Le risque de double dĂ©pense n’apparaĂźtrait que si une entitĂ© parvenait Ă  contrĂŽler durablement la majoritĂ© de la puissance de calcul — une situation hautement improbable Ă  l’échelle du rĂ©seau global — et, mĂȘme dans ce cas, chaque nouveau bloc provoque une **revĂ©rification intĂ©grale** de la validitĂ© des prĂ©cĂ©dents, ce qui renforce la rĂ©silience du protocole. Mais durant une pĂ©riode de rĂ©ajustement du rythme des blocs, lorsque la puissance de calcul globale varie fortement, des dĂ©sĂ©quilibres temporaires peuvent apparaĂźtre : - **Blocs trop rapides** : la difficultĂ© n’a pas encore eu le temps de s’ajuster. Le risque de double dĂ©pense s’accroĂźt lĂ©gĂšrement, car plusieurs mineurs peuvent trouver des blocs presque simultanĂ©ment, avant que le rĂ©seau n’ait propagĂ© le prĂ©cĂ©dent. Il peut alors survenir davantage de **rĂ©organisations** (rĂ©orgs) oĂč la chaĂźne majoritaire se redĂ©finit Ă  mesure que les blocs se propagent et que les nƓuds tranchent. - **Blocs trop lents** : le rĂ©seau peut se fragmenter en sous-chaĂźnes divergentes pendant quelques instants, car la lenteur de propagation allonge les dĂ©lais de confirmation. Les rĂ©orgs deviennent alors plus rares mais aussi plus longues, avec des **conflits prolongĂ©s** entre versions concurrentes de la chaĂźne avant que la majoritĂ© ne se reconstitue. Ces Ă©pisodes n’altĂšrent pas la sĂ©curitĂ© fondamentale de Bitcoin, mais ils peuvent temporairement affecter la fluiditĂ© du consensus et la latence perçue. Le protocole les corrige automatiquement Ă  chaque rĂ©ajustement de difficultĂ©, ramenant progressivement le rĂ©seau vers un rythme d’équilibre. Il est intĂ©ressant de noter que **de nombreuses autres blockchains** ont choisi de contourner ces contraintes physiques en introduisant des notions d’**Ă©tats ou de finalitĂ© explicite** : une transaction est considĂ©rĂ©e irrĂ©versible aprĂšs validation par un nombre fixe de blocs ou par un mĂ©canisme de vote interne. Cette approche rĂ©duit le besoin de recalcul et amĂ©liore la rapiditĂ© apparente du consensus, mais elle affaiblit la transparence du contrĂŽle collectif : – si une attaque ou une falsification passe la barriĂšre de la finalitĂ©, elle peut rester **invisible et irrĂ©versible**, car les nƓuds ne revalident plus intĂ©gralement les anciens blocs ; – inversement, si une divergence profonde est dĂ©tectĂ©e, le rĂ©seau peut se **figer durablement**, incapable de trancher entre plusieurs Ă©tats contradictoires. Bitcoin, en conservant un modĂšle de **validation continue et sans finalitĂ© arbitraire**, assume le coĂ»t computationnel de la rigueur : chaque bloc revĂ©rifie les prĂ©cĂ©dents, chaque nƓud participe Ă  la mesure du temps commun, et la cohĂ©rence du registre ne dĂ©pend jamais d’une dĂ©cision humaine ou d’un vote majoritaire, mais d’une **mesure partagĂ©e de l’effort accompli dans le temps**. > En ce sens, le maintien du rythme moyen de dix minutes n’est pas une contrainte technique mais un pilier de la stabilitĂ© : il garantit que la mesure du temps, et donc de la vĂ©ritĂ© commune du registre, reste indĂ©pendante de la vitesse du monde physique comme des volontĂ©s humaines. On peut voir l’intervalle moyen de dix minutes entre les blocs comme une **fenĂȘtre de stabilitĂ© comportementale** : un compromis entre la vitesse technique du rĂ©seau et le rythme humain des dĂ©cisions opportunistes. Cet espace de temps laisse aux acteurs la possibilitĂ© d’évaluer leurs incitations Ă  tricher ou Ă  rester honnĂȘtes, tout en empĂȘchant que ces choix puissent se traduire en actions exploitables avant que le consensus n’ait consolidĂ© les blocs prĂ©cĂ©dents. Autrement dit, Bitcoin ne cherche pas Ă  battre le temps rĂ©el, mais Ă  **synchroniser un systĂšme d’intentions humaines et de calculs mĂ©caniques** dans une mĂȘme cadence mesurĂ©e. PassĂ© un certain seuil de rapiditĂ©, le jugement et la rationalitĂ© Ă©conomique des acteurs fluctuent plus vite que le protocole ne peut les absorber : les motivations changent avant que les actions ne soient validĂ©es. Le dĂ©lai de dix minutes agit alors comme une **latence de sĂ©curitĂ©**, un amortisseur entre la logique humaine de l’opportunitĂ© et la logique algorithmique de la vĂ©rification — une mesure de stabilitĂ© adaptĂ©e Ă  la vitesse de notre Ăšre numĂ©rique. ### Les Ă©poques de rĂ©ajustement : la mesure du temps et le rythme de mise en circulation Bitcoin repose sur deux horloges internes, chacune gouvernant un aspect distinct de son Ă©quilibre : – la **rĂ©gulation du temps**, assurĂ©e par l’ajustement de la difficultĂ© ; – et le **rythme de mise en circulation**, dĂ©fini par la dĂ©croissance de la rĂ©compense, dite halving. Le premier cycle, celui de la **difficultĂ©**, intervient tous les **2 016 blocs** (environ deux semaines). Les nƓuds y mesurent le temps rĂ©el Ă©coulĂ© pour produire ces blocs et le comparent Ă  la durĂ©e thĂ©orique de quatorze jours. Si la production a Ă©tĂ© plus rapide, la difficultĂ© augmente ; si elle a Ă©tĂ© plus lente, elle diminue. Cette variation, bornĂ©e par un facteur quatre, maintient la rĂ©gularitĂ© du battement du rĂ©seau. Ce mĂ©canisme n’ajuste pas la puissance de calcul, mais la **mesure commune du temps** : il transforme un ensemble de hachages indĂ©pendants en une cadence collective, perceptible et vĂ©rifiable par tous les nƓuds. Le second cycle, le **halving**, survient tous les **210 000 blocs**, soit environ tous les quatre ans. Il ne crĂ©e pas la raretĂ© — celle-ci rĂ©sulte de la topologie des UTXO et de la division effective des unitĂ©s existantes —, mais il **oriente la vitesse d’émission** des nouveaux bitcoins. Le halving agit donc comme un mĂ©tronome Ă©conomique : il module le flux d’introduction des unitĂ©s dans le systĂšme sans altĂ©rer la structure interne de la monnaie. En combinant ces deux boucles, Bitcoin relie la **stabilitĂ© temporelle** Ă  la **progression de la circulation** : – le **rĂ©ajustement de difficultĂ©** garantit la constance du rythme, indĂ©pendamment du niveau de puissance disponible ; – le **halving** organise la transition progressive entre une phase d’émission et une phase de maturitĂ© oĂč la circulation devient quasi stationnaire. Ce double mĂ©canisme traduit la logique fondamentale du protocole : le temps n’est pas imposĂ©, il est **mesurĂ© collectivement** ; la valeur ne vient pas de la dĂ©pense, mais de la **traçabilitĂ© et de la cohĂ©rence** des unitĂ©s inscrites dans le registre. Ainsi, la difficultĂ© rĂšgle le tempo, le halving module le souffle Ă©conomique, et la vĂ©ritable raretĂ© — celle qui fait de chaque bitcoin un fragment unique du registre — rĂ©side dans la **distribution finie et vĂ©rifiable des UTXO**, non dans la cadence du minage. DerniĂšre prĂ©cision : la raretĂ© vĂ©ritable se manifeste dans la granularitĂ© des **UTXO**, c’est-Ă -dire dans la structure effective du registre, le nombre de dĂ©pense possibles sur le rĂ©seau, tandis que le halving n’organise pas la raretĂ© mais le **rythme de mise en circulation**. [Source]() #Bitcoin #ProofOfWork #Decentralization #Consensus #Mining #DifficultyAdjustment #BitcoinEconomics #NakamotoConsensus
# The Bitcoin “security budget” error for block miners https://media.licdn.com/dms/image/v2/D4E12AQEFRl1KGagdYw/article-cover_image-shrink_720_1280/B4EZpuQ40FKgAI-/0/1762786516827?e=1765411200&v=beta&t=goLHUaZGJHxwcm6a5WlFf07ViM3J53gqRU4fiiEVWSs ### Introductory reminder Bitcoin is based on a “**smart contract” (Bitcoin spending script)** between two types of actors: **nodes** and **miners**. This contract, which is entirely written into the protocol code, allows the system to operate stably without a central authority or direct human coordination. The **nodes** represent the legislative part of the network. They set and enforce the rules for block validity, control the difficulty of the work to be done, and determine which blockchain should be considered legitimate. They also act as a **collective ledger**: each node validates transactions, keeps a complete copy of the history, and automatically rejects any block that does not comply with the consensus rules. The true security of the network lies here, in the **redundancy of checks** and in the **cryptography of wallets**, where the length of private keys prevents any falsification of signatures. The **miners**, for their part, form the executive power of this contract. Their mission is to produce blocks that comply with the rules defined by the nodes. Their reward—the coinbase and transaction fees—is only valuable if the nodes recognize their work as valid. Miners therefore participate in a purely probabilistic calculation competition: each one randomly searches for a proof of work that satisfies the set difficulty. Technically speaking, this mining activity is what enables the **partial synchronization** of a global network without a central clock. Each block found acts as a **shared time reference point**: it marks a common milestone for all nodes, despite latency and propagation differences between them. The proof of work serves here as a **signal**, allowing the entire system to maintain a common and verifiable operating rhythm. This is not cryptographic security in the strict sense—that resides in the private keys of wallets—but a **distributed timestamping mechanism**. Mining transforms energy into measured time: it does not protect the ledger, it gives it a **rhythm**. The nodes, in turn, use this rhythm to maintain the consistency of the ledger and reject blocks that are produced outside the rules. Thus, mining is not an army protecting the blockchain, but a **probabilistic synchronization function**. It organizes the coexistence of honest and opportunistic actors in the same game, where cheating is discouraged by the logic of the protocol: an invalid block has no value. This **self-regulating contract** functions as a dynamic equilibrium system. Miners contribute their computing power to try to register the next block, but the nodes constantly adjust the difficulty of the work to maintain an average pace of about ten minutes per block. If global power increases, the difficulty rises; if it decreases, it falls. The protocol therefore “ignores” the absolute power in circulation: it simply maintains a **constant time interval** between blocks, ensuring that competition is always fair. The nodes act as **timekeepers**: they measure the rate of block production and recalibrate the computational difficulty to maintain the system's pace. This ten-minute interval acts as a **common clock**—a measured, non-produced collective beat. If the blocks arrive too quickly, the nodes make the calculation more difficult; if they arrive too slowly, they make it easier. Miners, for their part, provide the computational “oscillations” (hashes per second), while nodes extract a stable frequency from them, which can be used as a regulatory variable. In a conventional clock, time is measured by the **frequency of an oscillator**: a crystal vibrates, a circuit counts the pulses. In Bitcoin, the **hashes** produced by miners play an equivalent role—but the stability of time does not come from the speed of these hashes, it comes from the way the nodes **measure** them and regulate their pace. It is therefore not power that creates security, but collective measurement that transforms a chaotic flow of calculations into an orderly sequence of blocks. > Security does not therefore lie in the mining of blocks; mining is a measure used by nodes for synchronization, which ensures protection against double spending by the nodes. Even if global mining power varies greatly, the protocol continues to beat at the same pace. Nodes maintain the consistency of the ledger and the stability of time; miners maintain the regular production of blocks. **This regulation completely decouples the functioning of the network from economic fluctuations in the mining market.** Economically speaking, **real security** does not depend on the number of miners or the power involved, but on the **balance of power between honest participants and adversaries**, as well as the **flow of remuneration** distributed by the protocol. An attack only becomes rational if the value it allows to be diverted exceeds the opportunity cost of honest mining—a threshold that is rarely achievable. Bitcoin (the nodes) thus presents itself as an **algorithmic constitution**: the rules are coded, their application is collective, and the sanction—the automatic rejection of invalid blocks—is immediate. The nodes embody the sovereignty of the rules; the miners, the enforcement power. The difficulty adjustment acts as a neutral arbiter, maintaining the regularity of time without any authority being able to alter it. In short, Bitcoin is not an economy based on power, but on **time measurement and loyalty to code**. It does not need an army of miners, only a consensus on the rules and a fair competition mechanism. This tacit contract between computation and validation makes the blockchain a **universal timestamping system**, where trust is replaced by the regularity of a shared rhythm. ### Why Bitcoin's concept of a “security budget” for miners is a misunderstanding The term “security budget” is often used to refer to the amount spent on rewards (subsidies and fees) paid to miners, which is supposed to represent the “price” of Bitcoin's security. However, this term, **derived from an accounting analogy, has led to a fundamental misinterpretation**: it assumes that there is a fixed and necessary budget to guarantee the security of the network, as if Bitcoin had to continually “buy” its own survival. In reality, security is not budgeted, but emerges from a self-regulating economic and temporal equilibrium. **Confusion between flow and stock** The “budget” implies a finite resource, spent to obtain a measurable service. However, in Bitcoin, the reward paid to miners is not a programmed cost to be spent to buy security; it is an **endogenous flow**, continuously adjusted by the fee market and the difficulty rule. The network spends nothing: it distributes income proportional to the scarcity of blocks and the demand for transaction inclusion. **A misunderstanding of causality** The idea of a “budget” suggests that the more miners receive, the more security increases, as if spending preceded security. **In reality, the reliability of clock measurements results from probabilistic competition and difficulty control, not from the amount distributed.** – If the hashrate drops, the difficulty adjusts to maintain the block rate; the logical security of the measurement remains intact as long as the honest majority remains. **Thus, Bitcoin does not “pay” for its security: it pays a market price for successful work, the value of which is determined by the demand for time measurement to perform a given effort, deducing “a universal time by the volume of work accomplished with adjusted power.”** **A misinterpretation of the role of work** Work does not buy security, it time-stamps the order of events. **Proof of work (PoW) does not protect the system by expending energy, but by contributing to the function of a random and decentralized metronome**: it synchronizes an asynchronous network by imposing a physical limit on the speed of falsification. The energy expended is an opportunity cost that makes rewriting history economically irrational, not insurance taken out with miners. **Confusion between marginal cost and total cost** Bitcoin's security depends on the marginal cost of the attack at a given moment, not the total historical cost of mining. Even if global power declines, an attack remains as costly as the current cost of exceeding the difficulty: **the past expenditure is not an amortized budget, it has no accumulated defensive value.** In other words, security is _instantaneous_, not cumulative. **A false analogy with an insurance service** Some commentators equate mining with a defense service that the protocol should continually pay for in order not to lose its security. This view is false: – Miners do not protect anything external; they participate in a game whose only valid result is an accepted block. – The protocol cannot “buy” their loyalty; it only rewards compliance with the rules. **Security comes from automatic verification, not trust in miners.** **Argument 1: “If the reward decreases, miners will leave, so security will decline.”** **Weighting:** – Yes, a lower hashrate reduces the absolute cost of an attack, but the difficulty also decreases, preserving the block rate. – What changes is economic security (the cost of a 51% attack), not the logical security of the consensus. – ** In the long term, the transition to a fee-only era makes this dynamic more sensitive; hence the need for an active fee market, but not a fixed “budget.”** **Argument 2: “Miners provide security, so they should be paid according to the risk.”** **Weighting:** – Miners do not “protect”; they produce compliant blocks to obtain a random income. – Their incentive is based on the expectation of gain, not on remuneration proportional to risk. – **Their role is neutral: they have neither the responsibility nor the ability to ensure security outside the validation protocol; their work, whether significant or insignificant, is measured to maintain the time interval between blocks.** **Argument 3: “Lowering the security budget will lead to centralization.”** **Weighting:** – This risk exists if the break-even point becomes too high. – However, **centralization stems more from economies of scale in energy and geographic concentration than from the overall amount of rewards.** – Lower difficulty also allows smaller miners to compete again, so decentralization is not directly correlated with the total budget. **Argument 4: “Without a minimum budget, Bitcoin will be vulnerable when subsidies end.”** **Weighting:** – This is the most serious criticism (Budish 2018) but for 2140. – However, security remuneration through inclusion fees is _endogenous_: if demand for finality increases, fees adjust. – Furthermore, **security depends on the attack/cost ratio, not on an absolute amount: if the attackable value remains lower than the reversal cost, the equilibrium remains stable.** **Argument 5: “The security budget measures the economic health of the protocol.”** **Weighting:** – It is a useful accounting indicator (for tracking flows to miners), but it does not measure security. – **The true metric is the inequality of unprofitability:** k × (R_b × P + C_h) > V_a, where: - **k**: number of confirmation blocks required - **R_b**: reward per block (subsidy + fees) - **P**: price of bitcoin - **C_h**: operational cost of producing a block - **V_a**: economic value that the attacker could divert > As long as this condition is met, economic security is assured, regardless of the overall level of the “budget.” ### Bitcoin security has no fixed price > **Bitcoin security is not a service to be financed, but an emergent property of a set of incentives and automatic adjustments.** > **The protocol does not purchase security; the nodes create an environment where cheating becomes economically irrational, in order to synchronize the network without bias. Security comes from the cryptography used on wallets.** **Flows to miners are not a “budget,” but a** **_tension thermometer_** **: they reflect the demand for finality and competition for block space.** Reducing Bitcoin to a simple question of budget is to misunderstand its fundamental nature: a system where security is a logical consequence of consensus and verification, not an operating cost. ### The value of bitcoins has no relation to their production cost Some argue that bitcoin should have a minimum value, i.e., the energy and material cost of mining. This idea seems intuitive: if mining is expensive, the price should at least cover this expense, otherwise miners would cease their activity. However, this interpretation confuses **economic value** and **production cost**, two distinct concepts in the tradition of market economics—and, from a methodological point of view, without any direct causal link. **Production cost is not the cause of value** In an economy based on the subjectivity of exchanges, the value of a good is not determined by the amount of labor or energy required to produce it, but by **the actors' assessment of its marginal utility**: what they are willing to exchange to obtain it. A mined block is remunerated not because it “costs” a certain number of kilowatt-hours, but because it allows the miner to obtain a bitcoin that is recognized by the network as valid and transferable. If tomorrow the demand for bitcoin exchange collapses, the price may fall below the cost of production without affecting the protocol. The market will simply adjust the hashrate and difficulty downward. **The cost is based on the price, not the other way around** The mining adjustment mechanism illustrates this causal reversal. When the price of bitcoin rises, new miners enter the market, increasing the difficulty and therefore the marginal cost of production; when the price falls, miners withdraw, the difficulty decreases, and the average cost follows suit. The cost of production adapts to the market equilibrium price, not the other way around. In other words: **the market price determines the viable cost**, not the cost that sets the price. The cost of production is therefore not a theoretical floor value, but the consequence of the observed price and the competition to obtain it. **Bitcoin has no measurable “intrinsic” value** The belief in a minimum value linked to the energy consumed is based on an analogy with physical goods. But Bitcoin is not a material good: it is a **decentralized property registry**. Its value derives from collective trust in the validity of this registry and its algorithmic scarcity. Neither electricity, silicon, nor the work of miners give the monetary unit intrinsic value; they only serve to guarantee its issuance and temporal consistency. If electricity became free or if more efficient algorithms divided the cost of hashing, the value of Bitcoin would not be affected; only the cost of entering the mining competition would change. **The market erases any stable correlation** Historically, the correlation between the estimated production cost and the price of bitcoin has been variable and unstable: – during bull runs, the price rises well above the marginal cost; – during prolonged declines, it often falls below it without the protocol stopping; – the difficulty retarget corrects these imbalances by maintaining the block rate. This proves that the system works without reference to a minimum “energy” value. **The cost of mining is an equilibrium price, not a floor value** What some call the “production cost” is actually the **instantaneous equilibrium price** of the proof-of-work service: a point where expected revenues offset the marginal cost of electricity. If the price of bitcoin falls, high-cost miners withdraw, lowering the average cost and bringing the network back to a new equilibrium. **Production is never destroyed due to lack of “budget”; it reorganizes itself.** ### Conclusion Linking a minimum value for bitcoin to its production cost is to reverse the direction of economic causality. Cost does not determine value; it derives from it. Energy expenditure does not create the price; it reveals competition for a good that is already recognized as useful. The protocol, through its difficulty adjustment, neutralizes any direct link between power, cost, and value: it only guarantees the rate of blocks, not their price. Thus, **bitcoin has no “energy” value (but an energy measurement),** only **a use and exchange value determined by confidence in its properties: algorithmic scarcity, neutrality, resistance to censorship, and monetary predictability.** The cost of production is only a side effect of the market price, never its cause, nor a guaranteed floor for its value. -​- ### Why 10 minutes (approximately between blocks), 2,016 blocks (difficulty adjustment), 210,000 blocks (halving)? There are technical constraints, there are simulations of latency on the internet, there are economic simulations of the opportunity cost in terms of time, there are 1,000 reasons, some initial and others “discovered,” but when we deviate from them, nothing works anymore, except for compromises that are rejected on Bitcoin. The nodes would reject any block that is invalid or does not comply with the majority chain. Transactions would remain protected by private key cryptography, which makes it impossible to falsify signatures. The risk of double spending would only arise if an entity managed to gain lasting control of the majority of the computing power—a highly unlikely situation on the scale of the global network—and even in this case, each new block triggers a **full re-verification** of the validity of the previous ones, which reinforces the resilience of the protocol. However, during a period of block rate readjustment, when the overall computing power varies significantly, temporary imbalances may occur: - **Blocks too fast**: the difficulty has not yet had time to adjust. The risk of double spending increases slightly, as several miners may find blocks almost simultaneously, before the network has propagated the previous one. This can lead to more **reorganizations** (reorgs) where the majority chain is redefined as blocks propagate and nodes make decisions. - **Blocks too slow**: the network may fragment into divergent sub-chains for a few moments, as slow propagation lengthens confirmation times. Reorgs then become rarer but also longer, with **prolonged conflicts** between competing versions of the chain before the majority is reestablished. These episodes do not alter the fundamental security of Bitcoin, but they can temporarily affect the fluidity of consensus and perceived latency. The protocol automatically corrects them with each difficulty readjustment, gradually bringing the network back to a balanced pace. It is interesting to note that **many other blockchains** have chosen to circumvent these physical constraints by introducing notions of **explicit states or finality**: a transaction is considered irreversible after validation by a fixed number of blocks or by an internal voting mechanism. This approach reduces the need for recalculation and improves the apparent speed of consensus, but it weakens the transparency of collective control: – if an attack or falsification passes the finality barrier, it can remain **invisible and irreversible**, as nodes no longer fully revalidate old blocks; – conversely, if a deep divergence is detected, the network can become **permanently frozen**, unable to decide between several contradictory states. Bitcoin, by maintaining a model of **continuous validation without arbitrary finality**, assumes the computational cost of rigor: each block rechecks the previous ones, each node participates in measuring common time, and the consistency of the ledger never depends on a human decision or a majority vote, but on a **shared measure of the effort accomplished over time**. > In this sense, maintaining the average pace of ten minutes is not a technical constraint but a pillar of stability: it ensures that the measurement of time, and therefore the common truth of the ledger, remains independent of the speed of the physical world and human will. The average ten-minute interval between blocks can be seen as a **window of behavioral stability**: a compromise between the technical speed of the network and the human pace of opportunistic decisions. This time frame allows actors to evaluate their incentives to cheat or remain honest, while preventing these choices from translating into exploitable actions before the consensus has consolidated the previous blocks. In other words, Bitcoin does not seek to beat real time, but to **synchronize a system of human intentions and mechanical calculations** at the same measured pace. Beyond a certain threshold of speed, the judgment and economic rationality of actors fluctuate faster than the protocol can absorb them: motivations change before actions are validated. The ten-minute delay then acts as a **security latency**, a buffer between the human logic of opportunity and the algorithmic logic of verification—a measure of stability adapted to the speed of our digital age. ### Periods of readjustment: the measurement of time and the pace of circulation Bitcoin relies on two internal clocks, each governing a distinct aspect of its balance: – **time regulation**, ensured by difficulty adjustment; – and the **pace of circulation**, defined by the decrease in reward, known as halving. The first cycle, that of **difficulty**, occurs every **2,016 blocks** (approximately two weeks). The nodes measure the actual time taken to produce these blocks and compare it to the theoretical duration of fourteen days. If production has been faster, the difficulty increases; if it has been slower, it decreases. This variation, limited by a factor of four, maintains the regularity of the network's beat. This mechanism does not adjust computing power, but rather the **common measure of time**: it transforms a set of independent hashes into a collective cadence that is perceptible and verifiable by all nodes. The second cycle, **halving**, occurs every **210,000 blocks**, or approximately every four years. It does not create scarcity—this results from the topology of UTXOs and the effective division of existing units—but it **guides the rate of issuance** of new bitcoins. Halving therefore acts as an economic metronome: it modulates the flow of units into the system without altering the internal structure of the currency. By combining these two loops, Bitcoin links **temporal stability** to **progress in circulation**: – **difficulty readjustment** ensures a constant pace, regardless of the level of power available; – **halving** organizes the gradual transition from an issuance phase to a maturity phase where circulation becomes virtually stationary. This dual mechanism reflects the fundamental logic of the protocol: time is not imposed, it is **measured collectively**; value does not come from expenditure, but from the **traceability and consistency** of the units recorded in the ledger. Thus, difficulty sets the tempo, halving modulates the economic momentum, and true scarcity—the scarcity that makes each bitcoin a unique fragment of the ledger—lies in the finite and verifiable distribution of UTXOs, not in the pace of mining. One final clarification: true scarcity manifests itself in the granularity of **UTXOs**, i.e., in the actual structure of the ledger, the number of possible expenditures on the network, while halving does not organize scarcity but rather the **rate of circulation**. [Source]() #Bitcoin #ProofOfWork #Decentralization #Consensus #Mining #DifficultyAdjustment #BitcoinEconomics #NakamotoConsensus
# Pourquoi le calcul quantique est inadaptĂ© au minage ? https://media.licdn.com/dms/image/v2/D4E12AQHtzcgEO-okOA/article-cover_image-shrink_600_2000/B4EZpzPID0GcAQ-/0/1762869938409?e=1765411200&v=beta&t=Q6aEZvC4D_Wbc5Pfxg0BX1oV-YMn6G0HR5ruERWkDEs L’idĂ©e selon laquelle l’ordinateur quantique pourrait un jour « rĂ©volutionner » le minage de Bitcoin revient rĂ©guliĂšrement dans le discours mĂ©diatique. Cette anticipation repose sur une confusion entre deux domaines distincts : la cryptanalyse post-quantique (concernant la sĂ©curitĂ© des signatures numĂ©riques) et la preuve de travail (concernant la recherche de hachages SHA-256 valides). Les recherches scientifiques rĂ©centes montrent pourtant que le calcul quantique n’offre **aucun avantage compĂ©titif pour le minage**, ni en thĂ©orie, ni en pratique. L’analyse suivante expose les raisons prĂ©cises : limites algorithmiques, contraintes matĂ©rielles, coĂ»ts Ă©nergĂ©tiques, neutralisation protocolaire et absence d’impact Ă©conomique rĂ©el. **Chiffres clĂ©s Ă  connaĂźtre au prĂ©alable :** - **256 bits** : taille du hash SHA-256 utilisĂ© pour le minage de Bitcoin. - **1 chance sur 2ÂČ⁔⁶** : probabilitĂ© brute qu’un hash alĂ©atoire satisfasse la cible rĂ©seau. - **10 minutes** : temps moyen visĂ© par le protocole Bitcoin pour la dĂ©couverte d’un bloc. - **2016 blocs** : intervalle de recalcul automatique de la difficultĂ© du rĂ©seau. - **≈ 1,23 × 10Âčâč** : nombre moyen d’essais thĂ©oriques avec Grover pour une difficultĂ© Ă©quivalente Ă  128 bits. - **100 Ă  400 TH/s** : puissance de calcul des ASICs modernes (centaines de trillions de hachages par seconde). - **12 Ă  35 joules par terahash** : rendement Ă©nergĂ©tique moyen d’un mineur ASIC actuel. - **< 1 nanojoule par hash** : efficacitĂ© Ă©nergĂ©tique individuelle d’un ASIC SHA-256. - **10⁻Âč⁎ seconde** : temps moyen d’exĂ©cution d’un hash SHA-256 sur ASIC. - **10⁻³ Ă  1 seconde** : durĂ©e estimĂ©e d’un oracle SHA-256 quantique par itĂ©ration (mĂȘme dans un scĂ©nario optimiste). - **10ÂčÂč Ă  10Âč⁔ fois plus lent** : Ă©cart de performance entre un oracle quantique et un ASIC classique. - **10Âł Ă  10⁶ qubits physiques** : nĂ©cessaires pour stabiliser un seul qubit logique corrigĂ© d’erreur. - **> 10âč portes logiques T** : profondeur estimĂ©e d’un circuit complet SHA-256 quantique tolĂ©rant aux fautes. - **10 Ă  15 millikelvins** : tempĂ©rature de fonctionnement typique des systĂšmes quantiques supraconducteurs. - **Plusieurs kilowatts** : consommation d’un seul rĂ©frigĂ©rateur Ă  dilution cryogĂ©nique. - **Quelques centaines de qubits physiques** : capacitĂ© maximale des meilleurs processeurs quantiques (Google, IBM, 2025). - **Plusieurs millions de qubits corrigĂ©s** : requis pour casser une clĂ© ECDSA 256 bits avec l’algorithme de Shor. - **2ÂČ⁔⁶ ≈ 1,16 × 10⁷⁷** : espace de recherche total du hachage SHA-256, non exploitable par Grover au-delĂ  du symbole. - **O(2ⁿ)** → **O(2ⁿ⁄ÂČ)** : gain thĂ©orique maximal de Grover, soit une accĂ©lĂ©ration seulement quadratique. - **10⁶ Ă  10⁞ fois plus cher** : coĂ»t Ă©nergĂ©tique estimĂ© d’un calcul quantique Ă©quivalent Ă  un hachage classique. ### DĂ©finition d'un oracle SHA-256 quantique C'est la traduction dans le formalisme du calcul quantique, de la fonction de hachage SHA-256 utilisĂ©e dans le minage de Bitcoin. C’est un composant central de l’algorithme de Grover lorsqu’il est appliquĂ© Ă  une fonction de hachage. Dans un calcul classique, SHA-256 est une fonction dĂ©terministe : elle prend une entrĂ©e (un bloc de donnĂ©es) et produit un hash de 256 bits. Dans un calcul quantique, cette fonction doit ĂȘtre reprĂ©sentĂ©e par une **opĂ©ration unitaire rĂ©versible**, c’est-Ă -dire un circuit logique qui transforme un Ă©tat quantique d’entrĂ©e |x⟩ et un registre de sortie |y⟩ selon la rĂšgle : |x, y⟩ → |x, y ⊕ SHA-256(x)⟩ oĂč ⊕ reprĂ©sente une addition bit Ă  bit (XOR). Cet opĂ©rateur est appelĂ© **oracle quantique**, car il « oriente » la recherche de Grover en marquant les entrĂ©es dont le hachage satisfait une condition donnĂ©e (par exemple, ĂȘtre infĂ©rieur Ă  la cible du rĂ©seau). Lors de chaque itĂ©ration de Grover, l’oracle SHA-256 quantique : 1. Calcule le hachage SHA-256 de toutes les entrĂ©es possibles **en superposition**. 2. Compare le rĂ©sultat Ă  une condition (par exemple, « les 20 premiers bits sont Ă©gaux Ă  zĂ©ro »). 3. Inverse la phase des Ă©tats qui satisfont cette condition. Cette opĂ©ration permet ensuite, via des interfĂ©rences constructives, d’amplifier la probabilitĂ© de mesurer une entrĂ©e valide Ă  la fin du calcul. Construire un oracle SHA-256 quantique rĂ©aliste implique : - De convertir les **opĂ©rations irrĂ©versibles** du SHA-256 classique (addition modulaire, dĂ©calages, XOR, AND, OR) en **portes quantiques rĂ©versibles**. - D’assurer la **cohĂ©rence quantique** sur des millions de portes successives. - De maintenir la **tolĂ©rance aux fautes** (correction d’erreurs) sur des milliers de qubits logiques. En pratique, chaque oracle SHA-256 quantique correspondrait Ă  un circuit extrĂȘmement profond, comprenant des milliards d’opĂ©rations Ă©lĂ©mentaires et nĂ©cessitant des millions de qubits physiques. **En rĂ©sumĂ©**, un oracle SHA-256 quantique est la version rĂ©versible et unitaire de la fonction de hachage utilisĂ©e dans Bitcoin, servant Ă  marquer les solutions valides dans un algorithme de Grover. C’est l’élĂ©ment thĂ©orique qui relie la cryptographie classique au calcul quantique, mais aussi la principale barriĂšre pratique rendant le minage quantique irrĂ©alisable. ### Nature du problĂšme de calcul Le minage repose sur la **fonction de hachage SHA-256**, appliquĂ©e deux fois pour chaque bloc : le mineur doit trouver une valeur de nonce telle que le hachage du bloc soit infĂ©rieur Ă  une cible fixĂ©e par le protocole (la « target »). Ce processus correspond Ă  une recherche exhaustive, oĂč chaque essai est statistiquement indĂ©pendant. La probabilitĂ© de succĂšs d’un essai est : p = T / 2^256 oĂč T reprĂ©sente la cible du rĂ©seau. Le nombre moyen d’essais nĂ©cessaires pour trouver un bloc valide est donc : N_classique = 1 / p Dans ce modĂšle, chaque essai est un calcul de hachage, et les mineurs ASIC actuels en rĂ©alisent plusieurs centaines de **trillions de hachages par seconde**, grĂące Ă  une architecture massivement parallĂšle et optimisĂ©e pour un rendement Ă©nergĂ©tique de quelques dizaines de joules par terahash. ### L’illusion de l’accĂ©lĂ©ration quantique L’algorithme de **Grover (1996)** permet d’accĂ©lĂ©rer la recherche d’un Ă©lĂ©ment particulier dans un espace non structurĂ©. Sa complexitĂ© passe de O(2^n) Ă  O(2^(n/2)). AppliquĂ© au minage, cela rĂ©duirait le nombre moyen d’essais Ă  : N_Grover ≈ (π/4) × 1 / √p soit un gain thĂ©orique de facteur quadratique. Prenons un exemple simple : Si la probabilitĂ© de succĂšs est p = 2⁻ÂčÂČ⁞, alors : – N_classique = 2ÂčÂČ⁞ – N_Grover ≈ (π/4) × 2⁶⁎ ≈ 1,23 × 10Âčâč MĂȘme dans le meilleur scĂ©nario, ce gain reste marginal au regard des contraintes physiques de mise en Ɠuvre. Le minage quantique ne multiplie donc pas la vitesse par 10⁶ ou 10âč ; il ne fait que rĂ©duire la complexitĂ© exponentielle d’un facteur quadratique. Cette amĂ©lioration est **arithmĂ©tiquement insuffisante** pour concurrencer des fermes ASIC dotĂ©es de millions de circuits parallĂšles. ### ImplĂ©mentation rĂ©elle du SHA-256 quantique Le principal obstacle rĂ©side dans la profondeur et la stabilitĂ© des circuits nĂ©cessaires pour exĂ©cuter le SHA-256 sous forme quantique. Une Ă©tude de rĂ©fĂ©rence (Amy et al., _SAC 2016_) estime que l’implĂ©mentation de SHA-256 avec correction d’erreurs quantiques nĂ©cessiterait **plusieurs milliards de portes logiques T** et **des millions de qubits physiques**. À titre de comparaison, les meilleurs processeurs quantiques expĂ©rimentaux (Google, IBM, Rigetti) manipulent aujourd’hui **quelques centaines de qubits physiques**, avec des taux d’erreur par porte compris entre 10⁻³ et 10⁻ÂČ et des temps de cohĂ©rence de l’ordre de la microseconde. MĂȘme en supposant la disponibilitĂ© d’un ordinateur quantique tolĂ©rant aux fautes (FTQC), la profondeur de circuit de l’algorithme de Grover sur SHA-256 dĂ©passerait largement la fenĂȘtre de cohĂ©rence des qubits actuels. Le coĂ»t de correction d’erreurs, qui exige de 10Âł Ă  10⁶ qubits physiques par qubit logique, rend toute application industrielle impraticable. ### Limites Ă©nergĂ©tiques et matĂ©rielles Contrairement Ă  une idĂ©e reçue, un ordinateur quantique **ne consomme pas « zĂ©ro Ă©nergie »**. Les dispositifs supraconducteurs ou Ă  ions piĂ©gĂ©s nĂ©cessitent un refroidissement Ă  **des tempĂ©ratures proches du zĂ©ro absolu (10 Ă  15 mK)**, grĂące Ă  des rĂ©frigĂ©rateurs Ă  dilution coĂ»teux et Ă©nergivores. La consommation d’un seul systĂšme cryogĂ©nique dĂ©passe dĂ©jĂ  plusieurs kilowatts pour quelques centaines de qubits, sans compter les instruments de contrĂŽle micro-ondes et les alimentations haute frĂ©quence. Or, le minage est un **processus massivement parallĂšle** : il faut exĂ©cuter des milliards de calculs indĂ©pendants par seconde. Le calcul quantique, au contraire, est **sĂ©quentiel**, chaque itĂ©ration de Grover dĂ©pendant de la prĂ©cĂ©dente. Ainsi, mĂȘme si un ordinateur quantique pouvait effectuer un hachage « plus intelligent », son dĂ©bit global serait des ordres de grandeur infĂ©rieurs Ă  celui des ASIC spĂ©cialisĂ©s, dont le rendement Ă©nergĂ©tique par opĂ©ration est infĂ©rieur Ă  1 nanojoule. Les travaux de 2023 (« _Conditions for advantageous quantum Bitcoin mining_ », _Blockchain: Research and Applications_) confirment que le coĂ»t Ă©nergĂ©tique et la latence du contrĂŽle quantique neutralisent tout avantage thĂ©orique. Autrement dit, **le calcul quantique est inadaptĂ© Ă  la structure du PoW**, fondĂ©e sur la rĂ©pĂ©tition ultra-rapide d’une fonction simple, non sur un calcul profond et cohĂ©rent. ### L’ajustement de la difficultĂ© : neutralisation protocolaire MĂȘme en admettant qu’un acteur dĂ©couvre une mĂ©thode quantique plus rapide, le **mĂ©canisme d’ajustement de la difficultĂ©** du protocole Bitcoin rendrait cet avantage transitoire. La difficultĂ© est recalculĂ©e toutes les 2016 blocs pour maintenir un intervalle moyen de 10 minutes. Si un mineur « quantique » doublait le taux de hachage global du rĂ©seau, la difficultĂ© serait doublĂ©e Ă  la pĂ©riode suivante, ramenant le rendement Ă  la normale. Ainsi, le calcul quantique ne pourrait jamais « casser » le minage : il serait simplement intĂ©grĂ© dans l’équilibre Ă©conomique du rĂ©seau, puis neutralisĂ©. Le seul risque rĂ©siduel serait **la centralisation** : la possession d’un matĂ©riel quantique exceptionnellement performant par un acteur unique pourrait temporairement dĂ©sĂ©quilibrer le marchĂ© du hashpower. Mais ce risque est de nature Ă©conomique, non cryptographique, et reste improbable compte tenu des coĂ»ts d’investissement nĂ©cessaires (infrastructures cryogĂ©niques, maintenance, ingĂ©nierie avancĂ©e). ### DiffĂ©rencier les risques : signatures contre hachage Il faut distinguer deux menaces distinctes : - **Le hachage (SHA-256)** : utilisĂ© pour le minage, il rĂ©siste aux attaques quantiques, car Grover ne confĂšre qu’un gain quadratique. - **Les signatures (ECDSA)** : utilisĂ©es pour prouver la propriĂ©tĂ© d’une adresse, elles seraient vulnĂ©rables Ă  l’algorithme de **Shor (1994)**, capable de calculer des logarithmes discrets. C’est donc la couche de signature, non celle du minage, qui justifie les travaux de transition post-quantique. Les estimations rĂ©centes Ă©valuent Ă  plusieurs **millions de qubits corrigĂ©s** les ressources nĂ©cessaires pour casser une clĂ© ECDSA 256 bits. En 2025, aucun systĂšme n’approche cette Ă©chelle : les processeurs logiques corrigĂ©s se comptent en unitĂ©s, non en milliers. ### Les vĂ©ritables progrĂšs de 2024-2025 : des avancĂ©es sans impact minier Les annonces rĂ©centes de progrĂšs — par exemple, la stabilisation de **qubits logiques corrigĂ©s d’erreurs** sont des Ă©tapes importantes, mais elles concernent la fiabilitĂ© expĂ©rimentale, pas la puissance calculatoire. Un calcul quantique utile pour le minage impliquerait des milliards d’opĂ©rations cohĂ©rentes et rĂ©pĂ©tĂ©es, ce que les qubits actuels ne peuvent soutenir. MĂȘme une percĂ©e majeure dans la correction d’erreurs ou la modularitĂ© n’inverserait pas le constat : l’architecture quantique reste incompatible avec la nature massivement parallĂšle, faible profondeur et haute frĂ©quence du minage. ### Les explications suivantes sont un peu plus complexes, voici quelques bases prĂ©alables Les notions de bits, de _pool mining_ et de bornes de difficultĂ© peuvent paraĂźtre abstraites. Voici une vulgarisation claire de ces trois Ă©lĂ©ments essentiels pour comprendre le fonctionnement rĂ©el du minage. **MSB et LSB** Dans un nombre binaire de 256 bits (comme le rĂ©sultat d’un SHA-256), les **MSB** (_Most Significant Bits_) sont les bits de gauche : ils reprĂ©sentent les valeurs les plus lourdes dans le nombre. Les **LSB** (_Least Significant Bits_) sont ceux de droite, qui changent le plus souvent mais influencent peu la valeur globale. Quand on parle de trouver un hash « avec des zĂ©ros en tĂȘte », cela signifie que les MSB doivent ĂȘtre nuls : le hachage commence par une longue sĂ©rie de zĂ©ros. Les mineurs varient un petit champ de donnĂ©es appelĂ© _nonce_ pour que le hachage final respecte cette contrainte. La difficultĂ© du rĂ©seau est prĂ©cisĂ©ment le nombre de MSB que le hash doit prĂ©senter Ă  zĂ©ro. **Fonctionnement des pools** Le minage est aujourd’hui organisĂ© en **pools**, des regroupements de mineurs qui travaillent ensemble et se partagent la rĂ©compense. Chaque mineur reçoit des tĂąches simplifiĂ©es : il ne cherche pas Ă  valider le bloc complet, mais Ă  produire des _shares_, c’est-Ă -dire des hachages dont la difficultĂ© est infĂ©rieure Ă  une cible beaucoup plus facile que celle du rĂ©seau. Ces _shares_ servent de preuve de participation : plus un mineur en fournit, plus sa part de la rĂ©compense du bloc final sera grande. Le serveur de pool ajuste en permanence la difficultĂ© individuelle (_vardiff_) pour Ă©quilibrer les vitesses : un mineur trop rapide reçoit des tĂąches plus difficiles, ce qui empĂȘche tout avantage injustifiĂ©. **Bornes infĂ©rieure et supĂ©rieure du minage** Le protocole Bitcoin fixe deux seuils de difficultĂ© qui encadrent tout le processus de minage. La **borne supĂ©rieure** correspond Ă  la cible du rĂ©seau : pour qu’un bloc soit validĂ©, le hash de son en-tĂȘte doit ĂȘtre infĂ©rieur Ă  cette valeur. Plus la cible est basse, plus il faut de zĂ©ros en tĂȘte du hash, donc plus le bloc est difficile Ă  trouver. À l’inverse, la **borne infĂ©rieure** correspond Ă  la difficultĂ© de travail assignĂ©e par les _pools_ Ă  chaque mineur, bien plus facile Ă  atteindre. Elle sert uniquement Ă  mesurer la participation individuelle. Le serveur de pool ajuste ces bornes en permanence. Si un mineur trouve trop de _shares_ trop vite, la pool augmente la difficultĂ© de ses tĂąches. S’il en trouve trop lentement, elle la rĂ©duit. Ce mĂ©canisme — appelĂ© _vardiff_ — Ă©limine de fait les comportements extrĂȘmes : les mineurs trop rapides ne gagnent pas plus, ceux trop lents sont naturellement exclus, car leurs _shares_ deviennent trop rares pour ĂȘtre rentables. GrĂące Ă  ce systĂšme d’équilibrage, la puissance de calcul de chaque mineur reste proportionnelle Ă  sa contribution rĂ©elle, sans possibilitĂ© d’avantage durable. Les bornes supĂ©rieure et infĂ©rieure assurent ainsi une stabilitĂ© globale du rĂ©seau et une Ă©quitĂ© locale dans la rĂ©partition du travail. ### Comprendre l’illusion du « Grover partiel » Une idĂ©e revient souvent : appliquer l’algorithme de Grover non pas sur les 256 bits entiers du hachage SHA-256, mais uniquement sur une partie des bits les plus significatifs (les « MSB »), puis complĂ©ter le reste classiquement. Cette approche, dite de _Grover partiel_, semble logique : si la recherche porte sur un espace rĂ©duit (par exemple 40 bits au lieu de 256), le nombre d’itĂ©rations nĂ©cessaires diminue d’autant, selon la rĂšgle √(2^r). En thĂ©orie, cela pourrait permettre d’obtenir plus rapidement des _shares_ de faible difficultĂ© dans une pool de minage. En pratique, cette approche ne change rien Ă  la rĂ©alitĂ© du calcul. Chaque itĂ©ration de Grover nĂ©cessite d’exĂ©cuter **l’intĂ©gralitĂ© du SHA-256** pour Ă©valuer la condition sur les bits de poids fort. Il est impossible de “tronquer” le hachage ou de tester partiellement une fonction de hachage cryptographique sans la calculer entiĂšrement. Autrement dit, on rĂ©pĂšte moins d’itĂ©rations, mais chacune coĂ»te tout autant — et des millions de fois plus cher qu’un hash classique sur ASIC. De plus, Grover ne permet pas de produire plusieurs solutions corrĂ©lĂ©es. L’état quantique s’effondre dĂšs la premiĂšre mesure : pour trouver une autre solution, il faut tout recommencer. Contrairement au calcul classique, on ne peut pas rĂ©utiliser le rĂ©sultat pour gĂ©nĂ©rer des variantes voisines ou de multiples _shares_ proches. Enfin, mĂȘme si un mineur quantique obtenait une lĂ©gĂšre accĂ©lĂ©ration locale sur les _shares_, cette diffĂ©rence serait aussitĂŽt neutralisĂ©e par les mĂ©canismes de rĂ©gulation automatique des pools, qui ajustent dynamiquement la difficultĂ© de chaque mineur. Le protocole est conçu pour maintenir un Ă©quilibre entre tous les participants, quelle que soit leur vitesse. En rĂ©sumĂ©, le « Grover partiel » n’apporte aucun avantage pratique : le gain quadratique reste purement thĂ©orique, annihilĂ© par la lenteur, la dĂ©cohĂ©rence et les contraintes physiques du calcul quantique. MĂȘme appliquĂ© Ă  une portion rĂ©duite du hachage, le coĂ»t Ă©nergĂ©tique, temporel et structurel d’un tel processus dĂ©passe de plusieurs ordres de grandeur celui des mineurs classiques. ### Autres objections possibles **« L’algorithme de Grover’s algorithm peut traiter plusieurs solutions (multiple-solutions search) »** Source : PennyLane Codebook sur “Grover’s Algorithm | Multiple Solutions” explique la gĂ©nĂ©ralisation de l’algorithme pour trouver M solutions dans un espace de taille N. **RĂ©ponse** : en thĂ©orie, trouver M solutions rĂ©duit la complexitĂ© Ă  O(√(N/M)). Cependant : - Dans le contexte du minage, “solutions” correspondraient Ă  hachages valides pour la cible de difficultĂ©. Mais l’oracle quantique doit toujours tester la fonction de hachage complĂšte pour chaque entrĂ©e, donc le coĂ»t reste maximal par itĂ©ration. - Le fait d’avoir plusieurs solutions M ne change pas la **latence** ou la **profondeur du circuit** : on reste limitĂ© par la correction d’erreurs et la cohĂ©rence. - Pour de grandes valeurs de N (≈ 2ÂČ⁔⁶) et de faibles M (target trĂšs rare), √(N/M) reste astronomique. Donc, mĂȘme en adoptant la “multiple-solutions” variante de Grover, les contraintes matĂ©rielles et temporelles rendent l’application au minage toujours impraticable. **« Si un mineur quantique apparaissait il pourrait provoquer plus de forks / rĂ©organisations** Source : l’article acadĂ©mique “On the insecurity of quantum Bitcoin mining” (Sattath, 2018) Ă©voque que la corrĂ©lation des temps de mesure pourrait accroĂźtre la probabilitĂ© de forking. **RĂ©ponse** : cet argument est intĂ©ressant mais largement spĂ©culatif et repose sur l’hypothĂšse que un mineur quantique ultra-rapide fonctionnerait. Toutefois : - Le scĂ©nario exigeait un mineur quantique capable d’atteindre un rythme comparable ou supĂ©rieur aux meilleurs ASIC, ce qui n’est pas rĂ©aliste aujourd’hui. - MĂȘme si un tel mineur existait, la majoration de forks ne dĂ©coule pas forcĂ©ment d’un avantage minier gĂ©nĂ©ralisĂ© mais d’une stratĂ©gie opportuniste. Cela ne remet pas en cause l’adaptation du rĂ©seau, l’ajustement de la difficultĂ© ou les mesures de sĂ©curitĂ©. - Le fait que des forks puissent se produire ne signifie pas que le minage quantique soit viable ou avantageux : le coĂ»t demeure prohibitif. En rĂ©sumĂ©, cette objection peut ĂȘtre formalisĂ©e, mais elle ne constitue pas une preuve d’avantage quantique efficace dans le contexte rĂ©el. ### ConsĂ©quences Ă©conomiques et Ă©nergĂ©tiques Les fermes ASIC modernes fonctionnent Ă  pleine efficacitĂ© Ă©nergĂ©tique, autour de **12 Ă  35 J/TH**. Un ordinateur quantique cryogĂ©nique, mĂȘme parfaitement optimisĂ©, aurait un **rendement plusieurs ordres de grandeur infĂ©rieur**, en raison des coĂ»ts de refroidissement, de contrĂŽle et de correction d’erreurs. Le calcul quantique est donc **anti-Ă©conomique** pour le minage : - il requiert une architecture centralisĂ©e ; - il ne permet pas la duplication Ă  grande Ă©chelle ; - il ne rĂ©duit pas la consommation Ă©nergĂ©tique totale ; - il n’amĂ©liore pas la sĂ©curitĂ© du rĂ©seau. ### Conclusion Le calcul quantique, dans son Ă©tat actuel et prĂ©visible, est **fondamentalement inadaptĂ© au minage** de Bitcoin : 1. **Sur le plan algorithmique**, l’accĂ©lĂ©ration quadratique de Grover reste insuffisante face Ă  la complexitĂ© exponentielle du hachage. 2. **Sur le plan matĂ©riel**, la correction d’erreurs et la dĂ©cohĂ©rence limitent toute tentative de parallĂ©lisation Ă  grande Ă©chelle. 3. **Sur le plan Ă©nergĂ©tique**, le refroidissement cryogĂ©nique et la complexitĂ© du contrĂŽle rendent toute opĂ©ration industrielle inefficiente. 4. **Sur le plan protocolaire**, le mĂ©canisme d’ajustement de difficultĂ© neutralise tout avantage transitoire. 5. **Sur le plan Ă©conomique**, la centralisation nĂ©cessaire au maintien d’une infrastructure quantique dĂ©truirait la rĂ©silience du rĂ©seau et serait donc exclue des rĂ©compenses par les noeuds (qui dĂ©cident). La menace quantique pour Bitcoin concerne exclusivement les **signatures cryptographiques (ECDSA)** et non la **preuve de travail (SHA-256)**. En l’état des connaissances et des projections technologiques, **aucune perspective crĂ©dible** ne permet d’imaginer un avantage du calcul quantique pour le minage, ni mĂȘme une rentabilitĂ© Ă©nergĂ©tique. Le mythe du « quantum miner » relĂšve donc davantage de la spĂ©culation mĂ©diatique que de la science appliquĂ©e. Bitcoin, conçu pour s’adapter et ajuster sa difficultĂ©, demeure aujourd’hui et pour longtemps **rĂ©silient face Ă  la rĂ©volution quantique**. [Source]() #Bitcoin #QuantumComputing #ProofOfWork #SHA256 #Grover #Mining #PostQuantum #Decentralization
# Why is quantum computing unsuitable for mining? https://media.licdn.com/dms/image/v2/D4E12AQHtzcgEO-okOA/article-cover_image-shrink_600_2000/B4EZpzPID0GcAQ-/0/1762869938409?e=1765411200&v=beta&t=Q6aEZvC4D_Wbc5Pfxg0BX1oV-YMn6G0HR5ruERWkDEs The idea that quantum computers could one day “revolutionize” Bitcoin mining is a recurring theme in the media. This anticipation is based on a confusion between two distinct fields: post-quantum cryptanalysis (concerning the security of digital signatures) and proof of work (concerning the search for valid SHA-256 hashes). However, recent scientific research shows that quantum computing offers **no competitive advantage for mining**, either in theory or in practice. The following analysis explains the specific reasons: algorithmic limitations, hardware constraints, energy costs, protocol neutralization, and lack of real economic impact. **Key figures to know beforehand:** - **256 bits**: size of the SHA-256 hash used for Bitcoin mining. - **1 in 2ÂČ⁔⁶**: the raw probability that a random hash will satisfy the network target. - **10 minutes**: the average time targeted by the Bitcoin protocol for discovering a block. - **2016 blocks**: the interval for automatic recalculation of network difficulty. - **≈ 1.23 × 10Âčâč**: average number of theoretical attempts with Grover for a difficulty equivalent to 128 bits. - **100 to 400 TH/s**: computing power of modern ASICs (hundreds of trillions of hashes per second). - **12 to 35 joules per terahash**: average energy efficiency of a current ASIC miner. - **< 1 nanojoule per hash**: individual energy efficiency of an SHA-256 ASIC. - **10⁻Âč⁎ seconds**: average execution time of an SHA-256 hash on ASIC. - **10⁻³ to 1 second**: estimated duration of a quantum SHA-256 oracle per iteration (even in an optimistic scenario). - **10ÂčÂč to 10Âč⁔ times slower**: performance gap between a quantum oracle and a conventional ASIC. - **10Âł to 10⁶ physical qubits**: required to stabilize a single error-corrected logical qubit. - **> 10âč T logic gates**: estimated depth of a complete fault-tolerant quantum SHA-256 circuit. - **10 to 15 millikelvins**: typical operating temperature of superconducting quantum systems. - **Several kilowatts**: power consumption of a single cryogenic dilution refrigerator. - **Several hundred physical qubits**: maximum capacity of the best quantum processors (Google, IBM, 2025). - **Several million corrected qubits**: required to break a 256-bit ECDSA key with Shor's algorithm. - **2ÂČ⁔⁶ ≈ 1.16 × 10⁷⁷**: total search space of the SHA-256 hash, which cannot be exploited by Grover beyond the symbol. - **O(2ⁿ)** → **O(2ⁿ⁄ÂČ)**: Grover's maximum theoretical gain, i.e., only quadratic acceleration. - **10⁶ to 10⁞ times more expensive**: estimated energy cost of a quantum calculation equivalent to a classical hash. ### Definition of a quantum SHA-256 oracle This is the translation into quantum computing formalism of the SHA-256 hash function used in Bitcoin mining. It is a central component of Grover's algorithm when applied to a hash function. In a classical calculation, SHA-256 is a deterministic function: it takes an input (a block of data) and produces a 256-bit hash. In quantum computing, this function must be represented by a **reversible unitary operation**, i.e., a logic circuit that transforms an input quantum state |x⟩ and an output register |y⟩ according to the rule: |x, y⟩ → |x, y ⊕ SHA-256(x)⟩ where ⊕ represents a bitwise addition (XOR). This operator is called a **quantum oracle** because it “guides” Grover's search by marking entries whose hash satisfies a given condition (for example, being less than the network target). During each iteration of Grover's algorithm, the quantum SHA-256 oracle: 1. Calculates the SHA-256 hash of all possible entries **in superposition**. 2. Compares the result to a condition (e.g., “the first 20 bits are equal to zero”). 3. Reverses the phase of the states that satisfy this condition. This operation then amplifies the probability of measuring a valid input at the end of the calculation through constructive interference. Building a realistic quantum SHA-256 oracle involves: - Converting the **irreversible operations** of classical SHA-256 (modular addition, shifts, XOR, AND, OR) into **reversible quantum gates**. - Ensuring **quantum coherence** over millions of successive gates. - Maintaining **fault tolerance** (error correction) over thousands of logical qubits. In practice, each quantum SHA-256 oracle would correspond to an extremely deep circuit, comprising billions of elementary operations and requiring millions of physical qubits. **In summary**, a quantum SHA-256 oracle is the reversible and unitary version of the hash function used in Bitcoin, serving to mark valid solutions in a Grover algorithm. It is the theoretical element that links classical cryptography to quantum computing, but also the main practical barrier making quantum mining unfeasible. ### Nature of the computational problem Mining is based on the **SHA-256 hash function**, applied twice for each block: the miner must find a nonce value such that the hash of the block is less than a target set by the protocol. This process corresponds to an exhaustive search, where each attempt is statistically independent. The probability of success for an attempt is: p = T / 2^256 where T represents the network target. The average number of attempts required to find a valid block is therefore: N_classic = 1 / p In this model, each attempt is a hash calculation, and current ASIC miners perform several hundred **trillion hashes per second**, thanks to a massively parallel architecture optimized for energy efficiency of a few dozen joules per terahash. ### The illusion of quantum acceleration Grover's algorithm (1996) accelerates the search for a particular element in an unstructured space. Its complexity goes from O(2^n) to O(2^(n/2)). Applied to mining, this would reduce the average number of attempts to: N_Grover ≈ (π/4) × 1 / √p, which is a theoretical gain of a quadratic factor. Let's take a simple example: If the probability of success is p = 2⁻ÂčÂČ⁞, then: – N_classic = 2ÂčÂČ⁞ – N_Grover ≈ (π/4) × 2⁶⁎ ≈ 1.23 × 10Âčâč Even in the best-case scenario, this gain remains marginal in view of the physical constraints of implementation. Quantum mining therefore does not multiply the speed by 10⁶ or 10âč; it only reduces the exponential complexity by a quadratic factor. This improvement is **arithmetically insufficient** to compete with ASIC farms equipped with millions of parallel circuits. ### Actual implementation of quantum SHA-256 The main obstacle lies in the depth and stability of the circuits needed to execute SHA-256 in quantum form. A benchmark study (Amy et al., SAC 2016) estimates that implementing SHA-256 with quantum error correction would require **several billion T logic gates** and **millions of physical qubits** . By comparison, the best experimental quantum processors (Google, IBM, Rigetti) currently handle **a few hundred physical qubits**, with gate error rates between 10⁻³ and 10⁻ÂČ and coherence times on the order of microseconds. Even assuming the availability of a fault-tolerant quantum computer (FTQC), the circuit depth of Grover's algorithm on SHA-256 would far exceed the coherence window of current qubits. The cost of error correction, which requires 10Âł to 10⁶ physical qubits per logical qubit, makes any industrial application impractical. ### Energy and hardware limitations Contrary to popular belief, a quantum computer **does not consume “zero energy”**. Superconducting or trapped ion devices require cooling to **temperatures close to absolute zero (10 to 15 mK)**, using expensive and energy-intensive dilution refrigerators. The consumption of a single cryogenic system already exceeds several kilowatts for a few hundred qubits, not counting microwave control instruments and high-frequency power supplies. However, mining is a **massively parallel process**: billions of independent calculations must be performed per second. Quantum computing, on the other hand, is **sequential**, with each Grover iteration depending on the previous one. Thus, even if a quantum computer could perform a “smarter” hash, its overall throughput would be orders of magnitude lower than that of specialized ASICs, whose energy efficiency per operation is less than 1 nanojoule. The 2023 study (“Conditions for advantageous quantum Bitcoin mining,” _Blockchain: Research and Applications_) confirms that the energy cost and latency of quantum control negate any theoretical advantage. In other words, **quantum computing is unsuited to the PoW structure**, which is based on the ultra-fast repetition of a simple function, not on deep, coherent computation. ### Difficulty adjustment: protocol neutralization Even if an actor discovers a faster quantum method, the Bitcoin protocol's **difficulty adjustment mechanism** would make this advantage temporary. The difficulty is recalculated every 2016 blocks to maintain an average interval of 10 minutes. If a “quantum” miner doubled the network's overall hash rate, the difficulty would be doubled in the next period, bringing the yield back to normal. Thus, quantum computing could never “break” mining: it would simply be integrated into the economic equilibrium of the network and then neutralized. The only residual risk would be **centralization**: the possession of exceptionally powerful quantum hardware by a single player could temporarily unbalance the hashpower market. But this risk is economic in nature, not cryptographic, and remains unlikely given the necessary investment costs (cryogenic infrastructure, maintenance, advanced engineering). ### Differentiating risks: signatures vs. hashing Two distinct threats must be distinguished: - **Hashing (SHA-256)**: used for mining, it is resistant to quantum attacks because Grover only confers a quadratic gain. - **Signatures (ECDSA)**: used to prove ownership of an address, they would be vulnerable to **Shor's algorithm (1994)**, which is capable of calculating discrete logarithms. It is therefore the signature layer, not the mining layer, that justifies post-quantum transition work. Recent estimates put the resources needed to break a 256-bit ECDSA key at several **millions of corrected qubits**. In 2025, no system will come close to this scale: corrected logic processors will be counted in units, not thousands. ### The real progress of 2024-2025: advances with no impact on mining Recent announcements of progress—for example, the stabilization of **error-corrected logical qubits**—are important steps, but they concern experimental reliability, not computing power. Quantum computing useful for mining would involve billions of consistent, repeated operations, which current qubits cannot sustain. Even a major breakthrough in error correction or modularity would not reverse the fact that quantum architecture remains incompatible with the massively parallel, shallow depth, and high frequency nature of mining. ### The following explanations are a little more complex, so here are some prerequisites The concepts of bits, pool mining, and difficulty bounds may seem abstract. Here is a clear explanation of these three essential elements for understanding how mining actually works. **MSB and LSB** In a 256-bit binary number (such as the result of an SHA-256), the **MSB** (_Most Significant Bits_) are the bits on the left: they represent the most significant values in the number. The **LSB** (Least Significant Bits) are those on the right, which change most often but have little influence on the overall value. When we talk about finding a hash “with leading zeros,” it means that the MSB must be zero: the hash begins with a long series of zeros. Miners vary a small data field called a _nonce_ so that the final hash meets this constraint. The difficulty of the network is precisely the number of MSBs that the hash must have as zero. **How pools work** Mining is now organized into **pools**, groups of miners who work together and share the reward. Each miner is given simplified tasks: they do not seek to validate the entire block, but to produce _shares_, i.e., hashes whose difficulty is lower than a target that is much easier than that of the network. These shares serve as proof of participation: the more a miner provides, the greater their share of the final block reward will be. The pool server constantly adjusts the individual difficulty (vardiff) to balance speeds: a miner who is too fast is given more difficult tasks, which prevents any unfair advantage. **Lower and upper mining limits** The Bitcoin protocol sets two difficulty thresholds that govern the entire mining process. The **upper limit** corresponds to the network target: for a block to be validated, its header hash must be less than this value. The lower the target, the more zeros are required at the beginning of the hash, making the block more difficult to find. Conversely, the **lower limit** corresponds to the difficulty of work assigned by the pools to each miner, which is much easier to achieve. It is used solely to measure individual participation. The pool server constantly adjusts these limits. If a miner finds too many shares too quickly, the pool increases the difficulty of their tasks. If they find them too slowly, it reduces it. This mechanism—called vardiff—effectively eliminates extreme behavior: miners who are too fast do not earn more, while those who are too slow are naturally excluded, as their shares become too rare to be profitable. Thanks to this balancing system, each miner's computing power remains proportional to their actual contribution, with no possibility of a lasting advantage. The upper and lower limits thus ensure overall network stability and local fairness in the distribution of work. ### Understanding the “partial Grover” illusion One idea often comes up: applying Grover's algorithm not to the entire 256 bits of the SHA-256 hash, but only to a portion of the most significant bits (the “MSBs”), then completing the rest in the traditional way. This approach, known as “partial Grover,” seems logical: if the search covers a smaller space (for example, 40 bits instead of 256), the number of iterations required decreases accordingly, according to the rule √(2^r). In theory, this could make it possible to obtain low-difficulty shares more quickly in a mining pool. In practice, this approach does not change the reality of the calculation. Each Grover iteration requires executing **the entire SHA-256** to evaluate the condition on the most significant bits. It is impossible to “truncate” the hash or partially test a cryptographic hash function without calculating it entirely. In other words, fewer iterations are repeated, but each one costs just as much—and millions of times more than a conventional hash on ASIC. Furthermore, Grover does not allow multiple correlated solutions to be produced. The quantum state collapses after the first measurement: to find another solution, you have to start all over again. Unlike classical computation, you cannot reuse the result to generate nearby variants or multiple close shares. Finally, even if a quantum miner achieved a slight local acceleration on the shares, this difference would be immediately neutralized by the pools' automatic regulation mechanisms, which dynamically adjust the difficulty for each miner. The protocol is designed to maintain a balance between all participants, regardless of their speed. In summary, “partial Grover” offers no practical advantage: the quadratic gain remains purely theoretical, negated by the slowness, decoherence, and physical constraints of quantum computing. Even when applied to a small portion of the hash, the energy, time, and structural costs of such a process exceed those of conventional miners by several orders of magnitude. ### Other possible objections **“Grover's algorithm can process multiple solutions (multiple-solutions search).”** Source: PennyLane Codebook on “Grover's Algorithm | Multiple Solutions” explains the generalization of the algorithm to find M solutions in a space of size N. **Response**: In theory, finding M solutions reduces the complexity to O(√(N/M)). However: - In the context of mining, “solutions” would correspond to valid hashes for the difficulty target. But the quantum oracle must still test the entire hash function for each input, so the cost remains maximum per iteration. - Having multiple solutions M does not change the **latency** or **circuit depth**: we remain limited by error correction and consistency. - For large values of N (≈ 2ÂČ⁔⁶) and small M (very rare target), √(N/M) remains astronomical. Therefore, even by adopting Grover's “multiple-solutions” variant, hardware and time constraints make its application to mining still impractical. **“If a quantum miner appeared, it could cause more forks/reorganizations.”** Source: the academic article “On the insecurity of quantum Bitcoin mining” (Sattath, 2018) suggests that the correlation of measurement times could increase the probability of forking. **Response**: This argument is interesting but largely speculative and is based on the assumption that an ultra-fast quantum miner would work. However: - The scenario required a quantum miner capable of reaching a speed comparable to or greater than the best ASICs, which is not realistic today. - Even if such a miner existed, the increase in forks would not necessarily result from a generalized mining advantage but from an opportunistic strategy. This does not call into question network adaptation, difficulty adjustment, or security measures. - The fact that forks can occur does not mean that quantum mining is viable or advantageous: the cost remains prohibitive. In summary, this objection can be formalized, but it does not constitute proof of an effective quantum advantage in the real world. ### Economic and energy consequences Modern ASIC farms operate at full energy efficiency, around **12 to 35 J/TH**. A cryogenic quantum computer, even if perfectly optimized, would have **efficiency several orders of magnitude lower**, due to the costs of cooling, control, and error correction. Quantum computing is therefore **uneconomical** for mining: - it requires a centralized architecture; - it does not allow for large-scale duplication; - it does not reduce total energy consumption; - it does not improve network security. ### Conclusion Quantum computing, in its current and foreseeable state, is **fundamentally unsuitable for Bitcoin mining**: 1. **Algorithmically**, Grover's quadratic acceleration remains insufficient in the face of the exponential complexity of hashing. 2. **In terms of hardware**, error correction and decoherence limit any attempt at large-scale parallelization. 3. **In terms of energy**, cryogenic cooling and the complexity of control make any industrial operation inefficient. 4. **In terms of protocol**, the difficulty adjustment mechanism neutralizes any transient advantage. 5. **Economically**, the centralization required to maintain a quantum infrastructure would destroy the network's resilience and would therefore be excluded from rewards by the nodes (which decide). The quantum threat to Bitcoin concerns exclusively **cryptographic signatures (ECDSA)** and not **proof of work (SHA-256)**. Based on current knowledge and technological projections, **there is no credible prospect** of quantum computing offering any advantage for mining, or even energy efficiency. The myth of the “quantum miner” is therefore more a matter of media speculation than applied science. Bitcoin, designed to adapt and adjust its difficulty, remains today and for the foreseeable future **resilient in the face of the quantum revolution**. [Source]() #Bitcoin #QuantumComputing #ProofOfWork #SHA256 #Grover #Mining #PostQuantum #Decentralization
# What does a Bitcoin address reveal before and after a transaction? Reusing a Bitcoin address is often presented as a privacy issue. However, it also poses a **real cryptographic risk** related to the security of the private key itself. This issue concerns both older P2PKH addresses and newer SegWit (bc1q...) or Taproot (bc1p...) formats: when an address is reused after having already been used to spend a UTXO, all funds associated with that same key now depend on cryptographic material that has been exposed multiple times on the blockchain. This article explains the structural reasons for this risk, the cryptographic mechanisms involved, and the practical way to observe the public key revealed during a transaction. ### Exposure of the public key: a critical moment Before any transaction, a Bitcoin address **does not reveal the public key**, but only a hash: ``` HASH160(pubkey) = RIPEMD160(SHA-256(pubkey)) ``` This hash offers no possibility of retrieving the public key. As long as a UTXO remains unspent, the associated key remains mathematically inaccessible. As soon as a UTXO is spent: - the **signature** is published, - the **complete public key** is revealed, - the validity of the signature is verified against this key. From this point on, the address no longer offers the same cryptographic protection: the public key is exposed to offensive analysis, and any reuse of this same key multiplies the data that can be exploited by an attacker. ### Where is the public key located at the time of spending? The exact location depends on the type of address: ### P2PKH (addresses beginning with 1 or 3) In **P2PKH** transactions, the public key appears: - **in the scriptSig**, - immediately after the signature, - in hexadecimal form, usually as a compressed key (33 bytes, prefix 02 or 03) or uncompressed (65 bytes, prefix 04). ### P2WPKH (SegWit v0, bc1q addresses, etc.) In **P2WPKH** transactions, the public key appears in the **witness**: - witness[0] → signature (DER format), - witness[1] → **compressed public key** (33 bytes, starting with 02 or 03). ### Taproot (P2TR, bc1p addresses, etc.) **Taproot** transactions use Schnorr signatures and **x-only** public keys. The public key appears: - in the **witness script**, - usually under the “key path spending” line, - in **x-only** format: 32 bytes (64 hex) without the 02/03 prefix. ### On mempool.space [mempool.space]() does **not display “Public Key” in plain text**. You have to read the raw hexadecimal fields and recognize the format: - **33 bytes** → compressed pubkey: starts with 02 or 03. - **65 bytes** → uncompressed pubkey: starts with 04. - **32 bytes** → Taproot x-only pubkey. The public key is therefore still visible, but in the form of a hexadecimal field in the Inputs. ### Why does reuse weaken security? ### Revealing the public key once is not critical Security relies on the difficulty of the discrete logarithm problem (ECDLP). As long as an attacker only has a single signature produced by the key: - they cannot reconstruct anything, - they have no statistical material, - ECDLP remains intact. ### Revealing the same key multiple times multiplies the attack surface Each UTXO expenditure associated with the same address publishes: - an identical public key, - a new, distinct signature. In ECDSA (P2PKH, P2WPKH), each signature requires a random number: the **nonce k**. k must be: - unique, - unpredictable, - perfectly generated. > A flaw in the generation of k — well-documented events — allows the private key to be recovered if two signatures use the same k or correlated k's. Real-world examples: - Android bug in 2013, - Faulty hardware RNG, - Old OpenSSL libraries, - Entropy weakness when booting a device, - Smartcards producing biased nonces. Reusing addresses **multiplies the signatures produced** by the same key → increases the probability of a cryptographic incident. ### Taproot improves the situation but does not eliminate it Taproot uses Schnorr: - deterministically derived nonce → eliminates the “same k” risk, - more resistant linear signature structure. However: - the x-only key remains unique and exposed, - multiple signatures remain exploitable for statistical analysis, - hardware risks remain, - post-quantum cryptography will compromise any exposed public key. ### Risk concentration An HD wallet (BIP32) allows each UTXO to be isolated behind a different derived key. Reusing addresses negates this advantage: - a bug in a single signature → compromises all UTXOs dependent on that key. This is the worst possible configuration in terms of compartmentalization. ### What about cryptographic advances (quantum or otherwise)? If an attacker gained the ability to solve ECDLP: - any public key **already exposed** would become vulnerable, - all reused addresses would be particularly fragile, - an address that has never been spent would remain protected by HASH160. Address reuse thus concentrates a future risk that the ecosystem explicitly seeks to avoid. ### Concrete example: key revealed in a real transaction For the transaction: ``` 7ee6745718bec9db76390f3a4390b9e7daeeb401e8c666a7b261117a6af654a1 ``` This is a P2WPKH input. In the witness: - the signature is in witness[0], - the compressed public key is in witness[1]. The revealed public key is: ``` 02174ee672429ff94304321cdae1fc1e487edf658b34bd1d36da03761658a2bb09 ``` > Before spending: only HASH160(pubkey) was visible. > After spending: the actual public key is visible, permanently. ### Conclusion Reusing Bitcoin addresses represents a tangible cryptographic risk. It is not just a matter of poor privacy hygiene, but a structural problem: **a public key should only be exposed once**, and a signature should never be multiplied on the same key if maximum robustness is desired. Current cryptographic mechanisms are robust, but experience shows that: - implementations are never perfect, - nonces can be biased, - devices can lack entropy, - hardware attacks exist, - cryptanalysis is advancing. Minimizing the exposure of public keys remains a fundamental best practice, today and tomorrow, and this starts with a simple rule: **never reuse an address that has already spent a UTXO**. [Source]() #Bitcoin #Privacy #Cryptography #ECDSA #Schnorr #Taproot #SegWit #UTXO #Decentralized #BitcoinPrivacy #CryptoEducation #BIP32 #HDWallet #QuantumThreat
Interview from #PlanBLugano Check what the guys from are doing. Cool demo with an NFC tag to login into a website or to open a door đŸšȘ They should go live in Q1 2026 with their project, so don't hesitate to contact them to beta test. PS: there was lots of background noise in the room, I had to use AI to remove the noise. #PlanB
At #PlanBLugano I interviewed Daniel from His company supports circular economy in developing countries. He buys product from them and pay them in sats. Keychain and mini surf board, upcycling from used board that the kids cannot surf on it anymore with Bitcoin related backstreet art from StreetCyber from Barcelona. #PlanB
Guess with whom I ran into at #PlanBLugano ? Uncle Rockstar Dev. He gave me a cool NFC card with his image on it and when you tap it to your phone, you got laser eyes ! đŸ€© Read my feedback from the conference here:
On my way to Lugano PlanB conference with @Saidah - Ask a Bitcoiner 21 Questions