Tim

Tim's avatar
Tim
timp@iris.to
npub1yn8h...6pzj
Poastor, internet explorer, yolo coder #PardonSamourai
image My dad worked on computers before I was born, back when machines did not pretend to be friendly. He wrote the code, built the machines, wired them together, and listened for sounds that meant trouble. Fans had a pitch and drives had a rhythm. When something went wrong, you often heard it before you saw it. In the eighties he became a system administrator, before that phrase turned into a job title. Back then it meant you owned the whole system. There was no one else to hand it off to. If something failed, it failed right there, in the same room. It left you with little choice other than rolling a cigarette and fixing it. Back then, systems did not reach very far. That was not a limitation. It was the reason they worked. A disk could fail without pulling anything else down with it. A program could crash and stay crashed. You always knew where the edge was because it was right there in the room with you. Trust lived inside those edges. If you gave something access, you knew what it could touch. If you handed someone a key, you knew which door it opened. When a mistake happened, the damage stopped where the system stopped. Abuse existed, but it demanded effort. You had to show up. You had to spend time and hardware and attention. That way of working did not vanish because it was flawed. It vanished because it could not survive success. More people started using computers. Machines began to speak across rooms, then across cities, then across continents. Software left the building and never came back. At some point nobody could point at the whole system anymore. Responsibility stretched thin and then dissolved. Nothing about that was reckless. It was scale doing what scale does. The failures changed shape. Now when an application disappears, your data goes with it. When an account closes, years of context vanish in the same motion. We pretend this is normal. We call it convenience. We accept it because pushing back feels futile. The old alternatives have faded far enough that we barely remember them. These systems assume they will keep working. They assume the platform will still exist tomorrow. They assume nothing important will fail at the wrong time. When those assumptions break, everything breaks together. That is also why abuse spreads so easily. Identity is cheap because it has no weight. Authority slips in unnoticed because it is never clearly granted. Reach appears by default. You do not earn it. You inherit it by showing up. Moderation shows up after the damage is done. There is nowhere else for it to go. By that point the system has already chosen what spreads and what survives. At scale it turns into endless cleanup, not control. We tried to fix it with cryptography. We locked the wires. We stamped messages. We built systems that promised safety if you followed the rules closely enough. What people got were setups that offered no forgiveness. One wrong move and the door slammed shut. Most people stepped away because the price of a small error was total loss. Years could vanish without warning. That is not security. That is fragility. Even Phil Zimmermann, who invented PGP, has said he does not use it himself, because he couldn’t be bothered spending time learning how to set it up correctly. The failure was not the math. It was where we placed it. There is another way to approach this. It starts by letting go of the idea that software should be permanent. It accepts that things will fail. It treats that as normal instead of catastrophic. This is where 2WAY comes in. It starts from a simple idea: On 2WAY, your data does not belong to the application that happens to display it. It lives locally, shaped in a way other software can understand. An app can leave without dragging your history behind it. Another can step in and continue. This changes how access works. Nothing acts just because it exists. Permission has to be given. Not once, not vaguely, and not forever. When something is allowed to do something, that allowance has a shape. When it ends, it ends cleanly. You do not see the cryptography most of the time. It does its work quietly. What you see are moments where a decision is required. A choice to allow something. A choice to stop it later. Recovery is expected and mistakes do not poison the whole system. As a result, failure stops spreading. If a service disappears, your data stays put. If software misbehaves, it cannot wander into places it was never invited to. If trust breaks, it breaks along a line you can point at. Abuse becomes harder because it has to push against structure. New identities arrive without weight. They do not reach far unless other identities lay the groundwork for them. That takes time. It takes effort. It cannot be faked cheaply. Misinformation looks different too. The problem was never that people lie. The problem is that content floats free of its past. In this kind of system, where something came from is not a mystery. What happened to it along the way leaves marks. You are not forced to believe any of it. You can see enough to decide. This matters more as machines begin to act on our behalf. An assistant that can do everything is dangerous. One that can only act within clear limits is dull in the right way. When it fails, it stops instead of spreading the damage. Something else happens along the way. Patterns that rely on quiet overreach stop working. An application cannot slowly take more than it was given. It cannot blur the boundary and hope no one notices. That kind of behavior becomes visible because it has to ask. Power shifts. Platforms lose their grip because they no longer own the ground beneath you. Software competes again on usefulness. Lock-in fades because there is nothing left to hold. None of this fixes people. Conflict remains. Deception remains. But the systems stop amplifying the worst behavior just because it is cheap. We are moving into a world filled with generated voices, synthetic images, and automated decisions. Implicit trust will not survive. Assumptions turn into attack surfaces. Silence becomes risk. The only thing that scales safely is clarity. Clear authority. Explicit boundaries. Systems that know what they are allowed to do and nothing beyond that. Software should be allowed to die. Your data should stay where it is. Your identity should not thin out or vanish because a service closed its doors. That is what 2WAY is trying to make boring again. It is not a product yet. It is a shape. The protocol and backend are written down. The boundaries exist. The hard parts are named. What exists today is the structure, not the machine that runs it. I am still finalizing that structure. I am looking for weak joints. I am checking where assumptions leak. I want the limits to hold before anything is turned on. When that work settles, I will start another build iteration. The protocol and backend specifications are practically ready, MIT licensed, and live here: If this way of thinking aligns with you, the work is open. There is room to work on it together.