Thread

🛡️
When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?" But there is a darker side to the question, which people intuit more than they say aloud. In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots. Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks. But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction. So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge. But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum. What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity. And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them. Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.

Replies (28)

And really that's the question!!!
Lyn Alden's avatar Lyn Alden
When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?" But there is a darker side to the question, which people intuit more than they say aloud. In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots. Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks. But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction. So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge. But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum. What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity. And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them. Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.
View quoted note →
With robots taking over all the work, we'll all be bathing in endless happiness, watching video after video of cute kittens. Who needs purpose when you have an endless catalog of cats falling off shelves and playing with balls of yarn, right? The meaning of life will be replaced by 'likes' and 'shares' of cat videos. We'll be a society of eternal viewers, completely satisfied with our superficial and ephemeral pleasures. Utopia, here we come.
I'm not worried as this argument could have been made with the replacing of humans in farms, etc. Humans are not replaceable by robots at a fundamental level that is not commonly thought with, even from a biomechanica/economical point of view not counting the spiritual aspect of man Nice thought experiment but overthinking in my opinion If one was very worried, of course he could just try to own robots obviously, and everyone should be working away from being just a laborer for many reasons, robots possibly replacing your work as one of many issues for laborers The whole UBI thing is just disguised socialism
I think about this a LOT. What are governments going to do when even 25% of the people aren’t “necessary”. Does the government just support them? Where does that money come from? Endless questions.. From what I see, It’s either wall-e ( the movie ) and we are all slugs or some dystopian hell-hole of a future. Neither very appealing.
"It takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot. " I think so too, but in the end, it doesn't matter whether it takes 20 or 30 years, if in the end it happens that way, which is very likely. The cost to use a given level of AI falls about 10× every 12 months, and lower prices lead to much more use. The world will not change all at once; it never does. The price of luxury goods and a few inherently limited resources like land may rise even more dramatically, as Altman predicts.
Well how would humans treat other humans if they believe they aren’t need. Look to Israel and the persecution of the Palestinian people. That’s your answer . Those not needed are dehumanised - where the violent believe they should be they wipe them out and take their resources, where there is a need for them, they coral them and keep them weak… culling the population if it grows above a certain recourse threshold. In an age where empathy is lacking where the masses are becoming more and more desensitised.. this behaviour will become more and more apparent. In this society we judge a person by what he does.. think about it when you meet a person it’s one of the first questions you’ll like consider asking. So in a world where now robots are doing more of the banal tasks and more people are out of work - I’m Sure some violent rich billionaire cartel who believe the planet is over populated will see those not being productive and surplus and will alter policy in a way in which those deemed less fit will have trouble surviving in the west. I’ve typed this in 2 minutes but you can get the main points.
Is this the part where the majority truly become the “useless eaters”? With the resources and power being ultra-concentrated at the top.. the grave concern is how will “they” handle this shift? Are the technocrats/oligarchs going to suddenly decide that they have had their fill and are now willing to share? Are they willing to drop profits in favor of humanity and the wellbeing of the whole? Or, will the class divide become even more so? Insert dystopian story of choice…. If you look to our short history as a species I don’t get the warm and fuzzies.
🛡️
How about a positive take. The tribe won't dismantle due to a mere lack of economical interest. Humans need deep, meaningful relationships. It's one of the basic psychological needs. We need each other for social and spiritual bonds. Without them we suffer mentally. But we don't want just not to suffer, we want to thrive. In order to thrive, we need to help others and grow spiritually. The more time we will have at our disposal, the more we will engage in helping each other selflessly. It might take centuries, but slowly the selfish will be replaced by or converted to the selfless.
So, what do we do? Will we as a creature loose all will to live? People, for as capable as we are, are equally complex, and often a contradiction. In times of intense need our attention is focused on basic survival… Yet as we gain time (freedom and resources) our attention tends to turn towards our, quality of existence. Which is highly subjective. Self worth has a tendency to impact our actions and achievements. So, what happens when our drive and function is primarily focused on creating and building what each individual wants? instead of ‘working for the man.’ Can we look to tribal groups to get an insight for how happy someone is when all their needs are met and they have enough resources to choose, how to spend their time?
The greatest of human fears is not to be noticed at all, or to not matter. Even slavery is tolerable if living with a purpose (to become free or to save the life of a loved one, etc) While it's possible for a small group of people to hold all the money, it's not possible for them to hold all the Capital. Capital is just something that is useful(tractors and factories yes, but also ideas and creativity), and usefulness is subjective and dependent on the receiver. When the Industrial Revolution destroyed many money paying jobs, humans simply created new jobs or new ways of producing things that people wanted. So just like Say's Law states; production of goods creates demand for those goods and the economy grows. In my experience more harm comes from people needing each other too much rather than not needing them enough. That's the core of the sovereign individual, to be independent, and from this place of strength be able to add to the world.
Just find it funny? Well, least I did something make you think it’s funny rather than fooling people around with their savings. No purpose? Are you sure about that? Do you have full access of all my connection at minor or major scale? My fragile ego? Well, I have no damn Ego, otherwise would not still alive right now. How’s your ego btw? Snowflake ass? Not quite sure what this sentence means, are you talking about the company named Snowflake or something metaphor? Restructure your content before dumping shit on my comment, you arrogant prick.
If history is any guide, the first phase will be similar to the industrial revolution -technology will complement certain types of workers, making them more productive, while replacing others entirely. But if AI advances to the point where even high-skill labor is no longer necessary, we reach a stage unprecedented in human history: a world where ownership of automation is the sole determinant of economic power. Imagine a scenario where AI-driven corporations, controlled by a small group of capital holders, optimize every aspect of production, logistics, and service industries. Governments, pressured by economic efficiency, privatize social services, making access to resources contingent on corporate governance rather than state policies. In this world, the traditional idea of employment vanishes for most. Instead of wages, former workers survive on universal basic income or corporate stipends, tied not to productivity but to compliance with the systems owned by the elite. History suggests that once a class of people is economically unnecessary, they become politically vulnerable. The landed aristocracies of the past had use for peasants as laborers, but what happens when even the illusion of economic necessity disappears? In previous centuries, displaced workers could riot, revolt, or demand redistribution, but in a world governed by automated systems and AI-controlled security, resistance itself could become obsolete. The darkest outcome isn’t violent suppression but a slow, passive neglect—the emergence of a “post-labor caste” that, lacking any economic leverage, is maintained at a subsistence level only as long as the ruling class finds it convenient. Perhaps they are given digital entertainment, AI companions, and just enough resources to avoid rebellion, but they remain permanently outside the sphere of influence, their fate determined entirely by those who own and control automation. Think of animals in world dominated by humans… Its evolution and survival of the fittest again.
Think of what happens to animals in a world ruled by humans. Some are domesticated, bred to serve human needs. Others are left to the wild, dwindling in number as their habitats shrink. And some, deemed inconvenient or dangerous, are simply eradicated. That’s the fate I am thinking of from an evolutionary survival-of-the-fittest perspective. In a world where AI replaces human labor and economic interdependence dissolves, the question is no longer about fairness or purpose, but about survival itself.