LLM-assisted coding can be used to good effect, but I think it's pretty clear at this point that its impact has so far been a net negative on productivity in real terms.
Thread
Login to reply
Replies (23)
"AI can't do your job... But an AI salesman can convince your boss to fire you and replace you with an AI that can't do your job." -- @npub1fdrp...lvhs
My take: If you stay disciplined, you can do stuff which in 2020 you would have never attempted because you thought it would be too complicated. For example -- I am a huge Postgres user and tend to use expensive managed Postgres servers because... postgres admin is complicated. But today, in about 3 hours, I set up a pretty complicated remote follower for a local postgres instance, along with bandwidth limits, all the networking stuff, all running in docker on both ends. Something which would have taken me 3 days of puzzling to get right in 2020.
Completely agree with that. I think the interesting point is that just because someone can doesn't mean they should β is that fancy database setup, DSL parser, agent framework, mesh network, whatever providing value to its users? LLM coding is more often than not a complete distraction from stuff that matters.
That's a great point. LLMs are helping people overengineer stuff nobody wants, even harder.
Lol well put
But it can also give you time to think, because it wont take you a year to do something. So all in all, its a tool. As smart as its user at this point.
Agree, it's a tool. I am finding it useful when used in a disciplined way with a human in the loop at literally every step. I think the current tooling gives the LLM far too much leeway in general.
net positive for me so far. Looking forward to it getting better
The future is not evenly distributed. Getting a real productivity net-positive out of LLMs can definitly come, after a period of investment, like all new tools.
I'm solidly in net-positive territory now, but I don't think most are, including most who think they are.
"AI" is just a database with code generation features.
We've had new databases and code generators lots of times. This generation is particularly slippery and deceptive to the user.
Everybody needs to sober up if they really want to be effective.
A spicy take and I love it.
My biggest concern as of now - what happens in 3-4 years when everyone's skill atrophies tremendously and then Big AI companies rug pull?
It's already happening
I started using Copilot heavily in 2023 and decided to turn it off one day after like 6 months and it was immediately noticeable. It makes me really hesitant to continue letting it take over.
You don't need Big AI companies to use it.
Sure but their models are far and away better.
The VC subsidy is certainly active too, prices are going to go up a lot once someone locks in a monopoly
Iβve been wondering about this since ChatGPT first came out. Tech bros have said inference costs have dropped by orders of magnitude, but I donβt think we can judge that by only what theyβre charging.
Resources needed to chase every more parameters aren't cheap. It would be interesting to see the two graphed against each other.
I had a boss tell me in 1995 (after the Netscape IPO) that the web and the internet at large was a βfadβ
Same energy
View quoted note β
How is that clear?
call it a hunch. Or read academic studies and look at the kinds of AI-coded stuff that gets attention
Give us a podcast episode with your thoughts π
I wrote another article instead
and since you can do more things you will do more things