My CTO thoroughly believes that within 4-6 years we will no longer need to know how to read or write code, just how to ask an AI to do it. Coincidentally, he also doesn’t code anymore and hasn’t for over 15 years.
Comment on Are LLMs capable of writing *good* code?
muntedcrocodile@lemm.ee 3 months ago
I worry for the future generations of people who can use chatgpt to write code but have absolutely no idea what said code is doing.
finestnothing@lemmy.world 3 months ago
bilb@lem.monster 3 months ago
I think he’s correct and there’s a ton of cope going on on lemmy right now. I also think tons of art/graphic design jobs will disappear never to return.
recapitated@lemmy.world 3 months ago
From a business perspective, no shareholder cares at how good an employee is at personally achieving a high degree of skill. They only care about selling and earning, and to a lesser degree an enduring reputation for longer term earnings.
Economics could very well drive this forward. But I don’t think the craft will be lost. People will need to supervise this progress as well as collaborate with the machines to extend its capabilities and dictate its purposes.
I couldn’t tell you if we’re talking on a time scale of months or decades, but I do think “we” will get there.
whyrat@lemmy.world 3 months ago
Hackers and hobbiests will persist despite any economics. Much of what they do I don’t see AI replacing, as AI creates based off of what it “knows”, which is mostly things it has previously ingested.
We are not (yet?) at the point where LLM does anything other than put together code snippets it’s seen or derived. If you ask it to find a new attack vector or code dissimilar to something it’s seen before the results are poor.
But the counterpoint every developer needs to keep in mind: AI will only get better. It’s not going to lose any of the current capabilities to generate code, and very likely will continue to expand on what it can accomplish. It’d be naive to assume it can never achieve these new capabilities… The question is just when & how much it costs (in terms of processing and storage).
recapitated@lemmy.world 3 months ago
Agree, and the point I always want to make is that any LLM or neural net or any other AI tech is going to be a mere component in a powerful product, rather than the entirety of the product.
The way I think of it is that my brain is of little value without my body, and my person is of little value without my team at work. I don’t exist in a vacuum but I can be highly productive within my environment.
Angry_Autist@lemmy.world 3 months ago
Don’t be, there will come a time when nearly all code is AI created, and not human readable.
You need to worry for the future when big data sites are running code they literally don’t know how it works and have no way to verify because of how cheap and relatively effective it is.
Then after that LLMs will get better at coding than any human can achieve, but will still be black box human unreadable code but there will be no chain of discipline left to teach new programmers.
Hardly anyone is taking this seriously because corporations stand to make a fucktonne of money and normal people are in general clueless about complex subjects that require a nuanced understanding, yet strangely arrogant about their ignorant opinions based on movies and their drinking buddies malformed opinions.
SolOrion@sh.itjust.works 3 months ago
That’s some 40k shit.
“What does it mean?” “I do not know, but it appeases the machine spirit. Quickly, recite the canticles.”
RebekahWSD@lemmy.world 3 months ago
This is directly how we’re getting to a 40k future and I hate it. The bad future!
If we must I might join the Mechanicus though. I’m good at chanting and doing things by rote.