I use it for software, but you really need to know what you are doing to understand what is wrong and ask it to redo it in a different way. I still think it saves time, but the ability to generate fully realized applications is a ways away.
henfredemars@infosec.pub 5 months ago
Automation-prone fields like writing, software, and app development saw a 21% decrease in job listings
Maybe, but hard disagree that software is being automated away.
Fixbeat@lemmy.ml 5 months ago
i_am_not_a_robot@discuss.tchncs.de 5 months ago
The headline says “digital freelancers,” so maybe it’s talking primarily about small jobs that were being outsourced. A 21% decrease in regular job listings would be more concerning because of the amount of incorrect information and buggy software about to be created than job loss.
RandomException@sopuli.xyz 5 months ago
Well at least the buggy software will eventually generate more jobs because they need more hands fixing everything while AI can’t do it.
lightnsfw@reddthat.com 5 months ago
They just pass those problems on to their customers these days.
i_stole_ur_taco@lemmy.ca 5 months ago
It’s a little worrisome, actually. Professionally written software still needs a human to verify things are correct, consistent, and safe, but the tasks we used to foist off on more junior developers are being increasingly done by AI.
Part of that is fine - offloading minor documentation updates and “trivial” tasks to AI is easy to do and review while remaining productive. But it comes at the expense of the next generation of junior developers being deprived of tasks that are valuable for them to gain experience to work towards a more senior level.
If companies lean too hard into that, we’re going to have serious problems when this generation of developers starts retiring and the next generation is understaffed, underpopulated, and probably underpaid.
frog@beehaw.org 5 months ago
AI is also going to run into a wall because it needs continual updates with more human-made data, but the supply of all that is going to dry up once the humans who create new content have been driven out of business.
It’s almost like AIs have been developed and promoted by people who have no ability to think about anything but their profits for the next 12 months.
greenskye@lemm.ee 5 months ago
I just tend to think of it as the further enshittification of life. I’m not even that old and it’s super obvious how poorly most companies are actually run these days, including my own. It’s not that we’re doing more with less, it’s a global reduction in standards and expectations. Issues that used to be solved in a day now bounce between a dozen different departments staffed with either a handful of extremely overworked people, complete newbies, or clueless contractors. AI is just going to further cement the shitty new standard both inside and outside the company.
frog@beehaw.org 5 months ago
Yep. Life does just seem… permanently enshittified now. I honestly don’t see it ever getting better, either. AI will just ensure it carries on.
HobbitFoot@thelemmy.club 5 months ago
It looks like we are already at the point with some AI where we can correct the output instead of add new input. Microsoft is using LinkedIn to help get professional input for free.
frog@beehaw.org 5 months ago
But this is the point: the AIs will always need input from some source or another. Consider using AI to generate search results. Those will need to be updated with new information and knowledge, because an AI that can only answer questions related to things known before 2023 will very quickly become obsolete. So it must be updated. But AIs do not know what is going on in the world. They have no sensory capacity of their own, and so their inputs require data that is ultimately, at some point in the process, created by a human who does have the sensory capacity to observe what is happening in the world and write it down. And if the AI simply takes that writing without compensating the human, then the human will stop writing, because they will have had to get a different job to buy food, rent, etc.
No amount of “we can train AIs on AI-generated content” is going to fix the fundamental problem that the world is not static and AI’s don’t have the capacity to observe what is changing. They will always be reliant on humans. Taking human input without paying for it disincentivises humans from producing content, and this will eventually create problems for the AI.
burningmatches@feddit.uk 5 months ago
It’s the same in many fields. Trainees learn by doing the easy, repetitive work that can now be automated.
frog@beehaw.org 5 months ago
Yep. I used to be an accountant, and that’s how trainees learn in that field too. The company I worked at had a fairly even split between clients with manual and computerised records, and trainees always spent the first year or so almost exclusively working on manual records because that was how you learned to recognise when something had gone wrong in the computerised records, which would always look “right” on a first glance.
supersquirrel@sopuli.xyz 5 months ago
I get sooooo much schadenfreude from programmers smugly acting like their jobs aren’t going to be obliterated by AI… because the AI won’t be able to do the job correctly, as if that matters in this late stage of collapse and end state capitalism.