It’s the same in many fields. Trainees learn by doing the easy, repetitive work that can now be automated.
It’s a little worrisome, actually. Professionally written software still needs a human to verify things are correct, consistent, and safe, but the tasks we used to foist off on more junior developers are being increasingly done by AI.
Part of that is fine - offloading minor documentation updates and “trivial” tasks to AI is easy to do and review while remaining productive. But it comes at the expense of the next generation of junior developers being deprived of tasks that are valuable for them to gain experience to work towards a more senior level.
If companies lean too hard into that, we’re going to have serious problems when this generation of developers starts retiring and the next generation is understaffed, underpopulated, and probably underpaid.
burningmatches@feddit.uk 4 months ago
frog@beehaw.org 4 months ago
Yep. I used to be an accountant, and that’s how trainees learn in that field too. The company I worked at had a fairly even split between clients with manual and computerised records, and trainees always spent the first year or so almost exclusively working on manual records because that was how you learned to recognise when something had gone wrong in the computerised records, which would always look “right” on a first glance.
supersquirrel@sopuli.xyz 4 months ago
I get sooooo much schadenfreude from programmers smugly acting like their jobs aren’t going to be obliterated by AI… because the AI won’t be able to do the job correctly, as if that matters in this late stage of collapse and end state capitalism.
frog@beehaw.org 4 months ago
AI is also going to run into a wall because it needs continual updates with more human-made data, but the supply of all that is going to dry up once the humans who create new content have been driven out of business.
It’s almost like AIs have been developed and promoted by people who have no ability to think about anything but their profits for the next 12 months.
greenskye@lemm.ee 4 months ago
I just tend to think of it as the further enshittification of life. I’m not even that old and it’s super obvious how poorly most companies are actually run these days, including my own. It’s not that we’re doing more with less, it’s a global reduction in standards and expectations. Issues that used to be solved in a day now bounce between a dozen different departments staffed with either a handful of extremely overworked people, complete newbies, or clueless contractors. AI is just going to further cement the shitty new standard both inside and outside the company.
frog@beehaw.org 4 months ago
Yep. Life does just seem… permanently enshittified now. I honestly don’t see it ever getting better, either. AI will just ensure it carries on.
HobbitFoot@thelemmy.club 4 months ago
It looks like we are already at the point with some AI where we can correct the output instead of add new input. Microsoft is using LinkedIn to help get professional input for free.
frog@beehaw.org 4 months ago
But this is the point: the AIs will always need input from some source or another. Consider using AI to generate search results. Those will need to be updated with new information and knowledge, because an AI that can only answer questions related to things known before 2023 will very quickly become obsolete. So it must be updated. But AIs do not know what is going on in the world. They have no sensory capacity of their own, and so their inputs require data that is ultimately, at some point in the process, created by a human who does have the sensory capacity to observe what is happening in the world and write it down. And if the AI simply takes that writing without compensating the human, then the human will stop writing, because they will have had to get a different job to buy food, rent, etc.
No amount of “we can train AIs on AI-generated content” is going to fix the fundamental problem that the world is not static and AI’s don’t have the capacity to observe what is changing. They will always be reliant on humans. Taking human input without paying for it disincentivises humans from producing content, and this will eventually create problems for the AI.
pbjamm@beehaw.org 4 months ago
and 20yrs from now polydactylism will be the new human beauty standard
HobbitFoot@thelemmy.club 4 months ago
But humans also need input as well.