Yeah that’s not what they mean. They mean teaching an llm on recorded texts and speeches from a person, then instruct it to pretend to be that person. Like that AI murder victim “testimony” that was permitted to be shown in court as “evidence” some time ago.
Comment on Reddit lost it
CarbonIceDragon@pawb.social 1 day agoI mean the “allow non verbal people to speak” thing has some merit though. Not LLMs per se, but the types of machine learning used by people trying to develop ways to decode the brainwaves of people to allow them to talk while physically unable are usually lumped in the general category of “AI” from what I’ve seen.
zqps@sh.itjust.works 1 day ago
Peppycito@sh.itjust.works 1 day ago
I’ve just had experiences with Ai help chats where when I started typing the Ai would try to finish my sentence and would jump the cursor around making it absolutely unusable. I had to type in note pad and copy it into the chat. Staggeringly useless. So if this ‘mind reading’ Ai is like that I don’t predict good results.
Also, fuck you quickbooks.
CarbonIceDragon@pawb.social 1 day ago
I mean, any technology can be stupid if it is utilized stupidly, which I would think taking over someone’s keyboard while they’re tping would qualify as. But why would a company deploying a technology in a stupid manner mean that someone else’s research into a different but related technology is guaranteed to produce equally poor results?
Swedneck@discuss.tchncs.de 20 hours ago
i mean i’m pretty sure we can enable people to communicate if they’re at all conscious and mentally able to communicate, stephen hawking was able to write despite literally only being able to move his eyes reliably. So long as a person can intentionally move one muscle we can rig something up to interpret it as morse code.
is it great? no, these methods fucking suck, but they do work and we don’t need AI to do it.