[deleted]
Comment on Help.
samus12345@sh.itjust.works 1 month ago
I could see myself having conversations with an LLM, but I wouldn’t want it to pretend it’s anything other than a program assembling words together.
StarryPhoenix97@lemmy.world 1 month ago
samus12345@sh.itjust.works 1 month ago
“Stop the presses! Send my wife some flowers and bring me an Advil! What do you mean you don’t work for me? You’re hired! Now that you’re hired, you’re fired! Now that you don’t work here, we can be friends! Now that we’re friends, how come you never call? Some friend you are!” hangs up
“God, I love this business!”
Droggelbecher@lemmy.world 1 month ago
It’s not pretending to be anything, that’s just the function you described: assembling words together.
VitoRobles@lemmy.today 1 month ago
The way it clicks for me is that it’s a juiced up auto-complete tool.
very_well_lost@lemmy.world 1 month ago
It’s literally that.
Calabast@lemmy.ml 1 month ago
Well that explains why that user thinks it completes them.
Not_mikey@lemmy.dbzer0.com 1 month ago
If llms are juiced up auto complete then humans are juiced up bacteria. Yeah they both have the same end goal, guess the next word, survive and reproduce , but the methods they use to accomplish them are vastly more complex.
Swedneck@discuss.tchncs.de 4 weeks ago
but it’s still literally just looking at the text and calculating what’s the most likely thing to follow, that’s fundamentally how LLMs work