Believable given how robots spit out tokens.
Comment on stuttering search engine
SalmiakDragon@feddit.nu 2 days ago
This is real?
okwhateverdude@lemmy.world 2 days ago
Zorcron@lemmy.zip 1 day ago
I suppose it’s possible, but I’m always suspicious of screenshots like this just because they’d be so easy to fake, either by giving a different prompt than in the screenshot or by using inspect element to completely edit the response text.
EpeeGnome@feddit.online 1 day ago
I couldn’t replicate it, but I’ve seen LLMs fail this way before, so it’s very plausible.
TherapyGary@lemmy.dbzer0.com 1 day ago
1000014128
Willdrick@lemmy.world 1 day ago
Yup, too cold temp does that, and repeats the last token/word ad infinitum