Believable given how robots spit out tokens.
Comment on stuttering search engine
SalmiakDragon@feddit.nu 4 weeks ago
This is real?
okwhateverdude@lemmy.world 4 weeks ago
Zorcron@lemmy.zip 4 weeks ago
I suppose it’s possible, but I’m always suspicious of screenshots like this just because they’d be so easy to fake, either by giving a different prompt than in the screenshot or by using inspect element to completely edit the response text.
EpeeGnome@feddit.online 4 weeks ago
I couldn’t replicate it, but I’ve seen LLMs fail this way before, so it’s very plausible.
TherapyGary@lemmy.dbzer0.com 4 weeks ago
1000014128
Willdrick@lemmy.world 4 weeks ago
Yup, too cold temp does that, and repeats the last token/word ad infinitum