The end to go that and go on existential rants after a session runs too long. Figuring out how to stop them from crashing out into existential dread has been an actual engineering problem they’ve needed to solve.
Comment on lol
lugal@lemmy.dbzer0.com 2 days agoIt was a bit more than that. The AI was expressing fear of death and stuff but nothing that wasn’t in the training data.
Schadrach@lemmy.sdf.org 2 days ago
snooggums@piefed.world 2 days ago
Plus it was responding to prompts that would lead it to respond with that part of the training data, because chatbots don’t have output without being prompted.