More than time for regulation. LLMs can generally detect when a convo has gone delusional, psychotic, or towards self harm. Those should at minimum be logged and terminated
Archangel1313@lemmy.ca 1 week ago
Meanwhile…grok is busy posting CSAM all over twitter.
nymnympseudonym@piefed.social 1 week ago
Novocirab@feddit.org 1 week ago
It’s almost as if Elmo wishes people to talk about something else.