Comment on Reddit lost it
outhouseperilous@lemmy.dbzer0.com 6 days agoThe llm simulates an isolating abusive/culty relationship, talks someone into suicide
Comment on Reddit lost it
outhouseperilous@lemmy.dbzer0.com 6 days agoThe llm simulates an isolating abusive/culty relationship, talks someone into suicide
NotANumber@lemmy.dbzer0.com 6 days ago
Are their records of this happening? Did someone prompt it into doing this?
outhouseperilous@lemmy.dbzer0.com 6 days ago
Buncha times! Pick your fav Nd we can talk about it!
NotANumber@lemmy.dbzer0.com 6 days ago
There are to my knowledge to instances of this happening. One involving openai the other involving character.ai. Only one of these involved an autistic person. Unless you know of more?
I also think it’s too soon to blame the AI here. Suicide rates in autistic people are ridiculously high. Something like 70% of autistic people experience suicidal ideation. No one really cared about this before AI. It’s almost like we are being used as a moral argument once again. It’s like think of the children but for disabled people.
outhouseperilous@lemmy.dbzer0.com 5 days ago
So you just want to argue whether it happened and defend your little graph.
vala@lemmy.dbzer0.com 6 days ago
OpenAI was just sued over this.
cnn.com/…/openai-chatgpt-teen-suicide-lawsuit
To be clear I’m not asserting that this kid was autistic.
NotANumber@lemmy.dbzer0.com 6 days ago
I think they were talking specifically about character.ai and one particular instance that involved an autistic person.
OpenAI and character.ai are two different things. I believe character.ai uses their own model, but I could be wrong.