Comment on Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica
GammaGames@beehaw.org 6 days ago“hey, I know you feel like killing yourself, but if it happens then we’ll just replace you with a shitty bot” probably isn’t as helpful as they thought it would be. It’s violating and ghoulish.
OsrsNeedsF2P@lemmy.ml 6 days ago
I hate this attitude of “well if you can’t get a professional therapist, figure out how to get one anyways”. There needs to be an option for people who either can’t afford or can’t access a therapist. I would love for AI to fill that gap. No, it won’t be as good, but in many regions the wait-list for therapy is far too long.
TehPers@beehaw.org 6 days ago
Someone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.
Powderhorn@beehaw.org 6 days ago
I’ve given up on crisis lines. Their whole premise seems to be “get back to being comfortable with the oppressive system, you little bitch.”
Megaman_EXE@beehaw.org 6 days ago
I’ve used one called PI which I’m assuming is some kind of branch off of chat gpt or something.
You don’t have to sign up or anything (for now) which is cool. But I assume they harvest all our data and information.
I tested to see if I could break it once, and from my brief tests, it seemed to never break out of character or tell me something bad or negative, which I thought was interesting(and good!)
Powderhorn@beehaw.org 6 days ago
I actually used Pi as my intro to generative LLMs. It was … I guess not encouraging self harm, but so fucking irritating that it led me to want to. Always with the irrelevant supportive words that I guess work if you’re a teen?
Alice@beehaw.org 4 days ago
I tried AI once but it just kept telling me to call the hotlines. Useless.
wizardbeard@lemmy.dbzer0.com 6 days ago
I would have loved AI to fill that need as well, but it’s not an adequate tool for the job.