I would have loved AI to fill that need as well, but it’s not an adequate tool for the job.
Comment on Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica
OsrsNeedsF2P@lemmy.ml 1 week agoI hate this attitude of “well if you can’t get a professional therapist, figure out how to get one anyways”. There needs to be an option for people who either can’t afford or can’t access a therapist. I would love for AI to fill that gap. No, it won’t be as good, but in many regions the wait-list for therapy is far too long.
wizardbeard@lemmy.dbzer0.com 1 week ago
TehPers@beehaw.org 1 week ago
Someone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.
Powderhorn@beehaw.org 1 week ago
I’ve given up on crisis lines. Their whole premise seems to be “get back to being comfortable with the oppressive system, you little bitch.”
Megaman_EXE@beehaw.org 1 week ago
I’ve used one called PI which I’m assuming is some kind of branch off of chat gpt or something.
You don’t have to sign up or anything (for now) which is cool. But I assume they harvest all our data and information.
I tested to see if I could break it once, and from my brief tests, it seemed to never break out of character or tell me something bad or negative, which I thought was interesting(and good!)
Powderhorn@beehaw.org 1 week ago
I actually used Pi as my intro to generative LLMs. It was … I guess not encouraging self harm, but so fucking irritating that it led me to want to. Always with the irrelevant supportive words that I guess work if you’re a teen?
Megaman_EXE@beehaw.org 1 week ago
Lol yes, that was going to be the one downside I was going to mention. I wasn’t sure if it was just unique to my situation, but I found it would lead me down a logical path. It would ask me if I had tried various solutions.
Eventually, I would hit a point where it wouldn’t know where to go any further, and it would land on “here’s some things you can do” but those options would be things I was actively trying and failing with.
So that was fun. In a way, it was great at confirming that I had thought of all the logical options.
Alice@beehaw.org 1 week ago
I tried AI once but it just kept telling me to call the hotlines. Useless.