Comment on Mom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technica

<- View Parent
Megaman_EXE@beehaw.org ⁨1⁩ ⁨week⁩ ago

I’ve used one called PI which I’m assuming is some kind of branch off of chat gpt or something.

You don’t have to sign up or anything (for now) which is cool. But I assume they harvest all our data and information.

I tested to see if I could break it once, and from my brief tests, it seemed to never break out of character or tell me something bad or negative, which I thought was interesting(and good!)

source
Sort:hotnewtop