Comment on Character.AI bans users under 18 after being sued over child’s suicide

thingsiplay@beehaw.org ⁨1⁩ ⁨day⁩ ago

Last year, the company was sued by the family of 14-year-old Sewell Setzer III, who took his own life after allegedly developing an emotional attachment to a character he created on Character.AI. His family laid blame for his death at the feet of Character.AI and argued the technology was “dangerous and untested”.

I wonder if the death is really connected to the Ai. It could be the person was already in a problematic situation with family and friends, and they just need to blame someone or something and don’t want to admit the real problems. Kind of what often happened back in the day with videogames getting blamed for killing humans.

Now, I don’t know if this is what is happening, but it is good to have the mind open. Because we could end up in a society where everyone undermines real problems in physical world and blames Ai to sideload the question. Nontheless I think the outcome that the companies restrict access to adult only is a good thing. But the entire industry needs to do this, not select services AFTER deaths. Because if this is the root cause of death, then why should others be allowed to operate? This needs regulation! I am not for regulating every detail, but IF this is causing kids to die, then it NEEDS regulation worldwide.

source
Sort:hotnewtop