They are for adult weirdos.
Where do I sign up?
Comment on Chatbot that caused teen’s suicide is now more dangerous for kids, lawsuit says
Letstakealook@lemm.ee 1 month ago
What hairnet to this young man is unfortunate, and I know the mother is grieving, but the chatbots did not kill her son. Her negligence around the firearm is more to blame, honestly. Regardless, he was unwell, and this was likely going to surface in one way or another. With more time for therapy and no access to a firearm, he may have been here with us today. I do agree, though, that sexual/romantic chatbots are not for minors. They are for adult weirdos.
They are for adult weirdos.
Where do I sign up?
If she’s not running on your hardware, she only loves your for money.
Hirom@beehaw.org 1 month ago
That’s a good point, but there’s more to this story than a gunshot.
The lawsuit alleges amongst other things this the chatbots are posing are licensed therapist, as real persons, and caused a minor to suffer mental anguish.
A court may consider these accusations and whether the company has any responsibility on everything that happened up to the child’s death, regarless of whether they find the company responsible for the death itself or not.