We shouldn’t judge them for getting to this point, but we also shouldn’t just leave them be. People like the one in the screenshot are clearly in need of therapy.
We already saw the story of the guy who poisoned himself with sodium bromide at the “advice” of chatgpt. What kind of outward damage could someone who confides in a chatbot be capable of?
DoctorDelicious@leminal.space 3 days ago
Nah, this is corporations exploiting vulnerable people for profit using perceived interpersonal connection.
Daft_ish@lemmy.dbzer0.com 3 days ago
Ok, make it illegal.
Kuori@hexbear.net 2 days ago
never gonna happen. even if it did it would be “illegal” in the “pay a fine” sense rather than the “firing squad” sense, which is to say it will remain fully 100% legal