Comment on call of the void
Samskara@sh.itjust.works 1 day agoPeople in distress will talk to an LLM instead of calling a suicide hotline.
Comment on call of the void
Samskara@sh.itjust.works 1 day agoPeople in distress will talk to an LLM instead of calling a suicide hotline.
Scubus@sh.itjust.works 1 day ago
Ok, people will turn to google when they’re depressed. I just googled a couple months ago the least painful way to commit suicide. Google gave me the info I was looking for. Should I be mad at them?
Samskara@sh.itjust.works 1 day ago
You are ignoring that people are already developing personal emotional reaction with chatbots. That’s no the case with search bars.
The first line above the search results at google for queries like that is a suicide hotline phone number.
A chatbot should provide at least that as well.
I’m not saying it shouldn’t provide no information.
Scubus@sh.itjust.works 1 day ago
Ok, then we are in agreement. That is a good idea.
I think that at low levels the tech should not be hindered because a subset of users use the tool improperly. There is a line, however, but im not sure where it is. If that problem were to become as widespread as, say, gun violence, then i would agree that the utility of the tool may need to be effected to curb the negative influence
Samskara@sh.itjust.works 1 day ago
It’s about providing some safety measures to protect the most vulnerable. They need to be thrown a lifeline and an exit sign on their way down.
For gun purchases, these can be waiting periods of a few days. So you don’t buy a gun in anger and kill someone, regretting it immediately and ruining many people’s lives.
Did you have to turn off safe search to find methods for suicide?