“These are people at their worst moments. Using that data to help other people is one thing, but commercializing it just seems like a real ethical line for a nonprofit to cross.”
Jennifer King, privacy and data policy fellow at Stanford University
Submitted 2 years ago by Slatlun@lemmy.ml to privacy@lemmy.ml
https://www.politico.com/news/2022/01/28/suicide-hotline-silicon-valley-privacy-debates-00002617#
“These are people at their worst moments. Using that data to help other people is one thing, but commercializing it just seems like a real ethical line for a nonprofit to cross.”
Jennifer King, privacy and data policy fellow at Stanford University
The irony is that the company uses data from Crisis Text Line to make their customer support more "empathetic".
And when you drill down to their so called value proposition, it is apparently to help human empathy scale
. That word scale
lies at the heart of almost all immoral business practices. It is a alter for "how do we spend as little input as possible to get as much as possible".
I hate this world.
I lost hope
Happens to me every time that I read the news
seen this fictional short story via diaspora
https://www.youtube.com/watch?v=NZ8G3e3Cgl4
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
N1GGERS
let no death go to waste
YouWillNeverBeAWoman@lemmy.ml 2 years ago
I have this awful déjà-vu of that time facebook used their sentiment analysis to target depressed teens for special anti-depressant ads… [1] https://www.technologyreview.com/2017/05/01/105987/is-facebook-targeting-ads-at-sad-teens/
We seriously need better privacy laws. Especially for any kind of medical records!