Well it’s not like everyone who uses Chat GPT is going to become delusional but if you start going down the path Chat GPT is going to make it a lot worse
Comment on People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies
0x0@lemmy.dbzer0.com 2 days ago
I’m no AI proponent, but I’ll only believe that LLMs are causing this psychosis when I see a randomized controlled trial. Until then, it seems far more plausible that people experiencing delusions gravitate towards LLMs.
zygo_histo_morpheus@programming.dev 2 days ago
Swedneck@discuss.tchncs.de 2 days ago
causing? no
hugely enabling any slight tendencies? no doubtMniot@programming.dev 1 day ago
Based on the article, it seems like cult-follower behavior. Not everyone is susceptible to cults (I think it’s a combo of individual brain and life-circumstances), but I wouldn’t say, “eh, it’s not the cult’s fault that these delusional people killed themselves!”
SteposVenzny@beehaw.org 1 day ago
Cults have intentions to exploit and manipulate. LLMs don’t.
You could argue negligence here but not malice. It’s more in line with people falling into wells.
verdare@beehaw.org 2 days ago
In my few experiments with ChatGPT, I found it to be disgustingly sycophantic. I have no trouble believing that it could easily amplify existing delusions of grandeur.