Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

My AI therapist got me through dark times

⁨9⁩ ⁨likes⁩

Submitted ⁨⁨4⁩ ⁨weeks⁩ ago⁩ by ⁨sabreW4K3@lazysoci.al⁩ to ⁨technology@beehaw.org⁩

https://www.bbc.com/news/articles/ced2ywg7246o

source

Comments

Sort:hotnewtop
  • kami@lemmy.dbzer0.com ⁨4⁩ ⁨weeks⁩ ago

    No.

    source
  • Powderhorn@beehaw.org ⁨4⁩ ⁨weeks⁩ ago

    A few thoughts here as someone with multiple suicide attempts under his belt:

    • I’d never use an “AI therapist” not running locally. Crisis is not the time to start uploading your most personal thoughts to an unknown server with possible indefinite retention.

    • When ideation hits, we’re not of sound enough mind to consider that, so it is, in effect, taking advantage of people in a dark place for data gathering.

    • Having seen the gamut of mental-health services from what’s available to the indigent to what the rich have access to (my dad was the director of a private mental hospital), it’s pretty much all shit. This is a U.S. perspective, but I find it hard to believe we’re unique.

    • As such, there may be room for “AI” to provide similar outcomes to crisis lines, telehealth or in-person therapy. But again, this would need to be local and likely isn’t ready for primetime, as I can really only see this becoming more helpful once it can take over more of an agent role where it has context for what you’re going through.

    source
  • spit_evil_olive_tips@beehaw.org ⁨4⁩ ⁨weeks⁩ ago

    With NHS mental health waitlists at record highs, are chatbots a possible solution?

    taking Betteridge’s Law one step further - not only is the answer “no”, the fucking article itself explains why the answer is no:

    People around the world have shared their private thoughts and experiences with AI chatbots, even though they are widely acknowledged as inferior to seeking professional advice.

    as with so many other things, “maybe AI can fix it?” is being used as a catch-all for every systemic problem in society:

    In April 2024 alone, nearly 426,000 mental health referrals were made in England - a rise of 40% in five years. An estimated one million people are also waiting to access mental health services, and private therapy can be prohibitively expensive.

    fucking fund the National Health Service properly, in order to take care of the people who need it.

    but instead, they want to continue cutting its budget, and use “oh there’s an AI chatbot that you can use that is totally just as good as talking to a human, trust us” as a way of sweeping the real-world harm caused by those budget cuts under the rug.

    Nicholas has autism, anxiety, OCD, and says he has always experienced depression. He found face-to-face support dried up once he reached adulthood: “When you turn 18, it’s as if support pretty much stops, so I haven’t seen an actual human therapist in years.”

    He tried to take his own life last autumn, and since then he says he has been on a NHS waitlist.

    source
  • rob299@lemmy.blahaj.zone ⁨4⁩ ⁨weeks⁩ ago

    I think ai can be usefull in cases like this. Especially in a case where a person who is literally about to commit the suicide. Ai might not be 100% accurate but if it can prevent someone from taking their life by offering it some support, that is a positive thing.

    source
    • wizardbeard@lemmy.dbzer0.com ⁨4⁩ ⁨weeks⁩ ago

      Considering there have already been news stories of AI chatbots telling users to kill themselves and feeding into suicidal ideation, it is absolutely not a reliable fact that the AI will not cause further harm.

      source
      • rob299@lemmy.blahaj.zone ⁨4⁩ ⁨weeks⁩ ago

        There are going to be ai’s that tell users to kill themselves or not to. Depending on how well they are trained or set up by the creator and how the user is acting towards the ai. It’s not just one factor all those factors are linked to how the ai will respond so i’m not particularly victim blaming.

        I do have to why why you bring up a news anchors fetish when we’re talking about the potential of an ai chat bot helping someone with suicidal issues. It really just sounds like you have a bias against ai and aren’t taking into account the potential of preventing suicide for a person when they have no one they feel comfortable to talk to.

        However it is important to know the context of these cases. Ai character creators are able to make ai characters (on platforms like character ai) and are able to direct the behavior of the ai to how they would want. Now Gemini and grok on the other hand, ai’s like that where no one knows where Google or Musk are guiding them i’l acknowledge might not be as trustworthy because on character ai the user knows up front how the ai is intended to behave by the creator. If the user want to make a modification to the ai’s behavior or vibe they can literally just tell the ai character how they want the ai to behave and it will adapt. So ai does certainly have the potential help people with their tramas through conversation.

        source