• OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    About 3 percent of students in the study had positive mental health outcomes, reporting that talking to the chatbot “halted their suicidal ideation.” But researchers also found “there are some cases where their use is either negligible or might actually contribute to suicidal ideation.”

    This is referring to a bot designed to help with people struggling with mental health, and is actually a big one. That number is way too low.

    • Gamma@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      “hey, I know you feel like killing yourself, but if it happens then we’ll just replace you with a shitty bot” probably isn’t as helpful as they thought it would be. It’s violating and ghoulish.

      • OsrsNeedsF2P@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        3 months ago

        I hate this attitude of “well if you can’t get a professional therapist, figure out how to get one anyways”. There needs to be an option for people who either can’t afford or can’t access a therapist. I would have loved for AI to fill that gap. I understand it won’t be as good, but in many regions the wait-list for therapy is far too long, and something is better than nothing

        • TehPers@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          Someone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.

          • Powderhorn@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            I’ve given up on crisis lines. Their whole premise seems to be “get back to being comfortable with the oppressive system, you little bitch.”