sabreW4K3@lazysoci.al to Technology@beehaw.org · 2 months agoMom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technicaarstechnica.comexternal-linkmessage-square22fedilinkarrow-up17arrow-down10
arrow-up17arrow-down1external-linkMom horrified by Character.AI chatbots posing as son who died by suicide - Ars Technicaarstechnica.comsabreW4K3@lazysoci.al to Technology@beehaw.org · 2 months agomessage-square22fedilink
minus-squareTehPers@beehaw.orglinkfedilinkEnglisharrow-up2·2 months agoSomeone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.
minus-squareAlice@beehaw.orglinkfedilinkarrow-up1·2 months agoI tried AI once but it just kept telling me to call the hotlines. Useless.
minus-squarePete Hahnloser@beehaw.orglinkfedilinkEnglisharrow-up1·2 months agoI’ve given up on crisis lines. Their whole premise seems to be “get back to being comfortable with the oppressive system, you little bitch.”
Someone close to me gave up on the hotlines in the US and now just uses ChatGPT. It’s no therapist, but at least it’ll hold a conversation. If only the hotlines here weren’t so absurdly understaffed.
I tried AI once but it just kept telling me to call the hotlines. Useless.
I’ve given up on crisis lines. Their whole premise seems to be “get back to being comfortable with the oppressive system, you little bitch.”