• FourWaveforms@lemm.ee
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    edit-2
    23 hours ago

    The article talks of ChatGPT “inducing” this psychotic/schizoid behavior.

    ChatGPT can’t do any such thing. It can’t change your personality organization. Those people were already there, at risk, masking high enough to get by until they could find their personal Messiahs.

    It’s very clear to me that LLM training needs to include protections against getting dragged into a paranoid/delusional fantasy world. People who are significantly on that spectrum (as well as borderline personality organization) are routinely left behind in many ways.

    This is just another area where society is not designed to properly account for or serve people with “cluster” disorders.

    • Captain Aggravated@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 hours ago

      I mean, I think ChatGPT can “induce” such schizoid behavior in the same way a strobe light can “induce” seizures. Neither machine is twisting its mustache while hatching its dastardly plan, they’re dead machines that produce stimuli that aren’t healthy for certain people.

      Thinking back to college psychology class and reading about horrendously unethical studies that definitely wouldn’t fly today. Well here’s one. Let’s issue every anglophone a sniveling yes man and see what happens.

      • DancingBear@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 hours ago

        No, the light is causing a phsical reaction. The LLM is nothing like a strobe light…

        These people are already high functioning schizophrenic and having psychotic episodes, it’s just that seeing random strings of likely to come next letters and words is part of their psychotic episode. If it wasn’t the LLM it would be random letters on license plates that drive by, or the coindence that red lights cause traffic to stop every few minutes.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 hours ago

          If it wasn’t the LLM it would be random letters on license plates that drive by, or the coindence that red lights cause traffic to stop every few minutes.

          You don’t think having a machine (that seems like a person) telling you “yes you are correct you are definitely the Messiah, I will tell you aincient secrets” has any extra influence?