• yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    19
    ·
    2 days ago

    The American Psychological Association met with the FTC in February to urge regulators to address the use of AI chatbots as unlicensed therapists.

    Protect our revenue, er patients!

    • TheAlbatross@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      I think that’s a little cynical. I know a few people who work in psych, some in ER’s, and it’s becoming more common to hear people following advice they got via ChatGPT and harming themselves. One particularly egregious one was where the patient was using the program for therapy reasons then suddenly pivoted to asking what the highest buildings were locally, which, of course, the program answered.

      • dindonmasker@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 days ago

        Thr highest building will just make you regret your action for longer while falling. May i suggest this building close to your location that is perfectly as tall as it needs to do the job? Chatgpt probably.

        • TheAlbatross@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          11
          ·
          2 days ago

          Funny, but the reality is even darker. There are zero safeguards built into the program for these scenarios so it makes absolutely no correlation between the two topics, something even a self-styled, unlicensed “life coach” would easily do.