• webghost0101@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Pretty sure its in the Tos it can’t be used for therapy.

      It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.

      • jagged_circle@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        22 hours ago

        What? Its a virtual therapist. Thats the whole point.

        I don’t think you can sell a sandwich and then write on the back “this sandwich is not for eating” to get out of a case of food poisoning

    • Case@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      I mean, in theory… isn’t that a company practicing medicine without the proper credentials?

      I worked in IT for medical companies throughout my life, and my wife is a clinical tech.

      There is shit we just CAN NOT say due to legal liabilities.

      Like, my wife can generally tell whats going on with a patient - however - she does not have the credentials or authority to diagnose.

      That includes tell the patient or their family what is going on. That is the doctor’s job. That is the doctor’s responsibility. That is the doctor’s liability.