• Electricd@lemmybefree.net
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    26
    ·
    2 days ago

    Safety guidelines are regularly given

    If people purchase a knife and behave badly with it, it’s on them

    Something writing things isn’t comparable to a machine that could kill you. In the end, it’s always up to the person doing the things

    I still wonder how ClosedOpenAI forcefully installed ChatGPT in this person’s home. Or how it is installed because they don’t have software… Quit your bullshit

    • Feathercrown@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 day ago

      This is more like selling someone a knife that can randomly decide of its own accord to stab them

        • Epzillon@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          22 hours ago

          Are you deadass saying we should let ChatGPT itself and the companies that ship it form its own safety guidelines? Because that went really well with the Church Rock incident…

          • Electricd@lemmybefree.net
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            22 hours ago

            If they don’t, then its lawsuits going their way, so they will put some

            But having some laws isn’t necessarily bad, I just don’t trust countries to do a good job at it, knowing how tech illiterate they are

            • Epzillon@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              20 hours ago

              What do you even mean. You are contradicting yourself. “We shouldnt blame AI or the companies because they cant be controlled” but the companies and AI itself is supposed to handle the safety regulations? What type of regulations do you seriously expect them to restrict themselves with if they know there is no way they cant guarantee safety? The legislation must come outside of the business and restrict the industry from releasing half-baked ass-garbage that is potentially harmful to the public.

              • Electricd@lemmybefree.net
                link
                fedilink
                English
                arrow-up
                1
                ·
                20 hours ago

                What I meant is:

                You can’t expect LLMs not to do that because that’s not technically possible at the moment

                Companies should display warning and add some safeguards to reduce the amount of time this happens