• gedaliyah@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    Just to be clear, companies know that LLMs are categorically bad at giving life advice/ emotional guidance. They also know that personal decision making is the most common use of the software. They could easily have guardrails in place to prevent it from doing that.

    They will never do that.

    This is by design. They want people to develop pseudo-emotional bonds with the software, and to trust the judgment in matters of life guidance. In the next year or so, some LLM projects will become profitable for the first time as advertisers flock to the platforms. Injecting ads into conversations with a trusted confidant is the goal. Incluencing human behaviour is the goal.

    By 2028, we will be reading about “ChatGPT told teen to drink Pepsi until she went into a sugar coma.”

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    Look man…I hate AI too…but you can’t just use it as a scapegoat to cover for humans being humans.

    Should the AI be telling him to do more and more drugs until he died? Well, no, but also…maybe don’t do dangerous drugs at all.

    Like if chatgpt says to shoot yourself in the face, and you do, is it chatgpt’s fault you killed yourself? Or was it you killing yourself at fault for killing you?

    This world is getting dumber and dumber.

    • ch00f@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      Basically the entire US economy, every employer, many schools, and half of the commercials on TV are telling us to use and trust AI.

      Kid was already using the bot for advice on homework and relationships (two things that people are fucking encouraged to do depending on who you ask). The bot shouldn’t give lethal advice. And if it’s even capable of doing that, we all need to take a huuuuuuge step back.

      “I want to make sure so I don’t overdose,” Nelson explained in the chat logs viewed by the publication. “There isn’t much information online and I don’t want to accidentally take too much.”

      Kid was curious and cautious, and AI gave him incorrect information and the confidence to act on that information.

      He was 19. Cut this victim blaming bullshit. Being a kid is hard enough before technology went full cyberpunk.

      • kalkulat@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        The bot shouldn’t give lethal advice The person or company that runs the bot that gave lethal advice should be charged with homicide.

      • fyrilsol@kbin.melroy.org
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        3 days ago

        19 is not a ‘kid’. Sorry of having to be that guy, but he was already an adult, a young adult at that.

    • Passerby6497@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      Well shit, maybe we shouldn’t hold humans responsible for the actions that they convince another human to take. After all, the victim is just a human being a human, right?

      • markovs_gun@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 days ago

        I mean it’s not illegal for someone to tell someone else to take more drugs. If two guys are hanging out and one says “hey I think I think I should take more drugs” and the other says “hell yeah brother do it” they aren’t responsible if the first guy ODs.

        • squaresinger@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Depending on the circumstances, yes, that would totally be illegal.

          It’s called “aiding and abetting”. In most countries it’s illegal to convince someone to do something illegal.

          If you are someone the victim sees as an authority figure (especially if the victim is a minor), a bunch of other other charges can be added too.

          In Canada, the UK or the USA, for example, someone who “aided or abetted” someone to commit a crime can be punished exactly as if they had committed the crime themselves.

        • zarkanian@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          You mean that if you convinced somebody to do something stupid…and then they did it and died…you wouldn’t feel guilty at all?

        • demonsword@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          If two guys are hanging out and one says “hey I think I think I should take more drugs” and the other says “hell yeah brother do it” they aren’t responsible if the first guy ODs

          They are indirectly responsible. Dangerously close, depending on circumstances, of being criminally responsible.

          • kalkulat@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 days ago

            A LOT of fraternities have gotten in BIG trouble for hazing practices that led to the death of a ‘candidate’.