• Prove_your_argument@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    16
    ·
    11 days ago

    and if the argument is “Bbbuut the LLM was wrong once and someone DIED!”

    The comparison is the human being wrong over and over and over and over to the result of countless deaths. Malpractice lawsuits must be rare compared to the amount of mistakes that are made, simply because it’s difficult to get to the point where you win, and extremely costly if you fail the suit.

    We already have people posting on social media for medical advice. LLMs just can’t be worse than that.

    • greasewizard@piefed.social
      link
      fedilink
      English
      arrow-up
      12
      ·
      11 days ago

      You can at least sue a doctor for malpractice if they make a mistake. If you follow medical advice from a chatbot and you die, who is liable?

      Large Language Models were built to rewrite emails, not provide valid medical advice

      • Prove_your_argument@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        11 days ago

        If you post on reddit asking for advice, and you die after following the advice despite there being no claims of anyone being a doctor, who does someone sue?

        IMO shouldn’t need disclaimers stating that absolutely everyone and everything is not a lawyer, is not a HCP, etc, etc. It’s just a given.

        If you google something and just blindly do what the first result says, do you have a case against them too?