• KT-TOT@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      10 days ago

      You’re claiming that LLMs are superior to doctors in… Accurate diagnosis rates for unspecified conditions?

      Helluva claim without offering even the suggestion of evidence. Not really sure a “no ur stupid” is needed, you’re not exactly making a claim based in established fact or reality.

      Also uh, self driving cars lmao.

          • Prove_your_argument@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            10 days ago

            Nah, I just consistently put more effort in than you clowns.

            https://pmc.ncbi.nlm.nih.gov/articles/PMC11263899/

            ChatGPT4 rated higher than physicians at taking text input and getting a diagnosis in that study.

            Here’s a completely different one. 90% accuracy for chatgpt, 74% for doctors not using LLM tool. https://www.advisory.com/daily-briefing/2024/12/03/ai-diagnosis-ec

            So chatgpt wrong 10% of the time, doctor wrong 26% of the time. 2.6x worse failure rate by real docs… for that one anyway.

            It’s just a matter of time for medical diagnosis to be done by LLMs first, and then simply be reviewed by a doc for sanity because humans “don’t trust” technology.

            So here, you literally just prove you’re an asshat, and I brought data.

            • wucking_feardo@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              10 days ago

              100 people in the first study, and the authors specifically indicate that the high quality results are dependent on adequate input by the resident physicians. When the diagnosis was obtained by self reporting, accuracy dropped to 50%.

              So I do not believe LLMs are able to replace doctors.

            • AbeilleVegane@beehaw.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 days ago

              "I feel like this is the self driving car thing again.

              How often are human doctors wrong in their diagnoses?

              How often are LLM doctors wrong in their diagnoses?

              I’m pretty sure the former is close to 75%, and the latter substantially less. I’ve heard of so many people go to doctor after doctor and not get the right diagonsis or treatment for whatever they have going on, and it takes 5+ to find the one who figures it out and gets them treated."

              This you?