• Mirshe@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    21 hours ago

    Even then, an LLM isn’t going to be able to actually diagnose a problem. It’s just very sophisticated word prediction, it doesn’t actually THINK, even if it looks like it does.

    • nul9o9@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      Absolutely. LLMs are the antithesis of innovation. No thinking outside the box, just rehashing old ideas.