• ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    15
    ·
    2 months ago

    When you get a long and nuanced answer to a seemingly simple question you can be quite certain they know what they’re talking about. If you prefer a short and simple answer it’s better to ask someone who doesn’t.

    • Halcyon@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      2 months ago

      It’s a LLM. It doesn’t “know” what it’s talking about. Gemini is designed to write long nuanced answers to ‘every’ question, unless prompted otherwise.

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        5
        ·
        2 months ago

        Not knowing what it’s talking about is irrelevant if the answer is correct. Humans that knows what they’re talking about are just as prone to mistakes as an LLM is. Some could argue that in much more numerous ways too. I don’t see the way they work that different from each other as most other people here seem to.