• luciferofastora@feddit.org
    link
    fedilink
    arrow-up
    17
    ·
    23 hours ago

    It’s trained on human writing, not knowledge. It has no actual understanding of meaning or logical connections, just an impressive store of knowledge about language patterns and phrases that occur in the context of the prompt and the rest of the answer. It’s very good at sounding human, and that’s one hell of an achievement.

    But its lack of actual knowledge becomes apparent in things like the alphabet poster example or a history professor asking a simple question and getting a complicated answer that sounds like a student trying to seem like they read the books in question but misses the one-sentence-answer that someone who actually knows the books would give. Source, the example I cited being about a third into the actual article

    If the best it can do is sound like a student trying to bullshit their way through, then that’s probably the most accurate description: It has been trained to sound knowledgeable, but it’s actually just a really good bullshitter.

    Again, don’t get me wrong, as a language processing and generation tool, I think it’s an amazing demonstration of what is possible now. I just don’t like seeing people ascribe any technical understanding to a hyperintelligent parrot.

    • Almacca@aussie.zone
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      16 hours ago

      It has been trained to sound knowledgeable, but it’s actually just a really good bullshitter.

      So just like their creators.