• Dogiedog64@lemmy.world
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    2 days ago

    Yup, literally seeing human features in random noise. LLMs can’t think and aren’t conscious; anyone telling you otherwise is either trying to sell you something or has genuinely lost their mind.

    • Garbagio@lemmy.zip
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      2 days ago

      I don’t even think necessarily that they’ve lost their mind. We built a machine that is incapable of thought or consciousness, yes, but is fine tuned to regurgitate an approximation of it. We built a sentience-mirror, and are somehow surprised that people think the reflection is its own person.

      • BanMe@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        2 days ago

        Even more than a sentence mirror, it will lead you into a fantasy realm based on the novels it’s trained on, which often include… AI becoming sentient. It’ll play the part if you ask it.