• snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      6
      ·
      13 hours ago

      AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.

      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        4 hours ago

        Since you’re going to be ridiculously pedantic, it isn’t AI. Start there, where it’s actually potentially useful to make the distinction.

        • Prandom_returns@lemm.eeM
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          2 hours ago

          Terminology changes and bad terminology can and should be changed.

          Anthromorphising software only helps the ones that are pushing it.

        • snooggums@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          12 hours ago

          Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.