• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    10 hours ago

    Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.