AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.
“Hallucinates”
AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.
Since you’re going to be ridiculously pedantic, it isn’t AI. Start there, where it’s actually potentially useful to make the distinction.
That is why I am pushing back on using these terms!
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
Terminology changes and bad terminology can and should be changed.
Anthromorphising software only helps the ones that are pushing it.
Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.