The AI mashed information together that didn’t go together in that context and returned something that was not correct. It was wrong, but did not invent anything.
AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.
The AI mashed information together that didn’t go together in that context and returned something that was not correct. It was wrong, but did not invent anything.
Use of the intentional stance is perfectly justified in this kind of situation.
No.
Yes.
“Hallucinates”
AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)
Yes, I am fully aware of the term’s origin and it was clever at the time but it is important to point out that it is not literally accurate due to how LLMs and other AI are being promoted currently. Calling them hallucinations, saying they invent things, or anything else that sounds anthropomorphic is used by the companies selling the products to deflect from the fact that what is currently being shoved down our throats are unreliable bullshit generators.