There is an amazing quirk of the LLM, whenever I don’t know about the topic, and refuse to google, it gives me some useful answers, but if I ask it something I know about, the answers are always stupid and wrong. I asked a computer about it but it said that everything is normal and I should buy better subscription, so there’s that.
There is an amazing quirk of the LLM, whenever I don’t know about the topic, and refuse to google, it gives me some useful answers, but if I ask it something I know about, the answers are always stupid and wrong. I asked a computer about it but it said that everything is normal and I should buy better subscription, so there’s that.
It’s not a quirk of LLMs, it’s a quirk of human cognitive biases.
See: Gell-Mann amnesia effect.
That’s the joke, yes.