- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its “Extended Thinking” version) to find an error in “Today’s featured article”. In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.


No surprise.
Wikipedia ain’t the bastion of facts that lemmites make them out to be.
It’s a mess of personal fiefdoms run by people with way too much time on their hands and an ego to match.
Yeah, better to use grokpedia /s