- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its “Extended Thinking” version) to find an error in “Today’s featured article”. In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.


it’s mostly outsourcing attention, which is pretty acceptable for a large project like wikipedia.
That’s my main use for LLMs
personally i have separate linters, formatters and structure markers that don’t raise the temperature of my apartment when in use, but you do you.
Right - I won’t call it a good thing to let people de-skill on reading comprehension skills, but they’re donating their labour to a public benefit! I’m hardly going to scold them as if I was their professor.
my thought is mainly that there aren’t enough hours in the day to read and check everything on wikipedia. there’s a reason the scots vandalism went unnoticed so long, people just don’t have the time.