For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its “Extended Thinking” version) to find an error in “Today’s featured article”. In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    11
    ·
    9 hours ago

    it’s mostly outsourcing attention, which is pretty acceptable for a large project like wikipedia.

    • Bldck@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      That’s my main use for LLMs

      • I write the code logic, the main argument points, etc
      • let the LLM lint, format and structure the discussion
      • I provide another round of copy editing, styling and other updates
      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 hours ago

        personally i have separate linters, formatters and structure markers that don’t raise the temperature of my apartment when in use, but you do you.

    • Brave Little Hitachi Wand@feddit.uk
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      8 hours ago

      Right - I won’t call it a good thing to let people de-skill on reading comprehension skills, but they’re donating their labour to a public benefit! I’m hardly going to scold them as if I was their professor.

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 hours ago

        my thought is mainly that there aren’t enough hours in the day to read and check everything on wikipedia. there’s a reason the scots vandalism went unnoticed so long, people just don’t have the time.