Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

    • gerikson@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      16 hours ago

      I mean it’s still just funny money seeing the creator works for some company that resells tokens from Claude, but very few people are stepping back to note the drastically reduced expectations of LLMs. A year ago, it would have been plausible to claim that a future LLM could design a language from scratch. Now we have a rancid mess of slop, and it’s an “art project”, and the fact it’s ersatz internally coherent is treated as a great success.

      Willison should just have let this go, because it’s a ludicrous example of GenAI, but he just can’t help himself defending this crap.

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      16 hours ago

      Top-tier from Willison himself:

      The learning isn’t in studying the finished product, it’s in watching how it gets there.

      Mate, if that’s true, my years of Gentoo experience watching compiler commands fly past in the terminal means I’m a senior operating system architect.

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        ·
        15 hours ago

        which naturally leads us to: having to fix a portage overlay ~= “compiler engineer”

        wonder what simonw’s total spend (direct and indirect) in this shit has been to date. maybe sunk cost fallacy is an unstated/un(der?)accounted part in his True Believer thing?

        • BlueMonday1984@awful.systemsOP
          link
          fedilink
          English
          arrow-up
          6
          ·
          14 hours ago

          maybe sunk cost fallacy is an unstated/un(der?)accounted part in his True Believer thing?

          Probably. Beyond throwing a shitload of cash into the LLM money pit, Willison’s completely wrapped his public image up in being an AI booster, having spent years advocating for AI and “learning” how to use it.

          If he admits he’s wrong about LLMs, he has to admit the money and time he spent on AI was all for nothing.

          • flere-imsaho@awful.systems
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 hours ago

            he’s claiming he is taking no llm money with exception of specific cases, but he does accept api credits and access to early releases, which aren’t payments only when you think of payments in extremely narrow sense of real money being exchanged.

            this would in no way stand if he were, say, a journalist.

          • David Gerard@awful.systemsM
            link
            fedilink
            English
            arrow-up
            6
            ·
            11 hours ago

            if you call him an AI promoter he cites his carefully organised blog posts of concerns

            meanwhile he was on the early access list for GPT-5

    • blakestacey@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      18 hours ago

      Good sneer from user andrewrk:

      People are always saying things like, “surprisingly good” to describe LLM output, but that’s like when 5 year old stops scribbling on the walls and draws a “surprisingly good” picture of the house, family, and dog standing outside on a sunny day on some construction paper. That’s great, kiddo, let’s put your programming language right here on the fridge.

    • nightsky@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      18 hours ago

      Sigh. Love how he claims it’s worth it for “learning”…

      We already have a thing for learning, it’s called “books”, and if you want to learn compiler basics, $14000 could buy you hundreds of copies of the dragon book.

      • istewart@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        13 hours ago

        $14,000 could probably still buy you a lesser Porsche in decent shape, but we should praise this brave pioneer for valuing experiences over things, especially at the all-important boundary of human/machine integration!

        (no, I’m not bitter at missing the depreciation nadir for 996-era 911s, what are you talking about)

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        4
        ·
        15 hours ago

        I’ve learned so much langdesign and stuff over the years simply by hanging around plt nerds, didn’t even need to spend for a single dragon book!

        (although I probably have a samizdat copy of it somewhere)

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      19 hours ago

      That the useless programming language is literally called “cursed” is oddly fitting, because the continued existence of LLMs is a curse upon all of humanity