Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 day ago

    Siskind did a review too, basically gives it the ‘their hearts in the right place but… [read AI2027 instead]’ treatment. Then they go at it a bit with Yud in the comments where Yud comes off as a bitter dick, but their actual disagreements are just filioque shit. Also they both seem to agree that a worldwide moratorium on AI research that will give us time to breed/genetically engineer superior brained humans to fix our shit is the way to go.

    https://www.astralcodexten.com/p/book-review-if-anyone-builds-it-everyone/comment/154920454

    https://www.astralcodexten.com/p/book-review-if-anyone-builds-it-everyone/comment/154927504

    Also notable that apparently Siskind thinks nuclear non-proliferation sorta worked because people talked it out and decided to be mature about it rather than being scared shitless of MAD, so AI non-proliferation by presumably appointing a rationalist Grand Inquisitor in charge of all human scientific progress is an obvious solution.

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      5
      ·
      22 hours ago

      Also they both seem to agree that a worldwide moratorium on AI research that will give us time to breed/genetically engineer superior brained humans to fix our shit is the way to go.

      This century deserves a better class of thought-criminal

    • fullsquare@awful.systems
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      assuming that nuclear nonproliferation is gonna hold up indefinitely for any reason is some real fukuyama’s end of history shit

      let alone “because it’s Rational™ thing to do”, it’s only in rational interest of already-nuclear states to keep things this way. couple of states that could make a good point for having nuclear arsenal and having capability to manufacture it are effectively dissuaded from this by american diplomacy (mostly nuclear umbrella for allies and sanctions or fucking with their facilities for enemies). with demented pedo in chief and his idiot underlings trying their hardest to undo this all, i really wouldn’t be surprised if, say, south korea decides to get nuclear

    • TinyTimmyTokyo@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      Yud: “That’s not going to asymptote to a great final answer if you just run them for longer.”

      Asymptote is a noun, you git. I know in the grand scheme of things this is a trivial thing to be annoyed by, but what is it it with Yud’s weird tendency to verbify nouns? Most rationalists seem to emulate him on this. It’s like a cult signifier.

      • saucerwizard@awful.systems
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 hours ago

        They think Yud is a world-historical intellect (I’ve seen claims on twitter he has a iq of 190 - yeah really) and by emulation a little of the old smartness can rub off on them.

          • Architeuthis@awful.systems
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            It’s possible someone specifically picked the highest IQ that wouldn’t need a second planet earth to make the statistics work.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          The normal max if an iq test is ~160 and from what I can tell nobody tests above it basically because it is not relevant. (And I assume testing problems and variance become to big statistical problems at this level). Not even sure how rare a 190 iq would be statistically, prob laughably rare.

      • zogwarg@awful.systems
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        It’s also inherently-begging-the-question-silly, like it assumes that the Ideal of Alignment™, can never be reached but only approached. (I verb nouns quite often so I have to be more picky at what I get annoyed at)

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      Also notable that apparently Siskind thinks nuclear non-proliferation sorta worked because people talked it out and decided to be mature about it

      This is his claim about everything, including how we got gay rights. Real if all you have is a hammer stuff.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Trying to figure out if that Siskind take comes from a) a lack of criticality and/or the ability to read subtext or b) some ideological agenda to erase the role of violence (threats of violence are also violence!) in change happening, or both