People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman’s “Geeks, Mops, and Sociopaths in Subculture Evolution” to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that “bednet” effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don’t know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

  • CinnasVerses@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    2 days ago

    Was TPOT a Twitter thing? It seems like LessWrong was all over Tumblr and Twitter.

    Most of us are harrmless and just want to explore our special interests. But I don’t think any of our friends fits that description. I don’t think it was just about power games either, Scott Alexander really cares about peddling racist lies, and Yudkowsky seems to build his whole worldview around the idea that he is a world-historical figure (and maybe he is, but Grigori Rasputin not Albert Einstein). So neither the “clueless, losers, sociopaths” model nor the “geek, mops, sociopaths” explains what happened to LW or Effective Altruism.

    • JFranek@awful.systems
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      Was TPOT a Twitter thing?

      If I recall correctly, TPOT literally means “That Part Of Twitter”

      • CinnasVerses@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 days ago

        Chapman’s advice seems pretty good for keeping an indy art scene small and for autistic introverts not big and for normies, but not for realizing that LessWrong and EA are cults founded by bad people with bad goals with an exoteric doctrine out front and an esotetric doctrine once you are committed.

        • istewart@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          an exoteric doctrine out front and an esotetric doctrine once you are committed.

          What you are describing here is the definition of occultism. There’s different lessons for the “inner door” students, and getting there requires buy-in to the group’s differentiating ideas. The Xenu story in Scientology’s OT3 is a galvanizing popular example, the Catholic practice of adolescent confirmation is a more mainstream example that we’re more likely to have encountered in daily life. To summarize my spiel above with this context, I would say that Chapman’s problem is he thought he could replace the harmful occultisms coming to predominate in Silicon Valley and associated spaces with a kinder, gentler, more scientifically informed occultism. It ain’t worked yet, you gotta give up the whole idea of progressing to a “higher level” or “deeper truth.”

          • CinnasVerses@awful.systemsOP
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 day ago

            occultism

            Another common example for Americans is “milk before meat” among the Later-Day Saints. The paper by Gleiberman above lays out how once you are committed to the idea that altruism should be as effective as possible and that your intuitions about what is effective are not trustworthy, the Longtermists pull you into a dark alley where their friend Pascal is waiting to mug you (although longermist EA never received a majority of EA funding). Its all as sad as when I am trying to have a factual conversation with Americans online and they try to convert me to pseudoscientific racism.

          • froztbyte@awful.systems
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 days ago

            I ran across it when it was still pitching “cozy twitter”, but rapidly also saw that a lot of that was driven by some of the homesteader and natalist types. then after a few rounds of looking into some histories and bigger posters, started backing away… can imagine it’s more mask-off now