I should probably mention that this person went on to write other comments in the same thread, revealing that they’re still heavily influenced by Bay ?Area rationalism (or what one other commenter brilliantly called “ritual multiplication”).
The story has now hopped to the orange site. I was expecting a shit-show, but there have been a few insightful comments from critics of the rationalists. This one from “rachofsunshine” for instance:
[Former member of that world, roommates with one of Ziz’s friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to…
Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is “justified” to prevent a speck of dust in the eye of eternity. When the thing you’re trying to create is infinitely good or the thing you’re trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of “anyone who would criticize us for any reason is a bad person who is lying to cause us harm”. That kind of framing can’t help but get culty.
The nature of being a “freethinker” is that you’re at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you’ll get stuck in it, because there’s no external “drag” or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you’ve got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
It’s a bunch of very weird people who have nowhere else they feel at home. I totally get this. I’d never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There’s some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical “doctrine” that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz’s group is only one of several.
As someone who also went to university in the late 80s and early 90s, I didn’t share his experiences. This reads like one of those silly shaggy-dog stories where everyone says sarcastically afterwards: “yeah that happened”.
Damn. I thought I was cynical, but nowhere near as cynical as OpenAI is, apparently.
As anyone who’s been paying attention already knows, LLMs are merely mimics that provide the “illusion of understanding”.
I’m noticing that people who criticize him on that subreddit are being downvoted, while he’s being upvoted.
I wouldn’t be surprised if, as part of his prodigious self-promotion of this overlong and tendentious screed, he’s steered some of his more sympathetic followers to some of these forums.
Actually it’s the wikipedia subreddit thread I meant to refer to.
So now Steve Sailer has shown up in this essay’s comments, complaining about how Wikipedia has been unfairly stifling scientific racism.
Birds of a feather and all that, I guess.
You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.
I used to be more sanguine about people’s ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.
Eats the same bland meal every day of his life. Takes an ungodly number of pills every morning. Uses his son as his own personal blood boy. Has given himself a physical appearance that can only be described as “uncanny valley”.
I’ll never understand the extremes some of these tech bros will go to deny the inevitability of death.
Happy Valentine’s Day everybody!
It’s kind of fascinating how rotten the “New Atheist” movement turned out to be. Whether it’s Richard Dawkins revealing his inner racist-misogynist, Michael Shermer being rapey AF, or James Lindsay turning into a Christofascist, the movement seems to have spawned and/or revealed a lot of really problematic people. I guess it’s no surprise that the rationalist scene had such a membership overlap.
I haven’t read Scott’s comment sections in a long time, so I don’t know if they’re all this bad, but that one is a total dumpster fire. It’s a hive of Trump stans, anti-woke circle-jerkers, scientific racists, and self-proclaimed Motte posters. It certainly reveals the present demographic and political profile of his audience.
Scott has always tried to hide his reactionary beliefs, but I’ve noticed he’s letting the mask slip a bit more lately.
It’s absolutely bizarre that Scott labels Rufo a journalist. Rufo is a right-wing activist who has only ever worked for right-wing think tanks. He first came to my attention as a part of the Discovery Institute, a Seattle-based “think tank” best known for promoting creationism.
Then again, Scott has previously said that he’s impressed by the arguments of creationist Michael Behe, another Discovery Institute lackey.
Glowfic feels like a writing format designed in a lab to be the perfect channel for Eliezer’s literary diarrhea.
My P(harrassment scandal) for EA is 0.98.
Exactly. It would be easier to take Scott’s argument more seriously if it wasn’t coming from the very same person who previously labeled as unstable and thereby non-credible a woman who accused his rationalist buddies of sexual harrassment – a woman who, by the way, went on to die by suicide.
So fuck him and his contrived rationalizations.
Imagine thinking there is actually some identifiable thing called “white culture”. As if a skin color defines a culture.
Yeah, sounds like a Nazi.
I think in their minds, there is this magical threshold below which all the brown and disabled people live, and once you get rid of all the people residing below that threshold all you have left is smart people who want to make the world better.
Only an EA could take seriously someone who approvingly cites journals like “Mankind Quarterly” and crackpots like Richard Lynn, Steven Hsu, Jonathan Anomaly, and Emil Kirkegaard.
The author considers himself a “rationalist of the right” and a libertarian who enjoys Richard Hanania and Scott Alexander. He describes ten tenets of right-wing rationalism, 8 of which are simply rephrasings of various ideas promoted by scientific racists. It would be an understatement to say this guy is monomaniacally focused on a single topic.
(Oh, and he publishes his brain farts on Substack. Because of course he does.)
Lots of discussion on the orange site post about this today.
(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)