(I will appreciate if you NEVER TELL ANYONE I SAID THIS, not even in confidence. And by “appreciate”, I mean that if you ever do, I’ll probably either leave the Internet forever or seek some sort of horrible revenge.)
Taken literally, this seems like kind of a fucked up thing to say to a friend. Or a stranger. Anyone really. Why would you say this? Why would you write this in an email and then send it, on purpose, under any circumstance?
scott clearly thought that it was important to get that message out. idk what precisely happened there, but i’ll risk a guess that perhaps scott thought that he found a partner in crime, so to speak, and secrecy would help them both. adversary would just use info as is. maybe the biggest thing scott could get in terms of blackmail was flimsy “okay, but you are into this thing too” which won’t be effective in all cases, or maybe he didn’t even had that
if one person came out and spilled the beans, it’d suggest that there might be more people who didn’t
[omitted a paragraph psychoanalyzing Scott]
I don’t think that he was trying to make a threat. I think that he was trying to explain the difficulties of being a cryptofascist! Scott’s entire grey-tribe persona collapses if he ever draws a solid conclusion; he would lose his audience if he shifted from cryptofascism to outright ethnonationalism because there are about twice as many moderates as fascists. Scott’s grift only continues if he is skeptical and nuanced about HBD; being an open believer would turn off folks who are willing to read words but not to be hateful. His “appreciat[ion]” is wholly for his brand and revenue streams.
This also contextualizes the “revenge”. If another content creator publishes these emails as part of their content then Scott has to decide how to fight the allegations. If the content is well-sourced mass-media journalism then Scott “leave[s] the Internet” by deleting and renaming his blog. If the content is another alt-right crab in the bucket then Scott “seek[s] some sort of horrible revenge” by attacking the rest of the alt-right as illiterate, lacking nuance, and unable to cite studies. No wonder he doesn’t talk about us or to us; we’re not part of his media strategy, so he doesn’t know what to do about us.
In this sense, we’re moderates too; none of us are hunting down Scott IRL. But that moderation is necessary in order to have the discussion in the first place.
if one person came out and spilled the beans, it’d suggest that there might be more people who didn’t
I mean, after his full throated defense of Lynn’s IQ map (featuring disgraced nazi college dropout Cremieux/TP0 as a subject matter expert) what other beans might be interesting enough to spill? Did he lie about becoming a kidney donor?
I think the emails are important because a) they make a case that for all his performative high-mindedness and deference to science and whinging about polygenic selection he came to his current views through the same white supremacist/great replacement milieu as every other pretentious gutter racist out there and b) he is so consistently disingenuous that the previous statement might not even matter much… he might honestly believe that priming impressionable well-off techies towards blood and soil fascism precursors was worth it if we end up allowing unchecked human genetic experimentation to come up with 260IQ babies that might have a fighting chance against shAItan.
I guess it could come out that despite his habit of including conflict of interest disclosures, his public views may be way more for sale than is generally perceived.
and it was 2014
if one person came out and spilled the beans, it’d suggest that there might be more people who didn’t
keep in mind these mails are years-disclosed by now - at least a few of them will have learned a lesson and gotten more careful
(I’m constantly glad that as many of them do not)
I’m reasonably certain worse emails exist but they’re with sympathetic individuals.
Even just transcribing these and hosting them is extremely helpful, thanks.
This was an excellent read if you’re aware of the emails but never bothered to read his citations or to dig into what the blather about object-level and meta-level problems was specifically about, which is presumably most people.
So, a deeper examination of the email paints 2014 Siskind as a pretty run of the mill race realist who’s really into black genes are dumber, you guys studies and who thinks that higher education institutions not taking them seriously means they are deeply broken and untrustworthy, especially with anything to do with pushing back against racism and sexism. Oh, and he is also very worried that immigration may destroy the West, or at least he gently urges you to get up to speed with articles coincidentally pushing that angle, and draw your own conclusions based on pure reason.
Also it seems that in private he takes seriously stuff he has already debunked in public, which makes it basically impossible to ever take anything he writes in good faith.
@Architeuthis @dgerard “…impossible to ever take anything he writes in good faith.”
See also this unguarded moment from Tumblr. All the alpha is in bad faith social engineering!
https://www.reddit.com/r/SneerClub/comments/9lj3g7I wonder if this is just a really clumsy attempt to invent stretching the overton window from first principles or if he really is so terminally rationalist that he thinks a political ideology is a sliding scale of fungible points and being 23.17% ancap can be a meaningful statement.
That the exchange of ideas between friends is supposed to work a bit like the principle of communicating vessels is a pretty weird assumption, too. Also, if he thinks it’s ok to admit that he straight up tries to manipulate friends in this way, imagine how he approaches non-friends.
Between this and him casually admitting that he keeps “culture war” topics alive on the substack because they get a ton of clicks, it’s a safe bet that he can’t be thinking too highly of his readership, although I suspect there is an esoteric/exoteric teachings divide that is mostly non-obvious from the online perspective.
So, by that token, if hypothetically you think that the Nazis got a few things right (not the war, racism or genocide, of course, or even the degenerate art, but maybe, say, the smoking bans and well-paved roads and perhaps the odd Wagnerian opera), the way to convince people is to start ranting about blood and soil and the need to exterminate the üntermenschen and wait for the nice normie liberals to politely meet you part of the way?
In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound (he went overseas for medical school, and failed to get medical residency on his first try, ending up in a small Midwestern city). So I wonder why he is sure that in a world with fewer university degrees, he would have gotten as far as he did (medical schools in the USA used to limit admissions from people of his ethnicity).
Likewise with immigration restrictions: he knows that they often blocked Jews, many Europeans. and East Asians not just brown people right?
In his early blog posts, Scott Alexander talked about how he was not leaping through higher education in a single bound
He starts his recent article on AI psychosis by mixing up psychosis with schizophrenia (he calls psychosis a biological disease), so that tracks.
Other than that, I think it’s ok in principle to be ideologically opposed to something even if you and yours happened to benefit from it. Of course, it immediately becomes iffy if it’s a mechanism for social mobility that you don’t plan on replacing, since in that case you are basically advocating for pulling up the ladder behind you.
He starts his recent article on AI psychosis by mixing up psychosis with schizophrenia (he calls psychosis a biological disease), so that tracks.
wait, this man is a psychiatrist? or is that another scott
Yes, Scott Alexander is an unusual rationalist blogger who had a credentialed professional career as a psychiatrist. After Substack became his patron, he opened his own medical practice, but the website has said “not accepting new patients at this time” since 2022. So he seems to live off gifts from fellow travelers with a side hustle in psychiatry.
i’ll risk a guess that running ritalin-dispenser-as-a-service type business catering to overly confident rationalists might get him a pretty penny
Reading his adderall article I couldn’t help but think that this guy is handing scripts to everyone in the Bay Area
That is very possible although I would guess that was earlier in his career given that he does not advertise as treating ADHD or similar. He has two small children, a writing job, and side projects like writing end-of-the-world stories for AI 2027. His practice has a name drawn from Lord of the Rings like other things in the Thielsphere.
His practice has a name drawn from Lord of the Rings like other things in the Thielsphere.
fuckin lol, I had not spotted this, what a tell
he had a blogpost about how amphetamines risks are overstated and it’s fine actually for more people than usually prescribed https://slatestarcodex.com/2017/12/28/adderall-risks-much-more-than-you-wanted-to-know/
but look, i liked this article,
I’m midway through this and this part stood out to me, this is part of the email that was written by S.Al, edited for length:
Compare RationalWiki and the neoreactionaries. […] Almost nothing they say is outrageously wrong, but almost nothing they say is especially educational to someone who is smart enough to have already figured out that homeopathy doesn’t work […] they fit exactly into my existing worldview without teaching me anything new
The Neoreactionaries provide a vast stream of garbage with occasional nuggets of absolute gold in them. Despite considering myself pretty smart and clueful, I constantly learn new and important things […] from the Reactionaries. Anything that gives you a constant stream of very important new insights is something you grab as tight as you can and never let go of.
The garbage doesn’t matter because I can tune it out.
“Rational Wiki presents an understanding of reality that I mostly agree with and is thus boring. I want some spicy takes so I’m going to go suck on the firehose of reactionary diarrhoea, but it’s ok because my throat game is too good to let any of it through” type shit
“Rational Wiki presents an understanding of reality that I mostly agree with and is thus boring. I want some spicy takes so I’m going to go suck on the firehose of reactionary diarrhoea, but it’s ok because my throat game is too good to let any of it through” type shit
All time banger
<3