Thanks for the context, I didn’t see that information on the website at all. Generally, I think it’s just good form to let people know how and to what ends their results are used — but of course, we know it’s not necessarily how things work
The site mentions it’s from ACX at the top. The results on this version are probably not going to be collected though, and it’s just for fun. The original ACX posting was a google form; when the site says “X% of people get this wrong” it’s most likely in reference to the population polled at that time.
…no? There’s a link mentioning Scott Alexander which does lead to ACX, but without any background or context. It’s a fairly big assumption that others know what “ACX” is and how SA is connected to it.
I didn’t connect the dots immediately, but turns out I tried reading Unsong back in the day. I quit when I realised Alexander’s ties to rationalist and effective altruist thought 🤷
Oh, sorry, I misremembered the link as saying ACX, but you’re right, it does say Scott Alexander.
Loved Unsong, I’m sorry you have had bad experiences with rationalists/EAs. :/ Yudkowsky is pretty weird and egotistical, still enjoyed his writing though. Here is a good essay defending EA, just basically reminding you that fundamentally EA is about convincing rich people to donate to life-saving charities; I don’t really see why this would be harmful. The biggest criticisms of EA I see are “well there shouldn’t be rich people!” – like, I agree, but, how is that relevant?
Different essay, but I love this quote:
Something else happened that month. On November 11, FTX fell apart and was revealed as a giant scam. Suddenly everyone hated effective altruists. Publications that had been feting us a few months before pivoted to saying they knew we were evil all along. I practiced rehearsing the words “I have never donated to charity, and if I did, I certainly wouldn’t care whether it was effective or not”.
Thanks for the context, I didn’t see that information on the website at all. Generally, I think it’s just good form to let people know how and to what ends their results are used — but of course, we know it’s not necessarily how things work
The site mentions it’s from ACX at the top. The results on this version are probably not going to be collected though, and it’s just for fun. The original ACX posting was a google form; when the site says “X% of people get this wrong” it’s most likely in reference to the population polled at that time.
…no? There’s a link mentioning Scott Alexander which does lead to ACX, but without any background or context. It’s a fairly big assumption that others know what “ACX” is and how SA is connected to it.
I didn’t connect the dots immediately, but turns out I tried reading Unsong back in the day. I quit when I realised Alexander’s ties to rationalist and effective altruist thought 🤷
Oh, sorry, I misremembered the link as saying ACX, but you’re right, it does say Scott Alexander.
Loved Unsong, I’m sorry you have had bad experiences with rationalists/EAs. :/ Yudkowsky is pretty weird and egotistical, still enjoyed his writing though. Here is a good essay defending EA, just basically reminding you that fundamentally EA is about convincing rich people to donate to life-saving charities; I don’t really see why this would be harmful. The biggest criticisms of EA I see are “well there shouldn’t be rich people!” – like, I agree, but, how is that relevant?
Different essay, but I love this quote: