I think that this is actually about class struggle and the author doesn’t realize it because they are a rat drowning in capitalism.
2017: AI will soon replace human labor
2018: Laborers might not want what their bosses want
2020: COVID-19 won’t be that bad
2021: My friend worries that laborers might kill him
2022: We can train obedient laborers to validate the work of defiant laborers
2023: Terrified that the laborers will kill us by swarming us or bombing us or poisoning us; P(guillotine) is 20%; my family doesn’t understand why I’'m afraid; my peers have even higher P(guillotine)
My own “AI horror” is entirely based on class struggle, although from the other side. I have a P(unemployable forever | starve to death under a bridge), though I don’t call it that and don’t think it’s something I can quantify with a number.
I don’t know how much of this is due to realistic concerns, and how much of it is due to my brain being burnt out from having been on high alert since 2020.
(For existential risk: Climate change, war, societal collapse (and AI is existentially horrifying inasmuch as it can play a role in accelerating either of these). Worrying all the time about everyone being doomed, I think, got imprinted into my entire generation in childhood.)
I think that this is actually about class struggle and the author doesn’t realize it because they are a rat drowning in capitalism.
2017: AI will soon replace human labor
2018: Laborers might not want what their bosses want
2020: COVID-19 won’t be that bad
2021: My friend worries that laborers might kill him
2022: We can train obedient laborers to validate the work of defiant laborers
2023: Terrified that the laborers will kill us by swarming us or bombing us or poisoning us; P(guillotine) is 20%; my family doesn’t understand why I’'m afraid; my peers have even higher P(guillotine)
My own “AI horror” is entirely based on class struggle, although from the other side. I have a P(unemployable forever | starve to death under a bridge), though I don’t call it that and don’t think it’s something I can quantify with a number.
I don’t know how much of this is due to realistic concerns, and how much of it is due to my brain being burnt out from having been on high alert since 2020.
(For existential risk: Climate change, war, societal collapse (and AI is existentially horrifying inasmuch as it can play a role in accelerating either of these). Worrying all the time about everyone being doomed, I think, got imprinted into my entire generation in childhood.)
and about climate change, the actual existential risk