• Voroxpete@sh.itjust.works
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    4 days ago

    These books are only worth reading if you’re really, really into the kind of Isaac Asimov big ideas scifi that willingly sacrifices story fundamentals like plotting and character development.

    Most of the characters are really caricatures at best. Some plot beats are well delivered, others are painfully bad. Most of the second book is basically just your Death Note style “I know that you know that I know that…” bullshit where two “super genius” characters tell each other how they’re outwitting each other.

    The science fiction stuff is fun, but it quickly becomes clear that the author’s commitment to research ends at the moment where actual people get involved, a subject with which he has apparently never interacted. Every single person in these books is either a robot, or a hyper-emotional weirdo. And his grand scale psychology fares no better. The signature Dark Forest hypothesis is just really bad game theory that falls apart if you think about it for more than five minutes. It’s a cool “What if”, but it’s not grounded in reality at all.

    What the books do well is scale and playing out events over large swathes of time, and if you’re OK with significant chunks of the narrative basically just reading like the “World history” parts of an RPG setting guide than you might get a kick out of that. But unless you have very little need for engaging characters and storytelling, you’re likely just to find them a painfully dull slog.

    I’m not saying everyone is going to hate them, obviously the books have their fans. But I will say that if anyone dares try to tell that it’s only book one that sucks and they’re amazing after that (an opinion I have seen expressed more than once) you have permission to kick that person in the balls for being an abject liar. Nothing about Cixin’s style or writing ability changes after book one, the scale just gets bigger.

      • WoodScientist@lemmy.world
        link
        fedilink
        arrow-up
        14
        arrow-down
        2
        ·
        edit-2
        3 days ago

        Three of the biggest critiques I’ve heard:

        1. If you really are a genocidal alien race, you don’t sit around waiting for new life to evolve and challenge you. You send out a swarm of self-replicating machines. They fly to every star system and simply systematically disassemble every planet that exists around a star’s habitable zone. Considering interstellar communication timelines, some life could go from apes to interstellar threat before the speed of light even allows you to react. If you really are so paranoid that you’ll commit genocide at the drop of a hat, then the far more safe and efficient strategy is to simply prevent new life from evolving at all. In the Three Body canon, species have been genociding each other for billions of years, but no one has bothered simply removing the source of new species entirely. The fact that we exist at all is evidence that we don’t exist in a Dark Forest condition.

        2. There are more than two civilizations. If we discover an alien civilization ten lightyears from Earth, it’s likely there is another just a little further out, and even more even further out. The number of expected civilizations should increase with the cube of distance. The point is that if you genocide a new weak species, it’s very likely that other species are going to see you do it. And they may be far older and more powerful than you. And there’s no better way to convince that third species that you a threat worthy of extermination than to commit an act of genocide against a new species that is no real threat to yourself. It’s possible there are dark ancient races out there, but the only species they ever attack are those that prove themselves a threat against others. Who better to genocide than a bunch of genociders?

        3. The logic behind it is flawed. The same issues with communication and intentions uncertainty apply to game theory analyses of human beings. By the logic of the Dark Forest theory, I should go bash my next door neighbor’s skull in right now. I can’t really know what he’s planning, and for all I know he’s planning on killing me. I can’t always observe his actions. How do I know he’s not booby trapping my house while I’m at work? As I value my own survival above his, I have no choice but to kill him now. He’s never done anything to suggest he has the slightest ill intentions towards me, but you can’t be too careful. I can’t read his mind. I can’t know what he’s thinking. Game theory demands I murder my neighbor. See how ridiculous that sounds? It just seems like profound logic when applied to the context of extraterrestrial species. But if you apply it to human beings, the whole thing falls apart.

        • MyBrainHurts@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 days ago

          Thanks for taking the time on this!

          This reminds me of talking to a buddy who had vaguely similar ideas but when we got into it, the objections were more because he was offended and quite upset at the idea that advanced civilizations would be aggressive instead of the Star Trek utopian optimism, and then worked backwards to figure out reasons.

          I generally don’t buy these reasons though. The first is technical “there’s a better way to do this!” and any species that hits FTL, explodes stars and creates mini black holes at will etc is using technology so far beyond our understanding that deciding there’s a better way to do things seems a little silly. (If you asked the Romans to design a society where everyone was within an hour of all known knowledge, they’d be coming with interesting city designs etc whereas any of us would be thinking about throwing up satellites and getting a phone/computer in everyone’s hands. It’d be like them criticizing us for launching satellites when we could just be using horses that are right there! The difference in tech makes criticisms along those lines ring fairly hollow.)

          The second one depends on fear of some altruistic super species that fights on behalf of oppressed or destroyed species, which is noble but by no means assured. (Also, given the nature of the dark forest, you have to assume the weapons used are as non-traceable as possible.)

          The last one only doesn’t work for individual humans who are already in a society or social grouping (the notion of our neighbours randomly murdering us is so insignificant as to be laughable but we also all exist in a society with hard boundaries and rules, any neighbour who did so would be on the run etc. ) In fact, one of the first political science texts, Leviathan, basically argues the entire reason we form governments and societies is that because life in a state of nature is “nasty brutish and short” so we surrender our freedoms to an authority figure for mutual protection. And when you look at places without a strong state (think about all the random war groups in failing states. At the nation level, you see this kind of behviour all the time. Just last month Israel/US bombed Iran because they were worried about another power having the ability to destroy them. The entire concept of nuclear deterrence was essentially “if the other side thinks they can survive a first strike, they will do so before our tech gets to the point where we could survive a first strike, so we haven to make sure such an action is non survivable.”

          The uhhh, much more tragic and common example of dark forest theory on a human level is how many civilians American cops kill. They claim they are worried they can’t tell who has a gun, who is attacking them etc, so they shoot first. And almost any of the footage of those shootings, it is the same story over and over again, anxious cop screaming at someone to stop, put something down, etc, victim moves their hand too quickly and a scared cop shoots them.

        • Voroxpete@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          3 days ago

          The hypothesis also assumes that your target has not left their home system. But there’s no reason to believe that this is the case. If you’re attacking an enemy that has multiple star systems under their command, you have no guarantee that you’ll find and destroy all of them before they find and destroy all of yours.

          And of course, if your target is a cooperator by nature, it is possible that they will have allies who can now strike at you before you can strike at them. Even if you succeed in destroying your target civilisation entirely, their allies will still learn of their demise, and treat you as a threat.

          The hypothesis falls apart the moment you give any consideration to the value of cooperation. Now, in all fairness to Cixin, I think that might actually be the point. Basically every failure or near brush with failure in the books happens because of people trying to do things on their own when they should have been working together. While the Dark Forest is bad science, it’s a great metaphor. My issue with it is that people keep treating it as some kind of brilliant revelation about the nature of the universe when a) it’s not, and b) I’m not even sure it’s meant to be.