Random review All Reviews Rating Form Contact

The Dark Forest by Liu Cixin


        

The way I read books is pretty weird. Usually, it's not super long reviews that catch my interest. Or medium long reviews or, really, reviews at all. I could scroll through tons and tons of Reviews on Goodreads without once feeling the urge to buy a book. Perhaps this is more about Goodreads reviews than reviews in general but the positive reviews on Goodreads never get me excited since they're usually just retelling the story. And the bad ones have become less and less useful as I've seen more and more clearly unjust three-star reviews litter the top reviews on the pages of my favorite books. It's very possible these complaints only apply to the Goodreads community, but I still think my method for picking books to read is the best one. What is it you ask?

Usually… weird, vague, and mysterious one-line sentences on Hacker News written by random people on the internet.

If you don't already know, very very often users on Hacker News will post answers to well defined questions like "What are the most fundamental books on computer science?" all the way to the more insanely defined and impossible to answer: "What books changed the way you think about almost everything?" The threads themselves end up containing thousands of comments with just the name of a book, from thousands of different people, with hundreds of replies to each top-level comment often agreeing that, yes, this particular book does in fact change the way you think about everything. Sometimes the comments can be even more enticing like this one:

"The Beach - Alex Garland… I'm 42 now, but read this in my late twenties. If you are 18-32 this book will save your life."

I bought The Beach as soon as I read that. How many traditional book reviews have had that kind of effect on you?

All this is to say, when I initially picked up The Three-Body Problem (the first in the trilogy The Dark Forest is a part of) I did it because of a comment on one of these Hacker News threads - so I really had no idea what I was in for. I knew it was Sci-Fi and that's about it. And in the spirit of weird Hacker News style book recommendations, I want to give you one at the start of this review:

The Dark Forest by Liu Cixin.

In all seriousness if you have even the most remote interest in any of the following keywords, you should not read this review and probably buy the book (that includes you Scott): Aliens, The Fermi paradox, The Great Filter. This review is going to contain tons of spoilers and I would sincerely hate to deprive anyone that would have a good experience reading this book from having the same experience I had with it, especially because (if by some miracle this is published) I suspect a lot of people in this audience would be those people.

I.

If you know anything about The Three-Body Problem, it's probably that Obama has apparently read it. The Dark Forest is the second book in The Three-Body Problem trilogy and The Three-Body Problem is also the name of the first book in the series. Allegedly the series is sometimes referred to as Remembrance of Earth's Past but I don't think I've met anyone that calls it that. The series is written by Liu Cixin and was first published in Chinese in 2008 and in 2014 was translated to English.

The Dark Forest is a pretty strange novel, and I mean that in the best possible way. Not because of the story, but actually - yeah that too. Really though, it's strange because it does something no other Sci-Fi book I've ever read has done. That is, it feels like this book is more a genuine hypothesis to the problem of the Fermi paradox than a fictional book at all. That hypothesis is the dark forest theory of the universe. It's a theory, I get the impression, Cixin is confident in. And most importantly it is a theory that is meant to describe the situation in the real world and in addition it happens to also explain the state of affairs in the world of The Three-Body Problem. Let me put it this way, it would not be unfair to describe The Dark Forest as a seven-page journal article you might find on arXiv detailing a new answer to the Fermi paradox, wrapped in a four hundred- and ninety-three-page fictional story.

The dark forest theory itself is both terrifying and surprising depending on how many answers to the Fermi paradox you've read. Yet I feel as though even if you have read about the general category the dark forest theory falls into, the amount of detail it is described in and the detail of the world it is presented in, would still be enough to make it terrifying, though maybe less surprising. What's more, the theory carries a weight of on the face plausibility that is hard to describe but feels something like you're reading a theory of equal importance to the theory of evolution except for some reason you're hearing it through the words of a character in China in the future and not in a paper giving evidence for each fact. In this sense, the fictional story serves as something of a case study for the theory itself.

By far the most interesting thing in The Three-Body Problem is what Cixin calls a Sophon. These are proton sized supercomputers that are sent by aliens to Earth to stop us from undergoing an intelligence explosion while the aliens themselves are travelling to Earth to murder us. The Sophons can travel at the speed of light and so arrive before the alien fleet. The aliens initially only send two. They use these Sophons to interfere with particle accelerators on Earth so we can no longer trust the results coming from them which is supposed to stop us from making any fundamental progress in particle physics, which in turn means we won't experience any leaps and bounds in our science.

I love the idea of Sophons, you can think of them as close to human intelligence level invisible spheres able to subtly influence the environment at the atomic level to fulfill the will of their masters. There are just so many things you can do with them. In fact, I think they are kind of underutilized in the book and open some plot holes because of how OP it makes the aliens. They could just do so much that they don't ever do: Carefully manipulate the memory of computers such that you could never trust the results of any computation again - check. One by one make every president of a country slowly go crazy until the last one quits - check. Crash the stock market a couple of times - check. All of these seem totally within the realm of possibility of things Sophons could do. But instead, they just mostly spy on us.

I think it would be a little unfortunate if Obama only read The Three-Body Problem - it's easily my least favorite of the three novels but feels the least important, at least if I'm properly understanding what Cixin and this series care about. Also, if there was anyone The Dark Forest was meant for, I'm guessing it would be, in ascending order of importance - (1) people who could realistically stop Earth from sending messages into space and (2) people who could make it illegal for us to send messages into space (sitting US presidents). But where The Three-Body Problem is typical and fun Sci-Fi - The Dark Forest is just off the wall bananas in every way.

II.

The Dark Forest feels special right from the premise, I love how simple it is.

There are aliens headed toward Earth to kill you. They are much much smarter than you. They are 450 years away.
This is your final exam.
You have 60 hours 450 years 200 years.
Your solution must at least allow
 Harry  Earth to evade immediate death,_despite being naked…

I'm not saying the stakes are as high as getting the good ending to your favorite fanfic - but they're definitely up there and this is basically the position the characters are in at the start of The Dark Forest. You have the combined thinking power of the entire Earth to solve a very well-defined problem but with similarly well-defined constraints, how do you do it?

Well, it involves a bunch of the craziest plot points I think I’ve ever heard. Seriously, just skim the plot section of the Wikipedia page. You'll find phrases like: "nuclear-equipped Kamikaze space-fighters", "Wallfacers", "Wallbreakers" and of course "escapes by pretending that he holds a dead man's switch which can destroy NYC."

Writing these out brings back memories of what it felt like to read the book which is simultaneously sheer bewilderment and fascination at how many different elements of stories there are going on. At the same time as there are complicated interstellar prisoners' dilemma there is also a geopolitical drama on Earth that involves the UN and other countries trying to decide who is and isn't a war criminal. Assassination plots. Long narratives describing extremely detailed ship-to-ship combat sequences. It's as though there are three or four different genres of books in here but only one of them is Sci-Fi and they’re all happening at more or less the same time. Oh, and did I mention these are occurring over two hundred years?

Yup, the book is split into three distinct parts with two distinct time periods and in total covers two hundred years of time. It achieves this by having characters hibernating for long periods and other times just skips forward a dozen years or so. What this means is that not only is this book Sci-Fi and all the other genres I mentioned - it also offers a take on the future and what the future would be like if the Earth were collectively suffering the anxiety of an alien armada invading, for two hundred years. These time jumps work surprisingly well and I was consistently impressed with just how many elements Cixin was able to deal with only for these then to be amplified by the smaller and larger time jumps.
        

With all these different genres, plot points and time periods you might think it would be hard to follow the story and you would be… right. Sometimes. This might just be because of my experience reading the book as a Westerner, but it can get a little confusing simply because of how many characters there are across multiple time periods all interacting with several, often, isolated plots. It can feel like the characters slightly blur together. It's probably not a great sign that the book opens with a Dramatis Personae of all the characters and acronyms in the book, that I definitely referred back to multiple times. However, the character descriptions were broad enough that I knew who I was reading about wasn't who I had in my mind, but not descriptive enough that I could place them if I had completely forgotten who they were. For example, a typical entry looks like: "Ding Yi: Theoretical physicist." Maybe that's enough for some people to realize who the book is talking about, but it wasn’t enough for me.

All that aside, the plot in this book is, overall, great. I really enjoyed how quirky and insane the story gets but all well within the rules of the world. As an example, I mentioned before that the aliens send these atomic sized supercomputer balls to Earth which effectively means anything on the planet that is on a computer or in print or said out loud could be transmitted back to the aliens. This (obviously) makes carrying out any offense or defense against the aliens tough/impossible because they could be in the room while we are drawing plans against them, or in the computer we have the plans saved. Part of me thinks this is the effect of a corner Cixin wrote himself into after The Three-Body Problem. Almost as though there wasn't supposed to be anything after it because why else would you make the aliens so OP that it becomes virtually impossible to defeat them unless you didn't intend to write about us defeating them.

True or not, the Sophons lead to some bizarre but hilarious situations that are so insane they are fun to read about and I often felt myself doing several mental spit takes during the first part of this book. How would you deal with countering Sophon powered aliens headed to Earth? If you haven't read the book, I can almost guarantee you won’t come to the conclusion the Earth collectively comes to. That is, create Wallfacers:

"These Wallfacers will be granted extensive powers that will enable them to mobilize and exploit a portion of Earth’s existing military resources. As they carry out their strategic plans, the Wallfacers need not make any explanation for their actions and commands, regardless of how incomprehensible their behavior may be. Monitoring and control of the Wallfacer activity will be undertaken by the UN Planetary Defense Council, the sole institution granted the authority to veto Wallfacer commands under the UN Wallfacer Act."

Essentially - designate a handful of humans to come up with their own plan for how to defeat the aliens, then ask them not to tell anyone about their plan so the aliens can't find out what it is by listening to a conversation. And since the Sophons can't read minds, the only people who truly know what their plans are, are the Wallfacers themselves. They are then given license by the UN to pretty much do whatever they want if it serves a plan of defeating the aliens. Almost no one can stop them and say, "hey this is a little strange - are you sure it's a good idea?" Because the Wallfacers can just reply, "all part of the plan." And no one will ever know if they're serious, or just pursuing something ridiculous to fool the aliens into thinking we're doing something silly, only to bring out the true strategy at the last minute. I think it's a reasonable idea except for - oh yes - it leads to a couple of possible crimes against humanity involving nuclear weapons and the planet Mercury.

What do the aliens do in response? What else but ask people from the organization that supports an alien invasion to follow the Wallfacers around until they can figure out their true plan - they are the Wallbreakers. This might seem silly but was actually quite fun to read and I don't mean it as a negative point about the book, just proof that this book is off the wall bananas.

Clearly, despite what the cover of the book will tell you about being "rooted in cutting-edge science" there is not much of that cutting-edge science to be found in many places. The way Sophons are created might as well be magic, even though Cixin tries to come up with a plausible sounding method for their construction it just is not rooted in any science I've ever heard or understand. I don't mean to cherry pick the one unscientific thing in the book but it's just that, generally, realism isn't exactly one of the book's strengths, in terms of its storytelling. I suppose you could argue that a lot of the Sci-Fi elements come from real places, but they don't end in real places and in the case of Sophons, employ what feels like magic to work. Though right at the end of the book there are some very clever uses of gravitational waves which strike me as quite realistic and no one ever goes faster than the speed of light (at least not until the third book). I also don't mean for this to count as a negative in any way. But I do know that some people don't like Sci-Fi where unrealistic things happen so I thought I should mention that, to those of you who value pure scientific realism, well you might have a lot of issues with this book.

To me, this stuff really isn't a problem at all, and I very much enjoyed even the magic Sophons.

III.

"This is the picture of cosmic civilization. It’s the explanation for the Fermi paradox."
        

I said earlier that it wouldn't be unfair to describe this book as many pages of plot wrapping a seven-page answer to the Fermi paradox. In fact, I think it would not be unfair to describe the entire trilogy as thousands of pages of plot wrapping a seven-page answer to the Fermi paradox. What is this answer and is it any good?

I'm going to start out by saying that ever since I read The Dark Forest it has become my favorite answer to the Fermi paradox. If I were going to YOLO a bet on the correct answer to the Fermi paradox, well I would put my money on The Dark Forest.

So, what is the theory then?

It's actually not all that complicated. Or at least I thought it wasn't until I started writing this review. Then suddenly I lost all understanding of it. I think, though, at its core this theory really is simple - it can just get confusing in the book because Cixin considers objections and evidence to the theory at the same time he is explaining it. As you're reading the theory it all makes sense but the second you try explaining it to someone else you sound like an idiot because you're confusing objections with explanations with evidence. I think the theory's apparent simplicity comes from the fact that all the theory uses to answer the Fermi paradox is some game theory and other observable facts about the universe. According to Cixin, the reason we don't see aliens or alien transmissions in the galaxy, despite the huge number of life bearing planets, is not because there are no aliens but rather because the aliens are hiding. While I'm sure this type of answer has already been posed by other people, Cixin's key insight is into why the aliens might want to hide. For Cixin, they are hiding because sending any message into space is really a non-cooperative game, like the prisoner's dilemma, and the move that nets civilization's most utility is to stay silent.

This is different from what (some of us) might intuitively expect - that no utility is lost by sending a signal and getting nothing back. Perhaps we gain zero utility if we send a signal, hoping to contact another civilization, but don't hear anything back. However, if we did send a signal and got a response back, the payoff would be very high because we would be able to share technology, ideas, and other fun things. Under this assumption, the rational move for all aliens would be to broadcast signals all the time, there's very little to lose if it fails and a lot to gain if it succeeds. So, we look at the sky and wonder why no one else is speaking. In fact, according to dark forest theory, this assumption is incorrect. The loss in utility of sending a signal is much greater than zero and the payoff of making contact with another civilization is actually a disutility!

Or to put it in words, we have assumed that making contact with aliens will likely lead to positive outcomes, the sharing of technology among other things. However, not only does making contact not inevitably lead to positive outcomes. Making contact doesn't even have a good chance of leading to a positive outcome and if anything, it inevitably leads to (very) negative outcomes.

This is because we have made the implicit assumption that alien civilizations are more likely to be benevolent than malicious. Yet, what Cixin shows is that it is very rare for there to be civilizations that act benevolently even if they themselves have similar morals to us and would want the same things as us, for instance, to prosper together.

If you’re familiar with the prisoner's dilemma then you know that all Cixin is talking about is just a specific instance of a general game where the best move for each player, given the information available to them, is to act self-interestedly and hurt the other player. Meanwhile the other player makes the same calculation and comes to the same conclusion. So, both players pursue strategies that harm each other, and both end up losing out. To put it plainly, even though both players are aware of the lose/lose outcome, together they are unable to coordinate with each other such that they shift from the lose/lose outcome to the win/win outcome. Or, in dark forest theory terms, both civilizations pursue a strategy of acting maliciously even though both would prefer acting benevolently.

It's important to understand that in this kind of game the player's own ethical framework doesn't really affect their moves. Both players or civilizations might be very fine people, or very altruistic civilizations - yet as soon as they enter the game their interests completely change. My favorite example of this occurs on the planet Earth. In this now famous clip of the game show Golden Balls - which I'm sure you've already seen many times - two players manage to escape the lose/lose outcome of the prisoner's dilemma in the most exciting way possible. Somehow this clip makes tangible the fact that both players in this game really are not (as far as I know) evil people and both seem like rather decent people actually. However, as soon as they enter the game their incentives completely change and even though "stealing" might be something I suspect both find morally unpalatable, it becomes the best move for each player to make and takes what feels like a herculean effort of meta-game moves from one of the players to avert the game ending with both players stealing.

Likewise, according to Cixin, the best strategy for civilizations in the universe is to act maliciously even though mutual benevolence would be preferred by all civilizations. This translates to all civilizations keeping quiet in the game of "should we broadcast signals or not" but it also has an even more eerie effect on the related game of "what should we do if we hear signals from a civilization?"

If a civilization detects through radio waves or otherwise the presence of another civilization, like in the game of should we broadcast signals or not, civilizations can either stay quiet or respond, however to some civilizations there will also be a third option of 'kill'. Where instead of merely staying quiet, they could launch the equivalent of a pre-emptive nuclear strike. According to dark forest theory, acting maliciously here and not only remaining silent but going the further step of wiping out any civilization you detect yields the highest payoff.

So, if dark forest theory is correct, then civilizations would never broadcast signals in the open, and civilizations which hadn't figured out that this is the best strategy, would be summarily executed by a neighboring civilization that has.


Or in the beautiful words of Cixin:

“[In] this dark forest, there’s a stupid child called humanity, who has built a bonfire and is standing beside it shouting, ‘Here I am! Here I am!’” Luo Ji said.
“Has anyone heard it?”
“That’s guaranteed.”

To summarize: the conclusion of the dark forest, then, is that the universe is like a dark forest since we don't see any signs of life in the universe because they are hiding because it is dangerous to reveal yourself.

"The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost, gently pushing aside branches that block the path and trying to tread without sound. Even breathing is done with care. The hunter has to be careful, because everywhere in the forest are stealthy hunters like him."

This conclusion depends on several premises but by far the most controversial, and the one the entire theory hangs on, is the premise that the payoffs of the prisoner's dilemma between civilizations is structured in such a way that civilizations will pursue the strategy of acting maliciously and making moves in accordance with this strategy. According to Cixin, the truth of this premise comes down to two facts about the universe: (1) intelligence explosions and (2) distances in interstellar space.

Cixin observes that the rate at which we have progressed, in the eyes of the cosmos, is an intelligence explosion. Like how an artificial general intelligence might experience a fast or slow takeoff to an artificial superintelligence, we are - on the timescales of the universe - a fast takeoff. The span of time it took for us to go from hunter gatherers to farmers to modern technology looks almost exactly like an artificial general intelligence quickly turning into an artificial superintelligence - if you are comparing our progress every thousand years or so. Further, Cixin generalizes this observation about us to other civilizations and across time. He seems to claim that either most civilizations or all civilizations that reach a certain point of technological maturity will continue to grow at this exponential rate unless something stops them. This means that over the course of millions or thousands or even hundreds of years of exponential development, Humans could go from where we are now - a civilization that it’s hard to imagine would be very threatening to any aliens worth their salt. To a civilization which might be a legitimate threat.

How does this influence the incentive for civilizations to pursue malicious moves instead of benevolent ones? Well, that's where the distances of interstellar space come in. If a civilization hundreds or thousands of light years away from Earth detects signals from Earth, they would have to consider the fact that we might be developing exponentially when deciding whether or not to either: respond, stay silent or kill.

Since it takes so many years for a message to reach Earth - it becomes impossible to establish beforehand whether we are going to act maliciously or benevolently when receiving that signal. And since we are developing exponentially, while the alien civilization might not have anything to fear from us now (lol nuclear weapons is that all?). In a hundred or two hundred or a thousand years we might have developed to the point where not only can we detect that civilization back but also kill them back. So really, if we can continue developing exponentially, we are a future threat.

Together, (1) and (2) make it such that it is very hard - if not impossible to trust that other aliens will act benevolently in any game, before you make your move. This is because of what Cixin calls the chain of suspicion. The civilization that detects Earth wants to play benevolent but doesn't know what move we are going to play and can't attempt to coordinate the move because (1) we might not be a threat now but (2) are far away enough that by the time we get the message we will be a threat and so the aliens can't take a chance on us not being a threat. Likewise, we want to play benevolent but don't know what move the other civilization is going to play. The other civilization wants to play benevolent but also knows that we don't know what move they are going to play. This goes on forever. This is kind of convoluted, but the ultimate effect is that civilizations can't establish trust with each other and thereby can't coordinate a benevolent/benevolent outcome. In fact, I think all the chain of suspicion is really trying to establish is that the prisoner's dilemma we are talking about here is not iterated.

In an iterated version of the game there are multiple rounds of the game, so if a player hurts you can hurt them back the next round and both players will take that into account when deciding which strategies to use. Ideally, if the game is iterated, it could allow players to build trust and help each other if the other player helps them (tit-for-tat) among other strategies. Whereas in the game of Golden Balls, if a player hurts you by stealing you can't hurt them back, so there is no mechanism for building trust. And what Cixin seems to be saying is that the same is true in space - sending or not sending a signal to an alien civilization is a one round prisoner's dilemma because you don't have any experience with what strategy that civilization plays the game. If civilizations really can't establish trust, either because the chain of suspicion is correct or just because we accept that sending a message is a non-iterated prisoner's dilemma, then you don't have any reason to believe other civilizations will act benevolently in any of the games you play with them. And your best strategy becomes to hurt the other players because it offers the highest payoff. Just like in Golden Balls where the best strategy is to steal.

With everything taken together, this means that if you are a very technologically advanced civilization you will firstly make the "malicious" move of not broadcasting any signals in the open despite the potential payoff you might get if both you and another civilization broadcasted signals. Secondly, if you ever detected signals coming from another planet and had the means to kill them - you would not respond because even though you might now be ahead of them technologically, this won't always be the case. You could similarly not just stay silent because the civilization you have detected could one day develop enough to detect your own presence and would be incentivized to play a malicious move because they would not be able to trust that you are benevolent. So, your only move becomes to make the malicious move of killing the civilization.

And that is dark forest theory.

It shouldn't be surprising that the universe in The Dark Forest is a dark forest. But Is our own universe one and what reasons do we have for thinking that it is or is not? I think Cixin's a priori reasoning does seem to follow. That is, his reasoning about how alien civilizations would act assuming some facts about the physical world are true. If everything Cixin assumes is right, then I find it hard to avoid the conclusion that the universe is a dark forest. However, if there is anywhere to attack dark forest theory it is in its assumptions about the world. There are at least two assumptions that jump out at me.

The first is that the universe is full of alien civilizations to be afraid of. A lot of the reasoning Cixin makes about civilizations in The Dark Forest is contingent on it being true that there is a lot of not just intelligent life in the universe, but life that is intelligent enough to form civilizations. It certainly could be true that the universe really is as full as Cixin describes it, but this does depend on what the variables in the Drake equation eventually resolve to. If it turns out that some of them are low enough for there to be only one or two civilizations in the milky way total, then it's hard to see how dark forest theory could be true even if its game theory were correct.

Another key assumption is that life and particularly technological species keep growing forever at an exponential rate. This is one of the harder ones to believe just because of how shaky life sometimes feels on Earth, it’s hard to imagine how any aliens could be one hundred percent justified in being worried about us evolving into a threatening civilization. Perhaps there is a high probability we'll continue to grow exponentially but it doesn’t feel like a certainty and, in fact, it might not even be that high.

Further, if dark forest theory is true, it would need to be the case that for every civilization that gets to where we are - they often continue developing exponentially. If not, then an alien civilization could sometimes take a risk and bet that maybe this civilization won't continue to develop so we can just keep quiet and let them blow themselves up. Or maybe it's even worse than that and it is much more likely for civilizations to wipe themselves out than for them to continue developing exponentially, in which case if there are any stable civilizations in the universe - the game theory just wouldn't work out in favor of them having such a strong preference in us or anyone else being killed.

Given what we know about the many times we've almost been wiped out, whether it’s from an asteroid or nuclear weapon, I am somewhat skeptical that we will necessarily be more likely to develop exponentially than not, especially for long periods of time. What's more, we might just not continue to progress exponentially and start progressing linearly at some point if this was the case the dark forest logic would again break down and alien civilizations would no longer be as motivated to act maliciously.

One more thing. It's worth nothing that if this really is a new solution to the Fermi paradox that wasn't published on arXiv, it's worth asking why that is the case. If it is true that this is the answer, then why does Cixin need to pose it through fictional characters in an off the wall plot? This is not to say that reciting the theory through fictional characters makes the theory false. It's just a little suspicious.

IV.

Before I can end this review I feel like I need to address the elephant in the room: Grabby Aliens. 

If you don't already know - Robin Hanson has recently come out with a new answer to the Fermi paradox that is a paper published on arXiv and which uses math and real facts we know about the world to come to conclusions about the Fermi paradox. This is as opposed to other answers which Hanson sees as mostly based on speculation instead of hard evidence.

I'm not embarrassed to say I don't totally understand Hanson's model of Grabby Aliens, there is a lot of math in the paper, and I chuckled when I saw the following comment on the topic:

I was reading the paper and my wife asks me what I'm reading. I replied "I don't have a clue"

So, all my understanding of Hanson's paper basically comes from Scott Aaronson's blog on the topic and this podcast with Hanson. And from what I do understand Hanson's model of Grabby Aliens is surprisingly like dark forest theory - at least in their conclusions. Except, where dark forest theory uses game theory and fiction to answer the Fermi paradox - Hanson uses math and data.

The conclusion Hanson comes to is that there are certain kinds of aliens we can call grabby, and these are any aliens (biological or non-biological) that "[take] over things and [change] the appearance of what they take over" meaning they consistently get bigger and bigger and stop others from the space they occupy from becoming grabby. What this boils down to is just like in the case of dark forest theory, if we ever were to find another grabby alien, the result would be our end since we would be transformed by the grabby alien into part of its expansion. This might mean our death or assimilation. In a sense this is a very similar conclusion to dark forest theory in that both models depend on a view of aliens that views (some of) them as exponentially developing things that, were we ever to bump into, would not be good for us. In fact, it would be an existential threat.

However, I think where these models crucially differ, besides their justifications for these points, is the importance of radio signals. The way I see it, deciding whether to broadcast signals just wouldn't matter much if Hanson is right about grabby aliens. Since grabby aliens will continue to grab whether we alert them to our presence or not. Perhaps on shorter timescales it might matter, but on a long view of the universe, we would be consumed by a grabby alien that isn't us whether we hide or not.

It does feel a little harder to defend dark forest theory after grabby aliens. I certainly feel much more aware of the weaknesses in the dark forest theory after hearing Hanson talk so much about how other theories are so speculative. Yet, even considering grabby aliens, I can't say that I'm ready to give up on the dark forest outright. In fact, as far as answers to the Fermi paradox go, I think dark forest theory falls somewhere in the middle between totally speculative and well supported.

I think that, at the very least, dark forest theory has enough going for us to take it seriously as an answer - despite being published in a fictional book. The key serious takeaways for me are that the Sophons are already here - so we should probably start figuring out how to deal with that. But more importantly we should almost certainly stop broadcasting signals into space specifically to find intelligent life. The risks just feel too high and the payoff is not nearly enough. Overall, I'm partially to nearly convinced that dark forest theory is the reason we don't see other intelligent life broadcasting signals in the universe.

Finally, if The Dark Forest and The Three-Body Problem are just wrappers around a novel theory to the Fermi paradox, are the other nine hundred and ninety-three pages of story worth reading? Even though I've seen several comments from people that found the series boring, I just did not feel that way and I think especially from The Dark Forest, the series gets more and more worthwhile from a story perspective. I do see how the first book might put a damper on things, but overall, the series is worth reading. Almost certainly The Dark Forest and the third book, Death's End, are worth reading.