With the release of Mass Effect: Legendary Edition, there is renewed discussion over the choices you’re left with at the end of Mass Effect 3. If you haven’t played, I suggest you do. The series has been one of the most memorable and influential for me. I just finished my first play-through of the Legendary Edition and I cried just as I did with the original because my parasocial brain thinks I lost all my good alien friends. Anyhow, here be spoilers and whatnot. The series has been around for a while. Don’t read if you want to be surprised. Also, I’m rambly, sorry this isn’t a professional essay.
If you played when the games released, you’ll already know that there is a lot of fan outcry about the endings, with some even making mods that create their own versions. I’m fine with that, and I understand their motives, but I think the endings Bioware gave us (with one being patched in later) raise interesting questions. At the end of Mass Effect 3, the Reapers are here, they are attacking planets with advanced civilizations, harvesting their populations so the galactic cycle of evolution and extinction at the hands of the Reapers can begin again. Throughout the cycles, a device which is called the Crucible in this cycle has been developed but never completed as a way to supposedly defeat the Reapers. Well, this cycle did it, it linked with the Citadel, and now Shepherd is here, talking with a representation of the collective minds of the Reapers and the original AI that created the cycles in the form of a child, one that Shepherd saw die on Earth and has been haunting his dreams. (I played a male Shepherd.)
The Reapers exist, it says, to prevent the inevitable conflict between organic life and the synthetic life it eventually creates. This intelligence views this outcome as inevitable and concluded that the only way to stop it was to ensure that when civilizations reached that point of development that they would be allowed to go no further. Each civilization is “harvested” by the Reapers and their genetic history, their knowledge, everything about them is turned into a new Reaper that supposedly keeps and maintains the essence of that species in perpetuity. In this way, the intelligence says, no civilization truly dies and they are saved from their own inevitable conflict.
You can imagine that doesn’t sound appealing to the various civilizations in the galaxy who are quite happy living as they do, but given the overwhelming technological superiority and power of the Reapers, the various civilizations being harvested rarely have a choice, no matter how long or how hard they struggle. This time though, with the Crucible completed, the intelligence says the variables have changed and their cycle will no longer work. (Seems to me that a superintelligence should have seen this coming but hey.)
Shepherd is now given three choices on what to do about the future of the galaxy which only he can choose because this superintelligence cannot predict what will happen. The choicesxa0 boil down to:
- Control
- Destroy
- Synthesis
The first option is control: Shepherd can merge his mind with the Reapers and his will becomes their will. In this way he saves the galaxy their destruction and can shape the Reapers however he wants since he will become their new reason to be. In the process though, Shepherd surrenders himself and truly merges his mind, meaning the Shepherd everyone knows and loves, is essentially dead, but his sacrifice spares galactic destruction.
Control asks us to imagine what it would be like for someone to take that power and essentially become the overseer of the galaxy. The power of the Reapers is not gone. Their ability to wipe out the galaxy is not gone. In the short term, Shepherd’s will undoubtedly save trillions of lives, but long term, who knows. How does a human mind cope with such power and knowledge as that of the Reapers? Will his will be strong enough to ensure this won’t happen in the future. How do we know he doesn’t use this power to enforce everything he wants onto the galaxy as opposed to allowing it to continue to determine its own future?
These are unknown of course. Nothing inherently gives Shepherd the right to be in that position, to make those decisions for the galaxy. What helps some may harm others, and while a superintelligence might be able to conclude all of that, would a human mind care? If his will truly is the deciding factor in all decisions, then he is free to ignore these questions in favor of his own desires. The ethics of this choice in the short term are easy to see: save trillions. But the long-term consequences of this choice are completely unknown and amount to basically trading in one master (the superintelligence) for another (Shepherd). The status quo with regard to power dynamics is no different now than it was before the choice, the difference is only that there is a different mind making different choices, possibly worse choices. If you play Shepherd as a villain, then you may still potentially have death and destruction, though more directed and less all-encompassing.
The control option does give us what I think is the clearest form of immediate harm reduction with the most potential for creating future harm. It’s easy to see how someone looking at the choices in the short term might pick this option, especially if you feel like your Shepherd was morally upstanding and worthy of this power. After all, this game is a power fantasy, what’s more power to an already powerful character? It’s certainly tempting, and I have on one occasion picked this option.
The next option is destroy. In this option, you guessed it, the Reapers are destroyed. Big explosions and raining Reaper debris are everywhere, but there are some side effects. This will also damage much of the technology the galaxy relies on. It can be rebuilt in time of course, but the damage is extensive. It is also indiscriminate in what synthetic life it targets. This means all synthetic life, including the Geth (assuming they are alive) and EDI will also be destroyed in an irreparable way. This is a bit philosophical. If you believe that moral agency does not rely on biology but on certain other factors then without a doubt the Geth and EDI are moral agents. They have self-awareness, intelligence, rationality, and can make decisions with others in mind, not only themselves. This is demonstrated time and time again, leading me to believe the series does want you to treat them as moral agents, if not as living beings though not in a biological sense, but a different form of life.
If this is the case, then a decision to allow them to be destroyed is in fact a genocide of synthetic life. Biological life continues while synthetic life is wiped out. Many argue this is the best option, only because if your readiness is at a certain level there is a scene that implies Shepherd lives, which I think plays a strong emotional role in people justifying this choice. If however, you chose to destroy the Geth and EDI is the only AI we know of, then it may seem like a small price to pay, even though she too is a moral agent. This is the trolley problem on a galactic scale. For me, genocide is not justifiable, period, and I treated the Geth and EDI as living moral agents who deserve to be treated as such. I could not justify sacrificing them simply because they were a different kind of life from myself in order to save organic life. This choice to me is cut and dry and based almost entirely on how the player views moral agency and the “aliveness” of EDI and the Geth (though maybe not in these terms). If you think they are not life and not capable of agency, then the choice is not destroying anything except machines that are very good at mimicking life. If they are capable, then you are okay committing genocide to spare organic life for some reason I can’t fathom.
The long-term consequences of this are interesting to think about. If the superintelligence is correct that it is inevitable that organic life will create synthetic life that will eventually break away from and be in conflict with them, then we’ve only delayed the future destruction of organic life in the galaxy for some point in the future. For me, while a superintelligence certainly does have much more ability to predict these things, it is still ultimately a statistical model and only gives a certain likelihood of that being the case. This leaves me doubting the certainty presented by the superintelligence, believing that the galaxy could forge a different path that only time would tell. If I were to grant that the superintelligence’s certainty is in fact correct, then taking this route could mean a potential genocide for a reprieve from an inevitable future. A reprieve is not enough for me to think genocidal action is a viable choice, but might be for other demented souls.
The third choice is synthesis. This choice basically has Shepherd give his essence as essentially a hybrid being into the energy of thexa0 Crucible, which will then change all life, synthetic and organic, in the galaxy into a new form of life that is a merging (synthesis) of both. The superintelligence says that this is a new possibility, but now that it has pondered it, this possibility is the inevitable final form of all life in the galxay. This choice is certainly tempting. Synthetic and organic life in full understanding of each other, no genocide, all of your friends alive, and all for the small price of your life…oh, and the violation of bodily autonomy of every being in the galaxy.
We trade in one problem for another. Instead of impacting one set of beings through genocide, we now impact everyone, forcing a change upon them which none have the option of rejecting. Sure everyone is alive, but they are fundamentally changed. What does this synthesis mean for them? Are they different people? Are their minds irreparably changed where they are no longer the same people they were before? What are the long-term implications of this change? How will life work after this synthesis? There are an incredible amount of unknowns here, with no clear answers to guide us, so we’re left with only the immediate implications. While everyone is alive, we have enforced upon them a change they had no say in, to every single living thing in the galaxy. That’s quite the decision to make. Like destroy, we also have to take the superintelligence’s word for it that this synthesis is indeed the true destiny of all life in the galaxy, which cannot be proven outside of whatever realm of statistical certainty exists in the collective minds of the Reapers.
There is also a fourth option: do nothing. Shepherd can refuse to make a choice, and the Reapers will continue their destruction and the cycle will continue. This dooms all life in the galaxy to the fate of all previous life in the galaxy, and to me might actually be the only decision to be made if you want to follow any kind of moral standard. The choices are given to Shepherd, thrust upon him, and made to be his and his alone. This is quite unfair, and I liken it to someone telling you they will kill one of a group of people but you have to choose or they will kill everyone. It’s an unfair premise, but one that is morally interesting to think about. How much responsibility does the individual given that ultimatum bear? The person killing others is not you, it’s the person who does the killing. They can choose not to kill anyone, but instead have tried to push moral culpability onto you. Refusing to answer and potentially having everyone be killed would also, at least in my philosophy, not be your fault because again, you affected no action, you did not kill the others, someone else did. It’s a horrible situation to imagine, and the person who was given that ultimatum will be traumatized beyond belief, whether they choose to say nothing or choose someone to be killed. It is a no-win scenario, and potentially morally grey. Sometimes there isn’t a clear answer, and we make a choice that may be neither right no wrong. Shepherd refusing the choice given to him does not cause the Reapers to kill all civilization in the galaxy, they already made that choice and give Shepherd a way to alter that behavior. The Reapers are the killer, but Shepherd can choose to stop the killing at the expense of the future, millions of lives, or forcing a fundamental change upon every single being in the galaxy against their will. Some choice.
In conclusion, all three choices give us something to consider, and that is, I believe, the very thing Bioware wanted us to consider. There are plenty of other choices of profound consequence in the series, so this is not the first time we are asked to affect the lives of millions if not billions with no insight into what that decision means for the future beyond our own internal hopes. This is the same, only on a galactic scale, one of far greater magnitude than others. It is the distilled essence of the moral questions the game asks throughout the series. I am not angry at Bioware for making me think, or creating the endings they felt were best in line with the game’s themes. It’s their story. I will likely continue to pick synthesis, my ending of choice, out of the purely selfish motives of seeing all the friends I’ve created parasocial bonds with living on and seeming happy, sans genocide, sans Shepherd losing himself in some weird mild-meld with the Reapers. There isn’t a right or wrong choice. It’s a video game. Haters need to calm down, and we should enjoy the thought-provoking nature of the ending, not scream about not getting what we wanted.