The Ethics of Westworld
Today I’d like to talk about my favourite new show, Westworld, if I may.
If you haven’t heard about it or seen it yet, Westworld is a new HBO series about an amusement park populated with androids. Customers pay a hefty price to enter the Western-themed park, which offers several narrative packages to play through. However, most of the customers simply use the park to fulfill their base desires, killing the androids or having sex with them, for example. And the people running the park don’t seem to have a better opinion of their customers either.
The show starts at a point where malfunctions have occurred with the androids and some must be taken out of the park for diagnostics. There is so much going on with this show and so many interesting science fiction themes that I could probably write about it for hours. But let’s just keep things simple for now.
The Rape of Dolores
Let’s start with the most obvious ethical issue presented by the show: the killing and/or rape of androids. In the very first episode we see the main android, Dolores, dragged kicking and screaming into a barn. We don’t see it, but we all know what happens next.
Why should we discuss the ethics of harming machines? A machine is not a living thing. It is simply the product of lines of code, programming, wiring. It is an artificial being not in any way organic. So why should we consider the ethics surrounding their treatment at all?
Well, I believe there is a vast difference between a machine that, for example, puts a car together, and an android, specifically the ones shown in Westworld. A factory machine has no mind of its own, it doesn’t know anything other than what it’s programmed to do, if it can be said to “know” anything at all. But you can argue this about androids as well: they may look human, but they are merely machines running their programs. How can we know that they feel anything even if they are kicking and screaming?
I would argue that the difference lies in the intent of creation. The Westworld androids are made to look and act human and are constantly being updated to be ever more human. So if we are deliberately creating human-like beings, should we not treat them with the same respect that organic humans deserve? I believe we should. In fact, one of the show’s characters, the narrative director, makes a good point when he argues (I can’t remember if it’s in the first or second episode) that the company should not be making the androids more realistic because, and I’m paraphrasing here, “Do you really want to think that your husband actually had sex with that woman, or that you actually killed that person?”
It seems highly suspect to me to create androids that are so human-like and realistic that we can’t tell the difference and then simply do with them what we please. That is tantamount to slavery. We’ve seen this in the story R.U.R., which I’ve reviewed on this blog. And you can guess what happens at the end of that one.
And whether or not Dolores can actually feel pain, the fact that she has been programmed to react to such a situation as any woman would is the only indication we need that raping her is a crime. This actually brings us to another interesting point about the ethics of how we treat androids.
After each narrative cycle (I’m still not sure how long those last in the show) the memories of the androids are wiped and reset. Their storylines start over, waiting to be interacted with by new customers. So Dolores has no memory of the injustices she has experienced. Should we still commit crimes against her just because she is an android and just because she will have no memory of it?
You can guess that my answer is no.
Playing God
I’d also like to argue that it is unethical and totally wrong to treat androids the way they are treated in Westworld because that mistreatment is a gross abuse of our responsibility as creators. The same way we take care of our children so should we take care of humanoid machines. I’m not sure I have any more argument to add other than that it’s simply the right thing to do. If we are choosing to play God in our own way, well then we damn well better take it seriously.
And if nothing else, what does it say about us, that we would create such masterpieces only to destroy them?
Final Thoughts
All in all the first few episodes of Westworld have been extremely interesting and thought-provoking. As I discussed in my post about why I’m fascinated with robots, I find these mechanical creations of ours an excellent ground for exploring what it means to be human. But as illustrated by the show, or rather, what I’m taking away from it, is that this exploration should be theoretical. We should be able to figure ourselves out without harming others.
I’m sure Westworld will inspire many a future blog post. For now, watch that show! And I’ll talk to you later!
Ciao!