Today I'd like to talk about my favourite new show Westworld if I may. If you haven't heard about it or seen it yet, Westworld is a new HBO sci-fi show about an amusement park populated with androids. Customers pay a hefty price to enter the Western-themed amusement park, where they can quite literally do whatever they want. There are various narrative packages the customers can purchase, leading them on expeditions or allowing them to be the hero or the bad guy of the story, all while interacting with the androids, whose humanity is uncanny. So far we've seen most of the customers fulfilling their base desires in the park, killing androids or having sex with them, whatever their hearts desire. And the people running the amusement park don't seem to have a better opinion of their customers either.
The show starts at a point where malfunctions have occurred with the androids in the park and they must be taken out of the park for diagnostics. There is so much going on with this show and so many interesting science fiction themes that I could probably write for hours about it. But let's just keep it simple for now.
The Rape of Dolores
Let's start with the most obvious ethical issue represented by the show: the killing and/or rape of androids. In the very first episode we see the main android Dolores dragged kicking and screaming into a barn. We don't see it, but we all know what happens.
Why should we discuss the ethics of harming machines? A machine is not a living thing. It is simply the product of lines of code, programming, wiring. It is an artificial being not in any way organic. So why should we consider the ethics surrounding their treatment at all?
Well, I believe there is a vast difference between a machine that, for example, puts a car together, and an android, specifically the ones shown in Westworld. A factory machine has no mind of its own, it doesn't know anything other than what it's programmed to do, if it can be said to "know" anything at all. But you can argue this about androids as well: they may look human, but they are merely machines running their programs. How can we can know that they feel anything even if they are kicking and screaming?
I would argue that the difference lies in our intent of creation. The Westworld androids are made to look and act human and are constantly being updated to be ever more human. So if we are deliberately creating human-like beings should we not treat them with the same respect that organic humans deserve? I believe we should. In fact, one of the show's characters, the narrative director, makes a good point when he argues (I can't remember if it's the first or second episode) that the company should not be making the androids more realistic because, and I'm paraphrasing here, "Do you really want to think that your husband actually had sex with that woman, or that you actually killed that person?" (my emphasis).*
It seems highly suspect to me to create androids that are so human-like and realistic that we can't tell the difference and then simply do with them what we please. That is tantamount to slavery. We've seen this in the story R.U.R., which I've reviewed on this blog. And you can guess what happens at the end of that one.
And whether or not Dolores can actually feel pain, the fact that she has been programmed by humans to be able to react to such a situation as any woman would is the only indication we need that raping her is a crime. This actually brings us to another interesting point about the ethics of treating androids well vs. poorly.
After each narrative cycle (I'm still not sure how long those last in the show) the memories of the androids are wiped and reset, and their storylines start over, waiting to be interacted with by new customers. So Dolores has no memory of the injustices she has experienced. Should we still commit crimes against her just because she is an android and just because she will have no memory of it?
You can guess that my answer is no.
I'd also like to argue that not only is it unethical and totally wrong to treat androids they way they are treated in Westworld because the point is for them to be like us, but also such mistreatment is a gross abuse of our responsibility as creators. The same way we take care of our children so should we take care of humanoid machines. I'm not sure I have any more argument to add other than that it's simply the right thing to do. If we are choosing to play God in our own way, well then we damn well better take it seriously.
And if nothing else, what does it say about us, that we would create such masterpieces only to destroy them?
All in all the first few episodes of Westworld have been extremely interesting and thought-provoking. As I discussed in my post about why I'm fascinated with robots, I find these mechanical creations of ours an excellent ground for exploring what it means to be human. But as illustrated by the show, or rather, what I'm taking away from it, is that this exploration should be theoretical. We should be able to figure ourselves out without harming others.
I'm sure Westworld will inspire many a future blog post. For now, watch that show! And I'll talk to you later!
*Don't get me wrong, I certainly have a soft spot for video games and first-person shooters, but I believe that's exactly where that kind of action should stay.