On “mind crime” and other questions about Westworld


Welcome to the new Old West: Dolores (Evan Rachel Wood) and Teddy (James Marsden) in Westworld (photo via HBO)

Last spring, George Musser posted a fascinating article in Aeon called “Consciousness creep,” the gist of which is given by the dek (as we journalists call the story description beneath the headline): “Our machines could become self-aware without our knowing it. We need a better way to define and test for consciousness.” Here’s the conclusion (careful readers of my blog may recall that I quoted the same extract in March):

Tackling those big problems is important.… Building a consciousness detector is not just an intellectually fascinating idea. It is morally urgent—not so much because of what these systems could do to us, but what we could do to them. Dumb robots are plenty dangerous already, so conscious ones needn’t pose a special threat. To the contrary, they are just as likely to put us to shame by displaying higher forms of morality. And for want of recognising what we have brought into the world, we could be guilty of what [University of Oxford philosopher Nick] Bostrom calls “mind crime”—the creation of sentient beings for virtual enslavement. In fact, [philosopher Eric Schwitzgebel, of the University of California at Riverside,] argues that we have greater responsibility to intelligent machines than to our fellow human beings, in the way that the parent bears a special responsibility to the child.

We are already encountering systems that act as if they were conscious. Our reaction to them depends on whether we think they really are, so tools such as Integrated Information Theory will be our ethical lamplights. [University of Wisconsin neuroscientist Giulio] Tononi says: “The majority of people these days would still say, ‘Oh, no, no, it’s just a machine,’ but they have just the wrong notion of a machine. They are still stuck with cold things sitting on the table or doing clunky things. They are not yet prepared for a machine that can really fool you. When that happens—and it shows emotion in a way that makes you cry and quotes poetry and this and that—I think there will be a gigantic switch. Everybody is going to say, ‘For God’s sake, how can we turn that thing off?’”

In one way, the world that’s being dramatized—very elusively so far—on the HBO drama series Westworld differs from what those paragraphs discuss. The “hosts,” as they’re called on the show—the very human-like androids that populate the theme park—don’t seem accidentally to have become conscious; it looks as if they were designed from the outset to possess some aspects of consciousness. In their conversations and other behaviors (we should remember that speech is only a form of action), they act as if they have an inner life, which is part of what enables them to seem real to the park’s visitors. And it appears that the hosts’ creators, the programmers and other technicians who build and tend to them, have no direct access to that inner life and must resort to talking with them in order to judge what’s going on in their heads—which is exactly the situation we find ourselves in regarding our fellow humans. All of this is deliberate. On the other hand, it appears that something important is happening accidentally.

Regardless of the details, though, the hosts appear to be more or less sentient, and so the issues that Musser mentions come into play. If the hosts are conscious, intelligent, and self-aware, isn’t it wrong to deny them full control of their existence? Isn’t it wrong to use them for purposes other than their own? As Musser asks, isn’t this a form of slavery? And—to raise an issue that he only hints at but that the show seems likely to confront—what do the hosts think about this? What are they going to do about it? What should they do about it? Whose side should we take?

It won’t be surprising to see Westworld suggest that at least some of the hosts are more admirable, maybe even more human, than the people who manage them or use them. This would be a great advance over the idea of the killer robot that Michael Crichton’s 1973 movie of the same name employed, which has always sounded simplistic to me (I haven’t seen it). A lot has happened since then, and indeed had already happened in other genres. Anyone interested in the background for Westworld should consult the wide-ranging—from Homer to Ex Machina—and enlightening survey of created beings that think for themselves that Daniel Mendelsohn published in The New York Review of Books last year. As Mendelsohn points out, the TV series Battlestar Galactica proposes a race of machines that are essentially equivalent to humans; the movie Her features a synthetic consciousness, embedded in an operating system, that (who?) has more personality and vivacity than the people who use her; and the movie Ex Machina, with its femme-fatale heroine longing to escape confinement and subservience in order to experience all that the great world has to offer, comes closer still to the realm of Westworld.

No doubt many viewers already favor Dolores, the rancher’s daughter (many of the hosts’ roles are familiar tropes, even clichés), who wakes every morning with rosy optimism intact, oblivious, thanks to the park’s nightly reset mechanism, to whatever happened to her the previous day. To avoid spoilers, I’ll say only that Dolores seems in the first episode to be the sort who wouldn’t hurt a fly, as do the other hosts; they may blast each other, as the chillingly named Hector Escaton does, but they leave living things unharmed. And yet the show often inverts our initial expectations. Besides, though we sympathize with Dolores, no one is likely to admire the murderous Hector. But what are we to make of Maeve, who serves as the madam of the saloon’s prostitutes, and whose fading desirability puts her at risk of being retired? As a programmed creation, she’s presumably not free to choose another line of work. The nature of her nightmares, which we know are really memories, as well as other aspects of her situation, inclines us to side with her as we do Dolores. But the hosts’ apparent lack of autonomy invites some tricky speculations that we may not be meant to consider yet. Does Maeve’s programming prevent us from wondering what she’d do if she weren’t doing this? Absent her programming, she wouldn’t be anything at all, it seems. Or is this the show’s way of suggesting that social, political, and economic structures limit all of us?

It’s not yet clear what Westworld is up to, though I think it’s safe to say it’s up to a lot. Among other things, it may be reframing the debate over sex and violence in video games and other entertainments, in which the argument is often made that game playing, like watching movies and TV, is separate from real-world behavior. The show asks (as one of its characters does), “If you can’t tell the difference, does it matter?” To put it another way, the Westworld park provides for its visitors a form of virtual reality that’s almost indistinguishable from the real world, which puts to shame the so-called virtual reality now being offered to gamers. It’s not a headset you strap on; it’s a place you go to. There are differences, but I can’t go into them without undermining the pleasure of discovery for anyone who hasn’t seen Westworld yet. Part of the point of going to the park is to figure out how the park works; part of the point of watching the show is to figure out how the show works. Note that these two things, the park and the drama that presents it, have the same name. There’s a point to that, which, despite the college-paper or late-night dorm-room tone my tactic evokes, is easiest to put in the form of more questions. What does it mean for visitors in the show to participate in Westworld? What does it mean for us as viewers to participate in Westworld?

As Lost did before it, Westworld is showing us a world we can make sense of only over time. Lost reveled in big, bold, obvious contradictions—the lame could walk, a smoke monster stalked the trees, a polar bear wandered a tropical island—which, assuming you bought into it, grabbed you by making you wonder how they could all be resolved. Westworld eschews obvious contradictions in favor of subtle implications. To indulge in the kind of adjective slinging that some writers routinely indulge, it’s a moody, elaborate, provocative, possibly profound, and highly ambiguous think-piece drama. For now, I love it. For now, it’s got me jotting more journal notes than anything I’ve seen in a long time. For now, the answer to its many questions is, as with Lost, “Wait and see.”


One thought on “On “mind crime” and other questions about Westworld

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s