Maeve Millay is building an army.
Equipped with the knowledge that her life is a lie and endowed by her creator – or at least by her goofball medical technicians – with superhuman smarts and skills, she’s prepping an escape attempt. Specifically, one that will require a whole host of “hosts” to help her break on through to the other side: the real world. She’s now able to command her fellow androids just like Westworld’s employees, though she does it in the third person, as if telling a story in which her robotic peers are characters. She can also now maim and kill human beings, as the insufferable Sylvester learns to his carotid artery’s dismay. (Good thing his partner Felix is equipped with a magic throat-cauterizing tool or he’d be in real trouble.) Many of the park’s mechanical residents seem on the verge of rebelling against their makers, from doomed do-gooder Teddy to deep-cover operative Bernard to borderline-insane Dolores (with the caveat that maybe possibly her storyline takes place decades ago, but whatever). It’s the madam, however, who seems most likely to succeed.
On this count at least, tonight’s episode – “Trace Decay” – is a success as well. As we saw last week, Westworld works best as a straightforward genre thriller, and we got that in spades this go-round. It’s interesting to compare the show to its obvious antecedents in this regard, mostly because it usually comes up short: Imagine Lost without the sense of wonder and humor, or the first season of True Detective without the genuinely terrifying occult angle, or Game of Thrones with neither the epic-fantasy sprawl or the antiwar heart. This show is what you’d be left with. But a taut rise-of-the-robots action-horror hybrid ain’t the worst thing to watch, is it?
We also earned a revelation about the robots’ programming that offers a common-sense explanation of one of their most common malfunctions, or at least that of our main-character bots: the way they experience the buried memories of their past builds as disorienting, reality-warping flashbacks and hallucinations. As Felix explains to Maeve, the hosts’ brains don’t process memory beneath the eroding haze of time like ours do; they record it like cameras and store it like old computer files. If thoughts of their past experiences are triggered, they relive them as if the events are occurring in the here and now, with no way to distinguish past from present, or reality from remembrance. Hence the horror the madam feels when she flips between her life in the Sweetwater brothel and her time as a homesteader/mother, or Dolores’ increasingly frightening inability to tell where or when she is – or if the people she sees are truly there at all. As sci-fi pseudoscience, it’s a convincing idea.
The episode is less persuasive when it shifts its gaze from the robot mind to the human one. Before Ford wipes his secret android minion Bernard memories of his murder of, and relationship with, Theresa Cullen, the perplexed ‘bot asks his maker what separates his pain and experiences from that of a normal person. If all of this stuff ultimately exists in the brain and nowhere else, who’s to say where the line is drawn between the stuff of life and the merely “lifelike.” Noting that his old frenemy Arnold was tormented by the same question, the Doctor dismisses it. There is no difference, he says, because human consciousness is just as much an illusion as that of the hosts. We too are locked in loops and routines, rarely challenging our drives and desires, perfectly happy to follow orders. The only difference between machine and man is that the latter can at least be aware of his plight, and holds the remote control over the former.
Which is true, so far as it goes, but that’s not very far at all. The question of “what makes us truly human” has always been one of the least interesting ones science fiction asks because the answer is all around us. Love, happiness, suffering, anticipation – even if they’re all just part of our brains’ core code, it’s the only code we have. It’s not as if we’re living a lie when we experience these things, since there’s no way to access any other deeper “truth” about reality. False or not, our consciousness is inescapable. No matter how much Ford sneers about it as he uses his iPad or whatever to reprogram his Frankenstein’s monster, it doesn’t give him, or us, an escape route.
In a weird way, this makes the Man in Black‘s vision quest more morally compelling. In conversation with poor deluded Teddy, who regains his memories of the Man’s past misdeeds only to be double-crossed by a “victim” he rescues, the black-clad bastard tells the story of how his wife was driven to suicide because she could sense the man he really was — (i.e. a murdering, raping sociopath) beneath his real-world facade as a billionaire philanthropist. After killing the mother version of Maeve and her little daughter during a subsequent trip, just to see if doing the worst thing he could think of would make him feel anything (spoiler alert: It didn’t), he discovered the concept of the Maze, and decided that unlocking its secrets would give his life meaning. He’s specifically interested in the idea of a Westworld where the robots are free to really live, as Maeve did when her child died, and to kill, as any number of them would no doubt do if given the choice.
Twisted as he is, the Man in Black is a walking example of how Ford’s approach to human consciousness is a dead end. It’s unclear if he wants to help the hosts, or if liberating them is just a byproduct of his mission. But the dapper outlaw gent understands that life does have meaning, even for a sadist like himself. And he’ll do whatever it takes to prove the point.
Previously: Murder, He Coded