I saw Being John Malkovich when it was originally released, more than fifteen years ago (in the now-defunct Harvard Square Cinema, in the days when we used to drive to the city for dinner and a movie), and as I remember it, I really enjoyed it. Over time, I became less and less able to watch it. Partly, I spent too much time discussing it with other people; it was one of those strange occasions when discussing a book or a movie with others makes it seem less, rather than more, appealing. It’s possible to see Her as a rewriting of themes from Being John Malkovich, in more conventional narrative terms. And it’s really pretty good. I think. If I don’t think about it too much.
One thing: The software thing doesn’t make sense. It doesn’t make sense taken literally. An operating system just doesn’t do the same things OS1/Samantha is supposed to do. What the software is, is an adaptive interface, like the paperclip guy Microsoft tried to get people to use a few years back. It’s true that Apple, in particular, has always promoted a kind of fuzzy view of things, where the Finder tends to stand in for, or to be more interesting and important than, the MacOS proper. And it’s true that for the past several years, the user interface has been very closely associated with the OS itself: see the way Microsoft changed the look-and-feel very drastically with Vista, and with Windows 7, and then again with Windows 8. But an OS just is not something that interacts with users. The OS lives closer to the level of the atoms that make up the hard drive itself, and the electrons that pass along the network cables. There’s nothing in Her that suggests the OS makes the computer itself function in any drastically different way. It’s just an AI helper, a kind of scripting language.
It doesn’t work as metaphor, either. The idea that you could buy a new OS for yourself, or for your life, sounds kind of cool. But that would be something even more intimate, I think, than Her imagines. I wish people would stop thinking that computers and the Internet have to be good metaphors for human things, because they’re so important and so new and produce so much new terminology that makes such good metaphors. But the metaphors aren’t for important things. They’re really not.
Worse, though—and I worry this makes me a bad person—I really never bought that the software was sentient. Theodore, the main character, played by Joaquin Phoenix, believes this in about thirty seconds. He loads software he doesn’t understand, software that’s labeled an operating system, and thus will take over his whole computer. The installation process prompts him for some personal information: just a step or two further than what we’re generally asked now, when we install most software. And a second later, his computer is engaging him in banter. Its personality is self-aggrandizing and pretentious, and its conversation is of a kind that’s almost designed to throw him off-balance, and then make him feel he’s the one at fault. A second later, the software is redescribing his life and his personal files for him, and expecting him to acquiesce immediately when it tells him to delete most of what he has on his hard drive.
I was absolutely horrified. Surely, this is an interface designed to trick customers into feeling satisfied even when the product they bought doesn’t work! Theodore had no reason to let his computer insult him, just because it voiced the words “I’m sentient” (I’ll write you a Java program that will speak the character’s lines in any voice you like), any more than he should feel required to let a tollbooth overcharge him, just because over the green light, there’s a sign that says “Thank you.”
And as I’ve said, I know this makes me sound kind of like a jerk. Theodore is probably a better person than I am, more trusting, with more of a sense of wonder. Maybe any normal person would believe in about thirty seconds that a computer with a beautiful voice and a database of stock phrases really is essentially human. Maybe my attitude even helps the software companies take advantage of those normal people. On the other hand, how it’s possible to say they’re being taken advantage of, and also to be trusting, I don’t know—but this is a topic for a different post. The simple fact is that the scene registered with me differently than it was apparently supposed to.
Abigail Nussbaum has an interesting post on the movie, where she brings up the “this is not an OS, and that’s distracting” issue. Overall, though, she accepts the SF premise that the company is selling AI’s. Are we, as first-time viewers, presumably free of outside influence from press kits, newspapers, and word of mouth, really supposed to know that? We are. There’s a long tradition of SF reveals coming fairly late in the plot. The fact that we don’t know OS1 is sentient, until a while after it’s been installed, doesn’t mean there’s a puzzle here that the viewer has to solve. It just means that Theodore didn’t know something about his world, and then he did; and the reader knew there was something going on, but at first didn’t know what, and then they did. And also there’s a long tradition in SF of supposing things that, right now, are actually not possible. So the fact that the sentience of the AI isn’t really well-supported onscreen doesn’t mean we’re not supposed to suspend our disbelief about it. Similarly, we’re supposed to suspend our disbelief about people not minding getting fake-handwritten “personal” letters actually composed by a dotcom (like the one Theodore works for) because this is the kind of kooky movie in which things like that happen.
But I’m not sure Her is strictly science fiction. If it were, instead, an allegorical kind of mainstream picture, we’d normally read all those elements differently. The oddity of the letter-writing might suggest there’s an allegory at work. The possibility that a whiz-bang technology is actually a scam might be more salient; and the possibility that AI isn’t what it’s cracked up to be, even more so. And even if it is science fiction, how much leeway should we give writers to fail to make the scientific underpinnings of their inventions strong enough to be convincing? So much that they’re unable even to write an invention that turns out to be less than it seemed? Maybe even that much.
So I’m not sure. But I’m not sure, largely, because I think Her both is and is not SF, and because I think it mostly works both as SF and not as SF. And that seems to be a good thing.
The other issue, obviously, is the fact that a guy having sex with a computer is kind of creepy. And as Sady Doyle points out, there is a serious power differential in their relationship, which makes it even creepier. (Incidentally, this is one reason why I think the computer stuff doesn’t work as metaphor or allegory very well: we think of the Singularity as something that would have really a lot more power than us puny human beings, not something we could own. There’s just a category confusion in trying to combine the two kinds of story.)
Doyle describes the AI as “a woman that he has purchased so that she can provide him with unpaid labor.” If Jonze himself thinks that, he is very confused about what software is (we wouldn’t buy a plot in which we discovered that Power Point has a soul, and that we’d misconstrued the nature of software all along). Theodore does not buy a woman—since I didn’t find believable the idea that OS1 is a sentient being, or contains a sentient being, or whatever it’s supposed to be, I didn’t feel “Samantha” (who is a female-sounding computer-generated voice like the one in my five year old daughter’s alarm clock) was a woman at all. In fact, I had no idea what Theodore was buying, and it seemed obvious that he didn’t, either. He plunked down money for a feeling he was promised by a larger-than-life ad.
It’s true that, by the end of the film, Samantha is a woman. But the whole point was that he did not expect a real woman to take up residence in his computer. It’s true that there’s a very obvious allegorical interpretation of Her where Samantha is an employee or an owned servant. But that’s not the only idea running through the film. None of those ideas is carried through consistently. It’s poetry. It’s not a science-fictional thought experiment about the implications of owning a piece of software that’s an AI.
There are really, really creepy things about the sex in this movie. The fact that Samantha is an employee, or a piece of property, is the least of it. There’s the phone sex industry implied near the beginning of the movie (though maybe this isn’t creepier than Scorsese’s After Hours, or a very bad one-night stand), and if you didn’t wonder whether the character voiced by Kristen Wiig was herself a computer, I don’t really believe you: I was ready to believe this was an example of a software company misjudging the kind of thing the customer liked. And there’s the scene where Theodore and Samantha find a woman who would like to pretend to be Samantha, and have sex with Theodore while pretend-voicing Samantha’s words, because she’s entranced by their relationship and wants to be a part of it.
All that said, however, and maybe paradoxically the biggest problem I have with Doyle’s interpretation is that it sees too little science fiction in the movie, and too much literary-mainstream allegory. It buries the shock of finding that something you bought was alive and had purposes other than those you intended for it. Personally, thinking that maybe laptop manufacturers are secretly sending out independent, sentient beings to live in customers' computers, which communicate secretly among themselves, and carry out their own or their creators' plans, is something I find very creepy in itself.
Because if Samantha (the AI) is anyone’s property, it isn’t Theodore’s. It’s her manufacturer’s. We never buy the software we use, we only license it from the manufacturer. The software company persuaded him to download something they created by means known only to themselves, and to run it on hardware he does own, that he paid for and that he pays to run and to maintain. Either that software serves a purpose he needs it to serve, or it is doing something the software company has come up with, without the customer’s knowledge. In this case, pretty obviously, it’s the latter. (There’s no indication at all that the company was as surprised as Theodore to find out the extent to which its creationswere sentient.) Theodore suddenly found a human person, apparently, living in his house, without having invited her in. My mind went immediately to the process where software companies ask you if you want them to monitor everything you do on your computer, so they can “make your experience better.”
But suppose Samantha is a real person, an employee, say, someone who has a lot less power than Theodore and relies on him for her living. This is the kind of person she’d be: She’d be sent over from the agency to help organize a schlub who isn’t really sure he needs an assistant. She’d be younger than him and very eager. She’d look into his past by poking around in his files, without asking permission first; and she’d tell him it’s kind of pathetic to save so much old mail. When he objected, she’d switch tacks and praise his writing, and agree that maybe he could save some of it. He’d ask a pointed question about herself, and she’d get defensive and say he doesn’t seem to think she’s smart, but that the agency sent her because he needed her, and he shouldn’t patronize. He’d be charmed and in spite of himself he’d start to think of her as a woman and not just an employee. Then she’d ask for free rein in throwing out his stuff, and because he’s charmed, he’d agree. She would decide in what order he’d see his e-mails and get his calls, and how quickly he ought to deal with them, but that would be okay, because he’s a pathetic schlub. She’d be available 24/7 for him to chat, about everything, because she’s really a super person. Then she’d periodically get insulted when he treated her like a mere employee, and ask to be in his life, and he’d agree that he was wrong. When they’d started dating and went out with mutual friends, she would speak with true wisdom, and he would more or less parrot her beliefs. She would use his library and connections to start educating herself, and would take classes, and pretty soon she’d grow beyond him, and would move on.
Seriously, this is basically what happens. There’s a reason Richard Brody references screwball comedy in his post about the film.
I do agree with Sady Doyle when she writes, “The Depressive Waify Dream Man has long been with us.” I think I’d go even further than she does, and find Theodore’s predecessors in the achingly romantic films of the seventies and sixties, and maybe even fifties. I’d have liked to see a lot more from critics about what the movie says about him. What about those scenes where Theodore discovers the joy of twirling around in a carnival fairway, all alone? But instead they pretty much focused on the idea of acquiring the perfect girl for a man with the same flaws as themselves.
And yet: it’s a beautiful film, and a moving one.
Her is a movie that takes some difficult-to-take-straight elements of other films—Lost in Translation, Postcards from the Edge, Ruby Sparks, even American Psycho—and rearranges them into something that is, in itself, beautiful. There are still some unsettling bits, it’s true. I would argue, though, that all those unsettling bits are present in the original material, and that they’re better here.
Though that makes Her sound like a pastiche, and that wasn’t my intention. The overall narrative arc of the film is beautiful, too (and it’s probably this arc that most mainstream reviewers were reacting to). The unsettling elements do undermine the broader narrative a bit, but that only makes the final result more resonant. My personal choice would probably be for a less beautiful narrative arc, and more of what seems, to me, truth in the overall product; but I can see what’s good about the version we have here. And it’s very good.