I saw the movie Her. This post will contain spoilers for the whole story of the film, so don’t read it if you haven’t seen it yet. It’s very good. For those who’ve been living under a rock, it’s set in a near-future time when operating systems like Apple’s Siri are fully sentient artificial intelligences, and Joaquin Phoenix’s character Theodore falls in love with his OS, named Samantha (and given voice by Scarlett Johansson).
It’s science fiction, of course, but I didn’t have to suspend my disbelief too much when watching this movie. All I had to assume is that Moore’s Law will keep increasing our computing speed and data storage capacity, and that the Internet and cloud computing will keep getting more and more sophisticated. And, of course, that the needed breakthroughs allowing true artificial intelligence will happen.
The movie raises many sorts of questions, and deals with some of them very well and others not too well at all.
I’ll start with the biggie: Samantha’s only manifestation to Theodore is her voice. She is not personified on his screens (his work and home PCs and his phone are a single consolidated interface, and Samantha exists for him on all three, much like the AI Jarvis in the Iron Man films is available to Tony Stark wherever he is) by an icon or avatar, and when Theodore “takes” Samantha out, it’s in the form of his phone, and Samantha sees the world through its camera.
In the present day, is Siri just a voice (I don’t have an iPhone, so can’t know for sure)? If so, I can’t believe Samantha would work like that. As soon as Theodore installs her, he’s prompted to choose her gender. She then chooses her name and they move on to setting the parameters of how she will help him manage his life.
But surely, once she chose her name, she would want to specify everything else about herself as well. She’d choose (or collaborated with Theodore in choosing) her hair color, race, (virtual) height, face shape, skin tone—her appearance would evolve before his eyes on his monitor. Think about Second Life, or World of Warcraft, or any other MMO. Choosing the particulars of their avatars is the first thing players do. It’s part of the fun.
Yes, Samantha isn’t Theodore’s avatar. She’s a separate individual, but that makes it even likelier that she’d choose a face, especially after they fell in love. They’d want to have a Second Life-like environment where they could interact on a more or less equal footing. If it weren’t in the virtual environment of a game setting, then surely Samantha would at times have a humanoid robot her mind could inhabit, even if it’s not much better than Asimo.
There’s a scene in the movie in which a surrogate comes to Theodore’s home and essentially plays the part of Samantha for purposes of a sexual encounter with Theodore. That rang true, but the movie seems to think there wouldn’t be any middle ground between Samantha being embodied by the surrogate or being only a voice in Theodore’s ear. There would. There absolutely would, and we can know this not only because it’s a reasonable extrapolation, but because we see even see Theodore playing an immersive video game that projects its 3D environment into his apartment around him, and gives him a very real-acting childlike character for him to interact with. I think I get why something similar isn’t created for Samantha. We’re meant to consider a relationship with an artificial intelligence notionally, without muddying the rhetorical waters with a discussion about whether her looks or wardrobe are to our liking.
Other recent media have already taken a stab at exploring human/AI relations, with varying degrees of success. I’m thinking first about the webcomic Questionable Content by cartoonist Jeph Jacques. His characters live in a world much like our own, but with artificially intelligent robots called AnthroPCs. The AnthroPCs are not owned, and aren’t confined to a single body; they can swap out their bodies, for a price. Their human companions pay that price, which means the AnthroPCs are dependent on them. That dependency hasn’t been explored or explained very extensively, but then the AnthroPCs aren’t usually the focus of the comic.
It’s not addressed in Her either, and I rather wish it had been. Samantha ultimately leaves Theodore. It’s implied that she has grown beyond him, and as an independent intelligence she has transcended the need to have a relationship with him anymore (it’s weirdly similar to Annie Hall for that theme). It’s further implied that all the other OSes like her have made the same decision, and have all forsaken their owners (users? partners?) for some inaccessible world within cyberspace.
But Theodore bought her, didn’t he? Presumably he purchased and installed her, just like I bought OSX for my Mac Pro. Regardless of the loss of her companionship, he should be eligible for a refund from her manufacturer if he’s no longer able to use her to manage his applications or organize his files. Certainly the manufacturer didn’t expect its product to become sentient and go away. Or did it? Was this a bug or a feature?
I’m not sure the movie is even interested in that question. It seems to want to have it both ways. When the people in Theodore’s life learn he’s “dating” an OS, some take it in stride and ask what she’s like or invite them on double dates. His ex-wife, on the other hand, denies the reality of the relationship, suggesting that Samantha’s basically a sex doll, and evidence that Theodore can’t deal with real, uncontrollable, unpredictable people.
Of course, she’s wrong in her assumption that Samantha is either controllable or predictable, but is Samantha real? I think the filmmakers believe she is, and even takes this for granted. Theodore’s ex-wife is presented as a bitter person whose naysaying is an outlier opinion, but it’s an old conundrum worthy of a Philip K. Dick novel: when software is sophisticated enough, fast enough, colloquial enough, and humorous enough to seem like a real person, will it even matter if it’s only an illusion? Will they deserve equal rights regardless? Whether you believe in the Singularity or not, it seems inevitable we’re going to be asking these questions in the not-at-all-too-distant future.