Is Artificial Love True Love?

There are two kinds of science fiction films, one of which makes me feel “WOW,” the other making me pause every five minutes and ponder why. Why does he/she do that? Why does that happen? Is there another way? If yes, what is it? If no, why? … Steven Spielberg’s A.I. Artificial Intelligence is one of the latter. To ask questions is to dig up the assumptions buried in the films. To further question those assumptions is to track down the allegorical connections between the fiction world and the real world. In this paper, I’ll discuss three questions that arose during my viewing.

Question #1: Why is there a protocol for imprinting love, but no protocol to erase love?

Most appliances (even Buzz Light Year in the Toy Story) have a way to restore factory settings. If such a button exists, Monica only needs to press that button so that she would not feel so sad to leave David in the forest, and David would not be so heartbroken. But Dr. Hobby didn’t give him such an erasing function. It’s ridiculous from a product-development point of view. The only explanation is that Hobby (and Spielberg) just didn’t want David to have one, because:

  1. Hobby is less a scientist than an experimentalist. He wants to play God and test how far his creatures can go. Just as he put it, “didn’t God create Adam to love him?” He wants his creatures to love him, just like Adam loves God, and he doesn’t want to provide his creatures ways to undo that unconditional love. Even though Hobby produced David to look like his son, deep down, he only considers David a “forced labor” who has no right to run away from its enslavement, even though, ironically, this enslavement means to live in a middle-class house and to be “loved” by a “mother.”
  2. Another possible reason is that Hobby might want David to be more like humans since humans cannot simply erase the love for someone. Even the powerful Lacuna procedure can’t completely erase the traces of love. The problem is, the production of David’s love is entirely different from that of humans. The love is commoditized, industrialized, and mass-produced. It leads me to the next question.

Question #2: Does David really love Monica?

Deborah’s discussion about Joe might only love an idea of Clementine inspires me to question this assumption in A.I. as well. After all the things David has done, I feel being coldblooded to question his love for Monica. But I just cannot forget where this love came from—by Monica’s reading seven random words in a certain order. Did I get my love for my mom in this way? Definitely not. It took years of caring and even fighting to build the love between my mom and me. How can seven words achieve the same effect? It seems David could “love” anyone on the street as long as he/she read the seven words to him. In other words, it’s sadly possible that David didn’t love Monica at all. He loves everyone who read the seven words in the right order. Admittedly, he did many things for Monica, but I’m still not fully convinced. It’s more likely that he is just programmed to mimic the behaviors that humans would do for love in a smart way that could pass the Turing Test. There might be a section of codes in David’s mainboard that tells him that one way to show love to someone is to find a way to let her love back. Just as John Searle’s Chinese Room Argument said, a program that is programmed to exhibit human-like intelligent behaviors cannot be seen as truly intelligent because it does not understand what’s going on, just like a person who can answer questions in Chinese merely by following a set of explicit rules written in English cannot be seen as being able to understand Chinese.

But Searle didn’t explain what’s missing in the “understanding” of the program. So, another me fights back. I could equally argue that how do I know I am not mimicking other people’s “loving” behavior to my mother, just like David did to Monica? After all, the behaviors of caring and loving and even the concept of love are mostly learned from social conventions, even though some are naturally inherent in my DNA. What’s the essential difference between David’s seven-word protocol and my building of love for my mother? Probably not much. People can fall in love at first sight, or with a few words or actions, for example, Jack and Rose in the Titanic. But, if it’s not about the scale of training sets, what’s the difference between human love and mecha love?

Therefore, a further question is raised—what on earth the nature of love is? At the beginning of the film, a man said, the question is not how to build robots that love humans, but whether humans can love them back. Initially, I thought this question is so easy. Humans certainly can love them back! We always love entities outside our species, even those that don’t exist. We love cats and dogs; we love Gods; we love fluffy toys (I love Teddy!); we love donuts… But after finished viewing, I began to question myself. Are they really love, or just another way of saying “enjoy”? What does it mean to love back a robot that loves you? Does it mean we need to take care of it? What if it does not need to be cared, just like David who doesn’t need food or sleep? Does it mean we need to satisfy its emotional needs and be gentle with it? The problem is if it really has emotional and psychological needs. If yes, are those emotional needs coded by some programmers? If yes, is a line of code worth caring about? Do we hold moral responsibilities toward a pencil?

Question #3: If David is designed to love, is it reasonable to assume he knows how to hate?

This question is raised by Henry. I think it’s a really good question, and I certainly believe the answer is yes. Hatred can rise from jealousness, competition, the desire for possession, and other things related to love. Martin hates David for competing with him for Monica’s love, so he deceived David into cutting off a lock of Monica’s hair in the middle of the night. I was surprised. How can a kid conduct such an insidious behavior? But when I saw David killed another David insanely in the office of Cybertronics headquarter while screaming “I’m David; I’m special; I’m unique,” I suddenly realized that this is exactly the hatred incubated in love. David was angry not because his self-identity is lost. He was angry because he thought his non-uniqueness would undermine Monica’s love for him because he believed Monica’s love for him is based on his uniqueness. Upon seeing other Davids in boxes, he even hated himself for being one of them, so he committed suicide. He knows how to hate, even though he hasn’t realized it yet.

One more thing I found interesting is that the moment when David found the other “Davids” echoes many other films in which the protagonists shockingly find other versions of themselves. Carolyn Jess-Cooke discussed this as well. In Aliens: Resurrection, Ripley found many failed “Ripleys” in the lab. In Triangle, Jess found a pile of bodies of herself on the deck. People in Coherence saw other versions of themselves in the house across the street, not to mention The 6th Day and Predestination. These moments reveal them a truth that either they are trapped in some loop or they are products assembled in factories. Sameness is the nature of capitalism manufacturing. Even the feeling of being unique is replicable. We are living in a society full of homogeneous products, but somehow we are afraid of becoming one of them. In this afraid rests some of the meanings of being a human instead of being a pencil. The problem is where we should put our creatures in the continuum between humans and pencils. Or, is it a continuum or dichotomy?

发表评论

邮箱地址不会被公开。 必填项已用*标注