Ever since Robbie the Robot, we humans have been fascinated by machines that can look and behave like we do. Most scientists agree that Artificial Intelligence is, if not already upon us, only a matter of a short time until it pervades our lives. Is it fantasy to believe that a computerized robot could soon be your best friend? Sega, the Japanese toy manufacturer, says they’ve already done it. Ema, short for Eternal Maiden Actualization (gag), is a 15 inch tall battery-powered companion for lonely men. She even kisses on command. (Lonely, desperate, and sadly mistaken about interacting with real women.)
Toys aside, it doesn’t seem like too much of a stretch to believe that clever programmers can create systems that respond as we do to various stimuli in our environment. A touch sensor in the hand registers the heat of the stove, the hand jerks back, and the machine vocalizes “ouch!”. In fact as far back as 1950, Alan Turing (a brilliant English mathematician and computer scientist who was convicted of homosexuality, chemically castrated, and stripped of his security clearance – a short-sighted blunder on the part of the British “intelligence” authorities) hypothesized a test of computer intelligence. It worked like this: A human at a terminal engages in two conversations. One is with another human on a hidden terminal, the second is with a computer program. The object is to determine which is which. With good programming it turns out to be a lot harder than you would think.
But what about face to face? Remember HAL, the computer in 2001: A Space Odyssey? When he said “I’m afraid I can’t do that, Dave,” you felt like it really pained him to betray his human friend for the good of the mission. And as his circuit boards were pulled out and he began to regress, he admits that he is afraid. Can a computer be afraid? Can a machine be programmed to feel human emotions?
Yes, in a way. Well, more accurately, a machine can be programmed to display behavior that we associate with emotion. But then so can a psychopath. Is there more to human emotion than saying the words, striking the postures, maybe even shedding a few tears? Perhaps not. Maybe all of that is just behavior that has suited our species in the Darwinian sense. Maybe when we listen attentively to a friend in need, it’s nothing more than a pragmatic strategy. We help them, someday they help us. Like a parrot saying “Good morning” but not really meaning it. (Apologies to parrot lovers.)
Most people emphatically reject such a notion. But then, what else can we do? What a blow to our egos if we are not separated from the animal world because we think, feel, love and all the other supposedly unique human traits. What if a run-of-the mill computerized robot can do all of those things just as well as we can?
If that’s true, then my search for the perfect lover is almost over. I can buy him at Kmart in the near future. But that’s just a little bit more cynical than even the Bitter Old Queen can embrace.