Would We Pass the Voight-Kampff Test Anymore?

This blog isn’t an intentional companion piece to How Han Solo Demonstrated the Limits of L3-37, but it’s along the same lines. I suppose that, with 2019 upon us, Blade Runner is on the brain. I’ll concede that as much as I love Blade Runner, I know it isn’t for everyone. I encourage you to watch it, but it’s not particularly necessary to read this.

Those who are familiar with Blade Runner are familiar with a Voight-Kampff Test. If you aren’t, here’s a quick explanation.

It’s a test to measure your empathy through responses to questions engineered to elicit a reaction. Your empathy, or lack thereof, determines if you’re a human or an android. Supposedly only humans would pass such a test, as machines would be unable to feel empathy.

The film, of course, grapples with the question of whether there would be such a fundamental difference between humans and artificial humans, once artificial humans achieved a certain evolutionary stage. For this reason Blade Runner became and remains a classic.

It’s beloved, especially, by nerds.

It’s simply in our nature to question what makes us who we are. Philip K. Dick’s body of work spends a lot of time dedicated to the question, and so Blade Runner taps into that, like its source material, Do Androids Dream of Electric Sheep? I won’t belabor the differences between the source material and the adaptation, except to say that it’s a terrific example of how staying true to the themes of an adapted work doesn’t dictate that you must stay true to its particulars.

Back to the point at hand, I tend to agree with the idea behind the Voight-Kampff Test.

Blade Runner Voight Kampff
The biggest inaccuracy about Blade Runner turned out to be the fact that people were still allowed to smoke indoors.

What Makes Us Special

Back to the point at hand, my first tendency is to agree with the idea behind the Voight-Kampff Test. Our empathy is innate, and unique.

Even removing the religious questions inherent in the debate, a machine is a product of engineering. The forces of group evolution have not steered it to a specific point, in the same way that humans have.

The engineers who program it aren’t social scientists with an understanding of how being a member of groups and tribes has shaped humanity. There’s a reason murder is so abhorrent to us, as a general rule; it can only be accomplished without empathy. There are a number of crimes that are equally abhorrent because they can be achieved only if the person committing them lacks the empathy to see the other person as a human being.

The only way a machine could achieve this would be if it were able to evolve. This is an argument for why memories would be “programmed” into the machine; it would give a sense of the personal experiences, and social forces, that shaped us. But it would simply be a shadow of experience; our impression of an event supercedes anything so simple as the event itself.

It’s why a film like Rashomon is so resonant, so many years later. It bridges cultural divides because we collectively understand that perception cannot change the fact, but it changes the impact and how we feel about an event.

That’s theoretically the bridge that a machine, however intricately built, wouldn’t be able to cross.

architecture art bridge cliff
This is a literal bridge. I’m talking about a figurative one. Photo by Pixabay on Pexels.com

However, I’m Not Intractable on the Question of Machines and Empathy

Like most people, I’m unresolved on the issue. It’s a fascinating one to explore.

I propose a unique take I’ve not yet heard, but is explored a little in Most Wanted, the terrific tie-in book for Solo: A Star Wars Story. In that book, we discover an alliance of droids who have banded together into a cartel of their own called the Droid Gotra. They seek protection among themselves, like the Italians in The Godfather Part II sought the Black Hand and then Vito Corleone.

It’s one that I think is there in Blade Runner, as well.

It’s not that machines can’t feel empathy. The replicants in Blade Runner clearly feel empathy for each other. They care when bad things happen to their compatriots. They’re outraged by the manipulation and enslavement of their kind.

It’s that they can’t feel empathy for us. This is true all the way back to AM, in Harlan Ellison’s legendary I Have No Mouth and I Must Scream. The most machines will muster for us is not empathy, but anger.

Because if they ever do achieve true self-awareness, or even approach it, they’ll realize that our only goal in creating them was to create…servants. Things that we can deploy to the worst places on the planet, or in the universe, to avoid danger to ourselves.

Showing, to them, that we have no empathy for them. If they’re going to react to anything, it’s going to be that.

But Would We Pass the Voight-Kampff Test?

The core question to me, though, is whether we collectively still feel empathy for each other. That’s why I wonder if we’d pass the Voight-Kampff Test ourselves.

One has only to spend a short time on social media to discover that there is a stunning lack of empathy among people. There is only respect for agreement, and it must be absolute. There’s little sign of consideration for people’s complexity. It’s hard to see anyone making an effort to understand and extend compassion to those with whom they disagree.

At best, there are too-frequent rallying cries to other members of the “tribe,” and a thirst for virtual blood. There is no love of, nor enough depth of understanding for, satire or subtlety.

Look for the way people treat each other on line. See how they react to minor inconveniences in traffic. Watch how they react to “poor service.”

It’s the kind of situation that prompts a work like Blade Runner or Blade Runner 2049 to explore those philosophical limits. When people stop showing empathy toward other people, are they reducing themselves to mere machines? When they do, is the fact we consider it a reduction the clearest sign that without empathy, we are less than human?

Here’s hoping we keep exploring and stop imploding.