Smith is an intriguing figure, sort of. Unlike his peers, he has
evolved with the culture that spawned him. Though once merely peeved by his
mother's fashion directives (1988's "Parents Just Don't Understand"), he has
grown into a mature artist who's willing to confront America's single greatest
threat: killer robots.
This summer, Smith will star in I, Robot , a
cinematic interpretation of nine stories by Isaac Asimov. When I was in the
sixth grade, Asimov struck me as a profoundly compelling figure, and I delivered
a stirring oral book report on I, Robot . The collection was punctuated by
Asimov's now famous "Laws of Robotics":
1. A robot may not injure a human being or, through inaction, allow a
human being to come to harm.
2. A robot must obey orders given it by human
beings, except where such orders would conflict with the First Law.
robot must protect its own existence as long as such protection does not
conflict with the First or Second Law.
4. Do not talk about Fight Club.
Now, I don't think I'm giving anything away by telling you that the
robots in I, Robot find a loophole in those principles, and they proceed to
slowly fuck us over. I, Robot was published in 1950, but writers (or at least
muttonchopped Isaac) were already terrified about mankind's bellicose
relationship with technology. This is a relationship we continue to fear. If we
have learned only one thing from film, literature, and rock music, it is this:
Humans will eventually go to war against the machines. There is no way to avoid
this. But you know what? If we somehow manage to lose this showdown, we really
have no excuse. Because I can't imagine any war we've spent more time worrying
The Terminator trilogy is about a war against the machines; so is the
Matrix trilogy. So is Maximum Overdrive , although that movie also implied that
robots enjoy the music of AC/DC. I don't think the Radiohead album OK Computer
is specifically about computers trying to kill us, but it certainly suggests
that computers are not "okay." 2001: A Space Odyssey employs elements of robot
hysteria, as do the plotlines of roughly 2,001 video games. I suspect Blade
Runner might have touched on this topic, but I honestly can't remember. (I was
too busy pretending it didn't suck.) There is even a Deutsch electronica band
called Lights of Euphoria; its supposed masterpiece is an album titled Krieg
Gegen die Maschinen , which literally translates as "War Against the Machines."
This means that even European techno fans are aware of this phenomenon, and
those idiots generally aren't aware of anything (except who in the room might
I'm not sure how we all became convinced that machines intend to
dominate us. As I type this very column, I can see my toaster, and I'll be
honest: I'm not nervous. As far as I can tell, it poses no threat. My
relationship with my toaster is delicious, but completely one-sided. If I can be
considered the Michael Jordan of My Apartment (and I think I can), my toaster is
LaBradford Smith. I'm never concerned that my toaster will find a way to poison
me, or that it will foster a false sense of security before electrocuting me in
the shower, or that it will align itself politically with my microwave. I even
played "Dirty Deeds Done Dirt Cheap" in my kitchen just to see if my toaster
would become self-aware and go for my jugular; its reaction was negligible.
Machines have no grit.
It appears we've spent half a century preparing for a war against a
potential foe who—thus far—has been nothing but civil to us. It's almost as if
we've made a bunch of movies that warn about a coming conflict with the
Netherlands. In fact, there isn't even any evidence that robots could kick our
ass if they wanted to . In March, a shadowy military organization called DARPA
(the Defense Advanced Research Projects Agency) challenged engineers to build a
driverless, autonomous vehicle that could traverse a 142-mile course in the
Mojave Desert; the contest's winner was promised a cash prize of $1 million. And
you know who won? Nobody. Nobody's robot SUV could make it farther than 7.4
miles. Even with the aid of GPS, robots are completely moronic. Why do we think
they'll be able to construct a matrix if they can't even drive to Vegas?
suspect all these dystopic man-versusmachine scenarios are grounded in the fact
that technology is legitimately alienating. The rise of computers (and robots,
and iPods, and nanomachines that hope to turn the world into sentient "gray
goo") has certainly made life easier, but it's also accelerated depression. Case
in point: If this were 1904, you would not be reading this magazine; you would
be chopping wood or churning butter or watching one of your thirteen children
die from crib death. Your life would be horrible, but it would have purpose. It
would have clarity. Machines allow humans the privilege of existential anxiety.
Machines provide us with the extra time to worry about the status of our
careers, and/or the context of our sexual relationships, and/or what it means to
be alive. Unconsciously, we hate technology. We hate the way it replaces
visceral experience with self-absorption. And the only way we can reconcile that
hatred is by pretending machines hate us, too.
It is human nature to personify anything we don't understand: God,
animals, hurricanes, mountain ranges, jet skis, strippers, et cetera. We deal
with inanimate objects by assigning them the human qualities we assume they
would have if they were exactly like us. Consequently, we think of machines as
our slaves, and we like to pretend that these mechanized slaves will eventually
attempt a hostile takeover.
The truth, of course, is that we are the slaves; the machines became
our masters through a bloodless coup that began during the industrial
revolution. (In fact, this is kind of what I, Robot is about, although I assume
the Will Smith version will not make that clear.) By now, I think many Americans
are aware of that reality. I think any smarter-than-average person already
concedes that a) we've lost control of technology and b) there's nothing we can
do about it.
But that's defeatist. Openly embracing that reality would make the
process of living even darker than it already is; we'd all move to rural Montana
and become Unabombers. We need to remain optimistic. And how do we do that? By
preparing ourselves for a futuristic war against intelligent, man-hating
cyborgs. As long as we dream of a war that has not yet happened, we are able to
believe it's a war we have not yet lost.
But hey—maybe I'm wrong about all
this. Perhaps we humans are still in command, and perhaps there really will be a
conventional robot war in the not-so-distant future. If so, let's roll. I'm
ready. My toaster will never be the boss of me. Get ready to make me some
Q+A: K. Eric Drexler,
the father of nanotechnology
Are you like me? Do you sometimes worry that tiny machines will consume
the earth in a sinister attempt to destroy our society? If so, the scenario you
fear hinges on the idea of "gray goo," a theory proposed by a futurist named K.
Eric Drexler. I called Drexler at his desk in Los Altos, California, to figure
out how anxious I need to be. —C. K.
ESQ: What is gray goo?
KED [ sighs ]: Gray goo is the hypothetical danger that would arise if
somebody built a machine that could create copies of itself using naturally
occurring materials and unleashed it on society. Now, this is not easy. Someone
would have to solve a very complicated engineering problem, build the machine,
and then turn it loose. It would then—theoretically—turn everything in the world
into a mass of nanomachines.
ESQ: Why would anyone do this?
KED : Beats me.
ESQ: I get the impression that this is something I don't need to worry
KED : This is the danger of alliteration. For example, I think half
of the people who talk about the "digital divide" do so because it has an
alliterative name; if you called it the "gap in the availability of computing,"
no one would care. People like the term "gray goo" because it has two g 's.
There's nothing gray or gooey about it. And this is all my fault, really.
ESQ: Do you regret creating the term "gray goo"?
KED : It is pretty