Culture At Large

Driver-less cars and the rise of moral machines

Nick Olson

Near the end of a New Yorker piece titled “Moral Machines” appears something like a thesis: “As machines become faster, more intelligent and more powerful, the need to endow them with a sense of morality becomes more and more urgent.” But that statement begs a question: can conscience be encoded in machines?

The article, by Gary Marcus, begins by acknowledging that driver-less cars are now legal in three states. Marcus then suggests that “eventually (though not yet), automated vehicles will be able to drive better, and more safely than you can; no drinking, no distraction, better reflexes and better awareness (via networking) of other vehicles.” The piece then transitions into a discussion of robotic soldiers, assuming that drone warfare is only the first wave.

With the example of driver-less cars, it’s not difficult to imagine a scenario in which automobile transportation is something like a highly sophisticated, computerized public transit. And one can imagine our robo-warfare developing beyond drones into machines more resembling human soldiers in awareness and perhaps appearance - programmed with directives to shoot or not shoot in certain situations. But in each of these examples we’re imagining machines with programmed systems that have ethical implications, which is something quite different from coding machines with the ability to choose toact morally. The machines would be wholly dependent on the ethical content that we give them. Does this really bring our robots close to being human or behaving morally? Can ethics be whittled down to “conscience,” “self-awareness” and other, similar modern-ethical terminologies?

Can ethics be whittled down to “conscience,” “self-awareness” and other, similar modern-ethical terminologies?

The troubling part of an otherwise fascinating piece comes near the end, when Marcus offers this telling lament: “The thought that haunts me the most is that human ethics themselves are only a work-in-progress. We still confront situations for which we don’t have well-developed codes (e.g., in the case of assisted suicide) and need not look far into the past to find cases where our own codes were dubious, or worse (e.g., laws that permitted slavery and segregation). What we really want are machines that can go a step further, endowed not only with the soundest codes of ethics that our best contemporary philosophers can devise, but also with the possibility of machines making their own moral progress, bringing them past our own limited early-twenty-first century idea of morality.”

What’s disturbing here is the baseline assumption that human moral failure might be characterized as something like a lack of formal exactness - the absence of a more attuned awareness that our future machines might be able to achieve for us, if they can move beyond what our philosophers “devise.” You might say the expectation is that a kind of revelatory Christ-machine will one day show itself, leading us into heavenly efficiency, precision and that treasured height of consciousness: total self-awareness.

But I’m willing to wager that we’d come to find that we’re actually in need of a second coming, one which might restore us to those lost terminologies: “character,” “virtue” and “narrative,” for instance. Maybe humanity’s ethical deficiencies aren’t a lack that machines can fix because the problems are essentially human.

Machines can be a help to human beings in countless ways. But there’s an assumption implicit in the suggestion of robotic ethics that human moral deficiencies are not a matter of good and evil, but of the absence of education, therapy or technical ability. We’re deceived if we think efficiency, precision and heightened awareness can restore righteousness to the human situation, for these are not the primary qualities of goodness or what it means to be human.

Goodness revolves around a telos - a baseline purpose and imperative - to love one another. Who can give us this purpose but a creator God who is love? Who can make this purpose into an imperative but a creator God who is love? To resolve our ethical deficiencies, we need a human savior who is God - not RoboCop.

Topics: Culture At Large, Science & Technology, Technology, Gadgets