Artificial intelligence (AI) is back in the news with a game-show win against humans to add to its scorecard.
Among the loudest voices expounding on the win are those who see in it a harbinger of the Singularity - a moment when AI surpasses human intelligence. Media outlets from The Atlantic to PBS' Charlie Rose are trying to unravel the claims of that movement's most prominent proponent, Ray Kurzweil, for a confused public. Christians in particular may wonder whether their theology allows for Singularity and how to respond to Kurzweil's claim that it will lead to eternal life.
In a recent Time cover story Lev Grossman argues, “[y]ou may reject...the Singularitarian charter, but you should admire Kurzweil for taking the future seriously. Singularitarianism is grounded in the idea that change is real and that humanity is in charge of its own fate.” This reduction of the movement to its core aspirations formulates the primary question Christians should ask: is this quintessentially humanist manifesto as compelling and admirable as Grossman believes it is?
Coined by Vernor Vinge and popularized by Ray Kurzweil, the Singularity has ebbed and flowed in public interest, buoyed by high-profile displays of artificial intelligence (AI) such as the recent win on "Jeopardy!" by IBM's Watson computer. Borrowing language from astrophysics, Vinge and Kurweil describe the Singularity as that moment in time at which technological progress effects such a significant paradigm shift it creates an event horizon. This shift, they claim, will result from AI that exceeds human intelligence, which will so radically alter what is possible that later events cannot be predicted. Moreover, Kurzweil argues that the Singularity is certain, an inevitable consequence of exponential increases in technological capacity.
Leaving aside the critique that Kurzweil's predictions are often dubious, is the possibility of superhuman AI in accord with Christian anthropology? This goal is tied to the Strong AI position, which posits that cognition is computation and therefore a computer could have a mind similar to that of a human. While Selmer Bringsjord argues Strong AI is “simply silly” and doesn't deserve formal refutation, arguments over Strong AI quickly become highly technical. Christians often reject Strong AI on the theological ground of the special anthropological status of human beings as the bearers of Imago Dei. However, Russel C. Bjork has argued that Christian theology in no way restricts its aims.
Intertwined with Kurzweil's views is a hope that human beings can achieve immortality by “uploading” their minds to AI systems. It is this view that raises many concerns. “[T]he idea of significant changes to human longevity — that seems to be particularly controversial,” Kurzweil observed in Time. “People invested a lot of personal effort into certain philosophies dealing with the issue of life and death. I mean, that's the major reason we have religion.” Indeed Singularitarianism itself amounts to a religion, as Robert M. Geraci recently suggested, with “sacred texts” forming the basis for a belief system that promises salvation and establishing a worldview. This alone is problematic, as such a religion exists in opposition to Christian faith.
What does this suggest with respect to the original question: how should we respond to Singularitarianism's call to use technology to shape our future? Attempting to overcome death through technological means is hardly new; it is a central theme in the oldest written text, and one echoed through much of Genesis. But Genesis also establishes the cultural mandate, which affirms the inherent good of human technological endeavors. A biblical view holds these two in tension, recognizing human beings as cocreators with God and yet creatures subject to our Lord.
Limited simulations of cognitive function (Weak AI) are possible, as Watson demonstrates, and rapidly increasing in capacity. With time these technologies will become increasingly accessible via near-seamless interfaces — cognitive equivalents of cochlear implants. In previous discussions with Nigel Cameron he has argued that technological augmentation of human bodies is appropriate so long as it is used to restore a lost capability or enhance an existing capacity in degree, rather than alter it in kind. Yet, the history of human technology — whether planes or cellphones — is one of increasing capacity in kind. The test of technology, I propose, is whether such innovations seek to further our mastery of the world or to establish us as lords over ourselves.
I, for one, welcome our new computer helpers, but ask us to remember that we are not our own Lord.
Jason E. Summers is the Chief Scientist at Applied Research in Acoustics LLC, a research and development firm based in Washington, D.C. His work includes development of physics-based simulation for training and cognitive-based signal-processing algorithms for machine learning. You can find more of his personal views on Twitter (@jasonesummers).