Extrapolating from Moore’s Law is Ray Kurzweil, the renowned inventor and futurist — he does most of his mind-bending cogitation at Kurzweil Technologies in North Andover — who sees us fast approaching a technological critical mass.
Describing his own “Law of Accelerating Returns,” Kurzweil writes on his Web site that “we won’t experience 100 years of progress in the 21st century — it will be more like 20,000 years of progress (at today’s rate).” Within a few decades, he maintains, “machine intelligence will surpass human intelligence, leading to The Singularity — technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light.”
Is your mind sufficiently blown?
Kurzweil’s vision for the future, if a little hard to wrap one’s head around, at least sounds reassuringly sanguine. (Publisher’s Weekly calls him “technology’s most credibly hyperbolic optimist.”)
But as Bill Joy, a co-founder and chief scientist of Sun Microsystems, asked in his famous 2000 Wired article “Why the Future Doesn’t Need Us,” should we be banking not so much on Moore’s Law but on Murphy’s? With technology and innovation unfolding so blindingly fast, it would seem an awful lot could go wrong along the way. In that article, Joy argued that such technologies as “genetics, nanotechnology, and robotics” (GNR) could imperil mankind, leading to “whole new classes of accidents and abuses.”
One of his worries was that the rise of “superior robots” might edge out their creators. Noting that “biological species almost never survive encounters with superior competitors,” Joy (joylessly) painted a grim future where “robotic industries would compete vigorously among themselves for matter, energy, and space.” Humanity would be pushed to the margins, and eventually “squeezed out of existence.”
Kurzweil doesn’t see it like that. Rather than facing extinction, he argues in his best-selling 2004 book, The Singularity Is Near: When Humans Transcend Biology (a combo sci-fi/documentary film version is due out later this year), that the very essence of humankind will soon be augmented and improved by GNR technologies, “transcending biology” as we gain extraordinary intelligence and durability.
(Kurzweil, who’s 60, believes we’re very near to this profoundly different post-biological era: he pegs its onset for the year 2045. As such, he’s doing everything he can to stay alive to see it happen, including downing dozens of vitamin supplements each day.)
Kurzweil’s predictions — and he’s got a good track record so far — had better come to pass, and fast. Indeed, some argue, becoming “immortal software-based humans” may be the only way to keep us from being conquered by our own mechanical creations. None other than world-renowned theoretical physicist Stephen Hawking has said that, with computing power doubling regularly — much, much faster than our own evolution — it’s imperative that humans alter themselves via genetics and cyber technology, lest we be outpaced permanently. Otherwise, he told a German magazine, “the danger is real that [computer] intelligence will develop and take over the world.”
The gigadeath scenario
Consensus on enormous issues such as these — fundamental questions about what it is to be human and what humanity’s place is on the globe — promises to be difficult to come by. You thought the battle over stem-cell research was bad? Just wait until the Artilect War.