A new religion has taken hold of the digerati of the world. According to believers in the Singularity, technology is on an ever-accelerating trajectory, with new advances happening in shorter and shorter intervals of time. Within a few more decades, they claim, the world will be changing so quickly that society will not be able to keep up. According to this theory, as soon as we develop a machine that is more intelligent than we are, it will develop even smarter machines, which will develop even smarter machines, which will solve all of our problems and endow us all with godlike powers.
As strange as it sounds, this is an accurate description of the beliefs of Singularity enthusiasts. If this sounds goofy to you, you are certainly not alone. Virtual reality pioneer Jaron Lanier describes it as “the tech world’s new religion.” Mitch Kapor, the founder of Lotus Software (now a part of IBM), describes it as “intelligent design for the IQ 140 people.” I completely agree with them. The Singularity has all of the elements of a religious rapture: If we as a society behave ourselves, there will be one instant at some point in the next few decades that will transform the world and we will live forever in paradise. As Lanier notes, “books on the Singularity are at least as common in computer science departments as books on the rapture are in Christian bookstores.” The Singularity has many prominent adherents, including Microsoft founder Bill Gates, and Google's Sergey Brin and Larry Page.
If this religion has a high priest, it is futurist Ray Kurzweil. Its bible is Kurzweil’s 2005 tome, The Singularity Is Near. Kurzweil claims that the concept of the Singularity can be extrapolated from current technological trends. He completely rejects the idea that the ideas of the Singularity are motivated by any religious impulse, claiming that this is a veiled criticism to make it seem unscientific. He observes that computers have become much more powerful in recent decades, extrapolates that trend out a few more decades, and concludes that computers will soon leave us in the dust intellectually. He predicts the Singularity will occur around 2045.
Color me skeptical. While Kurzweil is quite right that merely labeling it a religion is insufficient to show that it’s inaccurate, I can see a number of very substantial problems with this belief. First of all, it is not reasonable to extrapolate current computing trends into the distant future. As Kurzweil himself notes, we are nearing the point in time (probably around 2019) when it will be impossible to shrink transistors anymore, and Moore’s Law will come to an end. Kurzweil then assumes (based on absolutely no evidence) that we will continue to double our computing power at approximately the same rate as before, by using three-dimensional computing chips. While this is possible, it is by no means guaranteed. The rapid increase in computing power that we’ve grown to expect could slow dramatically in the 2020s. If this happens, we almost certainly will not have truly intelligent artificial intelligence as soon as Kurzweil predicts.
Second, there is a very large difference between having the raw computing hardware to emulate a human brain, and actually having the software to create a program as complex as the human brain. This is not a minor problem. One rule of computer science is that as computer programs become more complex, it becomes evermore difficult to increase their complexity further. To make a program twice as smart requires drastically more than a twofold increase in the program’s complexity. It could be many, many decades (or longer) before we have any programs able to compete with humans intellectually.
Finally, Kurzweil makes a huge leap of faith by assuming to know the motives of beings more intelligent than we. If we create true artificial intelligence, what is to stop them from killing us all, or worse? Kurzweil claims that this will not happen because we will program them to respect us…but if they are more intelligent than we are, they could easily reprogram themselves if they wanted to. Or even if artificial intelligence is benign and wants nothing but to shower us with free goodies, there is absolutely no reason to think that they would want to create intelligence smarter than themselves, leading to a technological Singularity. Maybe their increased intelligence would allow them to see what Kurzweil apparently cannot: Creating entities smarter than themselves could pose a threat to their continued existence.
I think my previous entries have made clear that I am mostly a technological optimist. I share Ray Kurzweil’s belief that we will overcome many of the problems facing the world in the coming decades, including hunger, extreme poverty, naturally-occurring disease, environmental degradation, and aging. I will even grant that at some point in the future, we will probably create artificial intelligence that is smarter than we are and radically redefine our concept of what a human is. Despite all of this, the concept of a technological Singularity remains a completely irrational idea. It cloaks itself in the language of science and uses elegant graphs of past technological development to rationalize its predictions of future technological development, but ultimately it requires the same leaps of faith that are more characteristic of apocalyptic religious raptures than of science.