Monday, September 13, 2010

Moore's Law and Ubiquitous Computing

The amount of computing power in a single iPhone is greater than the amount of computing power in the supercomputer that controlled the Apollo 11 mission to the moon. It is also greater than the total amount of computing power used by all the militaries of all the nations in World War II. It is no exaggeration to say that a single iPhone dropped into 1940 could have dramatically altered the outcome of the war.

The co-founder of Intel, Gordon Moore, made a stunningly accurate prediction in 1965. He noted that the number of transistors per integrated circuit had been doubling every year, and expected that trend to continue for at least ten more years. Moore’s Law, as it is now known, is still going strong 45 years later. This exponential acceleration of computer hardware has proven so consistent that we have grown to expect it. Moore’s Law has continued unhindered through booms and busts, war and peace. Approximately every 12-18 months, the amount of computing power that a person can buy for any given amount of money doubles. This has been accomplished by making transistors smaller and smaller, to fit more of them on a single integrated circuit. Just as Gordon Moore predicted way back in 1965, we can safely expect this trend to continue for yet another ten years.

But after about 2019, we will hit a wall. This is because by that time, our transistors will be so small that they will be just a few molecules across, and quantum effects will make it impossible to effectively shrink them any further. Fortunately, computer engineers have already found a way to keep our computing power accelerating beyond that. At the present time, most computer chips are flat, but there is no reason they have to be. After we can’t cram any more transistors onto an integrated circuit, we will still be able to expand our computer chips outward into the third dimension. However, this will introduce another problem. Three-dimensional chips will produce far more heat than flat chips do. If computer engineers cram too many tiny transistors on top of one another, they could fry the computer chips. Although there are some clever solutions in the works to address this problem, there are many skeptics, including Gordon Moore himself. While we can expect the raw power of computer chips to continue to increase beyond 2019 as they expand outward, it remains to be seen if we will still be able to double the power every 12-18 months in accordance with Moore’s Law.

With our computing power doubling every 12-18 months for at least the next decade, we can expect the computers of 2020 to be 100 to 1,000 times more powerful than equally-priced computers today, just as today’s computers are about 1,000 times more powerful than computers of ten years ago. This will profoundly transform the world. A thousandfold increase in computing power means far more than search engines that run a thousand times faster. It opens up a wide array of new applications that no one would have even attempted before.

The Information Age can be roughly divided into three epochs: mainframe computing, personal computing, and ubiquitous computing. The era of mainframe computing lasted from roughly 1946 to 1977. This era was dominated by enormous computers staffed by many people. The era of personal computing was the second epoch, lasting from roughly 1977 until the present. In this era, individuals were finally able to afford their own computers. We are now entering the third epoch: the era of ubiquitous computing. In this epoch, there will be many computers for each person. In addition to our PCs, many of us already have smartphones, portable music players, and e-readers. Computers will soon be woven into the fabric of our world so much that we will rarely even notice them. Virtually every machine, every wall, and every article of clothing will contain computers.

Although this constant connectedness will certainly have a negative impact on our privacy, it also has many benefits. Computers that constantly monitor our health will be able to automatically alert 911 whenever we are having an emergency, possibly before we are even aware of it ourselves. Ubiquitous computing will finally enable driverless cars, which offer the potential of saving thousands of lives per year, reducing traffic and pollution, and reducing the need to personally own a car. If we would like to have a change of scenery, we will be able to have interactive displays on our walls that could cycle through a preselected assortment of posters to display. It will enable truly smart homes, in which all appliances are connected to one another and to the internet, and can alert you when it is time to repair or replace them. Just as we have come to expect any building we enter to have electricity and plumbing, we will soon expect any building we enter to have internet access and to be connected to the outside world via computers woven unnoticeably into the walls, ceilings, and floorboards.

The exponential growth of computer hardware associated with Moore’s Law has been the single most important driving force in technology for the last half-century, and it still has at least another decade to go. As computers continue to become more and more powerful, things that seemed virtually impossible just a decade ago are beginning to look mundane. It begs the question: What seems virtually impossible today that will look mundane in 2020?

PREDICTIONS:

By 2015 – Effective smartphone applications exist which can turn lights on and off, and start or stop home appliances.

By 2016 - Personal health monitors, which are ingested or worn, can automatically call 911 whenever a person's vital signs indicate an emergency.

By 2017 – The average American carries at least ten computing devices on (or inside) his or her person.

By 2018 – Smart walls are becoming popular, which can display any image the user wants at any given moment, or can cycle through a series of posters.

By 2022 – Silicon computer chips are no longer flat. They are now three-dimensional because it is impossible to shrink transistors any further.

No comments:

Post a Comment