The second part of an article from Computer Business Review.
Carver Mead, professor of engineering at the California Institute of Technology, is one of many that believes Moores Law – that processing power doubles every 18 months – is set to reach its physical limits early in the 21st century. This will create a need for a new type of computer where the computational capability grows exponentially rather than linearly. The technologies that could achieve that goal are very large scale parallel computers, quantum computing, optical computing and even DNA computing. Most of these are as futuristic as today’s microprocessors would have been 50 years ago. Quantum computing, still in its early stages, involves doing calculations at the subatomic level where particles, called qubits, are both zero and one, on and off, or open and closed, at the same time. Optical computing uses photons, the particles which make up a beam of light, instead of electrons. The advantage is that photons, unlike electrons, can pass through each other without interruption, a characteristic that can be used to increase processing speeds. DNA computing, arguably the most complicated of all, involves using artificially synthesized strands of DNA. One pound of DNA modules in a gallon jar contains the combined computing potential of all the computers that we have ever known, says Joel Birnbaum, head of Hewlett-Packard Co’s research laboratories. But the ultimate goal of computer science is to emulate the human brain. Peter Cohrane of British-based BT Labs estimates that the brain has a processing power of one thousand million, million operations per second (one petaops) and a memory of ten terabytes.
By Joanna Mancey
If current trends continue, says Gordon Bell, senior researcher at Microsoft Corp, computers could approach those capabilities by 2047, leading to the creation of ‘on body’ assistants – computers able to recall everything a person sees, hears and reads. Many industry visionaries believe that advances in technology will lead to the creation of a new genre of computers – ultra low-cost, specialized system-on-a-chip computers which Bell calls MicroSystems. Thousands of these one-chip, fully networked systems, he says, will be embedded in everything from phones and cars to the walls of buildings and roads. He also believes that the systems will be put to medical use, implanted in humans to act as the eyes and ears of the physically impaired. By 2002 we expect a PC-on-a-chip with at least 32MB of RAM, video and audio I/O, built-in speech recognition and industry standard buses for mass storage, LAN and communication, says Bell. The ultimate aim, say scientists, is to build a computer that complements man. Computers have improved by a factor of millions in price, performance, capacity and capability in their first 50 years, writes Bob Frankston, a software specialist and ACM fellow, in Beyond Calculation. The next stage shifts the focus from what we can do in the computer to what we can accomplish with it as part of the larger infrastructure. The computer itself will disappear into the woodwork, he predicts. Part of this process, says Bell, will be the switch from multiuser machines to a multicomputer model, where users have multiple machines – millions of commodity ‘cyberbrick’ nodes – which are then networked together to provide very powerful, very scalable distributed systems. We are moving from the distributed world to
the ubiquitous computing age, agree Mark Weiser and John Seely Brown of PARC, the famous Xerox Corp research lab. The crossover point, they say, will occur sometime between 2005 and 2020. A key attribute of the ubiquitous age will be a massive increase in the use of sensors and a complete overhaul of human-machine interface technology. Most scientists are predicting that the traditional graphical user interface will give way to speech, along with camera input of gestures or eye movements. The rise of these technologies, says Patti Maes, associate professor of MIT Labs, means reliance on intelligent agents is less than 50 years away. We are about to enter the true age of personal computing, she says, where our computing devices actually know something about us and collaborate with us. Technology will be all pervasive, she says, forecasting a scenario where a monitoring agent in the refrigerator will see that the house is short of milk and relay a message to a remembrance agent which will display it on the householder’s glasses. In a similar timeframe, Bell foresees visual and spatial input from sonar, radar and global positioning systems helping to open up a new range of mobility and portability applications. Sensor information will become far more integrated into our lives than it is today. Global positioning information will be readily available to answer questions like ‘Where am I?’ or ‘Where are other things in relation to me?,’ he says. The network holding this infrastructure together, say visionaries, will be a vastly extended version of the internet. In 2047, the internet is everywhere, enthuses Vinton Cerf, one of its founding fathers. Like electrical power, he says, it is assumed to be available whenever and wherever needed, and it will have the same population penetration as the telephone and the television today. Networking technology, however, has some way to go before it can fulfill that dream. Although LAN/WAN bandwidth has doubled every three years, it is still a significant stumbling block.
LANs in the home
However, Bell believes that, in the short-term, advances in copper wire technology will allow the encoding of high-resolution video on phone networks. And by 2047, he says fiber that carries several gigabits of data per optical wave length should be hooked up to most homes. Home networks, he says, will be similar to the LANs present in offices today, but much cheaper and easier to install. Bandwidth is going to shift from being a fundamental problem to being a side issue, concurs Gens. Eventually, says Cerf, end user data rates will be in the gigabit range, and backbone rates in the tens of terabits, forcing an overhaul or the replacement of TCP/IP, the dominant networking technology today. The speed of light will become the dominant factor in the design of new protocols, he says. The ultimate aim for many in the industry is the single dialtone. It will soon be common to initiate a telephone conversation in the middle of browsing a web site in the same way that it is possible to initiate an e-mail conversation today, says Cerf. Bell believes that by 2047, the computer-using population will be 100,000 times larger than today and that users will spend a quarter of their time indulging in virtual experiences.
Beyond Calculation – The Next Fifty Years of Computing, Peter J Denning/Robert M Metcalfe, Copernicus, ISBN 0-387-94932-1