This is the first part of an article from Computer Business Review, a sister publication.
By 2003, says Gordon Bell, creator of the ground breaking Digital Vax minicomputer, industry luminary, and now senior researcher at Microsoft Corp, artificial intelligence will prove to have had as significant an effect on society as the transistor. By the middle of the next century, say other leading technology ‘visionaries’, computers will be as common and as cheap as paper clips are today, and almost all the information available in the world will be in ‘cyberspace’.
By Joanna Mancey
Such forecasts, viewed against the backdrop of today’s computing infrastructure where PCs cost upwards of $1,000, online services are frequently offline, and even supposedly simple devices such as VCRs are awkward to program, seem barely credible. But then the advances experienced in the first half-century of computing have, in hindsight, also been incredible. From the early days of monster machines requiring thousands of vacuum tubes, a time when computer designers dreamed of the luxury of a one-kilobit memory cache, computing has progressed to such an extent that a trivial programming shortcut undertaken in the 1970s and 1980s (the use of two digits rather than four to represent the year in computer date fields) threatens to bring the world to its knees in the year 2000. Who predicted such a future? What is yet to come? Is it possible to forecast 5, 10, 50 years ahead with any accuracy? It requires a brave person or a fool to attempt a detailed and accurate prediction for the next half century of computing, says Richard W Hamming, a professor and senior lecturer at the Naval Postgraduate School based in Monterey, California.
But without a vision of the future, we will tend to wander like the proverbial drunken sailor, he says. In a bid to bring some order and form to the forecasting process, earlier this year the Association of Computing Machinery, the industry’s oldest body, united some of the leaders in the field at a conference boldly titled The Next 50 Years of Computing. What resulted was a fair amount of insight, interspersed with considerable amounts of discrepancy and vagueness. I don’t think anyone came close to looking out 50 years – they rarely got past 15, says David Moschella, former head of research at market analyst IDC. Some of the most interesting people in the world were there – and it is good to see that they don’t have a clue either, he says. The simplest way to try to foretell the future, say analysts is to focus on the technological limitations of today and work forward. But, according to Gordon Bell, the conference keynote speaker, such processes are worthless. Extrapolating to the future based on our current knowledge of technology is a waste of time. There will be innovations that will seem to come out of nowhere that will change everything, he says. One of the key failings in any method of prediction, says Vinton Cerf, former president of the Internet Society and one of the so called ‘founding fathers’ of the Internet, is confusing what will be feasible with what will actually be commonplace. Social evolution may limit technological evolution, concurs Hamming. So why bother? It is a case of necessity, says Moschella. Each ‘computing wave’ opens up opportunities for a new wave of suppliers and threatens the incumbent rulers. Vendors are searching for as much forewarning as possible in a bid to adapt their strategies and survive the tide of change, he says. But even though their life may depend on it, the recent track record of prediction, at least, is poor. If we couldn’t predict the Internet, why are we here? is a question which has echoed throughout the R&D community. But according to Moschella, there are distinguishable trends in the short-term at least. No-one predicted what the PC industry would grow into, yet in retrospect, nothing surprising happened, he says. In that sense, you feel it is possible to predict the future – that is what keeps forecasters going. But if picking long-term technology leaders is impossible, even creating market predictions is set to get more complicated as the consumer influence on the development of computer technology climbs. The VCR was the baby version of the movies. The TV was the bigger version of the radio, but there is no real pattern to follow for the computer in the home, says Moschella. Environmental and social developments will have an increasing impact on the future of technology, yet they are factors over which technology companies have no control, says Hamming. In these conditions, few company executives are bold enough to even pretend that they know what is coming up, and are instead trying to convince their customers that they will be quick to react to, instead of propel, change. It is getting harder and harder to live up to Alan Kay’s law – the best way to predict the future is to invent it, admitted one industry executive. This is causing concern among the research community. In Beyond Calculation – The Next 50 Years of Computing, the book accompanying the ACM conference, Professor Dennis Tsichritzis, chairman of the board of the German National Research Center for Information Technology asks, In a society where everything is measured, quantified and rationalized, how can researchers continue to live in an idealistic world? Research budgets, both in the public and private sectors, are under pressure, he says, which means the corporate community will have to take increasing responsibility for the advancement of technology. But increasingly, say visionaries, the computer industry, much less a single computer company, will be unable to force through technological change.
Digital TV is clearly important for the computer industry, but Microsoft and Intel have realized that, despite their combined might, they cannot force broadcasters to adapt their television signals so that PCs can receive them. Likewise, say analysts, computer companies are going to be increasingly at the mercy of the telecommunications companies when it comes to advances in networking technology. But surely one could reasonably expect the chairman of AT&T to know what his corporation will look like 10 years from now. He doesn’t. But at least, he should know who his major competitors will be in 2005. Stumped again. But here is what he does know: something startling, intriguing and profound is afoot. These comments, made by Robert Allen last year, strike a chord across the computer industry. Nevertheless, there are those who are willing to risk almost inevitable ridicule by proffering a view of the future. The easiest aspect to predict, says Bell, is the rate of core technology advance. Currently, processor speed, storage capacity and transmission rates are improving at rates of 60% per annum, doubling in speed or capacity every 18 months or 1,000 times per decade. As many semiconductor specialists have pointed out, if cars had evolved at the same rate as semiconductors, we would all now be driving Rolls Royces capable of speeds of a million miles an hour and costing 25 cents. It is safe to predict, says Bell, that computers in the year 2047, 50 years from now, will be at least 100,000 times more powerful than those of today. Indeed, if Moore’s Law (processing power will double every 18 months) continues to hold computers could well be ten billion times more powerful than today, says Bell.