Applications will soon become machine independent by the establishment of applications architecture, according to Anura Guruge of Data General. Guruge was delivering the opening plenary of the fifth UK Computer Management Group Conference, and he painted a gloomy picture of an applications backlog still measured in years if consistency and the single systems image don’t become the order of the day within the near future. He believes that today’s information technology systems are confronted with a wider choice of applications than ever before, but white collar productivity in the US is falling by 6% a year, and training costs outweigh the combined costs of hardware and software.
Inconsistent
He claims that most US installations are reporting an average backlog of two years on the development of new applications. However, Guruge is convinced that the establishment of application architectures can provide a consistent framework for users and developers, and consequently, reduce both the backlog and costs of training and maintenance. He professes indifference to the origin of application architectures, and acknowledges that both IBM’s SAA and DEC’s NAS partly address some of the problems, but they are specific to those manufacturers: in the case of IBM, it is a range of inconsistent systems, and with DEC workstations, it problems of interconnection and interoperation with MS-DOS, OS/2, Sun, and Macintosh. Guruge is optimistic that by late 1992 and 1993, we will see the measurable benefits of applications architectures, and he thinks that while developers will reap economic benefits, users will have greater freedom of choice to mix and match products from different vendors. The demand for application development and the creation of onerous backlogs was a theme addressed by several speakers, including Dave Haslam of Hitachi Data Systems who delivered a paper entitled the Rise and Fall of Relational Databases? Haslam thinks that relational databases have been a major step forward, but they are no panacea to either the skills shortages or applications backlog. He claims that the economics of computing underwent a radical change in the late 1970s when the power of the CPU increased and hardware costs were overtaken by those of personnel. There was a switch of emphasis from the conservation of power to utilisation, but that didn’t ease productivity problems and the growing backlog caused by an imbalance between demand and supply. Hence the rise of fourth generation languages. But they created as many problems as they solved. Early code generators may have generated more code but not always at optimal design, and therefore they required more maintenance time. Consequently, demand was still outstripping supply. Personal computers enabled users to develop their own applications, but they were isolationist with limited provision for the exchange and sharing of data. Haslam describes them as islands of information processing with large question marks over the security of data distribution. Then along came the relational white knight. It had a user friendly language, and could easily handle large quantities of data. However, Haslam believes that the rise of the relational database was halted by IBM’s ambivalence towards the new technology. It was not until DB2 failed to perform and IBM bought Model 204 from CCA that relational technology regained the ascendancy. He says that IBM’s rethink and consequent decision to make DB2 strategic, did hit many of the independents, although they have continued to provide cheaper and faster relational models, primarily by adopting SQL interface standards.
By Janice McGinn
However, despite the obvious success of relational databases, they are problematic, and demand both time and effort. Haslam says that before taking the relational route, users ought to ask themselves several questions. Should memory be real or expanded, and which is the best disk capacity? Should memory be cache, and which channel speeds are most appropriate with relational technology? How does the relational application inter
face with other applications? Someone, says Haslam, has to know the answers. Another reason to be well informed about relational technology is the enthusiasm shown by mainframe suppliers. That, says Haslam, is because it involves MIPS, aka money. He reckons that relational models will provide a platform for the future, although there will always be a niche for the fast transaction processing provided by the likes of Tandem’s NonStop SQL, and he strongly advocates users remember that relational models are neither cheap to buy, nor run. Application development and and more flexible environments are subjects that Peter Scawen, managing director of Information Builders Europe, addressed in his speech – 4GLs in the 1990s. He forecasts that this will be a decade of user power, and believes that applications generators will be integral to creating enterprise-wide systems that include Systems Application Architecture, Network Application System, Open Software Foundation, and Open Systems Interconnection environments. Scawen believes that the fourth generation language is now a tool for building complete applications, but there are certain requirements to be met before the promises of an enterprise environment can fulfilled. First of these is shareable information assets which he defines as data, applications, and the people that work with the information. Secondly, there must be interoperable information systems that communicate in a seamless manner and automatically trigger reactions in different environments. Thirdly, in response to the rate of technological obsolescence, architectures must allow users to plug in and out as they deem appropriate. Lastly, the key issue of the 1990s will be management and control of connected, distributed, interoperable systems in a far more complex environment than existed in the 1980s.
Faint praise
He believes that the 1990s will see an era of open architecture that enables users to be hardware and database independent. As regards SAA and NAS, Scawen doesn’t exactly damn with faint praise, but he describes SAA as the most comprehensive open closed architecture ever conceived, and although DEC’s NAS/AIA strategy will enable users to network applications and homogeneous systems, it’s still a partial solution. He believes users want an architecture that will provide a set of standard business applications at the enterprise-wide level, and this enabling architecture will be a combination of tools and services. Tools will fall into one of three categories information access, processing tools and transaction processing tools. He sees the role of the fourth generation language as a development one. It will provide EIS/DSS tools, languages, and computer aided software engineering tools. It is to span the entire spectrum and facilitate any kind of user environment. As regards data access, the fourth generation language will provide the ability to work from any environment and access any kind of data. It won’t be fourth generation languages versus computer aided software engineering, or fourth generation languages versus expert systems and object orientation. Essentially, Scawen is holding out the promise of being able to develop and execute cross system applications. Whether he’ll be able to deliver this strategic component remains to be seen.