View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
May 18, 1994

GARTNER ABANDONS THE MAINFRAME AS MOVE TO LOCAL NETWORKING, CLIENT-SERVER ACCELERATES

By CBR Staff Writer

At the Data General Corp and Gartner Group Inc Executive Seminar on Building Open Enterprise Information Solutions, Wesley Melling, Programme Director at Gartner’s Midrange Computing Strategies, outlined a mainframe-free data processing future but a tense, transitional present. At the moment, he sees a collision taking place between top-down data processing strategies – coming from the 40% of data processing professionals running proprietary or Unix systems – and the 60% bottom-up community of users operating MS-DOS and NetWare. The battle will be fought over the mid-range, and explains Novell Inc’s move of NetWare to a Unix base with Univel, and Microsoft Corp’s development of a server-capable operating-system with Windows NT. Gartner Group predicts the battle will leave no dominant system, and that while NT server revenues could reach 13% of the server market in 1997, Unix will retain 40%. In the short term, however, the market will steer clear of client-server, as there’s still no truly effective way of managing it, and because service, administration and personnel costs remain high. By 1996, though, client-server will be the preferred architecture for new applications, says Gartner, as corporates move towards open systems and distributed processing models continue to evolve. As enterprises join the rush towards business process re-engineering, the reorganisation of tasks demands changes in data processing; the most notable shift being from transaction processing to mixed-input and flexible-path processes with a single supporting infrastructure. The 1980s data processing structure and the enormous investment in desktop computing created islands of automation incapable of communication, but Melling believes that, in the 1990s, older, isolated systems will be reintegrated. Gartner is seeing an explosion of acceptance of these new computing paradigms, and expects the trend to accelerate until 1998.

By David Johnson

Melling reckons that future data processing architectures, cheaper and more vendor-independent than in the past, will be based on a software framework that supports development of portable applications, provides for the interoperability of new systems, and enables cross-vendor systems and network manageability. The operating system chosen for this software will have to be both Posix-compliant and XPG3-branded. It will also have to be aggressively priced as RISC and complex instruction set, Unix and non-Unix systems vie for the open system market as customers shy away from proprietary systems. For Melling the globalisation of enterprises is accelerating pressure on systems availability. There’s no longer a nightime window to get batch processing done, no weekend to back up files and no Christmas week to move data centres, because enterprises and their customers are demanding continuous access. A mainframe that can offer only 99.9% availabilty will not be a viable possibility in the future. The pressure on costs and cost-performance is also drawing customers away from use of mainframes, even in traditional strongholds. Recent research for the five largest New York banks revealed that money transfer by mainframe was twice as expensive as by alternative open systems. Mainframes are too expensive in processor, disk, system software, application package and technical costs; secondly, MVS is seen to be more closed and proprietary than many other environments; and finally, the pace of MVS software evolution is slower than for non-MVS systems. Gartner believes that these reasons have motivated the shift away from the mainframe to networked personal computers and mid-range systems with commodity microprocessors and portable software, and in this Melling can see the death spiral of the database centre, as companies drive heavy earth-moving equipment through your database centre. The traditional concept of downsizing must be revised, as smaller machines offer increased security, performance, functionality and integrity for a lower cost. And users must wise up, says Melling and demand more from their suppliers.

However, the ultimate movement away from mainframes will be as dependent on corporate politics as on the nature of the applications, and Melling can see mainframe die-hards already circling the wagons, Custer style.

Content from our partners
Why all businesses must democratise data analytics
How start-ups can take the next step towards scaling up
Unlocking the value of artificial intelligence and machine learning
Websites in our network
NEWSLETTER Sign up Tick the boxes of the newsletters you would like to receive. Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
I consent to New Statesman Media Group collecting my details provided via this form in accordance with the Privacy Policy
SUBSCRIBED
THANK YOU