The Gartner Group’s bandwaggon came to the UK recently, where it used a conference in London – Open Systems: A Mix of Myth and Reality – for the launch of its latest consultancy package, the Open Systems Service. The market research outfit reckons that the four fundamental factors driving the adoption of open systems – functionality, availability, complexity and cost will only begin to intersect favourably around 1995. Why 1995? As well as being party to the future plans of its vendor clients – many of which will be directly responsible for developing technology that will fuel these forces – Gartner has arrived at 1995 by using diffusion theory, which is a way of interpreting something, in this case open systems, by examining the natural movement of its components. It’s a technique that was first used, according to Steve Wendler, vice-president and service director of the open systems programme, in the American mid-West, to explain why farmers adopted new corn seeds, and then applied in other fields of study. Even so, Wendler is careful to qualify best estimates, saying open environments will not be possible in the absolute, even by 1996, but systems will emerge that are increasingly less closed.

Mid-range ripe

Initially, any new technology functions rather crudely, Wendler argues, but its usability rises over time as users and vendors gain experience through subsequent versions. Similarly, the availability of new technology is limited at first to narrow distribution channels such as direct sales and systems integrators, typically proportional to the complexity of the stuff. The complexity of new technology (for users) is high at first but again decreases with subsequent releases of the product. Similarly initial costs are high as vendors try to recover their non-recurring investments in the technology, but falls as volumes increase and manufacturing is streamlined or competition depresses prices. Broadly, Wendler’s argument is that these four factors have favourably intersected for a sub-set of technologies – Unix, C and TCP/IP – in large scientific, engineering, government and education – but niche markets. Looking specifically at the mid-range commercial systems market, Wendler believes that by 1995, users are likely to reach a point where the cost of implementing open systems will be less than to carry on as they are. The mid-range market is most ripe for open systems, he says, because of the availability of low-cost, high-performance systems that span replicated site configurations linked to departmental server configurations connected to loosely-coupled multi-processing central processing units. At the low end of this market, for example, no supplier can any longer hope to offer a proprietary technical workstation, it has to be an open system. Wendler has also assessed which application areas, or what Gartner refers to as middleware – everything between the hardware and what the user actually sees – are most appropriate for the near-term deployment of open systems. These include application development environments (though vendor-specific), global networking (OSI, TCP/IP), network and systems management (EMA, OpenView, ONE, DME), integrated office products (AT&T Rhapsody, NCR Co-operation, Applix, Quadratron) and workstation integration (LM/X, PC-NFS). Some technologies, Wendler believes, are irrelevant because of their relative immaturity, such as repositories and transaction processing monitors. Some have to be dealt with using ad hoc, third party software systems at present – application development environments and distributed databases. Gartner has focused on the mid-range – apart from the money it has to spend on consultancy one assumes – because it believes the heritage of open systems has been to solve mid-range problems.

By William Fellows

Widespread implementation of Unix on the mainframe is still some way off, Wendler reckons, Amdahl Corp’s UTS excepted – because Unix input-output performance is still in the realm of kilobytes per second, not megabytes per second. Moreover, Unix can only recogn

ise a maximum of 1,024 devices and it doesn’t support multiple volume files. Unix is also very weak on systems administration compared with proprietary systems, Wendler says, and when it comes to high-availability, Unix just isn’t durable enough. As open systems cannot fulfill most of these roles at the moment, many [mid-range] users will see open systems as a niche strategy, until 1995, when the technology will change to such a degree that open systems will be able to meet the needs of the most demanding commercial applications. As is clear to anyone that is involved in the industry, open systems are already having an enormous impact on vendor strategies. Wendler says one result will be that software prices will rise astronomically to compensate for declining hardware revenues. Much of what business needs to carry out new information strategies is simply not being provided, he says; most organisations are going through an architectural crisis at the moment due to the absence of strategic planning and management. However, in response to user demands for openness, Wendler argues, vendors will – as we know only too well – increasingly promote everything they can as standard. In that arena, vendors will increasingly obfuscate formal, draft, proposed, consortium, de facto and proprietary standards compliance. In some cases, he says, this approaches Orwellian newspeak, for example IBM’s Open Communications Architectures, which effectively withdraw from publication of newer versions of proprietary network-layer protocols that have previously been published. Wendler suggests that users consider formal, draft, consortium and de facto standards as real standards, proposed and proprietary as self-serving vendor hype. Users themselves, however, are not completely blameless when it comes to the slow uptake of open systems, Wendler observes: our clients don’t take products seriously until IBM is involved. Another major player, Microsoft Corp, so far with only a tippy-toe in the open systems waters, needs to control the definition and evolution of computing to maintain its momentum, Wendler says. NT he considers a portable proprietary environment, but it will, by dint of its use of the Open Software Foundation’s Distributed Computing Environment accelerate take-up of open systems, he believes. In his view, DCE will become the de facto standard for client-server computing. Sun, he says, sat on its remote procedure call laurels too long and didn’t advance its Open Network Computing technology until the Foundation started talking about DCE.

Object technology

Wendler recognises that object-oriented technology will play a central role in the distributed management systems of the future, though it is too immature to be of any real use at present, he says – the class definition stuff that the Object Management Group is working on now is much more important than the Object Request Broker. The Object Group’s work so far has been like defining that there have to be lights on the the wing tips of aeroplanes. Wendler concludes there is no such thing as an open system in 1992, and that for users, the economic benefits of going to open systems have so far been only anecdotal. Although, he confesses, he cannot provide money-saving ideas, he firmly believes that as far as mid-range systems go, Unix functionality is now on a par with proprietary systems, reflected in reduced proprietary pricing, and that substitution, rather than wholesale replacement, is the key to open systems. He strongly advises users to get involved with their regional open systems organisations and representative industry groups. Open systems give users a great deal more leverage when negotiating with suppliers, as Wendler observes, the keys to a successful migration process are as much political as they are technical. We can incrementally manage our way towards open systems component by component, but don’t do it all at once – and remember it doesn’t necessarily mean Unix. Gartner’s Open Systems Service costs UKP10,800 for one year’s subscript

ion