What would happen if there were only one kind of computer in the world? It’s an interesting question, and one that is not entirely academic. With customers and vendors alike moving toward more open standards in hardware and software and away from proprietary products, it would appear that computing is moving in that direction. We are by no means there yet, but we are getting closer. Your answer to the question probably depends to a large extent on your experience. Information processing managers who, by virtue of their diverse workloads, need to manage many different platforms may wish that their systems and networks were a lot more sophisticated in their abilities but a lot less complex. This would be especially true among the core AS/400 customers – manufacturers – who have central machines running accounting and manufacturing requirements planning software, plus a diversity of systems on their shop floors and scattered around their corporate offices. Though these customers might wish that they only had to support one platform, the point is moot. AS/400 customers already have many different kinds of servers running at their sites. They have to support them and that is that.

Hope

The best that they can hope for is that the emerging (usually Unix-based) internetworking software that will enable them to share information will work as promised. The multitude of servers at AS/400 sites is only half of the complexity that managers have to deal with. The client workstations that use these servers, be they Windows personal computers, Macs or Unix workstations, also have their own software and quirks. And so do the end users (well, they only have quirks). Some end users prefer one kind of workstation, while others simply don’t care. Many end users, who have enough discretionary funds at their fingertips to buy a personal computer without obtaining approval from higher-ups, don’t ask questions of computing managers until they’ve already got a personal computer or Unix system on their desk – and by then it’s too late to get advice. As Sun Microsystems Inc so correctly points out in its advertising, the network is the computer. What Sun doesn’t say is that the network is a complete mess! The overwhelming acceptance of this new complexity by corporations has been driven by two primary forces: hope and technology. Hope is the more important of the two. People buy computers and lash them all together because they hope the machines will help them work better, faster and smarter. They hope that the money they blew on the computers will be worth it. The complexity in corporate computing owes a lot to the advancement of technology, too. Technology does not move ahead uniformly among types and makes of computers. Back in the 1960s, the pace of technological change wasn’t so much an issue in corporate computing. Most companies didn’t have computers and those that did bought IBM mainframes, more times than not so they could replace legions of human clerks. The disk drive replaced the filing cabinet. The disk controller replaced human fingers. The CPU replaced a clerk’s brain. With the advent of the minicomputer in the 1970s, another wave of change swept over corporate computing. The widespread adoption of the minicomputer added another layer of complexity to data processing. Established batch jobs like accounting still ran on the mainframe, while new applications like CAD/CAM or transaction processing went on the new minis. This added complexity was worth it not only because the minis were cheaper than the mainframes, but they had much more sophisticated software. In the 1980s, the personal computer and the Unix workstation forever changed the pace and face of the computer industry. Improvements in technology, whether they are for IBM mainframes, AS/400s or any other kind of server, are being implemented because of demands by end users to make their workstations do more than paint pretty pictures. Technological progress on the desktop is far from finished and will continue well into the next century, as will the evolutio

n of network computing. Both the desktop and the network now sport very complex environments, but they are not yet nearly as robust and easy to use (important aspects of what we are calling sophis tication) as on operating system like OS/400.

By Timothy Prickett

No matter what era, no matter what platform, customers have adopted the technologies that have advanced fast, affordably and, most importantly, in the right direction. People have been predicting the paperless office for decades, but for most customers, traditional image processing – big central computers equipped with specialised software and expensive scanners to take snapshots of paper documents and store them on disk – has been far too expensive and cumbersome to justify. And far too inflexible – you can’t modify a document with image processing software. But buying fax server software (it’s included free in Windows 3.11) and a $150 modem for every personal computer is fast becoming the preferred and affordable way to keep computerised correspondence on-line. Using a personal computer word processor/fax server set-up instead of image processing software enables end users to store outbound documents in a form that is modifiable. It puts documents right where they are needed – on the end user’s desk. And it is much less expensive per user (assuming the end user has a personal computer to begin with) than dedicating a big central machine to the task. The downfall with this approach – and this is true for all types of distributed computing – is that locating documents is a little trickier than it would be at a company that made all end users keep electronic documents on a central system with commonly accepted file naming conventions. The economic and technical forces that have shaped this one small facet of the computer business – image processing – are now shaping the evolution of distributed data processing. When distributed database processing gets perfected and if it helps customers make their business better or reduces the cost of transaction processing, the role of traditional central systems like the AS/400 will change. The level of complexity that information processing managers will have to cope with may, in fact, go down even as the level of sophistication in the network goes up. All major mid-range operating systems have been or are being rewritten to include all the big open system acronyms: TCP/IP and SNA (for data communications), SQL and ODBC (for database access), SPEC 1170 and Posix (Unix compatibility)… the list goes on and on.

Bickering

When all clients and all servers are speaking something that is closer to the same language, it should in theory be easier to support a network. What applies to the most esoteric of systems software standards applies equally well to a disk drive or a suite of accounting software. For instance, vendors will use one or two types of disks in their systems – SSA (an adaptation of SCSI) or ATA are the evolving standards. Their controllers will all provide RAID 5 data protection. They will attach to many types of servers, from NetWare local nets to AS/400s to Unix servers. They will give customers a kind of platform independence that they have always sought. Likewise, a suite of applications from software houses such as J D Edwards, System Software Associates, Marcam and others will soon be pretty much the same on an AS/400, Unix or Windows NT machine. Customers that use these applications will be less tied to their platform than they have ever been. They still won’t be able to hop from one platform to another easi ly, but they will be able to do it, which wasn’t an option in the past. SAP and PeopleSoft are set to jump into the AS/400 applications fray, which will give their customers as well as AS/400 users even more options. It appears that the kind of openness and vendor independence that Unix has always promised will, in the long run, be delivered to customers, but not exactly as advertised. Openness is coming in a piecemeal fashion, dictated by market forces, not the bic

kering among hardware and software vendors. Though the resulting open systems servers and clients that will come into the market over the next few years will be a far cry from the one computer that we speculated about in the beginning of this article, they will be about as close to it as anyone could have guessed.

From The Four Hundred, May 1995, published by Technology News Ltd, London NW1 8JA, Copyright (C) 1995 Technology News Ltd.