The 1980s have been dogged by the stubborn refusal of new computing technologies and applications to take on and fulfil the enthusiastic forecasts of their proponents: the combination of networking techniques and robotics were going to revolutionise the way the factory worked, but the robot industry has consistently failed to meet even the increasingly modest forecasts made for it. Equally disappointing for its supporters has been the failure of artificial intelligence techniques to come out of the hot-house labs and make a serious impact in the real world: probably the best known expert system in action is the one that DEC uses for its own benefit in drawing up the list of peripherals, cables and so forth that will be needed to fulfil a specific customer order once the salesman has it in the bag. Since that was one of the very first expert systems to receive widespread coverage in the technical media, the achievements of the artificial intelligentsia in winning acceptance for their methods look modest indeed. In the specialist hardware arena, Lisp Machine has gone bankrupt and Symbolics Inc, which went public on a wave of enthusiasm only three years or so ago, has struggled almost from its first days as a public company.
Brute Force One reason has been the increasing tendency of artificial intelligence software developers to abandon specialised languages and hardware and to rewrite their tools for the much cheaper Unix workstations. That sounds very much like an application of the brute force and ignorance method, and there just could be a much more neat and elegant solution to making the knowledge and skills of the expert into the hands of the lay user. For years, artificial intelligence researchers have claimed that the future lay in non-procedural languages such as Lisp, controlled by strange Svengalis such as inference entities. Their greatest success has been in developing rule-based tools for building expert systems. But these have remained difficult to use, and integrating the expert systems into the real world has been even harder. Artificial intelligence researchers at AT&T’s Bell Laboratories realised that the problems the developing systems faced were the same as those experienced by more mainstream projects: communication with other systems, user interfaces, systems maintenance, and administration, including backup and crash recovery. Not surprisingly, they found they relied more and more on Unix tools.
However, Lisp and the expert system tools and programming language interacted awkwardly with the Unix system, so they re wrote most of them in the C language, with the interpreter (or inference engine) eventually producing C code. The interpreter was loaded as a C library, enabling it to be called by a C program or a C function. This led to the finished expert systems being compiled as part of larger C programs, and to developers including bits of expert systems in their C programs.
With the entire expert system being developed and run from within the Unix environment, the researchers went on to develop a collection of routines to enable the expert system to interact with the Informix relational database system and with AT&T databases. Rule-based expert systems are made up of three parts: a database or working memory; a set of rules; and the interpreter or inference mechanism. The knowledge engineer tries to encapsulate the knowledge of an expert into the first two parts, with the IF…THEN… rules working on and modifying the information in the working memory. The interpreter executes or fires a rule by executing the THEN clause when the IF clause matches an element in the working memory. The rules are in no particular order as they are fired only when they match the working memory. If more than one rule matches, a process called conflict resolution gives priority to one of the rules. The form of rule-based languages makes it easy to to modify and add to the programs if the conditions change or if the expert remembers another bit of information. However, it is difficult to perform operations such as rec
ursion with them, and debugging can be very difficult as the working memory changes as each rule is fired. The AT&T workers started out using OPS4, a popular rule-based language developed at Carnegie-Mellon University and written in Lisp. Added C Then they added the C language and the Unix system’s Awk language, shell programming and other development tools such as an internal database language. They then rewrote a later version of OPS, OPS5, into the C language, calling it C5 and giving it an interface to C to allow a single program both procedural and rule-based capabilities. The C5 interpreter also allows rule based programs to be debugged (backed up) as well as allowing C functions to be called by the THEN clauses in the rules. The final program can be compiled into a C program, enabling the programmer to use all the powerful Unix tools. The routines that enable the expert system to interact with a database called OD permit developers to load an entire database, or part of it, into the working memory with a single command. The developer can also update the database from a C5 rule. AT&T has also developed a version of Common Lisp (the most widely accepted dialect of Lisp) called Portable Lisp or Plisp. The kernel of the Plisp interpreter and some functions were written in C, while the Plisp compiler was written in Common Lisp to produce C code that is compiled by the standard C compiler. The version is less than half the size of a Common Lisp system and runs much faster. It will run Common Lisp programs, which means that developers can produce their programs on powerful artificial intelligence workstations, then transfer them to a Unix machine to run in the real world. It also enables Unix-based developers to use Lisp for quick prototyping of applications requiring artificial intelligence techniques, and to use Lisp with C, while still retaining the power and familiarity of Unix.