Created in November 1988 by Apollo Computer Inc, Hewlett-Packard Co, MIPS Technologies Inc and Sun Microsystems Inc to provide standard performance benchmarks for measuring and characterising RISC computers, the Systems Performance and Evaluation Corp, SPEC, has been drawing flak in some quarters for its seeming inability to address the creeping trend towards creative performance measurement techniques that threaten to undermine its credibility. Indeed, after laying down the SPEC Baseline Run Rules, a set of restrictions on the use of optimisation flags for compilers and preprocessors, last summer, the body subsequently decided that disclosure of SPECbase results alongside standard SPECint and SPECfp numbers should not be mandatory as originally planned, only that they be made ‘available’. Guess how many vendors have reported SPECbase results along with the inflated SPEC results in their press releases and new products materials since July 1, asked Andrew Allison’s Benchpress at the end of the year. None that we have come across. Although SPEC says it will address these and other concerns through a new, single metric using baseline rules called SPEC95, it expects optimisation technologies to become much more prominent. Both industry and academia have indicated their beliefs that compiler and other optimisations will become increasingly essential in vendors’ offerings and the claims made for them. Source code on any system is now routinely optimised for use by on-board compilers. SPEC admits its current SPEC92 suites focus on processor, memory and hierarchy rather than input-output, graphics or distribution. SPEC95 will include updated versions of the set of source applications that are run to test and compare system performance and will emphasise raw processing capabilities. In addition to SPECint95 and SPECfp95 speed metrics, the group expects to retain a benchmark of raw processor horsepower, currently known as SPECrate. There has been some demand for a benchmark that can measure multiprocessing system performance, as SPECrate does not, although that will require more emphasis on things like input-output measurement and clustering than there is in SPEC’s existing system development multitasking suite. In all likelihood, SPEC will keep SPECrate as a horsepower measure and create up a new suite aimed at measuring overall multiprocessing system performance. SPEC95 goes to an initial vote at the end of this month with final approval planned for the end of March. The group expects a SPEC95 press announcement in the second quarter and first SPEC95 results reported in June. Beyond SPEC95, the group is looking at how to measure performance in client-server environments and will probably get together a package of database and workload measurement suites. It will still be providing source code, rather than the specifications provided by the Transaction Processing Council. It is also looking at how it will conduct future processor benchmarking in the light of technologies described at events such as Microprocessor Forum.