As day follows night, Apple Computer Inc’s announcement of a clutch of Power Macintoshes is followed by an independent benchmark study of their performance by Ingram Laboratories. The word independent should have quote marks around it really, since the reports are commissioned by Apple, and Ingram is not at liberty to talk about the findings without Apple’s say-so. Previous attempts to talk to the research firm about the significance of its figures failed to get anywhere at all – which is a pity since the detailed figures tend to indicate some interesting aspect’s of the Power Mac’s, or the application software’s, performance. Last month, Apple announced the results of the most recent study – or rather a potted version of the results – since the detailed report will not be available until this month. The new study measures the speed of the Power Mac 6100/66, 7100/80 and 8100/100 and 8100/110s against four Pentium machines – Gateway 2000 Inc’s 100MHz P5 100XL; Dell Computer Corp’s 90MHz XPS P90; the 66MHz Compaq Computer Corp DeskPro 5/66M; and the 60MHz AST Research Inc Bravo MT P60. There are a few 80486 machines in there too for good measure, but we will ignore them for the moment.

Pretty weak

Ingram’s methodology is to find some applications with versions for both Windows and Macintosh, and then run through a number of sample operations – such as opening files, scrolling, spell-checking, spreadsheet recalculating and graphing – to see how they compare on the Windows and the Macintosh machines. The results are then normalised, to give each measurement equal weighting, and a series of ‘Power Mac x is n times faster than Pentium box y’ readings produced. Using real-world applications rather than synthetic benchmarks has both benefits and problems. On the downside, there is a continual debate on Usenet about the choice of applications: iAPX-86 fans point out that the Windows versions have not been recompiled to be optimised for the Pentium. We reckon this criticism is pretty weak: even Intel Corp avoids saying that applications developers should recompile their desktop applications for the Pentium. Rather, it recommends Pentium optimisation only for specialised applications such as databases. The point may become more contentious later this year with the launch of the PowerPC 604-based Macintoshes. If applications developers start coming out with versions optimised for the new chip, then it will be fair to demand Pentium-optimised counterparts to run them against – assuming they exist. A more serious question hangs over the similarity of the same application on different machine types. Are they really comparable? In making the claim that ‘this machine is x% faster than that machine, the Ingram report implies that both the Windows and Power Macintosh versions are coded equally well, with equal care, and are equally optimised. This is patently untrue, as anyone who has tried running Microsoft Word 6.0 on a Macintosh will attest. As such, the tests are certainly flawed as objective trials of machine performance. But they still hold up as a measure of overall system performance, assuming the mix of applications is fair and the sample size large. During the benchtest’s last outing, back in July 1994, only four applications were tested – Aldus Corp’s Freehand, Fractal Design Inc’s Painter, Adobe Systems Inc PhotoShop and Frame Technology Inc’s FrameMaker. Foul, cried the critics. These are all graphic-intensive, floating point-hungry applications – they have been specifically picked because they will make use of the PowerPC’s floating point unit performance and will make Power Macs look good.

By Chris Rose

Ingram would no doubt answer, if allowed, that the material it had to work with was strictly limited – back then there were very few native Power Mac applications and, not surprisingly, those that went native first were the processor-hungry ones. Even as it was, the company had to make do with beta versions of the PowerPC applications or, in the case of PhotoShop, an emulated appl

ication with some functions accelerated thanks to its plug-in. This time Ingram has addressed this concern. The original four packages have been joined by six others: Microsoft’s Excel, Word and FoxPro and Claris Corp’s ClarisWorks, Wolfram Research Inc’s Mathematica, DeltaPoint Inc’s DeltaGraph Professional, and Ashlar Inc’s Vellum 3D. The Microsoft and Claris applications give the Ingram tests a much more balanced profile, and it is worth noting that these office applications drag the Power Mac ratings down considerably. Given all these caveats, here are Ingram’s overall findings: graphics and publishing applications performed an average of 84% to 94% faster on Power Macs than on similar clock speed Pentium-based Windows systems [note the use of the word ‘similar’]; scientific and engineering applications on the Power Macs outperformed Pentium-based personal computers by as much as 49% [note ‘by as much as’]; Power Macs demonstrated a performance edge for office productivity applications of up to 9% over Pentium-based systems [not a lot]; total the whole lot up, and express as a relative performance against a 33MHz 80486 machine and Ingram’s results pan out thus:

Computer Relative performance

Power Macintosh 8100/110 5.81 Power Macintosh 8100/100 5.57 Power Macintosh 6100/80 4.79 Power Macintosh 6100/66 4.23

Pentium 100 4.01 Pentium 90 3.67 Pentium 66 3.06 Pentium 60 2.80 80486DX4/100 2.24

Macintosh Quadra 630 1.85 80486/66 1.46 80486/33 1.00

What is missing, of course, is any indication of price-performance and the detail. With the previous Ingram studies, it has been the detailed breakdown of how fast individual packages, and parts of individual packages work, that has proved of most interest. These details will not be available until sometime this month in the full report. In the meantime, here is a look at what previous reports discovered: last time Ingram published its benchmarks – last July – the overall story was the same – the Power Mac 7100/66 was 15% faster than a 90MHz Pentium box, the 8100/80 was 21% faster than a 100MHz Pentium box and so on. Anomalies showed up in the finer detail: so with Aldus Freehand, the Power Macs performed poorly for the ‘select all objects’ test; for Painter they were worse than Pentia when running the ‘Apply marbling’ filter; with Framemaker it was ‘scroll text’, and ‘delete all text in flow’ operations that let them down; with Adobe Photoshop they were substantially slower at opening files, and applying ‘zig-zag’ and ’emboss’ filters. If anything, these results say more about the quality of application coding than the speed of the systems and the Photoshop results are probably an indication of which parts of the application were accelerated by the native plug-in.

Coding practices

This time around, the larger application sample size may serve to tell us more about the strengths and weaknesses of the Apple hardware and system software itself, but in fact it is more likely to tell us about Microsoft’s coding practices. When we are told that graphics and publishing applications can be nearly twice as fast on Power Macs as on Pentia, whereas office applications are barely faster at all, what is actually being measured here?

Chris Rose is editor of the fortnightly Internet publication PowerPC News, from which this item comes. PowerPC News is free by mailing: add@power.globalnews.com