87 lines
2.9 KiB
HTML
87 lines
2.9 KiB
HTML
|
|
<html>
|
|
<head>
|
|
<title>
|
|
Banyan 6.2 Desktop Performance Subsystems
|
|
</title>
|
|
</head>
|
|
<body>
|
|
|
|
<table>
|
|
<tr>
|
|
<td>
|
|
<img hspace=10 src="banyan.gif">
|
|
</td>
|
|
<td>
|
|
<h1>
|
|
Banyan 6.2 Desktop Performance Methodology
|
|
</h1>
|
|
</td>
|
|
</table>
|
|
|
|
<p>
|
|
<br clear=all>
|
|
<br>
|
|
|
|
The Desktop Performance test suite was designed to test the User's
|
|
Perception of desktop performance. Its methodology was thoroughly
|
|
reviewed when we created it for 5.3, and for the purposes of
|
|
comparing apples to apples we maintain strict conformance to the procedures.
|
|
<p>
|
|
|
|
The tests are now always run on an Indy with a 150Mhz IP22 R4400 SC
|
|
processor with an IBM half Gig system disk and only
|
|
16Mb of RAM (we have also run these tests at 24, 32, etc, but 16Mb is
|
|
still the baseline). The tests are run automatically by a program
|
|
called "xperform". Xperform starts an application with simulated mouse
|
|
clicks on the desktop.
|
|
It then records the time between the launch and when it sees the X event
|
|
messages that draw the application window.
|
|
<p>
|
|
|
|
Before each test we mkfs the system disk and reinstall the system
|
|
software according to a minimum set defined by Charles Marker
|
|
some long time ago - the installed list of software is detailed in
|
|
<a href="http://www-qa.engr.sgi.com/SQA/xperform/versions.html">versions.html</a>.
|
|
<p>
|
|
|
|
The system is usually chkconfig'd to run with the network ON. However,
|
|
we have run the tests with the network chkconfig'd off just
|
|
to see what would happen. The results are in
|
|
<a href="5.3-netless">5.3-netless</a>. They show that, without the network,
|
|
timings are faster (perhaps due to network daemons not being present).
|
|
With the network on but the ether unplugged, timings are slower (perhaps
|
|
because the daemons are present and something looking for the network
|
|
has to timeout?).
|
|
Regardless, testing without the network isn't a valid test
|
|
of our 'standard' end user experience and although no two network
|
|
performances are alike, it would be bad form for us to disregard the
|
|
existance of a network and how it effects the end users perception of
|
|
system performance.
|
|
<p>
|
|
|
|
Once we've produced the appropriate system disk, xperform tests each
|
|
application launch from a fresh login. In other words, it logs out,
|
|
logs in, and launches the desktop application (desktop on, soundscheme
|
|
on, but no other apps running in the background) and
|
|
takes the timing from the millisecond xperform executes a button click
|
|
to start an application to the X events signalling the
|
|
application's appearance.
|
|
<p>
|
|
|
|
After each single timing, xperform logs out and back in again before taking
|
|
the next timing for the next application.
|
|
<p>
|
|
|
|
We currently take 20 timings for each test (see
|
|
<a href="6.2-runsize">6.2-runsize</a>). We report the slowest and
|
|
fastest timings (to show variance), standard deviation, and the average.
|
|
<p>
|
|
|
|
Questions or comments please contact
|
|
<a href="mailto:jhunter@sgi.com">jhunter@engr</a> or
|
|
<a href="mailto:jgrisham@engr.sgi.com">jgrisham@engr</a>.
|
|
|
|
</body>
|
|
</html>
|