Benchmarks: Are They Good For You?

Acquiring a server isn't a piece of cake. In fact, it's a good bet that, depending on your installation, the server will be the most expensive piece of IT equipment

February 2, 2004

3 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Acquiring a server isn't a piece of cake. In fact, it's a good bet that, depending on your installation, the server will be the most expensive piece of IT equipment that you install. So it's a good idea to make sure that the server you get will do the job.

But how do you do that? There are so many servers out there (iSeries from IBM, Solaris servers from Sun, Linux servers from a host of people, Windows servers from the multitudes) that just deciding which OS you want is a chore, not to mention the question of architecture. Then there's the question of which server in the chosen architecture/OS you decide you want. Help!

You can cut through a lot of this fog, though, by using the results of some standardized performance benchmarks that are available free for the asking, over the Web.

"I advise customers, clients and analysts that benchmarks provide a common language to get relative information about one server versus another," says David Gelardi, director of IBM Systems Group Benchmarking and Design Centers." The TPC and SPEC benchmarks are all structured, and we all know the rules for the tests."

TPC (www.tpc.org) is the Transaction Processing Performance Council, a non-profit group whose members include most, if not all, server manufacturers and some software vendors as well. The group standardizes tests that result in benchmark numbers, and it keeps people from making exaggerated claims about the results of their testing. In the case of the TPC-C test, the group's flagship and probably most important benchmark for servers, the manufacturer runs the test in his own facility, but under the watchful eye of an independent auditor from TPC, who make sure the test was conducted according to the rules.But while the TPC benchmarks (and similar ones from the Standard Performance Evaluation Council [SPEC] and from independent software vendors, such as SAP) are useful in getting everyone onto the same page, so to speak, Gelardi offers some cautions. "It's a coarse yardstick," he notes. He explains that if you have two systems with very close benchmark scores, then you can be fairly sure that the systems are relatively equivalent in performance. "You have to get to pretty broad differences," he says, to see a real difference. For example, if you have two servers, and one has a benchmark score of twice the other, then there is a real difference between them.

Also, you should be careful when using benchmarks to compare hardware. There are lots of ways that manufacturers will "nudge" the server to get it to perform well. For example, Gelardi says that often a TPC-C benchmark, which measures online transaction processing (OLTP) speed, runs in a server with more than average memory, "because the benchmark runs well in large memory." Your results, in other words, may not be the same.

So you should have more than one data point. Don't assume that because one benchmark favors one particular server that one single benchmark tells the story. "Don't use a single proof-point benchmark," Gelardi advises.

Benchmarks can help you a lot in comparing various pieces of server hardware for your organization. But rather than just jump into the benchmark fray, do your homework, look at the server configurations that manufacturers used for their testing, and get more than one data point. This will give you a better chance of making the right choice.

David Gabel has been writing about, teaching and testing computers and electronic devices for 25 years.0

Read more about:

2004
SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights