DIY Lab Testing
As information technology gets more complex, it's essential to test new hardware and software thoroughly and systematically to ensure compatability, performance, scalability and more.
September 22, 2003
There are no exceptions to this rule. Even shops that run the most sophisticated capacity-planning and performance-monitoring software must do extensive testing to determine their scalability needs and challenges. In fact, testing goes hand in hand with capacity planning (for more on capacity planning see the Playbook in this issue). Let's say your capacity-planning model indicates that, due to projected load increases, you'll need to move from a two-way server to a four-way server in about six months. You'd want to run the projected load through your existing hardware to see how it behaves, then run it through the new hardware to be sure the four-way will be sufficient. Testing lets you verify the accuracy of your models sooner rather than later.
We Never Said It Would Be Easy
Testing takes time and money. It also takes expertise. You need the right people with the right training to float this boat. But finding qualified staff and developing appropriate test methodologies are among the biggest challenges faced by IT managers aiming to build efficient in-house test labs, according to our recent poll of 889 readers (see poll results, right). You can't test the performance of an enterprise messaging server just by measuring SMTP throughput, for instance--that wouldn't give you the complete picture.
Other major obstacles, according to our survey: securing ongoing funding, maintaining management support, demonstrating ROI and justifying TCO (total cost of ownership).
The size and scope of in-house test programs vary significantly from company to company. Only 27 percent of the reader organizations we polled insist on formal testing with established protocols before any product can be put into a live production system, though another 58 percent require formal testing of some products. And while 35 percent of respondents with in-house test labs says they have official testing facilities, 65 percent say they do ad hoc testing, typically by systems administrators or developers familiar with the systems under test.Of course, you can always have someone else do the dirty work. Companies that can't commit the necessary resources or just don't want the burden of creating, upgrading and maintaining in-house facilities may opt to outsource testing of critical products to independent third-party labs, which can develop the test methodology, do the testing and provide a comprehensive report detailing their findings.
The downside, of course, is that most testing services can't mimic your production systems precisely, and they can't provide the flexibility to make changes when unforeseen problems arise during testing. Another catch: You need someone in-house to manage the relationship with the outside lab, to negotiate the scope of each project and ensure that the lab's test methodology and setup will yield data that's applicable to your production environment. Only 4 percent of our survey respondents say they outsource product testing.
Another alternative is to make use of your vendor's reference lab. Sun Microsystems' iForce Centers (www.sun.com/executives/iforce/centers/), for example, are designed to help Sun's customers test prototypes prior to implementation. Sun also develops reference architectures (www.sun.com/products/architectures-platforms/refarch/features.html) to help customers develop size-appropriate solutions. Other vendors, including Microsoft (www.microsoft.com/usa/mtc/overview.asp), offer similar services.
Network Computing has been doing hands-on testing of IT hardware, software and services and reporting on our methodology and results since October 1990. Our goal is to help you make informed purchase decisions by cutting through the vendors' hype and identifying the products' strengths and weaknesses.
But even our most exhaustive tests can't guarantee that a winning product will work in your environment. You can read our analyses and rely on our findings to come up with your own shortlist, but ultimately you need to see for yourself. That's why we've devoted the bulk of this issue to a discussion of the inner workings of a comprehensive product-testing program. Here's a glimpse at the features you'll find inside:
• "Justify My Lab," Contributing editor Jonathan Feldman provides step-by-step instructions for building a business plan and getting executive buy-in for your in-house lab.• "Tool Time," Senior technology editor Mike Fratto tells you how to choose tools to design top-notch test beds and develop solid test methodologies.
• "Our World, Welcome To It," We take you on a guided tour of our own Real-World Labs®, with detailed diagrams of our test facilities in Chicago, Syracuse, N.Y., and Green Bay, Wis.
• "Testing To Go," Executive editor Bruce Boardman tells you where to turn if you need to outsource product testing.
• "Vendorspeak Exposed!" Technology editor Lori MacVittie and Syracuse University researcher Jeff Stanton provide a primer on deciphering vendor reports.
First, see more results from our reader poll about current approaches to product testing.Ron Anderson is Network Computing's lab director. Before joining the staff, he managed IT in various capacities at Syracuse University and the Veteran's Administration. Write to him at [email protected].
Post a comment or question on this story.
You May Also Like