Time - The First Casualty Of Lack Of Storage Analytics

In IT, time is no longer on your side. Staffs are stretched too thin, new products and capabilities are coming at you too fast and requests for additional storage performance or capacity never stop. The problem is that there is a limited amount of time available to you to examine what is causing a performance bottleneck or how much capacity an application really needs. The current trends of virtualization, tiered storage and infrastructure consolidation make diagnoses even more challenging.

George Crump

January 15, 2010

2 Min Read
Network Computing logo

In IT, time is no longer on your side. Staffs are stretched too thin, new products and capabilities are coming at you too fast and requests for additional storage performance or capacity never stop. The problem is that there is a limited amount of time available to you to examine what is causing a performance bottleneck or how much capacity an application really needs. The current trends of virtualization, tiered storage and infrastructure consolidation make diagnoses even more challenging.

Often the first solution that your storage vendors will offer when you present them with a performance or capacity problem is to throw more hardware at it. This additional hardware or software then costs you more time. There is the obvious time cost during the implementation process of the product, and then there is the less obvious time cost of managing another storage area or software task.

Often these so-called solutions are just slapping a bandage on the problem, but all these layers of fixes are making your management burden heavier. In fairness, there are times where a quick fix is all that you can legitimately afford or have time to apply. Its okay as long as you know that's what it is, but there are times when applying new hardware or software does solve and eliminate a performance problem.

Knowing the difference and knowing what to apply is the hard part. Having the time to understand the nature of the problem and even being able to predict a problem prior to it occurring, especially in the dynamic environment that a data center has become, is a luxury most organizations do not have. We have moved well beyond the days were a spreadsheet is a useful aid for managing the storage environment. What is needed is real-time or near real-time analytics.

This is where having analytical tools like those offered by Virtual Instruments, Vizioncore, Dynamic Ops and Tek-Tools can help. These tools can monitor and diagnose problems in your environment and help you track down the root cause. Many times performance problems are caused by improper configurations that relatively simple tuning could have solved. They can also examine the performance characteristics of virtual servers, physical servers and tiers of storage to make sure you have the right applications and data on the right platform.What you may find by implementing these tools is that you buy yourself more time. These type of tools give you a command console that allows you to see problems before they occur, optimize resources instead of rolling out new ones and having information at the ready to answer questions instead of having to scramble at the last minute. Who knows, you might even get to go home before the sun goes down on occasion.

Read more about:

2010

About the Author(s)

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox
More Insights