Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Big Data: A Backup Nightmare?

As the technology landscape changes, some things remain the same. In particular, the need to protect data is a timeless duty that is expected for the modern data center.

The investments companies have put into the data center are really the only way they can satisfy new demands from end users. This includes 24/7 access to data and applications, exponential growth, and, most importantly, no downtime or data loss. The fact is that data growth is exploding -- a point corroborated by analyst firms and companies across the globe.

When it comes to protecting this seemingly ever-expanding data profile, companies must thoroughly evaluate the modern data center and its attributes. The modern data center has some key characteristics: It is highly virtualized and has modern storage systems and a cloud strategy. These characteristics define the availability strategy that the data center must engage with. In particular, protecting large amounts of data becomes a capability of the pieces and parts of the data center that provision the exploding data.

Ultimately, protecting big data isn't that difficult; ensuring that recovery meets the expectations of the 24/7 enterprise is another thing.

To this end, companies should seek solutions to protect big data that deliver at least three capabilities. The first of these is data loss avoidance. Having near-continuous data protection and streamlined recovery are absolute essentials for the modern data center. The second capability that applies well to big data is high-speed recovery. When we consider expectations, there is a clear need to rapidly restore what you want, the way you want it.

The last capability that I feel is often overlooked is complete visibility. Not knowing what is going on in a data profile is an incredible risk. Proactive monitoring and alerting of issues before operations are impacted are must-have attributes.

These capabilities sound great, but how do we get there?

I believe that the availability strategy has to take a close look at the characteristics of the data center: virtualization, modern storage systems, and cloud. These attributes dictate levels of availability, such as setting recovery time and point objectives of 15 minutes or less. Will that satisfy the expectation of IT services today? In most situations, I believe so.

Asking companies how quickly they can recover what they need -- in the form they need it -- is one good measure of high-speed recovery. Whether an entire workload is in need of recovery or just an application item or a single file, companies need to ask if their availability strategy meets the 24/7 expectations of their users.

Big data can be a big problem in terms of availability. However, a company needs to assess the data center and its characteristics underneath the data. Only from there will it be able to enjoy key availability levels that fill the gap between legacy backup solutions and the expectations of today. For CIOs, the significant gap between the levels of availability they're able to provide versus what end users demand is unsettling, and it's a surprisingly common scenario.

Should big data be a big worry? It doesn't have to be. Companies that can protect big data to match the expectations of the business are doing so by keeping availability a priority and ensuring the data center is modern. For data centers that are not investing wisely in virtualization, modern storage systems, and cloud strategy, there will be a more difficult journey to ensuring availability levels expected of big data applications.

And that's when the big problems occur.