Replication Key To Protecting Data
IT's role in keeping an organization's doors open for business rest on a tripod--business continuity, disaster recovery and high availability--and, increasingly, these initiatives are being supported by replication technologies, says senior analyst David Chapa, Enterprise Strategy Group. The author of Replication Technologies for Business Continuity and Disaster Recovery says this trio is centered on the attempt to keep IT systems and applications available, regardless of what happens.
September 15, 2011
IT's role in keeping an organization's doors open for business rest on a tripod--business continuity (BC), disaster recovery (DR) and high availability (HA)--and, increasingly, these initiatives are being supported by replication technologies, says senior analyst David Chapa, Enterprise Strategy Group. The author of the new market landscape report, Replication Technologies for Business Continuity and Disaster Recovery, says this trio is centered on the attempt to keep IT systems and applications, and thus the business processes that rely on them, available regardless of what happens (a site disaster, hardware failure, and so on).
According to ESG’s 2011 IT Spending Intentions Survey, the top 10 spending priorities for 2011 include BC and DR programs. Chapa says this illustrates not only the continued challenge IT faces when trying to select the right tools to meet the needs for BC/DR, but also highlights a connection to virtualized environments, which affect the more traditional methods and approaches in the physical environments.
Active in the data protection field for more than 20 years, Chapa says most organizations talk about DR, but few of them really do anything about it. He adds that, as data volumes continue to soar and their relative importance grows, doing nothing is a recipe for disaster. At the very least companies need to put a plan in place to recover if and when disaster strikes.
Disasters and the havoc they may wreak on the business are mitigated by solid plans, says Chapa. Traditional file-based backup is most often the option employed for DR and BC, but with downtime tolerance of only hours and minutes, these data backup-only solutions may in fact leave a company exposed to risk.
He says he's been seeing a lot of customers look at mashups of backup and disaster recovery, and they are asking why their DR can't be their backup. With DR being driven primarily by replication, why couldn't they use that as their backup?
While replication has been a part of the IT toolkit for many years, Chapa believes it will play a key role in improving the DR programs today and in the future. In the past, it often required a lot of expensive bandwidth, and buying two of the same systems to keep the source and target devices the same, when optimizing solutions such as compression, deduplication and data reduction technologies weren't available.Protecting and recovering data is (or should be) all about the business, says Chapa. However, IT is not ordinarily the data owner, and without understanding the true value of data and time to recover requirements, IT tends to treat all data equally. Unfortunately, all data, services and business units are not created equal, and, as such, require different approaches.
ESG divides data into three tiers: tier 1 is mission-critical data and applications with the highest requirements for both availability and performance; tier 2 is data and applications requiring good performance and reliability, but not at the level of mission-critical data; and tier 3 is typically archived data (performance is less essential, but the data must be retrievable).
More than half (53%) of the respondents indicated that downtime for their tier 1 data cannot exceed more than an hour without causing adverse business impact. An aggregate 74% of respondents cited three hours or less of downtime tolerance for tier 1 data compared with 47% of respondents with similar requirements for tier 2 data and just 26% of respondents for tier 3 data. Conversely, 42% of respondents report that their organization can tolerate at least one or more days of downtime for tier 3 data before suffering any adverse business impact.
If the changing role of replication was one surprise from the survey, Chapa says the impact of virtualization was the second. Virtualization complicates the situation because organizations must look to solutions that will not only protect the physical environment but also integrate into the virtual environment. ESG says more than half of the surveyed organizations (55%) are using server virtualization, with another 34% planning to do so.
If virtualization adds complexity to the data protection puzzle, it also offers the possibility of a simpler, more holistic approach to BC, DR and HA, especially as it gains more momentum and maturity in the market, says Chapa. Throw in the fact that replication technologies are now more affordable to all customers, and he believes a lot more attention will be focused on data protection. "Selecting the right technologies to meet goals, initiatives and objectives for the data or systems to be protected will save IT time, resources and money," he says.
See more on this topic by subscribing to Network Computing Pro Reports Research: 2011 State of Storage (subscription required).
Read more about:
2011You May Also Like