Big Data A Big Pain In The Network
Data center-to-data center connectivity is a "silent killer" for big data deployments, according to a study by Infineta Systems, a provider of WAN optimization systems for big traffic, with a vested interest in big data networking. The study, conducted by Infineta in collaboration with Internet Research Group (IRG), examines the intersection of big data and the enterprise WAN, and states that the WAN infrastructure is stretched to the breaking point.
December 13, 2011
Data center-to-data center connectivity is a "silent killer" for big data deployments, according to a study by Infineta Systems, a provider of WAN optimization systems for big traffic, with a vested interest in big data networking. The study, conducted by Infineta in collaboration with Internet Research Group (IRG), examines the intersection of big data and the enterprise WAN, and states that the WAN infrastructure is stretched to the breaking point.
Infineta's customers started talking about big data traffic between data centers a year ago, says Haseeb Budhani, VP of products. The study was conducted in the last two months and involved in-depth interviews with more than 20 customers and a dozen vendors, he says. "We were surprised by how many people are using big data. This is not just an enterprise issue."
According to another new study, from IDC, big data analytics technologies will be one of the driving forces for IT spending through 2020 (IDC Predictions 2012: Competing for 2020). As businesses seek to squeeze high-value insights from this data, IDC expects to see offerings that more closely integrate data and analytics technologies, such as in-memory databases and BI tools, move into the mainstream. And, like the cloud services market, 2012 is likely to be a busy year for big data-driven mergers and acquisitions as large IT vendors seek to acquire additional functionality.
There are a number of drivers behind the adoption of big data, states the Infineta study. Big data storage promises to be 25 to 100 times less expensive than traditional storage, making it an extremely attractive alternative for vertical markets that are experiencing uncontrollable data growth. Big data technologies massively increase the scalability of data storage--that is, Hadoop enables companies to add petabytes of additional storage capacity for a fraction of traditional storage costs. Additionally, only 1% to 5% of unstructured data collected outside of big data deployments is actually analyzed. According to a recent McKinsey big data report, if the health care industry, for example, could collect and act upon the other 95% of uncaptured data, it could add value of up to $300 billion annually.
The WAN enables the aggregation of data for analysis, and the subsequent distribution of results, for better business decisions, but most WANs aren't up to the task, according to the Infineta study. With an estimated 80% to 90% of big data composed of semi-structured or unstructured data, unlike the structured data found in conventional data sets, traditional storage and analytics systems aren't well suited for the newest driver behind big traffic. Big data is characterized by aggregate volumes in the hundreds of terabytes to petabytes of distributed data, and mission-critical data sets are doubling or tripling in size every two years on average, with Gartner forecasting that enterprise data will grow approximately 800% by 2015, states the Infineta report.
Because it drives data aggregation, processing and results distribution, big data generates massive amounts of new data--measured in petabytes--that in turn require more and more bandwidth. In many cases, companies are being forced to cap big data traffic before it hits the WAN in order to avoid massive application performance issues and debilitating WAN congestion, according to the findings.
The growing adoption of big data will spur widespread enterprise pain, with organizations to be afflicted by WAN bandwidth and latency problems, states the report. Designed for low-capacity branch WAN traffic, traditional WAN optimization solutions cannot scale to meet big data-specific WAN throughput and performance needs or address the inter-data-center WAN bottleneck. The study adds that just purchasing additional WAN bandwidth is extremely expensive in terms of both capital expenditures (WAN equipment upgrades) and operational expenditures (incremental bandwidth costs), and does nothing to address performance degradation due to WAN latencies.
"This is the killer app," says Budhani. "We see big data changing the game for WAN optimization. If you're not ready for it, you won't participate in it."
See more on this topic by subscribing to Network Computing Pro Reports Strategy: Bringing APM to the Cloud (free, registration required).
Read more about:
2011You May Also Like