Dedupe Wars: Quantum, NEC Extend Deduping Capabilities

In a busy week for data deduplication, Quantum and NEC tout new featues while FalconStor claims speed record.

June 2, 2009

6 Min Read
NetworkComputing logo in a gray background | NetworkComputing

While the two largest independent storage vendors battle for control of Data Domain and its data deduplication technology, other storage vendors on Monday tried to stake out their own turf in the deduping wars. The move Monday by EMC to outbid NetApp for ownership of Data Domain shows how important deduplication technology is becoming to storage systems. Data deduplication announcements by Quantum, NEC and FalconStor also show that they are not willing to give ground on this issue.

Most storage vendors offer some form of data deduplication, which eliminates redundant copies of files and data and stories only a single copy, using pointers or markers to connect back to the stored or backed up file. Deduping can take place in a variety of locations in a storage infrastructure (primary, secondary or tape storage) and vendors argue over the best way to employ the technology (in-line, post-processing). It has become popular for backup and archiving systems as it can substantially reduce the amount of data that needs to be storage and the storage capacity a business needs to buy. Now, vendors are trying to argue that they have better data deduplication technology than their rivals.

"The data deduplication market has grown up," said Arun Taneja, founder of consultancy Taneja Group. "Just having deduping is not enough today. It has become mainstream technology and differentiation is going to start to make a difference."

Quantum expanded its line of storage systems with data deduping on Monday by introducing an appliance for remote and branch offices and a new version of its management software. NEC said it improved its deduplication capabilities by adding content awareness to its software. And FalconStor said its systems can store, dedupe, replicate and restore data faster than competitive systems.

Quantum added the DXi2500-D for remote offices to a product line that includes the DX-7500 Enterprise for data centers, the DXi7500 Express for mid-sized sites and the DXi3500 for small and mid-sized businesses. The new device can ingest data at 300 GTb per hour and offers RAID 6 for data protection, the company said. It also provides replication and OST support and can substantially reduce bandwidth needs.

"In most businesses backup is incredibly siloed. It is done in one spot and often not connected," said Steve Whitner, a product marketing manager at Quantum "We are letting people connect islands of backup data and making backup into a networked application." The product is designed for smaller remote offices with about 1 TB of primary data.

Quantum wants businesses to install the 2500-D in many remote offices, so it built the system on an inexpensive Dell server and set a list price of $12,500. "It has a simple NAS interface. You simply attach it and use it as a point on the network. It works with any local backup software," he said.

Version 3.0 of its Vision management software aims to tie together Quantum's disk and tape systems and provided centralized backup management and reporting. It monitors changes in servers, switches and software in multiple locations and lets IT managers do trend analysis. "The real value is connecting these islands of data," Whitner said. A single license costs $3,750.

NEC, meanwhile, upgraded its software to add content awareness to its data deduplication technology to improve data reduction ratios while continuing to argue that its grid architecture scales better than competing products. Gideon Senderov, director of product management for the advanced storage products group at NEC, said application awareness lets the company's HydraStor systems increase data reduction rates by 130 percent.

It does that by analyzing the metadata that backup systems wrap around data that is being backed up and eliminating redundancy in the metadata. "Backup software inserts blocking tags and you end up with metadata interleaved with the data, so it look different to the deduplication software when it has tags as part of it. It can't be resolved without something to take care of the metadata," Senderov said.

There are different kinds of metadata, including file-level, block-level and agent-side metadata, and different backup applications use various forms of those approaches. "We can separate out the tags from the user data, regardless of when it comes in or whether you use two different backup or archiving apps, so it will all look the same to the deduplication software," he said. "We chunk up the data and store it in multiple chunks, which produces a higher dedupe ratio."

Initially, the free software upgrade supports NetBackup and Simpana and NEC plans to add support for NetWorker, Data Protector, Tivoli Storage Manager and other applications.

FalconStor on Monday tried to change the focus of the deduping debate by claiming the "fastest total time to disaster recovery in the industry. The company said data reduction and dedupe ratios are fine, but what really counts is how long it takes to backup, dedupe, replicate and restore data.

The company said its virtual tape library was able to ingest 100 TB of backup data in 10 hours, dedupe it in 14 hours, replicate it to a remote site while the deduplication was taking place, and store the data in 11.6 hours. The key is that FalconStor is able to start deduplicating the backup data as it arrives and begin replication once deduplication starts. The simultaneous and overlapping processes reduce the amount of time it takes to run through the entire process, said Fadi Albatal, director of marketing.

"We can beat any existing solution on the market when it comes to backup performance and deduplication performance. But what is really important to enterprise IT departments is the amount of time it takes for disaster recovery and to restore the data," he said.

These announcements and debates show that the deduplication battles have just started, said Taneja. "Data deduplication is at the center of the storage universe and we now have several players with fundamentally good technology. Businesses are going to need some type of technology to cull data to avoid being overwhelmed by growing data tsunami. It is going to become part of every aspect of storage," he said.

While vendors tout performance numbers or reduction ratios, "a lot of claims have been made but there is no comparative data. In the end, these vendors will have to prove it to the market. Most IT guys don't care how you do it," Taneja said.

Most IT departments are still exploring data deduplication and have not yet implemented the technology, said Lauren Whitehouse, an analyst at Enterprise Strategy Group. "Our research shows that the adoption rate is pretty low and the market is pretty fragmented," she said.

More companies are looking to eliminate tape and move to disk backup, and they are looking at data reduction technologies to reduce the amount of storage capacity they need and make it easier to replicate data to another site for disaster recovery, she said. However, vendor hype over the various forms of deduplication is "just creating a lot of confusion for end users," Whitehouse said. "You can't just look at reduction ratios. That is the least of the end user concerns. They are more concerned about cost and manageability."

InformationWeek Analytics is seeking insight into enterprise use of technologies like data deduplication and tiered storage. Take part in our survey by June 5 and let us know. Upon completion, you'll be eligible to enter a drawing to receive one Apple 16-GB iPod Touch valued at $299.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights