New Data Management Report Shows Proactive Approach Saves Money, Boosts Security

A new report by Dan Power, founder of Hub Designs, looks at the differences between proactive and reactive approaches to data management, and explores the benefits of proactive data governance.

August 13, 2009

2 Min Read
NetworkComputing logo in a gray background | NetworkComputing

According to Ravi Shankar, Senior Director of Product Marketing for Siperian, a data management software firm that sponsored the report, the key to most effective data storage and retrieval is rationalizing data across the entire enterprise. He explains the concept of Master Data Management (MDM) as, "the data that's the lifeblood of the business is held in several systems and applications. Collectively, that's the master data." Shankar then says, "Any large company will have multiple systems containing this data in which the master data is replicated. At some point you reach a choke point where you don't know which version of the data to trust, or where all the versions are stored. The MDM system brings together all the duplicates and creates a single master view of the critical data."

From a data management perspective, the huge question is whether the single master version of the data is created after the data is entered (the reactive model), or as a result of rules established by IT management and enforced by software (the proactive model). In his report, When Data Governance Turns Bureaucratic: How Data Governance Police Can Constrain the Value of Your Master Data Management Initiative, Dan Power makes the argument that the proactive model saves significant time and money over the reactive model for any enterprise organization.

Shankar explained that the reactive model depends on "data stewards," individuals who make decisions on reconciling the often trivial differences in field contents that make data rationalization difficult in a large organization. Power points out that these data stewards can become significant workflow and data access bottlenecks, especially when they don't have the content knowledge required to make the decisions without consulting employees in sales, marketing, or other business line functions.

Power's report argues that training and empowering these business line employees to create proper records and constraining unqualified personnel from creating or modifying data, means that the laborious data verification process isn't required. Reducing the sources of potentially-incorrect data will also, according to Power, reduce data storage requirements and overall data management cost for the organization.

Normalizing and rationalizing data from the moment of creation is, according to Shankar, a practice that will save more money and resources for organizations the larger they grow. "A large global company might have 70 - 105 million records across the organization. When you look at the number of records and the number of people dealing with the data, it's huge," he says.  Shankar, looking at the recommendations of the paper, says that companies often begin the master data management process in reactive mode, but become far more satisfied with the overall results after eating the learning curve and muscling through the pain points of a switch to the proactive process.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights