Master Data Management (MDM) is supported and enabled through analytics, which is fair enough when you consider that enterprise analytics are enabled through MDM. Companies want to qualify their MDM process by examining master data’s journey (or its lineage) as it passes end-to-end, through the MDM life-cycle. But they also want to quantify the value of MDM’s data quality process through established metrics.
For example, in a typical data quality call performing cleansing and matching, metrics include calculations about the number of duplicates, missing data by attributes ( zip codes), how many times standardization is required ( St. vs. Street) as well as basic enrichment (e.g. adding +4 to a US zip code).
Some customers actually want to understand and analyze the need and scope of their MDM project prior to implementation through the use of data profiling. A good data profiling tool can provide empirical evidence about the current state of enterprise data. By confirming suspected data issues and uncovering unexpected data issues, it enables companies to define the scope, goal and requirements of successful enterprise Master Data Management initiatives.
In other words, data profiling evaluates and quantifies parsing and accuracy issues, duplication levels and entity completeness. Consequently, your data management challenges and data quality targets are clearly defined, providing clarity not only around the MDM tool selection process, but also substantially informing the implementation methodology.
However, a completely automated DQ process is not necessary for every MDM use case. While the DQ processes is certainly recommended for matching voluminous customer names based on algorithms and survivorship rules, or addresses verified against standard postal directories, human intervention is what’s required for the successful data governance of both product and reference data. In this case, a methodical workflow/approval process should be triggered and routed to employ product data specialists, allowing them to collaborate and consult on what constitutes a complete product master record.
But even with a more manually deployed workflow, companies still want to understand how well it’s going. WebMethods OneData MDM provides out-of-the-box integration with MashZone for the purpose of dash-boarding any data internally or externally governed through the OneData Workflow process. Data stewards can chart and analyze completed workflow activity by data object, data column, type of data change (insert/edit/delete), and data steward productivity. Additionally, MashZone functions as a visual monitoring tool, allowing data stewards to quickly resolve any problems with work-in-progress.
As alluded to, there is also much to say about MDM’s role in maximizing the accuracy of data consumed by even the best graphical and feature-rich analytical tools. Gartner has labeled this Analytical MDM.
Gartner’s definition for analytical MDM is very straight forward. Simply put, a single-view of customer, product and location reference data – and hierarchies – is just as important, and as necessary for corporate analytical tools as it is for operational business processes. In other words, MDM’s mission includes providing a single version of truth for business intelligence tools, data warehousing applications and Intelligent Business Operations, thereby substantially improving the analytical processes supporting critical, real-time decision making and compliance reporting.
All in all, this is just another example of MDM’s journey.