In the world of automated data governance, there are tools considered to be reactive and proactive. Using a data quality engine, for example, to cleanse a database replete with inaccuracies and duplications is considered reactive. Using MDM dashboard metrics to identify the nature of data inadequacies in order to modify end user behavior is proactive. MDM’s workflow provides something of a hybrid approach. Workflow, of course, is a rules-driven tool triggered by data changes made within the MDM hub. Conversely, workflow also provides proactive orchestration of governance routings assigned to data stewards who validate change and help propagate standards.

There is however, an emerging view-point advocating that the best solution for effective, proactive data governance resides chiefly with the end users or business stakeholders - or those creating, maintaining and entering data through their assigned user interfaces and APIs.

I recently hosted a Software AG luncheon this past September at FIMA in Toronto. During a roundtable discussion about data management tools (and in a response directed at master data management), A Chief Data Officer at a major bank stated that the best path to good and consistent data would be a strategy supported by a federated systems environment. Federation, he maintained, provides systematic commonality and a standardized accessibility to sharable data models and attributes. By providing an accurate, consolidated and accessible view of relevant business data, the burden of maintaining good data quality would squarely rest on the shoulders of the applications users. Based on this plan, we all become data stewards - with MDM tools eventually losing their relevancy. Admittedly, a sobering thought for a long-time, MDM marketing person.

Taking a breath, good quality data creation should be everyone’s concern - and we should all adhere to enforceable standards. It’s really not a decision an employee has to make. It’s inherently part of his or her job.

All this said, perhaps transactional-retail banking would be a better fit for this proactive, federated data governance approach than some use cases. There are, however, other specialized and complex data management decisions requiring an authoritative consensus and additional sets of eyes. Managing the life-cycle of product master data, for example, which is usually dependent on a highly controlled and centralized data management environment, comes to mind. Putting it another way, some data management decisions are aligned with processes whose complexity are greater than other data quality issues, such as fixing duplicates or executing address verification.

And as it happens, combining the need to support process integrity, while tying it to good and consistent data, is a major strength of master data management.

I was reminded of a webMethods OneData demo that provides a relatively straight forward example of how an MDM workflow supports both data quality and a critical enterprise process. In this scenario, HR is asked to validate both the process and data quality when employees (and their records), are relocated from one sales organization and merged into another location. Is this case, the process is responding to a business decision, but MDM’s workflow proactively validates the necessary data change.

I feel better already..

 

Leave a comment

Search B2B

Most Popular Blog Posts

Subscribe

Connect with us!