SAG_LinkedIn_MEME_Data_Superiority_Complex.jpgWhile integration is designed to move data throughout the enterprise quickly and efficiently, it is not designed to fix bad data. Inevitably, if the data is less than superior, integration becomes complicit in the distribution of bad data.

Enterprise ERP systems, applications, analytics and integration solutions have one major thing in common - superior data maximizes the efficiency and effectiveness for all of them.

The good news is that integration will simplify and streamline the IT enterprise. Applications integrated through a common enterprise service bus (ESB) instead of point-to-point connections will provide faster and more unified delivery and increase productivity. Integration will reduce time-to-market for new development projects, enable enterprise mobile apps and significantly improve partner relations.

But to realize these tremendous benefits corporations must address the critical need for good and consistent data quality. Poor or mediocre data quality will undermine and compromise many of the advantages integration provides.

There are many types of bad data with different possible outcomes:

  • Duplicate Data, or two or more identical records. This can result in the misrepresentation of inventory counts, duplication of marketing collateral or unnecessary billing.
  • Conflicting Data, or the same records with differing attributes. A company name with different versions of addresses can cause deliveries to the wrong location or customers to receive the wrong products.
  • Incomplete Data, or missing attributes. This can cause payroll not to be processed because of missing social security number, or mean that you are unable to identify a loyal customer.
  • Invalid Data, or attributes not conforming to standardization. This can be as simple as a customer’s ZIP code containing four digits, instead of the requisite five.

The business downside of inconsistent and inaccurate data is well documented. Sending an invoice with incorrect data, for example, can cause delays, frustration and even loss of revenue.

Integration solutions and strategies are designed to improve enterprise processes. Many companies will think of integration first as the driver for fulfilling the goals of business optimization efforts. In other words, process-driven integration.

But, in fact, an ESB and for that matter Business Process Management (BPM) are themselves processes—and while considered foundational to a successful outcome of the same major enterprise initiatives—they are subject to the same mitigating effects of inferior data quality.

Here MDM plays a critical role in maximizing these processes just as it does by consolidating disparate data from multiple enterprise applications.

MDM reduces errors in designs produced by a BPM system when the highest-quality data is available. Reusable services become more efficient by consuming accurate and complete data. Even the efficiency of an API management platform, whose purpose is to unlock the unique value of a company’s data and services, is susceptible to inadequate data quality.

ESBs, BPM systems and API platforms can maximize their enterprise roles by accelerating the delivery of mission-critical applications empowered by good and consistent master data, reference data and their associated hierarchies and relationships. This is where MDM comes in; companies are increasingly viewing MDM’s trusted data as a baseline requirement for any successful process improvement initiative. MDM provides superior data so your business no longer has a complex about it.


Leave a comment

Search B2B

Most Popular Blog Posts


Connect with us!