Blog ERP SiloBack in the 1960s, 70s and 80s, the mainframe was king. For those users not lucky enough to have a console enabling data entry directly to analog/magnetic tape, good integration meant keeping your box of punch cards in exactly the right processing sequence in order to avoid a print-out ending with the word ABORTED. The big Metrics were CPU utilization and development time for new, homegrown programs. Data quality was strictly an end user issue, at best a painful after thought. The concept of master data management or SVOT (Single-Version-of-Truth) was a gleam in the eye of some futuristic data manager.

In the 80s, the introduction of powerful, but more agile mini computers created a ground-swell of innovation for unprecedented best-in-breed applications. But the more mainframes and mini computers (not to mention manual processes), coexisted in the same system landscapes, the more enterprise computing became disjointed and silioed. The resulting “islands of information” lacked integration, synchronization and most certainly a holistic approach to managing data quality.

The concept of ERP (Enterprise Resource Planning, a term first used by Gartner in 1990), promised a completely integrated software package providing all necessary business applications. Additionally, the homogeneity of this approach would ensure data quality standards minimizing duplication, redundancy and inaccurate master data.

Unfortunately, a very large number of companies who embrace ERP continue to experience poor data quality and its debilitating effects on business processes.

Why is this?

  • Despite sharing data model attributes between applications, ERP's numerous business applications (order entry, billing, marketing, etc.), present different points-of-view and requirements, impacting data standardization and creating their own brand of siloed computing.
  • Mergers & Acquisitions are game changers. Even when the acquiring and target companies share the same ERP vendor, differences exist in approaches to data management.
  • Large companies will still continue to invest in non-ERP solutions, such as Salesforce.com.
  • Big data, the Internet-of-Things and the growing number of divergent data sources will conspire against an organic, homogeneous computing environment.

If nothing else, consider the emphasis ERP vendors (such as SAP and Oracle), have placed on providing their own data management modules and products.

MDM, CRM and ERP Integration

One of the biggest, Master Data Management use cases is CRM and ERP integration. This enables the MDM solution to govern and synchronize disparate customer data between enterprise systems. For example, Software AG’s OneData MDM platform provides data governance through Salesforce’s user-interface and mobile devices—allowing data quality governance across multiple systems in real-time or batch-mode.

For over two years, webMethods OneData has been able to provide MDM, data integration between Salesforce.com and SAP, as in this demo scenario:

  1. A new record created in Salesforce.com triggers OneData's workflow/governance process.
  2. An automated approval process validates this new customer, sending it directly to SAP, or assigns the record to OneData's internal workflow for manual, data quality review.
  3. Once approved, the record is sent in real-time to SAP.

This hybrid-cloud, data governance highlights a critical ERP domain. But, given OneData’s multi-domain capabilities, it is one of several party domains (supplier, vendors, employees), plus product and location data that can be aligned and managed in support of real-world, business requirements.

The creation of good, consistent master data, reference data and hierarchies maximizes the value of business processes - and of course, ERP.

Leave a comment

Search B2B

Most Popular Blog Posts

Subscribe

Connect with us!