Accross every industry, the volume of operational information about products, customers and suppliers is rising exponentially in both size and complexity. New products and services are being implemented every day, causing changes in IT landscape.
Business processes are distributed throughout the organization, using different applications and technologies. Customer is serviced by different business units and product lines, with redundant and often different customer data entered, in order to process the transaction.
Data quality issues are one of the most important reasons why investments in big transformational projects, like CRM, fail. Moreover, such IT architecture does not enable company to discover associations and relationships between customers, leading to inaccurate reporting, internal disagreements over which data is appropriate and correct.
1. Consolidation
- Ability to create and maintain a unique, complete and accurate customer master across the enterprise
- Detection and capture all updates into central master data, map it to operational data sources and merge source records based on business rules
2. Management – Data Cleansing & Enrichment
- Automated and manual (quarantine) data matching an merging Data enrichment with external information sources
- Manual and bulk corrections (data steward, data governance)
3. Synchronization
- Real time frontend and back office application data quality control
- Distribution of customer information to all operational applications in real time
- Integration through ESB, SOA, XML over HTTP, HTTP, db links, ODBC, JDBC, etc. Source Data History: Includes capability to roll back the system to a prior point in time
4. Sharing
- Leverage consolidated and consistent customer data to improve operational systems and gain better customer insight through BI and analytical applications
Master Data Management (MDM)
Nearly every ecosystem has attempted to manage master data within its data warehouse architecture, but it has typically focused on mastering data after transactions occur. This approach does little to improve data quality because data are corrected after the fact. The best way to improve data quality is to move the process upstream of the data warehouse before transactions are executed MDM manages the uniqueness and quality of customer data and ensure that a customer is created only once and made available to any system of record whenever a transaction occurs. It guarantees the integrity of the most critical data within every transactional and data capture system in the organization..
The same applies for any master data that are critical to your organization’s success, and which are usually created, maintained, and shared among many applications. In commercial enterprises, these data are data about people, companies, products, locations, accounts, portfolios, etc.
By defining MDM strategy and plugging an MDM technology into existing applications you can build that one “true view” that will then feed consistent, accurate and reliable data back into these systems. It serves as the real-time single version of the truth for a specific kind of master data.
Properly implemented MDM service should be based on the principles of service-oriented architecture (SOA), as a service that enforces the business rules and policies governing of master data. By rethinking the data warehouse data chain and getting transactional details from the Enterprise Service Bus (ESB) and transactions flow between systems via “messages”, the storage and movement of data will be stream- lined. Master data validation should be “built in” all Front Office applications, meaning that every customer creation or modification transaction is passing through MDM validation process.
MDM service dramatically improves data quality
If each transaction is complete and accurate before it enters the data ware- house, the savings in terms of the work required to integrate it is greatly reduced.
This type of service contributes significant business quality improvements because it provides data to to all IT systems that consume or refer to master data.
The ultimate result is that scores of problem areas and risks will be eliminated, costs and complexities will be reduced, and there will be fewer errors. For example, when creating a sales order, the transactional system should guarantee that customer data on the order is not duplicated and that is correct and current . When one employee updates one bit of information in the database, for example – updating a new address for a customer, operational MDM ensures that this address is updated throughout the system so that all departments will have the correct information. The same thing could happen for the product data, prices, discounts, and payments on the order. The system should enforce business rules and policies, and guarantee the transaction is correct.
Customer MDM
It’s most important objective is to ensure single view on customer data, as a prerequisite to improve revenue, extend product portfolio, improve sales effectiveness, reduce operating costs and increase customer satisfaction. Customer MDM solution has the objective of providing processes for collecting, aggregating, matching, consolidating, qualityassuring, preserving and distributing customer related data throughout an organization to ensure consistency and control in the ongoing maintenance and use of this information in relevant customer facing applications.
Business benefits of implementing such solution range from enablement of better customer service; efficient, personalized loyalty and marketing campaigns; targeted customer insights to drive enhanced sales opportunities, to accurate product and financial forecasts.
Multicom’s 1CustomerView MDM solution is based on J2EE and Oracle Database technolgy. The design of the framework enables unification of differ- ent business contents, coming from different data sources, with unified view, configurable (generic) process and web (GUI) elements.