Data Quality Management & Governance

mDQ is an application solution that allows continuous evaluation, monitoring and control of the consistency of the data according to business rules and eligibility criteria. Metadata repository with high-performance PL/SQL based engine enable flexibility to define very complex business rules and to establish and maintain consistent data definitions. Additionaly, a data catalog can be built, tracking DQ relevance and coverage of every single database dictionary object, down to an attribute level.


  • Can Handle VERY Complex Rules
  • SQL Based

High Performance

  • Oracle in-Database Solution
  • Using Advanced Database Features

DQ Data Governance

  • Who is Responsible?
  • Take Actions

Fine Grained Responsibility Definition

  • Business Rule Level or
  • Attribute Content Level

Business User Friendly

  • E-mail Contains Statistics and Rule Details URL
  • Get Knowledge from Predefined Rule Reports on Mouse Click


  • Bind Variables
  • SQL Code Preview
  • Query Parallel Degree
  • Max Error Count
  • Table Scope
  • Lookup Table Scope
  • Table Groups
  • Metadata Export / Import
  • Application Security
  • Rule (de)Activation
  • Rule Threshold
  • Rule Code Templates
  • Rule Scope
  • Rule Clusters
  • Responsibilities
  • Fine Grained Access Control
  • Custom Smart Reports
  • Export Report Data to File
  • Send E-mail on Demand
  • Create Ticket on Demand

Functionality to handle complex problems

All types of rules are supported, from the simple to the most complex horizontal multi table controls that identify dependencies between elements in one or more fields or tables and let users drill down from the report to individual records: (Single-table horizontal rules; Single-table vertical rules; Multi-table horizontal rules; Multi-table vertical rules)

Data Governance

Detailed row level security and role management enable identification, quarantining, assignment, escalation and resolution of data quality issues through processes and interfaces that enable collaboration with key stakeholders and to establish the “owners” and “custodians” of each major subject area and data store. Fine Grained ownership and Responsibility Definition of the persons who are accountable for the quality of data in a given subject areas, at the Business Rule or Attribute Content level (e.g. organization unit or customer type).

Full control through GUI


Business-oriented interface enable provision of all the above functionality and makes it available to business and technical users in a manner relevant to their role. Users can define scope and mandatory data elements that need to be evaluated, review metrics for measuring the quality of these elements, define thresholds for reporting on data quality levels and define custom “smart” reports delivering concise information from error data.

Operational reports

Thanks to full history logging of every variable in the moment of each DQ evaluation, monitoring, managing, auditing and control of data quality processes is enabled. Operational audit reports enable insight into DQ rule and data table statuses, trends, measure progress in achieving data quality goals and complying with service level agreements required by business that specify tolerances (thresholds) for critical data elements.


High performance execution (running parallel workflows and processing high volumes of data) is achieved by innovative rule orchestration mechanism and by using advanced database features (bind variables, direct path insert, query Level Parallel Degree, etc.)


mDQ can be implemented perform data quality checks on multiple source systems by request, at the reception points, within ETL processes in regular intervals or just before or after data is loaded into another system such as a data warehouse.

Connectivity/adapters (API’s) returning status make it easy for ETL tools, workflows and applications to embed data cleansing routines into their applications and exchange rich sets of metadata between different sources.


The analysis of data to capture statistics (metadata) together with drill down/up functionalities, provide business and IT users analysis and insight into the quality of data, and help to identify and understand data quality issues (which “anomalies” represent defects and which are valid data elements, trends, etc.). Such analytics enable the subject matter experts to define business rules for cleaning defects or to implement prevention procedures).