Data Quality Management Framework

From Open Risk Manual

Definition

Data Quality Management Framework is a formal framework (set of policies and procedures) that supports organizational efforts to meet internal or external Data Quality objectives

Data quality management is particularly important for regulated institutions where significant and/or sensitive decisions are made on the basis of data

ECB TRIM Requirements

Institutions must have in place a process for vetting data inputs into any regulated Risk Model[1]. This must include an assessment of the accuracy, completeness and appropriateness of the data. To comply with this requirement and to ensure the quality of the data used for credit risk measurement and management processes institutions should establish and implement an effective data quality management framework that is formalised in a set of policies and procedures.

Scope

This framework should be applicable to all data used in IRB-related processes, i.e. internal data, external data and pooled data, if any. In addition, it should ensure that reliable risk information is available to enable an institution’s risk profile to be assessed accurately and drive sound decision-making within the institution and by external stakeholders, including competent authorities.

Components

A data quality management framework is effective when it encompasses the following components:

  1. Sound underlying governance principles, including allocation of roles and responsibilities for the management of data quality, to ensure in particular that Data Quality management activities are independent of Data Processing activities, and the active steering of data quality;
  2. Description of the scope in terms of risk data coverage
  3. Data quality standards covering all relevant data quality dimensions, i.e. completeness, accuracy, consistency, timeliness, uniqueness, validity, availability and traceability
  4. Consistent criteria and a systematic metrics approach to assess compliance with data quality standards; this should be supported by sufficient data quality controls along the entire IRB data chain
  5. Procedures for constantly assessing and improving the quality of data
  6. Reporting procedures on data quality allowing for sufficient understanding of the quality of the data supporting the IRB models

Governance Principles

The data quality management framework:

  1. Should be approved by the institution's management body or a designated committee thereof and senior management, as part of their responsibilities;
  2. Should be distributed throughout the organisation to the relevant staff;
  3. Should be periodically assessed in order to verify its adequacy, and be updated and improved whenever necessary;
  4. Should be subject to regular review by the internal audit function or another comparable independent auditing unit.

The roles of the different units, internal bodies and staff involved in the data quality management process should be defined in such a way as to ensure that the data handling process is sufficiently independent from the data quality management process.

It is good practice for institutions to have a dedicated independent unit with an overall view of and responsibility for the management of data quality. Where an independent unit is established, the size of this unit should be proportionate to the nature, size and degree of complexity of the institution’s business and organisational structure.

References

  1. ECB guide to internal models - Credit Risk, Sep 2018

Contributors to this article

» Wiki admin