Data Quality Standards

From Open Risk Manual
Revision as of 11:43, 27 November 2020 by Wiki admin (talk | contribs) (Data Quality Dimensions)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Definition

Data Quality Standards refers to the objectives and overall scope of the Data Quality Management Framework which is is typically defined with reference to certain data quality dimensions.

Data Quality Dimensions

The following dimensions are generally highlighted as important across a range of domains and also in risk management context[1]

  • Data Traceability or Data Provenance, the history, processing and location of the data under consideration can be easily traced
  • Data Appropriateness or Suitability
  • Data Consistency, a given set of data can be matched across different data sources
  • Data Timeliness or Punctuality (data values are up to date) and the related concept of Stability
  • Data Completeness, values are present where required
  • Data Accuracy, data are substantively error-free (related concepts are Precision and Plausibility)
  • Data Uniqueness, aggregate data are free from any duplication arising from filters or other transformations of source data
  • Data Validity, data are founded on an adequate and rigorous classification system that ensures their acceptance
  • Data Availability or Accesibility data are made available to the relevant stakeholders


NB: The boundaries between concepts are not sharp.

Data Quality Controls

  • Data quality should be measured in an integrated and systematic way. The measurement system and the frequency of its application should be formalised.
  • Indicators and their corresponding tolerance levels and thresholds should be set in order to monitor compliance with the standards established and should be combined with visual systems (e.g. red/amber/green traffic-light system) and dashboards for monitoring and reporting purposes.
  • Indicators should be supported by effective and sufficient data quality checks and controls throughout the data life cycle, from data entry to reporting, and for both historical data and current application data. Data quality checks and controls should include reconciliation across and within systems, including between accounting and IRB data. An effective control framework should therefore be in place to ensure that sound controls and related procedures are implemented, especially for manual processes.

References

  1. ECB guide to internal models - Credit Risk, Sep 2018