• 1000

With the growing reliance on big data, the demands on IT systems and data are becoming increasingly complex. In the era of cloud computing and dashboard culture, the integrity and quality of data repositories are becoming increasingly important and form the basis for up-to-date reporting. Not only does the amount of data to be processed increase constantly (data volumes double every 2 years), but the frequency of the reporting cycles also increases.

It is only by having well established and correctly functioning data quality management that unstructured data can be transformed into meaningful insights and competitive advantage. Well-maintained data repositories are not only the basis for institutional customer reporting (e.g. Solvency, Basel III), but increasingly also the basis for ever more comprehensive regulatory reporting (AIFMD, BCBS 239).

Data quality does not get the attention this topic deserves

In recent years, many public health insurance organisations and asset custodians have assigned extensive resources in the development of central data warehouses (DWHs) and have increased investment in the systems and applications landscape that surround them. However, data quality (DQ) still does not receive the attention it deserves. An IT system can be up-to-date and well set up – but this does nothing to eliminate data deficiencies. The individual components and data flows generally come from a variety of different source systems, which all too rarely either come from a single source or are compatible with one another.

In the area of investment compliance, erroneous data on holdings, markets and master data, for example, regularly lead to a large number of false alerts, such as alleged threshold violations, which subsequently turn out to be incorrect. This not only costs companies time and resources, but also affects the division’s acceptance and reputation.

Design and establishment of a DQ control process

The uniform definition of DQ criteria and DQ standards, especially for risk reporting, plus the creation of an integrated basis for the data, is therefore essential for reporting to be functional. This includes building and enforcing a DQ control process for the measurement, analysis and correction of DQ problems, including assignment of responsibilities. Developing a control plan to measure data quality at different points along the data processing chains helps to ensure consistently high quality. The consolidation of information into actionable insights and the integration of these into existing business processes create lasting added value.

At KPMG, the analysis and evaluation of financial statement and business data has been part of our core business for over 100 years. We are not only able to process data efficiently, but we can also consolidate information into actionable insights and integrate these into existing business operations.

Take advantage of our wealth of experience and feel free to contact us.