If the data model is created, and has been filled with attributes and digital assets, the question rises: What is the value of this data? Did we miss anything? How do we ensure that the quality of the data is secured and remains that way? The amount of data and data sources is growing exponentially and this growth will continue in the near future. The integrated management of data quality is a huge challenge for most organisations. Failure in delivering data quality can have detrimental effects on costs and profit.
In order to guarantee the quality of the data, we must first determine how it is to be measured. Quality data is:
- consistent (one version of the truth)
- up to date
To improve data quality, the PIM processes should include appropriate quality controls. It is therefore required to check if the data is complete and correct while setting up a new product. All this is only possible if the minimum requirements are met. Nowadays we can measure the overall quality of the product data, ideally resulting in a data quality dashboard. Besides that, it is essential to have a good governance of the responsibilities and compliancy of it.
Build a Dashboard to manage data quality
A picture is worth a thousand words. As such, visualizing data quality by means of DQ Dashboards is a great way to support business users. While profiling provides dashboards for operational support, they can enable senior management to control data quality. Dashboards should not only reflect a status quo but also have capabilities to show trends, exceptional activities and also to provide some proper detailed information. Where normal BI tools provide management information on data itself (e.g. sales) a DQ dashboard provides similar functionality but is able to focus on the quality of the data (e.g. trend line for missing email addresses or product EAN codes). While building a dashboard, “think big but start small”, as the main strength of dashboards is visualization and simplicity. At a later stage, more KPI’s can be added without losing the ability to interpret data and their quality without being confronted with masses of figures.