In today’s data-driven world, every decision a company makes is dependent on data. And because organizations need fast access to data they can trust, data quality has become paramount to the success of any business. To transform information into valuable business insights, all lines of business within organizations require the timely delivery of high-quality, accurate information.
An organization’s data supply chain is responsible for the movement and processing of data throughout its entire lifecycle. The data supply chain collects, transforms, distributes, stores and audits information as part of an enterprise data management strategy. However, challenges managing data quality often result in unanticipated, but substantial problems.
Up-and-Coming Small- and Mid-cap Portfolio Managers #MICUS (Morningstar Conference)
For example, if an extra digit is added to the amount of an account transfer, a company may lose or gain tens of thousands of dollars. If a customer’s account is wrongly debited, it can ruin the customer experience. And if there are errors in the reconciliation of a corporate financial statement, a company may face significant regulatory fines. For these examples and the many others out there, businesses need automated, appropriate oversight into their data supply chain.
Shining a Light on the Data Environment
Without a clear view of an organization’s data landscape, errors in the data supply chain occur, go unnoticed and persist for an extended period of time. That is why all companies must take steps to protect the integrity of their data supply chains at the source.
Establishing the appropriate data oversight begins with a foundation of data governance that integrates enterprise data quality. Data governance ensures all data users within an organization understand where their data came from, where it’s located, how it’s used and what it means. For example, an organization might receive third party data including customer first and last name details. If the third party classifies this information as [LastName, FirstName], but your organization defines this attribute as [FirstName LastName], how can you ensure that the data is transferred properly? If you can’t get the name of your customers correct, how will they trust that the information in their accounts is accurate? Governance establishes policies and processes for data oversight and regulatory compliance.
Data quality capabilities ensure the data is fit for use among business users by measuring data’s accuracy, consistency and completeness, and verifying, balancing, reconciling and tracking data across systems and processes. Combined, the integration of data governance and data quality increases data value to help organizations achieve their critical business objectives and mitigates information risk to ensure positive ROI.
Still, companies can only implement risk mitigation measures if they have fully mapped their critical information supply chains and defined business rules that allow them to determine when information is bad.
Minimizing Data Quality Risk
Data governance with integrated data quality is a critical tool to help define and implement risk mitigation factors. Information is continuously in motion, traveling from source systems through processes, continuously transforming, putting its integrity at risk. Some data quality issues are readily apparent, like missing files, but some integrity challenges are harder to detect. Data governance with integrated data quality tracks data’s source, lineage, usage and transformations, while identifying potential quality issues for remediation and improvement.
Data supply chains are also represented by discrete steps both within systems and in the exchanges between systems and third-party data sources. Organizations must, therefore, implement appropriate rules at defined critical information analysis points to ensure the end-to-end integrity of the entire data supply chain. Incomplete coverage of control points can leave an organization exposed to inaccurate or incomplete information.
Businesses must also validate information across multiple quality dimensions, requiring multi-dimensional validation. Multi-dimensional validation means businesses require rules at each validation point to ensure that information is accurate, complete, reliable and timely. All of these attributes are crucial for accomplishing true enterprise data quality.
Real-Time Data Error Notification
Additionally, companies need real-time data error notification when a problem is detected at any validation point. When a data owner notices a problem, they need to be empowered to immediately stop processes and track remediation of any errors that occur. Alternatively, an automated workflow identifies data glitches and ensures such errors are quickly reviewed by the right team.
Finally, organizations must implement information validation independent of the applications and systems that execute any data supply chain operations. Information validation is a fundamental requirement of any auditing structure. At the same time, rules across all applications and systems should be stored on a common platform accessible to all business users. As a result, all data users have end-to-end visibility into the diverse information supply chains across the enterprise and can streamline data quality risk management exercises.
If an organization’s data isn’t trustworthy, its business decisions won’t be either. Businesses must assess their own information, integrity risk profile and take immediate steps to eliminate any vulnerabilities. A data quality risk assessment requires a careful look at your data landscape, including key sources, operational processes and systems to identify potential threats to data quality, root cause analysis and an effective data quality strategy. Risk mitigation and data quality monitoring and improvement conducted within an enterprise data governance strategy builds user trust in data quality and ensures that organizations get meaningful business intelligence from their data to limit exposure and drive positive ROI.