The Risk of Poor Data Quality to Organization

What if I told you that 50-80% of the people in an organization spend their time on mundane data quality issues, would you disagree? What if I said the Harvard Business Review conducted a study that highlighted a disturbing truth that many senior executives more often than not, receive subpar reporting, that they then use to make strategic decisions?  Would you believe me then?  Well, the study was conducted in 2017 and it’s called Only 3% of Companies’ Data Meets Basic Quality Standards (hbr.org) feel free to investigate.

As an investment advisor, portfolio manager, performance analyst, or any other investment management professional, how do you make any decision with 100% certainty, when data is questionable? I know I could not. The financial industry is packed full of buzzwords; one of them is “Our organization is data-driven”.  Well, if you are a data-driven financial institution and your data is inaccurate this leads to poor investment decisions, which cast doubt on your investment strategy. Your strategy may or may not be the issue, but if your data quality is questionable, you will never know how good your strategy truly is.

Now, imagine having a solution with the ability to provide you with accurate, dependable, accessible data, allowing you the ability to trust the information presented to you, thus removing the worry of subpar data. Now with trust in the data, you have the power to make better, more meaningful, and more accurate decisions with added confidence and increased certainty.  And of equal importance, your organization is not wasting 50-80% of its time and energy chasing data issues. Is this something that would interest you?

Data quality is a hot topic, but that hasn’t always been the case. Prior to recent regulations, organizations weren’t obligated and often didn’t feel the need to place great emphasis on data quality. Why? Simple, data quality control requires a significant commitment of resources to sift through data and the cost-benefit was not worth it. By the time the data was verified and passed through the necessary channels it became stale. There’s also the fact that all the data issues were probably not captured and resolved during the often-manual verification processes.  Data is constantly changing which means your teams are working non-stop to sanitize data that will only be relevant to you for a short period. This is why most have prioritized speed over quality, at least in the past.

Now, remember the solution alluded to previously, that can provide you with accurate, dependable, accessible data? Imagine it also encompassed the five key parameters of data quality:

  • Accuracy:  The data reflect real-world scenarios.
  • Completeness: All the data elements necessary are available.
  • Reliability: All necessary measures are taken to prevent duplicity of data and data source consistency is verifiable.
  • Accessibility: Data is available to the resources that need it.
  • Timeliness: Data can be accessed on demand.

What would that mean to you now? Your teams don’t have to be overly bloated and centrally focused on nonstop data sanitization on fast-changing data. Restructuring and re-focusing of resources can occur which reduces cost, increases productivity, and improves decision-making times and strategies. Now, you would expect a magical solution that encompassed all this to have run-times that are measured in days or months or batch cycles that require constant monitoring, but what if it was available to you with run-times measured in minutes and seconds? As an investment professional would this be of value to you? Now what if I told you that this was merely one small facet of a larger ecosystem, what would you think?