Strategic Steps to Avoid Paying the Price of Poor Data Quality
05/13/2019 by Ken Matz Modernization - Analytics
If you deal with reporting, preparing, cleansing, or analyzing data in any form or fashion, then you have had exposure to poor data quality during your work. Depending on your role, the impact of the data quality will impact you differently. If you are a PR person, poor data quality impacts you when it is exposed to the public. If you are an IT person, you (or someone on your team) are directly responsible for the strategy and execution of the data quality. If you are the CEO, you may feel the impact on the reputation of the company and the stock price when a data quality event occurs.
Regardless of your role, poor data quality impacts everyone in some way. Let’s review some of the critical areas in which poor data quality affects the organization.
One of the critical costs of poor data quality is lower productivity. “Nearly one-third of analysts spend more than 40 percent of their time vetting and validating their analytics data before it can be used for strategic decision-making,” noted Forrester.
The costs of poor data quality vary depending on the inefficiencies at your organization. Data can impact marketing, production, and development efforts.
A few examples of reduced productivity include:
Financial cost includes lost revenue and additional overhead cost. According to Gartner research, “organizations believe poor data quality to be responsible for an average of $15 million per year in losses.”
The negative financial impacts related to data errors, inconsistent data, duplicate data can include increased operating costs, decreased revenues, missed opportunities, reduction or delays in cash flow, or increased penalties, fines, or other charges.
What can you do about poor data quality at your company? Several solutions can help to improve productivity and reduce the financial impact of poor data quality in your organization include:
To prove to yourself and to anyone whom you are conversing with related to data quality that you are serious about data quality, create a team who owns the data quality process. The team size is not as important as the membership from the parts of the organization. These workers must have the right impact and knowledge in the process. When the team is set, make sure that they create a set of goals and objectives for data quality. To gauge performance, you need a set of metrics to be able to measure the performance.
After you create the proper team to govern your data quality, ensure that the team focuses on the data you need first. Everyone knows the rules of “good data in, good data out” and “bad data in, bad data out.” To put this to work, make sure that your team knows the relevant business questions that are in progress across various data projects to ensure that they focus on the data that supports answering those questions.
Once you do that, you can look at the potential data quality issues associated with each of the relevant downstream business questions and put the proper processes and data quality routines in place to ensure that poor data quality has a low probability of infecting that data. As you decide which data to focus on, remember that the size of the data isn’t the most critical factor — having the right data is.
As you look for opportunities to tune your organization’s data quality, one of the top priorities should be looking at data that lives in different databases. When you merge or join data from separate databases, you have human factors to consider when planning your data quality rules.
This situation can present itself due to a merger with another company. However, it can also be present in everyday data management. The data silos and people silos need to be managed during this process.
When data volumes become unwieldy and difficult to manage the quality, automate the process. Many data quality tools in the market do an excellent job of removing the manual effort from the process. Commercial products include offerings from DataFlux, Informatica, Alteryx and Software AG.
As you search for the right tool for you and your team, beware that the tools help with the organization and automation. However, the right processes and knowledge of your company’s data can’t be replaced by the machine.
Remember that the process is not a one-time activity. It requires regular care and feeding. While good data quality can save you much time, energy, and money downstream, it does take time, investment, and practice to do well. As you improve the quality of your data and the processes around that quality, you will want to look for other opportunities to avoid data quality mishaps.
The good news is that if you have followed some of the former solutions, you should have more time to invest in looking for edge cases. As always, look for the opportunities with the biggest bang for the buck first. You don’t want to be answering questions from the steering committee about why you are looking for differences between “HR” and “Hr” if you haven’t solved more significant issues like knowing the difference between “Human Resources” and “Resources,” for example.