Data-Management and Data-Quality Best Practices
By Peter A. Marotta
Any shared data mechanism is dependent upon sound data-management and data-quality best practices. Planning and prevention at the outset of any process — that is, engineering quality into the process — can substantially mitigate the cost of correction and change afterward and ensure quality results. Quality and process-improvement literature over the past half century makes this clear.
Nevertheless, there are many examples in every industry where reliable information resulted only after quality was engineered into the process later. This facet produces two types of costs: prevention costs that are expended to prevent errors from being made and appraisal costs to evaluate output and audit the process to assure that the results conform to established criteria and procedures. Other costs associated with data quality include consequential costs of poor customer service and the cost of bad decision making. There are also the less obvious disparate data costs of dealing with distinct data and data sources.
Data is the raw ingredient of most insurance products. Data drives the major financial and operational decisions within a company, and it's critical to strategic planning, investment decisions, and merger-and-acquisition considerations. Data drives decisions regarding a company's customer base and how a company serves those customers.
To ensure quality results from any data process, a good starting point is a thorough, coherent plan. Engineering quality into the process can prevent the cost of correction and change afterward. For years, many companies have expended scarce labor and resources to correct processes that failed to produce the required information or analysis.
Engineering data quality into every phase of data management does not come without a price. But the property/casualty industry recognizes that the price is always less than the cost of doing nothing to protect this valuable asset.
To engineer quality into any data process, companies must start with a clear definition of quality — an essential foundation for all data-management best practices. So what is the definition of quality data? Some define quality as meeting customer expectations; others relate quality to the number of defects. But a simple definition applicable to the insurance industry is data fit for its intended use.
The quality of data can be measured by determining whether the data contains a number of indispensable characteristics, such as accuracy, precision, validity, timeliness and other timing criteria, depth, latency and volatility, completeness, reasonability, consistency, uniqueness, accessibility, availability, and cohesiveness.
Adherence to key data-management guiding principles and best practices will help promote data quality. Data-management guiding principles include:
The following data-management best practices support the principles:
A growing number of tools and techniques can be used to support data-management and data-quality best practices. These include metadata repositories, data dictionaries, data models, data and process flows, master data management (MDM), detailed specifications, audits and controls, data and text mining, and encryption.
If the reliability, availability, or timeliness of the data a company uses or produces is in jeopardy or doubt, the value of the data erodes. Less than optimal operational decisions can result, an organization may delay or misdirect corporate initiatives, and personnel and customer dissatisfaction and frustration can arise. In this light, reestablishing the credibility of a company's data assets can be costly.
On the upside, with the potential for new and enhanced uses of data, the value of data assets can appreciate markedly. To continue to grow, prosper, and maintain a competitive advantage, a company will need more data of increased variety and sophistication. Maintaining the quality of that new data will require even more tools, as well as different data-management and control expertise. An important step in positioning for future success is to evaluate immediately the quality of data and the data-management practices in use throughout the organization.
Peter Marotta, AIDM, FIDM, is enterprise data administrator and principal at ISO, leading ISO's enterprise data management activities. He is a past president of the Insurance Data Management Association (IDMA), IDMA's vice president of emerging data issues, editor-in-chief of IDMA's EDMIS, and a member of IDMA's Board of Directors.