ISO Review Issues Forum
Data Quality: Why It Matters
For many insurers, the most visible feature of ISO’s data-quality efforts is the record-by-record validity checks on the statistical data they submit. Edit packages ISO makes available to companies provide a preview of the results of those edits before insurers submit their data, and submission analysis reports provide a complete picture of the actual results as ISO’s front-end systems process the data.
Yet the validity checks constitute only one aspect of the data-quality regimen that ISO pursues. Much of the other work takes place after statistical submissions have been accepted. In this Issues Forum of ISO Review, ISO data experts Beth Fitzgerald and Sanders Cathcart discuss the additional steps ISO takes, why they are important, and how they might assist insurers.
Please explain the distinction between data validity and data correctness.
But what if a record is coded with a “1” — a valid code? Does that mean it is correct? That is, does the record reflect an actual underlying Basic Form 1 homeowners policy? The validity tests allow this record to pass through with no associated error. After all, Form 1 is a part of ISO’s homeowners program and a part of many companies’ programs. Although it’s available, we also know that the Basic Form is not used very often in today’s market. And it is always possible that a company system glitch led to a Form 3 Special policy being coded as a Form 1 policy. From ISO’s perspective, it is a difficult task to make certain that an individual record, though valid, is really correct.
So what does ISO do to verify such records?
Likewise, a sudden change in a company’s distribution may suggest an anomaly, for example, due to system maintenance. For a company consistently reporting 95 percent of its personal auto physical damage business with deductibles under $500, a new quarter’s submission showing most business with a $1,000 deductible would be questionable.
As a final example, we also monitor aggregate premium and loss distributions to ensure an acceptable level of consistency. It might be quite reasonable that 90 percent of a company’s commercial automobile bodily injury claims come from policies with liability limits over $5 million. However, if the same company’s premium records suggest that the majority of its business carries liability limits of $1 million or less, we may flag the results.
Cathcart: All these examples illustrate ways to search for potential problem variances often not apparent in reviewing individual records for validity. The important general lesson for data managers is that “valid” does not necessarily mean “correct,” and validity tests can lead to a false sense of security. Validity tests alone, constructed in the context of a set of rules (e.g., a statistical plan) reflecting what is permissible to a “system,” cannot ensure that the resulting data accurately represents underlying business realities — in the insurance industry or anywhere else.
At the same time, it is important to recognize that, in cases such as those above, the underlying data could actually be correct. An open discussion with the company, together with a good understanding of the market and company operations, is critical to getting to the bottom of the issue. Legitimate changes in a reporting company’s underwriting strategy or claim practices, or significant differences from industry norms, can lead to patterns that will catch our attention, making discussions with the company essential. And engaging the appropriate professionals responsible for the business can often lead to enlightenment for all concerned.
How does ISO verify data quality at the analysis stage?
Each insurance line program has associated standard policy forms and rating procedures. The existence of standardized policy forms, coverage definitions, and rating rules allows for the uniform collection of data from many insurers. The requirement to “code as rated” in reporting data allows for the creation of a homogenous database.
Unit transaction reporting of the data allows for more flexibility in the compilations of data for analysis. For example, reports can be derived by calendar year, accident year, policy year, or report year. Unit transaction reporting enhances the data-quality checks of individual field and field relationship edits and the distributional analysis of data submissions. Even if reviewing data for analysis on a summarized basis, unit transaction reporting always allows an actuary to drill down into the data to investigate any anomalies. For instance, if analyzing the increase in losses for a particular accident year, we can evaluate whether the increase is due to the reporting of an individual large claim or an increase in the amounts of several individual claims.
Cathcart: Actuaries examine data for validity, accuracy, reasonableness, completeness, and timeliness. Many such checks are done prior to updating the database with the most recent quarterly data. As each layer of analysis is performed, the data is evaluated for quality, reasonableness, and appropriateness for that specific analysis. And quality in this sense is relative, not absolute. Data that may be used in an annual analysis of prospective advisory loss costs may not be suitable for an analysis in a relativity study. Each separate analysis must evaluate the quality of the data for that particular analysis. It is common to use a broader set of data for overall statewide analysis and a subset of data for much more refined analysis by individual rating variables.
Why are these efforts important to insurers?
Of course, ISO’s data is valuable to our customers for many other purposes. So the quality of the data is just as important to company underwriters looking to develop strategies in a new market or to claim professionals who want to benchmark their salvage or subrogation practices against those of the industry.
Cathcart: And ISO must deliver timely, quality aggregate data to regulators according to its role as a statistical agent. Ensuring that company data is as accurate as possible allows ISO to include it in statistical compilations that represent an accurate portrayal of company practices. At the same time, by aggregating the results of all carriers reporting quality data, ISO can ensure the confidentiality of individual companies’ experience.
Sanders Cathcart, FCAS, MAAA, is assistant vice president and actuary of ISO’s Insurance Information Products and Analytics Department.
Beth Fitzgerald, FCAS, MAAA, CPCU, is vice president of ISO’s Commercial Lines and Modeling Department.