The insurance industry’s growing use of big data, predictive modeling and artificial intelligence (AI) has been well documented, and it is a matter of attention and concern among American regulators. The Connecticut Insurance Department (CID) is no exception; it has recently issued a notice imposing on domestic insurers a new data-related requirement that all industry members should take note of, as it may well be a sign of what is to come in other states.
As insurers and other industry players increasingly rely on big data sets to provide better predictions and risk analyses, regulators have come to view this trend with apprehension. When data is fed through AI applications—especially those that utilize machine learning to improve outcomes—the data processing can create a black box that is difficult for even the insurers themselves to fully comprehend. Beyond issues of data privacy, security, ownership and accuracy, the National Association of Insurance Commissioners and its member regulators are concerned about the potential for algorithmic bias when big data is used in accelerated underwriting programs or antifraud applications. Such bias can arise when a model or an AI application utilizes metrics and data sources that have been inadequately vetted or does not screen out irrelevant and/or illegal influences. As a result, the underwriting, pricing or claims-handling practices guided by such AI may unintentionally violate federal or state antidiscrimination laws.
Regulators expect insurers to be aware of and flush out and eliminate various sources of bias. The CID has put that expectation into writing. Connecticut is home not only to many of our nation’s largest insurers, but also to a growing insurtech community that is continuously developing uses for big data in AI and models. As a result, the CID has been at the forefront of regulatory efforts to identify and eliminate sources of algorithmic bias. Commissioner Andrew Mais initially brought issues of big data and the potential for discrimination to the industry’s attention last spring with his notice alerting Connecticut’s domestic companies to the CID’s concerns in this area. On April 20, the commissioner’s original notice was amended and restored here. Of primary concern to compliance professionals is the newly announced requirement that Connecticut domestic insurers must file by September 22 a “Data Certification.”  The substance of the certification form requires a responsible company officer to acknowledge the following: (a) that there is an “established process” concerning third-party-derived data, (b) that “all data used to build models or algorithms” will be provided to the CID upon request, and (c) that the entity or certificate signer will maintain the necessary records, schedules and data to support the certificate for a commercially reasonable time.
Connecticut companies will want to prepare for the new certification requirement by considering the following two steps:
- Ensure that the company has an established process in place to evaluate the data and AI technology provided by third parties and internally generated data used in underwriting, pricing and/or claims processes. Appendix A to the commissioner’s notice provides the starting point for areas to be scrutinized. (NOTE: Though not specified in the notice, the company’s vetting process should be documented in writing for purposes of future regulatory review.)
- Review existing and proposed contracts with third-party providers in order to protect the insurer’s ability to comply with any CID informational requirements without breach.
Lastly, although the Commissioner’s notice applies only to Connecticut domestics, it would be reasonable to assume that other states’ departments will eventually follow suit with requirements to periodically document companies’ methodology for addressing algorithmic bias. Insurers in other states should start to prepare, putting the processes in place both internally and with third-party providers to ensure that their use of big data is accessible when regulators inevitably come calling.