How can insurers implement an ethical approach to AI whilst ensuring they maximise the opportunities it presents?
Data seems to be discussed everywhere. It has perhaps become a truism that firms stand to benefit from better leveraging the data they hold, for example big data and artificial intelligence (AI) promises a huge leap forward for insurers making better decisions, faster, and with greater accuracy. It allows organisations to better understand their customers, enhance their products, price risk more accurately and improve efficiency driving down costs.
At the same time, firms, the authorities and the public are becoming increasingly aware that AI and advanced algorithms in general also bring some new risks while accentuating others.
The financial services industry is already heavily regulated, but firms will need to make sure that their approach to GDPR, ePrivacy Regulation, the Financial Conduct Authority (FCA) rulebook etc, is calibrated to fit these new technologies and evolving customer expectations. For example, how do requirements to be transparent with customers and treat them fairly map onto decisions made using machine learning? It may not be clear how existing regulation applies to new technologies.
Regulators are certainly getting interested too. The Information Commissioners Office (ICO), FCA and the Bank of England (BoE) are all planning work and guidance on data ethics and AI, and the new Centre for Data Ethics and Innovation has been set up to advise government and business on a wide range of potential challenges.
The need to embed a data ethics approach
Our joint KPMG-UK Finance report on data ethics (March 2019) suggested a set of principles and some next steps to help firms ensure they look at risks in the round and embed a data ethics approach throughout the business.
Within Insurance, the FCA published the interim report (Oct 2019) from the Market Study examining the extent to which there is consumer harm (higher premiums) for long standing customers caused by pricing practices focusing on higher margins. Their fundamental concern over the aspects incorporated that lead to higher margins being harmful and that these practices could be taking advantage of low customer awareness, characteristics of vulnerability or levels of engagement. In addition to controversial practices such as price optimisation that relies on profiling of customers, other risk related ‘correlations’ are popping up in all sorts of unexpected places with unintended consequences increasing the risk of discriminatory outcomes being experienced by consumers.
While an insurer can check that the inputs to its AI or Machine Learning (ML) based models are in order, the very power and advantage of AI that insurers are looking to exploit could produce unfair outcomes. Consumers are becoming less concerned about whether a firm has done the right things, and are putting more focus on whether it has achieved the right outcomes. This is why legal compliance is no longer enough and effective continuous monitoring frameworks will become necessary. So what else can insurers and other financial service organisations do?
Action: Incorporate a "tweet test"
One additional control could be to consider incorporating a “tweet test” as part of your ethical challenge over AI/algorithmic use cases. Ask yourself: What could be the reaction from the “court of public opinion” be on the outcomes of the proposed algorithm. As was evidenced by the scrutiny over the well-publicised seemingly biased algorithm used for credit risk scoring Goldman’s apple card – reputation damaging media coverage can occur if you cannot appropriately explain and justify what the algorithm is doing.
Get in touch for further support and to find out more.
© 2021 KPMG LLP a UK limited liability partnership and a member firm of the KPMG global organisation of independent member firms affiliated with KPMG International Limited, a private English company limited by guarantee. All rights reserved.
For more detail about the structure of the KPMG global organisation please visit https://home.kpmg/governance.