As technology races ahead in complexity and influence, assurance is becoming ever more important – and ever harder to achieve. Audit functions need to master these diverse issues.
Information technology is transforming the work landscape for auditors, and there is no sign of any let up in the pace of change.
There is an ever-increasing dependence on technology within organisations; it now underpins nearly everything a business does.
In addition, technology is increasing in complexity, while the speed of its impact on business is also accelerating.
Alongside these rapid technological developments, the issue of trust has become paramount within business and society as a whole. And that includes trust in business, auditors, regulators, government and in the technology itself.
These trends make assurance within IT a daunting prospect, but also an exciting one, as it is now incumbent on auditors to keep pace with change in order to be able to address emerging risks.
In order to do so, it’s critical that the internal audit function fully understands the emerging risks. These are some of the key areas to be aware of.
With organisations adopting new technology, the internal audit function is expected to take a greater role in how it is being used within financial services firms.
Regulators and enforcement bodies, particularly in the US, are already increasing their scrutiny on the role of assurance in this area.
Financial crime and adhering to any economic sanctions are a key issue for financial services. Numerous banks have already paid large fines after breaching sanctions, largely because of a failure in their screening systems.
KPMG recently undertook a banking project on financial crime and sanctions, looking at how AI could improve efficiency and accuracy.
At the beginning of the project, the bank had over 100,000 alerts every day. As the number of alerts increased, it became clear that this was not a sustainable situation. In addition, regulators had criticised the effectiveness and efficiency of their sanctions filter.
To help solve that problem, KPMG built a machine learning model that looked at how to tackle the number of false positives that the bank’s screening system was producing, using the previous two years’ data.
The outcome was KPMG’s Sanctions Alert Classifier, an optimised system that decreased the rates of non-suspicious alerts, allowing analysts to focus on high risk activity. Its outcome was consistent, the review faster – up to 1 million decisions a second – with 99.9% accuracy.
The important thing for the auditor is that the Sanctions Alert Classifier is supervised machine learning, it’s transparent and it can be audited, meaning every false positive can be explained.
Indeed, as more and more organisations adopt artificial intelligence and machine learning, that algorithm assurance becomes increasingly important.
The biggest concern about AI is how it reaches its decisions. So how can internal audit provide assurance around algorithms and AI?
Businesses need to make sure that they can stand by the logic and governance put in place around algorithms. For instance, is it reliable? Are outputs being tested and validated? Is it secure?
Is the IT architecture (the processes and controls) working correctly? Are all laws and regulations being followed?
Within data management and governance, is the use of data appropriate, and is it being safeguarded correctly?
Does the company fully understand AI logic and the decisions that the algorithm makes? Can they make sure that they fit the business context, or are they relying blindly on the technology?
In addition, companies should be on top of the newest developments such as relevant legal frameworks and regulations. The issue of ethics and privacy is also an increasingly important topic.
Internal auditors also need to think about the human side of the technology, how employees interact with the system. Do they understand the context and business process that is in place so they can decide whether it’s right to do what the algorithm tells them?
Over the past year, we have helped organisations think about their algorithms. These are the three main things that we look at: The design – is its goal and purpose properly defined; the implementation – are the inputs right and do the outputs meet expectations; the operation – is there continuous monitoring of controls?
Bringing a multi-skilled team to this endeavour is essential. That team needs to include pure data scientists, highly skilled technologists, and those with plenty of experience on internal controls.
One of the most critical elements for the assurance function is to understand that this new technological landscape extends well outside of the IT department. Quite often the technology is brought in and developed by one particular business unit. It’s critical that there are ongoing conversations between the assurance function and everyone in the business that is using an algorithm of one kind or another.
The Internal Audit function can really add value here, working with those teams and enabling them to think about how they communicate what their AI does and building that sense of transparency. It can also bring a discussion about internal controls to the table in order to start assuring those algorithms.
With the exponential increase in cyber-attacks comes a proliferation of new laws and regulations regarding cyber security. And internal assurance needs to factor this into what they are already doing.
Among the many challenges is that those laws are rarely consistent. However, there are some common themes. For instance, nearly all have fines attached to regulatory breaches. Those fines vary vastly, with the UK delivering some of the stiffest penalties.
Many regulations contain audit rights, where governments can come into organisations and check on their operations. Some have data localisation requirements such as China, where information on Chinese employees has to be held within the country.
Breach notifications are also standard. And some governments have the right to step into your organisation to run an incident if it is severe enough.
But while there is some commonality, the challenge is knowing which laws and regulations apply.
One recent example of a new law is the Network and Information Security Directive. NISD is specifically targeted at critical national infrastructures such as power and utilities, oil and gas, water, health and digital service providers. Only the largest organisations are in scope, but smaller ones are set to follow.
This directive opens a new piece of work for internal auditors. They need to watch out for the fact that the operational technology within these types of organisations is not typically under the remit of IT, and might not be at the same level of control and maturity as other controls within the business.
Overall, legal and regulatory change within cyberspace is on the increase, and the internal audit function must be constantly alert to the changes.
Operational resilience is another hot topic on the internal audit agenda.
Last year, The Bank of England, the Prudential Regulation Authority and the Financial Conduct Authority published a joint discussion paper on an approach to improve the operational resilience of firms and financial market infrastructures. The paper raises the bar for operational resilience, and not just for financial services.
According to the paper, internal auditors need to make sure they are getting seven key elements right.
1. Top down – is there board ownership and accountability for operational resilience, and do they understand it?
2. End-to-end services – is there a real understanding of the resilience of an organisation’s end-to-end services. Business contingency plans can often operate in siloes, but taken together, do they add up?
3. Does the reporting give an informative view of that end-to-end reporting?
4. Incidents keep happening. Does the organisation have a resilience culture of thinking about when a failure might occur, not if?
5. Is resilience embedded within everything an organisation does?
6. The regulators are clearly focusing on testing the resilience of organisations, so having stress plausible scenarios firm-wide is important.
7. As organisations expand and use third parties, have they made sure there is end-to-end communication in the event of an incident?
The board and executive teams need to be making differentiated and informed investment decisions with resilience in mind.
But in order to do so, they need a structure and rigorous framework that gives them an informed view of organisational resilience. Furthermore, organisations need to make sure they have sufficient skills within the organisation to really embed such a culture of resilience.
By the same token, internal auditors need to make sure they are embedding operational resilience in all the audits they are performing. And there needs to be continual oversight of those risks and controls, as well as reporting up to the audit committee.
Getting the right skills and embedding the appropriate culture is not just a regulatory imperative, it’s a strategic one. Organisations need to build resilient organisations for the future, especially in an increasingly uncertain world.
© 2020 KPMG LLP, a UK limited liability partnership, and a member firm of the KPMG network of independent member firms affiliated with KPMG International Cooperative, a Swiss entity. All rights reserved.
KPMG International Cooperative (“KPMG International”) is a Swiss entity. Member firms of the KPMG network of independent firms are affiliated with KPMG International. KPMG International provides no client services. No member firm has any authority to obligate or bind KPMG International or any other member firm vis-à-vis third parties, nor does KPMG International have any such authority to obligate or bind any member firm.