More than half (61 percent) of Australians know little about Artificial Intelligence (AI) and many are unaware that it is being used in everyday applications, like social media. While 42 percent generally accept it only 16 percent approve of AI.
These are some of the key findings of the University of Queensland/KPMG Australia Trust in Artificial Intelligence report launched today. It is the first national survey to take a deep dive into a vital area of technology that is reshaping how we work and live, setting out to understand and quantify the extent of Australians’ trust in and support of AI, and to benchmark these attitudes over time.
“The benefits and promise of AI for society and business are undeniable,” said Professor Nicole Gillespie, KPMG Chair in Organisational Trust and Professor of Management at the University of Queensland Business School. “AI helps people make better predictions and informed decisions, it enables innovation, and can deliver productivity gains, improve efficiency, and drive lower costs. Through such measures as AI driven fraud detection, it is helping protect physical and financial security – and facilitating the current global fight against COVID-19.”
But Professor Gillespie said that the risks and challenges AI poses for society are equally undeniable. These include the risk of codifying and reinforcing unfair biases, infringing on human rights such as privacy, spreading fake online content, technological unemployment and the dangers stemming from mass surveillance technologies, critical AI failures and autonomous weapons.
“It’s clear that these issues are causing public concern and raising questions about the trustworthiness and regulation of AI. Trust in AI systems is low in Australia, with only one in three Australians reporting that they are willing to trust AI systems. A little under half of the public (45 percent) are unwilling to share their information or data with an AI system and two in five (40 percent) are unwilling to trust the output of an AI system (eg a recommendation or decision).”
She said that the report findings also highlighted that most Australians do not view AI systems as trustworthy – however, they are more likely to perceive AI systems as competent than designed to operate with integrity and humanity. While many in the community are hesitant to trust AI systems, Australians generally accept (42 percent) or tolerate (28 percent) AI, but few approve (16 percent) or embrace (7 percent) AI.
Professor Gillespie said a key insight from the survey shows the Australian public is generally ambivalent in their trust towards AI systems: “If left unaddressed this is likely to impair societal uptake and the ability of Australia to realise the societal and economic benefits of AI at a time when investment in these new technologies is likely to be critical to our future prosperity. The report provides a roadmap for what to do about this,” she said.
The report emphasises four key drivers that influence Australian’s trust in AI systems:
“Of these drivers, the perceived adequacy of current regulations and laws is clearly the strongest,” said Professor Gillespie. “This demonstrates the importance of developing adequate regulatory and legal mechanisms that people believe will protect them from the risks associated with AI use. Our findings suggest this is central to shoring up trust in AI.”
She noted that one reason for the lack of confidence in commercial organisations to develop and regulate AI may be that people think such organisations are motivated to innovate to cut labour costs and increase revenue, rather than to help solve societal problems and enhance societal wellbeing.
“About three quarters (76 percent) of the public believe commercial organisations innovate with AI for financial gain, whereas only a third (35 percent) believe they innovate with AI for societal benefit,” said Professor Gillespie. “That opens up an opportunity for business to invest in and better communicate to Australians about how they are using AI and emerging technologies to create mutual benefit and societal good.”
James Mabbott, National Leader KPMG Innovate pointed to the survey finding that Australians generally disagree (43-47 percent) or are ambivalent (19-21 percent) about the adequacy of current safeguards around AI (such as rules, regulations and laws).
“Survey respondents ask whether the regulations are sufficient to make the use of AI safe or protect them from problems. Similarly, the majority either disagree or are ambivalent that the government adequately regulates AI,” he said. “This is where innovation is needed – in understanding that trust acts as the central vehicle through which other drivers impact AI acceptance – and in delivering certainty. We need to be more creative about providing these solutions and assurances and communicating them in an effective way.”
Mr Mabbott said the findings from the report and defined action plan provided a key opportunity to enhance vitally needed public trust in AI in order to accelerate benefits whilst mitigating potential harms.
According to the findings from the University of Queensland/KPMG Australia Trust in Artificial Intelligence report, the key ways to build trust in AI are:
1. Live up to Australians’ expectations of trustworthy AI
2. Strengthen the regulatory framework for governing AI
3. Strengthen Australia’s AI literacy
Marjorie Johnston
+ 61 407 329 430
mjohnston4@kpmg.com.au
Ash Pritchard
+ 61 411 020 680
apritchard2@kpmg.com.au
For the purposes of the survey, Artificial Intelligence (AI) refers to computer systems that can perform tasks or make predictions, recommendations or decisions that usually require human intelligence. AI systems can perform these tasks and make these decisions based on objectives set by humans but without explicit human instructions.
The University of Queensland/KPMG Australia Trust in Artificial Intelligence national survey is the first of its kind to take a deep look at community trust and expectations in relation to AI. The survey involved a nationally representative sample of over 2,500 Australian citizens and was conducted in June to July 2020.
©2021 KPMG, an Australian partnership and a member firm of the KPMG global organisation of independent member firms affiliated with KPMG International Limited, a private English company limited by guarantee. All rights reserved. The KPMG name and logo are trademarks used under license by the independent member firms of the KPMG global organisation.
Liability limited by a scheme approved under Professional Standards Legislation.
For more detail about the structure of the KPMG global organisation please visit https://home.kpmg/governance.