A CIO’s guide to AI: Australian artificial intelligence suffering from arrested development

In part 1 of this guide, we provide a view of the AI landscape in Australia. Where is AI being deployed? Expectations versus realities? Are adoption rates falling behind the rest of the world? What is holding AI back in this country?

1 2 Page 2
Page 2 of 2

Days of miracle and wonder

Notwithstanding the growing pockets of success, AI in Australia is also being hindered by the intense hype being generated by the industry, media and general public.

“AI is overhyped in the industry,” said UTS’s Blumenstein, who says businesses’ understanding and therefore expectations of AI is completely distorted. He’s particularly concerned about the growing number of CIOs with less direct experience of tech, especially those in situations where they may be feeling pressure to accelerate deployment of AI.

“A certain number of CIOs today might be susceptible to misinformation,” he said.

Gartner VP and distinguished analyst, Whit Andrews, invokes music great Paul Simon: “These are the days of miracle and wonder.”

“When we see a machine can teach itself how to play chess elegantly with nothing but the rules of chess, learning opening moves acquired over centuries in less than 48 hours every single time, it creates a sense of wonder that can in some cases can change into a sense of paralysis or fear,” Andrews said.

One example of a growing anxiety is organisations worrying AI may be too complex to integrate into existing IT systems.

But the biggest anxieties appear to be around the ethical and security risks posed by AI, with Australian organisations among the most worried.

Deloitte reports 43 per cent of all its survey respondents have ‘major’ or ‘extreme’ concerns about potential AI risks, led by cybersecurity, making the wrong decision based on AI, and ethics concerns around bias.

Just under 50 per cent of Australian companies reported major or extreme concerns about potential AI risks, yet only 38 per cent said they were fully prepared to deal with them. Looking at China, however, 42 per cent of organisations reported being fully prepared, while just 20 per cent expressed real concerns.

UTS’s Blumenstein thinks Australian concerns are partly a by-product of the wider hype and confusion.

“There have been unrealistic expectations in business and fear in the community fuelled people who want to create a panic,” he said.

“The public fears it even though it’s nowhere like what human rights and ethical groups are trying to portray it as.”

There have, however, been many examples of AI systems caught out making decisions based on biased – albeit innocently acquired – assumptions.

For example, Amazon came under fire years ago after it was revealed it was relying on an AI solution to sort through candidate resumes which was actively discriminating against women largely due to assumptions based on 20 years of resumes. This naturally provided a skewed proportion of male CVs with apparently higher levels of success.

Left to its own devices the system started filtering out CVs with terms like ‘women’s basketball’ and female-only colleges. It kept on for some five years before the problem was detected. 

Problems with racial profiling have also been a major source of concern, especially for African Americans and Hispanics in the US, most notably the notorious COMPAS platform used to assess US prison inmates’ parole applications, which was found to be unfairly refusing African Americans.

It’s anticipated that Australians will face their own challenges around system bias and ethics. The infamous ‘robodebt’ program has parallels with the above examples, if not so much about AI specifically, at least illustrating the sort of community and political backlash that can be unleashed when automated systems are left unchecked.

Interestingly, there are currently over 20 pieces of Commonwealth legislation that allow for decisions to be made by computers, while seemingly benign new apps and systems are being launched all the time which may come into conflict with existing and evolving Australian privacy and human rights laws at any time.

 So watch this space.

Meanwhile, Deloitte’s Marshall points to the lack of a strong local tech industry, Australia’s relatively unfavourable investment environment, as well as relatively low levels of government investment, especially compared with countries like the US, UK, Germany and China in particular, which he describes as “chicken feed”.

“Australia doesn’t enjoy a natural competitive advantage like with hubs in China, Silicon Valley, pockets in Israel, and stuff in Germany,” Marshall lamented. “We don’t have proximity to centres of gravity for tech and tech development.”

And this is making Australia’s AI and general data science skills shortage worse.

Gary Adler, chief digital officer with the local subsidiary of global law firm Minter Ellison, observes many of our best data scientists are being lured overseas.

“Certainly in Australia what we’re seeing is some of the really good resources are being attracted straight to Silicon Valley, sometimes to Israel, sometimes into the EU,” he said.

The federal government has announced various initiatives to address this problem, including support for PhD scholarships and school learning related to AI and data science.

“However, AI experts are issuing warnings that greater levels of spending will be needed for Australia to keep up with other countries that are lavishing public funds on AI initiatives,” Deloitte’s Marshall said.


Copyright © 2020 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
Discover what your peers are reading. Sign up for our FREE email newsletters today!