by David Binning

A CIO’s guide to AI: Australian artificial intelligence suffering from arrested development

Mar 30, 2020
IT Leadership

In part 1 of this guide, we provide a view of the AI landscape in Australia. Where is AI being deployed? Expectations versus realities? Are adoption rates falling behind the rest of the world? What is holding AI back in this country?

ai artificial intelligence circuit board circuitry mother board nodes computer chips
Credit: Getty Images

Australia is lagging other developed economies in the deployment of artificial intelligence technologies thanks to a perfect storm of lower confidence levels, skill shortages and higher levels of general anxiety around ethical, legal and security challenges. 

“The first hurdle we’re dealing with is the topic of AI fluency,” Deloitte Australia’s national analytics lead, Alan Marshall told CIO Australia. 

While there are plenty of examples of successful AI deployments in Australia, few organisations seem to have a firm grasp of what AI can do and what it can’t, and therefore how to intelligently plan for and deploy the technology, and measure the outcomes. 

“We need businesses to develop an intuitive understanding of what AI is good at,” Marshall said. “But in my experience that’s not happening on the ground with organisations.”

It’s a sentiment reflected in Deloitte’s report ‘How countries are pursuing an AI advantage’, published last year. It asked early adopters of AI in the US, UK, China, Germany, France, Canada, Australia and New Zealand about their experiences deploying the technology.

The upshot was that Australian organisations have less confidence about AI’s potential to develop real competitive advantage, while they are also more worried about the technology’s downsides, especially when compared with China.

For instance, Deloitte found that early adopters of AI in Australia were less ambitious about the potential impact of AI on their businesses, viewing it more as a means to ‘catch-up/keep on par’, rather than ‘widen lead/leapfrog ahead’. Deloitte reports only 22 per cent of Australian companies in the second group, compared with 55 per cent for China, 47 percent for Germany, 44 and 37 per cent respectively for the UK and US. Overall Australia ranked 7th on this scale, with Canada and France also ahead. 

Further, 41 per cent of Australian respondents said their organisation either has no real AI strategy or only disparate departmental strategies, compared with the global average of 30 per cent.

This is despite the fact 79 per cent of Australian organisations surveyed by Deloitte reported AI will be “very” or “critically” important to their business within two years.

Although the Deloitte report appears to score Australia quite low on key indicators compared with other countries, an article by Gartner last year suggested low levels of AI maturity internationally.

One of the analyst’s ‘strategic planning assumptions’ is that throughout 2021, “75 per cent of AI projects will remain at the prototype level as AI experts and organisational functions cannot engage in a productive dialogue.” 

Professor Michael Blumenstein, associate Dean with the University of Technology’s faculty of engineering and IT, attributes this apparent arrested development to vendors over-promising and under-delivering, especially when it comes to off-the-shelf AI products. 

Natural language processing is one area he feels warrants a little more scepticism, despite its being highlighted by Gartner’s recent AI hype cycle report as being the most important and advanced area of AI currently.

Blumenstein said this is a big area needing further research and fine-tuning.

“There are apps out there that work and ones that don’t. I don’t want to disparage international companies that make claims that aren’t there, [but] some are making very big claims around what their technology can do. The reality is there’s a lot of fine-tuning in order to translate into results. There are no off-the-shelf solutions,” he said.

But while CIOs in some organisations are finding it difficult to convince the board and bean-counters to stump up for AI projects, others are finding themselves in the opposite position, coming under increasing pressure to deploy AI.

“Boards are now are putting pressure on CIOs and other members of the executive,” Blumenstein said.

The latter scenario is playing out especially in industries where huge data sets are being generated, and where there is a growing acceptance that making better use of them will breed efficiencies and competitive advantage.

The majority of projects have been in the financial services, healthcare, agricultural and energy/resources industries, roughly in line with the three pillars identified in Data61’s AI Roadmap report, published last year:

–   Natural resources and environment

–   Health, ageing and disability

–   Cities, towns and infrastructure. 

The roadmap highlights a number of promising AI initiatives, including the University of Queensland’s Agbot II, which uses computer vision and machine learning to classify weeds and determine the best way to eliminate them, potentially saving Australia’s farm sector $1.3 billion a year.

Or there’s Data 61’s ‘Spark’ system using machine learning to map fuel loads, terrain, climate and fire fronts to help minimise risks and improve emergency response.

The University of New South Wales recently revealed details of a project applying machine learning to ‘second guess’ the intent of hospital patients unable to communicate, for instance basic things like ‘sit up’ or ‘move left arm’, that would be sent to devices such as computerised wheelchairs.

The Australian public sector has also begun to look at AI. For example, Queensland’s Treasury became the first public sector agency in the world to deploy SAP’s Leonardo machine learning technology, which is being used to assess around 200 million tax payer records in the state to better understand who would and wouldn’t pay and why. It has reduced land tax debts by five percent.

And in something of a reubuke to Blumenstein’s skepticism about natural language AI, several Australian organisations are reporting encouraging results with the technology, including Suncorp Bank which has been using voice capabilities in IBM’s Watson to analyse customer sentiment during call centre interractions and develop better CX. 

Next up: Days of miracle and wonder 

Days of miracle and wonder

Notwithstanding the growing pockets of success, AI in Australia is also being hindered by the intense hype being generated by the industry, media and general public.

“AI is overhyped in the industry,” said UTS’s Blumenstein, who says businesses’ understanding and therefore expectations of AI is completely distorted. He’s particularly concerned about the growing number of CIOs with less direct experience of tech, especially those in situations where they may be feeling pressure to accelerate deployment of AI.

“A certain number of CIOs today might be susceptible to misinformation,” he said.

Gartner VP and distinguished analyst, Whit Andrews, invokes music great Paul Simon: “These are the days of miracle and wonder.”

“When we see a machine can teach itself how to play chess elegantly with nothing but the rules of chess, learning opening moves acquired over centuries in less than 48 hours every single time, it creates a sense of wonder that can in some cases can change into a sense of paralysis or fear,” Andrews said.

One example of a growing anxiety is organisations worrying AI may be too complex to integrate into existing IT systems.

But the biggest anxieties appear to be around the ethical and security risks posed by AI, with Australian organisations among the most worried.

Deloitte reports 43 per cent of all its survey respondents have ‘major’ or ‘extreme’ concerns about potential AI risks, led by cybersecurity, making the wrong decision based on AI, and ethics concerns around bias.

Just under 50 per cent of Australian companies reported major or extreme concerns about potential AI risks, yet only 38 per cent said they were fully prepared to deal with them. Looking at China, however, 42 per cent of organisations reported being fully prepared, while just 20 per cent expressed real concerns.

UTS’s Blumenstein thinks Australian concerns are partly a by-product of the wider hype and confusion.

“There have been unrealistic expectations in business and fear in the community fuelled people who want to create a panic,” he said.

“The public fears it even though it’s nowhere like what human rights and ethical groups are trying to portray it as.”

There have, however, been many examples of AI systems caught out making decisions based on biased – albeit innocently acquired – assumptions.

For example, Amazon came under fire years ago after it was revealed it was relying on an AI solution to sort through candidate resumes which was actively discriminating against women largely due to assumptions based on 20 years of resumes. This naturally provided a skewed proportion of male CVs with apparently higher levels of success.

Left to its own devices the system started filtering out CVs with terms like ‘women’s basketball’ and female-only colleges. It kept on for some five years before the problem was detected. 

Problems with racial profiling have also been a major source of concern, especially for African Americans and Hispanics in the US, most notably the notorious COMPAS platform used to assess US prison inmates’ parole applications, which was found to be unfairly refusing African Americans.

It’s anticipated that Australians will face their own challenges around system bias and ethics. The infamous ‘robodebt’ program has parallels with the above examples, if not so much about AI specifically, at least illustrating the sort of community and political backlash that can be unleashed when automated systems are left unchecked.

Interestingly, there are currently over 20 pieces of Commonwealth legislation that allow for decisions to be made by computers, while seemingly benign new apps and systems are being launched all the time which may come into conflict with existing and evolving Australian privacy and human rights laws at any time.

 So watch this space.

Meanwhile, Deloitte’s Marshall points to the lack of a strong local tech industry, Australia’s relatively unfavourable investment environment, as well as relatively low levels of government investment, especially compared with countries like the US, UK, Germany and China in particular, which he describes as “chicken feed”.

“Australia doesn’t enjoy a natural competitive advantage like with hubs in China, Silicon Valley, pockets in Israel, and stuff in Germany,” Marshall lamented. “We don’t have proximity to centres of gravity for tech and tech development.”

And this is making Australia’s AI and general data science skills shortage worse.

Gary Adler, chief digital officer with the local subsidiary of global law firm Minter Ellison, observes many of our best data scientists are being lured overseas.

“Certainly in Australia what we’re seeing is some of the really good resources are being attracted straight to Silicon Valley, sometimes to Israel, sometimes into the EU,” he said.

The federal government has announced various initiatives to address this problem, including support for PhD scholarships and school learning related to AI and data science.

“However, AI experts are issuing warnings that greater levels of spending will be needed for Australia to keep up with other countries that are lavishing public funds on AI initiatives,” Deloitte’s Marshall said.