by George Nott

Adopting AI: The Big 5 factors holding back Aussie businesses

May 15, 2018
Artificial Intelligence Augmented Reality CIO

While many Australian companies view artificial intelligence as an integral part of their future success, very few have fully embraced the technology.

According to a global Accenture survey some 82 per cent of Australian executives believe that within the next two years, artificial intelligence will work next to humans in their organisations as a co-worker, collaborator and trusted advisor. Accenture predicts by 2022, firms that adopt AI can boost revenues by up to 38 per cent.

But rolling out AI at scale across a company is far easier said than done. And despite the lofty ambitions, very few businesses locally have taken AI beyond the experimental stage.

It’s little wonder. There are still significant hurdles to be overcome when adopting AI in a significant way.

At the AI NSW Summit in Sydney last month, CIO Australia sat down with Accenture’s Applied Intelligence Lead for ANZ Amit Bansal and Artificial Intelligence Lead for ANZ Brad Ryan to discuss the five big hurdles facing local organisations.

A lack of available talent and skills

It will come as little surprise that companies in Australia are facing a lack of available skills in AI.More than half the respondents in a Gartner surveylast yearindicated that the lack of necessary staff skills was the top challenge to adopting AI in their organisation.

“What we’re finding is the universities are producing a lot of graduates who have done courses relating to AI and we’re starting to see talent come through at that junior level,” Ryan says. “But it’s the deep experts that have been doing it for a long timethat’s where we’re limited in the number of people we have.”

However, many AI academics working in research institutions are keen to start applying their work in then real world, Bansal says. Many of them are joining Accenture to do just that.

“They are desperate to move into the applied AI. They want to go and solve real world problems and they see Accenture as this fantastic opportunity to get into big clients and big problems. So its actually surprisingly easy to attract talent, there just isn’t a lot of it,” he said.

Getting stuck at the proof of concept stage

“It takes really strong sponsorship within an organisation. Realistically it needs to come from top down,” Ryan says.

“Often the projects are driven by a CTO organisation or some small department that’s experimenting. They’ve proved it out but they didn’t get any of the business along on the journey. So when it comes to deployment the business goes ‘I don’t see why I would do it’. They focused on the wrong problem, they focused on ‘look at this tech, how cool is this’ and the business goes well I’m not going to fund you because it makes no sense.”

It is also very hard to make the leap from proof of concept.

“Turning it into something is hard right? To build one of these things in the cloud with a limited dataset, maybe not even a real data, it’s relatively cheap, quick and you can do it. But the moment you go I want to use real data and IP and need all the security requirements and infrastructure capability and all of that, it’s hard. Suddenly you’re going out and you’re impacting customers and so there’s all of the brand implications,” says Bansal.

“We’re just finding people are not ready to invest like that wholesale yet. It’s a big step.”

Regulation is still playing catch up

“Autonomous vehicles is a perfect example, so yes the cars are ready but the roads are not ready, the regulations not ready, the insurance companies are not ready. Similarly when you bring in AI say with financial advice if the AI gives bad advice who is at fault? At the end of the day certain things will need to be tested in the courts. The regulation is a long way away,” says Bansal.

The lack of guardrails and clarity is making enterprise hesitant to move ahead with AI in certain areas, Bansal adds.

“If I’m a trailblazer is the government going to be all over me, is the regulator going to be all over me so I cant deploy?” he says.

While regulators in the European Union are beginning to consider AI, and China’s relative lack of regulation is allowing innovation to blossom, the needed discussion by government “is not happening” in Australia.

“We need to start lobbying government to start thinking about these things,” Bansal says.

Data veracity

As the old CIO’s adage goes: garbage in, garbage out. This is especially true for artificial intelligence.

“We must give them high quality information. The fuel for an AI is data and lots of it. Just like cars, to run properly, AIs require fuel that is clean and fit for purpose. AI solutions trained on data that is incomplete, misrepresentative or biased will make decisions that would be obviously incorrect to a human,” says Ryan.

“Governments and organisations need to invest in cleansing and curating their data to the quality needed to train their AI solutions. They must also ensure data is continually updated and models retrained as products, markets and services change. Doing this ensures trained models will continue to reflect the current circumstances, in the same way people continually learn through new experiences,” he added.

A fear of the ecosystem

“Here in Australia we’re still looking at AI as a source of competitive advantage, people are playing very close to their chest,” Ryan says. “The vendors are very much trying to orchestrate that collaboration, and Accenture and I’m sure our competitors are advocating for places like universities to try and engage this broader ecosystem. But for now most people are saying, ‘I’ve got my secret solution and I don’t really want to share that with my competitors.'”

But being too closed about your company’s progress AI runs the risk of stifling its success.

“That’s going to be inhibitive to growing. If you think this is your secret sauce you don’t want to share, you’re not going to help your industry,” Bansal adds.

Sharing experiences with AI needn’t go as far as open sourcing everything, just a little discussion and collaboration with peers.

“I think that’s what’s holding us back here,” Bansal says.