Building modern data platforms to exploit the power of AI

BrandPost By In association with CANCOM
Sep 18, 2019
Cloud Computing

Giving employees the ease of access to enterprise data they are accustomed to in their consumer lives is the key to delivering business value.

This is the challenge laid down to CIOs by Mark Skelton, the CTO at CANCOM. Only by fundamentally looking at enterprise data architectures differently and moving away from data silos can advanced data science techniques such as AI and machine learning shine a light on your organisation’s data darkness, opening up new revenue streams and business outcomes, he stresses.

“Culturally we are trying to shift to a more open platform. I think we have seen this in the consumer world, where you have the internet in your pocket these days and you can go and search for anything and find any bit of data that you want,” he told the CIO podcast. “Yet our experience in enterprise is very different to that, you go and look at something and you have to know where to go to find that data. Quite often IT don’t know where the data pockets are and what the value is in those data silos.”

CIOs still face “old world” issues of having enough storage and data for day-to-day operations, but they also have new challenges, adds HPE’s Chief Technologist, Hybrid IT, Alex Haddock.

“If you’re going to do digital transformation you really need to monetise that data, and that’s where CIOs are starting to struggle the most because they’re having to go back to the business and show how they can bring in new revenue streams and new business outcomes.”

By 2021, three quarters of business transformation projects will have an AI component, with market research firm IDC predicting worldwide spending on AI technologies to exceed $52 billion over the same period. It should come as no surprise then that businesses across a wide range of industry sectors agree that AI and machine learning technologies offer them numerous opportunities, such as the ability to identify complex patterns and make predictions out of huge volumes of data.

However, to get to a place where AI is really making a difference, organisations first need to overcome some common data challenges, centring on storage, access and classification.

Creating a data culture

To get around such challenges, organisations need to develop a firm data foundation that will allow them to gain business advantage from AI, particularly the latest breed of powerful and scalable cloud-based AI and analytics.

The first step to doing so is to create a data culture, where business and technology professionals alike are aware of the need for high-quality, consistent and usable data that no longer resides in hidden silos.

Secondly, for AI to be truly effective, organisations need to get their data in shape with an effective data classification lifecycle that incorporates unstructured data. It’s essential to understand where your datasets reside; what sensitive data you hold; and to identify and eliminate redundant, obsolete or trivial (ROT) data.

Data discovery and classification are the foundations of effective data analysis and technologies like AI and ML, argues CANCOM CTO, Mark Skelton.

“It’s really important to understand the characteristics of your data, and if you get that defined within your organisation, you can then look at how you begin to feed those into learning algorithms with machine learning and similar technologies”

As part of this lifecycle, business leaders need to develop data classifications which enable their organisations to meet your security and privacy responsibilities, such as the EU’s General Data Protection Regulation. Then, once you understand your datasets, you can successfully deploy AI to gain business advantage from your data and deliver a more competitive customer experience.

Building a modern IT infrastructure

Your data strategy also requires a modern IT infrastructure that can handle and exploit your data, and scale up or down as required.

Consider deploying agile infrastructure, such as hybrid and public cloud platforms, that offer mechanisms such as PaaS and containers to developers. Also keep in mind hyperconverged and software-defined datacentre hardware, which offers agile, scalable and cost-effective data access, making it ideal for AI applications.

Emerging IT infrastructures have the flexibility and scalability to cope with distributed and diverse data, taking into account fluid IoT-generated information such as weather or traffic; and web-related data such as browsing and shopping patterns. With AI at the back-end, making sense of this complex information, the business potential is huge.

But it requires the right IT infrastructure and access to your data, says Haddock. “From the infrastructure and data perspective we need to make sure we’ve got visibility across the sources. And as we start to go into the merging of big data and AI analytics, you need to have access to that data.”

Building a data foundation

Managing your data so that you can extract business insights and value begins with understanding it, standardising it, and classifying it, advises CANCOM.

The next step is to create a data culture, where you are looking beyond your four walls to incorporate publicly available data. Visualise your data using infographics and dashboards. And classify your data to ensure that it’s protected and secured.

But it’s important to get your data foundation right first of all, Skelton says. He adds, “What IT needs to do is understand what the business needs from the dataset, and understand the value that needs to be created from those. Finding out what data you’ve got is one thing, but actually understanding what to do with it is the next piece.”

So, with the right data foundation in place: based on culture, a modern data and infrastructure set-up, and understanding what data is truly at your disposal, you will be in the best place to exploit the power of artificial intelligence.