Artificial intelligence is starting to eat the world, one step at a time, and IT operations is no exception.
Although still early in deployment, companies are taking advantage of AI and machine learning to improve tech support and manage infrastructure. Here, natural language processing is proving to be a valuable IT tool. The technology, which fuels most customer service chatbots, is being put to us in internal IT operations, to improve tech support and user interfaces.
Credit Suisse Group AG, for example, rolled out a chatbot last December to help process routine requests such as password resets and computer reboots.
“We were primarily a voice-only support center, which didn’t enable us to have efficiencies in terms of how we handled our users queries,” says Jennifer Hewit, the company’s head of cognitive and digital services.
Employees who called in about problems would have to wait in a telephone queue for the next available agent, she says, an approach that doesn’t scale well. “So we wanted to provide a new channel into the service desk, and introduce chat for a quicker response and action to our users,” she says.
Credit Suisse first started thinking about this in late 2016, chose the Amelia chatbot system from IPSoft in early 2017, and began the installation in June. It was up and running by the end of the year.
“When we introduced her, she was a baby,” Hewit says, referring to the chatbot. “I’d say she’s an infant now. We’re still spending time training her brain.”
For example, when the chatbot is not able to handle a request, it’s escalated to a human agent. The chatbot follows along with the conversation, learns from it, and this learning is reviewed by the bank before it goes back into the chatbot, to keep mistakes and biases from creeping in.
The new system serves 76,000 users in 40 countries around the world, and has allowed Credit Suisse to move some of its 80 tech support agents to higher-level support. “My ambition is by the end of the year to have automated 25 percent of the queries coming into the service desk,” Hewit says.
With its ultimate goal of freeing up one-third of its tech support staff, Credit Suisse’s use of AI in IT underscores the impetus fueling the trend: to empower IT personnel to drive deeper business value by handing over lower-level work to machines better suited to those tasks.
Using AI to secure and inspire
Texas A&M University System is another organization putting AI to work in IT, deploying Artemis, an intelligent assistant from Endgame, to help new staffers keep the university secure from cyberattacks.
“We monitor the networks for 11 universities and 7 state agencies,” says Barbara Gallaway, security analyst at Texas A&M University System. Gallaway’s team includes nine full-time staff and eight part-time student workers who don’t have the experience required to deal with security incidents.
The AI system enables her staff to ask questions in plain English, helping to train them in their jobs as a side benefit. “It’s on-the-job training and doing the job at the same time,” Gallaway says.
“We just did a new round of hires in January, and it literally took them two hours to figure out what they were doing and jump in and do their job,” she says. “They learned at a quicker pace, and we had fewer people coming to our full-time people asking questions. There was less searching they had to do on Google, or watching learning videos.”
It also has had a positive effect on recruitment, she says. Two years ago, when hiring three security analysts, they couldn’t find enough applicants for the jobs.
“This January, we had 88 applicants for seven openings,” she says. “The word of mouth was that what we were doing was fun. They actually got to do investigations, they weren’t just sitting there staring at the screen, and they’re getting real-world experience. I hope it will make more people want to do cybersecurity.”
Murphy Oil, headquartered in Arkansas, is an oil company with operations in the U.S., Canada and Malaysia, and 1,200 employees around the world. The company has been moving its infrastructure from traditional on-premises and colocation to cloud and SaaS models for the past year, but the biggest savings have been from adding intelligence to the management of its cloud infrastructure, says Mike Orr, IT director of digital transformation at Murphy Oil.
“If you just lift and shift your workload to the cloud, you’re not going to save any money,” he says. “It might even cost you more.”
The cloud does allow for significant flexibility, but it can take a lot of people to adjust the workloads, and that adds up. So the company turned to an AI-powered system from Turbonomic to make recommendations about how to optimize the infrastructure. But the real payoff came once Murphy Oil grew comfortable with the system and trusted it to perform placement and sizing automatically.
“There’s another setting, that said, based on these learnings, we’ll take these actions, are you okay with that? Once we turned that on, we found that the software made a better decision than my people did,” says Orr. “It let the data drive the decisions rather than gut and emotion.”
Prior to the move, Orr had four and half full-time equivalents working on nothing but tickets. “Now it’s one-tenth of an FTE,” he says.
That enabled Murphy Oil to move staff from basic operations and maintenance to business enablement. One employee, for example, is now learning about automating processes so that the company can move further up the maturity curve.
“There’s always a backlog of projects,” Orr says. “We don’t have any intention of laying people off.”
Ohio’s North Canton school system had a different infrastructure management challenge: keeping its wireless network up across the entire campus. That includes making sure user laptops and mobile devices can connect correctly.
There are about 4,400 students, 650 staff members, seven buildings, and between 6,000 and 8,000 devices total on the network, with just three people to manage the network. Last August, the district switched to Mist Systems for wireless network management, and, as an added benefit, got a new AI-powered interface.
“It does feel faster, and we can drill into things quicker,” says John Fano, systems administrator at North Canton City Schools. “You can say, ‘What’s going on with access point one,’ and it will show you all the information about it, and you can drill further into it.”
In addition to the natural language interface, there’s also AI on the back end, analyzing network activity. “We’ve been using it all year long to find little things on the network that we didn’t even know were going on,” he says.
Last year, for example, his team spent nine months doing packet captures and traces to prove to their supplier that staff laptops had faulty wireless cards. “Under Mist, we were able to see the problem, all the packet information, everything, almost in real time and duplicate finding the problem in about an hour,” he says.
Mist finds network problems by analyzing an organization’s own data, and combining it with anonymized reference data from other customers who opt in to the data sharing. Depending on the specific algorithms, the learning is either continuous or batched, says Bob Friday, CTO at Mist Systems. Other algorithms use supervised training models that change on the order of hours, he says.
Since the AI is baked into the product, even enterprises with no in-house expertise in artificial intelligence can still benefit from the technology, Friday adds.
Once known for its cameras, Tokyo-based Konica Minolta began using AI-powered IT infrastructure management tool ScienceLogic internally in early 2017 in support of its office and IT services business, to help predict which equipment was about to break down.
At first, the predictions were about 56 percent accurate, says Dennis Curry, the company’s deputy CTO, but the system learned over time. “Now we can predict that something will fail in the next two weeks 95 percent of the time,” reducing downtime and lowering overall costs, he says.
The company is adding the technology to Workplace Hub, its ScienceLogic-powered IT management platform, which should be available later this year.
Nlyte Software is also planning to offer an AI-powered predictive maintenance tool. Powered by IBM’s Watson technology, Nlyte uses general information from customers to gather insights about commonly used equipment, and combines it with learning from individual customer environments.
“We have patterns that we have built, and we provide those to customers,” says Enzo Greco, the company’s chief strategy officer. “But we have found that every customer’s environment is slightly or significantly different, so we also provide a tool kit for customers to create their own use cases, their own AI patterns.”
The top two uses are energy optimization, such as where to place new servers to optimize thermal conditions, and workflow optimization, which is about where to place workloads.
These kinds of tools are typically custom-built by companies, tailored to their own operations. Now, with off-the-shelf software and prebuilt models, enterprises can get up and running quickly, without having to have deep AI expertise in-house.
Netherlands-based Interxion is one company already seeing savings from using machine learning to improve operations. A couple of years ago, the company, which operates 50 data centers in 13 cities around the world, began deploying data center infrastructure management (DCIM) technology EcoStruxure from Schneider Electric.
“We are building, typically, four new data centers a year,” says Lex Coors, the company’s chief data center technology and engineering officer. “That gives us the opportunity to look back and see how the old ones are doing without any EcoStruxure, what are the ones doing with the early version of EcoStruxure, and the latest version.”
The early versions were difficult to use, he says. They provided plenty of information, but more staff were needed to make sense of the data and to make decisions and implement them.
“Even with the newer system, it provides you with so many recommendations,” he says. “I can implement recommendations all day.”
The latest iteration of the product includes more intelligence, and now the savings are coming through, he says.
The replacement capital expenses budget had savings of between 1 and 2 percent, he says. “In the maintenance opex budget, I’m looking at a 10 percent decrease with the full benefit of all the analytics.”
That’s because the company can do the right maintenance, at the right time, to avoid equipment breakdowns, he says, adding that there are also recommendations for optimizing energy efficiency.
But even the latest versions still need work. “It can tell me today to change to this temperature, and tomorrow to another one, and the next day back to the first one because that’s the best decision at the time,” he says. The system should make recommendations based on long-term projections, and prioritize them, he says, adding that his company is working with Schneider to improve the system.
“The machine learning capabilities of the our DCIM systems are still limited,” he says. “If I look at our data centers today, and think about what we can do with machine learning, there is not yet much.”
But the technical capabilities of AI in DCIM is likely to expand.
“This is a whole new area, a new development for the industry, and it’s powerful,” she says Rhonda Ascierto, research director for datacenters and critical infrastructure at 451 Research, who mentions Eaton as another vendor in the DCIM space. “I think it’s the beginning of a long-term evolutionary change towards integrating physical data center management with many other services. As the technology evolves, other data and services are likely to be added, including integrated workload management, energy management, staff services, and security and network management.”
All of that is going to take time to play out, she says, and the more data vendors can gather, the smarter and more valuable their platforms will be. As a result, customers can get these tools at very low cost.
It’s all about point solutions
But a general-purpose AI-powered platform for IT operations remains elusive, says Michele Goetz, an analyst at Forrester Research.
“There are still no AI systems that really could replace a database administration or systems administrator,” she says. “We still have a few years for these AI solutions to mature and we also need time for enterprise organizations to have a better vision for what implementing AI in their IT environment should look like.”
One challenge is that AI currently needs large volumes of training data, and that is available only for particular types of problems. In addition, systems need to be able to talk to each other better than they do today, says Shannon Kalvar, analyst at International Data Corp.
“Technically, if we were to see the convergence of IT service management and IT operations management, we can do it in two to three years,” he says. “The technical hooks exist. But to be honest, I’m not seeing a lot of that kind of design thinking.”
It’s not just about automating processes, he says. “Right now, we rely on people’s experience, support people, operations folks, to understand and link it all together. And I don’t want to be unduly harsh, several vendors are making steps in that direction, but we’re not there yet.”
He calls it a process abstraction layer, an integrated intelligence.
According to a Turbonomics survey of 750 IT operations managers that the company conducted together with Red Hat and AppDynamics, 68 percent said they are not yet leveraging AI for IT management, while 24 percent said they’re experimenting with it.
However, 84 percent said they believe AI could reduce complexity by creating self-organizing systems.
“I would argue that this is the space a lot of people want to play in but I’m not sure any of us are there yet,” says IDC’s Kalvar.