Everything about your job has changed. It’s time your metrics did too.
Craig Williams, CIO of telecommunications networking company Ciena, knows exactly how his IT team will be evaluated by the company’s leaders in 2019. They will look at a wide range of metrics that include things like talent management (days to fill open position, number of employees completing management development), profit participation (revenue per IT employee), and change management (rate of adoption for new social media, data, and collaboration tools).
It’s a big change from how things used to be. “IT used to be primarily measured on things like uptime and meeting service-level agreements,” he says. IT would shoot for targets like 99.999 percent availability, and have goals for number of tickets resolved. These days, IT still keeps track of those metrics “but we do this for ourselves, not necessarily for our [internal] customers,” Williams says. “They’re trusting in our expertise, so their needs have shifted toward value, business enablement, and profit contribution interests.”
In a survey of 900 IT leaders conducted by unified communications company Fuze in 2017, more than three quarters said they believed IT could drive business success and that IT’s ability to innovate was critical to the business. Yet most thought business leaders were too focused on cutting costs, and 49 percent said their IT departments’ success wasn’t being measured the right way. “What our survey showed is that IT teams are really interested in being judged on their ability to innovate and handle change management,” says Chris Conry, CIO of Fuze.
Clearly, IT leaders are eager to take their place as partners to business leaders and are ready to guide their organizations into a digital future. So why are so many still measuring their own success with outdated metrics that may no longer be relevant to the C-suite’s goals? Unless a key performance indicator (KPI) can be tied to business objectives or results, it’s a “vanity metric,” that may make IT look good to itself, but doesn’t deliver real value, according to Dawn Parzych, director at digital experience monitoring company Catchpoint.
What are some of the specific KPIs IT leaders should either stop measuring or at least stop talking about to their business contacts? Here are some examples, and some suggestions for what to look at instead.
“Having near 100 percent uptime is no longer a marker of success; it’s more just what’s expected out there,” says Adam Tallinger, vice president at healthcare IT consulting company Impact Advisors in Naperville, Illinois. “You don’t necessarily cheer your plumber every time water comes out of the faucet. You for sure will call if you turn on the tap and it doesn’t come out.”
Like many KPIs, uptime can be misleading if looked at by itself. For example, some systems include a virtualized software layer. The network may be functioning properly, but if that virtualization layer isn’t working, users still have no access to the apps they need. “It’s cheating if you call that uptime,” he says.
Uptime can be useful, but only if tied directly to a business result, Parzych says. “Availability is really important, but what they care about is how it’s impacting service-level agreements [SLAs]. If we have an agreement with customers that our site will be available 99.99 percent of the time, business leaders will care about availability in the context of that SLA.” Even without an SLA, website downtime can have a big impact on customer satisfaction especially if, say, a site goes dark on Cyber Monday, she adds.
Even if you’re not sharing the information with business contacts, uptime is one of the basic metrics IT should still gather to make sure its primary obligations are being met, according to Ben Lorica, chief data scientist at O’Reilly Media. “You have to keep the lights on and keep all these services up,” he says. “You can go through the checklist and drop some KPIs that aren’t useful, but there’s a baseline of metrics you should track to make sure things aren’t going down.”
Mean time to repair (MTTR)
MTTR can be a misleading measure. This is especially true if IT employees know it’s being used to evaluate their performance, Parzych says.
“When it’s tied to compensation, everyone wants that bonus or raise,” she explains. “So if you’re tracking MTTR and the number goes down, it could be because people are closing a ticket before it’s fully resolved, for example after they send an email about it. But then a new ticket gets opened later. There are all these different ways you can manipulate things. MTTR is definitely a very important metric to track, but you have to make sure you’re tracking it accurately. You have to ask how other metrics are related to it.”
Even without deliberate manipulation, MTTR may not be that useful because it fails to consider several factors, Tallinger says. “You could be using MTTR for a break-fix issue that may not even involve downtime. Or it might be affecting one user as opposed to an entire department or the entire enterprise. It doesn’t take into account how important that issue is.”
Besides, he says, there may be other issues that matter more to users than just how long it takes to get a problem resolved. “Something may be simple, or it may take a work group to put it together, or custom programming from a vendor. There are different scenarios where things can take seconds or days, or weeks, or months to fix. How was my case handled? Was I kept in the loop? Was I involved in the solution, and will it be a satisfactory solution or just a Band-Aid?”
First call resolution
This is another dubious KPI, Tallinger says, because of what it fails to take into account. “It’s very dependent on what the issue is. If someone forgot a password, first call resolution percentages will be way up there, but if there are issues that are appropriately passed off to second- or third-tier support, they won’t be.”
Online storage company Box recently made a change in how it handles forgotten passwords, for example. After entering three wrong passwords, an employee is locked out of his or her account for an hour. But Box’s IT team recently put a tool in place where users can unlock their accounts themselves by using two-factor authentication to send a message to their mobile phones. That simple change eliminated about 2,000 calls to the help desk every month.
Besides freeing up valuable IT resources, having an option like this is usually popular with users. So Box’s initiative improved user experience, made IT jobs less tedious by getting rid of a repetitive task, and freed up resources for IT projects. But if Box were to measure its first call resolution, it would likely find that the percentages have gone down dramatically as a result of the change.
Budget and schedule adherence
“A lot of IT organizations look at a project’s budget and schedule adherence to measure its success,” says Brent Rasmussen, CIO of Carrington Mortgage Holdings in Aliso Viejo, California. The problem is that this measures IT’s performance against the parameters originally set, but doesn’t tell you anything about whether the completed project resulted in any business benefit.
“The business doesn’t benefit from a project until it’s fully adopted, business processes have changed, and it’s measurable and transparent,” Rasmussen says.
In general, he adds, staying within budget, like uptime, is “table stakes” for IT in today’s world. “All your infrastructure layer needs to run all the time and be scalable and dependable so it does not impede the business. If that’s not happening, you’ve probably got bigger issues.”
What IT should do instead
If the KPIs above no longer prove IT success, what should IT leaders be measuring? The answer depends on what’s important to a particular organization, which varies, not only from one company to another, but also at the same company over time. Rapid expansion may be the primary goal one year; maximizing profits may be the biggest goal the following year; and building an enduring brand, the year after that. The only way to truly determine what KPIs IT should measure is to find out from business leaders and users what matters to them.
Here are some expert suggestions for learning what the business really needs.
Start with a lot of consultation
If you want to measure user satisfaction — a metric that very likely matters a lot to business leaders — you might be tempted to send out a survey to everyone, or a follow-up survey to users who ask for support. But without proper preparation, that would likely be a mistake, Tallinger says.
“It shouldn’t be all of a sudden we send out a survey,” he warns. “It’s too much data, too much variability among the people you’re contacting at a specific point in time. I would focus more on building relationships with departmental or organization leaders to say, ‘Here’s how we’re going to measure ourselves in the future,’ and let them give you feedback as well. A survey can be used after you form those relationships.”
Keep asking why
“The biggest issues I’ve seen arise is when business leaders don’t communicate what their goals and objectives are to the IT organization,” Parzych says. For example, a business leader might simply tell IT that user experience for external customers needs improvement. The reason might be that the company is trying to grow its customer base, and frustrated customers tend to leave, resulting in churn. That’s the reason behind the emphasis on customer experience, but that reason isn’t always shared with IT.
So you have to probe a little deeper. “Ask, ‘Why does this matter?’” Parzych advises. “You might have to ask multiple times because the first answer may not be the real business reason. ‘You care about availability — why?’
‘Because users are happier.’
‘What does that have to do with the company?’
‘When they’re happier, they spend more time on the site and make more purchases.’”
Digging down to an answer like that one can be really helpful, she explains, because there might be a better way to achieve that business goal. For example, rather than strive for five nines, you might be better off adding features to the site that engage customers’ attention. “If you’re focusing on availability and not on building features that customers like, you might be missing the mark,” she says.
Tie every metric to a specific business result — and follow up to ensure the result was achieved
Rasmussen does this every year, sitting down with leaders from every department one on one to determine what projects IT will do for them, and how many hours and other resources IT will spend on each. They also agree on a projected business benefit to result from the project.
Rasmussen and his team then track how much time and resources are actually spent on the project to see whether it stays within the planned budget. But that’s not all — they also determine whether the desired business results were achieved.
“We benchmark prior to the project, so first we go in and measure whatever can be measured,” he says. For example, if it’s a marketing project, they might measure lead generation, touches, opened emails, or the current capacity for sending out promotions. “Then we look at our new solution against that baseline to see what its effectiveness is,” he explains.
He and his staff call this process TOFU, for “take ownership and follow up,” he says, and the process applies to every project IT takes on. If it’s a compliance project, they measure numbers of incidents. If it’s a robotic process automation project intended to reduce human workloads, Rasmussen wants that reflected in the numbers as well. “I would expect to see their budget call for ten people for the year, but the actual show eight,” he says. “Either the business is taking on more capacity, or people are laid off or reassigned. I should be able to measure the savings in that operation.” Otherwise, he says, “you get into too many intangibles.”
So it’s important to find the right metrics to track. But keep in mind that when you do, they still won’t really give you the whole picture of IT performance. As Williams says, “KPIs tell me which way the wind is blowing, but not necessarily the weather for today or forecast for tomorrow. It’s the conversations around KPIs that matter.”