Africa

Americas

by Thor Olavsrud

8 Analytics Trends to Watch in 2015

Feature
Feb 09, 201511 mins
AnalyticsBig Data

Business analytics are becoming 'the air companies breathe and the oceans in which they swim,' according to Deloitte Analytics. These eight trends will dominate the analytics field in 2015.

analytic trends
Credit: Thinkstock

Analytics were one of the fastest growing technology trends in 2014 and momentum is continuing to build, according to Deloitte Analytics’ recent report, Analytics Trends 2015: A Below-the-Surface Look.

“Put simply, analytics is becoming both the air that we breathe — and the ocean in which we swim,” the report says.

Deloitte believes eight trends will dominate the world of analytics in 2015.

1. Quadruple Down on Data Security

2014 was a rough year for data security. Deloitte says business and tech leaders are deeply anxious about data security in 2015 and for good reason. Data is exploding all around us: mobile data generation, real-time connectivity and digital business have changed the nature of the game when it comes to protecting data assets, and made it harder too.

As a result analytics have an increasingly important role to play in data security. Analytics are already transforming intrusion detection, differential privacy, digital watermarking and malware countermeasures.

“Securing data is really the difference between remaining operational and dealing with a serious crisis,” says John Lucker, principal, Deloitte Consulting, and Deloitte’s global advanced analytics and modeling market leader. “And companies can’t wait for legislation to save the day. Without a keen focus on data security now, some organizations might not have a bottom line to worry about. But it’s not all horror stories. Security is also about building brand reputation and trust. Strong security practices, including the use of advanced analytics capabilities to manage privacy and security challenges, can set businesses apart from the competition and create comfort and confidence with customers and consumers.”

Lucker notes that one thing companies can do right away is elevate the role of the executive responsible for data security in the organization.

“For example, in consumer products and retail companies — where consumers are looking for reassurance around privacy and security issues — it is important to have a visible (and hopefully visionary) senior executive in charge of security evaluating and meeting business, technology and consumer security needs. Consumers need to know that someone is looking out for their interests.”

2. The Analytics of Things Comes into Its Own

The Internet of Things (IoT) will continue to grow rapidly in 2015. Analytics tools and techniques for dealing with the massive amounts of structured and unstructured data generated by IoT are already coming to light, but Deloitte says the integration of systems is where things are lagging. Both consumer and industrial IoT applications could benefit from industry standards, especially since traditional analytics architectures and techniques don’t play well with the noisy, analog, high-velocity data generated by sensors.

“Everyone from Alljoyn (open source) to Google to the Industrial Internet Consortium is now involved in the standards effort,” Lucker says. “We are making headway, but there are some things that could speed the pace.”

First, Lucker says, it is essential that we facilitate collaboration, allowing participants from research, academia and the business to get involved.

“There are too many IoT standards bodies already — and most of them are technology company-driven,” he says. “It would be better in the long run for users, and established companies, to drive standards setting for the IoT.”

In addition, Lucker says we need to think big. It’s not just about setting standards for sensor devices. We need to think about standards for data integration, analytics and processes too.

We need to think faster. Lucker says the use of new technologies like cloud and APIs can translate between standards and formats quickly and we need to employ them.

That said, Lucker warns that we should probably hunker down for the long haul. It took 15 years to develop a standard for RFID. While IoT standards-setting is moving much faster than that, Lucker says we shouldn’t expect success overnight.

3. Data Monetization Is Here with Risks and Rewards

Here and there, people have begun talking about a strange new idea: That data should not only be managed as an asset but valued as one. In the future, analysts and researchers say, companies will routinely monetize their own data for financial gain.

In some areas it makes perfect sense, and some companies — especially online businesses and, more recently, industrial firms — are already rebuilding their strategy around data as an asset, Deloitte says. But Deloitte also notes that many companies are likely underestimating the responsibilities that come with this data power.

“Data privacy and liability concerns are probably the most important monetization questions businesses need to consider,” Lucker says. “Does the company have the implicit and/or explicit statutory or legal right, or ethical right, to divulge private consumer data through aggregation/monetization? If not, do not pass go. The risks are too great. If yes, then just because something can be done doesn’t mean it should be done.”

If you’re thinking about monetizing data, Lucker says your first stop should be legal counsel to understand what you can and can’t legally do with the data in question.

“Rights need to be negotiated up front as do agreements with customers on their plan to use the data,” he says. “Much of this data is intellectual property, and as such, companies have to consider the cost vs. benefits of making it openly available.”

In addition, Lucker notes there are other potential pitfalls. They include: failing to establish a data monetization business model, underestimating technology and other costs involved and ignoring data accuracy concerns.

“That’s not to say it’s never a good idea to monetize data — clearly there is value in doing so,” he says. “But if it diverts the focus of the company from its primary strategic goals, then I’d stick with doing what the business does well and doing that better.”

4. We Can Build the Bionic Brain Stronger, Faster, Better

The advent of cognitive analytics means we can now automate analytical thinking through machine learning. Cognitive analytics aren’t a replacement for traditional information and analytics programs, but they appear to be capable of improving just about any knowledge-intensive undertaking.

“Modeled after the way the human brain processes information, draws conclusions and learns from actions taken, cognitive analytics uses technology, computing power and human interaction to generate hypotheses, make conclusions and express recommendations,” Lucker says. “With cognitive computing, these recommendations can also be ranked by how likely it is that the response is accurate. What’s more, iterative learning takes place at the machine level. The more data fed into a machine learning system, the better quality insights you get out of it.”

It’s the last point, Lucker says, that differentiates cognitive analytics from traditional analytics.

“With traditional analytics, data representing complex challenges or questions is analyzed, patterns are identified and historical or predictive insights are generated to inform decision making around those issues,” he says. “Cognitive analytics goes one step further, feeding learnings back into the analytics ecosystem to be applied to the next iteration and new or related challenges. With each iteration, the mechanism gets smarter.”

5. The Rise of Open Source…Again

Open source solutions have been common in Silicon Valley for more than a decade and most websites these days make use of the LAMP stack. Now open source solutions like Hadoop are finding a place in mainstream enterprises as data storage and processing engines. It’s no wonder why: Open source solutions are often free or inexpensive and the communities around them can enable rapid development and iteration. Tech leaders are ready to put open source technologies to work, but Deloitte warns that risk management must be part of the equation.

Deloitte points out that you may currently have an army of volunteer open source developers working on the project that you’ve just made core to your technology strategy, but they could move on to the next big thing. Or the quality of the solution and developers working on it might decline.

You need to have a clear picture of the portion of your infrastructure built on open source solutions so you can calculate your risk exposure.

6. Striking Gold with Tax Analytics

Deloitte notes that companies have been slow to capture their tax situations and outcomes in structured formats, but a surge in common data sets is making it easier than ever to bring fact-based insights to company taxes. For instance, tax analytics can explain or predict tax levels under particular circumstances.

“More and more, tax departments are taking a proactive approach to tax risk management,” Lucker says. “This means using data and initiatives from across the business to generate insights, make fact-based decisions that drive strategy.”

With tax analytics, he says, businesses can think differently about tax issues. The can analyze data sets from different parts of the business in a granular way and allow different departments to access a single data source for all their tax information, increasing transparency and accuracy.

“One of our clients was looking for a way to reduce the effort involved in reviewing vendors on a monthly basis over a certain threshold,” he explains. “The effort required two people working two weeks out of each month to get the job done. A tax analytics solution reduced that effort significantly, allowing the business to review every transaction for the company in just a few hours’ time.”

Tax analytics are also simplifying the process of recovering overpaid transaction taxes and helping to prevent future overpayments at the same time.

“Technology and computing power allows for the analysis of terabytes of transactional data to uncover patterns,” Lucker says. “Customer behavior and spending patterns can be examined, and forensic methods can uncover fraud, waste and abuse. Double payments and improper payments become easier to identify.”

Companies are also developing tax analytic approaches to simulate numerous business scenarios, probabilistic outcomes, decision criteria and other business actions/outcomes.

7. STEM Becomes STEAM

It should be no surprise to anyone at this point that companies are concerned about a lack of data scientists out there. Universities are stepping up their efforts to turn out data scientists and quantitative analysts, but Deloitte believes that as the demand grows, there will be an inevitable shakeout in university programs.

Deloitte notes that “STEM”, short for the academic disciplines of science, technology, engineering and math, has been a hot buzzword on college campuses for years. But now some are beginning to talk about “STEAM” — the ‘A’ stands for ‘Art’ — instead.

“This is good news for the business world, which is looking to such programs to deliver the analytics talent they need — and are increasingly on the hunt for people who can balance quantitative analysis skills with an ability to tell the story of their data in compelling, visual ways,” Deloitte says. “Design thinking, visualization and storytelling are increasingly important.”

8. The Quest for Accuracy

The rapid increase in analytics capabilities over the past several years has made the data brokerage business white hot. That’s expected to continue, Deloitte says, but those buying the data will become much more discriminating about what they’re sold.

Having performed a study on consumer data collected by data brokers, Lucker says Deloitte found numerous issues within the data that could affect both companies using the data and the consumers the data describes.

“The data is not as accurate or complete as we hoped or expected,” Lucker says. “Thirteen of the 80 participants (or 16 percent) reported no information was available at all. Despite trying multiple addresses, almost one-sixth of our sample found no information.”

And even when data existed, it wasn’t necessarily accurate. For instance, he points to an unmarried person with a Ph.D.; the data described the person as a married high-school graduate. There were many more examples.

When it comes to big data, those in the know understand that it can be directionally accurate in aggregate but individually inaccurate. It’s inaccurate and valuable at the same time. But the more accurate it is, the more valuable it potentially becomes, especially for companies trying their hands at micromarketing and micro-segmentation at the consumer level.

Today’s data from data brokers may be better than nothing, but buyers want and expect greater accuracy.

Bubbles to Watch

Looking farther out, there are a number of trends that Deloitte believes will become the topics du jour next year, including facial recognition and geospatial monitoring, citizen backlash and analytics driving the physical world.

Technology is already capable, mostly, of tagging friends in photographs and catching criminals by tracking their movements via security cameras. Deloitte believes that the explosion in data from inexpensive cameras and cell phones will be used to train machine learning systems that will lead to plenty of innovation in the field.

At the same time, Deloitte believes the combination of government monitoring, data breaches and “creepy” commercial efforts will sooner or later lead the public to demand enforceable accountability for those who collect or disseminate personal data.

Finally, technology that controls physical activities, from Google’s self-driving car to the Nest thermostat, will continue to get a lot of consumer attention. As they catch on, Deloitte warns that businesses must plan thoroughly for the good and bad potential consequences of these capabilities.

Follow Thor on Google+