Gartner research VP Frank Buytendijk has called on organisations to be prepared to put the brakes on complex, high-level data analytics and get the basics right before the practice gets out of control and they cross the ‘creepy line.’
Netherlands-based Buytendijk – who was speaking on Monday morning at Gartner’s Business Intelligence Summit in Sydney – warned that taking steps to correctly implement predictive and descriptive analytics is not easy and organisations should take care when using tools.
“It’s easy to use tools [but that] will not lead to better decisions. If you don’t know what you are doing, easy to use tools lead to more spectacular failure,” Buytendijk told attendees.
He cited Gartner research which suggests that by 2016, around 25 per cent of organisations using consumer data will face reputation damage due to an inadequate understanding of information trust issues.
“Big data is at the top of the hype cycle but after that peak of inflated expectation comes the trough of disillusionment … what if the tipping point becomes the slipping point?
“And we start to hear about the first spectacular failures on how investments didn’t pay back, on how technology didn’t do what it was supposed to do … and we’ve already seen the first examples of how organisations have taken big data too far and crossed the creepy line.”
He referred to the high profile example of the US National Security Agency collecting almost 200 million text messages daily across the globe.
“Do companies and governments not care? Are companies evil? Are companies invading our privacy system unethically?” he asked.
“Maybe some but most are not. Most want to respect their customers and their privacy – they just want to get to know you and find the best way to sell their products and services to you.”
But in the process, they can go too far with unintended consequences down the line.
He highlighted a situation in late 2013 where TV manufacturer LG investigated claims that its smart TVs sent data on users’ viewing habits back to the company without consent as an example of data gathering and analysis out of control.
“The CEO of LG had to publicly declare that this indeed was going too far and they would release a patch. The CEO talking about a patch: As an IT person [if that happens] you know you’re in trouble,” he said.
He said organisations needed to accept that in an age driven by digital business, “we are simply not in control.”
“We need to adapt and anticipate use cases in our information infrastructures that simply don’t exist today,” he said. He was referring predominantly to challenges in areas such as social analytics, cloud security, privacy and data ownership.
“Don’t get me wrong, I’m all for moving fast but [we need] to make sure that once in a while, we know where to find the brakes.”
He highlighted a UK health insurer offering customers a “quantified self-tracking” device, where users earn points for walking a certain number of steps each day as another potential misuse of consumer data.
“Do I really want me health insurer to track how much I move? There’s no privacy any more.”
He also warned about the use of predictive analytics, saying that it can’t predict the future rather it identifies a pattern of what can happen in certain situations.
He asked the audience to consider a doctor who is called on to treat an epidemic and try experimental medication that a specialist says will save 30 per cent of people. He suggested that in this scenario, the doctor would administer the medication to save the people.
“Now let’s reverse the scenario – imagine a specialist tells you that the medication will not work on 70 per cent of people. Would you do it anyway? Probably not but it is the same situation.
“We humans are notoriously bad when understanding probability. When asked more than 80 per cent of people feel that they belonged to the best 50 per cent of car drivers,” he said.
“We tend to mistake patterns for reality – that’s how we’re built. Sometimes what we need to hit the brake for hardest is ourselves.”