Credit: Qlik Trust is ubiquitous, but the understanding, building, and retaining of trust has become a key challenge of our time, with the trust narrative evolving across a dynamic duality. On one hand, concerns around data privacy, security, and the ethical development of artificial intelligence (AI) abound; on the other, the “art of the possible” has been demonstrated through the positive purposes to which data and technology have been applied. Another dynamic has also evolved recently: data literacy. Over the last year, our everyday lives have been dominated by data, heightening levels of awareness, and helping move beyond data ubiquity to make analytics more ubiquitous too. But as people understand more about how organizations are using their data, they are increasingly concerned, bringing trust center stage. Learn more about Data Literacy>> What is trust? Definitions vary but coalesce around being interpersonal and affording the “willingness to be vulnerable to the actions of others”. But with the advancing role of AI, is this trust relationship poised to change in relation to machines? The answer is “Yes” with respect to the human-machine interface evolving from information system to automation to autonomous agent (to varying degrees). In other words, a move from master-servant to teammates or partners bringing together complementary strengths. But it is “No” with respect to the question of intent. I would argue that, in its current state, AI is not close to having its own intentions or mental states. There are three main domains of AI “trustworthiness”: the technology, the system it is in, and the people behind/interacting with it. Within these domains, five key pillars have emerged: the capacity for AI development and decision-making to be human-led, trainable, transparent, explainable and reversable. Rob O’Neill, Head of Information at the University Hospitals of Morecambe Bay NHS Foundation Trust, says end-to-end transparency is central. Giving the example of a predictive analytics project to identify patients with high hypertension risk, he explains that, in practice, AI trustworthiness is established through openness, visibility and built-in bias checks. “I am an advocate of open machine learning techniques – not black box approaches,” he says. “When it comes to data quality, we need to show there’s a clear line of sight from the board down to the ward.” Within the data landscape, the four Vs of data – volume, velocity, variability, and volatility – are accelerating, and a recent study by Forrester Consulting for Dell Technologies indicated that this acceleration has exacerbated data paradox barriers for organizations. The study found that data overload and the inability to extract data insights is the third highest barrier to digital transformation. This makes investment and the optimal application of business intelligence (BI) analytics and automation – supported by culture, talent and skills – an imperative if data is to enable democratization, help develop new services and foster organizational proactive agility to change. Improving the data pipeline, enhancing integration, and ensuring insights are informed by trusted data is critical. For Richard Speigal, Senior Business Intelligence Manager at Nationwide Building Society, this meant his organization had to move away from a traditional project-based structure, where the data and analytics community was split into different functions, creating data and people siloes. His evolution was to bring in a product-based framework that is value stream-focused and highly multidisciplinary, with “domain-driven pots of data” governed at source and available for business self-service. Senior leadership take-up helps to cascade adoption, supported by investment in data literacy skills. “We want to federate the business to use BI tools to build their own solutions,” Speigal explains. “It’s no good just giving people tools; you’ve got to get them data literate.” Data governance and trust provides another example of duality. On one hand we have increased regulatory pressure to govern, with complex geographical differences including General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), Payment Card Industry. Data Security Standard (PCI DSS) and Health Insurance Portability and Accountability Act (HIPAA) compliance. Compliance data is typically owned by data stewards, especially the closer you get to consumption-in-action, where governance must be more stringent. This is Governance with a capital G. On the other, we have the business and trust impact of good governance. This is governance shared across all key stakeholders, from consumer to engineering team to data stewards. It is governance with a small g, but is equally vital. As discussed with Dan Potter, Vice President, Product Marketing at Qlik, there should be governance at every step, from when a piece of data is created to when it is used to take action, and also around analytics. “Trust comes from transparency and consistency,” he says. “We also have the ‘protect me from myself’ component – don’t to allow me to get myself into problems.” The capacity to move faster with data is poised to accelerate, with increasing convergence of information technology (IT) and operational technology (OT) systems necessitating the bringing together of different levels of data structure, time sensitivity and volume, with varying latency and stream processing needs. We are also seeing specific sectors addressing data change – especially healthcare, where pandemic dynamics have demanded more open, predictive models with the capacity to retrain expediently. Read more on balancing risk and speed in data delivery>> In combination, this brings the need for Active Intelligence center stage. Passive business intelligence solutions, which rely on preconfigured, curated, and historical data sets, are not designed to holistically embed governance and support or compel real-time decisioning and action making. Active Intelligence affords exactly this, establishing an intelligent analytics data pipeline with dynamic business content and logic, triggering immediate actions, and ensuring business moments are not missed. Similarly, when it comes to governance, having an automated system is not enough; we need to move to a position that could be described as self-correcting, or self-healing. Finally, as highlighted by Elif Tutuk, Vice President, Innovation and Design at Qlik, while continuous intelligence has been talked about for some time, this typically focuses on continual data flow and triggering an action. Active Intelligence goes further, bringing together additional components with people very much in the loop. “We need to get human trust into analytics and data and provide good collaboration between data producer and consumer,” she says. As we look to the future, this people component becomes core, with enhancing collaboration the next critical step in enhancing trust in data, and trust in data analytics. For more on this and for the latest trends, visit Qlik’s Executive Insights Center qlik.com/executiveinsights. Related content brandpost Sponsored by Qlik Lay a data pipeline: Exposing the gaps in today’s business intelligence market By Josh Good Oct 20, 2021 6 mins Business Intelligence Data Management IT Leadership brandpost Sponsored by Qlik Data deluge: Navigating uncertainty via informed action By Vikram Mansharamani Oct 20, 2021 6 mins Data Management IT Leadership brandpost Sponsored by Qlik Decision-Making at the Speed of Business Means Better Data Onboarding By Joe DosSantos Oct 20, 2021 7 mins Data Management IT Management brandpost Sponsored by Qlik The Keys to Unlocking the Benefits of a Modern Data Analytics Platform By Josh Good Oct 20, 2021 7 mins Data Science Analytics IT Leadership Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe