Trust is ubiquitous, but the understanding, building, and retaining of trust has become a key challenge of our time, with the trust narrative evolving across a dynamic duality. On one hand, concerns around data privacy, security, and the ethical development of artificial intelligence (AI) abound; on the other, the \u201cart of the possible\u201d has been demonstrated through the positive purposes to which data and technology have been applied.\nAnother dynamic has also evolved recently: data literacy. Over the last year, our everyday lives have been dominated by data, heightening levels of awareness, and helping move beyond data ubiquity to make analytics more ubiquitous too. But as people understand more about how organizations are using their data, they are increasingly concerned, bringing trust center stage.\n\nWhat is trust? Definitions vary but coalesce around being interpersonal and affording the \u201cwillingness to be vulnerable to the actions of others\u201d. But with the advancing role of AI, is this trust relationship poised to change in relation to machines? The answer is \u201cYes\u201d with respect to the human-machine interface evolving from information system to automation to autonomous agent (to varying degrees). In other words, a move from master-servant to teammates or partners bringing together complementary strengths. But it is \u201cNo\u201d with respect to the question of intent. I would argue that, in its current state, AI is not close to having its own intentions or mental states.\nThere are three main domains of AI \u201ctrustworthiness\u201d: the technology, the system it is in, and the people behind\/interacting with it. Within these domains, five key pillars have emerged: the capacity for AI development and decision-making to be human-led, trainable, transparent, explainable and reversable.\nRob O\u2019Neill, Head of Information at the University Hospitals of Morecambe Bay NHS Foundation Trust, says end-to-end transparency is central. Giving the example of a predictive analytics project to identify patients with high hypertension risk, he explains that, in practice, AI trustworthiness is established through openness, visibility and built-in bias checks. \u201cI am an advocate of open machine learning techniques \u2013 not black box approaches,\u201d he says. \u201cWhen it comes to data quality, we need to show there\u2019s a clear line of sight from the board down to the ward.\u201d\nWithin the data landscape, the four Vs of data \u2013 volume, velocity, variability, and volatility \u2013 are accelerating, and a recent study by Forrester Consulting for Dell Technologies indicated that this acceleration has exacerbated data paradox barriers for organizations. The study found that data overload and the inability to extract data insights is the third highest barrier to digital transformation. This makes investment and the optimal application of business intelligence (BI) analytics and automation \u2013 supported by culture, talent and skills \u2013 an imperative if data is to enable democratization, help develop new services and foster organizational proactive agility to change.\nImproving the data pipeline, enhancing integration, and ensuring insights are informed by trusted data is critical. For Richard Speigal, Senior Business Intelligence Manager at Nationwide Building Society, this meant his organization had to move away from a traditional project-based structure, where the data and analytics community was split into different functions, creating data and people siloes. His evolution was to bring in a product-based framework that is value stream-focused and highly multidisciplinary, with \u201cdomain-driven pots of data\u201d governed at source and available for business self-service. Senior leadership take-up helps to cascade adoption, supported by investment in data literacy skills. \u201cWe want to federate the business to use BI tools to build their own solutions,\u201d Speigal explains. \u201cIt\u2019s no good just giving people tools; you\u2019ve got to get them data literate.\u201d\nData governance and trust provides another example of duality. On one hand we have increased regulatory pressure to govern, with complex geographical differences including General Data Protection Regulation (GDPR), California Consumer Privacy Act (CCPA), Payment Card Industry.\nData Security Standard (PCI DSS) and Health Insurance Portability and Accountability Act (HIPAA) compliance. Compliance data is typically owned by data stewards, especially the closer you get to consumption-in-action, where governance must be more stringent. This is Governance with a capital G.\nOn the other, we have the business and trust impact of good governance. This is governance shared across all key stakeholders, from consumer to engineering team to data stewards. It is governance with a small g, but is equally vital. As discussed with Dan Potter, Vice President, Product Marketing at Qlik, there should be governance at every step, from when a piece of data is created to when it is used to take action, and also around analytics. \u201cTrust comes from transparency and consistency,\u201d he says. \u201cWe also have the \u2018protect me from myself\u2019 component \u2013 don\u2019t to allow me to get myself into problems.\u201d\nThe capacity to move faster with data is poised to accelerate, with increasing convergence of information technology (IT) and operational technology (OT) systems necessitating the bringing together of different levels of data structure, time sensitivity and volume, with varying latency and stream processing needs. We are also seeing specific sectors addressing data change \u2013 especially healthcare, where pandemic dynamics have demanded more open, predictive models with the capacity to retrain expediently.\n\nIn combination, this brings the need for Active Intelligence center stage. Passive business intelligence solutions, which rely on preconfigured, curated, and historical data sets, are not designed to holistically embed governance and support or compel real-time decisioning and action making. Active Intelligence affords exactly this, establishing an intelligent analytics data pipeline with dynamic business content and logic, triggering immediate actions, and ensuring business moments are not missed.\nSimilarly, when it comes to governance, having an automated system is not enough; we need to move to a position that could be described as self-correcting, or self-healing.\nFinally, as highlighted by Elif Tutuk, Vice President, Innovation and Design at Qlik, while continuous intelligence has been talked about for some time, this typically focuses on continual data flow and triggering an action. Active Intelligence goes further, bringing together additional components with people very much in the loop. \u201cWe need to get human trust into analytics and data and provide good collaboration between data producer and consumer,\u201d she says.\nAs we look to the future, this people component becomes core, with enhancing collaboration the next critical step in enhancing trust in data, and trust in data analytics.\nFor more on this and for the latest trends, visit Qlik\u2019s Executive Insights Center qlik.com\/executiveinsights.