It\u2019s heady times for data. Big data, data lakes, data-as-a-service, data breaches\u2014a week can\u2019t go by without a headline mentioning data.\u00a0 Certainly, businesses are aggressively investing in harnessing their data as an asset to drive strategic insights, automate complex processes, and personalize customer experiences.\u00a0 Companies are doubling down on data security, and at the same time starting to package insights as information products.\nAll this energy is turning up the pressure on data management in these organizations, and cracks are rapidly appearing.\u00a0 The status quo of highly engineered data warehouse supply chains surrounded by pockets of specialized analytical sandboxes can\u2019t keep up with business demand for data. \u00a0Tensions are rising between groups tasked with locking data down with those trying to set it free.\u00a0\nWhat is emerging is nothing short of a seismic shift in data management.\u00a0 The demand for business agility with data is moving the center of gravity from IT producers to business consumers. Like never before, organizations are striving to provide self-service data access for business analysts through a secure, well-managed environment where users can quickly find, understand, and prepare data for their specific needs.\nDavid Wells of Eckerson Group calls this new environment a \u201cdata marketplace.\u201d (source) Drawing on analogies on e-commerce, a data marketplace is a place for analysts and other data consumers go to find and provision the data they need.\u00a0 While this sounds simple and intuitive, making it a reality actually requires a major overhaul of common data management platforms and processes.\nWells cites three ways that data marketplaces differ from data warehouses:\n\nCataloging: Instead of mapping data to a comprehensive enterprise data model (which takes an enormous amount of engineering before users can access it), data sets are cataloged along the path from \u201craw\u201d data to \u201cready\u201d data.\u00a0 The catalog describes data quality, completeness, business definitions, and how it has been used to help users find and understand the data they need.\nCurating: The marketplace replaces the painstaking effort of creating a \u201csingle source of the truth\u201d with an agile, on-demand approach to improving data incrementally. Users who need harmonized, consistent data use tools in the marketplace to rapidly prepare \u201cfit for purpose\u201d data sets.\u00a0 Over time, commonly used, clean views of data evolve into trusted data shared across the enterprise.\nCrowdsourcing:\u00a0 All stakeholders in the marketplace\u2014data producers, stewards, and consumers\u2014actively improve it every time it is used.\u00a0 Business users note data quality and consistency gaps, stewards establish common definitions and \u201cgo to\u201d data sets, and source system experts identify sensitive data that needs protection.\u00a0 These stakeholders continuously enrich the marketplace catalog, and use it to coordinate with each other as data sources evolve and new business requirements emerge.\n\nI have had the good fortune of working with early adopters of the marketplace, and the results have astounded me \u2013 analysis time reduced from months to days; data preparation costs in cut in half; millions of dollars in hard cost savings through migration and retirement of legacy systems. Just as important, consolidating the \u201cfirst mile\u201d of the data supply chain has improved data security and governance\u2014these companies use the marketplace to enforce and monitor important data protection and access policies.\nThese early successes (along with 25 years of scars from traditional approaches) have convinced me that the data marketplace is the platform for the future.\u00a0 I look forward to sharing what I learn on this journey.