“Data is the fuel of this new economy. In fact, I would venture to say that for almost any organization today, after its associates, data is the biggest asset.” — Dev Ganguly, CIO, Jackson National Life Insurance Company (from the Inspired Execution podcast)
You won’t get many arguments from CIOs about the importance of data to the modern enterprise. And amid the broad spectrum of data types that power business today, one that keeps rising in importance is fast data. More than three-quarters of modern enterprises use real-time, actionable data for at least some of their applications, according to Forrester Research. In noting fast data’s strategic importance, the analyst firm also highlights the challenges involved with getting the most out of it.
In this article, I’ll walk you through what makes fast data unique among other operational data types, and what’s required of enterprises to take full advantage of it.
Beyond operational data
Data managed by enterprises has historically been categorized into two major buckets: “operational” and “analytical.” Analytical databases—think data warehouses or data lakes—serve to analyze static, historical data to help determine patterns retroactively, understand the past, and attempt to predict the future. Operational data, on the other hand, includes more immediate, transactional data—the data needed to run a business day-to-day, like Inventory and purchase data.
As large enterprises increasingly need to rapidly ingest, interact with, and react in real time to data that’s generated by applications at scale and speed, another type of data is emerging: “fast data.” Analysts and other industry observers often lump fast data into the operational bucket—but it has distinct use cases that set it apart from other kinds of operational data.
Fast data enables full-circle delivery of data that is “in motion.” In other words, it’s generated and consumed instantly by interactive applications running on large numbers of devices. Fast data enables organizations to act on insights gained from user interactions as these insights are generated at the point of the interaction. And because decisions or actions take place right at the front-end, fast data architectures are, by definition, distributed and real-time.
Big versus fast
Big data is focused on capturing data, storing it, and processing it periodically in batches. A fast data architecture, on the other hand, processes events in real time. Big data focuses on volume, while with fast data, the emphasis is on velocity.
Here’s an example. A credit card company might want to create credit risk models based on demographic data. That’s a big data challenge. A fast data architecture would be required if that credit card company wants to send fraud alerts to customers in real-time, when a suspicious activity occurs in their accounts.
Think of FedEx. To track millions of packages and ensure on-time and accurate delivery across the planet, FedEx needs access to the right real-time data to perform real-time analysis and deliver the right interaction—right away, right there, not a day later.
The fast data challenge
Handling fast data, which pours in from mobile devices, sensor networks, retail systems, and telecommunications call-routing systems, is becoming a major challenge for data-driven organizations.
To illustrate the complexity, let’s examine what’s meant by the fast data definition we’ve arrived at: enabling reactive engagement at the point of interaction.
- The point of interaction could be a system making an API call, or a mobile app.
- Engagement is defined as adding value to the interaction. It could be giving a tracking number to a customer after they place an order, a product recommendation based on a user’s browsing history, or a billing authorization or service upgrade.
- Reactive is the fast part of fast data; it means the engagement action happens in hundreds of milliseconds for human interactions (machine-to-machine interactions that occur in an energy utility’s sensor network might not require such a near-real-time response).
Fast data requires modern architectures that incorporate a database capable of handling massive, distributed data at speed, high-scale streaming technologies that can deliver events as rapidly as they occur, and logic at the point of interaction to deliver that engagement and value to the end user or end point.
Businesses that have built a fast data software stack gain the ability to build applications that can process real-time data and output recommendations, analytics, and decisions in an appropriately quick manner. Regardless of whether it’s seconds or fractions of a second, enterprises need an architecture that can respond in the timeframe demanded by the market.
With a fast data architecture in place, organizations also have the ability to shift the way they interact with customers very quickly.This became particularly important once COVID-19 struck. The Home Depot already relied on fast data to keep customers, store employees, and inventory synced. And because the company’s architecture was optimized for app and data velocity, it was able to shift to curbside delivery rapidly and smoothly.
The bottom line: Fast data makes it possible to offer a user a “next best action” at the point when a user would find it most helpful—in any experience or business process.
Learn how DataStax helps enterprises create modern data applications, built on the world’s most scalable data stack.