Ever heard the name Kafka and thought your architects were discussing Franz Kafka, the 20th century writer? They were actually talking about Apache Kafka, a project spun out of LinkedIn in 2011 that kicked off a quiet revolution in how large-scale applications are built. Every Facebook post, order on Amazon or tweet on Twitter results in dozens if not hundreds of events being produced behind the scenes. Large-scale commerce applications, especially those built using microservices, are almost entirely powered by events. Events are the low-hype plumbing that’s changing how applications are built.
What’s an event?
Before we get too far, let’s cover what an event actually is. An event is a fact at a specific point in time. Many applications publish events any time an action is performed or data is changed. Actions include viewing a product detail page, placing an order or searching for a product. Data changes include updating a customer’s surname, decrementing inventory or changing a product’s description. Source applications publish them into the ether for any internal or external application with appropriate permission who cares to consume them, either immediately or at some point in the future. The sender of the event is completely independent of the consumer of the event.
Let’s take a real-world example of an event. Let’s say you have an inventory team whose charter is to track customer-facing inventory levels, and a warehouse team whose charter is to re-order products so that the warehouse is fully stocked. Historically, the two applications would need a tight technical integration that involves many meetings, emails, approvals from management, documentation, etc. After the integration, the two teams are tightly coupled. Every time one team changes their application, the other team now needs to know. If one application is down, errors are thrown on the other. With an event-based architecture, the inventory team would just publish every inventory change as an event for anyone who cares to consume it. Now the warehouse team (and any other team) can consume those events. In this model, the warehouse team and the inventory team don’t even have to know about each other. They’re completely decoupled, both technically and organizationally. The warehouse application could even be offline for a few hours and when it comes back up it’ll still get the events and pick up where it left off. The two teams and two applications are completely decoupled.
Besides exchanging data between applications, the burgeoning serverless movement is entirely based on events. Serverless is sometimes called Function as a Service, or even AWS Lambda, which is the market-leading serverless product from Amazon Web Services. Serverless allows you to execute a little snippet of code in response to an event. You could send customers a “Thank you for your order”-type email every time an order submission event is published, for example. Rather than extending your commerce platform or building tightknit integrations between different applications, your developers can just write little snippets of code that are executed when various events are received. Imagine having hundreds of events capturing every action or data change to consume if you’re a developer. The opportunities to inject business logic, without tight coupling, is extremely powerful.
As a final benefit, events can serve as a great foundation of your big data initiatives. With every data change or action represented as an event, and with cloud services being so inexpensive, you can quickly and easily apply real-time streaming analytics to all of your events and store the important data for later analytics. All of the major cloud services offer a robust suite of services for analyzing data, or you can hire your own team of machine learning experts to analyze the data for you. From that data, you can deliver business insights that just weren’t possible before.
How are events different than messaging?
What sets events apart from traditional messaging is the scale. A large-scale commerce application could easily generate millions of events per second. If you wanted to, you could capture every page view, you could log which users have seen which products, and of course you could capture the basics like every time a product or order has been created. The scale at which these event cloud services operate is staggering. The cost is also next to nothing. For $0.02 per petabyte, you can capture and analyze events in real time using AWS Kinesis. For $0.004 per gigabyte, you can store whatever you want in AWS S3. Other clouds have similar offerings for similar prices. You can now economically store anything you want for as long as you want. Capturing, analyzing and storing a petabyte of data used to cost millions of dollars. Now it takes a credit card and a few thousand dollars.
Event-related architecture, technology and commercials are all so appealing that now is absolutely the time to start adopting this style of architecture. Events will make your integrations easier by helping disparate applications exchange data, open up a new world of serverless and provide the data necessary for data analytics, including machine learning.