Not so long ago, the request/response pattern architecture was the most common way for computer systems to communicate over a network. One computer would send a request (such as a request to access data), and it would have to wait for the second computer to respond by either granting or denying that access. In the meantime, there was nothing the first computer could do while it waited. This approach was fine when organizations relied on small data sets and simple events, but thanks to the rise of big data, enterprises need faster solutions that are capable of doing more. This is why event driven architecture (EDA) is so important.
The EDA design pattern allows computer systems to detect events and react to them in near real-time. Not only can a modern system work on multiple new events at once, but it also won’t have to wait for a response before performing each action. This greatly speeds up complex events and allows the organization to keep its data accurate and current.
How do event streams work?
EDA is frequently used in microservice applications, which allows many loosely coupled parts to work independently of the overall app. This is made possible by shared APIs, and it ensures that the actions of one microservice can’t disrupt an entire application. This is a way of breaking applications down into their core functions, so each one can be used independently of the others.
An event can be anything that results in a change of state within an organization. For example, if a customer purchases an item on your company website, that item’s state will change from “for sale” to “sold.” Sales aren’t the only thing that produces events, though. Any time a user logs in is an event. If unusual network activity is detected, that produces an event in the form of a security alert. Even an IoT sensor detecting unusual temperatures will produce an event message. Event stream processing (ESP) is the act of collecting these events and recording them in databases, visualizing them during data analysis, and much more. Beyond the capability to process multiple types of events quickly, the goal of event processing is to discover patterns in your most meaningful events and use them to lead to better decision-making.
Each event starts with the event producer, which will generally be your own point of sale system, online store, or application. Each event is funneled to an event broker and depending on the event source, it will be sent to the appropriate event consumer. For example, if a customer buys the last of one of your items, the event broker will funnel this information to your inventory management system, so it can order additional stock.
What are the challenges?
While EDA certainly offers some great benefits to enterprises, it isn’t without some drawbacks. Loose coupling of functions can make it difficult for engineers to determine how the system works, especially when it reacts to complex events in real-time, making it difficult to study. Event sources can also be challenging to pinpoint, which can make it harder to analyze data. The event broker must also be maintained perfectly, which may require specialist knowledge since if the broker fails, the entire system fails.
The best way to combat these challenges is to be deliberate when designing or selecting a messaging framework. Message processing tends to be the most stable approach, and it involves event messages passing through multiple brokers until it reaches a single destination. Stream processing, as already discussed, is the most commonly used approach now. It is more complicated than message processing, but if you use Apache Kafka (generally considered the industry standard), you’ll find an active community of users to draw support from. You also don’t want to design too many event types, just as you wouldn’t want to design an event that’s so generic that it’s difficult to interpret the intent. Both of these issues can make your EDA system less effective.