7
 min read

How Event Streaming Is Changing Businesses

In this post we explore how event stream processing works and what are some key modern business examples.

Event Streaming enables businesses to interact with users and react to events in realtime. After the adoption of web 2.0 in the early 2000s changed the Internet from a read-only experience to a read-write experience, event streaming now enables us to interact with web applications almost instantaneously for a truly immersive environment.

Information technology has naturally evolved in such a way that any digital interaction leaves a footprint. This resulted in the generation of massive amounts of timestamped data, such as financial transactions, application logs, website clickstreams, and IoT telemetry data. However, many businesses are still processing these types of data in large batches from historical information. By processing streams of events, we  shift the paradigm of conducting analysis on stored information to analysing data on-the-fly to respond to events in realtime.

Understanding Event Streaming and Event Stream Processing

We can develop an intuitive understanding of event stream processing by looking at its syntax – namely what is an event, what is event streaming, and lastly, event stream processing.

What is an Event?

An event is any action that takes place and leaves a digital footprint with a clear timestamp. Events are typically notifications of change of state, such as an item being bought. These notifications are published in topics, and relevant applications can subscribe to these topics in order to take action on the events. Typically, the event generating system does not know what action is taken and receives no corresponding feedback that the event has been processed.

What is an Event Stream?

An event stream is a constant flow of events ordered by time. Streams are continuously generated by users interacting with applications, IoT sensors which periodically send data, security-related events, and the like.

What is Event Stream Processing?

Event stream processing refers to applying any kind of analysis or transformation over a constant stream of events. Stream processing typically happens in realtime, which means that analysis or transformation happens in less than 20 milliseconds.

How Does Event Stream Processing Work?

So far, data analysis and processing has been carried out in large batches, on information that has been collected and stored, long after the events took place. Event stream processing looks at data as it is generated in order to compute requests, forecast demand, and identify trends, all in real time. This underlying shift enables businesses to react as events take place, rather than looking over long periods of time and implementing changes retrospectively.

Event Stream Processing requires a change in architecture to enable efficient data transmission between multiple event generators and end-applications. Rather than having every event generator integrate with each individual end-application, we can integrate an event stream platform in between to manage dataflows as required.  This platform can receive information from event generators, which can be push-based continuous intelligence systems, or it can pull in multiple data streams from various connected devices. 

To collect and distribute data, event stream platforms often encompass two types of technologies. The first is a system that stores events in chronological order (both in the short term as buffers, and in the long term as storage), and the second is a piece of software to process events.

Processing can take multiple forms, such as aggregations – which include calculations such as sum and mean; analytics – such as forecasting or pattern identification; transformations – such as changing a number into a date format; and enrichment - combining multiple relevant data points to add context.

As events are generated, they are published within topics, which are categories that differentiate between various types of events. Applications which need to access a specific type of events will then subscribe to a topic to start receiving data in chronological order.

Adding an event processing platform in the mix brings the following benefits:

Mediate Dataflows between Event Generators and Applications

Adding an event streaming platform between systems that generate events and applications which leverage events simplifies the overall architecture. Rather than having each event generator integrate with every application, we can now only integrate once with the event streaming platform. This means that any changes to the system, such as new event sources or applications, do not need for all the integrations to be rewritten

Manage the Data Transmission Rate between Producers and Consumers

‘Backpressure’ refers to the instances when there is more data generated than that which can be processed. Rather than dropping events or, even worse, having the consuming application crash, an event streaming platform can act as a buffer and distribute the data transmission rate to enable consistent and continuous processing.

A Single Pane of Glass

Besides configuring the flows between event generators and event consumers, an even streaming platform can also store data, enabling applications to query whatever data is available on-demand.

Without an event streaming platform: lots of configuration required
With an event streaming platform: much better


Event Streaming Businesses Opportunities and Benefits

Typically, businesses that are best positioned to profit from implementing event stream processing are those who have a large number of noteworthy events, whose events are happening frequently and close together in time, and have the need to respond quickly to these events.

For businesses that have noteworthy events, such as manufacturing companies that have data on machine failures, time to completion, and capacity peaks and flows, event stream processing can help put all this data to good use rather than just showing it as blips on monthly reports.

Frequent events that are close together in time are the perfect scenario for event streaming applications.  As event streaming solutions can scale to support hundreds of thousands of events in short periods of time, the more events we gather, the merrier. Selling tickets online, for example, can see thousands of concurrent requests. Rather than placing users in queues, event stream processing can help address each of these requests in real time.

Lastly, event stream processing can help you bring the power of hindsight into the now, such as generating content suggestions on-the-fly for video streaming platforms.

Regardless of industry, event stream processing enables the applications such as:

  • Analysing data in real time: Real-time anomaly detection, infrastructure monitoring, forecasting, trend analysis
  • Streaming analytics: Observe metrics from thousands of events in real time, such as real-time counters and moving averages. Applications in financial trading are prime candidates for this type of application
  • Feeding data pipelines: You can move ingested data from the event stream platform to multiple targets such as data lakes, data warehouses, and OLAP databases in real-time. Those systems are best suited for offline analytics and performing interactive queries.

Use Case #1 - Industrial Internet of Things (IIoT)

In the context of automating manufacturing processes, enterprises can integrate an IIoT solution by adding various sensors that transmit streams of data in real time. These sensors can be deployed by the hundreds, and their data streams are typically aggregated by IoT gateways, which can push a steady stream of data further into the technology stack. In order to make use of the data and analyse to identify patterns and quickly take actions on them, enterprises would need to employ an event stream processing methodology.  The event streaming platform would ingest this stream of events and run real-time analytics.

For example, we may be interested in monitoring the average temperature over a 30 second period. Afterwards, we want the temperature to be displayed only if the temperature exceeds 45 °C. Whenever this condition is met, the alert can be used by other applications to react in real-time and adjust operations to avoid the risk of overheating.

Use Case #2 - Payment Processing  

Fast payment processing is a great use case for event stream processing to help alleviate user experience issues and unwanted behaviours. For example, when a person wants to make a payment and faces severe delays, they may refresh the page which can cause the transaction to fail and leave the person confused as to whether the money has been debited or not. Similarly, when dealing with machine-driven payments, latency can have a significant knock-on effect, especially when queueing thousands of payments. This can force systematic retries or timeouts.

To enable seamless processing of thousands of concurrent requests, we can use event streaming processing to ensure consistent experience throughout.

A payment request event can be fed from a topic into an initial payments processor, which updates the total balance of payments currently being processed. Afterwards, a subsequent event is generated and sent to another processor that validates that the payment can be processed, and updates the user’s balance. A last event is then emitted and another processor updates the user’s balance.

Use Case #3 – Cybersecurity

Cybersecurity tools gather millions of events to detect new threats and understand correlations between individual incidents. To reduce false positives, cybersecurity tools leverage event streaming processing to enrich threats and deliver contextually rich data. They do so by following a series of steps such as:

  • Collect events from various data sources such as live customer environments
  • Filter event streams so that only relevant data gets into the topics, namely to remove known false positives or benign attacks
  • Leverage real-time streaming applications to correlate events across various source interfaces
  • Forward priority events to other systems such as the security information and event management systems (SIEM) or security automation, orchestration and response (SOAR)

Use Case #4 – Airline Optimization

We can implement real time applications to improve passenger experience before, during and after flights, as well as improve overall process efficiency. If we take key events, such as passengers scanning their boarding pass at the gate, and make them available across all the back-end platforms used by airlines and airports, we can coordinate and respond efficiently.

For instance, off the back of this one type of event, we can support three different use cases, such as:

  • Predicting take-off times more accurately and forecasting delays
  • Lower the amount of support required by connecting passengers by providing real time information
  • Mitigate compounding effects of one flight impacting the on-time performance for the rest of the flights
Event streaming example - airline optimization

Use Case #5 - E-commerce

In an e-commerce application, we can use event stream processing to support “viewing through to buying”. To do so, we can define an initial event stream to record the events generated by shoppers with three different types of events feeding into the stream.

  • Shopper views product
  • Shopper adds item to cart
  • Shopper places order

As these three types of events are flowing through into each relevant topic, we can support our use cases by employing discrete processes or algorithms such as:

  1. An hourly sales calculator – which reads ‘Shopper places order’ events from the stream and maintains a sum of total revenues for each hour
  2. A product look-to-book tracker – which reads ‘Shopper views product’ from the stream and maintains a count of total views for each product. It also reads ‘Shopper places order’ events from the stream and maintains a count of total units sold for each product
  3. An abandoned cart detector – which reads all three types of event, and applies the algorithm set out earlier to identify shoppers who have abandoned their shopping cart is detected, a new ‘Shopper abandons’ cart event is generated and written to a new topic

Learn How Our Platform Makes Event Streaming Easier

It simplifies and consolidates event stream processing and automation within a single, workflow-driven platform. It allows the ingestion of events and their distribution across automation workflows. Our platform offers highly scalable event streaming that’s about as plug-and-play as it gets.

Our platform enables you to achieve the benefits of event stream processing without infrastructure concerns such as Kafka clusters, Lambda event sources, DevOps overhead, VPC configurations, subnets, security groups, and so on — you just instrument your applications to collect.