Overview

Streaming apps process continuous data flows in real-time, enabling immediate analysis and response to incoming information. These applications handle unbounded sequences of events, transforming and aggregating data as it arrives from various sources such as sensors, user interactions, or system logs. By operating on data in motion, stream apps facilitate rapid decision-making and allow systems to adapt dynamically to changing conditions across distributed environments.

app-type-streaming

What is a streaming app?

Streaming applications employ different processing logic at various data pipeline stages, adapting to each phase's specific requirements and constraints. They aggregate, filter, and enrich data streams, passing results to subsequent stages for further processing or storage. Streaming apps often operate in resource-constrained environments, requiring efficient utilization of computing power, memory, and network bandwidth.

The architectural landscape of stream processing can be elastic and distributed, potentially spanning multiple regions and geographies. This distributed nature allows scalability, fault tolerance, and the ability to process data closer to its source or where it's most relevant. Stream apps must maintain resilience when network inconsistencies occur and adapt to varying data volumes and velocities. 

Key properties of streaming apps

Stream apps possess distinct characteristics that enable real-time data processing and analysis.

Continuous processing
Stream apps operate on data as it arrives. They process unbounded datasets without a predetermined end.
Low latency
These applications provide near-real-time results. They minimize the delay between data ingestion and output generation.
Scalable architecture
Stream apps can handle varying data volumes. They scale horizontally to accommodate increased load or data throughput.
Fault tolerance
These systems are designed for reliability. They can recover from failures without data loss or significant processing delays.
Stateful operations
Stream apps maintain context across data items. They can perform complex aggregations and pattern matching over time windows.
Event-time processing
These applications effectively handle out-of-order events. They can process data based on when events occurred rather than when they were received.

Akka components

  1. The reference data view is maintained separately, providing reference data for the streaming flow.
  2. The input topic consumer component processes incoming topic messages. Messages are filtered, enriched with data queried from the reference data view, and then transformed into commands forwarded to the stream data entity.
  3. The stream data entity processes and persists the commands, which trigger entity state changes.
  4. The final publish stream data consumer processes the stream data entity state changes and transforms them into messages posted to an output topic.
streaming_App2

How Akka enables streaming apps

Akka enables distributed streaming applications across local and cloud environments with a unified model for creating interacting streaming services. It handles continuous data flows with backpressure management and resilience in distributed systems. Additionally, Akka offers unique capabilities for streaming HTTP endpoints within the SDK, such as a stream-oriented API for efficiently managing HTTP requests and responses, dynamic endpoint handling for optimized connection management, and built-in connection pooling that scales to meet fluctuating demand.

Stream processing
Handles unbounded data flows with source, flow, and sink abstractions. Processes data through defined transformation stages with built-in flow control mechanisms.
Component architecture
Combines event-sourced entities, value entities, views, and consumers to process streaming data. Components maintain state consistency while handling continuous data flows through stream processing flows.
Async non-blocking backpressure
Implements non-blocking backpressure, allowing components to communicate processing capacity. This prevents system overload by regulating data flow, ensuring efficient resource utilization without compromising performance.
Message processing guarantees
It offers flexible message processing guarantees, from at-least-once to exactly-once semantics. It allows balancing performance and consistency based on specific use case requirements.
Low latency high throughput
Provides a DSL for building flexible data pipelines. It easily integrates various data sources and sinks, seamlessly connecting different systems and services in a streaming architecture.
Integration pipelines
Provides a DSL for building flexible data pipelines. It .easily integrates various data sources and sinks, seamlessly connecting different systems and services in a streaming architecture.
dream11-logo-white

Dream11 quickly reduced cloud infrastructure costs by 30%

john-deere-logo-white

John Deere optimizes equipment use for higher yields & customer loyalty

tubi-logo-white

Tubi boots ad revenue with unique hyper-personalized experiences

Related content

Lightbend is now Akka

Multi-region replicated apps

Webinar: introducing Akka 3

Announcing Akka 3

Akka 3 - FAQ

Lightbend launches Akka 3 to make it easy to build and run apps that react to change; rebrands company as Akka

InfoQ webinar: the architect's guide to elasticity

Lightbend and Scalac partner to enable enterprises to leverage the power of Akka

Stay Responsive
to Change.