Joule Release 1.3.0
Joule cluster deployment with new transports and platform enhancements
Composable stream processing
The core building blocks for real-time data pipelines. Compose, enrich, transform, and act on events as they flow.
Joule processors are compact, modular, and observable by design , allowing you to build sophisticated streaming use cases using a declarative DSL — without boilerplate or operational overhead.
Processor features
Processors are the core of the Joule platform, each performing a specific task. These create use case when linked together.
Enrich analytical logic with static or dynamic contextual datasets for deeper insight and relevance.
Join independent stream events to trigger advanced analytics and dynamic business rules.
Apply RSA and AES encryption to selected event attributes using a rolling encryption key.
Processors can be combined flexibly into pipelines, allowing custom configurations for specific use cases.
Events are processed sequentially and in real time, supporting high-speed, low-latency applications.
Joule offers a wide range of ready-to-use processors for common tasks, accelerating deployment.
All processors provide real-time metrics accessible through JMX, enhancing monitoring and troubleshooting.
The Processor SDK allows developers to create custom processors, extending functionality to meet unique business needs.
Define use cases, reuse modules, and start creating from day one
Joule cluster deployment with new transports and platform enhancements
The Joule + DuckDB integration unlocks advanced streaming analytics by embedding a high-performance, in-memory database directly into the Joule runtime.
Enriching events with contextual data is key for advanced streaming use cases
Joule now ships with real-time inferencing enabling advanced use cases