├── .tool-versions ├── ene_kafka ├── src │ ├── messages │ │ ├── cloud_events │ │ │ ├── mod.rs │ │ │ └── cloud_event.rs │ │ ├── mod.rs │ │ ├── kafka_message.rs │ │ └── rdkafka_impl.rs │ ├── consumers │ │ ├── mod.rs │ │ ├── rdkafka_impl.rs │ │ └── consumer.rs │ ├── producers │ │ ├── mod.rs │ │ ├── rdkafka_impl.rs │ │ └── producer.rs │ ├── lib.rs │ ├── handlers │ │ └── mod.rs │ ├── dispatchers │ │ └── mod.rs │ └── admins │ │ ├── mod.rs │ │ └── rdkafka_impl.rs └── Cargo.toml ├── md_assets ├── ene-kafka.png └── architecture.svg ├── docs └── consumer_architecture.md ├── .gitignore ├── .github ├── workflows │ └── rust.yml ├── dependabot.yml ├── ISSUE_TEMPLATE │ ├── feature_request.md │ └── bug_report.md └── actions │ └── setup-builder │ └── action.yaml ├── ene_kafka_derive ├── Cargo.toml └── src │ ├── deserialize_from.rs │ ├── handler.rs │ ├── cloud_event.rs │ ├── kafka_message.rs │ └── lib.rs ├── ene_kafka_examples ├── kafka_admin.rs ├── Cargo.toml ├── kafka_producer.rs ├── events_custom_serde.rs └── kafka_consumer.rs ├── Cargo.toml ├── CODE_OF_CONDUCT.md ├── readme.md └── LICENSE /.tool-versions: -------------------------------------------------------------------------------- 1 | rust 1.70 -------------------------------------------------------------------------------- /ene_kafka/src/messages/cloud_events/mod.rs: -------------------------------------------------------------------------------- 1 | pub mod cloud_event; 2 | -------------------------------------------------------------------------------- /ene_kafka/src/consumers/mod.rs: -------------------------------------------------------------------------------- 1 | pub mod consumer; 2 | pub mod rdkafka_impl; 3 | -------------------------------------------------------------------------------- /ene_kafka/src/producers/mod.rs: -------------------------------------------------------------------------------- 1 | pub mod producer; 2 | pub mod rdkafka_impl; 3 | -------------------------------------------------------------------------------- /md_assets/ene-kafka.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ene-rs/ene-kafka/HEAD/md_assets/ene-kafka.png -------------------------------------------------------------------------------- /ene_kafka/src/messages/mod.rs: -------------------------------------------------------------------------------- 1 | pub mod cloud_events; 2 | pub mod kafka_message; 3 | pub mod rdkafka_impl; 4 | -------------------------------------------------------------------------------- /docs/consumer_architecture.md: -------------------------------------------------------------------------------- 1 | The current consumer architecture of Ene kafka is as follows: 2 | ![Architecture](../md_assets/architecture.svg) -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Generated by Cargo 2 | # will have compiled files and executables 3 | debug/ 4 | target/ 5 | 6 | # Remove Cargo.lock from gitignore if creating an executable, leave it for libraries 7 | # More information here https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html 8 | Cargo.lock 9 | 10 | # These are backup files generated by rustfmt 11 | **/*.rs.bk 12 | 13 | # MSVC Windows builds of rustc generate these, which store debugging information 14 | *.pdb 15 | .vscode/ 16 | .idea/ -------------------------------------------------------------------------------- /ene_kafka/src/lib.rs: -------------------------------------------------------------------------------- 1 | use rdkafka::client::DefaultClientContext; 2 | 3 | pub mod admins; 4 | pub mod consumers; 5 | pub mod dispatchers; 6 | pub mod handlers; 7 | pub mod messages; 8 | pub mod producers; 9 | 10 | pub type KafkaResult = anyhow::Result; 11 | 12 | #[cfg(feature = "rdkafka")] 13 | pub type ConsumerImpl = rdkafka::consumer::StreamConsumer; 14 | #[cfg(feature = "rdkafka")] 15 | pub type ProducerImpl = rdkafka::producer::FutureProducer; 16 | #[cfg(feature = "rdkafka")] 17 | pub type AdminImpl = rdkafka::admin::AdminClient; 18 | -------------------------------------------------------------------------------- /.github/workflows/rust.yml: -------------------------------------------------------------------------------- 1 | name: Rust 2 | 3 | on: 4 | push: 5 | branches: [ "main" ] 6 | pull_request: 7 | branches: [ "main" ] 8 | 9 | env: 10 | CARGO_TERM_COLOR: always 11 | 12 | jobs: 13 | build: 14 | 15 | runs-on: ubuntu-latest 16 | 17 | steps: 18 | - uses: actions/checkout@v4 19 | - name: Setup Rust toolchain 20 | uses: ./.github/actions/setup-builder 21 | with: 22 | rust-version: stable 23 | - name: Build 24 | run: cargo build --verbose 25 | - name: Run tests 26 | run: cargo test --verbose 27 | -------------------------------------------------------------------------------- /.github/dependabot.yml: -------------------------------------------------------------------------------- 1 | # To get started with Dependabot version updates, you'll need to specify which 2 | # package ecosystems to update and where the package manifests are located. 3 | # Please see the documentation for all configuration options: 4 | # https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file 5 | 6 | version: 2 7 | updates: 8 | - package-ecosystem: cargo # See documentation for possible values 9 | directory: "/" # Location of package manifests 10 | schedule: 11 | interval: "weekly" 12 | target-branch: main 13 | labels: [auto-dependencies] 14 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | -------------------------------------------------------------------------------- /ene_kafka_derive/Cargo.toml: -------------------------------------------------------------------------------- 1 | [package] 2 | name = "ene_kafka_derive" 3 | version = { workspace = true } 4 | edition = { workspace = true } 5 | repository = { workspace = true } 6 | license = { workspace = true } 7 | authors = { workspace = true } 8 | rust-version = { workspace = true } 9 | readme = { workspace = true } 10 | keywords = { workspace = true } 11 | description = "Derive macros used for Ene Kafka" 12 | 13 | # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html 14 | 15 | [dependencies] 16 | deluxe = {workspace = true} 17 | proc-macro2 = {workspace = true} 18 | quote = {workspace = true} 19 | syn = {workspace = true} 20 | 21 | 22 | [lib] 23 | proc-macro = true 24 | -------------------------------------------------------------------------------- /ene_kafka_examples/kafka_admin.rs: -------------------------------------------------------------------------------- 1 | use std::env; 2 | 3 | use ene_kafka::admins::KafkaAdminInterface; 4 | use ene_kafka::kafka_admin; 5 | 6 | /// This examples demonstrates how the Kafka Admin can be used 7 | #[tokio::main] 8 | async fn main() -> ene_kafka::KafkaResult<()> { 9 | env::set_var("RUST_LOG", "debug"); 10 | tracing_subscriber::fmt() 11 | .with_env_filter(tracing_subscriber::EnvFilter::from_default_env()) 12 | .init(); 13 | let bootstrap_servers = "localhost:9092".to_string(); 14 | 15 | let admin = kafka_admin!( 16 | bootstrap_servers = bootstrap_servers, 17 | request_time_out_ms = "50000".to_string(), 18 | connection_max_idle_ms = "0".to_string() 19 | ); 20 | 21 | let is_topic_live = admin.check_topic_liveness("test").await?; 22 | println!("Is topic live: {}", is_topic_live); 23 | 24 | Ok(()) 25 | } 26 | -------------------------------------------------------------------------------- /ene_kafka/src/handlers/mod.rs: -------------------------------------------------------------------------------- 1 | use async_trait::async_trait; 2 | 3 | use crate::messages::cloud_events::cloud_event::{CloudEvent, DeserializeFrom, EventType}; 4 | 5 | #[async_trait] 6 | pub trait EventHandler< 7 | InputEvent: CloudEvent, 8 | HandlableEvent: CloudEvent + DeserializeFrom, 9 | > 10 | { 11 | fn can_handle(&self, event: &InputEvent) -> anyhow::Result { 12 | Ok(event.event_type()? == self.event_type()?) 13 | } 14 | 15 | fn event_type(&self) -> anyhow::Result; 16 | 17 | async fn deserialize_and_handle(&self, event: &InputEvent) -> anyhow::Result<()> { 18 | let deserialized_event = HandlableEvent::deserialize_from(&event)?; 19 | self.handle(&deserialized_event).await 20 | } 21 | 22 | async fn handle(&self, event: &HandlableEvent) -> anyhow::Result<()>; 23 | } 24 | -------------------------------------------------------------------------------- /ene_kafka/Cargo.toml: -------------------------------------------------------------------------------- 1 | [package] 2 | name = "ene_kafka" 3 | version = { workspace = true } 4 | edition = { workspace = true } 5 | repository = { workspace = true } 6 | license = { workspace = true } 7 | authors = { workspace = true } 8 | rust-version = { workspace = true } 9 | readme = { workspace = true } 10 | keywords = { workspace = true } 11 | description = { workspace = true } 12 | 13 | # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html 14 | 15 | [dependencies] 16 | anyhow = {workspace = true} 17 | async-trait = {workspace = true} 18 | chrono = {workspace = true} 19 | rdkafka = {workspace = true} 20 | serde = {workspace = true} 21 | serde_json = {workspace = true} 22 | tokio = {workspace = true} 23 | uuid = {workspace = true} 24 | tracing = {workspace = true} 25 | tracing-subscriber = {workspace = true} 26 | 27 | [features] 28 | default = ["rdkafka"] 29 | rdkafka = [] -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | A clear and concise description of what the bug is. 12 | 13 | **To Reproduce** 14 | Steps to reproduce the behavior: 15 | 1. Go to '...' 16 | 2. Click on '....' 17 | 3. Scroll down to '....' 18 | 4. See error 19 | 20 | **Expected behavior** 21 | A clear and concise description of what you expected to happen. 22 | 23 | **Screenshots** 24 | If applicable, add screenshots to help explain your problem. 25 | 26 | **Desktop (please complete the following information):** 27 | - OS: [e.g. iOS] 28 | - Browser [e.g. chrome, safari] 29 | - Version [e.g. 22] 30 | 31 | **Smartphone (please complete the following information):** 32 | - Device: [e.g. iPhone6] 33 | - OS: [e.g. iOS8.1] 34 | - Browser [e.g. stock browser, safari] 35 | - Version [e.g. 22] 36 | 37 | **Additional context** 38 | Add any other context about the problem here. 39 | -------------------------------------------------------------------------------- /ene_kafka_examples/Cargo.toml: -------------------------------------------------------------------------------- 1 | [package] 2 | name = "ene_kafka_examples" 3 | description = "Ene Kafka usage examples" 4 | keywords = ["kafka"] 5 | publish = false 6 | readme = "README.md" 7 | version = { workspace = true } 8 | edition = { workspace = true } 9 | repository = { workspace = true } 10 | license = { workspace = true } 11 | authors = { workspace = true } 12 | rust-version = { workspace = true } 13 | 14 | [lints] 15 | workspace = true 16 | 17 | [[example]] 18 | name = "kafka_admin" 19 | path = "kafka_admin.rs" 20 | 21 | [[example]] 22 | name = "kafka_consumer" 23 | path = "kafka_consumer.rs" 24 | 25 | [[example]] 26 | name = "kafka_producer" 27 | path = "kafka_producer.rs" 28 | 29 | [[example]] 30 | name = "events_custom_serde" 31 | path = "events_custom_serde.rs" 32 | 33 | [dev-dependencies] 34 | ene_kafka = { workspace = true } 35 | ene_kafka_derive = { workspace = true } 36 | tokio = {workspace = true} 37 | tracing-subscriber = {workspace = true} 38 | serde = {workspace = true} 39 | anyhow = {workspace = true} 40 | serde_json = {workspace = true} 41 | uuid = {workspace = true} 42 | chrono = {workspace = true} 43 | async-trait = {workspace = true} -------------------------------------------------------------------------------- /ene_kafka_derive/src/deserialize_from.rs: -------------------------------------------------------------------------------- 1 | use syn::DeriveInput; 2 | 3 | use crate::kafka_message::KafkaMessageAttributes; 4 | 5 | pub fn deserialize_from_derive_macro2( 6 | input: proc_macro2::TokenStream, 7 | ) -> deluxe::Result { 8 | let mut ast: DeriveInput = syn::parse2(input.into())?; 9 | let KafkaMessageAttributes { serde, .. }: KafkaMessageAttributes = 10 | deluxe::extract_attributes(&mut ast)?; 11 | let struct_name = &ast.ident; 12 | 13 | Ok(quote::quote! { 14 | impl> ene_kafka::messages::cloud_events::cloud_event::DeserializeFrom for #struct_name { 15 | fn deserialize_from(value: &Event) -> ene_kafka::KafkaResult { 16 | match ene_kafka::messages::kafka_message::ContentType::#serde { 17 | ene_kafka::messages::kafka_message::ContentType::Json => { 18 | Ok(serde_json::from_str::<#struct_name>(value.payload()?.as_str())?) 19 | } 20 | } 21 | } 22 | } 23 | }) 24 | } 25 | -------------------------------------------------------------------------------- /ene_kafka/src/messages/kafka_message.rs: -------------------------------------------------------------------------------- 1 | use std::collections::HashMap; 2 | 3 | use anyhow::Result; 4 | 5 | pub type HeaderKey = String; 6 | pub type HeaderValue = String; 7 | pub type Headers = HashMap; 8 | 9 | pub trait ToBytes { 10 | fn to_bytes(&self) -> Result>; 11 | } 12 | 13 | impl ToBytes for String { 14 | fn to_bytes(&self) -> Result> { 15 | Ok(self.as_bytes().to_vec()) 16 | } 17 | } 18 | 19 | #[derive(Debug, Clone)] 20 | pub enum ContentType { 21 | Json, 22 | } 23 | 24 | impl ContentType { 25 | pub fn from_str(content_type: &str) -> Result { 26 | match content_type { 27 | "json" => Ok(Self::Json), 28 | _ => Err(anyhow::anyhow!("Invalid content type")), 29 | } 30 | } 31 | } 32 | 33 | #[derive(Debug, Clone)] 34 | pub struct KafkaTopic { 35 | pub name: String, 36 | pub content_type: ContentType, 37 | } 38 | 39 | pub trait KafkaMessage: Sync + Send { 40 | fn topic(&self) -> anyhow::Result; 41 | fn payload(&self) -> anyhow::Result; 42 | fn key(&self) -> anyhow::Result; 43 | fn headers(&self) -> anyhow::Result; 44 | } 45 | -------------------------------------------------------------------------------- /Cargo.toml: -------------------------------------------------------------------------------- 1 | [workspace] 2 | members = [ 3 | "ene_kafka", 4 | "ene_kafka_derive", 5 | "ene_kafka_examples" 6 | ] 7 | resolver = "2" 8 | 9 | [workspace.package] 10 | authors = ["Abdullah Sabaa Allil"] 11 | edition = "2021" 12 | license = "Apache-2.0" 13 | readme = "README.md" 14 | rust-version = "1.70" 15 | version = "0.3.0" 16 | keywords = ["kafka", "microservices", "ene-rs", "ene-kafka", "pubsub"] 17 | description = "Ene Kafka is an easy-to-use Rust client for Apache Kafka" 18 | repository = "https://github.com/ene-rs/ene-kafka" 19 | 20 | [workspace.dependencies] 21 | anyhow = "1.0.86" 22 | async-trait = "0.1.82" 23 | chrono = "0.4.38" 24 | rdkafka = "0.37.0" 25 | serde = "1.0.209" 26 | serde_json = "1.0.128" 27 | tokio = { version = "1.40.0", features = ["rt", "rt-multi-thread", "macros"] } 28 | uuid = {version = "1.10.0", features = ["v4"]} 29 | tracing = "0.1.40" 30 | tracing-subscriber = { version = "0.3.18", features = ["env-filter", "fmt", "json"] } 31 | deluxe = "0.5.0" 32 | proc-macro2 = "1.0.86" 33 | quote = "1.0.37" 34 | syn = "2.0.77" 35 | ene_kafka = { path = "ene_kafka" } 36 | ene_kafka_derive = { path = "ene_kafka_derive" } 37 | 38 | 39 | [workspace.lints.rust] 40 | unused_imports = "deny" 41 | 42 | [profile.release] 43 | codegen-units = 1 44 | lto = true -------------------------------------------------------------------------------- /ene_kafka_derive/src/handler.rs: -------------------------------------------------------------------------------- 1 | use syn::DeriveInput; 2 | 3 | #[derive(deluxe::ExtractAttributes)] 4 | #[deluxe(attributes(event_handler))] 5 | struct HandlerAttributes { 6 | event: syn::ExprPath, 7 | handler: syn::Ident, 8 | } 9 | 10 | pub fn handler_derive_macro2( 11 | input: proc_macro2::TokenStream, 12 | ) -> deluxe::Result { 13 | let mut ast: DeriveInput = syn::parse2(input.into())?; 14 | let HandlerAttributes { event, handler }: HandlerAttributes = 15 | deluxe::extract_attributes(&mut ast)?; 16 | let struct_name = &ast.ident; 17 | 18 | let event_path = event.path; 19 | 20 | Ok(quote::quote! { 21 | #[async_trait::async_trait] 22 | impl> EventHandler for #struct_name { 23 | fn event_type(&self) -> ene_kafka::KafkaResult { 24 | use ene_kafka::messages::cloud_events::cloud_event::CloudEvent; 25 | #event_path::entity_event_type() 26 | } 27 | 28 | async fn handle(&self, event: &#event_path) -> ene_kafka::KafkaResult<()> { 29 | #struct_name::#handler(self, event).await 30 | } 31 | } 32 | }) 33 | } 34 | -------------------------------------------------------------------------------- /ene_kafka_examples/kafka_producer.rs: -------------------------------------------------------------------------------- 1 | use serde::{Deserialize, Serialize}; 2 | 3 | use ene_kafka::producers::producer::KafkaProducerInterface; 4 | use ene_kafka::{kafka_producer, producers::producer::KafkaProducer}; 5 | use ene_kafka_derive::{CloudEvent, DeserializeFrom, KafkaMessage}; 6 | 7 | #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize, DeserializeFrom)] 8 | #[kafka(topic = "test", serde = Json, key = entity_id, headers = CloudEvent)] 9 | #[cloud_event( 10 | content_type = "application/json", 11 | version = "1.0", 12 | event_type = "com.ene.entity.updated.v1", 13 | event_source = "https://ene-kafka.com/docs/cloudevents/entity/updated", 14 | id = entity_id 15 | )] 16 | struct EntityUpdated { 17 | pub entity_id: i64, 18 | pub organisation_id: i64, 19 | } 20 | 21 | #[tokio::main] 22 | async fn main() -> ene_kafka::KafkaResult<()> { 23 | tracing_subscriber::fmt() 24 | .with_env_filter(tracing_subscriber::EnvFilter::from_default_env()) 25 | .init(); 26 | let bootstrap_servers = "localhost:9092".to_string(); 27 | 28 | let producer: KafkaProducer = kafka_producer!(bootstrap_servers = bootstrap_servers.clone()); 29 | let event = EntityUpdated { 30 | entity_id: 1755, 31 | organisation_id: 42, 32 | }; 33 | 34 | producer.send(event).await?; 35 | Ok(()) 36 | } 37 | -------------------------------------------------------------------------------- /ene_kafka/src/messages/cloud_events/cloud_event.rs: -------------------------------------------------------------------------------- 1 | use std::collections::HashMap; 2 | 3 | use crate::messages::kafka_message::{Headers, KafkaMessage, ToBytes}; 4 | 5 | pub trait CloudEvent: 6 | KafkaMessage + Sync + Send 7 | { 8 | fn spec_version(&self) -> anyhow::Result; 9 | fn event_type(&self) -> anyhow::Result; 10 | fn event_source(&self) -> anyhow::Result; 11 | fn event_id(&self) -> anyhow::Result; 12 | fn event_time(&self) -> anyhow::Result; 13 | fn event_content_type(&self) -> anyhow::Result; 14 | 15 | fn entity_event_type() -> anyhow::Result; 16 | 17 | fn cloud_event_headers(&self) -> anyhow::Result { 18 | Ok(HashMap::from([ 19 | (String::from("ce_specversion"), self.spec_version()?), 20 | (String::from("ce_type"), self.event_type()?), 21 | (String::from("ce_source"), self.event_source()?), 22 | (String::from("ce_id"), self.event_id()?), 23 | (String::from("ce_time"), self.event_time()?), 24 | (String::from("content_type"), self.event_content_type()?), 25 | ])) 26 | } 27 | } 28 | 29 | pub trait DeserializeFrom> { 30 | fn deserialize_from(event: &InputEvent) -> anyhow::Result 31 | where 32 | Self: Sized; 33 | } 34 | 35 | pub type EventType = String; 36 | -------------------------------------------------------------------------------- /ene_kafka_examples/events_custom_serde.rs: -------------------------------------------------------------------------------- 1 | use serde::{Deserialize, Serialize}; 2 | 3 | use ene_kafka::messages::cloud_events::cloud_event::{CloudEvent, DeserializeFrom}; 4 | use ene_kafka::producers::producer::KafkaProducerInterface; 5 | use ene_kafka::{kafka_producer, producers::producer::KafkaProducer}; 6 | use ene_kafka_derive::{CloudEvent, KafkaMessage}; 7 | 8 | #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize)] 9 | #[kafka(topic = "test", serde = Json, key = entity_id, headers = CloudEvent)] 10 | #[cloud_event( 11 | content_type = "application/json", 12 | version = "1.0", 13 | event_type = "com.ene.entity.created.v1", 14 | event_source = "https://ene-kafka.com/docs/cloudevents/entity/created", 15 | id = entity_id 16 | )] 17 | struct EntityCreated { 18 | pub entity_id: i64, 19 | pub organisation_id: i64, 20 | } 21 | 22 | impl> DeserializeFrom for EntityCreated { 23 | fn deserialize_from(value: &Event) -> ene_kafka::KafkaResult { 24 | Ok(serde_json::from_str::( 25 | value.payload()?.as_str(), 26 | )?) 27 | } 28 | } 29 | 30 | #[tokio::main] 31 | async fn main() -> ene_kafka::KafkaResult<()> { 32 | tracing_subscriber::fmt() 33 | .with_env_filter(tracing_subscriber::EnvFilter::from_default_env()) 34 | .init(); 35 | let bootstrap_servers = "localhost:9092".to_string(); 36 | 37 | let producer: KafkaProducer = kafka_producer!(bootstrap_servers = bootstrap_servers.clone()); 38 | let event = EntityCreated { 39 | entity_id: 1755, 40 | organisation_id: 42, 41 | }; 42 | 43 | producer.send(event).await?; 44 | Ok(()) 45 | } 46 | -------------------------------------------------------------------------------- /.github/actions/setup-builder/action.yaml: -------------------------------------------------------------------------------- 1 | # Licensed to the Apache Software Foundation (ASF) under one 2 | # or more contributor license agreements. See the NOTICE file 3 | # distributed with this work for additional information 4 | # regarding copyright ownership. The ASF licenses this file 5 | # to you under the Apache License, Version 2.0 (the 6 | # "License"); you may not use this file except in compliance 7 | # with the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, 12 | # software distributed under the License is distributed on an 13 | # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY 14 | # KIND, either express or implied. See the License for the 15 | # specific language governing permissions and limitations 16 | # under the License. 17 | 18 | name: Prepare Rust Builder 19 | description: 'Prepare Rust Build Environment' 20 | inputs: 21 | rust-version: 22 | description: 'version of rust to install (e.g. stable)' 23 | required: true 24 | default: 'stable' 25 | targets: 26 | description: 'The toolchain targets to add, comma-separated' 27 | default: '' 28 | 29 | runs: 30 | using: "composite" 31 | steps: 32 | - name: Setup Rust Toolchain 33 | shell: bash 34 | run: | 35 | echo "Installing ${{ inputs.rust-version }}" 36 | if [ -n "${{ inputs.targets}}" ]; then 37 | rustup toolchain install ${{ inputs.rust-version }} -t ${{ inputs.targets }} 38 | else 39 | rustup toolchain install ${{ inputs.rust-version }} 40 | fi 41 | rustup default ${{ inputs.rust-version }} 42 | rustup component add rustfmt clippy -------------------------------------------------------------------------------- /ene_kafka/src/dispatchers/mod.rs: -------------------------------------------------------------------------------- 1 | use async_trait::async_trait; 2 | 3 | use crate::messages::cloud_events::cloud_event::CloudEvent; 4 | 5 | #[async_trait] 6 | pub trait EventDispatcher: Send + Sync { 7 | async fn dispatch_event>( 8 | &self, 9 | event: &Event, 10 | ) -> anyhow::Result<()>; 11 | } 12 | 13 | /// A macro to generate an event dispatcher struct that will dispatch events to the appropriate handlers 14 | /// based on the event type. 15 | /// The macro expects a list of handlers that will be used to dispatch the events. 16 | #[macro_export] 17 | macro_rules! generate_event_dispatcher { 18 | ($($handler_name: ident: $handler_type: ident $(< $( $generic_identifier:tt $( : $identifier_constraint:tt $(+ $identifier_additions:tt )* )? ),+ >)?),*) => { 19 | struct CloudEventDispatcher { 20 | $( 21 | $handler_name: $handler_type $(< $( $generic_identifier $( : $identifier_constraint $(+ $identifier_additions )* )? ),+ >)?, 22 | )* 23 | } 24 | 25 | 26 | #[async_trait::async_trait] 27 | impl ene_kafka::dispatchers::EventDispatcher for CloudEventDispatcher { 28 | 29 | async fn dispatch_event>(&self, event: &Event) -> anyhow::Result<()> { 30 | use ene_kafka::handlers::EventHandler; 31 | $( 32 | if self.$handler_name.can_handle(event)? { 33 | return self.$handler_name.deserialize_and_handle(event).await; 34 | } 35 | )* 36 | anyhow::bail!("No handler found for event type {:?}", event.event_type()?); 37 | } 38 | } 39 | } 40 | } 41 | -------------------------------------------------------------------------------- /ene_kafka/src/producers/rdkafka_impl.rs: -------------------------------------------------------------------------------- 1 | use async_trait::async_trait; 2 | use rdkafka::{ 3 | producer::{FutureProducer, FutureRecord}, 4 | util::Timeout, 5 | ClientConfig, 6 | }; 7 | 8 | use crate::messages::{ 9 | kafka_message::{KafkaMessage, ToBytes}, 10 | rdkafka_impl::ToRdkafkaHeaders, 11 | }; 12 | 13 | use super::producer::KafkaProducerInterface; 14 | 15 | #[async_trait] 16 | impl KafkaProducerInterface for FutureProducer { 17 | async fn send>( 18 | &self, 19 | message: Message, 20 | ) -> anyhow::Result<()> { 21 | let payload = message.payload()?.to_bytes()?; 22 | let key = message.key()?.to_bytes()?; 23 | let topic = message.topic()?; 24 | let record: FutureRecord<'_, Vec, Vec> = FutureRecord::, Vec> { 25 | topic: topic.name.as_str(), 26 | partition: None, 27 | payload: Some(&payload), 28 | key: Some(&key), 29 | timestamp: None, 30 | headers: Some(message.headers()?.to_rdkafka_headers()?), 31 | }; 32 | let delivery_status = FutureProducer::send(self, record, Timeout::Never).await; 33 | match delivery_status { 34 | Ok(_) => Ok(()), 35 | Err(e) => Err(anyhow::anyhow!(format!("Failed to produce event: {:?}", e))), 36 | } 37 | } 38 | 39 | fn new(bootstrap_servers: String) -> Self { 40 | // TODO: configure 41 | ClientConfig::new() 42 | .set("request.required.acks", "all") 43 | .set("bootstrap.servers", bootstrap_servers) 44 | .set("message.timeout.ms", "5000") 45 | .create() 46 | .expect("producers::rdkafka_impl - failed to create producer") 47 | } 48 | } 49 | -------------------------------------------------------------------------------- /ene_kafka_derive/src/cloud_event.rs: -------------------------------------------------------------------------------- 1 | use syn::DeriveInput; 2 | 3 | #[derive(deluxe::ExtractAttributes)] 4 | #[deluxe(attributes(cloud_event))] 5 | struct CloudEventAttributes { 6 | content_type: String, 7 | #[deluxe(default = "1.0".to_string())] 8 | version: String, 9 | event_type: String, 10 | event_source: String, 11 | id: syn::Ident, 12 | } 13 | 14 | pub fn cloudevent_derive_macro2( 15 | input: proc_macro2::TokenStream, 16 | ) -> deluxe::Result { 17 | let mut ast: DeriveInput = syn::parse2(input.into())?; 18 | let CloudEventAttributes { 19 | content_type, 20 | version, 21 | event_type, 22 | event_source, 23 | id, 24 | }: CloudEventAttributes = deluxe::extract_attributes(&mut ast)?; 25 | let struct_name = &ast.ident; 26 | let (impl_generics, type_generics, where_clause) = ast.generics.split_for_impl(); 27 | let id_ident = syn::Ident::new(&id.to_string(), struct_name.span()); 28 | 29 | Ok(quote::quote! { 30 | impl #impl_generics ene_kafka::messages::cloud_events::cloud_event::CloudEvent for #struct_name #type_generics #where_clause { 31 | fn spec_version(&self) -> ene_kafka::KafkaResult { 32 | Ok(#version.to_string()) 33 | } 34 | 35 | fn event_type(&self) -> ene_kafka::KafkaResult { 36 | Ok(#event_type.to_string()) 37 | } 38 | 39 | fn event_source(&self) -> ene_kafka::KafkaResult { 40 | Ok(#event_source.to_string()) 41 | } 42 | 43 | fn event_id(&self) -> ene_kafka::KafkaResult { 44 | Ok(self.#id_ident.to_string()) 45 | } 46 | 47 | fn event_time(&self) -> ene_kafka::KafkaResult { 48 | Ok(chrono::Utc::now().to_rfc3339()) 49 | } 50 | 51 | fn event_content_type(&self) -> ene_kafka::KafkaResult { 52 | Ok(#content_type.to_string()) 53 | } 54 | 55 | fn entity_event_type() -> ene_kafka::KafkaResult { 56 | Ok(#event_type.to_string()) 57 | } 58 | } 59 | }) 60 | } 61 | -------------------------------------------------------------------------------- /ene_kafka/src/producers/producer.rs: -------------------------------------------------------------------------------- 1 | extern crate proc_macro; 2 | 3 | use async_trait::async_trait; 4 | 5 | use crate::{ 6 | messages::kafka_message::{KafkaMessage, ToBytes}, 7 | ProducerImpl, 8 | }; 9 | 10 | #[async_trait] 11 | pub trait KafkaProducerInterface: Sync + Send { 12 | async fn send>( 13 | &self, 14 | message: Message, 15 | ) -> anyhow::Result<()> 16 | where; 17 | fn new(bootstrap_servers: String) -> Self; 18 | } 19 | 20 | #[derive(Debug, Clone)] 21 | pub struct KafkaProducer { 22 | producer: Producer, 23 | } 24 | 25 | #[async_trait] 26 | impl KafkaProducerInterface for KafkaProducer { 27 | /// 28 | /// Sends a message to a kafka topic 29 | /// Arguments: 30 | /// - `message` - a KafkaMessage 31 | /// 32 | /// Example: 33 | /// ```rust, ignore 34 | /// #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize, DeserializeFrom)] 35 | /// #[kafka(topic = "test", serde = Json, key = entity_id, headers = CloudEvent)] 36 | /// #[cloud_event( 37 | /// content_type = "application/json", 38 | /// version = "1.0", 39 | /// event_type = "com.ene.entity.created.v1", 40 | /// event_source = "https://ene-kafka.com/docs/cloudevents/entity/created" 41 | /// )] 42 | /// struct EntityCreated { 43 | /// pub entity_id: i64, 44 | /// pub organisation_id: i64, 45 | /// } 46 | /// 47 | /// let producer = kafka_producer!(bootstrap_servers = "localhost:9092".to_string()); 48 | /// let event = EntityCreated { 49 | /// entity_id: 1, 50 | /// organisation_id: 1, 51 | /// }; 52 | /// producer.send(event).await?; 53 | async fn send>( 54 | &self, 55 | message: Message, 56 | ) -> anyhow::Result<()> { 57 | tracing::debug!("sending message"); 58 | self.producer.send(message).await 59 | } 60 | 61 | fn new(bootstrap_servers: String) -> Self { 62 | Self { 63 | producer: A::new(bootstrap_servers), 64 | } 65 | } 66 | } 67 | 68 | /// 69 | /// Create a new Kafka producer 70 | /// Arguments: 71 | /// - `bootstrap_servers` - a string representing the Kafka bootstrap servers 72 | /// 73 | /// Example: 74 | /// ```rust, ignore 75 | /// let producer = kafka_producer!(bootstrap_servers = "localhost:9092".to_string()); 76 | /// producer.send(event).await?; 77 | /// ``` 78 | /// 79 | #[macro_export] 80 | macro_rules! kafka_producer { 81 | (bootstrap_servers = $bootstrap_servers: expr) => { 82 | ::new($bootstrap_servers) 83 | }; 84 | } 85 | -------------------------------------------------------------------------------- /ene_kafka/src/admins/mod.rs: -------------------------------------------------------------------------------- 1 | pub mod rdkafka_impl; 2 | 3 | use async_trait::async_trait; 4 | 5 | use crate::AdminImpl; 6 | 7 | #[async_trait] 8 | pub trait KafkaAdminInterface { 9 | async fn verify_topic_existence(&self, topic: &str) -> anyhow::Result; 10 | async fn create_topic_if_not_exists( 11 | &self, 12 | topic: &str, 13 | partitions: i32, 14 | replication_factor: i32, 15 | ) -> anyhow::Result<()>; 16 | async fn check_topic_liveness(&self, topic: &str) -> anyhow::Result; 17 | fn new( 18 | bootstrap_servers: String, 19 | request_time_out_ms: String, 20 | connection_max_idle_ms: String, 21 | ) -> Self; 22 | } 23 | 24 | #[derive(Debug, Clone)] 25 | pub struct KafkaAdmin { 26 | admin: Admin, 27 | } 28 | 29 | #[async_trait] 30 | impl KafkaAdminInterface 31 | for KafkaAdmin 32 | { 33 | async fn verify_topic_existence(&self, topic: &str) -> anyhow::Result { 34 | self.admin.verify_topic_existence(topic).await 35 | } 36 | async fn create_topic_if_not_exists( 37 | &self, 38 | topic: &str, 39 | partitions: i32, 40 | replication_factor: i32, 41 | ) -> anyhow::Result<()> { 42 | self.admin 43 | .create_topic_if_not_exists(topic, partitions, replication_factor) 44 | .await 45 | } 46 | async fn check_topic_liveness(&self, topic: &str) -> anyhow::Result { 47 | self.admin.check_topic_liveness(topic).await 48 | } 49 | fn new( 50 | bootstrap_servers: String, 51 | request_time_out_ms: String, 52 | connection_max_idle_ms: String, 53 | ) -> Self { 54 | Self { 55 | admin: A::new( 56 | bootstrap_servers, 57 | request_time_out_ms, 58 | connection_max_idle_ms, 59 | ), 60 | } 61 | } 62 | } 63 | 64 | /// 65 | /// Create a new Kafka admin 66 | /// Arguments: 67 | /// - `bootstrap_servers` - a string representing the Kafka bootstrap servers 68 | /// - `request_time_out_ms` - a string representing the Kafka request time out in milliseconds 69 | /// - `connection_max_idle_ms` - a string representing the Kafka connection max idle time in milliseconds 70 | /// 71 | /// Example: 72 | /// ```rust,ignore 73 | /// use ene_kafka::admins::KafkaAdminInterface; 74 | /// use ene_kafka::kafka_admin; 75 | /// let admin = kafka_admin!(bootstrap_servers = "localhost:9092".to_string(), request_time_out_ms = "50000".to_string(), connection_max_idle_ms = "0".to_string()); 76 | /// admin.check_topic_liveness("topic").await?; 77 | /// ``` 78 | /// 79 | #[macro_export] 80 | macro_rules! kafka_admin { 81 | (bootstrap_servers = $bootstrap_servers: expr, request_time_out_ms = $request_time_out_ms: expr, connection_max_idle_ms = $connection_max_idle_ms: expr) => { 82 | ::new( 83 | $bootstrap_servers, 84 | $request_time_out_ms, 85 | $connection_max_idle_ms, 86 | ) 87 | }; 88 | } 89 | -------------------------------------------------------------------------------- /ene_kafka_examples/kafka_consumer.rs: -------------------------------------------------------------------------------- 1 | use ene_kafka::messages::kafka_message::ContentType; 2 | use serde::{Deserialize, Serialize}; 3 | 4 | use ene_kafka::kafka_consumer; 5 | use ene_kafka::{handlers::EventHandler, messages::kafka_message::KafkaTopic}; 6 | use ene_kafka_derive::{CloudEvent, DeserializeFrom, EventHandler, KafkaMessage}; 7 | 8 | #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize, DeserializeFrom)] 9 | #[kafka(topic = "test", serde = Json, key = entity_id, headers = CloudEvent)] 10 | #[cloud_event( 11 | content_type = "application/json", 12 | version = "1.0", 13 | event_type = "com.ene.entity.created.v1", 14 | event_source = "https://ene-kafka.com/docs/cloudevents/entity/created", 15 | id = entity_id 16 | )] 17 | struct EntityCreated { 18 | pub entity_id: i64, 19 | pub organisation_id: i64, 20 | } 21 | 22 | #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize, DeserializeFrom)] 23 | #[kafka(topic = "test", serde = Json, key = entity_id, headers = CloudEvent)] 24 | #[cloud_event( 25 | content_type = "application/json", 26 | version = "1.0", 27 | event_type = "com.ene.entity.updated.v1", 28 | event_source = "https://ene-kafka.com/docs/cloudevents/entity/updated", 29 | id = entity_id 30 | )] 31 | struct EntityUpdated { 32 | pub entity_id: i64, 33 | pub organisation_id: i64, 34 | } 35 | 36 | #[tokio::main] 37 | async fn main() -> ene_kafka::KafkaResult<()> { 38 | tracing_subscriber::fmt() 39 | .with_env_filter(tracing_subscriber::EnvFilter::from_default_env()) 40 | .init(); 41 | let bootstrap_servers = "localhost:9092".to_string(); 42 | 43 | let consumer = kafka_consumer!( 44 | topic = KafkaTopic { 45 | name: "test".to_string(), 46 | content_type: ContentType::Json 47 | }, 48 | dlq_topic = KafkaTopic { 49 | name: "test-dlq".to_string(), 50 | content_type: ContentType::Json 51 | }, 52 | consumer_group_id = "test-group", 53 | bootstrap_servers = bootstrap_servers, 54 | handlers = { 55 | entity_created_event_handler: EntityCreatedEventHandler = EntityCreatedEventHandler {}, 56 | entity_updated_event_handler: EntityUpdatedHandler = EntityUpdatedHandler {} 57 | } 58 | ); 59 | consumer.start().await; 60 | 61 | Ok(()) 62 | } 63 | 64 | #[derive(EventHandler)] 65 | #[event_handler(event = EntityCreated, handler = handle_entity_created_event)] 66 | struct EntityCreatedEventHandler {} 67 | 68 | impl EntityCreatedEventHandler { 69 | async fn handle_entity_created_event( 70 | &self, 71 | event: &EntityCreated, 72 | ) -> ene_kafka::KafkaResult<()> { 73 | println!("EntityCreatedEventHandler: {:?}", event); 74 | Ok(()) 75 | } 76 | } 77 | 78 | #[derive(EventHandler)] 79 | #[event_handler(event = EntityUpdated, handler = handle_entity_updated_event)] 80 | struct EntityUpdatedHandler {} 81 | 82 | impl EntityUpdatedHandler { 83 | async fn handle_entity_updated_event( 84 | &self, 85 | event: &EntityUpdated, 86 | ) -> ene_kafka::KafkaResult<()> { 87 | println!("EntityUpdatedHandler: {:?}", event); 88 | Ok(()) 89 | } 90 | } 91 | -------------------------------------------------------------------------------- /ene_kafka_derive/src/kafka_message.rs: -------------------------------------------------------------------------------- 1 | use std::fmt::{Display, Formatter}; 2 | 3 | use syn::DeriveInput; 4 | 5 | enum HeaderType { 6 | CloudEvent, 7 | Empty, 8 | } 9 | 10 | impl From for HeaderType { 11 | fn from(ident: syn::Ident) -> Self { 12 | match ident.to_string().as_str() { 13 | "CloudEvent" => HeaderType::CloudEvent, 14 | _ => HeaderType::Empty, 15 | } 16 | } 17 | } 18 | 19 | impl Display for HeaderType { 20 | fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result { 21 | match self { 22 | HeaderType::CloudEvent => write!(f, "CloudEvent"), 23 | HeaderType::Empty => write!(f, "Custom"), 24 | } 25 | } 26 | } 27 | 28 | #[derive(deluxe::ExtractAttributes)] 29 | #[deluxe(attributes(kafka))] 30 | pub struct KafkaMessageAttributes { 31 | pub topic: String, 32 | pub serde: syn::Ident, 33 | pub key: syn::Ident, 34 | #[deluxe(default = syn::Ident::new("Empty", proc_macro2::Span::call_site()))] 35 | pub headers: syn::Ident, 36 | } 37 | 38 | pub fn kafkamessage_derive_macro2( 39 | input: proc_macro2::TokenStream, 40 | ) -> deluxe::Result { 41 | // convert into an ast 42 | let mut ast: DeriveInput = syn::parse2((input).into())?; 43 | 44 | // extract struct attributes 45 | let KafkaMessageAttributes { 46 | topic, 47 | key, 48 | headers, 49 | .. 50 | }: KafkaMessageAttributes = deluxe::extract_attributes(&mut ast)?; 51 | 52 | let header_impl = match HeaderType::from(headers) { 53 | HeaderType::CloudEvent => quote::quote! { 54 | fn headers(&self) -> ene_kafka::KafkaResult { 55 | ene_kafka::messages::cloud_events::cloud_event::CloudEvent::cloud_event_headers(self) 56 | } 57 | }, 58 | HeaderType::Empty => quote::quote! { 59 | fn headers(&self) -> ene_kafka::KafkaResult { 60 | Ok(std::collections::HashMap::new()) 61 | } 62 | }, 63 | }; 64 | // define impl variables 65 | let struct_name = &ast.ident; 66 | let (impl_generics, type_generics, where_clause) = ast.generics.split_for_impl(); 67 | let key_ident = syn::Ident::new(&key.to_string(), struct_name.span()); 68 | 69 | // generate 70 | Ok(quote::quote! { 71 | impl #impl_generics ene_kafka::messages::kafka_message::KafkaMessage for #struct_name #type_generics #where_clause { 72 | fn topic(&self) -> ene_kafka::KafkaResult { 73 | Ok(ene_kafka::messages::kafka_message::KafkaTopic { 74 | name: #topic.to_string(), 75 | content_type: ene_kafka::messages::kafka_message::ContentType::Json, 76 | }) 77 | } 78 | 79 | fn payload(&self) -> ene_kafka::KafkaResult { 80 | serde_json::to_string(self).map_err(|e| anyhow::anyhow!("Failed to serialize payload: {}", e)) 81 | } 82 | 83 | fn key(&self) -> ene_kafka::KafkaResult { 84 | Ok(self.#key_ident.to_string()) 85 | } 86 | 87 | 88 | #header_impl 89 | } 90 | }) 91 | } 92 | -------------------------------------------------------------------------------- /ene_kafka/src/consumers/rdkafka_impl.rs: -------------------------------------------------------------------------------- 1 | use async_trait::async_trait; 2 | use rdkafka::config::RDKafkaLogLevel; 3 | use rdkafka::consumer::{CommitMode, Consumer, StreamConsumer}; 4 | use rdkafka::ClientConfig; 5 | 6 | use crate::dispatchers::EventDispatcher; 7 | use crate::messages::kafka_message::KafkaTopic; 8 | use crate::producers::producer::{KafkaProducer, KafkaProducerInterface}; 9 | 10 | use super::consumer::KafkaConsumerInterface; 11 | 12 | #[async_trait] 13 | impl 14 | KafkaConsumerInterface for StreamConsumer 15 | { 16 | fn new(consumer_group_id: String, bootstrap_servers: String) -> Self { 17 | tracing::info!("Creating consumer with group ID {}", consumer_group_id); 18 | // TODO: configure all properties 19 | ClientConfig::new() 20 | .set("group.id", consumer_group_id) 21 | .set("bootstrap.servers", bootstrap_servers) 22 | .set("enable.partition.eof", "false") 23 | .set("session.timeout.ms", "6000") 24 | .set_log_level(RDKafkaLogLevel::Debug) 25 | .create::() 26 | .expect("Consumer creation failed") 27 | } 28 | 29 | async fn start<'a>( 30 | &'a self, 31 | dispatcher: &'a Dispatcher, 32 | dlq_producer: &'a KafkaProducer, 33 | topic: KafkaTopic, 34 | dlq_topic: KafkaTopic, 35 | ) { 36 | self.subscribe(&[topic.name.as_str()]) 37 | .map(|()| tracing::info!("Subscribed to {}", topic.name.as_str())) 38 | .expect("Can't subscribe to specified topics"); 39 | loop { 40 | match self.recv().await { 41 | Ok(event) => { 42 | tracing::debug!("event: {:?}", event); 43 | let result = &dispatcher.dispatch_event(&event).await; 44 | match result { 45 | Ok(_) => {} 46 | Err(error) => { 47 | tracing::error!("consumers::rdkafka_impl::error: {:?}", error); 48 | let unhandled_event = event.detach().set_topic(dlq_topic.name.clone()); 49 | match dlq_producer.send(unhandled_event).await { 50 | Ok(_) => { 51 | tracing::info!("Sent event to DLQ"); 52 | } 53 | Err(error) => { 54 | tracing::error!( 55 | "consumers::rdkafka_impl::dlq::error: {:?}", 56 | error 57 | ); 58 | } 59 | } 60 | } 61 | } 62 | match self.commit_message(&event, CommitMode::Async) { 63 | Ok(_) => {} 64 | Err(error) => { 65 | tracing::error!( 66 | "consumers::rdkafka_impl::commit_message::error: {:?}", 67 | error 68 | ); 69 | } 70 | } 71 | } 72 | Err(error) => { 73 | tracing::error!("Kafka error: {}", error); 74 | } 75 | } 76 | } 77 | } 78 | } 79 | -------------------------------------------------------------------------------- /ene_kafka_derive/src/lib.rs: -------------------------------------------------------------------------------- 1 | mod cloud_event; 2 | mod deserialize_from; 3 | mod handler; 4 | mod kafka_message; 5 | 6 | /// Derive the KafkaMessage trait for a struct 7 | /// It requires the following attributes: 8 | /// - `key` - the name of the field that will be used as the key 9 | /// - `topic` - the name of the field that will be used as the topic 10 | /// - `headers` - the name of the field that will be used as the headers. Possible values: `CloudEvent` or `None` (default) 11 | /// - `payload` - the name of the field that will be used as the payload 12 | /// - `serde` - the serialization format of the payload. Possible values: `Json` 13 | /// 14 | /// Example: 15 | /// ```rust,ignore 16 | /// #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize)] 17 | /// #[kafka(topic = "test", serde = Json, key = message_id)] 18 | /// struct SomeMessage { 19 | /// pub message_id: i64, 20 | /// } 21 | /// ``` 22 | #[proc_macro_derive(KafkaMessage, attributes(kafka))] 23 | pub fn kafkamessage_derive_macro(input: proc_macro::TokenStream) -> proc_macro::TokenStream { 24 | kafka_message::kafkamessage_derive_macro2(input.into()) 25 | .unwrap() 26 | .into() 27 | } 28 | 29 | /// Derive the CloudEvent trait for a struct 30 | /// It requires the following attributes: 31 | /// - `content_type` - the content type of the event 32 | /// - `version` - the version of the event 33 | /// - `event_type` - the type of the event 34 | /// - `event_source` - the source of the event 35 | /// 36 | /// Implementing `KafkaMessage` is required for this trait to work. The `headers` field of the `KafkaMessage` trait should be set to `CloudEvent` 37 | /// 38 | /// Example: 39 | /// ```rust, ignore 40 | /// #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize)] 41 | /// #[kafka(topic = "test", serde = Json, key = event_id, headers = CloudEvent)] 42 | /// #[cloud_event( 43 | /// content_type = "application/json", 44 | /// version = "1.0", 45 | /// event_type = "com.ene.SomeEvent.v1", 46 | /// event_source = "https://ene-kafka.com/docs/cloudevents/SomeEvent" 47 | /// )] 48 | /// struct SomeEvent { 49 | /// pub event_id: i64, 50 | /// } 51 | /// ``` 52 | #[proc_macro_derive(CloudEvent, attributes(cloud_event))] 53 | pub fn cloudevent_derive_macro(input: proc_macro::TokenStream) -> proc_macro::TokenStream { 54 | cloud_event::cloudevent_derive_macro2(input.into()) 55 | .unwrap() 56 | .into() 57 | } 58 | 59 | /// Derive the EventHandler trait for a struct 60 | /// It requires the following attributes: 61 | /// - `event` - A concrete type that implements the `CloudEvent` trait 62 | /// - `handler` - The name of the handler function. This function should be implemented by the struct. It should take a reference to the event it can handle as input. 63 | /// 64 | /// The event type should implement `CloudEvent` as well as `DeserializeFrom` is required for this trait to work. 65 | /// Example: 66 | /// ```rust,ignore 67 | /// #[derive(EventHandler)] 68 | /// #[event_handler(event = crate::SomeEvent, handler = handle_some_event)] 69 | /// struct SomeEventHandler; 70 | /// 71 | /// impl SomeEventHandler { 72 | /// async fn handle_some_event(&self, event: &crate::SomeEvent) -> anyhow::Result<()> { 73 | /// println!("Handling event: {:?}", event); 74 | /// Ok(()) 75 | /// } 76 | /// } 77 | /// ``` 78 | #[proc_macro_derive(EventHandler, attributes(event_handler))] 79 | pub fn handler_derive_macro(input: proc_macro::TokenStream) -> proc_macro::TokenStream { 80 | handler::handler_derive_macro2(input.into()).unwrap().into() 81 | } 82 | 83 | /// Derive the DeserializeFrom trait for a struct 84 | /// It relies on the `KafkaMessage` trait and requires the following attributes: 85 | /// - `serde` - the serialization format of the payload. Possible values: `Json` 86 | /// `DeserializeFrom` requires the struct to implement `Deserialize` from the `serde` crate. 87 | /// 88 | /// Example: 89 | /// ```rust, ignore 90 | /// #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize, DeserializeFrom)] 91 | /// #[kafka(topic = "test", serde = Json, key = message_id)] 92 | /// struct SomeMessage { 93 | /// pub message_id: i64, 94 | /// } 95 | /// ``` 96 | #[proc_macro_derive(DeserializeFrom, attributes(kafka))] 97 | pub fn deserialize_from_derive_macro(input: proc_macro::TokenStream) -> proc_macro::TokenStream { 98 | deserialize_from::deserialize_from_derive_macro2(input.into()) 99 | .unwrap() 100 | .into() 101 | } 102 | -------------------------------------------------------------------------------- /ene_kafka/src/consumers/consumer.rs: -------------------------------------------------------------------------------- 1 | use async_trait::async_trait; 2 | 3 | use crate::dispatchers::EventDispatcher; 4 | use crate::messages::kafka_message::KafkaTopic; 5 | use crate::producers::producer::{KafkaProducer, KafkaProducerInterface}; 6 | use crate::{ConsumerImpl, ProducerImpl}; 7 | 8 | #[async_trait] 9 | pub trait KafkaConsumerInterface 10 | { 11 | fn new(consumer_group_id: String, bootstrap_servers: String) -> Self; 12 | async fn start<'a>( 13 | &'a self, 14 | dispatcher: &'a Dispatcher, 15 | dlq_producer: &'a KafkaProducer, 16 | topic: KafkaTopic, 17 | dlq_topic: KafkaTopic, 18 | ); 19 | } 20 | 21 | #[derive(Debug, Clone)] 22 | pub struct KafkaConsumer< 23 | Dispatcher: EventDispatcher, 24 | InnerConsumer: KafkaConsumerInterface = ConsumerImpl, 25 | InnerProducer: KafkaProducerInterface = ProducerImpl, 26 | > { 27 | topic: KafkaTopic, 28 | dlq_topic: KafkaTopic, 29 | dispatcher: Dispatcher, 30 | inner_consumer: InnerConsumer, 31 | dlq_producer: KafkaProducer, 32 | } 33 | 34 | impl< 35 | Dispatcher: EventDispatcher, 36 | Consumer: KafkaConsumerInterface, 37 | InnerProducer: KafkaProducerInterface, 38 | > KafkaConsumer 39 | { 40 | pub fn new( 41 | topic: KafkaTopic, 42 | dlq_topic: KafkaTopic, 43 | consumer_group_id: String, 44 | bootstrap_servers: String, 45 | handler: Dispatcher, 46 | ) -> Self { 47 | let dlq_producer = KafkaProducer::new(bootstrap_servers.clone()); 48 | Self { 49 | topic, 50 | dlq_topic, 51 | dispatcher: handler, 52 | inner_consumer: Consumer::new(consumer_group_id, bootstrap_servers), 53 | dlq_producer, 54 | } 55 | } 56 | 57 | /// Starts the consumer loop 58 | /// This function will block the current thread 59 | /// It will consume messages from the Kafka topic and dispatch them to the handlers. 60 | /// If the message could not be consumed, it will be sent to the dead letter queue. 61 | pub async fn start(self) { 62 | self.inner_consumer 63 | .start( 64 | &self.dispatcher, 65 | &self.dlq_producer, 66 | self.topic, 67 | self.dlq_topic, 68 | ) 69 | .await; 70 | } 71 | } 72 | 73 | /// 74 | /// Create a new Kafka consumer 75 | /// Arguments: 76 | /// - `topic` - a string representing the Kafka topic 77 | /// - `dlq_topic` - a string representing the Kafka dead letter queue topic. If an event could not be consuler, it will be sent to the dead letter queue. 78 | /// - `consumer_group_id` - a string representing the Kafka consumer group id 79 | /// - `bootstrap_servers` - a string representing the Kafka bootstrap servers 80 | /// - `handlers` - a list of handle declarations that will be used by this consumer 81 | /// The handlers need to implement The `EventHandler` trait. 82 | /// 83 | /// Example: 84 | /// ```rust,ignore 85 | /// let consumer = kafka_consumer!( 86 | /// topic = KafkaTopic { 87 | /// name: "test".to_string(), 88 | /// content_type: ContentType::Json 89 | /// }, 90 | /// dlq_topic = KafkaTopic { 91 | /// name: "test-dlq".to_string(), 92 | /// content_type: ContentType::Json 93 | /// }, 94 | /// consumer_group_id = "test-group", 95 | /// bootstrap_servers = bootstrap_servers, 96 | /// handlers = { 97 | /// entity_created_event_handler: EntityCreatedEventHandler = EntityCreatedEventHandler {} 98 | /// } 99 | /// ); 100 | /// consumer.start().await; 101 | /// ``` 102 | /// 103 | #[macro_export] 104 | macro_rules! kafka_consumer { 105 | ( 106 | topic = $topic: expr, 107 | dlq_topic = $dlq_topic: expr, 108 | consumer_group_id = $consumer_group_id: expr, 109 | bootstrap_servers = $bootstrap_servers: expr, 110 | handlers = {$($handler_name: ident: $handler_type: ident = $handler: expr),*}$(,)? 111 | $(,)? 112 | ) => { 113 | { 114 | 115 | ene_kafka::generate_event_dispatcher!($($handler_name: $handler_type),*); 116 | 117 | 118 | ene_kafka::consumers::consumer::KafkaConsumer::::new( 119 | $topic, 120 | $dlq_topic, 121 | $consumer_group_id.to_string(), 122 | $bootstrap_servers.to_string(), 123 | CloudEventDispatcher { $($handler_name: $handler),* } 124 | ) 125 | 126 | } 127 | 128 | }; 129 | } 130 | -------------------------------------------------------------------------------- /ene_kafka/src/admins/rdkafka_impl.rs: -------------------------------------------------------------------------------- 1 | use anyhow::anyhow; 2 | use async_trait::async_trait; 3 | use rdkafka::{ 4 | admin::{ 5 | AdminClient, AdminOptions, ConfigResource, NewTopic, OwnedResourceSpecifier, 6 | ResourceSpecifier, 7 | }, 8 | client::DefaultClientContext, 9 | config::FromClientConfig, 10 | types::RDKafkaErrorCode, 11 | }; 12 | use tracing::{error, info}; 13 | 14 | use super::KafkaAdminInterface; 15 | 16 | #[async_trait] 17 | impl KafkaAdminInterface for AdminClient { 18 | async fn verify_topic_existence(&self, topic: &str) -> anyhow::Result { 19 | let topic_config = ResourceSpecifier::Topic(topic); 20 | let admin_options = 21 | AdminOptions::new().request_timeout(Some(std::time::Duration::from_secs(1))); 22 | self.describe_configs(vec![&topic_config], &admin_options) 23 | .await 24 | .map(|res: Vec>| { 25 | res.iter().any(|res| match res { 26 | Ok(config_resource) => match &config_resource.specifier { 27 | OwnedResourceSpecifier::Topic(recieved_topic) => { 28 | let topic_exists = 29 | (recieved_topic == topic) && !config_resource.entries.is_empty(); 30 | if topic_exists { 31 | info!("Topic {} exists", topic); 32 | }; 33 | topic_exists 34 | } 35 | something_else => { 36 | error!("Expected topic, got {:?}", something_else); 37 | false 38 | } 39 | }, 40 | Err(e) => { 41 | error!("Error describing topic: {:?}", e); 42 | false 43 | } 44 | }) 45 | }) 46 | .map_err(|e| { 47 | error!("Error describing topic: {:?}", e); 48 | anyhow!(e.to_string()) 49 | }) 50 | } 51 | async fn create_topic_if_not_exists( 52 | &self, 53 | topic: &str, 54 | partitions: i32, 55 | replication_factor: i32, 56 | ) -> anyhow::Result<()> { 57 | let topic_exists = self.verify_topic_existence(topic).await?; 58 | if !topic_exists { 59 | let new_topic = NewTopic::new( 60 | topic, 61 | partitions, 62 | rdkafka::admin::TopicReplication::Fixed(replication_factor), 63 | ); 64 | self.create_topics( 65 | &[new_topic], 66 | &AdminOptions::new() 67 | .request_timeout(Some(std::time::Duration::from_secs(1))) 68 | .operation_timeout(Some(std::time::Duration::from_secs(1))), 69 | ) 70 | .await 71 | .map(|res| { 72 | res.iter().all(|res| match res { 73 | Ok(_) => true, 74 | Err(e) => { 75 | error!("Error creating topic: {:?}", e); 76 | false 77 | } 78 | }) 79 | }) 80 | .map_err(|e| { 81 | error!("Error creating topic: {:?}", e); 82 | anyhow!(e.to_string()) 83 | })?; 84 | } 85 | Ok(()) 86 | } 87 | async fn check_topic_liveness(&self, topic: &str) -> anyhow::Result { 88 | let topic_config = ResourceSpecifier::Topic(topic); 89 | let admin_options = AdminOptions::new(); 90 | self.describe_configs(vec![&topic_config], &admin_options) 91 | .await 92 | .map(|res: Vec>| { 93 | res.iter().all(|res| res.is_ok()) 94 | && res.iter().any(|res| match res { 95 | Ok(config_resource) => match &config_resource.specifier { 96 | OwnedResourceSpecifier::Topic(recieved_topic) => { 97 | (recieved_topic == topic) && !config_resource.entries.is_empty() 98 | } 99 | something_else => { 100 | error!("Expected topic, got {:?}", something_else); 101 | false 102 | } 103 | }, 104 | Err(e) => { 105 | error!("Error describing topic: {:?}", e); 106 | false 107 | } 108 | }) 109 | }) 110 | .map_err(|e| { 111 | error!("Error describing topic: {:?}", e); 112 | anyhow!(e.to_string()) 113 | }) 114 | } 115 | 116 | fn new( 117 | bootstrap_servers: String, 118 | request_time_out_ms: String, 119 | connection_max_idle_ms: String, 120 | ) -> Self { 121 | AdminClient::from_config( 122 | &rdkafka::ClientConfig::new() 123 | .set("bootstrap.servers", bootstrap_servers) 124 | .set("request.timeout.ms", &request_time_out_ms) 125 | .set( 126 | "connections.max.idle.ms", 127 | &connection_max_idle_ms.to_string(), 128 | ), 129 | ) 130 | .expect("Failed to create admin client") 131 | } 132 | } 133 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | We as members, contributors, and leaders pledge to make participation in our 6 | community a harassment-free experience for everyone, regardless of age, body 7 | size, visible or invisible disability, ethnicity, sex characteristics, gender 8 | identity and expression, level of experience, education, socio-economic status, 9 | nationality, personal appearance, race, religion, or sexual identity 10 | and orientation. 11 | 12 | We pledge to act and interact in ways that contribute to an open, welcoming, 13 | diverse, inclusive, and healthy community. 14 | 15 | ## Our Standards 16 | 17 | Examples of behavior that contributes to a positive environment for our 18 | community include: 19 | 20 | * Demonstrating empathy and kindness toward other people 21 | * Being respectful of differing opinions, viewpoints, and experiences 22 | * Giving and gracefully accepting constructive feedback 23 | * Accepting responsibility and apologizing to those affected by our mistakes, 24 | and learning from the experience 25 | * Focusing on what is best not just for us as individuals, but for the 26 | overall community 27 | 28 | Examples of unacceptable behavior include: 29 | 30 | * The use of sexualized language or imagery, and sexual attention or 31 | advances of any kind 32 | * Trolling, insulting or derogatory comments, and personal or political attacks 33 | * Public or private harassment 34 | * Publishing others' private information, such as a physical or email 35 | address, without their explicit permission 36 | * Other conduct which could reasonably be considered inappropriate in a 37 | professional setting 38 | 39 | ## Enforcement Responsibilities 40 | 41 | Community leaders are responsible for clarifying and enforcing our standards of 42 | acceptable behavior and will take appropriate and fair corrective action in 43 | response to any behavior that they deem inappropriate, threatening, offensive, 44 | or harmful. 45 | 46 | Community leaders have the right and responsibility to remove, edit, or reject 47 | comments, commits, code, wiki edits, issues, and other contributions that are 48 | not aligned to this Code of Conduct, and will communicate reasons for moderation 49 | decisions when appropriate. 50 | 51 | ## Scope 52 | 53 | This Code of Conduct applies within all community spaces, and also applies when 54 | an individual is officially representing the community in public spaces. 55 | Examples of representing our community include using an official e-mail address, 56 | posting via an official social media account, or acting as an appointed 57 | representative at an online or offline event. 58 | 59 | ## Enforcement 60 | 61 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 62 | reported to the community leaders responsible for enforcement at 63 | Github issues. 64 | All complaints will be reviewed and investigated promptly and fairly. 65 | 66 | All community leaders are obligated to respect the privacy and security of the 67 | reporter of any incident. 68 | 69 | ## Enforcement Guidelines 70 | 71 | Community leaders will follow these Community Impact Guidelines in determining 72 | the consequences for any action they deem in violation of this Code of Conduct: 73 | 74 | ### 1. Correction 75 | 76 | **Community Impact**: Use of inappropriate language or other behavior deemed 77 | unprofessional or unwelcome in the community. 78 | 79 | **Consequence**: A private, written warning from community leaders, providing 80 | clarity around the nature of the violation and an explanation of why the 81 | behavior was inappropriate. A public apology may be requested. 82 | 83 | ### 2. Warning 84 | 85 | **Community Impact**: A violation through a single incident or series 86 | of actions. 87 | 88 | **Consequence**: A warning with consequences for continued behavior. No 89 | interaction with the people involved, including unsolicited interaction with 90 | those enforcing the Code of Conduct, for a specified period of time. This 91 | includes avoiding interactions in community spaces as well as external channels 92 | like social media. Violating these terms may lead to a temporary or 93 | permanent ban. 94 | 95 | ### 3. Temporary Ban 96 | 97 | **Community Impact**: A serious violation of community standards, including 98 | sustained inappropriate behavior. 99 | 100 | **Consequence**: A temporary ban from any sort of interaction or public 101 | communication with the community for a specified period of time. No public or 102 | private interaction with the people involved, including unsolicited interaction 103 | with those enforcing the Code of Conduct, is allowed during this period. 104 | Violating these terms may lead to a permanent ban. 105 | 106 | ### 4. Permanent Ban 107 | 108 | **Community Impact**: Demonstrating a pattern of violation of community 109 | standards, including sustained inappropriate behavior, harassment of an 110 | individual, or aggression toward or disparagement of classes of individuals. 111 | 112 | **Consequence**: A permanent ban from any sort of public interaction within 113 | the community. 114 | 115 | ## Attribution 116 | 117 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], 118 | version 2.0, available at 119 | https://www.contributor-covenant.org/version/2/0/code_of_conduct.html. 120 | 121 | Community Impact Guidelines were inspired by [Mozilla's code of conduct 122 | enforcement ladder](https://github.com/mozilla/diversity). 123 | 124 | [homepage]: https://www.contributor-covenant.org 125 | 126 | For answers to common questions about this code of conduct, see the FAQ at 127 | https://www.contributor-covenant.org/faq. Translations are available at 128 | https://www.contributor-covenant.org/translations. 129 | -------------------------------------------------------------------------------- /ene_kafka/src/messages/rdkafka_impl.rs: -------------------------------------------------------------------------------- 1 | use super::{ 2 | cloud_events::cloud_event::CloudEvent, 3 | kafka_message::{ContentType, Headers as KafkaHeaders, KafkaTopic}, 4 | }; 5 | use crate::messages::kafka_message::KafkaMessage; 6 | use anyhow::anyhow; 7 | use rdkafka::{ 8 | message::{BorrowedHeaders, BorrowedMessage, Header, Headers, OwnedHeaders, OwnedMessage}, 9 | Message, 10 | }; 11 | 12 | pub trait ToRdkafkaHeaders { 13 | fn to_rdkafka_headers(&self) -> anyhow::Result; 14 | } 15 | 16 | impl<'a> KafkaMessage for BorrowedMessage<'a> { 17 | fn topic(&self) -> anyhow::Result { 18 | Ok(KafkaTopic { 19 | name: Message::topic(self).to_string(), 20 | content_type: ContentType::Json, 21 | }) 22 | } 23 | 24 | fn payload(&self) -> anyhow::Result { 25 | Message::payload(self) 26 | .map(|bytes| -> anyhow::Result { Ok(String::from_utf8(bytes.to_vec())?) }) 27 | .ok_or(anyhow!("Payload is not a valid UTF-8 string"))? 28 | } 29 | 30 | fn key(&self) -> anyhow::Result { 31 | Message::key(self) 32 | .map(|bytes| -> anyhow::Result { Ok(String::from_utf8(bytes.to_vec())?) }) 33 | .ok_or(anyhow!("Key is not a valid UTF-8 string"))? 34 | } 35 | 36 | fn headers(&self) -> anyhow::Result { 37 | Message::headers(self) 38 | .map(borrowed_headers_to_headers) 39 | .ok_or(anyhow!("Headers are not a valid UTF-8 string"))? 40 | } 41 | } 42 | 43 | impl KafkaMessage for OwnedMessage { 44 | fn topic(&self) -> anyhow::Result { 45 | Ok(KafkaTopic { 46 | name: Message::topic(self).to_string(), 47 | content_type: ContentType::Json, // TODO: Json is only supported for now 48 | }) 49 | } 50 | 51 | fn payload(&self) -> anyhow::Result { 52 | Message::payload(self) 53 | .map(|bytes| -> anyhow::Result { Ok(String::from_utf8(bytes.to_vec())?) }) 54 | .ok_or(anyhow!("Payload is not a valid UTF-8 string"))? 55 | } 56 | 57 | fn key(&self) -> anyhow::Result { 58 | Message::key(self) 59 | .map(|bytes| -> anyhow::Result { Ok(String::from_utf8(bytes.to_vec())?) }) 60 | .ok_or(anyhow!("Key is not a valid UTF-8 string"))? 61 | } 62 | 63 | fn headers(&self) -> anyhow::Result { 64 | Message::headers(self) 65 | .map(owned_headers_to_headers) 66 | .ok_or(anyhow!("Headers are not a valid UTF-8 string"))? 67 | } 68 | } 69 | 70 | impl<'a> CloudEvent for BorrowedMessage<'a> { 71 | fn spec_version(&self) -> anyhow::Result { 72 | KafkaMessage::headers(self)? 73 | .get("ce_specversion") 74 | .map(|value| value.clone()) 75 | .ok_or(anyhow!("ce_specversion header is missing")) 76 | } 77 | 78 | fn event_type(&self) -> anyhow::Result { 79 | KafkaMessage::headers(self)? 80 | .get("ce_type") 81 | .map(|value| value.clone()) 82 | .ok_or(anyhow!("ce_type header is missing")) 83 | } 84 | 85 | fn event_source(&self) -> anyhow::Result { 86 | KafkaMessage::headers(self)? 87 | .get("ce_source") 88 | .map(|value| value.clone()) 89 | .ok_or(anyhow!("ce_source header is missing")) 90 | } 91 | 92 | fn event_id(&self) -> anyhow::Result { 93 | KafkaMessage::headers(self)? 94 | .get("ce_id") 95 | .map(|value| value.clone()) 96 | .ok_or(anyhow!("ce_id header is missing")) 97 | } 98 | 99 | fn event_time(&self) -> anyhow::Result { 100 | KafkaMessage::headers(self)? 101 | .get("ce_time") 102 | .map(|value| value.clone()) 103 | .ok_or(anyhow!("ce_time header is missing")) 104 | } 105 | 106 | fn event_content_type(&self) -> anyhow::Result { 107 | KafkaMessage::headers(self)? 108 | .get("content_type") 109 | .map(|value| value.clone()) 110 | .ok_or(anyhow!("content_type header is missing")) 111 | } 112 | 113 | fn entity_event_type() -> anyhow::Result { 114 | Ok(String::from("lib.rdkafka.BorrowedMessage")) 115 | } 116 | } 117 | 118 | impl ToRdkafkaHeaders for KafkaHeaders { 119 | fn to_rdkafka_headers(&self) -> anyhow::Result { 120 | let mut owned_headers = rdkafka::message::OwnedHeaders::new(); 121 | for (key, value) in self.iter() { 122 | let header = Header { 123 | key: key.as_str(), 124 | value: Some(value.as_str()), 125 | }; 126 | owned_headers = owned_headers.insert(header); 127 | } 128 | Ok(owned_headers) 129 | } 130 | } 131 | 132 | pub fn borrowed_headers_to_headers(headers: &BorrowedHeaders) -> anyhow::Result { 133 | headers 134 | .iter() 135 | .filter(|header| header.value.is_some()) 136 | .map(|header| -> anyhow::Result<(String, String)> { 137 | let key = header.key.to_string(); 138 | let value = String::from_utf8( 139 | header 140 | .value 141 | .ok_or(anyhow!("header with key {key} not avaiable"))? 142 | .to_vec(), 143 | )?; 144 | Ok((key, value)) 145 | }) 146 | .collect::>() 147 | } 148 | 149 | pub fn owned_headers_to_headers(headers: &OwnedHeaders) -> anyhow::Result { 150 | headers 151 | .iter() 152 | .filter(|header| header.value.is_some()) 153 | .map(|header| -> anyhow::Result<(String, String)> { 154 | let key = header.key.to_string(); 155 | let value = String::from_utf8( 156 | header 157 | .value 158 | .ok_or(anyhow!("header with key {key} not avaiable"))? 159 | .to_vec(), 160 | )?; 161 | Ok((key, value)) 162 | }) 163 | .collect::>() 164 | } 165 | -------------------------------------------------------------------------------- /readme.md: -------------------------------------------------------------------------------- 1 | # Ene Kafka: The Opinionated Kafka Framework for Rust 2 | 3 |

4 | Markdown Monster icon 7 |

8 | 9 | [![Crates.io][crates-badge]][crates-url] 10 | [![Crates.io][derive-crates-badge]][derive-crates-url] 11 | [![Apache licensed][license-badge]][license-url] 12 | [![Build Status][actions-badge]][actions-url] 13 | 14 | [crates-badge]: https://img.shields.io/crates/v/ene-kafka.svg?label=ene-kafka 15 | [crates-url]: https://crates.io/crates/ene-kafka 16 | [derive-crates-badge]: https://img.shields.io/crates/v/ene-kafka-derive.svg?label=ene-kafka-derive 17 | [derive-crates-url]: https://crates.io/crates/ene-kafka-derive 18 | [license-badge]: https://img.shields.io/badge/license-Apache%20v2-blue.svg 19 | [license-url]: https://github.com/ene-rs/ene-kafka/blob/main/LICENSE 20 | [actions-badge]: https://github.com/ene-rs/ene-kafka/actions/workflows/rust.yml/badge.svg 21 | [actions-url]: https://github.com/ene-rs/ene-kafka/actions?query=branch%3Amain 22 | 23 | Originally a small internal project built for https://enelyzer.com/ (hence the `Ene` in the name) to increase code readibility and speed of delivery, but we liked it so much we decided to make it public 24 | 25 | ## Introduction 26 | Ene kafka is a Kafka client for Rust that is intended to facilitate the interaction with Apache Kafka in Rust programs. The main goal of Ene kafka is to make the process of building event-driven microservices That use Kafka for their distributed messaging system easier and quicker. 27 | 28 | Ene Kafka uses cloud events to define the structure of messages that are consumed and produced. 29 | 30 | ## Examples 31 | 32 | Create an event: 33 | ```rust 34 | #[derive(KafkaMessage, Serialize, CloudEvent, Debug, Deserialize, DeserializeFrom)] 35 | #[kafka(topic = "test", serde = Json, key = entity_id, headers = CloudEvent)] 36 | #[cloud_event( 37 | content_type = "application/json", 38 | version = "1.0", 39 | event_type = "com.ene.entity.created.v1", 40 | event_source = "https://ene-kafka.com/docs/cloudevents/entity/created", 41 | id = entity_id, 42 | )] 43 | struct EntityCreated { 44 | pub entity_id: i64, 45 | pub organisation_id: i64, 46 | } 47 | ``` 48 | 49 | Produce an event: 50 | ```rust 51 | let producer: KafkaProducer = kafka_producer!(bootstrap_servers = bootstrap_servers.clone()); 52 | let event = EntityCreated { 53 | entity_id: 1755, 54 | organisation_id: 42, 55 | }; 56 | 57 | producer.send(event).await?; 58 | ``` 59 | 60 | Consume and handle an event: 61 | ```rust 62 | // Create a handler to process the event 63 | #[derive(EventHandler)] 64 | #[event_handler(event = EntityCreated, handler = handle_entity_created_event)] 65 | struct EntityCreatedEventHandler { 66 | // Optionally some state for the handler 67 | } 68 | 69 | impl EntityCreatedEventHandler { 70 | async fn handle_entity_created_event(&self, event: &EntityCreated) -> ene_kafka::KafkaResult<()> { 71 | println!("EntityCreatedEventHandler: {:?}", event); 72 | // Do something with the event 73 | Ok(()) 74 | } 75 | } 76 | 77 | // Create a consumer that listens to the topic and registers the handler 78 | let consumer = kafka_consumer!( 79 | topic = KafkaTopic { 80 | name: "test".to_string(), 81 | content_type: ContentType::Json 82 | }, 83 | // Dead letter queueing is included in the consumer! 84 | dlq_topic = KafkaTopic { 85 | name: "test-dlq".to_string(), 86 | content_type: ContentType::Json 87 | }, 88 | consumer_group_id = "test-group", 89 | bootstrap_servers = bootstrap_servers, 90 | handlers = { 91 | entity_created_event_handler: EntityCreatedEventHandler = EntityCreatedEventHandler { /*Handler state initialisation*/ } 92 | } 93 | ); 94 | ``` 95 | For more examples, check the [examples](ene_kafka_examples/) folder in the repository. 96 | 97 | ## Features 98 | - **Ease of Use thanks to Rust Macros**: Ene Kafka uses Rust macros to make it easier to define Kafka messages and to produce and consume them. The macros help abstracting away a big chunk of the boilerplate, making it possible for you to focus on what matters most: your business logic. 99 | 100 | - **CloudEvents**: Ene Kafka supports the CloudEvents specification for event messages. 101 | 102 | - **Dead Letter Queueing**: Ene Kafka supports dead letter queueing for messages that fail to be handled. 103 | 104 | - **Automatic (De)serialization**: Ene Kafka automatically serializes and deserializes messages into the specified event type. 105 | 106 | - **Extensiblity**: Ene Kafka is designed with extensibility in mind (though this is still a work in progress). It should be possible to use different underlying clients for Kafka, or to use other serialization libraries instead of serde. 107 | 108 | - **Async by default** 109 | 110 | ## Limitations 111 | - **Only JSON format is supported**: Ene Kafka only supports JSON serialization and deserialization at the moment. There is no support for Avro or Protobuf 112 | 113 | - **rdKafka is the only supported Kafka client implementation**: Ene kafka is mostly a pretty interface implemented on top of an existing Kafka client for Rust. The intention is to make it possible for the developer to choose between several implementations for Rust. Currently, only rdKafka is supported as this is the most stable option for Rust. We hope to have another, Rust-native, alternative in the future 114 | 115 | - **Ene Kafka is built around CloudEvents** which may not be suitable for all use cases. 116 | 117 | - **Limited consumer and producer configurations**: One of the goals of Ene Kafka is to abstract away all the technicalities of Kafka and to make it easier to implement event-driven microservices quickly and reliably. It would however be nice if the consumer and producer configurations could be more flexible. This is something that can be implemented quite easily. 118 | 119 | ## Requirements 120 | Ene kafka requires some depdencnies to be present: 121 | - anyhow 122 | - async-trait 123 | - serde and serde_json 124 | - tokio 125 | - rdkafka 126 | 127 | 128 | ## MSRV 129 | The minimum supported Rust version is 1.70 130 | 131 | ## Open to Contributions 132 | Ene Kafka is an open-source project and we welcome contributions from the community. If you have any ideas for improvements or new features, please feel free to open an issue or a pull request. 133 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /md_assets/architecture.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 |
in case of failure, produces the failed event to the DLQ
in case of failure, produces the failed event to the DLQ
Consumer
Consumer
Dispatcher fails when no handler 
can handle the incoming event, 
or when the handler fails to handle the event
Dispatcher fails when no handler...
Event dispatcher
Event dispatcher
Handler
Handler
Business Logic
Business Logic
consumes events from topic
and sends them to the event dispatcher
consumes events from topic...
based on the event type, 
the dispatcher tried to find the appropriate handler
based on the event type,...
DLQ Producer
DLQ Producer
Handler fails when it cannot
deserialize the event, or when 
something goes wrong 
during the execution of the business logic
Handler fails when it cannot...
The handler will deserialize the event,
 and execute the (business) logic
 needed for that event to be handled
The handler will deserialize the eve...
Business logic fails when it cannot
handle the event for whatever reason
Business logic fails when it cannot...
--------------------------------------------------------------------------------