├── .env ├── project-overview.png ├── .mvn └── wrapper │ ├── maven-wrapper.jar │ ├── maven-wrapper.properties │ └── MavenWrapperDownloader.java ├── .editorconfig ├── stock-quote-consumer-avro ├── README.md ├── src │ └── main │ │ ├── java │ │ └── io │ │ │ └── stockgeeks │ │ │ └── kafka │ │ │ └── stock │ │ │ └── quote │ │ │ └── consumer │ │ │ └── avro │ │ │ ├── config │ │ │ └── KafkaConfiguration.java │ │ │ ├── StockQuoteConsumerAvroApplication.java │ │ │ └── StockQuoteConsumer.java │ │ └── resources │ │ └── application.yml ├── .gitignore └── pom.xml ├── stock-quote-producer-avro ├── README.md ├── .gitignore ├── src │ └── main │ │ ├── java │ │ └── io │ │ │ └── stockgeeks │ │ │ └── kafka │ │ │ └── stock │ │ │ └── quote │ │ │ └── producer │ │ │ └── avro │ │ │ ├── StockQuoteProducerAvroApplication.java │ │ │ ├── StockQuoteProducer.java │ │ │ ├── RandomStockQuoteProducer.java │ │ │ └── RandomStockQuoteGenerator.java │ │ └── resources │ │ └── application.yml └── pom.xml ├── .gitignore ├── stock-quote-kafka-streams-avro ├── src │ └── main │ │ ├── java │ │ └── io │ │ │ └── stockgeeks │ │ │ └── kafka │ │ │ └── stock │ │ │ └── quote │ │ │ └── streams │ │ │ ├── StockQuoteKafkaStreamsApplication.java │ │ │ └── KafkaConfiguration.java │ │ └── resources │ │ └── application.yml ├── .gitignore └── pom.xml ├── pom.xml ├── stock-quote-avro-model ├── src │ └── main │ │ └── resources │ │ └── avro │ │ └── stock-quote-v1.avsc └── pom.xml ├── docker-compose.yml ├── mvnw.cmd ├── mvnw └── README.md /.env: -------------------------------------------------------------------------------- 1 | TZ=Europe/Amsterdam -------------------------------------------------------------------------------- /project-overview.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stockgeeks/spring-kafka-poison-pill/HEAD/project-overview.png -------------------------------------------------------------------------------- /.mvn/wrapper/maven-wrapper.jar: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/stockgeeks/spring-kafka-poison-pill/HEAD/.mvn/wrapper/maven-wrapper.jar -------------------------------------------------------------------------------- /.editorconfig: -------------------------------------------------------------------------------- 1 | # Cross IDE code style configuration 2 | # * General info: http://EditorConfig.org 3 | # * Intellij info: https://www.jetbrains.com/idea/help/configuring-code-style.html 4 | 5 | root = true 6 | 7 | indent_style = space 8 | indent_size = 2 9 | -------------------------------------------------------------------------------- /.mvn/wrapper/maven-wrapper.properties: -------------------------------------------------------------------------------- 1 | distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.6.3/apache-maven-3.6.3-bin.zip 2 | wrapperUrl=https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar 3 | -------------------------------------------------------------------------------- /stock-quote-consumer-avro/README.md: -------------------------------------------------------------------------------- 1 | # Stock Quote Consumer Application 2 | 3 | Simple consumer application (consuming from topic: `stock-quotes-avro`) using: 4 | 5 | * Spring Boot 2.3.0 6 | * Spring Kafka 2.5 7 | * Apache Avro 8 | * Confluent Kafka 9 | * Confluent Schema Registry 10 | 11 | Running on port: 8082 12 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/README.md: -------------------------------------------------------------------------------- 1 | # Stock Quote Producer Application 2 | 3 | Simple producer application producing random Stock Quotes to topic: `stock-quotes-avro` using: 4 | 5 | * Spring Boot 2.3.0 6 | * Spring Kafka 2.5 7 | * Apache Avro 8 | * Confluent Kafka 9 | * Confluent Schema Registry 10 | 11 | Running on port: 8080 12 | 13 | 14 | 15 | -------------------------------------------------------------------------------- /stock-quote-consumer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/consumer/avro/config/KafkaConfiguration.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.consumer.avro.config; 2 | 3 | import org.springframework.context.annotation.Configuration; 4 | import org.springframework.kafka.annotation.EnableKafka; 5 | 6 | @Configuration 7 | @EnableKafka 8 | public class KafkaConfiguration { 9 | 10 | } -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | HELP.md 2 | target/ 3 | !.mvn/wrapper/maven-wrapper.jar 4 | !**/src/main/** 5 | !**/src/test/** 6 | 7 | ### STS ### 8 | .apt_generated 9 | .classpath 10 | .factorypath 11 | .project 12 | .settings 13 | .springBeans 14 | .sts4-cache 15 | 16 | ### IntelliJ IDEA ### 17 | .idea 18 | *.iws 19 | *.iml 20 | *.ipr 21 | 22 | ### NetBeans ### 23 | /nbproject/private/ 24 | /nbbuild/ 25 | /dist/ 26 | /nbdist/ 27 | /.nb-gradle/ 28 | build/ 29 | 30 | ### VS Code ### 31 | .vscode/ 32 | -------------------------------------------------------------------------------- /stock-quote-consumer-avro/.gitignore: -------------------------------------------------------------------------------- 1 | HELP.md 2 | target/ 3 | !.mvn/wrapper/maven-wrapper.jar 4 | !**/src/main/** 5 | !**/src/test/** 6 | 7 | ### STS ### 8 | .apt_generated 9 | .classpath 10 | .factorypath 11 | .project 12 | .settings 13 | .springBeans 14 | .sts4-cache 15 | 16 | ### IntelliJ IDEA ### 17 | .idea 18 | *.iws 19 | *.iml 20 | *.ipr 21 | 22 | ### NetBeans ### 23 | /nbproject/private/ 24 | /nbbuild/ 25 | /dist/ 26 | /nbdist/ 27 | /.nb-gradle/ 28 | build/ 29 | 30 | ### VS Code ### 31 | .vscode/ 32 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/.gitignore: -------------------------------------------------------------------------------- 1 | HELP.md 2 | target/ 3 | !.mvn/wrapper/maven-wrapper.jar 4 | !**/src/main/** 5 | !**/src/test/** 6 | 7 | ### STS ### 8 | .apt_generated 9 | .classpath 10 | .factorypath 11 | .project 12 | .settings 13 | .springBeans 14 | .sts4-cache 15 | 16 | ### IntelliJ IDEA ### 17 | .idea 18 | *.iws 19 | *.iml 20 | *.ipr 21 | 22 | ### NetBeans ### 23 | /nbproject/private/ 24 | /nbbuild/ 25 | /dist/ 26 | /nbdist/ 27 | /.nb-gradle/ 28 | build/ 29 | 30 | ### VS Code ### 31 | .vscode/ 32 | -------------------------------------------------------------------------------- /stock-quote-kafka-streams-avro/src/main/java/io/stockgeeks/kafka/stock/quote/streams/StockQuoteKafkaStreamsApplication.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.streams; 2 | 3 | import org.springframework.boot.SpringApplication; 4 | import org.springframework.boot.autoconfigure.SpringBootApplication; 5 | 6 | @SpringBootApplication 7 | public class StockQuoteKafkaStreamsApplication { 8 | 9 | public static void main(String[] args) { 10 | SpringApplication.run(StockQuoteKafkaStreamsApplication.class, args); 11 | } 12 | 13 | } 14 | -------------------------------------------------------------------------------- /stock-quote-kafka-streams-avro/.gitignore: -------------------------------------------------------------------------------- 1 | HELP.md 2 | target/ 3 | !.mvn/wrapper/maven-wrapper.jar 4 | !**/src/main/**/target/ 5 | !**/src/test/**/target/ 6 | 7 | ### STS ### 8 | .apt_generated 9 | .classpath 10 | .factorypath 11 | .project 12 | .settings 13 | .springBeans 14 | .sts4-cache 15 | 16 | ### IntelliJ IDEA ### 17 | .idea 18 | *.iws 19 | *.iml 20 | *.ipr 21 | 22 | ### NetBeans ### 23 | /nbproject/private/ 24 | /nbbuild/ 25 | /dist/ 26 | /nbdist/ 27 | /.nb-gradle/ 28 | build/ 29 | !**/src/main/**/build/ 30 | !**/src/test/**/build/ 31 | 32 | ### VS Code ### 33 | .vscode/ 34 | -------------------------------------------------------------------------------- /stock-quote-consumer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/consumer/avro/StockQuoteConsumerAvroApplication.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.consumer.avro; 2 | 3 | import org.springframework.boot.SpringApplication; 4 | import org.springframework.boot.autoconfigure.SpringBootApplication; 5 | 6 | @SpringBootApplication 7 | public class StockQuoteConsumerAvroApplication { 8 | 9 | static final String TOPIC_NAME = "stock-quotes-avro"; 10 | 11 | public static void main(String[] args) { 12 | SpringApplication.run(StockQuoteConsumerAvroApplication.class, args); 13 | } 14 | 15 | } 16 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/producer/avro/StockQuoteProducerAvroApplication.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.producer.avro; 2 | 3 | import org.springframework.boot.SpringApplication; 4 | import org.springframework.boot.autoconfigure.SpringBootApplication; 5 | import org.springframework.scheduling.annotation.EnableScheduling; 6 | 7 | @SpringBootApplication 8 | @EnableScheduling 9 | public class StockQuoteProducerAvroApplication { 10 | 11 | public static void main(String[] args) { 12 | SpringApplication.run(StockQuoteProducerAvroApplication.class, args); 13 | } 14 | 15 | } 16 | -------------------------------------------------------------------------------- /pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 4.0.0 5 | 6 | io.stockgeeks.kafka.stock.quote.avro 7 | spring-kafka-poison-pill 8 | 0.0.1-SNAPSHOT 9 | pom 10 | 11 | spring-kafka-poison-pill 12 | Spring Kafka Example project 13 | 14 | 15 | stock-quote-avro-model 16 | stock-quote-producer-avro 17 | stock-quote-consumer-avro 18 | stock-quote-kafka-streams-avro 19 | 20 | 21 | 22 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/producer/avro/StockQuoteProducer.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.producer.avro; 2 | 3 | import io.stockgeeks.stock.quote.avro.StockQuote; 4 | import lombok.extern.slf4j.Slf4j; 5 | import org.springframework.kafka.core.KafkaTemplate; 6 | import org.springframework.stereotype.Component; 7 | 8 | @Slf4j 9 | @Component 10 | public class StockQuoteProducer { 11 | 12 | public static final String TOPIC_NAME = "stock-quotes-avro"; 13 | 14 | private final KafkaTemplate kafkaTemplate; 15 | 16 | public StockQuoteProducer(KafkaTemplate kafkaTemplate) { 17 | this.kafkaTemplate = kafkaTemplate; 18 | } 19 | 20 | public void produce(StockQuote stockQuote) { 21 | log.info("Produce stock quote: {}, {} {}", stockQuote.getSymbol(), stockQuote.getCurrency(), stockQuote.getTradeValue()); 22 | kafkaTemplate.send(TOPIC_NAME, stockQuote.getSymbol(), stockQuote); 23 | } 24 | } -------------------------------------------------------------------------------- /stock-quote-producer-avro/src/main/resources/application.yml: -------------------------------------------------------------------------------- 1 | server: 2 | port: 8080 3 | 4 | spring: 5 | application: 6 | name: "stock-quote-producer-avro" 7 | 8 | kafka: 9 | bootstrap-servers: localhost:9092 10 | properties: 11 | # Schema Registry Connection parameter 12 | schema.registry.url: http://localhost:8081 13 | producer: 14 | key-serializer: org.apache.kafka.common.serialization.StringSerializer 15 | value-serializer: io.confluent.kafka.serializers.KafkaAvroSerializer 16 | client-id: ${spring.application.name} 17 | properties: 18 | enable.idempotence: true 19 | 20 | stockQuote: 21 | producer: 22 | # Mimic the stock exchange create a random stock quote every configured milliseconds 23 | rateInMs: 1000 24 | 25 | # Open up all Spring Boot Actuator endpoints 26 | management: 27 | endpoints: 28 | web: 29 | exposure: 30 | include: "*" 31 | 32 | endpoint: 33 | health: 34 | show-details: always 35 | -------------------------------------------------------------------------------- /stock-quote-avro-model/src/main/resources/avro/stock-quote-v1.avsc: -------------------------------------------------------------------------------- 1 | { 2 | "namespace": "io.stockgeeks.stock.quote.avro", 3 | "type": "record", 4 | "name": "StockQuote", 5 | "version": 1, 6 | "doc": "Avro schema for our stock quote.", 7 | "fields": [ 8 | { 9 | "name": "symbol", 10 | "type": "string", 11 | "doc": "The identifier of the stock." 12 | }, 13 | { 14 | "name": "exchange", 15 | "type": "string", 16 | "doc": "The stock exchange the stock was traded." 17 | }, 18 | { 19 | "name": "tradeValue", 20 | "type": "string", 21 | "doc": "The value the stock was traded for." 22 | }, 23 | { 24 | "name": "currency", 25 | "type": "string", 26 | "doc": "The currency the stock was traded in." 27 | }, 28 | { 29 | "name": "tradeTime", 30 | "type": "long", 31 | "logicalType": "timestamp-millis", 32 | "doc": "Epoch millis timestamp at which the stock trade took place." 33 | } 34 | ] 35 | } -------------------------------------------------------------------------------- /stock-quote-consumer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/consumer/avro/StockQuoteConsumer.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.consumer.avro; 2 | 3 | import io.stockgeeks.stock.quote.avro.StockQuote; 4 | import lombok.extern.slf4j.Slf4j; 5 | import org.springframework.kafka.annotation.KafkaListener; 6 | import org.springframework.messaging.handler.annotation.Header; 7 | import org.springframework.stereotype.Component; 8 | 9 | import static io.stockgeeks.kafka.stock.quote.consumer.avro.StockQuoteConsumerAvroApplication.TOPIC_NAME; 10 | import static org.springframework.kafka.support.KafkaHeaders.RECEIVED_PARTITION_ID; 11 | 12 | @Component 13 | @Slf4j 14 | public class StockQuoteConsumer { 15 | 16 | @KafkaListener(topics = TOPIC_NAME) 17 | public void listen(StockQuote stockQuote, @Header(RECEIVED_PARTITION_ID) Integer partitionId) { 18 | log.info("Consumed: {}, {} {} from partition: {}", stockQuote.getSymbol(), stockQuote.getCurrency(), 19 | stockQuote.getTradeValue(), partitionId); 20 | } 21 | 22 | } 23 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/producer/avro/RandomStockQuoteProducer.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.producer.avro; 2 | 3 | import io.stockgeeks.stock.quote.avro.StockQuote; 4 | import org.springframework.scheduling.annotation.Scheduled; 5 | import org.springframework.stereotype.Component; 6 | 7 | @Component 8 | public class RandomStockQuoteProducer { 9 | 10 | private final StockQuoteProducer stockQuoteProducer; 11 | private final RandomStockQuoteGenerator randomStockQuoteGenerator; 12 | 13 | public RandomStockQuoteProducer(StockQuoteProducer stockQuoteProducer, RandomStockQuoteGenerator randomStockQuoteGenerator) { 14 | this.stockQuoteProducer = stockQuoteProducer; 15 | this.randomStockQuoteGenerator = randomStockQuoteGenerator; 16 | } 17 | 18 | @Scheduled(fixedRateString = "${stockQuote.producer.rateInMs}") 19 | public void produceRandomStockQuote() { 20 | StockQuote stockQuote = randomStockQuoteGenerator.generateRandomStockQuote(); 21 | stockQuoteProducer.produce(stockQuote); 22 | } 23 | } 24 | -------------------------------------------------------------------------------- /stock-quote-kafka-streams-avro/src/main/java/io/stockgeeks/kafka/stock/quote/streams/KafkaConfiguration.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.streams; 2 | 3 | import io.stockgeeks.stock.quote.avro.StockQuote; 4 | import org.apache.kafka.streams.StreamsBuilder; 5 | import org.apache.kafka.streams.kstream.KStream; 6 | import org.apache.kafka.streams.kstream.Printed; 7 | import org.springframework.context.annotation.Bean; 8 | import org.springframework.context.annotation.Configuration; 9 | import org.springframework.kafka.annotation.EnableKafkaStreams; 10 | 11 | @Configuration 12 | @EnableKafkaStreams 13 | public class KafkaConfiguration { 14 | 15 | private static final String INPUT_TOPIC_NAME = "stock-quotes-avro"; 16 | public static final String OUTPUT_TOPIC_NAME = "stock-quotes-ing-avro"; 17 | 18 | @Bean 19 | public KStream kStream(StreamsBuilder kStreamBuilder) { 20 | KStream input = kStreamBuilder.stream(INPUT_TOPIC_NAME); 21 | input.print(Printed.toSysOut()); 22 | 23 | // Simple filter to stream all ING stock quotes to topic `stock-quotes-ing-avro` 24 | KStream output = input.filter((key, value) -> value.getSymbol().equals("INGA")); 25 | output.to(OUTPUT_TOPIC_NAME); 26 | return output; 27 | } 28 | } 29 | -------------------------------------------------------------------------------- /stock-quote-consumer-avro/src/main/resources/application.yml: -------------------------------------------------------------------------------- 1 | server: 2 | port: 8082 3 | 4 | spring: 5 | application: 6 | name: "stock-quote-consumer-avro" 7 | 8 | kafka: 9 | bootstrap-servers: localhost:9092 10 | consumer: 11 | key-deserializer: org.apache.kafka.common.serialization.StringDeserializer 12 | value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer 13 | client-id: ${spring.application.name} 14 | group-id: ${spring.application.name}-group 15 | auto-offset-reset: earliest 16 | properties: 17 | schema.registry.url: http://localhost:8081 18 | # Tells Kafka / Schema Registry that we will be using a specific Avro type 19 | # (StockQuote type in this case) otherwise Kafka will expect GenericRecord to be used on the topic. 20 | specific.avro.reader: true 21 | 22 | # At application startup a missing topic on the broker will not fail the 23 | # application startup 24 | listener: 25 | missing-topics-fatal: false 26 | 27 | # Since Spring Boot 2.2 JMX is disabled by default but 28 | # Micrometer is depending on JMX for (Spring) Kafka consumer metrics! 29 | jmx: 30 | enabled: true 31 | 32 | management: 33 | endpoints: 34 | web: 35 | exposure: 36 | include: "*" 37 | 38 | metrics: 39 | tags: 40 | application: ${spring.application.name} 41 | region: us-west-1 42 | -------------------------------------------------------------------------------- /stock-quote-kafka-streams-avro/src/main/resources/application.yml: -------------------------------------------------------------------------------- 1 | server: 2 | port: 8083 3 | 4 | spring: 5 | application: 6 | name: "stock-quote-kafka-streams" 7 | 8 | kafka: 9 | bootstrap-servers: localhost:9092 10 | streams: 11 | application-id: ${spring.application.name} 12 | client-id: ${spring.application.name}-stream 13 | properties: 14 | default.key.serde: org.apache.kafka.common.serialization.Serdes$StringSerde 15 | default.value.serde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde 16 | # This is the default: log, fail and stop processing records (stop stream) 17 | default.deserialization.exception.handler: org.apache.kafka.streams.errors.LogAndFailExceptionHandler 18 | properties: 19 | schema.registry.url: http://localhost:8081 20 | # Tells Kafka / Schema Registry that we will be using a specific Avro type 21 | # (StockQuote type in this case) otherwise Kafka will expect GenericRecord to be used on the topic. 22 | specific.avro.reader: true 23 | bootstrap.servers: ${spring.kafka.bootstrap-servers} 24 | # At application startup a missing topic on the broker will not fail the 25 | # application startup 26 | listener: 27 | missing-topics-fatal: false 28 | 29 | management: 30 | endpoints: 31 | web: 32 | exposure: 33 | include: "*" 34 | 35 | metrics: 36 | tags: 37 | application: ${spring.application.name} 38 | region: us-west-1 39 | -------------------------------------------------------------------------------- /stock-quote-avro-model/pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 4.0.0 5 | io.stockgeeks.kafka.stock.quote.avro 6 | stock-quote-avro-model 7 | 0.0.1-SNAPSHOT 8 | stock-quote-avro-model 9 | Module for Avro files and Java Generated Code 10 | 11 | 12 | 1.8 13 | 1.9.2 14 | 15 | 1.8 16 | 1.8 17 | UTF-8 18 | 19 | 20 | 21 | 22 | 23 | org.apache.avro 24 | avro 25 | ${avro.version} 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | org.apache.avro 34 | avro-maven-plugin 35 | ${avro.version} 36 | 37 | 38 | generate-sources 39 | 40 | schema 41 | 42 | 43 | src/main/resources/avro 44 | ${project.build.directory}/generated-sources 45 | String 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/src/main/java/io/stockgeeks/kafka/stock/quote/producer/avro/RandomStockQuoteGenerator.java: -------------------------------------------------------------------------------- 1 | package io.stockgeeks.kafka.stock.quote.producer.avro; 2 | 3 | import io.stockgeeks.stock.quote.avro.StockQuote; 4 | import lombok.Value; 5 | import org.apache.commons.math3.random.RandomDataGenerator; 6 | import org.springframework.stereotype.Component; 7 | 8 | import java.math.BigDecimal; 9 | import java.math.RoundingMode; 10 | import java.time.Instant; 11 | import java.util.Arrays; 12 | import java.util.List; 13 | import java.util.Random; 14 | 15 | @Component 16 | public class RandomStockQuoteGenerator { 17 | 18 | private List instruments; 19 | private static final String STOCK_EXCHANGE_NASDAQ = "NASDAQ"; 20 | private static final String STOCK_EXCHANGE_AMSTERDAM = "AMS"; 21 | 22 | private static final String CURRENCY_EURO = "EUR"; 23 | private static final String CURRENCY_US_DOLLAR = "USD"; 24 | 25 | public RandomStockQuoteGenerator() { 26 | instruments = Arrays.asList(new Instrument("AAPL", STOCK_EXCHANGE_NASDAQ, CURRENCY_US_DOLLAR), 27 | new Instrument("AMZN", STOCK_EXCHANGE_NASDAQ, CURRENCY_US_DOLLAR), 28 | new Instrument("GOOGL", STOCK_EXCHANGE_NASDAQ, CURRENCY_US_DOLLAR), 29 | new Instrument("NFLX", STOCK_EXCHANGE_NASDAQ, CURRENCY_US_DOLLAR), 30 | new Instrument("INGA", STOCK_EXCHANGE_AMSTERDAM, CURRENCY_EURO), 31 | new Instrument("AD", STOCK_EXCHANGE_AMSTERDAM, CURRENCY_EURO), 32 | new Instrument("RDSA", STOCK_EXCHANGE_AMSTERDAM, CURRENCY_EURO)); 33 | } 34 | 35 | StockQuote generateRandomStockQuote() { 36 | Instrument randomInstrument = pickRandomInstrument(); 37 | BigDecimal randomPrice = generateRandomPrice(); 38 | return new StockQuote(randomInstrument.getSymbol(), randomInstrument.getExchange(), randomPrice.toPlainString(), 39 | randomInstrument.getCurrency(), Instant.now().toEpochMilli()); 40 | } 41 | 42 | private BigDecimal generateRandomPrice() { 43 | double leftLimit = 1.000D; 44 | double rightLimit = 3000.000D; 45 | 46 | BigDecimal randomPrice = BigDecimal.valueOf(new RandomDataGenerator().nextUniform(leftLimit, rightLimit)); 47 | randomPrice = randomPrice.setScale(3, RoundingMode.HALF_UP); 48 | return randomPrice; 49 | } 50 | 51 | private Instrument pickRandomInstrument() { 52 | int randomIndex = new Random().nextInt(instruments.size()); 53 | return instruments.get(randomIndex); 54 | } 55 | 56 | @Value 57 | private static class Instrument { 58 | private String symbol; 59 | private String exchange; 60 | private String currency; 61 | } 62 | } 63 | -------------------------------------------------------------------------------- /stock-quote-consumer-avro/pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 4.0.0 5 | 6 | org.springframework.boot 7 | spring-boot-starter-parent 8 | 2.3.0.RELEASE 9 | 10 | 11 | io.stockgeeks.kafka.stock.quote.avro 12 | stock-quote-consumer-avro 13 | 0.0.1-SNAPSHOT 14 | stock-quote-consumer-avro 15 | Demo project for Spring Boot + Spring Kafka 16 | 17 | 18 | 1.8 19 | 5.5.0 20 | 21 | 22 | 23 | 24 | 25 | io.stockgeeks.kafka.stock.quote.avro 26 | stock-quote-avro-model 27 | ${project.version} 28 | 29 | 30 | org.springframework.boot 31 | spring-boot-starter-actuator 32 | 33 | 34 | org.springframework.boot 35 | spring-boot-starter-web 36 | 37 | 38 | org.springframework.kafka 39 | spring-kafka 40 | 41 | 42 | 43 | 44 | io.confluent 45 | kafka-avro-serializer 46 | ${confluent.kafka.avro.serializer.version} 47 | 48 | 49 | 50 | org.projectlombok 51 | lombok 52 | true 53 | 54 | 55 | org.springframework.boot 56 | spring-boot-starter-test 57 | test 58 | 59 | 60 | org.junit.vintage 61 | junit-vintage-engine 62 | 63 | 64 | 65 | 66 | org.springframework.kafka 67 | spring-kafka-test 68 | test 69 | 70 | 71 | 72 | 73 | 74 | 75 | org.springframework.boot 76 | spring-boot-maven-plugin 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | confluent 85 | https://packages.confluent.io/maven/ 86 | 87 | 88 | spring-milestones 89 | Spring Milestones 90 | https://repo.spring.io/milestone 91 | 92 | 93 | 94 | 95 | -------------------------------------------------------------------------------- /stock-quote-producer-avro/pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 4.0.0 5 | 6 | org.springframework.boot 7 | spring-boot-starter-parent 8 | 2.3.0.RELEASE 9 | 10 | 11 | io.stockgeeks.kafka.stock.quote.avro 12 | stock-quote-producer-avro 13 | 0.0.1-SNAPSHOT 14 | stock-quote-producer-avro 15 | Demo project for Spring Boot + Spring Kafka 16 | 17 | 18 | 1.8 19 | 5.5.0 20 | 3.6.1 21 | 22 | 23 | 24 | 25 | 26 | io.stockgeeks.kafka.stock.quote.avro 27 | stock-quote-avro-model 28 | ${project.version} 29 | 30 | 31 | 32 | org.springframework.boot 33 | spring-boot-starter-actuator 34 | 35 | 36 | org.springframework.boot 37 | spring-boot-starter-web 38 | 39 | 40 | org.springframework.kafka 41 | spring-kafka 42 | 43 | 44 | org.projectlombok 45 | lombok 46 | 47 | 48 | 49 | 50 | io.confluent 51 | kafka-avro-serializer 52 | ${confluent.kafka.avro.serializer.version} 53 | 54 | 55 | 56 | org.springframework.boot 57 | spring-boot-starter-test 58 | test 59 | 60 | 61 | org.junit.vintage 62 | junit-vintage-engine 63 | 64 | 65 | 66 | 67 | org.springframework.kafka 68 | spring-kafka-test 69 | test 70 | 71 | 72 | 73 | 74 | org.apache.commons 75 | commons-math3 76 | ${commons.math3.version} 77 | 78 | 79 | 80 | 81 | 82 | 83 | org.springframework.boot 84 | spring-boot-maven-plugin 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | confluent 93 | https://packages.confluent.io/maven/ 94 | 95 | 96 | 97 | 98 | -------------------------------------------------------------------------------- /stock-quote-kafka-streams-avro/pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 4.0.0 5 | 6 | org.springframework.boot 7 | spring-boot-starter-parent 8 | 2.3.3.RELEASE 9 | 10 | 11 | io.stockgeeks.kafka.stock.quote.avro 12 | stock-quote-kafka-streams 13 | 0.0.1-SNAPSHOT 14 | stock-quote-kafka-streams 15 | Demo project for Spring Boot 16 | 17 | 18 | 1.8 19 | 5.5.0 20 | 21 | 22 | 23 | 24 | 25 | io.stockgeeks.kafka.stock.quote.avro 26 | stock-quote-avro-model 27 | ${project.version} 28 | 29 | 30 | 31 | 32 | io.confluent 33 | kafka-avro-serializer 34 | ${confluent.kafka.avro.version} 35 | 36 | 37 | 38 | 39 | io.confluent 40 | kafka-streams-avro-serde 41 | ${confluent.kafka.avro.version} 42 | 43 | 44 | 45 | org.springframework.boot 46 | spring-boot-starter-actuator 47 | 48 | 49 | org.springframework.boot 50 | spring-boot-starter-web 51 | 52 | 53 | org.apache.kafka 54 | kafka-streams 55 | 56 | 57 | org.springframework.kafka 58 | spring-kafka 59 | 60 | 61 | 62 | org.springframework.boot 63 | spring-boot-starter-test 64 | test 65 | 66 | 67 | org.junit.vintage 68 | junit-vintage-engine 69 | 70 | 71 | 72 | 73 | org.springframework.kafka 74 | spring-kafka-test 75 | test 76 | 77 | 78 | org.projectlombok 79 | lombok 80 | 81 | 82 | 83 | 84 | 85 | 86 | org.springframework.boot 87 | spring-boot-maven-plugin 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | confluent 96 | https://packages.confluent.io/maven/ 97 | 98 | 99 | 100 | 101 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | # Docker compose file format 3.7 Required Docker Engine 18.06.0+ 2 | # For more information see: https://docs.docker.com/compose/compose-file/compose-versioning/ 3 | version: '3.7' 4 | 5 | services: 6 | 7 | # Confluent Kafka Docker image see: https://hub.docker.com/r/confluentinc/cp-kafka/ 8 | # Confluent Platform and Apache Kafka Compatibility: 9 | # https://docs.confluent.io/current/installation/versions-interoperability.html 10 | # Confluent Platform: 5.5.x = Apache Kafka: 2.5.x 11 | kafka: 12 | image: confluentinc/cp-kafka:5.5.0 13 | container_name: kafka 14 | hostname: kafka 15 | # Just in case Zookeeper isn't up yet restart 16 | restart: always 17 | environment: 18 | # KAFKA_ADVERTISED_LISTENERS: comma-separated list of listeners with their the host/ip and port. 19 | # This is the metadata that’s passed back to clients. 20 | # LISTENER_DOCKER_INTERNAL: This will make Kafka accessible from outside of the Docker network (your machine) port: 9092. 21 | # LISTENER_DOCKER_EXTERNAL: This will make Kafka accessible to other Docker containers by advertising it’s 22 | # location on the Docker network port: 29092 23 | KAFKA_LISTENERS: LISTENER_DOCKER_INTERNAL://:29092,LISTENER_DOCKER_EXTERNAL://:9092 24 | KAFKA_ADVERTISED_LISTENERS: LISTENER_DOCKER_INTERNAL://kafka:29092,LISTENER_DOCKER_EXTERNAL://localhost:9092 25 | # Key/value pairs for the security protocol to use, per listener name 26 | KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: LISTENER_DOCKER_INTERNAL:PLAINTEXT,LISTENER_DOCKER_EXTERNAL:PLAINTEXT 27 | # The same ZooKeeper port is specified here as the previous container. 28 | KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 29 | KAFKA_INTER_BROKER_LISTENER_NAME: LISTENER_DOCKER_INTERNAL 30 | # The KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR is set to 1 for a single-node cluster. Unless you have three or more 31 | # nodes you do not need to change this from the default. 32 | KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 33 | KAFKA_DEFAULT_REPLICATION_FACTOR: 1 34 | KAFKA_NUM_PARTITIONS: 3 35 | # Whether or not to auto create topics when data is published for the first time to a topic 36 | KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true" 37 | JMX_PORT: 9999 38 | KAFKA_JMX_OPTS: -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Djava.rmi.server.hostname=kafka -Dcom.sun.management.jmxremote.rmi.port=9999 39 | KAFKA_LOG4J_LOGGERS: "kafka.controller=INFO,kafka.producer.async.DefaultEventHandler=INFO,state.change.logger=INFO" 40 | CONFLUENT_SUPPORT_CUSTOMER_ID: 'anonymous' 41 | ports: 42 | - 9092:9092 43 | - 9999:9999 44 | depends_on: 45 | - zookeeper 46 | 47 | # Confluent Zookeeper Docker image see: https://hub.docker.com/r/confluentinc/cp-zookeeper/ 48 | zookeeper: 49 | container_name: zookeeper 50 | hostname: zookeeper 51 | image: confluentinc/cp-zookeeper:5.5.0 52 | environment: 53 | ZOOKEEPER_CLIENT_PORT: 2181 54 | ports: 55 | - 2181:2181 56 | 57 | # Confluent Schema Registry Docker image see: https://hub.docker.com/r/confluentinc/cp-schema-registry 58 | # Schema Registry: http://localhost:8081 59 | schema-registry: 60 | image: confluentinc/cp-schema-registry:5.5.0 61 | hostname: schema-registry 62 | container_name: schema-registry 63 | restart: always 64 | environment: 65 | # Connects to the docker internal network port: 29092 66 | SCHEMA_REGISTRY_KAFKASTORE_BOOTSTRAP_SERVERS: "kafka:29092" 67 | SCHEMA_REGISTRY_HOST_NAME: schema-registry 68 | SCHEMA_REGISTRY_LISTENERS: "http://0.0.0.0:8081" 69 | ports: 70 | - 8081:8081 71 | depends_on: 72 | - zookeeper 73 | 74 | # https://hub.docker.com/r/hlebalbau/kafka-manager/ 75 | # Kafka Manager: http://localhost:9000 76 | kafka-manager: 77 | container_name: kafka-manager 78 | image: hlebalbau/kafka-manager 79 | environment: 80 | ZK_HOSTS: "zookeeper:2181" 81 | ports: 82 | - 9000:9000 83 | depends_on: 84 | - zookeeper 85 | 86 | # Confluent Kafka cat (command line tool for interacting with Kafka brokers) 87 | # Docker image see: https://hub.docker.com/r/confluentinc/cp-kafkacat/ 88 | kafkacat: 89 | container_name: kafkacat 90 | image: confluentinc/cp-kafkacat:5.5.0 91 | command: sleep infinity -------------------------------------------------------------------------------- /.mvn/wrapper/MavenWrapperDownloader.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2007-present the original author or authors. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | import java.net.*; 17 | import java.io.*; 18 | import java.nio.channels.*; 19 | import java.util.Properties; 20 | 21 | public class MavenWrapperDownloader { 22 | 23 | private static final String WRAPPER_VERSION = "0.5.6"; 24 | /** 25 | * Default URL to download the maven-wrapper.jar from, if no 'downloadUrl' is provided. 26 | */ 27 | private static final String DEFAULT_DOWNLOAD_URL = "https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/" 28 | + WRAPPER_VERSION + "/maven-wrapper-" + WRAPPER_VERSION + ".jar"; 29 | 30 | /** 31 | * Path to the maven-wrapper.properties file, which might contain a downloadUrl property to 32 | * use instead of the default one. 33 | */ 34 | private static final String MAVEN_WRAPPER_PROPERTIES_PATH = 35 | ".mvn/wrapper/maven-wrapper.properties"; 36 | 37 | /** 38 | * Path where the maven-wrapper.jar will be saved to. 39 | */ 40 | private static final String MAVEN_WRAPPER_JAR_PATH = 41 | ".mvn/wrapper/maven-wrapper.jar"; 42 | 43 | /** 44 | * Name of the property which should be used to override the default download url for the wrapper. 45 | */ 46 | private static final String PROPERTY_NAME_WRAPPER_URL = "wrapperUrl"; 47 | 48 | public static void main(String args[]) { 49 | System.out.println("- Downloader started"); 50 | File baseDirectory = new File(args[0]); 51 | System.out.println("- Using base directory: " + baseDirectory.getAbsolutePath()); 52 | 53 | // If the maven-wrapper.properties exists, read it and check if it contains a custom 54 | // wrapperUrl parameter. 55 | File mavenWrapperPropertyFile = new File(baseDirectory, MAVEN_WRAPPER_PROPERTIES_PATH); 56 | String url = DEFAULT_DOWNLOAD_URL; 57 | if(mavenWrapperPropertyFile.exists()) { 58 | FileInputStream mavenWrapperPropertyFileInputStream = null; 59 | try { 60 | mavenWrapperPropertyFileInputStream = new FileInputStream(mavenWrapperPropertyFile); 61 | Properties mavenWrapperProperties = new Properties(); 62 | mavenWrapperProperties.load(mavenWrapperPropertyFileInputStream); 63 | url = mavenWrapperProperties.getProperty(PROPERTY_NAME_WRAPPER_URL, url); 64 | } catch (IOException e) { 65 | System.out.println("- ERROR loading '" + MAVEN_WRAPPER_PROPERTIES_PATH + "'"); 66 | } finally { 67 | try { 68 | if(mavenWrapperPropertyFileInputStream != null) { 69 | mavenWrapperPropertyFileInputStream.close(); 70 | } 71 | } catch (IOException e) { 72 | // Ignore ... 73 | } 74 | } 75 | } 76 | System.out.println("- Downloading from: " + url); 77 | 78 | File outputFile = new File(baseDirectory.getAbsolutePath(), MAVEN_WRAPPER_JAR_PATH); 79 | if(!outputFile.getParentFile().exists()) { 80 | if(!outputFile.getParentFile().mkdirs()) { 81 | System.out.println( 82 | "- ERROR creating output directory '" + outputFile.getParentFile().getAbsolutePath() + "'"); 83 | } 84 | } 85 | System.out.println("- Downloading to: " + outputFile.getAbsolutePath()); 86 | try { 87 | downloadFileFromURL(url, outputFile); 88 | System.out.println("Done"); 89 | System.exit(0); 90 | } catch (Throwable e) { 91 | System.out.println("- Error downloading"); 92 | e.printStackTrace(); 93 | System.exit(1); 94 | } 95 | } 96 | 97 | private static void downloadFileFromURL(String urlString, File destination) throws Exception { 98 | if (System.getenv("MVNW_USERNAME") != null && System.getenv("MVNW_PASSWORD") != null) { 99 | String username = System.getenv("MVNW_USERNAME"); 100 | char[] password = System.getenv("MVNW_PASSWORD").toCharArray(); 101 | Authenticator.setDefault(new Authenticator() { 102 | @Override 103 | protected PasswordAuthentication getPasswordAuthentication() { 104 | return new PasswordAuthentication(username, password); 105 | } 106 | }); 107 | } 108 | URL website = new URL(urlString); 109 | ReadableByteChannel rbc; 110 | rbc = Channels.newChannel(website.openStream()); 111 | FileOutputStream fos = new FileOutputStream(destination); 112 | fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE); 113 | fos.close(); 114 | rbc.close(); 115 | } 116 | 117 | } 118 | -------------------------------------------------------------------------------- /mvnw.cmd: -------------------------------------------------------------------------------- 1 | @REM ---------------------------------------------------------------------------- 2 | @REM Licensed to the Apache Software Foundation (ASF) under one 3 | @REM or more contributor license agreements. See the NOTICE file 4 | @REM distributed with this work for additional information 5 | @REM regarding copyright ownership. The ASF licenses this file 6 | @REM to you under the Apache License, Version 2.0 (the 7 | @REM "License"); you may not use this file except in compliance 8 | @REM with the License. You may obtain a copy of the License at 9 | @REM 10 | @REM http://www.apache.org/licenses/LICENSE-2.0 11 | @REM 12 | @REM Unless required by applicable law or agreed to in writing, 13 | @REM software distributed under the License is distributed on an 14 | @REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY 15 | @REM KIND, either express or implied. See the License for the 16 | @REM specific language governing permissions and limitations 17 | @REM under the License. 18 | @REM ---------------------------------------------------------------------------- 19 | 20 | @REM ---------------------------------------------------------------------------- 21 | @REM Maven Start Up Batch script 22 | @REM 23 | @REM Required ENV vars: 24 | @REM JAVA_HOME - location of a JDK home dir 25 | @REM 26 | @REM Optional ENV vars 27 | @REM M2_HOME - location of maven2's installed home dir 28 | @REM MAVEN_BATCH_ECHO - set to 'on' to enable the echoing of the batch commands 29 | @REM MAVEN_BATCH_PAUSE - set to 'on' to wait for a keystroke before ending 30 | @REM MAVEN_OPTS - parameters passed to the Java VM when running Maven 31 | @REM e.g. to debug Maven itself, use 32 | @REM set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 33 | @REM MAVEN_SKIP_RC - flag to disable loading of mavenrc files 34 | @REM ---------------------------------------------------------------------------- 35 | 36 | @REM Begin all REM lines with '@' in case MAVEN_BATCH_ECHO is 'on' 37 | @echo off 38 | @REM set title of command window 39 | title %0 40 | @REM enable echoing by setting MAVEN_BATCH_ECHO to 'on' 41 | @if "%MAVEN_BATCH_ECHO%" == "on" echo %MAVEN_BATCH_ECHO% 42 | 43 | @REM set %HOME% to equivalent of $HOME 44 | if "%HOME%" == "" (set "HOME=%HOMEDRIVE%%HOMEPATH%") 45 | 46 | @REM Execute a user defined script before this one 47 | if not "%MAVEN_SKIP_RC%" == "" goto skipRcPre 48 | @REM check for pre script, once with legacy .bat ending and once with .cmd ending 49 | if exist "%HOME%\mavenrc_pre.bat" call "%HOME%\mavenrc_pre.bat" 50 | if exist "%HOME%\mavenrc_pre.cmd" call "%HOME%\mavenrc_pre.cmd" 51 | :skipRcPre 52 | 53 | @setlocal 54 | 55 | set ERROR_CODE=0 56 | 57 | @REM To isolate internal variables from possible post scripts, we use another setlocal 58 | @setlocal 59 | 60 | @REM ==== START VALIDATION ==== 61 | if not "%JAVA_HOME%" == "" goto OkJHome 62 | 63 | echo. 64 | echo Error: JAVA_HOME not found in your environment. >&2 65 | echo Please set the JAVA_HOME variable in your environment to match the >&2 66 | echo location of your Java installation. >&2 67 | echo. 68 | goto error 69 | 70 | :OkJHome 71 | if exist "%JAVA_HOME%\bin\java.exe" goto init 72 | 73 | echo. 74 | echo Error: JAVA_HOME is set to an invalid directory. >&2 75 | echo JAVA_HOME = "%JAVA_HOME%" >&2 76 | echo Please set the JAVA_HOME variable in your environment to match the >&2 77 | echo location of your Java installation. >&2 78 | echo. 79 | goto error 80 | 81 | @REM ==== END VALIDATION ==== 82 | 83 | :init 84 | 85 | @REM Find the project base dir, i.e. the directory that contains the folder ".mvn". 86 | @REM Fallback to current working directory if not found. 87 | 88 | set MAVEN_PROJECTBASEDIR=%MAVEN_BASEDIR% 89 | IF NOT "%MAVEN_PROJECTBASEDIR%"=="" goto endDetectBaseDir 90 | 91 | set EXEC_DIR=%CD% 92 | set WDIR=%EXEC_DIR% 93 | :findBaseDir 94 | IF EXIST "%WDIR%"\.mvn goto baseDirFound 95 | cd .. 96 | IF "%WDIR%"=="%CD%" goto baseDirNotFound 97 | set WDIR=%CD% 98 | goto findBaseDir 99 | 100 | :baseDirFound 101 | set MAVEN_PROJECTBASEDIR=%WDIR% 102 | cd "%EXEC_DIR%" 103 | goto endDetectBaseDir 104 | 105 | :baseDirNotFound 106 | set MAVEN_PROJECTBASEDIR=%EXEC_DIR% 107 | cd "%EXEC_DIR%" 108 | 109 | :endDetectBaseDir 110 | 111 | IF NOT EXIST "%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config" goto endReadAdditionalConfig 112 | 113 | @setlocal EnableExtensions EnableDelayedExpansion 114 | for /F "usebackq delims=" %%a in ("%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config") do set JVM_CONFIG_MAVEN_PROPS=!JVM_CONFIG_MAVEN_PROPS! %%a 115 | @endlocal & set JVM_CONFIG_MAVEN_PROPS=%JVM_CONFIG_MAVEN_PROPS% 116 | 117 | :endReadAdditionalConfig 118 | 119 | SET MAVEN_JAVA_EXE="%JAVA_HOME%\bin\java.exe" 120 | set WRAPPER_JAR="%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.jar" 121 | set WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain 122 | 123 | set DOWNLOAD_URL="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar" 124 | 125 | FOR /F "tokens=1,2 delims==" %%A IN ("%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.properties") DO ( 126 | IF "%%A"=="wrapperUrl" SET DOWNLOAD_URL=%%B 127 | ) 128 | 129 | @REM Extension to allow automatically downloading the maven-wrapper.jar from Maven-central 130 | @REM This allows using the maven wrapper in projects that prohibit checking in binary data. 131 | if exist %WRAPPER_JAR% ( 132 | if "%MVNW_VERBOSE%" == "true" ( 133 | echo Found %WRAPPER_JAR% 134 | ) 135 | ) else ( 136 | if not "%MVNW_REPOURL%" == "" ( 137 | SET DOWNLOAD_URL="%MVNW_REPOURL%/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar" 138 | ) 139 | if "%MVNW_VERBOSE%" == "true" ( 140 | echo Couldn't find %WRAPPER_JAR%, downloading it ... 141 | echo Downloading from: %DOWNLOAD_URL% 142 | ) 143 | 144 | powershell -Command "&{"^ 145 | "$webclient = new-object System.Net.WebClient;"^ 146 | "if (-not ([string]::IsNullOrEmpty('%MVNW_USERNAME%') -and [string]::IsNullOrEmpty('%MVNW_PASSWORD%'))) {"^ 147 | "$webclient.Credentials = new-object System.Net.NetworkCredential('%MVNW_USERNAME%', '%MVNW_PASSWORD%');"^ 148 | "}"^ 149 | "[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; $webclient.DownloadFile('%DOWNLOAD_URL%', '%WRAPPER_JAR%')"^ 150 | "}" 151 | if "%MVNW_VERBOSE%" == "true" ( 152 | echo Finished downloading %WRAPPER_JAR% 153 | ) 154 | ) 155 | @REM End of extension 156 | 157 | @REM Provide a "standardized" way to retrieve the CLI args that will 158 | @REM work with both Windows and non-Windows executions. 159 | set MAVEN_CMD_LINE_ARGS=%* 160 | 161 | %MAVEN_JAVA_EXE% %JVM_CONFIG_MAVEN_PROPS% %MAVEN_OPTS% %MAVEN_DEBUG_OPTS% -classpath %WRAPPER_JAR% "-Dmaven.multiModuleProjectDirectory=%MAVEN_PROJECTBASEDIR%" %WRAPPER_LAUNCHER% %MAVEN_CONFIG% %* 162 | if ERRORLEVEL 1 goto error 163 | goto end 164 | 165 | :error 166 | set ERROR_CODE=1 167 | 168 | :end 169 | @endlocal & set ERROR_CODE=%ERROR_CODE% 170 | 171 | if not "%MAVEN_SKIP_RC%" == "" goto skipRcPost 172 | @REM check for post script, once with legacy .bat ending and once with .cmd ending 173 | if exist "%HOME%\mavenrc_post.bat" call "%HOME%\mavenrc_post.bat" 174 | if exist "%HOME%\mavenrc_post.cmd" call "%HOME%\mavenrc_post.cmd" 175 | :skipRcPost 176 | 177 | @REM pause the script if MAVEN_BATCH_PAUSE is set to 'on' 178 | if "%MAVEN_BATCH_PAUSE%" == "on" pause 179 | 180 | if "%MAVEN_TERMINATE_CMD%" == "on" exit %ERROR_CODE% 181 | 182 | exit /B %ERROR_CODE% 183 | -------------------------------------------------------------------------------- /mvnw: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | # ---------------------------------------------------------------------------- 3 | # Licensed to the Apache Software Foundation (ASF) under one 4 | # or more contributor license agreements. See the NOTICE file 5 | # distributed with this work for additional information 6 | # regarding copyright ownership. The ASF licenses this file 7 | # to you under the Apache License, Version 2.0 (the 8 | # "License"); you may not use this file except in compliance 9 | # with the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, 14 | # software distributed under the License is distributed on an 15 | # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY 16 | # KIND, either express or implied. See the License for the 17 | # specific language governing permissions and limitations 18 | # under the License. 19 | # ---------------------------------------------------------------------------- 20 | 21 | # ---------------------------------------------------------------------------- 22 | # Maven Start Up Batch script 23 | # 24 | # Required ENV vars: 25 | # ------------------ 26 | # JAVA_HOME - location of a JDK home dir 27 | # 28 | # Optional ENV vars 29 | # ----------------- 30 | # M2_HOME - location of maven2's installed home dir 31 | # MAVEN_OPTS - parameters passed to the Java VM when running Maven 32 | # e.g. to debug Maven itself, use 33 | # set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 34 | # MAVEN_SKIP_RC - flag to disable loading of mavenrc files 35 | # ---------------------------------------------------------------------------- 36 | 37 | if [ -z "$MAVEN_SKIP_RC" ] ; then 38 | 39 | if [ -f /etc/mavenrc ] ; then 40 | . /etc/mavenrc 41 | fi 42 | 43 | if [ -f "$HOME/.mavenrc" ] ; then 44 | . "$HOME/.mavenrc" 45 | fi 46 | 47 | fi 48 | 49 | # OS specific support. $var _must_ be set to either true or false. 50 | cygwin=false; 51 | darwin=false; 52 | mingw=false 53 | case "`uname`" in 54 | CYGWIN*) cygwin=true ;; 55 | MINGW*) mingw=true;; 56 | Darwin*) darwin=true 57 | # Use /usr/libexec/java_home if available, otherwise fall back to /Library/Java/Home 58 | # See https://developer.apple.com/library/mac/qa/qa1170/_index.html 59 | if [ -z "$JAVA_HOME" ]; then 60 | if [ -x "/usr/libexec/java_home" ]; then 61 | export JAVA_HOME="`/usr/libexec/java_home`" 62 | else 63 | export JAVA_HOME="/Library/Java/Home" 64 | fi 65 | fi 66 | ;; 67 | esac 68 | 69 | if [ -z "$JAVA_HOME" ] ; then 70 | if [ -r /etc/gentoo-release ] ; then 71 | JAVA_HOME=`java-config --jre-home` 72 | fi 73 | fi 74 | 75 | if [ -z "$M2_HOME" ] ; then 76 | ## resolve links - $0 may be a link to maven's home 77 | PRG="$0" 78 | 79 | # need this for relative symlinks 80 | while [ -h "$PRG" ] ; do 81 | ls=`ls -ld "$PRG"` 82 | link=`expr "$ls" : '.*-> \(.*\)$'` 83 | if expr "$link" : '/.*' > /dev/null; then 84 | PRG="$link" 85 | else 86 | PRG="`dirname "$PRG"`/$link" 87 | fi 88 | done 89 | 90 | saveddir=`pwd` 91 | 92 | M2_HOME=`dirname "$PRG"`/.. 93 | 94 | # make it fully qualified 95 | M2_HOME=`cd "$M2_HOME" && pwd` 96 | 97 | cd "$saveddir" 98 | # echo Using m2 at $M2_HOME 99 | fi 100 | 101 | # For Cygwin, ensure paths are in UNIX format before anything is touched 102 | if $cygwin ; then 103 | [ -n "$M2_HOME" ] && 104 | M2_HOME=`cygpath --unix "$M2_HOME"` 105 | [ -n "$JAVA_HOME" ] && 106 | JAVA_HOME=`cygpath --unix "$JAVA_HOME"` 107 | [ -n "$CLASSPATH" ] && 108 | CLASSPATH=`cygpath --path --unix "$CLASSPATH"` 109 | fi 110 | 111 | # For Mingw, ensure paths are in UNIX format before anything is touched 112 | if $mingw ; then 113 | [ -n "$M2_HOME" ] && 114 | M2_HOME="`(cd "$M2_HOME"; pwd)`" 115 | [ -n "$JAVA_HOME" ] && 116 | JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`" 117 | fi 118 | 119 | if [ -z "$JAVA_HOME" ]; then 120 | javaExecutable="`which javac`" 121 | if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then 122 | # readlink(1) is not available as standard on Solaris 10. 123 | readLink=`which readlink` 124 | if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then 125 | if $darwin ; then 126 | javaHome="`dirname \"$javaExecutable\"`" 127 | javaExecutable="`cd \"$javaHome\" && pwd -P`/javac" 128 | else 129 | javaExecutable="`readlink -f \"$javaExecutable\"`" 130 | fi 131 | javaHome="`dirname \"$javaExecutable\"`" 132 | javaHome=`expr "$javaHome" : '\(.*\)/bin'` 133 | JAVA_HOME="$javaHome" 134 | export JAVA_HOME 135 | fi 136 | fi 137 | fi 138 | 139 | if [ -z "$JAVACMD" ] ; then 140 | if [ -n "$JAVA_HOME" ] ; then 141 | if [ -x "$JAVA_HOME/jre/sh/java" ] ; then 142 | # IBM's JDK on AIX uses strange locations for the executables 143 | JAVACMD="$JAVA_HOME/jre/sh/java" 144 | else 145 | JAVACMD="$JAVA_HOME/bin/java" 146 | fi 147 | else 148 | JAVACMD="`which java`" 149 | fi 150 | fi 151 | 152 | if [ ! -x "$JAVACMD" ] ; then 153 | echo "Error: JAVA_HOME is not defined correctly." >&2 154 | echo " We cannot execute $JAVACMD" >&2 155 | exit 1 156 | fi 157 | 158 | if [ -z "$JAVA_HOME" ] ; then 159 | echo "Warning: JAVA_HOME environment variable is not set." 160 | fi 161 | 162 | CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher 163 | 164 | # traverses directory structure from process work directory to filesystem root 165 | # first directory with .mvn subdirectory is considered project base directory 166 | find_maven_basedir() { 167 | 168 | if [ -z "$1" ] 169 | then 170 | echo "Path not specified to find_maven_basedir" 171 | return 1 172 | fi 173 | 174 | basedir="$1" 175 | wdir="$1" 176 | while [ "$wdir" != '/' ] ; do 177 | if [ -d "$wdir"/.mvn ] ; then 178 | basedir=$wdir 179 | break 180 | fi 181 | # workaround for JBEAP-8937 (on Solaris 10/Sparc) 182 | if [ -d "${wdir}" ]; then 183 | wdir=`cd "$wdir/.."; pwd` 184 | fi 185 | # end of workaround 186 | done 187 | echo "${basedir}" 188 | } 189 | 190 | # concatenates all lines of a file 191 | concat_lines() { 192 | if [ -f "$1" ]; then 193 | echo "$(tr -s '\n' ' ' < "$1")" 194 | fi 195 | } 196 | 197 | BASE_DIR=`find_maven_basedir "$(pwd)"` 198 | if [ -z "$BASE_DIR" ]; then 199 | exit 1; 200 | fi 201 | 202 | ########################################################################################## 203 | # Extension to allow automatically downloading the maven-wrapper.jar from Maven-central 204 | # This allows using the maven wrapper in projects that prohibit checking in binary data. 205 | ########################################################################################## 206 | if [ -r "$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" ]; then 207 | if [ "$MVNW_VERBOSE" = true ]; then 208 | echo "Found .mvn/wrapper/maven-wrapper.jar" 209 | fi 210 | else 211 | if [ "$MVNW_VERBOSE" = true ]; then 212 | echo "Couldn't find .mvn/wrapper/maven-wrapper.jar, downloading it ..." 213 | fi 214 | if [ -n "$MVNW_REPOURL" ]; then 215 | jarUrl="$MVNW_REPOURL/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar" 216 | else 217 | jarUrl="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar" 218 | fi 219 | while IFS="=" read key value; do 220 | case "$key" in (wrapperUrl) jarUrl="$value"; break ;; 221 | esac 222 | done < "$BASE_DIR/.mvn/wrapper/maven-wrapper.properties" 223 | if [ "$MVNW_VERBOSE" = true ]; then 224 | echo "Downloading from: $jarUrl" 225 | fi 226 | wrapperJarPath="$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" 227 | if $cygwin; then 228 | wrapperJarPath=`cygpath --path --windows "$wrapperJarPath"` 229 | fi 230 | 231 | if command -v wget > /dev/null; then 232 | if [ "$MVNW_VERBOSE" = true ]; then 233 | echo "Found wget ... using wget" 234 | fi 235 | if [ -z "$MVNW_USERNAME" ] || [ -z "$MVNW_PASSWORD" ]; then 236 | wget "$jarUrl" -O "$wrapperJarPath" 237 | else 238 | wget --http-user=$MVNW_USERNAME --http-password=$MVNW_PASSWORD "$jarUrl" -O "$wrapperJarPath" 239 | fi 240 | elif command -v curl > /dev/null; then 241 | if [ "$MVNW_VERBOSE" = true ]; then 242 | echo "Found curl ... using curl" 243 | fi 244 | if [ -z "$MVNW_USERNAME" ] || [ -z "$MVNW_PASSWORD" ]; then 245 | curl -o "$wrapperJarPath" "$jarUrl" -f 246 | else 247 | curl --user $MVNW_USERNAME:$MVNW_PASSWORD -o "$wrapperJarPath" "$jarUrl" -f 248 | fi 249 | 250 | else 251 | if [ "$MVNW_VERBOSE" = true ]; then 252 | echo "Falling back to using Java to download" 253 | fi 254 | javaClass="$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.java" 255 | # For Cygwin, switch paths to Windows format before running javac 256 | if $cygwin; then 257 | javaClass=`cygpath --path --windows "$javaClass"` 258 | fi 259 | if [ -e "$javaClass" ]; then 260 | if [ ! -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then 261 | if [ "$MVNW_VERBOSE" = true ]; then 262 | echo " - Compiling MavenWrapperDownloader.java ..." 263 | fi 264 | # Compiling the Java class 265 | ("$JAVA_HOME/bin/javac" "$javaClass") 266 | fi 267 | if [ -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then 268 | # Running the downloader 269 | if [ "$MVNW_VERBOSE" = true ]; then 270 | echo " - Running MavenWrapperDownloader.java ..." 271 | fi 272 | ("$JAVA_HOME/bin/java" -cp .mvn/wrapper MavenWrapperDownloader "$MAVEN_PROJECTBASEDIR") 273 | fi 274 | fi 275 | fi 276 | fi 277 | ########################################################################################## 278 | # End of extension 279 | ########################################################################################## 280 | 281 | export MAVEN_PROJECTBASEDIR=${MAVEN_BASEDIR:-"$BASE_DIR"} 282 | if [ "$MVNW_VERBOSE" = true ]; then 283 | echo $MAVEN_PROJECTBASEDIR 284 | fi 285 | MAVEN_OPTS="$(concat_lines "$MAVEN_PROJECTBASEDIR/.mvn/jvm.config") $MAVEN_OPTS" 286 | 287 | # For Cygwin, switch paths to Windows format before running java 288 | if $cygwin; then 289 | [ -n "$M2_HOME" ] && 290 | M2_HOME=`cygpath --path --windows "$M2_HOME"` 291 | [ -n "$JAVA_HOME" ] && 292 | JAVA_HOME=`cygpath --path --windows "$JAVA_HOME"` 293 | [ -n "$CLASSPATH" ] && 294 | CLASSPATH=`cygpath --path --windows "$CLASSPATH"` 295 | [ -n "$MAVEN_PROJECTBASEDIR" ] && 296 | MAVEN_PROJECTBASEDIR=`cygpath --path --windows "$MAVEN_PROJECTBASEDIR"` 297 | fi 298 | 299 | # Provide a "standardized" way to retrieve the CLI args that will 300 | # work with both Windows and non-Windows executions. 301 | MAVEN_CMD_LINE_ARGS="$MAVEN_CONFIG $@" 302 | export MAVEN_CMD_LINE_ARGS 303 | 304 | WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain 305 | 306 | exec "$JAVACMD" \ 307 | $MAVEN_OPTS \ 308 | -classpath "$MAVEN_PROJECTBASEDIR/.mvn/wrapper/maven-wrapper.jar" \ 309 | "-Dmaven.home=${M2_HOME}" "-Dmaven.multiModuleProjectDirectory=${MAVEN_PROJECTBASEDIR}" \ 310 | ${WRAPPER_LAUNCHER} $MAVEN_CONFIG "$@" 311 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Spring Kafka beyond the basics - Poison Pill 2 | 3 | Example project used for my: 4 | 5 | * [Kafka Summit 2020 lightning talk](https://events.kafka-summit.org/2020-schedule#element-custom-block-2986770903) 6 | * [Confluent Online talks session](https://www.confluent.io/online-talks/spring-kafka-beyond-the-basics/) 7 | * [Confluent blog post - Spring for Apache Kafka – Beyond the Basics: Can Your Kafka Consumers Handle a Poison Pill?](https://www.confluent.io/blog/spring-kafka-can-your-kafka-consumers-handle-a-poison-pill/) 8 | 9 | Using: 10 | 11 | * Confluent Kafka 5.5 12 | * Confluent Schema Registry 5.5 13 | * Java 8 14 | * Spring Boot 2.3 15 | * Spring Kafka 2.5 16 | * Apache Avro 1.9.2 17 | 18 | ## Project modules and applications 19 | 20 | | Applications | Port | Avro | Topic(s) | Description | 21 | |-----------------------------------------------|------|-------|-------------------|--------------------------------------------------------------------------| 22 | | stock-quote-producer-avro | 8080 | YES | stock-quotes-avro | Simple producer of random stock quotes using Spring Kafka & Apache Avro. | 23 | | stock-quote-consumer-avro | 8082 | YES | stock-quotes-avro | Simple consumer of stock quotes using using Spring Kafka & Apache Avro. | 24 | | stock-quote-kafka-streams-avro | 8083 | YES | stock-quotes-avro | Simple Kafka Streams application using Spring Kafka & Apache Avro. | 25 | 26 | | Module | Description | 27 | |------------------------------------------|-------------------------------------------------------------------------| 28 | | stock-quote-avro-model | Holds the Avro schema for the Stock Quote including `avro-maven-plugin` to generate Java code based on the Avro Schema. This module is used by both the producer and consumer application. | 29 | 30 | Note Confluent Schema Registry is running on port: `8081` using Docker see: [docker-compose.yml](docker-compose.yml). 31 | 32 | ![](project-overview.png) 33 | 34 | ## Goal 35 | 36 | The goal of this example project is to show how protect your Kafka application against Deserialization exceptions (a.k.a. poison pills) leveraging Spring Boot and Spring Kafka. 37 | 38 | This example project has 3 different branches: 39 | 40 | * `master` : no configuration to protect the consumer application (`stock-quote-consumer-avro`) against the poison pill scenario. 41 | * `handle-poison-pill-log-and-continue-consuming` : configuration to protect the consumer application (`stock-quote-consumer-avro`) against the poison pill scenario by simply logging the poison pill(s) and continue consuming. 42 | * `handle-poison-pill-dead-letter-topic-and-continue-consuming` : configuration to protect the consumer application (`stock-quote-consumer-avro`) against the poison pill scenario by publishing the poison pill(s) to a dead letter topic `stock-quotes-avro.DLT` and continue consuming. 43 | 44 | ## Build to the project 45 | 46 | ``` 47 | ./mvnw clean install 48 | ``` 49 | 50 | ## Start / Stop Kafka & Zookeeper 51 | 52 | ``` 53 | docker-compose up -d 54 | ``` 55 | 56 | ``` 57 | docker-compose down -v 58 | ``` 59 | 60 | ## Execute the Poison Pill scenario yourself 61 | 62 | First make sure all Docker containers started successfully (state: `Up`): 63 | 64 | ``` 65 | docker-compose ps 66 | ``` 67 | 68 | ### Start both producer and consumer 69 | 70 | * Spring Boot application: `stock-quote-producer-avro` 71 | 72 | ``` 73 | java -jar stock-quote-producer-avro/target/stock-quote-producer-avro-0.0.1-SNAPSHOT.jar 74 | ``` 75 | 76 | * Spring Boot application: `stock-quote-consumer-avro` 77 | 78 | ``` 79 | java -jar stock-quote-consumer-avro/target/stock-quote-consumer-avro-0.0.1-SNAPSHOT.jar 80 | ``` 81 | 82 | ### Attach to Kafka 83 | 84 | ``` 85 | docker exec -it kafka bash 86 | ``` 87 | 88 | ### Unset JMX Port 89 | 90 | To avoid error: 91 | 92 | ``` 93 | unset JMX_PORT 94 | ``` 95 | 96 | ``` 97 | Error: JMX connector server communication error: service:jmx:rmi://kafka:9999 98 | sun.management.AgentConfigurationError: java.rmi.server.ExportException: Port already in use: 9999; nested exception is: 99 | java.net.BindException: Address already in use (Bind failed) 100 | at sun.management.jmxremote.ConnectorBootstrap.exportMBeanServer(ConnectorBootstrap.java:800) 101 | at sun.management.jmxremote.ConnectorBootstrap.startRemoteConnectorServer(ConnectorBootstrap.java:468) 102 | at sun.management.Agent.startAgent(Agent.java:262) 103 | at sun.management.Agent.startAgent(Agent.java:452) 104 | Caused by: java.rmi.server.ExportException: Port already in use: 9999; nested exception is: 105 | java.net.BindException: Address already in use (Bind failed) 106 | at sun.rmi.transport.tcp.TCPTransport.listen(TCPTransport.java:346) 107 | at sun.rmi.transport.tcp.TCPTransport.exportObject(TCPTransport.java:254) 108 | at sun.rmi.transport.tcp.TCPEndpoint.exportObject(TCPEndpoint.java:411) 109 | at sun.rmi.transport.LiveRef.exportObject(LiveRef.java:147) 110 | at sun.rmi.server.UnicastServerRef.exportObject(UnicastServerRef.java:237) 111 | at sun.management.jmxremote.ConnectorBootstrap$PermanentExporter.exportObject(ConnectorBootstrap.java:199) 112 | at javax.management.remote.rmi.RMIJRMPServerImpl.export(RMIJRMPServerImpl.java:146) 113 | at javax.management.remote.rmi.RMIJRMPServerImpl.export(RMIJRMPServerImpl.java:122) 114 | at javax.management.remote.rmi.RMIConnectorServer.start(RMIConnectorServer.java:404) 115 | at sun.management.jmxremote.ConnectorBootstrap.exportMBeanServer(ConnectorBootstrap.java:796) 116 | ... 3 more 117 | Caused by: java.net.BindException: Address already in use (Bind failed) 118 | at java.net.PlainSocketImpl.socketBind(Native Method) 119 | at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387) 120 | at java.net.ServerSocket.bind(ServerSocket.java:375) 121 | at java.net.ServerSocket.(ServerSocket.java:237) 122 | at java.net.ServerSocket.(ServerSocket.java:128) 123 | at sun.rmi.transport.proxy.RMIDirectSocketFactory.createServerSocket(RMIDirectSocketFactory.java:45) 124 | at sun.rmi.transport.proxy.RMIMasterSocketFactory.createServerSocket(RMIMasterSocketFactory.java:345) 125 | at sun.rmi.transport.tcp.TCPEndpoint.newServerSocket(TCPEndpoint.java:666) 126 | at sun.rmi.transport.tcp.TCPTransport.listen(TCPTransport.java:335) 127 | ... 12 more 128 | ``` 129 | 130 | ### Start the Kafka console producer from the command line 131 | 132 | ``` 133 | ./usr/bin/kafka-console-producer --broker-list localhost:9092 --topic stock-quotes-avro 134 | ``` 135 | 136 | ### Publish the poison pill ;) 137 | 138 | The console is waiting for input. Now publish the poison pill: 139 | 140 | ``` 141 | 💊 142 | ``` 143 | 144 | ### Check the consumer application! 145 | 146 | The consumer application will try to deserialize the poison pill but fail. 147 | The application by default will try again, again and again to deserialize the record but will never succeed!. 148 | For every deserialization exception a log line will be written to the log: 149 | 150 | ``` 151 | java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer 152 | at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:145) ~[spring-kafka-2.5.0.RELEASE.jar!/:2.5.0.RELEASE] 153 | at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:103) ~[spring-kafka-2.5.0.RELEASE.jar!/:2.5.0.RELEASE] 154 | at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.handleConsumerException(KafkaMessageListenerContainer.java:1241) ~[spring-kafka-2.5.0.RELEASE.jar!/:2.5.0.RELEASE] 155 | at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1002) ~[spring-kafka-2.5.0.RELEASE.jar!/:2.5.0.RELEASE] 156 | at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na] 157 | at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na] 158 | at java.base/java.lang.Thread.run(Thread.java:834) ~[na:na] 159 | Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition stock-quotes-avro-1 at offset 69. If needed, please seek past the record to continue consumption. 160 | Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte! 161 | ``` 162 | 163 | Make sure to stop the consumer application! 164 | 165 | ## How to survive a poison pill scenario? 166 | 167 | Now it's time protect the consumer application! 168 | Spring Kafka offers excellent support to protect your Kafka application(s) against the poison pill scenario. 169 | Depending on your use case you have different options. 170 | 171 | ### Log the poison pill(s) and continue consuming 172 | 173 | By configuring Spring Kafka's: `org.springframework.kafka.support.serializer.ErrorHandlingDeserializer` 174 | 175 | Branch with complete example: 176 | 177 | ``` 178 | git checkout handle-poison-pill-log-and-continue-consuming 179 | ``` 180 | 181 | Configuration of the Consumer Application (`application.yml`) to configure the: ErrorHandlingDeserializer 182 | 183 | ```yml 184 | spring: 185 | kafka: 186 | consumer: 187 | # Configures the Spring Kafka ErrorHandlingDeserializer that delegates to the 'real' deserializers 188 | key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer 189 | value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer 190 | properties: 191 | # Delegate deserializers 192 | spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer 193 | spring.deserializer.value.delegate.class: io.confluent.kafka.serializers.KafkaAvroDeserializer 194 | ``` 195 | 196 | ### Publish poison pill(s) to a dead letter topic and continue consuming 197 | 198 | By configuring Spring Kafka's: `org.springframework.kafka.support.serializer.ErrorHandlingDeserializer` in combination 199 | with both the `org.springframework.kafka.listener.DeadLetterPublishingRecoverer` and `org.springframework.kafka.listener.SeekToCurrentErrorHandler` 200 | 201 | Branch with complete example: 202 | 203 | ``` 204 | git checkout handle-poison-pill-dead-letter-topic-and-continue-consuming 205 | ``` 206 | 207 | ### Kafka Streams and Deserialization exceptions 208 | 209 | From the [Kafka Streams documentation](https://kafka.apache.org/10/documentation/streams/developer-guide/config-streams.html#default-deserialization-exception-handler): 210 | 211 | The default deserialization exception handler allows you to manage record exceptions that fail to deserialize. This can be caused by corrupt data, incorrect serialization logic, or unhandled record types. These exception handlers are available: 212 | 213 | * `LogAndContinueExceptionHandler`: This handler logs the deserialization exception and then signals the processing pipeline to continue processing more records. This log-and-skip strategy allows Kafka Streams to make progress instead of failing if there are records that fail to deserialize. 214 | 215 | * `LogAndFailExceptionHandler`: This handler logs the deserialization exception and then signals the processing pipeline to stop processing more records. 216 | 217 | 218 | ``` 219 | org.apache.kafka.common.errors.SerializationException: Unknown magic byte! 220 | 221 | 2020-08-15 22:29:43.745 ERROR 37514 --- [-StreamThread-1] o.a.k.s.p.internals.StreamThread : stream-thread [stock-quote-kafka-streams-stream-StreamThread-1] Encountered the following unexpected Kafka exception during processing, this usually indicate Streams internal errors: 222 | 223 | org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately. 224 | at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:80) ~[kafka-streams-2.5.1.jar:na] 225 | at org.apache.kafka.streams.processor.internals.RecordQueue.updateHead(RecordQueue.java:175) ~[kafka-streams-2.5.1.jar:na] 226 | at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:112) ~[kafka-streams-2.5.1.jar:na] 227 | at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:162) ~[kafka-streams-2.5.1.jar:na] 228 | at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:765) ~[kafka-streams-2.5.1.jar:na] 229 | at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:943) ~[kafka-streams-2.5.1.jar:na] 230 | at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:764) ~[kafka-streams-2.5.1.jar:na] 231 | at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:697) ~[kafka-streams-2.5.1.jar:na] 232 | at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:670) ~[kafka-streams-2.5.1.jar:na] 233 | Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte! 234 | 235 | 2020-08-15 22:29:43.745 INFO 37514 --- [-StreamThread-1] o.a.k.s.p.internals.StreamThread : stream-thread [stock-quote-kafka-streams-stream-StreamThread-1] State transition from RUNNING to PENDING_SHUTDOWN 236 | 2020-08-15 22:29:43.745 INFO 37514 --- [-StreamThread-1] o.a.k.s.p.internals.StreamThread : stream-thread [stock-quote-kafka-streams-stream-StreamThread-1] Shutting down 237 | ``` 238 | 239 | ```yml 240 | spring: 241 | kafka: 242 | streams: 243 | properties: 244 | default.deserialization.exception.handler: org.apache.kafka.streams.errors.LogAndContinueExceptionHandler 245 | ``` 246 | 247 | ``` 248 | org.apache.kafka.common.errors.SerializationException: Unknown magic byte! 249 | 250 | 2020-08-15 22:47:50.775 WARN 39647 --- [-StreamThread-1] o.a.k.s.p.internals.RecordDeserializer : stream-thread [stock-quote-kafka-streams-stream-StreamThread-1] task [0_0] Skipping record due to deserialization error. topic=[stock-quotes-avro] partition=[0] offset=[771] 251 | ``` 252 | 253 | Spring Kafka RecoveringDeserializationExceptionHandler: 254 | 255 | Spring Kafka provides a the org.springframework.kafka.streams.RecoveringDeserializationExceptionHandler (implementation of Kafka Streams DeserializationExceptionHandler) to be able to publish to a Dead letter topic deserialization exception occurs. 256 | 257 | The RecoveringDeserializationExceptionHandler is configured with a ConsumerRecordRecoverer implementation. Spring Kafka provides the DeadLetterPublishingRecoverer which sends the failed record to a dead-letter topic. 258 | 259 | ```yml 260 | spring: 261 | kafka: 262 | streams: 263 | properties: 264 | default.deserialization.exception.handler: org.springframework.kafka.streams.RecoveringDeserializationExceptionHandler 265 | ``` --------------------------------------------------------------------------------