├── .github └── workflows │ └── clojure.yml ├── .gitignore ├── README.md ├── docker-compose.yml ├── kafka-producer-consumer-example ├── .gitignore ├── .hgignore ├── Dockerfile ├── LICENSE ├── README.md ├── bin │ ├── start-example.sh │ └── start.sh ├── project.clj ├── src │ ├── kafka_example │ │ └── core.clj │ └── log4j.properties └── test │ ├── kafka_example │ ├── core_test.clj~ │ └── integration │ │ └── example_integration_test.clj │ └── log4j.properties └── kafka-streams-example ├── .gitignore ├── CHANGELOG.md ├── LICENSE ├── README.md ├── project.clj ├── src ├── kafka_streams_example │ ├── core.clj │ ├── kstream_aggregate.clj │ ├── kstream_avro_join.clj │ ├── kstream_kstream_inner_join_example.clj │ ├── kstream_kstream_left_join_example.clj │ ├── kstream_kstream_outer_join_example.clj │ ├── kstream_reduce.clj │ ├── ktable_example.clj │ ├── processor_api_example.clj │ ├── processor_api_rekey.clj │ ├── serdes │ │ └── json_serdes.clj │ └── utils.clj └── log4j.properties └── test └── kafka_streams_example ├── core_test.clj ├── kstream_aggregate_test.clj ├── kstream_avro_join_test.clj ├── kstream_kstream_inner_join_example_test.clj ├── kstream_kstream_left_join_example_test.clj ├── kstream_kstream_outer_join_example_test.clj ├── kstream_reduce_test.clj ├── ktable_example_test.clj ├── processor_api_example_test.clj ├── processor_api_rekey_test.clj └── test_support.clj /.github/workflows/clojure.yml: -------------------------------------------------------------------------------- 1 | name: Clojure CI 2 | 3 | on: 4 | push: 5 | branches: [ main ] 6 | pull_request: 7 | branches: [ main ] 8 | 9 | jobs: 10 | build: 11 | 12 | runs-on: ubuntu-latest 13 | 14 | steps: 15 | - uses: actions/checkout@v2 16 | - name: Install dependencies and run tests 17 | working-directory: ./kafka-producer-consumer-example 18 | run: lein deps && lein test 19 | - name: Install dependencies and run tests 20 | working-directory: ./kafka-streams-example 21 | run: lein deps && lein test 22 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *~ 2 | kafka-streams-example/pom.xml 3 | kafka-producer-consumer-example/pom.xml 4 | *.jar 5 | *.class 6 | /lib/ 7 | /classes/ 8 | target 9 | /checkouts/ 10 | .lein-deps-sum 11 | .lein-repl-history 12 | .lein-plugins/ 13 | .lein-failures 14 | .nrepl-port 15 | *.iml 16 | /.idea 17 | .lsp/ 18 | .clj-kondo 19 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## About 2 | 3 | A selection of examples using Kafka and Kafka Streams with Clojure. I wanted to learn the real Kafka API via Clojure not via a Clojure wrapper so here are examples of using the raw API, which is clean and means you do not need to wait for Clojure wrapper libraries to upgrade Kafka. 4 | 5 | A guide to the examples can be find in this post [here](https://perkss.github.io/#/clojure/KafkaClojure#text-body). 6 | 7 | ## Prerequisites 8 | Expected to be able to set up Zookeeper and Kafka Broker to run the examples. Or use the Confluent platform. 9 | The Kafka Stream Examples all have relevant tests rather than running a Jar. Please check individual module README to check how to run. 10 | 11 | ## Unit Test 12 | Check out the unit tests for each file to see it in action without the need for running with Zookeeper and Kafka. 13 | 14 | ## Examples 15 | * Kafka Producer and Consumer in Clojure using Java interop. 16 | * Search a Topic and return records that match a Key from beginning to current point in time. 17 | * Kafka Streaming example to upper case of strings. 18 | * Kafka Streaming testing example. 19 | * Kafka Streaming join with two Streams (Left Join, Inner Join, Outer Join) 20 | * Kafka Streaming join with KTABLE. 21 | * Avro Kafka deserialization and serialization. 22 | * KStreamGroup Aggregate 23 | * TopologyTestDriver 24 | * More examples to come regularly watch this space ... 25 | 26 | ## Integration Testcontainers 27 | Testcontainers provide the ability to test applications that have docker interactions simply by starting up your containers 28 | in our examples Kafka we can then start our applications up against this container and run the tests in a fully integrated 29 | test with Kafka. Awesome! 30 | 31 | ## Kafka and Zookeeper 32 | 33 | We have provided our own docker-compose file that will start Kafka and Zookeeper via localhost, so other containers or local apps can access the broker. 34 | 35 | Start this using the command `docker-compose up -d` 36 | 37 | ```yaml 38 | version: '3' 39 | services: 40 | zookeeper: 41 | image: confluentinc/cp-zookeeper:5.4.1 42 | hostname: zookeeper 43 | container_name: zookeeper 44 | ports: 45 | - "2181:2181" 46 | environment: 47 | ZOOKEEPER_CLIENT_PORT: 2181 48 | ZOOKEEPER_TICK_TIME: 2000 49 | broker: 50 | image: confluentinc/cp-enterprise-kafka:5.4.1 51 | hostname: broker 52 | container_name: broker 53 | depends_on: 54 | - zookeeper 55 | ports: 56 | - "29092:29092" 57 | - "9092:9092" 58 | environment: 59 | KAFKA_BROKER_ID: 1 60 | KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181' 61 | KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT 62 | KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:29092,PLAINTEXT_HOST://localhost:9092 63 | KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 64 | KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0 65 | ``` 66 | 67 | ## Contributions 68 | 69 | * Please get involved and get recognition. If you want to add more examples then raise a PR. Or raise issues for ideas on examples that would be beneficial. 70 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3' 2 | services: 3 | zookeeper: 4 | image: confluentinc/cp-zookeeper:latest 5 | hostname: zookeeper 6 | container_name: zookeeper 7 | ports: 8 | - "2181:2181" 9 | environment: 10 | ZOOKEEPER_CLIENT_PORT: 2181 11 | ZOOKEEPER_TICK_TIME: 2000 12 | broker: 13 | image: confluentinc/cp-enterprise-kafka:latest 14 | hostname: broker 15 | container_name: broker 16 | depends_on: 17 | - zookeeper 18 | ports: 19 | - "29092:29092" 20 | - "9092:9092" 21 | environment: 22 | KAFKA_BROKER_ID: 1 23 | KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181' 24 | KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT 25 | KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:29092,PLAINTEXT_HOST://localhost:9092 26 | KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 27 | KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0 -------------------------------------------------------------------------------- /kafka-producer-consumer-example/.gitignore: -------------------------------------------------------------------------------- 1 | pom.xml 2 | pom.xml.asc 3 | *jar 4 | /lib/ 5 | /classes/ 6 | /target/ 7 | /checkouts/ 8 | .lein-deps-sum 9 | .lein-repl-history 10 | .lein-plugins/ 11 | .lein-failures 12 | .nrepl-port 13 | .DS_Store 14 | .idea/ 15 | *.iml 16 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/.hgignore: -------------------------------------------------------------------------------- 1 | syntax: glob 2 | pom.xml 3 | pom.xml.asc 4 | *.jar 5 | *.class 6 | .gitignore 7 | .git/** 8 | 9 | syntax: regexp 10 | ^.nrepl-port 11 | ^.lein-.* 12 | ^target/ 13 | ^classes/ 14 | ^checkouts/ 15 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM openjdk:8-jre-alpine 2 | # COPY the build jar from the target folder 3 | COPY target/uberjar/kafka-example*-standalone.jar kafka-example.jar 4 | CMD ["/usr/bin/java", "-jar", "/kafka-example.jar"] -------------------------------------------------------------------------------- /kafka-producer-consumer-example/LICENSE: -------------------------------------------------------------------------------- 1 | THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE PUBLIC 2 | LICENSE ("AGREEMENT"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM 3 | CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT. 4 | 5 | 1. DEFINITIONS 6 | 7 | "Contribution" means: 8 | 9 | a) in the case of the initial Contributor, the initial code and 10 | documentation distributed under this Agreement, and 11 | 12 | b) in the case of each subsequent Contributor: 13 | 14 | i) changes to the Program, and 15 | 16 | ii) additions to the Program; 17 | 18 | where such changes and/or additions to the Program originate from and are 19 | distributed by that particular Contributor. A Contribution 'originates' from 20 | a Contributor if it was added to the Program by such Contributor itself or 21 | anyone acting on such Contributor's behalf. Contributions do not include 22 | additions to the Program which: (i) are separate modules of software 23 | distributed in conjunction with the Program under their own license 24 | agreement, and (ii) are not derivative works of the Program. 25 | 26 | "Contributor" means any person or entity that distributes the Program. 27 | 28 | "Licensed Patents" mean patent claims licensable by a Contributor which are 29 | necessarily infringed by the use or sale of its Contribution alone or when 30 | combined with the Program. 31 | 32 | "Program" means the Contributions distributed in accordance with this 33 | Agreement. 34 | 35 | "Recipient" means anyone who receives the Program under this Agreement, 36 | including all Contributors. 37 | 38 | 2. GRANT OF RIGHTS 39 | 40 | a) Subject to the terms of this Agreement, each Contributor hereby grants 41 | Recipient a non-exclusive, worldwide, royalty-free copyright license to 42 | reproduce, prepare derivative works of, publicly display, publicly perform, 43 | distribute and sublicense the Contribution of such Contributor, if any, and 44 | such derivative works, in source code and object code form. 45 | 46 | b) Subject to the terms of this Agreement, each Contributor hereby grants 47 | Recipient a non-exclusive, worldwide, royalty-free patent license under 48 | Licensed Patents to make, use, sell, offer to sell, import and otherwise 49 | transfer the Contribution of such Contributor, if any, in source code and 50 | object code form. This patent license shall apply to the combination of the 51 | Contribution and the Program if, at the time the Contribution is added by the 52 | Contributor, such addition of the Contribution causes such combination to be 53 | covered by the Licensed Patents. The patent license shall not apply to any 54 | other combinations which include the Contribution. No hardware per se is 55 | licensed hereunder. 56 | 57 | c) Recipient understands that although each Contributor grants the licenses 58 | to its Contributions set forth herein, no assurances are provided by any 59 | Contributor that the Program does not infringe the patent or other 60 | intellectual property rights of any other entity. Each Contributor disclaims 61 | any liability to Recipient for claims brought by any other entity based on 62 | infringement of intellectual property rights or otherwise. As a condition to 63 | exercising the rights and licenses granted hereunder, each Recipient hereby 64 | assumes sole responsibility to secure any other intellectual property rights 65 | needed, if any. For example, if a third party patent license is required to 66 | allow Recipient to distribute the Program, it is Recipient's responsibility 67 | to acquire that license before distributing the Program. 68 | 69 | d) Each Contributor represents that to its knowledge it has sufficient 70 | copyright rights in its Contribution, if any, to grant the copyright license 71 | set forth in this Agreement. 72 | 73 | 3. REQUIREMENTS 74 | 75 | A Contributor may choose to distribute the Program in object code form under 76 | its own license agreement, provided that: 77 | 78 | a) it complies with the terms and conditions of this Agreement; and 79 | 80 | b) its license agreement: 81 | 82 | i) effectively disclaims on behalf of all Contributors all warranties and 83 | conditions, express and implied, including warranties or conditions of title 84 | and non-infringement, and implied warranties or conditions of merchantability 85 | and fitness for a particular purpose; 86 | 87 | ii) effectively excludes on behalf of all Contributors all liability for 88 | damages, including direct, indirect, special, incidental and consequential 89 | damages, such as lost profits; 90 | 91 | iii) states that any provisions which differ from this Agreement are offered 92 | by that Contributor alone and not by any other party; and 93 | 94 | iv) states that source code for the Program is available from such 95 | Contributor, and informs licensees how to obtain it in a reasonable manner on 96 | or through a medium customarily used for software exchange. 97 | 98 | When the Program is made available in source code form: 99 | 100 | a) it must be made available under this Agreement; and 101 | 102 | b) a copy of this Agreement must be included with each copy of the Program. 103 | 104 | Contributors may not remove or alter any copyright notices contained within 105 | the Program. 106 | 107 | Each Contributor must identify itself as the originator of its Contribution, 108 | if any, in a manner that reasonably allows subsequent Recipients to identify 109 | the originator of the Contribution. 110 | 111 | 4. COMMERCIAL DISTRIBUTION 112 | 113 | Commercial distributors of software may accept certain responsibilities with 114 | respect to end users, business partners and the like. While this license is 115 | intended to facilitate the commercial use of the Program, the Contributor who 116 | includes the Program in a commercial product offering should do so in a 117 | manner which does not create potential liability for other Contributors. 118 | Therefore, if a Contributor includes the Program in a commercial product 119 | offering, such Contributor ("Commercial Contributor") hereby agrees to defend 120 | and indemnify every other Contributor ("Indemnified Contributor") against any 121 | losses, damages and costs (collectively "Losses") arising from claims, 122 | lawsuits and other legal actions brought by a third party against the 123 | Indemnified Contributor to the extent caused by the acts or omissions of such 124 | Commercial Contributor in connection with its distribution of the Program in 125 | a commercial product offering. The obligations in this section do not apply 126 | to any claims or Losses relating to any actual or alleged intellectual 127 | property infringement. In order to qualify, an Indemnified Contributor must: 128 | a) promptly notify the Commercial Contributor in writing of such claim, and 129 | b) allow the Commercial Contributor to control, and cooperate with the 130 | Commercial Contributor in, the defense and any related settlement 131 | negotiations. The Indemnified Contributor may participate in any such claim 132 | at its own expense. 133 | 134 | For example, a Contributor might include the Program in a commercial product 135 | offering, Product X. That Contributor is then a Commercial Contributor. If 136 | that Commercial Contributor then makes performance claims, or offers 137 | warranties related to Product X, those performance claims and warranties are 138 | such Commercial Contributor's responsibility alone. Under this section, the 139 | Commercial Contributor would have to defend claims against the other 140 | Contributors related to those performance claims and warranties, and if a 141 | court requires any other Contributor to pay any damages as a result, the 142 | Commercial Contributor must pay those damages. 143 | 144 | 5. NO WARRANTY 145 | 146 | EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON 147 | AN "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER 148 | EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR 149 | CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A 150 | PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the 151 | appropriateness of using and distributing the Program and assumes all risks 152 | associated with its exercise of rights under this Agreement , including but 153 | not limited to the risks and costs of program errors, compliance with 154 | applicable laws, damage to or loss of data, programs or equipment, and 155 | unavailability or interruption of operations. 156 | 157 | 6. DISCLAIMER OF LIABILITY 158 | 159 | EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY 160 | CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, 161 | SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION 162 | LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN 163 | CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) 164 | ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE 165 | EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY 166 | OF SUCH DAMAGES. 167 | 168 | 7. GENERAL 169 | 170 | If any provision of this Agreement is invalid or unenforceable under 171 | applicable law, it shall not affect the validity or enforceability of the 172 | remainder of the terms of this Agreement, and without further action by the 173 | parties hereto, such provision shall be reformed to the minimum extent 174 | necessary to make such provision valid and enforceable. 175 | 176 | If Recipient institutes patent litigation against any entity (including a 177 | cross-claim or counterclaim in a lawsuit) alleging that the Program itself 178 | (excluding combinations of the Program with other software or hardware) 179 | infringes such Recipient's patent(s), then such Recipient's rights granted 180 | under Section 2(b) shall terminate as of the date such litigation is filed. 181 | 182 | All Recipient's rights under this Agreement shall terminate if it fails to 183 | comply with any of the material terms or conditions of this Agreement and 184 | does not cure such failure in a reasonable period of time after becoming 185 | aware of such noncompliance. If all Recipient's rights under this Agreement 186 | terminate, Recipient agrees to cease use and distribution of the Program as 187 | soon as reasonably practicable. However, Recipient's obligations under this 188 | Agreement and any licenses granted by Recipient relating to the Program shall 189 | continue and survive. 190 | 191 | Everyone is permitted to copy and distribute copies of this Agreement, but in 192 | order to avoid inconsistency the Agreement is copyrighted and may only be 193 | modified in the following manner. The Agreement Steward reserves the right to 194 | publish new versions (including revisions) of this Agreement from time to 195 | time. No one other than the Agreement Steward has the right to modify this 196 | Agreement. The Eclipse Foundation is the initial Agreement Steward. The 197 | Eclipse Foundation may assign the responsibility to serve as the Agreement 198 | Steward to a suitable separate entity. Each new version of the Agreement will 199 | be given a distinguishing version number. The Program (including 200 | Contributions) may always be distributed subject to the version of the 201 | Agreement under which it was received. In addition, after a new version of 202 | the Agreement is published, Contributor may elect to distribute the Program 203 | (including its Contributions) under the new version. Except as expressly 204 | stated in Sections 2(a) and 2(b) above, Recipient receives no rights or 205 | licenses to the intellectual property of any Contributor under this 206 | Agreement, whether expressly, by implication, estoppel or otherwise. All 207 | rights in the Program not expressly granted under this Agreement are 208 | reserved. 209 | 210 | This Agreement is governed by the laws of the State of New York and the 211 | intellectual property laws of the United States of America. No party to this 212 | Agreement will bring a legal action under this Agreement more than one year 213 | after the cause of action arose. Each party waives its rights to a jury trial 214 | in any resulting litigation. 215 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/README.md: -------------------------------------------------------------------------------- 1 | # kafka-example 2 | 3 | An example showing how to use Kafka producer and consumer using Clojure and the Java API via Java interop. Simple example that consumers from a topic logs the input and then sends on the value to another topic for the user to see. You can run it as a plain jar with all the Kafka Broker and Zookeeper running or you can use the Docker set up we have done and follow the blog post https://perkss.github.io/#/DevOps! Up to you. 4 | 5 | ## Integration Test 6 | To get started simply have Docker running and run `lein test` and this will run the integration test where Kafka is started up 7 | we start our application up, drop a test message on the topic and consume if after being processed by our application on the 8 | output topic. 9 | 10 | ## Installation 11 | 12 | Requires Zookeeper and Kafka to be set up. Please check the main README to run the bundled `docker-compose` file. Check the ports match from the code. 13 | Checkout the project view the start script and follow those commands. 14 | 15 | Alternatively you can also just install confluent platform docker version and start it. Or install Kafka locally and set the location the same as `/usr/local/bin/kafka/bin/` and `/usr/local/zookeeper/bin/` and then run the start.sh script from the bin directory. 16 | 17 | ## Usage 18 | 19 | In the project directory run : 20 | 21 | $ lein uberjar 22 | 23 | $ java -jar target/uberjar/kafka-example-0.1.0-SNAPSHOT-standalone.jar 24 | 25 | This starts the project and you should see it log out that it has started. 26 | 27 | If you simply want to run the code just do : 28 | 29 | $ lein run 30 | 31 | You then need to set up the Kafka topics, producer and consumer as specified in the start.sh script. Once set up you can produce to the example topic for example Hello 32 | 33 | With the app running it will log out: 34 | 35 | INFO kafka-example.core: Sending on value Value: Hello 36 | 37 | Then with a consumer on the example-produced-topic it will log out Value: Hello 38 | 39 | ## Running with Docker 40 | 41 | Docker makes the above very simple. 42 | 43 | To start the required Kafka and Zookeeper run the following in the parent directory this starts a separate docker network with these two exposing them on localhost: 44 | 45 | $ ../docker-compose up -d 46 | 47 | To build the example docker image and run it follow: 48 | 49 | $ docker build -t producer-consumer-example . 50 | 51 | Please note we use localhost in the boostrap server and the zookeeper host so we are required to pass --network="host" to make this container point to the localhost of our machine where we have exposed our Kafka and Zookeeper instances. 52 | 53 | $ docker run --network="host" -i -t producer-consumer-example 54 | 55 | Then we need to create the required topics note we reference the docker-compose names for broker and zookeeper. Rather than if running on bare metal the localhost. 56 | 57 | $ docker exec broker kafka-topics --create --zookeeper zookeeper:2181 --replication-factor 1 --partitions 1 --topic example-topic 58 | $ docker exec broker kafka-topics --create --zookeeper zookeeper:2181 --replication-factor 1 --partitions 1 --topic example-produced-topic 59 | 60 | Now lets check they have been created by listing them if running from the docker-compose file location 61 | 62 | $ docker exec broker kafka-topics --zookeeper zookeeper:2181 --list 63 | 64 | Or pass localhost and from the machine 65 | 66 | $ /usr/local/bin/kafka/bin/kafka-topics.sh --list --zookeeper localhost:2181 67 | 68 | Now create the consumer to listen to messages you will eventually produce and will be processed by Java. 69 | 70 | $ docker exec broker kafka-console-consumer --bootstrap-server broker:9092 --topic example-produced-topic --property print.key=true --from-beginning 71 | 72 | Or from the local machine instance 73 | 74 | $ /usr/local/bin/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic example-produced-topic --from-beginning 75 | 76 | Now create the producer to send messages that will be processed 77 | 78 | $ docker exec -it broker kafka-console-producer --broker-list broker:9092 --topic example-consumer-topic 79 | 80 | Or from the local 81 | 82 | $ /usr/local/bin/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic example-consumer-topic 83 | 84 | 85 | ## Example 86 | 87 | Have fun with the example, kept very simple purposely to show the Java interop API of Kafka Clients in Clojure. 88 | 89 | 90 | ## Security TLS 91 | 92 | A great tutorial on TLS and Kafka is available from [Confluent](https://docs.confluent.io/current/security/security_tutorial.html#generating-keys-certs). 93 | 94 | 95 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/bin/start-example.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | echo "Running Kafka Producer Consumer. You must have confluent platform on your 3 | path installed from: https://docs.confluent.io/current/quickstart/index.html." 4 | 5 | ### Note: WE ADVISE USE DOCKER-COMPOSE. Use this if you want to use the confluent install 6 | 7 | export BOOTSTRAP_SERVER=localhost:9092 8 | export ZOOKEEPER_HOSTS=localhost:2181 9 | 10 | #Expects confluent platform to be on the path 11 | confluent start 12 | 13 | # Run the app straight with lein 14 | #(cd ../ 15 | # lein run 16 | #) 17 | 18 | # Make the uberjar 19 | (cd ../ 20 | lein uberjar 21 | ) 22 | 23 | # Run the jar to start the app 24 | (cd ../ 25 | java -jar target/uberjar/kafka-example-0.1.0-SNAPSHOT-standalone.jar 26 | ) 27 | 28 | 29 | 30 | 31 | # Path needs to match for Kafka Console Producer run this and produce data 32 | #/usr/local/bin/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic example-consumer-topic 33 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/bin/start.sh: -------------------------------------------------------------------------------- 1 | #### Note: WE ADVISE USE DOCKER-COMPOSE. Use this if you want to start Kafka and Zookeeper on bare metal. 2 | 3 | # Start Zookeeper 4 | /usr/local/zookeeper/bin/zkServer.sh start 5 | # Start Kafka 6 | /usr/local/bin/kafka/bin/kafka-server-start.sh -daemon /usr/local/kafka/config/server.properties 7 | # Create topic topic that the application consumes from 8 | /usr/local/bin/kafka/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic example-consumer-topic 9 | # Create the consuming topic from the example that produces to it 10 | /usr/local/bin/kafka/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic example-produced-topic 11 | # List the created 12 | /usr/local/bin/kafka/bin/kafka-topics.sh --list --zookeeper localhost:2181 13 | # Produce messages for the app to consumer 14 | /usr/local/bin/kafka/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic example-consumer-topic 15 | # Consume the output messages 16 | /usr/local/bin/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic example-produced-topic --from-beginning 17 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/project.clj: -------------------------------------------------------------------------------- 1 | (defproject kafka-example "0.1.0-SNAPSHOT" 2 | :description "Example Kafka Producer and Consumer using Plain Java Interop" 3 | :url "https://perkss.github.io/#/clojure/KafkaClojure#text-body" 4 | :dependencies [[environ "1.1.0"] 5 | [org.clojure/clojure "1.11.1"] 6 | [org.apache.kafka/kafka-clients "3.8.0"] 7 | [org.apache.kafka/kafka_2.12 "3.8.0"] 8 | [org.clojure/tools.logging "1.2.4"] 9 | [org.slf4j/slf4j-log4j12 "2.0.10"] 10 | [org.apache.logging.log4j/log4j-core "2.22.1"] 11 | [org.testcontainers/testcontainers "1.19.3"] 12 | [org.testcontainers/kafka "1.19.3"]] 13 | :main ^:skip-aot kafka-example.core 14 | :target-path "target/%s" 15 | :plugins [[lein-cljfmt "0.6.0"]] 16 | :profiles {:uberjar {:aot :all}}) 17 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/src/kafka_example/core.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-example.core 2 | (:gen-class) 3 | (:require [clojure.tools.logging :as log] 4 | [environ.core :refer [env]]) 5 | (:import [org.apache.kafka.clients.admin AdminClientConfig NewTopic KafkaAdminClient] 6 | org.apache.kafka.clients.consumer.KafkaConsumer 7 | [org.apache.kafka.clients.producer KafkaProducer ProducerRecord] 8 | [org.apache.kafka.common.serialization StringDeserializer StringSerializer] 9 | (org.apache.kafka.common TopicPartition) 10 | (java.time Duration))) 11 | 12 | (defn create-topics! 13 | "Create the topic " 14 | [bootstrap-server topics ^Integer partitions ^Short replication] 15 | (let [config {AdminClientConfig/BOOTSTRAP_SERVERS_CONFIG bootstrap-server} 16 | adminClient (KafkaAdminClient/create config) 17 | new-topics (map (fn [^String topic-name] (NewTopic. topic-name partitions replication)) topics)] 18 | (.createTopics adminClient new-topics))) 19 | 20 | (defn- pending-messages 21 | [end-offsets consumer] 22 | (some true? (doall 23 | (map 24 | (fn [topic-partition] 25 | (let [position (.position consumer (key topic-partition)) 26 | value (val topic-partition)] 27 | (< position value))) 28 | end-offsets)))) 29 | 30 | (defn search-topic-by-key 31 | "Searches through Kafka topic and returns those matching the key" 32 | [^KafkaConsumer consumer topic search-key] 33 | (let [topic-partitions (->> (.partitionsFor consumer topic) 34 | (map #(TopicPartition. (.topic %) (.partition %)))) 35 | _ (.assign consumer topic-partitions) 36 | _ (.seekToBeginning consumer (.assignment consumer)) 37 | end-offsets (.endOffsets consumer (.assignment consumer)) 38 | found-records (transient [])] 39 | (log/infof "end offsets %s" end-offsets) 40 | (log/infof "Pending messages? %s" (pending-messages end-offsets consumer)) 41 | (while (pending-messages end-offsets consumer) 42 | (log/infof "Pending messages? %s" (pending-messages end-offsets consumer)) 43 | (let [records (.poll consumer (Duration/ofMillis 50)) 44 | matched-search-key (filter #(= (.key %) search-key) records)] 45 | (conj! found-records matched-search-key) 46 | (doseq [record matched-search-key] 47 | (log/infof "Found Matching Key %s Value %s" (.key record) (.value record))))) 48 | (persistent! found-records))) 49 | 50 | (defn build-consumer 51 | "Create the consumer instance to consume 52 | from the provided kafka topic name" 53 | [bootstrap-server] 54 | (let [consumer-props 55 | {"bootstrap.servers", bootstrap-server 56 | "group.id", "example" 57 | "key.deserializer", StringDeserializer 58 | "value.deserializer", StringDeserializer 59 | "auto.offset.reset", "earliest" 60 | "enable.auto.commit", "true"}] 61 | (KafkaConsumer. consumer-props))) 62 | 63 | (defn consumer-subscribe 64 | [consumer topic] 65 | (.subscribe consumer [topic])) 66 | 67 | (defn build-producer ^KafkaProducer 68 | ;"Create the kafka producer to send on messages received" 69 | [bootstrap-server] 70 | (let [producer-props {"value.serializer" StringSerializer 71 | "key.serializer" StringSerializer 72 | "bootstrap.servers" bootstrap-server}] 73 | (KafkaProducer. producer-props))) 74 | 75 | (defn run-application 76 | "Create the simple read and write topology with Kafka" 77 | [bootstrap-server] 78 | (let [consumer-topic "example-consumer-topic" 79 | producer-topic "example-produced-topic" 80 | bootstrap-server (env :bootstrap-server bootstrap-server) 81 | replay-consumer (build-consumer bootstrap-server) 82 | consumer (build-consumer bootstrap-server) 83 | producer (build-producer bootstrap-server)] 84 | (log/infof "Creating the topics %s" [producer-topic consumer-topic]) 85 | (create-topics! bootstrap-server [producer-topic consumer-topic] 1 1) 86 | (log/infof "Starting the kafka example app. With topic consuming topic %s and producing to %s" 87 | consumer-topic producer-topic) 88 | (search-topic-by-key replay-consumer consumer-topic "1") 89 | (consumer-subscribe consumer consumer-topic) 90 | (while true 91 | (let [records (.poll consumer (Duration/ofMillis 100))] 92 | (doseq [record records] 93 | (log/info "Sending on value" (str "Processed Value: " (.value record))) 94 | (.send producer (ProducerRecord. producer-topic "a" (str "Processed Value: " (.value record)))))) 95 | (.commitAsync consumer)))) 96 | 97 | (defn -main 98 | [& args] 99 | (.addShutdownHook (Runtime/getRuntime) (Thread. #(log/info "Shutting down"))) 100 | (run-application "localhost:9092")) -------------------------------------------------------------------------------- /kafka-producer-consumer-example/src/log4j.properties: -------------------------------------------------------------------------------- 1 | log4j.rootLogger=INFO, console 2 | log4j.logger.example=DEBUG 3 | log4j.appender.console=org.apache.log4j.ConsoleAppender 4 | log4j.appender.console.layout=org.apache.log4j.PatternLayout 5 | log4j.appender.console.layout.ConversionPattern=%-5p %c: %m%n -------------------------------------------------------------------------------- /kafka-producer-consumer-example/test/kafka_example/core_test.clj~: -------------------------------------------------------------------------------- 1 | (ns kafka-example.core-test 2 | (:require [clojure.test :refer :all] 3 | [kafka-example.core :refer :all])) 4 | 5 | (deftest a-test 6 | (testing "FIXME, I fail." 7 | (is (= 0 1)))) 8 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/test/kafka_example/integration/example_integration_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-example.integration.example-integration-test 2 | (:require [clojure.test :refer [deftest testing is]] 3 | [kafka-example.core :refer [build-consumer build-producer run-application consumer-subscribe]]) 4 | (:import (org.testcontainers.containers KafkaContainer) 5 | (org.apache.kafka.clients.producer ProducerRecord) 6 | (org.testcontainers.utility DockerImageName))) 7 | 8 | (deftest example-kafka-integration-test 9 | (testing "Fire up test containers Kafka and then send and consume message" 10 | (let 11 | [kafka-container (KafkaContainer. (DockerImageName/parse "confluentinc/cp-kafka:5.5.3")) 12 | _ (.start kafka-container) 13 | bootstrap-server (.getBootstrapServers kafka-container) 14 | test-producer (build-producer bootstrap-server) 15 | _ (future (run-application bootstrap-server)) ; execute application in separate thread 16 | producer-topic "example-consumer-topic" 17 | test-consumer (build-consumer bootstrap-server) 18 | _ (consumer-subscribe test-consumer "example-produced-topic") 19 | input-data "hello" 20 | sent-result (.get (.send test-producer (ProducerRecord. producer-topic input-data))) 21 | records (.poll test-consumer 10000)] 22 | (is (= producer-topic (.topic sent-result))) 23 | (doseq [record records] 24 | (is (= (format "Processed Value: %s" input-data) (.value record))))))) 25 | -------------------------------------------------------------------------------- /kafka-producer-consumer-example/test/log4j.properties: -------------------------------------------------------------------------------- 1 | log4j.rootLogger=INFO, console 2 | log4j.logger.example=DEBUG 3 | log4j.appender.console=org.apache.log4j.ConsoleAppender 4 | log4j.appender.console.layout=org.apache.log4j.PatternLayout 5 | log4j.appender.console.layout.ConversionPattern=%-5p %c: %m%n -------------------------------------------------------------------------------- /kafka-streams-example/.gitignore: -------------------------------------------------------------------------------- 1 | /target 2 | /classes 3 | /checkouts 4 | pom.xml 5 | pom.xml.asc 6 | *.jar 7 | *.class 8 | /.lein-* 9 | /.nrepl-port 10 | .hgignore 11 | .hg/ 12 | .idea/ 13 | *.iml 14 | 15 | -------------------------------------------------------------------------------- /kafka-streams-example/CHANGELOG.md: -------------------------------------------------------------------------------- 1 | # Change Log 2 | All notable changes to this project will be documented in this file. This change log follows the conventions of [keepachangelog.com](http://keepachangelog.com/). 3 | 4 | ## [Unreleased] 5 | ### Changed 6 | - Add a new arity to `make-widget-async` to provide a different widget shape. 7 | 8 | ## [0.1.1] - 2018-03-12 9 | ### Changed 10 | - Documentation on how to make the widgets. 11 | 12 | ### Removed 13 | - `make-widget-sync` - we're all async, all the time. 14 | 15 | ### Fixed 16 | - Fixed widget maker to keep working when daylight savings switches over. 17 | 18 | ## 0.1.0 - 2018-03-12 19 | ### Added 20 | - Files from the new template. 21 | - Widget maker public API - `make-widget-sync`. 22 | 23 | [Unreleased]: https://github.com/your-name/kafka-streams-example/compare/0.1.1...HEAD 24 | [0.1.1]: https://github.com/your-name/kafka-streams-example/compare/0.1.0...0.1.1 25 | -------------------------------------------------------------------------------- /kafka-streams-example/LICENSE: -------------------------------------------------------------------------------- 1 | THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE PUBLIC 2 | LICENSE ("AGREEMENT"). ANY USE, REPRODUCTION OR DISTRIBUTION OF THE PROGRAM 3 | CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT. 4 | 5 | 1. DEFINITIONS 6 | 7 | "Contribution" means: 8 | 9 | a) in the case of the initial Contributor, the initial code and 10 | documentation distributed under this Agreement, and 11 | 12 | b) in the case of each subsequent Contributor: 13 | 14 | i) changes to the Program, and 15 | 16 | ii) additions to the Program; 17 | 18 | where such changes and/or additions to the Program originate from and are 19 | distributed by that particular Contributor. A Contribution 'originates' from 20 | a Contributor if it was added to the Program by such Contributor itself or 21 | anyone acting on such Contributor's behalf. Contributions do not include 22 | additions to the Program which: (i) are separate modules of software 23 | distributed in conjunction with the Program under their own license 24 | agreement, and (ii) are not derivative works of the Program. 25 | 26 | "Contributor" means any person or entity that distributes the Program. 27 | 28 | "Licensed Patents" mean patent claims licensable by a Contributor which are 29 | necessarily infringed by the use or sale of its Contribution alone or when 30 | combined with the Program. 31 | 32 | "Program" means the Contributions distributed in accordance with this 33 | Agreement. 34 | 35 | "Recipient" means anyone who receives the Program under this Agreement, 36 | including all Contributors. 37 | 38 | 2. GRANT OF RIGHTS 39 | 40 | a) Subject to the terms of this Agreement, each Contributor hereby grants 41 | Recipient a non-exclusive, worldwide, royalty-free copyright license to 42 | reproduce, prepare derivative works of, publicly display, publicly perform, 43 | distribute and sublicense the Contribution of such Contributor, if any, and 44 | such derivative works, in source code and object code form. 45 | 46 | b) Subject to the terms of this Agreement, each Contributor hereby grants 47 | Recipient a non-exclusive, worldwide, royalty-free patent license under 48 | Licensed Patents to make, use, sell, offer to sell, import and otherwise 49 | transfer the Contribution of such Contributor, if any, in source code and 50 | object code form. This patent license shall apply to the combination of the 51 | Contribution and the Program if, at the time the Contribution is added by the 52 | Contributor, such addition of the Contribution causes such combination to be 53 | covered by the Licensed Patents. The patent license shall not apply to any 54 | other combinations which include the Contribution. No hardware per se is 55 | licensed hereunder. 56 | 57 | c) Recipient understands that although each Contributor grants the licenses 58 | to its Contributions set forth herein, no assurances are provided by any 59 | Contributor that the Program does not infringe the patent or other 60 | intellectual property rights of any other entity. Each Contributor disclaims 61 | any liability to Recipient for claims brought by any other entity based on 62 | infringement of intellectual property rights or otherwise. As a condition to 63 | exercising the rights and licenses granted hereunder, each Recipient hereby 64 | assumes sole responsibility to secure any other intellectual property rights 65 | needed, if any. For example, if a third party patent license is required to 66 | allow Recipient to distribute the Program, it is Recipient's responsibility 67 | to acquire that license before distributing the Program. 68 | 69 | d) Each Contributor represents that to its knowledge it has sufficient 70 | copyright rights in its Contribution, if any, to grant the copyright license 71 | set forth in this Agreement. 72 | 73 | 3. REQUIREMENTS 74 | 75 | A Contributor may choose to distribute the Program in object code form under 76 | its own license agreement, provided that: 77 | 78 | a) it complies with the terms and conditions of this Agreement; and 79 | 80 | b) its license agreement: 81 | 82 | i) effectively disclaims on behalf of all Contributors all warranties and 83 | conditions, express and implied, including warranties or conditions of title 84 | and non-infringement, and implied warranties or conditions of merchantability 85 | and fitness for a particular purpose; 86 | 87 | ii) effectively excludes on behalf of all Contributors all liability for 88 | damages, including direct, indirect, special, incidental and consequential 89 | damages, such as lost profits; 90 | 91 | iii) states that any provisions which differ from this Agreement are offered 92 | by that Contributor alone and not by any other party; and 93 | 94 | iv) states that source code for the Program is available from such 95 | Contributor, and informs licensees how to obtain it in a reasonable manner on 96 | or through a medium customarily used for software exchange. 97 | 98 | When the Program is made available in source code form: 99 | 100 | a) it must be made available under this Agreement; and 101 | 102 | b) a copy of this Agreement must be included with each copy of the Program. 103 | 104 | Contributors may not remove or alter any copyright notices contained within 105 | the Program. 106 | 107 | Each Contributor must identify itself as the originator of its Contribution, 108 | if any, in a manner that reasonably allows subsequent Recipients to identify 109 | the originator of the Contribution. 110 | 111 | 4. COMMERCIAL DISTRIBUTION 112 | 113 | Commercial distributors of software may accept certain responsibilities with 114 | respect to end users, business partners and the like. While this license is 115 | intended to facilitate the commercial use of the Program, the Contributor who 116 | includes the Program in a commercial product offering should do so in a 117 | manner which does not create potential liability for other Contributors. 118 | Therefore, if a Contributor includes the Program in a commercial product 119 | offering, such Contributor ("Commercial Contributor") hereby agrees to defend 120 | and indemnify every other Contributor ("Indemnified Contributor") against any 121 | losses, damages and costs (collectively "Losses") arising from claims, 122 | lawsuits and other legal actions brought by a third party against the 123 | Indemnified Contributor to the extent caused by the acts or omissions of such 124 | Commercial Contributor in connection with its distribution of the Program in 125 | a commercial product offering. The obligations in this section do not apply 126 | to any claims or Losses relating to any actual or alleged intellectual 127 | property infringement. In order to qualify, an Indemnified Contributor must: 128 | a) promptly notify the Commercial Contributor in writing of such claim, and 129 | b) allow the Commercial Contributor to control, and cooperate with the 130 | Commercial Contributor in, the defense and any related settlement 131 | negotiations. The Indemnified Contributor may participate in any such claim 132 | at its own expense. 133 | 134 | For example, a Contributor might include the Program in a commercial product 135 | offering, Product X. That Contributor is then a Commercial Contributor. If 136 | that Commercial Contributor then makes performance claims, or offers 137 | warranties related to Product X, those performance claims and warranties are 138 | such Commercial Contributor's responsibility alone. Under this section, the 139 | Commercial Contributor would have to defend claims against the other 140 | Contributors related to those performance claims and warranties, and if a 141 | court requires any other Contributor to pay any damages as a result, the 142 | Commercial Contributor must pay those damages. 143 | 144 | 5. NO WARRANTY 145 | 146 | EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, THE PROGRAM IS PROVIDED ON 147 | AN "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER 148 | EXPRESS OR IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR 149 | CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A 150 | PARTICULAR PURPOSE. Each Recipient is solely responsible for determining the 151 | appropriateness of using and distributing the Program and assumes all risks 152 | associated with its exercise of rights under this Agreement , including but 153 | not limited to the risks and costs of program errors, compliance with 154 | applicable laws, damage to or loss of data, programs or equipment, and 155 | unavailability or interruption of operations. 156 | 157 | 6. DISCLAIMER OF LIABILITY 158 | 159 | EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, NEITHER RECIPIENT NOR ANY 160 | CONTRIBUTORS SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, 161 | SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION 162 | LOST PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN 163 | CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) 164 | ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE 165 | EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY 166 | OF SUCH DAMAGES. 167 | 168 | 7. GENERAL 169 | 170 | If any provision of this Agreement is invalid or unenforceable under 171 | applicable law, it shall not affect the validity or enforceability of the 172 | remainder of the terms of this Agreement, and without further action by the 173 | parties hereto, such provision shall be reformed to the minimum extent 174 | necessary to make such provision valid and enforceable. 175 | 176 | If Recipient institutes patent litigation against any entity (including a 177 | cross-claim or counterclaim in a lawsuit) alleging that the Program itself 178 | (excluding combinations of the Program with other software or hardware) 179 | infringes such Recipient's patent(s), then such Recipient's rights granted 180 | under Section 2(b) shall terminate as of the date such litigation is filed. 181 | 182 | All Recipient's rights under this Agreement shall terminate if it fails to 183 | comply with any of the material terms or conditions of this Agreement and 184 | does not cure such failure in a reasonable period of time after becoming 185 | aware of such noncompliance. If all Recipient's rights under this Agreement 186 | terminate, Recipient agrees to cease use and distribution of the Program as 187 | soon as reasonably practicable. However, Recipient's obligations under this 188 | Agreement and any licenses granted by Recipient relating to the Program shall 189 | continue and survive. 190 | 191 | Everyone is permitted to copy and distribute copies of this Agreement, but in 192 | order to avoid inconsistency the Agreement is copyrighted and may only be 193 | modified in the following manner. The Agreement Steward reserves the right to 194 | publish new versions (including revisions) of this Agreement from time to 195 | time. No one other than the Agreement Steward has the right to modify this 196 | Agreement. The Eclipse Foundation is the initial Agreement Steward. The 197 | Eclipse Foundation may assign the responsibility to serve as the Agreement 198 | Steward to a suitable separate entity. Each new version of the Agreement will 199 | be given a distinguishing version number. The Program (including 200 | Contributions) may always be distributed subject to the version of the 201 | Agreement under which it was received. In addition, after a new version of 202 | the Agreement is published, Contributor may elect to distribute the Program 203 | (including its Contributions) under the new version. Except as expressly 204 | stated in Sections 2(a) and 2(b) above, Recipient receives no rights or 205 | licenses to the intellectual property of any Contributor under this 206 | Agreement, whether expressly, by implication, estoppel or otherwise. All 207 | rights in the Program not expressly granted under this Agreement are 208 | reserved. 209 | 210 | This Agreement is governed by the laws of the State of New York and the 211 | intellectual property laws of the United States of America. No party to this 212 | Agreement will bring a legal action under this Agreement more than one year 213 | after the cause of action arose. Each party waives its rights to a jury trial 214 | in any resulting litigation. 215 | -------------------------------------------------------------------------------- /kafka-streams-example/README.md: -------------------------------------------------------------------------------- 1 | # kafka-streams-example 2 | 3 | A number of kafka stream examples of joins with streams and streams and also with KTables. Check the tests out. 4 | 5 | 6 | ## Usage 7 | 8 | Running the tests will give the user ideas of how these joins are working. 9 | 10 | $ java -jar kafka-streams-example-0.1.0-standalone.jar 11 | -------------------------------------------------------------------------------- /kafka-streams-example/project.clj: -------------------------------------------------------------------------------- 1 | (defproject kafka-streams-example "0.1.0-SNAPSHOT" 2 | :description "Kafka Streams Example" 3 | :url "https://perkss.github.io/#/clojure/KafkaClojure#text-body" 4 | :dependencies [[org.clojure/clojure "1.11.1"] 5 | [org.clojure/data.json "0.2.6"] 6 | [org.apache.kafka/kafka-streams "3.8.0"] 7 | [org.apache.kafka/kafka-clients "3.8.0"] 8 | [org.apache.kafka/kafka-streams-test-utils "3.8.0"] 9 | [org.apache.kafka/kafka-clients "3.8.0" :classifier "test"] 10 | [io.confluent/kafka-streams-avro-serde "7.7.1"] 11 | [org.clojure/tools.logging "1.2.4"] 12 | [org.slf4j/slf4j-log4j12 "2.0.10"] 13 | [org.apache.logging.log4j/log4j-core "2.22.1"] 14 | [danlentz/clj-uuid "0.1.9"]] 15 | 16 | :repositories [["confluent" {:url "https://packages.confluent.io/maven/"}]] 17 | :plugins [[lein-cljfmt "0.6.1"]] 18 | :main ^:skip-aot kafka-streams-example.core 19 | :target-path "target/%s" 20 | :profiles {:uberjar {:aot :all}}) 21 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/core.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.core 2 | (:require [clojure.tools.logging :as log]) 3 | (:import 4 | (org.apache.kafka.common.serialization Serdes) 5 | (org.apache.kafka.streams StreamsConfig KafkaStreams StreamsBuilder) 6 | (org.apache.kafka.streams.kstream ValueMapper)) 7 | (:gen-class)) 8 | 9 | (defn to-uppercase-topology [input-topic output-topic] 10 | (let [builder (StreamsBuilder.)] 11 | (-> 12 | (.stream builder input-topic) ;; Create the source node of the stream 13 | (.mapValues (reify 14 | ValueMapper 15 | (apply [_ v] 16 | (clojure.string/upper-case v)))) ;; map the strings to uppercase 17 | (.to output-topic)) 18 | builder)) 19 | 20 | (defn -main 21 | [& args] 22 | 23 | (.addShutdownHook (Runtime/getRuntime) (Thread. #(log/info "Shutting down"))) 24 | 25 | (log/info "Starting kafka stream application") 26 | 27 | (let [properties {StreamsConfig/APPLICATION_ID_CONFIG, "uppercase-processing-application" 28 | StreamsConfig/BOOTSTRAP_SERVERS_CONFIG, "broker:9092" 29 | StreamsConfig/DEFAULT_KEY_SERDE_CLASS_CONFIG, (.getName (.getClass (Serdes/String))) 30 | StreamsConfig/DEFAULT_VALUE_SERDE_CLASS_CONFIG, (.getName (.getClass (Serdes/String)))} 31 | input-topic "plaintext-input" 32 | output-topic "uppercase" 33 | streams (KafkaStreams. (.build (to-uppercase-topology input-topic output-topic)) properties)] 34 | (log/info "starting stream with topology to upper case") 35 | (.start streams) 36 | (Thread/sleep (* 60000 10)) 37 | (log/info "stopping"))) -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/kstream_aggregate.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-aggregate 2 | (:require [kafka-streams-example.utils :as kstream-utils]) 3 | (:import [org.apache.kafka.streams.kstream Aggregator Initializer KeyValueMapper KStream Materialized Produced KTable] 4 | org.apache.kafka.streams.StreamsBuilder 5 | (org.apache.kafka.common.serialization Serdes))) 6 | 7 | ; https://github.com/confluentinc/kafka-streams-examples/blob/aefdc2fa3fe820709b069685021728e7775b1788/src/test/java/io/confluent/examples/streams/AggregateTest.java 8 | 9 | ;KTable, Long> timeWindowedAggregatedStream = groupedStream.windowedBy(TimeUnit.MINUTES.toMillis(5)) 10 | ;.aggregate( 11 | ; () -> 0L, /* initializer */ 12 | ; (aggKey, newValue, aggValue) -> aggValue + newValue, /* adder */ 13 | ; Materialized.>as("time-windowed-aggregated-stream-store") /* state store name */ 14 | ; .withValueSerde(Serdes.Long())); /* serde for aggregate value */ 15 | (defn ^KTable aggregate-starting-letter-topology 16 | [^KStream input] 17 | (-> input 18 | (.groupBy (reify KeyValueMapper 19 | (apply [_ k v] 20 | ((fn [_ v] 21 | (.substring v 0 1)) k v)))) 22 | (.aggregate 23 | (reify Initializer 24 | (apply [_] 0)) 25 | (reify Aggregator 26 | (apply [_ _ new-value agg-value] 27 | (long (+ (.length new-value) agg-value)))) 28 | (Materialized/with (Serdes/String) (Serdes/Long))))) 29 | 30 | (defn build-aggregate-topology 31 | [] 32 | (let [builder (StreamsBuilder.) 33 | input-topic "input-topic" 34 | output-topic "output-topic" 35 | input (kstream-utils/build-stream builder input-topic)] 36 | 37 | (-> (aggregate-starting-letter-topology input) 38 | (.toStream) 39 | (.to output-topic (Produced/with (Serdes/String) (Serdes/Long)))) 40 | builder)) -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/kstream_avro_join.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-avro-join 2 | (:require [kafka-streams-example.utils :as kstream-utils]) 3 | (:import (org.apache.kafka.streams StreamsBuilder) 4 | (org.apache.kafka.streams.kstream KStream ValueJoiner JoinWindows ForeachAction Consumed Joined Produced StreamJoined) 5 | (org.apache.kafka.common.serialization Serdes) 6 | (org.apache.avro Schema Schema$Field Schema$Type) 7 | (java.util ArrayList) 8 | (org.apache.avro.generic GenericRecordBuilder) 9 | (java.time Duration))) 10 | 11 | (defn build-repayment-processed-schema 12 | [] 13 | (let [fields (doto (ArrayList.) 14 | (.add (Schema$Field. "id" (Schema/create (Schema$Type/STRING)) "id field" "")) 15 | (.add (Schema$Field. "repayment_amount" (Schema/create (Schema$Type/INT)) "repayment amount field" 0)) 16 | (.add (Schema$Field. "transaction_amount" (Schema/create (Schema$Type/INT)) "transaction amount field" 0)) 17 | (.add (Schema$Field. "account" (Schema/create (Schema$Type/INT)) "account field" 0)))] 18 | (doto (Schema/createRecord "RepaymentProcessedRecord" 19 | "The repayment processed schema record" 20 | "kafka.streams.example" 21 | false 22 | fields)))) 23 | 24 | (defn build-repayment-processed-record 25 | [repayment transaction ^Schema schema] 26 | (let [^GenericRecordBuilder builder (GenericRecordBuilder. schema)] 27 | (.set builder "id" (.get repayment "id")) 28 | (.set builder "repayment_amount" (.get repayment "amount")) 29 | (.set builder "transaction_amount" (.get transaction "amount")) 30 | (.set builder "account" (.get repayment "account")) 31 | (.build builder))) 32 | 33 | (defn ^KStream build-stream 34 | [^StreamsBuilder builder ^String input-topic 35 | key-serializer value-serializer] 36 | (.stream builder input-topic (Consumed/with key-serializer value-serializer))) 37 | 38 | (defn join-repayment-transaction-topology 39 | [^KStream repayments ^KStream transactions value-serializer] 40 | (-> repayments 41 | (.join transactions 42 | (reify ValueJoiner 43 | (apply [_ left right] 44 | ((fn [repayment-value transaction-value] 45 | (build-repayment-processed-record repayment-value transaction-value (build-repayment-processed-schema))) 46 | left right))) 47 | (. JoinWindows of (Duration/ofMillis 5000)) 48 | (. StreamJoined with (. Serdes String) 49 | value-serializer 50 | value-serializer)))) 51 | 52 | (defn repayment-transaction-topology 53 | [key-serializer value-serializer] 54 | (let [builder (StreamsBuilder.) 55 | repayment-topic "repayment" 56 | transaction-topic "transaction" 57 | processed-repayment-topic "processed-repayment" 58 | repayment-stream (build-stream builder repayment-topic key-serializer value-serializer) 59 | transaction-stream (build-stream builder transaction-topic key-serializer value-serializer)] 60 | 61 | (-> (join-repayment-transaction-topology repayment-stream transaction-stream value-serializer) 62 | (kstream-utils/peek-stream) 63 | (.to processed-repayment-topic (Produced/with key-serializer value-serializer))) 64 | builder)) 65 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/kstream_kstream_inner_join_example.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-kstream-inner-join-example 2 | (:require [kafka-streams-example.utils :as kstream-utils]) 3 | (:import (org.apache.kafka.streams StreamsBuilder) 4 | (org.apache.kafka.streams.kstream KStream ValueJoiner JoinWindows) 5 | (java.time Duration))) 6 | 7 | ;; The click and impressions example topology where user clicks are joined to 8 | ;; impressions with a inner join. 9 | (defn impressions-clicks-topology 10 | [^KStream impressions ^KStream clicks] 11 | (-> impressions 12 | (.join clicks 13 | (reify ValueJoiner 14 | (apply [_ left right] 15 | ((fn [impression-value click-value] 16 | (str impression-value "/" click-value)) 17 | left right))) 18 | (. JoinWindows of (Duration/ofMillis 5000))))) 19 | 20 | (defn builder-streaming-join-topology 21 | [] 22 | (let [builder (StreamsBuilder.) 23 | ad-impressions-topic "adImpressions" 24 | ad-clicks-topic "adClicks" 25 | output-topic "output-topic" 26 | impressions (kstream-utils/build-stream builder ad-impressions-topic) 27 | clicks (kstream-utils/build-stream builder ad-clicks-topic)] 28 | 29 | (-> (impressions-clicks-topology impressions clicks) 30 | (.to output-topic)) 31 | builder)) 32 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/kstream_kstream_left_join_example.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-kstream-left-join-example 2 | (:require [clojure.tools.logging :as log]) 3 | (:require [kafka-streams-example.utils :as kstream-utils]) 4 | (:import [org.apache.kafka.streams.kstream JoinWindows KStream ValueJoiner] 5 | org.apache.kafka.streams.StreamsBuilder 6 | (java.time Duration))) 7 | 8 | (defn impressions-clicks-topology 9 | [^KStream impressions ^KStream clicks] 10 | (-> impressions 11 | (.leftJoin clicks 12 | (reify ValueJoiner 13 | (apply [_ left right] 14 | ((fn [impression-value click-value] 15 | (log/infof "Received values impression value: %s click: %s" impression-value click-value) 16 | (str impression-value "/" click-value)) 17 | left right))) 18 | (. JoinWindows of (Duration/ofMillis (* 50 60 10000)))))) 19 | 20 | (defn builder-streaming-join-topology 21 | [] 22 | (let [builder (StreamsBuilder.) 23 | ad-impressions-topic "adImpressions" 24 | ad-clicks-topic "adClicks" 25 | output-topic "output-topic" 26 | impressions (kstream-utils/build-stream builder ad-impressions-topic) 27 | clicks (kstream-utils/build-stream builder ad-clicks-topic)] 28 | 29 | (-> (impressions-clicks-topology impressions clicks) 30 | (kstream-utils/peek-stream) 31 | (.to output-topic)) 32 | builder)) 33 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/kstream_kstream_outer_join_example.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-kstream-outer-join-example 2 | (:require [kafka-streams-example.utils :as kstream-utils]) 3 | (:import (org.apache.kafka.streams StreamsBuilder) 4 | (org.apache.kafka.streams.kstream KStream ValueJoiner JoinWindows) 5 | (java.time Duration))) 6 | 7 | (defn impressions-clicks-topology 8 | [^KStream impressions ^KStream clicks] 9 | (-> impressions 10 | (.outerJoin clicks 11 | (reify ValueJoiner 12 | (apply [_ left right] 13 | ((fn [impression-value click-value] 14 | (str impression-value "/" click-value)) 15 | left right))) 16 | (. JoinWindows of (Duration/ofMillis 5000))))) 17 | 18 | (defn builder-streaming-join-topology 19 | [] 20 | (let [builder (StreamsBuilder.) 21 | ad-impressions-topic "adImpressions" 22 | ad-clicks-topic "adClicks" 23 | output-topic "output-topic" 24 | impressions (kstream-utils/build-stream builder ad-impressions-topic) 25 | clicks (kstream-utils/build-stream builder ad-clicks-topic)] 26 | 27 | (-> (impressions-clicks-topology impressions clicks) 28 | (.to output-topic)) 29 | builder)) 30 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/kstream_reduce.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-reduce 2 | (:import (org.apache.kafka.streams StreamsBuilder) 3 | (org.apache.kafka.streams.kstream Consumed Grouped Reducer Produced) 4 | (org.apache.kafka.common.serialization Serdes))) 5 | 6 | ; https://github.com/confluentinc/kafka-streams-examples/blob/5.4.1-post/src/test/java/io/confluent/examples/streams/ReduceTest.java 7 | (defn build-group-by-reduce-topology 8 | [input-topic output-topic] 9 | (let [builder (StreamsBuilder.) 10 | input (.stream builder input-topic (Consumed/with (Serdes/Long) (Serdes/String))) 11 | concatenated (-> input 12 | (.groupByKey (Grouped/with (Serdes/Long) (Serdes/String))) 13 | (.reduce (reify Reducer 14 | (apply [_ v1 v2] 15 | (str v1 " " v2))))) 16 | _ (-> concatenated 17 | (.toStream) 18 | (.to output-topic (Produced/with (Serdes/Long) (Serdes/String))))] 19 | builder)) 20 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/ktable_example.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.ktable-example 2 | (:require [kafka-streams-example.utils :as kstream-utils]) 3 | (:import (org.apache.kafka.streams StreamsBuilder KeyValue Topology) 4 | (org.apache.kafka.streams.kstream KStream KTable ValueJoiner KeyValueMapper Reducer))) 5 | 6 | (defn clicks-per-region 7 | [^KStream user-clicks-stream ^KTable user-regions-table] 8 | (-> user-clicks-stream 9 | ;; Joins on the Key which is the name 10 | (.leftJoin user-regions-table 11 | (reify ValueJoiner 12 | (apply [_ left right] 13 | ((fn [clicks region] 14 | {:region region :clicks clicks}) 15 | left right)))) 16 | (.map (reify KeyValueMapper 17 | (apply [_ k v] 18 | ((fn [_ clicks-with-regions] 19 | (let [value (KeyValue. 20 | (:region clicks-with-regions) 21 | (:clicks clicks-with-regions))] 22 | value)) k v)))) 23 | (.groupByKey) 24 | (.reduce (reify Reducer 25 | (apply [_ left right] 26 | ((fn [first-clicks second-clicks] 27 | (str (+ (Integer. first-clicks) (Integer. second-clicks)))) left right)))))) 28 | 29 | (defn build-join-topology 30 | [] 31 | (let [builder (StreamsBuilder.) 32 | input-topic-clicks "user-clicks-topic" 33 | input-topic-regions "user-regions-topic" 34 | output-topic "clicks-per-region-topic" 35 | user-clicks (kstream-utils/build-stream builder input-topic-clicks) 36 | user-regions (kstream-utils/build-table builder input-topic-regions)] 37 | (-> (clicks-per-region user-clicks user-regions) 38 | (.toStream) 39 | (.to output-topic)) 40 | builder)) 41 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/processor_api_example.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.processor-api-example 2 | (:require [clojure.string :as str] 3 | [clojure.tools.logging :as log]) 4 | (:import (org.apache.kafka.streams.processor Processor 5 | ProcessorSupplier PunctuationType Punctuator ProcessorContext) 6 | (org.apache.kafka.streams Topology) 7 | org.apache.kafka.common.serialization.Serdes 8 | (org.apache.kafka.streams.state Stores) 9 | (java.time Duration))) 10 | 11 | ;; inspired by https://kafka.apache.org/11/documentation/streams/developer-guide/processor-api.html 12 | 13 | (defn add-word-count 14 | [word kvstore] 15 | (log/info "Calling add word count") 16 | (let [old-value (.get @kvstore word)] 17 | (if (nil? old-value) 18 | (do 19 | (log/infof "Entering initial value for %s " word) 20 | (.put @kvstore word (str 1))) 21 | (do 22 | (log/infof "Entering a updated value for word %s with old value %s" word old-value) 23 | ;; TODO fix horrible casting here 24 | (.put @kvstore word (str (inc (Integer. old-value)))))))) 25 | 26 | (defn punctuator-forward-message 27 | [timestamp kvstore context] 28 | (reify 29 | Punctuator 30 | (punctuate [_ timestamp] 31 | (let [iter (iterator-seq (.all @kvstore))] 32 | (log/infof "Iter %s" (pr-str iter)) 33 | (dorun (map #(.forward @context (.key %) (str (.value %))) iter)) 34 | (.commit @context))))) 35 | 36 | (defn word-processor 37 | "The first argument to reify method if this. Impleemt the Processor Java API" 38 | [store-name] 39 | (let [store (atom {}) 40 | context (atom nil) 41 | timestamp (Duration/ofMillis 10)] 42 | (reify 43 | Processor 44 | (close [_]) 45 | (init [this processor-context] 46 | (reset! context processor-context) 47 | (reset! store (.getStateStore @context store-name)) ;; Assign the state store to the atom 48 | 49 | (.schedule ^ProcessorContext @context 50 | timestamp 51 | PunctuationType/STREAM_TIME 52 | (punctuator-forward-message timestamp store context))) 53 | (process [_ key line] 54 | (log/infof "Process has been called with %s %s" key line) 55 | (let [words 56 | (-> (str line) 57 | (str/lower-case) 58 | (str/split #" "))] 59 | 60 | (log/infof "Words is: %s" words) 61 | ;; Update the word count in the state store 62 | (dorun (map #(add-word-count % store) words)) 63 | 64 | (log/infof "Current word value for %s word is :%s" 65 | (first words) 66 | (.get @store (first words)))))))) 67 | 68 | (defn word-processor-supplier 69 | [store-name] 70 | (reify 71 | ProcessorSupplier 72 | (get [_] (word-processor store-name)))) 73 | 74 | (defn word-processor-topology 75 | [] 76 | (log/info "Word Processor API Streaming Topology") 77 | (let [builder (Topology.) 78 | store-name "Counts" 79 | store (Stores/keyValueStoreBuilder 80 | (Stores/persistentKeyValueStore store-name) 81 | (Serdes/String) 82 | (Serdes/String))] 83 | (-> builder 84 | (.addSource "Source" (into-array String ["source-topic"])) 85 | (.addProcessor "Process" 86 | (word-processor-supplier store-name) 87 | (into-array String ["Source"])) 88 | ;; Remove the add state store and you will see my contribution to Kafka: 89 | ;; https://issues.apache.org/jira/browse/KAFKA-6659 90 | (.addStateStore store (into-array String ["Process"])) 91 | (.addSink "Sink" "sink-topic" (into-array String ["Process"]))))) 92 | -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/processor_api_rekey.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.processor-api-rekey 2 | (:require [clojure.tools.logging :as log] 3 | [kafka-streams-example.serdes.json-serdes :refer [json-serde]]) 4 | (:import (org.apache.kafka.streams.processor Processor 5 | ProcessorSupplier ProcessorContext To) 6 | org.apache.kafka.common.serialization.Serdes)) 7 | 8 | (defn re-key-stream-processor 9 | "A generic processor that rekeys messages" 10 | [re-key-fn sink] 11 | (let [context (atom nil)] 12 | (reify 13 | Processor 14 | (close [_]) 15 | (init [_ processor-context] 16 | (reset! context processor-context)) 17 | (process [_ _ message] 18 | (re-key-fn @context sink message))))) 19 | 20 | (defn re-key-stream-processor-supplier 21 | [re-key-fn sink] 22 | (reify 23 | ProcessorSupplier 24 | (get [_] (re-key-stream-processor re-key-fn sink)))) 25 | 26 | (defn re-key-trades 27 | "Function provided to rekey" 28 | [^ProcessorContext context sink-name {:keys [trade-id] :as message}] 29 | (log/infof "Sending on rekey trade Key: %s Message %s" trade-id message) 30 | (.forward context trade-id message (To/child sink-name))) 31 | 32 | (defn trade-rekey-topology 33 | "Topology that takes a trade and rekeys it by the provided function" 34 | [builder] 35 | (-> builder 36 | (.addSource "TradeInputs" 37 | (.deserializer (Serdes/String)) 38 | (.deserializer json-serde) 39 | (into-array String ["trade-input-topic"])) 40 | (.addProcessor "ReKeyTrades" 41 | (re-key-stream-processor-supplier re-key-trades "TradeSinkByTradeId") 42 | (into-array String ["TradeInputs"])) 43 | (.addSink "TradeSinkByTradeId" 44 | "trades-by-trade-id" 45 | (.serializer (Serdes/String)) 46 | (.serializer json-serde) 47 | (into-array String ["ReKeyTrades"])))) -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/serdes/json_serdes.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.serdes.json-serdes 2 | (:require [clojure.data.json :as json]) 3 | (:import java.nio.charset.StandardCharsets 4 | [org.apache.kafka.common.serialization Deserializer Serdes Serializer])) 5 | 6 | (defn bytes-to-string 7 | [data] 8 | (String. data StandardCharsets/UTF_8)) 9 | 10 | (defn string-to-bytes 11 | [data] 12 | (.getBytes ^String data StandardCharsets/UTF_8)) 13 | 14 | (def json-serializer 15 | (reify 16 | Serializer 17 | (close [_]) 18 | (configure [_ _ _]) 19 | (serialize [_ _ data] 20 | (when data 21 | (-> (json/write-str data) 22 | (string-to-bytes)))))) 23 | 24 | (def json-deserializer 25 | (reify 26 | Deserializer 27 | (close [_]) 28 | (configure [_ _ _]) 29 | (deserialize [_ _topic data] 30 | (when data 31 | (-> (bytes-to-string data) 32 | (json/read-str :key-fn keyword)))))) 33 | 34 | (def json-serde 35 | (Serdes/serdeFrom json-serializer json-deserializer)) -------------------------------------------------------------------------------- /kafka-streams-example/src/kafka_streams_example/utils.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.utils 2 | (:require [clojure.tools.logging :as log]) 3 | (:import (org.apache.kafka.streams.kstream KStream KTable ForeachAction))) 4 | 5 | (defn ^KStream build-stream 6 | [builder input-topic] 7 | (.stream builder input-topic)) 8 | 9 | (defn ^KTable build-table 10 | [builder input-topic] 11 | (.table builder input-topic)) 12 | 13 | (defn peek-stream 14 | [stream] 15 | (.peek stream 16 | (reify ForeachAction 17 | (apply [_ k v] 18 | (log/infof "Key: %s, Data: %s" k v))))) -------------------------------------------------------------------------------- /kafka-streams-example/src/log4j.properties: -------------------------------------------------------------------------------- 1 | log4j.rootLogger=INFO, console 2 | log4j.logger.example=DEBUG 3 | log4j.appender.console=org.apache.log4j.ConsoleAppender 4 | log4j.appender.console.layout=org.apache.log4j.PatternLayout 5 | log4j.appender.console.layout.ConversionPattern=%-5p %c: %m%n -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/core_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.core-test 2 | (:require [clojure.test :refer :all] 3 | [kafka-streams-example.core :as sut] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver Topology])) 7 | 8 | (deftest kafka-streams-to-uppercase-test 9 | (testing "Kafka Stream example one to test the uppercase topology" 10 | (let [input-topic "plaintext-input" 11 | output-topic-name "uppercase" 12 | ^Topology topology (.build (sut/to-uppercase-topology input-topic output-topic-name)) 13 | topology-test-driver (TopologyTestDriver. topology (support/properties "uppercase-topology")) 14 | serializer (.serializer (. Serdes String)) 15 | deserializer (.deserializer (. Serdes String)) 16 | output-topic (.createOutputTopic topology-test-driver output-topic-name deserializer deserializer) 17 | input-topic (.createInputTopic topology-test-driver input-topic serializer serializer) 18 | input "Hello my first stream testing to uppercase" 19 | expected "HELLO MY FIRST STREAM TESTING TO UPPERCASE"] 20 | (.pipeInput input-topic "key" input) 21 | (is (= expected (.value (.readKeyValue output-topic))))))) 22 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/kstream_aggregate_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream_aggregate_test 2 | (:require [kafka-streams-example.kstream-aggregate :as sut] 3 | [clojure.test :refer [deftest testing is]] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver])) 7 | 8 | (deftest kafka-streams-example-streaming-aggregate-test 9 | (testing "Aggregate a stream of works by counting the number of times the starting letter has been seen in total") 10 | (let [topology (.build (sut/build-aggregate-topology)) 11 | topology-test-driver (TopologyTestDriver. topology (support/properties "aggregate-topology")) 12 | serializer (.serializer (Serdes/String)) 13 | input-topic (.createInputTopic topology-test-driver "input-topic" serializer serializer) 14 | output-topic (.createOutputTopic topology-test-driver "output-topic" (.deserializer (Serdes/String)) (.deserializer (Serdes/Long)))] 15 | 16 | (.pipeInput input-topic "1" "stream") 17 | 18 | (let [output (.readKeyValue output-topic)] 19 | (is (= "s" (.key output))) 20 | (is (= 6 (.value output)))) 21 | 22 | (.pipeInput input-topic "2" "all") 23 | 24 | (let [output (.readKeyValue output-topic)] 25 | (is (= "a" (.key output))) 26 | (is (= 3 (.value output)))) 27 | 28 | (.pipeInput input-topic "3" "the") 29 | 30 | (let [output (.readKeyValue output-topic)] 31 | (is (= "t" (.key output))) 32 | (is (= 3 (.value output)))) 33 | 34 | (.pipeInput input-topic "4" "things") 35 | 36 | ;; Aggregates the count for items beginning with t 37 | (let [output (.readKeyValue output-topic)] 38 | (is (= "t" (.key output))) 39 | (is (= 9 (.value output)))) 40 | 41 | (.close topology-test-driver))) -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/kstream_avro_join_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-avro-join-test 2 | (:require [clojure.test :refer :all]) 3 | (:require [kafka-streams-example.kstream-avro-join :as sut] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import (org.apache.kafka.streams TopologyTestDriver Topology KeyValue) 6 | (io.confluent.kafka.schemaregistry.client MockSchemaRegistryClient) 7 | (io.confluent.kafka.streams.serdes.avro GenericAvroSerde) 8 | (org.apache.kafka.common.serialization Serdes) 9 | (org.apache.avro Schema$Field Schema Schema$Type) 10 | (java.util ArrayList) 11 | (org.apache.avro.generic GenericRecordBuilder))) 12 | 13 | (defn build-transaction-schema 14 | [] 15 | (let [fields (doto (ArrayList.) 16 | (.add (Schema$Field. "id" (Schema/create (Schema$Type/STRING)) "id field" "")) 17 | (.add (Schema$Field. "amount" (Schema/create (Schema$Type/INT)) "id field" 0)))] 18 | (doto (Schema/createRecord "TransactionRecord" 19 | "The transaction schema record" 20 | "kafka.streams.example" 21 | false 22 | fields)))) 23 | 24 | (defn build-repayment-schema 25 | [] 26 | (let [fields (doto (ArrayList.) 27 | (.add (Schema$Field. "id" (Schema/create (Schema$Type/STRING)) "id field" "")) 28 | (.add (Schema$Field. "amount" (Schema/create (Schema$Type/INT)) "id field" 0)) 29 | (.add (Schema$Field. "account" (Schema/create (Schema$Type/INT)) "id field" 0)))] 30 | (doto (Schema/createRecord "RepaymentRecord" 31 | "The repayment schema record" 32 | "kafka.streams.example" 33 | false 34 | fields)))) 35 | 36 | (defn build-repayment-record 37 | [data ^Schema schema] 38 | (let [^GenericRecordBuilder builder (GenericRecordBuilder. schema)] 39 | (.set builder "id" (:id data)) 40 | (.set builder "amount" (:amount data)) 41 | (.set builder "account" (:account data)) 42 | (.build builder))) 43 | 44 | (defn build-transaction-record 45 | [data ^Schema schema] 46 | (let [^GenericRecordBuilder builder (GenericRecordBuilder. schema)] 47 | (.set builder "id" (:id data)) 48 | (.set builder "amount" (:amount data)) 49 | (.build builder))) 50 | 51 | (deftest join-repayment-transaction-topology-test 52 | (testing "Joining a repayment and transaction" 53 | (let [repayment-id "1234" 54 | transaction-id "1234" 55 | amount 10 56 | repayment {:id repayment-id 57 | :amount amount 58 | :account 10241241} 59 | transaction {:id transaction-id 60 | :amount amount} 61 | 62 | schema-registry (MockSchemaRegistryClient.) 63 | serdes-config {"schema.registry.url" "http://localhost"} 64 | key-serdes (. Serdes String) 65 | value-serdes (doto 66 | (GenericAvroSerde. schema-registry) 67 | (.configure serdes-config false)) 68 | key-serializer (.serializer key-serdes) 69 | key-deserializer (.deserializer key-serdes) 70 | value-serializer (.serializer value-serdes) 71 | value-deserializer (.deserializer value-serdes) 72 | ^Topology topology (.build (sut/repayment-transaction-topology 73 | key-serdes 74 | value-serdes)) 75 | topology-test-driver (TopologyTestDriver. topology (support/properties "repayments-application")) 76 | repayment-topic (.createInputTopic topology-test-driver "repayment" key-serializer value-serializer) 77 | transaction-topic (.createInputTopic topology-test-driver "transaction" key-serializer value-serializer) 78 | processed-repayment-topic (.createOutputTopic topology-test-driver "processed-repayment" key-deserializer value-deserializer) 79 | ] 80 | 81 | (.pipeInput transaction-topic transaction-id (build-transaction-record transaction (build-transaction-schema))) 82 | (.pipeInput repayment-topic repayment-id (build-repayment-record repayment (build-repayment-schema))) 83 | 84 | (let [^KeyValue output (.readKeyValue processed-repayment-topic)] 85 | (is (= repayment-id (.key output))) 86 | (is (= 10241241 (.get (.value output) "account"))) 87 | (is (= repayment-id (.toString (.get (.value output) "id")))) 88 | (is (= 10 (.get (.value output) "repayment_amount"))) 89 | (is (= 10 (.get (.value output) "transaction_amount")))) 90 | (.close topology-test-driver)))) 91 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/kstream_kstream_inner_join_example_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-kstream-inner-join-example-test 2 | (:require [kafka-streams-example.kstream-kstream-inner-join-example :as sut] 3 | [clojure.test :refer [deftest testing is]] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver Topology] 7 | (java.util Properties NoSuchElementException))) 8 | 9 | ;; https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Streams+Join+Semantics 10 | (def ^Properties properties (support/properties "click-impressions-application")) 11 | 12 | (deftest kafka-streams-example-streaming-inner-join-test 13 | (testing "Joining two KafkaStreams with a joining window with input from each side") 14 | (let [^Topology topology (.build (sut/builder-streaming-join-topology)) 15 | topology-test-driver (TopologyTestDriver. topology properties) 16 | serializer (.serializer (. Serdes String)) 17 | deserializer (.deserializer (. Serdes String)) 18 | ad-clicks-topic (.createInputTopic topology-test-driver "adClicks" serializer serializer) 19 | ad-impressions-topic (.createInputTopic topology-test-driver "adImpressions" serializer serializer) 20 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer)] 21 | (.pipeInput ad-clicks-topic "newspaper-advertisement" "1") 22 | (.pipeInput ad-impressions-topic "newspaper-advertisement" "football-advert") 23 | (let [output (.readKeyValue output-topic)] 24 | (is (= "newspaper-advertisement" (.key output))) 25 | (is (= "football-advert/1" (.value output)))) 26 | (.close topology-test-driver))) 27 | 28 | (deftest kafka-streams-example-streaming-inner-join-only-left-side-present-test 29 | (testing "Joining two KafkaStreams with a joining window with input from the left side 30 | as it is a inner join it will wait for both sides so will return nothing") 31 | (let [^Topology topology (.build (sut/builder-streaming-join-topology)) 32 | topology-test-driver (TopologyTestDriver. topology properties) 33 | serializer (.serializer (. Serdes String)) 34 | deserializer (.deserializer (. Serdes String)) 35 | ad-impressions-topic (.createInputTopic topology-test-driver "adImpressions" serializer serializer) 36 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer)] 37 | (.pipeInput ad-impressions-topic "newspaper-advertisement" "football-advert") 38 | 39 | (let [output (try 40 | (.readKeyValue output-topic) 41 | (catch Exception e e))] 42 | (is (= NoSuchElementException (type output)))) 43 | (.close topology-test-driver))) 44 | 45 | (deftest kafka-streams-example-streaming-inner-join-only-right-side-present-test 46 | (testing "Joining two KafkaStreams with a joining window with input from the right side 47 | as it is a innerjoin will return nothing as waits for both sides") 48 | (let [^Topology topology (.build (sut/builder-streaming-join-topology)) 49 | topology-test-driver (TopologyTestDriver. topology properties) 50 | serializer (.serializer (. Serdes String)) 51 | deserializer (.deserializer (. Serdes String)) 52 | ad-clicks-topic (.createInputTopic topology-test-driver "adClicks" serializer serializer) 53 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer)] 54 | (.pipeInput ad-clicks-topic "newspaper-advertisement" "1") 55 | 56 | (let [output (try 57 | (.readKeyValue output-topic) 58 | (catch Exception e e))] 59 | (is (= NoSuchElementException (type output)))) 60 | (.close topology-test-driver))) 61 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/kstream_kstream_left_join_example_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-kstream-left-join-example-test 2 | (:require [kafka-streams-example.kstream-kstream-left-join-example :as sut] 3 | [clojure.test :refer [deftest testing is]] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver] 7 | (java.util NoSuchElementException))) 8 | 9 | (def properties (support/properties "click-impressions-application")) 10 | 11 | ;; https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Streams+Join+Semantics 12 | (deftest kafka-streams-example-streaming-left-join-test 13 | (testing "Joining two KafkaStreams with a joining window with input from each side") 14 | (let [topology (.build (sut/builder-streaming-join-topology)) 15 | topology-test-driver (TopologyTestDriver. topology properties) 16 | serializer (.serializer (. Serdes String)) 17 | deserializer (.deserializer (. Serdes String)) 18 | ad-clicks-topic (.createInputTopic topology-test-driver "adClicks" serializer serializer) 19 | ad-impressions-topic (.createInputTopic topology-test-driver "adImpressions" serializer serializer) 20 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer)] 21 | 22 | (.pipeInput ad-impressions-topic "newspaper-advertisement" "football-advert") 23 | 24 | ;; Outputs the record first as it received left side first 25 | (let [output (.readKeyValue output-topic)] 26 | (is (= "newspaper-advertisement" (.key output))) 27 | (is (= "football-advert/" (.value output)))) 28 | 29 | ;; send the right side and it then joins 30 | (.pipeInput ad-clicks-topic "newspaper-advertisement" "1") 31 | (let [output (.readKeyValue output-topic)] 32 | (is (= "newspaper-advertisement" (.key output))) 33 | (is (= "football-advert/1" (.value output)))) 34 | (.close topology-test-driver))) 35 | 36 | (deftest kafka-streams-example-streaming-left-join-only-left-side-present-test 37 | (testing "Joining two KafkaStreams with a joining window with input from the left side 38 | as it is a left join that single data will flow through so will just be shown") 39 | (let [topology (.build (sut/builder-streaming-join-topology)) 40 | topology-test-driver (TopologyTestDriver. topology properties) 41 | serializer (.serializer (. Serdes String)) 42 | deserializer (.deserializer (. Serdes String)) 43 | ad-impressions-topic (.createInputTopic topology-test-driver "adImpressions" serializer serializer) 44 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer) 45 | ] 46 | (.pipeInput ad-impressions-topic "newspaper-advertisement" "football-advert") 47 | 48 | (let [output (.readKeyValue output-topic)] 49 | (is (= "newspaper-advertisement" (.key output))) 50 | (is (= "football-advert/" (.value output)))) 51 | (.close topology-test-driver))) 52 | 53 | (deftest kafka-streams-example-streaming-left-join-only-right-side-present-test 54 | (testing "Joining two KafkaStreams with a joining window with input from the right side 55 | as it is a left join the right single data will not flow through") 56 | (let [topology (.build (sut/builder-streaming-join-topology)) 57 | topology-test-driver (TopologyTestDriver. topology properties) 58 | serializer (.serializer (. Serdes String)) 59 | deserializer (.deserializer (. Serdes String)) 60 | ad-clicks-topic (.createInputTopic topology-test-driver "adClicks" serializer serializer) 61 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer)] 62 | (.pipeInput ad-clicks-topic "newspaper-advertisement" "1") 63 | 64 | (let [output (try 65 | (.readKeyValue output-topic) 66 | (catch Exception e e))] 67 | (is (= NoSuchElementException (type output)))) 68 | (.close topology-test-driver))) 69 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/kstream_kstream_outer_join_example_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-kstream-outer-join-example-test 2 | (:require [kafka-streams-example.kstream-kstream-outer-join-example :as sut] 3 | [clojure.test :refer [deftest is testing]] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver Topology] 7 | (java.util Properties))) 8 | 9 | (def ^Properties properties (support/properties "click-impressions-application")) 10 | 11 | ;; https://cwiki.apache.org/confluence/display/KAFKA/Kafka+Streams+Join+Semantics 12 | (deftest kafka-streams-example-streaming-outer-join-test 13 | (testing "Joining two KafkaStreams with a joining window with input from each side 14 | as it is an outer join the first result will be released on its own and 15 | then as a pair.") 16 | (let [^Topology topology (.build (sut/builder-streaming-join-topology)) 17 | topology-test-driver (TopologyTestDriver. topology properties) 18 | serializer (.serializer (. Serdes String)) 19 | deserializer (.deserializer (. Serdes String)) 20 | ad-clicks-topic (.createInputTopic topology-test-driver "adClicks" serializer serializer) 21 | ad-impressions-topic (.createInputTopic topology-test-driver "adImpressions" serializer serializer) 22 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer)] 23 | (.pipeInput ad-clicks-topic "newspaper-advertisement" "1") 24 | (.pipeInput ad-impressions-topic "newspaper-advertisement" "football-advert") 25 | 26 | ;; Will release first input immediately as outer join 27 | (let [output (.readKeyValue output-topic)] 28 | (is (= "newspaper-advertisement" (.key output))) 29 | (is (= "/1" (.value output)))) 30 | 31 | (let [output (.readKeyValue output-topic)] 32 | (is (= "newspaper-advertisement" (.key output))) 33 | (is (= "football-advert/1" (.value output)))) 34 | (.close topology-test-driver))) 35 | 36 | (deftest kafka-streams-example-streaming-outer-join-only-left-side-present-test 37 | (testing "Joining two KafkaStreams with a joining window with input from the left side 38 | as it is a outer join that single data will flow through") 39 | (let [^Topology topology (.build (sut/builder-streaming-join-topology)) 40 | topology-test-driver (TopologyTestDriver. topology properties) 41 | serializer (.serializer (. Serdes String)) 42 | deserializer (.deserializer (. Serdes String)) 43 | ad-impressions-topic (.createInputTopic topology-test-driver "adImpressions" serializer serializer) 44 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer) 45 | ] 46 | (.pipeInput ad-impressions-topic "newspaper-advertisement" "football-advert") 47 | 48 | (let [output (.readKeyValue output-topic)] 49 | (is (= "newspaper-advertisement" (.key output))) 50 | (is (= "football-advert/" (.value output)))) 51 | (.close topology-test-driver))) 52 | 53 | (deftest kafka-streams-example-streaming-outer-join-only-right-side-present-test 54 | (testing "Joining two KafkaStreams with a joining window with input from the right side 55 | as it is a outer-join join the right single data will just flow through") 56 | (let [^Topology topology (.build (sut/builder-streaming-join-topology)) 57 | topology-test-driver (TopologyTestDriver. topology properties) 58 | serializer (.serializer (. Serdes String)) 59 | deserializer (.deserializer (. Serdes String)) 60 | ad-clicks-topic (.createInputTopic topology-test-driver "adClicks" serializer serializer) 61 | output-topic (.createOutputTopic topology-test-driver "output-topic" deserializer deserializer) 62 | ] 63 | (.pipeInput ad-clicks-topic "newspaper-advertisement" "1") 64 | 65 | (let [output (.readKeyValue output-topic)] 66 | (is (= "newspaper-advertisement" (.key output))) 67 | (is (= "/1" (.value output)))) 68 | (.close topology-test-driver))) 69 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/kstream_reduce_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.kstream-reduce-test 2 | (:require [clojure.test :refer :all]) 3 | (:require [kafka-streams-example.kstream-reduce :refer [build-group-by-reduce-topology]] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import (org.apache.kafka.streams TopologyTestDriver KeyValue) 6 | (org.apache.kafka.common.serialization StringSerializer StringDeserializer LongSerializer LongDeserializer))) 7 | 8 | (deftest build-group-by-reduce-topology-test 9 | (testing "Group by a key then reduce the values into single string") 10 | (let [input-topic-name "input-topic" 11 | output-topic-name "output-topic" 12 | topology (.build (build-group-by-reduce-topology input-topic-name output-topic-name)) 13 | topology-test-driver (TopologyTestDriver. topology (support/properties "reduce-topology")) 14 | input-topic (.createInputTopic topology-test-driver input-topic-name (LongSerializer.) (StringSerializer.)) 15 | output-topic (.createOutputTopic topology-test-driver output-topic-name (LongDeserializer.) (StringDeserializer.)) 16 | input-records [(KeyValue. 456 "stream") 17 | (KeyValue. 123 "hello") 18 | (KeyValue. 123 "world") 19 | (KeyValue. 456 "all") 20 | (KeyValue. 123 "kafka") 21 | (KeyValue. 456 "the") 22 | (KeyValue. 456 "things") 23 | (KeyValue. 123 "streams")] 24 | expected-output {456 "stream all the things" 25 | 123 "hello world kafka streams"}] 26 | (.pipeKeyValueList input-topic input-records) 27 | (is (= expected-output (.readKeyValuesToMap output-topic))))) 28 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/ktable_example_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.ktable-example-test 2 | (:require [kafka-streams-example.ktable-example :as sut] 3 | [clojure.test :refer [deftest testing is]] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver Topology])) 7 | 8 | (deftest kafka-streams-ktable-example-test 9 | (testing "Kafka Streams with KTABLE" 10 | (let [^Topology topology (.build (sut/build-join-topology)) 11 | topology-test-driver (TopologyTestDriver. topology (support/properties "user-clicks-application")) 12 | serializer (.serializer (. Serdes String)) 13 | deserializer (.deserializer (. Serdes String)) 14 | user-clicks-topic (.createInputTopic topology-test-driver "user-clicks-topic" serializer serializer) 15 | user-regions-topic (.createInputTopic topology-test-driver "user-regions-topic" serializer serializer) 16 | output-topic (.createOutputTopic topology-test-driver "clicks-per-region-topic" deserializer deserializer) 17 | input-clicks "2" 18 | input-regions "England"] 19 | (.pipeInput user-regions-topic "alice" input-regions) 20 | (.pipeInput user-clicks-topic "alice" input-clicks) 21 | (.pipeInput user-clicks-topic "alice" input-clicks) 22 | (let [output (.readKeyValue output-topic) 23 | output-2 (.readKeyValue output-topic)] 24 | (is (= "England" (.key output))) 25 | (is (= "2" (.value output))) 26 | (is (= "England" (.key output-2))) 27 | (is (= "4" (.value output-2))))))) 28 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/processor_api_example_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.processor-api-example-test 2 | (:require [clojure.test :refer [deftest is testing]] 3 | [kafka-streams-example.processor-api-example :as sut] 4 | [kafka-streams-example.test-support :as support]) 5 | (:import org.apache.kafka.common.serialization.Serdes 6 | [org.apache.kafka.streams TopologyTestDriver])) 7 | 8 | (deftest word-count-test 9 | (testing "A word count processor API test" 10 | (let [topology (sut/word-processor-topology) 11 | topology-test-driver (TopologyTestDriver. topology (support/properties "word-count-application")) 12 | serializer (.serializer (. Serdes String)) 13 | deserializer (.deserializer (. Serdes String)) 14 | input-topic (.createInputTopic topology-test-driver "source-topic" serializer serializer) 15 | output-topic (.createOutputTopic topology-test-driver "sink-topic" deserializer deserializer)] 16 | 17 | (.pipeInput input-topic "word-entry" "perkss blog rocks clojure rocks kafka rocks") 18 | 19 | (let [output (.readKeyValue output-topic)] 20 | (is (= "blog" (.key output))) 21 | (is (= "1" (.value output)))) 22 | 23 | (let [output (.readKeyValue output-topic)] 24 | (is (= "clojure" (.key output))) 25 | (is (= "1" (.value output)))) 26 | 27 | (let [output (.readKeyValue output-topic)] 28 | (is (= "kafka" (.key output))) 29 | (is (= "1" (.value output)))) 30 | 31 | (let [output (.readKeyValue output-topic)] 32 | (is (= "perkss" (.key output))) 33 | (is (= "1" (.value output)))) 34 | 35 | (let [output (.readKeyValue output-topic)] 36 | (is (= "rocks" (.key output))) 37 | (is (= "3" (.value output)))) 38 | 39 | (.close topology-test-driver)))) 40 | -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/processor_api_rekey_test.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.processor-api-rekey-test 2 | (:require 3 | [clj-uuid :as uuid] 4 | [clojure.test :refer [deftest is testing]] 5 | [kafka-streams-example.serdes.json-serdes :refer [json-serde]] 6 | [kafka-streams-example.processor-api-rekey :as sut] 7 | [kafka-streams-example.test-support :as support]) 8 | (:import org.apache.kafka.common.serialization.Serdes 9 | [org.apache.kafka.streams TopologyTestDriver Topology])) 10 | 11 | (deftest trade-rekey-to-id-test 12 | (testing "A test that rekeys the trade messages to trade-id from id" 13 | (let [topology (sut/trade-rekey-topology (Topology.)) 14 | key-serializer (.serializer (. Serdes String)) 15 | key-deserializer (.deserializer (. Serdes String)) 16 | value-serializer (.serializer json-serde) 17 | value-deserializer (.deserializer json-serde) 18 | topology-test-driver (TopologyTestDriver. topology (support/properties "word-count-application")) 19 | input-topic (.createInputTopic topology-test-driver "trade-input-topic" key-serializer value-serializer) 20 | output-topic (.createOutputTopic topology-test-driver "trades-by-trade-id" key-deserializer value-deserializer) 21 | trade-msg {:id (str (uuid/v4)) 22 | :trade-id (str (uuid/v4)) 23 | :buyer "perkss" 24 | :seller "stuart"} 25 | trade-msg-key (:id trade-msg) 26 | new-trade-msg-key (:trade-id trade-msg)] 27 | 28 | ;; Send in with Key ID 29 | (.pipeInput input-topic trade-msg-key trade-msg) 30 | 31 | ;; The new key is the trade-id 32 | (let [output (.readKeyValue output-topic)] 33 | (is (= new-trade-msg-key (.key output))) 34 | (is (= trade-msg (.value output)))) 35 | 36 | (.close topology-test-driver)))) -------------------------------------------------------------------------------- /kafka-streams-example/test/kafka_streams_example/test_support.clj: -------------------------------------------------------------------------------- 1 | (ns kafka-streams-example.test-support 2 | (:import (java.util Properties) 3 | (org.apache.kafka.streams StreamsConfig) 4 | (org.apache.kafka.common.serialization Serdes) 5 | (org.apache.kafka.test TestUtils))) 6 | 7 | (defn ^Properties properties [application-name] 8 | (let [properties (Properties.)] 9 | (.put properties StreamsConfig/APPLICATION_ID_CONFIG application-name) 10 | (.put properties StreamsConfig/BOOTSTRAP_SERVERS_CONFIG "dummy:9092") 11 | (.put properties StreamsConfig/DEFAULT_KEY_SERDE_CLASS_CONFIG (.getName (.getClass (Serdes/String)))) 12 | (.put properties StreamsConfig/DEFAULT_VALUE_SERDE_CLASS_CONFIG (.getName (.getClass (Serdes/String)))) 13 | (.put properties StreamsConfig/COMMIT_INTERVAL_MS_CONFIG (* 10 1000)) 14 | (.put properties StreamsConfig/STATE_DIR_CONFIG (.getAbsolutePath (. TestUtils tempDirectory))) 15 | properties)) --------------------------------------------------------------------------------