├── LICENSE ├── README.md ├── example.jpg ├── pictures ├── Model_Inference_Stream_Processing_vs_Request_Response.png └── TensorFlow_Serving_Architecture.svg ├── pom.xml └── src └── main ├── java └── com │ └── github │ └── megachucky │ └── kafka │ └── streams │ └── machinelearning │ ├── InceptionBlockingStub.java │ ├── InceptionInference.java │ ├── Kafka_Streams_TensorFlow_Serving_gRPC_Example.java │ └── TensorflowObjectRecogniser.java └── resources ├── example.jpg └── log4j.properties /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "{}" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright {yyyy} {name of copyright owner} 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # TensorFlow Serving + gRPC + Java + Kafka Streams 2 | This project contains a demo to do **model inference with Apache Kafka, Kafka Streams and a TensorFlow model deployed using [TensorFlow Serving](https://www.tensorflow.org/serving/)**. The concepts are very similar for other ML frameworks and Cloud Providers, e.g. you could also use [Google Cloud ML Engine](https://cloud.google.com/ml-engine/docs/tensorflow/deploying-models) for TensorFlow (which uses TensorFlow Serving under the hood) or Apache MXNet and [AWS model server](https://github.com/awslabs/mxnet-model-server). 3 | 4 | Most ML servers for model serving are also extendible to serve other types of models and data, e.g. you could also deploy non-TensorFlow models to TensorFlow Serving. Many ML servers are available as cloud service and for local deployment. 5 | 6 | ## Model Serving: Stream Processing vs. Request Response 7 | Some background on model serving alternatives: 8 | 9 | Machine Learning / Deep Learning models can be used in different ways to do predictions. The preferred way is to deploy an analytic model directly into a stream processing application (like [Kafka Streams](https://kafka.apache.org/documentation/streams/)). You could e.g. use the [TensorFlow for Java API](https://www.tensorflow.org/install/install_java). This allows best latency and independence of external services. Examples here: [Model Inference within Kafka Streams Microservices using TensorFlow, H2O.ai, Deeplearning4j](https://github.com/kaiwaehner/kafka-streams-machine-learning-examples). 10 | 11 | However, direct deployment of models is not always a feasible approach. Sometimes it makes sense or is needed to deploy a model in another serving infrastructure like TensorFlow Serving for TensorFlow models. Model Inference is then done via RPC / Request Response communication. Organisational or technical reasons might force this approach. Or you might want to leverage the built-in features for managing and versioning different models in the model server. This Github project shows an **example for how to access a model serving infrastructure from a stream processing microservice leveraging Apache Kafka and Kafka Streams**. 12 | 13 | ![Model Serving: Stream Processing vs. Request Response](pictures/Model_Inference_Stream_Processing_vs_Request_Response.png) 14 | 15 | **Pros of an external model serving infrastructure like TensorFlow Serving:** 16 | - Simple integration with existing technologies and organizational processes 17 | - Easier to understand if you come from non-streaming world 18 | - Later migration to real streaming is also possible 19 | - Model management built-in for deployment of different models including versioning, A/B testing, etc. 20 | 21 | **Cons:** 22 | - Often tied to specific ML technologies 23 | - Often tied to a specific cloud provider and introduces lock-in 24 | - Worse latency as remote call instead of local inference 25 | - More complex security setups (remote communication through firewalls and authorization management) 26 | - No offline inference (devices, edge processing, etc.) 27 | - Coupling the availability, scalability, and latency / throughput of your stream processing application with the SLAs of the RPC interface 28 | - Side-effects (e.g. in case of failure) not covered by Kafka processing (e.g. Exactly Once). 29 | 30 | 31 | 32 | ### TensorFlow Serving 33 | Let’s discuss TensorFlow Serving quickly. It can be used to host your trained analytic models. Like with most model servers, you can do inference via request-response paradigm. gRPC and REST / HTTP are the two common technologies and concepts used. 34 | 35 | The blog post "[How to deploy TensorFlow models to production using TF Serving](https://medium.freecodecamp.org/how-to-deploy-tensorflow-models-to-production-using-tf-serving-4b4b78d41700)" is a great explanation of how to export and deploy trained TensorFlow models to a TensorFlow Serving infrastructure. You can either deploy your own infrastructure anywhere or leverage a cloud service like Google Cloud ML Engine. A [SavedModel](https://www.tensorflow.org/programmers_guide/saved_model#build_and_load_a_savedmodel) is TensorFlow's recommended format for saving models, and it is the required format for deploying trained TensorFlow models using TensorFlow Serving or deploying on Goodle Cloud ML Engine. 36 | 37 | The core architecture is described in detail in [TensorFlow Serving's architecture overview](https://www.tensorflow.org/serving/architecture_overview): 38 | 39 | ![TensorFlow Serving Architecture](pictures/TensorFlow_Serving_Architecture.svg) 40 | 41 | This architecture allows deployement and managend of different models and versions of these models including additional features like A/B testing. In the following demo, we just deploy one single TensorFlow model for Image Recognition (based on the famous Inception neural network). 42 | 43 | ## Demo: Mixing Stream Processing with RPC: TensorFlow Serving + Kafka Streams 44 | 45 | ### Requirements 46 | - Install Java 8 47 | - Install Docker 48 | - Kafka Streams API 2.2 (configured in pom.xml and downloaded via Maven build; Kafka Streams API 2.1, 2.0 and 1.1 are also compatible) 49 | 50 | The code is developed and tested on Mac and Linux operating systems. As Kafka does not support and work well on Windows, this is not tested at all. 51 | 52 | 53 | ### Things to do 54 | 1. Install and start a ML Serving Engine 55 | 2. Deploy prebuilt TensorFlow Model 56 | 3. Create Kafka Cluster 57 | 4. Implement Kafka Streams application 58 | 5. Deploy Kafka Streams application (e.g. locally on laptop or to a Kubernetes cluster) 59 | 6. Generate streaming data to test the combination of Kafka Streams and TensorFlow Serving 60 | 61 | ### Step 1: Create a TensorFlow model and export it to 'SavedModel' format 62 | I simply added an existing pretrained Image Recognition model built with TensorFlow. You just need to export a model using TensorFlow's API and then use the exported folder. TensorFlow uses Protobuf to store the model graph and adds variables for the weights of the neural network. 63 | 64 | Google ML Engine shows how to create a simple TensorFlow model for predictions of census using the "[ML Engine getting started guide](https://cloud.google.com/ml-engine/docs/tensorflow/getting-started-training-prediction)". In a second step, you can build a more advanced example for image recognition using Transfer Learning folling the guide "[Image Classification using Flowers dataset](https://cloud.google.com/ml-engine/docs/tensorflow/flowers-tutorial)". 65 | 66 | You can also combine cloud and local services, e.g. build the analytic model with Google ML Engine and then deploy it locally using TensorFlow Serving as we do. 67 | 68 | ### Step 2: Install and start TensorFlow Serving server + deploy model 69 | Different options are available. Installing TensforFlow Serving on a Mac is still a pain in mid of 2018. apt-get works much easier on Linux operating systems. Unforunately there is nothing like a 'brew' command or simple zip file you can use on Mac. Alternatives: 70 | 71 | - You can **build the project and compile everything using [Bazel build system]**(https://bazel.build/) - which literaly takes forever (on my laptop), i.e. many hours. 72 | 73 | - **Install and run TensorFlow Serving via a [Docker container](https://www.tensorflow.org/serving/docker)** 74 | 75 | docker build --pull -t tensorflow-serving-devel -f Dockerfile.devel . 76 | 77 | docker run -it tensorflow-serving-devel 78 | 79 | git clone --recurse-submodules https://github.com/tensorflow/serving 80 | cd serving/tensorflow 81 | ./configure 82 | cd .. 83 | bazel test tensorflow_serving/... 84 | Also requires building the project. In addition, documentation is not very good and outdated. 85 | 86 | - **Preferred option for beginners => Use a prebuilt Docker container with TensorFlow Serving**. I used an [example from Thamme Gowda](https://github.com/thammegowda/tensorflow-grpc-java). Kudos to him for building a project which not just contains the TensorFlow Serving Docker image, but also shows an example of how to do gRPC communication between a Java application and TensorFlow Serving. 87 | 88 | **Pull and start container with TensorFlow Serving preinstalled (forward port 9000)** 89 | 90 | docker run -it -p 9000:9000 tgowda/inception_serving_tika 91 | 92 | **Inside the container, start the Tensorflow Serving server - this deploys the TensorFlow model for Image Recognition** 93 | 94 | root@8311ea4e8074:/# /serving/server.sh 95 | 96 | If you want to your own model, read the guide "[Deploy TensorFlow model to TensorFlow serving](https://www.tensorflow.org/programmers_guide/saved_model#load_and_serve_a_savedmodel_in_tensorflow_serving)". Or to use a cloud service, e.g. take a look at "[Getting Started with Google ML Engine](https://cloud.google.com/ml-engine/docs/tensorflow/deploying-models)". 97 | 98 | ### Step 3: Create Kafka Cluster and Kafka topics 99 | Create a local Kafka environment (Apache Kafka broker + Zookeeper). The easiest way is the open source [Confluent CLI](https://github.com/confluentinc/confluent-cli) - which is also part of Confluent Open Source and Confluent Enteprise Platform: 100 | 101 | confluent start kafka 102 | 103 | You can also create a cluster using Kafka as a Service. Best option is [Confluent Cloud - Apache Kafka as a Service](https://www.confluent.io/confluent-cloud/). You can choose between Confluent Cloud Professional for "playing around" or Confluent Cloud Enterprise on AWS, GCP or Azure for mission-critical deployments including 99.95% SLA and very large scale up to 2 GBbyte/second throughput. The third option is to connect to your existing Kafka cluster on premise or in cloud (note that you need to change the broker URL and port in the Kafka Streams Java code before building the project). 104 | 105 | Next create the two Kafka topics for this example ('ImageInputTopic' for URLs to the image and 'ImageOutputTopic' for the prediction result): 106 | 107 | kafka-topics --bootstrap-server localhost:9092 --create --topic ImageInputTopic --partitions 3 --replication-factor 1 108 | 109 | kafka-topics --bootstrap-server localhost:9092 --create --topic ImageOutputTopic --partitions 3 --replication-factor 1 110 | 111 | ### Step 4 Build and deploy Kafka Streams app + send test messages 112 | The Kafka Streams microservice [Kafka_Streams_TensorFlow_Serving_gRPC_Example](https://github.com/kaiwaehner/tensorflow-serving-java-grpc-kafka-streams/blob/master/src/main/java/com/github/megachucky/kafka/streams/machinelearning/Kafka_Streams_TensorFlow_Serving_gRPC_Example.java) is the Kafka Streams Java client. The microservice uses gRPC and Protobuf for request-response communication with the TensorFlow Serving server to do model inference to predict the contant of the image. Note that the Java client does not need any TensorFlow APIs, but just gRPC interfaces. 113 | 114 | Let's build the project: 115 | 116 | mvn clean package 117 | 118 | This example executes a Java main method, i.e. it starts a local Java process running the Kafka Streams microservice. It waits continuously for new events arriving at 'ImageInputTopic' to do a model inference (via gRCP call to TensorFlow Serving) and then sending the prediction to 'ImageOutputTopic' - all in real time within milliseconds. 119 | 120 | java -cp target/tensorflow-serving-java-grpc-kafka-streams-1.0-jar-with-dependencies.jar com.github.megachucky.kafka.streams.machinelearning.Kafka_Streams_TensorFlow_Serving_gRPC_Example 121 | 122 | In the same way, you could deploy this Kafka Streams microservice anywhere - including Kubernetes (e.g. on premise OpenShift cluster or Google Kubernetes Engine), Mesosphere, Amazon ECS or even in a Java EE app - and scale it up and down dynamically. 123 | 124 | Now send messages, e.g. with kafkacat... 125 | 126 | echo -e "src/main/resources/example.jpg" | kafkacat -b localhost:9092 -P -t ImageInputTopic 127 | 128 | ... and consume predictions (start consumer first if you want to see the 'real time' processing): 129 | 130 | kafka-console-consumer --bootstrap-server localhost:9092 --topic ImageOutputTopic --from-beginning 131 | 132 | To stop everything, *stop the Docker container with TensorFlow Serving*, *stop the Kafka Consumer*, *shut down the Kafka Streams application* and finally stop Kafka using using Confluent CLI (which also deletes all configuration and topics): 133 | 134 | confluent destroy 135 | -------------------------------------------------------------------------------- /example.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kaiwaehner/tensorflow-serving-java-grpc-kafka-streams/71f93e0de13cc5ef752bc0c3e3beb7e753a55cc9/example.jpg -------------------------------------------------------------------------------- /pictures/Model_Inference_Stream_Processing_vs_Request_Response.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kaiwaehner/tensorflow-serving-java-grpc-kafka-streams/71f93e0de13cc5ef752bc0c3e3beb7e753a55cc9/pictures/Model_Inference_Stream_Processing_vs_Request_Response.png -------------------------------------------------------------------------------- /pom.xml: -------------------------------------------------------------------------------- 1 | 2 | 13 | 14 | 17 | 4.0.0 18 | 19 | com.github.kaiwaehner.kafka.streams.machinelearning 20 | tensorflow-serving-java-grpc-kafka-streams 21 | 1.0 22 | 23 | 24 | 25 | confluent 26 | http://packages.confluent.io/maven/ 27 | 28 | 29 | 30 | 31 | 32 | 33 | com.github.megachucky.kafka.streams.machinelearning.Kafka_Streams_TensorFlow_Serving_gRPC_Example 34 | 1.13.1 35 | 3.6.0 36 | 1.8 37 | 2.2.0 38 | 2.11 39 | ${kafka.scala.version}.8 40 | 5.2.0 41 | UTF-8 42 | 43 | 44 | 45 | 48 | 49 | org.apache.kafka 50 | kafka-streams 51 | ${kafka.version} 52 | 53 | 54 | 55 | com.google.protobuf 56 | protobuf-java 57 | ${protobuf.version} 58 | 59 | 60 | 61 | io.grpc 62 | grpc-all 63 | ${grpc.version} 64 | 65 | 66 | 67 | io.grpc 68 | grpc-protobuf 69 | ${grpc.version} 70 | 71 | 72 | 73 | io.grpc 74 | grpc-stub 75 | ${grpc.version} 76 | 77 | 78 | 79 | io.grpc 80 | grpc-netty 81 | ${grpc.version} 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | org.apache.maven.plugins 93 | maven-compiler-plugin 94 | 3.6.1 95 | 96 | 1.8 97 | 1.8 98 | 99 | 100 | 101 | 102 | 103 | org.apache.maven.plugins 104 | maven-assembly-plugin 105 | 2.5.2 106 | 107 | 108 | jar-with-dependencies 109 | 110 | 111 | 112 | true 113 | ${exec.mainClass} 114 | 115 | 116 | 117 | 118 | 119 | assemble-all 120 | package 121 | 122 | single 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | -------------------------------------------------------------------------------- /src/main/java/com/github/megachucky/kafka/streams/machinelearning/InceptionBlockingStub.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | package com.github.megachucky.kafka.streams.machinelearning; 19 | 20 | import io.grpc.CallOptions; 21 | import io.grpc.Channel; 22 | import io.grpc.ExperimentalApi; 23 | import io.grpc.MethodDescriptor; 24 | import io.grpc.MethodDescriptor.MethodType; 25 | import io.grpc.protobuf.ProtoUtils; 26 | import io.grpc.stub.AbstractStub; 27 | 28 | import static com.github.megachucky.kafka.streams.machinelearning.InceptionInference.InceptionRequest; 29 | import static com.github.megachucky.kafka.streams.machinelearning.InceptionInference.InceptionResponse; 30 | import static io.grpc.MethodDescriptor.generateFullMethodName; 31 | import static io.grpc.stub.ClientCalls.blockingUnaryCall; 32 | 33 | /** 34 | * Stub implementation for {@link #SERVICE_NAME} (Tensorflow' Inception service ) 35 | */ 36 | public class InceptionBlockingStub extends AbstractStub { 37 | 38 | public static final String SERVICE_NAME = "tensorflow.serving.InceptionService"; 39 | public static final String METHOD_NAME = "Classify"; 40 | 41 | @ExperimentalApi("https://github.com/grpc/grpc-java/issues/1901") 42 | public static final MethodDescriptor METHOD_DESCRIPTOR = MethodDescriptor 43 | .create(MethodType.UNARY, 44 | generateFullMethodName(SERVICE_NAME, METHOD_NAME), 45 | ProtoUtils.marshaller(InceptionRequest.getDefaultInstance()), 46 | ProtoUtils.marshaller(InceptionResponse.getDefaultInstance())); 47 | 48 | public InceptionBlockingStub(Channel channel) { 49 | super(channel); 50 | } 51 | 52 | private InceptionBlockingStub(Channel channel, CallOptions callOptions) { 53 | super(channel, callOptions); 54 | } 55 | 56 | @Override 57 | protected InceptionBlockingStub build(Channel channel, 58 | CallOptions callOptions) { 59 | return new InceptionBlockingStub(channel, callOptions); 60 | } 61 | 62 | public InceptionResponse classify(InceptionRequest request) { 63 | return blockingUnaryCall(getChannel(), METHOD_DESCRIPTOR, getCallOptions(), 64 | request); 65 | } 66 | } -------------------------------------------------------------------------------- /src/main/java/com/github/megachucky/kafka/streams/machinelearning/InceptionInference.java: -------------------------------------------------------------------------------- 1 | 2 | /* 3 | * Licensed to the Apache Software Foundation (ASF) under one or more 4 | * contributor license agreements. See the NOTICE file distributed with 5 | * this work for additional information regarding copyright ownership. 6 | * The ASF licenses this file to You under the Apache License, Version 2.0 7 | * (the "License"); you may not use this file except in compliance with 8 | * the License. You may obtain a copy of the License at 9 | * 10 | * http://www.apache.org/licenses/LICENSE-2.0 11 | * 12 | * Unless required by applicable law or agreed to in writing, software 13 | * distributed under the License is distributed on an "AS IS" BASIS, 14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | * See the License for the specific language governing permissions and 16 | * limitations under the License. 17 | */ 18 | 19 | // Generated by the protocol buffer compiler. DO NOT EDIT! 20 | // source: inception_inference.proto 21 | 22 | package com.github.megachucky.kafka.streams.machinelearning; 23 | 24 | import com.google.protobuf.*; 25 | import com.google.protobuf.Descriptors.Descriptor; 26 | import com.google.protobuf.Descriptors.FileDescriptor; 27 | import com.google.protobuf.Descriptors.FileDescriptor.InternalDescriptorAssigner; 28 | import com.google.protobuf.GeneratedMessage.FieldAccessorTable; 29 | 30 | import java.io.IOException; 31 | import java.io.InputStream; 32 | import java.util.ArrayList; 33 | import java.util.Collections; 34 | import java.util.List; 35 | 36 | public final class InceptionInference { 37 | private InceptionInference() { 38 | } 39 | 40 | public static void registerAllExtensions(ExtensionRegistry registry) { 41 | } 42 | 43 | public interface InceptionRequestOrBuilder extends 44 | // @@protoc_insertion_point(interface_extends:tensorflow.serving.InceptionRequest) 45 | MessageOrBuilder { 46 | 47 | /** 48 | *
  49 |      * JPEG encoded stream of the image to be classified.
  50 |      * 
51 | *

52 | * optional bytes jpeg_encoded = 1; 53 | */ 54 | ByteString getJpegEncoded(); 55 | } 56 | 57 | /** 58 | * Protobuf type {@code tensorflow.serving.InceptionRequest} 59 | */ 60 | public static final class InceptionRequest extends GeneratedMessage implements 61 | // @@protoc_insertion_point(message_implements:tensorflow.serving.InceptionRequest) 62 | InceptionRequestOrBuilder { 63 | // Use InceptionRequest.newBuilder() to construct. 64 | private InceptionRequest(GeneratedMessage.Builder builder) { 65 | super(builder); 66 | } 67 | 68 | private InceptionRequest() { 69 | jpegEncoded_ = ByteString.EMPTY; 70 | } 71 | 72 | @Override 73 | public final UnknownFieldSet getUnknownFields() { 74 | return UnknownFieldSet.getDefaultInstance(); 75 | } 76 | 77 | private InceptionRequest(CodedInputStream input, 78 | ExtensionRegistryLite extensionRegistry) 79 | throws InvalidProtocolBufferException { 80 | this(); 81 | int mutable_bitField0_ = 0; 82 | try { 83 | boolean done = false; 84 | while (!done) { 85 | int tag = input.readTag(); 86 | switch (tag) { 87 | case 0: 88 | done = true; 89 | break; 90 | default: { 91 | if (!input.skipField(tag)) { 92 | done = true; 93 | } 94 | break; 95 | } 96 | case 10: { 97 | 98 | jpegEncoded_ = input.readBytes(); 99 | break; 100 | } 101 | } 102 | } 103 | } catch (InvalidProtocolBufferException e) { 104 | throw e.setUnfinishedMessage(this); 105 | } catch (IOException e) { 106 | throw new InvalidProtocolBufferException(e).setUnfinishedMessage(this); 107 | } finally { 108 | makeExtensionsImmutable(); 109 | } 110 | } 111 | 112 | public static final Descriptor getDescriptor() { 113 | return InceptionInference.internal_static_tensorflow_serving_InceptionRequest_descriptor; 114 | } 115 | 116 | protected FieldAccessorTable internalGetFieldAccessorTable() { 117 | return InceptionInference.internal_static_tensorflow_serving_InceptionRequest_fieldAccessorTable 118 | .ensureFieldAccessorsInitialized(InceptionRequest.class, 119 | Builder.class); 120 | } 121 | 122 | public static final int JPEG_ENCODED_FIELD_NUMBER = 1; 123 | private ByteString jpegEncoded_; 124 | 125 | /** 126 | *

 127 |      * JPEG encoded stream of the image to be classified.
 128 |      * 
129 | *

130 | * optional bytes jpeg_encoded = 1; 131 | */ 132 | public ByteString getJpegEncoded() { 133 | return jpegEncoded_; 134 | } 135 | 136 | private byte memoizedIsInitialized = -1; 137 | 138 | public final boolean isInitialized() { 139 | byte isInitialized = memoizedIsInitialized; 140 | if (isInitialized == 1) 141 | return true; 142 | if (isInitialized == 0) 143 | return false; 144 | 145 | memoizedIsInitialized = 1; 146 | return true; 147 | } 148 | 149 | public void writeTo(CodedOutputStream output) throws IOException { 150 | if (!jpegEncoded_.isEmpty()) { 151 | output.writeBytes(1, jpegEncoded_); 152 | } 153 | } 154 | 155 | public int getSerializedSize() { 156 | int size = memoizedSize; 157 | if (size != -1) 158 | return size; 159 | 160 | size = 0; 161 | if (!jpegEncoded_.isEmpty()) { 162 | size += CodedOutputStream.computeBytesSize(1, jpegEncoded_); 163 | } 164 | memoizedSize = size; 165 | return size; 166 | } 167 | 168 | private static final long serialVersionUID = 0L; 169 | 170 | public static InceptionRequest parseFrom(ByteString data) 171 | throws InvalidProtocolBufferException { 172 | return PARSER.parseFrom(data); 173 | } 174 | 175 | public static InceptionRequest parseFrom(ByteString data, 176 | ExtensionRegistryLite extensionRegistry) 177 | throws InvalidProtocolBufferException { 178 | return PARSER.parseFrom(data, extensionRegistry); 179 | } 180 | 181 | public static InceptionRequest parseFrom(byte[] data) 182 | throws InvalidProtocolBufferException { 183 | return PARSER.parseFrom(data); 184 | } 185 | 186 | public static InceptionRequest parseFrom(byte[] data, 187 | ExtensionRegistryLite extensionRegistry) 188 | throws InvalidProtocolBufferException { 189 | return PARSER.parseFrom(data, extensionRegistry); 190 | } 191 | 192 | public static InceptionRequest parseFrom(InputStream input) 193 | throws IOException { 194 | return GeneratedMessage.parseWithIOException(PARSER, input); 195 | } 196 | 197 | public static InceptionRequest parseFrom(InputStream input, 198 | ExtensionRegistryLite extensionRegistry) throws IOException { 199 | return GeneratedMessage 200 | .parseWithIOException(PARSER, input, extensionRegistry); 201 | } 202 | 203 | public static InceptionRequest parseDelimitedFrom(InputStream input) 204 | throws IOException { 205 | return GeneratedMessage.parseDelimitedWithIOException(PARSER, input); 206 | } 207 | 208 | public static InceptionRequest parseDelimitedFrom(InputStream input, 209 | ExtensionRegistryLite extensionRegistry) throws IOException { 210 | return GeneratedMessage 211 | .parseDelimitedWithIOException(PARSER, input, extensionRegistry); 212 | } 213 | 214 | public static InceptionRequest parseFrom(CodedInputStream input) 215 | throws IOException { 216 | return GeneratedMessage.parseWithIOException(PARSER, input); 217 | } 218 | 219 | public static InceptionRequest parseFrom(CodedInputStream input, 220 | ExtensionRegistryLite extensionRegistry) throws IOException { 221 | return GeneratedMessage 222 | .parseWithIOException(PARSER, input, extensionRegistry); 223 | } 224 | 225 | public Builder newBuilderForType() { 226 | return newBuilder(); 227 | } 228 | 229 | public static Builder newBuilder() { 230 | return DEFAULT_INSTANCE.toBuilder(); 231 | } 232 | 233 | public static Builder newBuilder(InceptionRequest prototype) { 234 | return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype); 235 | } 236 | 237 | public Builder toBuilder() { 238 | return this == DEFAULT_INSTANCE ? 239 | new Builder() : 240 | new Builder().mergeFrom(this); 241 | } 242 | 243 | @Override 244 | protected Builder newBuilderForType(BuilderParent parent) { 245 | Builder builder = new Builder(parent); 246 | return builder; 247 | } 248 | 249 | /** 250 | * Protobuf type {@code tensorflow.serving.InceptionRequest} 251 | */ 252 | public static final class Builder extends GeneratedMessage.Builder 253 | implements 254 | // @@protoc_insertion_point(builder_implements:tensorflow.serving.InceptionRequest) 255 | InceptionRequestOrBuilder { 256 | public static final Descriptor getDescriptor() { 257 | return InceptionInference.internal_static_tensorflow_serving_InceptionRequest_descriptor; 258 | } 259 | 260 | protected FieldAccessorTable internalGetFieldAccessorTable() { 261 | return InceptionInference.internal_static_tensorflow_serving_InceptionRequest_fieldAccessorTable 262 | .ensureFieldAccessorsInitialized(InceptionRequest.class, 263 | Builder.class); 264 | } 265 | 266 | // Construct using tensorflow.serving.InceptionInference.InceptionRequest.newBuilder() 267 | private Builder() { 268 | maybeForceBuilderInitialization(); 269 | } 270 | 271 | private Builder(BuilderParent parent) { 272 | super(parent); 273 | maybeForceBuilderInitialization(); 274 | } 275 | 276 | private void maybeForceBuilderInitialization() { 277 | if (GeneratedMessage.alwaysUseFieldBuilders) { 278 | } 279 | } 280 | 281 | public Builder clear() { 282 | super.clear(); 283 | jpegEncoded_ = ByteString.EMPTY; 284 | 285 | return this; 286 | } 287 | 288 | public Descriptor getDescriptorForType() { 289 | return InceptionInference.internal_static_tensorflow_serving_InceptionRequest_descriptor; 290 | } 291 | 292 | public InceptionRequest getDefaultInstanceForType() { 293 | return getDefaultInstance(); 294 | } 295 | 296 | public InceptionRequest build() { 297 | InceptionRequest result = buildPartial(); 298 | if (!result.isInitialized()) { 299 | throw newUninitializedMessageException(result); 300 | } 301 | return result; 302 | } 303 | 304 | public InceptionRequest buildPartial() { 305 | InceptionRequest result = new InceptionRequest(this); 306 | result.jpegEncoded_ = jpegEncoded_; 307 | onBuilt(); 308 | return result; 309 | } 310 | 311 | public Builder mergeFrom(Message other) { 312 | if (other instanceof InceptionRequest) { 313 | return mergeFrom((InceptionRequest) other); 314 | } else { 315 | super.mergeFrom(other); 316 | return this; 317 | } 318 | } 319 | 320 | public Builder mergeFrom(InceptionRequest other) { 321 | if (other == getDefaultInstance()) 322 | return this; 323 | if (other.getJpegEncoded() != ByteString.EMPTY) { 324 | setJpegEncoded(other.getJpegEncoded()); 325 | } 326 | onChanged(); 327 | return this; 328 | } 329 | 330 | public final boolean isInitialized() { 331 | return true; 332 | } 333 | 334 | public Builder mergeFrom(CodedInputStream input, 335 | ExtensionRegistryLite extensionRegistry) throws IOException { 336 | InceptionRequest parsedMessage = null; 337 | try { 338 | parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry); 339 | } catch (InvalidProtocolBufferException e) { 340 | parsedMessage = (InceptionRequest) e.getUnfinishedMessage(); 341 | throw e.unwrapIOException(); 342 | } finally { 343 | if (parsedMessage != null) { 344 | mergeFrom(parsedMessage); 345 | } 346 | } 347 | return this; 348 | } 349 | 350 | private ByteString jpegEncoded_ = ByteString.EMPTY; 351 | 352 | /** 353 | *

 354 |        * JPEG encoded stream of the image to be classified.
 355 |        * 
356 | *

357 | * optional bytes jpeg_encoded = 1; 358 | */ 359 | public ByteString getJpegEncoded() { 360 | return jpegEncoded_; 361 | } 362 | 363 | /** 364 | *

 365 |        * JPEG encoded stream of the image to be classified.
 366 |        * 
367 | *

368 | * optional bytes jpeg_encoded = 1; 369 | */ 370 | public Builder setJpegEncoded(ByteString value) { 371 | if (value == null) { 372 | throw new NullPointerException(); 373 | } 374 | 375 | jpegEncoded_ = value; 376 | onChanged(); 377 | return this; 378 | } 379 | 380 | /** 381 | *

 382 |        * JPEG encoded stream of the image to be classified.
 383 |        * 
384 | *

385 | * optional bytes jpeg_encoded = 1; 386 | */ 387 | public Builder clearJpegEncoded() { 388 | 389 | jpegEncoded_ = getDefaultInstance().getJpegEncoded(); 390 | onChanged(); 391 | return this; 392 | } 393 | 394 | public final Builder setUnknownFields( 395 | final UnknownFieldSet unknownFields) { 396 | return this; 397 | } 398 | 399 | public final Builder mergeUnknownFields( 400 | final UnknownFieldSet unknownFields) { 401 | return this; 402 | } 403 | 404 | // @@protoc_insertion_point(builder_scope:tensorflow.serving.InceptionRequest) 405 | } 406 | 407 | // @@protoc_insertion_point(class_scope:tensorflow.serving.InceptionRequest) 408 | private static final InceptionRequest DEFAULT_INSTANCE; 409 | 410 | static { 411 | DEFAULT_INSTANCE = new InceptionRequest(); 412 | } 413 | 414 | public static InceptionRequest getDefaultInstance() { 415 | return DEFAULT_INSTANCE; 416 | } 417 | 418 | private static final Parser PARSER = new AbstractParser() { 419 | public InceptionRequest parsePartialFrom(CodedInputStream input, 420 | ExtensionRegistryLite extensionRegistry) 421 | throws InvalidProtocolBufferException { 422 | return new InceptionRequest(input, extensionRegistry); 423 | } 424 | }; 425 | 426 | public static Parser parser() { 427 | return PARSER; 428 | } 429 | 430 | @Override 431 | public Parser getParserForType() { 432 | return PARSER; 433 | } 434 | 435 | public InceptionRequest getDefaultInstanceForType() { 436 | return DEFAULT_INSTANCE; 437 | } 438 | 439 | } 440 | 441 | public interface InceptionResponseOrBuilder extends 442 | // @@protoc_insertion_point(interface_extends:tensorflow.serving.InceptionResponse) 443 | MessageOrBuilder { 444 | 445 | /** 446 | *

 447 |      * Human readable descriptions of the classes, in scores descending order.
 448 |      * 
449 | *

450 | * repeated string classes = 3; 451 | */ 452 | ProtocolStringList getClassesList(); 453 | 454 | /** 455 | *

 456 |      * Human readable descriptions of the classes, in scores descending order.
 457 |      * 
458 | *

459 | * repeated string classes = 3; 460 | */ 461 | int getClassesCount(); 462 | 463 | /** 464 | *

 465 |      * Human readable descriptions of the classes, in scores descending order.
 466 |      * 
467 | *

468 | * repeated string classes = 3; 469 | */ 470 | String getClasses(int index); 471 | 472 | /** 473 | *

 474 |      * Human readable descriptions of the classes, in scores descending order.
 475 |      * 
476 | *

477 | * repeated string classes = 3; 478 | */ 479 | ByteString getClassesBytes(int index); 480 | 481 | /** 482 | *

 483 |      * Scores of top matches, in same order as classes.
 484 |      * 
485 | *

486 | * repeated float scores = 2; 487 | */ 488 | List getScoresList(); 489 | 490 | /** 491 | *

 492 |      * Scores of top matches, in same order as classes.
 493 |      * 
494 | *

495 | * repeated float scores = 2; 496 | */ 497 | int getScoresCount(); 498 | 499 | /** 500 | *

 501 |      * Scores of top matches, in same order as classes.
 502 |      * 
503 | *

504 | * repeated float scores = 2; 505 | */ 506 | float getScores(int index); 507 | } 508 | 509 | /** 510 | * Protobuf type {@code tensorflow.serving.InceptionResponse} 511 | */ 512 | public static final class InceptionResponse extends GeneratedMessage 513 | implements 514 | // @@protoc_insertion_point(message_implements:tensorflow.serving.InceptionResponse) 515 | InceptionResponseOrBuilder { 516 | // Use InceptionResponse.newBuilder() to construct. 517 | private InceptionResponse(GeneratedMessage.Builder builder) { 518 | super(builder); 519 | } 520 | 521 | private InceptionResponse() { 522 | classes_ = LazyStringArrayList.EMPTY; 523 | scores_ = Collections.emptyList(); 524 | } 525 | 526 | @Override 527 | public final UnknownFieldSet getUnknownFields() { 528 | return UnknownFieldSet.getDefaultInstance(); 529 | } 530 | 531 | private InceptionResponse(CodedInputStream input, 532 | ExtensionRegistryLite extensionRegistry) 533 | throws InvalidProtocolBufferException { 534 | this(); 535 | int mutable_bitField0_ = 0; 536 | try { 537 | boolean done = false; 538 | while (!done) { 539 | int tag = input.readTag(); 540 | switch (tag) { 541 | case 0: 542 | done = true; 543 | break; 544 | default: { 545 | if (!input.skipField(tag)) { 546 | done = true; 547 | } 548 | break; 549 | } 550 | case 21: { 551 | if (!((mutable_bitField0_ & 0x00000002) == 0x00000002)) { 552 | scores_ = new ArrayList(); 553 | mutable_bitField0_ |= 0x00000002; 554 | } 555 | scores_.add(input.readFloat()); 556 | break; 557 | } 558 | case 18: { 559 | int length = input.readRawVarint32(); 560 | int limit = input.pushLimit(length); 561 | if (!((mutable_bitField0_ & 0x00000002) == 0x00000002) 562 | && input.getBytesUntilLimit() > 0) { 563 | scores_ = new ArrayList(); 564 | mutable_bitField0_ |= 0x00000002; 565 | } 566 | while (input.getBytesUntilLimit() > 0) { 567 | scores_.add(input.readFloat()); 568 | } 569 | input.popLimit(limit); 570 | break; 571 | } 572 | case 26: { 573 | String s = input.readStringRequireUtf8(); 574 | if (!((mutable_bitField0_ & 0x00000001) == 0x00000001)) { 575 | classes_ = new LazyStringArrayList(); 576 | mutable_bitField0_ |= 0x00000001; 577 | } 578 | classes_.add(s); 579 | break; 580 | } 581 | } 582 | } 583 | } catch (InvalidProtocolBufferException e) { 584 | throw e.setUnfinishedMessage(this); 585 | } catch (IOException e) { 586 | throw new InvalidProtocolBufferException(e).setUnfinishedMessage(this); 587 | } finally { 588 | if (((mutable_bitField0_ & 0x00000002) == 0x00000002)) { 589 | scores_ = Collections.unmodifiableList(scores_); 590 | } 591 | if (((mutable_bitField0_ & 0x00000001) == 0x00000001)) { 592 | classes_ = classes_.getUnmodifiableView(); 593 | } 594 | makeExtensionsImmutable(); 595 | } 596 | } 597 | 598 | public static final Descriptor getDescriptor() { 599 | return InceptionInference.internal_static_tensorflow_serving_InceptionResponse_descriptor; 600 | } 601 | 602 | protected FieldAccessorTable internalGetFieldAccessorTable() { 603 | return InceptionInference.internal_static_tensorflow_serving_InceptionResponse_fieldAccessorTable 604 | .ensureFieldAccessorsInitialized(InceptionResponse.class, 605 | Builder.class); 606 | } 607 | 608 | public static final int CLASSES_FIELD_NUMBER = 3; 609 | private LazyStringList classes_; 610 | 611 | /** 612 | *

 613 |      * Human readable descriptions of the classes, in scores descending order.
 614 |      * 
615 | *

616 | * repeated string classes = 3; 617 | */ 618 | public ProtocolStringList getClassesList() { 619 | return classes_; 620 | } 621 | 622 | /** 623 | *

 624 |      * Human readable descriptions of the classes, in scores descending order.
 625 |      * 
626 | *

627 | * repeated string classes = 3; 628 | */ 629 | public int getClassesCount() { 630 | return classes_.size(); 631 | } 632 | 633 | /** 634 | *

 635 |      * Human readable descriptions of the classes, in scores descending order.
 636 |      * 
637 | *

638 | * repeated string classes = 3; 639 | */ 640 | public String getClasses(int index) { 641 | return classes_.get(index); 642 | } 643 | 644 | /** 645 | *

 646 |      * Human readable descriptions of the classes, in scores descending order.
 647 |      * 
648 | *

649 | * repeated string classes = 3; 650 | */ 651 | public ByteString getClassesBytes(int index) { 652 | return classes_.getByteString(index); 653 | } 654 | 655 | public static final int SCORES_FIELD_NUMBER = 2; 656 | private List scores_; 657 | 658 | /** 659 | *

 660 |      * Scores of top matches, in same order as classes.
 661 |      * 
662 | *

663 | * repeated float scores = 2; 664 | */ 665 | public List getScoresList() { 666 | return scores_; 667 | } 668 | 669 | /** 670 | *

 671 |      * Scores of top matches, in same order as classes.
 672 |      * 
673 | *

674 | * repeated float scores = 2; 675 | */ 676 | public int getScoresCount() { 677 | return scores_.size(); 678 | } 679 | 680 | /** 681 | *

 682 |      * Scores of top matches, in same order as classes.
 683 |      * 
684 | *

685 | * repeated float scores = 2; 686 | */ 687 | public float getScores(int index) { 688 | return scores_.get(index); 689 | } 690 | 691 | private int scoresMemoizedSerializedSize = -1; 692 | 693 | private byte memoizedIsInitialized = -1; 694 | 695 | public final boolean isInitialized() { 696 | byte isInitialized = memoizedIsInitialized; 697 | if (isInitialized == 1) 698 | return true; 699 | if (isInitialized == 0) 700 | return false; 701 | 702 | memoizedIsInitialized = 1; 703 | return true; 704 | } 705 | 706 | public void writeTo(CodedOutputStream output) throws IOException { 707 | getSerializedSize(); 708 | if (getScoresList().size() > 0) { 709 | output.writeRawVarint32(18); 710 | output.writeRawVarint32(scoresMemoizedSerializedSize); 711 | } 712 | for (int i = 0; i < scores_.size(); i++) { 713 | output.writeFloatNoTag(scores_.get(i)); 714 | } 715 | for (int i = 0; i < classes_.size(); i++) { 716 | GeneratedMessage.writeString(output, 3, classes_.getRaw(i)); 717 | } 718 | } 719 | 720 | public int getSerializedSize() { 721 | int size = memoizedSize; 722 | if (size != -1) 723 | return size; 724 | 725 | size = 0; 726 | { 727 | int dataSize = 0; 728 | dataSize = 4 * getScoresList().size(); 729 | size += dataSize; 730 | if (!getScoresList().isEmpty()) { 731 | size += 1; 732 | size += CodedOutputStream.computeInt32SizeNoTag(dataSize); 733 | } 734 | scoresMemoizedSerializedSize = dataSize; 735 | } 736 | { 737 | int dataSize = 0; 738 | for (int i = 0; i < classes_.size(); i++) { 739 | dataSize += computeStringSizeNoTag(classes_.getRaw(i)); 740 | } 741 | size += dataSize; 742 | size += 1 * getClassesList().size(); 743 | } 744 | memoizedSize = size; 745 | return size; 746 | } 747 | 748 | private static final long serialVersionUID = 0L; 749 | 750 | public static InceptionResponse parseFrom(ByteString data) 751 | throws InvalidProtocolBufferException { 752 | return PARSER.parseFrom(data); 753 | } 754 | 755 | public static InceptionResponse parseFrom(ByteString data, 756 | ExtensionRegistryLite extensionRegistry) 757 | throws InvalidProtocolBufferException { 758 | return PARSER.parseFrom(data, extensionRegistry); 759 | } 760 | 761 | public static InceptionResponse parseFrom(byte[] data) 762 | throws InvalidProtocolBufferException { 763 | return PARSER.parseFrom(data); 764 | } 765 | 766 | public static InceptionResponse parseFrom(byte[] data, 767 | ExtensionRegistryLite extensionRegistry) 768 | throws InvalidProtocolBufferException { 769 | return PARSER.parseFrom(data, extensionRegistry); 770 | } 771 | 772 | public static InceptionResponse parseFrom(InputStream input) 773 | throws IOException { 774 | return GeneratedMessage.parseWithIOException(PARSER, input); 775 | } 776 | 777 | public static InceptionResponse parseFrom(InputStream input, 778 | ExtensionRegistryLite extensionRegistry) throws IOException { 779 | return GeneratedMessage 780 | .parseWithIOException(PARSER, input, extensionRegistry); 781 | } 782 | 783 | public static InceptionResponse parseDelimitedFrom(InputStream input) 784 | throws IOException { 785 | return GeneratedMessage.parseDelimitedWithIOException(PARSER, input); 786 | } 787 | 788 | public static InceptionResponse parseDelimitedFrom(InputStream input, 789 | ExtensionRegistryLite extensionRegistry) throws IOException { 790 | return GeneratedMessage 791 | .parseDelimitedWithIOException(PARSER, input, extensionRegistry); 792 | } 793 | 794 | public static InceptionResponse parseFrom(CodedInputStream input) 795 | throws IOException { 796 | return GeneratedMessage.parseWithIOException(PARSER, input); 797 | } 798 | 799 | public static InceptionResponse parseFrom(CodedInputStream input, 800 | ExtensionRegistryLite extensionRegistry) throws IOException { 801 | return GeneratedMessage 802 | .parseWithIOException(PARSER, input, extensionRegistry); 803 | } 804 | 805 | public Builder newBuilderForType() { 806 | return newBuilder(); 807 | } 808 | 809 | public static Builder newBuilder() { 810 | return DEFAULT_INSTANCE.toBuilder(); 811 | } 812 | 813 | public static Builder newBuilder(InceptionResponse prototype) { 814 | return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype); 815 | } 816 | 817 | public Builder toBuilder() { 818 | return this == DEFAULT_INSTANCE ? 819 | new Builder() : 820 | new Builder().mergeFrom(this); 821 | } 822 | 823 | @Override 824 | protected Builder newBuilderForType(BuilderParent parent) { 825 | Builder builder = new Builder(parent); 826 | return builder; 827 | } 828 | 829 | /** 830 | * Protobuf type {@code tensorflow.serving.InceptionResponse} 831 | */ 832 | public static final class Builder extends GeneratedMessage.Builder 833 | implements 834 | // @@protoc_insertion_point(builder_implements:tensorflow.serving.InceptionResponse) 835 | InceptionResponseOrBuilder { 836 | public static final Descriptor getDescriptor() { 837 | return InceptionInference.internal_static_tensorflow_serving_InceptionResponse_descriptor; 838 | } 839 | 840 | protected FieldAccessorTable internalGetFieldAccessorTable() { 841 | return InceptionInference.internal_static_tensorflow_serving_InceptionResponse_fieldAccessorTable 842 | .ensureFieldAccessorsInitialized(InceptionResponse.class, 843 | Builder.class); 844 | } 845 | 846 | // Construct using tensorflow.serving.InceptionInference.InceptionResponse.newBuilder() 847 | private Builder() { 848 | maybeForceBuilderInitialization(); 849 | } 850 | 851 | private Builder(BuilderParent parent) { 852 | super(parent); 853 | maybeForceBuilderInitialization(); 854 | } 855 | 856 | private void maybeForceBuilderInitialization() { 857 | if (GeneratedMessage.alwaysUseFieldBuilders) { 858 | } 859 | } 860 | 861 | public Builder clear() { 862 | super.clear(); 863 | classes_ = LazyStringArrayList.EMPTY; 864 | bitField0_ = (bitField0_ & ~0x00000001); 865 | scores_ = Collections.emptyList(); 866 | bitField0_ = (bitField0_ & ~0x00000002); 867 | return this; 868 | } 869 | 870 | public Descriptor getDescriptorForType() { 871 | return InceptionInference.internal_static_tensorflow_serving_InceptionResponse_descriptor; 872 | } 873 | 874 | public InceptionResponse getDefaultInstanceForType() { 875 | return getDefaultInstance(); 876 | } 877 | 878 | public InceptionResponse build() { 879 | InceptionResponse result = buildPartial(); 880 | if (!result.isInitialized()) { 881 | throw newUninitializedMessageException(result); 882 | } 883 | return result; 884 | } 885 | 886 | public InceptionResponse buildPartial() { 887 | InceptionResponse result = new InceptionResponse(this); 888 | int from_bitField0_ = bitField0_; 889 | if (((bitField0_ & 0x00000001) == 0x00000001)) { 890 | classes_ = classes_.getUnmodifiableView(); 891 | bitField0_ = (bitField0_ & ~0x00000001); 892 | } 893 | result.classes_ = classes_; 894 | if (((bitField0_ & 0x00000002) == 0x00000002)) { 895 | scores_ = Collections.unmodifiableList(scores_); 896 | bitField0_ = (bitField0_ & ~0x00000002); 897 | } 898 | result.scores_ = scores_; 899 | onBuilt(); 900 | return result; 901 | } 902 | 903 | public Builder mergeFrom(Message other) { 904 | if (other instanceof InceptionResponse) { 905 | return mergeFrom((InceptionResponse) other); 906 | } else { 907 | super.mergeFrom(other); 908 | return this; 909 | } 910 | } 911 | 912 | public Builder mergeFrom(InceptionResponse other) { 913 | if (other == getDefaultInstance()) 914 | return this; 915 | if (!other.classes_.isEmpty()) { 916 | if (classes_.isEmpty()) { 917 | classes_ = other.classes_; 918 | bitField0_ = (bitField0_ & ~0x00000001); 919 | } else { 920 | ensureClassesIsMutable(); 921 | classes_.addAll(other.classes_); 922 | } 923 | onChanged(); 924 | } 925 | if (!other.scores_.isEmpty()) { 926 | if (scores_.isEmpty()) { 927 | scores_ = other.scores_; 928 | bitField0_ = (bitField0_ & ~0x00000002); 929 | } else { 930 | ensureScoresIsMutable(); 931 | scores_.addAll(other.scores_); 932 | } 933 | onChanged(); 934 | } 935 | onChanged(); 936 | return this; 937 | } 938 | 939 | public final boolean isInitialized() { 940 | return true; 941 | } 942 | 943 | public Builder mergeFrom(CodedInputStream input, 944 | ExtensionRegistryLite extensionRegistry) throws IOException { 945 | InceptionResponse parsedMessage = null; 946 | try { 947 | parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry); 948 | } catch (InvalidProtocolBufferException e) { 949 | parsedMessage = (InceptionResponse) e.getUnfinishedMessage(); 950 | throw e.unwrapIOException(); 951 | } finally { 952 | if (parsedMessage != null) { 953 | mergeFrom(parsedMessage); 954 | } 955 | } 956 | return this; 957 | } 958 | 959 | private int bitField0_; 960 | 961 | private LazyStringList classes_ = LazyStringArrayList.EMPTY; 962 | 963 | private void ensureClassesIsMutable() { 964 | if (!((bitField0_ & 0x00000001) == 0x00000001)) { 965 | classes_ = new LazyStringArrayList(classes_); 966 | bitField0_ |= 0x00000001; 967 | } 968 | } 969 | 970 | /** 971 | *

 972 |        * Human readable descriptions of the classes, in scores descending order.
 973 |        * 
974 | *

975 | * repeated string classes = 3; 976 | */ 977 | public ProtocolStringList getClassesList() { 978 | return classes_.getUnmodifiableView(); 979 | } 980 | 981 | /** 982 | *

 983 |        * Human readable descriptions of the classes, in scores descending order.
 984 |        * 
985 | *

986 | * repeated string classes = 3; 987 | */ 988 | public int getClassesCount() { 989 | return classes_.size(); 990 | } 991 | 992 | /** 993 | *

 994 |        * Human readable descriptions of the classes, in scores descending order.
 995 |        * 
996 | *

997 | * repeated string classes = 3; 998 | */ 999 | public String getClasses(int index) { 1000 | return classes_.get(index); 1001 | } 1002 | 1003 | /** 1004 | *

1005 |        * Human readable descriptions of the classes, in scores descending order.
1006 |        * 
1007 | *

1008 | * repeated string classes = 3; 1009 | */ 1010 | public ByteString getClassesBytes(int index) { 1011 | return classes_.getByteString(index); 1012 | } 1013 | 1014 | /** 1015 | *

1016 |        * Human readable descriptions of the classes, in scores descending order.
1017 |        * 
1018 | *

1019 | * repeated string classes = 3; 1020 | */ 1021 | public Builder setClasses(int index, String value) { 1022 | if (value == null) { 1023 | throw new NullPointerException(); 1024 | } 1025 | ensureClassesIsMutable(); 1026 | classes_.set(index, value); 1027 | onChanged(); 1028 | return this; 1029 | } 1030 | 1031 | /** 1032 | *

1033 |        * Human readable descriptions of the classes, in scores descending order.
1034 |        * 
1035 | *

1036 | * repeated string classes = 3; 1037 | */ 1038 | public Builder addClasses(String value) { 1039 | if (value == null) { 1040 | throw new NullPointerException(); 1041 | } 1042 | ensureClassesIsMutable(); 1043 | classes_.add(value); 1044 | onChanged(); 1045 | return this; 1046 | } 1047 | 1048 | /** 1049 | *

1050 |        * Human readable descriptions of the classes, in scores descending order.
1051 |        * 
1052 | *

1053 | * repeated string classes = 3; 1054 | */ 1055 | public Builder addAllClasses(Iterable values) { 1056 | ensureClassesIsMutable(); 1057 | AbstractMessageLite.Builder.addAll(values, classes_); 1058 | onChanged(); 1059 | return this; 1060 | } 1061 | 1062 | /** 1063 | *

1064 |        * Human readable descriptions of the classes, in scores descending order.
1065 |        * 
1066 | *

1067 | * repeated string classes = 3; 1068 | */ 1069 | public Builder clearClasses() { 1070 | classes_ = LazyStringArrayList.EMPTY; 1071 | bitField0_ = (bitField0_ & ~0x00000001); 1072 | onChanged(); 1073 | return this; 1074 | } 1075 | 1076 | /** 1077 | *

1078 |        * Human readable descriptions of the classes, in scores descending order.
1079 |        * 
1080 | *

1081 | * repeated string classes = 3; 1082 | */ 1083 | public Builder addClassesBytes(ByteString value) { 1084 | if (value == null) { 1085 | throw new NullPointerException(); 1086 | } 1087 | checkByteStringIsUtf8(value); 1088 | ensureClassesIsMutable(); 1089 | classes_.add(value); 1090 | onChanged(); 1091 | return this; 1092 | } 1093 | 1094 | private List scores_ = Collections.emptyList(); 1095 | 1096 | private void ensureScoresIsMutable() { 1097 | if (!((bitField0_ & 0x00000002) == 0x00000002)) { 1098 | scores_ = new ArrayList(scores_); 1099 | bitField0_ |= 0x00000002; 1100 | } 1101 | } 1102 | 1103 | /** 1104 | *

1105 |        * Scores of top matches, in same order as classes.
1106 |        * 
1107 | *

1108 | * repeated float scores = 2; 1109 | */ 1110 | public List getScoresList() { 1111 | return Collections.unmodifiableList(scores_); 1112 | } 1113 | 1114 | /** 1115 | *

1116 |        * Scores of top matches, in same order as classes.
1117 |        * 
1118 | *

1119 | * repeated float scores = 2; 1120 | */ 1121 | public int getScoresCount() { 1122 | return scores_.size(); 1123 | } 1124 | 1125 | /** 1126 | *

1127 |        * Scores of top matches, in same order as classes.
1128 |        * 
1129 | *

1130 | * repeated float scores = 2; 1131 | */ 1132 | public float getScores(int index) { 1133 | return scores_.get(index); 1134 | } 1135 | 1136 | /** 1137 | *

1138 |        * Scores of top matches, in same order as classes.
1139 |        * 
1140 | *

1141 | * repeated float scores = 2; 1142 | */ 1143 | public Builder setScores(int index, float value) { 1144 | ensureScoresIsMutable(); 1145 | scores_.set(index, value); 1146 | onChanged(); 1147 | return this; 1148 | } 1149 | 1150 | /** 1151 | *

1152 |        * Scores of top matches, in same order as classes.
1153 |        * 
1154 | *

1155 | * repeated float scores = 2; 1156 | */ 1157 | public Builder addScores(float value) { 1158 | ensureScoresIsMutable(); 1159 | scores_.add(value); 1160 | onChanged(); 1161 | return this; 1162 | } 1163 | 1164 | /** 1165 | *

1166 |        * Scores of top matches, in same order as classes.
1167 |        * 
1168 | *

1169 | * repeated float scores = 2; 1170 | */ 1171 | public Builder addAllScores(Iterable values) { 1172 | ensureScoresIsMutable(); 1173 | AbstractMessageLite.Builder.addAll(values, scores_); 1174 | onChanged(); 1175 | return this; 1176 | } 1177 | 1178 | /** 1179 | *

1180 |        * Scores of top matches, in same order as classes.
1181 |        * 
1182 | *

1183 | * repeated float scores = 2; 1184 | */ 1185 | public Builder clearScores() { 1186 | scores_ = Collections.emptyList(); 1187 | bitField0_ = (bitField0_ & ~0x00000002); 1188 | onChanged(); 1189 | return this; 1190 | } 1191 | 1192 | public final Builder setUnknownFields( 1193 | final UnknownFieldSet unknownFields) { 1194 | return this; 1195 | } 1196 | 1197 | public final Builder mergeUnknownFields( 1198 | final UnknownFieldSet unknownFields) { 1199 | return this; 1200 | } 1201 | 1202 | // @@protoc_insertion_point(builder_scope:tensorflow.serving.InceptionResponse) 1203 | } 1204 | 1205 | // @@protoc_insertion_point(class_scope:tensorflow.serving.InceptionResponse) 1206 | private static final InceptionResponse DEFAULT_INSTANCE; 1207 | 1208 | static { 1209 | DEFAULT_INSTANCE = new InceptionResponse(); 1210 | } 1211 | 1212 | public static InceptionResponse getDefaultInstance() { 1213 | return DEFAULT_INSTANCE; 1214 | } 1215 | 1216 | private static final Parser PARSER = new AbstractParser() { 1217 | public InceptionResponse parsePartialFrom(CodedInputStream input, 1218 | ExtensionRegistryLite extensionRegistry) 1219 | throws InvalidProtocolBufferException { 1220 | return new InceptionResponse(input, extensionRegistry); 1221 | } 1222 | }; 1223 | 1224 | public static Parser parser() { 1225 | return PARSER; 1226 | } 1227 | 1228 | @Override 1229 | public Parser getParserForType() { 1230 | return PARSER; 1231 | } 1232 | 1233 | public InceptionResponse getDefaultInstanceForType() { 1234 | return DEFAULT_INSTANCE; 1235 | } 1236 | 1237 | } 1238 | 1239 | private static final Descriptor internal_static_tensorflow_serving_InceptionRequest_descriptor; 1240 | private static final FieldAccessorTable internal_static_tensorflow_serving_InceptionRequest_fieldAccessorTable; 1241 | private static final Descriptor internal_static_tensorflow_serving_InceptionResponse_descriptor; 1242 | private static final FieldAccessorTable internal_static_tensorflow_serving_InceptionResponse_fieldAccessorTable; 1243 | 1244 | public static FileDescriptor getDescriptor() { 1245 | return descriptor; 1246 | } 1247 | 1248 | private static FileDescriptor descriptor; 1249 | 1250 | static { 1251 | String[] descriptorData = { 1252 | "\n\031inception_inference.proto\022\022tensorflow." + 1253 | "serving\"(\n\020InceptionRequest\022\024\n\014jpeg_enco" + 1254 | "ded\030\001 \001(\014\"4\n\021InceptionResponse\022\017\n\007classe" 1255 | + 1256 | "s\030\003 \003(\t\022\016\n\006scores\030\002 \003(\0022k\n\020InceptionServ" 1257 | + 1258 | "ice\022W\n\010Classify\022$.tensorflow.serving.Inc" + 1259 | "eptionRequest\032%.tensorflow.serving.Incep" + 1260 | "tionResponseb\006proto3" }; 1261 | InternalDescriptorAssigner assigner = new InternalDescriptorAssigner() { 1262 | public ExtensionRegistry assignDescriptors(FileDescriptor root) { 1263 | descriptor = root; 1264 | return null; 1265 | } 1266 | }; 1267 | FileDescriptor 1268 | .internalBuildGeneratedFileFrom(descriptorData, new FileDescriptor[] {}, 1269 | assigner); 1270 | internal_static_tensorflow_serving_InceptionRequest_descriptor = getDescriptor() 1271 | .getMessageTypes().get(0); 1272 | internal_static_tensorflow_serving_InceptionRequest_fieldAccessorTable = new FieldAccessorTable( 1273 | internal_static_tensorflow_serving_InceptionRequest_descriptor, 1274 | new String[] { "JpegEncoded", }); 1275 | internal_static_tensorflow_serving_InceptionResponse_descriptor = getDescriptor() 1276 | .getMessageTypes().get(1); 1277 | internal_static_tensorflow_serving_InceptionResponse_fieldAccessorTable = new FieldAccessorTable( 1278 | internal_static_tensorflow_serving_InceptionResponse_descriptor, 1279 | new String[] { "Classes", "Scores", }); 1280 | } 1281 | 1282 | // @@protoc_insertion_point(outer_class_scope) 1283 | } 1284 | -------------------------------------------------------------------------------- /src/main/java/com/github/megachucky/kafka/streams/machinelearning/Kafka_Streams_TensorFlow_Serving_gRPC_Example.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | package com.github.megachucky.kafka.streams.machinelearning; 19 | 20 | import java.io.FileInputStream; 21 | import java.io.InputStream; 22 | import java.util.Collections; 23 | import java.util.List; 24 | import java.util.Map; 25 | import java.util.Properties; 26 | 27 | import org.apache.kafka.common.serialization.Serdes; 28 | import org.apache.kafka.streams.KafkaStreams; 29 | import org.apache.kafka.streams.StreamsBuilder; 30 | import org.apache.kafka.streams.StreamsConfig; 31 | import org.apache.kafka.streams.kstream.KStream; 32 | 33 | import com.github.megachucky.kafka.streams.machinelearning.TensorflowObjectRecogniser; 34 | 35 | /** 36 | * @author Kai Waehner 37 | */ 38 | public class Kafka_Streams_TensorFlow_Serving_gRPC_Example { 39 | 40 | private static final String imageInputTopic = "ImageInputTopic"; 41 | private static final String imageOutputTopic = "ImageOutputTopic"; 42 | 43 | private static final String server = "localhost"; 44 | private static final Integer port = 9000; 45 | 46 | // Image path will be received from Kafka message to topic 'imageInputTopic' 47 | private static String imagePath = null; 48 | 49 | public static void main(String[] args) throws Exception { 50 | 51 | // Configure Kafka Streams Application 52 | final String bootstrapServers = args.length > 0 ? args[0] : "localhost:9092"; 53 | 54 | final Properties streamsConfiguration = new Properties(); 55 | 56 | // Give the Streams application a unique name. The name must be unique 57 | // in the Kafka cluster against which the application is run. 58 | streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams-tensorflow-serving-gRPC-example"); 59 | 60 | // Where to find Kafka broker(s). 61 | streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers); 62 | 63 | // Specify default (de)serializers for record keys and for record 64 | // values. 65 | streamsConfiguration.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName()); 66 | streamsConfiguration.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName()); 67 | 68 | // In the subsequent lines we define the processing topology of the Streams 69 | // application. 70 | final StreamsBuilder builder = new StreamsBuilder(); 71 | 72 | // Construct a `KStream` from the input topic "ImageInputTopic", where 73 | // message values represent lines of text 74 | final KStream imageInputLines = builder.stream(imageInputTopic); 75 | 76 | 77 | KStream transformedMessage = imageInputLines.mapValues(value -> { 78 | 79 | System.out.println("Image path: " + value); 80 | 81 | imagePath = value; 82 | 83 | TensorflowObjectRecogniser recogniser = new TensorflowObjectRecogniser(server, port); 84 | 85 | System.out.println("Image = " + imagePath); 86 | InputStream jpegStream; 87 | try { 88 | jpegStream = new FileInputStream(imagePath); 89 | 90 | // Prediction of the TensorFlow Image Recognition model: 91 | List> list = recogniser.recognise(jpegStream); 92 | String prediction = list.toString(); 93 | System.out.println("Prediction: " + prediction); 94 | recogniser.close(); 95 | jpegStream.close(); 96 | 97 | return prediction; 98 | } catch (Exception e) { 99 | e.printStackTrace(); 100 | 101 | return Collections.emptyList().toString(); 102 | } 103 | 104 | }); 105 | 106 | 107 | 108 | // Send prediction information to Output Topic 109 | transformedMessage.to(imageOutputTopic); 110 | 111 | // Start Kafka Streams Application to process new incoming images from the Input 112 | // Topic 113 | final KafkaStreams streams = new KafkaStreams(builder.build(), streamsConfiguration); 114 | 115 | streams.cleanUp(); 116 | 117 | streams.start(); 118 | 119 | System.out.println("Image Recognition Microservice is running..."); 120 | 121 | System.out.println("Input images arrive at Kafka topic " + imageInputTopic + "; Output predictions going to Kafka topic " 122 | + imageOutputTopic); 123 | 124 | // Add shutdown hook to respond to SIGTERM and gracefully close Kafka 125 | // Streams 126 | Runtime.getRuntime().addShutdownHook(new Thread(streams::close)); 127 | 128 | } 129 | } 130 | -------------------------------------------------------------------------------- /src/main/java/com/github/megachucky/kafka/streams/machinelearning/TensorflowObjectRecogniser.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | package com.github.megachucky.kafka.streams.machinelearning; 19 | 20 | import com.google.protobuf.ByteString; 21 | import io.grpc.ManagedChannel; 22 | import io.grpc.netty.NettyChannelBuilder; 23 | import org.slf4j.Logger; 24 | import org.slf4j.LoggerFactory; 25 | 26 | import static com.github.megachucky.kafka.streams.machinelearning.InceptionInference.InceptionRequest; 27 | import static com.github.megachucky.kafka.streams.machinelearning.InceptionInference.InceptionResponse; 28 | 29 | import java.io.Closeable; 30 | import java.io.IOException; 31 | import java.io.InputStream; 32 | import java.util.*; 33 | 34 | /** 35 | * This class offers image recognition implementation using gRCP communication 36 | * with TensorFlow Serving server where the Neural Network for image recognition 37 | * is deployed. 38 | * 39 | * Uses the gRPC stub and generated classes for RPC communication. 40 | * 41 | */ 42 | public class TensorflowObjectRecogniser implements Closeable { 43 | 44 | private static final Logger LOG = LoggerFactory.getLogger(TensorflowObjectRecogniser.class); 45 | 46 | private ManagedChannel channel; 47 | private InceptionBlockingStub stub; 48 | 49 | public TensorflowObjectRecogniser(String host, int port) { 50 | LOG.debug("Creating channel host:{}, port={}", host, port); 51 | try { 52 | channel = NettyChannelBuilder.forAddress(host, port).usePlaintext(true).build(); 53 | stub = new InceptionBlockingStub(channel); 54 | // TODO: test channel here with a sample image 55 | } catch (Exception e) { 56 | throw new RuntimeException(e); 57 | } 58 | } 59 | 60 | public List> recognise(InputStream stream) throws Exception { 61 | 62 | List> objects = new ArrayList<>(); 63 | ByteString jpegData = ByteString.readFrom(stream); 64 | InceptionRequest request = InceptionRequest.newBuilder().setJpegEncoded(jpegData).build(); 65 | long st = System.currentTimeMillis(); 66 | InceptionResponse response = stub.classify(request); 67 | long timeTaken = System.currentTimeMillis() - st; 68 | LOG.debug("Time taken : {}ms", timeTaken); 69 | Iterator classes = response.getClassesList().iterator(); 70 | Iterator scores = response.getScoresList().iterator(); 71 | while (classes.hasNext() && scores.hasNext()) { 72 | String className = classes.next(); 73 | Float score = scores.next(); 74 | Map.Entry object = new AbstractMap.SimpleEntry<>(className, score.doubleValue()); 75 | objects.add(object); 76 | } 77 | return objects; 78 | } 79 | 80 | @Override 81 | public void close() throws IOException { 82 | if (channel != null) { 83 | LOG.debug("Closing the channel "); 84 | channel.shutdownNow(); 85 | } 86 | } 87 | } 88 | -------------------------------------------------------------------------------- /src/main/resources/example.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kaiwaehner/tensorflow-serving-java-grpc-kafka-streams/71f93e0de13cc5ef752bc0c3e3beb7e753a55cc9/src/main/resources/example.jpg -------------------------------------------------------------------------------- /src/main/resources/log4j.properties: -------------------------------------------------------------------------------- 1 | # Root logger option 2 | log4j.rootLogger=INFO, stdout 3 | 4 | # Direct log messages to stdout 5 | log4j.appender.stdout=org.apache.log4j.ConsoleAppender 6 | log4j.appender.stdout.Target=System.out 7 | log4j.appender.stdout.layout=org.apache.log4j.PatternLayout 8 | log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n 9 | log4j.category.edu.usc.irds=DEBUG --------------------------------------------------------------------------------