├── .gitignore ├── LICENSE ├── README.md ├── docker-compose.yml ├── sample.png ├── src ├── streamlit │ ├── Dockerfile │ ├── constants.py │ ├── requirements.txt │ ├── ui.py │ └── utils.py └── tfserving │ ├── Dockerfile │ └── requirements.txt └── ui-demo.gif /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 Alvaro Bartolome del Canto, @alvarobartt at GitHub 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # TensorFlow Serving + Streamlit! :sparkles::framed_picture: 2 | 3 | __Serve TensorFlow ML models with TF-Serving and then create a Streamlit UI to use them!__ 4 | 5 | This is a pretty simple [Streamlit](https://www.streamlit.io/) UI to expose the functionality 6 | of a [TensorFlow](https://www.tensorflow.org/) image classification CNN served using 7 | [TensorFlow Serving](https://www.tensorflow.org/tfx/guide/serving). 8 | 9 | In this case, we will be serving the ML model developed at 10 | [alvarobartt/serving-tensorflow-models](https://github.com/alvarobartt/serving-tensorflow-models), which 11 | is an image classification CNN to classify images from 12 | [The Simpsons Characters](https://www.kaggle.com/alexattia/the-simpsons-characters-dataset). 13 | 14 | --- 15 | 16 | ## :tv: Demo 17 | 18 | ![](ui-demo.gif) 19 | 20 | --- 21 | 22 | ## :whale2: Deployment 23 | 24 | In order to deploy the presented application, you will need to use [Docker Compose](https://docs.docker.com/compose/), 25 | which means that you will also need to have [Docker](https://www.docker.com/) installed. 26 | 27 | We will deploy the following Docker Containers: 28 | 29 | - `src/tfserving`: contains the TF-Serving API deployment. 30 | - `src/streamlit`: contains the code of the UI connected to the deployed API. 31 | 32 | That said, you can easily deploy them with Docker Compose. So we will start off 33 | with the initial step which is __building the containers__, with the following command: 34 | 35 | ``` 36 | docker-compose build --force-rm 37 | ``` 38 | 39 | __Note__: we use `--force-rm` so as to force the removal of the intermediate Docker containers. 40 | 41 | Once built, we can proceed to __deploy the containers__ with the following command: 42 | 43 | ``` 44 | docker-compose up 45 | ``` 46 | 47 | Finally, whenever you want to __stop the containers__ you can use the following command: 48 | 49 | ``` 50 | docker-compose stop 51 | ``` 52 | 53 | And additionally, you can also __remove the containers__ once you don't need them anymore with 54 | the following command: 55 | 56 | ``` 57 | docker-compose rm 58 | ``` 59 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: "3" 2 | 3 | services: 4 | tfserving: 5 | build: src/tfserving/ 6 | ports: 7 | - 8500:8500 8 | - 8501:8501 9 | networks: 10 | - backend 11 | container_name: tfserving 12 | 13 | streamlit: 14 | build: src/streamlit/ 15 | depends_on: 16 | - tfserving 17 | ports: 18 | - 8502:8502 19 | networks: 20 | - backend 21 | container_name: streamlit 22 | 23 | networks: 24 | backend: 25 | driver: bridge -------------------------------------------------------------------------------- /sample.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alvarobartt/tensorflow-serving-streamlit/3d3b6b938d70b7f22bab17f137f641b658f79c3c/sample.png -------------------------------------------------------------------------------- /src/streamlit/Dockerfile: -------------------------------------------------------------------------------- 1 | # Docker pulls the specified image and sets it as the working image 2 | ARG BASE_IMAGE="ubuntu:20.04" 3 | FROM ${BASE_IMAGE} 4 | 5 | # Allow log messages to be dumped in the stream (not buffer) 6 | ENV PYTHONUNBUFFERED TRUE 7 | 8 | # Install the Ubuntu dependencies and Python 3 9 | RUN apt-get update \ 10 | && apt-get install --no-install-recommends -y \ 11 | ca-certificates \ 12 | python3-dev \ 13 | python3-distutils \ 14 | python3-venv \ 15 | curl \ 16 | && rm -rf /var/lib/apt/lists/* \ 17 | && cd /tmp \ 18 | && curl -O https://bootstrap.pypa.io/get-pip.py \ 19 | && python3 get-pip.py \ 20 | && rm get-pip.py 21 | 22 | # Create a new Python env and include it in the PATH 23 | RUN python3 -m venv /home/venv 24 | ENV PATH="/home/venv/bin:$PATH" 25 | 26 | # Update the default system Python version to Python3/PIP3 27 | RUN update-alternatives --install /usr/bin/python python /usr/bin/python3 1 28 | RUN update-alternatives --install /usr/local/bin/pip pip /usr/local/bin/pip3 1 29 | 30 | # Creates a new Ubuntu user 31 | RUN useradd -m model-server 32 | 33 | # Upgrades PIP before proceeding and installs setuptools 34 | RUN pip install pip --upgrade 35 | RUN pip install -U pip setuptools 36 | 37 | # Move the current directory to the Docker image 38 | RUN mkdir /streamlit 39 | COPY . /streamlit 40 | WORKDIR /streamlit 41 | 42 | # Install the requirements 43 | RUN pip install -r requirements.txt 44 | 45 | # Sets the proper rights to the created Python env 46 | RUN chown -R model-server /home/venv 47 | 48 | # Creates a directory for the logs and sets permissions to it 49 | RUN mkdir /home/logs \ 50 | && chown -R model-server /home/logs 51 | 52 | # Expose the UI port (8502) 53 | ENV UI_PORT=8502 54 | EXPOSE $UI_PORT 55 | 56 | # Prepare the CMD that will be run on docker run 57 | USER model-server 58 | CMD streamlit run ui.py --server.port=$UI_PORT >> /home/logs/ui.log 59 | -------------------------------------------------------------------------------- /src/streamlit/constants.py: -------------------------------------------------------------------------------- 1 | # Copyright 2021 Alvaro Bartolome, alvarobartt @ GitHub 2 | # See LICENSE for details. 3 | 4 | # TF-Serving URLs for both gRPC and REST APIs 5 | GRPC_URL = "tfserving:8500" 6 | REST_URL = "http://tfserving:8501/v1/models/simpsonsnet:predict" 7 | 8 | # Mapping of ids to labels (The Simpsons characters) 9 | MAPPING = { 10 | 0: "abraham_grampa_simpson", 1: "apu_nahasapeemapetilon", 2: "barney_gumble", 3: "bart_simpson", 11 | 4: "carl_carlson", 5: "charles_montgomery_burns", 6: "chief_wiggum", 7: "comic_book_guy", 12 | 8: "disco_stu", 9: "edna_krabappel", 10: "groundskeeper_willie", 11: "homer_simpson", 13 | 12: "kent_brockman", 13: "krusty_the_clown", 14: "lenny_leonard", 15: "lisa_simpson", 14 | 16: "maggie_simpson", 17: "marge_simpson", 18: "martin_prince", 19: "mayor_quimby", 15 | 20: "milhouse_van_houten", 21: "moe_szyslak", 22: "ned_flanders", 23: "nelson_muntz", 16 | 24: "patty_bouvier", 25: "principal_skinner", 26: "professor_john_frink", 27: "ralph_wiggum", 17 | 28: "selma_bouvier", 29: "sideshow_bob", 30: "snake_jailbird", 31: "waylon_smithers" 18 | } -------------------------------------------------------------------------------- /src/streamlit/requirements.txt: -------------------------------------------------------------------------------- 1 | streamlit==0.76.0 2 | tensorflow==2.4.1 3 | requests==2.25.1 -------------------------------------------------------------------------------- /src/streamlit/ui.py: -------------------------------------------------------------------------------- 1 | # Copyright 2021 Alvaro Bartolome, alvarobartt @ GitHub 2 | # See LICENSE for details. 3 | 4 | import streamlit as st 5 | 6 | import requests 7 | 8 | from utils import image2tensor, prediction2label 9 | from constants import REST_URL, MAPPING 10 | 11 | # General information about the UI 12 | st.title("TensorFlow Serving + Streamlit! ✨🖼️") 13 | st.header("UI to use a TensorFlow image classification model of The Simpsons characters (named SimpsonsNet) served with TensorFlow Serving.") 14 | 15 | # Show which are the classes that the SimpsonsNet model can predict 16 | if st.checkbox("Show classes"): 17 | st.write("The SimpsonsNet can predict the following characters:") 18 | st.write(MAPPING) 19 | 20 | # Create a FileUploader so that the user can upload an image to the UI 21 | uploaded_file = st.file_uploader(label="Upload an image of any of the available The Simpsons characters (please see Classes).", 22 | type=["png", "jpeg", "jpg"]) 23 | 24 | # Display the predict button just when an image is being uploaded 25 | if not uploaded_file: 26 | st.warning("Please upload an image before proceeding!") 27 | st.stop() 28 | else: 29 | image_as_bytes = uploaded_file.read() 30 | st.image(image_as_bytes, use_column_width=True) 31 | pred_button = st.button("Predict") 32 | 33 | if pred_button: 34 | # Converts the input image into a Tensor 35 | image_tensor = image2tensor(image_as_bytes=image_as_bytes) 36 | 37 | # Prepare the data that is going to be sent in the POST request 38 | json_data = { 39 | "instances": image_tensor 40 | } 41 | 42 | # Send the request to the Prediction API 43 | response = requests.post(REST_URL, json=json_data) 44 | 45 | # Retrieve the highest probablity index of the Tensor (actual prediction) 46 | prediction = response.json()['predictions'][0] 47 | label = prediction2label(prediction=prediction) 48 | 49 | # Write the predicted label for the input image 50 | st.write(f"Predicted The Simpsons character is: {label}") 51 | -------------------------------------------------------------------------------- /src/streamlit/utils.py: -------------------------------------------------------------------------------- 1 | # Copyright 2021 Alvaro Bartolome, alvarobartt @ GitHub 2 | # See LICENSE for details. 3 | 4 | import tensorflow as tf 5 | 6 | from constants import MAPPING 7 | 8 | def image2tensor(image_as_bytes): 9 | """ 10 | Receives a image as bytes as input, that will be loaded, 11 | preprocessed and turned into a Tensor so as to include it 12 | in the TF-Serving request data. 13 | """ 14 | 15 | # Apply the same preprocessing as during training (resize and rescale) 16 | image = tf.io.decode_image(image_as_bytes, channels=3) 17 | image = tf.image.resize(image, [224, 224]) 18 | image = image/255. 19 | 20 | # Convert the Tensor to a batch of Tensors and then to a list 21 | image = tf.expand_dims(image, 0) 22 | image = image.numpy().tolist() 23 | return image 24 | 25 | 26 | def prediction2label(prediction): 27 | """ 28 | Receives the prediction Tensor and retrieves the index with the highest 29 | probability, so as to then map the index value in order to get the 30 | predicted label. 31 | """ 32 | 33 | # Retrieve the highest probablity index of the Tensor (actual prediction) 34 | prediction = tf.argmax(prediction) 35 | return MAPPING[prediction.numpy()] 36 | -------------------------------------------------------------------------------- /src/tfserving/Dockerfile: -------------------------------------------------------------------------------- 1 | # Docker pulls the specified image and sets it as the working image 2 | ARG BASE_IMAGE="ubuntu:20.04" 3 | FROM ${BASE_IMAGE} 4 | 5 | # Allow log messages to be dumped in the stream (not buffer) 6 | ENV PYTHONUNBUFFERED TRUE 7 | 8 | # Install the Ubuntu dependencies and Python 3 9 | RUN apt-get update \ 10 | && apt-get install --no-install-recommends -y \ 11 | ca-certificates \ 12 | python3-dev \ 13 | python3-distutils \ 14 | python3-venv \ 15 | curl \ 16 | wget \ 17 | unzip \ 18 | gnupg \ 19 | && rm -rf /var/lib/apt/lists/* \ 20 | && cd /tmp \ 21 | && curl -O https://bootstrap.pypa.io/get-pip.py \ 22 | && python3 get-pip.py \ 23 | && rm get-pip.py 24 | 25 | # Create a new Python env and include it in the PATH 26 | RUN python3 -m venv /home/venv 27 | ENV PATH="/home/venv/bin:$PATH" 28 | 29 | # Update the default system Python version to Python3/PIP3 30 | RUN update-alternatives --install /usr/bin/python python /usr/bin/python3 1 31 | RUN update-alternatives --install /usr/local/bin/pip pip /usr/local/bin/pip3 1 32 | 33 | # Creates a new Ubuntu user 34 | RUN useradd -m model-server 35 | 36 | # Upgrades PIP before proceeding and installs setuptools 37 | RUN pip install pip --upgrade 38 | RUN pip install -U pip setuptools 39 | 40 | # Move the requirements.txt file to the Docker image 41 | RUN mkdir /tfserving 42 | COPY . /tfserving 43 | WORKDIR /tfserving 44 | 45 | # Installs the TensorFlow Serving requirements 46 | RUN pip install -r requirements.txt 47 | RUN echo "deb [arch=amd64] http://storage.googleapis.com/tensorflow-serving-apt stable tensorflow-model-server tensorflow-model-server-universal" | tee /etc/apt/sources.list.d/tensorflow-serving.list && \ 48 | curl https://storage.googleapis.com/tensorflow-serving-apt/tensorflow-serving.release.pub.gpg | apt-key add - 49 | RUN apt-get update && apt-get install tensorflow-model-server -y 50 | 51 | # Downloads the TensorFlow trained model 52 | RUN cd /home \ 53 | && wget -nv "https://www.dropbox.com/s/4l0efitsuvfoxhj/simpsonsnet.zip" \ 54 | && unzip simpsonsnet.zip \ 55 | && rm simpsonsnet.zip \ 56 | && mkdir /home/saved_models \ 57 | && mv simpsonsnet /home/saved_models/ 58 | 59 | # Sets the proper rights to the /home/saved_models dir and the created Python env 60 | RUN chown -R model-server /home/saved_models \ 61 | && chown -R model-server /home/venv 62 | 63 | # Creates a directory for the logs and sets permissions to it 64 | RUN mkdir /home/logs \ 65 | && chown -R model-server /home/logs 66 | 67 | # Defines the model_path environment variable and the model name 68 | ENV MODEL_PATH=/home/saved_models/simpsonsnet 69 | ENV MODEL_NAME=simpsonsnet 70 | 71 | # Expose the ports 8500 (gRPC) and 8501 (REST) 72 | ENV GRPC_PORT=8500 73 | ENV REST_PORT=8501 74 | EXPOSE $GRPC_PORT $REST_PORT 75 | 76 | # Prepare the CMD that will be run on docker run 77 | USER model-server 78 | CMD tensorflow_model_server --port=$GRPC_PORT --rest_api_port=$REST_PORT --model_name=$MODEL_NAME --model_base_path=$MODEL_PATH >> /home/logs/server.log -------------------------------------------------------------------------------- /src/tfserving/requirements.txt: -------------------------------------------------------------------------------- 1 | tensorflow==2.4.1 -------------------------------------------------------------------------------- /ui-demo.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alvarobartt/tensorflow-serving-streamlit/3d3b6b938d70b7f22bab17f137f641b658f79c3c/ui-demo.gif --------------------------------------------------------------------------------