├── .dockerignore ├── .gitignore ├── Dockerfile ├── Dockerfile.train ├── LICENSE ├── README.md ├── data-download.sh ├── data └── .gitkeep ├── docker-compose.train.yml ├── docker-compose.yml ├── docs ├── quickdraw-preview.jpg └── web-preview.png ├── model ├── .gitkeep └── model-v1.h5 ├── src └── quickdraw-classification.ipynb └── web ├── .gitignore ├── Procfile ├── app.py ├── requirement.txt ├── runtime.txt ├── static └── sketch.min.js └── templates └── index.html /.dockerignore: -------------------------------------------------------------------------------- 1 | docs 2 | src 3 | docker-compose.yml 4 | Dockerfile 5 | .local 6 | .ipynb_checkpoints -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | data 2 | **/.ipynb_checkpoints/ 3 | **/.local/ -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:2.7 2 | 3 | RUN curl -o get-pip.py https://bootstrap.pypa.io/get-pip.py 4 | RUN python get-pip.py 5 | 6 | RUN mkdir /home/web 7 | WORKDIR /home/web 8 | 9 | COPY ./web/requirement.txt requirement.txt 10 | RUN pip install -r requirement.txt 11 | 12 | COPY ./model/ ../model/ 13 | COPY ./web/ . 14 | CMD ["gunicorn", "app:app"] 15 | -------------------------------------------------------------------------------- /Dockerfile.train: -------------------------------------------------------------------------------- 1 | FROM tensorflow/tensorflow 2 | 3 | RUN pip install keras 4 | RUN pip install h5py 5 | RUN pip install --upgrade jupyter 6 | 7 | RUN useradd -ms /bin/bash quickdraw-ten 8 | WORKDIR /home/quickdraw-ten 9 | 10 | COPY data-download.sh . 11 | RUN chown -R quickdraw-ten . 12 | RUN chmod +x ./data-download.sh 13 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2017 Kosate Limpongsa 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Quickdraw 10 CNN Classifier 2 | === 3 | 4 | ## Demo 5 | 6 | :point_right: [https://quickdraw-10-classification.herokuapp.com/](https://quickdraw-10-classification.herokuapp.com/) 7 | 8 | ## Introduction 9 | 10 | This project is demo project that getting hand-drawing image and try to classification which kind of image of input. 11 | 12 |  13 | 14 | It is machine learning project to classify [Google Quickdraw image](https://github.com/googlecreativelab/quickdraw-dataset) just only 10 classes with convolutional neural networks using [Keras](https://keras.io/) and [Tensorflow](https://www.tensorflow.org). 15 | And also serve hand-writting website with [Flask](http://flask.pocoo.org/) 16 | 17 |  18 | 19 | ## Website Usage 20 | 21 | 1. Compose Docker 22 | 23 | ``` 24 | docker-compose up 25 | ``` 26 | 27 | 2. Start `http://localhost:8000` 28 | 29 | Basicly, Flask website view download model from `model` folder which I've provided pre-trained model `model-v1.h5` and serve to website. 30 | 31 | You can train new model with `docker-compose up -f docker-compose.train.yml` to start Jupyter notebook and customize code and run by yourself then save to `model` directory and replace the old model if you want to or change the code in `web` folder for better choice. 32 | 33 | ## Model Usage 34 | 35 | ### Training Data Preparation 36 | 37 | Enter these command to automatic download training dataset 38 | 39 | ``` 40 | chmod +x ./data-download.sh 41 | ./data-download.sh 42 | 43 | # Or 44 | docker-compose exec quickdraw ./data-download.sh 45 | ``` 46 | 47 | ### Jupyter Usage 48 | 49 | ``` 50 | docker-compose up -f docker-compose.train.yml 51 | ``` 52 | 53 | ## Heroku Deployment 54 | 55 | ``` 56 | heroku container:login 57 | heroku create 58 | heroku container:push web 59 | heroku open 60 | ``` 61 | 62 | ## License 63 | 64 | [MIT](LICENSE) © Kosate Limpongsa 65 | -------------------------------------------------------------------------------- /data-download.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | curl -o data/baseball.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/baseball.npy 4 | curl -o data/bowtie.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/bowtie.npy 5 | curl -o data/clock.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/clock.npy 6 | curl -o data/hand.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/hand.npy 7 | curl -o data/hat.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/hat.npy 8 | curl -o data/lightning.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/lightning.npy 9 | curl -o data/lollipop.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/lollipop.npy 10 | curl -o data/mountain.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/mountain.npy 11 | curl -o data/pizza.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/pizza.npy 12 | curl -o data/star.npy https://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/star.npy -------------------------------------------------------------------------------- /data/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/neungkl/quickdraw-10-CNN-classifier/7f3f0d408f6a4ea99d900bcefaa59dc5bbc3de56/data/.gitkeep -------------------------------------------------------------------------------- /docker-compose.train.yml: -------------------------------------------------------------------------------- 1 | version: '3.3' 2 | services: 3 | quickdraw: 4 | build: 5 | context: . 6 | dockerfile: Dockerfile.train 7 | volumes: 8 | - ./src:/home/quickdraw-ten/src 9 | - ./data:/home/quickdraw-ten/data 10 | - ./model:/home/quickdraw-ten/model 11 | ports: 12 | - 8888:8888 -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3.3' 2 | services: 3 | web: 4 | build: . 5 | environment: 6 | - PORT=8000 7 | volumes: 8 | - ./web:/home/web/ 9 | ports: 10 | - 8000:8000 -------------------------------------------------------------------------------- /docs/quickdraw-preview.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/neungkl/quickdraw-10-CNN-classifier/7f3f0d408f6a4ea99d900bcefaa59dc5bbc3de56/docs/quickdraw-preview.jpg -------------------------------------------------------------------------------- /docs/web-preview.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/neungkl/quickdraw-10-CNN-classifier/7f3f0d408f6a4ea99d900bcefaa59dc5bbc3de56/docs/web-preview.png -------------------------------------------------------------------------------- /model/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/neungkl/quickdraw-10-CNN-classifier/7f3f0d408f6a4ea99d900bcefaa59dc5bbc3de56/model/.gitkeep -------------------------------------------------------------------------------- /model/model-v1.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/neungkl/quickdraw-10-CNN-classifier/7f3f0d408f6a4ea99d900bcefaa59dc5bbc3de56/model/model-v1.h5 -------------------------------------------------------------------------------- /src/quickdraw-classification.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "collapsed": true 8 | }, 9 | "outputs": [], 10 | "source": [ 11 | "import tensorflow as tf\n", 12 | "import os\n", 13 | "import numpy as np\n", 14 | "import matplotlib.pyplot as plt\n", 15 | "import pickle\n", 16 | "%matplotlib inline\n", 17 | "\n", 18 | "cwd = os.getcwd()" 19 | ] 20 | }, 21 | { 22 | "cell_type": "code", 23 | "execution_count": 4, 24 | "metadata": { 25 | "collapsed": true 26 | }, 27 | "outputs": [], 28 | "source": [ 29 | "img_rows, img_cols = 28, 28\n", 30 | "\n", 31 | "data_dir = os.path.join(cwd, \"..\", \"data\")\n", 32 | "model_dir = os.path.join(cwd, \"..\", \"model\")\n", 33 | "LABELS = np.array(\n", 34 | " map(\n", 35 | " (lambda x : x.replace(\".npy\", \"\")),\n", 36 | " filter(lambda x: x.endswith('.npy'), os.listdir(data_dir))\n", 37 | " )\n", 38 | ")\n", 39 | "\n", 40 | "num_classes = len(LABELS)" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": 3, 46 | "metadata": { 47 | "collapsed": true 48 | }, 49 | "outputs": [], 50 | "source": [ 51 | "def data_prepare():\n", 52 | " datas_path = filter(lambda x: x.endswith('.npy'), os.listdir(data_dir))\n", 53 | " dataset = np.array([]).reshape(0, img_rows * img_cols + 1)\n", 54 | " \n", 55 | " for i, d_path in enumerate(datas_path):\n", 56 | " data = np.load(os.path.join(data_dir, d_path))\n", 57 | " image_size = len(data)\n", 58 | " label = np.ones(image_size, dtype=int) * i\n", 59 | " data = np.concatenate((label[:, np.newaxis], data), axis=1)\n", 60 | " \n", 61 | " np.random.shuffle(data)\n", 62 | "\n", 63 | " dataset = np.append(dataset, data[0:5000], axis=0)\n", 64 | " print(\"Load {}\".format(d_path))\n", 65 | " \n", 66 | " np.random.shuffle(dataset)\n", 67 | " dataset_len = len(dataset)\n", 68 | " split_x = (int)(dataset_len * 0.9)\n", 69 | " \n", 70 | " print(\"Dataset {} images\".format(dataset_len))\n", 71 | " print(\"Train {} images\".format(split_x))\n", 72 | " print(\"Test {} images\".format(dataset_len - split_x))\n", 73 | " \n", 74 | " print(\"Write data to pickle files...\")\n", 75 | " \n", 76 | " pickle.dump(dataset[0:split_x], open(os.path.join(data_dir, \"train.pickle\"), \"wb\"))\n", 77 | " pickle.dump(dataset[split_x:-1], open(os.path.join(data_dir, \"test.pickle\"), \"wb\"))\n", 78 | " \n", 79 | " print(\"Finish\")\n", 80 | "\n", 81 | "if not os.path.exists(os.path.join(data_dir, \"train.pickle\")):\n", 82 | " print(\"Prepare pickle data\")\n", 83 | " data_prepare()" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 15, 89 | "metadata": {}, 90 | "outputs": [ 91 | { 92 | "name": "stdout", 93 | "output_type": "stream", 94 | "text": [ 95 | "Load dataset complete\n" 96 | ] 97 | } 98 | ], 99 | "source": [ 100 | "x_train = None\n", 101 | "y_train = None\n", 102 | "x_test = None\n", 103 | "y_test = None\n", 104 | "\n", 105 | "def load_dataset():\n", 106 | " global x_train, y_train, x_test, y_test\n", 107 | " train_data = pickle.load(open(os.path.join(data_dir, \"train.pickle\"), \"rb\"))\n", 108 | " test_data = pickle.load(open(os.path.join(data_dir, \"test.pickle\"), \"rb\"))\n", 109 | " x_train = train_data[:,1:]\n", 110 | " y_train = train_data[:,0]\n", 111 | " x_test = test_data[:,1:]\n", 112 | " y_test = test_data[:,0]\n", 113 | " print(\"Load dataset complete\")\n", 114 | "\n", 115 | "load_dataset()" 116 | ] 117 | }, 118 | { 119 | "cell_type": "code", 120 | "execution_count": null, 121 | "metadata": { 122 | "collapsed": true 123 | }, 124 | "outputs": [], 125 | "source": [ 126 | "for i in range(10):\n", 127 | " plt.imshow(x_train[i].reshape(28,28))\n", 128 | " plt.title(LABELS[(int)(y_train[i])])\n", 129 | " plt.show()" 130 | ] 131 | }, 132 | { 133 | "cell_type": "markdown", 134 | "metadata": {}, 135 | "source": [ 136 | "### Data Prepossing" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": 2, 142 | "metadata": {}, 143 | "outputs": [ 144 | { 145 | "name": "stderr", 146 | "output_type": "stream", 147 | "text": [ 148 | "Using TensorFlow backend.\n" 149 | ] 150 | } 151 | ], 152 | "source": [ 153 | "import keras\n", 154 | "from keras.models import Sequential\n", 155 | "from keras.layers import Dense, Dropout, Flatten\n", 156 | "from keras.layers import Conv2D, MaxPooling2D" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 16, 162 | "metadata": {}, 163 | "outputs": [ 164 | { 165 | "name": "stdout", 166 | "output_type": "stream", 167 | "text": [ 168 | "(45000, 28, 28, 1)\n", 169 | "(45000, 10)\n", 170 | "(4999, 28, 28, 1)\n", 171 | "(4999, 10)\n" 172 | ] 173 | } 174 | ], 175 | "source": [ 176 | "x_train = x_train.astype('float32')\n", 177 | "x_test = x_test.astype('float32')\n", 178 | "x_train /= 255\n", 179 | "x_test /= 255\n", 180 | "x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n", 181 | "x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n", 182 | "\n", 183 | "y_train_onehot = keras.utils.to_categorical(y_train, num_classes)\n", 184 | "y_test_onehot = keras.utils.to_categorical(y_test, num_classes)\n", 185 | "\n", 186 | "print(x_train.shape)\n", 187 | "print(y_train_onehot.shape)\n", 188 | "print(x_test.shape)\n", 189 | "print(y_test_onehot.shape)" 190 | ] 191 | }, 192 | { 193 | "cell_type": "markdown", 194 | "metadata": {}, 195 | "source": [ 196 | "### Model" 197 | ] 198 | }, 199 | { 200 | "cell_type": "code", 201 | "execution_count": 38, 202 | "metadata": {}, 203 | "outputs": [ 204 | { 205 | "name": "stdout", 206 | "output_type": "stream", 207 | "text": [ 208 | "Epoch 1/3\n", 209 | "45000/45000 [==============================] - 134s - loss: 0.4630 - acc: 0.8626 \n", 210 | "Epoch 2/3\n", 211 | "45000/45000 [==============================] - 134s - loss: 0.2798 - acc: 0.9182 \n", 212 | "Epoch 3/3\n", 213 | "45000/45000 [==============================] - 138s - loss: 0.2278 - acc: 0.9315 \n" 214 | ] 215 | } 216 | ], 217 | "source": [ 218 | "model = Sequential()\n", 219 | "\n", 220 | "model.add(Conv2D(32, kernel_size=(3, 3), padding='same', activation='relu', input_shape=(img_rows, img_cols, 1)))\n", 221 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n", 222 | "model.add(Conv2D(64, kernel_size=(3, 3), padding='same', activation='relu'))\n", 223 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n", 224 | "model.add(Dropout(0.25))\n", 225 | "model.add(Flatten())\n", 226 | "model.add(Dense(256, activation='relu'))\n", 227 | "model.add(Dense(256, activation='relu'))\n", 228 | "model.add(Dropout(0.5))\n", 229 | "model.add(Dense(num_classes, activation='softmax'))\n", 230 | "\n", 231 | "model.compile(\n", 232 | " loss=keras.losses.categorical_crossentropy,\n", 233 | " optimizer=keras.optimizers.Adam(),\n", 234 | " metrics=['accuracy']\n", 235 | ")\n", 236 | "\n", 237 | "model.fit(\n", 238 | " x_train, y_train_onehot,\n", 239 | " batch_size = 32,\n", 240 | " epochs = 3,\n", 241 | " verbose = 1\n", 242 | ")\n", 243 | "\n", 244 | "model.save(os.path.join(model_dir, 'model.h5'))" 245 | ] 246 | }, 247 | { 248 | "cell_type": "code", 249 | "execution_count": 39, 250 | "metadata": {}, 251 | "outputs": [ 252 | { 253 | "name": "stdout", 254 | "output_type": "stream", 255 | "text": [ 256 | "4992/4999 [============================>.] - ETA: 0s\n", 257 | "Accuracy 0.931786357271\n" 258 | ] 259 | } 260 | ], 261 | "source": [ 262 | "model = keras.models.load_model(os.path.join(model_dir, 'model.h5'))\n", 263 | "\n", 264 | "score = model.evaluate(x_test, y_test_onehot, verbose = 1)\n", 265 | "print(\"\\nAccuracy {}\".format(score[1]))" 266 | ] 267 | }, 268 | { 269 | "cell_type": "code", 270 | "execution_count": 40, 271 | "metadata": {}, 272 | "outputs": [ 273 | { 274 | "name": "stdout", 275 | "output_type": "stream", 276 | "text": [ 277 | "4999/4999 [==============================] - 4s \n", 278 | "()\n", 279 | "[[410 5 25 8 2 6 0 2 12 2]\n", 280 | " [ 0 444 3 6 4 5 0 2 4 5]\n", 281 | " [ 9 3 483 0 2 0 0 0 7 1]\n", 282 | " [ 2 3 2 464 6 9 4 5 6 7]\n", 283 | " [ 2 7 4 5 452 4 0 7 8 5]\n", 284 | " [ 2 3 0 4 1 460 0 8 1 14]\n", 285 | " [ 2 2 0 4 2 6 519 1 2 2]\n", 286 | " [ 1 1 0 2 2 3 0 475 1 4]\n", 287 | " [ 23 4 6 0 4 7 4 0 456 1]\n", 288 | " [ 3 5 1 4 2 7 0 1 2 495]]\n" 289 | ] 290 | }, 291 | { 292 | "data": { 293 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAASEAAAEhCAYAAAAwHRYbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmcXFWZ//HPl7AkGBLAhREFQVYhrCYsCsgiriCMbC9B\nIG7IMgI/BcffOCxuvxlFnRlEwOASBFzYEVBAdmQJBAgJgYRdUFABWeJAIMvz++OcItWdqnR33Xtz\nq+jv+/XKq6tu3fvU6U73U+eee+5zFBGYmdVlmbobYGbDm5OQmdXKScjMauUkZGa1chIys1o5CZlZ\nrZyEzKxWTkJmVisnITOr1bJ1N6Auo1ZZIcau/obS4750X+khAdCIEaXHjIULSo+ZAlcT1gCpmrgV\n3Dkxl//l1XhlwAYP2yQ0dvU3cOAvdik97rR3l58sAEaMHVN6zIVz5pQeEyDmz68kLstU87OlqmRc\nAa2wQiVx45VXSo85Ja4Z1H4+HTOzWjkJmVmtnITMrFZOQmZWKychM6uVk5CZ1arjJCRpLUn3ltmY\nptgTJZ0yxGMek/Sm/PgfVbTLzMrnnpCZ1apoElpW0jmS7pd0vqQVJR0v6Q5J90qaJKUpnpKOlHSf\npOmSfpW3vUHSTyXdLuluSXs0xV5D0vWSHpR0QmOjpIsl3SlppqRDCrbfzGpWNAltAJwaEe8CXgQO\nB06JiAkRMQ4YBeyW9/0KsEVEbAocmrd9Fbg2IrYCdgJOktS4l2IrYC9gU2AfSePz9k9HxLuB8cCR\nkt442MZKOkTSVElTX3qu/BmiZjZ0RZPQExFxc358NrAdsJOkKZJmADsDG+fXpwPnSPok0JjX/wHg\nK5KmAdcDI4E182u/j4hnI+Jl4MIcG1LiuQe4DVgDWG+wjY2ISRExPiLGr7hKNdPfzWxoit471v+u\ntwBOBcZHxBOSTiQlFoCPAjsAuwNflbQJIGCviJjdHETS1q1iS9oReD+wbUS8JOn6pvhm1oOK9oTW\nlLRtfrw/8If8+BlJo4G9ASQtA6wREdcB/wqMBUYDVwJfaBo32qIp9q6SVpU0CtgTuDkf91xOQBsC\n2xRsv5nVrGhPaDZwhKSfAvcBpwGrAPcCfwHuyPuNAM6WNJbU+zk5Ip6X9A3gv4HpOVE9yqIxpNuB\nC4C3A2dHxNR8ineopPvze99WsP1mVjMN1xVY/2njVcOlPFzKA3ApD6or5fFi/H3AekKeJ2RmtXIS\nMrNaOQmZWa2chMysVsO2xvRL98G0LQbeb6gOmPXH8oMC57xrjfKDqqLPoKqKsRvx6qvVBK5i0H+Q\n4/3uCZlZrZyEzKxWTkJmVisnITOrlZOQmdXKScjMauUkZGa1qiwJlV0IPxe/X73p+Y8lbVRWfDOr\nRy9NVpxIKhHyJEBEfLbW1phZKao+HWtVCH+XXNR+Ri5yv4KkCZIuBJC0h6SXJS0vaaSkRyTtTaop\nfY6kaZJG5SL44/MxH5B0q6S7JJ2XC6qZWQ+oOgn1L4T/RWAysF9EbELqiR0G3A1sno/ZntTjmQBs\nDUyJiPOBqcABEbF5rjsNQF5r7N+B90fElnm/L7ZqTHOh+3m40L1ZN6g6CfUvhL8L8GhEPJC3nQns\nEBHzgYclvYu0ysb3SfWotwduGuA9tgE2Am7OBfMPBt7RasfmQvfL4UL3Zt2g6jGh/mUbnwfaLdFz\nI/BhYB5wNanHNAI4doD3EGlljk903kwzq0vVPaH+hfCnAmtJWjdvOxC4IT++CTgauDUiniYlqw1I\np2YAc4CVWrzHbcB7GzHzgorrl/6dmFklqk5CjUL495MK4P8X8CngvFy0fiFwet53CrAaqUcEaZ2y\nGbGoCPZk4PTGwHTjDXLCmgj8UtJ04FZgwyq/KTMrz7AtdD9Gq8bWKr/Q/QGz/lR6TOixekKxsJq4\nVbW3hwrdV1arqYKf7ZQFV7nQvZl1PychM6uVk5CZ1cpJyMxq5SRkZrXqpRtYyyWh5ZYvPWwlV7GA\n7z96S+kxv7jWtgPv1E2ih65iVUQjqlkKu7KluwfBPSEzq5WTkJnVyknIzGrlJGRmtXISMrNaOQmZ\nWa2chMysVl2VhCSdKOmYDo7bUdJlVbTJzKrVVUnIzIafWpOQpIMkTZd0j6Sz+r22uaTb8usXSVol\nb19X0tX5mLskrdPvuAl5NY8+282sO9WWhCRtTFolY+eI2Aw4qt8uPwf+NSI2BWYAJ+Tt5wA/zMe8\nB3iqKeZ7SJUa94iIh1u856LVNmJu6d+TmQ1dnT2hnYHzIuIZgIj4e+MFSWOBlSOiUX/6TGAHSSsB\nb4uIi/IxcyPipbzPu4BJwO4R8XirN+yz2oZGVvNdmdmQvJ7GhJ4C5gJb1N0QMxu8OpPQtcA+kt4I\nIGnVxgsR8QLwnKTt86YDgRsiYg7wJ0l75mNWkLRi3ud54KPAf0jacSl9D2ZWUG2lPCJipqRvATdI\nWkBahfWxpl0OJq2usSLwCGmVDkgJ6UeSvk5ao2yfpph/lbQb8DtJn46IKUvhWzGzAmqtJxQRZ5LG\ne1q9No20umr/7Q+SxpOaPQJcn19/HNi41IaaWWVeT2NCZtaDnITMrFZOQmZWKychM6vV8C10H0HM\ne7X8uBUt01tFUfovPTSz9JgA31t/00riVlfkfV4lcatYWjkWVrNsu5atIBUMsna+e0JmVisnITOr\nlZOQmdXKScjMauUkZGa1chIys1o5CZlZrZyEzKxWdZZ3XUvSvRXEnSxp77Ljmlk13BMys1rVnYRG\nSDpD0kxJV0kaJelzku7Iq2lc0KicmHs4J0u6RdIjjd6OklMkzZZ0NfCWWr8jMxuSupPQeqSVMzYm\nlWfdC7gwIibk1TTuBz7TtP9bge2A3YD/zNv+GdgA2Ag4iLQCR0t9VtvgldK/GTMburpvYH00V1AE\nuBNYCxgn6ZvAysBo4Mqm/S+OiIXAfZJWy9t2AH4ZEQuAJyVd2+7NImISaUUOxmjVau4ENLMhqbsn\n1NwdWUBKipOBf4mITYCvASPb7F/N7epmtlTVnYRaWQl4StJywAGD2P9GYD9JIyS9Fdip0taZWanq\nPh1r5ThgCvB0/rrSAPtfRCp8fx/wOHBrpa0zs1LVueTPY8C4puffbXr5tBb7T+z3fHT+GsC/VNJI\nM6tcN56Omdkw4iRkZrVyEjKzWjkJmVmtuvHqWE+rbEWIBQtKj1nVqhifmfVQJXF/ssE7K4lLVLSC\nxYjemcpWySoegwzpnpCZ1cpJyMxq5SRkZrVyEjKzWjkJmVmtnITMrFZOQmZWKychM6tVVyehoa7I\nIWlPSRtV2SYzK1dXJ6EO7EmqNW1mPaIXktCgVuSQ9B7gY8BJkqZJWqfuhpvZwHohCQ1qRY6IuAX4\nDXBsRGweEQ/3D+TVNsy6Ty/cwDrUFTna8mobZt2nF3pCQ12Rw8x6SC8koVbarcgxh4EL45tZF+nV\nJNRYkeNmYFbT9l8Bx0q62wPTZr2hq8eEOliR42Z8id6sp/RqT8jMXiechMysVk5CZlYrJyEzq1VX\nD0z3oipWxUiBq5hbubCCmPCT9deuJO6RD91fSdyT192wkrhViPnzK4m7zMjyp9pp7uBWG3FPyMxq\n5SRkZrVyEjKzWjkJmVmtnITMrFZOQmZWKychM6vVkJKQpH/kr6tLOn+w+7fYPqiC9JLGSzp5KG00\ns97SUU8oIp6MiL0LvO+gCtJHxNSIOLLA+5hZl+soCTUvxZOLzJ8r6T5JF0maIml8077fygXpb5O0\nWquC9JKul/RtSbdLekDS9vnYHSVdlh+fKOmned9HJB3Z9B7HSZot6Q+SfinpmCI/FDNbesoYEzoc\neC4iNiIVG3t302tvAG7LBelvBD63hIL0y0bEVsDRwAlt3mtD4IPAVsAJkpaTNIFU/H4z4MPA+DbH\nutC9WRcqIwltR6poSETcC0xveu1V4LL8uFGkvp0LB7Hf5RHxSkQ8A/wNWA14L3BJRMyNiDnApe3e\nICImRcT4iBi/HCss8Zsys6Wj6qtj8yJeu/OyUaS+nVcGsV+rovdm1sPKSEI3A/sC5CtemwzimDIL\n0t8M7C5ppKTRwG4lxTWzpaCMJHQq8GZJ9wHfBGYCLwxwTGkF6SPiDtIY03Tgd8CMQby/mXUJRcE6\nNZJGAMtFxNycUK4GNoiIV8to4CDbMDoi/iFpRdIA+CERcdeSjhmjVWNr7VJFY8qPCdXUE+qltgJH\nPjRr4J06UFU9IS1b/mhBL9UTum3ub3lh4bMD/pKV8VNaEbgurwEm4PClmYCySflUcCRw5kAJyMy6\nR+EklK9Itb0svjRExP51vr+Zdc73jplZrZyEzKxWTkJmVitP9itbRVeGDE5ef+NK4l755J2VxP3g\n6ptXErcKC+fOLT3mYK+8uydkZrVyEjKzWjkJmVmtnITMrFZOQmZWKychM6uVk5CZ1ar0JNRuhY2m\n15vrUzfXkP6YpK+U3R4z625dM1kxIn5DqgtkZsNIZadjSk6SdK+kGZL2G2D/iZJOyY8nSzo9F6V/\nQNJueftIST/L8e6WtFPTsZfklTgelNSuUL6ZdZkqe0IfBzYnrYLxJuAOSTcO4fi1SKtqrEOqV7Qu\ncAQQEbGJpA2BqyStn/ffChgHvJTf6/KImNocUNIhwCEAI1mx42/MzMpT5cD0dsAvI2JBRPwVuAGY\nMITjz42IhRHxIPAIabmf7YCzASJiFvBHoJGEfh8Rz0bEy6SVO7brH9CrbZh1n26+Otb/7reB7oYb\n6v5m1gWqTEI3AftJGiHpzcAOwO1DOH4fScvkutXvBGbnmAcA5NOwNfN2gF0lrSppFGmZ6ZtL+j7M\nrEJVjgldBGwL3EPqlXw5Iv4iaa1BHv84KWmNAQ7NhfRPBU6TNAOYD0yMiFeUCrbfDlwAvB04u/94\nkJl1p9KTUESMzl8DODb/a379MdIAMhFxPXB9fjwZmNy069URcWi/Y+cCn2rz1n+KiD0LNt/MlrJu\nHhMys2GgayYrNouIiUPcfzJ9e1Fm1iPcEzKzWjkJmVmtnITMrFZdOSZkS0mvrQwSCysJW9WqGN94\n9I7SYx639lBuOhgCDbhk/NAN8tfLPSEzq5WTkJnVyknIzGrlJGRmtXISMrNaOQmZWa2chMysVl2X\nhHK96NUHsd/XJb1/abTJzKrTjZMVJwL3Ak8uaaeIOH6ptMbMKjVgTyivEzYrr4DxgKRzJL1f0s15\nZYutckXDiyVNl3SbpE3zsSdKOqYp1r053lqS7pd0hqSZkq6SNErS3sB44BxJ0/K24yXdkY+dpFzB\nLLdn7/z4MUlfk3RXXoljw2p+XGZWtsGejq0LfI9UbH5DYH9SIfljgH8DvgbcHRGb5uc/H0TM9YAf\nRsTGwPPAXhFxPjAVOCAiNs9F60+JiAkRMQ4YBezWJt4zEbElcFpu12IkHZKXEZo6j1cG9Y2bWbUG\nm4QejYgZEbEQmAlckysnziAtzbMdcBZARFwLvFHSmEHEnJYf35njtLKTpCm5pOvOwMZt9rtwoFhe\nbcOs+wx2TKi527Cw6fnCHGNem+Pm0zfRjWwTcwGpl9OHpJHAqcD4iHhC0on9YrRq4wK6c6zLzFoo\n6+pY8yoYO5JOjV4EHgO2zNu3BNYeRKw5wEr5cSPhPCNpNLB3Se01sy5RVo/hROCnkqaTVkA9OG+/\nADhI0kxgCvDAIGJNBk6X9DJptY4zSFfL/gKUXxvBzGql6LWaMiUZo1Vja+1SdzNsKKqoeQOV1VUa\n7vWEpiy8mhfj7wMG7rrJimY2vDgJmVmtnITMrFZOQmZWq2E7n0YjRjBizNjS4y544cXSYwKgCj4v\nFi4oPyZUNoCsESMqiRsLqvk5VDGIfNiDD5UeE+C09datJO5guCdkZrVyEjKzWjkJmVmtnITMrFZO\nQmZWKychM6uVk5CZ1cpJyMxq1XVJSNKPJW1UdzvMbOnouhnTEfHZuttgZktPbT2hplU8zskrb5wv\naUVJ10saL+ljecWNaZJmS3o0b29smyEpcqzP5RU57pF0gaQV6/q+zGxo6j4d2wA4NSLeBbwIHN54\nISJ+k1fc2By4B/huRExt2nYF8N28+4V5RY7NgPuBz7R6s+bVNl6Nl6v8vsxskOo+HXsiIm7Oj88G\njuy/g6QvAy9HxA+btu1Hql39gbxpnKRvAisDo4ErW71ZREwCJgGMXfbNw7OkpFmXqTsJ9U8EfZ7n\nZZ73AXZo2jaOVNN6h4ho3P48GdgzIu6RNBHYsZrmmlnZ6j4dW1PStvnx/sAfGi9IegfwQ2CfvAgi\nklYGfgkcFBFPN8VZCXhK0nLkVT/MrDfUnYRmA0dIuh9YhbR6asNE4I3AxXkg+rfAHsA7gDMaA9R5\n3+NIq3ncDMxaWo03s+LqPh2bHxGf7Ldtx/x1Kml56f7O7L8hIk6jbwIzsx5Rd0/IzIa52npCEfEY\nMK6u9zez7uCekJnVyknIzGpV98B0bWLBgmpWxqhiVQyoZmWMHltWuTI91N6qVsX4yeN/GHinIdrt\nI3MGtZ97QmZWKychM6uVk5CZ1cpJyMxq5SRkZrVyEjKzWjkJmVmteiIJSTraJVvNXp96IgkBRwND\nSkKSRlTUFjMrUdclIUlvkHR5Llp/r6QTgNWB6yRdl/c5LdeKninpa03HPibp25LuIlVkNLMu1423\nbXwIeDIiPgogaSzwKWCniHgm7/PViPh77u1cI2nTiJieX3s2IrZsFVjSIcAhACOH1rEys4p0XU8I\nmAHsmns020fECy322Tf3du4GNgaaF0v8dbvAETEpIsZHxPjlWKHcVptZR7quJxQRD0jaEvgI8E1J\n1zS/Lmlt4BhgQkQ8J2kyMLJpl/9dao01s8K6rickaXXgpYg4GziJtLTPHFIxe4AxpETzgqTVgA/X\n0lAzK0XX9YSATYCTJC0E5gGHAdsCV0h6MiJ2knQ3qaD9E6Ti9mbWo7ouCUXElSy+eOFU4AdN+0xs\nc+xalTXMzCrRdadjZja8OAmZWa2chMysVk5CZlYrJyEzq5Wih1YaKJOkp4E/DnL3NwHPDLjX0FQR\n03Gri+m4Q4/5joh480A7DdskNBSSpkbE+G6P6bjVxXTc6mL6dMzMauUkZGa1chIanEk9EtNxq4vp\nuBXF9JiQmdXKPSEzq5WTkJnVyknIzGrlJGRmteq6ekKvZ3nttC8Ba0bE5yStB2wQEZcVjLt2RDza\nb9uEiLijw3gfX9LrEXFhJ3Gb4rdq72LbukUuN7wdEMDNEXFXzU1aaiQtA+wdEedW9h6+OtaXpDmk\nX7bFXgIiIsYUiP1r4E7goIgYl5PSLRGxeacxc9y7gN0j4s/5+fuAUyJikw7j/Sw/fAvwHuDa/Hyn\n3N7dira3/4ooku6MiHcXiPnFFptfAO6MiGkF4h5PWj6qkXj3BM6LiG92GjPHXR84FngHTZ2BiNi5\nYNyRwGdIC0C8Vns9Ij5dIGYlM7ob3BPqJyJWGnivjq0TEftJ+kR+r5ckqYS4nwculrQ7qSb3f5AW\nCuhIRHwKQNJVwEYR8VR+/lZgcqdxJW1I+uMY26+3NYa+ixV0Ynz+d2l+vhswHThU0nkR8Z0O4x4A\nbBYRcwEk/ScwDSiUhIDzgNOBM4AFBWM1O4tU+viDwNdJ7b+/YMyrJR1DWsnmtYUkIuLvBeMCTkKL\nkbTqkl4v+IN/VdIock9L0jrAKwXiNdp0h6QjgauAucD7I+LponGBNRoJKPsrsGaBeBuQksPKwO5N\n2+cAnysQF+DtwJYR8Q+AvGjm5cAOpN5np0noSVKCnJufrwD8uVhTAZgfEaeVEKe/dSNiH0l7RMSZ\nkn4B3FQw5n756xFN2wJ4Z8G4gJNQK3eSfsCteihFf/AnAFcAa0g6B3gvMLHTYJIupe+p44qkU5Cf\nSCIiPlagrZAWlrwS+GV+vh9wdafBIuIS4BJJ20bErQXb1t9b6JvQ5wGrRcTLkook+heAmZJ+T/pZ\n7wrcLulkgIg4ssO4l0o6HLioud0l9C7m5a/PSxoH/IX0s+lYRKxdsE1L5DGhpUzSG4FtSEnutqZV\nZTuJ9b4lvR4RN3Qau+k9Pg5sn5/eGBEXlRCzinGL44B/Bi7Jm3YHfgN8D5gUEQd0GPfgJb0eEWd2\nGLfVIHxERKHehaTPAheQVq2ZDIwGjo+I0wvGHUdaZLT5/+vnRWK+FttJqD1JqwDr0fcHf2MHcTaM\niFn5Kstiil5tyQtCPtU0bjGK1At4rEjcqkg6jzRusT9N4xYRcVTBuBNIA+mQrmJNLdTQRXGXB9bP\nT2dHxLwl7f96k09tdyQlod+S1vr7Q0TsXUp8J6HW8ifKUaSxhmmk3sutnVy9kDQpIg6RdF2Ll6OE\nKyJTgfdExKv5+fKkP8IJBeN+HPg2qTsvSrhCmOPeHRFbSJoeEZtKWg64KSK2KRh3BLAafa82PV4w\n5o7AmcBjpO9/DeDgTj6McrydI+LadtMgSpj+sIC0aOj/jfzH3epq5BBjzgA2A+6OiM3yoqNnR8Su\nRdra4DGh9o4CJpBOmXbKV3b+XyeBIuKQ/PDDjd5KQz41KWrZRgLK7/dqTkRFfYd06b/o1ZX+Sh+3\nkPQF0pjbX0lXm0Qaw9m0SFzS6dwHImJ2fp/1SWNknU4neB9pysPuLV4LFk0F6NRM0iTkqyTtl8eY\nil6BfTkiFkqaL2kM8DdSMi6Fk1B7cyNiriQkrZBPpzYoGPMW0iX0gbYN1dOSPhYRvwGQtAfllPX8\nawUJCGBSPtX9d9K4zWjguIIxjyJN/Hy2aOP6Wa6RgAAi4oHcc+tIRJyQv36qjMa1MD8ivixpP+Am\nSQfRet7bUEyVtDJpOsGdwD+A0i4sOAm196f8g78Y+L2k5xh8Teo+JP0T8DZglKQtWPTJNIZ0Rauo\nQ4FzJP0wP38COLCEuFPzBMuL6XsFp+in9VnAXsBapFMdSKdRRTxBupJVtqmSfgycnZ8fQFoRuDBJ\nH2XxwfmvFw2b4/xa0kzgFxSbVkFEHJ4fni7pCmBMREwv1sxFPCY0CPkq1FjgiubTniEcfzDpUvx4\n+v4CvwicWcIfdeN9RgM05sqUEO9nLTZHkatYOe4V5NnMNE3Ui4jvFYj5E9I8pMvpmzC/33lLQdIK\npPkx2+VNNwGnRkSh+V2STid9AO0E/BjYG7g9Ij5TMO6WzRc6JI0F9ihyJUvSNRGxy0DbOo7vJNRe\n2fcMSfp4WQmnX9yxpPGQHfKmG4CvR0QVPYPCJN0bEeNKjnlCq+0R8bUSYi8PvAtYSLo6NuQPohYx\nG4Pyja+jgd9FxPYDHrzkuA8DJzVfkpd0WSe32uTxyhWB60hXx5p78FdExIZF2trg07E2Wtwz9LM8\n/b/IdP2TJO1F+jS9KSJmFm1n9lPgXmDf/PxA4GfAEm9EHUgV83myWyRtEhEzCsZ5TRnJppV8ynQ6\n8DDpj3BtSZ+PiN8VDP1y/vqSpNWBZ4G3FowJadB/J0lbA5/PCfNtHcb6PHA0sDqp19oY7J8D/KCE\ntiYR4X8t/gGzgZFNz0eRPgWLxFyB1Fv5Kmm+xcPARSW0ddpgtnUQ9zzgG7mdB5NuC/mfAvFmkO7n\nuo/0xzI7P58BTO8w5n/nr5eSBrn7/CvhZzCLdCtE4/k6wKwS4h5Hun1lL9LVwaeAb5QQ96789cvA\nFNJ40F0FYx5PGgdqtPsi0i0yhdra+OeeUHtV3DO0gPTHt4DUtf9b/lfUy5K2i4g/AEh6L4s+aYso\n+z6kQnfft3FW/vrdCmIDzImIh5qeP0LqCRT1nUjjShdIuoy+v2tFNAamv6NUXeEqYIn3Qw7C3hHx\ndUnbATuTftanAVsXjAv4dGwxkn5A6nK2vGeoYPgXSZ/63wfOiPIuJx8GnJnHhgT8nQL3pDUpdT5P\nRHR0dXGAmHfmr4VvUWljqqTfAueSfg/2Ae5oTDaMzsf4biVPzcjJ6JWcNIpO1zi+8SAirpb0QVIv\ntojGxYOPkn5vL5dUtIrAazww3U9V9wrl2HuQBrq3Al4lzRG6MSKu6TRmv/hjchtfLCleq/uQjouI\nH5URv0y593cii+rzNGZ3F70Xq9UVwoaIIY6PNU3XOJt020rzYO/p0eFgb5W3BuWe2p9JH8RbknrZ\nt0fEZp3G7BPfSai9fA/WmtE0Wa2kuBuS7r85GnhLRIzqME6rQl6viXIuTzfm8zQm6EUUn8tSOkmz\ngP/D4pf9y568WMgSpmvMASZ32rPqd2tQ8x91Ixl3fGuQUvG9DwEzIuJBpbpSm0TEVZ3G7BPfSag1\npQJh3wWWj4i1JW1OuuzdcXkMSReQ7sF5mHyFDJgS/W7lGEK8xmXpVqVHCieLKubzVEXSlIgoZYwi\nx/tyHldpnJ73EZ2X8GjE3ysiLigSo03cUcDhLJpachNwWqe/Y0uDk1Abku4kDcJdHxFb5G2F5rdI\nGk+6CbDMSnpIOhM4KiKez89XAb431FOFFnFLn89TFaWKhyNIUyqaJyt2dBoiafeIuLTd6XmR0/Ic\nv7mX2XzDbdEPjnNJY4/n5E37A2MjYt/2R9XLA9PtzYuIF9S3+urCgjHvAY6Q1Dyp8PQoXhpi00YC\nAoiI5/LtIUWVPp+nQo1eUHMt5CB9kAxZRFyavxZKNktwCYt6mYWrazYZFxEbNT2/TtJ9JcYvnZNQ\nezMl7Q+MUFoV40jSQHIRp5HGVk7Nzw/M2z5bMO4yklaJiOfgtRK1Hf/fKpVuiBzjU5IeIf2hNMYX\nit6ZXoXPRMQjzRskdTworcWrVvZR5LQ8e3tEfKhgjFbukrRNRNwGkCctlnKvW1WchNr7AmlS4Suk\n0g1XkibuFTGh3xWFayXdUzAmpHITtyoVC4N0GflbBeJVMZ+nauez+OXt8+i85EZV844aquplvjvH\nbtRRWhOY3fhg6cYPEI8JDYJSsaw3FL30neeB7BMRD+fn7wTOjwIFp5pib8SiU49rI6Kru+Bl0aIV\nPL5DWkKnYQxwbERsXEvDBpBPkdYFHqXEXqakdyzp9SrmahXlnlAbeXbwoaSrQncAYyT9T0ScVCDs\nsaRz9MZpw1pAKXVlctIZFomnn0pW8Gg6JV3sJcrpUXy44PEtdWOSGYh7Qm1ImhYRm0s6gNTN/wpp\nIb2Of/nU1INqAAADW0lEQVTyDaFfAnYBniclt//q5sunvUIlr+BRdY9CUssaP1GwHG0vck+oveWU\nKujtSVrNdJ6kohn756TLp42xpf1J9z7tUzCuwUOS/o3FL3l3NE2hOcko1VRu1Ou+PSLKuN/vchbN\n7xoJrE26obcrTx+r5CTU3o9Ixc3vAW7Mn4xFb4foucunPeQS0sS8qylxRVNJ+5IKx19PShg/kHRs\nRJxfJG70W6I7325xeJvdX9d8OjYEkpaNiPkFjj+b1Ktqvnx6REQcVFYbh6vG6XMFce8Bdm30fiS9\nGbi6rPum+r3XjP7JaThwT2gJ1KIGMGmdrKHGaQxyLseiy6dButlyVglNNbhM0kci4rclx12m3+nX\ns6TVLArpd9/fMqRxxyeLxu1FTkJtqE0N4A7D9eK8m54gaQ6Lxlb+TWnJ53lQzhppwBVafCnsMhLd\nSk2P55PGiEq/l6wX+HSsDVVUA9h6j1JJ3vfmpzdFCUthN8UudXGCXuSeUHtV1QC2CrSpo/MC8Mci\n43gA+W73UnspuUjcWeSqh5KeIa3sem+Z79MLnITau0xp3bHvkG4yhHRaZt3pVNK4SuM2iE1Ixf/H\nSjpsqLVvmk7zFnuJck7zJgFfjIjr8vvtmLe9p2DcnuPTsTZyXZbDgO3pkbosw5mkC0lVH2fm5xuR\nLiJ8GbiwiitnRUi6p/8VtlbbhgP3hNo7kzT1/+T8fH/SZMOurcsyzK0fTUsoRcR9ueTpI/3KsXSL\nRyQdx6JC/Z8kFdEfdpyE2vPEwt4yU9JpwK/y8/2A+3LxsKL1mqrwaeBrLBpruomS7iPsNYXnO7yO\n3SVpm8aTXqjLMsxNBB4i1e0+mtSrmEheDLC2VrW3DrAG6W9wedL9hDfW2qKaeEyon34TCzcA+kws\n7Nc7MuuIpNnAMaTB89cqdvbiXfBF+XRscZ5Y2EMknRsR+7YrvdGNRbyypxslZIc794Ssp0l6a0Q8\n1a70Rrf2LCTtAnwCuIa+hfk7XUyxZzkJmdUg38y8ITCTRadjUXSFlF7kJGQ9bSlMKqyEpNkRsUHd\n7egGHhOynhYRKw28V1e6RdJGw6UW+JK4J2RWA0n3ky7Tl1rovhc5CZnVoNcG0qvkJGRmtfKMaTOr\nlZOQmdXKScjMauUkZGa1+v+jsl+6fC1KDgAAAABJRU5ErkJggg==\n", 294 | "text/plain": [ 295 | "" 296 | ] 297 | }, 298 | "metadata": {}, 299 | "output_type": "display_data" 300 | } 301 | ], 302 | "source": [ 303 | "from sklearn.metrics import confusion_matrix\n", 304 | "\n", 305 | "y_pred = np.argmax(model.predict(x_test, verbose=1), axis=1)\n", 306 | "confusion_matrix = confusion_matrix(y_test.astype(int), y_pred)\n", 307 | "\n", 308 | "print()\n", 309 | "print(confusion_matrix)\n", 310 | "\n", 311 | "plt.imshow(confusion_matrix, interpolation='nearest')\n", 312 | "plt.xticks(np.arange(0,num_classes), LABELS, rotation=90)\n", 313 | "plt.yticks(np.arange(0,num_classes), LABELS)\n", 314 | "\n", 315 | "plt.show()" 316 | ] 317 | } 318 | ], 319 | "metadata": { 320 | "kernelspec": { 321 | "display_name": "Python 2", 322 | "language": "python", 323 | "name": "python2" 324 | }, 325 | "language_info": { 326 | "codemirror_mode": { 327 | "name": "ipython", 328 | "version": 2 329 | }, 330 | "file_extension": ".py", 331 | "mimetype": "text/x-python", 332 | "name": "python", 333 | "nbconvert_exporter": "python", 334 | "pygments_lexer": "ipython2", 335 | "version": "2.7.12" 336 | } 337 | }, 338 | "nbformat": 4, 339 | "nbformat_minor": 2 340 | } 341 | -------------------------------------------------------------------------------- /web/.gitignore: -------------------------------------------------------------------------------- 1 | venv 2 | *.pyc -------------------------------------------------------------------------------- /web/Procfile: -------------------------------------------------------------------------------- 1 | web: gunicorn app:app --log-file=- -------------------------------------------------------------------------------- /web/app.py: -------------------------------------------------------------------------------- 1 | from flask import Flask, render_template, render_template_string, request 2 | import os 3 | import keras 4 | import numpy as np 5 | app = Flask(__name__) 6 | 7 | model_path = os.path.join(os.getcwd(), "..", "model/model-v1.h5") 8 | model = keras.models.load_model(model_path) 9 | 10 | @app.route("/") 11 | def hello(): 12 | return render_template('index.html') 13 | 14 | @app.route('/predict', methods=['POST']) 15 | def predict(): 16 | img = np.array(request.form['img'].split(","), np.float32).reshape(1, 28, 28, 1) 17 | img /= 255 18 | result = model.predict(img)[0] 19 | return render_template_string(",".join(result.astype(str))) 20 | # return render_template_string("0.0985102057457,0.0936895906925,0.0623816810548,0.102438829839,0.108508199453,0.180629864335,0.0768260806799,0.0812399238348,0.0883773490787,0.107398249209") 21 | 22 | if __name__ == '__main__': 23 | app.run(debug=True, use_reloader=True) 24 | -------------------------------------------------------------------------------- /web/requirement.txt: -------------------------------------------------------------------------------- 1 | Flask 2 | gunicorn 3 | tensorflow 4 | keras 5 | h5py 6 | numpy 7 | -------------------------------------------------------------------------------- /web/runtime.txt: -------------------------------------------------------------------------------- 1 | python-2.7.13 -------------------------------------------------------------------------------- /web/static/sketch.min.js: -------------------------------------------------------------------------------- 1 | /* Copyright (C) 2013 Justin Windle, http://soulwire.co.uk */ 2 | !function(e,t){"object"==typeof exports?module.exports=t(e,e.document):"function"==typeof define&&define.amd?define(function(){return t(e,e.document)}):e.Sketch=t(e,e.document)}("undefined"!=typeof window?window:this,function(e,t){"use strict";function n(e){return"[object Array]"==Object.prototype.toString.call(e)}function o(e){return"function"==typeof e}function r(e){return"number"==typeof e}function i(e){return"string"==typeof e}function u(e){return C[e]||String.fromCharCode(e)}function a(e,t,n){for(var o in t)!n&&o in e||(e[o]=t[o]);return e}function c(e,t){return function(){e.apply(t,arguments)}}function l(e){var t={};for(var n in e)"webkitMovementX"!==n&&"webkitMovementY"!==n&&(o(e[n])?t[n]=c(e[n],e):t[n]=e[n]);return t}function s(e){function t(t){o(t)&&t.apply(e,[].splice.call(arguments,1))}function n(e){for(_=0;_ 2 | 3 | 4 | 5 | 6 | Quickdraw Clasification 7 | 8 | 9 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | Quickdraw Classification 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | Quickdraw 10 classes classification 45 | 46 | 47 | 48 | 49 | Clear 50 | 51 | 52 | Prediction result 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 67 | 231 | 232 | --------------------------------------------------------------------------------
44 | Quickdraw 10 classes classification 45 |