├── .gitignore ├── LICENSE ├── README.md ├── Spectral NET.ipynb ├── best-model.hdf5 ├── classification_report.txt ├── data ├── Indian_pines_corrected.mat ├── Indian_pines_gt.mat ├── PaviaU.mat ├── PaviaU_gt.mat ├── Salinas_corrected.mat └── Salinas_gt.mat ├── figure ├── Architecture.png ├── Architecture.svg ├── HSI-RN.jpg ├── IP-FC.jpg ├── IP-GT.jpg ├── IP-Pr.jpg ├── IP_legend.jpg ├── SA-FC.jpg ├── SA-GT.jpg ├── SA-Pr.jpg ├── SA_legend.jpg ├── UP-FC.jpg ├── UP-GT.jpg ├── UP-Pr.jpg └── UP_legend.jpg ├── predictions.jpg └── wavelet_cnn_0.5.png /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | pip-wheel-metadata/ 24 | share/python-wheels/ 25 | *.egg-info/ 26 | .installed.cfg 27 | *.egg 28 | MANIFEST 29 | 30 | # PyInstaller 31 | # Usually these files are written by a python script from a template 32 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 33 | *.manifest 34 | *.spec 35 | 36 | # Installer logs 37 | pip-log.txt 38 | pip-delete-this-directory.txt 39 | 40 | # Unit test / coverage reports 41 | htmlcov/ 42 | .tox/ 43 | .nox/ 44 | .coverage 45 | .coverage.* 46 | .cache 47 | nosetests.xml 48 | coverage.xml 49 | *.cover 50 | *.py,cover 51 | .hypothesis/ 52 | .pytest_cache/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | db.sqlite3 62 | db.sqlite3-journal 63 | 64 | # Flask stuff: 65 | instance/ 66 | .webassets-cache 67 | 68 | # Scrapy stuff: 69 | .scrapy 70 | 71 | # Sphinx documentation 72 | docs/_build/ 73 | 74 | # PyBuilder 75 | target/ 76 | 77 | # Jupyter Notebook 78 | .ipynb_checkpoints 79 | 80 | # IPython 81 | profile_default/ 82 | ipython_config.py 83 | 84 | # pyenv 85 | .python-version 86 | 87 | # pipenv 88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control. 89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies 90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not 91 | # install all needed dependencies. 92 | #Pipfile.lock 93 | 94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow 95 | __pypackages__/ 96 | 97 | # Celery stuff 98 | celerybeat-schedule 99 | celerybeat.pid 100 | 101 | # SageMath parsed files 102 | *.sage.py 103 | 104 | # Environments 105 | .env 106 | .venv 107 | env/ 108 | venv/ 109 | ENV/ 110 | env.bak/ 111 | venv.bak/ 112 | 113 | # Spyder project settings 114 | .spyderproject 115 | .spyproject 116 | 117 | # Rope project settings 118 | .ropeproject 119 | 120 | # mkdocs documentation 121 | /site 122 | 123 | # mypy 124 | .mypy_cache/ 125 | .dmypy.json 126 | dmypy.json 127 | 128 | # Pyre type checker 129 | .pyre/ 130 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 eternal-vanguard 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # SpectralNET a 2D wavelet CNN for Hyperspectral Image Classification. 2 | [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) 3 | [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/spectralnet-exploring-spatial-spectral/hyperspectral-image-classification-on-indian)](https://paperswithcode.com/sota/hyperspectral-image-classification-on-indian?p=spectralnet-exploring-spatial-spectral) 4 | [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/spectralnet-exploring-spatial-spectral/hyperspectral-image-classification-on-pavia)](https://paperswithcode.com/sota/hyperspectral-image-classification-on-pavia?p=spectralnet-exploring-spatial-spectral) 5 | [![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/spectralnet-exploring-spatial-spectral/hyperspectral-image-classification-on-salinas)](https://paperswithcode.com/sota/hyperspectral-image-classification-on-salinas?p=spectralnet-exploring-spatial-spectral) 6 | 7 | ## Description 8 | Hyperspectral Image (HSI) classification using Convolutional 9 | Neural Networks (CNN) is widely found in the current 10 | literature. Approaches vary from using SVMs to 2D CNNs, 11 | 3D CNNs, 3D-2D CNNs, FuSENets. Besides 3D-2D CNNs and 12 | FuSENet, the other approaches do not consider both the spectral 13 | and spatial features together for HSI classification task, thereby 14 | resulting in poor performances. 3D CNNs are computationally 15 | heavy and are not widely used, while 2D CNNs do not consider 16 | multi-resolution processing of images, and only limits itself to 17 | the spatial features. Even though 3D-2D CNNs try to model the 18 | spectral and spatial features their performance seems limited 19 | when applied over multiple dataset. In this article, we propose 20 | SpectralNET, a wavelet CNN, which is a variation of 2D CNN 21 | for multi-resolution HSI classification. A wavelet CNN uses layers 22 | of wavelet transform to bring out spectral features. Computing 23 | a wavelet transform is lighter than computing 3D CNN. The 24 | spectral features extracted are then connected to the 2D CNN 25 | which bring out the spatial features, thereby creating a spatialspectral 26 | feature vector for classification. Overall a better model 27 | is achieved that can classify multi-resolution HSI data with 28 | high accuracy. Experiments performed with SpectralNET on 29 | benchmark dataset, i.e. Indian Pines, University of Pavia, and 30 | Salinas Scenes confirm the superiority of proposed SpectralNET 31 | with respect to the state-of-the-art methods. 32 | 33 | 34 | ## Model 35 | 36 | 37 | 38 | Fig: Proposed SpectralNet (Wavelet CNN) Model for hyperspectral image (HSI) classification. 39 | 40 | ## Prerequisites 41 | 42 | - [Anaconda 4.8.3](https://www.anaconda.com/download/#linux) 43 | - [Tensorflow 2.3.0](https://github.com/tensorflow/tensorflow/tree/r2.4) 44 | - [Keras 2.4.3](https://github.com/fchollet/keras) 45 | 46 | ## Results 47 | 48 | ### Salinas Scene (SS) dataset 49 | 50 | 51 | 52 | Fig.4 The SA dataset classification result (Overall Accuracy 100%) of SpectralNet using 30% samples for training. (a) False color image. (b) Ground truth labels. (c) Classification map. 53 | 54 | 55 | 56 | ## Acknowledgement 57 | https://github.com/gokriznastic/HybridSN 58 | https://github.com/menon92/WaveletCNN 59 | 60 | 61 | ## License 62 | 63 | Copyright (c) 2021 Tanmay Chakraborty and Utkarsh Trehan. Released under the MIT License. See [LICENSE](LICENSE) for details. 64 | -------------------------------------------------------------------------------- /Spectral NET.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# SpectralNET Exploring Spatial Spectral Wavelet CNN for Hyper Spectral Image Classification\n", 8 | "\n", 9 | "**Authors:** Tanmay CHAKRABORTY and Utkarsh TREHAN" 10 | ] 11 | }, 12 | { 13 | "cell_type": "code", 14 | "execution_count": 3, 15 | "metadata": { 16 | "colab": { 17 | "base_uri": "https://localhost:8080/", 18 | "height": 50 19 | }, 20 | "executionInfo": { 21 | "elapsed": 5572, 22 | "status": "ok", 23 | "timestamp": 1602478409232, 24 | "user": { 25 | "displayName": "Tanmay Chakraborty", 26 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 27 | "userId": "10513402671331353489" 28 | }, 29 | "user_tz": -330 30 | }, 31 | "id": "r9imWZNCMoOM", 32 | "outputId": "e6ddc3e2-53c3-4e29-ceb6-5b303ac0b75b" 33 | }, 34 | "outputs": [ 35 | { 36 | "name": "stdout", 37 | "output_type": "stream", 38 | "text": [ 39 | "Collecting spectral\n", 40 | " Downloading spectral-0.22.1-py3-none-any.whl (212 kB)\n", 41 | "Requirement already satisfied: numpy in c:\\users\\utkarsh trehan\\anaconda3\\envs\\malis\\lib\\site-packages (from spectral) (1.19.2)\n", 42 | "Installing collected packages: spectral\n", 43 | "Successfully installed spectral-0.22.1\n" 44 | ] 45 | } 46 | ], 47 | "source": [ 48 | "import keras\n", 49 | "from keras.layers import Conv2D, Conv3D, Flatten, Dense, Reshape, BatchNormalization\n", 50 | "from keras.layers import Dropout, Input\n", 51 | "from tensorflow.keras.models import Model\n", 52 | "from keras.optimizers import Adam, SGD\n", 53 | "from keras.callbacks import ModelCheckpoint\n", 54 | "from keras.utils import np_utils\n", 55 | "from keras import backend as Kb\n", 56 | "from keras.layers import Lambda\n", 57 | "from keras.layers import Activation\n", 58 | "from keras.layers.merge import add, concatenate\n", 59 | "from keras.layers import AveragePooling2D\n", 60 | "from keras.utils import plot_model\n", 61 | " \n", 62 | "from sklearn.model_selection import train_test_split\n", 63 | "from sklearn.metrics import confusion_matrix, accuracy_score, classification_report, cohen_kappa_score\n", 64 | " \n", 65 | "from sklearn.decomposition import FactorAnalysis\n", 66 | "from sklearn.decomposition import PCA\n", 67 | "from operator import truediv\n", 68 | " \n", 69 | "from plotly.offline import init_notebook_mode\n", 70 | " \n", 71 | "import numpy as np\n", 72 | "import matplotlib.pyplot as plt\n", 73 | "import scipy.io as sio\n", 74 | "import os\n", 75 | "!pip install spectral\n", 76 | "import spectral" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": 4, 82 | "metadata": { 83 | "executionInfo": { 84 | "elapsed": 5565, 85 | "status": "ok", 86 | "timestamp": 1602478409233, 87 | "user": { 88 | "displayName": "Tanmay Chakraborty", 89 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 90 | "userId": "10513402671331353489" 91 | }, 92 | "user_tz": -330 93 | }, 94 | "id": "HC83Bv1IPQfc" 95 | }, 96 | "outputs": [], 97 | "source": [ 98 | "def applyFA(X, numComponents=75):\n", 99 | " newX = np.reshape(X, (-1, X.shape[2]))\n", 100 | " fa = FactorAnalysis(n_components=numComponents, random_state=0)\n", 101 | " newX = fa.fit_transform(newX)\n", 102 | " newX = np.reshape(newX, (X.shape[0],X.shape[1], numComponents))\n", 103 | " return newX, fa\n", 104 | " " 105 | ] 106 | }, 107 | { 108 | "cell_type": "code", 109 | "execution_count": 5, 110 | "metadata": { 111 | "executionInfo": { 112 | "elapsed": 5559, 113 | "status": "ok", 114 | "timestamp": 1602478409234, 115 | "user": { 116 | "displayName": "Tanmay Chakraborty", 117 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 118 | "userId": "10513402671331353489" 119 | }, 120 | "user_tz": -330 121 | }, 122 | "id": "UGivdxXN1pCh" 123 | }, 124 | "outputs": [], 125 | "source": [ 126 | "# def applyPCA(X, numComponents=75):\n", 127 | "# newX = np.reshape(X, (-1, X.shape[2]))\n", 128 | "# pca = PCA(n_components=numComponents, whiten=True)\n", 129 | "# newX = pca.fit_transform(newX)\n", 130 | "# newX = np.reshape(newX, (X.shape[0],X.shape[1], numComponents))\n", 131 | "# return newX, pca" 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "execution_count": 6, 137 | "metadata": { 138 | "executionInfo": { 139 | "elapsed": 5555, 140 | "status": "ok", 141 | "timestamp": 1602478409235, 142 | "user": { 143 | "displayName": "Tanmay Chakraborty", 144 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 145 | "userId": "10513402671331353489" 146 | }, 147 | "user_tz": -330 148 | }, 149 | "id": "jEBfhIyVNHRQ" 150 | }, 151 | "outputs": [], 152 | "source": [ 153 | "## GLOBAL VARIABLES\n", 154 | "dataset = 'SA'\n", 155 | "test_ratio = 0.9\n", 156 | "windowSize = 24" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 7, 162 | "metadata": { 163 | "executionInfo": { 164 | "elapsed": 5550, 165 | "status": "ok", 166 | "timestamp": 1602478409236, 167 | "user": { 168 | "displayName": "Tanmay Chakraborty", 169 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 170 | "userId": "10513402671331353489" 171 | }, 172 | "user_tz": -330 173 | }, 174 | "id": "7wXNSkhfM3gs" 175 | }, 176 | "outputs": [], 177 | "source": [ 178 | "def loadData(name):\n", 179 | " data_path = os.path.join(os.getcwd(),'data')\n", 180 | " if name == 'IP':\n", 181 | " data = sio.loadmat(os.path.join(data_path, 'Indian_pines_corrected.mat'))['indian_pines_corrected']\n", 182 | " labels = sio.loadmat(os.path.join(data_path, 'Indian_pines_gt.mat'))['indian_pines_gt']\n", 183 | " elif name == 'SA':\n", 184 | " data = sio.loadmat(os.path.join(data_path, 'Salinas_corrected.mat'))['salinas_corrected']\n", 185 | " labels = sio.loadmat(os.path.join(data_path, 'Salinas_gt.mat'))['salinas_gt']\n", 186 | " elif name == 'PU':\n", 187 | " data = sio.loadmat(os.path.join(data_path, 'PaviaU.mat'))['paviaU']\n", 188 | " labels = sio.loadmat(os.path.join(data_path, 'PaviaU_gt.mat'))['paviaU_gt']\n", 189 | " \n", 190 | " return data, labels" 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": 8, 196 | "metadata": { 197 | "executionInfo": { 198 | "elapsed": 5545, 199 | "status": "ok", 200 | "timestamp": 1602478409237, 201 | "user": { 202 | "displayName": "Tanmay Chakraborty", 203 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 204 | "userId": "10513402671331353489" 205 | }, 206 | "user_tz": -330 207 | }, 208 | "id": "iaIzfkQ3NDvS" 209 | }, 210 | "outputs": [], 211 | "source": [ 212 | "def splitTrainTestSet(X, y, testRatio, randomState=345):\n", 213 | " X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=testRatio, random_state=randomState,\n", 214 | " stratify=y)\n", 215 | " return X_train, X_test, y_train, y_test" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": 9, 221 | "metadata": { 222 | "executionInfo": { 223 | "elapsed": 5538, 224 | "status": "ok", 225 | "timestamp": 1602478409237, 226 | "user": { 227 | "displayName": "Tanmay Chakraborty", 228 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 229 | "userId": "10513402671331353489" 230 | }, 231 | "user_tz": -330 232 | }, 233 | "id": "M0he4FtONMU-" 234 | }, 235 | "outputs": [], 236 | "source": [ 237 | "def padWithZeros(X, margin=2):\n", 238 | " newX = np.zeros((X.shape[0] + 2 * margin, X.shape[1] + 2* margin, X.shape[2]))\n", 239 | " x_offset = margin\n", 240 | " y_offset = margin\n", 241 | " newX[x_offset:X.shape[0] + x_offset, y_offset:X.shape[1] + y_offset, :] = X\n", 242 | " return newX" 243 | ] 244 | }, 245 | { 246 | "cell_type": "code", 247 | "execution_count": 10, 248 | "metadata": { 249 | "executionInfo": { 250 | "elapsed": 5533, 251 | "status": "ok", 252 | "timestamp": 1602478409238, 253 | "user": { 254 | "displayName": "Tanmay Chakraborty", 255 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 256 | "userId": "10513402671331353489" 257 | }, 258 | "user_tz": -330 259 | }, 260 | "id": "l0wsTkhNNO04" 261 | }, 262 | "outputs": [], 263 | "source": [ 264 | "def createImageCubes(X, y, windowSize=8, removeZeroLabels = True):\n", 265 | " margin = int((windowSize) / 2)\n", 266 | " zeroPaddedX = padWithZeros(X, margin=margin)\n", 267 | " # split patches\n", 268 | " patchesData = np.zeros((X.shape[0] * X.shape[1], windowSize, windowSize, X.shape[2]))\n", 269 | " patchesLabels = np.zeros((X.shape[0] * X.shape[1]))\n", 270 | " patchIndex = 0\n", 271 | " for r in range(margin, zeroPaddedX.shape[0] - margin):\n", 272 | " for c in range(margin, zeroPaddedX.shape[1] - margin):\n", 273 | " patch = zeroPaddedX[r - margin:r + margin , c - margin:c + margin ] \n", 274 | " patchesData[patchIndex, :, :, :] = patch\n", 275 | " patchesLabels[patchIndex] = y[r-margin, c-margin]\n", 276 | " patchIndex = patchIndex + 1\n", 277 | " if removeZeroLabels:\n", 278 | " patchesData = patchesData[patchesLabels>0,:,:,:]\n", 279 | " patchesLabels = patchesLabels[patchesLabels>0]\n", 280 | " patchesLabels -= 1\n", 281 | " return patchesData, patchesLabels" 282 | ] 283 | }, 284 | { 285 | "cell_type": "code", 286 | "execution_count": 11, 287 | "metadata": { 288 | "colab": { 289 | "base_uri": "https://localhost:8080/", 290 | "height": 34 291 | }, 292 | "executionInfo": { 293 | "elapsed": 5525, 294 | "status": "ok", 295 | "timestamp": 1602478409238, 296 | "user": { 297 | "displayName": "Tanmay Chakraborty", 298 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 299 | "userId": "10513402671331353489" 300 | }, 301 | "user_tz": -330 302 | }, 303 | "id": "cnneGlFFNRVo", 304 | "outputId": "3b00138d-9d02-4b14-f6e2-2b1dc9ad8aeb" 305 | }, 306 | "outputs": [ 307 | { 308 | "data": { 309 | "text/plain": [ 310 | "((512, 217, 204), (512, 217))" 311 | ] 312 | }, 313 | "execution_count": 11, 314 | "metadata": {}, 315 | "output_type": "execute_result" 316 | } 317 | ], 318 | "source": [ 319 | "X, y = loadData(dataset)\n", 320 | "\n", 321 | "X.shape, y.shape" 322 | ] 323 | }, 324 | { 325 | "cell_type": "code", 326 | "execution_count": 12, 327 | "metadata": { 328 | "executionInfo": { 329 | "elapsed": 5518, 330 | "status": "ok", 331 | "timestamp": 1602478409239, 332 | "user": { 333 | "displayName": "Tanmay Chakraborty", 334 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 335 | "userId": "10513402671331353489" 336 | }, 337 | "user_tz": -330 338 | }, 339 | "id": "TOGl1BmWNdyv" 340 | }, 341 | "outputs": [], 342 | "source": [ 343 | "K = X.shape[2]" 344 | ] 345 | }, 346 | { 347 | "cell_type": "code", 348 | "execution_count": 13, 349 | "metadata": { 350 | "colab": { 351 | "base_uri": "https://localhost:8080/", 352 | "height": 34 353 | }, 354 | "executionInfo": { 355 | "elapsed": 30165, 356 | "status": "ok", 357 | "timestamp": 1602478433894, 358 | "user": { 359 | "displayName": "Tanmay Chakraborty", 360 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 361 | "userId": "10513402671331353489" 362 | }, 363 | "user_tz": -330 364 | }, 365 | "id": "fTYzjiltNj8a", 366 | "outputId": "7bfa99db-15b8-40d1-83b2-b7d32d039c0e" 367 | }, 368 | "outputs": [ 369 | { 370 | "data": { 371 | "text/plain": [ 372 | "(512, 217, 3)" 373 | ] 374 | }, 375 | "execution_count": 13, 376 | "metadata": {}, 377 | "output_type": "execute_result" 378 | } 379 | ], 380 | "source": [ 381 | "K = 3 if dataset == 'IP' else 3\n", 382 | "X,fa = applyFA(X,numComponents=K)\n", 383 | "\n", 384 | "X.shape" 385 | ] 386 | }, 387 | { 388 | "cell_type": "code", 389 | "execution_count": 14, 390 | "metadata": { 391 | "colab": { 392 | "base_uri": "https://localhost:8080/", 393 | "height": 34 394 | }, 395 | "executionInfo": { 396 | "elapsed": 31758, 397 | "status": "ok", 398 | "timestamp": 1602478435499, 399 | "user": { 400 | "displayName": "Tanmay Chakraborty", 401 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 402 | "userId": "10513402671331353489" 403 | }, 404 | "user_tz": -330 405 | }, 406 | "id": "hiZiDsE0cN-O", 407 | "outputId": "acf836b7-666b-4104-c856-d4e0b1d1f84f" 408 | }, 409 | "outputs": [ 410 | { 411 | "data": { 412 | "text/plain": [ 413 | "((54129, 24, 24, 3), (54129,))" 414 | ] 415 | }, 416 | "execution_count": 14, 417 | "metadata": {}, 418 | "output_type": "execute_result" 419 | } 420 | ], 421 | "source": [ 422 | "X, y = createImageCubes(X, y, windowSize=windowSize)\n", 423 | "\n", 424 | "X.shape, y.shape" 425 | ] 426 | }, 427 | { 428 | "cell_type": "code", 429 | "execution_count": 15, 430 | "metadata": { 431 | "colab": { 432 | "base_uri": "https://localhost:8080/", 433 | "height": 34 434 | }, 435 | "executionInfo": { 436 | "elapsed": 31739, 437 | "status": "ok", 438 | "timestamp": 1602478435500, 439 | "user": { 440 | "displayName": "Tanmay Chakraborty", 441 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 442 | "userId": "10513402671331353489" 443 | }, 444 | "user_tz": -330 445 | }, 446 | "id": "OMYMZxnDcSHb", 447 | "outputId": "8e9b4564-aff1-40e2-b081-7154c18d5d77" 448 | }, 449 | "outputs": [ 450 | { 451 | "data": { 452 | "text/plain": [ 453 | "((5412, 24, 24, 3), (48717, 24, 24, 3), (5412,), (48717,))" 454 | ] 455 | }, 456 | "execution_count": 15, 457 | "metadata": {}, 458 | "output_type": "execute_result" 459 | } 460 | ], 461 | "source": [ 462 | "Xtrain, Xtest, ytrain, ytest = splitTrainTestSet(X, y, test_ratio)\n", 463 | "\n", 464 | "Xtrain.shape, Xtest.shape, ytrain.shape, ytest.shape" 465 | ] 466 | }, 467 | { 468 | "cell_type": "code", 469 | "execution_count": 16, 470 | "metadata": { 471 | "colab": { 472 | "base_uri": "https://localhost:8080/", 473 | "height": 34 474 | }, 475 | "executionInfo": { 476 | "elapsed": 31716, 477 | "status": "ok", 478 | "timestamp": 1602478435501, 479 | "user": { 480 | "displayName": "Tanmay Chakraborty", 481 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 482 | "userId": "10513402671331353489" 483 | }, 484 | "user_tz": -330 485 | }, 486 | "id": "VBvWzipfcZDK", 487 | "outputId": "af402f16-e7ee-4272-c243-ae05646c8317" 488 | }, 489 | "outputs": [ 490 | { 491 | "data": { 492 | "text/plain": [ 493 | "(5412, 24, 24, 3, 1)" 494 | ] 495 | }, 496 | "execution_count": 16, 497 | "metadata": {}, 498 | "output_type": "execute_result" 499 | } 500 | ], 501 | "source": [ 502 | "Xtrain = Xtrain.reshape(-1, windowSize, windowSize, K, 1)\n", 503 | "Xtrain.shape" 504 | ] 505 | }, 506 | { 507 | "cell_type": "code", 508 | "execution_count": 17, 509 | "metadata": { 510 | "colab": { 511 | "base_uri": "https://localhost:8080/", 512 | "height": 34 513 | }, 514 | "executionInfo": { 515 | "elapsed": 31703, 516 | "status": "ok", 517 | "timestamp": 1602478435502, 518 | "user": { 519 | "displayName": "Tanmay Chakraborty", 520 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 521 | "userId": "10513402671331353489" 522 | }, 523 | "user_tz": -330 524 | }, 525 | "id": "lxr7HMjocZvd", 526 | "outputId": "cb12450a-7ffc-44a2-d57b-2e42e0402ec4" 527 | }, 528 | "outputs": [ 529 | { 530 | "data": { 531 | "text/plain": [ 532 | "(5412, 16)" 533 | ] 534 | }, 535 | "execution_count": 17, 536 | "metadata": {}, 537 | "output_type": "execute_result" 538 | } 539 | ], 540 | "source": [ 541 | "ytrain = np_utils.to_categorical(ytrain)\n", 542 | "ytrain.shape" 543 | ] 544 | }, 545 | { 546 | "cell_type": "code", 547 | "execution_count": 18, 548 | "metadata": { 549 | "executionInfo": { 550 | "elapsed": 31691, 551 | "status": "ok", 552 | "timestamp": 1602478435502, 553 | "user": { 554 | "displayName": "Tanmay Chakraborty", 555 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 556 | "userId": "10513402671331353489" 557 | }, 558 | "user_tz": -330 559 | }, 560 | "id": "pAmd40PJcdog" 561 | }, 562 | "outputs": [], 563 | "source": [ 564 | "S1 = windowSize\n", 565 | "L1 = K\n", 566 | "output_units = 9 if (dataset == 'PU' or dataset == 'PC') else 16" 567 | ] 568 | }, 569 | { 570 | "cell_type": "code", 571 | "execution_count": 19, 572 | "metadata": { 573 | "executionInfo": { 574 | "elapsed": 31687, 575 | "status": "ok", 576 | "timestamp": 1602478435503, 577 | "user": { 578 | "displayName": "Tanmay Chakraborty", 579 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 580 | "userId": "10513402671331353489" 581 | }, 582 | "user_tz": -330 583 | }, 584 | "id": "wQDctC5hdHlP" 585 | }, 586 | "outputs": [], 587 | "source": [ 588 | "def WaveletTransformAxisY(batch_img):\n", 589 | " odd_img = batch_img[:,0::2]\n", 590 | " even_img = batch_img[:,1::2]\n", 591 | " L = (odd_img + even_img) / 2.0\n", 592 | " H = Kb.abs(odd_img - even_img)\n", 593 | " return L, H\n", 594 | "\n", 595 | "def WaveletTransformAxisX(batch_img):\n", 596 | " # transpose + fliplr\n", 597 | " tmp_batch = Kb.permute_dimensions(batch_img, [0, 2, 1])[:,:,::-1]\n", 598 | " _dst_L, _dst_H = WaveletTransformAxisY(tmp_batch)\n", 599 | " # transpose + flipud\n", 600 | " dst_L = Kb.permute_dimensions(_dst_L, [0, 2, 1])[:,::-1,...]\n", 601 | " dst_H = Kb.permute_dimensions(_dst_H, [0, 2, 1])[:,::-1,...]\n", 602 | " return dst_L, dst_H" 603 | ] 604 | }, 605 | { 606 | "cell_type": "code", 607 | "execution_count": 20, 608 | "metadata": { 609 | "executionInfo": { 610 | "elapsed": 31679, 611 | "status": "ok", 612 | "timestamp": 1602478435503, 613 | "user": { 614 | "displayName": "Tanmay Chakraborty", 615 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 616 | "userId": "10513402671331353489" 617 | }, 618 | "user_tz": -330 619 | }, 620 | "id": "NCAUyHZFdPfN" 621 | }, 622 | "outputs": [], 623 | "source": [ 624 | "def Wavelet(batch_image):\n", 625 | " # make channel first image\n", 626 | " batch_image = Kb.permute_dimensions(batch_image, [0, 3, 1, 2])\n", 627 | " r = batch_image[:,0]\n", 628 | " g = batch_image[:,1]\n", 629 | " b = batch_image[:,2]\n", 630 | "\n", 631 | " # level 1 decomposition\n", 632 | " wavelet_L, wavelet_H = WaveletTransformAxisY(r)\n", 633 | " r_wavelet_LL, r_wavelet_LH = WaveletTransformAxisX(wavelet_L)\n", 634 | " r_wavelet_HL, r_wavelet_HH = WaveletTransformAxisX(wavelet_H)\n", 635 | "\n", 636 | " wavelet_L, wavelet_H = WaveletTransformAxisY(g)\n", 637 | " g_wavelet_LL, g_wavelet_LH = WaveletTransformAxisX(wavelet_L)\n", 638 | " g_wavelet_HL, g_wavelet_HH = WaveletTransformAxisX(wavelet_H)\n", 639 | "\n", 640 | " wavelet_L, wavelet_H = WaveletTransformAxisY(b)\n", 641 | " b_wavelet_LL, b_wavelet_LH = WaveletTransformAxisX(wavelet_L)\n", 642 | " b_wavelet_HL, b_wavelet_HH = WaveletTransformAxisX(wavelet_H)\n", 643 | "\n", 644 | " wavelet_data = [r_wavelet_LL, r_wavelet_LH, r_wavelet_HL, r_wavelet_HH, \n", 645 | " g_wavelet_LL, g_wavelet_LH, g_wavelet_HL, g_wavelet_HH,\n", 646 | " b_wavelet_LL, b_wavelet_LH, b_wavelet_HL, b_wavelet_HH]\n", 647 | " transform_batch = Kb.stack(wavelet_data, axis=1)\n", 648 | "\n", 649 | " # level 2 decomposition\n", 650 | " wavelet_L2, wavelet_H2 = WaveletTransformAxisY(r_wavelet_LL)\n", 651 | " r_wavelet_LL2, r_wavelet_LH2 = WaveletTransformAxisX(wavelet_L2)\n", 652 | " r_wavelet_HL2, r_wavelet_HH2 = WaveletTransformAxisX(wavelet_H2)\n", 653 | "\n", 654 | " wavelet_L2, wavelet_H2 = WaveletTransformAxisY(g_wavelet_LL)\n", 655 | " g_wavelet_LL2, g_wavelet_LH2 = WaveletTransformAxisX(wavelet_L2)\n", 656 | " g_wavelet_HL2, g_wavelet_HH2 = WaveletTransformAxisX(wavelet_H2)\n", 657 | "\n", 658 | " wavelet_L2, wavelet_H2 = WaveletTransformAxisY(b_wavelet_LL)\n", 659 | " b_wavelet_LL2, b_wavelet_LH2 = WaveletTransformAxisX(wavelet_L2)\n", 660 | " b_wavelet_HL2, b_wavelet_HH2 = WaveletTransformAxisX(wavelet_H2)\n", 661 | "\n", 662 | "\n", 663 | " wavelet_data_l2 = [r_wavelet_LL2, r_wavelet_LH2, r_wavelet_HL2, r_wavelet_HH2, \n", 664 | " g_wavelet_LL2, g_wavelet_LH2, g_wavelet_HL2, g_wavelet_HH2,\n", 665 | " b_wavelet_LL2, b_wavelet_LH2, b_wavelet_HL2, b_wavelet_HH2]\n", 666 | " transform_batch_l2 = Kb.stack(wavelet_data_l2, axis=1)\n", 667 | "\n", 668 | " # level 3 decomposition\n", 669 | " wavelet_L3, wavelet_H3 = WaveletTransformAxisY(r_wavelet_LL2)\n", 670 | " r_wavelet_LL3, r_wavelet_LH3 = WaveletTransformAxisX(wavelet_L3)\n", 671 | " r_wavelet_HL3, r_wavelet_HH3 = WaveletTransformAxisX(wavelet_H3)\n", 672 | "\n", 673 | " wavelet_L3, wavelet_H3 = WaveletTransformAxisY(g_wavelet_LL2)\n", 674 | " g_wavelet_LL3, g_wavelet_LH3 = WaveletTransformAxisX(wavelet_L3)\n", 675 | " g_wavelet_HL3, g_wavelet_HH3 = WaveletTransformAxisX(wavelet_H3)\n", 676 | "\n", 677 | " wavelet_L3, wavelet_H3 = WaveletTransformAxisY(b_wavelet_LL2)\n", 678 | " b_wavelet_LL3, b_wavelet_LH3 = WaveletTransformAxisX(wavelet_L3)\n", 679 | " b_wavelet_HL3, b_wavelet_HH3 = WaveletTransformAxisX(wavelet_H3)\n", 680 | "\n", 681 | " wavelet_data_l3 = [r_wavelet_LL3, r_wavelet_LH3, r_wavelet_HL3, r_wavelet_HH3, \n", 682 | " g_wavelet_LL3, g_wavelet_LH3, g_wavelet_HL3, g_wavelet_HH3,\n", 683 | " b_wavelet_LL3, b_wavelet_LH3, b_wavelet_HL3, b_wavelet_HH3]\n", 684 | " transform_batch_l3 = Kb.stack(wavelet_data_l3, axis=1)\n", 685 | "\n", 686 | " # level 4 decomposition\n", 687 | " wavelet_L4, wavelet_H4 = WaveletTransformAxisY(r_wavelet_LL3)\n", 688 | " r_wavelet_LL4, r_wavelet_LH4 = WaveletTransformAxisX(wavelet_L4)\n", 689 | " r_wavelet_HL4, r_wavelet_HH4 = WaveletTransformAxisX(wavelet_H4)\n", 690 | "\n", 691 | " wavelet_L4, wavelet_H4 = WaveletTransformAxisY(g_wavelet_LL3)\n", 692 | " g_wavelet_LL4, g_wavelet_LH4 = WaveletTransformAxisX(wavelet_L4)\n", 693 | " g_wavelet_HL4, g_wavelet_HH4 = WaveletTransformAxisX(wavelet_H4)\n", 694 | "\n", 695 | " wavelet_L3, wavelet_H3 = WaveletTransformAxisY(b_wavelet_LL3)\n", 696 | " b_wavelet_LL4, b_wavelet_LH4 = WaveletTransformAxisX(wavelet_L4)\n", 697 | " b_wavelet_HL4, b_wavelet_HH4 = WaveletTransformAxisX(wavelet_H4)\n", 698 | "\n", 699 | "\n", 700 | " wavelet_data_l4 = [r_wavelet_LL4, r_wavelet_LH4, r_wavelet_HL4, r_wavelet_HH4, \n", 701 | " g_wavelet_LL4, g_wavelet_LH4, g_wavelet_HL4, g_wavelet_HH4,\n", 702 | " b_wavelet_LL4, b_wavelet_LH4, b_wavelet_HL4, b_wavelet_HH4]\n", 703 | " transform_batch_l4 = Kb.stack(wavelet_data_l4, axis=1)\n", 704 | "\n", 705 | " # print('shape before')\n", 706 | " # print(transform_batch.shape)\n", 707 | " # print(transform_batch_l2.shape)\n", 708 | " # print(transform_batch_l3.shape)\n", 709 | " # print(transform_batch_l4.shape)\n", 710 | "\n", 711 | " decom_level_1 = Kb.permute_dimensions(transform_batch, [0, 2, 3, 1])\n", 712 | " decom_level_2 = Kb.permute_dimensions(transform_batch_l2, [0, 2, 3, 1])\n", 713 | " decom_level_3 = Kb.permute_dimensions(transform_batch_l3, [0, 2, 3, 1])\n", 714 | " decom_level_4 = Kb.permute_dimensions(transform_batch_l4, [0, 2, 3, 1])\n", 715 | " \n", 716 | " # print('shape after')\n", 717 | " # print(decom_level_1.shape)\n", 718 | " # print(decom_level_2.shape)\n", 719 | " # print(decom_level_3.shape)\n", 720 | " # print(decom_level_4.shape)\n", 721 | " return [decom_level_1, decom_level_2, decom_level_3, decom_level_4]\n", 722 | "\n", 723 | "\n", 724 | "def Wavelet_out_shape(input_shapes):\n", 725 | " # print('in to shape')\n", 726 | " return [tuple([None, 112, 112, 12]), tuple([None, 56, 56, 12]), \n", 727 | " tuple([None, 28, 28, 12]), tuple([None, 14, 14, 12])]" 728 | ] 729 | }, 730 | { 731 | "cell_type": "code", 732 | "execution_count": 21, 733 | "metadata": { 734 | "colab": { 735 | "base_uri": "https://localhost:8080/", 736 | "height": 1000 737 | }, 738 | "executionInfo": { 739 | "elapsed": 31664, 740 | "status": "ok", 741 | "timestamp": 1602478435504, 742 | "user": { 743 | "displayName": "Tanmay Chakraborty", 744 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 745 | "userId": "10513402671331353489" 746 | }, 747 | "user_tz": -330 748 | }, 749 | "id": "GVN-IuF7e2OB", 750 | "outputId": "2d2cbade-23cf-4833-8750-3755e381c91f" 751 | }, 752 | "outputs": [ 753 | { 754 | "data": { 755 | "text/plain": [ 756 | "[,\n", 1064 | " ,\n", 1324 | " ,\n", 1428 | " ]" 1484 | ] 1485 | }, 1486 | "execution_count": 21, 1487 | "metadata": {}, 1488 | "output_type": "execute_result" 1489 | } 1490 | ], 1491 | "source": [ 1492 | "img_batch = Kb.zeros(shape=(8, 24, 24, 3), dtype='float32')\n", 1493 | "Wavelet(img_batch)" 1494 | ] 1495 | }, 1496 | { 1497 | "cell_type": "code", 1498 | "execution_count": 22, 1499 | "metadata": { 1500 | "executionInfo": { 1501 | "elapsed": 31657, 1502 | "status": "ok", 1503 | "timestamp": 1602478435506, 1504 | "user": { 1505 | "displayName": "Tanmay Chakraborty", 1506 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 1507 | "userId": "10513402671331353489" 1508 | }, 1509 | "user_tz": -330 1510 | }, 1511 | "id": "5taLY_ljgFq3" 1512 | }, 1513 | "outputs": [], 1514 | "source": [ 1515 | "def get_wavelet_cnn_model():\n", 1516 | " \n", 1517 | " input_shape = 24, 24, 3\n", 1518 | " \n", 1519 | " input_ = Input(input_shape, name='the_input')\n", 1520 | " # wavelet = Lambda(Wavelet, name='wavelet')\n", 1521 | " wavelet = Lambda(Wavelet, Wavelet_out_shape, name='wavelet')\n", 1522 | " input_l1, input_l2, input_l3, input_l4 = wavelet(input_)\n", 1523 | " # print(input_l1)\n", 1524 | " # print(input_l2)\n", 1525 | " # print(input_l3)\n", 1526 | " # print(input_l4)\n", 1527 | " # level one decomposition starts\n", 1528 | " conv_1 = Conv2D(64, kernel_size=(3, 3), padding='same', name='conv_1')(input_l1)\n", 1529 | " norm_1 = BatchNormalization(name='norm_1')(conv_1)\n", 1530 | " relu_1 = Activation('relu', name='relu_1')(norm_1)\n", 1531 | " \n", 1532 | " conv_1_2 = Conv2D(64, kernel_size=(3, 3), strides=(2, 2), padding='same', name='conv_1_2')(relu_1)\n", 1533 | " norm_1_2 = BatchNormalization(name='norm_1_2')(conv_1_2)\n", 1534 | " relu_1_2 = Activation('relu', name='relu_1_2')(norm_1_2)\n", 1535 | " \n", 1536 | " # level two decomposition starts\n", 1537 | " conv_a = Conv2D(filters=64, kernel_size=(3, 3), padding='same', name='conv_a')(input_l2)\n", 1538 | " norm_a = BatchNormalization(name='norm_a')(conv_a)\n", 1539 | " relu_a = Activation('relu', name='relu_a')(norm_a)\n", 1540 | " \n", 1541 | " # concate level one and level two decomposition\n", 1542 | " concate_level_2 = concatenate([relu_1_2, relu_a])\n", 1543 | " conv_2 = Conv2D(128, kernel_size=(3, 3), padding='same', name='conv_2')(concate_level_2)\n", 1544 | " norm_2 = BatchNormalization(name='norm_2')(conv_2)\n", 1545 | " relu_2 = Activation('relu', name='relu_2')(norm_2)\n", 1546 | " \n", 1547 | " conv_2_2 = Conv2D(128, kernel_size=(3, 3), strides=(2, 2), padding='same', name='conv_2_2')(relu_2)\n", 1548 | " norm_2_2 = BatchNormalization(name='norm_2_2')(conv_2_2)\n", 1549 | " relu_2_2 = Activation('relu', name='relu_2_2')(norm_2_2)\n", 1550 | " \n", 1551 | " # level three decomposition starts \n", 1552 | " conv_b = Conv2D(filters=64, kernel_size=(3, 3), padding='same', name='conv_b')(input_l3)\n", 1553 | " norm_b = BatchNormalization(name='norm_b')(conv_b)\n", 1554 | " relu_b = Activation('relu', name='relu_b')(norm_b)\n", 1555 | " \n", 1556 | " conv_b_2 = Conv2D(128, kernel_size=(3, 3), padding='same', name='conv_b_2')(relu_b)\n", 1557 | " norm_b_2 = BatchNormalization(name='norm_b_2')(conv_b_2)\n", 1558 | " relu_b_2 = Activation('relu', name='relu_b_2')(norm_b_2)\n", 1559 | " \n", 1560 | " # concate level two and level three decomposition \n", 1561 | " concate_level_3 = concatenate([relu_2_2, relu_b_2])\n", 1562 | " conv_3 = Conv2D(256, kernel_size=(3, 3), padding='same', name='conv_3')(concate_level_3)\n", 1563 | " norm_3 = BatchNormalization(name='nomr_3')(conv_3)\n", 1564 | " relu_3 = Activation('relu', name='relu_3')(norm_3)\n", 1565 | " \n", 1566 | " conv_3_2 = Conv2D(256, kernel_size=(3, 3), strides=(2, 2), padding='same', name='conv_3_2')(relu_3)\n", 1567 | " norm_3_2 = BatchNormalization(name='norm_3_2')(conv_3_2)\n", 1568 | " relu_3_2 = Activation('relu', name='relu_3_2')(norm_3_2)\n", 1569 | " \n", 1570 | " # level four decomposition start\n", 1571 | " conv_c = Conv2D(64, kernel_size=(3, 3), padding='same', name='conv_c')(input_l4)\n", 1572 | " norm_c = BatchNormalization(name='norm_c')(conv_c)\n", 1573 | " relu_c = Activation('relu', name='relu_c')(norm_c)\n", 1574 | " \n", 1575 | " conv_c_2 = Conv2D(256, kernel_size=(3, 3), padding='same', name='conv_c_2')(relu_c)\n", 1576 | " norm_c_2 = BatchNormalization(name='norm_c_2')(conv_c_2)\n", 1577 | " relu_c_2 = Activation('relu', name='relu_c_2')(norm_c_2)\n", 1578 | " \n", 1579 | " conv_c_3 = Conv2D(256, kernel_size=(3, 3), padding='same', name='conv_c_3')(relu_c_2)\n", 1580 | " norm_c_3 = BatchNormalization(name='norm_c_3')(conv_c_3)\n", 1581 | " relu_c_3 = Activation('relu', name='relu_c_3')(norm_c_3)\n", 1582 | " \n", 1583 | " # concate level level three and level four decomposition\n", 1584 | " concate_level_4 = concatenate([relu_3_2, relu_c_3])\n", 1585 | " conv_4 = Conv2D(256, kernel_size=(3, 3), padding='same', name='conv_4')(concate_level_4)\n", 1586 | " norm_4 = BatchNormalization(name='norm_4')(conv_4)\n", 1587 | " relu_4 = Activation('relu', name='relu_4')(norm_4)\n", 1588 | " \n", 1589 | " conv_4_2 = Conv2D(256, kernel_size=(3, 3), strides=(2, 2), padding='same', name='conv_4_2')(relu_4)\n", 1590 | " norm_4_2 = BatchNormalization(name='norm_4_2')(conv_4_2)\n", 1591 | " relu_4_2 = Activation('relu', name='relu_4_2')(norm_4_2)\n", 1592 | " \n", 1593 | " conv_5_1 = Conv2D(128, kernel_size=(3, 3), padding='same', name='conv_5_1')(relu_4_2)\n", 1594 | " norm_5_1 = BatchNormalization(name='norm_5_1')(conv_5_1)\n", 1595 | " relu_5_1 = Activation('relu', name='relu_5_1')(norm_5_1)\n", 1596 | " \n", 1597 | " pool_5_1 = AveragePooling2D(pool_size=(7, 7), strides=1, padding='same', name='avg_pool_5_1')(relu_5_1)\n", 1598 | " #flat_5_1 = Flatten(name='flat_5_1')(pool_5_1) \n", 1599 | " \n", 1600 | " #fc_5 = Dense(2048, name='fc_5')(flat_5_1)\n", 1601 | " #norm_5 = BatchNormalization(name='norm_5')(fc_5)\n", 1602 | " #relu_5 = Activation('relu', name='relu_5')(norm_5)\n", 1603 | " #drop_5 = Dropout(0.5, name='drop_5')(relu_5)\n", 1604 | " \n", 1605 | " #fc_6 = Dense(2048, name='fc_6')(drop_5)\n", 1606 | " #norm_6 = BatchNormalization(name='norm_6')(fc_6)\n", 1607 | " #relu_6 = Activation('relu', name='relu_6')(norm_6)\n", 1608 | " #drop_6 = Dropout(0.5, name='drop_6')(relu_6)\n", 1609 | " flatten_layer = Flatten()(pool_5_1)\n", 1610 | " \n", 1611 | " dense_layer1 = Dense(units=2048, activation='relu')(flatten_layer)\n", 1612 | " dense_layer1 = Dropout(0.4)(dense_layer1)\n", 1613 | " dense_layer2 = Dense(units=1024, activation='relu')(dense_layer1)\n", 1614 | " dense_layer2 = Dropout(0.4)(dense_layer2)\n", 1615 | " output_layer = Dense(units=output_units, activation='softmax')(dense_layer2)\n", 1616 | " \n", 1617 | " model = Model(inputs=input_, outputs=output_layer)\n", 1618 | " model.summary()\n", 1619 | " plot_model(model, to_file='wavelet_cnn_0.5.png')\n", 1620 | " \n", 1621 | " return model" 1622 | ] 1623 | }, 1624 | { 1625 | "cell_type": "code", 1626 | "execution_count": 23, 1627 | "metadata": { 1628 | "colab": { 1629 | "base_uri": "https://localhost:8080/", 1630 | "height": 1000 1631 | }, 1632 | "executionInfo": { 1633 | "elapsed": 32872, 1634 | "status": "ok", 1635 | "timestamp": 1602478436734, 1636 | "user": { 1637 | "displayName": "Tanmay Chakraborty", 1638 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 1639 | "userId": "10513402671331353489" 1640 | }, 1641 | "user_tz": -330 1642 | }, 1643 | "id": "iVgd4QmzgKKq", 1644 | "outputId": "32d4779c-5699-49b0-eaef-613ac3ef7968" 1645 | }, 1646 | "outputs": [ 1647 | { 1648 | "name": "stdout", 1649 | "output_type": "stream", 1650 | "text": [ 1651 | "Model: \"functional_1\"\n", 1652 | "__________________________________________________________________________________________________\n", 1653 | "Layer (type) Output Shape Param # Connected to \n", 1654 | "==================================================================================================\n", 1655 | "the_input (InputLayer) [(None, 24, 24, 3)] 0 \n", 1656 | "__________________________________________________________________________________________________\n", 1657 | "wavelet (Lambda) [(None, 12, 12, 12), 0 the_input[0][0] \n", 1658 | "__________________________________________________________________________________________________\n", 1659 | "conv_1 (Conv2D) (None, 12, 12, 64) 6976 wavelet[0][0] \n", 1660 | "__________________________________________________________________________________________________\n", 1661 | "norm_1 (BatchNormalization) (None, 12, 12, 64) 256 conv_1[0][0] \n", 1662 | "__________________________________________________________________________________________________\n", 1663 | "relu_1 (Activation) (None, 12, 12, 64) 0 norm_1[0][0] \n", 1664 | "__________________________________________________________________________________________________\n", 1665 | "conv_1_2 (Conv2D) (None, 6, 6, 64) 36928 relu_1[0][0] \n", 1666 | "__________________________________________________________________________________________________\n", 1667 | "conv_a (Conv2D) (None, 6, 6, 64) 6976 wavelet[0][1] \n", 1668 | "__________________________________________________________________________________________________\n", 1669 | "norm_1_2 (BatchNormalization) (None, 6, 6, 64) 256 conv_1_2[0][0] \n", 1670 | "__________________________________________________________________________________________________\n", 1671 | "norm_a (BatchNormalization) (None, 6, 6, 64) 256 conv_a[0][0] \n", 1672 | "__________________________________________________________________________________________________\n", 1673 | "relu_1_2 (Activation) (None, 6, 6, 64) 0 norm_1_2[0][0] \n", 1674 | "__________________________________________________________________________________________________\n", 1675 | "relu_a (Activation) (None, 6, 6, 64) 0 norm_a[0][0] \n", 1676 | "__________________________________________________________________________________________________\n", 1677 | "concatenate (Concatenate) (None, 6, 6, 128) 0 relu_1_2[0][0] \n", 1678 | " relu_a[0][0] \n", 1679 | "__________________________________________________________________________________________________\n", 1680 | "conv_2 (Conv2D) (None, 6, 6, 128) 147584 concatenate[0][0] \n", 1681 | "__________________________________________________________________________________________________\n", 1682 | "conv_b (Conv2D) (None, 3, 3, 64) 6976 wavelet[0][2] \n", 1683 | "__________________________________________________________________________________________________\n", 1684 | "norm_2 (BatchNormalization) (None, 6, 6, 128) 512 conv_2[0][0] \n", 1685 | "__________________________________________________________________________________________________\n", 1686 | "norm_b (BatchNormalization) (None, 3, 3, 64) 256 conv_b[0][0] \n", 1687 | "__________________________________________________________________________________________________\n", 1688 | "relu_2 (Activation) (None, 6, 6, 128) 0 norm_2[0][0] \n", 1689 | "__________________________________________________________________________________________________\n", 1690 | "relu_b (Activation) (None, 3, 3, 64) 0 norm_b[0][0] \n", 1691 | "__________________________________________________________________________________________________\n", 1692 | "conv_2_2 (Conv2D) (None, 3, 3, 128) 147584 relu_2[0][0] \n", 1693 | "__________________________________________________________________________________________________\n", 1694 | "conv_b_2 (Conv2D) (None, 3, 3, 128) 73856 relu_b[0][0] \n", 1695 | "__________________________________________________________________________________________________\n", 1696 | "norm_2_2 (BatchNormalization) (None, 3, 3, 128) 512 conv_2_2[0][0] \n", 1697 | "__________________________________________________________________________________________________\n", 1698 | "norm_b_2 (BatchNormalization) (None, 3, 3, 128) 512 conv_b_2[0][0] \n", 1699 | "__________________________________________________________________________________________________\n", 1700 | "conv_c (Conv2D) (None, 2, 2, 64) 6976 wavelet[0][3] \n", 1701 | "__________________________________________________________________________________________________\n", 1702 | "relu_2_2 (Activation) (None, 3, 3, 128) 0 norm_2_2[0][0] \n", 1703 | "__________________________________________________________________________________________________\n", 1704 | "relu_b_2 (Activation) (None, 3, 3, 128) 0 norm_b_2[0][0] \n", 1705 | "__________________________________________________________________________________________________\n", 1706 | "norm_c (BatchNormalization) (None, 2, 2, 64) 256 conv_c[0][0] \n", 1707 | "__________________________________________________________________________________________________\n", 1708 | "concatenate_1 (Concatenate) (None, 3, 3, 256) 0 relu_2_2[0][0] \n", 1709 | " relu_b_2[0][0] \n", 1710 | "__________________________________________________________________________________________________\n", 1711 | "relu_c (Activation) (None, 2, 2, 64) 0 norm_c[0][0] \n", 1712 | "__________________________________________________________________________________________________\n", 1713 | "conv_3 (Conv2D) (None, 3, 3, 256) 590080 concatenate_1[0][0] \n", 1714 | "__________________________________________________________________________________________________\n", 1715 | "conv_c_2 (Conv2D) (None, 2, 2, 256) 147712 relu_c[0][0] \n", 1716 | "__________________________________________________________________________________________________\n", 1717 | "nomr_3 (BatchNormalization) (None, 3, 3, 256) 1024 conv_3[0][0] \n", 1718 | "__________________________________________________________________________________________________\n", 1719 | "norm_c_2 (BatchNormalization) (None, 2, 2, 256) 1024 conv_c_2[0][0] \n", 1720 | "__________________________________________________________________________________________________\n", 1721 | "relu_3 (Activation) (None, 3, 3, 256) 0 nomr_3[0][0] \n", 1722 | "__________________________________________________________________________________________________\n", 1723 | "relu_c_2 (Activation) (None, 2, 2, 256) 0 norm_c_2[0][0] \n", 1724 | "__________________________________________________________________________________________________\n", 1725 | "conv_3_2 (Conv2D) (None, 2, 2, 256) 590080 relu_3[0][0] \n", 1726 | "__________________________________________________________________________________________________\n", 1727 | "conv_c_3 (Conv2D) (None, 2, 2, 256) 590080 relu_c_2[0][0] \n", 1728 | "__________________________________________________________________________________________________\n", 1729 | "norm_3_2 (BatchNormalization) (None, 2, 2, 256) 1024 conv_3_2[0][0] \n", 1730 | "__________________________________________________________________________________________________\n", 1731 | "norm_c_3 (BatchNormalization) (None, 2, 2, 256) 1024 conv_c_3[0][0] \n", 1732 | "__________________________________________________________________________________________________\n", 1733 | "relu_3_2 (Activation) (None, 2, 2, 256) 0 norm_3_2[0][0] \n", 1734 | "__________________________________________________________________________________________________\n", 1735 | "relu_c_3 (Activation) (None, 2, 2, 256) 0 norm_c_3[0][0] \n", 1736 | "__________________________________________________________________________________________________\n", 1737 | "concatenate_2 (Concatenate) (None, 2, 2, 512) 0 relu_3_2[0][0] \n", 1738 | " relu_c_3[0][0] \n", 1739 | "__________________________________________________________________________________________________\n", 1740 | "conv_4 (Conv2D) (None, 2, 2, 256) 1179904 concatenate_2[0][0] \n", 1741 | "__________________________________________________________________________________________________\n", 1742 | "norm_4 (BatchNormalization) (None, 2, 2, 256) 1024 conv_4[0][0] \n", 1743 | "__________________________________________________________________________________________________\n", 1744 | "relu_4 (Activation) (None, 2, 2, 256) 0 norm_4[0][0] \n", 1745 | "__________________________________________________________________________________________________\n", 1746 | "conv_4_2 (Conv2D) (None, 1, 1, 256) 590080 relu_4[0][0] \n", 1747 | "__________________________________________________________________________________________________\n", 1748 | "norm_4_2 (BatchNormalization) (None, 1, 1, 256) 1024 conv_4_2[0][0] \n", 1749 | "__________________________________________________________________________________________________\n", 1750 | "relu_4_2 (Activation) (None, 1, 1, 256) 0 norm_4_2[0][0] \n", 1751 | "__________________________________________________________________________________________________\n", 1752 | "conv_5_1 (Conv2D) (None, 1, 1, 128) 295040 relu_4_2[0][0] \n", 1753 | "__________________________________________________________________________________________________\n", 1754 | "norm_5_1 (BatchNormalization) (None, 1, 1, 128) 512 conv_5_1[0][0] \n", 1755 | "__________________________________________________________________________________________________\n", 1756 | "relu_5_1 (Activation) (None, 1, 1, 128) 0 norm_5_1[0][0] \n", 1757 | "__________________________________________________________________________________________________\n", 1758 | "avg_pool_5_1 (AveragePooling2D) (None, 1, 1, 128) 0 relu_5_1[0][0] \n", 1759 | "__________________________________________________________________________________________________\n", 1760 | "flatten (Flatten) (None, 128) 0 avg_pool_5_1[0][0] \n", 1761 | "__________________________________________________________________________________________________\n", 1762 | "dense (Dense) (None, 2048) 264192 flatten[0][0] \n", 1763 | "__________________________________________________________________________________________________\n", 1764 | "dropout (Dropout) (None, 2048) 0 dense[0][0] \n", 1765 | "__________________________________________________________________________________________________\n", 1766 | "dense_1 (Dense) (None, 1024) 2098176 dropout[0][0] \n", 1767 | "__________________________________________________________________________________________________\n", 1768 | "dropout_1 (Dropout) (None, 1024) 0 dense_1[0][0] \n", 1769 | "__________________________________________________________________________________________________\n", 1770 | "dense_2 (Dense) (None, 16) 16400 dropout_1[0][0] \n", 1771 | "==================================================================================================\n", 1772 | "Total params: 6,805,072\n", 1773 | "Trainable params: 6,800,336\n", 1774 | "Non-trainable params: 4,736\n", 1775 | "__________________________________________________________________________________________________\n", 1776 | "('Failed to import pydot. You must `pip install pydot` and install graphviz (https://graphviz.gitlab.io/download/), ', 'for `pydotprint` to work.')\n" 1777 | ] 1778 | } 1779 | ], 1780 | "source": [ 1781 | "model = get_wavelet_cnn_model()" 1782 | ] 1783 | }, 1784 | { 1785 | "cell_type": "code", 1786 | "execution_count": 24, 1787 | "metadata": { 1788 | "executionInfo": { 1789 | "elapsed": 32861, 1790 | "status": "ok", 1791 | "timestamp": 1602478436735, 1792 | "user": { 1793 | "displayName": "Tanmay Chakraborty", 1794 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 1795 | "userId": "10513402671331353489" 1796 | }, 1797 | "user_tz": -330 1798 | }, 1799 | "id": "OYsuViCjhSA3" 1800 | }, 1801 | "outputs": [], 1802 | "source": [ 1803 | "#adam = Adam(lr=0.001, decay=1e-06)\n", 1804 | "sgd = SGD(learning_rate=0.01, momentum=0.9, nesterov=False)\n", 1805 | "model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])" 1806 | ] 1807 | }, 1808 | { 1809 | "cell_type": "code", 1810 | "execution_count": 25, 1811 | "metadata": { 1812 | "executionInfo": { 1813 | "elapsed": 32854, 1814 | "status": "ok", 1815 | "timestamp": 1602478436736, 1816 | "user": { 1817 | "displayName": "Tanmay Chakraborty", 1818 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 1819 | "userId": "10513402671331353489" 1820 | }, 1821 | "user_tz": -330 1822 | }, 1823 | "id": "sUQZa9UD-rsL" 1824 | }, 1825 | "outputs": [], 1826 | "source": [ 1827 | "filepath = \"best-model.hdf5\"\n", 1828 | "checkpoint = ModelCheckpoint(filepath, monitor='acc', verbose=1, save_best_only=False, mode='max')\n", 1829 | "callbacks_list = [checkpoint]" 1830 | ] 1831 | }, 1832 | { 1833 | "cell_type": "code", 1834 | "execution_count": 26, 1835 | "metadata": { 1836 | "colab": { 1837 | "base_uri": "https://localhost:8080/", 1838 | "height": 1000 1839 | }, 1840 | "executionInfo": { 1841 | "elapsed": 417831, 1842 | "status": "ok", 1843 | "timestamp": 1602478821720, 1844 | "user": { 1845 | "displayName": "Tanmay Chakraborty", 1846 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 1847 | "userId": "10513402671331353489" 1848 | }, 1849 | "user_tz": -330 1850 | }, 1851 | "id": "OVwsWGf1hfg3", 1852 | "outputId": "66ced090-c536-4f79-9bee-71ded4683b4d" 1853 | }, 1854 | "outputs": [ 1855 | { 1856 | "name": "stdout", 1857 | "output_type": "stream", 1858 | "text": [ 1859 | "Epoch 1/150\n", 1860 | "181/181 [==============================] - ETA: 0s - loss: 0.6525 - accuracy: 0.7882\n", 1861 | "Epoch 00001: saving model to best-model.hdf5\n", 1862 | "181/181 [==============================] - 19s 104ms/step - loss: 0.6525 - accuracy: 0.7882\n", 1863 | "Epoch 2/150\n", 1864 | "181/181 [==============================] - ETA: 0s - loss: 0.1720 - accuracy: 0.9448\n", 1865 | "Epoch 00002: saving model to best-model.hdf5\n", 1866 | "181/181 [==============================] - 19s 105ms/step - loss: 0.1720 - accuracy: 0.9448\n", 1867 | "Epoch 3/150\n", 1868 | "181/181 [==============================] - ETA: 0s - loss: 0.1618 - accuracy: 0.9429\n", 1869 | "Epoch 00003: saving model to best-model.hdf5\n", 1870 | "181/181 [==============================] - 18s 99ms/step - loss: 0.1618 - accuracy: 0.9429\n", 1871 | "Epoch 4/150\n", 1872 | "181/181 [==============================] - ETA: 0s - loss: 0.0926 - accuracy: 0.9695\n", 1873 | "Epoch 00004: saving model to best-model.hdf5\n", 1874 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0926 - accuracy: 0.9695\n", 1875 | "Epoch 5/150\n", 1876 | "181/181 [==============================] - ETA: 0s - loss: 0.0671 - accuracy: 0.9808\n", 1877 | "Epoch 00005: saving model to best-model.hdf5\n", 1878 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0671 - accuracy: 0.9808\n", 1879 | "Epoch 6/150\n", 1880 | "181/181 [==============================] - ETA: 0s - loss: 0.0748 - accuracy: 0.9763\n", 1881 | "Epoch 00006: saving model to best-model.hdf5\n", 1882 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0748 - accuracy: 0.9763\n", 1883 | "Epoch 7/150\n", 1884 | "181/181 [==============================] - ETA: 0s - loss: 0.0502 - accuracy: 0.9834\n", 1885 | "Epoch 00007: saving model to best-model.hdf5\n", 1886 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0502 - accuracy: 0.9834\n", 1887 | "Epoch 8/150\n", 1888 | "181/181 [==============================] - ETA: 0s - loss: 0.0267 - accuracy: 0.9911\n", 1889 | "Epoch 00008: saving model to best-model.hdf5\n", 1890 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0267 - accuracy: 0.9911\n", 1891 | "Epoch 9/150\n", 1892 | "181/181 [==============================] - ETA: 0s - loss: 0.0343 - accuracy: 0.9893\n", 1893 | "Epoch 00009: saving model to best-model.hdf5\n", 1894 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0343 - accuracy: 0.9893\n", 1895 | "Epoch 10/150\n", 1896 | "181/181 [==============================] - ETA: 0s - loss: 0.0257 - accuracy: 0.9921\n", 1897 | "Epoch 00010: saving model to best-model.hdf5\n", 1898 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0257 - accuracy: 0.9921\n", 1899 | "Epoch 11/150\n", 1900 | "181/181 [==============================] - ETA: 0s - loss: 0.0510 - accuracy: 0.9861\n", 1901 | "Epoch 00011: saving model to best-model.hdf5\n", 1902 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0510 - accuracy: 0.9861\n", 1903 | "Epoch 12/150\n", 1904 | "181/181 [==============================] - ETA: 0s - loss: 0.0394 - accuracy: 0.9898\n", 1905 | "Epoch 00012: saving model to best-model.hdf5\n", 1906 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0394 - accuracy: 0.9898\n", 1907 | "Epoch 13/150\n", 1908 | "181/181 [==============================] - ETA: 0s - loss: 0.0195 - accuracy: 0.9933\n", 1909 | "Epoch 00013: saving model to best-model.hdf5\n", 1910 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0195 - accuracy: 0.9933\n", 1911 | "Epoch 14/150\n", 1912 | "181/181 [==============================] - ETA: 0s - loss: 0.0405 - accuracy: 0.9884\n", 1913 | "Epoch 00014: saving model to best-model.hdf5\n", 1914 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0405 - accuracy: 0.9884\n", 1915 | "Epoch 15/150\n", 1916 | "181/181 [==============================] - ETA: 0s - loss: 0.0145 - accuracy: 0.9969\n", 1917 | "Epoch 00015: saving model to best-model.hdf5\n", 1918 | "181/181 [==============================] - 20s 110ms/step - loss: 0.0145 - accuracy: 0.9969\n", 1919 | "Epoch 16/150\n", 1920 | "181/181 [==============================] - ETA: 0s - loss: 0.0238 - accuracy: 0.9913\n", 1921 | "Epoch 00016: saving model to best-model.hdf5\n", 1922 | "181/181 [==============================] - 22s 119ms/step - loss: 0.0238 - accuracy: 0.9913\n", 1923 | "Epoch 17/150\n", 1924 | "181/181 [==============================] - ETA: 0s - loss: 0.0423 - accuracy: 0.9884\n", 1925 | "Epoch 00017: saving model to best-model.hdf5\n", 1926 | "181/181 [==============================] - 18s 101ms/step - loss: 0.0423 - accuracy: 0.9884\n", 1927 | "Epoch 18/150\n", 1928 | "181/181 [==============================] - ETA: 0s - loss: 0.0203 - accuracy: 0.9948\n", 1929 | "Epoch 00018: saving model to best-model.hdf5\n", 1930 | "181/181 [==============================] - 20s 110ms/step - loss: 0.0203 - accuracy: 0.9948\n", 1931 | "Epoch 19/150\n", 1932 | "181/181 [==============================] - ETA: 0s - loss: 0.0278 - accuracy: 0.9917\n", 1933 | "Epoch 00019: saving model to best-model.hdf5\n", 1934 | "181/181 [==============================] - 19s 104ms/step - loss: 0.0278 - accuracy: 0.9917\n", 1935 | "Epoch 20/150\n", 1936 | "181/181 [==============================] - ETA: 0s - loss: 0.0102 - accuracy: 0.9972\n", 1937 | "Epoch 00020: saving model to best-model.hdf5\n", 1938 | "181/181 [==============================] - 21s 114ms/step - loss: 0.0102 - accuracy: 0.9972\n", 1939 | "Epoch 21/150\n", 1940 | "181/181 [==============================] - ETA: 0s - loss: 0.0539 - accuracy: 0.9871 E\n", 1941 | "Epoch 00021: saving model to best-model.hdf5\n", 1942 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0539 - accuracy: 0.9871\n", 1943 | "Epoch 22/150\n", 1944 | "181/181 [==============================] - ETA: 0s - loss: 0.0105 - accuracy: 0.9969\n", 1945 | "Epoch 00022: saving model to best-model.hdf5\n", 1946 | "181/181 [==============================] - 19s 105ms/step - loss: 0.0105 - accuracy: 0.9969\n", 1947 | "Epoch 23/150\n", 1948 | "181/181 [==============================] - ETA: 0s - loss: 0.0112 - accuracy: 0.9967\n", 1949 | "Epoch 00023: saving model to best-model.hdf5\n", 1950 | "181/181 [==============================] - 19s 104ms/step - loss: 0.0112 - accuracy: 0.9967\n", 1951 | "Epoch 24/150\n", 1952 | "181/181 [==============================] - ETA: 0s - loss: 0.0189 - accuracy: 0.9939\n", 1953 | "Epoch 00024: saving model to best-model.hdf5\n", 1954 | "181/181 [==============================] - 19s 104ms/step - loss: 0.0189 - accuracy: 0.9939\n", 1955 | "Epoch 25/150\n", 1956 | "180/181 [============================>.] - ETA: 0s - loss: 0.0294 - accuracy: 0.9919\n", 1957 | "Epoch 00025: saving model to best-model.hdf5\n", 1958 | "181/181 [==============================] - 20s 111ms/step - loss: 0.0294 - accuracy: 0.9919\n", 1959 | "Epoch 26/150\n", 1960 | "181/181 [==============================] - ETA: 0s - loss: 0.0182 - accuracy: 0.9956\n", 1961 | "Epoch 00026: saving model to best-model.hdf5\n", 1962 | "181/181 [==============================] - 21s 115ms/step - loss: 0.0182 - accuracy: 0.9956\n", 1963 | "Epoch 27/150\n", 1964 | "181/181 [==============================] - ETA: 0s - loss: 0.0084 - accuracy: 0.9972\n", 1965 | "Epoch 00027: saving model to best-model.hdf5\n", 1966 | "181/181 [==============================] - 19s 107ms/step - loss: 0.0084 - accuracy: 0.9972\n", 1967 | "Epoch 28/150\n", 1968 | "181/181 [==============================] - ETA: 0s - loss: 0.0052 - accuracy: 0.9983\n", 1969 | "Epoch 00028: saving model to best-model.hdf5\n", 1970 | "181/181 [==============================] - 18s 102ms/step - loss: 0.0052 - accuracy: 0.9983\n", 1971 | "Epoch 29/150\n", 1972 | "181/181 [==============================] - ETA: 0s - loss: 0.0102 - accuracy: 0.9965\n", 1973 | "Epoch 00029: saving model to best-model.hdf5\n", 1974 | "181/181 [==============================] - 20s 113ms/step - loss: 0.0102 - accuracy: 0.9965\n", 1975 | "Epoch 30/150\n", 1976 | "181/181 [==============================] - ETA: 0s - loss: 0.0122 - accuracy: 0.9958\n", 1977 | "Epoch 00030: saving model to best-model.hdf5\n", 1978 | "181/181 [==============================] - 21s 114ms/step - loss: 0.0122 - accuracy: 0.9958\n", 1979 | "Epoch 31/150\n", 1980 | "181/181 [==============================] - ETA: 0s - loss: 0.0184 - accuracy: 0.9950\n", 1981 | "Epoch 00031: saving model to best-model.hdf5\n", 1982 | "181/181 [==============================] - 19s 104ms/step - loss: 0.0184 - accuracy: 0.9950\n", 1983 | "Epoch 32/150\n", 1984 | "181/181 [==============================] - ETA: 0s - loss: 0.0054 - accuracy: 0.9987\n", 1985 | "Epoch 00032: saving model to best-model.hdf5\n", 1986 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0054 - accuracy: 0.9987\n", 1987 | "Epoch 33/150\n", 1988 | "181/181 [==============================] - ETA: 0s - loss: 0.0282 - accuracy: 0.9921\n", 1989 | "Epoch 00033: saving model to best-model.hdf5\n", 1990 | "181/181 [==============================] - 19s 103ms/step - loss: 0.0282 - accuracy: 0.9921\n", 1991 | "Epoch 34/150\n", 1992 | "181/181 [==============================] - ETA: 0s - loss: 0.0078 - accuracy: 0.9972\n", 1993 | "Epoch 00034: saving model to best-model.hdf5\n", 1994 | "181/181 [==============================] - 21s 114ms/step - loss: 0.0078 - accuracy: 0.9972\n", 1995 | "Epoch 35/150\n", 1996 | "181/181 [==============================] - ETA: 0s - loss: 0.0120 - accuracy: 0.9965\n", 1997 | "Epoch 00035: saving model to best-model.hdf5\n", 1998 | "181/181 [==============================] - 19s 107ms/step - loss: 0.0120 - accuracy: 0.9965\n" 1999 | ] 2000 | }, 2001 | { 2002 | "name": "stdout", 2003 | "output_type": "stream", 2004 | "text": [ 2005 | "Epoch 36/150\n", 2006 | "181/181 [==============================] - ETA: 0s - loss: 0.0082 - accuracy: 0.9974\n", 2007 | "Epoch 00036: saving model to best-model.hdf5\n", 2008 | "181/181 [==============================] - 19s 102ms/step - loss: 0.0082 - accuracy: 0.9974\n", 2009 | "Epoch 37/150\n", 2010 | "181/181 [==============================] - ETA: 0s - loss: 0.0060 - accuracy: 0.9982\n", 2011 | "Epoch 00037: saving model to best-model.hdf5\n", 2012 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0060 - accuracy: 0.9982\n", 2013 | "Epoch 38/150\n", 2014 | "181/181 [==============================] - ETA: 0s - loss: 0.0064 - accuracy: 0.9976\n", 2015 | "Epoch 00038: saving model to best-model.hdf5\n", 2016 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0064 - accuracy: 0.9976\n", 2017 | "Epoch 39/150\n", 2018 | "181/181 [==============================] - ETA: 0s - loss: 0.0062 - accuracy: 0.9978\n", 2019 | "Epoch 00039: saving model to best-model.hdf5\n", 2020 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0062 - accuracy: 0.9978\n", 2021 | "Epoch 40/150\n", 2022 | "181/181 [==============================] - ETA: 0s - loss: 0.0070 - accuracy: 0.9985\n", 2023 | "Epoch 00040: saving model to best-model.hdf5\n", 2024 | "181/181 [==============================] - 18s 102ms/step - loss: 0.0070 - accuracy: 0.9985\n", 2025 | "Epoch 41/150\n", 2026 | "181/181 [==============================] - ETA: 0s - loss: 0.0054 - accuracy: 0.9983\n", 2027 | "Epoch 00041: saving model to best-model.hdf5\n", 2028 | "181/181 [==============================] - 18s 101ms/step - loss: 0.0054 - accuracy: 0.9983\n", 2029 | "Epoch 42/150\n", 2030 | "181/181 [==============================] - ETA: 0s - loss: 0.0011 - accuracy: 0.9998\n", 2031 | "Epoch 00042: saving model to best-model.hdf5\n", 2032 | "181/181 [==============================] - 19s 106ms/step - loss: 0.0011 - accuracy: 0.9998\n", 2033 | "Epoch 43/150\n", 2034 | "181/181 [==============================] - ETA: 0s - loss: 0.0073 - accuracy: 0.9978\n", 2035 | "Epoch 00043: saving model to best-model.hdf5\n", 2036 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0073 - accuracy: 0.9978\n", 2037 | "Epoch 44/150\n", 2038 | "181/181 [==============================] - ETA: 0s - loss: 0.0174 - accuracy: 0.9959\n", 2039 | "Epoch 00044: saving model to best-model.hdf5\n", 2040 | "181/181 [==============================] - 19s 102ms/step - loss: 0.0174 - accuracy: 0.9959\n", 2041 | "Epoch 45/150\n", 2042 | "181/181 [==============================] - ETA: 0s - loss: 0.0099 - accuracy: 0.9974\n", 2043 | "Epoch 00045: saving model to best-model.hdf5\n", 2044 | "181/181 [==============================] - 18s 102ms/step - loss: 0.0099 - accuracy: 0.9974\n", 2045 | "Epoch 46/150\n", 2046 | "181/181 [==============================] - ETA: 0s - loss: 0.0029 - accuracy: 0.9993\n", 2047 | "Epoch 00046: saving model to best-model.hdf5\n", 2048 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0029 - accuracy: 0.9993\n", 2049 | "Epoch 47/150\n", 2050 | "181/181 [==============================] - ETA: 0s - loss: 0.0087 - accuracy: 0.9976\n", 2051 | "Epoch 00047: saving model to best-model.hdf5\n", 2052 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0087 - accuracy: 0.9976\n", 2053 | "Epoch 48/150\n", 2054 | "181/181 [==============================] - ETA: 0s - loss: 0.0017 - accuracy: 0.9996\n", 2055 | "Epoch 00048: saving model to best-model.hdf5\n", 2056 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0017 - accuracy: 0.9996\n", 2057 | "Epoch 49/150\n", 2058 | "181/181 [==============================] - ETA: 0s - loss: 7.0718e-04 - accuracy: 0.9996\n", 2059 | "Epoch 00049: saving model to best-model.hdf5\n", 2060 | "181/181 [==============================] - 18s 98ms/step - loss: 7.0718e-04 - accuracy: 0.9996\n", 2061 | "Epoch 50/150\n", 2062 | "181/181 [==============================] - ETA: 0s - loss: 3.8591e-04 - accuracy: 1.0000\n", 2063 | "Epoch 00050: saving model to best-model.hdf5\n", 2064 | "181/181 [==============================] - 18s 98ms/step - loss: 3.8591e-04 - accuracy: 1.0000\n", 2065 | "Epoch 51/150\n", 2066 | "181/181 [==============================] - ETA: 0s - loss: 0.0013 - accuracy: 0.9994\n", 2067 | "Epoch 00051: saving model to best-model.hdf5\n", 2068 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0013 - accuracy: 0.9994\n", 2069 | "Epoch 52/150\n", 2070 | "181/181 [==============================] - ETA: 0s - loss: 0.0269 - accuracy: 0.9941\n", 2071 | "Epoch 00052: saving model to best-model.hdf5\n", 2072 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0269 - accuracy: 0.9941\n", 2073 | "Epoch 53/150\n", 2074 | "181/181 [==============================] - ETA: 0s - loss: 0.0043 - accuracy: 0.9993\n", 2075 | "Epoch 00053: saving model to best-model.hdf5\n", 2076 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0043 - accuracy: 0.9993\n", 2077 | "Epoch 54/150\n", 2078 | "181/181 [==============================] - ETA: 0s - loss: 0.0020 - accuracy: 0.9994\n", 2079 | "Epoch 00054: saving model to best-model.hdf5\n", 2080 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0020 - accuracy: 0.9994\n", 2081 | "Epoch 55/150\n", 2082 | "181/181 [==============================] - ETA: 0s - loss: 0.0028 - accuracy: 0.9993\n", 2083 | "Epoch 00055: saving model to best-model.hdf5\n", 2084 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0028 - accuracy: 0.9993\n", 2085 | "Epoch 56/150\n", 2086 | "181/181 [==============================] - ETA: 0s - loss: 0.0014 - accuracy: 0.9998\n", 2087 | "Epoch 00056: saving model to best-model.hdf5\n", 2088 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0014 - accuracy: 0.9998\n", 2089 | "Epoch 57/150\n", 2090 | "181/181 [==============================] - ETA: 0s - loss: 0.0027 - accuracy: 0.9994\n", 2091 | "Epoch 00057: saving model to best-model.hdf5\n", 2092 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0027 - accuracy: 0.9994\n", 2093 | "Epoch 58/150\n", 2094 | "181/181 [==============================] - ETA: 0s - loss: 0.0078 - accuracy: 0.9978\n", 2095 | "Epoch 00058: saving model to best-model.hdf5\n", 2096 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0078 - accuracy: 0.9978\n", 2097 | "Epoch 59/150\n", 2098 | "181/181 [==============================] - ETA: 0s - loss: 0.0087 - accuracy: 0.9969\n", 2099 | "Epoch 00059: saving model to best-model.hdf5\n", 2100 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0087 - accuracy: 0.9969\n", 2101 | "Epoch 60/150\n", 2102 | "181/181 [==============================] - ETA: 0s - loss: 0.0033 - accuracy: 0.9991\n", 2103 | "Epoch 00060: saving model to best-model.hdf5\n", 2104 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0033 - accuracy: 0.9991\n", 2105 | "Epoch 61/150\n", 2106 | "181/181 [==============================] - ETA: 0s - loss: 0.0052 - accuracy: 0.9985\n", 2107 | "Epoch 00061: saving model to best-model.hdf5\n", 2108 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0052 - accuracy: 0.9985\n", 2109 | "Epoch 62/150\n", 2110 | "181/181 [==============================] - ETA: 0s - loss: 6.0274e-04 - accuracy: 1.0000\n", 2111 | "Epoch 00062: saving model to best-model.hdf5\n", 2112 | "181/181 [==============================] - 18s 99ms/step - loss: 6.0274e-04 - accuracy: 1.0000\n", 2113 | "Epoch 63/150\n", 2114 | "181/181 [==============================] - ETA: 0s - loss: 8.9559e-04 - accuracy: 0.9996\n", 2115 | "Epoch 00063: saving model to best-model.hdf5\n", 2116 | "181/181 [==============================] - 18s 99ms/step - loss: 8.9559e-04 - accuracy: 0.9996\n", 2117 | "Epoch 64/150\n", 2118 | "181/181 [==============================] - ETA: 0s - loss: 0.0062 - accuracy: 0.9983 ETA: \n", 2119 | "Epoch 00064: saving model to best-model.hdf5\n", 2120 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0062 - accuracy: 0.9983\n", 2121 | "Epoch 65/150\n", 2122 | "181/181 [==============================] - ETA: 0s - loss: 0.0107 - accuracy: 0.9970\n", 2123 | "Epoch 00065: saving model to best-model.hdf5\n", 2124 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0107 - accuracy: 0.9970\n", 2125 | "Epoch 66/150\n", 2126 | "181/181 [==============================] - ETA: 0s - loss: 0.0040 - accuracy: 0.9985\n", 2127 | "Epoch 00066: saving model to best-model.hdf5\n", 2128 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0040 - accuracy: 0.9985\n", 2129 | "Epoch 67/150\n", 2130 | "181/181 [==============================] - ETA: 0s - loss: 0.0054 - accuracy: 0.9993\n", 2131 | "Epoch 00067: saving model to best-model.hdf5\n", 2132 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0054 - accuracy: 0.9993\n", 2133 | "Epoch 68/150\n", 2134 | "181/181 [==============================] - ETA: 0s - loss: 0.0029 - accuracy: 0.9989\n", 2135 | "Epoch 00068: saving model to best-model.hdf5\n", 2136 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0029 - accuracy: 0.9989\n", 2137 | "Epoch 69/150\n", 2138 | "181/181 [==============================] - ETA: 0s - loss: 0.0031 - accuracy: 0.9989\n", 2139 | "Epoch 00069: saving model to best-model.hdf5\n", 2140 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0031 - accuracy: 0.9989\n", 2141 | "Epoch 70/150\n", 2142 | "181/181 [==============================] - ETA: 0s - loss: 0.0032 - accuracy: 0.9991\n", 2143 | "Epoch 00070: saving model to best-model.hdf5\n", 2144 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0032 - accuracy: 0.9991\n" 2145 | ] 2146 | }, 2147 | { 2148 | "name": "stdout", 2149 | "output_type": "stream", 2150 | "text": [ 2151 | "Epoch 71/150\n", 2152 | "181/181 [==============================] - ETA: 0s - loss: 0.0059 - accuracy: 0.9980\n", 2153 | "Epoch 00071: saving model to best-model.hdf5\n", 2154 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0059 - accuracy: 0.9980\n", 2155 | "Epoch 72/150\n", 2156 | "181/181 [==============================] - ETA: 0s - loss: 0.0062 - accuracy: 0.9987\n", 2157 | "Epoch 00072: saving model to best-model.hdf5\n", 2158 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0062 - accuracy: 0.9987\n", 2159 | "Epoch 73/150\n", 2160 | "181/181 [==============================] - ETA: 0s - loss: 0.0038 - accuracy: 0.9991\n", 2161 | "Epoch 00073: saving model to best-model.hdf5\n", 2162 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0038 - accuracy: 0.9991\n", 2163 | "Epoch 74/150\n", 2164 | "181/181 [==============================] - ETA: 0s - loss: 0.0027 - accuracy: 0.9991\n", 2165 | "Epoch 00074: saving model to best-model.hdf5\n", 2166 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0027 - accuracy: 0.9991\n", 2167 | "Epoch 75/150\n", 2168 | "181/181 [==============================] - ETA: 0s - loss: 0.0048 - accuracy: 0.9991\n", 2169 | "Epoch 00075: saving model to best-model.hdf5\n", 2170 | "181/181 [==============================] - 20s 108ms/step - loss: 0.0048 - accuracy: 0.9991\n", 2171 | "Epoch 76/150\n", 2172 | "181/181 [==============================] - ETA: 0s - loss: 4.0937e-04 - accuracy: 1.0000\n", 2173 | "Epoch 00076: saving model to best-model.hdf5\n", 2174 | "181/181 [==============================] - 18s 101ms/step - loss: 4.0937e-04 - accuracy: 1.0000\n", 2175 | "Epoch 77/150\n", 2176 | "181/181 [==============================] - ETA: 0s - loss: 0.0021 - accuracy: 0.9991\n", 2177 | "Epoch 00077: saving model to best-model.hdf5\n", 2178 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0021 - accuracy: 0.9991\n", 2179 | "Epoch 78/150\n", 2180 | "181/181 [==============================] - ETA: 0s - loss: 0.0063 - accuracy: 0.9982\n", 2181 | "Epoch 00078: saving model to best-model.hdf5\n", 2182 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0063 - accuracy: 0.9982\n", 2183 | "Epoch 79/150\n", 2184 | "181/181 [==============================] - ETA: 0s - loss: 0.0025 - accuracy: 0.9996\n", 2185 | "Epoch 00079: saving model to best-model.hdf5\n", 2186 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0025 - accuracy: 0.9996\n", 2187 | "Epoch 80/150\n", 2188 | "181/181 [==============================] - ETA: 0s - loss: 0.0190 - accuracy: 0.9939\n", 2189 | "Epoch 00080: saving model to best-model.hdf5\n", 2190 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0190 - accuracy: 0.9939\n", 2191 | "Epoch 81/150\n", 2192 | "181/181 [==============================] - ETA: 0s - loss: 0.0071 - accuracy: 0.9974\n", 2193 | "Epoch 00081: saving model to best-model.hdf5\n", 2194 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0071 - accuracy: 0.9974\n", 2195 | "Epoch 82/150\n", 2196 | "181/181 [==============================] - ETA: 0s - loss: 0.0131 - accuracy: 0.9961\n", 2197 | "Epoch 00082: saving model to best-model.hdf5\n", 2198 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0131 - accuracy: 0.9961\n", 2199 | "Epoch 83/150\n", 2200 | "181/181 [==============================] - ETA: 0s - loss: 0.0052 - accuracy: 0.9985\n", 2201 | "Epoch 00083: saving model to best-model.hdf5\n", 2202 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0052 - accuracy: 0.9985\n", 2203 | "Epoch 84/150\n", 2204 | "181/181 [==============================] - ETA: 0s - loss: 0.0015 - accuracy: 0.9994\n", 2205 | "Epoch 00084: saving model to best-model.hdf5\n", 2206 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0015 - accuracy: 0.9994\n", 2207 | "Epoch 85/150\n", 2208 | "181/181 [==============================] - ETA: 0s - loss: 0.0045 - accuracy: 0.9987\n", 2209 | "Epoch 00085: saving model to best-model.hdf5\n", 2210 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0045 - accuracy: 0.9987\n", 2211 | "Epoch 86/150\n", 2212 | "181/181 [==============================] - ETA: 0s - loss: 0.0068 - accuracy: 0.9985\n", 2213 | "Epoch 00086: saving model to best-model.hdf5\n", 2214 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0068 - accuracy: 0.9985\n", 2215 | "Epoch 87/150\n", 2216 | "181/181 [==============================] - ETA: 0s - loss: 0.0024 - accuracy: 0.9993\n", 2217 | "Epoch 00087: saving model to best-model.hdf5\n", 2218 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0024 - accuracy: 0.9993\n", 2219 | "Epoch 88/150\n", 2220 | "181/181 [==============================] - ETA: 0s - loss: 0.0102 - accuracy: 0.9969\n", 2221 | "Epoch 00088: saving model to best-model.hdf5\n", 2222 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0102 - accuracy: 0.9969\n", 2223 | "Epoch 89/150\n", 2224 | "181/181 [==============================] - ETA: 0s - loss: 0.0089 - accuracy: 0.9970\n", 2225 | "Epoch 00089: saving model to best-model.hdf5\n", 2226 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0089 - accuracy: 0.9970\n", 2227 | "Epoch 90/150\n", 2228 | "181/181 [==============================] - ETA: 0s - loss: 0.0073 - accuracy: 0.9978\n", 2229 | "Epoch 00090: saving model to best-model.hdf5\n", 2230 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0073 - accuracy: 0.9978\n", 2231 | "Epoch 91/150\n", 2232 | "181/181 [==============================] - ETA: 0s - loss: 7.5099e-04 - accuracy: 0.9998\n", 2233 | "Epoch 00091: saving model to best-model.hdf5\n", 2234 | "181/181 [==============================] - 20s 108ms/step - loss: 7.5099e-04 - accuracy: 0.9998\n", 2235 | "Epoch 92/150\n", 2236 | "181/181 [==============================] - ETA: 0s - loss: 0.0012 - accuracy: 0.9994 \n", 2237 | "Epoch 00092: saving model to best-model.hdf5\n", 2238 | "181/181 [==============================] - 19s 107ms/step - loss: 0.0012 - accuracy: 0.9994\n", 2239 | "Epoch 93/150\n", 2240 | "181/181 [==============================] - ETA: 0s - loss: 0.0026 - accuracy: 0.9993\n", 2241 | "Epoch 00093: saving model to best-model.hdf5\n", 2242 | "181/181 [==============================] - 18s 101ms/step - loss: 0.0026 - accuracy: 0.9993\n", 2243 | "Epoch 94/150\n", 2244 | "181/181 [==============================] - ETA: 0s - loss: 5.6871e-04 - accuracy: 1.0000\n", 2245 | "Epoch 00094: saving model to best-model.hdf5\n", 2246 | "181/181 [==============================] - 18s 100ms/step - loss: 5.6871e-04 - accuracy: 1.0000\n", 2247 | "Epoch 95/150\n", 2248 | "181/181 [==============================] - ETA: 0s - loss: 1.7800e-04 - accuracy: 1.0000\n", 2249 | "Epoch 00095: saving model to best-model.hdf5\n", 2250 | "181/181 [==============================] - 18s 100ms/step - loss: 1.7800e-04 - accuracy: 1.0000\n", 2251 | "Epoch 96/150\n", 2252 | "181/181 [==============================] - ETA: 0s - loss: 2.1147e-04 - accuracy: 1.0000\n", 2253 | "Epoch 00096: saving model to best-model.hdf5\n", 2254 | "181/181 [==============================] - 18s 100ms/step - loss: 2.1147e-04 - accuracy: 1.0000\n", 2255 | "Epoch 97/150\n", 2256 | "181/181 [==============================] - ETA: 0s - loss: 2.4996e-04 - accuracy: 1.0000\n", 2257 | "Epoch 00097: saving model to best-model.hdf5\n", 2258 | "181/181 [==============================] - 18s 99ms/step - loss: 2.4996e-04 - accuracy: 1.0000\n", 2259 | "Epoch 98/150\n", 2260 | "181/181 [==============================] - ETA: 0s - loss: 0.0043 - accuracy: 0.9991\n", 2261 | "Epoch 00098: saving model to best-model.hdf5\n", 2262 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0043 - accuracy: 0.9991\n", 2263 | "Epoch 99/150\n", 2264 | "181/181 [==============================] - ETA: 0s - loss: 0.0026 - accuracy: 0.9994\n", 2265 | "Epoch 00099: saving model to best-model.hdf5\n", 2266 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0026 - accuracy: 0.9994\n", 2267 | "Epoch 100/150\n", 2268 | "181/181 [==============================] - ETA: 0s - loss: 7.3477e-04 - accuracy: 0.9998\n", 2269 | "Epoch 00100: saving model to best-model.hdf5\n", 2270 | "181/181 [==============================] - 18s 99ms/step - loss: 7.3477e-04 - accuracy: 0.9998\n", 2271 | "Epoch 101/150\n", 2272 | "181/181 [==============================] - ETA: 0s - loss: 0.0043 - accuracy: 0.9991\n", 2273 | "Epoch 00101: saving model to best-model.hdf5\n", 2274 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0043 - accuracy: 0.9991\n", 2275 | "Epoch 102/150\n", 2276 | "181/181 [==============================] - ETA: 0s - loss: 2.7936e-04 - accuracy: 1.0000\n", 2277 | "Epoch 00102: saving model to best-model.hdf5\n", 2278 | "181/181 [==============================] - 18s 100ms/step - loss: 2.7936e-04 - accuracy: 1.0000\n", 2279 | "Epoch 103/150\n", 2280 | "181/181 [==============================] - ETA: 0s - loss: 3.2449e-04 - accuracy: 1.0000\n", 2281 | "Epoch 00103: saving model to best-model.hdf5\n", 2282 | "181/181 [==============================] - 18s 99ms/step - loss: 3.2449e-04 - accuracy: 1.0000\n", 2283 | "Epoch 104/150\n", 2284 | "181/181 [==============================] - ETA: 0s - loss: 1.6277e-04 - accuracy: 1.0000\n", 2285 | "Epoch 00104: saving model to best-model.hdf5\n", 2286 | "181/181 [==============================] - 18s 99ms/step - loss: 1.6277e-04 - accuracy: 1.0000\n", 2287 | "Epoch 105/150\n", 2288 | "181/181 [==============================] - ETA: 0s - loss: 0.0024 - accuracy: 0.9996\n", 2289 | "Epoch 00105: saving model to best-model.hdf5\n" 2290 | ] 2291 | }, 2292 | { 2293 | "name": "stdout", 2294 | "output_type": "stream", 2295 | "text": [ 2296 | "181/181 [==============================] - 19s 102ms/step - loss: 0.0024 - accuracy: 0.9996\n", 2297 | "Epoch 106/150\n", 2298 | "181/181 [==============================] - ETA: 0s - loss: 9.4074e-04 - accuracy: 0.9998\n", 2299 | "Epoch 00106: saving model to best-model.hdf5\n", 2300 | "181/181 [==============================] - 18s 98ms/step - loss: 9.4074e-04 - accuracy: 0.9998\n", 2301 | "Epoch 107/150\n", 2302 | "181/181 [==============================] - ETA: 0s - loss: 6.9615e-04 - accuracy: 0.9996\n", 2303 | "Epoch 00107: saving model to best-model.hdf5\n", 2304 | "181/181 [==============================] - 18s 98ms/step - loss: 6.9615e-04 - accuracy: 0.9996\n", 2305 | "Epoch 108/150\n", 2306 | "181/181 [==============================] - ETA: 0s - loss: 0.0065 - accuracy: 0.9982\n", 2307 | "Epoch 00108: saving model to best-model.hdf5\n", 2308 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0065 - accuracy: 0.9982\n", 2309 | "Epoch 109/150\n", 2310 | "181/181 [==============================] - ETA: 0s - loss: 0.0066 - accuracy: 0.9987\n", 2311 | "Epoch 00109: saving model to best-model.hdf5\n", 2312 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0066 - accuracy: 0.9987\n", 2313 | "Epoch 110/150\n", 2314 | "181/181 [==============================] - ETA: 0s - loss: 0.0015 - accuracy: 0.9998\n", 2315 | "Epoch 00110: saving model to best-model.hdf5\n", 2316 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0015 - accuracy: 0.9998\n", 2317 | "Epoch 111/150\n", 2318 | "181/181 [==============================] - ETA: 0s - loss: 5.1282e-04 - accuracy: 0.9998\n", 2319 | "Epoch 00111: saving model to best-model.hdf5\n", 2320 | "181/181 [==============================] - 18s 99ms/step - loss: 5.1282e-04 - accuracy: 0.9998\n", 2321 | "Epoch 112/150\n", 2322 | "181/181 [==============================] - ETA: 0s - loss: 0.0038 - accuracy: 0.9993\n", 2323 | "Epoch 00112: saving model to best-model.hdf5\n", 2324 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0038 - accuracy: 0.9993\n", 2325 | "Epoch 113/150\n", 2326 | "181/181 [==============================] - ETA: 0s - loss: 4.8119e-04 - accuracy: 1.0000\n", 2327 | "Epoch 00113: saving model to best-model.hdf5\n", 2328 | "181/181 [==============================] - 18s 99ms/step - loss: 4.8119e-04 - accuracy: 1.0000\n", 2329 | "Epoch 114/150\n", 2330 | "181/181 [==============================] - ETA: 0s - loss: 2.0273e-04 - accuracy: 1.0000\n", 2331 | "Epoch 00114: saving model to best-model.hdf5\n", 2332 | "181/181 [==============================] - 18s 99ms/step - loss: 2.0273e-04 - accuracy: 1.0000\n", 2333 | "Epoch 115/150\n", 2334 | "181/181 [==============================] - ETA: 0s - loss: 2.2046e-04 - accuracy: 1.0000\n", 2335 | "Epoch 00115: saving model to best-model.hdf5\n", 2336 | "181/181 [==============================] - 18s 98ms/step - loss: 2.2046e-04 - accuracy: 1.0000\n", 2337 | "Epoch 116/150\n", 2338 | "181/181 [==============================] - ETA: 0s - loss: 9.9546e-05 - accuracy: 1.0000\n", 2339 | "Epoch 00116: saving model to best-model.hdf5\n", 2340 | "181/181 [==============================] - 18s 100ms/step - loss: 9.9546e-05 - accuracy: 1.0000\n", 2341 | "Epoch 117/150\n", 2342 | "181/181 [==============================] - ETA: 0s - loss: 0.0054 - accuracy: 0.9980\n", 2343 | "Epoch 00117: saving model to best-model.hdf5\n", 2344 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0054 - accuracy: 0.9980\n", 2345 | "Epoch 118/150\n", 2346 | "181/181 [==============================] - ETA: 0s - loss: 9.0097e-04 - accuracy: 0.9996\n", 2347 | "Epoch 00118: saving model to best-model.hdf5\n", 2348 | "181/181 [==============================] - 18s 99ms/step - loss: 9.0097e-04 - accuracy: 0.9996\n", 2349 | "Epoch 119/150\n", 2350 | "181/181 [==============================] - ETA: 0s - loss: 0.0067 - accuracy: 0.9994\n", 2351 | "Epoch 00119: saving model to best-model.hdf5\n", 2352 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0067 - accuracy: 0.9994\n", 2353 | "Epoch 120/150\n", 2354 | "181/181 [==============================] - ETA: 0s - loss: 0.0031 - accuracy: 0.9991\n", 2355 | "Epoch 00120: saving model to best-model.hdf5\n", 2356 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0031 - accuracy: 0.9991\n", 2357 | "Epoch 121/150\n", 2358 | "181/181 [==============================] - ETA: 0s - loss: 0.0011 - accuracy: 0.9993\n", 2359 | "Epoch 00121: saving model to best-model.hdf5\n", 2360 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0011 - accuracy: 0.9993\n", 2361 | "Epoch 122/150\n", 2362 | "181/181 [==============================] - ETA: 0s - loss: 7.7555e-04 - accuracy: 0.9998\n", 2363 | "Epoch 00122: saving model to best-model.hdf5\n", 2364 | "181/181 [==============================] - 18s 100ms/step - loss: 7.7555e-04 - accuracy: 0.9998\n", 2365 | "Epoch 123/150\n", 2366 | "181/181 [==============================] - ETA: 0s - loss: 0.0022 - accuracy: 0.9991\n", 2367 | "Epoch 00123: saving model to best-model.hdf5\n", 2368 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0022 - accuracy: 0.9991\n", 2369 | "Epoch 124/150\n", 2370 | "181/181 [==============================] - ETA: 0s - loss: 0.0042 - accuracy: 0.9989\n", 2371 | "Epoch 00124: saving model to best-model.hdf5\n", 2372 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0042 - accuracy: 0.9989\n", 2373 | "Epoch 125/150\n", 2374 | "181/181 [==============================] - ETA: 0s - loss: 0.0017 - accuracy: 0.9993\n", 2375 | "Epoch 00125: saving model to best-model.hdf5\n", 2376 | "181/181 [==============================] - 18s 99ms/step - loss: 0.0017 - accuracy: 0.9993\n", 2377 | "Epoch 126/150\n", 2378 | "181/181 [==============================] - ETA: 0s - loss: 0.0018 - accuracy: 0.9998\n", 2379 | "Epoch 00126: saving model to best-model.hdf5\n", 2380 | "181/181 [==============================] - 18s 101ms/step - loss: 0.0018 - accuracy: 0.9998\n", 2381 | "Epoch 127/150\n", 2382 | "181/181 [==============================] - ETA: 0s - loss: 0.0030 - accuracy: 0.9994\n", 2383 | "Epoch 00127: saving model to best-model.hdf5\n", 2384 | "181/181 [==============================] - 19s 102ms/step - loss: 0.0030 - accuracy: 0.9994\n", 2385 | "Epoch 128/150\n", 2386 | "181/181 [==============================] - ETA: 0s - loss: 4.7869e-04 - accuracy: 1.0000\n", 2387 | "Epoch 00128: saving model to best-model.hdf5\n", 2388 | "181/181 [==============================] - 19s 105ms/step - loss: 4.7869e-04 - accuracy: 1.0000\n", 2389 | "Epoch 129/150\n", 2390 | "181/181 [==============================] - ETA: 0s - loss: 3.3301e-04 - accuracy: 1.0000\n", 2391 | "Epoch 00129: saving model to best-model.hdf5\n", 2392 | "181/181 [==============================] - 18s 100ms/step - loss: 3.3301e-04 - accuracy: 1.0000\n", 2393 | "Epoch 130/150\n", 2394 | "181/181 [==============================] - ETA: 0s - loss: 1.0445e-04 - accuracy: 1.0000\n", 2395 | "Epoch 00130: saving model to best-model.hdf5\n", 2396 | "181/181 [==============================] - 18s 99ms/step - loss: 1.0445e-04 - accuracy: 1.0000\n", 2397 | "Epoch 131/150\n", 2398 | "181/181 [==============================] - ETA: 0s - loss: 1.8951e-04 - accuracy: 1.0000\n", 2399 | "Epoch 00131: saving model to best-model.hdf5\n", 2400 | "181/181 [==============================] - 18s 99ms/step - loss: 1.8951e-04 - accuracy: 1.0000\n", 2401 | "Epoch 132/150\n", 2402 | "181/181 [==============================] - ETA: 0s - loss: 1.3279e-04 - accuracy: 1.0000\n", 2403 | "Epoch 00132: saving model to best-model.hdf5\n", 2404 | "181/181 [==============================] - 18s 100ms/step - loss: 1.3279e-04 - accuracy: 1.0000\n", 2405 | "Epoch 133/150\n", 2406 | "181/181 [==============================] - ETA: 0s - loss: 8.5358e-05 - accuracy: 1.0000 ETA: 1s - loss:\n", 2407 | "Epoch 00133: saving model to best-model.hdf5\n", 2408 | "181/181 [==============================] - 18s 100ms/step - loss: 8.5358e-05 - accuracy: 1.0000\n", 2409 | "Epoch 134/150\n", 2410 | "181/181 [==============================] - ETA: 0s - loss: 0.0036 - accuracy: 0.9994\n", 2411 | "Epoch 00134: saving model to best-model.hdf5\n", 2412 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0036 - accuracy: 0.9994\n", 2413 | "Epoch 135/150\n", 2414 | "181/181 [==============================] - ETA: 0s - loss: 0.0019 - accuracy: 0.9998\n", 2415 | "Epoch 00135: saving model to best-model.hdf5\n", 2416 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0019 - accuracy: 0.9998\n", 2417 | "Epoch 136/150\n", 2418 | "181/181 [==============================] - ETA: 0s - loss: 2.0812e-04 - accuracy: 1.0000\n", 2419 | "Epoch 00136: saving model to best-model.hdf5\n", 2420 | "181/181 [==============================] - 18s 99ms/step - loss: 2.0812e-04 - accuracy: 1.0000\n", 2421 | "Epoch 137/150\n", 2422 | "181/181 [==============================] - ETA: 0s - loss: 2.4933e-04 - accuracy: 1.0000\n", 2423 | "Epoch 00137: saving model to best-model.hdf5\n", 2424 | "181/181 [==============================] - 18s 101ms/step - loss: 2.4933e-04 - accuracy: 1.0000\n", 2425 | "Epoch 138/150\n", 2426 | "181/181 [==============================] - ETA: 0s - loss: 0.0036 - accuracy: 0.9989\n", 2427 | "Epoch 00138: saving model to best-model.hdf5\n", 2428 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0036 - accuracy: 0.9989\n", 2429 | "Epoch 139/150\n", 2430 | "181/181 [==============================] - ETA: 0s - loss: 3.0098e-04 - accuracy: 1.0000\n", 2431 | "Epoch 00139: saving model to best-model.hdf5\n", 2432 | "181/181 [==============================] - 18s 100ms/step - loss: 3.0098e-04 - accuracy: 1.0000\n" 2433 | ] 2434 | }, 2435 | { 2436 | "name": "stdout", 2437 | "output_type": "stream", 2438 | "text": [ 2439 | "Epoch 140/150\n", 2440 | "181/181 [==============================] - ETA: 0s - loss: 0.0026 - accuracy: 0.9993\n", 2441 | "Epoch 00140: saving model to best-model.hdf5\n", 2442 | "181/181 [==============================] - 20s 110ms/step - loss: 0.0026 - accuracy: 0.9993\n", 2443 | "Epoch 141/150\n", 2444 | "181/181 [==============================] - ETA: 0s - loss: 0.0101 - accuracy: 0.9972\n", 2445 | "Epoch 00141: saving model to best-model.hdf5\n", 2446 | "181/181 [==============================] - 19s 106ms/step - loss: 0.0101 - accuracy: 0.9972\n", 2447 | "Epoch 142/150\n", 2448 | "181/181 [==============================] - ETA: 0s - loss: 0.0018 - accuracy: 0.9998\n", 2449 | "Epoch 00142: saving model to best-model.hdf5\n", 2450 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0018 - accuracy: 0.9998\n", 2451 | "Epoch 143/150\n", 2452 | "181/181 [==============================] - ETA: 0s - loss: 6.1224e-04 - accuracy: 0.9998\n", 2453 | "Epoch 00143: saving model to best-model.hdf5\n", 2454 | "181/181 [==============================] - 18s 99ms/step - loss: 6.1224e-04 - accuracy: 0.9998\n", 2455 | "Epoch 144/150\n", 2456 | "181/181 [==============================] - ETA: 0s - loss: 0.0011 - accuracy: 0.9996\n", 2457 | "Epoch 00144: saving model to best-model.hdf5\n", 2458 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0011 - accuracy: 0.9996\n", 2459 | "Epoch 145/150\n", 2460 | "181/181 [==============================] - ETA: 0s - loss: 0.0013 - accuracy: 0.9996\n", 2461 | "Epoch 00145: saving model to best-model.hdf5\n", 2462 | "181/181 [==============================] - 18s 100ms/step - loss: 0.0013 - accuracy: 0.9996\n", 2463 | "Epoch 146/150\n", 2464 | "181/181 [==============================] - ETA: 0s - loss: 0.0012 - accuracy: 0.9996 \n", 2465 | "Epoch 00146: saving model to best-model.hdf5\n", 2466 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0012 - accuracy: 0.9996\n", 2467 | "Epoch 147/150\n", 2468 | "181/181 [==============================] - ETA: 0s - loss: 0.0029 - accuracy: 0.9991\n", 2469 | "Epoch 00147: saving model to best-model.hdf5\n", 2470 | "181/181 [==============================] - 18s 98ms/step - loss: 0.0029 - accuracy: 0.9991\n", 2471 | "Epoch 148/150\n", 2472 | "181/181 [==============================] - ETA: 0s - loss: 1.0186e-04 - accuracy: 1.0000\n", 2473 | "Epoch 00148: saving model to best-model.hdf5\n", 2474 | "181/181 [==============================] - 18s 98ms/step - loss: 1.0186e-04 - accuracy: 1.0000\n", 2475 | "Epoch 149/150\n", 2476 | "181/181 [==============================] - ETA: 0s - loss: 0.0036 - accuracy: 0.9993\n", 2477 | "Epoch 00149: saving model to best-model.hdf5\n", 2478 | "181/181 [==============================] - 18s 102ms/step - loss: 0.0036 - accuracy: 0.9993\n", 2479 | "Epoch 150/150\n", 2480 | "181/181 [==============================] - ETA: 0s - loss: 4.9897e-04 - accuracy: 1.0000\n", 2481 | "Epoch 00150: saving model to best-model.hdf5\n", 2482 | "181/181 [==============================] - 19s 107ms/step - loss: 4.9897e-04 - accuracy: 1.0000\n" 2483 | ] 2484 | } 2485 | ], 2486 | "source": [ 2487 | "history = model.fit(x=Xtrain, y=ytrain, batch_size = 30, epochs=150, callbacks=callbacks_list)" 2488 | ] 2489 | }, 2490 | { 2491 | "cell_type": "markdown", 2492 | "metadata": { 2493 | "id": "R5-BLNivkkD2" 2494 | }, 2495 | "source": [ 2496 | "# Validation" 2497 | ] 2498 | }, 2499 | { 2500 | "cell_type": "code", 2501 | "execution_count": 27, 2502 | "metadata": { 2503 | "colab": { 2504 | "base_uri": "https://localhost:8080/", 2505 | "height": 445 2506 | }, 2507 | "executionInfo": { 2508 | "elapsed": 417824, 2509 | "status": "ok", 2510 | "timestamp": 1602478821723, 2511 | "user": { 2512 | "displayName": "Tanmay Chakraborty", 2513 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2514 | "userId": "10513402671331353489" 2515 | }, 2516 | "user_tz": -330 2517 | }, 2518 | "id": "rzi0kGwckZIh", 2519 | "outputId": "860ced41-4c23-4c7b-d0d9-88b35f1d005f" 2520 | }, 2521 | "outputs": [ 2522 | { 2523 | "data": { 2524 | "text/plain": [ 2525 | "[]" 2526 | ] 2527 | }, 2528 | "execution_count": 27, 2529 | "metadata": {}, 2530 | "output_type": "execute_result" 2531 | }, 2532 | { 2533 | "data": { 2534 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAasAAAGbCAYAAAB6a7/AAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAA0kUlEQVR4nO3deXyV5Zn/8c91TvaEJCCQQBI2RZBFUSLuGmxVtIvVWovTdrrYsdpa2xntYjvTdmZ+HTvdnZm2DLW2tqNStS7YutOmWhdEVtkCYU2AsJOVLCfn/v1xTmIIgQSSePLc+b5fL17kPOfJc66cLN9z38917secc4iIiAxkoUQXICIi0h2FlYiIDHgKKxERGfAUViIiMuAprEREZMBLStQDDx8+3I0bN67Xx6mvryczM7P3BSVIkOsPcu2g+hMpyLWD6u9PS5cu3eecG9F5e8LCaty4cbz11lu9Pk5paSklJSW9LyhBglx/kGsH1Z9IQa4dVH9/MrNtXW3XNKCIiAx4CisRERnwFFYiIjLgKaxERGTAU1iJiMiAp7ASEZEBT2ElIiIDnsJKREQGPIWViIgMeAorEREZ8BRWIiIy4CmsRERkwFNYiYjIgKewEhGRAS/QYbV1Xz17G6KJLkNERPpZoMPqiw8v5//WNSe6DBER6WeBDquQQdQlugoREelvwQ6rkOEUViIi3gt2WJkRRWklIuK7gIcVGlmJiAwCAQ8r0zkrEZFBIPBhpawSEfFfoMMqHNLISkRkMAh0WJnOWYmIDAqBDqtYN6CIiPgu0GEV1vusREQGhUCHlVawEBEZHAIeVuoGFBEZDIIfVpoHFBHxXrDDKqRpQBGRwSDYYaVuQBGRQSHwYaVZQBER/wU6rLSChYjI4BDosDJD3YAiIoNAoMNKq66LiAwOgQ6rsM5ZiYgMCoEOq1AIdQOKiAwCwQ4rvSlYRGRQ8CCsEl2FiIj0t4CHlaYBRUQGgx6FlZnNMbMyMys3s68fY58SM1thZmvM7K99W2bXQnqflYjIoJDU3Q5mFgZ+BlwBVAJLzGyhc25th31ygZ8Dc5xz281sZD/VewRNA4qIDA49GVnNAsqdc5udc83AAuDaTvv8HfC4c247gHNuT9+W2TVNA4qIDA7WXTedmd1AbMT02fjtTwDnOedu77DPT4FkYCowBLjXOffbLo51C3ALQF5e3swFCxb0qvhHypp5cVszv7wyq1fHSaS6ujqysoJZf5BrB9WfSEGuHVR/f5o9e/ZS51xx5+3dTgMC1sW2zgmXBMwE3gOkA6+b2RvOuQ1HfJJz84H5AMXFxa6kpKQHD39sixvXE926id4eJ5FKS0sDW3+QawfVn0hBrh1UfyL0JKwqgaIOtwuBnV3ss885Vw/Um9nLwFnABvpRWFcKFhEZFHpyzmoJMNHMxptZCjAXWNhpn6eAS8wsycwygPOAdX1b6tFChhosREQGgW5HVs65iJndDjwPhIH7nXNrzOzW+P3znHPrzOw5YBWxnof7nHOr+7NwAIuPrJxzmHU1WykiIj7oyTQgzrlngGc6bZvX6fYPgB/0XWndC4diARV1EFZWiYh4K/ArWAC06p3BIiJeC3ZYtY+sFFYiIj4LdljFz1Mpq0RE/BbwsIr936q0EhHxWsDDStOAIiKDgR9hpQYLERGvBTqsOraui4iIvwIdVm3nrDQNKCLit0CHlWkaUERkUAh0WGkaUERkcAh0WKl1XURkcAh4WGkaUERkMPAirDSwEhHxW7DDKl69pgFFRPwW7LDSChYiIoOCH2Glc1YiIl7zI6yUVSIiXgt0WIXj1WsaUETEb4EOq7YVLHSlYBERvwU6rMJqXRcRGRQCHVZqXRcRGRwCHVam1nURkUEh0GH1zjSgwkpExGeBDqtQe4NFggsREZF+FeywUuu6iMigEOyw0jkrEZFBwY+w0jSgiIjXAh1WWsFCRGRwCHRYta9gobASEfFaoMNKresiIoNDoMNK56xERAaHQIdVPKs0DSgi4rlAh1U4pGlAEZHBINBhpRUsREQGh4CHVex/ta6LiPgt2GEV0goWIiKDQbDDSsstiYgMCoEOq7Ba10VEBoVAh5Va10VEBodAh1VIresiIoNCoMOqfRpQWSUi4rVAh1Vb63qr0kpExGvBDitNA4qIDArBDqv2FSwUViIiPutRWJnZHDMrM7NyM/t6F/eXmFm1ma2I//tW35d6tHdWsHg3Hk1ERBIlqbsdzCwM/Ay4AqgElpjZQufc2k67vuKce38/1HhMWsFCRGRw6MnIahZQ7pzb7JxrBhYA1/ZvWT2jFSxERAaHbkdWQAFQ0eF2JXBeF/tdYGYrgZ3AXc65NZ13MLNbgFsA8vLyKC0tPeGCO2pqjYVUefkmSqMV3ew9MNXV1fX6eUiUINcOqj+Rglw7qP5E6ElYWRfbOg9llgFjnXN1ZnYN8CQw8ahPcm4+MB+guLjYlZSUnFCxnTW2tMKLzzF2/ARKSk7r1bESpbS0lN4+D4kS5NpB9SdSkGsH1Z8IPZkGrASKOtwuJDZ6auecq3HO1cU/fgZINrPhfVblMbRNA6p1XUTEbz0JqyXARDMbb2YpwFxgYccdzCzfLJYcZjYrftz9fV1sZ+GQVrAQERkMup0GdM5FzOx24HkgDNzvnFtjZrfG758H3ADcZmYR4DAw170Lwx2tYCEiMjj05JxV29TeM522zevw8f8A/9O3pXXPzDA0DSgi4rtAr2ABscuEaGAlIuK34IcVup6ViIjvAh9WIdObgkVEfBf4sDKDqOYBRUS8FviwCqFzViIivgt8WJmmAUVEvBf4sAppGlBExHvBDys0DSgi4rvAh5WZWtdFRHznQViZVrAQEfFc4MMqBESjia5CRET6U+DDStOAIiL+C3xYaQULERH/BT6sDLWui4j4LvBhFdKq6yIi3gt8WGkFCxER/wU+rGJvClZYiYj4LPhhZWpdFxHxXeDDyszUui4i4rnghxVoBQsREc8FPqzUDSgi4r/Ah5UZtCqtRES8FviwUjegiIj/Ah9Wep+ViIj/Ah9Wal0XEfFf4MPK0MhKRMR3gQ8rrbouIuK/wIeVqXVdRMR7gQ+rEKaRlYiI5wIfVma6npWIiO8CH1ZawUJExH+BDytDK1iIiPgu+GGlbkAREe8FPqxCBsoqERG/BT6sDHQ9KxERzwU+rPSmYBER//kRVmqwEBHxWuDDKrY2YKKrEBGR/hT4sAqZVrAQEfFd4MNKK1iIiPgv8GGlFSxERPwX+LBS67qIiP+CH1YGTmElIuK1wIeVpgFFRPzXo7AyszlmVmZm5Wb29ePsd66ZtZrZDX1XYje1oYVsRUR8121YmVkY+BlwNTAFuMnMphxjv/8Enu/rIo9HK1iIiPivJyOrWUC5c26zc64ZWABc28V+XwT+AOzpw/q6FUKt6yIivkvqwT4FQEWH25XAeR13MLMC4DrgcuDcYx3IzG4BbgHIy8ujtLT0BMs9WkukhZZW65NjJUJdXZ1qTxDVnzhBrh1UfyL0JKysi22dhzI/Bb7mnGs162r3+Cc5Nx+YD1BcXOxKSkp6VuVxPFL2Amat9MWxEqG0tFS1J4jqT5wg1w6qPxF6ElaVQFGH24XAzk77FAML4kE1HLjGzCLOuSf7osjjia0NqGlAERGf9SSslgATzWw8sAOYC/xdxx2cc+PbPjaz3wB/fDeCKvZ4al0XEfFdt2HlnIuY2e3EuvzCwP3OuTVmdmv8/nn9XONxhUyt6yIivuvJyArn3DPAM522dRlSzrlP9b6snms7Q+ac43jny0REJLi8WMECNBUoIuKzwIdV22BKU4EiIv4KfFi1fQHqCBQR8Vfwwyo+slJWiYj4K/Bh1dZUoWtaiYj4K/Bh9U6DhcJKRMRXgQ+rtmZ1LWYrIuKv4IeVWtdFRLwX+LDSNKCIiP8CH1aaBhQR8V/gw0orWIiI+C/wYdW+goWmAUVEvBX4sGpfwUJDKxERbwU/rLSChYiI9wIfVlrBQkTEf4EPKy1kKyLiv8CHVfubgnXOSkTEW/6ElbJKRMRbgQ8rTQOKiPgv8GGlKwWLiPgv8GGl1nUREf8FPqza1gZU67qIiL8CH1ZadV1ExH/ehJVTWImIeCvwYWXxicDWaIILERGRfhP8sNI0oIiI9wIfVjpnJSLiv8CH1TtXCk5oGSIi0o8CH1YaWYmI+C/wYaUrBYuI+C/wYdX2Bah1XUTEX8EPq/ZLhCS2DhER6T+BDytNA4qI+C/wYRWKp5WmAUVE/BX4sGpfyFbTgCIi3gp+WKl1XUTEe4EPK10pWETEf4EPK42sRET8F/iwUuu6iIj/Ah9WulKwiIj/Ah9WuviiiIj/vAmrqLJKRMRbgQ+rd95npbQSEfFV8MNKK1iIiHivR2FlZnPMrMzMys3s613cf62ZrTKzFWb2lpld3Peldq1tGlAjKxERfyV1t4OZhYGfAVcAlcASM1vonFvbYbdFwELnnDOzM4FHgMn9UfBR9cX/V1aJiPirJyOrWUC5c26zc64ZWABc23EH51yde2ceLhN416JDVwoWEfFftyMroACo6HC7Ejiv805mdh1wDzASeF9XBzKzW4BbAPLy8igtLT3Bco/WUF8PGBvLN1Haur3Xx3u31dXV9cnzkAhBrh1UfyIFuXZQ/YnQk7CyLrYdNYxxzj0BPGFmlwL/Dry3i33mA/MBiouLXUlJyQkV25XnX/oL0MD4CRMouezUXh/v3VZaWkpfPA+JEOTaQfUnUpBrB9WfCD2ZBqwEijrcLgR2Hmtn59zLwKlmNryXtfWIqcFCRMR7PQmrJcBEMxtvZinAXGBhxx3M7DSL95Cb2TlACrC/r4vtilawEBHxX7fTgM65iJndDjwPhIH7nXNrzOzW+P3zgA8Df29mLcBh4KPuXUoPrWAhIuK/npyzwjn3DPBMp23zOnz8n8B/9m1pPaMVLERE/KcVLEREZMALfFgBhEOmaUAREY95EVYh0/WsRER85klYmVawEBHxmD9hpXlAERFveRJWal0XEfGZH2EV0jSgiIjP/AgrTQOKiHjNi7BS67qIiN+8CCu1rouI+M2LsDIzrWAhIuIxL8IqbEY0mugqRESkv3gRVpoGFBHxmxdhZVrBQkTEa16EVTik1nUREZ95EVZawUJExG9+hJVWsBAR8ZofYaVzViIiXvMkrFDruoiIxzwJK1PruoiIx7wJK61gISLiLy/CSgvZioj4zYuwChm0Kq1ERLzlRVhpBQsREb95EVZhvc9KRMRrXoSVWtdFRPzmSVhpZCUi4jOFlYiIDHh+hFVIC9mKiPjMj7DSyEpExGv+hJWGViIi3vIirLSChYiI37wIK61gISLiNy/CSitYiIj4zYuwCpuhrBIR8ZcXYRUKoetZiYh4zIuw0jSgiIjfvAirsFrXRUS85kVYhUwrWIiI+MyPsNIlQkREvOZHWGkaUETEa56ElaYBRUR85kVYhUOm1nUREY95EVZmhlNYiYh4q0dhZWZzzKzMzMrN7Otd3P8xM1sV//eamZ3V96UeW9i0kK2IiM+6DSszCwM/A64GpgA3mdmUTrttAS5zzp0J/Dswv68LPR4tZCsi4reejKxmAeXOuc3OuWZgAXBtxx2cc6855w7Gb74BFPZtmcenFSxERPxm3Z3rMbMbgDnOuc/Gb38COM85d/sx9r8LmNy2f6f7bgFuAcjLy5u5YMGCXpYPdXV1PF2RzF8rI8y7IrPXx3u31dXVkZWVlegyTkqQawfVn0hBrh1Uf3+aPXv2UudcceftST34XOtiW5cJZ2azgZuBi7u63zk3n/gUYXFxsSspKenBwx9faWkpY8aMxHZupy+O924rLS0NZN0Q7NpB9SdSkGsH1Z8IPQmrSqCow+1CYGfnnczsTOA+4Grn3P6+Ka9ntIKFiIjfenLOagkw0czGm1kKMBdY2HEHMxsDPA58wjm3oe/LPL6QzlmJiHit25GVcy5iZrcDzwNh4H7n3BozuzV+/zzgW8ApwM/NDCDS1Zxjf9EKFiIifuvJNCDOuWeAZzptm9fh488CRzVUvFvCZmpdFxHxmDcrWABaxUJExFNehFU4FAsrDa5ERPzkRVjFs0pTgSIinvIirNqmAdURKCLiJy/Cqm0aUFklIuInL8KqfRpQaSUi4iVPwkrTgCIiPvMrrNRgISLiJU/CKva/skpExE9ehFVORjIAm/fWJbgSERHpD16E1ZVT8slOS+LXr25NdCkiItIPvAirzNQkbpo1hmdX76LyYEOiyxERkT7mRVgBfPLCcZgZv319W6JLERGRPuZNWI3OTefqafk8/OZ26psiiS5HRET6kDdhBXDzxeOpbYzw9MqjLmQsIiIB5lVYzSjKJRwyKnTeSkTEK16FlZmRnZZE9eGWRJciIiJ9yKuwAshJT6bmsM5ZiYj4xLuwyk5P1shKRMQz3oVVjsJKRMQ73oVVdnoyNY0KKxERn/gXVmnJ1GhkJSLiFe/Cqm0a0OnaViIi3vAyrFpaHY0t0USXIiIifcS7sMpOTwJQk4WIiEe8C6uc9Ni1rRRWIiL+8Das1BEoIuIP78IqOy0+smpQWImI+MK7sNI0oIiIf7wNK00Dioj4w7uwGpKmbkAREd94F1ZJ4RBZqUlaeV1ExCPehRWga1qJiHjGz7DSyusiIl7xMqxytPK6iIhXvAyr7HStvC4i4hMvw0oXYBQR8Yu3YaWRlYiIP7wMq+y0ZOqbW2lp1WVCRER84GVY5cQvE6LRlYiIH/wMq4y2JZf0xmARER94GVbtK69rZCUi4gUvw0orr4uI+MXrsNI5KxERP/QorMxsjpmVmVm5mX29i/snm9nrZtZkZnf1fZknJlsjKxERryR1t4OZhYGfAVcAlcASM1vonFvbYbcDwB3Ah/qjyBOlaUAREb/0ZGQ1Cyh3zm12zjUDC4BrO+7gnNvjnFsCDIh0SE0KkRIOaX1AERFP9CSsCoCKDrcr49sGLDPT+oAiIh7pdhoQsC62uZN5MDO7BbgFIC8vj9LS0pM5zBHq6uq6PE6ya2bjtp2Ulh7o9WP0p2PVHwRBrh1UfyIFuXZQ/YnQk7CqBIo63C4Edp7Mgznn5gPzAYqLi11JScnJHOYIpaWldHWc/LWvkpaSREnJeb1+jP50rPqDIMi1g+pPpCDXDqo/EXoyDbgEmGhm480sBZgLLOzfsnpPK6+LiPij25GVcy5iZrcDzwNh4H7n3BozuzV+/zwzywfeArKBqJl9GZjinKvpv9KPb1hmCut31Sbq4UVEpA/1ZBoQ59wzwDOdts3r8HEVsenBAeO0kVk8vmwH1Ydb2lvZRUQkmLxcwQJgcv4QADbs1uhKRCTovA2rSfnZAKyvUliJiASdt2E1OieNIWlJlFUl7LSZiIj0EW/DysyYlDeEMo2sREQCz9uwApiUP4T1VbU4d1LvYRYRkQHC67CaPCqb2sYIu6obE12KiIj0gt9hFe8I1FSgiEiweR1Wp+fFwmp9VS3RqOOOh5fz8JvbE1yViIicKK/DKic9mdE5aZRV1fDE8h0sXLmTJ5fvSHRZIiJygnq0gkWQTcofwoqKQ7y6aT8Aa3fV4JzDrKvF5EVEZCDyemQFsTcHb93fwN7aJj5aXERtY4TKg4cTXZaIiJwA78OqrcnixuJC5s6KXelk7S69UVhEJEi8D6uSSSP42Hlj+NqcyUzOzyZksE5hJSISKN6fs8rNSOG7101vvz1ueCZrdyqsRESCxPuRVWdTRmVrGlBEJGAGX1iNzqby4GFdRVhEJEAGX1iNil06ROetRESCQ2ElIiID3qALqxFDUhmelZLwJoumSCv/+PsV7GmIJrQOEZEgGHRhZWac0cMmi13Vh2mKtPZLHRt31/HE8h2s3Ns/xxcR8cmgCyuINVls3F1Hc+TYo5pIa5Q5P32Fe55Z3y817K6JXbbkUKOutSUi0p3BGVajsmlujbJpb90x99m6v4Hqwy38YVkljS19P/rZXdMEwMEmhZWISHcGbVjB8Zss2q6BVdsY4bnVVX1eQ/vIqknnrEREujMow2r88ExSk0LHbbJYX1VDyKAgN53fL6no8xrawuqgpgFFRLo1KMMqKRxicv6Q4zZZrK+qZfzwTG6aVcTrm/ezbX99j4//ysa9PL6s8rj7vDOyUliJiHRnUIYVxJos2q5t1ZX1VTVMHpXNDTOLCBk8+taR4fP7Jdt5ce3uLj/3hy9s4F+fXnvMY8M756wOR6C+KXKSX4WIyOAwaMPqjFHZHGpoYVd141H31TVFqDhwmMl5Q8jPSeOy00fw+LLK9vBxznHPs+v5RWn5UZ9b3xRh9Y5qqg+3UHHg2NfN2l3TSFZqUvvHIiJybIM2rI7XZLFhd6y5YnJ8n9mTR7KzurH9oo3b9jdwqKGFDbvrjho9Ldt+kNZobNvKykNdPnZzJMr++mamFcSOX6WwEhE5rkEbVm1B1NZkUXGggS37Yuel1u+Kh1X8wo3njBkKxIIIYEXFISA2AtvZaWT25pYDhENGSjjE2zuqu3zsvXWxKcAzC3MB2BOfEhQRka4N2rDKSk1i3CkZrN1VQ31ThBvmvcZH5r1OQ3OEsqoaslKTKMhNB2KhlZESZtm2I8MKYEO8xb3N4i0HmDY6mzNGZ7PqGCOrqnjAnVmYE7utkZWIyHEN2rCCd5os/vvP5eyuaWJfXRMPvLaNdVW1nJ6XRShkQKx7cEZRLkvjI6vlFYfaR11lu98Jq8aWVlZUHGLW+GGcVZjD6h01RKNHN1nsiYfThOFZpIV1zkpEpDuDOqzOyM9m2/4GfvW3zdwws5DZk0Yw76+bWLerpn2asM3MsUNZt6uWQw3NrNtZw2WnjyA/O+2IkdWqymqaI1FmjT+F6QU51DVF2Lzv6Jb3tnDKy05laJodN6yefXsX9zy7rk++3taoo6FZnYciEjyDOqymjI4FUlpSmK/NmcydV06i+nALtY2R9pFTm3PGDqU16nj4zQqaW6PMKMrl9PwhR4ys3tyyH4Bzxw1tPx/19o5DNEei3Pi/r3PfK5sBqKppIjlsDM1IITfV2tvYO4u0Rvl/f1rH//51c3vTR2/M++smZv+wtL0BJAgirVGeXL4jUDWLSN8b1GF1ZmEuKUkh7rpqEiOGpDKtIIdrpucDMDn/yJHVOUWxJovfvr4VgBljcpmUl8XGPXVEWmNLJi3ecoDJ+UPIzUjhtJFZpCeHWVlRza9f3cKbWw7w+LIdQGwacOSQNEIhY2haqP0cVmcvrdvDjkOxDsSHFm/v9df7xub97K5pal9KKghKy/by5d+v4C/r9yS6FBFJoEEdViOGpLL0n9/LJy8c177tG9ecwacvGseMotwj9s3JSGbiyCx2VTcyckgq+dlpnJ43hOZIlG0HGmiKtLJ020FmjR8GQDhkTCvI5uWNe7l30UZSkkKsq6rhUEMzu2sbyctOBSA31dhT29jlG4h/89oWCnLTed/0UTzeywV1nXPt3YnLKw6e9HHebW1vLThWZ6WIDA6DOqwAhqQlH3G7cGgG3/7AVFKSjn5qZo6Nja5mFOViZu2jrw1VtTz6ViUNza1cNTW/ff/pBbls3ltPJOq457rpOAdvbD7A7pom8rLTABiaZrS0Og7UNx/xWOuranhj8wE+ccFYPn7+WGoaI/xx1a6T/jp3HDrMoYYWAJZtO3TSx3m3rY9Pf65WWIkMaoM+rE7EOW1hNSYXgNNGZmEGa3bWMO+vm5hRlMuFp57Svv9ZRbHW9FsvO5UPnDWa9OQwr2/ax+7qxvawyk2NdRx2Pm/1wGtbSUsOMffcIs6fMIwJIzJ5aPG2k6697Y99fnZaoEZWbVOWGlnB4eZWKg40JLoMkYRQWJ2ASyeOYMKITN4zOQ+A9JQwY4dl8MDrW6k8eJgvXn4aZta+/1VT8/n3a6fy+ZJTSUkKUTxuKH8u20NtU+SIkRUc2b6+49BhHl+2gw/NKCA3IwUz4+9mjWHZ9kPHvaxJR6+W7+PyH5ZSeTD2x231jhrCIePG4kI2763nUENzN0c4eY0trTz79q4u2/ZPRFOklS376snNSGZPbVN7y/9g9dOXNnDVT1+mOj5CFhlMFFYnID8njT/fWcKkDp2Cp+cNobYxwhmjsrl88sgj9k9LDvOJC8aRlhwG4IJTT2lfLzA/J3bOamjq0WH1H39ahxncfvlp7dtumFlIenKY+17Z0m2d++qa+NKCFWzeV88T8aaO1TurmTgyi/PjI7/lHd7Y3Nfuf3ULtz24jIfe7F1TSPmeOlqjjmvPGg1odPXn9XtoaG7l6VU7E12KyLtOYdVLbcF1++wjR1VduWDCO1OEeUNiI6uceFi1rWLxWvk+/vT2Lm677DQKh2a075+bkcJHzy3iqRU72FV97AVyo1HHnY+spKaxhQnDM3l61U6cc6zeUc20ghzOKswlZLB8W/9MBTZHojzw2lYAfvB82VHn4k5E2xTgdecUYja4w2pX9WE27old2frRpe9cAaAp0tovV7IWGWgUVr10Y3ERd115OnOm5Xe77/SCnPaV1kfGpwGTQsbwrBR21zRxuLmV7zy9hsKh6XzusglHff7NF4/HAff/revR1YH6Zv7tj2v564a9/Mv7Yl2NG3bX8fLGfeyra2ba6GwyU5OYlJ/dPrJ6euVOnn375Bs3OvvT2zvZXdPEN685g/qmCN9/bv1JH6usqpaUcIipo7OZMDyT1Tt6NgXqo1c27gNiI+yVFYfYuLuWxojj/f/1N+bOf6PXU64iA53CqpeKhmVw++UTCYeOP6qC2LJNba3t+Tlp7dvzstP4w9JKpnz7OTbsruNf3j+lfeqw82O9/8xRPLR4O9WH3zlv0dIa5d+eXsuF31vEb17bytxzi/j4+WO5evoowiHje8/GAmNaQazh45wxuazYfoj/XrSRLz68nNseXMb3n1vf6z94zjnue2ULE0dm8dlLxvPpi8bx+7cqWL795EZx66tqOXVkFsnhENMLct71jsBXy/fx4V+8xsY+eEN2b72ycR/Ds1L52pzJJIWMx5ZW8ru1zWzcU8eKikP8oZuLfXa2YXct//7HtZSW7aE5Eu2nqmWgaY5EeXDxNuqag/fiRmH1Lpt7bhFXTc1rH2EBfPKCcVw5NY87Lp/IA5+ZdUT7e2e3XDqB+uZW/ufPG3HOEY06vvLoSu5/dQsfOHM0L/7jpXzvw2diZgzPSuXCU09h3a4azN5ZseOcMUOpbYrwoxc3cN3ZBdw0aww/L93EFx5a1qt1Ct/YfIA1O2v4zMXjMTO+9N7TGTkklW88sZqW+Bunq6ob+cKDy3r0xuSyqtr2lUSmFeRQVdPI3tqer1D/1tYD/OiFMj7zmyV85dGVNEV6Pl32avk+PvObJSzddpA7H13Z/sbvRIhGHa+W7+PSicMZMSSV2ZNH8uvXtvLqzgh3XH4aM4py+cHzZT2+iGc06rjr0ZX86m9b+NSvl1D8/17kn36/ghfWVGlK0XP3v7qFbz6xmt+X9V+DVX9J6n4X6UtXTs3nyk5hdOO5Rdx4blGPPn/q6ByuP7uAX76yhfI9dRQMTefJFTv5ylWT+MLs047a/wNnjeaVjfs4dUQWGSmxb/d5E4aRnhzmxuJCvv2BqZjBhOGZ/Odz61m0fg83nVvEmFMy2VPTSENzK1lpSSSHjE376tm4u5b8nHSuPWs0GZF3Xp21Rh0/eWkDwzJTuO7sAiC2sv2/XTuNz/1uKfNf3sxnLxnPbQ8uZfn2Q2zZV89Tt19Ecrjr10vVDS1U1TS2nxOcHh8Vrt5RzexOjSxdeW51Fbc9uBQDxg/P5M/r9+CAH9xwZrfnFl/btI+bH1jCuFMy+dj5Y/jWU2uY/8pmPl9yWvubt7s7xuLN+9m6v566plaGZ6XwvumjSDrG19qVA/XNPLF8Bx8pLmT7/gYO1DdzyenDAfjIzEJeXLubSUND3PGeiVw2aSQf/sVrzPvrJu68ctIRx4m0RvnDskqeWL6Dr86ZzDljhvKHZZWsqqzme9dPZ3hWKs+tqeLFtbt5fPkOpozK5vHPX9jlyH4geeStCl7esJfmSJThQ1L5xjVnHPEC8GRFo659AWuITZO/tmkfd19zBtmd3pMZNDsPHea/Fm0kPTnM33ZE2LC7ltPzhnT/iQNEj767ZjYHuBcIA/c5577X6X6L338N0AB8yjm3rI9rlbgf3XgWZxXl8h/PrKMpEuVTF47j8yWndrnvVVPz+ecnVrf/sYfYG5+Xf+uKI/4g/cOlE5gzLZ+f/aWcBxdvJxJ1pCSFyEgJU9cYIRJ1FA1L5/SRsfUQ73x0JZnJMHLiQWaOHcovSst5c8sBvn/DmUcc96qp+VwzPZ97F23k7cpqlm8/xE2zinj4zQp+GQ+Arqyvip2fagurqfH6F2850G1YLd12kC8tWM5Zhbn87uZZDElL5icvbuDeRRsZPzyT3IxkHnxjO9k0ctEl0SMCc+PuWj7326WMGZbBQ/9wHsMyU3h9035++uJGqqobKS3by6GGZm4rOY1PXzTuqD/qFQca+Nen1/DSuiOXh7r3pY18+YrTuXJKHmnJYeqaIvx+SQXrd9Xwz++bQk7GO38Idx46zCd+tZhNe+tZ8Ob29qnji06LhdV7zsjjnuunk3lwE0nhEDPHDuWDZ41m/submTl2KCWTYs/PonW7+e4z69i8t57UpBCfuG8x//Oxc/j+82WcPSaXG4uLCIWM907Jo6U1ylMrdnLXoyv5yUsbuPvqM477HHfn7cpq1lfVcPX0UUeFSMWBBt7cFeEy57oN/a68sKaKrz62ilE5aeSkJ7No/R4qDx7mV58sPuJ76Zyj8uBhCoem9+hx/u+Nbfzg+TJ+8tGzuHxyHisrDnHnIytpbo2yePMB5v/9TE4beew/7s45Fm85QFMkyqUTh5/U19Yd5xxRR7enHRqaI+0vTtt890/raI06Hr31Aj4671W+/1wZ932yuFf1NEVaSU16d17YWFfL/Byxg1kY2ABcAVQCS4CbnHNrO+xzDfBFYmF1HnCvc+684x23uLjYvfXWW72rHigtLaWkpKTXx0mU3tRfvqeWN7ccZO65RUe8GuzstU37GDMs44juwuNpew9WTnoyZoZzjtaoax8ZOOdYuu0gX/jtG9RGYq/uf/B8Ge+bPop758446pd0T20jV/z4ZaoPt/C5Sydw9zVn8PkHl/LSuj08c8clnDYy66gafvv6Vr711Bpev/tyRuXErit24/++zptbDnDe+GFcf04BFQcOs2V/PdNG5zBnWj6pSSEWrdvNj1/cQE56Mn+47UJOyYq9RSAadXzhoWU8u7oKgAkjMtm8t55rpufzX3PPJikcYn9dEx/6+ascbo7y1O0XtV/PbF9dE1f95GVqGyNceNopGPCXsr3kZ6dRNCydxpYoTZFWmiJRdlU3khQyvvzeiVwzfRRDUpNZvGU/P3yhjA2760gOG1NH57Bpbx21jZHY9OyobH53cywY1+ys5h8eeIvaxghfeu9E/mvRRmriCys/9+VLj3iOOv7s7Klt5JP3L6GsqoY7r5xEWVUtC1fu5LSRWXzlqknMKMrl7375Bpv2xq4C8OQXLjpqSTGAux9fxYIlFTz6uQsoHhcLyb21Tby4djcHG5r5yMzC9uagY3lx7W5uf2gZTZEoGSlhrp0xmptmjWF6QQ6lZXv50oLl1DRGuGFmIfdcP73L0XVzJMpL63YzJC2J8yec0r7P1n31fOC//8a44Zk8eusFpCWHeWRJBV/9wyo+fE4hP/xIbORc3dDC1x9fxbOrq/jgWaP57nXTjlqt5ojnsmwPn/nNEpLDIZyDe66fzo9f3ADAtz8whW888TaHm1u5+eLxfPyCsaxd+kb7c1/fFOGldbv51d+2sKoydl61eOxQvjpnMqNy0mhujbJmZw2vle+jvrmV2y47tX1K/kQs3XaA2x9aTlMkyodmFHDDzMKjjtPY0sq/Pr2WBUu2c+tlp/JPV5zefo7zK4+t4s4rTueL75nIXb96gcc2tvDI5y5ofzF0LK1Rx+ub9vPHVTsZc0oGN188npRwiAVLKvjOwjXMnjSS7143rf13rbfMbKlz7qgU7UlYXQB8xzl3Vfz23QDOuXs67PO/QKlz7uH47TKgxDl3zDYzhVVMkOt/8vk/M29dEuuraikals6f7rjkmFMlpWV7eHnDPr5xzWSSwqH2AKtvijB1dDbTC3MYnpVKVmoSW/fX85f1e6ltbGHlt69sD7/GllYeWrydeX/dxJ7aJsIhIz87rX2x3zYTR2bxy78vZtzwzCO2NzRHuP9vWzh/winMHDuUbzzwEg+vb+bi04ZTNCydpdsOsm1/AwtuOZ+z41eHbnOwvpmksLX/wXtt0z5++fJmGluipCWHSE0Kk5YcYlhmKp+9ZDyj40HXpjXqeHnjXt7YvJ9l2w6Sl53GzReP59DhFj73u9hILis1iRUVhzglM4UHPjOLaQU5bN5bxz8+spIPzRjNpy8af+Rz2ulnp6E5wp2PrOTZ1VUkh40vXj6RWy87tX3psL21TfzDb99iRlEu3/ng1C6/T3VNEa6+92VaIo4po7PZVd3I+qoa2v5MJIeND5w1mukFOZySlUp6cpioc+2v+Lfsq+dHL5QxvSCHu66axNMrd/L0yl0cbmnl1BGZbN5Xz+T8bMamHua5rS1cevoIPjKzkHDICJmRFDIqDjZw3ytb2r+vQ9KSOHfcMHLTk1lRcYgDDc388YsXH/Hi696XNvKTlzYwZlgGM8cO5c0tB9hd08jV00fxp1U7KRqWwacuHEduRjKZKUmxxwsZYTPqmyJ85bFVjBmWwa8+VcznfreUVZXVpIRDPHbbBZxZmMvOQ4f51lNrWLR+N0khY9wQY/zoETRFory+eT/NkSjjh2dyy6UTcA5+/GIZ++qOPC80JC0JA2qbIlw3o4CpBTkkh43kcIikkJGSFCIpFCI5bEfUFw4Zq3dU84PnyygYms6UUdksWreH5tYoU0Zlc/05BRQOTae51fHzv5SzvqqWWeOH8eaWA5w9JpdIa2xd0GkF2Tx2a2yK9/lFf+Fbi1vZV9fMmYU5TC/I4UB9M1XVjdQ2Rjjc0kqkNUo4bNQ3tXKgvpmMlDANza1MGJHJGaOy+dOqXZxZmMP6XbUMSUviP66fftzz7T3Vm7C6AZjjnPts/PYngPOcc7d32OePwPecc3+L314EfM05d8w0UljFBLn+0tJSZsy6kB+9sIGbZo054VeL66tqWLhiJ0u3HWTdrhpqGmMNAlmpSUwvyOG6swu6PJfX2BJbdqhoWAZpyWF2HDrMC2uqiLQ6Zk8eyakjMns0BVNaWkp5eAz3LtpIWnKYoRnJ3HnlpD75hTsRr5bv45bfvkXB0HQ+MrOI688p6NGr1K5+dqJRx8KVO5k6OpuJJ3k+Yum2A9z9+Nskh0MMz0rlrKJc5kzNJyMlzP2vbuHRtyo5fJxGjEtPH8EvPnYOmfHpv5rGFp5asZMnl+9gUv4Q/uV9U1j82ivsyTyVu594u8vLv8wcO5TbZ59GJOp4fk0Vq3dUUxdvILnn+ulcMnHEEfs75/j9kgr+UraHpdsOkpuRwo8+EpsuX7L1AF9esOKoFzUd5WWn8uQXLmJUTjrVh1v4xuNvc+XUPK6dUXDEflv21fPb17fy+rrtuORMos5xycQRXDElj1njh7VPz9U1RXhxbRWt0djbU8YNz2Ta6Gzqm1r5eWk5v35t6wl3Yb73jDx+dONZ5KQnc7C+madX7eSxpZXtozmAoRnJ/OSjMyiZNJKFK3fyjcffZmhmMl96z+l8aMbo9tmR0tJSxk47l8eXVfJq+T427q7jlKwU8rJjU6sZKWGSwiGi0dhU7ezJI3jvGXm8sXk/33pqDRUHG7jj8onc8Z6JlO+p458eWUFSyHj88xf1qDP6eHoTVh8BruoUVrOcc1/ssM+fgHs6hdVXnXNLOx3rFuAWgLy8vJkLFizo1RcFUFdXR1bW0dNIQRHk+vu69tao43AEMpIh1A/z/Z0NpOe+JepIsu4bNzpKVP1R56hvgZomR0vUYQZGrPawQX6mdfv9a6u9pslR2xIblcVGaLHR2+hMO+lzPq6Lc2FR56hrgYYWR2PEEQWiDpyL/V+QFSIr5d177iNRR1MrtLrYz30kGvs4Eo3d17m+pBCMzwl1+bzubYjS2Br7HpySbqQnvbPP4YgjJXT0Oa7e1N/c6jjY6MjLfGf6NhJ11DU7ctN632A+e/bsLsOqJw0WlUDHl7eFQOf1XnqyD865+cB8iI2s+mJEEeSRCQS7/iDXDqo/kYJcO6j+ROhJDC4BJprZeDNLAeYCCzvtsxD4e4s5H6g+3vkqERGRE9HtyMo5FzGz24HnibWu3++cW2Nmt8bvnwc8Q6wTsJxY6/qn+69kEREZbHr0Pivn3DPEAqnjtnkdPnbAF/q2NBERkRgttyQiIgOewkpERAY8hZWIiAx4CisRERnwFFYiIjLgKaxERGTAU1iJiMiAp7ASEZEBT2ElIiIDnsJKREQGPIWViIgMeAorEREZ8Lq9+GK/PbDZXmBbHxxqOLCvD46TKEGuP8i1g+pPpCDXDqq/P411zo3ovDFhYdVXzOytrq4qGRRBrj/ItYPqT6Qg1w6qPxE0DSgiIgOewkpERAY8H8JqfqIL6KUg1x/k2kH1J1KQawfV/64L/DkrERHxnw8jKxER8ZzCSkREBrzAhpWZzTGzMjMrN7OvJ7qe7phZkZn9xczWmdkaM/tSfPswM3vRzDbG/x+a6FqPxczCZrbczP4Yvx2k2nPN7DEzWx//HlwQsPr/Mf5zs9rMHjaztIFcv5ndb2Z7zGx1h23HrNfM7o7/LpeZ2VWJqbq9lq5q/0H8Z2eVmT1hZrkd7hswtcfrOar+DvfdZWbOzIZ32Dag6j+WQIaVmYWBnwFXA1OAm8xsSmKr6lYEuNM5dwZwPvCFeM1fBxY55yYCi+K3B6ovAes63A5S7fcCzznnJgNnEfs6AlG/mRUAdwDFzrlpQBiYy8Cu/zfAnE7buqw3/nswF5ga/5yfx3/HE+U3HF37i8A059yZwAbgbhiQtUPX9WNmRcAVwPYO2wZi/V0KZFgBs4By59xm51wzsAC4NsE1HZdzbpdzbln841pifywLiNX9QHy3B4APJaTAbphZIfA+4L4Om4NSezZwKfArAOdcs3PuEAGpPy4JSDezJCAD2MkArt859zJwoNPmY9V7LbDAOdfknNsClBP7HU+Irmp3zr3gnIvEb74BFMY/HlC1wzGfe4CfAF8FOnbVDbj6jyWoYVUAVHS4XRnfFghmNg44G1gM5DnndkEs0ICRCSzteH5K7Ac92mFbUGqfAOwFfh2fxrzPzDIJSP3OuR3AD4m9It4FVDvnXiAg9XdwrHqD9vv8GeDZ+MeBqN3MPgjscM6t7HRXIOqH4IaVdbEtED34ZpYF/AH4snOuJtH19ISZvR/Y45xbmuhaTlIScA7wC+fc2UA9A2vK7Lji53auBcYDo4FMM/t4YqvqU4H5fTazbxKb0n+wbVMXuw2o2s0sA/gm8K2u7u5i24Cqv01Qw6oSKOpwu5DYtMiAZmbJxILqQefc4/HNu81sVPz+UcCeRNV3HBcBHzSzrcSmXC83s/8jGLVD7Oel0jm3OH77MWLhFZT63wtscc7tdc61AI8DFxKc+tscq95A/D6b2SeB9wMfc++8QTUItZ9K7IXOyvjvcCGwzMzyCUb9QHDDagkw0czGm1kKsROECxNc03GZmRE7Z7LOOffjDnctBD4Z//iTwFPvdm3dcc7d7ZwrdM6NI/Zc/9k593ECUDuAc64KqDCzSfFN7wHWEpD6iU3/nW9mGfGfo/cQO+cZlPrbHKvehcBcM0s1s/HARODNBNR3TGY2B/ga8EHnXEOHuwZ87c65t51zI51z4+K/w5XAOfHfiwFffzvnXCD/AdcQ68rZBHwz0fX0oN6LiQ2vVwEr4v+uAU4h1hm1Mf7/sETX2s3XUQL8Mf5xYGoHZgBvxZ//J4GhAav/X4H1wGrgd0DqQK4feJjY+bUWYn8cbz5evcSmqTYBZcDVA7D2cmLndtp+d+cNxNqPVX+n+7cCwwdq/cf6p+WWRERkwAvqNKCIiAwiCisRERnwFFYiIjLgKaxERGTAU1iJiMiAp7ASEZEBT2ElIiID3v8HgoWGiLxB7RMAAAAASUVORK5CYII=\n", 2535 | "text/plain": [ 2536 | "
" 2537 | ] 2538 | }, 2539 | "metadata": { 2540 | "needs_background": "light" 2541 | }, 2542 | "output_type": "display_data" 2543 | } 2544 | ], 2545 | "source": [ 2546 | "plt.figure(figsize=(7,7)) \n", 2547 | "plt.grid() \n", 2548 | "plt.plot(history.history['loss'])\n", 2549 | "#plt.plot(history.history['val_loss'])\n", 2550 | "#plt.ylabel('Loss') \n", 2551 | "#plt.xlabel('Epochs') \n", 2552 | "#plt.legend(['Training','Validation'], loc='upper right') \n", 2553 | "#plt.savefig(\"loss_curve.pdf\") \n", 2554 | "#plt.show()\n", 2555 | "#plt.figure(figsize=(5,5)) \n", 2556 | "#plt.ylim(0,1.1) \n", 2557 | "#plt.grid() \n", 2558 | "#plt.plot(history.history['acc'])\n", 2559 | "#plt.plot(history.history['val_acc'])\n", 2560 | "#plt.ylabel('Accuracy') \n", 2561 | "#plt.xlabel('Epochs') \n", 2562 | "#plt.legend(['Training','Validation']) \n", 2563 | "#plt.savefig(\"acc_curve.pdf\") \n", 2564 | "#plt.show()" 2565 | ] 2566 | }, 2567 | { 2568 | "cell_type": "code", 2569 | "execution_count": 28, 2570 | "metadata": { 2571 | "executionInfo": { 2572 | "elapsed": 418307, 2573 | "status": "ok", 2574 | "timestamp": 1602478822219, 2575 | "user": { 2576 | "displayName": "Tanmay Chakraborty", 2577 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2578 | "userId": "10513402671331353489" 2579 | }, 2580 | "user_tz": -330 2581 | }, 2582 | "id": "HKSPxOqckkD3" 2583 | }, 2584 | "outputs": [], 2585 | "source": [ 2586 | "# load best weights\n", 2587 | "model.load_weights(\"best-model.hdf5\")\n", 2588 | "model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])" 2589 | ] 2590 | }, 2591 | { 2592 | "cell_type": "code", 2593 | "execution_count": 29, 2594 | "metadata": { 2595 | "colab": { 2596 | "base_uri": "https://localhost:8080/", 2597 | "height": 34 2598 | }, 2599 | "executionInfo": { 2600 | "elapsed": 418301, 2601 | "status": "ok", 2602 | "timestamp": 1602478822220, 2603 | "user": { 2604 | "displayName": "Tanmay Chakraborty", 2605 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2606 | "userId": "10513402671331353489" 2607 | }, 2608 | "user_tz": -330 2609 | }, 2610 | "id": "tzGuqZILkkD5", 2611 | "outputId": "58b28fd2-e2a8-47a5-94bd-3bb222491aa0" 2612 | }, 2613 | "outputs": [ 2614 | { 2615 | "data": { 2616 | "text/plain": [ 2617 | "(48717, 24, 24, 3, 1)" 2618 | ] 2619 | }, 2620 | "execution_count": 29, 2621 | "metadata": {}, 2622 | "output_type": "execute_result" 2623 | } 2624 | ], 2625 | "source": [ 2626 | "Xtest = Xtest.reshape(-1, windowSize, windowSize, K, 1)\n", 2627 | "Xtest.shape" 2628 | ] 2629 | }, 2630 | { 2631 | "cell_type": "code", 2632 | "execution_count": 30, 2633 | "metadata": { 2634 | "colab": { 2635 | "base_uri": "https://localhost:8080/", 2636 | "height": 34 2637 | }, 2638 | "executionInfo": { 2639 | "elapsed": 418293, 2640 | "status": "ok", 2641 | "timestamp": 1602478822221, 2642 | "user": { 2643 | "displayName": "Tanmay Chakraborty", 2644 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2645 | "userId": "10513402671331353489" 2646 | }, 2647 | "user_tz": -330 2648 | }, 2649 | "id": "FIhcrDHQkkD7", 2650 | "outputId": "301bc50b-063d-4ad4-b84e-96658ec5bfee" 2651 | }, 2652 | "outputs": [ 2653 | { 2654 | "data": { 2655 | "text/plain": [ 2656 | "(48717, 16)" 2657 | ] 2658 | }, 2659 | "execution_count": 30, 2660 | "metadata": {}, 2661 | "output_type": "execute_result" 2662 | } 2663 | ], 2664 | "source": [ 2665 | "ytest = np_utils.to_categorical(ytest)\n", 2666 | "ytest.shape" 2667 | ] 2668 | }, 2669 | { 2670 | "cell_type": "code", 2671 | "execution_count": 31, 2672 | "metadata": { 2673 | "colab": { 2674 | "base_uri": "https://localhost:8080/", 2675 | "height": 403 2676 | }, 2677 | "executionInfo": { 2678 | "elapsed": 427183, 2679 | "status": "ok", 2680 | "timestamp": 1602478831122, 2681 | "user": { 2682 | "displayName": "Tanmay Chakraborty", 2683 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2684 | "userId": "10513402671331353489" 2685 | }, 2686 | "user_tz": -330 2687 | }, 2688 | "id": "x9gnEB8wkkD-", 2689 | "outputId": "38e44cce-3f72-40f7-b8e4-5914a2294f69" 2690 | }, 2691 | "outputs": [ 2692 | { 2693 | "name": "stdout", 2694 | "output_type": "stream", 2695 | "text": [ 2696 | " precision recall f1-score support\n", 2697 | "\n", 2698 | " 0 1.00 1.00 1.00 1808\n", 2699 | " 1 1.00 1.00 1.00 3354\n", 2700 | " 2 1.00 1.00 1.00 1779\n", 2701 | " 3 0.99 1.00 1.00 1255\n", 2702 | " 4 1.00 1.00 1.00 2410\n", 2703 | " 5 1.00 1.00 1.00 3563\n", 2704 | " 6 1.00 0.99 1.00 3221\n", 2705 | " 7 1.00 1.00 1.00 10144\n", 2706 | " 8 1.00 1.00 1.00 5583\n", 2707 | " 9 1.00 1.00 1.00 2950\n", 2708 | " 10 1.00 1.00 1.00 961\n", 2709 | " 11 1.00 1.00 1.00 1734\n", 2710 | " 12 1.00 1.00 1.00 825\n", 2711 | " 13 1.00 1.00 1.00 963\n", 2712 | " 14 1.00 1.00 1.00 6541\n", 2713 | " 15 0.99 1.00 0.99 1626\n", 2714 | "\n", 2715 | " accuracy 1.00 48717\n", 2716 | " macro avg 1.00 1.00 1.00 48717\n", 2717 | "weighted avg 1.00 1.00 1.00 48717\n", 2718 | "\n" 2719 | ] 2720 | } 2721 | ], 2722 | "source": [ 2723 | "Y_pred_test = model.predict(Xtest)\n", 2724 | "y_pred_test = np.argmax(Y_pred_test, axis=1)\n", 2725 | " \n", 2726 | "classification = classification_report(np.argmax(ytest, axis=1), y_pred_test)\n", 2727 | "print(classification)" 2728 | ] 2729 | }, 2730 | { 2731 | "cell_type": "code", 2732 | "execution_count": 32, 2733 | "metadata": { 2734 | "executionInfo": { 2735 | "elapsed": 427171, 2736 | "status": "ok", 2737 | "timestamp": 1602478831123, 2738 | "user": { 2739 | "displayName": "Tanmay Chakraborty", 2740 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2741 | "userId": "10513402671331353489" 2742 | }, 2743 | "user_tz": -330 2744 | }, 2745 | "id": "M8Z-62eCkkEA" 2746 | }, 2747 | "outputs": [], 2748 | "source": [ 2749 | "def AA_andEachClassAccuracy(confusion_matrix):\n", 2750 | " counter = confusion_matrix.shape[0]\n", 2751 | " list_diag = np.diag(confusion_matrix)\n", 2752 | " list_raw_sum = np.sum(confusion_matrix, axis=1)\n", 2753 | " each_acc = np.nan_to_num(truediv(list_diag, list_raw_sum))\n", 2754 | " average_acc = np.mean(each_acc)\n", 2755 | " return each_acc, average_acc" 2756 | ] 2757 | }, 2758 | { 2759 | "cell_type": "code", 2760 | "execution_count": 33, 2761 | "metadata": { 2762 | "executionInfo": { 2763 | "elapsed": 427165, 2764 | "status": "ok", 2765 | "timestamp": 1602478831124, 2766 | "user": { 2767 | "displayName": "Tanmay Chakraborty", 2768 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2769 | "userId": "10513402671331353489" 2770 | }, 2771 | "user_tz": -330 2772 | }, 2773 | "id": "Jw2j7mjQkkEC" 2774 | }, 2775 | "outputs": [], 2776 | "source": [ 2777 | "def reports (X_test,y_test,name):\n", 2778 | " #start = time.time()\n", 2779 | " Y_pred = model.predict(X_test)\n", 2780 | " y_pred = np.argmax(Y_pred, axis=1)\n", 2781 | " #end = time.time()\n", 2782 | " #print(end - start)\n", 2783 | " if name == 'IP':\n", 2784 | " target_names = ['Alfalfa', 'Corn-notill', 'Corn-mintill', 'Corn'\n", 2785 | " ,'Grass-pasture', 'Grass-trees', 'Grass-pasture-mowed', \n", 2786 | " 'Hay-windrowed', 'Oats', 'Soybean-notill', 'Soybean-mintill',\n", 2787 | " 'Soybean-clean', 'Wheat', 'Woods', 'Buildings-Grass-Trees-Drives',\n", 2788 | " 'Stone-Steel-Towers']\n", 2789 | " elif name == 'SA':\n", 2790 | " target_names = ['Brocoli_green_weeds_1','Brocoli_green_weeds_2','Fallow','Fallow_rough_plow','Fallow_smooth',\n", 2791 | " 'Stubble','Celery','Grapes_untrained','Soil_vinyard_develop','Corn_senesced_green_weeds',\n", 2792 | " 'Lettuce_romaine_4wk','Lettuce_romaine_5wk','Lettuce_romaine_6wk','Lettuce_romaine_7wk',\n", 2793 | " 'Vinyard_untrained','Vinyard_vertical_trellis']\n", 2794 | " elif name == 'PU':\n", 2795 | " target_names = ['Asphalt','Meadows','Gravel','Trees', 'Painted metal sheets','Bare Soil','Bitumen',\n", 2796 | " 'Self-Blocking Bricks','Shadows']\n", 2797 | " \n", 2798 | " classification = classification_report(np.argmax(y_test, axis=1), y_pred, target_names=target_names)\n", 2799 | " oa = accuracy_score(np.argmax(y_test, axis=1), y_pred)\n", 2800 | " confusion = confusion_matrix(np.argmax(y_test, axis=1), y_pred)\n", 2801 | " each_acc, aa = AA_andEachClassAccuracy(confusion)\n", 2802 | " kappa = cohen_kappa_score(np.argmax(y_test, axis=1), y_pred)\n", 2803 | " score = model.evaluate(X_test, y_test, batch_size=32)\n", 2804 | " Test_Loss = score[0]*100\n", 2805 | " Test_accuracy = score[1]*100\n", 2806 | " \n", 2807 | " return classification, confusion, Test_Loss, Test_accuracy, oa*100, each_acc*100, aa*100, kappa*100" 2808 | ] 2809 | }, 2810 | { 2811 | "cell_type": "code", 2812 | "execution_count": 34, 2813 | "metadata": { 2814 | "colab": { 2815 | "base_uri": "https://localhost:8080/", 2816 | "height": 34 2817 | }, 2818 | "executionInfo": { 2819 | "elapsed": 445851, 2820 | "status": "ok", 2821 | "timestamp": 1602478849818, 2822 | "user": { 2823 | "displayName": "Tanmay Chakraborty", 2824 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2825 | "userId": "10513402671331353489" 2826 | }, 2827 | "user_tz": -330 2828 | }, 2829 | "id": "wiez8wEtkkEE", 2830 | "outputId": "9955427d-ee60-4060-b1eb-9e79bbf5b67f" 2831 | }, 2832 | "outputs": [ 2833 | { 2834 | "name": "stdout", 2835 | "output_type": "stream", 2836 | "text": [ 2837 | "1523/1523 [==============================] - 26s 17ms/step - loss: 0.0041 - accuracy: 0.9994\n" 2838 | ] 2839 | } 2840 | ], 2841 | "source": [ 2842 | "classification, confusion, Test_loss, Test_accuracy, oa, each_acc, aa, kappa = reports(Xtest,ytest,dataset)" 2843 | ] 2844 | }, 2845 | { 2846 | "cell_type": "code", 2847 | "execution_count": 35, 2848 | "metadata": { 2849 | "executionInfo": { 2850 | "elapsed": 445840, 2851 | "status": "ok", 2852 | "timestamp": 1602478849819, 2853 | "user": { 2854 | "displayName": "Tanmay Chakraborty", 2855 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2856 | "userId": "10513402671331353489" 2857 | }, 2858 | "user_tz": -330 2859 | }, 2860 | "id": "kR-idaI8J5bl" 2861 | }, 2862 | "outputs": [], 2863 | "source": [ 2864 | "classification = str(classification)\n", 2865 | "confusion = str(confusion)\n", 2866 | "file_name = \"classification_report.txt\"\n", 2867 | "\n", 2868 | "with open(file_name, 'w') as x_file:\n", 2869 | " x_file.write('{} Test loss (%)'.format(Test_loss))\n", 2870 | " x_file.write('\\n')\n", 2871 | " x_file.write('{} Test accuracy (%)'.format(Test_accuracy))\n", 2872 | " x_file.write('\\n')\n", 2873 | " x_file.write('\\n')\n", 2874 | " x_file.write('{} Kappa accuracy (%)'.format(kappa))\n", 2875 | " x_file.write('\\n')\n", 2876 | " x_file.write('{} Overall accuracy (%)'.format(oa))\n", 2877 | " x_file.write('\\n')\n", 2878 | " x_file.write('{} Average accuracy (%)'.format(aa))\n", 2879 | " x_file.write('\\n')\n", 2880 | " x_file.write('\\n')\n", 2881 | " x_file.write('{}'.format(classification))\n", 2882 | " x_file.write('\\n')\n", 2883 | " x_file.write('{}'.format(confusion))" 2884 | ] 2885 | }, 2886 | { 2887 | "cell_type": "code", 2888 | "execution_count": 36, 2889 | "metadata": { 2890 | "executionInfo": { 2891 | "elapsed": 445833, 2892 | "status": "ok", 2893 | "timestamp": 1602478849820, 2894 | "user": { 2895 | "displayName": "Tanmay Chakraborty", 2896 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2897 | "userId": "10513402671331353489" 2898 | }, 2899 | "user_tz": -330 2900 | }, 2901 | "id": "xGcIixswkkEG" 2902 | }, 2903 | "outputs": [], 2904 | "source": [ 2905 | "def Patch(data,height_index,width_index):\n", 2906 | " height_slice = slice(height_index, height_index+PATCH_SIZE)\n", 2907 | " width_slice = slice(width_index, width_index+PATCH_SIZE)\n", 2908 | " patch = data[height_slice, width_slice, :]\n", 2909 | " \n", 2910 | " return patch" 2911 | ] 2912 | }, 2913 | { 2914 | "cell_type": "code", 2915 | "execution_count": 37, 2916 | "metadata": { 2917 | "executionInfo": { 2918 | "elapsed": 445826, 2919 | "status": "ok", 2920 | "timestamp": 1602478849821, 2921 | "user": { 2922 | "displayName": "Tanmay Chakraborty", 2923 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2924 | "userId": "10513402671331353489" 2925 | }, 2926 | "user_tz": -330 2927 | }, 2928 | "id": "r-HdxMrCkkEJ" 2929 | }, 2930 | "outputs": [], 2931 | "source": [ 2932 | "# load the original image\n", 2933 | "X, y = loadData(dataset)" 2934 | ] 2935 | }, 2936 | { 2937 | "cell_type": "code", 2938 | "execution_count": 38, 2939 | "metadata": { 2940 | "executionInfo": { 2941 | "elapsed": 445812, 2942 | "status": "ok", 2943 | "timestamp": 1602478849821, 2944 | "user": { 2945 | "displayName": "Tanmay Chakraborty", 2946 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2947 | "userId": "10513402671331353489" 2948 | }, 2949 | "user_tz": -330 2950 | }, 2951 | "id": "6GaaBm4BkkEL" 2952 | }, 2953 | "outputs": [], 2954 | "source": [ 2955 | "height = y.shape[0]\n", 2956 | "width = y.shape[1]\n", 2957 | "PATCH_SIZE = windowSize" 2958 | ] 2959 | }, 2960 | { 2961 | "cell_type": "code", 2962 | "execution_count": 39, 2963 | "metadata": { 2964 | "executionInfo": { 2965 | "elapsed": 470026, 2966 | "status": "ok", 2967 | "timestamp": 1602478874044, 2968 | "user": { 2969 | "displayName": "Tanmay Chakraborty", 2970 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2971 | "userId": "10513402671331353489" 2972 | }, 2973 | "user_tz": -330 2974 | }, 2975 | "id": "oumhazm3kkEN" 2976 | }, 2977 | "outputs": [], 2978 | "source": [ 2979 | "K = 3\n", 2980 | "X,fa = applyFA(X, numComponents=K)\n" 2981 | ] 2982 | }, 2983 | { 2984 | "cell_type": "code", 2985 | "execution_count": 40, 2986 | "metadata": { 2987 | "executionInfo": { 2988 | "elapsed": 470018, 2989 | "status": "ok", 2990 | "timestamp": 1602478874046, 2991 | "user": { 2992 | "displayName": "Tanmay Chakraborty", 2993 | "photoUrl": "https://lh3.googleusercontent.com/a-/AOh14Gi7eevbipSJbPNmjfEacGnvwa7ZJkT_EcljzNX6FQ=s64", 2994 | "userId": "10513402671331353489" 2995 | }, 2996 | "user_tz": -330 2997 | }, 2998 | "id": "tvZuH_0-kkEP" 2999 | }, 3000 | "outputs": [], 3001 | "source": [ 3002 | "X = padWithZeros(X, PATCH_SIZE//2)" 3003 | ] 3004 | }, 3005 | { 3006 | "cell_type": "code", 3007 | "execution_count": 42, 3008 | "metadata": { 3009 | "id": "4sHtc-12kkER" 3010 | }, 3011 | "outputs": [], 3012 | "source": [ 3013 | "# calculate the predicted image\n", 3014 | "outputs = np.zeros((height,width))\n", 3015 | "for i in range(height):\n", 3016 | " for j in range(width):\n", 3017 | " target = int(y[i,j])\n", 3018 | " if target == 0 :\n", 3019 | " continue\n", 3020 | " else :\n", 3021 | " image_patch=Patch(X,i,j)\n", 3022 | " X_test_image = image_patch.reshape(1,image_patch.shape[0],image_patch.shape[1], image_patch.shape[2], 1).astype('float32') \n", 3023 | " prediction = (model.predict(X_test_image))\n", 3024 | " prediction = np.argmax(prediction, axis=1)\n", 3025 | " outputs[i][j] = prediction+1" 3026 | ] 3027 | }, 3028 | { 3029 | "cell_type": "code", 3030 | "execution_count": 43, 3031 | "metadata": { 3032 | "id": "KpnGO-c_kkET" 3033 | }, 3034 | "outputs": [ 3035 | { 3036 | "name": "stderr", 3037 | "output_type": "stream", 3038 | "text": [ 3039 | "C:\\Users\\Utkarsh Trehan\\anaconda3\\envs\\MALIS\\lib\\site-packages\\spectral\\graphics\\spypylab.py:27: MatplotlibDeprecationWarning:\n", 3040 | "\n", 3041 | "\n", 3042 | "The keymap.all_axes rcparam was deprecated in Matplotlib 3.3 and will be removed two minor releases later.\n", 3043 | "\n", 3044 | "C:\\Users\\Utkarsh Trehan\\anaconda3\\envs\\MALIS\\lib\\site-packages\\spectral\\graphics\\spypylab.py:905: MatplotlibDeprecationWarning:\n", 3045 | "\n", 3046 | "Passing parameters norm and vmin/vmax simultaneously is deprecated since 3.3 and will become an error two minor releases later. Please pass vmin/vmax directly to the norm when creating it.\n", 3047 | "\n" 3048 | ] 3049 | }, 3050 | { 3051 | "data": { 3052 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMkAAAGfCAYAAAD1ZvZbAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAA6xklEQVR4nO29bYw02XnX/bvYGJNgYG12bSa2Yxuy4LW/wPiWu6RYPMCjEJMeYYNktCCQP7h6pR1DggQia6ERkW4hDBLR8yVzi+1KFD9AYlaEEGsqwgRDFEVKdbJ3K4SsX/BCjLPx4I0h5lWE2Fx8OOd0V1VXdVdVV3W9zPlJe+9MT3dVTU/9+3o91xFVxePxlPNb+r4Aj2foeJF4PAfwIvF4DuBF4vEcwIvE4zmAF4nHc4DORCIi7xWRz4nISyLybFfn8Xi6Rrqok4jII8C/Bb4deBn4eeDPquqnWz+Zx9MxXVmSdwMvqeq/V9X/DXwceF9H5/J4OuUbOjruG4FfSX3/MjAre7KI+LI/8K6OjvuwpeM/PPyUsfMVVX08/2BXIpGCxzJCEJGngac7On+7lEm46Lc84hTSgeurAqICAg/VnClZ1LvwWaT2Vz3wOvNLoEibb80p+Q9FD3YlkpeBN6e+fxPwpfQTVPU54DkYtiVRQERR3f2zi/lpK+cR90/L74QoqNjbtsbxg6V5ooqU3/L22JtzAUkoSNTe+zIEuhLJzwNPiMjbgF8FngL+XEfnah1VIwzc7VEgkMzzN//Uw51jeyAxN3QnHxnmXCqwCsufFSzV/LpLI6rC3zz1C6/C7TOSxfbnWvbaEdKJSFT1ayLyF4FPAo8AP6iqL3ZxrjZRlAUrLi7WzK8vuLm0H75irEbemqgKsvmU1to6Uc3diB3dVaKgmxOVn2QWKURsrc4O5ndcLWTvcZKF0KKR7Z2uLAmq+hPAT3R1/FZRYJFwcX4B8Xzz8IoQiA7+rTc2x/rklRHdxAupBzuxJmI/3lWMtXBxSR23StQJpOL5JqKSTuoktS+ip5jEeQ0PLubEzHd+Pifm8ibe/K3L3qrtB3R9a4IKkruXVN1N1i5GjzYwcS5RmV+UEgZQO9gPljaAH5dOHqrqvfyDd1YkSQhRCLf3r/c+L75aQxCZb3RP6LG5Ger/KvnMUaeZLrDZp5JPevs7rha7P6qLiW9GlekqFMmd691SFMKE++fzgwIBuL5/i7vxVVywvYsYr998Mte9JpWMtg6EDo0RTYcKufgKZwe1FYGAsT5dWMRTc3dEouYmuLi+YH67zsQe+4iZk4Sp22qPCNT9uK5QRHdjGRs/tElZ/UJRRJVVKLXdqgonHb1M7oa7ZX2r+e262evnMfFlDGwSqcV1k01Kt0Fswm58YG7eJhecZW/sgW4+BLpitmQssckddLeSkOv5nPn6trlA3KEwxYUqmS7zRQO3a+eR5tZExV2C7Enpmp/Pom4/o8yHSqen6JRJikQVSELm61uTtaroWpUSz1nPb1N3cfEdpyr2k7/hnS26G5s0Eptxq0rFodlMnbBNBXdBsmhWRxoKExOJEpJw8WDO/P758eJIETPPlKrLbl3luDrazo1U80DbtGtJ5srlvcVdaYOTNGAVSkedBN0zCZEoECZwPb/gdl49KK/LfF0t0+V+3jzyzvZDHewrFPdfcV3CZa5MD5c5ds5oHT5JK4xTJeMP3K1b1ZUw8tzcXG5vxJLMzSaAzzUAVkYlK45cYS/zVKRcSO769lXwUy9WNNOL1QXBsJsfpxW4q3WtTikQgJVsXa591mQTmjSxJvmUcL4iL7iqjBFsaReisRz7dJp3urqMTTYnHBmjEolzp6/ncy6uu3Wtyrh/vQ3gS+MS3RbRmhrJ9M3rzpbJWJWJw9aDqp41q8f2azNFDMB5qcVo3C3FNBzen5+f4Ir2E5+dQxRsvi99CyXl+zd1u9I37T5PZY9LVuE0qTKjueDWi4opZtblGqDTNWJ3KwlZJAxCIADXt/czn9Vlf2xBj3e/0zd9Wfu6NbHHZY98pquMwYrEZWRcMbBKn9WpiJmzYLX5viw2UZVtbFIXu1Zlr+WwUj22Pyqf6dKTlMfHo5KBulvK9fyisH29d+bxJg6qk+lS09p7+Pg2s7Wng53jrUYJPtM1DncrIYRw1Vwg87jdC8qTShQstsZkf93EGYSyqNhG5Jt7dF8XSYf0k+nq/0P6EMMQybsw9Y7rOfevj+yzOmG263Y9r5TpYq/bZJcAb/qs9uPE1kUW6tSZrmQxjkb6QYjkd730Lds2krZu8q4tCkA8J1ylHyjv6do4Fu7OU0l15la7Gzef8h3evJnpJyqdW5NVKIM3JoMQydEUCeJEFuV2Pa+d6artVrnsVYXztMMB89j22Qae6pqGSE5cUMyfu2qmC6jsVoGNEXQbK2yyZVjJdORypUOFU2S6VuGw3a7xieQUblRNbudr3F1VdEuZsUM12Nylmg0U0g2KJedqg3RKWKDz9SbmnMOVyfhE0qfV2EN2paILxnXrW1U1HWgmy1T4lNR5ugquJeVEniTTBQw1OBmfSAbKxYOceHWz7OkgebfqUBEk86HbaQpqK5O7nOmarkj6cMvyy/2KnpL6d/OVKzTWCGDTma5T6OQuZ7qmK5JTu2XxnHBVfrduKuW4IRKp2MJMta5+LjlNpivTrnKHM13TFUkP3K7nJIXDqLfiKJqy0uQOPEWmC8gs8z1VpmtovfTjEMkAM1qFxHOiMNcs6P5fMGw7/cTa605Omely6WtOk+kCBuV2jUMkA81oFXF7/zp78+97cmqCiFI/uJhqpmtoQfxwRDIWa1GBfKarrEqyHUHU3BpMMdNlzzIYazIckYzIWlRCs4H1nmUh268brok/RU9X+kSnyHQNyZoMRyRToiDTtX8EUXuZrq6Ckz4yXWk3r0+8SDoin+mqfl8NPNN1wp6uZCGDaFfxIumKAvdRbbvKzuNjynS5r6me6TraNetZJ14kHbK+nZ8u05XrHRtSpuuYyStDiE28SDrELEHO/ok7y3SJVumKaQnd2K27kOnyIumY+YOLzPenynR1KZP0rO2TZbp6bFfxIjkBIUnm+6qZrtrkM11d2pPs3nKdEiy72tu+Gl4kXWNbVZpkafdOWCkhk+nqsENYXOcy3WW6gqUyW4JZj9PH/BiDF8kJmN+uSXKbdVbKdDW5L06U6QIgZUzMzdwem1Gonf8Sh/mGfk9/d7h/Pgfd7gl/ONMlm5u8rqehbvqjPULTGcEVz4ZbiYkel8lysY3Z1rrqlIzu8ZbkVMTzQmtSRDrT1XQ337TbdZpuleMyXWm3aiDa2OAtyQm5fz4Hto2c7mYo+pDf2gH7vRtyVxXRzFyvrqzJdmo+dg97rWxNNpaD4h26CjlRsiCNtyQnpkmmq+lNkcrU2hu4IxpkuoKl2jEA1QSiav7pI8nlLckpcZmuJdViE7IW5Rhr4jTSpTUxnqGk91/dIViq7S877FaZY5p95lcLSBAz4ezE1sSL5MTMb9fEub+yy3QVrV603lKzCJ5dkXXmdrFNNugSVqn4K+1WUdG1UrUCKXDdzAapp1OJF0kfrLIRfNVMV/P7YptO7dZd2c10ObdKKtzWav8RlNVCSEpe4VYTnEomXiQ9ML9/TpxzGfZmujDPbepyqb2jTlFysDIxbSRKLbdKrLDKxOEw1uV0PpcP3HsikazjXrWnq/aiLCC7RXWHHcLpTIG1XAcFoiBq4o46NZbsubrFi6Qn0rv4VuO4qsepM137zmATVaBaGnccYmNNTiCU6Ypk6IMl4vmONdmfDs7JpIk12czP6qany8UJpRt62X9F1WSrFvWsR55TrTWZhkh63J/kGNbz28I1IGU9XXKcMcnMz2pztyzd+Io2e1V47mZu1SFOIZNpiGQEgigiZo7sFBX23ECp1YvHZbraa6rdZpnKA3RbGelsb/iuS4zjE8nQ3aiazO+f7/rVJR/xGWvShBZ7upz1kLKUw4lWSZphEd2GJuMTyUitxj6y+y7uj03SN3ijmy+V6WoyWWUbzxRfgQvKsysJu03VuqkqXQllfCKZIOldfKFKYJ5qEW64xHdTia/4erPMxWyGWlQY3GSsSCUIMnftCYTSkUymK5IxuWVNM11N77sGmS7bi1hyTpOxcgVBe4pMDKVygmHbHR1+uiIZmVt2VKariTWpkOly1kPL1KFsdugqIrPdo/2v66ERXcyym65IRkbMvMBdOJzpOir4djdwiUDKqubGXVPbaLi/YVIlu4F3l3XMrlyucYhkTK5TE+YxV2eXxXd8lUxXkxvPDXHIHSJd8ygKy129o+rudaLZWkbX1mTb19Ue42hwHJnrVAkr/Pj8DIIb4rJskdg29AKxpNvg3fe1SK83cd27pNyw3NlcI2Jdshav/Zu46IRtdgkPx5JM0VqU/E7x2TlXl2fElzEEEZs/5zZFtOGUma689XCt6+oEcsS9ffJMV4vByXBEMkVrUfA7xWfnEM0IiFKP2ttwFRp3Iff37SzTpaWlQLbyqe5aldFLpsucqZWjDEckU2YewzzmJn4GlgF5y5GEgthRKkEY7oz0bLWny1oc0fKZVmr10eYKxlNnutpsfvQi6Zizq0uu1jHxZYybJeXcmGQBrBYEYbZGsgqL/PaKma49Lpe7OQ8OfOvIGzplFR6cKI+XiugANkl59NG36Hve82zfl9Eq8dk5LGc7K/Pc2u38Et4dZsvd3Oy+moT9J5uzcq+rtvw3HWBvrrNt0hdib+KuGh8hNQmy2tMfquq9/IMHLYmI/KCIvCIiv5R67HUi8pMi8nn7/9emfvYREXlJRD4nIt9R7domhHWrNArID1pTzDqKgwLBPKeo77E0NrGPb4xEBbcqc2U596phjfIgGRF3b0xYhUc2hVLN3foh4L25x54FPqWqTwCfst8jIu8AngLeaV9zLSKP1L6qMWa65jFX8Zqby5tsb5P7tAzNHyzvWu0j30W/z51STbloLiA/JA5XMbfr5pWUDepIJel94bcn6prjVHKwTqKqPy0ib809/D7gj9ivPwb8FPA99vGPq+pvAL8sIi8B7wZ+ttZVjSzTdRWvmcU3Vhoma+Xc4dUCgmRBQHVxZMnV1ffUTdLPOXhE57plzIcbRyGb4xyb+i2+vFQhQ2CG2jisfWaRwvK4YzctJr5BVW8BVPVWRF5vH38jZEYUvmwf20FEngaeBvjGb3xdw8vokXnMPIbLm5gbAcmkdE1VmtWCxtrAZrqi4nvezemSTeX8oE9lnrcxNsV3vjr3jG7LfunZWQKwbC82aXvwdtsV98LGiqInqupzwHNgAveWr6NT5sQ8Y92qzN9AzR9/FQos6rlWZaxCgSjfpWVu383klAr3wbZaLhVMQ/p83Uglm7Zu5xxuOiT5v8uRNBXJl0XkzFqRM+AV+/jLwJtTz3sT8KVjLnBoxFdrbmYxstMVaD4JTb3jeHE4gjDMjEXdnq3iwLqUv7TjXhVRMKfLnKt9oWQnMZpBdk2tiXOrutjrp6lIPgF8EPio/f+Ppx7/YRH5PuCbgSeAnzv2IvtmTsz6KmYZgMS5+9LVO2jHchRiM111w93Gt3VmIv02hu9mjxNn3Wh8wbNIW3GryjgoEhH5EUyQ/piIvAz8DYw4nheRDwFfBD4AoKovisjzwKeBrwEfVtWvd3PpJ2Aec3Z5xaWaDt1s1KFWIB2KI8UqpDQ+SV0RsK1vCNusVV12vaH23S6TPEh/+je7y7see+qLiUXMY+LzM3QWFa+1QDdtJKdEg+VuWWxzN2czXiLOP6dxiiqd6dqkBzq5XbZ3uDbMdBXVXhvQrJh4p5jHzIlt+3qRQLb1jj5YhbL7YS7sCAS2PV3HXGlGdEcc5+B5UsoTmvV0Cd0NghjHepITEJ+dw/ma+DJKb0a1yQoloRCEx9Q7jsc0PxZZk5ICo/1/42nyO52Tw810bbqnBxS4TwNb2b+5vMkEqmlcvaNPcaSRZIEG2UvdtzDLpX2b3t4nzXSltq9rkumSjlRyd0Xi4o4g2smMOP97FQpBlT6rEyOqGQdc4GALiTYtn58o05XZUe4Ig5XazaU17qRIruI1nMXEhcvKXVDebr2jTZKFEOQ+NMusiavKH5XpklQnbUeZLnPodJqq/o2+GVLXcqbr7gTuNii/ublkRmRWzaZR8ymU9BSU12HfwqwyTPOiNG5azLhYzQ9zkHSDZZPVi11MTJm+JZnHxOsr9PIZBOGyIIMq2k21vEtWoRRU4ksC+NR6kqa3z6l6utJ7Lwo06+lquW4yaUtyFa+5WscQBYV+qql3hIWrA4dOkTWB8hvDtcE3vnFyW0t1a2/zWbV6FM0JOIZpiCS//mQec3ZV4lYBbiFFHwXBNpGk3sIswe6d3tBXSk+kbzJsu97J3Bf1TxIstVWXaxrull1/4nqsohk7bSQusbPps9q32fiIqJvpcj9pXDexI1G71Ec+0xVoNZfL7fRr/vDtXeE0RILtzg1ukLjIrUrVO6ahjQ1NMl0q0jyPmxtol920tD3qZrpmkUJEoVt99LWMundrHnN2HhPl75IU6vqNBljvaIvCnq6y4uJmX3il9nbXm0Nvg3i1JroLobj936G4p2vbvtKadSvs3RqtJZkTc355RgAUvz0mnbtCdheLT4zyhVm7bDNdx95StnbSboxccpZspqvIrerS/RufSOYxV+uYYOemsFh/dnWiFvYhEIQhCbKTpChL1eqBnx+kYGFWF4huC5nuart0q8oYh0gqDpfG1TvCcCzljlbJt2Qc6ukaQ2wCbMxJfobZqRi8SOKzc5LLdWp27m5Od+Ov3iHrkcdZk/TCrCo9XccUBo+rZlQ4fi5d3VcvxKADdzdcel+JTGH09Y42yS8Uc63+ZUG8Sek2vwdcRND2oqzKa/jbZSSBe759vagYqOafjWvl2bBakCmtV7nHjrImut0NS63gmgplq+N+3KoyBlVxP7u6JD4/ywyXTmO1UTpo2mPbVXYeLc90mS+O7Vi0a+qPOIQJjYp39u2bQbhbb3n0Uf3iV7+6NzCrPGjaQxKVdTmX4EzJMSZg85m270QlL+/HtSpiuGvcv/it/4X8cOktps+q8qBpD+CitdT3B3q6jrpBUy5W1Z4uZ7xUqgz07pdBWBLZs42TKsjEi4FdkEQRs/wIopIP+c1klRNZk83Ovs3O1CXDtSR7EfMH99QjCMOdG7XMmpgqvB5308q283avNRmuQEoZvEgE06PjhVKfTabLUjXT1RSXCHD1mUxeAMH5VWMSCIxAJEALfUZ3k0aZrqPJZrrU5qvG/Ccch0gwE/o89VktyO96vac0y3HpYNvTlV6/OGZxOEYjEsHHJk0IwnBnlV6VbeUak8p0TUAfwIhEgnhr0pRkIRlrcqinq40m+ikxHpGAz3Q1xFmT9M27L9O13RauAQ1WP2nqvyEyKpEIZpmmF0oDVguy0UIFqsYmmzhm28dVGcUO9tbtPo4DY1QigTEmEIfD7uC9PZmuigJxxfI64nA9eJvi5SaIGaZQRicS8LHJMdTJdJXGLinLkSqzHz63/Ufy4shcUIeLtxoySpH4TFczjs50OXFIPcthXntAHLtPHwzDW09SBYGZAqu+L2SErBakR3VVyXRlpgnUUMamFazubkI2cdD+fPhmjNKSAD7TdQQ7u0IdtCZHulVNr3Mg5mS0IjGZLi+URuQyXXlENLXnYs1P85pu1V4GEsWPViSwTQl76lOW6XJuTh1xaPqfNsQxMEYtEjBLPr01acbOUqJ6XlVrbtVeBmBNRi8S367SjKJMV29u1SF6Fsr4RYJPCTfGZrqq0rdb1ZdOJiESxC/MqksSRcYE1zAfnbpVFU7el0GZhkjAL8yqSbA0xZJa75pw5OihI+mpGj8dkeAzXfVourCK3oVy6r/ypETiM12HSaIIDeq5WTv0LJRTW5NxtqWU4dtV9mLGDLXU7OGE0leMkumV6ZZJWRLwma59tCYQR18WxWXWTqTPyYnEZ7qKKd4yrq2D9+R6nSg+mZ5IwGe6ciRR1J1n0vNbfQpvb5oiwWe6HGZ4dscOfJ+B/AncrsmKxGe6TiQQhxNKT2LpUieTFQlyt63JSQXi6Mv16rjIOFGRqNlk845uE9eLQBx9Zrw6Esq0RKKAmv3b76pAtvRZFSe9Aclp3bAOhDL4/UkOoqBid9+FO79FXOG+JEMh/Vfuyj+y0+wb/v4j2Vi0AtuWbTMQ2liNuy0OsG7WkuLNWIeAvXM7dQRT+8u3d8iRWRK3Z/tdtxh5urAgmQ/+Fo+7OXiX0XYzoRRaksGLxL2XynbLRC+QAlpWiPuDbPZCTI8gaun4+4ZRHH+COyISv+NuRWbLvTsXVyUvjMLnHDhJnfXxnYoEmghlTHsmKn7H3YrMlvUXT+VQdsfyluGeU/SfO1YVNkPxusx6tdTbNVCR+ExVJaxAjqGK9ahKXaGcpBGghfLJQEUC4NtK9mJdrCY4y9HFPIfaXexdFx9bWBs/XJEMMtE/DJIoajQnt45bdSy1hUKHrteRbSvDEMm73lX4cODnaRXSxMs6lTjS1P4E79KqHGFOBiGSt/+Hh8W/gJjFQh5DEkXoLKoskLxb1QeDE0oDBiGSzz72WPkb2edkjgGRRFEtC9KH5ShjUEJp8H4cFImIvFlE/pWIfEZEXhSR77aPv05EflJEPm///9rUaz4iIi+JyOdE5DuqXIiswjJj4gN46rlYfc6QK0M2Zq3qC+hGKA3criqW5GvAX1HVJ4EA+LCIvAN4FviUqj4BfMp+j/3ZU8A7gfcC1yLySLXLKbh6v2bd/O4jFkiGgWS+6rxFB0WiqrequrZf/zfgM8AbgfcBH7NP+xjwfvv1+4CPq+pvqOovAy8B765yMauwpAZ7h9esu7UhVd6BwQvE0ihF3KZYaqaFa8UkIvJW4A9hJlu9QVVvwQgJeL192huBX0m97GX7WP5YT4vICyLyAl/9X4ArHhZf+l1cZVhn8dRYBOKoKxRt26rUSAtXFomIvAb4UeAvq+p/3ffUgsd2LkdVn1PVe6p6j0d/2+bxVSjFXtcdC+CnLBBHHaFsNNLyfVDl/JVEIiKvwgjkH6rqP7EPf1lEzuzPz4BX7OMvA29OvfxNwJcqXTHGmhRduMrdCuCnLhDHEIRyiCrZLQF+APiMqn5f6kefAD5ov/4g8OOpx58SkVeLyNuAJ4Cfq3NRRZku4Q5t1jNbUiUKGbtAHCcXiop5fys2CVexJN8G/AXgj4nIL9j/vhP4KPDtIvJ54Nvt96jqi8DzwKeBfwZ8WFW/3uA32X3oDigliaJW2t7HhksRVxFLI6HY4L+OOBwHl++q6s9Qfsz/t+Q1fxP4mzWuY4dVKGgu8+nel1UYTbJDuM7qwqlYkTyi1ZaBuHthE6seWATjjtnkw2cQFfciyjJdk/2MnS3vvEAcVd2vg32RDS1HnsGKBMozXcHE0sF1XKypC8RRO05xajnCrSpj0CIpy3RNrZ8rWE6rWNgWTQJ6tT5Vm3fIoEUCxZkunVAAb3adujuZrLrUDei7+PgcvEgAJDeswpnXMddN3O63VZysuyqQPH29BaMQSbLY7ekS2/g4Vtzut4fwAjHUXj/fIqMQSWlP14ir8FXXp08r+mpO23O/6jAKkUBZpmt8t9BmdWGN10wsT1ELF4x3FW9UYTQiKct0jWl1b93VhY4e2pV6ZwjicIxGJFC+enEMaLA060L6/ouPgKGIwzG6qfKiuYBXQFkOaj+SfJxkKunHx059bpveNemYYyjicIxOJMlCmLHb05VEp+/nMr1WRXdt+7vdmrT3NOkzKK/CMAZmP/m48kPvr/z8wj3JVTubG7xdAFXE6f60U0sHt7yNSBtMZxOfVSiwJPMOqwirsNn84LR7FCy1JD3b/59z2/na95Ucx9AtR55RisRkuqJdl6vJwWZLZumBb9Fp+4zd/T6WG+ZYBmg9DjKq7FaaokxX7XYut69HD3+1umOoxk4XjYenYrQigeKerkMV+CSKTEEviDh2X4+mHCOOsdVMxiwOxyjdLcdOpktgppiBR0XP73N/c9q1HEOPTcYWd+xj1JaksKerpJ9LZ3Zn2h7+bG27VkO/8YZWDDyWUYsEdnu6hOxY1Mwk9hP/1bqMO4Z2A3a14GkIjLJOkqesbqJ0F5S3/a41usz+/3TAODNWJYxpY9F6SLIoWL3YvkCUYWWl+gzgp2w58kxCJFCc6WqDoQkjTx9CmVJQXoVRZ7c22ImHx/zR+hRB0+s+dT/XXROHY7QicYF5sASV5g2FfYmjrRvtFK0qU8tW1WV07pYboBBEShBxlE/ch0Cq3mxq/0kISehnHU164dNdZlzZLedWHfFXG4NbpWrabsLUGpRlUO5QdtEdfEddq3F2Ad8lt8p1+68kJEyyBdEV5csA2nK5hrzwqU8GK5IkigiWShDYnvjq2wbuMHiBKCQSshKIkgjC4v6zLhtqvFtVzjDdrbviVtl/FpJAEhx8fhhh4rA9x2tiTbxANgy3mPj2r3xl053LLDqqENhnTaOWm6KKJCGLFdUEErS7NPkuFQOPZRiWRNoJO4duPZzlWIm54aPk8HCIMAgJEtDgcDxWxZLc0YC8KoWWZPQi6fvq68QdYUW3yhHNlM0mHDXOU/ojbzUOMc7sVhmjEQdAEhJKWMutCjQ7xU4BSczP9lmVfJYr3bbiBdKMQcQkdenbrap0symgaoqBAdUsSJBAGBEQbQWikISwSDB1kyhkd3x49vqcMHxKtx1GY0n6thxQXRwqykoWRIkQVZhOEQYhM5aIClHqJIqyWAkBCenZdgkLDknOu1btMXiR9C2OJm7VIqn2qo1r5eYjbV6mhKxYJEJUoLIoidA9FXgvjnYZdODe95XVCcpZGbeqSsYKICLcWSxmlueHRElUKI40CcHemomnEcPNbt0T0Rf6vogclVK6CoISrmp8dgcJSzu+IvMqVUJZEVUsh4QRRBp4s9Eu08pudUFdy7ESCGu6VjOCzHkUZcEKVkIUVh+vF4Xdtql4tniRUOdGU0gWJLbHqmpQbjJWubmsKGEiLLBxR5WD5VisvFBOwZ13t+q0kdRyqyymIJird6iSyKKdKfjLGX7Tk9YYbu/WqRGq1w5UISQhkeoT68MgJCJkGZC5gZPQBOYsVq1tE1HnujzNuHPuVtUeK0FJWLASIAkqbcHjXKsZEWRiaiUJhSg0Wa22CCP7j89ydcqdcbeqt68rkiwyqwKrEAZhYdyxEUeDmKP0XGHAMjBS9o5Wq9zNFHCtm8jGClVrHek+K00N31ZggWlmbEscYWhq7F4cnXL3UsCV20iw7es1BGL6rGxBL7OsWM1S2zYFEkGIWXBV2tro01ydMUlLUitjJata7esABMlOIc+5aZWrgRUIw4ClHZu/73dKQnPapa8tHsv0s1tVMlZm4ZOaQT0rqSyQMAiNOAJlmYrKzUpINf1aLQkkDAM0mpmbvkQgihFHmMD6dk4UJSZz5mmdyViSquncOqsCHWV9Vqgp6EG7gfnexey2y1hCc+Lr2/ubH62vYqKaRtGT4e7FJHmqDlxIYwQSkd6HPQlhESVEHZQowjAgCpTCvG4SEq4iHswvuL5t/9yeYibhblUL0LV2WpcggdwqQOf/txh6ZDAZs4LfSIEo5Pz+vPS1UZSQeI+rdUZvSSoXB1c1Mle4hVCzHVkFYYcKgYIz2utZwXnKtSri+vY+nHVxVXeb0VqSeuN7qGxFwiA0hUEt2BgIrZ0Iq3TOMCCMQJdBcZCu1WKey7Or9i/OMz6R1BGHAiThJlg/SJAQJPbTPNc06AL1tnquwIgjiSKiGUQEmVO684UJyCLJBOh5Ls+uWF/FsPQLsbpgVNmtWi3tSuWuXWc5CgsStti4WHXTWlJUO3drTM7n+90rgMurs+2ekJ5jGW92q/bQaVkQraoNfgO7xrzgLktCiCShqiE6eD7bWhJGEARaWD1PQhBWBy1HGMFSAy5TunbtMFF2427PkQzaktRxq8QOnYbDNZBNKzuznU9ytf+6YQ5tWY8kiuw5d+MOd86i2kee9VVsiprpeSl28N35/D7xHG6eib1laca4GhwrC6RgL49DlNfqzGrBVguD2An5Ox3CWUHuS+2CsR4JAUHmMMYXvL64IE69PD4/2z9Z21PG8EVS+8OvRteusx4Rs539Fd10xCRoLzBPd+1KwRCsJASikPW8vCroslVmMkq+g1EhXDE/37U68xieiWPvcNVn2DFJrYyVVu/adQOno9QnefqDmFXIwk5GbGtw+2bjIde1m7u3kwUEJFwfCMyNBXKO1bZZLFzB+f0L4vN2rtezn0FYkioDszeuCfW6doszSKmUVcuEEYVxR/qc+2IOMHFHaHahSAXl5ne/vT6c8fKWpDHDtiR7UUVWC8KKf3ZnPZhFBRkksyw3WLRb7wBsh/CuJEMSSITzi/K4w7lWGs2QG0k1KprrfTC/4HZ/2AIYgXjaZdgisb5VndWCBAkzAnaG5drsV5QIQZurBYOEZeSsVX7gnDlnEML1nsD8PD4jsQu4BEC26+yv5xcciOkBI471Vcz52rh5l6lr8HOBj2OY7pYbOk29pbRu6HTXi6EcZaNG3Qah6wof/ev4atc9swmJ+xVbfc/O452MnZsumcjCWFWf7apCM3dLRH4b8NPAq+3z/7Gq/g0ReR3wj4C3Al8A/oyq/rp9zUeADwFfB75LVT9Z+TJrDJ1OrzGPVECiTFTuhk43GfxWek5rPSINCApCnVASZLG/1rHpsYpymTYrjvXFBfevD1/LPIZnbkzskXHPQmEhCcmeHXs91TloSUREgN+uqv9dRF4F/Azw3cCfBv6zqn5URJ4FXquq3yMi7wB+BHg38M3AvwB+v6p+fc85tOnQ6fwnZJ2h03VJoohAd7cBducMwvCgOJIoYqb5NhLjVgGZekcZ89i4aOnjuDR2GOzWjMoKp54djq+TiMg3YUTyDPD/A39EVW9F5Az4KVX9A9aKoKp/y77mk8D3qurPlh733j0Nv/9h9V/FDp3e+aPXHDpdlTAMyrdms+c81Ge1jq+INDdtMZUOLqp3FHF2uXsc51LuS/rNClMKnhzNs1si8gjwEPhW4PtVdSUib1DVWwArlNfbp78RSH+Ev2wfK+Wx/1FNIG621W4xsNnQ6b3nSvdZzdjtCra1mmCxXyDOejxDkNmgx60yvH8+Z85+gcxj66JZ98wdx8U+q2i/QADz/viJKo2oa0keBX4M+EvAz6jqo6mf/bqqvlZEvh/4WVX9B/bxHwB+QlV/NHesp4GnAV7ze3jXU/+0/LyblG6qJ8Nlf8KWe6w254zI9kc5FLM3G4fjjp3tEWy27vqiumsVr6/MOpN8sV3MNdRpxylqjfFkOL5OoqpfFZGfAt4LfFlEzlLu1iv2aS8Db0697E3AlwqO9RzwHMDjT5YXE8MIZps4IHIvZiErCFftiyM1HTF9+7lB1+FKDq4QPD+L0WWMRGTL+4sV8/OLSuIAuFrH3CzjTNwB5nevvRR580vUf9ldp0rg/jjwm1Yg3wj8c+BvA/8P8J9SgfvrVPWvicg7gR9mG7h/CnhiX+D++JOi7/+h7GNmjUe0m+NPTGDa9toOOL7PqiiwTwf1VeKOeezil9wQLetbHbMy0vSu+XEqe2hsSc6Aj9m45LcAz6vqjYj8LPC8iHwI+CLwAQBVfVFEngc+DXwN+PA+geTJDJ1296utm5i0cPsCcTWGfJ+VqlkVGMDePqv1VUwYXNldczevhmTBg/u3wO1BgcRrm/kiQojNcdzvzQqkpala3prUZhDFRGdJ8hmk1vfyyGH6rIpSo6bWEHB42azZH2R7ze6169t5ZbdqU+/IpYVDGrpVe/CFxb0Mt1X+SXlcP6u/Rn4vD9OZ26440n1WRa2PoR10Xba+I9O+vhMHl7ev53E9VudnceY4++odbeBFspfhiiTdlrIZGUoHGavSdeVbywEHVgbGVzt+vdpJjuf3q1mPs/OC1YVuXb6YzuQuBAK+sHiAoYukm708HAkBs4Kl7C7uqNK+nt/2oE77Oth6x80N2YETW3F0JYw8vrBYynBb5d/OY7zHdee2eJ+EYWAbH6NMBhnYdOjKoryV5PLsKjfZfSuPMBEe3K/evn5+FhPcsFNxD2VFKKd1f3xhsR7DsCSPP6nkc8BHUrquXM1M4Cjc71adx2dm19ycsBR4cFE9KL+6PDPjivLVwGTRWdxRBV9YLGTA7lbLItFoll+oC5hkwKGMFRS3r7tEQp329R33zLaRdDkm9RCuc6F44tedZ7juVhskUWTqK7km3fQWCUFULhDnWkUzUIldmQKxUfn69j7xAYG4Hiu1u+lkt2o4olLeApshGIEg5ftleQqYhCUpWleeFkd0QByEUcEUxAbt62dxduETtr+MbjNW+4gIbdcwePfqINNytzatJAXbpSU2Q1alfX2puzN4xe6vUKeNZCet2kIbydGEUcEoIs8eprEdnBsyvQx0Z7s0ZRt3HGpfP4/PMkOq3XBtCRPm68NtJPPYtJJc3sTmOK5LwF5EKKveBBIRppIOXiDHMjpLUriu3PY4SXh4hu7OrB4AVeYPLiqdfx6XDKlOrcvvi41rlTetnqqMN3DfTCXJuUbAwS3S0m0kRDdkNxU08cL5xeH29XQbCbklImJXJ/YlEFdFD2C3Wuo5mkGLxBUDgyWQW9m3aUGPQs5vy7NO285aSDdOLhI7BXEOh0ZVxesrruL0cbbiWMiKRPqzHkECGvg2ky4ZrLtV1mdVdVyPm4KYzTaZteDzdbVah3Ot8u6Z0q9bBU4cvr2kZYbvbu3dv8NOUznURgJFUxDNjf1gfsF8ffg6NuncWCHOBx7Yekc/RO7MsyC3dtLTFYMRySbu2NQ7tq7RipBgI45igayvYpIoLpyCSLjiwW21uCM9BTGzriUJCYPT91k5IjsmyU6l8IH5CRmEu/WkPK6f5dfY6bOi2rZoXU1BTB+nb3zn7kkYcDHxniipqUJVmhDLpiC6eGE9v62csXomvtkdjzqAPquIkDAsWtzl6YgBi8Qtuqowrqd0CmJqXM+Y20gcJjD3PVYnZrgiMTtdHd4WrWwK4nZczxFTEBXEjlntk83KQT8Kvg+GK5In5XH9rvms9Ocb65HvXrXjhW7Xh01HfgpiURt8X7iMlRdH7wxXJG959FF99j3v2Xm8bJYVWn0KostYLYsMkBuPSr+u1c7iLk9fjEskhf1RDcb1XK134w5sYqBXYeALggNk+MXEjeVgCXHOJSIkCFfcP6/mWp3HZwQsSe8c6OodifRrOfILoDzDZjCW5I9/8r8Uj9lJFlzfN7WOQ9YjOwUxexjQk04kKWKT0vVrPIbKcN0tuXdPeZjdyd21kdRxq3ZHBpm9gRYdTEKsTRgVjjTyDIoBiySzZ6LZbbZOMbCo3rGyqxP7FoezHvklLJ5BMnSRmFlWx01BZKOQPsf1gBfHSBmuSB6XJ3V2/fsqPbd4CiKZ3Wb7xgXmXh6jY7jZrd/8lltgv0j2TUFMFvS+22y6IDjzOatJMQiRHOLqMjVN0W12hW1flxD32d0XQQLMTLXSi2N6DMLdevQtj+p7nt2tuJ+dx0QzRWV3I9FDu82eCiMQH3hMhOG6W2k27es3ZrenKDcmVFYhi6C/cT2wda22BUHPlBmUSNJu1WWuDT6UhIVAQsrn6oH8/h5eINNnMO7WB774yeJRpbYJse96B+ALgtNnuClguSeqD/M7qw1gTKjFWI/AW43pM+CYxAmki91mj8BNRMwnDjx3i2FYEldxH4pbhRu8AD7quFMM2JK8/THCz/YvkMwKQS8Oj2UQInn7Z3teGYgvCHrKGYRI+iQiZKk2O9DT4DnPsBnd/iRtERESJKYgKIIPPTyl3E2RhBFLZnZco1eHZz93SiRuByjTauXF4anGnRCJc62WdpcsLw9PHSYfuG8WQAW+Yu5pxiRFkq53BH4ioudIJuluhWGw3RHXC8RzJJMSSZCYdhI/MtTTJqN3tzaulQYYz8orxNMuoxZJegFU5N0qT0eM190KI7Ovuw86PB0zOkti1nhY18p7Vp4TMBpL4qrlSwK/+aznpIzCkgS6NJNTGloOtydJ92vT/bT4KTJYkaQLgo3ubt38w2ohRJJAfr/F1lB3Kq+RCTJIkQQJJJHps6piPBQQe5/Kyq6MX0EYmK3eIhL72KLdz3o3vp4FoYRESWRmg3mhTIpBrHF/Uh7XH+L9QP2JiG6/9ShKKj2/jaW5bjh3KKudn/nBEaOmcI37IAL3r/CY6dRliVZs01U1u+bKKqwsEABJFlvXqDa6OWeRQABCWSH9f+54WmQQluSe3NOHPDz4POdWYV2qMGg2dGipQWWXqMk5oyTC98aMkuFOS6kqkFVoxp0ei4lbDt/E+9yqw/hM11QYhLt1CEU327u1d9B9FvSwW7WPMAiPcOk8Q2MQ7lZ2z0RLLlvV1LUqI0oiNNju0tuWK5c5/s4+9J6BM+BZwHmRpPY97BKX6TrOrSrHZ7pGx3CzW2mSkJMIBEym6xi36hA+0zUNKlsSEXkEeAH4VVW9EJHXAf8IeCvwBeDPqOqv2+d+BPgQ8HXgu1T1k3uPfU9UH5rdq6B916pPfKZrVByd3fpu4DPA77TfPwt8SlU/KiLP2u+/R0TeATwFvBP4ZuBfiMjvV9Wvlx34sYdvN3uQTEgcWXyma8xUcrdE5E3AnGze9H3Ax+zXHwNbMjePf1xVf0NVfxl4CXh3K1c7Qnyma/xUjUn+P+CvAf8n9dgbVPUWwP7/9fbxNwK/knrey/axu8tqsT/j7Bk0B0UiIhfAK6p6uOJnX1Lw2M4tIiJPi8gLIvLC/+KrFQ89TsIgZHVMN4ynV6pYkm8D/qSIfAH4OPDHROQfAF8WkTMA+/9X7PNfBt6cev2bgC/lD6qqz6nqPVW999/f9R9NgDthoijxma6RclAkqvoRVX2Tqr4VE5D/S1X988AngA/ap30Q+HH79SeAp0Tk1SLyNuAJ4Of2nuQhMIsmLxRXqPSMi2N6tz4KPC8iHwK+CHwAQFVfFJHngU8DXwM+vC+ztUEAXQLt1yuGhF9vMj4GVXFXQJLTFBL7IupsdaSnBYZfcRdAg4gwHMC+1B0RyspnukbGoEQCRiiziEkLxWe6xsXgRAJGKMFSJxvI+0zXuBikSADjt8+WfV9Fd/hM12gYrkjAbBets8laFB+bjINhiwQ2FmVqQgmDEPGRySgYvkhgsilTn+kaB+MQCaATrcj7TNfwGY1IRKYpFJ/pGj6jEQlMVyg+0zVsRiUSmGx44mOTATM6kQCT6xj2ma5hM06R2N6VSQlFVvgQfpiMUyQwSaEkobcnQ2S8IgEQSJpufzVAoijxbtcAGbdImGDHsJ095hkOoxdJurV+Kq6Xz3QNi9GLBGxr/TT0scl0eZ0Mh0mIxKHBNFrrQ1n52GRATEokgrBkGq31PtM1HCYlEjBC0WD8rfU+0zUcJicS4OjddQeDz3QNgkmKBIAJWBPwma4hMF2RIKMXis90DYMJiwQmIRSf6eqdQYjk7TzW4dEnEJ8kCy+THhmESG5/1zeREHZ3I8zGPRXSt9L3yyBE8q2v+p/cn5+zoqMVesbrGrXb5TNd/TEIkQDcnK25Pz/v7vNSzJzhMQvFh/D9MBiRgBHKg/m8s7VHAsakjJAwCFn5KnwvDEokAM+cncEi6UwoipkKOUZ8Fb4fBicSACVi/mDeye0gACLjDeR9puvkDFIkADfrs05dryBilBbFZ7pOz2BFAsb1ur7oTijISDNePtN1UgYtEjBC6cr1YsQdwz7TdToGLxLo1vVyrfVjwme6TssoRAJb16ubYN4s1hoTPtN1OkYjErDp4Q5rKKNzu3xschJGJRKAi4vrjtZYjK9jOAxCv97kBIxOJDdnay4ezL1QLCtvTDpndCIBE8hLZ9sVCDobj0iCMPSZro4ZhEhe+s1vqv0ajYCkG6EI45kK6TNd3TMIkXzrq/5no9fN17ctX4lFzFTIsbhdPtPVLYMQSWPiOWFHvZCCaa0fi0Xxma7uGLVIbs7WLKOws6ZhMz51PJ/QPtPVDaMWiWMZhR1uzmla64fueoVB6DNdHTEJkQDMH3TZCCkwG3brShgGzEZk9cbEZERys7Ydw52V5Ie5WCtKIpYaMIsmNLlyYExGJNBt2wowqNZ6Jw4NIkQmMThpsHxD3xfQNvOLB9ygHX2qCjoDSUwM0BeRzlCCyW7XPTQmZUnAtq1cX3R2fBF6q8iHYcBSAxDxAjkhk7MkAMRzSM462/5KBFQDZBV2blE27t0sYikgw/D27hSTsyRgrMn8/nlnbStghNJ1xitKImO1ggh83NEbkxQJGKFoRLeBPNJZIB/pDIKld6sGwGRFsmGRdHdsu/VvW0KJEtMGsySwpsorZAhMXiTz2zVJVzOGoTWhREkEs8h5Vp4BMXmR3JytWc9v6dTvOjI+WWqArQZ6BsjkRQJ2LNH1Rac6UakXn0RJRKQzlq7e4QUyWO6ESMC0rcwfzDs7ftXW+iiJWDJDg8jUOzq7Ik9b3BmRAKZ+0mE/uQ1PSoXi4g7xXVajYprFxBJuztZcXDwgVqWr3KoALBXCBWEQblwwdcXATs7q6ZI7JRLYCuWGDm9YU5InDIVloN5yjJw7JxLHggTocGmuCDN8+/oUuFsxieXmbM3tfN1p2wp412oqDEYkF7fnJz3fzdkaorDjthXPFBiMSG7O1ic/pxJ127bimQSVRCIiXxCRfyMivyAiL9jHXiciPykin7f/f23q+R8RkZdE5HMi8h1dXXwbzG/XEHqheMqpY0n+qKr+QVW9Z79/FviUqj4BfMp+j4i8A3gKeCfwXuBaRB5p8Zpb5eZszfz8Pt7v8pRxjLv1PuBj9uuPAe9PPf5xVf0NVf1l4CXg3Uecp3viOSGrvq/CM1CqpoAV+OciosDfU9XngDeo6i2Aqt6KyOvtc98IpP2Xl+1jGUTkaeBpgG95zWsaXv5xXJzfMo8hvrn0qShPKVVF8m2q+iUrhJ8Ukc/ueW7R7bbjy1ihPQdw7/HHT+rrXNyecxafc8PM1DG8QDx7qCQSVf2S/f8rIvJjGPfpyyJyZq3IGfCKffrLwJtTL38T8KUWr7kxF7fnnF1dcjMzRXG/XNxThYMxiYj8dhH5He5r4I8DvwR8AvigfdoHgR+3X38CeEpEXi0ibwOeAH6u7Quvy8X5LfHNJVHQWduWZ6JUsSRvAH5MzJ31DcAPq+o/E5GfB54XkQ8BXwQ+AKCqL4rI88Cnga8BH1bVr3dy9RW4uD0nvlpDcAOXXh2e+ogOYBT5vccf1xfe//7Wjueq90YcS3zQ4anIw1SJY8PkGhwf3N5ydmXcKmLwkYfnWCYjkoxbFXvL4WmPQbhbIvcUXmjjSC0cw3OHKXS3BiIS+TXgfwBf6ftacjyGv6YqTOWa3qKqj+cfHIRIAETkhSIV94m/pmpM/ZoG0yrv8QwVLxKP5wBDEslzfV9AAf6aqjHpaxpMTOLxDJUhWRKPZ5D0LhIRea9d5vuSiDzb43XUWqLc4XX8oIi8IiK/lHqs16XSJdf0vSLyq/b9+gUR+c4TX9ObReRfichnRORFEflu+3j775Wq9vYf8Ajw74DfC/xW4F8D7+jpWr4APJZ77O8Az9qvnwX+9gmu4w8D58AvHboO4B32PXs18Db7Xj5yomv6XuCvFjz3VNd0Bpzbr38H8G/tuVt/r/q2JO8GXlLVf6+q/xv4OGb571AoW6LcGar608B/rngdJ1kqXXJNZZzqmm5VdW2//m/AZzArYFt/r/oWyRuBX0l9X7jU90S4JcoP7dJiyC1RBl5f+upuKbuOvt+/vygiv2jdMefWnPyaROStwB8CVnTwXvUtkkpLfU/Et6nqOfAngA+LyB/u6Trq0Of79wD4fcAfBG6Bv9vHNYnIa4AfBf6yqv7XfU8teKzSdfUtksEs9dXUEmUgs0QZILdE+dSUXUdv75+qfllVv66q/wdYsnVdTnZNIvIqjED+oar+E/tw6+9V3yL5eeAJEXmbiPxWzLyuT5z6IhosUT41g1sq7W5Ey5/CvF8nuyYxS2V/APiMqn5f6kftv1ddZ2sqZCm+E5OZ+HfAX+/pGn4vJvPxr4EX3XUAvxszeO/z9v+vO8G1/AjGfflNzKffh/ZdB/DX7Xv3OeBPnPCa/j7wb4BftDfg2Ymv6T0Yd+kXgV+w/31nF++Vr7h7PAfo293yeAaPF4nHcwAvEo/nAF4kHs8BvEg8ngN4kXg8B/Ai8XgO4EXi8Rzg/wIALXi+S/nwAgAAAABJRU5ErkJggg==\n", 3053 | "text/plain": [ 3054 | "
" 3055 | ] 3056 | }, 3057 | "metadata": { 3058 | "needs_background": "light" 3059 | }, 3060 | "output_type": "display_data" 3061 | } 3062 | ], 3063 | "source": [ 3064 | "ground_truth = spectral.imshow(classes = y,figsize =(7,7))" 3065 | ] 3066 | }, 3067 | { 3068 | "cell_type": "code", 3069 | "execution_count": 44, 3070 | "metadata": { 3071 | "id": "5yIVQ8-hkkEW" 3072 | }, 3073 | "outputs": [ 3074 | { 3075 | "data": { 3076 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAMkAAAGfCAYAAAD1ZvZbAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAA670lEQVR4nO2da4ws6Vnff08Wx1ycsHb22Glsgw0s8dpfkvGRuyQsQhIRDD2KnUiONlEif3D1SjtOIFIiWAuNgnQUxYkUlC/MUbYLhHMBswoQrCmEAScIIbna7GlxW1/iTXDM4okXB8gNcbHz5MP7vt3V1VXdVdVVXZd5f9KenenprqrpqX8/1/d5RVXxeDzF/ImuL8Dj6TteJB7PAbxIPJ4DeJF4PAfwIvF4DuBF4vEcoDWRiMjbROSTIvK8iDzV1nk8nraRNuokIvIQ8J+BbwFeAH4J+Fuq+rHGT+bxtExbluQtwPOq+l9V9Y+ADwBvb+lcHk+rfElLx3018Jup718ApkVPFhFf9gfe3NJxHzR0/AeHnzJ0Pq+qd7IPtiUSyXlsSwgi8gTwREvnb5YiCef9lkecQlpwfVVAVEDggZozJfNqFz6N1P6qB15nfgkUafKtOSX/Le/BtkTyAvDa1PevAT6bfoKqPg08Df22JAqIKKq7f3YxP23kPOL+afidEAUVe9tWOH6wME9UkeJb3h57fS4gCQWJmntf+kBbIvkl4FEReT3wW8DjwN9u6VyNo2qEgbs9cgSy9fz1P9Vw59gcSMwN3cpHhjmXCizD4mcFCzW/7sKIKvc3T/3Cy3DzjGS++bkWvXaAtCISVf2CiPw94EPAQ8APqupzbZyrSRRlzpLz8xWzq3OuL+yHrxirkbUmqoKsP6W1sk5UMzdiS3eVKOj6RMUnmUYKERurs4P5HZdz2XucZC40aGQ7py1Lgqr+FPBTbR2/URSYJ5yfnUM8Wz+8JASig3/rtc2xPnlpRNfxQurBVqyJ2I93FWMtXFxSxa0SdQIpeb6RqKSVOknli+goJnFew/3zGTGznZ/PiLm4jtd/66K3avMBXd2aoIJk7iVVd5M1i9GjDUycS1TkF6WEAVQO9oOFDeCHpZMHqno3++CtFUkSQhTCzb2rvc+LL1cQROYb3RN6rG+G6r9KNnPUaqYLbPap4JPe/o7L+e6PqmLim0FlunJFcut6txSFMOHe2eygQACu7t3gbnwVF2zvIsbrN5/MVa9JZUtbB0KH2oimQ4VMfIWzg9qIQMBYnzYs4qm5PSJRcxOcX50zu1ltxR77iJmRhKnbao8I1P24qlBEd2MZGz80SVH9QlFElWUold2qEicdvExuh7tlfavZzare62cx8UUMrBOp+XWTdUq3RmzCbnxgbt46F7zN3tgDXX8ItMV0wVBik1vobiUhV7MZs9VNfYG4Q2GKC2UyXeaLGm7XziP1rYmKuwTZk9I1P59G7X5GmQ+VVk/RKqMUiSqQhMxWNyZrVdK1KiSesZrdpO7i/DtOVewnf807W3Q3NqklNuNWFYpDtzN1wiYV3AbJvF4dqS+MTCRKSML5/Rmze2fHiyNFzGyrVF106yrH1dF2bqSKB9qkXQsyVy7vLe5Ka5ykBstQWuokaJ9RiESBMIGr2Tk3s/JBeVVmq3KZLvfz+pH3dj/Uwb5Ccf/l1yVc5sr0cJljZ4zW4ZM0wjBVMvzA3bpVbQkjy/X1xeZGLMjcrAP4TANgaVS2xZEp7G09FSkWkru+fRX81IsV3erFaoOg382P4wrc1bpWpxQIwFI2Ltc+a7IOTepYk2xKOFuRF1xVxgi2sAvRWI59Os06XW3GJusTDoxBicS501ezGedX7bpWRdy72gTwhXGJbopodY1k+uZ1Z9vKWBWJw9aDyp51W4/N12by6IHzUonBuFuKaTi8Nzs7wRXtJ56cQRSsvy98CyXl+9d1u9I37T5PZY9LVuI0qTKjueDGi4opptbl6qHTNWB3KwmZJ/RCIABXN/e2PquL/tiCHu9+p2/6ovZ1a2KPyx75TFcRvRWJy8i4YmCZPqtTETNjznL9fVFsoiqb2KQqdq3KXsthpXpsf1Q206UnKY8PRyU9dbeUq9l5bvt658zidRxUJdOlprX38PFtZmtPBzvHW40CfKZrGO5WQgjhsr5AZnGzF5QllSiYb4zJ/rqJMwhFUbGNyNf36L4ukhbpJtPV/Yf0Ifohkjdj6h1XM+5dHdlndcJs181qVirTxV63yS4BXvdZ7ceJrY0s1KkzXcl8GI30vRDJVz7/1Zs2kqZu8rYtCkA8I1ymHyju6Vo7Fu7OU0l15pa7G9ef8i3evFvTT1RatybLUHpvTHohkqPJE8SJLMrNalY501XZrXLZqxLnaYYD5rHps/U81TUOkZy4oJg9d9lMF1DarQIbI+gmVlhny7CSacnlSocKp8h0LcN+u13DE8kp3KiK3MxWuLsq75YyY4cqsL5LdTtQSDcoFpyrCdIpYYHW15uYc/ZXJsMTSZdWYw/bKxVdMK4b36qs6UC3sky5T0mdp63gWlJO5EkyXUBfg5PhiaSnnN/PiFfXy54OknWrDhVBtj50W01BbWRymzNd4xVJF25Zdrlf3lNS/66/coXGCgFsOtN1Cp3c5kzXeEVyarcsnhEui+/WdaUcN0QiFVuYqdblzyWnyXRttavc4kzXeEXSATerGUnuMOqNOPKmrNS5A0+R6QK2lvmeKtPVt176YYikhxmtXOIZUZhpFnT/zxm2nX5i5XUnp8x0ufQ1p8l0Ab1yu4Yhkp5mtPK4uXe1ffPve3JqgohSPbgYa6arb0F8f0QyFGtRgmymq6hKshlBVN8ajDHTZc/SG2vSH5EMyFqUQrcD6z3LQjZf11wTf4qervSJTpHp6pM16Y9IxkROpmv/CKLmMl1tBSddZLrSbl6XeJG0RDbTVf6+6nmm64Q9XclcetGu4kXSFjnuo9p2lZ3Hh5Tpcl9TPtN1tGvWsU68SFpkdTM7XaYr0zvWp0zXMZNX+hCbeJG0iFmCvP0nbi3TJVqmK6YhdG23bkOmy4ukZWb3z7e+P1Wmq02ZpGdtnyzT1WG7ihfJCQhJtr4vm+mqTDbT1aY92d5brlWCRVt725fDi6RtbKtKnSzt3gkrBWxlulrsEBbXuUx7ma5goUwXYNbjdDE/xuBFcgJmNyuSzGadpTJdde6LE2W6AEgZE3MzN8d6FGrrv8RhvqTb098e7p3NQDd7wh/OdMn6Jq/qaaib/miPUHdGcMmz4VZiosdlslxsY7a1Ljslo328JTkV8SzXmuSRznTV3c037XadplvluExX2q3qiTbWeEtyQu6dzYBNI6e7GfI+5Dd2wH7vhtyVRXRrrldb1mQzNR+7h72WtiZry0H+Dl25nChZkMZbkhNTJ9NV96ZIZWrtDdwSNTJdwULtGIByAlE1/3SR5PKW5JS4TNeCcrEJ2xblGGviNNKmNTGeoaT3X90hWKjtLzvsVpljmn3ml3NIEDPh7MTWxIvkxMxuVsSZv7LLdOWtXrTeUr0Inl2RteZ2sUk26AKWqfgr7VZR0rVStQLJcd3MBqmnU4kXSRcstyP4spmu+vfFJp3arruym+lybpWUuK3V/iMoy7mQFLzCrSY4lUy8SDpgdu+MOOMy7M10YZ5b1+VSe0edouRgZWLaSJRKbpVYYRWJw2Gsy+l8Lh+4d0Qi24572Z6uyouygO0tqlvsEE5nCqzlOigQBVETd1SpsWyfq128SDoivYtvOY6repw607XvDDZRBaqFccch1tbkBEIZr0j6Plginu1Yk/3p4IxM6liT9fysdnq6XJxQuKGX/VdUTbZqXs16ZDnVWpNxiKTD/UmOYTW7yV0DUtTTJccZk635WU3ulqVrX9Fmr3LPXc+tOsQpZDIOkQxAEHnEzJCdosKeGyi1evG4TFdzTbWbLFNxgG4rI63tDd92iXF4Ium7G1WR2b2zXb+64CN+y5rUocGeLmc9pCjlcKJVkmZYRLuhyfBEMlCrsY/tfRf3xybpG7zWzZfKdNWZrLKJZ/KvwAXl2ysJ203VuqkqbQlleCIZIeldfKFMYJ5qEa65xHddiS/5erPMxWyGmlcYXGesSCUItu7aEwilJZmMVyRDcsvqZrrq3nc1Ml22F7HgnCZj5QqC9hRbMZTKCYZtt3T48YpkYG7ZUZmuOtakRKbLWQ8tUoey3qErj63tHu1/bQ+NaGOW3XhFMjBiZjnuwuFM11HBt7uBCwRSVDU37praRsP9DZMq2xt4t1nHbMvlGoZIhuQ61WEWczm5yL/jy2S66tx4bohD5hDpmkdeWO7qHWV3rxPdrmW0bU02fV3NMYwGx4G5TqWwwo/PJhBcExdli8S2oeeIJd0G776vRHq9ieveJeWGZc7mGhGrsm3xmr+J807YZJdwfyzJGK1Fwe8UT864vJgQX8QQRKz/nJsU0ZpTZrqy1sO1rqsTyBH39skzXQ0GJ/0RyRitRc7vFE/OIJoSEKUetbfhMjTuQubv21qmSwtLgWzkU961KqKTTJc5UyNH6Y9IxswshlnMdfwkLAKyliMJBbGjVIIw3Bnp2WhPl7U4osUzrdTqo8kVjKfOdDXZ/OhF0jKTywsuVzHxRYybJeXcmGQOLOcE4XaNZBnm+e0lM117XC53cx4c+NaSN3TKKjw4UR4vFdEebJLy8MNfo29961NdX0ajxJMzWEx3Vua5tdvZJbw7TBe7udl9NQn7z3bOyr2u3PLfdIC9vs6mSV+IvYnbanyE1CTIck9/oKp3sw8etCQi8oMi8qKI/HrqsVeIyM+KyKfs/1+e+tl7ReR5EfmkiHxruWsbEdat0iggO2hNMesoDgoE85y8vsfC2MQ+vjYSJdyqrSvLuFc1a5QH2RJx+8aEZXhkUyjl3K0fAt6Weewp4MOq+ijwYfs9IvJG4HHgTfY1VyLyUOWrGmKmaxZzGa+4vrje7m1yn5ah+YNlXat9ZLvo97lTqikXzQXkh8ThKuZ23bySskEtqSS9L/zmRG1znEoO1klU9RdE5HWZh98OfLP9+v3AzwPfbR//gKr+IfAbIvI88BbgI5WuamCZrst4xTS+ttIwWSvnDi/nECRzAsqLY5tMXX1P3ST9nINHdK7blvlw4yhkfZxjU7/5l5cqZAhMURuHNc80Ulgcd+y6xcRXqeoNgKreiMgr7eOvhq0RhS/Yx3YQkSeAJwC+7MteUfMyOmQWM4vh4jrmWkC2UrqmKs1yTm1tYDNdUf497+Z0ybpyftCnMs9bG5v8O1+de0a7Zb/07CwBWDQXmzQ9eLvpintuY0XeE1X1aeBpMIF7w9fRKjNinrRu1dbfQM0ffxkKzKu5VkUsQ4Eo26Vlbt/15JQS98GmWi4lTEP6fO1IZTtt3cw53HRIsn+XI6krks+JyMRakQnwon38BeC1qee9BvjsMRfYN+LLFdfTGNnpCjSfhKbecbw4HEEYbo1F3Zyt5MC6lL+0417lkTOny5yreaFsT2I0g+zqWhPnVrWx109dkXwQeBfwPvv/n0w9/sMi8n3AVwGPAh899iK7ZkbM6jJmEYDEmfvS1TtoxnLkYjNdVcPd2rf11kT6TQzfzh4nzrpR+4KnkTbiVhVxUCQi8iOYIP0REXkB+McYcTwjIu8GPgO8E0BVnxORZ4CPAV8A3qOqX2zn0k/ALGZyccmFmg7d7ahDrUBaFEeKZUhhfJK6ImBT3xA2Wauq7HpDzbtdJnmQ/vSvd5e3PfbUFxPzmMXEZxN0GuWvtUDXbSSnRIPFbllsfTdvZ7xEnH9O7RRVOtO1Tg+0crts7nCtmenKq73WoF4x8VYxi5kR2/b1PIFs6h1dsAxl98Nc2BEIbHq6jrnSLdEdcZyD50kpT6jX0yW0NwhiGOtJTkA8OYOzFfFFlN6Map0VSkIhCI+pdxyPaX7MsyYFBUb7/9rT5Hc6J/ub6Vp3T/cocB8HtrJ/fXG9FaimcfWOLsWRRpI5Gmxf6r6FWS7tW/f2PmmmK7V9XZ1Ml7Skktvrbtm44zrVnesw7RnGtSrVZ3VisguKBA62kChSr80ks+nPOoZvmK0BczWPb6xJ8wK+lZbkMl7BJCbOXVbugvJm6x1NksyFIPOhWWRNXFX+qEyXpDppW8p0mUOn01T1lHJ1fs5Fwxbl9lgSG5RfX18wJTKrZtOosx7dBOVV2LcwqwhjHWtaE9h2seof5iBpa1Jn9eJZPHGFq8YYv0hmMfHkjOuLay7ieGdEjmlEVJK58fdPUfNogtxMV4FUhpTpSmep6mS6gsTMV27S6xq1SC7jFZerGKJgNyOEq3eEuasD+06eNYHiG9i1wde+wTNbS7Vrb7NZtfKERFxPVswlOfzkkoxDJNn1J7OYyWWBWwW4hRRdFASbRJJqC7MEu3d6TV8pPZG+zrDtaidzX1Q7yc29K85vzriZrRq7lHEE7nb9ieuxiqbstJG4ovO6z2rfZuMDQlS3Ss2HMl3uJ7XrJnYkaqsul6asnkCg5dPB8eSMhKDgw7Ee4xAJtjs3uEbiPLcqVe8YhzbW1Ml0qUj9jsXMQLtsirgpqma6TJOjeW7Q9LUMundrFjM5i4myd0kKdf1GPax3NEVuT1dRcXG9L7xSebvr9aE3TYlqTXQbQnH7v0N+T9cmqG/MuuX2bg3WksyIObuY2E+NvLfHpHOXyO5i8ZFRvDBrl82+8MfeUrZ2UrflpdJZ7G9jq/DBwsZWEe4nrbp/wxPJLOZyFRPs3BQW688uT9TC3geCMCRBdvzwopKfHvj5QXIWZrWB6KaQ6a52GilE5FjO9hiGSEoOl8YGeBKGfS2Wt4pzpNbfH+jpGkJsAqzNSXaG2anovUjiyRnJxSo1O3c3bbH2V2+R9cjirEl6YVaZnq5jGkzqVzNKHj+Tru6qF6LXgbsbLr2vRKYw+HpHk2QXirlW/6Ig3qR0698DLiJoelFW6TX8zTKQwD3bvp6X7za9JBvXyrNmOWertF7mHjvKmuhmNyy1gqsrlI2Ou3GriuhVxX1yeUF8NtkaLp3GaqNw0LTHtqvsPFqc6TJfHNuxaNfUH3EIExrl7+zbNb1wtx6TO/oJ/e29gVnpQdMekqioy7kAZ0qOMQHrz7R9Jyp4eTeuVR79XeP+FXye7HDpDabPqvSgaQ/gorXU9wd6uo5aTZVyscr2dDnjpVJmoHe39MKSyJ5tnFRBRl4MbIMkiphmRxAVfMhvVeHrnrCCNVnv7Fv3XO3RX0uyFzF/cE81gjDcuVGLrImpwutxcYndrhoOWJP+CqSQ3otEMD06XijVWWe6LGUzXXVxiQBXn9nKCyA4v2pIAoEBiARooM/odlIr0wWNZrrU5quG/CcchkgwE/o81VnO2VnKWlya5WiXK70wi4GLwzEYkQg+NqlDEIZk52SV2VauNqlM1wj0AQxIJIi3JnVJ5rJlTQ71dJXp+dpH9/nSZhmOSMBnumrirEn65t2X6UK02NIcosbqJ03910cGJRLBLNP0QqnBcs52tFCCstZk3dYiOyObDr8WO+5It6c49ohBiQSGmEDsD7uD9/Zkutx+jCVa7QUqicP14K1bYdZBTD+FMjiRgI9NjqGRTFfKcqTK7IfPbf+RrDi2LqjFxVs1GaRIfKarHnUyXVs/c+KQapbDvPaAOHaf3hv6t56kDAJTBZZdX8gAWc5Jj+o6mMlyVfQaed11Y3HVGat2P3ntiXM9SEsC+EzXEew0Mh6qm7g+9rpuVd3r7Ik5GaxITKbLC6UWmUxXFrFultsTvU23ai89ieIHKxLYpIQ91SnKdDk3p4o4NP1PE+LoGYMWCZgln96a1GNnKVG1ZFVjbtVeemBNBi8S365Sj7xMV2du1SE6FsrwRYJPCdfGZrrK0rVb1ZVORiESxC/MqkoSRcYEVzAfrbpVJU7elUEZh0jAL8yqSLAwxZJK79oxwyKaoKNq/HhEgs90VaPuZBQ6F8qp/8qjEonPdB0miSI0qOZm7dCxUE5tTYbZllKEb1fZixkz1FCzhxNKVzGKcrKlj6OyJOAzXftoTCCOriyKy6ydSJ+jE4nPdOWTv2VcUwfvyPU6UXwyPpGAz3RlSKKoPc+k47f6FN7eOEWCz3Q5zPDslh34LgP5E7hdoxWJz3SdSCAOJ5SOxNKmTkYrEuR2W5OTCsTRlevVcpFxpCJRs8nmLd0mrhOBOLrMeLUklHGJRAE1+7ffVoFs6LIqTnoDktO6YS0Ipff7kxxEQcXuvgu3fou43H1J+kL6r9yWf2Sn2df8/QeysWgJNi3bZiC0sRq3Wxxg3awF+Zux9oH1pPkWRZzaX765Qw7Mkrg922+7xcjShgXZ+uBv8Ljrg7cZbdcTSq4l6b1I3HupbLZM9ALJoWGFuD/Iei/E9Aiiho6/bxjF8Se4JSLxO+6WZLrYu3NxWbLCyH3OgZNUWR/fqkigjlCGtGei4nfcLcl0UX3xVAZldyxvEe45ef+5Y5VhPRSvzaxXQ71dPRWJz1SVwgrkGMpYj7JUFcpJGgEaKJ/0VCQAvq1kL9bFqoOzHG3Mc6jcxd528bGBtfH9FUkvE/39IImiWnNyq7hVx1JZKLToeh3ZttIPkbz5zbkPB36eVi51vKxTiSNN5U/wNq3KEeakFyJ5w397kP8LiFks5DEkUYROo9ICybpVXdA7odSgFyL5xCOPFL+RXU7m6BFJFFWyIF1YjiJ6JZQa78dBkYjIa0XkP4nIx0XkORH5Tvv4K0TkZ0XkU/b/L0+95r0i8ryIfFJEvrXMhcgyLDImPoCnmovV5Qy5ImRt1sq+gHaEUsPtKmNJvgD8Q1V9DAiA94jIG4GngA+r6qPAh+332J89DrwJeBtwJSIPlbucnKv3a9bN7z5ggWzRk8xXlbfooEhU9UZVV/br/w18HHg18Hbg/fZp7wfeYb9+O/ABVf1DVf0N4HngLWUuZhkW1GBv8Zp1tzakzDvQe4FYaqWImxRLxbRwpZhERF4H/AXMZKtXqeoNGCEBr7RPezXwm6mXvWAfyx7rCRF5VkSe5ff+AHDFw/xLv42rDKssnhqKQBxVhaJNW5UKaeHSIhGRlwE/BvwDVf1f+56a89jO5ajq06p6V1Xv8vCXrh9fhpLvdd2yAH7MAnFUEcpaIw3fB2XOX0okIvISjED+nar+uH34cyIysT+fAC/ax18AXpt6+WuAz5a6Yow1ybtwldsVwI9dII4+COUQZbJbAvwA8HFV/b7Ujz4IvMt+/S7gJ1OPPy4iLxWR1wOPAh+tclF5mS7hFm3WM11QJgoZukAcJxeKinl/SzYJl7Ek3wj8XeAvi8gv2/++HXgf8C0i8ingW+z3qOpzwDPAx4CfBt6jql+s8ZvsPnQLlJJEUSNt70PDpYjLiKWWUGzwX0UcjoPLd1X1Fyk+5l8peM0/Af5JhevYYRkKmsl8uvdlGUaj7BCusrpwLFYki2i5ZSDuXljHqgcWwbhj1vnw6UXFPY+iTNdoP2Oni1svEEdZ9+tgX2RNy5GltyKB4kxXMLJ0cBUXa+wCcVSOU5xajnCriui1SIoyXWPr5woW4yoWNkWdgF6tT9XkHdJrkUB+pktHFMCbXaduTyarKlUD+jY+PnsvEgDJDKtw5nXIdRO3+20ZJ+u2CiRLV2/BIESSzHd7usQ2Pg4Vt/vtIbxADJXXzzfIIERS2NM14Cp82fXp44q+6tP03K8qDEIkUJTpGt4ttF5dWOE1I8tTVMIF423FG2UYjEiKMl1DWt1bdXWho4N2pc7pgzgcgxEJFK9eHAIaLMy6kK7/4gOgL+JwDG6qvGgm4BVQFr3ajyQbJ5lK+vGxU5fbprdNOuboizgcgxNJMhem7PZ0JdHp+7lMr1XeXdv8brcm7T1OugzKy9CPgdmP3VF+6B2ln5+7J7lqa3ODNwug8jjdn3Zs6eCGtxFpgvFs4rMMBRZsvcMqwjKsNz847R4FCy1Iz3b/59x0vnZ9JcfRd8uRZZAiMZmuaNflqnOw6YJpeuBbdNo+Y3e/D+WGOZYeWo+DDCq7lSYv01W5ncvt69HBX63qGKqh00bj4akYrEggv6frUAU+iSJT0Asijt3Xoy7HiGNoNZMhi8MxSHfLsZPpEpgqZuBR3vO73N+cZi1H32OTocUd+xi0Jcnt6Sro59Kp3Zm2gz9b065V32+8vhUDj2XQIoHdni5heyzq1iT2E//V2ow7+nYDtrXgqQ8Msk6SpahuorQXlDf9rtW6zO7/dMAwM1YFDGlj0WpIMs9Zvdi8QJR+ZaW6DODHbDmyjEIkkJ/paoK+CSNLF0IZU1BehkFnt9bYiYfH/NG6FEHd6z51P9dtE4djsCJxgXmwAJX6DYVdiaOpG+0UrSpjy1ZVZXDulhugEERKEHGUT9yFQMrebGr/SQhJ6GYdTXrh021mWNkt51Yd8VcbglulatpuwtQalEVQ7FC20R18S12rYXYB3ya3ynX7LyUkTLYLokuKlwE05XL1eeFTl/RWJEkUESyUILA98eW3Ddyh9wJRSCRkKRAlEYT5/WdtNtR4t6qYfrpbt8Wtsv/MJYEkOPj8MMLEYXuOV8eaeIGs6W8x8Q2f//y6O5dpdFQhsMuaRiU3RRVJQuZLygkkaHZp8m0qBh5LPyyJNBN29t16OMuxFHPDR8nh4RBhEBIkoMHheKyMJbmlAXlZci3J4EXS9dVXiTvCkm6VI5oq6004Kpyn8EfeahximNmtIgYjDoAkJJSwklsV6PYUOwUkMT/bZ1WyWa5024oXSD16EZNUpWu3qtTNpoCqKQYGlLMgQQJhREC0EYhCEsI8wdRNopDd8eHb1+eE4VO6zTAYS9K15YDy4lBRljInSoSoxHSKMAiZskBUiFInUZT5UghISM+2S5hzSHLetWqO3ouka3HUcavmSblXrV0rNx9p/TIlZMk8EaIclUVJhO6pwHtxNEuvA/eur6xKUM7SuFVlMlYAEeHOYjGzPD8kSqJccaRJCPbWTDy16G92666IPtv1RWQoldJVEJRwWeGzO0hY2PEVW69SJZQlUclySBhBpIE3G80yruxWG1S1HEuBsKJrNSXYOo+izFnCUojC8uP1orDdNhXPBi8SqtxoCsmcxPZYlQ3KTcYqM5cVJUyEOTbuKHOwDPOlF8opuPXuVpU2kkpulcUUBDP1DlUSmTczBX8xxW960hj97d06NUL52oEqhCQkUn5ifRiERIQsArZu4CQ0gTnzZWPbRFS5Lk89bp27VbbHSlAS5iwFSIJSW/A412pKBFsxtZKEQhSarFZThJH9x2e5WuXWuFvl29cVSeZbqwLLEAZhbtyxFkeNmKPwXGHAIjBS9o5Wo9zOFHClm8jGCmVrHek+K00N31ZgjmlmbEocYWhq7F4crXL7UsCl20iw7esVBGL6rGxBb2tZsZqltk0KJIIQs+CqsLXRp7laY5SWpFLGSpaV2tcBCJKdQp5z00pXA0sQhgELOzZ/3++UhOa0C19bPJbxZ7fKZKzMwic1g3qWUlogYRAacQTKIhWVm5WQavq1GhJIGAZoNDU3fYFAFCOOMIHVzYwoSkzmzNM4o7EkZdO5VVYFOor6rFBT0INmA/O9i9ltl7GE5sRXN/fWP1pdxkQVjaJni9sXk2QpO3AhjRFIRHof9iSEeZQQtVCiCMOAKFBy87pJSLiMuD875+qm+XN78hmFu1UuQNfKaV2CBDKrAJ3/32DosYXJmOX8RgpEIWf3ZoWvjaKExHtcjTN4S1K6OLiskLnCLYSa7sgqCFtUCOSc0V7PEs5SrlUeVzf3YNLGVd1uBmtJqo3vobQVCYPQFAY1Z2MgtHIirNQ5w4AwAl0E+UG6lot5LiaXzV+cZ3giqSIOBUjCdbB+kCAhSOyneaZp0AXqTfVcgRFHEkVEU4gItk7pzhcmIPNkK0DPcjG5ZHUZw8IvxGqDQWW3KrW0K6W7dp3lyC1I2GLjfNlOa0le7dytMTmb7XevAC4uJ5s9IT3HMtzsVuWh0zInWpYb/AZ2jXnOXZaEEElCWUN08Hy2tSSMIAg0t3qehCAsD1qOMIKFBlykdO3aYaLtjbs9R9JrS1LFrRI7dBoO10DWrexMdz7J1f7rhjk0ZT2SKLLn3I073Dnzah9ZVpexKWqm56XYwXdns3vEM7h+MvaWpR7DanAsLZCcvTwOUVyrM6sFGy0MYifk73QIbwtyX2oXjPVICAi2DmN8wavzc+LUy+Ozyf7J2p4i+i+Syh9+Fbp2nfWImO7sr+imIyZBc4F5umtXcoZgJSEQhaxmxVVBl60yk1GyHYwK4ZLZ2a7VmcXwZBx7h6s6/Y5JKmWstHzXrhs4HaU+ydMfxCxD5nYyYlOD29cbD7mu3cy9ncwhIOHqQGBuLJBzrDbNYuESzu6dE581c72e/fTCkpQZmL12TajWtZufQUqlrBomjMiNO9Ln3BdzgIk7QrMLRSooN7/7zdXhjJe3JLXptyXZiyqynBOW/LM768E0yskgmWW5wbzZegdgO4R3JRmSQCKcnRfHHc610miKXEuqUdFc7/3ZOTf7wxbACMTTLP0WifWtqqwWJEiYErAzLNdmv6JECJpcLRgkLCJnrbID58w5gxCu9gTmZ/GExC7gEgDZrLO/mp1zIKYHjDhWlzFnK+PmXaSuwc8FPo5+ultu6DTVltK6odNtL4ZyFI0adRuErkp89K/iy133zCYk7pVs9Z2cxTsZOzddMpG5sao+21WGeu6WiHwp8AvAS+3z/72q/mMReQXwo8DrgE8Df1NVf9e+5r3Au4EvAt+hqh8qfZkVhk6n15hHKiDRVlTuhk7XGfxWeE5rPSINCHJCnVASZL6/1rHusYoymTYrjtX5OfeuDl/LLIYnr03sseWehcJcEpI9O/Z6ynPQkoiIAF+hqv9HRF4C/CLwncDfAH5HVd8nIk8BL1fV7xaRNwI/ArwF+Crg54BvUNUv7jmH1h06nf2ErDJ0uipJFBHo7jbA7pxBGB4URxJFTDXbRmLcKmCr3lHELDYuWvo4Lo0dBrs1o6LCqWeH4+skIvLlGJE8Cfxr4JtV9UZEJsDPq+qfs1YEVf2n9jUfAr5XVT9SeNy7dzX8/gflfxU7dHrnj15x6HRZwjAo3prNnvNQn9UqviTSzLTFVDo4r96Rx+Ri9zjOpdyX9JvmphQ8Gepnt0TkIeAB8PXA96vqUkRepao3AFYor7RPfzWQ/gh/wT5WyCP/t5xA3Gyr3WJgvaHTe8+V7rOastsVbGs1wXy/QJz1eJJga4Met8rw3tmMGfsFMouti2bdM3ccF/sso/0CAcz74yeq1KKqJXkY+Ang7wO/qKoPp372u6r6chH5fuAjqvpv7eM/APyUqv5Y5lhPAE8AvOzP8ubH/0Pxedcp3VRPhsv+hA33WK3PGbHdH+VQzN5sHI47drZHsNm6q/PyrlW8ujTrTLLFdjHXUKUdJ681xrPF8XUSVf09Efl54G3A50RkknK3XrRPewF4beplrwE+m3Osp4GnAe48VlxMDCOYruOAyL2YuSwhXDYvjtR0xPTt5wZdh0s5uELwbBKjixiJ2C7vz5fMzs5LiQPgchVzvYi34g4wv3vlpcjrX6L6y247ZQL3O8AfW4F8GfAzwD8D/iLwP1KB+ytU9btE5E3AD7MJ3D8MPLovcL/zmOg7fmj7MbPGI9rN8ScmMG16bQcc32eVF9ing/oycccsdvFLZoiW9a2OWRlpetf8OJU91LYkE+D9Ni75E8AzqnotIh8BnhGRdwOfAd4JoKrPicgzwMeALwDv2SeQLFtDp939ausmJi3cvEBcjSHbZ6VqVgUGsLfPanUZEwaXdtfc9ashmXP/3g1wc1Ag8cpmvogQYnMc93uzBGloqpa3JpXpRTHRWZJsBqnxvTwymD6rvNSoqTUEHF42a/YH2Vyze+3qZlbarVrXOzJp4ZCabtUefGFxL/1tlX9M7ugn9LfJ7uVhOnObFUe6zyqv9TG0g66L1ndsta/vxMHF7etZXI/V2STeOs6+ekcTeJHspb8iSbelrEeG0kLGqnBd+cZywIGVgfHljl+vdpLj2b1y1mNylrO60K3LF9OZ3IZAwBcWD9B3kbSzl4cjIWCas5TdxR1l2tez2x5UaV8HW++4vmZ74MRGHG0JI4svLBbS31b5N/AIb3XduQ3eJ2EY2MbHaCuDDKw7dGVe3EpyMbnMTHbfyCNMhPv3yrevn01igmt2Ku6hLAnltO6PLyxWox+W5M5jSjYHfCSF68rVzASOwv1u1Vk8MbvmZoSlwP3z8kH55cXEjCvKVgOTeWtxRxl8YTGXHrtbDYtEo2l2oS5gkgGHMlaQ377uEglV2td33DPbRtLmmNRDuM6F/Ilft57+ultNkESRqa9kmnTTWyQEUbFAnGsVTUEldmUKxEblq5t7xAcE4nqs1O6ms71VwxGV8gZYD8EIBCneL8uTwygsSd668rQ4ogPiIIxypiDWaF+fxNsLn7D9ZbSbsdpHRGi7hsG7VwcZl7u1biXJ2S4tsRmyMu3rC92dwSt2f4UqbSQ7adUG2kiOJoxyRhF59jCO7eDckOlFoDvbpSmbuONQ+/pZPNkaUu2Ga0uYMFsdbiOZxaaV5OI6NsdxXQL2IkJZdiaQiDCVdPACOZbBWZLcdeW2x0nCwzN0d2b1AKgyu39e6vyzuGBIdWpdflesXausafWUZbiB+3oqScY1Ag5ukZZuIyG6ZntTQRMvnJ0fbl9Pt5GQWSIidnViVwJxVfQAdqulnqPptUhcMTBYAJmVfesW9Cjk7KY467TprIV04+Q8sVMQZ3BoVFW8uuQyTh9nI465LEmkO+sRJKCBbzNpk966W0V9VmXH9bgpiNvZJrMWfLYqV+twrlXWPVO6davAicO3lzRM/92tvft32Gkqh9pIIG8Kormx78/Oma0OX8c6nRsrxNnAA1vv6IbInXkaZNZOetqiNyJZxx3resfGNVoSEqzFkS+Q1WVMEsW5UxAJl9y/KRd3pKcgbq1rSULC4PR9Vo7IjkmyUyl8YH5CeuFuPSZ39BP8Njt9VpTbFq2tKYjp43SN79w9CT0uJt4VJTVVqEwTYtEURBcvrGY3pTNWT8bXu+NRe9BnFREShnmLuzwt0WORuEVXJcb1FE5BTI3rGXIbicME5r7H6sT0VyRmp6vD26IVTUHcjOs5Ygqigtgxq12yXjnoR8F3QX9F8pjc0e+YTQt/vrYe2e5VO17oZnXYdGSnIOa1wXeFy1h5cXROf0XyNQ8/rE+99a07jxfNskLLT0F0GatFngFy41Hp1rXaWdzl6YphiSS3P6rGuJ7L1W7cgU0MdCoMfEGwh/S/mLi2HCwgzrhEhAThkntn5Vyrs3hCwIL0zoGu3pFIt5YjuwDK0296Y0n+6of+Z/6YnWTO1T1T6zhkPbanIG4fBvSkE0nyWKd0/RqPvtJfd0vu3lUebO/k7tpIqrhVuyODzN5A8xYmIVYmjHJHGnl6RY9FsrVnotlttkoxMK/esbSrE7sWh7Me2SUsnl7Sd5GYWVbHTUFkrZAux/WAF8dA6a9I7shjOr36ulLPzZ+CyNZus13jAnMvj8HR3+zWH3/1DbBfJPumICZzOt9tNl0QnPqc1ajohUgOcXmRmqboNrvCtq9LiPvs7oogAaamWunFMT564W49/DUP61uf2q24T85ioqmisruR6KHdZk+FEYgPPEZCf92tNOv29Wuz21OUGRMqy5B50N24Hti4VpuCoGfM9EokabfqItMGH0rCXCAh5XN1QHZ/Dy+Q8dMbd+udn/lQ/qhS24TYdb0D8AXB8dPfFLDcFdUH2Z3VejAm1GKsR+CtxvjpcUziBNLGbrNH4CYiZhMHnttFPyyJq7j3xa3CDV4AH3XcKnpsSd7wCOEnuhfI1gpBLw6PpRciecMnOl4ZiC8IeorphUi6JCJkoTY70NHgOU+/Gdz+JE0RERIkpiAogg89PIXcTpGEEQumdlyjV4dnP7dKJG4HKNNq5cXhKcetEIlzrRZ2lywvD08VRh+4rxdABb5i7qnHKEWSrncEfiKi50hG6W6FYbDZEdcLxHMkoxJJkJh2Ej8y1NMkg3e31q6VBhjPyivE0yyDFkl6AVTk3SpPSwzX3Qojs6+7Dzo8LTM4S2LWeFjXyntWnhMwGEviquULAr/5rOekDMKSBLowk1NqWg63J0n7a9P9tPgx0luRpAuCte5uXf/Dci5EkkB2v8XGUHcqr5ER0kuRBAkkkemzKmM8FBB7n8rSroxfQhiYrd4iEvvYvNnPeje+njmhhERJZGaDeaGMil6scX9M7ugP8Q6g+kREt996FCWlnt/E0lw3nDuU5c7P/OCIQZO7xr0XgfvnecR06rJAS7bpqppdc2UZlhYIgCTzjWtUGV2fM08gAKEske4/dzwN0gtLclfu6gMeHHyec6uwLlUY1Bs6tNCgtEtU55xREuF7YwZJf6ellBXIMjTjTo/FxC2Hb+J9btVhfKZrLPTC3TqEouvt3Zo76D4Letit2kcYhEe4dJ6+0Qt3a3vPREsmW1XXtSoiSiI02OzS25Qrt3X8nX3oPT2nx7OAsyJJ7XvYJi7TdZxbVYzPdA2O/ma30iQhJxEImEzXMW7VIXymaxyUtiQi8hDwLPBbqnouIq8AfhR4HfBp4G+q6u/a574XeDfwReA7VPVDe499V1QfmN2roHnXqkt8pmtQHJ3d+k7g48Cftt8/BXxYVd8nIk/Z779bRN4IPA68Cfgq4OdE5BtU9YtFB37kwRvMHiQjEsc2PtM1ZEq5WyLyGmDGdt707cD77dfvB1syN49/QFX/UFV/A3geeEsjVztAfKZr+JSNSf4l8F3A/0s99ipVvQGw/3+lffzVwG+mnveCfez2spzvzzh7es1BkYjIOfCiqh6u+NmX5Dy2c4uIyBMi8qyIPPsH/F7JQw+TMAhZHtMN4+mUMpbkG4G/JiKfBj4A/GUR+bfA50RkAmD//6J9/gvAa1Ovfw3w2exBVfVpVb2rqnf/z5v/uwlwR0wUJT7TNVAOikRV36uqr1HV12EC8v+oqn8H+CDwLvu0dwE/ab/+IPC4iLxURF4PPAp8dO9JHgDTaPRCcYVKz7A4pnfrfcAzIvJu4DPAOwFU9TkReQb4GPAF4D37MltrBNAF0Hy9ok/49SbDo1cVdwUkOU0hsSui1lZHehqg/xV3ATSICMMe7EvdEqEsfaZrYPRKJGCEMo0YtVB8pmtY9E4kYIQSLHS0gbzPdA2LXooEMH77dNH1VbSHz3QNhv6KBMx20TodrUXxsckw6LdIYG1RxiaUMAgRH5kMgv6LBEabMvWZrmEwDJEAOtKKvM909Z/BiERknELxma7+MxiRwHiF4jNd/WZQIoHRhic+NukxgxMJMLqOYZ/p6jfDFIntXRmVUGSJD+H7yTBFAqMUShJ6e9JHhisSAIGk7vZXPSSKEu929ZBhi4QRdgzb2WOe/jB4kaRb68fievlMV78YvEjAttaPQx/rTJfXSX8YhUgcGoyjtT6UpY9NesSoRCIIC8bRWu8zXf1hVCIBIxQNht9a7zNd/WF0IgGO3l23N/hMVy8YpUgAGIE1AZ/p6gPjFQkyeKH4TFc/GLFIYBRC8ZmuzumFSN7AIy0efQTxSTL3MumQXojk5iu/nISwvRthOuypkL6Vvlt6IZKvf8nvc292xpKWVugZr2vQbpfPdHVHL0QCcD1ZcW921t7npZg5w0MWig/hu6E3IgEjlPuzWWtrjwSMSRkgYRCy9FX4TuiVSACenExgnrQmFMVMhRwivgrfDb0TCYASMbs/a+V2EACR4QbyPtN1cnopEoDr1aRV1yuIGKRF8Zmu09NbkYBxva7O2xMKMtCMl890nZReiwSMUNpyvRhwx7DPdJ2O3osE2nW9XGv9kPCZrtMyCJHAxvVqJ5g3i7WGhM90nY7BiARserjFGsrg3C4fm5yEQYkE4Pz8qqU1FsPrGA6D0K83OQGDE8n1ZMX5/ZkXimXpjUnrDE4kYAJ5aW27AkGnwxFJEIY+09UyvRDJ83/85ZVfoxGQtCMUYThTIX2mq316IZKvf8nv13rdbHXT8JVYxEyFHIrb5TNd7dILkdQmnhG21AspmNb6oVgUn+lqj0GL5HqyYhGFrTUNm/Gpw/mE9pmudhi0SByLKGxxc07TWt931ysMQp/paolRiARgdr/NRkiBab9bV8IwYDogqzckRiOS65XtGG6tJN/PxVpRErHQgGk0osmVPWM0IoF221aAXrXWO3FoECEyisFJveVLur6Appmd3+cabelTVdApSGJigK6IdIoSjHa77r4xKksCtm3l6ry144vQWUU+DAMWGoCIF8gJGZ0lASCeQTJpbfsrEVANkGXYukVZu3fTiIWA9MPbu1WMzpKAsSaze2etta2AEUrbGa8oiYzVCiLwcUdnjFIkYISiEe0G8khrgXykUwgW3q3qAaMVyZp50t6x7da/TQklSkwbzILAmiqvkD4wepHMblYkbc0YhsaEEiURTCPnWXl6xOhFcj1ZsZrd0KrfdWR8stAAWw309JDRiwTsWKKr81Z1olItPomSiEinLFy9wwukt9wKkYBpW5ndn7V2/LKt9VESsWCKBpGpd7R2RZ6muDUiAUz9pMV+chueFArFxR3iu6wGxTiLiQVcT1acn98nVqWt3KoALBTCOWEQrl0wdcXAVs7qaZNbJRLYCOWaFm9YU5InDIVFoN5yDJxbJxLHnARocWmuCFN8+/oYuF0xieV6suJmtmq1bQW8azUWeiOS85uzk57verKCKGy5bcUzBnojkuvJ6uTnVKJ221Y8o6CUSETk0yLyayLyyyLyrH3sFSLysyLyKfv/l6ee/14ReV5EPiki39rWxTfB7GYFoReKp5gqluQvqeqfV9W79vungA+r6qPAh+33iMgbgceBNwFvA65E5KEGr7lRricrZmf38H6Xp4hj3K23A++3X78feEfq8Q+o6h+q6m8AzwNvOeI87RPPCFl2fRWenlI2BazAz4iIAv9KVZ8GXqWqNwCqeiMir7TPfTWQ9l9esI9tISJPAE8AfPXLXlbz8o/j/OyGWQzx9YVPRXkKKSuSb1TVz1oh/KyIfGLPc/Nutx1fxgrtaYC7d+6c1Nc5vzljEp9xzdTUMbxAPHsoJRJV/az9/4si8hMY9+lzIjKxVmQCvGif/gLw2tTLXwN8tsFrrs35zRmTywuup6Yo7peLe8pwMCYRka8QkT/lvgb+KvDrwAeBd9mnvQv4Sfv1B4HHReSlIvJ64FHgo01feFXOz26Iry+IgtbatjwjpYwleRXwE2LurC8BflhVf1pEfgl4RkTeDXwGeCeAqj4nIs8AHwO+ALxHVb/YytWX4PzmjPhyBcE1XHh1eKoj2oNR5Hfv3NFn3/GOxo7nqvdGHAt80OEpyYNUiWPN6Boc79/cMLk0bhUx+MjDcyyjEcmWWxV7y+Fpjl64WyJ3FZ5t4kgNHMNzi8l1t3oiEvlt4P8Cn+/6WjI8gr+mMozlmr5GVe9kH+yFSABE5Nk8FXeJv6ZyjP2aetMq7/H0FS8Sj+cAfRLJ011fQA7+msox6mvqTUzi8fSVPlkSj6eXdC4SEXmbXeb7vIg81eF1VFqi3OJ1/KCIvCgiv556rNOl0gXX9L0i8lv2/fplEfn2E1/Ta0XkP4nIx0XkORH5Tvt48++Vqnb2H/AQ8F+ArwX+JPArwBs7upZPA49kHvvnwFP266eAf3aC6/gm4Az49UPXAbzRvmcvBV5v38uHTnRN3wv8o5znnuqaJsCZ/fpPAf/Znrvx96prS/IW4HlV/a+q+kfABzDLf/tC0RLl1lDVXwB+p+R1nGSpdME1FXGqa7pR1ZX9+n8DH8esgG38vepaJK8GfjP1fe5S3xPhlig/sEuLIbNEGXhl4avbpeg6un7//p6I/Kp1x5xbc/JrEpHXAX8BWNLCe9W1SEot9T0R36iqZ8C3Ae8RkW/q6Dqq0OX7dx/4OuDPAzfAv+jimkTkZcCPAf9AVf/XvqfmPFbquroWSW+W+mpqiTKwtUQZILNE+dQUXUdn75+qfk5Vv6iq/w9YsHFdTnZNIvISjED+nar+uH248feqa5H8EvCoiLxeRP4kZl7XB099ETWWKJ+a3i2Vdjei5a9j3q+TXZOYpbI/AHxcVb8v9aPm36u2szUlshTfjslM/Bfgezq6hq/FZD5+BXjOXQfwZzCD9z5l//+KE1zLj2Dclz/GfPq9e991AN9j37tPAt92wmv6N8CvAb9qb8DJia/prRh36VeBX7b/fXsb75WvuHs8B+ja3fJ4eo8XicdzAC8Sj+cAXiQezwG8SDyeA3iReDwH8CLxeA7gReLxHOD/A3JKga1AyGRAAAAAAElFTkSuQmCC\n", 3077 | "text/plain": [ 3078 | "
" 3079 | ] 3080 | }, 3081 | "metadata": { 3082 | "needs_background": "light" 3083 | }, 3084 | "output_type": "display_data" 3085 | } 3086 | ], 3087 | "source": [ 3088 | "predict_image = spectral.imshow(classes = outputs.astype(int),figsize =(7,7))" 3089 | ] 3090 | }, 3091 | { 3092 | "cell_type": "code", 3093 | "execution_count": 45, 3094 | "metadata": { 3095 | "id": "vBPvnosekkEZ" 3096 | }, 3097 | "outputs": [], 3098 | "source": [ 3099 | "spectral.save_rgb(\"predictions.jpg\", outputs.astype(int), colors=spectral.spy_colors)" 3100 | ] 3101 | }, 3102 | { 3103 | "cell_type": "markdown", 3104 | "metadata": { 3105 | "id": "JL4rV6j7kkEa" 3106 | }, 3107 | "source": [ 3108 | "spectral.save_rgb(str(dataset)+\"_ground_truth.jpg\", y, colors=spectral.spy_colors)" 3109 | ] 3110 | } 3111 | ], 3112 | "metadata": { 3113 | "accelerator": "GPU", 3114 | "colab": { 3115 | "authorship_tag": "ABX9TyN/aVtrq8w9XQ9tKxeZX/5h", 3116 | "collapsed_sections": [], 3117 | "name": "Untitled1.ipynb", 3118 | "provenance": [] 3119 | }, 3120 | "kernelspec": { 3121 | "display_name": "Python 3", 3122 | "language": "python", 3123 | "name": "python3" 3124 | }, 3125 | "language_info": { 3126 | "codemirror_mode": { 3127 | "name": "ipython", 3128 | "version": 3 3129 | }, 3130 | "file_extension": ".py", 3131 | "mimetype": "text/x-python", 3132 | "name": "python", 3133 | "nbconvert_exporter": "python", 3134 | "pygments_lexer": "ipython3", 3135 | "version": "3.8.5" 3136 | } 3137 | }, 3138 | "nbformat": 4, 3139 | "nbformat_minor": 1 3140 | } 3141 | -------------------------------------------------------------------------------- /best-model.hdf5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/best-model.hdf5 -------------------------------------------------------------------------------- /classification_report.txt: -------------------------------------------------------------------------------- 1 | 0.4067569971084595 Test loss (%) 2 | 99.94047284126282 Test accuracy (%) 3 | 4 | 99.93371857995123 Kappa accuracy (%) 5 | 99.94047252499128 Overall accuracy (%) 6 | 99.93818383736027 Average accuracy (%) 7 | 8 | precision recall f1-score support 9 | 10 | Brocoli_green_weeds_1 1.00 1.00 1.00 1808 11 | Brocoli_green_weeds_2 1.00 1.00 1.00 3354 12 | Fallow 1.00 1.00 1.00 1779 13 | Fallow_rough_plow 0.99 1.00 1.00 1255 14 | Fallow_smooth 1.00 1.00 1.00 2410 15 | Stubble 1.00 1.00 1.00 3563 16 | Celery 1.00 0.99 1.00 3221 17 | Grapes_untrained 1.00 1.00 1.00 10144 18 | Soil_vinyard_develop 1.00 1.00 1.00 5583 19 | Corn_senesced_green_weeds 1.00 1.00 1.00 2950 20 | Lettuce_romaine_4wk 1.00 1.00 1.00 961 21 | Lettuce_romaine_5wk 1.00 1.00 1.00 1734 22 | Lettuce_romaine_6wk 1.00 1.00 1.00 825 23 | Lettuce_romaine_7wk 1.00 1.00 1.00 963 24 | Vinyard_untrained 1.00 1.00 1.00 6541 25 | Vinyard_vertical_trellis 0.99 1.00 0.99 1626 26 | 27 | accuracy 1.00 48717 28 | macro avg 1.00 1.00 1.00 48717 29 | weighted avg 1.00 1.00 1.00 48717 30 | 31 | [[ 1808 0 0 0 0 0 0 0 0 0 0 0 32 | 0 0 0 0] 33 | [ 0 3354 0 0 0 0 0 0 0 0 0 0 34 | 0 0 0 0] 35 | [ 0 0 1779 0 0 0 0 0 0 0 0 0 36 | 0 0 0 0] 37 | [ 0 0 0 1255 0 0 0 0 0 0 0 0 38 | 0 0 0 0] 39 | [ 0 0 0 10 2400 0 0 0 0 0 0 0 40 | 0 0 0 0] 41 | [ 0 0 0 0 0 3563 0 0 0 0 0 0 42 | 0 0 0 0] 43 | [ 0 0 0 0 0 0 3203 0 0 0 0 0 44 | 0 0 0 18] 45 | [ 0 0 0 0 0 0 0 10144 0 0 0 0 46 | 0 0 0 0] 47 | [ 0 0 0 0 0 0 0 0 5583 0 0 0 48 | 0 0 0 0] 49 | [ 0 0 0 0 0 0 0 0 0 2950 0 0 50 | 0 0 0 0] 51 | [ 0 0 0 0 0 0 0 0 0 0 961 0 52 | 0 0 0 0] 53 | [ 0 0 0 0 0 0 0 0 0 0 0 1734 54 | 0 0 0 0] 55 | [ 0 0 0 0 0 0 0 0 0 0 0 0 56 | 825 0 0 0] 57 | [ 0 0 0 0 0 0 0 0 0 0 0 0 58 | 0 963 0 0] 59 | [ 0 0 0 0 0 0 0 1 0 0 0 0 60 | 0 0 6540 0] 61 | [ 0 0 0 0 0 0 0 0 0 0 0 0 62 | 0 0 0 1626]] -------------------------------------------------------------------------------- /data/Indian_pines_corrected.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/data/Indian_pines_corrected.mat -------------------------------------------------------------------------------- /data/Indian_pines_gt.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/data/Indian_pines_gt.mat -------------------------------------------------------------------------------- /data/PaviaU.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/data/PaviaU.mat -------------------------------------------------------------------------------- /data/PaviaU_gt.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/data/PaviaU_gt.mat -------------------------------------------------------------------------------- /data/Salinas_corrected.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/data/Salinas_corrected.mat -------------------------------------------------------------------------------- /data/Salinas_gt.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/data/Salinas_gt.mat -------------------------------------------------------------------------------- /figure/Architecture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/Architecture.png -------------------------------------------------------------------------------- /figure/HSI-RN.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/HSI-RN.jpg -------------------------------------------------------------------------------- /figure/IP-FC.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/IP-FC.jpg -------------------------------------------------------------------------------- /figure/IP-GT.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/IP-GT.jpg -------------------------------------------------------------------------------- /figure/IP-Pr.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/IP-Pr.jpg -------------------------------------------------------------------------------- /figure/IP_legend.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/IP_legend.jpg -------------------------------------------------------------------------------- /figure/SA-FC.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/SA-FC.jpg -------------------------------------------------------------------------------- /figure/SA-GT.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/SA-GT.jpg -------------------------------------------------------------------------------- /figure/SA-Pr.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/SA-Pr.jpg -------------------------------------------------------------------------------- /figure/SA_legend.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/SA_legend.jpg -------------------------------------------------------------------------------- /figure/UP-FC.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/UP-FC.jpg -------------------------------------------------------------------------------- /figure/UP-GT.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/UP-GT.jpg -------------------------------------------------------------------------------- /figure/UP-Pr.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/UP-Pr.jpg -------------------------------------------------------------------------------- /figure/UP_legend.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/figure/UP_legend.jpg -------------------------------------------------------------------------------- /predictions.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/predictions.jpg -------------------------------------------------------------------------------- /wavelet_cnn_0.5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/eternal-vanguard/SpectralNET/513986a1f9a9a51d066a389f84724ed13610faec/wavelet_cnn_0.5.png --------------------------------------------------------------------------------