├── .gitattributes ├── 9781484266151.jpg ├── Chapter 1.ipynb ├── Chapter1 ├── Chapter 1.ipynb ├── Chapter1 ├── Coin.png └── Mario.png ├── Chapter2 ├── Chapter2 ├── DogVsCatClassification.ipynb ├── MNIST.png └── MNISTImageUsingCNN.ipynb ├── Chapter3 ├── Chapter3 ├── GermanTrafficClassificationUsingLeNet-5.ipynb ├── Important └── LeNet MNIST Digit Classification.ipynb ├── Chapter4 ├── CIFAR10_AlexNet.ipynb ├── CIFAR10_VGG.ipynb └── Chapter4 ├── Chapter5 ├── Chapter5 └── YOLO.ipynb ├── Chapter6 ├── Chapter6 ├── Chapter6.ipynb └── Important ├── Chapter7 ├── Dataset and codes for Chapter7 ├── Important └── VideoAnalyticsChapter7.ipynb ├── Chapter8 ├── Chapter 8.ipynb ├── Chapter8 └── Hoover.jpg ├── Coin.png ├── Contributing.md ├── LICENSE.txt ├── Mario.png └── README.md /.gitattributes: -------------------------------------------------------------------------------- 1 | # Auto detect text files and perform LF normalization 2 | * text=auto 3 | -------------------------------------------------------------------------------- /9781484266151.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/9781484266151.jpg -------------------------------------------------------------------------------- /Chapter1/Chapter1: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter1/Coin.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/Chapter1/Coin.png -------------------------------------------------------------------------------- /Chapter1/Mario.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/Chapter1/Mario.png -------------------------------------------------------------------------------- /Chapter2/Chapter2: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter2/DogVsCatClassification.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "f# Here we are builing an Image Classfier using Keras. The dataset used is from Kaggle Dog vs Cat Image \n", 10 | "# Classification Problem. First let's build the dataset\n", 11 | "# Step 1: Download the dataset from Kaggle\n", 12 | "# Step 2: Unzip the dataset\n", 13 | "# Step 3: You will find 2 folder, test and train\n", 14 | "# Step 4: Delete the test folder. We will create our own test folder.\n", 15 | "# Step 5: Inside both the train and test folders, create 2 sub-folders cats and dogs.\n", 16 | "# Step 7: Put all the cat's image in cats folder and all the dog's image in dogs folder\n", 17 | "# Step 8: Take some images (I took 2000) from train>cats folder and put in test>cats folder\n", 18 | "# Step 9: Take some images (I took 2000) from train>dogs folder and put in test>dogs folder\n", 19 | "# Step 10: Your dataset is ready" 20 | ] 21 | }, 22 | { 23 | "cell_type": "code", 24 | "execution_count": 69, 25 | "metadata": {}, 26 | "outputs": [], 27 | "source": [ 28 | "from keras.models import Sequential# Import the sequential layer. \n", 29 | "#Generally there are two types of layers, sequential and functional. Sequential is most common one\n", 30 | "from keras.layers import Conv2D,Activation,MaxPooling2D,Dense,Flatten,Dropout\n", 31 | "import numpy as np" 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": null, 37 | "metadata": {}, 38 | "outputs": [], 39 | "source": [ 40 | "#Initialize a catDogImageclassifier variable here" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": 70, 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [ 49 | "catDogImageclassifier = Sequential()" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": null, 55 | "metadata": {}, 56 | "outputs": [], 57 | "source": [ 58 | "# We are adding layers to our network here.\n", 59 | "# Conv2D: 2 dimensional convolutional layer\n", 60 | "# 32: filters required. \n", 61 | "# 3,3: size of the filter (3 rows, 3 columns)\n", 62 | "# Input Image shape is 64*64*3 - height*width*RGB. Each number represents pixel intensity (0-255)\n" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": 71, 68 | "metadata": {}, 69 | "outputs": [], 70 | "source": [ 71 | "catDogImageclassifier.add(Conv2D(32,(3,3),input_shape=(64,64,3)))" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": null, 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "# Output is a feature map. The training data will work on it and get some feature maps" 81 | ] 82 | }, 83 | { 84 | "cell_type": "code", 85 | "execution_count": null, 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "# Lets add the activation function now. We are using ReLU (Rectified Linear Unit). \n", 90 | "#The activation function gives the output basis the output. In the feature map output from the previous layer,\n", 91 | "#the activation function will replace all the negative pixels with zero" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": 72, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "catDogImageclassifier.add(Activation('relu'))" 101 | ] 102 | }, 103 | { 104 | "cell_type": "code", 105 | "execution_count": null, 106 | "metadata": {}, 107 | "outputs": [], 108 | "source": [ 109 | "# We do not want out network to be overly complex computationally, hence the pooling layer comes into picture\n", 110 | "# The pooling layer will reduce the dimensions. Max with two by two filter, will take the maximum value but the \n", 111 | "# significant features will be retained" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": 73, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "catDogImageclassifier.add(MaxPooling2D(pool_size =(2,2)))" 121 | ] 122 | }, 123 | { 124 | "cell_type": "code", 125 | "execution_count": null, 126 | "metadata": {}, 127 | "outputs": [], 128 | "source": [ 129 | "# All three convolutional blocks. " 130 | ] 131 | }, 132 | { 133 | "cell_type": "code", 134 | "execution_count": 74, 135 | "metadata": {}, 136 | "outputs": [], 137 | "source": [ 138 | "catDogImageclassifier.add(Conv2D(32,(3,3))) # Convolutional Layer\n", 139 | "catDogImageclassifier.add(Activation('relu')) # Activation Layer\n", 140 | "catDogImageclassifier.add(MaxPooling2D(pool_size =(2,2))) # Pooling Layer\n", 141 | "catDogImageclassifier.add(Conv2D(32,(3,3))) # Convolutional Layer\n", 142 | "catDogImageclassifier.add(Activation('relu')) # Activation Layer\n", 143 | "catDogImageclassifier.add(MaxPooling2D(pool_size =(2,2))) # Pooling Layer\n", 144 | "catDogImageclassifier.add(Conv2D(32,(3,3))) # Convolutional Layer\n", 145 | "catDogImageclassifier.add(Activation('relu')) # Activation Layer\n", 146 | "catDogImageclassifier.add(MaxPooling2D(pool_size =(2,2))) # Pooling Layer" 147 | ] 148 | }, 149 | { 150 | "cell_type": "code", 151 | "execution_count": null, 152 | "metadata": {}, 153 | "outputs": [], 154 | "source": [ 155 | "# Overfitting is a nuicance. We have to fight it using Drop out. Prepare the data by flatenning it. \n", 156 | "#And flattening to 1 dimension" 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": 75, 162 | "metadata": {}, 163 | "outputs": [], 164 | "source": [ 165 | "catDogImageclassifier.add(Flatten())" 166 | ] 167 | }, 168 | { 169 | "cell_type": "code", 170 | "execution_count": null, 171 | "metadata": {}, 172 | "outputs": [], 173 | "source": [ 174 | "# Add dense function now followed by ReLU activation" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": 76, 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "catDogImageclassifier.add(Dense(64))\n", 184 | "catDogImageclassifier.add(Activation('relu'))" 185 | ] 186 | }, 187 | { 188 | "cell_type": "code", 189 | "execution_count": null, 190 | "metadata": {}, 191 | "outputs": [], 192 | "source": [ 193 | "# Here add the doropout layer\n", 194 | "# Overfitting means that model is working good for training but failing on testing dataset" 195 | ] 196 | }, 197 | { 198 | "cell_type": "code", 199 | "execution_count": 77, 200 | "metadata": {}, 201 | "outputs": [], 202 | "source": [ 203 | "catDogImageclassifier.add(Dropout(0.5))" 204 | ] 205 | }, 206 | { 207 | "cell_type": "code", 208 | "execution_count": null, 209 | "metadata": {}, 210 | "outputs": [], 211 | "source": [ 212 | "# Add one more fully connected layer now to get the output in n-dimensional classes (a vector will be the output)" 213 | ] 214 | }, 215 | { 216 | "cell_type": "code", 217 | "execution_count": 78, 218 | "metadata": {}, 219 | "outputs": [], 220 | "source": [ 221 | "catDogImageclassifier.add(Dense(1))" 222 | ] 223 | }, 224 | { 225 | "cell_type": "code", 226 | "execution_count": null, 227 | "metadata": {}, 228 | "outputs": [], 229 | "source": [ 230 | "# Sigmoid function to convert to probabilities" 231 | ] 232 | }, 233 | { 234 | "cell_type": "code", 235 | "execution_count": 79, 236 | "metadata": {}, 237 | "outputs": [], 238 | "source": [ 239 | "catDogImageclassifier.add(Activation('sigmoid'))" 240 | ] 241 | }, 242 | { 243 | "cell_type": "code", 244 | "execution_count": null, 245 | "metadata": {}, 246 | "outputs": [], 247 | "source": [ 248 | "# Let us look how out network looks" 249 | ] 250 | }, 251 | { 252 | "cell_type": "code", 253 | "execution_count": 80, 254 | "metadata": {}, 255 | "outputs": [ 256 | { 257 | "name": "stdout", 258 | "output_type": "stream", 259 | "text": [ 260 | "_________________________________________________________________\n", 261 | "Layer (type) Output Shape Param # \n", 262 | "=================================================================\n", 263 | "conv2d_27 (Conv2D) (None, 62, 62, 32) 896 \n", 264 | "_________________________________________________________________\n", 265 | "activation_22 (Activation) (None, 62, 62, 32) 0 \n", 266 | "_________________________________________________________________\n", 267 | "max_pooling2d_20 (MaxPooling (None, 31, 31, 32) 0 \n", 268 | "_________________________________________________________________\n", 269 | "conv2d_28 (Conv2D) (None, 29, 29, 32) 9248 \n", 270 | "_________________________________________________________________\n", 271 | "activation_23 (Activation) (None, 29, 29, 32) 0 \n", 272 | "_________________________________________________________________\n", 273 | "max_pooling2d_21 (MaxPooling (None, 14, 14, 32) 0 \n", 274 | "_________________________________________________________________\n", 275 | "conv2d_29 (Conv2D) (None, 12, 12, 32) 9248 \n", 276 | "_________________________________________________________________\n", 277 | "activation_24 (Activation) (None, 12, 12, 32) 0 \n", 278 | "_________________________________________________________________\n", 279 | "max_pooling2d_22 (MaxPooling (None, 6, 6, 32) 0 \n", 280 | "_________________________________________________________________\n", 281 | "conv2d_30 (Conv2D) (None, 4, 4, 32) 9248 \n", 282 | "_________________________________________________________________\n", 283 | "activation_25 (Activation) (None, 4, 4, 32) 0 \n", 284 | "_________________________________________________________________\n", 285 | "max_pooling2d_23 (MaxPooling (None, 2, 2, 32) 0 \n", 286 | "_________________________________________________________________\n", 287 | "flatten_2 (Flatten) (None, 128) 0 \n", 288 | "_________________________________________________________________\n", 289 | "dense_3 (Dense) (None, 64) 8256 \n", 290 | "_________________________________________________________________\n", 291 | "activation_26 (Activation) (None, 64) 0 \n", 292 | "_________________________________________________________________\n", 293 | "dropout_2 (Dropout) (None, 64) 0 \n", 294 | "_________________________________________________________________\n", 295 | "dense_4 (Dense) (None, 1) 65 \n", 296 | "_________________________________________________________________\n", 297 | "activation_27 (Activation) (None, 1) 0 \n", 298 | "=================================================================\n", 299 | "Total params: 36,961\n", 300 | "Trainable params: 36,961\n", 301 | "Non-trainable params: 0\n", 302 | "_________________________________________________________________\n" 303 | ] 304 | } 305 | ], 306 | "source": [ 307 | "catDogImageclassifier.summary()" 308 | ] 309 | }, 310 | { 311 | "cell_type": "code", 312 | "execution_count": null, 313 | "metadata": {}, 314 | "outputs": [], 315 | "source": [ 316 | "# A quick look at the network summary states that total number of parameters in our network are 36,961. \n", 317 | "#Play around with different network structures and have a look how this number changes" 318 | ] 319 | }, 320 | { 321 | "cell_type": "code", 322 | "execution_count": 81, 323 | "metadata": {}, 324 | "outputs": [], 325 | "source": [ 326 | "catDogImageclassifier.compile(optimizer ='rmsprop',# rmsprop is the optimizer using Gradient Descent\n", 327 | " loss ='binary_crossentropy', # Loss or cost function for the model\n", 328 | " metrics =['accuracy']) # The KPI " 329 | ] 330 | }, 331 | { 332 | "cell_type": "code", 333 | "execution_count": null, 334 | "metadata": {}, 335 | "outputs": [], 336 | "source": [ 337 | "# Let us do some data augmentation here. It helps to fight overfitting. Zoom, scale etc. \n", 338 | "# There is a function ImageDataGenerator which is used here" 339 | ] 340 | }, 341 | { 342 | "cell_type": "code", 343 | "execution_count": 82, 344 | "metadata": {}, 345 | "outputs": [], 346 | "source": [ 347 | "from keras.preprocessing.image import ImageDataGenerator\n", 348 | "train_datagen = ImageDataGenerator(rescale =1./255,\n", 349 | " shear_range =0.25,\n", 350 | " zoom_range = 0.25,\n", 351 | " horizontal_flip =True)\n", 352 | "test_datagen = ImageDataGenerator(rescale = 1./255)" 353 | ] 354 | }, 355 | { 356 | "cell_type": "code", 357 | "execution_count": null, 358 | "metadata": {}, 359 | "outputs": [], 360 | "source": [ 361 | "# Load the training data" 362 | ] 363 | }, 364 | { 365 | "cell_type": "code", 366 | "execution_count": 83, 367 | "metadata": {}, 368 | "outputs": [ 369 | { 370 | "name": "stdout", 371 | "output_type": "stream", 372 | "text": [ 373 | "Found 23000 images belonging to 2 classes.\n" 374 | ] 375 | } 376 | ], 377 | "source": [ 378 | "training_set = train_datagen.flow_from_directory('/Users/vaibhavverdhan/Book Writing/DogsCats/train',target_size=(64,64),batch_size= 32,class_mode='binary')" 379 | ] 380 | }, 381 | { 382 | "cell_type": "code", 383 | "execution_count": null, 384 | "metadata": {}, 385 | "outputs": [], 386 | "source": [ 387 | "# Load the testing data" 388 | ] 389 | }, 390 | { 391 | "cell_type": "code", 392 | "execution_count": 84, 393 | "metadata": {}, 394 | "outputs": [ 395 | { 396 | "name": "stdout", 397 | "output_type": "stream", 398 | "text": [ 399 | "Found 2000 images belonging to 2 classes.\n" 400 | ] 401 | } 402 | ], 403 | "source": [ 404 | "test_set = test_datagen.flow_from_directory('/Users/vaibhavverdhan/Book Writing/DogsCats/test',\n", 405 | " target_size = (64,64),\n", 406 | " batch_size = 32,\n", 407 | " class_mode ='binary')" 408 | ] 409 | }, 410 | { 411 | "cell_type": "code", 412 | "execution_count": null, 413 | "metadata": {}, 414 | "outputs": [], 415 | "source": [ 416 | "# Let us begin the training now. Steps per epoch is 625 and number of epochs is 10. \n", 417 | "#Epoch is one full cycle of the training data\n", 418 | "# Steps and Batch size has to be understood next. For example: if we have 1000 images and batch size of 10, it means\n", 419 | "# number of steps = 1000/10 which is 100 steps required.\n", 420 | "# Depending on the complexity of the network, the number of epochs given etc., the compilation will take time.\n", 421 | "# The test dataset is passed as a validation_data here." 422 | ] 423 | }, 424 | { 425 | "cell_type": "code", 426 | "execution_count": 85, 427 | "metadata": {}, 428 | "outputs": [ 429 | { 430 | "name": "stdout", 431 | "output_type": "stream", 432 | "text": [ 433 | "Epoch 1/10\n", 434 | "625/625 [==============================] - 185s 296ms/step - loss: 0.6721 - acc: 0.5822 - val_loss: 0.6069 - val_acc: 0.6610\n", 435 | "Epoch 2/10\n", 436 | "625/625 [==============================] - 152s 243ms/step - loss: 0.5960 - acc: 0.6831 - val_loss: 0.5151 - val_acc: 0.7543\n", 437 | "Epoch 3/10\n", 438 | "625/625 [==============================] - 151s 242ms/step - loss: 0.5452 - acc: 0.7217 - val_loss: 0.4891 - val_acc: 0.7545\n", 439 | "Epoch 4/10\n", 440 | "625/625 [==============================] - 150s 239ms/step - loss: 0.5069 - acc: 0.7568 - val_loss: 0.4657 - val_acc: 0.7743\n", 441 | "Epoch 5/10\n", 442 | "625/625 [==============================] - 150s 240ms/step - loss: 0.4813 - acc: 0.7713 - val_loss: 0.4407 - val_acc: 0.7925\n", 443 | "Epoch 6/10\n", 444 | "625/625 [==============================] - 152s 243ms/step - loss: 0.4526 - acc: 0.7866 - val_loss: 0.4374 - val_acc: 0.7924\n", 445 | "Epoch 7/10\n", 446 | "625/625 [==============================] - 151s 241ms/step - loss: 0.4458 - acc: 0.7953 - val_loss: 0.3891 - val_acc: 0.8324\n", 447 | "Epoch 8/10\n", 448 | "625/625 [==============================] - 151s 242ms/step - loss: 0.4177 - acc: 0.8123 - val_loss: 0.3917 - val_acc: 0.8221\n", 449 | "Epoch 9/10\n", 450 | "625/625 [==============================] - 155s 248ms/step - loss: 0.4158 - acc: 0.8158 - val_loss: 0.3947 - val_acc: 0.8176\n", 451 | "Epoch 10/10\n", 452 | "625/625 [==============================] - 151s 241ms/step - loss: 0.4021 - acc: 0.8201 - val_loss: 0.3783 - val_acc: 0.8221\n" 453 | ] 454 | }, 455 | { 456 | "data": { 457 | "text/plain": [ 458 | "" 459 | ] 460 | }, 461 | "execution_count": 85, 462 | "metadata": {}, 463 | "output_type": "execute_result" 464 | } 465 | ], 466 | "source": [ 467 | "from IPython.display import display\n", 468 | "from PIL import Image\n", 469 | "catDogImageclassifier.fit_generator(training_set,\n", 470 | " steps_per_epoch =625,\n", 471 | " epochs = 10,\n", 472 | " validation_data =test_set,\n", 473 | " validation_steps = 1000)" 474 | ] 475 | }, 476 | { 477 | "cell_type": "code", 478 | "execution_count": null, 479 | "metadata": {}, 480 | "outputs": [], 481 | "source": [ 482 | "# We can see here that in the final epoch we got validation accuracy of 82.21%. \n", 483 | "#We can also see that in Epoch 7 we got accuracy of 83.24 which is better than the final accuarcy.\n", 484 | "# There are ways to give a checkpoint in between the training and save that version, \n", 485 | "#we will look at it in subsequent chapters" 486 | ] 487 | }, 488 | { 489 | "cell_type": "code", 490 | "execution_count": null, 491 | "metadata": {}, 492 | "outputs": [], 493 | "source": [ 494 | "# We are saving the final model as a file here. The model can be then loaded again as and when required.\n", 495 | "# The model will be saved as a HDF5 file. And it can be reused later." 496 | ] 497 | }, 498 | { 499 | "cell_type": "code", 500 | "execution_count": 38, 501 | "metadata": {}, 502 | "outputs": [], 503 | "source": [ 504 | "catDogImageclassifier.save('catdog_cnn_model.h5')" 505 | ] 506 | }, 507 | { 508 | "cell_type": "code", 509 | "execution_count": null, 510 | "metadata": {}, 511 | "outputs": [], 512 | "source": [ 513 | "# Load the saved model. The saved file is loaded using load_model." 514 | ] 515 | }, 516 | { 517 | "cell_type": "code", 518 | "execution_count": 39, 519 | "metadata": {}, 520 | "outputs": [], 521 | "source": [ 522 | "from keras.models import load_model \n", 523 | "catDogImageclassifier = load_model('catdog_cnn_model.h5')" 524 | ] 525 | }, 526 | { 527 | "cell_type": "code", 528 | "execution_count": null, 529 | "metadata": {}, 530 | "outputs": [], 531 | "source": [ 532 | "# Check how the model is predicting for an unseen image. " 533 | ] 534 | }, 535 | { 536 | "cell_type": "code", 537 | "execution_count": 40, 538 | "metadata": {}, 539 | "outputs": [ 540 | { 541 | "name": "stdout", 542 | "output_type": "stream", 543 | "text": [ 544 | "dog\n" 545 | ] 546 | } 547 | ], 548 | "source": [ 549 | "import numpy as np\n", 550 | "from keras.preprocessing import image\n", 551 | "an_image =image.load_img('/Users/vaibhavverdhan/Book Writing/2.jpg',target_size =(64,64))# Load the image\n", 552 | "# The image is now getting converted to array of numbers\n", 553 | "an_image =image.img_to_array(an_image)\n", 554 | "#Let us now expand it's dimensions. It will improve the prediction power \n", 555 | "an_image =np.expand_dims(an_image, axis =0)\n", 556 | "# call the predict method here\n", 557 | "verdict = catDogImageclassifier.predict(an_image)\n", 558 | "if verdict[0][0] >= 0.5:\n", 559 | " prediction = 'dog'\n", 560 | "else:\n", 561 | " prediction = 'cat'\n", 562 | "# Let us print our final prediction \n", 563 | "print(prediction)" 564 | ] 565 | }, 566 | { 567 | "cell_type": "code", 568 | "execution_count": 6, 569 | "metadata": {}, 570 | "outputs": [], 571 | "source": [ 572 | "# Here in this example, we have designed a Neural Network using Kears. We trained using images of Cats and Dogs. \n", 573 | "#And then tested it.\n", 574 | "# It is possible to train a multi-classifier system too. The onus lies to get the images for \n", 575 | "# each of the class." 576 | ] 577 | }, 578 | { 579 | "cell_type": "code", 580 | "execution_count": null, 581 | "metadata": {}, 582 | "outputs": [], 583 | "source": [] 584 | }, 585 | { 586 | "cell_type": "code", 587 | "execution_count": null, 588 | "metadata": {}, 589 | "outputs": [], 590 | "source": [] 591 | } 592 | ], 593 | "metadata": { 594 | "kernelspec": { 595 | "display_name": "Python 3", 596 | "language": "python", 597 | "name": "python3" 598 | }, 599 | "language_info": { 600 | "codemirror_mode": { 601 | "name": "ipython", 602 | "version": 3 603 | }, 604 | "file_extension": ".py", 605 | "mimetype": "text/x-python", 606 | "name": "python", 607 | "nbconvert_exporter": "python", 608 | "pygments_lexer": "ipython3", 609 | "version": "3.6.8" 610 | } 611 | }, 612 | "nbformat": 4, 613 | "nbformat_minor": 2 614 | } 615 | -------------------------------------------------------------------------------- /Chapter2/MNIST.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/Chapter2/MNIST.png -------------------------------------------------------------------------------- /Chapter2/MNISTImageUsingCNN.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## We are going to build an Image Classification solution here. The dataset for this is available on Kaggle Link" 8 | ] 9 | }, 10 | { 11 | "cell_type": "code", 12 | "execution_count": 1, 13 | "metadata": {}, 14 | "outputs": [ 15 | { 16 | "name": "stderr", 17 | "output_type": "stream", 18 | "text": [ 19 | "Using TensorFlow backend.\n" 20 | ] 21 | } 22 | ], 23 | "source": [ 24 | "import keras" 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": {}, 30 | "source": [ 31 | "#### Import all the Keras libraries here. Keras comes with many layers and we are importing all the layers required here. \n" 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": 2, 37 | "metadata": {}, 38 | "outputs": [], 39 | "source": [ 40 | "from keras.datasets import mnist ## Data set is imported here\n", 41 | "from keras.models import Sequential # Import the sequential layer. \n", 42 | "#Generally there are two types of layers, sequential and functional. Sequential is most common one\n", 43 | "from keras.layers import Dense, Dropout, Flatten\n", 44 | "from keras.layers import Conv2D, MaxPooling2D\n", 45 | "from keras import backend as K" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "metadata": {}, 52 | "outputs": [], 53 | "source": [ 54 | "Lets define the hyper parameters here" 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": 3, 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "batch_size = 256\n", 64 | "num_classes = 10\n", 65 | "epochs = 10" 66 | ] 67 | }, 68 | { 69 | "cell_type": "code", 70 | "execution_count": 4, 71 | "metadata": {}, 72 | "outputs": [], 73 | "source": [ 74 | "# input image dimensions\n", 75 | "image_rows, image_cols = 28, 28" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 5, 81 | "metadata": {}, 82 | "outputs": [], 83 | "source": [ 84 | "# here we are diving the dataset in train and test\n", 85 | "(x_train, y_train), (x_test, y_test) = mnist.load_data()" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": null, 91 | "metadata": {}, 92 | "outputs": [], 93 | "source": [ 94 | "An image has a dimension each for row, height and column. And there are 2 ways to represent\n", 95 | "If the image format is channels_first it means fist channel is the color [channel][row][col] else\n", 96 | "if in the case of channels_last it is [row][col][channels]" 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": 6, 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "if K.image_data_format() == 'channels_first':\n", 106 | " x_train = x_train.reshape(x_train.shape[0], 1, image_rows, image_cols)\n", 107 | " x_test = x_test.reshape(x_test.shape[0], 1, image_rows, image_cols)\n", 108 | " input_shape = (1, image_rows, image_cols)\n", 109 | "else:\n", 110 | " x_train = x_train.reshape(x_train.shape[0], image_rows, image_cols, 1)\n", 111 | " x_test = x_test.reshape(x_test.shape[0], image_rows, image_cols, 1)\n", 112 | " input_shape = (image_rows, image_cols, 1)" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": 7, 118 | "metadata": {}, 119 | "outputs": [ 120 | { 121 | "name": "stdout", 122 | "output_type": "stream", 123 | "text": [ 124 | "x_train shape: (60000, 28, 28, 1)\n", 125 | "60000 train samples\n", 126 | "10000 test samples\n" 127 | ] 128 | } 129 | ], 130 | "source": [ 131 | "x_train = x_train.astype('float32')\n", 132 | "x_test = x_test.astype('float32')\n", 133 | "x_train /= 255\n", 134 | "x_test /= 255\n", 135 | "print('x_train shape:', x_train.shape)\n", 136 | "print(x_train.shape[0], 'train samples')\n", 137 | "print(x_test.shape[0], 'test samples')" 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": 8, 143 | "metadata": {}, 144 | "outputs": [], 145 | "source": [ 146 | "# convert class vectors to binary class matrices\n", 147 | "y_train = keras.utils.to_categorical(y_train, num_classes)\n", 148 | "y_test = keras.utils.to_categorical(y_test, num_classes)" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [ 157 | "Lets create our network here\n", 158 | "We are adding layers to our network here.\n", 159 | "Conv2D: 2 dimensional convolutional layer 32: filters required. 3,3: size of the filter (3 rows, 3 columns)\n", 160 | "Input Image shape is 64*64*3 - height*width*RGB. Each number represents pixel intensity (0-255)\n", 161 | "Output is a feature map. The training data will work on it and get some feature maps\n", 162 | "\n", 163 | "Lets add the activation function now. We are using ReLU (Rectified Linear Unit).\n", 164 | "The activation function gives the output basis the output. \n", 165 | "In the feature map output from the previous layer, \n", 166 | "the activation function will replace all the negative pixels with zero\n", 167 | "We do not want out network to be overly complex computationally, hence the pooling layer comes into picture\n", 168 | "The pooling layer will reduce the dimensions. Max with two by two filter, \n", 169 | "will take the maximum value but the significant features will be retained\n", 170 | "To fight overfitting using Drop out. Prepare the data by flattening it. And flattening to 1 dimension\n", 171 | "\n" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": 9, 177 | "metadata": {}, 178 | "outputs": [ 179 | { 180 | "name": "stdout", 181 | "output_type": "stream", 182 | "text": [ 183 | "WARNING:tensorflow:From /Users/vaibhavverdhan/anaconda3/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\n", 184 | "Instructions for updating:\n", 185 | "Colocations handled automatically by placer.\n", 186 | "WARNING:tensorflow:From /Users/vaibhavverdhan/anaconda3/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:3445: calling dropout (from tensorflow.python.ops.nn_ops) with keep_prob is deprecated and will be removed in a future version.\n", 187 | "Instructions for updating:\n", 188 | "Please use `rate` instead of `keep_prob`. Rate should be set to `rate = 1 - keep_prob`.\n" 189 | ] 190 | } 191 | ], 192 | "source": [ 193 | "model = Sequential()\n", 194 | "model.add(Conv2D(32, kernel_size=(3, 3),\n", 195 | " activation='relu',\n", 196 | " input_shape=input_shape))\n", 197 | "model.add(Conv2D(64, (3, 3), activation='relu'))\n", 198 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n", 199 | "model.add(Dropout(0.45))\n", 200 | "model.add(Flatten())\n", 201 | "model.add(Dense(128, activation='relu'))\n", 202 | "model.add(Dropout(0.55))\n", 203 | "model.add(Dense(num_classes, activation='softmax'))" 204 | ] 205 | }, 206 | { 207 | "cell_type": "code", 208 | "execution_count": null, 209 | "metadata": {}, 210 | "outputs": [], 211 | "source": [ 212 | "Compile the model now. We are using crossentropy as the loss function and Adadelta for optimization" 213 | ] 214 | }, 215 | { 216 | "cell_type": "code", 217 | "execution_count": 10, 218 | "metadata": {}, 219 | "outputs": [], 220 | "source": [ 221 | "model.compile(loss=keras.losses.categorical_crossentropy,\n", 222 | " optimizer=keras.optimizers.Adadelta(),\n", 223 | " metrics=['accuracy'])" 224 | ] 225 | }, 226 | { 227 | "cell_type": "code", 228 | "execution_count": null, 229 | "metadata": {}, 230 | "outputs": [], 231 | "source": [ 232 | "The network is ready. Fit the model now" 233 | ] 234 | }, 235 | { 236 | "cell_type": "code", 237 | "execution_count": 11, 238 | "metadata": {}, 239 | "outputs": [ 240 | { 241 | "name": "stdout", 242 | "output_type": "stream", 243 | "text": [ 244 | "WARNING:tensorflow:From /Users/vaibhavverdhan/anaconda3/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", 245 | "Instructions for updating:\n", 246 | "Use tf.cast instead.\n", 247 | "Train on 60000 samples, validate on 10000 samples\n", 248 | "Epoch 1/10\n", 249 | "60000/60000 [==============================] - 55s 916us/step - loss: 0.3784 - acc: 0.8814 - val_loss: 0.0852 - val_acc: 0.9711\n", 250 | "Epoch 2/10\n", 251 | "60000/60000 [==============================] - 56s 938us/step - loss: 0.1267 - acc: 0.9625 - val_loss: 0.0512 - val_acc: 0.9818\n", 252 | "Epoch 3/10\n", 253 | "60000/60000 [==============================] - 112s 2ms/step - loss: 0.0962 - acc: 0.9720 - val_loss: 0.0393 - val_acc: 0.9873\n", 254 | "Epoch 4/10\n", 255 | "60000/60000 [==============================] - 57s 956us/step - loss: 0.0788 - acc: 0.9763 - val_loss: 0.0335 - val_acc: 0.9884\n", 256 | "Epoch 5/10\n", 257 | "60000/60000 [==============================] - 57s 951us/step - loss: 0.0698 - acc: 0.9793 - val_loss: 0.0336 - val_acc: 0.9881\n", 258 | "Epoch 6/10\n", 259 | "60000/60000 [==============================] - 448s 7ms/step - loss: 0.0622 - acc: 0.9807 - val_loss: 0.0375 - val_acc: 0.9869\n", 260 | "Epoch 7/10\n", 261 | "60000/60000 [==============================] - 56s 931us/step - loss: 0.0579 - acc: 0.9823 - val_loss: 0.0286 - val_acc: 0.9903\n", 262 | "Epoch 8/10\n", 263 | "60000/60000 [==============================] - 57s 944us/step - loss: 0.0520 - acc: 0.9837 - val_loss: 0.0288 - val_acc: 0.9904\n", 264 | "Epoch 9/10\n", 265 | "60000/60000 [==============================] - 2774s 46ms/step - loss: 0.0479 - acc: 0.9854 - val_loss: 0.0307 - val_acc: 0.9898\n", 266 | "Epoch 10/10\n", 267 | "60000/60000 [==============================] - 56s 936us/step - loss: 0.0469 - acc: 0.9858 - val_loss: 0.0292 - val_acc: 0.9900\n", 268 | "Test loss: 0.029244637563071593\n", 269 | "Test accuracy: 0.99\n" 270 | ] 271 | } 272 | ], 273 | "source": [ 274 | "model.fit(x_train, y_train,\n", 275 | " batch_size=batch_size,\n", 276 | " epochs=epochs,\n", 277 | " verbose=1,\n", 278 | " validation_data=(x_test, y_test))\n", 279 | "score = model.evaluate(x_test, y_test, verbose=0)\n", 280 | "print('Test loss:', score[0])\n", 281 | "print('Test accuracy:', score[1])" 282 | ] 283 | }, 284 | { 285 | "cell_type": "code", 286 | "execution_count": null, 287 | "metadata": {}, 288 | "outputs": [], 289 | "source": [ 290 | " " 291 | ] 292 | } 293 | ], 294 | "metadata": { 295 | "kernelspec": { 296 | "display_name": "Python 3", 297 | "language": "python", 298 | "name": "python3" 299 | }, 300 | "language_info": { 301 | "codemirror_mode": { 302 | "name": "ipython", 303 | "version": 3 304 | }, 305 | "file_extension": ".py", 306 | "mimetype": "text/x-python", 307 | "name": "python", 308 | "nbconvert_exporter": "python", 309 | "pygments_lexer": "ipython3", 310 | "version": "3.6.8" 311 | } 312 | }, 313 | "nbformat": 4, 314 | "nbformat_minor": 2 315 | } 316 | -------------------------------------------------------------------------------- /Chapter3/Chapter3: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter3/Important: -------------------------------------------------------------------------------- 1 | The dataset for German Traffic Classification can be downloaded from this link: 2 | https://drive.google.com/drive/folders/1chK1QIeP0q5lUAvCjOjKqhZYTGDzgF4K?usp=sharing 3 | -------------------------------------------------------------------------------- /Chapter3/LeNet MNIST Digit Classification.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 48, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import keras\n", 10 | "from keras.optimizers import SGD\n", 11 | "from sklearn.preprocessing import LabelBinarizer\n", 12 | "from sklearn.model_selection import train_test_split\n", 13 | "from sklearn.metrics import classification_report\n", 14 | "from sklearn import datasets\n", 15 | "from keras import backend as K\n", 16 | "import matplotlib.pyplot as plt\n", 17 | "import numpy as np" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": 49, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "from keras.datasets import mnist ## Data set is imported here\n", 27 | "from keras.models import Sequential\n", 28 | "from keras.layers.convolutional import Conv2D\n", 29 | "from keras.layers.convolutional import MaxPooling2D\n", 30 | "from keras.layers.core import Activation\n", 31 | "from keras.layers.core import Flatten\n", 32 | "from keras.layers.core import Dense\n", 33 | "from keras import backend as K" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": 22, 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": null, 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [] 49 | }, 50 | { 51 | "cell_type": "code", 52 | "execution_count": 37, 53 | "metadata": {}, 54 | "outputs": [], 55 | "source": [] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": 78, 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "image_rows, image_cols = 28, 28\n", 64 | "batch_size = 256\n", 65 | "num_classes = 10\n", 66 | "epochs = 10" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": 80, 72 | "metadata": {}, 73 | "outputs": [], 74 | "source": [ 75 | "(x_train, y_train), (x_test, y_test) = mnist.load_data()" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 81, 81 | "metadata": {}, 82 | "outputs": [ 83 | { 84 | "name": "stdout", 85 | "output_type": "stream", 86 | "text": [ 87 | "x_train shape: (60000, 28, 28)\n", 88 | "60000 train samples\n", 89 | "10000 test samples\n" 90 | ] 91 | } 92 | ], 93 | "source": [ 94 | "x_train = x_train.astype('float32')\n", 95 | "x_test = x_test.astype('float32')\n", 96 | "x_train /= 255\n", 97 | "x_test /= 255\n", 98 | "print('x_train shape:', x_train.shape)\n", 99 | "print(x_train.shape[0], 'train samples')\n", 100 | "print(x_test.shape[0], 'test samples')" 101 | ] 102 | }, 103 | { 104 | "cell_type": "code", 105 | "execution_count": 82, 106 | "metadata": {}, 107 | "outputs": [], 108 | "source": [ 109 | "# convert class vectors to binary class matrices\n", 110 | "y_train = keras.utils.to_categorical(y_train, num_classes)\n", 111 | "y_test = keras.utils.to_categorical(y_test, num_classes)" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": 83, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "if K.image_data_format() == 'channels_first':\n", 121 | " x_train = x_train.reshape(x_train.shape[0], 1, image_rows, image_cols)\n", 122 | " x_test = x_test.reshape(x_test.shape[0], 1, image_rows, image_cols)\n", 123 | " input_shape = (1, image_rows, image_cols)\n", 124 | "else:\n", 125 | " x_train = x_train.reshape(x_train.shape[0], image_rows, image_cols, 1)\n", 126 | " x_test = x_test.reshape(x_test.shape[0], image_rows, image_cols, 1)\n", 127 | " input_shape = (image_rows, image_cols, 1)" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": 84, 133 | "metadata": {}, 134 | "outputs": [], 135 | "source": [ 136 | "model = Sequential()" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": 85, 142 | "metadata": {}, 143 | "outputs": [], 144 | "source": [ 145 | "model.add(Conv2D(20, (5, 5), padding=\"same\",input_shape=input_shape))" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": 86, 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "model.add(Activation(\"relu\"))\n", 155 | "model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))" 156 | ] 157 | }, 158 | { 159 | "cell_type": "code", 160 | "execution_count": 87, 161 | "metadata": {}, 162 | "outputs": [], 163 | "source": [ 164 | "model.add(Conv2D(50, (5, 5), padding=\"same\"))\n", 165 | "model.add(Activation(\"relu\"))\n", 166 | "model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))" 167 | ] 168 | }, 169 | { 170 | "cell_type": "code", 171 | "execution_count": 88, 172 | "metadata": {}, 173 | "outputs": [], 174 | "source": [ 175 | "model.add(Flatten())\n", 176 | "model.add(Dense(500))\n", 177 | "model.add(Activation(\"relu\"))" 178 | ] 179 | }, 180 | { 181 | "cell_type": "code", 182 | "execution_count": 89, 183 | "metadata": {}, 184 | "outputs": [], 185 | "source": [ 186 | "model.add(Dense(num_classes))\n", 187 | "model.add(Activation(\"softmax\"))" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": 90, 193 | "metadata": {}, 194 | "outputs": [], 195 | "source": [ 196 | "model.compile(loss=keras.losses.categorical_crossentropy,\n", 197 | " optimizer=keras.optimizers.Adadelta(),\n", 198 | " metrics=['accuracy'])" 199 | ] 200 | }, 201 | { 202 | "cell_type": "code", 203 | "execution_count": 91, 204 | "metadata": {}, 205 | "outputs": [ 206 | { 207 | "name": "stdout", 208 | "output_type": "stream", 209 | "text": [ 210 | "Train on 60000 samples, validate on 10000 samples\n", 211 | "Epoch 1/10\n", 212 | "60000/60000 [==============================] - 36s 607us/step - loss: 0.2736 - acc: 0.9127 - val_loss: 0.1051 - val_acc: 0.9649\n", 213 | "Epoch 2/10\n", 214 | "60000/60000 [==============================] - 37s 622us/step - loss: 0.0590 - acc: 0.9813 - val_loss: 0.0490 - val_acc: 0.9835\n", 215 | "Epoch 3/10\n", 216 | "60000/60000 [==============================] - 37s 614us/step - loss: 0.0387 - acc: 0.9879 - val_loss: 0.0939 - val_acc: 0.9671\n", 217 | "Epoch 4/10\n", 218 | "60000/60000 [==============================] - 37s 625us/step - loss: 0.0285 - acc: 0.9910 - val_loss: 0.0267 - val_acc: 0.9905\n", 219 | "Epoch 5/10\n", 220 | "60000/60000 [==============================] - 37s 615us/step - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0305 - val_acc: 0.9896\n", 221 | "Epoch 6/10\n", 222 | "60000/60000 [==============================] - 37s 614us/step - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0228 - val_acc: 0.9920\n", 223 | "Epoch 7/10\n", 224 | "60000/60000 [==============================] - 37s 614us/step - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0236 - val_acc: 0.9918\n", 225 | "Epoch 8/10\n", 226 | "60000/60000 [==============================] - 37s 616us/step - loss: 0.0106 - acc: 0.9969 - val_loss: 0.0279 - val_acc: 0.9909\n", 227 | "Epoch 9/10\n", 228 | "60000/60000 [==============================] - 37s 617us/step - loss: 0.0082 - acc: 0.9976 - val_loss: 0.0246 - val_acc: 0.9917\n", 229 | "Epoch 10/10\n", 230 | "60000/60000 [==============================] - 37s 620us/step - loss: 0.0062 - acc: 0.9983 - val_loss: 0.0316 - val_acc: 0.9907\n" 231 | ] 232 | } 233 | ], 234 | "source": [ 235 | "theLeNetModel = model.fit(x_train, y_train,\n", 236 | " batch_size=batch_size,\n", 237 | " epochs=epochs,\n", 238 | " verbose=1,\n", 239 | " validation_data=(x_test, y_test))\n" 240 | ] 241 | }, 242 | { 243 | "cell_type": "code", 244 | "execution_count": 92, 245 | "metadata": {}, 246 | "outputs": [], 247 | "source": [ 248 | "score = model.evaluate(x_test, y_test, verbose=0)\n" 249 | ] 250 | }, 251 | { 252 | "cell_type": "code", 253 | "execution_count": 93, 254 | "metadata": {}, 255 | "outputs": [ 256 | { 257 | "data": { 258 | "text/plain": [ 259 | "Text(0, 0.5, 'acc')" 260 | ] 261 | }, 262 | "execution_count": 93, 263 | "metadata": {}, 264 | "output_type": "execute_result" 265 | }, 266 | { 267 | "data": { 268 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEWCAYAAACXGLsWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xl8VNX5+PHPk31PIAkICUtARRDZpLghorYqbiguSLXute7ab8Wqda9Wq9Zaf1or7guVKgruokUUqYqiJGETQUCyAWGZLGRPzu+PcxMmkz2ZyUwyz/v1mtfcfZ6ZwH3uPeeec8QYg1JKKdWaEH8HoJRSKvBpslBKKdUmTRZKKaXapMlCKaVUmzRZKKWUapMmC6WUUm3SZKFaJSKhIlIqIoO9uW13E5FXReRuZ3qqiKxpz7ad+JyA/Q2CkYgsE5GL/R1Hb6DJopdxTlT1rzoRKXebP7+jxzPG1Bpj4owxW725bXuJyPMicpeIlIhITDPrV4nIlR05pjHmM2PMwV6Kr9HJyBe/QW/hJOEqj3+j3/k7LtU+mix6GedEFWeMiQO2Aqe5LZvrub2IhHV/lO0jIgKcCMwBtgMzPNaPAw4A/tP90anWtPLv6i/u/0aNMYd2a2Cq0zRZBBkRuU9E/iMir4lICXCBiBwhIl+LiEtECkTkcREJd7YPExEjIkOd+Ved9R86V/tfiUhGR7d11k8TkR9FpEhE/p+I/M+jyGA8sN0YUwC8DFzo8XUuBN4xxuwRkRARmS8i25zv8ZmIjGzhN/iliGxxmz9URDKdGF8DIt3WJYvIByJSKCJ7RORdEUlz1v0VOAL4l3OV/Fgzv0GS8zsUisgWEbnVSYKIyOUi8rmI/N2JeZOInNDK3+52Z5sSEVkjIqd7rP+diPzgrF8tImOd5UNEZKETw04R+UcLx6//t/GGc4wVInKI2/p0EVngHGeziFzTzL4N/65a+h4tfPb+zu/2WxHJd16/d1sf5fxbKhCRPBF5VEQi3NbPcP6GxSKy0eN3zBCRL53v9JGI9O1IbMrSZBGczgT+DSRir8prgBuAFOAo4CTgd63s/2vgDqAv9u7lzx3dVkT6Aa8Ds53P3QxM8tj3ZOB9Z/pl4Fi3E3UoMMtZXu897J3GfsBq4JVW4sI5TiTwNvC8E+PbwBlum4QAzwCDgSFANfAPAGPMH4GvgCudq+Qbm/mIfwIxwDDgOOAyGie9I4FVQDLwd+C5VsL9Efv3SQTuB/4tIv2d7zELuB04H0jA3oXtFnuF/z6wERgKDML+7i2Zgf230ReYDyxwEmAo9vf9FkgDfgXMFpHj3fb1/HfVGVOA/YFpwO0iMtVZficwERiDvYg4CrjV+e5HYv9+fwCSgGOBn92O+WvgIqA/EAv8XydjC27GGH310hewBfilx7L7gE/b2O8m4A1nOgwwwFBn/lXgX27bng6s7sS2lwJfuK0ToAC42G3ZV8ARbvOfATc709OwRVNhLXyHFCeWWLdY7namfwlscaaPA3IAcdv3m/ptmznuRKDQbX6ZR8wNvwEQjk3EB7qtvwb4rzN9OfCD27oEZ9+Udv59VwOnONOLgWua2eZoYBsQ2o7j3Qcsc5sPBXZg756OAjZ5bH8H8EwH/l29ClQALrfXc866/Z3vvr/b9o8CTzvTPwMnuK07BdjoTD8HPNzCZy4DbnGbvx54zx//H3v6S+8sglOO+4yIHCQi7ztFOMXAvdiTbUu2uU2XAXGd2HagexzG/k/OdYspGXs1vtxt/5fYd1X+G2CuMabG2T5URB5yimmKsVfStPE96uPIdT6/XsNVqYjEisizIrLVOe6n7ThmvX7YE677Ve7P2Cvzep6/D7Twe4rIxSKS5RRZuYCD3GIZBPzUzG6DsImxtp0xu/9NaoE87G80BBhc/9nO59+MvYtrsm8rHjTGJLm9Lmvp87G/1UBnegAt/44tffd6Hfn3qlqgySI4eXY1/DT2KnV/Y0wC9pZffBxDAZBeP+OU47ufRE8CPjHG1LktewNb/nwMMJ3GRVAXYoutjsMWg+xff+iOxOFwf+z1ZiADmOT8Nsd5bNtat807gFrsidb92HltxNSEiAwDngKuApKNMUnAD+z7fjnA8GZ2zQGGOMVI7THI7TNDsH+TfOc4GzxO9PHGmNPc9vVGF9aD3KYHO58N9u/U0u/Y0ndXXqTJQgHEA0XAXqdSuLX6Cm95D5ggIqc55eo3AKlu608BPnDfwRhTCryFvcPYaIzJdFsdD1QCu7B1BPe3M45lQIiIXOuUzZ8DTPA4bhmwx7nbudNj/+3YO6AmjDHV2HL/v4hInNjK/d9ji2M6Kg57Mi7E5tbLsXcW9Z4FbhaR8WIdICKDsEV5u5wYYkQkWkSOauVzJonIdLEPONwElGDrKb4CqkTkD05lc6iIHCIi3n6a6Q4nxkOw9Qz1dR+vAXeKSIqIpGKLwOp/x+eAy0XkWLEPOqSLyAgvxxX0NFkosBWDF2FPDE/TDY+iGmO2AzOx5dK7sFeGK4FK54r2eGBRM7u+hL3CfNlj+QvYq9B8YA3wZTvjqMRWzP4W2IOt4F3otsmj2DuVXc4xP/Q4xGPALKdo5tFmPuJqoApbgf+5E79n7O2JMxt4HFufUoBNFMvd1r8G/BX7tyvGJtU+TjHdqcBI7BX4VuDsVj5qAfZJpt3Yv88MY0yNc5yTsQ8hbAF2Yv+tJHTwq9wmjdtZbPNYvwzYBHwMPGCM+dRZfg+QhX0YINv57g843/1L7N/vcexFzxIa36EoL5DGRbVK+YdTTJKPPZHVAo8YY470b1TBRUTuA9KNMRf74bP3xxZz+br4U3WS3lkovxGRk0Qk0Xl89Q7sk0PfAHXYK0mlVIAI2Na7KihMBuYCEdiiozOcYqGv/RqVUqoJLYZSSinVJi2GUkop1aZeUwyVkpJihg4d6u8wlFKqR/nuu+92GmNS29qu1ySLoUOHsmLFCn+HoZRSPYqI/Nz2VloMpZRSqh18lizEDlqzQ0RWt7BenC6HN4pItohMcFt3kYhscF4X+SpGpZRS7ePLO4sXsf37tGQatjvpA4ArsP3e4PQ1fxdwGLa16F0i0seHcSqllGqDz5KFMWYptsuAlkwHXjbW10CSiAzAjoz2iTFmtzFmD/AJrScdpZRSPubPOos0GndHnOssa2l5EyJyhdjRvFYUFhb6LFCllAp2/nwaqrk+YEwry5suNGYOdnxmJk6cqK0LlVJBZeHKPB5etJ58VzkDk6KZfeIIzhjf7LV1l/nzziKXxj1DpmM7kmtpuVJKKcfClXnc+tYq8lzlGCDPVc6tb61i4coOD5fSLv68s3gHuFZE5mErs4uMMQUisgjb9359pfYJOGPtKqVUIOiOK3pjDJU1deytrGFvZS2llTXsraqx75U13P3OGsqrGw+AWF5dy8OL1vvk7sJnyUJEXgOmAikikot9wikcwBjzL+zANidjh78sAy5x1u0WkT9jB1wBuNcY01pFuVJKdZv6K/r6E3X9Fb0xhmmHDGhycq+f31u570RfWlVDmfuyqhpKK2spq19fWcPeqlpq6zpeup7vKvf2VwZ6UUeCEydONNqCW6nezZtX9HV1hvJqe8LeW+W8e5y4SytrKKtfV2VP+h+uKqCipq7tD2hFRGgIsZGhxEaGERcZRkzEvumWl4USE7Fv/YXPL2d7cWWTY6clRfO/WzxH/22ZiHxnjJnY1na9prsPpVTvZq/osymvtifqPFc5N8/PJjvXxei0xIYTvj3Ju5/g9yWDsip7xV9WWUNZdS3tvVaODAuxJ/DI0FYTxewTRzQ6ucdGhhETUX/CD3WSQBgRYV2vLr512shGdzgA0eGhzD7RNyPKarJQSvlVXZ1hT1kVO0ur2FlaSWFJpX1vmK5iZ0klP2wrxrNUpqq2juf/t6XRMhGIjbAn51jnSjwmIpT9EqKIjdy3PMbtar3+St4mhMYn+tiIUMJC953cj3rwU/KaKepJS4rmmmP398VP1Kz6O6ruehpKk4VSql06UgTkmQDqk0BhaSU7S6qcd7t8196qZsvmI0JDSI2PJCUuggGJUawtKG72swRYctPUhkQQHR6KiO9GZ5194ohuvaJvzRnj03yWHDxpslAqwHXns/QtWfB9LrcuWEWFWxHQ7PlZfLZ+B/0To9hZ0viuoLUEkBIXQWp8JAMSozgkLbEhIaTER5ISF+nMR5IQFdbopN/SFf3ApGiGpsT67st76O4r+kChFdxKBTDPJ2/AXsU+MOOQRienmto6yqprqaiqpayqlvJq+15RvW++vKqG8qpayqvr7HSjdc3s47GuJfUJICU+ktQ4e6JPiY+w086Jvz4JeCYAX/wWqmPaW8GtyUKpAFJTW8eOkkoKisrJd1XwpwWrKK6oabJdaIiQEhfRcCKvru3Y/2MRe6KNDg8lOsK+x0SEEuW822VhREeEEBMRxpylm5o/DrDpgZN9WuzjLhDusnobfRpKqQBTV2coLK2koKiCAlc5+c57QVEF+UXlbCuqYHtxRZNK3ObU1hmmHtjPntQjQompP+k3OfGHOQkhhGhnOiYilMiwkA6d4N/PLmixCKi7EgV0bxm9akyThVIt6MhVrDGGXXurKHDZE3+Bq5yC4goKXBUNdwnbiyuo8cgEkWEhDEyKZkBiFEcOT2FgUhQDEqMZkBTFgMQoLnnhWwqKKpp8XlpSNH89e4xPvndzAqlSV/mHJgulmtFcK90/vpnN6vwihiTH7rsjcN63FVVQVdv4+fuI0BD2S7Qn/UkZfdkvMYqBifuSwcDEaJJiwlu9Mv/jSQcFxEk6WCt11T5aZ6GUo7Syhk2FpfxUWModC9dQWtm0rqBeaIiwX4JNBAOSop0kYKcHOAkhOTaCkJCuF9FoOb3yJa2zUKoZxhi2F1fyk5MUftpRysbCUn7asZdtxU2LezwJ8PVtx5MSF0moFxJBm5Y9xhlpEzjDvfuGzUth2Rsw+Ubff75bHKRNgIwpjePI+75741B+o8lC9UpVNXX8vGuvkxT28tOO0oZp9zuGuMgwhqfGcuTwZIb3i2N4ahz794vlwue+Ib+ZuoKBSdH0T4jqvi+SNgHeuBjOedGeqDcv3TffnQIljkAQpIlTk4UKOB0pdikqr+anwlI21ieDHXvZVFjKz7vLGjUKG5AYxfDUOM6akMb+TlIY3i+OfvGRzdYZ3OzPuoKy3VD4g/NaD4np8PJ0CI2EmgqI7gsLr4aQUJBQCAmz003mw0BCPOZD923bMB8GISFt73vgNPj3TBhyJGz9Go7+g41pxzqISrSv8Bj7XK4v+PskbQzUVELfYfCf38Av74L9xsDODbDoNjj7ed/H4EdaZ6ECyurX7+XhVTF8Xj2yYdkx4eu4dkQxZZOucys2sncJO0v39boZHipkpMTaRJAax/B+seyfGk9GaixxkR2/LvJpXYExULpjX0Kof9+5Hva6DREcHgupI6C2EravgQFj7QmqrhZMLdTV2Om6GjB1bczXOtPu8zVQV9fGfMt1N02EhEFkwr7kEZUIUfXzSfa9xfWJEBFvE1dz3O9mPO9u3BOIp7o6qN4LlSVur2KP+dJmljWzbV11698/Ih6i+0B0on2PSnLmk5rOu09HJnQ8yXopeWqjPNUjXXP/Y9xb9QjXVl/PV3UHc0TIGp4If7xhHiAxOty5O3BPDHEM6hPdqMO3gGAMFOc3TQqFP0CFa992kYnQ7yCbGFKd95QR9q5iyxf2pDjxMljxXNsnR1+oq4PNn8H8y2DsLMicC8fdDn0zoKIIKoqdd+dV6T7vNl29t40PktaTSfkeWPcuDDkKfv4fjDwdYvq2cqJ3Xs2PzNxYWDRExnu8EppZ5iz/8SNY8xYcdKr9e5S7bHwVzrvnfG1VK1871H6/lhJLc/M7foAP/g/OealjydPzo7WCWwW68qpa1hYUkZVTRHaui+zcIjaVHMDukOv5Z/hjZNcN49CQDTxdcwoA754VS9qA/ejTJxWJSoRQH/7z7ehVW10dFG1tmhAKf4Sqkn3bRfeFfiNh9Ix9SSH1IIjr3/yVpecJIOPoTp0QuuznZfDm5XCuc2IacdK+OEaf1f7j1Fbbk3eFq/lk0iTRFIErBypW71uHgY2f2ONlz7NX854n8/gBrZ/om1seGt7+77F5KWz+HKbcbBP4Yb9r/e9hDFSXuyWPNhJL2W7Y9dO+bVtLdi+dBn2G2t/Vh/8u9M5CdYvq2jrWbyshO9cmhqzcIn7cXtJQr9A/IZIx6Uks37SLyooylkdeTZKUtX7Q8Njmrz6bFHW0UAwS3kpFdUtFHmc9B0mDG9cp1CeFGrcWznH9G98lpB5kX7EpHfvh/F1OH2hx/PQ5zL8YxpxnE8XZL8LwY7rv86HzxWGdVVdnk2RLiWb9R5D7jU1cx/2pw4fXYijlN3V1hk079zbcLWTlulibX0ylM2hMYnQ4Y9ITGZueZN8HJTU8YbRwZR6hC37LaSH/4+3aI5gSks3f6s7nxCmTOXpQeDNXn66Wr07bKmsPjWycZDyLP8p3w+oFMHAc5H4LCQOhuMDWH9RLSHeSwYh9SSHlQFs0oryru0/SLQmUxFn/uV0sotRkobqFMYY8V3lDUsjOKWJ1XhElzuOpMRGhjB6YyJj0RMYMSmJseiKD+8a03Gr5+1cw71zL8pDxzCqbzanxG3lE/k7krJc79p/AGKguayGRFDWeb6kYpMbt0dnIRBh8eOO7hJQDbLJR3SOQTtKBwEvJU5OF6rD2PP1TWFLJqjxXo3qGXXttxV14qDByQIJNDOlJjE1PYv9+ce1vvFZXB/8YY2+zb8iC2GS73F8nhI2L4c3LYMJFsPIV/1QsK9USfRqqczRZdE1zYwVEhYdw6VEZxEeF23qGHFdDQzUROKBfnJMUbHI4aEA8kWGhnQ9ixQvw3o1wxlMw7tdd/UpdEyhFHkr5mD4NpTrk4UXrmwxwU1Fdxz8/+wmAwX1jmDCkD5c49Qyj0xKJ7UTbhRaVbIdP7oKhR9tHM/0t7/vGiSFjip3P+16ThQpKmiwUVTV1zY5VUG/lHb+iT2yEb4NYdKt9mujUx3zXArgjmruNz5iiiUIFLU0WQayovJp/L9/Ki19ubnGbtKRo3yeKDf+F1W/C1NsgZX/ffpZSqlM0WQShfFc5zy/bzLxvcyitrGHy/imcMT6Nl7/cQnn1vjEZuqUvpKoyeP//7OOmwfhEi1I9hCaLILI2v5hnvtjEu1n5GODUMQP47dHDGJ2WCMDI/RK6f9yEz/8Krp/h4g8gLNK3n6WU6jRNFr2cMYZlG3cyZ+kmvtiwk9iIUC46ciiXTs4gLSm60bbdPr7x9jXw1RMw/gIYelT3fa5SqsM0WfRS1bV1vJedz5ylm1lXUExqfCQ3nzSC8ycNITGmA33g+EpdHbx7g20p/as/+zsapVQbNFn0MqWVNcz7ZivPL9tMflEF+/eL46GzxjB9/MCutYHwtu+et11onDlHu8ZQqgfQZOFvXmqFub24ghf+t4W5y3+mpKKGwzL6ct+Zo5l6YD+vjAPtVSXb4L/3QMYxMOZcf0ejlGoHTRb+1sXhKn/cXsKcpZt4OzOP2jrDtEMGcMXRwxg7KMmHQXfRR7fYEcdO/XtgtKlQSrVJk4W/1bcMfuNiO4jKD++12aWEMYavNu3imaWbWLK+kOjwUH49aTCXTR7G4OSY7oq8c378GNYsgGNvh+Th/o5GKdVOmiwCQcYU287g+5cg/RctJoqa2jo+XL2NOUs3sSqviJS4CP7wqwO54PAhvm845w1Ve+H9P9gR4I66wd/RKKU6QJNFIPj8Idj6lR1PIfdbeO8PcOrfGlaXVdXwn29zeG7ZZnL3lDMsJZa/nHkIMyakERUeQJXWbfnsQTua3CUfQlgPSG5KqQaaLPwt8zVY8hdI3h9+txRePQtWPAsxfSn8xU289OUWXvn6Z4rKq5k4pA93nDqKX43sH3iV1m3Ztgq+ehImXAhDjvR3NEqpDtJk4U+VpfDJHVSFxXJe6c2svPMzhibewPzYHSQtfYT/WxLJspqRnDCqP1dMGc6hQ/r4O+LOqau1bSpi+sIv7/F3NEqpTtBk4S/GwNvXYPbu4ne1t/F9dRwAm4vqOI4/Mj/iHp4Jf5TC899m0EFtdjUf2FY8D3nfwYxntU2FUj1UiC8PLiInich6EdkoIrc0s36IiCwWkWwR+UxE0t3WPSQia0RknYg8Li2Ow9lDffn/YO1C/hl2AUuqRzVaVUQcN0XdSVRsIoPe/w24cvwUpBcUF9g2FcOOhUPO9nc0SqlO8lmyEJFQ4ElgGjAKmCUiozw2ewR42RgzBrgXeMDZ90jgKGAMMBr4BXCMr2Ltdps+h//eBSNP55HSE5vdJLs4Hi6Yb58gevUsKNvdzUF6yYc3Q101nPqotqlQqgfz5Z3FJGCjMWaTMaYKmAdM99hmFLDYmV7itt4AUUAEEAmEA9t9GGv3KcqF+ZdA8gFwxj8ZmNR8u4iBSdHQ/2A479+wZzO8NguqWx6gKCCt/xDWvQPH3Ax9h/k7GqVUF/gyWaQB7uUnuc4yd1nAWc70mUC8iCQbY77CJo8C57XIGLPOh7F2j+oK+M9voKYKZr4KkfHMPnEE0eGN/wyNxpHIOBrOfBpylsObl9vK4p6gshQ+mA2pI+GI6/wdjVKqi3yZLJorczAe8zcBx4jISmwxUx5QIyL7AyOBdGyCOU5EmrRUE5ErRGSFiKwoLCz0bvS+8OHNkP89nPkUpB4I2G7Bbz7poIZN0pKieWDGIY27Ch89A056wLbu/vBmWzke6D57AIpy4LTHtE2FUr2AL5+GygUGuc2nA/nuGxhj8oEZACISB5xljCkSkSuAr40xpc66D4HDgaUe+88B5gBMnDgxsM+g371oW2hP/j8YeVqjVf0TogB4+5qjWu7T6fCroDgfvnwcEgbC0X/wccBdUJAFXz8Fh14Cgw/3dzRKKS/w5Z3Ft8ABIpIhIhHAecA77huISIqI1MdwK/C8M70Ve8cRJiLh2LuOnlsMlfudLZIZdiwcd3uT1Zk5LiLCQhg5IKH14/zyHjjkXFh8L2T+20fBdlFDm4pk+OVd/o5GKeUlPksWxpga4FpgEfZE/7oxZo2I3CsipzubTQXWi8iPQH/gfmf5fOAnYBW2XiPLGPOur2L1qdJCeP03ELcfnP08hDTtniNzq4uDByYQEdbGnyMkBKY/CcOmwtvXwob/+iTkLvn2WchfaYvNontoI0KlVBM+bZRnjPkA+MBj2Z1u0/OxicFzv1rgd76MrVvU1tgnn8p2waWLmm2QVlNbx6q8Imb+YlAzB2hGWASc+wq8eDK8fiFc/C6kHerlwDupKM/e9Qw/Hkaf1fb2Sqkew6eN8oLe4ntgyxd23IaB45rd5MftpZRX1zJ+cAfGn4hKgPPnQ2wyzD0Xdm/yUsBd9OHNthhK21Qo1etosvCVNQtsZfQvLodxv25xs8wcFwDjOjpYUfx+cMFbYOrglRm2uMuffnjfPq019Y/QZ6h/Y1FKeZ0mC1/YsQ4WXgPpk+DEB1rdNCvHRZ+YcAb37cSgRSkHwK9ft8OU/vsc27bBHypLbAV+v4PhiGv9E4NSyqc0WXhbRRH85wKIiIVzX2qzjUFmjouxg5LodNdXg34B57xgH1d942Kore7ccbpiyV/sY72nPQah4d3/+Uopn9Nk4U11dbDgKtizxSaKhIGtbl5aWcOPO0o6XgTlacQ0Wy+y8RP72Gp3NtrLXwnL/wUTL4VBk7rvc5VS3Uq7KPemZX+D9e/DSQ+2a4CfVblFGNOJ+ormHHqx7eH18wchfgAcf0fXj9mW2hqbnGJT4fg7295eKdVjabLwlo3/hU/vh0POgcOubNcu9ZXbY9O9kCwApt4CJfnwxSOQMMBWrvvSN3Ns8dfZL0C0l76DUiogabLwhj1bYP5ltpfY0/7R7sdGM3P2MDQ5hj6xXuo7SQRO+TuU7rAVznH7wchTvXNsT0W58Ol9cMAJcPCZvvkMpVTA0DqLrqoqsxXaGJj5iq3YbqfMHJd3iqDchYbZluIDJ8Cbl8HWr717/Hof3Gwf2z35EW1ToVQQ0GTRFcbAe7+HbavtkKEdGLNhW1EF24srW+44sCsiYu0jtQlp8O+ZULjeu8df956tmzn2VugzxLvHVkoFJE0WXfHts5A9D6beCgee0KFdM3P2AF6q3G5ObDL85i0IjbAj7RUXeOe4FcW2iKv/aDj8au8cUykV8DRZdNbWr+GjW+DAk2DK7A7vvjLHRXioMGpgGz3NdkWfoXZo1vI9MPds2wakq5bcDyUFtm5G21QoFTQ0WXRGyTZ4/SJIHGRHsQvp+M+YleNi1IAEIsOa9kLrVQPG2rqUwh9g3vlQU9n5Y+V9B8uftk9ZpU/0XoxKqYCnyaKjaqttS+nKYjhvbqceGa2tM6zKLfJdEZSn4cfB9H/aTg0XXGkbD3ZUfZuKuP7d04ZDKRVQ9NHZjvr4dtj6FZz1nH1UthM27Chhb1Ut4zrS02xXjZ1pi4/+e5dttHfSXzq2//J/wbZVcO7LEJXomxiVUgFLk0VHZP3HnjQPvwYOObvzh/F2Y7z2OuoGmzC+ftJ2RXJkOzv9c221dRUHngQjT297e6VUr6PJor22rbLFMEMmw6/u6dKhMnNcJEaHk5HS/jYZXiFie8Et2QYf/8l2c95W0jPGPv0EcPLD2qZCqSClyaI9ynbbyuHoPraH1y4+BbRyaxd7mu2KkBBbKb93p62/iE2xw7S2ZN078ONHcMJ9kDS4u6JUSgUYreBuS10tvPVb2wX3uS9DXL8uHa6sqoYft5cwLt2P5f7hUbZyPuUAmHeBvWtqTkWRbam93yFw2FXdG6NSKqBosmjLZw/aTgKn/dWOHdFFq3KLqDN0b+V2c6KT7NCsUQnw6tmw5+em23x6H5Rud9pU6E2oUsFMk0Vr1n8ISx+CcRfY8Rq8wOs9zXZFYhpc8CbUlNtW3mW7963LXQHfPAOTroC0Q/0Xo1IqIGiyaMmun+CtK2DAODjFe53lZeW6GNQ3muS4SK8cr8v6jYRZ82D3Jnh+GlSX27Yk794A0X0hJtnfESqlAoAmi+ZUltoK7ZAw2/o5PNprh87c6mLcoD5eO56cCCM0AAAb2UlEQVRXDDkSjrkZdv4AL0+HL/8fbF8NdVUw5Ah/R6eUCgBaEO3JGHjnOti53hbRePEJoB3FFeQXVXBZd7Xc7oipt9hiqG+ehpzltgPCmXMhY4q/I1NKBQC9s/D01ZOw5i07TOjw47x66JVOfUW3dfPRUSc/BIOdO4mJl8KwY/wbj1IqYGiycLf5C/jkThh5Ghx1o9cPn5XjIixEONiXPc12xealsPNHOHo2rHrDziulFMGeLJY9tu+EWJRnOwiMHwD9D/FJS+XMHBcjByQQFe7jnmY7Y/NS+/3PeRGOv92+v3GxJgylFBDsySJtgj0hblwMr/8GqvZCValPKnVr6wzZ3dnTbEflfW8TRH0dRcYUO5/3vT+jUkoFiOCu4K4/Ic49247zEBlvn37yQaXupsJSSitrfDOMqjdMbqbYLWOKVnArpYBgv7MAiNsPaqrs9GFX+ezkGPCV20op1QpNFqXb7PgMR98EK57zWRl9Zo6L+KgwhnV3T7NKKeUFwZ0s6it1Z75iR3/zYaVuVo6LselJhIRoF99KqZ4nuJNFN1XqllfV8sO2Ei2CUkr1WMFdwd1Nlbqr84uorTOaLJRSPVZw31l0k4ZhVDVZKKV6KE0W3WBljou0pGhS4wOkp1mllOogTRbdIHOry/+DHSmlVBdosvCxwpJK8lzljAuEwY6UUqqTfJosROQkEVkvIhtF5JZm1g8RkcUiki0in4lIutu6wSLysYisE5G1IjLUl7H6Sn19hd5ZKKV6Mp8lCxEJBZ4EpgGjgFkiMspjs0eAl40xY4B7gQfc1r0MPGyMGQlMAnb4KlZfysxxERoijB6Y6O9QlFKq03x5ZzEJ2GiM2WSMqQLmAdM9thkFLHaml9Svd5JKmDHmEwBjTKkxpsyHsfpMZo6LEf3jiY4IwJ5mlVKqnXyZLNKAHLf5XGeZuyzgLGf6TCBeRJKBAwGXiLwlIitF5GHnTqUREblCRFaIyIrCwkIffIWuqaszZOVq5bZSqufzZbJorl8L4zF/E3CMiKwEjgHygBpsY8GjnfW/AIYBFzc5mDFzjDETjTETU1NTvRi6d2zauZeSihptjKeU6vF8mSxygUFu8+lAvvsGxph8Y8wMY8x44E/OsiJn35VOEVYNsBCY4MNYfSJTe5pVSvUSvkwW3wIHiEiGiEQA5wHvuG8gIikiUh/DrcDzbvv2EZH624XjgLU+jNUnsnJcxEWGMTw1zt+hKKVUl7QrWYjImSKS6DafJCJntLaPc0dwLbAIWAe8boxZIyL3isjpzmZTgfUi8iPQH7jf2bcWWwS1WERWYYu0nunQNwsAmTkuxqQnEqo9zSqlerj2diR4lzFmQf2MMcYlIndhi4daZIz5APjAY9mdbtPzgfkt7PsJMKad8QWciupa1hUUc8WUYf4ORSmluqy9xVDNbRfcPda2YU1+MTV1RjsPVEr1Cu1NFitE5FERGS4iw0Tk78B3vgysp6uv3B6vyUIp1Qu0N1lcB1QB/wFeB8qBa3wVVG+QmeNiYGIU/RKi/B2KUkp1WbuKkowxe4EmfTuplmXluLQISinVa7T3aahPRCTJbb6PiCzyXVg9267SSrbuLtP2FUqpXqO9xVApxhhX/YwxZg/Qzzch9XxZudoYTynVu7Q3WdSJyOD6Gae7cM+uO5Qjc6uLEIHRadrTrFKqd2jv469/ApaJyOfO/BTgCt+E1PNl5hZxYP94YiP16WKlVO/QrjsLY8xHwERgPfaJqD9gn4hSHowxZOW4GK89zSqlepF2XfqKyOXADdjOADOBw4GvsH02KTebd+6lqLyasTqMqlKqF2lvncUN2K7CfzbGHAuMBwJvAIkA0FC5rXcWSqlepL3JosIYUwEgIpHGmB+AEb4Lq+fK3OoiJiKUA/rF+zsUpZTymvbWwOY67SwWAp+IyB48xqZQVmaOi0PStKdZpVTv0t4W3Gc6k3eLyBIgEfjIZ1H1UJU1tawtKObSyRn+DkUppbyqw892GmM+b3ur4LQ2v5jqWqOdByqleh1fjpQXdOp7mtU+oZRSvY0mCy/KynHRPyGSAYnR/g5FKaW8SpOFF2XmuLQ/KKVUr6TJwkv27K1iy64yLYJSSvVKmiy8RHuaVUr1ZposvCQzx4UIjNFuPpRSvZAmCy/JzHFxYL944rSnWaVUL6TJwgvqe5odO0jHr1BK9U6aLLxg6+4y9pRVM25QH3+HopRSPqHJwgvqG+Np5bZSqrfSZOEFK7e6iA4P5cD+cf4ORSmlfEKThRdk5dqeZsNC9edUSvVOenbroqqaOtbkF+tgR0qpXk2TRRetKyimqqZOh1FVSvVqmiy6SIdRVUoFA00WXZS51UVqfCQDE6P8HYpSSvmMJosuysxxMTY9CREdRlUp1XtpsuiCorJqNu3cy3gtglJK9XKaLLpAe5pVSgULTRZdUN/T7CHp2ieUUqp302TRBVk5LoanxpEQFe7vUJRSyqc0WXSSMUaHUVVKBQ2fJgsROUlE1ovIRhG5pZn1Q0RksYhki8hnIpLusT5BRPJE5AlfxtkZuXvK2bW3SodRVUoFBZ8lCxEJBZ4EpgGjgFkiMspjs0eAl40xY4B7gQc81v8Z+NxXMXbFSqen2fGaLJRSQcCXdxaTgI3GmE3GmCpgHjDdY5tRwGJneon7ehE5FOgPfOzDGDstK8dFZFgII/aL93coSinlc75MFmlAjtt8rrPMXRZwljN9JhAvIskiEgL8DZjd2geIyBUiskJEVhQWFnop7PbJzLE9zYZrT7NKqSDgyzNdc02ajcf8TcAxIrISOAbIA2qAq4EPjDE5tMIYM8cYM9EYMzE1NdUbMbdLdW0dq/OKtL5CKRU0wnx47FxgkNt8OpDvvoExJh+YASAiccBZxpgiETkCOFpErgbigAgRKTXGNKkk94f120qorKnTJ6GUUkHDl8niW+AAEcnA3jGcB/zafQMRSQF2G2PqgFuB5wGMMee7bXMxMDFQEgXsq9zWZKGUChY+K4YyxtQA1wKLgHXA68aYNSJyr4ic7mw2FVgvIj9iK7Pv91U83pS51UVybATpfaL9HYpSSnULX95ZYIz5APjAY9mdbtPzgfltHONF4EUfhNdpWbm2MZ72NKuUChb6KE8HFVdU81NhqRZBKaWCiiaLDsrOKcIY9EkopVRQ0WTRQfXdkmuyUEoFE00WHbRyq4thqbEkRmtPs0qp4KHJogMaeppN17sKpVRw0WTRAflFFewsrWScDqOqlAoymiw6IHOrNsZTSgUnTRYdkJmzh4iwEA7aL8HfoSilVLfSZNEBmTkuDh6YQESY/mxKqeCiZ712qqmtY1VekRZBKaWCkiaLdlq/vYSKau1pVikVnDRZtFOm9jSrlApimizaKSvHRZ+YcAb3jfF3KEop1e00WbRTZo6LsdrTrFIqSGmyaIeSimo27NCeZpVSwUuTRTusyrM9zWqyUEoFK00W7aCV20qpYKfJoh0yt7oYmhxDUkyEv0NRSim/0GTRDvXDqCqlVLDSZNGGgqJythdXarJQSgU1TRZtqO9pVkfGU0oFM00WbcjMdRERGsKogdrTrFIqeGmyaEPmVhcjByYQGRbq71CUUspvNFm0orbO2J5m0xP9HYpSSvmVJotW/Li9hLKqWh1GVSkV9DRZtCKroTFeHz9HopRS/qXJohWZOS4So8MZmqw9zSqlgpsmi1ZoT7NKKWVpsmjB3soaftxeoo3xlFIKCPN3AIFqVV4RdQbGDdInoZRqS3V1Nbm5uVRUVPg7FNWCqKgo0tPTCQ8P79T+mixaUN/T7Nh0vbNQqi25ubnEx8czdOhQLbYNQMYYdu3aRW5uLhkZGZ06hhZDtSArx8XgvjEkx0X6OxSlAl5FRQXJycmaKAKUiJCcnNylOz9NFi2or9xWSrWPJorA1tW/jyaLZmwvrqCgqEIrt5VSyqF1Fs3QkfGU8q2FK/N4eNF68l3lDEyKZvaJIzhjfFqnj7dr1y6OP/54ALZt20ZoaCipqakAfPPNN0REtD1w2SWXXMItt9zCiBEjOh1Hb6bJohmZOS7CQoSDtadZpbxu4co8bn1rFeXVtQDkucq59a1VAJ1OGMnJyWRmZgJw9913ExcXx0033dRoG2MMxhhCQpovUHnhhRc69dnBQpNFMzK3uhg5IIGocO1pVqmOuufdNazNL25x/cqtLqpq6xotK6+u5eb52bz2zdZm9xk1MIG7Tju4w7Fs3LiRM844g8mTJ7N8+XLee+897rnnHr7//nvKy8uZOXMmd955JwCTJ0/miSeeYPTo0aSkpHDllVfy4YcfEhMTw9tvv02/fv0aHfvrr7/m97//PRUVFcTExPDiiy9ywAEHUFNTw+zZs/nkk08ICQnhyiuv5Oqrr2b58uXceOONlJWVERUVxZIlS4iJ6Tm9Q/i0zkJEThKR9SKyUURuaWb9EBFZLCLZIvKZiKQ7y8eJyFcissZZN9OXcbqrrTNk6zCqSvmMZ6Joa3lXrV27lssuu4yVK1eSlpbGgw8+yIoVK8jKyuKTTz5h7dq1TfYpKirimGOOISsriyOOOILnn3++yTYjR45k2bJlrFy5kjvuuIPbb78dgKeeeor8/HyysrLIzs7mvPPOo6KigvPOO48nn3ySrKwsPv74YyIje9aTlj67sxCRUOBJ4FdALvCtiLxjjHH/yzwCvGyMeUlEjgMeAH4DlAEXGmM2iMhA4DsRWWSMcfkq3no/FZayt6pWk4VSndTWHcBRD35Knqu8yfK0pGj+87sjvB7P8OHD+cUvftEw/9prr/Hcc89RU1NDfn4+a9euZdSoUY32iY6OZtq0aQAceuihfPHFF02O63K5uPDCC/npp58aLf/vf//LjTfeSGioLZno27cvK1euZPDgwUyYMAGAxMSe19jXl3cWk4CNxphNxpgqYB4w3WObUcBiZ3pJ/XpjzI/GmA3OdD6wA0j1YawNdBhVpXxr9okjiPYo4o0OD2X2ib6pWI6NjW2Y3rBhA//4xz/49NNPyc7O5qSTTmq27YF7hXhoaCg1NTVNtvnTn/7EiSeeyOrVq1m4cGHDcYwxTR5TbW5ZT+PLZJEG5LjN5zrL3GUBZznTZwLxIpLsvoGITAIigJ889kVErhCRFSKyorCw0CtBr8xxER8VxrCU2LY3Vkp12Bnj03hgxiGkJUUj2DuKB2Yc0qWnodqruLiY+Ph4EhISKCgoYNGiRZ0+VlFREWlpNuYXX3yxYfkJJ5zAU089RW2trcDfvXs3Bx98MD///DPff/99Qxz163sKX1ZwN5dGjcf8TcATInIxsBTIAxpSuIgMAF4BLjLGNCnQNMbMAeYATJw40fPYnZKVY+srQkJ69lWAUoHsjPFp3ZIcPE2YMIFRo0YxevRohg0bxlFHHdXpY/3xj3/k0ksv5aGHHuLYY49tWP673/2ODRs2MGbMGMLCwrjqqqu48soree2117jqqquoqKggOjqaTz/9tEdVcIsxXjnHNj2wyBHA3caYE535WwGMMQ+0sH0c8IMxpr6SOwH4DHjAGPNGW583ceJEs2LFii7FXF5Vy+i7F3HVMcO5yUe3xEr1RuvWrWPkyJH+DkO1obm/k4h8Z4yZ2Na+viyG+hY4QEQyRCQCOA94x30DEUkRkfoYbgWed5ZHAAuwld9tJgpvWZVXRG2d0cptpZTy4LNkYYypAa4FFgHrgNeNMWtE5F4ROd3ZbCqwXkR+BPoD9zvLzwWmABeLSKbzGuerWOvVD6OqldtKKdWYTxvlGWM+AD7wWHan2/R8YH4z+70KvOrL2JqTmeMiLSma1Pie9fyzUkr5mnYk6CYzx8W4wXpXoZRSnjRZOHaUVJDnKme8FkEppVQTmiwcWTlFgNZXKKVUczRZODJz9hAaIowe2POa4SvVoyx7DDYvbbxs81K7vJOmTp3apIHdY489xtVXX93qfnFxcQDk5+dz9tlnt3jsth7Lf+yxxygrK2uYP/nkk3G5fN47UbfSZOHIyinioP3iiY7QnmaV8qm0CfDGxfsSxualdj5tQqcPOWvWLObNm9do2bx585g1a1a79h84cCDz5zd51qbdPJPFBx98QFJS7yql0C7Kgbo6Q1aOi9PGDfR3KEr1fB/eAttWtb5N/AB45Uz7XlIAqQfBZ3+1r+bsdwhMe7DFw5199tncfvvtVFZWEhkZyZYtW8jPz2fy5MmUlpYyffp09uzZQ3V1Nffddx/Tpzfupm7Lli2ceuqprF69mvLyci655BLWrl3LyJEjKS/f1+nhVVddxbfffkt5eTlnn30299xzD48//jj5+fkce+yxpKSksGTJEoYOHcqKFStISUnh0Ucfbei19vLLL+fGG29ky5YtTJs2jcmTJ/Pll1+SlpbG22+/TXR0dKO43n33Xe677z6qqqpITk5m7ty59O/fn9LSUq677jpWrFiBiHDXXXdx1lln8dFHH3HbbbdRW1tLSkoKixcvxls0WQCbdpZSUlmjjfGU6i5RSTZRFOVA4iA73wXJyclMmjSJjz76iOnTpzNv3jxmzpyJiBAVFcWCBQtISEhg586dHH744Zx++uktduz31FNPERMTQ3Z2NtnZ2Q09xQLcf//99O3bl9raWo4//niys7O5/vrrefTRR1myZAkpKSmNjvXdd9/xwgsvsHz5cowxHHbYYRxzzDH06dOHDRs28Nprr/HMM89w7rnn8uabb3LBBRc02n/y5Ml8/fXXiAjPPvssDz30EH/729/485//TGJiIqtW2aS8Z88eCgsL+e1vf8vSpUvJyMhg9+7dXfpNPWmywA7GAuiTUEp5Qyt3AA3qi56m3AwrnoOpf4SMKV362PqiqPpkUX81b4zhtttuY+nSpYSEhJCXl8f27dvZb7/9mj3O0qVLuf766wEYM2YMY8aMaVj3+uuvM2fOHGpqaigoKGDt2rWN1ntatmwZZ555ZkPPtzNmzOCLL77g9NNPJyMjg3HjbFvjQw89lC1btjTZPzc3l5kzZ1JQUEBVVRUZGRmA7QbdvditT58+vPvuu0yZMqVhm759+7b3p2uXoK+zWLgyj7vfWQPARS98w8KVeX6OSKlerj5RnPMiHPcn++5eh9FJZ5xxBosXL24YBa/+jmDu3LkUFhby3XffkZmZSf/+/Zvtltxdc3cdmzdv5pFHHmHx4sVkZ2dzyimntHmc1vrecx/8qKVu0K+77jquvfZaVq1axdNPP+3XbtCDOlnUjwW8t8p2FZzvquDWt1ZpwlDKl/K+twmi/k4iY4qdz/u+S4eNi4tj6tSpXHrppY0qtouKiujXrx/h4eEsWbKEn3/+udXjTJkyhblz5wKwevVqsrOzAduteGxsLImJiWzfvp0PP/ywYZ/4+HhKSkqaPdbChQspKytj7969LFiwgKOPPrrd38m9G/SXXnqpYfkJJ5zAE0880TC/Z88ejjjiCD7//HM2b94M4PViqKBOFg8vWt8waHy98upaHl603k8RKRUEJt/YtMgpY4pd3kWzZs0iKyuL8847r2HZ+eefz4oVK5g4cSJz587loIMOavUYV111FaWlpYwZM4aHHnqISZMmATB27FjGjx/PwQcfzKWXXtqoe/MrrriCadOmNeqqHGyX6BdffDGTJk3isMMO4/LLL2f8+PHt/j53330355xzDkcffXSj+pDbb7+dPXv2MHr0aMaOHcuSJUtITU1lzpw5zJgxg7FjxzJzpndHo/ZZF+XdrTNdlGfc8n6TATbADsSx+cFTvBKXUsFAuyjvGQK1i/KANzApukPLlVIqWAV1sujusYCVUqqnCupHZ+uHdXx40XryXeUMTIpm9okj/DLco1I9na+fxlFd09Uqh6BOFuC/sYCV6k2ioqLYtWsXycnJmjACkDGGXbt2ERUV1eljBH2yUEp1XXp6Orm5uRQWFvo7FNWCqKgo0tPTO72/JgulVJeFh4c3tBxWvVNQV3ArpZRqH00WSiml2qTJQimlVJt6TQtuESkEWu/0JfClADv9HUQA0d+jMf099tHforGu/B5DjDGpbW3Ua5JFbyAiK9rT7D5Y6O/RmP4e++hv0Vh3/B5aDKWUUqpNmiyUUkq1SZNFYJnj7wACjP4ejenvsY/+Fo35/PfQOgullFJt0jsLpZRSbdJkoZRSqk2aLAKAiAwSkSUisk5E1ojIDf6Oyd9EJFREVorIe/6Oxd9EJElE5ovID86/kSP8HZM/icjvnf8nq0XkNRHpfFeqPZCIPC8iO0RktduyviLyiYhscN77ePtzNVkEhhrgD8aYkcDhwDUiMsrPMfnbDcA6fwcRIP4BfGSMOQgYSxD/LiKSBlwPTDTGjAZCgfNa36vXeRE4yWPZLcBiY8wBwGJn3qs0WQQAY0yBMeZ7Z7oEezII2kE2RCQdOAV41t+x+JuIJABTgOcAjDFVxhiXf6PyuzAgWkTCgBgg38/xdCtjzFJgt8fi6cBLzvRLwBne/lxNFgFGRIYC44Hl/o3Erx4Dbgbq/B1IABgGFAIvOMVyz4pIrL+D8hdjTB7wCLAVKACKjDEf+zeqgNDfGFMA9uIT6OftD9BkEUBEJA54E7jRGFPs73j8QUROBXYYY77zdywBIgyYADxljBkP7MUHRQw9hVMWPx3IAAYCsSJygX+jCg6aLAKEiIRjE8VcY8xb/o7Hj44CTheRLcA84DgRedW/IflVLpBrjKm/05yPTR7B6pfAZmNMoTGmGngLONLPMQWC7SIyAMB53+HtD9BkEQDEDlr8HLDOGPOov+PxJ2PMrcaYdGPMUGzF5afGmKC9cjTGbANyRGSEs+h4YK0fQ/K3rcDhIhLj/L85niCu8HfzDnCRM30R8La3P0CHVQ0MRwG/AVaJSKaz7DZjzAd+jEkFjuuAuSISAWwCLvFzPH5jjFkuIvOB77FPEa4kyLr+EJHXgKlAiojkAncBDwKvi8hl2IR6jtc/V7v7UEop1RYthlJKKdUmTRZKKaXapMlCKaVUmzRZKKWUapMmC6WUUm3SZKFUB4hIrYhkur281ppaRIa69ySqVCDRdhZKdUy5MWacv4NQqrvpnYVSXiAiW0TkryLyjfPa31k+REQWi0i28z7YWd5fRBaISJbzqu+yIlREnnHGa/hYRKL99qWUcqPJQqmOifYohprptq7YGDMJeALbcy7O9MvGmDHAXOBxZ/njwOfGmLHYvp7WOMsPAJ40xhwMuICzfPx9lGoXbcGtVAeISKkxJq6Z5VuA44wxm5xOIbcZY5JFZCcwwBhT7SwvMMakiEghkG6MqXQ7xlDgE2cAG0Tkj0C4MeY+338zpVqndxZKeY9pYbqlbZpT6TZdi9YrqgChyUIp75np9v6VM/0l+4b9PB9Y5kwvBq6ChvHGE7orSKU6Q69alOqYaLeegcGOjV3/+GykiCzHXoTNcpZdDzwvIrOxI97V9xh7AzDH6SW0Fps4CnwevVKdpHUWSnmBU2cx0Riz09+xKOULWgyllFKqTXpnoZRSqk16Z6GUUqpNmiyUUkq1SZOFUkqpNmmyUEop1SZNFkoppdr0/wEecC+P9zYx6AAAAABJRU5ErkJggg==\n", 269 | "text/plain": [ 270 | "
" 271 | ] 272 | }, 273 | "metadata": { 274 | "needs_background": "light" 275 | }, 276 | "output_type": "display_data" 277 | } 278 | ], 279 | "source": [ 280 | "import matplotlib.pyplot as plt\n", 281 | "f, ax = plt.subplots()\n", 282 | "ax.plot([None] + theLeNetModel.history['acc'], 'o-')\n", 283 | "ax.plot([None] + theLeNetModel.history['val_acc'], 'x-')\n", 284 | "ax.legend(['Train acc', 'Validation acc'], loc = 0)\n", 285 | "ax.set_title('Training/Validation acc per Epoch')\n", 286 | "ax.set_xlabel('Epoch')\n", 287 | "ax.set_ylabel('acc')" 288 | ] 289 | }, 290 | { 291 | "cell_type": "code", 292 | "execution_count": 94, 293 | "metadata": {}, 294 | "outputs": [ 295 | { 296 | "data": { 297 | "text/plain": [ 298 | "Text(0, 0.5, 'acc')" 299 | ] 300 | }, 301 | "execution_count": 94, 302 | "metadata": {}, 303 | "output_type": "execute_result" 304 | }, 305 | { 306 | "data": { 307 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEWCAYAAACXGLsWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xl8VNX5+PHPk8lKtgkQ1gQJiMiWhBgRRUFcwQXUYgF3q7XWWrdWi13crVT9WbS1dYVaUai7fN2oC4pYRdkE2QTDFoIQliyE7Dm/P85NmIRJMklmMlme9+s1r7n7feYG7nPPOfeeK8YYlFJKqYaEBDsApZRSbZ8mC6WUUo3SZKGUUqpRmiyUUko1SpOFUkqpRmmyUEop1ShNFqpeIuISkYMi0s+fy7Y2EZkrIvc4w6eKyFpflm3GfgJ2DEQkW0RO9fd2Ows9fi2nyaIDcU5U1Z8qESn2GL+0qdszxlQaY2KMMdv9uayvRGS2iNwtIoUi0sXL/DUicn1TtmmM+dQYM8xP8S0Rkas8tu33Y9AROcetpM6/1zeDHZdqmCaLDsQ5UcUYY2KA7cD5HtNeqru8iIS2fpS+EREBzgaeAXYDF9WZnw4MAv7T+tEpX4mIq55Z13v+ezXGXNiqgakm02TRiYjIAyLyHxGZJyKFwGUicqKIfCUieSKyS0SeEJEwZ/lQETEi0t8Zn+vMf9+52v9SRFKauqwzf6KIfC8i+SLyNxH5wvMqHRgJ7DbG7AL+DVxR5+dcASwwxhwQkRAReU1EfnR+x6ciMqSeY3CGiGz1GD9ORFY5Mc4DIjzmdROR90QkV0QOiMj/iUhfZ95fgBOBp5wr41lejoHbOQ65IrJVRO50kiAicq2IfCYif3VizhKRs3z8O0Y6x3aXiOwUkcdEJNyZ18OJOU9E9ovIYo/1fi8iOSJSICIb6quWcWJ+UkQ+do7LIhFJ9pg/VEQ+cra/QUR+4mXdD0SkCDjFl9/ksf4ZzrG6S0T2icgWEZnmMb/eY+rM/4UTU6GIfCciaR6bzxBbGs13/g9EoHymyaLzuRB4GYjHXpVXADcD3YExwATgFw2sfwnwJ6ArtvRyf1OXFZEewCvA7c5+twCj6qx7DvCuM/xvYLzHidoFTHemV3sHW9LoBXwHvNhAXDjbiQDeBmY7Mb4NXOCxSAjwLNAPOAooBx4HMMb8DviSw1fIt3jZxT+ALsAA4DTgGmonvZOANUA34K/A843F7LgLyARSsUl1DHCnM+92IAtIxB6LPzm/dRj275phjIkDJmL/JvW5zNlPd2AdzvEUkVjgQ+yx7wFcCjwjIoM91r0EuBeIxR6jpkpy1u2DPWazReRoZ169x1REpgN/dGKKw5ZG93ts96fAmc66xwGXNyO2zssYo58O+AG2AmfUmfYA8Ekj6/0WeNUZDgUM0N8Znws85bHsJOC7Ziz7M+Bzj3kC7AKu8pj2JXCix/inwB3O8ERs1VRoPb+huxNLtEcs9zjDZwBbneHTgB2AeKz7dfWyXrabCeR6jC+pE3PNMQDCsIn4GI/5vwI+coavBTZ4zItz1u1ez76zgVOd4W3AWR7zzgU2O8N/Bt4ABtZZf7BzzE6v77h5LDsXmOsxHg9UAb2xJ+JFdZZ/HviDx7qzG9n+EuAQkOfxudvj71MGdPFY/g1sMmzsmH4M/KqB4zfNY/wx4O/B/D/a3j5asuh8dniOiMixIvKuU4VTANyHPdnW50eP4UNATDOW7eMZh7H/e7M9YuqGvfpb6rH+Cxy+Kr8ceMkYU+Es7xKRh52qnAJgs7NcQ7+jOo5sZ//VtnnEES0iz4nIdme7n/iwzWo9AJfn9pzhvh7jdY8PNHw8q/VuYLsznfGPReQHEbkdwBizEfgN9u+7x6mG6dXAPjz/PvlAPvZ4HQWMcaq58kQkD5jqxHTEug24wRjj9vjc6zFvnzHmkMf4NmffjR3TZOCHBvbZlH+7qg5NFp1P3W6Gn8ZW2xxtbPXEXdgr/UDaha1qAGoasz1PohOAD40xVR7TXgVSRGQcMJnaVVBXYKutTsNeBVdXWTT2O2rF4fC87fUOIAUY5Ryb0+os21CXzXuASuzJ1XPbOxuJyRe76tuuMabAGHOrMaY/tkrtd84xwxgz1xgzBvubXMBDDezDs40iHntcc7CJ4OM6J/oYY8yNHuu2tCvrbiISVef35dD4Md0BDGzhvlU9NFmoWOxVY5HTKNxQe4W/vINtbDxf7B1ZN2Pr2KudC7znuYIx5iC2OuIFbJXLKo/ZsUApsA9bn/2gj3EsAUJE5EancfpiIKPOdg8BB5zSzl111t+NLQEdwRhTDrwG/FlEYsQ27t+KraZpqXnAXSLSXUQSse0ScwGcYzrQScD52JNrpYgMEZHxTjtNsfOpbGAf54u9+SECW325xNibDRYAw0TkEhEJcz6j6rRZtFQIcI+IhDuN8BOB13w4ps8Bd4jISLEGeTbMq5bRZKF+A1wJFGJLGQG/FdUYsxtbdfEY9gQ/EFgJlIpICLZefaGXVV/AXlX+u870OdgrzxxgLfA/H+MoxTb4/xw4gG0QfctjkcewV9T7nG2+X2cTs4DpTnXMY152cQO2/n0L8JkTf93Ym+Ne4Fts4/hqbHVddSlhMLa67CDwBfC4MWYJ9i6vh4G92OqYBGxjcH3mYpPEXmxD+uVQUyV1NrYBfJezrYfwuIvMR9V3kVV/vvaYlw0UOdt/AbjWGLPJmVfvMTXGzAP+gv03XIC9uEhoYlyqHlK7ulap1ufc3ZQDTMFe7T5qjDkpuFF1XiIyF1t6uycI+z4DeM6pRlNtiJYsVFCIyAQRiXeqOf6Evcvla+xdN/c2uLJSqtW12Sd4VYd3MvASEI6tOrrAqRb6KqhRKaW80moopZRSjdJqKKWUUo3qMNVQ3bt3N/379w92GEop1a4sX758rzEmsbHlOkyy6N+/P8uWLQt2GEop1a6IyLbGl9JqKKWUUj7QZKGUUqpRmiyUUko1qsO0WSilWld5eTnZ2dmUlJQEOxTlg8jISJKSkggLC2vW+poslFLNkp2dTWxsLP3798fjZXWqDTLGsG/fPrKzs0lJSWl8BS86fbJ4a+VOHlm4kZy8Yvq4o7j97MFcMLJv4ysq1cmVlJRoomgnRIRu3bqRm5vb7G106mTx1sqd3PnGGorLbU/NO/OKufONNQCaMJTygSaK9qOlf6tO3cD9yMKNNYmiWnF5JY8s3BikiJRSqm3q1MkiJ6+4SdOVUm3Hvn37SE9PJz09nV69etG3b9+a8bKyMp+2cfXVV7Nxo+8Xh8899xy33HJLc0Nu1zp1NVQfdxQ7vSSGPu4oL0srpVrC3+2D3bp1Y9Uq+8LEe+65h5iYGH7729/WWsYYgzGGkBDv18Vz5sxp9v47m05dsrj97MFEhblqTYsKc3H72f58Q6RSqrp9cGdeMYbD7YNvrfTHK8lr27x5M8OHD+f6668nIyODXbt2cd1115GZmcmwYcO47777apY9+eSTWbVqFRUVFbjdbmbMmEFaWhonnngie/bsaXA/W7ZsYfz48aSmpnLmmWeSnZ0NwPz58xk+fDhpaWmMHz8egDVr1nD88ceTnp5OamoqWVlZfv/dgdapSxbVVzV3L1hLfnE5PeMiuHPiEG3cVqqJ7v2/tazLKah3/srteZRVVtWaVlxeyR2vrWbe19u9rjO0Txx3nz+sWfGsW7eOOXPm8NRTTwEwc+ZMunbtSkVFBePHj2fKlCkMHTq01jr5+fmMGzeOmTNncttttzF79mxmzJhR7z5uuOEGrr32Wi699FKeeeYZbrnlFl577TXuvfdePv30U3r27EleXh4A//jHP/jtb3/L1KlTKS0tpT2+GqJTlyzAJoyXrj0BgD+cO1QThVIBUDdRNDa9pQYOHMjxxx9fMz5v3jwyMjLIyMhg/fr1rFu37oh1oqKimDhxIgDHHXccW7dubXAfS5cuZdq0aQBcccUVfP755wCMGTOGK664gueee46qKvv7TjrpJB544AEefvhhduzYQWRkpD9+Zqvq1CWLaoN7xRIRGsK3O/KYlNYn2OEo1e40VgIYM/MTr+2Dfd1R/OcXJ/o9nujo6JrhTZs28fjjj/P111/jdru57LLLvD51Hh4eXjPscrmoqKho1r6fffZZli5dyjvvvENaWhqrV6/m8ssv58QTT+Tdd9/lzDPP5IUXXmDs2LHN2n6wdPqSBUCYK4ThfeNZtSMv2KEo1SEFs32woKCA2NhY4uLi2LVrFwsXLvTLdkePHs0rr7wCwNy5c2tO/llZWYwePZr777+fhIQEdu7cSVZWFkcffTQ333wz5557LqtXr/ZLDK1JSxaO9GQ3c7/aRnllFWEuzaFK+VN19W4wekvIyMhg6NChDB8+nAEDBjBmzBi/bPfvf/8711xzDQ899BA9e/asubPq1ltvZcuWLRhjOOussxg+fDgPPPAA8+bNIywsjD59+vDAAw/4JYbW1GHewZ2ZmWla8vKjBd/mcNO8lbzz65MZ3jfej5Ep1TGtX7+eIUOGBDsM1QTe/mYistwYk9nYunoJ7RiZ7AbQqiillPJCk4UjKSGKrtHhmiyUUsoLTRYOESE92a3JQimlvNBk4SE92c0PuQcpKCkPdihKKdWmaLLwkJbsxhhYk50f7FCUUqpN0WThIS3J3gWlVVFKKVVbQJOFiEwQkY0isllEjuhkRURuE5F1IrJaRD4WkaM85lWKyCrnsyCQcVZzdwknpXu0Jgul2oFTTz31iAfsZs2axQ033NDgejExMQDk5OQwZcqUerfd2K34s2bN4tChQzXj55xzTk1fUC1xzz338Oijj7Z4O/4WsGQhIi7gSWAiMBSYLiJD6yy2Esg0xqQCrwEPe8wrNsakO59JgYqzrupG7o7y/IlSbcKSWbBlce1pWxbb6c00ffp05s+fX2va/PnzmT59uk/r9+nTh9dee63Z+6+bLN577z3cbnezt9fWBbJkMQrYbIzJMsaUAfOByZ4LGGMWGWOqj/ZXQFIA4/FJerKb3MJSduUf2XeMUqqZ+mbAq1cdThhbFtvxvhnN3uSUKVN45513KC0tBWDr1q3k5ORw8sknc/DgQU4//XQyMjIYMWIEb7/99hHrb926leHDhwNQXFzMtGnTSE1NZerUqRQXH+7H6pe//GVN9+Z33303AE888QQ5OTmMHz++phvy/v37s3fvXgAee+wxhg8fzvDhw5k1a1bN/oYMGcLPf/5zhg0bxllnnVVrP96sWrWK0aNHk5qayoUXXsiBAwdq9j906FBSU1NrOjP87LPPal7+NHLkSAoLC5t9bL0JZHcffYEdHuPZwAkNLH8N8L7HeKSILAMqgJnGmLfqriAi1wHXAfTr16/FAYNt5AbbbqEvQVLKR+/PgB/XNLxMbG948UL7XbgLEo+FT/9iP970GgETZ9a7uW7dujFq1Cg++OADJk+ezPz585k6dSoiQmRkJG+++SZxcXHs3buX0aNHM2nSpHrfQ/3Pf/6TLl26sHr1alavXk1GxuEk9uCDD9K1a1cqKys5/fTTWb16NTfddBOPPfYYixYtonv37rW2tXz5cubMmcPSpUsxxnDCCScwbtw4EhIS2LRpE/PmzePZZ5/lpz/9Ka+//jqXXXZZvb/xiiuu4G9/+xvjxo3jrrvu4t5772XWrFnMnDmTLVu2EBERUVP19eijj/Lkk08yZswYDh486PeebQNZsvD2V/FatyMilwGZwCMek/s5j6BfAswSkYFHbMyYZ4wxmcaYzMTERH/EzJDesYS7bA+0Sik/inTbRJG/w35HtrzKxrMqyrMKyhjD73//e1JTUznjjDPYuXMnu3fvrnc7ixcvrjlpp6amkpqaWjPvlVdeISMjg5EjR7J27Vqv3Zt7WrJkCRdeeCHR0dHExMRw0UUX1XRfnpKSQnp6OtB4N+j5+fnk5eUxbtw4AK688koWL15cE+Oll17K3LlzCQ211/xjxozhtttu44knniAvL69mur8EsmSRDSR7jCcBOXUXEpEzgD8A44wxpdXTjTE5zneWiHwKjAR+CGC8AESEuhjSJ46VmiyU8l0DJYAa1VVPY++AZc/Dqb+DlJZ1033BBRdw2223sWLFCoqLi2tKBC+99BK5ubksX76csLAw+vfv77Vbck/eSh1btmzh0Ucf5ZtvviEhIYGrrrqq0e001N4ZERFRM+xyuRqthqrPu+++y+LFi1mwYAH3338/a9euZcaMGZx77rm89957jB49mo8++ohjjz22Wdv3JpAli2+AQSKSIiLhwDSg1l1NIjISeBqYZIzZ4zE9QUQinOHuwBig4XTuRyOT3azJzqciQC9mUarTqU4UF/8LTvuD/fZsw2immJgYTj31VH72s5/VatjOz8+nR48ehIWFsWjRIrZt29bgdsaOHctLL70EwHfffVfThXhBQQHR0dHEx8eze/du3n//cE15bGys13aBsWPH8tZbb3Ho0CGKiop48803OeWUU5r82+Lj40lISKgplbz44ouMGzeOqqoqduzYwfjx43n44YfJy8vj4MGD/PDDD4wYMYLf/e53ZGZmsmHDhibvsyEBK1kYYypE5EZgIeACZhtj1orIfcAyY8wCbLVTDPCqk9W3O3c+DQGeFpEqbEKbaYxptWSRlhzPv/63lU17DjKkd1xr7VapjmvnCpsgqksSKWPt+M4VLS5dTJ8+nYsuuqjWnVGXXnop559/PpmZmaSnpzd6hf3LX/6Sq6++mtTUVNLT0xk1ahQAaWlpjBw5kmHDhh3Rvfl1113HxIkT6d27N4sWLaqZnpGRwVVXXVWzjWuvvZaRI0c2+uY9b1544QWuv/56Dh06xIABA5gzZw6VlZVcdtll5OfnY4zh1ltvxe1286c//YlFixbhcrkYOnRozVv//EW7KPdiy94ixj/6KQ9dNILpo/zTcK5UR6NdlLc/2kW5n/Xv1oX4qDBt5FZKKYcmCy9EhDTtgVYppWposqhHerKb73cXUlTavJe2K9UZdJRq7M6gpX8rTRb1SE+Op8rAmp3aA61S3kRGRrJv3z5NGO2AMYZ9+/a16EG9QD5n0a6lJdkHhr7dkcfoAd2CHI1SbU9SUhLZ2dnk5uYGOxTlg8jISJKSmt+jkiaLenSLiaBf1y7abqFUPcLCwkhJSQl2GKqVaDVUA7SRWymlLE0WDUhPdrMrv4TdBdoDrVKqc9Nk0YD0ZH1znlJKgSaLBg3rE09oiOjDeUqpTk+TRQMiw1wc2ztWSxZKqU5Pk0Uj0pPdrM7Op7JK7yVXSnVemiwakZ6cwMHSCrJyDwY7FKWUChpNFo2obuTWlyEppTozTRaNGNA9htiIUG3kVkp1aposGhESIqQmx2sjt1KqU9Nk4YP0ZDcbfiykpLwy2KEopVRQaLLwQXpyApVVhu+0B1qlVCelycIHafokt1Kqk9Nk4YMesZH0dUdpslBKdVqaLHyUpo3cSqlOTJOFj9KT3WQfKGbvwdJgh6KUUq1Ok4WPPN+cp5RSnY0mCx+NSIrHpT3QKqU6KU0WPuoSHsoxPWO12w+lVKekyaIJ0pPj+XZHHlXaA61SqpPRZNEE6cluCkoq2LqvKNihKKVUqwposhCRCSKyUUQ2i8gML/NvE5F1IrJaRD4WkaM85l0pIpucz5WBjNNXacm2kVtvoVVKdTYBSxYi4gKeBCYCQ4HpIjK0zmIrgUxjTCrwGvCws25X4G7gBGAUcLeIJAQqVl8N6hFLdLhLG7mVUp1OIEsWo4DNxpgsY0wZMB+Y7LmAMWaRMeaQM/oVkOQMnw18aIzZb4w5AHwITAhgrD5xhQgjkvThPKVU5xPIZNEX2OExnu1Mq881wPvNXLfVpCW7WberQHugVUp1KoFMFuJlmtfbiETkMiATeKQp64rIdSKyTESW5ebmNjvQphiZ7Ka80rB+V0Gr7E8ppdqCQCaLbCDZYzwJyKm7kIicAfwBmGSMKW3KusaYZ4wxmcaYzMTERL8F3hBt5FZKdUaBTBbfAINEJEVEwoFpwALPBURkJPA0NlHs8Zi1EDhLRBKchu2znGlB1zs+ip5xEdrIrZTqVEIDtWFjTIWI3Ig9ybuA2caYtSJyH7DMGLMAW+0UA7wqIgDbjTGTjDH7ReR+bMIBuM8Ysz9QsTZVWpJbSxZKqU4lYMkCwBjzHvBenWl3eQyf0cC6s4HZgYuu+dL7ufnvut3kHSrD3SU82OEopVTA6RPczZCu7RZKqU5Gk0UzjOgbj4gmC6VU56HJohliI8MY1CNGG7mVUp2GJotmqm7kNkZ7oFVKdXyaLJopvZ+bA4fK2bG/ONihKKVUwGmyaKbqRu6VOw4EORKllAo8TRbNNLhnLJFhIXy7Iz/YoSilVMBpsmimUFcII/rGs0pLFkqpTkCTRQukJbn5LqeAsoqqYIeilFIBpcmiBdL7uSmrqGLjj4XBDkUppQJKk0ULpCVVP8mtVVFKqY5Nk0ULJCVE0T0mnFXayK2U6uA0WbSAiJCe7NaShVKqw9Nk0UJpSW5+yC0iv7g82KEopVTAaLJoofR+tt1iTbZWRSmlOi5NFi2Uqo3cSqlOQJNFC8VHhTEgMVobuZVSHZomCz+wjdzaA61SquPSZOEH6clu9h4sJSe/JNihKKVUQGiy8IOa16xu15chKaU6Jk0WfnBsrzjCQ0O0kVsp1WFpsvCD8NAQhvWJ0+7KlVIdliYLP0lLcrNmZz4VldoDrVKq49Fk4Scj+7kpLq/k+90Hgx2KUkr5nSYLP6lp5N6hjdxKqY5Hk4Wf9OvahYQuYXyryUIp1QFpsvATESHNeThPKaU6Gk0WfpSW5Ob7PYUcLK0IdihKKeVXAU0WIjJBRDaKyGYRmeFl/lgRWSEiFSIypc68ShFZ5XwWBDJOf0nv58YY7YFWKdXxBCxZiIgLeBKYCAwFpovI0DqLbQeuAl72soliY0y685kUqDj9KT1JG7mVUh1TaAC3PQrYbIzJAhCR+cBkYF31AsaYrc68DvFwQkJ0OEd166KN3EqpDieQ1VB9gR0e49nONF9FisgyEflKRC7wb2iBk66N3EqpDiiQyUK8TGtKH979jDGZwCXALBEZeMQORK5zEsqy3Nzc5sbpV2lJbn4sKOFH7YFWKdWB+JQsRORCEYn3GHf7cLWfDSR7jCcBOb4GZozJcb6zgE+BkV6WecYYk2mMyUxMTPR10wFV/ZpVLV0opToSX0sWdxtjam7xMcbkAXc3ss43wCARSRGRcGAa4NNdTSKSICIRznB3YAwebR1t2dDecYS5RJOFUqpD8TVZeFuuwcZxY0wFcCOwEFgPvGKMWSsi94nIJAAROV5EsoGLgadFZK2z+hBgmYh8CywCZhpj2kWyiAxzMaR3nDZyK6U6FF/vhlomIo9hb4U1wK+B5Y2tZIx5D3ivzrS7PIa/wVZP1V3vf8AIH2Nrc9KT3by+PJvKKoMrxFvTjVJKtS++lix+DZQB/wFeAYqBXwUqqPYuLclNUVklP+RqD7RKqY7Bp5KFMaYIOOIJbOVdTSP39jyO6Rkb5GiUUqrlfL0b6kMRcXuMJ4jIwsCF1b6ldIsmNjKUldpuoZTqIHythuru3AEFgDHmANAjMCG1fyEhQnqyWxu5lVIdhq/JokpE+lWPiEh/mvaAXaeTnuxm4+5Cissqgx2KUkq1mK93Q/0BWCIinznjY4HrAhNSx5CW5KayyvBdTj7H9+8a7HCUUqpFfCpZGGM+ADKBjdg7on6DvSNK1SMt+XAjt1JKtXc+lSxE5FrgZuwzEauA0cCXwGmBC619S4yNoK87ilXZmiyUUu2fr20WNwPHA9uMMeOx/TS1jZ772rD0fm4tWSilOgRfk0WJMaYEQEQijDEbgMGBC6tjSE9yszOvmNzC0mCHopRSLeJrssh2nrN4C/hQRN6mCT3IdlbVD+fpLbRKqfbO1ye4L3QG7xGRRUA88EHAouoghveJxxVie6A9Y2jPYIejlFLN1uTXqhpjPmt8KQUQFe5icM9YvtVGbqVUOxfIN+UpnEbuHXlUVekzjEqp9kuTRYClJ7kpLKkga29RsENRSqlm02QRYNrIrZTqCDRZBNjAxBhiIkL1NatKqXZNk0WAuUKEEX3jtZFbKdWuabJoBen93KzfVUBJufZAq5RqnzRZtIK0JDfllYZ1uwqCHYpSSjWLJotWMLKf9kCrlGrfNFm0gp5xkfSOj9RGbqVUu6XJopWkJbm1kVsp1W5psmgl6f3cbNt3iP1FZcEORSmlmkyTRStJS3IeztPShVKqHercyWLJLNiyuPa0LYvtdD9LTYonRLSRWynVPnXuZNE3A169CtYtAGNsonj1Kjvdz6IjQhnUQ3ugVUq1T03uorxDSRkLZz0Ir1wBA0+HXSvh4n/Z6QGQnuzmv+t+xBiDiARkH0opFQgBLVmIyAQR2Sgim0Vkhpf5Y0VkhYhUiMiUOvOuFJFNzufKgAWZOhW6DoAfPoKjzwhYogDbyH3gUDnb9h0K2D6UUioQApYsRMQFPAlMBIYC00VkaJ3FtgNXAS/XWbcrcDdwAjAKuFtEEgIS6LYlUHwAunSD1a/At68EZDegjdxKqfYrkCWLUcBmY0yWMaYMmA9M9lzAGLPVGLMaqKqz7tnAh8aY/caYA8CHwAS/R1jdRvHTF+CaDyGsC7x1PXy/0O+7AjimZwxRYS5WaiO3UqqdCWSy6Avs8BjPdqYFel3f7VxxuI2i20CY+iKYKvjwLqiqm79aLtQVoj3QKqXapUAmC28tuL6+W9SndUXkOhFZJiLLcnNzmxQcACffUruN4ujT4ewHIXcDfPaXpm/PB+n93KzNKaCswv/JSCmlAiWQySIbSPYYTwJy/LmuMeYZY0ymMSYzMTGx2YHWMvoGSLsEPptpb6n1s/RkN2UVVaxvaz3QtuIzJ0qp9ieQyeIbYJCIpIhIODAN8PXsuxA4S0QSnIbts5xpgScC5/0V+mbCm9fD7rV+3Xxachtt5K5+5qQ6YQTwmROlVPsTsGRhjKkAbsSe5NcDrxhj1orIfSIyCUBEjheRbOBi4GkRWeusux+4H5twvgHuc6a1jrBImDoXImJh3jQo2ue3TfeJjyQxNqLtPcmdMta23/znMphzjk3QrSuQAAAcTklEQVQUAXzmRCnVvgT0oTxjzHvAe3Wm3eUx/A22isnburOB2YGMr0FxvWHayzBnIrx6JVz+JrjCWrxZESEtyc2qtlayAHCFQ1kxbPsC+p+iiUIpVaNzd/fRmKTj4PzHYevnsPD3ftvsyH5usnKLyD9U7rdtttjWL+CFyWAqoOtA+5uXvxDsqJRSbYQmi8akT4cTb4Svn/HbybP64bzVO9tI6SLrM3jxQqgqhylz4NqPININ79wKmz8OdnRKqTZAk4UvzrgXBp4G7/4Gtn/V4s2lJscjbaUH2s0fw8s/hch420Yx7ALo0hUufApMJSz5a7AjVEq1AZosfOEKhSmzwZ1sG4Dzs1u0ubjIMAYmxgT/Navf/xfmTYdug+CGr2DopMPzBk+E9Mts+0X2suDFqJRqEzRZ+CoqAabPh/ISmH8JlLWsM8Dq16wa4+tzin628X34z6XQ41i4cgFEdztymQl/hri+8OYvWvx7lVLtmyaLpkgcDD95DnathgW/tu/AaKb0fm72Hiwj+0CxHwP00boFtoTUczhc8batdvImMh4m/x32bYZP7m/dGJVSbYomi6YaPAFO/xN89xp80fynm9OD1QPtd2/YZyj6ZMAVb9kSU0MGnAqjroOv/gFbPm+FAJVSbZEmi+Y4+TYYdhF8dG+ze6g9tncsEaEhrdvIvfpVeP0aSB4Fl79hSw6+OOMe+86Pt26A0sJARqiUaqM0WTSHCEx+EnqNgNevhdzvm7yJMFcIw1uzB9pV8+DN6+CoMXDZ6/bpdF+FR8MFT0FBNiz8Q+BiVEq1WZosmiu8i33C2xUO86dDcdNP+mlJbtbszKe8MsA90K74N7z1S/tE9iWv2JN/U/U7AU66CVa8AJs+9H+MSqk2TZNFS7iT7TswDmyz1TtVlU1aPb2fm5LyKjb+GMCqnW+et43xR59u7+YK79L8bY3/PSQOgbdvhEOt11WXUir4NFm01FEnwTmPwOaP4KN7mrRqwBu5lz4D794Gg86GqS9BWFTLthcaYR/WO7QX3r/DPzEqpdoFTRb+kHk1HH8t/O8J+PY/Pq+W3DWKrtHhgWnk/vJJeP92OPY824NuWKR/ttsnHcbeAWtehbVv+WebSqk2T5OFv0yYCUedbKt8di73aRXbA20AGrmXzLIdHw6dbLvwCA337/ZPuQ36jLSlloN7/LttpVSbpMnCX1xh8NMXIKYnzL8UCn/0abX05AQ27TlIYYmfeqD97BH46G4Y/hP4yWy/dKt+BFeYvTuq9KDtbDBYT6ErpVqNJgt/iu4O01+Gknz4z+VQUdroKun93BgDa7LzW7ZvY2DRQ7DoAUidChc+Y/u0CpQex9qHEze8A9/OD9x+lFJtgiYLf+s1wjYCZ38N79zW6FV3WpJ9MK5FL0MyxnbH8dlMSL8ULvhnYBNFtdE3QL8T4f3ftbhzRaVU26bJIhCGTraNwKvmwtKnG1zU3SWclO7RzW/kNgY+vAs+/3+QcSVM+juEuJq3raYKccEF/4CqCns7rVZHKdVhabIIlFPvtHciLfw9ZH3a4KLNbuQ2Bj64096Fdfy1cN4sCGnlP2nXAXDW/ZC1CJYF7y24SqnA0mQRKCEhtjqq+zHwypWwP6veRdOT3ewuKGVXfhN6oK2qgvduh6X/hBN+Cec82vqJolrmz2DAePjvnxr8nUqp9kuTRSBFxNoGbxGYd0m9nfCl97M9v37r68uQqqrg3Vvhm2fhpF/DhIfsPoJFxHZlHhJqOxts4pPsSqm2T5NFoHUdYJ912Ps9vPELe6KvY0jvWMJdIaz0JVlUVdpnOZb/y/Z+e+b9wU0U1eKTYOJfYPuXtjtzpVSHosmiNQw4Fc7+M2x8Fz596IjZ76/5EYPh6c+yGDPzE95audP7dqoq7ZX7qrkw7ndw+l1tI1FUS5tm22k+vh/2bAh2NEopP9Jk0VpO+IV9p/Xih2t1k/HWyp3c+cYayivtnUQ784q58401RyaMygp44+ewej6M/6Pt1K8tJQqw8Zz3V4iIgbeuh0o/PWiolAo6TRatRQTOewySRtnuwn9cA8AjCzdSXF67jr+4vJJHFm48PKGyHF7/GXz3un0R0bjbWy/uporpAec+Bjkr4fPHgh2NUspPNFm0ptAI26lfpNs2eBftJSfP+x1QO/OK2fBjAVSU2degrnsbznoQTr61dWNujmEXwIiLbSkqZ1Wwo1FK+YEmi9YW2xOmvQRFe+CVK0mO9953kwCTZn3CykfPgw3vUHX2TDjpxtaNtSUmPgxdusOb1/vU7YlSqm3TZBEMfTNg0t9g2xL+1fsNosJqP3EdFeZi5qSj+bDP04wsWcofy6/mrC+HMu/r7ZSUt5PbUrt0tbfT5q6HRX8OdjRKqRYKaLIQkQkislFENovIDC/zI0TkP878pSLS35neX0SKRWSV83kqkHEGRepP4aSbGLB1Hi9nrKevOwoB+rqjeHjS0UzdfAdH7f+SinMf57gpvyUiNIQ731jDSTM/4bEPvye3sB1crQ8603ZB8r8nYPvSYEejlGoBMQHqz0dEXMD3wJlANvANMN0Ys85jmRuAVGPM9SIyDbjQGDPVSRrvGGOG+7q/zMxMs2zZMn/+hMCrqoQnR9mnnq961751r/QgzJ4Au9fYDgHTLwHAGMPSLft57vMtfLxhN2EhIUxO78M1p6RwbK+4IP+QBpQWwj9Psg/sXb+kee//VkoFjIgsN8ZkNrZcIEsWo4DNxpgsY0wZMB+YXGeZycALzvBrwOkibe1+0AAKcdmH6gBengp71sPzZ9pEccpvahIF2BcljR7QjeeuzOTj28Yx9fhk3lm9iwmzPufy55eyaOMeqqraYEd+EbEw+R82ITbxtbNKqbYjkMmiL7DDYzzbmeZ1GWNMBZAPdHPmpYjIShH5TEROCWCcwXXsOfZkWloI/zgR9qyDsbfbB+7qMSAxhvsvGM6Xd57GHRMG8/3uQq6e8w1nzVrcNts1Uk6x3Zl//UyjnSoqpdqmQCYLbyWEupe+9S2zC+hnjBkJ3Aa8LCJH1LWIyHUiskxEluXm5rY44KBJn27fbIex3Zuf9kefVnN3CeeGU4/m8ztO469T09p2u8bpd0G3QbYr85IWvuhJKdXqApkssoFkj/EkIKe+ZUQkFIgH9htjSo0x+wCMMcuBH4Bj6u7AGPOMMSbTGJOZmJgYgJ/QSrYstl18j70Dti6x400QHhrChSOTeOfXJzP/utFk9Evgb59sYszMT7j91W/t8xrBFhZle+Et2Akf/D7Y0SilmiiQr1P7BhgkIinATmAacEmdZRYAVwJfAlOAT4wxRkQSsUmjUkQGAIOAjtn39ZbF9qG7i/8FKWNtlY3neBNUt2uMHtCNrNyDzPliK68tz+bV5dmcMqg7Pzs5hXGDEgkJCVKzUFKmfajw8/8HQ86DwRODE4dSqskCdjcUgIicA8wCXMBsY8yDInIfsMwYs0BEIoEXgZHAfmCaMSZLRH4C3AdUAJXA3caY/2toX+3ybiiAJbPscxeeiWHLYti5Ak6+pcWbzztUxstfb+eF/21ld0EpR/eI4ZqTU7hwZF8iw1rpjXqeKsrg2fFwcA/8aql9HkMpFTS+3g0V0GTRmtptsmglZRVVvLsmh+c+38LanAK6Rodz2eijuHz0USTGRrRuMD+ugWfGw5Dz4eI5rbtvpVQtmiyUV8YYvsraz/NLjnxeY8OuQh5ZuJGcvGL6uKO4/ezBXDCy7g1sfrL4Ufjkfpgy22ncV0oFgyYL1ajqdo1Xl++gpLyKEAHPRzWiwlw8dNGIwCSMygqYfZZ9/uKGryC2l//3oZRqVFt4KE+1cdXPa3x15+nERYZS95m+4vJK/vze+sA87OcKhQuegvJi+L+boYNctCjVUWmyULi7hFNYUuF13p7CUtLv+y9Xzv6axz/axOebciks8dNLjRKPse/n+P4DWPWSf7aplAqIQN46q9qRPu4odnp5t4a7SxgTh/dixbY8Zn38PcbY9zgN7hnLyH4JZPRzk3FUAgO6R9OsnlpG/QLWvwPvz7B3hLn7+eHXKKX8TdssFHD49a6eb+2r22ZRUFLOqu15rNh+gBXb81i5/UBNicTdJYyRyW6OOyqBjH4JpCW7iY7w8VrkwFb45xh7C/Hlb0OIFniVapSfbrv3tc1CSxYKoCYhNHQ3VFxkGGOPSWTsMfZp+aoqww+5B23y2JbH8u0HWLTRdrsSIjC4V5wtefRL4LijEjiqWxfvpY+E/nD2g7bt4pvn4ITrAv57GxXg51+UarG+GbUf4PV8wDcAtGSh/Cr/UDkrdxwueazankdhqS19dI0OJ6Of26m+SiAtOZ4u4c71ijHw0sWwdQkfjnude74obZ1beOtT98n6uuNKBdOh/bB3E2x413bQmZRpOyFtXs8PeuusCr7KKsOmPYWs2FZdfXWArNwiAFwhwpDesWQ4yeP4rqUkzh3H2rJe/KT0Lqqc+y8CdgtvRSkU7YWi3MPfhzzGczfCrlUQ29tOO+5qGDoJEo/VJ8+DobOV9irLYf8W2LfJJobq772boHj/4eUkBEyV7VvutD80eTeaLFSbdaCojFU7bPJYvu0A3+7Io6jMtpU8EfY3Jrm+ZGb5NJ6qnATAiSFrObnLDn71x781vOHKCvufqCjXIwHsrT3umQxK6+lg0RUB0YkQ3d32kHtgC4SEQ1XZ4WViekGPYyFxSO3vyHh/HCLlTUcs7Rlj/y3WSgib7ff+LWA8XjcQ09P23Nz9aOd7EBTnwcI7IfMaWPa8lix8ocmi/aqsMmz8sZAV2w/w7oL5/CvsYYQqJpX9mSTZw6NhT/N4+UUk9x9I/6gS+oYdJDGkgPiqPFyeyaH4AEf2gg+Iy574oxOhSzcnETjJoHp69XiX7vaFTSKHT0bV/xEnPgKRcfYlVbkbDn+XHzq8r7i+tuTRY4jH92C7TeWbsiI4uNv2H1bz7QzvWQ85KyA8FsoKodcI6DoQohJsaS8qAaK61hlPsEk8xE99oTWnhFNRak/+e7+vnRD2boKSvMPLuSKg20CbCKoTQnWCqHsh4qfkqclCtUtjZn7CsILFPBU2ixCp/9/mfhPDfhNHoctNeaRNAOFxPYjp1ouEHn1J6N6HkBgnCUS6m36Hla//EauqIH877NkAuevt95519qRQUXJ4ufh+TgnEI5EkDm4/r5ltaRVQRRkU7amTBHKd7zqJobzoyPUlxP4tY3rYVw8f2AIJKTYhFB+wdfgl+Xi9WLAbgCh3PcnEc9xdezwizl44eKrv38aUOfbvum+T/ft7JoS8bbaqqFpsb+h2tEcyOMYmhPhk35NaK98NpclCtSnVt/D+yTzFJaGL+KxyBAtkPBeenMbJqcdSEtGVLUURZO0vIyv3IFl7i/gh9yBZuUUcLD38YGFUmIuU7tEMSIxmQGIMAxOjGdA9hgGJ0b7d0tvS/4hVlfaW4OoSSHUpZO/3UFldnSWQcFSdqqwh9sQRFumfOPylvhPkebOga0qdEsCeI5OA59Wzp6gEW70S0wOiexwervXd0564Q1xHlvY8k3dVpU0Y1cmj+ICtlmxw/ED91ZFgS6XVpRPP5FJWBJs/hD4ZkP2NfT7o4O7a2wqNPLLaqNvR9hN5xLvcgkaThWq3lvz3DYb972ZerDidy0M/Zu1Jj3PyWRc1uI4xhtzCUn7ILSJrr00e1clkx/5Dtboy6RUX6SSRwwlkYGIMfdxRuDze9fHWyp3+71ixssJeFddUZa2zpZF9m6HKeTJeQuxVc48hENYFNr4HZ//Zvutk+1L4YAaccS/0HmG3V1Vh162qqDNeaRtJjxj3tmwj45UV9mS461t7wiyqfjOll/NHWLT3E35MD49PT1tSCG1Cj8eBarOoLLd1/40ml+phJ8lUl4DCY21C737M4YTQfRDEJbWLZ4Y0Waj2KQAnhNKKSrbtO0RW7kGbTDwSSn7x4a5LwkNDGOCURsorqvj0+1zKKw///whsx4rlsO+H2lVZuRvsNBPgd6qHhIErDEJCD39cYfZKPiSs9vjBXCjMgV6pcMzZR5YEontARExg4mwrpazq/b56FYy8Alb+u103smuyUO1TK54QjDHsLypzEogthWQ5VVpZe73UmwMRoSGcm9qbPvFR9HZH0js+kt7xUfSJjyIuKrR5XZ40pKLUljo+nQnrF8Cx58OIKQ2f0JsyLiFH1snXp6EqoM6kg92VpclCqRZImfFuvU2lfeIj2V1YSmWd3ni7hLvoFR9pE0m8k0jcdriPO4pe8ZHERYY1OZbmVMv5XQc7QbZIWyrh+IF296FUC9TXsWJfdxRfzDiNyirbRpKTX8yuvBJ25RezK99+5+SV8PmmvewpLDmi2/eYiFB6x0ceTiqepRO3/fZsgF/y3zcY+sVN3FB+E19WDePLqqE8+cVNLIHWTRg7V9RODClj7fjOFZ0vWXhLCCljO/xx0JKFUl740rFiY8orq9hTWMquvNqJZFd+MT/ml5CTX0JuYekR68VGhtYkkuFb5rC8IoUvq4bVzD8xZC0nRW3nytv/Skx4KCEhfq76qkdAGvxV0Gk1lFIt1Bonx7KKKnYXlByRTKrHv9vZwG2d2OaGmIhQ4iLDiI2033FRocRGhhEXab9jI0OJizo8P9aZHhdlxyNCQxpta/FH8lRtkyYLpTqAMTM/8f6ekagwfjX+aApLyikoqaCgpJzCkgoKip1vZ7ywpPyIqrC6wl0htRKKZ1Kx32HM/iKL/OIjX5DVOz6S/804zf8N+6rVaJuFUh3A7WcP9npFf8+kYT5d0RtjKCqrtEml2CaP6mRS4JFcqpOOXa6cPQWlNcsdKqv/1t1d+SUM+sP7xEeFER8VRpzz7e0TV2s4lPioMGIimnYHmVaFBY8mC6XaMF/eM9IQESEmItRpWG9eDBWVVZzy8CJ25ZccMS8uMpTLRh9FfnF5zSfvUBnb9tlnWApKKo64a8yTK0SIiwz1Kdms2ZnP80u2UFphu83YmVfMnW+sAWj1hNEZk5ZWQymlGtXcNgtjDAdLK2olkwKP4cOfCq/zG0o01UIE+nXtQnREKNERocQ639ERocREuJzv2tNiIsKIjnDVTI+JCPWp7aYlx6Kt0moopZTfNLeEIyJOI3sYSQlN22d1FVp+cTn5h8o554nPvS5XZSA1yU1RaQUHSyv4saDEGa6kqLSi1km9Ia4QITrcVSuxxEaGEh1eO/HM/WrbEdssLq/kwffW29cJh7uICnfRJTy0VvcxgdCaJRwtWSil2oX6Gvurn32pT2WVoaisgqLSilpJpLDEmVZmk4ydX1kzXPv78PQKH0o71SJCQ+jiJI6ocFetRGKn1x6OCg+ttUx9y0eGhfD2qhy/lHC0ZKGU6lDqa+y//ezBDa5n20XCmvX0fF3GGMbM/IQcL+03XaPD+dN5QzhUVsmh0kr7XV5RM1xcbpNOcVklewpLDi9TVsGhssomJSERwBzZjWNxeSWPLNwYkNKFJgulVLvQ0sZ+fxAR7phwrNekddd5Q1sUS1lFFcVllRQ5yaN6uLjMJhXP4UNlFfztk81et5PjpfTlDwFNFiIyAXgccAHPGWNm1pkfAfwbOA7YB0w1xmx15t0JXANUAjcZYxYGMlalVNt3wci+QW9EDlTSCg8NITw0hPguvpWA3lix02u1XB93VIviqE/AkoWIuIAngTOBbOAbEVlgjFnnsdg1wAFjzNEiMg34CzBVRIYC04BhQB/gIxE5xphA99WslFKNawtJq7nVcs0VyDdzjAI2G2OyjDFlwHxgcp1lJgMvOMOvAaeLvXdtMjDfGFNqjNkCbHa2p5RSCpuwHrpoBH3dUQi2oT+Qt+8GshqqL7DDYzwbOKG+ZYwxFSKSD3Rzpn9VZ932dwOzUkoFUGuWcAJZsvB2g3Hdxvv6lvFlXUTkOhFZJiLLcnNzvayilFLKHwKZLLKBZI/xJCCnvmVEJBSIB/b7uC7GmGeMMZnGmMzExEQ/hq6UUspTIJPFN8AgEUkRkXBsg/WCOsssAK50hqcAnxj7lOACYJqIRIhICjAI+DqAsSqllGpAwNosnDaIG4GF2FtnZxtj1orIfcAyY8wC4HngRRHZjC1RTHPWXSsirwDrgArgV3onlFJKBY9296GUUp1Yp3v5kYjkAtuCHUcLdQf2BjuINkSPR216PA7TY1FbS47HUcaYRht9O0yy6AhEZJkvGb6z0ONRmx6Pw/RY1NYaxyOQDdxKKaU6CE0WSimlGqXJom15JtgBtDF6PGrT43GYHovaAn48tM1CKaVUo7RkoZRSqlGaLJRSSjVKk0UbICLJIrJIRNaLyFoRuTnYMQWbiLhEZKWIvBPsWIJNRNwi8pqIbHD+jZwY7JiCSURudf6ffCci80QkMtgxtSYRmS0ie0TkO49pXUXkQxHZ5Hwn+Hu/mizahgrgN8aYIcBo4FfOC6A6s5uB9cEOoo14HPjAGHMskEYnPi4i0he4Ccg0xgzHdiU0LbhRtbp/ARPqTJsBfGyMGQR87Iz7lSaLNsAYs8sYs8IZLsSeDDrt+ztEJAk4F3gu2LEEm4jEAWOx/ahhjCkzxuQFN6qgCwWinJ6qu+ClR+qOzBizGNuXnifPF8m9AFzg7/1qsmhjRKQ/MBJYGtxIgmoWcAdQFexA2oABQC4wx6mWe05EooMdVLAYY3YCjwLbgV1AvjHmv8GNqk3oaYzZBfbiE+jh7x1osmhDRCQGeB24xRhTEOx4gkFEzgP2GGOWBzuWNiIUyAD+aYwZCRQRgCqG9sKpi58MpAB9gGgRuSy4UXUOmizaCBEJwyaKl4wxbwQ7niAaA0wSka3Y97afJiJzgxtSUGUD2caY6pLma9jk0VmdAWwxxuQaY8qBN4CTghxTW7BbRHoDON97/L0DTRZtgIgItk56vTHmsWDHE0zGmDuNMUnGmP7YhstPjDGd9srRGPMjsENEBjuTTse+56Wz2g6MFpEuzv+b0+nEDf4ePF8kdyXwtr93ELCXH6kmGQNcDqwRkVXOtN8bY94LYkyq7fg18JLzxsks4OogxxM0xpilIvIasAJ7F+FKOlnXHyIyDzgV6C4i2cDdwEzgFRG5BptQL/b7frW7D6WUUo3RaiillFKN0mShlFKqUZoslFJKNUqThVJKqUZpslBKKdUoTRZKNYGIVIrIKo+P356mFpH+nj2JKtWW6HMWSjVNsTEmPdhBKNXatGShlB+IyFYR+YuIfO18jnamHyUiH4vIaue7nzO9p4i8KSLfOp/qLitcIvKs876G/4pIVNB+lFIeNFko1TRRdaqhpnrMKzDGjAL+ju05F2f438aYVOAl4Aln+hPAZ8aYNGxfT2ud6YOAJ40xw4A84CcB/j1K+USf4FaqCUTkoDEmxsv0rcBpxpgsp1PIH40x3URkL9DbGFPuTN9ljOkuIrlAkjGm1GMb/YEPnRfYICK/A8KMMQ8E/pcp1TAtWSjlP6ae4fqW8abUY7gSbVdUbYQmC6X8Z6rH95fO8P84/NrPS4ElzvDHwC+h5n3jca0VpFLNoVctSjVNlEfPwGDfjV19+2yEiCzFXoRNd6bdBMwWkduxb7yr7jH2ZuAZp5fQSmzi2BXw6JVqJm2zUMoPnDaLTGPM3mDHolQgaDWUUkqpRmnJQimlVKO0ZKGUUqpRmiyUUko1SpOFUkqpRmmyUEop1ShNFkoppRr1/wFc8fhIJ+nIngAAAABJRU5ErkJggg==\n", 308 | "text/plain": [ 309 | "
" 310 | ] 311 | }, 312 | "metadata": { 313 | "needs_background": "light" 314 | }, 315 | "output_type": "display_data" 316 | } 317 | ], 318 | "source": [ 319 | "import matplotlib.pyplot as plt\n", 320 | "f, ax = plt.subplots()\n", 321 | "ax.plot([None] + theLeNetModel.history['loss'], 'o-')\n", 322 | "ax.plot([None] + theLeNetModel.history['val_loss'], 'x-')\n", 323 | "ax.legend(['Train loss', 'Validation loss'], loc = 0)\n", 324 | "ax.set_title('Training/Validation loss per Epoch')\n", 325 | "ax.set_xlabel('Epoch')\n", 326 | "ax.set_ylabel('acc')" 327 | ] 328 | }, 329 | { 330 | "cell_type": "code", 331 | "execution_count": null, 332 | "metadata": {}, 333 | "outputs": [], 334 | "source": [] 335 | }, 336 | { 337 | "cell_type": "code", 338 | "execution_count": null, 339 | "metadata": {}, 340 | "outputs": [], 341 | "source": [] 342 | }, 343 | { 344 | "cell_type": "code", 345 | "execution_count": null, 346 | "metadata": {}, 347 | "outputs": [], 348 | "source": [] 349 | }, 350 | { 351 | "cell_type": "code", 352 | "execution_count": null, 353 | "metadata": {}, 354 | "outputs": [], 355 | "source": [] 356 | }, 357 | { 358 | "cell_type": "code", 359 | "execution_count": null, 360 | "metadata": {}, 361 | "outputs": [], 362 | "source": [] 363 | } 364 | ], 365 | "metadata": { 366 | "kernelspec": { 367 | "display_name": "Python 3", 368 | "language": "python", 369 | "name": "python3" 370 | }, 371 | "language_info": { 372 | "codemirror_mode": { 373 | "name": "ipython", 374 | "version": 3 375 | }, 376 | "file_extension": ".py", 377 | "mimetype": "text/x-python", 378 | "name": "python", 379 | "nbconvert_exporter": "python", 380 | "pygments_lexer": "ipython3", 381 | "version": "3.6.8" 382 | } 383 | }, 384 | "nbformat": 4, 385 | "nbformat_minor": 2 386 | } 387 | -------------------------------------------------------------------------------- /Chapter4/Chapter4: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter5/Chapter5: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter5/YOLO.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 2, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import cv2\n", 10 | "from imutils.video import VideoStream\n", 11 | "import os\n", 12 | "import numpy as np" 13 | ] 14 | }, 15 | { 16 | "cell_type": "code", 17 | "execution_count": 4, 18 | "metadata": {}, 19 | "outputs": [], 20 | "source": [ 21 | "localPath_labels = \"coco.names\"\n", 22 | "localPath_weights = \"yolov3.weights\"\n", 23 | "localPath_config = \"yolov3.cfg\"\n", 24 | "labels = open(localPath_labels).read().strip().split(\"\\n\")\n", 25 | "scaling = 0.005\n", 26 | "confidence_threshold = 0.5\n", 27 | "nms_threshold = 0.005 # Non Maxima Supression Threshold Vlue\n", 28 | "model = cv2.dnn.readNetFromDarknet(localPath_config, localPath_weights)" 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": null, 34 | "metadata": {}, 35 | "outputs": [], 36 | "source": [] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": null, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": 10, 48 | "metadata": {}, 49 | "outputs": [ 50 | { 51 | "name": "stdout", 52 | "output_type": "stream", 53 | "text": [ 54 | "\n" 55 | ] 56 | } 57 | ], 58 | "source": [ 59 | "#Start the video streat\n", 60 | "cap = VideoStream(src=0).start()\n", 61 | "#Getting the layers here\n", 62 | "layers_name = model.getLayerNames()\n", 63 | "\n", 64 | "output_layer = [layers_name[i[0] - 1] for i in model.getUnconnectedOutLayers()]\n", 65 | "print(model.getLayerNames)\n" 66 | ] 67 | }, 68 | { 69 | "cell_type": "code", 70 | "execution_count": null, 71 | "metadata": {}, 72 | "outputs": [], 73 | "source": [ 74 | "while True:\n", 75 | " frame = cap.read()\n", 76 | " (h, w) = frame.shape[:2]\n", 77 | " blob = cv2.dnn.blobFromImage(frame, 1 / 255.0, (416, 416), swapRB=True, crop=False)\n", 78 | " model.setInput(blob)\n", 79 | " nnoutputs = model.forward(output_layer)\n", 80 | " confidence_scores = []\n", 81 | " box_dimensions = []\n", 82 | " class_ids = []\n", 83 | "\n", 84 | " for output in nnoutputs:\n", 85 | " for detection in output:\n", 86 | " scores = detection[5:]\n", 87 | " class_id = np.argmax(scores)\n", 88 | " confidence = scores[class_id]\n", 89 | " if confidence > confidence_threshold :\n", 90 | " box = detection[0:4] * np.array([w, h, w, h])\n", 91 | " (center_x, center_y, width, height) = box.astype(\"int\")\n", 92 | " x = int(center_x - (width / 2))\n", 93 | " y = int(center_y - (height / 2))\n", 94 | " box_dimensions.append([x, y, int(width), int(height)])\n", 95 | " confidence_scores.append(float(confidence))\n", 96 | " class_ids.append(class_id)\n", 97 | " ind = cv2.dnn.NMSBoxes(box_dimensions, confidence_scores, confidence_threshold, nms_threshold)\n", 98 | " for i in ind:\n", 99 | " i = i[0]\n", 100 | " (x, y, w, h) = (box_dimensions[i][0], box_dimensions[i][1],box_dimensions[i][2], box_dimensions[i][3])\n", 101 | " cv2.rectangle(frame,(x, y), (x + w, y + h), (0, 255, 255), 2)\n", 102 | " label = \"{}: {:.4f}\".format(labels[class_ids[i]], confidence_scores[i])\n", 103 | " cv2.putText(frame, label, (x, y - 5), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (255,0,255), 2)\n", 104 | " cv2.imshow(\"Yolo\", frame)\n", 105 | " if cv2.waitKey(1) & 0xFF == ord(\"q\"):\n", 106 | " break\n", 107 | "cv2.destroyAllWindows()\n", 108 | "cap.stop()" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": null, 114 | "metadata": {}, 115 | "outputs": [], 116 | "source": [] 117 | }, 118 | { 119 | "cell_type": "code", 120 | "execution_count": null, 121 | "metadata": {}, 122 | "outputs": [], 123 | "source": [] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": null, 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [] 131 | } 132 | ], 133 | "metadata": { 134 | "kernelspec": { 135 | "display_name": "Python 3", 136 | "language": "python", 137 | "name": "python3" 138 | }, 139 | "language_info": { 140 | "codemirror_mode": { 141 | "name": "ipython", 142 | "version": 3 143 | }, 144 | "file_extension": ".py", 145 | "mimetype": "text/x-python", 146 | "name": "python", 147 | "nbconvert_exporter": "python", 148 | "pygments_lexer": "ipython3", 149 | "version": "3.7.4" 150 | } 151 | }, 152 | "nbformat": 4, 153 | "nbformat_minor": 2 154 | } 155 | -------------------------------------------------------------------------------- /Chapter6/Chapter6: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter6/Chapter6.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | } 15 | ], 16 | "source": [ 17 | "from keras.models import model_from_json\n" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": 19, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "from inception_resnet_v1 import *\n", 27 | "import numpy as np\n", 28 | "\n", 29 | "from keras.models import Sequential\n", 30 | "from keras.models import load_model\n", 31 | "from keras.models import model_from_json\n", 32 | "from keras.layers.core import Dense, Activation\n", 33 | "from keras.utils import np_utils\n", 34 | "\n", 35 | "from keras.preprocessing.image import load_img, save_img, img_to_array\n", 36 | "from keras.applications.imagenet_utils import preprocess_input\n", 37 | "\n", 38 | "import matplotlib.pyplot as plt\n", 39 | "from keras.preprocessing import image" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": 30, 45 | "metadata": {}, 46 | "outputs": [], 47 | "source": [ 48 | "face_model = InceptionResNetV1()" 49 | ] 50 | }, 51 | { 52 | "cell_type": "code", 53 | "execution_count": 31, 54 | "metadata": {}, 55 | "outputs": [], 56 | "source": [ 57 | "face_model.load_weights('facenet_weights.h5')" 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": 27, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "def normalize(x):\n", 67 | " return x / np.sqrt(np.sum(np.multiply(x, x)))\n" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": 32, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [ 76 | "def getEuclideanDistance(source, validate):\n", 77 | " euclidean_dist = source - validate\n", 78 | " euclidean_dist = np.sum(np.multiply(euclidean_dist, euclidean_dist))\n", 79 | " euclidean_dist = np.sqrt(euclidean_dist)\n", 80 | " return euclidean_dist" 81 | ] 82 | }, 83 | { 84 | "cell_type": "code", 85 | "execution_count": 34, 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "def preprocess_data(image_path):\n", 90 | " image = load_img(image_path, target_size=(160, 160))\n", 91 | " image = img_to_array(image)\n", 92 | " image = np.expand_dims(image, axis=0)\n", 93 | " image = preprocess_input(image)\n", 94 | " return image" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": 21, 100 | "metadata": {}, 101 | "outputs": [], 102 | "source": [ 103 | "img1_representation = normalize(face_model.predict(preprocess_data('image_1.jpeg'))[0,:])\n", 104 | "img2_representation = normalize(face_model.predict(preprocess_data('image_2.jpeg'))[0,:])\n", 105 | " \n", 106 | "euclidean_distance = getEuclideanDistance(img1_representation, img2_representation)" 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": 22, 112 | "metadata": {}, 113 | "outputs": [ 114 | { 115 | "data": { 116 | "text/plain": [ 117 | "0.70589006" 118 | ] 119 | }, 120 | "execution_count": 22, 121 | "metadata": {}, 122 | "output_type": "execute_result" 123 | } 124 | ], 125 | "source": [ 126 | "euclidean_distance" 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": 25, 132 | "metadata": {}, 133 | "outputs": [], 134 | "source": [ 135 | "def getCosineSimilarity(source, validate):\n", 136 | " a = np.matmul(np.transpose(source), validate)\n", 137 | " b = np.sum(np.multiply(source, source))\n", 138 | " c = np.sum(np.multiply(validate, validate))\n", 139 | " return 1 - (a / (np.sqrt(b) * np.sqrt(c)))" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": 37, 145 | "metadata": {}, 146 | "outputs": [], 147 | "source": [ 148 | "img1_representation = (face_model.predict(preprocess_data('image_1.jpeg'))[0,:])\n", 149 | "img2_representation = (face_model.predict(preprocess_data('image_2.jpeg'))[0,:])\n", 150 | " \n", 151 | "cosine = getCosineSimilarity(img1_representation, img2_representation)" 152 | ] 153 | }, 154 | { 155 | "cell_type": "code", 156 | "execution_count": 38, 157 | "metadata": {}, 158 | "outputs": [ 159 | { 160 | "data": { 161 | "text/plain": [ 162 | "0.2491404414176941" 163 | ] 164 | }, 165 | "execution_count": 38, 166 | "metadata": {}, 167 | "output_type": "execute_result" 168 | } 169 | ], 170 | "source": [ 171 | "cosine" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": null, 177 | "metadata": {}, 178 | "outputs": [], 179 | "source": [] 180 | }, 181 | { 182 | "cell_type": "code", 183 | "execution_count": null, 184 | "metadata": {}, 185 | "outputs": [], 186 | "source": [] 187 | }, 188 | { 189 | "cell_type": "code", 190 | "execution_count": null, 191 | "metadata": {}, 192 | "outputs": [], 193 | "source": [ 194 | "# import all the necessary libraries\n", 195 | "import cv2\n", 196 | "import imutils\n", 197 | "import numpy as np\n", 198 | "from sklearn.metrics import pairwise\n", 199 | "\n", 200 | "\n", 201 | "# global variables\n", 202 | "bg = None" 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": null, 208 | "metadata": {}, 209 | "outputs": [], 210 | "source": [ 211 | "#-------------------------------------------------------------------------------\n", 212 | "# Function - To find the running average over the background\n", 213 | "#-------------------------------------------------------------------------------\n", 214 | "def run_avg(image, accumWeight):\n", 215 | " global bg\n", 216 | " # initialize the background\n", 217 | " if bg is None:\n", 218 | " bg = image.copy().astype(\"float\")\n", 219 | " return\n", 220 | "\n", 221 | " # compute weighted average, accumulate it and update the background\n", 222 | " cv2.accumulateWeighted(image, bg, accumWeight)" 223 | ] 224 | }, 225 | { 226 | "cell_type": "code", 227 | "execution_count": null, 228 | "metadata": {}, 229 | "outputs": [], 230 | "source": [ 231 | "#------------------------------------------------------------------------------#\n", 232 | "#segment function starts, to segment the region of hand in the image\n", 233 | "#-------------------------------------------------------------------------------\n", 234 | "def segment(image, threshold=25):\n", 235 | " global bg\n", 236 | " # find the absolute difference between background and current frame\n", 237 | " diff = cv2.absdiff(bg.astype(\"uint8\"), image)\n", 238 | "\n", 239 | " # threshold the diff image so that we get the foreground\n", 240 | " thresholded = cv2.threshold(diff, threshold, 255, cv2.THRESH_BINARY)[1]\n", 241 | "\n", 242 | " # get the contours in the thresholded image\n", 243 | " (_, cnts, _) = cv2.findContours(thresholded.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n", 244 | "\n", 245 | " # return None, if no contours detected\n", 246 | " if len(cnts) == 0:\n", 247 | " return\n", 248 | " else:\n", 249 | " # based on contour area, get the maximum contour which is the hand\n", 250 | " segmented = max(cnts, key=cv2.contourArea)\n", 251 | " return (thresholded, segmented)" 252 | ] 253 | }, 254 | { 255 | "cell_type": "code", 256 | "execution_count": null, 257 | "metadata": {}, 258 | "outputs": [], 259 | "source": [ 260 | "#------------------------------------------------------------------------------#\n", 261 | "#segment function ends----------------------------------------------------------------------#\n", 262 | "\n", 263 | "# Function - To count the number of fingers in the segmented hand region\n", 264 | "#-------------------------------------------------------------------------------\n", 265 | "from sklearn.metrics import pairwise\n", 266 | "def count(thresholded, segmented):\n", 267 | "\t# find the convex hull of the segmented hand region\n", 268 | "\tchull = cv2.convexHull(segmented)\n", 269 | "\n", 270 | "\t# find the most extreme points in the convex hull\n", 271 | "\textreme_top = tuple(chull[chull[:, :, 1].argmin()][0])\n", 272 | "\textreme_bottom = tuple(chull[chull[:, :, 1].argmax()][0])\n", 273 | "\textreme_left = tuple(chull[chull[:, :, 0].argmin()][0])\n", 274 | "\textreme_right = tuple(chull[chull[:, :, 0].argmax()][0])\n", 275 | "\n", 276 | "\t# find the center of the palm\n", 277 | "\tcX = int((extreme_left[0] + extreme_right[0]) / 2)\n", 278 | "\tcY = int((extreme_top[1] + extreme_bottom[1]) / 2)\n", 279 | "\n", 280 | "\t# find the maximum euclidean distance between the center of the palm\n", 281 | "\t# and the most extreme points of the convex hull\n", 282 | "\tdistance = pairwise.euclidean_distances([(cX, cY)], Y=[extreme_left, extreme_right, extreme_top, extreme_bottom])[0]\n", 283 | "\tmaximum_distance = distance[distance.argmax()]\n", 284 | "\t\n", 285 | "\t# calculate the radius of the circle with 80% of the max euclidean distance obtained\n", 286 | "\tradius = int(0.8 * maximum_distance)\n", 287 | "\t\n", 288 | "\t# find the circumference of the circle\n", 289 | "\tcircumference = (2 * np.pi * radius)\n", 290 | "\n", 291 | "\t# take out the circular region of interest which has \n", 292 | "\t# the palm and the fingers\n", 293 | "\tcircular_roi = np.zeros(thresholded.shape[:2], dtype=\"uint8\")\n", 294 | "\t\n", 295 | "\t# draw the circular ROI\n", 296 | "\tcv2.circle(circular_roi, (cX, cY), radius, 255, 1)\n", 297 | "\t\n", 298 | "\t# take bit-wise AND between thresholded hand using the circular ROI as the mask\n", 299 | "\t# which gives the cuts obtained using mask on the thresholded hand image\n", 300 | "\tcircular_roi = cv2.bitwise_and(thresholded, thresholded, mask=circular_roi)\n", 301 | "\n", 302 | "\t# compute the contours in the circular ROI\n", 303 | "\t(_, cnts, _) = cv2.findContours(circular_roi.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)\n", 304 | "\n", 305 | "\t# initalize the finger count\n", 306 | "\tcount = 0\n", 307 | "\n", 308 | "\t# loop through the contours found\n", 309 | "\tfor c in cnts:\n", 310 | "\t\t# compute the bounding box of the contour\n", 311 | "\t\t(x, y, w, h) = cv2.boundingRect(c)\n", 312 | "\n", 313 | "\t\t# increment the count of fingers only if -\n", 314 | "\t\t# 1. The contour region is not the wrist (bottom area)\n", 315 | "\t\t# 2. The number of points along the contour does not exceed\n", 316 | "\t\t# 20% of the circumference of the circular ROI\n", 317 | "\t\tif ((cY + (cY * 0.20)) > (y + h)) and ((circumference * 0.20) > c.shape[0]):\n", 318 | "\t\t\tcount += 1\n", 319 | "\n", 320 | "\treturn count" 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": null, 326 | "metadata": {}, 327 | "outputs": [], 328 | "source": [ 329 | "#-------------------------------------------------------------------------------\n", 330 | "# Main function\n", 331 | "#-------------------------------------------------------------------------------\n", 332 | "if __name__ == \"__main__\":\n", 333 | " # initialize accumulated weight\n", 334 | " accumWeight = 0.5\n", 335 | "\n", 336 | " # get the reference to the webcam\n", 337 | " camera = cv2.VideoCapture(0)\n", 338 | "\n", 339 | " # region of interest (ROI) coordinates\n", 340 | " top, right, bottom, left = 20, 450, 325, 690\n", 341 | "\n", 342 | " # initialize num of frames\n", 343 | " num_frames = 0\n", 344 | "\n", 345 | " # calibration indicator\n", 346 | " calibrated = False\n", 347 | "\n", 348 | " # keep looping, until interrupted\n", 349 | " while(True):\n", 350 | " # get the current frame\n", 351 | " (grabbed, frame) = camera.read()\n", 352 | "\n", 353 | " # resize the frame\n", 354 | " frame = imutils.resize(frame, width=700)\n", 355 | "\n", 356 | " # flip the frame so that it is not the mirror view\n", 357 | " frame = cv2.flip(frame, 1)\n", 358 | "\n", 359 | " # clone the frame\n", 360 | " clone = frame.copy()\n", 361 | "\n", 362 | " # get the height and width of the frame\n", 363 | " (height, width) = frame.shape[:2]\n", 364 | "\n", 365 | " # get the ROI\n", 366 | " roi = frame[top:bottom, right:left]\n", 367 | "\n", 368 | " # convert the roi to grayscale and blur it\n", 369 | " gray = cv2.cvtColor(roi, cv2.COLOR_BGR2GRAY)\n", 370 | " gray = cv2.GaussianBlur(gray, (7, 7), 0)\n", 371 | "\n", 372 | " # to get the background, keep looking till a threshold is reached\n", 373 | " # so that our weighted average model gets calibrated\n", 374 | " if num_frames < 30:\n", 375 | " run_avg(gray, accumWeight)\n", 376 | " if num_frames == 1:\n", 377 | " \tprint (\"Calibration is in progress...\")\n", 378 | " elif num_frames == 29:\n", 379 | " print (\"Calibration is successful...\")\n", 380 | " else:\n", 381 | " # segment the hand region\n", 382 | " hand = segment(gray)\n", 383 | "\n", 384 | " # check whether hand region is segmented\n", 385 | " if hand is not None:\n", 386 | " # if yes, unpack the thresholded image and\n", 387 | " # segmented region\n", 388 | " (thresholded, segmented) = hand\n", 389 | "\n", 390 | " # draw the segmented region and display the frame\n", 391 | " cv2.drawContours(clone, [segmented + (right, top)], -1, (0, 0, 255))\n", 392 | " \n", 393 | " # count the number of fingers\n", 394 | " fingers = count(thresholded, segmented)\n", 395 | "\n", 396 | " cv2.putText(clone, str(fingers), (70, 45), cv2.FONT_HERSHEY_SIMPLEX, 1, (0,0,255), 2)\n", 397 | " \n", 398 | " # show the thresholded image\n", 399 | " cv2.imshow(\"Thesholded\", thresholded)\n", 400 | "\n", 401 | " # draw the segmented hand\n", 402 | " cv2.rectangle(clone, (left, top), (right, bottom), (0,255,0), 2)\n", 403 | "\n", 404 | " # increment the number of frames\n", 405 | " num_frames += 1\n", 406 | "\n", 407 | " # display the frame with segmented hand\n", 408 | " cv2.imshow(\"Video Feed\", clone)\n", 409 | "\n", 410 | " # observe the keypress by the user\n", 411 | " keypress = cv2.waitKey(1) & 0xFF\n", 412 | "\n", 413 | " # if the user pressed \"q\", then stop looping\n", 414 | " if keypress == ord(\"q\"):\n", 415 | " break" 416 | ] 417 | }, 418 | { 419 | "cell_type": "code", 420 | "execution_count": null, 421 | "metadata": {}, 422 | "outputs": [], 423 | "source": [] 424 | }, 425 | { 426 | "cell_type": "code", 427 | "execution_count": null, 428 | "metadata": {}, 429 | "outputs": [], 430 | "source": [] 431 | }, 432 | { 433 | "cell_type": "code", 434 | "execution_count": null, 435 | "metadata": {}, 436 | "outputs": [], 437 | "source": [] 438 | }, 439 | { 440 | "cell_type": "code", 441 | "execution_count": null, 442 | "metadata": {}, 443 | "outputs": [], 444 | "source": [] 445 | }, 446 | { 447 | "cell_type": "code", 448 | "execution_count": null, 449 | "metadata": {}, 450 | "outputs": [], 451 | "source": [] 452 | }, 453 | { 454 | "cell_type": "code", 455 | "execution_count": null, 456 | "metadata": {}, 457 | "outputs": [], 458 | "source": [] 459 | } 460 | ], 461 | "metadata": { 462 | "kernelspec": { 463 | "display_name": "Python 3", 464 | "language": "python", 465 | "name": "python3" 466 | }, 467 | "language_info": { 468 | "codemirror_mode": { 469 | "name": "ipython", 470 | "version": 3 471 | }, 472 | "file_extension": ".py", 473 | "mimetype": "text/x-python", 474 | "name": "python", 475 | "nbconvert_exporter": "python", 476 | "pygments_lexer": "ipython3", 477 | "version": "3.7.4" 478 | } 479 | }, 480 | "nbformat": 4, 481 | "nbformat_minor": 2 482 | } 483 | -------------------------------------------------------------------------------- /Chapter6/Important: -------------------------------------------------------------------------------- 1 | The 'facenet_weights.h5' can be downloaded from this Google Drive link 2 | https://drive.google.com/drive/folders/1yNNCw3n3DscEMc3yoALKDrolb6WEA9U0?usp=sharing 3 | -------------------------------------------------------------------------------- /Chapter7/Dataset and codes for Chapter7: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter7/Important: -------------------------------------------------------------------------------- 1 | The dataset, compiled model, weights, images are uploaded to this Google Drive. Due to the size of the dataset, they cannot be uploaded to Github. 2 | The link is https://drive.google.com/drive/u/6/folders/1QS9P5SjTqGofGp-ZCh69c8EQjCQ6VWme 3 | -------------------------------------------------------------------------------- /Chapter8/Chapter8: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Chapter8/Hoover.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/Chapter8/Hoover.jpg -------------------------------------------------------------------------------- /Coin.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/Coin.png -------------------------------------------------------------------------------- /Contributing.md: -------------------------------------------------------------------------------- 1 | # Contributing to Apress Source Code 2 | 3 | Copyright for Apress source code belongs to the author(s). However, under fair use you are encouraged to fork and contribute minor corrections and updates for the benefit of the author(s) and other readers. 4 | 5 | ## How to Contribute 6 | 7 | 1. Make sure you have a GitHub account. 8 | 2. Fork the repository for the relevant book. 9 | 3. Create a new branch on which to make your change, e.g. 10 | `git checkout -b my_code_contribution` 11 | 4. Commit your change. Include a commit message describing the correction. Please note that if your commit message is not clear, the correction will not be accepted. 12 | 5. Submit a pull request. 13 | 14 | Thank you for your contribution! -------------------------------------------------------------------------------- /LICENSE.txt: -------------------------------------------------------------------------------- 1 | Freeware License, some rights reserved 2 | 3 | Copyright (c) 2021 Vaibhav Verdhan 4 | 5 | Permission is hereby granted, free of charge, to anyone obtaining a copy 6 | of this software and associated documentation files (the "Software"), 7 | to work with the Software within the limits of freeware distribution and fair use. 8 | This includes the rights to use, copy, and modify the Software for personal use. 9 | Users are also allowed and encouraged to submit corrections and modifications 10 | to the Software for the benefit of other users. 11 | 12 | It is not allowed to reuse, modify, or redistribute the Software for 13 | commercial use in any way, or for a user’s educational materials such as books 14 | or blog articles without prior permission from the copyright holder. 15 | 16 | The above copyright notice and this permission notice need to be included 17 | in all copies or substantial portions of the software. 18 | 19 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 20 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 22 | AUTHORS OR COPYRIGHT HOLDERS OR APRESS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 24 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 25 | SOFTWARE. 26 | 27 | 28 | -------------------------------------------------------------------------------- /Mario.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Apress/computer-vision-using-deep-learning/5c08fdbc4ffd67297512cc4d929c111c885e0c75/Mario.png -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Apress Source Code 2 | 3 | This repository accompanies [*Computer Vision Using Deep Learning: Neural Network Architectures with Python, Keras, and TensorFlow*](https://www.apress.com/9781484266151) by Vaibhav Verdhan(Apress, 2021). 4 | 5 | [comment]: #cover 6 | ![Cover image](9781484266151.jpg) 7 | 8 | Download the files as a zip using the green button, or clone the repository to your machine using Git. 9 | 10 | ## Releases 11 | 12 | Release v1.0 corresponds to the code in the published book, without corrections or updates. 13 | 14 | ## Contributions 15 | 16 | See the file Contributing.md for more information on how you can contribute to this repository. --------------------------------------------------------------------------------