├── Breast cancer Detection ├── Breast Cancer.ipynb ├── Readme.md └── Screenshot.png ├── Car Brand Classifier And Deployment ├── Procfile ├── README.md ├── Transfer Learning Resnet 50.ipynb ├── app.py ├── model-Resnet-50-h5 (Download Link).txt ├── requirements.txt ├── static │ ├── css │ │ └── main.css │ └── js │ │ └── main.js └── templates │ ├── base.html │ └── index.html ├── Cat Dog Images Classifier (CNN + Keras) ├── Cat_Dog_Classifier_Using_CNN.ipynb ├── Cat_Dog_Classifier_Using_CNN.py ├── DataSet │ └── How to Download Dataset....txt ├── Image.gif └── Readme.md ├── Covid19 FaceMask Detector (CNN & OpenCV) ├── Readme.md ├── face_mask_detection.ipynb ├── haarcascade_frontalface_default.xml ├── man-mask-protective.jpg ├── mask.py ├── video.mp4 ├── video1.mp4 ├── video2.mp4 └── women with mask.jpg ├── Digit Recognizer Kaggle ├── My_Submission.csv ├── Readme.md └── digit-recognizer-detailed-step-wise.ipynb ├── Image Classifier Using Resnet50 ├── README.md ├── Screenshot1.PNG ├── Screenshot2.PNG ├── image-classifier-using-resnet50.ipynb └── images │ ├── Image1.jpg │ ├── Image3.jpg │ ├── Scooter.jpg │ ├── banana.jpg │ ├── car.jpg │ ├── image10.jpg │ ├── image11.jpg │ ├── image2.jpg │ ├── image4.jpg │ ├── image6.jpg │ ├── image8.jpg │ └── image9.jpg ├── Keras Introduction Exploration ├── A Gentle Introduction to Keras.ipynb ├── BBC.csv ├── BBCN.csv ├── Readme.md └── Screenshot.jpeg └── Readme.md /Breast cancer Detection/Readme.md: -------------------------------------------------------------------------------- 1 | # Breast Cancer Prediction 2 | > Predicts whether the type of breast cancer is Malignant or Benign 3 | 4 | ## Focus of the Project 5 | #### > To predict if a breast cancer is Malignant or Benign using Image Dataset as well as Numerical Data 6 | #### > Apply ML and DL Models to predict the severity of the Breast-Cancer 7 | 8 | 9 | 10 | ## About Project: 11 | 12 | Breast cancer is the most common type of cancer in women. When cancers are found early, they can often be cured. 13 | There are some devices that detect the breast cancer but many times they lead to false positives, which results 14 | is patients undergoing painful, expensive surgeries that were not even necessary. These type of cancers are called 15 | **benign** which do not require surgeries and we can reduce these unnecessary surgeries by using Machine Learning. 16 | I have taken the dataset of the previous breast cancer patients and train the model to predict whether the cancer is **benign** or **malignant**. These predictions will help doctors to do surgeries only when the cancer is malignant, thus reducing the unnecessary surgeries for woman. 17 | 18 | For building the project I have used Wisconsin Breast cancer data which has 569 rows of which 357 are benign and 212 are malignant. 19 | The data is prepossessed and scaled. I have trained with Random forest Classifier gives best accuracy of 95.0%. 20 | 21 | ## Languages Used: 22 | 23 | * Python: language 24 | * NumPy: library for numerical calculations 25 | * Pandas: library for data manipulation and analysis 26 | * SkLearn: library which features various classification, regression and clustering algorithms 27 | 28 | Thank You! 29 | 30 | #### Feel Free to contact me at➛ amark720@gmail.com for any help related to this Project! 31 | -------------------------------------------------------------------------------- /Breast cancer Detection/Screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Breast cancer Detection/Screenshot.png -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/Procfile: -------------------------------------------------------------------------------- 1 | web: gunicorn app:main -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/README.md: -------------------------------------------------------------------------------- 1 | # (Deep Learning) Car-Brand-Classifier 2 | "Car Brand Classifier App" Classifies cars of different brands using transfer learning technique Resnet-50. Here Transfer learning recognizes car brands of 3 different brands(i.e. Audi/Lamborghini/Mercedes). For the training set we used 80 images and for validation set 52 images. 3 | 4 |
5 | 6 | ### ScreenRecording Clip of Live App. 7 | [![Demo Doccou alpha](https://github.com/amark720/Amar-kumar/blob/master/ScreenShots/Car%20Brand%20Classifier%20GIF.gif)](http://ec2-18-220-203-245.us-east-2.compute.amazonaws.com:8080) 8 | 9 | ### Web App and Deployment 10 | 11 | This project uses Flask for the web app and its deployment is done on AWS Ec2. 12 | 13 | ## Live Project Link: http://ec2-18-220-203-245.us-east-2.compute.amazonaws.com:8080 14 | 15 | 16 | 17 | ### ScreenShots: 18 | 19 | #### Landing Page- 20 | 21 | 22 | 23 | #### Result- 24 | 25 | 26 | 27 | #### Improvements 28 | * Here we've used a very less amount of images to train the model. So, the Model can be improved further by adding more images to the training set. 29 | * We can add more Classes of Images to help predecting Cars of many Brands. 30 | * Adding more Layers and Epocs into Neural Network will further improve the Accuracy. 31 | 32 | 33 | #### Feel Free to contact me at➛ amark720@gmail.com for any help related to this Project! 34 | 40 | -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/Transfer Learning Resnet 50.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Transfer Learning VGG 16 and VGG 19 using Keras" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "Please download the dataset from the below url" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": 67, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "# import the libraries as shown below\n", 24 | "\n", 25 | "from tensorflow.keras.layers import Input, Lambda, Dense, Flatten\n", 26 | "from tensorflow.keras.models import Model\n", 27 | "from tensorflow.keras.applications.resnet50 import ResNet50\n", 28 | "#from keras.applications.vgg16 import VGG16\n", 29 | "from tensorflow.keras.applications.resnet50 import preprocess_input\n", 30 | "from tensorflow.keras.preprocessing import image\n", 31 | "from tensorflow.keras.preprocessing.image import ImageDataGenerator,load_img\n", 32 | "from tensorflow.keras.models import Sequential\n", 33 | "import numpy as np\n", 34 | "from glob import glob\n", 35 | "import matplotlib.pyplot as plt" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": 68, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "# re-size all the images to this\n", 45 | "IMAGE_SIZE = [224, 224]\n", 46 | "\n", 47 | "train_path = 'Datasets/train'\n", 48 | "valid_path = 'Datasets/test'\n" 49 | ] 50 | }, 51 | { 52 | "cell_type": "code", 53 | "execution_count": 69, 54 | "metadata": {}, 55 | "outputs": [], 56 | "source": [ 57 | "# Import the Vgg 16 library as shown below and add preprocessing layer to the front of VGG\n", 58 | "# Here we will be using imagenet weights\n", 59 | "\n", 60 | "resnet = ResNet50(input_shape=IMAGE_SIZE + [3], weights='imagenet', include_top=False)\n", 61 | "\n", 62 | "\n" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": 70, 68 | "metadata": {}, 69 | "outputs": [], 70 | "source": [ 71 | "# don't train existing weights\n", 72 | "for layer in resnet.layers:\n", 73 | " layer.trainable = False" 74 | ] 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": 71, 79 | "metadata": {}, 80 | "outputs": [ 81 | { 82 | "data": { 83 | "text/plain": [ 84 | "['Datasets/train\\\\audi',\n", 85 | " 'Datasets/train\\\\lamborghini',\n", 86 | " 'Datasets/train\\\\mercedes']" 87 | ] 88 | }, 89 | "execution_count": 71, 90 | "metadata": {}, 91 | "output_type": "execute_result" 92 | } 93 | ], 94 | "source": [ 95 | " # useful for getting number of output classes\n", 96 | "folders = glob('Datasets/train/*')\n", 97 | "folders" 98 | ] 99 | }, 100 | { 101 | "cell_type": "code", 102 | "execution_count": 72, 103 | "metadata": {}, 104 | "outputs": [], 105 | "source": [ 106 | "# our layers - you can add more if you want\n", 107 | "x = Flatten()(resnet.output)" 108 | ] 109 | }, 110 | { 111 | "cell_type": "code", 112 | "execution_count": 73, 113 | "metadata": {}, 114 | "outputs": [], 115 | "source": [ 116 | "prediction = Dense(len(folders), activation='softmax')(x)\n", 117 | "\n", 118 | "# create a model object\n", 119 | "model = Model(inputs=resnet.input, outputs=prediction)" 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": 74, 125 | "metadata": {}, 126 | "outputs": [ 127 | { 128 | "name": "stdout", 129 | "output_type": "stream", 130 | "text": [ 131 | "Model: \"model_2\"\n", 132 | "__________________________________________________________________________________________________\n", 133 | "Layer (type) Output Shape Param # Connected to \n", 134 | "==================================================================================================\n", 135 | "input_3 (InputLayer) [(None, 224, 224, 3) 0 \n", 136 | "__________________________________________________________________________________________________\n", 137 | "conv1_pad (ZeroPadding2D) (None, 230, 230, 3) 0 input_3[0][0] \n", 138 | "__________________________________________________________________________________________________\n", 139 | "conv1_conv (Conv2D) (None, 112, 112, 64) 9472 conv1_pad[0][0] \n", 140 | "__________________________________________________________________________________________________\n", 141 | "conv1_bn (BatchNormalization) (None, 112, 112, 64) 256 conv1_conv[0][0] \n", 142 | "__________________________________________________________________________________________________\n", 143 | "conv1_relu (Activation) (None, 112, 112, 64) 0 conv1_bn[0][0] \n", 144 | "__________________________________________________________________________________________________\n", 145 | "pool1_pad (ZeroPadding2D) (None, 114, 114, 64) 0 conv1_relu[0][0] \n", 146 | "__________________________________________________________________________________________________\n", 147 | "pool1_pool (MaxPooling2D) (None, 56, 56, 64) 0 pool1_pad[0][0] \n", 148 | "__________________________________________________________________________________________________\n", 149 | "conv2_block1_1_conv (Conv2D) (None, 56, 56, 64) 4160 pool1_pool[0][0] \n", 150 | "__________________________________________________________________________________________________\n", 151 | "conv2_block1_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_1_conv[0][0] \n", 152 | "__________________________________________________________________________________________________\n", 153 | "conv2_block1_1_relu (Activation (None, 56, 56, 64) 0 conv2_block1_1_bn[0][0] \n", 154 | "__________________________________________________________________________________________________\n", 155 | "conv2_block1_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block1_1_relu[0][0] \n", 156 | "__________________________________________________________________________________________________\n", 157 | "conv2_block1_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block1_2_conv[0][0] \n", 158 | "__________________________________________________________________________________________________\n", 159 | "conv2_block1_2_relu (Activation (None, 56, 56, 64) 0 conv2_block1_2_bn[0][0] \n", 160 | "__________________________________________________________________________________________________\n", 161 | "conv2_block1_0_conv (Conv2D) (None, 56, 56, 256) 16640 pool1_pool[0][0] \n", 162 | "__________________________________________________________________________________________________\n", 163 | "conv2_block1_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block1_2_relu[0][0] \n", 164 | "__________________________________________________________________________________________________\n", 165 | "conv2_block1_0_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block1_0_conv[0][0] \n", 166 | "__________________________________________________________________________________________________\n", 167 | "conv2_block1_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block1_3_conv[0][0] \n", 168 | "__________________________________________________________________________________________________\n", 169 | "conv2_block1_add (Add) (None, 56, 56, 256) 0 conv2_block1_0_bn[0][0] \n", 170 | " conv2_block1_3_bn[0][0] \n", 171 | "__________________________________________________________________________________________________\n", 172 | "conv2_block1_out (Activation) (None, 56, 56, 256) 0 conv2_block1_add[0][0] \n", 173 | "__________________________________________________________________________________________________\n", 174 | "conv2_block2_1_conv (Conv2D) (None, 56, 56, 64) 16448 conv2_block1_out[0][0] \n", 175 | "__________________________________________________________________________________________________\n", 176 | "conv2_block2_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_1_conv[0][0] \n", 177 | "__________________________________________________________________________________________________\n", 178 | "conv2_block2_1_relu (Activation (None, 56, 56, 64) 0 conv2_block2_1_bn[0][0] \n", 179 | "__________________________________________________________________________________________________\n", 180 | "conv2_block2_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block2_1_relu[0][0] \n", 181 | "__________________________________________________________________________________________________\n", 182 | "conv2_block2_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block2_2_conv[0][0] \n", 183 | "__________________________________________________________________________________________________\n", 184 | "conv2_block2_2_relu (Activation (None, 56, 56, 64) 0 conv2_block2_2_bn[0][0] \n", 185 | "__________________________________________________________________________________________________\n", 186 | "conv2_block2_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block2_2_relu[0][0] \n", 187 | "__________________________________________________________________________________________________\n", 188 | "conv2_block2_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block2_3_conv[0][0] \n", 189 | "__________________________________________________________________________________________________\n", 190 | "conv2_block2_add (Add) (None, 56, 56, 256) 0 conv2_block1_out[0][0] \n", 191 | " conv2_block2_3_bn[0][0] \n", 192 | "__________________________________________________________________________________________________\n", 193 | "conv2_block2_out (Activation) (None, 56, 56, 256) 0 conv2_block2_add[0][0] \n", 194 | "__________________________________________________________________________________________________\n", 195 | "conv2_block3_1_conv (Conv2D) (None, 56, 56, 64) 16448 conv2_block2_out[0][0] \n", 196 | "__________________________________________________________________________________________________\n", 197 | "conv2_block3_1_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_1_conv[0][0] \n", 198 | "__________________________________________________________________________________________________\n", 199 | "conv2_block3_1_relu (Activation (None, 56, 56, 64) 0 conv2_block3_1_bn[0][0] \n", 200 | "__________________________________________________________________________________________________\n", 201 | "conv2_block3_2_conv (Conv2D) (None, 56, 56, 64) 36928 conv2_block3_1_relu[0][0] \n", 202 | "__________________________________________________________________________________________________\n", 203 | "conv2_block3_2_bn (BatchNormali (None, 56, 56, 64) 256 conv2_block3_2_conv[0][0] \n", 204 | "__________________________________________________________________________________________________\n", 205 | "conv2_block3_2_relu (Activation (None, 56, 56, 64) 0 conv2_block3_2_bn[0][0] \n", 206 | "__________________________________________________________________________________________________\n", 207 | "conv2_block3_3_conv (Conv2D) (None, 56, 56, 256) 16640 conv2_block3_2_relu[0][0] \n", 208 | "__________________________________________________________________________________________________\n", 209 | "conv2_block3_3_bn (BatchNormali (None, 56, 56, 256) 1024 conv2_block3_3_conv[0][0] \n", 210 | "__________________________________________________________________________________________________\n", 211 | "conv2_block3_add (Add) (None, 56, 56, 256) 0 conv2_block2_out[0][0] \n", 212 | " conv2_block3_3_bn[0][0] \n", 213 | "__________________________________________________________________________________________________\n", 214 | "conv2_block3_out (Activation) (None, 56, 56, 256) 0 conv2_block3_add[0][0] \n", 215 | "__________________________________________________________________________________________________\n", 216 | "conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 32896 conv2_block3_out[0][0] \n", 217 | "__________________________________________________________________________________________________\n", 218 | "conv3_block1_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_1_conv[0][0] \n", 219 | "__________________________________________________________________________________________________\n", 220 | "conv3_block1_1_relu (Activation (None, 28, 28, 128) 0 conv3_block1_1_bn[0][0] \n", 221 | "__________________________________________________________________________________________________\n", 222 | "conv3_block1_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block1_1_relu[0][0] \n", 223 | "__________________________________________________________________________________________________\n", 224 | "conv3_block1_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block1_2_conv[0][0] \n", 225 | "__________________________________________________________________________________________________\n", 226 | "conv3_block1_2_relu (Activation (None, 28, 28, 128) 0 conv3_block1_2_bn[0][0] \n", 227 | "__________________________________________________________________________________________________\n", 228 | "conv3_block1_0_conv (Conv2D) (None, 28, 28, 512) 131584 conv2_block3_out[0][0] \n", 229 | "__________________________________________________________________________________________________\n", 230 | "conv3_block1_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block1_2_relu[0][0] \n", 231 | "__________________________________________________________________________________________________\n", 232 | "conv3_block1_0_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block1_0_conv[0][0] \n", 233 | "__________________________________________________________________________________________________\n", 234 | "conv3_block1_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block1_3_conv[0][0] \n", 235 | "__________________________________________________________________________________________________\n", 236 | "conv3_block1_add (Add) (None, 28, 28, 512) 0 conv3_block1_0_bn[0][0] \n", 237 | " conv3_block1_3_bn[0][0] \n", 238 | "__________________________________________________________________________________________________\n", 239 | "conv3_block1_out (Activation) (None, 28, 28, 512) 0 conv3_block1_add[0][0] \n", 240 | "__________________________________________________________________________________________________\n", 241 | "conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block1_out[0][0] \n", 242 | "__________________________________________________________________________________________________\n", 243 | "conv3_block2_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_1_conv[0][0] \n", 244 | "__________________________________________________________________________________________________\n", 245 | "conv3_block2_1_relu (Activation (None, 28, 28, 128) 0 conv3_block2_1_bn[0][0] \n", 246 | "__________________________________________________________________________________________________\n", 247 | "conv3_block2_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block2_1_relu[0][0] \n", 248 | "__________________________________________________________________________________________________\n", 249 | "conv3_block2_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block2_2_conv[0][0] \n", 250 | "__________________________________________________________________________________________________\n", 251 | "conv3_block2_2_relu (Activation (None, 28, 28, 128) 0 conv3_block2_2_bn[0][0] \n", 252 | "__________________________________________________________________________________________________\n", 253 | "conv3_block2_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block2_2_relu[0][0] \n", 254 | "__________________________________________________________________________________________________\n", 255 | "conv3_block2_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block2_3_conv[0][0] \n", 256 | "__________________________________________________________________________________________________\n", 257 | "conv3_block2_add (Add) (None, 28, 28, 512) 0 conv3_block1_out[0][0] \n", 258 | " conv3_block2_3_bn[0][0] \n", 259 | "__________________________________________________________________________________________________\n", 260 | "conv3_block2_out (Activation) (None, 28, 28, 512) 0 conv3_block2_add[0][0] \n", 261 | "__________________________________________________________________________________________________\n", 262 | "conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block2_out[0][0] \n", 263 | "__________________________________________________________________________________________________\n", 264 | "conv3_block3_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_1_conv[0][0] \n", 265 | "__________________________________________________________________________________________________\n", 266 | "conv3_block3_1_relu (Activation (None, 28, 28, 128) 0 conv3_block3_1_bn[0][0] \n", 267 | "__________________________________________________________________________________________________\n", 268 | "conv3_block3_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block3_1_relu[0][0] \n", 269 | "__________________________________________________________________________________________________\n", 270 | "conv3_block3_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block3_2_conv[0][0] \n", 271 | "__________________________________________________________________________________________________\n", 272 | "conv3_block3_2_relu (Activation (None, 28, 28, 128) 0 conv3_block3_2_bn[0][0] \n", 273 | "__________________________________________________________________________________________________\n", 274 | "conv3_block3_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block3_2_relu[0][0] \n", 275 | "__________________________________________________________________________________________________\n", 276 | "conv3_block3_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block3_3_conv[0][0] \n", 277 | "__________________________________________________________________________________________________\n", 278 | "conv3_block3_add (Add) (None, 28, 28, 512) 0 conv3_block2_out[0][0] \n", 279 | " conv3_block3_3_bn[0][0] \n", 280 | "__________________________________________________________________________________________________\n", 281 | "conv3_block3_out (Activation) (None, 28, 28, 512) 0 conv3_block3_add[0][0] \n", 282 | "__________________________________________________________________________________________________\n", 283 | "conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 65664 conv3_block3_out[0][0] \n", 284 | "__________________________________________________________________________________________________\n", 285 | "conv3_block4_1_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_1_conv[0][0] \n", 286 | "__________________________________________________________________________________________________\n", 287 | "conv3_block4_1_relu (Activation (None, 28, 28, 128) 0 conv3_block4_1_bn[0][0] \n", 288 | "__________________________________________________________________________________________________\n", 289 | "conv3_block4_2_conv (Conv2D) (None, 28, 28, 128) 147584 conv3_block4_1_relu[0][0] \n", 290 | "__________________________________________________________________________________________________\n", 291 | "conv3_block4_2_bn (BatchNormali (None, 28, 28, 128) 512 conv3_block4_2_conv[0][0] \n", 292 | "__________________________________________________________________________________________________\n", 293 | "conv3_block4_2_relu (Activation (None, 28, 28, 128) 0 conv3_block4_2_bn[0][0] \n", 294 | "__________________________________________________________________________________________________\n", 295 | "conv3_block4_3_conv (Conv2D) (None, 28, 28, 512) 66048 conv3_block4_2_relu[0][0] \n", 296 | "__________________________________________________________________________________________________\n", 297 | "conv3_block4_3_bn (BatchNormali (None, 28, 28, 512) 2048 conv3_block4_3_conv[0][0] \n", 298 | "__________________________________________________________________________________________________\n", 299 | "conv3_block4_add (Add) (None, 28, 28, 512) 0 conv3_block3_out[0][0] \n", 300 | " conv3_block4_3_bn[0][0] \n", 301 | "__________________________________________________________________________________________________\n", 302 | "conv3_block4_out (Activation) (None, 28, 28, 512) 0 conv3_block4_add[0][0] \n", 303 | "__________________________________________________________________________________________________\n", 304 | "conv4_block1_1_conv (Conv2D) (None, 14, 14, 256) 131328 conv3_block4_out[0][0] \n", 305 | "__________________________________________________________________________________________________\n", 306 | "conv4_block1_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_1_conv[0][0] \n", 307 | "__________________________________________________________________________________________________\n", 308 | "conv4_block1_1_relu (Activation (None, 14, 14, 256) 0 conv4_block1_1_bn[0][0] \n", 309 | "__________________________________________________________________________________________________\n", 310 | "conv4_block1_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block1_1_relu[0][0] \n", 311 | "__________________________________________________________________________________________________\n", 312 | "conv4_block1_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block1_2_conv[0][0] \n", 313 | "__________________________________________________________________________________________________\n", 314 | "conv4_block1_2_relu (Activation (None, 14, 14, 256) 0 conv4_block1_2_bn[0][0] \n", 315 | "__________________________________________________________________________________________________\n", 316 | "conv4_block1_0_conv (Conv2D) (None, 14, 14, 1024) 525312 conv3_block4_out[0][0] \n", 317 | "__________________________________________________________________________________________________\n", 318 | "conv4_block1_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block1_2_relu[0][0] \n", 319 | "__________________________________________________________________________________________________\n", 320 | "conv4_block1_0_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block1_0_conv[0][0] \n", 321 | "__________________________________________________________________________________________________\n", 322 | "conv4_block1_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block1_3_conv[0][0] \n", 323 | "__________________________________________________________________________________________________\n", 324 | "conv4_block1_add (Add) (None, 14, 14, 1024) 0 conv4_block1_0_bn[0][0] \n", 325 | " conv4_block1_3_bn[0][0] \n", 326 | "__________________________________________________________________________________________________\n", 327 | "conv4_block1_out (Activation) (None, 14, 14, 1024) 0 conv4_block1_add[0][0] \n", 328 | "__________________________________________________________________________________________________\n", 329 | "conv4_block2_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block1_out[0][0] \n", 330 | "__________________________________________________________________________________________________\n", 331 | "conv4_block2_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_1_conv[0][0] \n", 332 | "__________________________________________________________________________________________________\n", 333 | "conv4_block2_1_relu (Activation (None, 14, 14, 256) 0 conv4_block2_1_bn[0][0] \n", 334 | "__________________________________________________________________________________________________\n", 335 | "conv4_block2_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block2_1_relu[0][0] \n", 336 | "__________________________________________________________________________________________________\n", 337 | "conv4_block2_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block2_2_conv[0][0] \n", 338 | "__________________________________________________________________________________________________\n", 339 | "conv4_block2_2_relu (Activation (None, 14, 14, 256) 0 conv4_block2_2_bn[0][0] \n", 340 | "__________________________________________________________________________________________________\n", 341 | "conv4_block2_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block2_2_relu[0][0] \n", 342 | "__________________________________________________________________________________________________\n", 343 | "conv4_block2_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block2_3_conv[0][0] \n", 344 | "__________________________________________________________________________________________________\n", 345 | "conv4_block2_add (Add) (None, 14, 14, 1024) 0 conv4_block1_out[0][0] \n", 346 | " conv4_block2_3_bn[0][0] \n", 347 | "__________________________________________________________________________________________________\n", 348 | "conv4_block2_out (Activation) (None, 14, 14, 1024) 0 conv4_block2_add[0][0] \n", 349 | "__________________________________________________________________________________________________\n", 350 | "conv4_block3_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block2_out[0][0] \n", 351 | "__________________________________________________________________________________________________\n", 352 | "conv4_block3_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_1_conv[0][0] \n", 353 | "__________________________________________________________________________________________________\n", 354 | "conv4_block3_1_relu (Activation (None, 14, 14, 256) 0 conv4_block3_1_bn[0][0] \n", 355 | "__________________________________________________________________________________________________\n", 356 | "conv4_block3_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block3_1_relu[0][0] \n", 357 | "__________________________________________________________________________________________________\n", 358 | "conv4_block3_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block3_2_conv[0][0] \n", 359 | "__________________________________________________________________________________________________\n", 360 | "conv4_block3_2_relu (Activation (None, 14, 14, 256) 0 conv4_block3_2_bn[0][0] \n", 361 | "__________________________________________________________________________________________________\n", 362 | "conv4_block3_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block3_2_relu[0][0] \n", 363 | "__________________________________________________________________________________________________\n", 364 | "conv4_block3_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block3_3_conv[0][0] \n", 365 | "__________________________________________________________________________________________________\n", 366 | "conv4_block3_add (Add) (None, 14, 14, 1024) 0 conv4_block2_out[0][0] \n", 367 | " conv4_block3_3_bn[0][0] \n", 368 | "__________________________________________________________________________________________________\n", 369 | "conv4_block3_out (Activation) (None, 14, 14, 1024) 0 conv4_block3_add[0][0] \n", 370 | "__________________________________________________________________________________________________\n", 371 | "conv4_block4_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block3_out[0][0] \n", 372 | "__________________________________________________________________________________________________\n", 373 | "conv4_block4_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_1_conv[0][0] \n", 374 | "__________________________________________________________________________________________________\n", 375 | "conv4_block4_1_relu (Activation (None, 14, 14, 256) 0 conv4_block4_1_bn[0][0] \n", 376 | "__________________________________________________________________________________________________\n", 377 | "conv4_block4_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block4_1_relu[0][0] \n", 378 | "__________________________________________________________________________________________________\n", 379 | "conv4_block4_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block4_2_conv[0][0] \n", 380 | "__________________________________________________________________________________________________\n", 381 | "conv4_block4_2_relu (Activation (None, 14, 14, 256) 0 conv4_block4_2_bn[0][0] \n", 382 | "__________________________________________________________________________________________________\n", 383 | "conv4_block4_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block4_2_relu[0][0] \n", 384 | "__________________________________________________________________________________________________\n", 385 | "conv4_block4_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block4_3_conv[0][0] \n", 386 | "__________________________________________________________________________________________________\n", 387 | "conv4_block4_add (Add) (None, 14, 14, 1024) 0 conv4_block3_out[0][0] \n", 388 | " conv4_block4_3_bn[0][0] \n", 389 | "__________________________________________________________________________________________________\n", 390 | "conv4_block4_out (Activation) (None, 14, 14, 1024) 0 conv4_block4_add[0][0] \n", 391 | "__________________________________________________________________________________________________\n", 392 | "conv4_block5_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block4_out[0][0] \n", 393 | "__________________________________________________________________________________________________\n", 394 | "conv4_block5_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_1_conv[0][0] \n", 395 | "__________________________________________________________________________________________________\n", 396 | "conv4_block5_1_relu (Activation (None, 14, 14, 256) 0 conv4_block5_1_bn[0][0] \n", 397 | "__________________________________________________________________________________________________\n", 398 | "conv4_block5_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block5_1_relu[0][0] \n", 399 | "__________________________________________________________________________________________________\n", 400 | "conv4_block5_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block5_2_conv[0][0] \n", 401 | "__________________________________________________________________________________________________\n", 402 | "conv4_block5_2_relu (Activation (None, 14, 14, 256) 0 conv4_block5_2_bn[0][0] \n", 403 | "__________________________________________________________________________________________________\n", 404 | "conv4_block5_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block5_2_relu[0][0] \n", 405 | "__________________________________________________________________________________________________\n", 406 | "conv4_block5_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block5_3_conv[0][0] \n", 407 | "__________________________________________________________________________________________________\n", 408 | "conv4_block5_add (Add) (None, 14, 14, 1024) 0 conv4_block4_out[0][0] \n", 409 | " conv4_block5_3_bn[0][0] \n", 410 | "__________________________________________________________________________________________________\n", 411 | "conv4_block5_out (Activation) (None, 14, 14, 1024) 0 conv4_block5_add[0][0] \n", 412 | "__________________________________________________________________________________________________\n", 413 | "conv4_block6_1_conv (Conv2D) (None, 14, 14, 256) 262400 conv4_block5_out[0][0] \n", 414 | "__________________________________________________________________________________________________\n", 415 | "conv4_block6_1_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_1_conv[0][0] \n", 416 | "__________________________________________________________________________________________________\n", 417 | "conv4_block6_1_relu (Activation (None, 14, 14, 256) 0 conv4_block6_1_bn[0][0] \n", 418 | "__________________________________________________________________________________________________\n", 419 | "conv4_block6_2_conv (Conv2D) (None, 14, 14, 256) 590080 conv4_block6_1_relu[0][0] \n", 420 | "__________________________________________________________________________________________________\n", 421 | "conv4_block6_2_bn (BatchNormali (None, 14, 14, 256) 1024 conv4_block6_2_conv[0][0] \n", 422 | "__________________________________________________________________________________________________\n", 423 | "conv4_block6_2_relu (Activation (None, 14, 14, 256) 0 conv4_block6_2_bn[0][0] \n", 424 | "__________________________________________________________________________________________________\n", 425 | "conv4_block6_3_conv (Conv2D) (None, 14, 14, 1024) 263168 conv4_block6_2_relu[0][0] \n", 426 | "__________________________________________________________________________________________________\n", 427 | "conv4_block6_3_bn (BatchNormali (None, 14, 14, 1024) 4096 conv4_block6_3_conv[0][0] \n", 428 | "__________________________________________________________________________________________________\n", 429 | "conv4_block6_add (Add) (None, 14, 14, 1024) 0 conv4_block5_out[0][0] \n", 430 | " conv4_block6_3_bn[0][0] \n", 431 | "__________________________________________________________________________________________________\n", 432 | "conv4_block6_out (Activation) (None, 14, 14, 1024) 0 conv4_block6_add[0][0] \n", 433 | "__________________________________________________________________________________________________\n", 434 | "conv5_block1_1_conv (Conv2D) (None, 7, 7, 512) 524800 conv4_block6_out[0][0] \n", 435 | "__________________________________________________________________________________________________\n", 436 | "conv5_block1_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_1_conv[0][0] \n", 437 | "__________________________________________________________________________________________________\n", 438 | "conv5_block1_1_relu (Activation (None, 7, 7, 512) 0 conv5_block1_1_bn[0][0] \n", 439 | "__________________________________________________________________________________________________\n", 440 | "conv5_block1_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block1_1_relu[0][0] \n", 441 | "__________________________________________________________________________________________________\n", 442 | "conv5_block1_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block1_2_conv[0][0] \n", 443 | "__________________________________________________________________________________________________\n", 444 | "conv5_block1_2_relu (Activation (None, 7, 7, 512) 0 conv5_block1_2_bn[0][0] \n", 445 | "__________________________________________________________________________________________________\n", 446 | "conv5_block1_0_conv (Conv2D) (None, 7, 7, 2048) 2099200 conv4_block6_out[0][0] \n", 447 | "__________________________________________________________________________________________________\n", 448 | "conv5_block1_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block1_2_relu[0][0] \n", 449 | "__________________________________________________________________________________________________\n", 450 | "conv5_block1_0_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block1_0_conv[0][0] \n", 451 | "__________________________________________________________________________________________________\n", 452 | "conv5_block1_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block1_3_conv[0][0] \n", 453 | "__________________________________________________________________________________________________\n", 454 | "conv5_block1_add (Add) (None, 7, 7, 2048) 0 conv5_block1_0_bn[0][0] \n", 455 | " conv5_block1_3_bn[0][0] \n", 456 | "__________________________________________________________________________________________________\n", 457 | "conv5_block1_out (Activation) (None, 7, 7, 2048) 0 conv5_block1_add[0][0] \n", 458 | "__________________________________________________________________________________________________\n", 459 | "conv5_block2_1_conv (Conv2D) (None, 7, 7, 512) 1049088 conv5_block1_out[0][0] \n", 460 | "__________________________________________________________________________________________________\n", 461 | "conv5_block2_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_1_conv[0][0] \n", 462 | "__________________________________________________________________________________________________\n", 463 | "conv5_block2_1_relu (Activation (None, 7, 7, 512) 0 conv5_block2_1_bn[0][0] \n", 464 | "__________________________________________________________________________________________________\n", 465 | "conv5_block2_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block2_1_relu[0][0] \n", 466 | "__________________________________________________________________________________________________\n", 467 | "conv5_block2_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block2_2_conv[0][0] \n", 468 | "__________________________________________________________________________________________________\n", 469 | "conv5_block2_2_relu (Activation (None, 7, 7, 512) 0 conv5_block2_2_bn[0][0] \n", 470 | "__________________________________________________________________________________________________\n", 471 | "conv5_block2_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block2_2_relu[0][0] \n", 472 | "__________________________________________________________________________________________________\n", 473 | "conv5_block2_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block2_3_conv[0][0] \n", 474 | "__________________________________________________________________________________________________\n", 475 | "conv5_block2_add (Add) (None, 7, 7, 2048) 0 conv5_block1_out[0][0] \n", 476 | " conv5_block2_3_bn[0][0] \n", 477 | "__________________________________________________________________________________________________\n", 478 | "conv5_block2_out (Activation) (None, 7, 7, 2048) 0 conv5_block2_add[0][0] \n", 479 | "__________________________________________________________________________________________________\n", 480 | "conv5_block3_1_conv (Conv2D) (None, 7, 7, 512) 1049088 conv5_block2_out[0][0] \n", 481 | "__________________________________________________________________________________________________\n", 482 | "conv5_block3_1_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_1_conv[0][0] \n", 483 | "__________________________________________________________________________________________________\n", 484 | "conv5_block3_1_relu (Activation (None, 7, 7, 512) 0 conv5_block3_1_bn[0][0] \n", 485 | "__________________________________________________________________________________________________\n", 486 | "conv5_block3_2_conv (Conv2D) (None, 7, 7, 512) 2359808 conv5_block3_1_relu[0][0] \n", 487 | "__________________________________________________________________________________________________\n", 488 | "conv5_block3_2_bn (BatchNormali (None, 7, 7, 512) 2048 conv5_block3_2_conv[0][0] \n", 489 | "__________________________________________________________________________________________________\n", 490 | "conv5_block3_2_relu (Activation (None, 7, 7, 512) 0 conv5_block3_2_bn[0][0] \n", 491 | "__________________________________________________________________________________________________\n", 492 | "conv5_block3_3_conv (Conv2D) (None, 7, 7, 2048) 1050624 conv5_block3_2_relu[0][0] \n", 493 | "__________________________________________________________________________________________________\n", 494 | "conv5_block3_3_bn (BatchNormali (None, 7, 7, 2048) 8192 conv5_block3_3_conv[0][0] \n", 495 | "__________________________________________________________________________________________________\n", 496 | "conv5_block3_add (Add) (None, 7, 7, 2048) 0 conv5_block2_out[0][0] \n", 497 | " conv5_block3_3_bn[0][0] \n", 498 | "__________________________________________________________________________________________________\n", 499 | "conv5_block3_out (Activation) (None, 7, 7, 2048) 0 conv5_block3_add[0][0] \n", 500 | "__________________________________________________________________________________________________\n", 501 | "flatten_2 (Flatten) (None, 100352) 0 conv5_block3_out[0][0] \n", 502 | "__________________________________________________________________________________________________\n", 503 | "dense_2 (Dense) (None, 3) 301059 flatten_2[0][0] \n", 504 | "==================================================================================================\n", 505 | "Total params: 23,888,771\n", 506 | "Trainable params: 301,059\n", 507 | "Non-trainable params: 23,587,712\n", 508 | "__________________________________________________________________________________________________\n" 509 | ] 510 | } 511 | ], 512 | "source": [ 513 | "\n", 514 | "# view the structure of the model\n", 515 | "model.summary()\n" 516 | ] 517 | }, 518 | { 519 | "cell_type": "code", 520 | "execution_count": 75, 521 | "metadata": {}, 522 | "outputs": [], 523 | "source": [ 524 | "# tell the model what cost and optimization method to use\n", 525 | "model.compile(\n", 526 | " loss='categorical_crossentropy',\n", 527 | " optimizer='adam',\n", 528 | " metrics=['accuracy']\n", 529 | ")\n" 530 | ] 531 | }, 532 | { 533 | "cell_type": "code", 534 | "execution_count": 76, 535 | "metadata": {}, 536 | "outputs": [], 537 | "source": [ 538 | "# Use the Image Data Generator to import the images from the dataset\n", 539 | "from tensorflow.keras.preprocessing.image import ImageDataGenerator\n", 540 | "\n", 541 | "train_datagen = ImageDataGenerator(rescale = 1./255,\n", 542 | " shear_range = 0.2,\n", 543 | " zoom_range = 0.2,\n", 544 | " horizontal_flip = True)\n", 545 | "\n", 546 | "test_datagen = ImageDataGenerator(rescale = 1./255)" 547 | ] 548 | }, 549 | { 550 | "cell_type": "code", 551 | "execution_count": 77, 552 | "metadata": {}, 553 | "outputs": [ 554 | { 555 | "name": "stdout", 556 | "output_type": "stream", 557 | "text": [ 558 | "Found 64 images belonging to 3 classes.\n" 559 | ] 560 | } 561 | ], 562 | "source": [ 563 | "# Make sure you provide the same target size as initialied for the image size\n", 564 | "training_set = train_datagen.flow_from_directory('Datasets/Train',\n", 565 | " target_size = (224, 224),\n", 566 | " batch_size = 32,\n", 567 | " class_mode = 'categorical')" 568 | ] 569 | }, 570 | { 571 | "cell_type": "code", 572 | "execution_count": 78, 573 | "metadata": {}, 574 | "outputs": [ 575 | { 576 | "name": "stdout", 577 | "output_type": "stream", 578 | "text": [ 579 | "Found 58 images belonging to 3 classes.\n" 580 | ] 581 | } 582 | ], 583 | "source": [ 584 | "test_set = test_datagen.flow_from_directory('Datasets/Test',\n", 585 | " target_size = (224, 224),\n", 586 | " batch_size = 32,\n", 587 | " class_mode = 'categorical')" 588 | ] 589 | }, 590 | { 591 | "cell_type": "code", 592 | "execution_count": 79, 593 | "metadata": {}, 594 | "outputs": [ 595 | { 596 | "name": "stdout", 597 | "output_type": "stream", 598 | "text": [ 599 | "WARNING:tensorflow:sample_weight modes were coerced from\n", 600 | " ...\n", 601 | " to \n", 602 | " ['...']\n", 603 | "WARNING:tensorflow:sample_weight modes were coerced from\n", 604 | " ...\n", 605 | " to \n", 606 | " ['...']\n", 607 | "Train for 2 steps, validate for 2 steps\n", 608 | "Epoch 1/5\n", 609 | "2/2 [==============================] - 27s 14s/step - loss: 4.8157 - accuracy: 0.4062 - val_loss: 8.9823 - val_accuracy: 0.3276\n", 610 | "Epoch 2/5\n", 611 | "2/2 [==============================] - 22s 11s/step - loss: 6.7594 - accuracy: 0.5938 - val_loss: 12.2410 - val_accuracy: 0.3276\n", 612 | "Epoch 3/5\n", 613 | "2/2 [==============================] - 22s 11s/step - loss: 0.9376 - accuracy: 0.8906 - val_loss: 14.1595 - val_accuracy: 0.3276\n", 614 | "Epoch 4/5\n", 615 | "2/2 [==============================] - 21s 10s/step - loss: 1.1360 - accuracy: 0.9219 - val_loss: 15.6251 - val_accuracy: 0.3276\n", 616 | "Epoch 5/5\n", 617 | "2/2 [==============================] - 22s 11s/step - loss: 1.7595 - accuracy: 0.9062 - val_loss: 18.2626 - val_accuracy: 0.3276\n" 618 | ] 619 | } 620 | ], 621 | "source": [ 622 | "# fit the model\n", 623 | "# Run the cell. It will take some time to execute\n", 624 | "r = model.fit(\n", 625 | " training_set,\n", 626 | " validation_data=test_set,\n", 627 | " epochs=100,\n", 628 | " steps_per_epoch=len(training_set),\n", 629 | " validation_steps=len(test_set)\n", 630 | ")" 631 | ] 632 | }, 633 | { 634 | "cell_type": "code", 635 | "execution_count": 81, 636 | "metadata": {}, 637 | "outputs": [ 638 | { 639 | "data": { 640 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXoAAAD4CAYAAADiry33AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8vihELAAAACXBIWXMAAAsTAAALEwEAmpwYAAAsxElEQVR4nO3deXhU1f3H8fc3e0JCCFlYEiDsYQ8QEIiyurC4AG4gsmnBpba1/mq12lqtfVrrUlsVRGwBUcSlgtpK3SqbCrLJvsguIUASSEJC9sn5/XGHAGEC2e/M5Pt6njyZuffO3G+u8cPJuWfOEWMMSimlvJeP3QUopZSqWxr0Sinl5TTolVLKy2nQK6WUl9OgV0opL+dndwGuREVFmfj4eLvLUEopj7Fx48YMY0y0q31uGfTx8fFs2LDB7jKUUspjiMjhivZp141SSnk5DXqllPJyGvRKKeXl3LKP3pXi4mJSUlIoKCiwuxSPFBQURFxcHP7+/naXopSqZx4T9CkpKYSFhREfH4+I2F2ORzHGcPLkSVJSUmjbtq3d5Sil6pnHdN0UFBQQGRmpIV8NIkJkZKT+NaRUA+UxQQ9oyNeAXjulGi6PCnqllPJKjhLY/Ql8/WKdvL0GfSVlZWUxe/bsar129OjRZGVlVfr4J598kueff75a51JKeZDMQ/C/p+HFbvDOHbB+HpQU1vppPOZmrN3OBv39999/0T6Hw4Gvr2+Fr122bFldlqaU8iQlRbDnE9j4BhxYDuIDHa6Bvn+FjteBb+3HsrboK+nRRx9l//79JCYm8vDDD7NixQqGDRvGHXfcQY8ePQAYO3Ysffv2pVu3bsydO7fstfHx8WRkZHDo0CG6dOnCjBkz6NatG9deey35+fmXPO/mzZsZMGAAPXv2ZNy4cWRmZgLw0ksv0bVrV3r27MmECRMAWLlyJYmJiSQmJtK7d29ycnLq6GoopaosYx98/jv4axd4fxpk7IWhj8GD22DSe5Awpk5CHjy0Rf/Uv3ewM/V0rb5n15aN+f0N3Src/8wzz7B9+3Y2b94MwIoVK1i3bh3bt28vG7I4b948mjZtSn5+Pv369ePmm28mMjLygvfZu3cvixcv5vXXX+e2227jgw8+4M4776zwvFOmTOHll19myJAhPPHEEzz11FP87W9/45lnnuHgwYMEBgaWdQs9//zzzJo1i+TkZHJzcwkKCqrZRVFK1UxxAez6N2xcAIe/BvGFzqOg7zRoPxx8Ku4JqE0eGfTuon///heMS3/ppZdYunQpAEeOHGHv3r0XBX3btm1JTEwEoG/fvhw6dKjC98/OziYrK4shQ4YAMHXqVG699VYAevbsyaRJkxg7dixjx44FIDk5mYceeohJkyYxfvx44uLiauknVUpVSdouq2tm6zuQnwkR8TDiCUicBGHN670cjwz6S7W861OjRo3KHq9YsYIvv/ySNWvWEBISwtChQ12OWw8MDCx77Ovre9mum4p88sknrFq1io8//pinn36aHTt28OijjzJmzBiWLVvGgAED+PLLL0lISKjW+yulqqgoD3YstVrvKevAxx+63AB9p0L8YPCxr6f8skEvIvOA64E0Y0x357Z3gc7OQ5oAWcaYRBevPQTkAA6gxBiTVCtV2yAsLOySfd7Z2dlEREQQEhLC7t27Wbt2bY3PGR4eTkREBKtXr+aqq67izTffZMiQIZSWlnLkyBGGDRvGlVdeydtvv01ubi4nT56kR48e9OjRgzVr1rB7924NeqXq2rGtVrhvex8KT0NkR7j2j9BrIjSKsrs6oHIt+gXAK8DCsxuMMbeffSwiLwDZl3j9MGNMRnULdBeRkZEkJyfTvXt3Ro0axZgxYy7YP3LkSObMmUPPnj3p3LkzAwYMqJXzvvHGG9x7773k5eXRrl075s+fj8Ph4M477yQ7OxtjDL/85S9p0qQJv/vd71i+fDm+vr507dqVUaNG1UoNSqlyCnNg279g0xuQ+j34BUHXsVbrvfVAcLMPKIox5vIHicQD/znboj9vuwA/AsONMXtdvO4QkFTVoE9KSjLlFx7ZtWsXXbp0qcrbqHL0GipVA8bA0U2waQFs+wCKz0BMNyvce94GwRG2liciGyvqNalpH/1VwAlXIe9kgM9FxACvGWPmVnAcIjITmAnQunXrGpallFK1JD8Ltr5ntd5PbAf/EOg+HvpMg7gkt2u9u1LToJ8ILL7E/mRjTKqIxABfiMhuY8wqVwc6/xGYC1aLvoZ1KaVU9RkDP661wn3Hh1CSDy16wfUvQvdbIKix3RVWSbWDXkT8gPFA34qOMcakOr+nichSoD/gMuiVUsp2eadgy2JraGTGHggIg8SJ0GcqtEy0u7pqq0mL/mpgtzEmxdVOEWkE+BhjcpyPrwX+UIPzKaVU7SsthUOrrdb7rn+Dowji+sGNr1hdNAGNLv8ebq4ywysXA0OBKBFJAX5vjPknMIFy3TYi0hL4hzFmNNAMWOqcHtcPeNsY82ntlq+UUtWUmwabF8GmhXDqAASFQ9Jd0GcKNHOPz+rUlssGvTFmYgXbp7nYlgqMdj4+APSqYX1KKVV7SkvhwFfWuPc9/4XSEmiTDEMeha43gn+w3RXWCY/8ZKynCA0NJTc3t9LblVJ15HQqfP8WbHoTsn+EkEi44l6r7z26k93V1TkNeqWUd3KUwL4vrNb73s/BlEK7oXDNU9ZMkX6Bl3sHr6HTFFfSI488csHCI08++SQvvPACubm5jBgxgj59+tCjRw8++uijSr+nMYaHH36Y7t2706NHD959910Ajh07xuDBg0lMTKR79+6sXr0ah8PBtGnTyo598cW6WYlGKY+XeRi++iP8rTssnmB9cjX5Qfj59zDlI+sGawMKefDUFv1/H4Xj22r3PZv3gFHPVLh7woQJPPjgg2ULj7z33nt8+umnBAUFsXTpUho3bkxGRgYDBgzgxhtvrNQarUuWLGHz5s1s2bKFjIwM+vXrx+DBg3n77be57rrrePzxx3E4HOTl5bF582aOHj3K9u3bAaq0YpVSXs9RDHuWWcMi939lbetwNYx+DjqNBF9/e+uzmWcGvQ169+5NWloaqamppKenExERQevWrSkuLuaxxx5j1apV+Pj4cPToUU6cOEHz5pefivTrr79m4sSJ+Pr60qxZM4YMGcL69evp168fd911F8XFxYwdO5bExETatWvHgQMH+NnPfsaYMWO49tpr6+GnVsrNndxvjZrZvAjOpEPjWBjyCPS+E5q0srs6t+GZQX+JlndduuWWW/jXv/7F8ePHy1Z1WrRoEenp6WzcuBF/f3/i4+NdTk/sSkXzDA0ePJhVq1bxySefMHnyZB5++GGmTJnCli1b+Oyzz5g1axbvvfce8+bNq7WfTSmPUVJ4bjGPQ6utxTw6jbTmnOlwdb0t5uFJPDPobTJhwgRmzJhBRkYGK1euBKzpiWNiYvD392f58uUcPny40u83ePBgXnvtNaZOncqpU6dYtWoVzz33HIcPHyY2NpYZM2Zw5swZNm3axOjRowkICODmm2+mffv2TJs2rY5+SqXcVPoeq2tmy2LIPwVNWsPw30LindC4hd3VuTUN+iro1q0bOTk5xMbG0qKF9Ys1adIkbrjhBpKSkkhMTKzS/O/jxo1jzZo19OrVCxHh2WefpXnz5rzxxhs899xz+Pv7ExoaysKFCzl69CjTp0+ntLQUgD//+c918jMq5VaK8mDnR1br/chaazGPhDFW673tUFsX8/AklZqmuL7pNMV1Q6+h8hjHtzmX4nsPCrOhaXsr3HvdAaHRdlfnlupymmKllKodhbmw/QNrzpmjG8E3ELreZAV8m2SPmA7YXWnQK6XsY4w1zn3jAivki3IhuguMfAZ63g4hTe2u0Ct4VNAbYyo1Pl1dzB276FQDVpB9bjGP49vAL9i5mMdUaNVfW++1zGOCPigoiJMnTxIZGalhX0XGGE6ePElQUJDdpaiGzBg4ss4K9+1LrMU8mveAMS9Aj1ut2SNVnfCYoI+LiyMlJYX09HS7S/FIQUFBxMXF2V2GaojyTsGWd6yAT98NAaHQ63bnYh69tfVeDzwm6P39/Wnbtq3dZSilKiM3zWq97/wQdn4MjkKI7Qs3vATdb4bAULsrbFA8JuiVUm6qpMjqZ09Z7/xaB1k/WvsCw61RM32mQvPu9tbZgGnQK6WqJjvFGegbrO+pm60WO1hzzcQlQf+ZENffWlDbX+8N2U2DXilVseJ8K8jLWusbICfV2ucbaPWx959hrbEa1w/CY20tV7mmQa+UshgDmYfOC/X1VpdMaYm1PyIe4pOdoZ4EzXqAX4CdFatKqszi4POA64E0Y0x357YngRnA2SEwjxljlrl47Ujg74Av1qLh9kw7qZS6WGEupG6yAv2IM9jzMqx9/o0gtg8M+vm5YA+NsbdeVW2VadEvAF4BFpbb/qIx5vmKXiQivsAs4BogBVgvIh8bY3ZWs1alVHWVlsLJfRe21tN2WsvrAUR2hE7XWYEe18/6dKqv/sHvLS77X9IYs0pE4qvx3v2BfcaYAwAi8g5wE6BBr1Rdy8+05os5e8M0ZQMUZFn7AsMhri8kXG+FemwfnWrAy9Xkn+wHRGQKsAH4P2NMZrn9scCR856nAFdU9GYiMhOYCdC6desalKVUA1PqgLRdF46Eydjj3CkQ09WaHKxVfyvYIzvq9L4NTHWD/lXgacA4v78A3FXuGFcfd6twwhVjzFxgLljTFFezLqW835mMC7tgjm6yJgMDCIm0wrznrdbwxpa9IaixvfUq21Ur6I0xJ84+FpHXgf+4OCwFOH/RxjggtTrnU6rBchTDie1WS/3IOivYMw9a+8TXmium10Qr3Fv1g4i2OqWAuki1gl5EWhhjjjmfjgO2uzhsPdBRRNoCR4EJwB3VqlKphuL0sQtb66nfQ4lzDeLQ5laYJ023gr1FIgSE2Fqu8gyVGV65GBgKRIlICvB7YKiIJGJ1xRwC7nEe2xJrGOVoY0yJiDwAfIY1vHKeMWZHXfwQSnmk4gI4vvVcqB9ZD6dTrH2+AVaQJ919biRMeJy21lW1eMxSgkp5NGOs+V/Ov2F6fCs4iqz94a2tQD97w7R5D/ALtLdm5VF0KUGl6lvRGavb5fxgz3Xe2vILtoY0DrjPumEalwRhze2tV3k1DXqlasoYOHXg3M3SlPVwYgcYh7W/aXtoN+xciz2mK/j621uzalA06JWqqoJsa0jj+TdN850fIwkIsz6MdNVDzg8jJUGjSHvrVQ2eBr1Sl1N0Bg6ugr1fwOFvrVWSMIBAdMK5T5jG9YPozuDja3fFSl1Ag16p8oyBk/th7+fW1+FvrJum/o2gzSBrEeu4JGvFJF3nVHkADXqlwJp3/dDX58I985C1PaqztYhGx2ug9UAdCaM8kga9arhOHbC6Y/Z+AYdWWx9M8guGdkNg4ANWuEfE212lUjWmQa8ajuICqxtm7xew7wtr2l6wRsX0nW4Fe5tkXfpOeR0NeuXdMg9bob73C+uGanEe+AVB/JVWl0yHqyGyvd1VKlWnNOiVdykpgh+/Pdclc3a63iZtIHESdLzWCnmdI0Y1IBr0yvNlpzi7Y76EAyusKXt9A6xumL5TrXCP7KDzxKgGS4NeeR5HMRz5zjlC5ktIc86VF94Ket4GHa6BtoMhMNTeOpVyExr0yjOcPma12Pd+brXaC0+Dj5815PGap60bqdEJ2mpXygUNeuWeHCVwdMO5ce3Ht1nbw1pCt7FWd0zbIbp6klKVoEGv3EduGuz7nxXs+/9nzSkjvtB6AIz4vRXuzbppq12pKtKgV/YpdViTg+393BoCmfq9tT20GSTcAB2vtmZ9DG5ia5lKeToNelW/zpy0Wut7P7da7/mnQHysCcGG/9a6kdq8J/j42F2pUl5Dg17VrdJSOLbZOa79czi6ETAQEmV1xXS8BtoPh5CmdleqlNfSoFe1L+8U7P/KGiWz70s4kw6INdvj0EetcG/RW1vtStUTDXpVc8ZY65+eHdeesg5MKQRHQPsRVsu9wwhoFGV3pUo1SJcNehGZB1wPpBljuju3PQfcABQB+4HpxpgsF689BOQADqCkooVrlQcqyIb9y899IjX3uLW9RSJc9X9WuMf21UU4lHIDlWnRLwBeARaet+0L4DfGmBIR+QvwG+CRCl4/zBiTUaMqlf2MsdZBPTtB2I9rrTVRA8Ohw3Ar2NuPgLBmdleqlCrnskFvjFklIvHltn1+3tO1wC21XJdyB4U51qdQz04QlpNqbW/WA5J/YYV7XD/w1R5ApdxZbfwfehfwbgX7DPC5iBjgNWPM3IreRERmAjMBWrduXQtlqSozBtL3nBvXfngNlBZbC163Hwodf2NN69u4pd2VKqWqoEZBLyKPAyXAogoOSTbGpIpIDPCFiOw2xqxydaDzH4G5AElJSaYmdakqKFv42nkjNftHa3tMVxhwn9Vqb3UF+AXYW6dSqtqqHfQiMhXrJu0IY4zLYDbGpDq/p4nIUqA/4DLoVT1L2QCr/2q13M8ufN1uKFz1S+tDS01a2V2hUqqWVCvoRWQk1s3XIcaYvAqOaQT4GGNynI+vBf5Q7UpV7Tj8Lax8Fg4st4Y/9pvhXEJvkC58rZSXqszwysXAUCBKRFKA32ONsgnE6o4BWGuMuVdEWgL/MMaMBpoBS537/YC3jTGf1slPoS7NGDi4ElY+B4e/hkbRcPVT0O9uCAyzuzqlVB2rzKibiS42/7OCY1OB0c7HB4BeNapO1Ywx1hj3lc9aH2IKbQ7X/Rn6TtOl9JRqQHRcnDcyBvYsg1XPWTNCNo6D0c9D78ngH2R3dUqpeqZB701KS2HXR7DqeTix3VoQ+4a/Q687dNSMUg2YBr03KHXA9iWw+nlI320thD32VehxK/j6212dUspmGvSezFEMW9+D1S/Aqf3Wmqk3/xO6jdM5ZpRSZTToPVFJEWxeBF+/CFmHrSkJbltorcqkU/8qpcrRoPckxQXw/ZtWwJ8+Ci37wKi/QKeRuo6qUqpCGvSeoCgPNs6Hb16ypgNuNQBufMmaLVIDXil1GRr07qwwB9b/A759BfIyIP4quPl167sGvFKqkjTo3VF+FqybC2tnQ36mtabq4F9Dm4F2V6aU8kAa9O4k75QV7t+9BoWnrb73wb+GuL52V6aU8mAa9O4gNx3WvAzr/wlFudDlBhj8MLTQGSSUUjWnQW+nnOPWDdYN86CkALqPh6t+Bc262l2ZUsqLaNDbITsFvv4bbFoIpSXQ8zZrQe2ojnZXppTyQhr09SnzkLXYx+a3AQO9JsJVD0HTdnZXppTyYhr09SFjnzVNwdZ3rakJ+kyBKx+EJro2rlKq7mnQ16W0XdZMkjuWgG8A9J8JyT/XxbWVUvVKg74uHN9mzQW/82PwD4GBD8Cgn0FojN2VKaUaIA362nR0kxXwe5ZBYGPrBuuA+6FRpN2VKaUaMA362vDjd7DqWWvZvqBwGPobuOIea/FtpZSyWWUWB58HXA+kGWO6O7c1Bd4F4oFDwG3GmEwXrx0J/B3wxVo0/Jlaq9xuxsChr62AP7gKQiJhxBPQbwYENba7OqWUKlOZycsXACPLbXsU+J8xpiPwP+fzC4iILzALGAV0BSaKiOd/EsgY2P8VzB8Fb1wPabvh2j/Cg9usrhoNeaWUm7lsi94Ys0pE4sttvgkY6nz8BrACeKTcMf2BfcaYAwAi8o7zdTurX66NjIG9n8PKZ+HoBghrCaOetYZK+gfbXZ1SSlWoun30zYwxxwCMMcdExNVwkljgyHnPU4Arqnk++5SWwp5PrJusx7ZAeGu4/kVInAR+gXZXp5RSl1WXN2NdTZhuKjxYZCYwE6B1azf4IFGpA3Z+aI2DT9tpfXr1plnQ83ZdcFsp5VGqG/QnRKSFszXfAkhzcUwK0Oq853FAakVvaIyZC8wFSEpKqvAfhDrnKIHt/7I+yZrxA0R1hvGvQ7fx4KuDlJRSnqe6yfUxMBV4xvn9IxfHrAc6ikhb4CgwAbijmuereyVFsPUday6azIMQ0w1uXQBdbrSmLVBKKQ9VmeGVi7FuvEaJSArwe6yAf09E7gZ+BG51HtsSaxjlaGNMiYg8AHyGNbxynjFmR938GDVQUuhccPtvkH3EmgP+9kXQeTT4VGZQklJKubfKjLqZWMGuES6OTQVGn/d8GbCs2tXVpeJ82PgGfPN3yEmFuH4w5q/Q8Rpdj1Up5VUaXqdzYa610Me3L8OZNGiTDGNnQ7uhGvBKKa/UcIK+4LS14PaaWZB/CtoOgSHzIf5KuytTSqk65f1Bn59pLba9djYUZEOHa2DIr6FVf7srU0qpeuG9QX/mJKydBd/NhaIc6DwGBv8KYvvYXZlSStUr7wv6nBOw5mVYPw+K86DrTTD4YWje3e7KlFLKFt4T9IW58NXTsHEBOIqg+y3WJGMxCXZXppRStvKeoPcLggMrnAH/EES2t7sipZRyC94T9L5+cM9q8AuwuxKllHIr3vXRTw15pZS6iHcFvVJKqYto0CullJfToFdKKS+nQa+UUl5Og14ppbycBr1SSnk5DXqllPJyGvRKKeXlNOiVUsrLadArpZSX06BXSikvV+2gF5HOIrL5vK/TIvJguWOGikj2ecc8UeOKlVJKVUm1Z680xuwBEgFExBc4Cix1cehqY8z11T2PUkqpmqmtrpsRwH5jzOFaej+llFK1pLaCfgKwuIJ9A0Vki4j8V0S6VfQGIjJTRDaIyIb09PRaKksppVSNg15EAoAbgfdd7N4EtDHG9AJeBj6s6H2MMXONMUnGmKTo6OialqWUUsqpNlr0o4BNxpgT5XcYY04bY3Kdj5cB/iISVQvnVEopVUm1EfQTqaDbRkSai4g4H/d3nu9kLZxTKaVUJdVozVgRCQGuAe45b9u9AMaYOcAtwH0iUgLkAxOMMaYm51RKKVU1NQp6Y0weEFlu25zzHr8CvFKTcyillKoZ/WSsUkp5OQ36BsoYw/70XHILS+wuRSlVx2rUdaM8S15RCd/sO8lXu9NYsSeNY9kFxDYJZsH0fnRsFmZ3eUqpOqJB7+UOZZxh+Z40vtqdxncHTlHkKCU00I8rO0Rx95URzFl5gPGvfsvcyUkMbB95+TdUSnkcDXovU1RSyrqDp1i+J43lu9M4kHEGgHbRjZgysA3DE2JIim9KgJ/Va3ddt+ZMX7CeKfO+47lbejG2d6yd5Sul6oAGvRc4cbqA5butVvs3+zI4U+QgwM+Hge0imTKwDcMSYmgT2cjla1s1DeGDewcx880NPPjuZo5m5XP/0PY4P/6glPICGvQeyFFq2Hwkqyzcdx47DUDL8CDG9o5lWOcYBnWIJCSgcv95w0P8WXh3f379r60899kejpzK4+mx3fH31Xv1SnkDDXoPkZVXxMof0lm+O42VP6STmVeMr4/Qt00Ej4xMYHhCDJ2ahVa7JR7o58uLtyUSFxHMrOX7Sc0uYPakPoQG6q+IUp5O/y92U8YYdh3LKetr3/RjJqUGIhsFMCwhhmGdYxjcMZrwEP9aO6ePj/DwdQnERYTw2w+3c9ucNcyf3o9mjYNq7RxKqfqnQe9GzhSW8M2+DGe4p3P8dAEAPWLDeWB4R4Z1jqZXXBN8fOq2/3xi/9Y0Dw/igUWbGDfrG+ZP70/n5jr8UilPJe449UxSUpLZsGGD3WXUi4MZZ8rGtZ8//PGqjlEMS4hhaOdoYsLsaVFvP5rNXQvWk1/kYM7kviR30IlHlXJXIrLRGJPkcp8Gff0qLHGw7uApZ7inc9A5/LFDTCjDncGe1Obc8Ee7Hc3KZ/r8dRxIP8MzN/fklr5xdpeklHLhUkGvXTf14Fh2Piv2pJcNf8wrchDo58PA9pFMT45nWOcYWjUNsbtMl2KbBPP+vYO4762N/Or9LRzNzOfnIzro8EulPIgGfR1wlBq+/zHT+YnUdHY5hz/GNglmfJ9YhifEMLBdFMEBvjZXWjnhwf4smN6fR5ds5cUvf+BIZh5/Ht9Dh18q5SE06GtJ5hlr+ONXu9NYtTedLOfwx6Q2EfxmVALDEmLoGFP94Y92C/Dz4YVbexEXEcJL/9vLidPW8MuwoNob9aOUqhsa9NVkjGFH6mlWOOeR2Xwki1IDUaEBjEhoxvCEGK7sGEV4sPcEoYjw0DWdiGsSzGNLt3Grc/hli/Bgu0tTSl2CBn0V5BaW8PXeDFbsSWP5njROnC4EoFdcOD8b3pHhCTH0iA2v8+GPdrutXytaNAnivrc2MXbWN8yf1p+uLRvbXZZSqgI66uYSjDHnDX9M57uDJyl2GMIC/RjcKZphCTEM6RRNdFig3aXaYtex00yfv57cwhJmT+rD4E7RdpekVIOlwyuroKDYwXcHT7F8t9VqP3wyD4BOzUIZ1jmGYQkx9G0ToTcinY5l5zN9/nr2puXy53E9uK1fK7tLUqpBqrPhlSJyCMgBHEBJ+ZOIdefx78BoIA+YZozZVJNz1oXUrPyyT6N+sy+D/GJr+GNyhyh+cmVbhrrx8Ee7tQgP5v17B3L/ok38+oOtHMnM46FrOnnsTWelvFFt9NEPM8ZkVLBvFNDR+XUF8Krzu61KHKV8fySLr3Zb88jsPp4DQFxEMLcmxTEsIYaB7SIJ8veM4Y92CwvyZ960fjy+dBsvf7WPo5n5PHNzT7f50JdSDV1d34y9CVhorP6htSLSRERaGGOO1fF5L3LqTBErf7DGta/6IZ3s/GL8fISk+AgeG23N/tg+2nOHP9rN39eHv9zck7iIEP76xQ8cyy5gzuS+XjXqSClPVdOgN8DnImKA14wxc8vtjwWOnPc8xbmtzoP+7PDH5bvT+GqPNfzRGIgKDeTars0Y5hz+2FjHgdcaEeHnIzoSFxHMIx9s5dY53zJ/en9im+jwS6XsVNOgTzbGpIpIDPCFiOw2xqw6b7+r5rHLu78iMhOYCdC6detqFWMNf0wvGyWTllOICPSMa8KDIzoxPCGGbi0be/3wR7uN7xNH88ZB3PPWRufwy350jw23uyylGqxaG3UjIk8CucaY58/b9hqwwhiz2Pl8DzD0cl031Rl1k1/koM/TX5Bf7CAsyBr+OLxzDEM6RxMV2jCHP9rthxM5TJu3jqz8Ymbd0YdhCTF2l6SU16qTUTci0gjwMcbkOB9fC/yh3GEfAw+IyDtYN2Gz66p/PjjAl8fGdKFTTCh920Tgp8MfbdepWRhLf5rMXQvW85OFG3j6pu7ccUX1/lpTSlVfTbpumgFLnTcv/YC3jTGfisi9AMaYOcAyrKGV+7CGV06vWbmXNnlAm7p8e1UNzRoH8e49A3ng7U08tnQbKZl5/Oraztp9plQ90g9MqXpR4ijldx/tYPG6H7mxV0ueu7UngX46fFWp2qLz0Svb+fn68Kdx3WnVNJhnP93D8dMFzJ3clyYhAXaXppTX045sVW9EhPuHduDvExLZ/GMWN7/6LUdO5dldllJeT4Ne1bubEmNZeHd/0nMKGTf7G7amZNldklJeTYNe2WJAu0iW3D+IIH9fbn9tLV/uPGF3SUp5LQ16ZZsOMWEsuX8QHWJCmfnmBhauOWR3SUp5JQ16ZauYsCDevWcAwzrH8MRHO/jTsl2UlrrfSDClPJkGvbJdSIAfr03uy+QBbZi76gA/W/w9BcUOu8tSymvo8ErlFvx8ffjDTd1o1TSYPy3bzYnTBbw+JYmIRjr8Uqma0ha9chsiwszB7Xnljt5sPZrN+Fe/5fDJM3aXpZTH06BXbuf6ni1Z9JMryMwrYvzsb/n+x0y7S1LKo2nQK7fUL74pH9w3iEaBfkx8fS2f7Thud0lKeSwNeuW22keHsuT+QSQ0b8y9b21k3tcH7S5JKY+kQa/cWlRoIItnDOCaLs34w3928od/78Shwy+VqhINeuX2ggN8efXOvkwbFM+8bw7y00WbdPilUlWgQa88gq+P8OSN3fjd9V35bOdxJr6+lpO5hXaXpZRH0KBXHuXuK9vy6qQ+7Ew9zfhXv+Vghg6/VOpyNOiVxxnZvQVvzxhATkEJ42d/w8bDp+wuSSm3pkGvPFLfNhEsuW8Q4cH+THz9O5Ztq5OliJXyChr0ymPFRzViyf3J9IgN56dvb+Ifqw/gjktjKmU3DXrl0Zo2CmDRT65gZLfm/PGTXTz58Q4dfqlUOdUOehFpJSLLRWSXiOwQkV+4OGaoiGSLyGbn1xM1K1epiwX5+zLrjj7MuKotb6w5zD1vbiSvqMTuspRyGzVp0ZcA/2eM6QIMAH4qIl1dHLfaGJPo/PpDDc6nVIV8fITHx3TlqRu78dXuE0ycu5b0HB1+qRTUIOiNMceMMZucj3OAXUBsbRWmVHVMHRTPa5OT2HMih/GvfsP+9Fy7S1LKdrXSRy8i8UBv4DsXuweKyBYR+a+IdLvEe8wUkQ0isiE9Pb02ylIN1DVdm/HOzIHkFzkYP/tb1h3U4ZeqYatx0ItIKPAB8KAx5nS53ZuANsaYXsDLwIcVvY8xZq4xJskYkxQdHV3TslQDl9iqCUvuSyYyNIA7//EdH29JtbskpWxTo6AXEX+skF9kjFlSfr8x5rQxJtf5eBngLyJRNTmnUpXVOjKEJfcNIrFVE36++HteXbFfh1+qBqkmo24E+Cewyxjz1wqOae48DhHp7zzfyeqeU6mqahISwMK7+3NDr5b85dPd/PbD7ZQ4Su0uS6l6VZM1Y5OBycA2Edns3PYY0BrAGDMHuAW4T0RKgHxggtEmlapnQf6+/P32RGKbBDNn5X6OZRfw8sTeNArUJZNVwyDumLtJSUlmw4YNdpehvNBbaw/zxEfb6dqyMfOm9iOmcZDdJSlVK0RkozEmydU+/WSsalDuHNCGf0xN4kD6GcbN/pa9J3LsLkmpOqdBrxqc4QnNeHfmQIocpYx/9VvW7NfbRso+xhiy84vZl5bLliNZdXIO7bpRDVZKZh7T5q/n8MkzPHtLT8b1jrO7JOVFCoodpOcUkp5baH0/++XieVGJNUAgOiyQ9Y9fXa3zXarrRu9GqQYrLiKED+4dxD1vbeCX727haGY+Px3WAedAMaUuUuIo5dSZItIuEeAZzu85BRfPtyQCkY0CiAoNJDoskHZRjYgOC7zgqy5o0KsGLTzEnzfu6s8j/9rK85//QEpmPk+P7Y6/r/ZqNhRnu04uanGXC/KM3EJOninCVSdIWKAf0WGBRIUF0qVFYwZ3vDC8o0MDiQkLpGmjAPxs+N3SoFcNXqCfLy/enkhcRAivLN9HanYBsyf1IVSHX3q0/KKzXScFl+w6ycgtosjFZysC/HyIdra8WzUNoU+biLLnZ1vkMc7HwQG+NvyElae/yUoBIsKvrutMXEQwj3+4ndvmrGHetH40D9fhl+6k2FHKydwilwGeUbbdep5beHHXiY9AZOi5oO4QE3ZRy/vs48ZBfl7TjadBr9R5JvRvTfPwIH66aBPjZn/D/On9SGje2O6yvFppqSErv7isb/tSNy1PnSly+R6Ng/zKArp7bPgFgX1+gDdtFICvj3eEd1XoqBulXNiRms1dC9aTV+hgzuS+JHfQKZoqwxhDscNQ5CilsNhBTkHJxYFdLsQzcgspcbEqWKCfDzGNL2xlR4cGObtOAsq2RYUGEuTv3l0n9eFSo2406JWqQGpWPtPnr2d/ei7P3NyTW/q69/BLR6mhsMRBUUkphSWlFBaXUuRwUFBc6gze0gv2lx130TbHBfvP31ZYflvZezus7yWlLm9WnuXrI0Q2CnDZVVL+eWig93Sd1AcdXqlUNbRsEsz79w3kvrc28qv3t5CSmccvRnS8KHyMMRcHoKugPBuKF4WxFZSFZWF87jUVh3H5/aW1slauj1g3pwP9fQj08yHAz8d6XvbYh/BgfwL9Lt5/0TZ/HxoF+FmtcmeIR4QE4NMAu07spi16pS6jqKSU3yzZxgebUohtElwW7GXBW0uzYQaUhaVvudC0tpU99vchwNfVtosDOsBVAF/iPe0Y+qdqh7bolaqBAD8fnr+1J91jG7Ppx6yKA7jCAPU9F87+rgM4wNdHuylUndGgV6oSRITpyW2Znmx3JUpVnf6dppRSXk6DXimlvJwGvVJKeTkNeqWU8nIa9Eop5eU06JVSystp0CullJfToFdKKS/nllMgiEg6cLiaL48CMmqxnNqidVWN1lU1WlfVeGNdbYwx0a52uGXQ14SIbKhovgc7aV1Vo3VVjdZVNQ2tLu26UUopL6dBr5RSXs4bg36u3QVUQOuqGq2rarSuqmlQdXldH71SSqkLeWOLXiml1Hk06JVSyst5ZNCLyEgR2SMi+0TkURf7RURecu7fKiJ93KSuoSKSLSKbnV9P1FNd80QkTUS2V7Dfrut1ubrsul6tRGS5iOwSkR0i8gsXx9T7NatkXfV+zUQkSETWicgWZ11PuTjGjutVmbps+R1znttXRL4Xkf+42Fe718sY41FfgC+wH2gHBABbgK7ljhkN/BcQYADwnZvUNRT4jw3XbDDQB9hewf56v16VrMuu69UC6ON8HAb84Ca/Y5Wpq96vmfMahDof+wPfAQPc4HpVpi5bfsec534IeNvV+Wv7enlii74/sM8Yc8AYUwS8A9xU7pibgIXGshZoIiIt3KAuWxhjVgGnLnGIHderMnXZwhhzzBizyfk4B9gFxJY7rN6vWSXrqnfOa5DrfOrv/Co/ysOO61WZumwhInHAGOAfFRxSq9fLE4M+Fjhy3vMULv5lr8wxdtQFMND5p+R/RaRbHddUWXZcr8qy9XqJSDzQG6s1eD5br9kl6gIbrpmzG2IzkAZ8YYxxi+tVibrAnt+xvwG/Bkor2F+r18sTg15cbCv/r3RljqltlTnnJqz5KHoBLwMf1nFNlWXH9aoMW6+XiIQCHwAPGmNOl9/t4iX1cs0uU5ct18wY4zDGJAJxQH8R6V7uEFuuVyXqqvfrJSLXA2nGmI2XOszFtmpfL08M+hSg1XnP44DUahxT73UZY06f/VPSGLMM8BeRqDquqzLsuF6XZef1EhF/rDBdZIxZ4uIQW67Z5eqy+3fMGJMFrABGlttl6+9YRXXZdL2SgRtF5BBWF+9wEXmr3DG1er08MejXAx1FpK2IBAATgI/LHfMxMMV553oAkG2MOWZ3XSLSXETE+bg/1vU/Wcd1VYYd1+uy7LpeznP+E9hljPlrBYfV+zWrTF12XDMRiRaRJs7HwcDVwO5yh9lxvS5blx3XyxjzG2NMnDEmHisnvjLG3FnusFq9Xn7VL9cexpgSEXkA+AxrpMs8Y8wOEbnXuX8OsAzrrvU+IA+Y7iZ13QLcJyIlQD4wwThvsdclEVmMNbogSkRSgN9j3Ziy7XpVsi5brhdWi2sysM3ZvwvwGND6vNrsuGaVqcuOa9YCeENEfLGC8j1jzH/s/n+yknXZ9Tt2kbq8XjoFglJKeTlP7LpRSilVBRr0Sinl5TTolVLKy2nQK6WUl9OgV0opL6dBr5RSXk6DXimlvNz/A0qQCAOfE5RvAAAAAElFTkSuQmCC\n", 641 | "text/plain": [ 642 | "
" 643 | ] 644 | }, 645 | "metadata": { 646 | "needs_background": "light" 647 | }, 648 | "output_type": "display_data" 649 | }, 650 | { 651 | "data": { 652 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD4CAYAAAD8Zh1EAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8vihELAAAACXBIWXMAAAsTAAALEwEAmpwYAAAlxklEQVR4nO3deXxU1f3/8deHkJCFQEgIiwlLQATZCRHB3VoUxOKGimu1VUTBpbYqUq1W60+7aNWC8uVr+boFkbpCRa27VaslQNhRkDWsIZBASEK28/sjQ0xCQiaQ5E4m7+fjwcO5c8/MfHJM3nPm3HvPmHMOERFp+lp4XYCIiNQPBbqISJBQoIuIBAkFuohIkFCgi4gEiZZevXD79u1d9+7dvXp5EZEmadGiRbudc/HV7fMs0Lt3705aWppXLy8i0iSZ2aaa9mnKRUQkSCjQRUSChAJdRCRIeDaHXp2ioiIyMjIoKCjwupQmJzw8nMTEREJDQ70uRUQ8ElCBnpGRQXR0NN27d8fMvC6nyXDOkZWVRUZGBklJSV6XIyIeCagpl4KCAuLi4hTmdWRmxMXF6ZONSDMXUIEOKMyPkvpNRAJqykVEAp9zjj0HCtm0J4/NWXlszc6nZQsjJjKUthGhtI0Io21EaPl2ZFiIBhyNRIFeQXZ2NrNnz+bWW2+t82PPP/98Zs+eTUxMTP0XJtLISkod27Lz2bwnj01ZeWzac4DNWWW3N+/JI/dgsd/PFRpivqAP9QV9WJXtyv899IbQNiKUsJYBN4kQ0BToFWRnZ/Pss89WG+glJSWEhITU+NgFCxY0ZGki9a6gqOTHwM46UH578548MvbmUVTy45ffhIW0IDE2gm6xkQxLiqVrbCTd4sr+JbaLpLjUkZNfRE5eEdn5hezLLyI7r4ic/CKy84vK9+XkF7FrfwFrd+0nO6+I/QVHfmOIDAupIfxreFPwvRlEh7ekRYvm96lAgV7BlClT+OGHHxg8eDAjR45kzJgx/P73v6dz586kp6ezatUqLrroIrZs2UJBQQF33HEHEyZMAH5cyiA3N5fRo0dz2mmn8fXXX5OQkMA777xDREREpdeaP38+f/jDHygsLCQuLo7U1FQ6duxIbm4ut912G2lpaZgZDz74IJdeeinvv/8+U6dOpaSkhPbt2/Pxxx970UXShDjn2JtXVB7Wm7PyyqdJNu05wM59Byu1jw5vSbe4SPp2bsOo/p3oFhtJ17hIusVF0alNOCG1BGTrVi1JiIk4YpuqSkod+/IrB392Xs1vCBt355GdX0hOfhEFRaU1Pq8ZtAmvOOqvPvjbRh6+LyK06U4RmVdfQZeSkuKqruWyevVqTjzxRAB+P38lq7btq9fX7HtcGx78Wb8a92/cuJELLriAFStWAPDZZ58xZswYVqxYUX464J49e4iNjSU/P5+TTjqJzz//nLi4uEqBfvzxx5OWlsbgwYO5/PLLGTt2LNdcc02l19q7dy8xMTGYGc8//zyrV6/miSee4N577+XgwYM89dRT5e2Ki4tJTk7miy++ICkpqbyGqir2nzQPJaWO7Tn55WFdNsI+UPbfrDz2V5ka6dimFd1io8qCukJgd4uNJCYytEkFWUFRSeU3g7yK4V9Y5U2iqOxNwrddUlpz7oWFtKBNpfAPrRT+Mb7bMRFhldq1jQglNKThp4jMbJFzLqW6fX6N0M1sFPA0EAI875x7vMr+dsAsoCdQAPzCObfimKoOEMOGDat0bvczzzzDW2+9BcCWLVtYu3YtcXFxlR6TlJTE4MGDARg6dCgbN2487HkzMjK44oor2L59O4WFheWv8dFHHzFnzpzydu3atWP+/PmcccYZ5W2qC3MJXgVFJWwpn8vOY3PWgfKRdsbefApLfhyphoYYie0i6RobydBu7XxTI1F0i4ukS7tIIsJqnjZsasJDQwgPDaFDm/A6Pc45R+7B4mqD/tCnghzfp4DsvCJ27CtgzY797MsvOuwNsqqosBBiIsuCvm1Ey/JPAjGRoZXCv0+naI7vEH0sP361ag10MwsBpgMjgQxgoZnNc86tqtBsKpDunLvYzPr42p9zLIUdaSTdmKKiospvf/bZZ3z00Uf85z//ITIykrPOOqvac79btWpVfjskJIT8/PzD2tx2223cddddjB07ls8++4yHHnoIKPtlqzpKqu4+CS7ZeYWVA7vC9MiOfZV/x1q3aknX2Eh6d4pmZL+OdIstC+yusZEcFxNR69RIc2dmRIeHEh0eSmK7uj22uKSUfQXF5VNDOYc+EVScIqrwprB+dy7Zvk8OhcU/vvFOPLMnU0b3qeefzL8R+jBgnXNuPYCZzQEuBCoGel/gMQDn3Boz625mHZ1zO+u74IYUHR3N/v37a9yfk5NDu3btiIyMZM2aNXzzzTdH/Vo5OTkkJCQA8OKLL5bff+655zJt2rRKUy4jRoxg0qRJbNiw4YhTLhK4SksdO/YVVJoSKZ/PzjrAvioHBztEt6JbXCSnHt++/ADkoWmS2KgwvcF7pGVIC2KjwoiNCgOiam1fUUFRSXngt4lomMOX/jxrArClwnYGcHKVNkuBS4AvzWwY0A1IBJpUoMfFxXHqqafSv39/Ro8ezZgxYyrtHzVqFDNmzGDgwIH07t2b4cOHH/VrPfTQQ1x22WUkJCQwfPhwNmzYAMD999/PpEmT6N+/PyEhITz44INccsklzJw5k0suuYTS0lI6dOjAhx9+eEw/q9S/gqISMvYeOmuk7GyRsjNHDrBlb36lEVrLFkZiuwi6xkUxuEtM+Qi7W1wUXWIjiAzT+QrB5tAUUcc6ThHVRa0HRc3sMuA859yNvu1rgWHOudsqtGlD2Rz7EGA50Ae40Tm3tMpzTQAmAHTt2nXopk2V12nXQb1jo/5reDl5RWw6dNDRF9aHbu/YV0DFP6eosBC6+g44/jjCLpse6dw2nJaNcABNgs+xHhTNALpU2E4EtlVs4JzbB9zgezEDNvj+UaXdTGAmlJ3l4k/xIo0tK/cga3fllp/e92N455GTX1SpbfvWZVMjI3rE+c4YiaSrL7TjNDUijcyfQF8I9DKzJGArMB64qmIDM4sB8pxzhcCNwBe+kBdpUr7+YTfXz1pYfuZISAsjISaCbnGRXDCwc6XA7hobSVQrTY1I4Kj1t9E5V2xmk4EPKDttcZZzbqWZTfTtnwGcCLxkZiWUHSz9ZQPWLNIgduQUcPurS+gaF8nvLuhLt7iys0Ya49xikfrg1/DCObcAWFDlvhkVbv8H6FW/pYk0nqKSUibPXkxeYQlzJiQ3yDnCIg1NnxdFgMffW0Papr08c+UQhbk0WfosKc3eguXb+fuXG7j+lO6MHXSc1+WIHDUF+jFq3bq11yXIMfghM5d7Xl/GkK4xTD1fp3xK06ZAl2Yrr7CYW15ZRFjLFky/Kllrb0uTp9/gCu69916effbZ8u2HHnqIJ554gtzcXM455xySk5MZMGAA77zzTq3PddFFFzF06FD69evHzJkzy+9///33SU5OZtCgQZxzTtlyN7m5udxwww0MGDCAgQMH8sYbb9T/DyeVOOf47VsrWLsrl6fHD+a4Oi77KhKIAveg6HtTYMfy+n3OTgNg9OM17h4/fjx33nln+RdczJ07l/fff5/w8HDeeust2rRpw+7duxk+fDhjx4494kUjs2bNqrTM7qWXXkppaSk33XRTpWVwAR555BHatm3L8uVlP+/evXvr8YeW6qR+u5m3lmzlrpEncHqveK/LEakXgRvoHhgyZAi7du1i27ZtZGZm0q5dO7p27UpRURFTp07liy++oEWLFmzdupWdO3fSqVOnGp+rumV2MzMzq10Gt7olc6XhLN2SzcPzV3FW73gmn3281+WI1JvADfQjjKQb0rhx43j99dfZsWMH48ePByA1NZXMzEwWLVpEaGgo3bt3r3bZ3ENqWma3pmVwtTxu49l7oJBbUxcTH92Kv14+uFl+TZkEL82hVzF+/HjmzJnD66+/zrhx44CypW47dOhAaGgon376KVUXFauqpmV2R4wYweeff16+suKhKZdDS+YeoimXhlFa6vjV3HQy9x/k2auTaRcV5nVJIvVKgV5Fv3792L9/PwkJCXTu3BmAq6++mrS0NFJSUkhNTaVPnyMvTD9q1CiKi4sZOHAgDzzwQPkyu/Hx8eXL4A4aNIgrrrgCKFsyd+/evfTv359Bgwbx6aefNuwP2UxN+3Qdn32Xye9+1pdBXWK8Lkek3gXsd4pK3an/avbvtZlcN+u/XDQ4gScvH6QpLmmyjrR8rkboEvS2Zedz+6tL6NWhNY9e3F9hLkFLgS5BrbC4lFtTF1NU4njumqH6JiAJagEX6F5NATV16rfq/b8Fq0nfks2fxg2kZ7yWaZDgFlCBHh4eTlZWlsKpjpxzZGVlER7ecN9V2BTNW7qNF77eyC9PS+L8AZ29LkekwQXU58/ExEQyMjLIzMz0upQmJzw8nMTERK/LCBhrd+5nyhvLSOnWjimjj3xWkkiwCKhADw0NLb+KUuRoHThYzC2pi4kMC2HaVcn6xiFpNgIq0EWOlXOOKW8uZ31mLq/ceDKd2moaSpoPDV0kqLz49UbmL93Gr8/tzSk923tdjkijUqBL0Fi8eS+PLljNOX06cMuZPb0uR6TRKdAlKGTlHmRS6mI6tQ3nSS26Jc2U5tClySspddz5WjpZBwp585ZTaBsZ6nVJIp7QCF2avKc/+p5/r93Nw2P70T+hrdfliHhGgS5N2qff7eKZT9Zx2dBErjipi9fliHhKgS5NVsbePH71Wjondm7DIxdp0S0RvwLdzEaZ2Xdmts7MplSzv62ZzTezpWa20sxuqP9SRX50sLiEW1MXU1LieO7qZMJDQ7wuScRztQa6mYUA04HRQF/gSjPrW6XZJGCVc24QcBbwhJnp62CkwTw8fxXLMnL4y+WD6N4+yutyRAKCPyP0YcA659x651whMAe4sEobB0Rb2Wfe1sAeoLheKxXxeWtJBqnfbubmM3twXr+av6hbpLnxJ9ATgC0VtjN891U0DTgR2AYsB+5wzpVWfSIzm2BmaWaWpgW45Gis2bGP+95czslJsdx9bm+vyxEJKP4EenVHmqqub3sekA4cBwwGpplZm8Me5NxM51yKcy4lPj6+jqVKc7e/oIhbXllMdHgof7tqCC216JZIJf78RWQAFc8HS6RsJF7RDcCbrsw6YAOgNUul3jjnuOf1ZWzek8e0K4fQIVqLbolU5U+gLwR6mVmS70DneGBelTabgXMAzKwj0BtYX5+FSvP29y838N6KHdw7qjcn94jzuhyRgFTrpf/OuWIzmwx8AIQAs5xzK81som//DOAR4AUzW07ZFM29zrndDVi3NCMLN+7h8ffWcF6/jtx0eg+vyxEJWH6t5eKcWwAsqHLfjAq3twHn1m9pIpC5v2zRrcR2Efz5skG6eEjkCLQ4lwSs4pJSbn91CTn5RbxwwzDahGvRLZEjUaBLwHryw+/5z/os/nLZIPoed9hJUyJShc77koD00aqdPPvZD1w5rAvjhurLr0X8oUCXgLM5K49fzU2nf0IbHvxZP6/LEWkyFOgSUAqKSrgldREGPHf1UC26JVIHmkOXgPLQvJWs3LaPWden0CU20utyRJoUjdAlYMxN28KchVuYdHZPftKno9fliDQ5CnQJCCu35fDA2ys4pWccd43UolsiR0OBLp7LyS/i1tTFxESG8syVQwhpoYuHRI6G5tDFU8457v7HUrbuzee1m4fTvnUrr0sSabI0QhdPzfxiPf9atZP7zj+Rod1ivS5HpElToItnvlmfxR/fX8OYAZ35xandvS5HpMlToIsndu0rYPLsJXRvH8Xjlw7Qolsi9UBz6NLoikpKmTx7CQcOFjP7ppOJ1qJbIvVCgS6N7s8ffMd/N+7hqSsGc0LHaK/LEQkamnKRRvX+ih3M/GI91w7vxkVDqn7XuIgcCwW6NJoNuw9w9z+WMqhLDPdfcKLX5YgEHQW6NIr8whJueWURISHG9KuG0KqlFt0SqW+aQ5cG55zj/rdX8N3O/fzf9SeR2E6Lbok0BI3QpcHNWbiFNxZncPtPenFW7w5elyMStBTo0qBWbM3hwXkrOb1Xe24/p5fX5YgENQW6NJjsvEImvrKI9lFhPD1ei26JNDTNoUuDKC113DV3KTv3FTD35hHERoV5XZJI0NMIXRrEc5//wCdrdvHABX0Z0rWd1+WINAt+BbqZjTKz78xsnZlNqWb/3WaW7vu3wsxKzExL5zVTX63bzRP/+o6xg47j2uHdvC5HpNmoNdDNLASYDowG+gJXmlnfim2cc392zg12zg0G7gM+d87taYB6JcDtyCng9leX0CO+NY9dokW3RBqTPyP0YcA659x651whMAe48AjtrwRerY/ipGkpKill0uzFFBSVMOOaoUS10iEakcbkT6AnAFsqbGf47juMmUUCo4A3atg/wczSzCwtMzOzrrVKgHtswRoWbdrLH8cN5PgOrb0uR6TZ8SfQq/vM7Gpo+zPgq5qmW5xzM51zKc65lPj4eH9rlCbg3WXbmfXVBq4/pTsXDDzO63JEmiV/Aj0D6FJhOxHYVkPb8Wi6pdlZtyuXe15fSnLXGKaer0W3RLziT6AvBHqZWZKZhVEW2vOqNjKztsCZwDv1W6IEsrzCYm5NXUSr0BCmX51MWEudCSvilVqPWjnnis1sMvABEALMcs6tNLOJvv0zfE0vBv7lnDvQYNVKQHHOMfXN5azdlcvLvziZzm0jvC5JpFnz6zQE59wCYEGV+2ZU2X4BeKG+CpPA98q3m3k7fRu/HnkCp/Vq73U5Is2ePh/LUUnfks3D81dydu94Jp19vNfliAgKdDkKew8UMil1MR2iw/nrFYNpoUW3RAKCrvyQOiktddz5WjqZ+w/y+i0jiInUolsigUIjdKmTv32yjs+/z+TBsX0ZmBjjdTkiUoECXfz2+feZPPXx91ySnMBVw7p6XY6IVKFAF79szc7nzjlL6N0xmkcv0qJbIoFIgS61KiwuZVLqYopKHM9enUxEWIjXJYlINXRQVGr16LurSN+SzYxrkukRr0W3RAKVRuhyRO+kb+XF/2ziptOTGNW/s9fliMgRKNClRmt37mfKG8s5qXs77hnVx+tyRKQWCnSpVu7BYia+soioVi2ZdlUyoSH6VREJdPorlcM455jyxjI27D7A364cQsc24V6XJCJ+UKDLYV74eiP/XLadu8/rw4iecV6XIyJ+UqBLJYs27eXRd1fz0xM7MvHMHl6XIyJ1oECXclm5B5mUupjjYiJ44vJBunhIpInReegCQEmp4/Y5S9ibV8ibt55C24hQr0sSkTpSoAsAT330PV+ty+JPlw6k33FtvS5HRI6CplyET9fs4m+frOPylEQuP6lL7Q8QkYCkQG/mtuzJ487X0unbuQ0PX9jf63JE5Bgo0JuxgqISbk1dTKlzPHdNMuGhWnRLpCnTHHoz9vA/V7F8aw7/e10K3eKivC5HRI6RRujN1JuLM5j97WYmntmTkX07el2OiNQDBXoztGbHPqa+tZzhPWL5zbkneF2OiNQTBXozs6+giFteWUyb8FCeuXIILbXolkjQ8Ouv2cxGmdl3ZrbOzKbU0OYsM0s3s5Vm9nn9lin1wTnHPf9YxuY9eUy7KpkO0Vp0SySY1HpQ1MxCgOnASCADWGhm85xzqyq0iQGeBUY55zabWYcGqleOwd+/3MD7K3fw2/NPZFhSrNfliEg982eEPgxY55xb75wrBOYAF1ZpcxXwpnNuM4Bzblf9linH6r8b9vDYe2sY1a8TN56e5HU5ItIA/An0BGBLhe0M330VnQC0M7PPzGyRmV1X3ROZ2QQzSzOztMzMzKOrWOps1/4CJs9eTNfYSP582UAtuiUSpPwJ9Or++l2V7ZbAUGAMcB7wgJkddvqEc26mcy7FOZcSHx9f52Kl7opKSrn91SXsKyjiuWuSiQ7XolsiwcqfC4sygIoLfCQC26pps9s5dwA4YGZfAIOA7+ulSjkqew4UMil1Md+s38MTlw2iT6c2XpckIg3InxH6QqCXmSWZWRgwHphXpc07wOlm1tLMIoGTgdX1W6rUxXc79nPh9C9ZtHkvT14+iEuHJnpdkog0sFpH6M65YjObDHwAhACznHMrzWyib/8M59xqM3sfWAaUAs8751Y0ZOFSs3+t3MGvXksnqlVL5t48gsFdYrwuSUQagTlXdTq8caSkpLi0tDRPXjtYOeeY/uk6/vKv7xmU2Jb/uTaFTm11rrlIMDGzRc65lOr2aXGuIJFfWMJvXl/Ku8u2c/GQBB67ZIBWTxRpZhToQWBbdj43vZTGqu37uG90Hyac0UOnJoo0Qwr0Ji5t4x4mvrKIg0WlzPr5SZzdRxfpijRXCvQmbO7CLfz27eUkxEQwZ0IKx3eI9rokEfGQAr0JKi4p5dEFq/m/rzZyeq/2TLsymbaRumBIpLlToDcx2XmFTJ69hC/X7eaXpyVx3+g+WgJXRAAFepOydud+bnwpje3ZBfxp3EAuT+lS+4NEpNlQoDcRH6/eyR1z0gkPDeHVCScztJuWvxWRyhToAc45x4zP1/OnD9bQ77g2zLw2heNiIrwuS0QCkAI9gBUUlXDvG8t4J30bPxt0HH+6dCARYbpYSESqp0APUDtyCpjwchrLt+Zw93m9ufWsnrpYSESOSIEegBZv3svNLy8i72AxM69NYWTfjl6XJCJNgAI9wLy+KIOpby6nU9twUm88mRM66mIhEfGPAj1AlJQ6Hn9vNf/77w2c0jOO6Vcl0y4qzOuyRKQJUaAHgJz8Im5/dQmff5/J9ad057djTiRUFwuJSB0p0D32Q2YuN72Yxpa9eTx2yQCuHNbV65JEpIlSoHvos+92cdurSwgLaUHqjcMZlqSLhUTk6CnQPeCc4/l/b+Cx91bTu1Mb/ve6oSS2i/S6LBFp4hTojaygqISpby3nzcVbOX9AJ/5y2SAiw/S/QUSOnZKkEe3aV8CElxeRviWbu0aewG0/OV4XC4lIvVGgN5KlW7KZ8HIa+wuKmXHNUEb17+R1SSISZBTojeDtJVu5541ldIhuxRu3nMKJndt4XZKIBCEFegMqKXX8+YPvmPH5D5ycFMtz1wwlVhcLiUgDUaA3kH0FRdw5J51P1uzi6pO78tDYfrpYSEQalF8JY2ajzOw7M1tnZlOq2X+WmeWYWbrv3+/qv9SmY8PuA1w8/Su++D6TRy7qz6MXD1CYi0iDq3WEbmYhwHRgJJABLDSzec65VVWa/ts5d0ED1Nik/HttJpNSFxPSwnj5lyczomec1yWJSDPhz5TLMGCdc249gJnNAS4EqgZ6s+ac4/++2sgf3l1Frw7RPP/zFLrE6mIhEWk8/gR6ArClwnYGcHI17UaY2VJgG/Ab59zKeqivSThYXMIDb69gbloG5/btyJNXDKZ1Kx2eEJHG5U/qVHfli6uyvRjo5pzLNbPzgbeBXoc9kdkEYAJA167BsQhV5v6DTHxlEYs27eX2c3px5zm9aNFCFwuJSOPz50hdBtClwnYiZaPwcs65fc65XN/tBUCombWv+kTOuZnOuRTnXEp8fPwxlB0YVmzNYey0L1m5LYfpVyVz18gTFOYi4hl/An0h0MvMkswsDBgPzKvYwMw6me8adjMb5nverPouNpDMX7qNcTO+xoDXJ57CmIGdvS5JRJq5WqdcnHPFZjYZ+AAIAWY551aa2UTf/hnAOOAWMysG8oHxzrmq0zJBobTU8eSH3zPt03WkdGvHjGuH0r51K6/LEhHBvMrdlJQUl5aW5slrH63cg8X86rV0Ply1kytSuvDIRf0Ja6nzy0Wk8ZjZIudcSnX7dCqGnzZn5XHjSwv5IfMAD/2sLz8/pbtWShSRgKJA98PXP+zm1tTFOAcv/WIYpx5/2PFeERHPKdCPwDnHy99s4vfzV9GjfRTP/zyFbnFRXpclIlItBXoNCotLeXDeSl7972Z+emIH/nrFYKLDQ70uS0SkRgr0amTlHuSWVxbz3417mHR2T349srfOLxeRgKdAr2LVtn3c9FIau3MP8vT4wVw4OMHrkkRE/KJAr+C95du5a+5S2kaE8o+JIxiYGON1SSIiflOgU3ax0NMfr+Xpj9cypGsM/3PtUDpEh3tdlohInTT7QD9wsJhfz13K+yt3MG5oIo9e3J9WLUO8LktEpM6adaBv2ZPHTS+l8f3O/dw/5kR+eVqSLhYSkSar2Qb6t+uzuCV1McUlpbxwwzDOOKHpr/4oIs1bswz01G838eA7K+kaF8nz16XQI7611yWJiByzZhXoRSWlPDx/FS9/s4mzesfzzJVDaKOLhUQkSDSbQN9zoJBbUxfxzfo93HxmD+45rw8hulhIRIJIswj0NTvKLhbaue8gf71iEBcPSfS6JBGRehf0gf7Byh386rV0WrdqydybRzC4S4zXJYmINIigDXTnHNM+WccTH37PoMS2zLwuhY5tdLGQiASvoAz0vMJi7n59Ge8u287FQxJ47JIBhIfqYiERCW5BF+hbs/OZ8FIaq7bv477RfZhwRg9dLCQizUJQBXraxj1MfGURB4tKmfXzkzi7TwevSxIRaTRBE+ivLdzM/W+vILFdJHMmpHB8B10sJCLNS5MP9OKSUv7w7mpe+Hojp/dqz7Qrk2kbqYuFRKT5adKBnp1XyOTZS/hy3W5uPC2JKaP70DKkhddliYh4oskG+tqd+7nxpTS2Zxfwp3EDuTyli9cliYh4qkkG+serd3LHnHTCQ0N4dcJwhnZr53VJIiKe82t+wsxGmdl3ZrbOzKYcod1JZlZiZuPqr8TK5i3dxo0vpZHUPor5t52qMBcR8al1hG5mIcB0YCSQASw0s3nOuVXVtPsj8EFDFHrIace35xenJvGbc3sTEaaLhUREDvFnhD4MWOecW++cKwTmABdW0+424A1gVz3Wd5jYqDAeuKCvwlxEpAp/Aj0B2FJhO8N3XzkzSwAuBmYc6YnMbIKZpZlZWmZmZl1rFRGRI/An0Ku7bt5V2X4KuNc5V3KkJ3LOzXTOpTjnUuLj9ZVvIiL1yZ+zXDKAiucEJgLbqrRJAeb41kxpD5xvZsXOubfro0gREamdP4G+EOhlZknAVmA8cFXFBs65pEO3zewF4J8KcxGRxlVroDvnis1sMmVnr4QAs5xzK81som//EefNRUSkcfh1YZFzbgGwoMp91Qa5c+76Yy9LRETqSgufiIgECQW6iEiQUKCLiAQJBbqISJBQoIuIBAkFuohIkFCgi4gECQW6iEiQUKCLiAQJBbqISJBQoIuIBAkFuohIkFCgi4gECQW6iEiQUKCLiAQJBbqISJBQoIuIBAm/vrEooLw3BXYs97oKEZGj12kAjH683p9WI3QRkSDR9EboDfCuJiISDDRCFxEJEgp0EZEgoUAXEQkSCnQRkSDhV6Cb2Sgz+87M1pnZlGr2X2hmy8ws3czSzOy0+i9VRESOpNazXMwsBJgOjAQygIVmNs85t6pCs4+Bec45Z2YDgblAn4YoWEREqufPCH0YsM45t945VwjMAS6s2MA5l+ucc77NKMAhIiKNyp9ATwC2VNjO8N1XiZldbGZrgHeBX1T3RGY2wTclk5aZmXk09YqISA38ubDIqrnvsBG4c+4t4C0zOwN4BPhpNW1mAjMBzCzTzDbVrdxy7YHdR/nYhhSodUHg1qa66kZ11U0w1tWtph3+BHoG0KXCdiKwrabGzrkvzKynmbV3ztVYsHMu3o/XrpaZpTnnUo728Q0lUOuCwK1NddWN6qqb5laXP1MuC4FeZpZkZmHAeGBeleKONzPz3U4GwoCs+i5WRERqVusI3TlXbGaTgQ+AEGCWc26lmU307Z8BXApcZ2ZFQD5wRYWDpCIi0gj8WpzLObcAWFDlvhkVbv8R+GP9lnZEMxvxteoiUOuCwK1NddWN6qqbZlWXaSAtIhIcdOm/iEiQUKCLiASJgA50P9aQMTN7xrd/me8Mm0Co6ywzy/GtbZNuZr9rpLpmmdkuM1tRw36v+qu2uhq9v8ysi5l9amarzWylmd1RTZtG7y8/6/Kiv8LN7L9mttRX1++raeNFf/lTlyd/j77XDjGzJWb2z2r21X9/OecC8h9lZ9T8APSg7DTIpUDfKm3OB96j7OKn4cC3AVLXWcA/PeizM4BkYEUN+xu9v/ysq9H7C+gMJPtuRwPfB8jvlz91edFfBrT23Q4FvgWGB0B/+VOXJ3+Pvte+C5hd3es3RH8F8gi91jVkfNsvuTLfADFm1jkA6vKEc+4LYM8RmnjRX/7U1eicc9udc4t9t/cDqzl8SYtG7y8/62p0vj7I9W2G+v5VPaPCi/7ypy5PmFkiMAZ4voYm9d5fgRzo/qwh49c6Mx7UBTDC9zHwPTPr18A1+cuL/vKXZ/1lZt2BIZSN7irytL+OUBd40F++6YN0YBfwoXMuIPrLj7rAm9+vp4B7gNIa9td7fwVyoPuzhoxf68zUM39eczHQzTk3CPgb8HYD1+QvL/rLH571l5m1Bt4A7nTO7au6u5qHNEp/1VKXJ/3lnCtxzg2mbPmPYWbWv0oTT/rLj7oavb/M7AJgl3Nu0ZGaVXPfMfVXIAe6P2vI1Gmdmcaqyzm379DHQFd2UVaombVv4Lr84UV/1cqr/jKzUMpCM9U592Y1TTzpr9rq8vr3yzmXDXwGjKqyy9Pfr5rq8qi/TgXGmtlGyqZlf2Jmr1RpU+/9FciBXusaMr7t63xHi4cDOc657V7XZWadzMrXthlGWT8Hwto2XvRXrbzoL9/r/R1Y7Zx7soZmjd5f/tTlUX/Fm1mM73YEZauprqnSzIv+qrUuL/rLOXefcy7ROdedsoz4xDl3TZVm9d5ffl367wXn3xoyCyg7UrwOyANuCJC6xgG3mFkxZWvbjHe+w9oNycxepeyIfnszywAepOwgkWf95WddXvTXqcC1wHLf/CvAVKBrhbq86C9/6vKivzoDL1rZN5i1AOY65/7p9d+jn3V58vdYnYbuL136LyISJAJ5ykVEROpAgS4iEiQU6CIiQUKBLiISJBToIiJBQoEuIhIkFOgiIkHi/wM89L/iy4xBwAAAAABJRU5ErkJggg==\n", 653 | "text/plain": [ 654 | "
" 655 | ] 656 | }, 657 | "metadata": { 658 | "needs_background": "light" 659 | }, 660 | "output_type": "display_data" 661 | }, 662 | { 663 | "data": { 664 | "text/plain": [ 665 | "
" 666 | ] 667 | }, 668 | "metadata": {}, 669 | "output_type": "display_data" 670 | } 671 | ], 672 | "source": [ 673 | "# plot the loss\n", 674 | "plt.plot(r.history['loss'], label='train loss')\n", 675 | "plt.plot(r.history['val_loss'], label='val loss')\n", 676 | "plt.legend()\n", 677 | "plt.show()\n", 678 | "plt.savefig('LossVal_loss')\n", 679 | "\n", 680 | "# plot the accuracy\n", 681 | "plt.plot(r.history['accuracy'], label='train acc')\n", 682 | "plt.plot(r.history['val_accuracy'], label='val acc')\n", 683 | "plt.legend()\n", 684 | "plt.show()\n", 685 | "plt.savefig('AccVal_acc')" 686 | ] 687 | }, 688 | { 689 | "cell_type": "code", 690 | "execution_count": 82, 691 | "metadata": {}, 692 | "outputs": [], 693 | "source": [ 694 | "# save it as a h5 file\n", 695 | "\n", 696 | "\n", 697 | "from tensorflow.keras.models import load_model\n", 698 | "\n", 699 | "model.save('model_resnet50.h5')" 700 | ] 701 | }, 702 | { 703 | "cell_type": "code", 704 | "execution_count": null, 705 | "metadata": {}, 706 | "outputs": [], 707 | "source": [] 708 | }, 709 | { 710 | "cell_type": "code", 711 | "execution_count": 83, 712 | "metadata": {}, 713 | "outputs": [], 714 | "source": [ 715 | "\n", 716 | "y_pred = model.predict(test_set)\n" 717 | ] 718 | }, 719 | { 720 | "cell_type": "code", 721 | "execution_count": 84, 722 | "metadata": {}, 723 | "outputs": [ 724 | { 725 | "data": { 726 | "text/plain": [ 727 | "array([[1.72658485e-17, 3.15305914e-14, 1.00000000e+00],\n", 728 | " [1.44285156e-18, 1.04114977e-14, 1.00000000e+00],\n", 729 | " [1.72185482e-18, 2.45480757e-15, 1.00000000e+00],\n", 730 | " [1.69613023e-18, 5.67724129e-15, 1.00000000e+00],\n", 731 | " [1.58277143e-18, 3.13576086e-15, 1.00000000e+00],\n", 732 | " [4.46850329e-18, 1.18827149e-13, 1.00000000e+00],\n", 733 | " [3.54525992e-19, 4.03665452e-15, 1.00000000e+00],\n", 734 | " [6.27264810e-19, 4.79972067e-15, 1.00000000e+00],\n", 735 | " [1.86832241e-17, 2.78132331e-14, 1.00000000e+00],\n", 736 | " [2.50938722e-18, 1.34208999e-14, 1.00000000e+00],\n", 737 | " [1.44070657e-18, 6.03744714e-15, 1.00000000e+00],\n", 738 | " [1.29441017e-18, 4.11096641e-15, 1.00000000e+00],\n", 739 | " [4.26931158e-18, 6.29958190e-14, 1.00000000e+00],\n", 740 | " [3.45607537e-18, 8.87510704e-15, 1.00000000e+00],\n", 741 | " [2.06131713e-18, 1.81756974e-14, 1.00000000e+00],\n", 742 | " [1.03641626e-18, 5.46802458e-15, 1.00000000e+00],\n", 743 | " [1.00217139e-18, 5.56380283e-15, 1.00000000e+00],\n", 744 | " [1.82711066e-18, 1.86024292e-14, 1.00000000e+00],\n", 745 | " [3.87583438e-19, 4.07064299e-15, 1.00000000e+00],\n", 746 | " [8.47338694e-19, 2.99262055e-15, 1.00000000e+00],\n", 747 | " [5.06119888e-18, 2.16092199e-14, 1.00000000e+00],\n", 748 | " [1.59416336e-18, 9.65978142e-15, 1.00000000e+00],\n", 749 | " [5.72713180e-18, 1.50026831e-14, 1.00000000e+00],\n", 750 | " [2.28775348e-18, 8.07533006e-15, 1.00000000e+00],\n", 751 | " [4.63019435e-18, 1.41081325e-14, 1.00000000e+00],\n", 752 | " [7.96198894e-18, 2.15624502e-14, 1.00000000e+00],\n", 753 | " [1.04141781e-18, 1.15448947e-14, 1.00000000e+00],\n", 754 | " [1.15247621e-18, 4.19886599e-15, 1.00000000e+00],\n", 755 | " [4.73869439e-18, 1.06193571e-14, 1.00000000e+00],\n", 756 | " [1.18833418e-18, 6.53833270e-14, 1.00000000e+00],\n", 757 | " [2.84519008e-18, 7.93761859e-15, 1.00000000e+00],\n", 758 | " [1.64842321e-18, 7.50876751e-15, 1.00000000e+00],\n", 759 | " [6.47598998e-18, 1.88450939e-14, 1.00000000e+00],\n", 760 | " [8.19181879e-19, 2.95362887e-15, 1.00000000e+00],\n", 761 | " [6.70528073e-18, 2.20390383e-14, 1.00000000e+00],\n", 762 | " [1.66077529e-18, 6.74443759e-15, 1.00000000e+00],\n", 763 | " [1.44466898e-18, 7.34333774e-15, 1.00000000e+00],\n", 764 | " [1.98024826e-18, 2.13992921e-15, 1.00000000e+00],\n", 765 | " [5.60118631e-19, 3.58009952e-15, 1.00000000e+00],\n", 766 | " [2.06639519e-18, 1.56281983e-14, 1.00000000e+00],\n", 767 | " [2.80224286e-19, 1.51491605e-14, 1.00000000e+00],\n", 768 | " [1.56633794e-18, 8.40833005e-15, 1.00000000e+00],\n", 769 | " [1.92835653e-18, 7.12104580e-15, 1.00000000e+00],\n", 770 | " [1.61547329e-18, 5.66715821e-15, 1.00000000e+00],\n", 771 | " [1.38129215e-18, 1.36457237e-14, 1.00000000e+00],\n", 772 | " [1.06465987e-17, 2.40037770e-14, 1.00000000e+00],\n", 773 | " [1.57657078e-18, 1.01950597e-14, 1.00000000e+00],\n", 774 | " [2.60430806e-18, 1.80379342e-14, 1.00000000e+00],\n", 775 | " [8.51574334e-18, 3.26405569e-14, 1.00000000e+00],\n", 776 | " [7.40295009e-19, 3.48502113e-15, 1.00000000e+00],\n", 777 | " [2.37361172e-18, 6.39510976e-15, 1.00000000e+00],\n", 778 | " [1.30336425e-17, 2.92999674e-14, 1.00000000e+00],\n", 779 | " [4.08764250e-19, 4.92643679e-15, 1.00000000e+00],\n", 780 | " [1.11583428e-18, 5.83212804e-15, 1.00000000e+00],\n", 781 | " [1.51674087e-17, 2.18138750e-14, 1.00000000e+00],\n", 782 | " [9.69172575e-19, 2.22222228e-15, 1.00000000e+00],\n", 783 | " [7.40182471e-18, 3.08916135e-14, 1.00000000e+00],\n", 784 | " [4.56019915e-19, 9.16357766e-15, 1.00000000e+00]], dtype=float32)" 785 | ] 786 | }, 787 | "execution_count": 84, 788 | "metadata": {}, 789 | "output_type": "execute_result" 790 | } 791 | ], 792 | "source": [ 793 | "y_pred" 794 | ] 795 | }, 796 | { 797 | "cell_type": "code", 798 | "execution_count": 85, 799 | "metadata": {}, 800 | "outputs": [], 801 | "source": [ 802 | "import numpy as np\n", 803 | "y_pred = np.argmax(y_pred, axis=1)" 804 | ] 805 | }, 806 | { 807 | "cell_type": "code", 808 | "execution_count": 86, 809 | "metadata": {}, 810 | "outputs": [ 811 | { 812 | "data": { 813 | "text/plain": [ 814 | "array([2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,\n", 815 | " 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,\n", 816 | " 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2], dtype=int64)" 817 | ] 818 | }, 819 | "execution_count": 86, 820 | "metadata": {}, 821 | "output_type": "execute_result" 822 | } 823 | ], 824 | "source": [ 825 | "y_pred" 826 | ] 827 | }, 828 | { 829 | "cell_type": "code", 830 | "execution_count": null, 831 | "metadata": {}, 832 | "outputs": [], 833 | "source": [] 834 | }, 835 | { 836 | "cell_type": "code", 837 | "execution_count": 87, 838 | "metadata": {}, 839 | "outputs": [], 840 | "source": [ 841 | "from tensorflow.keras.models import load_model\n", 842 | "from tensorflow.keras.preprocessing import image" 843 | ] 844 | }, 845 | { 846 | "cell_type": "code", 847 | "execution_count": 88, 848 | "metadata": {}, 849 | "outputs": [], 850 | "source": [ 851 | "model=load_model('model_resnet50.h5')" 852 | ] 853 | }, 854 | { 855 | "cell_type": "code", 856 | "execution_count": 89, 857 | "metadata": {}, 858 | "outputs": [], 859 | "source": [ 860 | "#img_data" 861 | ] 862 | }, 863 | { 864 | "cell_type": "code", 865 | "execution_count": 90, 866 | "metadata": {}, 867 | "outputs": [], 868 | "source": [ 869 | "img=image.load_img('Datasets/Test/lamborghini/11.jpg',target_size=(224,224))\n", 870 | "\n" 871 | ] 872 | }, 873 | { 874 | "cell_type": "code", 875 | "execution_count": 91, 876 | "metadata": {}, 877 | "outputs": [ 878 | { 879 | "data": { 880 | "text/plain": [ 881 | "array([[[252., 252., 252.],\n", 882 | " [252., 252., 252.],\n", 883 | " [252., 252., 252.],\n", 884 | " ...,\n", 885 | " [196., 187., 172.],\n", 886 | " [217., 208., 193.],\n", 887 | " [243., 234., 219.]],\n", 888 | "\n", 889 | " [[252., 252., 252.],\n", 890 | " [252., 252., 252.],\n", 891 | " [252., 252., 252.],\n", 892 | " ...,\n", 893 | " [245., 245., 237.],\n", 894 | " [243., 243., 235.],\n", 895 | " [242., 242., 234.]],\n", 896 | "\n", 897 | " [[252., 252., 252.],\n", 898 | " [252., 252., 252.],\n", 899 | " [252., 252., 252.],\n", 900 | " ...,\n", 901 | " [240., 249., 248.],\n", 902 | " [242., 251., 250.],\n", 903 | " [242., 251., 250.]],\n", 904 | "\n", 905 | " ...,\n", 906 | "\n", 907 | " [[189., 207., 229.],\n", 908 | " [190., 206., 229.],\n", 909 | " [190., 206., 229.],\n", 910 | " ...,\n", 911 | " [171., 180., 187.],\n", 912 | " [171., 180., 187.],\n", 913 | " [171., 180., 187.]],\n", 914 | "\n", 915 | " [[185., 206., 227.],\n", 916 | " [185., 206., 227.],\n", 917 | " [185., 206., 227.],\n", 918 | " ...,\n", 919 | " [171., 180., 187.],\n", 920 | " [171., 180., 187.],\n", 921 | " [171., 180., 187.]],\n", 922 | "\n", 923 | " [[185., 206., 227.],\n", 924 | " [185., 206., 227.],\n", 925 | " [185., 206., 227.],\n", 926 | " ...,\n", 927 | " [171., 180., 187.],\n", 928 | " [171., 180., 187.],\n", 929 | " [171., 180., 187.]]], dtype=float32)" 930 | ] 931 | }, 932 | "execution_count": 91, 933 | "metadata": {}, 934 | "output_type": "execute_result" 935 | } 936 | ], 937 | "source": [ 938 | "x=image.img_to_array(img)\n", 939 | "x" 940 | ] 941 | }, 942 | { 943 | "cell_type": "code", 944 | "execution_count": 92, 945 | "metadata": {}, 946 | "outputs": [ 947 | { 948 | "data": { 949 | "text/plain": [ 950 | "(224, 224, 3)" 951 | ] 952 | }, 953 | "execution_count": 92, 954 | "metadata": {}, 955 | "output_type": "execute_result" 956 | } 957 | ], 958 | "source": [ 959 | "x.shape" 960 | ] 961 | }, 962 | { 963 | "cell_type": "code", 964 | "execution_count": 93, 965 | "metadata": {}, 966 | "outputs": [], 967 | "source": [ 968 | "x=x/255" 969 | ] 970 | }, 971 | { 972 | "cell_type": "code", 973 | "execution_count": 94, 974 | "metadata": {}, 975 | "outputs": [ 976 | { 977 | "data": { 978 | "text/plain": [ 979 | "(1, 224, 224, 3)" 980 | ] 981 | }, 982 | "execution_count": 94, 983 | "metadata": {}, 984 | "output_type": "execute_result" 985 | } 986 | ], 987 | "source": [ 988 | "x=np.expand_dims(x,axis=0)\n", 989 | "img_data=preprocess_input(x)\n", 990 | "img_data.shape" 991 | ] 992 | }, 993 | { 994 | "cell_type": "code", 995 | "execution_count": 95, 996 | "metadata": {}, 997 | "outputs": [ 998 | { 999 | "data": { 1000 | "text/plain": [ 1001 | "array([[2.2220233e-10, 4.6077218e-07, 9.9999952e-01]], dtype=float32)" 1002 | ] 1003 | }, 1004 | "execution_count": 95, 1005 | "metadata": {}, 1006 | "output_type": "execute_result" 1007 | } 1008 | ], 1009 | "source": [ 1010 | "model.predict(img_data)" 1011 | ] 1012 | }, 1013 | { 1014 | "cell_type": "code", 1015 | "execution_count": 96, 1016 | "metadata": {}, 1017 | "outputs": [], 1018 | "source": [ 1019 | "a=np.argmax(model.predict(img_data), axis=1)" 1020 | ] 1021 | }, 1022 | { 1023 | "cell_type": "code", 1024 | "execution_count": 97, 1025 | "metadata": {}, 1026 | "outputs": [ 1027 | { 1028 | "data": { 1029 | "text/plain": [ 1030 | "array([False])" 1031 | ] 1032 | }, 1033 | "execution_count": 97, 1034 | "metadata": {}, 1035 | "output_type": "execute_result" 1036 | } 1037 | ], 1038 | "source": [ 1039 | "a==1" 1040 | ] 1041 | }, 1042 | { 1043 | "cell_type": "code", 1044 | "execution_count": 98, 1045 | "metadata": {}, 1046 | "outputs": [ 1047 | { 1048 | "data": { 1049 | "text/plain": [ 1050 | "'2.1.0'" 1051 | ] 1052 | }, 1053 | "execution_count": 98, 1054 | "metadata": {}, 1055 | "output_type": "execute_result" 1056 | } 1057 | ], 1058 | "source": [ 1059 | "import tensorflow as tf\n", 1060 | "tf.version.VERSION" 1061 | ] 1062 | }, 1063 | { 1064 | "cell_type": "code", 1065 | "execution_count": null, 1066 | "metadata": {}, 1067 | "outputs": [], 1068 | "source": [] 1069 | } 1070 | ], 1071 | "metadata": { 1072 | "kernelspec": { 1073 | "display_name": "Python 3", 1074 | "language": "python", 1075 | "name": "python3" 1076 | }, 1077 | "language_info": { 1078 | "codemirror_mode": { 1079 | "name": "ipython", 1080 | "version": 3 1081 | }, 1082 | "file_extension": ".py", 1083 | "mimetype": "text/x-python", 1084 | "name": "python", 1085 | "nbconvert_exporter": "python", 1086 | "pygments_lexer": "ipython3", 1087 | "version": "3.7.6" 1088 | } 1089 | }, 1090 | "nbformat": 4, 1091 | "nbformat_minor": 2 1092 | } 1093 | -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/app.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | """ 3 | Created on Thu Jun 11 22:34:20 2020 4 | 5 | @author: Krish Naik 6 | """ 7 | 8 | from __future__ import division, print_function 9 | # coding=utf-8 10 | import sys 11 | import os 12 | import glob 13 | import re 14 | import numpy as np 15 | import keras 16 | 17 | # Keras 18 | from tensorflow.keras.applications.imagenet_utils import preprocess_input, decode_predictions 19 | from tensorflow.keras.models import load_model 20 | from tensorflow.keras.preprocessing import image 21 | 22 | # Flask utils 23 | from flask import Flask, redirect, url_for, request, render_template 24 | from werkzeug.utils import secure_filename 25 | #from gevent.pywsgi import WSGIServer 26 | 27 | # Define a flask app 28 | app = Flask(__name__) 29 | 30 | # Model saved with Keras model.save() 31 | MODEL_PATH ='model_resnet50.h5' 32 | 33 | # Load your trained model 34 | model = load_model('model_resnet50.h5') 35 | 36 | 37 | 38 | 39 | def model_predict(img_path, model): 40 | img = image.load_img(img_path, target_size=(224, 224)) 41 | 42 | # Preprocessing the image 43 | x = image.img_to_array(img) 44 | # x = np.true_divide(x, 255) 45 | ## Scaling 46 | x= x/255 47 | x = np.expand_dims(x, axis=0) 48 | 49 | 50 | 51 | 52 | preds = model.predict(x) 53 | preds=np.argmax(preds, axis=1) 54 | if preds==0: 55 | preds="The Car is Audi" 56 | elif preds==1: 57 | preds="The Car is Lamborghini" 58 | elif preds == 2: 59 | preds = "The Car is Mercedes" 60 | else: 61 | preds="Other Than Audi/Lamborghini/Mercedes" 62 | 63 | 64 | return preds 65 | 66 | 67 | @app.route('/', methods=['GET']) 68 | def index(): 69 | # Main page 70 | return render_template('index.html') 71 | 72 | 73 | @app.route('/predict', methods=['GET', 'POST']) 74 | def upload(): 75 | if request.method == 'POST': 76 | # Get the file from post request 77 | f = request.files['file'] 78 | 79 | # Save the file to ./uploads 80 | basepath = os.path.dirname(__file__) 81 | file_path = os.path.join( 82 | basepath, 'uploads', secure_filename(f.filename)) 83 | ''' 84 | try: 85 | os.mkdir(os.path.join(basepath, 'uploads')) 86 | except: 87 | pass 88 | ''' 89 | f.save(file_path) 90 | 91 | # Make prediction 92 | preds = model_predict(file_path, model) 93 | result=preds 94 | return result 95 | return None 96 | 97 | if __name__ == '__main__': 98 | app.run(host='0.0.0.0', port=8080, debug=True) 99 | 100 | ''' 101 | if __name__ == '__main__': 102 | # app.run(debug=True) 103 | ''' -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/model-Resnet-50-h5 (Download Link).txt: -------------------------------------------------------------------------------- 1 | Download Model.h5 File using the Below link- 2 | 3 | https://github.com/amark720/Car-Brand-Classifier-And-Deployment/blob/main/model_resnet50.h5 4 | 5 | 6 | Ps- GitHub is not allowing to upload files which is more than 25 MB to Subfolders that's why I've given the above main deployed Model link from where you can download the Model.h5 file. -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/requirements.txt: -------------------------------------------------------------------------------- 1 | 2 | 3 | jsonify 4 | requests 5 | gunicorn 6 | 7 | 8 | absl-py==0.9.0 9 | astunparse==1.6.3 10 | attrs==19.3.0 11 | backcall==0.1.0 12 | bleach==3.1.5 13 | cachetools==4.1.0 14 | certifi==2020.4.5.1 15 | chardet==3.0.4 16 | click==7.1.2 17 | colorama==0.4.3 18 | cycler==0.10.0 19 | decorator==4.4.2 20 | defusedxml==0.6.0 21 | entrypoints==0.3 22 | Flask==1.1.2 23 | Flask-Cors==3.0.8 24 | gast==0.3.3 25 | geojson==2.5.0 26 | google-auth==1.15.0 27 | google-auth-oauthlib==0.4.1 28 | google-pasta==0.2.0 29 | grpcio==1.29.0 30 | h5py==2.10.0 31 | idna==2.9 32 | importlib-metadata==1.6.0 33 | ipykernel==5.3.0 34 | ipython==7.14.0 35 | ipython-genutils==0.2.0 36 | ipywidgets==7.5.1 37 | itsdangerous==1.1.0 38 | jedi==0.17.0 39 | Jinja2==2.11.2 40 | joblib==0.15.1 41 | jsonschema==3.2.0 42 | jupyter==1.0.0 43 | jupyter-client==6.1.3 44 | jupyter-console==6.1.0 45 | jupyter-core==4.6.3 46 | Keras-Preprocessing==1.1.2 47 | kiwisolver==1.2.0 48 | lxml==4.5.1 49 | Markdown==3.2.2 50 | MarkupSafe==1.1.1 51 | matplotlib==3.2.1 52 | mistune==0.8.4 53 | nbconvert==5.6.1 54 | nbformat==5.0.6 55 | notebook==6.0.3 56 | numpy==1.18.4 57 | oauthlib==3.1.0 58 | opencv-python==4.2.0.34 59 | opt-einsum==3.2.1 60 | packaging==20.4 61 | pandas==1.0.3 62 | pandas-datareader==0.8.1 63 | pandocfilters==1.4.2 64 | parso==0.7.0 65 | pexpect==4.8.0 66 | pickleshare==0.7.5 67 | Pillow==7.1.2 68 | prometheus-client==0.7.1 69 | prompt-toolkit==3.0.5 70 | protobuf==3.8.0 71 | ptyprocess==0.6.0 72 | pyasn1==0.4.8 73 | pyasn1-modules==0.2.8 74 | Pygments==2.6.1 75 | pyparsing==2.4.7 76 | pyrsistent==0.16.0 77 | PySocks==1.7.1 78 | python-dateutil==2.8.1 79 | pytz==2020.1 80 | pywinpty==0.5.7 81 | pyzmq==19.0.1 82 | qtconsole==4.7.4 83 | QtPy==1.9.0 84 | requests-oauthlib==1.3.0 85 | rsa==4.0 86 | scikit-learn==0.23.1 87 | scipy==1.4.1 88 | seaborn==0.10.1 89 | Send2Trash==1.5.0 90 | six==1.15.0 91 | sklearn==0.0 92 | tensorboard==2.2.1 93 | tensorboard-plugin-wit==1.6.0.post3 94 | tensorflow-cpu==2.3.0 95 | tensorflow-estimator==2.2.0 96 | termcolor==1.1.0 97 | terminado==0.8.3 98 | testpath==0.4.4 99 | threadpoolctl==2.0.0 100 | tornado==6.0.4 101 | traitlets==4.3.3 102 | urllib3==1.25.9 103 | wcwidth==0.1.9 104 | webencodings==0.5.1 105 | Werkzeug==1.0.1 106 | widgetsnbextension==3.5.1 107 | wincertstore==0.2 108 | wrapt==1.12.1 109 | zipp==3.1.0 110 | -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/static/css/main.css: -------------------------------------------------------------------------------- 1 | .img-preview { 2 | width: 256px; 3 | height: 256px; 4 | position: relative; 5 | border: 5px solid #F8F8F8; 6 | box-shadow: 0px 2px 4px 0px rgba(0, 0, 0, 0.1); 7 | margin-top: 1em; 8 | margin-bottom: 1em; 9 | } 10 | 11 | 12 | .img-preview>div { 13 | width: 100%; 14 | height: 100%; 15 | background-size: 256px 256px; 16 | background-repeat: no-repeat; 17 | background-position: center; 18 | } 19 | 20 | input[type="file"] { 21 | display: none; 22 | } 23 | 24 | .upload-label{ 25 | display: inline-block; 26 | padding: 12px 30px; 27 | background: #39D2B4; 28 | color: #fff; 29 | font-size: 1em; 30 | transition: all .4s; 31 | cursor: pointer; 32 | } 33 | 34 | .upload-label:hover{ 35 | background: #34495E; 36 | color: #39D2B4; 37 | } 38 | 39 | .loader { 40 | border: 8px solid #f3f3f3; /* Light grey */ 41 | border-top: 8px solid #3498db; /* Blue */ 42 | border-radius: 50%; 43 | width: 50px; 44 | height: 50px; 45 | animation: spin 1s linear infinite; 46 | } 47 | 48 | @keyframes spin { 49 | 0% { transform: rotate(0deg); } 50 | 100% { transform: rotate(360deg); } 51 | } -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/static/js/main.js: -------------------------------------------------------------------------------- 1 | $(document).ready(function () { 2 | // Init 3 | $('.image-section').hide(); 4 | $('.loader').hide(); 5 | $('#result').hide(); 6 | 7 | // Upload Preview 8 | function readURL(input) { 9 | if (input.files && input.files[0]) { 10 | var reader = new FileReader(); 11 | reader.onload = function (e) { 12 | $('#imagePreview').css('background-image', 'url(' + e.target.result + ')'); 13 | $('#imagePreview').hide(); 14 | $('#imagePreview').fadeIn(650); 15 | } 16 | reader.readAsDataURL(input.files[0]); 17 | } 18 | } 19 | $("#imageUpload").change(function () { 20 | $('.image-section').show(); 21 | $('#btn-predict').show(); 22 | $('#result').text(''); 23 | $('#result').hide(); 24 | readURL(this); 25 | }); 26 | 27 | // Predict 28 | $('#btn-predict').click(function () { 29 | var form_data = new FormData($('#upload-file')[0]); 30 | 31 | // Show loading animation 32 | $(this).hide(); 33 | $('.loader').show(); 34 | 35 | // Make prediction by calling api /predict 36 | $.ajax({ 37 | type: 'POST', 38 | url: '/predict', 39 | data: form_data, 40 | contentType: false, 41 | cache: false, 42 | processData: false, 43 | async: true, 44 | success: function (data) { 45 | // Get and display the result 46 | $('.loader').hide(); 47 | $('#result').fadeIn(600); 48 | $('#result').text(' Result: ' + data); 49 | console.log('Success!'); 50 | }, 51 | }); 52 | }); 53 | 54 | }); 55 | -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/templates/base.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Car Brand Classifier! 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 45 |
46 |
{% block content %}{% endblock %}
47 |
48 | 49 | 50 |
51 | 52 | 53 |













54 | Deployed by: Amar Kumar 55 |
56 |
57 | 58 | -------------------------------------------------------------------------------- /Car Brand Classifier And Deployment/templates/index.html: -------------------------------------------------------------------------------- 1 | {% extends "base.html" %} {% block content %} 2 | 3 |

Find Your Car Brand by Simply uploading a Car Photo!

4 | 5 |
6 |
7 | 10 | 11 |
12 | 13 | 22 | 23 | 24 | 25 |

26 | 27 |

28 | 29 |
30 | 31 | {% endblock %} -------------------------------------------------------------------------------- /Cat Dog Images Classifier (CNN + Keras)/Cat_Dog_Classifier_Using_CNN.ipynb: -------------------------------------------------------------------------------- 1 | {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"Convolutional Neural Network","provenance":[],"collapsed_sections":[],"toc_visible":true,"authorship_tag":"ABX9TyN4RwM22jdD+NwpsDagcktL"},"kernelspec":{"name":"python3","display_name":"Python 3"}},"cells":[{"cell_type":"markdown","metadata":{"id":"3DR-eO17geWu","colab_type":"text"},"source":["# Convolutional Neural Network"]},{"cell_type":"markdown","metadata":{"id":"EMefrVPCg-60","colab_type":"text"},"source":["### Importing the libraries"]},{"cell_type":"code","metadata":{"id":"sCV30xyVhFbE","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":34},"outputId":"41e39496-ad7b-45be-8cb5-5ae492405174","executionInfo":{"status":"ok","timestamp":1586435320041,"user_tz":-240,"elapsed":2561,"user":{"displayName":"Hadelin de Ponteves","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GhEuXdT7eQweUmRPW8_laJuPggSK6hfvpl5a6WBaA=s64","userId":"15047218817161520419"}}},"source":["import tensorflow as tf\n","from keras.preprocessing.image import ImageDataGenerator"],"execution_count":1,"outputs":[{"output_type":"stream","text":["Using TensorFlow backend.\n"],"name":"stderr"}]},{"cell_type":"code","metadata":{"id":"FIleuCAjoFD8","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":34},"outputId":"9f4bbca7-a8c6-4a14-8354-82c989248f45","executionInfo":{"status":"ok","timestamp":1586435320042,"user_tz":-240,"elapsed":2558,"user":{"displayName":"Hadelin de Ponteves","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GhEuXdT7eQweUmRPW8_laJuPggSK6hfvpl5a6WBaA=s64","userId":"15047218817161520419"}}},"source":["tf.__version__"],"execution_count":2,"outputs":[{"output_type":"execute_result","data":{"text/plain":["'2.2.0-rc2'"]},"metadata":{"tags":[]},"execution_count":2}]},{"cell_type":"markdown","metadata":{"id":"oxQxCBWyoGPE","colab_type":"text"},"source":["## Part 1 - Data Preprocessing"]},{"cell_type":"markdown","metadata":{"id":"y8K74-1foOic","colab_type":"text"},"source":["### Generating images for the Training set"]},{"cell_type":"code","metadata":{"id":"OlH2WYQ5ocVO","colab_type":"code","colab":{}},"source":["train_datagen = ImageDataGenerator(rescale = 1./255,\n"," shear_range = 0.2,\n"," zoom_range = 0.2,\n"," horizontal_flip = True)"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"LXXei7qHornJ","colab_type":"text"},"source":["### Generating images for the Test set"]},{"cell_type":"code","metadata":{"id":"T9It49laowGX","colab_type":"code","colab":{}},"source":["test_datagen = ImageDataGenerator(rescale = 1./255)"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"MvE-heJNo3GG","colab_type":"text"},"source":["### Creating the Training set"]},{"cell_type":"code","metadata":{"id":"0koUcJMJpEBD","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":34},"outputId":"9d177ed1-04eb-4b72-a6f6-d9da3528fb20","executionInfo":{"status":"ok","timestamp":1586435320043,"user_tz":-240,"elapsed":2544,"user":{"displayName":"Hadelin de Ponteves","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GhEuXdT7eQweUmRPW8_laJuPggSK6hfvpl5a6WBaA=s64","userId":"15047218817161520419"}}},"source":["training_set = train_datagen.flow_from_directory('dataset/training_set',\n"," target_size = (64, 64),\n"," batch_size = 32,\n"," class_mode = 'binary')"],"execution_count":5,"outputs":[{"output_type":"stream","text":["Found 334 images belonging to 3 classes.\n"],"name":"stdout"}]},{"cell_type":"markdown","metadata":{"id":"mrCMmGw9pHys","colab_type":"text"},"source":["### Creating the Test set"]},{"cell_type":"code","metadata":{"id":"SH4WzfOhpKc3","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":34},"outputId":"ea549ecd-e7b7-408c-df58-cbbf77fb58db","executionInfo":{"status":"ok","timestamp":1586435320044,"user_tz":-240,"elapsed":2543,"user":{"displayName":"Hadelin de Ponteves","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GhEuXdT7eQweUmRPW8_laJuPggSK6hfvpl5a6WBaA=s64","userId":"15047218817161520419"}}},"source":["test_set = test_datagen.flow_from_directory('dataset/test_set',\n"," target_size = (64, 64),\n"," batch_size = 32,\n"," class_mode = 'binary')"],"execution_count":6,"outputs":[{"output_type":"stream","text":["Found 334 images belonging to 3 classes.\n"],"name":"stdout"}]},{"cell_type":"markdown","metadata":{"id":"af8O4l90gk7B","colab_type":"text"},"source":["## Part 2 - Building the CNN"]},{"cell_type":"markdown","metadata":{"id":"ces1gXY2lmoX","colab_type":"text"},"source":["### Initialising the CNN"]},{"cell_type":"code","metadata":{"id":"SAUt4UMPlhLS","colab_type":"code","colab":{}},"source":["cnn = tf.keras.models.Sequential()"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"u5YJj_XMl5LF","colab_type":"text"},"source":["### Step 1 - Convolution"]},{"cell_type":"code","metadata":{"id":"XPzPrMckl-hV","colab_type":"code","colab":{}},"source":["cnn.add(tf.keras.layers.Conv2D(filters=32, kernel_size=3, padding=\"same\", activation=\"relu\", input_shape=[64, 64, 3]))"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"tf87FpvxmNOJ","colab_type":"text"},"source":["### Step 2 - Pooling"]},{"cell_type":"code","metadata":{"id":"ncpqPl69mOac","colab_type":"code","colab":{}},"source":["cnn.add(tf.keras.layers.MaxPool2D(pool_size=2, strides=2, padding='valid'))"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"xaTOgD8rm4mU","colab_type":"text"},"source":["### Adding a second convolutional layer"]},{"cell_type":"code","metadata":{"id":"i_-FZjn_m8gk","colab_type":"code","colab":{}},"source":["cnn.add(tf.keras.layers.Conv2D(filters=32, kernel_size=3, padding=\"same\", activation=\"relu\"))\n","cnn.add(tf.keras.layers.MaxPool2D(pool_size=2, strides=2, padding='valid'))"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"tmiEuvTunKfk","colab_type":"text"},"source":["### Step 3 - Flattening"]},{"cell_type":"code","metadata":{"id":"6AZeOGCvnNZn","colab_type":"code","colab":{}},"source":["cnn.add(tf.keras.layers.Flatten())"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"dAoSECOm203v","colab_type":"text"},"source":["### Step 4 - Full Connection"]},{"cell_type":"code","metadata":{"id":"8GtmUlLd26Nq","colab_type":"code","colab":{}},"source":["cnn.add(tf.keras.layers.Dense(units=128, activation='relu'))"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"yTldFvbX28Na","colab_type":"text"},"source":["### Step 5 - Output Layer"]},{"cell_type":"code","metadata":{"id":"1p_Zj1Mc3Ko_","colab_type":"code","colab":{}},"source":["cnn.add(tf.keras.layers.Dense(units=1, activation='sigmoid'))"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"D6XkI90snSDl","colab_type":"text"},"source":["## Part 3 - Training the CNN"]},{"cell_type":"markdown","metadata":{"id":"vfrFQACEnc6i","colab_type":"text"},"source":["### Compiling the CNN"]},{"cell_type":"code","metadata":{"id":"NALksrNQpUlJ","colab_type":"code","colab":{}},"source":["cnn.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])"],"execution_count":0,"outputs":[]},{"cell_type":"markdown","metadata":{"id":"ehS-v3MIpX2h","colab_type":"text"},"source":["### Training the CNN on the Training set and evaluating it on the Test set"]},{"cell_type":"code","metadata":{"id":"XUj1W4PJptta","colab_type":"code","colab":{"base_uri":"https://localhost:8080/","height":924},"outputId":"3de830db-5a3c-41ea-c318-3dd55b8099aa","executionInfo":{"status":"ok","timestamp":1586438210865,"user_tz":-240,"elapsed":2893328,"user":{"displayName":"Hadelin de Ponteves","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GhEuXdT7eQweUmRPW8_laJuPggSK6hfvpl5a6WBaA=s64","userId":"15047218817161520419"}}},"source":["cnn.fit_generator(training_set,\n"," steps_per_epoch = 334,\n"," epochs = 25,\n"," validation_data = test_set,\n"," validation_steps = 334)"],"execution_count":15,"outputs":[{"output_type":"stream","text":["WARNING:tensorflow:From :5: Model.fit_generator (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.\n","Instructions for updating:\n","Please use Model.fit, which supports generators.\n","Epoch 1/25\n","334/334 [==============================] - 116s 346ms/step - loss: -37484704.0000 - accuracy: 0.4987 - val_loss: -218133216.0000 - val_accuracy: 0.4999\n","Epoch 2/25\n","334/334 [==============================] - 115s 343ms/step - loss: -1530877952.0000 - accuracy: 0.5002 - val_loss: -4208331776.0000 - val_accuracy: 0.5004\n","Epoch 3/25\n","334/334 [==============================] - 114s 342ms/step - loss: -10730311680.0000 - accuracy: 0.4992 - val_loss: -21080999936.0000 - val_accuracy: 0.5000\n","Epoch 4/25\n","334/334 [==============================] - 114s 343ms/step - loss: -38008516608.0000 - accuracy: 0.4999 - val_loss: -62359158784.0000 - val_accuracy: 0.4993\n","Epoch 5/25\n","334/334 [==============================] - 114s 342ms/step - loss: -95429894144.0000 - accuracy: 0.5003 - val_loss: -141409304576.0000 - val_accuracy: 0.5004\n","Epoch 6/25\n","334/334 [==============================] - 114s 342ms/step - loss: -195819356160.0000 - accuracy: 0.5002 - val_loss: -270405451776.0000 - val_accuracy: 0.5005\n","Epoch 7/25\n","334/334 [==============================] - 114s 343ms/step - loss: -350490820608.0000 - accuracy: 0.4992 - val_loss: -460973342720.0000 - val_accuracy: 0.4995\n","Epoch 8/25\n","334/334 [==============================] - 115s 343ms/step - loss: -567733321728.0000 - accuracy: 0.5008 - val_loss: -721733025792.0000 - val_accuracy: 0.5005\n","Epoch 9/25\n","334/334 [==============================] - 119s 357ms/step - loss: -871657046016.0000 - accuracy: 0.4992 - val_loss: -1073665343488.0000 - val_accuracy: 0.4997\n","Epoch 10/25\n","334/334 [==============================] - 115s 344ms/step - loss: -1252980817920.0000 - accuracy: 0.5005 - val_loss: -1517686751232.0000 - val_accuracy: 0.5002\n","Epoch 11/25\n","334/334 [==============================] - 115s 345ms/step - loss: -1743530622976.0000 - accuracy: 0.5000 - val_loss: -2079168659456.0000 - val_accuracy: 0.4998\n","Epoch 12/25\n","334/334 [==============================] - 115s 345ms/step - loss: -2341885968384.0000 - accuracy: 0.5000 - val_loss: -2746375471104.0000 - val_accuracy: 0.5003\n","Epoch 13/25\n","334/334 [==============================] - 115s 344ms/step - loss: -3043618455552.0000 - accuracy: 0.5000 - val_loss: -3542708387840.0000 - val_accuracy: 0.4995\n","Epoch 14/25\n","334/334 [==============================] - 115s 343ms/step - loss: -3882742448128.0000 - accuracy: 0.5001 - val_loss: -4493544521728.0000 - val_accuracy: 0.4996\n","Epoch 15/25\n","334/334 [==============================] - 121s 361ms/step - loss: -4904350908416.0000 - accuracy: 0.4998 - val_loss: -5572023812096.0000 - val_accuracy: 0.4997\n","Epoch 16/25\n","334/334 [==============================] - 115s 344ms/step - loss: -6028249268224.0000 - accuracy: 0.4999 - val_loss: -6827078057984.0000 - val_accuracy: 0.5000\n","Epoch 17/25\n","334/334 [==============================] - 115s 345ms/step - loss: -7348382859264.0000 - accuracy: 0.5007 - val_loss: -8296905310208.0000 - val_accuracy: 0.4998\n","Epoch 18/25\n","334/334 [==============================] - 115s 345ms/step - loss: -8830923571200.0000 - accuracy: 0.4989 - val_loss: -9900669272064.0000 - val_accuracy: 0.5004\n","Epoch 19/25\n","334/334 [==============================] - 115s 345ms/step - loss: -10479195914240.0000 - accuracy: 0.5007 - val_loss: -11772465512448.0000 - val_accuracy: 0.5003\n","Epoch 20/25\n","334/334 [==============================] - 116s 346ms/step - loss: -12438010331136.0000 - accuracy: 0.5002 - val_loss: -13772319096832.0000 - val_accuracy: 0.4996\n","Epoch 21/25\n","334/334 [==============================] - 115s 345ms/step - loss: -14471482310656.0000 - accuracy: 0.4999 - val_loss: -15963112079360.0000 - val_accuracy: 0.4998\n","Epoch 22/25\n","334/334 [==============================] - 115s 345ms/step - loss: -16802979512320.0000 - accuracy: 0.4998 - val_loss: -18570987700224.0000 - val_accuracy: 0.5004\n","Epoch 23/25\n","334/334 [==============================] - 115s 345ms/step - loss: -19371355275264.0000 - accuracy: 0.4995 - val_loss: -21191286849536.0000 - val_accuracy: 0.4998\n","Epoch 24/25\n","334/334 [==============================] - 116s 346ms/step - loss: -22239346950144.0000 - accuracy: 0.5005 - val_loss: -24250689781760.0000 - val_accuracy: 0.5002\n","Epoch 25/25\n","334/334 [==============================] - 115s 345ms/step - loss: -25315573235712.0000 - accuracy: 0.5000 - val_loss: -27645225992192.0000 - val_accuracy: 0.4999\n"],"name":"stdout"},{"output_type":"execute_result","data":{"text/plain":[""]},"metadata":{"tags":[]},"execution_count":15}]}]} -------------------------------------------------------------------------------- /Cat Dog Images Classifier (CNN + Keras)/Cat_Dog_Classifier_Using_CNN.py: -------------------------------------------------------------------------------- 1 | # Convolutional Neural Network 2 | 3 | # Installing Theano 4 | # pip install --upgrade --no-deps git+git://github.com/Theano/Theano.git 5 | 6 | # Installing Tensorflow 7 | # Install Tensorflow from the website: https://www.tensorflow.org/versions/r0.12/get_started/os_setup.html 8 | 9 | # Installing Keras 10 | # pip install --upgrade keras 11 | 12 | # Part 1 - Building the CNN 13 | 14 | # Importing the Keras libraries and packages 15 | from keras.models import Sequential 16 | from keras.layers import Convolution2D 17 | from keras.layers import MaxPooling2D 18 | from keras.layers import Flatten 19 | from keras.layers import Dense 20 | 21 | # Initialising the CNN 22 | classifier = Sequential() 23 | 24 | # Step 1 - Convolution 25 | classifier.add(Convolution2D(32, 3, 3, input_shape = (64, 64, 3), activation = 'relu')) 26 | 27 | # Step 2 - Pooling 28 | classifier.add(MaxPooling2D(pool_size = (2, 2))) 29 | 30 | # Adding a second convolutional layer 31 | classifier.add(Convolution2D(32, 3, 3, activation = 'relu')) 32 | classifier.add(MaxPooling2D(pool_size = (2, 2))) 33 | 34 | # Step 3 - Flattening 35 | classifier.add(Flatten()) 36 | 37 | # Step 4 - Full connection 38 | classifier.add(Dense(output_dim = 128, activation = 'relu')) 39 | classifier.add(Dense(output_dim = 1, activation = 'sigmoid')) 40 | 41 | # Compiling the CNN 42 | classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy']) 43 | 44 | # Part 2 - Fitting the CNN to the images 45 | 46 | from keras.preprocessing.image import ImageDataGenerator 47 | 48 | train_datagen = ImageDataGenerator(rescale = 1./255, 49 | shear_range = 0.2, 50 | zoom_range = 0.2, 51 | horizontal_flip = True) 52 | 53 | test_datagen = ImageDataGenerator(rescale = 1./255) 54 | 55 | training_set = train_datagen.flow_from_directory('dataset/training_set', 56 | target_size = (64, 64), 57 | batch_size = 32, 58 | class_mode = 'binary') 59 | 60 | test_set = test_datagen.flow_from_directory('dataset/test_set', 61 | target_size = (64, 64), 62 | batch_size = 32, 63 | class_mode = 'binary') 64 | 65 | classifier.fit_generator(training_set, 66 | samples_per_epoch = 8000, 67 | nb_epoch = 25, 68 | validation_data = test_set, 69 | nb_val_samples = 2000) -------------------------------------------------------------------------------- /Cat Dog Images Classifier (CNN + Keras)/DataSet/How to Download Dataset....txt: -------------------------------------------------------------------------------- 1 | The dataset size is too large to upload. However I can upload that dataset here using GitBash but I dont want to OverLoad my Repo with Large Files. You can Download the Entire code and Dataset from This Repository-> https://github.com/MonicaGS/Machine-Learning-A-Z 2 | 3 | Just download the whole code from the repository link given above.. 4 | Then navigate to folder "Part 8- Deep Learning" -> " Section 40 - Convolutional Neural Networks (CNN)/Convolutional_Neural_Networks " then Take the dataset folder from there which will include a total of 10000 images of dogs and cats. -------------------------------------------------------------------------------- /Cat Dog Images Classifier (CNN + Keras)/Image.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Cat Dog Images Classifier (CNN + Keras)/Image.gif -------------------------------------------------------------------------------- /Cat Dog Images Classifier (CNN + Keras)/Readme.md: -------------------------------------------------------------------------------- 1 | ## Cat & Dog Classifier using Convolution Neural Network. 2 | ![Python 3.9](https://img.shields.io/badge/Python-3.9-brightgreen.svg) ![Problem Kaggle](https://img.shields.io/badge/Problem-Vision-blue.svg) ![Problem Kaggle](https://img.shields.io/badge/Data-Kaggle-orange.svg) 3 | 4 | ### Problem statement : 5 | 6 | In this Project we are implementing Convolution Neural Network(CNN) Classifier for Classifying images of dogs and cats. The Total number of images available for training is 8,000 and final testing is done on 2,000 images. 7 | 8 | ### Dependencies 9 | * Jupyter notebook 10 | * Tensorflow 2.10 11 | * Python 3.7+ 12 | * Matplotlib 13 | * Scikit-Learn 14 | * Pandas 15 | 16 |

17 | 18 | ### Don't forget to ⭐ the repository, if it helped you in anyway. 19 | 20 | #### Feel Free to contact me at➛ databoyamar@gmail.com for any help related to Projects in this Repository! 21 | -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/Readme.md: -------------------------------------------------------------------------------- 1 | # Covid19 FaceMask Detector using CNN & OpenCV. 2 | 3 | 4 |
5 |

Face Mask Detection system built with OpenCV, Keras/TensorFlow using Deep Learning and Computer Vision concepts in order to detect face masks in static images as well as in video streams.

6 |
7 | 8 |                                     9 | 10 | ## Live Demo: 11 |

12 | 13 | 14 | 15 | ## :innocent: Motivation 16 | In the present scenario due to Covid-19, there is no efficient face mask detection applications which are now in high demand for transportation means, densely populated areas, residential districts, large-scale manufacturers and other enterprises to ensure safety. Also, the absence of large datasets of __‘with_mask’__ images has made this task more cumbersome and challenging. 17 | 18 | 19 | ## :star: Features 20 | 21 | This system can be used in real-time applications which require face-mask detection for safety purposes due to the outbreak of Covid-19. This project can be integrated with embedded systems for application in airports, railway stations, offices, schools, and public places to ensure that public safety guidelines are followed. 22 | 23 | ## :file_folder: Dataset 24 | The dataset used can be downloaded here - [Click to Download](https://www.kaggle.com/prithwirajmitra/covid-face-mask-detection-dataset) 25 | 26 | This dataset consists of __1006 images__ belonging to two classes: 27 | * __with_mask: 500 images__ 28 | * __without_mask: 606 images__ 29 | 30 | 31 | 32 | ## 🚀 Installation 33 | 1. Download the files in this repository and extract them. 34 | 2. Run Face_Mask_Detection.ipynb file first using Google colab:-
35 | * Colab File link - https://colab.research.google.com/drive/1rX32L-EHFvdtulPbVlwllBve8bdKwC_m#scrollTo=pO9U0q_KNDsF 36 | 37 | 3. Running the above .ipynb file will generate Model.h5 file. 38 | 4. Download that Model.h5 file from Colab to local Machine. 39 | 5. Now Run Mask.py file 40 | 6. Done. 41 | 42 | Note: Make sure that you're using the same Tensorflow and Keras version on your local machine that you're using on Google Colab otherwise you'll get error. 43 | 44 | ## :key: Results 45 | 46 | #### Our model gave 92% accuracy for Face Mask Detection after training via tensorflow==2.3.0
47 | The model can further be Improved by doing parameter tuning. 48 | 49 | ![](https://github.com/chandrikadeb7/Face-Mask-Detection/blob/master/Readme_images/Screenshot%202020-06-01%20at%209.48.27%20PM.png) 50 | 51 | #### We got the following accuracy/loss training curve plot 52 | ![](https://github.com/chandrikadeb7/Face-Mask-Detection/blob/master/plot.png) 53 | 54 | ## :clap: And it's done! 55 | Feel free to mail me for any doubts/query 56 | :email: amark720@gmail.com 57 | 58 | ## :heart: Owner 59 | Made with :heart:  by [Amar Kumar](https://github.com/amark720) 60 | 61 | 62 | -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/man-mask-protective.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Covid19 FaceMask Detector (CNN & OpenCV)/man-mask-protective.jpg -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/mask.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | from tensorflow.keras.models import load_model 3 | from keras.preprocessing.image import load_img , img_to_array 4 | import numpy as np 5 | import tensorflow as tf 6 | print(tf.version.VERSION) 7 | 8 | model =load_model('model.h5') 9 | 10 | img_width , img_height = 150,150 11 | 12 | face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml') 13 | 14 | cap = cv2.VideoCapture('video.mp4') 15 | 16 | img_count_full = 0 17 | 18 | font = cv2.FONT_HERSHEY_SIMPLEX 19 | org = (1,1) 20 | class_label = '' 21 | fontScale = 1 22 | color = (0,0,255) 23 | thickness = 2 24 | 25 | while True: 26 | img_count_full += 1 27 | response , color_img = cap.read() 28 | 29 | if response == False: 30 | break 31 | 32 | 33 | scale = 50 34 | width = int(color_img.shape[1]*scale /100) 35 | height = int(color_img.shape[0]*scale/100) 36 | dim = (width,height) 37 | 38 | color_img = cv2.resize(color_img, dim ,interpolation= cv2.INTER_AREA) 39 | 40 | gray_img = cv2.cvtColor(color_img,cv2.COLOR_BGR2GRAY) 41 | 42 | faces = face_cascade.detectMultiScale(gray_img, 1.1, 6) 43 | 44 | img_count = 0 45 | for (x,y,w,h) in faces: 46 | org = (x+20,y+85) 47 | img_count += 1 48 | color_face = color_img[y:y+h,x:x+w] 49 | cv2.imwrite('input/%d%dface.jpg'%(img_count_full,img_count),color_face) 50 | img = load_img('input/%d%dface.jpg'%(img_count_full,img_count),target_size=(img_width,img_height)) 51 | img = img_to_array(img) 52 | img = np.expand_dims(img,axis=0) 53 | prediction = model.predict(img) 54 | 55 | 56 | if prediction==0: 57 | class_label = "Mask" 58 | color = (0,255,0) 59 | 60 | else: 61 | class_label = "No Mask" 62 | color = (0,0,255) 63 | 64 | 65 | cv2.rectangle(color_img,(x,y),(x+w,y+h),(255,0,0),3) 66 | cv2.putText(color_img, class_label, org, font, fontScale, color, thickness,cv2.LINE_AA) 67 | 68 | cv2.imshow('Face mask detection', color_img) 69 | if cv2.waitKey(1) & 0xFF == ord('q'): 70 | break 71 | 72 | cap.release() 73 | cv2.destroyAllWindows() 74 | 75 | 76 | 77 | -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/video.mp4: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Covid19 FaceMask Detector (CNN & OpenCV)/video.mp4 -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/video1.mp4: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Covid19 FaceMask Detector (CNN & OpenCV)/video1.mp4 -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/video2.mp4: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Covid19 FaceMask Detector (CNN & OpenCV)/video2.mp4 -------------------------------------------------------------------------------- /Covid19 FaceMask Detector (CNN & OpenCV)/women with mask.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Covid19 FaceMask Detector (CNN & OpenCV)/women with mask.jpg -------------------------------------------------------------------------------- /Digit Recognizer Kaggle/Readme.md: -------------------------------------------------------------------------------- 1 | # MNIST- Digit Recognizer 2 | This is a neural network designed to train on the MNIST data set for recognizing handwritten digits. It is run with an input layer of 784 inputs (28x28 images) and 2 hidden layers, each with 15 neurons. The output layer has 10 neurons, each of which represents the probability that the image is the digit 0-9, in order. The sigmoid activation function is used. 3 | 4 | The data set used to train this is the MNIST data set, found here: https://www.kaggle.com/c/digit-recognizer/data. The training data set (60,000 images) is used for everything; it is split 75/25% for training and testing, respectively. 5 | 6 | ### Python Implementation 7 | Dataset- MNIST dataset 8 | Images of size 28 X 28 9 | Classify digits from 0 to 9 10 | 11 | ## Kaggle NoteBook Link: 12 | https://www.kaggle.com/datawarriors/digit-recognizer-detailed-step-wise 13 | -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/README.md: -------------------------------------------------------------------------------- 1 | # Image classification with ResNet50 2 | Doing cool things with data doesn't always need to be difficult. By using ResNet-50 you don't have to start from scratch when it comes to building a classifier model and make a prediction based on it. This project is an beginners guide to ResNet-50. In the following you will get an short overall introduction to ResNet-50 and a simple project on how to use it for image classification with python coding. 3 | 4 | Here I've Created a Program using ResNet50 to predict whether an image is of which Category. The model is trained using existing deep learning model i.e. resnet50. Also Ive uploaded the code on Kaggle. 5 | 6 | ### What is ResNet-50 and why use it for image classification? 7 | ResNet-50 is a pretrained Deep Learning model for image classification of the Convolutional Neural Network(CNN, or ConvNet), which is a class of deep neural networks, most commonly applied to analyzing visual imagery. ResNet-50 is 50 layers deep and is trained on a million images of 1000 categories from the ImageNet database. Furthermore the model has over 23 million trainable parameters, which indicates a deep architecture that makes it better for image recognition. Using a pretrained model is a highly effective approach, compared if you need to build it from scratch, where you need to collect great amounts of data and train it yourself. Of course, there are other pretrained deep models to use such as AlexNet, GoogleNet or VGG19, but the ResNet-50 is noted for excellent generalization performance with fewer error rates on recognition tasks and is therefore a useful tool to know. 8 | 9 | ## ScreenShots: 10 | 11 | ### Single Image Classification- 12 | 13 | 14 | 15 | ### Multiple Image Classification- 16 | 17 | 18 | 19 | 20 | ### Kaggle Notebook Link -> https://www.kaggle.com/datawarriors/image-classifier-using-resnet50 21 | 22 | #### Feel Free to contact me at➛ amark720@gmail.com for any help related to this Project! 23 | -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/Screenshot1.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/Screenshot1.PNG -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/Screenshot2.PNG: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/Screenshot2.PNG -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/Image1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/Image1.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/Image3.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/Image3.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/Scooter.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/Scooter.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/banana.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/banana.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/car.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/car.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image10.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image10.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image11.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image11.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image2.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image2.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image4.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image4.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image6.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image6.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image8.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image8.jpg -------------------------------------------------------------------------------- /Image Classifier Using Resnet50/images/image9.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Image Classifier Using Resnet50/images/image9.jpg -------------------------------------------------------------------------------- /Keras Introduction Exploration/A Gentle Introduction to Keras.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Welcome to A Gentle Introduction to Keras " 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "This course focuses on a specific sub-field of machine learning called **predictive modeling.**\n", 15 | "\n", 16 | "Within predicitve modeling is a speciality or another sub-field called **deep learning.**\n", 17 | "\n", 18 | "We will be crafting deep learning models with a library called Keras. \n", 19 | "\n", 20 | ">**Predictive modeling** is focused on developing models that make accurate predictions at the expense of explaining why predictions are made. \n", 21 | "\n", 22 | "You and I don't need to be able to write a binary classification model. We need to know how to use and interpret the results of the model. " 23 | ] 24 | }, 25 | { 26 | "cell_type": "markdown", 27 | "metadata": {}, 28 | "source": [ 29 | "**Where does machine learning fit into data science?**\n", 30 | "\n", 31 | "Data science is a much broader discipline. \n", 32 | "\n", 33 | "> Data Scientists take the raw data, analyse it, connect the dots and tell a story often via several visualizations. They usually have a broader range of skill-set and may not have too much depth into more than one or two. They are more on the creative side. Like an Artist. An Engineer, on the other hand, is someone who looks at the data as something they have to take in and churn out an output in some appropriate form in the most efficient way possible. The implementation details and other efficiency hacks are usually on the tip of their fingers. There can be a lot of overlap between the two but it is more like A Data Scientist is a Machine Learning Engineer but not the other way round. -- Ria Chakraborty, Data Scientist\n", 34 | "\n", 35 | "\n", 36 | "\n" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "# Step 1. Import our modules\n", 44 | "\n", 45 | "Two important points here. Firstly, the **from** means we aren't importing the entire library, only a specific module. Secondly, notice we **are** imporing the entire numpy library. \n", 46 | "\n", 47 | "> If you get a message that states: WARNING (theano.configdefaults): g++ not detected, blah... blah. Run this in your Anaconda prompt. \n", 48 | "\n", 49 | "conda install mingw libpython\n", 50 | "\n" 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": null, 56 | "metadata": { 57 | "collapsed": false 58 | }, 59 | "outputs": [], 60 | "source": [ 61 | "from keras.models import Sequential\n", 62 | "from keras.layers import Dense\n", 63 | "import numpy" 64 | ] 65 | }, 66 | { 67 | "cell_type": "markdown", 68 | "metadata": {}, 69 | "source": [ 70 | "# Step 2. Set our random seed\n" 71 | ] 72 | }, 73 | { 74 | "cell_type": "markdown", 75 | "metadata": {}, 76 | "source": [ 77 | "Run an algorithm on a dataset and you've built a great model. Can you produce the same model again given the same data?\n", 78 | "You should be able to. It should be a requirement that is high on the list for your modeling project.\n", 79 | "\n", 80 | "> We achieve reproducibility in applied machine learning by using the exact same code, data and sequence of random numbers.\n", 81 | "\n", 82 | "Random numbers are created using a random number generator. It’s a simple program that generates a sequence of numbers that are random enough for most applications.\n", 83 | "\n", 84 | "This math function is deterministic. If it uses the same starting point called a seed number, it will give the same sequence of random numbers.\n", 85 | "\n", 86 | "Hold on... what's **deterministic** mean? \n", 87 | "\n", 88 | "> \"a deterministic algorithm is an algorithm which, given a particular input, will always produce the same output, with the underlying machine always passing through the same sequence of states\"\n", 89 | "\n", 90 | "Let's apply an English translator to this: \n", 91 | "\n", 92 | "> The **only purpose of seeding** is to make sure that you get the **exact same result** when you run this code many times on the exact same data." 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "metadata": { 99 | "collapsed": true 100 | }, 101 | "outputs": [], 102 | "source": [ 103 | "seed = 9\n", 104 | "numpy.random.seed(seed)" 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "metadata": {}, 110 | "source": [ 111 | "# Step 3. Import our data set\n" 112 | ] 113 | }, 114 | { 115 | "cell_type": "markdown", 116 | "metadata": {}, 117 | "source": [ 118 | "Let's import the object called read_csv. \n", 119 | "\n", 120 | "We define a variable called filename and put our data set in it. \n", 121 | "\n", 122 | "The last line does the work. It using the function called **read_csv** to put the contents of our data set into a variable called dataframe. " 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": null, 128 | "metadata": { 129 | "collapsed": false 130 | }, 131 | "outputs": [], 132 | "source": [ 133 | "from pandas import read_csv\n", 134 | "filename = 'BBCN.csv'\n", 135 | "dataframe = read_csv(filename)" 136 | ] 137 | }, 138 | { 139 | "cell_type": "markdown", 140 | "metadata": {}, 141 | "source": [ 142 | "# Step 4. Split the Output Variables\n" 143 | ] 144 | }, 145 | { 146 | "cell_type": "markdown", 147 | "metadata": {}, 148 | "source": [ 149 | "The first thing we need to do is put our data in an array. \n", 150 | "\n", 151 | "> An array is a data structure that stores values of **same data type**. \n", 152 | "\n", 153 | "In Python, this is the main difference between arrays and lists. While python lists can contain values corresponding to different data types, arrays in python can only contain values corresponding to same data type." 154 | ] 155 | }, 156 | { 157 | "cell_type": "code", 158 | "execution_count": null, 159 | "metadata": { 160 | "collapsed": true 161 | }, 162 | "outputs": [], 163 | "source": [ 164 | "array = dataframe.values" 165 | ] 166 | }, 167 | { 168 | "cell_type": "markdown", 169 | "metadata": {}, 170 | "source": [ 171 | "The code below is the trickest part of the exercise. Now, we are assinging X and y as output variables.\n", 172 | "\n", 173 | "> That looks pretty easy but keep in mind that an array starts at 0. \n", 174 | "\n", 175 | "If you take a look at the shape of our dataframe (shape means the number of columns and rows) you can see we have 12 rows. \n", 176 | "\n", 177 | "On the X array below we saying... include all items in the array from 0 to 11. \n", 178 | "\n", 179 | "On the y array below we are saying... just use the column in the array mapped to the **11th row**. The **BikeBuyer** column. \n", 180 | "\n", 181 | "> Before we split X and Y out we are going to put them in an array. \n", 182 | "\n", 183 | "\n" 184 | ] 185 | }, 186 | { 187 | "cell_type": "code", 188 | "execution_count": null, 189 | "metadata": { 190 | "collapsed": false 191 | }, 192 | "outputs": [], 193 | "source": [ 194 | "X = array[:,0:11] \n", 195 | "Y = array[:,11]" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "metadata": { 202 | "collapsed": false 203 | }, 204 | "outputs": [], 205 | "source": [ 206 | "dataframe.head()" 207 | ] 208 | }, 209 | { 210 | "cell_type": "markdown", 211 | "metadata": {}, 212 | "source": [ 213 | "# Step 4. Build the Model\n" 214 | ] 215 | }, 216 | { 217 | "cell_type": "markdown", 218 | "metadata": {}, 219 | "source": [ 220 | "We can piece it all together by adding each layer. \n", 221 | "\n", 222 | "> The first layer has 11 neurons and expects 11 input variables. \n", 223 | "\n", 224 | "The second hidden layer has 8 neurons.\n", 225 | "\n", 226 | "The third hidden layer has 8 neurons. \n", 227 | "\n", 228 | "The output layer has 1 neuron to predict the class. \n", 229 | "\n", 230 | "How many hidden layers are in our model? " 231 | ] 232 | }, 233 | { 234 | "cell_type": "code", 235 | "execution_count": null, 236 | "metadata": { 237 | "collapsed": true 238 | }, 239 | "outputs": [], 240 | "source": [ 241 | "model = Sequential()\n", 242 | "model.add(Dense(12, input_dim=11, init='uniform', activation='relu'))\n", 243 | "model.add(Dense(8, init='uniform', activation='relu'))\n", 244 | "model.add(Dense(1, init='uniform', activation='sigmoid'))" 245 | ] 246 | }, 247 | { 248 | "cell_type": "markdown", 249 | "metadata": {}, 250 | "source": [ 251 | "# Step 5. Compile the Model" 252 | ] 253 | }, 254 | { 255 | "cell_type": "markdown", 256 | "metadata": {}, 257 | "source": [ 258 | "A metric is a function that is used to judge the performance of your model. Metric functions are to be supplied in the metrics parameter when a model is compiled.\n", 259 | "\n", 260 | "> Lastly, we set the cost (or loss) function to categorical_crossentropy. The (binary) cross-entropy is just the technical term for the **cost function** in logistic regression, and the categorical cross-entropy is its generalization for multi-class predictions via softmax" 261 | ] 262 | }, 263 | { 264 | "cell_type": "markdown", 265 | "metadata": {}, 266 | "source": [ 267 | "Binary learning models are models which just predict one of two outcomes: positive or negative. These models are very well suited to drive decisions, such as whether to administer a patient a certain drug or to include a lead in a targeted marketing campaign.\n", 268 | "\n", 269 | "> Accuracy is perhaps the most intuitive performance measure. **It is simply the ratio of correctly predicted observations.**\n", 270 | "\n", 271 | "Using accuracy is only good for symmetric data sets where the class distribution is 50/50 and the cost of false positives and false negatives are roughly the same. In our case our classes are balanced. " 272 | ] 273 | }, 274 | { 275 | "cell_type": "markdown", 276 | "metadata": {}, 277 | "source": [ 278 | "Whenever you train a model with your data, you are actually producing some new values (predicted) for a specific feature. However, that specific feature already has some values which are real values in the dataset. \n", 279 | "\n", 280 | "> We know the the closer the predicted values to their corresponding real values, the better the model.\n", 281 | "\n", 282 | "We are using cost function to measure **how close the predicted values are to their corresponding real values.**\n", 283 | "\n", 284 | "So, for our model we choose binary_crossentropy. " 285 | ] 286 | }, 287 | { 288 | "cell_type": "code", 289 | "execution_count": null, 290 | "metadata": { 291 | "collapsed": true 292 | }, 293 | "outputs": [], 294 | "source": [ 295 | "model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])" 296 | ] 297 | }, 298 | { 299 | "cell_type": "markdown", 300 | "metadata": {}, 301 | "source": [ 302 | "# Step 5. Fit the Model" 303 | ] 304 | }, 305 | { 306 | "cell_type": "markdown", 307 | "metadata": {}, 308 | "source": [ 309 | "**Epoch:** A full pass over all of your training data.\n", 310 | "\n", 311 | "For example, let's say you have 1213 observations. So an epoch concludes when it has finished a training pass over all 1213 of your observations.\n", 312 | "\n", 313 | "> What you'd expect to see from running fit on your Keras model, is a decrease in loss over n number of epochs." 314 | ] 315 | }, 316 | { 317 | "cell_type": "markdown", 318 | "metadata": {}, 319 | "source": [ 320 | "batch_size denotes the subset size of your training sample (e.g. 100 out of 1000) which is going to be used in order to train the network during its learning process. \n", 321 | "\n", 322 | "Each batch trains network in a successive order, taking into account the updated weights coming from the appliance of the previous batch. \n", 323 | "\n", 324 | ">Example: if you have 1000 training examples, and your batch size is 500, then it will take 2 iterations to complete 1 epoch.\n", 325 | "\n" 326 | ] 327 | }, 328 | { 329 | "cell_type": "code", 330 | "execution_count": null, 331 | "metadata": { 332 | "collapsed": false 333 | }, 334 | "outputs": [], 335 | "source": [ 336 | "model.fit(X, Y, nb_epoch=200, batch_size=30)" 337 | ] 338 | }, 339 | { 340 | "cell_type": "markdown", 341 | "metadata": {}, 342 | "source": [ 343 | "# Step 6. Score the Model" 344 | ] 345 | }, 346 | { 347 | "cell_type": "code", 348 | "execution_count": null, 349 | "metadata": { 350 | "collapsed": false 351 | }, 352 | "outputs": [], 353 | "source": [ 354 | "scores = model.evaluate(X, Y)\n", 355 | "print(\"%s: %.2f%%\" % (model.metrics_names[1], scores[1]*100))\n" 356 | ] 357 | } 358 | ], 359 | "metadata": { 360 | "kernelspec": { 361 | "display_name": "Python 2", 362 | "language": "python", 363 | "name": "python2" 364 | }, 365 | "language_info": { 366 | "codemirror_mode": { 367 | "name": "ipython", 368 | "version": 2 369 | }, 370 | "file_extension": ".py", 371 | "mimetype": "text/x-python", 372 | "name": "python", 373 | "nbconvert_exporter": "python", 374 | "pygments_lexer": "ipython2", 375 | "version": "2.7.13" 376 | } 377 | }, 378 | "nbformat": 4, 379 | "nbformat_minor": 2 380 | } 381 | -------------------------------------------------------------------------------- /Keras Introduction Exploration/BBCN.csv: -------------------------------------------------------------------------------- 1 | MaritalStatus,Gender,YearlyIncome,TotalChildren,NumberChildrenAtHome,EnglishEducation,HouseOwnerFlag,NumberCarsOwned,CommuteDistance,Region,Age,BikeBuyer 2 | 5,1,9,2,0,5,1,0,2,2,5,1 3 | 5,1,6,3,3,5,0,1,1,2,4,1 4 | 5,1,6,3,3,5,1,1,5,2,4,1 5 | 5,2,7,0,0,5,0,1,10,2,5,1 6 | 5,2,8,5,5,5,1,4,2,2,5,1 7 | 5,1,7,0,0,5,1,1,10,2,4,1 8 | 5,2,7,0,0,5,1,1,10,2,4,1 9 | 5,1,6,3,3,5,1,2,1,2,4,1 10 | 5,2,6,4,4,5,1,3,20,2,4,1 11 | 5,1,7,0,0,5,0,1,10,2,4,1 12 | 5,2,7,0,0,5,0,1,10,2,4,1 13 | 5,1,6,4,4,5,1,4,20,2,4,1 14 | 5,2,9.5,2,0,5,1,2,2,1,5,0 15 | 5,1,9.5,2,0,5,1,3,1,1,5,0 16 | 5,2,9.5,3,0,5,0,3,2,1,5,0 17 | 3,2,3,0,0,3,0,1,10,1,3,1 18 | 3,1,3,0,0,3,1,1,10,1,3,1 19 | 1,2,2,4,0,1,1,2,10,2,8,1 20 | 3,1,3,2,0,3,1,2,10,2,8,1 21 | 1,1,4,0,0,1,0,2,10,1,3,0 22 | 1,1,4,0,0,1,0,2,2,1,3,1 23 | 3,2,4,0,0,3,0,1,2,1,3,1 24 | 3,1,4,0,0,3,1,1,10,1,3,1 25 | 3,1,4,0,0,3,1,1,2,1,3,0 26 | 3,1,6,0,0,3,1,2,10,1,3,0 27 | 2,1,1,2,1,2,1,2,2,2,8,1 28 | 3,1,3,2,0,3,0,2,2,2,7,1 29 | 3,1,3,2,0,3,1,2,10,2,7,1 30 | 3,2,3,2,0,3,1,2,2,2,7,1 31 | 3,1,3,2,0,3,1,2,2,2,7,1 32 | 2,2,1,2,1,2,1,2,2,2,7,1 33 | 1,2,2,4,0,1,1,2,2,2,7,1 34 | 1,2,2,4,0,1,1,2,2,2,7,1 35 | 1,1,2,4,0,1,1,2,10,2,7,1 36 | 1,2,2,4,0,1,1,2,10,2,7,1 37 | 2,2,1,2,1,2,1,2,2,2,7,1 38 | 3,2,6,0,0,3,1,2,2,1,3,1 39 | 2,2,4,0,0,2,0,2,10,1,3,1 40 | 2,2,1,2,1,2,1,2,10,2,7,1 41 | 3,1,3,3,0,3,1,2,10,2,7,1 42 | 3,1,3,0,0,3,1,2,2,1,3,1 43 | 3,2,6,0,0,3,1,2,2,1,3,1 44 | 3,2,7,0,0,3,1,2,1,1,3,1 45 | 3,1,6,0,0,3,1,2,10,1,4,0 46 | 2,1,2,2,1,2,1,2,2,2,7,1 47 | 1,1,3,3,0,1,0,2,2,2,7,0 48 | 1,2,3,3,0,1,1,2,10,2,7,1 49 | 1,2,3,3,0,1,0,2,2,2,7,1 50 | 1,1,3,3,0,1,1,2,10,2,7,1 51 | 2,2,4,0,0,2,1,2,10,1,3,0 52 | 1,1,3,3,0,1,1,2,10,2,7,1 53 | 1,1,3,3,0,1,0,2,2,2,7,0 54 | 3,2,4,2,0,3,0,2,10,2,7,1 55 | 3,2,6,0,0,3,0,2,2,1,3,1 56 | 3,2,4,2,0,3,1,2,10,2,7,1 57 | 3,1,4,2,0,3,1,2,10,2,7,1 58 | 3,2,4,3,0,3,1,2,10,2,7,1 59 | 4,1,7,2,0,4,1,2,2,2,7,1 60 | 3,1,8,2,0,3,1,2,2,2,7,1 61 | 3,2,8,2,0,3,1,2,10,2,7,0 62 | 3,1,8,2,0,3,1,2,10,2,7,1 63 | 3,1,8,2,0,3,1,2,10,2,7,1 64 | 1,1,4,0,0,1,1,2,10,1,4,1 65 | 1,2,4,0,0,1,1,2,10,1,4,1 66 | 1,1,4,0,0,1,1,2,10,1,4,1 67 | 3,2,6,0,0,3,0,2,2,1,4,0 68 | 3,2,7,0,0,3,1,2,10,1,4,0 69 | 3,1,6,0,0,3,1,2,10,1,4,0 70 | 1,2,8,2,0,1,1,2,10,2,7,0 71 | 1,2,8,2,0,1,0,2,2,2,7,1 72 | 1,1,8,2,0,1,1,2,10,2,7,1 73 | 1,2,8,2,0,1,1,2,10,2,7,0 74 | 1,2,8,2,0,1,0,2,2,2,7,1 75 | 1,2,7,2,0,1,1,2,10,2,6,0 76 | 1,1,7,2,0,1,1,2,10,2,6,0 77 | 1,2,8,2,0,1,1,2,10,2,6,1 78 | 1,1,8,2,0,1,1,2,10,2,6,1 79 | 1,2,8,2,0,1,0,2,2,2,6,1 80 | 1,2,4,0,0,1,0,2,2,1,4,0 81 | 4,1,9.5,0,0,4,1,2,1,2,6,0 82 | 4,1,9.5,0,0,4,1,2,1,2,6,1 83 | 5,2,9.5,2,2,5,1,3,5,1,5,0 84 | 4,2,9.5,0,1,4,0,3,2,1,5,1 85 | 4,2,9.5,0,1,4,1,3,2,1,5,1 86 | 2,1,8,2,0,2,0,2,2,1,6,1 87 | 1,2,6,2,0,1,0,2,10,1,6,0 88 | 3,1,7,2,1,3,1,0,1,1,6,0 89 | 3,2,7,3,2,3,0,0,1,1,6,0 90 | 5,1,8,2,1,5,1,0,5,1,6,1 91 | 5,2,8,2,1,5,1,1,5,1,6,1 92 | 3,1,9,2,0,3,1,1,5,1,6,1 93 | 3,1,9,2,0,3,1,1,10,1,6,0 94 | 5,2,9,2,2,5,1,0,2,2,5,1 95 | 4,2,9.5,0,0,4,1,0,1,2,5,1 96 | 3,1,7,1,0,3,0,1,1,2,4,0 97 | 3,1,7,1,0,3,1,1,10,2,4,1 98 | 5,1,6,1,0,5,1,1,10,2,4,1 99 | 3,1,6,1,0,3,1,1,10,2,4,1 100 | 3,2,6,1,0,3,0,1,1,2,4,0 101 | 5,1,6,1,0,5,1,1,1,2,6,1 102 | 3,2,6,1,0,3,1,1,10,2,6,1 103 | 5,2,7,0,0,5,0,1,1,2,4,1 104 | 5,2,8,5,5,5,1,4,2,2,4,0 105 | 5,2,7,0,0,5,1,1,10,2,4,1 106 | 5,1,7,0,0,5,0,1,1,2,4,1 107 | 5,2,7,0,0,5,0,1,10,2,4,1 108 | 5,1,7,0,0,5,0,1,10,2,4,1 109 | 5,2,9,1,0,5,1,1,10,2,6,1 110 | 5,2,7,0,0,5,0,2,1,2,4,1 111 | 5,1,7,0,0,5,0,2,1,2,4,1 112 | 3,1,6,1,0,3,1,1,10,2,4,1 113 | 3,2,6,1,0,3,1,1,1,2,4,1 114 | 3,2,6,1,0,3,1,1,1,2,4,1 115 | 5,1,7,5,4,5,1,2,10,2,4,0 116 | 3,2,7,5,4,3,1,2,10,2,4,0 117 | 3,1,7,5,4,3,1,2,1,2,4,0 118 | 3,1,7,5,4,3,1,2,10,2,4,0 119 | 3,2,7,5,4,3,1,3,20,2,6,1 120 | 3,1,8,1,0,3,0,1,1,2,6,0 121 | 3,1,3,2,0,3,0,2,1,2,81,0 122 | 5,2,4,2,0,5,1,1,10,2,8,1 123 | 3,1,7,5,4,3,1,3,20,2,6,0 124 | 3,1,7,5,4,3,1,3,10,2,6,0 125 | 3,1,7,5,4,3,1,3,20,2,6,0 126 | 3,1,8,1,0,3,1,1,10,2,6,1 127 | 3,2,7,1,0,3,1,1,10,2,6,0 128 | 1,1,1,2,1,1,0,2,1,2,7,1 129 | 1,2,4,0,0,1,1,1,10,1,4,0 130 | 1,2,4,0,0,1,1,1,10,1,4,0 131 | 1,2,4,0,0,1,1,1,10,1,4,1 132 | 1,2,3,0,0,1,1,2,10,1,3,0 133 | 1,2,3,0,0,1,0,2,1,1,3,0 134 | 1,2,3,0,0,1,0,2,10,1,3,1 135 | 2,2,3,0,0,2,0,2,1,1,3,0 136 | 1,1,1,5,0,1,1,2,10,2,8,1 137 | 2,1,3,0,0,2,0,2,10,1,3,1 138 | 1,2,4,0,0,1,0,2,1,1,3,1 139 | 1,2,4,0,0,1,1,2,10,1,3,0 140 | 1,2,4,0,0,1,0,2,1,1,3,0 141 | 3,2,3,2,0,3,0,2,10,2,8,0 142 | 2,1,3,0,0,2,0,2,10,1,4,0 143 | 1,2,4,0,0,1,0,2,1,1,3,0 144 | 1,1,4,0,0,1,0,2,1,1,3,0 145 | 1,1,4,0,0,1,1,2,10,1,4,0 146 | 1,1,4,0,0,1,0,2,1,1,3,1 147 | 1,2,4,0,0,1,0,2,1,1,3,1 148 | 5,2,4,2,0,5,1,2,10,2,8,0 149 | 4,1,6,2,0,4,1,1,1,2,8,1 150 | 5,1,4,2,0,5,1,2,10,2,8,0 151 | 5,1,4,2,0,5,1,2,1,2,8,0 152 | 5,1,4,2,0,5,1,2,10,2,8,0 153 | 4,2,6,2,0,4,0,1,1,2,8,1 154 | 1,1,4,0,0,1,1,2,10,1,4,1 155 | 1,2,4,0,0,1,1,2,10,1,4,0 156 | 3,2,4,0,0,3,1,1,10,1,4,1 157 | 3,1,4,0,0,3,0,1,2,1,4,1 158 | 3,2,4,0,0,3,0,1,2,1,4,1 159 | 1,2,4,0,0,1,1,2,10,1,4,0 160 | 1,1,4,0,0,1,1,2,10,1,4,1 161 | 1,2,4,0,0,1,0,2,2,1,4,0 162 | 3,1,4,0,0,3,0,1,10,1,4,1 163 | 3,2,4,0,0,3,0,1,2,1,4,0 164 | 3,1,4,0,0,3,1,1,10,1,4,1 165 | 3,1,4,0,0,3,1,1,10,1,4,0 166 | 3,1,4,0,0,3,0,1,2,1,4,0 167 | 3,2,6,0,0,3,0,1,2,1,4,0 168 | 3,2,7,0,0,3,0,2,1,1,4,0 169 | 5,2,8,0,0,5,0,1,1,1,4,1 170 | 5,1,8,0,0,5,0,1,1,1,5,1 171 | 5,1,9,4,4,5,1,1,1,1,5,0 172 | 5,2,9,4,4,5,1,1,1,1,5,0 173 | 5,1,9.5,1,0,5,1,2,2,1,5,1 174 | 5,2,9.5,1,0,5,1,2,2,1,5,0 175 | 4,2,9.5,0,0,4,0,4,1,1,5,0 176 | 4,1,9.5,0,1,4,1,0,5,1,5,0 177 | 3,1,7,0,0,3,1,2,10,1,4,1 178 | 5,1,9,4,4,5,1,1,2,1,5,0 179 | 4,1,9.5,2,4,4,1,2,10,1,8,0 180 | 4,1,9.5,2,3,4,1,3,2,1,8,0 181 | 4,2,9.5,2,4,4,1,3,10,1,8,0 182 | 5,2,9.5,1,3,5,1,4,1,1,8,0 183 | 4,1,9.5,2,3,4,1,2,1,1,8,0 184 | 5,2,7,4,0,5,1,2,20,1,8,0 185 | 5,2,7,4,0,5,1,2,20,1,8,0 186 | 5,2,7,4,0,5,1,2,20,1,8,0 187 | 5,2,7,4,0,5,1,2,20,1,8,0 188 | 5,2,7,4,0,5,1,2,20,1,8,0 189 | 4,2,6,4,0,4,1,2,20,1,8,0 190 | 4,2,6,4,0,4,0,2,2,1,8,0 191 | 4,1,6,4,0,4,1,2,20,1,8,1 192 | 5,1,7,5,0,5,1,2,20,1,8,0 193 | 5,2,7,5,0,5,1,3,20,1,8,1 194 | 5,1,6,4,0,5,1,2,5,1,8,0 195 | 5,1,6,4,0,5,1,2,5,1,8,0 196 | 5,2,6,4,0,5,1,2,5,1,8,0 197 | 5,2,6,5,0,5,1,3,20,1,8,0 198 | 5,1,6,5,0,5,1,3,20,1,7,0 199 | 5,2,7,3,1,5,0,1,2,1,7,0 200 | 5,2,7,4,1,5,1,1,20,1,7,0 201 | 5,2,7,4,1,5,1,1,20,1,7,0 202 | 5,1,7,4,1,5,1,1,2,1,7,0 203 | 5,2,7,4,1,5,1,1,20,1,7,0 204 | 5,2,8,5,0,5,0,2,2,1,7,0 205 | 1,1,4,2,1,1,1,2,20,1,7,0 206 | 1,2,4,2,1,1,1,2,5,1,7,0 207 | 1,2,4,2,1,1,1,2,20,1,7,0 208 | 3,1,6,2,1,3,1,1,20,1,7,0 209 | 3,2,6,2,1,3,1,1,20,1,7,0 210 | 5,2,6,2,1,5,1,0,20,1,7,0 211 | 5,2,6,2,1,5,1,0,20,1,7,0 212 | 3,1,7,4,1,3,1,2,20,1,7,0 213 | 1,2,4,2,1,1,1,2,5,1,7,0 214 | 1,2,4,2,1,1,1,2,5,1,7,0 215 | 3,2,6,2,1,3,1,2,20,1,7,0 216 | 3,1,6,2,1,3,1,2,20,1,7,0 217 | 2,2,4,2,1,2,1,2,20,1,7,0 218 | 3,2,6,2,1,3,1,2,20,1,7,1 219 | 3,2,6,2,1,3,1,2,20,1,7,1 220 | 3,2,6,2,1,3,0,2,5,1,7,0 221 | 1,1,6,2,1,1,1,2,20,1,7,0 222 | 3,2,6,2,1,3,1,1,20,1,7,0 223 | 3,1,7,4,2,3,1,1,2,1,7,0 224 | 5,2,8,3,1,5,1,1,2,1,7,0 225 | 1,2,7,2,1,1,1,2,5,1,7,0 226 | 1,2,7,2,1,1,1,2,20,1,7,1 227 | 3,2,6,2,1,3,1,1,20,1,7,0 228 | 3,2,6,2,1,3,1,1,5,1,7,0 229 | 3,1,6,2,1,3,1,2,20,1,7,1 230 | 3,2,6,2,1,3,1,2,20,1,7,0 231 | 3,1,7,4,2,3,1,1,20,1,7,0 232 | 3,2,7,4,2,3,0,2,5,1,7,0 233 | 1,1,6,2,1,1,1,2,20,1,7,0 234 | 1,2,6,3,1,1,0,2,5,1,7,0 235 | 3,1,7,4,2,3,1,2,20,1,7,0 236 | 3,2,7,4,2,3,1,2,20,1,7,0 237 | 3,1,7,4,2,3,1,2,20,1,7,0 238 | 3,1,7,5,2,3,1,2,20,1,7,0 239 | 5,1,9.5,2,3,5,0,4,1,3,6,1 240 | 5,2,9.5,2,3,5,0,4,1,3,6,1 241 | 3,2,9.5,2,4,3,0,2,10,3,6,1 242 | 3,2,9.5,2,3,3,1,4,10,3,6,1 243 | 3,2,9.5,2,3,3,1,4,10,3,6,1 244 | 3,1,9.5,2,4,3,1,3,10,3,6,1 245 | 3,2,9.5,2,3,3,1,3,1,3,6,1 246 | 5,2,9.5,2,3,5,0,3,1,3,6,1 247 | 1,1,9.5,3,4,1,0,4,10,3,6,1 248 | 1,2,9.5,3,4,1,0,4,10,3,6,1 249 | 3,2,9.5,3,4,3,0,3,1,3,6,1 250 | 3,1,9.5,3,4,3,1,4,20,3,7,0 251 | 3,2,9.5,3,4,3,1,3,1,3,7,1 252 | 1,2,9.5,3,4,1,1,2,10,3,7,1 253 | 1,1,3,4,0,1,0,2,10,1,84,1 254 | 4,1,9,3,0,4,1,1,5,1,84,1 255 | 4,1,6,2,0,4,1,2,2,1,84,0 256 | 4,1,7,4,0,4,1,2,10,1,84,0 257 | 4,1,7,4,0,4,1,2,10,1,83,0 258 | 4,2,7,4,0,4,1,2,10,1,83,0 259 | 4,2,9.5,1,2,4,1,4,5,1,84,1 260 | 4,1,9.5,1,2,4,1,4,5,1,83,0 261 | 3,2,9.5,4,1,3,1,4,5,1,4,1 262 | 5,2,9.5,2,2,5,1,3,5,1,4,0 263 | 4,2,9.5,1,2,4,1,1,1,1,4,1 264 | 5,2,8,4,3,5,1,0,1,1,4,0 265 | 3,2,9,5,5,3,1,3,5,1,4,1 266 | 3,1,9,5,5,3,0,3,1,1,4,1 267 | 3,2,9,5,5,3,1,3,1,1,4,1 268 | 3,2,9,0,0,3,0,1,10,1,4,1 269 | 3,1,9.5,4,2,3,1,4,5,1,4,1 270 | 4,2,9.5,2,2,4,1,2,2,1,4,0 271 | 4,1,9.5,2,2,4,1,2,1,1,4,0 272 | 4,1,9.5,2,2,4,1,3,1,1,4,1 273 | 5,2,9.5,2,0,5,0,4,2,1,4,1 274 | 3,2,7,4,3,3,1,0,10,1,6,1 275 | 3,1,9,2,1,3,1,0,20,1,6,0 276 | 4,1,5,2,0,4,1,2,10,1,82,1 277 | 4,2,8,4,0,4,1,2,1,1,83,1 278 | 3,2,8,4,3,3,1,2,5,1,4,0 279 | 3,1,8,4,3,3,1,2,1,1,4,0 280 | 3,1,9,0,0,3,0,1,20,1,4,1 281 | 3,2,9,0,0,3,1,1,20,1,4,0 282 | 3,1,9,0,0,3,1,1,20,1,4,0 283 | 5,2,9.5,4,2,5,1,4,10,1,4,1 284 | 3,1,9,5,4,3,1,4,2,1,4,1 285 | 5,1,9.5,1,3,5,1,2,2,1,4,0 286 | 5,2,9.5,1,3,5,1,2,2,1,4,0 287 | 3,1,9.5,1,3,3,0,4,5,1,4,0 288 | 4,1,9.5,0,0,4,1,4,5,1,4,1 289 | 3,1,7,5,4,3,0,3,10,1,6,0 290 | 5,2,9.5,1,3,5,1,3,5,1,4,0 291 | 5,2,9.5,1,3,5,1,1,1,1,6,1 292 | 5,2,9.5,1,3,5,1,3,5,1,4,1 293 | 5,2,9.5,1,3,5,1,3,5,1,4,1 294 | 3,1,9.5,1,2,3,0,3,1,1,4,1 295 | 1,1,6,2,0,1,0,2,2,1,7,0 296 | 1,1,6,2,0,1,0,2,2,1,7,0 297 | 2,2,8,2,0,2,1,2,10,1,7,1 298 | 4,2,5,2,0,4,1,2,10,1,81,1 299 | 4,1,7,4,0,4,1,2,10,1,82,1 300 | 4,1,7,4,0,4,1,2,2,1,82,1 301 | 4,1,7,4,0,4,1,2,2,1,8,1 302 | 3,1,8,5,4,3,0,3,2,1,6,0 303 | 3,1,8,5,4,3,1,4,10,1,6,0 304 | 1,1,9,4,3,1,0,3,2,1,6,1 305 | 1,2,9,4,3,1,0,3,2,1,6,1 306 | 3,2,9.5,0,3,3,1,0,10,1,6,0 307 | 3,1,9.5,3,3,3,0,4,5,1,6,0 308 | 3,1,9.5,3,3,3,0,4,5,1,6,0 309 | 2,1,8,2,0,2,1,2,10,1,7,1 310 | 1,1,6,2,0,1,0,2,2,1,7,0 311 | 1,2,6,2,0,1,1,2,10,1,7,0 312 | 4,2,7,3,0,4,1,2,10,1,7,0 313 | 4,2,7,3,0,4,0,2,2,1,7,0 314 | 4,2,7,3,0,4,1,2,10,1,7,0 315 | 4,1,7,3,0,4,1,2,10,1,7,0 316 | 4,2,7,3,0,4,0,2,2,1,7,0 317 | 4,2,7,3,0,4,0,2,2,1,7,0 318 | 4,1,6,3,0,4,1,2,10,1,7,1 319 | 3,2,4,3,0,3,0,2,10,1,7,1 320 | 3,2,4,3,0,3,0,2,2,1,7,0 321 | 4,2,7,4,0,4,0,2,2,1,8,1 322 | 4,2,8,4,0,4,1,2,10,1,8,0 323 | 5,1,9,5,0,5,1,2,10,1,8,0 324 | 5,2,9,5,0,5,1,2,10,1,8,0 325 | 4,1,9.5,2,4,4,1,3,10,1,8,1 326 | 5,1,9,5,0,5,1,2,2,1,8,1 327 | 5,1,9,5,0,5,1,2,10,1,8,1 328 | 5,1,9.5,2,3,5,1,4,1,1,8,1 329 | 5,1,9,5,0,5,1,2,10,1,8,1 330 | 5,1,9,5,0,5,1,2,10,1,8,1 331 | 5,1,9,5,0,5,1,2,10,1,8,1 332 | 4,1,9.5,2,4,4,1,1,5,1,8,0 333 | 3,2,9,4,4,3,1,2,2,1,5,0 334 | 4,2,1,1,0,4,1,0,1,3,4,1 335 | 4,2,2,1,0,4,1,0,1,3,4,1 336 | 5,2,1,1,0,5,1,0,1,3,4,1 337 | 5,2,1,1,0,5,1,0,1,3,4,1 338 | 4,1,2,1,0,4,1,0,1,3,4,1 339 | 5,1,1,1,0,5,1,0,1,3,4,1 340 | 4,1,2,1,0,4,1,0,1,3,4,1 341 | 4,1,2,1,0,4,1,0,1,3,4,1 342 | 4,2,1,1,0,4,1,0,1,3,8,1 343 | 4,1,2,1,0,4,1,0,1,3,8,1 344 | 4,1,3,5,0,4,1,0,1,3,4,0 345 | 5,1,1,1,0,5,1,0,1,3,6,1 346 | 3,2,1,1,0,3,1,0,1,3,6,1 347 | 3,2,1,1,0,3,1,0,1,3,6,1 348 | 3,2,1,2,0,3,1,1,1,3,7,1 349 | 3,1,1,2,0,3,1,1,1,3,7,0 350 | 3,1,1,2,0,3,1,1,1,3,7,0 351 | 3,2,1,2,0,3,1,1,5,3,7,0 352 | 4,2,3,1,0,4,1,0,1,3,8,0 353 | 5,2,2,1,0,5,1,0,1,3,8,1 354 | 3,1,1,2,0,3,1,1,1,3,8,0 355 | 5,2,3,1,0,5,1,0,1,3,8,1 356 | 3,2,2,1,0,3,1,0,1,3,8,0 357 | 4,1,4,1,0,4,1,0,1,3,8,0 358 | 3,1,1,0,0,3,0,1,5,2,3,1 359 | 3,1,1,0,0,3,0,1,5,2,3,1 360 | 3,2,1,0,0,3,0,1,1,2,3,1 361 | 3,1,1,0,0,3,0,1,1,2,3,1 362 | 3,1,1,0,0,3,1,1,5,2,3,1 363 | 1,2,1,0,0,1,1,2,1,2,3,0 364 | 1,1,1,0,0,1,0,2,1,2,3,0 365 | 1,1,1,0,0,1,0,2,1,2,3,1 366 | 1,2,1,1,1,1,0,0,5,2,3,1 367 | 1,2,1,1,1,1,0,0,2,2,3,1 368 | 5,2,2,0,0,5,1,0,1,2,3,0 369 | 5,1,2,0,0,5,1,0,1,2,3,0 370 | 1,1,1,1,1,1,0,0,2,2,3,1 371 | 5,2,2,0,0,5,0,0,1,2,3,0 372 | 5,2,2,0,0,5,1,0,1,2,3,0 373 | 5,2,2,0,0,5,1,0,1,2,3,0 374 | 5,2,2,0,0,5,1,0,1,2,3,1 375 | 5,2,3,1,0,5,1,0,1,3,8,0 376 | 5,1,3,1,0,5,1,0,1,3,8,0 377 | 5,1,3,1,0,5,1,0,1,3,8,0 378 | 4,1,4,1,0,4,1,0,1,3,8,0 379 | 4,1,3,4,0,4,1,0,1,3,6,0 380 | 1,2,1,1,1,1,0,1,5,3,6,1 381 | 1,1,1,1,1,1,1,1,5,3,6,1 382 | 3,1,2,2,0,3,1,1,2,3,6,1 383 | 3,2,2,2,0,3,1,1,5,3,6,1 384 | 3,1,2,2,0,3,1,1,5,3,6,1 385 | 4,2,3,3,0,4,1,0,1,3,6,0 386 | 5,2,3,3,0,5,1,0,1,3,6,0 387 | 5,1,3,3,0,5,1,0,1,3,6,0 388 | 5,2,3,3,0,5,1,0,1,3,6,0 389 | 4,2,4,3,0,4,1,0,1,3,6,1 390 | 4,1,4,3,0,4,1,0,1,3,6,1 391 | 1,1,1,1,1,1,0,1,5,3,5,0 392 | 1,2,1,1,1,1,0,1,5,3,5,0 393 | 1,2,1,2,2,1,1,0,1,3,5,0 394 | 1,1,1,2,2,1,1,0,1,3,5,0 395 | 1,1,1,2,2,1,1,0,1,3,5,0 396 | 1,1,1,2,2,1,1,0,1,3,5,1 397 | 5,2,3,0,0,5,1,0,1,3,5,1 398 | 1,1,1,2,2,1,0,1,1,3,5,0 399 | 3,2,2,0,0,3,0,1,1,3,5,0 400 | 5,1,3,0,0,5,1,0,1,3,5,1 401 | 5,2,3,1,0,5,1,0,1,3,5,0 402 | 4,1,4,0,0,4,1,0,1,3,5,1 403 | 4,2,4,0,0,4,1,0,1,3,5,1 404 | 4,2,4,0,0,4,1,0,1,3,5,1 405 | 4,2,4,0,0,4,1,0,1,3,5,1 406 | 3,2,1,3,0,3,0,2,1,3,7,0 407 | 3,2,1,3,0,3,0,2,1,3,7,0 408 | 3,2,2,2,0,3,1,1,5,3,7,1 409 | 3,1,2,2,0,3,1,1,1,3,7,0 410 | 1,1,1,2,1,1,0,2,1,3,7,0 411 | 1,2,1,2,2,1,1,1,1,3,5,1 412 | 3,1,2,1,1,3,1,0,5,3,5,1 413 | 3,1,8,3,2,3,0,1,5,3,7,0 414 | 3,2,8,4,2,3,1,1,20,3,7,1 415 | 3,2,8,4,2,3,0,1,5,3,7,1 416 | 3,1,8,4,2,3,1,1,20,3,7,1 417 | 3,1,9,5,0,3,0,2,20,3,8,0 418 | 2,2,6,3,1,2,1,4,5,3,8,1 419 | 1,2,7,5,1,1,1,2,20,3,8,1 420 | 3,1,9,5,0,3,0,2,20,3,8,0 421 | 4,1,9.5,3,4,4,1,4,1,3,8,0 422 | 5,1,9.5,2,3,5,1,4,20,3,7,1 423 | 5,2,9.5,3,4,5,1,4,10,3,7,1 424 | 5,1,9.5,4,4,5,1,3,10,3,7,1 425 | 3,2,8,4,2,3,0,1,5,3,7,1 426 | 3,2,8,4,2,3,1,2,20,3,7,0 427 | 1,2,9,4,2,1,1,2,20,3,7,1 428 | 5,1,9.5,3,4,5,1,4,20,3,7,0 429 | 3,2,9.5,2,5,3,1,2,20,3,7,1 430 | 5,2,9.5,2,5,5,1,2,20,3,7,1 431 | 3,1,8,4,1,3,1,2,20,3,7,1 432 | 1,2,9,4,1,1,1,2,20,3,7,0 433 | 3,1,9.5,4,5,3,1,2,20,3,7,1 434 | 1,2,8,4,2,1,1,2,20,3,7,1 435 | 1,1,8,5,2,1,0,2,5,3,7,1 436 | 3,1,9.5,5,5,3,1,4,1,3,7,0 437 | 3,2,9.5,3,4,3,1,4,20,3,7,0 438 | 3,2,9.5,3,5,3,0,4,20,3,7,0 439 | 1,1,9,4,1,1,1,3,20,3,7,0 440 | 1,2,9.5,4,5,1,1,4,1,3,7,0 441 | 1,2,9,4,1,1,0,3,10,3,7,1 442 | 2,1,8,5,0,2,0,2,10,3,7,0 443 | 2,2,9,5,0,2,0,2,10,3,7,0 444 | 2,1,9,5,0,2,1,2,20,3,7,0 445 | 5,2,6,0,0,5,0,4,5,2,4,1 446 | 5,2,7,0,0,5,1,3,20,2,4,1 447 | 5,2,7,0,0,5,0,3,20,2,4,1 448 | 5,2,7,0,0,5,0,3,20,2,4,1 449 | 5,2,7,0,0,5,1,4,20,2,4,1 450 | 5,1,8,0,0,5,1,3,20,2,4,1 451 | 5,1,8,0,0,5,1,3,20,2,4,1 452 | 1,1,9.5,4,5,1,1,4,1,2,4,1 453 | 5,1,7,0,0,5,0,4,20,2,4,1 454 | 5,2,8,0,0,5,0,3,20,2,4,1 455 | 5,1,8,0,0,5,0,3,20,2,4,1 456 | 5,2,8,0,0,5,0,3,20,2,4,1 457 | 1,1,9.5,0,5,1,1,2,20,2,4,1 458 | 2,1,9.5,5,5,2,1,4,20,2,4,1 459 | 5,2,9,0,0,5,1,3,20,2,4,1 460 | 1,2,9.5,0,5,1,1,2,20,2,4,0 461 | 5,2,8,0,0,5,0,3,20,2,4,1 462 | 5,1,9,0,0,5,0,3,20,2,4,1 463 | 5,1,9,0,0,5,0,3,20,2,4,1 464 | 1,2,9.5,0,5,1,1,4,20,2,5,1 465 | 1,2,9.5,0,5,1,0,4,20,2,5,0 466 | 5,1,9,0,0,5,1,4,20,2,5,1 467 | 5,1,9,0,0,5,0,4,20,2,5,1 468 | 2,1,9.5,0,5,2,1,4,20,2,5,0 469 | 4,1,9.5,1,5,4,1,4,1,2,5,0 470 | 5,2,4,1,1,5,1,0,2,3,4,1 471 | 2,1,1,0,0,2,0,2,2,3,4,0 472 | 2,1,2,0,0,2,0,2,1,3,4,1 473 | 3,2,3,0,0,3,1,1,5,3,4,0 474 | 5,1,6,3,2,5,1,2,10,2,4,1 475 | 2,2,2,0,0,2,0,2,2,3,3,0 476 | 2,1,2,0,0,2,1,2,2,3,3,0 477 | 1,1,3,0,0,1,0,1,2,3,3,0 478 | 5,2,4,1,1,5,1,0,2,3,4,0 479 | 1,1,3,0,0,1,0,1,2,3,4,0 480 | 1,1,3,0,0,1,0,1,5,3,4,0 481 | 5,1,4,1,1,5,1,0,2,3,4,1 482 | 5,2,4,1,1,5,1,0,2,3,4,1 483 | 5,2,4,1,1,5,1,0,2,3,4,1 484 | 1,2,3,0,0,1,0,1,2,3,4,1 485 | 5,1,4,1,1,5,1,1,1,3,4,1 486 | 3,1,4,2,2,3,1,0,1,3,5,1 487 | 3,1,4,2,2,3,0,0,1,3,4,1 488 | 2,2,1,0,0,2,1,2,2,3,5,0 489 | 2,2,1,0,0,2,0,2,2,3,5,0 490 | 2,1,2,0,0,2,1,2,2,3,5,1 491 | 2,2,2,0,0,2,0,2,2,3,5,1 492 | 2,1,2,0,0,2,0,2,1,3,5,0 493 | 2,1,2,0,0,2,0,2,1,3,5,1 494 | 1,1,3,0,0,1,0,1,5,3,5,1 495 | 3,2,4,2,2,3,1,0,2,3,5,1 496 | 3,1,4,2,2,3,1,0,2,3,5,1 497 | 3,1,4,2,2,3,1,1,2,3,5,0 498 | 3,1,4,2,2,3,1,2,1,3,5,1 499 | 3,2,4,2,2,3,1,2,2,3,5,0 500 | 3,1,4,3,3,3,1,2,2,1,4,0 501 | 3,1,4,3,3,3,1,2,2,1,4,0 502 | -------------------------------------------------------------------------------- /Keras Introduction Exploration/Readme.md: -------------------------------------------------------------------------------- 1 | # Introduction to Keras for Duilding Deep Learning Models 2 | 3 | This Python Notebook focuses on a specific sub-field of machine learning called **predictive modeling.** 4 | 5 | 6 | 7 | Within predicitve modeling is a speciality or another sub-field called **deep learning.** 8 | 9 | This notebook includes deep learning models with a library called Keras. 10 | 11 | >**Predictive modeling** is focused on developing models that make accurate predictions at the expense of explaining why predictions are made. 12 | 13 | You and I don't need to be able to write a binary classification model. We need to know how to use and interpret the results of the model. 14 | 15 | Just Download and open the .ipynb file and everything is explained there in a detailed manner.
16 | Alternatively you can visit the link given below where the notebook can be accessed through Kaggle. 17 | 18 | #### My Kaggle Notebook Link -> https://www.kaggle.com/datawarriors/a-gentle-introduction-to-keras 19 | -------------------------------------------------------------------------------- /Keras Introduction Exploration/Screenshot.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amark720/Deep-Learning-Projects/0ae283356592451eb834af43494545324eead24c/Keras Introduction Exploration/Screenshot.jpeg -------------------------------------------------------------------------------- /Readme.md: -------------------------------------------------------------------------------- 1 | # Deep-Learning-Projects 2 | 3 | Python TensorFlow Flask Heroku Keras AWS 4 | 5 | 6 | ## Overview 7 | • This Repository consists Deep Learning Projects made by Me.
8 | • Datasets are provided in each of the folders above, and the solution to the problem statements as well.
9 | • Visit each folder to access the Projects in detail. 10 | 11 | Landing Page 12 | 13 | ### Don't forget to ⭐ the repository, if it helped you in anyway.
14 | 15 | ### Repo Stats: 16 | [![GitHub](https://img.shields.io/github/followers/amark720?style=social)](https://github.com/amark720)   [![GitHub](https://img.shields.io/github/stars/amark720/Deep-Learning-Projects?style=social)](https://github.com/amark720/Deep-Learning-Projects)   [![GitHub](https://img.shields.io/github/forks/amark720/Deep-Learning-Projects?style=social)](https://github.com/amark720/Deep-Learning-Projects) 17 | 18 | #### Feel Free to contact me at➛ databoyamar@gmail.com for any help related to Projects in this Repository! 19 | --------------------------------------------------------------------------------