├── 2.1-a-first-look-at-a-neural-network.ipynb ├── 3.5-classifying-movie-reviews.ipynb ├── 3.6-classifying-newswires.ipynb ├── 3.7-predicting-house-prices.ipynb ├── 4.4-overfitting-and-underfitting.ipynb ├── 5.1-introduction-to-convnets.ipynb ├── 5.2-using-convnets-with-small-datasets.ipynb ├── 5.3-using-a-pretrained-convnet.ipynb ├── 5.4-visualizing-what-convnets-learn.ipynb ├── 6.1-one-hot-encoding-of-words-or-characters.ipynb ├── 6.1-using-word-embeddings.ipynb ├── 6.2-understanding-recurrent-neural-networks.ipynb ├── 6.3-advanced-usage-of-recurrent-neural-networks.ipynb ├── 6.4-sequence-processing-with-convnets.ipynb ├── 8.1-text-generation-with-lstm.ipynb ├── 8.2-deep-dream.ipynb ├── 8.3-neural-style-transfer.ipynb ├── 8.4-generating-images-with-vaes.ipynb ├── 8.5-introduction-to-gans.ipynb ├── LICENSE └── README.md /2.1-a-first-look-at-a-neural-network.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# A first look at a neural network\n", 36 | "\n", 37 | "This notebook contains the code samples found in Chapter 2, Section 1 of [Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text features far more content, in particular further explanations and figures: in this notebook, you will only find source code and related comments.\n", 38 | "\n", 39 | "----\n", 40 | "\n", 41 | "We will now take a look at a first concrete example of a neural network, which makes use of the Python library Keras to learn to classify \n", 42 | "hand-written digits. Unless you already have experience with Keras or similar libraries, you will not understand everything about this \n", 43 | "first example right away. You probably haven't even installed Keras yet. Don't worry, that is perfectly fine. In the next chapter, we will \n", 44 | "review each element in our example and explain them in detail. So don't worry if some steps seem arbitrary or look like magic to you! \n", 45 | "We've got to start somewhere.\n", 46 | "\n", 47 | "The problem we are trying to solve here is to classify grayscale images of handwritten digits (28 pixels by 28 pixels), into their 10 \n", 48 | "categories (0 to 9). The dataset we will use is the MNIST dataset, a classic dataset in the machine learning community, which has been \n", 49 | "around for almost as long as the field itself and has been very intensively studied. It's a set of 60,000 training images, plus 10,000 test \n", 50 | "images, assembled by the National Institute of Standards and Technology (the NIST in MNIST) in the 1980s. You can think of \"solving\" MNIST \n", 51 | "as the \"Hello World\" of deep learning -- it's what you do to verify that your algorithms are working as expected. As you become a machine \n", 52 | "learning practitioner, you will see MNIST come up over and over again, in scientific papers, blog posts, and so on." 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "The MNIST dataset comes pre-loaded in Keras, in the form of a set of four Numpy arrays:" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": 2, 65 | "metadata": { 66 | "collapsed": true 67 | }, 68 | "outputs": [], 69 | "source": [ 70 | "from keras.datasets import mnist\n", 71 | "\n", 72 | "(train_images, train_labels), (test_images, test_labels) = mnist.load_data()" 73 | ] 74 | }, 75 | { 76 | "cell_type": "markdown", 77 | "metadata": {}, 78 | "source": [ 79 | "`train_images` and `train_labels` form the \"training set\", the data that the model will learn from. The model will then be tested on the \n", 80 | "\"test set\", `test_images` and `test_labels`. Our images are encoded as Numpy arrays, and the labels are simply an array of digits, ranging \n", 81 | "from 0 to 9. There is a one-to-one correspondence between the images and the labels.\n", 82 | "\n", 83 | "Let's have a look at the training data:" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 3, 89 | "metadata": {}, 90 | "outputs": [ 91 | { 92 | "data": { 93 | "text/plain": [ 94 | "(60000, 28, 28)" 95 | ] 96 | }, 97 | "execution_count": 3, 98 | "metadata": {}, 99 | "output_type": "execute_result" 100 | } 101 | ], 102 | "source": [ 103 | "train_images.shape" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": 4, 109 | "metadata": {}, 110 | "outputs": [ 111 | { 112 | "data": { 113 | "text/plain": [ 114 | "60000" 115 | ] 116 | }, 117 | "execution_count": 4, 118 | "metadata": {}, 119 | "output_type": "execute_result" 120 | } 121 | ], 122 | "source": [ 123 | "len(train_labels)" 124 | ] 125 | }, 126 | { 127 | "cell_type": "code", 128 | "execution_count": 5, 129 | "metadata": {}, 130 | "outputs": [ 131 | { 132 | "data": { 133 | "text/plain": [ 134 | "array([5, 0, 4, ..., 5, 6, 8], dtype=uint8)" 135 | ] 136 | }, 137 | "execution_count": 5, 138 | "metadata": {}, 139 | "output_type": "execute_result" 140 | } 141 | ], 142 | "source": [ 143 | "train_labels" 144 | ] 145 | }, 146 | { 147 | "cell_type": "markdown", 148 | "metadata": {}, 149 | "source": [ 150 | "Let's have a look at the test data:" 151 | ] 152 | }, 153 | { 154 | "cell_type": "code", 155 | "execution_count": 6, 156 | "metadata": {}, 157 | "outputs": [ 158 | { 159 | "data": { 160 | "text/plain": [ 161 | "(10000, 28, 28)" 162 | ] 163 | }, 164 | "execution_count": 6, 165 | "metadata": {}, 166 | "output_type": "execute_result" 167 | } 168 | ], 169 | "source": [ 170 | "test_images.shape" 171 | ] 172 | }, 173 | { 174 | "cell_type": "code", 175 | "execution_count": 7, 176 | "metadata": {}, 177 | "outputs": [ 178 | { 179 | "data": { 180 | "text/plain": [ 181 | "10000" 182 | ] 183 | }, 184 | "execution_count": 7, 185 | "metadata": {}, 186 | "output_type": "execute_result" 187 | } 188 | ], 189 | "source": [ 190 | "len(test_labels)" 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": 8, 196 | "metadata": {}, 197 | "outputs": [ 198 | { 199 | "data": { 200 | "text/plain": [ 201 | "array([7, 2, 1, ..., 4, 5, 6], dtype=uint8)" 202 | ] 203 | }, 204 | "execution_count": 8, 205 | "metadata": {}, 206 | "output_type": "execute_result" 207 | } 208 | ], 209 | "source": [ 210 | "test_labels" 211 | ] 212 | }, 213 | { 214 | "cell_type": "markdown", 215 | "metadata": {}, 216 | "source": [ 217 | "Our workflow will be as follow: first we will present our neural network with the training data, `train_images` and `train_labels`. The \n", 218 | "network will then learn to associate images and labels. Finally, we will ask the network to produce predictions for `test_images`, and we \n", 219 | "will verify if these predictions match the labels from `test_labels`.\n", 220 | "\n", 221 | "Let's build our network -- again, remember that you aren't supposed to understand everything about this example just yet." 222 | ] 223 | }, 224 | { 225 | "cell_type": "code", 226 | "execution_count": 9, 227 | "metadata": { 228 | "collapsed": true 229 | }, 230 | "outputs": [], 231 | "source": [ 232 | "from keras import models\n", 233 | "from keras import layers\n", 234 | "\n", 235 | "network = models.Sequential()\n", 236 | "network.add(layers.Dense(512, activation='relu', input_shape=(28 * 28,)))\n", 237 | "network.add(layers.Dense(10, activation='softmax'))" 238 | ] 239 | }, 240 | { 241 | "cell_type": "markdown", 242 | "metadata": {}, 243 | "source": [ 244 | "\n", 245 | "The core building block of neural networks is the \"layer\", a data-processing module which you can conceive as a \"filter\" for data. Some \n", 246 | "data comes in, and comes out in a more useful form. Precisely, layers extract _representations_ out of the data fed into them -- hopefully \n", 247 | "representations that are more meaningful for the problem at hand. Most of deep learning really consists of chaining together simple layers \n", 248 | "which will implement a form of progressive \"data distillation\". A deep learning model is like a sieve for data processing, made of a \n", 249 | "succession of increasingly refined data filters -- the \"layers\".\n", 250 | "\n", 251 | "Here our network consists of a sequence of two `Dense` layers, which are densely-connected (also called \"fully-connected\") neural layers. \n", 252 | "The second (and last) layer is a 10-way \"softmax\" layer, which means it will return an array of 10 probability scores (summing to 1). Each \n", 253 | "score will be the probability that the current digit image belongs to one of our 10 digit classes.\n", 254 | "\n", 255 | "To make our network ready for training, we need to pick three more things, as part of \"compilation\" step:\n", 256 | "\n", 257 | "* A loss function: the is how the network will be able to measure how good a job it is doing on its training data, and thus how it will be \n", 258 | "able to steer itself in the right direction.\n", 259 | "* An optimizer: this is the mechanism through which the network will update itself based on the data it sees and its loss function.\n", 260 | "* Metrics to monitor during training and testing. Here we will only care about accuracy (the fraction of the images that were correctly \n", 261 | "classified).\n", 262 | "\n", 263 | "The exact purpose of the loss function and the optimizer will be made clear throughout the next two chapters." 264 | ] 265 | }, 266 | { 267 | "cell_type": "code", 268 | "execution_count": 10, 269 | "metadata": { 270 | "collapsed": true 271 | }, 272 | "outputs": [], 273 | "source": [ 274 | "network.compile(optimizer='rmsprop',\n", 275 | " loss='categorical_crossentropy',\n", 276 | " metrics=['accuracy'])" 277 | ] 278 | }, 279 | { 280 | "cell_type": "markdown", 281 | "metadata": {}, 282 | "source": [ 283 | "\n", 284 | "Before training, we will preprocess our data by reshaping it into the shape that the network expects, and scaling it so that all values are in \n", 285 | "the `[0, 1]` interval. Previously, our training images for instance were stored in an array of shape `(60000, 28, 28)` of type `uint8` with \n", 286 | "values in the `[0, 255]` interval. We transform it into a `float32` array of shape `(60000, 28 * 28)` with values between 0 and 1." 287 | ] 288 | }, 289 | { 290 | "cell_type": "code", 291 | "execution_count": 11, 292 | "metadata": { 293 | "collapsed": true 294 | }, 295 | "outputs": [], 296 | "source": [ 297 | "train_images = train_images.reshape((60000, 28 * 28))\n", 298 | "train_images = train_images.astype('float32') / 255\n", 299 | "\n", 300 | "test_images = test_images.reshape((10000, 28 * 28))\n", 301 | "test_images = test_images.astype('float32') / 255" 302 | ] 303 | }, 304 | { 305 | "cell_type": "markdown", 306 | "metadata": {}, 307 | "source": [ 308 | "We also need to categorically encode the labels, a step which we explain in chapter 3:" 309 | ] 310 | }, 311 | { 312 | "cell_type": "code", 313 | "execution_count": 12, 314 | "metadata": { 315 | "collapsed": true 316 | }, 317 | "outputs": [], 318 | "source": [ 319 | "from keras.utils import to_categorical\n", 320 | "\n", 321 | "train_labels = to_categorical(train_labels)\n", 322 | "test_labels = to_categorical(test_labels)" 323 | ] 324 | }, 325 | { 326 | "cell_type": "markdown", 327 | "metadata": {}, 328 | "source": [ 329 | "We are now ready to train our network, which in Keras is done via a call to the `fit` method of the network: \n", 330 | "we \"fit\" the model to its training data." 331 | ] 332 | }, 333 | { 334 | "cell_type": "code", 335 | "execution_count": 13, 336 | "metadata": {}, 337 | "outputs": [ 338 | { 339 | "name": "stdout", 340 | "output_type": "stream", 341 | "text": [ 342 | "Epoch 1/5\n", 343 | "60000/60000 [==============================] - 2s - loss: 0.2577 - acc: 0.9245 \n", 344 | "Epoch 2/5\n", 345 | "60000/60000 [==============================] - 1s - loss: 0.1042 - acc: 0.9690 \n", 346 | "Epoch 3/5\n", 347 | "60000/60000 [==============================] - 1s - loss: 0.0687 - acc: 0.9793 \n", 348 | "Epoch 4/5\n", 349 | "60000/60000 [==============================] - 1s - loss: 0.0508 - acc: 0.9848 \n", 350 | "Epoch 5/5\n", 351 | "60000/60000 [==============================] - 1s - loss: 0.0382 - acc: 0.9890 \n" 352 | ] 353 | }, 354 | { 355 | "data": { 356 | "text/plain": [ 357 | "" 358 | ] 359 | }, 360 | "execution_count": 13, 361 | "metadata": {}, 362 | "output_type": "execute_result" 363 | } 364 | ], 365 | "source": [ 366 | "network.fit(train_images, train_labels, epochs=5, batch_size=128)" 367 | ] 368 | }, 369 | { 370 | "cell_type": "markdown", 371 | "metadata": {}, 372 | "source": [ 373 | "Two quantities are being displayed during training: the \"loss\" of the network over the training data, and the accuracy of the network over \n", 374 | "the training data.\n", 375 | "\n", 376 | "We quickly reach an accuracy of 0.989 (i.e. 98.9%) on the training data. Now let's check that our model performs well on the test set too:" 377 | ] 378 | }, 379 | { 380 | "cell_type": "code", 381 | "execution_count": 14, 382 | "metadata": {}, 383 | "outputs": [ 384 | { 385 | "name": "stdout", 386 | "output_type": "stream", 387 | "text": [ 388 | " 9536/10000 [===========================>..] - ETA: 0s" 389 | ] 390 | } 391 | ], 392 | "source": [ 393 | "test_loss, test_acc = network.evaluate(test_images, test_labels)" 394 | ] 395 | }, 396 | { 397 | "cell_type": "code", 398 | "execution_count": 15, 399 | "metadata": {}, 400 | "outputs": [ 401 | { 402 | "name": "stdout", 403 | "output_type": "stream", 404 | "text": [ 405 | "test_acc: 0.9777\n" 406 | ] 407 | } 408 | ], 409 | "source": [ 410 | "print('test_acc:', test_acc)" 411 | ] 412 | }, 413 | { 414 | "cell_type": "markdown", 415 | "metadata": {}, 416 | "source": [ 417 | "\n", 418 | "Our test set accuracy turns out to be 97.8% -- that's quite a bit lower than the training set accuracy. \n", 419 | "This gap between training accuracy and test accuracy is an example of \"overfitting\", \n", 420 | "the fact that machine learning models tend to perform worse on new data than on their training data. \n", 421 | "Overfitting will be a central topic in chapter 3.\n", 422 | "\n", 423 | "This concludes our very first example -- you just saw how we could build and a train a neural network to classify handwritten digits, in \n", 424 | "less than 20 lines of Python code. In the next chapter, we will go in detail over every moving piece we just previewed, and clarify what is really \n", 425 | "going on behind the scenes. You will learn about \"tensors\", the data-storing objects going into the network, about tensor operations, which \n", 426 | "layers are made of, and about gradient descent, which allows our network to learn from its training examples." 427 | ] 428 | } 429 | ], 430 | "metadata": { 431 | "kernelspec": { 432 | "display_name": "Python 3", 433 | "language": "python", 434 | "name": "python3" 435 | }, 436 | "language_info": { 437 | "codemirror_mode": { 438 | "name": "ipython", 439 | "version": 3 440 | }, 441 | "file_extension": ".py", 442 | "mimetype": "text/x-python", 443 | "name": "python", 444 | "nbconvert_exporter": "python", 445 | "pygments_lexer": "ipython3", 446 | "version": "3.5.2" 447 | } 448 | }, 449 | "nbformat": 4, 450 | "nbformat_minor": 2 451 | } 452 | -------------------------------------------------------------------------------- /3.6-classifying-newswires.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# Classifying newswires: a multi-class classification example\n", 36 | "\n", 37 | "This notebook contains the code samples found in Chapter 3, Section 5 of [Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text features far more content, in particular further explanations and figures: in this notebook, you will only find source code and related comments.\n", 38 | "\n", 39 | "----\n", 40 | "\n", 41 | "In the previous section we saw how to classify vector inputs into two mutually exclusive classes using a densely-connected neural network. \n", 42 | "But what happens when you have more than two classes? \n", 43 | "\n", 44 | "In this section, we will build a network to classify Reuters newswires into 46 different mutually-exclusive topics. Since we have many \n", 45 | "classes, this problem is an instance of \"multi-class classification\", and since each data point should be classified into only one \n", 46 | "category, the problem is more specifically an instance of \"single-label, multi-class classification\". If each data point could have \n", 47 | "belonged to multiple categories (in our case, topics) then we would be facing a \"multi-label, multi-class classification\" problem." 48 | ] 49 | }, 50 | { 51 | "cell_type": "markdown", 52 | "metadata": {}, 53 | "source": [ 54 | "## The Reuters dataset\n", 55 | "\n", 56 | "\n", 57 | "We will be working with the _Reuters dataset_, a set of short newswires and their topics, published by Reuters in 1986. It's a very simple, \n", 58 | "widely used toy dataset for text classification. There are 46 different topics; some topics are more represented than others, but each \n", 59 | "topic has at least 10 examples in the training set.\n", 60 | "\n", 61 | "Like IMDB and MNIST, the Reuters dataset comes packaged as part of Keras. Let's take a look right away:" 62 | ] 63 | }, 64 | { 65 | "cell_type": "code", 66 | "execution_count": 2, 67 | "metadata": { 68 | "collapsed": true 69 | }, 70 | "outputs": [], 71 | "source": [ 72 | "from keras.datasets import reuters\n", 73 | "\n", 74 | "(train_data, train_labels), (test_data, test_labels) = reuters.load_data(num_words=10000)" 75 | ] 76 | }, 77 | { 78 | "cell_type": "markdown", 79 | "metadata": {}, 80 | "source": [ 81 | "\n", 82 | "Like with the IMDB dataset, the argument `num_words=10000` restricts the data to the 10,000 most frequently occurring words found in the \n", 83 | "data.\n", 84 | "\n", 85 | "We have 8,982 training examples and 2,246 test examples:" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": 3, 91 | "metadata": {}, 92 | "outputs": [ 93 | { 94 | "data": { 95 | "text/plain": [ 96 | "8982" 97 | ] 98 | }, 99 | "execution_count": 3, 100 | "metadata": {}, 101 | "output_type": "execute_result" 102 | } 103 | ], 104 | "source": [ 105 | "len(train_data)" 106 | ] 107 | }, 108 | { 109 | "cell_type": "code", 110 | "execution_count": 4, 111 | "metadata": {}, 112 | "outputs": [ 113 | { 114 | "data": { 115 | "text/plain": [ 116 | "2246" 117 | ] 118 | }, 119 | "execution_count": 4, 120 | "metadata": {}, 121 | "output_type": "execute_result" 122 | } 123 | ], 124 | "source": [ 125 | "len(test_data)" 126 | ] 127 | }, 128 | { 129 | "cell_type": "markdown", 130 | "metadata": {}, 131 | "source": [ 132 | "As with the IMDB reviews, each example is a list of integers (word indices):" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "execution_count": 5, 138 | "metadata": {}, 139 | "outputs": [ 140 | { 141 | "data": { 142 | "text/plain": [ 143 | "[1,\n", 144 | " 245,\n", 145 | " 273,\n", 146 | " 207,\n", 147 | " 156,\n", 148 | " 53,\n", 149 | " 74,\n", 150 | " 160,\n", 151 | " 26,\n", 152 | " 14,\n", 153 | " 46,\n", 154 | " 296,\n", 155 | " 26,\n", 156 | " 39,\n", 157 | " 74,\n", 158 | " 2979,\n", 159 | " 3554,\n", 160 | " 14,\n", 161 | " 46,\n", 162 | " 4689,\n", 163 | " 4329,\n", 164 | " 86,\n", 165 | " 61,\n", 166 | " 3499,\n", 167 | " 4795,\n", 168 | " 14,\n", 169 | " 61,\n", 170 | " 451,\n", 171 | " 4329,\n", 172 | " 17,\n", 173 | " 12]" 174 | ] 175 | }, 176 | "execution_count": 5, 177 | "metadata": {}, 178 | "output_type": "execute_result" 179 | } 180 | ], 181 | "source": [ 182 | "train_data[10]" 183 | ] 184 | }, 185 | { 186 | "cell_type": "markdown", 187 | "metadata": {}, 188 | "source": [ 189 | "Here's how you can decode it back to words, in case you are curious:" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": 6, 195 | "metadata": { 196 | "collapsed": true 197 | }, 198 | "outputs": [], 199 | "source": [ 200 | "word_index = reuters.get_word_index()\n", 201 | "reverse_word_index = dict([(value, key) for (key, value) in word_index.items()])\n", 202 | "# Note that our indices were offset by 3\n", 203 | "# because 0, 1 and 2 are reserved indices for \"padding\", \"start of sequence\", and \"unknown\".\n", 204 | "decoded_newswire = ' '.join([reverse_word_index.get(i - 3, '?') for i in train_data[0]])" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "execution_count": 7, 210 | "metadata": {}, 211 | "outputs": [ 212 | { 213 | "data": { 214 | "text/plain": [ 215 | "'? ? ? said as a result of its december acquisition of space co it expects earnings per share in 1987 of 1 15 to 1 30 dlrs per share up from 70 cts in 1986 the company said pretax net should rise to nine to 10 mln dlrs from six mln dlrs in 1986 and rental operation revenues to 19 to 22 mln dlrs from 12 5 mln dlrs it said cash flow per share this year should be 2 50 to three dlrs reuter 3'" 216 | ] 217 | }, 218 | "execution_count": 7, 219 | "metadata": {}, 220 | "output_type": "execute_result" 221 | } 222 | ], 223 | "source": [ 224 | "decoded_newswire" 225 | ] 226 | }, 227 | { 228 | "cell_type": "markdown", 229 | "metadata": {}, 230 | "source": [ 231 | "The label associated with an example is an integer between 0 and 45: a topic index." 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": 8, 237 | "metadata": {}, 238 | "outputs": [ 239 | { 240 | "data": { 241 | "text/plain": [ 242 | "3" 243 | ] 244 | }, 245 | "execution_count": 8, 246 | "metadata": {}, 247 | "output_type": "execute_result" 248 | } 249 | ], 250 | "source": [ 251 | "train_labels[10]" 252 | ] 253 | }, 254 | { 255 | "cell_type": "markdown", 256 | "metadata": {}, 257 | "source": [ 258 | "## Preparing the data\n", 259 | "\n", 260 | "We can vectorize the data with the exact same code as in our previous example:" 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": 9, 266 | "metadata": { 267 | "collapsed": true 268 | }, 269 | "outputs": [], 270 | "source": [ 271 | "import numpy as np\n", 272 | "\n", 273 | "def vectorize_sequences(sequences, dimension=10000):\n", 274 | " results = np.zeros((len(sequences), dimension))\n", 275 | " for i, sequence in enumerate(sequences):\n", 276 | " results[i, sequence] = 1.\n", 277 | " return results\n", 278 | "\n", 279 | "# Our vectorized training data\n", 280 | "x_train = vectorize_sequences(train_data)\n", 281 | "# Our vectorized test data\n", 282 | "x_test = vectorize_sequences(test_data)" 283 | ] 284 | }, 285 | { 286 | "cell_type": "markdown", 287 | "metadata": {}, 288 | "source": [ 289 | "\n", 290 | "To vectorize the labels, there are two possibilities: we could just cast the label list as an integer tensor, or we could use a \"one-hot\" \n", 291 | "encoding. One-hot encoding is a widely used format for categorical data, also called \"categorical encoding\". \n", 292 | "For a more detailed explanation of one-hot encoding, you can refer to Chapter 6, Section 1. \n", 293 | "In our case, one-hot encoding of our labels consists in embedding each label as an all-zero vector with a 1 in the place of the label index, e.g.:" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": 10, 299 | "metadata": { 300 | "collapsed": true 301 | }, 302 | "outputs": [], 303 | "source": [ 304 | "def to_one_hot(labels, dimension=46):\n", 305 | " results = np.zeros((len(labels), dimension))\n", 306 | " for i, label in enumerate(labels):\n", 307 | " results[i, label] = 1.\n", 308 | " return results\n", 309 | "\n", 310 | "# Our vectorized training labels\n", 311 | "one_hot_train_labels = to_one_hot(train_labels)\n", 312 | "# Our vectorized test labels\n", 313 | "one_hot_test_labels = to_one_hot(test_labels)" 314 | ] 315 | }, 316 | { 317 | "cell_type": "markdown", 318 | "metadata": {}, 319 | "source": [ 320 | "Note that there is a built-in way to do this in Keras, which you have already seen in action in our MNIST example:" 321 | ] 322 | }, 323 | { 324 | "cell_type": "code", 325 | "execution_count": 11, 326 | "metadata": { 327 | "collapsed": true 328 | }, 329 | "outputs": [], 330 | "source": [ 331 | "from keras.utils.np_utils import to_categorical\n", 332 | "\n", 333 | "one_hot_train_labels = to_categorical(train_labels)\n", 334 | "one_hot_test_labels = to_categorical(test_labels)" 335 | ] 336 | }, 337 | { 338 | "cell_type": "markdown", 339 | "metadata": {}, 340 | "source": [ 341 | "## Building our network\n", 342 | "\n", 343 | "\n", 344 | "This topic classification problem looks very similar to our previous movie review classification problem: in both cases, we are trying to \n", 345 | "classify short snippets of text. There is however a new constraint here: the number of output classes has gone from 2 to 46, i.e. the \n", 346 | "dimensionality of the output space is much larger. \n", 347 | "\n", 348 | "In a stack of `Dense` layers like what we were using, each layer can only access information present in the output of the previous layer. \n", 349 | "If one layer drops some information relevant to the classification problem, this information can never be recovered by later layers: each \n", 350 | "layer can potentially become an \"information bottleneck\". In our previous example, we were using 16-dimensional intermediate layers, but a \n", 351 | "16-dimensional space may be too limited to learn to separate 46 different classes: such small layers may act as information bottlenecks, \n", 352 | "permanently dropping relevant information.\n", 353 | "\n", 354 | "For this reason we will use larger layers. Let's go with 64 units:" 355 | ] 356 | }, 357 | { 358 | "cell_type": "code", 359 | "execution_count": 12, 360 | "metadata": { 361 | "collapsed": true 362 | }, 363 | "outputs": [], 364 | "source": [ 365 | "from keras import models\n", 366 | "from keras import layers\n", 367 | "\n", 368 | "model = models.Sequential()\n", 369 | "model.add(layers.Dense(64, activation='relu', input_shape=(10000,)))\n", 370 | "model.add(layers.Dense(64, activation='relu'))\n", 371 | "model.add(layers.Dense(46, activation='softmax'))" 372 | ] 373 | }, 374 | { 375 | "cell_type": "markdown", 376 | "metadata": {}, 377 | "source": [ 378 | "\n", 379 | "There are two other things you should note about this architecture:\n", 380 | "\n", 381 | "* We are ending the network with a `Dense` layer of size 46. This means that for each input sample, our network will output a \n", 382 | "46-dimensional vector. Each entry in this vector (each dimension) will encode a different output class.\n", 383 | "* The last layer uses a `softmax` activation. You have already seen this pattern in the MNIST example. It means that the network will \n", 384 | "output a _probability distribution_ over the 46 different output classes, i.e. for every input sample, the network will produce a \n", 385 | "46-dimensional output vector where `output[i]` is the probability that the sample belongs to class `i`. The 46 scores will sum to 1.\n", 386 | "\n", 387 | "The best loss function to use in this case is `categorical_crossentropy`. It measures the distance between two probability distributions: \n", 388 | "in our case, between the probability distribution output by our network, and the true distribution of the labels. By minimizing the \n", 389 | "distance between these two distributions, we train our network to output something as close as possible to the true labels." 390 | ] 391 | }, 392 | { 393 | "cell_type": "code", 394 | "execution_count": 13, 395 | "metadata": { 396 | "collapsed": true 397 | }, 398 | "outputs": [], 399 | "source": [ 400 | "model.compile(optimizer='rmsprop',\n", 401 | " loss='categorical_crossentropy',\n", 402 | " metrics=['accuracy'])" 403 | ] 404 | }, 405 | { 406 | "cell_type": "markdown", 407 | "metadata": {}, 408 | "source": [ 409 | "## Validating our approach\n", 410 | "\n", 411 | "Let's set apart 1,000 samples in our training data to use as a validation set:" 412 | ] 413 | }, 414 | { 415 | "cell_type": "code", 416 | "execution_count": 14, 417 | "metadata": { 418 | "collapsed": true 419 | }, 420 | "outputs": [], 421 | "source": [ 422 | "x_val = x_train[:1000]\n", 423 | "partial_x_train = x_train[1000:]\n", 424 | "\n", 425 | "y_val = one_hot_train_labels[:1000]\n", 426 | "partial_y_train = one_hot_train_labels[1000:]" 427 | ] 428 | }, 429 | { 430 | "cell_type": "markdown", 431 | "metadata": {}, 432 | "source": [ 433 | "Now let's train our network for 20 epochs:" 434 | ] 435 | }, 436 | { 437 | "cell_type": "code", 438 | "execution_count": 15, 439 | "metadata": {}, 440 | "outputs": [ 441 | { 442 | "name": "stdout", 443 | "output_type": "stream", 444 | "text": [ 445 | "Train on 7982 samples, validate on 1000 samples\n", 446 | "Epoch 1/20\n", 447 | "7982/7982 [==============================] - 1s - loss: 2.5241 - acc: 0.4952 - val_loss: 1.7263 - val_acc: 0.6100\n", 448 | "Epoch 2/20\n", 449 | "7982/7982 [==============================] - 0s - loss: 1.4500 - acc: 0.6854 - val_loss: 1.3478 - val_acc: 0.7070\n", 450 | "Epoch 3/20\n", 451 | "7982/7982 [==============================] - 0s - loss: 1.0979 - acc: 0.7643 - val_loss: 1.1736 - val_acc: 0.7460\n", 452 | "Epoch 4/20\n", 453 | "7982/7982 [==============================] - 0s - loss: 0.8723 - acc: 0.8178 - val_loss: 1.0880 - val_acc: 0.7490\n", 454 | "Epoch 5/20\n", 455 | "7982/7982 [==============================] - 0s - loss: 0.7045 - acc: 0.8477 - val_loss: 0.9822 - val_acc: 0.7760\n", 456 | "Epoch 6/20\n", 457 | "7982/7982 [==============================] - 0s - loss: 0.5660 - acc: 0.8792 - val_loss: 0.9379 - val_acc: 0.8030\n", 458 | "Epoch 7/20\n", 459 | "7982/7982 [==============================] - 0s - loss: 0.4569 - acc: 0.9037 - val_loss: 0.9039 - val_acc: 0.8050\n", 460 | "Epoch 8/20\n", 461 | "7982/7982 [==============================] - 0s - loss: 0.3668 - acc: 0.9238 - val_loss: 0.9279 - val_acc: 0.7890\n", 462 | "Epoch 9/20\n", 463 | "7982/7982 [==============================] - 0s - loss: 0.3000 - acc: 0.9326 - val_loss: 0.8835 - val_acc: 0.8070\n", 464 | "Epoch 10/20\n", 465 | "7982/7982 [==============================] - 0s - loss: 0.2505 - acc: 0.9434 - val_loss: 0.8967 - val_acc: 0.8150\n", 466 | "Epoch 11/20\n", 467 | "7982/7982 [==============================] - 0s - loss: 0.2155 - acc: 0.9473 - val_loss: 0.9080 - val_acc: 0.8110\n", 468 | "Epoch 12/20\n", 469 | "7982/7982 [==============================] - 0s - loss: 0.1853 - acc: 0.9506 - val_loss: 0.9025 - val_acc: 0.8140\n", 470 | "Epoch 13/20\n", 471 | "7982/7982 [==============================] - 0s - loss: 0.1680 - acc: 0.9524 - val_loss: 0.9268 - val_acc: 0.8100\n", 472 | "Epoch 14/20\n", 473 | "7982/7982 [==============================] - 0s - loss: 0.1512 - acc: 0.9562 - val_loss: 0.9500 - val_acc: 0.8130\n", 474 | "Epoch 15/20\n", 475 | "7982/7982 [==============================] - 0s - loss: 0.1371 - acc: 0.9559 - val_loss: 0.9621 - val_acc: 0.8090\n", 476 | "Epoch 16/20\n", 477 | "7982/7982 [==============================] - 0s - loss: 0.1306 - acc: 0.9553 - val_loss: 1.0152 - val_acc: 0.8050\n", 478 | "Epoch 17/20\n", 479 | "7982/7982 [==============================] - 0s - loss: 0.1210 - acc: 0.9575 - val_loss: 1.0262 - val_acc: 0.8010\n", 480 | "Epoch 18/20\n", 481 | "7982/7982 [==============================] - 0s - loss: 0.1185 - acc: 0.9570 - val_loss: 1.0354 - val_acc: 0.8040\n", 482 | "Epoch 19/20\n", 483 | "7982/7982 [==============================] - 0s - loss: 0.1128 - acc: 0.9598 - val_loss: 1.0841 - val_acc: 0.8010\n", 484 | "Epoch 20/20\n", 485 | "7982/7982 [==============================] - 0s - loss: 0.1097 - acc: 0.9594 - val_loss: 1.0707 - val_acc: 0.8040\n" 486 | ] 487 | } 488 | ], 489 | "source": [ 490 | "history = model.fit(partial_x_train,\n", 491 | " partial_y_train,\n", 492 | " epochs=20,\n", 493 | " batch_size=512,\n", 494 | " validation_data=(x_val, y_val))" 495 | ] 496 | }, 497 | { 498 | "cell_type": "markdown", 499 | "metadata": {}, 500 | "source": [ 501 | "Let's display its loss and accuracy curves:" 502 | ] 503 | }, 504 | { 505 | "cell_type": "code", 506 | "execution_count": 16, 507 | "metadata": {}, 508 | "outputs": [ 509 | { 510 | "data": { 511 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmYFNW9//H3FxxBdgTEhWVweWQXxhE0iIAaoyZKUOIV\nMe5BfVyiZpHrFqOSq16vGtAYSaJxGSVejVvEoDeiaFRk+SGoaHABRFkGZBWMDPP9/XGqe5qhZ6Zn\nqememc/reerp6upT1d+u6alv1zmnTpm7IyIiAtAs2wGIiEjuUFIQEZEkJQUREUlSUhARkSQlBRER\nSVJSEBGRJCUFqVNm1tzMtphZj7osm01mdqCZ1XnfbTM71syWpjz/yMyGZ1K2Bu/1RzO7pqbrV7Ld\nW8zsz3W9Xcme3bIdgGSXmW1JedoK+DewI3p+obsXVWd77r4DaFPXZZsCdz+4LrZjZhcAZ7r7yJRt\nX1AX25bGT0mhiXP35EE5+iV6gbv/X0XlzWw3dy+pj9hEpP6p+kgqFVUP/MXMHjezzcCZZnaEmb1t\nZhvMbKWZTTazvKj8bmbmZpYfPX80ev1FM9tsZm+ZWa/qlo1eP8HM/mVmG81sipn908zOqSDuTGK8\n0Mw+NrP1ZjY5Zd3mZnaXma0zs0+B4yvZP9ea2bRyy+41szuj+QvMbHH0eT6JfsVXtK0VZjYymm9l\nZo9Esb0PHFqu7HVm9mm03ffN7ORo+QDgHmB4VDW3NmXf3piy/kXRZ19nZs+Y2T6Z7JuqmNmYKJ4N\nZvaKmR2c8to1ZvalmW0ysw9TPuvhZjY/Wr7azP470/eTGLi7Jk24O8BS4Nhyy24BvgVOIvyI2AM4\nDBhKONPcH/gXcGlUfjfAgfzo+aPAWqAQyAP+Ajxag7J7AZuB0dFrVwHbgXMq+CyZxPgs0B7IB75K\nfHbgUuB9oBvQCZgV/lXSvs/+wBagdcq21wCF0fOTojIGHA1sAwZGrx0LLE3Z1gpgZDR/B/Aq0BHo\nCXxQruxpwD7R3+SMKIau0WsXAK+Wi/NR4MZo/rgoxkFAS+B3wCuZ7Js0n/8W4M/RfJ8ojqOjv9E1\nwEfRfD9gGbB3VLYXsH80PwcYF823BYZm+3+hKU86U5BMvOHuz7t7qbtvc/c57j7b3Uvc/VNgKjCi\nkvWfdPe57r4dKCIcjKpb9gfAAnd/NnrtLkICSSvDGP/L3Te6+1LCATjxXqcBd7n7CndfB9xayft8\nCrxHSFYA3wXWu/vc6PXn3f1TD14B/gGkbUwu5zTgFndf7+7LCL/+U9/3CXdfGf1NHiMk9MIMtgsw\nHvijuy9w92+AicAIM+uWUqaifVOZ04Hn3P2V6G90KyGxDAVKCAmoX1QF+Vm07yAk94PMrJO7b3b3\n2Rl+DomBkoJk4vPUJ2bW28xeMLNVZrYJuAnoXMn6q1Lmt1J543JFZfdNjcPdnfDLOq0MY8zovQi/\ncCvzGDAumj8jep6I4wdmNtvMvjKzDYRf6ZXtq4R9KovBzM4xs3ejapoNQO8Mtwvh8yW35+6bgPXA\nfillqvM3q2i7pYS/0X7u/hHwM8LfYU1UHbl3VPRcoC/wkZm9Y2YnZvg5JAZKCpKJ8t0x7yf8Oj7Q\n3dsBNxCqR+K0klCdA4CZGTsfxMqrTYwrge4pz6vqMvsEcKyZ7Uc4Y3gsinEP4EngvwhVOx2AlzKM\nY1VFMZjZ/sB9wMVAp2i7H6Zst6rus18SqqQS22tLqKb6IoO4qrPdZoS/2RcA7v6ouw8jVB01J+wX\n3P0jdz+dUEX4P8BTZtaylrFIDSkpSE20BTYCX5tZH+DCenjPvwEFZnaSme0G/BToElOMTwBXmNl+\nZtYJuLqywu6+CngD+DPwkbsviV5qAewOFAM7zOwHwDHViOEaM+tg4TqOS1Nea0M48BcT8uNPCGcK\nCauBbomG9TQeB843s4Fm1oJwcH7d3Ss886pGzCeb2cjovX9BaAeabWZ9zGxU9H7boqmU8AF+bGad\nozOLjdFnK61lLFJDSgpSEz8Dzib8w99PaBCOlbuvBv4DuBNYBxwA/D/CdRV1HeN9hLr/RYRG0Ccz\nWOcxQsNxsurI3TcAVwJPExprxxKSWyZ+RThjWQq8CDycst2FwBTgnajMwUBqPfzLwBJgtZmlVgMl\n1v87oRrn6Wj9HoR2hlpx9/cJ+/w+QsI6Hjg5al9oAdxOaAdaRTgzuTZa9URgsYXebXcA/+Hu39Y2\nHqkZC1WzIg2LmTUnVFeMdffXsx2PSGOhMwVpMMzs+Kg6pQVwPaHXyjtZDkukUVFSkIbkSOBTQtXE\n94Ax7l5R9ZGI1ICqj0REJElnCiIiktTgBsTr3Lmz5+fnZzsMEZEGZd68eWvdvbJu3EADTAr5+fnM\nnTs322GIiDQoZlbVlfmAqo9ERCSFkoKIiCQpKYiISFKDa1MQkfq1fft2VqxYwTfffJPtUCQDLVu2\npFu3buTlVTT0VeWUFESkUitWrKBt27bk5+cTBqeVXOXurFu3jhUrVtCrV6+qV0ijSVQfFRVBfj40\naxYei6p1K3qRpu2bb76hU6dOSggNgJnRqVOnWp3VNfozhaIimDABtm4Nz5ctC88Bxtd6XEiRpkEJ\noeGo7d8qtjMFM+tuZjPN7IPoRt4/TVNmpIWbsC+IphvqOo5rry1LCAlbt4blIiKyszirj0qAn7l7\nX+Bw4BIz65um3OvuPiiabqrrIJYvr95yEckt69atY9CgQQwaNIi9996b/fbbL/n8228zu+3Cueee\ny0cffVRpmXvvvZeiOqpbPvLII1mwYEGdbKu+xVZ95O4rCTfwwN03m9liwu0TP4jrPdPp0SNUGaVb\nLiJ1r6gonIkvXx7+zyZNql1VbadOnZIH2BtvvJE2bdrw85//fKcy7o6706xZ+t+5Dz74YJXvc8kl\nl9Q8yEakXhqazSwfGMzOd4dKOCK6AfmLZtavgvUnmNlcM5tbXFxcrfeeNAlatdp5WatWYbmI1K1E\nG96yZeBe1oYXR+eOjz/+mL59+zJ+/Hj69evHypUrmTBhAoWFhfTr14+bbiqreEj8ci8pKaFDhw5M\nnDiRQw45hCOOOII1a9YAcN1113H33Xcny0+cOJEhQ4Zw8MEH8+abbwLw9ddfc+qpp9K3b1/Gjh1L\nYWFhlWcEjz76KAMGDKB///5cc801AJSUlPDjH/84uXzy5MkA3HXXXfTt25eBAwdy5pln1vk+y0Ts\nDc1m1gZ4CrjC3TeVe3k+0NPdt5jZicAzwEHlt+HuU4GpAIWFhdUa6zvxC6Uuf7mISHqVteHF8T/3\n4Ycf8vDDD1NYWAjArbfeyp577klJSQmjRo1i7Nix9O27c631xo0bGTFiBLfeeitXXXUVDzzwABMn\nTtxl2+7OO++8w3PPPcdNN93E3//+d6ZMmcLee+/NU089xbvvvktBQUGl8a1YsYLrrruOuXPn0r59\ne4499lj+9re/0aVLF9auXcuiRYsA2LBhAwC33347y5YtY/fdd08uq2+xnilEN+9+Cihy97+Wf93d\nN7n7lmh+OpBnZp3rOo7x42HpUigtDY9KCCLxqO82vAMOOCCZEAAef/xxCgoKKCgoYPHixXzwwa61\n1XvssQcnnHACAIceeihLly5Nu+1TTjlllzJvvPEGp59+OgCHHHII/fqlrdxImj17NkcffTSdO3cm\nLy+PM844g1mzZnHggQfy0UcfcfnllzNjxgzat28PQL9+/TjzzDMpKiqq8cVntRVn7yMD/gQsdvc7\nKyizd1QOMxsSxbMurphEJF4VtdXF1YbXunXr5PySJUv47W9/yyuvvMLChQs5/vjj0/bX33333ZPz\nzZs3p6SkJO22W7RoUWWZmurUqRMLFy5k+PDh3HvvvVx44YUAzJgxg4suuog5c+YwZMgQduzYUafv\nm4k4zxSGAT8Gjk7pcnqimV1kZhdFZcYC75nZu8Bk4HTXreBEGqxstuFt2rSJtm3b0q5dO1auXMmM\nGTPq/D2GDRvGE088AcCiRYvSnomkGjp0KDNnzmTdunWUlJQwbdo0RowYQXFxMe7Oj370I2666Sbm\nz5/Pjh07WLFiBUcffTS33347a9euZWv5urh6EGfvozeASq+icPd7gHviikFE6lc22/AKCgro27cv\nvXv3pmfPngwbNqzO3+Oyyy7jrLPOom/fvskpUfWTTrdu3bj55psZOXIk7s5JJ53E97//febPn8/5\n55+Pu2Nm3HbbbZSUlHDGGWewefNmSktL+fnPf07btm3r/DNUpcHdo7mwsNB1kx2R+rN48WL69OmT\n7TByQklJCSUlJbRs2ZIlS5Zw3HHHsWTJEnbbLbcGh0j3NzOzee5eWMEqSbn1SUREctiWLVs45phj\nKCkpwd25//77cy4h1Fbj+jQiIjHq0KED8+bNy3YYsWoSo6SKiEhmlBRERCRJSUFERJKUFEREJElJ\nQURy2qhRo3a5EO3uu+/m4osvrnS9Nm3aAPDll18yduzYtGVGjhxJVV3c77777p0uIjvxxBPrZFyi\nG2+8kTvuuKPW26lrSgoiktPGjRvHtGnTdlo2bdo0xo0bl9H6++67L08++WSN3798Upg+fTodOnSo\n8fZynZKCiOS0sWPH8sILLyRvqLN06VK+/PJLhg8fnrxuoKCggAEDBvDss8/usv7SpUvp378/ANu2\nbeP000+nT58+jBkzhm3btiXLXXzxxclht3/1q18BMHnyZL788ktGjRrFqFGjAMjPz2ft2rUA3Hnn\nnfTv35/+/fsnh91eunQpffr04Sc/+Qn9+vXjuOOO2+l90lmwYAGHH344AwcOZMyYMaxfvz75/omh\ntBMD8b322mvJmwwNHjyYzZs313jfpqPrFEQkY1dcAXV9Q7FBgyA6nqa15557MmTIEF588UVGjx7N\ntGnTOO200zAzWrZsydNPP027du1Yu3Ythx9+OCeffHKF9ym+7777aNWqFYsXL2bhwoU7DX09adIk\n9txzT3bs2MExxxzDwoULufzyy7nzzjuZOXMmnTvvPIDzvHnzePDBB5k9ezbuztChQxkxYgQdO3Zk\nyZIlPP744/zhD3/gtNNO46mnnqr0/ghnnXUWU6ZMYcSIEdxwww38+te/5u677+bWW2/ls88+o0WL\nFskqqzvuuIN7772XYcOGsWXLFlq2bFmNvV01nSmISM5LrUJKrTpyd6655hoGDhzIscceyxdffMHq\n1asr3M6sWbOSB+eBAwcycODA5GtPPPEEBQUFDB48mPfff7/Kwe7eeOMNxowZQ+vWrWnTpg2nnHIK\nr7/+OgC9evVi0KBBQOXDc0O4v8OGDRsYMWIEAGeffTazZs1Kxjh+/HgeffTR5JXTw4YN46qrrmLy\n5Mls2LChzq+o1pmCiGSssl/0cRo9ejRXXnkl8+fPZ+vWrRx66KEAFBUVUVxczLx588jLyyM/Pz/t\ncNlV+eyzz7jjjjuYM2cOHTt25JxzzqnRdhISw25DGHq7quqjirzwwgvMmjWL559/nkmTJrFo0SIm\nTpzI97//faZPn86wYcOYMWMGvXv3rnGs5elMQURyXps2bRg1ahTnnXfeTg3MGzduZK+99iIvL4+Z\nM2eyLN0N2VMcddRRPPbYYwC89957LFy4EAjDbrdu3Zr27duzevVqXnzxxeQ6bdu2TVtvP3z4cJ55\n5hm2bt3K119/zdNPP83w4cOr/dnat29Px44dk2cZjzzyCCNGjKC0tJTPP/+cUaNGcdttt7Fx40a2\nbNnCJ598woABA7j66qs57LDD+PDDD6v9npXRmYKINAjjxo1jzJgxO/VEGj9+PCeddBIDBgygsLCw\nyl/MF198Meeeey59+vShT58+yTOOQw45hMGDB9O7d2+6d+++07DbEyZM4Pjjj2ffffdl5syZyeUF\nBQWcc845DBkyBIALLriAwYMHV1pVVJGHHnqIiy66iK1bt7L//vvz4IMPsmPHDs4880w2btyIu3P5\n5ZfToUMHrr/+embOnEmzZs3o169f8i5ydUVDZ4tIpTR0dsNTm6GzVX0kIiJJSgoiIpKkpCAiVWpo\n1cxNWW3/VkoKIlKpli1bsm7dOiWGBsDdWbduXa0uaFPvIxGpVLdu3VixYgXFxcXZDkUy0LJlS7p1\n61bj9ZUURKRSeXl59OrVK9thSD1R9ZGIiCQpKYiISJKSgoiIJCkpiIhIkpKCiIgkKSmIiEiSkoKI\niCQpKYiISJKSgoiIJCkpiIhIUmxJwcy6m9lMM/vAzN43s5+mKWNmNtnMPjazhWZWEFc8IiJStTjH\nPioBfubu882sLTDPzF529w9SypwAHBRNQ4H7okcREcmC2M4U3H2lu8+P5jcDi4H9yhUbDTzswdtA\nBzPbJ66YRESkcvXSpmBm+cBgYHa5l/YDPk95voJdEwdmNsHM5prZXA3fKyISn9iTgpm1AZ4CrnD3\nTTXZhrtPdfdCdy/s0qVL3QYoIiJJsSYFM8sjJIQid/9rmiJfAN1TnneLlomISBbE2fvIgD8Bi939\nzgqKPQecFfVCOhzY6O4r44pJREQqF2fvo2HAj4FFZrYgWnYN0APA3X8PTAdOBD4GtgLnxhiPiIhU\nIbak4O5vAFZFGQcuiSsGERGpHl3RLCIiSUoKIiKSpKQgIiJJSgoiIpKkpCAiIklKCiIikqSkICIi\nSUoKIiKSpKQgIiJJSgoiIpKkpCAiIklKCiIikqSkICIiSUoKIiKSpKQgIiJJTSYplJbCa69lOwoR\nkdzWZJLCAw/AyJHwz39mOxIRkdzVZJLCuHHQtStcdx24ZzsaEZHc1GSSQuvWcO218Oqr8Mor2Y5G\nRCQ3NZmkADBhAnTvrrMFEZGKNKmk0KIFXH89vP02vPBCtqMREck9TSopAJxzDhxwQEgOpaXZjkZE\nJLc0uaSQlwc33ggLFsBf/5rtaEREckuTSwoQeiL17Qs33AA7dmQ7GhGR3NEkk0Lz5nDTTbB4MTz2\nWLajERHJHU0yKQCMGQODB4eqpO3bsx2NiEhuaLJJoVkzuPlm+PRTePDBbEcjIpIbmmxSADjxRDji\niJAcvvkm29GIiGRfk04KZnDLLbBiBdx/f7ajERHJviadFACOPhpGjYLf/Aa+/jrb0YiIZFeTTwoQ\nzhbWrIF77sl2JCIi2aWkAHznO6F94bbbYOPGbEcjIpI9sSUFM3vAzNaY2XsVvD7SzDaa2YJouiGu\nWDJx882wfj3cdVc2oxARya44zxT+DBxfRZnX3X1QNN0UYyxVKiiAU0+FO++EdeuyGYmISPbElhTc\nfRbwVVzbj8Ovfw1btsDtt2c7EhGR7Mh2m8IRZvaumb1oZv0qKmRmE8xsrpnNLS4uji2Yfv1g/HiY\nMgVWrYrtbUREclY2k8J8oKe7HwJMAZ6pqKC7T3X3Qncv7NKlS6xB/epX8O23oYtqQlER5OeHq6Dz\n88NzEZHGKGtJwd03ufuWaH46kGdmnbMVT8KBB8K554aL2ZYvDwlgwgRYtizcrW3ZsvBciUFEGqOs\nJQUz29vMLJofEsWSE028118fHm+5JdzXeevWnV/fujUsFxFpbHaLa8Nm9jgwEuhsZiuAXwF5AO7+\ne2AscLGZlQDbgNPdc+POyT16wIUXwu9+V/H9FpYvr9+YRETqg+XIcThjhYWFPnfu3NjfZ9Uq2H//\n0I6QbviLnj1h6dLYwxARqRNmNs/dC6sql+3eRzlr773hsstCVVHLlju/1qoVTJqUnbhEROKUUVIw\nswPMrEU0P9LMLjezDvGGln2//CW0aQMDBoQzA7PwOHVq6LoqItLYZHqm8BSww8wOBKYC3YFGfyPL\nTp3gyithzhz461+htDRUGSkhiEhjlWlSKHX3EmAMMMXdfwHsE19YueOqq6BjR7ghqyMziYjUj0yT\nwnYzGwecDfwtWpYXT0i5pX37UI30wgvw1lvZjkZEJF6ZJoVzgSOASe7+mZn1Ah6JL6zcctllsNde\ncN112Y5ERCReGSUFd//A3S9398fNrCPQ1t1vizm2nNG6NVxzDbzySkgMDawXr4hIxjK6eM3MXgVO\njsrPA9aY2T/d/aoYY8spl1wC770XuqJ++WUYBiOvSVSgiUhTkmn1UXt33wScAjzs7kOBY+MLK/fs\ntlvoinrjjfDgg3DyyWGYbRGRxiTTpLCbme0DnEZZQ3OTYxZGUZ06FV56CUaOhNWrsx2ViEjdyTQp\n3ATMAD5x9zlmtj+wJL6wcttPfgLPPgsffBDu77ykye4JEWlsMm1o/l93H+juF0fPP3X3U+MNLbf9\n4AcwcyZs2hQSwzvvZDsiEZHay3SYi25m9rSZrYmmp8ysW9zB5bqhQ+HNN6FdOxg1KlzLICLSkGVa\nffQg8BywbzQ9Hy1r8g46KCSGPn1g9Gj44x+zHZGISM1lmhS6uPuD7l4STX8G4r0vZgPStSu8+ioc\ne2xob/j1r3Utg4g0TJkmhXVmdqaZNY+mM8mRu6TlijZt4Pnn4eyzQ7fVCROgpCTbUYmIVE+md147\nD5gC3AU48CZwTkwxNVh5eeEahm7dwkVuq1bBtGnhimgRkYYg095Hy9z9ZHfv4u57ufsPgSbd+6gi\nZuHezvfdB9Onw9FHQ3FxtqMSEclMbe681mSGuKiJiy4K92BYuDB0Wf3kk2xHJCJStdokBauzKBqp\n0aPhH/+Ar74KiaEebi0tIlIrtUkK6l+Tge98B/75T9hjDxgxAn72s3D3NhGRXFRpUjCzzWa2Kc20\nmXC9gmSgd+9wg57Ro2HyZDjgADj1VHj9dXVdFZHcUmlScPe27t4uzdTW3TPtuSTAPvvAY4/BZ5/B\n1VeH6xqOOgoKC+GRR+Dbb7MdoYhI7aqPpAa6dYPf/AY+/zzck2HbNjjrLOjZE26+GdasyXaEItKU\nKSnUg6IiyM+HZs3CY1ERtGoVLnB7/32YMQMGD4YbboAePeD880OvJRGR+qakELOionDwX7YstB8s\nWxaeFxWF183guOPCNQ2LF8N554UL3g45BI45JlwlXVqa3c8gIk2HkkLMrr0Wtm7dednWrWF5eb17\nw+9+F6qWbrsN/vWvcIe3gw+GKVNg8+b6iVlEmi7zBtb9pbCw0Oc2oA7/zZql72FkVvUZwPbt8PTT\ncPfdofdS+/bwi1/AFVdo6AyRhmLtWpg1K3RN37ED9twTOnbc+TEx37FjuPVvHMxsnrsXVllOSSFe\n+fmhyqi8nj2rd73CO++EBupnn4W99w6D7p1/fnxfIBGpmeLikARefRVeew0WLQrLW7YM46NVdcbf\ntm36xNGxY6hSPu64msWVaVLQISVmkyaFNoTUKqRWrcLy6hgyBJ55Jty74Ze/DMNo3HVXSBRjxoQz\nDxGpf2vWlCWBV18NnUcg/J8feSScfnq4n3thIey+e6gB2LAB1q8Pox1U9bh4cXj86quQVGqaFDKl\nM4V6UFQU2hCWLw+9iyZNgvHja74999AAPXFi+MIcfjjcfjsMH153MYtIeqtX75wEPvggLG/dOiSB\nESPKkkBeXt29r3uocm7evGbrq/qoCSgpgYceCl1Zv/wy3Df61luhX79sRyaS20pL4euvYcuWzKcN\nG2DOnPBDDMI9VI48MiSAESPg0EPrNgnUtawnBTN7APgBsMbd+6d53YDfAicCW4Fz3H1+VdtVUtjV\n1q1h+Ixbbw31lWefHe7+1r17tiMTqVvffgubNoVp48Zd59MtS51PHODL9wisTPPmoZ6/TRsYMKAs\nCRQU5HYSKC8XksJRwBbg4QqSwonAZYSkMBT4rbsPrWq7SgoVW7cutDHcc0/o9XT55aGKqWPHbEcm\nUjOffAIvvwwvvRSqatavr3qdvLzQU69duzClzicO7tWZdt+9cbTZZT0pREHkA3+rICncD7zq7o9H\nzz8CRrr7ysq2qaRQtWXLQpXSI49Ahw5wzTVw6aWh94NILtuwAV55JSSBl1+GTz8Ny3v0CPdA33//\nnQ/06Q7+LVo0joN4XWsIvY/2Az5Peb4iWrZLUjCzCcAEgB49etRLcA1Zz56hreGqq+A//zNc2zB5\ncujG+sMfhu5tIrlg+3aYPbvsbOCdd0J9f5s24a6FV14ZetscdJAO9PWlQXRJdfepwFQIZwpZDqfB\nOOSQMHzGzJlhZNbzzw9T797hPg+J6eCDQ3WTSNzcYcmSsiQwc2ZoB2vWDA47LPTS++53Q4+6hlRf\n35hkMyl8AaQ2hXaLlkkdGzUq/Bp7/fVwVeWbb4ZrHh54ILzesSMccURZkhgyRFdMS+ZKS8saczds\nKJvKPy8uDt/BxMWc+flwxhkhCRx9tNq+ckU2k8JzwKVmNo3Q0LyxqvYEqTmzcP+Go44Kz93D2Epv\nvlk2TZ8eXmvePJxlpJ5N9Oih0/emZNu2MAbX8uXhIL58eXi+bt2uB/xNm6q+WVSbNqF9q7AwnLV+\n97vhZlP6TuWeOHsfPQ6MBDoDq4FfAXkA7v77qEvqPcDxhC6p57p7lS3IamiOz/r18PbbZUli9uzQ\nlxtg333D2cSgQdC/f+ia16uXqp0aIvdwcE8c7FMP/InH8vf1aNYs3Ciqc+dwcE9M7dtX/bx9ew3H\nkgtyovdRHJQU6k9JSRi35a23QpJ4++3QRTChVatwoVwiSSQeu3at+S9A93BAWrYsjA21bFnZ/OrV\n0Ldv2dlL3741v7qzMSsthVWrwj5L7MPEfCIJlO+n36pV6KDQo0fZY+r8fvupjr+hU1KQWGzZEi7r\nX7QI3nuv7HH16rIynTrtnCT69w9Tu3ZhlMiVK9Mf9BPz33yz83t26BAOTl26hJsPJX7FtmsXGiQT\nSWLo0LAsl2zfXlbfvnlzSGItW4Zuk+WnTBNpYh+mO+gnDvzlb+/apUvYh4kp9YDfs2fokaaqnMZN\nSaERqeuxk+JQXFyWJBKJ4r33QhJJ2GuvUEW1ffvO6yYOWPn5ZQet1Pn27cvKuoe+66ltIYsWheXN\nmoUklNoW0qtXzQ92paXhYJ4YnGz9+vA89erZ8vPln2/blvn75eVVnDBatAhVMKtWhe9B+X3YtWvY\nZ4kpsQ/z88N3Rh0HREmhkUjcua38KKtTp+ZeYiivtDQcwBJJ4uOPQ2JIPejXxQFr48bQvz2RJN56\nq2x44q5gdAYfAAAMfElEQVRdyxLEYYeFX9mJA3zqwT51PvF848aqG1Dbti27cCr1gqp0z9u0Cfvk\n3/8O0zfflM2Xn9K9tn17GDY93UF/jz1qtw+l8VNSaCTq6n4MTcmOHWH44tSzidS2kFR5eWU3N0kd\nt778fGJKNJ4mhkxQm4Y0FEoKjURt7twmZVavhgULQjVM6sG+dWvVpUvT0BCGuZAM9OiR/kxBo31U\nT9eu8L3vZTsKkdynXuY5btKk0IaQqiZ3bhMRyYSSQo4bPz40KvfsGao5evZsGI3MItIwqfqoARg/\nXklAROqHzhRERCRJSUFERJKUFEREJElJQUREkpQUREQkSUlBRESSlBSagKKiMIZSs2bhsago2xGJ\nSK7SdQqNXPlRVpctC89B1z6IyK50ptDIXXvtrnfZ2ro1LBcRKU9JoZFbvrx6y0WkaVNSaOQqGk1V\no6yKSDpKCo2cRlkVkepQUmjkNMqqiFSHeh81ARplVUQypTMFERFJUlIQEZEkJQUREUlSUhARkSQl\nBcmIxk8SaRrU+0iqpPGTRJoOnSlIlTR+kkjToaQgVdL4SSJNh5KCVEnjJ4k0HbEmBTM73sw+MrOP\nzWximtfPMbNiM1sQTRfEGY/UjMZPEmk6YksKZtYcuBc4AegLjDOzvmmK/sXdB0XTH+OKR2pO4yeJ\nNB1x9j4aAnzs7p8CmNk0YDTwQYzvKTHR+EkiTUOc1Uf7AZ+nPF8RLSvvVDNbaGZPmln3dBsyswlm\nNtfM5hYXF8cRq8RM1zmINAzZbmh+Hsh394HAy8BD6Qq5+1R3L3T3wi5dutRrgFJ7iescli0D97Lr\nHJQYRHJPnEnhCyD1l3+3aFmSu69z939HT/8IHBpjPJIlus5BpOGIMynMAQ4ys15mtjtwOvBcagEz\n2yfl6cnA4hjjkSzRdQ4iDUdsDc3uXmJmlwIzgObAA+7+vpndBMx19+eAy83sZKAE+Ao4J654JHt6\n9AhVRumWi0huMXfPdgzVUlhY6HPnzs12GFIN5cdOgnCdg7q1itQfM5vn7oVVlct2Q7M0AbrOQaTh\n0CipUi90nYNIw6AzBWkQdJ2DSP3QmYLkPN3PQaT+6ExBcp6ucxCpP0oKkvN0nYNI/VFSkJyn+zmI\n1B8lBcl5dXE/BzVUi2RGSUFyXm2vc9CAfCKZ0xXN0ujl56cfZqNnT1i6tL6jEckOXdEsElFDtUjm\nlBSk0auLhmq1SUhToaQgjV5tG6rVJiFNiZKCNHq1bajWxXPSlCgpSJMwfnxoVC4tDY/VGR6jLtok\nVP0kDYWSgkgVatsmoeonaUiUFESqUNs2CVU/SUOipCBShdq2Saj6SRoSJQWRDNSmTSIXqp+UVCRT\nSgoiMct29ZPaNKQ6lBREYpbt6qe6aNPQmUbToaQgUg+yWf1U26SSC9VXSkr1R0lBJMfVtvqptkkl\n29VXSkr1zN0b1HTooYe6SFPz6KPuPXu6m4XHRx+t3rqtWrmHQ2qYWrXKfBtmO6+bmMwyW79nz/Tr\n9+xZP+vX9vPXdv3ENmr696uL9d3dgbmewTE26wf56k5KCiLVV5uDSm0PyrVNKkpKtU9K7pknBd1P\nQUQqlai+Sa1CatUq88by2t7PorbrN2sWDqXlmYU2nrjXz/bnT9D9FESkTtS291Rt20Sy3aaS7Yb+\n+r4fiJKCiFSpNr2naptUlJRqt361ZVLHlEuT2hREpLqy2dCrNoWYqU1BRBqaoqLQhXf58vALf9Kk\n6p1t1XZ9yLxNQUlBRKQJUEOziIhUW6xJwcyON7OPzOxjM5uY5vUWZvaX6PXZZpYfZzwiIlK52JKC\nmTUH7gVOAPoC48ysb7li5wPr3f1A4C7gtrjiERGRqsV5pjAE+NjdP3X3b4FpwOhyZUYDD0XzTwLH\nmJnFGJOIiFQizqSwH/B5yvMV0bK0Zdy9BNgIdCq/ITObYGZzzWxucXFxTOGKiMhu2Q4gE+4+FZgK\nYGbFZpbmou+c0BlYm+0gKpHr8UHux6j4akfx1U5t4uuZSaE4k8IXQPeU592iZenKrDCz3YD2wLrK\nNuruXeoyyLpkZnMz6fKVLbkeH+R+jIqvdhRf7dRHfHFWH80BDjKzXma2O3A68Fy5Ms8BZ0fzY4FX\nvKFdOCEi0ojEdqbg7iVmdikwA2gOPODu75vZTYTLrZ8D/gQ8YmYfA18REoeIiGRJrG0K7j4dmF5u\n2Q0p898AP4ozhno2NdsBVCHX44Pcj1Hx1Y7iq53Y42tww1yIiEh8NMyFiIgkKSmIiEiSkkI1mVl3\nM5tpZh+Y2ftm9tM0ZUaa2UYzWxBNN6TbVowxLjWzRdF77zKkrAWTozGnFppZQT3GdnDKfllgZpvM\n7IpyZep9/5nZA2a2xszeS1m2p5m9bGZLoseOFax7dlRmiZmdna5MTPH9t5l9GP0NnzazDhWsW+n3\nIcb4bjSzL1L+jidWsG6lY6TFGN9fUmJbamYLKlg31v1X0TEla9+/TG66oKlsAvYBCqL5tsC/gL7l\nyowE/pbFGJcCnSt5/UTgRcCAw4HZWYqzObAK6Jnt/QccBRQA76Usux2YGM1PBG5Ls96ewKfRY8do\nvmM9xXccsFs0f1u6+DL5PsQY343AzzP4DnwC7A/sDrxb/v8prvjKvf4/wA3Z2H8VHVOy9f3TmUI1\nuftKd58fzW8GFrPr8B25bjTwsAdvAx3MbJ8sxHEM8Im7Z/0KdXefRegWnSp1bK6HgB+mWfV7wMvu\n/pW7rwdeBo6vj/jc/SUPw8MAvE24QDQrKth/mchkjLRaqyy+aLy104DH6/p9M1HJMSUr3z8lhVqI\nhvoeDMxO8/IRZvaumb1oZv3qNTBw4CUzm2dmE9K8nsm4VPXhdCr+R8zm/kvo6u4ro/lVQNc0ZXJl\nX55HOPtLp6rvQ5wujaq3Hqig+iMX9t9wYLW7L6ng9Xrbf+WOKVn5/ikp1JCZtQGeAq5w903lXp5P\nqBI5BJgCPFPP4R3p7gWEYcsvMbOj6vn9qxRd5X4y8L9pXs72/tuFh3P1nOy/bWbXAiVAUQVFsvV9\nuA84ABgErCRU0eSicVR+llAv+6+yY0p9fv+UFGrAzPIIf7wid/9r+dfdfZO7b4nmpwN5Zta5vuJz\n9y+ixzXA04RT9FSZjEsVtxOA+e6+uvwL2d5/KVYnqtWixzVpymR1X5rZOcAPgPHRgWMXGXwfYuHu\nq919h7uXAn+o4H2zvf92A04B/lJRmfrYfxUcU7Ly/VNSqKao/vFPwGJ3v7OCMntH5TCzIYT9XOlA\nf3UYX2sza5uYJzRGvleu2HPAWVEvpMOBjSmnqfWlwl9n2dx/5aSOzXU28GyaMjOA48ysY1Q9cly0\nLHZmdjzwS+Bkd99aQZlMvg9xxZfaTjWmgvfNZIy0OB0LfOjuK9K9WB/7r5JjSna+f3G1qDfWCTiS\ncBq3EFgQTScCFwEXRWUuBd4n9KR4G/hOPca3f/S+70YxXBstT43PCHfF+wRYBBTW8z5sTTjIt09Z\nltX9R0hQK4HthHrZ8wn39vgHsAT4P2DPqGwh8MeUdc8DPo6mc+sxvo8J9cmJ7+Hvo7L7AtMr+z7U\nU3yPRN+vhYQD3D7l44uen0jocfNJfcYXLf9z4nuXUrZe918lx5SsfP80zIWIiCSp+khERJKUFERE\nJElJQUREkpQUREQkSUlBRESSlBREIma2w3YewbXORuw0s/zUETpFclWst+MUaWC2ufugbAchkk06\nUxCpQjSe/u3RmPrvmNmB0fJ8M3slGvDtH2bWI1re1cL9Dd6Npu9Em2puZn+Ixsx/ycz2iMpfHo2l\nv9DMpmXpY4oASgoiqfYoV330HymvbXT3AcA9wN3RsinAQ+4+kDAY3eRo+WTgNQ8D+hUQroQFOAi4\n1937ARuAU6PlE4HB0XYuiuvDiWRCVzSLRMxsi7u3SbN8KXC0u38aDVy2yt07mdlawtAN26PlK929\ns5kVA93c/d8p28gnjHt/UPT8aiDP3W8xs78DWwijwT7j0WCAItmgMwWRzHgF89Xx75T5HZS16X2f\nMBZVATAnGrlTJCuUFEQy8x8pj29F828SRvUEGA+8Hs3/A7gYwMyam1n7ijZqZs2A7u4+E7gaaA/s\ncrYiUl/0i0SkzB62883b/+7uiW6pHc1sIeHX/rho2WXAg2b2C6AYODda/lNgqpmdTzgjuJgwQmc6\nzYFHo8RhwGR331Bnn0ikmtSmIFKFqE2h0N3XZjsWkbip+khERJJ0piAiIkk6UxARkSQlBRERSVJS\nEBGRJCUFERFJUlIQEZGk/w+dYzD20tOTDwAAAABJRU5ErkJggg==\n", 512 | "text/plain": [ 513 | "" 514 | ] 515 | }, 516 | "metadata": {}, 517 | "output_type": "display_data" 518 | } 519 | ], 520 | "source": [ 521 | "import matplotlib.pyplot as plt\n", 522 | "\n", 523 | "loss = history.history['loss']\n", 524 | "val_loss = history.history['val_loss']\n", 525 | "\n", 526 | "epochs = range(1, len(loss) + 1)\n", 527 | "\n", 528 | "plt.plot(epochs, loss, 'bo', label='Training loss')\n", 529 | "plt.plot(epochs, val_loss, 'b', label='Validation loss')\n", 530 | "plt.title('Training and validation loss')\n", 531 | "plt.xlabel('Epochs')\n", 532 | "plt.ylabel('Loss')\n", 533 | "plt.legend()\n", 534 | "\n", 535 | "plt.show()" 536 | ] 537 | }, 538 | { 539 | "cell_type": "code", 540 | "execution_count": 18, 541 | "metadata": {}, 542 | "outputs": [ 543 | { 544 | "data": { 545 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XmcFNW5//HPwyargAwqsg0xKuKCwgT1525c0Khckasi\nSdwI0Su4xCxEvGpUsvrzGo0/r0Qweh0lKheDiTsSlxiVQRlQUCEIOICAgCAOAgPP749T0zRNz0wP\nM9XdM/N9v1796u6qU1VP1/TU0+ecqlPm7oiIiAA0y3UAIiKSP5QUREQkQUlBREQSlBRERCRBSUFE\nRBKUFEREJEFJQXZhZs3NbKOZ9arPsrlkZt80s3o//9rMTjWzxUnvPzKz4zMpuxvbetDMbtzd5UUy\n0SLXAUjdmdnGpLdtgc3Atuj9D929uDbrc/dtQPv6LtsUuPtB9bEeMxsJfNfdT0pa98j6WLdIdZQU\nGgF3TxyUo1+iI9395arKm1kLd6/IRmwiNdH3Mb+o+agJMLM7zOzPZva4mX0JfNfMjjGzt8zsCzNb\nYWb3mFnLqHwLM3MzK4zePxrNf87MvjSzf5pZn9qWjeafaWYfm9l6M7vXzP5hZpdWEXcmMf7QzBaa\n2Tozuydp2eZm9l9mtsbMFgGDq9k/48xscsq0+8zsruj1SDObH32ef0W/4qtaV5mZnRS9bmtm/xPF\n9gEwMKXsTWa2KFrvB2Z2bjT9MOAPwPFR09znSfv21qTlr4w++xoze9rMumWyb2qznyvjMbOXzWyt\nmX1mZj9N2s5/Rvtkg5mVmNl+6ZrqzOyNyr9ztD9fi7azFrjJzA4wsxnRNj6P9lvHpOV7R59xdTT/\n92bWOor54KRy3cys3My6VPV5pQburkcjegCLgVNTpt0BbAHOIfwQaAN8CziKUFv8BvAxMDoq3wJw\noDB6/yjwOVAEtAT+DDy6G2X3Br4EhkTzfgRsBS6t4rNkEuNfgI5AIbC28rMDo4EPgB5AF+C18HVP\nu51vABuBdknrXgUURe/PicoYcAqwCTg8mncqsDhpXWXASdHrO4G/A52B3sC8lLIXAN2iv8nFUQz7\nRPNGAn9PifNR4Nbo9elRjEcArYH/B7ySyb6p5X7uCKwErgX2APYEBkXzfg6UAgdEn+EIYC/gm6n7\nGnij8u8cfbYK4CqgOeH7eCDwbaBV9D35B3Bn0ud5P9qf7aLyx0bzJgDjk7ZzAzA11/+HDfmR8wD0\nqOc/aNVJ4ZUalvsx8GT0Ot2B/r+Typ4LvL8bZS8HXk+aZ8AKqkgKGcZ4dNL8/wV+HL1+jdCMVjnv\nrNQDVcq63wIujl6fCXxUTdm/AldHr6tLCkuT/xbAfySXTbPe94HvRK9rSgoPA79MmrcnoR+pR037\nppb7+XvAzCrK/asy3pTpmSSFRTXEMKxyu8DxwGdA8zTljgU+ASx6PxsYWt//V03poeajpuPT5Ddm\n1tfM/hY1B2wAbgMKqln+s6TX5VTfuVxV2f2S4/DwX1xW1UoyjDGjbQFLqokX4DFgePT64uh9ZRxn\nm9nbUdPGF4Rf6dXtq0rdqovBzC41s9KoCeQLoG+G64Xw+RLrc/cNwDqge1KZjP5mNeznnoSDfzrV\nzatJ6vdxXzN7wsyWRTH8KSWGxR5OatiJu/+DUOs4zswOBXoBf9vNmAT1KTQlqadjPkD4ZfpNd98T\nuJnwyz1OKwi/ZAEwM2Png1iqusS4gnAwqVTTKbNPAKeaWXdC89ZjUYxtgKeAXxGadjoBL2YYx2dV\nxWBm3wDuJzShdInW+2HSems6fXY5oUmqcn0dCM1UyzKIK1V1+/lTYP8qlqtq3ldRTG2Tpu2bUib1\n8/2GcNbcYVEMl6bE0NvMmlcRxyPAdwm1mifcfXMV5SQDSgpNVwdgPfBV1FH3wyxs86/AADM7x8xa\nENqpu8YU4xPAdWbWPep0/Fl1hd39M0ITx58ITUcLoll7ENq5VwPbzOxsQtt3pjHcaGadLFzHMTpp\nXnvCgXE1IT/+gFBTqLQS6JHc4ZviceAKMzvczPYgJK3X3b3Kmlc1qtvP04BeZjbazPYwsz3NbFA0\n70HgDjPb34IjzGwvQjL8jHBCQ3MzG0VSAqsmhq+A9WbWk9CEVemfwBrglxY679uY2bFJ8/+H0Nx0\nMSFBSB0oKTRdNwCXEDp+HyB0CMfK3VcCFwJ3Ef7J9wfeI/xCrO8Y7wemA3OBmYRf+zV5jNBHkGg6\ncvcvgOuBqYTO2mGE5JaJWwg1lsXAcyQdsNx9DnAv8E5U5iDg7aRlXwIWACvNLLkZqHL55wnNPFOj\n5XsBIzKMK1WV+9nd1wOnAecTEtXHwInR7N8BTxP28wZCp2/rqFnwB8CNhJMOvpny2dK5BRhESE7T\ngClJMVQAZwMHE2oNSwl/h8r5iwl/583u/mYtP7ukqOycEcm6qDlgOTDM3V/PdTzScJnZI4TO61tz\nHUtDp4vXJKvMbDDhTJ9NhFMatxJ+LYvslqh/ZghwWK5jaQzUfCTZdhywiNCWfgZwnjoGZXeZ2a8I\n10r80t2X5jqexkDNRyIikqCagoiIJDS4PoWCggIvLCzMdRgiIg3KrFmzPnf36k4BBxpgUigsLKSk\npCTXYYiINChmVtNV/YCaj0REJImSgoiIJCgpiIhIgpKCiIgkKCmIiEiCkoKISMyKi6GwEJo1C8/F\nxdldvjaUFESk0cvlQbm4GEaNgiVLwD08jxqV+Trqunyt5frWb7V9DBw40EUkux591L13b3ez8Pzo\now1n+UcfdW/b1j0cUsOjbdvM11HX5Xv33nnZykfv3tlZvhJQ4hkcY3N+kK/tQ0lBpPYa8kG1oR+U\nzdIvb5ad5SspKYg0Irk8qOf6oNrQD8q5/vyVMk0K6lMQyXN1bVMeNw7Ky3eeVl4epmdiaRUDUlc1\nPd+W71XF3bmrml7fy48fD23b7jytbdswPRvL15aSgkjM6trJmeuDeq4Pqg39oDxiBEyYAL17g1l4\nnjAhTM/G8rWWSXUinx5qPpJsy2XTjXvumy9y3SdQH/sw1x3l+QD1KYjUXa7b4+tjHY3hoNoYDsq5\nlmlSaHB3XisqKnINnS3ZUlgY2vBT9e4NixfXvHyzZuEwnMoMtm/PLIbKPoXkJqS2bWvXhFBcHJqb\nli4NzS7jx8fY/CB5ycxmuXtRTeXUpyBSjVy3x0P9tCmPGBGS2Pbt4VkJQaqipCBSjVx3clbSQV2y\nRUlBGr26nP2T6zNPRLKtwd2OU6Q2UtvjK8/xh8wOzJVl6tIeP2KEkoA0HOpolkatrh3FIo2FOppF\nqHtHsUhTo6Qgea8ufQL1cfaPSFOipCB5ra7j/mR73BiRhk5JQfJaXcf90dk/IrWjjmbJa/VxRbCI\nqKNZGgn1CYhkl5KC5DX1CYhkl5KC5DX1CYhkl65olrynK4JFskc1BYldXe88JiLZo5qCxKquYw+J\nSHbFWlMws8Fm9pGZLTSzsWnm9zaz6WY2x8z+bmY94oxHsq+u1xmISHbFlhTMrDlwH3Am0A8Ybmb9\nUordCTzi7ocDtwG/iiseyQ2NPSTSsMRZUxgELHT3Re6+BZgMDEkp0w94JXo9I818aeB0nYFIwxJn\nUugOfJr0viyalqwUGBq9Pg/oYGZdUldkZqPMrMTMSlavXh1LsBIPXWcg0rDk+uyjHwMnmtl7wInA\nMmBbaiF3n+DuRe5e1LVr12zHKHWg6wxEGpY4zz5aBvRMet8jmpbg7suJagpm1h44392/iDEmyQFd\nZyDScMRZU5gJHGBmfcysFXARMC25gJkVmFllDD8HJsUYj4iI1CC2pODuFcBo4AVgPvCEu39gZreZ\n2blRsZOAj8zsY2AfQC3NIiI5pKGzRUSaAA2dLfVGw1SINB0a5kKqpWEqRJoW1RSkWhqmQqRpUVKQ\nammYCpGmRUlBqqVhKkSaFiUFqZaGqRBpWpQUpFoapkKkadHZR1IjDVMh0nSopiAiIglKCiIikqCk\nICIiCUoKIiKSoKQgIiIJSgpNgAa0E5FM6ZTURk4D2olIbaim0MhpQDsRqQ0lhUZOA9qJSG0oKTRy\nGtBORGpDSaGR04B2IlIbSgqNnAa0E5Ha0NlHTYAGtBORTKmmICIiCUoKIiKSoKQgIiIJSgoiIpKg\npCAiIglKCg2ABrQTkWzRKal5TgPaiUg2qaaQ5zSgnYhkk2oKeU4D2uXO55/DO++Ex9tvw4cfQpcu\n0L079OgRnpMfPXpAhw513+62bbBuXdj+6tXh+fPPYf16KCjYeft77ln37WXCHbZsgVatwpXx0ngp\nKeS5Xr1Ck1G66VJ/Nm2C997bkQDeeQcWLQrzmjWDQw6Bo4+GL76ATz6BN96AtWt3XU+HDrsmisrX\nXbvChg07H+jTvV67FrZvzyzu9u3Tbyt52t57Q/PmOy+3dSusWbPrtquLbfNmaNeu+m117w777LPr\n9qThUFLIc+PH79ynABrQrq62bw+/+pNrAXPmQEVFmN+zJwwaBD/8IRx1FAwcGA6+qcrLYflyWLYs\nPMrKdrxetgxeeQVWrAi//NNp3jz88u/aNTwfeuiO15XPya/33DMcnFO3U7ntGTPC9io/R/J2unUL\nyeHLL8M6vvii6v3TseOObfbsCUceGV537BgSROX2Xn01fP5029t33/TJI/l96kCNkh/M3XMdQ60U\nFRV5SUlJrsPIquLi0IewdGmoIYwfr07m2lq4EB56CN56C0pKwi92CAfab30rJIGjjgrP3brV33a3\nbYNVq8KBdPVq6NRpx4G+Y8f6b4rZvn3H9lKTx6pVYZvJiSY1+XTpAi1b1m571SWqyteV+ztZp041\n1zoKCtRcVV/MbJa7F9VYLs6kYGaDgd8DzYEH3f3XKfN7AQ8DnaIyY9392erW2RSTQr6qqAi/PL/8\nMvzTb9iw43W6aWYwfDiccEL2/tHnzoVf/Qr+/OfQDNS/fzjwVyaBgw4K0yVeGzemTxzJCWTlyl2b\nzVq1gv32S584Kh/77Qd77BFP3O7h+/v556Gfp337kKg6d87O98Y97LvKZryePUMtbHfkPCmYWXPg\nY+A0oAyYCQx393lJZSYA77n7/WbWD3jW3QurW6+SQm489RTceWdo76482G/alNmybdqEX+Tl5eEf\n7LDDYMyYUNuJqwnh7bfhl7+EadPCP/JVV8H119dvLUDqV0UFfPZZ9TWOZct2PRsPQi2nqtpG5aNT\npx19Ken6TqrqT9myZdftNWsWalXVNfelPrdps3NfTqYxbN68Y7v33w9XXrl7+zfTpBBnn8IgYKG7\nL4oCmgwMAeYllXGg8vyJjsDyGOOR3bBxI1x7LUyaFDpbi4pCZ+qee4ZH5evU58rXHTpAi+hbVl4O\njz8O994b+kl++lO44gr4j/+Ab3yj7rG6h3b18eNDe37nznDrrSEB7bVX3dcv8WrRIhzIe/Souox7\nOAuruuaqd94JB9RUrVqlP8BX6tx5x4G9sDB815MP9p077/jVnnrwnj8/PK9ZU/VJAq1bw9dfV739\nqvpykpNL//5VL19f4qwpDAMGu/vI6P33gKPcfXRSmW7Ai0BnoB1wqrvPSrOuUcAogF69eg1cku50\nHKl3JSVw8cWhPf7GG+GWW2rX3lwV93D2zr33wv/+b/gnOvvscPA+9dTaNy25w1//GpLB22+H6vUN\nN4SO4vo4RVQans2bdz4JYNmyUAtp3z79L/m99qqf7/b27TtOJ06tAaxdG76P9dGXszsyrSng7rE8\ngGGEfoTK998D/pBS5kfADdHrYwi1iGbVrXfgwIEu8dq2zf03v3Fv0cK9Rw/3v/89vm2VlbnfdJN7\n167u4H7QQe733uu+YUPNy1ZUuD/+uPthh4VlCwvd77/ffdOm+OIVaaiAEs/g2B1nV8kyoGfS+x7R\ntGRXAE8AuPs/gdZAQYwxSQ2WLYPTToOf/Qz+7d+gtBROPDG+7XXvDrffDp9+Co88EpqdxowJ06+5\nBj76aNdltmyBiROhb9/QcV1REZb9+OPQ3tq6dXzxijR2cSaFmcABZtbHzFoBFwHTUsosBb4NYGYH\nE5JCmtZAyYann4bDDw+nbT74IDzxRPba4vfYA773vR3XDQwZAv/93+HAf8YZoXlo40a45x7Yf38Y\nOTIkkClT4P33w7JxV79FmoK4T0k9C7ibcLrpJHcfb2a3Eaox06Izjv4ItCd0Ov/U3V+sbp06+6j+\nlZfDj34EDzwAAwbAY4+FUzVzbeVKmDAhJIfly0NHZEVFOKX1xhvh9NN1DrtIpnJ+SmpclBTq1+zZ\noTN5/nz4yU/gjjvCWRr5ZOtWmDo1XEE7fDgcd1yuIxJpePLhlFTJY9u3w+9/D2PHhjMfXnopnPmT\nj1q2hAsuCA8RiZeSQhP02Wdw6aXwwgtw7rmh07ZA3fsigu6n0OT87W+hM/nVV8PVkU8/rYQgIjso\nKTQRX38dTvE8++ww1MOsWeH0TXXUikgyJYUm4P33w0ig994L110XTvns1y/XUYlIPlJSaMTc4Q9/\nCGO4rFoFzz4L//VfurhLRKqmpJAFxcVhgK1mzcJzcXH821y9OnQijxkDp5wSbiJz5pnxb1dEGjad\nfRSz4uKd75y2ZEl4D/HdKOfFF+GSS8LAXPfcA6NHq+9ARDKjmkLMxo3bdfz38vIwvb5t3hxGBz3j\njDA8xTvvhJqCEoKIZEo1hZgtXVq76btr/vxwZfLs2eH+BHfeGW7qISJSG6opxKxXr9pNry33MGbR\nwIFhpNG//AXuu08JQUR2j5JCzMaP3/WWk23bhul1tWYNDB0arjc49tjQmXzuuXVfr4g0XUoKMRsx\nIoz02bt3aNvv3Tu8r2sn8yuvhCuT//a30FT0wgvhBuYiInWhPoUsGDGi/s402rIFbr4ZfvtbOPBA\neOaZMNy1iEh9UFJoQD7+OHQmz5oFP/hBuBCtXbtcRyUijYmSQh7YtGnXG32nez1zZrhD2ZQpoS9B\nRKS+KSlkyZIl4aygFSt2Peh/9VX6ZZo1C/c6KCiArl3D/QRuvx169Mhu7CLSdGSUFMxsf6DM3Teb\n2UnA4cAj7v5FnME1FkuXhltIrlgRbkjftWt4HHxweK486BcU7Py6c+eQGEREsiXTmsIUoMjMvglM\nAP4CPAacFVdgjcXy5WHsofXr4a231CksIvkt09+h2929AjgPuNfdfwJ0iy+sxmHVKvj2t8MN6J9/\nXglBRPJfpjWFrWY2HLgEOCea1jKekBqHtWvhtNNCX8Lzz8PRR+c6IhGRmmVaU7gMOAYY7+6fmFkf\n4H/iC6thW78eTj8dPvoIpk0L/QkiIg1BRjUFd58HXANgZp2BDu7+mzgDa6g2boSzzgpDTkydCqee\nmuuIREQyl1FNwcz+bmZ7mtlewLvAH83srnhDa3jKy+Gcc8LtLidPhu98J9cRiYjUTqbNRx3dfQMw\nlHAq6lGAfgMn2bwZzjsPXn0VHnlEF5eJSMOUaVJoYWbdgAuAv8YYT4O0dWu4sOzFF2HixDAUhYhI\nQ5RpUrgNeAH4l7vPNLNvAAviC6vhqKgIg91NmxauWL7sslxHJCKy+zLtaH4SeDLp/SLg/LiCaii2\nbQtJ4Mkn4a67wh3PREQaskw7mnuY2VQzWxU9pphZkx6BZ/v2cHObRx8NN8y5/vpcRyQiUneZNh89\nBEwD9osez0TTmiR3uPZaePBBuOkmuPHGXEckIlI/Mk0KXd39IXeviB5/ArrGGFfecoef/Qz+8Ae4\n4Qa47bZcRyQiUn8yTQprzOy7ZtY8enwXWBNnYPnq1lvhd7+Dq68Oz2a5jkhEpP5kmhQuJ5yO+hmw\nAhgGXBpTTHnr178ONYMrroB77lFCEJHGJ6Ok4O5L3P1cd+/q7nu7+7/RxM4+euop+PnPwzUIDzyg\n+xyISONUl0Pbj2oqYGaDzewjM1toZmPTzP8vM5sdPT42s7y9ac/dd8OBB8LDD0Pz5rmORkQkHnW5\nHWe1jSdm1hy4DzgNKANmmtm0aHA9ANz9+qTyY4Aj6xBPbD78EP7xD/jNb6CFbmAqIo1YXWoKXsP8\nQcBCd1/k7luAycCQasoPBx6vQzyxmTQpJIPvfz/XkYiIxKva371m9iXpD/4GtKlh3d2BT5PelwFH\nVbGd3kAf4JUq5o8CRgH06tWrhs3Wr61bQ5PR2WfDvvtmddMiIllXbVJw9w5ZiuMi4Cl331ZFHBMI\n94amqKiophpKvfrrX8NtNa+4IptbFRHJjTjPoVkG9Ex63yOals5F5GnT0cSJsN9+MHhwriMREYlf\nnElhJnCAmfUxs1aEA/+01EJm1hfoDPwzxlh2y7Jl8NxzUFQE3/xmOA21sBCKi3MdmYhIPGI7l8bd\nK8xsNGHI7ebAJHf/wMxuA0rcvTJBXARMdvesNgtl4k9/CgPfvfgifP11mLZkCYwaFV6PGJGz0ERE\nYmF5eCyuVlFRkZeUlMS+ne3b4YADQm1h8+Zd5/fuDYsXxx6GiEi9MLNZ7l5UUzmddV+Fv/8dFi2q\nev7SpVkLRUQkazRYQxUmToROnaBnz/Tzs3xmrIhIVigppLFuHUyZEvoMfvUraNt25/lt24Yb64iI\nNDZqPkqjuDj0I1xxBRwZDbwxblxoMurVKyQEdTKLSGOkjuY0jjwynH46a1asmxERyZpMO5rVfJTi\n3Xdh9mxdwSwiTZOSQooHH4TWrcN9E0REmholhSSbNsFjj8GwYeHMIxGRpkZJIcmUKbB+vZqORKTp\nUlJI8uCDsP/+cOKJuY5ERCQ3lBQiCxbAq6+GWoJVe085EZHGS0khMmlSOA31kktyHYmISO4oKQAV\nFeHuamedFe6dICLSVCkpEO6ZsGIFjByZ60hERHJLSYHQwbzPPqGmICLSlDX5pLBiBfztb6EvoWXL\nXEcjIpJbTT4pPPIIbNumaxNERKCJJwX3cN+E44+HAw/MdTQiIrnXpJPC66+H6xNUSxARCZp0Upg4\nEfbcM4x1JCIiTTgprF8PTz4Jw4dDu3a5jkZEJD802aTw+ONhVFQ1HYmI7NBkk8LEiXD44VBU432I\nRESajiaZFEpLoaREg9+JiKRqkklh4kRo1QpGjMh1JCIi+aXJJYWvv4ZHH4WhQ6FLl1xHIyKSX5pc\nUpg6FdatUweziEg6TS4pTJwIhYVwyim5jkREJP80qaTwyScwfTpcfnm4oY6IiOysSR0aJ00KZxtd\nemmuIxERyU9NJils2wZ/+hOccQb07JnraERE8lOTSQovvghlZbq7mohIdZpMUli+HA44AM45J9eR\niIjkr1iTgpkNNrOPzGyhmY2toswFZjbPzD4ws8fiiuWKK+DDD8NFayIikl6LuFZsZs2B+4DTgDJg\npplNc/d5SWUOAH4OHOvu68xs77jiAZ1xJCJSkzgPk4OAhe6+yN23AJOBISllfgDc5+7rANx9VYzx\niIhIDeJMCt2BT5Pel0XTkh0IHGhm/zCzt8xscLoVmdkoMysxs5LVq1fHFK6IiOS6QaUFcABwEjAc\n+KOZdUot5O4T3L3I3Yu6du2a5RBFRJqOOJPCMiD5ioAe0bRkZcA0d9/q7p8AHxOShIiI5ECcSWEm\ncICZ9TGzVsBFwLSUMk8TagmYWQGhOWlRjDGJiEg1YksK7l4BjAZeAOYDT7j7B2Z2m5mdGxV7AVhj\nZvOAGcBP3H1NXDGJiEj1zN1zHUOtFBUVeUlJSa7DEBFpUMxslrvXeAPiXHc0i4hIHlFSEBGRBCUF\nERFJUFIQEZEEJQUREUlQUhARkQQlBRERSVBSEBGRBCUFERFJUFIQEZEEJQUREUmI7XacItK4bN26\nlbKyMr7++utchyLVaN26NT169KBly5a7tbySgohkpKysjA4dOlBYWIiZ5TocScPdWbNmDWVlZfTp\n02e31qHmIxHJyNdff02XLl2UEPKYmdGlS5c61eaUFEQkY0oI+a+ufyMlBRERSVBSEJFYFBdDYSE0\naxaei4vrtr41a9ZwxBFHcMQRR7DvvvvSvXv3xPstW7ZktI7LLruMjz76qNoy9913H8V1DbYBU0ez\niNS74mIYNQrKy8P7JUvCe4ARI3ZvnV26dGH27NkA3HrrrbRv354f//jHO5Vxd9ydZs3S/9596KGH\natzO1VdfvXsBNhKqKYhIvRs3bkdCqFReHqbXt4ULF9KvXz9GjBjBIYccwooVKxg1ahRFRUUccsgh\n3HbbbYmyxx13HLNnz6aiooJOnToxduxY+vfvzzHHHMOqVasAuOmmm7j77rsT5ceOHcugQYM46KCD\nePPNNwH46quvOP/88+nXrx/Dhg2jqKgokbCS3XLLLXzrW9/i0EMP5corr6Ty9scff/wxp5xyCv37\n92fAgAEsXrwYgF/+8pccdthh9O/fn3Fx7KwMKCmISL1burR20+vqww8/5Prrr2fevHl0796dX//6\n15SUlFBaWspLL73EvHnzdllm/fr1nHjiiZSWlnLMMccwadKktOt2d9555x1+97vfJRLMvffey777\n7su8efP4z//8T9577720y1577bXMnDmTuXPnsn79ep5//nkAhg8fzvXXX09paSlvvvkme++9N888\n8wzPPfcc77zzDqWlpdxwww31tHdqR0lBROpdr161m15X+++/P0VFO+5J//jjjzNgwAAGDBjA/Pnz\n0yaFNm3acOaZZwIwcODAxK/1VEOHDt2lzBtvvMFFF10EQP/+/TnkkEPSLjt9+nQGDRpE//79efXV\nV/nggw9Yt24dn3/+Oeeccw4QLjZr27YtL7/8Mpdffjlt2rQBYK+99qr9jqgHSgoiUu/Gj4e2bXee\n1rZtmB6Hdu3aJV4vWLCA3//+97zyyivMmTOHwYMHpz1vv1WrVonXzZs3p6KiIu2699hjjxrLpFNe\nXs7o0aOZOnUqc+bM4fLLL28QV4MrKYhIvRsxAiZMgN69wSw8T5iw+53MtbFhwwY6dOjAnnvuyYoV\nK3jhhRfqfRvHHnssTzzxBABz585NWxPZtGkTzZo1o6CggC+//JIpU6YA0LlzZ7p27cozzzwDhIsC\ny8vLOe2005g0aRKbNm0CYO3atfUedyZ09pGIxGLEiOwkgVQDBgygX79+9O3bl969e3PsscfW+zbG\njBnD978rmfjYAAANkklEQVT/ffr165d4dOzYcacyXbp04ZJLLqFfv35069aNo446KjGvuLiYH/7w\nh4wbN45WrVoxZcoUzj77bEpLSykqKqJly5acc8453H777fUee02ssje8oSgqKvKSkpJchyHS5Myf\nP5+DDz4412HkhYqKCioqKmjdujULFizg9NNPZ8GCBbRokR+/s9P9rcxslrsXVbFIQn58AhGRBmTj\nxo18+9vfpqKiAnfngQceyJuEUFeN41OIiGRRp06dmDVrVq7DiIU6mkVEJEFJQUREEpQUREQkQUlB\nREQSlBREpEE4+eSTd7kQ7e677+aqq66qdrn27dsDsHz5coYNG5a2zEknnURNp7rffffdlCeN8nfW\nWWfxxRdfZBJ6g6KkICINwvDhw5k8efJO0yZPnszw4cMzWn6//fbjqaee2u3tpyaFZ599lk6dOu32\n+vKVTkkVkVq77jpIM1J0nRxxBEQjVqc1bNgwbrrpJrZs2UKrVq1YvHgxy5cv5/jjj2fjxo0MGTKE\ndevWsXXrVu644w6GDBmy0/KLFy/m7LPP5v3332fTpk1cdtlllJaW0rdv38TQEgBXXXUVM2fOZNOm\nTQwbNoxf/OIX3HPPPSxfvpyTTz6ZgoICZsyYQWFhISUlJRQUFHDXXXclRlkdOXIk1113HYsXL+bM\nM8/kuOOO480336R79+785S9/SQx4V+mZZ57hjjvuYMuWLXTp0oXi4mL22WcfNm7cyJgxYygpKcHM\nuOWWWzj//PN5/vnnufHGG9m2bRsFBQVMnz69/v4IxFxTMLPBZvaRmS00s7Fp5l9qZqvNbHb0GBln\nPCLScO21114MGjSI5557Dgi1hAsuuAAzo3Xr1kydOpV3332XGTNmcMMNN1DdaA33338/bdu2Zf78\n+fziF7/Y6ZqD8ePHU1JSwpw5c3j11VeZM2cO11xzDfvttx8zZsxgxowZO61r1qxZPPTQQ7z99tu8\n9dZb/PGPf0wMpb1gwQKuvvpqPvjgAzp16pQY/yjZcccdx1tvvcV7773HRRddxG9/+1sAbr/9djp2\n7MjcuXOZM2cOp5xyCqtXr+YHP/gBU6ZMobS0lCeffLLO+zVVbDUFM2sO3AecBpQBM81smrunjhz1\nZ3cfHVccIlL/qvtFH6fKJqQhQ4YwefJkJk6cCIR7Htx444289tprNGvWjGXLlrFy5Ur23XfftOt5\n7bXXuOaaawA4/PDDOfzwwxPznnjiCSZMmEBFRQUrVqxg3rx5O81P9cYbb3DeeeclRmodOnQor7/+\nOueeey59+vThiCOOAKoenrusrIwLL7yQFStWsGXLFvr06QPAyy+/vFNzWefOnXnmmWc44YQTEmXi\nGF47zprCIGChuy9y9y3AZGBIDcvEor7vFSsiuTFkyBCmT5/Ou+++S3l5OQMHDgTCAHOrV69m1qxZ\nzJ49m3322We3hqn+5JNPuPPOO5k+fTpz5szhO9/5Tp2Gu64cdhuqHnp7zJgxjB49mrlz5/LAAw/k\nfHjtOJNCd+DTpPdl0bRU55vZHDN7ysx6pluRmY0ysxIzK1m9enWtgqi8V+ySJeC+416xSgwiDU/7\n9u05+eSTufzyy3fqYF6/fj177703LVu2ZMaMGSxZsqTa9Zxwwgk89thjALz//vvMmTMHCMNut2vX\njo4dO7Jy5cpEUxVAhw4d+PLLL3dZ1/HHH8/TTz9NeXk5X331FVOnTuX444/P+DOtX7+e7t3DofHh\nhx9OTD/ttNO47777Eu/XrVvH0UcfzWuvvcYnn3wCxDO8dq7PPnoGKHT3w4GXgIfTFXL3Ce5e5O5F\nXbt2rdUGsnmvWBGJ3/DhwyktLd0pKYwYMYKSkhIOO+wwHnnkEfr27VvtOq666io2btzIwQcfzM03\n35yocfTv358jjzySvn37cvHFF+807PaoUaMYPHgwJ5988k7rGjBgAJdeeimDBg3iqKOOYuTIkRx5\n5JEZf55bb72Vf//3f2fgwIEUFBQkpt90002sW7eOQw89lP79+zNjxgy6du3KhAkTGDp0KP379+fC\nCy/MeDuZim3obDM7BrjV3c+I3v8cwN1/VUX55sBad++Ybn6l2g6d3axZqCHsuj3Yvj3j1Yg0eRo6\nu+Goy9DZcdYUZgIHmFkfM2sFXARMSy5gZt2S3p4LzK/vILJ9r1gRkYYstqTg7hXAaOAFwsH+CXf/\nwMxuM7Nzo2LXmNkHZlYKXANcWt9xZPtesSIiDVmsF6+5+7PAsynTbk56/XPg53HGUHk7wHHjYOnS\nUEMYPz43twkUaejcHTPLdRhSjbp2CTSJK5pzda9YkcakdevWrFmzhi5duigx5Cl3Z82aNbRu3Xq3\n19EkkoKI1F2PHj0oKyujtqeFS3a1bt2aHj167PbySgoikpGWLVsmrqSVxivX1ymIiEgeUVIQEZEE\nJQUREUmI7YrmuJjZaqD6gU1ypwD4PNdBVEPx1U2+xwf5H6Piq5u6xNfb3WscJ6jBJYV8ZmYlmVxG\nniuKr27yPT7I/xgVX91kIz41H4mISIKSgoiIJCgp1K8JuQ6gBoqvbvI9Psj/GBVf3cQen/oUREQk\nQTUFERFJUFIQEZEEJYVaMrOeZjbDzOZF94K4Nk2Zk8xsvZnNjh43p1tXjDEuNrO50bZ3uU2dBfeY\n2cLo/tgDshjbQUn7ZbaZbTCz61LKZH3/mdkkM1tlZu8nTdvLzF4yswXRc+cqlr0kKrPAzC7JUmy/\nM7MPo7/fVDPrVMWy1X4XYo7xVjNblvR3PKuKZQeb2UfR93FsFuP7c1Jsi81sdhXLxroPqzqm5Oz7\n5+561OIBdAMGRK87AB8D/VLKnAT8NYcxLgYKqpl/FvAcYMDRwNs5irM58Bnhopqc7j/gBGAA8H7S\ntN8CY6PXY4HfpFluL2BR9Nw5et05C7GdDrSIXv8mXWyZfBdijvFW4McZfAf+BXwDaAWUpv4/xRVf\nyvz/C9yci31Y1TElV98/1RRqyd1XuPu70esvCXeV657bqGptCPCIB28BnVJujZot3wb+5e45v0Ld\n3V8D1qZMHgI8HL1+GPi3NIueAbzk7mvdfR3wEjA47tjc/UUPdzcEeAvY/bGS60EV+y8Tg4CF7r7I\n3bcAkwn7vV5VF5+Fm0NcADxe39vNRDXHlJx8/5QU6sDMCoEjgbfTzD7GzErN7DkzOySrgYEDL5rZ\nLDMblWZ+d+DTpPdl5CaxXUTV/4i53H+V9nH3FdHrz4B90pTJh315OaHml05N34W4jY6auCZV0fyR\nD/vveGCluy+oYn7W9mHKMSUn3z8lhd1kZu2BKcB17r4hZfa7hCaR/sC9wNNZDu84dx8AnAlcbWYn\nZHn7NTKzVsC5wJNpZud6/+3CQ109787fNrNxQAVQXEWRXH4X7gf2B44AVhCaaPLRcKqvJWRlH1Z3\nTMnm909JYTeYWUvCH6/Y3f83db67b3D3jdHrZ4GWZlaQrfjcfVn0vAqYSqiiJ1sG9Ex63yOalk1n\nAu+6+8rUGbnef0lWVjarRc+r0pTJ2b40s0uBs4ER0UFjFxl8F2Lj7ivdfZu7bwf+WMW2c/pdNLMW\nwFDgz1WVycY+rOKYkpPvn5JCLUXtjxOB+e5+VxVl9o3KYWaDCPt5TZbia2dmHSpfEzok308pNg34\nfnQW0tHA+qRqarZU+essl/svxTSg8myOS4C/pCnzAnC6mXWOmkdOj6bFyswGAz8FznX38irKZPJd\niDPG5H6q86rY9kzgADPrE9UeLyLs92w5FfjQ3cvSzczGPqzmmJKb719cPeqN9QEcR6jGzQFmR4+z\ngCuBK6Myo4EPCGdSvAX8nyzG941ou6VRDOOi6cnxGXAf4ayPuUBRlvdhO8JBvmPStJzuP0KCWgFs\nJbTLXgF0AaYDC4CXgb2iskXAg0nLXg4sjB6XZSm2hYS25Mrv4H9HZfcDnq3uu5DF/fc/0fdrDuEA\n1y01xuj9WYQzbv4VV4zp4oum/6nye5dUNqv7sJpjSk6+fxrmQkREEtR8JCIiCUoKIiKSoKQgIiIJ\nSgoiIpKgpCAiIglKCiIRM9tmO4/gWm8jdppZYfIInSL5qkWuAxDJI5vc/YhcByGSS6opiNQgGk//\nt9GY+u+Y2Tej6YVm9ko04Nt0M+sVTd/Hwj0OSqPH/4lW1dzM/hiNmf+imbWJyl8TjaU/x8wm5+hj\nigBKCiLJ2qQ0H12YNG+9ux8G/AG4O5p2L/Cwux9OGJDunmj6PcCrHgb0G0C4EhbgAOA+dz8E+AI4\nP5o+FjgyWs+VcX04kUzoimaRiJltdPf2aaYvBk5x90XRwGWfuXsXM/ucMHTD1mj6CncvMLPVQA93\n35y0jkLCuPcHRO9/BrR09zvM7HlgI2E02Kc9GgxQJBdUUxDJjFfxujY2J73exo4+ve8QxqIaAMyM\nRu4UyQklBZHMXJj0/M/o9ZuEUT0BRgCvR6+nA1cBmFlzM+tY1UrNrBnQ091nAD8DOgK71FZEskW/\nSER2aGM737z9eXevPC21s5nNIfzaHx5NGwM8ZGY/AVYDl0XTrwUmmNkVhBrBVYQROtNpDjwaJQ4D\n7nH3L+rtE4nUkvoURGoQ9SkUufvnuY5FJG5qPhIRkQTVFEREJEE1BRERSVBSEBGRBCUFERFJUFIQ\nEZEEJQUREUn4/0ypeWoxAVOLAAAAAElFTkSuQmCC\n", 546 | "text/plain": [ 547 | "" 548 | ] 549 | }, 550 | "metadata": {}, 551 | "output_type": "display_data" 552 | } 553 | ], 554 | "source": [ 555 | "plt.clf() # clear figure\n", 556 | "\n", 557 | "acc = history.history['acc']\n", 558 | "val_acc = history.history['val_acc']\n", 559 | "\n", 560 | "plt.plot(epochs, acc, 'bo', label='Training acc')\n", 561 | "plt.plot(epochs, val_acc, 'b', label='Validation acc')\n", 562 | "plt.title('Training and validation accuracy')\n", 563 | "plt.xlabel('Epochs')\n", 564 | "plt.ylabel('Loss')\n", 565 | "plt.legend()\n", 566 | "\n", 567 | "plt.show()" 568 | ] 569 | }, 570 | { 571 | "cell_type": "markdown", 572 | "metadata": {}, 573 | "source": [ 574 | "It seems that the network starts overfitting after 8 epochs. Let's train a new network from scratch for 8 epochs, then let's evaluate it on \n", 575 | "the test set:" 576 | ] 577 | }, 578 | { 579 | "cell_type": "code", 580 | "execution_count": 27, 581 | "metadata": {}, 582 | "outputs": [ 583 | { 584 | "name": "stdout", 585 | "output_type": "stream", 586 | "text": [ 587 | "Train on 7982 samples, validate on 1000 samples\n", 588 | "Epoch 1/8\n", 589 | "7982/7982 [==============================] - 0s - loss: 2.6118 - acc: 0.4667 - val_loss: 1.7207 - val_acc: 0.6360\n", 590 | "Epoch 2/8\n", 591 | "7982/7982 [==============================] - 0s - loss: 1.3998 - acc: 0.7107 - val_loss: 1.2645 - val_acc: 0.7360\n", 592 | "Epoch 3/8\n", 593 | "7982/7982 [==============================] - 0s - loss: 1.0343 - acc: 0.7839 - val_loss: 1.0994 - val_acc: 0.7700\n", 594 | "Epoch 4/8\n", 595 | "7982/7982 [==============================] - 0s - loss: 0.8114 - acc: 0.8329 - val_loss: 1.0252 - val_acc: 0.7820\n", 596 | "Epoch 5/8\n", 597 | "7982/7982 [==============================] - 0s - loss: 0.6466 - acc: 0.8628 - val_loss: 0.9536 - val_acc: 0.8070\n", 598 | "Epoch 6/8\n", 599 | "7982/7982 [==============================] - 0s - loss: 0.5271 - acc: 0.8894 - val_loss: 0.9187 - val_acc: 0.8110\n", 600 | "Epoch 7/8\n", 601 | "7982/7982 [==============================] - 0s - loss: 0.4193 - acc: 0.9126 - val_loss: 0.9051 - val_acc: 0.8120\n", 602 | "Epoch 8/8\n", 603 | "7982/7982 [==============================] - 0s - loss: 0.3478 - acc: 0.9258 - val_loss: 0.8891 - val_acc: 0.8160\n", 604 | "1952/2246 [=========================>....] - ETA: 0s" 605 | ] 606 | } 607 | ], 608 | "source": [ 609 | "model = models.Sequential()\n", 610 | "model.add(layers.Dense(64, activation='relu', input_shape=(10000,)))\n", 611 | "model.add(layers.Dense(64, activation='relu'))\n", 612 | "model.add(layers.Dense(46, activation='softmax'))\n", 613 | "\n", 614 | "model.compile(optimizer='rmsprop',\n", 615 | " loss='categorical_crossentropy',\n", 616 | " metrics=['accuracy'])\n", 617 | "model.fit(partial_x_train,\n", 618 | " partial_y_train,\n", 619 | " epochs=8,\n", 620 | " batch_size=512,\n", 621 | " validation_data=(x_val, y_val))\n", 622 | "results = model.evaluate(x_test, one_hot_test_labels)" 623 | ] 624 | }, 625 | { 626 | "cell_type": "code", 627 | "execution_count": 28, 628 | "metadata": {}, 629 | "outputs": [ 630 | { 631 | "data": { 632 | "text/plain": [ 633 | "[0.98764628548762257, 0.77693677651807869]" 634 | ] 635 | }, 636 | "execution_count": 28, 637 | "metadata": {}, 638 | "output_type": "execute_result" 639 | } 640 | ], 641 | "source": [ 642 | "results" 643 | ] 644 | }, 645 | { 646 | "cell_type": "markdown", 647 | "metadata": {}, 648 | "source": [ 649 | "\n", 650 | "Our approach reaches an accuracy of ~78%. With a balanced binary classification problem, the accuracy reached by a purely random classifier \n", 651 | "would be 50%, but in our case it is closer to 19%, so our results seem pretty good, at least when compared to a random baseline:" 652 | ] 653 | }, 654 | { 655 | "cell_type": "code", 656 | "execution_count": 29, 657 | "metadata": {}, 658 | "outputs": [ 659 | { 660 | "data": { 661 | "text/plain": [ 662 | "0.18477292965271594" 663 | ] 664 | }, 665 | "execution_count": 29, 666 | "metadata": {}, 667 | "output_type": "execute_result" 668 | } 669 | ], 670 | "source": [ 671 | "import copy\n", 672 | "\n", 673 | "test_labels_copy = copy.copy(test_labels)\n", 674 | "np.random.shuffle(test_labels_copy)\n", 675 | "float(np.sum(np.array(test_labels) == np.array(test_labels_copy))) / len(test_labels)" 676 | ] 677 | }, 678 | { 679 | "cell_type": "markdown", 680 | "metadata": {}, 681 | "source": [ 682 | "## Generating predictions on new data\n", 683 | "\n", 684 | "We can verify that the `predict` method of our model instance returns a probability distribution over all 46 topics. Let's generate topic \n", 685 | "predictions for all of the test data:" 686 | ] 687 | }, 688 | { 689 | "cell_type": "code", 690 | "execution_count": 30, 691 | "metadata": { 692 | "collapsed": true 693 | }, 694 | "outputs": [], 695 | "source": [ 696 | "predictions = model.predict(x_test)" 697 | ] 698 | }, 699 | { 700 | "cell_type": "markdown", 701 | "metadata": {}, 702 | "source": [ 703 | "Each entry in `predictions` is a vector of length 46:" 704 | ] 705 | }, 706 | { 707 | "cell_type": "code", 708 | "execution_count": 31, 709 | "metadata": {}, 710 | "outputs": [ 711 | { 712 | "data": { 713 | "text/plain": [ 714 | "(46,)" 715 | ] 716 | }, 717 | "execution_count": 31, 718 | "metadata": {}, 719 | "output_type": "execute_result" 720 | } 721 | ], 722 | "source": [ 723 | "predictions[0].shape" 724 | ] 725 | }, 726 | { 727 | "cell_type": "markdown", 728 | "metadata": {}, 729 | "source": [ 730 | "The coefficients in this vector sum to 1:" 731 | ] 732 | }, 733 | { 734 | "cell_type": "code", 735 | "execution_count": 32, 736 | "metadata": {}, 737 | "outputs": [ 738 | { 739 | "data": { 740 | "text/plain": [ 741 | "0.99999994" 742 | ] 743 | }, 744 | "execution_count": 32, 745 | "metadata": {}, 746 | "output_type": "execute_result" 747 | } 748 | ], 749 | "source": [ 750 | "np.sum(predictions[0])" 751 | ] 752 | }, 753 | { 754 | "cell_type": "markdown", 755 | "metadata": {}, 756 | "source": [ 757 | "The largest entry is the predicted class, i.e. the class with the highest probability:" 758 | ] 759 | }, 760 | { 761 | "cell_type": "code", 762 | "execution_count": 33, 763 | "metadata": {}, 764 | "outputs": [ 765 | { 766 | "data": { 767 | "text/plain": [ 768 | "3" 769 | ] 770 | }, 771 | "execution_count": 33, 772 | "metadata": {}, 773 | "output_type": "execute_result" 774 | } 775 | ], 776 | "source": [ 777 | "np.argmax(predictions[0])" 778 | ] 779 | }, 780 | { 781 | "cell_type": "markdown", 782 | "metadata": {}, 783 | "source": [ 784 | "## A different way to handle the labels and the loss\n", 785 | "\n", 786 | "We mentioned earlier that another way to encode the labels would be to cast them as an integer tensor, like such:" 787 | ] 788 | }, 789 | { 790 | "cell_type": "code", 791 | "execution_count": 35, 792 | "metadata": { 793 | "collapsed": true 794 | }, 795 | "outputs": [], 796 | "source": [ 797 | "y_train = np.array(train_labels)\n", 798 | "y_test = np.array(test_labels)" 799 | ] 800 | }, 801 | { 802 | "cell_type": "markdown", 803 | "metadata": {}, 804 | "source": [ 805 | "\n", 806 | "The only thing it would change is the choice of the loss function. Our previous loss, `categorical_crossentropy`, expects the labels to \n", 807 | "follow a categorical encoding. With integer labels, we should use `sparse_categorical_crossentropy`:" 808 | ] 809 | }, 810 | { 811 | "cell_type": "code", 812 | "execution_count": 36, 813 | "metadata": { 814 | "collapsed": true 815 | }, 816 | "outputs": [], 817 | "source": [ 818 | "model.compile(optimizer='rmsprop', loss='sparse_categorical_crossentropy', metrics=['acc'])" 819 | ] 820 | }, 821 | { 822 | "cell_type": "markdown", 823 | "metadata": {}, 824 | "source": [ 825 | "This new loss function is still mathematically the same as `categorical_crossentropy`; it just has a different interface." 826 | ] 827 | }, 828 | { 829 | "cell_type": "markdown", 830 | "metadata": {}, 831 | "source": [ 832 | "## On the importance of having sufficiently large intermediate layers\n", 833 | "\n", 834 | "\n", 835 | "We mentioned earlier that since our final outputs were 46-dimensional, we should avoid intermediate layers with much less than 46 hidden \n", 836 | "units. Now let's try to see what happens when we introduce an information bottleneck by having intermediate layers significantly less than \n", 837 | "46-dimensional, e.g. 4-dimensional." 838 | ] 839 | }, 840 | { 841 | "cell_type": "code", 842 | "execution_count": 42, 843 | "metadata": {}, 844 | "outputs": [ 845 | { 846 | "name": "stdout", 847 | "output_type": "stream", 848 | "text": [ 849 | "Train on 7982 samples, validate on 1000 samples\n", 850 | "Epoch 1/20\n", 851 | "7982/7982 [==============================] - 0s - loss: 3.1620 - acc: 0.2295 - val_loss: 2.6750 - val_acc: 0.2740\n", 852 | "Epoch 2/20\n", 853 | "7982/7982 [==============================] - 0s - loss: 2.2009 - acc: 0.3829 - val_loss: 1.7626 - val_acc: 0.5990\n", 854 | "Epoch 3/20\n", 855 | "7982/7982 [==============================] - 0s - loss: 1.4490 - acc: 0.6486 - val_loss: 1.4738 - val_acc: 0.6390\n", 856 | "Epoch 4/20\n", 857 | "7982/7982 [==============================] - 0s - loss: 1.2258 - acc: 0.6776 - val_loss: 1.3961 - val_acc: 0.6570\n", 858 | "Epoch 5/20\n", 859 | "7982/7982 [==============================] - 0s - loss: 1.0886 - acc: 0.7032 - val_loss: 1.3727 - val_acc: 0.6700\n", 860 | "Epoch 6/20\n", 861 | "7982/7982 [==============================] - 0s - loss: 0.9817 - acc: 0.7494 - val_loss: 1.3682 - val_acc: 0.6800\n", 862 | "Epoch 7/20\n", 863 | "7982/7982 [==============================] - 0s - loss: 0.8937 - acc: 0.7757 - val_loss: 1.3587 - val_acc: 0.6810\n", 864 | "Epoch 8/20\n", 865 | "7982/7982 [==============================] - 0s - loss: 0.8213 - acc: 0.7942 - val_loss: 1.3548 - val_acc: 0.6960\n", 866 | "Epoch 9/20\n", 867 | "7982/7982 [==============================] - 0s - loss: 0.7595 - acc: 0.8088 - val_loss: 1.3883 - val_acc: 0.7050\n", 868 | "Epoch 10/20\n", 869 | "7982/7982 [==============================] - 0s - loss: 0.7072 - acc: 0.8193 - val_loss: 1.4216 - val_acc: 0.7020\n", 870 | "Epoch 11/20\n", 871 | "7982/7982 [==============================] - 0s - loss: 0.6642 - acc: 0.8254 - val_loss: 1.4405 - val_acc: 0.7020\n", 872 | "Epoch 12/20\n", 873 | "7982/7982 [==============================] - 0s - loss: 0.6275 - acc: 0.8281 - val_loss: 1.4938 - val_acc: 0.7080\n", 874 | "Epoch 13/20\n", 875 | "7982/7982 [==============================] - 0s - loss: 0.5915 - acc: 0.8353 - val_loss: 1.5301 - val_acc: 0.7110\n", 876 | "Epoch 14/20\n", 877 | "7982/7982 [==============================] - 0s - loss: 0.5637 - acc: 0.8419 - val_loss: 1.5400 - val_acc: 0.7080\n", 878 | "Epoch 15/20\n", 879 | "7982/7982 [==============================] - 0s - loss: 0.5389 - acc: 0.8523 - val_loss: 1.5826 - val_acc: 0.7090\n", 880 | "Epoch 16/20\n", 881 | "7982/7982 [==============================] - 0s - loss: 0.5162 - acc: 0.8588 - val_loss: 1.6391 - val_acc: 0.7080\n", 882 | "Epoch 17/20\n", 883 | "7982/7982 [==============================] - 0s - loss: 0.4950 - acc: 0.8623 - val_loss: 1.6469 - val_acc: 0.7060\n", 884 | "Epoch 18/20\n", 885 | "7982/7982 [==============================] - 0s - loss: 0.4771 - acc: 0.8670 - val_loss: 1.7258 - val_acc: 0.6950\n", 886 | "Epoch 19/20\n", 887 | "7982/7982 [==============================] - 0s - loss: 0.4562 - acc: 0.8718 - val_loss: 1.7667 - val_acc: 0.6930\n", 888 | "Epoch 20/20\n", 889 | "7982/7982 [==============================] - 0s - loss: 0.4428 - acc: 0.8742 - val_loss: 1.7785 - val_acc: 0.7060\n" 890 | ] 891 | }, 892 | { 893 | "data": { 894 | "text/plain": [ 895 | "" 896 | ] 897 | }, 898 | "execution_count": 42, 899 | "metadata": {}, 900 | "output_type": "execute_result" 901 | } 902 | ], 903 | "source": [ 904 | "model = models.Sequential()\n", 905 | "model.add(layers.Dense(64, activation='relu', input_shape=(10000,)))\n", 906 | "model.add(layers.Dense(4, activation='relu'))\n", 907 | "model.add(layers.Dense(46, activation='softmax'))\n", 908 | "\n", 909 | "model.compile(optimizer='rmsprop',\n", 910 | " loss='categorical_crossentropy',\n", 911 | " metrics=['accuracy'])\n", 912 | "model.fit(partial_x_train,\n", 913 | " partial_y_train,\n", 914 | " epochs=20,\n", 915 | " batch_size=128,\n", 916 | " validation_data=(x_val, y_val))" 917 | ] 918 | }, 919 | { 920 | "cell_type": "markdown", 921 | "metadata": {}, 922 | "source": [ 923 | "\n", 924 | "Our network now seems to peak at ~71% test accuracy, a 8% absolute drop. This drop is mostly due to the fact that we are now trying to \n", 925 | "compress a lot of information (enough information to recover the separation hyperplanes of 46 classes) into an intermediate space that is \n", 926 | "too low-dimensional. The network is able to cram _most_ of the necessary information into these 8-dimensional representations, but not all \n", 927 | "of it." 928 | ] 929 | }, 930 | { 931 | "cell_type": "markdown", 932 | "metadata": {}, 933 | "source": [ 934 | "## Further experiments\n", 935 | "\n", 936 | "* Try using larger or smaller layers: 32 units, 128 units...\n", 937 | "* We were using two hidden layers. Now try to use a single hidden layer, or three hidden layers." 938 | ] 939 | }, 940 | { 941 | "cell_type": "markdown", 942 | "metadata": {}, 943 | "source": [ 944 | "## Wrapping up\n", 945 | "\n", 946 | "\n", 947 | "Here's what you should take away from this example:\n", 948 | "\n", 949 | "* If you are trying to classify data points between N classes, your network should end with a `Dense` layer of size N.\n", 950 | "* In a single-label, multi-class classification problem, your network should end with a `softmax` activation, so that it will output a \n", 951 | "probability distribution over the N output classes.\n", 952 | "* _Categorical crossentropy_ is almost always the loss function you should use for such problems. It minimizes the distance between the \n", 953 | "probability distributions output by the network, and the true distribution of the targets.\n", 954 | "* There are two ways to handle labels in multi-class classification:\n", 955 | " ** Encoding the labels via \"categorical encoding\" (also known as \"one-hot encoding\") and using `categorical_crossentropy` as your loss \n", 956 | "function.\n", 957 | " ** Encoding the labels as integers and using the `sparse_categorical_crossentropy` loss function.\n", 958 | "* If you need to classify data into a large number of categories, then you should avoid creating information bottlenecks in your network by having \n", 959 | "intermediate layers that are too small." 960 | ] 961 | } 962 | ], 963 | "metadata": { 964 | "kernelspec": { 965 | "display_name": "Python 3", 966 | "language": "python", 967 | "name": "python3" 968 | }, 969 | "language_info": { 970 | "codemirror_mode": { 971 | "name": "ipython", 972 | "version": 3 973 | }, 974 | "file_extension": ".py", 975 | "mimetype": "text/x-python", 976 | "name": "python", 977 | "nbconvert_exporter": "python", 978 | "pygments_lexer": "ipython3", 979 | "version": "3.5.2" 980 | } 981 | }, 982 | "nbformat": 4, 983 | "nbformat_minor": 2 984 | } 985 | -------------------------------------------------------------------------------- /3.7-predicting-house-prices.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# Predicting house prices: a regression example\n", 36 | "\n", 37 | "This notebook contains the code samples found in Chapter 3, Section 6 of [Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text features far more content, in particular further explanations and figures: in this notebook, you will only find source code and related comments.\n", 38 | "\n", 39 | "----\n", 40 | "\n", 41 | "\n", 42 | "In our two previous examples, we were considering classification problems, where the goal was to predict a single discrete label of an \n", 43 | "input data point. Another common type of machine learning problem is \"regression\", which consists of predicting a continuous value instead \n", 44 | "of a discrete label. For instance, predicting the temperature tomorrow, given meteorological data, or predicting the time that a \n", 45 | "software project will take to complete, given its specifications.\n", 46 | "\n", 47 | "Do not mix up \"regression\" with the algorithm \"logistic regression\": confusingly, \"logistic regression\" is not a regression algorithm, \n", 48 | "it is a classification algorithm." 49 | ] 50 | }, 51 | { 52 | "cell_type": "markdown", 53 | "metadata": {}, 54 | "source": [ 55 | "## The Boston Housing Price dataset\n", 56 | "\n", 57 | "\n", 58 | "We will be attempting to predict the median price of homes in a given Boston suburb in the mid-1970s, given a few data points about the \n", 59 | "suburb at the time, such as the crime rate, the local property tax rate, etc.\n", 60 | "\n", 61 | "The dataset we will be using has another interesting difference from our two previous examples: it has very few data points, only 506 in \n", 62 | "total, split between 404 training samples and 102 test samples, and each \"feature\" in the input data (e.g. the crime rate is a feature) has \n", 63 | "a different scale. For instance some values are proportions, which take a values between 0 and 1, others take values between 1 and 12, \n", 64 | "others between 0 and 100...\n", 65 | "\n", 66 | "Let's take a look at the data:" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": 2, 72 | "metadata": { 73 | "collapsed": true 74 | }, 75 | "outputs": [], 76 | "source": [ 77 | "from keras.datasets import boston_housing\n", 78 | "\n", 79 | "(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()" 80 | ] 81 | }, 82 | { 83 | "cell_type": "code", 84 | "execution_count": 3, 85 | "metadata": {}, 86 | "outputs": [ 87 | { 88 | "data": { 89 | "text/plain": [ 90 | "(404, 13)" 91 | ] 92 | }, 93 | "execution_count": 3, 94 | "metadata": {}, 95 | "output_type": "execute_result" 96 | } 97 | ], 98 | "source": [ 99 | "train_data.shape" 100 | ] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "execution_count": 4, 105 | "metadata": {}, 106 | "outputs": [ 107 | { 108 | "data": { 109 | "text/plain": [ 110 | "(102, 13)" 111 | ] 112 | }, 113 | "execution_count": 4, 114 | "metadata": {}, 115 | "output_type": "execute_result" 116 | } 117 | ], 118 | "source": [ 119 | "test_data.shape" 120 | ] 121 | }, 122 | { 123 | "cell_type": "markdown", 124 | "metadata": {}, 125 | "source": [ 126 | "\n", 127 | "As you can see, we have 404 training samples and 102 test samples. The data comprises 13 features. The 13 features in the input data are as \n", 128 | "follow:\n", 129 | "\n", 130 | "1. Per capita crime rate.\n", 131 | "2. Proportion of residential land zoned for lots over 25,000 square feet.\n", 132 | "3. Proportion of non-retail business acres per town.\n", 133 | "4. Charles River dummy variable (= 1 if tract bounds river; 0 otherwise).\n", 134 | "5. Nitric oxides concentration (parts per 10 million).\n", 135 | "6. Average number of rooms per dwelling.\n", 136 | "7. Proportion of owner-occupied units built prior to 1940.\n", 137 | "8. Weighted distances to five Boston employment centres.\n", 138 | "9. Index of accessibility to radial highways.\n", 139 | "10. Full-value property-tax rate per $10,000.\n", 140 | "11. Pupil-teacher ratio by town.\n", 141 | "12. 1000 * (Bk - 0.63) ** 2 where Bk is the proportion of Black people by town.\n", 142 | "13. % lower status of the population.\n", 143 | "\n", 144 | "The targets are the median values of owner-occupied homes, in thousands of dollars:" 145 | ] 146 | }, 147 | { 148 | "cell_type": "code", 149 | "execution_count": 5, 150 | "metadata": {}, 151 | "outputs": [ 152 | { 153 | "data": { 154 | "text/plain": [ 155 | "array([ 15.2, 42.3, 50. , 21.1, 17.7, 18.5, 11.3, 15.6, 15.6,\n", 156 | " 14.4, 12.1, 17.9, 23.1, 19.9, 15.7, 8.8, 50. , 22.5,\n", 157 | " 24.1, 27.5, 10.9, 30.8, 32.9, 24. , 18.5, 13.3, 22.9,\n", 158 | " 34.7, 16.6, 17.5, 22.3, 16.1, 14.9, 23.1, 34.9, 25. ,\n", 159 | " 13.9, 13.1, 20.4, 20. , 15.2, 24.7, 22.2, 16.7, 12.7,\n", 160 | " 15.6, 18.4, 21. , 30.1, 15.1, 18.7, 9.6, 31.5, 24.8,\n", 161 | " 19.1, 22. , 14.5, 11. , 32. , 29.4, 20.3, 24.4, 14.6,\n", 162 | " 19.5, 14.1, 14.3, 15.6, 10.5, 6.3, 19.3, 19.3, 13.4,\n", 163 | " 36.4, 17.8, 13.5, 16.5, 8.3, 14.3, 16. , 13.4, 28.6,\n", 164 | " 43.5, 20.2, 22. , 23. , 20.7, 12.5, 48.5, 14.6, 13.4,\n", 165 | " 23.7, 50. , 21.7, 39.8, 38.7, 22.2, 34.9, 22.5, 31.1,\n", 166 | " 28.7, 46. , 41.7, 21. , 26.6, 15. , 24.4, 13.3, 21.2,\n", 167 | " 11.7, 21.7, 19.4, 50. , 22.8, 19.7, 24.7, 36.2, 14.2,\n", 168 | " 18.9, 18.3, 20.6, 24.6, 18.2, 8.7, 44. , 10.4, 13.2,\n", 169 | " 21.2, 37. , 30.7, 22.9, 20. , 19.3, 31.7, 32. , 23.1,\n", 170 | " 18.8, 10.9, 50. , 19.6, 5. , 14.4, 19.8, 13.8, 19.6,\n", 171 | " 23.9, 24.5, 25. , 19.9, 17.2, 24.6, 13.5, 26.6, 21.4,\n", 172 | " 11.9, 22.6, 19.6, 8.5, 23.7, 23.1, 22.4, 20.5, 23.6,\n", 173 | " 18.4, 35.2, 23.1, 27.9, 20.6, 23.7, 28. , 13.6, 27.1,\n", 174 | " 23.6, 20.6, 18.2, 21.7, 17.1, 8.4, 25.3, 13.8, 22.2,\n", 175 | " 18.4, 20.7, 31.6, 30.5, 20.3, 8.8, 19.2, 19.4, 23.1,\n", 176 | " 23. , 14.8, 48.8, 22.6, 33.4, 21.1, 13.6, 32.2, 13.1,\n", 177 | " 23.4, 18.9, 23.9, 11.8, 23.3, 22.8, 19.6, 16.7, 13.4,\n", 178 | " 22.2, 20.4, 21.8, 26.4, 14.9, 24.1, 23.8, 12.3, 29.1,\n", 179 | " 21. , 19.5, 23.3, 23.8, 17.8, 11.5, 21.7, 19.9, 25. ,\n", 180 | " 33.4, 28.5, 21.4, 24.3, 27.5, 33.1, 16.2, 23.3, 48.3,\n", 181 | " 22.9, 22.8, 13.1, 12.7, 22.6, 15. , 15.3, 10.5, 24. ,\n", 182 | " 18.5, 21.7, 19.5, 33.2, 23.2, 5. , 19.1, 12.7, 22.3,\n", 183 | " 10.2, 13.9, 16.3, 17. , 20.1, 29.9, 17.2, 37.3, 45.4,\n", 184 | " 17.8, 23.2, 29. , 22. , 18. , 17.4, 34.6, 20.1, 25. ,\n", 185 | " 15.6, 24.8, 28.2, 21.2, 21.4, 23.8, 31. , 26.2, 17.4,\n", 186 | " 37.9, 17.5, 20. , 8.3, 23.9, 8.4, 13.8, 7.2, 11.7,\n", 187 | " 17.1, 21.6, 50. , 16.1, 20.4, 20.6, 21.4, 20.6, 36.5,\n", 188 | " 8.5, 24.8, 10.8, 21.9, 17.3, 18.9, 36.2, 14.9, 18.2,\n", 189 | " 33.3, 21.8, 19.7, 31.6, 24.8, 19.4, 22.8, 7.5, 44.8,\n", 190 | " 16.8, 18.7, 50. , 50. , 19.5, 20.1, 50. , 17.2, 20.8,\n", 191 | " 19.3, 41.3, 20.4, 20.5, 13.8, 16.5, 23.9, 20.6, 31.5,\n", 192 | " 23.3, 16.8, 14. , 33.8, 36.1, 12.8, 18.3, 18.7, 19.1,\n", 193 | " 29. , 30.1, 50. , 50. , 22. , 11.9, 37.6, 50. , 22.7,\n", 194 | " 20.8, 23.5, 27.9, 50. , 19.3, 23.9, 22.6, 15.2, 21.7,\n", 195 | " 19.2, 43.8, 20.3, 33.2, 19.9, 22.5, 32.7, 22. , 17.1,\n", 196 | " 19. , 15. , 16.1, 25.1, 23.7, 28.7, 37.2, 22.6, 16.4,\n", 197 | " 25. , 29.8, 22.1, 17.4, 18.1, 30.3, 17.5, 24.7, 12.6,\n", 198 | " 26.5, 28.7, 13.3, 10.4, 24.4, 23. , 20. , 17.8, 7. ,\n", 199 | " 11.8, 24.4, 13.8, 19.4, 25.2, 19.4, 19.4, 29.1])" 200 | ] 201 | }, 202 | "execution_count": 5, 203 | "metadata": {}, 204 | "output_type": "execute_result" 205 | } 206 | ], 207 | "source": [ 208 | "train_targets" 209 | ] 210 | }, 211 | { 212 | "cell_type": "markdown", 213 | "metadata": {}, 214 | "source": [ 215 | "\n", 216 | "The prices are typically between \\$10,000 and \\$50,000. If that sounds cheap, remember this was the mid-1970s, and these prices are not \n", 217 | "inflation-adjusted." 218 | ] 219 | }, 220 | { 221 | "cell_type": "markdown", 222 | "metadata": {}, 223 | "source": [ 224 | "## Preparing the data\n", 225 | "\n", 226 | "\n", 227 | "It would be problematic to feed into a neural network values that all take wildly different ranges. The network might be able to \n", 228 | "automatically adapt to such heterogeneous data, but it would definitely make learning more difficult. A widespread best practice to deal \n", 229 | "with such data is to do feature-wise normalization: for each feature in the input data (a column in the input data matrix), we \n", 230 | "will subtract the mean of the feature and divide by the standard deviation, so that the feature will be centered around 0 and will have a \n", 231 | "unit standard deviation. This is easily done in Numpy:" 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": 6, 237 | "metadata": { 238 | "collapsed": true 239 | }, 240 | "outputs": [], 241 | "source": [ 242 | "mean = train_data.mean(axis=0)\n", 243 | "train_data -= mean\n", 244 | "std = train_data.std(axis=0)\n", 245 | "train_data /= std\n", 246 | "\n", 247 | "test_data -= mean\n", 248 | "test_data /= std" 249 | ] 250 | }, 251 | { 252 | "cell_type": "markdown", 253 | "metadata": {}, 254 | "source": [ 255 | "\n", 256 | "Note that the quantities that we use for normalizing the test data have been computed using the training data. We should never use in our \n", 257 | "workflow any quantity computed on the test data, even for something as simple as data normalization." 258 | ] 259 | }, 260 | { 261 | "cell_type": "markdown", 262 | "metadata": {}, 263 | "source": [ 264 | "## Building our network\n", 265 | "\n", 266 | "\n", 267 | "Because so few samples are available, we will be using a very small network with two \n", 268 | "hidden layers, each with 64 units. In general, the less training data you have, the worse overfitting will be, and using \n", 269 | "a small network is one way to mitigate overfitting." 270 | ] 271 | }, 272 | { 273 | "cell_type": "code", 274 | "execution_count": 7, 275 | "metadata": { 276 | "collapsed": true 277 | }, 278 | "outputs": [], 279 | "source": [ 280 | "from keras import models\n", 281 | "from keras import layers\n", 282 | "\n", 283 | "def build_model():\n", 284 | " # Because we will need to instantiate\n", 285 | " # the same model multiple times,\n", 286 | " # we use a function to construct it.\n", 287 | " model = models.Sequential()\n", 288 | " model.add(layers.Dense(64, activation='relu',\n", 289 | " input_shape=(train_data.shape[1],)))\n", 290 | " model.add(layers.Dense(64, activation='relu'))\n", 291 | " model.add(layers.Dense(1))\n", 292 | " model.compile(optimizer='rmsprop', loss='mse', metrics=['mae'])\n", 293 | " return model" 294 | ] 295 | }, 296 | { 297 | "cell_type": "markdown", 298 | "metadata": {}, 299 | "source": [ 300 | "\n", 301 | "Our network ends with a single unit, and no activation (i.e. it will be linear layer). \n", 302 | "This is a typical setup for scalar regression (i.e. regression where we are trying to predict a single continuous value). \n", 303 | "Applying an activation function would constrain the range that the output can take; for instance if \n", 304 | "we applied a `sigmoid` activation function to our last layer, the network could only learn to predict values between 0 and 1. Here, because \n", 305 | "the last layer is purely linear, the network is free to learn to predict values in any range.\n", 306 | "\n", 307 | "Note that we are compiling the network with the `mse` loss function -- Mean Squared Error, the square of the difference between the \n", 308 | "predictions and the targets, a widely used loss function for regression problems.\n", 309 | "\n", 310 | "We are also monitoring a new metric during training: `mae`. This stands for Mean Absolute Error. It is simply the absolute value of the \n", 311 | "difference between the predictions and the targets. For instance, a MAE of 0.5 on this problem would mean that our predictions are off by \n", 312 | "\\$500 on average." 313 | ] 314 | }, 315 | { 316 | "cell_type": "markdown", 317 | "metadata": {}, 318 | "source": [ 319 | "## Validating our approach using K-fold validation\n", 320 | "\n", 321 | "\n", 322 | "To evaluate our network while we keep adjusting its parameters (such as the number of epochs used for training), we could simply split the \n", 323 | "data into a training set and a validation set, as we were doing in our previous examples. However, because we have so few data points, the \n", 324 | "validation set would end up being very small (e.g. about 100 examples). A consequence is that our validation scores may change a lot \n", 325 | "depending on _which_ data points we choose to use for validation and which we choose for training, i.e. the validation scores may have a \n", 326 | "high _variance_ with regard to the validation split. This would prevent us from reliably evaluating our model.\n", 327 | "\n", 328 | "The best practice in such situations is to use K-fold cross-validation. It consists of splitting the available data into K partitions \n", 329 | "(typically K=4 or 5), then instantiating K identical models, and training each one on K-1 partitions while evaluating on the remaining \n", 330 | "partition. The validation score for the model used would then be the average of the K validation scores obtained." 331 | ] 332 | }, 333 | { 334 | "cell_type": "markdown", 335 | "metadata": {}, 336 | "source": [ 337 | "In terms of code, this is straightforward:" 338 | ] 339 | }, 340 | { 341 | "cell_type": "code", 342 | "execution_count": 8, 343 | "metadata": {}, 344 | "outputs": [ 345 | { 346 | "name": "stdout", 347 | "output_type": "stream", 348 | "text": [ 349 | "processing fold # 0\n", 350 | "processing fold # 1\n", 351 | "processing fold # 2\n", 352 | "processing fold # 3\n" 353 | ] 354 | } 355 | ], 356 | "source": [ 357 | "import numpy as np\n", 358 | "\n", 359 | "k = 4\n", 360 | "num_val_samples = len(train_data) // k\n", 361 | "num_epochs = 100\n", 362 | "all_scores = []\n", 363 | "for i in range(k):\n", 364 | " print('processing fold #', i)\n", 365 | " # Prepare the validation data: data from partition # k\n", 366 | " val_data = train_data[i * num_val_samples: (i + 1) * num_val_samples]\n", 367 | " val_targets = train_targets[i * num_val_samples: (i + 1) * num_val_samples]\n", 368 | "\n", 369 | " # Prepare the training data: data from all other partitions\n", 370 | " partial_train_data = np.concatenate(\n", 371 | " [train_data[:i * num_val_samples],\n", 372 | " train_data[(i + 1) * num_val_samples:]],\n", 373 | " axis=0)\n", 374 | " partial_train_targets = np.concatenate(\n", 375 | " [train_targets[:i * num_val_samples],\n", 376 | " train_targets[(i + 1) * num_val_samples:]],\n", 377 | " axis=0)\n", 378 | "\n", 379 | " # Build the Keras model (already compiled)\n", 380 | " model = build_model()\n", 381 | " # Train the model (in silent mode, verbose=0)\n", 382 | " model.fit(partial_train_data, partial_train_targets,\n", 383 | " epochs=num_epochs, batch_size=1, verbose=0)\n", 384 | " # Evaluate the model on the validation data\n", 385 | " val_mse, val_mae = model.evaluate(val_data, val_targets, verbose=0)\n", 386 | " all_scores.append(val_mae)" 387 | ] 388 | }, 389 | { 390 | "cell_type": "code", 391 | "execution_count": 9, 392 | "metadata": {}, 393 | "outputs": [ 394 | { 395 | "data": { 396 | "text/plain": [ 397 | "[2.0750808349930412, 2.117215852926273, 2.9140411863232605, 2.4288365227161068]" 398 | ] 399 | }, 400 | "execution_count": 9, 401 | "metadata": {}, 402 | "output_type": "execute_result" 403 | } 404 | ], 405 | "source": [ 406 | "all_scores" 407 | ] 408 | }, 409 | { 410 | "cell_type": "code", 411 | "execution_count": 10, 412 | "metadata": {}, 413 | "outputs": [ 414 | { 415 | "data": { 416 | "text/plain": [ 417 | "2.3837935992396706" 418 | ] 419 | }, 420 | "execution_count": 10, 421 | "metadata": {}, 422 | "output_type": "execute_result" 423 | } 424 | ], 425 | "source": [ 426 | "np.mean(all_scores)" 427 | ] 428 | }, 429 | { 430 | "cell_type": "markdown", 431 | "metadata": {}, 432 | "source": [ 433 | "\n", 434 | "As you can notice, the different runs do indeed show rather different validation scores, from 2.1 to 2.9. Their average (2.4) is a much more \n", 435 | "reliable metric than any single of these scores -- that's the entire point of K-fold cross-validation. In this case, we are off by \\$2,400 on \n", 436 | "average, which is still significant considering that the prices range from \\$10,000 to \\$50,000. \n", 437 | "\n", 438 | "Let's try training the network for a bit longer: 500 epochs. To keep a record of how well the model did at each epoch, we will modify our training loop \n", 439 | "to save the per-epoch validation score log:" 440 | ] 441 | }, 442 | { 443 | "cell_type": "code", 444 | "execution_count": 21, 445 | "metadata": { 446 | "collapsed": true 447 | }, 448 | "outputs": [], 449 | "source": [ 450 | "from keras import backend as K\n", 451 | "\n", 452 | "# Some memory clean-up\n", 453 | "K.clear_session()" 454 | ] 455 | }, 456 | { 457 | "cell_type": "code", 458 | "execution_count": 22, 459 | "metadata": {}, 460 | "outputs": [ 461 | { 462 | "name": "stdout", 463 | "output_type": "stream", 464 | "text": [ 465 | "processing fold # 0\n", 466 | "processing fold # 1\n", 467 | "processing fold # 2\n", 468 | "processing fold # 3\n" 469 | ] 470 | } 471 | ], 472 | "source": [ 473 | "num_epochs = 500\n", 474 | "all_mae_histories = []\n", 475 | "for i in range(k):\n", 476 | " print('processing fold #', i)\n", 477 | " # Prepare the validation data: data from partition # k\n", 478 | " val_data = train_data[i * num_val_samples: (i + 1) * num_val_samples]\n", 479 | " val_targets = train_targets[i * num_val_samples: (i + 1) * num_val_samples]\n", 480 | "\n", 481 | " # Prepare the training data: data from all other partitions\n", 482 | " partial_train_data = np.concatenate(\n", 483 | " [train_data[:i * num_val_samples],\n", 484 | " train_data[(i + 1) * num_val_samples:]],\n", 485 | " axis=0)\n", 486 | " partial_train_targets = np.concatenate(\n", 487 | " [train_targets[:i * num_val_samples],\n", 488 | " train_targets[(i + 1) * num_val_samples:]],\n", 489 | " axis=0)\n", 490 | "\n", 491 | " # Build the Keras model (already compiled)\n", 492 | " model = build_model()\n", 493 | " # Train the model (in silent mode, verbose=0)\n", 494 | " history = model.fit(partial_train_data, partial_train_targets,\n", 495 | " validation_data=(val_data, val_targets),\n", 496 | " epochs=num_epochs, batch_size=1, verbose=0)\n", 497 | " mae_history = history.history['val_mean_absolute_error']\n", 498 | " all_mae_histories.append(mae_history)" 499 | ] 500 | }, 501 | { 502 | "cell_type": "markdown", 503 | "metadata": {}, 504 | "source": [ 505 | "We can then compute the average of the per-epoch MAE scores for all folds:" 506 | ] 507 | }, 508 | { 509 | "cell_type": "code", 510 | "execution_count": 23, 511 | "metadata": { 512 | "collapsed": true 513 | }, 514 | "outputs": [], 515 | "source": [ 516 | "average_mae_history = [\n", 517 | " np.mean([x[i] for x in all_mae_histories]) for i in range(num_epochs)]" 518 | ] 519 | }, 520 | { 521 | "cell_type": "markdown", 522 | "metadata": {}, 523 | "source": [ 524 | "Let's plot this:" 525 | ] 526 | }, 527 | { 528 | "cell_type": "code", 529 | "execution_count": 26, 530 | "metadata": {}, 531 | "outputs": [ 532 | { 533 | "data": { 534 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEKCAYAAAD9xUlFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XecVOXVwPHfmdnZvstSll4WBAGRJigW7ESxG0ussURf\nooktiTGa15hoYkk0tpjEWGKMLSrR14INu1hAiiBKlV53gS3ssn2e949b9s7MnZmlzC4w5/v57Gdn\n7r0z89wt99ynnUeMMSillFIAgfYugFJKqd2HBgWllFIuDQpKKaVcGhSUUkq5NCgopZRyaVBQSinl\n0qCglFLKpUFBKaWUS4OCUkopV0Z7F2B7denSxZSUlLR3MZRSao8ya9asTcaY4mTH7XFBoaSkhJkz\nZ7Z3MZRSao8iIitbc5w2HymllHJpUFBKKeXSoKCUUsqlQUEppZRLg4JSSilXyoOCiARFZI6IvO6z\n7xIRKRORr+yvy1NdHqWUUvG1xZDUa4EFQGGc/c8bY65qg3IopZRKIqU1BRHpDZwEPJbKz2mNRRu2\ncu87i9hUXd/eRVFKqd1WqpuP7gduAMIJjjlTROaJyGQR6ZOqgiwtrebB95eypaYhVR+hlFJ7vJQF\nBRE5GSg1xsxKcNhrQIkxZgQwFXgyzntNEpGZIjKzrKxsB8tjfQ8bs0OvV0qpdJDKmsJhwKkisgL4\nD3CMiDztPcAYs9kY47TnPAaM8XsjY8wjxpixxpixxcVJU3f4CojzXjv0cqWUSgspCwrGmJuMMb2N\nMSXAucD7xpgLvceISA/P01OxOqRTQuyqgtYUlFIqvjZPiCcitwEzjTGvAteIyKlAE7AFuCRln2t/\n15iglFLxtUlQMMZ8CHxoP77Fs/0m4Ka2KEPAriloUFBKqfjSZkazdjQrpVRyaRMU3JpCO5dDKaV2\nZ2kTFNCaglJKJZU2QaGlT0GDglJKxZNGQcH6rjFBKaXiS5ugIDjzFNq5IEoptRtLm6DQUlPQqKCU\nUvGkTVBo6Whu32IopdTuLG2CQsuQVI0KSikVT/oFBY0JSikVV9oEBZ3RrJRSyaVNUNAhqUoplVza\nBAXQ1NlKKZVM2gQFt6bQvsVQSqndWtoEBdE0F0oplVTaBAXtU1BKqeTSKChomgullEombYKCQzua\nlVIqvrQJCjp5TSmlkkuboCCaEE8ppZJKm6Cgy3EqpVRyaRMUNM2FUkollzZBQYekKqVUcmkTFEQ0\nzYVSSiWTPkHB/q4xQSml4kuboKCL7CilVHJpExTcjuZw+5ZDKaV2Z2kTFHRIqlJKJZc2QUGHpCql\nVHJpFBQ0dbZSSiWTNkFB5ykopVRyaRMUBE2drZRSyaRNUGhZjlOjglJKxZM2QQG3o7l9i6GUUruz\ntAkKAdFOBaWUSiblQUFEgiIyR0Re99mXJSLPi8hSEZkuIiWpKocux6mUUsm1RU3hWmBBnH2XAeXG\nmIHAfcAfU1UIJ/eRzlNQSqn4UhoURKQ3cBLwWJxDTgOetB9PBo4VZ0LBLqbLcSqlVHKprincD9wA\nxMs41AtYDWCMaQIqgc4pKYnOaFZKqaRSFhRE5GSg1Bgzaxe81yQRmSkiM8vKynboPQIpqX8opdTe\nJZU1hcOAU0VkBfAf4BgReTrqmLVAHwARyQA6AJuj38gY84gxZqwxZmxxcfEOFUYX2VFKqeRSFhSM\nMTcZY3obY0qAc4H3jTEXRh32KnCx/fgs+5iUXLU1zYVSSiWX0dYfKCK3ATONMa8CjwNPichSYAtW\n8EgJHZKqlFLJtUlQMMZ8CHxoP77Fs70OOLstyuDQ5iOllIov/WY0K6WUiittgkLLcpxaU1BKqXjS\nJijocpxKKZVcGgUF67v2KSilVHxpExRERx8ppVRSaRMUwO5X0JqCUkrFlV5BAa0pKKVUImkVFAIi\nuhynUkolkFZBQURrCkoplUiaBQXRLgWllEogrYJCQCBF+faUUmqvkFZBQRCdp6CUUgmkVVCwagrt\nXQqllNp9xQ0KInKD5/HZUfvuSGWhUkVEtKNZKaUSSFRT8K5tcFPUvokpKEvKiaBDUpVSKoFEQUHi\nPPZ7vkfQCc1KKZVYoqBg4jz2e75HCARERx8ppVQCiVZeGykiVVg32Dn2Y+zn2SkvWQoEtE9BKaUS\nihsUjDHBtixIW7ByH2lUUEqpeLZrSKqI5InIhSIyJVUFSiUR2TPbvZRSqo0kDQoikiki3xeRF4H1\nwLHAwykvWQqIzmhWSqmE4jYfichxwHnAccAHwL+BA40xl7ZR2XY5nbymlFKJJaopvAUMAMYbYy40\nxrwGhNumWKmhaS6UUiqxRKOPDsCawPauiCwD/gPs0Z3PAU2drZRSCcWtKRhjvjLG3GiM2Qf4LTAK\nCInImyIyqc1KuAtp6myllEqsVaOPjDGfGWOuBnoD9wEHp7RUKaIdzUoplViijuYD4uzaBDyUmuKk\nVkCHpCqlVEKJ+hRmAvOxggBE5jsywDGpKlSqWMtxalhQSql4EgWFnwNnAbVYncwvG2Oq26RUKRLQ\nPgWllEooUUfz/caY8cDVQB/gPRF5QURGtVnpdjGtKSilVGJJO5qNMcuAV4B3gIOAfVNdqFTR1NlK\nKZVYoo7mAVjzFE4DVmM1Id1hjKlto7LtclZHs0YFpZSKJ1GfwlJgHlYtoQroC1wpYvU3G2PuTXnp\ndjERCO/Rc7KVUiq1EgWF22hZTCe/DcqSclpTUEqpxBKtp/C7nXljEckGPgay7M+ZbIz5bdQxlwB3\nA2vtTQ8ZYx7bmc9NRtNcKKVUfIlqCjurHjjGGFMtIiFgmoi8aYz5Iuq4540xV6WwHC4dkqqUUoml\nLCgYK5+EM68hZH+16yU5ENA0F0oplch2rby2vUQkKCJfAaXAVGPMdJ/DzhSReSIyWUT6pLQ8mjpb\nKaUSSlpTEJEs4EygxHu8Mea2ZK81xjQDo0SkCHhZRPY3xsz3HPIa8Jwxpl5Efgw8iU/6DDsr6ySA\nvn37JvvYuALSzlUVpZTazbWmpvAK1lyFJqDG89VqxpgKrNXbJkZt32yMqbefPgaMifP6R4wxY40x\nY4uLi7fnoyOJaEezUkol0Jo+hd7GmInJD4skIsVAozGmQkRygO8Bf4w6pocxZr399FRgwfZ+zvYI\naOpspZRKqDVB4TMRGW6M+Xo737sH8KSIBLFqJC8YY14XkduAmcaYV4FrRORUrFrIFuCS7fyM7aJp\nLpRSKrHWBIXxwCUishxrmKl1bTVmRKIXGWPmAaN9tt/ieXwTcNN2lXgn6OQ1pZRKrDVB4YSUl6KN\nBEQ0zYVSSiXQmiypK4Ei4BT7q8jetufR1NlKKZVQ0qAgItcCzwBd7a+nReTqVBcsFXRIqlJKJdaa\n5qPLgHHGmBoAEfkj8Dnwl1QWLBUEwRhtP1JKqXhaM09BgGbP82Yi12veY1hpLtq7FEoptftqTU3h\nCWC6iLxsPz8deDx1RUqdgAjNGhWUUiqupEHBGHOviHyINTQV4FJjzJyUlipFggEhrFOalVIqrkTL\ncRYaY6pEpBOwwv5y9nUyxmxJffF2rVAwQEOzBgWllIonUU3hWeBkYBaRg3bEfj4gheVKicxggIam\n5uQHKqVUmkq08trJ9vf+bVec1MrMCNCoNQWllIqrNfMU3mvNtj1BKCg0NuuQVKWUiidRn0I2kAt0\nEZGOtAxDLQR6tUHZdrlQMEBDkwYFpZSKJ1Gfwo+B64CeWP0KTlCoAh5KcblSIjMjQIPWFJRSKq5E\nfQoPAA+IyNXGmD1u9rKfzGBAm4+UUiqB1sxT+IuI7A/sB2R7tv87lQVLBW0+UkqpxFqzRvNvgaOw\ngsIbWKm0pwF7XFDIzAgQNtAcNgQDe2SmDqWUSqnW5D46CzgW2GCMuRQYCXRIaalSJBS0TldrC0op\n5a81QaHWWKlFm0SkECgF+qS2WKmRmWEHBe1XUEopX61JiDdTRIqAR7FGIVVjpc7e42QGrSYj7WxW\nSil/relo/on98GEReQsotNdf3uNo85FSSiWWaPLaAYn2GWNmp6ZIqeM0H2lNQSml/CWqKfzZ/p4N\njAXmYk1gGwHMBA5JbdF2PaemoEFBKaX8xe1oNsYcbYw5GlgPHGCMGWuMGQOMBta2VQF3JSco1Gvz\nkVJK+WrN6KPBxpivnSfGmPnA0NQVKXWy3OYjzZSqlFJ+WjP6aJ6IPAY8bT+/ANijO5q1+Ugppfy1\nJihcClwJXGs//xj4e8pKlELuPAVtPlJKKV+tGZJaB9xnf+3RQvY8BZ28ppRS/hINSX3BGPMDEfma\nyOU4ATDGjEhpyVLAbT7SmoJSSvlKVFNwmotObouCtIUsTXOhlFIJJVpPYb39fWXbFSe1tKNZKaUS\nS9R8tBWfZiOsCWzGGFOYslKlSMgZktqkQ1KVUspPoppCQVsWpC1kOpPXtKaglFK+WjMkFQAR6Urk\nymurUlKiFMrUjmallEoo6YxmETlVRJYAy4GPgBXAmykuV0qEMjR1tlJKJdKaNBe/Bw4GFhtj+mOt\nwvZFSkuVIpmaOlsppRJqTVBoNMZsBgIiEjDGfICVNTUhEckWkRkiMldEvhGRW32OyRKR50VkqYhM\nF5GS7T6D7RAMCCJaU1BKqXha06dQISL5WOktnhGRUqCmFa+rB44xxlSLSAiYJiJvGmO8tYzLgHJj\nzEARORf4I3DOdp5Dq4kImcEADZoQTymlfLWmpnAaUAv8DHgL+A44JdmLjKXafhqyv6KvxqcBT9qP\nJwPHioi0okw7LDMY0OYjpZSKI25QEJG/ishhxpgaY0yzMabJGPOkMeZBuzkpKREJishXQCkw1Rgz\nPeqQXsBqAGNME1AJdN6xU2mdUEZAm4+UUiqORDWFxcA9IrJCRP4kIqO3983tYDIK6A0cJCL770gh\nRWSSiMwUkZllZWU78hYurSkopVR8iVZee8AYcwhwJLAZ+KeILBSR34rIvtvzIcaYCuADYGLUrrVA\nHwARyQA62J8V/fpH7JXfxhYXF2/PR8cIZYjWFJRSKo6kfQrGmJXGmD8aY0YD5wGnAwuSvU5EikWk\nyH6cA3wPWBh12KvAxfbjs4D3jTEp7QUOBQOaEE8ppeJozeS1DBE5RUSewZq0tgg4oxXv3QP4QETm\nAV9i9Sm8LiK3icip9jGPA51FZCnwc+DGHTqL7aDNR0opFV+ihHjfw6oZnAjMAP4DTDLGtGY4KsaY\neUBMP4Qx5hbP4zrg7O0s807J1I5mpZSKK9E8hZuAZ4FfGGPK26g8KRcKBmjUeQpKKeUrUZbUY9qy\nIG1Fm4+UUiq+1kxe26uEMrSjWSml4km7oJAZ1D4FtWfa1tDElyu2xGw/4PdT+fuH37VDidLTPr9+\ng4v/OaO9i5Ey6RcUMkSbj1RCxhg+WVJGikdHb7cfPzWLsx/+nKq6RndbOGzYUtPAH9+KHu2tdsYH\ni0q54w3/kffNYcNHi3d8Em3p1jr++sHS3e7vy5F2QSGkNQWVxIuz1vDDx2fw8py17V2UCJ8s2QRA\neU2Du62moSnmuFkry5lmH7u7m7VyCxXbGpIf2MYufeJLHvl4WUre+xcvzOXutxcxf21VSt5/Z6Vd\nUMjU0UcqibKt9QAs3lgdsX1TdT1DfvMms1a272C88m0tNYXq+sigcO/UxZz598+48PHoNGOp19gc\n5t6pi9nqqckk0hw2nPn3z7ngsZ0v61erK/h06Y4HwlWbt7m/dy9vq8L8tZV8thOf4dhcbQXBcCtr\nCv/+fAWvfNV2NyitXo5zbxHKCFCvzUcqgbzMIAA1URfcGcu3UNcY5h8ffccjF7UsKbKsrJqtdU2M\n7FOUkvK8OHM1xQVZ7nPvnbW3jMYYHnxvifu8rrGZ7FAwJWXy8+pX63jwvSVU1zVxyyn7+R5jjOGV\nr9Zx6MDObtm+Wbfzd8yn//VTAFbcddIOvf6Iuz9ABJbfGfn6rXWNdM63fvYn/2XazhXS5rRUbGto\nTnicMYZfvzyf52ZYKx+fNqrXLvn8ZNK0pqBBQVkmz1rDizNXR2zLiRMUMgJWVvfmcOQd3jF//ojT\n7ItStF9NnscD7y6J2378wszVrK+sjdlujKG0qg6AX06exyVPfOnu+8kzs6m0awtb61rKuNluVhrZ\nuwMA73y7kZIbp7BgffyL7opNNVTWNrKuopYz//6Z+5nRahuaY847mtOUtaGqlvom/wveszNWcd3z\nX/HYJ8uprott+mpPfr+iMX94l7UVsb8fx3MzVvHW/PWtev+qukZ++uxsNtg/4+haXrTybY1uQGhL\n6RcUdEbzHm3OqnJKbpzC0tKtu+T9rn9xLr+cPC+i6cBZhCneP21jnIvj5urI5ofmsOH5mau5793F\nfLEsdtTQlpoGbpg8j8ufnBmz75npqzjojvf4+Qtfxezb1tDMpKdmMvLWd9jouYivsy9eRw3uCsA1\nz80B4I2v/S9a05Zs4qh7PuSix6czZ1UFs1aWc/sbCzj+vo8j+i0Aht7yFtf8x3q/Gcu3MG3JJuav\nrYw4xmlqeePrDfz0mdnudu//20eLrA7apmYTE3SdY2997RtWb9nmW+ZkknXehsMm5pjo59EDUeas\nit9ceNNLX3PF07Pj7vd65at1TJm33g3kyZrZSrf6B+hUS7ugEAoK2xqamb6sVUtCqN3MS7OtttV4\nHanLN9Vw/qNftLpd23Hg7e+6j+sbrbvcd77d6Nup2xTnpuJrz0UyHDb87YOl7vOX56yJOX5LjRVE\nNlbFtmW/Oncd0HK+0aYv30JlbWPEZy5cbwXKI/aNzCTcFCeIOYF17ppK9wL0ylfrWLRxK3e/syji\nXACmzLOCyw/+8TkXPj49pjnFO//n3QWlALy3YCOD/vdNlmy0Pquy1vq9bNxax5aa2A7maUs38cSn\nK7jkiRkc++cPtzs41DXGv+EzxjDg129wyyvfRGz3NuM0NofZFBXc4zXzJLpoG2P46wdL+WzpJhZt\n2MqnSzeRnRF5uU1WUyitir3JaAtpFxTG9OsIwAszY/9J1e7P+QfNzfTvDrt36mI++24z79kXpe3h\n3NHWNbZcBB6w2+jrGpt58+sNgHWX62dpaUvH9JzV5fx56mL3+aINkTWbusZmNlRa//ShoNUsNX3Z\nZvdCGX1himdLTUvwu+G/8wDo3yWPe84e6W5fsnErb369PqaGvL6y5aIW3a7/7PRVzFtTAcA2z88j\nXrMQQGNT7M/FGcH15QrrbtsJClPmreecR76IOX6NHQS+K6vhu7IaXrCb9owxTPr3TD72GQo6eVbL\n/3JFbWSgqWtsprymgbrGZjbZHbxPfbEy4piK2pafYenWelZsikzvtrGyjm0+o7wOuv0997FT26iu\nb2L5phrWVdZx99uLOP+x6Rx//8dc8Nj0mJrVVp/ms6e+WMldby5k9qrymDkpbTVKK+06mo8Z0o19\nu+X7/pKVv8raRjrkhNq7GEDLBTuU0bJqazhsKN/WQOf8LIrsckb/A93z9iIGFOdxxgG94753TX0T\nRbmZEXebzt3Zra99yzvfbgSgMdyy33uhrfL8k6/e0tIOHQqKezF03nPIb95yn2cEhXDYcM4jXzCo\naz5Tf34km3xGwvhZUx55J50ZDNAxNxTx+3p3QSnvLijlkkNL+N2pw9zt3qDwhafmPKhrPktKq3nn\nm40sK6vhkH1aFkN0LqzecwkGBGNMxPwJh/Pz2WD3m3h/Dl51jc089flKno1qQ5++3LowVtc38c63\nG+nVMSeiJmSM4foX57rPK7Y10qNDjvv8/Ee/YPYqK7gdNrDlPGrqm8jLsi5/lZ7RXOsqapmzuiKi\nDOsqa31rNV4V2xrpmJfJRY9PZ/aqCt689vCYY7zBC6ygsHBDFaVV9Tzy8TJ+NXEIb369nq9WV/Dw\nR7GTEbfUNLid3qmUdjUFsO4ya5L0/CvLtCWbGHnrO7tkKN6u4ATz2oaWi/FTX6xkzB/eZfmmGjrm\nWhdD77BNgIc+WMrPX5hLtIBnRXDnzq2usZlQUOiQE6J8WwPPzVjFt+ta7vLmrKpwR/mUe4JPleeC\n5+2c7N0xlxWbt3Hl07MouXEK+/z6jYgyhAIBFto1iSWl1fYF1v+m5eELx0Q8X1se2QnaITeEiFCU\nGxvEoy926ytr6dc5F4A1nvcZ3qsDvYpyeOiDpVz3/FfMWN5yxxo9bHPYb9+isTnME5+u4PFpy2M+\nc+VmK2hNXVDKU5+viBsU5q+t5PY3FrA86i595eYalpZuZYldC9tQGdnstDnqYn3CA5/w4aJS987d\nCQgAny5tCXzDf/c2Vz83hwsfm84qTxPVh4tKmbOqnE55me6252asjtsvkx2yLqFO57HzeX41vehr\nTmVtIxPv/4SL/jmDaUs3cdmTX7K5uiFuc9X8dZW+23e1tAwKeVlB304uFWu23cn26Xf+QeGjxWU7\n3Cm4I5x/GG9Nb67dzDFt6SYC9lXeW1OIdyFyLhwDu+YDLW28dU3NFGSH6Nspl9fnreeml75m7prI\nf8iH3rf6C7wXKG9zgPcOvndH6871zfkbfMuxbFMNJz74ifvcr1nFMWFoV4b1LPR8TmRQcGpK0TW7\nwd0KWFdRy1vz17tlW19Zx4jeRYgdGHsVWeXsWZTDCHsEE8AHC1ua4qJHJ9U1Wm3wU3wumo3NYb4r\nsy7mC9ZX8ZtXvvG94GVmBLhhstX0VZCVQY49VDU/K4NN1Q1MuPdjzvjbZwDMXFnOAb+fym/+bz7G\nmJigCHDJE1/yr89WxGz3Cht4be46pi3dxD8+tu7Ku+Rn8eb8DSwtrebAko4Rxz//5eqY9zh+WDee\n+5+DAfg66u8julzeIDNhaDd6FeXENCeVbq1n0cb4AyjueXtx3H27UloGhdzMDGatLGf0be+0d1F2\ne84/qPfO3Ovif85g4v0f++4Lhw0/fWY2n8UJKDuittEJCi0Xl/6d8wD4dl0ltfb2DZ6Llzdoedv2\n65vChA0U21Xy8m0NNDSFqWsMk50RoKPnHzna0B4FQGRQqKprZG1FLbdP+TYiADhBobW8d+bRMoIB\nOua2lCs6uaPTJBIdFA4e0ImyrfVc8fRsLnniS8Jhw8aqOnp3zKGzfZ6H7NOZs8f05rhh3bjxhCE8\ncO4ouhdmu81mAGU+d8BlW+vdGprXlyu20NhsOHhAp4Tn++ezR7LMriH84rh9ybWHBO/XszCmc9Wp\nqTz1xUqe/mJl3OGi73yz0Xe7nzn23f2YfkVsqKxjbUUtA4qtG4WBXfMZ3beI78pil5HpkBNieK8O\nDO/VgT+9vTDibyE6WPcsygZgSPcCHrt4LEW5oYhBAsmceUBv1lbUJu2c3hXSMig4k5OimxhUrGz7\nZ1XbGHuH57QXx2uKW765hilfr+dnz8cOq4w2a2U5r89bF7Ht589/xS9fjGzyqbB/Z/dOXczCDVbn\nqDO6ZvbKCjdYeJs5vP+gx9//MY3NYd79dqPbBu5MDDv/0el8776PmDxrDdmhIJlB/3+PXkU57jmX\n2x29HXJCTP12I4fd9T6PfrLcLSdYzUe7UqJgFbRrStFBYVjPljv/0qo6NtXU09hs6Nkh2w0kJZ1z\nufvskYzoXUS/znmcNqoXJ4/oEXEhuvW1b2M+89Olm90Lq9f5j1ozlScdMSBi++i+kZP8xnmCRn52\nyJ1cur+nzH4+WbLJvSO/YeLgiODz+bLNEX0NAP0657J/r8KIbd4aUUnnPLY1NNPYbOjdMYcp14xn\n8hWHMLK3Vd7C7Mgu2I65mWQEA/zprBFsrmngqc9bOrCj+3qcQO4060WPEAMY5TP5ccLQbjx7+TiO\nHmId3xa18vQMCllp17++w7LsC2OdT1DwGz3h5VSPnWaJeCbe/zFn/v0zrnp2TsT2l+as5cVZa5hw\n70fuNu8F6kp7fHidPSJm0cat7gggb03CGXq5T7FVo7jwselc/u+Z7uiRrp7Zwk4beFPYxB3tMWFo\nV5aWVvPYJ8vctuOSzvEv/NtbU/AzuFuB+3O8cFxfzh7T0mGe5Rnq6HSReGcyXzdhED09v4Ouhdms\nr7BqUt075Li/2752jcsrehatXzLJP761MKZt3+uwgV348n8nuM+vOXYQy+880X1elNMS5AqyM9wR\nTtEXcK8DSzqyYEMVyzbVUJidwZVH7sP1xw2OOCa6Y/ejXx7N8F6RgeZoe04HQPcO2e7j3h1zGdaz\nA0W5mZw+2voZbGtodicwHrFvMT85aiAAQ3sU0jkviw1VLTcf0TUYp/moc571t3bNMYOYdMQA/nTm\nCG7//v785uT93GO65Lf8PLoWZnHowC70sW8sNCikiDcoGGNiOrdUC6d5otanNuA3FyAcNvzihbl8\nvaaSeXY7a7fC7IhjjDERzQILNySeiLbU7nyF2LQOAPWe0UKf26NonKAQtieQHVTSiSnXHE5GQNwR\nLY6uhbEjOsq3tXT4RV9InBEgf5iywG277tMpflDoEjVixNu5fecZwyP2xQugL/3kUD690Vr3atyA\nztx99ki3b+GYIS0XNpHY1143YV9G9Gk5h9zMoDvyqEeHbPd328/nHAZ1y4/Zds7YPqy46ySevmxc\nxPb8qJut356yH89POpisjCDFBVn84fT9GdK9gP16FCIivHXd4fzjh2PI9AS1gqwMt5YzLEFN4eAB\nnVm9pZbnZqxi/14d4nauRwtE/YAOG9gFsH4m3td7A/moPkX8+IgBPHzhGPfC/dD5o+ngOb5TXoiy\nrZHNR85QY2iZLT3crpnkZAb59YlD+cGBfbhgXD8uG9/f/d2fd1BftznOqa06f1+rffpQdrW0DApO\nmyXA299s4Oh7PnQn5rS15rD/zM7dhXNn+NY3GyISjoXDxremsLmmgf/OXsOpf53mVu2du/v1lbXU\nNTbzx7cWsc+v3/CdXQpWraQyqmmvrjFMU3M4Im+VczGpa2ymW2FWxN260xG9cWsdq7fUcsrIHmSH\ngvTyuWv35hVybK1r4p6zR3LugX24+NASd/srPz0M72VluX2n2tmnScfZFn2dFvvCdOcZwzk96k7c\n6biMluOTw+j5Hx/Ct7cdH9FUJJ6L3s0nDeVRO0dTYXaIab86ml5FOWypaXBTa/TokE2h/fp+PrWd\n6NxJxQVYdbOVAAAcZUlEQVRZ3HzyUADGD+oS0ZfgtJsDXDCuLxcdUsK4AS3DQC88uB9vXXeEe5Mw\npHshxw/rHvH++dkZ/GfSwdxy8n4MKM7j8EFdOG1Uz6gyBThpRA/3uVPTK2zFsOlfHDc4YmjqvnbQ\nO21Ur4ifY3RwvunEoUzYrxvP/s/BXH/cvhRkxTYlLd/UMk9lfWUdBdnW+x07pCsrNls3nt7mqmjX\nHz+Yy8f356dHD+THR+4DtKRW6ZgbIj8ro01mOadlUAh6/nGc9APOCJa29ssX5zLst2+3y2e3hvci\n/LtXrZmgHy8uY8Cv32CmZ3KNe9duV/2NaemEraptJBw2HHLn+0x6ahZP2nfXR93zYcRYebCC0APv\nLeHMhz+L2F5d3xQxiQqsNOjQkvjN207r3OU7ZSguyI54jZf3Tv4v5412H+/Xs5C7zhzh3omfOLw7\nI/sUcfFhJdx80lAmDO0GWE0DzgXgxOEtF7knLj2QFXed5I5ucrQ08QTIyQzyW0/yuPzsDLdfwLHi\nrpPcUVVe+VkZ5GZm8OuThnKsXUbvUZcfPoDv7dfNfd67Yy4T9+/OmvJabn3tWzIzAnTKy+Rflx7E\nzScNpSg3fl+F45JDS9xzBSLmBHhvEq6dMCjmPFqjIDvEwK4F/Gh8f0LBAE9dNi4mcHbMzWRI90Le\n/8WRQEsTV2vm0nTKy+Sv5x8AWDWZotxMPrj+KG49dZj7+uKCrLiJBAd2zeeqYwZFBF+nTNGd0flZ\nGSy740Qeu3gsvzhuMH065TC6T+SoJq8OOSFuPnk/skNBt3bqBFURYebNE7jphKFJz3FnpWVQ8E44\nau+79JfsGZ/hNprCnkjZ1npKbpzCB4tahiB6Z7A6FzdnotPvPJ2OTke0t+9hnX03WlXXxEb7Dufj\nxWX0sO8oV23ZFvFZYDVTLS2tjmk73dbQxLb6yKCQ4QaFMNkZQUo8beLbGprdBWigpU3XL12xd3a0\n9yLq6JSXyWtXjefOM0YA1l335YcP4FB7UlenvEz6ds6lICuDO88YwZvXHs7JI3owpLvVvNO1MJtv\nbj3efT/nepKdYV14Lj2sv7svLyvI29cdzn3ntMxITqYwO8SD541mWM9C/vekxBcNb1NNjw7ZiAgD\nu+Zz+eED4r7G2yzkrWUD9C9u+Zl39rSF+9VsWiO6CQpgmKdvIScUdC/eA4rz+e6OEznaDohZGf6f\n2SU/i1s9k/aKcjNZcvsJ7s+9f5c8MjMC7vvuSB+Q0/l/+KAu7s+oY26IQEAQEY7ct5hPbjjGTbaY\nzGEDuzD918dG/D22VcbbtOxxrfcEBedOtd6nI7UtNYbDZAXaLs2xHyetwZOfrXA74Lwdi85FOLpd\nFqw78iuenh1xt+aM+qmsbXQ7cAG6F2azzL6rim6CqmloonRrfUx68/lrq4i+8Zy7uoKVm2uoa2om\nOxSIaRp6ZsYqVtnVdicoXH/cYK5+bk5En0YwINx/zigGdy+I+4833Kfaf1D/Tu57n3VAb44f1p0O\nOdZs4ofsu1FH9MUU/P/JszKCDOxawMCuBeSEgr6jvvzkZWUw5ZrYWbTRvLXkHh2yExzZYurPj+Ds\nhz9nTXltzHlMHNadKfPWc8rIntx0whAOvet9YMeDQkF27CWpa0E2t39/f5aV1fDBwtKIIbnxaiMX\nHtyXp79YxSEDOvPcpNgmOb8aY6EbFLZ/tJjT2T+mX0dOGdGTdZW1bk1yR0X3xbWVtAwKeG4Wl9mT\na0pbmVYgVRqbDe09KMqpQX24qIz3F27kmCHdqG8Kkx0KMKBLvjsap3xbA1lR61JUbGtk3pqKmPZz\nsJqPVnmCgjdArI8apbGtoYkyn/TNP322JRNl30657izUI+/+kIMHdCIrFIxpB/7N/813HztB4cTh\nPThxeA9Kbpzi7ivMznBHmAA8etHYpCOmwBp10iEnRNfCbAIBSdh84TQ3jOnX0Z7oZMgKJa6oT9y/\nR8L9O+LKo/bhkyVlzF1T6XtX7qdHhxwOLOnEmvK15GdFnuMpI3sypHsBg7oVRGzPiDOcN5msDP/X\nXTCuH2D9zBOV+9nLx1FckMWgbgX8/rT9t+uzO+SECAj02YGagjMKrVdRDmeP7bPdr9+dpGXz0ZVH\n7eMOd1tn1xQ2xskj31biZd5sS96cPz/6l5XOuaEpTG5mBt0Ks9hS04Axhoraxpi78pr6JrY1NMfM\n/ejdMYemsGHBhpaEa9bkIKvZIbpPobq+2XeClNddZwxnwtCWETeVtU1kh4Jxq/0i8dub/3vloe5E\nJcf39uvGfj3jD4d0BAPCc/9zMNdNGJT0WIBZN0/gmctbRux4awpTf3YEj/xwjN/Ldqm8rAx+NN5q\nNtmeJYJvPGEI54/ry4T9usbsiw4IO+LhC8dw1pjeMW310c4a05uJ+3ePu//QgV3c8ohI0vfzysoI\n8uhFY7nksJJWv8bh5Ica3Td+n8GeIi1rCkW5mfzlvAM4+p4P3W3JLkSpFj0ztT34jcuvb2omMxiw\nOuQWlXHZkzNpaApTlBOiX+dc966/qq7Jdwz7kO4FrCmvjcn42K3AakKKHs+9trw26XKpuVkZ7nhv\nsFIo9OnYzb3w52YGI+YpFGTFdt46Ru/kammtCR6O6GRm3rviQd1i77ZTxWna8eu8jqdbYTZ3fH94\n0uMmX3GIOxR5e0zcv3vCi31bOXYHm3zOP6gvE4d1b5OEdamWljUFiO3QSpSHvS3sDutG+83wbmgK\nkxUKuGOu319YSvm2BjrmZvLqVeP52wVW23l0orQLD+7LwK75bmfe/LVVDOnectFzJgpFp2x2hu4l\nkpsZ5OaTh/LAuaPcbdmhICLCgtsm8tD5oyOO97v4OW3X23Nh3GWcjuY2XCrT64h9i/n+6F7ccrL/\nkpk7Y2xJJ7cmkk5EZK8ICJCmNQUgJldLu3c0t8O60VV1jUxbsokOOSE+XlIWM7oHrCGpmcEAKzz9\nABXbGhncvYAOOSEOsKvL0UHhyqMG8ofTczDGSqWwrrKOvp1y2VBVR8W2xohZxMft141LDi3h/Mem\nc/fbi0gmNzNIQXaIU0b05IbJ89x+D7AmBXnbvS85tMQdneI19WdHJlxmsS20V1DIDgW575xRyQ9U\naSltg4K3I6ykc25EArX20BRu+6Dwyxfn8vY3G+lVlBP3AunUFM46oDczlm8hJxSkwq4pgDWuHqCs\nOvLn5zRRiAgj+xSxrnIDXQuz6JSXScW2xoiJRueP6xvTrp9IXmbLXX6/zrks3lgdcYF1Rsj07ZQb\nsX6AV/cO2RFpDdqSO08hTqeqUu0prf8qnTHb/bvkUd8UTrq+ayo1+KxaVbmtMaVZEZ2FYLLjjIIx\nxrg1hR8c2IfLx/entrGZmoZm904/174YR9cUvEMSnU7l/KwQA+2Lv3fkUqe8THKzkt81O7M7vWO9\n+9lzE7zJ65zhpt4cMrsTp+8zq51qCkolktZBwcmZPqA4H2Pat10/eqlEgJG3vcPBd7znc3TrhMOG\nxQnyszudr/EWdHE6j51JQd7snPvanaKBgLh57728nahOp3BlbQM3TBwCwGGe1bw65WVSkJXhm2J5\n8hWHAHDG6F5MueZwbpg4OKJWcILdOekdzz+sZyHnj+vLA+dG9i3sLu45eyQDuuTt8Fh+pVIpbZuP\nAP52/hhmryp3M2vWNzVHzPj8Zl0l5TWNXPj4dKb+7AgGdStI2dKUfkEBki/uncijnyzjzjcX8tpV\n430nXzmdrPHWA67Y1kB9U7Ob/sCbMGxfT6dxXlaQWSvL3efZoUBEB+4pI3vy7IxV/Oiw/gzsms/y\nO0+MGCrYKS8TEeHJHx3E4Jtblqm03ivIsjtORMRqihrcPXKEzvdH90IEDurfEmQygoFWjZRpLyeP\n6MnJI3omP1CpdpDWNYUOuSGOHtLVnUTkHVK5cEMVJz04jQsft3LCf7S4zF2a8tMULE3p1FKMMXy5\nYkurm7KMMTz43hLfTK8z7Qv16nL/dLtOEsd4H/VdWTX1TWH3rt87k7Snpz1+Y1VkUPFLovbuz4+M\nGD/u5aSZ8KYpcIJzKBhwUwX4ERG+P7p3qyabKaWSS+ug4HAuet52biffvKOqrokZ9lj76Z5FzncV\np6bw2rz1nP3w5/x39tpWvW5TdQP3Tl3Mxf+cEbPPudj75fsB/3QVl4/vz8s/OZS8zCBTvy21O5qt\ni7XTlt+rKCfuRTojIDvVLHLBuL788vjB/PlsK/dPe3UGK5WuUtZ8JCJ9gH8D3bASSzxijHkg6pij\ngFcAZ8Xvl4wxt6WqTPFk+gSFpqgEdVW1je6dcnOCu/jahmZO+ssn3HXGCDc3Tms0NoepqW/irflW\nCu8lpfH7AuqbmjHGuiN3Ri35rW3gqPCZf1DX2OzbNNWvSx6j+3bkyMHFvLdgI4aWzuTRfYoYP7AL\nt57mP6IHrBnMfsHGz+BuBTFr0t7uafY5ZaQ2sSjV1lJZU2gCfmGM2Q84GPipiPjNlvnEGDPK/mrz\ngAAtzRart2yj/01TmLF8S0zaia11TTgDXBJNPv52fRXLymq4440FMfvCYePmsY/W2Bzmmufm8MbX\n1tq+9Qkm0x37548Y8hur7d2ZdOefZNXaWO5ZFevz7zZTU9/EmX//zHdxG2f+xoSh3SjdWk/Z1nq6\n2XfrRbmZPH35OPaJGj76+tXj3cc/PKSEcw5sXe6XKdeMZ/EfTmjVsUqptpGyoGCMWW+MmW0/3gos\nAHolflX7cJqPPl+2GWPgbx8ujUiTANZEr7CnOeZXk+fxl/eWRBwTDhvmrLLa8f0myt7/7mIOufN9\nHp+2PCYlRGOzcZunIPFSl941h2s9K4xFc0YVbbHTVyzfVMN5j37Bvz9fGTOT2OGkcz7Ks0xhzyRN\nOPt7ViY798A+7gIhyWQEAxEd+0qp9tcmo49EpAQYDUz32X2IiMwF1gHXG2O+aYsyeTkXpspaq5ll\nU3V9TNNKVW2ju/ZCTX0Tz89cDcDVx1rJ0BZt2Mq7Cza6M3L9cu28u8BaO+D3r3/Lqs013OrJ4tjY\nHI7IMOqUJRl3HYOmZpqawxGT8py1BN75ZiPQkop37ur4Cwo55e7kGX7aoxWduAf0LWL2qgodZqnU\nHi7lQUFE8oH/AtcZY6JvT2cD/Ywx1SJyIvB/QEzKSRGZBEwC6Nu37y4vo9N89Oz0VYCVpycvM3J5\nzvJtDe6dd0XUBXvu6gpO++un0WWO+Rxvh+/HSyJHMEUPSa3y9BE0h41vkFmwvooz//6Z/XrD6X/7\nlNevPpzmsOHRT5a56arXVtTyxKcr3Nd9vTY2YdnFh/SjorbRzfbolaymAPDkjw5i5eZt7ZNLSCm1\ny6S07i4iIayA8Iwx5qXo/caYKmNMtf34DSAkIl18jnvEGDPWGDO2uLg4evdO82vCiF7cvbK20e3M\njV4DYJFP23zAHe5p3KYib1BYvqkmYthpQ3Nsx7Yj3kIrJzzwScTz+WurqK5v4lf/ncddby6koTmM\nSGSep5xQ0DelxeDuhTxw7uiI4aRDe1gZQFtTUyjIDkU0Iyml9kwpCwpi3So/Diwwxtwb55ju9nGI\nyEF2eXb9eM8kohf2uMSzULtjY1U9r8+zag/Ra7H6NfU4d/Z/+/A79r35TWrqm2I6g70L+zQ1hyNq\nF5s9ncO1Da1P1nf7lG+ZPGuN+/zus0Yy55bj3LWHTx/tP6Inw+cO/98/OohHfjim1YuxKKX2fKms\nKRwG/BA4RkS+sr9OFJErROQK+5izgPl2n8KDwLmmHRIQRQeFowYnro1EBwG/oBAQwRjj9jFM+Xq9\nO3Pacf+7i93H0c1H3lxC05aWccSfPuDz7zYnndS2pSYy3YQzqeuUkT2ZdfMEjtw3NmMoRK616ygu\nyOK4Ye2f414p1XZSdgtojJkGvqszeo95CHgoVWVoLW/z0d8uOICDB8S2q8djjIkbFLwzfW+YPC/m\nmOdmrHYfJ8q79MmSTazaso1bX/uGV646LGF5OuVF5nT3rkbWOT8rIlUFWE1L0351DHlaG1BKoTOa\ngcj0CicO70F2KMj4gTFdG0Ds4jz1TeGYDKFg9SkkmoDmXU8A8F21zOGsbra1rsl3zQOvvKiF1aNn\nBEcHhe4dcjQgKKVcGhTAdwH1py8fx+mjItvfO+VlcvdZIyK2fbiolHlrYod4Nhvc5qJuhbErMjkr\nfzlenbsubtOQk9doY1Vd0gR53lrL9cftSyhqAfWinJahpqeN6sm/Lj0w4fsppdKL3iLSkos/uq81\nx07Ulp+VQXV9E1kZAboWRt55X/H0bN/3rGtsZklpNUW5Ifp3yYtJGufcnffokM36yjrfhHYOp5+g\nKWxYtcU/uZ3DWSzoX5ceGDEBzeGtKVx0SIk7d0EppUBrCoDV0fzjIwfwyk/HR2x3JmI5qbKzMgK+\nd/0AfTpFDtusb2xmaWk1A4vzfVNtO6uDtXaxcudivqys2nf/2WN6A1ZtYkj3At+AAJEZTPfr0fpF\n55VS6UGDAtZEs5tOGBqz5kCnPOtC3LtjDhOGduOBc0fTtcC6sz58UEufw+MXj3WXiHTUN4X5rrSa\ngV3zKcj2CwrW8dGvi8dZC9kZDnvzSUOZdMQAd//x9iihDZV1bsBJJqeVxyml0ocGhQScyVtrymt5\n7OKxjOxTRGZGgBV3ncT1xw0G4K4zhnPs0G4xSysu3LCVzTUNDOyaT6EdFC4b39/d72QSzcvK4KWf\nHMohSUY8HWrPNF64wZoUPq5/Z3594lB3v7PmcVVdU9KO46uOHsifzhyR8BilVHrSPoUEhvW0ag5+\nM4BH9inikxuOpk+nXKBlEfZBXfNZX9nSIdy/S56bHiMnFORnE/YlLyvIF8usGdN5WUEO6NuRhy8c\nwy8nz2VsSUdWb6nlqS9WRnzeQf07EQoKXyzbQqe8TAZ1szKVvvvzIyjf1siI3h0oyM5ga11TzLyL\naNcfP3hHfyRKqb2c1hQScPoPjh/WzXe/ExCgZf2FH4ztw+i+RQAM6V7A4YOKCdk92AbDtRMGcfnh\nLc0+TvNRh9wQj1w0lklH7MP+vWLb+vcpzneT1J0xupfbNzCwawEHlnQiKyPIzSdZNQe/9ROUUqo1\ntKaQgIgw73fHuemkEym301N375DNOrtmcccZw8nMaFmv2G/NA7/2f29n8FVHD+S/s9eQl5VBjT1H\nwW+9ZYB+na1ZyfHWXFZKqWQ0KCRR6NNJ7Me5O+9ZlO0ug9nfvkg7KY381jzwy37qTKYTsZp6nOYe\np0kq3qihfp2tmsum6gbf/UoplYw2H+0iLTWFHB65aCw3nzSUjnZzT4Hd8ZsbMdIoflqLbHsyXXS4\nOGaINcy0f5fYPEUA3QqyCQaEGyZqn4FSasdoTWEXyQgIjc2GrgVZhIIBBnZtWbLynAP7UrGtkf/x\nDCF1moiiZxyDt6YQGRYeOn80m6sbIhbS8QoEhO/uOHGnz0Uplb40KOwiL//kMGYs3+J7kc/MCLgr\ntDluPXUYvTrmRMx3cDg1heiWpdzMDHI76a9MKZU6eoXZRfbv1WG7FpnpnJ/FTScM9d3n1CL8Vm9T\nSqlU0j6F3ZAbFNq5HEqp9KNBYTfkdjRrVFBKtTENCrshp6M5oFFBKdXGNCjshpw0FdFrLiilVKrp\nVWc3lJeVwQ0TB7uZT5VSqq1oUNhN/eSoge1dBKVUGtLmI6WUUi4NCkoppVwaFJRSSrk0KCillHJp\nUFBKKeXSoKCUUsqlQUEppZRLg4JSSimXGBN/BbDdkYiUASt38OVdgE27sDh7Aj3n9KDnnB525pz7\nGWOKkx20xwWFnSEiM40xY9u7HG1Jzzk96Dmnh7Y4Z20+Ukop5dKgoJRSypVuQeGR9i5AO9BzTg96\nzukh5eecVn0KSimlEku3moJSSqkE0iYoiMhEEVkkIktF5Mb2Ls+uIiL/FJFSEZnv2dZJRKaKyBL7\ne0d7u4jIg/bPYJ6IHNB+Jd9xItJHRD4QkW9F5BsRudbevteet4hki8gMEZlrn/Ot9vb+IjLdPrfn\nRSTT3p5lP19q7y9pz/LvKBEJisgcEXndfr5Xny+AiKwQka9F5CsRmWlva7O/7bQICiISBP4KnADs\nB5wnIvu1b6l2mX8BE6O23Qi8Z4wZBLxnPwfr/AfZX5OAv7dRGXe1JuAXxpj9gIOBn9q/z735vOuB\nY4wxI4FRwEQRORj4I3CfMWYgUA5cZh9/GVBub7/PPm5PdC2wwPN8bz9fx9HGmFGe4adt97dtjNnr\nv4BDgLc9z28Cbmrvcu3C8ysB5nueLwJ62I97AIvsx/8AzvM7bk/+Al4Bvpcu5w3kArOBcVgTmTLs\n7e7fOfA2cIj9OMM+Ttq77Nt5nr3tC+AxwOuA7M3n6znvFUCXqG1t9redFjUFoBew2vN8jb1tb9XN\nGLPefrwB6GY/3ut+DnYzwWhgOnv5edtNKV8BpcBU4DugwhjTZB/iPS/3nO39lUDnti3xTrsfuAEI\n2887s3efr8MA74jILBGZZG9rs79tXaN5L2eMMSKyVw4xE5F84L/AdcaYKhFx9+2N522MaQZGiUgR\n8DIwpJ2LlDIicjJQaoyZJSJHtXd52th4Y8xaEekKTBWRhd6dqf7bTpeawlqgj+d5b3vb3mqjiPQA\nsL+X2tv3mp+DiISwAsIzxpiX7M17/XkDGGMqgA+wmk+KRMS5ufOel3vO9v4OwOY2LurOOAw4VURW\nAP/BakJ6gL33fF3GmLX291Ks4H8Qbfi3nS5B4UtgkD1yIRM4F3i1ncuUSq8CF9uPL8Zqc3e2X2SP\nWDgYqPRUSfcYYlUJHgcWGGPu9ezaa89bRIrtGgIikoPVh7IAKzicZR8Wfc7Oz+Is4H1jNzrvCYwx\nNxljehtjSrD+X983xlzAXnq+DhHJE5EC5zFwHDCftvzbbu9OlTbsvDkRWIzVDvu/7V2eXXhezwHr\ngUas9sTLsNpS3wOWAO8CnexjBWsU1nfA18DY9i7/Dp7zeKx213nAV/bXiXvzeQMjgDn2Oc8HbrG3\nDwBmAEuBF4Ese3u2/XypvX9Ae5/DTpz7UcDr6XC+9vnNtb++ca5Vbfm3rTOalVJKudKl+UgppVQr\naFBQSinl0qCglFLKpUFBKaWUS4OCUkoplwYFpWwi0mxnpnS+dlk2XREpEU8mW6V2V5rmQqkWtcaY\nUe1dCKXak9YUlErCzm//JzvH/QwRGWhvLxGR9+089u+JSF97ezcRedle+2CuiBxqv1VQRB6110N4\nx56ZjIhcI9baEPNE5D/tdJpKARoUlPLKiWo+Osezr9IYMxx4CCt7J8BfgCeNMSOAZ4AH7e0PAh8Z\na+2DA7BmpoKV8/6vxphhQAVwpr39RmC0/T5XpOrklGoNndGslE1Eqo0x+T7bV2AtcLPMTsS3wRjT\nWUQ2YeWub7S3rzfGdBGRMqC3Mabe8x4lwFRjLZKCiPwKCBlj/iAibwHVwP8B/2eMqU7xqSoVl9YU\nlGodE+fx9qj3PG6mpU/vJKz8NQcAX3qygCrV5jQoKNU653i+f24//gwrgyfABcAn9uP3gCvBXRin\nQ7w3FZEA0McY8wHwK6yUzzG1FaXait6RKNUix17ZzPGWMcYZltpRROZh3e2fZ2+7GnhCRH4JlAGX\n2tuvBR4RkcuwagRXYmWy9RMEnrYDhwAPGmu9BKXahfYpKJWE3acw1hizqb3LolSqafORUkopl9YU\nlFJKubSmoJRSyqVBQSmllEuDglJKKZcGBaWUUi4NCkoppVwaFJRSSrn+H7Bzz028X5LpAAAAAElF\nTkSuQmCC\n", 535 | "text/plain": [ 536 | "" 537 | ] 538 | }, 539 | "metadata": {}, 540 | "output_type": "display_data" 541 | } 542 | ], 543 | "source": [ 544 | "import matplotlib.pyplot as plt\n", 545 | "\n", 546 | "plt.plot(range(1, len(average_mae_history) + 1), average_mae_history)\n", 547 | "plt.xlabel('Epochs')\n", 548 | "plt.ylabel('Validation MAE')\n", 549 | "plt.show()" 550 | ] 551 | }, 552 | { 553 | "cell_type": "markdown", 554 | "metadata": {}, 555 | "source": [ 556 | "\n", 557 | "It may be a bit hard to see the plot due to scaling issues and relatively high variance. Let's:\n", 558 | "\n", 559 | "* Omit the first 10 data points, which are on a different scale from the rest of the curve.\n", 560 | "* Replace each point with an exponential moving average of the previous points, to obtain a smooth curve." 561 | ] 562 | }, 563 | { 564 | "cell_type": "code", 565 | "execution_count": 38, 566 | "metadata": {}, 567 | "outputs": [ 568 | { 569 | "data": { 570 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYwAAAEKCAYAAAAB0GKPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XecXHW9+P/Xe3vvm92UTTYJ6T0sSSCIISCEJiqgoCAi\nmouiF+5FpXi/FxX1pyIoCoooiF6ahSo9YOiQSnrvyZZs7333/fvjnJmd7bPZnZ3s7vv5eMxjz3zO\n58x8zmYz7/l0UVWMMcaY3oQEuwDGGGOGBgsYxhhj/GIBwxhjjF8sYBhjjPGLBQxjjDF+sYBhjDHG\nLxYwjDHG+MUChjHGGL9YwDDGGOOXsGAXYCClpaVpdnZ2sIthjDFDxoYNG4pVNd2fvMMqYGRnZ7N+\n/fpgF8MYY4YMETnsb15rkjLGGOMXCxjGGGP8YgHDGGOMXyxgGGOM8YsFDGOMMX4JWMAQkSwRWS0i\nO0Rku4jc1EWeRBH5l4hsdvNc53PuWhHZ6z6uDVQ5jTHG+CeQw2qbgVtUdaOIxAMbRGSVqu7wyXMj\nsENVLxGRdGC3iDwOxAF3AjmAute+oKplASyvMcaYHgSshqGq+aq60T2uAnYCYztmA+JFRHCCRClO\noDkfWKWqpW6QWAWsCFRZjRnuCirqeX5TbrCLYYa4QZm4JyLZwAJgTYdT9wMvAHlAPPAFVW0VkbHA\nUZ98x+gcbIwxfrr16S28vaeIT05NJykmItjFMUNUwDu9RSQOeBq4WVUrO5w+H9gEjAHmA/eLSEIf\nX3+liKwXkfVFRUUDUmZjhpMP95fw9h7n/8Ydz26lvqklyCUyQ1VAA4aIhOMEi8dV9ZkuslwHPKOO\nfcBBYDqQC2T55BvnpnWiqg+pao6q5qSn+7UcijEjyp/ePUBMRCgAL28t4P8+9HslCGPaCeQoKQEe\nBnaq6r3dZDsCnOPmzwCmAQeA14DzRCRZRJKB89w0Y0wf7cyvZPn0Ud7nh0trglgaM5QFsoaxFLgG\nWC4im9zHhSJyg4jc4Oa5CzhDRLYCbwK3qmqxqpa659a5jx+5acaYPqhuaCavop4ZoxNIigkH4LGP\njjDtf17hBy9sD3Lphq/SmkZaWjXYxRhwAev0VtX3AOklTx5O7aGrc48AjwSgaMaMGPsLqwGYnB7H\nuu+fyxs7jvPUuqN8sL+YF7fkceclM3EaA8xAqW9qYeFdq7hgdia/v/rUYBdnQNlMb2OGsOaWVo6W\n1nZ7fp8bME4ZFUd4aAgXzBnNX766iO9fOIPi6kYKKusHq6jDWkur8sN/bedgcQ0bjzjTxV7ZVsCR\nku7/bYYiCxjGDGG/fH0Pn/jFagoquv7g31tYTXiokJ0a0y59zrhEALbldhy4aE7EoZIa/vz+Ic7+\n5Vt88/GN3vSn1h0JYqkGngUMY4awl7bmAfCP9UfZfLScmoZmsm97iWc2HgOcGkZ2aixhoe3/q88c\nnUiIwNbcikEv83BUWNngPY4OD+WWT03l7GnpPPdxLqrDpy/DAoYxQ1RFXRMVtU0A3LNqD5c+8D6H\nSpwRUD99eRcA+4uqmZIR1+na6IhQThkVxzY/A0ZhVb3N3+jGnuNVXPXHj7zPn/nmGXz7nClcOGc0\neRX1bDk2fIKyBQxjhqgfv7iDyvpmpmXEe9Pe3VsMQHF1A6t2HOdwSQ2npHcOGACzxyay5VhFr6N5\nymsbOfvutzjnnrf54zsHKKlu6DH/SPM/z27zHm/6308xOjEagE9OdeaFbTpaHpRyBYIFDGOGqD2F\n1Zw+KZX/Pm+qN+217QXe46//dT2tCqf4BBRf587IoLi6gb98cKjH93n0g0PUNLaQW17HT17eyf88\nt63H/P1RVd/Exb99l0/evZoP9hd3Ol9a00h+RV3A3t8fqkphVT3H3QED+ZVt5fFddiUtLpKwEKGw\nyr+BBbWNzfzp3QN+1+Re317Akp++ya/f2EPrIA3htYBhzBBVWFnPmKRozpuZwSNfySE1NoKPj3T+\nNttdDeOC2ZlMGRXH+/s6fzB7qCovbMrjjMmpfHT7OZw/K4NXthX0ODKrP3bmV7Ett5LDJbV8+4mP\nqahr8p4rr21k4V2r+MwD7wfkvf115wvbWfSTNznrF6s5VFzD0dKuA1hIiJAWF8nxSv9qZK9uK+DH\nL+3kh//yb37M+/uKKais59dv7OXNXYV+l78/LGAYMwS1tCqFVQ1kJkYiIiyfnsGnZmYAEBoifCGn\nbWWdSemxXb6GiDAhNZbc8u6/se8vquFAcQ0XzBlNZmIU379wJtC+JjOQPIHoR5fOoqSmka3HKjhU\nXEP2bS9x01ObADhe2UBVfVNPLxNQ7+4tZnpmPA3NrfziNaevaNm0dO66dFanvBkJkRRW+Rcwytz+\nqCfXHvWrlnGwpJapGXGEhQgfHxmcnR8sYBgzBJVUN9DSqmQmRHnTLp47BoAQgcWTUgBnxE5UeGi3\nrzMuOZrcsrpuR/JsOOwssLB0cioA41NjmJoRx1u7i9p9+x8I97y+m1v+sRmAM9z3y6+o8y6c6PkJ\nwRvdtfloOQeLa7hg9mgWTUzh5a1O4PzOedO45vTsTvnT46Mo9HOuS75P4H74vYPc+fw2tudVUNvY\n3ClvYVU9H+4vZmpGPNMy4wft9zEoy5sbYwaWZ8Jdhk/AOHNKGn/+ymmEh4YwJslJ//nlc3t8nbFJ\n0VQ1NFNZ10yiu3SIx6HiGm59eisxEaFMTGurpUzNiOfFLfnM++HrHPrZRf2+l4q6Jv7zyY/bBYRx\nyc68kYKKeg75TH67esl4nlhzhPf3FXPG5LR+v3dfXeo2h80YHU9keAhrDzoBNTut61pcRkKkdyJf\nb/Ir65mYFktYiHD3a7sB+MuHh4mLDOMnn53NpfPbdnj4j//bQFOL84UhLjKMV7YVoKoBn7VvAcOY\nISjfnajnGZHjcbbPIoN7fnwBEWE9NyKMTXauP1ZeS2JMYrtz96/eB8B5MzPafRBN8vlwrKpvIj6q\nfaDpqzd3Hm8XLACiwkNJiY3gtR0F7D1e7U3/znnTyC2r44HV+4mNDOOby07p13v3RXVD2zf907JT\nmJIRz89f3cWUUXHERXb9UZqZEEVpTSN1jS1ER3Rf0wOnhjE6MYrLTx3HnS9sJyU2gsMltTS3tnLT\nU5u47429zB+fxNLJad6+qvNmZZIeH8nXPjFp4G60BxYwjBlE9U0tvLHzOBfNGd2vb4Oetv6slOhu\n8/QWLACyU50P/wNFNcwa0xYwGptbeXlrPlecOo6fX9a+lpKV0jZrPL+ivt8B44h7L1eelsWXFk8g\nPMz5vYSFCNtyK8lKiebpG84gJjKMuMgwvrX8FFbvLuKFTXmDGjA8s+l//YX5JMdGkBwbwa67VhDS\nw7/jKaOcAQf7Cqu9s+u70tjcyuGSWj45LZ3PLRzHpfPHUt3QzPHKejYdKed7T2/hQLHTn/TMRmen\nh5vPncKiiSkDeIe9sz4MYwbRT17aybee+JgNh/vXSXm4pJaEqLB+7543eVQsoSHCrgJniZDmlla+\n9pd1/OSlHdQ2tnDmlDRCQtp/IGYmtjWD5bnt7sXVDVzx4Aes3l1IXWPfJvjtL6ohKyWan102lznj\nEpme6eyh5un4ffS6RYxym14ATp2QwtfOnMjB4pqArwhbVtPIn949wPHKem9fhG8zYGRYKOGh3X+M\nTnGHNO8+XtXl+T3Hq6hrbOGXr++mpKaRi+eOBpyBC4nR4UzNiOd0tz+no0ndjH4LJKthGDOIPJO4\n/r2rkAXjkwkN6Xst49Vt+fzfR4eZObpPm1N2KTIslMnpsewucD7Qnt54jDd2tg3R9HxD9nXmKWnc\neclMfvivHeSVOx+iq3cVsu5QGdf9eR1XLxnPjz8zx+8yHCiqZnIXH36PfnURBRX1XZ6bkhFHQ3Mr\nx8pqmZDadf/BQPjm4xv58EAJNQ0tjE91anMZCZF+X5+dGkNEaAh7uggYpTWNnPerd5gzNpGtuRV8\nISeL5dMzOuXLSonh0etOY0FWMr97ex9/ePsAAJO7Gf0WSFbDMKYfnvs4l7d2+zcGXlW9Q1h/99Z+\n73pPfVFZ38QNjzmL2/k7Iaw30zIT2OP2E6w50H7bma4+rEWEa5ZMIESc5qQP95fw3X9u8Z5/Z0/3\n8zo8Gppb2F1QxZ7jVRwoqunyfRaOT+bCOaO7vN7zzX2PT/9GV3LL6/j6X9f7PVLJ1+6CKj48UALA\nsbJa73wK3xpGb8JCQ5iSEcfD7x3kqbXtFyL0TEz0jHC69ozsbl9n2bRRJMaEc+v50wkLEWaOThiQ\nLwx9ZTUMY07Qn949wI9f2gng12ihkppGSmsavc935Pd9pdgdec418ZFh/L+LZ/b5+q6MSYrite31\nqCp7C9s+gNPjI7sdkhsWGsIZk9N47KPDrDlY4k2/bmk2j685QmNza7d9KAeLa7jwvnep85lr0N1c\nke54aj5f/+t6Hrx6IStmdw4sb+w4ztf+uh5wOu6vyMnqlAegtVVpbGntdK9bjjm1wbS4CI6V1REZ\nHkJ8VBix3XRwd2deVhLb8yq57ZmtfD4ny9vE13HC5MRuRlr5CgkR1txxDonR4UHZxySQW7Rmichq\nEdkhIttF5KYu8nzXZze+bSLSIiIp7rlDIrLVPbc+UOU05kSU1TR6gwU4s5A9Wlu1y3kNh4qdhQEf\nvjaHsUnRFFY18Oq2fB5wRyP5w9NB/NJ/fqLdMMv+yIiPorG5lbLaJvYVVnPt6RP4wzWn8s8bTu/x\nuh98eibVDc3eETuPfCWHnAkpNDa3eptg3ttbzINv72933W/f3NsuWEDXNZmeJPh0tN/w2MYuJ/K9\n77O0yLGy7icn/vL13Uz/f6/y33/bRKXP63hqg4smpnCsvJaPj5Qzt4eO6+5M9Gky8/z7qap33S+P\n3kZReaTGRXZafXiwBPJdm4FbVHUmsAS4UUTafSVS1btVdb6qzgduB97usBXr2e75nACW05g+y3PX\nM7r+zIkAvLW7bVjoGT/7N199dF2naw64AeOUUXFMSo/lpS353PDYRu+Y+6aWVoq6mRX8j/VHWX7P\nW3zPbfoZneR/s0hvRrlt8puOllHX1MK0zATOn5XZa9/AKaPimeE2i1yzZALLp2d4V8bdX+TUVK5+\neA0/e2UXD769n6aWVgA2HyvnUzMzOPDTC73t8H0NGB1t7WJF2LzyOqZmxDEuOZr73tzLOx2G7nr8\n6d2DADzzcS6vbM3nH+uPcun97/HvXYVkJEQyKS2Oo6V1bM+r5NTxyX0u2+cWjmVsktP/saugiobm\nFtYeLOVYWR13XjKThKgw7yz9k13AAoaq5qvqRve4CtgJ9PSV6CrgyUCVx5j+UlXe2l1IQ3ML+W5n\n70VzRzMpPZY/vnsAVaWusYWCynpW7+784XSwuIbwUGFsUjSVHWZJt7Yq972xl9N+8gbHyjqv0/Td\nf27hQFGN93lPI3P6ytMm/95ep2mpq47u7vzsc3OYl5XkHd0zITWGEGnbGtab75VdPLsxl9rGZg4U\n1zBrTAIhIcL3L5rBpfPHkBbX99FeX/Fp89+RX8ljHx1ut0RGXnk9oxOjKal2an9ffmRtl7/b+Ki2\nJqa3dhdx69Nb2Hysgi3HKhibFM2sMW19BUu6GbHUk9S4SN74708iArsKKvnBC9v5wkPOcuifnJrO\nuv85l99/aWGfXzcYBqVeIyLZwAJgTTfnY4AVwNM+yQq8LiIbRGRloMtoTHeaWlr59P3v8Y3HNvKV\nP6/j3tf3sN4dFjsuKZobzprM9rxK3t1b3G6JhsoOzSQHi2rISokhLDSEbyw7hemZ8fzXuVO9eT9y\nO1h/++Y+aho6LwcRKBnxTsDwtKlP6UPAmJeVxPM3LmXxJOeDNDIslAmpsezI7zwq6Ndv7OEbj21E\nFWa7cz6WT8/gvisXnFB7/J2XzGTfTy4gNET48UvOKrq+ne955XWMSYrmjgunk+AGhY77f+RX1FFS\n08g3l01m8cQUXtlWQKs6S6YAjE2O4Vz3239CVBinT+p7wACnuSk7NZZd+VW8sCnPmz4xLZbIsNCg\nNTH1VcBLKSJxOIHgZlXtrpfvEuD9Ds1RZ6rqQuACnOass7p5/ZUisl5E1hcVdV3lNKY/dhdUseVY\nBa+6C+794Z0D3nb51LhILl0whpTYCJ79OLddwNhytO1YVTlYXOOdJb1idiav3nyWd6hmaU0jjW6T\nzd/WH+WsX6wGnAlfr2zNb1eet76zbEDvz9Mktft4FanuhLT+WDg+mTe6mL2dmRjlTTtzSv+X9RAR\nwkJDvHMxTstOZl9hNTvzKympbqCkppGxSVFcc3o2a+44lxChXSBrbG7lf5/fTojAFTlZXLfUaV6c\nmhHHL6+YxyempHHpvDGEh4bw/m3LefOWZf3qaJ6eGc+ugkoi3c7182dlBKXjuj8COkpKRMJxgsXj\nqvpMD1mvpENzlKrmuj8LReRZYBHwTscLVfUh4CGAnJyc4bMXojlp+O6Y9sXFzlpGHqEhQmhIKIsn\nprDuUKk7egViI8L41pMbef7GpUxIjeUzD7zP7uNVnDW1/QelZ+JdbnkdB4trEAFVZ0TVVQ995B3W\n6XHTOVO6XbfoREWFh7ofZlVM7WbvjL646zOzeHlrPtc+srZd+tPfOIP73tzLrDGJPS6I2Fe3rpjO\nsbJabjlvGot+8gbPbcr1zoRfNNGpEURHhJKdFsuj7x/kqkVZjE6M5rGPDrNqx3GuPX0CE9NiyU6N\n4Y4Lp3PujAwmpcexxKc24emD6I/pmQm8ss350vHd86fx9UFazmMgBXKUlAAPAztV9d4e8iUCnwSe\n90mLFZF4zzFwHhC4XVuM6cHmo+UkxYSz664V3HTOFG/6PVfM8x7nZKdwrKyOj4+UMWVUHFeelkV5\nbRM/f3UXtY3NbHaDTsdv7yluwLjm4bVU1TfznfOmMT3T+dDuGCze/u6ydu8/kMa7y31cPK/reQ99\nERMR5l380OOmc6YgItx87tQB7+D9xrLJ/OSzc0iJjWCZu4/2u3uLuWzhuHZLZ3x5yQQq65t53m0S\n+mB/MRNSY/jhpbMBp8ay8qzJAZtBPdOnL2TxxBS/lm452QSyhrEUuAbYKiKb3LQ7gPEAqvqgm/ZZ\n4HVVrfG5NgN41q2uhQFPqOqrASyrMd3afKycueOSiHKXCv/cwrEsnZzGZaeO8+bxDLfcfKyCs6el\nc8eFMzhe1cC/Nufx4f62D/7lPosDAiT7LO0xJjGKS+ePYcmkVC77/QcAPPDFhdz4hDNRL5Azmr+3\nYhqxkWF8bsG43jP7wTNX4a5LZ/GlxRM6LS8SKJ9ZMNY7U31xh3WWvrJ0Ig+9c4C3dxex4XAZb+ws\n5MrTup6bEQjzs5K8xzOCMOluIAQsYKjqe0CvfyWq+ijwaIe0A8C8rvIbM5hqG5vZW1jNeT7fiu/9\n/PxO+XyHhY5JiiYkRPjamRP51+Y878Y4q/7rLO8MZY+k2Lb5BO9872zCQkOIjXD+W45JjOKiuaOJ\nCMthQmoMgXTKqHh+9YXO93Wi7rliHi9vLeDqJRMGtZ3+3BkZhIcKTS3KaV0szDdrbCKrdhz3Pvfs\nGzIY0uPblhTp6+S/k8XQLLUxg2RnfiUtrcrccUk95kvxaWry7OUwLyuJp1Yu4Up3CGVXM3njI8OI\nCg/h28uneEfKJMdG8OPPzOYTbsfwUBmj72tKRjw3DUB/SF9FhYey9o5zKatt7PL3PT8rqX3AmHhi\no55O1H1Xzu9xdduTnQUMY3rg2XfCd0nv3lwwO9N7fOqEZC5bOI6rl4zvcuikiLDrrgs6pV+9ZMIJ\nlNYA3qXHu3LZwnHeiZKhIcKYAejM7ouBmp0fLBYwjOmBZ+2nFD+Gmj52/WJKahrajWIKDw3hns9b\n6+rJIjMxige+uJCxydHewQXGfxYwjOmBZ5ZwckzvmwQNxNwCE3gXze3/SLCRauiN6zImgB5YvY/L\nf/8BFbVNVNQ1UVrTSFJM+JCZiWtMIFkNwxgfnvbteT963ZsW6BFKxgwV9rXJmF4Ud7OCrDEjjQUM\nY1wtrerdMvXBqxd612yq6eMe1cYMV9YkZYwrv6KOllbl//vcHFbMHu3dBOmibrYJNWaksYBhjOtI\nibNgnWddJWeOxIoB3XvCmKHMAoYxLs9+1r4bCA3kqqrGDHX21ckY1+7jVSREhTHKZ80fY0ybER8w\nmlta+cf6o2w4XNp7ZjOs7T3u7Acx1Da1MWawjPiAERoi/OhfO3ju47zeM5th7XBJLZPSA7eEuDFD\n3YjvwxARpmXGs7ug8x7EZmRQVVrdXe5GxUf1foExI9SIr2EATB8dz86CSu8wSjNyPL8pl4m3v8y+\nwmpaWrXdngXGmPYCuUVrloisFpEdIrJdRG7qIs93RWST+9gmIi0ikuKeWyEiu0Vkn4jcFqhyAkzL\nTKCqvtm7lLUZ/uqbWvjLB4e46SlnM8h/73J2abOAYUz3AlnDaAZuUdWZwBLgRhGZ6ZtBVe9W1fmq\nOh+4HXhbVUtFJBR4ALgAmAlc1fHageQZFeNZytoMf3969wB3vrDd+/znr+4CLGAY05OABQxVzVfV\nje5xFbAT6Gn3kKuAJ93jRcA+VT2gqo3AU8ClgSprTIQz1r6uyZaAGAlUlSfWHAEgrsNWmelxFjCM\n6c6g9GGISDawAFjTzfkYYAXwtJs0Fjjqk+UYPQebfvEEjFpbM2hEOFpaR15FPT/89CzW3HEOkWFt\n/w3SrIZhTLcCPkpKROJwAsHNqlrZTbZLgPdVtc+TIURkJbASYPz48SdUxuhw59dQZwFj2Fl3qJRZ\nYxKIiWj7U//oYAkAp09OJTYyjH9/Zxk78yrZVVDZqcZhjGkT0BqGiITjBIvHVfWZHrJeSVtzFEAu\nkOXzfJyb1omqPqSqOaqak56efkLljPY2STWf0PXm5HS4pIYrHvyQu17c0S59zYFSUmIjmOIuATI2\nKZpzZ2bwreVTglFMY4aMQI6SEuBhYKeq3ttDvkTgk8DzPsnrgCkiMlFEInACyguBKqs1SQ1PW3Mr\nANh7vLpd+kcHSliUnWIzuo3po0DWv5cC1wBbRWSTm3YHMB5AVR900z4LvK6qNZ4LVbVZRL4FvAaE\nAo+oatuQlgHmrWFYwBhWNh8tByA+qu3PvLCqntzyOr565sRgFcuYIStgAUNV3wN6/Qqnqo8Cj3aR\n/jLw8oAXrAvR4RYwhqNd7uz9ouq2HfOOldUBMDHNtl01pq9spjcQHhpCeKhQa8Nqh5Uid2vVvPK2\nCZl55U7AGJMUHZQyGTOUWcBwRYeHWg1jmCl2axalNY2UVHuChwUMY06UBQxXdIQFjOGkpVUprWnk\nvJkZALy4JR9wahvxkWEkRIUHs3jGDEkWMFwxEWHWJDWM/G71PloVPjEljXnjErn7td0cLqnhWFkd\no5NsRVpjToQFDJfTJGXzMIaDj4+Ucc+qPYCzNtRvrlpAdUMzr20vYEdeBVMz4oNcQmOGJgsYrpiI\nUFtLaph4cu0R73FCVDgTUmOZlBbLC5vzyKuoZ8H45CCWzpihywKGKzoilJoGCxjDwbpDZcwak8Bl\nC8d5g8PiSSlsy3VWppmflRTM4hkzZHUbMETkez7HV3Q499NAFioY4qPCqG6wJqmhZnteBX/54JD3\neVFVAweLa/j0vDHc8/l53kmZiyemevPMGZs42MU0ZljoqYZxpc/x7R3OrQhAWYIqMTqcyrqmYBfD\n9NGNj2/kzhe2e7fY9fzsGBQWT0rxHkeEWcXamBPR0/8c6ea4q+dDXkJUOBUWMIacFndb3RX3vcPB\n4hoOFjvrRk1Kj2uXb3RiNDeePZm/rVwy6GU0ZrjoaWkQ7ea4q+dDXkJ0OA3NrdQ3tRDlLhViTm6q\nSllNExNSYzhcUssf3z1AVFgo0eGhZCR03tfiu+dPD0IpjRk+egoY80SkEqc2Ee0e4z4fdgPZE6Kd\niVxV9c0WMIaIJ9cepbqhmVsvmM6aAyWs3lXI9Mx4stNibSVaYwKg24ChqiPqUzPBXdG0oq7J9nUe\nIv764SEiwkK4aM5oKuuaeHFLPqowL8s6tY0JhD71/olIrIhcLSIvBapAweKpYVTWWz/GUFFR18Sl\n88aQEhvhnYxXUFlPRsKwqwAbc1LoNWCISISIfFZE/gHkA+cAD/Zy2ZCT6AkY1vE9ZFTUNZEU4/y7\neXbPAxhlNURjAqLbJikROQ+4CjgPWA38FThNVa8bpLINKs9idDZSamhoaG6htrGFpJgIALJSYogI\nC6GxuZVR8VbDMCYQeqphvApMAs5U1atV9V9Aq78vLCJZIrJaRHaIyHYRuambfMtEZJOb522f9EMi\nstU9t97f9z1RCdFO7Kyst8l7wfTqtgKyb3uJgor6HvN5ArunZhgaIkx2h9JaH5QxgdFTwFgIfAi8\nISKrROR6nO1S/dUM3KKqM4ElwI0iMtM3g4gkAb8DPq2qs4ArOrzG2ao6X1Vz+vC+J8RTw7AmqcFX\nUdeEuvMpnnDXgdpwuKzna2qdfydPkxS0NUtZwDAmMLoNGKq6SVVvU9XJwJ3AfCBcRF4RkZW9vbCq\n5qvqRve4CtgJjO2Q7YvAM6p6xM1XeIL30W9R4aFEhoVYwBhkR0trmffD13lsjRMo4iKd7yT7Cqt7\nvK7c/XdKio7wpk3LdDq+rdPbmMDwa5SUqn6gqt8GxgG/wqkx+E1EsoEFwJoOp6YCySLylohsEJEv\n+74t8Lqb3muAGggJ0eE2SmqQfXigBIB39hRR39TC/sIaALbmlvd4XXkXNYyrF0/gwasXWg3DmADp\nqdN7YTenioH7/X0DEYkDngZuVtXKDqfDgFNxRl5FAx+KyEequgen7yRXREYBq0Rkl6q+08XrrwRW\nAowfP97fYnUpISrMOr0D5Ib/20BtUwu/vWoBLa3KL1/fzX9/aiprD5YCTl/Ej17cwe7jzlpQ7+8r\nobaxmZiIrv9Ey2sbvdd5JMaEs2L26ADfiTEjV08zvdcD23ACBLRfP0qB5b29uIiE4wSLx1X1mS6y\nHANKVLUGqBGRd4B5wB5VzQWnmUpEngUWAZ0Chqo+BDwEkJOT068lS5wFCK3Te6A1NLfw6vYCAP62\n7ggbD5eIA6m2AAAgAElEQVTz6vYC4iLDeMndOvV4ZT17jzvNUBfMzuSVbQW8s6eYFbMzAWhtVX7/\n9n6+uGg8ybERFLl7dKfGRXTxjsaYQOipSeq/gUqgDvgzcImqnu0+/AkWAjwM7FTVe7vJ9jxwpoiE\niUgMsBjY6U4QjHdfJxZnaO82v+/qBFmTVGAUVjZ4j5/9OI/39znfQR565wB1TS1MTIslr7yOEIGL\n547m3s/PB2Dv8SpaW53vAOsPl3H3a7v5/nNbAcgrryM5JrzbGogxZuD11On9a1U9E/g2kAW8KSJ/\nF5H5fr72UuAaYLk7NHaTiFwoIjeIyA3ue+zEGb67BVgL/ElVtwEZwHsistlNf0lVXz3Rm/SXrVgb\nGAWVzhDZ2WMT2JlfSVVDM9Huel0RoSF8cmo6ueV15FfWMzk9jugIZ/HAZzflMumOl9l0tJxit0bx\n8tYC5v/odR776AijE6ODdk/GjES9fj1T1QMi8jxOH8M1OB3Vm/y47j38WAZdVe8G7u74njhNU4PK\n9sQIDM+ciuXTRnl3vbtuaTa/e2s/Ta2tnHlKGo+6myCNT4kBYEJqrLd/468fHuJFt+kK2jq8rTnK\nmMHV0457k0TkDhFZA/wQ2AzMUNW/D1rpBllCdBiV9c3eOQFmYBx3axjLZ2R4065a5AxQCA8J4dyZ\nGdx35XwyEiKZ526fOsENHADPbMylsdmZM5qVEk3OhOR2r2uMGRw91TD24TQVPY/TlzEe+IZn2ege\n+iWGrOSYCFpalcq6ZhJ9hmua3r2zp4j73tzLE19fTGRY+/mdBRX1RIWHMG9cIr/70kIWjk8mMzGK\nlWdN4pzpowC4dP5YLp3fNk1n4YRk/rHhWKf3efd7y6msb2LuD17nwjk2IsqYwdRTwPgRbRslxfWQ\nb9jITHQmfOVX1lnA6KMvP7IWgEPFtd4JdB5Hy2rJSo5BRNp9yN9x4YxuX+/K07Ioq23kpS35bM9z\nmrHu/bzTSpkQFc6WH5xHnHV4GzOoetoP4weDWI6TwmhPwCivZ3pmQpBLM3Q0tbQtMXasrHPAOFJa\n5+2b8JeI8M1lp5ASE8Ftz2xlbFI0n1s4znves5SLMWbw9Gk/jOHOM+omv5eF70x7nhFMAMfK6tqd\nU1WOlNSQ1ceA4eHZm7um0ebHGBNsFjB8jIqPJEQgv6Ku98zGq6S60Xt8tLS23bnSmkZqGluYkHqi\nASMWsBqFMScDawT2ERYaQkZCFHnlVsPoC98axhGfgNHSqvzv89sBmDH6xJr4UmMj+J+LZrBs2qj+\nFdIY02+9BgwRiQQuA7J986vqjwJXrOAZFR/pXXbC+MdTw5gxOoGdBW3Lhf1zw1Fe2prP7RdMZ/HE\nlBN6bRHha5+YNCDlNMb0jz9NUs8Dl+Lsb1Hj8xiW0uIiKaqygNEXJTXO7+vsaekcLa3zLgy48XA5\nqbERrDxrEp7h2MaYocufJqlxqroi4CU5SaTHR7IltyLYxRhSSqobiQwLYcmkVH731n6251Wy9JQ0\ndhVUMn10vAULY4YJf2oYH4jInICX5CSRFhdJaU0jLa0229tfRVUNpMVFMtnd8e5IaS0trcru41VM\ny7DhycYMF/7UMM4EviIiB4EGnPWhVFXnBrRkQZIeH0lLq1JW20hanG3E05vaxmb+vbuQJRNTGRUf\niQg893Eu45KjqW9qZfKo2GAX0RgzQPwJGBcEvBQnEU+QKK5usIDhh3f2FFNe28SXz5hAeGgIyTER\nrDlYypqHnZnfY5JsRVljhotem6RU9TCQBFziPpLctGEpzV0BtbiqsZecBmB7XgWhIcLC8c6CgLUd\nJth5Zs8bY4a+XgOGiNwEPA6Mch+Pici3A12wYPHsB11UbXMx/LEjr5LJ6bFEuftb1De1tjs/OsFq\nGMYMF/40SV0PLHa3UUVEfg58CPw2kAULljQ3YFgNwz878ys5zWeOxfTMeHYVVHmfJ0Tb3FBjhgt/\nRkkJ0OLzvAU/NkYSkSwRWS0iO0Rku1tT6SrfMnc3vu0i8rZP+goR2S0i+0TkNj/KOSDiI8OICAtp\nN3vZdK2usYW8inpOSW9bzPixry3muRuXep/bkFpjhg9/vv79GVgjIs+6zz+Ds1d3b5qBW1R1o7s/\n9wYRWaWqOzwZRCQJ+B2wQlWPiMgoNz0UeAD4FHAMWCciL/heGygiQrpN3vPLoRJn/mZ2WttIqLS4\nSNLiInn5Pz9BdYMtGGjMcOLPFq33ishbOMNrAa5T1Y/9uC4fyHePq0RkJzAW8P3Q/yLwjKoecfMV\nuumLgH3uVq2IyFM4s80DHjDAaZay5UF6d6jYCRgT0zoPnZ05xuZfGDPcdBswRCRBVStFJAU45D48\n51JUtdTfNxGRbGABsKbDqalAuBuQ4oH7VPWvOIHlqE++Y8Bif9+vv9LjIsi1BQh7dbCLGoYxZvjq\nqYbxBHAxsIG2nffAnbgH+LUinIjEAU8DN6tqZYfTYcCpwDlANPChiHzkX9G9r78SWAkwfvz4vlza\nrfT4SDYdteVBenO0tI6U2AjiIq1j25iRoKcd9y52f0480RcXkXCcYPG4qj7TRZZjQIk7AqtGRN4B\n5rnpWT75xgG53ZTzIeAhgJycnAFZz8NZHqSBllYlNMQ6bTv6cH8JM0cnUFBRZ/MsjBlB/JmH8aY/\naV3kEZzO8Z2qem832Z4HzhSRMBGJwWl22gmsA6aIyEQRiQCuBF7o7T0HSlpcJK0KZbU2tLaj4uoG\nrvrjR3zryY3kV9R7dyk0xgx/PfVhRAExQJqIJNM2lDYBp4+hN0uBa4CtIrLJTbsDGA+gqg+q6k4R\neRXYArQCf1LVbe77fwt4DQgFHlHV7X29uRPlnbxXZcuDdLT+kNN19e7eYhKiwlh0gvtcGGOGnp4a\nn/8DuBkYg9OP4QkYlcD9vb2wqr6HH/M1VPVu4O4u0l8GXu7t+kDwXU/KtPfRgbaxDpX1zVbDMGYE\n6akP4z7gPhH5tqoOy1nd3fGuJ2UBo52ymkb+sf4ok9JjOVDkjJDKSrGAYcxI4c88jN+KyGxgJhDl\nk/7XQBYsmHybpEyb9YfLqGls4eeXzeWP7xygsaWV82ZmBrtYxphB4s+e3ncCy3ACxss4y52/Bwzb\ngBEXGUZkWAjF1dbp7et4pTM3JSs5hj9cc6ot+2HMCOPPWlKX48yTKFDV63CGvSYGtFRBJiKkxUVS\nbDWMdgqrGhBxmuwsWBgz8vgTMOpUtRVoFpEEoJD2cySGpXRbHqSToqp6UmMjCQv158/GGDPc+DNF\nd727SOAfcUZLVeMsbz6spcVFcqysNtjFOKkUVjYwKt6GGRszUvnT6f1N9/BBd85EgqpuCWyxgi89\nPoJNR8uDXYyTyvGqekYlWMAwZqTqaeLewp7OqerGwBTp5JBuy4N0UljZwMzRtgqtMSNVTzWMe9yf\nUUAOsBlnIt5cYD1wemCLFlxp8c7yINvzKpg7LinYxQm6llaluLqBjARbO8qYkarb3ktVPVtVz8bZ\n02Khquao6qk4y5R3uRDgcLJkUioAv3x9T5BLcnIoqWmgVbE+DGNGMH+Gu0xT1a2eJ+5aTzMCV6ST\nw9SMeFbMyiSvvC7YRTkpFFY6I8bS462GYcxI5c8oqS0i8ifgMff5l3AWCxz2kmPDqTjSFOxinBQK\nq5xJe9bpbczI5U/AuA74BnCT+/wd4PcBK9FJJCE6nIraJlR1xE9U89QwrEnKmJHLn2G19cCv3MeI\nkhgdTmNLK/VNrURHhAa7OEFVWOVpkrKAYcxI1dOw2r+r6udFZCvtt2gFQFXnBrRkJ4HE6HAAKuqa\nRnzAqKxrIiYilMiwkf17MGYk66mG4WmCungwCnIy8g0YmSN8K9KaxmZibe9uY0a0nvbDyHd/Hj6R\nFxaRLJwVbTNwaigPuXts+OZZhrNN60E36RlV/ZF77hBQBbQAzaqacyLl6A/fgDHSVTe0EGcBw5gR\nracmqSq6aIrCmbynqtrblN9m4BZV3Sgi8cAGEVmlqjs65HtXVburxZytqsW9vE/AJEU7GylZwIDq\n+iZiI605ypiRrKcaRnx/XtitoXhqKVUishNnL/COAeOkZTWMNjUNLcRGWA3DmJHM73WqRWSUiIz3\nPPryJiKSjTNDfE0Xp08Xkc0i8oqIzPJJV+B1EdkgIiv78n4DxQJGm+qGZmuSMmaE82fHvU/jrCs1\nBmcvjAnATmBWT9f5XB8HPA3crKqVHU5vBCaoarWIXAg8B0xxz52pqrkiMgpYJSK7VPWdLl5/JbAS\nYPz4PsWxXsVHhSECFbUjc+e9sppGjlfVMy0j3jq9jTF+1TDuApYAe1R1Is7uex/58+IiEo4TLB5X\n1Wc6nlfVSlWtdo9fBsJFJM19nuv+LASeBRZ19R6q+pC7zlVOenq6P8XyW0iIEB8ZNiJrGKrK1/66\nnhW/fpeH3ztITYMFDGNGOn8CRpOqlgAhIhKiqqtxVq/tkThTox8Gdqrqvd3kyXTzISKL3PKUiEis\n21GOiMQC5wHb/LqjAZYYEz4iA8bGI2VsOFwGwO/e2k9JTSNx1ultzIjmz1fGcrdZ6R3gcREpBGr8\nuG4pcA2wVUQ2uWl3AOMBVPVBnP3CvyEizUAdcKWqqohkAM+6sSQMeEJVX+3DfQ2YxOiRGTCOlTmL\nLn73/Gnc/dpuAOIiw4NZJGNMkPkTMC4F6oH/wll4MBH4UW8Xqep7OENwe8pzP3B/F+kHgHl+lC3g\nRmrAKHKXAjl3RoY3YNiwWmNGtm6bpETkARFZqqo1qtqiqs2q+hdV/Y3bRDUiJEVHsPFIOesOlQa7\nKIOqqKqBiLAQpmbEedNslJQxI1tPfRh7gF+KyCER+YWILBisQp1MPNuzXvHgh0EuyeAqqmogPS4S\nEeHrn5jIgvFJnDE5LdjFMsYEUU8T9+4D7hORCcCVwCMiEg08CTypqiNiK7o9x6u8x8cr60fMFqWF\nVQ3evS++f9HMIJfGGHMy6HWUlKoeVtWfq+oC4CrgMzjzMEaE/3fxTNLinCVCtudVBLk0g6ewqp70\nOFvK3BjTpteAISJhInKJiDwOvALsBj4X8JKdJJaeksaz31wKQHHVyJjAl1dex97CamaO6W25MGPM\nSNLT4oOfwqlRXAisBZ4CVqqqP0Nqh5U095t2UXVDkEsyOF7ZVoAqfHbB2GAXxRhzEulp2MvtwBM4\nK86WDVJ5TkrREaHERYZRPEICRn55HTERoUxIjQ12UYwxJ5GeOr2XD2ZBTnZpcREUV4+MJqnyuiaS\nom2SnjGmPb9Xqx3p0uIiKaqqD3YxBkV5bROJMRHBLoYx5iRjAcNPaXGRI6aGUVHXaDUMY0wnFjD8\nlJ0Wy+GSGuqbWoJdlIArr20iOdYChjGmPQsYfsqZkExTi7L5aHmwixJw5XVNJEZbk5Qxpj0LGH46\ndUIyAGsPltLc0hrk0gSOqlJe20hSjNUwjDHtWcDwU3JsBGOTorln1R4uuf/9YBcnYGobW2hqUevD\nMMZ0YgGjDyalO/MSduZX0tg8PGsZpTVOx36yjZIyxnRgAaMPWlW9x1tzh2dfxtHSWgDGJUcHuSTG\nmJNNwAKGiGSJyGoR2SEi20Xkpi7yLBORChHZ5D7+1+fcChHZLSL7ROS2QJWzLy6YPdp7vCO/qoec\nQ9dhN2BkpcQEuSTGmJNNIHfEacZZVmSjuz/3BhFZpao7OuR7V1Uv9k0QkVDgAeBTwDFgnYi80MW1\ng+pLi8dzwexMFv30TY5XDM9JfIdLagkPFcYkWQ3DGNNewGoYqpqvqhvd4yqcJdH9Xc1uEbBPVQ+o\naiPOwoeXBqak/hMRUuMiyYiPJH8YBowjJbU8+PZ+xiRFezeOMsYYj0HpwxCRbGABsKaL06eLyGYR\neUVEZrlpY4GjPnmO4X+wCbjMxCgKKuuCXYwB9/aeQgDOnjYqyCUxxpyMAh4wRCQOeBq4WVUrO5ze\nCExQ1XnAb4HnTuD1V4rIehFZX1RU1P8C+2F0YvSwrGEUVjUQIs6mUcYY01FAA4aIhOMEi8dV9ZmO\n51W1UlWr3eOXgXARSQNygSyfrOPctE5U9SFVzVHVnPT09AG/h65kJkaRX16P+oyaGg4KKxtIjYu0\n5ihjTJcCOUpKgIeBnap6bzd5Mt18iMgitzwlwDpgiohMFJEInD3FXwhUWftqUnosdU0tHCsbXs1S\nhVX1jIq3bVmNMV0LZA1jKXANsNxn2OyFInKDiNzg5rkc2CYim4HfAFeqoxn4FvAaTmf531V1ewDL\n2iezxiQC8IlfrB5WmyoVVjVYwDDGdCtgw2pV9T2gx7YNVb0fuL+bcy8DLwegaP02PTPee7z2YCkX\nzhndQ+6ho7CqgdluMDTGmI5spvcJiAoP5dwZzkiikmFSw2hpVUqqGxiVYDUMY0zXLGCcoIeuySEs\nRIbNaKmS6gZaFWuSMsZ0ywLGCQoJETISooZNwCiscmpK6fFRQS6JMeZkZQGjH8YkRZFfMTxGShW6\n+5Vbk5QxpjsWMPphdGL0sBhaW9fYNkTYmqSMMd0J5OKDw960zHhe2JxHZX0TCVFDd8Ohqx9ew4bD\nZQCkW8AwxnTDahj9MGtMAgA78jqueDK0eIIFQGRYaBBLYow5mVnA6AfPBL7tQyxgHCur5fMPfshL\nW/Jpcvcnv2juaB75Sk6QS2aMOZlZk1Q/pMdHkhIbwb7CobWZ0r2r9rD2UCkJ0WGclp0MwJJJqSyf\nnhHkkhljTmZWw+inU9Lj2FdYHexi9MmaA6UAFFc3Ulzt7OGdFmt7eBtjemYBo58mjxpaAeNoaS25\n5c6IqCOltTy/yVkEOMUChjGmFxYw+umUUXGU1TZRVDU0lgh5Yu0RRJztZktrGvnDOwcASI2z0VHG\nmJ5ZwOineeOcju+Pj5T1kjP4Xtmaz+/f2s+nZmSwrMOueqlWwzDG9MI6vftp9thEIkJDeGZjLo0t\nrVw8d0ywi9QlVeW+N/cyKT2We78wn2Z3dBTA5aeOIylm6M4jMcYMDqth9FNUeCjzs5J4dXsB33ri\nY1paT85d+LblVrKroIqvnTmJuMgwkmLaahS/vGIe7j5WxhjTLathDIDzZ2ey9pAz8qiyronkk7B5\n56Wt+YSGCBfMzvSm/etbZ9LY0hLEUhljhpJAbtGaJSKrRWSHiGwXkZt6yHuaiDSLyOU+aS0+O/Wd\nNNuzduXyheO8xyU1jUEsSddUlVe25XPG5NR2wWzOuEROnZASxJIZY4aSQDZJNQO3qOpMYAlwo4jM\n7JhJREKBnwOvdzhVp6rz3cenA1jOfkuMCeevX10EQFlt3wPGr9/Yw5KfvjnQxfLanlfJ4ZJaLhom\nOwMaY4IjYAFDVfNVdaN7XIWzN/fYLrJ+G3gaKAxUWQaDZx5DSfWJBIy9FFTWU1HbNNDFAuCVbU5z\n1HmzMnvPbIwx3RiUTm8RyQYWAGs6pI8FPgv8vovLokRkvYh8JCKfCXgh+8kTME6khhEb4Sz4tz2/\nYkDL5PHKtgJOn5Rqk/OMMf0S8IAhInE4NYibVbXjKn2/Bm5V1dbOVzJBVXOALwK/FpHJ3bz+Sjew\nrC8qKhrQsveF58O4tKaRPceruPWfW9oNXe1JdlosANtznV+PqrLmQAl1jf3vkG5pVQ4V17BgfFK/\nX8sYM7IFNGCISDhOsHhcVZ/pIksO8JSIHAIuB37nqU2oaq778wDwFk4NpRNVfUhVc1Q1Jz09feBv\nwk9R4aHERISyM7+Si37zLn9bf5SDxTV+XRsd7tQwXt6WD8DGI+V84aGPuOg37/a7XKU1jbQqpNlM\nbmNMPwVylJQADwM7VfXervKo6kRVzVbVbOCfwDdV9TkRSRaRSPd10oClwI5AlXWgjE6M4sUt+TS1\nOHMx/B0xVevWJD4+Us7hkhqOlDqB5oCfAacnxdXOkiUWMIwx/RXIGsZS4Bpguc/w2AtF5AYRuaGX\na2cA60VkM7Aa+JmqnvQBY8qo+HbPj1fW+3VdbWMzp4yKA+CpdUe55e+bveda+zARsKCinvf2FrdL\nawsY1n9hjOmfgE3cU9X3AL+nD6vqV3yOPwDmBKBYAZWR0P5bfEGFvwGjhU9MSeFISS2/f2t/u3Mf\nHSwhJiKM+Vm990Fc8YcPOFpax667VhDlNnN5Rm2l2darxph+sqVBBlBiTPtv8QU91DAqapv42l/W\nsfloObWNLSTFhDMtM75Tvi/+cQ2feeB978543bnn9d0cLXWWLd+R3za2wFvDiLWAYYzpHwsYA+g/\nzprELZ+ayk3nTEGk5yapH/5rO2/sLOTFLXnUNDYTExHK7LGJ3eZfvavnaSq//fc+7/G23LbhuUVV\nDYSHCgnRtgqMMaZ/7FNkAMVGhvHtc6YAzof2nuNdb6zU3NLKm24AqKpvRhViIsKYPTbKmyczIapd\nDWVXQZXfE++2HmsLGLsKqpiUFmeLCxpj+s1qGAEye2wi+4uqqWlo7nRu09FyKuqcWd2HSpyRULGR\nocwe49Qwbl0xnd9fvbDdNZ58XaltdN7jeyum8cmp6Wx1axiqyuZj5X71fxhjTG8sYATInLGJqDr9\nCR/sK+bmpz5mz/Eqiqsb+Mf6YwBMTIvlcEkt4MzFmDUmgeuWZnPhnEzio9rvT3G4pJa39xTx2EeH\n26U3tbRy2o/fACA9LpK54xLZW1hNfVMLtz29lfLaJuZmdd/UZYwx/rImqQCZ4+7Et/VYBX9ff5Rd\nBVVMTIvjlW357CqoIiIshFljEnhxizNZLzYyjLDQEO68ZBYAhVVtzVFR4SHsyKvk2kfWAnDZwnFE\nu8uJ7DleRY07jyM9PpL4qHBaWpVdBVW8vDWftLhILp5zcm7qZIwZWqyGESAZCVGkx0fy1p4idhVU\nAZBfUec9bmxuJTSkrV9hdGJUu+sTfGoYSyenUdfUtkzIRwdLvMe+Hdzp8ZGMT4kB4IP9xVQ1NPOd\n86aSaLvpGWMGgAWMAJozNpF39jjrW4UI5FfUExnW9iv3fLg/dv1iFoxPbnetb74vLRnvPQ4NETYe\nbts/fKtPwMhMiGJscjQAv3h1N0CPI6+MMaYvLGAE0LJpztpW6fGRnDsjg7f3FNHQ7MynuPvyudx4\n9im8ecsnOXNKWqdrRYRvLHPWW5w9NpGffnYOT359CZkJUeSW1XnzbcutZFF2Ch/evpzUuEgSotpa\nGVNjI7qc22GMMSfC+jAC6JolE6hvamHRxFT+9O4Bb/r3VkzjipwsACanx3V7/a0rpnP9mRNJi4vk\ni4udWsbYpGiOldfR0qrc/swWNh0t5/ozJzI60alZ+A6fffE/zyQ81L4TGGMGhgWMABIRVp7l1BJ8\nF/87a4r/q+p2XDRwbHI0aw+W8vf1R/m7O9pqeje1CE8QMcaYgWABY5B85/xpXDR3NKdl928P7TFJ\nzoS+TUfKAfjElDTOnj6qXZ7V31lGdX3n+R/GGNMfFjAGSVxkWL+DBUBWcgwtrcrGI2XMGJ3A/12/\nuFOeie6GTMYYM5CsgXuImTvOmbW9t7C60+q4xhgTSBYwhphpmfHERToVw4z4qF5yG2PMwLGAMcSE\nhggLJzhzNkZZDcMYM4gCuUVrloisFpEdIrJdRG7qIe9pItIsIpf7pF0rInvdx7WBKudQdLY7v6Ox\nlz0yjDFmIAWy07sZuEVVN4pIPLBBRFZ13GpVREKBnwOv+6SlAHcCOYC6176gqmUYrlo0noKKer66\ndGKwi2KMGUECVsNQ1XxV3egeVwE7gbFdZP028DTgu0PQ+cAqVS11g8QqYEWgyjrURIWHcvuFM8hI\nsD4MY8zgGZQ+DBHJBhYAazqkjwU+C/y+wyVjgaM+z4/RdbAxxhgzSAIeMEQkDqcGcbOqVnY4/Wvg\nVlU94cZ4EVkpIutFZH1RUVF/imqMMaYHAZ24JyLhOMHicVV9possOcBT7vpHacCFItIM5ALLfPKN\nA97q6j1U9SHgIYCcnBwdqLIbY4xpL2ABQ5wo8DCwU1Xv7SqPqk70yf8o8KKqPud2ev9URDxrfp8H\n3B6oshpjjOldIGsYS4FrgK0isslNuwMYD6CqD3Z3oaqWishdwDo36UeqWhrAshpjjOlFwAKGqr4H\nSK8Z2/J/pcPzR4BHBrhYxhhjTpDN9DbGGOMXCxjGGGP8IqrDZ2CRiBQBh0/w8jSgeACLM1TYfY88\nI/Xe7b67NkFV/drVbVgFjP4QkfWqmhPscgw2u++RZ6Teu913/1mTlDHGGL9YwDDGGOMXCxhtHgp2\nAYLE7nvkGan3bvfdT9aHYYwxxi9WwzDGGOOXER8wRGSFiOwWkX0icluwyzPQROQRESkUkW0+aSki\nssrdzXCVZ80ucfzG/V1sEZGFwSt5/3S34+Nwv3cRiRKRtSKy2b3vH7rpE0VkjXt/fxORCDc90n2+\nzz2fHczy95eIhIrIxyLyovt82N+3iBwSka0isklE1rtpAfk7H9EBw93t7wHgAmAmcJWIzAxuqQbc\no3TefOo24E1VnQK86T4H5/cwxX2spPM+JUOJZ8fHmcAS4Eb333a433sDsFxV5wHzgRUisgRnV8tf\nqeopQBlwvZv/eqDMTf+Vm28ouwlnszaPkXLfZ6vqfJ/hs4H5O1fVEfsATgde83l+O3B7sMsVgPvM\nBrb5PN8NjHaPRwO73eM/AFd1lW+oP4DngU+NpHsHYoCNwGKciVthbrr37x54DTjdPQ5z80mwy36C\n9zvO/XBcDryIs5bdSLjvQ0Bah7SA/J2P6BoGI3dnvwxVzXePC4AM93hY/j467Pg47O/dbZbZhLPt\n8SpgP1Cuqs1uFt978963e74CSB3cEg+YXwPfAzwbsqUyMu5bgddFZIOIrHTTAvJ3HtANlMzJT1VV\nRIbtULmOOz66m3UBw/feVbUFmC8iScCzwPQgFyngRORioFBVN4jIsmCXZ5Cdqaq5IjIKWCUiu3xP\nDuTf+UivYeQCWT7Px7lpw91xERkN4P4sdNOH1e+jmx0fR8S9A6hqObAapykmSUQ8XxB978173+75\nREJP9RwAAAMbSURBVKBkkIs6EJYCnxaRQ8BTOM1S9zH87xtVzXV/FuJ8QVhEgP7OR3rAWAdMcUdS\nRABXAi8EuUyD4QXgWvf4Wpz2fU/6l92RFEuACp9q7ZAi0u2Oj8P63kUk3a1ZICLROP02O3ECx+Vu\nto737fl9XA78W93G7aFEVW9X1XGqmo3z//jfqvolhvl9i0isiMR7jnF2J91GoP7Og91hE+wHcCGw\nB6ed9/vBLk8A7u9JIB9owmmvvB6nrfZNYC/wBpDi5hWcUWP7ga1ATrDL34/7PhOnbXcLsMl9XDjc\n7x2YC3zs3vc24H/d9EnAWmAf8A8g0k2Pcp/vc89PCvY9DMDvYBnOds/D/r7d+9vsPrZ7PsMC9Xdu\nM72NMcb4ZaQ3SRljjPGTBQxjjDF+sYBhjDHGLxYwjDHG+MUChjHGGL9YwDCmFyLS4q4E6nkM2KrG\nIpItPisJG3Mys6VBjOldnarOD3YhjAk2q2EYc4LcfQh+4e5FsFZETnHTs0Xk3+5+A2+KyHg3PUNE\nnnX3qtgsIme4LxUqIn9096943Z2hjYj8pzj7eWwRkaeCdJvGeFnAMKZ30R2apL7gc65CVecA9+Os\nlgrwW+AvqjoXeBz4jZv+G+BtdfaqWIgzMxecvQkeUNVZQDlwmZt+G7DAfZ0bAnVzxvjLZnob0wsR\nqVbVuC7SD+FsVnTAXeiwQFVTRaQYZ4+BJjc9X1XTRKQIGKeqDT6vkQ2sUmejG0TkViBcVX8sIq8C\n1cBzwHOqWh3gWzWmR1bDMKZ/tJvjvmjwOW6hrW/xIpx1fxYC63xWXTUmKCxgGNM/X/D5+aF7/AHO\niqkAXwLedY/fBL4B3k2OErt7UREJAbJUdTVwK87y251qOcYMJvvGYkzvot0d7DxeVVXP0NpkEdmC\nU0u4yk37NvBnEfkuUARc56bfBDwkItfj1CS+gbOScFdCgcfcoCLAb9TZ38KYoLE+DGNOkNuHkaOq\nxcEuizGDwZqkjDHG+MVqGMYYY/xiNQxjjDF+sYBhjDHGLxYwjDHG+MUChjHGGL9YwDDGGOMXCxjG\nGGP88v8DYqN7v3gUkLQAAAAASUVORK5CYII=\n", 571 | "text/plain": [ 572 | "" 573 | ] 574 | }, 575 | "metadata": {}, 576 | "output_type": "display_data" 577 | } 578 | ], 579 | "source": [ 580 | "def smooth_curve(points, factor=0.9):\n", 581 | " smoothed_points = []\n", 582 | " for point in points:\n", 583 | " if smoothed_points:\n", 584 | " previous = smoothed_points[-1]\n", 585 | " smoothed_points.append(previous * factor + point * (1 - factor))\n", 586 | " else:\n", 587 | " smoothed_points.append(point)\n", 588 | " return smoothed_points\n", 589 | "\n", 590 | "smooth_mae_history = smooth_curve(average_mae_history[10:])\n", 591 | "\n", 592 | "plt.plot(range(1, len(smooth_mae_history) + 1), smooth_mae_history)\n", 593 | "plt.xlabel('Epochs')\n", 594 | "plt.ylabel('Validation MAE')\n", 595 | "plt.show()" 596 | ] 597 | }, 598 | { 599 | "cell_type": "markdown", 600 | "metadata": {}, 601 | "source": [ 602 | "\n", 603 | "According to this plot, it seems that validation MAE stops improving significantly after 80 epochs. Past that point, we start overfitting.\n", 604 | "\n", 605 | "Once we are done tuning other parameters of our model (besides the number of epochs, we could also adjust the size of the hidden layers), we \n", 606 | "can train a final \"production\" model on all of the training data, with the best parameters, then look at its performance on the test data:" 607 | ] 608 | }, 609 | { 610 | "cell_type": "code", 611 | "execution_count": 28, 612 | "metadata": {}, 613 | "outputs": [ 614 | { 615 | "name": "stdout", 616 | "output_type": "stream", 617 | "text": [ 618 | "\r", 619 | " 32/102 [========>.....................] - ETA: 0s" 620 | ] 621 | } 622 | ], 623 | "source": [ 624 | "# Get a fresh, compiled model.\n", 625 | "model = build_model()\n", 626 | "# Train it on the entirety of the data.\n", 627 | "model.fit(train_data, train_targets,\n", 628 | " epochs=80, batch_size=16, verbose=0)\n", 629 | "test_mse_score, test_mae_score = model.evaluate(test_data, test_targets)" 630 | ] 631 | }, 632 | { 633 | "cell_type": "code", 634 | "execution_count": 29, 635 | "metadata": {}, 636 | "outputs": [ 637 | { 638 | "data": { 639 | "text/plain": [ 640 | "2.5532484335057877" 641 | ] 642 | }, 643 | "execution_count": 29, 644 | "metadata": {}, 645 | "output_type": "execute_result" 646 | } 647 | ], 648 | "source": [ 649 | "test_mae_score" 650 | ] 651 | }, 652 | { 653 | "cell_type": "markdown", 654 | "metadata": {}, 655 | "source": [ 656 | "We are still off by about \\$2,550." 657 | ] 658 | }, 659 | { 660 | "cell_type": "markdown", 661 | "metadata": {}, 662 | "source": [ 663 | "## Wrapping up\n", 664 | "\n", 665 | "\n", 666 | "Here's what you should take away from this example:\n", 667 | "\n", 668 | "* Regression is done using different loss functions from classification; Mean Squared Error (MSE) is a commonly used loss function for \n", 669 | "regression.\n", 670 | "* Similarly, evaluation metrics to be used for regression differ from those used for classification; naturally the concept of \"accuracy\" \n", 671 | "does not apply for regression. A common regression metric is Mean Absolute Error (MAE).\n", 672 | "* When features in the input data have values in different ranges, each feature should be scaled independently as a preprocessing step.\n", 673 | "* When there is little data available, using K-Fold validation is a great way to reliably evaluate a model.\n", 674 | "* When little training data is available, it is preferable to use a small network with very few hidden layers (typically only one or two), \n", 675 | "in order to avoid severe overfitting.\n", 676 | "\n", 677 | "This example concludes our series of three introductory practical examples. You are now able to handle common types of problems with vector data input:\n", 678 | "\n", 679 | "* Binary (2-class) classification.\n", 680 | "* Multi-class, single-label classification.\n", 681 | "* Scalar regression.\n", 682 | "\n", 683 | "In the next chapter, you will acquire a more formal understanding of some of the concepts you have encountered in these first examples, \n", 684 | "such as data preprocessing, model evaluation, and overfitting." 685 | ] 686 | } 687 | ], 688 | "metadata": { 689 | "kernelspec": { 690 | "display_name": "Python 3", 691 | "language": "python", 692 | "name": "python3" 693 | }, 694 | "language_info": { 695 | "codemirror_mode": { 696 | "name": "ipython", 697 | "version": 3 698 | }, 699 | "file_extension": ".py", 700 | "mimetype": "text/x-python", 701 | "name": "python", 702 | "nbconvert_exporter": "python", 703 | "pygments_lexer": "ipython3", 704 | "version": "3.5.2" 705 | } 706 | }, 707 | "nbformat": 4, 708 | "nbformat_minor": 2 709 | } 710 | -------------------------------------------------------------------------------- /5.1-introduction-to-convnets.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": { 34 | "collapsed": true 35 | }, 36 | "source": [ 37 | "# 5.1 - Introduction to convnets\n", 38 | "\n", 39 | "This notebook contains the code sample found in Chapter 5, Section 1 of [Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text features far more content, in particular further explanations and figures: in this notebook, you will only find source code and related comments.\n", 40 | "\n", 41 | "----\n", 42 | "\n", 43 | "First, let's take a practical look at a very simple convnet example. We will use our convnet to classify MNIST digits, a task that you've already been \n", 44 | "through in Chapter 2, using a densely-connected network (our test accuracy then was 97.8%). Even though our convnet will be very basic, its \n", 45 | "accuracy will still blow out of the water that of the densely-connected model from Chapter 2.\n", 46 | "\n", 47 | "The 6 lines of code below show you what a basic convnet looks like. It's a stack of `Conv2D` and `MaxPooling2D` layers. We'll see in a \n", 48 | "minute what they do concretely.\n", 49 | "Importantly, a convnet takes as input tensors of shape `(image_height, image_width, image_channels)` (not including the batch dimension). \n", 50 | "In our case, we will configure our convnet to process inputs of size `(28, 28, 1)`, which is the format of MNIST images. We do this via \n", 51 | "passing the argument `input_shape=(28, 28, 1)` to our first layer." 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": 2, 57 | "metadata": {}, 58 | "outputs": [], 59 | "source": [ 60 | "from keras import layers\n", 61 | "from keras import models\n", 62 | "\n", 63 | "model = models.Sequential()\n", 64 | "model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))\n", 65 | "model.add(layers.MaxPooling2D((2, 2)))\n", 66 | "model.add(layers.Conv2D(64, (3, 3), activation='relu'))\n", 67 | "model.add(layers.MaxPooling2D((2, 2)))\n", 68 | "model.add(layers.Conv2D(64, (3, 3), activation='relu'))" 69 | ] 70 | }, 71 | { 72 | "cell_type": "markdown", 73 | "metadata": {}, 74 | "source": [ 75 | "Let's display the architecture of our convnet so far:" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 3, 81 | "metadata": {}, 82 | "outputs": [ 83 | { 84 | "name": "stdout", 85 | "output_type": "stream", 86 | "text": [ 87 | "_________________________________________________________________\n", 88 | "Layer (type) Output Shape Param # \n", 89 | "=================================================================\n", 90 | "conv2d_1 (Conv2D) (None, 26, 26, 32) 320 \n", 91 | "_________________________________________________________________\n", 92 | "max_pooling2d_1 (MaxPooling2 (None, 13, 13, 32) 0 \n", 93 | "_________________________________________________________________\n", 94 | "conv2d_2 (Conv2D) (None, 11, 11, 64) 18496 \n", 95 | "_________________________________________________________________\n", 96 | "max_pooling2d_2 (MaxPooling2 (None, 5, 5, 64) 0 \n", 97 | "_________________________________________________________________\n", 98 | "conv2d_3 (Conv2D) (None, 3, 3, 64) 36928 \n", 99 | "=================================================================\n", 100 | "Total params: 55,744\n", 101 | "Trainable params: 55,744\n", 102 | "Non-trainable params: 0\n", 103 | "_________________________________________________________________\n" 104 | ] 105 | } 106 | ], 107 | "source": [ 108 | "model.summary()" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": { 114 | "collapsed": true 115 | }, 116 | "source": [ 117 | "You can see above that the output of every `Conv2D` and `MaxPooling2D` layer is a 3D tensor of shape `(height, width, channels)`. The width \n", 118 | "and height dimensions tend to shrink as we go deeper in the network. The number of channels is controlled by the first argument passed to \n", 119 | "the `Conv2D` layers (e.g. 32 or 64).\n", 120 | "\n", 121 | "The next step would be to feed our last output tensor (of shape `(3, 3, 64)`) into a densely-connected classifier network like those you are \n", 122 | "already familiar with: a stack of `Dense` layers. These classifiers process vectors, which are 1D, whereas our current output is a 3D tensor. \n", 123 | "So first, we will have to flatten our 3D outputs to 1D, and then add a few `Dense` layers on top:" 124 | ] 125 | }, 126 | { 127 | "cell_type": "code", 128 | "execution_count": 4, 129 | "metadata": { 130 | "collapsed": true 131 | }, 132 | "outputs": [], 133 | "source": [ 134 | "model.add(layers.Flatten())\n", 135 | "model.add(layers.Dense(64, activation='relu'))\n", 136 | "model.add(layers.Dense(10, activation='softmax'))" 137 | ] 138 | }, 139 | { 140 | "cell_type": "markdown", 141 | "metadata": {}, 142 | "source": [ 143 | "We are going to do 10-way classification, so we use a final layer with 10 outputs and a softmax activation. Now here's what our network \n", 144 | "looks like:" 145 | ] 146 | }, 147 | { 148 | "cell_type": "code", 149 | "execution_count": 5, 150 | "metadata": {}, 151 | "outputs": [ 152 | { 153 | "name": "stdout", 154 | "output_type": "stream", 155 | "text": [ 156 | "_________________________________________________________________\n", 157 | "Layer (type) Output Shape Param # \n", 158 | "=================================================================\n", 159 | "conv2d_1 (Conv2D) (None, 26, 26, 32) 320 \n", 160 | "_________________________________________________________________\n", 161 | "max_pooling2d_1 (MaxPooling2 (None, 13, 13, 32) 0 \n", 162 | "_________________________________________________________________\n", 163 | "conv2d_2 (Conv2D) (None, 11, 11, 64) 18496 \n", 164 | "_________________________________________________________________\n", 165 | "max_pooling2d_2 (MaxPooling2 (None, 5, 5, 64) 0 \n", 166 | "_________________________________________________________________\n", 167 | "conv2d_3 (Conv2D) (None, 3, 3, 64) 36928 \n", 168 | "_________________________________________________________________\n", 169 | "flatten_1 (Flatten) (None, 576) 0 \n", 170 | "_________________________________________________________________\n", 171 | "dense_1 (Dense) (None, 64) 36928 \n", 172 | "_________________________________________________________________\n", 173 | "dense_2 (Dense) (None, 10) 650 \n", 174 | "=================================================================\n", 175 | "Total params: 93,322\n", 176 | "Trainable params: 93,322\n", 177 | "Non-trainable params: 0\n", 178 | "_________________________________________________________________\n" 179 | ] 180 | } 181 | ], 182 | "source": [ 183 | "model.summary()" 184 | ] 185 | }, 186 | { 187 | "cell_type": "markdown", 188 | "metadata": {}, 189 | "source": [ 190 | "As you can see, our `(3, 3, 64)` outputs were flattened into vectors of shape `(576,)`, before going through two `Dense` layers.\n", 191 | "\n", 192 | "Now, let's train our convnet on the MNIST digits. We will reuse a lot of the code we have already covered in the MNIST example from Chapter \n", 193 | "2." 194 | ] 195 | }, 196 | { 197 | "cell_type": "code", 198 | "execution_count": 6, 199 | "metadata": {}, 200 | "outputs": [], 201 | "source": [ 202 | "from keras.datasets import mnist\n", 203 | "from keras.utils import to_categorical\n", 204 | "\n", 205 | "(train_images, train_labels), (test_images, test_labels) = mnist.load_data()\n", 206 | "\n", 207 | "train_images = train_images.reshape((60000, 28, 28, 1))\n", 208 | "train_images = train_images.astype('float32') / 255\n", 209 | "\n", 210 | "test_images = test_images.reshape((10000, 28, 28, 1))\n", 211 | "test_images = test_images.astype('float32') / 255\n", 212 | "\n", 213 | "train_labels = to_categorical(train_labels)\n", 214 | "test_labels = to_categorical(test_labels)" 215 | ] 216 | }, 217 | { 218 | "cell_type": "code", 219 | "execution_count": 7, 220 | "metadata": {}, 221 | "outputs": [ 222 | { 223 | "name": "stdout", 224 | "output_type": "stream", 225 | "text": [ 226 | "Epoch 1/5\n", 227 | "60000/60000 [==============================] - 8s - loss: 0.1766 - acc: 0.9440 \n", 228 | "Epoch 2/5\n", 229 | "60000/60000 [==============================] - 7s - loss: 0.0462 - acc: 0.9855 \n", 230 | "Epoch 3/5\n", 231 | "60000/60000 [==============================] - 7s - loss: 0.0322 - acc: 0.9902 \n", 232 | "Epoch 4/5\n", 233 | "60000/60000 [==============================] - 7s - loss: 0.0241 - acc: 0.9926 \n", 234 | "Epoch 5/5\n", 235 | "60000/60000 [==============================] - 7s - loss: 0.0187 - acc: 0.9943 \n" 236 | ] 237 | }, 238 | { 239 | "data": { 240 | "text/plain": [ 241 | "" 242 | ] 243 | }, 244 | "execution_count": 7, 245 | "metadata": {}, 246 | "output_type": "execute_result" 247 | } 248 | ], 249 | "source": [ 250 | "model.compile(optimizer='rmsprop',\n", 251 | " loss='categorical_crossentropy',\n", 252 | " metrics=['accuracy'])\n", 253 | "model.fit(train_images, train_labels, epochs=5, batch_size=64)" 254 | ] 255 | }, 256 | { 257 | "cell_type": "markdown", 258 | "metadata": {}, 259 | "source": [ 260 | "Let's evaluate the model on the test data:" 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": 8, 266 | "metadata": {}, 267 | "outputs": [ 268 | { 269 | "name": "stdout", 270 | "output_type": "stream", 271 | "text": [ 272 | " 9536/10000 [===========================>..] - ETA: 0s" 273 | ] 274 | } 275 | ], 276 | "source": [ 277 | "test_loss, test_acc = model.evaluate(test_images, test_labels)" 278 | ] 279 | }, 280 | { 281 | "cell_type": "code", 282 | "execution_count": 9, 283 | "metadata": {}, 284 | "outputs": [ 285 | { 286 | "data": { 287 | "text/plain": [ 288 | "0.99129999999999996" 289 | ] 290 | }, 291 | "execution_count": 9, 292 | "metadata": {}, 293 | "output_type": "execute_result" 294 | } 295 | ], 296 | "source": [ 297 | "test_acc" 298 | ] 299 | }, 300 | { 301 | "cell_type": "markdown", 302 | "metadata": {}, 303 | "source": [ 304 | "While our densely-connected network from Chapter 2 had a test accuracy of 97.8%, our basic convnet has a test accuracy of 99.3%: we \n", 305 | "decreased our error rate by 68% (relative). Not bad! " 306 | ] 307 | } 308 | ], 309 | "metadata": { 310 | "kernelspec": { 311 | "display_name": "Python 3", 312 | "language": "python", 313 | "name": "python3" 314 | }, 315 | "language_info": { 316 | "codemirror_mode": { 317 | "name": "ipython", 318 | "version": 3 319 | }, 320 | "file_extension": ".py", 321 | "mimetype": "text/x-python", 322 | "name": "python", 323 | "nbconvert_exporter": "python", 324 | "pygments_lexer": "ipython3", 325 | "version": "3.5.2" 326 | } 327 | }, 328 | "nbformat": 4, 329 | "nbformat_minor": 2 330 | } 331 | -------------------------------------------------------------------------------- /6.1-one-hot-encoding-of-words-or-characters.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "name": "stderr", 10 | "output_type": "stream", 11 | "text": [ 12 | "Using TensorFlow backend.\n" 13 | ] 14 | }, 15 | { 16 | "data": { 17 | "text/plain": [ 18 | "'2.0.8'" 19 | ] 20 | }, 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "output_type": "execute_result" 24 | } 25 | ], 26 | "source": [ 27 | "import keras\n", 28 | "keras.__version__" 29 | ] 30 | }, 31 | { 32 | "cell_type": "markdown", 33 | "metadata": {}, 34 | "source": [ 35 | "# One-hot encoding of words or characters\n", 36 | "\n", 37 | "This notebook contains the first code sample found in Chapter 6, Section 1 of [Deep Learning with Python](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text features far more content, in particular further explanations and figures: in this notebook, you will only find source code and related comments.\n", 38 | "\n", 39 | "----\n", 40 | "\n", 41 | "One-hot encoding is the most common, most basic way to turn a token into a vector. You already saw it in action in our initial IMDB and \n", 42 | "Reuters examples from chapter 3 (done with words, in our case). It consists in associating a unique integer index to every word, then \n", 43 | "turning this integer index i into a binary vector of size N, the size of the vocabulary, that would be all-zeros except for the i-th \n", 44 | "entry, which would be 1.\n", 45 | "\n", 46 | "Of course, one-hot encoding can be done at the character level as well. To unambiguously drive home what one-hot encoding is and how to \n", 47 | "implement it, here are two toy examples of one-hot encoding: one for words, the other for characters.\n", 48 | "\n" 49 | ] 50 | }, 51 | { 52 | "cell_type": "markdown", 53 | "metadata": {}, 54 | "source": [ 55 | "Word level one-hot encoding (toy example):" 56 | ] 57 | }, 58 | { 59 | "cell_type": "code", 60 | "execution_count": 3, 61 | "metadata": {}, 62 | "outputs": [], 63 | "source": [ 64 | "import numpy as np\n", 65 | "\n", 66 | "# This is our initial data; one entry per \"sample\"\n", 67 | "# (in this toy example, a \"sample\" is just a sentence, but\n", 68 | "# it could be an entire document).\n", 69 | "samples = ['The cat sat on the mat.', 'The dog ate my homework.']\n", 70 | "\n", 71 | "# First, build an index of all tokens in the data.\n", 72 | "token_index = {}\n", 73 | "for sample in samples:\n", 74 | " # We simply tokenize the samples via the `split` method.\n", 75 | " # in real life, we would also strip punctuation and special characters\n", 76 | " # from the samples.\n", 77 | " for word in sample.split():\n", 78 | " if word not in token_index:\n", 79 | " # Assign a unique index to each unique word\n", 80 | " token_index[word] = len(token_index) + 1\n", 81 | " # Note that we don't attribute index 0 to anything.\n", 82 | "\n", 83 | "# Next, we vectorize our samples.\n", 84 | "# We will only consider the first `max_length` words in each sample.\n", 85 | "max_length = 10\n", 86 | "\n", 87 | "# This is where we store our results:\n", 88 | "results = np.zeros((len(samples), max_length, max(token_index.values()) + 1))\n", 89 | "for i, sample in enumerate(samples):\n", 90 | " for j, word in list(enumerate(sample.split()))[:max_length]:\n", 91 | " index = token_index.get(word)\n", 92 | " results[i, j, index] = 1." 93 | ] 94 | }, 95 | { 96 | "cell_type": "markdown", 97 | "metadata": {}, 98 | "source": [ 99 | "Character level one-hot encoding (toy example)" 100 | ] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "execution_count": 5, 105 | "metadata": {}, 106 | "outputs": [], 107 | "source": [ 108 | "import string\n", 109 | "\n", 110 | "samples = ['The cat sat on the mat.', 'The dog ate my homework.']\n", 111 | "characters = string.printable # All printable ASCII characters.\n", 112 | "token_index = dict(zip(characters, range(1, len(characters) + 1)))\n", 113 | "\n", 114 | "max_length = 50\n", 115 | "results = np.zeros((len(samples), max_length, max(token_index.values()) + 1))\n", 116 | "for i, sample in enumerate(samples):\n", 117 | " for j, character in enumerate(sample[:max_length]):\n", 118 | " index = token_index.get(character)\n", 119 | " results[i, j, index] = 1." 120 | ] 121 | }, 122 | { 123 | "cell_type": "markdown", 124 | "metadata": {}, 125 | "source": [ 126 | "Note that Keras has built-in utilities for doing one-hot encoding text at the word level or character level, starting from raw text data. \n", 127 | "This is what you should actually be using, as it will take care of a number of important features, such as stripping special characters \n", 128 | "from strings, or only taking into the top N most common words in your dataset (a common restriction to avoid dealing with very large input \n", 129 | "vector spaces)." 130 | ] 131 | }, 132 | { 133 | "cell_type": "markdown", 134 | "metadata": {}, 135 | "source": [ 136 | "Using Keras for word-level one-hot encoding:" 137 | ] 138 | }, 139 | { 140 | "cell_type": "code", 141 | "execution_count": 7, 142 | "metadata": {}, 143 | "outputs": [ 144 | { 145 | "name": "stdout", 146 | "output_type": "stream", 147 | "text": [ 148 | "Found 9 unique tokens.\n" 149 | ] 150 | } 151 | ], 152 | "source": [ 153 | "from keras.preprocessing.text import Tokenizer\n", 154 | "\n", 155 | "samples = ['The cat sat on the mat.', 'The dog ate my homework.']\n", 156 | "\n", 157 | "# We create a tokenizer, configured to only take\n", 158 | "# into account the top-1000 most common words\n", 159 | "tokenizer = Tokenizer(num_words=1000)\n", 160 | "# This builds the word index\n", 161 | "tokenizer.fit_on_texts(samples)\n", 162 | "\n", 163 | "# This turns strings into lists of integer indices.\n", 164 | "sequences = tokenizer.texts_to_sequences(samples)\n", 165 | "\n", 166 | "# You could also directly get the one-hot binary representations.\n", 167 | "# Note that other vectorization modes than one-hot encoding are supported!\n", 168 | "one_hot_results = tokenizer.texts_to_matrix(samples, mode='binary')\n", 169 | "\n", 170 | "# This is how you can recover the word index that was computed\n", 171 | "word_index = tokenizer.word_index\n", 172 | "print('Found %s unique tokens.' % len(word_index))" 173 | ] 174 | }, 175 | { 176 | "cell_type": "markdown", 177 | "metadata": {}, 178 | "source": [ 179 | "\n", 180 | "A variant of one-hot encoding is the so-called \"one-hot hashing trick\", which can be used when the number of unique tokens in your \n", 181 | "vocabulary is too large to handle explicitly. Instead of explicitly assigning an index to each word and keeping a reference of these \n", 182 | "indices in a dictionary, one may hash words into vectors of fixed size. This is typically done with a very lightweight hashing function. \n", 183 | "The main advantage of this method is that it does away with maintaining an explicit word index, which \n", 184 | "saves memory and allows online encoding of the data (starting to generate token vectors right away, before having seen all of the available \n", 185 | "data). The one drawback of this method is that it is susceptible to \"hash collisions\": two different words may end up with the same hash, \n", 186 | "and subsequently any machine learning model looking at these hashes won't be able to tell the difference between these words. The likelihood \n", 187 | "of hash collisions decreases when the dimensionality of the hashing space is much larger than the total number of unique tokens being hashed." 188 | ] 189 | }, 190 | { 191 | "cell_type": "markdown", 192 | "metadata": {}, 193 | "source": [ 194 | "Word-level one-hot encoding with hashing trick (toy example):" 195 | ] 196 | }, 197 | { 198 | "cell_type": "code", 199 | "execution_count": 9, 200 | "metadata": {}, 201 | "outputs": [], 202 | "source": [ 203 | "samples = ['The cat sat on the mat.', 'The dog ate my homework.']\n", 204 | "\n", 205 | "# We will store our words as vectors of size 1000.\n", 206 | "# Note that if you have close to 1000 words (or more)\n", 207 | "# you will start seeing many hash collisions, which\n", 208 | "# will decrease the accuracy of this encoding method.\n", 209 | "dimensionality = 1000\n", 210 | "max_length = 10\n", 211 | "\n", 212 | "results = np.zeros((len(samples), max_length, dimensionality))\n", 213 | "for i, sample in enumerate(samples):\n", 214 | " for j, word in list(enumerate(sample.split()))[:max_length]:\n", 215 | " # Hash the word into a \"random\" integer index\n", 216 | " # that is between 0 and 1000\n", 217 | " index = abs(hash(word)) % dimensionality\n", 218 | " results[i, j, index] = 1." 219 | ] 220 | } 221 | ], 222 | "metadata": { 223 | "kernelspec": { 224 | "display_name": "Python 3", 225 | "language": "python", 226 | "name": "python3" 227 | }, 228 | "language_info": { 229 | "codemirror_mode": { 230 | "name": "ipython", 231 | "version": 3 232 | }, 233 | "file_extension": ".py", 234 | "mimetype": "text/x-python", 235 | "name": "python", 236 | "nbconvert_exporter": "python", 237 | "pygments_lexer": "ipython3", 238 | "version": "3.5.2" 239 | } 240 | }, 241 | "nbformat": 4, 242 | "nbformat_minor": 2 243 | } 244 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2017 François Chollet 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Companion Jupyter notebooks for the book "Deep Learning with Python" 2 | 3 | This repository contains Jupyter notebooks implementing the code samples found in the book [Deep Learning with Python (Manning Publications)](https://www.manning.com/books/deep-learning-with-python?a_aid=keras&a_bid=76564dff). Note that the original text of the book features far more content than you will find in these notebooks, in particular further explanations and figures. Here we have only included the code samples themselves and immediately related surrounding comments. 4 | 5 | These notebooks use Python 3.6 and Keras 2.0.8. They were generated on a p2.xlarge EC2 instance. 6 | 7 | ## Table of contents 8 | 9 | * Chapter 2: 10 | * [2.1: A first look at a neural network](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/2.1-a-first-look-at-a-neural-network.ipynb) 11 | * Chapter 3: 12 | * [3.5: Classifying movie reviews](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/3.5-classifying-movie-reviews.ipynb) 13 | * [3.6: Classifying newswires](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/3.6-classifying-newswires.ipynb) 14 | * [3.7: Predicting house prices](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/3.7-predicting-house-prices.ipynb) 15 | * Chapter 4: 16 | * [4.4: Underfitting and overfitting](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/4.4-overfitting-and-underfitting.ipynb) 17 | * Chapter 5: 18 | * [5.1: Introduction to convnets](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/5.1-introduction-to-convnets.ipynb) 19 | * [5.2: Using convnets with small datasets](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/5.2-using-convnets-with-small-datasets.ipynb) 20 | * [5.3: Using a pre-trained convnet](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/5.3-using-a-pretrained-convnet.ipynb) 21 | * [5.4: Visualizing what convnets learn](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/5.4-visualizing-what-convnets-learn.ipynb) 22 | * Chapter 6: 23 | * [6.1: One-hot encoding of words or characters](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/6.1-one-hot-encoding-of-words-or-characters.ipynb) 24 | * [6.1: Using word embeddings](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/6.1-using-word-embeddings.ipynb) 25 | * [6.2: Understanding RNNs](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/6.2-understanding-recurrent-neural-networks.ipynb) 26 | * [6.3: Advanced usage of RNNs](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/6.3-advanced-usage-of-recurrent-neural-networks.ipynb) 27 | * [6.4: Sequence processing with convnets](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/6.4-sequence-processing-with-convnets.ipynb) 28 | * Chapter 8: 29 | * [8.1: Text generation with LSTM](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/8.1-text-generation-with-lstm.ipynb) 30 | * [8.2: Deep dream](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/8.2-deep-dream.ipynb) 31 | * [8.3: Neural style transfer](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/8.3-neural-style-transfer.ipynb) 32 | * [8.4: Generating images with VAEs](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/8.4-generating-images-with-vaes.ipynb) 33 | * [8.5: Introduction to GANs](http://nbviewer.jupyter.org/github/fchollet/deep-learning-with-python-notebooks/blob/master/8.5-introduction-to-gans.ipynb 34 | ) 35 | --------------------------------------------------------------------------------