├── 01_Linear_Regression.ipynb ├── 02_MLP.ipynb ├── 03_CNN.ipynb └── README.md /01_Linear_Regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Linear Regression\n", 8 | "To get familiar with automatic differentiation, we start by learning a simple linear regression model using Stochastic Gradient Descent (SGD).\n", 9 | "\n", 10 | "Recall that given a dataset $\\{(x_i, y_i)\\}_{i=0}^N$, with $x_i, y_i \\in \\mathbb{R}$, the objective of linear regression is to find two scalars $w$ and $b$ such that $y = w\\cdot x + b$ fits the dataset. In this tutorial we will learn $w$ and $b$ using SGD and a Mean Square Error (MSE) loss:\n", 11 | "\n", 12 | "$$L = \\frac{1}{N} \\sum_{i=0}^N (w\\cdot x_i + b - y_i)^2$$\n", 13 | "\n", 14 | "Starting from random values, parameters $w$ and $b$ will be updated at each iteration via the following rule:\n", 15 | "\n", 16 | "$$w_t = w_{t-1} - \\eta \\frac{\\partial L}{\\partial w}$$\n", 17 | "$$b_t = b_{t-1} - \\eta \\frac{\\partial L}{\\partial b}$$\n", 18 | "\n", 19 | "where $\\eta$ is the learning rate.\n", 20 | "\n", 21 | "## Placeholders and variables\n", 22 | "To implement and run this simple model, we will use the [Keras backend module](http://keras.io/backend/), which provides an abstraction over Theano and Tensorflow, two popular tensor manipulation libraries that provide automatic differentiation.\n", 23 | "\n", 24 | "First of all, we define the necessary variables and placeholders for our computational graph. Variables maintain state across executions of the computational graph, while placeholders are ways to feed the graph with external data.\n", 25 | "\n", 26 | "For the linear regression example, we need three variables: `w`, `b`, and the learning rate for SGD, `lr`. Two placeholders `x` and `target` are created to store $x_i$ and $y_i$ values." 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": 1, 32 | "metadata": { 33 | "collapsed": false 34 | }, 35 | "outputs": [ 36 | { 37 | "name": "stdout", 38 | "output_type": "stream", 39 | "text": [ 40 | "Using TensorFlow backend.\n" 41 | ] 42 | } 43 | ], 44 | "source": [ 45 | "import keras.backend as K\n", 46 | "import numpy as np\n", 47 | "\n", 48 | "# Placeholders and variables\n", 49 | "x = K.placeholder()\n", 50 | "target = K.placeholder()\n", 51 | "lr = K.variable(0.1)\n", 52 | "w = K.variable(np.random.rand())\n", 53 | "b = K.variable(np.random.rand())" 54 | ] 55 | }, 56 | { 57 | "cell_type": "markdown", 58 | "metadata": { 59 | "collapsed": true 60 | }, 61 | "source": [ 62 | "## Model definition\n", 63 | "Now we can define the $y = w\\cdot x + b$ relation as well as the MSE loss in the computational graph." 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": 2, 69 | "metadata": { 70 | "collapsed": true 71 | }, 72 | "outputs": [], 73 | "source": [ 74 | "# Define model and loss\n", 75 | "y = w * x + b\n", 76 | "loss = K.mean(K.square(y-target))" 77 | ] 78 | }, 79 | { 80 | "cell_type": "markdown", 81 | "metadata": {}, 82 | "source": [ 83 | "Then, given the gradient of MSE wrt to `w` and `b`, we can define how we update the parameters via SGD:" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": 4, 89 | "metadata": { 90 | "collapsed": false 91 | }, 92 | "outputs": [ 93 | { 94 | "name": "stdout", 95 | "output_type": "stream", 96 | "text": [ 97 | "Tensor(\"gradients_1/mul_grad/Reshape:0\", shape=TensorShape([]), dtype=float32)\n" 98 | ] 99 | } 100 | ], 101 | "source": [ 102 | "grads = K.gradients(loss, [w,b])\n", 103 | "K.eval( grads[0]" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": null, 109 | "metadata": { 110 | "collapsed": true 111 | }, 112 | "outputs": [], 113 | "source": [ 114 | "grads = K.gradients(loss, [w,b])\n", 115 | "updates = [[w, w-lr*grads[0]], [b, b-lr*grads[1]]]" 116 | ] 117 | }, 118 | { 119 | "cell_type": "markdown", 120 | "metadata": {}, 121 | "source": [ 122 | "The whole model can be encapsulated in a `function`, which takes as input `x` and `target`, returns the current loss value and updates its parameter according to `updates`." 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": null, 128 | "metadata": { 129 | "collapsed": true 130 | }, 131 | "outputs": [], 132 | "source": [ 133 | "train = K.function(inputs=[x, target], outputs=[loss], updates=updates)" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "metadata": {}, 139 | "source": [ 140 | "## Training\n", 141 | "Training is now just a matter of calling the `function` we have just defined. Each time `train` is called, indeed, `w` and `b` will be updated using the SGD rule.\n", 142 | "\n", 143 | "Having generated some random training data, we will feed the `train` function for several epochs and observe the values of `w`, `b`, and loss." 144 | ] 145 | }, 146 | { 147 | "cell_type": "code", 148 | "execution_count": null, 149 | "metadata": { 150 | "collapsed": false 151 | }, 152 | "outputs": [], 153 | "source": [ 154 | "# Generate data\n", 155 | "np_x = np.random.rand(1000)\n", 156 | "np_target = 0.96*np_x + 0.24\n", 157 | "\n", 158 | "# Training\n", 159 | "loss_history = []\n", 160 | "for epoch in range(200):\n", 161 | " current_loss = train([np_x, np_target])[0]\n", 162 | " loss_history.append(current_loss)\n", 163 | " if epoch % 20 == 0:\n", 164 | " print \"Loss: %.03f, w, b: [%.02f, %.02f]\" % (current_loss, K.eval(w), K.eval(b))" 165 | ] 166 | }, 167 | { 168 | "cell_type": "markdown", 169 | "metadata": {}, 170 | "source": [ 171 | "We can also plot the loss history:" 172 | ] 173 | }, 174 | { 175 | "cell_type": "code", 176 | "execution_count": null, 177 | "metadata": { 178 | "collapsed": false 179 | }, 180 | "outputs": [], 181 | "source": [ 182 | "# Plot loss history\n", 183 | "import matplotlib.pyplot as plt\n", 184 | "%matplotlib inline\n", 185 | "plt.plot(loss_history)" 186 | ] 187 | }, 188 | { 189 | "cell_type": "markdown", 190 | "metadata": {}, 191 | "source": [ 192 | "You should now have seen how a simple machine learning model can be implemented and trained using SGD and the Keras backend module. Recall that linear regression is indeed a simple neuron with a linear activation function\n", 193 | "\n", 194 | "Following the same principles and looking at the [Keras backend implementation](https://github.com/fchollet/keras/tree/master/keras/backend), you should be able to implement a basic Neural Network. In practice, we will use Keras to build and train Neural Networks." 195 | ] 196 | } 197 | ], 198 | "metadata": { 199 | "kernelspec": { 200 | "display_name": "Python 2", 201 | "language": "python", 202 | "name": "python2" 203 | }, 204 | "language_info": { 205 | "codemirror_mode": { 206 | "name": "ipython", 207 | "version": 2 208 | }, 209 | "file_extension": ".py", 210 | "mimetype": "text/x-python", 211 | "name": "python", 212 | "nbconvert_exporter": "python", 213 | "pygments_lexer": "ipython2", 214 | "version": "2.7.8" 215 | } 216 | }, 217 | "nbformat": 4, 218 | "nbformat_minor": 0 219 | } 220 | -------------------------------------------------------------------------------- /02_MLP.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Multilayer Perceptron\n", 8 | "In this second tutorial we will implement a Multilayer Perceptron (MLP) for a classification task using [Keras](http://keras.io/).\n", 9 | "\n", 10 | "In a MLP, the output of each layer is computed using the activations from the previous one, as follows:\n", 11 | "\n", 12 | "$$\\mathbf{h}_{i} = \\sigma(\\mathbf{W}_i \\mathbf{h}_{i-1} + \\mathbf{b}_i)$$\n", 13 | "\n", 14 | "where $\\mathbf{h}_i$ is the activation vector from the $i$-th layer (or the input data for $i=0$), $\\mathbf{W}_i$ and $\\mathbf{b}_i$ are respectively the weight matrix and the bias vector for the $i$-th layer, and $\\sigma(\\cdot)$ is the activation function. In our example, we will use the ReLU activation function for the hidden layers and softmax for the last layer.\n", 15 | "\n", 16 | "To regularize the model, we will also insert a Dropout layer between consecutive hidden layers. Dropout works by “dropping out” some unit activations in a given layer, that is setting them to zero with a given probability.\n", 17 | "\n", 18 | "Our loss function will be the categorical crossentropy.\n", 19 | "\n", 20 | "## Model definition\n", 21 | "Keras supports two different kind of models: the [Sequential](http://keras.io/models/#sequential) model and the [Graph](http://keras.io/models/#graph) model. The former is used to build linear stacks of layer (so each layer has one input and one output), and the latter supports any kind of connection graph.\n", 22 | "\n", 23 | "In our case we build a Sequential model with three [Dense](http://keras.io/layers/core/#dense) (aka fully connected) layers, with some [Dropout](http://keras.io/layers/core/#dropout). Notice that the output layer has the softmax activation function. \n", 24 | "The resulting model is actually a `function` of its own inputs implemented using the Keras backend (see previous tutorial). We apply the binary crossentropy loss and choose SGD as the optimizer. Keras supports as well a variety of different [optimizers](http://keras.io/optimizers/) and [loss functions](http://keras.io/objectives/), which you may want to check out. " 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": 1, 30 | "metadata": { 31 | "collapsed": false 32 | }, 33 | "outputs": [ 34 | { 35 | "name": "stdout", 36 | "output_type": "stream", 37 | "text": [ 38 | "Using TensorFlow backend.\n" 39 | ] 40 | } 41 | ], 42 | "source": [ 43 | "from keras.models import Sequential\n", 44 | "from keras.layers.core import Dense, Dropout\n", 45 | "from keras.optimizers import SGD\n", 46 | "\n", 47 | "model = Sequential()\n", 48 | "model.add(Dense(512, activation='relu', input_shape=(784,)))\n", 49 | "model.add(Dropout(0.2))\n", 50 | "model.add(Dense(512, activation='relu'))\n", 51 | "model.add(Dropout(0.2))\n", 52 | "model.add(Dense(10, activation='softmax'))\n", 53 | "\n", 54 | "model.compile(loss='binary_crossentropy', optimizer=SGD())" 55 | ] 56 | }, 57 | { 58 | "cell_type": "markdown", 59 | "metadata": {}, 60 | "source": [ 61 | "## Data preparation\n", 62 | "We will train our MLP on the MNIST dataset, which consists of 60,000 28x28 grayscale images of the 10 digits, along with a test set of 10,000 images. Since this dataset is provided with Keras, we just ask the `keras.dataset` model for training and test data, reshape them to be in vectorial form, and normalize between 0 and 1.\n", 63 | "\n", 64 | "The `binary_crossentropy` loss expects a one-hot-vector as input, therefore we apply the `to_categorical` function from `keras.utilis` to convert integer labels to one-hot-vectors." 65 | ] 66 | }, 67 | { 68 | "cell_type": "code", 69 | "execution_count": 2, 70 | "metadata": { 71 | "collapsed": false 72 | }, 73 | "outputs": [], 74 | "source": [ 75 | "from keras.datasets import mnist\n", 76 | "from keras.utils import np_utils\n", 77 | "\n", 78 | "(X_train, y_train), (X_test, y_test) = mnist.load_data()\n", 79 | "X_train = X_train.reshape(60000, 784)\n", 80 | "X_test = X_test.reshape(10000, 784)\n", 81 | "X_train = X_train.astype(\"float32\")\n", 82 | "X_test = X_test.astype(\"float32\")\n", 83 | "X_train /= 255\n", 84 | "X_test /= 255\n", 85 | "\n", 86 | "# convert class vectors to binary class matrices\n", 87 | "Y_train = np_utils.to_categorical(y_train, 10)\n", 88 | "Y_test = np_utils.to_categorical(y_test, 10)" 89 | ] 90 | }, 91 | { 92 | "cell_type": "markdown", 93 | "metadata": {}, 94 | "source": [ 95 | "## Training\n", 96 | "Having defined and compiled the model, it can be trained using the `fit` function. We also specify a validation dataset to monitor validation loss and accuracy." 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": 3, 102 | "metadata": { 103 | "collapsed": false 104 | }, 105 | "outputs": [], 106 | "source": [ 107 | "history = model.fit(X_train, Y_train,\n", 108 | " batch_size=128, nb_epoch=100,\n", 109 | " show_accuracy=True, verbose=0,\n", 110 | " validation_data=(X_test, Y_test))" 111 | ] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "metadata": {}, 116 | "source": [ 117 | "The return value of the `fit` function is a `keras.callbacks.History` object which contains the entire history of training/validation loss and accuracy, for each epoch. We can therefore plot the behaviour of loss and accuracy during the training phase." 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": 4, 123 | "metadata": { 124 | "collapsed": false 125 | }, 126 | "outputs": [ 127 | { 128 | "data": { 129 | "text/plain": [ 130 | "" 131 | ] 132 | }, 133 | "execution_count": 4, 134 | "metadata": {}, 135 | "output_type": "execute_result" 136 | }, 137 | { 138 | "data": { 139 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY0AAAEPCAYAAAC+35gCAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xl4XWW99//3t0kzz006p0NogRYFZCjVMuQgSqFAUR8Z\nBEHBI3gE4eCjCB6l6MGjHngEfwqigMggw3UUqJTZYwC1Fgplbm1LW9KmaRuSZp6T7++PtffOTkjK\nTpOd8fO6rnV1r3HfWUo+uYd1L3N3REREYjFhuAsgIiKjh0JDRERiptAQEZGYKTRERCRmCg0REYmZ\nQkNERGIW19Aws6VmtsHMNpnZ1b3sX25mr5vZOjN7xcxOjNq3zczeCO17KZ7lFBGR2Fi8ntMwswTg\nn8BJQBnwMnCuu6+POibd3RtCnz8KPOLu80LrW4Ej3b0qLgUUEZF+i2dNYxGw2d23uXsb8CCwPPqA\ncGCEZADv97iGxbF8IiLST/EMjRnA9qj1HaFt3ZjZmWa2HngS+EbULgeeM7O1ZvavcSyniIjEKDGO\n146p3cvdHwUeNbPjgHuBg0K7lrh7uZkVAM+a2QZ3fzFOZRURkRjEMzTKgMKo9UKC2kav3P1FM0s0\ns0nuXunu5aHtFWb2CEFzV7fQMDNNnCUish/cfb+a/+PZPLUWmG9mc8wsCTgbWBl9gJkdYGYW+nwE\ngLtXmlmamWWGtqcDnwbe7O1L3F2LO9ddd92wl2GkLLoXuhe6F/teBiJuNQ13bzezy4CngQTgTndf\nb2aXhPbfDnwOuMDM2oB64JzQ6VOBP4byJBG4392fiVdZRUQkNvFsnsLdnyTo4I7ednvU558CP+3l\nvC3A4fEsm4iI9J+eCB8jiouLh7sII4buRRfdiy66F4Mjbg/3DQUz89FcfhGR4WBm+H52hMe1eUpE\nxodQ/6OMQIP9h7VCQ0QGhWr9I088wlx9GiIiEjOFhoiIxEyhISIiMVNoiIjE6NRTT+Xee+8d9GNH\nEw25FZEBCw3hHO5i9CojIyPSIdzQ0EBKSgoJCQkA/PrXv+bcc88dzuLFVV//uwxkyK1CQ0QGbCSH\nRrS5c+dy5513cuKJJ35gX3t7O4mJY2tAaTxCQ81TIjIulZSUMHPmTH76058ybdo0Lr74Yqqrqznt\ntNOYPHkyeXl5nH766ZSVlUXOKS4u5s477wTg7rvv5thjj+Vb3/oWeXl5FBUV8dRTT+3XsVu3buX4\n448nKyuLT33qU3z961/ni1/84hDdif5RaIjIuLV792727t1LaWkpt99+O52dnVx88cWUlpZSWlpK\namoql112WeR4M+v27MNLL73EwQcfTGVlJd/+9re5+OKL9+vYL3zhCyxevJiqqipWrFjBfffdN2If\nmFRoiMiQMBv4MtgmTJjA9ddfz8SJE0lJSSEvL4/PfOYzpKSkkJGRwbXXXsvzzz/f5/mzZ8/m4osv\nxsy44IILKC8vZ8+ePf06trS0lLVr1/KDH/yAxMRElixZwhlnnDFim/sUGiIyJNwHvgy2goICkpKS\nIuuNjY1ccsklzJkzh+zsbE444QRqamr6/AU+derUyOe0tDQA6uvr+3Xszp07ycvLIyUlJbK/sLDw\nA+ePFAoNERm3ejYB3XTTTWzcuJGXXnqJmpoann/++UF5cdG+TJs2jaqqKpqamiLbSktL4/Z9A6XQ\nEBEJqa+vJzU1lezsbKqqqrj++uvj/p2zZ8/mqKOOYsWKFbS1tbF69Woef/xx9WmIiIw0PX8xX3nl\nlTQ1NZGfn88nPvEJTjnllD5/effs6O7terEee//997N69WomTZrE9773Pc4+++xuzWYjiZ7TEJEB\nGy3PaYwWZ599NgsXLuS6664b0HX0nIaIyBi0du1a3n33XTo7O3nyySdZuXIlZ5555nAXq1dj6/FH\nEZFRaNeuXXz2s5+lsrKSwsJCfvWrX3HYYYcNd7F6peYpERkwNU+NTGqeEhGRYaXQEBGRmMU1NMxs\nqZltMLNNZnZ1L/uXm9nrZrbOzF4xsxNjPTds7954lV5ERHqKW5+GmSUA/wROAsqAl4Fz3X191DHp\n7t4Q+vxR4BF3nxfLuaFz/JVXnCOOiMuPICIxUp/GyDTa+jQWAZvdfZu7twEPAsujDwgHRkgG8H6s\n54Zt3Tro5RYRkT7EMzRmANuj1neEtnVjZmea2XrgSeAb/TkXFBoiEj8TJkxgy5YtAHzta1/jP//z\nP2M6tr/uv/9+Tj755P06d6jF8zmNmOqq7v4o8KiZHQfca2YH9+dLHnpoBeFJJYuLiykuLu5nMUVk\nLFu6dCnHHHPMB+aReuyxx7j00kspKytjwoQP//v5tttuG5TybNu2jaKiItrb2yPfe95553HeeecN\nyvV7U1JSQklJyaBcK56hUQZEz+9bSFBj6JW7v2hmiUBe6LiYzs3PX8GKFQMuq4iMUV/60pf47ne/\n+4HQuPfeezn//PNjCox4GMo+oJ5/UA9kIsZ43q21wHwzm2NmScDZwMroA8zsAAvN2mVmRwC4e2Us\n54apeUpE9mX58uVUVlby4osvRrbt3buXVatWcfrpp/Pxj3+c3Nxcpk+fzuWXX05bW1uv1/nSl77E\n9773vcj6f//3fzN9+nRmzpzJXXfd1e3YVatW8bGPfYzs7GxmzZrV7Zf08ccfD0BOTg5ZWVn84x//\n4O677+a4446LHPP3v/+do48+mpycHBYtWsTq1asj+4qLi/n+97/PscceS1ZWFieffDKVlZUDu0n9\nELfQcPd24DLgaeAd4CF3X29ml5jZJaHDPge8aWbrgFuAc/Z1bm/fs20bdHbG66cQkdEuNTWVs846\ni3vuuSey7eGHH2bBggVkZGRwyy23UFlZyerVq/nzn//Mrbfe2ut1omeqfeqpp7jpppt47rnn2Lhx\nI88991y3YzMyMrjvvvuoqalh1apV3HbbbTz22GMAkfCqqamhtraWxYsXdzu3qqqKZcuWceWVV1JV\nVcVVV13FsmXL2Bv1fMEDDzzA3XffzZ49e2htbeXGG28c+I2KUVznnnL3Jwk6uKO33R71+afAT2M9\ntzc5OVBeDjN67SYXkZHCrh/4+yH8uv1r0rnwwgs57bTT+OUvf0lSUhL33HMPF154IUdEjdefPXs2\nX/3qV3n++ee54oor9nm9hx9+mIsuuoiFCxcCQXPPgw8+GNl/wgknRD5/9KMf5ZxzzuH5559n+fLl\nH9ostWrVKg466KBIH8c555zDz3/+c1auXMmFF16ImfHlL3+ZefPmAXDWWWexcmWvDTFxMeonLJw7\nN2iiUmiIjGz7+wt/MCxZsoT8/HweeeQRjjrqKF5++WUeffRRNm7cyFVXXcUrr7xCY2Mj7e3tHHXU\nUR96vfLyco4++ujI+qxZs7rtX7NmDd/5znd4++23aW1tpaWlhbPOOiumsu7cufMD15s9ezY7d+6M\nrEe/OjY1NbXPV8zGw6ifRqSoCPZzlJuIjCMXXHAB99xzD/fddx9Lly6loKCAr33tayxcuJDNmzdT\nU1PDDTfcQGcM7d3Tpk3r9krWnq9n/cIXvsCZZ57Jjh07qK6u5tJLL41c98PeyDdjxgzee++9btve\ne+89ZoyQv4xHfWiEaxoiIvtywQUX8Oyzz3LHHXdw4YUXAsHrXTMzM0lLS2PDhg37HFYb/a7ws846\ni7vvvpv169fT2Nj4gdFI9fX15ObmkpSUxEsvvcTvf//7SFgUFBQwYcIE3n333V6/55RTTmHjxo08\n8MADtLe389BDD7FhwwZOO+20bmUZLqM+NIqKFBoi8uFmz57NkiVLaGxs5IwzzgDgxhtv5Pe//z1Z\nWVl89atf5ZxzzulWE+j5Oby+dOlSrrzySk488UQOPPBAPvnJT3Y79tZbb+X73/8+WVlZ/PCHP+Ts\ns8+O7EtLS+O73/0uS5YsIS8vjzVr1nS79qRJk3j88ce56aabyM/P58Ybb+Txxx8nLy/vQ8s1FEb9\n+zT+93+d666DF14Y7tKIjF+ae2pkGm1zTw0JNU+JiAydUV/TaGtz0tOhthaSk4e7RCLjk2oaI5Nq\nGr1o7qxn5kzoMdhARETiYNSHxvaa7WqiEhEZIqM/NGq361kNEZEhMvpDQzUNEZEhM+qnESmtKWVh\nEfzP/wx3SUTGt6F8VkCGz6gPje2121k2V81TIsNJI6fGj9HfPFWr5ikRkaEy+kOjZjv5+dDWBjU1\nw10aEZGxbfSHRu12wJk1C3pMNCkiIoNs1IfGxAkT2du8l9mz9YCfiEi8jfrQKMwupLSmVDUNEZEh\nMOpDY1b2LLbXbGfWLNU0RETibdSHRmFWIdtrtzN7tmoaIiLxNjZCI1TTUGiIiMTX6A+N7K6ahpqn\nRETia/SHRlbQET59OuzZEzyvISIi8RHX0DCzpWa2wcw2mdnVvew/z8xeN7M3zOxvZnZo1L5toe3r\nzOylvr5jVvYsttduJzERpk2DHTvi9dOIiEjc5p4yswTgF8BJQBnwspmtdPf1UYdtAY539xozWwr8\nGlgc2udAsbtX7et7ZmbNZGfdTjq9k1mzJlBaGrwCVkREBl88axqLgM3uvs3d24AHgeXRB7j7ancP\nT/6xBpjZ4xofOm1mcmIyOSk57K7frWG3IiJxFs/QmAFsj1rfEdrWl4uBJ6LWHXjOzNaa2b/u64s0\n7FZEZGjEc2r0mOdKNrN/AS4ClkRtXuLu5WZWADxrZhvc/cWe565YsYL6t+q58fUbmZn8b7z/fvGA\nCy4iMpaUlJRQUlIyKNeyeM2Db2aLgRXuvjS0fg3Q6e4/6XHcocAfgaXuvrmPa10H1Lv7TT22u7tz\nxZNXMCdnDgdX/zs33wxPPx2XH0lEZEwwM9x9v96aFc/mqbXAfDObY2ZJwNnAyugDzGwWQWCcHx0Y\nZpZmZpmhz+nAp4E3+/qi8LMaesBPRCS+4tY85e7tZnYZ8DSQANzp7uvN7JLQ/tuB7wO5wG2hV0W2\nufsiYCrwx9C2ROB+d3+mr+8qzCpkTdmaSGi4g948KSIy+OL6uld3fxJ4sse226M+fwX4Si/nbQEO\nj/V7CrODqUQyMyE5GSorIT9/AAUXEZFejfonwqFr9BSgYbciInE0JkJjWuY0KhoqaO1o1bBbEZE4\nGhOhkTghkSkZUyivK1dnuIhIHI2J0IBgOpGyujLNdisiEkdjKjR21O5QTUNEJI7GTmhkBqGhmoaI\nSPyMndBQTUNEJO7GXGhMmQI1NdDUNNwlEhEZe8ZcaEyYADNnwvbtH36OiIj0z5gLDUBNVCIicTJm\nQmNa5jR21e+io7NDneEiInEyZkIjKSGJSWmT2N2wmzlzYNu24S6RiMjYM2ZCA7qaqBQaIiLxMWZD\nY+vW4S6NiMjYM7ZCI/SA39y5qmmIiMTD2AqNUE1j+nSoqICWluEukYjI2DImQyMxEWbM0LMaIiKD\nbUyGBsDcuerXEBEZbGM2NDSCSkRk8I2p0JiRNYOyujI6vVOhISISB2MqNFISU8hKzuL9xvc17FZE\nJA7GVGhAVxOVht2KiAy+MRsaap4SERl8cQ0NM1tqZhvMbJOZXd3L/vPM7HUze8PM/mZmh8Z6bl/C\nD/hNmwZVVdDcPFg/jYiIxC00zCwB+AWwFFgInGtmC3octgU43t0PBX4I/Lof5/YqXNNISIDCQs12\nKyIymOJZ01gEbHb3be7eBjwILI8+wN1Xu3tNaHUNMDPWc/uiYbciIvETz9CYAUQ/k70jtK0vFwNP\n7Oe5EQoNEZH4SYzjtT3WA83sX4CLgCX9PXfFihWRz8XFxcz4yAw9FS4iEqWkpISSkpJBuVY8Q6MM\nKIxaLySoMXQT6vz+DbDU3ff251zoHhoAdS117KjdgbszZ46xcuV+l19EZEwoLi6muLg4sn799dfv\n97Xi2Ty1FphvZnPMLAk4G+j2K9zMZgF/BM539839ObcvmcmZJCUksbd5r5qnREQGWdxqGu7ebmaX\nAU8DCcCd7r7ezC4J7b8d+D6QC9xmZgBt7r6or3Nj/e6uB/zy1DwlIjKIzD3m7oMRx8y8t/IvvW8p\n3zjmGyw94FTS06GyEtLShqGAIiIjkJnh7rY/5465J8IBCrMK2V6znQkTYNYsPashIjJYxmRozM2d\ny9bqoF1K/RoiIoPnQ0PDzDJCT2hjZgeZ2RlmNjH+Rdt/RblFbNm7BdCwWxGRwRRLTeMFINnMZhB0\nTH8RuDuehRqo6NBQTUNEZPDEEhrm7o3AZ4Fb3f3zwEfiW6yBmZszV6EhIhIHMfVpmNnHgfOAVf05\nb7jkp+XT1tnG3qa9zJsHGzcOd4lERMaGWH75XwlcAzzi7m+b2QHAX+JbrIExM4pyi9havZWDDw5C\no6NjuEslIjL6fWhouPvz7n6Gu//EzCYAFe7+jSEo24AU5Raxde9WMjJg8mQ1UYmIDIZYRk89YGZZ\nZpYOvAWsN7Nvx79oA1OU09UZvnAhvPPOMBdIRGQMiKV5aqG71wJnAk8CcwhGUI1o0SOoFBoiIoMj\nltBIDD2XcSbwp9BLkUb83CNFuUVsqVZoiIgMplhC43ZgG5ABvGBmc4CafRw/IqimISIy+Po9YaEF\n09Emhmocw6qvCQsBmtubyf5xNo3XNlJfl8CMGVBbCxNG9GBhEZH4i+uEhWaWY2Y/M7NXzOwV4EZg\nxM8Zm5KYQkFaATtqd5CdDTk5UFo63KUSERndYvm7+y6gFvg8cBZQB/w2noUaLGqiEhEZXLGExgHu\nfp27b3H3d919BXBAnMs1KMIP+IFCQ0RkMMQSGk1mdlx4xcyOBRrjV6TBE13TOOQQhYaIyEDFEhqX\nAr80s/fM7D3gF6FtI56ap0REBteHviPc3V8DDjWz7NB6jZldCbwe78INVHRoLFgQhIY72H6NGRAR\nkZgHoLp7jbuHn8/4ZpzKM6iiQyMvD9LToaxsmAslIjKKjemnFqakT6G+tZ66ljpATVQiIgM1pkPD\nzLq9L1yhISIyMH2GhpnVm1ldbwswfQjLOCA9O8PffnuYCyQiMor1GRrunuHumX0sCbFc3MyWmtkG\nM9tkZlf3sv9gM1ttZs1m9s0e+7aZ2Rtmts7MXur/jxbQFOkiIoPnQ0dP7S8zSyAYnnsSUAa8bGYr\n3X191GGVwOUEM+j25ECxu1cNpBxFuUVsrtoMdIWGRlCJiOyfePZpLAI2u/u20OSGDwLLow9w9wp3\nXwv0NfnhgH+1F+UWsXlvEBoFBZCUBDt2DPSqIiLjUzxDYwawPWp9R2hbrBx4zszWmtm/7m8hDpl8\nCG/ufjOyvmQJ/PWv+3s1EZHxLW7NUwz8RU1L3L3czAqAZ81sg7u/2POgFStWRD4XFxdTXFzcbf/c\nnLnUt9ZT0VBBQXoBxx8PL7wA5547wNKJiIwSJSUllJSUDMq1+v0+jZgvbLYYWOHuS0Pr1wCd7v6T\nXo69Dqh395v6uFav+/f1Po1oxXcXc+1x1/LpAz7Nq6/C+eerQ1xExq+4vk9jANYC881sjpklAWcD\nK/s4tlvhzSzNzDJDn9OBTwNv9nZiLI6YdgTrytcBcNhhwVPhe/bs79VERMavuIWGu7cDlwFPA+8A\nD7n7ejO7xMwuATCzqWa2Hfh34D/MrNTMMoCpwItm9hqwBnjc3Z/Z37J8bOrHWLcrCI2EBPVriIjs\nr7g1Tw2FWJun3trzFp996LNsvHwjAD/5CZSXw803x7uEIiIjz0htnhoxDs4/mLK6ssgcVOHOcBER\n6Z9xERqJExI5pOAQXt8dzOZ+5JGwaRPU1HzIiSIi0s24CA3o3hmelASLFqlfQ0Skv8ZNaER3hoOa\nqERE9sf4CY1pH+PV8lcj6yecoNAQEemvcRMaH538Uf5Z+U9a2lsAOOYYePNNaGgY5oKJiIwi4yY0\nUiemMi9vHm9XBC/USE2Fww+Hf/xjmAsmIjKKjJvQgFC/RnlXv8YnPwlPPDGMBRIRGWXGX2hEdYaf\ney488AB0dAxjoURERpHxFRrTuofGwQfD9Onwl78MY6FEREaRcRUah089nNd3vU5HZ1fV4vzz4b77\nhrFQIiKjyLgKjZyUHAqzC3lt12uRbeecA489Bo2Nw1gwEZFRYlyFBsAp807hiU1dvd9TpwZPh//p\nT8NYKBGRUWLchcay+ctYtWlVt21qohIRic24mBo9WmtHKwX/XcDmyzdTkF4AQH09zJwZTGJYUBCP\nkoqIjByaGr0fkhKS+OTcT/LU5qci2zIy4NRT4eGHh7FgIiKjwLgLDei9ieqLX4Q774RRXPESEYm7\ncRkap8w/hWfefYb2zvbItpNPhs7OYCSViIj0blyGxvTM6czJmcPq7asj2yZMgBtugP/4Dz0hLiLS\nl3EZGtB7E9Wpp0J2djC1iIiIfNC4DY1T55/a7XkNADP40Y/guuugtXWYCiYiMoKN29BYNGMR5fXl\nlNaUdtt+wgkwbx7cddcwFUxEZAQbt6GRMCGBpfOW8tiGD/Z833AD/PCHekGTiEhPcQ0NM1tqZhvM\nbJOZXd3L/oPNbLWZNZvZN/tz7mC46PCLuHXtrXR6Z7ftRx0FJ54I114bj28VERm94hYaZpYA/AJY\nCiwEzjWzBT0OqwQuB27cj3MHrHhOMUkJSTz77rMf2Pfzn8Mf/qBp00VEosWzprEI2Ozu29y9DXgQ\nWB59gLtXuPtaoK2/5w4GM+PKY67k5jU3f2Bfbi78+tdw0UVQVzfY3ywiMjrFMzRmANuj1neEtsX7\n3H4596Pn8mr5q6yvWP+BfaeeGjRT/d//G49vFhEZfRLjeO2BTMgR87krVqyIfC4uLqa4uLhfX5SS\nmMKlR17KLWtu4Ven/eoD+//f/4NDD4VVq2DZsn5dWkRkRCgpKaGkpGRQrhW3WW7NbDGwwt2Xhtav\nATrd/Se9HHsdUO/uN/Xn3P2Z5bY3u+p3seCXC9h8+WYmpU36wP6//hU+8xl48smgk1xEZDQbqbPc\nrgXmm9kcM0sCzgZW9nFsz8L359wBm5oxlTMOOoPfvPqbXvcfeyz85jdw+umwcWO8SiEiMvLF9X0a\nZnYKcDOQANzp7v9lZpcAuPvtZjYVeBnIAjqBOmChu9f3dm4v1x+UmgbA67te5+T7Tmb919eTm5rb\n6zF33hk8v/G3v8GMuPSwiIjE30BqGuPuJUz7cunjl5JgCfxy2S/7PObHP4Z77oGnnoJZswbtq0VE\nhsxIbZ4adX70yR/xh/V/YO3OtX0ec/XV8JWvwCc+AevWDWHhRERGAIVGlLzUPH580o/5t1X/Rkdn\n7/Ojm8FVV8EttwTv4HjyySEupIjIMFJo9HDBYReQlJDEHa/esc/jPve54IVNF10EN9+sN/6JyPig\nPo1evLH7DU665yTWXbKOGVn77vHeti0IkPnz4Y47gveNi4iMZOrTGGSHTjmUqz5+FcsfXE5jW+M+\nj50zJ3iOIy0NjjkG3npraMooIjIcVNPog7tzwaMX0NrRyoOfexCzfYeye/AOjmuuCaYfWbEiCBQR\nkZFGNY04MDN+c/pvKK0p5QfP/yCG4+Hii2HTJpg9G448Ei6/HHbtGoLCiogMEYXGPqQkpvDI2Y9w\n12t3ce/r98Z0TnY2XH89bNgAEyfCIYfAd78L1dVxLqyIyBBQaHyIqRlTeeILT3DNn6/htpdvi/m8\ngoJgssN164LaxoEHwk03QUtLHAsrIhJn6tOI0Za9W/jUvZ/iosMv4trjrv3QPo6e1q+Hb38b3nkn\neKr8//yfoElLRGSoaRqRIVJeV86n7/s0J845kRs/fSMTEyb2+xp//nPwfo7mZjjzTDjjjGDU1QTV\n+URkiCg0hlBVUxXn//F89jbv5fef/T1zc+f2+xqdnfDyy7ByZbBUVsKllwbL5MlxKLSISBSNnhpC\neal5PP6Fx/n8ws9zzB3H8NBbD/X7GhMmBLWLG26AN9+EZ56BHTvgoIPgy18OnjTfsycOhRcRGSDV\nNAbglZ2vcO4fzuWQyYfws5N/xpycOQO63vvvw29/GzRhrVkDeXlwwglBM9anPgWpqYNTbhEZ39Q8\nNYya25u58e838rN//IwrjrmCb33iW6ROHPhv987OYNjuc8/BI4/Aq6/CSSfB0qXBRImall1E9pdC\nYwR4r/o9vvnMN1m9YzXf/Pg3ueTIS0hPSh+067//PjzxBDz9NDz7LEyaBMXFcNxxwVJYOGhfJSJj\nnEJjBFlXvo4f/fVHvPDeC1x29GV89civMiVjyqB+R2cnvPYaPP98MO/Viy9CSgosWRK852PxYjjg\nAMjN1bBeEfkghcYItL5iPTetvok/rP8DS+ct5WtHfY3jZh3X7+c7YuEeTF/y978Hy5o18N570NYW\n1EAOOwyOPz5YDjlEw3tFxjuFxghW3VzN7177Hb965Vd0dHZw4WEX8sXDvsis7Ph3StTVQWkprF0L\nL7wQLBUVcMQRwdxYRxwBCxcG07qnpcW9OCIyQig0RgF3Z03ZGn732u94+J2H+cjkj7D8oOWccdAZ\nzMubN2TlqKiAV14JlldfDTrbt2yBKVOgqCiYbHHWrODfoqKgmWv6dEhIGLIiikicKTRGmeb2Zv68\n5c+s/OdKVm5cSU5KDsvmL2PZ/GUcO+vY/XrSfCDa24OXSW3dGtRMSkuD9S1b4N13g8kWjz8eTj8d\nli3TlO8io51CYxTr9E7W7lzLE5ueYNWmVWyu2sySwiUcO+tYjpt1HEdOP5KUxJRhLWNNTTBi6/HH\ngxFc7e3B6K28PJg2DRYsCJq5FiyA/HzIzAyWpKRhLbaI9EGhMYbsrt/NX0v/youlL/LX0r/yTsU7\nHDL5EBZNX8TRM47miGlHsCB/wZDXRsI6O2Hv3mDqk8pK2LkzmIzxnXeCf6uqgr6U2togVD7ykWBZ\nsCBo6po7N2j6UqCIDJ8RGxpmthS4GUgA7nD3n/RyzM+BU4BG4Evuvi60fRtQC3QAbe6+qJdzx1xo\n9NTY1si68nW8VPYSL+98mXW71vFe9XssKFjAkdOO5OjpR3PU9KP4yOSPDFuQ9MYdysuD19++9VYQ\nKlu3BktZWVATKSgIlvz8IGDy8oLPc+cGAVNUFLyfRMOGRQbXiAwNM0sA/gmcBJQBLwPnuvv6qGNO\nBS5z91MwciOpAAAQ+0lEQVTN7BjgFndfHNq3FTjS3av28R1jPjR609DawBu73+CV8ldYu3MtL+98\nmXer3mVu7lwW5C9gYcFCFuQvYEHBAg7OP5i0iSNraFRHR1AjqagI5tiqrAzWq6qC9S1buvpTOjuD\nYJk8GWbMCN5LcuCBwYivuXODbeqkF+mfkRoaHweuc/elofXvALj7j6OO+RXwF3d/KLS+ATjB3XeH\nQuMod6/cx3eMy9DoTXN7MxsrN7K+Yj3r3w8tFevZVLWJgrQC5k+az/y8+czLm8ecnDmRZVLqpLg8\nOzJYGhqCIKmoCDroN22CjRuDZevWIGgKC4NaSloapKcHtZPJk4MRYZMnd6/RzJwZPAgpMp4NJDQS\nB7swUWYA26PWdwDHxHDMDGA34MBzZtYB3O7uv4ljWUe9lMQUDp1yKIdOObTb9vbOdkprStlUuYlN\nVZvYXLWZv23/G+9Vv8eWvVswM+bnzefASQcyN2cus7JnRZbC7EIykjKG6ScKpKcHNYq5c2HRBxoo\noakpeJCxuhoaG4OQqa4OQmb37mBI8fvvB+sVFUEfTH5+0PQ1fXoQNOGwKSzs+q7CwqAJTUS6i2do\nxFoF6CvtjnX3nWZWADxrZhvc/cWeB61YsSLyubi4mOLi4v6Wc0xLnJBIUW4RRblFnMzJ3fa5O5VN\nlWys3Mimyk1sq97GP3b8g4ffeZjSmlK212wnJTGFWdmzmJk1M7LMyJzBjKwZzMicwcysmWQlZw1b\nbSU1FQ4+OPbjOzqCPpUtW4IAaWoKwqauDt5+OxghtnVrMFV9QkLQ/DV5cvA9KSld/yYnB/9OmxZ8\n/8EHB0E0ceR0K4lElJSUUFJSMijXimfz1GJgRVTz1DVAZ3RneKh5qsTdHwytR5qnelzrOqDe3W/q\nsV3NU3EUDpXSmlLKasvYXrudHbU7KKsro6y2jLK6MnbU7gCgMKuQmVkzKUgvID81n/y0fCanT2Za\n5jSmZUxjasZUCtILhn34cKzcg6HGZWVBDaW5OViamoL3vIc/l5UFtZn164MaT3p6V6f+lCkwdWqw\npKcH1wRITAyGLOfnB0u4zyYnR1O8yNAYqX0aiQQd4Z8EdgIvse+O8MXAze6+2MzSgAR3rzOzdOAZ\n4Hp3f6bHdyg0RoCa5ppIoFQ0VFDZVElFQwV7GvZQXl9OeX05u+p3UdFQQXJiMgVpBUzLnMaMzBlM\nz5zO9MzpTE6fzJT0KUzJmEJBWsGoCpiwzs5gqHFVVdAktmcP7NoVjCJrbg6OMYPW1q5jwk1ne/ZA\nfX0wyWRmJmRlBUtubtcSDpiCgqDfJikpqPGkpgbbJk3SoACJzYgMDQAzO4WuIbd3uvt/mdklAO5+\ne+iYXwBLgQbgy+7+qpkVAX8MXSYRuN/d/6uX6ys0RhF3p661LgiTuvJIjaW8vpw9DXvY3bCb3fW7\nqWis4P3G90lKSCIvNY+clBxyU3LJTc0NAiUUKlPSpzA5fTKT0yczKW0SOSk5pCamjuiO/X1paQn6\nY+rqgqWmJlgPjyyL7puprQ0mpGxtDZrXKiqC52dyc7s6/SdNCmov6elBv014mPPUqUEtKCOjq6kt\nLS0IKTWvjQ8jNjTiTaExdrk7tS217G3eS3VzNdXN1VQ2VkYCZU/Dnsiyu2E3VU1VVDdX0+md5Kbk\nkpOSE1ny0/IjSziEopfw8YPx8qzh1N7eVXsJL+EBAuF+mz17ggECu3cHgwaam4OwamgIgigpKQie\noqJgWPO8ed0HBHR2Bv1C7e1Bc1u4NpSTEwwsmD07qAXJyKbQEAlpbm9mb9NealpqqG6upqqpKhI2\nFQ0VQQC1VEeCKLzsbdoLBO+Az03N7Qqa1HxyU3PJTMokMzmTzKRMclNzyUvN6xZA6RPTR20NJ8w9\nCJfKymCgwKZNsHlzsC3MLOiTCTeD1dUFNZy9e4OBBe+9F+ybOjUIoIkTg5pMdJNbTk7XkpUV1HbC\nS3gkW7hvaNIk9fPEg0JDZBA0tTUFIdNUSVVTFe83vk9lY/C5vrWeutY6altqg1pPUyWVjZVUN1dT\n01JDS3sL2SnZZCZlkpGUEQmY8L9ZyVlkJ2eTk5JDdkp2pIaTnZJNVnJWcE5SJulJ6Uyw0ftb0j1o\nStu9O2g+a2sLajL19UFzW3iprg6W2truAwvCw6YbGoLr1NYGgwQmTw5qMNnZQdAkJgYBZhb06YQH\nH2Rnd4VaYmKwHm6qy8wMtiUmBoGWGM+xoyOcQkNkmLV2tFLTXENDW0MQMC111LXWUdcSBE1tS22k\n9hMOmr1NQdNbXWtd5Jzm9maykrMi4ZKamErqxFRSE1PJTM4kKymLrOSuJTM5M/I5Ozmb7JRsMpIy\nSJ+YTnpSOkkJo3uSr9bWIID27OkKnNraoInMPViamoKAqawM9re3dzWhVVcH299/Pwiu8PbW1qCm\nM21aUCvKyAhqROHaUUJCsEycGIRO+CHR1NRg+4QJwfFZWUEwZWZ2DcceDYMRFBoiY0R7Z3ukNlPd\nXE1TWxNN7U00tTVR31rfLYDqWuqoba2lprmm2/b61noaWhtoaGsAiARIuDYTrgmFwyUjKeMDS3Qw\npU1MIzUxlbSJaZFlJM1ztj86O4MgKS8PRrg1NARB0tIS1I46OoJjWlqC0AnPStDcHOzr6Ag+hyfn\nrK0Nwqu5OajBZGR0NceFm+TC4dLZGXxXa2sQSuEmuYyM7k13KSldI+TCAxmysoJ/U1IGNiebQkNE\netXa0UpDa1D7CS/hGlC4VtTr/tauGlJjWyONbY00tTVFPpsZaRPTSElMiSzpE9M/0CwXHUzpSemk\nT0wnbWIayYnJJCckkzoxtVuQhQMuccLobDty7z6wIDwKLtwkV1sb1ETCNZqOjuDY8ECFcLNddXVw\nnfAS3h++ZltbV9BEh1JSUlcNLDGxazqdKVPg/PODQQswcqcREZFhlpSQRFJqErmpuYN63baONhrb\nGmlub6a5vZmm9iYaWhsiYRMdSvWt9VQ1VbG9djv1rfWRc5rbm2lsa+wWVuEaUuKExG41nNSJqSQn\nJJOSmEJyYnKkxpOamPqBUIoOp3Cghc+Jrk2lJqYOeo3JrKtTf9KkQb10N+3tQdiEgyQcTK2tXX09\nbW1do+U2bAjWB4NqGiIyorg7LR0t3Wo3Te1NtLS30NLREoRUaHtDW0Pwb1RtqbGtkcb2YFtze3Pk\nnOiACh83wSaQmphKUkISiRMSmZgwkZTElEiwpE9Mj/QppSamkpKYQurE1EgYRfc5hWtP4WOim/NS\nElMi+1ISU0ickDiso+3UPCUi0k/uTltnG01tTbR2tNLe2R5Zj64lhUMqHFxN7U3dtkX2hcKppb0r\npMJLdNg1tTcBwSSj4VAJ14ySEpJITkwOaogJSUycMDGyLSUhqDGFwyocTskJyZHACu/vuS05IZkD\n8g6IDIxQaIiIjCLtne2RAGlobYiES2tHK60drbR0tNDW0RZZb+1o7dYUGPncFoRVS0dLpCYWDq2e\n25447wmKcosAhcZwF0NEZFQZSGiM3qeIRERkyCk0REQkZgoNERGJmUJDRERiptAQEZGYKTRERCRm\nCg0REYmZQkNERGKm0BARkZgpNEREJGYKDRERiZlCQ0REYhbX0DCzpWa2wcw2mdnVfRzz89D+183s\nY/05V0REhlbcQsPMEoBfAEuBhcC5ZragxzGnAvPcfT7wVeC2WM+V7kpKSoa7CCOG7kUX3YsuuheD\nI541jUXAZnff5u5twIPA8h7HnAH8DsDd1wA5ZjY1xnMliv6D6KJ70UX3oovuxeCIZ2jMALZHre8I\nbYvlmOkxnCsiIkMsnqER69uRhu9FuSIi0i9xe3OfmS0GVrj70tD6NUCnu/8k6phfASXu/mBofQNw\nAjD3w84Nbddr+0RE9sP+vrkvcbALEmUtMN/M5gA7gbOBc3scsxK4DHgwFDLV7r7bzCpjOHe/f2gR\nEdk/cQsNd283s8uAp4EE4E53X29ml4T23+7uT5jZqWa2GWgAvryvc+NVVhERiU3cmqdERGTsGbVP\nhI/nh//MrNDM/mJmb5vZW2b2jdD2PDN71sw2mtkzZpYz3GUdCmaWYGbrzOxPofXxeh9yzOx/zGy9\nmb1jZseM43txTei/jzfN7Pdmljxe7oWZ3WVmu83szahtff7soXu1KfT79NMfdv1RGRp6+I824N/d\n/RBgMfD10M//HeBZdz8Q+HNofTy4AniHrhF74/U+3AI84e4LgEOBDYzDexHqC/1X4Ah3/yhBE/c5\njJ978VuC343Rev3ZzWwhQZ/xwtA5t5rZPnNhVIYG4/zhP3ff5e6vhT7XA+sJnmOJPCwZ+vfM4Snh\n0DGzmcCpwB10Dd8ej/chGzjO3e+CoF/Q3WsYh/cCqCX4wyrNzBKBNIIBNePiXrj7i8DeHpv7+tmX\nAw+4e5u7bwM2E/x+7dNoDY1YHhwcF0J/VX0MWANMcffdoV27gSnDVKyh9DPgW0Bn1LbxeB/mAhVm\n9lsze9XMfmNm6YzDe+HuVcBNQClBWFS7+7OMw3sRpa+ffTrB78+wD/1dOlpDQ733gJllAH8ArnD3\nuuh9HoxwGNP3ycxOA/a4+zr6eEh0PNyHkETgCOBWdz+CYDRit+aX8XIvzOwA4EpgDsEvxQwzOz/6\nmPFyL3oTw8++z/syWkOjDCiMWi+ke1qOeWY2kSAw7nX3R0Obd4fm7sLMpgF7hqt8Q+QTwBlmthV4\nADjRzO5l/N0HCP7/v8PdXw6t/w9BiOwah/fiKODv7l7p7u3AH4GPMz7vRVhf/030/F06M7StT6M1\nNCIPDppZEkFHzsphLtOQMTMD7gTecfebo3atBC4Mfb4QeLTnuWOJu1/r7oXuPpego/N/3f2LjLP7\nAEE/F7DdzA4MbToJeBv4E+PsXhAMAFhsZqmh/1ZOIhgoMR7vRVhf/02sBM4xsyQzmwvMB17a14VG\n7XMaZnYKcDNdD//91zAXaciY2bHAC8AbdFUlryH4H/thYBawDTjL3auHo4xDzcxOAL7p7meYWR7j\n8D6Y2WEEAwKSgHcJHpZNYHzei28T/HLsBF4FvgJkMg7uhZk9QDAdUz5B/8X3gcfo42c3s2uBi4B2\ngqbup/d5/dEaGiIiMvRGa/OUiIgMA4WGiIjETKEhIiIxU2iIiEjMFBoiIhIzhYaIiMRMoSHSCzPr\nCE23Hl6+PYjXnhM9bbXIaBLP172KjGaN7v6x4S6EyEijmoZIP5jZNjP7iZm9YWZrQpPjhWsP/2tm\nr5vZc2ZWGNo+xcweMbPXQsvi0KUSzOzXoZdoPW1mKaHjvxF6edDroSd7RUYUhYZI71J7NE99PrTd\nCabaPpTgRWDhub/+P+C37n4YcD/w89D2nwN/cffDCSYQfCe0fT7wC3f/CFANfC60/Wrg8NB1Lonj\nzyeyXzSNiEgvzKzO3TN72b4V+Bd33xaaabjc3fPNrAKY6u4doe073b3AzPYAM0IvCwtfYw7wTOgt\nauF5kia6+w1m9iRQTzCh3KPu3hDvn1WkP1TTEBmY6L+6en2nRx/bW6I+d9DVv7gM+CVBreTl0KuN\nRUYMhYZI/50d9e/fQ5//TjA9O8B5BLMQQ/A+5q9B8G57M8vq66KhabxnuXsJwQuUsoH0QS25yABp\n9JRI71LNbF3U+pPufm3oc66ZvQ40A+eGtl0O/NbMvkXwgpsvh7ZfAfzazC4mqFFcSjBddc92YSeY\nxvze0Pu+DbjF3WsH+ecSGRD1aYj0Q6hP48jQe6hFxh01T4n0j/7KknFNNQ0REYmZahoiIhIzhYaI\niMRMoSEiIjFTaIiISMwUGiIiEjOFhoiIxOz/B2bG47K6boPeAAAAAElFTkSuQmCC\n", 140 | "text/plain": [ 141 | "" 142 | ] 143 | }, 144 | "metadata": {}, 145 | "output_type": "display_data" 146 | }, 147 | { 148 | "data": { 149 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYcAAAEPCAYAAACp/QjLAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xl8VdW9///XJ/MMAcIUEETRSgcckWqrqdoa64Adrji0\nauutw7fW2n6/1drWitfbX6+t3nvbOg/VOg8dlDpW28baloIo4gQqIDJDCCGQOSfn8/tjnZyEDBAg\n5yQneT8fj/3I2fuss846W1mfvdbaay9zd0RERDpK6+8CiIjIwKPgICIiXSg4iIhIFwoOIiLShYKD\niIh0oeAgIiJdJDQ4mNmvzWyjmb25kzS/NLP3zWyxmR2SyPKIiEjvJLrlcA9Q3tObZvZ5YH93nwpc\nCNya4PKIiEgvJDQ4uPvLQPVOkpwG/CaWdj4w3MzGJLJMIiKya/095lAKrO6wvwaY0E9lERGRmP4O\nDgDWaV/P8xAR6WcZ/fz9a4GJHfYnxI7twMwUMERE9oC7d74A75X+bjnMBc4FMLOZwFZ339hdQnfX\n5s4111zT72UYKJvOhc6FzsXOt72R0JaDmT0MHAuMMrPVwDVAJoC73+7uz5jZ581sGVAHfC2R5RER\nkd5JaHBw97N6kebSRJZBRER2X393K8luKisr6+8iDBg6F+10LtrpXPQN29t+qWQwM0+FcoqIDCRm\nhqfogLSIiAxACg4iItKFgoOIiHSh4CAiIl0oOIiISBcKDiIi0oWCg4hICunu0RjuTlOkiZrGmj77\nnv5+8J6IyIDW0tpCdWM11Q3VNEQayMnIIScjh+z0bFqiLTS0NNAYaaQl2kLUo0Q9SktrC9ubt7O9\naTu1zbU0tzYTiUZoibbQGm2Np2v1ViLRSHgv9pmaphpqGmuob6mnubWZ5tZmGiIN1DTWUNNUw/am\n7QBkZ2STnZ5N1KPUt9STZmlMKZ7C0kuX9snv1iQ4Eek37k5DpIHtTdvZ3ryd5tbmHtPVt9THK9zq\nxmo21G5gY+1GtjRuIc3SyLAMMtIySE9LJ83SSLM0oh4NlXdrY7wSb4iEv5FoJF5JR6IRmiJNNLc2\n09TaFK+UmyJNRKIRhucMpzi3mLzMPJoiTTREGmiKNJGVnhUPFlnpWfHvzUzPpCCrgMKsQgqyCshK\nzyIzLZOMtFDGtnRplhY/1vaZYdnDGJYzjPzMfLIzsuPfMSx7GEXZRRRlF+F4vIwA+Zn5eGsm1dUw\npsNyaXszCU7BQWSQcvf41Wrb1Wltc238CrTj3+3N2zGMzPRQgTW3NlPbXMv2pu00RBriFZlhNEQa\nqG+pp66ljsZIY7wSbWptCpVvhyvptu/tWBF33nIycijMKqQwu5Ds9Owef09eZl6ocLMLKc4pZkz+\nGMYWjGVk3sgdfmvHvA0jNzOX3IxccjJyyM3MjVfmmWmZ8d+VnpZOdnqoiLPSs+JX5ZnpmeRm5GK2\n6/rVHVpbIRKBaBQyMiA9HcxgwwZYtQpWr4aampA2Gg2fy8qC7GzIzITt26G6GrZsCVvb65qakG9b\n/unpIf+MDKivD/nX1MBBB8Hixe1lUnAQSSHuzvbm7Wyq20RlXSUNkYZ4pZSRlkFTpCl+hVtZV8m6\n7etYX7ueyvrK+BV2XXPdDpVvS2tLvHJsbm2mMdJIY6QRw+L5ZqRlkJ+VH78ybftblFVEYXYhELpQ\nWqItZKVnxSvs3IxcHI9XuLkZueRl5pGflU9uRm687FnpWTtUxB2/Nz0tnXQLV/RmtsPrNEvs0GdL\nC6xdC+vWQV0dNDSErbk5bC0tYb9jhbxtW6ioa2tD5dvYGNK4w/DhYSsshK1bobISNm8O6dLTw5aW\n1l6Ru4er+X32gYkTw2fT0sLmHsrQ1BT+FhbCiBFhKy6GkSPD66KiEDza8o9GQ96RCOTkwLhxIW16\n+o6/XcFBpA+5Oy3RlnB13FxHbXMtm+o2sW77OtZtX8emuk1srt9MVUMV25q2xSu4NEujNdreh1zf\nUh+/Mq9trqXVQ19za7SV/Kx8SvJKKMkvIS8zj5bWFppaQxdGdnp2/Aq3JK+EcQXjGF84npL8kniF\nnZ+ZH69809PSyUzLjF/1Z6Zlxj+fkTYwhhXdQ6W7Zg1UVUFBAQwbFv5WVsLKlfDhh+F1bW2oxOvr\nw+cgVIZVVaGCX7cupMnJCVfc2dk7VsppsVhjFtJVVobKubQ0fF9ubtjartYzM8N+W6U8fHiojAsL\nw5aXF97PyQl51tSE37J9e/gNJSVhy88P73f323vR8EgIBQeRDtydupY6qhuq2dKwhaqGKirrKqms\nr2RLwxYaWhriXSMbajewZtsa1mxbw7ambUSiEVq9NVxlZ+bHr5DH5I9hfOF4xhWMY0zBGEbmjmRk\n3kiGZQ8DiF9Vp6elx6+W2/qJh+UMozCrcIe+5vS09F38iuSJRHa8aq6uDsfbKsesrFDJ1taGK+rl\ny+G99+Ddd8N+ZmZIk54ersLbrsjbtqamkGdmJkyYEK5w6+rar85HjYJJk2DyZBg9OlTgbZV4WodG\nxahRMH582AoKQr6NjeFva2sIIK2toTJuqy5yc0P6jIERI5NOwUFSUlOkiaqGKqobquN90q3eSk1j\nDZX1lVTWVbK9eXs8fWu0lZqmGqobqqlurI53r7T1f9c118X/ZqVnUZxbTHFOMaPyRlGSX0JJXgkj\nckeQl5lHTkYOeZl5jC0Yy4SiCZQWllKcWxyuxC29V33MydLaCps2hQq2pSVskUj7+22VYVs/9rZt\n8P77oQJfuTKkj0bD+y0toUJtbAwVdFVV+Dt8eHtXRnFxuNJt61ZpamqvsAsLYd994cADwzZ8+I5l\nysoKW2ZmuDJv608fNix8VpJLwUH6jbuzpWELG+s2sqF2Q/wOkg21G6isr4xX1vUt9WGAM3a3ydbG\nrTS3NjMybyTFOcVkpmfGBzyH5QwLXS55JRRlF8Ur6rb3inOKKc4tpii7KFzZZ+aTn5W/w5V+VnpW\nP5+ZUEmvWRO2zZtD5Th8eKgo2/qN2yrz1avDgOXatSHt5s2h4t6wIXSLFBeHyrut4s3I2LGrwixc\nZZuF79l/f5g6NVTkbd0hZu2VdU5O6C4ZOTKUJ00zngYlBQfZKy2tLWys28jG2o3xyntb0zZWb1vN\niuoVrKheQWV95Q6Dn3UtocKvb6mnKLuIsQVjGZM/hjEFYxibP5axBWMpyS+hIKsgXnm33dpXmF1I\nUXYRhVmFA+oKvY176O9u2xoaduyuqKoKV+bLloVKvaGh/Wq8ujq8v3lzCAATJoRt1KhwFV5TEwYx\no9H2yrygIAxW7rNPSFtSEirtkSNDX/mYMSEgiOwuBQfpUdSjrKpZxavrXmXhuoUs3riYrY1baYg0\n0NDSQHVj6JcfnT+a0fmj45V2YXYhE4smsl/xfkwpnsKovFFkpmfG79Vuu1LPzcwdMIOe3Wn736Yt\nBrW2wquvwvPPw0svha6T1taw1daGyn3r1nCFnZ/fPnjZdkuiWbiKnzo1XJ1PmhSuwNuuxtvuMBk1\nKhwX6U8DNjiYWTnwv0A6cJe7X9/p/WLg18AUoBH4uru/3U0+Cg7dqGmsYdmWZSzbsowV1SvY2rg1\nXPk3b2dj7UZWbl3JqppVjMwbyWHjDuOwcYdxyLhDGJU3KtzznZHL8JzhjM4fPaAGSLsTiYSr9I0b\n27tdIFTg+fmhz3vZsnBFv3x56KNvu8XQvb3PvK4uDFCeeCIcf3z77X/p6aEyLy4OXT9Z/d8rJbLX\nBmRwMLN04F3gBGAt8Apwlrsv6ZDm58A2d7/OzA4Ebnb3E7rJa8gFh7Z74dduW8sbG99g0YZFLNqw\niNU1q+NT+TPSMth/xP7sP2J/phRPYUTuiPhV/5j8MUwaPol9hu1DXubAvIRtbAwV+fLloSJv646p\nr28f5KyuDnfFrFgR7mQZOzZclY8aFa7i6+rClp7e3s++334hXVv3TFpaSFNbGyr9jjNIRQazvQkO\niewPmAEsc/eVAGb2CDALWNIhzUHAfwG4+7tmNtnMSty9MoHlGhDcPf4IgA21G1i2ZRlvb3qbdza/\nw7Ity9hQu4F0S2dc4Tg+PvrjHDL2EL4141tMHj45PiDb25mbyRKNhtsh2yrirVvDHTNLl4a/W7e2\nT/apqoL160O3zH77hQp71KhQmU+c2H575LBhcMABoeLPzd3zsmVlhVaBiPROIoNDKbC6w/4a4MhO\naRYDXwT+bmYzgEnABGDQBAd3p6aphnXb1/Hu5neZt2Ye89bM49V1r5KdkR0fxN2veD8+WvJRyvcv\nZ+rIqYwtGEtBVkF/F5+mplDhr1oF77wTttWrQyXedhX/wQfwyivw2mvhCr6gIHT1tFXsH/kInHNO\n+EzbrY7FxeG+9qF6/7nIQJfIf5q96Qf6L+AXZrYIeBNYBLR2l3DOnDnx12VlZZSVle19CRMgEo3w\nrzX/4un3nua55c/x7uZ3yUjLYHzheKYUT2HmhJlcc+w1zCidQVF2UX8XdweRCPz97/Dkk/DMMyEI\nRCLhFsrSUpg2LWwHHxy6ezZsgNdfD3fZfP/7cPjhIQCISP+oqKigoqKiT/JK5JjDTGCOu5fH9q8C\nop0HpTt95gPg4+5e2+n4gBxzaGlt4e3Kt1m0fhGLNy5m8cbFvL7hdSYPn8zJU0/mpP1PYvrY6f3e\nAti4ERYuDHfptD2Uq22AtqkpVPIbNoR+/X33hVmz4NRTQ/99T48EEJGBb6AOSGcQBqSPB9YBC+g6\nID0MaHD3ZjP7BnC0u5/fTV4DIji4O/9a8y8ef+dx/rH6H7y16S0mDZvEoeMOZfqY6UwfO52Dxx7M\n6PzR/VbGbdvgT3+CefPgzTfhjTdCH/9hh4Ur+4MPDl05tbXhNs7s7NA9NGZM6P/XYK3I4DEggwOA\nmZ1E+62sd7v7T83sIgB3v93MPgncS+iCegu4wN27LGXU38Hh7U1v85vFv+HRtx8lPzOf2R+dzfFT\njufgsQcnvVWwbVtoASxYEPr68/PDjNiMjHDf/vz5cPTRcOyx8IlPhK20VFf/IkPRgA0OfaU/gkNt\ncy0PvfkQv170a1ZvW825nziXsz9+Nh8b/bGk3CH04Yfw29+Gvv+259+0Pazs4INhxoxwB099fWgB\nNDTAUUfBCSeE7iIREQWHPhSJRrjrtbu49qVrmTlhJt849Bt8br/PJWUW8NKlYTD4978P/f+nnx62\n0tIwQSsvT49SEJHeU3DoA+7OH9/7I1e8cAWlRaX8/LM/59Bxhybge8Lg8LPPhufsNDaGq/9//jO0\nDGbNCgGhrExBQET2joLDXnpr01t85/nvsHbbWm743A2ctP9Jfdp1VFsbxgieew4efzxU+qefHloB\nOTlhmz49DBprbEBE+spAnSE94G1v2s5Vf76Kx95+jKuPuZqLD7+YzPS+uVzftAl+9aswZrB0aRgn\n+Mxn4IknwiCxgoCIDGRDNji8ufFNvvz4lzlq4lEs+eYSRub1zeyt9evh5z+He++FM8+EX/4ytAhy\ncvokexGRpBiSweGeRfdwxYtXcOPnbuTc6efudX61tTB3Ljz6aLid9PzzwxyD0tK9L6uISH8YUsGh\nNdrKZc9exl9W/oWK8yr46OiP7lk+rWGm8V/+An/9a3jkxKc+BbNnw333hWcKiYiksiEzIN0UaeLc\nJ85lU90mnjzzyT1+rtGf/gQXXxxmFh93XNg+85nw/CERkYFEA9K7UNtcyxce/QKFWYU8e86z5GTs\n/gBAVRV85zvwt7/BbbdBeXkCCioiMkAM+mXFW1pbOOnBk5g8bDKP/9vjux0Y3OGRR+BjHwtPHH3r\nLQUGERn8Bn3L4bq/XUdBVgF3nHrHbs9dWLcOLrkkLD/5xBNwZOfVKEREBqlB3XL4x6p/cOdrd3LP\nrHt2KzBEo3DHHWFi2vTpYREbBQYRGUoGbcthW9M2vvqHr3LbybcxtmBsrz+3eHFoLbjDn/8cJqyJ\niAw1g7blcNmzl3HClBOY9ZFZvUofjcKPfgSf/Sx87Wvwj38oMIjI0DUoWw4vLH+Bl1e9zBsXv9Gr\n9K2t4fbUt98OA86j+2+tHhGRAWHQBYeoR7nqz1fxX8f/F/lZ+btMH4mEGc1r1sDzz4eFc0REhrpB\nFxx+987vAPjStC/tMm1zM5xzTlhA55lnwnoJIiIyyIJDS2sLP/zLD7n58zeTZjsfTqmthS99CXJz\nwwI7ejCeiEi7QTUgfe/r9zJx2EROmHLCTtNVVYXlNEtLw1KcCgwiIjtKaHAws3IzW2pm75vZld28\nP8rMnjOz183sLTM7f0+/q6GlgWtfupafHv/Tnc5pWLsWPv1pOOYYuPtuyBhUbScRkb6RsOBgZunA\nTUA5MA04y8wO6pTsUmCRux8MlAE3mtkeVde3LbyNGaUzmFE6o8c0LS3w5S+HdRZ+9jMtuCMi0pNE\nthxmAMvcfaW7twCPAJ0nHawH2h6PWgRUuXtkd7/I3bll4S1876jv7TTdddeFx2n/6Ee7+w0iIkNL\nIjtVSoHVHfbXAJ0fQnEn8BczWwcUAmfsyRdVrKwgOz2bmRNm9pjm5Zfhzjth0SJIG1QjLSIifS+R\nwaE3CzD8AHjd3cvMbD/gBTOb7u7bOyecM2dO/HVZWRllZWXx/Ttfu5MLD7uwx7GG6mr4ylfgrrtg\nbO+fpCEiklIqKiqoqKjok7wSttiPmc0E5rh7eWz/KiDq7td3SPMM8BN3/0ds/8/Ale6+sFNePS72\nU1VfxX6/3I8V317BiNzuV9z5yleguBh+9au++GUiIqlhoC72sxCYamaTgXXAbOCsTmmWAicA/zCz\nMcCBwIrd+ZL7Ft/HqQee2mNgeOcdeOEF+OCD3Su8iMhQlrDg4O4RM7sUeB5IB+529yVmdlHs/duB\n/w+4x8wWEwbHr3D3LbvxHdz52p3cdsptPab52c/gsss0+1lEZHek9BrSf1/1d/597r+z5JtLuh1v\nWLUKDjkkLNZTXJyMkoqIDBx7062U0vft3PnanXzj0G/0OBD93/8NF1ygwCAisrtStuUQ9SgjfzaS\nd/7PO4wrHNflM5s3wwEHhEdwjx+frJKKiAwcQ7Ll8H7V+wzPGd5tYAC46aYwG1qBQURk96Xsk4Xm\nr53f46My6urgllvCam4iIrL7UrblMH/NfI4s7TzhOnj6aTj0UJg6NcmFEhEZJFI2OCxYt6DH4PDk\nk3D66UkukIjIIJKSwaEx0sg7le9w6LhDu7zX0gLPPgunndYPBRMRGSRSMjgsWr+IA0ceSG5mbpf3\n/vY32H9/DUSLiOyNlAwO89f2PN7wxBPqUhIR2VspGRwWrF3AkRO6Bgf3MN4wq/OqESIisltSMjj0\n1HJ4/XXIyoJp0/qhUCIig0jKBYfKukqq6qs4cNSBXd5razVo+U8Rkb2TcsFhwdoFHD7+cNKsa9HV\npSQi0jdSMjh016X04YewZg0cdVQ/FEpEZJBJueAwf+38bgejn3oKTj4ZMlL2gSAiIgNHSgUHd2fB\n2gXdPlNp3jw49th+KJSIyCCUUsFhefVyCrMLGVswtst7r78OBx/cD4USERmEUio4fLj1Q/Yr3q/L\n8YYGWLECPvrRfiiUiMgglFLBobK+kpL8ki7H33orLOyTldUPhRIRGYQSGhzMrNzMlprZ+2Z2ZTfv\n/z8zWxTb3jSziJkN7ym/yrpKSvK6Bgd1KYmI9K2EBQczSwduAsqBacBZZnZQxzTufoO7H+LuhwBX\nARXuvrWnPCvrew4OhxzSp8UXERnSEtlymAEsc/eV7t4CPALsbIra2cDDO8uwsq77biW1HERE+lYi\ng0MpsLrD/prYsS7MLA84EfjdzjLsruXQ2gpvvAHTp+9dYUVEpF0ip4z5bqQ9Ffj7zrqU5syZwyuv\nv0LB5AJKTi+hrKwMgOXLoaQEhvc4UiEiMjRUVFRQUVHRJ3mZ++7U4buRsdlMYI67l8f2rwKi7n59\nN2n/ADzq7o/0kJe7O9Nunsbj//Y4Hx3dfs/qY4/BI4/A73+fkJ8hIpKyzAx336NHkSayW2khMNXM\nJptZFjAbmNs5kZkNA44BntxVht3dyrpokcYbRET6WsKCg7tHgEuB54F3CC2DJWZ2kZld1CHp6cDz\n7t6ws/xao61sbdzKyNyROxzXnUoiIn0vYd1KfcnMfFPtJg66+SA2X7F5h/fGjYMFC2DixH4qnIjI\nADVQu5X6VHddShs2QHMzTJjQT4USERmkUic4dDM7uq1LSSu/iYj0rdQJDt20HDT5TUQkMXYZHMzs\nNLNu1uRMsu5aDrpTSUQkMXpT6c8GlpnZz8zsI4kuUE+6mx2tmdEiIomxy+Dg7ucAhwArgHvNbJ6Z\nXWhmhQkvXQfdPVdp7VrYZ59klkJEZGjoVXeRu9cAvwUeBcYDXwAWmdllCSzbDjq3HOrqoKUFioqS\nVQIRkaGjN2MOs2KPt6gAMoEj3P0k4BPAdxNbvHadB6Q3bICxY3WnkohIIvTmwXtfBP7H3f/W8aC7\n15vZvyemWF11HpDesCFMgBMRkb7Xm+BwLbC+bcfMcoExsXUaXkxYyTrpqeUgIiJ9rzdjDo8BrR32\no4Txh6Sqqq9iVN6o+L6Cg4hI4vQmOGS4e3Pbjrs3EcYekiovM4+s9Kz4/vr16lYSEUmU3gSHzWYW\nX94z9nrzTtInRHfPVVLLQUQkMXoz5nAx8KCZ3RTbXwN8NXFF6l7nCXAKDiIiibPL4ODuy4AjY5Pe\n3N1rE1+srtRyEBFJnl6tIW1mpwDTgByLTSxw9/9IYLm66Nxy0JiDiEji9GYS3O3AGcBlgMVeT0pw\nubroGByiUaishNGjk10KEZGhoTcD0ke5+7nAFne/FpgJHJjYYnXVsVupqio8NiMraycfEBGRPdab\n4NC2tnO9mZUCESDpvf0dWw7r12u8QUQkkXoTHP5oZsXAz4FXgZXAw73J3MzKzWypmb1vZlf2kKbM\nzBaZ2VtmVtFTXp1nR2u8QUQkcXY6IB1b5Ocv7l4N/M7MngZy3H3rrjI2s3TgJuAEYC3wipnNdfcl\nHdIMB24GTnT3NWY2qvvc6PJcJbUcREQSZ6ctB3ePEirvtv3G3gSGmBnAstgzmFqAR4BZndKcDfzO\n3dfE8u9xcp2eqyQikjy96VZ60cy+bLbbD8cuBVZ32F8TO9bRVGCEmf3VzBaaWY+T6zqPOahbSUQk\ncXo7Q/q7QKuZNcaOubvvapkd70XemcChwPFAHjDPzP7l7u93Tnj9T66Pv168uIwjjijrRfYiIkNH\nRUUFFRUVfZKXufemDt+DjM1mAnPcvTy2fxUQdffrO6S5Esh19zmx/buA59z9t53y8o7l/Mxn4Oqr\n4bjjElJ0EZFBwcxw9z1aEm2XLQczO6a7450X/+nGQmCqmU0G1gGzgbM6pXkSuCk2eJ0NHAn8967K\npFtZRUQSqzfdSlfQ3kWUQxhofhXY6XW7u0fM7FLgeSAduNvdl5jZRbH3b3f3pWb2HPAGYZ2IO939\nnV0VSLeyiogk1m53K5nZROAX7v7FxBSp2++Mdys1NMDw4dDYqPWjRUR2Zm+6lXpzt1Jna4CD9uTL\n+sLGjaFLSYFBRCRxejPm8KsOu2nAwYRupX6h21hFRBKvN2MOr9I+5hABHnL3fySuSDunCXAiIonX\nm+DwW6DB3VshPBbDzPLcvT6xReuegoOISOL1aoY0kNthPy92rF8oOIiIJF5vgkNOx6VB3X07IUD0\nC405iIgkXm+CQ52ZHda2Y2aH077GQ9Kp5SAikni9GXO4HHjMzNbH9scRZjv3CwUHEZHE22VwcPdX\nzOwg2pcGfdfdmxNbrJ6pW0lEJPF22a0UewRGvru/6e5vAvlm9n8SX7SuotEwCW7MmP74dhGRoaM3\nYw7fiK0EB0Ds9YWJK1LPqquhoACys/vj20VEho7eBIe02HKhQHz5z8zEFalnGm8QEUmO3gxIPw88\nYma3AwZcBDyX0FL1YOvW8NA9ERFJrN4EhysJ3UiXEB6j8QbhjqWkq6+H/Pz++GYRkaFll91Kscdm\nzAdWEtZyOB5Ykthida+uDvL6bfqdiMjQ0WPLwcwOJKzcNhuoBB4nrP9QlpyidaWWg4hIcuysW2kJ\n8BRworuvAjCz7yalVD2or1fLQUQkGXbWrfRFwmMy/mZmt5nZ8YQB6X5TV6eWg4hIMvQYHNz9CXef\nDXwMeBn4DlBiZrea2eeSVcCO1HIQEUmO3gxI17r7g+5+CjARWAR8vzeZm1m5mS01s/fN7Mpu3i8z\nsxozWxTbfrSz/NRyEBFJjt7cyhrn7luAO2LbTsUmy90EnACsBV4xs7nu3vlOp5fc/bTefH99PYwY\nsTslFhGRPdGbGdJ7agawzN1XunsL8Agwq5t0vR7H0K2sIiLJkcjgUAqs7rC/JnasIweOMrPFZvaM\nmU3bWYa6lVVEJDl2q1tpN3kv0rwGTHT3ejM7CXgCOKC7hHPmzGHhQmhqgokTyygrK+vDooqIpL6K\nigoqKir6JC9z700dvgcZm80E5rh7eWz/KiDq7tfv5DMfAIfFxjY6Hnd3p7wcLr8cyssTUmQRkUHF\nzHD3PZqCkMhupYXAVDObbGZZhJnWczsmMLMxZmax1zMIwWpL16wCjTmIiCRHwrqV3D0SWyjoeSAd\nuNvdl5jZRbH3bwe+DFxiZhGgHjhzZ3lqzEFEJDkS1q3Ul9q6lT7yEfjDH+Cgg/q7RCIiA99A7Vbq\nc2o5iIgkR8oFB405iIgkXkoFBz0+Q0QkOVImOESjYY5DTk5/l0REZPBLmeBQXw+5uWD9+tBwEZGh\nIaWCg7qURESSI2WCgybAiYgkT8oEB7UcRESSJ2WCg1oOIiLJkzLBQS0HEZHkSZngoJaDiEjypExw\nUMtBRCR5Uio4qOUgIpIcKRMc1K0kIpI8KRMc1K0kIpI8KRMc1HIQEUmelAkOajmIiCRPygQHtRxE\nRJInZYKDWg4iIsmT0OBgZuVmttTM3jezK3eS7ggzi5jZF3tKo5aDiEjyJCw4mFk6cBNQDkwDzjKz\ng3pIdz3wHNDjag1qOYiIJE8iWw4zgGXuvtLdW4BHgFndpPsW8FugcmeZqeUgIpI8iQwOpcDqDvtr\nYsfizKwlBI5FAAAQoElEQVSUEDBujR3ynjJTy0FEJHkyEph3jxV9B/8LfN/d3cyMnXQrrVgxh9/8\nBp57DsrKyigrK+urcoqIDAoVFRVUVFT0SV7m3ps6fA8yNpsJzHH38tj+VUDU3a/vkGYF7QFhFFAP\nfMPd53bKy6dMcZ5/HvbfPyHFFREZdMwMd+/xontnEtlyWAhMNbPJwDpgNnBWxwTuPqXttZndA/yx\nc2Boo24lEZHkSVhwcPeImV0KPA+kA3e7+xIzuyj2/u27k58GpEVEkidh3Up9ycw8Pd1paIDMzP4u\njYhIatibbqWUmSGdlqbAICKSLCkTHDTeICKSPCkTHDTeICKSPIm8W6lPqeUg0v/CdCQZiPp6/Dhl\ngoNaDiIDQyrcxDLUJCJoq1tJRES6SJngoG4lEZHkSZngoJaDiEjypExwUMtBRCR5UiY4qOUgIsnw\n+c9/nvvvv7/P06aalLlbSS0HEelJQUFB/I6duro6cnJySE9PB+COO+7grLPO2tnHd/DMM88kJG2q\nSZngoJaDiPSktrY2/nrffffl7rvv5rjjjuuSLhKJkJGRMtVev0qZbiW1HERkd1VUVDBhwgR+9rOf\nMW7cOC644AK2bt3KKaecwujRoxkxYgSnnnoqa9eujX+mrKyMu+++G4B7772XT33qU3zve99jxIgR\nTJkyheeee26P0n7wwQccc8wxFBUV8dnPfpZvfvObfPWrX03Smdh9KRMc1HIQkT2xceNGqqurWbVq\nFbfffjvRaJQLLriAVatWsWrVKnJzc7n00kvj6c1sh0llCxYs4CMf+QhVVVVcccUVXHDBBXuU9uyz\nz2bmzJls2bKFOXPm8MADDwzoGecpExzUchAZ+Mz6ZutLaWlpXHvttWRmZpKTk8OIESP4whe+QE5O\nDgUFBfzgBz/gpZde6vHzkyZN4oILLsDMOPfcc1m/fj2bNm3arbSrVq1i4cKF/Md//AcZGRkcffTR\nnHbaaQN6tnnKBAe1HEQGPve+2fpSSUkJWVlZ8f36+nouuugiJk+ezLBhwzj22GOpqanpsaIeO3Zs\n/HVerCLqOMbRm7Tr1q1jxIgR5OTkxN+fOHHinv+oJFBwEJFBrXPXzY033sh7773HggULqKmp4aWX\nXsLdE3oVP27cOLZs2UJDQ0P82KpVqxL2fX0hZYKDupVEpC/U1taSm5vLsGHD2LJlC9dee23Cv3PS\npEkcfvjhzJkzh5aWFubNm8dTTz01dMcczKzczJaa2ftmdmU3788ys8VmtsjMXjWzrveexajlICJ7\nonMFfPnll9PQ0MCoUaM46qijOOmkk3qspDsPOHeXX2/TPvjgg8ybN4+RI0dy9dVXM3v27B26uwaa\nhK0hbWbpwLvACcBa4BXgLHdf0iFNvrvXxV5/HPiDu+/fTV6+cKFz2GEJKaqI9FJsTeL+LsagMHv2\nbKZNm8Y111yz13n19N9loK4hPQNY5u4r3b0FeASY1TFBW2CIKQA295SZWg4iksoWLlzI8uXLiUaj\nPPvss8ydO5fTTz+9v4vVo0ROFSwFVnfYXwMc2TmRmZ0O/BQYB3yup8w05iAiqWzDhg188YtfpKqq\niokTJ3Lbbbcxffr0/i5WjxIZHHrV9nT3J4AnzOzTwP3Agd2lU8tBRFLZKaecwimnnNLfxei1RAaH\ntUDHG3knEloP3XL3l80sw8xGuntV5/f/53/mkJkZXpeVlVFWVta3pRURSXEVFRVUVFT0SV6JHJDO\nIAxIHw+sAxbQdUB6P2CFu7uZHQo87u77dZOXR6Pe5zMnRWT3aEB6YErEgHTCWg7uHjGzS4HngXTg\nbndfYmYXxd6/HfgScK6ZtQC1wJk95afAICKSPAlrOfQlM/NUKKfIYKeWw8CUareyiohIilJwEJEh\nLS0tjRUrVgBwySWX8J//+Z+9Sru7HnzwQU488cQ9+mx/ULeSiPTaQO1WKi8v58gjj+zynKQnn3yS\niy++mLVr15KW1v21cFpaGsuWLWPKlCm7/J7epl25ciVTpkwhEon0+L19Sd1KIiLdOP/883nggQe6\nHL///vv5yle+kpQKujsDMZD2loKDiKS8WbNmUVVVxcsvvxw/Vl1dzdNPP82pp57KJz/5SYqLixk/\nfjzf+ta3aGlp6Taf888/n6uvvjq+//Of/5zx48czYcIEfv3rX++Q9umnn+aQQw5h2LBh7LPPPju0\nWo455hgAhg8fTlFREf/617+49957+fSnPx1P889//pMjjjiC4cOHM2PGDObNmxd/r6ysjB//+Md8\n6lOfoqioiBNPPJGqqi7TvxJKwUFEUl5ubi5nnHEG9913X/zYY489xkEHHURBQQG/+MUvqKqqYt68\nefz5z3/mlltu6Tafjk9Wfe6557jxxht58cUXee+993jxxRd3SFtQUMADDzxATU0NTz/9NLfeeitP\nPvkkQDxI1dTUsG3bNmbOnLnDZ7ds2cLJJ5/M5ZdfzpYtW/jud7/LySefTHV1dTzNww8/zL333sum\nTZtobm7mhhtu2PsTtRsSOUNaRIYYu7ZvJiT5NbvfHXPeeedxyimncPPNN5OVlcV9993Heeedx6GH\nHhpPM2nSJC688EJeeuklvv3tb+80v8cee4yvf/3rTJs2DYBrr72WRx55JP7+scceG3/98Y9/nDPP\nPJOXXnqJWbNm7bI76emnn+bAAw/knHPOAeDMM8/kl7/8JXPnzuW8887DzPja177G/vuHh1SfccYZ\nzJ07d/dOyF5ScBCRPrMnlXpfOfrooxk1ahR/+MMfOPzww3nllVd44okneO+99/jud7/Lq6++Sn19\nPZFIhMMPP3yX+a1fv54jjjgivr/PPvvs8P78+fP5/ve/z9tvv01zczNNTU2cccYZvSrrunXruuQ3\nadIk1q1bF9/vuORobm5uj0uTJoq6lURk0Dj33HO57777eOCBBygvL6ekpIRLLrmEadOmsWzZMmpq\navjJT35CNBrdZV7jxo3bYSnPzst6nn322Zx++umsWbOGrVu3cvHFF8fz3dUKb6WlpXz44Yc7HPvw\nww8pLS3t7U9NOAUHERk0zj33XF544QXuuusuzjvvPCAsC1pYWEheXh5Lly7l1ltv7fHzHdeSPuOM\nM7j33ntZsmQJ9fX1XW6Tra2tpbi4mKysLBYsWMBDDz0UDwolJSWkpaWxfPnybr/npJNO4r333uPh\nhx8mEonw6KOPsnTp0h2e2trfdzopOIjIoDFp0iSOPvpo6uvrOe200wC44YYbeOihhygqKuLCCy/k\nzDPP3OHKvvPrtv3y8nIuv/xyjjvuOA444ACOP/74HdLecsst/PjHP6aoqIjrrruO2bNnx9/Ly8vj\nhz/8IUcffTQjRoxg/vz5O+Q9cuRInnrqKW688UZGjRrFDTfcwFNPPcWIESN2Wa5k0SQ4Eem1gToJ\nbqjTJDgREUkKBQcREelCwUFERLpQcBARkS4UHEREpAsFBxER6UKPzxCR3ZLs++2lfyQ8OJhZOfC/\nQDpwl7tf3+n9c4ArAAO2A5e4+xuJLpeI7D7NcRg6EtqtZGbpwE1AOTANOMvMDuqUbAVwjLt/ArgO\nuCORZUp1FRUV/V2EAUPnop3ORTudi76R6DGHGcAyd1/p7i3AI8CsjgncfZ6718R25wMTElymlKb/\n8dvpXLTTuWinc9E3Eh0cSoHVHfbXxI715ALgmYSWSEREdinRYw697qA0s88AXweOTlxxRESkNxL6\n4D0zmwnMcffy2P5VQLSbQelPAL8Hyt19WTf5aBRMRGQP7OmD9xLdclgITDWzycA6YDZwVscEZrYP\nITB8pbvAAHv+40REZM8kNDi4e8TMLgWeJ9zKere7LzGzi2Lv3w78GCgGbo3dP93i7jMSWS4REdm5\nlFjPQUREkmtAPz7DzMrNbKmZvW9mV/Z3eZLJzCaa2V/N7G0ze8vMLosdH2FmL5jZe2b2JzMb3t9l\nTRYzSzezRWb2x9j+kDwXZjbczH5rZkvM7B0zO3IIn4urYv9G3jSzh8wse6icCzP7tZltNLM3Oxzr\n8bfHztX7sTr1c7vKf8AGh15OoBvMWoDvuPtHgZnAN2O///vAC+5+APDn2P5Q8W3gHdrvghuq5+IX\nwDPufhDwCWApQ/BcxMYyvwEc6u4fJ3Rdn8nQORf3EOrHjrr97WY2jTDmOy32mVvMbKf1/4ANDvRi\nAt1g5u4b3P312OtaYAlhjshpwG9iyX4DnN4/JUwuM5sAfB64i/CoFRiC58LMhgGfdvdfQxjXi00i\nHXLnAthGuIjKM7MMII9w48uQOBfu/jJQ3elwT799FvCwu7e4+0pgGaGO7dFADg67O4Fu0IpdIR1C\nmEE+xt03xt7aCIzpp2Il2/8A3wOiHY4NxXOxL1BpZveY2WtmdqeZ5TMEz4W7bwFuBFYRgsJWd3+B\nIXguOujpt48n1KFtdlmfDuTgoJFywMwKgN8B33b37R3f83A3waA/T2Z2CrDJ3RfR3mrYwVA5F4Q7\nDA8FbnH3Q4E6OnWbDJVzYWb7AZcDkwmVX4GZfaVjmqFyLrrTi9++0/MykIPDWmBih/2J7Bj5Bj0z\nyyQEhvvd/YnY4Y1mNjb2/jhgU3+VL4mOAk4zsw+Ah4HjzOx+hua5WAOscfdXYvu/JQSLDUPwXBwO\n/NPdq9w9Qpgv9UmG5rlo09O/ic716YTYsR4N5OAQn0BnZlmEwZS5/VympLEw6eNu4B13/98Ob80F\nzou9Pg94ovNnBxt3/4G7T3T3fQkDjn9x968yNM/FBmC1mR0QO3QC8DbwR4bYuSAMxM80s9zYv5cT\nCDcsDMVz0aanfxNzgTPNLMvM9gWmAgt2ltGAnudgZifRvhbE3e7+034uUtKY2aeAvwFv0N78u4rw\nH/QxYB9gJXCGu2/tjzL2BzM7Fvi/7n6amY1gCJ4LM5tOGJjPApYDXyP8GxmK5+IKQiUYBV4D/h0o\nZAicCzN7GDgWGEUYX/gx8CQ9/HYz+wHh+XURQjf18zvNfyAHBxER6R8DuVtJRET6iYKDiIh0oeAg\nIiJdKDiIiEgXCg4iItKFgoOIiHSh4CBDlpm1xh4B3rZd0Yd5T+74KGWRVJPoZUJFBrJ6dz+kvwsh\nMhCp5SDSiZmtNLPrzewNM5sfe8BbW2vgL2a22MxeNLOJseNjzOwPZvZ6bJsZyyrdzO6ILdb0vJnl\nxNJfFlugZnFslqvIgKPgIENZbqdupX+LHXfC458/QVhwqu3ZVr8C7nH36cCDwC9jx38J/NXdDyY8\nBO+d2PGpwE3u/jFgK/Cl2PErgYNj+VyUwN8nssf0+AwZssxsu7sXdnP8A+Az7r4y9mTc9e4+yswq\ngbHu3ho7vs7dS8xsE1AaW5SqLY/JwJ9iK3K1PQMo091/YmbPArWEh6I94e51if6tIrtLLQeRXet4\nBdXtehI9HG/q8LqV9jG+k4GbCa2MV2JL4ooMKAoOIt2b3eHvP2Ov/0l4ZDjAOYSn5kJYq/cSCGuf\nm1lRT5nGHi29j7tXEBbpGQbk92nJRfqA7laSoSzXzBZ12H/W3X8Qe11sZouBRuCs2LFvAfeY2fcI\ni6h8LXb828AdZnYBoYVwMeERyp37bJ3waO37Y2tBG/ALd9/Wx79LZK9pzEGkk9iYw2GxNYpFhiR1\nK4l0pSsmGfLUchARkS7UchARkS4UHEREpAsFBxER6ULBQUREulBwEBGRLhQcRESki/8fG73mOBFg\nmPYAAAAASUVORK5CYII=\n", 150 | "text/plain": [ 151 | "" 152 | ] 153 | }, 154 | "metadata": {}, 155 | "output_type": "display_data" 156 | } 157 | ], 158 | "source": [ 159 | "import matplotlib.pyplot as plt\n", 160 | "%matplotlib inline\n", 161 | "\n", 162 | "plt.figure()\n", 163 | "plt.xlabel('Epochs')\n", 164 | "plt.ylabel('Loss')\n", 165 | "plt.plot(history.history['loss'])\n", 166 | "plt.plot(history.history['val_loss'])\n", 167 | "plt.legend(['Training', 'Validation'])\n", 168 | "\n", 169 | "plt.figure()\n", 170 | "plt.xlabel('Epochs')\n", 171 | "plt.ylabel('Accuracy')\n", 172 | "plt.plot(history.history['acc'])\n", 173 | "plt.plot(history.history['val_acc'])\n", 174 | "plt.legend(['Training', 'Validation'], loc='lower right')" 175 | ] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "metadata": {}, 180 | "source": [ 181 | "After 100 epochs, we get a 95% validation accuracy. If you continue training, at some point the validation loss will start to increase: that is when the model starts to overfit. It always necessary to monitor training and validation loss during the training of any kind of Neural Network, either to detect overfitting or to evaluate the behaviour of the model.\n", 182 | "\n", 183 | "We have seen how to build a simple Neural Network with only fully-connected layers. In the next tutorial we will see how to build and train a Convolutional Neural Network with Keras." 184 | ] 185 | } 186 | ], 187 | "metadata": { 188 | "kernelspec": { 189 | "display_name": "Python 2", 190 | "language": "python", 191 | "name": "python2" 192 | }, 193 | "language_info": { 194 | "codemirror_mode": { 195 | "name": "ipython", 196 | "version": 2 197 | }, 198 | "file_extension": ".py", 199 | "mimetype": "text/x-python", 200 | "name": "python", 201 | "nbconvert_exporter": "python", 202 | "pygments_lexer": "ipython2", 203 | "version": "2.7.8" 204 | } 205 | }, 206 | "nbformat": 4, 207 | "nbformat_minor": 0 208 | } 209 | -------------------------------------------------------------------------------- /03_CNN.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Convolutional Neural Network\n", 8 | "In this third tutorial we will see how to define, train and visualize a Convolutional Neural Network (CNN) for image classification using the [Keras](http://keras.io/) library. \n", 9 | "\n", 10 | "As you should have seen, a CNN is a feed-forward neural network tipically composed of Convolutional, MaxPooling and Dense layers. If the task implemented by the CNN is a classification task, the last Dense layer shoud use the Softmax activation, and the loss should be the categorical crossentropy.\n", 11 | "\n", 12 | "## Define the network\n", 13 | "A simple CNN, with one input branch and one output branch can be defined using a [Sequential](http://keras.io/models/#sequential) model and stacking together all its layer. In this example the network will contain two [Convolution, Convolution, MaxPooling] stages, and two Dense layers.\n", 14 | "\n", 15 | "To test a different optimizer, we will use [AdaDelta](http://keras.io/optimizers/), which is a bit more complex than the simple Vanilla SGD with momentum." 16 | ] 17 | }, 18 | { 19 | "cell_type": "code", 20 | "execution_count": 1, 21 | "metadata": { 22 | "collapsed": false 23 | }, 24 | "outputs": [ 25 | { 26 | "name": "stdout", 27 | "output_type": "stream", 28 | "text": [ 29 | "Using TensorFlow backend.\n" 30 | ] 31 | } 32 | ], 33 | "source": [ 34 | "from keras.models import Sequential\n", 35 | "from keras.layers.core import Dense, Dropout, Flatten, Activation\n", 36 | "from keras.layers.convolutional import Convolution2D, MaxPooling2D\n", 37 | "from keras.optimizers import Adadelta\n", 38 | "\n", 39 | "input_shape = (3, 32, 32)\n", 40 | "nb_classes = 10\n", 41 | "\n", 42 | "model = Sequential()\n", 43 | "model.add(Convolution2D(32, 3, 3, border_mode='same',\n", 44 | " input_shape=input_shape))\n", 45 | "model.add(Activation('relu'))\n", 46 | "model.add(Convolution2D(32, 3, 3))\n", 47 | "model.add(Activation('relu'))\n", 48 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n", 49 | "model.add(Dropout(0.25))\n", 50 | "\n", 51 | "model.add(Convolution2D(64, 3, 3, border_mode='same'))\n", 52 | "model.add(Activation('relu'))\n", 53 | "model.add(Convolution2D(64, 3, 3))\n", 54 | "model.add(Activation('relu'))\n", 55 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n", 56 | "model.add(Dropout(0.25))\n", 57 | "\n", 58 | "model.add(Flatten())\n", 59 | "model.add(Dense(512))\n", 60 | "model.add(Activation('relu'))\n", 61 | "model.add(Dropout(0.5))\n", 62 | "model.add(Dense(nb_classes))\n", 63 | "model.add(Activation('softmax'))\n", 64 | "\n", 65 | "model.compile(loss='categorical_crossentropy', optimizer=Adadelta())" 66 | ] 67 | }, 68 | { 69 | "cell_type": "markdown", 70 | "metadata": {}, 71 | "source": [ 72 | "### Understanding layer shapes\n", 73 | "An important feature of Keras layers is that each of them has an `input_shape` attribute, which you can use to visualize the shape of the input tensor, and an `output_shape` attribute, for inspecting the shape of the output tensor.\n", 74 | "\n", 75 | "As we can see, the input shape of the first convolutional layer corresponds to the `input_shape` attribute (which must be specified by the user). In this case, it is a 32x32 image with three color channels. Since this convolutional layer has the `border_mode` set to `same`, its output width and height will remain the same, and the number of output channel will be equal to the number of filters learned by the layer, 16. The following convolutional layers, instead, have the default `border_mode`, and therefore reduce width and height by $(k-1)$, where $k$ is the size of the kernel. \n", 76 | "\n", 77 | "MaxPooling layers, instead, reduce width and height of the input tensor, but keep the same number of channels. Activation layers, of course, don't change the shape." 78 | ] 79 | }, 80 | { 81 | "cell_type": "code", 82 | "execution_count": 2, 83 | "metadata": { 84 | "collapsed": false 85 | }, 86 | "outputs": [ 87 | { 88 | "name": "stdout", 89 | "output_type": "stream", 90 | "text": [ 91 | "Layer 0 \t (None, 3, 32, 32) \t (None, 32, 32, 32)\n", 92 | "Layer 1 \t (None, 32, 32, 32) \t (None, 32, 32, 32)\n", 93 | "Layer 2 \t (None, 32, 32, 32) \t (None, 32, 30, 30)\n", 94 | "Layer 3 \t (None, 32, 30, 30) \t (None, 32, 30, 30)\n", 95 | "Layer 4 \t (None, 32, 30, 30) \t (None, 32, 15, 15)\n", 96 | "Layer 5 \t (None, 32, 15, 15) \t (None, 32, 15, 15)\n", 97 | "Layer 6 \t (None, 32, 15, 15) \t (None, 64, 15, 15)\n", 98 | "Layer 7 \t (None, 64, 15, 15) \t (None, 64, 15, 15)\n", 99 | "Layer 8 \t (None, 64, 15, 15) \t (None, 64, 13, 13)\n", 100 | "Layer 9 \t (None, 64, 13, 13) \t (None, 64, 13, 13)\n", 101 | "Layer 10 \t (None, 64, 13, 13) \t (None, 64, 6, 6)\n", 102 | "Layer 11 \t (None, 64, 6, 6) \t (None, 64, 6, 6)\n", 103 | "Layer 12 \t (None, 64, 6, 6) \t (None, 2304)\n", 104 | "Layer 13 \t (None, 2304) \t (None, 512)\n", 105 | "Layer 14 \t (None, 512) \t (None, 512)\n", 106 | "Layer 15 \t (None, 512) \t (None, 512)\n", 107 | "Layer 16 \t (None, 512) \t (None, 10)\n", 108 | "Layer 17 \t (None, 10) \t (None, 10)\n" 109 | ] 110 | } 111 | ], 112 | "source": [ 113 | "from __future__ import print_function\n", 114 | "for i, layer in enumerate(model.layers):\n", 115 | " print (\"Layer\", i, \"\\t\", layer.input_shape, \"\\t\", layer.output_shape)" 116 | ] 117 | }, 118 | { 119 | "cell_type": "markdown", 120 | "metadata": {}, 121 | "source": [ 122 | "### Understanding weights shape\n", 123 | "In the same way, we can visualize the shape of the weights learned by each layer. In particular, Keras lets you inspect weights by using the `get_weights` method of a layer object. This will return a list with two elements, the first one being the weight tensor and the second one being the bias vector.\n", 124 | "\n", 125 | "Of course, MaxPooling layer don't have any weight tensor, since they don't have learnable parameters. Convolutional layers, instead, learn a $(n_o, n_i, k, k)$ weight tensor, where $k$ is the size of the kernel, $n_i$ is the number of channels of the input tensor, and $n_o$ is the number of filters to be learned. For each of the $n_o$ filters, a bias is also learned. Dense layers learn a $(n_i, n_o)$ weight tensor, where $n_o$ is the output size and $n_i$ is the input size of the layer. Each of the $n_o$ neurons also has a bias." 126 | ] 127 | }, 128 | { 129 | "cell_type": "code", 130 | "execution_count": 3, 131 | "metadata": { 132 | "collapsed": false 133 | }, 134 | "outputs": [ 135 | { 136 | "name": "stdout", 137 | "output_type": "stream", 138 | "text": [ 139 | "Layer 0 \t (32, 3, 3, 3) \t (32,)\n", 140 | "Layer 2 \t (32, 32, 3, 3) \t (32,)\n", 141 | "Layer 6 \t (64, 32, 3, 3) \t (64,)\n", 142 | "Layer 8 \t (64, 64, 3, 3) \t (64,)\n", 143 | "Layer 13 \t (2304, 512) \t (512,)\n", 144 | "Layer 16 \t (512, 10) \t (10,)\n" 145 | ] 146 | } 147 | ], 148 | "source": [ 149 | "for i, layer in enumerate(model.layers):\n", 150 | " if len(layer.get_weights()) > 0:\n", 151 | " print(\"Layer\", i, \"\\t\", layer.get_weights()[0].shape, \"\\t\", layer.get_weights()[1].shape)" 152 | ] 153 | }, 154 | { 155 | "cell_type": "markdown", 156 | "metadata": {}, 157 | "source": [ 158 | "# Training the network\n", 159 | "We will train our network on the Cifar10 dataset, which contains 50,000 32x32 color training images, labeled over 10 categories, and 10,000 test images. As this dataset is also included in Keras datasets, we just ask the `keras.datasets` module for the dataset.\n", 160 | "\n", 161 | "Training and test images are normalized to lie in the $\\left[0,1\\right]$ interval." 162 | ] 163 | }, 164 | { 165 | "cell_type": "code", 166 | "execution_count": 4, 167 | "metadata": { 168 | "collapsed": false 169 | }, 170 | "outputs": [], 171 | "source": [ 172 | "from keras.datasets import cifar10\n", 173 | "from keras.utils import np_utils\n", 174 | "\n", 175 | "(X_train, y_train), (X_test, y_test) = cifar10.load_data()\n", 176 | "Y_train = np_utils.to_categorical(y_train, nb_classes)\n", 177 | "Y_test = np_utils.to_categorical(y_test, nb_classes)\n", 178 | "X_train = X_train.astype(\"float32\")\n", 179 | "X_test = X_test.astype(\"float32\")\n", 180 | "X_train /= 255\n", 181 | "X_test /= 255" 182 | ] 183 | }, 184 | { 185 | "cell_type": "markdown", 186 | "metadata": {}, 187 | "source": [ 188 | "To reduce the risk of overfitting, we also apply some image transformation, like rotations, shifts and flips. All these can be easily implemented using the Keras [Image Data Generator](http://keras.io/preprocessing/image/)." 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": 5, 194 | "metadata": { 195 | "collapsed": false 196 | }, 197 | "outputs": [], 198 | "source": [ 199 | "from keras.preprocessing.image import ImageDataGenerator\n", 200 | "datagen = ImageDataGenerator(\n", 201 | " featurewise_center=True, # set input mean to 0 over the dataset\n", 202 | " samplewise_center=False, # set each sample mean to 0\n", 203 | " featurewise_std_normalization=True, # divide inputs by std of the dataset\n", 204 | " samplewise_std_normalization=False, # divide each input by its std\n", 205 | " zca_whitening=False, # apply ZCA whitening\n", 206 | " rotation_range=0, # randomly rotate images in the range (degrees, 0 to 180)\n", 207 | " width_shift_range=0.2, # randomly shift images horizontally (fraction of total width)\n", 208 | " height_shift_range=0.2, # randomly shift images vertically (fraction of total height)\n", 209 | " horizontal_flip=True, # randomly flip images\n", 210 | " vertical_flip=False) # randomly flip images\n", 211 | "\n", 212 | "datagen.fit(X_train)" 213 | ] 214 | }, 215 | { 216 | "cell_type": "markdown", 217 | "metadata": {}, 218 | "source": [ 219 | "Now we can start training. At each iteration, a batch of 500 images is requested to the `ImageDataGenerator` object, and then fed to the network." 220 | ] 221 | }, 222 | { 223 | "cell_type": "code", 224 | "execution_count": null, 225 | "metadata": { 226 | "collapsed": false 227 | }, 228 | "outputs": [], 229 | "source": [ 230 | "from keras.utils import generic_utils\n", 231 | "for e in range(200):\n", 232 | " print('Epoch', e)\n", 233 | " print('Training...')\n", 234 | " progbar = generic_utils.Progbar(X_train.shape[0])\n", 235 | " for X_batch, Y_batch in datagen.flow(X_train, Y_train, batch_size=500, shuffle=True):\n", 236 | " loss = model.train_on_batch(X_batch, Y_batch)\n", 237 | " progbar.add(X_batch.shape[0], values=[('train loss', loss[0])])" 238 | ] 239 | } 240 | ], 241 | "metadata": { 242 | "kernelspec": { 243 | "display_name": "Python 2", 244 | "language": "python", 245 | "name": "python2" 246 | }, 247 | "language_info": { 248 | "codemirror_mode": { 249 | "name": "ipython", 250 | "version": 2 251 | }, 252 | "file_extension": ".py", 253 | "mimetype": "text/x-python", 254 | "name": "python", 255 | "nbconvert_exporter": "python", 256 | "pygments_lexer": "ipython2", 257 | "version": "2.7.8" 258 | } 259 | }, 260 | "nbformat": 4, 261 | "nbformat_minor": 0 262 | } 263 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Deep Learning Tutorials 2 | This repository contains three Deep Learning tutorials which I wrote for the Machine Learning course at University of Modena and Reggio Emilia. 3 | 4 | They have been designed for a three hours class, in which students can get to know automatic differentiation and SGD and design some simple Neural Networks. 5 | 6 | * **[Tutorial 1](01_Linear_Regression.ipynb)**: Training a Linear Regression model with SGD and Keras backend. 7 | * **[Tutorial 2](02_MLP.ipynb)**: Multilayer Perceptron in Keras on the MNIST dataset 8 | * **[Tutorial 3](03_CNN.ipynb)**: A Convolutional Neural Network trained on the CIFAR10 dataset 9 | --------------------------------------------------------------------------------