├── README.md ├── Training both Models and Architecture Experiments .ipynb ├── plots ├── CompressionRate_Accuracy.png ├── initial experiment │ ├── compressionRate_Accuracy.png │ └── parameterSize_Accuracy.png └── parameterSize_Accuracy.png ├── report.md ├── runme.py ├── stockweighs.h5 └── student_weights_6_0.2dopout.h5 /README.md: -------------------------------------------------------------------------------- 1 | # Keras Model Compression 2 | 3 | An Implementation of "Distilling the Knowledge in a Neural Network - Geoffery Hinton et. al" https://arxiv.org/abs/1503.02531 4 | 5 | 6 | #### Usage 7 | 8 | python runme.py 9 | 10 | 11 | ##### Initial Model Parameters: 12 | - Total params: 1,199,882 13 | - Trainable params: 1,199,882 14 | 15 | ### Model Evaluation before compression 16 | 17 | - Initial Accuracy on test set~= 0.99 18 | - Initial Model Parameters : 1199882 19 | - Memory footprint per Image Feed Forward ~= ', 4.577 Mb 20 | 21 | ##### Evaluation of the Model after compression 22 | 23 | - Compressed Model parameters: 74188 24 | - Compression Rate : 16.2x 25 | - Accuracy : 0.96 26 | 27 | ## Implementation of Logit Regression in Keras 28 | 29 | Experiments and thoughts to investigate the affect of number of parameters in Model. 30 | 31 | ### Convolutional Network Model Architecture and Summary 32 | 33 | ____________________________________________________________________________________________________ 34 | | Layer (type) | Output Shape | Param # |Connected to 35 | |----------------------------------:|----------------------:|------------:|------------------------:| 36 | | convolution2d_1 (Convolution2D) | (None, 26, 26, 32) | 320 | convolution2d_input_1[0][0] | 37 | | convolution2d_2 (Convolution2D) | (None, 24, 24, 64) | 18496 | convolution2d_1[0][0] | 38 | |maxpooling2d_1 (MaxPooling2D) | (None, 12, 12, 64) | 0 | convolution2d_2[0][0] | 39 | |dropout_1 (Dropout) | (None, 12, 12, 64) | 0 | maxpooling2d_1[0][0] | 40 | |flatten_1 (Flatten) | (None, 9216) | 0 | dropout_1[0][0] | 41 | |dense_1 (Dense) | (None, 128) | 1179776 | flatten_1[0][0] | 42 | |dropout_2 (Dropout) | (None, 128) | 0 | dense_1[0][0] | 43 | |dense_2 (Dense) | (None, 10) | 1290 | dropout_2[0][0] | 44 | |activation_1 (Activation) | (None, 10) | 0 | dense_2[0][0] | 45 | ____________________________________________________________________________________________________ 46 | 47 | - Total params: 1,199,882 48 | - Trainable params: 1,199,882 49 | - Non-trainable params: 0 50 | 51 | 52 | ### Reducing The Model 53 | If we observe the model weights and number of parameters, the most consuming layers are the 2 Dense Layers. and the two convolutuonal layers have only 18,816 parameters. Accounting for only (18816 / 1,199,882) x 100 = 1.56 percent of all the parameters, 54 | 55 | So the first Step is to either to replace it with a lighter model. That generalizes well to new examples, and learns to model the heavy dense layer of the original model. 56 | 57 | Replacing these 2 heavy dense layers with 1 Hidden layer with 6 HiddenLayer Neurons. We can achieve an accuracy of 0.9626. 58 | 59 | The Logits from the last layer. before the Activation layer were used, as mentioned in Geoffery Hinton's paper to avoid encoutering very small activation values after squashing through Softmax function. 60 | 61 | The python Notebook in this repository shows other architectures with varying number of hiddenLayer Neurons, and the affect of HiddenNeurons on the accuracy. See plots/ subdirectory for findings 62 | 63 | ### Evaluation of the Model after Reduction 64 | 65 | - Compressed Model parameters: 74188 66 | - Compression Rate : 16.2x 67 | 68 | ### Exepriments on how we can trade accuracy with model size. initial model accuracy is 99%. The minimum model size that we can achieve without dropping below 0.95 accuracy. 69 | 70 | Experiment: 71 | THe number of hidden Layer Neurons in small model was iteratively decreased, this showed a exponential decrease in model size and linear decrease in accuracy. 72 | 73 | By examining the plot of number of parameters(or model size) to accuracy. we can find the tradeoff, in the number of Hidden Layer Neurons needed to achieve atleast 0.95 accuracy. 74 | 75 | I was able to obtain 16.2x times compression, while keeping accuracy at 0.96 76 | 77 | ![Compression-Rate-Accuracy](/plots/CompressionRate_Accuracy.png){:class="img-responsive"} 78 | ![ParameterSize-Accuracy](/plots/parameterSize_Accuracy.png){:class="img-responsive"} 79 | 80 | 81 | ### Experiment Settings and Hyper Parameters for student model. 82 | Various Dropouts were tested . 0.2 was chosen to regularize a bit, and showed good generalization accuracy. 83 | 84 | lossFunction : Mean Square Error as mentioned by Geoff Hinton in the paper.. Optimized Adadelta optimizer over 85 | 86 | Let s(f) be the minimum model size that can achieve accuracy f. Given 0 <= f_1 < f_2 < ... < f_n < =1, 0 < s(f_1) < s(f_2) < ... < s(f_n) 87 | 88 | No, The above equation representing the relation of model size and minimum accuray does not hold. During my experiments I have found that accuracy has rised/equal even when the model size was reduced several x times . 89 | 90 | However, after plotting the accuracy against model size. it is observed that the accuracy decreases as model size is reduced. 91 | 92 | ### Future Work and Improvements : 93 | Its also worth noting that in MNIST dataset the characters only appear in the center of the image, and convolution weights corresponding to the edges are blank/constant and likely to be highly redundant or noisy. Pruning these weight connections through the network is likely to effectively reduce model size. 94 | 95 | Ideas from Song Hans work in deep compression can be taken forward to establish a general framework for compression of deep-learning models in keras. 96 | 97 | See Hans work here: 98 | https://arxiv.org/abs/1510.00149 "Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding" 99 | -------------------------------------------------------------------------------- /Training both Models and Architecture Experiments .ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "name": "stderr", 12 | "output_type": "stream", 13 | "text": [ 14 | "Using TensorFlow backend.\n" 15 | ] 16 | } 17 | ], 18 | "source": [ 19 | "import keras\n", 20 | "from keras.datasets import mnist\n", 21 | "import numpy as np\n", 22 | "from sklearn import metrics\n", 23 | "import h5py\n", 24 | "from matplotlib import pyplot as plt \n", 25 | "%matplotlib inline\n", 26 | "plt.rcParams['figure.figsize'] = (8, 6)\n", 27 | "from skimage import io\n", 28 | "\n", 29 | "def show(img):\n", 30 | " io.imshow(img)\n", 31 | " io.show()\n", 32 | "def softmax_c(z):\n", 33 | " assert len(z.shape) == 2\n", 34 | " s = np.max(z, axis=1)\n", 35 | " s = s[:, np.newaxis]\n", 36 | " e_x = np.exp(z - s)\n", 37 | " div = np.sum(e_x, axis=1)\n", 38 | " div = div[:, np.newaxis] \n", 39 | " return e_x / div" 40 | ] 41 | }, 42 | { 43 | "cell_type": "markdown", 44 | "metadata": {}, 45 | "source": [ 46 | "### Model Compression/Reduction Techniques Studied\n", 47 | "\n", 48 | "#### Logit Regression : NIPS '14\n", 49 | "- Formulation: (Regression)\n", 50 | "- Loss Function : \n", 51 | "- https://arxiv.org/pdf/1312.6184.pdf\n", 52 | "- Do Deep Nets Really Need to be Deep?\n", 53 | "\n", 54 | "#### Song Hans et al. \n", 55 | "- Deep Compression, Pruning, trained-Quantization and Huffman Coding\n", 56 | "- Learning both Weights and Connections for Efficient Neural Networks\n", 57 | "- Formulation: \n", 58 | "- Loss Function : \n", 59 | "\n", 60 | "#### Quanitzation of Dense Layer. \n", 61 | "- Discretize the continuous Range of Values in a dense Layer to Clusters\n", 62 | "- Little or no Loss of accuracy. \n", 63 | "- Dense Layers are reduced.\n", 64 | "- a series of information theoretical vector quantization methods (Jegou et al., 2011; Chen et al., 2010) for compressing dense connected layers. \n", 65 | "- SqueezeNet (Implementation Available)\n", 66 | "\n", 67 | "\n", 68 | "- They Found simply applying kmeans-based scalar quantization achieves very impressive results. Than Matrix Factorization \n", 69 | "- Compression Rate to Accuracy Tradeoff using Kmeans saclar quantization := 32 / log2 (k)\n", 70 | "- Product Quantization .. breaking matrix into S submatrices... and kmeans clustering on each one of them . Compression Rate (32mn)/(32kn + log2(k)ms).\n", 71 | "\n", 72 | "\n", 73 | "- COMPRESSING DEEP CONVOLUTIONAL NETWORKS\n", 74 | "- USING VECTOR QUANTIZATION Yunchao Gong, Liu Liu ∗ , Ming Yang, Lubomir Bourdev" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "execution_count": 2, 80 | "metadata": { 81 | "collapsed": false 82 | }, 83 | "outputs": [ 84 | { 85 | "name": "stdout", 86 | "output_type": "stream", 87 | "text": [ 88 | "('x_train shape:', (60000, 28, 28, 1))\n", 89 | "(60000, 'train samples')\n", 90 | "(10000, 'test samples')\n" 91 | ] 92 | } 93 | ], 94 | "source": [ 95 | "from keras.datasets import mnist\n", 96 | "from keras.models import Sequential\n", 97 | "from keras.layers import Dense, Dropout, Flatten\n", 98 | "from keras.layers import Conv2D, MaxPooling2D , Activation\n", 99 | "from keras import backend as K\n", 100 | "\n", 101 | "batch_size = 128\n", 102 | "num_classes = 10\n", 103 | "epochs = 15\n", 104 | "\n", 105 | "# input image dimensions\n", 106 | "img_rows, img_cols = 28, 28\n", 107 | "# the data, shuffled and split between train and test sets\n", 108 | "(x_train, y_train), (x_test, y_test) = mnist.load_data()\n", 109 | "\n", 110 | "if K.image_dim_ordering() == 'th':\n", 111 | " x_train = x_train.reshape(x_train.shape[0], 1, img_rows, img_cols)\n", 112 | " x_test = x_test.reshape(x_test.shape[0], 1, img_rows, img_cols)\n", 113 | " input_shape = (1, img_rows, img_cols)\n", 114 | "else:\n", 115 | " x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n", 116 | " x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n", 117 | " input_shape = (img_rows, img_cols, 1)\n", 118 | "\n", 119 | "x_train = x_train.astype('float32')\n", 120 | "x_test = x_test.astype('float32')\n", 121 | "x_train /= 255\n", 122 | "x_test /= 255\n", 123 | "print('x_train shape:', x_train.shape)\n", 124 | "print(x_train.shape[0], 'train samples')\n", 125 | "print(x_test.shape[0], 'test samples')\n", 126 | "\n", 127 | "# convert class vectors to binary class matrices\n", 128 | "y_train = keras.utils.np_utils.to_categorical(y_train, num_classes)\n", 129 | "y_test = keras.utils.np_utils.to_categorical(y_test, num_classes)\n", 130 | "\n", 131 | "model = Sequential()\n", 132 | "model.add(Conv2D(32,3,3, activation='relu', input_shape=input_shape))\n", 133 | "model.add(Conv2D(64,3,3,activation='relu'))\n", 134 | "model.add(MaxPooling2D(pool_size=(2, 2)))\n", 135 | "model.add(Dropout(0.25))\n", 136 | "model.add(Flatten())\n", 137 | "model.add(Dense(128, activation='relu'))\n", 138 | "model.add(Dropout(0.5))\n", 139 | "model.add(Dense(num_classes))\n", 140 | "model.add(Activation('softmax'))\n", 141 | "\n", 142 | "model.compile(loss='categorical_crossentropy',\n", 143 | " optimizer=keras.optimizers.Adadelta(),\n", 144 | " metrics=['accuracy'])" 145 | ] 146 | }, 147 | { 148 | "cell_type": "markdown", 149 | "metadata": {}, 150 | "source": [ 151 | "#### Train cumberSome Model " 152 | ] 153 | }, 154 | { 155 | "cell_type": "raw", 156 | "metadata": { 157 | "collapsed": true 158 | }, 159 | "source": [ 160 | "model.fit(x_train, y_train,\n", 161 | " batch_size=batch_size,\n", 162 | " nb_epoch=epochs,\n", 163 | " verbose=0,\n", 164 | " validation_data=(x_test, y_test))\n", 165 | "score = model.evaluate(x_test, y_test, verbose=0)\n", 166 | "print('Test loss:', score[0])\n", 167 | "print('Test accuracy:', score[1])\n", 168 | "\n", 169 | "\n", 170 | "print (\"Saving Model weights\")\n", 171 | "model.save_weights(\"stockweighs.h5\")\n", 172 | "print (\"Model Weights Saved\")" 173 | ] 174 | }, 175 | { 176 | "cell_type": "code", 177 | "execution_count": 3, 178 | "metadata": { 179 | "collapsed": false 180 | }, 181 | "outputs": [ 182 | { 183 | "name": "stdout", 184 | "output_type": "stream", 185 | "text": [ 186 | "____________________________________________________________________________________________________\n", 187 | "Layer (type) Output Shape Param # Connected to \n", 188 | "====================================================================================================\n", 189 | "convolution2d_1 (Convolution2D) (None, 26, 26, 32) 320 convolution2d_input_1[0][0] \n", 190 | "____________________________________________________________________________________________________\n", 191 | "convolution2d_2 (Convolution2D) (None, 24, 24, 64) 18496 convolution2d_1[0][0] \n", 192 | "____________________________________________________________________________________________________\n", 193 | "maxpooling2d_1 (MaxPooling2D) (None, 12, 12, 64) 0 convolution2d_2[0][0] \n", 194 | "____________________________________________________________________________________________________\n", 195 | "dropout_1 (Dropout) (None, 12, 12, 64) 0 maxpooling2d_1[0][0] \n", 196 | "____________________________________________________________________________________________________\n", 197 | "flatten_1 (Flatten) (None, 9216) 0 dropout_1[0][0] \n", 198 | "____________________________________________________________________________________________________\n", 199 | "dense_1 (Dense) (None, 128) 1179776 flatten_1[0][0] \n", 200 | "____________________________________________________________________________________________________\n", 201 | "dropout_2 (Dropout) (None, 128) 0 dense_1[0][0] \n", 202 | "____________________________________________________________________________________________________\n", 203 | "dense_2 (Dense) (None, 10) 1290 dropout_2[0][0] \n", 204 | "____________________________________________________________________________________________________\n", 205 | "activation_1 (Activation) (None, 10) 0 dense_2[0][0] \n", 206 | "====================================================================================================\n", 207 | "Total params: 1,199,882\n", 208 | "Trainable params: 1,199,882\n", 209 | "Non-trainable params: 0\n", 210 | "____________________________________________________________________________________________________\n" 211 | ] 212 | } 213 | ], 214 | "source": [ 215 | "model.summary()" 216 | ] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "metadata": {}, 221 | "source": [ 222 | "### Load pre-trained wieghts and Evaluate Model" 223 | ] 224 | }, 225 | { 226 | "cell_type": "code", 227 | "execution_count": 4, 228 | "metadata": { 229 | "collapsed": false 230 | }, 231 | "outputs": [ 232 | { 233 | "name": "stdout", 234 | "output_type": "stream", 235 | "text": [ 236 | "('Test loss:', 0.028270393712621807)\n", 237 | "('Test accuracy:', 0.99139999999999995)\n" 238 | ] 239 | } 240 | ], 241 | "source": [ 242 | "model.load_weights('stockweighs.h5')\n", 243 | "score = model.evaluate(x_test, y_test, verbose=0)\n", 244 | "print('Test loss:', score[0])\n", 245 | "print('Test accuracy:', score[1])" 246 | ] 247 | }, 248 | { 249 | "cell_type": "markdown", 250 | "metadata": {}, 251 | "source": [ 252 | "### Parameter Size of Initial Model and Memory Footprint \n", 253 | "- How many parameters does your initial model contain? \n", 254 | "- What is the memory footprint? \n", 255 | "- What is the accuracy on the testing data set?" 256 | ] 257 | }, 258 | { 259 | "cell_type": "code", 260 | "execution_count": 5, 261 | "metadata": { 262 | "collapsed": false 263 | }, 264 | "outputs": [ 265 | { 266 | "name": "stdout", 267 | "output_type": "stream", 268 | "text": [ 269 | "('Memory footprint per Image Feed Forward ~= ', 4.5771865844726562, 'Mb')\n" 270 | ] 271 | } 272 | ], 273 | "source": [ 274 | "convparams = 320 + 18496\n", 275 | "trainable_params = model.count_params()\n", 276 | "footprint = trainable_params * 4\n", 277 | "\n", 278 | "print (\"Memory footprint per Image Feed Forward ~= \" , footprint / 1024.0 /1024.0 ,\"Mb\") # 2x Backprop\n" 279 | ] 280 | }, 281 | { 282 | "cell_type": "markdown", 283 | "metadata": {}, 284 | "source": [ 285 | "### Distillation of Knowledge in Final Dense Layers , Through Logit Regression " 286 | ] 287 | }, 288 | { 289 | "cell_type": "markdown", 290 | "metadata": {}, 291 | "source": [ 292 | "### Todo:\n", 293 | "\n", 294 | "- See if Redundant Conv-Layers Connections from TL,TR,BL,BR should be pruned. \n", 295 | "- see indices from karpathy slides and learn pruning connections " 296 | ] 297 | }, 298 | { 299 | "cell_type": "markdown", 300 | "metadata": {}, 301 | "source": [ 302 | "### Intermediate Outputs" 303 | ] 304 | }, 305 | { 306 | "cell_type": "code", 307 | "execution_count": 6, 308 | "metadata": { 309 | "collapsed": false 310 | }, 311 | "outputs": [], 312 | "source": [ 313 | "from keras import backend as K\n", 314 | "\n", 315 | "def prepare_softtargets(model,X):\n", 316 | " inp = model.input # input placeholder\n", 317 | " outputs = []\n", 318 | " for layer in model.layers[:]:\n", 319 | " if layer.name == 'flatten_1':\n", 320 | " outputs.append(layer.output)\n", 321 | " if layer.name == 'dense_2':\n", 322 | " outputs.append(layer.output)\n", 323 | " \n", 324 | " functor = K.function([inp]+ [K.learning_phase()], outputs ) # evaluation function\n", 325 | " layer_outs = functor([X, 1.])\n", 326 | " return np.array(layer_outs[0]) , np.array(layer_outs[1])" 327 | ] 328 | }, 329 | { 330 | "cell_type": "markdown", 331 | "metadata": {}, 332 | "source": [ 333 | "### Prepare Train Data for small model " 334 | ] 335 | }, 336 | { 337 | "cell_type": "code", 338 | "execution_count": 7, 339 | "metadata": { 340 | "collapsed": false 341 | }, 342 | "outputs": [ 343 | { 344 | "name": "stdout", 345 | "output_type": "stream", 346 | "text": [ 347 | "Batch # : 0\n", 348 | "Batch # : 1\n", 349 | "Batch # : 2\n", 350 | "Batch # : 3\n", 351 | "Batch # : 4\n", 352 | "Batch # : 5\n", 353 | "Batch # : 6\n", 354 | "Batch # : 7\n", 355 | "Batch # : 8\n", 356 | "Batch # : 9\n", 357 | "Batch # : 10\n", 358 | "Batch # : 11\n", 359 | "Batch # : 12\n", 360 | "Batch # : 13\n", 361 | "Batch # : 14\n", 362 | "Batch # : 15\n", 363 | "Batch # : 16\n", 364 | "Batch # : 17\n", 365 | "Batch # : 18\n", 366 | "Batch # : 19\n", 367 | "Batch # : 20\n", 368 | "Batch # : 21\n", 369 | "Batch # : 22\n", 370 | "Batch # : 23\n", 371 | "Batch # : 24\n", 372 | "Batch # : 25\n", 373 | "Batch # : 26\n", 374 | "Batch # : 27\n", 375 | "Batch # : 28\n", 376 | "Batch # : 29\n", 377 | "Batch # : 30\n", 378 | "Batch # : 31\n", 379 | "Batch # : 32\n", 380 | "Batch # : 33\n", 381 | "Batch # : 34\n", 382 | "Batch # : 35\n", 383 | "Batch # : 36\n", 384 | "Batch # : 37\n", 385 | "Batch # : 38\n", 386 | "Batch # : 39\n", 387 | "Batch # : 40\n", 388 | "Batch # : 41\n", 389 | "Batch # : 42\n", 390 | "Batch # : 43\n", 391 | "Batch # : 44\n", 392 | "Batch # : 45\n", 393 | "Batch # : 46\n", 394 | "Batch # : 47\n", 395 | "Batch # : 48\n", 396 | "Batch # : 49\n", 397 | "Batch # : 50\n", 398 | "Batch # : 51\n", 399 | "Batch # : 52\n", 400 | "Batch # : 53\n", 401 | "Batch # : 54\n", 402 | "Batch # : 55\n", 403 | "Batch # : 56\n", 404 | "Batch # : 57\n", 405 | "Batch # : 58\n", 406 | "Batch # : 59\n", 407 | "(60000, 9216)\n", 408 | "(60000, 10)\n", 409 | "clean up \n" 410 | ] 411 | } 412 | ], 413 | "source": [ 414 | "lastconv_out = []\n", 415 | "logit_out = []\n", 416 | "for i in range(0,60):\n", 417 | " print \"Batch # : \",i\n", 418 | " l,l2 = ( prepare_softtargets(model,x_train[i*1000:(i+1)*1000]))\n", 419 | " lastconv_out.append(l)\n", 420 | " logit_out.append(l2)\n", 421 | " #lastconv_out , logit_out = prepare_softtargets(model,x_train[i*1000:(i+1)*1000])\n", 422 | "\n", 423 | "# lastconv_out.shape , logit_out.shape\n", 424 | "lastconv_out = np.array(lastconv_out)\n", 425 | "logit_out = np.array(logit_out)\n", 426 | "lastconv_out = lastconv_out.reshape((60000 , 9216))\n", 427 | "logit_out = logit_out.reshape((60000 , 10))\n", 428 | "print lastconv_out.shape\n", 429 | "print logit_out.shape\n", 430 | "\n", 431 | "print \"clean up \" \n", 432 | "x_train = 0 " 433 | ] 434 | }, 435 | { 436 | "cell_type": "markdown", 437 | "metadata": {}, 438 | "source": [ 439 | "### Write transferset to file " 440 | ] 441 | }, 442 | { 443 | "cell_type": "code", 444 | "execution_count": 8, 445 | "metadata": { 446 | "collapsed": true 447 | }, 448 | "outputs": [], 449 | "source": [ 450 | "h5f = h5py.File('lastconv_out.h5', 'w')\n", 451 | "h5f.create_dataset('dataset_1', data=lastconv_out)\n", 452 | "h5f.close()\n", 453 | "\n", 454 | "h5f2 = h5py.File('logit_out.h5', 'w')\n", 455 | "h5f2.create_dataset('dataset_1', data=logit_out)\n", 456 | "h5f2.close()" 457 | ] 458 | }, 459 | { 460 | "cell_type": "markdown", 461 | "metadata": {}, 462 | "source": [ 463 | "### Preparing Soft-targets and inputs for Evalutation of Model on Totally Held out x_test " 464 | ] 465 | }, 466 | { 467 | "cell_type": "code", 468 | "execution_count": 9, 469 | "metadata": { 470 | "collapsed": false 471 | }, 472 | "outputs": [ 473 | { 474 | "name": "stdout", 475 | "output_type": "stream", 476 | "text": [ 477 | "Batch # : 0\n", 478 | "Batch # : 1\n", 479 | "Batch # : 2\n", 480 | "Batch # : 3\n", 481 | "Batch # : 4\n", 482 | "Batch # : 5\n", 483 | "Batch # : 6\n", 484 | "Batch # : 7\n", 485 | "Batch # : 8\n", 486 | "Batch # : 9\n", 487 | "(10000, 9216)\n", 488 | "(10000, 10)\n" 489 | ] 490 | } 491 | ], 492 | "source": [ 493 | "# free up memory\n", 494 | "lastconv_out = 0\n", 495 | "logit_out = 0 \n", 496 | "\n", 497 | "test_lastconv_out = []\n", 498 | "test_logit_out = []\n", 499 | "\n", 500 | "for i in range(0,10):\n", 501 | " print \"Batch # : \",i\n", 502 | " l,l2 = prepare_softtargets(model,x_test[i*1000:(i+1)*1000])\n", 503 | " test_lastconv_out.append(l)\n", 504 | " test_logit_out.append(l2)\n", 505 | " \n", 506 | "\n", 507 | "# lastconv_out.shape , logit_out.shape\n", 508 | "test_lastconv_out = np.array(test_lastconv_out)\n", 509 | "test_logit_out = np.array(test_logit_out)\n", 510 | "\n", 511 | "test_lastconv_out = test_lastconv_out.reshape((10000 , 9216))\n", 512 | "test_logit_out = test_logit_out.reshape((10000 , 10))\n", 513 | "\n", 514 | "print test_lastconv_out.shape\n", 515 | "print test_logit_out.shape\n" 516 | ] 517 | }, 518 | { 519 | "cell_type": "code", 520 | "execution_count": 10, 521 | "metadata": { 522 | "collapsed": true 523 | }, 524 | "outputs": [], 525 | "source": [ 526 | "h5f = h5py.File('test_lastconv_out.h5', 'w')\n", 527 | "h5f.create_dataset('dataset_1', data=test_lastconv_out)\n", 528 | "h5f.close()\n", 529 | "\n", 530 | "h5f2 = h5py.File('test_logit_out.h5', 'w')\n", 531 | "h5f2.create_dataset('dataset_1', data=test_logit_out)\n", 532 | "h5f2.close()" 533 | ] 534 | }, 535 | { 536 | "cell_type": "markdown", 537 | "metadata": {}, 538 | "source": [ 539 | "### Experiment on different Compression Rates and Architectures (number of Hidden layer Neurons) for plotting results " 540 | ] 541 | }, 542 | { 543 | "cell_type": "raw", 544 | "metadata": { 545 | "collapsed": false 546 | }, 547 | "source": [ 548 | "## Logit Regression Method \n", 549 | "\n", 550 | "results = []\n", 551 | "\n", 552 | "for HiddenNeuron in [4,5,6,7,8,9,10,11,14,16,20]:\n", 553 | " \n", 554 | " # Load input,target to studentModel \n", 555 | " h5f = h5py.File('lastconv_out.h5' , 'r')\n", 556 | " lastconv_out = h5f['dataset_1'][:]\n", 557 | " h5f.close()\n", 558 | " \n", 559 | " h5f2 = h5py.File('logit_out.h5' , 'r')\n", 560 | " logit_out = h5f2['dataset_1'][:]\n", 561 | " h5f2.close()\n", 562 | "\n", 563 | " student_model = Sequential()\n", 564 | " student_model.add(Dense(HiddenNeuron,input_dim=9216,activation='relu'))\n", 565 | " student_model.add(Dropout(0.5))\n", 566 | " student_model.add(Dense(num_classes))\n", 567 | "\n", 568 | " student_model.compile(loss='mse',\n", 569 | " optimizer=keras.optimizers.Adadelta(),\n", 570 | " metrics=['accuracy'])\n", 571 | "\n", 572 | " student_model.fit(lastconv_out, logit_out,nb_epoch=10,verbose=0)\n", 573 | "# student_model.save_weights(\"student_weights_\"+str(HiddenNeuron)+\"hidden_0.5_dropout.h5\")\n", 574 | " \n", 575 | " # Compression Rate from Number of Parameters Reduced\n", 576 | " print \"HiddenNeurons : \" , HiddenNeuron\n", 577 | " print \"Initial Model Parameters : \" , model.count_params()\n", 578 | " print \"Compressed Model parameters: \", student_model.count_params() + convparams\n", 579 | " \n", 580 | " compressionRate = model.count_params() / np.float(student_model.count_params() + convparams)\n", 581 | " print \"Compression Rate : \" , compressionRate\n", 582 | " \n", 583 | " lastconv_out = 0\n", 584 | " logit_out = 0 \n", 585 | " \n", 586 | " h5f = h5py.File('test_lastconv_out.h5' , 'r')\n", 587 | " test_lastconv_out = h5f['dataset_1'][:]\n", 588 | " h5f.close()\n", 589 | " h5f2 = h5py.File('test_logit_out.h5' , 'r')\n", 590 | " test_logit_out = h5f2['dataset_1'][:]\n", 591 | " h5f2.close()\n", 592 | " \n", 593 | " pred = student_model.predict(test_lastconv_out)\n", 594 | " probs = softmax_c(pred)\n", 595 | " pred_classes = np.argmax(probs,axis=1)\n", 596 | "\n", 597 | " accuracy_student = metrics.accuracy_score(y_pred=pred_classes,y_true=np.argmax(y_test,axis=1))\n", 598 | " print \"Accuracy : \" , accuracy_student\n", 599 | " \n", 600 | " out = {\n", 601 | " \"HiddenNeuron\" : HiddenNeuron,\n", 602 | " \"compressionRate\" : compressionRate,\n", 603 | " \"nparams_student\" : student_model.count_params() + convparams = 320 + 18496,\n", 604 | " \"accuracy_student\": accuracy_student\n", 605 | " }\n", 606 | " \n", 607 | " student_model = 0 \n", 608 | " lastconv_out = 0\n", 609 | " logit_out = 0 \n", 610 | " test_lastconv_out = 0\n", 611 | " test_logit_out = 0 \n", 612 | " \n", 613 | " results.append(out)\n", 614 | " # Free-up train Set " 615 | ] 616 | }, 617 | { 618 | "cell_type": "markdown", 619 | "metadata": {}, 620 | "source": [ 621 | "### Remove Below" 622 | ] 623 | }, 624 | { 625 | "cell_type": "code", 626 | "execution_count": 11, 627 | "metadata": { 628 | "collapsed": false 629 | }, 630 | "outputs": [ 631 | { 632 | "name": "stdout", 633 | "output_type": "stream", 634 | "text": [ 635 | "HiddenNeurons : 2\n", 636 | "Initial Model Parameters : 1199882\n", 637 | "Compressed Model parameters: 37280\n", 638 | "Compression Rate : 32.1856759657\n", 639 | "Accuracy : 0.4211\n", 640 | "HiddenNeurons : 3\n", 641 | "Initial Model Parameters : 1199882\n", 642 | "Compressed Model parameters: 46507\n", 643 | "Compression Rate : 25.800030103\n", 644 | "Accuracy : 0.7185\n", 645 | "HiddenNeurons : 4\n", 646 | "Initial Model Parameters : 1199882\n", 647 | "Compressed Model parameters: 55734\n", 648 | "Compression Rate : 21.5287257329\n", 649 | "Accuracy : 0.9199\n", 650 | "HiddenNeurons : 5\n", 651 | "Initial Model Parameters : 1199882\n", 652 | "Compressed Model parameters: 64961\n", 653 | "Compression Rate : 18.4708055603\n", 654 | "Accuracy : 0.9423\n", 655 | "HiddenNeurons : 6\n", 656 | "Initial Model Parameters : 1199882\n", 657 | "Compressed Model parameters: 74188\n", 658 | "Compression Rate : 16.1735321076\n", 659 | "Accuracy : 0.9626\n", 660 | "HiddenNeurons : 7\n", 661 | "Initial Model Parameters : 1199882\n", 662 | "Compressed Model parameters: 83415\n", 663 | "Compression Rate : 14.3844872025\n", 664 | "Accuracy : 0.9699\n" 665 | ] 666 | } 667 | ], 668 | "source": [ 669 | "## Logit Regression Method \n", 670 | "\n", 671 | "results = []\n", 672 | "\n", 673 | "for HiddenNeuron in [2,3,4,5,6,7]:\n", 674 | " \n", 675 | " # Load input,target to studentModel \n", 676 | " h5f = h5py.File('lastconv_out.h5' , 'r')\n", 677 | " lastconv_out = h5f['dataset_1'][:]\n", 678 | " h5f.close()\n", 679 | " \n", 680 | " h5f2 = h5py.File('logit_out.h5' , 'r')\n", 681 | " logit_out = h5f2['dataset_1'][:]\n", 682 | " h5f2.close()\n", 683 | "\n", 684 | " student_model = Sequential()\n", 685 | " student_model.add(Dense(HiddenNeuron,input_dim=9216,activation='relu'))\n", 686 | " student_model.add(Dropout(0.2))\n", 687 | " student_model.add(Dense(num_classes))\n", 688 | "\n", 689 | " student_model.compile(loss='mse',\n", 690 | " optimizer=keras.optimizers.Adadelta(),\n", 691 | " metrics=['accuracy'])\n", 692 | "\n", 693 | " student_model.fit(lastconv_out, logit_out,nb_epoch=40,verbose=0)\n", 694 | "# student_model.save_weights(\"student_weights_\"+str(HiddenNeuron)+\"hidden_0.5_dropout.h5\")\n", 695 | " \n", 696 | " # Compression Rate from Number of Parameters Reduced\n", 697 | " print \"HiddenNeurons : \" , HiddenNeuron\n", 698 | " print \"Initial Model Parameters : \" , model.count_params()\n", 699 | " print \"Compressed Model parameters: \", student_model.count_params() + convparams\n", 700 | " \n", 701 | " compressionRate = model.count_params() / np.float(student_model.count_params() + convparams)\n", 702 | " print \"Compression Rate : \" , compressionRate\n", 703 | " \n", 704 | " lastconv_out = 0\n", 705 | " logit_out = 0 \n", 706 | " \n", 707 | " h5f = h5py.File('test_lastconv_out.h5' , 'r')\n", 708 | " test_lastconv_out = h5f['dataset_1'][:]\n", 709 | " h5f.close()\n", 710 | " h5f2 = h5py.File('test_logit_out.h5' , 'r')\n", 711 | " test_logit_out = h5f2['dataset_1'][:]\n", 712 | " h5f2.close()\n", 713 | " \n", 714 | " pred = student_model.predict(test_lastconv_out)\n", 715 | " probs = softmax_c(pred)\n", 716 | " pred_classes = np.argmax(probs,axis=1)\n", 717 | "\n", 718 | " accuracy_student = metrics.accuracy_score(y_pred=pred_classes,y_true=np.argmax(y_test,axis=1))\n", 719 | " print \"Accuracy : \" , accuracy_student\n", 720 | " \n", 721 | " out = {\n", 722 | " \"HiddenNeuron\" : HiddenNeuron,\n", 723 | " \"compressionRate\" : compressionRate,\n", 724 | " \"nparams_student\" : student_model.count_params() + convparams,\n", 725 | " \"accuracy_student\": accuracy_student\n", 726 | " }\n", 727 | " \n", 728 | " student_model = 0 \n", 729 | " lastconv_out = 0\n", 730 | " logit_out = 0 \n", 731 | " test_lastconv_out = 0\n", 732 | " test_logit_out = 0 \n", 733 | " \n", 734 | " results.append(out)\n", 735 | " # Free-up train Set" 736 | ] 737 | }, 738 | { 739 | "cell_type": "code", 740 | "execution_count": 13, 741 | "metadata": { 742 | "collapsed": false 743 | }, 744 | "outputs": [ 745 | { 746 | "data": { 747 | "text/plain": [ 748 | "[{'HiddenNeuron': 2,\n", 749 | " 'accuracy_student': 0.42109999999999997,\n", 750 | " 'compressionRate': 32.185675965665233,\n", 751 | " 'nparams_student': 37280},\n", 752 | " {'HiddenNeuron': 3,\n", 753 | " 'accuracy_student': 0.71850000000000003,\n", 754 | " 'compressionRate': 25.800030102995247,\n", 755 | " 'nparams_student': 46507},\n", 756 | " {'HiddenNeuron': 4,\n", 757 | " 'accuracy_student': 0.91990000000000005,\n", 758 | " 'compressionRate': 21.528725732945777,\n", 759 | " 'nparams_student': 55734},\n", 760 | " {'HiddenNeuron': 5,\n", 761 | " 'accuracy_student': 0.94230000000000003,\n", 762 | " 'compressionRate': 18.470805560259233,\n", 763 | " 'nparams_student': 64961},\n", 764 | " {'HiddenNeuron': 6,\n", 765 | " 'accuracy_student': 0.96260000000000001,\n", 766 | " 'compressionRate': 16.173532107618481,\n", 767 | " 'nparams_student': 74188},\n", 768 | " {'HiddenNeuron': 7,\n", 769 | " 'accuracy_student': 0.96989999999999998,\n", 770 | " 'compressionRate': 14.384487202541509,\n", 771 | " 'nparams_student': 83415}]" 772 | ] 773 | }, 774 | "execution_count": 13, 775 | "metadata": {}, 776 | "output_type": "execute_result" 777 | } 778 | ], 779 | "source": [ 780 | "results " 781 | ] 782 | }, 783 | { 784 | "cell_type": "code", 785 | "execution_count": 14, 786 | "metadata": { 787 | "collapsed": false 788 | }, 789 | "outputs": [ 790 | { 791 | "data": { 792 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAAGmCAYAAACUWUbFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3Xl8VOXZ//HPRUKAhD0Je4CwqSCyiLgr4r5Xq1Vcqm2t\nWnd9aqttrfZpf3Z76o4LWsWquO/WVkXFfQFZVEQghLAvYYeE7Pfvj3MCwzAJmWQyZ2byfb9e85qZ\nc+4555p7JrnmXs455pxDREREUlOroAMQERGR5qNELyIiksKU6EVERFKYEr2IiEgKU6IXERFJYUr0\nIiIiKUyJXiSFmNltZubMbFzQsQTNzC726+LioGMRCZISvUgIMxvnJ4fbgo4FwMzamtkvzewLM9ts\nZhVmtsrMvjKz+8zsyKBjTFVmtsD/LnwadCwiTZEedAAiEpmZtQc+AEYDq4EX/fv2wAjgUqCzX6bW\nfcAzwNK4BptizOwoYDDggIPNbF/n3LcBhyXSKEr0IonrOrwk/zZwqnOuInSlmXUB9gld5pxbB6yL\nW4Sp61L//q/ATf7za4ILR6Tx1HWfosysv9/tONnM9jazV8xsg5mVmNnHZnZchNd0MrMbzew9M1vu\ndxMXm9lrZnZwHftxZjbNzHqY2SNmtsLMqmvHRc1siJn9xcxm+NsqN7MlZjbJzPpE2N6OrnMzG2Nm\n//W7rDea2YtmlueXG2Bmz/jb3G5m75vZiDpizDSzm81stv/+t5nZZ2Y2IazcZOB9/+mtfhy1t3Fh\nZSf4+9xkZmVmNs/MfmdmbaKto3oc4t8/EJ7kAZxzG51zu3QrRxqj9/ft6rlNC9tGupldYWafm9kW\nMys1s1lmdpWZNfh/hpntb2Z3m9kc/7tXZmYLzewf/o+U8PI7xtTN7Cg/7q1+DP82s33q2M8gM3ve\n/46UmNmnZnZyQ+OMsL1s4AxgIXALXi/KBWbWtp7X9DGze/z3t91/v1+a2S2NLRvpswlZN9lf3z9k\nWejf/BAze9bM1ppZTe33IdrPJGTb55jZuyGvKTKzp81sjL/+Mn/ft9bx+h5mVmlm39S1D2k+atGn\nvnzgM+Ab4CGgJ3AO8B8zO88592xI2X2A/wd8CPwb2Aj0BU4DTjSzU51z/42wj67A58A24CWgBljj\nrzsTuBwvgX4KVADDgEuAU81sjHNuRYRtHgD8Gq9b+mFguL+tfc3sdOBj4HvgX0A/f907ZjbAObet\ndiNm1hl4DxgFzAQexfuBezwwxcyGOed+5xd/xb+/yN/vtJB4ikK2+SjwE2A5Xnf6JuAg4I/A0WZ2\nrHOuKoo6qst6/37IHsrtyWR2fS+1DgfGA6W1C8ysNfA6Xv3MB6YAZcBRwL3AgcCFDdzvz/ES5gfA\nVLx63x+4Ae/7dKBzbmuE150CnA78B3gQGAqcBBxgZkP9XovaeAfjfb+z/fKzgUF4n+V/GhhnuIuA\nNsBk51yVmT0F/A9wNvBEeGE/2b2F9xl/iPf5Zvpx34b3vYi6bBMMBL4AFgBPAe2ALf66qD4TMzPg\nMbw6WefHWwz0wftOzAdm+Pv5G/AzM/uTc646LKaf4uWbh2Lw/iRazjndUvAG9McbX3TA38PWjQEq\n8RJ5x5DlnYCcCNvqA6wE5kVYV7uPfwHpEdb3BtpEWH4cUI3XWg1dPi5km+eHrfunv3wD8Nuwdbf4\n664NWz7ZX/6rsOVtgf/iJdyREfZ/Wx31erG//iWgXdi62+qIod46quczPMV/XTlwP3Ay0HMPr6mN\nYdweyu2H98+/GBgY4fX3Amkhy9NC6v/0BsbfL3QbIct/5m/n13XUbRVwdNi6P9fxOb5dR52fHlLv\nF0f5tzPP/2728Z/v62/nowhlM4DF/vrzIv3tNKZsyPdmWh0x1n6v+4cs6x/ynm+P0Wdyqb/8S6BT\n2Lq00O8j3vwQB5wSVs6AQqAkfBu6xecWeAC6NdMHu/OPfhPQIcL62n8UFzVwe/f45fuGLa9NRN0a\nEePXQGHYsnH1/FM9wl+3OPyflf8PzAGPhSzL9pPG9Dr2P8J/zd8i7P+2Ol4zC+9HUucI69LwWj1f\nxrCOrvE/QxdyW4XXgjoiQvnb2EOiB3oBy4DtwCEhy1vh9SKsIvKPts54P4yea+J304DNwHthyy/2\nY38ywmvy/XUvhCzr4y8rrCN5TSPKRI/Xy+GAt8KWz/CX7xO2/If+8lcbsO0Glw353kyrY13t32//\nkGX9/WWrifDjupGfyTf+Nkc1YBvD/LKvhy0/3l/+aFO+N7o1/qau+9Q300XuHp2G1x03Cni8dqGZ\nHQpcCxwMdMNrhYTqze4zuoucc2sj7dzv+jsf75/4CKALXkKstdvYs29GhGUr/fvZbveuwdru/9Bx\n/wP8fdV1uFxr/z7i2G84M8vEew/rgOu8t7ab8jq2V2cd1cc5d4+ZPQIcizdmP8q/Pw84z8z+6Jz7\nfUO3Z95M/jfwPscJbtcx/iF4XcoLgd/V8f620/D6ag1cBpyL1zXdiV3nBfWu46WRPvtl/n3oOPIo\n//7jCN8H8L7jRzYk1hC1k/AeC1s+Ga+L++d43dy1DvLvGzJMEE3ZppjjnCuPtCKaz8TMsvB6M9Y4\n52btaafOublm9iHeEECec672M6ut0wejficSE0r0qa+uceDV/n2n2gVmdgbwAt6Y7DvAIrzuthq8\nlu6ReGOXdW0rkjvwZo+vwhubXIGXLMBL/v3qeN3mCMuq6lrnvLFU2Jm8wWvRg5fwD6gnxvb1rAvV\nBa/lkwvc2sDX1KqvjurlnCsFXvVvmFkGXsK5G7jFzF5yzs3e03bMLA14Fi9B3ux2nZ8BO+trMPW/\nv4bW17N448GFfuyr8X4IgfediPRdAq8HYxchn2/oj8Ta7+6evuMN4k9GO8vf/ythq6cA/wB+bGY3\nhyTSzv59pHkm4aIp2xT1ve9oPpPGxHs/Xs/bJXgTWnvgzfGZ7Zz7MortSAwp0ae+7nUs7+HfhybN\nP+K1sMc45+aFFjazh6i7deQiLTSzbnhdz9/idRFvDVs/IdLrYqj2vd3pnLuh3pLRbW+Wc250lK+N\nWEeN4bwZ+BPN7CDgArwJdXtM9Hjj7icBDzvn/hJhfe37e9k5d2ZTYvQnnZ2BN+HrRBcyOdGfuf+r\npmzfVxvvnr7jDfVjvLkbbYHtdfRoZON1wU/xn9f+KKmrdyJUNGXB+87U9T+6cx3La1+3m0Z8JtHG\nC97clTV4k/L+F03CSwg6vC71jTazDhGWj/PvQ7vkBgHfRUjyrYDDGrHvAXjfsbcjJPk+/vrm9CVe\nb8ThUbymtgs4LXyF82bzzwWGmVnXpofXZLV1GjEjhTKz/wF+gTd57Yo6in2PfwSB38XbFIP8+9fc\n7kcgjMWbCd5Utd/dw/zeinDjotzez/37p/EmHobfXggrB96RFAAnNmD70ZQFb7JsXvhC/72ObOA2\nQkX1mTjnSvB+pHc3s1E0gHOuEngE78fBqXgt+214c0okIEr0qa8TsMsYrv/L/ny8FtHLIauKgMFm\n1iukrOFN8BraiH0X+fe7/CP2x4kfppl7lPwx8aeAMWZ2S6RkYGYDzSw/ZFHtIW1969jsHXjzFh71\nD90L314XM4u2tR+RmV3ut9ojrdsb73Av8A7Tqm87Z+Id+vQNcHaEf/KA1z2O1+rvCdxjZrslYzPr\naWYN+S4U+ffjwl7fDZjYgNfvkXNuOd4QUz5wVdh+TieK8XkzOwRvMtl3zrnznHOXhN/wDktdAozz\nD+sD71DEIuC0SD1Utuu5IqIpC94P1b62+zkvfkfdQ171KfLvx4Xtt77P5B7//iEz6xS6wsxamVnP\nCK+ZhPeD+T68z2ZKHfOEJE7UdZ/6PgQuMbMDgU/YeRx9K+Ay59yWkLJ34k2YmWVmL+LNLj8UL8m/\njvcLvcGcc6vN7Bm8iT+zzextvB8ex+LNA5hN41om0bgKb8z5f4ELzexjvK7FXniTyg4AJuDN5Afv\nuOAVwLlmVon3j90BTzjnljjnHjWz/fFaxYvM7C28yYld8f6pHYE3kevyGMR+AvCAmRXhfXbL8MZQ\nB+PNZG4N3OOcm76H7TyJ93lPB26I0CVd5Jyb7D/+I96Ew8vxznPwHl59dPP3eyjwW+C7Pexzuh/z\nmeadK/5jvC72E/HqeGU9r43GlXjH0d/lJ8Q5eC3XM4juO1s7YeyfdRVwztWY2WN4P3wvBW50zlWY\n2dl4PSVTzOwyvJZ7W7zv19H4/2ejKev7P7zP+VUzexbvsNJD8L5n04i+x6Ixn8kjeD1iFwILzexV\nvEMye+ENGT3q10doPS01s3/jjc2Duu2DF/S0f92a58bOQ20m4/0TeRWvK7AU74/9+DpedzFeAi7B\nm13+Mt7Jam4jwmFb1HMIkL8+E+8kPAV4yX0ZXushG//wp7Dy46jj8LbQ91THviLGgtcCvwrvhD2b\n8SYfLQXexZuAlB1W/gB/3Wa8rv9I7/sUvNnra/HmNazGa4H9Cdg7mjqqp+6G4J2o5T9+/ZWExP4S\nYccr+6/Z7XNi10PzIt2mhW3D8P6xv4uXXCrwkv3HwG+AvAbG3xVvclaR/9kvAm73vxNFeD8wwr97\ndR4OV8/nOwivW32TX0ef4Z1zoN7thby+U0jd7nYeibCyeXit1bVARsjyvv57XezX13q8k9b8JsI2\noil7Gt5RCGV+uWfwWvOTqfvwuoh/H435TEJedz7eSXY2+69bjNdbNrqO8rXnMYh4aKtu8b2Z/6FI\nijHv1JiLgcedcxcHGoyItCj+4ay3Apc45+rsJZH40Bi9iIjEjD/593K83qCnAw5H0Bi9iIjEgHkX\nEhqNNy+iO/BL550DQgKmRC8iIrFwNt7ZNtfgXZvgzmDDkVoaoxcREUlhGqMXERFJYUr0IiIiKUyJ\nXkREJIUp0YuIiKQwJXoREZEUpkQvIiKSwpToRUREUpgSvYiISApTohcREUlhSvQiIiIpTIleREQk\nhSnRi4iIpDAlehERkRSmRC8iIpLClOhFRERSmBK9iIhIClOiFxERSWFK9CIiIilMiV5ERCSFKdGL\niIikMCV6ERGRFKZELyIiksKU6EVERFJYetABxEJOTo7r379/0GGIiIjEzVdffbXOOZe7p3Ipkej7\n9+/PjBkzgg5DREQkbsxsSUPKqeteREQkhSnRi4iIpDAlehERkRSmRC8iIpLClOhFRERSmBK9iIhI\nClOiFxERSWFK9CIiIilMiV5ERCSFKdGLiIikMCV6ERGRFJYS57oXERFJBNU1jtKKKkorqtlWXkVp\nuX9fUcW4vbqR1sriHpMSvYiItEjOOcqraigpr6KkvJqSiirvcUW1vywkYVf4Zcqr/HLVlFZUsc2/\nr93G9srqOvf39W3H0bFt6zi+Q48SvYiIJIWq6podSXhHkg1NzCEJtzZhhybhkrDHpRXVVNe4Bu27\nlUFWm3Tat0knMyPNv0+nd+cMstqkkZmRTvsd9+lkttlZJisjjaw26bRrndbMNRSZEr2IiMScc47t\nlTu7r2tbwbXJNrRLe2cLeteEXVoR8rryKsqrahq8/3atveSa1SaNrAzvvmtWBnldMsnMCFnXJt1f\n7yXkzDZhCdsv2ya9FWbx73aPBSV6ERHZTVV1Das2l7FkfSlrt5btbEmX7+yu3labjP3kvDN5ewna\nNayxTOs0C0m4XpLNapNGTvs2Yck6PSRJ72wp176udn1mRnogY+GJSoleRKSF2lZexdL1pSzdUMLS\nDaUsWV/K0g3ebcXG7VRF6NY2g6yM9F1axZkZ6XTr0JbM7JDu6h2t5doEHN6lvTN5Z6TrALDmpEQv\nIpKiamocxdvKdybw9SUs2VD7uJT1JRW7lO+c2Zq+XTMZ3rsTJw/vSb/sTPK6ZtKzUzuy/ATdNj2N\nVmotJxUlehGRJFZeVc2yDdu9Vvn6UpZsKGVZSOs8dFy7lUHPTu3ol53JccO6k9c1k35ds+jbNZO+\n2Zl0ahf/GeHS/JToRUQSmHOOTaWVIS3xkl262FdvKdtlLDwzI42+XTPJz8niyCG59MvOpG+2l8x7\nd26nbvIWSIleRCRgtRPfasfJl2wo2dkqX1/K1vKqXcrndmhDv66ZHDwwm75dM71k3jWTvl2zyGmf\nkbSzw6V5KNGLiMRBSXlVSEt818lv4RPfWqcZeV287vT9+3Xxk7nXKs/r2o7MDP3rlobTt0VEJAac\nc6zdWr4zga/3k7k/Zr5u264T3zq1a02/7N0nvvXLzqJHx7Y6PExiRoleRKSBaie+ed3qJSz1J8Et\nWV/Kso2llFVGnvh2zD7d6ZsdMvGtayadMjXxTeJDiV5ExFc78a22Jb6jVb7ea5WvCpv41q51Gv2y\nd534Vtsq18Q3SRRK9CLSooRPfKsdM699vLUs8sS3gwZke61yTXyTJKNELyIpx+tiL6WwuGTHLPba\nVvnyBkx8q538polvkgr0DRaRpFRT41i1pYzFxSUsXreNwnUlFBaXsHhdCcs3lhJ69tbaiW/Denfi\nJE18kxZGiV5EEtrGkgoK13kJfPG6bSwOSeihZ33LzEgjPyeLEXmd+cGo3gzIySI/J4v+2Vma+CYt\nmhK9iARue0U1Retrk3ltIvda6ZtKK3eUS29lO876dvjgHPJz2pOfk8WA3Cy6dWij8XKRCJToRSQu\nqmscyzeWeq3z4p1JffG6ElZs2r5L2R4d25Kfk8VJw3sywE/k+Tnt6dOlHa3TNJNdJBpK9CISM855\nV0sLTeS13e5L15dSUb2zq71D23QG5LZnbH7XHa3y2q72rDb61yQSK/prEpGobSuvYnFxCYX+mPmO\nW3HJLudlz0hvRf/sTAbmZnHMPt1DWudZdM3SoWki8aBELyIRVVTVsGxj6Y7x8tBJcGu3lu8oZwa9\nO7cjPyeLM0f3Jj8ni/zc9gzIyaJX53aa0S4SMCV6kRaspsaxZqt3iNqiHWPnXlJftnE71SHHqGVn\nZew4A1x+bpY/q709/bIzads6LcB3ISL1UaIXaQE2l1bu0s1eWOyNnRetK2F7ZfWOcu1ae4eoDevd\niVNH9PJa5/6tc2ZGgO9ARBpLiV4kRZRVVrNkfemOw9IWF++cCLehZOeV09JaGXld2jEgtz2HDMz2\nJsLlZJGfm0X3Dm1ppa52kZSiRC+SRKprHCs3bfcT+bYds9oLi0tYuXn7Lhdc6dahDfk5WRw/rMeO\nk8fk52aR1yVTF1sRaUGU6EUS1OJ1JUxfvMFvlXtJvWh9KRUhZ4Pr0Cad/NwsxvTvwoCcvB1j5/1z\nsmivQ9REBCV6kYQ0f/VWTp/4MWWVNbROM/pley3yo/bqtuPkMfk5unqaiOyZEr1IgimtqOLKKTPp\n0LY1r111IANyskjX2eBEpJGU6EUSzC2vzGVR8Tae+tmBDOneIehwRCTJqZkgkkBe+Go5L85czjXj\nB3PIoJygwxGRFKBEL5IgFq7Zyi2vfMvBA7K55ujBQYcjIilCiV4kAWyvqObKKTPJapPG3eeO1Glj\nRSRmNEYvkgBue20uC9du418/HUu3jm2DDkdEUoha9CIBe2XWCp6dsYwrxw3i8MG5QYcjIikm7one\nzE4ws/lmVmBmN0VY38XMXjazr83sSzPbN94xisTLouJt/OblbxjbvyvXHaNxeRGJvbgmejNLAyYC\nJwJDgQlmNjSs2G+A2c65/YAfA3fHM0aReCmrrObKp2bSJr0V90wYpWPlRaRZxPs/y1igwDlX6Jyr\nAJ4BTg8rMxR4D8A59z3Q38y6xzdMkeb3v298x/ert3LHOSPp0Unj8iLSPOKd6HsDy0KeL/eXhZoD\nnAlgZmOBfkCf8A2Z2aVmNsPMZhQXFzdTuCLN4/U5K5nyxVIuO3IAR+3VLehwRCSFJWJf4V+AzmY2\nG7gamAVUhxdyzk1yzo1xzo3JzdUEJkkeRetKuPmlb9i/Xxd+edxeQYcjIiku3ofXrQDyQp738Zft\n4JzbAvwEwLyrdSwGCuMVoEhzKqv0jpdPa2XcM2EUrTUuLyLNLN7/ZaYDg80s38wygHOB10ILmFln\nfx3AJcCHfvIXSXq3vzmPuSu38I+zR9C7c7ugwxGRFiCuLXrnXJWZXQW8BaQBjzrn5prZ5f76B4F9\ngMfNzAFzgZ/FM0aR5vLmN6v412dLuOSwfI4ZqvmlIhIfcT8znnPuTeDNsGUPhjz+DBgS77hEmtPS\n9aX8+oWvGZHXmV+dsHfQ4YhIC6IBQpFmVl5VzVVPz8QM7pswiox0/dmJSPzoXPcizewv//mer5dv\n5qEL9yeva2bQ4YhIC6OmhUgzemvuah77pIiLD+nP8cN6BB2OiLRASvQizWTZhlJufH4Ow3t34uaT\nNC4vIsFQohdpBpXVNVz99Cycg4nnjaZNelrQIYlIC6UxepFm8Pe35jN72SbuP380fbM1Li8iwVGL\nXiTG3vt+DZM+LOTCg/px0vCeQYcjIi2cEr1IDK3ctJ0bnpvD0J4d+e3J+wQdjoiIEr1IrFRW13DN\n07OorKph4vmjadta4/IiEjyN0YvEyB3vLGDGko3cM2EU+TlZQYcjIgKoRS8SE9Pmr+WBaYuYMLYv\np43oFXQ4IiI7KNGLNNHqzWXc8Nwc9u7RgVtPHRp0OCIiu1CiF2mCquoarnlmFmWV1dx3nsblRSTx\naIxepAnufnchXy7ewB0/GsGgbu2DDkdEZDdq0Ys00scL13Hf+wWcvX8fzhzdJ+hwREQiUqIXaYS1\nW8q47tlZDMptzx9OHxZ0OCIidVLXvUiUqmsc1z4zm23lVUz5+UFkZujPSEQSl/5DiUTp3vcW8lnh\nev521n4M6d4h6HBEROqlrnuRKHy6aB13v7uQM0f15uz9NS4vIolPiV6kgYq3lnPtM7MZkJPFH3+w\nL2YWdEgiInukrnuRBqipcdzw3Gy2bK/kiZ+NJauN/nREJDnov5VIA9w/rYCPFq7jz2cOZ+8eHYMO\nR0SkwdR1L7IHXxSu5453FnDaiF6ce0Be0OGIiERFiV6kHuu3lXPNM7Pol53F7WcO17i8iCQdJXqR\nOnjj8nPYWFrJfeeNor3G5UUkCSnRi9Rh0keFfLCgmFtOGcqwXp2CDkdEpFGU6EUi+GrJBv7+1nxO\nHt6TCw7sG3Q4IiKNpkQvEmZjSQVXT5lF787t+PMPNS4vIslNg44iIZxz3PjCHNZtq+DFXxxCx7at\ngw5JRKRJ1KIXCfHPjxczdd5afnPS3gzvo3F5EUl+SvQivllLN/KX/3zPCcN6cNEh/YMOR0QkJpTo\nRYDNpZVcNWUWPTq15a9n7adxeRFJGRqjlxavdlx+zZYyXvjFIXRqp3F5EUkdatFLizf50yLe/m4N\nN524NyPzOgcdjohITCnRS4v29fJN3P7mPI7Zpxs/Oyw/6HBERGJOiV5arC1l3rh8bvs2/N/ZIzQu\nLyIpSWP00iI557jpxa9ZsWk7z112EJ0zM4IOSUSkWahFLy3Sk58v4c1vVnPj8Xuxf7+uQYcjItJs\nlOilxfl2xWb++MY8xu2Vy6WHDwg6HBGRZqVELy3K1rJKrpoyk65ZGdzxo5G0aqVxeRFJbRqjlxbD\nOcdvXv6WZRu38/TPD6JrlsblRST1qUUvLcbTXy7j9TkrueHYIYzN17i8iLQMSvTSIsxbtYU/vD6X\nwwfn8IsjBwYdjohI3CjRS8orKa/iyikz6dSuNXeeo3F5EWlZNEYvKc05x+9e+ZaidSU8dclB5LRv\nE3RIIiJxFfcWvZmdYGbzzazAzG6KsL6Tmb1uZnPMbK6Z/STeMUrqeH7Gcl6etYJrjx7CwQOzgw5H\nRCTu4prozSwNmAicCAwFJpjZ0LBiVwLfOedGAOOAf5iZpkdL1Bas2crvX/uWQwZmc9X4QUGHIyIS\niHi36McCBc65QudcBfAMcHpYGQd0MO/E4+2BDUBVfMOUZFdaUcWVT82kfZvW3HXuSNI0Li8iLVS8\nE31vYFnI8+X+slD3AfsAK4FvgGudczXhGzKzS81shpnNKC4ubq54JUnd+upcCoq3cfe5I+nWoW3Q\n4YiIBCYRZ90fD8wGegEjgfvMrGN4IefcJOfcGOfcmNzc3HjHKAnspZnLef6r5Vx91CAOHZQTdDgi\nIoGKd6JfAeSFPO/jLwv1E+Al5ykAFgN7xyk+SXIFa7fxu1e+5cD8rlx7zJCgwxERCVy8E/10YLCZ\n5fsT7M4FXgsrsxQ4GsDMugN7AYVxjVKSUlllNVdNmUnb1mncM2GUxuVFRIjzcfTOuSozuwp4C0gD\nHnXOzTWzy/31DwJ/BCab2TeAAb92zq2LZ5ySnP7w+ly+X72VyT85gO4dNS4vIgIBnDDHOfcm8GbY\nsgdDHq8Ejot3XJLcXp29gqe/XMYvxg1k3F7dgg5HRCRhJOJkPJGoFBZv4zcvfcOYfl34n2M1Li8i\nEkqJXpJaWWU1V06ZRev0VtwzYRTpafpKi4iE0rnuJan96d/fMW/VFh69eAy9OrcLOhwRkYSj5o8k\nrTe+XsmTny/l0iMGMH7v7kGHIyKSkJToJSktWV/CTS9+w6i+nbnx+L2CDkdEJGEp0UvSKa+q5sop\nM0lrZdw7YRStNS4vIlInjdFL0vnzm9/z7YotTLpwf/p0yQw6HBGRhKamkCSV/367ismfFvHTQ/M5\nbliPoMMREUl4SvSSNJZtKOXGF75mRJ9O3HSiLn8gItIQSvSSFCqqarhqykwA7jtvNBnp+uqKiDSE\nxuglKfz1v98zZ/lmHrxgNHldNS4vItJQahZJwnvnuzX88+PFXHRwP07Yt2fQ4YiIJBUlekloyzeW\n8svn57Bv74785uR9gg5HRCTpKNFLwqqsruGap2dRXeO4b8Jo2qSnBR2SiEjS0Ri9JKz/e3s+M5du\n4r7zRtE/JyvocEREkpJa9JKQ3v9+LQ99UMj5B/bllP16BR2OiEjSUqKXhLNq83ZueG42+/TsyC2n\nDA06HBGRpKZELwmlyh+XL6+qYeJ5o2jbWuPyIiJN0eBEb2afmtmFZtamOQOSlu3OqQuYXrSR288Y\nzoDc9kGHIyKS9KJp0VcAjwMrzewOM9M5SCWmPlxQzP3TFnHOmDx+MKp30OGIiKSEBid659w4YChe\nsv8xMNcgOCkLAAAgAElEQVTMppnZOWbWupnikxZizZYyrn92NkO6deC204YFHY6ISMqIaozeOfe9\nc+4GoDdwMZAGTAGWm9lfzGxA7EOUVFdd47j2mVmUVlQz8fxRtMvQuLyISKw0ajKec67cOfcEcC3w\nEZAL/ApYYGbPm5muHyoNdve7C/m8cAN//MG+DOrWIehwRERSStSJ3szamdlPzexLYDrQDS/h9wJ+\nARwCPBXTKCVlfVKwjnvfW8gPR/fhrP37BB2OiEjKafCZ8cxsOHAZcD6QBbwK/No5935IsYfNbDXw\nfEyjlJS0dmsZ1z4zm4G57fnjDzQuLyLSHKI5Be4cYCVwFzDJObeqjnIFwGdNDUxSW3WN4/pnZ7Ot\nvJKnLjmQzAydjVlEpDlE89/1LOBV51x1fYWcc/OAo5oUlaS8ie8X8EnBev76w+Hs1UPj8iIizSWa\nMfrXgbaRVphZlg6xk4b6vHA9d01dwA9G9uJHY/KCDkdEJKVF06J/BGgNnBdh3UN4J9T5aSyCktS1\nbls51zw9i/7ZWfzpjOGYWdAhiYiktGha9EfhTcCL5DXg6KaHI6msxh+X37S9kvvOG037NhqXFxFp\nbtEk+m7A2jrWFQPdmx6OpLIHPljERwvXceupQxnaq2PQ4YiItAjRJPq1wPA61g0H1jc9HElV04s2\ncMc7Czhlv56cN7Zv0OGIiLQY0ST6N4BbzGy/0IX+8fW/xZusJ7KbDSUVXD1lFnld2vHnMzUuLyIS\nT9EMkv4eOBb4ysymA8vxznk/FlgM/C724Umyq6lx/M9zs9lQUsFLVxxCh7Y6OENEJJ6iuXrdOuAA\n4M+AASP9+/8HHOCvF9nFIx8X8v78Yn53yj7s27tT0OGIiLQ4UU17ds5twmvZ/755wpFUMnPpRv72\n3/mcNLwHFx7UL+hwRERapEZdvU5kTzaVeuPyPTu35S8/3E/j8iIiAYmqRW9mw4BLgL3Y/Sx5zjmn\nY+kF5xw3vvA1a7eW8cLlh9BR4/IiIoGJ5up1BwIfAEXAYOBroAvQF29iXkEzxCdJ6P35a3nnuzX8\n9qR9GJHXOehwRERatGi67m8HXgKG4U3C+5lzrj9wDJAG/Cnm0UnScc5x5zsLyevajosP7R90OCIi\nLV40iX4/4EnA+c/TAJxz7+El+T/HNjRJRlPnreWbFZu5evxgWqdpCoiISNCi+U+cAZQ452qADUDP\nkHXzgX1jGZgkH+ccd01dQL/sTM4c1TvocEREhOgSfQHeeDx44/M/NbNWZtYK+AmwOtbBSXJ5+7s1\nzF25hWvGDyZdrXkRkYQQzaz7N4AjgCfwxuv/DWwBqoH2wDUxj06SRk2N4853FpCfk8XpI3sFHY6I\niPganOidc7eGPJ5qZgcBPwQygf86595uhvgkSbw1dzXfr97KneeMUGteRCSBNCjRm1lr4CTga+fc\nYgDn3CxgVrQ7NLMTgLvxJvM94pz7S9j6G4HzQ+LbB8h1zm2Idl8SHzU1jrumLmRAbhanjdDYvIhI\nImlQ08s5Vwk8B/Rvys7MLA2YCJwIDAUmmNnQsH393Tk30jk3ErgZ+EBJPrG9+e0q5q/ZyrVHDyat\nlc6AJyKSSKLpYy0EujVxf2OBAudcoXOuAngGOL2e8hOAp5u4T2lG1TWOu6cuZFC39pyyn8bmRUQS\nTTSJ/m/Ab80stwn76w0sC3lee6nb3ZhZJnAC8GId6y81sxlmNqO4uLgJIUlT/PubVSxcu43rjlFr\nXkQkEUUz63480BVYbGafA6vYefIc8M51f1EMYzsV+KSubnvn3CRgEsCYMWNcpDLSvLzW/AL26t6B\nk/btuecXiIhI3EWT6A8DKoFiYKB/C9WQZLsCyAt53sdfFsm5qNs+ob0+ZyWLiku4//zRtFJrXkQk\nIUVzeF1+DPY3HRhsZvl4Cf5c4LzwQmbWCTgSuCAG+5RmUFVdwz3vLmTvHh04YViPoMMREZE6xPWA\nZ+dcFXAV8BYwD3jOOTfXzC43s8tDip4BvO2cK4lnfNJwr81ZSeG6Eq47Zoha8yIiCSyay9T23VMZ\n59zSBpR5E3gzbNmDYc8nA5MbGpvEV21rfmjPjhw/rHvQ4YiISD2iGaMvYs/j8GmND0WSxcuzVlC0\nvpRJF+6PmVrzIiKJLJpE/1N2T/TZwClAPvDHWAUliauyuoZ73ytg394dOXaoWvMiIokumsl4k+tY\ndYeZPQEMiElEktBemrmcpRtK+edFY9SaFxFJArGajPckXotfUlhFldeaH9GnE+P3bupJEkVEJB5i\nlei7AW1jtC1JUC/OXM7yjdu57tghas2LiCSJaGbdHxFhcQawL97FZz6KVVCSeCqqarjvvQJG5nVm\n3JCmnAVZRETiKZrJeNPYfTJebbPuA+AXsQhIEtNzM5axYtN2bj9zuFrzIiJJJJpEf1SEZWXAEufc\n6hjFIwmovKqaie8XsH+/LhwxOCfocEREJArRzLr/oDkDkcT17PRlrNpcxt/PGqHWvIhIkmnwZDwz\nO8jMflTHurPN7MDYhSWJoqzSa80f0L8Lhw7KDjocERGJUjSz7v8MDKtj3T7+ekkxz3y5lDVbyrle\nM+1FRJJSNIl+BPB5Heu+BPZrejiSSMoqq7l/2iIOzO/KIQM1Ni8ikoyiSfRt6ymfBmQ1PRxJJE99\nsZS1W73WvIiIJKdoEv084LQ61p0GzG96OJIotldU88C0RRwyMJuDBmhsXkQkWUVzeN2DwENmtgV4\nGFgO9AYuBX4GXBH78CQoT32xhHXbynnggtFBhyIiIk0QzeF1D5vZXsD1wA2hq4A7nXOTYh2cBKO0\noooHpi3isEE5HNC/a9DhiIhIE0TTosc590szewA4Bu8SteuAqc65wuYIToLxxGdLWF9SwfXHDg46\nFBERaaKoEj2Ac24RsKgZYpEEUFJexUMfFnLEkFz276fWvIhIsovmhDk/MbPb6lh3m5ldFLOoJDCP\nf1bEhpIKrj9GrXkRkVQQzaz7a4H1daxbC1zX9HAkSNvKq5j0YSHj9splVN8uQYcjIiIxEE2iHwTM\nrWPdPGBg08ORID3+aRGbSiu5/hgdNy8ikiqiSfRVQF2nR9MFypPclrJKJn1YyNF7d2NEXuegwxER\nkRiJJtF/CVxex7rLgelND0eCMvmTIjZvr+Q6teZFRFJKNLPu/x8w1cy+AB4BVuCdMOcSYDRwbOzD\nk3jYvL2SRz4q5Nih3Rnep1PQ4YiISAxFdT16MzsLuAt4KGRVEfBD59y02IYm8fLYJ4vZUlbFdZpp\nLyKScqI9Yc6rwKv+GfKygXXOuQXNEpnExebSSv750WKOH9adYb3UmhcRSTVRnzAHwDmnC9ikiH9+\nXMjW8iqNzYuIpKioE72ZjQD2wrts7S6cc/+KRVASH5tKK3j0kyJOGt6DfXp2DDocERFpBg1O9GbW\nGfg3cFDtIv/ehRRTok8iD39USElFFdcerda8iEiqiubwutvxxuWPwEvyZwDjgaeAQmBszKOTZrOh\npILJnxRx8vCe7NWjQ9DhiIhIM4km0R+Pl+w/958vd85Nc879GJiKd4pcSRIPf1RIaWU11x6tmfYi\nIqksmkTfE1jsnKsGyoDQZuBLwMmxDEyaz/pt5Tz+aRGn7teLwd3VmhcRSWXRJPrVQO11S5cAB4es\nGxSziKTZTfqwkLLKaq5Ra15EJOVFM+v+Y7yJeK8CTwC3mll/vHPgXwS8FuvgJPaKt5bzr8+WcPrI\n3gzq1j7ocEREpJlFk+j/APTyH/8db2LeOUAmXpK/OrahSXN46INFlFdVc/V4dcKIiLQE0ZwCdxGw\nyH9cCfyPf5MksXZrGU9+sYQfjOrNgFy15kVEWoJoxuglyT04rZDKasc14zU2LyLSUijRtxBrtnit\n+TNH9aZ/TlbQ4YiISJwo0bcQD0xbRE2N42q15kVEWhQl+hZg9eYypny5lLP270Pf7MygwxERkThS\nom8B7p9WQE2N48qjNNNeRKSlUaJPcSs3beeZL5dx9pg88rqqNS8i0tIo0ae4ie8X4HBcpePmRURa\nJCX6FLZ8YynPzVjGOQfk0btzu6DDERGRACjRp7CJ7xdgmMbmRURasLgnejM7wczmm1mBmd1UR5lx\nZjbbzOaa2QfxjjEVLNtQyvMzljNhbB49O6k1LyLSUkVzrvsmM7M0YCJwLLAcmG5mrznnvgsp0xm4\nHzjBObfUzLrFM8ZUcd97BbRqZVyh1ryISIsW7xb9WKDAOVfonKsAngFODytzHvCSc24pgHNubZxj\nTHpL1pfwwszlnDe2L907tg06HBERCVC8E31vYFnI8+X+slBDgC5mNs3MvjKzH0fakJldamYzzGxG\ncXFxM4WbnO59r4D0VsYV4wYGHYqIiAQsESfjpQP7AycDxwO3mNmQ8ELOuUnOuTHOuTG5ubnxjjFh\nLV5XwsuzVnDBQf3opta8iEiLF9cxemAFkBfyvI+/LNRyYL1zrgQoMbMPgRHAgviEmNzufXchrdOM\ny49Ua15EROLfop8ODDazfDPLAM4FXgsr8ypwmJmlm1kmcCAwL85xJqVFxdt4ZfYKLjyoH7kd2gQd\njoiIJIC4tuidc1VmdhXwFpAGPOqcm2tml/vrH3TOzTOz/wJfAzXAI865b+MZZ7K6992FtElP4zK1\n5kVExBfvrnucc28Cb4YtezDs+d+Bv8czrmRXsHYrr85ZyaVHDCCnvVrzIiLiScTJeNIId79bQLvW\naVx2hFrzIiKykxJ9CliwZitvfL2Siw/pT9esjKDDERGRBKJEnwLufnchWRnp/PzwAUGHIiIiCUaJ\nPsl9v3oL//56FRcf0p8uas2LiEgYJfokd/fUhXRok84lh+cHHYqIiCQgJfok9t3KLfzn29X85LB8\nOmeqNS8iIrtTok9id01dQIe26fzsMLXmRUQkMiX6JPXtis28/d0aLjlsAJ3atQ46HBERSVBK9Enq\nrqkL6dg2nZ8c1j/oUEREJIEp0Sehr5dvYuq8Nfz88AF0bKvWvIiI1E2JPgndNXUhnTNbc/Gh/YMO\nRUREEpwSfZKZvWwT732/lp8fPoAOas2LiMgeKNEnmTvfWUCXzNZcdEj/oEMREZEkoESfRL5aspEP\nFhRz2ZEDad8m7hceFBGRJKREn0TumrqA7KwMfnxwv6BDERGRJKFEnyRmFG3go4XruOzIAWRmqDUv\nIiINo0SfJO6cuoCc9hlccJBa8yIi0nBK9Engy8Ub+KRgPZcfOVCteRERiYoSfRK4850F5HZoo9a8\niIhETYk+wX22aD2fFa7nF0cOpG3rtKDDERGRJKNEn8Ccc9w5dQHdO7bhvAP7Bh2OiIgkISX6BPbZ\novV8uXgDV4wbpNa8iIg0ihJ9gnLOccc7C+jRsS3nHJAXdDgiIpKklOgT1McF65ixZCNXjldrXkRE\nGk+JPgE557jznQX06tSWH43pE3Q4IiKSxJToE9AHC4qZuXQTV44fRJt0teZFRKTxlOgTjDfTfiG9\nO7fj7P01Ni8iIk2jRJ9gps0vZs6yTVw9fhAZ6fp4RESkaZRJEkjtcfN5Xdvxw/01Ni8iIk2nRJ9A\n3p23lq+Xb+bq8YNpnaaPRkREmk7ZJEE457jr3QX0y87kzFG9gw5HRERShBJ9gnjnuzV8u2ILV48f\nTLpa8yIiEiPKKAmgpsabaZ+fk8UPRvYKOhwREUkhSvQJ4O3vVjNv1RauOXqQWvMiIhJTyioBq6lx\n3DV1IQNyszhthMbmRUQktpToA/afb1fz/eqtXHv0YNJaWdDhiIhIilGiD1BNjePudxcwqFt7TtlP\nY/MiIhJ7SvQB+vc3q1iwZpta8yIi0myU6ANSXeO4a+oChnRvz8nDewYdjoiIpCgl+oC88fVKFhWX\ncN0xQ2il1ryIiDQTJfoAVFXXcPfUhezdowMnDOsRdDgiIpLClOgD8NqclRSuK+G6YwarNS8iIs1K\niT7OqqpruOfdhQzt2ZHjhqo1LyIizUuJPs5emb2SovWlas2LiEhcKNHHUWV1Dfe+t5B9e3fk2KHd\ngw5HRERagLgnejM7wczmm1mBmd0UYf04M9tsZrP92+/jHWNzeXnmCpasL+X6Y4Zgpta8iIg0v/R4\n7szM0oCJwLHAcmC6mb3mnPsurOhHzrlT4hlbc6usruHe9xcyok8nxu/dLehwRESkhYh3i34sUOCc\nK3TOVQDPAKfHOYZAvPjVcpZt2M51as2LiEgcxTvR9waWhTxf7i8Ld4iZfW1m/zGzYZE2ZGaXmtkM\nM5tRXFzcHLHGTEVVDfe+V8DIvM6M2ys36HBERKQFScTJeDOBvs65/YB7gVciFXLOTXLOjXHOjcnN\nTezk+fxXy1ixaTvXH6vWvIiIxFe8E/0KIC/keR9/2Q7OuS3OuW3+4zeB1maWE78QY6u8qpqJ7xUw\num9njhictG9DRESSVLwT/XRgsJnlm1kGcC7wWmgBM+thfrPXzMb6Ma6Pc5wx89z0ZazcXKbWvIiI\nBCKus+6dc1VmdhXwFpAGPOqcm2tml/vrHwTOAn5hZlXAduBc55yLZ5yxUlZZzcT3F3FA/y4cNkit\neRERib+4JnrY0R3/ZtiyB0Me3wfcF++4msOz05exeksZd/xohFrzIiISiEScjJcSvNZ8AWPzu3Lw\nwOygwxERkRZKib6ZTPliKWu3lnODxuZFRCRASvTNYHtFNQ98sIiDB2Rz0AC15kVEJDhK9M3gqS+W\nULy1nOuPHRJ0KCIi0sIp0cdYaUUVD36wiMMG5TA2v2vQ4YiISAunRB9jT36+hHXbKrj+2MFBhyIi\nIqJEH0sl5VU89EEhhw/OYf9+as2LiEjwlOhj6F+fLWF9SYXG5kVEJGEo0cfItvIqJn24iHF75TK6\nb5egwxEREQGU6GPm8U+L2FhayXXHqDUvIiKJQ4k+BraWVTLpw0KO3rsbI/M6Bx2OiIjIDkr0MTD5\nkyI2b1drXkREEo8SfRNtKavk4Y8KOWaf7gzv0ynocERERHahRN9Ej368mC1lVVx3jI6bFxGRxKNE\n3wSbt1fyz48Xc/yw7uzbW615ERFJPEr0TfDPjxeztaxKY/MiIpKwlOgbaVNpBY9+vJgT9+3BPj07\nBh2OiIhIREr0jfTIR4spqVBrXkREEpsSfSNsLKngsU8Wc9LwnuzVo0PQ4YiIiNRJib4RJn1USGll\nNdcdrZn2IiKS2JToo7R+WzmPf1rEqfv1YnB3teZFRCSxKdFHadJHhZRVVnONWvMiIpIElOijsG5b\nOf/6dAmnjejFoG7tgw5HRERkj5Too/DQB4sor1JrXkREkocSfQOt3VrGE58v4QejejMgV615ERFJ\nDkr0DfTQB4VUVjuuGa/WvIiIJA8l+gZYu6WMJz9fwpmjetM/JyvocERERBpMib4B7p+2iOoax9Vq\nzYuISJJRot+D1ZvLmPLlUn44ug99szODDkdERCQqSvR7cP+0AmpqHFeNHxR0KCIiIlFToq/Hyk3b\neebLZZw9Jo+8rmrNi4hI8lGir8f90wpwqDUvIiLJS4m+Dss3lvLs9GX8aEwevTu3CzocERGRRlGi\nr8PE9xdhGFcepda8iIgkLyX6CJZtKOX5Gcs4d2wevdSaFxGRJKZEH8HE9wto1cq4Ypxa8yIiktyU\n6MMsXV/K818t57yxfenRqW3Q4YiIiDSJEn2YTdsr2K9PJ64YNzDoUERERJosPegAEs1+fTrz8hWH\nBh2GiIhITKhFLyIiksKU6EVERFKYEr2IiEgKU6IXERFJYUr0IiIiKUyJXkREJIXFPdGb2QlmNt/M\nCszspnrKHWBmVWZ2VjzjExERSSVxTfRmlgZMBE4EhgITzGxoHeX+Crwdz/hERERSTbxb9GOBAudc\noXOuAngGOD1CuauBF4G18QxOREQk1cQ70fcGloU8X+4v28HMegNnAA/UtyEzu9TMZpjZjOLi4pgH\nKiIikgoScTLeXcCvnXM19RVyzk1yzo1xzo3Jzc2NU2giIiLJJd7nul8B5IU87+MvCzUGeMbMAHKA\nk8ysyjn3SnxCFBERSR3xTvTTgcFmlo+X4M8Fzgst4JzLr31sZpOBN5TkRUREGieuid45V2VmVwFv\nAWnAo865uWZ2ub/+wcZs96uvvlpnZktiGGpzywHWBR1EClP9Ni/Vb/NS/TavVKrffg0pZM655g5E\nwpjZDOfcmKDjSFWq3+al+m1eqt/m1RLrNxEn44mIiEiMKNGLiIikMCX6YEwKOoAUp/ptXqrf5qX6\nbV4trn41Ri8iIpLC1KIXERFJYUr0IiIiKUyJvonMLM3MZpnZG/7zrmb2jpkt9O+7hJS92b8873wz\nOz5k+f5m9o2/7h7zTwtoZm3M7Fl/+Rdm1j/e7y9IZlbk18tsM5vhL1P9xoiZdTazF8zsezObZ2YH\nq35jw8z28r+3tbctZnad6jd2zOx6M5trZt+a2dNm1lb1WwfnnG5NuAE3AFPwzuAH8DfgJv/xTcBf\n/cdDgTlAGyAfWASk+eu+BA4CDPgPcKK//ArgQf/xucCzQb/fONdtEZATtkz1G7v6fRy4xH+cAXRW\n/TZLPacBq/FObqL6jU2d9gYWA+38588BF6t+66ivoANI5hveufrfBcazM9HPB3r6j3sC8/3HNwM3\nh7z2LeBgv8z3IcsnAA+FlvEfp+OdzcmCft9xrN8idk/0qt/Y1G0n/x+lhS1X/ca+ro8DPlH9xrRO\na6+E2tV/72/49az6jXBT133T3AX8Cgi90l5359wq//FqoLv/uK5L9Pb2H4cv3+U1zrkqYDOQHcP4\nE50DpprZV2Z2qb9M9Rsb+UAx8Jg/9PSImWWh+m0O5wJP+49VvzHgnFsB/B+wFFgFbHbOvY3qNyIl\n+kYys1OAtc65r+oq47yfgjp+sfEOc86NBE4ErjSzI0JXqn6bJB0YDTzgnBsFlOB1de6g+m06M8sA\nTgOeD1+n+m08f+z9dLwfrL2ALDO7ILSM6ncnJfrGOxQ4zcyKgGeA8Wb2JLDGzHoC+Pdr/fJ1XaJ3\nhf84fPkurzGzdLzu1vXN8WYSkf+rHefcWuBlYCyq31hZDix3zn3hP38BL/GrfmPrRGCmc26N/1z1\nGxvHAIudc8XOuUrgJeAQVL8RKdE3knPuZudcH+dcf7yuufeccxcArwEX+cUuAl71H78GnOvP5MwH\nBgNf+t1MW8zsIH+254/DXlO7rbP8fbSIX6hmlmVmHWof442/fYvqNyacc6uBZWa2l7/oaOA7VL+x\nNoGd3fag+o2VpcBBZpbp18vRwDxUv5EFPUkgFW7AOHZOxsvGm6C3EJgKdA0p91u82Z7z8Wd2+svH\n4CWxRcB97DxjYVu8Lr8CvJmhA4J+r3Gs0wF4s2TnAHOB36p+Y17HI4EZwNfAK0AX1W9M6zcLrwXY\nKWSZ6jd29fsH4Hu/bp7Am1Gv+o1w0ylwRUREUpi67kVERFKYEr2IiEgKU6IXERFJYUr0IiIiKUyJ\nXkREJIUp0YtIi2Jm48zMmdnFQcciEg9K9CJRCkkUobdt/jn5rzWztKBjjDfzLnl7m5mNCzCGU/1L\nky43s3IzW2Vmn5rZ38wsJ6i4RIKWHnQAIknsaeBNvMtb9sK7TOZdwDDg0rpflpI6A7f6j6fFe+dm\n9le8C0x9DdwPrMH7TIYDl+NdxnSdX/xDoB1QGe84RYKgRC/SeDOdc0/WPjGzB/BOw3mJmd3idp7f\nvNH83oE2zrnSpm4rmZlZB+fc1jrWdQN+CUwHDnXeuc9D17cPfe6cqwHKmitWkUSjrnuRGHHObQE+\nw2vhDwAws15m9g8zm21mG82szMy+M7Nfh3fxm9nF/jDAMWZ2i5ktwktIP/LXH2dmz5pZoZltN7NN\nZva2mR0ZHouZTTOzIjPrb2Yv+2U3mtlkM2tvZq3M7DdmttiPaaaZHRphO2Zmv/CHJUr9IYr3zeyo\nkDLj8K5tD3BryHBGUdi2zjGzj81sq7+tL8zsrAj7dH6cR/vltwGv11P1A/D+l30YnuT9z2Wbc25b\naLzhY/R+XYUPx9TepoXFN8av03X+EMF8M/uteRc+EUk4+mKKxIh/UYxB/tPabuL9gDPxrr63CGgN\nnAD8BS9BXRZhU//nl3sY2IJ3bm7whga6Av9i53WzLwHeNbOjnHMfhW0nC3gP+ADvErQHAD/FO4f3\neuBA4F5/X78EXjezfmEt5yfwLszyAvAY3vnEzwfeMbMznXOv4fViXA/c6b/Pl/zXhibXP+Gda/y/\nwC1ADXAG8LyZXeWcmxgW+xjgh34dPB6hjkIV+venmNkdzrmVeygfyXVA+7BlBwJX4Q0D1L6Pk/He\nXwHwD2ADcDDwv3jXDji7EfsWaV5Bn2xfN92S7YZ3ESMH/B7IAXLxEvrD/vLPQsq2w79IRtg2ngCq\ngZ4hyy72Xz8fyIzwmqwIy7rj/ah4M2z5NH9bN4Ytfwkvyc4AWocsP80vf1nIsjP8ZZeGbSPdf/1i\ndl4ApL9f9rYIMY72190eYd0reD9mOoQsq72O+DFRfCb3+q8pxxuD/xveFce61PP5XVzP9vrjJfgC\nINtf1hZY7W8/Paz89f42xwX9/dRNt/Cbuu5FGu8PQDHeNa/n4LWWXwN+UFvAObfdOecAzCzDzLr6\nM8DfwutuHhNhuw+4CGPyzrmS2sd+93s23o+FL/Ban+Gq8RJgqI/whhYedLt2c9f2BgwOWXYBsBV4\nxcxyam94E+9ex0uGoeXrcj5eEnw8dDv+tl4DOuC1ikPNcc5NbcC2a12Dd4nRT4GxwI14Vx5bZWZ/\njeZICDPrBLwBZAAnO+dqr0F+LN4Pq8eAzmHv402/zHFRxCwSF+q6F2m8SXjJxAElwALn3IbQAv64\n7U14SWgQXpIN1SXCdhdE2pmZDQT+H3A8XrINFekylKucc+GTzjb694tDFzrnNnojD2SHLN4HLwnX\nN6mwe13xhm3H8C4pWt92Qu1pm7vwf0w9ATxhZhl4PSzH4XXJ/wrYBPx5T9vxP6/ngSHACc65+SGr\n97yTr+gAAAMPSURBVPHvH61nE+HvQyRwSvQijbewAa3OO4CrgWfxkvRavMO6RgN/JfKE2N1a8/7M\n8Q/xxt3vAr7Ba23XADcD4yNsp7qeuOpaZ2GPi4Hz6tnOt/WsC92OA06sZ79zw543+igD51wF3tDC\nDDN7EW8Owc9oQKLHOzTvWOAS59x7Yetq6+ZGYHYdr2/M/ACRZqVEL9K8LsSbDX5u6EIzG1RH+boc\njXdc+E+dc4+FbetPTQuxTgvxWrafu5BZ63WI1KMQup0TgKXOuXmxCq4hnHPzzWwj3sTFepnZjcDP\ngb855/4ZochC/74kymEFkUBpjF6keVUT1l1vZll4k7ei3Q4RtnUckcfnY+FfeP8jIraEzSy0m7r2\nh0DXCEWf8O9vjzRWHradqJlZDzMbWce6w/2YvtvDNs7A62F5GW+oJZK38HpkbjKz3d6nmbUzsw7R\nxC4SD2rRizSvF4DL/n879+/qUxzHcfz5GqSUsqo7GAw2In+DssiiW1eGa/Qj2ZiuxXbtFhaL2Cx0\nScqi61dZKIsMXBGLyfA2vM8tAze/bvTp+dhO307nfIdvr+857x9JrgK36RruPD3e9ivu0x3fi0m2\n0eN1u+g3Bs/oDXB/VVVdT3IZOJ5kN92g9h6YoZvntjPtC6iqD0leArPT/P8K/eR7o6qWkywAC8DT\nJNfoV9xbgT3Afrrx7XfNAMtJHgB36HG7jcBOuhHwC3D2RydPTY1X6HLBTWBu6ldYtVJVS1X1OckR\nelLgRZJLdFf+FmAHPUZ5kH+wGVBai0Evra/TdC39EHAAeE038S3Twf9TqupTkn302NgJ+rf7iA7J\no6xD0E/XnU9yl17pe4YO5LfA4+n4W3P0LP15YBPwimnRTVWdS/KQ7o4/RfcavKNr/Cf/8DafA8fo\n2vos/WdqA/CGDuXFqnqyxvmbp/sFuPidz+8BS9P3uJVkL/3Uf5gerfxI70i4QK/glf4rqzOwkiRp\nQNboJUkamEEvSdLADHpJkgZm0EuSNDCDXpKkgRn0kiQNzKCXJGlgBr0kSQMz6CVJGthXJwF5n+tB\nX7gAAAAASUVORK5CYII=\n", 793 | "text/plain": [ 794 | "" 795 | ] 796 | }, 797 | "metadata": {}, 798 | "output_type": "display_data" 799 | }, 800 | { 801 | "data": { 802 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAf0AAAGmCAYAAAB2hV28AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3Xl4VOXd//H3NztkA5IQlrArKLuQBPWxaltbaWu17rhB\nCO7Vp4va2latXd362Na6W8Niq1atW61Vq9Va9VcTQFYF2VRAgYQtCZCEJPfvj3OCQ5iETEhykszn\ndV1zJXPOfWY+Z7bvmXPuc4855xAREZHuLyboACIiItIxVPRFRESihIq+iIhIlFDRFxERiRIq+iIi\nIlFCRV9ERCRKqOiLtJKZOTN7I+gc0r7M7CMz+yjoHCJtQUW/CzKzI8zsD2a2zMx2mlmNmX1qZn83\ns1lmlhh0RgmGvyESeqkzs21m9oaZFZiZtdH9qBC2gJldEPJcfDXoPCJxQQeQyJjZTcBP8TbY/h8w\nF6gAsoHjgT8CVwC5QWWMIkcCu4MO0YSf+X/jgcOA04ET8F4XVwUVKgpdCjjA/P9fCTaORDvTiHxd\nh5n9GPgVsB442zn3bpg2U4EfOOe+1NH5JHhm5gCcc9Zo+v8Ab+IVnxHOuXWHeD8f+fcz9FBupyto\n7bqa2ShgBfAq0BsYDwxyzm1u44giLabd+12EmQ0Fbgb2Al8PV/ABnHMvAV8Ls/w5Zvamfzhgj5kt\nNbMfhTsU0LDr1sxSzOy3ZrbeX2aRmX3LbxNnZj8xs1VmVmVma8zsgG+QZnaiv2vzZjM7xsxe9TNU\nmNnLZnbAHgm/rfOXPd/M3jWzysa7k81sipk9ZWab/EMc683sATMbEOY2h5vZg2a22l+Xbf5jcL+Z\nZYS0SzCz/zWzhWa23cx2+4/Fc2Z2UqPbDHtM38zSzewWM1vpPzbb/XU9KUzb0Mdnon+IZod/v/82\ns2MbL9Mazrm38QqQAZMbZUgws6vM7EUz+9jMqv3H51Uz+1qjtif6GxZDgCGNDiXMadT2CDOb4z8v\nNWa22cwe9Ythi0SSLWSZhtdvspndYWaf+MutNrMfmh14iMM8V5nZcv8522hmd5tZekuzhnGJ/3c2\nMAdvr0vBQdb3XDN7zV/HKn89HmvifXLQtqHvpTDLD23ieZvjTx9uZleb2RL/PfOGPz/i58RfLsfM\n7jLvM6PhPVhsZjf682P910q5maU0cRt/8LOd1dzjKM1wzunSBS54u2sd8Fgrlv21v2wpcB9wB7DM\nn/YGkNCo/UfARrzDByuBu4EH8Q4j1AFfBv4KbAAe8udv9m/v3Ea3daI//R9ANfB3P88TQC2wB/hC\no2Vu9pf5G1AFPAncCtwX0qbQX34X8BhwO/CMn+9TYHBI2/7AVrwNpueA24DfA8/7y48Nafuof99L\n/Ta3AvOAtcBvGuV0wBuNpvUClvvziv3l/wiUA/XAZU08Pi/gHSp4DfiN//jU+Y/PqAiea+e9rcPO\na8h1WqPp/fz7+o+f9Ra8IrXVb39xSNuh/vOzw7/cHHL5Vki7qf767AWe9p+fR/3ncycwqYXr0+Js\nYV6/b/nP2wPAPf40B/w0zDK/9+d9CtwF/B+wGijxp30U4XsuAe/9tgPoAfTBe/2vwt/D2qi9+evV\n8D5tWNdH8N5nN7ey7c1+uxPD3OdQf96cRtMbbvtvfv4/472Of3UIz0luyPx/470H/4D3eq8LaXeT\n3+aSMLfRA9gOfAbEt/azNNovgQfQpYVPlPfmCPuGOshyx/jLfQL0C5ke57+pHfDjRst8FPKmTwyZ\n/gV/+jb/w7BXyLzhQA3wXqPbOtFfxgFXNZp3mj99FRATMr3hg2oXcFSYdRrp39dqYGCjeV/2P5Ce\nCZl2tX973wlzW8lAD///dLzCPB+IDdM2o9H1cEX/AX/6A4R8uAOH4xW7amBoE49PQaPbusyffm8E\nz3fYoo/X36POv//+jeYlAjlhlknH2zjc1vAYNXqNfNREht7+h3MZMLrRvLFAJbCwhevT2mwOeDF0\nHtCXzzdW4kOmH+u3Xw30CZmehLfh65pa12ZyT2t4HYRMe8qf9uUw7RuO/RcD6Y3mxYY+ZxG2vZnW\nF/2NwLBDfU7wNoDW+bd5fpjlckL+74+3oTg/TLsC/zZ+FclzoUujxzHoALq08ImC9/0X/NQIl3vI\nX+7SMPNG+oVgbaPpDR+aI8Iss9af96Uw817337CxIdNOJExhD5n/hj//hJBpDR9Uv21inX7rz/9G\nE/OfwdsLkOpfbyj6BzwGjZZL89u9TZhvY2Ha71f0/Q+3XXh7RPqEaf8Lf5mbwjw+b4VpH9/UB+BB\nMjk+//b9K+AveBtJ9cDVEb5+vu/f3vFhXiMfNbHMd/xlvn2Q5290JFkizOaAw8IsM9efF7p3p+E9\nMjNM+4bnJ+y6NpOtYSP9mJBpp/jT/hKm/VJ/3gEbuYfYtuG9dGKYeUNpvugfsJHcmucEONOf9lwL\nb+NJv/3kRtP/H97n1dBDed1E+0W997u/Sf7ffzWe4Zz70Mw2AMPMLN05tzNk9g7n3Jowt/cpMAxY\nEGbeRrw9CP38/0P9xzlXH2aZN/B6lR+Ft9svVHGY9uDtvQA4wczywszvi/eNZ6Sf83m8Qwr3mNnJ\nwMt4hf1953+aADjnys3sb8A3gUVm9le8XZjvOuda0kt/FNATeNs5ty3M/H8BN+Cta2PzG09wzu01\ns81435wj9dPGNwfMcs7NDtfYzMYA1+HtEeiP9y031MAI7rvh+ZlgZjeHmT/S/3sk3sZss1qZbadz\nbnWY6ev9v6GPacN7pPHrD7xDBHUHyxjKzA4DvgisdM79v5BZLwGbgG+ZWaZzrsxvn4y3B2Szc+69\ng9x2i9u2gabef5E+J0f7f//Rwvu9FzgLb0/Xpf79jfNv5x/OuY9aeDsShop+1/EZ3odkJB++4O1y\na1i+qdsdjHcsOrTo7wzfnFqARhsI+83D+4baWFM9ljc1yhluXmMNHe+ua2J+gxQA59zHZpaP961n\nKnCGP3+9mf3GOXdXyDLnAj8Ezufz096qzOwp4FrXfM/rljzW4D3Wje1oYplavA2YiDi/975fJI4B\nHgbuN7OPnXP7bQCa2dF4GyRxeN9Qn+fzPggT8Q7DRDL2Q8Pzc0mzrfznpzmHkK25xxP2f0wbnrcD\nnlvnXK2ZlR0sZyOX8Plx98a39WfgGrxd1b/xZzW8HhpvKIcTSdtDFfb914rnJKLMzrnXzewD4Dwz\nu8Y5V4Ff/PEOm8khUNHvOt4CvoR3zPrhCJZrKM79gHDf3Ps3atdespuY3q+Z+3dhpoW2TXfOlbfk\nzp1zHwDnmlkcMAE4CW+3/+/NbJdz7mG/3R78XeNmNgjvm0wBcCHe7tAvNHM3oY91OB31WO/jnNsF\nvGpm3wQWAnPNbFSjPRc34HWS+qJz7o3Q5c3sR3gf4pFoWL8JzrklrUvebtnCacibjXf4KvQ+4oBM\nvA5yB2VmoT30bzGzW5poegmfF/2GDZSWbNBH0ha8QgzhP+vDbXyGaur9F+lzEmlmgPvxOldeYGZz\n8d5/G/E6vMoh0Cl7XcdsvOO7Z5rZ6OYa2v6n4TXsAjwxTLvDgBxgnXOuqW9GbeU4Mwv3emvIFcmu\nyv/6f5srwGE552qdcwucc7cB5/mTv9VE2/XOuT8DJ+N18jrOQk7vC2MlXo/1CWYW7gP1i/7fhZHm\nPlR+8X0I7/n+XqPZhwHbGn+A+05o4ibraHoPRKufnzBaky1SDc9HuNs7jsj2tJyGd3hpJd7GebjL\nWmCkmZ0A+zbMlgHZZhbu0M8+kbT1bff/Dgozr7UDeEX6nDS8Hpo8nS+MuXjvpUvx9r71Ah52zkV0\nqEUOpKLfRfjHsW7G6yz293Dn7QL458m+FDKpyP97g5llhbSLxfumEUNkew5a63DgytAJZnYa3ofE\narxj5y11N94G0G/NbGTjmf55xF8IuT65ifOtG/Y+7PbbZfnHDhtLxtsVXYvXIS4s51wN3ulNqXid\n9kIzjQD+18/9SNOr1q5+idd7/1ozCz2m/RHQx8zGhzY2s1l4GzzhbAWyzKxHmHmz8b7d/dQ/rLIf\nM4sJd954E1qTLVJz/L8/MbM+IfeRhHc6WiQadkPf5Jy7ONwFr39JaFvwThMEeKDxa9V/vPq3sm3D\ncfmZ/l6LhnaD8E6Pa42PiOw5+Zu/zKlmdl7jmWaW03iaf/jwUbz+L7/E28h8qJV5JYR273chzrlf\n+2/cnwIlZvYOXgewSj4fhvdwQjqFOefeMbPbgR8Ay/xj07vwtrrH4h02uKMD4r8E/J+/UbIY79vC\nGXjnbRc20ckvLOfcCjMrxNugWW5mLwEf4vUlGIz3DbMUOMJf5CLgMjN7C+8Qx3ZgBF6HvWrgd367\ngcB7ZrYUWILX6SsNr9d1P+Au//hic6737/8qv5Ph63i7h8/B2xi4yh3iaHit5ZzbaGb34/Wu/wHw\nI3/W7/A+rN8ysyfwdnfn4n3LfQqvU1VjrwF5wEtm9ibe47jYOfc359xWf/CUZ4D/mtlrfD5GwCC8\nPgYZHNj5K5zWZIuIc+5tM/sD3uGehvfIXrxv7Q3nhR+UmQ3DO2xUBjzbTNO/4K3XmWZ2td/p8494\nr5uLgFVm9hzea3gA3mG9IryNfiJp65x7139+jgeKzexfeJ8V38Tr0BpuD8DBRPScOOdqzOxsvCGI\nHzWzy/C+/Sfh9VP6MuFr0b3AxXjvy78551p0iEUOIujTB3SJ/IL3RvkD3m6+crxvn5/h9Y6dRci5\n9SHLTMMr8BV4hXY58BMgKUzbj2j6dKw3aHrwlzl4H+xDQ6adyOenkB2DNyRpuZ/jFSAvzO3cTBOn\nGTVqN86/z4/xis42/zF5gJBTCoEpeIMSLfbb7MHbuzCb/U/d6oX37edfeMcPq/3H9Q28QwHW6P73\nO2Wv0e3chneaYjXet95/Al8N03bf49PEOjb5XDTR3jX1/Pjzs/E2+nYB2SHTT8H7IK7w877C5/0Z\nHAeOIZDsP6Yb8PaAhDv1ayjeXplV/muuHG9UwEcIGcinBesUabbmXr9hX1t4He+uAj7wn7NP8Qb0\nSW/pc4B3eqQD7mxB2wf9tt9rNP0CvLMIdvqP2Tq8vUcHDGbU0rb+6/EhYIu/bsvw9jIMbeJ5m0Oj\n9/GhPif+MoPxCvk6vM+srcC7NBonpNEy79HM6bm6RH7R2PvSrvzduK8DP3PO3RxsGhHpKswsFW/j\naxveIEEt3hsoTdMxfRER6YyuwOtLc68KftvRMX0REekU/I6JV+Adx78E7/DavYGG6mZU9EVEpLPo\njXfGRDXeaJpXu4N3npUI6Ji+iIhIlNAxfRERkSihoi8iIhIlVPRFRESihIq+iIhIlFDRFxERiRIq\n+iIiIlFCRV9ERCRKqOiLiIhECRV9ERGRKKGiLyIiEiVU9EVERKKEir6IiEiUUNEXERGJEir6IiIi\nUUJFX0REJEqo6IuIiEQJFX0REZEooaIvIiISJVT0RUREooSKvoiISJRQ0RcREYkSKvoiIiJRQkVf\nREQkSsQFHaCtZWZmuqFDhwYdQ0REpMMsWLCgzDmXdbB23a7oDx06lPnz5wcdQ0REpMOY2cctaafd\n+yIiIlFCRV9ERCRKqOiLiIhECRV9ERGRKKGiLyIiEiVU9EVERKKEir6IiEiUUNEXERGJEir6IiIi\nUUJFX0REJEqo6IuIiESJbjf2flvasbuG9z7ZQVZqIn3TEslITiQ2xoKOJSIi0ioq+s1YunEnM+eU\n7LseY5CRkkjfVO+SlZpI39Qk+qbtfz0rNZGk+NgAk4uIiBxIRb8ZRw3uzdNXHsuW8mpKK6rYUlFN\naUU1Wyqq2VJRxfJPyymrrKbeHbhsWlJckxsFff09B1kpSaT1iMNMew9ERKT9qeg3IyUxjkmDezfb\npq7esW1XDVtCNgpKK6rZUv759fc+2cGWiiqq9tYfsHxiXIy/MRBmoyDkekaKDi2IiMihUdE/RLEx\nRpZfsMc00845R0V1rb9B4O0pKG2052Bt6S7+u3YbO/fsPWD5GIM+yYn7NghCNxIG9OrB+Jx0stOS\n2m9FRUSky1PR7yBmRlpSPGlJ8YzISmm2bXVt3ecbA+XVlFZWU+rvOWjYe/DBZ+WUVdZQF3JsITst\nkQk5vZgwqBcTcnoxLied9B7x7b1qIiLSRajod0KJcbHk9O5JTu+ezbarq3ds313Dx1t3s2TDDhav\n38HiDTt55f3N+9oMz0z2NwLSGT+oF6P7p6mToYhIlFLR78JiY4zMlEQyUxKZPOTzvgc7d+9lycYd\nLNmwk0Xrd/D26jKeeW8jAHExxhH9U/fbI3BY3xT1FxARiQLmXJiu511Ybm6umz9/ftAxOp1NO6tY\ntH4HizfsYMmGHSxZv5OK6loAeibEMm5g+r6NgPE56eT07qGzCkREuggzW+Ccyz1YO33TjxL90pOY\nmt6PqWP7AVBf71i3dZd3SMA/LDDn7Y+oqfPOMMhITmDCIG8DoGFjoE9yQpCrICIih0hFP0rFxBgj\nslIYkZXCGZNyAKiprWflpgoW+f0DlmzYwesrt9CwM2hQnx7eYQH/0MDYgWn0TNBLSESkq9DufWlW\nZXUtyzbu9PcG7GDx+p1s3LEH8E4jHJnt9Q8YPyidCTm9GNUvlfhY/aSDiEhH0u59aRMpiXEcPTyD\no4dn7JtWVlnNkg07WLTe2xh45f1N/GX+esAbbGjMgLR9hwQmDOrF0Iye6h8gItIJ6Ju+HDLnHOu3\n7fH3BHh7BJZtLGfP3joA0nvEe30D/E6CEwf1oq8GEhIRaTP6pi8dxswYnNGTwRk9+eaEAQDU1tWz\nakvlvk6Ci9fv4L5/r9k3mFD/9KR9nQQn5vRibE46aUkaSEhEpD2p6Eu7iIuN4cj+aRzZP41p+d60\nPTV1vP/ZThat37lvMKGXl38+kNCIrOR9ewNGD0hnVHYq6T21ISAi0lZU9KXD9EiIZfKQPkwe0mff\ntB27a1iyYee+PQL/WV3G0/5AQgD90pIY2S+VI/qlMjLb+3tY3xSNKigi0goq+hKoXj0TOH5kFseP\nzAK8/gGbyqtYsamCDzdVsHJTBSs3VzDnna3U1HpjCMQYDM1IZmR26n4bBEMzehKnMwdERJqkoi+d\nipnRP70H/dN78MVRffdNr62r5+Ntu72NAP/y4eYKXnl/Ew2/OZQQF8NhWSneRkC/VEb1S2VUdir9\n05N09oCICOq9L11c1d46Vm+p3LdHoGGDYFN51b42qUlxjGq0V+CIfqn06qkRBkWke1DvfYkKSfGx\njB2YztiB6ftN37l7r7cRsPnzwwQvLP6UR9+t3demb2rivr0BDRsEh/dNpUeC+guISPekoi/dUnrP\nePKH9SF/2OedBp1zbC6v9vcIlLNyUyUrN5fzyH8/ptrvL2AGQ/r03Lc3oGFjYGhGsvoLiEiXp6Iv\nUcPM6JeeRL/0JE7wOw4C1NU7Pt66iw83V3gdCP3DBK9+sPnz/gKxMYzom8Ko7JT9DhMM7KVfIxSR\nrkNFX6JebIwxPCuF4VkpTB3bf9/0qr11rCndv79A8bptPLvo031tUhLjGJmdwqh+aYxq+NsvVb9I\nKCKdkoq+SBOS4mMZMyCdMQP27y9QXrXX6ycQ0nHwH8s+47HivfvaZKYk7tdpcGS/VEZmp+hXCUUk\nUPoEEolQWlI8uUP7kDt0//4CpRXV+w4PNPx9tPhjqvZ+3l9gUO+e+zoPjvJPKxyWmaxfJhSRDqGi\nL9IGzIy+aUn0TUvaN9AQeP0F1m/b/fleAf/vv1Zs2fc7BPGxxoisFEZmp+63QTCwVw9iYtRfQETa\njoq+SDuKjTGGZiYzNDOZk8f02ze9uraONVv27zy44OPtPL/48/4CyQmx3iBD2d5hgslDejNhUK8g\nVkNEugkVfZEAJMbFMnpAGqMHpO03vaJqLx9urtw34uCKTeW8vHwTj5esB+C2M8dxbt7gICKLSDeg\noi/SiaQmxTN5SG8mD+m9b5pzjtLKaq59cgk/fmYZWamJfOmI7ABTikhXpd5DIp2cmdE3NYn7LpjE\n6P5pXPnnhbz3yfagY4lIF6SiL9JFJCfGUVSQR3ZaEoVzSlhbWhl0JBHpYlT0RbqQrNRE5s7MJ8aM\n6UXFbKmoOvhCIiI+FX2RLmZoZjKzZ+axbVcNM2eXUFG19+ALiYigoi/SJY3P6cW9F0xi5aYKLv/T\nAmr8HwwSEWmOir5IF3XiqL7ceuZ43l69leueWkx9w68DiYg0IdCib2ZTzWylma02s+vDzO9tZs+Y\n2RIzKzazsUHkFOmszpqcw3Unj+K5RZ9y60srgo4jIp1cYOfpm1kscA/wFWADUGJmzzvn3g9p9mNg\nkXPudDM7wm//5Y5PK9J5XXniCDaXV/Hgm2vJTkti1nHDgo4kIp1UkN/084HVzrm1zrka4HHgtEZt\nRgP/AnDOrQCGmplGJREJYWb89Jtj+NrYfvzihff5W8hQviIioYIs+gOB9SHXN/jTQi0GzgAws3xg\nCJDT+IbM7FIzm29m80tLS9sprkjnFRtj/PbcieQP7cM1TyzmnTVlQUcSkU6os3fkuxXoZWaLgKuB\n94C6xo2ccw8653Kdc7lZWVmNZ4tEhaT4WB6ansvQzJ5cNm8B739aHnQkEelkgiz6G4FBIddz/Gn7\nOOfKnXMznXMTgelAFrC24yKKdC3pPeOZMzOf5MQ4CmYXs2H77qAjiUgnEmTRLwEON7NhZpYATAOe\nD21gZr38eQAXA2865/T1RaQZA3r1YG5hPlV765hRVMz2XTVBRxKRTiKwou+cqwWuAl4GPgCecM4t\nN7PLzexyv9mRwDIzWwl8DfhOMGlFupZR/VJ5aHou67fv4eJ586nae8BRMRGJQuZc9xrQIzc3182f\nPz/oGCKdwj+WfsaVjy7ky0dkc/+Fk4iL7ezdeESkNcxsgXMu92Dt9Akg0o19bVx/fnbqGF79YDM3\nPrec7raRLyKRCWxwHhHpGNOPGcqmnVXc+8Ya+qUl8Z2TDg86kogEREVfJApcd/IoNpdX89tXPyQ7\nLZFp+YODjiQiAVDRF4kCZsatZ46jrLKanzy7jKzURL58pAa3FIk2OqYvEiXiY2O494JJjBmQxrcf\nXcjCT7YHHUlEOpiKvkgUSU6Mo6ggz/thnjklrCmtDDqSiHQgFX2RKJOZksi8wnxizJhRVMyW8qqg\nI4lIB1HRF4lCQzKSmT0zj227aiiYXUJF1d6gI4lIB1DRF4lS43N6ce8Fk/hwcwWX/2kBNbX1QUcS\nkXamoi8SxU4c1ZdbzxzP26u3ct1Ti6mv1+A9It2ZTtkTiXJnTc5hc3kVd7y8kuy0JH789SODjiQi\n7URFX0S48sQRbC6v4sE319I3NZGLvzA86Egi0g5U9EUEM+On3xxDaUU1v/z7B/RNS+LUCQOCjiUi\nbUzH9EUEgNgY47fnTiR/WB+ueWIR76wuCzqSiLQxFX0R2ScpPpaHLsplWGYylz6ygPc/LQ86koi0\nIRV9EdlPes945hbmk5oUR8HsYtZv2x10JBFpIyr6InKA/uk9mFuYT9XeOmbMLmb7rpqgI4lIG1DR\nF5GwRman8scZeWzYvodZc0vYU1MXdCQROUQq+iLSpPxhfbhr2kTeW7+Dqx97j9o6jdon0pWp6ItI\ns6aO7c/PTh3Dqx9s5sbnluOcRu0T6ap0nr6IHNT0Y4ayaWcV976xhn5pSXznpMODjiQiraCiLyIt\nct3Jo9hcXs1vX/2Q7LREpuUPDjqSiERIRV9EWsTMuPXMcZRVVvPjZ5aSmZLISaOzg44lIhHQMX0R\nabH42BjuvWASYwemc9VjC1n4yfagI4lIBFT0RSQiyYlxFBXkkZ2WxKw5JawprQw6koi0kIq+iEQs\nMyWReYX5xMYY0x8uZkt5VdCRRKQFVPRFpFWGZCRTVJDH9t01zJhdQkXV3qAjichBqOiLSKuNz+nF\nfRdOZtXmCi7/0wJqajV4j0hnpqIvIofkhJFZ3HbmeN5evZVrn1xMfb0G7xHprHTKnogcsjMn57C5\noorbX1pJdloiP/nG6KAjiUgYKvoi0iauOGEEm3dW8dB/1pGdlsTFXxgedCQRaURFX0TahJlx0zfH\nUFpZzS///gF905I4dcKAoGOJSAgd0xeRNhMbY9x5zkTyh/XhmicW8c7qsqAjiUgIFX0RaVNJ8bE8\ndFEuwzKTufSRBSz/dGfQkUTEp6IvIm0uvWc8cwvzSU2Ko2B2Ceu37Q46koigoi8i7aR/eg/mFuZT\nvbeOGbOL2b6rJuhIIlFPRV9E2s3I7FT+OCOPDdv3UDi3hD01dUFHEolqKvoi0q7yh/XhrmkTWbR+\nB1c/tpDaOo3aJxIUFX0RaXdTx/bn56eO4dUPtnDjc8twTqP2iQRB5+mLSIe46JihbCqv4p7X15Cd\nlsR3TxoZdCSRqKOiLyId5tqvjmJzeTW/e3UV2WlJnJc/OOhIIlFFRV9EOoyZccsZ4yirrOYnzywl\nKyWRk0ZnBx1LJGromL6IdKj42BjuOX8SYwemc9VjC1n4yfagI4lEDRV9EelwyYlxFBXk0S8ticI5\nJazeUhl0JJGooKIvIoHITElkbmE+cTHGjKJiNpdXBR1JpNsLtOib2VQzW2lmq83s+jDz083sb2a2\n2MyWm9nMIHKKSPsYkpFMUUEe23fXUDC7hPKqvUFHEunWAiv6ZhYL3AN8DRgNnGdmoxs1+zbwvnNu\nAnAi8H9mltChQUWkXY3P6cV9F05m1eYKLn9kAdW1GrVPpL0E+U0/H1jtnFvrnKsBHgdOa9TGAalm\nZkAKsA2o7diYItLeThiZxW1njuedNVu59skl1Ndr8B6R9hDkKXsDgfUh1zcAUxq1uRt4HvgUSAXO\ndc4dMIanmV0KXAoweLDO+xXpis6cnMPmiipuf2kl2amJ3HBK4x1/InKoOntHvpOBRcAAYCJwt5ml\nNW7knHvQOZfrnMvNysrq6Iwi0kauOGEEBccO5Y9vreOP/1kbdByRbifIor8RGBRyPcefFmom8LTz\nrAbWAUd0UD4R6WBmxo2njObr4/rxy79/wHOLGn8kiMihCLLolwCHm9kwv3PeNLxd+aE+Ab4MYGbZ\nwChAm/+Qf23BAAAgAElEQVQi3VhsjHHnOROZMqwP1z65mLdXlwUdSaTbCKzoO+dqgauAl4EPgCec\nc8vN7HIzu9xv9gvgWDNbCrwG/NA5p08AkW4uKT6WB6fnMjwzhcseWcDyT3cGHUmkW7Du9hOXubm5\nbv78+UHHEJE28NnOPZxx7zvU1juevuJYBvXpGXQkkU7JzBY453IP1q6zd+QTkSjWP70Hcwvzqd5b\nx4zZxWzbVRN0JJEuTUVfRDq1kdmpPFyQx4bte5g1t4Q9NRq8R6S1VPRFpNPLG9qHu6ZNZNH6HVz1\n6EJq6w4YrkNEWkBFX0S6hKlj+/PzU8fw2oot3PDsMrpbfySRjhDkiHwiIhG56JihbCqv4p7X15Cd\nlsT3vjIy6EgiXYqKvoh0Kdd+dRSby6v5/WuryE5L4vwpGnpbpKVU9EWkSzEzbjljHGWV1dzw7FKy\nUhP5yujsoGOJdAk6pi8iXU58bAz3nD+JcQPTufqxhSz4eHvQkUS6BBV9EemSkhPjKCrIo19aErPm\nlrB6S2XQkUQ6PRV9EemyMlISmVc4hbgYY0ZRMZvLq4KOJNKpqeiLSJc2OKMnswvy2bG7hoLZJZRX\n7Q06kkinpaIvIl3euJx07rtwMqs2V3D5IwuortWofSLhqOiLSLdw/Mgsbj9rPO+s2co1Tyymvl6D\n94g0plP2RKTbOGNSDpvLq7ntpRVkpyVx4ymjg44k0qlE9E3fzP5oZlPaK4yIyKG6/IThFBw7lIff\nWsdDb64NOo5IpxLp7v0C4B0zW2Zm3zWzjHbIJCLSambGjaeM5uvj+vGrFz/guUUbg44k0mlEWvRz\ngJ8A8cCdwAYze9zMvtLmyUREWik2xrjznIlMGdaHa59czNury4KOJNIpRFT0nXObnHO3OudGAScC\nTwCnAC+Z2Tozu9HMBrVDThGRiCTFx/Lg9FyGZ6Zw2SMLWP7pzqAjiQSu1b33nXNvOudmAP2BK4At\nwM3AWjN70cy+ZWbWNjFFRCKX3iOeOYV5pCbFUTC7hPXbdgcdSSRQbXHKXhKQ5l8M2AVMAf4KLDaz\nI9vgPkREWqV/eg/mFuZTvbeOGUXFbNtVE3QkkcC0quib5xtm9jSwAbgd2AlcDAzA+/Z/if/3oTbK\nKiLSKiOzU3m4II+NO/Ywa24Je2o0eI9Ep0hP2RthZr8C1gPPA18EHgQmOOeOds4VOed2O+dqnHNF\nwM+A3DZPLSISobyhffj9tKNYvH4HVz26kNq6+qAjiXS4SL/prwJ+BHwMFAL9nXNXO+eWNtH+I+Cz\n1scTEWk7U8f242enjeW1FVu44dllOKdR+yS6RDoi313AQ8655S1p7Jx7AXgh4lQiIu3koqOHsHln\nFXe/vpq+aUl8/ysjg44k0mEiKvrOue+2VxARkY5yzVdHsrm8irteW0V2WiIXTBkSdCSRDhHpMf1z\nzGxeM/PnmtlZhx5LRKT9mBm/PmMcJ47K4sZnl/HK8k1BRxLpEJEe078aaK73S53fRkSkU4uPjeHe\nCyYxbmA6Vz/2Hgs+3hZ0JJF2F2nRPxJ4r5n57wH6WSsR6RJ6JsRRVJBH//QkZs2dz+otlUFHEmlX\nkRb9ZLxv801xQGrr44iIdKyMlETmFU4hLsaYUVTM5vKqoCOJtJtIi/464Lhm5h8HfNL6OCIiHW9w\nRk9mF+SzY3cNM4qKKa/aG3QkkXYRadF/BjjbzGY1nmFmhcDZwNNtEUxEpCONy0nnvgsns3pLJZfN\nW0B1rUbtk+4n0qJ/K/AB8KCZLTOzP/mXpXjD7a4Eft3WIUVEOsLxI7O4/azx/L+1W7nmicXU12vw\nHuleIj1Pv8LM/ge4BTiXzzvtbQfuA25wzpW3bUQRkY5zxqQctlRUc+s/VpCdlsSNp6hvsnQfkY7I\nh3NuJ3ClmX0byPQnlzmNZyki3cRlxw9n084qHn5rHf3Skrjk+OFBRxJpExEX/QZ+kS9twywiIp2C\nmXHTKaMprajmVy9+QFZqIt86amDQsUQOWauKvpnFAkcAvQnTL8A59+Yh5hIRCVRMjPF/50ygrLKa\n655aTGZKIscdnnnwBUU6sUg78mFmPwTKgCXAv4HXw1xERLq8pPhYHpyey4isFC57ZD7LNu4MOpLI\nIYl07P1ZeJ34FgE3AAb8DrgD2AbMx/vJXRGRbiG9RzxzZuaT3iOegtklrN+2O+hIIq0W6Tf9K4D/\nOue+CDzoT/u7c+56YDwwFIhtu3giIsHrl57E3MJ89tbVM72omK2V1UFHEmmV1oy9/6T/f0Nv/VgA\n59xneBsC32mbaCIincfh2akUFeTy2c49zJxTwq7q2qAjiUQs0qJfB+zy/2/4mxEy/yPg8EPMJCLS\nKU0e0oe7z5vE8k/LufxPC6ipbe5HR0U6n0iL/ifAMADnXDWwHvhCyPw8vGP7IiLd0kmjs7nl9HH8\nZ1UZ1z2lUfuka4n0lL03gW8AP/KvPwl818x64G1AXAgUtV08EZHO55y8QZRWVnPHyyvJSE7kxlOO\nxMyCjiVyUJEW/d8Di82sh3NuD/BTYCQww5//CnB9G+YTEemUrjxxBKUV1RS9vY6+aYlcfsKIoCOJ\nHFSkY++vxPtRnYbru4BTzSwdqHPOVbZxPhGRTqlh1L6ySm+c/ozkBM7OHRR0LJFmtfiYvpmlmFmR\nmZ3deJ5zbmdrCr6ZTTWzlWa22swO2ENgZteZ2SL/sszM6sysT6T3IyLSHhpG7TvusEyuf3op/1qx\nOehIIs1qcdH3i/o0IK0t7tgfyvce4Gt4v9Z3npnt93NWzrk7nHMTnXMT8foR/Ns5p46CItJpJMbF\ncv9FkxndP40r/7yQBR9vDzqSSJMi7b3/Pt4APG0hH1jtnFvrnKsBHgdOa6b9ecBjbXTfIiJtJiUx\njtkz8+iXlkThnBJWba4IOpJIWJEW/duBK8xsZBvc90C8U/4abPCnHcDMegJTgb82Mf9SM5tvZvNL\nS/XDfyLS8TJTEplXOIX42BimFxXz6Y49QUcSOUCkvfePwCvUS83sBWAV0Hggauec+0VbhAvxTeDt\npnbtO+cexB8WODc3VyfNikggBmf0ZG5hHuc+8F9mFBXz5OXH0KtnQtCxRPaJtOjfHPL/6U20cUBL\niv5GILSra44/LZxpaNe+iHQBYwak8+D0yRQUlTBr7nz+NGsKPRL0kyTSOUS6e39YCy7DW3hbJcDh\nZjbMzBLwCvvzjRv5pwOeADwXYVYRkUAcOyKT302byMJPtnPVowuprdNwvdI5RFT0nXMft+TSwtuq\nBa4CXgY+AJ5wzi03s8vN7PKQpqcDr/hjAoiIdAlfH9efn582ltdWbOHHzyzFOR15lOBFunu/TTnn\nXgRebDTt/kbX5wBzOi6ViEjbuOjoIZRWVHPXa6vISk3kupOPCDqSRLmIir6ZtWRcfeecm9XKPCIi\n3cr3Tjqc0opq7nl9DZkpicz8n2FBR5IoFuk3/YIWtHGAir6ICN5wvb/81li27arm5y+8T0ZKIqdO\nGBB0LIlSkR7Tj2l8AeKBUcBDwH+B3u2QU0Sky4qNMX4/7SjyhvThmicW8daqsqAjSZSKtPf+AZxz\ndc65Vc65y4CtwG2HHktEpHtJio/loRm5jMhK4bJH5rN0w86gI0kUOuSi38hLwJltfJsiIt1Ceo94\n5hbm06tnAgWzi1lXppOSpGO1ddHvA6S08W2KiHQb2WlJzJuVT71zTC96ly0VVUFHkijSJkXfzHqZ\n2VnA94AFbXGbIiLd1YisFGbPzKesooYZRSWUV+0NOpJEiYiKvpnV+79pv98F71j+E0A98P32CCoi\n0p1MHNSL+y6cxKrNFVw6bz5Ve+uCjiRRINJT9ubhnZIXygHbgA+Bx5xz+k1JEZEWOHFUX+44ezzf\n+8tivv/EIv5w3iRiYyzoWNKNRVT0nXMF7ZRDRCQqnX5UDlsra/jl3z8gI3k5Pz9tDGYq/NI+Ah2G\nV0RE4OIvDKe0opoH3lxLVmoi//vlw4OOJN1UpMf0v21mrzYz/xUzu+zQY4mIRJcfTj2CMyYN5M5/\nfsij734SdBzppiLtvV8ArGpm/odAYavTiIhEqZgY47Yzx3PiqCxueHYpLy/fFHQk6YYiLfqHA0ub\nmb/cbyMiIhGKj43h3gsmMT6nF1c/9h7vrt0adCTpZiIt+vFAUjPzkw4yX0REmtEzIY7ZBXkM6t2D\ni+fNZ8Wm8qAjSTcSadH/EPhKM/O/CqxpfRwREemdnMC8WVNITohj+sPFrN+2O+hI0k1EWvQfA75q\nZr8ws4SGiWYWb2Y/wyv6j7ZlQBGRaDSwVw/mFuZTtbeOGUXFbNtVE3Qk6QYiLfq/Bd4EfgJ8amZv\nmdlbwGfAjcBbwP+1bUQRkeg0ql8qDxfksXHHHmbOKWFXdW3QkaSLi6joO+f24n2bvx7YABzlX9YD\nPwBOcs5pc1REpI3kDe3D3edPYumGHVzx54XsrasPOpJ0YRH/4I5zbq9z7nbn3ETnXLJ/Oco59xt/\no0BERNrQV0Zn8+vTx/Hmh6X84Kkl1Nc3Hg1dpGU0Ip+ISBcwLX8wZZXV/OaVD8lMSeAn3xgddCTp\ngiIdke9nZrasmflLzOyGQ48lIiKNffuLhzHjmCE89J91PPimTpSSyEW6e/904J/NzP8ncFbr44iI\nSFPMjJu+OYZvjO/Pr19cwV8XbAg6knQxkRb9YcCKZuav9NuIiEg7iI0x7jxnAseOyOAHf13C6yu2\nBB1JupCIO/IBvZqZ1xuIbWUWERFpgcS4WB64aDJH9Evlyj8vZOEn24OOJF1EpEV/OXBauBnm/QD0\nqTS/J0BERNpAalI8c2bm0zctkcI5JazeUhF0JOkCIi36DwNHm9kcM8tqmOj/XwQc7bcREZF2lpWa\nyLzCfOJijOkPF7NpZ1XQkaSTi3RwnofwhtmdDmwysw1mtgHYBMwAnnDO3df2MUVEJJwhGcnMmZlP\neVUtM4qK2blbw6VI01ozOM+FwDTgBWCnf3keOMc5d17bxhMRkYMZOzCdBy+azLqyXVw8r4SqvXVB\nR5JOqjUd+XDOPeGcO805N8a/nO6ce6qtw4mISMsce1gmd547gfkfb+eqR9+jVsP1ShitGpHPzHKB\nKXi99RtvODjn3C8ONZiIiETmlPED2FpZw0+fX84Nzy7jljPG4fWxFvFEVPTNrAfwNN6P7hjg/L+E\n/O8AFX0RkQDMOHYoZZXV/OFfq8lKTeSar44KOpJ0IpF+078Jr+D/CngNeB2vA98W4EdAD7xOfiIi\nEpDvf2UkpRVe4c9MSWTGsUODjiSdRKTH9M8CnnTO3QQ0jMG/0Tn3MnASkAAUtF08ERGJlJnxy2+N\n5aQjs7n5b8t5YcmnQUeSTiLSoj8I+Lf/f0P30AQA51wt8Bhez34REQlQXGwMd59/FLlDevO9vyzi\n7dVlQUeSTiDSol/B54cEKoB6YEDI/J1AvzbIJSIihygpPpY/Ts9jWGYylz2ygGUbdwYdSQIWadFf\nA4wEcM7V4Q3LexbsG4b3DGB9WwYUEZHWS+8Zz7zCKaT3iKdgdjEfb90VdCQJUKRF/1XgTDNr+FGd\nB4CpZrYGWIV3XF/D8IqIdCL90pOYW5hPbb3jooeLKa2oDjqSBCTSon8r8EX80/Scc/cC1+Lt1t8O\n/Bi4vS0DiojIoTusbwqzC/IoraimYHYxFVUarjcaRTr2fqVzbqXfaa9h2p3OuUnOuTzn3G3OOdf2\nMUVE5FAdNbg39144iRWbKrjskQVU12q43mjTqmF4RUSka/riqL7cfuZ43lmzle8/sZj6en1Piyat\nGoZXRES6rjMn57B1VzW/fnEFmckJ3HzqGA3XGyVU9EVEotClx4+gtKKah/6zjr5pSXz7i4cFHUk6\ngIq+iEiU+tHXjqSssoY7Xl5JRnIC0/IHBx1J2pmKvohIlIqJMW4/azzbdtXw42eWkpGSyFdGZwcd\nS9pRoB35zGyqma00s9Vmdn0TbU40s0VmttzM/h2ujYiItE58bAz3XjCJcTm9uOrRhZR8tC3oSNKO\nAiv6/gA/9wBfA0YD55nZ6EZtegH3Aqc658YAZ3d4UBGRbi45MY7ZBXkM7N2DWXNKWLmpIuhI0k6C\n/KafD6x2zq11ztUAjwOnNWpzPvC0c+4TAOfclg7OKCISFfokJzCvMJ8eCbFML3qXDdt3Bx1J2kGQ\nRX8g+4/Tv8GfFmok0NvM3jCzBWY2PdwNmdmlZjbfzOaXlpa2U1wRke4tp3dP5hbms7umjulFxWzb\nVRN0JGljnX1wnjhgMvAN4GTgRjMb2biRc+5B51yucy43KyurozOKiHQbR/RL44/Tc9mwfQ+Fc0rY\nXVN78IWkywiy6G8EBoVcz/GnhdoAvOyc2+WcKwPeBCZ0UD4Rkag0ZXgGfzjvKJZs2MGVf17I3rr6\noCNJGwmy6JcAh5vZMDNLAKYBzzdq8xxwnJnFmVlPYArwQQfnFBGJOieP6ccvvzWON1aW8sOnlmi4\n3m4isPP0nXO1ZnYV8DIQCxQ555ab2eX+/Pudcx+Y2UvAEqAe+KNzbllQmUVEosn5UwZTVlnNnf/8\nkKzURH709SODjiSHKNDBeZxzLwIvNpp2f6PrdwB3dGQuERHxXP2lwyitqOaBN9eSmZLIJccPDzqS\nHAKNyCciIk0yM24+dQxbd1Xzqxc/IDM1gdOPygk6lrRSZ++9LyIiAYuNMX577kSOGZ7BdU8u4Y2V\nGjKlq1LRFxGRg0qMi+WB6ZMZmZ3KlX9eyKL1O4KOJK2goi8iIi2SlhTPnMI8MlISKJxTwprSyqAj\nSYRU9EVEpMX6pibxSOEUDJj+cDGby6uCjiQRUNEXEZGIDM1MZs7MfHbsrmFGUTE79+wNOpK0kIq+\niIhEbFxOOg9clMua0koumTufqr11QUeSFlDRFxGRVjnu8EzuPGciJR9v438fe486jdrX6anoi4hI\nq31zwgB+espoXnl/Mzc8uwznVPg7Mw3OIyIih6Tgf4ZRWlnNPa+vISs1ke9/5YAfQ5VOQkVfREQO\n2bVfHUVpRTV3vbaKrJQELjpmaNCRJAwVfREROWRmxq9PH8e2XTXc9PxyMlIS+fq4/kHHkkZ0TF9E\nRNpEXGwMfzhvEpMG9+a7jy/inTVlQUeSRlT0RUSkzfRIiOXhGbkMyejJpfMWsGzjzqAjSQgVfRER\naVO9eiYwb1Y+aUlxFMwu4ZOtu4OOJD4VfRERaXP903swb1Y+tfX1TC96l7LK6qAjCSr6IiLSTg7r\nm8rDM/LYVF7FzNklVFbXBh0p6qnoi4hIu5k8pDf3XjCJ9z8r5/JHFlBTWx90pKimoi8iIu3qS0dk\nc9uZ43lrdRnXPLmYeg3XGxidpy8iIu3urMk5lFVWc+s/VpCRnMBPvzkaMws6VtRR0RcRkQ5x2fHD\nKa2o5uG31tE3LZErTzws6EhRR0VfREQ6hJnxk68fSVllNbe/tJLMlETOyR0UdKyooqIvIiIdJibG\nuOOsCWzbVcOPnl5Kn54JnDQ6O+hYUUMd+UREpEMlxMVw/4WTGTsgjW8/upAFH28LOlLUUNEXEZEO\nl5wYR1FBHgN69aBwznw+3FwRdKSooKIvIiKByEhJZF5hPolxMcwoKubTHXuCjtTtqeiLiEhgBvXp\nydzCfCqrapleVMz2XTVBR+rWVPRFRCRQR/ZP46EZuXyybTeFc0vYXaPhetuLir6IiATu6OEZ3DVt\nIovW7+CqR99jb52G620PKvoiItIpTB3bn1+cNpZ/rdjC9X9dinMarret6Tx9ERHpNC48eghlldX8\n7tVV9E1L5IdTjwg6Ureioi8iIp3Kd758OKUV1dz3xhoyUxKZddywoCN1Gyr6IiLSqZgZPz9tLFsr\na/jFC++TmZLAaRMHBh2rW9AxfRER6XRiY4zfTZvIlGF9uPbJxbz5YWnQkboFFX0REemUkuJjeWhG\nLof1TeXyPy1g8fodQUfq8lT0RUSk00pLimfuzDz6JCcwc04Ja0srg47Upanoi4hIp9Y3LYlHZk3B\ngOlFxWwprwo6Upeloi8iIp3esMxkZs/MY9uuGqYXFVNetTfoSF2Sir6IiHQJ43N68cBFk1lTWskl\nc+dTtbcu6Ehdjoq+iIh0GV84PIvfnD2Bd9dt47uPL6KuXqP2RUJFX0REupTTJg7kplNG89LyTdz4\n3DIN1xsBDc4jIiJdTuFxwyit9Ebt65uayHdPGhl0pC5BRV9ERLqkH5w8itIKb5z+zJRELjx6SNCR\nOj0VfRER6ZLMjFvOGMe2XTXc+NwyMlMSmDq2f9CxOjUd0xcRkS4rPjaGe86fxMRBvfjfxxfx37Vb\ng47UqQVa9M1sqpmtNLPVZnZ9mPknmtlOM1vkX24KIqeIiHRePRJiKZqRx+A+Pblk7nze/7Q86Eid\nVmBF38xigXuArwGjgfPMbHSYpv9xzk30Lz/v0JAiItIl9E5OYF5hPilJccyYXcz6bbuDjtQpBflN\nPx9Y7Zxb65yrAR4HTgswj4iIdGEDevVgbmE+NbX1TC8qZmtlddCROp0gi/5AYH3I9Q3+tMaONbMl\nZvYPMxsT7obM7FIzm29m80tL9fOLIiLRamR2KkUFuXy2cw8z55Swq7o26EidSmfvyLcQGOycGw/8\nAXg2XCPn3IPOuVznXG5WVlaHBhQRkc5l8pA+3H3eJJZ/Ws7lf1pATW190JE6jSCL/kZgUMj1HH/a\nPs65cudcpf//i0C8mWV2XEQREemKThqdzS1njOM/q8q47qnF1Gu4XiDY8/RLgMPNbBhesZ8GnB/a\nwMz6AZudc87M8vE2UnQ+hoiIHNQ5uYMoq6zm9pdWkpGcyI2nHImZBR0rUIEVfedcrZldBbwMxAJF\nzrnlZna5P/9+4CzgCjOrBfYA05wGWRYRkRa64oQRlFZUU/T2OvqmJXL5CSOCjhSoQEfk83fZv9ho\n2v0h/98N3N3RuUREpHswM278xmi2VtZw6z9WkJGcwNm5gw6+YDelYXhFRKRbi4kxfnP2BLbtquH6\np5eSkZLAl47IDjpWIDp7730REZFDlhAXw/0XTWZ0/zSu/PNCFny8PehIgVDRFxGRqJCSGMfsmXn0\nS0uicE4JqzZXBB2pw6noi4hI1MhMSeSRWVNIiIthelExn+3cE3SkDqWiLyIiUWVQn57MmZlHRVUt\n0x8uZsfumqAjdRgVfRERiTpjBqTz4PTJfLx1NxfPnc+emrqgI3UIFX0REYlKx47I5HfTJrLgk+1c\n/dhCauu6/3C9KvoiIhK1vj6uPz8/bSyvfrCFHz+zlO4+/pvO0xcRkah20dFDKK2o5q7XVpGVmsh1\nJx8RdKR2o6IvIiJR73snHU5pRTX3vL6GzJREZv7PsKAjtQsVfRERiXpmxi+/NZZtu6r5+Qvvk5GS\nyKkTBgQdq83pmL6IiAgQG2P8ftpR5A3twzVPLOKtVWVBR2pzKvoiIiK+pPhYHpqey4isFC57ZD5L\nN+wMOlKbUtEXEREJkd4jnrmF+fTqmUDB7GLWle0KOlKbUdEXERFpJDstiUdm5eOA6UXvsqWiKuhI\nbUJFX0REJIzhWSkUFeSxtbKGGUUllFftDTrSIVPRFxERacLEQb2478LJrNpcwaXz5lO1t2sP16ui\nLyIi0owTRmbxm7Mn8N+12/j+E4uoq++6o/bpPH0REZGD+NZRAymrrOaXf/+AzJTl/OzUMZhZ0LEi\npqIvIiLSAhd/YTilFdU88OZaslISufrLhwcdKWIq+iIiIi30w6lHUFpZzf/980MyUxM5L39w0JEi\noqIvIiLSQjExxm1njmfbrhp+8sxS+iQncPKYfkHHajF15BMREYlAfGwM914wifE5vbj6sfd4d+3W\noCO1mIq+iIhIhHomxDG7II9BvXtw8bz5rNhUHnSkFlHRFxERaYXeyQnMmzWF5IQ4pj9czPptu4OO\ndFAq+iIiIq00sFcP5hbmU7W3jhlFxWzbVRN0pGap6IuIiByCUf1Sebggj4079jBzTgm7qmuDjtQk\nFX0REZFDlDe0D3efP4mlG3ZwxZ8XsreuPuhIYanoi4iItIGvjM7mljPG8eaHpfzgqSXUd8LhenWe\nvoiISBs5N28wZZU13PHySjJTEvjJN0YHHWk/KvoiIiJt6MoTR1BaUc1D/1lHVmoilx4/IuhI+6jo\ni4iItCEz46ZTRlNaWc2vX1xBRnIiZ07OCToWoKIvIiLS5mJijDvPmcCO3TX84K9L6JOSwBdH9Q06\nljryiYiItIfEuFjuv3AyR/ZP5co/LeS9T7YHHUlFX0REpL2kJsUzuyCfvmmJFM4pYfWWykDzqOiL\niIi0o6zUROYV5hMbY8woKmbTzqrAsqjoi4iItLMhGcnMmZnPzj17mVFUzM7dewPJoaIvIiLSAcYO\nTOfBiybTq2c89S6YgXvUe19ERKSDHHtYJseMyMDMArl/fdMXERHpQEEVfFDRFxERiRoq+iIiIlFC\nRV9ERCRKqOiLiIhECRV9ERGRKBFo0TezqWa20sxWm9n1zbTLM7NaMzurI/OJiIh0J4EVfTOLBe4B\nvgaMBs4zs9FNtLsNeKVjE4qIiHQvQX7TzwdWO+fWOudqgMeB08K0uxr4K7ClI8OJiIh0N0EW/YHA\n+pDrG/xp+5jZQOB04L7mbsjMLjWz+WY2v7S0tM2DioiIdAedvSPf74AfOufqm2vknHvQOZfrnMvN\nysrqoGgiIiJdS5Bj728EBoVcz/GnhcoFHveHLMwEvm5mtc65ZzsmooiISPcRZNEvAQ43s2F4xX4a\ncH5oA+fcsIb/zWwO8IIKvoiISOsEVvSdc7VmdhXwMhALFDnnlpvZ5f78+1tzuwsWLCgzs4/bMOqh\nyATKgg7RzrSO3UN3X8fuvn6gdewuWruOQ1rSyFxAv+kbDcxsvnMuN+gc7Unr2D1093Xs7usHWsfu\nor3XsbN35BMREZE2oqIvIiISJVT029eDQQfoAFrH7qG7r2N3Xz/QOnYX7bqOOqYvIiISJfRNX0RE\nJJ3a0xsAAAxWSURBVEqo6IuIiEQJFf0ImVmRmW0xs2Uh0242s41mtsi/fL2JZVv0U8JBa2Id/xKy\nfh+Z2aImlv3IzJb67eZ3XOrImNkgM3vdzN43s+Vm9h1/eh8z++f/b+/Mg++czjj++cpCSGpLJhNl\n/FCNoC1pa6nYRihpEcaUSpGWGhTVKtXQNlqNrZZoB7UHsbQhES1jCaFmGkuDSBMkJNrIz88aRPyQ\nePrHc27yevPe64rfdu99PjNn3vee85zlOeeee96zvPeRNCdd1y0Tv8u3ZQUdz5f0rKQZkiZKWqdM\n/C7flhV0rJs+WUHHuumTktaQ9Jikp5OOZyb/euqP5XTs2P5oZuE+gwN2AQYDMzN+o4FffEq8bsAL\nwKZAT+BpYMvO1qdaHXPhFwC/KRM2H+jb2TpUoeMAYHC67wM8j5t4Pg84LfmfBpxbq21ZQce9gO7J\n/9wiHWulLSvoWDd9spyOOZma7pOAgN7pvgfwKLBDnfXHcjp2aH+Mmf5nxMweBt5chajVmhLudCrp\nKDeE8D3g5g4tVBtjZs1mNj3dvwvMxq087g+MS2LjgOEF0WuiLcvpaGb3mtnSJDYNt3tRk1Rox2qo\n6XYshddDnzRncfrYIzmjvvpjoY4d3R9j0G87TkjLM9eUWYL6VFPCNcLOQIuZzSkTbsD9kv4t6egO\nLNcqI6kJ2BZ/8u5vZs0p6BWgf0GUmmvLnI5ZfgTcXSZaTbVlgY511yfLtGNd9ElJ3dIWxavAfWZW\nd/2xjI5Z2r0/xqDfNlyGLy1tAzTjS231yvepPKMYYmbbAPsAP5G0S8cUa9WQ1Bu4DTjJzN7Jhpmv\nqdX8O63ldJR0OrAUGF8mas20ZYGOddcnK3xX66JPmtmyVM4Nge0kbZ0Lr/n+WEnHjuqPMei3AWbW\nkhrzY+BKfLkpTzWmhLs0kroDBwK3lpMxs5fT9VVgIsV10SWQ1AP/ER1vZrcn7xZJA1L4APyJPE/N\ntGUZHZE0EvguMCL9mK5ErbRlkY711icrtGNd9UkAM1sEPAjsTZ31xxI5HTu0P8ag3waUvpSJA4CZ\nBWLLTQlL6ombEp7cEeVrQ4YCz5rZgqJASWtJ6lO6xw+oFNVFp5P2Qa8GZpvZhZmgycAR6f4I4I6C\n6DXRluV0lLQ3cCqwn5ktKRO3Jtqygo510ycrfFehTvqkpH6lU+uSegF7As9SX/2xUMcO74+f9yRg\nozl8Ga0Z+AjfOzoSuAF4BpiBf9kGJNkNgLsycYfhJ29fAE7vbF0+i47J/zrgmJzsch3x5dSnk/tP\nF9dxCL5UOAN4KrlhwPrAFGAOcD+wXq22ZQUd5+J7oCW/y2u1LSvoWDd9spyOKawu+iTwVeDJpONM\n0psIddYfy+nYof0x/oY3CIIgCBqEWN4PgiAIggYhBv0gCIIgaBBi0A+CIAiCBiEG/SAIgiBoEGLQ\nD4IgCIIGIQb9IAg6BEkm6brOLkcQNDIx6AdB0DBImpoePkruI0kL5WZqt/70FCqm3SQ36btNW5U3\nCNqa7p1dgCAIGoZewLLOLgTwAXBUuu8FfB34ITBM0jfM7LlVTLcJ+C1uArXQtn0QdDYx6AdBA5D+\nu72bmbV2Vhk6M+8cS83sxsznKyXNAsYCxwMndE6xgqD9ieX9IMggqaekUyU9JWmJpLclPSHp+Jxc\nk6QbJLVI+kDSC5LGSFozJzc6LSNvKeliSc0p3SmSBiaZAyVNl/S+pPlFZjNL++GShkqaltJ4RdLY\nZH2tKM+tJF0oaQHQCuyQkRkq6V5JiyS1yk3QHlOQ77ck3Z3yapX0sqS7JGXTWk/SRakOWiW9ITf/\neUqRDgV5HJXR/+1UriEV6mBHSQ9Jei/ldVW+DlaBKem6eS7PPpLOkvSopNdTW8+VdE62reUGUx5M\nH6/NbB9MzchI0rGpbpZIWizpQUm7f86yB0HVxEw/CBLJWMc9wG7AvcCN+GD5FdyS2Z+T3MbAY8Da\nwKX4/4LvBvwK2EnSHma2NJf8OGAxMAboB5wM3CPp18B5uCnYa3BbDn+RNMvMHsmlMRg4CLcadz2w\nO3AisLWkPc0tymUZD7yPm5U13J4C6aHicmAa8AfgPdz4x2WSNjOzU5LcQOA+3I75WKAFt2c+BPha\nig/wN2CXlOYMfMl8UKqT8wsrOyHpXNzYyGPAKKAPcDTwoKT9zeyuXJRtgL8D1wI3pTyOBD5O8VaV\nzdL1zZz/F/GtgNtSfkuBXVOZtwW+neQextt2FHAF8M/k35JJ6wbcDO6EVP7VgRHAfZIONLMuZyQm\nqEM62whBuHBdxeE/5AaMKQhbLXM/PskNy8mcn/yPzPiNTn53gtu6SP4nJv93gI0y/v3wB42bc2mX\nbIkPz/mPTf6HFOQ5Feiekx+Q0r+pQMex+J77prkyblehztZOMpdWUb8GXJf5PBAfrB8Bemb8NwAW\n4Xvj3XLxPwa2z6X7D9w4VO8qyjAVf/jqm9xGwPCUV1Gb9gR6FKTz+3zd4A8gBowskD8ghR2d8+8O\nPAHMy34/woVrLxfL+0GwghHAW8Dv8gGWZtGSVgP2A560lWehZ+OD0gEFaV9iZlnrVqWZ4GQz+18m\nn9eA58gtMyeeM7NJOb9z0rUoz4tt5RWHg/AZ5tWS+mYd/mCyGm6uFeDtdN1f0hoF6YOvJHwAbC+p\nqYxMOfYHBJxnZh+WPM1sIT4T3hifTWf5l5k9mvN7AB88q81/LeC15P6L2ybvCRyRb1Mz+9DMPgK3\nXS9p3VRX9yeR7avM8wfAu8CkXJ2vg9d7E8VtHgRtSizvB8EKNgeessoHzvoBvXHzlp/AzN6U1Iyb\nwczzYu7zW+k6r0D2LXzAyzO7IM9mSYvK5Pl8gd+gdL2/IKxE/3S9BR+sRgE/kzQN3/64xcxeSvl/\nKOkkfJVgnvxA3APAJDObsnLSn2CTdF2pLjN+m+Iz4RL5egR4I13X/5T8SrQC+6b79YDD8e2NwkmQ\npOOAY4CtCmTWrTLPQfjWRUsFmf4Ut1kQtBkx6AdBx1DuVbVy/mqDPJdUSPdw0h5/AS8CmNkHwJ6S\ntsP3rnfBV0FGSzrUzCYmucsl3QF8B9/vPgg4XtKtZnZIG+iRpdIrf9XW2TIzW/7QI2kCfk7gCknT\nzWxGJuzn+JmIe4FLgIXAh/he/3VUfxha+MrCoRVkZlaZVhCsMjHoB8EKnge2kLR6GvCKeA1fpt0q\nHyBpXXzPvL3e0R6U95A0AF8iLpoBFzEnXV/PDnyVMLPH8IN2SNoIeBI4C18WL8k0A1cBV0nqRjq0\nJukCM3u8TNKlMm8FvJAL2zIn026Y2ceSfgrMAv4I7JUJPgzf79/HMgclJe1dlFSFbOYAXwammdni\nz13oIFhFYk8/CFYwHl+uPSMfIEmwfG//TmDbgh/+0/A+NZH2YaCk4Tm/X6Zrfq+/HH/F9+DPlNQr\nHyhpbUmrp/u+BfEX4A8+6yWZNZV7TdHMluGn+CnJlWEyPlCeIv8fgVIZBuB/lvMS/oDR7pjZHPx0\n/p651wWXpTIuX0WQ1B1v6zylwbxI5+vx78bZRflL6l/kHwRtTcz0g2AFY/G93jMkfRNf0m3FZ6ID\nWXHAbRS+BzxJ0qXAXHzp+2D81a1x7VS+Z4AbJV2Jzxx3x5fSHwJurSYBM1sg6Vh8Vj5b0g344NoP\nfzVxOD7Lno/Xw1740vc8fODbF9gCf80QfPb6kKSJ+PL0W/iKxLEpTunAYlFZnpN0Pv7WxMOSbmXF\nK3u9gRHpAaKjGIOfYTgT2CP5TcAH6rsl3Q58AV+i/6gg/ix8Feg4SUvwNxBeNbMHzGyCpGvxbY/B\neJ2+DmwI7Ah8ieJzGUHQtnT26wPhwnUlB6wBnI4fJGvFf7gfB47LyW2CL2G/iu/xvogPGmvm5Ebj\nM8WmnH9T8h9dUIapwPycn+F7yEOBR/FT8y3An4A+1eSZk9kJX5EolX8h/ucyJwNrJJnd8IeJ+Sm/\nN1PeR5FeL8MPz12Eb2ksSnJzgYuBAUU6FJTlx/iMvhV/hfE+YOcCuXLxR6aw3apo36nA4grhN6e0\ndk2fu+H/vzAXXyF5CX/gGVTUfsAwYHrSxYCpufDD8Aehd5LMfOB24ODO/u6HawxX6rhBEHRhJBkw\nzsxGdnZZgiCoXWJPPwiCIAgahBj0gyAIgqBBiEE/CIIgCBqE2NMPgiAIggYhZvpBEARB0CDEoB8E\nQRAEDUIM+kEQBEHQIMSgHwRBEAQNQgz6QRAEQdAg/B9Z86DwqdNaRgAAAABJRU5ErkJggg==\n", 803 | "text/plain": [ 804 | "" 805 | ] 806 | }, 807 | "metadata": {}, 808 | "output_type": "display_data" 809 | } 810 | ], 811 | "source": [ 812 | "def plot_findings(results,save=False):\n", 813 | " fig = plt.figure()\n", 814 | " fig.suptitle('parameter Size and Accuracy', fontsize=20)\n", 815 | " plt.plot([r['nparams_student'] for r in results] , [r['accuracy_student'] for r in results])\n", 816 | " plt.xlabel('Parameter Size', fontsize=18)\n", 817 | " plt.ylabel('accuracy', fontsize=16)\n", 818 | " if save: \n", 819 | " fig.savefig('plots/parameterSize_Accuracy.png')\n", 820 | " \n", 821 | " plt.show()\n", 822 | " \n", 823 | " fig = plt.figure()\n", 824 | " fig.suptitle('Compression Rate and Accuracy', fontsize=20)\n", 825 | " plt.plot([r['compressionRate'] for r in results] , [r['accuracy_student'] for r in results])\n", 826 | " plt.xlabel('compression Rate', fontsize=18)\n", 827 | " plt.ylabel('accuracy', fontsize=18)\n", 828 | " \n", 829 | " if save: \n", 830 | " fig.savefig('plots/CompressionRate_Accuracy.png')\n", 831 | " plt.show()\n", 832 | "\n", 833 | "\n", 834 | "plot_findings(results,save=True)\n" 835 | ] 836 | }, 837 | { 838 | "cell_type": "markdown", 839 | "metadata": { 840 | "collapsed": true 841 | }, 842 | "source": [ 843 | "### Conclusion \n", 844 | "- Achieved 16.2x compression keeping accuracy at 0.96" 845 | ] 846 | }, 847 | { 848 | "cell_type": "code", 849 | "execution_count": null, 850 | "metadata": { 851 | "collapsed": true 852 | }, 853 | "outputs": [], 854 | "source": [] 855 | }, 856 | { 857 | "cell_type": "code", 858 | "execution_count": null, 859 | "metadata": { 860 | "collapsed": true 861 | }, 862 | "outputs": [], 863 | "source": [] 864 | } 865 | ], 866 | "metadata": { 867 | "anaconda-cloud": {}, 868 | "kernelspec": { 869 | "display_name": "Python [conda root]", 870 | "language": "python", 871 | "name": "conda-root-py" 872 | }, 873 | "language_info": { 874 | "codemirror_mode": { 875 | "name": "ipython", 876 | "version": 2 877 | }, 878 | "file_extension": ".py", 879 | "mimetype": "text/x-python", 880 | "name": "python", 881 | "nbconvert_exporter": "python", 882 | "pygments_lexer": "ipython2", 883 | "version": "2.7.13" 884 | } 885 | }, 886 | "nbformat": 4, 887 | "nbformat_minor": 2 888 | } 889 | -------------------------------------------------------------------------------- /plots/CompressionRate_Accuracy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Irtza/Keras_model_compression/230e077dc13257e1531587c3012ac6ddd409122b/plots/CompressionRate_Accuracy.png -------------------------------------------------------------------------------- /plots/initial experiment/compressionRate_Accuracy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Irtza/Keras_model_compression/230e077dc13257e1531587c3012ac6ddd409122b/plots/initial experiment/compressionRate_Accuracy.png -------------------------------------------------------------------------------- /plots/initial experiment/parameterSize_Accuracy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Irtza/Keras_model_compression/230e077dc13257e1531587c3012ac6ddd409122b/plots/initial experiment/parameterSize_Accuracy.png -------------------------------------------------------------------------------- /plots/parameterSize_Accuracy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Irtza/Keras_model_compression/230e077dc13257e1531587c3012ac6ddd409122b/plots/parameterSize_Accuracy.png -------------------------------------------------------------------------------- /report.md: -------------------------------------------------------------------------------- 1 | ## Implementation of Logit Regression in Keras 2 | 3 | Experiments and thoughts to investigate the affect of number of parameters in Model. 4 | 5 | 6 | ### MNIST Model Evaluation 7 | 8 | - Initial Accuracy on test set~= 0.99 9 | - Initial Model Parameter : 1199882 10 | - Memory footprint per Image Feed Forward ~= ', 4.577 Mb 11 | 12 | 13 | ### MNIST Model Architecture and Summary 14 | 15 | ____________________________________________________________________________________________________ 16 | | Layer (type) | Output Shape | Param # |Connected to 17 | |----------------------------------:|----------------------:|------------:|------------------------:| 18 | | convolution2d_1 (Convolution2D) | (None, 26, 26, 32) | 320 | convolution2d_input_1[0][0] | 19 | | convolution2d_2 (Convolution2D) | (None, 24, 24, 64) | 18496 | convolution2d_1[0][0] | 20 | |maxpooling2d_1 (MaxPooling2D) | (None, 12, 12, 64) | 0 | convolution2d_2[0][0] | 21 | |dropout_1 (Dropout) | (None, 12, 12, 64) | 0 | maxpooling2d_1[0][0] | 22 | |flatten_1 (Flatten) | (None, 9216) | 0 | dropout_1[0][0] | 23 | |dense_1 (Dense) | (None, 128) | 1179776 | flatten_1[0][0] | 24 | |dropout_2 (Dropout) | (None, 128) | 0 | dense_1[0][0] | 25 | |dense_2 (Dense) | (None, 10) | 1290 | dropout_2[0][0] | 26 | |activation_1 (Activation) | (None, 10) | 0 | dense_2[0][0] | 27 | ____________________________________________________________________________________________________ 28 | 29 | - Total params: 1,199,882 30 | - Trainable params: 1,199,882 31 | - Non-trainable params: 0 32 | 33 | 34 | 35 | ### Reducing The Model 36 | If we observe the model weights and number of parameters, the most consuming layers are the 2 Dense Layers. and the two convolutuonal layers have only 18,816 parameters. accounting for only (18816 / 1,199,882) x 100 = 1.56 percent of all the parameters 37 | 38 | So the first Step is to either Reduce this matrix using some sort of compression or quantization Approach or to replace it with a lighter model. That generalizes well to new examples, and learns to model the heavy dense layer of the original model. 39 | 40 | Replacing these 2 heavy dense layers with 1 Hidden layer with 6 HiddenLayer Neurons. We can achieve an accuracy of 0.9626. 41 | 42 | The Logits from the last layer. before the Activation layer were used, as mentioned in Geoffery Hinton's paper to avoid encoutering very small activation values after squashing through Softmax function. 43 | 44 | The python Notebook in this repository shows several other architectures with varying number of hiddenLayer Neurons, and the affect of HiddenNeurons on the accuracy through plots. See plots/ subdirectory for findings 45 | 46 | ### Evaluation of the Model of Reduction 47 | 48 | - Compressed Model parameters: 74188 49 | - Compression Rate : 16.2x 50 | 51 | ### Exepriments on how we can trade accuracy with model size. initial model accuracy is 99%. The minimum model size that we can achieve without dropping below 0.95 accuracy. 52 | 53 | Experiment: 54 | THe number of hidden Layer Neurons in small model was iteratively decreased, this showed a exponential decrease in model size and linear decrease in accuracy. 55 | 56 | By examining the plot of number of parameters(or model size) to accuracy. we can find the tradeoff, in the number of Hidden Layer Neurons needed to achieve atleast 0.95 accuracy. 57 | 58 | I was able to obtain 16.2x times compression, while keeping accuracy at 0.96 59 | 60 | 61 | ![image-title-here](/plots/CompressionRate_Accuracy.png){:class="img-responsive"} 62 | ![image-title-here](/plots/parameterSize_Accuracy.png){:class="img-responsive"} 63 | 64 | 65 | ### Experiment Settings and Hyper Parameters for student model. 66 | Various Dropouts were tested . 0.2 was chosen to regularize a bit, and showed good generalization accuracy. 67 | 68 | lossFunction : Mean Square Error as mentioned by Geoff Hinton in the paper.. Optimized Adadelta optimizer over 69 | 70 | Let s(f) be the minimum model size that can achieve accuracy f. Given 0 <= f_1 < f_2 < ... < f_n < =1, 0 < s(f_1) < s(f_2) < ... < s(f_n) 71 | 72 | No, The above equation representing the relation of model size and minimum accuray does not hold. During my experiments I have found that accuracy has rised/equal even when the model size was reduced several x times . 73 | 74 | However, after plotting the accuracy against model size. it is observed that the accuracy decreases linearly as model size is reduced, as a general trend. 75 | 76 | ### Future Work and Improvements : 77 | Its also worth noting that in MNIST dataset the characters only appear in the center of the image, and convolution weights corresponding to the edges are blank/constant and likely to be highly redundant or noisy. Pruning these weight connections through the network is likely to effectively reduce model size. also due to the simplicity of the structure in MNIST. 78 | 79 | I think its interesting to explore quantization in the conv layers as well. 80 | 81 | Ideas from Song Hans work in deep compression can be taken forward to establish a general framework for compression of deep-learning models. https://arxiv.org/abs/1510.00149 "Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding" 82 | -------------------------------------------------------------------------------- /runme.py: -------------------------------------------------------------------------------- 1 | import keras 2 | import h5py 3 | import numpy as np 4 | from sklearn import metrics 5 | from matplotlib import pyplot as plt 6 | 7 | 8 | from keras import backend as K 9 | from keras.datasets import mnist 10 | from keras.models import Sequential 11 | from keras.layers import Dense, Dropout, Flatten 12 | from keras.layers import Conv2D, MaxPooling2D , Activation 13 | 14 | plt.rcParams['figure.figsize'] = (8, 6) 15 | 16 | 17 | # from skimage import io 18 | # def show(img): 19 | # io.imshow(img) 20 | # io.show() 21 | 22 | def softmax_c(z): 23 | assert len(z.shape) == 2 24 | s = np.max(z, axis=1) 25 | s = s[:, np.newaxis] 26 | e_x = np.exp(z - s) 27 | div = np.sum(e_x, axis=1) 28 | div = div[:, np.newaxis] 29 | return e_x / div 30 | 31 | def prepare_softtargets(model,X): 32 | inp = model.input # input placeholder 33 | outputs = [] 34 | for layer in model.layers[:]: 35 | if layer.name == 'flatten_1': 36 | outputs.append(layer.output) 37 | if layer.name == 'dense_2': 38 | outputs.append(layer.output) 39 | 40 | functor = K.function([inp]+ [K.learning_phase()], outputs ) # evaluation function 41 | layer_outs = functor([X, 1.]) 42 | return np.array(layer_outs[0]) , np.array(layer_outs[1]) 43 | 44 | 45 | # Todo parse as cmd argments 46 | TRAIN_BIG = False 47 | TRAIN_SMALL = False 48 | PRPARE_TRAININPUT = True 49 | PRPARE_TESTINPUT = True 50 | 51 | # big model 52 | batch_size = 128 53 | num_classes = 10 54 | BIG_epochs = 20 55 | 56 | # Small Model 57 | STUDENT_epochs = 30 58 | HiddenNeuron = 6 # found out after experimentation. 59 | 60 | 61 | ### Setup Data 62 | 63 | # input image dimensions 64 | img_rows, img_cols = 28, 28 65 | # the data, shuffled and split between train and test sets 66 | (x_train, y_train), (x_test, y_test) = mnist.load_data() 67 | if K.image_dim_ordering() == 'th': 68 | x_train = x_train.reshape(x_train.shape[0], 1, img_rows, img_cols) 69 | x_test = x_test.reshape(x_test.shape[0], 1, img_rows, img_cols) 70 | input_shape = (1, img_rows, img_cols) 71 | else: 72 | x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1) 73 | x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1) 74 | input_shape = (img_rows, img_cols, 1) 75 | x_train = x_train.astype('float32') 76 | x_test = x_test.astype('float32') 77 | x_train /= 255 78 | x_test /= 255 79 | print('x_train shape:', x_train.shape) 80 | print(x_train.shape[0], 'train samples') 81 | print(x_test.shape[0], 'test samples') 82 | # convert class vectors to binary class matrices 83 | y_train = keras.utils.np_utils.to_categorical(y_train, num_classes) 84 | y_test = keras.utils.np_utils.to_categorical(y_test, num_classes) 85 | 86 | 87 | ### Buld Model 88 | model = Sequential() 89 | model.add(Conv2D(32,3,3, activation='relu', input_shape=input_shape)) 90 | model.add(Conv2D(64,3,3,activation='relu')) 91 | model.add(MaxPooling2D(pool_size=(2, 2))) 92 | model.add(Dropout(0.25)) 93 | model.add(Flatten()) 94 | model.add(Dense(128, activation='relu')) 95 | model.add(Dropout(0.5)) 96 | model.add(Dense(num_classes)) 97 | model.add(Activation('softmax')) 98 | 99 | model.compile(loss='categorical_crossentropy', 100 | optimizer=keras.optimizers.Adadelta(), 101 | metrics=['accuracy']) 102 | 103 | 104 | if TRAIN_BIG: 105 | print "Training Model TRAIN_BIG FLAG set as TRUE" 106 | model.fit(x_train, y_train, 107 | batch_size=batch_size, 108 | nb_epoch=BIG_epochs, 109 | verbose=1, 110 | validation_data=(x_test, y_test)) 111 | 112 | print "Evaluating model .." 113 | score = model.evaluate(x_test, y_test, verbose=1) 114 | print('Test loss:', score[0]) 115 | print('Test accuracy:', score[1]) 116 | print ("Savin Model weights") 117 | model.save_weights("new_stockweighs.h5") 118 | print ("Model Weights Saved") 119 | 120 | else: 121 | print "loading weights TRAIN_BIG FLAG set as FALSE" 122 | model.load_weights('stockweighs.h5') 123 | 124 | 125 | 126 | print "Evaluating Initial Model ...\n" 127 | score = model.evaluate(x_test, y_test, verbose=1) 128 | print('Test loss:', score[0]) 129 | print('Test accuracy:', score[1]) 130 | print ('-'*30) 131 | 132 | print "Parameter Size of Initial Model and Memory Footprint " 133 | trainable_params = model.count_params() 134 | footprint = trainable_params * 4 135 | print ("Memory footprint per Image Feed Forward ~= " , footprint / 1024.0 /1024.0 ,"Mb") # 2x Backprop 136 | print ('-'*30) 137 | 138 | 139 | ## Obtaining the Output of the last Convolutional Layer After Flatten. Caruana et. al 140 | ## and Preparing the SoftTargets, (Logits), as proposed Geoffery Hinton et. al 141 | 142 | 143 | 144 | 145 | if PRPARE_TRAININPUT: 146 | print "Creating trainsfer Set" 147 | 148 | lastconv_out = [] 149 | logit_out = [] 150 | for i in range(0,60): 151 | print "Batch # : ",i 152 | l,l2 = (prepare_softtargets(model,x_train[i*1000:(i+1)*1000])) 153 | lastconv_out.append(l) 154 | logit_out.append(l2) 155 | 156 | lastconv_out = np.array(lastconv_out) 157 | logit_out = np.array(logit_out) 158 | lastconv_out = lastconv_out.reshape((60000 , 9216)) 159 | logit_out = logit_out.reshape((60000 , 10)) 160 | 161 | print "clean up " 162 | x_train = 0 163 | print "Write to Disk" 164 | h5f = h5py.File('new_lastconv_out.h5', 'w') 165 | h5f.create_dataset('dataset_1', data=lastconv_out) 166 | h5f.close() 167 | h5f2 = h5py.File('new_logit_out.h5', 'w') 168 | h5f2.create_dataset('dataset_1', data=logit_out) 169 | h5f2.close() 170 | 171 | else: 172 | print "loading Transfer Set from lastconv_out.h5" 173 | h5f = h5py.File('lastconv_out.h5' , 'r') 174 | lastconv_out = h5f['dataset_1'][:] 175 | h5f.close() 176 | 177 | h5f2 = h5py.File('logit_out.h5' , 'r') 178 | logit_out = h5f2['dataset_1'][:] 179 | h5f2.close() 180 | 181 | 182 | 183 | print "Building minimal Model" 184 | 185 | student_model = Sequential() 186 | student_model.add(Dense(HiddenNeuron,input_dim=9216,activation='relu')) 187 | student_model.add(Dropout(0.2)) 188 | student_model.add(Dense(num_classes)) 189 | 190 | student_model.compile(loss='mse', 191 | optimizer=keras.optimizers.Adadelta(), 192 | metrics=['accuracy']) 193 | 194 | 195 | 196 | if TRAIN_SMALL: 197 | print "Training Small Model " 198 | student_model.fit(lastconv_out,logit_out,nb_epoch=STUDENT_epochs,verbose=1 , batch_size=batch_size) 199 | student_model.save_weights("new_student_weights_6_0.2dopout.h5") 200 | else : 201 | print "Loading Small Model Weights" 202 | student_model.load_weights("student_weights_6_0.2dopout.h5") 203 | 204 | 205 | 206 | print "Clean up small model Training and targets" 207 | lastconv_out = 0 208 | logit_out = 0 209 | 210 | 211 | ############ Preparing Test Input ######### 212 | 213 | if PRPARE_TESTINPUT: 214 | print "creating Test data from the big Model on HeldOut data" 215 | 216 | test_lastconv_out = [] 217 | test_logit_out = [] 218 | for i in range(0,10): 219 | print "Batch # : ",i 220 | l,l2 = prepare_softtargets(model,x_test[i*1000:(i+1)*1000]) 221 | test_lastconv_out.append(l) 222 | test_logit_out.append(l2) 223 | 224 | # lastconv_out.shape , logit_out.shape 225 | test_lastconv_out = np.array(test_lastconv_out) 226 | test_logit_out = np.array(test_logit_out) 227 | 228 | test_lastconv_out = test_lastconv_out.reshape((10000 , 9216)) 229 | test_logit_out = test_logit_out.reshape((10000 , 10)) 230 | 231 | print test_lastconv_out.shape 232 | print test_logit_out.shape 233 | 234 | print "Write to Disk" 235 | h5f = h5py.File('new_test_lastconv_out.h5', 'w') 236 | h5f.create_dataset('dataset_1', data=lastconv_out) 237 | h5f.close() 238 | h5f2 = h5py.File('new_test_logit_out.h5', 'w') 239 | h5f2.create_dataset('dataset_1', data=logit_out) 240 | h5f2.close() 241 | else : 242 | print "Loading saved test data from .h5 . " 243 | h5f = h5py.File('test_lastconv_out.h5' , 'r') 244 | test_lastconv_out = h5f['dataset_1'][:] 245 | h5f.close() 246 | h5f2 = h5py.File('test_logit_out.h5' , 'r') 247 | test_logit_out = h5f2['dataset_1'][:] 248 | h5f2.close() 249 | 250 | pred = student_model.predict(test_lastconv_out) 251 | probs = softmax_c(pred) 252 | pred_classes = np.argmax(probs,axis=1) 253 | 254 | accuracy_student = metrics.accuracy_score(y_pred=pred_classes,y_true=np.argmax(y_test,axis=1)) 255 | print "Small Model Test Set Accuracy : " , accuracy_student 256 | print "\n" 257 | # Compression Rate from Number of Parameters Reduced 258 | 259 | # Parameters for the first two conv layers from Bigger model 260 | convparams = 320 + 18496 261 | 262 | print "Evaluating COompression ... " 263 | print "HiddenNeurons : " , HiddenNeuron 264 | print "Initial Model Parameters : " , model.count_params() 265 | print "Compressed Model parameters + initial feature extractor part params : ", student_model.count_params() + convparams 266 | compressionRate = model.count_params() / np.float(student_model.count_params() + convparams) 267 | print "Compression Rate : " , compressionRate 268 | print "\n\n" 269 | 270 | # if __name__ == '__main__': 271 | # Set usage using flags. 272 | # python runme.py --TRAIN_BIG=True --epochs=20 -------------------------------------------------------------------------------- /stockweighs.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Irtza/Keras_model_compression/230e077dc13257e1531587c3012ac6ddd409122b/stockweighs.h5 -------------------------------------------------------------------------------- /student_weights_6_0.2dopout.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Irtza/Keras_model_compression/230e077dc13257e1531587c3012ac6ddd409122b/student_weights_6_0.2dopout.h5 --------------------------------------------------------------------------------