├── .gitignore ├── LICENSE.md ├── README.md ├── notebooks ├── .gitignore ├── 01_basics.ipynb ├── 02_linear_regression.ipynb ├── 03_polynomial_regression.ipynb ├── 04_logistic_regression.ipynb ├── 05_basic_convnet.ipynb ├── 06_modern_convnet.ipynb ├── 07_autoencoder.ipynb ├── 08_denoising_autoencoder.ipynb ├── 09_convolutional_autoencoder.ipynb ├── 10_residual_network.ipynb ├── 11_variational_autoencoder.ipynb ├── convert.py └── libs └── python ├── .gitignore ├── 01_basics.py ├── 02_linear_regression.py ├── 03_polynomial_regression.py ├── 04_logistic_regression.py ├── 05_basic_convnet.py ├── 06_modern_convnet.py ├── 07_autoencoder.py ├── 08_denoising_autoencoder.py ├── 09_convolutional_autoencoder.py ├── 10_residual_network.py ├── 11_variational_autoencoder.py └── libs ├── __init__.py ├── activations.py ├── batch_norm.py ├── connections.py ├── dataset_utils.py ├── datasets.py └── utils.py /.gitignore: -------------------------------------------------------------------------------- 1 | MNIST_data 2 | *.pyc 3 | *.pyo 4 | __pycach* 5 | -------------------------------------------------------------------------------- /LICENSE.md: -------------------------------------------------------------------------------- 1 | Copyright 2016 Parag K. Mital 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # TensorFlow Tutorials 2 | 3 | You can find python source code under the `python` directory, and associated notebooks under `notebooks`. 4 | 5 | | | Source code | Description | 6 | | --- | --- | --- | 7 | |1| **[basics.py](python/01_basics.py)** | Setup with tensorflow and graph computation.| 8 | |2| **[linear_regression.py](python/02_linear_regression.py)** | Performing regression with a single factor and bias. | 9 | |3| **[polynomial_regression.py](python/03_polynomial_regression.py)** | Performing regression using polynomial factors.| 10 | |4| **[logistic_regression.py](python/04_logistic_regression.py)** | Performing logistic regression using a single layer neural network.| 11 | |5| **[basic_convnet.py](python/05_basic_convnet.py)** | Building a deep convolutional neural network.| 12 | |6| **[modern_convnet.py](python/06_modern_convnet.py)** | Building a deep convolutional neural network with batch normalization and leaky rectifiers.| 13 | |7| **[autoencoder.py](python/07_autoencoder.py)** | Building a deep autoencoder with tied weights.| 14 | |8| **[denoising_autoencoder.py](python/08_denoising_autoencoder.py)** | Building a deep denoising autoencoder which corrupts the input.| 15 | |9| **[convolutional_autoencoder.py](python/09_convolutional_autoencoder.py)** | Building a deep convolutional autoencoder.| 16 | |10| **[residual_network.py](python/10_residual_network.py)** | Building a deep residual network.| 17 | |11| **[variational_autoencoder.py](python/11_variational_autoencoder.py)** | Building an autoencoder with a variational encoding.| 18 | 19 | # Installation Guides 20 | 21 | * [TensorFlow Installation](https://github.com/tensorflow/tensorflow) 22 | * [OS specific setup](https://github.com/tensorflow/tensorFlow/blob/master/tensorflow/g3doc/get_started/os_setup.md) 23 | * [Installation on EC2 GPU Instances](http://eatcodeplay.com/installing-gpu-enabled-tensorflow-with-python-3-4-in-ec2/) 24 | 25 | For Ubuntu users using python3.4+ w/ CUDA 7.5 and cuDNN 7.0, you can find compiled wheels under the `wheels` directory. Use `pip3 install tensorflow-0.8.0rc0-py3-none-any.whl` to install, e.g. and be sure to add: `export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/lib64" 26 | ` to your `.bashrc`. Note, this still requires you to install CUDA 7.5 and cuDNN 7.0 under `/usr/local/cuda`. 27 | 28 | # Resources 29 | 30 | * [Official Tensorflow Tutorials](https://www.tensorflow.org/versions/r0.7/tutorials/index.html) 31 | * [Tensorflow API](https://www.tensorflow.org/versions/r0.7/api_docs/python/index.html) 32 | * [Tensorflow Google Groups](https://groups.google.com/a/tensorflow.org/forum/#!forum/discuss) 33 | * [More Tutorials](https://github.com/nlintz/TensorFlow-Tutorials) 34 | 35 | # Author 36 | 37 | Parag K. Mital, Jan. 2016. 38 | 39 | http://pkmital.com 40 | 41 | # License 42 | 43 | See LICENSE.md 44 | -------------------------------------------------------------------------------- /notebooks/.gitignore: -------------------------------------------------------------------------------- 1 | .ipynb_* 2 | .DS_Store 3 | X_* 4 | -------------------------------------------------------------------------------- /notebooks/02_linear_regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "data": { 12 | "text/plain": [ 13 | "'Simple tutorial for using TensorFlow to compute a linear regression.\\n\\nParag K. Mital, Jan. 2016'" 14 | ] 15 | }, 16 | "execution_count": 1, 17 | "metadata": {}, 18 | "output_type": "execute_result" 19 | } 20 | ], 21 | "source": [ 22 | "\"\"\"Simple tutorial for using TensorFlow to compute a linear regression.\n", 23 | "\n", 24 | "Parag K. Mital, Jan. 2016\"\"\"" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": 2, 30 | "metadata": { 31 | "collapsed": true 32 | }, 33 | "outputs": [], 34 | "source": [ 35 | "# %% imports\n", 36 | "%matplotlib inline\n", 37 | "import numpy as np\n", 38 | "import tensorflow as tf\n", 39 | "import matplotlib.pyplot as plt\n" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": 3, 45 | "metadata": { 46 | "collapsed": false 47 | }, 48 | "outputs": [ 49 | { 50 | "name": "stderr", 51 | "output_type": "stream", 52 | "text": [ 53 | "/home/heythisischo/anaconda2/lib/python2.7/site-packages/matplotlib/figure.py:397: UserWarning: matplotlib is currently using a non-GUI backend, so cannot show the figure\n", 54 | " \"matplotlib is currently using a non-GUI backend, \"\n" 55 | ] 56 | }, 57 | { 58 | "data": { 59 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEACAYAAACwB81wAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAH5JJREFUeJzt3XuQXOV55/HvIyGhsaSRkXeMMSySgy+UndiMssJy2Vtq\nKRIRSmJIilp5kipX7CkHokBYjDFaoJbZJXYB5ayCIZQEq4QiFV1y8VpgywmoUCdFsmamdEHYlgjY\nzICJ0UxlbS2y2ysJnv2ju0Vrpi/ndJ8+t/59qrpquvvMOe9paZ5+z/M+73vM3RERkfyblXQDREQk\nHgr4IiI9QgFfRKRHKOCLiPQIBXwRkR6hgC8i0iM6DvhmdpGZPWVm3zWz58zsDxts91Uze8HMDpnZ\nZZ0eV0REwjkngn2cBj7v7ofMbAGw38yecPej1Q3M7ErgEnd/n5l9FNgCrIjg2CIiElDHPXx3f83d\nD1V+PgEcAS6cttlVwKOVbZ4BFpnZ+Z0eW0REgos0h29mS4HLgGemvXUh8ErN81eZ+aUgIiJdFFnA\nr6Rz/ga4sdLTFxGRFIkih4+ZnUM52P+Fu++us8mrwL+veX5R5bV6+9LiPiIiIbm7tdomqh7+nwHf\nc/f7Grz/GPBpADNbAfzE3Y812pm7p/px5513Jt4GtVPtVDvVzuojqI57+Gb2ceB3gOfM7CDgwG3A\nknLs9ofcfY+ZrTezF4GfAp/p9LgiIhJOxwHf3f8JmB1gu+s7PZaIiLRPM23bUCgUkm5CIGpntNTO\naKmd8bMw+Z84mJmnrU0iImlmZniMg7YiIpJyCvgiIj1CAV9EpEco4IuI9AgFfBGRHqGALyLSIxTw\nRUR6hAK+iEiPUMAXEekRCvgiIj1CAV9EpEco4IuI9AgFfBGRHqGALyLSIxTwRUR6hAK+iEiPUMAX\nEekRCvgiIj1CAV9EpEco4IuI9AgFfBGRHhFJwDezbWZ2zMwON3h/pZn9xMwOVB53RHFcEREJ7pyI\n9vPnwP3Ao022+Ud3/2RExxMRkZAi6eG7+9PAj1tsZlEcS0RE2hNnDv9jZnbIzL5pZh+M8bgiIkJ0\nKZ1W9gMXu/vPzOxK4OvA+xttPDIycubnQqFAoVDodvtERDKjWCxSLBZD/565eyQNMLMlwOPu/uEA\n274E/LK7/58673lUbRKR4KamphgfH2fp0qUMDAwk3RwJwcxw95Zp8yhTOkaDPL2ZnV/z8+WUv2hm\nBHsRScaOHbtYsuRS1q69jiVLLmXHjl1JN0m6IJIevpltBwrAO4BjwJ3AXMDd/SEz+wPg94FTQAm4\nyd2fabAv9fBFYjQ1NcWSJZdSKu0DPgwcpq9vFRMTR9XTz4igPfxIcvju/tst3v9T4E+jOJaIRGt8\nfJy5c5dSKlWzsR9mzpwljI+PK+DnjGbaivS4pUuXcvLkOFCdN3mYU6cmWLp0aUf7nZqaYmxsjKmp\nqaavSXwU8EV63MDAANu2PUhf3yr6+5fR17eKzZvvZnx8vO3AXG9MQOMEyYusSicqyuGLJKNapXPg\nwCFuumkTc+eWe/7btj3I0NCGUPuZPiYwb95KzGZpnKBLYs3hi0j2VQPvypXrKJX2VXL6hxkeXsWa\nNasBApVt1hsTmD37nUAf5WBffk3jBPFTSkdEzqgG6+mBeevWhwOnY+qNCZw+/RqnT08Q9TiBhKOU\njkiPCDKxKqp0zI4duxge3sicOUsolV7EbBazZy+mVHqNvr5LgH8NnSqSxpKYeCUiKdVswLS2cqbe\nAO7tt99St9c/Pj7e8HhDQxuYmDjKX//13ZxzzhxOnvxHSqUfAHt4882X2b//aQX7BKiHL5JzzSZW\n7d37FMPDG2cM0NZeDQBtT8waGxtj7drrOH58/5nX+vuXsXfvVpYvX96tU+456uGLCNA4L3/w4EGG\nhzdSKu3j+PH9lEr7GB7eeKanXw3I4+PjbN5891m9/m3bHgw02NpJjb9q9qOnKh2RnDs76JZ76CdP\nvsTzzz/POecsoVHlTDUPX+39b958N8uWXRZqcbVqimh4eBVz5izh1KmJQF8W04+tfH80lNIR6QH1\nBlHPPfdiXn/9ReDbTE/VQPtpnHrCrMSptX3CU0pHRM6oN4j6+uuHgRFgBQsXDp6VqmmUBmo2UNtM\nNUVUG7AbpWyiPra8RQFfJIfqBdOBgQHOO++8acH0iyxYcAn33/+fmZg4eiZt0q31daqaVQ11+9g9\nzd1T9Sg3SUTatX37Tu/rW+yLFi3zvr7Fvn37zjPvTU5Oel/fYodnHdzhWe/rW+yTk5MN99PfPzhj\nP50I0oZuHTuvKnGzZXxVDl8kR4Lkv2vz+dVB1EYDot24C1bQUk3dgSs4raUj0oOCrG0/NLSBNWtW\nBwqmAwMDkQfbelVD9VI23Th2r1MOXyRHgua/6w2ixqXebN7qYLFq77tLKR2RnAmTsknS9JSNau/b\nFzSlo4AvkiFB89pZy3+Hrb3P2vl1m+rwRXImzB2jkkzZtCNM7b3unNU+9fBFMiDvs0+Dnl/eP4d2\nqYcvkiN5n33abCC3Vt4/h25TD18kA3qlZ9sqN98rn0NYsfbwzWybmR0zs8NNtvmqmb1gZofM7LIo\njivSK4L2gLOu1dhDr3wO3RJJD9/MPgGcAB519w/Xef9K4Hp3/zUz+yhwn7uvaLAv9fBFGlB1Spk+\nh7PFXpZpZkuAxxsE/C3APnffVXl+BCi4+7E62yrgi4iEkLZB2wuBV2qev1p5TUREYpLKtXRGRkbO\n/FwoFCgUCom1RUQkbYrFIsViMfTvJZXSOQqsVEpHRKRzSaR0rPKo5zHg05WGrQB+Ui/Yi0h4WnBM\ngoqqLHM78M/A+83sZTP7jJlda2a/B+Due4CXzOxFYCuwMYrjivQ6LTMgYWjilUjMoiopDLMcgUoY\n8y1tVToiQrQ98iDLDOgKQGqphy8Sk6iXBWi1Py1DcLY8X+mohy+SMlEv/NVqmQEtNPYWXemUqYcv\nEpNu9bgb9VzVwy/rhc9BPXyRlOnWwl+NFhxLaqGxtJWJ6krnLerhi8Qs7lxynMdL431p1cOv2S5t\nwVUBXySb0hxYs3Jj93YFDfipXEtHRLKnmjoplWamTpIO+ENDG1izZnVuq3SCUg5fJAXSlvduplFb\nly4tp3Ggeh+kw5w6NcHSpUu7cryw22Xtxu5d4e6pepSbJNI7tm/f6X19i33RomXe17fYt2/fmXST\nGmrV1ur7/f2DkZxL0M8mS59hN1TiZsv4qhy+SILSnPeeLu6lHMIcLyufYbeoLFMkA7JUMhi0rVGl\nToIeL0ufYdIU8EUS1K28dzfE3dagx8vSZ5g0BXyRBCU1Oaodcbc16PGy9BkmTTl8kRTI0sJeUbe1\n1f6CHi9Ln2HUNPFKRFIvjTNzs0gBX0RSrZvVNb3W21eVjkgKdHNCVRoma3XShm5V12gp5CaCFOvH\n+UATryQn2p0MNDk56aOjoz45ORn5vqPUaRsmJye9r2+xw7MO7vCs9/UtbnreSewzCwg48SrxAD+j\nQQr4kgPtBp4gQTQNQS2qNkQ9M3d0dNQXLVpWaVP50d8/6KOjox3tN+2CBnyldES6oJ10xdTUFMPD\nGymV9nH8+H5KpX0MD2+ckS5Jw0SjqNowNLSBiYmj7N27lYmJox0P2KomvzkFfJEuaCfwBA2iaQhq\nUbYhykXNoqjJT8PYSNcEuQyI84FSOpITYdMVYdIkUadC2pGGNjQSZByknjSMjbSDOBdPM7N1wJ9Q\nvmLY5u73THt/JbAb+EHlpa+5+x812JdH0SaRNAg7qSjMjTrSUHpYbcOCBQs4ceJEpssg65WJzpu3\nkt27dzE4OJjq8wpalhlFj3wW8CKwBJgDHAIunbbNSuCxgPvrzlegSMo06k222ztNSlZ7xdPNHPDd\n6fA2nz//I6k/L+Lq4ZvZCuBOd7+y8nxT5eD31GyzEviCu/9GgP15p20SSbu8LOmbl/OA6edyAfAB\noEgWzivOiVcXAq/UPP9h5bXpPmZmh8zsm2b2wQiOK5JZaai0iUJezgPOHvCdP/8TwDvIw3nViuue\ntvuBi939Z2Z2JfB14P2NNh4ZGTnzc6FQoFAodLt9IrE6u8ql3IPMYvlg1s9j+jhI9d63Bw8e5Oqr\nhyiV0nlexWKRYrEY/heD5H2aPYAVwN/VPN8E3Nrid14CFjd4L+r0lkgqpbnKJYysnkfct2vsJmLM\n4c8Gngd+BfgRMAoMufuRmm3Od/djlZ8vB/7K3Zc22J932iaRrEhDpU0U0noejdoV9+0auy1oDr/j\nlI67v2Fm1wNP8FZZ5hEzu7b8tj8EXGNmvw+cAkqA1j8VoZw3TnMgCSqN59Fs6eXq2EOpNDNHX3se\naTyvTmh5ZJEcyUqPtF1hbobSrAefp+oi0PLIIj0n78sChzm/VtVDvXpbRPXwRerIWk85bz3W6cKe\nX95y9K2ohy/SprT0lMMs4pWnevh6wp5fmBugR7VwWyYEKeWJ84HKMiVBnazzHuWSCGGXK0jDGvnd\n1O75ZW2ZinahG6CIhNfuDTSiXE+m05unZKFuvB15P79OBA34yuGL1GgnFx51/nxsbIy1a6/j+PH9\nZ15bsOCXeOCBL7B+/fqW1Sl5yEk3kvfza5dy+CJtaKd6I+r8+cybi9zLiRPf54Yb7ms5ppD3nHTe\nz6/b1MMXqSNMT7IbFTLVSUOzZ7+bEye+D3w7sn33ujxeJaiHL9KBMD3J6VcF8+at5Lbbbu7o+NV7\nvT7wwBdYuPBS8lp9E7e0VGAlRT18kQ7U9hYBtm59mC9/+Y/rTudvd/95rq+PQlSzb7MstjteRf1A\nVTqSEdMrc7ZseagrpZGqTmksTHVUuxVYWYCqdES6p15v8dxz/yNz517C668fOLNdf/8y9u7dyvLl\nyzs+Xt7yzp3q1uzbLFIOX6SL6lfmXMzJky/xVnXNWzfNCDNrth5Vp8zUrdm3eaYevkgbGvUWN2++\nm5tu2sScOUs4dWqCbdseBGi4TK+0r90eex6vloL28BXwRdpULZ2sDe5DQxtmDOTmNY2QBo3+DXqN\nAn4M8thTkHBa/R+oN2s2qry+lOnvUAG/65rdTUekKs8DhZIeCvhdpD/i/OpGb1FpB+k2Vel0Ud7X\nHu9V3ZqFWZ01u3fvViYmjirYS2LUw2+Devj5oMFVyQv18LtI9bzZN703v3Xrw7pqk9xTD78Dqg7I\npnpXaPPmrcRsVtMevv69Ja3Uw4+BZj9mU70xmLlz38Ntt93c8Kqt11dZlHyIpIdvZuuAP6H8BbLN\n3e+ps81XgSuBnwK/6+6HGuwrMz18yaZmYzDAjF68xmwk7YL28M+J4ECzgAeAXwH+FRgzs93ufrRm\nmyuBS9z9fWb2UWALsKLTY4u0ozoGMzy86qxSyWrwnh7Eq1cEpdLM/L4CvmRJxwEfuBx4wd0nAMxs\nJ3AVcLRmm6uARwHc/RkzW2Rm57v7sQiOHyvlcfNhaGgDa9asDvRvefYtB8s9/OqiaCJZEkUO/0Lg\nlZrnP6y81mybV+tsk3rK4+ZLmDGYZvl9kayIoocfuZGRkTM/FwoFCoVCYm2pmpqaYnh4I6XSvsql\n/WGGh1exZs1q/eHnWO0SGu5vcsst13DttZ/Tv7kkqlgsUiwWQ/9ex4O2ZrYCGHH3dZXnmyjffeWe\nmm22APvcfVfl+VFgZb2UTloHbbUIVu/RYK1kRZxlmWPAe81siZnNBT4FPDZtm8eAT1catgL4Sdby\n92fncUF53PzTEhqSNx0HfHd/A7geeAL4LrDT3Y+Y2bVm9nuVbfYAL5nZi8BWYGOnx42bZtf2Hn3J\nS95opm1IqtLpLVrpUrJAyyOLRERf8pJ2CvgiIj1Ca+lI7k1NTTE2NsbU1FTSTRHJBAV8ySRNghMJ\nTykdyZxGyxvv3r2LwcFB5dml5yilI7k1sz7+CD//+Ul+67e+qN6+SBPq4cdAVR7ROruHfwHwAaCI\nZsNKr1IPPyWUa45e7SS4+fM/AbwDzYaVTvRKAYB6+F2ktVi6a2pqioMHD3L11UP6jKVttQvknTw5\nnsnJderhp0Ana7H0So+jEwMDA1xxxRVa8kLaVrsK7vHj+ymV9jE8vDG3f3cK+F3U7losSgOFMzS0\ngYmJo+zdu5WJiaOZ651JcnptgTyldLos7FosSgOJxCcvf2+x3dNWmgtzKz3Q/VNF4tTq/sZ5ox5+\nyuSlx5F3KrXNl6z/e2rQNqO07n76aYwlf8Lc3zjL1MNPqaz3OPJKV2CSRsrhZ9zAwIACSAppjEWy\nTCkdkRB020PJMgX8iGnCVL5pjEWyTDn8COVhirYEozEWSRPd4jBmGswTkaSoLDNmvTZFW0SyRwE/\nIhrME5G06yjgm9l5ZvaEmT1vZn9vZosabDduZs+a2UEzG+3kmGmlwbxs0KC69LKOcvhmdg/wb+5+\nr5ndCpzn7pvqbPcD4Jfd/ccB9pnJHH6VBvPSS4PqklexDNqa2VFgpbsfM7N3AUV3v7TOdi8B/8Hd\n/y3APjMd8NNAXzozaVBd8iyuQdt3uvsxAHd/DXhng+0ceNLMxszscx0eM3XSlCbQOi/1aVBdJMDS\nCmb2JHB+7UuUA/gddTZv1DX/uLv/yMwGKAf+I+7+dKNjjoyMnPm5UChQKBRaNTMxaUoT1N69pzz1\n/zDDw6tYs2Z1Jnqxza5MOr1qOXtQvfzZaFBdsqpYLFIsFsP/oru3/QCOAOdXfn4XcCTA79wJfL7J\n+54Vk5OT3te32OFZB3d41vv6Fvvk5GQibXnkkUd84cLBSlvKj/7+QR8dHY29PWFt377T+/oW+6JF\ny7yvb7Fv374z0HvtHKO/f7Cj/YikTSVuto7ZQTZq+MtwD3Br5edbgbvrbPM2YEHl5/nAPwFXNNln\nVz+YKI2OjvqiRcsSD7DVQLZw4S859KXiCyiMZl+cUX+pTk5O+ujoaOo/E5Ewggb8TlfLvAf4KzP7\nLDAB/CcAM7sAeNjdf51yOuh/mZlTTiH9pbs/0eFxUyENaYLaNE65DfcCK1i48AOcPv1yR6WhcQ3+\nNluBEoh0dcp6q5BqkFt6RpBvhTgfZKiH7558mqDeVcaCBb/ojzzySEe92KjSKEHE2cOfLs7zFOkW\n4kjpdOORtYDv3l6aIKrUQjcCYhJjE82+OMN8qYb5XNM0BiPSCQX8FIu6V9nuVUaj4JjU2ESzYB0k\nkIf9XNMyBiPSKQX8lOpWrzLsFUOz4JjFnm87bc7ieYrUEzTga/G0mHVrAlCYmzDXDvQeP76fUmkf\nw8Mbz0wcy+K6QO18rlk8T5FO6J62MUtDZU+Q+7IODW1gzZrVkVSvxFEF0+7nGuV5iqSdevgxq9er\n3Lz5bsbHx2NbmiHoUs5hrhoa6cZSD/WWsuiktx7FeYpkQpC8T5wPcp7Dr6rm3LdseSiRssA4ykm7\nkSNvNTCriVXSiwiYw9ctDhOU9AqO3U61jI2NsXbtdRw/vv/Ma/39y9i7dyvLly8P3UZAK16K1BF0\ntUzl8LuoVUANkkvv9BjN1Jt1GqV28+rVczpw4BA33bTpzMJ0t912c6SzbkV6TpDLgDgf5CSlE6Qm\nvNOURxZmiYZNHTVbF2jevLerjFKkDlSHn5xWSwXU5pg7mTSVleAXNK9+9jmNOnxkxqSou+76kla8\nFJlGAT9BjWZwVoPV9B55OwONWZwl2uo8zz6nSYdgX5qdHlck6xTwE1Sv9x11OiJLPXz3dlNc9zj0\n+cKFl7XVm5+cnGz4JSuSJwr4CZueqrnrri9F3iPvRmllN3rDYb6cpp/Tli0PtdWe7dt3+rx5b3d4\nW2a+FEXapYCfArXBMy1r6DTTrUHgsOmnTs/prc/6Lx2ylfYSaYcCfgp12iOPcknl6fvpZooo7vTT\nW18wjccBRPJEAT+l2g3aUd/Xdfp+uj0I3OjLrvsppJ0O5zlcohy+5JYCfo5E1UNO8s5S1ePXK0nt\nxoBq7RfMvHlv97vu+pJ69pJbCvg5ElXvu9V+4rxdYxJfMCJ5FTTga2mFDIhqSeVW+4lzqeAolpVo\npdtLR4hkjZZHzoCobtQRZD9xLRUcdIlmEYmOVsvMkKhWt4zjhiRB7Nixi+HhjcyZs4RTpybYtu1B\nhoY2JNYekawKulqmAr4kKi1fPiJZFjTgd5TSMbNrzOw7ZvaGmS1rst06MztqZv9iZrd2ckzJF91t\nSiQ+nebwnwN+E/iHRhuY2SzgAeBXgQ8BQ2Z2aYfHzaV6t+4TEYlKRwHf3Z939xeAZpcSlwMvuPuE\nu58CdgJXdXLcPOrGvV9FRGrFUaVzIfBKzfMfVl6TiqmpKYaHN1Iq7eP48f2USvsYHt6onr6IRKpl\nHb6ZPQmcX/sS4MDt7v54Nxo1MjJy5udCoUChUOjGYVIjjpr0VjR4KpIdxWKRYrEY+vciqdIxs33A\nze5+oM57K4ARd19Xeb6J8qywexrsq+eqdJK+mXm1PLJ679h2yyP1pSGSjFiqdKYfs8HrY8B7zWyJ\nmc0FPgU8FuFxMy+qiVX1tBoIjiqdpDEIkQwIsv5CowdwNeX8fAn4EfCtyusXAN+o2W4d8DzwArCp\nxT6jW2AiY6Je+yXI4mRRrNOTtbtvieQNAdfS0cSrnAqaJooinTQ2Nsbatddx/Pj+M6/19y9j796t\nLF++PNLzEpGZkkjpSIpUB4LLQRxqB4JrRZFO0ro4ItmgHn5Ohe25dzrgqnVxRJKjtXRyopNAXBuE\nT558idtvv4Vrr/1c1ypoVKUjkgyldHKgUeVL0CUYhoY2MDFxlFtuuQazWXzlK3/b1QoarYsjkm7q\n4adUo5TM5s13c9NNmwLXzCdV46/evkh81MPPuHqDrrNnv5sbb/xiqJr5oIO3UVJNvkg6KeCnVP3K\nl5eZO/c9hAneUVTQBJm8VX1f6wKJpJcCfkrVK5e8776vcPr0BGGCd6dll61669Pf37r14divKEQk\nGOXwU256Lrzd8sd2cuqt8v/13p83byVmsxJbF0ikFwXN4bdcLVOSNTAwcFagHBrawJo1q0MH7+n7\nCaLeKp6zZ7+bPXv2sH79+rrvz537Hm655Rq+/OVVZ30pKdiLJE89fGloZg/+XmCEhQsv5fTpiTMV\nQ/V684CqdERioolXEolqCmn27Hdz4sT3gW9Tr0xUM2xFkqOAL5GZmppiz5493HDDfbz+evWWB1PM\nn/8Jvva1+xkcHGxrfEBXACLRUB2+RGZgYID169fXVAjtAj7AT3/6JldfPcTevU+FmmGrOn2RZKiH\nL4Ht2LGLz372On7+85PA/6adKpyk7+4lkkfq4UvkhoY2sHv3LubPfx/t1tknMfNXRMoU8CWUwcFB\n3nzzFdqduau180WSo4AvoXQ6c7eb9+8VkeaUw5e2dFployodkeioLFNEpEdo0FZERM6igC8i0iM6\nCvhmdo2ZfcfM3jCzZU22GzezZ83soJmNdnJMERFpT6c9/OeA3wT+ocV2bwIFdx9098s7PGbiisVi\n0k0IRO2MltoZLbUzfh0FfHd/3t1fAFoNFlinx0qTrPwHUDujpXZGS+2MX1xB2IEnzWzMzD4X0zFF\nRKRGyxugmNmTwPm1L1EO4Le7++MBj/Nxd/+RmQ1QDvxH3P3p8M0VEZF2RVKHb2b7gJvd/UCAbe8E\nXnf3/9HgfRXhi4iEFPctDusezMzeBsxy9xNmNh+4AvhvjXYSpNEiIhJep2WZV5vZK8AK4Btm9q3K\n6xeY2Tcqm50PPG1mBynfLulxd3+ik+OKiEh4qVtaQUREuiO1pZJmdrOZvWlmi5NuSz1m9t9rJpP9\nnZm9K+k21WNm95rZETM7ZGZ/a2b9SbepnqCT+JJgZuvM7KiZ/YuZ3Zp0exoxs21mdszMDrfeOhlm\ndpGZPWVm3zWz58zsD5NuUz1mdq6ZPVP5+36uMvaYWmY2y8wOmNljzbZLZcA3s4uAtcBE0m1p4l53\n/4i7DwLfBNL6H+IJ4EPufhnwAvBfEm5PI0En8cXKzGYBDwC/CnwIGDKzS5NtVUN/TrmdaXYa+Ly7\nfwj4GPAHafw83f3/Aasqf9+XAVeaWZonjd4IfK/VRqkM+MBm4JakG9GMu5+oeTqf8mzi1HH3ve5e\nbdu3gYuSbE8jISbxxe1y4AV3n3D3U8BO4KqE21RXpdT5x0m3oxl3f83dD1V+PgEcAS5MtlX1ufvP\nKj+eS7nAJZX570oHeT3wP1ttm7qAb2afBF5x9+eSbksrZvZHZvYy8NvAf026PQF8FvhW0o3ImAuB\nV2qe/5CUBqisMbOllHvPzyTbkvoqaZKDwGvAk+4+lnSbGqh2kFt+IUVZlhlYk8lcdwC3UU7n1L6X\niFaTztz9DuCOSl73BmAk/lYGmxxnZrcDp9x9ewJNpNKGKCbxSQ6Y2QLgb4Abp10tp0blyniwMu71\ndTP7oLu3TJvEycx+DTjm7ofMrECLeJlIwHf3tfVeN7NfBJYCz5qZUU4/7Dezy919MsYmAo3bWcd2\nYA8JBfxW7TSz36V8ybc6lgY1EOLzTJNXgYtrnl9UeU3aZGbnUA72f+Huu5NuTyvu/n8rk0vXESBP\nHrOPA580s/VAH7DQzB5190/X2zhVKR13/467v8vdf8Hd30P58nkwiWDfipm9t+bp1ZRzkaljZuso\nX+59sjIQlQVpyuOPAe81syVmNhf4FNC0EiJhRro+v3r+DPieu9+XdEMaMbN/Z2aLKj/3Uc46HE22\nVTO5+23ufrG7/wLl/5tPNQr2kLKAX4eT3v+8d5vZYTM7BKyhPEqeRvcDCyivYXTAzB5MukH1NJrE\nlzR3fwO4nnK103eBne6e1i/37cA/A+83s5fN7DNJt2k6M/s48DvA6krJ44FKpyRtLgD2Vf6+nwH+\n3t33JNymjmnilYhIj0h7D19ERCKigC8i0iMU8EVEeoQCvohIj1DAFxHpEQr4IiI9QgFfRKRHKOCL\niPSI/w+536Vu0dYGYwAAAABJRU5ErkJggg==\n", 60 | "text/plain": [ 61 | "" 62 | ] 63 | }, 64 | "metadata": {}, 65 | "output_type": "display_data" 66 | } 67 | ], 68 | "source": [ 69 | "# %% Let's create some toy data\n", 70 | "plt.ion()\n", 71 | "n_observations = 100\n", 72 | "fig, ax = plt.subplots(1, 1)\n", 73 | "xs = np.linspace(-3, 3, n_observations)\n", 74 | "ys = np.sin(xs) + np.random.uniform(-0.5, 0.5, n_observations)\n", 75 | "ax.scatter(xs, ys)\n", 76 | "fig.show()\n", 77 | "plt.draw()" 78 | ] 79 | }, 80 | { 81 | "cell_type": "code", 82 | "execution_count": 4, 83 | "metadata": { 84 | "collapsed": true 85 | }, 86 | "outputs": [], 87 | "source": [ 88 | "# %% tf.placeholders for the input and output of the network. Placeholders are\n", 89 | "# variables which we need to fill in when we are ready to compute the graph.\n", 90 | "X = tf.placeholder(tf.float32)\n", 91 | "Y = tf.placeholder(tf.float32)" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": 5, 97 | "metadata": { 98 | "collapsed": true 99 | }, 100 | "outputs": [], 101 | "source": [ 102 | "# %% We will try to optimize min_(W,b) ||(X*w + b) - y||^2\n", 103 | "# The `Variable()` constructor requires an initial value for the variable,\n", 104 | "# which can be a `Tensor` of any type and shape. The initial value defines the\n", 105 | "# type and shape of the variable. After construction, the type and shape of\n", 106 | "# the variable are fixed. The value can be changed using one of the assign\n", 107 | "# methods.\n", 108 | "W = tf.Variable(tf.random_normal([1]), name='weight')\n", 109 | "b = tf.Variable(tf.random_normal([1]), name='bias')\n", 110 | "Y_pred = tf.add(tf.mul(X, W), b)" 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": 6, 116 | "metadata": { 117 | "collapsed": true 118 | }, 119 | "outputs": [], 120 | "source": [ 121 | "# %% Loss function will measure the distance between our observations\n", 122 | "# and predictions and average over them.\n", 123 | "cost = tf.reduce_sum(tf.pow(Y_pred - Y, 2)) / (n_observations - 1)" 124 | ] 125 | }, 126 | { 127 | "cell_type": "code", 128 | "execution_count": 7, 129 | "metadata": { 130 | "collapsed": true 131 | }, 132 | "outputs": [], 133 | "source": [ 134 | "# %% if we wanted to add regularization, we could add other terms to the cost,\n", 135 | "# e.g. ridge regression has a parameter controlling the amount of shrinkage\n", 136 | "# over the norm of activations. the larger the shrinkage, the more robust\n", 137 | "# to collinearity.\n", 138 | "# cost = tf.add(cost, tf.mul(1e-6, tf.global_norm([W])))" 139 | ] 140 | }, 141 | { 142 | "cell_type": "code", 143 | "execution_count": 8, 144 | "metadata": { 145 | "collapsed": true 146 | }, 147 | "outputs": [], 148 | "source": [ 149 | "# %% Use gradient descent to optimize W,b\n", 150 | "# Performs a single step in the negative gradient\n", 151 | "learning_rate = 0.01\n", 152 | "optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": 9, 158 | "metadata": { 159 | "collapsed": false 160 | }, 161 | "outputs": [ 162 | { 163 | "name": "stdout", 164 | "output_type": "stream", 165 | "text": [ 166 | "1.43989\n", 167 | "1.30686\n", 168 | "1.18916\n", 169 | "1.08503\n", 170 | "0.992879\n", 171 | "0.911325\n", 172 | "0.839141\n", 173 | "0.77524\n", 174 | "0.718663\n", 175 | "0.668564\n", 176 | "0.624192\n", 177 | "0.584885\n", 178 | "0.550058\n", 179 | "0.519194\n", 180 | "0.491835\n", 181 | "0.467578\n", 182 | "0.446063\n", 183 | "0.426977\n", 184 | "0.410039\n", 185 | "0.395001\n", 186 | "0.381647\n", 187 | "0.369783\n", 188 | "0.359238\n", 189 | "0.349861\n", 190 | "0.341519\n", 191 | "0.334094\n", 192 | "0.327481\n", 193 | "0.321587\n", 194 | "0.316332\n", 195 | "0.311643\n", 196 | "0.307455\n", 197 | "0.303713\n", 198 | "0.300366\n", 199 | "0.29737\n", 200 | "0.294686\n", 201 | "0.292278\n", 202 | "0.290117\n", 203 | "0.288175\n", 204 | "0.286427\n", 205 | "0.284853\n", 206 | "0.283433\n", 207 | "0.282151\n", 208 | "0.280991\n", 209 | "0.279942\n", 210 | "0.27899\n", 211 | "0.278125\n", 212 | "0.277339\n", 213 | "0.276623\n", 214 | "0.27597\n", 215 | "0.275373\n", 216 | "0.274826\n", 217 | "0.274325\n", 218 | "0.273865\n", 219 | "0.273442\n", 220 | "0.273052\n", 221 | "0.272693\n", 222 | "0.27236\n", 223 | "0.272052\n", 224 | "0.271767\n", 225 | "0.271501\n", 226 | "0.271254\n", 227 | "0.271024\n", 228 | "0.27081\n", 229 | "0.270609\n", 230 | "0.270421\n", 231 | "0.270245\n", 232 | "0.27008\n", 233 | "0.269925\n", 234 | "0.269779\n", 235 | "0.269642\n", 236 | "0.269512\n", 237 | "0.26939\n", 238 | "0.269274\n", 239 | "0.269165\n", 240 | "0.269062\n", 241 | "0.268964\n", 242 | "0.268871\n", 243 | "0.268783\n", 244 | "0.268699\n", 245 | "0.26862\n", 246 | "0.268544\n", 247 | "0.268472\n", 248 | "0.268404\n", 249 | "0.268339\n", 250 | "0.268277\n", 251 | "0.268218\n", 252 | "0.268161\n", 253 | "0.268107\n", 254 | "0.268056\n", 255 | "0.268007\n", 256 | "0.26796\n", 257 | "0.267916\n", 258 | "0.267873\n", 259 | "0.267832\n", 260 | "0.267794\n", 261 | "0.267756\n", 262 | "0.267721\n", 263 | "0.267687\n", 264 | "0.267654\n", 265 | "0.267623\n", 266 | "0.267594\n", 267 | "0.267565\n", 268 | "0.267538\n", 269 | "0.267512\n", 270 | "0.267487\n", 271 | "0.267464\n", 272 | "0.267441\n", 273 | "0.267419\n", 274 | "0.267398\n", 275 | "0.267378\n", 276 | "0.267359\n", 277 | "0.267341\n", 278 | "0.267324\n", 279 | "0.267307\n", 280 | "0.267291\n", 281 | "0.267276\n", 282 | "0.267261\n", 283 | "0.267247\n", 284 | "0.267234\n", 285 | "0.267221\n", 286 | "0.267209\n", 287 | "0.267197\n", 288 | "0.267186\n", 289 | "0.267176\n", 290 | "0.267165\n", 291 | "0.267156\n", 292 | "0.267146\n", 293 | "0.267137\n", 294 | "0.267129\n", 295 | "0.267121\n", 296 | "0.267113\n", 297 | "0.267105\n", 298 | "0.267098\n", 299 | "0.267092\n", 300 | "0.267085\n", 301 | "0.267079\n", 302 | "0.267073\n", 303 | "0.267067\n", 304 | "0.267062\n", 305 | "0.267057\n", 306 | "0.267052\n", 307 | "0.267047\n", 308 | "0.267043\n", 309 | "0.267039\n", 310 | "0.267034\n", 311 | "0.267031\n", 312 | "0.267027\n", 313 | "0.267023\n", 314 | "0.26702\n", 315 | "0.267017\n", 316 | "0.267014\n", 317 | "0.267011\n", 318 | "0.267008\n", 319 | "0.267006\n", 320 | "0.267003\n", 321 | "0.267001\n", 322 | "0.266998\n", 323 | "0.266996\n", 324 | "0.266994\n", 325 | "0.266992\n", 326 | "0.26699\n", 327 | "0.266989\n", 328 | "0.266987\n", 329 | "0.266985\n", 330 | "0.266984\n", 331 | "0.266982\n", 332 | "0.266981\n", 333 | "0.26698\n", 334 | "0.266979\n", 335 | "0.266978\n", 336 | "0.266976\n", 337 | "0.266975\n", 338 | "0.266974\n" 339 | ] 340 | }, 341 | { 342 | "data": { 343 | "text/plain": [ 344 | "" 345 | ] 346 | }, 347 | "metadata": {}, 348 | "output_type": "display_data" 349 | } 350 | ], 351 | "source": [ 352 | "# %% We create a session to use the graph\n", 353 | "n_epochs = 1000\n", 354 | "with tf.Session() as sess:\n", 355 | " # Here we tell tensorflow that we want to initialize all\n", 356 | " # the variables in the graph so we can use them\n", 357 | " sess.run(tf.initialize_all_variables())\n", 358 | "\n", 359 | " # Fit all training data\n", 360 | " prev_training_cost = 0.0\n", 361 | " for epoch_i in range(n_epochs):\n", 362 | " for (x, y) in zip(xs, ys):\n", 363 | " sess.run(optimizer, feed_dict={X: x, Y: y})\n", 364 | "\n", 365 | " training_cost = sess.run(\n", 366 | " cost, feed_dict={X: xs, Y: ys})\n", 367 | " print(training_cost)\n", 368 | "\n", 369 | " if epoch_i % 20 == 0:\n", 370 | " ax.plot(xs, Y_pred.eval(\n", 371 | " feed_dict={X: xs}, session=sess),\n", 372 | " 'k', alpha=epoch_i / n_epochs)\n", 373 | " fig.show()\n", 374 | " plt.draw()\n", 375 | "\n", 376 | " # Allow the training to quit if we've reached a minimum\n", 377 | " if np.abs(prev_training_cost - training_cost) < 0.000001:\n", 378 | " break\n", 379 | " prev_training_cost = training_cost\n", 380 | "fig.show()\n" 381 | ] 382 | } 383 | ], 384 | "metadata": { 385 | "kernelspec": { 386 | "display_name": "Python [Root]", 387 | "language": "python", 388 | "name": "Python [Root]" 389 | }, 390 | "language_info": { 391 | "codemirror_mode": { 392 | "name": "ipython", 393 | "version": 2 394 | }, 395 | "file_extension": ".py", 396 | "mimetype": "text/x-python", 397 | "name": "python", 398 | "nbconvert_exporter": "python", 399 | "pygments_lexer": "ipython2", 400 | "version": "2.7.11" 401 | } 402 | }, 403 | "nbformat": 4, 404 | "nbformat_minor": 0 405 | } 406 | -------------------------------------------------------------------------------- /notebooks/03_polynomial_regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "\"\"\"Simple tutorial for using TensorFlow to compute polynomial regression.\n", 10 | "\n", 11 | "Parag K. Mital, Jan. 2016\"\"\"" 12 | ] 13 | }, 14 | { 15 | "cell_type": "code", 16 | "execution_count": null, 17 | "metadata": {}, 18 | "outputs": [], 19 | "source": [ 20 | "# %% Imports\n", 21 | "import numpy as np\n", 22 | "import tensorflow as tf\n", 23 | "import matplotlib.pyplot as plt\n" 24 | ] 25 | }, 26 | { 27 | "cell_type": "code", 28 | "execution_count": null, 29 | "metadata": {}, 30 | "outputs": [], 31 | "source": [ 32 | "# %% Let's create some toy data\n", 33 | "plt.ion()\n", 34 | "n_observations = 100\n", 35 | "fig, ax = plt.subplots(1, 1)\n", 36 | "xs = np.linspace(-3, 3, n_observations)\n", 37 | "ys = np.sin(xs) + np.random.uniform(-0.5, 0.5, n_observations)\n", 38 | "ax.scatter(xs, ys)\n", 39 | "fig.show()\n", 40 | "plt.draw()" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": null, 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [ 49 | "# %% tf.placeholders for the input and output of the network. Placeholders are\n", 50 | "# variables which we need to fill in when we are ready to compute the graph.\n", 51 | "X = tf.placeholder(tf.float32)\n", 52 | "Y = tf.placeholder(tf.float32)" 53 | ] 54 | }, 55 | { 56 | "cell_type": "code", 57 | "execution_count": null, 58 | "metadata": {}, 59 | "outputs": [], 60 | "source": [ 61 | "# %% Instead of a single factor and a bias, we'll create a polynomial function\n", 62 | "# of different polynomial degrees. We will then learn the influence that each\n", 63 | "# degree of the input (X^0, X^1, X^2, ...) has on the final output (Y).\n", 64 | "Y_pred = tf.Variable(tf.random_normal([1]), name='bias')\n", 65 | "for pow_i in range(1, 5):\n", 66 | " W = tf.Variable(tf.random_normal([1]), name='weight_%d' % pow_i)\n", 67 | " Y_pred = tf.add(tf.multiply(tf.pow(X, pow_i), W), Y_pred)" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": null, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [ 76 | "# %% Loss function will measure the distance between our observations\n", 77 | "# and predictions and average over them.\n", 78 | "cost = tf.reduce_sum(tf.pow(Y_pred - Y, 2)) / (n_observations - 1)" 79 | ] 80 | }, 81 | { 82 | "cell_type": "code", 83 | "execution_count": null, 84 | "metadata": {}, 85 | "outputs": [], 86 | "source": [ 87 | "# %% if we wanted to add regularization, we could add other terms to the cost,\n", 88 | "# e.g. ridge regression has a parameter controlling the amount of shrinkage\n", 89 | "# over the norm of activations. the larger the shrinkage, the more robust\n", 90 | "# to collinearity.\n", 91 | "# cost = tf.add(cost, tf.mul(1e-6, tf.global_norm([W])))" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": null, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "# %% Use gradient descent to optimize W,b\n", 101 | "# Performs a single step in the negative gradient\n", 102 | "learning_rate = 0.01\n", 103 | "optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": null, 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "# %% We create a session to use the graph\n", 113 | "n_epochs = 1000\n", 114 | "with tf.Session() as sess:\n", 115 | " # Here we tell tensorflow that we want to initialize all\n", 116 | " # the variables in the graph so we can use them\n", 117 | " sess.run(tf.initialize_all_variables())\n", 118 | "\n", 119 | " # Fit all training data\n", 120 | " prev_training_cost = 0.0\n", 121 | " for epoch_i in range(n_epochs):\n", 122 | " for (x, y) in zip(xs, ys):\n", 123 | " sess.run(optimizer, feed_dict={X: x, Y: y})\n", 124 | "\n", 125 | " training_cost = sess.run(\n", 126 | " cost, feed_dict={X: xs, Y: ys})\n", 127 | " print(training_cost)\n", 128 | "\n", 129 | " if epoch_i % 100 == 0:\n", 130 | " ax.plot(xs, Y_pred.eval(\n", 131 | " feed_dict={X: xs}, session=sess),\n", 132 | " 'k', alpha=epoch_i / n_epochs)\n", 133 | " fig.show()\n", 134 | " plt.draw()\n", 135 | "\n", 136 | " # Allow the training to quit if we've reached a minimum\n", 137 | " if np.abs(prev_training_cost - training_cost) < 0.000001:\n", 138 | " break\n", 139 | " prev_training_cost = training_cost\n", 140 | "ax.set_ylim([-3, 3])\n", 141 | "fig.show()\n", 142 | "plt.waitforbuttonpress()" 143 | ] 144 | } 145 | ], 146 | "metadata": { 147 | "language": "python" 148 | }, 149 | "nbformat": 4, 150 | "nbformat_minor": 0 151 | } 152 | -------------------------------------------------------------------------------- /notebooks/04_logistic_regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 8, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "data": { 12 | "text/plain": [ 13 | "'Simple tutorial using code from the TensorFlow example for Regression.\\n\\nParag K. Mital, Jan. 2016'" 14 | ] 15 | }, 16 | "execution_count": 8, 17 | "metadata": {}, 18 | "output_type": "execute_result" 19 | } 20 | ], 21 | "source": [ 22 | "\"\"\"Simple tutorial using code from the TensorFlow example for Regression.\n", 23 | "\n", 24 | "Parag K. Mital, Jan. 2016\"\"\"\n", 25 | "# pip3 install --upgrade\n", 26 | "# https://storage.googleapis.com/tensorflow/mac/tensorflow-0.6.0-py3-none-any.whl" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": 9, 32 | "metadata": { 33 | "collapsed": false 34 | }, 35 | "outputs": [], 36 | "source": [ 37 | "# %%\n", 38 | "%matplotlib inline\n", 39 | "import tensorflow as tf\n", 40 | "import tensorflow.examples.tutorials.mnist.input_data as input_data\n", 41 | "import numpy as np\n", 42 | "import matplotlib.pyplot as plt\n" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": 10, 48 | "metadata": { 49 | "collapsed": false 50 | }, 51 | "outputs": [ 52 | { 53 | "name": "stdout", 54 | "output_type": "stream", 55 | "text": [ 56 | "Extracting MNIST_data/train-images-idx3-ubyte.gz\n", 57 | "Extracting MNIST_data/train-labels-idx1-ubyte.gz\n", 58 | "Extracting MNIST_data/t10k-images-idx3-ubyte.gz\n", 59 | "Extracting MNIST_data/t10k-labels-idx1-ubyte.gz\n" 60 | ] 61 | } 62 | ], 63 | "source": [ 64 | "# %%\n", 65 | "# get the classic mnist dataset\n", 66 | "# one-hot means a sparse vector for every observation where only\n", 67 | "# the class label is 1, and every other class is 0.\n", 68 | "# more info here:\n", 69 | "# https://www.tensorflow.org/versions/0.6.0/tutorials/mnist/download/index.html#dataset-object\n", 70 | "mnist = input_data.read_data_sets('MNIST_data/', one_hot=True)" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": 11, 76 | "metadata": { 77 | "collapsed": false 78 | }, 79 | "outputs": [ 80 | { 81 | "name": "stdout", 82 | "output_type": "stream", 83 | "text": [ 84 | "(55000, 10000, 5000)\n" 85 | ] 86 | } 87 | ], 88 | "source": [ 89 | "# %%\n", 90 | "# mnist is now a DataSet with accessors for:\n", 91 | "# 'train', 'test', and 'validation'.\n", 92 | "# within each, we can access:\n", 93 | "# images, labels, and num_examples\n", 94 | "print(mnist.train.num_examples,\n", 95 | " mnist.test.num_examples,\n", 96 | " mnist.validation.num_examples)" 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": 12, 102 | "metadata": { 103 | "collapsed": false 104 | }, 105 | "outputs": [ 106 | { 107 | "name": "stdout", 108 | "output_type": "stream", 109 | "text": [ 110 | "((55000, 784), (55000, 10))\n" 111 | ] 112 | } 113 | ], 114 | "source": [ 115 | "# %% the images are stored as:\n", 116 | "# n_observations x n_features tensor (n-dim array)\n", 117 | "# the labels are stored as n_observations x n_labels,\n", 118 | "# where each observation is a one-hot vector.\n", 119 | "print(mnist.train.images.shape, mnist.train.labels.shape)" 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": 13, 125 | "metadata": { 126 | "collapsed": false 127 | }, 128 | "outputs": [ 129 | { 130 | "name": "stdout", 131 | "output_type": "stream", 132 | "text": [ 133 | "(0.0, 1.0)\n" 134 | ] 135 | } 136 | ], 137 | "source": [ 138 | "# %% the range of the values of the images is from 0-1\n", 139 | "print(np.min(mnist.train.images), np.max(mnist.train.images))" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": 14, 145 | "metadata": { 146 | "collapsed": false 147 | }, 148 | "outputs": [ 149 | { 150 | "data": { 151 | "text/plain": [ 152 | "" 153 | ] 154 | }, 155 | "execution_count": 14, 156 | "metadata": {}, 157 | "output_type": "execute_result" 158 | }, 159 | { 160 | "data": { 161 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP4AAAD8CAYAAABXXhlaAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnV+IfNlV77+r629X//v1hDszkNHxinAvCDIoBi5z4UYU\nbxBhxIfcEB+SK4gPRgVfEn2ZV+NDIFzwJY5hFINXhdwZXzSRIJcIXodr5jrRiREuMxrN/DLj7/fr\n7uru6urq2j50rdPrrNr7nNO/rqo+Xef7gc05dbq6and1fc9ae+2115YQAgghzWLjtjtACFk9FD4h\nDYTCJ6SBUPiENBAKn5AGQuET0kBuJHwR+ZCIfENEvikin1xUpwghy0Uedx5fRDYAfBPAjwL4FwCv\nAfhICOEb7nlMFCDklgghSOz6TSz+BwD8Qwjh7RDCOYDfB/DCDV6PELIibiL89wP4J/P4W7NrhJCa\nw+AeIQ3kJsL/ZwDfbR4/M7tGCKk5NxH+awC+T0SeFZEugI8AeHUx3SKELJP24/5iCOFCRD4B4Eu4\nvIG8FEJ4c2E9I4Qsjceezqv8BpzOI+TWWMZ0HiHkjkLhE9JAKHxCGgiFT0gDofAJaSAUPiENhMIn\npIFQ+IQ0EAqfkAZC4RPSQCh8QhoIhU9IA6HwCWkgFD4hDYTCJ6SBUPiENBAKn5AGQuET0kAofEIa\nCIVPSAOh8AlpIBQ+IQ2EwiekgVD4hDQQCp+QBkLhE9JAKHxCGgiFT0gDofAJaSAUPiENhMInpIG0\nb/LLIvIWgAMAUwDnIYQPLKJThJDlciPh41LwHwwhPFxEZwghq+Gmrr4s4DUIISvmpqINAL4sIq+J\nyM8tokOEkOVzU1f/+RDCt0Xk3+HyBvBmCOGri+gYIWR53MjihxC+PTu+C+CLABjcI+QO8NjCF5GB\niGzPzrcA/DiAry+qY4SQ5XETV/8pAF8UkTB7nd8LIXxpMd0ihCwTCSEs9w0ubwyEkFsghCCx65yK\nI6SBUPiENBAKn5AGctN5fFIBESl8XHZ9GZT1ycZ+rhsHEpHs9crOYz8LIWTv6c/L+qPP97/vrzcd\nCn8JVPnSF11bRf9SrUgwVUQjItjY2Mhaq9XKPS67BgDT6RTT6RQXFxfZubaY+O25/l6qlf1+U6Dw\nl4QKaWNjo1BosbbKfvkjcCkEFZkV3HQ6LX19FXG73Ua73a50bh8DwGQywcXFRfSofUh5ARcXFzg/\nP482ALmbh/6eveE1BQp/CVhxpQSWOq5C+N7aeosbs7Te4qZotVrodDrR1u12S68BSAr3/Pw860PK\nnZ9MJjg7O8N4PMbZ2RnOzs5ynoT+jvdumiZ+Cn/BeNGnWuo5qxB+q9XKhK7nMeGre2zPqwi/1+uh\n1+uh2+3mjv489jMAmWjH4/Hc+WQySQ5DQgg4Pz/H6ekpTk9P0Wq1MkFbjyE2bGmS6AEKf2n4G4AV\nWmq8q1/UZWLFri62PQeuXG3rZut5mUDa7Tb6/X7WNjc3c4+Lfra5uYkQAkajEc7OzjAajbKmj714\ntem18XiMbreLdruds/STyQTj8Tj6+TZN9ACFvxRioi+ysvbxKoRvx9Z+nA1cCt82Fb0dY6fodDrY\n3NzEYDDAYDDIzmPXYo9DCJnF1nZycpKde3ffxh/0ppESvX7eNl6hbr6eNwUKfwnYQF1M/N7axtzt\nZbGxsYFOp4N2u5076jlwOcaeTCa54/n5OVqtVqnwu90uBoMBtra2km17ezt5LYSAk5MTHB8fZ0d7\nPh6Po0FHPT89Pc3ErYG+s7MzdDqd7POdTqdzsyhNEj1A4UepMt8ce6xCj0WvY8fY+bItvr5PTPSd\nTicLkHnR63kV4RcJPHbdPtbX0KYBQD1PCV+PvV4PwNXMhPVWxuMxACRjF00K8FH4Dg1+pcbgZdeK\nhO6tfOzxbbr6djot1aq4+kWuvXX9NaCnrrkKTz/HTqeDXq+XCVtESl39TqcTDeDp59rtdnM3NG36\n2hR+Q1FBF4nDN7WY17HsMTd/Va6+74ftD4BoUO9xgntFgb1+v49er5dzwYGrG2+73Ua3281uNHpd\nxRkL7KnwY31Uj6zb7WbTfHaqTz2dpkDhz7Cuu52Ltu6mdTtTj/WLfB2xr1L4djovFly0LvIyp/Ps\n56afjf4P9MZrLb1e0z6kmk4J+r9ZW7vdxunpae49NR6wypTp24bCRz5N1lucXq+XWSfb9Jr9Wb/f\nTwrfP44F/FYl/KIUWuAq7fVxEng0eBhL1vHNeks2h0FFrqK3/4+LiwsA6Zz8brc710cbf9H3VNGX\nTfWtK40Xfmyxilr8brebc1k3NzdLm7VgVZu9CSxb+ADmEohSKbux4FkZalVjw56yG6Htn8Y7bDAy\nla9vj2rxrQcXS5gC5qf6KPyGEBO9tzAqeJ2islNVsfNer3dtsdvHq8rVL1ojcJOFOmUZi0U3He2D\nfg7W8scCdrFz6+qnFgype6+iH41GFH5TsW6md/U3NzezaSdtOzs7uaOelwnfW/dVC9/+ranzMnFV\nee2yY9lrxNKXq2Td9Xq9uYVI9gYA5C39aDTKuf5NoZHC9/Pu9rFaebXgKmjbYsLXoxV+TNipsfV1\ncvWLxFjkBtu/3x+rCLPqjSHmIfg029TjsnUOKU/Fnvf7/dxUnZ2OHI/HOD09zWYU/KxCU2ic8O08\nfcwSb25uYmdnB7u7u9GmySbezdcgX7fbLQye2ekjDaDFvrwpUkGt1BSXPwLz7rh3vWM3AT8UKHrf\nmzQbI0ilFRfdVH2fvcVP3UCaRiOFb6O7fi5+MBhgd3cXe3t72Nvby871uL29HQ3q9fv9LJIdCyZZ\n78KKBLi+O+3F7AVup+D80U/n+YCbFU/Mqur7x1JmbQENmwcQe+ynCfWxnRWINT8boOe2f17UVT2I\nJtFI4dtIsf9ibW1t5YS/t7eHe/fuZefqzvvpPBW+RqP9lyoWNU9Fz4sos6qpVXV6VIsay9e3c9up\nVnZzUZfautg+99/3yZ63Wq1cgk8s6cf/zwDMuer+c68i+CaJv3HCB66mi/w8fb/fx/b2dubWq+j3\n9/dx79493Lt3D4PBIGmJut1uVDgAcsIBrlx9b/HKpszsEMFbzpjw/FHjGD4HXs+t1Y8FyDQanipt\ndX5+jvF4nB1j50U3h1arlVy5Z1N9+/1+9lnZ6L/9rIvc/CZbe6CBwtd/up+n16bjey/6/f197O/v\nYzAYzLma1uW0lidmTax7b4VaNRfeu8cxq27FpgEuPReRuWQk22wyTWz8r8KPLdlVAduUWN9sn2Kt\n3W7PzaBsbW3h7Owse44u3wWuRB/LKiy6AaSChE2hkcK3rr5O12mwzgrfiv+JJ57AE088gc3NzbmA\nUtF0XCrqbgVsxVlF+DHR6bkKz1ausceNjY1kLv1kMknGKLSFEOZW7NmjL6Ch7fT0NCuoYT0A21T4\nu7u72f9Bf+/8/HwubdgOW3q9XtTiL2IWZR1ppPBjrr6dulNXX917L/yy6SQgXSLaBuSsWFWYmpKa\nYjqdFrrL+jq2ao09bmxsZDe60WiEwWCQlbSaTqeZ8FNTj/r+3qOw720LZ/jz0WiUvCmdnZ2h2+1m\nN9zT09PsRmFz9PXztv9Db/GruPu0+A3DirVo6i02zgXmp9SU1BSbFbtaxVQrWyGmFj8lfk1KSQlf\nROZEZ5ufjvQ3gBBCdAhhhW8Ffx3h2/e3lly9ExsctHGNWEafD576gKefmagym7JONFL4+g/3VlJX\nbh0fH2dBL5vcMZ1O0e/3owK3j1Nz6DZjzI557XkVi19ljF3k6tsgnAp1c3MTx8fHmcVPNfu5xdx9\nTZBJufvWrU+573aO3s6+2BV+PvlGb85W6PpZ6HufnJxkNyK9AWnfq8yorBONE76djvLC18U59ktl\no/EXFxfZWNJnnsWsTKyp8GMVZKsKv2h+3FtifxSR3N/sg3udTmcuoGePPrjnb0L2Rpa6ufmpPW91\nfa6FFb3Nl9Cgqh+rW+Hbm5uW8dIhhH4m2ocm0Ujhx8bXajlshph17fX5WhwiNQ9vLU7s3ArTB7aq\nCN9OA/p5dCv+1JSeiGRiiOUx2AUysaFObBrS5wykpvFU9P751tpai29nS3QGxt6gYqv7rGel7xmz\n+PYmZG8+TaFU+CLyEoCfBHA/hPADs2v7AP4ngGcBvAXgwyGEgyX2c6H4iHoqkGUTVvSL1Ol0kmvV\nU/Pr9rEPhvnzKsK/TgKPHw6ISDJ5x3s5y0jg8c+3DZh39e3Nybr6dvmzvzGlLL4KX4cedPWL+TyA\n/wHgd8y1TwH4sxDCb4jIJwH86uxa7bHC0S+jT1O11sNbsVarlUyeKbKE9lrZTjFV+l81Zdf3EUDp\nuvjYbIV3pVOeTiyQ5ltqiGQ/f+t9aZAvNsZPufo+2Bmz+HbGgK6+I4TwVRF51l1+AcB/mZ2/DODP\ncUeED+QF7VNpY8+xLmOr1SpMoCmytvb1Uscqwtdj0UKZ1DE2fo9NcwHxVXxF7xu7Cfhr/jP26Ptb\ni58K7sVuWCmLPxqNosE9uvrX48kQwn0ACCG8IyJPLrBPS8UH96zoU3Ps+sXp9/tZVLwoiaZq80tG\nq2Tu2b+j6Dx1VMqsun1e0fun3jt17t8v1vwYP+bqW+HH1kHExvhan98H9+jqPz536hOz4o/NzVvr\n5SP/IlIo+OuIP+YVNOHL5z0OIJ5tZ+v+X1f4ftoxljZsrT1d/WrcF5GnQgj3ReRpAN9ZZKeWScwt\ntTcAAHNuo/cQysbyMVEXJZA0zdoosVkDa+1TqyiLRB+Lc8RuzvZ/0MTPvqrwZdaUVwF8HMCnAXwM\nwCuL7dZyseNdG0W3rqgXvVp9TRKJRe5jAT7f/HN8EtC6U+Te21kVK3ptdgOOMmsfy3coEn6T/gdA\ntem8LwD4IID3icg/AngRwK8D+EMR+VkAbwP48DI7uWisZfD59YoXvVoZK/zUdF5smiv1O021OEqR\n+P08vs01SM1CpGYZ/A2g6Va/SlT/o4kf/diC+7Iy9MthH4cQssw0+2XRL6AtwZyKVpdFtIsi3U2z\nODGLn0rX9cKPpRID81OdVQXfpM9daWTmnnXl9Zp37VutFiaTSS65p9Vq5Z7vZwJSc+tF500VvR5j\nFt9a+5j4Y0MFYP7/SPGnaZzwgflpJ7XiKQtkj/Y1YnPq/nHV5zWNmOB1/j7l6usY37+OctMxfpNo\nrPCt5VeKElfs0X9JrvO47LlNosjix4J7avFTN1A9p/jLaaTwFYpw9ahVT+02bKsC+c1INT03FlDV\no13/rzUIYnP3TRY90HDhk9Vg3XG/S5HNyut2u9jf38+qGet+BXbVYAghl5jjqwAdHh7iwYMHePTo\nEQ4PDzEcDrOFObZun82UZFSfkAViBa/nmpFnK+vYo5Yy39nZwdbW1lzZcs3I89WF9Pzg4CAn/KOj\no2wNvi60irn7TbP6FD5ZCqn4yMbGVYVj3ZPQbkh67969bMeiwWCQrb33Ft8vtdU8fCv8g4ODzOL7\n+n2pIiBNgcInS8NP2wF5i69FP+2+hHbjkiJXX0t8DYdDHB0dZe3g4ACPHj3KufrHx8fZajytZOxj\nAxQ+ITfEu/hW+HaMr9WNreB1u7KdnZ1M+D5r0lr84XCIw8PDzMJrOzw8zFx9b/FjiVRNs/oUPlkK\nsQQbnbKz+xlY4Wtgz7r63uKr8MfjMU5OTjAcDnFwcICHDx/iwYMHODg4yKz/cDiMBvdSiVRNgsIn\nK8HO1acsvo3o212IY8K3BTQPDw/x8OFDvPfeezg4OMiKaurY3wf3FDv33zQofLJwYhl5evRbl2lg\nz+5ZqHvk2Yi+LX5qV0tqgE/H+oeHh7l6/rHaeoTCJwtGRZ/aw95P4cU2xbQJPH5vA7/m3hdKsdVz\n7ZRdE935Iih8snBsBR1fVMPu22ebndKz22HHaufbCjte+H5X3qZG7cug8MlCsRZfi2XatFwvfGvt\ntflqO76YZhWLH8vOI1dQ+GTh+PLYKmAd25eJ398srKsP5PdFiNXVS1l8Wv0rKHyyUGIW3y6rjbn5\n3tUvqvdfZPHV6tsKxnaMT66g8MnCKdoQIzbG98G92C69sdp6MVffBvZ8FV2K/woKnywUb/F9XXzv\n6sfG+D7pJ1Zsw66v92P8WIFTij4PhU8Wji+VHSuq4ZsN5hWVNrPj+FgkX1NyfWFTCj8PhU8WTlHp\nbJ/YE7PoReXJU+vsY269D+hR/FdQ+GSheBfdb5aREr0dv2tmXmxnYc3Ft8L3+9zHxE/R56HwydIo\nsvr+pqBY4dvSWdq8xffr7P3KO4o+DoVPFk6Rq5/amVex6+11IY5tuuTWrrP3wo+JnuLPQ+GTpVE2\n1o9F7q3FtyvwVOiPHj1Kuvp211u6+cVQ+GThlFl87+Zbq2+3LbPFNmyFnTJXP1V6m1xB4ZOlkAry\nFUX0NTPPCl/X1B8dHWVVdnxU3wb3YpugUvjzUPhkoVhx++2v7Kq72FbXap11jD8ajTJXX9fax6z9\neDxOJupQ9HEofLIwbNKOzdSzhTa0bPbm5ma0gq4ut1WLPxqNMquvLr9W17EpurF8fIo+DYVPboSP\nyotILj9f03K3t7exs7OD3d3dqPCB/PjeRvWtu2+tvbr5MeFT9MVQ+OSxSW2Yoavy1OKr8NXib29v\n54RvXX3NwVfhe3f/6Ogom9qzS3Ctm0/Rl7NR9gQReUlE7ovI35hrL4rIt0Tkr2ftQ8vtJqkbMdFb\nV99umuGFr8tvvatvF994i68Vc3VaL2XxKfpqlAofwOcB/NfI9c+EEH5w1v5kwf0id4TYjjnW1bdj\n/CJX30f0Nbhnx/jq6qvwy8b4JE2pqx9C+KqIPBv5kUSukQYRE7139WMWX6P7fowfc/XtGF+De7bq\njk/cIdWoYvFTfEJEXheR3xKRvYX1iNSelJtfRfhFFt8G92JR/ZTF1zl8Cr86jyv83wTwvSGE5wC8\nA+Azi+sSqTt+rb1dT19WbMPXy7eVc9XS26aLc9Tt19x8vyKP4/vr8VhR/RDCu+bh5wD88WK6Q+qO\n3Q1HhW+LbWjUfmtrK7PsehPQ5B19DeBqUc7FxQU2NjZyK/F80czUenuK/vpUFb7AjOlF5OkQwjuz\nhz8N4OuL7hipJ1pWy2bl2fPt7e3cFlhaV8+O6RW7ik7xFj1WJrvJm10uilLhi8gXAHwQwPtE5B8B\nvAjgR0TkOQBTAG8B+Pkl9pHUCCt8697rUS2+bnpp3Xt9no7lrYD1mhV+yuJz2e3NqRLV/2jk8ueX\n0BdyB4htfGlbmavf6XQwmUxyordVcb3FT1XLpZt/M5i5R66FWnx17W3kfjAY5ISfsvgAsp1tYmm6\nqR1xuOx2cVD45FrELL7uc69JOn6Mby1+u93OrDwwX3En5urbJbep1FyK/3pQ+ORa+DF+r9fLbXXt\nXX0Vvwb3Wq0WJpNJFtW3q/Fi03V+Kywm6iyGmyTwkIZiLb5fiGOtvV9/79fe+0o7mqhjF+Fw19vl\nQItProVN3mm327n98GLj+aqC15V3RWW1KPrFQYtProWfx/dW37r2fn97oLiYplbY0WIbfrMMCn9x\nUPjkWtjgnhW+jexXtfi6Ak/z8A8ODnBwcJCz+MzFXw4UPqlMbCGOjez76bvY/vYpV7/I4lP4i4dj\nfFIJv/S2qqtfZYxvLf7h4WFWU8+P8cnioPBJKX4Zrl+V5y2+dfXV4lfZMOPw8BAHBwe5nXO00AYj\n+ouFwieFxIppqsW3U3pFY3wN7vk19z6qrxbf75dHV3/xUPhkjtimlto0EcdO49mma+19SS1fXccX\n0zw5Ocmaz9qjxV88FD7JYV35WNMqOpqhF5u7j03fjcdjTKfTrEim3wnXFt/w+fnMx188FD7J4Rfh\n+OaF73Pyu91uFvVX114ttq6+89V0/A2gaEUeWQwUPslhha8uvY7je70e9vb2sLu7GxW9XX1nS2rp\nSjwRSYo+tf7eip8sDgqf5LDCtxF7bWrtiyy+L7Bhi2dYV98uyLHi1wAgF+YsDwqf5PAWX5fc6go8\ntfhFwtcgnq2sYwttpFx8uytO7MZB8S8OCp/ksLn4dppOhW6Fn3L11WoD+fX24/E4avG9+G1lHYp+\nOVD4JIe3+HYnHG/tYxbfFtqwwT0/fVcU1VdYT295UPgkh192axN0/NLbsv3urZuvCTtFU3c6PCDL\nh4t0yBwqfrtphp3is0ttbZIPgJyL7sf3fturWDktshoofBLFi1/Frk1vBlb8ihW9tfi6FDc1T09W\nB4VPcth0XWvxY8L3ordW34tfRe+LZ9rMPIp/dVD4ZA4vfhW7Hcdbi39dV58W//ah8MkcqTG+F78+\nx7v6flGOd/U5xr99KHySw67EK3P1vcW3K/FiFr/K7jhkNVD4ZA5r8W1wz0b0U+N84GqMHwvuxdx9\nin71UPgkh7f2dnyvm2R64fsxvrf2fnssuvu3D4VPcnjR27p6tvlAnxe+F7xm7NmsPSt+in61MHOP\nAJgvpunLa9mimqlMPSDv5hcJXzP3uEvO7UCLT+aKaXqLb/P2fYquracH5C2+z9HXyrkxi89U3dVS\nKnwReUZEviIifysib4jIL82u74vIl0Tk70XkT0Vkb/ndJYvGi95H8+3Y3rr6sdLZwFWpLbX4mqMf\nc/XtZhm0+KulisWfAPiVEML3A/hPAH5BRP4jgE8B+LMQwn8A8BUAv7q8bpJl4EWvx1hgTy1+VVc/\nZvG1qGYswEeLv1pKhR9CeCeE8PrsfAjgTQDPAHgBwMuzp70M4KeW1UmyeFKiTwX3rMVfxBjf1tfj\nlN7quVZwT0S+B8BzAP4SwFMhhPvA5c1BRJ5ceO/I0vF59kVRfR3jx+byY8K3Fl+tfdEYn8JfHZWF\nLyLbAP4IwC+HEIYi4v9L/K/dEXytfNt8hp6tsKuWX28Mdlssa+XV0tv6+Wr1/fjelugiq6OS8EWk\njUvR/24I4ZXZ5fsi8lQI4b6IPA3gO8vqJFkcfgGOb7EAnl2HH9sOS4U7mUwygXuh+1JbzN67XapO\n5/02gL8LIXzWXHsVwMdn5x8D8Ir/JVJPUll5XvTa7NRdbANMH8RLid5bexvRp8VfLaUWX0SeB/Az\nAN4Qka/h0qX/NQCfBvAHIvKzAN4G8OFldpQshlRKrp7bufqUxbdpubYg5nQ6LRX9aDTKldqya/LJ\n6igVfgjhLwC0Ej/+scV2h6wCv+rOCttuoOHFr8LXLavtnL026857917P1UuwZbjp6q8Wpuw2DGvx\nYwtwvKvv6+xZi+/n7O0Yv8jqq3tv3Xy6+quFwm8gdrmtjdjbKbsii29r5vsVeDEr76/Fhgi0+KuF\nwm8YavGtq59KyY1Z/Ha7jclkAiCfpWen7VKC18f6u7EjWQ0UfgPxNfPt/ni6SYbdHadoKi+Wj2/F\n7ve61/gAuV0o/IZhXfx+v5/tiaeCv3fvXrYb7mAwyKy/it6P663oufru7kDhNwx189Xa6754Ozs7\n2N3dnRN+v9/PLL7d+trn4mtKroqfq+/qDdfjNwxNy7UWf3t7OxN9TPgxi2+Fr2N3Fb4+tptn0OLX\nC1r8hmEtvhX+3t4e9vf3c8Lf3NzMxvgx4ftFOLbR4tcbCr9hpIRvLb7uhBuz+DYv3y+7tRbfL7ul\n8OsFhd8wUq6+tfh+V9zUGD+27NYK3+fjk/pA4TcMjerb4J5a/P39fezt7c0l8qRc/ZTFt64+LX49\nofAbhl2g4zP2dC4/tirPrsjzGXuxOXxfWouirxcUfkOIFdX0JbdSzZPaLCNVaIMWv35Q+GtOSvDX\nFbvio/rXET6pDxT+GuPr6el5ytr730ndBFJTejHhW2tPi18fKPw1JSZkPZZZei94ex6z+Hac76fy\naPHrCTP31pgyK17V+ntirr5fkmt3xWVNvfpB4a8hRTXzq47vU0cVcCqJh1V07wZ09deUoqCev1b0\nnBhq8WNz+Zqjb5N3aPHrB4W/psQsua21V9bsvvd+jK9Ha/1ti1XVoejrBYW/Znih+2br69myW7FK\nO/4mYF/flu/ym2/Ym8HFxUXpFCFZPRT+GmKtuz+mBJ8qrKkCjw0LfLVebSr41O+T24fCX0NiFtlv\ne10k/pTLH3t9X6230+lkgT+Kvr5Q+GuGijJlkVXc1t33G2dYS23Fq6+fsvjqLUwmk+zof5/UAwp/\nDbGitNbY1tC3QvfX7I0jFeSzHoV/H03cSXkM5Pah8NeQVPAtZuVjN4Gy/H19HBO9jw9Q/PWEwl8z\nitxwb91TR30d+5qx109F9s/Pz+c22KTo6wWFv4ZY4fudcP1OOTEr7effbbPpuDYl126H5efxmbxT\nPyj8NUNdcOvaa5GNfr8/V0uv2+3OFdrwe9tZYR8dHeHw8BBHR0cYDoc4Pj7Omq2rH0vZpfjrA4W/\nZqi1V7fbVtfRzTNU+HaXnI2Ny2Ubaq1VtHrU86Ojo6wNh8Os+bJbfmkuRV8vShfpiMgzIvIVEflb\nEXlDRH5xdv1FEfmWiPz1rH1o+d0lZcSEr6K322P50tm2pp6uvLO19IbDYWbt1eJ7q5+y+KzAUz+q\nWPwJgF8JIbwuItsA/q+IfHn2s8+EED6zvO6RxyG2N55a+yKLb4XvN8vQVib609PTOU+B6/HrR6nw\nQwjvAHhndj4UkTcBvH/2Y4Zqa4YP7BUJ34/xgfzW1yp8deNPTk5yFl/dfC9+jQn4wB8tfn241np8\nEfkeAM8B+D+zS58QkddF5LdEZG/BfSOPgXf1/cYZukNOkcXXMb519Y+Pj+cCe9bq2zF+bNNMRvbr\nRWXhz9z8PwLwyyGEIYDfBPC9IYTncOkR0OWvAVb46urrmD7l6qfG+NbiD4fDzNpXierH6uqT+lAp\nqi8ibVyK/ndDCK8AQAjhXfOUzwH448V3jzwONtsutnoutvJO8dV1vNW3Ft7vk6cBPQutfD2pavF/\nG8DfhRA+qxdE5Gnz858G8PVFdow8PjZpxo63bfMJN1agsdLZfous2G64XuQUfX0ptfgi8jyAnwHw\nhoh8DUAA8GsAPioizwGYAngLwM8vsZ/kmvhiGF78PujmK+qktsGO7ZYTc+Up+npTJar/FwBakR/9\nyeK7Q24qubh7AAAEQUlEQVSKL4flRW/r4NlyWfb3UzXz1eqXbYNN0dcfVtldQ1Li93Prsc0uUhV0\nreh91N66+hT93YDCX0Ou4+bHxvkpi++3wbYWn2m5dwsKf82IVb9NBfeKLL4vna0RfG6DvR5Q+GuI\nD9J5q19k8b2r76f0UlthU/h3C67OW0Ni+9f7AJ2txmPn9s/Pz6MpuXbe3s7Z28w8cneg8NeM2CIb\nm4vvn3N2dpYl5xweHmIymeC9997L2oMHD3BwcIDhcJgUPdfb3z0o/DXDp9xqhh4wv/WVegCajru9\nvY2Liws8fPgw11T4qQ0xKfq7B4W/Ztgx+ng8ztJx9Zq6/zYHX/P3B4MBLi4ucjn56vYfHx9nQb1Y\nBiCn8u4WFP6aYd14u8NtLEp/fHycLdvVNp1O5xbe6LkK3wYMbaCQ3B0o/DXDuvr6WK1/p9PJ3Htf\nXVfPQwi5wht2F9zRaDQ3rqerfzeh8NcMu55ehRnbJdduq6Xn7XYbIYTcGD5WUde69n4qkNwNZNn/\nLBHht2HFpDbEKGp2ea4vk+1LZgNIHkm9CCFEq2TR4q8htL6kDGbuEdJAKHxCGgiFT0gDofAJaSBL\nj+oTQuoHLT4hDYTCJ6SBrEz4IvIhEfmGiHxTRD65qvetioi8JSL/T0S+JiJ/VYP+vCQi90Xkb8y1\nfRH5koj8vYj86W3uXpToX202Uo1s9vpLs+u1+AxvezPalYzxRWQDwDcB/CiAfwHwGoCPhBC+sfQ3\nr4iI/H8APxRCeHjbfQEAEfnPAIYAfieE8AOza58G8K8hhN+Y3Tz3QwifqlH/XgRwVIeNVGf7Pjxt\nN3sF8AKA/44afIYF/ftvWMFnuCqL/wEA/xBCeDuEcA7g93H5R9YJQY2GPiGErwLwN6EXALw8O38Z\nwE+ttFOGRP+AmmykGkJ4J4Tw+ux8COBNAM+gJp9hon8r24x2VV/09wP4J/P4W7j6I+tCAPBlEXlN\nRH7utjuT4MkQwn0g28X4yVvuT4zabaRqNnv9SwBP1e0zvI3NaGtj4WrA8yGEHwTwEwB+YebK1p26\nzcXWbiPVyGav/jO71c/wtjajXZXw/xnAd5vHz8yu1YYQwrdnx3cBfBGXw5O6cV9EngKyMeJ3brk/\nOUII74aroNHnAPzwbfYnttkravQZpjajXcVnuCrhvwbg+0TkWRHpAvgIgFdX9N6liMhgdueFiGwB\n+HHUYxNQQX689yqAj8/OPwbgFf8LKybXvxpupDq32Svq9Rne2ma0K8vcm01LfBaXN5uXQgi/vpI3\nroCI/HtcWvmAy6XKv3fb/RORLwD4IID3AbgP4EUA/wvAHwL4LgBvA/hwCOFRjfr3I7gcq2Ybqep4\n+hb69zyA/w3gDVz+X3Wz178C8Ae45c+woH8fxQo+Q6bsEtJAGNwjpIFQ+IQ0EAqfkAZC4RPSQCh8\nQhoIhU9IA6HwCWkgFD4hDeTfAKZ6ORNIn8/jAAAAAElFTkSuQmCC\n", 162 | "text/plain": [ 163 | "" 164 | ] 165 | }, 166 | "metadata": {}, 167 | "output_type": "display_data" 168 | } 169 | ], 170 | "source": [ 171 | "# %% we can visualize any one of the images by reshaping it to a 28x28 image\n", 172 | "plt.imshow(np.reshape(mnist.train.images[100, :], (28, 28)), cmap='gray')" 173 | ] 174 | }, 175 | { 176 | "cell_type": "code", 177 | "execution_count": 15, 178 | "metadata": { 179 | "collapsed": true 180 | }, 181 | "outputs": [], 182 | "source": [ 183 | "# %% We can create a container for an input image using tensorflow's graph:\n", 184 | "# We allow the first dimension to be None, since this will eventually\n", 185 | "# represent our mini-batches, or how many images we feed into a network\n", 186 | "# at a time during training/validation/testing.\n", 187 | "# The second dimension is the number of features that the image has.\n", 188 | "n_input = 784\n", 189 | "n_output = 10\n", 190 | "net_input = tf.placeholder(tf.float32, [None, n_input])" 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": 16, 196 | "metadata": { 197 | "collapsed": true 198 | }, 199 | "outputs": [], 200 | "source": [ 201 | "# %% We can write a simple regression (y = W*x + b) as:\n", 202 | "W = tf.Variable(tf.zeros([n_input, n_output]))\n", 203 | "b = tf.Variable(tf.zeros([n_output]))\n", 204 | "net_output = tf.nn.softmax(tf.matmul(net_input, W) + b)" 205 | ] 206 | }, 207 | { 208 | "cell_type": "code", 209 | "execution_count": 17, 210 | "metadata": { 211 | "collapsed": true 212 | }, 213 | "outputs": [], 214 | "source": [ 215 | "# %% We'll create a placeholder for the true output of the network\n", 216 | "y_true = tf.placeholder(tf.float32, [None, 10])" 217 | ] 218 | }, 219 | { 220 | "cell_type": "code", 221 | "execution_count": 18, 222 | "metadata": { 223 | "collapsed": true 224 | }, 225 | "outputs": [], 226 | "source": [ 227 | "# %% And then write our loss function:\n", 228 | "cross_entropy = -tf.reduce_sum(y_true * tf.log(net_output))" 229 | ] 230 | }, 231 | { 232 | "cell_type": "code", 233 | "execution_count": 19, 234 | "metadata": { 235 | "collapsed": true 236 | }, 237 | "outputs": [], 238 | "source": [ 239 | "# %% This would equate each label in our one-hot vector between the\n", 240 | "# prediction and actual using the argmax as the predicted label\n", 241 | "correct_prediction = tf.equal(\n", 242 | " tf.argmax(net_output, 1), tf.argmax(y_true, 1))" 243 | ] 244 | }, 245 | { 246 | "cell_type": "code", 247 | "execution_count": 20, 248 | "metadata": { 249 | "collapsed": true 250 | }, 251 | "outputs": [], 252 | "source": [ 253 | "# %% And now we can look at the mean of our network's correct guesses\n", 254 | "accuracy = tf.reduce_mean(tf.cast(correct_prediction, \"float\"))" 255 | ] 256 | }, 257 | { 258 | "cell_type": "code", 259 | "execution_count": 21, 260 | "metadata": { 261 | "collapsed": true 262 | }, 263 | "outputs": [], 264 | "source": [ 265 | "# %% We can tell the tensorflow graph to train w/ gradient descent using\n", 266 | "# our loss function and an input learning rate\n", 267 | "optimizer = tf.train.GradientDescentOptimizer(\n", 268 | " 0.01).minimize(cross_entropy)" 269 | ] 270 | }, 271 | { 272 | "cell_type": "code", 273 | "execution_count": 22, 274 | "metadata": { 275 | "collapsed": true 276 | }, 277 | "outputs": [], 278 | "source": [ 279 | "# %% We now create a new session to actually perform the initialization the\n", 280 | "# variables:\n", 281 | "sess = tf.Session()\n", 282 | "sess.run(tf.initialize_all_variables())" 283 | ] 284 | }, 285 | { 286 | "cell_type": "code", 287 | "execution_count": 23, 288 | "metadata": { 289 | "collapsed": false 290 | }, 291 | "outputs": [ 292 | { 293 | "name": "stdout", 294 | "output_type": "stream", 295 | "text": [ 296 | "0.8994\n", 297 | "0.9236\n", 298 | "0.9242\n", 299 | "0.9152\n", 300 | "0.9216\n", 301 | "0.915\n", 302 | "0.918\n", 303 | "0.9256\n", 304 | "0.926\n", 305 | "0.9228\n" 306 | ] 307 | } 308 | ], 309 | "source": [ 310 | "# %% Now actually do some training:\n", 311 | "batch_size = 100\n", 312 | "n_epochs = 10\n", 313 | "for epoch_i in range(n_epochs):\n", 314 | " for batch_i in range(mnist.train.num_examples // batch_size):\n", 315 | " batch_xs, batch_ys = mnist.train.next_batch(batch_size)\n", 316 | " sess.run(optimizer, feed_dict={\n", 317 | " net_input: batch_xs,\n", 318 | " y_true: batch_ys\n", 319 | " })\n", 320 | " print(sess.run(accuracy,\n", 321 | " feed_dict={\n", 322 | " net_input: mnist.validation.images,\n", 323 | " y_true: mnist.validation.labels\n", 324 | " }))" 325 | ] 326 | }, 327 | { 328 | "cell_type": "code", 329 | "execution_count": 24, 330 | "metadata": { 331 | "collapsed": false 332 | }, 333 | "outputs": [ 334 | { 335 | "name": "stdout", 336 | "output_type": "stream", 337 | "text": [ 338 | "0.923\n" 339 | ] 340 | } 341 | ], 342 | "source": [ 343 | "# %% Print final test accuracy:\n", 344 | "print(sess.run(accuracy,\n", 345 | " feed_dict={\n", 346 | " net_input: mnist.test.images,\n", 347 | " y_true: mnist.test.labels\n", 348 | " }))" 349 | ] 350 | }, 351 | { 352 | "cell_type": "code", 353 | "execution_count": 25, 354 | "metadata": { 355 | "collapsed": false 356 | }, 357 | "outputs": [ 358 | { 359 | "data": { 360 | "text/plain": [ 361 | "'\\n# We could do the same thing w/ Keras like so:\\nfrom keras.models import Sequential\\nmodel = Sequential()\\n\\nfrom keras.layers.core import Dense, Activation\\nmodel.add(Dense(output_dim=10, input_dim=784, init=\\'zero\\'))\\nmodel.add(Activation(\"softmax\"))\\n\\nfrom keras.optimizers import SGD\\nmodel.compile(loss=\\'categorical_crossentropy\\', \\n optimizer=SGD(lr=learning_rate))\\n\\nmodel.fit(mnist.train.images, mnist.train.labels, nb_epoch=n_epochs,\\n batch_size=batch_size, show_accuracy=True)\\n\\nobjective_score = model.evaluate(mnist.test.images, mnist.test.labels,\\n batch_size=100, show_accuracy=True)\\n'" 362 | ] 363 | }, 364 | "execution_count": 25, 365 | "metadata": {}, 366 | "output_type": "execute_result" 367 | } 368 | ], 369 | "source": [ 370 | "# %%\n", 371 | "\"\"\"\n", 372 | "# We could do the same thing w/ Keras like so:\n", 373 | "from keras.models import Sequential\n", 374 | "model = Sequential()\n", 375 | "\n", 376 | "from keras.layers.core import Dense, Activation\n", 377 | "model.add(Dense(output_dim=10, input_dim=784, init='zero'))\n", 378 | "model.add(Activation(\"softmax\"))\n", 379 | "\n", 380 | "from keras.optimizers import SGD\n", 381 | "model.compile(loss='categorical_crossentropy', \n", 382 | " optimizer=SGD(lr=learning_rate))\n", 383 | "\n", 384 | "model.fit(mnist.train.images, mnist.train.labels, nb_epoch=n_epochs,\n", 385 | " batch_size=batch_size, show_accuracy=True)\n", 386 | "\n", 387 | "objective_score = model.evaluate(mnist.test.images, mnist.test.labels,\n", 388 | " batch_size=100, show_accuracy=True)\n", 389 | "\"\"\"" 390 | ] 391 | } 392 | ], 393 | "metadata": { 394 | "kernelspec": { 395 | "display_name": "Python [Root]", 396 | "language": "python", 397 | "name": "Python [Root]" 398 | }, 399 | "language_info": { 400 | "codemirror_mode": { 401 | "name": "ipython", 402 | "version": 2 403 | }, 404 | "file_extension": ".py", 405 | "mimetype": "text/x-python", 406 | "name": "python", 407 | "nbconvert_exporter": "python", 408 | "pygments_lexer": "ipython2", 409 | "version": "2.7.11" 410 | } 411 | }, 412 | "nbformat": 4, 413 | "nbformat_minor": 0 414 | } 415 | -------------------------------------------------------------------------------- /notebooks/06_modern_convnet.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "\"\"\"Tutorial on how to build a convnet w/ modern changes, e.g.\n", 10 | "Batch Normalization, Leaky rectifiers, and strided convolution.\n", 11 | "\n", 12 | "Parag K. Mital, Jan 2016.\n", 13 | "\"\"\"" 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": null, 19 | "metadata": {}, 20 | "outputs": [], 21 | "source": [ 22 | "# %%\n", 23 | "import tensorflow as tf\n", 24 | "from libs.batch_norm import batch_norm\n", 25 | "from libs.activations import lrelu\n", 26 | "from libs.connections import conv2d, linear\n", 27 | "from libs.datasets import MNIST\n" 28 | ] 29 | }, 30 | { 31 | "cell_type": "code", 32 | "execution_count": null, 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "# %% Setup input to the network and true output label. These are\n", 37 | "# simply placeholders which we'll fill in later.\n", 38 | "mnist = MNIST()\n", 39 | "x = tf.placeholder(tf.float32, [None, 784])\n", 40 | "y = tf.placeholder(tf.float32, [None, 10])" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": null, 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [ 49 | "# %% We add a new type of placeholder to denote when we are training.\n", 50 | "# This will be used to change the way we compute the network during\n", 51 | "# training/testing.\n", 52 | "is_training = tf.placeholder(tf.bool, name='is_training')" 53 | ] 54 | }, 55 | { 56 | "cell_type": "code", 57 | "execution_count": null, 58 | "metadata": {}, 59 | "outputs": [], 60 | "source": [ 61 | "# %% We'll convert our MNIST vector data to a 4-D tensor:\n", 62 | "# N x W x H x C\n", 63 | "x_tensor = tf.reshape(x, [-1, 28, 28, 1])" 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": null, 69 | "metadata": {}, 70 | "outputs": [], 71 | "source": [ 72 | "# %% We'll use a new method called batch normalization.\n", 73 | "# This process attempts to \"reduce internal covariate shift\"\n", 74 | "# which is a fancy way of saying that it will normalize updates for each\n", 75 | "# batch using a smoothed version of the batch mean and variance\n", 76 | "# The original paper proposes using this before any nonlinearities\n", 77 | "h_1 = lrelu(batch_norm(conv2d(x_tensor, 32, name='conv1'),\n", 78 | " is_training, scope='bn1'), name='lrelu1')\n", 79 | "h_2 = lrelu(batch_norm(conv2d(h_1, 64, name='conv2'),\n", 80 | " is_training, scope='bn2'), name='lrelu2')\n", 81 | "h_3 = lrelu(batch_norm(conv2d(h_2, 64, name='conv3'),\n", 82 | " is_training, scope='bn3'), name='lrelu3')\n", 83 | "h_3_flat = tf.reshape(h_3, [-1, 64 * 4 * 4])\n", 84 | "h_4 = linear(h_3_flat, 10)\n", 85 | "y_pred = tf.nn.softmax(h_4)" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": null, 91 | "metadata": {}, 92 | "outputs": [], 93 | "source": [ 94 | "# %% Define loss/eval/training functions\n", 95 | "cross_entropy = -tf.reduce_sum(y * tf.log(y_pred))\n", 96 | "train_step = tf.train.AdamOptimizer().minimize(cross_entropy)\n", 97 | "\n", 98 | "correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.argmax(y, 1))\n", 99 | "accuracy = tf.reduce_mean(tf.cast(correct_prediction, 'float'))" 100 | ] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "execution_count": null, 105 | "metadata": {}, 106 | "outputs": [], 107 | "source": [ 108 | "# %% We now create a new session to actually perform the initialization the\n", 109 | "# variables:\n", 110 | "sess = tf.Session()\n", 111 | "sess.run(tf.initialize_all_variables())" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": null, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "# %% We'll train in minibatches and report accuracy:\n", 121 | "n_epochs = 10\n", 122 | "batch_size = 100\n", 123 | "for epoch_i in range(n_epochs):\n", 124 | " for batch_i in range(mnist.train.num_examples // batch_size):\n", 125 | " batch_xs, batch_ys = mnist.train.next_batch(batch_size)\n", 126 | " sess.run(train_step, feed_dict={\n", 127 | " x: batch_xs, y: batch_ys, is_training: True})\n", 128 | " print(sess.run(accuracy,\n", 129 | " feed_dict={\n", 130 | " x: mnist.validation.images,\n", 131 | " y: mnist.validation.labels,\n", 132 | " is_training: False\n", 133 | " }))" 134 | ] 135 | } 136 | ], 137 | "metadata": { 138 | "language": "python" 139 | }, 140 | "nbformat": 4, 141 | "nbformat_minor": 0 142 | } 143 | -------------------------------------------------------------------------------- /notebooks/07_autoencoder.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "collapsed": false 8 | }, 9 | "outputs": [ 10 | { 11 | "data": { 12 | "text/plain": [ 13 | "'Tutorial on how to create an autoencoder w/ Tensorflow.\\n\\nParag K. Mital, Jan 2016\\n'" 14 | ] 15 | }, 16 | "execution_count": 1, 17 | "metadata": {}, 18 | "output_type": "execute_result" 19 | } 20 | ], 21 | "source": [ 22 | "\"\"\"Tutorial on how to create an autoencoder w/ Tensorflow.\n", 23 | "\n", 24 | "Parag K. Mital, Jan 2016\n", 25 | "\"\"\"" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": 2, 31 | "metadata": { 32 | "collapsed": true 33 | }, 34 | "outputs": [], 35 | "source": [ 36 | "# %% Imports\n", 37 | "%matplotlib inline\n", 38 | "import tensorflow as tf\n", 39 | "import numpy as np\n", 40 | "import math\n" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": 3, 46 | "metadata": { 47 | "collapsed": true 48 | }, 49 | "outputs": [], 50 | "source": [ 51 | "# %% Autoencoder definition\n", 52 | "def autoencoder(dimensions=[784, 512, 256, 64]):\n", 53 | " \"\"\"Build a deep autoencoder w/ tied weights.\n", 54 | "\n", 55 | " Parameters\n", 56 | " ----------\n", 57 | " dimensions : list, optional\n", 58 | " The number of neurons for each layer of the autoencoder.\n", 59 | "\n", 60 | " Returns\n", 61 | " -------\n", 62 | " x : Tensor\n", 63 | " Input placeholder to the network\n", 64 | " z : Tensor\n", 65 | " Inner-most latent representation\n", 66 | " y : Tensor\n", 67 | " Output reconstruction of the input\n", 68 | " cost : Tensor\n", 69 | " Overall cost to use for training\n", 70 | " \"\"\"\n", 71 | " # %% input to the network\n", 72 | " x = tf.placeholder(tf.float32, [None, dimensions[0]], name='x')\n", 73 | " current_input = x\n", 74 | "\n", 75 | " # %% Build the encoder\n", 76 | " encoder = []\n", 77 | " for layer_i, n_output in enumerate(dimensions[1:]):\n", 78 | " n_input = int(current_input.get_shape()[1])\n", 79 | " W = tf.Variable(\n", 80 | " tf.random_uniform([n_input, n_output],\n", 81 | " -1.0 / math.sqrt(n_input),\n", 82 | " 1.0 / math.sqrt(n_input)))\n", 83 | " b = tf.Variable(tf.zeros([n_output]))\n", 84 | " encoder.append(W)\n", 85 | " output = tf.nn.tanh(tf.matmul(current_input, W) + b)\n", 86 | " current_input = output\n", 87 | "\n", 88 | " # %% latent representation\n", 89 | " z = current_input\n", 90 | " encoder.reverse()\n", 91 | "\n", 92 | " # %% Build the decoder using the same weights\n", 93 | " for layer_i, n_output in enumerate(dimensions[:-1][::-1]):\n", 94 | " W = tf.transpose(encoder[layer_i])\n", 95 | " b = tf.Variable(tf.zeros([n_output]))\n", 96 | " output = tf.nn.tanh(tf.matmul(current_input, W) + b)\n", 97 | " current_input = output\n", 98 | "\n", 99 | " # %% now have the reconstruction through the network\n", 100 | " y = current_input\n", 101 | "\n", 102 | " # %% cost function measures pixel-wise difference\n", 103 | " cost = tf.reduce_sum(tf.square(y - x))\n", 104 | " return {'x': x, 'z': z, 'y': y, 'cost': cost}\n" 105 | ] 106 | }, 107 | { 108 | "cell_type": "code", 109 | "execution_count": 4, 110 | "metadata": { 111 | "collapsed": false 112 | }, 113 | "outputs": [], 114 | "source": [ 115 | "# %% Basic test\n", 116 | "def test_mnist():\n", 117 | " \"\"\"Test the autoencoder using MNIST.\"\"\"\n", 118 | " import tensorflow as tf\n", 119 | " import tensorflow.examples.tutorials.mnist.input_data as input_data\n", 120 | " # from matplotlib import use\n", 121 | " # use('Agg')\n", 122 | " import matplotlib.pyplot as plt\n", 123 | "\n", 124 | " # %%\n", 125 | " # load MNIST as before\n", 126 | " mnist = input_data.read_data_sets('MNIST_data', one_hot=True)\n", 127 | " mean_img = np.mean(mnist.train.images, axis=0)\n", 128 | " ae = autoencoder(dimensions=[784, 256, 64])\n", 129 | "\n", 130 | " # %%\n", 131 | " learning_rate = 0.001\n", 132 | " optimizer = tf.train.AdamOptimizer(learning_rate).minimize(ae['cost'])\n", 133 | "\n", 134 | " # %%\n", 135 | " # We create a session to use the graph\n", 136 | " sess = tf.Session()\n", 137 | " sess.run(tf.initialize_all_variables())\n", 138 | "\n", 139 | " # %%\n", 140 | " # Fit all training data\n", 141 | " batch_size = 50\n", 142 | " n_epochs = 10\n", 143 | " for epoch_i in range(n_epochs):\n", 144 | " for batch_i in range(mnist.train.num_examples // batch_size):\n", 145 | " batch_xs, _ = mnist.train.next_batch(batch_size)\n", 146 | " train = np.array([img - mean_img for img in batch_xs])\n", 147 | " sess.run(optimizer, feed_dict={ae['x']: train})\n", 148 | " print(epoch_i, sess.run(ae['cost'], feed_dict={ae['x']: train}))\n", 149 | "\n", 150 | " # %%\n", 151 | " # Plot example reconstructions\n", 152 | " n_examples = 15\n", 153 | " test_xs, _ = mnist.test.next_batch(n_examples)\n", 154 | " test_xs_norm = np.array([img - mean_img for img in test_xs])\n", 155 | " recon = sess.run(ae['y'], feed_dict={ae['x']: test_xs_norm})\n", 156 | " fig, axs = plt.subplots(2, n_examples, figsize=(10, 2))\n", 157 | " for example_i in range(n_examples):\n", 158 | " axs[0][example_i].imshow(\n", 159 | " np.reshape(test_xs[example_i, :], (28, 28)))\n", 160 | " axs[1][example_i].imshow(\n", 161 | " np.reshape([recon[example_i, :] + mean_img], (28, 28)))\n", 162 | " fig.show()\n", 163 | " plt.draw()\n" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": 5, 169 | "metadata": { 170 | "collapsed": false 171 | }, 172 | "outputs": [ 173 | { 174 | "name": "stdout", 175 | "output_type": "stream", 176 | "text": [ 177 | "Extracting MNIST_data/train-images-idx3-ubyte.gz\n", 178 | "Extracting MNIST_data/train-labels-idx1-ubyte.gz\n", 179 | "Extracting MNIST_data/t10k-images-idx3-ubyte.gz\n", 180 | "Extracting MNIST_data/t10k-labels-idx1-ubyte.gz\n", 181 | "(0, 554.17175)\n", 182 | "(1, 451.15411)\n", 183 | "(2, 420.77158)\n", 184 | "(3, 437.03281)\n", 185 | "(4, 409.98288)\n", 186 | "(5, 363.77893)\n", 187 | "(6, 420.08453)\n", 188 | "(7, 396.18784)\n", 189 | "(8, 365.31839)\n", 190 | "(9, 407.51379)\n" 191 | ] 192 | }, 193 | { 194 | "name": "stderr", 195 | "output_type": "stream", 196 | "text": [ 197 | "/home/heythisischo/anaconda2/lib/python2.7/site-packages/matplotlib/figure.py:397: UserWarning: matplotlib is currently using a non-GUI backend, so cannot show the figure\n", 198 | " \"matplotlib is currently using a non-GUI backend, \"\n" 199 | ] 200 | }, 201 | { 202 | "data": { 203 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlAAAAB9CAYAAABtT12EAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsvXd4VNX2//860zKZTHolCZAAIaH33kVAEQsoeFXUK7ar\nH8u1d/Fa7rVcrFdFrKAUERFEuvROSCAECOm99zZ95vz+OImEMGlkBuP3N+/nmQeyZ5+937P2Ouus\ns/faawuiKOKCCy644IILLrjgQvsh+7MJuOCCCy644IILLvzV4HKgXHDBBRdccMEFFzoIlwPlggsu\nuOCCCy640EG4HCgXXHDBBRdccMGFDsLlQLngggsuuOCCCy50EC4HygUXXHDBBRdccKGD6JQDJQjC\nNYIgnBcEIUUQhOccRaqz6Iq8uiIncPHqCLoiJ3Dx6gi6Iifomry6Iidw8eoIuiIn6Lq8OgxRFC/r\ng+R8pQE9ASVwCoi53PYc9emKvLoiJxevvz4nF6+/PqeuyqsrcnLx+utz6sq8LufTmRmo0UCqKIrZ\noiiagTXAjZ1oz1Hoiry6Iidw8fqrcwIXr786J+iavLoiJ3Dx+qtzgq7Lq8PojAMVBuQ2+TuvoezP\nRlfk1RU5gYtXR9AVOYGLV0fQFTlB1+TVFTmBi1dH0BU5Qdfl1WG4gshdcMEFF1xwwQUXOghFJ67N\nB3o0+Tu8oewiCILwpxy2JwjC/Q3/fYEuwqsJJ4Dn7Xz/Z8sKuiavS8awC3CCv4isGr7vcry6ACdw\njWGrcMmqY3Dpe/vRVcewEaIoCm3VERqCujoMQRDkgAFIBUxAFDBSFMWkZvVEmAJMbdbCXjtlLZV3\npO4e4CDgB5QB6FrmtdgJ/dsrtwHvIcXLuQMlAImiKA6+lFNnZdVSub0yG/AGEADIgWKA/p2TVUf6\nb6m87TFsWVaO6N9eWUdkdSX1vSmvrqLv0LkxdET/9sr+uvrecV6d5XolbVZH6jpD3x3By2Wz/vo2\nq2nZv9rlQF32DJQoilZBEMqQ7jAl8EZzAfw5EAANYG0s6DQvz2gl/R/0INKcRUx6PIeXSeVl4yqo\nmxtO2kYrtkOFrbQgQ/IvzyIpMjQ3RH8OZEhDZ0VSavh/dQw7j64qq6a8gC4hK3CNYUfQVWXV1W0W\n4BBZaeihreeOsO1kVsLA+Qre/HIyBlNH2uiqY9gV9b2r2qyOo7MxUAZgrCiKUaIovu0IQo6BArgP\ngM7ymjMjmZjgPCbIjjFJdpRJwlEmIn16CPnc7XuMoRo10o7M1uAHeAH3t1HPEVAicx/GCOEMveYp\n8Bjo3kpdOZKsHnUwBxU+gR7MezCdiarjDPfPo5dXIBDSzusdN4aOg7Nk1Vk08upKsgLXGAL4gccA\nCB0jfTxCQYhk0O0yfO+JgtFjQOVN15SVE2yWRgu9okHujSx4JFEPeTG0dwmT7tYT/ZgHQTf3hF5R\nbTTiSH13Z8jkWvr2LGEKR5kYeJphDwnI3S6nra44hv//s1maQJFRDxoZrTrBsNA8egVUOrL5i9BZ\nB0oEdgqCENtsnbUZItpZ5si63wOXrP+2AxfanDo+i3unHuC6+vME/fNnbC/FU7Jdgd/1gfSbKhAa\nX8+EV75iij4NvwEBDuLV2rXtLNcOQjFrGjPdDhNxdxCqkUGttKFo4PRlC9+3hlZ4Cb74+fbkrkkH\nmavYzgNh6Qya6oVbr6B2tNFYdjmyslOu1KD0Gs117ml4C4bW64YFg/8k7G8Kaa+s7PFqJ9fGsr5h\nBIWY6N6rBu0oLySD01Ibjbw6p+8SAonw68aUiQX0GOXGxS8GLfUfAfjDoD7g69uM6xXQ9w614SR9\nv6RoLANmGrlmQAmTfWro522in7eJCL8Y5viXccu4FMaMr6RfpAlfj0aHwVFj2EGubdZ1xBh6MrBX\nDcP7+TMuoJqpshzGexmYOjmbRcPyuWFcItOuymLC8FLGBVQxWlsIAyeA3N4jynH67jday7RZaUwd\nUkWmKoTjXjfw6amrMVqVLfwue7+taZmDbJbD6jrLZrmBOhT5wD4MmqOnu0ZJP84zOeY802bnETlK\nBoS2g5fj9T0w0MAjD53jOrft3BeVTkS0ssEutb+N9qIzQeQAE0RRLBQEIRDJkUoSRfFg+4jZK3NU\n3UWAJ/AvgP9rmVdLbQrgGchr935Fwdf5BJyFenUAJk0QYaMFip7rS2RGPJbyAuLPm/F4Po7BNgV7\nn3QEr5aubYnrpVBG9iLs3/n4HLGSd9SfyhxrK2080MCpHvgvgiBM7Jis7CEaLw8veppEshdmoQCq\nT4Ps9Sx8dwdT9HFbbURw+bK6tFwR5Ef0jFms2jmeKcULOWXp1mLdgNk9CfCcimFdDlk5zeMa2yur\nln5T21wlDKTntEgmJu9FE1bC8RF9SYgVWmmjkdfl6nsjPAgjmGeHGxj92Al+MHbnw/mjgOxWuMqQ\nM4h+blqMc/qQv8UPXVUaiBE4Xt8jAQ3hUWV4WeopL9NQXKvtYBvO0PcIwA2lRkNoZB5CUh3mebdw\nz7Vfct22rRSuquF4wwKFSg7zB8GJZ8FbvxIDcICxbO60zeoI147UdcwYdovQ8uCcc4wp3U7+6hUE\ny8Ba9CEpT4FVDepTWxkugrIKfMvBFBHNs0+tJOfpOGwVJSBamrTmKH2HUS/Wo92dz2C9jcKHr2bF\nmudg4bpWfldrv9dxNstxdR1ts2SoA3rjp1Xjow/EcG037p//M7UpxxBSahk3HdzuiGBD0g2srRpJ\nVmpbvByv757lVUz5/lfyTFCyD+r+MRB3lRb9nkq79Vsvax1tOlCCIHwNzAGKG9fABUHwBX4EegqC\nkAUsAH5BSpBlRwh7m5GMAJkAGiVetlpEBYgIGC1qzCYlWGyAuQ1mG5Hi1z2AhxrKkoHtSCuL6saK\nHeMFoHSD6YvY8eQXBNflM7K/mn3We3ju3COw9gisPY/SayZHJr+FvCIPuaWUQFsZ4O8AXi1wai8E\nGd1UpTzj9Tp9vHRo3t0E1gQ7nPRIbwBVDZz6N7bQMVk1g0wtoFL0YHZECf8d/SQ/f3PhO7VoxE1t\nReahxGYELOvs8HK8rLpFFXLTK2uo/9mM1WK3yh8IOPsYJalZeLnJgZeQYgecI6sW4T6Rt1e9hMa4\nmwRAt76KBNTASZyi7wAo0boN4BXzvwk9eJiaKAUjB/5EkziFFvAbopCCVazhq2XuPGf7gaPyMrD8\nr528WuPUBHIBwV2Dpy6axV/v45riwyxdOpG3dk0Hmk/R27sHnTyGykjCho/k7R8eRDV4P3nvL0X/\nvsSkEeuA81Z47zQ84wF4QooZ9hhOAgl0fgxbggCCChTgpjagspgRrDZMChUG1KBbD5zHafehTMUT\n/96A1/4T7FoNolxBrreGu0bCoGkWGGgDdxEscGoTHPjWym9ZyVTeMwqZJho8HsdWV9zA0VH6LmEQ\niXQnjwOh4znsPQ72rbukzsX4E+y7XIW7Uo/MasbS8FgUAJUW6g0eWC0bJA6CFsSHcabNEtRa+t6l\n5L7RJ7j+3idZ/x6Y3wN3NxBlEP8p8GkWg6/7grioFWSlmnGezbKPsmL4dsmFv6dN+g0L/YjfI2/l\nqqyGT8fQnhmob4FPgBVNyp4H9gEfAf8HvAqMpcGdvBRTLy3qHYj3l3NJyBuKbqKMEq8APvztWTZ8\newscSwHdL23QGgaMQZJ9I3KAoQ1UjgO7AGZ2iBeACrgZLCdhZhR8Xfkcn2aNB34ALKi8RRYk1BN3\nkw1DAWw4MJ9D4hiknRid5dUCp/bCJwalPpzQse+i2tkL4X4POGiP00Egpgmn+sYvzthvuH28Il4N\n5aHJB5m1fAW/Lr/4u+drl7D92UV8NvUBsj4Dfku3w8vRslLRO7+Q5374CqOlLaccCsbdz8Lbktjy\n/I9AXySj7RxZtYj/9mDLDx5EHYHB90HkGB0r7x8KDMAp+o4SmM2WJ++l+LdEam/rwe/KkXz9TN82\nqQb26su9H/uy/IYdvPLL55x9uwy27OkAr5Y4XQxhdCjd3x/D4YUz2HFTJT+ZLJwwhAN9gNhmta+c\nvv+BnmCKqiNnyH6osTSE7F6Mq93hfyPgviJ4qMGzemkJTEoNZPuBO+ncGLYCWSD43gNXwWOvvMkd\n+36kR1oB34y9i8XiYurvj4L6XTjtPux5Bx5PHsOtsoIh7lDffxZ3D17Bq9+b4cAhkJ8GQQ+ikqH3\nWZm29CzD79rEOI3I9pAUZAovzqbYgB0d4NQOXkA3CgijigM7ojkWPxFY38YVf4J9H3o7n/ztJULj\n1nN4DVgArQfctRVuX/wrB/d3gzFaSH4NynogOQLO0fcer/oxWBWL+d41bKq/UD7/cTi+BbIaWi/Z\nYSRYoQVuxTk2q/3Y/ICCJEtb0UoRXOyc7WtX2206UKIoHhQEoXmE9I3AnUijJEfarvGaKIo72tUr\nEF2UyvdvzGJ3VhkGjYBFXsmA6lcYUvY+A0KM9JneRuCXD6T0def2J3yxif5QX47k3c4FvoELJmxT\nR3gBoDfCK9/yXeFn/FZaRamlGL31NFK2BnA3GHnh/eUcyS/BIILR7IZJVDVc3APJ628KB/FqDwbJ\nMd6oIulFGWvWf05K8RmkLchtcaoAoLOcen+6AdmKY8SV6TA328US+7IBt//+yBtueyhWTePpYU/A\nya/b4AV0RlbD+lIyOJpf1n2NuPsGqv/uI3XRAkL7BKAr1VOnkwO9Gvg4R1aXQgCCWRN3F5VlJygC\nfiq9g0OZtyMZGyfplVoO/9eLPVsseKVZ2Pvp9RxkNPW69Nav84hAqZiM8tWb8A6H6n+bqD9iAJIc\nw+sP9GfQaS2v3fM3tuWVUmkSsYpwz70/MT68nsX/mg+caFK/PfdgZ8dwInPeP0VeQDSnVqkguRJN\nrciYagt9PCG7HgJehxPTJ7Eq63bi1w5CsTGfXxMSyDAtZdyCWwBISv0RT+0knGIb/AYx4joFT81+\nB/WdX7L3d/A8V8ih2gpiDWbMm37mAY4QohDpN7mE/8uvA5mS3FQBh96HUW4knIG+RpHihdOJnXA9\nupfXorOIYKlD0ntpo8vxlQqsNb2Zt+oqqu/YTU2eDeGp3rAxFc452o5eT8oTxxk/rIaY7mexHNnd\njmuulH1XgDYY7r2G73c+geenu8iuMeCFNCes0sPu++HZxc/wzt02jq/VsdhUgX7VCMwLs8HmBJsV\nMosbV/7KpKofyak3YQasnmqSNz/BgpcFxv9fDuH7D2NbnYFohhCzkiazTA24gs/CBpjrwYrglLYv\nNwYqSBTFWCRXEkEQKjoaSV/kFsJ/Br6C54OxnD05mPDIbK4P2MCkbSex/mIl/zBM9oQtVWBuSFXl\nBvRSgFYGsQIUJ7jhF+QDsgrK0kDyssOAfzT08q/Li/C3iZBVTAlplJisQDmNHrxHpJzRb/iSu7gI\nXaWFMRPgbE0Wh1ObB0c3hYN4tQOTavfzQtFpgn3g/e/q0Oe1tGbVnNM7tL1c0zqev/UA3TNPUnW8\nktJm3w0BwmshpaycMks5ao8qrpNVspnmcnOgrAKGMVFfyT/ivqK3Tc1j/5lFSVElrS0P33BmC5Fl\nR9gogqRx9jh1XlZ2IVfCiOlkHfgYeX41KqDmSAVZZy7JMdcCr8uRlSfuqmG8eNe/UG4uYP/jD3Ms\ncQjlm8vbuKwvI7Ru/KPgTc4KZgpDwyDHiK26xkG8LmDYNeeYNzaTqteSKEba+SIA1QfK6Kndzou9\n69n5+tWcesSMubIa6R29ORw1hjKgB++PWoZqdxKrxz8FodGwP5GiQwKvsQYvwybqxOmofzlP6XEz\n6bVmyjMzwaaDWgFQcOpsYxyeEWNFX6TZTnCIbQidyI23HWV8/dfIfq+hPu80Nks9D5hAvBviIodR\n+l0F0eezGRtQTmYe9M0Gz2oZYR8GUvapHn2sA8cwBOrTIWyRO0W9VJz4rg7Ka+1W1ZVCwlYtfnk9\neDQAlpbCC9Pe4au4KBLPOVavHnl/FWGrk1EYrWjUOtA3d4yaoEdfmDwMRgKF+bBkK4g+YK3CKfbd\nJwDNwJHM2/M01pzfCbNWc3rSrawdeitiJcjkNgImF+I2xo+Fy5cTdmIHSpMndeWN9tTBNitkFs/f\ntomhp9ZSebacEDUE+EfypOo9qt/Lp+iUjkJjPzy9pjIj8hAzMr9Fpank/V7beOqMyIVsk85/FoqR\nPtj+dT3CP7Yi6CyMf96KNdFG2mbHH7zS2SDyRnQ4G2e1XmDjaQ09bN0pyPYkvyQEs+dITqZFU6fu\nSX6AJxtPFZAwoSdWhfTDFToIyjQwOCuevtZ1ZGeBLVwAhQynPNDIvPjPnkF4Xh3C5EOfkJJrIcoK\nx8IWcTorAorznNB/R9EHRVY6fjv3EhXkhiLpFFjaOzSd9NDHjmesajfu+kpO0pg5BgR3GQGvdGdL\nwR3YygLwCj+HNnUfbhtTGRh2mNinHqfkfeEyNKht3DT9MDcqYpFtLeHrQa9ycrOAxdpaR73Q7TyC\nwViItG7fEpzzNqNUW7jmqVgqnqvHXQeDJkCFWAmHW3KgOo9uYRU8+sRKui3fTO+wKn4v7ElZvoxL\n37Ivhsd4BcExZQT/vA/jXf7UfiHDRi1txy52FDGEFaQQk7iNPCVM7Q/KfnDmGJSkgIZiBgbvJOqI\nmSdNj1BJEo1v263jcsZQhrubmlfu/ZW+cbvI6x0G+dUQWwC6Oup02eyhP5ijAW+I6w5x5UgvYI0O\nad1l9Nt+3Pn0ecKPFdIt6Sw+dSfRGHX4R6lJunoQ37w7APPJGjLzw5keeAKdvCcZht5MHPEtaSdA\nho2x+9Zzvvwmch14ytfTXv/FX3WGpMETOWUbSd0x+85TI2oK5aQqtfBoECwuYfa+7Wwr0pDoMEYS\nbkjeSklNHlsmzuKgOAa2XFpnzqJsRubkYjmXQE5mLFUCGKtqOGGrYPTTlRzfLFB+1sHENN2J1AZw\nd9FnaNPXE3G9hYO1d7LZPJMz59wlH0Qmw1M+insqf8SyP5+yiEDIMGNbcUSaALgEnbNZygeD0Vac\npjY/gyrA2j+c1Juv4+wSNfxWD6KRrGNuEORG1ORw6qf2Qb4yDY0hnYAQNch0lBZoOsWhvRDqTHAk\n749nX9hYEV+dc5KaX64DVSwIQrAoisWCIITQkKa2Zext8v8I6VOvw7Ynlqw9AOUUAVsIBvpDSCSI\nnvyuy5fqCg03swD09SGvWxjDY9cRHaygrsIdUaYBapFmDHYgBTG1JwuaHV6twLc3DJ1Rg/+CvdQA\nnneEsNUwjFNFHlzYqWQPHeHVMU5NIR8TRI23iTMnoP9MEdLKweLZQm0t0lJLcQOntm6wlnjJgWHc\nLMvAdrSS8jypxF8FIT3cSJkTxTH1RNYIEynX+RJtHsQ8OUTLUrGgY7r7GX72lwEemMoMOExWU6MZ\nEv4jY4v3sNt/JF+VDgXxAK052qPmVWNLMWCqA5XaBoZ8JM/OUbJqHW4yIw/5f85ZRRkWINt3Muni\nmNauoLP6HhBYwV2LVrMyQs+1/wDNlmIsZ1T2LwdAS8RkAyNHn6Bv/nlOix5km/sjViSB2JgiwoH6\nPjgCg1ABu0qIHqsirn4wNsGbtGv96ZF0nqATpykuriF66Tbuj+nPqqze5NVZuNQB7OwYRuMR6MmE\n24qYpNhCdYaRxIELKEhVQmLjva/jwlLi6Tbab4QjbJYMuZuWQffUMiEokfDKdKwHjBQNDSZ3znBk\ntgBOC4GsMw7AvLoGsHL11UpKfIazXHcPylEG9EdWo5dD4IrtaK3zkJbUOjuGcpAPY2HeY1TUZ/JZ\n6p0cs41GWo5uHaVegaycfAei8CGeu3UoCy04zr4LgC9+a6oJtJr4LS+CfeYIJBsugMwdIvuzoPBH\npslP0EfIRVtWj1UGJl8o+h2SbHCfcjdl8tsoR9MBXi1xuoBB/UuY3z+ecT99hXwuJAfexorq6zh5\n1gRlR6VKchW1hhEMOL0Zt9SznB01DX1OCsSeamjFsTbr1mtWYXszndIUCI0E3RA/VlX2hoqm8YdG\nKEkjsdREbGhfJgWncbDAgl+gHplcRykanP2MBqBUh+zzk+1ouymycEoQecMuvBuQRqQR5UCKIAjp\nQDB2o+ebYmoHKJVDUTkUNfy5JaXJd770mBxIrz5JvH5CyblcMyZDHhceil5APOBLk6A5h/DqFV7B\nlMgkYvLKqVVCtxkqdk2fTP7SfMhpmnVtI9I6b1Pl6Aiv9nNqjvBrcugTlUr1eW82j74Kw1a1tBnD\nLif3hvJGThHAucvgpQTl1Sw4Ogad7QyFgHcQ+AUFUuI1ioSREXz/wFBMunjATDLDSe4ziJFj+rLh\nSAqJb36DVSnHq0cEpvICEB0jq7G3W1AcsJKeEoJxYgh8u7eVdmRAJAuGbeTXY8dZUSpDbzMiGSBw\nnKxagcoNWVA4AV98gKJMjy/wc9xEtoqjkXbfdVav7PMSAREZCiDXBAN6ngXPMtIJ52z9wIvq9g5N\nIaK0hAFjihmSH0e3386yGHdi3zveCV6XcroI06BWAflbAvC8aQrPvNAPY3ww2qe68+r09Qw0nibt\nMMTLTLw8422O//IpeXW/AHE4VN+DvfGd2Y1ZYzZQ+JiV+oopHPp6IJnYgJo2fmMjnDOGuGtQDR3A\n3KmrqHy+AFWuEbnvQPIGjmaXdxSH3/ZHOmqsEun1Jpwc9SB6qY2E6jN599d0KgGjFTymynE/XwNF\nvh3gZYcTIFPKiJjdk8LzKiLDQJNqpjanfTOUpbnb+fbG88gvmk1xjH0XlDL8ZoSSf1zJyJEavBIL\nMZ3PBjxw8w6g5+QKlFVm7q44iHx5Fikxvhii+uEv9+T7nHiOG2sxCDZ6LT2Ip+JpkIWAzTH6HjWw\nkpsnxTLT7Qin3dQETp7O20vvIzMtCyw5jb8AuULO9Ov3kvtNOftr4NjWwxitTbcuOMpmSXGZj8e9\nRW55HmmAaWQEeb0HkvKy/Vnq1H1qjpWFc9rTm/251VhyrUR7Ny6rO+8Z3TlEcDlB5O2Zp+2BZGfd\nBEHIEQThHuAAkmvrgZTr/8EOML1seIaGc11aPPd89zaFuFGn8ERaMngL6SHTo4GSGQh0YM8eXDc4\nn5d91hL85C581QoCXh7Nd29fRerxgGZ1qxv+tQIfOJlXE2i9GVObwA3Fm6nwiuQ26+dUij4d4HT9\n5fUrAJ4QJ0hetacHhA4NIG3wbO46/ARfLwzEpMvnwtJOPAfHaFm78O9Uy6UwQqvZSk3GY0g3uCNk\npeHV6reYoP+NDRHD+Xjs7a3/BJmc4P6jCX+7gLoCA2ZRhdVqA17HobJqBTJ/L1S3zmD9r0qqqsAn\nwhON+RwUNb5JOUevDCZ3krP74SvI2fshzNNsZ9ldK7jnrmxCbuh/0efm58t4Z8FKrt62grrlZ9EP\n9ELv3/jy4Bx9D9XkE+6RR4a2D2/FvIBJpgLKqFtyktqj2VijJZdALoOKGB/MHpVcmBB3nL57jhKI\neLAe2d27uGGAhQ88XuEUXjSc5dVOOMk2BGmR3zWagfdsQJVVxVh3OD1+IZ+ZbqdoUymD+8QxOKqI\nwf2LGOxfSYQihOT+U/GLzOSDxJvwzTr+B6vnz8io9tmNFKfSOV5u7kbue+1zkikj75YILH45cL6d\nMwOmckDACoyIgxxDBo7SK6WHjZmrikkMtbD70WjyZ0rxaB5e3gye3IPbnjnFPSdvpVidhbt/BG5z\np/DbtY/y99Nvsuu0H7WiGxYRplfDed9d4NvfIbwAbr35POP8kjj1NgTU+XLbBx+TmXHuIudJ6a6g\n9wALH5QtIsp4hmpPFYJMBNGM4+27EkKnUve+FnMiaL1hn88M3hUeafWy1KjenPL0B8Bkg6Sq1UCi\nHV5/bbRnF96Mhl14m5rkgVoMfCGK4pLWr3YgFEom/jOObpZYUl9W8KpqCPcHfQZps7jgv+0FRgDj\nG/5uYWdkB6FWDMK8o46cLesxyGCs4MOkGT9Tq1+HtHTYFHchLR+s5kLuC+fwagrFhKkEHCxjbNVZ\nooZG8NYzW5FybbSXU+cgNnwGD1ewV3cnL676Oy1tCS5ZGYrxuC+7F8IHy6X8GJ4B68ioTcdiSKZz\nshKA4Zj//QtqAWp0JWStbWXbHaDSGPnn9vcoujOTb2vhJ30/3jqXj7NkZQ+ewTVMeWoHiv+ZkRvh\n/FvTKDjQD5Y21nCOXqXlRnHz088SLxvOFnU9O7eaYKMJmWklTytWYdVcMBH176jZZtajAEaEQk3J\nVM5kfYxktJ2j74/aPmGS9RjnPWBu1N3MVtyKXuOGu1yHfqeFcqPkQJkUCn6bPJ3SlQZgIZI77zh9\nHyae5EXbPpLcwfwaKJ/6Cdm57thMchDbG3/pJNsgigiY8dSC0giiO/SYncyCO6zcWHmEKelHEGQg\nhgFLYPvGD/B5T5oHWy/CbTIwqmGNTobMdoTzGeeRMld3jpebzsjDj39DXrGRh3cvZX+Omot3S7aC\nHv/E7Y3xeP9tGHEjbDyQ1otTtUWd5gSgRs9L5n9zVCxmF7dzlnCgmuGTE3jpmSWcmpyC3EvG1Efc\neOTYErZ8rILKE0AGcBfuXuX4ha/gZQ18pp1CQel6h/ACCNhcTZBQRYUAMbI6ZNlfIS13AciRK9VE\njLDw+Op4to0yoygC7awPMNQEw5GHcbjNcpehfDGUze8o6VYPY29WU1VmxvJlG1E7wMzP76bHiMWs\nV4DMbSuJunSwfY2zn4XNYa4Ha4fONWw/OhNE/oggCHci3RFPiaJY3dYFncK06/H86Rs8zxzmaOQs\n1o39AFZ9CZdkXIlFij8IdVjXr05+h0niUY7uAfkAL+I3jsMydBnYOhK47nheTTH1th1ok08QnwIh\nD5phbQ6XyqY1TjM71b8JyV179cgUDuEJbGr9AhvYzGBEcrzW/nQ9z7x9PTu3Ned1OQhhe5WaO96S\nMUIuZ+1zrddWimZurfyF4+ZKrPPBVoid2W7HycoewsnlPds7rEbPMOC/D9xOrMUdabq7NXRSVrUl\n1Oz7geHWFZjv7c+iB74l8uvlRHyRSvdZocR/f8EAL1v0CpN2PUdM7Vb0N6momVAAd6x0Dq8GvC5/\nlXHyeO7aZ30TAAAgAElEQVRLeYbI16sRAh6GtzS8P/I+opceo/QDyRwfr7MwcuyvePk+AN46qN7b\nBqeOjaHbjhoC9qdhroVtN8K1W0UqN15PxqpsyI/r1G/stKzyqpG/tp3hKy3IHoaD6VDzzEqCXpJz\nUjRzziYt6NTLAAOYTSCI0n0nA0L7Qv83tfxnQTeslbEgNj4cHTOGMX8D7z2/QmpbZ9y1B53nJDOL\nhO6qQFnbxH67D0NxVIfmuqXIvWUMTYjm5ltu4UxCOliaH/8kYe6NsHn5ARIryhzCq22MY+gNSu56\n7D0Mg3ZAtZnZWkg8mMshg70g6c7bLJXKyIL5K/D9ogJy4V91r3JAHIe0CNU2TCJUmkF4RAPbFQ12\n9UrI6gLalwfq8nC5DtRnwOuiKIqCILwJvA/c6zhal+KGW38l5vBxeqsseMeUsOKXIyA2dytHAVOQ\nZiHak9OjHXhgFhlFeXTbs5ewGAi72peFN87DUJdD+7eOOYFXM8zV/0Z34yk21V/D3sJnaXsNtzmn\n7ZfVr1qr5+lNr+B7TyGyFDBZFOgRaG03VtAdBbiNLWb7q1IKRwF4/PGlnMypAILovKySmTi+liFZ\nNuIzuiFl2zjVQl0fLPrRbFjwNeSKHLn9dnbXRHLx2VGOkVWLCI+gptcEfr/qc8x1EOMP7nV7sRgj\n27jQEXolIlrqqCYBNqSw5pAVt7JrUTEN1UEVuqsuhD4W5Qdg1isZNQDOn5vNO1tvQcrK7AxeEvRf\n1+Mp1OFTBts3FbNIfxXCyzLqPbJQDrdgufsaPtq8kE+7LWRnkplHPnyLFVuHsHttc4PZuTEsMosc\nMNuwATm1oLn3Z/5WfQCvRYHEXX8dRyzj6KtIZXfqtfBvpLeJvyPFS285DRxpoWUHyMpmpLY0h2mP\n3IU+dzBzliQwT7ODPpWZVA70IS2sJ/6GSiK/zWXjOpEK44VLfaZpsT06gjufuR6z7X2kJUmbA3i5\ngdgdamTIFSBYTWDt7E5pB+mVFWRJIlYDgA0ZIlfdsptbRiSQ9bnIVV9puff2qZw7I8ds1tHczsuw\nocKE1g9ksjIQGxNDdl7fzbPArIDyWDgvr2f/rG+QyaUAcFv2SooOC5xPLEKokuyr7kMNpu3Z8FNz\n2TrGZsmw0dMtC5lgxAzUHz6JHhXt2fUuIr1Y2wDBTWjYmu38Z2FzdLk8UKIoNk3z8yVtTjfsbfL/\nCDp+5kwQ18ZupHvGWQ6Y57C3ej5U5NqpV8qFSHpHRPgP4XnTT/Sp2EN5LXj19CX16hhyP6iiY/vu\nO8KrLU72MBH154cZ5lFHiqGGL5e0tiOwER5c2Hlgwv6DsG1ebiYTf1u7jtjKynaE0g6EoZGMNxxi\n3srlFFRKDpQHUHgmmhrbOTovKxHIwOeGOjKSIbwiltfuDeG1QY9dzMQ7gZleWxhZeQbV75vJ2ViD\nVQ8lsmDKBf9m/ThGVi0iWI15eACF6yX21a/7YfpeB0d1bfTjSH2vhbJayspACvBECtlp4ncu/vAV\nvFbF81PAIg6JV5OT2VKuKAfqe3EqsTG98Br6Gjdse40gTiPLBK8nu7OVyezYNIqq4Eri3r2K0oWH\nmL3+BHuzZ4E8oJmN79wYysb4I58/BJ5OwAzIksvxphz3Tdn0O1OBv2073rIaQqs3QSFS379C31Lo\nPauChGgl73480U4/jhhDEavNTGJKMGBmw48hnFZchZepFuMeN2o0Wnzd61j25AaEfXVQKRIE5MyZ\nytqrZ1H7SRXJ6Tqk/FmNwb6dG0N5pBuer/fm12fkXG+0f1Vr6KHK4pHQjSzBJuVilHeUk31ef8AI\nMhFAQESgj3sGo7yOc9YqElZlJSPBC5PdWR1vrLoBGHOXYexmgyALpBR3gFcrnIDvNk7DMk7LlEW7\niF1uQzhTjKzhXUCshVPhk4ifdz9vd3uW3S/A2wkvc7DAi0v12TE2S2U1MTfuN+L0NdQAQm0Nl4at\n2IfAhXQ2F+DoZ7SjkIWzduGFIy3a9xUEIRHJYVqLtPryI9LrvVUQBO+Wl/GmdphYU4x62ojtXAGB\nnjWIngqOxSqQYhx+bvj3c2A40lEXwUinThU18r9sXtOezqN36i5kKRl0GwWWIaEsWzWFlnPk1CCl\nra9BevocQ0r7rwVyuRBE2hpa52QPA5+qpXiXmbJwJV7dTPBxlh1eLcnqAJKs2pritM9LsNgIP1hI\nQn0bh8wB0xelEGg6Rd8DB6jKTuYLAepkAoU2OW7yT8A6CMfIqp5l8fdyS99N9BcPIKZv5gbTxRlx\nR2tyuMb9FN10mSRnixRYJeel4PtyihI34gxZtYT+/me4Y+gWLEjpFL8qf4wko5qGLZQNaG0MHaPv\nLcJNAf+Ygnf2PwgrK2CncjCJYgTSVLyz9b2Gghp3tpsHIr/1NW4c+TGVcj+Ol01lz64+JKcKaIMq\n+Gz7NJ5edIKyTTXE9N+Bd24a1ZWOG8PCMgu/np+NMGg+r1S/idpg4GwVlCbq0CSe+cOs+zaN8YmH\nEMDPFyIG9WPgA3WcWebsMawg6xhkEf5HiXuYjBFPyIlbDfoKySk4ee31nBh8DYmb6jDsWYs0Vo4b\nQ5m3HOV1WlL/T9rd1yF4BqGSm1h58zuU2WDg6QBERS0QjsNkJTRu7BcQgdhDA+mTLmNc6S8kf2Fj\n1GI3jnwYg66ggAu7LGtQyb9Dq6ikpNbCrV9cRUbtwAYujtH3s2d8Wa0fQuqQIDLnDGHAxERkchtZ\nWb2p2luBTw8dg64upHA5jAmFb3f2J7ewCGfZd5nNRp/iLM6aTTTmjG8v6our2YM0pyl8fgvIxyMF\njl8Bm9VhROCUo1yAj5EOnBIAH+AF4CqkKDALcBgpvP4FpDPyHAeFHK4axkjvHxFySoiLnkyKGAXZ\nKUhZzxrfgPVIipIB5CEpS3cazuy4bF7jhN+oTs/FUgLyiMEkZVzHoXV+SK+Y9iCjIaQV6XG8A8nP\nPIOUI0aLMxJ+zrpjN0JeLkn9fCmKsrezoT2yCqK969qXg9unJXJb30xMu7PIy66kBOm5XKUEqw50\npm+RpnYdI6sNW2MQzKWcV/mSVR6BEHpxPFj+Idh2JhCPYA+i5iYjnjaBCU4f3o3tj8Npr4SsQvAv\nLGDYnpWclMGo7vDm2l7k5dRysQPlfH1vCSq1lVseTqJybg2DY2wEmLMhIZgrpu8FxZSYzfw8fRzB\nsuNUyPz4fWsUmSeVQDV1JQr2fenLhK2LcN//E1uO78Nc37g47JgxLElX8XuBH+5R/TnEGNwxcLZ/\nX0p8fRlsOcng/AOkZ1x6XRFQfBq04ZVQvRra1C3HjqGilwfhN3twrel3Uj4zYDBAn2A44R1D7Mkg\n2JXR0PeVtVktwiuEyLH+CFkvkVEsnTlfobNhMx4E0ui8rNzBGg0pculnNuDk2W4Yz/ZArZEzomgj\n44lj0IhafjV2J6PcAyI96RGuQpVQTZWhBlGA33fsA5UHcBTHyUrH6XRPTlf2RDl5KOUyLTKZlSxZ\nH6qEcsYX7KTvxtWc/gHueFmBx4rzUL2KP9u+N0dwUQlpz/5AGSDKBGy6SrD8AnTD2TZLo4URw+DU\nYWnVOGqOSFWujeKEPyETuSiK85r+LQjCBuB/SCcVTmmSTHMvDjXcSpSKYGaMMhPy6UGiw0v4pu4+\nNmQ25sX5W7P6a5AOdC5HCj7Q0hDhf9Pl8nJ7LwE5lViAvScnsznxbiQlDG9Sqxrp5gFJORq6Umrw\n9n6V7lEa0hPNDBh/M7WVwSTHqoE3LodOixh/8DC1hUWcCxrDKc+xXBo83h5Z1XHZN5gIVlEyHWJ4\nMARFE1JTyODM4+Q22JM7o3fT69cqziVI9TyA+wPc8FnUlwf+OwedcT2SXE8jpflv5HSZGz1r4vhl\nvYZf/OfDqJHS0ntTlCfCmbPEhFex7F0BYeM5qLcQ4z6TI+JwMDSuUjtYVs3RPYJygohfYkNQyzDf\n2R3xq1io8WtW0fn63hLUFj0vxr3KCV0Z+RPGUB5vhJLzDX0ubMbJgWPYFKUV1K45zDtrGgey6fKm\nBaOlksXH7mRxzGHersrnQNQcPtQuoPDYeRwzhjbQl6E/vZ8XGt+KB06CvmFowzczoTgH4ZNslErw\nnKElPmk0MeFJKPMKqcmEEEMRKx4PYfihZ5u06eQx9PAlKMaHyd3OEfnkL2Q19CD2GY0lToDUxhLH\nj6HNKMOU7oGvTbCzjNMSZPTqB/NnxjNPTCP7dSXKB2fwxIrRZOnXIN3EnZWVCggHvQzRBlG1aZw3\neJKFgnP48oHiNt4NOEvxzwITotM5rB2NrFspfqMqGKKt5Z8Ben5TDOb4/BvYePdPWPQ+SPbWwfpe\nUYF5w25Ob2gsOA74U48XpfFF+GoUbB0ym1LPMqSz5ZrCcTbLbFOyOXM27qZdqKmh78AKisUKUs42\nt08Xo3tKLl92L2GThwdfLfsfp14qRJ+13A4vx9ssL1+4+iY4Eys5UEPvs1K8W6Q4wVE9XECHYqAE\nQYhAWrI7CgSLolgMIIpikSAIrR0G12Go5SpGaN345IeFxNbrsN0WgeV4DqQo7dSuQnrXC0dKztU0\n5+clh61dFjw0lYS4V4D+4ua8e1QhetcjAkK1hpqcIBBkWNQCopDPG0vk3DmrhtfejSc2rh8fVV1L\nVVvL0e2FTAZ+AXi+VIJBX88h1WT2lM5Fms5tCS3JStviFa3BJpOR3as7+oxszHoz2iHBdBvfn6tS\ningh/0c2N0yixC7loiNe5F5ysrWhPPLWPIdzugjlabAtDbbZ/7oKb7YxCx8ykFGH6BMNtj4NDpQT\neTVAMcOMYrQR2z/AQ6niq1m3UbbW7UIi2RZYO1vfGyGoZagjPDDcXU2E2coL3z/EkWotXHKwhvNl\n1SqMFnhmG8Ffe5Ol88fils7EsfH8FOsBNifx2n0AdsPRe3xxG30zMbyPh1pGxAuBrF//GFPGfYLy\nlyJqMkUscjk6t6bLyM4fQ02/MIa4VTPryXdJQcoBrfDvy3/cn+e0YMOZY2gtE6j+WsZQJWjMgFXb\nwMBOQJTan25hxfhadNzSfw+zjHs59rqaIREjuO7Mx9TVf4806+8IWVVjVWwlcUI0bmlnmXPqF0zl\nNnK6T6NAEMmsK2bB2ScJ+nwGB9+tRVdq5bYFBxmnWEnhMhsbh40h4dWr2XhNORb9ldX3gOAQQt0M\nuJdCnz5qbv7P65SlHQMKmtRy7H1Yb9Jwx7oP+SrgEVS1e5l37Tn8lSG8m9GrxbMDfQMM+HrWo0uH\nKd1F/nlnHXpbeQu8AEfaLDc3LJ7+lBSA6JzTWy5Cux0oQRC0SAuXj4uiWCcIQnN6DqSrItq3nM2z\nn2XZapgXo+Wx/37AllIFl+YRMSGFZF3DhXwZjuGl1IBgBKwwvHolw6sv3bI952XQ39AXmxxUv6aw\n/V5p6+YXtTC9FjKmp2DVQ8LkdCyz3bhqeQXrHZROSPDQoF50O78v/4IgXR4K0Qrm1jL9timrDkOn\n0fDkkreZcd3TaKqyuGrze1y1+T1UwC4uPovbHVAqwYgKxeRuhDw7CCY7nlNHICAiw4YV6T1SLNyP\nNLtxZXgFe+bTJ6AALwEmGVRMmTSXOg7RcjyF8/TdHjR93em/pQ+b+stY4CbglbYNLDFI4f8g7bb8\nc8fwAkw8/MEn3Fv/MaMKv0TzcwAyRSk202yn8kr/PQpNQigDPKCy1kbcpEyWvXITx9+C9ATpjOhq\nb18KfBszu1+JMRQYNuoEkyYkkPKL9OLSx0vGE/N+IGtvCqQ2f4tz8BiWliH/6huGjzehyAW1rj9e\nam+QX/r2KPaaxVPvLGFB+U/kf17J/u8hv0c/HpuxBb54H8zLWuPVYVnV6j2Z9NLvHJh2LalrTnDd\noI14vxzKi8r/wF5Qu+t5cMS73D93BbGF5WR8IqAKU+Ix9yYeHPYuTP6CK67vgoLb7l3FjZH7yFgq\nJ2inFvmAH6Cm6ZmdTrgPrSY4+Q2P7nyXhZ89wYSPtlHeIxL5kKuxHl13aX2Fgjm353Dn6OPs/Y+M\nKau1MD4T6la3xsthNksRGY5hcgTfLvmPo5psvb+2KjQEkX+PdA51LVIqUZACxwuQ3EoFreZl39vk\n/xG0GUnvFkGp2Jsv17yDyQyz79pC5tp8KGx68zUGbOcjvdlUI0XRW4H/0GSgLpvX1V9A1jLIaGXm\nc/s/QXw2DYBqI6xsYKRGeiDfuARefk7BYqsN2c4MTL+0NbCtc2oKb88q3nzmUeq35zK9L1RUxrFz\nv4+dmvZkBZLeNsqqLR22z8tSo2Tf4Jm8YvSkTAaZraSeGq+AvnPhK+Fenv5tPqatDwFbHc6pIxAR\nsCFHjrQqb0YPLMcZsrKHG8s3c2/u7xz3gR4zLcjWHwWzvd13ztd3e+idn8naF17ne70O3a5ALP9x\ng609aDiWHimJ7ZWRVbuQ9DOxLwezbYUPZdm78fOWU24eiig2cnECr/wMeg7KZe4bsPY6qein98Bi\nltzMYXOhbHQkj117FVJI6ZUYwx6ovzmOdsUuafeUt4zAhD4o5q2D9KYPXefZhkasiYcXRr7C0jkC\n9Lk0Rij3n0s5scDAGtFEbyPMmeXOvnsEmD8exNQGDo6UlRH4nNkHlvJ972eoP7MTcf83vCSsRjSD\nVQDtj3o2mExYDBD9pjcHuZUlr42Bb6YhxfBcYX3vNhvdCgO1FRtJ6juQl3U/USZuQIp5cv4Y6v/m\nzaTP3PELlOGuzGTQhDhOHbXTzPQpkOVG92/WUGKLps94P6x1/8PxY2gfs9x2sMR3I6tbrWUPWThl\nFx5SoHgd0u67V4E4QRB2IkV/VYmiOEcQhOf4Y++zPUztAKVBTBMEXpXfzX43Nem/P03J4rOYz1Zx\ncYCeDMk8jQCmAcuQ4kRCkNyX22lYX111ubzufPUWBr2oIfgjEVXDlstAWynzy1bz681SfgmzDno9\n7Inu9mGklPXG9PwBlMnD8ar3Yg0/cvSNWVith0DugTLgPkz1B2g960PrnC7AHU2VBzc+vYzNedX8\n9tocjqQNgXh7W0ztyao30sxpo6wOIs0ZdZCXaMRS/QP33/Q5dyxcRcDPG9CvLrBb9ev3HiIlzZ/z\nq8FQfxjppnYCpw5AgQX/htw3gYAWAafJ6hJoUJ1U4l5oABXIw20gK8H+bel8fbcHmc2Kp64OATB6\nucE703l2bAI3qj9kfX0YS16/UrJqJ6wmUpZFEFLryXyfUnoFhfN6/WEpJ5HZSbxslRw4GMkzhs/5\n30cPUfMi7KqHkUGQ+8gYvpYN5OAHXhjqS7kyY9iDJ1V7mSBuJrnejL8nTB/jxq3z7iL3nBVsTd9y\nnGcbat28mHTvfpbuXUjuoVROvI/0fG8GU64Bo0nagWq4YyTvTZvCrudqQdyHtKPMGbIyUmvZyxNZ\nfyP4yTsZMj+Jybu34PVlOiWrxyA27Fpb9cbdZH9XQ219PnpTJtK9eeX1XfWoibJMM9nLRGxmFaWF\n3RGtjQERTrTvjaj4mVeef4TnFWruETYzWF/KwVWjWXF7w2Pf+xZkL3pw//mvmPbzcg6NCiL91his\nD6XivDFshkGDyIvowY5vfvijaN0tayj7qILaM63GRODMXXi9gdlIC+bTkPYg3oQ0IvMEQUhGOsZ6\nQbt6bAP952QwenAGeUuSMQve7N/1N+qSdoOueTbYCqRdGTVInmMdcB5pgiwJKc4dgLcvl0tyZgDl\nXynx2AzyhsBsNR4cMiwgx3ghVNtzuwpLmj8F5XUUxKeCYKCuITd3WmE50sAkYUl+B/C+XDrNYEMQ\nDPiqqpn1lI3XTvfk5IFALuzyaYr2yOpyeYlAManxaaysDULT/Xlmv3KS6wq/JXUbDHvWg3+kfkGt\nqKHieD8qjp3DWHEaSEfKTRLkBE7th2d6NVNu30ltpZHdr99GziYVxO7GObJqDhH9QCU1Q92xFhio\nmq9B/MIKRnu3pfP1vS2ceaKcwR6fMexaPUmyIfy6vCdScNmVkFX7YSg8SRaZVMuCUNfk4u4mQ6k4\nSE21s3hZqa6p4fdj7jwk3IHss6lcp/2O7z+awImNAaSZainNlwE5XJExXNCf0twjFB8rxwbkaiJ5\nYsC7JH6WjcXY/GXfebbBarSS+EUaL5e+zqMe/8WzJo40O+n7Yt4PYl/ETez8WkXZUZGsNB9K0yuA\nFFq3D0CnZFVJptFGwW9qCpI8OVk4HmXRQAyvBzXM1wikn6yhOt8EFgXtGz/n6HvPHpn41xcSHAA9\nfCr56KkNUN34THSmfW+AWEZ2ZgafqeZgmyTn6qD1BK8uZuZMH16/5wXuWvwBletKCSw9Q+DQfIom\nTGX5N1NB3Ihzx7AJPDwwqmWUabwxrpjOln8oSIurwlhaD3V/UiJNURQP0RD72xBEvhfplz+F5PZW\nI2lW59cx+8TQ0z2NPqe3km0EUSlQeCIci84ezR5IE2IgBc59hxThfxjJtfnjVadTvMqOmZsdF6ri\nHP0urpTY8MFH4iS2xslRN5iZarOZ505NR1lp5UisJ1XZ9o8caJ+sOskrJ5PUHCDGE1l1H4oqZ1Kk\ng+Mn1MQXaTHhBiezoLTxmIgrwKkdUKtMRPYs4bVZL7MvP4AyQ90V5GXiYMpIAnv7MfjFbXzTd27D\nYbn2cGX0vTnyZWEs9n+TG+e9ydFBg8lcPZr0tQryxRBSz4W1g9OVd6CkyfDP6detmGfGLebh7TZU\n/n+npm4dWJ3FS0d1vY4NB6NR9PKhTjOcM3ndKcgQubBCcWXGcJFsPVG2E5TZQDnIB9PcAWzeoQaL\njkt36DrxPrTY4HAKe+iPNmQUCr0/xXaqnYrz5lxObzLy5RjSayHd0k5eQKf1vRpjYjU5iZBDEBAE\nG5p+35Txn2ezbjy8mQEFxygb0JOym6bAEyl/Aq88Ek3hfJU3hkT3MLwMFl698WOOxP5I30WF1KuM\nHIydwYkSTwwHbeTGh7eDF+Aom5WVRXGFhe11V2E50p9zFhnWzJZSDjkGnQki78BxLllcunZ5aVm3\nKUpsBYlYt2ajkUM/P5CXA8YU7CtAFtJ5Ok0D50K4kCr+X7TO6/K5tl6eCuxpwmkU0BOIpO309e3t\n30adKYtPY0dL+QyBC3mDWmqjuaw6wqudXM9ncfJ8FicZJ/29AqT9d/Y4geNl1VK5/bp5pgq+ko1h\nuTCOmnXJUG6g87Jqb/9Wzp4swWIZTmVQBZuWDsZkykHK4dLS72qNl+P1vdTowSfpw/D1H8PWrACS\nhHGU7jMh7fxpNOJXQt87Wn6ECtGTx3d58by1ho9GjZE2WOQGtJNXe7k2KzdasXx5hh3UQLNXrwt1\nnWuzboj/leTSZPwAK5Hs0c2Goyfs1m2ZkyPH8BybioKRFi/s1F2ZxcVpKdrLy1H63lJ5S3WvvM0a\nak2gpy2NLaFj2e5uL1omiytjs/I4mVzGyaIZhA02E3TmIEXf5vHpXF9CxvUmXhhJ6nlPyE9GmlVo\ni5cDbVYRVBbBMQZL+URbrNtKGx2MwWyXAyUIggLJefpeFMWN0NHjXPY2IRbR8MmiOdmIQWnU5CZg\nUEP3QC16+XjIyABTEtJZQ82RibSSOBgpLVUWUoaFi9Y7m2cAasbrj95b5NWxQbAhBUaPbuAEUvr6\nfUgrne1J9d/YZmucOsqruawuh1cjLoeXvTJnyKoj/UNpVRnPfTKNi88qc4SsmvOy3z/EkpzoQfI/\n+3EhiLGluvb0vfGT3VjJsfpeV4t+V2Puo71IU/BN0XX1/XxBEZ7CLLoHJmIKzkdy+P5MfZd4Odtm\nJXfrw6HyXGb28qZAPYYd/43hwkyKPV5XUt/tlbdU90rpe0d4/Tk2K27cEDSlBrySNOT/nt+k70Zc\nyTE8A9UB5B+Apw9cIxX9uBd+DODSQOw/wWZdVnkWF8ugfWjvDNQ3wDlRFD9qLBAEIUQUxca7fh5S\n6toWEEF7gsF0+TbEWgVm72CsPSNYePAp4DekkzntIRnJi210riKQvN3GvvbROq+2OXUcGwENFzt8\nAVyQwZFLL7kIjfUcjeaygo7xcgan/1dl1VjP0bCn7wEN/5+KS9+bQpJVrdifeSX94eNDzfr7M/T9\nAq//j73zDo+jvPb/Z7avVqvebUm25N4LuBuv6RgH00moISQBbgIYQhIu4Ybl3uSX5CY3JDeUhBJ6\nTSAYA6Yag3uXC26SZVm21etK2r47vz/eeT1rY6sYDPJlvs8zz+zOzs5855zznve85y1zMn3WMxfP\nh+ZKlPET2Tp5Kqz/pI+coD/p0LB3eHj/dUTTB1CgNlC/zXGMMwwdfjEM4kgZ9FRmBHqzjMFM4Bpg\nm6IomxH9lfcCVyuKMgERklch5jN/IWz5fQjI4qdcrjWYPu3m7Gr0VtXftP1ZiO6PR0l4a8+dX5RX\n71GNSFsmHcVpG2LQ9C7EOKmvGseT1dfJy5DVl88LMOydU0+HX67P2nFnHZDDju1Dte6xE+HUX2X1\nzbT38C9X8xeciK6vZYju0ERehg6/DijqSV6u8xgLbn6lUFVVOdbx/sjr6+YE/ZNXf+QE/ZOXYe99\nQ3/kZeiw9zBk1Tf0R16nkg6PxkkPoAwYMGDAgAEDBv6v4ct/PbEBAwYMGDBgwMD/cRgBlAEDBgwY\nMGDAQF+hqupJ2xALP+xCzB/+ecLxKqAZ8SZSf8LxdMSLhqKI9+6lasf/BzEVL6Btj2rHR2vXCWnn\n/1Q7fr92nU3adn5PvDROWxJ4bT0RXojXTX+KWHY1hFhyNfVEOPWC13PaPYLAewn3qdV4BhDzSG8/\nSby+Fh2eoKxOJR2eMvb+FevwG2nv3wCfddJl9XXr8EuUlWHvX5PPOtZ2MoMnE2J9+WLEiuVlwAjt\nt0rgAmDCUUL4HfCIdrwW+K12/PfAH7TPyYi5kSMQK6I/qB3/D00gIzQh3NUXXhqndGDWF+Q1C3gK\n+Jl2rBHxEqA+c0qQ1fF4PQ88CGwFfo5YEv9+bZvwFfD6ynX4BWR1KunwVLL3r1KH3zh7/4b4rJMu\nqw+4z4gAACAASURBVFPUZxn23k981vG2k9mFNwUoV1V1v6qqEeBlYIH2mwKs4/MvbluAWJq0Vdsu\n1o53IoSCqqqdiOh6IHA2+nt0Hkcsczog4R594aUAJlVVV3xBXg5gOvCMdmwDIpo+EU7yP8fjdRqi\nkAE8k8hLVdWyr4DX16HDE5XVqaTDU8nev0odfhPtvTtehr33ndep5LMMe+8/PuuYOJkB1AAg8dWR\nB9EJqsAHwJtARsI5OaqqyoUjooi3EEr8WFGUMkVRXgYmIpbvzU0434FYoGPtUec/oShK4ntgjsdL\nBT5QFGU94pXRiTghXtq7A0ciIt8T4URPvNDeF6GKRU0lr8T7jEVE5SeFF1+9Dk9YVqeoDk8Ze+8D\nL8Peey+r7ngZ9n5q6NCw997LqjteX6fPOia+rkHkM1VVnQR8F8hUFGXWcc5Ttf0jQAkidXcGsEeL\nJlU44j19XdrxR4ASVVUnIN6R8Mc+cJoHXI8Q6PHQI6/Edwdq558IpxPhlXifZsRLkO44iby+i6HD\nL5vXKSerr5GXYe+GvX+TdGjYez/yWSczgDqEeE20xEDtGKqq1mrHWoB2RMoOoF5RFLnEqgVo0M5v\nBMyIB30MfRnWekVRCrTjbyRcv1FVVSnAxznyXTvH5CU5afd6jyOV02deiMj9OURU23AinBJldTxe\naOviK4qSl3gfRby/cCoQVrX3F54MXnz1OjxhWZ1KOvySZHVcXl+2rL4qHR5LVv/H7f24vAx7PzFe\nnEI+61iyMuz9a/FZx8TJDKDWA0MURSlWFMWGSLm9qShKkhb1gRCAG/1dOG8iIl4FMVhsERw2nr8D\nOxDKTDx/sXY8ctT5Eke/p+9YvN6XnBRFcSEi1VDCf/rKKwZYVPHuwBuARSfA6QhZdcPrCo3X0ff5\nu7ZPfKnPl86Lr16HX0RWfeH1devwVLL3r1KH3zR7Px6v/2s+62TK6uvWoWHv/c/eT4TX56H2crT5\niWyIQWy7gXLgHu3YYMSo+haNeBTx0pwbtQev047FEf2gNyJef60ipiG2IWYlnK9tKmKaog8xxfF8\n4FntnDJEdJnbHa8ETpu16/s05ZwIr7sQhtyRwOuyvnI6SlbH4/UP7Xtc4/Bv2n0qNF7tiBcPbTpJ\nvL4WHZ6grE4lHZ5K9v5V6vAbae/fAJ91UmXVH3T4JcrKsPevyWcdazNe5WLAgAEDBgwYMNBHGCuR\nGzBgwIABAwYM9BFGAGXAgAEDBgwYMNBHGAGUAQMGDBgwYMBAH2EEUAYMGDBgwIABA32EEUAZMGDA\ngAEDBgz0EUYAZcCAAQMGDBgw0EcYAZQBAwYMGDBgwEAfYQRQBgwYMGDAgAEDfYQRQBkwYMCAAQMG\nDPQRRgBlwIABAwYMGDDQRxgBlAEDBgwYMGDAQB9hBFAGDBgwYMCAAQN9xBcKoBRFOV9RlF2KouxR\nFOXnXxapL4r+yKs/cgKDV1/QHzmBwasv6I+coH/y6o+cwODVF/RHTtB/efUZqqqe0IYIviqAYsAK\nlAEjTvR6X9bWH3n1R04Gr1Ofk8Hr1OfUX3n1R04Gr1OfU3/mdSLbF8lATQHKVVXdr6pqBHgZWPAF\nrvdloT/y6o+cwOB1qnMCg9epzgn6J6/+yAkMXqc6J+i/vPqMLxJADQAOJHw/qB37utEfefVHTmDw\n6gv6IycwePUF/ZET9E9e/ZETGLz6gv7ICfovrz7DcrJvoCiKerLvcZz7/kD7eN1xfv/KeSVwAvjL\nMX7/umUF/ZPX53TYDzjBKSIr7fd+x6sfcAJDh93CkFXfYNh779FfdSihqqrS0zlfJIA6BBQlfB+o\nHfs8kuaAywMKYu/yQL0XBnhFD2hM2wDqvJCvHVe143VeGOQV30NAFHGteu3cOGLrXAa1d4Hqg7Rr\noeGB7nnl3q9/TvaA2wM1Xsjy6vcwA01eGOgVnwH8GodWLxRovMIat7h2vFA7rgCtj0DdfZB1u/h/\n/QNSfp+HaQ6YPeKz2QMWD4S8kOTV5RHXZBPygtULdu27lEPECxYvOLT7x4GgF+yarKKI3GPsEQjf\nB9bbxf+j3fDKvF+cA+D0CB02eyFbu6aEoskrwyu+24FkhA5LtHMjmrziQK0mqy7teNcyaNB0mHIt\nNHejQ/ccSPWIe6Z4hP5MwCEvjPGKzyEgCOzzQrHGySqfH6jwQq5XfzY0TvnaMStC73WPQPV9kK/p\n8GA3smImMFv7PEv7/BvgPnQFqNr2O+AXgC3h//8J3KGRVzQh2oD/Bu5FF/gK4BVgMXAL8Nvjywog\nW7N3kyYvl0fY+wAvJGnPGQEOeKFUe/4AQldhhA7TvEImdoT3UBDyzvaKzyZEOTx0F8R9kH6ttPdj\n87LOAatHk7VHbAEvuLx6+ZF58nYvJHvFfS0a3yjQ6AW8Qiw2bYtoXGPaOQD+RyBwHzg0ew92o8PR\n94t7K0CuB/I8sMULk7yQIq+HuP4WL0z0Qocmrw6EvZV6hZzMCJklaedO9uoqrF0GG++CqA8GXQvb\ne/BZA+7X5SJt/pBX2LYNcd0YUO0V5UpC1Xgc8B5ZDuPAQa/wYxH0slH3CBy4D/I0ez/UjawccyDJ\nI+4hfUOTV5Qr6cdleW/xCltBOxYAOrzCv5nQ/VUc6PSC2yv8hxnwL4OO56HrdUi9HVp7kNUETYcA\n+R4Y4IENXjjdK/Qi6x0nsMkL53nFd7+2LffCYK/gKe1QAXZ5YbxX+FcLQodr7oKID4ZcC5u74ZU2\nBzI94nOGR3xWgXIvDEmQAcAe7f5OhAwk5w3ez/uy3V4Y5RW2HtbOO/gIlN8HxZoO93bns+YAGi88\n2uYFk1foQpYFExDzit9k/SiPpXhF2TAhyoAfUZblNeIAy4DngdeB24EedFio+SwFSPMIf7/fK+IB\nWRcqCHsv8uoc4xq3/ZoMQcgmpv1erclQ2mfrMlFm0zQZVD9wbDEdhS8SQK0HzlIUZQdCZUOB0455\npssjCpM0DDVhH0B/KGkMNoRxxtDrjwyEUEKIgh4DWtAFEEU4k8750PQbaH9F3v0/jssr3yv2RwcA\nJoTBpgK5oByMkn5pHVm2ZpymAPGACV/MTctHbVjn19PZmUK42gnVQBuiwnYhjCgOpN0sAqiWJ8Cc\nIe90DaL2PBJmjwh0ZP0qIR1eDFBV/XfpTKU8o+jGIyu5sMYpnrCZAPPNwG0QfUU7GYA3jykr6fSk\nrEwJeykzaU2ysDmBbCBP45CBKCYtGk9ZAUqnHwUcHnDNh9bfQEcPOkz1iEKgJMgi0QnLoFJyc2n3\nlFsk4TczeiVj1q4hg2gFyL0Z9t0Gza+A0oOsmA38+7F/IqJt8mYmhOyt+nMQ0QTVpP0nA8hCF758\n0NnADHSH1I2sAPK8Yi91J51PBGEfkpIVvbypiOC2GT1gkBWadFRSdxb04KxjPjT8Btp60KHVIypP\n0O0YdJ8gK99IDMIR6GwFqxXMJvF7VIVoGOIBwAE2RfcJTu3ZpB4dN4P/Ngj1QofjvPr9pawkl7D2\nrE5tSwFyNBkeAho0mXVo5yVp52QD+xB+pUPjleuBAfNhx2+guhc+a6D3SDlJSJ0m+gRLAm8QZmZH\nmJK07yjQiB58yTKce7MIoBp74bOcHtFgSuQAR/prqQO0e0n76UJveCaj+30Z9MYS/pvkAeds2PsU\ndPZCVpO9RwbgRwfkNiAFHMO7sLS3MficTZijUYJhJ9X+Ivzb4uI/jRpHWTdJuUnZDvBA8Xwo+w1U\n9sAr0wNDvbqeZBlMbNiaE46ZgTQgU5NNGKFDF8LOHRzZ0JOJCAUYeDPsvA3qXgFTTz7Lw+GgSCKx\nDlKPc0yWEVWFYAwUBSwKxJSEoOno+8wGnkI0/oDudDjIK/aH76MdT/TNUheJslQRupK6DmpbBF1O\nMgBUgXQPtHlEfRLn5AdQqqrGFEVpQqjOCvyXqqo7u//TUd9jiIcKaVeQ55jRnXEywoDdiAeTUXhE\n+5yEHnHHAcUEliyIR+Rdescr0XAcwGBIntRG0chqwu9XcN6CFylsPoDb10EsbKbJlMlHu8oY/60X\n2WYaz/ad42ldngk7EL27YfREg9kMyfOg/VVkpkFV1XHdcpJGICtV2bKIJ3jIxOSElFsqmExRBs6t\nJMnupzpWjL8iGbZqMpWF1AZEzEKAagSUqOTVvaykISfW5bLVb0V3NPlgGxWkaEgVA3IOsTewn9CM\nWpqXZhOvsojzUrX/uNACQ/kcJjD3QodHB+TSOcrsUmLFIYMC6YTagbY4BOMiiM/XzjMBNeiVj7yu\nagaTJiu1l7I6ghjoBi+jAxk4ObTGgyp+JgbOKCTFIByHji6NkPSU8iGlB0tCFIhuZAV6BjWx5SYD\nIPn3FIRuSrRL1wNBFcpViEQgGgWrRXNIKgRUIb8oooxatWuZtHKo9sArUUSSnxlhTwpaliICQR/Q\nAsEyCKai1xwxwAd0gc0GdrMIamQ2SpYhGxAz67LqSYeSi4x1pazkXjbqsoFybS8bUC3ocXIGOEd3\nkTGoidSMdur3N+CYvJ+m9izCDQ7UVjNYTGA/AZ8leclN2qusTKUupB+SZTQHUfbs6LG6iyMrGMyQ\nOg9a++CzEhsvsgxKc7cC6YgYd3aQDFcTabE2Yu0Wmlc1UjRmA860AFGXhcZ4NrVtBYTe1zgmBoGq\nGZSknu0K9GKXGDRZBAcZsNnGBSiZugf7rt3cOOJvWANhfMmpLLbPY81LEaIFKvgVEfi2IcxHylhm\nF1Xt4R29sHeJxAo/0afKAEpefyCiLKYiykITRzbw7ZqMZSAhG9lSVuYkYVe99VlHN0AP19kx8Wyx\nGMJBdAA2iMpIWAV/DPyJFZdZv+YRgVRCOexJVok+/uggzprw2Ywo90lgyQyTk1dPkX0/1eoBwrPq\naNmeSfwzq3AVdvQATNpXOOF+R8cp3eCLjoEKAtNUVW3u9iyX59jH5ENIIcQQrZmwqlX0iii7bo/u\npLu0vQ2RAlUR+uxEz2CYnoGha+GzLFRV/W2vnkTqO8MDhSqOcwJMvmwNV5teILX6Ay574RXU1WA9\noN0vBjNdcNbLn7D4knn8KXshn/jPgYo4hGbCgZioRBxaizjl++BfCcPXw/as4/Mwe47MLgGYPHq6\nXVqi1QQ2j3B8NkDVQupxCqlTx3PtT/7OwK5DPB39HltfnUTQ5xHGIwM7J0J7XcngWgtKJnT02OUr\nIAu7w6Nnumzo5SrTg2N2gOFX7uS69Cc5u+M9Vtjh4/kf8oF9Hr76VGhRIcsiskhDEAFNHSLLoXqg\n/RkYtBbKu9FhiufIIEcWiByPblc2RMU+fA6mMREsSVGiB63Eyy1QFoPwDI0zIoiyAQM9uoOT3UVh\nwJwM49aCNRNWdierWdpeekQToltP1lBoRB3AXLDEIUkRTi4YBetckieUkj4kj2hbnJatJkJ1XcKu\njmhdyMg6GVgKDO7e3hMDcqlDq0fvuskG0+lRrKYpOM5uIeY0E9zvIBq1QVkEmASWCJi1AMqnQiAu\nymw+okM/ArQCfg+0PgOj1sLmbnRo9RzJT5ZBO6JM+6MQbEPMenYDZZBUABa3yD6FwhArAiUuMiuy\nORf1CL3GtGsma9dvTIbstRDLhMZudJjY0pVqTPGI70GEveQgsquTPcI+2gBVxZQbxRY7DXteK44x\nAYacvZtpRSsYEdrNtitbsM57jk+ZTeWW4bRvzCBa74GKZ+C8tfBaL31WYoWb5tGDI5lpNXkER8lL\ndqGlac/gQsgLYIxHVMxtiKDYDTgVyPo+dK6EsethYzc+y+k5kpfsbolrsnICJeCc0oUrfRTDv7OO\nCY51jA5+RkBJonJtNXNP95Ibr6PL5GKtaypvx+ZTljqV8A4VKhW9+0UFFE2Htb2UlfQRZmCQR+y1\nhGXu+EOck/ceIyfv4eaKdwn6bFSUltJqTmXb2ZNoz4iDwyz8017tecZ4oAC9Iu5CdBGWPwOXroWn\nu+GV4dHlJPUne09koO/WNosHRgHDtWffq0JdHMwzoTUu6scIwu9mevShEcGE5zUnw5S14MyEd7vz\nWZ4j22UghpPEJVE/omB3AEPB2oiSloo1y0QaXahdI0lxlRMmRlssiQ5fKjQ5QPHoAatWb4rnTwbz\nWoj1oMN4wl4Gm26PnsGUgVOBRwTopSpp57VwmedFrjj4GmvHBlgxfxlLnz6Xjl0ZQo/piHonSZOz\nLOPpnj4HUIq2LsMJQVGUSkSxiwGPqar6+DHOURmvHhlxH+5L1U5KQU+NNgGH4tAREy3dbO1/coxG\nJyIQsCIcREj7j0yVuxQ4WAKWNAhsBvhht7yUBB4OoEDFfE6UCd9fwy9t/8kZz6/B/UQnzRsgYDYx\ncDgwA5StKhXrVXIzofFXRfxpzEIefvhOeDIE1CK0lAwpZvExDdhVAqbj81IURSVZ1fkktshldyDa\ns7u1a/q038JtosVxiYPT/7iGdRVnwKfw9A3f5qH1d7DxyWmwGT37ZNdkV6txwgzRDcccOKcoispw\n9fOVL5rc7UCbqmUrAA9M/sk67h99H/Pu/ZC2NyD1Unjunqt56N3b2fTnIXCoE1IL4CwrDNM41SHK\nqA9YViIeMNSNrKYl2K5J05+0k8RMQbGK+YwoaefWkWer49DHg2l7IgveUYU9pSowATgDGKMdWwuU\nKXqWsxNYWwLmNFDM0NWNrGg/+qgmNJ9G0qodk3t5Th3QDqcP5cI/vsPN0x+lUcnmwRV3s33hRNgc\n0AScit72UYFS7VjZMWV1mNd4Vb9V4lgJE6KcnaGScWMt44dtoohqmtVMtlRP5MD9Q+GlDgjbIdMK\nOSYRvNRp1zoLlB+okANqpSLsrEKBxZpt+bvRYVaCDqV/SEMfh+JvhsgeoBIUKyjjUGYXowy0odYo\nqJ/5oaEZzA6YlAVpZtHduF8VezMiyMnWuhR2loCSJspKpBsdflvjZdLk5AM6tbJp0TgWKlCo/bYV\nCKiYZsRIv6eW2c7luJUOJrORc1d/xIhnKuEzYCJwI6waPYnnHNfyTtkCDiwdDA+Ugi0NmnvwWVMT\nyqFs9CUj/IELYatacaYTPTDqQsx1OoQwO/kft/Y8ncBuRMYzTYFcRVynTLP57nQ4JEFWoFdAAY1H\nMSjfiTNr3sfcnvVnLtuyGOUlULchypdT8FRatWc4E8pvLuF7mY+y+odzib1jFcG6BbArEND8e7gH\nWd2UYO+JXc1t2klz4Jwr3uJn7j9w1vpPUBsUts8dxYVpr3HogVJUh1kMDfIDbwBvKeLZLkYENk7t\ntzZt/0QJWI+vQ0VRVM47yt5lgxZNVi5EZjNH20v/UwEsVWF7ByLFmQ2qVeuqNkOBSehMBnQxhKvZ\nWSLsSjGDrzuflaBD6T9jKsRk/349wnhqQRmIMvg0HN+NUXLvTq7jOcazhSKqqKSEpxt+wL9ev4r4\nr0xQp4jrpaNnQyNAWPPv9KDDWUfFJ4kZa1eC3FQhK2V6hIk/WMsy5uK6LIqSA//zsx/z+PJb2PPY\naOGaZcMnCT0I9iHsVeIT5aQPIgeYqapqraIo2cAHiqLsVFV1xTHPlGk8OyLwSUYU5AHACIQsq4EP\nENkb2iBSBfU+wAZKKig2MdYhroKSB51a+BjrhFAaxF1gVWDwSkjOh00KwI+65SW5OTReQ8EyJ8I0\nyxrG1e/C3dmF6UIou+cMHo7ewmfPn46tKsw5j73FXf/1INH368jaWUPRuBo4HfjABtUDxEUdJmE4\nWQhljV8Jaj6s74HX0enmIKIwgd56zNF+lz0Y6kYgiqKMwBSPi3EYTlhhnkVV5WCoRE/lo10zCrhX\ngikfwo0QzUFRlFndykp2/0h9tiMKhK8ZIvuBXHJHwhlpHzPp5yv51xIIhODC/TCks4KcuQ1QeRq8\naoW2ZliaJwpANsKJyzETpSshkg/l3cjKnPBZSThmQ9iTGxgCA6Ye4NvDn+VK28tsZRyPNN3J5ros\n8fxp2nM0iGcaPayM9OImqkcPovrVoaLSkxm2M1eCNR9aG2F5d7JK7LCX/SlhIBd9ZLtMhaOdVwu8\nB+TDOUVctekNBj24iqKBZu64Iokf3PgE7HNAe5cYb3AEPkUYiOv4sgJ97I4cAyATYlagEEqv3sH1\ng55mFsspZxjP7b6B2r8MgtdVCEfBkSTGONQjutVMMZjgwPbDEN+Z8QyqDdYPO53KpGGEOpJh6koI\n58PqHnQog3E5NgZE5dQOROJAEWSNQZkXx313B/em/5wMSyv/ar6cJa/Mh/9KhvgGqJsDg8zCjiLN\nQBOoqRDK0xtuRSshmg/+RmjsRoeJXRmNiOCiTQU1BIoFhli0TI/G0w5ps1u4+IZ/8NN//Anz39sw\nNcdxKiF8fj/vtwr5D90DxQGYdN1nHJq8jJ0lI6npyid2x0poyYdHevANiV0aZoR/kElJNH3uQgS3\ntRo3CyK+DiHKWS66ydUifEmmdryuHXxJkGwXvmvCSoj34LMkF9kmsGg8Agi/NwXmz3ydW8of44z/\ntwrWabzPBxZAzAGmACifAu8DH0BxSzV//Y87mH77Wjp2WkU5lOU6XyuHO3uQlWwcyEAzhgiqazQ5\nDINxqVsoqqumPc/FYtd8fvHTu6lbvwS15XQ4cwIpC/ykTG8lkOumeVC+kO0Rg6S1z13AuSshKR+e\n7YZXYnekzGaGEfYux18GEfa2H1HWJiBGGF8KLHRiGZVKhruRpqcHEH/RBzUd0OoCV7aoV+X4OgUY\ntRLc+RBrhJW9sHf5WSa5Y9J/hYQy86eT8l0r37r6JR7Y92sst4dJqeygozrE5lCUvTRyZf5OLrlg\nMQ+8+AvKfzZGBH/S7ckucVaCOV+MlepOh9I/SJnJsWKWhOuFEWU+Bmq6BbXGicMfZddWGDwImvzZ\ndITd4v/5iHjDjKg369EngMnGUeJ8nh7QYwClKMqTwHygXvaBK4qSjhgBVqwoShVwJfAvxAJZnxdC\nrVfszUCxByZ4MI+NkDGmjilJ65njXkaatY2d/pGsPmMG5Y1D6Gh2EC4rhU8jkGyCXCvkm8AcEw+8\ndCHsfRfIActaYXSBNyH4E1DaRUtA4Pi86jReCiIFmuMBQPWbaFEz+CxtBDsvGMbH5WezdPksKj5O\nor2qGYd7LM5cP2ptDHsQyjNL2RMtFQXT/32wvAXmXBiwVUujL4YNCyHarmV7uuEV8ib09XpEl54c\n9mJFT/HKrJ0JaOwSA2yJYLe2k+psRw1BcL6Z6lWDaXloIdR+JILOlK2ixRRvhdi5wD7RIrdcLhkc\nW1ZNmqxURFdGlpAVzehjvuwDsd1o4dyz3uDyJU+y9aMA+zqFX1YaYMzKXdxx/p8YfvsOlo6dy7Zf\n/hmal8C/sqFoC2Q7oHMxVPRSVvs1TiZEajzDo3e7uYFcyB1yiHOL3+GW+icpbDzAn0vvZP+nBbDF\nJ7opBrqhVhXB+y4I3ns7W9etx5GdQtZTm2mqL4CDbbDhXAjsEy3f7B5kxa8TPp8BnIle04GIyMJC\noTYzmP0QkB65HcesVlL/1UzTpwHM50BmqAX+/j1oewthUGs1Y3gH+BmilkztXlYgZk1KJ2nxQMAj\nHEehClfHuGXQI3zrsyVEMsy8Vz+Uz56YQPRjm8jK2ZMhagKfIhqiQTMMMeGMXI3p+jdZmRvj3i0X\nU+0opHL7bnj+PvD3QocdXn0ol8sjZsOmIQKAWCcQh1wXaReEmXXrMm7Z+DjjNmzFNjaMOttC3bQ8\nNk8eDhsj0LAeVgyB5p9D+B0hk/g/oV0Bixna5kFYs3d7Dzrc5hX7MBCcCU1DILqNwynWmRnkXFJH\nOMNKe1U6ScEgp01fzU8jf2LYs+V8ujlOKAD5CmQ7Ib0A7grCR82Q9z6suF+hyZnBgWe3E/vFj8DX\nLjI9PenwgFf/nOXRu+TaERVAJeDbAIFaCNogOhDyRqNcFyN36gHOsC1nsmMjxf6DtAbSWWWewtt/\neRPfmpVEUwrg3I2w0wLKYti6ECK90GFrAqcUjZPMNtgh6YI2LuJtZry/luTtXVTMGMyLc69i7Yez\n4FlQc0DRsi2jrt7Gxb5/8fffrGPxi7uwFQ0n48JttPgzoXox1N0pHrYnTgBlXj1wHuARXTQhRNCT\nDCVDdjGzYS0Dd9ayPGcGT3V8iwMfdEDrAMgZAiPt+NssRLY5iJVbROJnzU1w6C1IyoWzt4og+uBi\nWLcQwr2odyq9un93e0QXtgnhr+SsvjZgnwqNMSgyM+WHK7jK9RrDt5UT+kCl68UYSa4w9suc/O7R\n21n9h2eJbfkQky+b+DWfwatAdStUngsxzWfl9WDv8QQdWj2iWzaqiIHh7AH2gWsE6ddZuHzeG/xo\n9cNkPV1JcxWkDISkiTClFqzbo0RqAuSXfoq7bDtsbYBIJlg/hKgD4suAnwodxnqhwyqv3hZN9Qj7\nUjUZye5L2SOSCrgUVNUkEs9msJZAS3I6XZ2uww2dw8GXD5GhDQHBZRBapgeOvURvMlBPIdaOeDbh\n2D3AJ8CfgR8BvwSmoc1J/BxyvOIBsxH9uUNUHJO7GDuyjJv4G6cvKyO5qouapAI8ynLqa3IJNjuJ\nptrhAoTDbkNcYyiQBJX2AAy6kL/+92rUdCuxXQpEV0HSjTBkIbQ+DFX3AJx7XF6JSyCYEC1Nq0J0\nhZW1rll0DHITsiXxWcV4aj7Mg21tMMiH9dsRLtjwAW37O0gthJX5Z/Dpztkie5Z1I4y9DTZcL9L7\nKUDFChh6IxQshF098LImcJLjniwcGRg4EE5ApjGdFgjlQMxEninMVNNaYgETGwdNoOFPDtTOuVqA\n9ICIZvaAqOQvAtudwMOgNkgG248pqyyv3sKUAZzkqAI5LmzfjnDpRW9ww64nyH69gtoWGCTURVMd\nZL7sY8re1RSOPsjg3P385dpp7P3zGRD9FVSVi9R8bIWQYfpCqH8YGruRVbH3yEkHIFqCMh3rhmHJ\nu7k48AZFnxxk18yhlD0zmdalyWBTYZYNJgPLVFjWBJWpzL2gmCuv3c4tv+hk2MB1vNu+AMp+EPff\nVgAAIABJREFUC/YFwq5aHoZoD7I6PAMvceqWHCQGh/PkVovIUip2CLgRzcciLsj9gOyWg2TGYPOA\n01gUuBC60oGpwIPozdZPgKuBHwN/A7zHlxWIqeogKrYWTVYusI6PcM6Utzlv5UeUbKvmmcnX8M7u\nC2lbkQlDVPhOFJqtsE8R3UB+IGoibVwLEy4cxs9sSfzs1020xDJorsklsvIjGHUj5C6ErQ/D3m50\nmOTVxmIJfZGuiaoDiLSIL2PtDFqwjR/7/kzG4ysx7YmTWgPnZCzFP9xF9LZr2fbCKPi0FWrLEE12\nD2LpCLsIAhofB+cCyFoI7Q9DrAcdjtdkVQPsbIToZmCPqIxm27jssleZXbSSiozBvJd/HvXBXMyD\nI9S055J+bSN557bRZbUQsqRRYSvgQPowbDUuzjWn88HfnuMv+dfyaeVcav/5Pgy+EYo137ClB59V\n6NWDAivCzmWXTTtQ3QThALjSYGoag2a3sCDnDwyyHiSltYXh0d0UVVcRq28lYHYxsXQTw+c7SLpq\nNL/7/S7y7yxj9/IxBF5fAaU3Qp7G60A3vDK9R87O9QEdqrCTdCgt2MOw3XtIq/axecJYnjvtYt54\nZxL7ViZBVx64B0OyA5KDlJUOxz/JyVnXVHPrrnqu3tFK5uU72Fx7OqG6lWDV7KrjYWjoQVaTvXpw\nDtr4Sm1zwkz3coYuLSepMsi+yaVs7DoDkkIwrgSKsiFmIvqihWjQLoL6AcDlN0L0NnjpetGWsQHt\nK4S9j1oIWx6Gzd3wGurV/Xvi7FWZMY8g6jonuGZ2cc11z3Halo8YeGAVeXsbyKqApFZwpoJjGjAj\nzkc3O3FY5/DkLz8l9cydVHw8Anb/FpIXwNCF0PQwhHvyWV6xk0NhHEBA6wmiDLDCBcnMm/MBP6h5\nkrGLdhJsAdsFVl4981tsq5tEoD2JCWe9zaTdn5KpNnL9gmw2lb4Lv78Oom0QjwMfAdeB+W7gYYj1\noMMirx4gyS5PPyLwkePrstFnxCoQ71IIFlhIHRXFNBoiKVZiEbM+m/igdq1W9ExdhkfM8pT1yP5j\n0zkaPQZQqqquUBSl+KjDC4QUWIGovoYCXlVV3z/+hRAF3gnYwWSJ46aTEirJrmvFXh8hzV7OyIpy\nIh9CJAiuuUAh1FRBRzlk+yAjVYx1bjwAwRQnz6VkopZ24TuYBq1vQuqzsGsWescyi7vlJcfHBbRx\nWvUK8SYzlR3DOXTaACK1duJLrSKd6sjGMTKZ0ddvYNLjW6lqCxK5xsQe8wgqPiyBXWEYOQuy9wvJ\n5mvSqVkEc5+FpbNEV0h3vBILV0zVZjTF9fEhUVW0CjqBHLsYzJhqh640SHcwOO0z5u9fQrTcyjux\neTRVBqEjHzJMwtlmI1Kq8cXAsxCZJWbgxculvruXlQ1Rzydp312Igj/bieeGD/l+65NMeG8N1btg\noAI5VggromdVbQH3qgCjt+0hZXon/ott3Lv1SvjYBPFOaI9AZBEMehYqZh2eOXJcWclxIHJMSAgx\nVqJFcMyY3MRkdSPT1q6ja6+Lp6+6lob38lFbbDDDhOkqE5ahfsL1dljZAv4kJpuSGba9DYvfRnF4\nP8pOUFvfAPczsH0WmKIQ6oWsAN1Dyv6MxNGdbiFExSTG9pAFlmKYOobLdz2GpW4/qWOgNWsib62Y\nBeX70HP8suP+XUTgdC69tnfQZ4jZgEFgPzvED+2PM2jRQUwDY2zdPpbt74+AcBBm2cW4neUIByTL\ncaFCycR9LPSsZ9irLYSw8/G6s9m/oYT48rdh4rPw4Swxa68nXnKAqVO7thx8HPeD00rRyBrOGvg2\nw3+/nHfXQUoKZNbBgCXVnF++GEdxgPU3Tefl3Hn439sDjXI0rDYynhqILYLwM1Cj+Yae7F0GBHKW\nMC5wDsE6LpfzbvmIWxv/yunbN/HBrDPZkjuOA/XFlEdH8pDjx4y/ZAvZba2Euiy0hNM5qAxkZ2gE\nO+Kj6VrWhsm/hMdW3UrTujyCy38Kxc/C25pt9UaHcgypjM3lUgB+oEQla6yd8UVVDErfyWD3buaF\nXmX8moMoJeArdGGvDdGyPoq9s42RQzcxdTRUHYQnAwqXTPsnT+eksu/RRTDlWVg2C2K9KIeyLIaB\njjh0hCHaCb4MxsW3kr63FQZAWclEXtq+gLq/Nwjh2rThBlq3X836LN7pmkv+nTXMfO6/cWwOMWXk\nGvadXkr9a4sg+Vk4oPms3shK8gN93F8QsIFHXUbWujpiMQjgwqcUijTAcESjfVUXVNlwTQhQMruC\nkeN2k57XTnttK2+/4yN1wT7qW/OJ/GMRLHgW3uiFrBJ1aEMft5Y4K9IMzuF+Rl+xnR+ZHiLywj46\n1TCNY9KovnQQndZi7C4FV3En01nHhUor1ZvhnyEHA/P3UxEaAcFFkPEsbNPs3d9LnyXHtNoQE1po\nBBqgcBqzL97ANaEXOP2dzRCAwHfTectzKc/vuYZNZUMIqDlceUM2w85oY+y6jVxY3MErxdtY/TdF\ndKW31QJLgL9DrA91NOhrdkURth4WciITMVPRyuFx7moLREYoODOFe3WpfmyuMF1JCDdcjwgQ5cxh\nOawhcexsL3GiY6ByVFVdj2jqoShKS69mQ8jp44cUQnuSKE8ZwTupF1I3cTv20RFMahyXzYdpbxct\nFifuYjuZHc3U2OvZV+rCNaKA0el+lO372PpajHFrYyS5/KiOLnz2NJFFSZkCuWWi0GxSep6lEUcI\ntSUOIS2dUm2DdhOhnclijMBBbVC7C9IHNHNF2ktYt4UZYIdD5xZSX5WDWmaCwigMs+nrHyZr1w40\nQM4UOKtM2OPybnglDtSWBONa9KQC/gj4A2K2TJbWMrIBShTLpABFQw8w5d3NdG5OYtmBM2kOtQoi\naclHLg4abwSmgrNMGFFDBqjho9l8XlZykKEpYSuAjEsbudH8FOPeXUPjWogoUOKA9Cww50FtSS4H\nh2WTovjI21NPYUUNl1Ys4bk7z2LnWhc4BkOXQ8vuTIEhZcI6y7qRleSjIApXK9o4bBVGKpxWso4z\n25aS/pGPitNKea7iBlrCmVBggeFgzghiaw4SDjhAdWIbFcHcFIX9oMYU6iM5UBEAtR6UcVBQJuS9\nPUOsPdQtVPRIRUYIfvRpfXGxjETIoi3h4cTkGMngOzop+ddODu7xEfi+m3oll44XFESzqYjDa1Uc\nboZNB7YgU3G9nsFlFZcxjYuSMbsJT+0K3Pu62HlRCXVvu+FtH4x0gMUBq6zC7+1FBDk2SJvYwrSR\nq1mwbQk7LXaa1QxWPT4e31or+BqgdQqMLBN62dhDOZSZzC5E5XY4hW6HIiunFWzk4r0vsuef4MwC\n59lOzNEwXStipL9QzXcn/p1r//gKW+4dx/byfEKNnSKrhwUcbggEgAYIDwfrBnBZoL0He5etXjPg\nTIPUcViHhCi9qY5flvwnhT/ehd0RpnlAFs31uQQ/TqEyLYXK9BEsGnyFyKDtQLR4ZWZhhQqb2omb\nLFQ/NlSoNNAAgSkwvkxkh1/vQVaJw99kj7Ask6Ng9Jxaps7cyNXuF5j5wRpiT8ZoCJqpG51JeeEo\nDo7OI21CBwXjaklpaKPB2klOVTPKcjC1qZzZ9jHvDTyffV0NkDcFzigTevmkB14yqIshZmZGOsB0\nAFLSOD22kay2JhonZlDROZS6xcWQqYJjLJhsEFA0k7ZDcytVWxz8c823uWrmH+BfcWZEVvHxsHOo\njzVA6hRwlwn73d0LTtLfoemzE/E8BTCmfgdJlW10jLETSLcLnRUhfPQSIMPPmAk7GDlvD3PGf8y8\n8LsM2n2IvT4TG0w2Zp2zhHcOXkRDVwNMmAKFZcLHPtADL2nfcgkXWYl3IBI+bsg7rZZvj3qREddW\nsi0tgvucAdSeP5V38s9ndfM0iCpkFTbxkG0h7mWthLeBGjHRUp8JVYgMq3MKTNT8+4cZEOnJZ6GP\nIbUixjjSDhTi+lYB3897kCmLlxPeAh0XpPPO9efz0+2/pfPnSVBfCXY3H44+hwFXH6LQWUNebS23\n5D7CxswwanoWkTK/llWeAmZNh8FeyCqWsMljTvSsoFvTaRsiOeNTsbdHiR0AsxUG+A6Rnt9Ka2mO\nyCjH0NeaTEeflSnFkzi2tgd80UHkEj1P5VMRFdxeoAvC9Q527J/AvVNG4UrzEapOwpYZZPwlWxh6\n6R72xwupJZ+zHUu5SH0TX2QwD/uuY0zyDm4adD+x59uIOxTC2FADNn12uJyS2BfEgHiIwzk9cw4o\nDpF12h2F1gjYAlhHhSmeWMXFTYuwVkVInwqPpV/KxvWnQZpdtF5KEWsaxhCFVS5g2YRQUk8Sl4Pm\nYlJo2sOYtPBYDYnNgj41uyEIAT85M1ooHLIP7gE100TV68MI1m8HczEk2YUzaUYvwHI8TDzxSzeQ\nA/jkyrxyxsdQOHPGewx7rZy6N8McaoACF2QOUgiMdeA7P5UXpl/O2+55TEtZw/WbXmTUSxUUfraf\nB6b+gmsK4sTHZRH72KL3Tacen8YxEUEETq2AKY5tfoSL8t7mW2Xv0hDJZPmF0+j4dRbxkEUUuM0q\nkU02ImE7tChgKabk1i1krGokvhWiZgurA6ejclAI2RIGa5IW3PZCVodLvIyG5ZQ3N/oUyCBEHBAN\ngeLH4RrA3XN+zMH/qqO+EcpzxrMxPgIOmoDhoBRzuIl0ODuXuOBVLyAHiCYD+eAc4WfM4E2YF8cg\nD57IuIlP3KMho0usxbUZMaC1HDFOvSCOY0yQOWcv5XL7y8T+aaL50kza2tOIv18NtR2CTwPCwfXG\nw8ieTimmw1OpszBPVRiQWcuQpRWsSDFx5oIcKu4cjFJZhTNQT1d1nOg+yPpbhMsffJFG9w1Um0rB\n2SFsokSFz+TFG8XMJVLo0d6l404CBlghzUz2xW1cf/mTjLpyN1t2hsm93cVG6xR2rBwPK9EXhx2G\niG/3aNcah6iY8xVIV4R9N6LP+nGjT2fvCbKFLAe7ysVg81XsNwT51Zyf43nsU1I+8hONWgmUJhOe\nncO7F8/gJ6/8jsC/p8A1NqZdtJIp+auY2ryBq998DRaJ57XEoqg+mxCXHATeG14yQx4Dolpfiy2K\n6ZIYnuAq8q2NvJp2CR8zG7KzoDRLZM8rgL1xKFDEAox7O2BvI5HFbpoeSgVaGaduISOvRVzfjT7Z\npLeQa4rJSS8mFce5ARwfxTAdhPJvFbHPXgRrVVgF7ABbWpi0H8f51aT7OOfF97H+UazfCmA1x3G2\nB7mofTFb0sbTIH1h4gKgPUEGdodXwkZkRoCkM7uYPHsTty96BJMfSn6Xyl9mX8tzn91AxW9Gwlui\ne7Ty3hi+8Wn46qElx0a4zsre904XgTnoS8ep8oY9QDaMJeLa+AjHPCbcvJwh71USXRKgudDMuqkz\nuGvf7+i8ORnq6yFuh2QLLTvSeGf3fApGH+KOjkeZumYzGY4UVE8L9YdSoEZ7cFmX9AayPpQNZhl4\nZiACINl4bhPHlBjYqlTaK8DdCaPbP2Pg4ENUjhuuT55JQkzuksNh5Jsw5HjjXuJEA6h6RVFyVVWt\nVxQlD+EuuznbqxtYqweaPcJQo4DdRle7EyKHiCpJrEudzgb7FOKNXag2M1Xn/pin47cS37iLSL2N\nkh/twjQ1lZFpbUTOsdKx1E28M1Vrbbih/m4x4DXW2fNT1Hr1yVCm6cAEsJohzyEc3j4gfAhQYUoG\nExfu5Fej72HQv9WidKjEFsLa8Ez2HRwqegryEd1HTeivBOgETG5YeTeYkkVXVXcIeRO+zEGM5UBc\nH8BngoBZLGjWgXBAoaXAQM5mG1dGXxH1w78BfwLqsmGgCxxR8Zy70QKmXLC+DqGtEOqkR6tp0Hh1\nIAYfuz36goHfge8lP0nSqj3sq4A0G4wYBW23pvDaJd/iF+/9Ad8CiLa66fp1CmNm72LUhAqc+0OM\nW7KLgQ4rXFfFvsoh0OqG1rshkgxqD7KSA2rNiAHIsTlAFPIszDj7Q0r37YEd8NmIsdyx9lFCO+wi\n3VsJbAuALw4ml3DKV8C9+b9m+sGP8JlAtZhoeHcgqDuBTIi+BW0V4OsUeeFu8Rv04Hc6YuySE1Fi\n09DetQE0iW5aguCuwXb6UC7f8BZLu9qZADzVdQHvxqcCXWAaA0kdwp6kpyUZ8WoXlzinJ8jJHDHE\nYMxcD46iEEOowOSMw5lgzw9hvXgATC4WM2TKEA5mmHgU+1mdXDhoEbcue5zZf11LbVYuf8v9LmrD\nPyG+HRgJSjK03w2hZFB60GGXV18VWPWI9ZsOO30Hg8bvZnByOTkb4LQpuZx+2zo63krl/jP/g2sv\ne4FB/iboArUuyrnx9/nH4O9QnTERUg4I+xxtgR25oGYgujzrICRXk+0GZV4Or02X5cFx5mmMumYL\nC3c/Quv6MBOz4H/P+D5Ld80RYx/rEOVrAPrrpmyIGW5jENPeRyAGBT6D8BUmwOoWsgokQ30vfNYB\nr57tyfaIgdEpYhzbWRe9w9g/7yL0op/OaVB+60geLbqZV965jui4CMHIUxAdBKmzWJ8xg7q5+aSF\nOoS/0sYR7rcU07UvGcxuWHE3WJPB3wOvZq/Yx4GYB2LTASuKI4u0OxowL4qAA9ZEZrJu50zRPSZn\nxJUBkQCMtcFkKyQXwOIM/E3NbGUMKssZEDtEktsvsol1ml31xr9v9OqBSr4Hgh5oA2tBmPMmvEnK\n/7bhcsFHOfNYUnYBPB+DcBTzaDOTFq/iry/dTtUPdrJ4D0wYAcPnaNdaD9TAiPAuUuw+weuVuyHa\nC1mVa7JSNf1ZPUL+cvkWD1w1+0V+134HrX+BzB/BL6bfz6LGS6nbNVDUtK2gJIfJuL4e2xMhMlrg\nzdlTadlyULzNKQLYciH0OhzYilYJ9SAsrz4MosMDISErrMC5Fham/C/5a3aQVAt1lw1kdfoUmn+W\nBlVbQC0E8sBvg1VQbhnBkqvnc87w90h5cw9ptKKc00L9x7lQ4wLuBjVTdPH2hCqvHpi6PGKCggNR\ntizoK4zL0REuIAXUCqgNgiMA7lgHjqSAPndHzrmRa6OFgKZl0LhMz1r2Er2dhXcRwltLNAN7FEXZ\ni0g8H3/aO4jXuMh+yxjQqEJrANR9QCbEQtpTOYkFrMRMUYjGwJxBuNJCeH8Amh0wL5XMpChjXqnl\nh4qFt54PEe6oBqdVRKIdhdDxOIRKIVp/fD4SA7z6ayya4+J1EenALEWsE3EQQXqOi6uvXcR9gV+R\nfv1BdlWpDL8Vflj0KMv/czbxFWaxGORW4K2boHGRCJTkIDVroRjEai+FSA+87F59OiXo6Uo5Cq3K\nLqaQK0GIxUUgEDgEUwoZ6G5naM1+Godm8MrsS/H/3AqBbGi+Bw4sEZxMPkh1gz8TwjeCUiq6qMxz\nIfaP4/PK8Yq97GKsUaERFLdKysWN5L/ZRN3uMCkqDD0fdvxkDP+TspBP7ppD0/K9qPtrQZ3K3q4h\nbC0cTeO4VO55sJ1FeyN0EmH2pF3U5ecRKCuA8OPQ1QsdykHkIJyPTxX8psN3Ml5myqp17Moo5d0R\nZxK8Jwkmgv0mH9GNDmL/dIi1gopBGR6j6M4KBn9ygH//RxeLfQodcT/qaivCEHLB9yNwavpLnQvN\n3cjqiEHkaAJLAsUl1gXrAOKyxIeB/ZDWARcMxboiQnYHFH4bHO1JRNYAdED8duhcgr4QmhvRhH8a\nGExPbRhA6FCua6TZV8Rqod6SQ+d0K/aDQW52PM4V9teJp9s4lJPH6lmnYQrGGWCqIT+5hrzqZnIf\naiB9SRPRjBi3HQiwaOajqLFOYACYhoKlCAKPQ7QUYj3o0OEVexvCqbm1xwPIsDAuZTvjw2XE8xUC\n/26i48Zy/LWtbBiaw9TMoeTYmghshIgtzvDKSpJzgxC6C/YvFrIqMmtycgP3QawE4s1gmwuhbnQ4\n1qtng5MgO28fU8NrcKwMk2mH8B+tbNg+leo3B+tjC1sQ3rAEER/L9c26EL7g6Ztg0yIIdIqxGwrg\nLoRDj0NyKYR64bMKvHoWUc4msoAtK8yVtlfJWNNMchyWTp/DU6Hv88FD59O51w2pcfBdDdfaKby2\ngeKx25iesorzt7zLTX+FRbXQqcB21xhaW9PAWQgHHxc23xOvTE2HEbTX/YQgbsU2JJPvOB8jY08r\nwXNMBLfYiL2mQnsQXA7ojEJkB6hpkJQD6VZIdoLJQcP+H3H3jLVE2sHVGsGaHIT0gXDgcbD00r9P\n1njJpQG0jKBtdJirbC+jHmwlPAcadg2g/v0USGul8MpOvFfcx7S7N7J59T7ya2KMvQU2f+8s/pJx\nPlvvfIQdW/bRGYDB7TUUFx/AlJtPfMPj4C4Ffw+8Sry6zzIjbKMF0ajLgGkTl+Op/hD3HwK0Jifz\n++u/x/t/uJCG6gLiJi1zMw2c347wUtN3GblhGze1wT/+vJ6ugAqWeijJgaZM2Hej7rMy5kJ9dz7L\ne+SA+5AKURu4S+BHKsVLa6gq9xMugPWF03i19QrUzdrYJlsmRFQI7IUDtcTfyqHJkcXCliDbFkFn\nLMak0Y1YstOIkotoQQxBbwh2gyJNhzILlbicgQ89EynXAMwVn+NVUBODgX5oiWXQ6U/W1+uS15GT\nCqKI2b9JHv2+X9YgckQuRgXsiqJUA/cjhpTmIOK9z4Cbe7yKTFVGEKP7o3LkowMcLnGp7CRwmyHk\nEF1WmRYoUqDeBmcN5DrPy8yrfo7aijD7nU4CYRtEfbClGAofgPRZ0NwA8S6wj4ZoTfecZErQAThN\nEDeJpx2kcW0BgrkMnlHFJFZjfWQPm7fDhEIbT192BR//8WxalsRB6QCfW1SMvmqxQmwsDGuKYOAD\nkDZLjO2Jd4FjNER64AX6uAYLIqjL0I51mCDgBKsN4lHw74a4St5FDeRl1OHYHmZvSS5/3fZvdHU1\ngpoHkVqt2ykM8dGgPgD2WRBvALULTKPB8Rh0dVPAEtd+CgFdcYhEsFhNzMpaTuoqH6Z6cF4Ny6++\ngEcb7mDTQ6NpXmaGpgKxRklGGr69NhZtvYSM3Daq1QdRYmLtvtXTr8OS90tgMij1ug793chKthZk\nEBxC2M1sKKhtIK2tgw9N5/Ha3guhehfMH8bg0krqAwNoLckWzYAAWCIx5uYtJWdDI4f8EDObiQaj\nYiFI9T5wnAFqM9AFrtEw5LEeAiiJhIUyTTZIsejr8QTTNIHWAYdISrEwee4azLfFKO6CHZ4pVO4s\nhq0yHVOF3sflAW5DvD6qCeEJRmrX6gYK+ut8IoAPAi1OyoITeD/1//P23uFx1cf+/+uc7ast6l2y\nLdmyreZu2eAim95sA6ZDCMRgQjCQBEjjEoWQ5CYkIYAhEDBNAQIhoQcbMLgX3Lsly2pW10paSavt\ne873j885WiW/a0v3ee7zO88jr7WSzpmdmc985jPznpmLmGPaR+7GdhI+aUfqgMlFdqZOO4Ekqzgy\nfNgzhrCvD+F5XyEQhFNLp7PtX1HUYKu4obQGkn4O5oWiV5aq6bvvHDIciaXTjaPONqOE0+DDpQ4Q\nslnoGJeFeiID/HY6gyH6bEl4AnCsH0w2qOgJIRsViNWLm8XC8MJ4UB8DyhjuKCmXgOsv0H0OGepR\nHm39uZP6KQrWIp0GczZ8MOUyar+eQrjLGge/6yDgYwjspJ5uaEbgoRqaxY2jYXglH+b+AnIWQKgL\nYkPgLoHAKLZBT0WNTBk5wZgXZV7wG2xeP4YsUPMNJBT3M8+6lQnBRjJu78QV7ifJ6iMp0ovzWD8Z\nXV3kbG2j2StaW4VC8GTJSygX5IJzAfR3CTttL4H+c9Cl2wa9gksxAgmYzwuy3PAh7n4vmxIWc7Lb\nIXjgzhD2rE//436oSRR4u5YhUAaJhpsxWWTCUZiwBDIf3YA8cR5KR7ewDeYx8ArieBZ9qLgZjBNi\nVDQfIDEc4NC0Uk4fzCfaaqN4yVHuW/o85//2MzxfeSmZq7Lne5fz3rSlHDhRQePxcQSPv0XMZCLs\ni1C0NMy0qo9JmDODwc5eCA9B4hhs1n86A/okjamwyL2N8uYtNHeCfXECf/3k25xZf4popwo546DY\nSubF7ay64nlKHt2LtH+Io0lOoqiCL5HZ4PwFyAugRdMrRwmU/mUUB4r4/jwMlTJiMCUwtfQQrg8G\nGOhUiV0IvZY0zqzPEE1uDTki5dCrCChCuA880NPpor/biCSJDjsH5q5ETnsMUZGi2VJKEKCkc1z/\niWHT9d6A2J9B6JLeiFvrJaeqwr9S3NBqyqEvlCTWqd74Vh87pvd/GunU/i9SeKP+qqqqFyFaRB5V\nVTVfVdVXEar4oqqqk1VVvVhVVe+570I836tvwphFA8cJDpjngFmpUGaHiZLAHKTbRdfZNsBoJOuq\nAeZGt5GxZT/1djsPPXcB0WsPglQCuY2QfQdYbZC1GqbUwIRRCqQgnrfXgXP6wE89dB8A8p1UTtjN\n9M4ttO2CoRQTvfeP55WGVXR8mIrqMYkusA6NK/O+gAf2QkopzG0G1x0g2yBvNVTUQOEY6NIdA31j\nsYx4z4ngT7YVjDFEPLyEpYUHmBXdC14IjLdz7JUyon4fJMow8UvI2QOUgqERLHeAZIOE1ZBaA87P\nRW+c0S59L48i0iCGAMakCAukbThafeQlweHKC3hBXcUXL59HzydO0b16Srboq6I6iG0xc6p6KttO\nLeTtF5xstkGpA36w7bskjlshBGG6BbJrIG8UXunyG+5bKYEk4ZrVi9UfQraDJ5rG6V2ZED4Ie2P4\n3nQT+cgItQHo80GHH0NXlPOV7dhb+lh/Pvzk59dgyJ0KrmZIuhuS7DBxNVxcA7M+F6XsY7pGeMGy\nMd6gTQKkNERKTwbspBkzWZ38PP2nQqTa4WvpCg57cqF/EKGYHyBSUJMQLdgu0O59OyIP8uHYyNHD\n01q4PtJupul4AW8pN/OJ43I60tIxOMBoBofJT57cSjjVyCb3Inba59M3lIQJ6K3I4Osyv5v2AAAg\nAElEQVSJS/CWbwblTWAqFDRC+d3gssG41TCjBiaOIkNddrpxDCLSmmoEhqKEQyZCFgvGQIyMbV6u\ne+RrrntsP8sStzCpoR7TgPBLnRJIZxDBpplvQcJ6sJXC/EZEmzoLsALkHWAdo77rqcVksGf4yYm1\nwYAQnceYRrDEIlp8TUesfx/Ch9XB43pFbUsEtoeg8Av4772QUwrfaYZpms0qWQ3X18BF/0vboGOO\nEDQqRhkmgjQdJkdPcdOpd7m/4VnuPv4X7uh4lW9J1dyy600WrvsXE5/ajmvtKdTNQ3x0hYWX3ywh\ncUou1pePE0laDWEbJGsynDoGuiA+5SBBhmwjhotDFHfWYrWF+KLvUo51J0G0AzJNYgZ2qQyGbCAE\nZ6ICXtDtAzpJmfg0P9h+NSVZEqd2QN6tS1ANdshcDUU1kDMGmvRiCR2jpAJOkMappLR4ccgxDqTM\npCE6AXItjJvfzTUNH5CywUvkLidflt7LO4H7+VvN7ezatYCOT1KIjH+HGf96geJCibp9MvPvmITR\nYYQlq+GOGrh8DPo+UobDXb+BVMhWO0jr68BvBHKNNLrHES5MQ15uIP9b9Vx046fcvugVbjtZTetG\nP2EnXPPQKtJWfSAOBhVn4LI7xRrMXw1za2Dm5yP7U/3P10g8bBjRA8ogYXCrzHF8g73RT4EC9WVz\n2BEqIfp5F0geMVLJKIGktxvPBzWZgGJl8TuPsPcuKM2EG/Y9gSP/Rk1BVoJUA4Yx6vvIyRz6l+6w\nuxDRXBfx6LrWQNUIMA5q1GI6urOF/PMEiTiJH771Rpp6vc//ZQrvHNd9kiTdBuwFfqiq6n/Or/j/\nXsNOgSzwQEku0YenSBLNGHvQMDaAXwVvFHoHweGibOJBUj9rxtQMgzeM56OkFfC+KlzNqBoPP7at\nhc5qsP7Pw53PeukOnj4jzgjkQnJlN+ezk7wjx2iwQNbUJP56843su6eCUECCySaYaRBRLA9if8vW\n7qk7GwAta6G9GixjoGvkRqcvfg/xEk69D1OndmSYVMyiyDrKWw7Tb3DSkJMHn0UhaINxBm36u3ZD\nkyGuJENrYagaDLPB/Idz06TTo2940SiY/RgyDZTLh7AFAiRMgO2Oi/ly9xLYEACrBbKSBT8afKK8\nPGDG3BnGOhgi5DaJ/p8WUJG0hWKEyOvQ9jHYR+HVSOy0xPCIGkeOF6MvAmlgdEWwhMKEDA74WqHl\naAoMGMGkQHIUKV3GNjdEWc1xDF0+vAuseOypxAJGIdMcRICnaS20VkPCbMgdhVf/dmlesKIKvQ4F\nhHGSbOJ91QrpWSSVp3DVnk/Y2wPTz4OTtdNoPW0B+sAwEUzJIooZ1dHN+lHsZeA9YObYyBmJaw8B\nHTLBHU62hirxWDMYsjgpnNWMIVcFF/hVK0e8U9g6tIiiohOkF3YzY0kP30wt59POC4mu8wvMk8kI\nE0ziAOIBTq+Flmowj0Hf9QiG/v8wAv82FKKuL5dTmZOZa9pL2quNrLr1OUImC6Wnmsjs6cM3z0L2\nFDvJ/X2YTsO0yw9Qc14G7XU2sT4O9RNPFfwTYpsgUAGWMei7brgdYE0MkopnuFx/RsNhlk/8iOb0\nw/QHXTSdyaM1O4eQzYI1FiQxwYslM4RfttN7zIn/oD1e+aMgHK0k7XMfWwunqiFpDLzSN9+RjoEC\n4QEz/zStYMnVW0k1ekhx9bK4eRum/QqKH8JJRky+KPKX0H0U+oYgwQmx+XZaryjmLecy+gfeRf0g\nVRT6hIC+tdCv6fy5LnnEqwxYJYylYTLmtOI8FMSQplLXOpmujgQwBiHNIpxOkwSb3cKWGGKaMxEE\n+jGikkg/qCoBkwV/TwJqQIb2tSBVg+l/Yd91R9aAgCCOB/xgskCfMQWf24V9doCMgg7S3unHV2Hm\n2EMreerhB2l7dwIsMGqYNT9DtYMcbJxNjkWmN83BoNGBIvXD1xpdo8lQ3wPh3+e5afIcMDiI2Ryk\nqj5Mp/2sXPQugWIXhuwwkybUMNu9l6KOGrL+coY2H3hvTuZ4bDot9ROFI3M+Yg/aCpzQbJZ7Nkwd\ng76r//FlAeOEKHPkPdh9QyRnwZuWpWysL4H6GpAyBF+HEJEu3CC5QI0h48EciQhkQQysSggDfoSX\n8jaoX0JsDDLUgxkjHU+91Uki8QEMAeKFzhGQIiIYKudCi3ccfSdTxZrTR+QE0EJUxOt99EjX/w9V\neM8Dj6uqqkqS9ATwR+A7Z/1tXTigOSmyILIQsUHFgPowHJXAYIAkGYZU6AgD7ZBjY1ZkP6WWJtIX\nO9hdMZ/qnXdCT5MA9DoMgsEZ90L6Y6LxUPOjo38K6T/+PzI8bgX7VT4WX/YVWf+sofcLUHIMcP4E\n/njwh4T3W8XMqEXa51ARQNFCYJ8qcsJDimiUmHEvTH9MODE1Y6DrP+kbVLVMTQgwiAiUWYVuBVQT\n1luCJPsGcJwOsS9tCv90Xg6+ASAdXJYRQHmEty4DkXvB8BgoEgQfhfAPRqdnZLoFFcwxDOMVCsP1\nmGNhcEEsZkbxmYVcDE6wqHAyCJ1nwOzEOdfElOtPc97kbWR+2MuAQ3ykplgBvqgTuBfsj0GaAwZH\n4ZUezdSdVDuQBP5wAqEME0o7ZKe2MHdhHS0Nkwl0taKoXTizzNjzDRgmKxiKIqQtaSXxySbUpjC1\nd06lWc0RTvRkhEwt94LlMfBIsO1RaBgDr4YvzWIqCgyGEMC6BDCmiPdUE9ayBFJv7yD0FJwOwbhl\nVqLbZDgVAswCY5Aog98G/QaEBVCAmxB4Kzf/3v38HKSMrHbrQxxcBmV8G1PYY5/PHsssMETF7wwA\n9YoYupyVSPZnbagmGf9SKwcCs9j6bgUYzkBKvrCL4zWZlN4Lkx+DPgkOjyJD/eCjR4L1+WKi3wl7\nayfx1aSLWDJ3PeFne8i/6wSSCWwLDZxZmcOJGybTYUxl/tBeJrzSxKrwiwQvNvP35nn0vqtCVzPC\nit8CPAFyCshPgH8UGeoHBc2YSoqKwRATRrsBznt/D7PT9xB1QcPE8WxcdhGbVyymP5xIMr2MD9eS\nrPbRbc/kyHkzqDleTC/J+FyK4H09Aro2+16o1GzDpjHYBnUEr7TeSQQhsMPKbwt/zMGi6RTY6ilN\nOEZBahP2zDDRdAO+LDPTrEdw7xsiVqviMIN7ro3m60p4peR23v3T5SKluVnVog/3Qu5jwsZ1jtFm\n6SnidEiYPcQc9zcY2qKCZ51Aj2Zk9d5VJhWkAfFeviRwnk0RkEKYjRJZikhJdxozCNY6IO8+6K+C\nkAQdY6QprPFId4adKmSpYhlaIcPQiat4AIPUSGnyAWQzqBda+XPkHrrrLOCLgMMoDqwmA7JkwhYY\nRJElGo3jaFLGId9YAXP+BCclWD8GfR+512h9n7ADbXBILeN0+TTOn7yDyL4+1g19V8DcPgGjGyiC\ncBg8n0JBEmy6egF1W8fDIUXwvghBa/m94HhMtIhofhROjqLvOlxT1ytJ8MdYHGNubA/WmB91InR3\nZuE5nABSI5gmg0HVRn8OgMENkgli/djoJiPYISrZVfCpCUQYQszvuQORyhuDzYIRe84IGvUJMyCc\nIa/2lcawQ2QBZBdi76xHoCBkBE5KxzeGiUcnR0a3xniNBUSeC7wFFEmSdAR4SVXVZyRJSpIk6R2E\ny5AuSdIPzhqF0kemgEDSp1cKBmQijHgdcKwT+k0wKRmmmYVB6bCDsRjrQwHOC+2hJLuJLxYt4W+R\na+B7TYguzLXQXAaRu8BYLkaD9L8HsV6dfveY6DJUglQpFpYMTIcZd+/ioe1/JLZtD3X9kLg0mT3L\npxNeaRcVepWIdMMJRFQotxUe+BbUnoFIMxz8I0x/CNRdYvRMsF9UfZ3r0qvwJERVEovFRhYOg9oE\nJII3A6QodIpWv1OWHSNpvQe1C2rHF/GO52qgAeRpAnvTdga6bwKlFnrLIOEuMN0PoY8h8CDQC6r/\n3HTplTYyIC0GFoItHWlqgMwWL6ZQDBSQXBEkt4qqSjAUhMMWkBpA9mCckciSu7byvey1THluMxe9\nBS1haFbA84da+jypwE7wPwjNY+BVs0aTghhN4qwEq4T3RCZdi1MYarVyxZn1zM/fyc5PZ3FInYZf\ntbNE+ZpZ7Ydw+4dQZFBehDfehAo/fNBbwUcv/x38tVBTBvl3Qe79sPtj2PIghHohNgqv+A1xZi0F\nliBWeRui0qAMoskIZRuiyDnATYkvUfup2Ff2Lyim9wsH9NgFpkuyQqwFgrcgwgM3AjcgyrreRox0\nEVn0c+p7m8avGELfw5UCYxVFnM7UAEKxe4hb+UEw+zDPu4LHhx5n2r4j7FtQznFjEbQGwPADCLeA\nrxmanoFJ90PnLtiljbZgFBkOVY34plJ0BA4DqlHw7MMQ+zLLeW/1jdxrXodcHUPJlGl/OJXfpT7C\ni+/fi/2Uj1l/3s17JbdS8nwtd172Cz46aEfs2lcD1yKqFTeD+kOI9kJkFBke0ugyAJZK+pe4qbFP\nZuqMeqQ+BWkHdDVA1xAkpTeyZsJLrDG/RKwV1Bi0nAFzRBz+vfel8fUdC3n5zGVsufZl6KuFfWXg\nvwtmlsOBz+DwexAcg81q0egyI0a52CqFfXhfYvCNZN7LuRlKZJgvxXsa/UslsdjLP66/gjmdB0gN\nBTFdK3PsvhJ+572BDy99DzqeAlqg/hkwPwChXRAb4zocaRuSKiGtEtukAGXqUQx9MRF18yF62IUk\nsaluAnpViGolnkUOcS5oMIA9CYurl2du288pD1xU2UesdAPUGaH5Mxgco33fr9GlIgDCsUowqUju\niFiGIchXmkmZ7yFP8rOofSt0gHUoyMPOJ3n0midpcVmhQxVp2ZM+JPl7DP2mFm9zlOeehuPfLqZv\nUy38YbkYXWQYhVenquKHGH38lH4dh38cuBHbXQEmP3OCpKf8RHbEqG+HpCQDrhUyhiUxpN0hIibI\nuQpeN93DrgOZ0HozqLXweBmcfxe47hejb+ofFLwa1WaN4FWsUth4M8hZChM9Z4gMRDClgZSsQnIO\n2BMhqwASVDgSFVhbkwTRBlDrcdFActNhbtoHtTGoX/orgsYbEB7OR4j5nWOwWU1VcaczpVLol56a\nHUTovgFt/iXDkTzZEh8Xi5E45qkLoY965E8P5Ps3QWST8Ef+jyNQUYSr+FtETfY+SZL2A1cBXyJa\njn0LcRT+8f94h4yq+NPsxKcgZyEY0Aj4DKIx5CyTKPvdpIA5ACsT+HD6Cs57ditqOdQGitlePRkG\nvwH5YbA9BnN2wq5ZkPeyeE7SauEJd/yAc9KVVRVPafQhjM2QRtskWGN9ltT3jlO3B/KWQe8ls/nF\noz+Gpq9AWQJ1Jm0KewSKJVEynf1zOLkf+G8wvQBzr4RTG2DuajA/AjuuhHDz2blt0XilnzKjMQj1\nI1CpIbAmCmWI+BCeZzF38DBTm06AG1SMqE+bQY5Cuao1KJTB+hOI/RRm7YTds8Twoug2MK0GTBB9\nHZRDZ6cruSp+wuzT7msyQGZA7FMamFXtM6G2twI1orN2RjnkToKyAlZ++23u2fcy+b/cR3MD/DET\nzLdZWfCmi853/gHGO8CwARyrIe8RaLgSoufg1TiNV3roNoQAFm6Ft0tuJumCHpZ2byHRM8AFB7az\nsPMb1E4Za00QsxxBKhHjx2IHYCACCbOgu2cikXEroP1R+K+d8NtZwAxo2AY5qyFqgo7XwX8OXg1X\n4enDC4MID6WDONjOKIi1Qb4c4JLm9ewCVjrg2p7/Zn8wVxyfjMkaztYIrkfB8yNQNyBCn09p97lD\ne9ZP9Yf/z/qeWRWP2EURBqgTUKPiS1gphOmxAyrIPqwpZh7+/a/w3dnOQAg2LrmAr1gKOUaY9Eco\nT4Q/Xw7fPAfyxVC/QWBVEh+B2iuFc3W2K0GToUIcCBoC1DSgC7wear9I56nxD7PlroWU3HKUOnkS\nR7+YTvOTRti+iUCqwtE3ZtN+TQqOz7xkftHDpZcu4tXeE9D1EnA7OC4HZQfImr4HX4foOWRYrvHK\nCuRCp+8M690Xk3ZdF1OWnST5oI/MHSrprSAHgTboPwCbvcKEJEU1yOchyH65h8XmXZw+z8GWe34M\nD/4XpO2Eplmw8GWxzktWQ9AEB0axWblVQkwW4riPAaAzBuohOG2B5nFw2CmieWFImO6j4kebmX7f\ncQ4cCDKpAE5fWsZLyio+f+Mi8C5CLOhVEH4ewpcAG8C6GgoegbYrwXsOGepVeDoGKgoGYrjVfiSv\nKrIMKUCC5lwM++YyqNnim0NaVqJ9HIbzMrA99AWP/h1+3AmLP/8BL5RVg3GZINO1GlQT9I3Cq1lV\n8fRMN+AFCQWz0y8i44mQbO4hOb2bftx0D6VBEZh7w6w88wnPXb+G9tNZxD4cgq4ozIkyfs09vJt6\nL7feC/96vgVzhg82bISZq2HGI/C3UfR9UlU8JaX1lcKuybIb+Dv8s+s6dq2Yz+TfHWVxaDP+mJmN\n3kup7ylihfljnpq5hgQrSLeBcsoCZ+xg+wmoP4MVO+GNWVA4A7q2CRyb3QSdr4PvXDarKi4/FYFx\njYTEtiOJUZLSAFgnD2J1WQgeSxE6WI847OSjFV01AfWk0cYM6RC5RvhlMkjPVHNkxa3A/QhBr9I+\n+CgyHF8Vh5xYEc/UEQyDiH3bTbxvm1e8L00QUDwZBG3liEiZSjzCBvHoVmqlOITrXdgbfnEOXsWv\nsThQf0TEWlIQfrgHMftuEWIHr0OMdXmPszFBvyyISI1D+/8QsA/oHYCYCyZaIE2C4yHYN4TZauGG\n768j8w/HiXwVZOPs+XzVtZDQVivIL4B0EgI9sLMYpFRoeRwCW0W7APMk/akrzkmXDjwLI9oYDMWg\nwwxJMKG5lVCnn/F2qCs+n1cTr8W7rRekfLhKFoZ+PxA0anOoMmHXGoh9BdIgSAM4lN/j++ZdcORA\n8C2QMkfnuJ6vH65M0ldbGqS5Be/ahsB0CspmMv3USToaevFMs+LNt8IzYZDMAll7Guj9PkhbQe2F\nPcWgpsLg46BsBQpBmgTWagiUn52mkbgZvUrDop1y9SGNTkT43+8A03goHofxcT9Lszdyk/NvzP7g\nAH1/P8Pu4yGyXPBEzMSXz0r093ohNgjGl0GphkA21I2RV7pTZ9W+DwB7YGfqIs7MHU/m+HbSJnRQ\nKJ3Gke8jPdSFa/4AdfJE9jOH5N4Bnv3OvczcB+6lsO3LD1COPwehHqgqBmsqfP449Gl6ZZkEhdVw\n5By8+jfidE/YjojGJILNJSp2Yl4otmOcqmLcGqDFCs57YGBdGuEjIUiNiINAdwCUe8GzS6sGrEAA\naH4P7ELUzRfqDz27vhsYxjZgQqTJY7pHZRZGPGQSVaSqAsEBcPuxXGLnpp3/ZE+Dl+wV0H0ii+4d\nWdBtBu8PYMMmsQ4xwN6Xoa8aTNnQ9pZoQz9WNukVniHEBkk6qAaipxPpeMXIpt0Xsc8yjyElAd9B\nC9GTJ2GgGzWczeA6J1XX/pwnpv+CX71Tw6cb18NgFOQbwZQChl/D0FZRAm+YBInV4BlFhnojwkHw\ntKfzWecymrLH4XIMMFCUSCzZSLHhOEvMX1Fxci/pr3VRsRliBjBfCFIv9B0GW71C+t4utr/7KdKO\nDaihPugsBncqbHwc6raKEnj7GGyWjhfTT+VGtBZjMvROFK9ptuFqS+PcEJO/c4zffP4o3V8OktEv\ndOzLxMv49JPlBLZmQvBnaCEhTUleBqohnA31b4F5DDKEeCokBHJExayG4vrmRjgtKKIj9qBZ2A/J\nCByEprDA1MzJYP41B0j45Sru2g89Uaid/Wdi0UwxeFrZLmRoGgOvhvuLMQwuVq0SsagRzBL4YVJP\nE9nRDnaa5/GXrFV0rkqnMHyaE7nFdDTlEHMYYbwDLo2wYNkh7L9fw5UH+ukJQVT2wSufws6/gjMb\nDr4FljHwSk8R6fRpTjoJQD/4NzpoODKRzokZHCucgeKX8NQY8ae56S1Phm6IInOsoBBfQwI0PQKB\nTaI9xzptLzz1uNB3ayFEJkFJNeweRd91PJD+jT9K7KjMcXchZdnHMETCZBo7yHR20Ri2Q1M7DJ0B\ndSbMNIo2EXUyJOTjlC38+ZEWtgWhJwLqDdeB4gL+iihTnYgAa40iQ/3SMa524u0y1BE/0yNMAYRj\n5YQELY4wPv0UqYs68MQyxSRAfWyUPkrHOOJew0VuY7tGdaBUVb15+DNI0njESlsEnFFVtXTEz9LP\nehMdOGci3q9BQrhiA0DMDNNMUGYQa3iHiiVhiKk/quPuf6yjb2MPqYvgm5zz2fnNHOgzgPMvcP5E\nsWmfaoQjlZC2BUJ5MPGYeN4RCcRx+ux0xYhPDlcV0ZE8bEJKD2OpC9PrVclQIcUYJmnmIFN+3cVU\nQytJU7+mPZDNyfYp9AfdyE4FgyWG8cZfkRs7RIXpa6of/yvXPTWOde+BZd1u/O+6hEOz/Rwwf2XE\nawRQZYYbXGTZRcnRQB8E6sExABfKOPf78TdHqT9vMocTCmHAC4YsMSpkqB/UpyDLBvnJ0NMIpypB\n2oIoSTgW7ylzrkvHGUUQYXidLCegCn+NAchPbiDvkgEGM13Mnb2Va/I/YNKBU5Se3E/9eh+NNeCW\nIGemiYceKqNJXsv+36ShbLwQnFXgXQdlJ8RzfED9OXg1DFoljtkJAnXgfSsZ71dJHE8rwZ7sI9Xh\nwaKEcCT5sDqCeHpSaWnLZ3x6E7X3FDIlsxFndowLfvMd6navhq8lCDbCnkpwa7zKPiY+r20UXv1b\nz4cYYAIpCVTNc4laQW0FjFiLrTjLvEjPgWyWOXLjZAJ3t0G3G7LdguFKD/BTkGeBEgH1GKI123uI\nIeZbNGE44Fz6rmONdACymfgbTllsKpJBm0CjtUZOUJCmq2Sv7yDUH2X3tHLqjmYS3eYHqwKT3hQ0\ndm2FoW9D5CegroOCE0KnYsDxUfR9ZIuTMCIVYJMgIQXsBgiaiB2JMHBKYcCYJ+jsGxCYMCZCMINo\nTQ9ff3oBbYtf5PXptbyfcwm/dP6Mo2tT4K1KuGQb/CMPMo7FB7ie6xoJavdBpN5CO3l43YlIjSr+\n1gSUPplj08vpr3STVOIlp7iLzEZNApcDh8DfAEMtEDgZ5Q+/zmRR7pf035EOg81woBK+vwW+nwfX\nHxOG/+1RbJYevYA4fiwBUQmc7hYZEgPiZJ4DZYuO8rPEXzHxJ8fo7oKUy+CzS67is1OX0f1lOvQp\nUPi22MD7G+FwJVAFyjrIPiGOzQZg/zlkqF864NcH0U4DfXIyaqok1uT4GGQYgCEY6ob6HNG/RBkA\nohB0w1IXS67ZzMOGP3Fpfh/By608+f37eOa719HzxnVg2wz+CWIdAjSPwit93uNIWQZkYl0WSIbQ\nINi+8jOrYD97ymZyyFFOKNFMNm3EMDA/bTsXXv0FmQs7SI14KG6v4dLyXgI3Wvj50pv5w8IvUCb9\nEra8DrediE9p+Ms5eKXvhbp9lxHOZY72t6eAWoi1Ghk8ncTg7CQRuK73knVDO1nGOvo/BUu+ieqE\nW2ltzIZxbwk8XTLQ0QhHK2H8FqjPg/JjQj/GkpbSyVa1f0JRQvVGXjDexRNzn8BxqI2Zkf3MTNxP\nY2oW1PQKiMFkhD/UAzTnYT3PT+LFbby+TyGcZ+LjVy7lgVsfoW3DdcDTCAjCsZEPPbsMFeJ6rh/S\nZeJjb/Siqihx58kPahA6omDrg4lSLTmTmvC2JhI9bhVICpd2X/OI58C/t1MZwzVmELkkSQ6ExX5A\nVVWfJEn/+ZizP1YvOTQSN+I6YzKBfCssRCjSNkAxkHr5EHfl/Bnpl/uxO8JsW7mUrwMX0fFNLphD\nkOoWe0arD06uhMSnRQdypH+vdBiNLv0kJwPEQAojWRVSMzow7owQGhJQo/R99dyU/yEzOUpu6CTj\n6/rptGVS457MQI4L2aFgMMYwJEdJbWvimcc/4pmfG1Eth3lFUpHtUcY0rkGPPmnkIElgN4PNLECW\ng4C3G9QWTLYcJl9+CN9vA9ja4HRkGltDc0EeAkc+BGVQtLboURt4fdC4EqSnQdV4pXNoNKXRI3UB\n4mNzQqA2S7RPS8NuHUI+GeP8Y1tRSmR8s53MlHazZN0GfDuhpxbqfJDggKnzZYa+XUj15FvY+6sy\n2FQJ8tPih/1SvKx9NDCf7gzopwidfyGgNgJHAyjI+AxufPZE4RynmyFLho4ItKhEzzfQ/2MH400y\nxsEYlnQ/5sQA4VM+aF0GyuPg0TyOkW0TRhWiTpAKRhlsNojYxGYS6QJawOomLzdCsesA1MBko4WX\ns1bhCbdD1C4q9PAhVrxH6748hEjZVSEsiv684WPT2SWpRwj03zBIwhmRZWFQ9L+OARhATsCYDJmz\nWpHeVymzwFvOyzjknwpDEbBYoTMC9XXAPcCPIaSNRFdH3G+0ayQ/FYSDqgIptniDSo8qBn4bEX5i\n1AiuLDGt2miFU930v5rOjmcqmDC1nly5mRnSZk5seYPYqj9BUNP36IjnjHbpEYJB4DQoHgNDUqLo\nRN0OeKCzPYeacVNpG5+N6gIlBrIdJD8MtELzAASCYAiZ8RiSCR5KgNAQHFgJVzwtqpGR4piM0bim\nbyYjfHNcxFUhgthwFcgqbeGizM+46LVPaNwLedPhwB3zecP0bQ5/Uy5yB0kIPG+6D95eCTlPC155\nJHFfPa16rus/Nx8vhJotnJYLUQpkaIAJ2XWk5uXisWVDWBU8TQZmSJCRQ25OP7PLv+JG5a8s3rie\nQJ+V/Yum8eZH19L7t3uAP4LBJeyhHkkdjVdRjS8G7dUCDElEj1s4Mm8qpRMO4jwcYv7XuwkkWdg9\nbg4hLCTGvEw5c4rszc1Y+nsYl9BJntqLeSBKV1ESb826imeX7UKe/keUbqegSS+FH03f9Z+PdAxE\n2yxRrRxjRCNZTT4tgMtOxcRDzHJs4EQbzLjNyOdfXI5nu0MMYc8zgc0HdSthwlM8RFcAACAASURB\nVNPgcgi69BY4Y7Gl+h4tIYqAIgoRTxuf/H05N8x7n6RuL6U1J1hUupW9F5fRfMYNwQzIlUX0qTMM\nSekUzD/KnIqdKKcl+i5y8efm2+jY9D1QH0Rs8lpYUBq2E2fnmh4R0/lk03hlQkShzBqPesXvyMlR\nLDl+Yh/A8TBk9kNmuAO3sT9ezd5F/OCtpwT1Fgb6mh/jNSYHSpIkI8J5qlZVVW84M/ZxLu1V8V5L\n4UoBfJQRC9SBqN5JBxqj0AmukiEqVhxgxe9eY4sfMm4p41XpfrZtmCswR8kOSJVEXnr/SrDfBknL\nIbpJnKA7HgLDcOP0s9PVVhWfsUMlGOaDLCPZVBwmH0o6pGaDvwtMO3qYuXsT4wY34Q1BWhokuo5S\n7PqSlExwjhefJdoMV34Bd5bAlTPtvB5NRrIn4PvtL6AhCXpHaV8fqYoL0VAJ5krhYacgXjuBwQjY\nzCQUTeU7JU/T0drNFBsMxgo43lYGpl5IMwtnJ6YNtAoahfNkvA1YrjlBTuAhiI1hBEF/1YjGY5Wg\nLIIuhfCXJt6/4EpWlVaT8I9eCt7aT07ZfrCJQEnDHqGzXiDVDEVzDQTumcDfKq/n1Y3fgjdXguk2\nMC4Xi0N2QvNDgpmjjWtoqIo3MMyqFABDK5pWx2BQm0wr2zTgdAScJtF+yaeAVcHsjJAT7ED1x1C9\nYIqFMXoHCJ+5AZEqmwWEBV3ehyCsN/w51/Vb4jnYJWC+QGugqYqB1WoN0AW5WUzJOUFFcCuRQSiZ\naeG6Y7cxEHgfsIqGqXgR1iGqvX4HASC/AlGrbEA4U8PDA8+u761V8XSZvVLoVyLxNIuM0As74HSB\n1YVjZg/zyz/DKEeZNxWespfSmlgAM6yQatX69twNXAzy1WA1QtQJ3doaHG1cQ6BKPFMCLJWikMNE\nvHJUa3NBlhmMScLpG5AAG2TYYK4MpiictMDWftb3XURpyRHGRU+z8+pnSbz9BnomLYePETIcfAik\nMejWwar4OkypFCMk9N5OPuLpBA/0nEmlacI4ImkSkSaVwZCo1q8/BA1dkJQJXdOy+cx+MaH3bbDr\nSsi9DbKXw+5NQobbNLpGk2F7VTzlkFUpQMjDbU2Aeq3CbIHCzLK9VDZ+zOnnwWuSsdydy4vFq9nx\n+UIG9yQK/o4HcqLw0UoovA3sy7W5Yk7wPAT9Dg1veY6rtyqeqTYLGQbO2Dk8MI1AkQXr6SCXBD7H\nW5bMnuUz8QV82DOPkj6xC7s9BKkwLXkvVx95jymfHGZwwETnzQW85PoWp656DDV2IyhXQXizeEiv\nNhZrNF4dq4rjjJyaDIcgtMXCqzNu5+4VUUo3H6fwaBPpjvc4f+Z2uqV0MgfacW9tpudFFYMHrPPt\nnLp4Es3l+dSl5PKj++oIu+6G0HXafFMnbNH0fTRe1VaJVwUhu+zK+ExR3c4XIJynHAREog7INjPd\ndJwZDTvYkWrGtXI8HT9IJHIoJuTnlmDPSphwG2Qs16KsTmjSxvGMtg7VqvjhyVwJCYtg0IQa7KL3\nV1n8883ljC9vpnh3DYtStnLkijLWNd4DOxXo6RN9zhQriZeHqCzbxWVsYGiena0Xz2ZTxV9QzNdA\ncDEihyYjxrk4R5dhc1UcNB6rhIzKeCuQZOKHhxiQB8lzPJRM3odnCzREIWyFoMGGv8FJdK9V4K31\nQ5oVobNhoG8T9GyKB1TGeI21Cm+P9hFyJElSVFV9lv/NOJeUKmFs9PSdPszPor0XA7ZFYW8feA1M\nnXGS2xPXUrMNls81cMvyn7D9iVmEvvRDogIpyeBvgb2zQY1AtBUMMuSsga7J0KuNTRHX4bPSlVsV\nj6gEEf2kgmaUKDScnsixC4uY3RcgMyNE/+koOz1RzjjCuAnT6o/S1SK6FFQEJCbajLT0qZz/fpQI\ncKZRpubzDJqnXoTiPghfvSZy96ONcjFXxUsqzdqXDSEpL+DXGmDkTcZxm527zlTzj7Af5WIzFrcN\n9jiFftoQJ9FwIjAAg0Xig8qtIuJgWANKHqgvAYWgjEJXWpVWrop20orAYIjARgu/PP0YM286yrSG\nHTR+HaC+RmDfdaiNEbBboaTIhP+OXF4973ae+/vVBO8rhVgEjK3gkMG2Roy96X0JTIWjjwHJqYoX\nJdi0LwdaWsMKPmu8v0cIsDlF5isb+NQC/WDOj5Db30nMAmRCR7OK/zfTtD9oAnJB/ikY8mDgJQgX\njmGMxM+0V5PIbepOi0nPMzYLWRRLpE/poqC/HrLBdJ8BNpihPwlIgD4j8RzXEKKfbQwRkZIRjstE\n4E1EDB84l77nVcUjUDqQPJn47DZdWAmIAo8yleSrelnGR5hiEVgAE4rqybV56JqdRch/mujq88Vn\noRcMRZDwAITzoO8lMI+BV9aq+GwriKfXgkCDCl2qoKVchqlG4Y1/GoJIEKwmcNtBCms86adFTeJw\nRzKr5nxJMGIk2vwBlGfD4P0g54H/JTCMQbemVcXna+nAdj3yWgZSXgz1jAwtEqePT2bTzAu4Zc5r\n2KUeDtSDsxnqLZCQLZO93M03117E7zfdDC/kijRscyvskKFiDbgnQ402zmUsMhx2ghFqkaDxrwex\n5i2QdKWH2c7d5L/yDVtUuKrAzHOX3c6XX11K726tC78RiLTAi7MFTd2tIjqbsgbMeeB5ScPQjMKr\n1Kr4wUWbCBBoT+Dw/jnUL8mjDD+X7d1I0bRatlXOo1YtJIMmFisbye/qxr5TxfQy+GogkmGk/+Y8\nXptXyRvzHxeRTlrFB5QfAONkMa7LNAZeVVTFo1V6gckAhL+y8orruyg3w21Ff6Nk+wncu/op3lwP\noXoijbC9GQ7bTMyYkkrjyim8uvBW3v16EVy7AKIRMPaIHnupa8CeJ+TnLITAKLyaUqXtNcT7GUkI\n3dKbcqcinKdUhofQmy8MYvSGcayHBFsKb5Rcj7/RA8FUcHbDhrlChgOtEJIhew1Y8sSYoITCMYwJ\nqooPXdYjPaoNBvOhsZt1m1cxf+l+cvtbmeY/wlWpn/L28lvx77NCXS3Yu0jKL6TymmNcWfAJ+T1t\nfJU3kVsKdqN4ZYi2aB/6XuAVBM5uDDIcXyUcHB1Hp3cO1wLdw5FXN8hFUcoXHObOwIvUt4HVAJFi\nMwesMzjzRZ4od/MgTGUaWrWvxvuUSjEfVI/Sn/m/A5HPQMSHjmgkPylJUgDhMI1tnIuetoshFrw2\nwWU4PNwFnG6H/hNQkkJWbicX7ttJo0Oi/cMUTl5SxuCBbUAqhCvECSm0H9QukMsFdsnzMJiskLBA\nNByMDQ9XPTtdEeI9QozEh7T3Ac+aufXqf5C9rIFJd5wkX20mj2ZKOUpp+BiTPmtGfl5UcvStdnDs\nghw2feij628tTC4z0q8m8OJjjVgHmoilVEJjn6DJVgKDo4y2GBkSNyL23D6EsxcYAiTkzBTMV/TC\nelgyBPWXTqYhlgd/s4Aix6vS1Gyw7BPjIgzl4uaRh4VHoywQ7+vjLZRR2uonaHIb0mgKRqHOT+C7\nqdz6xqtUP30nFz/3JYXV0NomHm8CxpvBeR4c+eUkfmV6jA9+vBKl+hOIdYGlXDS17H4YDFawLhAp\nLmUITKOM4zFo8tJ7ecSIV7RYtS99SncMgVnMJ943RAHFBoFkCVupSmihROeeM+DvRpSCRoDfgCUL\njBqvFG20RXgMYyRALMYgYsMKq6AOaW/4wB7ESYD8wADhAiPHr0xB2SeBOh5wQ0xHmGYgwHMeRDmJ\ngpio5AYWEB/nApxL30emrfSKGz1k30HcqXKKWxsnhkhb3MK80C4M/iiEYbnyIY7F/RxkBrtf89AS\n6AJTOUQUiPwIQnZwLQCvNiZoLKNcIB7t0SOKTqA/BkNh6JQg1yaiJS1A9ASoHoiNh8hErSN/DxCi\nL5rJN3tNeLsiuEpzCIZM8N4jMMkGZk231CEwjmGkkol4+kovnswDZoO1YoDwKTuxf1igBhq3TeD9\nS1bywLIXufhzkApg/izovCmRZ4vu58mvfgZPfiqe7yoHSYGdms1KWwA+bWzKaDLU7YOCUKNBjXcW\njdZ04Gq4ZelfWbbxLQa3QEox2D+y8uI7D9L3VlJ8TUhA/34IdkGCZkfrHgZFs6OREfp+rlEuIy99\nE+6D0OcW1i28g/+a+yQ5f+kg+8kzLPe00K1I7I3AhygkAYtUmFQEzuug/rpsXs+9lacenwZDf0aM\nKJKBH4NqFzJUNLpG4xWIaKHeNyhVo2038Dy88eHd/PW+VVy/8k3W3Pgs83buh02i8HXRtyDhuiIe\n8j/Djqo5xB7sgdjnYs8xlYNR41Wa9d/H8SSNMmJGh2iM7Lekz3QLEXcUjBrte4WsSufsJ+vQKaz7\nIHe6kz+wmBC54LaBYbNIk6SWC1zZqYdF+t+tyVAf5xIaRYZ6TyqT9pVoFrbID6yDx+Y+CmuCXM87\nJFo7uND9Lz76+XVQXQEV8MDKn3F7YzX5B9uomzmB5w5UEOl+A6zlEIkCTyI8w4UImzVGGY60830I\nm6r1IcaL0PsJkHJFF9Om7WP+l8chC+ZbYM8Nxez2nEfn5hzxyGwEbEhPDUaIp151P+X/MgKlqurH\njMjOS5L0AaJZTA5inMvobZn1fgs6cRrQcHguVwCIZsCkRJZ96yPuyH2STQ+ppGJmq+kiBqJDoGpa\nF4oIJlqWQYoSZ2TPCjAUCKOYvBpSfwhHJUYdM6OnC6KIppIyYtjxpyFiW1XaE7LoMSSwn6mYCGM1\nBLGkJmD5WYQFz22kTimkbVc+4UdtqKkw5ZMQ0WwjamcC0k/vIdhYBn07IWM1uH8onnlwFFCtHiXQ\nT7wWxOkp0qIBiBUUXzGB5kTql+eQtKCLN/q+y2evXAY1XsFcw3it2lEC6zKwaA38BgHfCqBAOCeG\n1WDR6BoaRXN0RTYCshEwiwjDoQF6b8rinrte4rqb3uam6/9Kcd9xFBMoRolOVwZ/SriD9f+8kpMf\npKAcPQHqNBFNnIKgc48mP6kN3Ksh6YdCFjXnoEnPXeulqboDEENsLp1AiwLeAMgDoGQIsK0eBMoA\n5kgoVgNcH+VofimW1Evg58/Ba6rohB++HlILINgu6Bqn8epchQBEidfVRrWxQAZQVU2Qg+JDD9no\nU5y0TUvFek2UbdJCYmGjOPkNAwWTEDt3EXAn8R10BeIY1aC9vwZwj67v+mUmDibXCwS1vrW0AH6I\n5pgJtLuR0tThZrHjY41kyu30hZbQW3oFrH4QNiJKsIMrwK3pVcZqSP6hkM2hUUC1+iFGZ5kespcN\ngDZrzqWxzQNEtVlagZhgp1lL6TEBf28a6hVXc7NyHvvb5jLweTn88mowFUCsTbTtcGoy9IyiW7pe\n6U37XIjNLQ0yXR2EZpvp9aQT/MBJ044JvDhhDf2/SaTwZ6fZaZ7HofY5eBrT6PlXCrFDRnAuh0sU\nMQIsCHy2AhILYKBNjAoq+SG8OYrN0m2pbh/0/je6r71IYubVO1nUt43cb9oIpIB13ji+7fodgy/G\noMUHSTbRHNIMpC6DfEWswTBwYAUYC8DQBumrIVeT4e5RbIO+CY1obhg+aObddd8ivMLODx55muLJ\nJzH/S8XepJImwwVukEshdoeVz/Iq2eC8jH3Nc2l+oQD2pYEcAiUIVjso14oIXaQ9bh/qxmDfR+LF\nLIgNtBzYCkpjJ8qvY3zy1oXsmr+QhAsGkb8TJtpoh10QWGWhoxmizWdAckLWt6HgbrEkAfavgIQC\ncUCYvBqmaXr18ii8GlkEoMtQx33qQeokhJxPADPhmuT3ma1uIewGV7FMCCuqIRkSTJCzHM5TxD7a\nDHyj0RVtEyPECjQZbhijZxBGOE0ODQenpaq7fp3D42uq2H3pHJZLH/FzRxWPLnwCa4GCtBXSX+0k\nKc/LoYpSXkq9kx3Zt8Kdr4lmR20KRK9meIiwtBrkH0JsDDKEuE2XiM/yHCSeFXGDKRrBk5jC25dc\ng3F6jGNSMR+HrqFmx1RxGJ0mfm/4YNSL4JmeuhwL7vY/rjGDyGG4Cm86wodfwFjHuYwsgQ8RN5oj\ngXRLTCy4cjs3ut6n6P0TnOiFvFKZD+QJhEjX/iBFAMV1Dz2q3cPYKGadpVRA9zboXgveap3mszfp\n0o22DrTTOaL33en3E8VC1J6KP0kVjPeqcMoIVQY8SckMKEMEWvyi3taeAXnJQkf6G6HmGGScBwM7\noWMtdFSDbQxjEXRInR5ZGTaauqfnh4Yeen9VzP2Ln8eqBjm0ZxL9u/ogGAJL2ojuu9K/l6dKjRA7\nCAkVENsGkbUQqAZ5jGMkQANuS6BYIJgJgQGUw0Gans3h7S9uYU9KBcl4IQKqGYasNmo9RXQcsBFq\nDAmgsTNJnAijGq8CByGzAvzboH8tDFSPPo5HB+UPO8DEHXW9h0+yBDYLONwwTnOQg4j9NgcGUp18\nZV5C+kIPm9yLafbmiTJ+tyROkv6DkDQPPNuFXvVWg2M0Xo1EtGsengraSQGxWzmh1sRXey/igZUO\n0i7sZOuxJYT2W4RjwBBxoI2VOJhDRThNRxADxfYDLyHm441R3/UUnq77NkQqT9Jeu7RHW2T6BpJZ\nn3cJ5z+0E68liW9yZvNpz1UcPjabwGanmF8WBsKNED0I1goY0NZg9xj1XX/VD1d+7ePaJaFnqRpL\n27SfKaXgikGeU0tzmBA7Yxpqs4WjfaWY/EV0Hc+Bo03QcRAKNX0PrYVI9eijQKQRX3q1Z4jhXqjd\n3mxiSEROWKEbwl0W6gcn8trMVbhCA3S1p9MTSiOiWuJTAPIR9DoROu85CM4K0Y/t9FpoGoPNGumk\n6N/r1WbjgWURSqccZsKJBhJzothvlWhfnszmfy4l2joAwQRR2asXXox0fLyN4DsIORUgbYvr+2ij\nXHQ69M9pFPdTe2T6vkjl057lNKQUkSR5YR4C4+NheLhr5LiBthNZtARy6WtIIbLPAqdVUAzil0zN\nYh06K6B/GwyshcEx2nfdnusOihVRjBMEGhKhS2GgV2Kg1QTfGJFcUdTBqGhl4zeA0w+FEXCYIckc\n773laQTvQUiqgI5tcHwt1FVD2hjtqG6TR4KkdRyZijj89YqPn3RPFxMT6ylw9mC4CTqvttJ0uoiY\nzQoxWZiKLjSnolGzWRXg2wZn1ooRYq4x0KXbUn0d6ul0zaGK7hngzBNpfLT3OtquGcelMz5hcfJm\nslxdqE7w9KfxsetSPotdwc5N59O78f+1d66xcVzXHf/N7opccpeiSJGiKCmK7NouHceyZNOuAyv0\nMHRdp2ijNlGAoIHTxB/6QagiNECbtqjr5RdCTRs7AlIVTYIUiSvBgNFGVGAojF159KCjOlZFiaZs\nSZREiqJI8c3l+7XTD+dezogmuVxafFn3Dwy4HM7OHJ5z5txz7z2PPAn+GQYS14FzEHgSrGqY+CFM\nzEGG+GSosxa189SOLKasAnKhqyaPt4ef5Vz9dgKnOunmPDfj3YwNjcCaiKw8hREzrFcmg77/GT76\nfiXBx8nCm3s7l35H9he1pzcG9DkSrLYK8bYftSjccZOOk+9TeM8o7tcDjJTlcnK0hPjohxB8FMJh\niUTOUvdpd2Tg79wF6/dLEGb6I/DQVRizpDXMbHRpA6mzNAJIPY3VT8NwENLTICsIGSfh3lL14rjQ\nm4A2uFV/CYYfgb5eGA9CWpooc/NRaHoRNu+Hriis2Q1pJZBVCjf/YXZGTzhS0Mu/pDjgIKW4shDv\nbAxG1zBSdxon8xnZvTg/BB1dQBjGzkHns95MNQDgyKDRtwvCKgsvbTcESyBUCkPJ6MIb7FYhPsDE\nSch8GgiJYUlYNF/ZTPO5q5D5ZS9YLwu47sDYYxDNgJywNLvMBrqOQvOLULhfbpq9G9IVrzqS0BR3\nRK/8SWhat/JstftlSZ+20VOwyRbDdBmJrUmHm/91iX99cC+b8huo69nK9etboFbxqlfpVSgKG3bD\nmhLILp1Dm6BTyPwCPOLeBB5WjFwLRKA1naYjl2nP/hLRaC8dpwvgg24YfhuZLkmleW/v9B2kbMHX\ngO8hTP8LpAHWDlQg+ez6rqENUtyR4HttLAOAq86FoLsjh4OBr3P2sW2cf72PeOdOrly7n56La72m\nuX1HYfRFWKtkmLMbMpUMbyTh1YgDYdsL3kxHqgKHba9eXJai9bwDwzbkFMgg+BhiFBuqYc3nYCIM\nFy3aTm1gov4EidAm+LnKLOuPQng3UALppTCWhC7/Ur4eTBod0as49L+bLU5AS59Uxx77fUabMmi4\ndp+IrcOBjZsk3iIPEaMFtDoyMTiyC57YLwkcWY/Al66K434wic2afJ996HEkoPxTEN3aS052F2da\nernnT7O4lrmF/875E+L/niv3D/8GomXyXurYrrgDbjFc3CXvYTAq7aeyS2BNKTQm4dWgIx0mtKon\nEJufaUMLtJ0opC1cKOfb3oZwiWzPBtNk67jDgTVPyL2GUDkTFgSOg1UMQ18V3Qop3coogXAp1M/R\nvuvDBa44Eu+SBeTrYrET8qzBMtzBkLTJWmdJUtK9pV58pY5Zqj8K774o8suMwkO7obAENpTC2SS8\n6lTP9zvmQcTGr7e9Ej8DwKgDj9usf6qZiYhFj51Pu5XHgWu/S7whx4vL60SCoNcWw/ld8MB+SIvC\nZmWz8krhUjKb5YBre+9hQtGUZqsYyQnoG8E9M8KtplpONH6B1uJCTmfvIDvSS3N9PVnFj/FBQxGX\n64uIn8mROd5NR8boxC5AjdHWbrBKwC1VD5tFhnoM9O/KdDkQsMUJ0vPRuMNwt82Nmgg3Tm+Eo63I\nclwNFDwqpleXVRkDmh3It29LlqbHkUSDFFah5p2F57puu++SHyN5LtPjVgwGbPmcaYtQhh3JxstE\nxpM10J6WR1VDmOJvPM7gaJhzn32I904/wejAy5D9RRlHdNXRcWDkGIzug4jOHnGg7QfQV+N/+uOz\n0qURtcUITDjyOQhkhSWmoOs4rC6VZ2ZbEAmKYf/QkQE8mKOyMYDQOFzdAwV7hKYRYKAOWsuh73jy\n7J/RmGQbgGRJhRSvtAMVjEB6ENaEYOIn0GNLjOVQBlAAlis1XOLPTrnxMbD2ScZbYKdSpDoYK5c9\n9EQSujpi3oCSYQuPJgfaiKxc6A7XnY5kmGTglSe66UgGRS/etlFoHBr3SCBmhqJppA66yyXjJhlN\njTFReBeRgy7z3+54L4KOibrsCF+bEH4NAJ0wdOI93jr5j+RuvkW8K5exC+lw8Rg07YO1z0PuTrnP\nQB3cKIfe48kzWqhA9vlRP0uQ8mkPIla4UJg1mAkXHIZftRl2M6AhAV294NYge5tDiKJpy/EW8H0k\nC++L6p6nkGw8/bwU9D1qizGybIkl0DFRvQ5stGEABpsivNP4ec4MPUH3awdwNz8pW3y9ioeJcYjv\ngaw9Hq8Glb5H5qDvAzEYs+Vzug2rbGmpkLCFlnTEmHcBDY44VpmIU5KhWNR4AoJfUAMkjIXCUHMK\nbrwMOc/D8E5V/LUOJsphbA66dd7Hq3xb9LnFERo7kEHrJpLNOX4SMp4RWodRsSSOfCeKNygOAleP\nwXv7JFuqcKcY8ks/gN452qzrMe89XG1Dti0Thi02hCEwanFxpIj3a9fhfqWEc73bOHr6j6AxANG1\nMHgSsspu347vOQb1+4RXuTtVa4w6aC6H+Bxk2BUTxxbENkRsccpW2cIPvdLSjfCQzwMJL2lh1JH/\nQ+9c6+D9wDFw90mWdWSnOGpDjgzsGceT8+pszPu8wZaj2REd0rF+6ZbU1ms+CVvLJBlpdbqMNSeO\nw8ZSL17QQuI+q/fAZ/bAfTvlXEcdnCuHtuPJs/DqY9CleJVri24FEPt4ny26AmrXwYGHbUaHMjhR\nsINr2zbT1LeZIwc+8BxgEN2OH4OGfbDxeVi3U87H6+BKOXTPxWbFkDEG+WnZ8vwR21fGIg0YhLb/\noe/1ZzjjFHMm90mZHNyMQc23ZPuxFXlf+4EBJUNLZX8nHBk3cIA5yLAxJj+1jV9ti6OjJnisRmzB\ngCM62AfEA5AegdHtYB2UuiJq42YyJrXNEf7r3Z5eR96tXvvOlzFAwuYvuK67X5+wLGu967qt6tcv\nI/mJ0yNqw7qY+qLvyXqGmQZ0wZX6+wnEH+J71h8zmJHJBxceZOQXUZmN5OBl6qiqsgxXQqQM1u2d\n7FlFX7HXOqa9nFnpKozd/rue3anQi8mxSy/d61ofaXhbZOmIAPV2W8cLEMyHvL3eqkhakRiVwhi0\nvTIjOYA4TemKLr0lpWMxJlZ51VPTEWW5ihimCSAU8vb8/fV+ABKVECiTPlcaVpF48qtiMPaKrwrt\nNMiLeTOBye/j1aIZwAtCnMALuk338UZ/V8e9XVK8iu71bSkViZOdH4OuJLxabUtWEr576xnwCJNb\nCZOV5rsVzQVIDZII0AbuLwN0Rgvl+8NAY6UMMoWKVyEgo8h7XksSutiB184FvIAaPfKH5PNYpjSd\nftdVuqLrCWQh0co6olqvNf8KeBqpuQQyED2gPv8dsA9m0/f1ilf6HdS80tufekYMwq9bkKgNMdCb\ny8At4GJQJXDg6VjLC7AqHz6115stRookbTwvBp1JeLXKhsyY9/sg3srIKryMpXHEadMZOS2+61rV\n34LqswXUVcpKy9heuWcACBbJ8yIxGHpFvjsTtvp4pbdbtG3QOl8IDGVDZwQ2Z4o900VWm5ASESOI\nc6orJTdWwvoyuFfpfKENLcXyPBd4P4nN2hj76PaitqVdEK9ey1u3/oBwm8OHtd+mvbGA0V9HvcXM\nBJ5KaQfmaiWsLoMNSt+DpKbvGba0etL80nDxdq11vbYMS+yUG1KTKFRMK97EOKHOUwnhMrGlAUR2\naUXy97yYTLRm49VjsY+e84ds6LEngmwH6bhAtcM+OdhqPgWAoy9ARj58dq+nG6uLxDnbHoP3k/Aq\n14b7fLzyr5z3Ik6Htl99QD1c+U0R7fF8Eukug9ejJK7sE5pz8ZocXKuEYzvixAAACH9JREFUvDLY\n7Bt3MovE0bg/Bg3JbJbNZD88bdf9K/uTs1H1eWQcblrQGhKdH8Gzr5PhI8B4JVhl4Gr7YINbpMba\nmExoZpPhFkWTf3UTvGB7La9+5N0fQXaDcnJhIBcm3pBA+wS69Z73v+kkMhfhU7cNm2NCZ2N5En4J\n5lLG4CmklXmtZVln1eP+Hvgzy7K2KVIamC2S3v/CgxckmoMwvVkY0FR9L1y5n8ZLX/MM0QX5223V\nVAPAeDWM1cKwBQ3b5eaFFTDwJtRvxbcO91fJ2eCjSy9haudOL/kNqPNp6hqdTTiM5wn3V0P/QQjl\nw4XtMkhvqICuQ9BXBfHDkLZlzuRMDmg6bVMPWnqWO4SX5b4Gb5zW4TN65peohpFacC0YVLxKq4Dx\nQzBeBYnDYCWhSw8cmj86pkCH62jFTiCDnKYxof7e77sHQLwaug+KA3VN8SqvAuKHYLAKBg5LO5i5\n8slPJ3grghZewGAHsqK4Ba/2yg0kUFOvdPRVQ7xWnPZuxastFdB+CHqqoOcwpCeja+oURnt3EXX4\ng8wnIKFHZ3/Q+D14DAdp2XJBXbdDPeMl4HUkivsN/bCZ9V0/xk+eHkj0ioTeZo+r4yqeo9KOV5cp\nhOh7l9L3Sz597z4EccWrZPquHQC/cdQ6pJ+r/Ucdn6CL6OmYqGFFUyYqlb8aBpRtQMkwUgEjh2C0\nCkYPS5LFbPDzyh+ntRpvO06LsAb4jI9u8LKstFPgAu3V0FsrW2lViq7tFdD6JryxVcqLCGaWof8d\n0itI2gmoA87CeDiT/is59H/wO57zaSEDzqDik7ZrXYpXrgU9Soablb73VkH3XPSd22M3tYwy8FTY\nPzmN4tmmEJPzicn7BIHRauhRvLoxjX0YPJycV374xx9dVDMN770P4M1XxvGynvU22SBwsxouHIRw\nPvxCya+4Aq4cghtV0HBYFQScBdpmWtxeYmEUed8CvuvGkBpQtyD+q7Ve7M5VJKljo7pPt5JhwIJ3\nFK8eqICWQ9BRBe2HISMJXZbvZ2DKZx1LO2m/MiAQ9vgYVP/DAF4cVQKgGlC6xXZJ0rIqwD0EVMHE\nHGTojxHT/NILCvmKLB0U3oHngOepox0ZG8FzTvV99UJMwHffFGG57jy+lcoDPlqxfFHhuu60C3LL\nka6lpgmWJ13LkSZYnnQZfU8Ny5EuI8O5w/AqNSxHulaSDKdiwR0oAwMDAwMDA4NPGlKsemBgYGBg\nYGBgYGAcKAMDAwMDAwODVOG67oIdwHPAh0iv5u/6zjcg4Z5jwKDvvO49PY7kIGSr899HQkWH1PFv\n6vxD6B4Ocv1fq/Mvqfv8nzqeS0aXoumcj67z86EL2AScQELWRpAQ5ez50DQHul7FC2Wv8j2nRdE5\nhFRd/PYC0bUkMpwnr1aSDFeMvi+yDO9Kfb8LbNaC82qpZXgHeWX0fYls1nTHQjpPASSH4NNInHwN\nUKT+dhUpZLNtChP+CTigzrcA+9T5fwb+RX2OIvWPi4AfAq+o8y8qhhQpJnwnFboUTTlIitPHoWsH\n8B/A36hz7cCP5kOTj1cz0fWfwCtIM8bvInnsL6lj2yLQtegy/Bi8WkkyXEn6vpgyvOv0/S6xWQvO\nqxVqs4y+LxObNdOxkFt4TwCXXddtdF13DHgNUBW+sIB3kWRRP3YC5ep8N9LwC8SrbgFwXbcf8a43\nAc+git8gxTzTkORO/YxU6LKAgOu6pz4mXWHgc8DP1Ln3EG96PjTp78xEVzHykgH8zE+X67o1i0DX\nUshwvrxaSTJcSfq+mDK8G/V9NrqMvqdO10qyWUbfl4/NmhYL6UBtRMrJadzAI9BFelscQcqBaaxz\nXfeW+jyOVO3R+EvLsmosy3oNKe5yGijwXR9Gqon875Trf2JZVvYc6HKBNy3L+i3SK8OPedGlegc+\niHi+86GJZHQh1S9wpaippsv/nIcRr3xB6GLxZThvXq1QGa4YfU+BLqPvc+fVbHQZfV8ZMjT6Pnde\nzUbXUtqsabFUQeRPua77KPBNYK1lWTtmuM5VPw8gLSh3IH0xLilv0gX8ffoG1PkDwL2u625D6pO+\nnAJNfwh8A6+027zo8vcOVNfPh6b50OV/TidwDNW/cIHo+iZGhnearhXHqyWky+i70fe7SYZG35eR\nzVpIB6oZ6T+usUmdw3XdFnWuCylgrzpJcsuyrAL1OYT0mMaVvntB5B/9EdKMQ1+/QZ0/7Lt/u+u6\nmoE/5vZeO9PSpWlSz6riduGkTBfiub+KeLVt86HJz6uZ6ELqrWJZ1nr/cyzpX/h7wKir+hcuBF0s\nvgznzauVJMM7xKsZ6brTvFosGU7Hq0+4vs9Il9H3+dHFCrJZ0/HK6PuS2KxpsZAO1G+B+yzL+rRl\nWWnIktsRy7IyldcHwoAsvF44RxCP10KCxSphUnl+ivSy6J1y/S/V+bEp12tM7dM3HV2/1jRZlhVB\nPFV/p6xU6ZoAQq70DvxzoHIeNN3Gq1no+qqia+pzfqp+Hvddf8fpYvFl+HF4lQpdSy3DlaTviynD\nu03fZ6Lrk2azFpJXSy1Do+/LT9/nQ9dH4c4x2nw+BxLEdhG4DPytOncPElXfpQgfB64D31L/eCte\nR5smdf4oTLam7EGyEp5Th4ukKcaRFMfngJ+ra2oQ77JgNrp8NJ1V948r4cyHru8gitzno+srqdI0\nhVcz0fU6XmvXIWC3ek69oqsXqEXSMReCriWR4Tx5tZJkuJL0fTFleFfq+11gsxaUV8tBhneQV0bf\nl8hmTXeYVi4GBgYGBgYGBinCVCI3MDAwMDAwMEgRxoEyMDAwMDAwMEgRxoEyMDAwMDAwMEgRxoEy\nMDAwMDAwMEgRxoEyMDAwMDAwMEgRxoEyMDAwMDAwMEgRxoEyMDAwMDAwMEgRxoEyMDAwMDAwMEgR\n/w+Br7FGfQ8VhQAAAABJRU5ErkJggg==\n", 204 | "text/plain": [ 205 | "" 206 | ] 207 | }, 208 | "metadata": {}, 209 | "output_type": "display_data" 210 | } 211 | ], 212 | "source": [ 213 | "# %%\n", 214 | "if __name__ == '__main__':\n", 215 | " test_mnist()" 216 | ] 217 | } 218 | ], 219 | "metadata": { 220 | "kernelspec": { 221 | "display_name": "Python [Root]", 222 | "language": "python", 223 | "name": "Python [Root]" 224 | }, 225 | "language_info": { 226 | "codemirror_mode": { 227 | "name": "ipython", 228 | "version": 2 229 | }, 230 | "file_extension": ".py", 231 | "mimetype": "text/x-python", 232 | "name": "python", 233 | "nbconvert_exporter": "python", 234 | "pygments_lexer": "ipython2", 235 | "version": "2.7.11" 236 | } 237 | }, 238 | "nbformat": 4, 239 | "nbformat_minor": 0 240 | } 241 | -------------------------------------------------------------------------------- /notebooks/10_residual_network.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "\"\"\"In progress.\n", 10 | "\n", 11 | "Parag K. Mital, Jan 2016.\n", 12 | "\"\"\"" 13 | ] 14 | }, 15 | { 16 | "cell_type": "code", 17 | "execution_count": null, 18 | "metadata": {}, 19 | "outputs": [], 20 | "source": [ 21 | "# %%\n", 22 | "import tensorflow as tf\n", 23 | "from libs.connections import conv2d, linear\n", 24 | "from collections import namedtuple\n", 25 | "from math import sqrt\n" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": null, 31 | "metadata": {}, 32 | "outputs": [], 33 | "source": [ 34 | "# %%\n", 35 | "def residual_network(x, n_outputs,\n", 36 | " activation=tf.nn.relu):\n", 37 | " \"\"\"Builds a residual network.\n", 38 | "\n", 39 | " Parameters\n", 40 | " ----------\n", 41 | " x : Placeholder\n", 42 | " Input to the network\n", 43 | " n_outputs : TYPE\n", 44 | " Number of outputs of final softmax\n", 45 | " activation : Attribute, optional\n", 46 | " Nonlinearity to apply after each convolution\n", 47 | "\n", 48 | " Returns\n", 49 | " -------\n", 50 | " net : Tensor\n", 51 | " Description\n", 52 | "\n", 53 | " Raises\n", 54 | " ------\n", 55 | " ValueError\n", 56 | " If a 2D Tensor is input, the Tensor must be square or else\n", 57 | " the network can't be converted to a 4D Tensor.\n", 58 | " \"\"\"\n", 59 | " # %%\n", 60 | " LayerBlock = namedtuple(\n", 61 | " 'LayerBlock', ['num_repeats', 'num_filters', 'bottleneck_size'])\n", 62 | " blocks = [LayerBlock(3, 128, 32),\n", 63 | " LayerBlock(3, 256, 64),\n", 64 | " LayerBlock(3, 512, 128),\n", 65 | " LayerBlock(3, 1024, 256)]\n", 66 | "\n", 67 | " # %%\n", 68 | " input_shape = x.get_shape().as_list()\n", 69 | " if len(input_shape) == 2:\n", 70 | " ndim = int(sqrt(input_shape[1]))\n", 71 | " if ndim * ndim != input_shape[1]:\n", 72 | " raise ValueError('input_shape should be square')\n", 73 | " x = tf.reshape(x, [-1, ndim, ndim, 1])\n", 74 | "\n", 75 | " # %%\n", 76 | " # First convolution expands to 64 channels and downsamples\n", 77 | " net = conv2d(x, 64, k_h=7, k_w=7,\n", 78 | " name='conv1',\n", 79 | " activation=activation)\n", 80 | "\n", 81 | " # %%\n", 82 | " # Max pool and downsampling\n", 83 | " net = tf.nn.max_pool(\n", 84 | " net, [1, 3, 3, 1], strides=[1, 2, 2, 1], padding='SAME')\n", 85 | "\n", 86 | " # %%\n", 87 | " # Setup first chain of resnets\n", 88 | " net = conv2d(net, blocks[0].num_filters, k_h=1, k_w=1,\n", 89 | " stride_h=1, stride_w=1, padding='VALID', name='conv2')\n", 90 | "\n", 91 | " # %%\n", 92 | " # Loop through all res blocks\n", 93 | " for block_i, block in enumerate(blocks):\n", 94 | " for repeat_i in range(block.num_repeats):\n", 95 | "\n", 96 | " name = 'block_%d/repeat_%d' % (block_i, repeat_i)\n", 97 | " conv = conv2d(net, block.bottleneck_size, k_h=1, k_w=1,\n", 98 | " padding='VALID', stride_h=1, stride_w=1,\n", 99 | " activation=activation,\n", 100 | " name=name + '/conv_in')\n", 101 | "\n", 102 | " conv = conv2d(conv, block.bottleneck_size, k_h=3, k_w=3,\n", 103 | " padding='SAME', stride_h=1, stride_w=1,\n", 104 | " activation=activation,\n", 105 | " name=name + '/conv_bottleneck')\n", 106 | "\n", 107 | " conv = conv2d(conv, block.num_filters, k_h=1, k_w=1,\n", 108 | " padding='VALID', stride_h=1, stride_w=1,\n", 109 | " activation=activation,\n", 110 | " name=name + '/conv_out')\n", 111 | "\n", 112 | " net = conv + net\n", 113 | " try:\n", 114 | " # upscale to the next block size\n", 115 | " next_block = blocks[block_i + 1]\n", 116 | " net = conv2d(net, next_block.num_filters, k_h=1, k_w=1,\n", 117 | " padding='SAME', stride_h=1, stride_w=1, bias=False,\n", 118 | " name='block_%d/conv_upscale' % block_i)\n", 119 | " except IndexError:\n", 120 | " pass\n", 121 | "\n", 122 | " # %%\n", 123 | " net = tf.nn.avg_pool(net,\n", 124 | " ksize=[1, net.get_shape().as_list()[1],\n", 125 | " net.get_shape().as_list()[2], 1],\n", 126 | " strides=[1, 1, 1, 1], padding='VALID')\n", 127 | " net = tf.reshape(\n", 128 | " net,\n", 129 | " [-1, net.get_shape().as_list()[1] *\n", 130 | " net.get_shape().as_list()[2] *\n", 131 | " net.get_shape().as_list()[3]])\n", 132 | "\n", 133 | " net = linear(net, n_outputs, activation=tf.nn.softmax)\n", 134 | "\n", 135 | " # %%\n", 136 | " return net\n", 137 | "\n", 138 | "\n", 139 | "def test_mnist():\n", 140 | " \"\"\"Test the resnet on MNIST.\"\"\"\n", 141 | " import tensorflow.examples.tutorials.mnist.input_data as input_data\n", 142 | "\n", 143 | " mnist = input_data.read_data_sets('MNIST_data/', one_hot=True)\n", 144 | " x = tf.placeholder(tf.float32, [None, 784])\n", 145 | " y = tf.placeholder(tf.float32, [None, 10])\n", 146 | " y_pred = residual_network(x, 10)\n", 147 | "\n", 148 | " # %% Define loss/eval/training functions\n", 149 | " cross_entropy = -tf.reduce_sum(y * tf.log(y_pred))\n", 150 | " optimizer = tf.train.AdamOptimizer().minimize(cross_entropy)\n", 151 | "\n", 152 | " # %% Monitor accuracy\n", 153 | " correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.argmax(y, 1))\n", 154 | " accuracy = tf.reduce_mean(tf.cast(correct_prediction, 'float'))\n", 155 | "\n", 156 | " # %% We now create a new session to actually perform the initialization the\n", 157 | " # variables:\n", 158 | " sess = tf.Session()\n", 159 | " sess.run(tf.initialize_all_variables())\n", 160 | "\n", 161 | " # %% We'll train in minibatches and report accuracy:\n", 162 | " batch_size = 50\n", 163 | " n_epochs = 5\n", 164 | " for epoch_i in range(n_epochs):\n", 165 | " # Training\n", 166 | " train_accuracy = 0\n", 167 | " for batch_i in range(mnist.train.num_examples // batch_size):\n", 168 | " batch_xs, batch_ys = mnist.train.next_batch(batch_size)\n", 169 | " train_accuracy += sess.run([optimizer, accuracy], feed_dict={\n", 170 | " x: batch_xs, y: batch_ys})[1]\n", 171 | " train_accuracy /= (mnist.train.num_examples // batch_size)\n", 172 | "\n", 173 | " # Validation\n", 174 | " valid_accuracy = 0\n", 175 | " for batch_i in range(mnist.validation.num_examples // batch_size):\n", 176 | " batch_xs, batch_ys = mnist.validation.next_batch(batch_size)\n", 177 | " valid_accuracy += sess.run(accuracy,\n", 178 | " feed_dict={\n", 179 | " x: batch_xs,\n", 180 | " y: batch_ys\n", 181 | " })\n", 182 | " valid_accuracy /= (mnist.validation.num_examples // batch_size)\n", 183 | " print('epoch:', epoch_i, ', train:',\n", 184 | " train_accuracy, ', valid:', valid_accuracy)\n", 185 | "\n", 186 | "\n", 187 | "if __name__ == '__main__':\n", 188 | " test_mnist()" 189 | ] 190 | } 191 | ], 192 | "metadata": { 193 | "language": "python" 194 | }, 195 | "nbformat": 4, 196 | "nbformat_minor": 0 197 | } 198 | -------------------------------------------------------------------------------- /notebooks/11_variational_autoencoder.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "\"\"\"Training a variational autoencoder with 2 layer fully-connected\n", 10 | "encoder/decoders and gaussian noise distribution.\n", 11 | "\n", 12 | "Parag K. Mital, Jan 2016\n", 13 | "\"\"\"\n", 14 | "import tensorflow as tf\n", 15 | "import numpy as np\n", 16 | "from libs.utils import weight_variable, bias_variable, montage_batch\n" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": null, 22 | "metadata": {}, 23 | "outputs": [], 24 | "source": [ 25 | "# %%\n", 26 | "def VAE(input_shape=[None, 784],\n", 27 | " n_components_encoder=2048,\n", 28 | " n_components_decoder=2048,\n", 29 | " n_hidden=2,\n", 30 | " debug=False):\n", 31 | " # %%\n", 32 | " # Input placeholder\n", 33 | " if debug:\n", 34 | " input_shape = [50, 784]\n", 35 | " x = tf.Variable(np.zeros((input_shape), dtype=np.float32))\n", 36 | " else:\n", 37 | " x = tf.placeholder(tf.float32, input_shape)\n", 38 | "\n", 39 | " activation = tf.nn.softplus\n", 40 | "\n", 41 | " dims = x.get_shape().as_list()\n", 42 | " n_features = dims[1]\n", 43 | "\n", 44 | " W_enc1 = weight_variable([n_features, n_components_encoder])\n", 45 | " b_enc1 = bias_variable([n_components_encoder])\n", 46 | " h_enc1 = activation(tf.matmul(x, W_enc1) + b_enc1)\n", 47 | "\n", 48 | " W_enc2 = weight_variable([n_components_encoder, n_components_encoder])\n", 49 | " b_enc2 = bias_variable([n_components_encoder])\n", 50 | " h_enc2 = activation(tf.matmul(h_enc1, W_enc2) + b_enc2)\n", 51 | "\n", 52 | " W_enc3 = weight_variable([n_components_encoder, n_components_encoder])\n", 53 | " b_enc3 = bias_variable([n_components_encoder])\n", 54 | " h_enc3 = activation(tf.matmul(h_enc2, W_enc3) + b_enc3)\n", 55 | "\n", 56 | " W_mu = weight_variable([n_components_encoder, n_hidden])\n", 57 | " b_mu = bias_variable([n_hidden])\n", 58 | "\n", 59 | " W_log_sigma = weight_variable([n_components_encoder, n_hidden])\n", 60 | " b_log_sigma = bias_variable([n_hidden])\n", 61 | "\n", 62 | " z_mu = tf.matmul(h_enc3, W_mu) + b_mu\n", 63 | " z_log_sigma = 0.5 * (tf.matmul(h_enc3, W_log_sigma) + b_log_sigma)\n", 64 | "\n", 65 | " # %%\n", 66 | " # Sample from noise distribution p(eps) ~ N(0, 1)\n", 67 | " if debug:\n", 68 | " epsilon = tf.random_normal(\n", 69 | " [dims[0], n_hidden])\n", 70 | " else:\n", 71 | " epsilon = tf.random_normal(\n", 72 | " tf.pack([tf.shape(x)[0], n_hidden]))\n", 73 | "\n", 74 | " # Sample from posterior\n", 75 | " z = z_mu + tf.exp(z_log_sigma) * epsilon\n", 76 | "\n", 77 | " W_dec1 = weight_variable([n_hidden, n_components_decoder])\n", 78 | " b_dec1 = bias_variable([n_components_decoder])\n", 79 | " h_dec1 = activation(tf.matmul(z, W_dec1) + b_dec1)\n", 80 | "\n", 81 | " W_dec2 = weight_variable([n_components_decoder, n_components_decoder])\n", 82 | " b_dec2 = bias_variable([n_components_decoder])\n", 83 | " h_dec2 = activation(tf.matmul(h_dec1, W_dec2) + b_dec2)\n", 84 | "\n", 85 | " W_dec3 = weight_variable([n_components_decoder, n_components_decoder])\n", 86 | " b_dec3 = bias_variable([n_components_decoder])\n", 87 | " h_dec3 = activation(tf.matmul(h_dec2, W_dec3) + b_dec3)\n", 88 | "\n", 89 | " W_mu_dec = weight_variable([n_components_decoder, n_features])\n", 90 | " b_mu_dec = bias_variable([n_features])\n", 91 | " y = tf.nn.tanh(tf.matmul(h_dec3, W_mu_dec) + b_mu_dec)\n", 92 | "\n", 93 | " # p(x|z)\n", 94 | " log_px_given_z = -tf.reduce_sum(\n", 95 | " x * tf.log(y + 1e-10) +\n", 96 | " (1 - x) * tf.log(1 - y + 1e-10), 1)\n", 97 | "\n", 98 | " # d_kl(q(z|x)||p(z))\n", 99 | " # Appendix B: 0.5 * sum(1 + log(sigma^2) - mu^2 - sigma^2)\n", 100 | " kl_div = -0.5 * tf.reduce_sum(\n", 101 | " 1.0 + 2.0 * z_log_sigma - tf.square(z_mu) - tf.exp(2.0 * z_log_sigma),\n", 102 | " 1)\n", 103 | " loss = tf.reduce_mean(log_px_given_z + kl_div)\n", 104 | "\n", 105 | " return {'cost': loss, 'x': x, 'z': z, 'y': y}\n" 106 | ] 107 | }, 108 | { 109 | "cell_type": "code", 110 | "execution_count": null, 111 | "metadata": {}, 112 | "outputs": [], 113 | "source": [ 114 | "# %%\n", 115 | "def test_mnist():\n", 116 | " \"\"\"Summary\n", 117 | "\n", 118 | " Returns\n", 119 | " -------\n", 120 | " name : TYPE\n", 121 | " Description\n", 122 | " \"\"\"\n", 123 | " # %%\n", 124 | " import tensorflow as tf\n", 125 | " import tensorflow.examples.tutorials.mnist.input_data as input_data\n", 126 | " import matplotlib.pyplot as plt\n", 127 | "\n", 128 | " # %%\n", 129 | " # load MNIST as before\n", 130 | " mnist = input_data.read_data_sets('MNIST_data', one_hot=True)\n", 131 | " ae = VAE()\n", 132 | "\n", 133 | " # %%\n", 134 | " learning_rate = 0.001\n", 135 | " optimizer = tf.train.AdamOptimizer(learning_rate).minimize(ae['cost'])\n", 136 | "\n", 137 | " # %%\n", 138 | " # We create a session to use the graph\n", 139 | " sess = tf.Session()\n", 140 | " sess.run(tf.initialize_all_variables())\n", 141 | "\n", 142 | " # %%\n", 143 | " # Fit all training data\n", 144 | " t_i = 0\n", 145 | " batch_size = 100\n", 146 | " n_epochs = 50\n", 147 | " n_examples = 20\n", 148 | " test_xs, _ = mnist.test.next_batch(n_examples)\n", 149 | " xs, ys = mnist.test.images, mnist.test.labels\n", 150 | " fig_manifold, ax_manifold = plt.subplots(1, 1)\n", 151 | " fig_reconstruction, axs_reconstruction = plt.subplots(2, n_examples, figsize=(10, 2))\n", 152 | " fig_image_manifold, ax_image_manifold = plt.subplots(1, 1)\n", 153 | " for epoch_i in range(n_epochs):\n", 154 | " print('--- Epoch', epoch_i)\n", 155 | " train_cost = 0\n", 156 | " for batch_i in range(mnist.train.num_examples // batch_size):\n", 157 | " batch_xs, _ = mnist.train.next_batch(batch_size)\n", 158 | " train_cost += sess.run([ae['cost'], optimizer],\n", 159 | " feed_dict={ae['x']: batch_xs})[0]\n", 160 | " if batch_i % 2 == 0:\n", 161 | " # %%\n", 162 | " # Plot example reconstructions from latent layer\n", 163 | " imgs = []\n", 164 | " for img_i in np.linspace(-3, 3, n_examples):\n", 165 | " for img_j in np.linspace(-3, 3, n_examples):\n", 166 | " z = np.array([[img_i, img_j]], dtype=np.float32)\n", 167 | " recon = sess.run(ae['y'], feed_dict={ae['z']: z})\n", 168 | " imgs.append(np.reshape(recon, (1, 28, 28, 1)))\n", 169 | " imgs_cat = np.concatenate(imgs)\n", 170 | " ax_manifold.imshow(montage_batch(imgs_cat))\n", 171 | " fig_manifold.savefig('manifold_%08d.png' % t_i)\n", 172 | "\n", 173 | " # %%\n", 174 | " # Plot example reconstructions\n", 175 | " recon = sess.run(ae['y'], feed_dict={ae['x']: test_xs})\n", 176 | " print(recon.shape)\n", 177 | " for example_i in range(n_examples):\n", 178 | " axs_reconstruction[0][example_i].imshow(\n", 179 | " np.reshape(test_xs[example_i, :], (28, 28)),\n", 180 | " cmap='gray')\n", 181 | " axs_reconstruction[1][example_i].imshow(\n", 182 | " np.reshape(\n", 183 | " np.reshape(recon[example_i, ...], (784,)),\n", 184 | " (28, 28)),\n", 185 | " cmap='gray')\n", 186 | " axs_reconstruction[0][example_i].axis('off')\n", 187 | " axs_reconstruction[1][example_i].axis('off')\n", 188 | " fig_reconstruction.savefig('reconstruction_%08d.png' % t_i)\n", 189 | "\n", 190 | " # %%\n", 191 | " # Plot manifold of latent layer\n", 192 | " zs = sess.run(ae['z'], feed_dict={ae['x']: xs})\n", 193 | " ax_image_manifold.clear()\n", 194 | " ax_image_manifold.scatter(zs[:, 0], zs[:, 1],\n", 195 | " c=np.argmax(ys, 1), alpha=0.2)\n", 196 | " ax_image_manifold.set_xlim([-6, 6])\n", 197 | " ax_image_manifold.set_ylim([-6, 6])\n", 198 | " ax_image_manifold.axis('off')\n", 199 | " fig_image_manifold.savefig('image_manifold_%08d.png' % t_i)\n", 200 | "\n", 201 | " t_i += 1\n", 202 | "\n", 203 | "\n", 204 | " print('Train cost:', train_cost /\n", 205 | " (mnist.train.num_examples // batch_size))\n", 206 | "\n", 207 | " valid_cost = 0\n", 208 | " for batch_i in range(mnist.validation.num_examples // batch_size):\n", 209 | " batch_xs, _ = mnist.validation.next_batch(batch_size)\n", 210 | " valid_cost += sess.run([ae['cost']],\n", 211 | " feed_dict={ae['x']: batch_xs})[0]\n", 212 | " print('Validation cost:', valid_cost /\n", 213 | " (mnist.validation.num_examples // batch_size))\n", 214 | "\n", 215 | "\n", 216 | "if __name__ == '__main__':\n", 217 | " test_mnist()" 218 | ] 219 | } 220 | ], 221 | "metadata": { 222 | "language": "python" 223 | }, 224 | "nbformat": 4, 225 | "nbformat_minor": 0 226 | } 227 | -------------------------------------------------------------------------------- /notebooks/convert.py: -------------------------------------------------------------------------------- 1 | import nbformat 2 | from nbformat.v4 import new_code_cell, new_notebook 3 | import codecs 4 | 5 | 6 | def parse_py(fn): 7 | with open(fn, "r") as f: 8 | lines = [] 9 | for line_i in f: 10 | if line_i.startswith('# %%'): 11 | lines[-1] = lines[-1].strip('\n') 12 | if len(lines[-1]) == 0: 13 | lines = lines[:(len(lines)-1)] 14 | lines[-1] = lines[-1].strip('\n') 15 | yield "".join(lines) 16 | lines = [] 17 | lines.append(line_i) 18 | if lines: 19 | lines[-1] = lines[-1].strip('\n') 20 | yield "".join(lines) 21 | 22 | 23 | def py_to_ipynb(source, dest): 24 | # Create the code cells by parsing the file in input 25 | cells = [] 26 | for c in parse_py(source): 27 | cells.append(new_code_cell(source=c)) 28 | 29 | # This creates a V4 Notebook with the code cells extracted above 30 | nb0 = new_notebook(cells=cells, 31 | metadata={'language': 'python'}) 32 | 33 | with codecs.open(dest, encoding='utf-8', mode='w') as f: 34 | nbformat.write(nb0, f, 4) 35 | 36 | 37 | if __name__ == '__main__': 38 | import os 39 | root = '../python' 40 | files = [file for file in os.listdir(root) if '.py' in file] 41 | for file in files: 42 | py_to_ipynb(os.path.join(root, file), file.strip('.py') + '.ipynb') 43 | -------------------------------------------------------------------------------- /notebooks/libs: -------------------------------------------------------------------------------- 1 | ../python/libs -------------------------------------------------------------------------------- /python/.gitignore: -------------------------------------------------------------------------------- 1 | X_* 2 | -------------------------------------------------------------------------------- /python/01_basics.py: -------------------------------------------------------------------------------- 1 | """Summary of tensorflow basics. 2 | 3 | Parag K. Mital, Jan 2016.""" 4 | # %% Import tensorflow and pyplot 5 | import tensorflow as tf 6 | import matplotlib.pyplot as plt 7 | 8 | # %% tf.Graph represents a collection of tf.Operations 9 | # You can create operations by writing out equations. 10 | # By default, there is a graph: tf.get_default_graph() 11 | # and any new operations are added to this graph. 12 | # The result of a tf.Operation is a tf.Tensor, which holds 13 | # the values. 14 | 15 | # %% First a tf.Tensor 16 | n_values = 32 17 | x = tf.linspace(-3.0, 3.0, n_values) 18 | 19 | # %% Construct a tf.Session to execute the graph. 20 | sess = tf.Session() 21 | result = sess.run(x) 22 | 23 | # %% Alternatively pass a session to the eval fn: 24 | x.eval(session=sess) 25 | # x.eval() does not work, as it requires a session! 26 | 27 | # %% We can setup an interactive session if we don't 28 | # want to keep passing the session around: 29 | sess.close() 30 | sess = tf.InteractiveSession() 31 | 32 | # %% Now this will work! 33 | x.eval() 34 | 35 | # %% Now a tf.Operation 36 | # We'll use our values from [-3, 3] to create a Gaussian Distribution 37 | sigma = 1.0 38 | mean = 0.0 39 | z = (tf.exp(tf.negative(tf.pow(x - mean, 2.0) / 40 | (2.0 * tf.pow(sigma, 2.0)))) * 41 | (1.0 / (sigma * tf.sqrt(2.0 * 3.1415)))) 42 | 43 | # %% By default, new operations are added to the default Graph 44 | assert z.graph is tf.get_default_graph() 45 | 46 | # %% Execute the graph and plot the result 47 | plt.plot(z.eval()) 48 | 49 | # %% We can find out the shape of a tensor like so: 50 | print(z.get_shape()) 51 | 52 | # %% Or in a more friendly format 53 | print(z.get_shape().as_list()) 54 | 55 | # %% Sometimes we may not know the shape of a tensor 56 | # until it is computed in the graph. In that case 57 | # we should use the tf.shape fn, which will return a 58 | # Tensor which can be eval'ed, rather than a discrete 59 | # value of tf.Dimension 60 | print(tf.shape(z).eval()) 61 | 62 | # %% We can combine tensors like so: 63 | print(tf.stack([tf.shape(z), tf.shape(z), [3], [4]]).eval()) 64 | 65 | # %% Let's multiply the two to get a 2d gaussian 66 | z_2d = tf.matmul(tf.reshape(z, [n_values, 1]), tf.reshape(z, [1, n_values])) 67 | 68 | # %% Execute the graph and store the value that `out` represents in `result`. 69 | plt.imshow(z_2d.eval()) 70 | 71 | # %% For fun let's create a gabor patch: 72 | x = tf.reshape(tf.sin(tf.linspace(-3.0, 3.0, n_values)), [n_values, 1]) 73 | y = tf.reshape(tf.ones_like(x), [1, n_values]) 74 | z = tf.multiply(tf.matmul(x, y), z_2d) 75 | plt.imshow(z.eval()) 76 | 77 | # %% We can also list all the operations of a graph: 78 | ops = tf.get_default_graph().get_operations() 79 | print([op.name for op in ops]) 80 | 81 | # %% Lets try creating a generic function for computing the same thing: 82 | def gabor(n_values=32, sigma=1.0, mean=0.0): 83 | x = tf.linspace(-3.0, 3.0, n_values) 84 | z = (tf.exp(tf.negative(tf.pow(x - mean, 2.0) / 85 | (2.0 * tf.pow(sigma, 2.0)))) * 86 | (1.0 / (sigma * tf.sqrt(2.0 * 3.1415)))) 87 | gauss_kernel = tf.matmul( 88 | tf.reshape(z, [n_values, 1]), tf.reshape(z, [1, n_values])) 89 | x = tf.reshape(tf.sin(tf.linspace(-3.0, 3.0, n_values)), [n_values, 1]) 90 | y = tf.reshape(tf.ones_like(x), [1, n_values]) 91 | gabor_kernel = tf.multiply(tf.matmul(x, y), gauss_kernel) 92 | return gabor_kernel 93 | 94 | # %% Confirm this does something: 95 | plt.imshow(gabor().eval()) 96 | 97 | # %% And another function which can convolve 98 | def convolve(img, W): 99 | # The W matrix is only 2D 100 | # But conv2d will need a tensor which is 4d: 101 | # height x width x n_input x n_output 102 | if len(W.get_shape()) == 2: 103 | dims = W.get_shape().as_list() + [1, 1] 104 | W = tf.reshape(W, dims) 105 | 106 | if len(img.get_shape()) == 2: 107 | # num x height x width x channels 108 | dims = [1] + img.get_shape().as_list() + [1] 109 | img = tf.reshape(img, dims) 110 | elif len(img.get_shape()) == 3: 111 | dims = [1] + img.get_shape().as_list() 112 | img = tf.reshape(img, dims) 113 | # if the image is 3 channels, then our convolution 114 | # kernel needs to be repeated for each input channel 115 | W = tf.concat(axis=2, values=[W, W, W]) 116 | 117 | # Stride is how many values to skip for the dimensions of 118 | # num, height, width, channels 119 | convolved = tf.nn.conv2d(img, W, 120 | strides=[1, 1, 1, 1], padding='SAME') 121 | return convolved 122 | 123 | # %% Load up an image: 124 | from skimage import data 125 | img = data.astronaut() 126 | plt.imshow(img) 127 | print(img.shape) 128 | 129 | # %% Now create a placeholder for our graph which can store any input: 130 | x = tf.placeholder(tf.float32, shape=img.shape) 131 | 132 | # %% And a graph which can convolve our image with a gabor 133 | out = convolve(x, gabor()) 134 | 135 | # %% Now send the image into the graph and compute the result 136 | result = tf.squeeze(out).eval(feed_dict={x: img}) 137 | plt.imshow(result) 138 | -------------------------------------------------------------------------------- /python/02_linear_regression.py: -------------------------------------------------------------------------------- 1 | """Simple tutorial for using TensorFlow to compute a linear regression. 2 | 3 | Parag K. Mital, Jan. 2016""" 4 | # %% imports 5 | import numpy as np 6 | import tensorflow as tf 7 | import matplotlib.pyplot as plt 8 | 9 | 10 | # %% Let's create some toy data 11 | plt.ion() 12 | n_observations = 100 13 | fig, ax = plt.subplots(1, 1) 14 | xs = np.linspace(-3, 3, n_observations) 15 | ys = np.sin(xs) + np.random.uniform(-0.5, 0.5, n_observations) 16 | ax.scatter(xs, ys) 17 | fig.show() 18 | plt.draw() 19 | 20 | # %% tf.placeholders for the input and output of the network. Placeholders are 21 | # variables which we need to fill in when we are ready to compute the graph. 22 | X = tf.placeholder(tf.float32) 23 | Y = tf.placeholder(tf.float32) 24 | 25 | # %% We will try to optimize min_(W,b) ||(X*w + b) - y||^2 26 | # The `Variable()` constructor requires an initial value for the variable, 27 | # which can be a `Tensor` of any type and shape. The initial value defines the 28 | # type and shape of the variable. After construction, the type and shape of 29 | # the variable are fixed. The value can be changed using one of the assign 30 | # methods. 31 | W = tf.Variable(tf.random_normal([1]), name='weight') 32 | b = tf.Variable(tf.random_normal([1]), name='bias') 33 | Y_pred = tf.add(tf.multiply(X, W), b) 34 | 35 | # %% Loss function will measure the distance between our observations 36 | # and predictions and average over them. 37 | cost = tf.reduce_sum(tf.pow(Y_pred - Y, 2)) / (n_observations - 1) 38 | 39 | # %% if we wanted to add regularization, we could add other terms to the cost, 40 | # e.g. ridge regression has a parameter controlling the amount of shrinkage 41 | # over the norm of activations. the larger the shrinkage, the more robust 42 | # to collinearity. 43 | # cost = tf.add(cost, tf.mul(1e-6, tf.global_norm([W]))) 44 | 45 | # %% Use gradient descent to optimize W,b 46 | # Performs a single step in the negative gradient 47 | learning_rate = 0.01 48 | optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost) 49 | 50 | # %% We create a session to use the graph 51 | n_epochs = 1000 52 | with tf.Session() as sess: 53 | # Here we tell tensorflow that we want to initialize all 54 | # the variables in the graph so we can use them 55 | sess.run(tf.global_variables_initializer()) 56 | 57 | # Fit all training data 58 | prev_training_cost = 0.0 59 | for epoch_i in range(n_epochs): 60 | for (x, y) in zip(xs, ys): 61 | sess.run(optimizer, feed_dict={X: x, Y: y}) 62 | 63 | training_cost = sess.run( 64 | cost, feed_dict={X: xs, Y: ys}) 65 | print(training_cost) 66 | 67 | if epoch_i % 20 == 0: 68 | ax.plot(xs, Y_pred.eval( 69 | feed_dict={X: xs}, session=sess), 70 | 'k', alpha=epoch_i / n_epochs) 71 | fig.show() 72 | plt.draw() 73 | 74 | # Allow the training to quit if we've reached a minimum 75 | if np.abs(prev_training_cost - training_cost) < 0.000001: 76 | break 77 | prev_training_cost = training_cost 78 | fig.show() 79 | plt.waitforbuttonpress() 80 | -------------------------------------------------------------------------------- /python/03_polynomial_regression.py: -------------------------------------------------------------------------------- 1 | """Simple tutorial for using TensorFlow to compute polynomial regression. 2 | 3 | Parag K. Mital, Jan. 2016""" 4 | # %% Imports 5 | import numpy as np 6 | import tensorflow as tf 7 | import matplotlib.pyplot as plt 8 | 9 | 10 | # %% Let's create some toy data 11 | plt.ion() 12 | n_observations = 100 13 | fig, ax = plt.subplots(1, 1) 14 | xs = np.linspace(-3, 3, n_observations) 15 | ys = np.sin(xs) + np.random.uniform(-0.5, 0.5, n_observations) 16 | ax.scatter(xs, ys) 17 | fig.show() 18 | plt.draw() 19 | 20 | # %% tf.placeholders for the input and output of the network. Placeholders are 21 | # variables which we need to fill in when we are ready to compute the graph. 22 | X = tf.placeholder(tf.float32) 23 | Y = tf.placeholder(tf.float32) 24 | 25 | # %% Instead of a single factor and a bias, we'll create a polynomial function 26 | # of different polynomial degrees. We will then learn the influence that each 27 | # degree of the input (X^0, X^1, X^2, ...) has on the final output (Y). 28 | Y_pred = tf.Variable(tf.random_normal([1]), name='bias') 29 | for pow_i in range(1, 5): 30 | W = tf.Variable(tf.random_normal([1]), name='weight_%d' % pow_i) 31 | Y_pred = tf.add(tf.multiply(tf.pow(X, pow_i), W), Y_pred) 32 | 33 | # %% Loss function will measure the distance between our observations 34 | # and predictions and average over them. 35 | cost = tf.reduce_sum(tf.pow(Y_pred - Y, 2)) / (n_observations - 1) 36 | 37 | # %% if we wanted to add regularization, we could add other terms to the cost, 38 | # e.g. ridge regression has a parameter controlling the amount of shrinkage 39 | # over the norm of activations. the larger the shrinkage, the more robust 40 | # to collinearity. 41 | # cost = tf.add(cost, tf.multiply(1e-6, tf.global_norm([W]))) 42 | 43 | # %% Use gradient descent to optimize W,b 44 | # Performs a single step in the negative gradient 45 | learning_rate = 0.01 46 | optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost) 47 | 48 | # %% We create a session to use the graph 49 | n_epochs = 1000 50 | with tf.Session() as sess: 51 | # Here we tell tensorflow that we want to initialize all 52 | # the variables in the graph so we can use them 53 | sess.run(tf.global_variables_initializer()) 54 | 55 | # Fit all training data 56 | prev_training_cost = 0.0 57 | for epoch_i in range(n_epochs): 58 | for (x, y) in zip(xs, ys): 59 | sess.run(optimizer, feed_dict={X: x, Y: y}) 60 | 61 | training_cost = sess.run( 62 | cost, feed_dict={X: xs, Y: ys}) 63 | print(training_cost) 64 | 65 | if epoch_i % 100 == 0: 66 | ax.plot(xs, Y_pred.eval( 67 | feed_dict={X: xs}, session=sess), 68 | 'k', alpha=epoch_i / n_epochs) 69 | fig.show() 70 | plt.draw() 71 | 72 | # Allow the training to quit if we've reached a minimum 73 | if np.abs(prev_training_cost - training_cost) < 0.000001: 74 | break 75 | prev_training_cost = training_cost 76 | ax.set_ylim([-3, 3]) 77 | fig.show() 78 | plt.waitforbuttonpress() 79 | -------------------------------------------------------------------------------- /python/04_logistic_regression.py: -------------------------------------------------------------------------------- 1 | """Simple tutorial using code from the TensorFlow example for Regression. 2 | 3 | Parag K. Mital, Jan. 2016""" 4 | # pip3 install --upgrade 5 | # https://storage.googleapis.com/tensorflow/mac/tensorflow-0.6.0-py3-none-any.whl 6 | # %% 7 | import tensorflow as tf 8 | import tensorflow.examples.tutorials.mnist.input_data as input_data 9 | import numpy as np 10 | import matplotlib.pyplot as plt 11 | 12 | 13 | # %% 14 | # get the classic mnist dataset 15 | # one-hot means a sparse vector for every observation where only 16 | # the class label is 1, and every other class is 0. 17 | # more info here: 18 | # https://www.tensorflow.org/versions/0.6.0/tutorials/mnist/download/index.html#dataset-object 19 | mnist = input_data.read_data_sets('MNIST_data/', one_hot=True) 20 | 21 | # %% 22 | # mnist is now a DataSet with accessors for: 23 | # 'train', 'test', and 'validation'. 24 | # within each, we can access: 25 | # images, labels, and num_examples 26 | print(mnist.train.num_examples, 27 | mnist.test.num_examples, 28 | mnist.validation.num_examples) 29 | 30 | # %% the images are stored as: 31 | # n_observations x n_features tensor (n-dim array) 32 | # the labels are stored as n_observations x n_labels, 33 | # where each observation is a one-hot vector. 34 | print(mnist.train.images.shape, mnist.train.labels.shape) 35 | 36 | # %% the range of the values of the images is from 0-1 37 | print(np.min(mnist.train.images), np.max(mnist.train.images)) 38 | 39 | # %% we can visualize any one of the images by reshaping it to a 28x28 image 40 | plt.imshow(np.reshape(mnist.train.images[100, :], (28, 28)), cmap='gray') 41 | 42 | # %% We can create a container for an input image using tensorflow's graph: 43 | # We allow the first dimension to be None, since this will eventually 44 | # represent our mini-batches, or how many images we feed into a network 45 | # at a time during training/validation/testing. 46 | # The second dimension is the number of features that the image has. 47 | n_input = 784 48 | n_output = 10 49 | net_input = tf.placeholder(tf.float32, [None, n_input]) 50 | 51 | # %% We can write a simple regression (y = W*x + b) as: 52 | W = tf.Variable(tf.zeros([n_input, n_output])) 53 | b = tf.Variable(tf.zeros([n_output])) 54 | net_output = tf.nn.softmax(tf.matmul(net_input, W) + b) 55 | 56 | # %% We'll create a placeholder for the true output of the network 57 | y_true = tf.placeholder(tf.float32, [None, 10]) 58 | 59 | # %% And then write our loss function: 60 | cross_entropy = -tf.reduce_sum(y_true * tf.log(net_output)) 61 | 62 | # %% This would equate each label in our one-hot vector between the 63 | # prediction and actual using the argmax as the predicted label 64 | correct_prediction = tf.equal( 65 | tf.argmax(net_output, 1), tf.argmax(y_true, 1)) 66 | 67 | # %% And now we can look at the mean of our network's correct guesses 68 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float")) 69 | 70 | # %% We can tell the tensorflow graph to train w/ gradient descent using 71 | # our loss function and an input learning rate 72 | optimizer = tf.train.GradientDescentOptimizer( 73 | 0.01).minimize(cross_entropy) 74 | 75 | # %% We now create a new session to actually perform the initialization the 76 | # variables: 77 | sess = tf.Session() 78 | sess.run(tf.global_variables_initializer()) 79 | 80 | # %% Now actually do some training: 81 | batch_size = 100 82 | n_epochs = 10 83 | for epoch_i in range(n_epochs): 84 | for batch_i in range(mnist.train.num_examples // batch_size): 85 | batch_xs, batch_ys = mnist.train.next_batch(batch_size) 86 | sess.run(optimizer, feed_dict={ 87 | net_input: batch_xs, 88 | y_true: batch_ys 89 | }) 90 | print(sess.run(accuracy, 91 | feed_dict={ 92 | net_input: mnist.validation.images, 93 | y_true: mnist.validation.labels 94 | })) 95 | 96 | # %% Print final test accuracy: 97 | print(sess.run(accuracy, 98 | feed_dict={ 99 | net_input: mnist.test.images, 100 | y_true: mnist.test.labels 101 | })) 102 | 103 | # %% 104 | """ 105 | # We could do the same thing w/ Keras like so: 106 | from keras.models import Sequential 107 | model = Sequential() 108 | 109 | from keras.layers.core import Dense, Activation 110 | model.add(Dense(output_dim=10, input_dim=784, init='zero')) 111 | model.add(Activation("softmax")) 112 | 113 | from keras.optimizers import SGD 114 | model.compile(loss='categorical_crossentropy', 115 | optimizer=SGD(lr=learning_rate)) 116 | 117 | model.fit(mnist.train.images, mnist.train.labels, nb_epoch=n_epochs, 118 | batch_size=batch_size, show_accuracy=True) 119 | 120 | objective_score = model.evaluate(mnist.test.images, mnist.test.labels, 121 | batch_size=100, show_accuracy=True) 122 | """ 123 | -------------------------------------------------------------------------------- /python/05_basic_convnet.py: -------------------------------------------------------------------------------- 1 | """Simple tutorial following the TensorFlow example of a Convolutional Network. 2 | 3 | Parag K. Mital, Jan. 2016""" 4 | # %% Imports 5 | import tensorflow as tf 6 | import tensorflow.examples.tutorials.mnist.input_data as input_data 7 | from libs.utils import * 8 | import matplotlib.pyplot as plt 9 | 10 | 11 | # %% Setup input to the network and true output label. These are 12 | # simply placeholders which we'll fill in later. 13 | mnist = input_data.read_data_sets('MNIST_data/', one_hot=True) 14 | x = tf.placeholder(tf.float32, [None, 784]) 15 | y = tf.placeholder(tf.float32, [None, 10]) 16 | 17 | # %% Since x is currently [batch, height*width], we need to reshape to a 18 | # 4-D tensor to use it in a convolutional graph. If one component of 19 | # `shape` is the special value -1, the size of that dimension is 20 | # computed so that the total size remains constant. Since we haven't 21 | # defined the batch dimension's shape yet, we use -1 to denote this 22 | # dimension should not change size. 23 | x_tensor = tf.reshape(x, [-1, 28, 28, 1]) 24 | 25 | # %% We'll setup the first convolutional layer 26 | # Weight matrix is [height x width x input_channels x output_channels] 27 | filter_size = 5 28 | n_filters_1 = 16 29 | W_conv1 = weight_variable([filter_size, filter_size, 1, n_filters_1]) 30 | 31 | # %% Bias is [output_channels] 32 | b_conv1 = bias_variable([n_filters_1]) 33 | 34 | # %% Now we can build a graph which does the first layer of convolution: 35 | # we define our stride as batch x height x width x channels 36 | # instead of pooling, we use strides of 2 and more layers 37 | # with smaller filters. 38 | h_conv1 = tf.nn.relu( 39 | tf.nn.conv2d(input=x_tensor, 40 | filter=W_conv1, 41 | strides=[1, 2, 2, 1], 42 | padding='SAME') + 43 | b_conv1) 44 | 45 | # %% And just like the first layer, add additional layers to create 46 | # a deep net 47 | n_filters_2 = 16 48 | W_conv2 = weight_variable([filter_size, filter_size, n_filters_1, n_filters_2]) 49 | b_conv2 = bias_variable([n_filters_2]) 50 | h_conv2 = tf.nn.relu( 51 | tf.nn.conv2d(input=h_conv1, 52 | filter=W_conv2, 53 | strides=[1, 2, 2, 1], 54 | padding='SAME') + 55 | b_conv2) 56 | 57 | # %% We'll now reshape so we can connect to a fully-connected layer: 58 | h_conv2_flat = tf.reshape(h_conv2, [-1, 7 * 7 * n_filters_2]) 59 | 60 | # %% Create a fully-connected layer: 61 | n_fc = 1024 62 | W_fc1 = weight_variable([7 * 7 * n_filters_2, n_fc]) 63 | b_fc1 = bias_variable([n_fc]) 64 | h_fc1 = tf.nn.relu(tf.matmul(h_conv2_flat, W_fc1) + b_fc1) 65 | 66 | # %% We can add dropout for regularizing and to reduce overfitting like so: 67 | keep_prob = tf.placeholder(tf.float32) 68 | h_fc1_drop = tf.nn.dropout(h_fc1, keep_prob) 69 | 70 | # %% And finally our softmax layer: 71 | W_fc2 = weight_variable([n_fc, 10]) 72 | b_fc2 = bias_variable([10]) 73 | y_pred = tf.nn.softmax(tf.matmul(h_fc1_drop, W_fc2) + b_fc2) 74 | 75 | # %% Define loss/eval/training functions 76 | cross_entropy = -tf.reduce_sum(y * tf.log(y_pred)) 77 | optimizer = tf.train.AdamOptimizer().minimize(cross_entropy) 78 | 79 | # %% Monitor accuracy 80 | correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.argmax(y, 1)) 81 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, 'float')) 82 | 83 | # %% We now create a new session to actually perform the initialization the 84 | # variables: 85 | sess = tf.Session() 86 | sess.run(tf.global_variables_initializer()) 87 | 88 | # %% We'll train in minibatches and report accuracy: 89 | batch_size = 100 90 | n_epochs = 5 91 | for epoch_i in range(n_epochs): 92 | for batch_i in range(mnist.train.num_examples // batch_size): 93 | batch_xs, batch_ys = mnist.train.next_batch(batch_size) 94 | sess.run(optimizer, feed_dict={ 95 | x: batch_xs, y: batch_ys, keep_prob: 0.5}) 96 | print(sess.run(accuracy, 97 | feed_dict={ 98 | x: mnist.validation.images, 99 | y: mnist.validation.labels, 100 | keep_prob: 1.0 101 | })) 102 | 103 | # %% Let's take a look at the kernels we've learned 104 | W = sess.run(W_conv1) 105 | plt.imshow(montage(W / np.max(W)), cmap='coolwarm') 106 | -------------------------------------------------------------------------------- /python/06_modern_convnet.py: -------------------------------------------------------------------------------- 1 | """Tutorial on how to build a convnet w/ modern changes, e.g. 2 | Batch Normalization, Leaky rectifiers, and strided convolution. 3 | 4 | Parag K. Mital, Jan 2016. 5 | """ 6 | # %% 7 | import tensorflow as tf 8 | from libs.batch_norm import batch_norm 9 | from libs.activations import lrelu 10 | from libs.connections import conv2d, linear 11 | from libs.datasets import MNIST 12 | 13 | 14 | # %% Setup input to the network and true output label. These are 15 | # simply placeholders which we'll fill in later. 16 | mnist = MNIST() 17 | x = tf.placeholder(tf.float32, [None, 784]) 18 | y = tf.placeholder(tf.float32, [None, 10]) 19 | 20 | # %% We add a new type of placeholder to denote when we are training. 21 | # This will be used to change the way we compute the network during 22 | # training/testing. 23 | is_training = tf.placeholder(tf.bool, name='is_training') 24 | 25 | # %% We'll convert our MNIST vector data to a 4-D tensor: 26 | # N x W x H x C 27 | x_tensor = tf.reshape(x, [-1, 28, 28, 1]) 28 | 29 | # %% We'll use a new method called batch normalization. 30 | # This process attempts to "reduce internal covariate shift" 31 | # which is a fancy way of saying that it will normalize updates for each 32 | # batch using a smoothed version of the batch mean and variance 33 | # The original paper proposes using this before any nonlinearities 34 | h_1 = lrelu(batch_norm(conv2d(x_tensor, 32, name='conv1'), 35 | is_training, scope='bn1'), name='lrelu1') 36 | h_2 = lrelu(batch_norm(conv2d(h_1, 64, name='conv2'), 37 | is_training, scope='bn2'), name='lrelu2') 38 | h_3 = lrelu(batch_norm(conv2d(h_2, 64, name='conv3'), 39 | is_training, scope='bn3'), name='lrelu3') 40 | h_3_flat = tf.reshape(h_3, [-1, 64 * 4 * 4]) 41 | h_4 = linear(h_3_flat, 10) 42 | y_pred = tf.nn.softmax(h_4) 43 | 44 | # %% Define loss/eval/training functions 45 | cross_entropy = -tf.reduce_sum(y * tf.log(y_pred)) 46 | train_step = tf.train.AdamOptimizer().minimize(cross_entropy) 47 | 48 | correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.argmax(y, 1)) 49 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, 'float')) 50 | 51 | # %% We now create a new session to actually perform the initialization the 52 | # variables: 53 | sess = tf.Session() 54 | sess.run(tf.global_variables_initializer()) 55 | 56 | # %% We'll train in minibatches and report accuracy: 57 | n_epochs = 10 58 | batch_size = 100 59 | for epoch_i in range(n_epochs): 60 | for batch_i in range(mnist.train.num_examples // batch_size): 61 | batch_xs, batch_ys = mnist.train.next_batch(batch_size) 62 | sess.run(train_step, feed_dict={ 63 | x: batch_xs, y: batch_ys, is_training: True}) 64 | print(sess.run(accuracy, 65 | feed_dict={ 66 | x: mnist.validation.images, 67 | y: mnist.validation.labels, 68 | is_training: False 69 | })) 70 | -------------------------------------------------------------------------------- /python/07_autoencoder.py: -------------------------------------------------------------------------------- 1 | """Tutorial on how to create an autoencoder w/ Tensorflow. 2 | 3 | Parag K. Mital, Jan 2016 4 | """ 5 | # %% Imports 6 | import tensorflow as tf 7 | import numpy as np 8 | import math 9 | 10 | 11 | # %% Autoencoder definition 12 | def autoencoder(dimensions=[784, 512, 256, 64]): 13 | """Build a deep autoencoder w/ tied weights. 14 | 15 | Parameters 16 | ---------- 17 | dimensions : list, optional 18 | The number of neurons for each layer of the autoencoder. 19 | 20 | Returns 21 | ------- 22 | x : Tensor 23 | Input placeholder to the network 24 | z : Tensor 25 | Inner-most latent representation 26 | y : Tensor 27 | Output reconstruction of the input 28 | cost : Tensor 29 | Overall cost to use for training 30 | """ 31 | # %% input to the network 32 | x = tf.placeholder(tf.float32, [None, dimensions[0]], name='x') 33 | current_input = x 34 | 35 | # %% Build the encoder 36 | encoder = [] 37 | for layer_i, n_output in enumerate(dimensions[1:]): 38 | n_input = int(current_input.get_shape()[1]) 39 | W = tf.Variable( 40 | tf.random_uniform([n_input, n_output], 41 | -1.0 / math.sqrt(n_input), 42 | 1.0 / math.sqrt(n_input))) 43 | b = tf.Variable(tf.zeros([n_output])) 44 | encoder.append(W) 45 | output = tf.nn.tanh(tf.matmul(current_input, W) + b) 46 | current_input = output 47 | 48 | # %% latent representation 49 | z = current_input 50 | encoder.reverse() 51 | 52 | # %% Build the decoder using the same weights 53 | for layer_i, n_output in enumerate(dimensions[:-1][::-1]): 54 | W = tf.transpose(encoder[layer_i]) 55 | b = tf.Variable(tf.zeros([n_output])) 56 | output = tf.nn.tanh(tf.matmul(current_input, W) + b) 57 | current_input = output 58 | 59 | # %% now have the reconstruction through the network 60 | y = current_input 61 | 62 | # %% cost function measures pixel-wise difference 63 | cost = tf.reduce_sum(tf.square(y - x)) 64 | return {'x': x, 'z': z, 'y': y, 'cost': cost} 65 | 66 | 67 | # %% Basic test 68 | def test_mnist(): 69 | """Test the autoencoder using MNIST.""" 70 | import tensorflow as tf 71 | import tensorflow.examples.tutorials.mnist.input_data as input_data 72 | import matplotlib.pyplot as plt 73 | 74 | # %% 75 | # load MNIST as before 76 | mnist = input_data.read_data_sets('MNIST_data', one_hot=True) 77 | mean_img = np.mean(mnist.train.images, axis=0) 78 | ae = autoencoder(dimensions=[784, 256, 64]) 79 | 80 | # %% 81 | learning_rate = 0.001 82 | optimizer = tf.train.AdamOptimizer(learning_rate).minimize(ae['cost']) 83 | 84 | # %% 85 | # We create a session to use the graph 86 | sess = tf.Session() 87 | sess.run(tf.global_variables_initializer()) 88 | 89 | # %% 90 | # Fit all training data 91 | batch_size = 50 92 | n_epochs = 10 93 | for epoch_i in range(n_epochs): 94 | for batch_i in range(mnist.train.num_examples // batch_size): 95 | batch_xs, _ = mnist.train.next_batch(batch_size) 96 | train = np.array([img - mean_img for img in batch_xs]) 97 | sess.run(optimizer, feed_dict={ae['x']: train}) 98 | print(epoch_i, sess.run(ae['cost'], feed_dict={ae['x']: train})) 99 | 100 | # %% 101 | # Plot example reconstructions 102 | n_examples = 15 103 | test_xs, _ = mnist.test.next_batch(n_examples) 104 | test_xs_norm = np.array([img - mean_img for img in test_xs]) 105 | recon = sess.run(ae['y'], feed_dict={ae['x']: test_xs_norm}) 106 | fig, axs = plt.subplots(2, n_examples, figsize=(10, 2)) 107 | for example_i in range(n_examples): 108 | axs[0][example_i].imshow( 109 | np.reshape(test_xs[example_i, :], (28, 28))) 110 | axs[1][example_i].imshow( 111 | np.reshape([recon[example_i, :] + mean_img], (28, 28))) 112 | fig.show() 113 | plt.draw() 114 | plt.waitforbuttonpress() 115 | 116 | 117 | # %% 118 | if __name__ == '__main__': 119 | test_mnist() 120 | -------------------------------------------------------------------------------- /python/08_denoising_autoencoder.py: -------------------------------------------------------------------------------- 1 | """Tutorial on how to create a denoising autoencoder w/ Tensorflow. 2 | 3 | Parag K. Mital, Jan 2016 4 | """ 5 | import tensorflow as tf 6 | import numpy as np 7 | import math 8 | from libs.utils import corrupt 9 | 10 | 11 | # %% 12 | def autoencoder(dimensions=[784, 512, 256, 64]): 13 | """Build a deep denoising autoencoder w/ tied weights. 14 | 15 | Parameters 16 | ---------- 17 | dimensions : list, optional 18 | The number of neurons for each layer of the autoencoder. 19 | 20 | Returns 21 | ------- 22 | x : Tensor 23 | Input placeholder to the network 24 | z : Tensor 25 | Inner-most latent representation 26 | y : Tensor 27 | Output reconstruction of the input 28 | cost : Tensor 29 | Overall cost to use for training 30 | """ 31 | # input to the network 32 | x = tf.placeholder(tf.float32, [None, dimensions[0]], name='x') 33 | 34 | # Probability that we will corrupt input. 35 | # This is the essence of the denoising autoencoder, and is pretty 36 | # basic. We'll feed forward a noisy input, allowing our network 37 | # to generalize better, possibly, to occlusions of what we're 38 | # really interested in. But to measure accuracy, we'll still 39 | # enforce a training signal which measures the original image's 40 | # reconstruction cost. 41 | # 42 | # We'll change this to 1 during training 43 | # but when we're ready for testing/production ready environments, 44 | # we'll put it back to 0. 45 | corrupt_prob = tf.placeholder(tf.float32, [1]) 46 | current_input = corrupt(x) * corrupt_prob + x * (1 - corrupt_prob) 47 | 48 | # Build the encoder 49 | encoder = [] 50 | for layer_i, n_output in enumerate(dimensions[1:]): 51 | n_input = int(current_input.get_shape()[1]) 52 | W = tf.Variable( 53 | tf.random_uniform([n_input, n_output], 54 | -1.0 / math.sqrt(n_input), 55 | 1.0 / math.sqrt(n_input))) 56 | b = tf.Variable(tf.zeros([n_output])) 57 | encoder.append(W) 58 | output = tf.nn.tanh(tf.matmul(current_input, W) + b) 59 | current_input = output 60 | # latent representation 61 | z = current_input 62 | encoder.reverse() 63 | # Build the decoder using the same weights 64 | for layer_i, n_output in enumerate(dimensions[:-1][::-1]): 65 | W = tf.transpose(encoder[layer_i]) 66 | b = tf.Variable(tf.zeros([n_output])) 67 | output = tf.nn.tanh(tf.matmul(current_input, W) + b) 68 | current_input = output 69 | # now have the reconstruction through the network 70 | y = current_input 71 | # cost function measures pixel-wise difference 72 | cost = tf.sqrt(tf.reduce_mean(tf.square(y - x))) 73 | return {'x': x, 'z': z, 'y': y, 74 | 'corrupt_prob': corrupt_prob, 75 | 'cost': cost} 76 | 77 | # %% 78 | 79 | 80 | def test_mnist(): 81 | import tensorflow as tf 82 | import tensorflow.examples.tutorials.mnist.input_data as input_data 83 | import matplotlib.pyplot as plt 84 | 85 | # %% 86 | # load MNIST as before 87 | mnist = input_data.read_data_sets('MNIST_data', one_hot=True) 88 | mean_img = np.mean(mnist.train.images, axis=0) 89 | ae = autoencoder(dimensions=[784, 256, 64]) 90 | 91 | # %% 92 | learning_rate = 0.001 93 | optimizer = tf.train.AdamOptimizer(learning_rate).minimize(ae['cost']) 94 | 95 | # %% 96 | # We create a session to use the graph 97 | sess = tf.Session() 98 | sess.run(tf.global_variables_initializer()) 99 | 100 | # %% 101 | # Fit all training data 102 | batch_size = 50 103 | n_epochs = 10 104 | for epoch_i in range(n_epochs): 105 | for batch_i in range(mnist.train.num_examples // batch_size): 106 | batch_xs, _ = mnist.train.next_batch(batch_size) 107 | train = np.array([img - mean_img for img in batch_xs]) 108 | sess.run(optimizer, feed_dict={ 109 | ae['x']: train, ae['corrupt_prob']: [1.0]}) 110 | print(epoch_i, sess.run(ae['cost'], feed_dict={ 111 | ae['x']: train, ae['corrupt_prob']: [1.0]})) 112 | 113 | # %% 114 | # Plot example reconstructions 115 | n_examples = 15 116 | test_xs, _ = mnist.test.next_batch(n_examples) 117 | test_xs_norm = np.array([img - mean_img for img in test_xs]) 118 | recon = sess.run(ae['y'], feed_dict={ 119 | ae['x']: test_xs_norm, ae['corrupt_prob']: [0.0]}) 120 | fig, axs = plt.subplots(2, n_examples, figsize=(10, 2)) 121 | for example_i in range(n_examples): 122 | axs[0][example_i].imshow( 123 | np.reshape(test_xs[example_i, :], (28, 28))) 124 | axs[1][example_i].imshow( 125 | np.reshape([recon[example_i, :] + mean_img], (28, 28))) 126 | fig.show() 127 | plt.draw() 128 | plt.waitforbuttonpress() 129 | 130 | if __name__ == '__main__': 131 | test_mnist() 132 | -------------------------------------------------------------------------------- /python/09_convolutional_autoencoder.py: -------------------------------------------------------------------------------- 1 | """Tutorial on how to create a convolutional autoencoder w/ Tensorflow. 2 | 3 | Parag K. Mital, Jan 2016 4 | """ 5 | import tensorflow as tf 6 | import numpy as np 7 | import math 8 | from libs.activations import lrelu 9 | from libs.utils import corrupt 10 | 11 | 12 | # %% 13 | def autoencoder(input_shape=[None, 784], 14 | n_filters=[1, 10, 10, 10], 15 | filter_sizes=[3, 3, 3, 3], 16 | corruption=False): 17 | """Build a deep denoising autoencoder w/ tied weights. 18 | 19 | Parameters 20 | ---------- 21 | input_shape : list, optional 22 | Description 23 | n_filters : list, optional 24 | Description 25 | filter_sizes : list, optional 26 | Description 27 | 28 | Returns 29 | ------- 30 | x : Tensor 31 | Input placeholder to the network 32 | z : Tensor 33 | Inner-most latent representation 34 | y : Tensor 35 | Output reconstruction of the input 36 | cost : Tensor 37 | Overall cost to use for training 38 | 39 | Raises 40 | ------ 41 | ValueError 42 | Description 43 | """ 44 | # %% 45 | # input to the network 46 | x = tf.placeholder( 47 | tf.float32, input_shape, name='x') 48 | 49 | 50 | # %% 51 | # ensure 2-d is converted to square tensor. 52 | if len(x.get_shape()) == 2: 53 | x_dim = np.sqrt(x.get_shape().as_list()[1]) 54 | if x_dim != int(x_dim): 55 | raise ValueError('Unsupported input dimensions') 56 | x_dim = int(x_dim) 57 | x_tensor = tf.reshape( 58 | x, [-1, x_dim, x_dim, n_filters[0]]) 59 | elif len(x.get_shape()) == 4: 60 | x_tensor = x 61 | else: 62 | raise ValueError('Unsupported input dimensions') 63 | current_input = x_tensor 64 | 65 | # %% 66 | # Optionally apply denoising autoencoder 67 | if corruption: 68 | current_input = corrupt(current_input) 69 | 70 | # %% 71 | # Build the encoder 72 | encoder = [] 73 | shapes = [] 74 | for layer_i, n_output in enumerate(n_filters[1:]): 75 | n_input = current_input.get_shape().as_list()[3] 76 | shapes.append(current_input.get_shape().as_list()) 77 | W = tf.Variable( 78 | tf.random_uniform([ 79 | filter_sizes[layer_i], 80 | filter_sizes[layer_i], 81 | n_input, n_output], 82 | -1.0 / math.sqrt(n_input), 83 | 1.0 / math.sqrt(n_input))) 84 | b = tf.Variable(tf.zeros([n_output])) 85 | encoder.append(W) 86 | output = lrelu( 87 | tf.add(tf.nn.conv2d( 88 | current_input, W, strides=[1, 2, 2, 1], padding='SAME'), b)) 89 | current_input = output 90 | 91 | # %% 92 | # store the latent representation 93 | z = current_input 94 | encoder.reverse() 95 | shapes.reverse() 96 | 97 | # %% 98 | # Build the decoder using the same weights 99 | for layer_i, shape in enumerate(shapes): 100 | W = encoder[layer_i] 101 | b = tf.Variable(tf.zeros([W.get_shape().as_list()[2]])) 102 | output = lrelu(tf.add( 103 | tf.nn.conv2d_transpose( 104 | current_input, W, 105 | tf.stack([tf.shape(x)[0], shape[1], shape[2], shape[3]]), 106 | strides=[1, 2, 2, 1], padding='SAME'), b)) 107 | current_input = output 108 | 109 | # %% 110 | # now have the reconstruction through the network 111 | y = current_input 112 | # cost function measures pixel-wise difference 113 | cost = tf.reduce_sum(tf.square(y - x_tensor)) 114 | 115 | # %% 116 | return {'x': x, 'z': z, 'y': y, 'cost': cost} 117 | 118 | 119 | # %% 120 | def test_mnist(): 121 | """Test the convolutional autoencder using MNIST.""" 122 | # %% 123 | import tensorflow as tf 124 | import tensorflow.examples.tutorials.mnist.input_data as input_data 125 | import matplotlib.pyplot as plt 126 | 127 | # %% 128 | # load MNIST as before 129 | mnist = input_data.read_data_sets('MNIST_data', one_hot=True) 130 | mean_img = np.mean(mnist.train.images, axis=0) 131 | ae = autoencoder() 132 | 133 | # %% 134 | learning_rate = 0.01 135 | optimizer = tf.train.AdamOptimizer(learning_rate).minimize(ae['cost']) 136 | 137 | # %% 138 | # We create a session to use the graph 139 | sess = tf.Session() 140 | sess.run(tf.global_variables_initializer()) 141 | 142 | # %% 143 | # Fit all training data 144 | batch_size = 100 145 | n_epochs = 10 146 | for epoch_i in range(n_epochs): 147 | for batch_i in range(mnist.train.num_examples // batch_size): 148 | batch_xs, _ = mnist.train.next_batch(batch_size) 149 | train = np.array([img - mean_img for img in batch_xs]) 150 | sess.run(optimizer, feed_dict={ae['x']: train}) 151 | print(epoch_i, sess.run(ae['cost'], feed_dict={ae['x']: train})) 152 | 153 | # %% 154 | # Plot example reconstructions 155 | n_examples = 10 156 | test_xs, _ = mnist.test.next_batch(n_examples) 157 | test_xs_norm = np.array([img - mean_img for img in test_xs]) 158 | recon = sess.run(ae['y'], feed_dict={ae['x']: test_xs_norm}) 159 | print(recon.shape) 160 | fig, axs = plt.subplots(2, n_examples, figsize=(10, 2)) 161 | for example_i in range(n_examples): 162 | axs[0][example_i].imshow( 163 | np.reshape(test_xs[example_i, :], (28, 28))) 164 | axs[1][example_i].imshow( 165 | np.reshape( 166 | np.reshape(recon[example_i, ...], (784,)) + mean_img, 167 | (28, 28))) 168 | fig.show() 169 | plt.draw() 170 | plt.waitforbuttonpress() 171 | 172 | 173 | # %% 174 | if __name__ == '__main__': 175 | test_mnist() 176 | -------------------------------------------------------------------------------- /python/10_residual_network.py: -------------------------------------------------------------------------------- 1 | """In progress. 2 | 3 | Parag K. Mital, Jan 2016. 4 | """ 5 | # %% 6 | import tensorflow as tf 7 | from libs.connections import conv2d, linear 8 | from collections import namedtuple 9 | from math import sqrt 10 | 11 | 12 | # %% 13 | def residual_network(x, n_outputs, 14 | activation=tf.nn.relu): 15 | """Builds a residual network. 16 | 17 | Parameters 18 | ---------- 19 | x : Placeholder 20 | Input to the network 21 | n_outputs : TYPE 22 | Number of outputs of final softmax 23 | activation : Attribute, optional 24 | Nonlinearity to apply after each convolution 25 | 26 | Returns 27 | ------- 28 | net : Tensor 29 | Description 30 | 31 | Raises 32 | ------ 33 | ValueError 34 | If a 2D Tensor is input, the Tensor must be square or else 35 | the network can't be converted to a 4D Tensor. 36 | """ 37 | # %% 38 | LayerBlock = namedtuple( 39 | 'LayerBlock', ['num_repeats', 'num_filters', 'bottleneck_size']) 40 | blocks = [LayerBlock(3, 128, 32), 41 | LayerBlock(3, 256, 64), 42 | LayerBlock(3, 512, 128), 43 | LayerBlock(3, 1024, 256)] 44 | 45 | # %% 46 | input_shape = x.get_shape().as_list() 47 | if len(input_shape) == 2: 48 | ndim = int(sqrt(input_shape[1])) 49 | if ndim * ndim != input_shape[1]: 50 | raise ValueError('input_shape should be square') 51 | x = tf.reshape(x, [-1, ndim, ndim, 1]) 52 | 53 | # %% 54 | # First convolution expands to 64 channels and downsamples 55 | net = conv2d(x, 64, k_h=7, k_w=7, 56 | name='conv1', 57 | activation=activation) 58 | 59 | # %% 60 | # Max pool and downsampling 61 | net = tf.nn.max_pool( 62 | net, [1, 3, 3, 1], strides=[1, 2, 2, 1], padding='SAME') 63 | 64 | # %% 65 | # Setup first chain of resnets 66 | net = conv2d(net, blocks[0].num_filters, k_h=1, k_w=1, 67 | stride_h=1, stride_w=1, padding='VALID', name='conv2') 68 | 69 | # %% 70 | # Loop through all res blocks 71 | for block_i, block in enumerate(blocks): 72 | for repeat_i in range(block.num_repeats): 73 | 74 | name = 'block_%d/repeat_%d' % (block_i, repeat_i) 75 | conv = conv2d(net, block.bottleneck_size, k_h=1, k_w=1, 76 | padding='VALID', stride_h=1, stride_w=1, 77 | activation=activation, 78 | name=name + '/conv_in') 79 | 80 | conv = conv2d(conv, block.bottleneck_size, k_h=3, k_w=3, 81 | padding='SAME', stride_h=1, stride_w=1, 82 | activation=activation, 83 | name=name + '/conv_bottleneck') 84 | 85 | conv = conv2d(conv, block.num_filters, k_h=1, k_w=1, 86 | padding='VALID', stride_h=1, stride_w=1, 87 | activation=activation, 88 | name=name + '/conv_out') 89 | 90 | net = conv + net 91 | try: 92 | # upscale to the next block size 93 | next_block = blocks[block_i + 1] 94 | net = conv2d(net, next_block.num_filters, k_h=1, k_w=1, 95 | padding='SAME', stride_h=1, stride_w=1, bias=False, 96 | name='block_%d/conv_upscale' % block_i) 97 | except IndexError: 98 | pass 99 | 100 | # %% 101 | net = tf.nn.avg_pool(net, 102 | ksize=[1, net.get_shape().as_list()[1], 103 | net.get_shape().as_list()[2], 1], 104 | strides=[1, 1, 1, 1], padding='VALID') 105 | net = tf.reshape( 106 | net, 107 | [-1, net.get_shape().as_list()[1] * 108 | net.get_shape().as_list()[2] * 109 | net.get_shape().as_list()[3]]) 110 | 111 | net = linear(net, n_outputs, activation=tf.nn.softmax) 112 | 113 | # %% 114 | return net 115 | 116 | 117 | def test_mnist(): 118 | """Test the resnet on MNIST.""" 119 | import tensorflow.examples.tutorials.mnist.input_data as input_data 120 | 121 | mnist = input_data.read_data_sets('MNIST_data/', one_hot=True) 122 | x = tf.placeholder(tf.float32, [None, 784]) 123 | y = tf.placeholder(tf.float32, [None, 10]) 124 | y_pred = residual_network(x, 10) 125 | 126 | # %% Define loss/eval/training functions 127 | cross_entropy = -tf.reduce_sum(y * tf.log(y_pred)) 128 | optimizer = tf.train.AdamOptimizer().minimize(cross_entropy) 129 | 130 | # %% Monitor accuracy 131 | correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.argmax(y, 1)) 132 | accuracy = tf.reduce_mean(tf.cast(correct_prediction, 'float')) 133 | 134 | # %% We now create a new session to actually perform the initialization the 135 | # variables: 136 | sess = tf.Session() 137 | sess.run(tf.global_variables_initializer()) 138 | 139 | # %% We'll train in minibatches and report accuracy: 140 | batch_size = 50 141 | n_epochs = 5 142 | for epoch_i in range(n_epochs): 143 | # Training 144 | train_accuracy = 0 145 | for batch_i in range(mnist.train.num_examples // batch_size): 146 | batch_xs, batch_ys = mnist.train.next_batch(batch_size) 147 | train_accuracy += sess.run([optimizer, accuracy], feed_dict={ 148 | x: batch_xs, y: batch_ys})[1] 149 | train_accuracy /= (mnist.train.num_examples // batch_size) 150 | 151 | # Validation 152 | valid_accuracy = 0 153 | for batch_i in range(mnist.validation.num_examples // batch_size): 154 | batch_xs, batch_ys = mnist.validation.next_batch(batch_size) 155 | valid_accuracy += sess.run(accuracy, 156 | feed_dict={ 157 | x: batch_xs, 158 | y: batch_ys 159 | }) 160 | valid_accuracy /= (mnist.validation.num_examples // batch_size) 161 | print('epoch:', epoch_i, ', train:', 162 | train_accuracy, ', valid:', valid_accuracy) 163 | 164 | 165 | if __name__ == '__main__': 166 | test_mnist() 167 | -------------------------------------------------------------------------------- /python/11_variational_autoencoder.py: -------------------------------------------------------------------------------- 1 | """Training a variational autoencoder with 2 layer fully-connected 2 | encoder/decoders and gaussian noise distribution. 3 | 4 | Parag K. Mital, Jan 2016 5 | """ 6 | import tensorflow as tf 7 | import numpy as np 8 | from libs.utils import weight_variable, bias_variable, montage_batch 9 | 10 | 11 | # %% 12 | def VAE(input_shape=[None, 784], 13 | n_components_encoder=2048, 14 | n_components_decoder=2048, 15 | n_hidden=2, 16 | debug=False): 17 | # %% 18 | # Input placeholder 19 | if debug: 20 | input_shape = [50, 784] 21 | x = tf.Variable(np.zeros((input_shape), dtype=np.float32)) 22 | else: 23 | x = tf.placeholder(tf.float32, input_shape) 24 | 25 | activation = tf.nn.softplus 26 | 27 | dims = x.get_shape().as_list() 28 | n_features = dims[1] 29 | 30 | W_enc1 = weight_variable([n_features, n_components_encoder]) 31 | b_enc1 = bias_variable([n_components_encoder]) 32 | h_enc1 = activation(tf.matmul(x, W_enc1) + b_enc1) 33 | 34 | W_enc2 = weight_variable([n_components_encoder, n_components_encoder]) 35 | b_enc2 = bias_variable([n_components_encoder]) 36 | h_enc2 = activation(tf.matmul(h_enc1, W_enc2) + b_enc2) 37 | 38 | W_enc3 = weight_variable([n_components_encoder, n_components_encoder]) 39 | b_enc3 = bias_variable([n_components_encoder]) 40 | h_enc3 = activation(tf.matmul(h_enc2, W_enc3) + b_enc3) 41 | 42 | W_mu = weight_variable([n_components_encoder, n_hidden]) 43 | b_mu = bias_variable([n_hidden]) 44 | 45 | W_log_sigma = weight_variable([n_components_encoder, n_hidden]) 46 | b_log_sigma = bias_variable([n_hidden]) 47 | 48 | z_mu = tf.matmul(h_enc3, W_mu) + b_mu 49 | z_log_sigma = 0.5 * (tf.matmul(h_enc3, W_log_sigma) + b_log_sigma) 50 | 51 | # %% 52 | # Sample from noise distribution p(eps) ~ N(0, 1) 53 | if debug: 54 | epsilon = tf.random_normal( 55 | [dims[0], n_hidden]) 56 | else: 57 | epsilon = tf.random_normal( 58 | tf.stack([tf.shape(x)[0], n_hidden])) 59 | 60 | # Sample from posterior 61 | z = z_mu + tf.exp(z_log_sigma) * epsilon 62 | 63 | W_dec1 = weight_variable([n_hidden, n_components_decoder]) 64 | b_dec1 = bias_variable([n_components_decoder]) 65 | h_dec1 = activation(tf.matmul(z, W_dec1) + b_dec1) 66 | 67 | W_dec2 = weight_variable([n_components_decoder, n_components_decoder]) 68 | b_dec2 = bias_variable([n_components_decoder]) 69 | h_dec2 = activation(tf.matmul(h_dec1, W_dec2) + b_dec2) 70 | 71 | W_dec3 = weight_variable([n_components_decoder, n_components_decoder]) 72 | b_dec3 = bias_variable([n_components_decoder]) 73 | h_dec3 = activation(tf.matmul(h_dec2, W_dec3) + b_dec3) 74 | 75 | W_mu_dec = weight_variable([n_components_decoder, n_features]) 76 | b_mu_dec = bias_variable([n_features]) 77 | y = tf.nn.sigmoid(tf.matmul(h_dec3, W_mu_dec) + b_mu_dec) 78 | 79 | # p(x|z) 80 | log_px_given_z = -tf.reduce_sum( 81 | x * tf.log(y + 1e-10) + 82 | (1 - x) * tf.log(1 - y + 1e-10), 1) 83 | 84 | # d_kl(q(z|x)||p(z)) 85 | # Appendix B: 0.5 * sum(1 + log(sigma^2) - mu^2 - sigma^2) 86 | kl_div = -0.5 * tf.reduce_sum( 87 | 1.0 + 2.0 * z_log_sigma - tf.square(z_mu) - tf.exp(2.0 * z_log_sigma), 88 | 1) 89 | loss = tf.reduce_mean(log_px_given_z + kl_div) 90 | 91 | return {'cost': loss, 'x': x, 'z': z, 'y': y} 92 | 93 | 94 | # %% 95 | def test_mnist(): 96 | """Summary 97 | 98 | Returns 99 | ------- 100 | name : TYPE 101 | Description 102 | """ 103 | # %% 104 | import tensorflow as tf 105 | import tensorflow.examples.tutorials.mnist.input_data as input_data 106 | import matplotlib.pyplot as plt 107 | 108 | # %% 109 | # load MNIST as before 110 | mnist = input_data.read_data_sets('MNIST_data', one_hot=True) 111 | ae = VAE() 112 | 113 | # %% 114 | learning_rate = 0.001 115 | optimizer = tf.train.AdamOptimizer(learning_rate).minimize(ae['cost']) 116 | 117 | # %% 118 | # We create a session to use the graph 119 | sess = tf.Session() 120 | sess.run(tf.global_variables_initializer()) 121 | 122 | # %% 123 | # Fit all training data 124 | t_i = 0 125 | batch_size = 100 126 | n_epochs = 50 127 | n_examples = 20 128 | test_xs, _ = mnist.test.next_batch(n_examples) 129 | xs, ys = mnist.test.images, mnist.test.labels 130 | fig_manifold, ax_manifold = plt.subplots(1, 1) 131 | fig_reconstruction, axs_reconstruction = plt.subplots(2, n_examples, figsize=(10, 2)) 132 | fig_image_manifold, ax_image_manifold = plt.subplots(1, 1) 133 | for epoch_i in range(n_epochs): 134 | print('--- Epoch', epoch_i) 135 | train_cost = 0 136 | for batch_i in range(mnist.train.num_examples // batch_size): 137 | batch_xs, _ = mnist.train.next_batch(batch_size) 138 | train_cost += sess.run([ae['cost'], optimizer], 139 | feed_dict={ae['x']: batch_xs})[0] 140 | if batch_i % 2 == 0: 141 | # %% 142 | # Plot example reconstructions from latent layer 143 | imgs = [] 144 | for img_i in np.linspace(-3, 3, n_examples): 145 | for img_j in np.linspace(-3, 3, n_examples): 146 | z = np.array([[img_i, img_j]], dtype=np.float32) 147 | recon = sess.run(ae['y'], feed_dict={ae['z']: z}) 148 | imgs.append(np.reshape(recon, (1, 28, 28, 1))) 149 | imgs_cat = np.concatenate(imgs) 150 | ax_manifold.imshow(montage_batch(imgs_cat)) 151 | fig_manifold.savefig('manifold_%08d.png' % t_i) 152 | 153 | # %% 154 | # Plot example reconstructions 155 | recon = sess.run(ae['y'], feed_dict={ae['x']: test_xs}) 156 | print(recon.shape) 157 | for example_i in range(n_examples): 158 | axs_reconstruction[0][example_i].imshow( 159 | np.reshape(test_xs[example_i, :], (28, 28)), 160 | cmap='gray') 161 | axs_reconstruction[1][example_i].imshow( 162 | np.reshape( 163 | np.reshape(recon[example_i, ...], (784,)), 164 | (28, 28)), 165 | cmap='gray') 166 | axs_reconstruction[0][example_i].axis('off') 167 | axs_reconstruction[1][example_i].axis('off') 168 | fig_reconstruction.savefig('reconstruction_%08d.png' % t_i) 169 | 170 | # %% 171 | # Plot manifold of latent layer 172 | zs = sess.run(ae['z'], feed_dict={ae['x']: xs}) 173 | ax_image_manifold.clear() 174 | ax_image_manifold.scatter(zs[:, 0], zs[:, 1], 175 | c=np.argmax(ys, 1), alpha=0.2) 176 | ax_image_manifold.set_xlim([-6, 6]) 177 | ax_image_manifold.set_ylim([-6, 6]) 178 | ax_image_manifold.axis('off') 179 | fig_image_manifold.savefig('image_manifold_%08d.png' % t_i) 180 | 181 | t_i += 1 182 | 183 | 184 | print('Train cost:', train_cost / 185 | (mnist.train.num_examples // batch_size)) 186 | 187 | valid_cost = 0 188 | for batch_i in range(mnist.validation.num_examples // batch_size): 189 | batch_xs, _ = mnist.validation.next_batch(batch_size) 190 | valid_cost += sess.run([ae['cost']], 191 | feed_dict={ae['x']: batch_xs})[0] 192 | print('Validation cost:', valid_cost / 193 | (mnist.validation.num_examples // batch_size)) 194 | 195 | 196 | if __name__ == '__main__': 197 | test_mnist() 198 | -------------------------------------------------------------------------------- /python/libs/__init__.py: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /python/libs/activations.py: -------------------------------------------------------------------------------- 1 | """Activations for TensorFlow. 2 | Parag K. Mital, Jan 2016.""" 3 | import tensorflow as tf 4 | 5 | 6 | def lrelu(x, leak=0.2, name="lrelu"): 7 | """Leaky rectifier. 8 | 9 | Parameters 10 | ---------- 11 | x : Tensor 12 | The tensor to apply the nonlinearity to. 13 | leak : float, optional 14 | Leakage parameter. 15 | name : str, optional 16 | Variable scope to use. 17 | 18 | Returns 19 | ------- 20 | x : Tensor 21 | Output of the nonlinearity. 22 | """ 23 | with tf.variable_scope(name): 24 | f1 = 0.5 * (1 + leak) 25 | f2 = 0.5 * (1 - leak) 26 | return f1 * x + f2 * abs(x) 27 | -------------------------------------------------------------------------------- /python/libs/batch_norm.py: -------------------------------------------------------------------------------- 1 | """Batch Normalization for TensorFlow. 2 | Parag K. Mital, Jan 2016. 3 | """ 4 | 5 | import tensorflow as tf 6 | 7 | 8 | def batch_norm(x, phase_train, scope='bn', affine=True): 9 | """ 10 | Batch normalization on convolutional maps. 11 | 12 | from: https://stackoverflow.com/questions/33949786/how-could-i- 13 | use-batch-normalization-in-tensorflow 14 | 15 | Only modified to infer shape from input tensor x. 16 | 17 | Parameters 18 | ---------- 19 | x 20 | Tensor, 4D BHWD input maps 21 | phase_train 22 | boolean tf.Variable, true indicates training phase 23 | scope 24 | string, variable scope 25 | affine 26 | whether to affine-transform outputs 27 | 28 | Return 29 | ------ 30 | normed 31 | batch-normalized maps 32 | """ 33 | with tf.variable_scope(scope): 34 | shape = x.get_shape().as_list() 35 | 36 | beta = tf.Variable(tf.constant(0.0, shape=[shape[-1]]), 37 | name='beta', trainable=True) 38 | gamma = tf.Variable(tf.constant(1.0, shape=[shape[-1]]), 39 | name='gamma', trainable=affine) 40 | 41 | batch_mean, batch_var = tf.nn.moments(x, [0, 1, 2], name='moments') 42 | ema = tf.train.ExponentialMovingAverage(decay=0.9) 43 | ema_mean, ema_var = ema.average(batch_mean), ema.average(batch_var) 44 | 45 | def mean_var_with_update(): 46 | """Summary 47 | 48 | Returns 49 | ------- 50 | name : TYPE 51 | Description 52 | """ 53 | ema_apply_op = ema.apply([batch_mean, batch_var]) 54 | with tf.control_dependencies([ema_apply_op]): 55 | return tf.identity(batch_mean), tf.identity(batch_var) 56 | mean, var = tf.cond(phase_train, 57 | mean_var_with_update, 58 | lambda: (ema_mean, ema_var)) 59 | 60 | normed = tf.nn.batch_norm_with_global_normalization( 61 | x, mean, var, beta, gamma, 1e-3, affine) 62 | return normed 63 | -------------------------------------------------------------------------------- /python/libs/connections.py: -------------------------------------------------------------------------------- 1 | """APL 2.0 code from github.com/pkmital/tensorflow_tutorials w/ permission 2 | from Parag K. Mital. 3 | """ 4 | import math 5 | import tensorflow as tf 6 | 7 | 8 | def batch_norm(x, phase_train, scope='bn', affine=True): 9 | """ 10 | Batch normalization on convolutional maps. 11 | from: https://stackoverflow.com/questions/33949786/how-could-i- 12 | use-batch-normalization-in-tensorflow 13 | Only modified to infer shape from input tensor x. 14 | Parameters 15 | ---------- 16 | x 17 | Tensor, 4D BHWD input maps 18 | phase_train 19 | boolean tf.Variable, true indicates training phase 20 | scope 21 | string, variable scope 22 | affine 23 | whether to affine-transform outputs 24 | Return 25 | ------ 26 | normed 27 | batch-normalized maps 28 | """ 29 | with tf.variable_scope(scope): 30 | og_shape = x.get_shape().as_list() 31 | if len(og_shape) == 2: 32 | x = tf.reshape(x, [-1, 1, 1, og_shape[1]]) 33 | shape = x.get_shape().as_list() 34 | beta = tf.Variable(tf.constant(0.0, shape=[shape[-1]]), 35 | name='beta', trainable=True) 36 | gamma = tf.Variable(tf.constant(1.0, shape=[shape[-1]]), 37 | name='gamma', trainable=affine) 38 | 39 | batch_mean, batch_var = tf.nn.moments(x, [0, 1, 2], name='moments') 40 | ema = tf.train.ExponentialMovingAverage(decay=0.9) 41 | ema_apply_op = ema.apply([batch_mean, batch_var]) 42 | ema_mean, ema_var = ema.average(batch_mean), ema.average(batch_var) 43 | 44 | def mean_var_with_update(): 45 | """Summary 46 | Returns 47 | ------- 48 | name : TYPE 49 | Description 50 | """ 51 | with tf.control_dependencies([ema_apply_op]): 52 | return tf.identity(batch_mean), tf.identity(batch_var) 53 | mean, var = tf.cond(phase_train, 54 | mean_var_with_update, 55 | lambda: (ema_mean, ema_var)) 56 | 57 | normed = tf.nn.batch_norm_with_global_normalization( 58 | x, mean, var, beta, gamma, 1e-3, affine) 59 | if len(og_shape) == 2: 60 | normed = tf.reshape(normed, [-1, og_shape[-1]]) 61 | return normed 62 | 63 | 64 | def lrelu(x, leak=0.2, name="lrelu"): 65 | """Leaky rectifier. 66 | Parameters 67 | ---------- 68 | x : Tensor 69 | The tensor to apply the nonlinearity to. 70 | leak : float, optional 71 | Leakage parameter. 72 | name : str, optional 73 | Variable scope to use. 74 | Returns 75 | ------- 76 | x : Tensor 77 | Output of the nonlinearity. 78 | """ 79 | with tf.variable_scope(name): 80 | f1 = 0.5 * (1 + leak) 81 | f2 = 0.5 * (1 - leak) 82 | return f1 * x + f2 * abs(x) 83 | 84 | 85 | def linear(x, n_units, scope=None, stddev=0.02, 86 | activation=lambda x: x): 87 | """Fully-connected network. 88 | Parameters 89 | ---------- 90 | x : Tensor 91 | Input tensor to the network. 92 | n_units : int 93 | Number of units to connect to. 94 | scope : str, optional 95 | Variable scope to use. 96 | stddev : float, optional 97 | Initialization's standard deviation. 98 | activation : arguments, optional 99 | Function which applies a nonlinearity 100 | Returns 101 | ------- 102 | x : Tensor 103 | Fully-connected output. 104 | """ 105 | shape = x.get_shape().as_list() 106 | 107 | with tf.variable_scope(scope or "Linear"): 108 | matrix = tf.get_variable("Matrix", [shape[1], n_units], tf.float32, 109 | tf.random_normal_initializer(stddev=stddev)) 110 | return activation(tf.matmul(x, matrix)) 111 | 112 | 113 | def conv2d(x, n_filters, 114 | k_h=5, k_w=5, 115 | stride_h=2, stride_w=2, 116 | stddev=0.02, 117 | activation=None, 118 | bias=True, 119 | padding='SAME', 120 | name="Conv2D"): 121 | """2D Convolution with options for kernel size, stride, and init deviation. 122 | 123 | Parameters 124 | ---------- 125 | x : Tensor 126 | Input tensor to convolve. 127 | n_filters : int 128 | Number of filters to apply. 129 | k_h : int, optional 130 | Kernel height. 131 | k_w : int, optional 132 | Kernel width. 133 | stride_h : int, optional 134 | Stride in rows. 135 | stride_w : int, optional 136 | Stride in cols. 137 | stddev : float, optional 138 | Initialization's standard deviation. 139 | activation : arguments, optional 140 | Function which applies a nonlinearity 141 | padding : str, optional 142 | 'SAME' or 'VALID' 143 | name : str, optional 144 | Variable scope to use. 145 | 146 | Returns 147 | ------- 148 | x : Tensor 149 | Convolved input. 150 | """ 151 | with tf.variable_scope(name): 152 | w = tf.get_variable( 153 | 'w', [k_h, k_w, x.get_shape()[-1], n_filters], 154 | initializer=tf.truncated_normal_initializer(stddev=stddev)) 155 | conv = tf.nn.conv2d( 156 | x, w, strides=[1, stride_h, stride_w, 1], padding=padding) 157 | if bias: 158 | b = tf.get_variable( 159 | 'b', [n_filters], 160 | initializer=tf.truncated_normal_initializer(stddev=stddev)) 161 | conv = tf.nn.bias_add(conv, b) 162 | if activation: 163 | conv = activation(conv) 164 | return conv 165 | -------------------------------------------------------------------------------- /python/libs/dataset_utils.py: -------------------------------------------------------------------------------- 1 | import os 2 | import pickle 3 | import numpy as np 4 | 5 | 6 | def cifar10_download(dst='cifar10'): 7 | from six.moves import urllib 8 | import tarfile 9 | if not os.path.exists(dst): 10 | os.makedirs(dst) 11 | path = 'http://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz' 12 | filepath, _ = urllib.request.urlretrieve(path, './') 13 | tarfile.open(filepath, 'r:gz').extractall(dst) 14 | 15 | 16 | def cifar10_load(dst='cifar10'): 17 | if not os.path.exists(dst): 18 | cifar10_download(dst) 19 | Xs = None 20 | ys = None 21 | for f in range(1, 6): 22 | cf = pickle.load(open( 23 | '%s/data_batch_%d' % (dst, f), 'rb'), 24 | encoding='LATIN') 25 | if Xs is not None: 26 | Xs = np.r_[Xs, cf['data']] 27 | ys = np.r_[ys, np.array(cf['labels'])] 28 | else: 29 | Xs = cf['data'] 30 | ys = cf['labels'] 31 | return Xs, ys 32 | 33 | 34 | def dense_to_one_hot(labels, n_classes=2): 35 | """Convert class labels from scalars to one-hot vectors.""" 36 | labels = np.array(labels) 37 | n_labels = labels.shape[0] 38 | index_offset = np.arange(n_labels) * n_classes 39 | labels_one_hot = np.zeros((n_labels, n_classes), dtype=np.float32) 40 | labels_one_hot.flat[index_offset + labels.ravel()] = 1 41 | return labels_one_hot 42 | 43 | 44 | class DatasetSplit(object): 45 | def __init__(self, images, labels): 46 | self.images = np.array(images).astype(np.float32) 47 | self.labels = np.array(labels).astype(np.int32) 48 | self.n_labels = len(np.unique(labels)) 49 | self.num_examples = len(self.images) 50 | 51 | def next_batch(self, batch_size=100): 52 | # Shuffle each epoch 53 | current_permutation = np.random.permutation(range(len(self.images))) 54 | epoch_images = self.images[current_permutation, ...] 55 | epoch_labels = dense_to_one_hot( 56 | self.labels[current_permutation, ...], self.n_labels) 57 | 58 | # Then iterate over the epoch 59 | self.current_batch_idx = 0 60 | while self.current_batch_idx < len(self.images): 61 | end_idx = min( 62 | self.current_batch_idx + batch_size, len(self.images)) 63 | this_batch = { 64 | 'images': epoch_images[self.current_batch_idx:end_idx], 65 | 'labels': epoch_labels[self.current_batch_idx:end_idx] 66 | } 67 | self.current_batch_idx += batch_size 68 | yield this_batch['images'], this_batch['labels'] 69 | 70 | 71 | class Dataset(object): 72 | def __init__(self, Xs, ys, split=[0.8, 0.1, 0.1]): 73 | 74 | self.all_idxs = [] 75 | self.all_labels = [] 76 | self.all_inputs = [] 77 | self.train_idxs = [] 78 | self.valid_idxs = [] 79 | self.test_idxs = [] 80 | self.n_labels = 0 81 | self.split = split 82 | 83 | # Now mix all the labels that are currently stored as blocks 84 | self.all_inputs = Xs 85 | self.all_labels = ys 86 | n_idxs = len(self.all_inputs) 87 | idxs = range(n_idxs) 88 | rand_idxs = np.random.permutation(idxs) 89 | self.all_inputs = self.all_inputs[rand_idxs, ...] 90 | self.all_labels = self.all_labels[rand_idxs, ...] 91 | 92 | # Get splits 93 | self.train_idxs = idxs[:round(split[0] * n_idxs)] 94 | self.valid_idxs = idxs[len(self.train_idxs): 95 | len(self.train_idxs) + round(split[1] * n_idxs)] 96 | self.test_idxs = idxs[len(self.valid_idxs): 97 | len(self.valid_idxs) + round(split[2] * n_idxs)] 98 | 99 | @property 100 | def train(self): 101 | inputs = self.all_inputs[self.train_idxs, ...] 102 | labels = self.all_labels[self.train_idxs, ...] 103 | return DatasetSplit(inputs, labels) 104 | 105 | @property 106 | def valid(self): 107 | inputs = self.all_inputs[self.valid_idxs, ...] 108 | labels = self.all_labels[self.valid_idxs, ...] 109 | return DatasetSplit(inputs, labels) 110 | 111 | @property 112 | def test(self): 113 | inputs = self.all_inputs[self.test_idxs, ...] 114 | labels = self.all_labels[self.test_idxs, ...] 115 | return DatasetSplit(inputs, labels) 116 | 117 | def mean(self): 118 | return np.mean(self.all_inputs, axis=0) 119 | 120 | def std(self): 121 | return np.std(self.all_inputs, axis=0) 122 | -------------------------------------------------------------------------------- /python/libs/datasets.py: -------------------------------------------------------------------------------- 1 | """Loading datasets. 2 | 3 | Parag K. Mital, Jan. 2016 4 | """ 5 | import tensorflow.examples.tutorials.mnist.input_data as input_data 6 | from .dataset_utils import * 7 | 8 | 9 | def MNIST(one_hot=True): 10 | """Returns the MNIST dataset. 11 | 12 | Returns 13 | ------- 14 | mnist : DataSet 15 | DataSet object w/ convenienve props for accessing 16 | train/validation/test sets and batches. 17 | """ 18 | return input_data.read_data_sets('MNIST_data/', one_hot=one_hot) 19 | 20 | 21 | def CIFAR10(): 22 | # plt.imshow(np.transpose(np.reshape(cifar.train.images[10], (3, 32, 32)), [1, 2, 0])) 23 | Xs, ys = cifar10_load() 24 | return Dataset(Xs, ys) 25 | -------------------------------------------------------------------------------- /python/libs/utils.py: -------------------------------------------------------------------------------- 1 | """Some useful utilities when dealing with neural nets w/ tensorflow. 2 | 3 | Parag K. Mital, Jan. 2016 4 | """ 5 | import tensorflow as tf 6 | import numpy as np 7 | 8 | 9 | def montage_batch(images): 10 | """Draws all filters (n_input * n_output filters) as a 11 | montage image separated by 1 pixel borders. 12 | 13 | Parameters 14 | ---------- 15 | batch : numpy.ndarray 16 | Input array to create montage of. 17 | 18 | Returns 19 | ------- 20 | m : numpy.ndarray 21 | Montage image. 22 | """ 23 | img_h = images.shape[1] 24 | img_w = images.shape[2] 25 | n_plots = int(np.ceil(np.sqrt(images.shape[0]))) 26 | m = np.ones( 27 | (images.shape[1] * n_plots + n_plots + 1, 28 | images.shape[2] * n_plots + n_plots + 1, 3)) * 0.5 29 | 30 | for i in range(n_plots): 31 | for j in range(n_plots): 32 | this_filter = i * n_plots + j 33 | if this_filter < images.shape[0]: 34 | this_img = images[this_filter, ...] 35 | m[1 + i + i * img_h:1 + i + (i + 1) * img_h, 36 | 1 + j + j * img_w:1 + j + (j + 1) * img_w, :] = this_img 37 | return m 38 | 39 | 40 | # %% 41 | def montage(W): 42 | """Draws all filters (n_input * n_output filters) as a 43 | montage image separated by 1 pixel borders. 44 | 45 | Parameters 46 | ---------- 47 | W : numpy.ndarray 48 | Input array to create montage of. 49 | 50 | Returns 51 | ------- 52 | m : numpy.ndarray 53 | Montage image. 54 | """ 55 | W = np.reshape(W, [W.shape[0], W.shape[1], 1, W.shape[2] * W.shape[3]]) 56 | n_plots = int(np.ceil(np.sqrt(W.shape[-1]))) 57 | m = np.ones( 58 | (W.shape[0] * n_plots + n_plots + 1, 59 | W.shape[1] * n_plots + n_plots + 1)) * 0.5 60 | for i in range(n_plots): 61 | for j in range(n_plots): 62 | this_filter = i * n_plots + j 63 | if this_filter < W.shape[-1]: 64 | m[1 + i + i * W.shape[0]:1 + i + (i + 1) * W.shape[0], 65 | 1 + j + j * W.shape[1]:1 + j + (j + 1) * W.shape[1]] = ( 66 | np.squeeze(W[:, :, :, this_filter])) 67 | return m 68 | 69 | 70 | 71 | 72 | # %% 73 | def corrupt(x): 74 | """Take an input tensor and add uniform masking. 75 | 76 | Parameters 77 | ---------- 78 | x : Tensor/Placeholder 79 | Input to corrupt. 80 | 81 | Returns 82 | ------- 83 | x_corrupted : Tensor 84 | 50 pct of values corrupted. 85 | """ 86 | return tf.multiply(x, tf.cast(tf.random_uniform(shape=tf.shape(x), 87 | minval=0, 88 | maxval=2, 89 | dtype=tf.int32), tf.float32)) 90 | 91 | 92 | # %% 93 | def weight_variable(shape): 94 | '''Helper function to create a weight variable initialized with 95 | a normal distribution 96 | 97 | Parameters 98 | ---------- 99 | shape : list 100 | Size of weight variable 101 | ''' 102 | initial = tf.random_normal(shape, mean=0.0, stddev=0.01) 103 | return tf.Variable(initial) 104 | 105 | 106 | # %% 107 | def bias_variable(shape): 108 | '''Helper function to create a bias variable initialized with 109 | a constant value. 110 | 111 | Parameters 112 | ---------- 113 | shape : list 114 | Size of weight variable 115 | ''' 116 | initial = tf.random_normal(shape, mean=0.0, stddev=0.01) 117 | return tf.Variable(initial) 118 | --------------------------------------------------------------------------------