├── .gitignore
├── README.md
├── assignments
├── week_1
│ ├── assignment_1.md
│ └── poverty.txt
├── week_2
│ ├── assignment_2.md
│ └── mnist.png
├── week_3
│ ├── assignment_3.md
│ └── mnist.png
├── week_4
│ ├── .assignment_4.md.swp
│ └── assignment_4.md
├── week_5
│ ├── .assignment_5.md.swp
│ └── assignment_5.md
└── week_6
│ ├── celebA.png
│ ├── fashion-mnist.png
│ └── final_assignment.md
├── pytorch-tutorial
├── 0. Computation Graphs.ipynb
├── 1. Object Classification with CNNs.ipynb
├── 2. Question Answering with RNNs.ipynb
├── 3. Saliency Maps, Feature Visualisation and Adversarial Examples.ipynb
├── 4. Generative Modelling with VAEs, GANs and Autoregressive Models.ipynb
├── 5. Discrete and Continuous Control with A2C and PPO.ipynb
├── assets
│ ├── panda.png
│ └── starry-night.jpg
└── todo.txt
└── tensorflow-tutorial
├── week_1
├── Week_1.ipynb
└── lung_function.txt
├── week_2
└── Week_2.ipynb
├── week_3
└── Week_3.ipynb
└── week_9
├── Week_9 pt 1 - Bijectors.ipynb
└── Week_9 pt 2 - IAF.ipynb
/.gitignore:
--------------------------------------------------------------------------------
1 | *.npy
2 | *.xml
3 | *.pyc
4 | *.DS_Store
5 | *.mat
6 | *.npy
7 |
8 | **/solution.py
9 | **/.ipynb_checkpoints/
10 | **/checkpoint
11 | **/model.data-*
12 | *.index
13 | *.meta
14 |
15 | __pycache__/
16 | .idea/
17 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # dl-imperial-maths
2 | Code and assignment repository for the Imperial College Mathematics department Deep Learning course
3 |
4 | ## Course description
5 |
6 | Deep Learning is a fast-evolving field in artificial intelligence that has been driving breakthrough advances in many application areas in recent years. It has become one of the most in-demand skillsets in machine learning and AI, far exceeding the supply of people with an expertise in this field. This course is aimed at PhD students within the Mathematics department at Imperial College who have no prior knowledge or experience of the field. It will cover the foundations of Deep Learning, including the various types of neural networks used for supervised and unsupervised learning. Practical tutorials in TensorFlow/PyTorch are an integral part of the course, and will enable students to build and train their own deep neural networks for a range of applications. The course also aims to describe the current state-of-the-art in various areas of Deep Learning, theoretical underpinnings and outstanding problems.
7 |
8 | Topics covered in this course will include:
9 |
10 | * Convolutional and recurrent neural networks
11 | * Reinforcement Learning
12 | * Generative Adversarial Networks (GANs)
13 | * Variational autoencoders (VAEs)
14 | * Theoretical foundations of Deep Learning
15 |
16 | There is a course website where registrations can be made and further logistical details can be found [here](https://www.deeplearningmathematics.com).
17 |
18 | ## Course tutors
19 |
20 | ### Kevin Webster
21 |
22 | [Kevin](https://www.linkedin.com/in/kevin-webster-095aba59/) obtained his PhD in 2003 from the Department of Mathematics at Imperial College, in the area of dynamical systems. He has also held postdoctorate positions at Imperial College, and was awarded a Marie Curie Individual Fellowship, which he spent at the Potsdam Institute for Climate Impact Modelling in Germany. During these positions his research interests became more focused on machine learning, and specifically adapting ML technologies for numerical analysis problems in dynamical systems. He was the Head of Research at the London music AI startup Jukedeck, where he oversaw the development of the deep learning framework for automatic music composition. In 2018 he set up his own machine learning consultancy, [FeedForward](http://www.feedforwardai.com/), with a focus on the music & the creative industries. His particular interest in the field of deep learning is generative modelling. [@kn_webster](https://twitter.com/kn_webster) / [kevin.webster@imperial.ac.uk](mailto:kevin.webster@imperial.ac.uk)
23 |
24 | ### Pierre Richemond
25 |
26 | [Pierre](https://www.linkedin.com/in/pierre-h-richemond-2353683/) is currently researching his PhD in deep reinforcement learning at the Data Science Institute of Imperial College. He also helps run the [Deep Learning Network](http://www.dlnetwork.org/) and organize thematic reading groups there. Prior to that, he has worked in electronics as a research engineer and in quantitative finance as a trader. He has studied electrical engineering at ENST, probability theory and stochastic processes at Universite Paris VI - Ecole Polytechnique, and business management at HEC. His other research interests in the field of deep learning include neural network theory, as well as stochastic optimization methods. [@KloudStrife](https://twitter.com/KloudStrife) / [p.richemond17@imperial.ac.uk](mailto:p.richemond17@imperial.ac.uk)
27 |
28 | ## Guest tutors
29 |
30 | We are grateful to **Kai Arulkumaran** for providing PyTorch notebooks for the course and teaching two of the demonstration classes on PyTorch.
31 |
32 | [Kai](https://www.linkedin.com/in/kailasharul) is currently researching his PhD in deep learning at the Department of Bioengineering at Imperial College. During his PhD he has been a research intern at Microsoft Research, Twitter Magic Pony, Facebook AI Research and DeepMind. He also founded the [Deep Learning Network](http://dlnetwork.org) at Imperial College to organise guest lectures and a reading group on the topic of deep learning. He is an advocate for open-source software and a well-known contributor to the Torch/PyTorch ecosystems. Before his PhD he studied computer science at the University of Cambridge and worked as a web developer. [@KaiLashArul](https://twitter.com/KaiLashArul) / [kailash.arulkumaran13@imperial.ac.uk](mailto:kailash.arulkumaran13@imperial.ac.uk)
33 |
34 | ## Coursework
35 |
36 | This repository contains the notebooks for the TensorFlow/PyTorch tutorials as well as details for the coursework, for students that wish to take this course for credit.
37 |
38 | Students are recommended to fork this repository and add their solutions to the assignments (as python scripts) in their forked repository. The coursework will be assessed orally following completion of the course.
39 |
40 | ### Software requirements
41 |
42 | To complete the coursework and run the notebooks you will need to install Tensorflow and PyTorch (as well as other scientific packages, especially numpy). These can be installed using pip; alternatively Tensorflow/PyTorch can be installed using Anaconda (preferred for PyTorch). Jupyter is conveniently installed with Anaconda, or it can also be installed using pip. Relevant links are given below:
43 |
44 | * Installing anaconda: [https://www.anaconda.com/download/](https://www.anaconda.com/download/)
45 | * Installing Jupyter: [https://jupyter.readthedocs.io/en/latest/install.html](https://jupyter.readthedocs.io/en/latest/install.html)
46 | * Installing Tensorflow via Anaconda: [https://www.anaconda.com/blog/developer-blog/tensorflow-in-anaconda/](https://www.anaconda.com/blog/developer-blog/tensorflow-in-anaconda/)
47 | * Installing Tensorflow using pip: [https://www.tensorflow.org/install/](https://www.tensorflow.org/install/)
48 | * Installing PyTorch via Anaconda/pip: [https://pytorch.org/get-started/locally/](https://pytorch.org/get-started/locally/)
49 | * Installing torchtext via pip (Anaconda install unavailable): [https://github.com/pytorch/text](https://github.com/pytorch/text)
50 | * Installing OpenAI Gym using pip (Anaconda install unavailable): [https://gym.openai.com/docs/](https://gym.openai.com/docs/)
51 |
52 | The PyTorch notebooks require Python 3. Different Python versions can be managed via Anaconda.
53 |
--------------------------------------------------------------------------------
/assignments/week_1/assignment_1.md:
--------------------------------------------------------------------------------
1 | # Assignment 1
2 |
3 | Suggested due date: 17th October 2018
4 |
5 | ## Linear regression
6 |
7 | The aim of this assignment is to familiarise with the basics of a Tensorflow pipeline. We will use linear regression as a simple algorithm to work through building and executing a tensorflow graph purely as an exercise.
8 |
9 | Given a number of independent variables , , construct the matrix , where the data on the independent variables is stored in rows of , and the last column is filled with ones (to account for the bias term). Also let be the target values from the dataset.
10 |
11 | Then the linear regression problem can be expressed as
12 |
13 |
14 |
15 | where contains the coefficients for each independent variable, followed by the bias term.
16 |
17 | Provided the columns of are linearly independent, the solution can be expressed in closed form as the normal equation:
18 |
19 | .
20 |
21 | ## Implementation in Tensorflow
22 |
23 | The assignment is to implement the normal equation as a graph in Tensorflow. Your solution should be written in a python script.
24 |
25 | The matrix of independent variables and the vector of targets should be defined as placeholders, allowing for a variable number of data points and features. The graph should define the solution to the linear regression problem using the normal equation.
26 |
27 | In this folder you will find the file 'poverty.txt', which contains data on poverty level and teen birth rate in the US. This dataset has 51 datapoints for the 50 states and the District of Columbia in the United States.
28 |
29 | Use your Tensorflow implementation to regress Brth15to17 against PovPct from the attached file. Report the equation expressing the solution. Plot the data and the solution and include as an image file.
30 |
31 | Use the **same Tensorflow graph** to regress Brth15to17 against PovPct and ViolCrime and report the equation expressing the solution.
32 |
--------------------------------------------------------------------------------
/assignments/week_1/poverty.txt:
--------------------------------------------------------------------------------
1 | Location PovPct Brth15to17 Brth18to19 ViolCrime TeenBrth
2 | Alabama 20.1 31.5 88.7 11.2 54.5
3 | Alaska 7.1 18.9 73.7 9.1 39.5
4 | Arizona 16.1 35 102.5 10.4 61.2
5 | Arkansas 14.9 31.6 101.7 10.4 59.9
6 | California 16.7 22.6 69.1 11.2 41.1
7 | Colorado 8.8 26.2 79.1 5.8 47
8 | Connecticut 9.7 14.1 45.1 4.6 25.8
9 | Delaware 10.3 24.7 77.8 3.5 46.3
10 | District_of_Columbia 22 44.8 101.5 65 69.1
11 | Florida 16.2 23.2 78.4 7.3 44.5
12 | Georgia 12.1 31.4 92.8 9.5 55.7
13 | Hawaii 10.3 17.7 66.4 4.7 38.2
14 | Idaho 14.5 18.4 69.1 4.1 39.1
15 | Illinois 12.4 23.4 70.5 10.3 42.2
16 | Indiana 9.6 22.6 78.5 8 44.6
17 | Iowa 12.2 16.4 55.4 1.8 32.5
18 | Kansas 10.8 21.4 74.2 6.2 43
19 | Kentucky 14.7 26.5 84.8 7.2 51
20 | Louisiana 19.7 31.7 96.1 17 58.1
21 | Maine 11.2 11.9 45.2 2 25.4
22 | Maryland 10.1 20 59.6 11.8 35.4
23 | Massachusetts 11 12.5 39.6 3.6 23.3
24 | Michigan 12.2 18 60.8 8.5 34.8
25 | Minnesota 9.2 14.2 47.3 3.9 27.5
26 | Mississippi 23.5 37.6 103.3 12.9 64.7
27 | Missouri 9.4 22.2 76.6 8.8 44.1
28 | Montana 15.3 17.8 63.3 3 36.4
29 | Nebraska 9.6 18.3 64.2 2.9 37
30 | Nevada 11.1 28 96.7 10.7 53.9
31 | New_Hampshire 5.3 8.1 39 1.8 20
32 | New_Jersey 7.8 14.7 46.1 5.1 26.8
33 | New_Mexico 25.3 37.8 99.5 8.8 62.4
34 | New_York 16.5 15.7 50.1 8.5 29.5
35 | North_Carolina 12.6 28.6 89.3 9.4 52.2
36 | North_Dakota 12 11.7 48.7 0.9 27.2
37 | Ohio 11.5 20.1 69.4 5.4 39.5
38 | Oklahoma 17.1 30.1 97.6 12.2 58
39 | Oregon 11.2 18.2 64.8 4.1 36.8
40 | Pennsylvania 12.2 17.2 53.7 6.3 31.6
41 | Rhode_Island 10.6 19.6 59 3.3 35.6
42 | South_Carolina 19.9 29.2 87.2 7.9 53
43 | South_Dakota 14.5 17.3 67.8 1.8 38
44 | Tennessee 15.5 28.2 94.2 10.6 54.3
45 | Texas 17.4 38.2 104.3 9 64.4
46 | Utah 8.4 17.8 62.4 3.9 36.8
47 | Vermont 10.3 10.4 44.4 2.2 24.2
48 | Virginia 10.2 19 66 7.6 37.6
49 | Washington 12.5 16.8 57.6 5.1 33
50 | West_Virginia 16.7 21.5 80.7 4.9 45.5
51 | Wisconsin 8.5 15.9 57.1 4.3 32.3
52 | Wyoming 12.2 17.7 72.1 2.1 39.9
--------------------------------------------------------------------------------
/assignments/week_2/assignment_2.md:
--------------------------------------------------------------------------------
1 | # Assignment 2
2 |
3 | Suggested due date: 24th October 2018
4 |
5 | ## Multilayer perceptron / Feedforward network
6 |
7 | The aims for this assignment are:
8 | * Implement a simple MLP classifier in Tensorflow
9 | * Train a neural network using backpropagation
10 |
11 | We will build a multilayer perceptron as a classifier, and train it using backpropagation. The MLP consists of several densely connected layers
12 |
13 | ## MNIST MLP classifier
14 |
15 |
16 |
17 |
18 |
19 | For this assignment you will need to download the MNIST dataset, which is available [here](http://yann.lecun.com/exdb/mnist/ "MNIST dataset"). This dataset consists of 28x28 grayscale images, with associated labels for which digit the image contains (0-9). The training set consists of 60,000 examples and the test set is 10,000 examples.
20 |
21 | The MLP is a densely connected network, with layers , where . The input and output . For , the pre-activations are given by
22 |
23 | ,
24 |
25 | where and . The post-activations are given by
26 |
27 | ,
28 |
29 | where is an activation function that is applied element-wise.
30 |
31 | For our classifier, we will flatten the inputs so it is a 784-length vector, and this will serve as input to the first hidden layer. You may also want to rescale the inputs. The output should be a 10-way softmax layer to predict the digit label.
32 |
33 | ## Implementation in Tensorflow
34 |
35 | The assignment is to implement the MLP classifier for MNIST in Tensorflow, train it with one of the available optimisers and test the classification performance on the test set. Write your solution as a python script.
36 |
37 | You should choose the number of layers for the network, the size of those layers and the activation functions (try testing a few options for these hyperparameters).
38 |
39 | * Use the ```tf.layers.dense``` function for the hidden layers in the network
40 | * We recommend to use the ```tf.nn.sparse_softmax_cross_entropy_with_logits_v2``` to compute the loss
41 | * Read the TF docs carefully: the above loss function requires logits as inputs. Therefore if using this, the network output should be a linear layer
42 | * Create a train op in Tensorflow and train the network according to the schedule/criteria of your choice
43 | * Record and document the learning curves (train & test loss vs training iterations or epochs), and report the final train and test loss
44 | * Calculate the number of parameters used in the network, and record the time required to train the network
45 |
--------------------------------------------------------------------------------
/assignments/week_2/mnist.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/pukkapies/dl-imperial-maths/07b4a64026e855932ff8c38814946fcc9e4fd9f7/assignments/week_2/mnist.png
--------------------------------------------------------------------------------
/assignments/week_3/assignment_3.md:
--------------------------------------------------------------------------------
1 | # Assignment 3
2 |
3 | Suggested due date: 24th October 2018
4 |
5 | ## Convolutional neural network
6 |
7 | The aims for this assignment are:
8 | * Implement a CNN classifier in Tensorflow
9 | * Experiment with batch normalisation, dropout and residual connections
10 |
11 | This assignment follows directly from last week’s assignment. We will build a convolutional neural network (CNN) classifier on the MNIST dataset.
12 |
13 | ## MNIST CNN classifier
14 |
15 |
16 |
17 |
18 |
19 | You will have already downloaded the MNIST dataset, and trained an MLP classifier for last week’s assignment. You should also have recorded the network’s performance on the training and test sets, have an estimate for the number of parameters used and recorded the training time. For this week we will train a CNN on the same task and compare it to the MLP on all these benchmarks.
20 |
21 | Recall the MNIST dataset consists of 28x28 grayscale images, with associated labels for which digit the image contains (0-9). The training set consists of 60,000 examples and the test set is 10,000 examples.
22 |
23 | For the MLP, we flattened the inputs so the images were represented as 784-length vectors, and fed them through several dense layers, resulting in a final softmax layer to predict the digit. Note that this architecture disregards the spatial structure of the inputs, and is inefficient in terms of parameters.
24 |
25 | We exploit the CNN architecture to introduce an _infinitely strong prior_ into the network construction, which asserts the importance of local feature extraction and equivariant representations.
26 |
27 | In this week’s lecture we covered several standard ConvNet architectures, which should serve as inspiration for your own network design. The output of your network should again be a 10-way softmax layer to predict the digit label.
28 |
29 | ## Implementation in Tensorflow
30 |
31 | The assignment is to implement the CNN classifier for MNIST in Tensorflow, train it and test the classification performance on the test set. You should choose the number and types of layers in the network (try testing a few options).
32 |
33 | * We recommend to use the ```tf.layers.conv2d``` function for the convolutional layers in the network (but cf. with the lower-level ```tf.nn.conv2d```)
34 | * Similarly, consider using ```tf.layers.max_pooling2d``` and ```tf.layers.dropout``` in your network
35 | * As before, use the ```tf.nn.sparse_softmax_cross_entropy_with_logits_v2``` to compute the loss
36 | * Follow the design principles of the architectures covered in the lecture: build blocks of convolutional and pooling layers, with batch normalisation
37 | * Use either fully connected layers leading to a softmax output at the backend of the network, or implement a global pooling layer (as in GoogLeNet / ResNet)
38 | * Watch out for the dependencies in Tensorflow when using batch normalisation, and also the mode (training/inference)
39 | * As before, record and document the learning curves (train & test loss vs training iterations or epochs), and report the final train and test loss.
40 | * Calculate the number of parameters used in the network, and record the time required to train the network
41 | * Try to beat your own MLP implementation on the same task! Compare the above benchmarks to your MLP network
42 |
--------------------------------------------------------------------------------
/assignments/week_3/mnist.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/pukkapies/dl-imperial-maths/07b4a64026e855932ff8c38814946fcc9e4fd9f7/assignments/week_3/mnist.png
--------------------------------------------------------------------------------
/assignments/week_4/.assignment_4.md.swp:
--------------------------------------------------------------------------------
1 | b0nano 2.5.3 | pierr DESKTOP-SQ3IH85 assignment_4.md
--------------------------------------------------------------------------------
/assignments/week_4/assignment_4.md:
--------------------------------------------------------------------------------
1 | # Assignment 4
2 |
3 | Suggested due date : 7th November 2018
4 |
5 | ## Q-learning on Frozen Lake with OpenAI Gym
6 |
7 | The aims for this assignment are:
8 | * Get familiar with OpenAI Gym and more particularly the FrozenLake environment.
9 | * Implement the tabular Q-learning algorithm
10 |
11 | ## OpenAI Gym
12 |
13 | For the purpose of focusing on the algorithms, we will use standard environments provided by the OpenAI Gym suite. OpenAI Gym provides controllable environments (see here and docs here ) for research in reinforcement learning.
14 | We will use a simple toy problem to illustrate reinforcement learning algorithms properties. Especially, we will try to solve the FrozenLake-v0 environment (see here ).
15 |
16 | To get used to the OpenAI Gym suite, we will first try to load an environment and apply random actions to it. Once you have instantiated your environment, the most important command is the **env.step(action)** one.
17 | It applies the selected action to the environment and returns an observation (next state), a reward, a flag that is set to True if the episode has terminated, and some debug info.
18 | Try to use a different policy (for instance, a constant action) to understand the role of that command.
19 |
20 | Notice that the FrozenLake-v0 environment is non-deterministic (you can think of it as a slippery, or stochastic, GridWorld ) and you can’t compute the transition probabilities easily. This is why we will use reinforcement learning.
21 |
22 | OpenAI considers the task solved if your success rate is over 76%.
23 |
24 | ## Q-learning
25 |
26 | Note that since we are not using function approximation for this tabular problem.
27 | Q-learning is based on an online update of the action-value function
28 |
29 |
30 |
31 | where alpha is a learning rate, and gamma a discount factor (we recommend values of 0.1 and 0.99 here respectively).
32 | Most of the time, Q-learning is implemented with an epsilon-greedy exploration strategy ; this means selecting a random action (exploring) with probability epsilon, and selecting the best action with probability 1-epsilon.
33 |
34 | To this end:
35 |
36 | * Define a Q-table of the correct size to host the Q-function estimate (initialized randomly).
37 | * Implement the Q-learning algorithm with epsilon-greedy - try several constant values, from 0.5 to 0.1.
38 | * Measure your success rate by counting the number of succesful trials on a sliding window of 100 episodes.
39 | * Anneal your epsilon rate with time (ie schedule its decay to a very small value over time). **Observe how drastically this impacts performance**.
40 | * Since the environment is stochastic, the correct evaluation involves measuring the quality of the algorithm on average, over a handful of trials. Use that as your measure.
41 | * Try changing the update rule to the SARSA algorithm, which shares the same logic but with on-policy update rule
42 |
43 |
44 |
45 | Compare performance you obtain that way.
46 | * Bonus : Additionally, you are encouraged to move all your logic to TensorFlow once you have completed the assignment. You can then instantiate a new environment such as Atari games, and pick a function approximator of your choice to see how convergence goes (see the Nature DQN paper for implementation details).
47 |
--------------------------------------------------------------------------------
/assignments/week_5/.assignment_5.md.swp:
--------------------------------------------------------------------------------
1 | b0nano 2.5.3 ? pierr DESKTOP-SQ3IH85 /mnt/c/Users/Pierre H. Richemond/Documents/dl-imperial-maths/assignments/week_5/assignment_5.md U
--------------------------------------------------------------------------------
/assignments/week_5/assignment_5.md:
--------------------------------------------------------------------------------
1 | # Assignment 5
2 |
3 | Suggested due date : 7th November 2018
4 |
5 | ## Policy gradients on Cartpole with OpenAI Gym
6 |
7 | The aims for this assignment are:
8 | * Get familiar with the Cartpole environment in OpenAI Gym.
9 | * Implement REINFORCE and actor-critic policy gradient algorithms in TensorFlow.
10 |
11 | ## The Cartpole environment
12 |
13 | Cartpole is a classic simulated control environment, first described by Sutton and Barto, with a continuous state space and discrete action space.
14 | The task consists in maintaining a pole in a vertical position by moving a cart on which the pole is attached with a joint. No friction is considered. The task is considered solved if the pole stays upright (within 15 degrees) for 195 steps on average, over 100 episodes, while keeping the cart position within reasonable bounds.
15 | The state is 4 scalars - position and angle of the cart with the vertical, as well as the time derivatives of these quantities - but the aim of our RL algorithms is to solve the task without that knowledge. There are only two possible actions : left, and right. See the Gym documentation for more details.
16 |
17 | ## Policy gradient methods
18 |
19 | Policy gradient methods (see slides from RL lecture 2) have a tendency to scale better on large state spaces, which is of interest here, since CartPole is a continuous state space environment.
20 | The policy gradient theorem helps us do away with knowing the dynamics of the system, and building a stochastic gradient estimate just from one-step transitions. We will use it - and its special case REINFORCE - in order to solve our Cartpole problem.
21 |
22 | To this end:
23 |
24 | * Try the env.step(action) method with a constant policy, in order to get familiar with the environment.
25 | * Implement a sigmoid policy that selects the right action with probability
26 | , with theta the parameter vector and s the state vector. Then evaluate its performance by averaging over 100 episodes.
27 | * Compute analytically the gradient w.r.t parameters of the log-policy (for each action) - on paper and in closed form.
28 | * Implement REINFORCE by running rollouts with current policy. Use a fixed, maximal horizon of 250 steps. Loop the computation of the policy gradient, the parameter update via gradient ascent, and the check of whether the new policy has an average return >=195.
29 | * Bonus : reduce gradient variance, either by using a Monte-Carlo estimate of the average return, or using a parametric value function to estimate average returns as a baseline.
30 | * Bonus 2 : implement a softmax policy instead of our sigmoid above ; move all the code to TensorFlow to use full neural network approximation and the standard backpropagation tools.
31 |
--------------------------------------------------------------------------------
/assignments/week_6/celebA.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/pukkapies/dl-imperial-maths/07b4a64026e855932ff8c38814946fcc9e4fd9f7/assignments/week_6/celebA.png
--------------------------------------------------------------------------------
/assignments/week_6/fashion-mnist.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/pukkapies/dl-imperial-maths/07b4a64026e855932ff8c38814946fcc9e4fd9f7/assignments/week_6/fashion-mnist.png
--------------------------------------------------------------------------------
/assignments/week_6/final_assignment.md:
--------------------------------------------------------------------------------
1 | # Final assignment
2 |
3 | Due date : 7th December 2018
4 |
5 | ## Generative model for an image dataset
6 |
7 | This final assignment covers the remainder of the course. The aims for the assignment are:
8 | * Design, build, train and test a generative model of your choosing for either the CelebA dataset or the Fashion-MNIST dataset
9 | * Explore more of Tensorflow or PyTorch’s functions for data processing and model building
10 | * Write a report to summarise the research work carried out for this assignment
11 | * Provide example generations from your trained model
12 |
13 | This assignment is intentionally quite open-ended and has a lot of scope for different model choices; you are encouraged to dig deeper into the area that interests you the most.
14 |
15 | ## The dataset: either CelebA or Fashion-MNIST
16 |
17 | ### CelebA dataset
18 |
19 | As a default, we recommend to use the CelebA dataset for this project. The dataset itself can be downloaded from [here](http://mmlab.ie.cuhk.edu.hk/projects/ "CelebA dataset"). (Note the dataset can also be downloaded from [Google Drive](https://drive.google.com/drive/folders/0B7EVK8r0v71pWEZsZE9oNnFzTm8 "CelebA dataset") or [Baidu Drive](https://pan.baidu.com/s/1eSNpdRG#list/path=%2FCelebA "CelebA dataset")). Make sure to download the aligned and cropped version of the dataset. In this version, the images have been roughly aligned using similarity transformation according to the two eye locations.
20 |
21 |
22 |
23 |
24 |
25 | The dataset consists of over 20,000 images of celebrity faces, comprising 10,177 different identities. Each image is 178 x 218 pixels. In order to greatly simplify the learning task and reduce training time, you should downsample the dataset to something like 32 x 40 pixels. Additionally, feel free to convert the dataset to black and white.
26 |
27 | ### Fashion-MNIST dataset
28 |
29 | Modelling the CelebA dataset will require more computing resources, and although it is possible to access free GPU compute time with Google Colab (see below), we would like to offer the option of using the Fashion-MNIST dataset, which is a much simpler dataset for this project.
30 |
31 |
32 |
33 |
34 |
35 | This dataset consists of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image. It is intended to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms. It shares the same image size and structure of training and testing splits. The dataset can be downloaded [here](https://github.com/zalandoresearch/fashion-mnist "Fashion-MNIST").
36 |
37 | ## Choice of generative model
38 |
39 | In this course, we have covered several types of generative deep learning models: autoregressive models, variational autoencoders, generative adversarial networks and normalising flows. Each of these classes of generative models is actively researched and has a large and interesting body of literature to learn from.
40 |
41 | You are free to choose the type of generative model that interests you the most for this assignment. Part of the task is to explore more of the literature and experiment with some of the ideas and improvements that have been published.
42 |
43 | ## Framework
44 |
45 | In this course, we have covered the fundamentals of both Tensorflow and PyTorch. For this project, you can choose either of these frameworks.
46 |
47 | If using Tensorflow, you may want to familiarise yourself with the Dataset API, and make use of the tfrecords format. This enables Tensorflow to work with large datasets that cannot fit in memory. We recommend to look at the [Tensorflow guide to importing data](https://www.tensorflow.org/guide/datasets).
48 |
49 | ## Google Colab
50 |
51 | You may want to use GPUs for training your models for this assignment. You can get limited access to GPU hardware through Google Colab. It is easy to use and provides 12 hours at a time of GPU access. To get started with Colab, take a look through the [introductory notebook](https://colab.research.google.com/notebooks/welcome.ipynb).
52 |
53 | ## Submission
54 |
55 | Your final project should be available to view in your own private repository, together with all other assignments from the course. You will be required to provide a link to your repository prior to the final oral examination.
56 |
57 | ### Code
58 |
59 | All code used for the project should be included in your repository and clearly presented.
60 |
61 | ### Report
62 |
63 | A required component of this assignment is to write a short summary report (around a couple of pages) of the process that you followed during the completion of this assignment. Make sure to include:
64 |
65 | * Details of your final model architecture, including all hyperparameters, train/validation/test splits, optimizer used etc.
66 | * Hyperparameter searches that you performed during the project, including your method of validation and corresponding results
67 | * Model performance, metrics used and training curves for trained models
68 | * Lessons learned and any other points of interest from your project
69 |
70 | The report could be written in markdown format in your repository, or included as a pdf if you prefer.
71 |
72 | ### Example generations
73 |
74 | Finally, include a selection of example generations from your model for evaluation. You should of course aim for high quality models and samples, but this is not the main aim of the project. The oral examination will focus on your understanding of the course material and the process you followed for this assignment.
75 |
--------------------------------------------------------------------------------
/pytorch-tutorial/2. Question Answering with RNNs.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Question Answering with RNNs\n",
8 | "\n",
9 | "There are many structured prediction tasks in machine learning, and many of them involve sequences - particularly sequences of text - in some form. Some examples include sentiment analysis (text to single class), image captioning (single image to text) and machine translation (text to text). Recurrent neural networks (RNNs) are a good fit for such problems, particularly when the sequences involved have an explicit or implicit ordering to the items; conversely, one might find other architectures more suitable if the input is a set. RNNs not only take in and output a single input and output at a time, but also have a hidden state vector which can be used to integrate information over time. They can have more complex units, such as long short-term memory (LSTM) units, that alleviate problems such as vanishing gradients, and be combined with other modules to enable bidirectional reading or attention or memory mechanisms.\n",
10 | "\n",
11 | "We'll focus on text-based question answering, using RNNs to read a story, a query, and predict the answer. This can, broadly speaking, encapsulate several natural language processing (NLP) tasks."
12 | ]
13 | },
14 | {
15 | "cell_type": "markdown",
16 | "metadata": {},
17 | "source": [
18 | "## Data\n",
19 | "\n",
20 | "We'll use the first task from the 20 tasks of the bAbI dataset. This procedurally generated dataset was designed to test text understanding and reasoning, and includes tasks such as answering yes/no questions, counting items, performing coreference resolution, and even basic deduction. In each task there is a story to read, a question, and the answer, with 1000 examples for training and 1000 for testing per task. The first task is based on answering a question with a single supporting fact - we'll set up datasets to iterate over and show an example below. By the standards of the dataset, we'll be looking at a *weakly supervised* setting, as opposed to the *strongly supervised* setting where the indices of the supporting facts (out of all of the facts) are also provided during training.\n",
21 | "\n",
22 | "We'll use `torchtext`, which provides the dataset in a more convenient form. `torchtext` provides several NLP functions, such as tokenisation, as well as helpers for dealing with training on text data. There tends to be a lot of data processing for dealing with text, so it's worth consulting the documentation and other material for how to make use of the package. Here we'll get training and test datasets, along with metadata about the text (such as the vocabulary size)."
23 | ]
24 | },
25 | {
26 | "cell_type": "code",
27 | "execution_count": 1,
28 | "metadata": {},
29 | "outputs": [],
30 | "source": [
31 | "from collections import namedtuple\n",
32 | "import os\n",
33 | "from matplotlib import pyplot as plt\n",
34 | "from matplotlib import ticker\n",
35 | "import torch\n",
36 | "from torch import nn, optim\n",
37 | "from torch.nn import functional as F\n",
38 | "from torchtext import datasets\n",
39 | "from IPython.display import clear_output, display\n",
40 | "%matplotlib inline"
41 | ]
42 | },
43 | {
44 | "cell_type": "code",
45 | "execution_count": 12,
46 | "metadata": {},
47 | "outputs": [
48 | {
49 | "name": "stdout",
50 | "output_type": "stream",
51 | "text": [
52 | "Story:\n",
53 | "Mary moved to the bathroom\n",
54 | "John went to the hallway\n",
55 | "Daniel went back to the hallway\n",
56 | "Sandra moved to the garden\n",
57 | "John moved to the office\n",
58 | "Sandra journeyed to the bathroom\n",
59 | "-------------------------------------------------\n",
60 | "Query: Where is Daniel?\n",
61 | "Answer: hallway\n"
62 | ]
63 | }
64 | ],
65 | "source": [
66 | "def print_example(example):\n",
67 | " story, query, answer = '\\n'.join(' '.join(s) for s in example.story), ' '.join(example.query), ' '.join(example.answer)\n",
68 | " print('Story:\\n%s\\n-------------------------------------------------\\nQuery: %s?\\nAnswer: %s' % (story, query, answer))\n",
69 | "\n",
70 | "data_path = os.path.join(os.path.expanduser('~'), '.torch', 'datasets', 'babi')\n",
71 | "train_data, _, test_data = datasets.BABI20.iters(task=1, batch_size=32, root=data_path)\n",
72 | "STORY, QUERY, ANSWER = [train_data.dataset.fields[f] for f in ['story', 'query', 'answer']]\n",
73 | "print_example(train_data.dataset[2])"
74 | ]
75 | },
76 | {
77 | "cell_type": "markdown",
78 | "metadata": {},
79 | "source": [
80 | "## Model\n",
81 | "\n",
82 | "We'll use a combination of models to deal with the different inputs and produce an output. As the words are symbols, these are first passed through an embedding layer to map each symbol into a (learnable) real-valued vector. For the story, we'll use a bidirectional LSTM, as it will be able to better preserve information across larger sequences (that are provided at once - only unidirectional RNNs can be used for online sequences). For the question, we'll use a unidirectional LSTM, and use the final output to \"attend\" to the sentence states of the story RNN (through a multiplication operation). Finally, this will be passed to a fully-connected network to predict the output (the answers are single words here, so there is no need for an RNN)."
83 | ]
84 | },
85 | {
86 | "cell_type": "code",
87 | "execution_count": 8,
88 | "metadata": {},
89 | "outputs": [],
90 | "source": [
91 | "class Encoder(nn.Module):\n",
92 | " def __init__(self, vocab_size, hidden_size, zeros_idx, bidirectional=False):\n",
93 | " super().__init__()\n",
94 | " self.embedding = nn.Embedding(vocab_size, hidden_size, padding_idx=zeros_idx)\n",
95 | " self.rnn = nn.LSTM(hidden_size, hidden_size, bidirectional=bidirectional)\n",
96 | "\n",
97 | " def forward(self, x, h=None):\n",
98 | " x = self.embedding(x)\n",
99 | " if x.dim() == 4: # Sum embeddings over a sentence in the story encoder\n",
100 | " x = x.sum(2)\n",
101 | " x, _ = self.rnn(x, h)\n",
102 | " return x\n",
103 | "\n",
104 | "class QANetwork(nn.Module):\n",
105 | " def __init__(self, hidden_size):\n",
106 | " super().__init__()\n",
107 | " self.s_encoder = Encoder(len(STORY.vocab), hidden_size // 2, zeros_idx=STORY.vocab.stoi['pad'],\n",
108 | " bidirectional=True)\n",
109 | " self.q_encoder = Encoder(len(QUERY.vocab), hidden_size, zeros_idx=STORY.vocab.stoi['pad'])\n",
110 | " self.a_generator = nn.Sequential(nn.Linear(hidden_size, hidden_size),\n",
111 | " nn.Dropout(0.8),\n",
112 | " nn.ReLU(),\n",
113 | " nn.Linear(hidden_size, len(ANSWER.vocab)))\n",
114 | "\n",
115 | " def forward(self, x, h=None):\n",
116 | " s = self.s_encoder(x.story) # All hidden states\n",
117 | " q = self.q_encoder(x.query)[:, -1] # Final hidden state\n",
118 | " attention = F.softmax(torch.einsum('bsh,bh->bs', [s, q]), dim=1).unsqueeze(2) # Multiplicative attention mask\n",
119 | " a = torch.sum(attention * s, 1)\n",
120 | " a = self.a_generator(a)\n",
121 | " return a, attention"
122 | ]
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "metadata": {},
127 | "source": [
128 | "## Training and Testing\n",
129 | "\n",
130 | "We'll train the network for a few epochs and plot the training and test losses. We can also visualise the attention of the network over the story, showing which parts it thinks are relevant for answering the question at hand."
131 | ]
132 | },
133 | {
134 | "cell_type": "code",
135 | "execution_count": 13,
136 | "metadata": {},
137 | "outputs": [
138 | {
139 | "data": {
140 | "text/plain": [
141 | "'Final test accuracy: 85.60%'"
142 | ]
143 | },
144 | "metadata": {},
145 | "output_type": "display_data"
146 | },
147 | {
148 | "data": {
149 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAA0QAAAHjCAYAAAADn99RAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4xLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvDW2N/gAAIABJREFUeJzs3XecFdX5x/HvYVmaImVBQEBBjS0YQTc2NCZKbNgbGrGCilETMVFjjSViNBYULLHXnyUqEYnGiEajougCqyCIoiJVmlKl7LLn98fsMDP3zu137r27+3m/Xrxm5syZM88SDTyeOc8x1loBAAAAQFPUrNgBAAAAAECxkBABAAAAaLJIiAAAAAA0WSREAAAAAJosEiIAAAAATRYJEQAAAIAmi4QIAAAAQJNFQgQAAACgySIhAgAAANBkNS92AJnq1KmT7dWrV7HDAAAAAFCiJk2atNRa2zmdvg0uIerVq5eqqqqKHQYAAACAEmWM+TbdvnwyBwAAAKDJIiECAAAA0GSREAEAAABoshrcGiIAAAAA4WpqajRv3jytW7eu2KEURKtWrdSjRw+Vl5dnPQYJEQAAANBIzJs3T23btlWvXr1kjCl2OJGy1mrZsmWaN2+eevfunfU4fDIHAAAANBLr1q1TRUVFo0+GJMkYo4qKipxnw0iIAAAAgEakKSRDrnz8rCREAAAAAJosEiIAAAAAebFs2TL17dtXffv2VdeuXdW9e/dN1xs2bEhrjLPOOkszZ86MOFIPRRUAAAAA5EVFRYWqq6slSdddd50233xz/fGPfwz0sdbKWqtmzcLnZh599NHI4/QjIQIAAAAaoYsvlupzk7zp21caOTLz52bNmqVjjjlG++23nyZOnKhx48bp+uuv1+TJk7V27VoNGjRI1157rSRpv/320+jRo9WnTx916tRJw4YN02uvvaY2bdro5Zdf1pZbbpnXn4lP5gAAAABEbvr06RoyZIimTJmi7t27669//auqqqr0ySef6I033tD06dPjnlmxYoUOOOAAffLJJ9pnn330yCOP5D0uZogAAACARiibmZwobbfddvr5z3++6fqZZ57Rww8/rNraWi1YsEDTp0/XLrvsEnimdevWOuywwyRJe+yxh9599928x0VCBAAAACBym2222abzL7/8UnfddZc++ugjtW/fXoMHDw7dT6hFixabzsvKylRbW5v3uPhkDgAAAEBBrVy5Um3bttUWW2yhhQsX6vXXXy9aLMwQAQAAACio3XffXbvssov69OmjbbfdVv379y9aLMZaW7SXZ6OystJWVVUVOwxJUocO0tZbS598UuxIAAAAAGnGjBnaeeedix1GQYX9zMaYSdbaynSeZ4YoS+XlUm2ttHx5sSMBAAAAkC3WEGWppsY7f++94sUBAAAAIHuRJUTGmFbGmI+MMZ8YYz4zxlwf0qelMeY5Y8wsY8xEY0yvqOKJ0v77FzsCAAAAANmIcoZovaQDrbW7Seor6VBjzN4xfYZI+sFau72kOyXdEmE8eTd0aPB6u+0kY5xfAAAAAEpfZAmRdayuvyyv/xVbweFoSY/Xn78g6SBjGk468eCD3rkx0tdfB6/DtGzpJU0N5ycFAAAAGqdI1xAZY8qMMdWSFkt6w1o7MaZLd0lzJclaWytphaSKkHHONcZUGWOqlixZEmXIGWuW5HfQGOmZZ5zzv/zFud6wIb4PAAAAgOKINCGy1m601vaV1EPSnsaYPjFdwtKBuDrg1toHrLWV1trKzp07RxFq1jZuDF5bK91zj3f9m984n9Jdc02wX3NffT9/UjRnDrNHAAAAaJiWLVumvn37qm/fvuratau6d+++6XpD7MxAEo888oi+++67CCP1FKTKnLV2uaS3JR0ac2uepJ6SZIxpLqmdpO8LEVM+bbutc3S3dPrtb71zKfgpnbXOr5oaqdJXGb1lS+e4zTbRxgoAAABEpaKiQtXV1aqurtawYcM0fPjwTdctWrRIe5xCJkSR7UNkjOksqcZau9wY01rSAMUXTRgr6QxJH0g6QdJbtqHtFCvpq6/C260NzvTE/mQff+zd37BB+vWvg/dbt5bWrs1fnAAAAGhCLr5Yqq7O75h9+0ojR2b16OOPP6577rlHGzZs0L777qvRo0errq5OZ511lqqrq2Wt1bnnnqsuXbqourpagwYNUuvWrfXRRx9llExlKsoZom6S/muM+VTSx3LWEI0zxtxgjDmqvs/DkiqMMbMkXSLpTxHGUxTWSt9+G58M+e+7xo8P3lu3zjlefnmwEMOECdHECgAAAERh2rRpGjNmjCZMmKDq6mrV1tbq2Wef1aRJk7R06VJNnTpV06ZN0+mnn65Bgwapb9++eu655zKeWcpGZDNE1tpPJfULab/Wd75O0olRxVAqtt46+f2jjpLGjvWuv/02+OncrbcG+/fvL91/v3TeefmLEQAAAI1MljM5URg/frw+/vhjVdavGVm7dq169uypQw45RDNnztTvf/97HX744Tr44IMLHltB1hAhuZdf9s5btgwmUP5P7g45xDsfNiz6uAAAAIB8sNbq7LPP3rSeaObMmbrmmmtUUVGhTz/9VPvtt5/uvvtunVeE/+JPQlQi3GIL7mdyYf79b+nZZwsXEwAAAJAPAwYM0PPPP6+lS5dKcqrRzZkzR0uWLJG1VieeeKKuv/56TZ48WZLUtm1brVq1qiCxRfbJHHIT+9mcu9Zo0CDp5JOdc2Oc9m7dJH8RDmOkurrCxQoAAAAks+uuu+rPf/6zBgwYoLq6OpWXl+v+++9XWVmZhgwZImutjDG65RanBttZZ52loUOHFqSogmloRd0qKyttVVVVscMoiEQV6nbfXZoyJfmzl10m3RJb0w8AAACN2owZM7TzzjsXO4yCCvuZjTGTrLWVCR4J4JO5EuauKaupCbbXzyTGGTzYO48txAAAAAAgHglRCXv9dWdmqHnIh41lZd55v35OvyefDM4k+WeYAAAAAMQjIWqgamulESOcBCh2xujYY73zuXMLGxcAAACKq6EticlFPn5WEqIG7Iorwttfesk7T7UHEgAAABqPVq1aadmyZU0iKbLWatmyZWrVqlVO41BlrpHq2jVYeQ4AAACNX48ePTRv3jwtWbKk2KEURKtWrdSjR4+cxiAhaqQWLvTWED33nFOuGwAAAI1beXm5evfuXewwGhQ+mWsC3H2LAAAAAASREDVihxxS7AgAAACA0kZC1Ij9+9/e+bBhxYsDAAAAKFUkRE3E3/9e7AgAAACA0kNC1Mg9+2yxIwAAAABKFwlRI+evLteM/7UBAACAAP6K3AQ0ry+u7u7PZYz3CwAAAGjKSIiagJoa7zw2CSIpAgAAQFNGQoTAjNELLxQ7GgAAAKBwSIiaiPvvD15bK+21V3y/E08sTDwAAABAKSAhaiLOO885tmjhrSX68ENp40aprk7ad9/ixQYAAAAUCwlRE2KttH59sK1ZM+dTufff99rmzStsXAAAAECxkBAhTs+e2T334YesQwIAAEDDQkKETTp2zO35ffZxjqxDAgAAQENBQoRNli0rdgQAAABAYZEQIRT7EwEAAKApICFCQPPmxY4AAAAAKBwSIgTU1BQ7AgAAAKBwSIiQEOW3AQAA0NiRECGhbMtvAwAAAA0FCREAAACAJouECHE++KDYEQAAAACFQUKEOHvvXewIAAAAgMIgIUJSvXun12/gwGjjAAAAAKJAQoSkZs9Or98bb0QaBgAAABAJEiKEOvDAzPqzfxEAAAAaIhIihHrzzWJHAAAAAESPhAgpffhh5s+wqSsAAAAaAhIipNS/f+bPDBqU/zgAAACAfCMhQkp1dZk/M2VK/uMAAAAA8o2ECAl16pT9s2vX5i8OAAAAICokREhoyZJiRwAAAABEi4QIAAAAQJNFQoSMGOP8AgAAABoDEiKkpaIieP3QQ8WJAwAAAMgnEiKk5fvvpYMO8q7POcc79+851KpV4WICAAAAckVChKTKyrzzt94K73Pssd55nz7RxgMAAADkEwkRknrvvdR9Jk/2zkeNii4WAAAAIN9IiJDU3nun7uPfuDW2/447OkUYHn00v3EBAAAA+UBChKx9+GHqPl984RzPPjvaWAAAAIBskBAha/vsU+wIAAAAgNyQECEjc+cmvrfTToWLAwAAAMgHEiJkpEcP6Yorwu+98Ubm47HRKwAAAIqJhAgpPfhg8HrEiPB+PXpkNu6YMdnFAwAAAOQLCRFSGjpUstb5FauiIvtxhw3L/lkAAAAgH0iIkJPvv0987+ijkz+7eHF+YwEAAAAyRUKEvAibPRo7Nv3n58/PXywAAABAukiIkBW32lzr1uHJUJhkSU/fvrnHBAAAAGSKhAhZ6dHDSYR+/DH9Z444IvG9pUtzjwkAAADIFAkRCqa6utgRAAAAAEEkRAAAAACarMgSImNMT2PMf40xM4wxnxljfh/S55fGmBXGmOr6X9dGFQ8AAAAAxGoe4di1kv5grZ1sjGkraZIx5g1r7fSYfu9aa5OsLkFDt+WWlNgGAABAaYpshshau9BaO7n+fJWkGZK6R/U+lK5XXil2BAAAAEC4gqwhMsb0ktRP0sSQ2/sYYz4xxrxmjPlpgufPNcZUGWOqlixZEmGkyIdmMf9U7blnceIAAAAAUok8ITLGbC7pRUkXW2tXxtyeLGkba+1ukkZJ+mfYGNbaB6y1ldbays6dO0cbMHLWsWN2z40Zk984AAAAgFQiTYiMMeVykqGnrbUvxd631q601q6uP39VUrkxplOUMSF6J52Uus/tt8e3nXJK/mMBAAAAkomyypyR9LCkGdbaOxL06VrfT8aYPevjWRZVTIjY5MmSMbrn8H+l7HrjjfFt69dHEBMAAACQRJRV5vpLOk3SVGOMuyXnlZK2liRr7f2STpB0vjGmVtJaSSdba22EMSFKe+zhHI84QpLzP+NRRwW7fPSRs6ZoxYrChgYAAACEMQ0t/6isrLRVVVXFDgOJOBN+Gqb79HcNk/uPV32zOnWSlizxriXppz+VPvvMOW9g/zgCAACgBBljJllrK9PpW5Aqc2hC7r5bkjRKF+nXej3u9tKl8Y9MmxZ1UAAAAEA4EiLk10UXSTvvrCXqpBd1gtSiRdLurVoVKC4AAAAgBAkR8m/6dHUyi7S5Vks1NdLeeyfseuqpBYwLAAAAiEFChEi0qKvTpmVCEydqwNafhPZ76KGChQQAAADEISFCdHwVEl6f0zetR/zFFgAAAICokRAhWvVJUTNJF2pUwm4dOxYoHgAAAMCHhAjRe+opSdKdGq5f6U2NGRPfZZlvO15miQAAAFAoJESI3qmnSn36aLU21/MapPHH3RXajVkiAAAAFBoJEQpj6lT9R7+UkdW1GqEttDyuC7NEAAAAKDQSIhTMSfafOl2Pq0LL9JxOVgutjevDLBEAAAAKiYQIBfWqjtAw3a9D9bre0MFx9/2zRAAAAEDUmhc7ADQ9D2uodtF0XaI7nW/jfOW5AQAAgEJihggFdfXVzvFS/U0L1cW5SLBg6NFHCxQUAAAAmiwSIhTUjTc6xzqV6RT9n3ejWfw/imefXaCgAAAA0GSREKFo5m13oNS8/qtNa6Vddy1uQGkwhgp4AAAAjQkJEQpu3jypXTtp1ixJNTXejWnTpKef1v77Fy20tI0aVewIAAAAkA/GNrAF7ZWVlbaqqqrYYSDf/NMu1m66LLV/PGPCBAAAQAkyxkyy1lam05cZIpQGf3bBN2kAAAAoEBIilI5x4zadDtUDkqSPPgp2KeYanoULg9dduhQnDgAAAOQPCRFKx8CB0i9+IUkarYu0g2Zq33292/Pne+exiVIhHHRQ8Hrx4sLHAAAAgPwiIUJpeecdqbxcRlav6Ej13Thx060ePbxuxSi88Pnn8W2xs0YAAABoWEiIUHo2bNAM7aht9K2e1amhJd02bCh8WGFFFLbaqvBxAAAAIH9IiFCSdrNTdYFGa3t9Jf3ud0mnYsrKnF8AAABApkiIULIe1jm6V8MkSRu26pGwX12d86tQWrUq3LsAAAAQLRIilLQLNVrL1EHlqtMh+nfc/WHDvPORIwsT08knOxvLpsOtirf77tHGBAAAgOywMStKVkWF9P33Ulst1wztos21Rm/qlzpeL0ty1vTEluCO8h9n/2axgwZJzz+f+p1s5AoAAFB4bMyKRmHZMue4Su21jz5QrZrrOI3VtvpSknTKKcWLLZvZKPabBQAAKD0kRChp7qzKXG2j32q0JOl/OkDl2qBnn038nPupmvsr35OK3bpl99zRR+c3DgAAAOSGhAglr2dP5/i8nCmh7lqo0bpQUm1o/112iW/7+c8jCi5DY8c6CRr7FwEAAJQGEiKUvDlznJkiayVZq9VqrXP1oEbp93F9FyyQZszwrj/+2DsPS5QKJXb90FZbeZ8EAgAAoHhIiNDgbD7pPY3TQJ2v+3WYXg3c23pr7/zll6VK31I6f6KUqXzM6FgrXXaZd03lOQAAgOKjyhwapC3MCr2tX2l7zdIsbaM9NDWuj/8f7VyrvR18sPTGG8Hn/VXnEgnrQ+U5AACAaFFlDo3eKrXTURqr9Wqp3TVN48sHJO2/ww65ve/tt3N73q9Fi/yNBQAAgNyQEKHBmq8eGqKHJEkH1bypllq76d7OOwf7zpyZfKzy8viy2I8+Kl15pXNeU5NrtJ6JE/M3FgAAAHJDQoQG7RV5daz/qWMlOd+gTZ+e+JkFC+LbausL1g0c6LWdfbZ0882pP2t7+uk0g63Xt29m/QEAABAdEiI0SL/5jXPs0UObMpZD9br+oNtTPrvddonvvfpqfFvnzt558+bx96+6KuUrAQAAUKJIiNAgPf20kwfNnVvfMGmSlqqjbtVl+kB7Jn123brU4/s3XvWXx95jj/i+8+eHj3H55anfc+qpqfsAAAAgOiREaBx2312dThqgZrLaWx/HLwiS1LVr+KP9+8e3ffddeN8xY+LbasP3h03rU7r/+7/UfQAAABAdEiI0Hs89J222mXfdpUvgdqK9hCZMCF6HrTFy+WeOUlm8OP2+UfvHP5wcMSRPBAAAaNJIiNC4rF7tnS9enN53azG23TY/oSSrTPf73+fnHek666zCvg8AAKChICFC4+MvC3frraFTQ8lmgdavT+81YQUW0jVyZPbPZmPNmsK+DwAAoKEgIULj5E+Kttoq7nZYpbk//zm+LdG6I0nq3j2LuBLo0YPP2QAAAIqBhAiNl38aKCbbCKs0d911weuWLROvO5Kkm27KPjS/XXf1KtW1bZv586wNAgAAyB4JERqvbt2kM8/0ro3RllsGuyT7dC5Vee50S2a3aJH8/rRp3rl/CRQAAACiR0KExu3RR6V27TZdLqrpGLidbJPWfMn0HZnM9vjLg++4Y3rPfPppZvEAAAA0ZiREaPyWL/fOf/hBreVVGIidBWrVKv7x8vLcXn/VVeHtu+3mnS9cKO2wQ+Zjn322d/7FF+k9c8IJmb8HAACgsSIhQtPgK7Lwmg6TVKexY73bbiK0dq1zfPxx796GDdLddwfrNGQi0ad11dXO53SvveYUb5g507uX7izRm2+m7rNoUfD6q6/SGxsAAKApICFC01Gf0Rygd3WlbtbRR3u3/EmCtdLppwcfveiiaEJav1469FDv2r8/UTpJ0YYNwWv/J3Qu//iSVFeXfnwAAACNHQkRmpYFC2Ql3aSrdZxe2NQcUpm7KEaODCx5yrjqXFgp8Orq3GICAABozEiI0LR06yZzwQWqUZme1Onqp8nFjiiOf8lTplXnijH7Q9lvAADQkJEQoekZPVrlFe1Vpo0aqyO1Wx6SomT7FWUj0/VKffqk7tO5c+J73bpJzZtn9k4AAIDGgIQITdPSpWqpDdpKCzVZldJZZ+U03JVX5imuLL3xhnd+773hfaZOTfz8d99JGzdm/t4hQzJ/hhklAABQSkiI0HRZq2Zyfumxx3Ka5hkzJn9hxQorlBCra1fv3F+Ywa9Ll/B2/zqlQYPSj0uS/vGPzPoDAACUGhIiNG3+b9NyqKywYkUeYknAv9dQOmprM+vvX6f0/POZPbtqVWb9/W68MftnAQAA8oWECFiwwDsvwW+53nsvurFj9ygqpL/9rXjvBgAAcJEQAd26SZdd5l3nISmanMfidenOwrRsGd/25z+H93UTIf+ndrHctT4tWqT3/kzlMrsEAACQLyREgCTdcou05Zbe9eab5zTc8OE5xpOF44+Pb7v55vC+sWuF+vZNPG5NjZMY/f3vyd+/eHHy+wAAAKWIhAhw+b8fW7MmrQoDbdqEt0+ZkqeYkvj3v4PXTz8d36emJvzZ994LJjix8YZtCDtsWPJ4rrgi+X0pu6p0AAAAUSIhAvz8RRaefz7lt2/9+4e3Z7qhaphUX+6dd176Y5WXB683bkyc4Nx2WzD+009P7x1jx6bu88IL6Y0FAABQKCREQCx/UrTHHkm7Pv546iGy1b59fJt/D5+5c9MfK9nszZFHBq8vv9w7f/75xD9jrB9+SN1n5cr0xgIAACgUEiIgTJqV57p1iy6EsDVBfpkkXddf7xwrKuLvxc7s1NV55yeeGLzn3wA2VjYbuwIAABRbZAmRMaanMea/xpgZxpjPjDFx20Uax93GmFnGmE+NMbtHFQ+QkW7dghvlFKEcd7J9elIlS4ncd1/ie+l8Gnfwwdm9FwAAoFQZm49ve8IGNqabpG7W2snGmLaSJkk6xlo73dfncEkXSTpc0l6S7rLW7pVs3MrKSltVVRVJzECcHj2k+fOd81atpLVr47q4uZL/XyW3zZjgjEum/GNXV0v9+sX3adPGqQHh779woVNSO1lsklOqe9268Hv+51q08Ao0JBor9l6ynyed/l27SltsIX3xRfIxAQAAYhljJllrK9PpG9kMkbV2obV2cv35KkkzJHWP6Xa0pCes40NJ7esTKaA0zJvnna9bJx19dEaPh60DylaiWaFjjolvO/XU9Mb0J0OxDjjAOw9br/TWW+m9I5lEpbr3288p+vfll7m/AwAAIJmCrCEyxvSS1E/SxJhb3SX5/6o1T/FJk4wx5xpjqowxVUuWLIkqTCCcfxpj7NiMdl3df//8hTF7dnh7WLnt//1POuqo3N739tveeZcu8feHDs1tfEn6zW/C299/P/exAQAA0hF5QmSM2VzSi5IuttbG1pgKW5gR9xGNtfYBa22ltbayc+fOUYQJJJdG5bldd41vu//+/IWQyad3tbXSv/6Vv3f7VdZPPs+Zk/tY77wT31aE5VoAAKAJizQhMsaUy0mGnrbWvhTSZZ6knr7rHpIWhPQDii9F5blp0+IfibIKXSqpEqj//Ce7cSdNco75qCpXWxu8/t3vch8TAAAgE1FWmTOSHpY0w1p7R4JuYyWdXl9tbm9JK6y1C6OKCchJCVSeS6Vly/i2Fi2C1//5j1NR7te/ju/rToR98kn8vX32yT0+V6LfulGj4ttuvjl/7wUAAIgV5QxRf0mnSTrQGFNd/+twY8wwY8yw+j6vSvpa0ixJD0r6bYTxALm7+mppm22865YtQ/f2KZawwgt/+lPw+te/Tr7ZqrXSz34W3z5hQnj/Tp3Sj8+13XbJ75eVeed33ZX5+AAAAOmKrOx2VCi7jZLgm+JYttsv1OkTZzGM+69TWLnrXF5zzz3SBRc45y1aSBs2eH1i35FpKexs4vnuO6cstiTdeac0fHh673KfnzrVW3MVVsa7d2/pm2+8tlxKlwMAgKanJMpuA42a72/xFZ/8T81Um6Rz7m6/3TufGFursQi23947v/ji9J7xl+nu0yd539tu884b2H+zAQAADQwJEZAt39/UH9B5kb7KX9Gtb99IX5WW1aszf+bCC9Pve9xxwc/mAAAAokJCBOSivvLcED2ioXogstfEVmMrlmHDUvfxM8b7DC7TTVa7x+1IBgAAkH8kREAuunWT7r5bM/UT3aMLNd90KWqp7ajdd1/y+2GlxyWpR4/Mk7obbsisPwAAQDZIiIBcXXSRjtVLmq1e6qJl+vl3Ywr26lKbRfGvJ/InR/PnJ34mURJ1xhn5iUlyij4YI5WX529MAADQOJAQAXkwQ310pF6RldG9ukhttSKwZVG2YvcQclnr/Jo3L/d35NPHH3vnYaW7pfif6aijoovHNXKkcyyVTw8BAEDpICEC8qC8XPpCO+o6Xaeu+k7/0hG6ut+/ch53hx1yez5RQhWVVau880TV4fr1C17Pnh1ZOAAAACmREAF54JbFHqGrdJHu1v56TzriiJzH/dvfcnv+ootyDiFOspjSKZE9dmz4M1HEGuboowvzHgAA0DCwMSuQJ/4NUWdoJ+2kmc5Fjv+OZbrRqr9/VP96x74j9nrxYqlLl/Bn3ZhatvQ2mLVW6tBBWr482CffG9zGxgAAABonNmYFiuzBve7yLmL/Np6DZmn8G1voz+QkqXnz4HXPnqmfGT48eO0mQ2GWLMk8JgAAgHSQEAERuP3DQ6Q+fbyG2IwhS7vumrpPHr7US1uHDs6xU6dguzvzU1YmHXts+LN//Wv677n00sxjc02fnv2zmfDvuQQAABoOEiIgKlOnOhmBJG3cKO2+e85D/vvfqfu8+GLOr0lp2jSpTRvpu++c60QlshcskF56KbOxw2bBMh3Db//949sOPTT78VJZujS6sQEAQP6REAFR8td5njJFevrpnIbr2jX9vh075vSqpH76U2nNGu/zvESzPVtumfnYYRvb+qvXpeLuOeTO1nz/fXyf118Pf3aLLZznwpKodN16a3zbsmXOuHfemf24AAAgGiREQJ64f/mP23/Iv4J/8GDvb+s//3lksVjr/CW8GFq2zPyZxYu98+uvz+397p5Dsdq3l847L/mzbuL13nvZvz9shs79pPCSS7IfFwAARIOECMiTRYucROTqq0NuhpU1q6rykiNjQjKphsldP9S6dfrPDBzonQ8Z4p2Xl+cWy8sve+fvvy/df793PXhwbmMnMn9+NOMCAIBokBABhWKt8+uhh8LvX3ttMEGaNq2w8eXZjz965xdemLxvokr6O+2U2Ttvvjl4fcwx3vkuuwTvPf209PnnmY2fiH/dkJsQAgCAhoGECCi0IUO85Mha6aCDwvvtuqtkjO5RKUqHAAAgAElEQVTWhdpa3xY2xjwbNcr7cTN9LhNXXpm6jz+GnXfObPxETjwxfHxJ6t07P+8AAADRICECim38+GCCFFPD+iLdo2/VS4vVqSDrjwopVdGFAw7IbtwRI5Lf9+9rlM7eTql88EHie7Nn5z4+AACIDgkRUGqWLPGSo0WLdJoe1+90l5bJlyjFrj/605+KF2+IqVPT6/fmm/l7p3+90BVXJO/bqZNUUeGcuzM6uZTLXr8++2cBAEBxkRABpWzLLfWUTtco/U476/PE649uuaWk1h/596TNR79Urr46uF4oHbGV5H71q/zE4lesSn8AACB9JERAiXP3+jFG8euPDj88/KH69UcyxtlBtREzRrrpJu/aLa399tte209+Ev9cbMGGzz7Le2jq0iW+7ZRT8v8eAACQPRIioMStX+/kPnV1ITf/9a+k648kSWvXBmePfvazSOO9/PJIhw/Yb7/g9a67eqW1/euP3n8/9VixxRBy+YTOtXGjc9xqK69tzJjcxwUAAPlDQgQ0JjHrj1RWFt9n6tRI1h+5r/3rX/MyXBx/IQSXP9GxVvr00+D9bbd1fsTOnZOP/be/xbcde2zmMSbi35uoVNYbff+9tMcexY4CAIDiIyECGqstt5Rqa1Pvf1Ri648SGTQo8b3mzcPbv/oqwcxajKuuim+bODG9uNKVyUa1hVBRIU2eLHXtWuxIAAAoLhIioKloAOuPjEl8b8KExPfuvDO399bUpNeWi6OPzu94+bJoUbEjAACguEiIgKYqdv1R2KZAseuPdtkl72H4k6Bkn7Yl+9Tswguze/fBB8e3hX1l6GrWLPvfgmeeye45AAAQLRIiAI5Fi1KvP5oxI+/rj/yFEcL2DzrjjJxfkdDrr8e37bBDeN/LLnN+a2bMiC6efCkrk/7zn2JHAQBAw0BCBCBe7PqjF18M75eH9Uf/+593fvHF8fcfeyz8uenTM35VWl56KbzdX3jBLYUe2+6X62d82TLGWTd1yCHFeT8AAA0NCRGA1I47Lvh5XaISbBGvP/JXmttzz7wPLyl+f6Iw/vVFiRKiu+/OTzwAACBaJEQAMvfSS8EEKaxUWQTrj/zJypo1OQ+XE/fHSbRf0bx5iZ9dtiz1+O5vWyZ++cvM+gMAABIiAPmwcGHm648uumjTrWOPlQ46KPVrvv8+vm3rrXOIO4b/Uzi/LbbwzsvLnaO7lih2Q1dJGjfO+eIwkUQF/lynnJL8/qBB4WW833kn+XMAACAeCRGA/Ep3/dHo0ZuSo5fGGI2/8q2EQ553XuLXVVXlFq4/CTr33PA+q1Z5fTds8Nrvv987938heMst3nlYbjh5cvKYnn02+f3nn5fWrQsWTghLFgEAQGokRACile76o4MOchKkHXeMu+VPPGIlK9Wdjrvu8s5HjQrei/0cbv784PX553vn/h/rs8+88w4d4t+ZbPYoEyec4J136pSfMQEAaGpIiAAUVqr1R1984X1WF7LYxl9YIR+GDUt878QTg8mYm3Tcd19835EjvfOVK4NjuNq1yy7GRNyZK8n7dO+kk7y2WbMyG6+83Plt/8c/co+tUMrKpNmzix0FAKAhIyECUFz+9UetWgXvderk/A39kks2Ne2wg3TmmYUJbcKE4CyQKyyJ8s/QbNzond94o3f+5z+nfudVV6UfX5jnnvPOjz8+s2fdmauw8uelqF07p8R4797FjgQA0JCREAEoHWvXOolR7KKhO+/UDO2g7TRLy5dLTzyR39e6+Vgs/3qhTJMLV0WFdz58eOr+I0YEr7/4Inh93XXxzyRaP+T/dC8TDWU9kn+GDACAbJEQASg999/vZCi+RTw76kvN0k/0tn6h39gn1UwbMy5Lna5mIf/P+MILwet3343v07x5/mM566zgddi+R7vtFrx2f1/8M1XJxCZA69en91yxhSWxAABkioQIQOmqqNg0fWN69dIjOlM76gs9pdM1Vz21xraQ9tor76/t1St43a9ffJ/99otvy2SNUDp7EUnSJ58Er3/8Mb5P7J5H3bunH4cUv4ksiQYAoCkhIQLQMHzzjc62j6rrs05ZuK20UK1VI330kTMlEjatk6XHHw9epyqT7dp11/TfEVYV7qGH4ttSbUDr/4Suf3/n+PDDyZ/xl+uWpGeeSd4fAIDGLK2/QRhjtjPGtKw//6Ux5nfGmPbRhgYAIQYN8hb9+L9Rs9arThe7ECdD/tmfnXdO3M+teOd+ouYvoJDIwIGJ751zTurnY/l/1Pfec44HH5z8mdgqeXPnZv5eAAAai3T/k+qLkjYaY7aX9LCk3pL+L7KoACAdNTXhextddZWTpYRtApSh6dMT3+vUyXl9XZ1zHfYZXaxx47zzDz4I79O3r9S6dfJx3A1fa2pSvzNW7IzX2rXpFVJo1kw67bTM3wcAQClLNyGqs9bWSjpW0khr7XBJ3aILCwAy4O5tFLuT6vLl3qxRbLm2FBJVnsunfff1zv370U6Zkvrzu2yr3knxv02S9Kc/JX9mhx2c34+nnsr+vQAAlKJ0E6IaY8wpks6Q5P73zfJoQgKALPmKMGjLLYP3dtzRSYx+/evixOYTtreRm6+5M0P+jV7D+PcbylRYFblUm7F++WX27wMAoJSlmxCdJWkfSTdZa78xxvSWxH8nBFC6Fi1yEqM77gi2jx/vJEZR1MhO0733eucjRkjHHeddu1Xk9tkn/rnYYgjpCCvTHVaOe8WKzMcGAKAxMDbDb0KMMR0k9bTWfhpNSMlVVlbaqqqqYrwaQENXVuYt+PF78EFp6NC8vCJ2b6RE/xcbtoeSMcHw3D7uGF27Onme2+Yf46STgrNG7r327aUffkj97lixcfufKZWy3KUYEwCgNBhjJllrK9Ppm26VubeNMVsYYzpK+kTSo8aYO1I9BwAlZeNG52/Ov/pVsP2cc5y/XXfL79LIZJNQYX+BD8vV/NxkKEyiT+iWL0/8THmaHz5XpvXHSXG9/XaxIwAANFTpfjLXzlq7UtJxkh611u4haUB0YQFAhN56y8lIZs4Mtn/3nVeEId2dU5Po0SP5fXe50667Smeemf64qSrQSVLLlqn7tE9z84RJk9LrV0y33lrsCAAADVW6CVFzY0w3SSfJK6oAAA2bWzrNWqldu+C9Tp2cxOiUU7Ie/ne/S6/fp59Kjz6a/rgXXeQcW7VK3Oegg1KP06tX+u8sdelungsAQKx0E6IbJL0u6Str7cfGmG0lUXMIQOOxfLmTGF15ZbD92WedxCidKZcYw4fnKbYYt9ziHN0S2O6eRH5PPpl6nBNPzF9MxZbOPkoAAIRJKyGy1v7DWvsza+359ddfW2tz2AUDAErUTTeFb0K0YYP3Od24wk6UJ1ofdPzxTpi1tfH3OnZMPe6QIan7hFUp//rr1M8VWjYb1AIAIKVfVKGHMWaMMWaxMWaRMeZFY0yKr+MBoIFzE6O+fYPtRx7pJEbbbVeQMG64IZpx00maxo+Pb0v3U0AAABqCdD+Ze1TSWElbSeou6ZX6NgBo/KZMcRKjCROC7V9/ndciDInMmuVt3JqN00/P/JnYUt133umdf/hh9rHkyyOPFDsCAEBjkW5C1Nla+6i1trb+12OSOkcYFwCUnn328WaNYku91Rdh6K38f0+2YYO0227ZP5/OeqJY11wTvL74Yu+8FDZxzeZnAgAgTLoJ0VJjzGBjTFn9r8GSovvPoQBQ6n780UmMzj8/0DxTO+oODddempDgwfT5Nx5dt845JtvbKNbEidm/+623Et8LW7NUaLnMmAEA4JduQnS2nJLb30laKOkESWdFFRQANBj33uskRkuXSpKWqpMu0D36UP29z+k++CCrobfYIr5t7Nj0n99zT+989OjM3j1vXnxbixaZjRGl2E/6AADIVrpV5uZYa4+y1na21m5prT1GziatAABJqqiQrNV2rRdqa83RSzrGu7fvvk5i1K9fRkPut19822GHZReeu3dRun78Mb6tS5fs3h2F9euLHQEAoLFId4YozCV5iwIAGokrr5QWqauO1xjplVeCN6urvVmjNDz+eO7x3Hpr+n2b+f5E2Lgx/v5RR2UXQ6tWzo+8ww7ZPR+mrs45NsvlTzEAAJRbQpTen+gA0IRcfbVvG6MjjvAuYr83cxOjq65KOFZFRe7xXHqpd/7RR8n7dugQ3OR10qTg/RtvzC4Gdzbnywi28y6lz/gAAA1TLgmRTd0FACDJyQqslY49Ntg+YoSTGLVvn3KIXP/yv/fe8W3l5d75qadKbdp41xdcEOzboUNu749Cu3bFjgAA0NAlTYiMMauMMStDfq2SsycRACATL70UKMKwyYoV3qxRghJqb7+d3Ss328w52pD/jNWpk3d+111SD9+W21OnZve+QvrJT4odAQCgoUuaEFlr21prtwj51dZam0HxVwBAQH0RBlkrde0avLfjjk5i9MtfBpr32Se7V/krzD32WPDegAHB6wMP9M7DCitkaujQ3MdI5sgjox0fAND4RbYc1RjziDFmsTFmWoL7vzTGrDDGVNf/ujaqWACgpC1c6CRGDz4YbH/nHf1Lh2k3Vec0/JlneufPPBO8N3Jk8DpsnVAun+o9/HD2z6bj3HO98+rcfpsAAE1UlPV5HpN0aIo+71pr+9b/uiHCWACg9A0d6s0a1Vc3OEhvqVr99JV6Bxf8ZCn2M7iOHYPXYeuEttkm59duku/9g/xLr0aMyO/YAICmIbKEyFr7P0nfRzU+ADRqtbWStdpBM3WlbtLWmuO0GSM1z/6L5WwSkmvzOH9/XIQ72L3/fnRjAwAar2Lv4LCPMeYTY8xrxpifJupkjDnXGFNljKlasmRJIeMDgKJa2b6XbtaVWnHaeV7jxo1OYuSvkZ2mdevi2154QZo4MfEzgwfHt339dcavliS9+252z6Vj2bLoxgYANF7FTIgmS9rGWrubpFGS/pmoo7X2AWttpbW2snPnzgULEACK7YcfnC/oKp641zkZPty7WVeXdWLkd/zx0p57ZvbMFVek7nP11d65+7Vf2Iav+eLudwQAQCaKlhBZa1daa1fXn78qqdwY0ynFYwDQtN1xh5MY+TMSNzFqlvj/0k3MVtpJuqYlnRLgN9/sncduv5RPbM4KAMhF0RIiY0xXY5w/oo0xe9bHwgcPAJCOESOcxOj66702a729jGL06xe8zjWJSGctUl2dc+zeXXruudT9mzVzQv/XvzKLpaIis/4AAPhFWXb7GUkfSNrRGDPPGDPEGDPMGDOsvssJkqYZYz6RdLekk60N2zYQAJDQtdc6idAttwTb3cRo+XJJ0htvBG+HVZPLRE1N+OvCpLvBq/snQKZ7C+29d2b9AQDwMw0tB6msrLRVVVXFDgMAStM990gXXhjf/sMPMh28GtUHHJD4s7fYxMb/x0SLFl4y5La//LJ0zDHxfd1x3LbY62TvTfVHk3+s6mpvBizVc82aOfvgLliQvB8AoGEzxkyy1lam07fYVeYAAPl0wQVOVjB6dLC9QwedqiclORmDf7PWTIR9nuYmQ5LUv79zTFaF7ptvsnt3In37ptevQwfnt2bhwvy+HwDQsJEQAUBj5CZGDz+8qekpna4PtI8O1zideczyrIYdMCD5/QkTnOM++yTuM3Bg6vfEbhjrl05Bh1grVmz6ejBOss/9AACNHwkRADRmZ5/tJEbPPKMHNURdtEj/0pHOdIlvjVG67r47eO2W0461eLFzDEs0Pv889XuSFW24997Uz8dq3z68/YEHMh8LANC4kBABQFNw8sk6Vw9pB32h38r3OV2KxCh2i6PYYgy1tc4x0azP4Yd7523aOMdcl65muoz0xRcT33v88dxiAQA0fCREANBEtGkj1apc9+kCady44E03MVoW3P0g2V7YW2/tnccOF9Z+5ZWJx5o9O77ttNOC127OtnRp4nHCnHBC4nvpzFYBABo3EiIAaCKuu853MXCgM1Xz3/8GO3XqpCc0WEbOJkJDhiQeb+5c55gsafK76irvPKYquC65xLvnbhr71FNeW/PmXs7244+J33HxxYnvhU2CrVqVOm4AQONGQgQATcSll4Y0/vKXcYnRaXpa1eqrk/Ss/jJ0dspx3fVCLn/VuVR693aO/kIJb77pnRvjrP/ZuNFrc8/D1ifFri/yrx1q1847X7HCOcbupwQAaHpIiAAAXmI0ZYreVX+Vq0bP6RQnYzHG2ewnxFdfxbe9/HLi11gbXEPkztq4CYobSqVv5wj/Pb/mzb3z7t2dY2yC4z77i18E2++7L3GMAICmhYQIAJqQ2IQkTt++OqX7e+qjaTpHf/fa+/WLS4zat5e23VaBa79kpaxj+9Y5X+htSnI+/lg68UTvflg1u802887dct9+r7/unb/zTvDes8/G9//rXxPHCwBovEiIAAABfftKdSrTQzo3fhfVfv20QF3Vu9XcuNLY48cHrw86KPE7zj8/vN2/Hun556UpU6Rjj5U2bJAGDw729Rd18J+7Dj008fu//Ta+zbdlEwCgCSEhAgAEDB/uu+jVy5lS8iVG3bRIX6/b2pkC8i3+2WOP4DhvvJH4HSNGhLfHrj/q21d66SXn/Mkng/eSJVx+774b3xZWTGHevPTGc+2/v/NbsMMOmT0HACgtxua6IUSBVVZW2qpMN6EAAGTE/dwt7o+I2bO9Sgh+48ZJAwcGPpNL9ceL/x3u+Q8/JN5E1f+M5ORovXrF3zvwQGem6bnn4uNo3twryuB/b7oxJ4qlgf1RCgCNnjFmkrW2MnVPqXnqLgCApqaiQjr55JAb7ozR8uXBXVqPOEKS1EqrtU6bhTyYnmTJkOSsG1qzxgslzH//6yUoLVoE722+eeIiDQCApolP5gAAcZYulUaPTtKhfXsn64hZSLREXTRUD2pzRZN1JPusbbfdnKN/tia2JHjPnvmPCQDQsJEQAQCyF5MYlWuDHtS5+l6dnG/KHnkk5RD+PYjSeV0isZXkpODeQ5J0/PGp37FihRN6uhvOAgAaNhIiAEDu6hOjlj84UzLlqnXahwxJmRidc05mr0pUOjw2+Yn9XE6KKRiRwM9+5hyXLk0/ptgNYQEADQcJEQAgfxJ8SrcpMbrnnrhHYit750vs53JSfNIUZu7czN91+eWZPwMAKA0kRACA/EuUGF14oZMY3XqrWrZ0mtyqb/mWTvITJpuKcatXZ/cuAEDxkRABAKKTKDG6/HJ9s76rWuvHvL5uiy2c4+abp+4bu68RAKBpIiECAETPTYx8m/900yJ9rW11iy5TG63Jy2tWrHBeEbbxaqyHHsrLKwEADRwJEQCgsOrqNiVGX2gHXaa/6Rv11jvaX7rkkoKF8dlnqfsk2rNo5cr8xgIAKB4SIgBAcdTV6QD9T/vqfU1RP/1C70l33unMIP32t5G/PiypGTkyeH3BBeHP3nZbfNubb+YeEwCg8IzNZvVoEVVWVtqqqqpihwEAyIP6r+ckSa/qYB2mN4Idzj5bevjhyN4pSWVlXmGHjh2l77/37rVvH7/8SZK231766qtgW+fO4ZXtAACFZ4yZZK2tTKcvM0QAgJIw9Zb/OJ/SlZV5jY884mQwp50W2Xtbt/bO/cmQlPiTuQULvPNWrZzjkiX5jasQxo93fnvdYhQA0BSREAEASsJll9Wf1NY6iVHz5t7Np55y/uZ+0kl5f2/PnlKzBH8aJvqIYt0659i8uXTppXkPqWCuuso5plOEAgAaKxIiAEBpqqlxMpLycq/tH/9wEqPjjsvba446SmrbNtjWpUvyZ9xEqWNH6YYb8hZKwS1aVOwIAKD4SIgAAEWTaGYmYMMGJwNxd3KVpDFjnMRo4MCcY7jiCumnPw22zZyZ3rN77pnz64sq0SeBANCUkBABAIqma9cMOq9b5yRG7qIdSXr1VScxOuOMrGNo10669tr4tnTcckvWry0Ja9cWOwIAKD4SIgBA0dx+exYPrV3rJEb+aghPPJF2ue4WLeLbDjkkizgk7bJL8HrevOzGKZbaWu/8u++KFwcAFBMJEQCgaE4+OYeHf/zRSYw6dvTa7rvPSYyGD0/4WJQV1c45J7qxo1BX553n9L8FADRgJEQAgKKyNnE1t7QsW+YM0L691zZypJMYXX55XPfYWZ18eu+96MaOgv/3/aOPihcHABQTCREAoHH44Qfnb/ibb+613Xqrkxj5FgkNHZrZsA88kLqPWwhv9erMxi4lrCcC0FSREAEAGpdVq5zEaLPNvLYbb3QSo9tuS7nHa+wndTfdlPqV22+feZiSs+bImPiy3wCAwiEhAgA0TqtXxxdfuPRSyRi1VuKpnNjlR/Pnp37VqFHZhbjtts6xIc8sAUBDR0IEAGjc3OILvn2MvlVvXaw71V7LNrW1aeMcr7vOObrVvTduDA53xx3xrzjooOxCq6nJ7jkAQP6QEAEAmgZ3H6MWLfSVttOdukRztY3zzdrTT2vNmmCRgZ/9LHyYRx/NTzgrV+ZnHABAbkiIAABNy/r12kcf6hd6Rz+ovjLd4MFOYvTii5u6PfRQ+OPffJOfMLp0yc84+cSnewCaIhIiAECT9K5+oa01Tyor8xpPOMFJjF57Tbvu6jXffLPTfPjhzhd4UvCxbKxbl9vzUTjzzGJHAACFR0IEAGiyysok1dY636/5M5zDD5eMUQs5WcuVVzrNr73mfVbXrl34mGPHpn5vlJ/LnXGGdO650tNPp/+MMc7xjTfC719ySVz1cgBoNEiIAABN1qYCdG3beolRM++PxtnqrbP1sJprQ9yzu+8ePuaf/uQcjQn+6tzZ69O9e/xz6SRSySxY4LzniSekBx/0vgJMR0WFc1y1Kvz+nXc6x3RKkANAQ0NCBABosnr2jGlo29YpK7dypWSMflRrPayh+kEd9Jl2Upm8snA33hg+5qxZ0rPPxrcvXepUrisv99bqHHGEd/8vf8n+53jxxfAkK13u3kz+ohJh6uqyfwcAlCoSIgBAk3XFFQlutG0r1dVp1DmfaIL21Ob6Ubtopuarh07TEypTjfbeO/4RySmlPXiw126tl3itX+9MREnO7M0rr3j9ZszI/uc44YTg+/yJzcyZqZ8PKyUOAE0FCREAoMlxkwZ3ZiSRkQ+01b524qYZo3Zarid0hmZqJyej8WUbBxzgPefuXXT66c5xzhxpv/28+9dc4822NG/uHNesyfGHUvgMT//+uY8LAI0ZCREAAKnUzxi1+m6O/qxr1bK+2IJ28hKjsCIGjz/unb/7rpeI3XCD196xo3NM9blaImecEd6+1VbOcdmy8PvpohQ3gMaOhAgAgHR16aLr7fXqMf/jYPtOO2mLdkZGGzMe8tBDcwvJ3ToptoDCxx/H903Hd98Fr4cOzW4cAGgoSIgAAMjUVls5Uzrz5weav9COOkHPyyj96gOjRuUWivupnb+KnRtiNg47LHgdW/0ubMaoVSupT5/s3gcAxUZCBABAttzE6PPPJUkttU7/0CBNVj/NUffEdax9ttgiP6EkK4zw2mvxbePHB6/dEuSffhpsX7s2eB02Y7R+vfTZZ6ljBIBSREIEAECudtxRsla99bUG60l10HL11AIn22nWLK3EKFennpr43oknxre9+mrw2i0wkai0tvtJ3uuvB9urq73zRJvVAkApIyECACBPNqqFntZg/US+WtfWFjQx8tt5Z+cYVsFu8uTg9d//nnwsdyZrxYpg+x//6J2vXJlZfABQCkiIAADIs1rTykmE3n3Xa3QTo7KygiVGEycmvjd3bmZjuWXDY6vhTZoUvN5zz8zGBYBiIyECACBPbr/dOU6bVt+w335OBvH2216nurq8JUbDhye/724WG8ad6YmtTid5hRNmzfLaHnoofBx3VqhZ/d8osq1uBwDFQkIEAECeXHKJk//sskvMjQMOcG74F+64iVHz5lJ9ue5775WOOcZJUtwNW5NJlKSEOeWU4PWPPzrHsrL4vkcc4RyPOcZr69o1fFx3zVFlpdf27LPpxwUAxUZCBABAoRx2mJMYvfCC17Zxo+aol36hdzR6tPTyy5uaU3Jncjp0SNynosI5xiYpNTXOsbzca3PP33/fOdYXz9tUgS6Zm27yZpuuvjp1fwAoFSREAAAU2vHHO4nRU09Jkjppqd7RL/XejA46RU9lPNyVVya+t3Spd+7fdNVNuPzJzuGHO8fa2mAftz2ZAQOkzTZzzufMSd0fAEoFCREAAMVy6qmStfqZPtXvNVJttFb/p9P0hgboCL2c9jD+Sm/J9OzpnbvFEfz7ID2VIBd77LHgtT+x8vv5z52jO/sEAA0BCREAAEVW2+snulu/19aarT/oNu2qqXpF9YuJWrXKeXz3Mzx35sevRw/vfPPNvfPbbgtvl6Rhw8Lfc/fd2cUHAMVEQgQAQJGddJJzXKKuukN/0Haapat0o9O4fr2TGMVmJRk46ijvfMCA4L24AhD1Lr3UOfo/qXPXCPmL5vn16ZNVeABQVCREAAAU2VVXBa/XqK1G6GpNOO0OX+MaJyOpr6V9002ZvWOnnZzjm28G2wcOTP6cW41Oktq1c45uqW23qAMANGQkRAAAFJl/HY/fuZOHO4t9rrnGa1y9WjJGra/+XUbvmDEjvP3AA4PX/nLfzz8fvHfIIc7RXX+U6NM5AGhISIgAACgx7ianX35Z33DDDU4Wctllm/pcolF6UcfpLmWWGMWK/RLviiucY8eO0oknBu/F7ns0fnzicb/5JqewAKBgjHX/M08DUVlZaauqqoodBgAAeeWuz5GcvYOWLXPOQ/+YvuAC3X1vM52lx9RWq72H/DW2k7zD2uB5NnFa6+xbVFsrtWwprVsXvD9woDRuXGZjA0C+GGMmWWsrU/eMcIbIGPOIMWaxMWZagvvGGHO3MWaWMeZTY8zuUcUCAEBDcfTR0nHHJe+zz+R79HuNUh996jUuW+ZkI126RBugj1u1bpttvDb3k7t33y1YGACQkyg/mXtM0qFJ7h8m6Sf1v86VdF+EsUaW4i8AACAASURBVAAA0CD885/S7bcn7/Phh86xrkdvZ6rmN7/xbi5e7CRG3bsnfD4fxRAGD/bOL7zQO+/a1TmuWpX7OwCgECJLiKy1/5P0fZIuR0t6wjo+lNTeGNMtqngAAChl1nqfr9UXkttk9Ggnx/FvrCpJc+fWnzz9tPPw8cd7Nxcs8KrS+UvFSfrLX7KP0y3D/cwzXttFF3nnQ4Y4xwb2RT6AJqyYRRW6S5rru55X3xbHGHOuMabKGFO1ZMmSggQHAECpcBOOefOCa43ivPCCk4kccYTXtnq1tNlmkjHaRl9Lkp57LvtYHn/cOdbVhd+/7rrsxwaAYihmQhT2f+mh/z3JWvuAtbbSWlvZuXPniMMCAKD07bZbkpuvvOIkRi+9FMigZmkHjdKFOnL2nVm/N7byHAA0dMVMiOZJ8k/+95C0oEixAABQkhb4/mT86U+98+rqNB4+9lhnKmfNGqlzZy1VR52jB3W3LtF72lcXaJT0+ed5jxkAGpJiJkRjJZ1eX21ub0krrLULixgPAAAlZ6+9vPNp05z85YUXMhykTRtp8WKNuGixemientfx2kZzNFq/k3be2ZlFOu20tIdrlsHfHs4801t3BAClKMqy289I+kDSjsaYecaYIcaYYcYYd1/rVyV9LWmWpAcl/TaqWAAAaKjmzXOObhKy447B2gmZGDFCWqrOGqQXtI1m6/e6w7v51FNOYtShQ1wRhlinnJLe+x54wFlztG6ddMwx2cUMAFFjY1YAAEpQ27bB8th/+Yt01VW5jxtblMFaSf36hX+D99hj0hlnpBwn9q8SiQo/uP3atZNWrqQSHYDolMTGrAAAIHv9+wev85EMJTRlipOdPPZYsP3MM53spl+/hI+6G7H6tWmT+FWLFzvJkOQsbQKAYiMhAgCgBD3ySBFeesYZTmK0Zo3Uvr3XXl3tJEYtWkhz5gQe2XHH+GH69vXO/euNDjwwuF/seeflKW4AyAEJEQAAJWirrYr48jZtpB9+cJKjwYO99poaaZttJGO06uzz1aWLU+gh1h2+pUkrV0q77OKc//e/Um2td+/VV6MJHwAyQUIEAECJGziwiC9/8kknMZo0KfB93OaP3K/vFhlpyy3jijDstZczM7Tvvs6esJ99Fj708uVRBg4A6SEhAgCgxI0bF824iYofhNp9d2eGyFppp5289iVLnKynWTNpzJhNzRs3Su+/Hz7UIYc4R4oqACgFJEQAAJSoF15wqsvlk39NT0YJkd+MGU42M3q012atdNxxzqCxFSEU3E/p3//O8r0AEAHKbgMA0IRssYW0apVz3qKFtH59Hgb98UepS5dgnXBJatnS2UipUydJUrdu0qOPSoce6iVjDeyvIXnRpo0z2VZTU+xIgMaLstsAACDUzjt7561b52nQNm2cLMta6cgjvfb166XOnZ3s509/0sKFTjLUWO29d3gZ8lhr1waLSwAoLhIiAACaEP9+RltsEcELxo51EqP33w9+n3fLLU5iFFM+b/HiCGIokokTnbVT115b7EgAZIKECACAJuSoo7zzzp0jfNG++zrZgbVSr15e+8KFkjEard+qhdbr1FMjjKFIbr018b3qau983broYwGQGgkRAABNlP/zuUh9842TGN1ww6amC3Sf5qqnbhv/s0b3HV2ydVlXX+2dv/569LEASI2ECACAJuoXvyjwC6+5xkmMlizRqzpUE7SvdtU0JzMwxlmLtHRpWkMtXRpazK7kTZzonb/wQvHiAOAhIQIAoIn6zW+K9OJOndTstdd0rP6pvfSB1752rVeE4cYbkw7RubM0YUJhNq094wwnpFmz0n/mrrvC2/2b0U6dmltcAPKDstsAADQxpVLyOhDHW29JAwbEB9Wrl/PJnU/LltKGDd4YdXWFiXP77aUvv0zdT3Iq+P34Y/I+nTo5+9oCyD/KbgMAgITeeEM699xiRxHjwAOdzMbaYCW62bOdLKKsTJowQcce6yVDUmGTujlz0u+7dm3qPmvWZB9LqZg+3fmfZ8CAYkcCZI+ECACAJmbAAOnvfy92FEnMn+9kOpdf7rXV1Un9++uQf56n7pqrvn0LH5Y/EYv13nuZj5eXTXGLzP3nKJufHygVJEQAAKCo/IUG/Mwtf5WRU4RBLVvKSjpXD+pb9dKUaqPfa6QkGyglnm/33ZdevyefdI7+T+LctkSK/cliPsye7RxraooaBpATEiIAAFBU550X39aune+iUydp3Tpdd63V9pql8TpIkjRSwzVNfXTaKydI99yT1btXr05+/5pr0htn0iTnWFYmtW3rnF9wQbCPmzy4GkNC5G6s2xh+FjRdJEQAAKAoWrVyjjNmBNsXL5ZWrvSu3bU2Dz0kfaNtdUTz/0gvvaQaNdd6tdSJelG68EJneiaDzZV+/WsneenaNXGfZcvSG2vePOfYqpV0553O+apVwT5uclVWlnaIJe+HH5wjCREaMhIiAABQFPvt5xxj1+Z06RK8dmda3NmILbeUdOyxKrc12kNVOkBve50//9xJjMrLnfMkxo93josWpR9zomIJbgLXvr00ZEh4nzffdI6B2a8GLjbpAxoiEiIAAFAU//xnfNuoUd65O5Py8svOsbbWOQ4d6n+imf6nA3TKyVYaNsxrrq11ZouMkU47LedY3bVBf/xj+H03qdtmm8RjuLNNhSwIsX59cLYt39ati25soFBIiAAAQFFstll828UXO8eWLaWtt3bOV6wI9vEXn3M/d3v+eTkVEKyVvv3WmSFyPfWUk9F06LBpc6Dzzw+OecYZ8bG8+qp37n7eN2ZM+M+ycaNz7N8/2O6fUXILD1x5ZfgYUWjVKtoZqcZQKQ8gIQIAAEX3wAPO0d1k9fXXpT//2TmPXZ/Spo13/t//Bp+T5GRSGzY4D/qnY5Yvd7IwYzTu/tmSvJmfp56Kj8kt9tCihbTDDs55qo1UzzkneO2fUXJ/joMOSj5GFKZPj2Zcd9YOaMhIiAAAQNE0b+4c//CHYPsBB4TP2sTaaacUHaZMcTKRxx4LNH+tn+hD7aka20wD9Yra1y2Ne3T+fOf4q19JV1zhnKdKALbf3jluvrlzfPbZ5P0L9clZVBvxujNjQENGQgQAAIrmkEOc4+rV0nXXJe5X/6VbUrGfqwWccYaTGK1Zozmtf6K79Tt10jKVyWqcjtIidVWdMc6UUefO0mOPbZrRGTNGGjQo3Z/Isf/+ztGtwhbLnZl66aXMxs2WWxY83/wzc999F807gKiREAEAgKIZN847v+km5+iu1/GLnUHy69PHOU6YkMYL27TR9rVf6I+6Xbu2/koaN04jdLn+qj9phdo7fZYulc46S9O1k57QaWrdrb10662BYaZOlfbcM/Fr3E/w3KRq+fLgfbdgRKI1Sfn2/+3deXhU1fkH8O8bkpAQSMJOBCQogoIiYERxQRC1iFVEUVDqXrW4FbVW1Lq10rrij4p1qyIquNV9QxQVcWNTZAeFsAYBDYtAFpKc3x/vPd47a2ZCksnMfD/Pk+fcuducyXVo3r7nvKeuMlHe4YzTptXNexDVNQZERERE1CDY4WiPPx54bOpUbb21EqzZs6u/t63GLeIWN/j4YwCnnYaKu+/F7bgHLbANmDULyM8HUlKwFW1wLl7Rqg4334y12B+TcBHQrBnO67kQc+dqHYdgWrTwfX3HHdqmOH95NW6sbV3N7akv3oDom29i1w+ifSEmzlbSKigoMPPmzYt1N4iIiKiW2OFjlvdPE/9j+flAYWHoe2zdCrRqFXi8fXugqMh9nZ7uWyHNXm/fe+NGoEMHIBO7sOfgI4GVK7GoqjvysQbNsAsA8AO6IAMl6JhZjK4l3+EHdAvad2M061VWpvdcv97tT8uWmpAKZuhQ4Pe/DyzUEKmyMt9sW2mpG4jVFu/zOeEE4LPPavf+RDUlIvONMQWRnMsMEREREcWULawQTLNmvq+vvTb4efYP8549tZ082Xfe0aZN2nbvrgFKdeWi7Qi5UmkKLFsGVFbicFmE5tiGc/ESbsSDWIruaIOtQEkJVuJgbEUr7UhGBjB8OACdYDN3rvt+ixZpa9cr2rUr+PtnZwNvv71vxRD8gxNbNa+u2N8xUbxhQEREREQxNWqUu+0fHNmiC9YNNwS/h61It2mTDku7+GLfdY5s5ub558P3xa4bNH26tt4hek2bApVIxasYgfG4EWfiLTTBLqB/f4zBw1iFA/TEsjLgtdewG03xIkZiT9/j0R2LIKhArjNN6eijtbXD9+xwvvR0zeL8+qv7vnatpWjNnOn72i5wG6l27YDzz4/8fP95UkTxggERERERxZS3IvZtt4U+Fs6kSe52uNkAffqEv88DD2hrS27bAAYAevd2t21Gqgpp+GLcTEzAGByNOTou7bTTgLQ0bEMujsMXOAFfYAl6ogyZemFqKm7+eCBSUAljfKuz7d2rSygBbqnvzZtrVhRhyRJt7TC5aAOWzZuBF1+M/Pzdu6O7P1FDwYCIiIiIGgz/0tveLE9dspmpV1/V1g63697dPefBB91tb6Bgs04i0Ojj3XeB8nK8/Z8idMR6HIBVGI2JSIVTNaKyEm0XfYZf0BLzTW/8nHco+mI2clJ2ol07vc+bbwL//KdbhKFjR+C++7Q63dChkX2mDRu09R92GAkbEEbDBnJE8YYBEREREcXc6tX7XqXsoYeAzEzfTMV55wHjx1d/bU6OtmvXamsXHL30UvecI490t73rEs2Zo60tpW2NHg0AgkIcgMdxNcQYTV9dcw3QpAnexhlogy04FEswG0ejuCoXm34SVEkjDL2xC7BgAb7+Wu/188/A2LG67s/bb1f/eew1gM5HitY770R/DRdppXjFgIiIiIhirnNn4Kij9u0eN9ygmZ0mTdyhbq+9ppkWIHjJbuuww7T1XwD2ggt8X9uYxmvdOm2DrZ9k+QzVe+QRYPduXITn0AEbcRCW4zxMAeDcuKoKWLUK6N0bfY8STMdJGISPkI0Qq7yGYOchtWnjZpomTozs2miC09+GD1aFP4+ooWJARERERHEhPT3yc+36QHv3AsXFuu3N8Pi77jpto8ly2EBgmxOneOcb+fvyy9DHfkQ3vITzkWKjrUceAZo3/+34yZiBj3EKdqAFFqMHJuBaIC8PeOWVsP2z847y893CDHaOVHV++CGy84DA0uhE8YYBERERETVobdpoe8IJkV8zcqS7bTM64aqsDRsWfb/s3Bx7f1tK22vPHs2chMseBbjmGo3ibID0wgu//RIyUIrReFwrMYwY4Zana9kyYJVYu9Bt795Av366vXVrZF3wrtkUis2MpfCvSYpz/E+YiIiIGrQNG3TCvi2FXVPBFmzdFzbIsI49NvCczMzIMih2DlNQo0ZpyTdj0DVlFXKxHV83GeB7TnExcNVVboCUk4Oue3XRo4EDgUsu0dOqW3/J2rHD3bbzmPzNnq2t/9wponjDgIiIiIgatLS08PN/YsV/Ps7ll9f8XrbEdnVycoA9yMKIlp+6GaQ5c4ADDvBN1ezcicXoiWU4GAV9BaeNaIr2WB/xPB9vme933w1+ji3rHW5hXaJ4wICIiIiIEtJ++9XsuhkzIjuvS5fwr6vjzRzdfHNk1xxxhLabN3t2HnmkFmGorNQAaflyoEcPPIErsBadIACwezc2YH8UIU/fODNTV68NkTLyltD+9tvgfVm9Wlu7zhFRvGJARERERAnJWzo6mjk8Dz2kbV3PjanJULNx47TduzfMSd26AYsX4094AoPxIbBlC3DkkfgHbsNSOAsrlZYCkyfrL0ZEo5qhQ38LkLyV9AoLg7+NnWeUkcF5RBTf+J8vERERJSRvqevBgyO/7rPPtI2qEEIN2PtHE0z07autf+nvsFq3BubMwR24Bydhhk4QOvFE33GI5eW6wJETIP276moUYC5Ssfe3Kn3+7DpHzZq5n2Hhwij6RdRAMCAiIiKihDdlSvXn2LLeJSXatm1bd/0BgDPP1PbUU+v2ffyVNc7WcYHl5RpZlZZqdshT1/xq/Adz0Rc7kIO5Wztp1HPCCcAnn/x2ji280Ly5G1tFOtyQqCFhQEREREQJ68svdX5OkybVn+tZ+gcAcPzx1V8TzdpI/p5/XuORUEULapsdomfLj5eVAUOGQIfLvfmm7nAKNXTDUlyIZ/EkroDAALt2AZ9/DgwapEPs0tNxz7oL0BUr0K5VBTIz9Z7z5gF33aWnnH56/Xwuon0lJqqca+wVFBSYefPmxbobRERElGCGDAE++MB9vXAhcNhh4a/p3BlYs0a36+tPqpQUfa/Zs90hdMHYog22X1lZui7SGWdoUJSXp8sZ9e8PzJwZ/Fq7XbVshZbCmzlTV6L1fNhKpKAIeXgU16BT8x24t/wGrNvdGpmZ+n5EsSAi840xBZGcywwREREREYBbb/V9XV0wBABXXlk3fQnHlrm+7bborrPrMNl5Plu2aDtnTvjrjAHQrRt2Pfc6pPgXHNajCigpwdDUt3AZ/ovdKdlogWLci1swetu9WLW7Hb7BUXip5HQtizd1anQdJapnDIiIiIiIABx3XPTXjB2rbbt2tduXcOzcplDlsEPp0UPbrVu1tWsSedccCmfCBG2XLAGQkYEP5Aw8g8uw5IttuGzEHrTCVjzY6K94DFehFBn4HaZrJ0eN0jRTairQsSPw5z8D27dH13miOsSAiIiIiGgfGANs2lR/72eLMISLKTZs0NZbwe7cc7WNNADy9+qr2trRcpWV2h5xBFBQAPyCVrhF7sN1eAQDMBPZ2AFccolGiykpesGGDcC//60TtlJSdKXZk08Gvv66Zp0iqgUMiIiIiIjiyP33a2szPMF8/LG23rlA552nrQ1kouW/HpENjNLTgVNO0e2KCvd4OTKAZ57RaLGyUhdPmjgROPRQLeRgDLBzp3b2mGPc9ZC6d9fFoLw3I6pDDIiIiIiI4khubvXn2HlBdr4RoLGGNX687/kLFrjbc+dqm5LiG1Dt2uV7jbeIRM+evsdsRbs33/TsTE0Frr4aWLRI01TGAPPna8orN1ffrLwcWLYM+MtfUJLWFDslB2jTBvjDH9y0F1EtY0BERERElGBWrNDWGwR5Pfmk7+sbb3S3baW9Ro3cIXc//RQ+I+WvZUttn3iimhP79AHef18r11VVAb/+qsHQfgfgVZyDMqTrpKcpU3T+kQjQtCnQr59bP5xoHzEgIiIiInLYjIg3s9KQhZoPVFSkbVZW8ONr12qbna2td0WT777TNi3NXXD1jTd8r7eV6oIR0VFxgG/mKSJNmwIPPIAJ167CRXgebWWrTl7q29ddTGr3buCbb3RlWxHtZOfOuuBUTSdIUVJjQERERETksAutNmsW235Ux2ZuLrgg+PEdO7QNNbzOxg3PPqutdzicXVepSRM3BvniC9/rJ00K3bfmzYGLL9bt4uLQ5738sgZkGzcGHps1S1tjAAwfrosu7d6tOwoLgZEjgdat9RdRUaGdvv9+IDNT9zVvrivDhovciBwMiIiIiIgc772nbXVr88Rar17ahho1tnu3tnl5vvtT/P7yGzZMW+9wOLs+UU6OxhVA4O/jq69C9+3yy4ERI3R7797Q5/3xjzpCbsyYwGMrV4a+Dvn5wIsvakdtsYb77we6dtWI1hgtwffuu8Dhh2sWKTNTJzo9/jiLNVAABkREREREjkGD9O/pLl1i3ZPwZszQ1htwNG7sDvkrK9O2Wzff62wGLBybLWrfHujQQbdt1sgGVHbIXTB//7v7Pt7CC/7Jmj17tLVD9Lx+/jnwvJBSU4GbbtKJU2Vl+qazZunDzM7WX0ppqRZzGD1ah9ilpgL77adRmffNKCkxICIiIiKKM/5D4crKtEAboFkumwTp29f3vBYtQt/TLthqg6kePTTBArj3a99e2507A6+fNEmzPcGCrj599F4jR7r7bFbqp58Cz/cGQVOmhO5zSMcdp+W8d+zQN9q2TYOh9u21WkRlpZYDf/ppHXonouMkTzgB+OSTGrwhxTMGRERERERxyGaDbr8dGDLE3X/TTW5m5rTTfK85+GB3OyNDW1uJ7vrrtbXBz7HH6pqpXrfcoq0Nmrwuvhh4+OHA/pWXu1mgadMCryspCdznzXy9/37g8ajl5gL/+Y+W7q6o0F/QpElA797uL2LXLuDzzzWzJKKR3UEHAf/4B4fZJTgx3lxmHCgoKDDzvKVQiIiIiJLQ/vsD69dr4YOSEjcISktzAwr/P/MmTgSuvVa3u3UDli/XIGnFCk2UbNniBjI7dmis4C3dbYzv2kQioctxZ2Ro4HT77RpTADrkzi4M672Pfz+9x7p2dcuI793rVr6rdUuXAn/7GzBzpmaUgnWqRQtg6FDgjDM0yPL+NGsWOEmLYkZE5htjCiI6lwERERERUfyZOzdwSJw//z/zysrchMhDDwE33AA89hhw1VU6kqyiwg1G7LX+gYv3tTfA8dexoyZkUlJ8gyZjdFTaoEGh++l9j5wcrZEwaxbQv78ee/11rbrt9c03WqTBP6tVY6WlwD//CUydCqxbF75ChO10dnZgoJSbqx+iun3Z2fFT7z0OMCAiIiIiSgLewCE1NXBkV7A/8/wDHv990QREaWnu3CV/w4YBb74ZuN+YwGNFRb4V8UT0xxj3PU45BfjoI/ec/v01mWOlpOj5dfqn7ccf6xvZKG37dk2l2e1w+4JNvPLXrFnw4ClUUOXdn5MTWdWMJBFNQMQwlIiIiCgBXHut7xye2hZsNFijRuH74w16bIBTXq7ZLXvPqirg0UeBe+7RfZs2aZuerhktG+TNn6/tgQcCq1bpdB+vevn/+E86qebXVlZqCiuS4Mnu37gRWLLEfR1qfKLVpEnkAVSw/TZ9mGTqNCASkcEAJgBoBOC/xph7/Y5fDOABAHZJronGmP/WZZ+IiIiIEkWLFu7ip+PH6xyh6kZ2nXlm6ETCggWhr8vM1NY7BC5cQuLEE93tLl00C7Rnj45Cs5WuO3bUEt7TprkB0YsvapuV5VbRBjQmAIBly4CWLTW2eOYZ4NJL3QAL0KF1xx8ful8x06iRG3zUhDFa+CFU8BRs39atwA8/uPuqKw7RuHHNhvvZn8xM3xRinKizgEhEGgF4FMDJADYAmCsibxtjlvqd+rIx5pq66gcRERFRonrkEWDUKDdT06WLBgzhvPFG4L7sbB3Rddlloa+zJbcbN3Yrw0WaUJg1CxgwQIsj/O9/7jC7a68F/vIXYPVq91w7DK5tW9/aBjYIS0sDDjlEF4t9/HENiO67z71+0qQGGhDtK1savFkzjSSjZYxGpJEO8bM/a9Zou21b6PGR1nnn6ZyrOFOXGaK+AH40xqwGABF5CcBQAP4BERERERHVwPnnA82ba7ABAOPGAWedFf19Tj9d1/tZvDjwmM0IDR6sr7Oz3YDIZo1C6dJF/wZv1w44+2zNDhUWukHO6NEaEP36q3vNypXaduumyY2KCjfrZYftXXWVBkTLl+vrr792r589O/LPnVRENO2WlaWL0tZEaWn4gKpr19rtcz2ps6IKIjIcwGBjzB+d1xcAOMqbDXKGzP0LwFYAKwFcb4xZH+6+LKpAREREFFq4ctaheKvP+V/bvr0Od9u0SQObo47SYAQAevYEvv8+svfYtUuTG3YukX0P/yIOLVpoMmLiRODWWzVzddFFwOTJGozt2OF+Tlv2Oz3dDZpyc/X6WJk/X9eF+uYboHPn2PUj2UVTVKEui6UHG0Do/7V8B0C+MaYngI8BTA56I5ErRGSeiMzbapdRJiIiIqJa4V1ryN/GjRqstGunr485xj3WqlXk79G0qbbBqtd57d6t7YgRujYSoMPsAOCww3zPtffyzpvatSuy/ixZopmx2nb99bqe0x//WPv3prpRlwHRBgDeAY4dABR5TzDG/GKMsWsdPwXgiGA3MsY8aYwpMMYUtLbfDCIiIiIKYAONfVkjNNy1l1zibtsgqSZsQQY7/8kGNbZt1Qo46CDdtkGSXVTWe503YwSEXhfJ36mnAu++CzzwQPR9D+eHH7T1DuOjhq0uA6K5AA4Skc4ikg5gJIC3vSeIiKfiPM4AUM00QCIiIiIKJydH23AlsYOxGRwg/PqgPXu62wceGN17eAMt+/9xZ2Vpa+fie4f5+S+yOmKEu92ypbY2SLJZrkiHCf7yi7b/reX6xrbqX0mJG6xRw1ZnAZExpgLANQA+hAY6rxhjlojI30XkDOe060RkiYh8D+A6ABfXVX+IiIiIksEf/qBttNWdhwxxt8MNofPq0SO69/AGXQXO7A5bMC1YcbILLwx9r759tX35ZW3z8kIPwxszRo+99Za7r8wZo1RYWH2/o+EtxPa3v9Xuvalu1GWGCMaY940xXY0xBxpjxjn77jDGvO1s32KM6WGMOdwYM9AYs7wu+0NERESU6B55RMtQrw9bpirQc8+5297AJZwjgk52CO3gg91tm9mxFfK8Fe5sYOOdo+SftbrjDm1tADJokFsYwn89JRsIPfaYu88Oratu3aZobHRW1mzbVls79yla27dr5uzOO2unXxRenQZERERERFT//vrXyLM8lvf8Fi0iu6ZLl+jeY/Rod9su3Hr99doWF7sLtqalBV7r36cjj/R9ffvt7jn+w+Dsoq52fo+/Tz4J3+9QSkqA3r2Bdev09QsvaFtQoAHcTz/V7L5PP63lyr0ZLao7DIiIiIiIyEd+fvjjkS7I6u/iiwP32XlIpaXA//2fbtt5RV7e6nbB7L+/m4H68kvfY3v2aGsDrooK3+N33x3+3qHcdJNmo845R19//LG2v/udLh4L1GydUhug2WISNZWfD4wcuW/3SAYMiIiIiIgIgC6GCvhWcwumpCTy4gXB+A9/s0Pvxo3Ttk0b95gtDhFs+JitVGfZeVBr1/rutwGQDYxs4GIzSt9+G1m//dk1mBYu1NYuFDtqlGbpAOD++6O/79Kl2noXrPX6+WdgWTWlyIqL9ffAnYUZkgAAFD9JREFULFP1GBAREREREQD9g94YzXDUFWMC5+3Mm+dbgc6W2waAe+7RzE+vXoH3at9eWxsYXXaZtv5rEVVVaWvnDU2bpq3NhEW6dpE/G3iVluo9bAaqRQu3uEWowOXyy4HMTA0u/W3Zou3OncGv7dUL6N5drz/uOHdIoNeHH7p9swEbBceAiIiIiIhizmZtAGDgQHd77NjQQYUNgOxQOVty3H9InGWzWrbowiGHuMP/bDATjW3b3O1x4zT48M7FatpUiz74r41UWanzhEpLgVdfDbyvDZJKS4O/7+bN+j6NGunwwKefDjxn5kx321tMggIxICIiIiKimBs4EBg8WCvM/elPkV1z223AG2+4Q9cs73A+/+Bo1y43szNwoGZaAHeIWzRKStyKeLaggl1fCXAzXe+953vd5Ze7fXznHd9j5eXuMWMCsz/FxfqZ+vUDpkzRff5DBAF3GKAIMH165J8pGTEgIiIiIqIG4YMPdHhbkyaRX3Pmmb6v/dci+vxz39evv+4unjp0KHDjje57R6uyUrNAKSnAhg26z87DAoDTTtP2mWd8r3v+eXfbv0S4DZBs5mr+fN/jn36q7WGHuVX+iooC+7Zmjd6jQwfdtsMGKRADIiIiIiJKGHbI2qpV2s6Yoa0t5DB9ujskrVUr4PzzdXvr1ujex57fooU7lwnQ9ZCs667Tdu5cd9/dd2uGxw7z27TJ9742ILKBlX/266uvtO3Xz81A2TlHXsXFQLt2Wt68qorFFcJhQERERERECSM3V1s7b8YWFGjXTttFiwKH0TVqFDjPpzo2o9SpE3DWWe7+UaPc7datNXvkDVjuu0/b11/X9ZZs5TvLDnWzgZqtXGfZzzNokBaTEAF++cX3nHXr9PN06wZceaXumzw5us+XTBgQEREREVHC6NpVW7uWT2Ghtqeeqm1Rkc7N8Va1s+W333038veZNUvbnj191zHaf3/f83JyNAArL9f7l5QAeXla0CE3V/tSXu6eb4feXXKJtmvW+N5v9WoN4Gxp8rQ0YMcO33PsnKW+fTWTlJ4euDYTuRgQEREREVHCuPRSbe2QOTu07fLLtbVr+6Sludf076/tgw8G3m/0aA1qRo3yrfq2eLG2J56ox/PyfIfOWT16aDt5MnDVVbo9YYK2dg7QG2+45+/cqcFa69aa/dm40fd+mzcD2dnu68zMwLLhX3yh7SmnaNutm1bRq2l58UTHgIiIiIiIEoYdamb/+Ldt797a2mxMVpZ7zR13aOtf4AAAnnpKg5SpUzX4uOkm3b9+vbYnn6xtUZGb3fE6+2xtn3tOr0lPB845R/cdd5y23vk9tlADoEURvOXAq6p0iN1++7n7mjYNXMto0SJtjzlG2759tZ03L7B/xICIiIiIiBKIzfzYqmplZdqmpmrmxZa09pbH7tlT22ALoVZWaqbmsMP09cSJ2to1iGzwEoqdw2OzNsOGucds8Pbdd9ragMwGPNnZvn1atkz7b4cFApqd8l/odv16t/od4J5vAyXyxYCIiIiIiBKKDQT27vUtN52e7m7bCm1WWpoGG96CCzbjk5WlxQxE3GFzpaVu5bpwMjN1zo81aZK73auXtnZY3Msva2sDtNat3YAOcBdb7dPH3deype88pKoqnVPkHb536KHa+hdo8Pfhh27Rh2TCgIiIiIiIEorN2jz/vG8BhWbN3HPsMDLLZmVsBggAHn5Y2wMO0DYvT9sHHohuvaRWrbTt0kUDJK/UVGD3bt22WSRbAKJ9e+2/rVI3Z462ds4T4BZX+OEHbRcv1mvsgrOAG3jZAhOhPPmkFojwLmybDBgQEREREVFCsWv4PP20tnYYXceO7jlDh/pec/rp2nozOB9+qO3gwdqOGaPtQw9pawOd6owYoe1TTwUey83V4Kqy0l2EdfhwbTt31tbuX7pU26OPdq+3gdyPP2o7bZq2xx4beI5/gQZ/hYU6H8k7bykZMCAiIiIiooRy0UXa2jkzNpNj5wEB7rA06847tbWZFgBYu1bb66/X1hZU2LxZ2/z8yPozYYJmXQYMCDxmg57OnTUY6djRzXDZxVvt51i/Xhee9Q7969RJW5v9sYUTBg70fZ/GjasPdOw9/Et9JzoGRERERESUUK64QltbYtuuM/T734e+xmZ7vBXb7FA2u6groJXfLFu5bl/YSnDr12sma8UK99jhh2tr5/4UF+ucIS8blK1bp60tN26HyVlNmwauV+S1fbv+AG4gCAB33eWWLA/n0Ufd7FS8YUBERERERAnFu8YQ4BYYsAGRSPDr7PweG5QY41sQAQCOOsrdPvHEfesn4FaaA3QOkXeOkS2esGaNFnEoL3czQpYtDlFU5LaNG/suPAtoUOhfntvLO7/ImyF66y3fsuCADvHzFp/Yswe4+WbgtddC378hY0BERERERAnHG/TYxVFtdidUqWy7ptDw4cDcubqdk+N7zhNPuNsnnbTv/ezbVwOfRx4JLPSQna2BzaZNwC236D77WSy7uKsdxrdtG9C8eeD75OW56xgF4w2IbIaoqgpYuVIXt7XZNgA46yx3fSUAeOcdzaaNGhX+szZUERQLJCIiIiKKL1lZ7qKsdgFUIHwFtZdf1gzNkiU67wfwrdYGaMEGEb2Pdy7PvrBFE4LJyNCM1fLl+pluv933eJMm2p9fftEApqzMd+FWyxaUWLTIN8tl2YCoY0c3INq40Q2gVq92h/B99ZUGSYsXa0nvqVM1C3f88ZF/5oaEGSIiIiIiSji2VDYADBkS2TUZGZqVMUazHgBw7rmB540fD4wdWz/lqXNy9H3at9ehbPvvH3hOaqrO/7ELux54YOA5dmjd998Hf5/CQn2v3r3dIXMrV7rHV6/Wdts2DYYADRqLi4EPPgBGjgwcXhgvGBARERERUcLxBjK5uZFfd8cd2u7cqW2wggJjxgD/+lfouUi1adw44IILNEgJVeY7I0OHtH31lb72r6AHuEPtQi3OWliole46ddIMkTG+BR5ssQZbhS8/X9d5euwxXQDXOxcq3jAgIiIiIqKEY0tlR+vGG31fe6vKxcIllwDPPadZoFCaNtWCCTZDFGxInK2IF2pxVhsQ5edrMLh9uwZEWVk6J8lmiGyQNH68Ds+7804dRlgbFfdihQERERERESUcu/ZQTdjhZf7V6hqqnBzN0tghbv36BZ5j1ztavz7wmDGagbIZIkCzRCtWAF276hA8myFauVKHxp12mi5YW1mpxRTqI1tWVxgQEREREVFCmjhRCyVE66WXtD300NrtT11p2VILKqxdq5mkYFX0UlI0wNuyJfDY5s2aYbIZIkADpBUrNPtz4IG+GaLOnbWgxK23Ah066JC+eMYqc0RERESUkK6+umbX9elTPwUTakubNtpu2gQ0axb6vKwsd/FVLzuMzpshWr5cA6yLLtL1j157TdceWrlSs0aAVpULlnGKN8wQERERERHFsbw8bffuBdq2DX1e8+bB1yHyBkQtW2rgNGOGBoU2Q1RRAaxbp0UVunWr/c8QSwyIiIiIiIjimF1jCHDnCgXTtq3O+amo8N1vA6L8fJ0L1KkTMGuW7uvWzS1hPmuWBlQ2Q5QoGBAREREREcUxbxB0yCGhz7OBk3/p7cJCDZZsIYpOnbSCHKAFJuy6Rh98oC0DIiIiIiIiajC8C7EWFIQ+r0sXbW15bsuW3LZsYYX99tM5Se3ba0GG6dN1P4fMERERERFRg+HN2Bx/fOjzunfXdulS3/225LZlCyvYwKdRIz2+bZvOL9pvv33ucoPCgIiIiIiIKI5lZ2sromWwQ+nVS9uFC7WtqgI++USLJQTLEHkzQXYeUdeu8b3mUDAsu01EREREFOdSU4HGjcOfc/DB2r73nmZ9MjK0SEJuLjBsmHuef4YI8A2IEg0zREREREREcW7AAOCMM8Kfk5oKvP8+cPbZWmAhPR144QWgqMh37lGvXsCFFwJDh7r77DylRAyImCEiIiIiIopzH30U2Xmnnqo/4WRkAJMn++6zGaJEK6gAMENERERERETV6N8fGD4cOOmkWPek9jFDREREREREYbVoAbz6aqx7UTeYISIiIiIioqTFgIiIiIiIiJIWAyIiIiIiIkpaDIiIiIiIiChpMSAiIiIiIqKkxYCIiIiIiIiSFgMiIiIiIiJKWgyIiIiIiIgoaTEgIiIiIiKipMWAiIiIiIiIkhYDIiIiIiIiSloMiIiIiIiIKGkxICIiIiIioqTFgIiIiIiIiJIWAyIiIiIiIkpaDIiIiIiIiChpMSAiIiIiIqKkxYCIiIiIiIiSlhhjYt2HqIjIVgBrY90Pj1YAfo51J6he8FknBz7n5MFnnTz4rJMHn3XyqO5ZdzLGtI7kRnEXEDU0IjLPGFMQ635Q3eOzTg58zsmDzzp58FknDz7r5FGbz5pD5oiIiIiIKGkxICIiIiIioqTFgGjfPRnrDlC94bNODnzOyYPPOnnwWScPPuvkUWvPmnOIiIiIiIgoaTFDRERERERESYsBERERERERJS0GRDUkIoNFZIWI/CgiY2PdH6pdIrJGRBaJyAIRmefsayEiH4nID07bPNb9pOiJyDMiskVEFnv2BX22ov7tfM8Xikif2PWcohXiWd8lIhud7/YCERniOXaL86xXiMjvYtNripaIdBSRT0VkmYgsEZE/O/v5vU4wYZ41v9cJRkQyRGSOiHzvPOu7nf2dRWS2871+WUTSnf2Nndc/Osfzo3k/BkQ1ICKNADwK4FQA3QGcJyLdY9srqgMDjTG9PDXuxwKYYYw5CMAM5zXFn2cBDPbbF+rZngrgIOfnCgCP1VMfqXY8i8BnDQAPO9/tXsaY9wHA+Td8JIAezjX/cf6tp4avAsCNxphDABwN4GrnefJ7nXhCPWuA3+tEUwbgRGPM4QB6ARgsIkcDuA/6rA8CsA3AZc75lwHYZozpAuBh57yIMSCqmb4AfjTGrDbGlAN4CcDQGPeJ6t5QAJOd7ckAzoxhX6iGjDGfAyj22x3q2Q4F8JxR3wDIFZG8+ukp7asQzzqUoQBeMsaUGWMKAfwI/beeGjhjzCZjzLfO9q8AlgFoD36vE06YZx0Kv9dxyvl+7nJepjk/BsCJAP7n7Pf/Xtvv+/8ADBIRifT9GBDVTHsA6z2vNyD8F5LijwEwXUTmi8gVzr62xphNgP6jDKBNzHpHtS3Us+V3PTFd4wyVesYz9JXPOgE4w2R6A5gNfq8Tmt+zBvi9Tjgi0khEFgDYAuAjAKsAbDfGVDineJ/nb8/aOb4DQMtI34sBUc0EizhZvzyxHGuM6QMdWnG1iPSPdYcoJvhdTzyPATgQOgRjE4CHnP181nFORJoCeA3AGGPMznCnBtnHZx1Hgjxrfq8TkDGm0hjTC0AHaGbvkGCnOe0+PWsGRDWzAUBHz+sOAIpi1BeqA8aYIqfdAuAN6Bdxsx1W4bRbYtdDqmWhni2/6wnGGLPZ+R/ZKgBPwR0+w2cdx0QkDfoH8hRjzOvObn6vE1CwZ83vdWIzxmwH8Bl03liuiKQ6h7zP87dn7RzPQeRDphkQ1dBcAAc5lS7SoRP23o5xn6iWiEiWiDSz2wBOAbAY+owvck67CMBbsekh1YFQz/ZtABc6VamOBrDDDsGh+OQ3V2QY9LsN6LMe6VQq6gydcD+nvvtH0XPmCTwNYJkxZrznEL/XCSbUs+b3OvGISGsRyXW2MwGcBJ0z9imA4c5p/t9r+30fDuATY0zEGaLU6k8hf8aYChG5BsCHABoBeMYYsyTG3aLa0xbAG85cvFQAU40x00RkLoBXROQyAOsAnBPDPlINiciLAAYAaCUiGwDcCeBeBH+27wMYAp2IuwfAJfXeYaqxEM96gIj0gg6lWAPgSgAwxiwRkVcALIVWsrraGFMZi35T1I4FcAGARc58AwC4FfxeJ6JQz/o8fq8TTh6AyU5VwBQArxhj3hWRpQBeEpF7AHwHDZDhtM+LyI/QzNDIaN5MogieiIiIiIiIEgqHzBERERERUdJiQEREREREREmLARERERERESUtBkRERERERJS0GBAREREREVHSYkBERET1QkR2OW2+iJxfy/e+1e/1V7V5fyIiSlwMiIiIqL7lA4gqIHLWogjHJyAyxhwTZZ+IiChJMSAiIqL6di+A40VkgYhcLyKNROQBEZkrIgtF5EoAEJEBIvKpiEwFsMjZ96aIzBeRJSJyhbPvXgCZzv2mOPtsNkqcey8WkUUiMsJz789E5H8islxEpoizGrOI3CsiS52+PFjvvx0iIqpXqbHuABERJZ2xAP5ijPk9ADiBzQ5jzJEi0hjAlyIy3Tm3L4BDjTGFzutLjTHFIpIJYK6IvGaMGSsi1xhjegV5r7MA9AJwOIBWzjWfO8d6A+gBoAjAlwCOdVZBHwbgYGOMEZHcWv/0RETUoDBDREREsXYKgAtFZAGA2QBaAjjIOTbHEwwBwHUi8j2AbwB09JwXynEAXjTGVBpjNgOYCeBIz703GGOqACyADuXbCaAUwH9F5CwAe/b50xERUYPGgIiIiGJNAFxrjOnl/HQ2xtgM0e7fThIZAOAkAP2MMYcD+A5ARgT3DqXMs10JINUYUwHNSr0G4EwA06L6JEREFHcYEBERUX37FUAzz+sPAYwWkTQAEJGuIpIV5LocANuMMXtE5GAAR3uO7bXX+/kcwAhnnlJrAP0BzAnVMRFpCiDHGPM+gDHQ4XZERJTAOIeIiIjq20IAFc7Qt2cBTIAOV/vWKWywFZqd8TcNwJ9EZCGAFdBhc9aTABaKyLfGmFGe/W8A6AfgewAGwF+NMT85AVUwzQC8JSIZ0OzS9TX7iEREFC/EGBPrPhAREREREcUEh8wREREREVHSYkBERERERERJiwERERERERElLQZERERERESUtBgQERERERFR0mJARERERERESYsBERERERERJa3/B2rkv2/yEb0yAAAAAElFTkSuQmCC\n",
150 | "text/plain": [
151 | ""
152 | ]
153 | },
154 | "metadata": {
155 | "needs_background": "light"
156 | },
157 | "output_type": "display_data"
158 | }
159 | ],
160 | "source": [
161 | "hidden_size = 128\n",
162 | "model = QANetwork(hidden_size)\n",
163 | "optimiser = optim.Adam(model.parameters(), lr=0.0005)\n",
164 | "train_losses, test_losses, test_acc = [], [], 0\n",
165 | "epochs, iters_per_epoch = 10, len(train_data)\n",
166 | "\n",
167 | "plt.figure(figsize=(14, 8))\n",
168 | "plt.xlabel('Iterations')\n",
169 | "plt.ylabel('Loss')\n",
170 | "plotted_legend = False\n",
171 | "\n",
172 | "\n",
173 | "def plot():\n",
174 | " global plotted_legend\n",
175 | " plt.plot(range(len(train_losses)), train_losses, 'b-', label='Train')\n",
176 | " plt.plot([(i + 1) * iters_per_epoch - 1 for i in range(len(test_losses))], test_losses, 'r-', label='Test')\n",
177 | " clear_output(wait=True)\n",
178 | " display(plt.gcf())\n",
179 | " if not plotted_legend:\n",
180 | " plt.legend(loc='upper right')\n",
181 | " plotted_legend = True\n",
182 | "\n",
183 | "\n",
184 | "def train():\n",
185 | " model.train()\n",
186 | " for i, x in enumerate(train_data):\n",
187 | " optimiser.zero_grad()\n",
188 | " y_hat, _ = model(x)\n",
189 | " loss = F.cross_entropy(y_hat, x.answer.squeeze(1))\n",
190 | " loss.backward()\n",
191 | " train_losses.append(loss.item())\n",
192 | " optimiser.step()\n",
193 | " if i % 10 == 0:\n",
194 | " plot()\n",
195 | "\n",
196 | "\n",
197 | "def test():\n",
198 | " model.eval()\n",
199 | " test_loss, correct = 0, 0\n",
200 | " with torch.no_grad():\n",
201 | " for x in test_data:\n",
202 | " y_hat, _ = model(x)\n",
203 | " test_loss += F.cross_entropy(y_hat, x.answer.squeeze(1), reduction='sum').item()\n",
204 | " pred = y_hat.argmax(1, keepdim=True)\n",
205 | " correct += pred.eq(x.answer).sum().item()\n",
206 | "\n",
207 | " test_losses.append(test_loss / len(test_data.dataset))\n",
208 | " return correct / len(test_data.dataset)\n",
209 | "\n",
210 | "\n",
211 | "for _ in range(epochs):\n",
212 | " train()\n",
213 | " test_acc = test()\n",
214 | "plot()\n",
215 | "clear_output(wait=True)\n",
216 | "display('Final test accuracy: %.2f%%' % (test_acc * 100))"
217 | ]
218 | },
219 | {
220 | "cell_type": "code",
221 | "execution_count": 14,
222 | "metadata": {},
223 | "outputs": [
224 | {
225 | "data": {
226 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAARoAAAEGCAYAAAC6p1paAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4xLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvDW2N/gAAIABJREFUeJztnXm8XdPd/9+fJAiCIGgkSB9Ta4iomKdow5O2ij6lppaUUlXVRyuqE0FrylOe1lDUL6giMZTSehpKEjGExJRIBEFUzKGGFKnkfn9/rHVyd87d5+x9cs8599xzvu/72q+795r3cL57rbXXWh+ZGY7jOLWkR1cXwHGc5scNjeM4NccNjeM4NccNjeM4NccNjeM4NccNjeM4NccNjeM4NccNjYOkSZK+3UV5/1TSlV2Rd14kmaRNurocnUGBqyT9U9Ij0e27kt6QtFDS2vH/f9Qifzc0DYikkZJmSvpQ0uuSLpW0Rs64O0t6X1LPhNvvS7hdVovyV4KZnW1mFRs5SbtJelDSe5LekfSApO1rUcbOIGkbSXfHMr4saWSOOJOiQVipikXZDdgbGGhmO0haAbgA2MfM+pjZ2/H/C1XMcyluaBoMST8CzgNGAWsAOwGDgLviw5HFdKAn8LmE2+7Aq0VuewD3VaHIS4lvzZo/U5JWB/4CXASsBQwAzgAW1TrvlLL0zAiyAXAJ0B84GLhc0vpl0htEuF8G7FedUgKwETDPzP4Vj9cDegOzqphHaczMtwbZgNWBhcDXi9z7AG8CR+ZM5x7gR3F/XeAFwg8x6WaEtxvAJOAs4AHgA+AuoF8ivZ2AB4F3gSeBYQm/ScCvYtyPgE0IBvL/Aa8BrwC/BHqWKOto4I9xvzfwR+DtmNc0YL2UOEOBd8uc/8bAvTGdBcB1QN+E/zzgZGAG8B4wHuid8B8Vy/4qcFS8VptEv6uB3wF3Av8ChgNfBh4H3gdeBkaXKFeveH+3LlP20+K1vAD4S4XPz/rA7cA7wFzgmOh+NPAxsCTmf0Msu8Xje2O45HmuDPwaeCleo/uBlbOeh5Jl6+ofl2/LPCgjgMVArxS/a4Dr4v5uGT+004E/x/0DgT8Qqs1JtxcS4ScBzwObxQdsEnBu9BsQf7BfItSA947H6yTi/gPYMv6QVgBuAy4HViUYtUeA75Qo62jaDc13gDuAVQi1su2A1VPirB7LcA3wRWDNIv9NYjlXAtYh1Nz+N+E/L5ZpfUKN6GnguMQ9eAPYKpb/ejoamveAXeP16A0MA7aOx4Nj/ANSyv3bmG+PMvduLnB8PPdPSBha4DBgRpm4k4FLY5mGAG8BX4h+I4H7E2EHxfPqlXBLnucl8d4OiPdil3g9yz4PJcvW1T8u35Z5UL4BvF7C71zgrpzpDIs3X8BvgGMItaI3Em5XJcJPAn6eOD4e+Fvc/zFwbVH6E4i1qxj3zITfeoQmzMoJt0OBiSXKOpp2Q3MU4U05OMc5fjb+6OcTjPPtpNR+YtgDgMcTx/OAbySOzwcui/tjiUY2Hm9GR0Pzh4yy/S9wYZHbKcAzwKfKxNstGpd+8XgOcFLOe74BocayWsLtHODquJ/b0EQD8hGwTUo+ZZ+HUpv30TQWC4B+knql+PUnvKHyMJVgWLYi9MVMMbOFhGp9wa24f+b1xP6HMT6Etv1Bkt4tbIQfRP9E+JcT+xsRajWvJcJfTqjZZHEt4aEdJ+lVSeeX6pcys6fNbKSZDYzntD7hB46kdSWNk/SKpPcJzbF+Oc93/aLzeSkl+6Q/knaUNFHSW5LeA45Lye+/CU2Z1ynNkYSXyYJ4fH10y8P6wDtm9kFR2QfkjJ+kH6FW9HyKX57noQNuaBqLhwi1gf9KOkpaldBEmJwnETP7mNC/sS/Q38zmRK8p0W0w+TuCXya8wfomtlXN7NxklkXhFxHeyoXwq5vZljnK/YmZnWFmWxCq6vsCR+SIN4dQ09gqOp0TyzTYzFYn1BSVfapA6JvZIHG8YVqWRcfXE2pUG5jZGsBlKfn1J/T5pCJpZeDrwJ7xS+PrwEnANpK2yVHuV4G1JK1WVPZXcsQtZgGhT2fjFL88z0MH3NA0EGb2HqHT9iJJIyStEL9C3ER7p2Ze7iO8RR9MuN0f3V43s7S3VRp/BL4i6T8l9ZTUW9IwSQNLnMNrhM7kX0taXVIPSRtL2jMrI0l7Sdo6fsl5n9CMWJIS7jOSflQog6QNCM2zqTHIaoROznclDSB07ublRmCkpC0krULo78piNUJt4mNJOxD6UorpD7xYJo0DCOe6BaF/ZQiheTiFfMb2ZcK9Pifeo8GETuBKnplCWm2EJuQFktaP933n+Lm9ouehgBuaBsPMzgd+CvwP4QvQi4TO0eEWP01K2l3SwoykJhOaK/cn3O6Pbrk/a8cHeP9YprcIb7RRlH92jgBWBGYD/wRuJqNqHflUDPs+oYN2MuHBLuYDYEfgYUn/IhiYp4AfRf8zCJ/y3wP+CvwpR94AmNn/EZpg9xI6Zu/NEe144ExJHxC+Gt2YEmYuodlRiiMJ/Wb/MLPXCxtwMXC4pF6SDpdU7nP0oYS+l1eBW4HTzezuHOVP42RgJqFm/A5hyEWP5XweUOzMcRoUSUcRfji7mtk/uro8jrM8uKHpBkj6JvCJmY3r6rI4zvLghsZxnJrjfTSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49QcNzSO49SctLVpHafhkVTXZQfMLO9SoKmMGDHCFixYkBnu0UcfnWBmIzqTVyPihsZx6sCCBQuYNm1aZrgePXoUL2reFLihcZw60dbCaz+5oXGcOmBAKy8y54bGceqCYR1UWloHNzSOUw8MlrS5oXEcp4YY3kfjOE4d8D4ax3Fqjhsax3Fqipm1dNOp4ikIkn4maZakGZKekLRjNQoiaZKkoZ2If4CkLapRlox8Rkq6OO6PlnRyhfE7SNlK6ivp+ArTWSZO1D/+SyVpFKV3taQDU9yvLFxXST9d3nScYGyytmalIkMjaWdgX+BzZjYYGE7Q3q0ZUfA9DwcQBNLT0mj0mltfgn5zreNUjJl928xmx8NMQ+OkY8ASs8ytWam0RtMfWGBmiwDMbIGZvQog6TRJ0yQ9JekKSYrukySdJ+kRSc9K2j26ryxpXKwZjQdWLmQiaaGkMyU9DOxcKu1E+F2A/YAxsZa1ccz3bEmTgR9I+oqkhyU9LunvktaT1EPSPEl9E2nNjX7rSLol5jtN0q7lLkzM82+SHpU0RdJnovunJT0U0zirRPRzgY1j2ccoMCae70xJB2fFiW59JN0saY6k6xL3YDtJk2PZJkjqn3EuZ8WaSY9CTVPSucDKMb/rYrgj4v17UtK1iST2kPSgpBeStRtJo+J1mCHpjOg2SNLTkn6vUFO+S9LKNCGtXKPJdfKJi9AHeAJ4FrgU2DPht1Zi/1rgK3F/EvDruP8l4O9x/4fA2Lg/GFgMDI3HBnw9K+2isl0NHJg4ngRcmjhek3YJ4G8nyvQb4Ftxf8dE+a4Hdov7GwJPx/2RwMVxfzRwcty/B9g0kc69cf924Ii4/z1gYUrZBwFPJY6/BtwN9ATWA/4B9M+IMwx4DxhIeIE8BOwGrAA8CKwTwx1cuO5p1w84H7g8ca0mJe7LwkT4LYFngH7JexTTuSmWYQtgbnTfB7gCUPT7C7BHPI/FwJAY7kbgGyWev2OB6XGzem6V/E7Stm223dbefP/9zA2Y3tm8GnGrqElhZgslbQfsDuwFjJd0qpldDewl6RRgFWAtYBZwR4z6p/j/0fhgER+y38Z0Z0iakchqCXBL4rhc2uUYn9gfGMvbH1gReDER5jTgKuCQRJzhwBaJytPqklZLy0RSH2AX4KZE+JXi/10JhgOCkTwvR7l3A24wsyXAG7FWtj3BaJXjETObH8v0BOFavwtsBdwdy9YTeK1E/F8AD5vZsTnK+HngZjNbAGBm7yT8bjOzNmC2pPWi2z5xezwe9wE2JRjRF83sieiefEaWwcyuIBirus/e7jTtxrIlqbjvIj78k4BJkmYCR0oaR6jhDDWzlyWNBnonoi2K/5cU5Vnqyn8c80FS74y0y/GvxP5FwAVmdrukYYTaCIQ3/yaS1iH08/wyuvcAdjazj5IJFrXaSIR918yGlChHpU/Y8i5JsCixX7jWAmaZ2c454k8DtpO0VpHhKFXGUue1qChc4f85Znb5MolIg1LK3XRNp2rOdZI0glAT7wlcaWbnFvlfSKgIQHg5r2tmfelCKu0M3lzSpgmnIcBLtP/wF8S3e56vDvcBh8d0tyI0n9LIm/YHQGqNI7IG8ErcP7LgaOHu3wpcQGgevR297gJOKISTVMqIYGbvAy9KOiiGlaRtovcDhJoSxPPNUfb7gIMl9YwGcA/gkYw4pXgGWEehIx9JK0jaskTYvxH6fv5aovb2iaQV4v49wNclrR3TXSujHBOAo+I9RNIASevmKH/TsKStLXPLQuHjyCXAFwlN00NV9LXVzE4ysyHxxXcR7S2KLqPSzuA+wDWSZsemzhbAaDN7F/g9MBO4jfBmzOJ3hM7LGcApdPwhAVBB2uOAUQqdvRun+I8mNG2mAMUrEI0HvsGyTa0TgaGx43I2cFzG+RwOHC3pSULTbv/o/gPge5KmEYxdB6JxeyB2/o4hGL4ZwJPAvcApZvZ6RpxUzOzfBON8XizbE4RmXqnwNxGu9+0pnbJXADMkXWdms4BfAZNjuheUSjOmexeh3+uhWBO+mXyGskmwXH852IHQ7/VCvLfjaH/W0jgUuKEKJ9Ap1MrtRqf7Uu8+GuvkCnuDhwyxv957b2a4Ddde+yWWfRFeEfumAIhf8UaY2bfj8TeBHc3sBIqQtBEwFRhY6IroKhp9fInjNA05X+oLzKzcwNU0g1cq4UMIHfZdamTADY3j1I0qtR7mAxskjgcCr5YIewhhSEWX44bGcepAFZeJmAZsKunThI8bhwCHFQeStDlh7NhD1ci0s7ihcZx6YJbrq1J2MrZY0gmEr3g9CYMvZ0k6kzDYrzDW6lBgnDVIJ6wbGsepE9X6zZvZncCdRW6nFR2PrkpmVcINjePUgTCXoSEqF12CGxrHqRMtvGSwGxrHqRcN0l3SJbihcZw64YbGcZyaYlX66tRdcUPjOHXCazSO49QU13VyHKcu+Odtx3Fqjn/edhynppgZbd4Z7DhOrfE+Gsdxao5/dXIcp+a0sqGpdHFyl8NtITncjDgL4/9Bkp5a3rxbBbOgvZ21NSu5DY1cDrdWNKwcrlNdqrQ4ebekkhqNy+GWQM0hh9tBxlZSH0n3SHoslqXcavtIulPS4Lj/uKTT4v5Zkr5dKr3o/4NEOr+SdGK5vLobBixps8ytackraYnL4UJzy+Gmydj2AlaP+/2AuYnruLC4LMCp8TxXJyw5OSG6TwQ2L5VeTOOx6N4DeB5YO+N57FaSuJttuaVNnD07c6PVJXHN5XCbXQ43TcZWwNmS9gDagAEE4/d6iTSmEPSwXgT+CuwtaRVgkJk9oyA+1yE9M5sn6W1J28b0H7d2Ib+lSDqWoL/dLWnmPpgsKtXedjncjjSLHG6ajO3hwDrAdmb2iaR5lL/+04ChwAuEWlk/4BjCSyYrvSsJNcZPAWPTEjfX3u62VNIZ7HK4KVjzyOGmsQbwZjQKewEblQtsQTnxZeDrBOGyKcDJ8X9WercCIwi1twkVlLFbUNDeztryIGmEpGdin+KpJcJ8XUFRdpak66t5LstDJZ3BLodbmqaQw03hOsJ1mB7PcU6OOFOAN8zsw7g/kHZDUzK9WNaJwI3WAIJntaAan7eVQ3s7Vgh+AuxqZlsC/139s6kMl8R1GgJJPYDHgIPM7Lkc4buVJO5mW25pv70hWwL7i9ts86iVUaqMtdPRZvaf8fgnsXznJMKcDzxrZld2pszVpKIBe45TC+IbeS5wTx4j0x0prEdThQF7A1h2/Nr86JZkM2AzSQ9ImippRHXOYvlp9MFsTgtgZrOB/+jqctSU/H0w/WLTssAVsRO8QB7t7V7ApoRhDwOBKZK2il0RXYIbGsepEzlrLAvKNZ3Ip709H5hqZp8QPlQ8QzA8efpPa4I3nRynDlTxq9NS7W1JKxK+ahaPsbqNMNYNSf0ITakXqnc2leM1GsepE3XU3p4A7BO/mC4BRqUNgKwnbmgcpy5Ub9KkZWhvx/FhP4xbQ+CGxnHqgFnYWhU3NI5TJ3yuk+M4NaeVB8e6oXGcOuACco7j1B5zuRXHceqB12gcx6k11sxLdWbghsZx6kQLV2jc0DhOPQjjaFrX0rihcZw64YbGcZwaY7Qt8a9OjuPUkFZvOnXpMhGSTNK1ieNekt5SJ+Rd64nKSPGW88ubnjohFazlkKpVkPxdP3E8Ly4z4FSBai1O3h3p6vVo/gVsJamgVLk37WoFuVDXSt6WlOLN8Fue9OrBSGD9rEBJuvj6dy8KMyvLbU1KVxsagP8Dvhz3DwWWruAsaQcFmdbH4//No/tISTdJugO4S9K1Ssi1KkjC7pfMRNKlBTdJt0oaG/ePlvTLuP8NBfneJyRdHlecL8j0/krSk3EN1vWUIsWbyCtNpndIjDsj5r9mUflKpXeQOkoK91SQzZ0W0/tOiWvbS9I1MczNCmJuqRLGCjK4Q4HrYv4F4/99tUvYFqR+R8d4dwF/kNRb0lUxzOMKUiqUcR8p6TZJd0h6UdIJkn4Yw0yVtFaph6U708J2piEMzTjgEAWxuMHAwwm/OcAeZrYtQVHy7ITfzsCRZvZ5gvjYtwAkrUGQFFlmvQ6CXtLucX8A7TWH3Qhrqn6WIBm7axSDW0K7FtOqhKURt4npHGNmDxJWNhtlZkPM7PlCRiX8/gD82MwGE+RjTk8Wrkx6vcxsB4JkRiHO0cB7ZrY9QQfpGEmfTrm2mxPWnB0MvA8cH90vNrPtzWwrgu75vmZ2MzAdODzmXxDPW2BmnyNI5JycSHs7YH8zO4wgg4uZbU14WVwT72cpdwgKmocBOwC/Aj6M9/kh4IiUc+neWOgMztqalS43NGY2gyDfeigdjcMaBD2mp4ALgaT42d1m9k5MYzJBcXLdmM4tZra4KK0pwO6xD2Q2QW62P8FgPQh8gfDjmaYgKfsF2hfM/jdQ6DdKSvvmIhq/vrGcANcQhOHykCYpvA9wRCznw8DahDVhi3nZzB6I+38kGFUIMsMPK6iNfp5lr2ue/AFuTxij3QiSv5jZHIKw4GZl3AEmmtkHZvYWQTe8IHM8kxLXV9KxkqZr2cW7uwXVFJDrjjRK+/p24H8Iq7avnXA/i/BAflXSIIIcb4Gk5C2EB/pwwhqqRxVnYGavxObKCEKtZC2CouJCM/tAkoBrzOwnKeX7xNqfgmJp31qTJiks4PtmlqXoWPzkmiqXGS4laZy8/qU0j8ppISUleNsSx22UuL7WnSVx8a9OjcBY4Ewzm1nknpSyHZmRxtVERT4zm1UizEMxzH10lGu9Bzgw1oqQtJakshKwlJemXepnZu8B/yz0sQDfBCaXi5PBBOC7klaIZd1M0qop4TZUlMMl1PTup7zMcN78i0lKHG8GbEiQ4y3l3pK0co2mIQyNmc03s9+keJ0PnCPpAcJCzOXSeAN4GriqTLAphD6PuQRVxLWiW0Fb6OeEzuUZBJH6/hlFLyfFW+x3JKGjdwZBt/zMCtNLciWh+fdYbFZeTnot4GngyJjnWsDvrLzM8NXAZUWdwXm4FOgZm2LjgZFmtqiMe+thBm05tialaSRx4xeVmcDnYg3CaWLq3XTqrCTuRptsaqf+T9q7dFmO/+qXy0riAigoT/6G8PK90szOLfIfCYyhvTVwcVfL4zZKH02nkDSc0Py6wI2M04gY0FaFGksccnEJYczZfMLHi9tjjTzJeDM7odMZVommMDRm9ndC+99xGpPqTUHYAZhrZi8ASBoH7E9oSjcsDdFH4zitgLVZ5kbU3k5sxxYlMwB4OXE8P7oV87XEQM0NUvzrSlPUaByn8cn9VSlLezutr6g44TuAG8xskaTjCOO2Pp+vnLXBazSOUyeq9Hl7PpCsoQwEXi3K5+3E173fEwaidiluaBynDhSWiaiCoZkGbCrp05JWJAxQvT0ZII54L7AfYZhDl+JNJ8epE7ak853BZrZY0gmEQZs9gbFmNkvSmcB0M7sdOFFhAvFi4B2yB7vWHDc0jlMnqjVmzczupGheoJmdltj/CZA2labLcEPjdEtWWmllBg78TF3ymj9/TucTafIpBlm4oXGcOuGGxnGcmlJYJqJVcUPjOPXAwJp4Yass3NA4Tl3wPhrHcepAC9sZNzSOUy+8RuM4Tk0xozBpsiVxQ+M4dcJrNI7j1Bijra11vzrVfFKlpJ9JmhXXxnhC0o5VSne55WK7miigdnGK+zAFIblK0lomjqSroxhcQ9Cd71NVqd6kym5JTWs0cQX+fQnr+C5S0HFescZ59jSzJbXMo4YMAxYSdKZqGacmSOqVoqflFGjhPppa12j6ExbyWQRgZgvM7FVIl2WN7pMknaeOMrArSxoXa0bjCQqLRL+Fks6U9DCwc6m0k8Q3/+8kTZT0gqQ9JY2V9LSkqxPhDlWQdH1K0nnR7buSzk+EGSnporhfSlb3W/F8JgO7ppRnEHAccFKMu7ukjSTdE8/5HkkbZsWJXnsoSAi/kKzdSBqldhndM9JumIJE8LPxPvy+UPOS9BUF0bnHJf1d0nrRvVget9x92kfSQwoSuzcpyL0gaZ6kM1QkvdtMhJHBLolbK+4CNogP7qWS9kz4dZBlTfilycB+lyCbOpggoZpczGdV4Ckz29HM7s9IO8mahJXHTiKsSlZQw9xaQSt7feC8GGYIsL2kA4Cbgf9KpHMwMF4lZHXj+iBnEAzM3rTL8S7FzOYBlwEXWpCknQJcDPwhnvN1wG9zxIFg4HeL530uhB85Qc1yh3gu20laRi0znu8vgJ1iOZM/+PuBnSzI1o4DTkn4JeVxU+9TrM3+HBhuQWJ3OvDDRBqlpHebhlZuOtXU0JjZQsKDdizwFuHHODJ676XSsqxpMqx7EGRdCzK6MxLhlwC3JI7LpZ3kjqhAORN4w8xmmlkbMCvmuz0wyczeik2C6wha4G8BL0jaSdLaBI3rBygtq7tjIp1/EzSO8rAzcH3cv5Z2SdssbjOztrgy/nrRbZ+4PU7QtPoMHWV0dwAmm9k7ZvYJcFPCbyAwIV7TUSx7TZPyuKXu004EA/tAvDZHAkmBvlLSu0tRQhJ3yZJu1kJrce3tmn91iv0lk4BJ8SE9UmHl9nKyrKVkWEuZ/I8L/TKqTPI1KcNaLNHai7BwUCnGEyR15wC3mpnFJloHWd1YC6rG6ypvGslzUeL/OWZ2eZl45bSLLiLI2dwuaRgwOuFXLE+cVk4R9NIPLZF+qXvenmhCErd371W63eu/mWssWdS0RiNpc0nJt+YQgtB7OVnWUiTlVbcCBpcItzxpl+JhYE9J/WJfy6G0S9n+CTgguhVqKKVkdR8GhklaW0HG9qAS+RVL0j5IWKoRwrnfnyNOKSYARyX6RQYUypngkXi+a0rqBXwt4ZeUJz6yTD6l7tNUYFdJm0S/VRRkcluCwuztVm061bpG0we4SFJfQu1gLnCsmb0rqSDLOo9lZVlL8TvgKgV51ycIP4oOLGfaqZjZa5J+AkwkvJHvNLM/R79/SpoNbGFmj0S32ZIKsro9gE+A75nZ1Fizegh4jdB0SZP4vQO4WdL+wPeBE4GxkkYRmp7fyhGn1LncFfuQHgoVLxYC3wDeTIR5RdLZBMP4KkErqCDINxq4SdIrBKPx6RJZpd4nM3srNptvkLRSDPtz4NlSZW4qCr3BLUrTSOI61UFSHzNbGGs0txLWpL21q8tVTO/eq1g9V9j7+OMPOyWJ23/gRnbUiT/LDHf2j7+TKYnbHXEVBKeY0bGz9ingReC2Li5P02Bt2VseJI2Q9IykuZJOLRPuQEmmBhgw6VMQnGUws6b8tNzlGFWZgqCc2tuSViM0vR/udKZVwGs0jlMHqtgZvFR7Ow6VKGhvF3MWcD7wcdVOohO4oXGcOlElQ5OpvS1pW2ADM/tL9UrfObzp5Dh1wfKuR9NP0vTE8RVx/FCBstrb8WvnhTSAaFwSNzSOUw8s94C9BRlfnbK0t1cDtiIMkAX4FHC7pP3MLGnA6oobGsepF9UZSrJUe5swgPIQ4LD2LOw9oF/hWNIk4OSuNDLghsZx6oIBbVVYJsLyaW83HG5onG7JVlttwfTp9XlJDx1ahWEoVVwz2DK0t4vch1Ul007ihsZx6kJzz2XKwg2N49QJNzSO49QcNzSO49QUc+1tx3HqQQtXaNzQOE598M5gx3HqgBsax3FqS/4pCE2JGxrHqQNG9QbsdUdqskyEGlQGV9IBkjpoKlUbJSRvo8BaRYtJSVqY4tZX0vEVprNMHAX53KovHSBpHbWLy+0u6SAFIb6JkoZK+m12Ks2OYW1tmVuzUnVDo2VlcAcDw1l2/YyqE1cdy8MBpIi3xTQavXbXF6jI0CxnnOXhC8AcM9s2itgdDRxvZnuZ2XQzO7EOZWhsWlx7uxY1moaUwZW0C7AfMCbWsjaO+Z6tIFP7A6XIvkrqoSDZ2jeR1tzot46kW2K+0yR1kLotKsPGkv4m6VFJUxSlXyV9WkEqdpqks0pEPxfYOJZ9jAJj4vnOlHRwVpzo1kfSzZLmSLoucQ+2kzQ5lm2Cgrpmcfk7SPRKGkJYye1LMZ/TCUJ3l8XyLa1FSeoj6apY3hmSvhbdU6Vymw2XxK0uDSmDa2YPArcDoyzIxz4fvfqa2Z5m9mtSZF+jcuWfga8CKDQD55nZG8BvCHK02xM0kK7MuDZXAN83s+0Isq+XRvffAL+L6bxeIu6pwPOx7KMIkrxDgG0ItcYxKcahOA7AtoRrvAVBRXNXBa2pi4ADY9nGEq53MR0kes3sCeA0YHzM5wyC3O3hiTwL/AJ4z8y2jmncq2yp3KbB2ixza1aq3lyIUh3bAbsDexFkcE81s6sJUrWnAKsAaxGkZ++IUUvJ4P42pjtDQSuoQJoMbqm0y5GUpx0Yy9sfWJGgAlAIcxpt6egPAAAR4UlEQVRwFWH9j0Kc4cAWicrT6gqLQncgvqV3IWgjFZwL+ka70i7Wdi1B7zuL3YAbLCh0vhFrZdsTjGk5HjGz+bFMTxCu9buExZLujmXrSdCfKmZn2jXHryXUZCphOO2CeAVtrH1pl8qFcN0fSoss6ViCvDIbbrhhhVl3LYU1g1uVmvRLWGPL4BaTlHMtJfv6ELCJpHUI/Ty/jO49gJ2tXXeaWJ60fHoA75rZkBLlqPQpXF6doaRcbuFaC5hlZjtXmNbylLk4TpZUbntmCUncoUOHdq9fbYt/3q5FZ3Ajy+Bmycemyr5aeEJuBS4Anjazt6PXXcAJhXCxvyIVM3sfeFHSQTGsJG0TvR9gWenbPGW/DzhYUs9oAPego3pnXrncZ4B1FDrykbSCpC1TwuWR6C1H8fVak5aRyjXa2toyt2alFn00fYBrJM2OTZ0tgNFm9i5QkKq9jfwyuH1iOqdQRgY3Z9rjgFGxs3fjFP/RhKbNFGBBkd94goRssql1IjA0dmzOBo7LOJ/DgaMlPUlo2hVkMn4AfE/SNIKx60A0bg/Ezt8xBMM3A3gSuJfQn/R6RpxUomzHgcB5sWxPEJp5xZwIfCvej2/GclfCL4E1Y3meBPYys7cIC2nfENOdCtRHgrLOtHIfjUviOt2SoUOHWj1X2Js+fXqnJHH79Vvfvrz/tzPD/WHsWU0pidvoY0ccpykIncFdXYquww2N49SJVm49uFKl49QDM9qWtGVueZA0QtIzCgNHT03xPy4OinxC0v2qw7SbLNzQOE6dqMYUBIXpNpcAXyR8aDk0xZBcHwdFFkZtX1Dtc6kUNzSOUwcKA/aqMNdpB2Cumb0QvxaOo/3rZcgrDKUosCqVj3eqOt5H4zh1IqchydLeHsCyk5TnAx1WR5D0PcJUjhWBz1de2urihsZx6kLuWZNZ2ttpn9k7JGxmlwCXSDqMMJfsyA6x6og3nRynHhhYW/aWg/nABonjgcCrZcKPI0yb6VK8RuN0Sx599NFSc8oalipNMZgGbCrp04TpMocAhyUDSNrUzJ6Lh18GnqOLcUPjOHWgWrO3zWyxpBOACYRZ9mPNbJakM4HpZnY7cIKk4cAnwD/p4mYTuKFxnPpQxdnbZnYncGeR22mJ/UrnoNUcNzSOUxeae9JkFm5oHKdetPAUBDc0jlMnrOvHzXUZbmgcpw6YGW1tS7q6GF2GGxrHqROtPHvbDY3j1Ak3NI7j1Bw3NI7j1JQwO7t5Fx/PInOuk6QlcQGdWZKelPRDScs9R0rSlVkL8aiExnaeuI2KEnrcRe7DFFQ0K0lrmTiSrpaUR1WiVHodtL4zwndKW7xVMWvL3JqVPDWajwpaRJLWBa4nrNR/etlYJTCz7BWaaxA3iaReZra4GmlVgWHAQoKUSS3jOF1MKzedKqqZmNmbBKXAE6Iu0SAFDenH4rYLLH3jTlK6xvPS2ooq1FwuintoXK7wKUnnJcIsTOwfKOnquH+1pAskTSTIioyWNDam+YKkExPxvqGgA/6EpMujdtLRki5MhDlG0gWlwkf3bylIA08mqFEWn88ggkTLSTHu7krRt86KE732kPRgPJcDE+FHKWh6z5B0Rplr+6tYY50qab3o1kGLvEz8dSU9Gve3kWSFskt6XkGvqZS2+XMK2lTE47kKUrlNRZUWvuqWVNwEMrMXYrx1gTeBvaNm8sFE+dpIB43nZDrqhOaypPUJsrGfJwjUbS8pz1T4zWJ+P4rHnwH+k7Bq2ekKwmmfjeeya6zJLSHoMY0D9lPQqQb4FnBVqfAKsrpnxPPeO16HZTCzecBlBP3uIWY2hRR96xxxAPoTZHL3Bc6N12kfYNN4fkOA7STtkXJdVgWmmtk2BGG6Y6J7By3yUhc2voR6S1qdIIc8Hdhd0kbAm2b2YVp6Udv8j7QL5w0HnjSzYl0tJB0rabqWXRiqm2DedFoOCvPzVwAuVlBoXEL4IRdI03hOKhvuRE7N5RS2ByZF8TEkXUdQarwtI95NBRndyF/NbBGwSNKbwHrAF4DtgGmxXCsTfij/knQvsK+kp4EVzGymwkzaDuEJq54lyzi+6PqUYnn1rW+LP9rZiZrHPnF7PB73IRie+4ri/hv4S9x/lGAYobQWeSkeJBjWPYCzgRGEZ6VgDEulNxb4M/C/wFEEjfMOJCVxJXWr17+1uCRuxYZG0n8QjMqbhH6aN4BtCLWcjxNB0zSel0mKnJrLacUo45e8m8X62/8qOi6lQ32Nmf0kJe0rgZ8Cc2j/MaSGjzWsajxZedNInosS/88xs8sz4n5i7b+C5L0qpUVeiimE2sxGBMPxY0L5C0YsNb2ol/6GpM8TDHQpWeBuTSsbmoqaTrEdfRlwcXww1wBei2/SbxLWx8hLZzSXHwb2lNQv9occCkyOfm9I+qzCl7GvVlCeAvcAByp0fCNprVj9x8weJqxudhhwQ0b4h4FhktaOza2DSuRXrI+dR986r6b2BOAoxb4vSQMK5cxJqhZ5Ge4jyAY/F5+Jd4AvEbTFs9K7ktCEurGo1tkkGNbWlrk1K3kMzcqx03EW8HeCUHuhU/FS4EhJUwnNguIaQ0k6oblsZvYa8BNgIkF7+jEz+3P0P5XwBr0XeC1veRKJzyb0Hd0Vy3U3of+jwI3AA2b2z3LhYxlHE5qDfwceK5HlHcBXEx27efSti+OUOpe7CF8JH5I0E7iZfAaqwGhKa5Gn5Tcv7haaZvcD7xauVUZ6txOadqnNpmbAaMvcmpVupb0dfyz7mVlWX0Ety/AXQkfsPV1VhmZE4WvihWZW0nAWha/rg2tmnVo3dPXV17ahQ0dkhps48fqm1N7uNouTS7obmNlVRkZSX0nPEsYVuZGpIgpqi7cQaqlNSaEzuFU/b3ebKQhmtnd2qJrm/y75vho5FWJm5xI/yTcvzW1Isug2NRrH6e60tS3J3PKgbO3tH0qanRj0uVHVT6ZC3NA4Tp2oRtNJ+bS3HweGxkGfN5N/LFbNcEPjOPXALN+WTR7t7YlxJDaEr7kDq3ouy4EbGsepA0ZYMzjrLwdp2tsDyoQ/Gvi/5S95deg2ncGO093JOZepX9Fcrivi1IsCubS3IUz2BYYCe+YuZI1wQ+N0S6Qe9O69al3y+vjj3ONQy5D7q9OCjHE0ubS3FZQqfwbsGefzdSluaBynTtRRe3tb4HJgRJxV3+W4oXGcOhD6ejtvaHJqb48hTOe4Ka4o8A8z26/TmXcCNzSOUxeqN2Avh/b28KpkVEXc0DhOvWjhkcFuaBynTrgkruM4NaeV5zq5oXGcOuDa247j1AWv0TiOU3Pc0DiOU3Na2dBUdVKlXD63KqiB5XPL5HOQpKcVBPqQdENcD+UkSWfGIfEtjIG1ZW9NSrVrNC6fW1uG0bjyuUcDx5vZREmfAnYxsy5fcKlRMIO2JjYkWdRsmQiXz21O+dy0aynpNIJK5mWSxhCUMtZNlHNpLUrS9jHvJ+N1WC1eszGJvL9T+s52X1p5zeCarkfj8rnNJZ9b6lqa2ZmEe3K4mY0C9gOeL8oTSSsC44EfWJDfHQ58RKgNvWdm2xNUSI9RmDTYRLgkbq1x+dzmkc9d3mtZYHOC4OA0ADN7P6azDzA4UbtaI+a9jOKFpGMJtWTi9etWNHONJYuaGhq5fG6zyed29tct0sso4PtmNqFcZEtob/fo0bPb/Wpb2dDUrOkkl89tRvncctcyD3OA9SVtH/NYTVKvmPd3C01NSZtJqs+qVnXCdZ2qy8qx6bMCsJhQnb8g+l0K3CLpIIKUbUXyuZJGEuRzV4rOPweezY5qr0kqyOcKuNM6yue+DDxFaC7kxsxmSyrI4fYAPgG+B7wUg9wIDLGEfG5aeDObKmk0oTn4GkE+N80Q3wHcLGl/4PsE+dyxkkYBbxH6grLilDqXu2If0kOxWbKQoKP9ZiJMuWuZiZn9W9LBwEWSVib0zwwn1P4GAY8pZP4WkKcfrRthNKWkeE66lSRuJcjlc5uaHj16Wj2X8mxrW9KpZmPv3qvahht+NjPcc8896pK43QW5fK7TgHjTqckwl891Go7mNiRZNKWhcZxGo1prBndX3NA4Tp1o5RpNU/bROE7jYVhbW+aWB0kjJD0jaa6kU1P891CYqrNYNZhAuzy4oXGcOlENSdw4fukS4IuE0fKHquMqBf8ARhImNTcE3nRynDpRpT6aHYC5cR4hksYB+wOz2/OxedGvYTqF3NA4Th0ojAzOQZb29gDCANMC8wlz5RoaNzROt6RHjx6stNIqdcnr3//+qAqpVE17O23gYMP3MruhcZw6USXt7fmEOXQFBgKvViPhWuKGxnHqRJX6aKYBm8b1el4hTKo9rBoJ1xL/6uQ49SB00mRvmcnYYuAEwoz3p4EbzWyWwrrM+8HSVQznE1YBuFzSrBqeWS68RuM4dcConiSumd0J3FnkdlpifxqhSdUwuKFxnDrRyiOD3dA4Tp3wuU6O49QYq9ZXp26JGxrHqQMVDNhrStzQOE6daGVD01Cft5UQdEvxGxaXxux2KIjnPVXCvaIxEMVxVEI+t4L0UiWFM/J/Ku5323tSf1pbErehDE0LMojKB1stTxynAajG7O3uSsMZGgXGKEiuzoyr5hfoo3Tp3HmSzohrcMyU9JmUdEdKuk3SHZJelHSCpB9KelzSVElrxXBD4vEMSbdKWjNKsjySSGuQpBlxfztJkyU9KmmCgupkwf1JSQ8RlBHSOBfYXUE69iRJvSVdFc/hcUl7ZcWJbutL+puk5yQtFZFTfhnhgxTkaZ9VlM1VCQnjUsQy9433721JR0T3ayUNL5Ve9N8/kc51hYFnzUYrrxnccIaGoLw4hCA0NxwYU/jxUl46d0GUy/0dcHKJtLci1AZ2AH4FfGhm2xJkTo6IYf4A/DjKzM4ETjezp4EVFQTxIMja3qigQ3QRcKCZbQeMjelCEI070cx2LnOupwJTonTshUSDZGZbEzSTrpFULGxXHId4vQ4GtgYOlrSBKpMR7mVmOxCu7enRrZyEcRoPEO7HlsALQEHneyeCLlep9K4kysRIWgPYhaLBaM2AmdHWtiRza1YasTN4N+CGKEf7hoLo/fbA+5SXzv1T/P8o7TKxxUw0sw+ADyS9R9A8gmBQBscHva+ZFUTRrgFuivs3Al8n1CgOjtvmBON1d6xc9QReS0nnWsJCRXnO/SIAM5sj6SXCIuczMuLdY2bvAUiaDWwE9CW/jHDy2g2K++UkjNOYQpDHfYlg7I+VNAB4x8wWxmvSIT0zmyzpEgWxuv8CbonD7DugZSRxG/EdWZ5mrrFk0YiGppx+Tjnp3EUl3EvFb0sct5WJU2A8cJOkPxGE6Z6TtDUwq7jWIqkvyzd1f3m1g0rJ9eaVEU67didRWsI4jfsINbINgZ8RlD8PJBigrPSuJShtHgIcVSqDpCRur14rdLtfbSsbmkZ8LdxHqP73VJDV3QN4JCNOVYi1gn8W+ikI0r2To9/zhB/iLwhGB+AZYB1JOwNIWkHSllFu5T1Ju8Vwh5fIsliu9r5CWAXJ3w1jHuXilKIzMsJQoYSxmb0M9AM2jau/3U9owhYMTbn0riY02zCzLp8AWCu8j6YBUNBgXgTcSmgqPAncC5xiZq/XsShHEvqFZhD6Ps5M+I0nyMTeCEHilfDWPk/Sk8AThD4GCP0Ol8TO4FIrJ80AFsdO45MIssE9FVQ2xwMjzWxRRpxUzOwtwrqxN8RzmQp06CQvw6XAkZKmEpo5eSSMH6ZdpngKYTW4QtO2ZHpm9gZhJvJVFZSv+1GF2dvdlYaRxJW0DfD72CnptBCSViH0k32u0NeURa9eK9hqq61d24JFPvjgbRYv/qRTkrhBwjd7RcCPPlrokri1QtJxwA2EryROCyFpODAHuCivkemutHLTqWFqNI5TCd2xRrPiisUjFTqyaNGHTVmjacSvTo7ThDR3jSULNzSOUyd8PRrHcWqKLxPhOE4dMK/ROI5Te9zQOI5Tc7zp5DjdjCVLFi949903XqpTdhtVIY0JZtYvR7gFVcir4fBxNI7j1JyGGBnsOE5z44bGcZya44bGcZya44bGcZya44bGcZya44bGcZya44bGcZya44bGcZya44bGcZya8/8BawO7zOHg89UAAAAASUVORK5CYII=\n",
227 | "text/plain": [
228 | ""
229 | ]
230 | },
231 | "metadata": {
232 | "needs_background": "light"
233 | },
234 | "output_type": "display_data"
235 | }
236 | ],
237 | "source": [
238 | "# Create example manually (dataset is re-sorted by length)\n",
239 | "example_id = 8\n",
240 | "example = test_data.dataset[example_id]\n",
241 | "X = namedtuple('X', ['story', 'query'])\n",
242 | "x = X(STORY.process([example.story]), QUERY.process([example.query]))\n",
243 | "\n",
244 | "model.eval()\n",
245 | "with torch.no_grad():\n",
246 | " y_hat, attention = model(x)\n",
247 | "\n",
248 | "# Plot attention heatmap\n",
249 | "fig, ax = plt.subplots()\n",
250 | "cax = ax.matshow(attention[0, :len(example.story)].numpy(), cmap='bone')\n",
251 | "story = [' '.join(s) for s in example.story]\n",
252 | "ax.set_title('Q: ' + ' '.join(example.query) + '? A: ' + example.answer[0])\n",
253 | "ax.set_yticks(torch.arange(len(story)))\n",
254 | "ax.set_yticklabels(story)\n",
255 | "ax.xaxis.set_major_locator(ticker.NullLocator())\n",
256 | "fig.colorbar(cax)\n",
257 | "display(plt.gcf())\n",
258 | "clear_output(wait=True)"
259 | ]
260 | }
261 | ],
262 | "metadata": {
263 | "kernelspec": {
264 | "display_name": "Python 3",
265 | "language": "python",
266 | "name": "python3"
267 | },
268 | "language_info": {
269 | "codemirror_mode": {
270 | "name": "ipython",
271 | "version": 3
272 | },
273 | "file_extension": ".py",
274 | "mimetype": "text/x-python",
275 | "name": "python",
276 | "nbconvert_exporter": "python",
277 | "pygments_lexer": "ipython3",
278 | "version": "3.7.0"
279 | }
280 | },
281 | "nbformat": 4,
282 | "nbformat_minor": 2
283 | }
284 |
--------------------------------------------------------------------------------
/pytorch-tutorial/assets/panda.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/pukkapies/dl-imperial-maths/07b4a64026e855932ff8c38814946fcc9e4fd9f7/pytorch-tutorial/assets/panda.png
--------------------------------------------------------------------------------
/pytorch-tutorial/assets/starry-night.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/pukkapies/dl-imperial-maths/07b4a64026e855932ff8c38814946fcc9e4fd9f7/pytorch-tutorial/assets/starry-night.jpg
--------------------------------------------------------------------------------
/pytorch-tutorial/todo.txt:
--------------------------------------------------------------------------------
1 | Normalising flows (https://lilianweng.github.io/lil-log/2018/10/13/flow-based-deep-generative-models.html)?
2 | Transformer for text?
3 | Language modelling with RNNs?
4 | Graph networks? Dealing with other structured data (even just sets)?
5 | Object detection?
6 | DQN and DDPG/SVG?
7 | ES/GA for black box optimisation?
--------------------------------------------------------------------------------
/tensorflow-tutorial/week_1/Week_1.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Week 1: Tensorflow Basics"
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": null,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import tensorflow as tf"
17 | ]
18 | },
19 | {
20 | "cell_type": "markdown",
21 | "metadata": {},
22 | "source": [
23 | "## Build and execute a simple graph"
24 | ]
25 | },
26 | {
27 | "cell_type": "code",
28 | "execution_count": null,
29 | "metadata": {},
30 | "outputs": [],
31 | "source": [
32 | "x = tf.constant([1, 2])\n",
33 | "y = tf.constant([4, 5])\n",
34 | "\n",
35 | "z = tf.multiply(x, y)\n",
36 | "\n",
37 | "print(z)"
38 | ]
39 | },
40 | {
41 | "cell_type": "code",
42 | "execution_count": null,
43 | "metadata": {},
44 | "outputs": [],
45 | "source": [
46 | "x = tf.constant([1, 2, 3])\n",
47 | "y = tf.constant([4, 5, 6])\n",
48 | "\n",
49 | "z = tf.multiply(x, y)\n",
50 | "\n",
51 | "sess = tf.Session()\n",
52 | "\n",
53 | "print(sess.run(z))\n",
54 | "\n",
55 | "sess.close()"
56 | ]
57 | },
58 | {
59 | "cell_type": "code",
60 | "execution_count": null,
61 | "metadata": {},
62 | "outputs": [],
63 | "source": [
64 | "x = tf.constant([1, 2, 3])\n",
65 | "y = tf.constant([4, 5, 6])\n",
66 | "\n",
67 | "print(x)"
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "execution_count": null,
73 | "metadata": {},
74 | "outputs": [],
75 | "source": [
76 | "output1 = tf.multiply(x, y)\n",
77 | "\n",
78 | "with tf.Session() as sess:\n",
79 | " output = sess.run(output1)\n",
80 | " print(output)"
81 | ]
82 | },
83 | {
84 | "cell_type": "markdown",
85 | "metadata": {},
86 | "source": [
87 | "## Load and inspect data"
88 | ]
89 | },
90 | {
91 | "cell_type": "code",
92 | "execution_count": null,
93 | "metadata": {},
94 | "outputs": [],
95 | "source": [
96 | "import pandas as pd\n",
97 | "import numpy as np\n",
98 | "\n",
99 | "def load_space_csv_data(file_name):\n",
100 | " df = pd.read_csv(file_name, delim_whitespace=True)\n",
101 | " cols = list(df.columns.values)\n",
102 | " return df, cols\n",
103 | "\n",
104 | "df, cols = load_space_csv_data('lung_function.txt')\n",
105 | "print(cols)"
106 | ]
107 | },
108 | {
109 | "cell_type": "code",
110 | "execution_count": null,
111 | "metadata": {},
112 | "outputs": [],
113 | "source": [
114 | "df.head()"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": null,
120 | "metadata": {},
121 | "outputs": [],
122 | "source": [
123 | "df['age']"
124 | ]
125 | },
126 | {
127 | "cell_type": "code",
128 | "execution_count": null,
129 | "metadata": {},
130 | "outputs": [],
131 | "source": [
132 | "type(df['age'])"
133 | ]
134 | },
135 | {
136 | "cell_type": "code",
137 | "execution_count": null,
138 | "metadata": {},
139 | "outputs": [],
140 | "source": [
141 | "age = df['age'].values\n",
142 | "print(type(age))"
143 | ]
144 | },
145 | {
146 | "cell_type": "code",
147 | "execution_count": null,
148 | "metadata": {},
149 | "outputs": [],
150 | "source": [
151 | "age.dtype"
152 | ]
153 | },
154 | {
155 | "cell_type": "code",
156 | "execution_count": null,
157 | "metadata": {},
158 | "outputs": [],
159 | "source": [
160 | "age.shape"
161 | ]
162 | },
163 | {
164 | "cell_type": "markdown",
165 | "metadata": {},
166 | "source": [
167 | "FEV = forced exhalation volume: a measure of how much air somebody can forcibly exhale from their lungs"
168 | ]
169 | },
170 | {
171 | "cell_type": "code",
172 | "execution_count": null,
173 | "metadata": {},
174 | "outputs": [],
175 | "source": [
176 | "age_fev = np.column_stack((df['age'].values, df['FEV'].values))\n",
177 | "age_fev.shape"
178 | ]
179 | },
180 | {
181 | "cell_type": "code",
182 | "execution_count": null,
183 | "metadata": {},
184 | "outputs": [],
185 | "source": [
186 | "age_fev.dtype"
187 | ]
188 | },
189 | {
190 | "cell_type": "markdown",
191 | "metadata": {},
192 | "source": [
193 | "## Creating trainable variables"
194 | ]
195 | },
196 | {
197 | "cell_type": "code",
198 | "execution_count": null,
199 | "metadata": {},
200 | "outputs": [],
201 | "source": [
202 | "a = tf.Variable(2.0, name='a')\n",
203 | "print(a)"
204 | ]
205 | },
206 | {
207 | "cell_type": "code",
208 | "execution_count": null,
209 | "metadata": {},
210 | "outputs": [],
211 | "source": [
212 | "output2 = tf.add(x, a)\n",
213 | "# output2 = tf.add(tf.cast(x, tf.float32), a)\n",
214 | "print(output2)"
215 | ]
216 | },
217 | {
218 | "cell_type": "code",
219 | "execution_count": null,
220 | "metadata": {},
221 | "outputs": [],
222 | "source": [
223 | "with tf.Session() as sess:\n",
224 | " output = sess.run(output2)\n",
225 | " print(output)"
226 | ]
227 | },
228 | {
229 | "cell_type": "markdown",
230 | "metadata": {},
231 | "source": [
232 | "### Initializing variables"
233 | ]
234 | },
235 | {
236 | "cell_type": "code",
237 | "execution_count": null,
238 | "metadata": {},
239 | "outputs": [],
240 | "source": [
241 | "with tf.Session() as sess:\n",
242 | " init_op = tf.global_variables_initializer()\n",
243 | " sess.run(init_op)\n",
244 | " output = sess.run(output2)\n",
245 | " print(output)"
246 | ]
247 | },
248 | {
249 | "cell_type": "code",
250 | "execution_count": null,
251 | "metadata": {
252 | "collapsed": true
253 | },
254 | "outputs": [],
255 | "source": [
256 | "b = tf.Variable(tf.random_normal([2, 2], stddev=0.1),\n",
257 | " name=\"b\")"
258 | ]
259 | },
260 | {
261 | "cell_type": "code",
262 | "execution_count": null,
263 | "metadata": {},
264 | "outputs": [],
265 | "source": [
266 | "with tf.Session() as sess:\n",
267 | " init_op = tf.global_variables_initializer()\n",
268 | " sess.run(init_op)\n",
269 | " output = sess.run(b)\n",
270 | " print(output)"
271 | ]
272 | },
273 | {
274 | "cell_type": "markdown",
275 | "metadata": {},
276 | "source": [
277 | "### `tf.get_variable`"
278 | ]
279 | },
280 | {
281 | "cell_type": "code",
282 | "execution_count": null,
283 | "metadata": {},
284 | "outputs": [],
285 | "source": [
286 | "with tf.variable_scope('layer1'):\n",
287 | " b = tf.get_variable(\"b\", initializer=tf.random_normal([2, 2], stddev=0.1))\n",
288 | " \n",
289 | "print(b)"
290 | ]
291 | },
292 | {
293 | "cell_type": "code",
294 | "execution_count": null,
295 | "metadata": {},
296 | "outputs": [],
297 | "source": [
298 | "with tf.Session() as sess:\n",
299 | " init_op = tf.global_variables_initializer()\n",
300 | " sess.run(init_op)\n",
301 | " output = sess.run(b)\n",
302 | " print(output)"
303 | ]
304 | },
305 | {
306 | "cell_type": "code",
307 | "execution_count": null,
308 | "metadata": {},
309 | "outputs": [],
310 | "source": [
311 | "print(tf.global_variables())"
312 | ]
313 | },
314 | {
315 | "cell_type": "code",
316 | "execution_count": null,
317 | "metadata": {},
318 | "outputs": [],
319 | "source": [
320 | "with tf.variable_scope('layer1'):\n",
321 | " b = tf.get_variable('b', shape=(2, 2), initializer=tf.random_normal_initializer())"
322 | ]
323 | },
324 | {
325 | "cell_type": "code",
326 | "execution_count": null,
327 | "metadata": {
328 | "collapsed": true
329 | },
330 | "outputs": [],
331 | "source": [
332 | "with tf.variable_scope('layer1', reuse=True):\n",
333 | " b = tf.get_variable('b', shape=(2, 2), initializer=tf.random_normal_initializer())"
334 | ]
335 | },
336 | {
337 | "cell_type": "code",
338 | "execution_count": null,
339 | "metadata": {},
340 | "outputs": [],
341 | "source": [
342 | "with tf.Session() as sess:\n",
343 | " init_op = tf.global_variables_initializer()\n",
344 | " sess.run(init_op)\n",
345 | " output = sess.run(b)\n",
346 | " print(output)"
347 | ]
348 | },
349 | {
350 | "cell_type": "code",
351 | "execution_count": null,
352 | "metadata": {},
353 | "outputs": [],
354 | "source": [
355 | "print(tf.global_variables())"
356 | ]
357 | },
358 | {
359 | "cell_type": "markdown",
360 | "metadata": {},
361 | "source": [
362 | "## Placeholders"
363 | ]
364 | },
365 | {
366 | "cell_type": "code",
367 | "execution_count": null,
368 | "metadata": {},
369 | "outputs": [],
370 | "source": [
371 | "c = tf.placeholder(tf.float32, shape=(2,), name='input')\n",
372 | "print(c)"
373 | ]
374 | },
375 | {
376 | "cell_type": "code",
377 | "execution_count": null,
378 | "metadata": {},
379 | "outputs": [],
380 | "source": [
381 | "with tf.Session() as sess:\n",
382 | " output = sess.run(c)\n",
383 | " print(output)"
384 | ]
385 | },
386 | {
387 | "cell_type": "code",
388 | "execution_count": null,
389 | "metadata": {},
390 | "outputs": [],
391 | "source": [
392 | "feed_dict = {c: np.array([3, 4])}\n",
393 | "\n",
394 | "with tf.Session() as sess:\n",
395 | " output = sess.run(c, feed_dict=feed_dict)\n",
396 | " print(output)"
397 | ]
398 | },
399 | {
400 | "cell_type": "code",
401 | "execution_count": null,
402 | "metadata": {},
403 | "outputs": [],
404 | "source": [
405 | "mat_inv = tf.matrix_inverse(b)\n",
406 | "mat_vec_multiply = tf.matmul(mat_inv, tf.expand_dims(c, axis=1))\n",
407 | "print(mat_vec_multiply)"
408 | ]
409 | },
410 | {
411 | "cell_type": "code",
412 | "execution_count": null,
413 | "metadata": {},
414 | "outputs": [],
415 | "source": [
416 | "squeezed = tf.squeeze(mat_vec_multiply)\n",
417 | "print(squeezed)"
418 | ]
419 | },
420 | {
421 | "cell_type": "code",
422 | "execution_count": null,
423 | "metadata": {},
424 | "outputs": [],
425 | "source": [
426 | "feed_dict = {c: np.array([1, 1])}\n",
427 | "\n",
428 | "with tf.Session() as sess:\n",
429 | " init_op = tf.global_variables_initializer()\n",
430 | " sess.run(init_op)\n",
431 | " output = sess.run([squeezed, mat_inv], feed_dict=feed_dict)\n",
432 | " print(output[0])\n",
433 | " print(output[1])"
434 | ]
435 | },
436 | {
437 | "cell_type": "code",
438 | "execution_count": null,
439 | "metadata": {
440 | "collapsed": true
441 | },
442 | "outputs": [],
443 | "source": []
444 | }
445 | ],
446 | "metadata": {
447 | "kernelspec": {
448 | "display_name": "Python 3",
449 | "language": "python",
450 | "name": "python3"
451 | },
452 | "language_info": {
453 | "codemirror_mode": {
454 | "name": "ipython",
455 | "version": 3
456 | },
457 | "file_extension": ".py",
458 | "mimetype": "text/x-python",
459 | "name": "python",
460 | "nbconvert_exporter": "python",
461 | "pygments_lexer": "ipython3",
462 | "version": "3.6.3"
463 | }
464 | },
465 | "nbformat": 4,
466 | "nbformat_minor": 2
467 | }
468 |
--------------------------------------------------------------------------------
/tensorflow-tutorial/week_1/lung_function.txt:
--------------------------------------------------------------------------------
1 | age FEV ht sex smoke
2 | 9 1.7080 57.0 0 0
3 | 8 1.7240 67.5 0 0
4 | 7 1.7200 54.5 0 0
5 | 9 1.5580 53.0 1 0
6 | 9 1.8950 57.0 1 0
7 | 8 2.3360 61.0 0 0
8 | 6 1.9190 58.0 0 0
9 | 6 1.4150 56.0 0 0
10 | 8 1.9870 58.5 0 0
11 | 9 1.9420 60.0 0 0
12 | 6 1.6020 53.0 0 0
13 | 8 1.7350 54.0 1 0
14 | 8 2.1930 58.5 0 0
15 | 8 2.1180 60.5 1 0
16 | 8 2.2580 58.0 1 0
17 | 7 1.9320 53.0 1 0
18 | 5 1.4720 50.0 1 0
19 | 6 1.8780 53.0 0 0
20 | 9 2.3520 59.0 1 0
21 | 9 2.6040 61.5 1 0
22 | 5 1.4000 49.0 0 0
23 | 5 1.2560 52.5 0 0
24 | 4 0.8390 48.0 0 0
25 | 7 2.5780 62.5 1 0
26 | 9 2.9880 65.0 0 0
27 | 3 1.4040 51.5 1 0
28 | 9 2.3480 60.0 1 0
29 | 5 1.7550 52.0 1 0
30 | 8 2.9800 60.0 0 0
31 | 9 2.1000 60.0 0 0
32 | 5 1.2820 49.0 0 0
33 | 9 3.0000 65.5 1 0
34 | 8 2.6730 60.0 0 0
35 | 7 2.0930 57.5 0 0
36 | 5 1.6120 52.0 0 0
37 | 8 2.1750 59.0 0 0
38 | 9 2.7250 59.0 1 0
39 | 8 2.0710 55.0 1 0
40 | 8 1.5470 57.0 1 0
41 | 8 2.0040 57.0 1 0
42 | 9 3.1350 60.0 0 0
43 | 8 2.4200 59.0 1 0
44 | 5 1.7760 51.0 1 0
45 | 8 1.9310 57.0 0 0
46 | 5 1.3430 50.0 0 0
47 | 9 2.0760 57.0 0 0
48 | 7 1.6240 54.0 1 0
49 | 8 1.3440 52.5 0 0
50 | 6 1.6500 55.0 1 0
51 | 8 2.7320 60.5 1 0
52 | 5 2.0170 54.5 1 0
53 | 9 2.7970 61.5 0 0
54 | 9 3.5560 62.0 1 0
55 | 8 1.7030 54.5 1 0
56 | 6 1.6340 54.0 1 0
57 | 9 2.5700 57.0 1 0
58 | 9 3.0160 62.5 0 0
59 | 7 2.4190 60.0 0 0
60 | 4 1.5690 50.0 0 0
61 | 8 1.6980 57.5 0 0
62 | 8 2.1230 60.0 1 0
63 | 8 2.4810 60.0 0 0
64 | 6 1.4810 51.0 0 0
65 | 4 1.5770 49.0 0 0
66 | 8 1.9400 59.0 1 0
67 | 6 1.7470 57.5 1 0
68 | 9 2.0690 58.0 1 0
69 | 7 1.6310 55.5 0 0
70 | 5 1.5360 52.0 0 0
71 | 9 2.5600 60.5 0 0
72 | 8 1.9620 57.0 1 0
73 | 8 2.5310 58.0 0 0
74 | 9 2.7150 60.0 1 0
75 | 9 2.4570 59.0 1 0
76 | 9 2.0900 59.5 1 0
77 | 7 1.7890 56.0 1 0
78 | 5 1.8580 53.0 1 0
79 | 5 1.4520 51.0 1 0
80 | 9 3.8420 69.0 1 0
81 | 6 1.7190 53.0 0 0
82 | 7 2.1110 57.0 0 0
83 | 6 1.6950 53.0 0 0
84 | 8 2.2110 63.0 1 0
85 | 8 1.7940 54.5 1 0
86 | 7 1.9170 58.0 0 0
87 | 8 2.1440 63.0 0 0
88 | 7 1.2530 52.0 1 0
89 | 9 2.6590 61.5 1 0
90 | 5 1.5800 52.5 1 0
91 | 9 2.1260 62.0 1 0
92 | 9 3.0290 61.5 0 0
93 | 9 2.9640 64.5 1 0
94 | 7 1.6110 57.5 1 0
95 | 8 2.2150 60.0 0 0
96 | 8 2.3880 60.0 0 0
97 | 9 2.1960 61.0 1 0
98 | 9 1.7510 58.0 1 0
99 | 9 2.1650 61.5 1 0
100 | 7 1.6820 55.0 1 0
101 | 8 1.5230 55.0 1 0
102 | 8 1.2920 52.0 0 0
103 | 7 1.6490 54.0 1 0
104 | 9 2.5880 63.0 1 0
105 | 4 0.7960 47.0 1 0
106 | 9 2.5740 60.5 0 0
107 | 6 1.9790 56.0 1 0
108 | 8 2.3540 58.5 1 0
109 | 6 1.7180 55.0 1 0
110 | 7 1.7420 58.5 0 0
111 | 7 1.6030 51.0 0 0
112 | 8 2.6390 59.5 0 0
113 | 7 1.8290 54.0 0 0
114 | 7 2.0840 58.0 1 0
115 | 7 2.2200 58.0 1 0
116 | 7 1.4730 52.5 0 0
117 | 8 2.3410 60.5 0 0
118 | 7 1.6980 54.5 0 0
119 | 5 1.1960 46.5 0 0
120 | 8 1.8720 56.5 0 0
121 | 7 2.2190 55.0 1 0
122 | 9 2.4200 57.0 1 0
123 | 7 1.8270 54.5 0 0
124 | 7 1.4610 54.0 0 0
125 | 6 1.3380 53.0 1 0
126 | 8 2.0900 57.0 1 0
127 | 8 1.6970 59.0 0 0
128 | 8 1.5620 55.0 1 0
129 | 9 2.0400 55.5 0 0
130 | 7 1.6090 51.5 0 0
131 | 8 2.4580 61.0 0 0
132 | 9 2.6500 63.5 1 0
133 | 8 1.4290 57.5 1 0
134 | 8 1.6750 53.0 1 0
135 | 9 1.9470 56.5 0 0
136 | 8 2.0690 54.0 1 0
137 | 6 1.5720 52.0 1 0
138 | 6 1.3480 53.0 1 0
139 | 8 2.2880 61.5 0 0
140 | 9 1.7730 58.5 1 0
141 | 5 0.7910 52.0 0 0
142 | 7 1.9050 58.0 1 0
143 | 9 2.4630 61.0 0 0
144 | 6 1.4310 51.0 1 0
145 | 9 2.6310 62.0 0 0
146 | 9 3.1140 64.5 1 0
147 | 9 2.1350 58.5 1 0
148 | 6 1.5270 52.5 1 0
149 | 8 2.2930 58.0 0 0
150 | 9 3.0420 66.0 0 0
151 | 8 2.9270 63.5 1 0
152 | 8 2.6650 64.0 0 0
153 | 9 2.3010 58.5 1 0
154 | 9 2.4600 64.0 1 0
155 | 9 2.5920 60.5 0 0
156 | 7 1.7500 55.0 0 0
157 | 8 1.7590 53.0 1 0
158 | 6 1.5360 48.0 1 0
159 | 9 2.2590 58.5 0 0
160 | 9 2.0480 64.5 0 0
161 | 9 2.5710 60.5 1 0
162 | 7 2.0460 56.0 1 0
163 | 8 1.7800 58.5 0 0
164 | 5 1.5520 54.0 0 0
165 | 8 1.9530 58.0 0 0
166 | 9 2.8930 64.5 1 0
167 | 6 1.7130 50.5 1 0
168 | 9 2.8510 60.0 0 0
169 | 6 1.6240 51.5 1 0
170 | 8 2.6310 59.0 1 0
171 | 5 1.8190 53.0 1 0
172 | 7 1.6580 53.0 1 0
173 | 7 2.1580 53.5 1 0
174 | 4 1.7890 52.0 1 0
175 | 9 3.0040 64.0 0 0
176 | 8 2.5030 63.0 1 0
177 | 9 1.9330 58.0 0 0
178 | 9 2.0910 58.5 0 0
179 | 9 2.3160 59.5 0 0
180 | 5 1.7040 51.0 0 0
181 | 9 1.6060 57.5 0 0
182 | 7 1.1650 47.0 1 0
183 | 6 2.1020 55.5 0 0
184 | 9 2.3200 57.0 0 0
185 | 9 2.2300 61.0 1 0
186 | 9 1.7160 55.5 1 0
187 | 7 1.7900 53.5 1 0
188 | 5 1.1460 50.0 0 0
189 | 8 2.1870 61.5 0 0
190 | 9 2.7170 61.5 1 0
191 | 7 1.7960 55.0 1 0
192 | 9 1.9530 58.0 1 1
193 | 8 1.3350 56.5 0 0
194 | 9 2.1190 57.0 1 0
195 | 6 1.6660 52.0 1 0
196 | 6 1.8260 52.5 1 0
197 | 8 2.7090 62.5 0 0
198 | 9 2.8710 65.0 1 0
199 | 5 1.0920 50.0 0 0
200 | 6 2.2620 57.5 1 0
201 | 6 2.1040 56.5 1 0
202 | 9 2.1660 57.5 0 0
203 | 7 1.6900 54.0 0 0
204 | 9 2.9730 59.5 1 0
205 | 8 2.1450 59.5 0 0
206 | 5 1.9710 58.0 1 0
207 | 7 2.0950 57.0 0 0
208 | 6 1.6970 55.0 0 0
209 | 9 2.4550 60.0 0 0
210 | 7 1.9200 56.5 1 0
211 | 9 2.1640 60.0 1 0
212 | 9 2.1300 59.0 0 0
213 | 8 2.9930 63.0 0 0
214 | 9 2.5290 59.0 0 0
215 | 7 1.7260 53.0 0 0
216 | 9 2.4420 61.5 0 0
217 | 4 1.1020 48.0 0 0
218 | 9 2.0560 63.0 0 0
219 | 5 1.8080 55.5 1 0
220 | 8 2.3050 64.5 0 0
221 | 9 1.9690 59.0 0 0
222 | 8 1.5560 58.5 0 0
223 | 3 1.0720 46.0 0 0
224 | 9 2.0420 62.0 1 0
225 | 8 1.5120 53.0 0 0
226 | 6 1.4230 49.5 1 0
227 | 9 3.6810 68.0 1 0
228 | 8 1.9910 59.5 1 0
229 | 8 1.8970 55.5 1 0
230 | 7 1.3700 55.0 0 0
231 | 6 1.3380 51.5 0 0
232 | 8 2.0160 56.0 1 0
233 | 9 2.6390 63.0 0 0
234 | 4 1.3890 48.0 0 0
235 | 7 1.6120 56.5 1 0
236 | 8 2.1350 59.0 0 0
237 | 8 2.6810 60.5 1 0
238 | 9 3.2230 65.0 0 0
239 | 6 1.7960 55.0 0 0
240 | 8 2.0100 55.0 1 0
241 | 6 1.5230 51.0 0 0
242 | 8 1.7440 52.5 1 0
243 | 9 2.4850 64.0 0 0
244 | 8 2.3350 59.0 0 0
245 | 7 1.4150 53.5 0 0
246 | 9 2.0760 60.5 1 0
247 | 8 2.4350 59.5 1 0
248 | 7 1.7280 56.5 0 0
249 | 9 2.8500 63.0 0 0
250 | 8 1.8440 56.5 0 0
251 | 9 1.7540 61.5 0 0
252 | 6 1.3430 52.0 0 0
253 | 8 2.3030 57.0 1 0
254 | 9 2.2460 63.5 1 0
255 | 8 2.4760 63.0 0 0
256 | 9 3.2390 65.0 1 0
257 | 9 2.4570 61.5 1 0
258 | 8 2.3820 62.0 0 0
259 | 7 1.6400 55.0 0 0
260 | 5 1.5890 51.0 0 0
261 | 7 2.0560 54.0 1 0
262 | 8 2.2260 57.0 1 0
263 | 9 1.8860 56.0 0 0
264 | 9 2.8330 61.5 1 0
265 | 6 1.7150 53.0 1 0
266 | 8 2.6310 59.0 1 0
267 | 7 2.5500 56.0 1 0
268 | 9 1.9120 59.0 0 0
269 | 7 1.8770 52.5 0 0
270 | 7 1.9350 52.5 0 0
271 | 5 1.5390 50.0 0 0
272 | 9 2.8030 59.5 1 0
273 | 9 2.9230 64.0 1 0
274 | 8 2.3580 61.0 0 0
275 | 8 2.0940 57.5 1 0
276 | 9 1.8550 60.0 1 0
277 | 6 1.5350 55.0 0 0
278 | 7 2.1350 56.0 1 0
279 | 5 1.9300 51.0 1 0
280 | 9 2.1820 59.5 0 0
281 | 5 1.3590 50.5 1 0
282 | 7 2.0020 57.5 0 0
283 | 6 1.6990 54.0 1 0
284 | 8 2.5000 57.0 1 0
285 | 7 2.3660 58.0 0 0
286 | 8 2.0690 60.0 0 0
287 | 4 1.4180 49.0 0 0
288 | 8 2.3330 57.0 0 0
289 | 5 1.5140 52.0 1 0
290 | 8 1.7580 52.0 0 0
291 | 7 2.5350 59.5 1 0
292 | 7 2.5640 58.0 0 0
293 | 9 2.4870 64.0 0 0
294 | 9 1.5910 57.0 0 0
295 | 8 1.6240 53.0 1 0
296 | 9 2.7980 62.0 1 0
297 | 6 1.6910 53.0 1 0
298 | 8 1.9990 56.5 0 0
299 | 9 1.8690 57.0 1 0
300 | 4 1.0040 48.0 1 0
301 | 6 1.4270 49.5 1 0
302 | 7 1.8260 51.0 1 0
303 | 9 2.6880 59.5 0 0
304 | 8 1.6570 56.0 1 0
305 | 6 1.6720 54.0 0 0
306 | 8 2.0150 57.5 0 0
307 | 7 2.3710 55.5 0 0
308 | 5 2.1150 50.0 1 0
309 | 8 2.3280 60.0 0 0
310 | 7 1.4950 57.0 0 0
311 | 11 2.8840 69.0 1 0
312 | 10 2.3280 64.0 1 0
313 | 14 3.3810 63.0 1 0
314 | 11 2.1700 58.0 0 0
315 | 11 3.4700 66.5 1 0
316 | 12 3.0580 60.5 0 0
317 | 10 1.8110 57.0 1 0
318 | 11 2.5240 64.0 1 0
319 | 10 2.6420 61.0 0 0
320 | 14 3.7410 68.5 1 0
321 | 13 4.3360 69.5 1 0
322 | 14 4.8420 72.0 1 0
323 | 12 4.5500 71.0 1 0
324 | 12 2.8410 63.0 0 0
325 | 10 3.1660 61.5 0 0
326 | 13 3.8160 63.5 0 0
327 | 10 2.5610 62.0 1 0
328 | 11 3.6540 65.0 0 0
329 | 10 2.4810 61.0 1 0
330 | 11 2.6650 63.0 0 0
331 | 10 3.2030 66.0 1 0
332 | 13 3.5490 68.0 1 0
333 | 14 2.2360 66.0 0 1
334 | 11 3.2220 72.0 1 0
335 | 10 3.1110 66.0 1 0
336 | 11 3.4900 67.0 0 0
337 | 13 3.1470 64.0 0 0
338 | 10 2.5200 60.5 0 0
339 | 10 2.2920 63.0 1 0
340 | 12 2.8890 64.0 0 0
341 | 10 2.2460 60.5 1 0
342 | 10 1.9370 62.0 1 0
343 | 10 2.6460 60.0 1 0
344 | 11 2.9570 64.5 1 0
345 | 11 4.0070 67.0 1 0
346 | 11 2.3860 61.5 0 0
347 | 10 3.2510 66.0 1 0
348 | 11 2.7620 60.0 0 0
349 | 11 3.0110 64.0 0 0
350 | 13 4.3050 68.5 1 0
351 | 13 3.9060 67.0 1 0
352 | 11 3.5830 67.0 1 0
353 | 11 3.2360 66.0 0 0
354 | 14 3.4360 62.5 1 0
355 | 11 3.0580 61.0 1 0
356 | 10 3.0070 62.0 1 0
357 | 10 3.4890 66.5 1 0
358 | 10 2.8640 60.0 0 0
359 | 14 3.4280 64.0 0 1
360 | 13 2.8190 62.0 0 0
361 | 10 2.2500 58.0 0 0
362 | 14 4.6830 68.5 1 0
363 | 10 2.3520 61.5 1 0
364 | 11 3.1080 64.5 1 0
365 | 13 3.9940 67.0 1 0
366 | 12 4.3930 68.5 1 0
367 | 13 3.2080 61.0 0 1
368 | 10 2.5920 65.0 1 0
369 | 13 3.1930 70.0 1 0
370 | 11 1.6940 60.0 1 1
371 | 14 3.9570 72.0 1 1
372 | 11 2.3460 59.0 0 0
373 | 13 4.7890 69.0 1 1
374 | 11 3.5150 67.5 1 0
375 | 11 2.7540 65.5 0 0
376 | 10 2.7200 65.5 1 0
377 | 11 2.4630 64.5 1 0
378 | 11 2.6330 62.0 0 0
379 | 10 3.0480 65.5 0 0
380 | 11 3.1110 67.5 1 0
381 | 13 3.7450 68.0 0 0
382 | 12 2.3840 63.5 0 1
383 | 10 2.0940 58.5 1 0
384 | 10 3.1830 65.5 0 0
385 | 14 3.0740 65.0 0 1
386 | 11 3.9770 70.5 1 0
387 | 10 3.3540 63.0 1 0
388 | 11 3.4110 63.5 0 0
389 | 10 2.3870 66.0 0 1
390 | 11 3.1710 63.0 0 0
391 | 13 3.8870 67.5 1 0
392 | 13 2.6460 61.5 0 0
393 | 10 2.5040 60.0 0 0
394 | 11 3.5870 64.5 1 0
395 | 11 3.8450 68.5 1 0
396 | 12 2.9710 64.5 1 0
397 | 10 2.8910 61.0 0 0
398 | 10 1.8230 57.0 0 0
399 | 11 2.4170 62.5 1 0
400 | 10 2.1750 58.0 0 0
401 | 11 2.7350 62.5 0 0
402 | 14 4.2730 72.5 1 0
403 | 13 2.9760 65.5 1 0
404 | 12 3.8350 69.5 0 1
405 | 11 4.0650 66.5 1 0
406 | 11 2.3180 59.0 0 0
407 | 11 3.5960 68.0 1 0
408 | 14 3.3950 67.0 0 0
409 | 12 2.7510 63.0 0 0
410 | 10 2.6730 64.5 0 0
411 | 12 2.5560 62.0 0 0
412 | 11 2.5420 62.0 0 0
413 | 10 2.6080 66.0 1 0
414 | 11 2.3540 62.0 0 0
415 | 13 2.5990 62.5 0 1
416 | 10 1.4580 57.0 0 0
417 | 10 3.7950 68.5 1 0
418 | 11 2.4910 59.0 0 0
419 | 13 3.0600 61.5 0 0
420 | 10 2.5450 65.0 1 0
421 | 11 2.9930 66.5 1 0
422 | 10 3.3050 65.0 0 0
423 | 13 4.7560 68.0 1 1
424 | 11 3.7740 67.0 0 0
425 | 10 2.8550 64.5 1 0
426 | 11 2.9880 70.0 1 0
427 | 11 2.4980 60.0 1 0
428 | 14 3.1690 64.0 0 0
429 | 11 2.8870 62.5 1 0
430 | 13 2.7040 61.0 0 0
431 | 11 3.5150 64.0 0 0
432 | 11 3.4250 65.5 1 0
433 | 10 2.2870 61.0 0 0
434 | 13 2.4340 65.4 0 0
435 | 10 2.3650 63.5 0 0
436 | 13 3.0860 67.5 0 1
437 | 10 2.6960 66.0 1 0
438 | 12 2.8680 62.0 0 0
439 | 10 2.8130 61.5 0 0
440 | 14 4.3090 69.0 1 1
441 | 12 3.2550 66.0 0 0
442 | 10 3.4130 66.0 0 1
443 | 11 4.5930 69.0 1 0
444 | 14 4.1110 71.0 1 0
445 | 12 1.9160 60.5 1 0
446 | 10 1.8580 58.0 1 0
447 | 10 2.9750 63.0 0 1
448 | 10 3.3500 69.0 1 0
449 | 10 2.9010 59.5 1 0
450 | 12 2.2410 64.0 1 0
451 | 13 4.2250 74.0 1 0
452 | 11 3.2230 64.5 0 0
453 | 12 5.2240 70.0 1 0
454 | 11 4.0730 67.0 1 0
455 | 12 4.0800 64.5 1 0
456 | 11 2.6060 65.0 0 0
457 | 11 3.1690 62.5 0 1
458 | 12 4.4110 68.0 1 0
459 | 12 3.7910 68.5 1 0
460 | 13 3.0890 67.5 1 0
461 | 11 2.4650 60.0 1 0
462 | 12 3.3430 68.0 1 1
463 | 10 3.2000 65.0 1 0
464 | 12 2.9130 64.0 1 0
465 | 13 4.8770 73.0 1 0
466 | 10 2.3580 59.0 0 0
467 | 12 3.2790 70.5 1 0
468 | 10 2.5810 66.0 1 0
469 | 12 2.3470 61.5 0 0
470 | 10 2.6910 67.0 0 0
471 | 11 2.8270 62.5 0 0
472 | 10 1.8730 52.5 1 0
473 | 12 3.7510 72.0 1 1
474 | 14 2.5380 71.0 0 0
475 | 10 2.7580 65.5 1 0
476 | 10 3.0500 60.0 0 0
477 | 12 3.0790 60.0 0 0
478 | 10 2.2010 60.5 1 0
479 | 10 1.8580 59.0 1 0
480 | 13 2.2160 68.0 0 1
481 | 12 3.4030 62.0 0 0
482 | 12 3.5010 64.5 0 0
483 | 11 2.5780 63.0 0 0
484 | 13 3.0780 66.0 0 1
485 | 12 3.1860 67.0 0 1
486 | 10 1.6650 57.0 1 0
487 | 11 2.0810 63.0 0 0
488 | 11 2.9740 62.0 0 0
489 | 13 3.2970 65.0 0 1
490 | 12 4.0730 68.5 1 0
491 | 13 4.4480 69.0 1 0
492 | 13 3.9840 71.0 1 0
493 | 10 2.2500 58.0 0 0
494 | 12 2.7520 63.5 0 0
495 | 12 2.3040 66.5 1 1
496 | 14 3.6800 67.0 1 0
497 | 11 3.1020 64.0 0 1
498 | 10 2.8620 61.0 0 0
499 | 13 2.6770 67.0 0 1
500 | 11 3.0230 67.5 0 0
501 | 11 3.6810 68.0 0 0
502 | 13 3.2550 66.5 0 0
503 | 12 3.6920 67.0 1 0
504 | 10 2.3560 60.5 0 0
505 | 10 4.5910 70.0 1 0
506 | 12 3.0820 63.5 0 0
507 | 13 3.2970 65.0 0 1
508 | 11 3.2580 63.0 0 0
509 | 10 2.2160 61.0 1 0
510 | 11 3.2470 65.5 1 0
511 | 11 4.3240 67.5 1 0
512 | 11 2.3620 61.0 0 0
513 | 11 2.5630 63.0 0 0
514 | 11 3.2060 63.5 1 0
515 | 14 3.5850 70.0 1 0
516 | 12 4.7200 71.5 1 0
517 | 13 3.3310 65.5 0 0
518 | 13 5.0830 74.0 1 0
519 | 10 3.4980 68.0 1 1
520 | 12 2.4170 61.0 0 0
521 | 10 2.3640 61.0 1 0
522 | 10 2.3410 61.0 1 0
523 | 12 2.7590 61.5 0 1
524 | 11 2.9530 67.0 0 1
525 | 12 3.2310 63.0 1 0
526 | 11 3.0780 67.5 1 0
527 | 11 3.3690 70.5 1 0
528 | 12 3.5290 70.5 1 0
529 | 12 2.8660 62.0 0 0
530 | 14 2.8910 62.0 0 0
531 | 11 3.0220 61.5 0 0
532 | 10 3.1270 62.0 1 0
533 | 11 2.8660 60.5 0 0
534 | 12 2.6050 62.5 0 0
535 | 13 3.0560 63.0 0 0
536 | 12 2.5690 63.0 0 0
537 | 11 2.5010 62.0 0 0
538 | 11 3.3200 65.5 1 0
539 | 11 2.1230 65.0 1 0
540 | 14 3.7800 70.0 1 0
541 | 11 3.8470 66.0 1 0
542 | 13 3.7850 63.0 0 1
543 | 12 3.9240 68.0 1 0
544 | 10 2.1320 59.0 1 0
545 | 12 2.7520 68.5 1 0
546 | 13 2.4490 63.0 0 0
547 | 10 3.4560 63.0 1 0
548 | 10 3.0730 66.0 0 0
549 | 10 2.6880 62.0 0 0
550 | 10 3.3290 68.0 1 0
551 | 14 4.2710 72.5 1 0
552 | 12 3.5300 64.0 1 0
553 | 11 2.9280 65.5 1 0
554 | 11 2.6890 61.5 0 0
555 | 12 2.3320 57.0 1 0
556 | 14 2.9340 64.0 0 0
557 | 14 2.2760 66.0 1 1
558 | 10 3.1100 64.5 1 0
559 | 11 2.8940 67.0 1 0
560 | 11 4.6370 72.0 1 1
561 | 10 2.4350 65.0 0 0
562 | 10 2.8380 63.0 0 0
563 | 12 3.0350 62.0 0 0
564 | 12 4.8310 71.0 1 0
565 | 11 2.8120 61.0 1 0
566 | 12 2.7140 65.5 0 0
567 | 10 3.0860 62.0 0 0
568 | 12 3.5190 65.5 0 0
569 | 13 4.2320 70.5 1 0
570 | 10 2.7700 62.0 1 0
571 | 12 3.3410 65.5 0 0
572 | 10 3.0900 65.0 1 0
573 | 13 2.5310 61.0 1 0
574 | 12 2.8220 69.5 1 0
575 | 10 3.0380 65.0 0 1
576 | 12 2.9350 65.5 1 0
577 | 10 2.5680 63.5 0 0
578 | 11 2.3870 60.5 1 0
579 | 12 2.4990 65.0 1 0
580 | 11 4.1300 67.0 1 0
581 | 12 3.0010 63.5 0 0
582 | 10 3.1320 59.5 0 0
583 | 13 3.5770 63.5 0 0
584 | 12 3.2220 61.0 0 0
585 | 11 3.2800 66.0 1 0
586 | 11 2.6590 64.0 1 0
587 | 11 2.8220 62.0 0 0
588 | 11 2.1400 60.5 0 0
589 | 12 4.2030 71.0 1 0
590 | 14 2.9970 64.5 0 0
591 | 11 3.1200 61.0 0 1
592 | 11 2.5620 62.5 0 0
593 | 12 3.0820 64.5 0 0
594 | 14 3.8060 68.0 1 0
595 | 11 3.3390 68.5 1 1
596 | 13 3.1520 62.0 0 1
597 | 11 2.4580 60.0 0 0
598 | 10 2.3910 59.5 1 0
599 | 13 3.1410 61.0 0 0
600 | 12 2.5790 63.0 0 0
601 | 11 3.1040 67.5 0 1
602 | 13 4.0450 69.0 1 1
603 | 14 4.7630 68.0 1 1
604 | 10 2.1000 58.0 1 0
605 | 11 3.0690 65.0 0 1
606 | 11 2.7850 69.0 1 0
607 | 15 4.2840 70.0 1 0
608 | 15 4.5060 71.0 1 1
609 | 18 2.9060 66.0 0 0
610 | 19 5.1020 72.0 1 0
611 | 19 3.5190 66.0 0 1
612 | 16 3.6880 68.0 1 1
613 | 17 4.4290 70.0 1 0
614 | 15 4.2790 67.5 1 0
615 | 15 4.5000 70.0 1 0
616 | 15 2.6350 64.0 0 0
617 | 15 2.6790 66.0 0 1
618 | 15 2.1980 62.0 0 1
619 | 19 3.3450 65.5 0 1
620 | 18 3.0820 64.5 0 0
621 | 16 3.3870 66.5 0 0
622 | 17 3.0820 67.0 1 1
623 | 16 2.9030 63.0 0 1
624 | 15 3.0040 64.0 0 1
625 | 15 5.7930 69.0 1 0
626 | 15 3.9850 71.0 1 0
627 | 18 4.2200 68.0 1 0
628 | 17 4.7240 70.5 1 0
629 | 15 3.7310 67.0 1 0
630 | 17 3.4060 69.0 1 1
631 | 17 3.5000 62.0 0 0
632 | 16 3.6740 67.5 0 0
633 | 17 5.6330 73.0 1 0
634 | 15 3.1220 64.0 0 1
635 | 15 3.3300 68.5 0 1
636 | 16 2.6080 62.0 0 1
637 | 16 3.6450 73.5 1 0
638 | 15 3.7990 66.5 1 1
639 | 18 4.0860 67.0 1 1
640 | 15 2.8870 63.0 0 0
641 | 16 4.0700 69.5 1 1
642 | 17 3.9600 70.0 1 0
643 | 16 4.2990 66.0 1 0
644 | 16 2.9810 66.0 0 0
645 | 15 2.2640 63.0 0 1
646 | 18 4.4040 70.5 1 1
647 | 15 2.2780 60.0 0 1
648 | 16 4.5040 72.0 1 0
649 | 17 5.6380 70.0 1 0
650 | 16 4.8720 72.0 1 1
651 | 16 4.2700 67.0 1 1
652 | 15 3.7270 68.0 1 1
653 | 18 2.8530 60.0 0 0
654 | 16 2.7950 63.0 0 1
655 | 15 3.2110 66.5 0 0
--------------------------------------------------------------------------------
/tensorflow-tutorial/week_2/Week_2.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Week 2: MLP classifier"
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": null,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import tensorflow as tf"
17 | ]
18 | },
19 | {
20 | "cell_type": "markdown",
21 | "metadata": {},
22 | "source": [
23 | "SVHN can be downloaded from http://ufldl.stanford.edu/housenumbers/"
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "## Import and preprocess the data"
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "execution_count": null,
36 | "metadata": {
37 | "collapsed": true
38 | },
39 | "outputs": [],
40 | "source": [
41 | "from scipy.io import loadmat\n",
42 | "\n",
43 | "train = loadmat('../SVHN/train_32x32.mat')\n",
44 | "test = loadmat('../SVHN/test_32x32.mat')"
45 | ]
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {},
50 | "source": [
51 | "`train` and `test` are dictionaries with keys `'X'` and `'y'`. The values are numpy arrays."
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": null,
57 | "metadata": {},
58 | "outputs": [],
59 | "source": [
60 | "print(train['X'].shape)\n",
61 | "print(train['y'].shape)"
62 | ]
63 | },
64 | {
65 | "cell_type": "code",
66 | "execution_count": null,
67 | "metadata": {},
68 | "outputs": [],
69 | "source": [
70 | "print(test['X'].shape)\n",
71 | "print(test['y'].shape)"
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "execution_count": null,
77 | "metadata": {
78 | "collapsed": true
79 | },
80 | "outputs": [],
81 | "source": [
82 | "import numpy as np\n",
83 | "\n",
84 | "training_set = np.transpose(train['X'], (3, 0, 1, 2)).astype(np.float32)\n",
85 | "training_labels = train['y']\n",
86 | "\n",
87 | "test_set = np.transpose(test['X'], (3, 0, 1, 2)).astype(np.float32)\n",
88 | "test_labels = test['y']"
89 | ]
90 | },
91 | {
92 | "cell_type": "code",
93 | "execution_count": null,
94 | "metadata": {
95 | "collapsed": true
96 | },
97 | "outputs": [],
98 | "source": [
99 | "n_train = training_set.shape[0]\n",
100 | "n_test = test_set.shape[0]"
101 | ]
102 | },
103 | {
104 | "cell_type": "markdown",
105 | "metadata": {},
106 | "source": [
107 | "### Inspect the data"
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "execution_count": null,
113 | "metadata": {
114 | "collapsed": true
115 | },
116 | "outputs": [],
117 | "source": [
118 | "from matplotlib import pyplot as plt"
119 | ]
120 | },
121 | {
122 | "cell_type": "code",
123 | "execution_count": null,
124 | "metadata": {},
125 | "outputs": [],
126 | "source": [
127 | "example = np.random.choice(np.arange(n_train))\n",
128 | "\n",
129 | "image = training_set[example]\n",
130 | "label = training_labels[example][0]\n",
131 | "\n",
132 | "if label == 10:\n",
133 | " label = 0\n",
134 | "\n",
135 | "plt.imshow(image)\n",
136 | "plt.show()\n",
137 | "\n",
138 | "print(\"Digit: {}\".format(label))"
139 | ]
140 | },
141 | {
142 | "cell_type": "markdown",
143 | "metadata": {},
144 | "source": [
145 | "### Convert the images to grayscale"
146 | ]
147 | },
148 | {
149 | "cell_type": "code",
150 | "execution_count": null,
151 | "metadata": {
152 | "collapsed": true
153 | },
154 | "outputs": [],
155 | "source": [
156 | "def convert_to_grayscale(images):\n",
157 | " images = np.add.reduce(images, keepdims=True, axis=3)\n",
158 | " images = images / 3.0\n",
159 | " return images / 128.0 - 1.0"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "execution_count": null,
165 | "metadata": {
166 | "collapsed": true
167 | },
168 | "outputs": [],
169 | "source": [
170 | "training_set_gs = convert_to_grayscale(training_set)\n",
171 | "test_set_gs = convert_to_grayscale(test_set)"
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "execution_count": null,
177 | "metadata": {},
178 | "outputs": [],
179 | "source": [
180 | "print(training_set_gs.shape)\n",
181 | "print(test_set_gs.shape)"
182 | ]
183 | },
184 | {
185 | "cell_type": "code",
186 | "execution_count": null,
187 | "metadata": {},
188 | "outputs": [],
189 | "source": [
190 | "example = np.random.choice(np.arange(n_train))\n",
191 | "\n",
192 | "image = training_set_gs[example]\n",
193 | "label = training_labels[example][0]\n",
194 | "\n",
195 | "if label == 10:\n",
196 | " label = 0\n",
197 | "\n",
198 | "plt.imshow(np.squeeze(image), cmap='gray')\n",
199 | "plt.show()\n",
200 | "\n",
201 | "print(\"Digit: {}\".format(label))"
202 | ]
203 | },
204 | {
205 | "cell_type": "markdown",
206 | "metadata": {},
207 | "source": [
208 | "Flatten the inputs to feed into an MLP"
209 | ]
210 | },
211 | {
212 | "cell_type": "code",
213 | "execution_count": null,
214 | "metadata": {},
215 | "outputs": [],
216 | "source": [
217 | "training_set_flat = training_set_gs.reshape((n_train, -1))\n",
218 | "test_set_flat = test_set_gs.reshape((n_test, -1))\n",
219 | "\n",
220 | "print(training_set_flat.shape)\n",
221 | "print(test_set_flat.shape)"
222 | ]
223 | },
224 | {
225 | "cell_type": "markdown",
226 | "metadata": {},
227 | "source": [
228 | "### Encode the labels as one-hot vectors"
229 | ]
230 | },
231 | {
232 | "cell_type": "code",
233 | "execution_count": null,
234 | "metadata": {
235 | "collapsed": true
236 | },
237 | "outputs": [],
238 | "source": [
239 | "def one_hot(labels):\n",
240 | " \"\"\"\n",
241 | " Encodes the labels as one-hot vectors. Zero is represented as 10 in SVHN.\n",
242 | " [10] -> [1, 0, 0, 0, 0, 0, 0, 0, 0, 0]\n",
243 | " [2] -> [0, 0, 1, 0, 0, 0, 0, 0, 0, 0]\n",
244 | " \n",
245 | " \"\"\"\n",
246 | " labels = np.squeeze(labels)\n",
247 | " one_hot_labels = []\n",
248 | " for num in labels:\n",
249 | " one_hot = [0.0] * 10\n",
250 | " if num == 10:\n",
251 | " one_hot[0] = 1.0\n",
252 | " else:\n",
253 | " one_hot[num] = 1.0\n",
254 | " one_hot_labels.append(one_hot)\n",
255 | " labels = np.array(one_hot_labels).astype(np.float32)\n",
256 | " return labels"
257 | ]
258 | },
259 | {
260 | "cell_type": "code",
261 | "execution_count": null,
262 | "metadata": {
263 | "collapsed": true
264 | },
265 | "outputs": [],
266 | "source": [
267 | "training_labels_one_hot = one_hot(training_labels)\n",
268 | "test_labels_one_hot = one_hot(test_labels)"
269 | ]
270 | },
271 | {
272 | "cell_type": "code",
273 | "execution_count": null,
274 | "metadata": {},
275 | "outputs": [],
276 | "source": [
277 | "print(training_labels_one_hot.shape)\n",
278 | "print(test_labels_one_hot.shape)"
279 | ]
280 | },
281 | {
282 | "cell_type": "markdown",
283 | "metadata": {},
284 | "source": [
285 | "## Build the network"
286 | ]
287 | },
288 | {
289 | "cell_type": "code",
290 | "execution_count": null,
291 | "metadata": {
292 | "collapsed": true
293 | },
294 | "outputs": [],
295 | "source": [
296 | "class SVHN_MLP:\n",
297 | " def __init__(self, wd_factor, learning_rate):\n",
298 | " self.wd_factor = wd_factor\n",
299 | " self.learning_rate = learning_rate\n",
300 | " self.train_pointer = 0\n",
301 | " self.test_pointer = 0\n",
302 | " \n",
303 | " self.sess = tf.Session()\n",
304 | " \n",
305 | " self.input = tf.placeholder(dtype=tf.float32, shape=[None, 1024], name='input')\n",
306 | " self.ground_truth = tf.placeholder(dtype=tf.float32, shape=[None, 10], name='ground_truth')\n",
307 | " print(self.input)\n",
308 | " \n",
309 | " self._build_graph()\n",
310 | " \n",
311 | " def _build_graph(self):\n",
312 | " weights = [] # for weight decay\n",
313 | " \n",
314 | " with tf.variable_scope('layers'):\n",
315 | " h = tf.layers.dense(self.input, 512, kernel_initializer=tf.glorot_uniform_initializer(), \n",
316 | " activation=tf.tanh, name='1')\n",
317 | " print(h)\n",
318 | " h = tf.layers.dense(h, 256, kernel_initializer=tf.glorot_uniform_initializer(), \n",
319 | " activation=tf.tanh, name='2')\n",
320 | " print(h)\n",
321 | " h = tf.layers.dense(h, 64, kernel_initializer=tf.glorot_uniform_initializer(), \n",
322 | " activation=tf.tanh, name='3')\n",
323 | " print(h)\n",
324 | " self.logits = tf.layers.dense(h, 10, kernel_initializer=tf.glorot_uniform_initializer(), \n",
325 | " activation=tf.identity, name='4')\n",
326 | " print(self.logits)\n",
327 | " self.prediction = tf.nn.softmax(self.logits, name='softmax_prediction')\n",
328 | " \n",
329 | " with tf.name_scope('loss'):\n",
330 | " self.loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(logits=self.logits, \n",
331 | " labels=self.ground_truth))\n",
332 | " self.loss += self.weight_decay()\n",
333 | " \n",
334 | " self.optimizer = tf.train.AdamOptimizer(self.learning_rate)\n",
335 | " self.train_op = self.optimizer.minimize(self.loss)\n",
336 | " \n",
337 | " def weight_decay(self):\n",
338 | " loss = 0\n",
339 | " for v in tf.global_variables():\n",
340 | " if 'Adam' in v.name:\n",
341 | " continue\n",
342 | " elif 'kernel' in v.name:\n",
343 | " loss += self.wd_factor * tf.nn.l2_loss(v)\n",
344 | " print(loss)\n",
345 | " return loss\n",
346 | " \n",
347 | " def train_minibatch(self, samples, labels, batch_size):\n",
348 | " if self.train_pointer + batch_size <= samples.shape[0]:\n",
349 | " samples_minibatch = samples[self.train_pointer: self.train_pointer + batch_size]\n",
350 | " labels_minibatch = labels[self.train_pointer: self.train_pointer + batch_size]\n",
351 | " self.train_pointer += batch_size\n",
352 | " else:\n",
353 | " samples_minibatch = samples[self.train_pointer:]\n",
354 | " labels_minibatch = labels[self.train_pointer: self.train_pointer + batch_size]\n",
355 | " self.train_pointer = 0\n",
356 | " return samples_minibatch, labels_minibatch\n",
357 | "\n",
358 | " def train(self, train_samples, train_labels, train_batch_size, iteration_steps):\n",
359 | " self.sess.run(tf.global_variables_initializer())\n",
360 | "\n",
361 | " print('Start Training')\n",
362 | " losses = []\n",
363 | " for i in range(iteration_steps):\n",
364 | " samples, labels = self.train_minibatch(train_samples, train_labels, train_batch_size)\n",
365 | " feed_dict = {self.input: samples, self.ground_truth: labels}\n",
366 | " _, loss = self.sess.run([self.train_op, self.loss], feed_dict=feed_dict)\n",
367 | " if i % 50 == 0:\n",
368 | " print(\"Minibatch loss at step {}: {}\".format(i, loss))\n",
369 | " losses.append([i, loss])\n",
370 | " return losses\n",
371 | " \n",
372 | " def test_minibatch(self, samples, labels, batch_size):\n",
373 | " if self.test_pointer + batch_size <= samples.shape[0]:\n",
374 | " samples_minibatch = samples[self.test_pointer: self.test_pointer + batch_size]\n",
375 | " labels_minibatch = labels[self.test_pointer: self.test_pointer + batch_size]\n",
376 | " self.test_pointer += batch_size\n",
377 | " end_of_epoch = False\n",
378 | " else:\n",
379 | " samples_minibatch = samples[self.test_pointer:]\n",
380 | " labels_minibatch = labels[self.test_pointer: self.test_pointer + batch_size]\n",
381 | " self.test_pointer = 0\n",
382 | " end_of_epoch = True\n",
383 | " return samples_minibatch, labels_minibatch, end_of_epoch\n",
384 | " \n",
385 | " def test(self, test_samples, test_labels, test_batch_size):\n",
386 | " end_of_epoch = False\n",
387 | " losses = []\n",
388 | " while not end_of_epoch:\n",
389 | " samples, labels, end_of_epoch = self.test_minibatch(test_samples, test_labels, test_batch_size)\n",
390 | " feed_dict = {self.input: samples, self.ground_truth: labels}\n",
391 | " losses.append(self.sess.run(self.loss, feed_dict=feed_dict)) \n",
392 | " print(\"Average test loss: {}\".format(np.mean(losses)))"
393 | ]
394 | },
395 | {
396 | "cell_type": "code",
397 | "execution_count": null,
398 | "metadata": {},
399 | "outputs": [],
400 | "source": [
401 | "WD_FACTOR = 0.0001\n",
402 | "LEARNING_RATE = 0.001\n",
403 | "model = SVHN_MLP(WD_FACTOR, LEARNING_RATE)"
404 | ]
405 | },
406 | {
407 | "cell_type": "code",
408 | "execution_count": null,
409 | "metadata": {},
410 | "outputs": [],
411 | "source": [
412 | "tf.global_variables()"
413 | ]
414 | },
415 | {
416 | "cell_type": "markdown",
417 | "metadata": {},
418 | "source": [
419 | "### Train the network"
420 | ]
421 | },
422 | {
423 | "cell_type": "code",
424 | "execution_count": null,
425 | "metadata": {},
426 | "outputs": [],
427 | "source": [
428 | "TRAIN_BATCH_SIZE = 128\n",
429 | "ITERATIONS = 10000\n",
430 | "\n",
431 | "import time\n",
432 | "start_time = time.time()\n",
433 | "\n",
434 | "losses = model.train(training_set_flat, training_labels_one_hot, TRAIN_BATCH_SIZE, ITERATIONS)\n",
435 | "\n",
436 | "end_time = time.time()\n",
437 | "print(\"Training time: {}s\".format(end_time - start_time))"
438 | ]
439 | },
440 | {
441 | "cell_type": "code",
442 | "execution_count": null,
443 | "metadata": {},
444 | "outputs": [],
445 | "source": [
446 | "losses = np.array(losses)\n",
447 | "print(losses.shape)"
448 | ]
449 | },
450 | {
451 | "cell_type": "code",
452 | "execution_count": null,
453 | "metadata": {},
454 | "outputs": [],
455 | "source": [
456 | "import matplotlib.pyplot as plt\n",
457 | "\n",
458 | "iterations = losses[:, 0]\n",
459 | "train_loss = losses[:, 1]\n",
460 | "plt.figure(figsize=(10, 5))\n",
461 | "plt.plot(iterations, train_loss)\n",
462 | "plt.xlabel(\"Iterations\")\n",
463 | "plt.ylabel(\"Loss\")\n",
464 | "plt.title(\"Training curve\")\n",
465 | "plt.show()"
466 | ]
467 | },
468 | {
469 | "cell_type": "markdown",
470 | "metadata": {},
471 | "source": [
472 | "### Test network predictions"
473 | ]
474 | },
475 | {
476 | "cell_type": "code",
477 | "execution_count": null,
478 | "metadata": {},
479 | "outputs": [],
480 | "source": [
481 | "TEST_BATCH_SIZE = 128\n",
482 | "\n",
483 | "model.test(test_set_flat, test_labels_one_hot, TEST_BATCH_SIZE)"
484 | ]
485 | },
486 | {
487 | "cell_type": "code",
488 | "execution_count": null,
489 | "metadata": {},
490 | "outputs": [],
491 | "source": [
492 | "example = np.random.choice(np.arange(n_test))\n",
493 | "\n",
494 | "sample = np.expand_dims(test_set_flat[example], axis=0)\n",
495 | "label = np.expand_dims(test_labels_one_hot[example], axis=0)\n",
496 | "\n",
497 | "digit = np.where(label[0]==1.0)[0][0]\n",
498 | "\n",
499 | "feed_dict = {model.input: sample, model.ground_truth: label}\n",
500 | "prediction = model.sess.run(model.prediction, feed_dict=feed_dict)[0]\n",
501 | "\n",
502 | "image = np.reshape(sample, (32, 32))\n",
503 | "\n",
504 | "print(\"Test sample digit: {}\".format(digit))\n",
505 | "fig, ax = plt.subplots(1, 2, figsize=(17, 5))\n",
506 | "ax[0].imshow(image, cmap='gray')\n",
507 | "ax[0].set_title(\"Test example\")\n",
508 | "\n",
509 | "classes = np.arange(10)\n",
510 | "width = 1.0\n",
511 | "\n",
512 | "#fig, ax = plt.subplots()\n",
513 | "ax[1].bar(classes, prediction, width, color='Blue')\n",
514 | "ax[1].set_ylabel('Probabilities')\n",
515 | "ax[1].set_title('Network categorical distribution')\n",
516 | "ax[1].set_xticks(classes)\n",
517 | "ax[1].set_xticklabels(('0', '1', '2', '3', '4', '5', '6', '7', '8', '9'))\n",
518 | "ax[1].set_xlabel('Digit class')\n",
519 | "\n",
520 | "plt.show()\n",
521 | "\n",
522 | "print(\"Network prediction probabilities:\")\n",
523 | "print(prediction)"
524 | ]
525 | },
526 | {
527 | "cell_type": "code",
528 | "execution_count": null,
529 | "metadata": {
530 | "collapsed": true
531 | },
532 | "outputs": [],
533 | "source": [
534 | "model.sess.close()"
535 | ]
536 | }
537 | ],
538 | "metadata": {
539 | "kernelspec": {
540 | "display_name": "Python 3",
541 | "language": "python",
542 | "name": "python3"
543 | },
544 | "language_info": {
545 | "codemirror_mode": {
546 | "name": "ipython",
547 | "version": 3
548 | },
549 | "file_extension": ".py",
550 | "mimetype": "text/x-python",
551 | "name": "python",
552 | "nbconvert_exporter": "python",
553 | "pygments_lexer": "ipython3",
554 | "version": "3.6.3"
555 | }
556 | },
557 | "nbformat": 4,
558 | "nbformat_minor": 2
559 | }
560 |
--------------------------------------------------------------------------------
/tensorflow-tutorial/week_3/Week_3.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "### Week 3: CNN classifier"
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": null,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import tensorflow as tf"
17 | ]
18 | },
19 | {
20 | "cell_type": "markdown",
21 | "metadata": {},
22 | "source": [
23 | "SVHN can be downloaded from http://ufldl.stanford.edu/housenumbers/"
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "## Import and preprocess the data"
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "execution_count": null,
36 | "metadata": {
37 | "collapsed": true
38 | },
39 | "outputs": [],
40 | "source": [
41 | "from scipy.io import loadmat\n",
42 | "\n",
43 | "train = loadmat('../SVHN/train_32x32.mat')\n",
44 | "test = loadmat('../SVHN/test_32x32.mat')"
45 | ]
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {},
50 | "source": [
51 | "`train` and `test` are dictionaries with keys `'X'` and `'y'`. The values are numpy arrays."
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": null,
57 | "metadata": {},
58 | "outputs": [],
59 | "source": [
60 | "print(train['X'].shape)\n",
61 | "print(train['y'].shape)"
62 | ]
63 | },
64 | {
65 | "cell_type": "code",
66 | "execution_count": null,
67 | "metadata": {},
68 | "outputs": [],
69 | "source": [
70 | "print(test['X'].shape)\n",
71 | "print(test['y'].shape)"
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "execution_count": null,
77 | "metadata": {
78 | "collapsed": true
79 | },
80 | "outputs": [],
81 | "source": [
82 | "import numpy as np\n",
83 | "\n",
84 | "training_set = np.transpose(train['X'], (3, 0, 1, 2)).astype(np.float32)\n",
85 | "training_labels = train['y']\n",
86 | "\n",
87 | "test_set = np.transpose(test['X'], (3, 0, 1, 2)).astype(np.float32)\n",
88 | "test_labels = test['y']"
89 | ]
90 | },
91 | {
92 | "cell_type": "code",
93 | "execution_count": null,
94 | "metadata": {
95 | "collapsed": true
96 | },
97 | "outputs": [],
98 | "source": [
99 | "n_train = training_set.shape[0]\n",
100 | "n_test = test_set.shape[0]"
101 | ]
102 | },
103 | {
104 | "cell_type": "markdown",
105 | "metadata": {},
106 | "source": [
107 | "### Inspect the data"
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "execution_count": null,
113 | "metadata": {
114 | "collapsed": true
115 | },
116 | "outputs": [],
117 | "source": [
118 | "from matplotlib import pyplot as plt"
119 | ]
120 | },
121 | {
122 | "cell_type": "code",
123 | "execution_count": null,
124 | "metadata": {},
125 | "outputs": [],
126 | "source": [
127 | "example = np.random.choice(np.arange(n_train))\n",
128 | "\n",
129 | "image = training_set[example]\n",
130 | "label = training_labels[example][0]\n",
131 | "\n",
132 | "if label == 10:\n",
133 | " label = 0\n",
134 | "\n",
135 | "plt.imshow(image)\n",
136 | "plt.show()\n",
137 | "\n",
138 | "print(\"Digit: {}\".format(label))"
139 | ]
140 | },
141 | {
142 | "cell_type": "markdown",
143 | "metadata": {},
144 | "source": [
145 | "### Convert the images to grayscale"
146 | ]
147 | },
148 | {
149 | "cell_type": "code",
150 | "execution_count": null,
151 | "metadata": {
152 | "collapsed": true
153 | },
154 | "outputs": [],
155 | "source": [
156 | "def convert_to_grayscale(images):\n",
157 | " images = np.add.reduce(images, keepdims=True, axis=3)\n",
158 | " images = images / 3.0\n",
159 | " return images / 128.0 - 1.0"
160 | ]
161 | },
162 | {
163 | "cell_type": "code",
164 | "execution_count": null,
165 | "metadata": {
166 | "collapsed": true
167 | },
168 | "outputs": [],
169 | "source": [
170 | "training_set_gs = convert_to_grayscale(training_set)\n",
171 | "test_set_gs = convert_to_grayscale(test_set)"
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "execution_count": null,
177 | "metadata": {},
178 | "outputs": [],
179 | "source": [
180 | "print(training_set_gs.shape)\n",
181 | "print(test_set_gs.shape)"
182 | ]
183 | },
184 | {
185 | "cell_type": "code",
186 | "execution_count": null,
187 | "metadata": {},
188 | "outputs": [],
189 | "source": [
190 | "example = np.random.choice(np.arange(n_train))\n",
191 | "\n",
192 | "image = training_set_gs[example]\n",
193 | "label = training_labels[example][0]\n",
194 | "\n",
195 | "if label == 10:\n",
196 | " label = 0\n",
197 | "\n",
198 | "plt.imshow(np.squeeze(image), cmap='gray')\n",
199 | "plt.show()\n",
200 | "\n",
201 | "print(\"Digit: {}\".format(label))"
202 | ]
203 | },
204 | {
205 | "cell_type": "markdown",
206 | "metadata": {},
207 | "source": [
208 | "Don't flatten the inputs! Use a CNN to process the image"
209 | ]
210 | },
211 | {
212 | "cell_type": "code",
213 | "execution_count": null,
214 | "metadata": {
215 | "collapsed": true
216 | },
217 | "outputs": [],
218 | "source": [
219 | "# training_set_flat = training_set_gs.reshape((n_train, -1))\n",
220 | "# test_set_flat = test_set_gs.reshape((n_test, -1))"
221 | ]
222 | },
223 | {
224 | "cell_type": "markdown",
225 | "metadata": {},
226 | "source": [
227 | "### Encode the labels as one-hot vectors"
228 | ]
229 | },
230 | {
231 | "cell_type": "code",
232 | "execution_count": null,
233 | "metadata": {
234 | "collapsed": true
235 | },
236 | "outputs": [],
237 | "source": [
238 | "def one_hot(labels):\n",
239 | " \"\"\"\n",
240 | " Encodes the labels as one-hot vectors. Zero is represented as 10 in SVHN.\n",
241 | " [10] -> [1, 0, 0, 0, 0, 0, 0, 0, 0, 0]\n",
242 | " [2] -> [0, 0, 1, 0, 0, 0, 0, 0, 0, 0]\n",
243 | " \n",
244 | " \"\"\"\n",
245 | " labels = np.squeeze(labels)\n",
246 | " one_hot_labels = []\n",
247 | " for num in labels:\n",
248 | " one_hot = [0.0] * 10\n",
249 | " if num == 10:\n",
250 | " one_hot[0] = 1.0\n",
251 | " else:\n",
252 | " one_hot[num] = 1.0\n",
253 | " one_hot_labels.append(one_hot)\n",
254 | " labels = np.array(one_hot_labels).astype(np.float32)\n",
255 | " return labels"
256 | ]
257 | },
258 | {
259 | "cell_type": "code",
260 | "execution_count": null,
261 | "metadata": {
262 | "collapsed": true
263 | },
264 | "outputs": [],
265 | "source": [
266 | "training_labels_one_hot = one_hot(training_labels)\n",
267 | "test_labels_one_hot = one_hot(test_labels)"
268 | ]
269 | },
270 | {
271 | "cell_type": "code",
272 | "execution_count": null,
273 | "metadata": {},
274 | "outputs": [],
275 | "source": [
276 | "print(training_labels_one_hot.shape)\n",
277 | "print(test_labels_one_hot.shape)"
278 | ]
279 | },
280 | {
281 | "cell_type": "markdown",
282 | "metadata": {},
283 | "source": [
284 | "## Build the network"
285 | ]
286 | },
287 | {
288 | "cell_type": "code",
289 | "execution_count": null,
290 | "metadata": {
291 | "collapsed": true
292 | },
293 | "outputs": [],
294 | "source": [
295 | "class SVHN_CNN:\n",
296 | " def __init__(self, wd_factor, learning_rate):\n",
297 | " self.wd_factor = wd_factor\n",
298 | " self.learning_rate = learning_rate\n",
299 | " self.train_pointer = 0\n",
300 | " self.test_pointer = 0\n",
301 | " \n",
302 | " self.input = tf.placeholder(dtype=tf.float32, shape=[None, 32, 32, 1], name='input')\n",
303 | " self.ground_truth = tf.placeholder(dtype=tf.float32, shape=[None, 10], name='ground_truth')\n",
304 | " \n",
305 | " # For batch norm and dropout\n",
306 | " self.is_training = tf.placeholder(tf.bool, name='is_training')\n",
307 | " print(self.input)\n",
308 | " \n",
309 | " self._build_graph()\n",
310 | " \n",
311 | " def _build_graph(self):\n",
312 | " weights = [] # for weight decay\n",
313 | " \n",
314 | " with tf.variable_scope('layers'):\n",
315 | " h = tf.layers.conv2d(self.input, 32, (11, 11), strides=(4, 4), padding='same', \n",
316 | " data_format='channels_last', activation=None, use_bias=True,\n",
317 | " kernel_initializer=tf.glorot_uniform_initializer(), name='conv1')\n",
318 | " print(h)\n",
319 | " \n",
320 | " h = tf.layers.batch_normalization(h, training=self.is_training)\n",
321 | " h = tf.nn.relu(h)\n",
322 | " h = tf.layers.conv2d(h, 64, (5, 5), strides=(1, 1), padding='same', \n",
323 | " data_format='channels_last', activation=None, use_bias=True,\n",
324 | " kernel_initializer=tf.glorot_uniform_initializer(), name='conv2')\n",
325 | " \n",
326 | " h = tf.layers.batch_normalization(h, training=self.is_training)\n",
327 | " h = tf.nn.relu(h)\n",
328 | " h = tf.layers.conv2d(h, 64, (3, 3), strides=(1, 1), padding='same', \n",
329 | " data_format='channels_last', activation=None, use_bias=True,\n",
330 | " kernel_initializer=tf.glorot_uniform_initializer(), name='conv3')\n",
331 | " \n",
332 | " # Downsample\n",
333 | " h = tf.layers.max_pooling2d(h, (2, 2), (2, 2), padding='valid', name='pool1')\n",
334 | " print(h)\n",
335 | " \n",
336 | " # Fully connected layers\n",
337 | " h = tf.layers.batch_normalization(h, training=self.is_training)\n",
338 | " h = tf.nn.relu(h)\n",
339 | " h = tf.layers.flatten(h)\n",
340 | " print(h)\n",
341 | " \n",
342 | " h = tf.layers.dense(h, 32, kernel_initializer=tf.glorot_uniform_initializer(), \n",
343 | " activation=tf.nn.relu, name='dense1')\n",
344 | " print(h)\n",
345 | " h = tf.layers.dropout(h, rate=0.25, training=self.is_training, name='dropout1')\n",
346 | " print(h)\n",
347 | " \n",
348 | " self.logits = tf.layers.dense(h, 10, kernel_initializer=tf.glorot_uniform_initializer(), \n",
349 | " activation=tf.identity, name='dense2')\n",
350 | " print(self.logits)\n",
351 | " self.prediction = tf.nn.softmax(self.logits, name='softmax_prediction')\n",
352 | " \n",
353 | " with tf.name_scope('loss'):\n",
354 | " self.loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(logits=self.logits, \n",
355 | " labels=self.ground_truth))\n",
356 | " self.loss += self.weight_decay()\n",
357 | " \n",
358 | " self.optimizer = tf.train.AdamOptimizer(self.learning_rate)\n",
359 | " \n",
360 | " update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n",
361 | " with tf.control_dependencies(update_ops):\n",
362 | " self.train_op = self.optimizer.minimize(self.loss)\n",
363 | " \n",
364 | " def weight_decay(self):\n",
365 | " loss = 0\n",
366 | " for v in tf.global_variables():\n",
367 | " if 'Adam' in v.name:\n",
368 | " continue\n",
369 | " elif 'kernel' in v.name:\n",
370 | " loss += self.wd_factor * tf.nn.l2_loss(v)\n",
371 | " print(loss)\n",
372 | " return loss\n",
373 | " \n",
374 | " def train_minibatch(self, samples, labels, batch_size):\n",
375 | " if self.train_pointer + batch_size <= samples.shape[0]:\n",
376 | " samples_minibatch = samples[self.train_pointer: self.train_pointer + batch_size]\n",
377 | " labels_minibatch = labels[self.train_pointer: self.train_pointer + batch_size]\n",
378 | " self.train_pointer += batch_size\n",
379 | " else:\n",
380 | " samples_minibatch = samples[self.train_pointer:]\n",
381 | " labels_minibatch = labels[self.train_pointer: self.train_pointer + batch_size]\n",
382 | " self.train_pointer = 0\n",
383 | " return samples_minibatch, labels_minibatch\n",
384 | "\n",
385 | " def train(self, train_samples, train_labels, train_batch_size, iteration_steps):\n",
386 | " print('Start Training')\n",
387 | " losses = []\n",
388 | " \n",
389 | " with tf.Session() as sess:\n",
390 | " sess.run(tf.global_variables_initializer())\n",
391 | " saver = tf.train.Saver()\n",
392 | " \n",
393 | " for i in range(iteration_steps):\n",
394 | " samples, labels = self.train_minibatch(train_samples, train_labels, train_batch_size)\n",
395 | " \n",
396 | " feed_dict = {self.input: samples, self.ground_truth: labels, self.is_training: True}\n",
397 | " _, loss = sess.run([self.train_op, self.loss], feed_dict=feed_dict)\n",
398 | " \n",
399 | " if i % 50 == 0:\n",
400 | " print(\"Minibatch loss at step {}: {}\".format(i, loss))\n",
401 | " losses.append([i, loss])\n",
402 | " \n",
403 | " saver.save(sess, './model')\n",
404 | " return losses\n",
405 | " \n",
406 | " def test_minibatch(self, samples, labels, batch_size):\n",
407 | " if self.test_pointer + batch_size <= samples.shape[0]:\n",
408 | " samples_minibatch = samples[self.test_pointer: self.test_pointer + batch_size]\n",
409 | " labels_minibatch = labels[self.test_pointer: self.test_pointer + batch_size]\n",
410 | " self.test_pointer += batch_size\n",
411 | " end_of_epoch = False\n",
412 | " else:\n",
413 | " samples_minibatch = samples[self.test_pointer:]\n",
414 | " labels_minibatch = labels[self.test_pointer: self.test_pointer + batch_size]\n",
415 | " self.test_pointer = 0\n",
416 | " end_of_epoch = True\n",
417 | " return samples_minibatch, labels_minibatch, end_of_epoch\n",
418 | " \n",
419 | " def test(self, test_samples, test_labels, test_batch_size):\n",
420 | " self.test_pointer = 0\n",
421 | " end_of_epoch = False\n",
422 | " losses = []\n",
423 | " \n",
424 | " with tf.Session() as sess:\n",
425 | " saver = tf.train.import_meta_graph(\"./model.meta\")\n",
426 | " saver.restore(sess, './model')\n",
427 | " while not end_of_epoch:\n",
428 | " samples, labels, end_of_epoch = self.test_minibatch(test_samples, test_labels, test_batch_size)\n",
429 | " feed_dict = {self.input: samples, self.ground_truth: labels, self.is_training: False}\n",
430 | " losses.append(sess.run(self.loss, feed_dict=feed_dict)) \n",
431 | " print(\"Average test loss: {}\".format(np.mean(losses)))"
432 | ]
433 | },
434 | {
435 | "cell_type": "code",
436 | "execution_count": null,
437 | "metadata": {},
438 | "outputs": [],
439 | "source": [
440 | "WD_FACTOR = 0.0\n",
441 | "LEARNING_RATE = 0.001\n",
442 | "model = SVHN_CNN(WD_FACTOR, LEARNING_RATE)"
443 | ]
444 | },
445 | {
446 | "cell_type": "code",
447 | "execution_count": null,
448 | "metadata": {},
449 | "outputs": [],
450 | "source": [
451 | "tf.global_variables()"
452 | ]
453 | },
454 | {
455 | "cell_type": "markdown",
456 | "metadata": {},
457 | "source": [
458 | "### Train the network"
459 | ]
460 | },
461 | {
462 | "cell_type": "code",
463 | "execution_count": null,
464 | "metadata": {},
465 | "outputs": [],
466 | "source": [
467 | "TRAIN_BATCH_SIZE = 128\n",
468 | "ITERATIONS = 10000\n",
469 | "\n",
470 | "import time\n",
471 | "start_time = time.time()\n",
472 | "\n",
473 | "losses = model.train(training_set_gs, training_labels_one_hot, TRAIN_BATCH_SIZE, ITERATIONS)\n",
474 | "\n",
475 | "end_time = time.time()\n",
476 | "print(\"Training time: {}s\".format(end_time - start_time))"
477 | ]
478 | },
479 | {
480 | "cell_type": "code",
481 | "execution_count": null,
482 | "metadata": {},
483 | "outputs": [],
484 | "source": [
485 | "try:\n",
486 | " losses = np.array(losses)\n",
487 | " np.save('./train_losses.npy', losses)\n",
488 | " print(losses.shape)\n",
489 | "except NameError:\n",
490 | " losses = np.load('./train_losses.npy')"
491 | ]
492 | },
493 | {
494 | "cell_type": "code",
495 | "execution_count": null,
496 | "metadata": {},
497 | "outputs": [],
498 | "source": [
499 | "import matplotlib.pyplot as plt\n",
500 | "\n",
501 | "iterations = losses[:, 0]\n",
502 | "train_loss = losses[:, 1]\n",
503 | "\n",
504 | "plt.figure(figsize=(10, 5))\n",
505 | "plt.plot(iterations, train_loss, 'b-')\n",
506 | "plt.xlabel(\"Iterations\")\n",
507 | "plt.ylabel(\"Loss\")\n",
508 | "plt.title(\"Training curve\")\n",
509 | "plt.show()"
510 | ]
511 | },
512 | {
513 | "cell_type": "markdown",
514 | "metadata": {},
515 | "source": [
516 | "### Test network predictions"
517 | ]
518 | },
519 | {
520 | "cell_type": "code",
521 | "execution_count": null,
522 | "metadata": {},
523 | "outputs": [],
524 | "source": [
525 | "TEST_BATCH_SIZE = 128\n",
526 | "\n",
527 | "model.test(test_set_gs, test_labels_one_hot, TEST_BATCH_SIZE)"
528 | ]
529 | },
530 | {
531 | "cell_type": "code",
532 | "execution_count": null,
533 | "metadata": {},
534 | "outputs": [],
535 | "source": [
536 | "example = np.random.choice(np.arange(n_test))\n",
537 | "\n",
538 | "sample = np.expand_dims(test_set_gs[example], axis=0)\n",
539 | "label = np.expand_dims(test_labels_one_hot[example], axis=0)\n",
540 | "\n",
541 | "digit = np.where(label[0]==1.0)[0][0]\n",
542 | "\n",
543 | "feed_dict = {model.input: sample, model.ground_truth: label, model.is_training: False}\n",
544 | "\n",
545 | "with tf.Session() as sess:\n",
546 | " saver = tf.train.import_meta_graph(\"./model.meta\")\n",
547 | " saver.restore(sess, './model')\n",
548 | " prediction = sess.run(model.prediction, feed_dict=feed_dict)[0]\n",
549 | "\n",
550 | "image = np.reshape(sample, (32, 32))\n",
551 | "\n",
552 | "print(\"Test sample digit: {}\".format(digit))\n",
553 | "fig, ax = plt.subplots(1, 2, figsize=(17, 5))\n",
554 | "ax[0].imshow(image, cmap='gray')\n",
555 | "ax[0].set_title(\"Test example\")\n",
556 | "\n",
557 | "classes = np.arange(10)\n",
558 | "width = 1.0\n",
559 | "\n",
560 | "#fig, ax = plt.subplots()\n",
561 | "ax[1].bar(classes, prediction, width, color='Blue')\n",
562 | "ax[1].set_ylabel('Probabilities')\n",
563 | "ax[1].set_title('Network categorical distribution')\n",
564 | "ax[1].set_xticks(classes)\n",
565 | "ax[1].set_xticklabels(('0', '1', '2', '3', '4', '5', '6', '7', '8', '9'))\n",
566 | "ax[1].set_xlabel('Digit class')\n",
567 | "\n",
568 | "plt.show()\n",
569 | "\n",
570 | "print(\"Network prediction probabilities:\")\n",
571 | "print(prediction)"
572 | ]
573 | },
574 | {
575 | "cell_type": "code",
576 | "execution_count": null,
577 | "metadata": {
578 | "collapsed": true
579 | },
580 | "outputs": [],
581 | "source": []
582 | }
583 | ],
584 | "metadata": {
585 | "kernelspec": {
586 | "display_name": "Python 3",
587 | "language": "python",
588 | "name": "python3"
589 | },
590 | "language_info": {
591 | "codemirror_mode": {
592 | "name": "ipython",
593 | "version": 3
594 | },
595 | "file_extension": ".py",
596 | "mimetype": "text/x-python",
597 | "name": "python",
598 | "nbconvert_exporter": "python",
599 | "pygments_lexer": "ipython3",
600 | "version": "3.6.3"
601 | }
602 | },
603 | "nbformat": 4,
604 | "nbformat_minor": 2
605 | }
606 |
--------------------------------------------------------------------------------
/tensorflow-tutorial/week_9/Week_9 pt 1 - Bijectors.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "### Week 9: Normalising flows pt 1 - bijectors"
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": null,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import tensorflow as tf\n",
17 | "import tensorflow_probability as tfp\n",
18 | "tfd = tfp.distributions\n",
19 | "tfb = tfp.bijectors\n",
20 | "\n",
21 | "import numpy as np\n",
22 | "import matplotlib.pyplot as plt"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "metadata": {
28 | "collapsed": true
29 | },
30 | "source": [
31 | "## Tensorflow bijectors"
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "metadata": {},
37 | "source": [
38 | "### Base distribution"
39 | ]
40 | },
41 | {
42 | "cell_type": "code",
43 | "execution_count": null,
44 | "metadata": {
45 | "collapsed": true
46 | },
47 | "outputs": [],
48 | "source": [
49 | "base_dist = tfd.MultivariateNormalDiag(loc=tf.zeros([2], tf.float32), scale_diag=tf.constant([1, 1], tf.float32))"
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "execution_count": null,
55 | "metadata": {
56 | "collapsed": true
57 | },
58 | "outputs": [],
59 | "source": [
60 | "SAMPLE_BATCH_SIZE = 512"
61 | ]
62 | },
63 | {
64 | "cell_type": "code",
65 | "execution_count": null,
66 | "metadata": {},
67 | "outputs": [],
68 | "source": [
69 | "z = base_dist.sample(SAMPLE_BATCH_SIZE)\n",
70 | "print(z)"
71 | ]
72 | },
73 | {
74 | "cell_type": "code",
75 | "execution_count": null,
76 | "metadata": {
77 | "collapsed": true
78 | },
79 | "outputs": [],
80 | "source": [
81 | "sess = tf.InteractiveSession()"
82 | ]
83 | },
84 | {
85 | "cell_type": "code",
86 | "execution_count": null,
87 | "metadata": {},
88 | "outputs": [],
89 | "source": [
90 | "z_samples = z.eval()\n",
91 | "print(type(z_samples))\n",
92 | "print(z_samples.shape)"
93 | ]
94 | },
95 | {
96 | "cell_type": "code",
97 | "execution_count": null,
98 | "metadata": {},
99 | "outputs": [],
100 | "source": [
101 | "fig = plt.figure(figsize=(5, 5))\n",
102 | "plt.scatter(z_samples[:, 0], z_samples[:, 1], s=10)\n",
103 | "plt.title(\"Base distribution: standard normal\")\n",
104 | "plt.xlim([-4, 4])\n",
105 | "plt.ylim([-4, 4])\n",
106 | "plt.show()"
107 | ]
108 | },
109 | {
110 | "cell_type": "markdown",
111 | "metadata": {},
112 | "source": [
113 | "### Transform the distribution"
114 | ]
115 | },
116 | {
117 | "cell_type": "markdown",
118 | "metadata": {},
119 | "source": [
120 | "A Bijector is used to transform distributions. Bijectors are the building blocks for a normalising flow. \n",
121 | "They are characterised by the following three main methods:\n",
122 | " 1. forward\n",
123 | " 2. inverse\n",
124 | " 3. log_det_jacobian\n",
125 | "\n",
126 | "Conventionally, think of the `forward` operation as acting on the base distribution (generate samples) and the `inverse` operation is used to calculate probabilities.\n",
127 | "\n",
128 | "For example, the Affine Bijector:"
129 | ]
130 | },
131 | {
132 | "cell_type": "code",
133 | "execution_count": null,
134 | "metadata": {
135 | "collapsed": true
136 | },
137 | "outputs": [],
138 | "source": [
139 | "affine_bijector = tfb.Affine(shift=[1., -1.], scale_diag=[0.5, 1.5])"
140 | ]
141 | },
142 | {
143 | "cell_type": "code",
144 | "execution_count": null,
145 | "metadata": {
146 | "collapsed": true
147 | },
148 | "outputs": [],
149 | "source": [
150 | "fwd_z = affine_bijector.forward(z)"
151 | ]
152 | },
153 | {
154 | "cell_type": "code",
155 | "execution_count": null,
156 | "metadata": {
157 | "collapsed": true
158 | },
159 | "outputs": [],
160 | "source": [
161 | "z_samples, x_samples = sess.run([z, fwd_z])"
162 | ]
163 | },
164 | {
165 | "cell_type": "code",
166 | "execution_count": null,
167 | "metadata": {},
168 | "outputs": [],
169 | "source": [
170 | "fig = plt.figure(figsize=(12, 5))\n",
171 | "ax = fig.add_subplot(121)\n",
172 | "ax2 = fig.add_subplot(122)\n",
173 | "\n",
174 | "ax.scatter(z_samples[:, 0], z_samples[:, 1], s=10)\n",
175 | "ax.set_title(\"Base distribution: standard normal\")\n",
176 | "ax.set_xlim([-5, 5])\n",
177 | "ax.set_ylim([-5, 5])\n",
178 | "\n",
179 | "ax2.scatter(x_samples[:, 0], x_samples[:, 1], s=10, color='r')\n",
180 | "ax2.set_title(\"Transformed distribution: shift [1, -1], scale [0.5, 1.5]\")\n",
181 | "ax2.set_xlim([-5, 5])\n",
182 | "ax2.set_ylim([-5, 5])\n",
183 | "plt.show()"
184 | ]
185 | },
186 | {
187 | "cell_type": "code",
188 | "execution_count": null,
189 | "metadata": {
190 | "collapsed": true
191 | },
192 | "outputs": [],
193 | "source": [
194 | "fwd_inv_z = affine_bijector.inverse(fwd_z)"
195 | ]
196 | },
197 | {
198 | "cell_type": "code",
199 | "execution_count": null,
200 | "metadata": {},
201 | "outputs": [],
202 | "source": [
203 | "latents = np.random.random((SAMPLE_BATCH_SIZE, 2))\n",
204 | "print(np.allclose(latents, sess.run(fwd_inv_z, feed_dict={z: latents})))"
205 | ]
206 | },
207 | {
208 | "cell_type": "markdown",
209 | "metadata": {},
210 | "source": [
211 | "### Computing probabilities"
212 | ]
213 | },
214 | {
215 | "cell_type": "code",
216 | "execution_count": null,
217 | "metadata": {},
218 | "outputs": [],
219 | "source": [
220 | "x = tf.placeholder(shape=(1, 2), dtype=tf.float32)\n",
221 | "\n",
222 | "log_det_dzdx = affine_bijector.inverse_log_det_jacobian(x, event_ndims=1)\n",
223 | "log_det_dzdx"
224 | ]
225 | },
226 | {
227 | "cell_type": "code",
228 | "execution_count": null,
229 | "metadata": {},
230 | "outputs": [],
231 | "source": [
232 | "inv_x = affine_bijector.inverse(x)\n",
233 | "inv_x"
234 | ]
235 | },
236 | {
237 | "cell_type": "code",
238 | "execution_count": null,
239 | "metadata": {},
240 | "outputs": [],
241 | "source": [
242 | "log_prob_inv_x = base_dist.log_prob(inv_x)\n",
243 | "log_prob_inv_x"
244 | ]
245 | },
246 | {
247 | "cell_type": "code",
248 | "execution_count": null,
249 | "metadata": {},
250 | "outputs": [],
251 | "source": [
252 | "x_fixed_sample = np.array([[1., -1.]]) # Mode of the transformed distribution\n",
253 | "\n",
254 | "sess.run(log_det_dzdx, feed_dict={x: x_fixed_sample})"
255 | ]
256 | },
257 | {
258 | "cell_type": "markdown",
259 | "metadata": {},
260 | "source": [
261 | "Check: Jacobian determinant is just the product of scaling factors"
262 | ]
263 | },
264 | {
265 | "cell_type": "code",
266 | "execution_count": null,
267 | "metadata": {},
268 | "outputs": [],
269 | "source": [
270 | "- np.log(0.5) - np.log(1.5)"
271 | ]
272 | },
273 | {
274 | "cell_type": "markdown",
275 | "metadata": {},
276 | "source": [
277 | "Calculate log probability of `x`:"
278 | ]
279 | },
280 | {
281 | "cell_type": "code",
282 | "execution_count": null,
283 | "metadata": {},
284 | "outputs": [],
285 | "source": [
286 | "sess.run(log_prob_inv_x + log_det_dzdx, feed_dict={x: np.array([[1., -1.]])})"
287 | ]
288 | },
289 | {
290 | "cell_type": "markdown",
291 | "metadata": {},
292 | "source": [
293 | "Check:"
294 | ]
295 | },
296 | {
297 | "cell_type": "code",
298 | "execution_count": null,
299 | "metadata": {},
300 | "outputs": [],
301 | "source": [
302 | "np.log(np.sqrt(1 / (2 * np.pi)**2)) - np.log(0.5) - np.log(1.5)"
303 | ]
304 | },
305 | {
306 | "cell_type": "code",
307 | "execution_count": null,
308 | "metadata": {
309 | "collapsed": true
310 | },
311 | "outputs": [],
312 | "source": [
313 | "sess.close()"
314 | ]
315 | }
316 | ],
317 | "metadata": {
318 | "kernelspec": {
319 | "display_name": "Python 3",
320 | "language": "python",
321 | "name": "python3"
322 | },
323 | "language_info": {
324 | "codemirror_mode": {
325 | "name": "ipython",
326 | "version": 3
327 | },
328 | "file_extension": ".py",
329 | "mimetype": "text/x-python",
330 | "name": "python",
331 | "nbconvert_exporter": "python",
332 | "pygments_lexer": "ipython3",
333 | "version": "3.6.3"
334 | }
335 | },
336 | "nbformat": 4,
337 | "nbformat_minor": 2
338 | }
339 |
--------------------------------------------------------------------------------
/tensorflow-tutorial/week_9/Week_9 pt 2 - IAF.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "### Week 9: Normalising flows pt 3 - improved variational posterior with IAF"
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": null,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import tensorflow as tf\n",
17 | "import tensorflow_probability as tfp\n",
18 | "tfd = tfp.distributions\n",
19 | "tfb = tfp.bijectors\n",
20 | "\n",
21 | "import numpy as np\n",
22 | "import matplotlib.pyplot as plt\n",
23 | "from IPython import display\n",
24 | "%matplotlib inline"
25 | ]
26 | },
27 | {
28 | "cell_type": "markdown",
29 | "metadata": {},
30 | "source": [
31 | "## Improved variational posterior"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": null,
37 | "metadata": {
38 | "collapsed": true
39 | },
40 | "outputs": [],
41 | "source": [
42 | "tf.set_random_seed(100)"
43 | ]
44 | },
45 | {
46 | "cell_type": "code",
47 | "execution_count": null,
48 | "metadata": {},
49 | "outputs": [],
50 | "source": [
51 | "class VAE():\n",
52 | "\n",
53 | " def __init__(self, use_iaf=False):\n",
54 | " self.sess = tf.Session()\n",
55 | " self.lambda_l2_reg = 0.01\n",
56 | " self.learning_rate = 0.001\n",
57 | " self.dropout = 1.\n",
58 | " self.use_iaf = use_iaf\n",
59 | "\n",
60 | " handles = self._buildGraph()\n",
61 | " self.sess.run(tf.global_variables_initializer())\n",
62 | "\n",
63 | " (self.x_in, self.dropout_, self.z_mean, self.z_log_sigma, self.z_sample,\n",
64 | " self.x_reconstructed, self.cost, self.global_step, self.train_op,\n",
65 | " self.rec_loss, self.kl_loss) = handles\n",
66 | "\n",
67 | " def _buildGraph(self):\n",
68 | " x_in = tf.placeholder(tf.float32, shape=[None, 2], name=\"x\")\n",
69 | " dropout = tf.placeholder_with_default(1., shape=[], name=\"dropout\")\n",
70 | "\n",
71 | " h = tf.layers.Dense(8, activation=tf.nn.tanh, name=\"encoding/1\")(x_in)\n",
72 | " h = tf.layers.Dense(8, activation=tf.nn.tanh, name=\"encoding/2\")(h)\n",
73 | " \n",
74 | " z_mean = tf.layers.Dense(2, activation=None, name=\"z_mean\")(h)\n",
75 | " z_log_sigma = tf.layers.Dense(2, activation=None, name=\"z_log_sigma\")(h)\n",
76 | " \n",
77 | " z = tfd.MultivariateNormalDiag(loc=z_mean, scale_diag=tf.exp(z_log_sigma))\n",
78 | " \n",
79 | " if not self.use_iaf:\n",
80 | " z_sample = z.sample()\n",
81 | " else: \n",
82 | " iaf_flow = self.build_iaf_flow(z)\n",
83 | " z_sample = iaf_flow.sample()\n",
84 | " \n",
85 | " h = tf.layers.Dense(8, activation=tf.nn.sigmoid, name=\"decoding/1\")(z_sample)\n",
86 | " h = tf.layers.Dense(8, activation=tf.nn.sigmoid ,name=\"decoding/2\")(h)\n",
87 | " \n",
88 | " x_reconstructed = tf.layers.Dense(2, activation=None, name=\"decoding/out\")(h)\n",
89 | " \n",
90 | " with tf.name_scope(\"l2_loss\"):\n",
91 | " rec_loss = tf.reduce_sum(tf.square(x_reconstructed - x_in), 1)\n",
92 | "\n",
93 | " if not self.use_iaf:\n",
94 | " kl_loss = VAE.kullbackLeibler(z_mean, z_log_sigma)\n",
95 | " else:\n",
96 | " prior = tfd.MultivariateNormalDiag(loc=tf.zeros([2]))\n",
97 | " kl_loss = iaf_flow.log_prob(z_sample) - tf.log(prior.prob(z_sample) + 1e-10)\n",
98 | "\n",
99 | " with tf.name_scope(\"l2_regularization\"):\n",
100 | " regularizers = [tf.nn.l2_loss(var) for var in self.sess.graph.get_collection(\n",
101 | " \"trainable_variables\") if (\"kernel\" in var.name and \"decoding\" not in var.name)]\n",
102 | " l2_reg = self.lambda_l2_reg * tf.add_n(regularizers)\n",
103 | "\n",
104 | " with tf.name_scope(\"cost\"):\n",
105 | " cost = tf.reduce_mean(rec_loss + kl_loss, name=\"vae_cost\")\n",
106 | " cost += l2_reg\n",
107 | "\n",
108 | " global_step = tf.Variable(0, trainable=False)\n",
109 | " with tf.name_scope(\"Adam_optimizer\"):\n",
110 | " optimizer = tf.train.AdamOptimizer(self.learning_rate)\n",
111 | " tvars = tf.trainable_variables()\n",
112 | " self.grads_and_vars = optimizer.compute_gradients(cost, tvars)\n",
113 | " clipped = [(tf.clip_by_value(grad, -0.1, 0.1), tvar)\n",
114 | " for grad, tvar in self.grads_and_vars]\n",
115 | " train_op = optimizer.apply_gradients(clipped, global_step=global_step,\n",
116 | " name=\"minimize_cost\")\n",
117 | "\n",
118 | " return (x_in, dropout, z_mean, z_log_sigma, z_sample, x_reconstructed,\n",
119 | " cost, global_step, train_op, tf.reduce_mean(rec_loss), tf.reduce_mean(kl_loss))\n",
120 | "\n",
121 | " @staticmethod\n",
122 | " def kullbackLeibler(mu, log_sigma):\n",
123 | " with tf.name_scope(\"KL_divergence\"):\n",
124 | " return -0.5 * tf.reduce_sum(1 + 2 * log_sigma - mu**2 -\n",
125 | " tf.exp(2 * log_sigma), 1)\n",
126 | " \n",
127 | " def build_iaf_flow(self, base_dist):\n",
128 | " bijectors = [\n",
129 | " tfb.MaskedAutoregressiveFlow(shift_and_log_scale_fn=tfb.masked_autoregressive_default_template(\n",
130 | " hidden_layers=[64, 64])),\n",
131 | " tfb.Permute(permutation=[1, 0]),\n",
132 | " tfb.MaskedAutoregressiveFlow(shift_and_log_scale_fn=tfb.masked_autoregressive_default_template(\n",
133 | " hidden_layers=[64, 64])),\n",
134 | " tfb.Permute(permutation=[1, 0]),\n",
135 | " tfb.MaskedAutoregressiveFlow(shift_and_log_scale_fn=tfb.masked_autoregressive_default_template(\n",
136 | " hidden_layers=[64, 64])),\n",
137 | " tfb.Permute(permutation=[1, 0]),\n",
138 | " tfb.MaskedAutoregressiveFlow(shift_and_log_scale_fn=tfb.masked_autoregressive_default_template(\n",
139 | " hidden_layers=[64, 64]))\n",
140 | " ]\n",
141 | "\n",
142 | " maf_bijector = tfb.Chain(list(reversed(bijectors)), name='maf_bijector')\n",
143 | " return tfd.TransformedDistribution(distribution=base_dist, bijector=tfb.Invert(maf_bijector))\n",
144 | "\n",
145 | " def encode(self, x):\n",
146 | " # Encodes data points to factorised Gaussian, before passing through IAF flow (if used)\n",
147 | " return self.sess.run([self.z_mean, self.z_log_sigma], feed_dict={self.x_in: x})\n",
148 | " \n",
149 | " def posterior_sample(self, x):\n",
150 | " # Samples from the full posterior (after IAF if used)\n",
151 | " return self.sess.run(self.z_sample, feed_dict={self.x_in: x})\n",
152 | "\n",
153 | " def decode(self, zs):\n",
154 | " return self.sess.run(self.x_reconstructed, feed_dict={self.z_sample: zs})\n",
155 | " \n",
156 | " @staticmethod\n",
157 | " def plot_posterior_distribution(X):\n",
158 | " X1 = X[:64, :]\n",
159 | " X2 = X[64:128, :]\n",
160 | " X3 = X[128:192, :]\n",
161 | " X4 = X[192:, :]\n",
162 | " x1_posterior_samples = model.posterior_sample(X1)\n",
163 | " x2_posterior_samples = model.posterior_sample(X2)\n",
164 | " x3_posterior_samples = model.posterior_sample(X3)\n",
165 | " x4_posterior_samples = model.posterior_sample(X4)\n",
166 | " plt.close()\n",
167 | " plt.figure()\n",
168 | " plt.scatter(x1_posterior_samples[:, 0], x1_posterior_samples[:, 1], color='red', s=5)\n",
169 | " plt.scatter(x2_posterior_samples[:, 0], x2_posterior_samples[:, 1], color='blue', s=5)\n",
170 | " plt.scatter(x3_posterior_samples[:, 0], x3_posterior_samples[:, 1], color='green', s=5)\n",
171 | " plt.scatter(x4_posterior_samples[:, 0], x4_posterior_samples[:, 1], color='purple', s=5)\n",
172 | " plt.title(\"Posterior distributions\")\n",
173 | " display.display(plt.gcf())\n",
174 | " display.clear_output(wait=True)\n",
175 | "\n",
176 | " def train(self, x, max_iter=np.inf):\n",
177 | " losses = []\n",
178 | " iterations = []\n",
179 | " while True: \n",
180 | " feed_dict = {self.x_in: x, self.dropout_: self.dropout}\n",
181 | " x_reconstructed, cost, rec_loss, kl_loss, _, i = self.sess.run(\n",
182 | " [self.x_reconstructed, self.cost, self.rec_loss, \n",
183 | " self.kl_loss, self.train_op, self.global_step], feed_dict\n",
184 | " )\n",
185 | "\n",
186 | " if i%500 == 1:\n",
187 | " print(\"Iteration {}, cost: \".format(i), cost)\n",
188 | " print(\" rec_loss: {}, kl_loss: {}\".format(rec_loss, kl_loss))\n",
189 | " losses.append(cost)\n",
190 | " iterations.append(i)\n",
191 | " VAE.plot_posterior_distribution(x)\n",
192 | "\n",
193 | " if i >= max_iter:\n",
194 | " print(\"Finished training. Final cost at iteration {}: {}\".format(i, cost))\n",
195 | " print(\" rec_loss: {}, kl_loss: {}\".format(rec_loss, kl_loss))\n",
196 | " losses.append(cost)\n",
197 | " iterations.append(i)\n",
198 | " break\n",
199 | " return losses, iterations"
200 | ]
201 | },
202 | {
203 | "cell_type": "code",
204 | "execution_count": null,
205 | "metadata": {},
206 | "outputs": [],
207 | "source": [
208 | "x1 = np.array([5., 5.])\n",
209 | "x2 = np.array([-5., 5.])\n",
210 | "x3 = np.array([-5., -5.])\n",
211 | "x4 = np.array([5., -5.])\n",
212 | "\n",
213 | "X1 = np.vstack((x1,) * 64)\n",
214 | "X2 = np.vstack((x2,) * 64)\n",
215 | "X3 = np.vstack((x3,) * 64)\n",
216 | "X4 = np.vstack((x4,) * 64)\n",
217 | "\n",
218 | "X_train = np.vstack((X1, X2, X3, X4))\n",
219 | "X_train.shape"
220 | ]
221 | },
222 | {
223 | "cell_type": "code",
224 | "execution_count": null,
225 | "metadata": {},
226 | "outputs": [],
227 | "source": [
228 | "model = VAE(use_iaf=False)"
229 | ]
230 | },
231 | {
232 | "cell_type": "code",
233 | "execution_count": null,
234 | "metadata": {},
235 | "outputs": [],
236 | "source": [
237 | "import time\n",
238 | "start_time = time.time()\n",
239 | "losses, iterations = model.train(X_train, max_iter=10000)\n",
240 | "end_time = time.time()\n",
241 | "\n",
242 | "print(\"Training time: {}\".format(end_time - start_time))"
243 | ]
244 | },
245 | {
246 | "cell_type": "code",
247 | "execution_count": null,
248 | "metadata": {},
249 | "outputs": [],
250 | "source": [
251 | "plt.plot(iterations, losses)\n",
252 | "plt.title(\"Training curve\")\n",
253 | "plt.show()"
254 | ]
255 | },
256 | {
257 | "cell_type": "code",
258 | "execution_count": null,
259 | "metadata": {},
260 | "outputs": [],
261 | "source": [
262 | "# Plot the posterior distributions before passing through the IAF flow\n",
263 | "\n",
264 | "num_samples = 256\n",
265 | "\n",
266 | "x1_mean, x1_log_sigma = model.encode(np.expand_dims(x1, 0))\n",
267 | "x2_mean, x2_log_sigma = model.encode(np.expand_dims(x2, 0))\n",
268 | "x3_mean, x3_log_sigma = model.encode(np.expand_dims(x3, 0))\n",
269 | "x4_mean, x4_log_sigma = model.encode(np.expand_dims(x4, 0))\n",
270 | "x1_samples = np.random.normal(loc=np.vstack((x1_mean,) * num_samples), scale=np.vstack((np.exp(x1_log_sigma),) * num_samples))\n",
271 | "x2_samples = np.random.normal(loc=np.vstack((x2_mean,) * num_samples), scale=np.vstack((np.exp(x2_log_sigma),) * num_samples))\n",
272 | "x3_samples = np.random.normal(loc=np.vstack((x3_mean,) * num_samples), scale=np.vstack((np.exp(x3_log_sigma),) * num_samples))\n",
273 | "x4_samples = np.random.normal(loc=np.vstack((x4_mean,) * num_samples), scale=np.vstack((np.exp(x4_log_sigma),) * num_samples))\n",
274 | "plt.scatter(x1_samples[:, 0], x1_samples[:, 1], color='red', s=5)\n",
275 | "plt.scatter(x2_samples[:, 0], x2_samples[:, 1], color='blue', s=5)\n",
276 | "plt.scatter(x3_samples[:, 0], x3_samples[:, 1], color='green', s=5)\n",
277 | "plt.scatter(x4_samples[:, 0], x4_samples[:, 1], color='purple', s=5)\n",
278 | "plt.title(\"Posterior distributions before IAF flow\")\n",
279 | "plt.show()"
280 | ]
281 | },
282 | {
283 | "cell_type": "code",
284 | "execution_count": null,
285 | "metadata": {
286 | "collapsed": true
287 | },
288 | "outputs": [],
289 | "source": [
290 | "num_samples = 256\n",
291 | "\n",
292 | "x1_posterior_samples = model.posterior_sample(np.stack([x1] * num_samples))\n",
293 | "x2_posterior_samples = model.posterior_sample(np.stack([x2] * num_samples))\n",
294 | "x3_posterior_samples = model.posterior_sample(np.stack([x3] * num_samples))\n",
295 | "x4_posterior_samples = model.posterior_sample(np.stack([x4] * num_samples))"
296 | ]
297 | },
298 | {
299 | "cell_type": "code",
300 | "execution_count": null,
301 | "metadata": {},
302 | "outputs": [],
303 | "source": [
304 | "# Plot the posterior distributions after passing through the IAF flow\n",
305 | "\n",
306 | "plt.scatter(x1_posterior_samples[:, 0], x1_posterior_samples[:, 1], color='red', s=5)\n",
307 | "plt.scatter(x2_posterior_samples[:, 0], x2_posterior_samples[:, 1], color='blue', s=5)\n",
308 | "plt.scatter(x3_posterior_samples[:, 0], x3_posterior_samples[:, 1], color='green', s=5)\n",
309 | "plt.scatter(x4_posterior_samples[:, 0], x4_posterior_samples[:, 1], color='purple', s=5)\n",
310 | "plt.title(\"Posterior distributions after IAF flow\")\n",
311 | "plt.show()"
312 | ]
313 | },
314 | {
315 | "cell_type": "code",
316 | "execution_count": null,
317 | "metadata": {},
318 | "outputs": [],
319 | "source": [
320 | "x1_decoded = model.decode(x1_posterior_samples)\n",
321 | "x2_decoded = model.decode(x2_posterior_samples)\n",
322 | "x3_decoded = model.decode(x3_posterior_samples)\n",
323 | "x4_decoded = model.decode(x4_posterior_samples)\n",
324 | "plt.scatter(x1_decoded[:, 0], x1_decoded[:, 1], color='red', s=5)\n",
325 | "plt.scatter(x2_decoded[:, 0], x2_decoded[:, 1], color='blue', s=5)\n",
326 | "plt.scatter(x3_decoded[:, 0], x3_decoded[:, 1], color='green', s=5)\n",
327 | "plt.scatter(x4_decoded[:, 0], x4_decoded[:, 1], color='purple', s=5)\n",
328 | "plt.title(\"Reconstructions of data points\")\n",
329 | "plt.show()"
330 | ]
331 | },
332 | {
333 | "cell_type": "code",
334 | "execution_count": null,
335 | "metadata": {
336 | "collapsed": true
337 | },
338 | "outputs": [],
339 | "source": [
340 | "model.sess.close()"
341 | ]
342 | }
343 | ],
344 | "metadata": {
345 | "kernelspec": {
346 | "display_name": "Python 3",
347 | "language": "python",
348 | "name": "python3"
349 | },
350 | "language_info": {
351 | "codemirror_mode": {
352 | "name": "ipython",
353 | "version": 3
354 | },
355 | "file_extension": ".py",
356 | "mimetype": "text/x-python",
357 | "name": "python",
358 | "nbconvert_exporter": "python",
359 | "pygments_lexer": "ipython3",
360 | "version": "3.6.3"
361 | }
362 | },
363 | "nbformat": 4,
364 | "nbformat_minor": 2
365 | }
366 |
--------------------------------------------------------------------------------