├── .gitignore
├── LICENSE.md
├── README.md
├── examples
├── betavae.py
├── images
│ ├── adgm.png
│ ├── conditional.png
│ ├── dgm.png
│ ├── gvae.png
│ ├── ladder.png
│ ├── ladderdgm.png
│ └── vae.png
├── mnist_sslvae.py
└── notebooks
│ ├── Auxiliary Deep Generative Model.ipynb
│ ├── Beta Variational Autoencoder.ipynb
│ ├── Deep Generative Model.ipynb
│ ├── Gumbel Softmax.ipynb
│ ├── Ladder Deep Generative Model.ipynb
│ ├── Ladder Variational Autoencoder.ipynb
│ ├── Variational Autoencoder.ipynb
│ └── datautils.py
└── semi-supervised
├── inference
├── __init__.py
├── distributions.py
└── variational.py
├── layers
├── __init__.py
├── flow.py
└── stochastic.py
├── models
├── __init__.py
├── dgm.py
└── vae.py
└── utils.py
/.gitignore:
--------------------------------------------------------------------------------
1 | __pycache__
2 | raw
3 | processed
4 | .idea
5 | .DS_Store
6 | .ipynb_checkpoints
7 | mldata
8 | .pt
9 | .log
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 | MIT License (MIT)
2 | =====================
3 |
4 | Copyright © 2017 Jesper Wohlert
5 |
6 | Permission is hereby granted, free of charge, to any person
7 | obtaining a copy of this software and associated documentation
8 | files (the “Software”), to deal in the Software without
9 | restriction, including without limitation the rights to use,
10 | copy, modify, merge, publish, distribute, sublicense, and/or sell
11 | copies of the Software, and to permit persons to whom the
12 | Software is furnished to do so, subject to the following
13 | conditions:
14 |
15 | The above copyright notice and this permission notice shall be
16 | included in all copies or substantial portions of the Software.
17 |
18 | THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND,
19 | EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
20 | OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
21 | NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
22 | HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
23 | WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
24 | FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
25 | OTHER DEALINGS IN THE SOFTWARE.
26 |
27 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Semi-supervised PyTorch
2 |
3 | A PyTorch-based package containing useful models for modern deep semi-supervised learning and deep generative models. Want to jump right into it? Look into the [notebooks](examples/notebooks).
4 |
5 | ### Latest additions
6 |
7 | *2018.04.17* - The Gumbel softmax notebook has been added to show how
8 | you can use discrete latent variables in VAEs.
9 | *2018.02.28* - The β-VAE notebook was added to show how VAEs can learn disentangled representations.
10 |
11 | ## What is semi-supervised learning?
12 |
13 | Semi-supervised learning tries to bridge the gap between supervised and unsupervised learning by learning from both
14 | labelled and unlabelled data.
15 |
16 | Semi-supervised learning can typically be applied to areas where data is easy to get a hold of, but labelling is expensive.
17 | Normally, one would either use an unsupervised method, or just the few labelled examples - both of which would be
18 | likely to yield bad results.
19 |
20 | The current state-of-the-art method in semi-supervised learning achieves an accuracy of over 99% on the MNIST dataset using just **10 labelled examples per class**.
21 |
22 | ## Conditional generation
23 |
24 | Most semi-supervised models simultaneously train an inference network and a generator network. This means that it is
25 | not only possible to query this models for classification, but also to generate new data from trained model.
26 | By seperating label information, one can generate a new sample with the given digit as shown in the image below from
27 | Kingma 2014.
28 |
29 | 
30 |
31 | ## Implemented models and methods:
32 |
33 | * [Variational Autoencoder (Kingma 2013)](https://arxiv.org/abs/1312.6114)
34 | * [Importance Weighted Autoencoders (Burda 2015)](https://arxiv.org/abs/1509.00519)
35 | * [Variational Inference with Normalizing Flows (Rezende & Mohamed 2015)](https://arxiv.org/abs/1505.05770)
36 | * [Semi-supervised Learning with Deep Generative Models (Kingma 2014)](https://arxiv.org/abs/1406.5298)
37 | * [Auxiliary Deep Generative Models (Maaløe 2016)](https://arxiv.org/abs/1602.05473)
38 | * [Ladder Variational Autoencoders (Sønderby 2016)](https://arxiv.org/abs/1602.02282)
39 | * [β-VAE (Higgins 2017)](https://openreview.net/forum?id=Sy2fzU9gl)
40 |
--------------------------------------------------------------------------------
/examples/betavae.py:
--------------------------------------------------------------------------------
1 | from urllib import request
2 |
3 | import torch
4 | import numpy as np
5 | import sys
6 | from torch.utils.data import Dataset, DataLoader
7 | from torch.utils.data.sampler import SubsetRandomSampler
8 |
9 | sys.path.append("../semi-supervised")
10 |
11 | torch.manual_seed(1337)
12 | np.random.seed(1337)
13 |
14 | cuda = torch.cuda.is_available()
15 | print("CUDA: {}".format(cuda))
16 |
17 | def binary_cross_entropy(r, x):
18 | "Drop in replacement until PyTorch adds `reduce` keyword."
19 | return -torch.sum(x * torch.log(r + 1e-8) + (1 - x) * torch.log(1 - r + 1e-8), dim=-1)
20 |
21 |
22 | class SpriteDataset(Dataset):
23 | def __init__(self, transform=None, download=False):
24 | self.transform = transform
25 | url = "https://github.com/deepmind/dsprites-dataset/raw/master/dsprites_ndarray_co1sh3sc6or40x32y32_64x64.npz"
26 |
27 | if download:
28 | request.urlretrieve(url, "./dsprites.npz")
29 |
30 | try:
31 | self.dset = np.load("./dsprites.npz", encoding="bytes")["imgs"]
32 | except FileNotFoundError:
33 | print("Dataset not found, have you set download=True?")
34 |
35 | def __len__(self):
36 | return len(self.dset)
37 |
38 | def __getitem__(self, idx):
39 | sample = self.dset[idx]
40 |
41 | if self.transform:
42 | sample = self.transform(sample)
43 |
44 | return sample
45 |
46 |
47 | if __name__ == "__main__":
48 | from itertools import repeat
49 | from torch.autograd import Variable
50 |
51 | dset = SpriteDataset(transform=lambda x: x.reshape(-1), download=True)
52 | unlabelled = DataLoader(dset, batch_size=16, shuffle=True, sampler=SubsetRandomSampler(np.arange(len(dset)//3)))
53 |
54 | models = []
55 |
56 | from models import VariationalAutoencoder
57 | model = VariationalAutoencoder([64**2, 10, [1200, 1200]])
58 | model.decoder = nn.Sequential(
59 | nn.Linear(10, 1200),
60 | nn.Tanh(),
61 | nn.Linear(1200, 1200),
62 | nn.Tanh(),
63 | nn.Linear(1200, 1200),
64 | nn.Tanh(),
65 | nn.Linear(10, 64**2),
66 | nn.Sigmoid(),
67 | )
68 |
69 | if cuda: model = model.cuda()
70 |
71 | beta = repeat(4.0)
72 | optimizer = torch.optim.Adagrad(model.parameters(), lr=1e-2)
73 |
74 | epochs = 251
75 | best = np.inf
76 |
77 | file = open(model.__class__.__name__ + ".log", 'w+')
78 |
79 | for epoch in range(epochs):
80 | model.train()
81 | total_loss = 0
82 | for u in unlabelled:
83 | u = Variable(u.float())
84 |
85 | if cuda:
86 | u = u.cuda(device=0)
87 |
88 | reconstruction = model(u)
89 |
90 | likelihood = -binary_cross_entropy(reconstruction, u)
91 | elbo = likelihood - next(beta) * model.kl_divergence
92 |
93 | L = -torch.mean(elbo)
94 |
95 | L.backward()
96 | optimizer.step()
97 | optimizer.zero_grad()
98 |
99 | total_loss += L.data[0]
100 |
101 | m = len(unlabelled)
102 | print(total_loss / m, sep="\t")
103 |
104 | if total_loss < best:
105 | best = total_loss
106 | torch.save(model, '{}.pt'.format(model.__class__.__name__))
107 |
--------------------------------------------------------------------------------
/examples/images/adgm.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/adgm.png
--------------------------------------------------------------------------------
/examples/images/conditional.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/conditional.png
--------------------------------------------------------------------------------
/examples/images/dgm.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/dgm.png
--------------------------------------------------------------------------------
/examples/images/gvae.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/gvae.png
--------------------------------------------------------------------------------
/examples/images/ladder.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/ladder.png
--------------------------------------------------------------------------------
/examples/images/ladderdgm.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/ladderdgm.png
--------------------------------------------------------------------------------
/examples/images/vae.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/wohlert/semi-supervised-pytorch/7cde4959468d271552febdcfed5e1cfae9857613/examples/images/vae.png
--------------------------------------------------------------------------------
/examples/mnist_sslvae.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import numpy as np
3 | import sys
4 | sys.path.append("../semi-supervised")
5 |
6 | torch.manual_seed(1337)
7 | np.random.seed(1337)
8 |
9 | cuda = torch.cuda.is_available()
10 | print("CUDA: {}".format(cuda))
11 |
12 | def binary_cross_entropy(r, x):
13 | "Drop in replacement until PyTorch adds `reduce` keyword."
14 | return -torch.sum(x * torch.log(r + 1e-8) + (1 - x) * torch.log(1 - r + 1e-8), dim=-1)
15 |
16 | n_labels = 10
17 | def get_mnist(location="./", batch_size=64, labels_per_class=100):
18 | from functools import reduce
19 | from operator import __or__
20 | from torch.utils.data.sampler import SubsetRandomSampler
21 | from torchvision.datasets import MNIST
22 | import torchvision.transforms as transforms
23 | from utils import onehot
24 |
25 | flatten_bernoulli = lambda x: transforms.ToTensor()(x).view(-1).bernoulli()
26 |
27 | mnist_train = MNIST(location, train=True, download=True,
28 | transform=flatten_bernoulli, target_transform=onehot(n_labels))
29 | mnist_valid = MNIST(location, train=False, download=True,
30 | transform=flatten_bernoulli, target_transform=onehot(n_labels))
31 |
32 | def get_sampler(labels, n=None):
33 | # Only choose digits in n_labels
34 | (indices,) = np.where(reduce(__or__, [labels == i for i in np.arange(n_labels)]))
35 |
36 | # Ensure uniform distribution of labels
37 | np.random.shuffle(indices)
38 | indices = np.hstack([list(filter(lambda idx: labels[idx] == i, indices))[:n] for i in range(n_labels)])
39 |
40 | indices = torch.from_numpy(indices)
41 | sampler = SubsetRandomSampler(indices)
42 | return sampler
43 |
44 | # Dataloaders for MNIST
45 | labelled = torch.utils.data.DataLoader(mnist_train, batch_size=batch_size, num_workers=2, pin_memory=cuda,
46 | sampler=get_sampler(mnist_train.train_labels.numpy(), labels_per_class))
47 | unlabelled = torch.utils.data.DataLoader(mnist_train, batch_size=batch_size, num_workers=2, pin_memory=cuda,
48 | sampler=get_sampler(mnist_train.train_labels.numpy()))
49 | validation = torch.utils.data.DataLoader(mnist_valid, batch_size=batch_size, num_workers=2, pin_memory=cuda,
50 | sampler=get_sampler(mnist_valid.test_labels.numpy()))
51 |
52 | return labelled, unlabelled, validation
53 |
54 | if __name__ == "__main__":
55 | from itertools import repeat, cycle
56 | from torch.autograd import Variable
57 | from inference import SVI, DeterministicWarmup, ImportanceWeightedSampler
58 |
59 | labelled, unlabelled, validation = get_mnist(location="./", batch_size=100, labels_per_class=10)
60 | alpha = 0.1 * len(unlabelled) / len(labelled)
61 |
62 | models = []
63 |
64 | # Kingma 2014, M2 model. Reported: 88%, achieved: ??%
65 | # from models import DeepGenerativeModel
66 | # models += [DeepGenerativeModel([784, n_labels, 50, [600, 600]])]
67 |
68 | # Maaløe 2016, ADGM model. Reported: 99.4%, achieved: ??%
69 | # from models import AuxiliaryDeepGenerativeModel
70 | # models += [AuxiliaryDeepGenerativeModel([784, n_labels, 100, 100, [500, 500]])]
71 |
72 | from models import LadderDeepGenerativeModel
73 | models += [LadderDeepGenerativeModel([784, n_labels, [32, 16, 8], [128, 128, 128]])]
74 |
75 | for model in models:
76 | if cuda: model = model.cuda()
77 |
78 | beta = DeterministicWarmup(n=4*len(unlabelled)*100)
79 | sampler = ImportanceWeightedSampler(mc=1, iw=1)
80 |
81 | elbo = SVI(model, likelihood=binary_cross_entropy, beta=beta, sampler=sampler)
82 | optimizer = torch.optim.Adam(model.parameters(), lr=3e-4, betas=(0.9, 0.999))
83 |
84 | epochs = 251
85 | best = 0.0
86 |
87 | file = open(model.__class__.__name__ + ".log", 'w+')
88 |
89 | for epoch in range(epochs):
90 | model.train()
91 | total_loss, labelled_loss, unlabelled_loss, accuracy = (0, 0, 0, 0)
92 | for (x, y), (u, _) in zip(cycle(labelled), unlabelled):
93 | # Wrap in variables
94 | x, y, u = Variable(x), Variable(y), Variable(u)
95 |
96 | if cuda:
97 | # They need to be on the same device and be synchronized.
98 | x, y = x.cuda(device=0), y.cuda(device=0)
99 | u = u.cuda(device=0)
100 |
101 | L = -elbo(x, y)
102 | U = -elbo(u)
103 |
104 | # Add auxiliary classification loss q(y|x)
105 | logits = model.classify(x)
106 | classication_loss = torch.sum(y * torch.log(logits + 1e-8), dim=1).mean()
107 |
108 | J_alpha = L - alpha * classication_loss + U
109 |
110 | J_alpha.backward()
111 | optimizer.step()
112 | optimizer.zero_grad()
113 |
114 | total_loss += J_alpha.data[0]
115 | labelled_loss += L.data[0]
116 | unlabelled_loss += U.data[0]
117 |
118 | _, pred_idx = torch.max(logits, 1)
119 | _, lab_idx = torch.max(y, 1)
120 | accuracy += torch.mean((pred_idx.data == lab_idx.data).float())
121 |
122 | m = len(unlabelled)
123 | print(*(total_loss / m, labelled_loss / m, unlabelled_loss / m, accuracy / m), sep="\t", file=file)
124 |
125 | if epoch % 1 == 0:
126 | model.eval()
127 | print("Epoch: {}".format(epoch))
128 | print("[Train]\t\t J_a: {:.2f}, L: {:.2f}, U: {:.2f}, accuracy: {:.2f}".format(total_loss / m,
129 | labelled_loss / m,
130 | unlabelled_loss / m,
131 | accuracy / m))
132 |
133 | total_loss, labelled_loss, unlabelled_loss, accuracy = (0, 0, 0, 0)
134 | for x, y in validation:
135 | x, y = Variable(x), Variable(y)
136 |
137 | if cuda:
138 | x, y = x.cuda(device=0), y.cuda(device=0)
139 |
140 | L = -elbo(x, y)
141 | U = -elbo(x)
142 |
143 | logits = model.classify(x)
144 | classication_loss = -torch.sum(y * torch.log(logits + 1e-8), dim=1).mean()
145 |
146 | J_alpha = L + alpha * classication_loss + U
147 |
148 | total_loss += J_alpha.data[0]
149 | labelled_loss += L.data[0]
150 | unlabelled_loss += U.data[0]
151 |
152 | _, pred_idx = torch.max(logits, 1)
153 | _, lab_idx = torch.max(y, 1)
154 | accuracy += torch.mean((pred_idx.data == lab_idx.data).float())
155 |
156 | m = len(validation)
157 | print(*(total_loss / m, labelled_loss / m, unlabelled_loss / m, accuracy / m), sep="\t", file=file)
158 | print("[Validation]\t J_a: {:.2f}, L: {:.2f}, U: {:.2f}, accuracy: {:.2f}".format(total_loss / m,
159 | labelled_loss / m,
160 | unlabelled_loss / m,
161 | accuracy / m))
162 |
163 | if accuracy > best:
164 | best = accuracy
165 | torch.save(model, '{}.pt'.format(model.__class__.__name__))
--------------------------------------------------------------------------------
/examples/notebooks/Auxiliary Deep Generative Model.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "code_folding": [
8 | 0
9 | ]
10 | },
11 | "outputs": [],
12 | "source": [
13 | "# Imports\n",
14 | "import torch\n",
15 | "cuda = torch.cuda.is_available()\n",
16 | "import numpy as np\n",
17 | "import matplotlib.pyplot as plt\n",
18 | "%matplotlib inline\n",
19 | "import sys\n",
20 | "sys.path.append(\"../../semi-supervised\")"
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "# Auxiliary Deep Generative Model\n",
28 | "\n",
29 | "The Auxiliary Deep Generative Model [[Maaløe, 2016]](https://arxiv.org/abs/1602.05473) posits a model that with an auxiliary latent variable $a$ that infers the variables $z$ and $y$. This helps in terms of semi-supervised learning by delegating causality to their respective variables. This model was state-of-the-art in semi-supervised until 2017, and is still very powerful with an MNIST accuracy of *99.4%* using just 10 labelled examples per class.\n",
30 | "\n",
31 | "
\n"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": 5,
37 | "metadata": {},
38 | "outputs": [
39 | {
40 | "data": {
41 | "text/plain": [
42 | "AuxiliaryDeepGenerativeModel(\n",
43 | " (encoder): Encoder(\n",
44 | " (hidden): ModuleList(\n",
45 | " (0): Linear(in_features=826, out_features=256)\n",
46 | " (1): Linear(in_features=256, out_features=128)\n",
47 | " )\n",
48 | " (sample): GaussianSample(\n",
49 | " (mu): Linear(in_features=128, out_features=32)\n",
50 | " (log_var): Linear(in_features=128, out_features=32)\n",
51 | " )\n",
52 | " )\n",
53 | " (decoder): Decoder(\n",
54 | " (hidden): ModuleList(\n",
55 | " (0): Linear(in_features=42, out_features=128)\n",
56 | " (1): Linear(in_features=128, out_features=256)\n",
57 | " )\n",
58 | " (reconstruction): Linear(in_features=256, out_features=784)\n",
59 | " (output_activation): Sigmoid()\n",
60 | " )\n",
61 | " (classifier): Classifier(\n",
62 | " (dense): Linear(in_features=816, out_features=256)\n",
63 | " (logits): Linear(in_features=256, out_features=10)\n",
64 | " )\n",
65 | " (aux_encoder): Encoder(\n",
66 | " (hidden): ModuleList(\n",
67 | " (0): Linear(in_features=784, out_features=256)\n",
68 | " (1): Linear(in_features=256, out_features=128)\n",
69 | " )\n",
70 | " (sample): GaussianSample(\n",
71 | " (mu): Linear(in_features=128, out_features=32)\n",
72 | " (log_var): Linear(in_features=128, out_features=32)\n",
73 | " )\n",
74 | " )\n",
75 | " (aux_decoder): Encoder(\n",
76 | " (hidden): ModuleList(\n",
77 | " (0): Linear(in_features=826, out_features=128)\n",
78 | " (1): Linear(in_features=128, out_features=256)\n",
79 | " )\n",
80 | " (sample): GaussianSample(\n",
81 | " (mu): Linear(in_features=256, out_features=32)\n",
82 | " (log_var): Linear(in_features=256, out_features=32)\n",
83 | " )\n",
84 | " )\n",
85 | ")"
86 | ]
87 | },
88 | "execution_count": 5,
89 | "metadata": {},
90 | "output_type": "execute_result"
91 | }
92 | ],
93 | "source": [
94 | "from models import AuxiliaryDeepGenerativeModel\n",
95 | "\n",
96 | "y_dim = 10\n",
97 | "z_dim = 32\n",
98 | "a_dim = 32\n",
99 | "h_dim = [256, 128]\n",
100 | "\n",
101 | "model = AuxiliaryDeepGenerativeModel([784, y_dim, z_dim, a_dim, h_dim])\n",
102 | "model"
103 | ]
104 | },
105 | {
106 | "cell_type": "markdown",
107 | "metadata": {},
108 | "source": [
109 | "## Training\n",
110 | "\n",
111 | "The lower bound we derived in the notebook for the **deep generative model** is similar to the one for the ADGM. Here, we also need to integrate over a continuous auxiliary variable $a$.\n",
112 | "\n",
113 | "For labelled data, the lower bound is given by.\n",
114 | "\\begin{align}\n",
115 | "\\log p(x,y) &= \\log \\int \\int p(x, y, a, z) \\ dz \\ da\\\\\n",
116 | "&\\geq \\mathbb{E}_{q(a,z|x,y)} \\bigg [\\log \\frac{p(x,y,a,z)}{q(a,z|x,y)} \\bigg ] = - \\mathcal{L}(x,y)\n",
117 | "\\end{align}\n",
118 | "\n",
119 | "Again when no label information is available we sum out all of the labels.\n",
120 | "\n",
121 | "\\begin{align}\n",
122 | "\\log p(x) &= \\log \\int \\sum_{y} \\int p(x, y, a, z) \\ dz \\ da\\\\\n",
123 | "&\\geq \\mathbb{E}_{q(a,y,z|x)} \\bigg [\\log \\frac{p(x,y,a,z)}{q(a,y,z |x)} \\bigg ] = - \\mathcal{U}(x)\n",
124 | "\\end{align}\n",
125 | "\n",
126 | "Where we decompose the q-distribution into its constituent parts. $q(a, y, z|x) = q(z|a,y,x)q(y|a,x)q(a|x)$, which is also what can be seen in the figure.\n",
127 | "\n",
128 | "The distribution over $a$ is similar to $z$ in the sense that it is also a diagonal Gaussian distribution. However by introducing the auxiliary variable we allow for $z$ to become arbitrarily complex - something we can also see when using normalizing flows."
129 | ]
130 | },
131 | {
132 | "cell_type": "code",
133 | "execution_count": 7,
134 | "metadata": {},
135 | "outputs": [],
136 | "source": [
137 | "from datautils import get_mnist\n",
138 | "\n",
139 | "# Only use 10 labelled examples per class\n",
140 | "# The rest of the data is unlabelled.\n",
141 | "labelled, unlabelled, validation = get_mnist(location=\"./\", batch_size=64, labels_per_class=10)\n",
142 | "alpha = 0.1 * (len(unlabelled) + len(labelled)) / len(labelled)\n",
143 | "\n",
144 | "def binary_cross_entropy(r, x):\n",
145 | " return -torch.sum(x * torch.log(r + 1e-8) + (1 - x) * torch.log(1 - r + 1e-8), dim=-1)\n",
146 | "\n",
147 | "optimizer = torch.optim.Adam(model.parameters(), lr=3e-4, betas=(0.9, 0.999))"
148 | ]
149 | },
150 | {
151 | "cell_type": "code",
152 | "execution_count": 8,
153 | "metadata": {},
154 | "outputs": [],
155 | "source": [
156 | "from itertools import cycle\n",
157 | "from inference import SVI, DeterministicWarmup\n",
158 | "\n",
159 | "# We will need to use warm-up in order to achieve good performance.\n",
160 | "# Over 200 calls to SVI we change the autoencoder from\n",
161 | "# deterministic to stochastic.\n",
162 | "beta = DeterministicWarmup(n=200)\n",
163 | "\n",
164 | "\n",
165 | "if cuda: model = model.cuda()\n",
166 | "elbo = SVI(model, likelihood=binary_cross_entropy, beta=beta)"
167 | ]
168 | },
169 | {
170 | "cell_type": "markdown",
171 | "metadata": {},
172 | "source": [
173 | "The library is conventially packed with the `SVI` method that does all of the work of calculating the lower bound for both labelled and unlabelled data depending on whether the label is given. It also manages to perform the enumeration of all the labels.\n",
174 | "\n",
175 | "Remember that the labels have to be in a *one-hot encoded* format in order to work with SVI."
176 | ]
177 | },
178 | {
179 | "cell_type": "code",
180 | "execution_count": null,
181 | "metadata": {},
182 | "outputs": [],
183 | "source": [
184 | "from torch.autograd import Variable\n",
185 | "\n",
186 | "for epoch in range(10):\n",
187 | " model.train()\n",
188 | " total_loss, accuracy = (0, 0)\n",
189 | " for (x, y), (u, _) in zip(cycle(labelled), unlabelled):\n",
190 | " # Wrap in variables\n",
191 | " x, y, u = Variable(x), Variable(y), Variable(u)\n",
192 | "\n",
193 | " if cuda:\n",
194 | " # They need to be on the same device and be synchronized.\n",
195 | " x, y = x.cuda(device=0), y.cuda(device=0)\n",
196 | " u = u.cuda(device=0)\n",
197 | "\n",
198 | " L = -elbo(x, y)\n",
199 | " U = -elbo(u)\n",
200 | "\n",
201 | " # Add auxiliary classification loss q(y|x)\n",
202 | " logits = model.classify(x)\n",
203 | " \n",
204 | " # Regular cross entropy\n",
205 | " classication_loss = torch.sum(y * torch.log(logits + 1e-8), dim=1).mean()\n",
206 | "\n",
207 | " J_alpha = L - alpha * classication_loss + U\n",
208 | "\n",
209 | " J_alpha.backward()\n",
210 | " optimizer.step()\n",
211 | " optimizer.zero_grad()\n",
212 | "\n",
213 | " total_loss += J_alpha.data[0]\n",
214 | " accuracy += torch.mean((torch.max(logits, 1)[1].data == torch.max(y, 1)[1].data).float())\n",
215 | " \n",
216 | " if epoch % 1 == 0:\n",
217 | " model.eval()\n",
218 | " m = len(unlabelled)\n",
219 | " print(\"Epoch: {}\".format(epoch))\n",
220 | " print(\"[Train]\\t\\t J_a: {:.2f}, accuracy: {:.2f}\".format(total_loss / m, accuracy / m))\n",
221 | "\n",
222 | " total_loss, accuracy = (0, 0)\n",
223 | " for x, y in validation:\n",
224 | " x, y = Variable(x), Variable(y)\n",
225 | "\n",
226 | " if cuda:\n",
227 | " x, y = x.cuda(device=0), y.cuda(device=0)\n",
228 | "\n",
229 | " L = -elbo(x, y)\n",
230 | " U = -elbo(x)\n",
231 | "\n",
232 | " logits = model.classify(x)\n",
233 | " classication_loss = -torch.sum(y * torch.log(logits + 1e-8), dim=1).mean()\n",
234 | "\n",
235 | " J_alpha = L + alpha * classication_loss + U\n",
236 | "\n",
237 | " total_loss += J_alpha.data[0]\n",
238 | "\n",
239 | " _, pred_idx = torch.max(logits, 1)\n",
240 | " _, lab_idx = torch.max(y, 1)\n",
241 | " accuracy += torch.mean((torch.max(logits, 1)[1].data == torch.max(y, 1)[1].data).float())\n",
242 | "\n",
243 | " m = len(validation)\n",
244 | " print(\"[Validation]\\t J_a: {:.2f}, accuracy: {:.2f}\".format(total_loss / m, accuracy / m))"
245 | ]
246 | },
247 | {
248 | "cell_type": "markdown",
249 | "metadata": {},
250 | "source": [
251 | "## Conditional generation\n",
252 | "\n",
253 | "When the model is done training you can generate samples conditionally given some normal distributed noise $z$ and a label $y$.\n",
254 | "\n",
255 | "*The model below has only trained for 10 iterations, so the perfomance is not representative*."
256 | ]
257 | },
258 | {
259 | "cell_type": "code",
260 | "execution_count": 22,
261 | "metadata": {},
262 | "outputs": [],
263 | "source": [
264 | "from utils import onehot\n",
265 | "model.eval()\n",
266 | "\n",
267 | "z = Variable(torch.randn(16, 32))\n",
268 | "\n",
269 | "# Generate a batch of 5s\n",
270 | "y = Variable(onehot(10)(5).repeat(16, 1))\n",
271 | "\n",
272 | "x_mu = model.sample(z, y)"
273 | ]
274 | },
275 | {
276 | "cell_type": "code",
277 | "execution_count": 13,
278 | "metadata": {},
279 | "outputs": [
280 | {
281 | "data": {
282 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABBIAAABXCAYAAAC5gmYKAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnXd4XOWZ9u8zRRqNem9WsWW5G9sUG0yHUAMECBBINoUU\nNpue7CbZdJJsSF1SN+ELCWm00DuhQyi2ccE27lWWbfXeZqSZc873x/28R9KYIoilGeD5XZcvefp5\n2/O+56mW67pQFEVRFEVRFEVRFEWZCL5kX4CiKIqiKIqiKIqiKG8dVJGgKIqiKIqiKIqiKMqEUUWC\noiiKoiiKoiiKoigTRhUJiqIoiqIoiqIoiqJMGFUkKIqiKIqiKIqiKIoyYVSRoCiKoiiKoiiKoijK\nhFFFgqIoiqIoiqIoiqIoE0YVCYqiKIqiKIqiKIqiTBhVJCiKoiiKoiiKoiiKMmECU/ljZ/gudafy\n96aCx5zbrYm8753cduCd3f53ctuBd3b738ltB97Z7de2v73Qea9jPxHeye1/J7cdeGe3X9v+9uKN\nzHv1SFAURVEURVEURVEUZcKoIkFRFEVRFEVRFEVRlAmjigRFURRFURRFURRFUSaMKhIURXlnY1n8\npyiKoiiKoijKhFBFgqIoiqIoiqIoiqIoE2ZKqzZMGa9iXbT8fgCAa9sAAH9BPgBgePF0AEDgqfXj\nP+A6gPu2S8Y5Hh/7BI6d3OtQlKlC5ry/sAAA4Eajh7zFjQ7zb2xk6q5rCvGFQgAA5xXa/k7ACnDr\nc+NxeUL2jLe7vFeUdxpG3ufnAgDsrm4+/05b6+9kGfdObbuZ+wV5AAC7q4fPuw4A3hOZ+6G3bd8k\n3g++Xds5Bn8+721RUggAsHfuBZBwD3wY7/nUI0FRFEVRFEVRFEVRlAnz1vNIEO2S0awYy5Ivnxq3\nwSVVsEPUj1gONU9NJ/BxqIN/hyqoiZl+D61Roc0HAABubg4AwOnvl98KjtHWOfI3BbVZ0ie+cBhW\nWhoAwBka4ksJXhijHxEtnbwOH/vGHTnUAus9l4ptH4tlAVaCbsyMmzzv9Uc8Jq+neJsmivEsAQ5p\ns6d5fL08AG/FvjBtGjvurzLmvgxa4a3sLACAXUKtrZ0jaybA9/uH4vDFxGtpfxu/sreP73mrWvAT\nLDLGEm/kp9d/rgNX5GZKy7w3i9k/RE6avx5+/6g3yltBRlhWal9fkvE8TxK8EK1gkG8IpfP1YADW\nAPfMeHPLFF/lJJKw7q2gzPsg+8XrH9f1zgS2Of+k8rwau9fL/pbYNnMmdAcH+b5Cjn0gJ5uPI5Tl\n7lDEO+M4w1z7Kdt2GSP/3HrgAOepmdujZxuR7eEM/s3gX7ulbdxXjTvbjpH/49+Uev1gBQKHnGdf\n/b3B8Y9lvUM+78bioy9K21PWQm/GPi8PzsDg+JcS1jNKi8Z9Bm0d/Kx45bgjsrfFYqN7oLzXvJay\n3piv512ScCYMlLAvDjnzBAJAGudHfO++1/7OZDLWo8qMkcgun7RtuJaeB7vfz/cWrGEbnQDfP/iR\nYn7HMB9nNbrIPsBxTntqI7/zXxjvt44iwRwARTD4i8QteVA2/yp2VOPZPviKuBl8ZtHTAIB7Di4G\nAPQPy0bzLDu//z+7AACRvzG0If/ZRgCAzycTsbQIVu8AAMDp6eXvDZtD5hgBNNUkLBR/Pa+/f14h\nfDEuhPwvc2Fse2YGAGC4hMJx9h8jAIChUm4u0XxOvMI1FDRIZ/92z89B/ga6QVl72C9ms01q218B\nfw4VQFZONkamlwAAWo5l+yBdlbeT7c9+egcAwK0q5Qu79wMAHLMwM8P8WCgEywiZ5la+N5XCP0S4\njJx5JAAgvLUFrmySkek8MIU3NwMAXDkYms3Hl5XJ580NkxGeC+sBANGSDLh+dlzGoxv4HrPhJrsP\nEg/GCQpFAPDlyWZp3mOEb4yCc2hhJf8W8zMdS/i+UDvXk29pBIO9nD8lj3H9FL7AvnTMhpMMEjdQ\nmQPmEAEAvroa/qdT1q7cMNmt7XxdXBz7l1NmpHdLn5RRNvbU+WDJWbL270387H7+nfKDhVGOyQHP\nC0WRteodgIoLgG4qeqxMjpsjh2bvZiJCudd5wTwAwOCF8v4VnCvVNzcAAIYWVCDUTJnva+SY2318\nnPS5PwZfNm+G3EhktB+MwkxulJxuzgFH1nmgZhrfJ4/N674iHkLgunDDEvLSeJDPmQN3Ksn8MWcB\n7wBdwzXthGQez2cfZB3gnO2po2wcKpfPyhyvuX4nAODg++tR+SDH20oX5YK5qUxVLMsbe38Jzz9u\nBq/dGpKbZDkfWTmiPC3gXunJkBGOa9/8fGTt43sDrZwX9kHZP5Ix9gkKP7P/GBft3lNnItTF5/b8\nGz+S/bLsf6VsWy63euRewblcmUkZuPa+BQCAUBff17XERuFarp2Se3YBAOxOng1TZs1Lfzgn8Cw7\nkhlAbJ6E5uZw3woOytlvtSgYOtiG/R+eDQCoeIZ959vH84yVLn0bSvNuqNwGGtXgcIGkhOJc2h6o\nrAAgoYi2uOaLssQ7lw9Rzpuzqq+WMs+sB0ib91/I74plAUG5J6+6i/PE7ae8tzs6J6c9EyXB8OPP\n4/ghLQjf9CoAQKyc6znYxmv21r2EMAycOBMAkCVy0jXrSvaA6IxCxMOcP1mb5IwwzDNBfP+Bw9yg\nN48JxYTfD0sMQn2nSNsaOICxXK7/9Cbu7a0nyX2eHInSu9j2yqd4L7fjw1moeZDrO8MYimSvTwkl\nilEg5I7e30Dk4PCiWgDAYBnXbetpfL78MX7m+K+sBABEbM73snS2+YZ1xwMAPnPZo/jVnecBAKqH\n5/N3XtgM4M21XUMbFEVRFEVRFEVRFEWZMKnrkZBgdfeJxtTTUOdTS9PwcVoMZ5zOZBI3Vz+AAj+1\ncvf2HwEAaOmhdWKkhdZmfxE1t4N7aeEqkV5ws8QaLdo7Oz0Iq1A0+EZDbUIBkqGl94130e64jO0L\nXEbr2w9nXY9SPzVqMdER/f6ckwEAW3tpgT94CjW0Q+XU6Ba8zK+OldBiMZIjmmm/haHpbHtWj7iA\nt7RORqveMMb6bB+/EADQeAq10vHZQ3BaxXWtiJppJ8r35p9Oj4udl1I9+d9L/gEA+PVvLwYAlD8r\nFtxBsXbaDiAuXl6iGjMHkuH+JGMfqKblbev3qG394KIXAADPtNbjwEY+F+zn2E9v5HzuPkHGvIzP\n119MU03Pt+oAAHsu4prKaJXQn+o4rDjXX3Wccyz02EsADvV8nDKMW5+4JsO434vFNLq0HgOVbMdw\nAd+b3s33OLJkIyV8fiTXldfFypNPDezso+l5c0LBblz38okAgNYT+eHsRsoK//6Dh7lhr49x2fVP\nKwcA2AVcq63LaE0/8aOr8Y/HjgYAxMvZltwXxbVRVMVWvBoA0LNcrKySc+n7Z90LAFgeoqfFNw6c\nj5cemQsAaLyEVpuq66nNtrunSEvveV6M90Roe88sAEA/nSngmKgMB6h6gms0ls3xioV57RkdlNNt\nS9iHg/VsQ8hmxyx8z3YAQNn7aJHY9N/l8HXy/6nk5OjNAeOmmcc9rWNZAbIOUk4N57FDhkpMCB9b\nEG5if+acRwvz4F3sQJ8Mp1+82HL2RuEEZcJUsj+DK7fw95LpkZCQMMytFK+zE/OR1sdrbzuVjTl/\nId00319Ai8zKCGXckx1z+FUyqnvu4/NDx9QCAKbd0eDJklTDeEh4j2X/c+bPwIHTOA+KTqPXUFkm\nEwkWpdFKt6adlsv+Z9ln8TDbn0dHDPTXcG5Ey2yEWvhd056UvUY82Lx9byowYZrSZktkXvtJZQCA\nriN4/Xe+55dojHMvmJNGS+qfFx8HAGgd5rnlE5c+AwAIWZy7Tw5Sri38AGX4qm6uA6erCDl76aXn\nVLOfLOOBmiyPBONtJt6xvlk853bP4Nnv5M+vxCk5WwEAn3nigwCA3M08uwXnyVhn8MwXLWSf2Zl8\n3amnbPQNs198A8OIVHMvCebyLOXfsHNSmjUhzFmnlF428Rq2ZySdz3fPCmFQPIsy2ti2opd53jPn\nV2NlbzmWf2v+Qfmw5xLpVwldrJ7TipjD9+wN82xVe4t4JskcmHIS14B4n1lyf7L33yoRrec5NXCQ\n7ynYxL/hdo5pLIvtHM5m28IZ4mk8j9/VS2M+0noszzvLtdjPORuMx8p4uTOVGI87TK8c9/zebwZx\nYs0eAMA5WTzDV6XRc2R7lLJiQx/Pux1yxonLfV6og33SeQTlQ+19I17bTVJCJISMJIPANLa568Sq\ncc+HPtqMhr1cu+G9HM+cfWxAsJ2Pu+Zy7ty1mZ5L31t6HwDgwAj74Jrj7gIA/O/OM1DxHM8Ng2Uc\n51zjtaIeCYqiKIqiKIqiKIqiTCap65HgWX1FZSSxvpbkRug4ihqkSDW1KldWPA8A+J/G85CXRu3k\n89tpdfB187OBEdH0ScKJ4rWi1ewQi840amVD2xljZsVsWM2SbM1nEjhKQpbD08o3hPGU2PYbWleW\nzaI17dbpTwIAvtyyBOk+aiTv2rUIABDpk9gim20NLWW8fOYKavwikiqgex7fV3cnNXKuLwTfiLRV\n4iwPSWQ41YimeujdzAuw/yw+/aezrgMAfOln/46zr+I8MP2wfYANXHuA2r3PLXoKAFAYoOfGpz51\nDwBg/Ydosd13MS1+Tn4WrFa22yuZ4yUkmnorhbFMtJxJbeW/L3kcAPDVQloOllx/MtyZHK+ahzjG\n3Yu5VsxkDZ9JTfPHy58FAPj+wPd32bRwf+eOywEAvzz9Rnxp9WUAgMwt1PY6xhNoqi2TCblRIFbJ\neBHnb9sxtCT1zo0jmM+5W5zHsW3t5Hq+bN5aAEBRkM//ZdcyAMDgLr5u9XNdbW2l1Wt6ZidssVgX\nrxALdxY70T+Fa8BYoX0z6UWz53tco+fVbQIA3Fv6IgDgp50L8Z2LbwMAfPvFCwAAgXPpgXNJDT1J\nrspjrosNIxzroFjpMi3KvofEWnd5ySqsmcu1UHqj5JTJE6+s7u7D28BEEmNCxSphz6AWfuR8rsPs\nBzkHpv0bLRMH/joDey7nuBS+yPGyg/yu/7qa/VKfRjn+k6azAQCb2znWjsv3bfwONfgDtQH4hykz\ngpsaAIx6wznRqV/3vjCtUINnMK47vUe8pL4jOSD+XICWj0tOoPmPAADOyNwGAGiIsZ9+uo9tzpV9\ncc5/7Br3G7c+wbjJtqXpKHmR/VGwVuKDTf6FJMZKN3x/KQAgZzcfF3+Q3jPDnX7cePT14977jX0X\nAgAeCHA8r8jjGjEeCRt2ch8oOIXrI3z/6Ji6Enft5YNJVsk443koOW96LqRXWNZ+jvOuD0sc7Lxd\n+HXlgwCAWUHKwb0xyriQXPp2yYnwo9A5AICAj22ceS6t+A/vYs6QWcWd2JHFed+/m3KmoEmsglPo\nkeDlvJG27/4w16k7k3vx3cf9PwDA1Y3n4yPl3OsfGmCM78ONbEtemPP8mCp2wm976Mlk5P7QEL87\n6zmureCQi6DERltb6dnq+sbLoqnGjL3xyBio51r+zdW/AgBc03geKtMpjz9x3D8BABsX8Gyw8WHO\n9aDkzCx+iXO8p47jGs8Qj70ezoX0vkz4h/n/QJdYZI01Ojr1eULMOdfkPTt4CvesCy/nueW5tjpU\nh3id6/dyPWdIzpuWbbSqB8o5X9IDbPuBqzivavIp1zoGuF7+t/42/Lz5TACAu1M8H6ZJUs6G/Ye/\nca9FwlnHyhrvJbP9Ks7XGXUHcGkFzzRlx9FrouECSbrn8LM376aHYsYdPOPEs+UsERvvpdZX7yB7\nr0lGT0u0OVv5JK/SVGK8IIIPiDx7gGfYn/77HwEAO4fL8KFc7m8d4kE26HK+FPs5B87K4vnoY130\n1HED4oF+JOVC8Z/EY68kiMCweKful4SlJvlsEvLj+DLZ5sEjeN7pWMQxet/ZzwEAHtw3H7XTue/v\nG6FcLF7PdVv+PN8bKeA8j/bRs+ibIxcBAALiseCbznVTXdSNAfH0LHiB3mzO6yVjf61rf9OfVBRF\nURRFURRFURTlHUfqeiQYEiyAUSlz0XEUNUmXHb0aANAepxZt8/ZpXqb+tHY2L9gvMaJ7qb0ZLDMl\nMKihCuxkdlKjAfQyHRtL/BgcUzZlKjExU5KR/bhZtMT9pOp+AMB9g9RY3vvIsbDEwJLRZkpC8nHF\nM9RENVxA7W6BaODynm3gGzxNHLWSWTvHWN8SSmglC6+sU5xj/7N3/R0AsCCNqvdffPl3GHKo0Xyg\nmxapoTg1sTO/RAvUn353LAAgnMZx7H2Cmr3CzXyc0c6kEW5zCxzHlMxKnpbSYCyTBVtpGTwuk54I\n6801OUDROo5561KuhZx9tKxlbaDGMf+ztLr8x1Mf4uNSanB7d1MDnx7l57/2p4+gZhW/t11yahTc\n1DwZzXpdEqu0dC7lXO9YzGs9+UTGRG/rKcEFlRy7O/YtAQD8dtmNAIDfNzNPSCxT4gY30rpT9wD7\nY7iIlprhPK7/562lKBEjZHo3+zC8h9YfOz4F69+s91yOY/diyrzPz+d6/2QeY3wb4xyjv+8+EmdW\nU0tfXkwLRX0erQnXPX8qAOCCs+mRcOU/rwQA5K7jOvGLRt4RC74v5qJ2E+Web1g8cpqmuByeyHwj\nb3wj/DtwkNbVusspr/ffzLjhoWkW8l7iPIlxCHHqB2mJ7rG5btZF6WXR2M+5Xn6VVOwJ8wOZLr11\nMrdYcMQS44hl2p2KMU/AK2ssOUHM+NT/jHHRaeJxde9J5fjDkTcDAJalU8Zn+dims5+hReaaZXcD\nAL6/6VwAwIHrGCDbM0u87DI4B+ruGYY/wrZaEc4te+jQPXCqMNap/CUcj69f9hAAYHmIj0P1fuyX\nvWBnjBa5r1c/AAD43Lc+CwB46DJaqT9UtwoA0HZ9LQCg4CU5JLTwu+xIdEy50ySVPTXZ6SU2PDqX\nFubTv0LL+5IwPTGWhSjPm+x0zxNh2OW4Ffs5b37VRY/EZZn0Pknzcw1tbqSFe/Mwv9ufwed37C8F\nesT7RJrtNCchJ5Ip2SuZytMX0gvpE/WSC2iQFQimhXvw/e3MOh79J8c+Lmed4GOcsxddw/m+bQVz\nIdTdSBluOXzd3b/b+1mvIlWyq5R4VmnJA1XA8V3wTe5zVX6uy/rsNuT52Q7HZZ9tauXYlr3ItrQe\nzfE01S3yxMtouIZ7aV8111dG0yAsWwa9i/uHV15wKnNEmFwoUkVmKJ/XV3w697tPFqwAADzeNBsV\nGTy7DFVxjg6M8L0lXOZY8AWO7eOb6Gnn62V/HojzN0oL2c5LnvwUrAE5353Gts79meQEStIc8CrR\nyP3I/rPoVfA/p/C8m+2L4Kzw+PwNmwLcz27rofdWbgbPid3l3D9ydrNtmc2cG5bDuVH890G4Uvba\n7LN+qdoSn0LZ7+13M2sBAEvzeV752WfuBDDqcVUT2IphkctRl2MZkpueC+//FADg1GX0SDBORXN/\nzr6xJf+HJef6/LW7YEk1CFPRy0nGfpdQdrN/GvviPy9gfoNsyfl357oT0VDJ9xRu4JhFC/nZ/Jc5\nHzKaxZtNKrNA8h3E51MGuiJfI2UVyN0nOXBEzo8rhfoGUY8ERVEURVEURVEURVEmTMp7JHhZa6U2\nfO90atIq59JKtrmXWtjbtlETVzW9Hc3raWWueJYalrQeanH9Ev+VL1Z3V7SursSBWdm01sNP/Yrd\n1eP9vmuyxCchbb2pVLH/XGreL8inheKMVf8BAAg9TctlcZuN7D20svoP0gIfl/gqV9pU/3upCS81\nxG2jeTaWgOAY74NED4RkZTA2FlrRVu6/guP6tTs+AAD4+sXUWv7mp+/FQDXfW/f7BgCA3cZ+MJaG\nkosltlgs/Nkj7A+TGdce63Ugmk93OHmeGEZTa7LK+q6mBe3RPqlYEaHGOR4GMtukEsfdzLTuK+Zn\njLWl6z202s4LS51pqZ0bOovfkf8CqxYMLixH+oYGAEDaSmpDnan2RjFjLjkRovNoQWs9weRM4d8X\n9lPTWlPYhRVdtFB37mN7vrDyEwCAdElx0buBHkh1Uh8dUnM60MQ+Dudy/VsDQ4DkRDHrxpnCeEnj\nhdF9Rj0AIHq5WOVyeQ11f6fmPdQuGZlbXDyeS0+boo0SR51Lq2a1xER+9u+fAQDM6WabrQaJkzdy\nLW18XCYwKh/tkSmq1mAswJKDxMh8d4DXnHGAc6F/BedCz5HiMeS4iM3mPE1/mev60b2ME354+zEA\ngHgNX897lhaIzAFa+BwT/+3lPxmV76Myf+oz4niWUclN4H6c6z5DSi2s7aSHRcWjAXy3inkxmjZK\n3oewtCGTcvLPl9MyW9NAWWfkemEh10mknnMl2NILmHwwycqJAoxaZeu5tjt7OGZfXncJAOCl4/8A\nAPjsgdPQPsw12/cjxkqHV7AiTWG21EB/nhaaW05gjoDCx2ipdExW/rFzOxnVeMYQKGOOggPvoxyr\nfy/bcl7OegDAt/Yy/8PKfLZtc2853lu2DgDwy+30PEq7n2tEDI547vH58uXsh/oQ2+sbopxoPZlj\nn9bvouAFWn5dkQX2VHrgmao84nk2cATPdTMLmLPg2mclIVI6527BijTk7mFbSrbRU8Ptk6QAcmaL\nf5Jnn9o8sTLupfx0ZE67ryTXkjwHvDOHXFvjWZzf3y6kR8aTEXql3rHuKNw1wJwPxXTKRUY6221L\nH9XeLGtAzrvGyyAtneObViBx0wfaPVlvqpIlwwvLL14odjnnwMIf0Cp9cg5zgT0+xHXxrort2NjL\nPaDtdsrBAXYL0ivZB3v/m54rM3xSsSIkuRI+IPHyD1FWhgqA8hfYP3aa7PmSF23KMXuQjEHn8YyV\nzz+J57V+m3JwYfpBpFv8/1db6Xl724vc5/x9PMvkUXQgv41z3ZGKF321kh+kieMbaGwb9TjO4d5v\nt3GvMflKpgLjfda1mHvS0jDl9CMD9ChDFs+0q6LTsSPCsbv/phMAAOKkA3EuxIHPcd8o6aA8MB6G\n1h622VS9g98PW+6DTIWMZKx/08++THpMiLMhfrqOuTssqUTn5jnIX8exKlrHs7u/SXKYyVnejfCs\nZCecZ3wvbhn3G8FNozmBHCMH/4W2q0eCoiiKoiiKoiiKoigTJvU9EkRbFp1FbX3OPmpPlhRRC71v\nSOq75/D52F9KUbeTVnnfLtHIJsS9Geuql6FTYlO8mECpEAGMWoaM1mhKE/aLlr7nEsZ8F2zj9c9J\np0V1JMrrn7ZJLKvrdsFKF0uSfEWgkdpVY4Ex9jbTF57FKaF2uxuPjz6XrGoNJmZU6qrGM9m2Lyxh\nlYrso9nu7z77HgDAvEcaUdTCMYy/iiXNGBydSHTcE25fnzxOperxQOwkxroGu3m9F5U/DQB4Txa1\n9GeuvQoAUH37fthiaTe5HYynjakL63k39Ij3iVie8h7ld5lM1eFVu2F3SYZ+T7M5Rf1iakhXURvf\ndTzHfiSb1/ah5cxQ3RSl5e3xLbQ8dzxS7Wly67fI+o9KvHcrrc7GA8M2Vh+JCTMZsn3G+6CrB5Y/\nYc4b7fxkemaY2uHzWG3m0m88CgCISibmTofzPXcH+6LwZVragpv2ejHFiPP60sXbwumnVt5Y1z17\ne4JnlWd96O0/ROZN2dh7Vkl6XrmiPTfx+pFyXlcgKhp88bZI67UwlGWyUvOr8m6meSJntVghxfPA\nEY29k9imsRmLU0AG+PM4vyFybEEB1/aRWdz3HtpDS/O0pijwfXpuzGqn5WW4ko/TN9CaayyRtomB\nNFYKaWdoQCy2ju1ZNpIh8418Mh54llzvj45m7eu1g7UAgHO3sKLM4K3lKNjCtmWsZ34UR6wsGOQ4\nm7YWPsB2eevBrOMUGGtTM7313bSk/ecnWWlkW4QycEaQsn+xeCLc/QS9j2b9vhV3DdAiOS0inoa2\nnHlMzK+0zyeeRpbse1YJ11hWk8ROr26AIzLDPQwWqjeKme/2NHpImJj9D5UzLv6lTeyb8kcoC/Of\n3uPlc/LyeCR6je7k/PeJHDvE8pasqhyvganUYwgslko14omU7eMYFa4IIreBz6XvE68qOfuZvB+O\n2f8Tz0IHaOHO6RDvo3j80PPgFGK8Q3vfxeoa5hxybh69xhak0er6we30QB3+Qzlyt/E8Wxbh+Xak\ngjIv7eUGAICVwb3DLua8So+yr+p+QWt091yOecXDrfRABDyvPCc2xd4YxgPLVCwROdh+FF++soLW\n5JBsbp/bcTkam+htWvwEx3zGQbYv2C2W6u7+cd/t9vD5ok0i18UbzRmTB87khLMS18sk4s/huWXg\nNJ7jzBnumHSOb2WA1/2/re8CAOz9wizEcsQrvUNkelDOjNvoUWtkvDmpebLe5F7y8qE4nqxIZm4E\nfxXPuHaeeMRfzPm+KIdn8Kw6ruPtv52P3F2Sv2o35bwt+5w543sk3Kia152IePqmp3v3B4ejGp16\nJCiKoiiKoiiKoiiKMmFS1yMhoaZlaDu1qFu+Qy395RLn3TpMjVZgFzWQ6T1x+Nsl/lE+64rG2Vhg\nYYvmL318HKipyOBpV8bkBEiGptbEShesopX9gvuYiXyJZOd2+vl6pFiqUoRCniXV6adV1ou1TbTC\nSl947TKvO2O08wkxy1OO0SCKFXHvRYyDfb6HFtsdnbRe1N3M67Nb21+/skSi54V5f7KyNL8KxjKR\n3kyta94fmOvhkmwGv331IGvDl/9UtOgdXaOaRUc0rgk5LbyxFsu3G5HXBzmfAjXsX6e9c+rH3ngi\niKXMlbUZl4Lox32cscDfLd4MALhwJ2Nm89ZIvW3HRf42aYdkHXalNrgXA5hoeTLaaGO5aZHYSMf1\nrCJughV3Mq1XJkbUSeP1/nrlaQCAzWf/lq9bIq/8vLZgGzXvVkYG3G7xOBo2XigJ89nMhVexwo16\nIWB0bcSmaOyNZl4ss65YByy55m6xVF17DqtwhCzK6ds7GBe6qqkGvgZ+tmQd228yUJt17fXHq2nf\nU8gqCQC2eAYMn30kACDdxzj5n2xl3GThrbTipe1v8jxtjEU2XWJ8jZXl1fYuY7nxiQXOjcfH5IUQ\nq9wUWm1kdqzTAAAgAElEQVS9Kh1h7uUHfyweU37xqJCduX+Y66BwewT+XRLXb6x54k3oeRSVM07e\nGpJcE93dk9qGN4KxPA6ezszyw3ns65X9rKpxddkTAIB+GZMXv835Pmsl9wB3KALb5HdKyOXk8846\nJtcG32dkHUTWhTvZHy7GWOqmKicKMGoxNfkYZJ6l/yc9cC7MpAz/olSXSOuXeRhKN8W5YJn5bWSY\nOdskyMBD2pViax4Y4z04mxbaDKkuNSReadfu5frPPhAb9UToNHlN5CzQN/7sN+pNK3NB+sGse2d4\n+NAcMYfBSvm6mDxIsufHQ/zNT3/jdgDA2WHOiWu7jgAA9P+dZ//ibT3w9Up+M/G0TZNcWJBYd6eD\nVl2/mQuyP/r28TuLBliNCq0dnteOZ6H1jb/3OOy8mkyV33XqaKF2stn3M9N5/v/mGuZICewOIU/S\nPBVs5Nhb4n3ppsstnXgmwuQ8M/c4vbK3mvuDWHx0fiCh/ZO5PuS8Z34rew3leNkdHKccHz1HVg1z\nL3h0Ez3w5jW3I7hXzjiOeNYZb0uT5yORhLOsY8SA63j3WIdY9KcCuS5H5m7/0cz9MLeQlZnygrzv\naRyk173rAwKS588xnghvMJeJWft2LH5Yz7TqkaAoiqIoiqIoiqIoyoRJXY8EoyU0f8PUUGXsowYp\n8yRqpXZ00yrtl9gPO+RDvIxxUX7RxvlM3KTE1Bl9oyNZfhNjAt34JGsk3yCxcranLEhtXa6PWrqL\nl60BADy3ktl7ndoy2CG2OW0XVZaeljUhA7PzahmZk1WZ4ZUQjZkX219LbWpVBq0o2x+h1j7QJ9rm\nzAw4opQ0mrrEzLM+kxlYNNSpaJUAAEvi9kfKaGn9dDlrCIcstufJTWz7vGaJh7Tt19dOJnhjJMaK\n2k2mnmwStLOCU8TYve5FnPNDpby2lzqppb8tl7GQG3YwW3ONZOKNFvgxXCDeCXG+x+Q+cE1dZsmA\nPzYukG9wDnnsSvy9Z+mLT/48MR5SOz4iBdGlPvI1HQyW/Fg+44UzW8TSJBY3tyDXk2lGM+yarPvG\n22KC2udx1uupskYnyHpLsioPHkvPo+YzeE3GOvn1VlqoFkpm/ln1rXgyl1m6d+RSqx9sp4yccTtj\nSdEuFqtUkm+vgV9i2g98gG3/QQELpK/+X86F7N2SpTkcAsTabndIG9/geJnszQC8/vHyB02Fp5bJ\nDWIsY2Xc0z89+xkAQL3se1/cR8v9QCNlePRLA4iN0LKY+Tz7yxFx7x9mH5Q/yn3QyLZUwngPmGzx\nwZNoRU2TQOGHBpkX4PoGZicP94t8N+skOwv+GspFtMjYG08EGTcjF0b3O/HCkv3fFxz1XJhSTwSD\nWL59Odzn2o7g3/gAr9+W683aamQ7r7HlrEqk94j3heRTMN5rhc+LuVbiwo13T6ru9eOQtTAwi3vY\nZ+pYkeqKp/4dAJCzkXK9ZHDIi+2Pe+17ZRmf6JnoPT92bSd6rE2BnPSswbI/D5VJxa00est025Rr\nf97F823FBsl91NELR7LtHzJnTW4UwW7kHpEYJx/oE8tuPO55bpnKAe5kVytJzNFh1oDse12zGSt/\n1BxWVvr6yosAAKFdfD3c4qLkn1KJwFimE+9tEs86ieNpj97jeF4w4gU76e3HmHOV7D3xKnqlLM7e\nBADwS5/c2Xk0AKD0Cc6RjhMqvHky7WF65Ph6xfPUfKfxRpzAevfOulOZL0XWuD+faxylbPtQMZ9/\ndis9MNOzOA5pq8RTsxgYquP5OGSqiL1Bz6Fx3kmHsa3qkaAoiqIoiqIoiqIoyoRJXY8EwTKaIol5\njM6ips1kMl9UxGzFj88QC2aNhYonaI2K19NK0cty7Ch/nhr9oGi7/RslxthoRsWiOy5PQBIxGkqr\nb3ybhyTI55gsZiXeeiWtcFv3VCC8m1rFCrAO856LqGWtepwaq4xntwGY4jjIN4vJkSDZpGFxXO7d\nTotkOocZnUfQQlU0XA7fQbE8iQY6chy1e7FsauKyd3HMjZ+CV50gVawVoq305VFbGS3kmAeNdbp9\nqbyPf+xCaiut1vYxmdap5TUx5yZu0Hg5ePGyppa6mfdJqB3NC7O8ue6TeE/L5Xp2pUmLCrnOm2LU\nyPoG2E/GYyFSbCEgVsiDp3I+ZLSx/cEIn8/rF3cVEytqLPrGWmGeH5NnYyo8ETxtuHgk5G6RtpWx\n8eecynraxX6K6573UwPfX0lrZFq/i9w9lHVpTRzTeLnkjumh7PDtY//ZJm9EooXileb/FK0Jz+NI\nYh7dSsa1dyzk81867kEAwO4Yr31OBtvyfB8F+57+IvRGOX/mzuBru5tYWPzgqVxH1Z2Mr7UPSHb7\nFMuJYjBWMbNmb17+ewBASNa/+0Fanlti7Jv8/8tFqK1TPizVWPzj4+VfLxbylTyQprJ/jHXKyLy+\nOfxbFqAseGiA3ibRKGXh0UfSUnde0QbvO6qX0jq1bZj73q0HmE/AeSE0qdf+r2CJJTZ3Pa2LrRdS\nXt37EisxnH0KK1EcVUTr02Oncp93zmR/nHLGejz+wgwAQFoPvTjsdI5x1WOS0X9jA3/Li/seHy/v\nDIg8SNL+Z/rAeKJ1HsW5et18WuIdyXS15CJaKp9bwPwRJUXtaO1if/ma5JwkIi2WybVe+qR4uphK\nHW7qVWlIxFgMi7+wBwBQHeS8/tryhwAAaxdSru3aNG8023/i2Jp1/zpeBa+4xqeyb+S6Y1JxIbqE\n1vPZEhtuTiNhyRMRLaKlPrM9cIiHgTnbmNwoxtJtzjpWmpz4TDy9eDSMzdY/FZb4V8KMn6m81DuT\nbVgkudDcGB/HM3ntsbAFJ4vtjU2TCj0HuO97HgrR8VXJEnmlsU+GJ6qpJjOSQTlwVd4ueYWy/sRc\nVhTbcyU9C3siIQztZM6AjqWUGa6Pf4tXy1l4u1Qrkpx3h6yDV1oXUzjvvfEWLyw7g2097WMrAQD/\nbKaM6x/iGPfP47iEd6VhoEK8a0/lPVDGXrl/kbxItskT8WreSZO0r6e8IsEIBMsTmuyYx7rmAQC2\n3yzu7TV83h+x0LFISrss46Ex88d0HWk8R5LRFPM7QzsWAgBqfskN2x0wroOvvPimGnOzH8/mhPr2\nugsAADdX8GZ5y+paAECwX+62psUQnUtBsnu2bC597IuOIyRxVQb7LWcLNyh7myzcVNxcnfFlO4vv\nEpfvj3LRxNukbFAd29pxYhbSmihULj+fZQIf/DXb3TOH7et5P99bch0PYMFH10xqE94srggE4655\nXy+TrpmbqKLn2K7WZbz5LB+ahkA/N8WeY3mDmbuFm2XTaRTCQxXsg8pn2J9pj66TH0vy2Lvu6Cbu\n48ZY8AxL+QSiTADZ8C5uHtkBbpA5uySpaIDXHhxwMZwtLoIiKzNbuJ5HcniIiFXyOzCN/eHfuJs/\nL+5/ozdeU9wfJulOF8er7K9yyJlbCwDwf4CvPxnh9cdfZh8FZT/sONpG5xLOh2OP5LredT1vxocv\nZH/F1jBZUe3tsuFsNxt28vFKj5lEcXIQqMjhDcLfTqDybGsJbxKfu1USEHYZ12Ygq4mysvFo3lBl\nd/C13Mvp2rp1Oj876wY5bGxi+5N1eHw97Om8aVwXqQUAXCxJVj9QsxoAUBrkHPH/xsWPr2FZtLyd\nlP3+7VLysteUOuR3WpLEM9Xa7CWBk8PvSCbX8eeffT8A4GcnsBxiSR5vere0sW++Wvkw/HKjaYtT\nb7af39G4g+WiS+fz+YJ+jn+8oVF+NAX2u4TEwDV/oJxqOZYyfd3SWgDA+wt4wCy7mC7sBQH2wxU5\nu5B16fMAgG4pDfvHHioh7l1/OgAgvdiUx5aQgb0sH5oS7QdGEyS2U27lbOf11p9nkmLSWrDmoQUA\ngGypatezPOadBc85jXv4ql/RDbqbR0JEL+E8yWziXCj8+0sARt2+UxF/EdsftbkGbuui7FvdylC+\nxcVMStf2mQiybqsFAOTspaw3iUc9A4E73nU+1cK6jNIk2MJBnVvJeR1MKD1rO2JE/CznROxP5cjZ\nyrNfvJDzI7iBiheIe75lSVLxMu6VjoS2ObbsNZEU6AsT0idrIF5K44kp+dnQz3NK7ga2qX8p13jU\nDqF1GdsV5304qneLEiBgFCYmLDiJyQRfA9NmY8iLFtUCAFZEea9T4OeZLE20gy29lF9n1G5H7Qwm\nnf9bPddG/8vspwIJfwqWcT0gi3PE2bxdfjRFZJ7gtFKBDDEG9sR4vZlpHKuOJo6xNcL5X3xqEwZu\n5z7WNYfjWsmjEuIL5X6mSZLnStj+VBlKNbRBURRFURRFURRFUZQJk7oeCcbdV9xA4nnUPPr300r1\n8maqnXO6qHnrXSqli9rTEJ5JjWzPED/TeyGbGWql3iRUT2tvbFhc4iTZhZWM8kevgXHNSdtHbWr2\nU7TOdvTVAgCqu6hdHSyldqpspYuueeLvfzI1UUfPpiXrmRFq9Nty2AdZe6jlNCXnbNFipxSmPJBY\nUxyZrf7raXUs/mwDAODBujsAAF9rOhOzlrIs6DEZVNU9dCEtsWkrOcZ5YWp1AwPiIjSVScUmgim9\nJ1ZDO419cOdOWpqi3ccCALLO53hdUbcWAHD9khORXURt9Dfn3cq/a1kuyPJR0x9K53zK+Ju4tXol\nwyarMa/DmAQ3xhptrCmmJFSWJJXb92gtAGBrDZOrlXfwotMG2BZ/1IEt5aM6F0syG3GFzmzheyNl\nXO/Z23vG/b4plfaqCUinCFPm1JJEif5ByqEPvXglAKBYLLKYS21zTwXbU7A6gP5TqcG/etoDAIA/\nfp7r/46nOF9qT6FlfuQ59mtaK7XdKZWITCxmzjDHK7iF1tPCb9CyuDvEMKWqfbQ+mWS5TiTqhcbk\n5VLO9dRzDvSvoneOW8L13XwC2z2tk1bKeOPBcb+dKlgx9sFPnjoPADDn7OsBADsjtLaclbUFAPC9\ng+9G+8mcJ70zKfuLy+gamfuiaRu/y26V8qZTmVhqIiSUXyvYyDmZ2cL2/PAFelzEM3jdw5V8/+dv\n+iyCg2xb5ye5NhYUU/674qnUdSHXVP5a2TymoqzdBPE8McQt1VfCfa5gG5//06bjAAAzj2abjLfF\nPS3cC8qCvdg/Qmvcx3J3AgDOy94IAFj16VoAwOZ/ci7U3UKLv5EtqeKV4nmBmeSPkuT2R61nAGDY\nEgDYIfE+GuEcmPGtCNqX8xwQl7XevoxzIdzIx7PezT55aROTVhZsZl/4toj8GEpIuptMZE2aEnYD\nv6Ln6JqPi+v3E+yHoz5BD5StGaWo+DQ9U1dvoTUyo5EhL7W3SrJJCRO0U6jk6Vi8c7aEt2xupLX1\n8XLK5rogLbbnVrHkc2Ua23HfpxehuZ9rpX+QayW0mue83L18nL2V893ZJ14aIwlhmykk743XgL+J\n1xy6n94nu+fxbJ47It7W4oI4UhzHCKcDwpJ8vms5w3ny17DPfPn0bog3tUz25b85zN4j6z93M8f2\n43cxqeh738XE0re/wESbvijXxxOrlsKmCEO0nvLwiOO5njelcZ27pZSJ9b8Sr5QM7iOpst49z0sT\nctzPdqy7keEK8dO5bqvvY5v3s9I7Bv5e7nnb9i+QfW8ZZWDhCxznwXk8H4Tl/i4gbffmwSTNe/VI\nUBRFURRFURRFURRlwqSuR4IpcSRxVIF2Wp+mSRmQIbHCj2SLVVEsEHVLDmDPalrusw7wtZpd1EY2\nXGLKpvE75l5Aa33HFmqyQvubJqs1/xLGSlv8F9E2moRcJdS8BQaouWw6MRM+UbzaLzJXwAtB/sVs\nauPSX2Iczq7/osZq5rW08GJtCnokuOOt1PmP0sLQfxItC61/qQUArPoG++FnlY/j+Bc/DgB4KJuW\nSfdeKf9GoyYONvJxcT3nQMFaiSlLGY8EabNY5wpvYh6DkieoadzxaY5XJIvjt6aH2uvvnnAPfvHL\nSwEAXxu4mN8xxDbmlEjZJElWufWz1FbP+5HEDe/bP1mtmRiWNabMIsfBaaa1xRen6r32z9RYx6uo\ngY2WUNMaC1MX2niuD6EWjmXtg7R0OEF+Z88MyoqSteKJVMA1kNbJ73Dt1LBOeLkC5K9/Py1L9V/g\ndbafRcuTv4LtyhNngvi53Sj/Iy00n7v2kwAAJ8B+8Z/P90Z/XyHfIbHks2jxKr5h9bjfTAnEamys\nc9aOBv6V5Ji2V+ZJylb5/d71Zz/MfDc5GyUB7VcZczz7/2iZdsKcCw1XcB1V/5zWYCeaInPAtGkL\nczjM/T7X6lW9tNQE+9n2+6fTKv21Ex/EiiD3r3gW13d/lUklK7lSXuJa8pezT+IHDk5mE940Jo7X\n5MhI38QxDSUkUDMJulBcAHRzEVR9hvJw+7n0VPzAp54DADz+0+MBAFaE1j5Tcsvu7Jq8hrxBTIk2\n3wucu9n1HM/sjZyT37qSuSLKn2N/DJVQrn/jgvcg0sIEdL/uP5ffVc15/sH5jCPe5Ode2XYs10Hx\nXsp612diqZM87005ygitciUrOZ57tnAc7SDHvsLPg016G9vXu6gIxSvpqbnvMe7ps4q5z+3+Mtf4\ntkeZjPXMC5gb4blSys/qD6RGDqxxGK8c8UrLfopJsf0j9MLy5fOaf7ziHADAL0+6GY7YAT9V/hQA\n4OMrPwwA2H8R9/XqO2S9iLdHqlhkDcYya+1qAADM+TY9EX66nPN92icpA3ODnBthH+XDV6ofxoef\n5jnP5MmQPOToE9mX3i2JltOYnNKXmIAvhfCSXcvYlzzF+5BAlOPYM5MyPzOD7f/5MTfipg56K62r\noIwf7uIa6DpGLNTPcJ175f6Svc5fBSPz3QZ6TM78ZgMA4OU/07tmzgjveQbm8xx48JJhhDfyPJS7\nSrzTuygvr3gXvXXuvflEAEDD+XJP2ELZlzLz38g8SQJrPGIrHmBf9O/nPt0zk3J++hz2TXNZDkpu\nYNvnXCO5bsy5TbzqM9cyB5BdzsdOHs+6PskR4g6rR4KiKIqiKIqiKIqiKEkmdT0SJGbMsxa2UDPl\n1FPzZkq6RUqoC1lcTQ3c2l01KNsocTc7JIY2SK1ceDctGWdcsp7fIRlBI1v4vGN+M1XiKBM0Vyae\n35T48Iu20ZdNrVO41UVGJ695sIyvdc8TTXcXvRjOvoKxR/c+wthpaxu9MlIkWvYVMVmWfUFq7nJW\ncaxzJEPtD99Ha8zQPaUo2k8N3UAJvTTSB9iHgYtYIi2+XjS26yUzforEir4axkJpN9GqWH8NLTZu\nDS3M286n1eW7hXXIyOX89R+gpja3gd8ROJ99Miufa2jb/fRSsQ82T/LVTwwrLW20BKB5TmLeXVO6\nS6yR/n38m9XG1ztOkhh4v400carpnCdBdBIKHpcKcK4pN5XD30qT+ZQyseIJ2NJ2v8SQFj9BbXNh\nAa0tnUdyHMPXZyFzo5Q1zOA690lcfPkLfNy5gN8Rr+RaKvutVD5JkVK3AEZj9wUz90flsWSiTjPj\nK8/7/V6spWuLVTvCdT37//FvLJ+TINjHx1WPcR35JB7XMdnsUwQTwx5voyVh+jc4XoEyXm/nabUA\ngOcWzkTBk2xbPCyVjaRkaV+t5AoZpjdT5orUqdTxipg48QTLkStGRFMa0+6RErH9/aPxppncA7Oa\nuP89cAOtUuUb6Hlg8i/Y3SnieWdZY8r1ylOmFOLuBj6WeT7jGo69KXEXljYXrSmB1cn39i2vBQC0\nOrRY3R5aAgCQ1AkIDor115OzKbrvbeIFBzMlJ5Z4KnjxxIWUeXn/7IAbG1+u2zfMtV96Fz0Pui7n\nGl+QSdm4/+vcK5wUlffAqGeYKdMbepQlTsM1vPbB97JfaoNdaIjR0moqu3x+8ZMAgNuLWNWm0eJn\nqv4opS5SxSJrMPmgjBemyLqCx2mZH1rPs36DeNNsv5Ry7Oj6Pcgp4Bk4nC75YXZRLkaOkcoGRZSJ\n1Y9LfgEz5ilSkQ3A6H5nypyLZ5IVkvPbVo5bfzW9qL44+3EAwHGhYdwkX+EXj4z+0wfks+J9skGq\nE3Xw3OvGUywvjsHI/MHBcY/trdyrAhVSeWWP5M1ZV4CylXxvtIQyYaCG+9xNq3hPU9Ajss4e77mY\nMrmBTMlSke+mWpfxtMtaSZmXtZmPd5VxHZettJG5gd4JkLOrm0OPNCfM74rU0QNnoIJ9UrKS3+1O\n8jlPPRIURVEURVEURVEURZkwKeuRYLQ1JsbHWBz8UWoU4xIbnd1IC0T31dTEzG3s9rLAWr3U0sVn\nUauVfhy1c37wO+7ZtojfcTa1vKWSgyDpWkuJX/TimySGCvLYH6YmzsRUGS1+wcsOfNMZM981h212\nRWP53uWMmTwjZxMA4MmGYyezBYcX6Q9jVXKNFlOsFZnnsH8ysdfL0GqftRAAcPBCvnZqAbOWrpsv\nc2MoNSpzHIK0zVinkKBJdCXGz9rLOOeaa2lJdWNx+KV+7px7aIF5cDczGcd7qLXsyKDGu/RBiRdM\ncsUKM7+tQGDUylwhNYDFoo4WWiksmfNe1vGwVCzYIJmpg3kYlHQfaZI7ILJUPHe2UrM7WMnPpPdI\nLoL04PjfSjZm7E0bjSbd5AaYnj/u7YW3M0M7XBeQyiZuPsfatM3ISVNSvOBZ9mPvUmq5sx+U+trJ\nzBFg2i3Zq52lzN/g37ibr5vs9uav9Idn0RoZ8eaSwZFqFFaheCY1S+14+Wy0llaujJYkZzQX2Wbi\nJL08GcYTz8S2Sh/FW+iZlHcHr7v9TgsFR3Cfa/m6VPEZkv2hlfN91oXMaj14dopaoU31FLE6G+8j\ns++ZeWHmwdj6675wWN4j1nypcjO0nH0S2cl1ERZrNTo6JqsVE8PM9THz1VdYMO4trngcmv3NJ54I\nZu4ajw1fiwWnkvIyHuL3FizmGSY7nWPd8C6pXLOC/ZQvOSIs+W7XnHWm0kpnWd7veZ4Gcr4zeVE8\nq50ZV/HQcRsavddNpRZvH5F+ajuSn4318vWbG48GAPgX0sKdJR4fKUGCF5b3tHjPmTaig541066m\nJ+JXcSX6ZlG2nfOtZwAALcMc27pczvG0S3jm2X+drJ9UyYuRsAa8tWDWt5GBkucnt4Frd+T39Cb9\nr7J/R/88vvd9Jz0NANh0Dvtnxcv00Bwp5utp+3jmRwH3TiM/k0rCfmcq01nm7FpOebD/DI6vyf9w\n9SOXAACuKR9EXPK82cPsu+8uvwcAcFfrUQCA5iX0yimQqhVWQv6lpGPuceScm7inm2pyxuvIauCZ\nturmbgzP5dllJFNyKaVRllTUcKz7SykH0u5nfiErYb4lrQ/kOgLVleOediUHnHfeM9c3xPu7uqvp\nOey6LiAeGm6PHHLlHOykSX/K/ULpPykDmk/nmqlolflvKjcdZnmvHgmKoiiKoiiKoiiKokyYlPVI\nMNpJS+JG3CxqrE2Ma1oHNdfxXImli4oWp6ffq0nsirYr7SA1Pt1t1MpszKRGaFEV400O9DGzsRc/\nmOwYmgR8WbQyWiZ7tVjmjQYe9qhF1YpSq5XRzjYMF1ALdn4eMxc3xdgH6b1i1UnhHAFmPDxrdYLV\n8ZU/JBluD9Bqk7uaFqllS2mV641RgxcZSM2p73ngZEvsU//A+NfFcun1xXTOZct1MTiNbb17hWg8\nc6jNXVrXAABY909mwp05sp3fneQMxp5m2PLBV8Iss/0zaVXpq+b45O6V+drNeR1olRjnOOVDrIT9\nNJJroXATn+udzr4JbpSYs4Oc61FZC2ZtWIOSyd+snxTJjWKsrDCyT8bJ38W5YGIpUU5rpNvcBrtb\n4uyMBl/WQXQJs37HsiVWVOqwh9tk/Sdb1lnWqFVS2tt8NNsfWMDKK6X30DPBaO69j4pFx7XtUQu+\nsWSaN5nqPz2MNx2ZyUzY/VWUKRnbDmdj3jie55GP123G3lid3QQrvGe9l76ysjLRtphrIP0e+dIL\nORc+d/Y/AAD/89z5AIC5IfHwSJVY6QRLka+YMmCkllYU/4BUsJCqHd5eNdaaZfpF5GJgSHKohPne\n9iXsm5p14+Vo0hjjTeHPo8UM2RzLkWliNco0lmP+Ca9gHiPPQ8GMfW4OEGEfDRVz/sSfpUxwTqBn\nQmWReGwdEG8Pib0/ZF5NJa4LKygep+YMU0QrrN94VnXS48a02cuPYtZ8POatEW/+yNfHc/nemipa\n5SoyKTdermUcfZaVgvazV7kmL09ONvvF18c2x4tz4Pq5fp748gkAgIYL+R23nPU7AMC/3fVpAMDs\nHKnKJN+VbMZ6IgKjHjneuVZyX1nSVjn9obue66VncQy+Ib7nlt20wJuKVMEePp+/RT4knkiO7I8p\ncbY3Yy2eCH7xlrDL6DETKRfvoR1co8bbaHCInyua14fGVlkvnVwPP9jAPGEj3TzfzjgouYJEVtjG\neznZJHjgeGvXeN6YfBFyvcYTHWHe5yEWR/oeqV4XZR90HMnXPlHLSj27olznj9pcF67kWEt2dS7v\nPkbOrk0X0ou+4gF5g3hjueKJYLw0fJIHAYNDXh4Rsxf6TN4v8VYdKOOaalnGvTRkHPDM/cIkzf8U\nlKiKoiiKoiiKoiiKoqQqqWeWTazWINaHWCW1kYNl1GAb62K6ZOjM3SYZvssLPS+FaDE1QPFPUC1T\n4+ff+XmMOdnRR+29LXGVRkOY7Oyexkrll3iYkUpqmwI91Kw5WWzXSB77ImM/42WGy7LQNZfPDZXx\n2v/6vt8AAHbH2NY/NFJLl7tJsnlOYjsOF+48xnv5IpKlWTTxvv2Mdxub5duqoDby4EnU4D/0+Z8A\nAO4dmAsAeGktvU9mR7ZO9mW/MRLj4ytoleudXz/ubS2ncV3kbaAm2hYlZ97uOPorOW8KaznPb114\nAwDgopc+AQCY+WepfGLix2W+JzsliJcDBKNj218nsXISAzicR411ltQVbz5OciRsF619BtA/TSwZ\nMqnjmeNnd1DCb+100Z+KFtxXQNlisue6QHLiSI01XeKYh2ZxDvhiUg98hH97xKpe8qjkx3DdQ3IE\n7CUy6SsAABa0SURBVP8IvU8CsjQskWWDVfyOcAt/K0v6wE6WZ9IYGWssVCOSCqKvhGPQtbgWADDv\nx4z5dcRaaSxYvrSgF1fvfZdYqK1uscCK1bftPylDR17iHlEcSkMy8eI3qxj77IZkXWdTjge3Mibc\n80KRbM3xOcyF0z07AwVb2abWY9imxaWMi32si7kmqu+TnDKy7lPCKgccYoUdXMB5PTCN86B7Hvug\ncP0RAICsZsqJ9FYu5EhlFuwQv6NjAed/xUn0Mhzo4Jq2izkvOs6pAwDkmzxIycZ1RwVvOudg83JZ\ni4tpOc54mnuYHZoDAOh4P+dA3t3iwZBtwRF52beQ55+KaYwVzw/RqrX3EdZYr+gQ7yvZK/3FtH7a\n7RJD7jpTOi88C6TIrcFZtC4O53A8Q11SaWQTz2qOiSM2lspw2PPiMVnu28/j3h5qZp8cyOccOKaI\nclKWSUphPCyMZ4bPxIabnB6+8ecCu4ivD9SEkbuVfWJLxvaaOj42Zz1/VNZ9KlnjMXq2N20aWEzZ\nZzLN+6Tp3fN5vXameKNILqNZdc3YuYVel9EIv2PhNMq8DWncO8OtkhMg0RKfKpn7Mbrf2dO4zw9W\n0fo+WGrWwPjKdD5x4OwcDCN9m/HE5nMDBewHfz/7sE8qGaTvF4u+EXuJ7R+Tr2Qq8XKASF6vWBnH\nLR6WPgnx+g+cJn3Rzr/RUgdVj3L+dBzBtXPeKcz/luenbOuJsc3tx7HDCm8VT5ck50gwnpeG9G7O\n567lnP+5O3hW8XdS/g/M571MRgvb5QT98Innvb+da3rLt/lZf5/sfwt5RooMsA/sFezXyfY8V48E\nRVEURVEURVEURVEmTOp5JJiYVhNDJ/FS/j6qKe1qap/7ZvJ9aT3UhXQtoKa2ZK2Drtmi2VxKrc23\n6lh/9bIsamyb49T8nLzzUwCA8nZquEzt3mRrK/0l1FAa7bMrVtihWj42mdgtm9cZn03Ne/NxPjjF\n1DxVV9DSMOiKplICLu3fUMvlblvHv6mSxXUMXuxcnmSYbqaVxSniY6OBN5PXL7G1ez9cg8wm9smn\nPnYvAODJoVoAwLmZ9ED4Rcnp/K7BFIkZE0xsdM8FrDbRW8cxHplHbWR4LV/P3MXnMzo4np0LqNlt\nXRaF00BL1Q9mPQIAuPgXXwEADFZLpZNCttnaKfM7RbI3w/LBFatJJJ8aVlOtoo+GRIQ6xPK4RDw2\n/GzTgXq2oWZaK1pW8LPxmRJjJs3sncU5Pizxg/N+SE8WV2SL3UZ1vWfVT1K/GCuhqSKR3i7xsGKd\nHirn37w94plUzHXvs20vu3HLZfREGFrEPrj7RMbLdtmcP5+64ZMARq07dkLegSnH5x+TpZ/jNJIn\nFqhMPp5XQ6vknp+xf6p/IPHCTeJh09MLn9SdNxZOp5bW7Y4llJlDZRJLKBbZyjXsL3tP42S0amL4\n/F58cNtyyvzgECdt9t7xdbV9RYXjPjogNbS/8NXbsHqAVueMAcrBlftqAQCxHs73uTsoP+1Uk/Wy\nzqwMyq2mkynRg/X0nLiyfi0fv4vvu+7FkwEA6QfosnLJe57Fg/tYmSbaxjnx7jJWJWrIY3+98Dgz\n9hfcIvtdClgiAQCW5e1BsVLGwGZI3pLevXzcV8910Hss52qGj49P/8rzAIBTs7fg4098FABw4nzm\nUTgtn0k/FqbTM+Pn558JAOi+0VRz4byxTfbuJGFLPHCghqV2AkOS80Yyse+7iO8LHUPPm2nPyN4l\n3lmB3c1wxGtvqJLzp/IjzIP0wfIVAIB/dHMvfaSRHh15B0zm+uTmBgLgedr6S9kGk+PDq1xQwLNO\nrERyRgSkGkkp97/W84aRVkpZNjDMz7ynjGO/caiK32G2+cHByWnDm8RYZuPHcK8yXii+83lm7dpF\nmehkinyI8PULjqc8uDT/RdxXcCQAIFtM8jc8Q9kgSwQ99eyntMcScqOkwvo3+T7Ee7Z7NudvNJ/t\nHBaPvEFJdZUmW/TADM7f0rQYSs7gXE8TN4U93ZR38UI5J0lOjaLbpEqFl/8pYQ9IUn/4y+lt3Xsk\n+8B4hbcv4V87i2Pvz+H9TGAGx/nTM1fh/vlc104v2/j+Aq73IYfzKuayrfO+z0oPcZMLLMnnXeOJ\nM1LPNrdza0JGDT0QetZzzUcreFYreV68S+S+r3e2jfyX2ba5H6EHzkMVv+TjNH6m16GcPOqWLwEA\nAuwi2B2dk9CiUdQjQVEURVEURVEURVGUCZN6HgmJiHU6LvkAOs4Ua9wgNbiVi6mZOSKPf5dfugub\nItRyn5w1PiX3fYPU2vxf43kAgNpfU48S2LITAJBk+6yHiQfsPpWa5UgRtXQDi9l2fwv7onwRtY1f\nr3sIAFDgH0BDjFapPB+11d/YTtV+xq+o5sx8npZ5J9WsU6+AJZpr44nQcSQtsNFi9sfIF9g/9aW0\nTL4481q02mxXdYAWymN+/FkAwPfEKl//7Y38ztjIpF//G8FkrDea2ZF8Xu+CSlpjN/Qxw+s5R/H6\nCyTg/5ZNVGsuLGtFpIhrYleU2t5pt0i2e8la63ncJNsTIRHX8azSWS0cv97TqUU2sY8jDuXAUfm0\nIJcEabU8Mcy1uzpSi7vTlgAAtmyiFSvYK3rSOPt09n0SZ2u8UcwcMJmCk5XV13hmmGozuVQj986W\nvzN4fZEqY0kz1WU43kWrcr01MTBT+q2a8+bZIebY+NXdlHkz/8r+MzLGSXbMqOvA8o+vRDLtcfE2\neR+vqTvKtTyjiFr1PefSAl9yMtdAY1MVSh/ld+RtpGfL9qso69PFIBMP8zur/iL7iVg+vbjJJKwJ\ny2fByae1oWAr29Jfw+vum0l54K9i3HdMvNAWfPplAMD3Su8GABT7R1AVZL/8qI+Zu2d+X+Ih9zcA\nGM38nqqYqiROkOM9v5RxnrNCnMMlfl7/gpOZfb5e2jsrmIlTs5me/c6uYwAA//c4LfCF9XxPyW2b\nASQxB8ir4bqeZTx9w14AQPf5rLCSeYBjPbiEcjs3m/Lqq3PoaVYRZI6QTGsE3zzxfgDw5sDzg/yO\nO1qYzd75tNSib2vg7yY5c7nBxEgbGs+kBdmpYJt/d9zNAICXozzL3bt8EQDgyCLOgf54COtauKaH\nR9g/V5WtAQD4LT6/vYfx12U/EQ/GHdwPU6EHvPxEpqpAOa3w/bVc/13zxDtuAef+L5fcCgA4M8w5\n0xgftbQ/OcQcUv+z9t0AgOwXKC9n3kWrdaqe9Ix3ydCl3Is+MX0VAGBjMcf8A0W0NIcstnlGkHMj\n6rq4Io/vfWSAlX1CreyvUvE0y3iBcsE2eUhSwRPBYHIhDfCMHgtzLvTO5UgFJN590fE82yzP5zhe\nkcOzX79rIeryPT9vOQMAMBSVvGkR7oOzbkjwuE2R/G8Gk9+o6SQ+tgr4eGEVvQjy09k3/1X2KABg\nfhrn9IATxSXSDy9EeP5/ZpAeRzsGee7d9iPOiXDTGvmxJCcBE0zuu2A353FggPc3s4roHfb9j/4B\nAHBLz1IAwNLTKa8qAzzT5PlGcPDd9Cy7Q/a7a1s5/ucXsCrfF1ddCQCY/f8kn9R+9udkj7Z6JCiK\noiiKoiiKoiiKMmFS3iPBWM7SNlFjlfkSLWzBU5mZvrGDlvYPVlJ7+ZUX34sfHMOC2v+9/WIAQEAC\npwK/ZhxReI9kZz8gGmqTETtFLLUmR0DBeloeWo9nG90otZAXnE5tbLrER1WJxuqm7mU4JZseB5+7\niZn6s/dRFxV6mhorJ9UsM6+AyeZrsg23n0tr/PmfeQYAcCAqY17MWNHj0zm+EReoC1Jjt+gnzH8x\n7W5aYOMHJftziozxWKxAABCrXPsyXl9WA+dAf4xay0uWrgYArOmktX1fs8RMi6pxx6N1mHEGLVv/\nPIaeG5YlVudoCqarBka14j6/l1052Mt+CK+iZeal/loAQKiJmvYc8Ug6opiWqevaTwEAdI2EkSMW\ni8K11I+GJfdJ1lqJg5cs3yYvgOeBYOZEQo3jKSPBOmANieWlQPJhSOx0PEvEdS2t10dUUttceOQg\nflLxJADg2BVXAQC2HqR2vvsXXDszV7MP7HapQzwy8oq/nQwcmft+GY+sDWxXeC7nenshY/1b0nit\n51/MLM21UiQ5uyqKqhNokf3ihssAABWZfOw8Q6tkqJOyMmMNrTuOeOe4yfRM8vthtdCbauA07msS\n3onvfO9PAICDMakxLjr/4zO4Z/3gIL0PVr40C8uP3M4PXcR54UTY9snO1HzYEIvRrGsY57/2G7Sq\ng4YZnJi/CwDwbznc2/L99NbosAcx6DDGPOZwb5x5K/dyay0z9dupEA//esi8r7+R1ucdH+UeFtzH\neX/p+S8AAGqDnO8+a9TCNieda+Vjaz4MAAg9Ry+myvvpyeXsF6+0VPJAtCxP9saL6TGRw2WJgRjb\nfF31KQCAs4uZ8+KGuX8DANQEeDZ4KpKFK0ue5XubTwUA3N9Br4Wtd9BCWbRB4qu3c1+0uyXYPAVk\nnle5QHK7mGpFfbWcx8eeQ8+jS4q47w+5PAd8p51r468rjscvT78RAPDrn78XAFDSy3bl3M1cAvGR\n1PK6BMC9XuZi6zLxurI4TtuHuGeZc21jjF4aJhv/+ij3g02D07Aki+v76SvoeVPrcG04u2Xdp9D+\ndghy3nAlT5ETHH/uEGcD7Otl++dk063u2o4TAQBh/wi+XrQeAPDs88wRk7eV31H6CM9FZn83nscp\ns9+b3zcy/6/cs3J+IZ63B5kY4v1z6E2Q7WNfDbuU41HXRoGP56DtUeZBunEzN4r6z1PmZfatl59I\nrbO+LVWT/M1cyzOv4/3NwEPM7fXVH/J+9bIyrvl9I9zb0iy24z+2XIDfzbsJAPDwk/REzuWWiabH\neT8wO8I+MHm/pmq8ralMPnSG79I3/2MJ5fF8OdyATFKaniV06ffFXYSbeBDvmitJK1Yy2RTaJGmf\nuZEwh4x/oQ8ec26f0N3HG2q7tNUvCVO8TaeaE26kTFyea6WcyHt5cIw+U4SMdimZI1XNSm4SV35T\nIvEwjvdE2w68ubEPSEIWN4cHK+OeZWfzsBGXMpjp36IQ6hjKRM5P2TfRQt54Zt656o3+7IQ5nGMf\nKJMkmOLq2H4hD0M5jXxsSv9ZEgIQ2MFkWiYkwmlp8xK3eaEMnV0Tubw3xaSNvUlCJe0ypRCN23vP\nUewnk6ApbYBfnbexE1aEB5JoHQVwegtvFp1dDQAANyaH6cOwwUzmuvdlZclD+QkpleaWU8a1LKei\n6MSPcsN5vGE2ok3srzOOFbe/2xjmUX2n3FC0yUHrMMiBwz72RrZLKTRTGs4cfixTEq2Mm+W2T3KN\nZ5VxfAc6MlH8HA8XRR/hQXJXK/tq5mcpG1xx7zf73b9yk304x94kFTZtbriBh+Xz63gDdcfTxwIA\njjqGbq7rVlHhYNm8hPrrW8eX8gJg79wzkct7U0zmvHePZZnHQB/ll9XDMYvM57538GTOj4qlvHk+\nsK4C9X8WxVgax9/ZKOGMk3Cumez9zi/JhUeWMMNs2ktUAvSfxr2gdzpl4IgcfcpWxbwkhZ3zOH/K\nb5HQRTPfD6MC4bCOvch5rwyc7F1G5jlFVKB1L6as61zI5798AZMo//DZd6PkWSkT+m7ejGSsoNyc\ndp/IvGZJqmuSuf4LSsNJG3tz1qtneIJdQDneN4NnV5NMOTDEv5Wn8kYx8tsK7wY0EJWSvg9vAHB4\n9znDYZV5YigzZ/nYUs7vvhqeZweqpE+O4o3W8FYpYSdK1rrb+zA0jf2U0SzlQDdxrTjiMp+MtgP/\n2jnf2+dy2LbOI7kmBqZJouAjOM8zn8+Eb0SMC5miQFglyUu3c36YUr+esSTV7nFk/cfexXNKepso\ngAc4nq2n8ew/VMGfPuM8nnXuX70EeZs5f0yJ+5nXibGwie78qTrvD2FMsnH+MecgSTYvMrH7BJ4J\n8l9sBuKSlLaa5xv/ep4Lkn1/p6ENiqIoiqIoiqIoiqJMmJQPbfBIsCTZHbREWOJdkC2JdMZqZArp\nEZgSyXXeCMYyZ/dRq2i0d9hK7VNgp1jf1tAtzvmL0UbtHP0S6YfUSDPyxvCLt0m8hRaFgFjr4nv3\njXufmbz209SHFeVkwe5hH2ROwXUeTry2imdC4Y10T8QicfNdPz5xqG00zWPKujgHmyb3IqcAo5V1\nImKVNFZp8VDIe47Pj8ykW1twC+eEcVUHgECjWKRMQqvDoJWfCoyFxoR5WPlSB8r0xR56oZTvp+Z9\n9730vKiNNnnu0fsG6GVVCbr/x50UKfX5WhjZLh5i9sD4a/WJdQXyd+7VYrmREnLuSMxLZOTcys9O\nj3E9OfJ8qrr5Gyups5il0Go+xH3sjh/QE2H2NXwsjjeYZYmrukmgmJExan1OkUR6bxRT+tZZSW8a\nr0ShjFmwiV4lM56X98l6mOHse8vt7a+EfyaThzr7KLeCXVK+VizLWQ/R0pw1qxYAYB1kci53cMiz\ntpc8I2vHfGmKyzrPvdsVLxrjimvK1IkHVd52nn0K/kGr7d3frQUAzIqs8b4q72/jPXLiqd72MRiZ\nb7yI/EW0NOauYVhr3j1SBj0sIRDXcm7kBHo9K6RZ9ylT2vR1MF4yxvIaeJ7eV0XtXAfFj9ATAX/m\n67YkEvbJvudGoghL2Ir5rrfiOddgSqFaIu/MPU2hhL0FT6EXWs69kkDRHYLVJ2FsEv5r5kBKh3SM\nwTdPQtQf4znXkpL3jnjRFv+JstBfxtDEXddy/s+OrD/EkzyFgrbeGF6Yhwl1kYdmTst5PvvvUsZy\nzEet/TwLpsq8V48ERVEURVEURVEURVEmzFvHIyERY8VKsTJ+h4ND2pRgTTQaKzfFy3q9WTxPDCG+\np+G1PyAaPZNE762M8UzwWLMpOReSJEatqqKlFTWsTxIlOiMcY5/0k/0amvcUqfozYRKt5nZ7e5Ku\nJEkkaOgNznBCCa9XSB76avuAO/zWsFm7q5lczczmmV9cCQBwxGr3qrHuI7HU9jaZAM7g4PjHr5Ic\nNvF9bxfsXXvHPXY3MM+B54loxnfjeK+0twWJ8jth7ZuEafYYz7sJf9dbgNeT+d66NzHQbyMS17m9\nmUljrUSZd0ASyE3dpU0NiRZps1fJGcgZoJdl5l30LnTEU9m17UMPN2+xue9sSvCwbaWX1ajM434e\nF8u7ktqoR4KiKIqiKIqiKIqiKBPmreuRoCjK249X0aw7b0OLjDJB3mLWlsPJ62bdf4t7IyivgY6t\n8g4kpUqVJoFXa//b0fv6EFTmvSVRjwRFURRFURRFURRFUSaM9VbJ9KooiqIoiqIoiqIoSvJRjwRF\nURRFURRFURRFUSaMKhIURVEURVEURVEURZkwqkhQFEVRFEVRFEVRFGXCqCJBURRFURRFURRFUZQJ\no4oERVEURVEURVEURVEmjCoSFEVRFEVRFEVRFEWZMKpIUBRFURRFURRFURRlwqgiQVEURVEURVEU\nRVGUCaOKBEVRFEVRFEVRFEVRJowqEhRFURRFURRFURRFmTCqSFAURVEURVEURVEUZcKoIkFRFEVR\nFEVRFEVRlAmjigRFURRFURRFURRFUSaMKhIURVEURVEURVEURZkwqkhQFEVRFEVRFEVRFGXCqCJB\nURRFURRFURRFUZQJo4oERVEURVEURVEURVEmjCoSFEVRFEVRFEVRFEWZMKpIUBRFURRFURRFURRl\nwqgiQVEURVEURVEURVGUCaOKBEVRFEVRFEVRFEVRJowqEhRFURRFURRFURRFmTCqSFAURVEURVEU\nRVEUZcL8f0TShDc/AQ7yAAAAAElFTkSuQmCC\n",
283 | "text/plain": [
284 | ""
285 | ]
286 | },
287 | "metadata": {},
288 | "output_type": "display_data"
289 | }
290 | ],
291 | "source": [
292 | "f, axarr = plt.subplots(1, 16, figsize=(18, 12))\n",
293 | "\n",
294 | "samples = x_mu.data.view(-1, 28, 28).numpy()\n",
295 | "\n",
296 | "for i, ax in enumerate(axarr.flat):\n",
297 | " ax.imshow(samples[i])\n",
298 | " ax.axis(\"off\")"
299 | ]
300 | },
301 | {
302 | "cell_type": "code",
303 | "execution_count": null,
304 | "metadata": {},
305 | "outputs": [],
306 | "source": []
307 | }
308 | ],
309 | "metadata": {
310 | "kernelspec": {
311 | "display_name": "Python 3",
312 | "language": "python",
313 | "name": "python3"
314 | },
315 | "language_info": {
316 | "codemirror_mode": {
317 | "name": "ipython",
318 | "version": 3
319 | },
320 | "file_extension": ".py",
321 | "mimetype": "text/x-python",
322 | "name": "python",
323 | "nbconvert_exporter": "python",
324 | "pygments_lexer": "ipython3",
325 | "version": "3.6.0"
326 | }
327 | },
328 | "nbformat": 4,
329 | "nbformat_minor": 2
330 | }
331 |
--------------------------------------------------------------------------------
/examples/notebooks/Ladder Deep Generative Model.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 2,
6 | "metadata": {
7 | "code_folding": [
8 | 0
9 | ]
10 | },
11 | "outputs": [],
12 | "source": [
13 | "# Imports\n",
14 | "import torch\n",
15 | "cuda = torch.cuda.is_available()\n",
16 | "import numpy as np\n",
17 | "import matplotlib.pyplot as plt\n",
18 | "%matplotlib inline\n",
19 | "import sys\n",
20 | "sys.path.append(\"../../semi-supervised\")"
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "# Ladder Deep Generative Model\n",
28 | "\n",
29 | "The final model we present is the ladder deep generative model which combines the idea of the ladder variational auto encoder with the deep generative model. It therefore learns a hierarchical relationship of latent variables and therefore has many of the properties of standard ladder VAEs such as strong generative performance and good disentanglement. The model below shows a three layer LadderDGM, but the number of stochastic layer can be arbitrary.\n",
30 | "\n",
31 | "
\n",
32 | "\n",
33 | "This model serves as an example of what you can do with the tools provided this library to create new and novel models."
34 | ]
35 | },
36 | {
37 | "cell_type": "code",
38 | "execution_count": 5,
39 | "metadata": {},
40 | "outputs": [
41 | {
42 | "data": {
43 | "text/plain": [
44 | "LadderDeepGenerativeModel(\n",
45 | " (encoder): ModuleList(\n",
46 | " (0): LadderEncoder(\n",
47 | " (linear): Linear(in_features=784, out_features=256)\n",
48 | " (batchnorm): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True)\n",
49 | " (sample): GaussianSample(\n",
50 | " (mu): Linear(in_features=256, out_features=64)\n",
51 | " (log_var): Linear(in_features=256, out_features=64)\n",
52 | " )\n",
53 | " )\n",
54 | " (1): LadderEncoder(\n",
55 | " (linear): Linear(in_features=256, out_features=128)\n",
56 | " (batchnorm): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True)\n",
57 | " (sample): GaussianSample(\n",
58 | " (mu): Linear(in_features=128, out_features=32)\n",
59 | " (log_var): Linear(in_features=128, out_features=32)\n",
60 | " )\n",
61 | " )\n",
62 | " (2): LadderEncoder(\n",
63 | " (linear): Linear(in_features=138, out_features=64)\n",
64 | " (batchnorm): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True)\n",
65 | " (sample): GaussianSample(\n",
66 | " (mu): Linear(in_features=64, out_features=16)\n",
67 | " (log_var): Linear(in_features=64, out_features=16)\n",
68 | " )\n",
69 | " )\n",
70 | " )\n",
71 | " (decoder): ModuleList(\n",
72 | " (0): LadderDecoder(\n",
73 | " (linear1): Linear(in_features=16, out_features=128)\n",
74 | " (batchnorm1): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True)\n",
75 | " (merge): GaussianMerge(\n",
76 | " (mu): Linear(in_features=128, out_features=32)\n",
77 | " (log_var): Linear(in_features=128, out_features=32)\n",
78 | " )\n",
79 | " (linear2): Linear(in_features=16, out_features=128)\n",
80 | " (batchnorm2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True)\n",
81 | " (sample): GaussianSample(\n",
82 | " (mu): Linear(in_features=128, out_features=32)\n",
83 | " (log_var): Linear(in_features=128, out_features=32)\n",
84 | " )\n",
85 | " )\n",
86 | " (1): LadderDecoder(\n",
87 | " (linear1): Linear(in_features=32, out_features=256)\n",
88 | " (batchnorm1): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True)\n",
89 | " (merge): GaussianMerge(\n",
90 | " (mu): Linear(in_features=256, out_features=64)\n",
91 | " (log_var): Linear(in_features=256, out_features=64)\n",
92 | " )\n",
93 | " (linear2): Linear(in_features=32, out_features=256)\n",
94 | " (batchnorm2): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True)\n",
95 | " (sample): GaussianSample(\n",
96 | " (mu): Linear(in_features=256, out_features=64)\n",
97 | " (log_var): Linear(in_features=256, out_features=64)\n",
98 | " )\n",
99 | " )\n",
100 | " )\n",
101 | " (classifier): Classifier(\n",
102 | " (dense): Linear(in_features=784, out_features=256)\n",
103 | " (logits): Linear(in_features=256, out_features=10)\n",
104 | " )\n",
105 | " (reconstruction): Decoder(\n",
106 | " (hidden): ModuleList(\n",
107 | " (0): Linear(in_features=74, out_features=256)\n",
108 | " (1): Linear(in_features=256, out_features=128)\n",
109 | " (2): Linear(in_features=128, out_features=64)\n",
110 | " )\n",
111 | " (reconstruction): Linear(in_features=64, out_features=784)\n",
112 | " (output_activation): Sigmoid()\n",
113 | " )\n",
114 | ")"
115 | ]
116 | },
117 | "execution_count": 5,
118 | "metadata": {},
119 | "output_type": "execute_result"
120 | }
121 | ],
122 | "source": [
123 | "from models import LadderDeepGenerativeModel\n",
124 | "\n",
125 | "y_dim = 10\n",
126 | "z_dim = [64, 32, 16]\n",
127 | "h_dim = [256, 128, 64]\n",
128 | "\n",
129 | "model = LadderDeepGenerativeModel([784, y_dim, z_dim, h_dim])\n",
130 | "model"
131 | ]
132 | },
133 | {
134 | "cell_type": "markdown",
135 | "metadata": {},
136 | "source": [
137 | "## Training\n",
138 | "\n",
139 | "We train the model similarly to the Ladder VAE, but add label information to the lower bound. The final bound is similar to that presented in the \"Deep Generative Model\" notebook. From a programming point of view, the model is trained in exactly the same way as the `DeepGenerativeModel` and the `AuxiliaryDeepGenerativeModel`.\n",
140 | "\n",
141 | "This simultanously highlights the power of the library, as it is model agnostic as long as the given model subclasses the `VariationalAutoencoder` class."
142 | ]
143 | },
144 | {
145 | "cell_type": "code",
146 | "execution_count": 6,
147 | "metadata": {},
148 | "outputs": [],
149 | "source": [
150 | "from datautils import get_mnist\n",
151 | "\n",
152 | "# Only use 10 labelled examples per class\n",
153 | "# The rest of the data is unlabelled.\n",
154 | "labelled, unlabelled, validation = get_mnist(location=\"./\", batch_size=64, labels_per_class=10)\n",
155 | "alpha = 0.1 * (len(unlabelled) + len(labelled)) / len(labelled)\n",
156 | "\n",
157 | "def binary_cross_entropy(r, x):\n",
158 | " return -torch.sum(x * torch.log(r + 1e-8) + (1 - x) * torch.log(1 - r + 1e-8), dim=-1)\n",
159 | "\n",
160 | "optimizer = torch.optim.Adam(model.parameters(), lr=3e-4, betas=(0.9, 0.999))"
161 | ]
162 | },
163 | {
164 | "cell_type": "code",
165 | "execution_count": 8,
166 | "metadata": {},
167 | "outputs": [],
168 | "source": [
169 | "from itertools import cycle\n",
170 | "from inference import SVI, DeterministicWarmup\n",
171 | "\n",
172 | "# We will need to use warm-up in order to achieve good performance.\n",
173 | "# Over 200 calls to SVI we change the autoencoder from\n",
174 | "# deterministic to stochastic.\n",
175 | "beta = DeterministicWarmup(n=200)\n",
176 | "\n",
177 | "if cuda: model = model.cuda()\n",
178 | "elbo = SVI(model, likelihood=binary_cross_entropy, beta=beta)"
179 | ]
180 | },
181 | {
182 | "cell_type": "markdown",
183 | "metadata": {},
184 | "source": [
185 | "The library is conventially packed with the `SVI` method that does all of the work of calculating the lower bound for both labelled and unlabelled data depending on whether the label is given. It also manages to perform the enumeration of all the labels.\n",
186 | "\n",
187 | "Remember that the labels have to be in a *one-hot encoded* format in order to work with SVI."
188 | ]
189 | },
190 | {
191 | "cell_type": "code",
192 | "execution_count": null,
193 | "metadata": {},
194 | "outputs": [],
195 | "source": [
196 | "from torch.autograd import Variable\n",
197 | "\n",
198 | "for epoch in range(10):\n",
199 | " model.train()\n",
200 | " total_loss, accuracy = (0, 0)\n",
201 | " for (x, y), (u, _) in zip(cycle(labelled), unlabelled):\n",
202 | " # Wrap in variables\n",
203 | " x, y, u = Variable(x), Variable(y), Variable(u)\n",
204 | "\n",
205 | " if cuda:\n",
206 | " # They need to be on the same device and be synchronized.\n",
207 | " x, y = x.cuda(device=0), y.cuda(device=0)\n",
208 | " u = u.cuda(device=0)\n",
209 | "\n",
210 | " L = -elbo(x, y)\n",
211 | " U = -elbo(u)\n",
212 | "\n",
213 | " # Add auxiliary classification loss q(y|x)\n",
214 | " logits = model.classify(x)\n",
215 | " \n",
216 | " # Regular cross entropy\n",
217 | " classication_loss = torch.sum(y * torch.log(logits + 1e-8), dim=1).mean()\n",
218 | "\n",
219 | " J_alpha = L - alpha * classication_loss + U\n",
220 | "\n",
221 | " J_alpha.backward()\n",
222 | " optimizer.step()\n",
223 | " optimizer.zero_grad()\n",
224 | "\n",
225 | " total_loss += J_alpha.data[0]\n",
226 | " accuracy += torch.mean((torch.max(logits, 1)[1].data == torch.max(y, 1)[1].data).float())\n",
227 | " \n",
228 | " if epoch % 1 == 0:\n",
229 | " model.eval()\n",
230 | " m = len(unlabelled)\n",
231 | " print(\"Epoch: {}\".format(epoch))\n",
232 | " print(\"[Train]\\t\\t J_a: {:.2f}, accuracy: {:.2f}\".format(total_loss / m, accuracy / m))\n",
233 | "\n",
234 | " total_loss, accuracy = (0, 0)\n",
235 | " for x, y in validation:\n",
236 | " x, y = Variable(x), Variable(y)\n",
237 | "\n",
238 | " if cuda:\n",
239 | " x, y = x.cuda(device=0), y.cuda(device=0)\n",
240 | "\n",
241 | " L = -elbo(x, y)\n",
242 | " U = -elbo(x)\n",
243 | "\n",
244 | " logits = model.classify(x)\n",
245 | " classication_loss = -torch.sum(y * torch.log(logits + 1e-8), dim=1).mean()\n",
246 | "\n",
247 | " J_alpha = L + alpha * classication_loss + U\n",
248 | "\n",
249 | " total_loss += J_alpha.data[0]\n",
250 | "\n",
251 | " _, pred_idx = torch.max(logits, 1)\n",
252 | " _, lab_idx = torch.max(y, 1)\n",
253 | " accuracy += torch.mean((torch.max(logits, 1)[1].data == torch.max(y, 1)[1].data).float())\n",
254 | "\n",
255 | " m = len(validation)\n",
256 | " print(\"[Validation]\\t J_a: {:.2f}, accuracy: {:.2f}\".format(total_loss / m, accuracy / m))"
257 | ]
258 | },
259 | {
260 | "cell_type": "markdown",
261 | "metadata": {},
262 | "source": [
263 | "## Conditional generation\n",
264 | "\n",
265 | "When the model is done training you can generate samples conditionally given some normal distributed noise $z$ and a label $y$.\n",
266 | "\n",
267 | "*The model below has only trained for 10 iterations, so the perfomance is not representative*."
268 | ]
269 | },
270 | {
271 | "cell_type": "code",
272 | "execution_count": 22,
273 | "metadata": {},
274 | "outputs": [],
275 | "source": [
276 | "from utils import onehot\n",
277 | "model.eval()\n",
278 | "\n",
279 | "z = Variable(torch.randn(16, 32))\n",
280 | "\n",
281 | "# Generate a batch of 5s\n",
282 | "y = Variable(onehot(10)(5).repeat(16, 1))\n",
283 | "\n",
284 | "x_mu = model.sample(z, y)"
285 | ]
286 | },
287 | {
288 | "cell_type": "code",
289 | "execution_count": 13,
290 | "metadata": {},
291 | "outputs": [
292 | {
293 | "data": {
294 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABBIAAABXCAYAAAC5gmYKAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnXd4XOWZ9u8zRRqNem9WsWW5G9sUG0yHUAMECBBINoUU\nNpue7CbZdJJsSF1SN+ELCWm00DuhQyi2ccE27lWWbfXeZqSZc873x/28R9KYIoilGeD5XZcvefp5\n2/O+56mW67pQFEVRFEVRFEVRFEWZCL5kX4CiKIqiKIqiKIqiKG8dVJGgKIqiKIqiKIqiKMqEUUWC\noiiKoiiKoiiKoigTRhUJiqIoiqIoiqIoiqJMGFUkKIqiKIqiKIqiKIoyYVSRoCiKoiiKoiiKoijK\nhFFFgqIoiqIoiqIoiqIoE0YVCYqiKIqiKIqiKIqiTBhVJCiKoiiKoiiKoiiKMmECU/ljZ/gudafy\n96aCx5zbrYm8753cduCd3f53ctuBd3b738ltB97Z7de2v73Qea9jPxHeye1/J7cdeGe3X9v+9uKN\nzHv1SFAURVEURVEURVEUZcKoIkFRFEVRFEVRFEVRlAmjigRFURRFURRFURRFUSaMKhIURXlnY1n8\npyiKoiiKoijKhFBFgqIoiqIoiqIoiqIoE2ZKqzZMGa9iXbT8fgCAa9sAAH9BPgBgePF0AEDgqfXj\nP+A6gPu2S8Y5Hh/7BI6d3OtQlKlC5ry/sAAA4Eajh7zFjQ7zb2xk6q5rCvGFQgAA5xXa/k7ACnDr\nc+NxeUL2jLe7vFeUdxpG3ufnAgDsrm4+/05b6+9kGfdObbuZ+wV5AAC7q4fPuw4A3hOZ+6G3bd8k\n3g++Xds5Bn8+721RUggAsHfuBZBwD3wY7/nUI0FRFEVRFEVRFEVRlAnz1vNIEO2S0awYy5Ivnxq3\nwSVVsEPUj1gONU9NJ/BxqIN/hyqoiZl+D61Roc0HAABubg4AwOnvl98KjtHWOfI3BbVZ0ie+cBhW\nWhoAwBka4ksJXhijHxEtnbwOH/vGHTnUAus9l4ptH4tlAVaCbsyMmzzv9Uc8Jq+neJsmivEsAQ5p\ns6d5fL08AG/FvjBtGjvurzLmvgxa4a3sLACAXUKtrZ0jaybA9/uH4vDFxGtpfxu/sreP73mrWvAT\nLDLGEm/kp9d/rgNX5GZKy7w3i9k/RE6avx5+/6g3yltBRlhWal9fkvE8TxK8EK1gkG8IpfP1YADW\nAPfMeHPLFF/lJJKw7q2gzPsg+8XrH9f1zgS2Of+k8rwau9fL/pbYNnMmdAcH+b5Cjn0gJ5uPI5Tl\n7lDEO+M4w1z7Kdt2GSP/3HrgAOepmdujZxuR7eEM/s3gX7ulbdxXjTvbjpH/49+Uev1gBQKHnGdf\n/b3B8Y9lvUM+78bioy9K21PWQm/GPi8PzsDg+JcS1jNKi8Z9Bm0d/Kx45bgjsrfFYqN7oLzXvJay\n3piv512ScCYMlLAvDjnzBAJAGudHfO++1/7OZDLWo8qMkcgun7RtuJaeB7vfz/cWrGEbnQDfP/iR\nYn7HMB9nNbrIPsBxTntqI7/zXxjvt44iwRwARTD4i8QteVA2/yp2VOPZPviKuBl8ZtHTAIB7Di4G\nAPQPy0bzLDu//z+7AACRvzG0If/ZRgCAzycTsbQIVu8AAMDp6eXvDZtD5hgBNNUkLBR/Pa+/f14h\nfDEuhPwvc2Fse2YGAGC4hMJx9h8jAIChUm4u0XxOvMI1FDRIZ/92z89B/ga6QVl72C9ms01q218B\nfw4VQFZONkamlwAAWo5l+yBdlbeT7c9+egcAwK0q5Qu79wMAHLMwM8P8WCgEywiZ5la+N5XCP0S4\njJx5JAAgvLUFrmySkek8MIU3NwMAXDkYms3Hl5XJ580NkxGeC+sBANGSDLh+dlzGoxv4HrPhJrsP\nEg/GCQpFAPDlyWZp3mOEb4yCc2hhJf8W8zMdS/i+UDvXk29pBIO9nD8lj3H9FL7AvnTMhpMMEjdQ\nmQPmEAEAvroa/qdT1q7cMNmt7XxdXBz7l1NmpHdLn5RRNvbU+WDJWbL270387H7+nfKDhVGOyQHP\nC0WRteodgIoLgG4qeqxMjpsjh2bvZiJCudd5wTwAwOCF8v4VnCvVNzcAAIYWVCDUTJnva+SY2318\nnPS5PwZfNm+G3EhktB+MwkxulJxuzgFH1nmgZhrfJ4/N674iHkLgunDDEvLSeJDPmQN3Ksn8MWcB\n7wBdwzXthGQez2cfZB3gnO2po2wcKpfPyhyvuX4nAODg++tR+SDH20oX5YK5qUxVLMsbe38Jzz9u\nBq/dGpKbZDkfWTmiPC3gXunJkBGOa9/8fGTt43sDrZwX9kHZP5Ix9gkKP7P/GBft3lNnItTF5/b8\nGz+S/bLsf6VsWy63euRewblcmUkZuPa+BQCAUBff17XERuFarp2Se3YBAOxOng1TZs1Lfzgn8Cw7\nkhlAbJ6E5uZw3woOytlvtSgYOtiG/R+eDQCoeIZ959vH84yVLn0bSvNuqNwGGtXgcIGkhOJc2h6o\nrAAgoYi2uOaLssQ7lw9Rzpuzqq+WMs+sB0ib91/I74plAUG5J6+6i/PE7ae8tzs6J6c9EyXB8OPP\n4/ghLQjf9CoAQKyc6znYxmv21r2EMAycOBMAkCVy0jXrSvaA6IxCxMOcP1mb5IwwzDNBfP+Bw9yg\nN48JxYTfD0sMQn2nSNsaOICxXK7/9Cbu7a0nyX2eHInSu9j2yqd4L7fjw1moeZDrO8MYimSvTwkl\nilEg5I7e30Dk4PCiWgDAYBnXbetpfL78MX7m+K+sBABEbM73snS2+YZ1xwMAPnPZo/jVnecBAKqH\n5/N3XtgM4M21XUMbFEVRFEVRFEVRFEWZMKnrkZBgdfeJxtTTUOdTS9PwcVoMZ5zOZBI3Vz+AAj+1\ncvf2HwEAaOmhdWKkhdZmfxE1t4N7aeEqkV5ws8QaLdo7Oz0Iq1A0+EZDbUIBkqGl94130e64jO0L\nXEbr2w9nXY9SPzVqMdER/f6ckwEAW3tpgT94CjW0Q+XU6Ba8zK+OldBiMZIjmmm/haHpbHtWj7iA\nt7RORqveMMb6bB+/EADQeAq10vHZQ3BaxXWtiJppJ8r35p9Oj4udl1I9+d9L/gEA+PVvLwYAlD8r\nFtxBsXbaDiAuXl6iGjMHkuH+JGMfqKblbev3qG394KIXAADPtNbjwEY+F+zn2E9v5HzuPkHGvIzP\n119MU03Pt+oAAHsu4prKaJXQn+o4rDjXX3Wccyz02EsADvV8nDKMW5+4JsO434vFNLq0HgOVbMdw\nAd+b3s33OLJkIyV8fiTXldfFypNPDezso+l5c0LBblz38okAgNYT+eHsRsoK//6Dh7lhr49x2fVP\nKwcA2AVcq63LaE0/8aOr8Y/HjgYAxMvZltwXxbVRVMVWvBoA0LNcrKySc+n7Z90LAFgeoqfFNw6c\nj5cemQsAaLyEVpuq66nNtrunSEvveV6M90Roe88sAEA/nSngmKgMB6h6gms0ls3xioV57RkdlNNt\nS9iHg/VsQ8hmxyx8z3YAQNn7aJHY9N/l8HXy/6nk5OjNAeOmmcc9rWNZAbIOUk4N57FDhkpMCB9b\nEG5if+acRwvz4F3sQJ8Mp1+82HL2RuEEZcJUsj+DK7fw95LpkZCQMMytFK+zE/OR1sdrbzuVjTl/\nId00319Ai8zKCGXckx1z+FUyqnvu4/NDx9QCAKbd0eDJklTDeEh4j2X/c+bPwIHTOA+KTqPXUFkm\nEwkWpdFKt6adlsv+Z9ln8TDbn0dHDPTXcG5Ey2yEWvhd056UvUY82Lx9byowYZrSZktkXvtJZQCA\nriN4/Xe+55dojHMvmJNGS+qfFx8HAGgd5rnlE5c+AwAIWZy7Tw5Sri38AGX4qm6uA6erCDl76aXn\nVLOfLOOBmiyPBONtJt6xvlk853bP4Nnv5M+vxCk5WwEAn3nigwCA3M08uwXnyVhn8MwXLWSf2Zl8\n3amnbPQNs198A8OIVHMvCebyLOXfsHNSmjUhzFmnlF428Rq2ZySdz3fPCmFQPIsy2ti2opd53jPn\nV2NlbzmWf2v+Qfmw5xLpVwldrJ7TipjD9+wN82xVe4t4JskcmHIS14B4n1lyf7L33yoRrec5NXCQ\n7ynYxL/hdo5pLIvtHM5m28IZ4mk8j9/VS2M+0noszzvLtdjPORuMx8p4uTOVGI87TK8c9/zebwZx\nYs0eAMA5WTzDV6XRc2R7lLJiQx/Pux1yxonLfV6og33SeQTlQ+19I17bTVJCJISMJIPANLa568Sq\ncc+HPtqMhr1cu+G9HM+cfWxAsJ2Pu+Zy7ty1mZ5L31t6HwDgwAj74Jrj7gIA/O/OM1DxHM8Ng2Uc\n51zjtaIeCYqiKIqiKIqiKIqiTCap65HgWX1FZSSxvpbkRug4ihqkSDW1KldWPA8A+J/G85CXRu3k\n89tpdfB187OBEdH0ScKJ4rWi1ewQi840amVD2xljZsVsWM2SbM1nEjhKQpbD08o3hPGU2PYbWleW\nzaI17dbpTwIAvtyyBOk+aiTv2rUIABDpk9gim20NLWW8fOYKavwikiqgex7fV3cnNXKuLwTfiLRV\n4iwPSWQ41YimeujdzAuw/yw+/aezrgMAfOln/46zr+I8MP2wfYANXHuA2r3PLXoKAFAYoOfGpz51\nDwBg/Ydosd13MS1+Tn4WrFa22yuZ4yUkmnorhbFMtJxJbeW/L3kcAPDVQloOllx/MtyZHK+ahzjG\n3Yu5VsxkDZ9JTfPHy58FAPj+wPd32bRwf+eOywEAvzz9Rnxp9WUAgMwt1PY6xhNoqi2TCblRIFbJ\neBHnb9sxtCT1zo0jmM+5W5zHsW3t5Hq+bN5aAEBRkM//ZdcyAMDgLr5u9XNdbW2l1Wt6ZidssVgX\nrxALdxY70T+Fa8BYoX0z6UWz53tco+fVbQIA3Fv6IgDgp50L8Z2LbwMAfPvFCwAAgXPpgXNJDT1J\nrspjrosNIxzroFjpMi3KvofEWnd5ySqsmcu1UHqj5JTJE6+s7u7D28BEEmNCxSphz6AWfuR8rsPs\nBzkHpv0bLRMH/joDey7nuBS+yPGyg/yu/7qa/VKfRjn+k6azAQCb2znWjsv3bfwONfgDtQH4hykz\ngpsaAIx6wznRqV/3vjCtUINnMK47vUe8pL4jOSD+XICWj0tOoPmPAADOyNwGAGiIsZ9+uo9tzpV9\ncc5/7Br3G7c+wbjJtqXpKHmR/VGwVuKDTf6FJMZKN3x/KQAgZzcfF3+Q3jPDnX7cePT14977jX0X\nAgAeCHA8r8jjGjEeCRt2ch8oOIXrI3z/6Ji6Enft5YNJVsk443koOW96LqRXWNZ+jvOuD0sc7Lxd\n+HXlgwCAWUHKwb0xyriQXPp2yYnwo9A5AICAj22ceS6t+A/vYs6QWcWd2JHFed+/m3KmoEmsglPo\nkeDlvJG27/4w16k7k3vx3cf9PwDA1Y3n4yPl3OsfGmCM78ONbEtemPP8mCp2wm976Mlk5P7QEL87\n6zmureCQi6DERltb6dnq+sbLoqnGjL3xyBio51r+zdW/AgBc03geKtMpjz9x3D8BABsX8Gyw8WHO\n9aDkzCx+iXO8p47jGs8Qj70ezoX0vkz4h/n/QJdYZI01Ojr1eULMOdfkPTt4CvesCy/nueW5tjpU\nh3id6/dyPWdIzpuWbbSqB8o5X9IDbPuBqzivavIp1zoGuF7+t/42/Lz5TACAu1M8H6ZJUs6G/Ye/\nca9FwlnHyhrvJbP9Ks7XGXUHcGkFzzRlx9FrouECSbrn8LM376aHYsYdPOPEs+UsERvvpdZX7yB7\nr0lGT0u0OVv5JK/SVGK8IIIPiDx7gGfYn/77HwEAO4fL8KFc7m8d4kE26HK+FPs5B87K4vnoY130\n1HED4oF+JOVC8Z/EY68kiMCweKful4SlJvlsEvLj+DLZ5sEjeN7pWMQxet/ZzwEAHtw3H7XTue/v\nG6FcLF7PdVv+PN8bKeA8j/bRs+ibIxcBAALiseCbznVTXdSNAfH0LHiB3mzO6yVjf61rf9OfVBRF\nURRFURRFURTlHUfqeiQYEiyAUSlz0XEUNUmXHb0aANAepxZt8/ZpXqb+tHY2L9gvMaJ7qb0ZLDMl\nMKihCuxkdlKjAfQyHRtL/BgcUzZlKjExU5KR/bhZtMT9pOp+AMB9g9RY3vvIsbDEwJLRZkpC8nHF\nM9RENVxA7W6BaODynm3gGzxNHLWSWTvHWN8SSmglC6+sU5xj/7N3/R0AsCCNqvdffPl3GHKo0Xyg\nmxapoTg1sTO/RAvUn353LAAgnMZx7H2Cmr3CzXyc0c6kEW5zCxzHlMxKnpbSYCyTBVtpGTwuk54I\n6801OUDROo5561KuhZx9tKxlbaDGMf+ztLr8x1Mf4uNSanB7d1MDnx7l57/2p4+gZhW/t11yahTc\n1DwZzXpdEqu0dC7lXO9YzGs9+UTGRG/rKcEFlRy7O/YtAQD8dtmNAIDfNzNPSCxT4gY30rpT9wD7\nY7iIlprhPK7/562lKBEjZHo3+zC8h9YfOz4F69+s91yOY/diyrzPz+d6/2QeY3wb4xyjv+8+EmdW\nU0tfXkwLRX0erQnXPX8qAOCCs+mRcOU/rwQA5K7jOvGLRt4RC74v5qJ2E+Web1g8cpqmuByeyHwj\nb3wj/DtwkNbVusspr/ffzLjhoWkW8l7iPIlxCHHqB2mJ7rG5btZF6WXR2M+5Xn6VVOwJ8wOZLr11\nMrdYcMQS44hl2p2KMU/AK2ssOUHM+NT/jHHRaeJxde9J5fjDkTcDAJalU8Zn+dims5+hReaaZXcD\nAL6/6VwAwIHrGCDbM0u87DI4B+ruGYY/wrZaEc4te+jQPXCqMNap/CUcj69f9hAAYHmIj0P1fuyX\nvWBnjBa5r1c/AAD43Lc+CwB46DJaqT9UtwoA0HZ9LQCg4CU5JLTwu+xIdEy50ySVPTXZ6SU2PDqX\nFubTv0LL+5IwPTGWhSjPm+x0zxNh2OW4Ffs5b37VRY/EZZn0Pknzcw1tbqSFe/Mwv9ufwed37C8F\nesT7RJrtNCchJ5Ip2SuZytMX0gvpE/WSC2iQFQimhXvw/e3MOh79J8c+Lmed4GOcsxddw/m+bQVz\nIdTdSBluOXzd3b/b+1mvIlWyq5R4VmnJA1XA8V3wTe5zVX6uy/rsNuT52Q7HZZ9tauXYlr3ItrQe\nzfE01S3yxMtouIZ7aV8111dG0yAsWwa9i/uHV15wKnNEmFwoUkVmKJ/XV3w697tPFqwAADzeNBsV\nGTy7DFVxjg6M8L0lXOZY8AWO7eOb6Gnn62V/HojzN0oL2c5LnvwUrAE5353Gts79meQEStIc8CrR\nyP3I/rPoVfA/p/C8m+2L4Kzw+PwNmwLcz27rofdWbgbPid3l3D9ydrNtmc2cG5bDuVH890G4Uvba\n7LN+qdoSn0LZ7+13M2sBAEvzeV752WfuBDDqcVUT2IphkctRl2MZkpueC+//FADg1GX0SDBORXN/\nzr6xJf+HJef6/LW7YEk1CFPRy0nGfpdQdrN/GvviPy9gfoNsyfl357oT0VDJ9xRu4JhFC/nZ/Jc5\nHzKaxZtNKrNA8h3E51MGuiJfI2UVyN0nOXBEzo8rhfoGUY8ERVEURVEURVEURVEmTMp7JHhZa6U2\nfO90atIq59JKtrmXWtjbtlETVzW9Hc3raWWueJYalrQeanH9Ev+VL1Z3V7SursSBWdm01sNP/Yrd\n1eP9vmuyxCchbb2pVLH/XGreL8inheKMVf8BAAg9TctlcZuN7D20svoP0gIfl/gqV9pU/3upCS81\nxG2jeTaWgOAY74NED4RkZTA2FlrRVu6/guP6tTs+AAD4+sXUWv7mp+/FQDXfW/f7BgCA3cZ+MJaG\nkosltlgs/Nkj7A+TGdce63Ugmk93OHmeGEZTa7LK+q6mBe3RPqlYEaHGOR4GMtukEsfdzLTuK+Zn\njLWl6z202s4LS51pqZ0bOovfkf8CqxYMLixH+oYGAEDaSmpDnan2RjFjLjkRovNoQWs9weRM4d8X\n9lPTWlPYhRVdtFB37mN7vrDyEwCAdElx0buBHkh1Uh8dUnM60MQ+Dudy/VsDQ4DkRDHrxpnCeEnj\nhdF9Rj0AIHq5WOVyeQ11f6fmPdQuGZlbXDyeS0+boo0SR51Lq2a1xER+9u+fAQDM6WabrQaJkzdy\nLW18XCYwKh/tkSmq1mAswJKDxMh8d4DXnHGAc6F/BedCz5HiMeS4iM3mPE1/mev60b2ME354+zEA\ngHgNX897lhaIzAFa+BwT/+3lPxmV76Myf+oz4niWUclN4H6c6z5DSi2s7aSHRcWjAXy3inkxmjZK\n3oewtCGTcvLPl9MyW9NAWWfkemEh10mknnMl2NILmHwwycqJAoxaZeu5tjt7OGZfXncJAOCl4/8A\nAPjsgdPQPsw12/cjxkqHV7AiTWG21EB/nhaaW05gjoDCx2ipdExW/rFzOxnVeMYQKGOOggPvoxyr\nfy/bcl7OegDAt/Yy/8PKfLZtc2853lu2DgDwy+30PEq7n2tEDI547vH58uXsh/oQ2+sbopxoPZlj\nn9bvouAFWn5dkQX2VHrgmao84nk2cATPdTMLmLPg2mclIVI6527BijTk7mFbSrbRU8Ptk6QAcmaL\nf5Jnn9o8sTLupfx0ZE67ryTXkjwHvDOHXFvjWZzf3y6kR8aTEXql3rHuKNw1wJwPxXTKRUY6221L\nH9XeLGtAzrvGyyAtneObViBx0wfaPVlvqpIlwwvLL14odjnnwMIf0Cp9cg5zgT0+xHXxrort2NjL\nPaDtdsrBAXYL0ivZB3v/m54rM3xSsSIkuRI+IPHyD1FWhgqA8hfYP3aa7PmSF23KMXuQjEHn8YyV\nzz+J57V+m3JwYfpBpFv8/1db6Xl724vc5/x9PMvkUXQgv41z3ZGKF321kh+kieMbaGwb9TjO4d5v\nt3GvMflKpgLjfda1mHvS0jDl9CMD9ChDFs+0q6LTsSPCsbv/phMAAOKkA3EuxIHPcd8o6aA8MB6G\n1h622VS9g98PW+6DTIWMZKx/08++THpMiLMhfrqOuTssqUTn5jnIX8exKlrHs7u/SXKYyVnejfCs\nZCecZ3wvbhn3G8FNozmBHCMH/4W2q0eCoiiKoiiKoiiKoigTJvU9EkRbFp1FbX3OPmpPlhRRC71v\nSOq75/D52F9KUbeTVnnfLtHIJsS9Geuql6FTYlO8mECpEAGMWoaM1mhKE/aLlr7nEsZ8F2zj9c9J\np0V1JMrrn7ZJLKvrdsFKF0uSfEWgkdpVY4Ex9jbTF57FKaF2uxuPjz6XrGoNJmZU6qrGM9m2Lyxh\nlYrso9nu7z77HgDAvEcaUdTCMYy/iiXNGBydSHTcE25fnzxOperxQOwkxroGu3m9F5U/DQB4Txa1\n9GeuvQoAUH37fthiaTe5HYynjakL63k39Ij3iVie8h7ld5lM1eFVu2F3SYZ+T7M5Rf1iakhXURvf\ndTzHfiSb1/ah5cxQ3RSl5e3xLbQ8dzxS7Wly67fI+o9KvHcrrc7GA8M2Vh+JCTMZsn3G+6CrB5Y/\nYc4b7fxkemaY2uHzWG3m0m88CgCISibmTofzPXcH+6LwZVragpv2ejHFiPP60sXbwumnVt5Y1z17\ne4JnlWd96O0/ROZN2dh7Vkl6XrmiPTfx+pFyXlcgKhp88bZI67UwlGWyUvOr8m6meSJntVghxfPA\nEY29k9imsRmLU0AG+PM4vyFybEEB1/aRWdz3HtpDS/O0pijwfXpuzGqn5WW4ko/TN9CaayyRtomB\nNFYKaWdoQCy2ju1ZNpIh8418Mh54llzvj45m7eu1g7UAgHO3sKLM4K3lKNjCtmWsZ34UR6wsGOQ4\nm7YWPsB2eevBrOMUGGtTM7313bSk/ecnWWlkW4QycEaQsn+xeCLc/QS9j2b9vhV3DdAiOS0inoa2\nnHlMzK+0zyeeRpbse1YJ11hWk8ROr26AIzLDPQwWqjeKme/2NHpImJj9D5UzLv6lTeyb8kcoC/Of\n3uPlc/LyeCR6je7k/PeJHDvE8pasqhyvganUYwgslko14omU7eMYFa4IIreBz6XvE68qOfuZvB+O\n2f8Tz0IHaOHO6RDvo3j80PPgFGK8Q3vfxeoa5hxybh69xhak0er6we30QB3+Qzlyt/E8Wxbh+Xak\ngjIv7eUGAICVwb3DLua8So+yr+p+QWt091yOecXDrfRABDyvPCc2xd4YxgPLVCwROdh+FF++soLW\n5JBsbp/bcTkam+htWvwEx3zGQbYv2C2W6u7+cd/t9vD5ok0i18UbzRmTB87khLMS18sk4s/huWXg\nNJ7jzBnumHSOb2WA1/2/re8CAOz9wizEcsQrvUNkelDOjNvoUWtkvDmpebLe5F7y8qE4nqxIZm4E\nfxXPuHaeeMRfzPm+KIdn8Kw6ruPtv52P3F2Sv2o35bwt+5w543sk3Kia152IePqmp3v3B4ejGp16\nJCiKoiiKoiiKoiiKMmFS1yMhoaZlaDu1qFu+Qy395RLn3TpMjVZgFzWQ6T1x+Nsl/lE+64rG2Vhg\nYYvmL318HKipyOBpV8bkBEiGptbEShesopX9gvuYiXyJZOd2+vl6pFiqUoRCniXV6adV1ou1TbTC\nSl947TKvO2O08wkxy1OO0SCKFXHvRYyDfb6HFtsdnbRe1N3M67Nb21+/skSi54V5f7KyNL8KxjKR\n3kyta94fmOvhkmwGv331IGvDl/9UtOgdXaOaRUc0rgk5LbyxFsu3G5HXBzmfAjXsX6e9c+rH3ngi\niKXMlbUZl4Lox32cscDfLd4MALhwJ2Nm89ZIvW3HRf42aYdkHXalNrgXA5hoeTLaaGO5aZHYSMf1\nrCJughV3Mq1XJkbUSeP1/nrlaQCAzWf/lq9bIq/8vLZgGzXvVkYG3G7xOBo2XigJ89nMhVexwo16\nIWB0bcSmaOyNZl4ss65YByy55m6xVF17DqtwhCzK6ds7GBe6qqkGvgZ+tmQd228yUJt17fXHq2nf\nU8gqCQC2eAYMn30kACDdxzj5n2xl3GThrbTipe1v8jxtjEU2XWJ8jZXl1fYuY7nxiQXOjcfH5IUQ\nq9wUWm1kdqzTAAAgAElEQVS9Kh1h7uUHfyweU37xqJCduX+Y66BwewT+XRLXb6x54k3oeRSVM07e\nGpJcE93dk9qGN4KxPA6ezszyw3ns65X9rKpxddkTAIB+GZMXv835Pmsl9wB3KALb5HdKyOXk8846\nJtcG32dkHUTWhTvZHy7GWOqmKicKMGoxNfkYZJ6l/yc9cC7MpAz/olSXSOuXeRhKN8W5YJn5bWSY\nOdskyMBD2pViax4Y4z04mxbaDKkuNSReadfu5frPPhAb9UToNHlN5CzQN/7sN+pNK3NB+sGse2d4\n+NAcMYfBSvm6mDxIsufHQ/zNT3/jdgDA2WHOiWu7jgAA9P+dZ//ibT3w9Up+M/G0TZNcWJBYd6eD\nVl2/mQuyP/r28TuLBliNCq0dnteOZ6H1jb/3OOy8mkyV33XqaKF2stn3M9N5/v/mGuZICewOIU/S\nPBVs5Nhb4n3ppsstnXgmwuQ8M/c4vbK3mvuDWHx0fiCh/ZO5PuS8Z34rew3leNkdHKccHz1HVg1z\nL3h0Ez3w5jW3I7hXzjiOeNYZb0uT5yORhLOsY8SA63j3WIdY9KcCuS5H5m7/0cz9MLeQlZnygrzv\naRyk173rAwKS588xnghvMJeJWft2LH5Yz7TqkaAoiqIoiqIoiqIoyoRJXY8EoyU0f8PUUGXsowYp\n8yRqpXZ00yrtl9gPO+RDvIxxUX7RxvlM3KTE1Bl9oyNZfhNjAt34JGsk3yCxcranLEhtXa6PWrqL\nl60BADy3ktl7ndoy2CG2OW0XVZaeljUhA7PzahmZk1WZ4ZUQjZkX219LbWpVBq0o2x+h1j7QJ9rm\nzAw4opQ0mrrEzLM+kxlYNNSpaJUAAEvi9kfKaGn9dDlrCIcstufJTWz7vGaJh7Tt19dOJnhjJMaK\n2k2mnmwStLOCU8TYve5FnPNDpby2lzqppb8tl7GQG3YwW3ONZOKNFvgxXCDeCXG+x+Q+cE1dZsmA\nPzYukG9wDnnsSvy9Z+mLT/48MR5SOz4iBdGlPvI1HQyW/Fg+44UzW8TSJBY3tyDXk2lGM+yarPvG\n22KC2udx1uupskYnyHpLsioPHkvPo+YzeE3GOvn1VlqoFkpm/ln1rXgyl1m6d+RSqx9sp4yccTtj\nSdEuFqtUkm+vgV9i2g98gG3/QQELpK/+X86F7N2SpTkcAsTabndIG9/geJnszQC8/vHyB02Fp5bJ\nDWIsY2Xc0z89+xkAQL3se1/cR8v9QCNlePRLA4iN0LKY+Tz7yxFx7x9mH5Q/yn3QyLZUwngPmGzx\nwZNoRU2TQOGHBpkX4PoGZicP94t8N+skOwv+GspFtMjYG08EGTcjF0b3O/HCkv3fFxz1XJhSTwSD\nWL59Odzn2o7g3/gAr9+W683aamQ7r7HlrEqk94j3heRTMN5rhc+LuVbiwo13T6ru9eOQtTAwi3vY\nZ+pYkeqKp/4dAJCzkXK9ZHDIi+2Pe+17ZRmf6JnoPT92bSd6rE2BnPSswbI/D5VJxa00est025Rr\nf97F823FBsl91NELR7LtHzJnTW4UwW7kHpEYJx/oE8tuPO55bpnKAe5kVytJzNFh1oDse12zGSt/\n1BxWVvr6yosAAKFdfD3c4qLkn1KJwFimE+9tEs86ieNpj97jeF4w4gU76e3HmHOV7D3xKnqlLM7e\nBADwS5/c2Xk0AKD0Cc6RjhMqvHky7WF65Ph6xfPUfKfxRpzAevfOulOZL0XWuD+faxylbPtQMZ9/\ndis9MNOzOA5pq8RTsxgYquP5OGSqiL1Bz6Fx3kmHsa3qkaAoiqIoiqIoiqIoyoRJXY8EwTKaIol5\njM6ips1kMl9UxGzFj88QC2aNhYonaI2K19NK0cty7Ch/nhr9oGi7/RslxthoRsWiOy5PQBIxGkqr\nb3ybhyTI55gsZiXeeiWtcFv3VCC8m1rFCrAO856LqGWtepwaq4xntwGY4jjIN4vJkSDZpGFxXO7d\nTotkOocZnUfQQlU0XA7fQbE8iQY6chy1e7FsauKyd3HMjZ+CV50gVawVoq305VFbGS3kmAeNdbp9\nqbyPf+xCaiut1vYxmdap5TUx5yZu0Hg5ePGyppa6mfdJqB3NC7O8ue6TeE/L5Xp2pUmLCrnOm2LU\nyPoG2E/GYyFSbCEgVsiDp3I+ZLSx/cEIn8/rF3cVEytqLPrGWmGeH5NnYyo8ETxtuHgk5G6RtpWx\n8eecynraxX6K6573UwPfX0lrZFq/i9w9lHVpTRzTeLnkjumh7PDtY//ZJm9EooXileb/FK0Jz+NI\nYh7dSsa1dyzk81867kEAwO4Yr31OBtvyfB8F+57+IvRGOX/mzuBru5tYWPzgqVxH1Z2Mr7UPSHb7\nFMuJYjBWMbNmb17+ewBASNa/+0Fanlti7Jv8/8tFqK1TPizVWPzj4+VfLxbylTyQprJ/jHXKyLy+\nOfxbFqAseGiA3ibRKGXh0UfSUnde0QbvO6qX0jq1bZj73q0HmE/AeSE0qdf+r2CJJTZ3Pa2LrRdS\nXt37EisxnH0KK1EcVUTr02Oncp93zmR/nHLGejz+wgwAQFoPvTjsdI5x1WOS0X9jA3/Li/seHy/v\nDIg8SNL+Z/rAeKJ1HsW5et18WuIdyXS15CJaKp9bwPwRJUXtaO1if/ma5JwkIi2WybVe+qR4uphK\nHW7qVWlIxFgMi7+wBwBQHeS8/tryhwAAaxdSru3aNG8023/i2Jp1/zpeBa+4xqeyb+S6Y1JxIbqE\n1vPZEhtuTiNhyRMRLaKlPrM9cIiHgTnbmNwoxtJtzjpWmpz4TDy9eDSMzdY/FZb4V8KMn6m81DuT\nbVgkudDcGB/HM3ntsbAFJ4vtjU2TCj0HuO97HgrR8VXJEnmlsU+GJ6qpJjOSQTlwVd4ueYWy/sRc\nVhTbcyU9C3siIQztZM6AjqWUGa6Pf4tXy1l4u1Qrkpx3h6yDV1oXUzjvvfEWLyw7g2097WMrAQD/\nbKaM6x/iGPfP47iEd6VhoEK8a0/lPVDGXrl/kbxItskT8WreSZO0r6e8IsEIBMsTmuyYx7rmAQC2\n3yzu7TV83h+x0LFISrss46Ex88d0HWk8R5LRFPM7QzsWAgBqfskN2x0wroOvvPimGnOzH8/mhPr2\nugsAADdX8GZ5y+paAECwX+62psUQnUtBsnu2bC597IuOIyRxVQb7LWcLNyh7myzcVNxcnfFlO4vv\nEpfvj3LRxNukbFAd29pxYhbSmihULj+fZQIf/DXb3TOH7et5P99bch0PYMFH10xqE94srggE4655\nXy+TrpmbqKLn2K7WZbz5LB+ahkA/N8WeY3mDmbuFm2XTaRTCQxXsg8pn2J9pj66TH0vy2Lvu6Cbu\n48ZY8AxL+QSiTADZ8C5uHtkBbpA5uySpaIDXHhxwMZwtLoIiKzNbuJ5HcniIiFXyOzCN/eHfuJs/\nL+5/ozdeU9wfJulOF8er7K9yyJlbCwDwf4CvPxnh9cdfZh8FZT/sONpG5xLOh2OP5LredT1vxocv\nZH/F1jBZUe3tsuFsNxt28vFKj5lEcXIQqMjhDcLfTqDybGsJbxKfu1USEHYZ12Ygq4mysvFo3lBl\nd/C13Mvp2rp1Oj876wY5bGxi+5N1eHw97Om8aVwXqQUAXCxJVj9QsxoAUBrkHPH/xsWPr2FZtLyd\nlP3+7VLysteUOuR3WpLEM9Xa7CWBk8PvSCbX8eeffT8A4GcnsBxiSR5vere0sW++Wvkw/HKjaYtT\nb7af39G4g+WiS+fz+YJ+jn+8oVF+NAX2u4TEwDV/oJxqOZYyfd3SWgDA+wt4wCy7mC7sBQH2wxU5\nu5B16fMAgG4pDfvHHioh7l1/OgAgvdiUx5aQgb0sH5oS7QdGEyS2U27lbOf11p9nkmLSWrDmoQUA\ngGypatezPOadBc85jXv4ql/RDbqbR0JEL+E8yWziXCj8+0sARt2+UxF/EdsftbkGbuui7FvdylC+\nxcVMStf2mQiybqsFAOTspaw3iUc9A4E73nU+1cK6jNIk2MJBnVvJeR1MKD1rO2JE/CznROxP5cjZ\nyrNfvJDzI7iBiheIe75lSVLxMu6VjoS2ObbsNZEU6AsT0idrIF5K44kp+dnQz3NK7ga2qX8p13jU\nDqF1GdsV5304qneLEiBgFCYmLDiJyQRfA9NmY8iLFtUCAFZEea9T4OeZLE20gy29lF9n1G5H7Qwm\nnf9bPddG/8vspwIJfwqWcT0gi3PE2bxdfjRFZJ7gtFKBDDEG9sR4vZlpHKuOJo6xNcL5X3xqEwZu\n5z7WNYfjWsmjEuIL5X6mSZLnStj+VBlKNbRBURRFURRFURRFUZQJk7oeCcbdV9xA4nnUPPr300r1\n8maqnXO6qHnrXSqli9rTEJ5JjWzPED/TeyGbGWql3iRUT2tvbFhc4iTZhZWM8kevgXHNSdtHbWr2\nU7TOdvTVAgCqu6hdHSyldqpspYuueeLvfzI1UUfPpiXrmRFq9Nty2AdZe6jlNCXnbNFipxSmPJBY\nUxyZrf7raXUs/mwDAODBujsAAF9rOhOzlrIs6DEZVNU9dCEtsWkrOcZ5YWp1AwPiIjSVScUmgim9\nJ1ZDO419cOdOWpqi3ccCALLO53hdUbcWAHD9khORXURt9Dfn3cq/a1kuyPJR0x9K53zK+Ju4tXol\nwyarMa/DmAQ3xhptrCmmJFSWJJXb92gtAGBrDZOrlXfwotMG2BZ/1IEt5aM6F0syG3GFzmzheyNl\nXO/Z23vG/b4plfaqCUinCFPm1JJEif5ByqEPvXglAKBYLLKYS21zTwXbU7A6gP5TqcG/etoDAIA/\nfp7r/46nOF9qT6FlfuQ59mtaK7XdKZWITCxmzjDHK7iF1tPCb9CyuDvEMKWqfbQ+mWS5TiTqhcbk\n5VLO9dRzDvSvoneOW8L13XwC2z2tk1bKeOPBcb+dKlgx9sFPnjoPADDn7OsBADsjtLaclbUFAPC9\ng+9G+8mcJ70zKfuLy+gamfuiaRu/y26V8qZTmVhqIiSUXyvYyDmZ2cL2/PAFelzEM3jdw5V8/+dv\n+iyCg2xb5ye5NhYUU/674qnUdSHXVP5a2TymoqzdBPE8McQt1VfCfa5gG5//06bjAAAzj2abjLfF\nPS3cC8qCvdg/Qmvcx3J3AgDOy94IAFj16VoAwOZ/ci7U3UKLv5EtqeKV4nmBmeSPkuT2R61nAGDY\nEgDYIfE+GuEcmPGtCNqX8xwQl7XevoxzIdzIx7PezT55aROTVhZsZl/4toj8GEpIuptMZE2aEnYD\nv6Ln6JqPi+v3E+yHoz5BD5StGaWo+DQ9U1dvoTUyo5EhL7W3SrJJCRO0U6jk6Vi8c7aEt2xupLX1\n8XLK5rogLbbnVrHkc2Ua23HfpxehuZ9rpX+QayW0mue83L18nL2V893ZJ14aIwlhmykk743XgL+J\n1xy6n94nu+fxbJ47It7W4oI4UhzHCKcDwpJ8vms5w3ny17DPfPn0bog3tUz25b85zN4j6z93M8f2\n43cxqeh738XE0re/wESbvijXxxOrlsKmCEO0nvLwiOO5njelcZ27pZSJ9b8Sr5QM7iOpst49z0sT\nctzPdqy7keEK8dO5bqvvY5v3s9I7Bv5e7nnb9i+QfW8ZZWDhCxznwXk8H4Tl/i4gbffmwSTNe/VI\nUBRFURRFURRFURRlwqSuR4IpcSRxVIF2Wp+mSRmQIbHCj2SLVVEsEHVLDmDPalrusw7wtZpd1EY2\nXGLKpvE75l5Aa33HFmqyQvubJqs1/xLGSlv8F9E2moRcJdS8BQaouWw6MRM+UbzaLzJXwAtB/sVs\nauPSX2Iczq7/osZq5rW08GJtCnokuOOt1PmP0sLQfxItC61/qQUArPoG++FnlY/j+Bc/DgB4KJuW\nSfdeKf9GoyYONvJxcT3nQMFaiSlLGY8EabNY5wpvYh6DkieoadzxaY5XJIvjt6aH2uvvnnAPfvHL\nSwEAXxu4mN8xxDbmlEjZJElWufWz1FbP+5HEDe/bP1mtmRiWNabMIsfBaaa1xRen6r32z9RYx6uo\ngY2WUNMaC1MX2niuD6EWjmXtg7R0OEF+Z88MyoqSteKJVMA1kNbJ73Dt1LBOeLkC5K9/Py1L9V/g\ndbafRcuTv4LtyhNngvi53Sj/Iy00n7v2kwAAJ8B+8Z/P90Z/XyHfIbHks2jxKr5h9bjfTAnEamys\nc9aOBv6V5Ji2V+ZJylb5/d71Zz/MfDc5GyUB7VcZczz7/2iZdsKcCw1XcB1V/5zWYCeaInPAtGkL\nczjM/T7X6lW9tNQE+9n2+6fTKv21Ex/EiiD3r3gW13d/lUklK7lSXuJa8pezT+IHDk5mE940Jo7X\n5MhI38QxDSUkUDMJulBcAHRzEVR9hvJw+7n0VPzAp54DADz+0+MBAFaE1j5Tcsvu7Jq8hrxBTIk2\n3wucu9n1HM/sjZyT37qSuSLKn2N/DJVQrn/jgvcg0sIEdL/uP5ffVc15/sH5jCPe5Ode2XYs10Hx\nXsp612diqZM87005ygitciUrOZ57tnAc7SDHvsLPg016G9vXu6gIxSvpqbnvMe7ps4q5z+3+Mtf4\ntkeZjPXMC5gb4blSys/qD6RGDqxxGK8c8UrLfopJsf0j9MLy5fOaf7ziHADAL0+6GY7YAT9V/hQA\n4OMrPwwA2H8R9/XqO2S9iLdHqlhkDcYya+1qAADM+TY9EX66nPN92icpA3ODnBthH+XDV6ofxoef\n5jnP5MmQPOToE9mX3i2JltOYnNKXmIAvhfCSXcvYlzzF+5BAlOPYM5MyPzOD7f/5MTfipg56K62r\noIwf7uIa6DpGLNTPcJ175f6Svc5fBSPz3QZ6TM78ZgMA4OU/07tmzgjveQbm8xx48JJhhDfyPJS7\nSrzTuygvr3gXvXXuvflEAEDD+XJP2ELZlzLz38g8SQJrPGIrHmBf9O/nPt0zk3J++hz2TXNZDkpu\nYNvnXCO5bsy5TbzqM9cyB5BdzsdOHs+6PskR4g6rR4KiKIqiKIqiKIqiKEkmdT0SJGbMsxa2UDPl\n1FPzZkq6RUqoC1lcTQ3c2l01KNsocTc7JIY2SK1ceDctGWdcsp7fIRlBI1v4vGN+M1XiKBM0Vyae\n35T48Iu20ZdNrVO41UVGJ695sIyvdc8TTXcXvRjOvoKxR/c+wthpaxu9MlIkWvYVMVmWfUFq7nJW\ncaxzJEPtD99Ha8zQPaUo2k8N3UAJvTTSB9iHgYtYIi2+XjS26yUzforEir4axkJpN9GqWH8NLTZu\nDS3M286n1eW7hXXIyOX89R+gpja3gd8ROJ99Miufa2jb/fRSsQ82T/LVTwwrLW20BKB5TmLeXVO6\nS6yR/n38m9XG1ztOkhh4v400carpnCdBdBIKHpcKcK4pN5XD30qT+ZQyseIJ2NJ2v8SQFj9BbXNh\nAa0tnUdyHMPXZyFzo5Q1zOA690lcfPkLfNy5gN8Rr+RaKvutVD5JkVK3AEZj9wUz90flsWSiTjPj\nK8/7/V6spWuLVTvCdT37//FvLJ+TINjHx1WPcR35JB7XMdnsUwQTwx5voyVh+jc4XoEyXm/nabUA\ngOcWzkTBk2xbPCyVjaRkaV+t5AoZpjdT5orUqdTxipg48QTLkStGRFMa0+6RErH9/aPxppncA7Oa\nuP89cAOtUuUb6Hlg8i/Y3SnieWdZY8r1ylOmFOLuBj6WeT7jGo69KXEXljYXrSmB1cn39i2vBQC0\nOrRY3R5aAgCQ1AkIDor115OzKbrvbeIFBzMlJ5Z4KnjxxIWUeXn/7IAbG1+u2zfMtV96Fz0Pui7n\nGl+QSdm4/+vcK5wUlffAqGeYKdMbepQlTsM1vPbB97JfaoNdaIjR0moqu3x+8ZMAgNuLWNWm0eJn\nqv4opS5SxSJrMPmgjBemyLqCx2mZH1rPs36DeNNsv5Ry7Oj6Pcgp4Bk4nC75YXZRLkaOkcoGRZSJ\n1Y9LfgEz5ilSkQ3A6H5nypyLZ5IVkvPbVo5bfzW9qL44+3EAwHGhYdwkX+EXj4z+0wfks+J9skGq\nE3Xw3OvGUywvjsHI/MHBcY/trdyrAhVSeWWP5M1ZV4CylXxvtIQyYaCG+9xNq3hPU9Ajss4e77mY\nMrmBTMlSke+mWpfxtMtaSZmXtZmPd5VxHZettJG5gd4JkLOrm0OPNCfM74rU0QNnoIJ9UrKS3+1O\n8jlPPRIURVEURVEURVEURZkwKeuRYLQ1JsbHWBz8UWoU4xIbnd1IC0T31dTEzG3s9rLAWr3U0sVn\nUauVfhy1c37wO+7ZtojfcTa1vKWSgyDpWkuJX/TimySGCvLYH6YmzsRUGS1+wcsOfNMZM981h212\nRWP53uWMmTwjZxMA4MmGYyezBYcX6Q9jVXKNFlOsFZnnsH8ysdfL0GqftRAAcPBCvnZqAbOWrpsv\nc2MoNSpzHIK0zVinkKBJdCXGz9rLOOeaa2lJdWNx+KV+7px7aIF5cDczGcd7qLXsyKDGu/RBiRdM\ncsUKM7+tQGDUylwhNYDFoo4WWiksmfNe1vGwVCzYIJmpg3kYlHQfaZI7ILJUPHe2UrM7WMnPpPdI\nLoL04PjfSjZm7E0bjSbd5AaYnj/u7YW3M0M7XBeQyiZuPsfatM3ISVNSvOBZ9mPvUmq5sx+U+trJ\nzBFg2i3Zq52lzN/g37ibr5vs9uav9Idn0RoZ8eaSwZFqFFaheCY1S+14+Wy0llaujJYkZzQX2Wbi\nJL08GcYTz8S2Sh/FW+iZlHcHr7v9TgsFR3Cfa/m6VPEZkv2hlfN91oXMaj14dopaoU31FLE6G+8j\ns++ZeWHmwdj6675wWN4j1nypcjO0nH0S2cl1ERZrNTo6JqsVE8PM9THz1VdYMO4trngcmv3NJ54I\nZu4ajw1fiwWnkvIyHuL3FizmGSY7nWPd8C6pXLOC/ZQvOSIs+W7XnHWm0kpnWd7veZ4Gcr4zeVE8\nq50ZV/HQcRsavddNpRZvH5F+ajuSn4318vWbG48GAPgX0sKdJR4fKUGCF5b3tHjPmTaig541066m\nJ+JXcSX6ZlG2nfOtZwAALcMc27pczvG0S3jm2X+drJ9UyYuRsAa8tWDWt5GBkucnt4Frd+T39Cb9\nr7J/R/88vvd9Jz0NANh0Dvtnxcv00Bwp5utp+3jmRwH3TiM/k0rCfmcq01nm7FpOebD/DI6vyf9w\n9SOXAACuKR9EXPK82cPsu+8uvwcAcFfrUQCA5iX0yimQqhVWQv6lpGPuceScm7inm2pyxuvIauCZ\nturmbgzP5dllJFNyKaVRllTUcKz7SykH0u5nfiErYb4lrQ/kOgLVleOediUHnHfeM9c3xPu7uqvp\nOey6LiAeGm6PHHLlHOykSX/K/ULpPykDmk/nmqlolflvKjcdZnmvHgmKoiiKoiiKoiiKokyYlPVI\nMNpJS+JG3CxqrE2Ma1oHNdfxXImli4oWp6ffq0nsirYr7SA1Pt1t1MpszKRGaFEV400O9DGzsRc/\nmOwYmgR8WbQyWiZ7tVjmjQYe9qhF1YpSq5XRzjYMF1ALdn4eMxc3xdgH6b1i1UnhHAFmPDxrdYLV\n8ZU/JBluD9Bqk7uaFqllS2mV641RgxcZSM2p73ngZEvsU//A+NfFcun1xXTOZct1MTiNbb17hWg8\nc6jNXVrXAABY909mwp05sp3fneQMxp5m2PLBV8Iss/0zaVXpq+b45O6V+drNeR1olRjnOOVDrIT9\nNJJroXATn+udzr4JbpSYs4Oc61FZC2ZtWIOSyd+snxTJjWKsrDCyT8bJ38W5YGIpUU5rpNvcBrtb\n4uyMBl/WQXQJs37HsiVWVOqwh9tk/Sdb1lnWqFVS2tt8NNsfWMDKK6X30DPBaO69j4pFx7XtUQu+\nsWSaN5nqPz2MNx2ZyUzY/VWUKRnbDmdj3jie55GP123G3lid3QQrvGe9l76ysjLRtphrIP0e+dIL\nORc+d/Y/AAD/89z5AIC5IfHwSJVY6QRLka+YMmCkllYU/4BUsJCqHd5eNdaaZfpF5GJgSHKohPne\n9iXsm5p14+Vo0hjjTeHPo8UM2RzLkWliNco0lmP+Ca9gHiPPQ8GMfW4OEGEfDRVz/sSfpUxwTqBn\nQmWReGwdEG8Pib0/ZF5NJa4LKygep+YMU0QrrN94VnXS48a02cuPYtZ8POatEW/+yNfHc/nemipa\n5SoyKTdermUcfZaVgvazV7kmL09ONvvF18c2x4tz4Pq5fp748gkAgIYL+R23nPU7AMC/3fVpAMDs\nHKnKJN+VbMZ6IgKjHjneuVZyX1nSVjn9obue66VncQy+Ib7nlt20wJuKVMEePp+/RT4knkiO7I8p\ncbY3Yy2eCH7xlrDL6DETKRfvoR1co8bbaHCInyua14fGVlkvnVwPP9jAPGEj3TzfzjgouYJEVtjG\neznZJHjgeGvXeN6YfBFyvcYTHWHe5yEWR/oeqV4XZR90HMnXPlHLSj27olznj9pcF67kWEt2dS7v\nPkbOrk0X0ou+4gF5g3hjueKJYLw0fJIHAYNDXh4Rsxf6TN4v8VYdKOOaalnGvTRkHPDM/cIkzf8U\nlKiKoiiKoiiKoiiKoqQqqWeWTazWINaHWCW1kYNl1GAb62K6ZOjM3SYZvssLPS+FaDE1QPFPUC1T\n4+ff+XmMOdnRR+29LXGVRkOY7Oyexkrll3iYkUpqmwI91Kw5WWzXSB77ImM/42WGy7LQNZfPDZXx\n2v/6vt8AAHbH2NY/NFJLl7tJsnlOYjsOF+48xnv5IpKlWTTxvv2Mdxub5duqoDby4EnU4D/0+Z8A\nAO4dmAsAeGktvU9mR7ZO9mW/MRLj4ytoleudXz/ubS2ncV3kbaAm2hYlZ97uOPorOW8KaznPb114\nAwDgopc+AQCY+WepfGLix2W+JzsliJcDBKNj218nsXISAzicR411ltQVbz5OciRsF619BtA/TSwZ\nMqnjmeNnd1DCb+100Z+KFtxXQNlisue6QHLiSI01XeKYh2ZxDvhiUg98hH97xKpe8qjkx3DdQ3IE\n7CUy6SsAABa0SURBVP8IvU8CsjQskWWDVfyOcAt/K0v6wE6WZ9IYGWssVCOSCqKvhGPQtbgWADDv\nx4z5dcRaaSxYvrSgF1fvfZdYqK1uscCK1bftPylDR17iHlEcSkMy8eI3qxj77IZkXWdTjge3Mibc\n80KRbM3xOcyF0z07AwVb2abWY9imxaWMi32si7kmqu+TnDKy7lPCKgccYoUdXMB5PTCN86B7Hvug\ncP0RAICsZsqJ9FYu5EhlFuwQv6NjAed/xUn0Mhzo4Jq2izkvOs6pAwDkmzxIycZ1RwVvOudg83JZ\ni4tpOc54mnuYHZoDAOh4P+dA3t3iwZBtwRF52beQ55+KaYwVzw/RqrX3EdZYr+gQ7yvZK/3FtH7a\n7RJD7jpTOi88C6TIrcFZtC4O53A8Q11SaWQTz2qOiSM2lspw2PPiMVnu28/j3h5qZp8cyOccOKaI\nclKWSUphPCyMZ4bPxIabnB6+8ecCu4ivD9SEkbuVfWJLxvaaOj42Zz1/VNZ9KlnjMXq2N20aWEzZ\nZzLN+6Tp3fN5vXameKNILqNZdc3YuYVel9EIv2PhNMq8DWncO8OtkhMg0RKfKpn7Mbrf2dO4zw9W\n0fo+WGrWwPjKdD5x4OwcDCN9m/HE5nMDBewHfz/7sE8qGaTvF4u+EXuJ7R+Tr2Qq8XKASF6vWBnH\nLR6WPgnx+g+cJn3Rzr/RUgdVj3L+dBzBtXPeKcz/luenbOuJsc3tx7HDCm8VT5ck50gwnpeG9G7O\n567lnP+5O3hW8XdS/g/M571MRgvb5QT98Innvb+da3rLt/lZf5/sfwt5RooMsA/sFezXyfY8V48E\nRVEURVEURVEURVEmTOp5JJiYVhNDJ/FS/j6qKe1qap/7ZvJ9aT3UhXQtoKa2ZK2Drtmi2VxKrc23\n6lh/9bIsamyb49T8nLzzUwCA8nZquEzt3mRrK/0l1FAa7bMrVtihWj42mdgtm9cZn03Ne/NxPjjF\n1DxVV9DSMOiKplICLu3fUMvlblvHv6mSxXUMXuxcnmSYbqaVxSniY6OBN5PXL7G1ez9cg8wm9smn\nPnYvAODJoVoAwLmZ9ED4Rcnp/K7BFIkZE0xsdM8FrDbRW8cxHplHbWR4LV/P3MXnMzo4np0LqNlt\nXRaF00BL1Q9mPQIAuPgXXwEADFZLpZNCttnaKfM7RbI3w/LBFatJJJ8aVlOtoo+GRIQ6xPK4RDw2\n/GzTgXq2oWZaK1pW8LPxmRJjJs3sncU5Pizxg/N+SE8WV2SL3UZ1vWfVT1K/GCuhqSKR3i7xsGKd\nHirn37w94plUzHXvs20vu3HLZfREGFrEPrj7RMbLdtmcP5+64ZMARq07dkLegSnH5x+TpZ/jNJIn\nFqhMPp5XQ6vknp+xf6p/IPHCTeJh09MLn9SdNxZOp5bW7Y4llJlDZRJLKBbZyjXsL3tP42S0amL4\n/F58cNtyyvzgECdt9t7xdbV9RYXjPjogNbS/8NXbsHqAVueMAcrBlftqAQCxHs73uTsoP+1Uk/Wy\nzqwMyq2mkynRg/X0nLiyfi0fv4vvu+7FkwEA6QfosnLJe57Fg/tYmSbaxjnx7jJWJWrIY3+98Dgz\n9hfcIvtdClgiAQCW5e1BsVLGwGZI3pLevXzcV8910Hss52qGj49P/8rzAIBTs7fg4098FABw4nzm\nUTgtn0k/FqbTM+Pn558JAOi+0VRz4byxTfbuJGFLPHCghqV2AkOS80Yyse+7iO8LHUPPm2nPyN4l\n3lmB3c1wxGtvqJLzp/IjzIP0wfIVAIB/dHMvfaSRHh15B0zm+uTmBgLgedr6S9kGk+PDq1xQwLNO\nrERyRgSkGkkp97/W84aRVkpZNjDMz7ynjGO/caiK32G2+cHByWnDm8RYZuPHcK8yXii+83lm7dpF\nmehkinyI8PULjqc8uDT/RdxXcCQAIFtM8jc8Q9kgSwQ99eyntMcScqOkwvo3+T7Ee7Z7NudvNJ/t\nHBaPvEFJdZUmW/TADM7f0rQYSs7gXE8TN4U93ZR38UI5J0lOjaLbpEqFl/8pYQ9IUn/4y+lt3Xsk\n+8B4hbcv4V87i2Pvz+H9TGAGx/nTM1fh/vlc104v2/j+Aq73IYfzKuayrfO+z0oPcZMLLMnnXeOJ\nM1LPNrdza0JGDT0QetZzzUcreFYreV68S+S+r3e2jfyX2ba5H6EHzkMVv+TjNH6m16GcPOqWLwEA\nAuwi2B2dk9CiUdQjQVEURVEURVEURVGUCZN6HgmJiHU6LvkAOs4Ua9wgNbiVi6mZOSKPf5dfugub\nItRyn5w1PiX3fYPU2vxf43kAgNpfU48S2LITAJBk+6yHiQfsPpWa5UgRtXQDi9l2fwv7onwRtY1f\nr3sIAFDgH0BDjFapPB+11d/YTtV+xq+o5sx8npZ5J9WsU6+AJZpr44nQcSQtsNFi9sfIF9g/9aW0\nTL4481q02mxXdYAWymN+/FkAwPfEKl//7Y38ztjIpF//G8FkrDea2ZF8Xu+CSlpjN/Qxw+s5R/H6\nCyTg/5ZNVGsuLGtFpIhrYleU2t5pt0i2e8la63ncJNsTIRHX8azSWS0cv97TqUU2sY8jDuXAUfm0\nIJcEabU8Mcy1uzpSi7vTlgAAtmyiFSvYK3rSOPt09n0SZ2u8UcwcMJmCk5XV13hmmGozuVQj986W\nvzN4fZEqY0kz1WU43kWrcr01MTBT+q2a8+bZIebY+NXdlHkz/8r+MzLGSXbMqOvA8o+vRDLtcfE2\neR+vqTvKtTyjiFr1PefSAl9yMtdAY1MVSh/ld+RtpGfL9qso69PFIBMP8zur/iL7iVg+vbjJJKwJ\ny2fByae1oWAr29Jfw+vum0l54K9i3HdMvNAWfPplAMD3Su8GABT7R1AVZL/8qI+Zu2d+X+Ih9zcA\nGM38nqqYqiROkOM9v5RxnrNCnMMlfl7/gpOZfb5e2jsrmIlTs5me/c6uYwAA//c4LfCF9XxPyW2b\nASQxB8ir4bqeZTx9w14AQPf5rLCSeYBjPbiEcjs3m/Lqq3PoaVYRZI6QTGsE3zzxfgDw5sDzg/yO\nO1qYzd75tNSib2vg7yY5c7nBxEgbGs+kBdmpYJt/d9zNAICXozzL3bt8EQDgyCLOgf54COtauKaH\nR9g/V5WtAQD4LT6/vYfx12U/EQ/GHdwPU6EHvPxEpqpAOa3w/bVc/13zxDtuAef+L5fcCgA4M8w5\n0xgftbQ/OcQcUv+z9t0AgOwXKC9n3kWrdaqe9Ix3ydCl3Is+MX0VAGBjMcf8A0W0NIcstnlGkHMj\n6rq4Io/vfWSAlX1CreyvUvE0y3iBcsE2eUhSwRPBYHIhDfCMHgtzLvTO5UgFJN590fE82yzP5zhe\nkcOzX79rIeryPT9vOQMAMBSVvGkR7oOzbkjwuE2R/G8Gk9+o6SQ+tgr4eGEVvQjy09k3/1X2KABg\nfhrn9IATxSXSDy9EeP5/ZpAeRzsGee7d9iPOiXDTGvmxJCcBE0zuu2A353FggPc3s4roHfb9j/4B\nAHBLz1IAwNLTKa8qAzzT5PlGcPDd9Cy7Q/a7a1s5/ucXsCrfF1ddCQCY/f8kn9R+9udkj7Z6JCiK\noiiKoiiKoiiKMmFS3iPBWM7SNlFjlfkSLWzBU5mZvrGDlvYPVlJ7+ZUX34sfHMOC2v+9/WIAQEAC\npwK/ZhxReI9kZz8gGmqTETtFLLUmR0DBeloeWo9nG90otZAXnE5tbLrER1WJxuqm7mU4JZseB5+7\niZn6s/dRFxV6mhorJ9UsM6+AyeZrsg23n0tr/PmfeQYAcCAqY17MWNHj0zm+EReoC1Jjt+gnzH8x\n7W5aYOMHJftziozxWKxAABCrXPsyXl9WA+dAf4xay0uWrgYArOmktX1fs8RMi6pxx6N1mHEGLVv/\nPIaeG5YlVudoCqarBka14j6/l1052Mt+CK+iZeal/loAQKiJmvYc8Ug6opiWqevaTwEAdI2EkSMW\ni8K11I+GJfdJ1lqJg5cs3yYvgOeBYOZEQo3jKSPBOmANieWlQPJhSOx0PEvEdS2t10dUUttceOQg\nflLxJADg2BVXAQC2HqR2vvsXXDszV7MP7HapQzwy8oq/nQwcmft+GY+sDWxXeC7nenshY/1b0nit\n51/MLM21UiQ5uyqKqhNokf3ihssAABWZfOw8Q6tkqJOyMmMNrTuOeOe4yfRM8vthtdCbauA07msS\n3onvfO9PAICDMakxLjr/4zO4Z/3gIL0PVr40C8uP3M4PXcR54UTY9snO1HzYEIvRrGsY57/2G7Sq\ng4YZnJi/CwDwbznc2/L99NbosAcx6DDGPOZwb5x5K/dyay0z9dupEA//esi8r7+R1ucdH+UeFtzH\neX/p+S8AAGqDnO8+a9TCNieda+Vjaz4MAAg9Ry+myvvpyeXsF6+0VPJAtCxP9saL6TGRw2WJgRjb\nfF31KQCAs4uZ8+KGuX8DANQEeDZ4KpKFK0ue5XubTwUA3N9Br4Wtd9BCWbRB4qu3c1+0uyXYPAVk\nnle5QHK7mGpFfbWcx8eeQ8+jS4q47w+5PAd8p51r468rjscvT78RAPDrn78XAFDSy3bl3M1cAvGR\n1PK6BMC9XuZi6zLxurI4TtuHuGeZc21jjF4aJhv/+ij3g02D07Aki+v76SvoeVPrcG04u2Xdp9D+\ndghy3nAlT5ETHH/uEGcD7Otl++dk063u2o4TAQBh/wi+XrQeAPDs88wRk7eV31H6CM9FZn83nscp\ns9+b3zcy/6/cs3J+IZ63B5kY4v1z6E2Q7WNfDbuU41HXRoGP56DtUeZBunEzN4r6z1PmZfatl59I\nrbO+LVWT/M1cyzOv4/3NwEPM7fXVH/J+9bIyrvl9I9zb0iy24z+2XIDfzbsJAPDwk/REzuWWiabH\neT8wO8I+MHm/pmq8ralMPnSG79I3/2MJ5fF8OdyATFKaniV06ffFXYSbeBDvmitJK1Yy2RTaJGmf\nuZEwh4x/oQ8ec26f0N3HG2q7tNUvCVO8TaeaE26kTFyea6WcyHt5cIw+U4SMdimZI1XNSm4SV35T\nIvEwjvdE2w68ubEPSEIWN4cHK+OeZWfzsBGXMpjp36IQ6hjKRM5P2TfRQt54Zt656o3+7IQ5nGMf\nKJMkmOLq2H4hD0M5jXxsSv9ZEgIQ2MFkWiYkwmlp8xK3eaEMnV0Tubw3xaSNvUlCJe0ypRCN23vP\nUewnk6ApbYBfnbexE1aEB5JoHQVwegtvFp1dDQAANyaH6cOwwUzmuvdlZclD+QkpleaWU8a1LKei\n6MSPcsN5vGE2ok3srzOOFbe/2xjmUX2n3FC0yUHrMMiBwz72RrZLKTRTGs4cfixTEq2Mm+W2T3KN\nZ5VxfAc6MlH8HA8XRR/hQXJXK/tq5mcpG1xx7zf73b9yk304x94kFTZtbriBh+Xz63gDdcfTxwIA\njjqGbq7rVlHhYNm8hPrrW8eX8gJg79wzkct7U0zmvHePZZnHQB/ll9XDMYvM57538GTOj4qlvHk+\nsK4C9X8WxVgax9/ZKOGMk3Cumez9zi/JhUeWMMNs2ktUAvSfxr2gdzpl4IgcfcpWxbwkhZ3zOH/K\nb5HQRTPfD6MC4bCOvch5rwyc7F1G5jlFVKB1L6as61zI5798AZMo//DZd6PkWSkT+m7ejGSsoNyc\ndp/IvGZJqmuSuf4LSsNJG3tz1qtneIJdQDneN4NnV5NMOTDEv5Wn8kYx8tsK7wY0EJWSvg9vAHB4\n9znDYZV5YigzZ/nYUs7vvhqeZweqpE+O4o3W8FYpYSdK1rrb+zA0jf2U0SzlQDdxrTjiMp+MtgP/\n2jnf2+dy2LbOI7kmBqZJouAjOM8zn8+Eb0SMC5miQFglyUu3c36YUr+esSTV7nFk/cfexXNKepso\ngAc4nq2n8ew/VMGfPuM8nnXuX70EeZs5f0yJ+5nXibGwie78qTrvD2FMsnH+MecgSTYvMrH7BJ4J\n8l9sBuKSlLaa5xv/ep4Lkn1/p6ENiqIoiqIoiqIoiqJMmJQPbfBIsCTZHbREWOJdkC2JdMZqZArp\nEZgSyXXeCMYyZ/dRq2i0d9hK7VNgp1jf1tAtzvmL0UbtHP0S6YfUSDPyxvCLt0m8hRaFgFjr4nv3\njXufmbz209SHFeVkwe5hH2ROwXUeTry2imdC4Y10T8QicfNdPz5xqG00zWPKujgHmyb3IqcAo5V1\nImKVNFZp8VDIe47Pj8ykW1twC+eEcVUHgECjWKRMQqvDoJWfCoyFxoR5WPlSB8r0xR56oZTvp+Z9\n9730vKiNNnnu0fsG6GVVCbr/x50UKfX5WhjZLh5i9sD4a/WJdQXyd+7VYrmREnLuSMxLZOTcys9O\nj3E9OfJ8qrr5Gyups5il0Go+xH3sjh/QE2H2NXwsjjeYZYmrukmgmJExan1OkUR6bxRT+tZZSW8a\nr0ShjFmwiV4lM56X98l6mOHse8vt7a+EfyaThzr7KLeCXVK+VizLWQ/R0pw1qxYAYB1kci53cMiz\ntpc8I2vHfGmKyzrPvdsVLxrjimvK1IkHVd52nn0K/kGr7d3frQUAzIqs8b4q72/jPXLiqd72MRiZ\nb7yI/EW0NOauYVhr3j1SBj0sIRDXcm7kBHo9K6RZ9ylT2vR1MF4yxvIaeJ7eV0XtXAfFj9ATAX/m\n67YkEvbJvudGoghL2Ir5rrfiOddgSqFaIu/MPU2hhL0FT6EXWs69kkDRHYLVJ2FsEv5r5kBKh3SM\nwTdPQtQf4znXkpL3jnjRFv+JstBfxtDEXddy/s+OrD/EkzyFgrbeGF6Yhwl1kYdmTst5PvvvUsZy\nzEet/TwLpsq8V48ERVEURVEURVEURVEmzFvHIyERY8VKsTJ+h4ND2pRgTTQaKzfFy3q9WTxPDCG+\np+G1PyAaPZNE762M8UzwWLMpOReSJEatqqKlFTWsTxIlOiMcY5/0k/0amvcUqfozYRKt5nZ7e5Ku\nJEkkaOgNznBCCa9XSB76avuAO/zWsFm7q5lczczmmV9cCQBwxGr3qrHuI7HU9jaZAM7g4PjHr5Ic\nNvF9bxfsXXvHPXY3MM+B54loxnfjeK+0twWJ8jth7ZuEafYYz7sJf9dbgNeT+d66NzHQbyMS17m9\nmUljrUSZd0ASyE3dpU0NiRZps1fJGcgZoJdl5l30LnTEU9m17UMPN2+xue9sSvCwbaWX1ajM434e\nF8u7ktqoR4KiKIqiKIqiKIqiKBPmreuRoCjK249X0aw7b0OLjDJB3mLWlsPJ62bdf4t7IyivgY6t\n8g4kpUqVJoFXa//b0fv6EFTmvSVRjwRFURRFURRFURRFUSaM9VbJ9KooiqIoiqIoiqIoSvJRjwRF\nURRFURRFURRFUSaMKhIURVEURVEURVEURZkwqkhQFEVRFEVRFEVRFGXCqCJBURRFURRFURRFUZQJ\no4oERVEURVEURVEURVEmjCoSFEVRFEVRFEVRFEWZMKpIUBRFURRFURRFURRlwqgiQVEURVEURVEU\nRVGUCaOKBEVRFEVRFEVRFEVRJowqEhRFURRFURRFURRFmTCqSFAURVEURVEURVEUZcKoIkFRFEVR\nFEVRFEVRlAmjigRFURRFURRFURRFUSaMKhIURVEURVEURVEURZkwqkhQFEVRFEVRFEVRFGXCqCJB\nURRFURRFURRFUZQJo4oERVEURVEURVEURVEmjCoSFEVRFEVRFEVRFEWZMKpIUBRFURRFURRFURRl\nwqgiQVEURVEURVEURVGUCaOKBEVRFEVRFEVRFEVRJowqEhRFURRFURRFURRFmTCqSFAURVEURVEU\nRVEUZcL8f0TShDc/AQ7yAAAAAElFTkSuQmCC\n",
295 | "text/plain": [
296 | ""
297 | ]
298 | },
299 | "metadata": {},
300 | "output_type": "display_data"
301 | }
302 | ],
303 | "source": [
304 | "f, axarr = plt.subplots(1, 16, figsize=(18, 12))\n",
305 | "\n",
306 | "samples = x_mu.data.view(-1, 28, 28).numpy()\n",
307 | "\n",
308 | "for i, ax in enumerate(axarr.flat):\n",
309 | " ax.imshow(samples[i])\n",
310 | " ax.axis(\"off\")"
311 | ]
312 | },
313 | {
314 | "cell_type": "code",
315 | "execution_count": null,
316 | "metadata": {},
317 | "outputs": [],
318 | "source": []
319 | }
320 | ],
321 | "metadata": {
322 | "kernelspec": {
323 | "display_name": "Python 3",
324 | "language": "python",
325 | "name": "python3"
326 | },
327 | "language_info": {
328 | "codemirror_mode": {
329 | "name": "ipython",
330 | "version": 3
331 | },
332 | "file_extension": ".py",
333 | "mimetype": "text/x-python",
334 | "name": "python",
335 | "nbconvert_exporter": "python",
336 | "pygments_lexer": "ipython3",
337 | "version": "3.6.0"
338 | }
339 | },
340 | "nbformat": 4,
341 | "nbformat_minor": 2
342 | }
343 |
--------------------------------------------------------------------------------
/examples/notebooks/Ladder Variational Autoencoder.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 2,
6 | "metadata": {
7 | "code_folding": [
8 | 0
9 | ]
10 | },
11 | "outputs": [],
12 | "source": [
13 | "# Imports\n",
14 | "import torch\n",
15 | "cuda = torch.cuda.is_available()\n",
16 | "import numpy as np\n",
17 | "import matplotlib.pyplot as plt\n",
18 | "%matplotlib inline\n",
19 | "import sys\n",
20 | "sys.path.append(\"../../semi-supervised\")"
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "# Ladder Variational Autoencoder\n",
28 | "\n",
29 | "The ladder variational autoencoder (LVAE) [[Sønderby 2016a]](https://arxiv.org/abs/1602.02282) adds several stochastic layers to the VAE and performs both bottom-up and top-down merging of information to provide a better estimate of the log likelihood of data. The model is non-trivial, as evidenced by the diagram below.\n",
30 | "\n",
31 | "
\n",
32 | "\n",
33 | "Where left is the inference model and right is the generative model. The number of stochastic layers is not predetermined and can be chosen to be any number as long as the two parts agree."
34 | ]
35 | },
36 | {
37 | "cell_type": "code",
38 | "execution_count": 3,
39 | "metadata": {},
40 | "outputs": [
41 | {
42 | "data": {
43 | "text/plain": [
44 | "LadderVariationalAutoencoder(\n",
45 | " (encoder): ModuleList(\n",
46 | " (0): LadderEncoder(\n",
47 | " (linear): Linear(in_features=784, out_features=256)\n",
48 | " (batchnorm): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True)\n",
49 | " (sample): GaussianSample(\n",
50 | " (mu): Linear(in_features=256, out_features=32)\n",
51 | " (log_var): Linear(in_features=256, out_features=32)\n",
52 | " )\n",
53 | " )\n",
54 | " (1): LadderEncoder(\n",
55 | " (linear): Linear(in_features=256, out_features=128)\n",
56 | " (batchnorm): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True)\n",
57 | " (sample): GaussianSample(\n",
58 | " (mu): Linear(in_features=128, out_features=16)\n",
59 | " (log_var): Linear(in_features=128, out_features=16)\n",
60 | " )\n",
61 | " )\n",
62 | " (2): LadderEncoder(\n",
63 | " (linear): Linear(in_features=128, out_features=64)\n",
64 | " (batchnorm): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True)\n",
65 | " (sample): GaussianSample(\n",
66 | " (mu): Linear(in_features=64, out_features=8)\n",
67 | " (log_var): Linear(in_features=64, out_features=8)\n",
68 | " )\n",
69 | " )\n",
70 | " )\n",
71 | " (decoder): ModuleList(\n",
72 | " (0): LadderDecoder(\n",
73 | " (linear1): Linear(in_features=8, out_features=128)\n",
74 | " (batchnorm1): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True)\n",
75 | " (merge): GaussianMerge(\n",
76 | " (mu): Linear(in_features=128, out_features=16)\n",
77 | " (log_var): Linear(in_features=128, out_features=16)\n",
78 | " )\n",
79 | " (linear2): Linear(in_features=8, out_features=128)\n",
80 | " (batchnorm2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True)\n",
81 | " (sample): GaussianSample(\n",
82 | " (mu): Linear(in_features=128, out_features=16)\n",
83 | " (log_var): Linear(in_features=128, out_features=16)\n",
84 | " )\n",
85 | " )\n",
86 | " (1): LadderDecoder(\n",
87 | " (linear1): Linear(in_features=16, out_features=256)\n",
88 | " (batchnorm1): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True)\n",
89 | " (merge): GaussianMerge(\n",
90 | " (mu): Linear(in_features=256, out_features=32)\n",
91 | " (log_var): Linear(in_features=256, out_features=32)\n",
92 | " )\n",
93 | " (linear2): Linear(in_features=16, out_features=256)\n",
94 | " (batchnorm2): BatchNorm1d(256, eps=1e-05, momentum=0.1, affine=True)\n",
95 | " (sample): GaussianSample(\n",
96 | " (mu): Linear(in_features=256, out_features=32)\n",
97 | " (log_var): Linear(in_features=256, out_features=32)\n",
98 | " )\n",
99 | " )\n",
100 | " )\n",
101 | " (reconstruction): Decoder(\n",
102 | " (hidden): ModuleList(\n",
103 | " (0): Linear(in_features=32, out_features=256)\n",
104 | " (1): Linear(in_features=256, out_features=128)\n",
105 | " (2): Linear(in_features=128, out_features=64)\n",
106 | " )\n",
107 | " (reconstruction): Linear(in_features=64, out_features=784)\n",
108 | " (output_activation): Sigmoid()\n",
109 | " )\n",
110 | ")"
111 | ]
112 | },
113 | "execution_count": 3,
114 | "metadata": {},
115 | "output_type": "execute_result"
116 | }
117 | ],
118 | "source": [
119 | "from models import LadderVariationalAutoencoder\n",
120 | "\n",
121 | "# Bottom to top\n",
122 | "z_dim = [32, 16, 8]\n",
123 | "h_dim = [256, 128, 64]\n",
124 | "\n",
125 | "model = LadderVariationalAutoencoder([784, z_dim, h_dim])\n",
126 | "model"
127 | ]
128 | },
129 | {
130 | "cell_type": "markdown",
131 | "metadata": {},
132 | "source": [
133 | "## Training\n",
134 | "\n",
135 | "We still want to maximise the same lower bound as described in the VAE notebook, the difference here is that both the generative model and the inference model are now hierarchical.\n",
136 | "\n",
137 | "$$p_{\\theta}(z) = p_{\\theta}(z_L) \\prod_{i=1}^{L-1} p_{\\theta}(z_i|z_{i+1})$$\n",
138 | "$$q_{\\phi}(z|x) = q_{\\phi}(z_1|x) \\prod_{i=2}^{L} q_{\\phi}(z_i|z_{i+1})$$\n",
139 | "\n",
140 | "Which results in a KL-divergence between the latent distributions of their respective p and q layers. All of this is handled directly within the model.\n",
141 | "\n",
142 | "Additionally, training hierarchical deep generative models is prone to collapsing fo the stochastic units - meaning that these become inactive during training. This problem can be avoided by gradually turning on the KL-term during turning [[Sønderby 2016b]](http://orbit.dtu.dk/files/121765928/1602.02282.pdf). Typically one starts out training the model with no, or little influence of the KL-term ($\\beta \\approxeq 0$). Then after each epoch, the temperature is raised to $\\beta = 1$.\n",
143 | "\n",
144 | "This warm-up scheme has been implemented as an iterator as `DeterministicWarmup`."
145 | ]
146 | },
147 | {
148 | "cell_type": "code",
149 | "execution_count": 5,
150 | "metadata": {},
151 | "outputs": [],
152 | "source": [
153 | "from datautils import get_mnist\n",
154 | "from inference import DeterministicWarmup\n",
155 | "\n",
156 | "_, train, validation = get_mnist(location=\"./\", batch_size=64)\n",
157 | "\n",
158 | "def binary_cross_entropy(r, x):\n",
159 | " return -torch.sum(x * torch.log(r + 1e-8) + (1 - x) * torch.log(1 - r + 1e-8), dim=-1)\n",
160 | "\n",
161 | "optimizer = torch.optim.Adam(model.parameters(), lr=3e-4, betas=(0.9, 0.999))\n",
162 | "beta = DeterministicWarmup(n=50, t_max=1) # Linear warm-up from 0 to 1 over 50 epochs"
163 | ]
164 | },
165 | {
166 | "cell_type": "code",
167 | "execution_count": null,
168 | "metadata": {},
169 | "outputs": [],
170 | "source": [
171 | "from torch.autograd import Variable\n",
172 | "\n",
173 | "for epoch in range(50):\n",
174 | " model.train()\n",
175 | " total_loss = 0\n",
176 | " for (u, _) in train:\n",
177 | " u = Variable(u)\n",
178 | "\n",
179 | " if cuda: u = u.cuda(device=0)\n",
180 | "\n",
181 | " reconstruction = model(u)\n",
182 | " \n",
183 | " likelihood = -binary_cross_entropy(reconstruction, u)\n",
184 | " elbo = likelihood - next(beta) * model.kl_divergence\n",
185 | " \n",
186 | " L = -torch.mean(elbo)\n",
187 | "\n",
188 | " L.backward()\n",
189 | " optimizer.step()\n",
190 | " optimizer.zero_grad()\n",
191 | "\n",
192 | " total_loss += L.data[0]\n",
193 | "\n",
194 | " m = len(train)\n",
195 | "\n",
196 | " if epoch % 10 == 0:\n",
197 | " print(f\"Epoch: {epoch+1}\\tL: {total_loss/m:.2f}\")"
198 | ]
199 | },
200 | {
201 | "cell_type": "markdown",
202 | "metadata": {},
203 | "source": [
204 | "## Sampling from the generative model\n",
205 | "\n",
206 | "To sample from the network we pass some normal distributed noise $z \\sim N(0, I)$ to the top most layer of the decoder and pass the representation through the layers to arrive at our final generated data."
207 | ]
208 | },
209 | {
210 | "cell_type": "code",
211 | "execution_count": 19,
212 | "metadata": {},
213 | "outputs": [],
214 | "source": [
215 | "model.eval()\n",
216 | "x_mu = model.sample(Variable(torch.randn(16, 8)))"
217 | ]
218 | },
219 | {
220 | "cell_type": "code",
221 | "execution_count": 20,
222 | "metadata": {},
223 | "outputs": [
224 | {
225 | "data": {
226 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABA0AAABVCAYAAAAi7x1kAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztvWmQZGd2HXbfy32rfa/egUY39m0wmH3AGQ6Hi0yKi0yL\nIilblIJhOyzZEVaE9cf0D4ZlyXKEZco2KYVkaTgiQ+SQM0POQs6K2YAZNIBBdwONbvTe1dW1Z2VV\n5b685x/n3JeZr6swwKArMwHc8ycr13rf991vefeee67j+74YDAaDwWAwGAwGg8FgMITh9vsCDAaD\nwWAwGAwGg8FgMAwmzGlgMBgMBoPBYDAYDAaDYVeY08BgMBgMBoPBYDAYDAbDrjCngcFgMBgMBoPB\nYDAYDIZdYU4Dg8FgMBgMBoPBYDAYDLvCnAYGg8FgMBgMBoPBYDAYdoU5DQwGg8FgMBgMBoPBYDDs\nCnMaGAwGg8FgMBgMBoPBYNgV0V78k0+4f8vvxf/pJb7q/anzRj/7bm7/u7ntIu/u9r+b2y7y7m7/\nu7ntIu/u9lvb31kwu7exfyN4N7f/3dx2kXd3+99tbTemgcFgMBgMBoPBYDAYDIZd0ROmgcFgMBgM\nBoPBYDC8IThvINjrv+MCvQbDwMKYBgaDwWAwGAwGg8FgMBh2hTENDG9/dHqjnZAfzPdCz80rbXgH\n4PUiMGbjBoPhnQaueU4kEuzzTgxHWL/RxGe43/utFp/bWvi2ghsRkfa4OhE8Fz460Y5blghswNsp\n4jnH3Mbe8I6Gnv36ZN/GNDAYDAaDwWAwGAwGg8GwK95ZTIMflf+k3mkXn/O9kKdGo9LmoRwM0Ovs\nxmMiIuJkM3h9dFhERLyhFB4TUamNJ0RExKdj2q1hDGNlRCBiayX8xuY2vrOFR79ex2OzuW/NeMtQ\nu1b7Ve+72468OOwj9cgH32G7NBLj1xt4VG+819rHCzfcMYRs4PXhvf7bb7f1LWz/tHuJRNpzQT8T\nRJvCEcd3ydr+RnKARd75/fB2xl5zXW244/VgLgQf2WNc3w72H57nGm2Ox/GY4GMmLY35MRERaeSw\n70WrmOex5R38Rn5LRET8HTz3uO/Zfjdg2ONs46bTeJ1sgi6mQSjS6qpN6xlHzzy6FygL5e0y9uE1\nvHO+h85+e4LrwDv2nPej9rlBXufeDDrmx21jHx7jHrGqjWlgMBgMBoPBYDAYDAaDYVe8/ZgGmtcW\nhYfZzSDa7IwMS2NuVEREvDg8Mq0EfCK1EXqs1TNDT00jjcdUHp6a5DqizpGtqrgbBfzGeh7fadT3\nq0U/HkKets48v7An6jYPlOjLoff77Z3rzFkUEScB9oBzYFZEREr3ILqw/gDGs/EQ2AO/ePK0vD97\nSUREWj764IXSERER+ctrD4iISOQbEyIiMvN9/CZ9duJtIxIR9MUgeGRD/eDmcniegffdG8Hz2gyY\nF1vH4lKZwndaSbQjfQvPh68hwpI5vyYiIn5+E79RqeJ/tci6eTflAb4d9ABCNhBEWhyn+7m+L7uM\noX7Wdbu+q5EYr1rD+4Ng850IrfHKonFSSbxPplHt4KiUZ/gem5BZhF3H1pjnqhHHapWPaLPfZORx\nUMb7jWCXSGwQhVWmEXN+nSTWOR374Lvlioi01z2vgudvm37osA1tY9D2gG1ClolGHJVNFsp7H7g2\nh5h1wdgFc53Po9G2DUQj3Z/12Ha1c45vOPoaYJD6gG0KmIU6vsNDIiJSPzguIiKr70nL9v2Yv24K\n7Ypfwjlw5AK+M3qa/cN2O2oTgzr2oTUvQDii7Pm3M07c0F4RWuf3jEgOQh+EGQZ6ntf1i7bgk3Xi\nJRNBqNMtYk0P9sQY26tjrqxK/naw3w1S+0X23ut1bU9y3xvOSmssKyIirTTnCNvgcW2IltHmyDb7\nhvvf22K9d7vPNMH8T9MmMmlpDePM28qg/ZEabNqpY8zdHa53O7g3eNvu++EzUDIhLs//QsZVcH+n\na4Tau675bLNXU7u/M21+2zgNVAAloC3NTIqISP4JPj7gSPJe3OjrMhuPwqBSMXTmSJIHJh+fWCni\n5mthEYfQ7GX8dnI9JeNnOVhFGl+/jC1sPFwcggmVxSLip5PiZbC4uJxAakS6eAbQDVQnFA+SweGq\n1ervpAotml4Wba2O4PXqNK7/yUM3RETk47lX5KOpsoiIJBz00yOJZ0REpNDEmH5t/lEREamP4rdS\nt7jzhNvpOP1r+y4LhYiITMJZUj04IiIi6w/h9ejH1kVE5O8cfV4+mjkvIiJf3nlIRET+vzPvFxGR\nSB02kVyFnUR4qFSKkToPdM50HSoHYWHdi7LascGGb4aDNuhj+HAVuvHuhB9K5+jZDbWOPW3e5Q2y\nMzrSfX0q/JXkYcoT8WM8cJVr3b/J8dPPSpNjv401TQIHUp8PEmG7Zxtd3iz4dJKVj6Av1h7F51qP\n7Mhj81dEROTsKhyL6+exlo+fRf/lrsPuY7fQVn+DbS5331gOBPY6PEZDNO0htMkbyUp9nPuhZiOl\n8J1Glg7zIbwRoc97+DLme/zqKl6ATxwH6kE6TIcOkHozIfMzIiJSeGBU8vejbfURjmUc1x1fw3dy\n2B5k5BLmReIKHKdeHucEr4Q9A5OoD23eY713R2DDfpKHQ944NYdg037UFS+u+xd/itcfKfGmYQtz\n2t2iUzx8eG7pWHv9Dx5oP+j6HHIWVO+eEhGRG59Efzz10dPyD6aeFhGRr+0gKPBv5QMiItJYQB95\naXw2or+lB+mW9lufHaXh4Jeu92MjXe+HPy+tVkcwiI+6J/BRP+vqWJf1nAd793kT4Xc6IHo15uHA\nkO53Q1jjW/MI7rQ4fs10pOsxUm0HwOIFzhe9YazwnEvbdtRJwHkenHn0denzvA+v70y/ddgXjWms\nA4V7sMbnHxCJH8V8rlXpLKhwryzhtxLr6JPcdfzW6Hn20Q183ulM2ej3eh8OjnG+yziCv6UTcBKu\nPcz+eXRLHp65JSIiS2V89vo65ktrhfdw1/DdoRtoZ/Y1OE3clQ0REfFLvJ+r1wcrUBY657q6buV4\nfzc5JuUDsAsvis9WRzknGtz3dni/exMBE+fmCh6b3Wk7bxWWnmAwGAwGg8FgMBgMBoNhVww+00C9\npmQYeHcdFBGRhZ+GZ2nko8siIvI7R78lM1FEEHY8RCUu1qZFRCRCd/y1KjxX+Tq8cBsR/KZboceP\njqdUviUOvZZdJV56gT0oOuqFax5AG0oH0MbCXfhcbdwXb7bKn6D3OoLv1suMZGzjcfgCfEVj5+Fx\nTWoEhlE4qVT644ULCz8pHVPZeBpNKcMmnr1wl4iInLp+WNJpes/51fEMvMv1lqaq4J16Ds9TsT3G\ntY+e5zZVj4+kZTUn4G1cfYzUy59cEhGR3z3+WREReThekWEXn32hCoHHIAKlARVtl0aYgnQMRqM7\nBbQGyPN6WxoSUzW8ccyHVjoWiGFF6mhbdAdhVXcb0SWnqWJAakhhyidfr9XFr+G7XhCV2eeIlFKS\nGWlyJ8Aqac7CY16eRbu3D5NiSdJAiyQUxxOpD1MQio5kt841s97+jIhIbgF/DF9gnzL65vRLDHSP\naLKyK6pHEXVae4RMoycQJfitB54WEZGPZs7LEwm09TszmM+/P/WUiIj8YORuERGp5/Cb4/yXMUaZ\ngjb3My0pHF0I0jDadEwRkdYk+2OKtnAEn6sPOVKeZZQ9ynmcRjtyE4hIZRJo58pl9KXTgp2NbzCq\nVWRUotHsCL72LwIVRN2YmuZOg0m4/TD28sVfgI3+/IPPy/84+bSIiIy4+M6lBvrz6fIJERH58jKi\n0NeewZlhJgM2SvYs/5lS+Wu13u53ezEMNNo6jfEuHsEYlSdgH9VJMqo6tskEt+xYUaNNMb6OSG2C\nLCM3T4FAZbEEdN2mOI7S1/VXezT+4X1Pxz6DdgcMg59CW379k98SEZG/PXJK7opiLnxZlY8ZYU3s\nMKKqbEttg6ZoqQi236eyZWFGWUjcuTGJfa0yQ8YEjylejGt6U0QvPVLf49r5fiKPM2x8abvz5Xab\n6/WeEy7CqafuGPa5BlNPto9xjTvKcooxjmcTV5/MR8Rp4rWERlopeh0lC8HVyOsmbNzdpF3pNXQJ\nJPZwrduLZcGzfWsGfVC4H+vAOsix8uEPviwiIn9j7LT8jQwi5i9z7F+pzYmIyOnSIRER+c4yzsSb\naaz3yQL6M7vOPYWsYmm1+k62Cfa9EHt87UmcgfIfw/3MP37siyIi8oH0ZXkojnnxah3ns1MH0e5v\nFU7icfK4iIi0EmhvfAvzKxlKVxDHFXF0zAeAYRdiGLg8AzUOoU82T6algG1NWlll2+AxtonvZm7B\nzpWBE3VvZ9LeCRjTwGAwGAwGg8FgMBgMBsOuGHimgUuPpMwj0nDxN+GF+9gHTouIyN8af05ERO6K\nbcr5Brxr39i6V0REbpTgsTpzfR6/tUIPDJ1tsR14/qYW4bEZugRvfKRYE2cHnqwWNQ323QulUbdY\nt3aDQw905Rjzexhtrj2GCNGHjyCn957MikzHmL9DD9RWC162XAQNPls6ICIiXz0Ml9VqBl7eaR/9\nltAoq+eJaN7bAOT9BJ5hRglSa/Q6b8BTm1yPSbzEfsnBD7Z8CG2rH2cuI/Ndm8lucbggyryHWGQv\n0VlKTqTNNCjPwrtaewy2+DfnYfsNRlm+VRmXH5aPiIjI568/KCIi8cv4bmKbQjEaMW9QLKUZFgUb\nAG+ryJ4MA2HEsXIQ82H7MMa+NupIE0QMiTJFObEBW4iVmc+u+evs31Zc87wZkdhiWc7tmkRWqIui\nkWg+7ku/OE6bYcCIS+0YNVpOog2bj+HaHrznuoiIjMTRyGoL/XNjezTQbKk2sXbUGng8MIz1oFBF\nHy5cxm/HSlwX1hltZs5nT7VMHOd2wSMVPDuEdVsZBuM/sygiIv/k6JdERORDScyDtBuXsofxGWPS\n/lOjF0REJPUo+uRpB3tBtELGwSb+h+Y4B/l+fu+1TMIRp0DwSXNaZ2Dr+fuwF2yBPCH+QURLxkaK\nMhQhsyCO9frkMPIY/8H4d0REZI17wD9P/rSIiCxePCIiIl6K80uju7FosAb0JQLlhqJvZNxsvhca\nBpu/jDH/fx79ExEReU8iLxHGPM418N1TlWMiIjISgT0/PgZRg/yj6L+NIqNvq4jiRLYpltlqdeT1\n73PjHUdu02fhuLdmsAZsPMhI4/thmyPTmMezZM8tFYakskMW4TDGMbbD3ySzLrWK57kkn0fxPEIp\nC+nUrgkJ5O37+O9RZk8ZNq0p9MPiR2ELf/MT3xcRkX849jzeF5HPlTCGX168T0RE4gX8VivGNiRI\nyYqFSxFrnKw/zKK98tjr89Qruh99UDzMCLuS47hdJ/OOuEzdd1rMa9cc50nNW+f5iNoeI2oDehmN\nDu2qXtm9CNqv2mTM1W6ydGb+Xs7R96Kh99x9E++TVnN9FZ/zL6UkWlLGDfdxMgsaGTLWuGUnC/hf\nQ9DIlih1ihyec8Xze8uuUdsLsS1Us6d0BH2y8iFcyy89AXv/tTHY/8FoQ16to02XG2Dh6Bn/RBqM\n6/w4nj89y72ULExfWWyq/9QZae853STEuOD6V5lF+9ffh3H6pXtxzj0YB7tiuZmTb5RwL7daJzuD\ne0CNlJxWHc9jXNojLLkutfB5zhuMM6/2RUjHqXYc+97ykzgflg+1JDGF9d9pKmuKZeXnMH7bSXx3\n6BrL0Op6vlcZ3h8TxjQwGAwGg8FgMBgMBoPBsCsGl2mgub7j8DAuPYUowf2PXRURkf926hsiIhKj\nK/b31p+SL15ADmP6FLyWuQV4YO5aY45zDdEKVRYW5jo7VXqhgkoJTfGCPN9Q5YH9wOt6oOF1X34f\nvJJDH0Co4B8fe1pERO6JI7J0pnYwYBJcLiKiqF7au3PQLDiUgFT24zPw4n7rMDxT5QV49OPLLO1X\nrd1Wrqcn3shwtJ+e2VaOysK8Fs3ZTm3g89mFqkRKGK/kCMvTCCO1Q3j0Usx3a4VyGd3B85upLah3\nuDSL58M52G02gkjjsyXkb332+kOyeQN2kr2Cz05cpZLqCnO4G3dWQfWOI+x91igcI/ClY3hcfQ+V\n8x+AK9lrudIs47Opq1RcztCbrum7Y3ifkibSGGYu2Db79TI+P1RriasRquC69k9t24lEgii7n8Oa\ntX2IdvsIxumffAj5fB9MXe767n/cfFJERM6vT0meeXvNGtqZG4FH+r2j10RE5IEU5vvvRT4mIiJb\nL8Jbnwur9EcivWMWOW5bu4Nrnp9FHxTn0Afl96Advz3/oohIUCFly8Oa/KnCUTlTQr56w2NULYbP\nHEkhOnH4GNbL9WvI/RxhJZZIjMr0nEtY53oYeXAjHVViGAXS6NsU1uX1h8gwOInrmTqBdXw4gWte\n3BqWjSK+U0hj/XtsdEFEROYi+M79cbTpD1PIa17UxN4wsaojAt7TCFQo0qL57OWTYBYuPYUL/RcP\nf15ERJ5kEv+Xyofk31z/sIiI3LiIz/ps88xhjP3HZ18TEZFHJsFU+Tr30to49ojMAuderaPqiBPa\nH/YBYUaZsBpG6RAeNz6Ksfytx74nIiIfyqAdnys8JiIi15bHJbIBm0muMfdbc99p1lqC1+Ei6DYQ\nzUuVGWnTiimNZkdZzv5E4cMaPlsncQ5JPYyx/kcTYM2MRtA//+v6Cfl3L6M6UOwC5ohqOwTltIfI\nWNnm2UE1ajTSGPzzHjOMQiwTbxRtLdyF69z+AMZlZgLskmKVa+F5MBFiRV9iJWo7kSlXHe22gdYx\n/EZ1VkuR4jFeQF9F8mok1TvZsh8Nx20zP7jvqXbDFvO1/86TiKq/J4Nz/vOloyIicuUGIuvZdUcS\nBdUwoE5VllVUQMyS2gTst5Kn5kEd7R7iGdEpa7np1u3sml7Mf13zWDqvyfPt5gme3Q6gKtaTOez5\n18ig/tdr98upZeTwlyrov4lhnIOemASr6sEs1roXRnEv0EizGkc0vNe77bb3Sd8jKCHLfijNkXGX\n6K4C9cXNR0RE5JvXjkv9JvYHvQfQ/H6fDKPkTeq2XcUZIbbEaklaKUdZy/2uEKdQLQMyTr1p3O8u\nvw/PW4+D/Z4QkeoGdRrWMYb1edjz9Bw+sziD/vO1bGuYTXyHMHh3TAaDwWAwGAwGg8FgMBgGAgPL\nNFAl6epJKB5vPgHP0f8y920RESmwQsLvLz8lIiKvfPakHH0GTIHYDeT/+g2t26p127s9Lu16tept\n1Me2B7Ln3jetucvI4+ZJeJyq98A7+qvz50RExKO/518tf1xERL535h5JLuO7UebzqNL6K/cgyvbo\nXfBGDsfpac2hf+oZRlwYZQ0iIf0CGQWBJ5JoaASZL6tartNoR0acpqqJ47lGmx16Uz0VFFUtg0BR\nf0DqN4uIT5vz6YGuIxghH56B9308ggH+o4X3iohI4eKYDF/C9afyjKIXyaLRvomG8uh6HWX4UQjl\n+ynDqHwcXvabH8OAnnwCfTCZRB985/LdkroGD2tqXRXE8ajaBVX8hDROIAIzPIR1YnMTkavGEnPq\nW16bdaTRt/3Quuis3kBGldYVrw/hvbvvRo7ioRiipgUP1/jPFn5WRERee5r52xc9SXOJqg3jmrfu\npfHjI3IwCobR3UOIVD+XRBTC1wifMlscV3oWZfS9wM4dv3suqu5IiwySaxzA/6OJNfGzNx4WEZH8\nxbF2xGEa4/bAUURa3jcKO0lGsca1GFzzY1wA3G5mS8/UpDtq0mvEScLMonnsbbVRrllZtGFlDaG0\ntRVE3TK3HOGSKKUDsI811nKusk9f4h734grGPL1CxlWRkeagqkrnntdDfZdQXrtMYd4XjmPAnnoE\npQ5yLq73Xxcw9n/wrY/JxCl899AaIiolVtBYzYCpkTuANS6WZHQpoWs9/7dWRvJ62F6Rdps53t4w\n67Dfiz547/GLIiLyi0M/FBGRf5//gIiIfP4UmAYjZ6KSWVVjxWM9i99UhkF9CONaG+H8SOF9tf+g\nv1030LkJsN9nHv398DGDEcdGGm88OIWa7BTHl/89D2X4f/Pdj8rYaa14xbx+nnUaLEZQH8bYxrYx\nlyLbjGI6ZBw4+7i+vx503UmpXhEreT2O63jvUZxdPZ5XXjx7j4iITJ5BO3PXqxIp8tzKPSrGqirl\nKe6DWdh9ehzsoq1FMHFaSfRJZLfzXQ8i7CLtM52fpk4Tq1klj4NZcTSBPWq1iTn8x2ffIyIis18h\ni+5aMTjjaXtkjlFZijak5nA2qI6gP3Y28UaSFWPiyjD2vdvmvt/ap02gY88Pznd89FLMZycJplTF\ndf+/158SEZHrS6BIJi4lJUaJNZc6TmuQ7JHYFK7bdbrZAxEN2tc5xwN2gX/Hc91/XPic98oc0fIg\nZwpgRF5SlsmrCcmuUb8ih8+UZxipZ55/9ibeT90ia7wAu/K1WkzAnB6MtgfMWjLsNh7EHl97COvU\n7BBseeHahOQucv5yTOuz6INHxsEkXc4P8X2OcWN/WMXGNDAYDAaDwWAwGAwGg8GwKwaPaaCRGKpI\n7hyAF+pxet/no1A3f6aMfO5Tp+CJPfb9ssSuIDqnDAO/HmIaKLy2t63z/S4vY4/z3Nr5PYgYNaiC\nWplAfxyeRcTRo3v+Pywgp2/1G/DGzV9sSXIDHmbN/S/N8LeG0YdbB+FxPcpar6ksXFYNRrS9OHPA\n3Eg7+rhP3qrXhXpDVdk82s0wUI9yM4kXGkNxiVYYcaF3WVkJXpweWKqq7lnbWNHrHEeR4P+pF9Tl\nHPAYESgfRj98aAi5rRkX47ZTY95mwxGHv9Fk3XpVVG7m8Bkd2xht32G92r5oVyg6lcSVYTACT2v1\nOKIj138Or//yh5DrOMoSCZ86D5ZF6nRKRi7immM7aEukiucNKovvHMbjPXPQ/5hPw/v89Q3U9k1u\nok8i6ztBHV/1TO8rIpF2XmMWj2rbxTqen60iQvyXiw+JiMjWX4F5dfAU+iFarItPZfTyPHUR7sbz\nNEMXLa4Zm8zt1DrXQlZTkFvttrVV9l37wnEDm1OGl1tBn2dW8b8L6xi3v7qOcEqlgj6JnWeO6nq7\nv3bG0ObpJPL7JmJ43KphbdOKOQGLSXUUupTcafs9irrdBr87V1kRZT/Eiriu3HW8nyw0A3ZJGWQy\nOUwtB8VfF6Hzs/Ma8vnHl6hxUu1QUJeOvbBP0PHQqg47VI8fY2jt+TJoM//uLKLu4y+4MnaOuia0\n/50D+I3xMbw+GsV3b9QQqYsUta47zwUdEXZl1+3b8teh2dKumgA7rk5SPXwG/1wrgHx2G4XaP3Ma\nDIPp72o0rSqxAtYpXTcio1pNBr/dzHBucc+sjlBBP0llbbX/jkvctyjrXvBD5zA+Nsi2SkUwPloV\n689uoD/SC1FJ5pm3PoEGVse47yW4hzLymMijvRGus0GlEj1bdDZ5v+e94wbVwPxRnm8PYRxGDoIN\ndiyNfPb/9MrjIiIy9iq+OvIKK2MVioESvOoDJCJaDQhtOzyKPO57h3Ae/pM52H8zBRtIRHc59u9r\n1YAOVhfb3xymzU/iveEU7Hm9CabUZxfAKJr4KtbvkbPon0CPQERkAn2ozLTGBOzl8AgYFoUE/kee\nOkEJ1q/POThjJHy/vd/lC/zRHswBPWvRBiMlXHe0grYWl3D2v7aK9SG9gPFKbvjikT1ZnsPceWQe\nbJwP5nB/dLoMzYPiCqgIYwWyKsiuGBR2gcjt+74XMst8mSyhdTIRSu31TNm3yiiO5bk33sBvueuY\nLx7vAf1QhZi+I5gT1G1j5Zw8tmt59BC0iW4VYavxtagMX0EbVMNkcgZtfCADduUXajgjxjd5dt2n\nOW1MA4PBYDAYDAaDwWAwGAy7YuCYBu0cD3jZCszZ+Y0xeNIi9I2/uAOPWnqRkUpfxKcKsSrFO/UQ\n48Cjt02jiHtF0/oYdVFPeCOLoakcwDUfzMJ7fCp/WEREbr6A0NLcWbQhvbAT5K83R+ChqzPvpzmH\n9n5kEgVrNQo3mjkiIiKlBrxZbrUj76nXeZ4dCLyhagsNRiScbsZBZUzzmaJBhEW9cC0WUXAajDhs\nMP+PXl2tnNHxT/k4OJ5YjSKpmnmZee1rzPdLxTD2+VxLtk6gffE8Hpup7vwnl6aeSbPSACONjuZ6\n9YNRIu0In5uBzfoziIosfAJt/7WPfFdERA7EEWn4wxuoGpD4HlzNUy+UJbamIh4cw5hG29BP1Wm0\n7Ykx5IteLDJH7gwjD5fgsZXNrbbWQ4890soMCavaf2UFdcg3ngbDYO5FhMzjtxgZabXEZ9+1Elj/\n3Bm04VgClQOqPiJSlzfRt1EN1oTG3IlE2nOvF9F27WONdrPvIxW8nlrFNVQYGYpo/W0E4yRa8aWC\ncsZy5Aja+onRl0VE5GYdbV1jnt/QjuonsD1eaL73IQKhkZbwfqVaHDp300t4rqrh2Vucu54fsMlG\nT4Jh8N+NIQ/+FiOt31iFLHl2gb9Z7o6yt/fG/ua4aiQoYNxk8HyhggjMq9sYaOcmFvb0ektTX6U8\nT/Xxh9Gfv3EAuj8zZCV+afVBfOcW18g1MBBUUdtveb3LdXXcQMtAdYtqo9jnRo9ijdPa4396BZH1\nkVNcp65j7sdWd4LrdOMq0gM0ybxxpzGXGjW832SFJM2pv4192XWN/WHaqKZHk3u3x43+XBVsyvVN\nrPm5LZHaEN4roniKNMYwh9wKXq/zfT1HxRNaLUVFjl6nUsQ+td9xnSDSXp/CWq3X/5Fp5CVfKqHy\nVfIcBnL0AmzU3UD03C8W22ypdPfvq47FJydfERGRmSj2tT9JQBdAdZ1uYxfuN4JKVY44VIkPGKGM\nGM9lca2Xytibt57D48GLmKvONpP5m82gyk5tDH1ZPIB2jU2jjw5kMO9d3ius57B/1HMx/k8yjiay\nElN9gx2cIYJqaZ26Q3cKwR7T1tEREfFUZ4RvJ6iO78X9rtebGUd2juI7730c90O/M/8FERHZ4tnw\n09u4L0gtUtOjyM0+XD2rU8Om1wjvudTb0jOQT1bwTpHnmiFq1sxHpclqCdFJMq1KZOFdIetmHWuk\nX+tm0g1bxCz3AAAgAElEQVTi+V5Egip5W3djIozcj338YAr3e89fOiIiIrOnPUkto82bJ/Cdj0wt\ndP1W/AaZx1uw//1qqTENDAaDwWAwGAwGg8FgMOyKwWMaaG5jDl4m9SBPxwpdn4szdFqdhD9l6YNp\n8SLwQCY3qKJepJLmOj6ruR6Rm1Bo9bbgmfT6FGUN4HvtWskxrSlP7+MQrrnchBfpyhoiaMkNeOWi\nZXjSWpl4oCa7cxCfzTPy8ncfRj74r488JyIip6pwb+eL6K/cTsgnFWlHL4Jczx7Wcg1Xb1CVd61B\nrUrJyibYOhaRpnrd1SubVe8284TpcI2w4kJQOaM1IDlOnVCF5cAji+eXqsjzL7bgVS4w52v6SF6i\nLtqxRIX12qSyFLp/uv4qXo/n4dl0NzmvKtJ7OG4w31XDZPU9qCv80IfgSX8ojYofX9hAjuP68+iD\nI8+CLRPZLAWR2kAbgj9fmcBvf/hRRB5nuYb84WWwFQ6fQ1QhsobXvVI5yIHrmWeanvAI2TQcWilT\nr2L5FiKt0wvUXVCmjOqyZNNSm0HUqnAX5sm988hlPRKF13qxBZvIL+NxrtTNyHBYM1tcR8Tp1nII\nIi93uh98T3yPa55G2luqTYEoQWKTUWfqdGjERTVbdo44kn4EbfyV+RdFRGQ+Cg/9t7egV9HaYiRC\nVaerjLj0kUmF/++LRjm1drxO1Qjl4pVZoAYdq1BtmxGZnYNx2XgS4/OZ+z8tIiJpB332nQrW+Is3\nEbGbXmf+KJkG7WhjR8WEfkSeQv8zWkRfJJexuL8yDYZBNILPeTH0QeFYVNyDWMO2TqIf3/cINF/u\nScL+nytBbf/MS6j1fuQ09RyWqA+kLKtWK2Ah3nGEI5auE1RtUP2G4hzmwf1joNCoBsPOAtbEaWqu\nRLept5KMSSujkVr8RoXnhfJhjO+Th7FuPn8NzERXA6i129sZVDHZb12HPRBUEdGII5ejJpkGizWs\ngd4G2lydEGlSPb41gzFMptDA6hYj+dRxaqa1agTXOtWO0jOG4+5/gzuqpOhaWx/B9TQO4fon49jP\nnl/GvE0v8dxSYsRUdQyiUXGoZRBE3McxV5pHYB9PpTEPXqpBD8fd0koS3OS7tLt6N+cdxwn6wuOa\nXp3BOpSkfsXZDTDqUstk0tS7z+b+UFYaM5j3OwfQD+Wj+O6HphF5PZbCPLq2M9b1XZqRlOrKNklI\npIQzlLOD6K3uQbom3zH4fgeDRfdf2Kgf0df5EKE+Vbq7Mkj8rm35H06CeflfDUP3JOvCBv79Ntb5\ny3nof8RJnowWud43QueGTpZJv6LvqifBualnn0gGYz42jE27lYPd3vXAujw6hDF+rYRz4DcvQNNO\n1zfVdwruIQZIw6ETTpSDOobz7saDuN6fn8O5d6mKs1rmZXRKerUiTa71tRG06XAS+5hWl0rf4o/r\neCpTW/s5YNN5b2nMjWlgMBgMBoPBYDAYDAaDYVcMDtPgtprVvLQmXv/cGhSEnxqDh22jRs/gQeR8\ntY61JJuC17bZoqI0I1lrl+C1Sa3Cszf/DXox6X1z6pr/cqcb9QbhuEH7fbc7quY10B/qQayVqJZM\nR9X2YUaUJS511mTePg5P3X/zka+LiMivD50WEZGxCLxWf1yDl66yDi/leJG5tRV6Vz2/XUu2H546\nVdRXhVFeQ6yoOV70VrMPKpO+tLKqfI4HN8OxXUebVZk1UumOSg+MmmontIoCqwBkriKS8McZ5Caq\nTThlPNYnIhKPd7c/MYN54VAPYYr1XheL8OSPXsRvJndTUu4RHNcJIi8tKiFv3433fmIc81xz08+u\nIlcvdxXva71xbzgdzBm3TnXZGawNKx/F80/NfVlERP5jARUXkhfQ9tQCI45kW/itjmjrvubya169\nH0S8I2XYpUbEd7YQAYkvx/RLIiLSGGJ/ZdAv9VxMCndR/+ReRJp+e/5pEREZY/TmTI1aLzXm+Wb4\nOIb5H9NIfzUq4mq+JSJf/n6RsHxfAlqQspjI8nAZEcstUmncZT17LnVNVkapHajLh5kLfFccmgYx\nB2P+/BoidpFyqGqKRtY12rYb46AXY98hXhGox3McEnky4upod505uKrlUpzH65v3+fJTDyF/eYSV\nMi5Rw+XPl5APH1lCH8bISHNCWg67Vk3oQ+RJtRUieaxTY+dhs2sprAvlGbQvQr2O1pGGxOP4zs9O\nQ6vkI8PnRUTku9uIPn3xPKSoZ79LxuEFVE9pbTM/XFk0+4nd+pL7mzLolCWneaxXS5jbTqu7Ek51\nFn3SyLpSZdWMVoqskyMY1yfuvyIiIr82BXbhS4vz/C3+a400hpkm0rHXa75/nw5EcZLfTnPNb/Ec\n5zY476dbEh3HGnF0Ah+OubjW1yo429RH0K7aMPtQq0xEujUgROR2DYP9yGcXRvzIpmgmqLVEhsQw\ny7uUypivGa5xzWE8j3qwCT8WkRbb0qA+0eY9WA8+eDf0XO5m5PavqbyeoM6R7o/Sya7s9Vg73VF1\nt4r/n+d5fnUN832chDcvwfPJBNrSGEvK9iFWA8M0l/ffB60uZRgoKg1WCuFZsEY2TqzI/S/tSivN\nijzRkF3sxxq4x28G59uS3pcoOwbjFDuINfG3TjwrvzoE1mTKwfngZhPvvVTC2bDJuRINba0yIFVy\nOuGE5pm23ytwfONo44kp6hWNnZOYg3XrVIMMKvZRnfO8dAh2NJRnpagtXev3pw1vGnpPQ/2W+jzs\nunkY+1qlhbZ/7zwOwrNkmHoRV0rTPAsewlrxoQzOyP/0xs/hN7UY4Dj2iYgyyLao11HC4dKv1zu0\nHt68PQyO0yCMJjorewWXeGYGG8jLa6Ar7qzDONxtUv1mqtKgMJzHmTKaRucefxLpCOdW8N21AuhN\n0zycOkV2Zq9LDik6b1y19FyDCwipZWXSER3S7bU8U3VC75JForNoxy9x8/i7w2dERGQqAiP6Nin6\nf7FAYajrFAnaIm2zUywl3Be9XGwC6qCWTeRCSHquCoV5FH1yPAlulp0kD8ekePmTdCQVWVYyodS0\n/TkY3Anowh7dgn2m1rAAti6SSqcbrlYUXctKg7TdeL1tDyIilcNof2yUJWgOwAjKUziMpCjM1JWG\n0sOxdpIYl+oM2tiYRqNUBGssWuz6fGUa17nODcVttvtB7aNwHN/9F0+Bsq3iqX/0yhMiIjL/Ir+w\njkP6riVXe4XQzas6xoQbJ/eQoLRYnSU0WywzVRsRaR6Hg+jnT2DePxjH4WmaTsIbdTgclfbYyPK7\n4zxIV9iXIiIqEksKrFQ7SlzdaYTsTPvfZVmt6A7GKblF6jX7oDaG7+XGS3KUB8URF33wzSKEI7dI\nO41UdWF4fZvuuXO0k66qpSfZ19ENeo4crNteDJ8rzjH9DBrAMnXvmvzt8R+IiMi1Jg4fXyogjefi\nIuiqmTU61BohoT+3m7rYtxQ9dV7oHNxB25MbGPtEnmWDOfaP3A0HwS9O/VDuiq12/dQXth/B4zns\nb5Nfw3dHXlgSERFvA0KDfiidqSfoGGulyAeOTwqerdcx3lMJHvJG4SjZOcI1coJCblmR+jBv/qM8\nJ0zBdn7nwBe6/m29yhsnprw4zV2cZJ201V5Cb1g1NbOp5ZHxdHONKXR0iHs5vp9qynAW7U1FYScL\nBdB89Sa8SiHQ6rgKS7MfuK7pDcuuFnCn7SIQXZS2HWhGRrR7zznJssAv34Obohb3ufgWbKCVcoJy\ncxpYqkzjj78//W0REUk4aOPTG7irTivVf4drea/3uV3OWuroj9TwuLg13PV+M00R3Gme23inUpqN\nBA6y4buxf4/FGTjkmeGlbaRl5LdxpmhWmJrido9rtOpLpNydhnDH0xI60VXeV4L55tZ47g6l5UoW\nr8/z7DYcKctCE2P7ImsNn62i9HzT430D9zkVRFW6fkBV77T7fjkQ9BpUzJUlFzUtL04hyGoEjbjo\n4vzyp63HpdbCWC6sIvXEo4Nl5yRFhFlSNrVE8eQ1liCu9aCE9huApiVoafHN49ijHj+KlKK8lsWu\noA9KM2hfZTIhW8dhL3rOm4lQJJVjruVLC8dxb5we4Rn/Btc8Xd+9DuFfefNBMktPMBgMBoPBYDAY\nDAaDwbArBo9poN6nMjxDoxfhbSt49ETSGTe93O0Vb6ZSUhuFl6ZOoYjl42jeByZB2ZtJgnL7xfc8\nJCIimRVQvrKrpDUFImh9iDjS2+5Q8CaxRWo6SyXtsFRNwDtK8HEIjwdm83JiBJGXXxiFKNiwi+++\nUkfE+v+8+fMiIrL1Ejx3E5fx3fgKI1sqDNVZinK/BKI6EfLAavRLvaNejFTMeHe6QqeAjFNXDyo9\nrkmM5cw4xvxmAZG3+jBsIsEI++uWXuozHHpglbYV33a6nuuj47Xp105LxXPw2TWK7ayMMoo1DrpW\ncRY0zvFwKapeRCE6x5vpCQ0KVjlwnsrX1yFkd2IIkZfpHMbx0gOYB+Udpe2LJEjh16jFsZ9EDsMv\nZ9HW31mD8GH2GawP6YuIPPpBmSWNPL41gZg3C7/VCsq+uox4pPIsN7hMbzujSKV5fodsAW8Y9n3i\n6JKcHEYf/WcjKLd3KIqxLnv4zZd3wNLScmQMSgTzRymgrtv2IQdlafcTGnHQEmL6uqaNKY1WWcPK\n4qfI6b2jeZmLI9pU57y/UpkI/Q88aKRKo7v6v7romn2KvAQMF00TIdMiUqRdJ7UkFT5Wn0f//NdH\nvxX8xukKIpNfW0CJxcgi5kmMIrcqmBuUmu2IOrQvpI/UVV6Psi1iZL8l8lifSkzFuiuLvfpXsstB\nRPW5Gvrj2XUIHqbOo+1jZ5h2tE6GgUYRe7GnvR5CkVdd17939ZiIiDx2CGJfwyNYDAuzTD/i59PT\nJeHuJeU1RJUyafTX/YxMf52RKp+ROE1PCMa/1RFh6jnDwNn9uTKNWsqz5fpE1oCyCNPZ9tq0UkTY\nXWn9uQzsp06Kvp4FG0MsNdxPhqHXLmUdpSBzcQlr9Q+nkE710PAiPgvCkJyfxD69U8X1T0zuSINp\nS9WraHviEPax9yfQP5cbsJTzSzjzzKyRfcl1xessQ9eLsQ/YRO3/FSvhGpKrsNPCENISJK50czyt\n1DTSyhSd+yoyyTPdkWHM61Wm3203YQMv3cRm2dwgM1N1B7eYqkbziZa9ICXXZ/nVnkLp42T/6LlW\nmH7rcw4X61gDL1en5IXiEbxGYfTtOtp4KIO+SJBtXdSypYnutItgv+tnWm5HeqaIiF+BvSYKLB+/\nxNK4ZJE3F2EMVxNDEi2ToUKRSJ/inx86AQHB50awDxavYF0cucjzbTdhtfcIpSV447B3tetMFHY4\nHENfZOdxdt1hQYDsUEU+Pgtx298eB6NomMwULQrQyKFPSnN6n4SxTy10p1/5nWv+j7HnG9PAYDAY\nDAaDwWAwGAwGw64YDKaB47RLUKhHhlH/5BrcgmPN7ohJdJt5+Cwh5HRESGtzLN+2BU/sX8YgiPSz\nRyAicv89EM9aeBWRidxL+HygbdAPJxyja34Jnqb4Gh7TK8zH8ul1owdRyw2q4E9pPCaHU8jfeTiO\n775ah0/of7sFoYyzz0JcY/pFfCd3GR5bd5NiIZq/3Gj0Lr/bcdoMAy2LpN64MUYRKABSmYJttDRf\nS9P3myKxFXjVGsNkqjDvOR2r8znspTJOz12qo8xcv6E2Hy41yX7QyIvmfGnOp5YSjZaaQWmmIHLI\nqHE9w3J7R9Heu1nW68zMpIiItIYpKMrIdi9jjU7EDTyd6VV6Ws9jfM60MDdPJ5nAzQi70KYjjJpH\nqo5QQ0rKs2j7fz5zSkTaEchPnXq/iIjc8zzdzauYJxp57IvYJ6H51W4RjUiuwk5TtNOAZUPPupZh\nyozi8yOJiqRpEJMRMoYEi8NnSxC9fPGGlvHCbyU3dS3tzql0anXxG7SjXpbiCs/BmAofwia15GJ1\nnPNkHOtUNlaTPGuvuczNqzMcrw70VoI6KDFdLEJ+8gEqyRQwXrQEowrT+tT7yKENc3OIKnniyl8W\nkMf/g7UjIiJSvIr5nl5nFKLYXarTqda7/lcQ/RsUQViuXyqGmSxgfQoYI1z0T9dFPF8jq9AqKjEi\nl1phBK/EEoXa1n6OdWdee8CmxJhkKXZVHcc4P1c/go81Nee/+6ca9ag08xSyZaS2OoI5s0RhtJbP\n+nJkGgTR+3BkqTO/tcdME2UU6r7vpxPdH2AOejILm21QANj3HdlYx9nAZ1Q2wih7uQobiFIHoRnv\n1kGSSP/iZL7vB2wiXedHXoZ9vziKNdo9xP2QkceDk2BSqW0PJ6pyaRN7d3yHY8t+2vRg7wtNnGdb\ny7Cn+BbXES271yly3Ysx7yw1WCGTqMB9bo2siQlllZIpyqWwQUHIxhBtIV2XeIRjzWi7y1PLK9Qs\na66g3RqVVqaBMnpSG2TYFmriUEPF208tA0Wg4dGt2aW6A9QtDpgQDs86a5uw9c9sPSINCqHr3Bib\nwBl+Mol5P5xE/9I0OtgLocd+Ilx6kntQrIBrzy127/uqxxPfbopbx2e3j2L9W5nFZ09mwLYcOorO\n+/b84yIiMkrNrH6Luwb3OHGMn9o5b+vkRpGltVN4Qc/p3ii+dzy3Kk9kwZi/Nw77vtrAmF/ZBGNe\n7SbB8132Fs8RFEIMyom/xXs7YxoYDAaDwWAwGAwGg8Fg2BUDwjRw21HmLDyvfjLe9ZHkKvM81xkV\nLzIHiaqYfr0elG9LMmo3JcgHuz4Nz2uEqquaN/bK/ch98r8Uzm/vMXw/iIY4bE9kEx7E3A30Q2Kb\n0TdGzCr0QJWoGNtsReQnsmBS1OjB+8wWSsydeg4quvPPUifhEvrQpXq8p/lc9EQHHqkeQSMOrir5\nU1m0dAjjtnOI3uacJjfjIb6FP7IL7YhiieWpGk3NB8NvBqqyVOX1aSv6v/3dPJHhUkz7Bf3fGnlh\nGUSPkZcmlfJ9Xk+E3tZoiUyD1e12/rdWwEjAXhLb8FKryuxMCnb10jjHOk1V6V6WXtRIR7MpPtk9\n0Q3M+7FzuE5l2Iij44OnDY5frKy6Fr5UGIFOH0TbZqJQG/6/lj4hIiJT32Ibr4Nh1GK0462Unbkj\n8L2g1JxfxhyM5fGYXWQ+O3NZ60Oc91Poj0oD4/qDQkomHkQf5liO6Ewd7ftXV54SEZHED9G3OUY0\n0yvwQMdX8T2Ha6m/XQy0DHoalQ1pmfhkATWGGIUaoRL6JK8/g2tcKg/JToaq4lwUKi2ukyy76zS7\n1crbJReVkaP51HJ7rnWv7SJc/jEoycfqEXN4/73UrvmL1YdlpQw7WFpGpCK+0617kihoRQrambJr\nAv2eAWEYhJX02fYg2jTRXcXj0xsfkPkENAsmotwrE7CL9WR3JC9AP9tKHQVf2ow+l0yI9ApseOQC\n9vraKqNjND8tyaiRJLcRDbQqNOKYZ9nVKl/e0dJCRLTCvGkyGTWy6jeb7X7p8X6nZz5JaBlZPiaU\nFcCoexINL1SxjpUXs21KHKsGedR68n3MlUiEz7VYkg69ajqEGUci+9d+HftGW0cntoyxHj/HPZ9s\noucPQcunmeH1s5S0VtHa2nRldJFMMeoirLDE8K0mPvNKDefaaAntie5wrMsVXkeHhk8v0FEhRbUD\nIloZioyZygbWbVX9d1RWS5cDVsEpr2Qkx9LqB9KY/5e3oWNTZgnxRIH2xWGMYXmQoevUxFjC96OL\nefGCymn7XGrZcQJGnZ7vtHJUk2WUm4nu9crVSlhL+Fx81RUGqIO9MDdPDTRSM7aq+GyUR3pXS4yH\nGYT91K/RtT7EvFDWeGKdLHKOidqKFLaDSH1iFDpNDksuHk4gMp/mIvl1EtT6ySzaFVoxo0im0UU8\nX3RR8eMqmeNCzSqH69rlzLgcPwk2xXrrloiIfLWM+7qdV1FJYvqM6mHht+NaUjyP+7xg72+13tL4\nD1iPGgwGg8FgMBgMBoPBYBgUDATTwHEdcbQ6wDAjo1k892NacF49Zoyk0lvfpfKtf9ODpR7WVgru\nufkEPC7zMTz+sQ9V9SDC08dIhEb2AoVnekBjK4ySbrM+e5p5XIygFw+jbcfH1+SBONp/oYHPfOY1\n5LtOo5S3ZF9FLqzkEYn1SvSyhj3PvVSRd9ygbrWTZh73EDzGxTm0rUavqpekJ22RSsiruMbURlNa\nSa3hzhxP9pcPJ5w0mQ+pNY79JBXj1evLvHDf7/D29ijfL+yBFmoZtDJ4rGfxvioKJ7ZD1+f7Qa6g\nT1vWPlWWQoqK049nr4mIyFfiiGioNzdQ1XXcnuV9+a1W4Ol3V+AVTTLqnVhR4QpNTu+ODOv6UBtL\nSOFujPV/efw5EREpeLCf770ET+xJqqi31OPa7M7t7Bt8v12fnmuXy7mZisMWomW0sz6C5+lVRlzI\nOCgeismJ9LKIiExH0A9/nEed+o3TUM+ePYf5nVpkX+8wDKG5brqW1uu9j0KJtKP9rrKAaPdDnMuT\nvCRGFVtkEZTqcYk53ba6wWhki/ngMVbUcKmJ4zRC1QN2Q5/tQteBxgjseOcgGRjUZVksYyG4sjIh\nyVR3Lq6jOd7Vbg0grcqjFYICuxuEHNcOaNubbPvWUYzj9BgWvetlLOgX1qfkpw6dFxGRw3FEmUYT\nsOslrvES1TDzAOjWKHyvPde3Mf8Si7i+IcG4tlZ07WbOejPMsGvn9lamcA6IMCKlVn2lzrz3NfRB\nYo2VYjTavJvOQ4/sPtAwUWYdo/5+XKsk4e14GrbaaLW1DERE3JoT5ALrvqVVIqKZUG46w82ukicD\n5XYvuJZgu9vv9vueeNynXVb0SPD5VBHR8pGLGE9d75Uq4VJhP71YFLcE+2lMQM9lxcN6udrC8wtl\n5PYnuVe4BebtKyu3j/ufXkOEjJfEBuZ5ao0VQqgmTwJtwKgJNClE5L1T10VE5GPDr4qIyKer78Mb\n/IhqHCnDILOCAc5exhyIrLGqSrGDWbdfGl6da4/aO6PlyqirjZJRR8ZkjXpcqusQ26YW0YYfKONH\nZrHWHR9eExGR8TjatrmF/S/LPogwoq37ejDfHaf34x/S7mqf+2EDXoq6dnw/km8zDESwdjn8DY/2\nkEjBlt+XhE18tYTqQV5MNbAGhFGn1YF4PQ7n/8g56vCQJaNnHi+qLEvqOU2k5NQcdL6OxDDmf3br\nMRERGX4N/2LoFd7frVHzqKh6HaE+eIvjbkwDg8FgMBgMBoPBYDAYDLtiIJgGEokE3qbmGPPWZhFp\nVPXURAGel7R6iBk50ciE32y2c4VGqRj/ADyvJ06ivuXJBHJBqj5+K3kTj06lD3m8YagnShW0GREI\nFIY9qtzTG1eh2mzjGDzVvz7zfdlh7ty/XfuYiIjEf4CQy/DL9EBRNd5j/rQMWrSJ49fKMbo6TC8t\n1dLVw6U5vkHd8lxE6GwPohSxEXxnPIW2rieosq6BVI12aF5lkGfVh7rVIThUj29mGHEk06CZwThV\nqWeRIVsiKhJ4D4NI3RTmQOEEPvvoDGz/chXR58hVzC+3iOh7W1m5h233/YBZ423iOpwdhAfUG98e\nH3qn2TeuVoc4kpbWvfCoHojDzj+9hGoJ08/Qk30dbQ8UkvvNMOiEzntGYFqc/y7HI7ENXY8oIwjN\nEeb90tO+9URDnkhBVbfo4zufvYEi36Ov4F9kLrNvN9see5HOHDdeQ6vVn5zHPdafeqbb7iWH9kWj\nWLeODW9IjBN6sYac/kIFdu2TaZTEkifpVfbrDts+SGrSCs2zJtOoPId1cAcBBhkfQzRpZYdsPM+V\n0g4ZeVVGJGniiW0qUq9j/QtUwmlnb1VBed/Atu8cxTjWTmK85rNg4FzKIyK7s52SM5vI3Z6IoV+S\nlB8PyGL7naf8Y0L3eG+bax3nepJnGo8VbZRVqJE3rXLi7lTbecCT+ExOQ4vEdzaO43UcfSS6jv91\n2/j3klUo+i+5V4U1PPR9EkQaFdjCoQmsXwnO+81IRjytosOqCUNs/2gaj7c2sf9FWVUptUJWpUbb\nOysI9Aq+H0T5W0UygMg0cGgLqRzOKal0tyaF1Dp0CTQSS7adsMqMxxPSpR2wTBIF9nOx+7zXPw0f\nP7gGbwt7UZTXMsJxiNRh+xphVaZkbRTv33/fgvyjyW+KiAjljeTPWWnCLaEfMovd6vGJJVIONsAw\n8HT/q9barIv9huMGUfIgz55rnbaxPEPdrTHqm6mNV5VVKFI7CPv94CFM7Ec4wZ/fPoLfXIDdDF9m\nRFtZxXpf0U+EtbuoK+YPYcxrY2RN1UN6Q/r1RFx8MtGLrJrwiaNgm03wN6/WYPvDl/Ed1dDoO2jn\nHpktDs9eLudDamtERESSZJ/4ZJqmxjCea+mErFTR9msNtPHqKqomzC9ybJVhsAV7v82279C8N6aB\nwWAwGAwGg8FgMBgMhl0xGEyDjnqx1UlGV5jHWZ1Q7whz844jopTcgGcmvsOIyk5TKpOMUhyGLyT3\nUahN/r0D3xURkZkIIhL/8NKviojI0FV6fzbhgex3hBmXEPLCq4eQnrTKDPpn42F87pMnkNc1F92U\nb5SPiIjIN74JLYOj32eUiR4ovxJSz23/0zvcijcB32vn0/OxlaSWAXO7ZifhLT2UQ8RhbQ7e+MtZ\nKKjG8xFppfjdaXjuT04jz3Uo1q26HaRAh2vDd6qo9xK7ef8016ultsCPagSGObvVcUzfSHVIZJK1\nmdN4bfVReCynP4Ao+wM5PGoUevwsf1u9k2pnvY5CdCgri3TYP6OE6m0XMgx0HnhkWRQPROTDR+FW\nfroAnYZLXz0mIiJHnsP819yuQYs4isht7dfId6D1oBEYftxjHV+tAnJobkMOM4nzc0Wq6b6AaOzB\nG2QrbTPKzGhW0NeDtA6IBGOrcDUwRhX13DDaM5NDOx4euhnks58rYi3YKlJpeoNVOFbRpuQtRptU\n+6PZB+2GPRCwyRh58nKILpQYTXEOYd+azLAu8xoiDL4n7Tr1ReZ009STeUQZXK2MwejabRUa+o2A\nQRjVGsUAABODSURBVMSjyAjWsdIs2jM1gYhkhtHE7SKZJJWIxFiv/UoF9n4+j2pJyTzXEEZ1vLCd\n9xMdOiaqzyRedyTY5evOMPY5j5Vw3CrZlfWGeBn0Qz3LcWe1AM3pfWVhVkREDt1gtEkj1Zrb3M+o\nc1CfvXuzdRp4XSPk5SL3N26AD0wsiYjI1PyO5CKYx2VSC7U60mtFMOmubMIWZi+x0tAqWVac/31j\nWgT/S5mlHSwvaa9LDqssBOyjRjtq6KQw9q0E1wcqyK82cTAoU9MqucXf7AerYg8EdsezqMf1KMrX\nh+tY29y70JbCce73BzBuvzLzgtwVw7z4IvV+vvva3fjuJZ77b+KziavYG7RihUadu5h1wYXtb984\nrhOMpcNqIc0cqwQpo24I1zMyTQYSbXqL+i7DQ2X5ydlrIiLyQAaVoMoefuOZ66CjjVzA/0tfAsUu\nWPc77V2kL3oGgZYBWaTOMNb6+hja18iRPa0HXR82ENOqSsmo5O+njX8EY/rr48+IiMg62/dn5x4V\nEZHjPySrRJlVg3L2U2apB1vwqO0RRO/LrPZGplGM5TKcVkKyUbTl1DbG2n0NDI2EVhTUNX6fGAYK\nYxoYDAaDwWAwGAwGg8Fg2BWDwTTwvaAaQnIFnpeN++BRqk3CgzR3FF7DmAtPzUiiO4fPdTw5HMdv\nPMY8n8dSV0VEJMdEzz9Y/4iIiNz6NmpiHn0ev9lWlxwQb1QnNM9/GJ6nzXvw/NhDCyIi8vHhcyIi\nstwclt+7BC2DsZfx1dgtROaDWrSDkNf0OtDrU6VgVTw+MoRo+G9OfU9ERMYjaM8zM8jbfLZwTOoe\nPHJ3ZTmmTGzVvNcG1fhHV/jbFf54OLe5D5HHILqu9kePYbSM/kgU4JlthnK88/ejzfl7M8FrzmH0\nzd3TiMoczMAGPnMN7BP5MtTHJ15mJQH1wvc7xzkUgdHE5EAxWyOS9MBWZuFl3b63IZt1vPbDc/DA\nHnuG3uVlqMz2vW1vBOH2a+SJuW8OI4Qeo0vlGfTHxyevyRrFPT63jDFOL+O3Ymuc96H63H446jAo\nCFgmVP6v6JzEg+Yr/8wMFriPZc5LhG++lDgsIiKNMuufQ0xZsgvUQ8mTZaGVIsJ6Lv3sC2U5aQSK\na30NZDrJpmHPaUbbx4cwrsVqQioVKnGTcZBeQzvitxh9CK39QZQt3G7f37869W8AgTZRAo+qVxNR\n9XuO84EJRpDGHRmJ86xQw1qwdgsddvimVljqqKw0gLhtXdJzSIyPrW6b9DXilktJc4j17SfJNOD4\n/4frUJJPnYUNJVbB0gsijgM05wNbZNQ5uolIcGYF86AyBSNYOoCI5MenkL/8S7nTMh3BeystjPXv\nb3xYREROXcAeMP4DvD/8KjV7VIE9XDWin/n9XWBfhFgxmgevbAEnEhEngTlfZeWMqUlElfOsnnBz\nHfPgUJljPSgK8iJtZp32P6/N4zlE9/lUFm0sziMKrUtToZWWT22DWfQvL+K8mz2DuTDxMuwofp3a\nXRuh+vR9ZNb5nt/WNNAKAIwiq2ZVbAx704NTYIUeSGKty0awjs3FN+UJVgm43AAj43df+zkREcl9\nBWvgxPO093x3pL3v9i4SaBoEa30WY1sbo07bON5X5sXOAbJMYlzrpn2ZewiVon73ri90/fR/cfbv\niYjI7GfIYriJfhoolpnI7UwjJZiSceDcdr1Y+7y4yJUtjHmU98BJ3OqIu63nu/1lGCiMaWAwGAwG\ng8FgMBgMBoNhVwwE08BvtYKISHQB0cHJl+BFdTwqCFNB+r8/+jUREbkvjnzlw1G8n3Bi0qDbZr0F\nz8sPaqhX+z9fgUdy7ctgGBx+mt7364t47JWC6htAUMNYc7hz8CCWDsErV3kE3vifmYY0+jh1Gr5V\nPCn5S4giH7vGaJrmxe0VaR0Ez7NIO6pKG4gtISow9gps4HvzyFl7eAjsil/InRERkZ/Nog8eTC7I\nMyWwDm4xPPf0Ar5TXgBjZfQc+jV3g5HHbc1v2yXq2vO8/u4og+YmRdcxfulV9ENpHtO1Pof3D86B\ngfETM69JwsF3k6RnPL+FyOtXXrlPRETGnsU8mfoBPNDOjQGuKNCBIN+bqtL1eWia5O9TddltOXsT\nbJKxF6l7cgNrSJDP9g6An2R0Sb3yRzBu9ySX5WIdebzXNjD/x5epB7EdUsz/UfO93zagc5HRp5hG\nypjjOJZEe96XgobFQ/Gk3GxijtysYt6nLqOfhq6jzdEVrvXKMAjyyHdpay/b7zi3RV5UTVojL40c\nNV3IsNAIw2QK/RBxfCmVEIVJsSb70BW852ypWr5WyHgD7JJ+jL8Tilvw8hhck0QU43UohbXuA8OX\nRERkMrod5PP+xToZNldYFWkF+4e8UbvvNfZgVbXf5+taKUr3R0YmvVRUmllVXsdHd25gXYyUqAVx\nlYy6qua5DlDELaTj4lFnwKW+TobtrIyhTetz2MNfmMSeNhcrSIt99lcbD4qIyA9+AC2H2efwL4Zf\n5T63Qj0n1bUZVJZVGFyffFGGAdeKVDJQkC/chX4a5rrw4vYhfPcWjMJt6N7Otupc8weAeed3R1qd\nFq5No6URMkHjW9Sq2oG9f27xEdkosdLaD7Hmz78A+4lfXhWRdmWGQWAYKBzXac97ZY2EmESxGDrj\nyWEwpI/HEVUfcjk/HE/O16HV8c8ufhI/9RmwLqaeY4U0ZVeWwhUzBsjeact+FI/NFB6LNN/4fZi7\nB0bw+Ngozv0/kTsnx6LYz2+1YAN///RviohI6s9hC0Mv4VzbCnSsBqjdndiDcaDzQNGidld92Jc4\n31tdB/tg7hYrsJRhH16P9m9jGhgMBoPBYDAYDAaDwWDYFQPBNBDfb+fe5JGTk3oBnsb5PFWxbyKa\n9j/d+xsiIhJ/CF6ojx28KCIix1Mr8moJn31uFS6r0ndRz3LiLDyNB8/CY+WtUzG+XA7+f98RuoZA\nSXuEuduH4FW+ewaexIkovKk3Gogufn35hIy8ytyoVSqFM2K3Zx7rIKBTTZo24NBbOsrc9OwN5PL8\n+dc/ISIin56Gl1Vr93pRkWiFbWfTMxt4b7yA385cgYcyiL5uMAJRD0fi+qEmHaogoP3AaGH6Bmxh\nPIba0/kmImyFUUQUXt2ZCdTFX16HanbxeXig519mHeuX4LVu1yom02KQIlAit9fyZZ63TyXxygzz\nXadhw14xIdElRJdHLjOywpr0Mgh5fG8RyrTQms6tBO08jbl9sz4mr3lgVNVuoo8SBfZDM6RhsBf6\nnNcb2L2qx9NzntjAPMhdxxp4ehZMse+P3yUiIjHnNfl+5V4REfneZbw2eY0K7Gtc2wthZeFQ1YR+\nrYW+LxIOMHOM/Qj1Ozgsm2Xkp+dirAbALxYqSXEWsQYM3UA7Ipvd1RKCvP4gsvc6ttAPTYOQmrej\nLJMdPF/ZRlT16AFEEX86g1zVmDjyRzuILj97AWN/8FWMbWQFa1xLK2UMgGr8m4JWjuE80DxXnxUT\nHN+XRg62klrX+vZYJ1hUQNK3mCOrTAMvdAYYBHhaxYVPNa99gYyaGto9dB373qvnUR3nhdkT4sfQ\njswN7BeHX0U7k9dZCWuV6vH17nnf133+zSDMDtAqCo4jXg5zXitBLS6BkbFZgn1kFvFZt9bq/o1B\nwm2RVmXVkCFT5NlvCcZRvoY9/mZpRuKbaM/0i3gvQQ0Dv0h9Jq55t6114coBjrP/dtCp4RA637kV\n2GSKFX7WrmOt+0/Z94iIyL2jy10/tVAalVcvYA+ceA7zfVIZBrfIsghF2AfK3oOqKRxr1S5rkk2C\no12gXfTLMy+KiMhTabDLWuLIpwrvFRGRT519UkREZj7Ps98L0PBq3UKfBeySQWj36yE8D3Qf1Cpa\ncc7luiP5RbApMldx255eol4R2Tm9avNgOA1E2pOLE6qlZYd44zRyHhY19k3QUiSFhfNCBjT0C5Hj\n4pTx3ckyOnOijg3EL+F5axBuEN8kVCzFY7bCWgkH6K9tgna+VMaGuvzCjBzQm6Y8Kbn1ED3tjdKT\nHef1P3enoWPPjb21wzt/LTu3hIVg+HlMoGEt3aLX6bq3lWsLSovxt729ROAGyQZ4TR43PT0sOrwB\nGNrADVD2MuhJtRdwk7iUvVu8KOnJ61hAJlaxmQTlNknZG7gSNGGEbZRUNi+D+V+e4AFIaczLcRl+\nDX8nFnhg1LJag+Qc+zERbCK86XUbeN7MY/37/PUHpZCHHYyfoUOBN8xhYZyBdBp2IDjYc72O3cL2\nNMl5Ht/CofgPLv2siIj836mfkfgW3pt7FfM6e1Fpyd0lp24TgRwABKloOsYVzM3UEux39Bz2uk0f\njuEzs1l+kUKXNxMyBm04GTpP2+faHzjhfxQ9tXMd6OOa4If3+8vY1xbP4PFfxpBiePUQRDDPbs/J\nC2dRWvXgX+M3sudYYk3pyY2wg2jA1rzbROG6zyeOlt+j8F0gkpVOSiJPUUTaUHUMz6PVbjHRQASP\nUJvzmwPUF6H0PL3xcdgfCQraHbiMs483lBY/jrXB3aFjRUuLcu3wQqUVB2ne74rwzQNLsjnu7fPW\nqaOfsrfwXm0c9lHbwCExziGPbHOvH/S2S4fjmKKYDtexBIVRRy9gHWhedyS1hnYlr1PoUEsJ1/dI\nMw47YKQP64HvtVNOaJuRZexRow20Pb3CVOQfIgjwfAqPmqqVXmvKiRWmpt2i2CPb7oXPPAO41uk6\nFzhNGMAaOsf9rIgxLlxD4Oufz/yiiIj8U5ZTTy25MnQdfXXPBQYBb0H0vsV9I0g1H7T2v1EE5Xix\n9sW20FfDV2KSWsUaP3KJqZcsIxucd3uEAXRBGgwGg8FgMBgMBoPBYBgEDA7TIAylrgUUNnhePI1C\nv0MReOMYbY4W4EHP3kSUbf0Cok7fuQmqSvoGhnD2bFOS1xhVVtGvN0vJ62PJrS4EgjFqA3t8rEeX\n0zPc1u49bP8WqFixl+Hzi0lHBElpTYPqcX6zUKZInEwSmmjuGqPqJV+GLzE6sQ16YiscZX07IUTZ\n1+ijCpol85jTwxQJrS2Oy9gavjN8lR78dZZaHWQxpE6onddxfQHLjB70aAFRhYmbYNiMP6v1+CLt\ntjElRcVf92QWKQZgXgQRZo2wMkIe47WNkUWW3ARttToS7/p+YrsV0NDdLUZYNe1o0KPsioBlxtJr\nHOv4S7Dlo4tITWt+BX3wnQRKCkYLNbmvADFkf5MpVyxb9baLNqn9qyCi2mz4UZkHnidulKKvTGWJ\nbyuNFZ+NbDDyVgqV3Byk9ATFXvsex9FRdhyjibLkBCxDrxWiYQe/OeB2vxfCfaE2oazRSCSY69kF\nsM1iJa4L/Gp6gVHY5ZAg8CCu/2GGhaaP8QwbtPVKm/2qrwXlZEvdqZZ7Mur62X7fD85xwVrH/c1h\nunTiEuZwgiLoQbnNjnVA0ziag8yY3Quc17pHeWSVyApSK+LnmXbydaamqmCits3z22zctxFb/A0h\nxDrT+z93AX0zVq4H92fKwtHzgjKTtX/3G8Y0MBgMBoPBYDAYDAaDwbArBpdp8G5FyBvnsizkGL1L\n49+lMBxFsxxG2LxSOfDc/diaDe8Ur907HaFohMhgBhF+LKjHVSPF9KLGrkDXYroI0Sct1+NulUXo\nqW+9XSONuyHUD6015GsnqFMwdwXMIz8Wba8BzG9sBQKotI8eeaDfMsJRNgp5qRaNoyJPCscdbI2S\nH4Uwm07zelXThOOZWUQkMTMy1PV1p+W1mRXsG0/zet8uY64IaxppdJTMA+cSoiyqXOOLSPPtNNZv\nBKH2hCOngcZNvRHo/UTWqHcQ7T7KqajgbbnObyeEtI663ur1tfQLYeZZpSKtZZZk3sTciFLfK2Bf\ncOxbg57n3old2iki4q3xOfd4cd22NlmYTRX6rYHFnoxSvh8uFT3o7flxEW7XXv3xboLe/9Vo02vQ\nrnDyhYE56xjTwGAwGAwGg8FgMBgMBsOuMKbBoEJzubUspD4aDO8GqMe11J2rLSvI03RiWLo06iAi\n70yPvBeKujP/r6vCyTux3Z14nYjjOwp76PhIkWOdL3R/vjPC9i6xgXcldrMLnf877+J+eTehcw3U\nKlMakd7e7tdV3XmE1vp3/Jq/G97Na50B2IONMggwpoHBYDAYDAaDwWAwGAyGXeH45tUyGAwGg8Fg\nMBgMBoPBsAuMaWAwGAwGg8FgMBgMBoNhV5jTwGAwGAwGg8FgMBgMBsOuMKeBwWAwGAwGg8FgMBgM\nhl1hTgODwWAwGAwGg8FgMBgMu8KcBgaDwWAwGAwGg8FgMBh2hTkNDAaDwWAwGAwGg8FgMOwKcxoY\nDAaDwWAwGAwGg8Fg2BXmNDAYDAaDwWAwGAwGg8GwK8xpYDAYDAaDwWAwGAwGg2FXmNPAYDAYDAaD\nwWAwGAwGw64wp4HBYDAYDAaDwWAwGAyGXWFOA4PBYDAYDAaDwWAwGAy7wpwGBoPBYDAYDAaDwWAw\nGHaFOQ0MBoPBYDAYDAaDwWAw7ApzGhgMBoPBYDAYDAaDwWDYFeY0MBgMBoPBYDAYDAaDwbArzGlg\nMBgMBoPBYDAYDAaDYVeY08BgMBgMBoPBYDAYDAbDrjCngcFgMBgMBoPBYDAYDIZdYU4Dg8FgMBgM\nBoPBYDAYDLvCnAYGg8FgMBgMBoPBYDAYdoU5DQwGg8FgMBgMBoPBYDDsiv8f+76dnEtnDGUAAAAA\nSUVORK5CYII=\n",
227 | "text/plain": [
228 | ""
229 | ]
230 | },
231 | "metadata": {},
232 | "output_type": "display_data"
233 | }
234 | ],
235 | "source": [
236 | "f, axarr = plt.subplots(1, 16, figsize=(18, 12))\n",
237 | "\n",
238 | "samples = x_mu.data.view(-1, 28, 28).numpy()\n",
239 | "\n",
240 | "for i, ax in enumerate(axarr.flat):\n",
241 | " ax.imshow(samples[i])\n",
242 | " ax.axis(\"off\")"
243 | ]
244 | },
245 | {
246 | "cell_type": "code",
247 | "execution_count": null,
248 | "metadata": {},
249 | "outputs": [],
250 | "source": []
251 | }
252 | ],
253 | "metadata": {
254 | "kernelspec": {
255 | "display_name": "Python 3",
256 | "language": "python",
257 | "name": "python3"
258 | },
259 | "language_info": {
260 | "codemirror_mode": {
261 | "name": "ipython",
262 | "version": 3
263 | },
264 | "file_extension": ".py",
265 | "mimetype": "text/x-python",
266 | "name": "python",
267 | "nbconvert_exporter": "python",
268 | "pygments_lexer": "ipython3",
269 | "version": "3.6.0"
270 | }
271 | },
272 | "nbformat": 4,
273 | "nbformat_minor": 2
274 | }
275 |
--------------------------------------------------------------------------------
/examples/notebooks/Variational Autoencoder.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "code_folding": [
8 | 0
9 | ]
10 | },
11 | "outputs": [],
12 | "source": [
13 | "# Imports\n",
14 | "import torch\n",
15 | "cuda = torch.cuda.is_available()\n",
16 | "import numpy as np\n",
17 | "import matplotlib.pyplot as plt\n",
18 | "%matplotlib inline\n",
19 | "import sys\n",
20 | "sys.path.append(\"../../semi-supervised\")"
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "# Variational Autoencoder\n",
28 | "\n",
29 | "The variational autoencoder (VAE) was described in its current form in [Kingma 2013](https://arxiv.org/abs/1312.6114). The model consists of an encoder/inference network $q_{\\phi}(z|x)$ and a decoder/generative network $p_{\\theta}(x|z)$. The main idea is that it is possible to both reconstruct and generate samples from from some input distribution by learning a variational distribution over the latent variable $z$.\n",
30 | "\n",
31 | "
\n",
32 | "\n",
33 | "The VAE therefore has a bottleneck structure, where the input $x$ is encoded into a latent variable $z$. New data can then be generated by feeding a latent code into the generator network - $\\widehat{x} \\sim p_{\\theta}(z|x)$. The diagram above shows the generative model (right) and how the latent variable $z$ is inferred from $x$ (left).\n",
34 | "\n",
35 | "Below we will instantiate a new variational autoencoder with this bottleneck structure consisting of a 2-layer encoder network turning an input MNIST image into a latent code: $784 \\to 256 \\to 128 \\to 32$. We also have a decoder that performs the operation in reverse: $32 \\to 128 \\to 256 \\to 784$."
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": 2,
41 | "metadata": {},
42 | "outputs": [
43 | {
44 | "data": {
45 | "text/plain": [
46 | "VariationalAutoencoder(\n",
47 | " (encoder): Encoder(\n",
48 | " (hidden): ModuleList(\n",
49 | " (0): Linear(in_features=784, out_features=256)\n",
50 | " (1): Linear(in_features=256, out_features=128)\n",
51 | " )\n",
52 | " (sample): GaussianSample(\n",
53 | " (mu): Linear(in_features=128, out_features=32)\n",
54 | " (log_var): Linear(in_features=128, out_features=32)\n",
55 | " )\n",
56 | " )\n",
57 | " (decoder): Decoder(\n",
58 | " (hidden): ModuleList(\n",
59 | " (0): Linear(in_features=32, out_features=128)\n",
60 | " (1): Linear(in_features=128, out_features=256)\n",
61 | " )\n",
62 | " (reconstruction): Linear(in_features=256, out_features=784)\n",
63 | " (output_activation): Sigmoid()\n",
64 | " )\n",
65 | ")"
66 | ]
67 | },
68 | "execution_count": 2,
69 | "metadata": {},
70 | "output_type": "execute_result"
71 | }
72 | ],
73 | "source": [
74 | "from models import VariationalAutoencoder\n",
75 | "from layers import GaussianSample\n",
76 | "model = VariationalAutoencoder([784, 32, [256, 128]])\n",
77 | "model"
78 | ]
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "metadata": {},
83 | "source": [
84 | "Notice how the middle most layer consists of a `GaussianSample` layer, in which we turn the input digit into the parameters of a Normal distribution with parameters $\\mu$ and $\\sigma$. This allows us to use the *reparametrization trick* to sample from this distribution to introduce stochasticity into the network."
85 | ]
86 | },
87 | {
88 | "cell_type": "code",
89 | "execution_count": 3,
90 | "metadata": {},
91 | "outputs": [
92 | {
93 | "name": "stdout",
94 | "output_type": "stream",
95 | "text": [
96 | "sample -1.56 drawn from N(-0.38, 1.48)\n"
97 | ]
98 | }
99 | ],
100 | "source": [
101 | "from torch.autograd import Variable\n",
102 | "\n",
103 | "gaussian = GaussianSample(10, 1)\n",
104 | "z, mu, log_var = gaussian(Variable(torch.ones(1, 10)))\n",
105 | "\n",
106 | "print(f\"sample {float(z.data):.2f} drawn from N({float(mu.data):.2f}, {float(log_var.exp().data):.2f})\")"
107 | ]
108 | },
109 | {
110 | "cell_type": "markdown",
111 | "metadata": {},
112 | "source": [
113 | "## Training\n",
114 | "\n",
115 | "How do we go about training a variational autoencoder then? We want to model the data distribution $p(x)$, from here we can introduce a variational distribution $q(z|x)$ by multiplying and dividing by this distribution. Now we employ Jensen's inequality to move the logarithm inside the integral, we can do this because $\\log$ is concave and because $q(z|x)$ is a probability distribution. From here on we just rearrange and we see that a lower bound on the marginal probability of the data $p(x)$ is just an expectation over the likelihood of the data minus the KL-divergence between the variational distribution and a prior $p(z)$.\n",
116 | "\n",
117 | "\\begin{align}\n",
118 | "\\log p(x) &= \\log \\int p(x, z) \\ dz = \\log \\int q(z|x) \\frac{p(x, z)}{q(z|x)} \\ dz\\\\\n",
119 | " &\\geq \\int q(z|x) \\log \\frac{p(x, z)}{q(z|x)} \\ dz = \\int q(z|x) \\log p(x|z) + \\log \\frac{p(z)}{q(z|x)} \\ dz\\\\\n",
120 | " &= \\int q(z|x) \\log p(x|z) \\ dz + \\int q(z|x) \\log \\frac{p(z)}{q(z|x)} \\ dz\\\\\n",
121 | " &= \\mathbb{E}_{q(z|x)} [\\log p(x|z)] - KL(q(z|x)||p(z)) = \\mathcal{L}(x)\n",
122 | "\\end{align}\n",
123 | "\n",
124 | "To make things even more concrete, we show how we can go from this equation to an actual algorithm. Recall that the expectation is just an arithmetic mean, which can be approximated using Monte-Carlo samples. In fact for most applications we can do with just a single sample, even though this provides infinite variance.\n",
125 | "\n",
126 | "$$\\mathbb{E}_{q(z|x)} [\\log p(x|z)] = \\lim_{N \\to \\infty} \\frac{1}{N} \\sum_{i=1}^{N} \\log p(x_i|z_i) \\approx \\frac{1}{M} \\sum_{i=1}^{M} \\log p(x_i|z_i)$$\n",
127 | "\n",
128 | "As you can see the likelihood is just the log probability of the data given the latent variable, but the latent variable is itself derived from the data - we can just use the reconstruction error! In the MNIST case, it is most fitting to use the Bernoulli / binary cross entropy.\n",
129 | "\n",
130 | "Finally, the second term is the Kullback-Leibler divergence. It states that whatever distribution we learn over $q(z|x)$ can never be very far from the prior $p(z)$. This is both good and bad news. Ideally we want a reconstruction that is as good as possible, i.e. only relying on the likelihood, which will only occur if $q(z|x) = p(z)$. This will never happen in practice as the q-distribution will be very complex in order to produce latent variables that result convincing samples. On the plus side, the KL-term acts as a regularizer that pulls the distribution towards the prior, which is the whole reason why we can create samples.\n",
131 | "\n",
132 | "This term can either be computed analytically, or by sampling similarly to the way we did it in the expectation, which you can see in the printed doc string below."
133 | ]
134 | },
135 | {
136 | "cell_type": "code",
137 | "execution_count": 4,
138 | "metadata": {},
139 | "outputs": [
140 | {
141 | "name": "stdout",
142 | "output_type": "stream",
143 | "text": [
144 | "\n",
145 | " Computes the KL-divergence of\n",
146 | " some element z.\n",
147 | "\n",
148 | " KL(q||p) = -∫ q(z) log [ p(z) / q(z) ]\n",
149 | " = -E[log p(z) - log q(z)]\n",
150 | "\n",
151 | " :param z: sample from q-distribuion\n",
152 | " :param q_param: (mu, log_var) of the q-distribution\n",
153 | " :param p_param: (mu, log_var) of the p-distribution\n",
154 | " :return: KL(q||p)\n",
155 | " \n"
156 | ]
157 | }
158 | ],
159 | "source": [
160 | "print(model._kld.__doc__)"
161 | ]
162 | },
163 | {
164 | "cell_type": "code",
165 | "execution_count": 5,
166 | "metadata": {},
167 | "outputs": [],
168 | "source": [
169 | "from datautils import get_mnist\n",
170 | "\n",
171 | "_, train, validation = get_mnist(location=\"./\", batch_size=64)\n",
172 | "\n",
173 | "# We use this custom BCE function until PyTorch implements reduce=False\n",
174 | "def binary_cross_entropy(r, x):\n",
175 | " return -torch.sum(x * torch.log(r + 1e-8) + (1 - x) * torch.log(1 - r + 1e-8), dim=-1)\n",
176 | "\n",
177 | "optimizer = torch.optim.Adam(model.parameters(), lr=3e-4, betas=(0.9, 0.999))"
178 | ]
179 | },
180 | {
181 | "cell_type": "code",
182 | "execution_count": null,
183 | "metadata": {},
184 | "outputs": [],
185 | "source": [
186 | "for epoch in range(50):\n",
187 | " model.train()\n",
188 | " total_loss = 0\n",
189 | " for (u, _) in train:\n",
190 | " u = Variable(u)\n",
191 | "\n",
192 | " if cuda: u = u.cuda(device=0)\n",
193 | "\n",
194 | " reconstruction = model(u)\n",
195 | " \n",
196 | " likelihood = -binary_cross_entropy(reconstruction, u)\n",
197 | " elbo = likelihood - model.kl_divergence\n",
198 | " \n",
199 | " L = -torch.mean(elbo)\n",
200 | "\n",
201 | " L.backward()\n",
202 | " optimizer.step()\n",
203 | " optimizer.zero_grad()\n",
204 | "\n",
205 | " total_loss += L.data[0]\n",
206 | "\n",
207 | " m = len(train)\n",
208 | "\n",
209 | " if epoch % 10 == 0:\n",
210 | " print(f\"Epoch: {epoch}\\tL: {total_loss/m:.2f}\")"
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {},
216 | "source": [
217 | "## Sampling from the generative model\n",
218 | "\n",
219 | "Now that we have trained the network, we can begin to sample from it. We simply give it some random noise distributed according to the prior $p(z) = \\mathcal{N}(0, I)$ and send it through the decoder. This process generates a slew of samples that look like they come from the original distribution $p(x)$."
220 | ]
221 | },
222 | {
223 | "cell_type": "code",
224 | "execution_count": 8,
225 | "metadata": {},
226 | "outputs": [],
227 | "source": [
228 | "model.eval()\n",
229 | "x_mu = model.sample(Variable(torch.randn(16, 32)))"
230 | ]
231 | },
232 | {
233 | "cell_type": "code",
234 | "execution_count": null,
235 | "metadata": {},
236 | "outputs": [],
237 | "source": [
238 | "f, axarr = plt.subplots(1, 16, figsize=(18, 12))\n",
239 | "\n",
240 | "samples = x_mu.data.view(-1, 28, 28).numpy()\n",
241 | "\n",
242 | "for i, ax in enumerate(axarr.flat):\n",
243 | " ax.imshow(samples[i])\n",
244 | " ax.axis(\"off\")"
245 | ]
246 | },
247 | {
248 | "cell_type": "markdown",
249 | "metadata": {},
250 | "source": [
251 | "## Side note: Normalizing flows\n",
252 | "\n",
253 | "We can get a more expressive variational approximation by using flows. In the regular variational autoencoder we are limited to a diagonal Gaussian distribution as the output of our encoder, but we can do better. Introducing a full-rank covariance matrix would be an option, but we will quickly see that it becomes computationally infeasible.\n",
254 | "\n",
255 | "What we instead can do is to use change-of-variables to calculate the probability of a random sample under some transformation. In fact, if this transformation is made in a specific way, it is simple to calculate the probability of a sample under a series of $K$ transforms.\n",
256 | "\n",
257 | "$$\\ln q_K(z_K) = \\ln q_0(z_0) − \\sum_{k=1}^{K} \\ln \\bigg | \\det \\frac{\\partial f_k}{\\partial z_{k-1}} \\bigg |$$\n",
258 | "\n",
259 | "This is the essence of [Normalizing Flows (Rezende & Mohammed, 2015)](https://arxiv.org/abs/1505.05770). This process increases the expressability of the variational distribution and will likely give a better estimate of the marginal likelihood of the data $p(x)$. However, the parameters of each flow must be learned, meaning that this process will take a long time in order to perform well."
260 | ]
261 | },
262 | {
263 | "cell_type": "code",
264 | "execution_count": 6,
265 | "metadata": {},
266 | "outputs": [],
267 | "source": [
268 | "from layers import NormalizingFlows\n",
269 | "\n",
270 | "# Add a series of 16 normalizing flows to the model\n",
271 | "# You can do this to *all* of the models in the library.\n",
272 | "flow = NormalizingFlows(32, n_flows=16)\n",
273 | "model.add_flow(flow)"
274 | ]
275 | },
276 | {
277 | "cell_type": "code",
278 | "execution_count": null,
279 | "metadata": {},
280 | "outputs": [],
281 | "source": [
282 | "for epoch in range(50):\n",
283 | " model.train()\n",
284 | " total_loss = 0\n",
285 | " for (u, _) in train:\n",
286 | " u = Variable(u)\n",
287 | "\n",
288 | " if cuda: u = u.cuda(device=0)\n",
289 | "\n",
290 | " reconstruction = model(u)\n",
291 | " \n",
292 | " likelihood = -binary_cross_entropy(reconstruction, u)\n",
293 | " elbo = likelihood - model.kl_divergence\n",
294 | " \n",
295 | " L = -torch.mean(elbo)\n",
296 | "\n",
297 | " L.backward()\n",
298 | " optimizer.step()\n",
299 | " optimizer.zero_grad()\n",
300 | "\n",
301 | " total_loss += L.data[0]\n",
302 | "\n",
303 | " m = len(train)\n",
304 | "\n",
305 | " if epoch % 10 == 0:\n",
306 | " print(f\"Epoch: {epoch}\\tL: {total_loss/m:.2f}\")"
307 | ]
308 | },
309 | {
310 | "cell_type": "markdown",
311 | "metadata": {},
312 | "source": [
313 | "## Additional tips\n",
314 | "\n",
315 | "If you want to work with a problem in which the data distribution $x$ is continuous, you can do so by changing the output activation of the decoder and using a different likelihood function, as shown below."
316 | ]
317 | },
318 | {
319 | "cell_type": "code",
320 | "execution_count": 8,
321 | "metadata": {},
322 | "outputs": [],
323 | "source": [
324 | "import torch.nn as nn\n",
325 | "\n",
326 | "model.decoder.output_activation = nn.Softmax()\n",
327 | "loss = nn.CrossEntropyLoss()"
328 | ]
329 | }
330 | ],
331 | "metadata": {
332 | "kernelspec": {
333 | "display_name": "Python 3",
334 | "language": "python",
335 | "name": "python3"
336 | },
337 | "language_info": {
338 | "codemirror_mode": {
339 | "name": "ipython",
340 | "version": 3
341 | },
342 | "file_extension": ".py",
343 | "mimetype": "text/x-python",
344 | "name": "python",
345 | "nbconvert_exporter": "python",
346 | "pygments_lexer": "ipython3",
347 | "version": "3.6.0"
348 | }
349 | },
350 | "nbformat": 4,
351 | "nbformat_minor": 2
352 | }
353 |
--------------------------------------------------------------------------------
/examples/notebooks/datautils.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import numpy as np
3 | import sys
4 | from urllib import request
5 | from torch.utils.data import Dataset
6 | sys.path.append("../semi-supervised")
7 | n_labels = 10
8 | cuda = torch.cuda.is_available()
9 |
10 |
11 | class SpriteDataset(Dataset):
12 | """
13 | A PyTorch wrapper for the dSprites dataset by
14 | Matthey et al. 2017. The dataset provides a 2D scene
15 | with a sprite under different transformations:
16 | * color
17 | * shape
18 | * scale
19 | * orientation
20 | * x-position
21 | * y-position
22 | """
23 | def __init__(self, transform=None):
24 | self.transform = transform
25 | url = "https://github.com/deepmind/dsprites-dataset/raw/master/dsprites_ndarray_co1sh3sc6or40x32y32_64x64.npz"
26 |
27 | try:
28 | self.dset = np.load("./dsprites.npz", encoding="bytes")["imgs"]
29 | except FileNotFoundError:
30 | request.urlretrieve(url, "./dsprites.npz")
31 | self.dset = np.load("./dsprites.npz", encoding="bytes")["imgs"]
32 |
33 | def __len__(self):
34 | return len(self.dset)
35 |
36 | def __getitem__(self, idx):
37 | sample = self.dset[idx]
38 |
39 | if self.transform:
40 | sample = self.transform(sample)
41 |
42 | return sample
43 |
44 |
45 | def get_mnist(location="./", batch_size=64, labels_per_class=100):
46 | from functools import reduce
47 | from operator import __or__
48 | from torch.utils.data.sampler import SubsetRandomSampler
49 | from torchvision.datasets import MNIST
50 | import torchvision.transforms as transforms
51 | from utils import onehot
52 |
53 | flatten_bernoulli = lambda x: transforms.ToTensor()(x).view(-1).bernoulli()
54 |
55 | mnist_train = MNIST(location, train=True, download=True,
56 | transform=flatten_bernoulli, target_transform=onehot(n_labels))
57 | mnist_valid = MNIST(location, train=False, download=True,
58 | transform=flatten_bernoulli, target_transform=onehot(n_labels))
59 |
60 | def get_sampler(labels, n=None):
61 | # Only choose digits in n_labels
62 | (indices,) = np.where(reduce(__or__, [labels == i for i in np.arange(n_labels)]))
63 |
64 | # Ensure uniform distribution of labels
65 | np.random.shuffle(indices)
66 | indices = np.hstack([list(filter(lambda idx: labels[idx] == i, indices))[:n] for i in range(n_labels)])
67 |
68 | indices = torch.from_numpy(indices)
69 | sampler = SubsetRandomSampler(indices)
70 | return sampler
71 |
72 | # Dataloaders for MNIST
73 | labelled = torch.utils.data.DataLoader(mnist_train, batch_size=batch_size, num_workers=2, pin_memory=cuda,
74 | sampler=get_sampler(mnist_train.train_labels.numpy(), labels_per_class))
75 | unlabelled = torch.utils.data.DataLoader(mnist_train, batch_size=batch_size, num_workers=2, pin_memory=cuda,
76 | sampler=get_sampler(mnist_train.train_labels.numpy()))
77 | validation = torch.utils.data.DataLoader(mnist_valid, batch_size=batch_size, num_workers=2, pin_memory=cuda,
78 | sampler=get_sampler(mnist_valid.test_labels.numpy()))
79 |
80 | return labelled, unlabelled, validation
81 |
--------------------------------------------------------------------------------
/semi-supervised/inference/__init__.py:
--------------------------------------------------------------------------------
1 | from .distributions import log_standard_gaussian, log_gaussian, log_standard_categorical
2 | from .variational import SVI, DeterministicWarmup, ImportanceWeightedSampler
--------------------------------------------------------------------------------
/semi-supervised/inference/distributions.py:
--------------------------------------------------------------------------------
1 | import math
2 | import torch
3 | import torch.nn.functional as F
4 |
5 |
6 | def log_standard_gaussian(x):
7 | """
8 | Evaluates the log pdf of a standard normal distribution at x.
9 |
10 | :param x: point to evaluate
11 | :return: log N(x|0,I)
12 | """
13 | return torch.sum(-0.5 * math.log(2 * math.pi) - x ** 2 / 2, dim=-1)
14 |
15 |
16 | def log_gaussian(x, mu, log_var):
17 | """
18 | Returns the log pdf of a normal distribution parametrised
19 | by mu and log_var evaluated at x.
20 |
21 | :param x: point to evaluate
22 | :param mu: mean of distribution
23 | :param log_var: log variance of distribution
24 | :return: log N(x|µ,σ)
25 | """
26 | log_pdf = - 0.5 * math.log(2 * math.pi) - log_var / 2 - (x - mu)**2 / (2 * torch.exp(log_var))
27 | return torch.sum(log_pdf, dim=-1)
28 |
29 |
30 | def log_standard_categorical(p):
31 | """
32 | Calculates the cross entropy between a (one-hot) categorical vector
33 | and a standard (uniform) categorical distribution.
34 |
35 | :param p: one-hot categorical distribution
36 | :return: H(p, u)
37 | """
38 | # Uniform prior over y
39 | prior = F.softmax(torch.ones_like(p), dim=1)
40 | prior.requires_grad = False
41 |
42 | cross_entropy = -torch.sum(p * torch.log(prior + 1e-8), dim=1)
43 |
44 | return cross_entropy
--------------------------------------------------------------------------------
/semi-supervised/inference/variational.py:
--------------------------------------------------------------------------------
1 | from itertools import repeat
2 |
3 | import torch
4 | from torch import nn
5 | import torch.nn.functional as F
6 |
7 | from utils import log_sum_exp, enumerate_discrete
8 | from .distributions import log_standard_categorical
9 |
10 | class ImportanceWeightedSampler(object):
11 | """
12 | Importance weighted sampler [Burda 2015] to
13 | be used in conjunction with SVI.
14 | """
15 | def __init__(self, mc=1, iw=1):
16 | """
17 | Initialise a new sampler.
18 | :param mc: number of Monte Carlo samples
19 | :param iw: number of Importance Weighted samples
20 | """
21 | self.mc = mc
22 | self.iw = iw
23 |
24 | def resample(self, x):
25 | return x.repeat(self.mc * self.iw, 1)
26 |
27 | def __call__(self, elbo):
28 | elbo = elbo.view(self.mc, self.iw, -1)
29 | elbo = torch.mean(log_sum_exp(elbo, dim=1, sum_op=torch.mean), dim=0)
30 | return elbo.view(-1)
31 |
32 |
33 | class DeterministicWarmup(object):
34 | """
35 | Linear deterministic warm-up as described in
36 | [Sønderby 2016].
37 | """
38 | def __init__(self, n=100, t_max=1):
39 | self.t = 0
40 | self.t_max = t_max
41 | self.inc = 1/n
42 |
43 | def __iter__(self):
44 | return self
45 |
46 | def __next__(self):
47 | t = self.t + self.inc
48 |
49 | self.t = self.t_max if t > self.t_max else t
50 | return self.t
51 |
52 |
53 | class SVI(nn.Module):
54 | """
55 | Stochastic variational inference (SVI).
56 | """
57 | base_sampler = ImportanceWeightedSampler(mc=1, iw=1)
58 | def __init__(self, model, likelihood=F.binary_cross_entropy, beta=repeat(1), sampler=base_sampler):
59 | """
60 | Initialises a new SVI optimizer for semi-
61 | supervised learning.
62 | :param model: semi-supervised model to evaluate
63 | :param likelihood: p(x|y,z) for example BCE or MSE
64 | :param sampler: sampler for x and y, e.g. for Monte Carlo
65 | :param beta: warm-up/scaling of KL-term
66 | """
67 | super(SVI, self).__init__()
68 | self.model = model
69 | self.likelihood = likelihood
70 | self.sampler = sampler
71 | self.beta = beta
72 |
73 | def forward(self, x, y=None):
74 | is_labelled = False if y is None else True
75 |
76 | # Prepare for sampling
77 | xs, ys = (x, y)
78 |
79 | # Enumerate choices of label
80 | if not is_labelled:
81 | ys = enumerate_discrete(xs, self.model.y_dim)
82 | xs = xs.repeat(self.model.y_dim, 1)
83 |
84 | # Increase sampling dimension
85 | xs = self.sampler.resample(xs)
86 | ys = self.sampler.resample(ys)
87 |
88 | reconstruction = self.model(xs, ys)
89 |
90 | # p(x|y,z)
91 | likelihood = -self.likelihood(reconstruction, xs)
92 |
93 | # p(y)
94 | prior = -log_standard_categorical(ys)
95 |
96 | # Equivalent to -L(x, y)
97 | elbo = likelihood + prior - next(self.beta) * self.model.kl_divergence
98 | L = self.sampler(elbo)
99 |
100 | if is_labelled:
101 | return torch.mean(L)
102 |
103 | logits = self.model.classify(x)
104 |
105 | L = L.view_as(logits.t()).t()
106 |
107 | # Calculate entropy H(q(y|x)) and sum over all labels
108 | H = -torch.sum(torch.mul(logits, torch.log(logits + 1e-8)), dim=-1)
109 | L = torch.sum(torch.mul(logits, L), dim=-1)
110 |
111 | # Equivalent to -U(x)
112 | U = L + H
113 | return torch.mean(U)
--------------------------------------------------------------------------------
/semi-supervised/layers/__init__.py:
--------------------------------------------------------------------------------
1 | from .stochastic import GaussianSample, GaussianMerge, GumbelSoftmax
2 | from .flow import NormalizingFlows, PlanarNormalizingFlow
--------------------------------------------------------------------------------
/semi-supervised/layers/flow.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 |
5 |
6 | class PlanarNormalizingFlow(nn.Module):
7 | """
8 | Planar normalizing flow [Rezende & Mohamed 2015].
9 | Provides a tighter bound on the ELBO by giving more expressive
10 | power to the approximate distribution, such as by introducing
11 | covariance between terms.
12 | """
13 | def __init__(self, in_features):
14 | super(PlanarNormalizingFlow, self).__init__()
15 | self.u = nn.Parameter(torch.randn(in_features))
16 | self.w = nn.Parameter(torch.randn(in_features))
17 | self.b = nn.Parameter(torch.ones(1))
18 |
19 | def forward(self, z):
20 | # Create uhat such that it is parallel to w
21 | uw = torch.dot(self.u, self.w)
22 | muw = -1 + F.softplus(uw)
23 | uhat = self.u + (muw - uw) * torch.transpose(self.w, 0, -1) / torch.sum(self.w ** 2)
24 |
25 | # Equation 21 - Transform z
26 | zwb = torch.mv(z, self.w) + self.b
27 |
28 | f_z = z + (uhat.view(1, -1) * F.tanh(zwb).view(-1, 1))
29 |
30 | # Compute the Jacobian using the fact that
31 | # tanh(x) dx = 1 - tanh(x)**2
32 | psi = (1 - F.tanh(zwb)**2).view(-1, 1) * self.w.view(1, -1)
33 | psi_u = torch.mv(psi, uhat)
34 |
35 | # Return the transformed output along
36 | # with log determninant of J
37 | logdet_jacobian = torch.log(torch.abs(1 + psi_u) + 1e-8)
38 |
39 | return f_z, logdet_jacobian
40 |
41 |
42 | class NormalizingFlows(nn.Module):
43 | """
44 | Presents a sequence of normalizing flows as a torch.nn.Module.
45 | """
46 | def __init__(self, in_features, flow_type=PlanarNormalizingFlow, n_flows=1):
47 | super(NormalizingFlows, self).__init__()
48 | self.flows = nn.ModuleList([flow_type(in_features) for _ in range(n_flows)])
49 |
50 | def forward(self, z):
51 | log_det_jacobian = []
52 |
53 | for flow in self.flows:
54 | z, j = flow(z)
55 | log_det_jacobian.append(j)
56 |
57 | return z, sum(log_det_jacobian)
--------------------------------------------------------------------------------
/semi-supervised/layers/stochastic.py:
--------------------------------------------------------------------------------
1 | import torch
2 | from torch.autograd import Variable
3 | import torch.nn as nn
4 | import torch.nn.functional as F
5 |
6 |
7 | class Stochastic(nn.Module):
8 | """
9 | Base stochastic layer that uses the
10 | reparametrization trick [Kingma 2013]
11 | to draw a sample from a distribution
12 | parametrised by mu and log_var.
13 | """
14 | def reparametrize(self, mu, log_var):
15 | epsilon = Variable(torch.randn(mu.size()), requires_grad=False)
16 |
17 | if mu.is_cuda:
18 | epsilon = epsilon.cuda()
19 |
20 | # log_std = 0.5 * log_var
21 | # std = exp(log_std)
22 | std = log_var.mul(0.5).exp_()
23 |
24 | # z = std * epsilon + mu
25 | z = mu.addcmul(std, epsilon)
26 |
27 | return z
28 |
29 | class GaussianSample(Stochastic):
30 | """
31 | Layer that represents a sample from a
32 | Gaussian distribution.
33 | """
34 | def __init__(self, in_features, out_features):
35 | super(GaussianSample, self).__init__()
36 | self.in_features = in_features
37 | self.out_features = out_features
38 |
39 | self.mu = nn.Linear(in_features, out_features)
40 | self.log_var = nn.Linear(in_features, out_features)
41 |
42 | def forward(self, x):
43 | mu = self.mu(x)
44 | log_var = F.softplus(self.log_var(x))
45 |
46 | return self.reparametrize(mu, log_var), mu, log_var
47 |
48 |
49 | class GaussianMerge(GaussianSample):
50 | """
51 | Precision weighted merging of two Gaussian
52 | distributions.
53 | Merges information from z into the given
54 | mean and log variance and produces
55 | a sample from this new distribution.
56 | """
57 | def __init__(self, in_features, out_features):
58 | super(GaussianMerge, self).__init__(in_features, out_features)
59 |
60 | def forward(self, z, mu1, log_var1):
61 | # Calculate precision of each distribution
62 | # (inverse variance)
63 | mu2 = self.mu(z)
64 | log_var2 = F.softplus(self.log_var(z))
65 | precision1, precision2 = (1/torch.exp(log_var1), 1/torch.exp(log_var2))
66 |
67 | # Merge distributions into a single new
68 | # distribution
69 | mu = ((mu1 * precision1) + (mu2 * precision2)) / (precision1 + precision2)
70 |
71 | var = 1 / (precision1 + precision2)
72 | log_var = torch.log(var + 1e-8)
73 |
74 | return self.reparametrize(mu, log_var), mu, log_var
75 |
76 |
77 | class GumbelSoftmax(Stochastic):
78 | """
79 | Layer that represents a sample from a categorical
80 | distribution. Enables sampling and stochastic
81 | backpropagation using the Gumbel-Softmax trick.
82 | """
83 | def __init__(self, in_features, out_features, n_distributions):
84 | super(GumbelSoftmax, self).__init__()
85 | self.in_features = in_features
86 | self.out_features = out_features
87 | self.n_distributions = n_distributions
88 |
89 | self.logits = nn.Linear(in_features, n_distributions*out_features)
90 |
91 | def forward(self, x, tau=1.0):
92 | logits = self.logits(x).view(-1, self.n_distributions)
93 |
94 | # variational distribution over categories
95 | softmax = F.softmax(logits, dim=-1) #q_y
96 | sample = self.reparametrize(logits, tau).view(-1, self.n_distributions, self.out_features)
97 | sample = torch.mean(sample, dim=1)
98 |
99 | return sample, softmax
100 |
101 | def reparametrize(self, logits, tau=1.0):
102 | epsilon = Variable(torch.rand(logits.size()), requires_grad=False)
103 |
104 | if logits.is_cuda:
105 | epsilon = epsilon.cuda()
106 |
107 | # Gumbel distributed noise
108 | gumbel = -torch.log(-torch.log(epsilon+1e-8)+1e-8)
109 | # Softmax as a continuous approximation of argmax
110 | y = F.softmax((logits + gumbel)/tau, dim=1)
111 | return y
112 |
--------------------------------------------------------------------------------
/semi-supervised/models/__init__.py:
--------------------------------------------------------------------------------
1 | from .vae import VariationalAutoencoder, LadderVariationalAutoencoder, GumbelAutoencoder
2 | from .dgm import DeepGenerativeModel, StackedDeepGenerativeModel, AuxiliaryDeepGenerativeModel,\
3 | LadderDeepGenerativeModel
--------------------------------------------------------------------------------
/semi-supervised/models/dgm.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 | from torch.nn import init
5 |
6 | from .vae import VariationalAutoencoder
7 | from .vae import Encoder, Decoder, LadderEncoder, LadderDecoder
8 |
9 |
10 | class Classifier(nn.Module):
11 | def __init__(self, dims):
12 | """
13 | Single hidden layer classifier
14 | with softmax output.
15 | """
16 | super(Classifier, self).__init__()
17 | [x_dim, h_dim, y_dim] = dims
18 | self.dense = nn.Linear(x_dim, h_dim)
19 | self.logits = nn.Linear(h_dim, y_dim)
20 |
21 | def forward(self, x):
22 | x = F.relu(self.dense(x))
23 | x = F.softmax(self.logits(x), dim=-1)
24 | return x
25 |
26 |
27 | class DeepGenerativeModel(VariationalAutoencoder):
28 | def __init__(self, dims):
29 | """
30 | M2 code replication from the paper
31 | 'Semi-Supervised Learning with Deep Generative Models'
32 | (Kingma 2014) in PyTorch.
33 |
34 | The "Generative semi-supervised model" is a probabilistic
35 | model that incorporates label information in both
36 | inference and generation.
37 |
38 | Initialise a new generative model
39 | :param dims: dimensions of x, y, z and hidden layers.
40 | """
41 | [x_dim, self.y_dim, z_dim, h_dim] = dims
42 | super(DeepGenerativeModel, self).__init__([x_dim, z_dim, h_dim])
43 |
44 | self.encoder = Encoder([x_dim + self.y_dim, h_dim, z_dim])
45 | self.decoder = Decoder([z_dim + self.y_dim, list(reversed(h_dim)), x_dim])
46 | self.classifier = Classifier([x_dim, h_dim[0], self.y_dim])
47 |
48 | for m in self.modules():
49 | if isinstance(m, nn.Linear):
50 | init.xavier_normal(m.weight.data)
51 | if m.bias is not None:
52 | m.bias.data.zero_()
53 |
54 | def forward(self, x, y):
55 | # Add label and data and generate latent variable
56 | z, z_mu, z_log_var = self.encoder(torch.cat([x, y], dim=1))
57 |
58 | self.kl_divergence = self._kld(z, (z_mu, z_log_var))
59 |
60 | # Reconstruct data point from latent data and label
61 | x_mu = self.decoder(torch.cat([z, y], dim=1))
62 |
63 | return x_mu
64 |
65 | def classify(self, x):
66 | logits = self.classifier(x)
67 | return logits
68 |
69 | def sample(self, z, y):
70 | """
71 | Samples from the Decoder to generate an x.
72 | :param z: latent normal variable
73 | :param y: label (one-hot encoded)
74 | :return: x
75 | """
76 | y = y.float()
77 | x = self.decoder(torch.cat([z, y], dim=1))
78 | return x
79 |
80 |
81 | class StackedDeepGenerativeModel(DeepGenerativeModel):
82 | def __init__(self, dims, features):
83 | """
84 | M1+M2 model as described in [Kingma 2014].
85 |
86 | Initialise a new stacked generative model
87 | :param dims: dimensions of x, y, z and hidden layers
88 | :param features: a pretrained M1 model of class `VariationalAutoencoder`
89 | trained on the same dataset.
90 | """
91 | [x_dim, y_dim, z_dim, h_dim] = dims
92 | super(StackedDeepGenerativeModel, self).__init__([features.z_dim, y_dim, z_dim, h_dim])
93 |
94 | # Be sure to reconstruct with the same dimensions
95 | in_features = self.decoder.reconstruction.in_features
96 | self.decoder.reconstruction = nn.Linear(in_features, x_dim)
97 |
98 | # Make vae feature model untrainable by freezing parameters
99 | self.features = features
100 | self.features.train(False)
101 |
102 | for param in self.features.parameters():
103 | param.requires_grad = False
104 |
105 | def forward(self, x, y):
106 | # Sample a new latent x from the M1 model
107 | x_sample, _, _ = self.features.encoder(x)
108 |
109 | # Use the sample as new input to M2
110 | return super(StackedDeepGenerativeModel, self).forward(x_sample, y)
111 |
112 | def classify(self, x):
113 | _, x, _ = self.features.encoder(x)
114 | logits = self.classifier(x)
115 | return logits
116 |
117 | class AuxiliaryDeepGenerativeModel(DeepGenerativeModel):
118 | def __init__(self, dims):
119 | """
120 | Auxiliary Deep Generative Models [Maaløe 2016]
121 | code replication. The ADGM introduces an additional
122 | latent variable 'a', which enables the model to fit
123 | more complex variational distributions.
124 |
125 | :param dims: dimensions of x, y, z, a and hidden layers.
126 | """
127 | [x_dim, y_dim, z_dim, a_dim, h_dim] = dims
128 | super(AuxiliaryDeepGenerativeModel, self).__init__([x_dim, y_dim, z_dim, h_dim])
129 |
130 | self.aux_encoder = Encoder([x_dim, h_dim, a_dim])
131 | self.aux_decoder = Encoder([x_dim + z_dim + y_dim, list(reversed(h_dim)), a_dim])
132 |
133 | self.classifier = Classifier([x_dim + a_dim, h_dim[0], y_dim])
134 |
135 | self.encoder = Encoder([a_dim + y_dim + x_dim, h_dim, z_dim])
136 | self.decoder = Decoder([y_dim + z_dim, list(reversed(h_dim)), x_dim])
137 |
138 | def classify(self, x):
139 | # Auxiliary inference q(a|x)
140 | a, a_mu, a_log_var = self.aux_encoder(x)
141 |
142 | # Classification q(y|a,x)
143 | logits = self.classifier(torch.cat([x, a], dim=1))
144 | return logits
145 |
146 | def forward(self, x, y):
147 | """
148 | Forward through the model
149 | :param x: features
150 | :param y: labels
151 | :return: reconstruction
152 | """
153 | # Auxiliary inference q(a|x)
154 | q_a, q_a_mu, q_a_log_var = self.aux_encoder(x)
155 |
156 | # Latent inference q(z|a,y,x)
157 | z, z_mu, z_log_var = self.encoder(torch.cat([x, y, q_a], dim=1))
158 |
159 | # Generative p(x|z,y)
160 | x_mu = self.decoder(torch.cat([z, y], dim=1))
161 |
162 | # Generative p(a|z,y,x)
163 | p_a, p_a_mu, p_a_log_var = self.aux_decoder(torch.cat([x, y, z], dim=1))
164 |
165 | a_kl = self._kld(q_a, (q_a_mu, q_a_log_var), (p_a_mu, p_a_log_var))
166 | z_kl = self._kld(z, (z_mu, z_log_var))
167 |
168 | self.kl_divergence = a_kl + z_kl
169 |
170 | return x_mu
171 |
172 |
173 | class LadderDeepGenerativeModel(DeepGenerativeModel):
174 | def __init__(self, dims):
175 | """
176 | Ladder version of the Deep Generative Model.
177 | Uses a hierarchical representation that is
178 | trained end-to-end to give very nice disentangled
179 | representations.
180 |
181 | :param dims: dimensions of x, y, z layers and h layers
182 | note that len(z) == len(h).
183 | """
184 | [x_dim, y_dim, z_dim, h_dim] = dims
185 | super(LadderDeepGenerativeModel, self).__init__([x_dim, y_dim, z_dim[0], h_dim])
186 |
187 | neurons = [x_dim, *h_dim]
188 | encoder_layers = [LadderEncoder([neurons[i - 1], neurons[i], z_dim[i - 1]]) for i in range(1, len(neurons))]
189 |
190 | e = encoder_layers[-1]
191 | encoder_layers[-1] = LadderEncoder([e.in_features + y_dim, e.out_features, e.z_dim])
192 |
193 | decoder_layers = [LadderDecoder([z_dim[i - 1], h_dim[i - 1], z_dim[i]]) for i in range(1, len(h_dim))][::-1]
194 |
195 | self.classifier = Classifier([x_dim, h_dim[0], y_dim])
196 |
197 | self.encoder = nn.ModuleList(encoder_layers)
198 | self.decoder = nn.ModuleList(decoder_layers)
199 | self.reconstruction = Decoder([z_dim[0]+y_dim, h_dim, x_dim])
200 |
201 | for m in self.modules():
202 | if isinstance(m, nn.Linear):
203 | init.xavier_normal(m.weight.data)
204 | if m.bias is not None:
205 | m.bias.data.zero_()
206 |
207 | def forward(self, x, y):
208 | # Gather latent representation
209 | # from encoders along with final z.
210 | latents = []
211 | for i, encoder in enumerate(self.encoder):
212 | if i == len(self.encoder)-1:
213 | x, (z, mu, log_var) = encoder(torch.cat([x, y], dim=1))
214 | else:
215 | x, (z, mu, log_var) = encoder(x)
216 | latents.append((mu, log_var))
217 |
218 | latents = list(reversed(latents))
219 |
220 | self.kl_divergence = 0
221 | for i, decoder in enumerate([-1, *self.decoder]):
222 | # If at top, encoder == decoder,
223 | # use prior for KL.
224 | l_mu, l_log_var = latents[i]
225 | if i == 0:
226 | self.kl_divergence += self._kld(z, (l_mu, l_log_var))
227 |
228 | # Perform downword merge of information.
229 | else:
230 | z, kl = decoder(z, l_mu, l_log_var)
231 | self.kl_divergence += self._kld(*kl)
232 |
233 | x_mu = self.reconstruction(torch.cat([z, y], dim=1))
234 | return x_mu
235 |
236 | def sample(self, z, y):
237 | for i, decoder in enumerate(self.decoder):
238 | z = decoder(z)
239 | return self.reconstruction(torch.cat([z, y], dim=1))
--------------------------------------------------------------------------------
/semi-supervised/models/vae.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 | from torch.autograd import Variable
5 | from torch.nn import init
6 |
7 | from layers import GaussianSample, GaussianMerge, GumbelSoftmax
8 | from inference import log_gaussian, log_standard_gaussian
9 |
10 |
11 | class Perceptron(nn.Module):
12 | def __init__(self, dims, activation_fn=F.relu, output_activation=None):
13 | super(Perceptron, self).__init__()
14 | self.dims = dims
15 | self.activation_fn = activation_fn
16 | self.output_activation = output_activation
17 |
18 | self.layers = nn.ModuleList(list(map(lambda d: nn.Linear(*d), list(zip(dims, dims[1:])))))
19 |
20 | def forward(self, x):
21 | for i, layer in enumerate(self.layers):
22 | x = layer(x)
23 | if i == len(self.layers)-1 and self.output_activation is not None:
24 | x = self.output_activation(x)
25 | else:
26 | x = self.activation_fn(x)
27 |
28 | return x
29 |
30 |
31 |
32 | class Encoder(nn.Module):
33 | def __init__(self, dims, sample_layer=GaussianSample):
34 | """
35 | Inference network
36 |
37 | Attempts to infer the probability distribution
38 | p(z|x) from the data by fitting a variational
39 | distribution q_φ(z|x). Returns the two parameters
40 | of the distribution (µ, log σ²).
41 |
42 | :param dims: dimensions of the networks
43 | given by the number of neurons on the form
44 | [input_dim, [hidden_dims], latent_dim].
45 | """
46 | super(Encoder, self).__init__()
47 |
48 | [x_dim, h_dim, z_dim] = dims
49 | neurons = [x_dim, *h_dim]
50 | linear_layers = [nn.Linear(neurons[i-1], neurons[i]) for i in range(1, len(neurons))]
51 |
52 | self.hidden = nn.ModuleList(linear_layers)
53 | self.sample = sample_layer(h_dim[-1], z_dim)
54 |
55 | def forward(self, x):
56 | for layer in self.hidden:
57 | x = F.relu(layer(x))
58 | return self.sample(x)
59 |
60 |
61 | class Decoder(nn.Module):
62 | def __init__(self, dims):
63 | """
64 | Generative network
65 |
66 | Generates samples from the original distribution
67 | p(x) by transforming a latent representation, e.g.
68 | by finding p_θ(x|z).
69 |
70 | :param dims: dimensions of the networks
71 | given by the number of neurons on the form
72 | [latent_dim, [hidden_dims], input_dim].
73 | """
74 | super(Decoder, self).__init__()
75 |
76 | [z_dim, h_dim, x_dim] = dims
77 |
78 | neurons = [z_dim, *h_dim]
79 | linear_layers = [nn.Linear(neurons[i-1], neurons[i]) for i in range(1, len(neurons))]
80 | self.hidden = nn.ModuleList(linear_layers)
81 |
82 | self.reconstruction = nn.Linear(h_dim[-1], x_dim)
83 |
84 | self.output_activation = nn.Sigmoid()
85 |
86 | def forward(self, x):
87 | for layer in self.hidden:
88 | x = F.relu(layer(x))
89 | return self.output_activation(self.reconstruction(x))
90 |
91 |
92 | class VariationalAutoencoder(nn.Module):
93 | def __init__(self, dims):
94 | """
95 | Variational Autoencoder [Kingma 2013] model
96 | consisting of an encoder/decoder pair for which
97 | a variational distribution is fitted to the
98 | encoder. Also known as the M1 model in [Kingma 2014].
99 |
100 | :param dims: x, z and hidden dimensions of the networks
101 | """
102 | super(VariationalAutoencoder, self).__init__()
103 |
104 | [x_dim, z_dim, h_dim] = dims
105 | self.z_dim = z_dim
106 | self.flow = None
107 |
108 | self.encoder = Encoder([x_dim, h_dim, z_dim])
109 | self.decoder = Decoder([z_dim, list(reversed(h_dim)), x_dim])
110 | self.kl_divergence = 0
111 |
112 | for m in self.modules():
113 | if isinstance(m, nn.Linear):
114 | init.xavier_normal(m.weight.data)
115 | if m.bias is not None:
116 | m.bias.data.zero_()
117 |
118 | def _kld(self, z, q_param, p_param=None):
119 | """
120 | Computes the KL-divergence of
121 | some element z.
122 |
123 | KL(q||p) = -∫ q(z) log [ p(z) / q(z) ]
124 | = -E[log p(z) - log q(z)]
125 |
126 | :param z: sample from q-distribuion
127 | :param q_param: (mu, log_var) of the q-distribution
128 | :param p_param: (mu, log_var) of the p-distribution
129 | :return: KL(q||p)
130 | """
131 | (mu, log_var) = q_param
132 |
133 | if self.flow is not None:
134 | f_z, log_det_z = self.flow(z)
135 | qz = log_gaussian(z, mu, log_var) - sum(log_det_z)
136 | z = f_z
137 | else:
138 | qz = log_gaussian(z, mu, log_var)
139 |
140 | if p_param is None:
141 | pz = log_standard_gaussian(z)
142 | else:
143 | (mu, log_var) = p_param
144 | pz = log_gaussian(z, mu, log_var)
145 |
146 | kl = qz - pz
147 |
148 | return kl
149 |
150 | def add_flow(self, flow):
151 | self.flow = flow
152 |
153 | def forward(self, x, y=None):
154 | """
155 | Runs a data point through the model in order
156 | to provide its reconstruction and q distribution
157 | parameters.
158 |
159 | :param x: input data
160 | :return: reconstructed input
161 | """
162 | z, z_mu, z_log_var = self.encoder(x)
163 |
164 | self.kl_divergence = self._kld(z, (z_mu, z_log_var))
165 |
166 | x_mu = self.decoder(z)
167 |
168 | return x_mu
169 |
170 | def sample(self, z):
171 | """
172 | Given z ~ N(0, I) generates a sample from
173 | the learned distribution based on p_θ(x|z).
174 | :param z: (torch.autograd.Variable) Random normal variable
175 | :return: (torch.autograd.Variable) generated sample
176 | """
177 | return self.decoder(z)
178 |
179 |
180 | class GumbelAutoencoder(nn.Module):
181 | def __init__(self, dims, n_samples=100):
182 | super(GumbelAutoencoder, self).__init__()
183 |
184 | [x_dim, z_dim, h_dim] = dims
185 | self.z_dim = z_dim
186 | self.n_samples = n_samples
187 |
188 | self.encoder = Perceptron([x_dim, *h_dim])
189 | self.sampler = GumbelSoftmax(h_dim[-1], z_dim, n_samples)
190 | self.decoder = Perceptron([z_dim, *reversed(h_dim), x_dim], output_activation=F.sigmoid)
191 |
192 | self.kl_divergence = 0
193 |
194 | for m in self.modules():
195 | if isinstance(m, nn.Linear):
196 | init.xavier_normal(m.weight.data)
197 | if m.bias is not None:
198 | m.bias.data.zero_()
199 |
200 | def _kld(self, qz):
201 | k = Variable(torch.FloatTensor([self.z_dim]), requires_grad=False)
202 | kl = qz * (torch.log(qz + 1e-8) - torch.log(1.0/k))
203 | kl = kl.view(-1, self.n_samples, self.z_dim)
204 | return torch.sum(torch.sum(kl, dim=1), dim=1)
205 |
206 | def forward(self, x, y=None, tau=1):
207 | x = self.encoder(x)
208 |
209 | sample, qz = self.sampler(x, tau)
210 | self.kl_divergence = self._kld(qz)
211 |
212 | x_mu = self.decoder(sample)
213 |
214 | return x_mu
215 |
216 | def sample(self, z):
217 | return self.decoder(z)
218 |
219 |
220 | class LadderEncoder(nn.Module):
221 | def __init__(self, dims):
222 | """
223 | The ladder encoder differs from the standard encoder
224 | by using batch-normalization and LReLU activation.
225 | Additionally, it also returns the transformation x.
226 |
227 | :param dims: dimensions [input_dim, [hidden_dims], [latent_dims]].
228 | """
229 | super(LadderEncoder, self).__init__()
230 | [x_dim, h_dim, self.z_dim] = dims
231 | self.in_features = x_dim
232 | self.out_features = h_dim
233 |
234 | self.linear = nn.Linear(x_dim, h_dim)
235 | self.batchnorm = nn.BatchNorm1d(h_dim)
236 | self.sample = GaussianSample(h_dim, self.z_dim)
237 |
238 | def forward(self, x):
239 | x = self.linear(x)
240 | x = F.leaky_relu(self.batchnorm(x), 0.1)
241 | return x, self.sample(x)
242 |
243 |
244 | class LadderDecoder(nn.Module):
245 | def __init__(self, dims):
246 | """
247 | The ladder dencoder differs from the standard encoder
248 | by using batch-normalization and LReLU activation.
249 | Additionally, it also returns the transformation x.
250 |
251 | :param dims: dimensions of the networks
252 | given by the number of neurons on the form
253 | [latent_dim, [hidden_dims], input_dim].
254 | """
255 | super(LadderDecoder, self).__init__()
256 |
257 | [self.z_dim, h_dim, x_dim] = dims
258 |
259 | self.linear1 = nn.Linear(x_dim, h_dim)
260 | self.batchnorm1 = nn.BatchNorm1d(h_dim)
261 | self.merge = GaussianMerge(h_dim, self.z_dim)
262 |
263 | self.linear2 = nn.Linear(x_dim, h_dim)
264 | self.batchnorm2 = nn.BatchNorm1d(h_dim)
265 | self.sample = GaussianSample(h_dim, self.z_dim)
266 |
267 | def forward(self, x, l_mu=None, l_log_var=None):
268 | if l_mu is not None:
269 | # Sample from this encoder layer and merge
270 | z = self.linear1(x)
271 | z = F.leaky_relu(self.batchnorm1(z), 0.1)
272 | q_z, q_mu, q_log_var = self.merge(z, l_mu, l_log_var)
273 |
274 | # Sample from the decoder and send forward
275 | z = self.linear2(x)
276 | z = F.leaky_relu(self.batchnorm2(z), 0.1)
277 | z, p_mu, p_log_var = self.sample(z)
278 |
279 | if l_mu is None:
280 | return z
281 |
282 | return z, (q_z, (q_mu, q_log_var), (p_mu, p_log_var))
283 |
284 |
285 | class LadderVariationalAutoencoder(VariationalAutoencoder):
286 | def __init__(self, dims):
287 | """
288 | Ladder Variational Autoencoder as described by
289 | [Sønderby 2016]. Adds several stochastic
290 | layers to improve the log-likelihood estimate.
291 |
292 | :param dims: x, z and hidden dimensions of the networks
293 | """
294 | [x_dim, z_dim, h_dim] = dims
295 | super(LadderVariationalAutoencoder, self).__init__([x_dim, z_dim[0], h_dim])
296 |
297 | neurons = [x_dim, *h_dim]
298 | encoder_layers = [LadderEncoder([neurons[i - 1], neurons[i], z_dim[i - 1]]) for i in range(1, len(neurons))]
299 | decoder_layers = [LadderDecoder([z_dim[i - 1], h_dim[i - 1], z_dim[i]]) for i in range(1, len(h_dim))][::-1]
300 |
301 | self.encoder = nn.ModuleList(encoder_layers)
302 | self.decoder = nn.ModuleList(decoder_layers)
303 | self.reconstruction = Decoder([z_dim[0], h_dim, x_dim])
304 |
305 | for m in self.modules():
306 | if isinstance(m, nn.Linear):
307 | init.xavier_normal(m.weight.data)
308 | if m.bias is not None:
309 | m.bias.data.zero_()
310 |
311 | def forward(self, x):
312 | # Gather latent representation
313 | # from encoders along with final z.
314 | latents = []
315 | for encoder in self.encoder:
316 | x, (z, mu, log_var) = encoder(x)
317 | latents.append((mu, log_var))
318 |
319 | latents = list(reversed(latents))
320 |
321 | self.kl_divergence = 0
322 | for i, decoder in enumerate([-1, *self.decoder]):
323 | # If at top, encoder == decoder,
324 | # use prior for KL.
325 | l_mu, l_log_var = latents[i]
326 | if i == 0:
327 | self.kl_divergence += self._kld(z, (l_mu, l_log_var))
328 |
329 | # Perform downword merge of information.
330 | else:
331 | z, kl = decoder(z, l_mu, l_log_var)
332 | self.kl_divergence += self._kld(*kl)
333 |
334 | x_mu = self.reconstruction(z)
335 | return x_mu
336 |
337 | def sample(self, z):
338 | for decoder in self.decoder:
339 | z = decoder(z)
340 | return self.reconstruction(z)
--------------------------------------------------------------------------------
/semi-supervised/utils.py:
--------------------------------------------------------------------------------
1 | import torch
2 | from torch.autograd import Variable
3 |
4 |
5 | def enumerate_discrete(x, y_dim):
6 | """
7 | Generates a `torch.Tensor` of size batch_size x n_labels of
8 | the given label.
9 |
10 | Example: generate_label(2, 1, 3) #=> torch.Tensor([[0, 1, 0],
11 | [0, 1, 0]])
12 | :param x: tensor with batch size to mimic
13 | :param y_dim: number of total labels
14 | :return variable
15 | """
16 | def batch(batch_size, label):
17 | labels = (torch.ones(batch_size, 1) * label).type(torch.LongTensor)
18 | y = torch.zeros((batch_size, y_dim))
19 | y.scatter_(1, labels, 1)
20 | return y.type(torch.LongTensor)
21 |
22 | batch_size = x.size(0)
23 | generated = torch.cat([batch(batch_size, i) for i in range(y_dim)])
24 |
25 | if x.is_cuda:
26 | generated = generated.cuda()
27 |
28 | return Variable(generated.float())
29 |
30 |
31 | def onehot(k):
32 | """
33 | Converts a number to its one-hot or 1-of-k representation
34 | vector.
35 | :param k: (int) length of vector
36 | :return: onehot function
37 | """
38 | def encode(label):
39 | y = torch.zeros(k)
40 | if label < k:
41 | y[label] = 1
42 | return y
43 | return encode
44 |
45 |
46 | def log_sum_exp(tensor, dim=-1, sum_op=torch.sum):
47 | """
48 | Uses the LogSumExp (LSE) as an approximation for the sum in a log-domain.
49 | :param tensor: Tensor to compute LSE over
50 | :param dim: dimension to perform operation over
51 | :param sum_op: reductive operation to be applied, e.g. torch.sum or torch.mean
52 | :return: LSE
53 | """
54 | max, _ = torch.max(tensor, dim=dim, keepdim=True)
55 | return torch.log(sum_op(torch.exp(tensor - max), dim=dim, keepdim=True) + 1e-8) + max
--------------------------------------------------------------------------------