├── README.md ├── assets ├── background-card-chip.jpg ├── blur-blurry-close-up-167259.jpg ├── echo_logo.png ├── latex_brelu.png ├── latex_silu.png ├── latex_softexp.png ├── plot_brelu.png ├── plot_silu.png └── plot_softexp.pnd.png ├── custom_activations_example.py ├── extending-pytorch-with-custom-activation-functions.ipynb ├── gain └── activation_gain.py ├── in-place-operations-in-pytorch.ipynb └── landscapes ├── cifar-10-shake-shake-landscapes.ipynb ├── create_landscape.py ├── imrelu.png ├── imsmish.png └── imswish.png /README.md: -------------------------------------------------------------------------------- 1 | ![image](https://github.com/Lexie88rus/Activation-functions-examples-pytorch/raw/master/assets/blur-blurry-close-up-167259.jpg) 2 | 3 | # Custom Activation Functions Examples for PyTorch 4 | Repository containing the article with examples of custom activation functions for Pytorch and scripts used in the article. 5 | See the [article on Medium](https://towardsdatascience.com/extending-pytorch-with-custom-activation-functions-2d8b065ef2fa) and a [kernel on Kaggle](https://www.kaggle.com/aleksandradeis/extending-pytorch-with-custom-activation-functions). 6 | 7 | See also the [article about the in-place activations in PyTorch](https://medium.com/p/in-place-operations-in-pytorch-f91d493e970e?source=email-9f0981e41a86--writer.postDistributed&sk=d44e1786ba9cadee76dcdf87e150a5af). 8 | 9 | ## Introduction 10 | Today deep learning is going viral and is applied to a variety of machine learning problems such as image recognition, speech recognition, machine translation, and others. There is a wide range of highly customizable neural network architectures, which can suit almost any problem when given enough data. Each neural network should be elaborated to suit the given problem well enough. You have to fine tune the hyperparameters of the network (the learning rate, dropout coefficients, weight decay, and many others) as well as the number of hidden layers, and the number of units in each layer. __Choosing the right activation function for each layer is also crucial and may have a significant impact on metric scores and the training speed of the model.__ 11 | 12 | ## References 13 | * My original [article on Medium](https://towardsdatascience.com/extending-pytorch-with-custom-activation-functions-2d8b065ef2fa) 14 | * My related [kernel on Kaggle](https://www.kaggle.com/aleksandradeis/extending-pytorch-with-custom-activation-functions) 15 | * [Activation functions wiki](https://en.wikipedia.org/wiki/Activation_function) 16 | * [Tutorial](https://pytorch.org/docs/master/notes/extending.html) on PyTorch extending. 17 | * My [article on Medium](https://medium.com/p/in-place-operations-in-pytorch-f91d493e970e?source=email-9f0981e41a86--writer.postDistributed&sk=d44e1786ba9cadee76dcdf87e150a5af) explaining in-place operations in PyTorch. 18 | -------------------------------------------------------------------------------- /assets/background-card-chip.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/background-card-chip.jpg -------------------------------------------------------------------------------- /assets/blur-blurry-close-up-167259.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/blur-blurry-close-up-167259.jpg -------------------------------------------------------------------------------- /assets/echo_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/echo_logo.png -------------------------------------------------------------------------------- /assets/latex_brelu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/latex_brelu.png -------------------------------------------------------------------------------- /assets/latex_silu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/latex_silu.png -------------------------------------------------------------------------------- /assets/latex_softexp.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/latex_softexp.png -------------------------------------------------------------------------------- /assets/plot_brelu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/plot_brelu.png -------------------------------------------------------------------------------- /assets/plot_silu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/plot_silu.png -------------------------------------------------------------------------------- /assets/plot_softexp.pnd.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/assets/plot_softexp.pnd.png -------------------------------------------------------------------------------- /custom_activations_example.py: -------------------------------------------------------------------------------- 1 | # Imports 2 | 3 | # Import basic libraries 4 | import numpy as np # linear algebra 5 | import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv) 6 | from collections import OrderedDict 7 | 8 | # Import PyTorch 9 | import torch # import main library 10 | from torch.autograd import Variable 11 | import torch.nn as nn # import modules 12 | from torch.autograd import Function # import Function to create custom activations 13 | from torch.nn.parameter import Parameter # import Parameter to create custom activations with learnable parameters 14 | from torch import optim # import optimizers for demonstrations 15 | import torch.nn.functional as F # import torch functions 16 | from torchvision import datasets, transforms # import transformations to use for demo 17 | 18 | # helper function to train a model 19 | def train_model(model, trainloader): 20 | ''' 21 | Function trains the model and prints out the training log. 22 | INPUT: 23 | model - initialized PyTorch model ready for training. 24 | trainloader - PyTorch dataloader for training data. 25 | ''' 26 | #setup training 27 | 28 | #define loss function 29 | criterion = nn.NLLLoss() 30 | #define learning rate 31 | learning_rate = 0.003 32 | #define number of epochs 33 | epochs = 5 34 | #initialize optimizer 35 | optimizer = optim.Adam(model.parameters(), lr=learning_rate) 36 | 37 | #run training and print out the loss to make sure that we are actually fitting to the training set 38 | print('Training the model. Make sure that loss decreases after each epoch.\n') 39 | for e in range(epochs): 40 | running_loss = 0 41 | for images, labels in trainloader: 42 | images = images.view(images.shape[0], -1) 43 | log_ps = model(images) 44 | loss = criterion(log_ps, labels) 45 | 46 | optimizer.zero_grad() 47 | loss.backward() 48 | optimizer.step() 49 | 50 | running_loss += loss.item() 51 | else: 52 | # print out the loss to make sure it is decreasing 53 | print(f"Training loss: {running_loss}") 54 | 55 | # simply define a silu function 56 | def silu(input): 57 | ''' 58 | Applies the Sigmoid Linear Unit (SiLU) function element-wise: 59 | 60 | SiLU(x) = x * sigmoid(x) 61 | ''' 62 | return input * torch.sigmoid(input) # use torch.sigmoid to make sure that we created the most efficient implemetation based on builtin PyTorch functions 63 | 64 | # create a class wrapper from PyTorch nn.Module, so 65 | # the function now can be easily used in models 66 | class SiLU(nn.Module): 67 | ''' 68 | Applies the Sigmoid Linear Unit (SiLU) function element-wise: 69 | 70 | SiLU(x) = x * sigmoid(x) 71 | 72 | Shape: 73 | - Input: (N, *) where * means, any number of additional 74 | dimensions 75 | - Output: (N, *), same shape as the input 76 | 77 | References: 78 | - Related paper: 79 | https://arxiv.org/pdf/1606.08415.pdf 80 | 81 | Examples: 82 | >>> m = silu() 83 | >>> input = torch.randn(2) 84 | >>> output = m(input) 85 | 86 | ''' 87 | def __init__(self): 88 | ''' 89 | Init method. 90 | ''' 91 | super().__init__() # init the base class 92 | 93 | def forward(self, input): 94 | ''' 95 | Forward pass of the function. 96 | ''' 97 | return silu(input) # simply apply already implemented SiLU 98 | 99 | # create class for basic fully-connected deep neural network 100 | class ClassifierSiLU(nn.Module): 101 | ''' 102 | Demo classifier model class to demonstrate SiLU 103 | ''' 104 | def __init__(self): 105 | super().__init__() 106 | 107 | # initialize layers 108 | self.fc1 = nn.Linear(784, 256) 109 | self.fc2 = nn.Linear(256, 128) 110 | self.fc3 = nn.Linear(128, 64) 111 | self.fc4 = nn.Linear(64, 10) 112 | 113 | def forward(self, x): 114 | # make sure the input tensor is flattened 115 | x = x.view(x.shape[0], -1) 116 | 117 | # apply silu function 118 | x = silu(self.fc1(x)) 119 | 120 | # apply silu function 121 | x = silu(self.fc2(x)) 122 | 123 | # apply silu function 124 | x = silu(self.fc3(x)) 125 | 126 | x = F.log_softmax(self.fc4(x), dim=1) 127 | 128 | return x 129 | 130 | # Implementation of Soft Exponential activation function 131 | class soft_exponential(nn.Module): 132 | ''' 133 | Implementation of soft exponential activation. 134 | 135 | Shape: 136 | - Input: (N, *) where * means, any number of additional 137 | dimensions 138 | - Output: (N, *), same shape as the input 139 | 140 | Parameters: 141 | - alpha - trainable parameter 142 | 143 | References: 144 | - See related paper: 145 | https://arxiv.org/pdf/1602.01321.pdf 146 | 147 | Examples: 148 | >>> a1 = soft_exponential(256) 149 | >>> x = torch.randn(256) 150 | >>> x = a1(x) 151 | ''' 152 | def __init__(self, in_features, alpha = None): 153 | ''' 154 | Initialization. 155 | INPUT: 156 | - in_features: shape of the input 157 | - aplha: trainable parameter 158 | aplha is initialized with zero value by default 159 | ''' 160 | super(soft_exponential,self).__init__() 161 | self.in_features = in_features 162 | 163 | # initialize alpha 164 | if alpha == None: 165 | self.alpha = Parameter(torch.tensor(0.0)) # create a tensor out of alpha 166 | else: 167 | self.alpha = Parameter(torch.tensor(alpha)) # create a tensor out of alpha 168 | 169 | self.alpha.requiresGrad = True # set requiresGrad to true! 170 | 171 | def forward(self, x): 172 | ''' 173 | Forward pass of the function. 174 | Applies the function to the input elementwise. 175 | ''' 176 | if (self.alpha == 0.0): 177 | return x 178 | 179 | if (self.alpha < 0.0): 180 | return - torch.log(1 - self.alpha * (x + self.alpha)) / self.alpha 181 | 182 | if (self.alpha > 0.0): 183 | return (torch.exp(self.alpha * x) - 1)/ self.alpha + self.alpha 184 | 185 | # create class for basic fully-connected deep neural network 186 | class ClassifierSExp(nn.Module): 187 | ''' 188 | Basic fully-connected network to test Soft Exponential activation. 189 | ''' 190 | def __init__(self): 191 | super().__init__() 192 | 193 | # initialize layers 194 | self.fc1 = nn.Linear(784, 256) 195 | self.fc2 = nn.Linear(256, 128) 196 | self.fc3 = nn.Linear(128, 64) 197 | self.fc4 = nn.Linear(64, 10) 198 | 199 | # initialize Soft Exponential activation 200 | self.a1 = soft_exponential(256) 201 | self.a2 = soft_exponential(128) 202 | self.a3 = soft_exponential(64) 203 | 204 | def forward(self, x): 205 | # make sure the input tensor is flattened 206 | x = x.view(x.shape[0], -1) 207 | 208 | # apply Soft Exponential unit 209 | x = self.a1(self.fc1(x)) 210 | x = self.a2(self.fc2(x)) 211 | x = self.a3(self.fc3(x)) 212 | x = F.log_softmax(self.fc4(x), dim=1) 213 | 214 | return x 215 | 216 | # Implementation of BReLU activation function with custom backward step 217 | class brelu(Function): 218 | ''' 219 | Implementation of BReLU activation function. 220 | 221 | Shape: 222 | - Input: (N, *) where * means, any number of additional 223 | dimensions 224 | - Output: (N, *), same shape as the input 225 | 226 | References: 227 | - See BReLU paper: 228 | https://arxiv.org/pdf/1709.04054.pdf 229 | 230 | Examples: 231 | >>> brelu_activation = brelu.apply 232 | >>> t = torch.randn((5,5), dtype=torch.float, requires_grad = True) 233 | >>> t = brelu_activation(t) 234 | ''' 235 | #both forward and backward are @staticmethods 236 | @staticmethod 237 | def forward(ctx, input): 238 | """ 239 | In the forward pass we receive a Tensor containing the input and return 240 | a Tensor containing the output. ctx is a context object that can be used 241 | to stash information for backward computation. You can cache arbitrary 242 | objects for use in the backward pass using the ctx.save_for_backward method. 243 | """ 244 | ctx.save_for_backward(input) # save input for backward pass 245 | 246 | # get lists of odd and even indices 247 | input_shape = input.shape[0] 248 | even_indices = [i for i in range(0, input_shape, 2)] 249 | odd_indices = [i for i in range(1, input_shape, 2)] 250 | 251 | # clone the input tensor 252 | output = input.clone() 253 | 254 | # apply ReLU to elements where i mod 2 == 0 255 | output[even_indices] = output[even_indices].clamp(min=0) 256 | 257 | # apply inversed ReLU to inversed elements where i mod 2 != 0 258 | output[odd_indices] = 0 - output[odd_indices] # reverse elements with odd indices 259 | output[odd_indices] = - output[odd_indices].clamp(min = 0) # apply reversed ReLU 260 | 261 | return output 262 | 263 | @staticmethod 264 | def backward(ctx, grad_output): 265 | """ 266 | In the backward pass we receive a Tensor containing the gradient of the loss 267 | with respect to the output, and we need to compute the gradient of the loss 268 | with respect to the input. 269 | """ 270 | grad_input = None # set output to None 271 | 272 | input, = ctx.saved_tensors # restore input from context 273 | 274 | # check that input requires grad 275 | # if not requires grad we will return None to speed up computation 276 | if ctx.needs_input_grad[0]: 277 | grad_input = grad_output.clone() 278 | 279 | # get lists of odd and even indices 280 | input_shape = input.shape[0] 281 | even_indices = [i for i in range(0, input_shape, 2)] 282 | odd_indices = [i for i in range(1, input_shape, 2)] 283 | 284 | # set grad_input for even_indices 285 | grad_input[even_indices] = (input[even_indices] >= 0).float() * grad_input[even_indices] 286 | 287 | # set grad_input for odd_indices 288 | grad_input[odd_indices] = (input[odd_indices] < 0).float() * grad_input[odd_indices] 289 | 290 | return grad_input 291 | 292 | # Simple model to demonstrate BReLU 293 | class ClassifierBReLU(nn.Module): 294 | ''' 295 | Simple fully-connected classifier model to demonstrate BReLU activation. 296 | ''' 297 | def __init__(self): 298 | super(ClassifierBReLU, self).__init__() 299 | 300 | # initialize layers 301 | self.fc1 = nn.Linear(784, 256) 302 | self.fc2 = nn.Linear(256, 128) 303 | self.fc3 = nn.Linear(128, 64) 304 | self.fc4 = nn.Linear(64, 10) 305 | 306 | # create shortcuts for BReLU 307 | self.a1 = brelu.apply 308 | self.a2 = brelu.apply 309 | self.a3 = brelu.apply 310 | 311 | def forward(self, x): 312 | # make sure the input tensor is flattened 313 | x = x.view(x.shape[0], -1) 314 | 315 | # apply BReLU 316 | x = self.a1(self.fc1(x)) 317 | x = self.a2(self.fc2(x)) 318 | x = self.a3(self.fc3(x)) 319 | x = F.log_softmax(self.fc4(x), dim=1) 320 | 321 | return x 322 | 323 | def main(): 324 | print('Loading the Fasion MNIST dataset.\n') 325 | 326 | # Define a transform 327 | transform = transforms.Compose([transforms.ToTensor()]) 328 | 329 | # Download and load the training data for Fashion MNIST 330 | trainset = datasets.FashionMNIST('~/.pytorch/F_MNIST_data/', download=True, train=True, transform=transform) 331 | trainloader = torch.utils.data.DataLoader(trainset, batch_size=64, shuffle=True) 332 | 333 | # 1. SiLU demonstration with model created with Sequential 334 | # use SiLU with model created with Sequential 335 | 336 | # initialize activation function 337 | activation_function = SiLU() 338 | 339 | # Initialize the model using nn.Sequential 340 | model = nn.Sequential(OrderedDict([ 341 | ('fc1', nn.Linear(784, 256)), 342 | ('activation1', activation_function), # use SiLU 343 | ('fc2', nn.Linear(256, 128)), 344 | ('bn2', nn.BatchNorm1d(num_features=128)), 345 | ('activation2', activation_function), # use SiLU 346 | ('dropout', nn.Dropout(0.3)), 347 | ('fc3', nn.Linear(128, 64)), 348 | ('bn3', nn.BatchNorm1d(num_features=64)), 349 | ('activation3', activation_function), # use SiLU 350 | ('logits', nn.Linear(64, 10)), 351 | ('logsoftmax', nn.LogSoftmax(dim=1))])) 352 | 353 | # Run training 354 | print('Training model with SiLU activation.\n') 355 | train_model(model, trainloader) 356 | 357 | # 2. SiLU demonstration of a model defined as a nn.Module 358 | 359 | # Create demo model 360 | model = ClassifierSiLU() 361 | 362 | # Run training 363 | print('Training model with SiLU activation.\n') 364 | train_model(model, trainloader) 365 | 366 | # 3. Soft Eponential function demonstration (with trainable parameter alpha) 367 | 368 | # Create demo model 369 | model = ClassifierSExp() 370 | 371 | # Run training 372 | print('Training model with Soft Exponential activation.\n') 373 | train_model(model, trainloader) 374 | 375 | # 4. BReLU activation function demonstration (with custom backward step) 376 | 377 | # Create demo model 378 | model = ClassifierBReLU() 379 | 380 | # Run training 381 | print('Training model with BReLU activation.\n') 382 | train_model(model, trainloader) 383 | 384 | if __name__ == '__main__': 385 | main() 386 | -------------------------------------------------------------------------------- /extending-pytorch-with-custom-activation-functions.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "_cell_guid": "b1076dfc-b9ad-4769-8c92-a6c4dae69d19", 8 | "_kg_hide-input": true, 9 | "_uuid": "8f2839f25d086af736a60e9eeb907d3b93b6e0e5" 10 | }, 11 | "outputs": [], 12 | "source": [ 13 | "# Imports\n", 14 | "\n", 15 | "# Import basic libraries\n", 16 | "import numpy as np # linear algebra\n", 17 | "import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv)\n", 18 | "%matplotlib inline\n", 19 | "import matplotlib.pyplot as plt\n", 20 | "plt.style.use('seaborn-whitegrid')\n", 21 | "from collections import OrderedDict\n", 22 | "from PIL import Image\n", 23 | "\n", 24 | "# Import PyTorch\n", 25 | "import torch # import main library\n", 26 | "from torch.autograd import Variable\n", 27 | "import torch.nn as nn # import modules\n", 28 | "from torch.autograd import Function # import Function to create custom activations\n", 29 | "from torch.nn.parameter import Parameter # import Parameter to create custom activations with learnable parameters\n", 30 | "from torch import optim # import optimizers for demonstrations\n", 31 | "import torch.nn.functional as F # import torch functions\n", 32 | "from torchvision import transforms # import transformations to use for demo\n", 33 | "from torch.utils.data import Dataset, DataLoader " 34 | ] 35 | }, 36 | { 37 | "cell_type": "markdown", 38 | "metadata": {}, 39 | "source": [ 40 | "![image](https://github.com/Lexie88rus/Activation-functions-examples-pytorch/raw/master/assets/blur-blurry-close-up-167259.jpg)" 41 | ] 42 | }, 43 | { 44 | "cell_type": "markdown", 45 | "metadata": { 46 | "_cell_guid": "79c7e3d0-c299-4dcb-8224-4455121ee9b0", 47 | "_uuid": "d629ff2d2480ee46fbb7e2d37f6b5fab8052498a", 48 | "collapsed": true 49 | }, 50 | "source": [ 51 | "# Extending PyTorch with Custom Activation Functions" 52 | ] 53 | }, 54 | { 55 | "cell_type": "markdown", 56 | "metadata": {}, 57 | "source": [ 58 | "## Introduction\n", 59 | "Today deep learning is viral and applied to a variety of machine learning problems such as image recognition, speech recognition, machine translation, etc. There is a wide range of highly customizable neural network architectures, which can suit almost any problem when given enough data. Each neural network should be customized to suit the given problem well enough. You have to fine tune the hyperparameters for the network for each task (the learning rate, dropout coefficients, weight decay, etc.) as well as number of hidden layers, number of units in layers. __Choosing the right activation function for each layer is also crucial and may have a significant impact on learning speed.__" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "## Activation Functions\n" 67 | ] 68 | }, 69 | { 70 | "cell_type": "markdown", 71 | "metadata": {}, 72 | "source": [ 73 | "The [activation function](https://www.analyticsvidhya.com/blog/2017/10/fundamentals-deep-learning-activation-functions-when-to-use-them/) is an essential building block for every neural network. We can choose from a huge list of popular activation functions, which are already implemented in Deep Learning frameworks, like [ReLU](https://en.wikipedia.org/wiki/Rectifier_(neural_networks), [Sigmoid](https://en.wikipedia.org/wiki/Sigmoid_function), [Tanh](https://en.wikipedia.org/wiki/Hyperbolic_function) and many others.\n", 74 | "\n", 75 | "But to create a state of the art model, customized particularly for your task, you may need to use a custom activation function, which is not yet implemented in Deep Learning framework you are using. Activation functions can be roughly classified into the following groups by complexity:\n", 76 | "\n", 77 | "1. Simple activation functions like [SiLU](https://arxiv.org/pdf/1606.08415.pdf), [Inverse square root unit (ISRU)](https://arxiv.org/pdf/1710.09967.pdf). These functions can be easily implemented in any Deep Learning framework.\n", 78 | "2. Activation functions with __trainable parameters__ like [SoftExponential](https://arxiv.org/pdf/1602.01321.pdf) or [S-shaped rectified linear activation unit (SReLU)](https://arxiv.org/pdf/1512.07030.pdf). \n", 79 | "3. Activation functions, which are not differentiable at some points and require __custom implementation of backward step__, for example [Bipolar rectified linear unit (BReLU)](https://arxiv.org/pdf/1709.04054.pdf).\n", 80 | "\n", 81 | "In this kernel I will try to cover implementation and demo examples for all of these types of functions using [Fashion MNIST dataset](https://www.kaggle.com/zalando-research/fashionmnist)." 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "metadata": {}, 87 | "source": [ 88 | "## Seeting Up The Demo\n", 89 | "Ih this section I will prepare everything for the demonstration:\n", 90 | "* Load Fashion MNIST dataset from PyTorch,\n", 91 | "* Introduce transformations for Fashion MNIST images using PyTorch,\n", 92 | "* Prepare model training procedure.\n", 93 | "\n", 94 | "If you are familiar with PyTorch basics, just skip this part and go straight to implementation of the activation functions." 95 | ] 96 | }, 97 | { 98 | "cell_type": "markdown", 99 | "metadata": {}, 100 | "source": [ 101 | "### Introduce Transformations" 102 | ] 103 | }, 104 | { 105 | "cell_type": "markdown", 106 | "metadata": {}, 107 | "source": [ 108 | "The most efficient way to transform the input data is to use buil-in PyTorch transformations:" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": 2, 114 | "metadata": {}, 115 | "outputs": [], 116 | "source": [ 117 | "# Define a transform\n", 118 | "transform = transforms.Compose([transforms.ToTensor()])" 119 | ] 120 | }, 121 | { 122 | "cell_type": "markdown", 123 | "metadata": {}, 124 | "source": [ 125 | "### Load the Data" 126 | ] 127 | }, 128 | { 129 | "cell_type": "markdown", 130 | "metadata": {}, 131 | "source": [ 132 | "To load the data I used standard Dataset and Dataloader classes from PyTorch and [FashionMNIST class code from this kernel](https://www.kaggle.com/arturlacerda/pytorch-conditional-gan):" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "execution_count": 3, 138 | "metadata": {}, 139 | "outputs": [], 140 | "source": [ 141 | "class FashionMNIST(Dataset):\n", 142 | " '''\n", 143 | " Dataset clas to load Fashion MNIST data from csv.\n", 144 | " Code from original kernel:\n", 145 | " https://www.kaggle.com/arturlacerda/pytorch-conditional-gan\n", 146 | " '''\n", 147 | " def __init__(self, transform=None):\n", 148 | " self.transform = transform\n", 149 | " fashion_df = pd.read_csv('../input/fashion-mnist_train.csv')\n", 150 | " self.labels = fashion_df.label.values\n", 151 | " self.images = fashion_df.iloc[:, 1:].values.astype('uint8').reshape(-1, 28, 28)\n", 152 | "\n", 153 | " def __len__(self):\n", 154 | " return len(self.images)\n", 155 | "\n", 156 | " def __getitem__(self, idx):\n", 157 | " label = self.labels[idx]\n", 158 | " img = Image.fromarray(self.images[idx])\n", 159 | " \n", 160 | " if self.transform:\n", 161 | " img = self.transform(img)\n", 162 | "\n", 163 | " return img, label\n", 164 | "\n", 165 | "# Load the training data for Fashion MNIST\n", 166 | "trainset = FashionMNIST(transform=transform)\n", 167 | "# Define the dataloader\n", 168 | "trainloader = torch.utils.data.DataLoader(trainset, batch_size=64, shuffle=True)" 169 | ] 170 | }, 171 | { 172 | "cell_type": "markdown", 173 | "metadata": {}, 174 | "source": [ 175 | "### Setup Training Procedure\n", 176 | "I wrote a small training procedure, which runs 5 training epochs and prints the loss for each epoch, so we make sure that we are fitting the training set:" 177 | ] 178 | }, 179 | { 180 | "cell_type": "code", 181 | "execution_count": 4, 182 | "metadata": {}, 183 | "outputs": [], 184 | "source": [ 185 | "def train_model(model):\n", 186 | " '''\n", 187 | " Function trains the model and prints out the training log.\n", 188 | " '''\n", 189 | " #setup training\n", 190 | " \n", 191 | " #define loss function\n", 192 | " criterion = nn.NLLLoss()\n", 193 | " #define learning rate\n", 194 | " learning_rate = 0.003\n", 195 | " #define number of epochs\n", 196 | " epochs = 5\n", 197 | " #initialize optimizer\n", 198 | " optimizer = optim.Adam(model.parameters(), lr=learning_rate)\n", 199 | "\n", 200 | " #run training and print out the loss to make sure that we are actually fitting to the training set\n", 201 | " print('Training the model. Make sure that loss decreases after each epoch.\\n')\n", 202 | " for e in range(epochs):\n", 203 | " running_loss = 0\n", 204 | " for images, labels in trainloader:\n", 205 | " images = images.view(images.shape[0], -1)\n", 206 | " log_ps = model(images)\n", 207 | " loss = criterion(log_ps, labels)\n", 208 | "\n", 209 | " optimizer.zero_grad()\n", 210 | " loss.backward()\n", 211 | " optimizer.step()\n", 212 | "\n", 213 | " running_loss += loss.item()\n", 214 | " else:\n", 215 | " # print out the loss to make sure it is decreasing\n", 216 | " print(f\"Training loss: {running_loss}\")" 217 | ] 218 | }, 219 | { 220 | "cell_type": "markdown", 221 | "metadata": {}, 222 | "source": [ 223 | "## Implementing Simple Activation Functions\n", 224 | "The most simple activation functions\n", 225 | "* are differentiable and don't need the manual implementation of the backward step,\n", 226 | "* don't have any trainable parameters, all their parameters are set in advance.\n", 227 | "\n", 228 | "One of the examples of such simple functions is Sigmoid Linear Unit or just [SiLU](https://arxiv.org/pdf/1606.08415.pdf) also known as Swish-1:\n", 229 | "\n", 230 | "$$SiLU(x) = x * \\sigma(x) = x * \\frac{1}{1 + e^{-x}}$$\n", 231 | "\n", 232 | "Plot:" 233 | ] 234 | }, 235 | { 236 | "cell_type": "code", 237 | "execution_count": 5, 238 | "metadata": { 239 | "_kg_hide-input": true 240 | }, 241 | "outputs": [ 242 | { 243 | "data": { 244 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAksAAAGnCAYAAACn0xNmAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzt3Xd4VFX+x/FPGqQBIiBSxYLHoGtj7SA2LKhYgNCb8LNg17WsomtZy+q6dkXWArLSiyAgTVBBUDAqKsbDKqD0Ip0QSJnfHzPJxhhCGGbmzMx9v56HJ5N770w+J5eZfOecM/ck+Hw+AQAAoGKJrgMAAABEM4olAACASlAsAQAAVIJiCQAAoBIUSwAAAJWgWAIAAKhEsusAAGCMaSnpGUmN5H8T95ukeyTVk3SltfY6Y8wjkhpba/tXcH+fpCbW2lVltp0n6U1r7THhbwGAeEaxBMApY0yCpA8k/Z+1dkpg27WSJspfAE1wmQ8AGIYD4FpdSQ0kfV6ywVo7XtJJkrKNMbNcBQMAiWIJgHubJC2SNMcY088Y00ySyg6pAYBLFEsAnLLW+iS1lTRB0u2SlhtjlgSG4gDAOYolAM5Za7dZa/9mrT1R0uGS3pU0UlJaFR/Cpz++niVJKgpdSgBeRbEEwCljTGNjTKuS76216621/5D0naT0Kj7MOknNym07VtKvIQkJwNMolgC41kTS+4HLB0iSjDGnSWoqKbOKj/G6pAeNMZmB+x8t6S+S/hXirAA8iEsHAHDKWrvAGHO9pNeNMbXkHz5bL6mz/AVT6zKHdyzbCyXpG2ttF0lPSvqrpC+MMUmSdkn6q7X2w4g0AkBcS/D5fK4zAAAARC2G4QAAACpBsQQAAFAJiiUAAIBKUCwBAABUgmIJAACgEmG7dEBOTg4fswMAADGjZcuWCRVtD+t1llq2bLn/gw5Cbm6usrKywvozopmX2+/ltkvebj9t92bbJW+338ttlyLT/pycnH3uYxgOAACgEhRLAAAAlaBYAgAAqATFEgAAQCUolgAAACpBsQQAAFAJiiUAAIBKUCwBAABUgmIJAACgEhRLAAAAlaBYAgAAqESV1oYzxpwgaaKk5621rxhjmkgaJilJ0lpJPa21e8IXEwAAwI399iwZYzIkvSzpozKbH5P0qrW2taQVkq4LSzoAAADHqjIMt0dSO0lrymw7T9KkwO2Jki4KbSwAAIDosN9iyVpbaK3dXW5zRplht3WSGoQ8GQAA8LRt+dt038z7tCl/k9McVZqzVAFfmdsJ5b4vlZubG+TDV01+fn7Yf0Y083L7vdx2ydvtp+3ebLvk7fZ7se0+n093zL9Dc9bM0TkXnOO0/cEWS7uMMWmBHqdG8k/y/oOsrKygg1VFbm5u2H9GNPNy+73cdsnb7aft3my75O32e7HtL3/xsmaunql/tv2nmtduHvb25+Tk7HNfsJcOmCWpQ+B2B0nTgnwcAACA31m0epHunnG3rjz2St111l2u4+y/Z8kY01LSc5KaSSowxnSU1F3SEGPMDZJ+kTQ0nCEBAIA3bNm9Rdljs9WwRkMNuXqIEhISXEfaf7Fkrc2R/9Nv5bUNeRoAAOBZPp9PfSf21ertqzW371wdmnao60iSgp+zBAAAEFIvfvGiJtqJev6S53VG4zNcxynFcicAAMC5L1Z9oXtm3qOrj7tat59xu+s4v0OxBAAAnNq8e7Oyx2arcc3Gerv921ExT6kshuEAAIAzPp9Pfd7vo7U71uqz6z5T7bTariP9AcUSAABw5l8L/qUPln6gly59Sac1Os11nAoxDAcAAJxYsHKB7v/ofnXI6qBbTr/FdZx9olgCAAAR91veb+o8trOa1mqqt9q/FXXzlMpiGA4AAERUsa9Yvd7vpfW71mtBvwWqlVrLdaRKUSwBAICIevazZzX1v1P1artXdWqDU13H2S+G4QAAQMTM+3WeHpz9oDq16KSb/nyT6zhVQrEEAAAiYuOujeoytouOrH2k3mz/ZlTPUyqLYTgAABB2xb5i9ZzQU5vyNunz/p+rZvWariNVGcUSAAAIu6fnPa3pP0/XoMsH6eTDT3Yd54AwDAcAAMLq018+1UNzHlKXE7ro+pbXu45zwCiWAABA2GzYtUFdxnbRMYceo8FXDI6ZeUplMQwHAADCoqi4SD3G99CW/C2a1mOaalSv4TpSUCiWAABAWDw590nNXDZT/77y3zqx/omu4wSNYTgAABByc5bP0SOfPKLuf+qufqf0cx3noFAsAQCAkFq/c726je+mY+scq0FXDIrJeUplMQwHAABCpqi4SN3Gd9O2/G2a2XOmMqtluo500CiWAABAyDz+6eOavXy23m7/tk447ATXcUKCYTgAABASHy37SI998ph6ndRLfU7u4zpOyFAsAQCAg7Z2x1p1G99NWfWy9Fq712J+nlJZDMMBAICDUlhcqG7ju2nn3p2a03uOMqpluI4UUhRLAADgoDz68aP6eMXHGnr1ULWo18J1nJBjGA4AAARtxs8z9MTcJ9T35L7qdVIv13HCgmIJAAAEZfX21eo+vruOP+x4vdLuFddxwoZiCQAAHLDC4kJ1HddVuwt2a0ynMUpPSXcdKWyYswQAAA7Yw3Me1txf52rYNcN0XN3jXMcJK3qWAADAAfnwvx/qqXlPqf8p/dXjxB6u44QdxRIAAKiyVdtXqeeEnjqx/ol66bKXXMeJCIolAABQJQVFBeoytov2FO3R6I6jlZaS5jpSRDBnCQAAVMnA2QP12crPNPza4TJ1jes4EUPPEgAA2K/JSyfrmfnP6IaWN6jrn7q6jhNRFEsAAKBSv277Vb3f762TDz9ZL1z6gus4EUexBAAA9qlknlJBUYFGdxyt1ORU15EijjlLAABgn/760V+1YNUCjeo4Ss3rNHcdxwl6lgAAQIUm2Ul6bsFzGvDnAco+Ptt1HGcolgAAwB+s2LpCvd/vrVMbnKp/XfIv13GcolgCAAC/s7dorzqP7axiX7FGdxyt6snVXUdyijlLAADgd+6beZ8Wrl6osZ3G6uhDj3Ydxzl6lgAAQKkJuRP0whcv6NbTb1WHFh1cx4kKFEsAAECStGzLMvWd2FenNTxNz7Z91nWcqEGxBAAAtKdwjzqP7SxJGtVxlOfnKZXFnCUAAKB7Zt6jL9d8qfHZ43Vk7SNdx4kq9CwBAOBx434Yp5cXvqw7zrhD12Rd4zpO1KFYAgDAw5ZtWabrJl2n0xudrn+0/YfrOFGJYgkAAI/aU7hH2WOylZiQqFEdR6laUjXXkaISc5YAAPCov8z4i3LW5uj9zu+r2SHNXMeJWvQsAQDgQWN/GKtXFr2iO8+8U1cdd5XrOFGNYgkAAI/5efPP6jepn05vdLqevuhp13GiHsUSAAAesqdwj7LH+ucpje44mnlKVcCcJQAAPOTuGXfrq7VfaWKXiTrikCNcx4kJ9CwBAOARY5aM0auLXtVdZ96l9qa96zgxg2IJAAAP+GnzT+o3qZ/OaHSGnrroKddxYgrFEgAAcS6/MF/ZY7KVnJjM9ZSCwJwlAADi3N3T79bX677WpC6TmKcUBHqWAACIY6OXjNZrX76mu8+6W1eaK13HiUkUSwAAxKmfNv+k/pP668zGZ+qpC5mnFKyghuGMMZmS3pV0qKRqkh611k4PZTAAABC88vOUUpJSXEeKWcH2LPWRZK2150nqKOnFUAUCAAAH767pd+nrdV9r6NVD1bRWU9dxYlqwxdImSXUCt2sHvgcAAFFg1Pej9PqXr+svZ/2FeUohkODz+YK6ozFmmqRj5C+WLrfWfl52f05Oji89Pf3gE1YiPz9fqampYf0Z0czL7fdy2yVvt5+2e7PtkrfbfyBtX7FjhTrN7KTmtZpr6PlDlZIY+8NvkTj3eXl5atmyZUJF+4Kds9RD0q/W2kuNMSdJelPSaeWPy8rKCubhqyw3NzfsPyOaebn9Xm675O3203Zvtl3ydvur2vb8wnx1e6ubqqdU18SeE+Nm+C0S5z4nJ2ef+4IdhjtH0nRJstYultTIGMM1mwAAcOjOaXfqm3XfME8pxIItln6SdIYkGWOOkLTTWlsYslQAAOCAjPx+pAblDNI9Z9+jK469wnWcuBJsb9Abkt42xnwSeIwbQxcJAAAciP/+9l/93wf/p7ObnK0nLnjCdZy4E1SxZK3dKSk7xFkAAMAByi/MV6cxnVQtqZpGdhjJ9ZTCgHlGAADEsDum3aHF6xdrctfJalKries4cYnlTgAAiFEjvhuhN3Le0L1n36vLj73cdZy4RbEEAEAMWvrbUl0/+Xqd0+Qc/f2Cv7uOE9colgAAiDG7C3Yre0y2qidV14gOI5inFGbMWQIAIMaUzFOa0m0K85QigJ4lAABiyPDvhmvwV4N13zn3qV3zdq7jeALFEgAAMcJusrph8g3MU4owiiUAAGLA7oLdyh7rn6c0suNIJScykyZS+E0DABADbp92u75d/62mdpuqxjUbu47jKfQsAQAQ5T745QP9+6t/66+t/qrLml/mOo7nUCwBABDF7CarR3IeUaumrfTY+Y+5juNJFEsAAESp3QW71WlMJ6UmpWpEhxHMU3KE3zoAAFHqtg9v03cbvtMbrd9gnpJD9CwBABCF/vPtf/Tm12/qgVYPqHWD1q7jeBrFEgAAUebHTT/qxsk3qnXT1nr0/Eddx/E8iiUAAKJIXkGeOo3ppLSUNOYpRQnOAAAAUeS2D2/Tkg1L9GH3D9WoZiPXcSB6lgAAiBrDFg/TW1+/pQdaP6BLjrnEdRwEUCwBABAFcjfm6sYpN+rcI87VI+c94joOyqBYAgDAsbyCPGWPzVZGSgbzlKIQZwMAAMdunXqrlmxYomk9pqlhjYau46AcepYAAHDo3cXv6u1v3taDrR/UxUdf7DoOKkCxBACAI7kbc3XTlJvU5og2+tt5f3MdB/tAsQQAgAMl11PKSMnQ8A7DmacUxTgzAAA4cMvUW/TDxh80vcd05ilFOXqWAACIsKHfDNU737yjgecOVNuj27qOg/2gWAIAIIJ+2PiDBkwdoPOanae/tWGeUiygWAIAIEJ27d2lTmM6KbNapoZfO1xJiUmuI6EKmLMEAECE3PLhLcrdmKsZPWeoQY0GruOgiuhZAgAgAoZ8M0RDvhmih859SBcddZHrODgAFEsAAITZkg1LNGCKf57Sw20edh0HB4hiCQCAMNq1d5eyx2arZvWazFOKUcxZAgAgjG6eerNyN+ZqZs+ZzFOKUfQsAQAQJkO+GaKhi4fq4TYP68KjLnQdB0GiWAIAIAxK5imd3+x8PXTuQ67j4CBQLAEAEGIl11OqWb2mhndgnlKsY84SAAAhdsuHt+jHTT9qZs+ZOjzzcNdxcJDoWQIAIITeXfxu6fWUmKcUHyiWAAAIkdyNubppyk1cTynOUCwBABACeQV5yh6brYyUDL137XvMU4ojzFkCACAEbv/wdi3ZsETTekxTwxoNXcdBCNGzBADAQRr+3XC9+fWb+murv+rioy92HQchRrEEAMBBWPrbUt0w+Qa1atpKj57/qOs4CAOKJQAAgpRfmK/sMdmqnlRdIzqMUHIis1viEWcVAIAg3TntTi1ev1hTuk1R45qNXcdBmNCzBABAEEYvGa1BOYN0z9n3qF3zdq7jIIwolgAAOEA/bf5J/Sf111mNz9ITFzzhOg7CjGIJAIADsKdwjzqP7azkxGSN7DhSKUkpriMhzJizBADAAbhn5j36au1XmthloprWauo6DiKAniUAAKpofO54vbzwZd155p1qb9q7joMIoVgCAKAKlm9ZrusmXqfTG52upy962nUcRBDFEgAA+7G3aK86j+0sSRrZYaSqJVVznAiRxJwlAAD24/5Z92vRmkUanz1eR9Y+0nUcRBg9SwAAVGKSnaTnP39et55+q67JusZ1HDhAsQQAwD78svUX9Xm/j05tcKqebfus6zhwhGIJAIAKFBQVqMu4LiosLtSojqNUPbm660hwhDlLAABU4MHZD+rzVZ9rVMdROubQY1zHgUP0LAEAUM6UpVP07PxndWPLG5V9fLbrOHAs6J4lY0x3SfdKKpT0kLV2ashSAQDgyKrtq9T7/d46qf5Jev7S513HQRQIqmfJGFNH0t8ktZJ0haSrQxkKAAAXCosL1XVcV+0p2qPRnUYrNTnVdSREgWB7li6SNMtau0PSDknXhy4SAABuPDznYc37dZ7eu/Y9HVvnWNdxECWCLZaaSUowxoyS1FDSI9baj0KWCgCACJv+03Q9Ne8p9T+lv7r9qZvrOIgiCT6f74DvZIy5X9I5kq6RdISkOZKOsNaWPlhOTo4vPT09VDkrlJ+fr9RU73aRern9Xm675O3203Zvtl0Kb/s37N6ga2dcqzqpdTTywpFKS04Ly88JFuc+/O3Py8tTy5YtEyraF2zP0npJ8621hZJ+NsbskFRP0oayB2VlZQX58FWTm5sb9p8Rzbzcfi+3XfJ2+2m7N9suha/9RcVFGjBsgPKL8zWx+0Rl1Yu+3zHnPvztz8nJ2ee+YC8dMEPSBcaYRGNMXUmZkjYF+VgAADjz2CeP6eMVH+v1y1+PykIJ7gVVLFlrV0saK2m2pKmSbrXWFocyGAAA4fbRso/0+KePq/dJvdXrpF6u4yBKBX2dJWvtG5LeCGEWAAAiZt3Odeo+vruOq3ucXm33qus4iGIsdwIA8Jyi4iL1GN9D2/ds16xes5RRLcN1JEQxiiUAgOc8OfdJfbT8I7155Zs64bATXMdBlGNtOACAp3yy4hM98skj6v6n7rrulOtcx0EMoFgCAHjGhl0b1HVcVx1z6DF6/fLXlZBQ4WV1gN9hGA4A4AnFvmL1nNBTm3dv1ofdP1SN6jVcR0KMoFgCAHjCM589oxk/z9CgywfppMNPch0HMYRhOABA3Ju/cr4Gzh6o7OOzdX1L1n7HgaFYAgDEtS27t6jruK5qWqupBl8xmHlKOGAMwwEA4pbP51O/Sf20ZscafXbdZ6qVWst1JMQgiiUAQNx6bdFrmvDjBP2z7T91eqPTXcdBjGIYDgAQl75Z943umnGX2jVvpzvPutN1HMQwiiUAQNzZuXenOo/trLrpdTX06qFKTODPHYLHMBwAIO4MmDJAP23+SbN7zVbd9Lqu4yDGUWoDAOLK0G+Gati3w/TwuQ+rTbM2ruMgDlAsAQDiht1kNWDqALU5oo0GnjvQdRzECYolAEBcyC/MV/bYbKWnpOu9a99TUmKS60iIE8xZAgDEhbun361v13+rKd2mqFHNRq7jII7QswQAiHnjfhin1758TXefdbfaNW/nOg7iDMUSACCmrdi6Qv0m9dPpjU7Xkxc+6ToO4hDFEgAgZhUUFajruK7yyacRHUaoWlI115EQh5izBACIWQNnD9Tnqz7X6I6jdVTto1zHQZyiZwkAEJOm/zRdz8x/Rtefer06Hd/JdRzEMYolAEDMWbtjrXpO6KkTDjtBL1z6gus4iHMMwwEAYkpRcZF6TOihnXt36uOOHystJc11JMQ5iiUAQEx5at5Tmr18tt5q/5Za1GvhOg48gGE4AEDMmPvLXP3t47+p25+6qe/JfV3HgUfQswQAiAlb92xVt2nddFTtozTo8kFKSEhwHQkeQbEEAIh6Pp9PDyx8QOt3rteCfgtUo3oN15HgIRRLAICo9+IXL+rjtR/rhUteUMuGLV3HgccwZwkAENVy1uTo3pn36vyG5+u2M25zHQceRLEEAIha2/dsV+exnVU/s76eOO0J5inBCYbhAABRyefz6cbJN2rF1hX6uM/HOmTXIa4jwaPoWQIARKW3v35bI74foUfPe1StmrZyHQceRrEEAIg6uRtzddu023TBkRfo/lb3u44Dj6NYAgBElfzCfHUd11XpKekads0wJSUmuY4Ej2POEgAgqtw38z4tXr9Yk7tOVsMaDV3HAehZAgBEjw/sB3pp4Uu6/Yzbdfmxl7uOA0iiWAIARIk1O9ao78S+Ovnwk/WPi/7hOg5QimIJAOBcUXGReozvod2FuzWyw0hVT67uOhJQijlLAADn/vHZPzRnxRy93f5tmbrGdRzgd+hZAgA4tWDlAj0852F1OaGL+pzcx3Uc4A8olgAAzmzN36qu47qqSa0mGnT5IJYzQVRiGA4A4ETJciartq/SvOvmqVZqLdeRgApRLAEAnHjnm3c0askoPXnBkzqz8Zmu4wD7xDAcACDiftz0o2798FZdcOQFuvece13HASpFsQQAiKj8wnx1GduF5UwQMxiGAwBE1P2z7mc5E8QUepYAABEzeelkvfjFiyxngphCsQQAiAiWM0GsolgCAIRdUXGRek7oqbyCPI3oMILlTBBTmLMEAAi7Z+c/q9nLZ+ut9m/puLrHuY4DHBB6lgAAYfX5qs81cPZAdT6+s/qe3Nd1HOCAUSwBAMJmW/620uVM3rjiDZYzQUxiGA4AEBY+n083TL5BK7etZDkTxDSKJQBAWAz5ZgjLmSAuMAwHAAg5u8nqlg9vYTkTxAWKJQBASO0t2qtu47spLTmN5UwQFxiGAwCE1MDZA/XV2q80sctEljNBXDioniVjTJoxZpkxpk+I8gAAYthHyz7Ss/Of1Y0tb1R70951HCAkDnYYbqCk30IRBAAQ237L+0293u+l4+oep+cuec51HCBkgh6GM8YcJ6mFpCmhiwMAiEU+n0/9P+ivjbs2anLXyUpPSXcdCQiZg+lZek7SXaEKAgCIXf/+6t96/8f39dSFT+mUBqe4jgOEVILP5zvgOxljeklqaq39uzHmEUkrrLVDyh6Tk5PjS08P7zuL/Px8paamhvVnRDMvt9/LbZe83X7aHn1tX7Z9mTrO7KhT656qwecOVmJCeD5oHa3tjwQvt12KTPvz8vLUsmXLCi8xH+ww3OWSjjLGXCGpsaQ9xphV1tpZZQ/KysoK8uGrJjc3N+w/I5p5uf1ebrvk7fbT9uhq+96iver+ZndlVMvQmO5jwvrpt2hsf6R4ue1SZNqfk5Ozz31BFUvW2s4lt8v0LM3a9z0AAPFo4OyB+nrd11wmAHGNi1ICAILCZQLgFQd9UUpr7SMhyAEAiCEllwnIqpvFZQIQ97iCNwDggJRcJmBT3iZN6TaFywQg7lEsAQAOSMllAp67+DmdfPjJruMAYcecJQBAlf246UfdMe0OtT2qre448w7XcYCIoFgCAFTJnsI96jaum9JT0jXk6iFhu54SEG0YhgMAVAmXCYBX8bYAALBfs5bN0j8X/JPLBMCTKJYAAJXalLdJvSZwmQB4F8NwAIB98vl86j+pv37b/Zumdp/KZQLgSRRLAIB9GpwzWBPtRC4TAE9jGA4AUCG7yerO6Xfq4qMv5jIB8DSKJQDAHxQUFajnhJ5KS0nTO1e9w2UC4GkMwwEA/uDJuU9q0ZpFGt1xNJcJgOfxVgEA8DuLVi/S458+rh4n9lCn4zu5jgM4R7EEACiVV5CnHhN6qEGNBnr5spddxwGiAsNwAIBS9828T0t/W6pZPWfpkNRDXMcBogI9SwAASdLMn2fqlUWv6PYzbteFR13oOg4QNSiWAADavHuz+kzso6y6WXrqwqdcxwGiCsNwAADdPPVmbdi1QR90/UBpKWmu4wBRhWIJADxu5PcjNfL7kXr8/Md1aoNTXccBog7DcADgYau3r9ZNU27SmY3P1P2t7ncdB4hKFEsA4FHFvmL1ndhXe4v26t2r31VyIoMNQEV4ZgCAR7226DXNXDZTr1/+uprXae46DhC16FkCAA+ym6zunXmvLjvmMt3Q8gbXcYCoRrEEAB5TdpHct9q/pYSEBNeRgKjGMBwAeEzJIrljOo1RgxoNXMcBoh49SwDgIQtXLyxdJLdji46u4wAxgWIJADwiryBPPSf0ZJFc4AAxDAcAHlGySO5HvT5ikVzgANCzBAAeMOPnGXpl0Su644w7dMGRF7iOA8QUiiUAiHObd29W34l9lVU3S09e+KTrOEDMYRgOAOIci+QCB4diCQDiWMkiuX8//+8skgsEiWE4AIhTZRfJva/Vfa7jADGLYgkA4lDZRXKHXTOMRXKBg8CzBwDiUNlFco859BjXcYCYRs8SAMQZFskFQotiCQDiSMkiuekp6SySC4QIw3AAEEeemPsEi+QCIUbPEgDEiYWrF+rvn/6dRXKBEKNYAoA4ULJIbsMaDVkkFwgxhuEAIA6wSC4QPvQsAUCMY5FcILwolgAghrFILhB+DMMBQAxjkVwg/CiWACBGjfhuBIvkAhHAMBwAxKBV21dpwNQBLJILRADFEgDEmGJfsa6beB2L5AIRwjMMAGJMySK5gy4fxCK5QATQswQAMeTHTT/q3pn3ql3zdrq+5fWu4wCeQLEEADGioPh/i+S+eeWbLJILRAjDcAAQI9744Q19ueZLFskFIoyeJQCIAQtXL9QbuW+o54k9WSQXiDCKJQCIciWL5NZLq8ciuYADDMMBQJS7d+a9WvrbUr3d5m3VSq3lOg7gOfQsAUAUm/HzDL266FXdccYdOrP+ma7jAJ5EsQQAUapkkdwW9VqwSC7gEMNwABClShbJndx1MovkAg5RLAFAFCq7SO4pDU5xHQfwNIbhACDKlCySe1bjs1gkF4gCQfcsGWOekdQ68BhPWWvHhywVAHhUySK5BUUFevead1kkF4gCQfUsGWPOl3SCtfYsSZdKeiGkqQDAo0oWyX3u4udYJBeIEsEOw30qqVPg9hZJGcaYpNBEAgBv+nHTj7pn5j0skgtEmQSfz3dQD2CMuV5Sa2ttz7Lbc3JyfOnp6Qf12PuTn5+v1NTUsP6MaObl9nu57ZK32x+vbS8oLlD3j7pr1a5VmnjJRNVLq/eHY+K17VXl5fZ7ue1SZNqfl5enli1bVrg69UENhhtjrpLUT9LFFe3Pyso6mIffr9zc3LD/jGjm5fZ7ue2St9sfr21/5ONH9P2W7zW201id2+LcCo+J17ZXlZfb7+W2S5Fpf05Ozj73HcwE70skPSjpUmvttmAfBwC8buHqhfr7p39XzxN7qkOLDq7jACgnqGLJGFNL0rOSLrLWbg5tJADwjpJFchvWaMgiuUCUCrZnqbOkupJGG2NKtvWy1v4aklQA4BEli+TO7jWbRXKBKBVUsWS++dvZAAAU0klEQVStHSxpcIizAICnlCySe+eZd+r8I893HQfAPnAFbwBwgEVygdjBpWEBIMJ8Pp8GTBlQukhuarJ3PxIOxAKKJQCIsOHfDdeoJaP0xAVPsEguEAMYhgOACPp126+6eerNOrvJ2brvHBbJBWIBxRIAREixr1i93++tIl+Rhl0zTEmJrBIFxAKG4QAgQp5f8Lw+XvGx3mr/lo6qfZTrOACqiJ4lAIiAb9d/qwdmP6Crj7tafU/u6zoOgANAsQQAYZZfmK8e43uodmptDb5isBISKlyrE0CUYhgOAMJs4OyB+m7Dd5rSbYrqZdRzHQfAAaJnCQDCaM7yOfrXgn/ppj/fpHbN27mOAyAIFEsAECZb87eq9/u91bxOcz3b9lnXcQAEiWE4AAiTW6beojU71mh+v/nKqJbhOg6AINGzBABhMOr7UXrvu/f0cJuHdXqj013HAXAQKJYAIMRWbV+lG6fcqDMbn6kHWj/gOg6Ag0SxBAAhVOwrVp/3+6igqEDDrhmm5ERmOwCxjmcxAITQc/Of00fLP9LgKwbrmEOPcR0HQAjQswQAIfLlmi/1wOwH1CGrg/qf2t91HAAhQrEEACGwY88OdR3XVQ0yG+jfV/6bq3QDcYRhOAAIgVs/vFXLtizTx70/Vu202q7jAAghepYA4CCN+G6Ehi4eqgdbP6jWR7R2HQdAiFEsAcBBWL5luW6ccqPObnK2Hm7zsOs4AMKAYgkAglRYXKju47srQQl679r3uEwAEKd4ZgNAkB79+FEtWLVAIzuMVLNDmrmOAyBM6FkCgCB8suITPTH3CfU9ua86n9DZdRwAYUSxBAAHaMOuDeo6rqua12muly57yXUcAGHGMBwAHICi4iL1GN9DW/K3aFqPacqsluk6EoAwo1gCgAPw5NwnNXPZTA2+YrBOrH+i6zgAIoBhOACoojnL5+iRTx5R9z91ZzkTwEMolgCgCtbtXKeu47rq2DrHatAVg1jOBPAQhuEAYD+KiovUbVw3bd+zXbN6zWKeEuAxFEsAsB+PffKY5qyYo7fbv60TDjvBdRwAEcYwHABUYtpP0/T4p4+r90m91feUvq7jAHCAYgkA9uHnzT+r67iuOrH+iXrt8tdcxwHgCMUSAFRg596dunrU1UpMSNSEzhOUnpLuOhIAR5izBADl+Hw+9ZvUTz9s/EHTuk/TkbWPdB0JgEMUSwBQznMLntPoJaP19IVPq+3RbV3HAeAYw3AAUMasZbN036z71LFFR917zr2u4wCIAhRLABCwbMsydRnbRVl1s/TOVe9w4UkAkiiWAECStDV/q64YfoWKfcWa0HkCF54EUIo5SwA8r6CoQNljsvXT5p80o+cMNa/T3HUkAFGEYgmAp/l8Pt324W2auWym3m7/ts5rdp7rSACiDMNwADztxS9e1KCcQbrvnPu4QjeAClEsAfCsyUsn667pd+ma467Rkxc+6ToOgChFsQTAkz5f9bk6j+2sUxucqmHXDFNiAi+HACrGqwMAz8ndmKvLh1+uBpkNNKXbFGVUy3AdCUAUo1gC4Ckrt63Uxf+5WNWSqmlGzxmqn1nfdSQAUY5PwwHwjN/yftPF/7lY2/ds16d9PtVRtY9yHQlADKBYAuAJ2/K3qd3wdlq+Zbmm95iukw4/yXUkADGCYglA3NuxZ4cue+8yfbX2K43tNFZtmrVxHQlADKFYAhDXdu7dqXbD22nh6oUa1XGUrjruKteRAMQYiiUAcWvX3l26csSVmr9yvkZ0GKEOLTq4jgQgBlEsAYhLJQvjLli1QMOuGabs47NdRwIQoyiWAMSd9TvX65L/XKIfNv6gUR1HqWOLjq4jAYhhFEsA4sovW39R22FttXrHan3Q9QNdcswlriMBiHEUSwDixtdrv9aVI67Uzr07NbPnTJ3d5GzXkQDEAa7gDSAuTLKT1Pqd1kpISNCnfT+lUAIQMhRLAGKaz+fTc/Of09Ujr1aLei20sP9CnVj/RNexAMQRhuEAxKy8gjwNmDJAQxcPVccWHTX06qFKT0l3HQtAnAm6WDLGPC/pTEk+SbdbaxeFLBUA7IfdZNVxTEct2bBEj7R5RA+1eUiJCXSWAwi9oIolY0wbSc2ttWcZY1pIekfSGSFNBgAV8Pl8GrZ4mAZMHaDU5FRN6zFNFx99setYAOJYsD1LF0p6X5KstT8YY2obY2paa7eHLtr+bc7frDU71kiSEpSghISE0tuSKvy+sn1VeZxQ/wwAVbdx10bdPv92zVo9S62attKIDiPUuGZj17EAxLlgi6XDJeWU+X59YFvEiqVpP03TZZMui9SPC7tgijafz1c67FBSpCUmJCoxIVFJiUn+rwlJf9hWdntF2w70/smJyUpJTFFKUor/a9nb5b5WS6q2z30pSf79qcmpSk1OVVpymtJS0n53Oy05TcmJTLXzGp/PpzE/jNEtU2/R1vyteuaiZ3TXWXcpKTHJdTQAHhDsX53y3SIJ8s9d+p3c3NwgH37/6hXW06MnPypfkk8++eTz/f7H+wJxSrYHjvrdtoqOLb1dwf1Ljy+3raL7V3ZsRY+5v6wV/fyCwgIlJyeX7iv2FUuSinxFKvYVl/7zyVe6zefz397XtmJfsYqKi+Qr8m8rkv+YIl+RfD6fivW/xy25f6GvUIXFgX9lbhf4Ckpvl2/rwUhMSFT1xOpKTU5V9cTqqp5UXalJqaqWVE1pSWlKT05XRkqG/2tyhjJSMvxfy94usz8zJVM1UmqoWlK1kGUMt/z8/LA+v6LJ8u3L9cTXT2j++vlqUbuFXj39VZ1w6Alaape6jhZxXjrvFfFy+73cdsl9+4MtllbL35NUoqGkdeUPysrKCvLhqyY9OT3sPyOa5ebmxkz7i4qLtLdorwqKC1RQVLDPr3uL9iq/MF/5hfnaXbhbuwt2V3h71fpVyqiZod2Fv9+fV5CnbXu3adXOVdqxZ4d27N2hvIK8KmXMSMnQoWmH7vNf7dTaOjTtUNVJr6P6GfVVP7O+aqfWdjKkGkvnPlib8jbp6XlP66UvXlJ6SrpeuewV3fjnG7XULo37tu+LF857Zbzcfi+3XYpM+3Nycva5L9hiaYakRyW9YYw5RdIaa+2OIB8LHpCUmKS0xDSlKS0kj3cgT5yi4iLt3LtTO/buKC2gyn7dvme7tuZv1ebdm7U5f7P/6+7Nyt2UW3p7b9HeCh87JTFFh2UcpsMzD1f9zPr+IipQSNXPqK+GNRqqSa0malSjkaonVw9J2+Pd9j3b9cLnL+if8/+pXQW71Puk3nrqwqdUP7O+62gAPCqoYslaO98Yk2OMmS+pWNLNoY0FhE5SYpJqpdZSrdRaQd3f5/MpryCvtHDalLdJ63et1/qd6/1fA7fX7VynxesWa/2u9SosLvzD49RLr6cmtZqocc3Galyj8f9u12ysJjX9t71cUK3ctlIvffGSBn81WNv3bNe1Wdfq8fMfV4t6LVxHA+BxQc+UtdbeH8ogQLRKSEhQRrUMZVTLUJNaTfZ7vM/n05b8LVq3c53W7FijVdtXaeW2lVq1fZVW7Vil5VuWa+4vc7Ulf8sf7tuwRkMdeciROrL2kf6vZW43rtk47iY0FxYXasbPMzTkmyGa8OME+Xw+dWzRUfecfY9aNmzpOh4ASOIK3kDIJSQklM5zqqxXZNfeXVq9Y7VWblupldtX6tdtv2r51uVavmW5Pv3lUw3/bnjppH1JSk5MVtNaTdXskGaqrdo6ZeMppYVUs0OaqX5m/Zi4KGNhcaHm/TpPE3+cqFFLRmntzrWqm15Xt51+m2474zYdccgRriMCwO9QLAGOZFTL0LF1jtWxdY6tcP/eor1auW1laQG1fKv/34qtK7R402KNWz7ud8dXT6quIw45QkfUOkLNDmlW+q/k+wY1GjgppoqKi/Tt+m8179d5mvvrXM1aNktb8reoelJ1XXrMpepzch+1a94upj6NCMBbKJaAKFUtqZqOPvRoHX3o0X/Yl5ubqyOOOUIrtq7Q8i3L9cu2X7Ri64rSfxPtRG3YteEPj1fSM9W4ZmPVz6jvn5he8jUwKb1Waq0DvpZVsa9YG3dt1Oodq7V6+2r9d/N/tWTDEv2w6Qd9v+F77dy7U5LUtFZTtTftdZW5Sm2PbqvMapnB/4IAIEIoloAYlZ6Srhb1WuxzqG/X3l36dduvpQVUSUG1fOtyzVo2S+t3rldBcUGF901LTlPN6jVVo3oN1ahWQylJKaUXPU1ISFBRcZF2Fezyf8pwzw5t27PtD5PaD8s4TC3qtVCfk/ro7CZn65ym56hpraYh/z0AQLhRLAFxKqNahrLqZSmrXsWXWCg7Eb3k03wbdm3Qtj3bSi+psGOv/2vZC5gW+4qVmJCoRjUbKbNapjJTMlUrtZYa1mioRjUaqWGNhjr60KNVN71uhFsMAOFBsQR4VFUnogOA10X/R2cAAAAcolgCAACoBMUSAABAJSiWAAAAKkGxBAAAUAmKJQAAgEpQLAEAAFSCYgkAAKASFEsAAACVoFgCAACoBMUSAABAJSiWAAAAKkGxBAAAUAmKJQAAgEpQLAEAAFQiwefzheWBc3JywvPAAAAAYdCyZcuEiraHrVgCAACIBwzDAQAAVIJiCQAAoBLJrgNUhTGmjaQxkq6z1k4ObDtJ0uuSfJK+tdbeVO4+KZKGSDpCUpGkvtbaZZHMHUrGmAcltQ18myjpcGvtsWX215FkJX0f2LTRWtspsinDxxjTQdI/JK0KbJpprX2i3DHdJd0hqVjSG9batyObMjyMMcmS3pJ0lKQUSX+x1s4rd8xa+c9/iQuttUWRSxl6xpjnJZ0p/3P8dmvtojL7LpL0pPzP7anW2sfdpAwfY8wzklrL/zr9lLV2fJl9X0vaVubw7tba1RGOGBbGmJaSJkr6KbDpO2vtrWX2x/W5N8b0k9SzzKY/W2szy+yPu+e6JBljTpD/vD9vrX3FGNNE0jBJSZLWSupprd1T7j77fI0ItagvlowxR0u6S9K8crteUOCXY4wZbYy5zFr7YZn93SRttdZ2N8a0k/SUpM6RSR16gcLgCUkyxvSWdFi5QzIlzbPWXh3pbBGSKekVa+0LFe00xmRIeljS6ZL2SvraGPO+tXZzBDOGS09Ju6y1rY0xx0t6R/52SpKMMQmS1lhrz3OUL+QCb5CaW2vPMsa0kL/NZ5Q55CVJl0haLWmeMWactfYHB1HDwhhzvqQTAu2vI+lrSePLHhNP57ucTEljrbV37GN/XJ97a+1b8r85KnkeZJfsi8fnulT6+v2ypI/KbH5M0qvW2jGBNw7Xyd9BUnKf/b1GhFQsDMOtlXStpO0lG4wx1SQdWaaKnCjponL3u1DShMDt6ZJahTlnRAR6GW6S9Eq5XTUcxImk/bXvDEmLrLXbrLW7Jc2VdE74Y0XEf+R/wyBJGyXVKbc/Q/53X/HkQknvS1LgD2FtY0xNSTLGHCVps7V2pbW2WNKUwPHx5FNJJT3DWyRlGGPKnuN4fr7vs20eOfdlPSypbM9ZPD7XJWmPpHaS1pTZdp6kSYHb+/obX+FrRDhEfc+StTZPkowxZTfXlf8FpMQ6SQ3K3fVw+f+wyFpbZIwpNsZUs9buDWPcSLhW0vRAQVBWpqTjjDET5f/9vGStHRXxdOGTKaldoJcwQf6hqMVl9pee74CK/k/EJGttgaSCwLd3SBpe7pBMSYcZY8ZKaihppLX2pQhGDIfDJeWU+X59YNt2VXyuj45ctPALDKvsCnzbX/7hprJDLXWMMe9JaiZpjqSHrLXx8tHmTEmtjDEfyl8c/M1aOyewL+7PfQljzGmSVlpr15XZHI/PdVlrCyUVlvs7n1Fm2G1ff+P39RoRclFVLBlj+sv/wlDW36y108ttK38dhAT5xywP9JiotJ/fQz9JN1Rwt5Xyd1uOlL9Y+twY86m1dm1Yw4bBPto/QYEXTWNMa/nHsk8ssz9mz3dZlZ17Y8zNkk6VdGW5/XmSHpK/BypF0qfGmM+stTmKXZWdz7g411VhjLlK/uf8xeV2PSDpPUm75X/Xfa2kcZFNFzaLJT1mrZ1kjDlW0ixjzDGBN7qeOffyvw4MKbctHp/r+1L2vDr/Gx9VxZK19k1Jb1bh0A36/VBEI/mH68paLX+VuTgw2Tsh8A496u3r9xAY121krV1RwX1W6389DhuMMV9KOk5//L1Evf39P7DWzjXGHGaMSSrzbnu1pCvKHNZI0udhjBkWlZz7fvIXSVeX/39srd2uwBwHSXuMMbPkLyRj+QW05PlboqH87y4r2lfR8z/mGWMukfSgpEuttWUnc8taW3buxmT5z3dcFEvW2lxJuYHbS40x6+Q/x8vlkXMfcJ6kW8tuiNPn+r7sMsakBUZRKvsbX6Lsa0TIxcKcpT8I/LH40RhTMg/pWknTyh02Q/8b879S/q7qWHeSfv8piFLGmEuMMU8HbmdIOlnS0ghmCytjzEOBT8SVfGpiY7lhiS8knWaMOcQYkyn/fKW5DqKGXGCexo2SrrXW5lew/0/GmKHGmITAnLZWkpZEOmeIzZDUUZKMMafIP6l1hyQF3izUNMY0C7T3isDxccMYU0vSs5KuKP8hBWNMXWPM1MCbQElqo/99CjbmGWOuM8bcFrh9uKT68v9h9MS5lyRjTENJO8tPG4nT5/q+zJLUIXC7gyr+G1/ha0Q4RFXPUkWMMZdLukf+XpKWxpjbrLUXyz934w1jTKKkL6y1swLHT7TWXiVplKS2xph58k8e6+OkAaHVQP5etVLGmBckvSh/MdjLGDNf//uocVx8lDhgmKQhgRfRZPmHJmSMuV/SJ9baBYHb0+Xvin20/LvxGNZf/p7UqWXG9C+Wf9J3Sdt/k79gLJb0gbV2oZOkIWKtnW+MyQn8fy6WdLMxpo+kbdbaCfJ/yGFE4PBR1tq4eWMQ0Fn+4fTRZc75bPk/Rj/BGDNH0gJjzB75PykXF71KARMkvWeM6SipuvznupsxxivnXir3Wl/udS6unutS6eUinpN/Dl5B4Nx3l/81/wZJv0gaGjh2pPyXAvrDa0Q4M7LcCQAAQCVichgOAAAgUiiWAAAAKkGxBAAAUAmKJQAAgEpQLAEAAFSCYgkAAKASFEsAAACVoFgCAACoxP8DSBOS4i9aT0EAAAAASUVORK5CYII=\n", 245 | "text/plain": [ 246 | "
" 247 | ] 248 | }, 249 | "metadata": { 250 | "needs_background": "light" 251 | }, 252 | "output_type": "display_data" 253 | } 254 | ], 255 | "source": [ 256 | "def sigmoid(x):\n", 257 | " return 1 / (1 + np.exp(-x))\n", 258 | "\n", 259 | "fig = plt.figure(figsize=(10,7))\n", 260 | "ax = plt.axes()\n", 261 | "\n", 262 | "plt.title(\"SiLU\")\n", 263 | "\n", 264 | "x = np.linspace(-10, 10, 1000)\n", 265 | "ax.plot(x, x * sigmoid(x), '-g');" 266 | ] 267 | }, 268 | { 269 | "cell_type": "markdown", 270 | "metadata": {}, 271 | "source": [ 272 | "The implementation of SiLU:" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": 6, 278 | "metadata": {}, 279 | "outputs": [], 280 | "source": [ 281 | "# simply define a silu function\n", 282 | "def silu(input):\n", 283 | " '''\n", 284 | " Applies the Sigmoid Linear Unit (SiLU) function element-wise:\n", 285 | "\n", 286 | " SiLU(x) = x * sigmoid(x)\n", 287 | " '''\n", 288 | " return input * torch.sigmoid(input) # use torch.sigmoid to make sure that we created the most efficient implemetation based on builtin PyTorch functions\n", 289 | "\n", 290 | "# create a class wrapper from PyTorch nn.Module, so\n", 291 | "# the function now can be easily used in models\n", 292 | "class SiLU(nn.Module):\n", 293 | " '''\n", 294 | " Applies the Sigmoid Linear Unit (SiLU) function element-wise:\n", 295 | " \n", 296 | " SiLU(x) = x * sigmoid(x)\n", 297 | "\n", 298 | " Shape:\n", 299 | " - Input: (N, *) where * means, any number of additional\n", 300 | " dimensions\n", 301 | " - Output: (N, *), same shape as the input\n", 302 | "\n", 303 | " References:\n", 304 | " - Related paper:\n", 305 | " https://arxiv.org/pdf/1606.08415.pdf\n", 306 | "\n", 307 | " Examples:\n", 308 | " >>> m = silu()\n", 309 | " >>> input = torch.randn(2)\n", 310 | " >>> output = m(input)\n", 311 | "\n", 312 | " '''\n", 313 | " def __init__(self):\n", 314 | " '''\n", 315 | " Init method.\n", 316 | " '''\n", 317 | " super().__init__() # init the base class\n", 318 | "\n", 319 | " def forward(self, input):\n", 320 | " '''\n", 321 | " Forward pass of the function.\n", 322 | " '''\n", 323 | " return silu(input) # simply apply already implemented SiLU" 324 | ] 325 | }, 326 | { 327 | "cell_type": "markdown", 328 | "metadata": {}, 329 | "source": [ 330 | "Now it's time for a small demo _(don't forget to enable GPU in kernel settings to make training faster)_" 331 | ] 332 | }, 333 | { 334 | "cell_type": "markdown", 335 | "metadata": {}, 336 | "source": [ 337 | "Here is a small example of building a model with nn.Sequential and out custom SiLU class:" 338 | ] 339 | }, 340 | { 341 | "cell_type": "code", 342 | "execution_count": 7, 343 | "metadata": {}, 344 | "outputs": [ 345 | { 346 | "name": "stdout", 347 | "output_type": "stream", 348 | "text": [ 349 | "Training the model. Make sure that loss decreases after each epoch.\n", 350 | "\n", 351 | "Training loss: 478.8589185029268\n", 352 | "Training loss: 350.0746013522148\n", 353 | "Training loss: 319.25493428111076\n", 354 | "Training loss: 294.4485657066107\n", 355 | "Training loss: 280.9470457062125\n" 356 | ] 357 | } 358 | ], 359 | "source": [ 360 | "# use SiLU with model created with Sequential\n", 361 | "\n", 362 | "# initialize activation function\n", 363 | "activation_function = SiLU()\n", 364 | "\n", 365 | "# Initialize the model using nn.Sequential\n", 366 | "model = nn.Sequential(OrderedDict([\n", 367 | " ('fc1', nn.Linear(784, 256)),\n", 368 | " ('activation1', activation_function), # use SiLU\n", 369 | " ('fc2', nn.Linear(256, 128)),\n", 370 | " ('bn2', nn.BatchNorm1d(num_features=128)),\n", 371 | " ('activation2', activation_function), # use SiLU\n", 372 | " ('dropout', nn.Dropout(0.3)),\n", 373 | " ('fc3', nn.Linear(128, 64)),\n", 374 | " ('bn3', nn.BatchNorm1d(num_features=64)),\n", 375 | " ('activation3', activation_function), # use SiLU\n", 376 | " ('logits', nn.Linear(64, 10)),\n", 377 | " ('logsoftmax', nn.LogSoftmax(dim=1))]))\n", 378 | "\n", 379 | "# Run training\n", 380 | "train_model(model)" 381 | ] 382 | }, 383 | { 384 | "cell_type": "markdown", 385 | "metadata": {}, 386 | "source": [ 387 | "We can also use silu function in model class as follows:" 388 | ] 389 | }, 390 | { 391 | "cell_type": "code", 392 | "execution_count": 8, 393 | "metadata": {}, 394 | "outputs": [ 395 | { 396 | "name": "stdout", 397 | "output_type": "stream", 398 | "text": [ 399 | "Training the model. Make sure that loss decreases after each epoch.\n", 400 | "\n", 401 | "Training loss: 466.8241208344698\n", 402 | "Training loss: 349.4934533312917\n", 403 | "Training loss: 313.43123760819435\n", 404 | "Training loss: 293.65521355718374\n", 405 | "Training loss: 279.42011239379644\n" 406 | ] 407 | } 408 | ], 409 | "source": [ 410 | "# create class for basic fully-connected deep neural network\n", 411 | "class ClassifierSiLU(nn.Module):\n", 412 | " '''\n", 413 | " Demo classifier model class to demonstrate SiLU\n", 414 | " '''\n", 415 | " def __init__(self):\n", 416 | " super().__init__()\n", 417 | "\n", 418 | " # initialize layers\n", 419 | " self.fc1 = nn.Linear(784, 256)\n", 420 | " self.fc2 = nn.Linear(256, 128)\n", 421 | " self.fc3 = nn.Linear(128, 64)\n", 422 | " self.fc4 = nn.Linear(64, 10)\n", 423 | "\n", 424 | " def forward(self, x):\n", 425 | " # make sure the input tensor is flattened\n", 426 | " x = x.view(x.shape[0], -1)\n", 427 | "\n", 428 | " # apply silu function\n", 429 | " x = silu(self.fc1(x))\n", 430 | "\n", 431 | " # apply silu function\n", 432 | " x = silu(self.fc2(x))\n", 433 | " \n", 434 | " # apply silu function\n", 435 | " x = silu(self.fc3(x))\n", 436 | " \n", 437 | " x = F.log_softmax(self.fc4(x), dim=1)\n", 438 | "\n", 439 | " return x\n", 440 | "\n", 441 | "# Create demo model\n", 442 | "model = ClassifierSiLU()\n", 443 | " \n", 444 | "# Run training\n", 445 | "train_model(model)" 446 | ] 447 | }, 448 | { 449 | "cell_type": "markdown", 450 | "metadata": {}, 451 | "source": [ 452 | "## Implement Activation Function with Learnable Parameters\n", 453 | "\n", 454 | "There are lot's of activation functions with parameters, which can be trained with gradient descent while training the model. A great example for one of these is [SoftExponential](https://arxiv.org/pdf/1602.01321.pdf) function:\n", 455 | "\n", 456 | "$$SoftExponential(x, \\alpha) = \\left\\{\\begin{matrix} - \\frac{log(1 - \\alpha(x + \\alpha))}{\\alpha}, \\alpha < 0\\\\ x, \\alpha = 0\\\\ \\frac{e^{\\alpha * x} - 1}{\\alpha} + \\alpha, \\alpha > 0 \\end{matrix}\\right.$$" 457 | ] 458 | }, 459 | { 460 | "cell_type": "markdown", 461 | "metadata": {}, 462 | "source": [ 463 | "Plot (image from wikipedia):" 464 | ] 465 | }, 466 | { 467 | "cell_type": "markdown", 468 | "metadata": {}, 469 | "source": [ 470 | "![soft exponential plot](https://upload.wikimedia.org/wikipedia/commons/thumb/b/b5/Activation_soft_exponential.svg/2880px-Activation_soft_exponential.svg.png)" 471 | ] 472 | }, 473 | { 474 | "cell_type": "code", 475 | "execution_count": 9, 476 | "metadata": { 477 | "_kg_hide-input": true 478 | }, 479 | "outputs": [ 480 | { 481 | "name": "stderr", 482 | "output_type": "stream", 483 | "text": [ 484 | "/opt/conda/lib/python3.6/site-packages/ipykernel_launcher.py:6: RuntimeWarning: invalid value encountered in log\n", 485 | " \n" 486 | ] 487 | }, 488 | { 489 | "data": { 490 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlEAAAGnCAYAAACXXwJHAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzs3XlcVXUe//HXvYIbrriWOZFkBzQFRxsbxyV/imbjjGUu6AjTpP3SSXKcfu5YainmZNOkTZZbg7ujkZaVY2bmVC7hBobHwbRQzBVIRRa59/fHAQQBRQTvBd7Px4OHeM+553zOPTb3Pd/v93y/NqfTiYiIiIjcGrurCxAREREpjxSiREREREpAIUpERESkBBSiREREREpAIUpERESkBBSiRERERErAw9UFiIj7MgyjPTAHaIb1f7rOA+NM0/zvTd7XEVgH7DdN83eGYTxjmubCQvZ7CngLSLhu00nTNHuUwiWUqbzXZRjGVqzPZu8N9p8G3GOa5og7VKKIlCGFKBEplGEYNuBD4BnTNDdlv9Yf2GAYRnPTNFNv8PZewBemaYYYhtEUGA8UCFHZvjFNs2dp1n4nGIZRBfgb2ddVHkKfiJQuhSgRKUpD4C5gZ84Lpmm+bxjG7pwAZRjG88BIrFYqExgBdAPGAB6GYXwM+AH3GIZxGGhrmmZGcU5uGIYHsAeYYZpmlGEYLYBvgEAgArgAtAPuA74FQk3TTDUMoy3wNtAASAMmmKa52TCMR7Lf9wXwOFAdeMo0ze2GYVQFXgMeBaoC75qmOSu7juPZ7xsONAdWmqb5ArAFqJt9XX2AbcAw0zT/axjGCOAFrP+NPQWEmKb5Q3GuW0TKD42JEpGinMMKMdsMwxhuGIYPgGmaJwAMw3gYGAc8YpqmH/AjEGGa5jpgPrDONM3HgKeBH03T9CtugMo+z1XgGWCOYRjVgbnANNM0T2Xv0h8YALQAGgHPGIZhB1YD87NrGgGsMgyjdvZ72gE7TdP0B/4JhGe//jzQCmgDtAYGGIbRN085XYFfA+2BMMMw7sm+rqzs6zqWs6NhGI2zrz/INM2WQDwwtbjXLSLlh0KUiBTKNE0nEAREYbUsHTMM41B2lx7Ab7GC0pnsvy/C6sa7Vb82DOPwdT9/za7hW+Aj4N9AE2BBnvdtME3zvGmaDuADoBNWq1RTrCCV8/4fgIey33PRNM0N2b/vBX6R/fsgYLFpmummaV4GIrFCWo6VpmlmmaaZCJzGapEqVPbnUScnbAI7sIKeiFQw6s4TkSKZppkCvAS8ZBhGE+ApYLVhGAFYrT+JeXZPAhqX4DQ3GxP1T+AIMDw72OW4cN2562fXlHzdfjl1/QSk5Hk9C6iS/Xs9YLZhGC9l/70asDvPvkW9r4DssVLTDcPol71f7ez6RaSCUUuUiBTKMIx7DMPonPN30zRPm6b5KhCD1fV1GmvcUY4G2a+Vtgjg78BkwzC88rzeMM/v3lih6jTgnT0o/lbqSgSey+6a8zNN8z7TNAeXsN7BQD+gq2maBlYIFZEKSCFKRIrSHPgge5oDAAzDeAirC+xbYBPQ3zCMnCD1bPZr18sEamUPFL8lhmH8Fmt6hReAT4EZeTY/ahhGveyWn8exus2OAyewggyGYXTC6t7L26pUmI3ACMMwqhiGYTMMI9wwjEdv8p5MwJ5nvFWOxtl1nM/+bAZjtUaJSAWj7jwRKZRpmt8YhvF/gbcNw6iL1TV1Ghic/aTZD4ZhzAZ2ZA/o3g+MKuRQB7FaiX4yDOOXpmn+eN32X2c/4Xa9HsA8YKBpmk7DMKYC3xmGsSJ7+1bgfeB+YBewJHu/YGBBdtfc5ez3XzYM40aXOx+4FzgE2LBC4hs3egPWU3f/BX7MDns5VgFDsMZiHQamABsNw3iT/F2QIlLO2ZxO5833EhFxI4ZhvAfEm6b5iqtrEZHKS915IiIiIiWgECUiIiJSAurOExERESkBtUSJiIiIlIBClIiIiEgJ3PEpDqKjo9V/KCIiIuVG+/btbYW97pJ5otq3b3/znQSAuLg4/P39XV2G5KF74p50X9yP7ol70n25NdHR0UVuU3eeiIiISAkoRImIiIiUgEKUiIiISAkoRImIiIiUgEKUiIiISAkoRImIiIiUgEKUiIiISAkoRGVbsWIFgwYNIiQkhAEDBvD111/fcP+RI0cSGhoKwObNmwtsnzdvHr169SIkJCT3Z86cOWVSe0nk1Pz++++zZcuWIvcLCQnhyJEjt3TsefPmsXz58iK3T5w4kW3btt3SMUvC4XAwd+5cHn744UK3Z2Zm8sILLzBkyBCGDRtGQkJCmdckIiIVh0sm23Q3J06cYO3ataxbtw5PT0+OHz9OeHg4nTp1KvI90dHR7NmzhxMnTrBp0yZ69+5dYJ/Q0FCGDRtWlqWXSN6a+/fv7+pyysy7775L06ZNKWqR7Y8++og6deowd+5ctm/fzty5c3njjTfucJUiIlJeKUQBly5dIj09nczMTDw9PfHx8cltSTFNkxkzZmC32/Hy8mL27NksWLCA1NRURowYgd1u5+DBg8yfP5/Ro0ff9FzHjx9n3LhxrFmzhpMnTzJmzBjWrl1L9+7d6d27N7GxsTRu3Ji5c+eSlpZGREQETqeTq1evEh4eTuvWrQkKCqJnz57s3buX2rVr8+6775KamsrkyZNJSUkhKyuL8PBw/Pz8CAoKYvDgwWzbto2MjAyWLl3KjBkzcmt2Op3Ur1+f4OBgJkyYwOnTp0lNTSUsLIzu3bvf9HqWLFnC5s2bcTgcdOvWLd9nsGvXLhYuXEjVqlVJTEykd+/ejBo1Knfb8uXLOXXqFK+99hqtWrUiIiKCgwcPkp6ezpAhQxg4cGDusbKysnjqqafynfuuu+66YevesGHDqFWrFm+++Wah27/55hsef/xxADp37kx4ePhNr1dERCSHW4aoR957pMBrg1oP4s8P/ZnUzFQeW/FYge1PBT7FU4FPcS71HAPWDsi37Yunvrjh+fz8/Gjbti09evSgW7dudO3alV69euHh4cHMmTMZP348AQEBLF68mMjISCZOnEhUVBSLFi1i165drFixolgBCsDHx4euXbuyfv16duzYwZQpU/Dw8ODMmTP07duX8PBwwsLC2L59O4cPH8YwDKZMmUJMTAwREREsX76chIQE+vXrx4QJExg0aBCmafL555/TpUsXBg4cSHx8PDNnzmTp0qVkZWXRokULRowYwdixY9m5cyfDhw/PrXnevHkApKSk0LlzZ5544gkSEhIYM2ZMsUIUwMqVK7Hb7fTo0aNA0ImNjWXr1q14eHjQp08fgoODAbDZbCxevJjVq1cTFRWFr68vzZo1Y9KkSaSlpdGzZ898IapKlSosW7asWPXkqFWr1g23nzt3Dm9v79zj2+12MjIyqFq16i2dR0REKie3DFGuMGfOHI4ePcqOHTtYtGgRq1atIjIykvj4eAICAgDo0KEDb7/9drGPGRkZmW+8VGhoKEFBQTz77LMEBwfj5+eXu45gzZo1CQwMBCAwMJBjx44RGxtLnz59AGjTpg3Hjh0DrHDg5+cHQNOmTbl48SL79u3jwoULbNy4EYArV67knrdDhw759q1du3aBWuvUqUNMTAxr1qzBbreTnJxcrGusXr06w4YNw8PDg6SkpALvCwgIwMvLC4CWLVvmjjvKue4mTZpw4MABqlWrRkpKCsHBwXh6epKUlFSs89+O67v5nE4nNluha0yKiIgU4JYh6kYtRzU9a95we8OaDW/a8nQ9p9NJRkYGvr6++Pr6EhISQp8+fUhMTMz3pepwOLDbiz8Wv6gxUTkB59y5c/mOnbcem82GzWYrdDxPlSpVCtTv6enJ1KlTadeu3Q33v9H4oJSUFFauXElycjIDBgwodL+8Tp48yXvvvUdUVBReXl707du3wD7XX1dRNe3evZudO3eybNkyPD09C1xHcbrztmzZQmRkJADvvfdegc/pek2aNOHs2bP4+fmRmZmZ+zmKiIgUh1uGqDtt3bp17Nmzh1dffRWbzcbFixdxOBw0aNCAli1bsm/fPtq1a8eePXt48MEH8703pwvoVsydO5ewsDC+/PJLPv74Yx577DHS0tKIjY3lwQcfZP/+/QwYMICMjAxiYmJ4/PHH2b9/Py1btizymAEBAXz22We0a9eO+Ph4duzYwZ/+9KdC9y2s5qSkJO655x7sdjtbtmwp1jUlJSXh7e2Nl5cXhw4d4uTJk2RmZubb57vvvuPKlSvY7Xbi4+Px8fEp8lhNmzbF09OTrVu3kpWVla9rrTjdeUFBQQQFBd207hy/+c1v+PTTT+nSpQvbtm2jY8eOxX6viIiIpjgA+vfvT4MGDRg4cCChoaGMGjWK8PBwqlevTnh4OK+//jqhoaHExMTkTmuQw9fXl8OHDzNr1qwCx42MjMw3xcHo0aM5cOAAiYmJdO/eneeff55//vOfXLp0iXr16rFx40aGDh1KlSpV6Ny5M6GhoRw9epTQ0FDmzp3LlClTiryGYcOG8eOPPzJ06FDCw8Nzu/AKU1jNvXr14vPPP+ePf/wjNWrUoGnTprz11lu52+Pi4goM0Pb398fLy4vg4GA+/vhjgoODmT59eoFzTZ48meDgYIKDg6lTp06hNXXq1Ikffvghd6qBRx55hGnTphV5DcXx8ssvExISwqVLlwgJCWHp0qUAuYPbH3vsMRwOB0OGDGHFihW88MILt3U+ERG5Q86ehQkTIM/QFVewFdW9U1aio6OdOeNh5JqOHTuya9euAq/HxcXh7+/vgooKmjNnDuPHjy/2/jmD7ot6Oq68cqd7Itfovrgf3RP3VCHuy/jx8Prr8NNP0LBhmZ4qOjqa9u3bFzpgVi1RUiwXLlwodC4sERGRO+rSJVi4EJ54oswD1M1oTJSbKKwVyp14e3vnTgdQXB07dtQ4IxERKV3/+hckJ8PYsa6uRC1RIiIiUk44HPCPf8CvfgW//rWrq1FLlIiIiJQTmzbB//4Hq1aBG8zrp5YoERERKR/+/ne45x548klXVwIoRImIiEh5cOAAbNsGYWHgJhMjK0RlW7FiBYMGDSIkJIQBAwbw9ddf33D/kSNH5s4ZlXdplxzz5s2jV69e+eaJutFiuXdaTs3vv/8+W7ZsKXK/kJAQjhw5ckvHnjdvXu4CzoWZOHEi27Ztu6VjlsThw4dz56d66aWXCmzfvHkzQUFBuffnVpb0ERGRO+yNN6BmTXjmGVdXkktjooATJ06wdu1a1q1bh6enJ8ePHyc8PJxOnToV+Z7o6Gj27NnDiRMn2LRpU6GP/xe17Iur5a25f//+ri6nzMycOZPJkyfTtm1bxowZw/bt2+nWrVvu9tTUVP7whz8UWE5GRETczE8/wcqVVoCqX9/V1eRSiAIuXbpEeno6mZmZeHp64uPjk9uSYpomM2bMwG634+XlxezZs1mwYAGpqamMGDECu93OwYMHmT9/PqNHj77puY4fP864ceNYs2YNJ0+eZMyYMaxdu5bu3bvTu3dvYmNjady4MXPnziUtLY2IiAicTidXr14lPDyc1q1bExQURM+ePdm7dy+1a9fm3XffJTU1lcmTJ5OSkkJWVhbh4eH4+fkRFBTE4MGD2bZtGxkZGSxdupQZM2bk1ux0Oqlfvz7BwcFMmDCB06dPk5qaSlhYGN27d7/p9SxZsoTNmzfjcDjo1q1bvs9g165dLFy4kKpVq5KYmEjv3r1zZwvftWsXy5cv59SpU7z22mu0atWKiIgIDh48SHp6OkOGDGHgwIG5xyrO2nl5ZWRkcPLkSdq2bQtAjx49+Oabb/KFqMuXL9/0+kRExA28/TZkZsKYMa6uJB+3C1GRByJZsm9JqR7z6XZPExoQWuR2Pz8/2rZtS48ePejWrRtdu3alV69eeHh4MHPmTMaPH09AQACLFy8mMjKSiRMnEhUVxaJFi3Jn5S5OgALw8fGha9eurF+/nh07djBlyhQ8PDw4c+YMffv2JTw8nLCwMLZv387hw4cxDIMpU6YQExNDREQEy5cvJyEhgX79+jFhwgQGDRqEaZp8/vnndOnShYEDBxIfH8/MmTNZunQpWVlZtGjRghEjRjB27Fh27tzJ8OHDc2ueN28eACkpKXTu3JknnniChIQExowZU6wQBbBy5Ursdjs9evQoEHRiY2PZunUrHh4e9OnTh+DgYABsNhuLFy9m9erVREVF4evrS7NmzZg0aRJpaWn07NkzX4gqztp5eSUlJeVbYqZRo0acPXs23z6pqals376dL7/8EqfTyYQJE/Dz8yv2OURE5A5IS7NCVN++cIM1ZF3B7UKUq8yZM4ejR4+yY8cOFi1axKpVq4iMjCQ+Pp6AgAAAOnTocEvjZiIjI/ONlwoNDSUoKIhnn32W4OBg/Pz8yFkCp2bNmgQGBgIQGBjIsWPHiI2NpU+fPgC0adOGY8eOAVCrVq3cL/umTZty8eJF9u3bx4ULF9i4cSMAV/KsJ5Szjl7OvrVr1y5Qa506dYiJiWHNmjXY7XaSk5OLdY3Vq1dn2LBheHh4kJSUVOB9AQEBeHl5AdCyZUsSEhIAcq+7SZMmHDhwgGrVqpGSkkJwcDCenp4kJSUV6/zFVdjyRg8//DBt27bl4Ycf5ttvv2XcuHF8+OGHpXpeERG5TStWWGvl/eUvrq6kALcLUaEBoTdsNSoLTqeTjIwMfH198fX1JSQkhD59+pCYmIgtzzwUDocDu734Y/GLGhOVE3DOnTuX79h567HZbNhstkK//KtUqVKgfk9PT6ZOnUq7du1uuH9RayV+9NFHpKSksHLlSpKTkxkwYMBNrg5OnjzJe++9R1RUFF5eXvTt27fAPtdfV1E17d69m507d7Js2TI8PT0LXEdxuvO2bNlCZGQkAIsXL84X6E6fPk3jxo3zvT+nqw+soHnhwgWysrIKfL4iIuIiTqc1rUHbtlDM3pE7SU/nAevWrWPq1Km5X/IXL17E4XDQoEEDWrZsyb59+wDYs2cPDz74YL732u12MjIybul8c+fOJSwsjGbNmvHxxx8DkJaWRmxsLAD79+/n/vvvp02bNsTExOS+1vIGzZgBAQF89tlnAMTHx7N06dIi9y2s5qSkJO655x7sdjtbtmwp1jUlJSXh7e2Nl5cXhw4d4uTJk2RmZubb57vvvuPKlSukp6cTHx+Pj49Pkcdq2rQpnp6ebN26laysrHw15HTn5f25fjxUUFBQ7raqVavSokULvv32WwD+85//0KVLl3z7v/XWW7kthUeOHMHb21sBSkTEnXz8MRw6BC+84BaTa17P7VqiXKF///58//33DBw4kJo1a5KZmUl4eDjVq1cnPDyc6dOnY7PZqFu3LhEREfne6+vry+HDh5k1axaTJ0/Ot+367ry6devyzDPPkJiYSPfu3QkMDCQkJISuXbtSr149Nm7cyKxZs2jUqBGdO3emQ4cOjB49mtDQUJxOJy+++GKR1zBs2DAmTZrE0KFDcTgcTJkypch989ac07XXq1cvRo0axf79+3nyySdp2rQpb731Vu574uLi2LJlC88//3zua/7+/nh5eREcHEz79u0JDg5m+vTpuV11OeeaPHkyx48fJzg4ON84pbw6derEwoULGTZsGD179uSRRx5h2rRpzJo1q8jruJnJkyfz4osv4nA4CAgIyH3actSoUbz99tv069ePSZMmsWzZMq5evcrMmTNLfC4RESkDc+ZA8+YwZIirKymUrajunbISHR3tzPslK5aOHTsWughxXFwc/v7+LqiooDlz5jB+/Phi758z6P7NN98sw6ruPHe6J3KN7ov70T1xT+Xmvuzcaa2P9/e/u3Q8VHR0NO3bty+0GUzdeVIsFy5cKHQuLBERkTLx6qvWnFAjRri6kiKpO89NFNYK5U68vb3x9va+pfd07NiRjh07llFFIiJSYR0+DBs2wJQpUKuWq6spklqiRERExL289hpUq2atk+fGFKJERETEfSQmwrJl8PTTcN3UNO5GIUpERETcxz/+AVevWtMauDmFKBEREXEPKSmwYAEMHAgtWri6mptSiMq2YsUKBg0aREhICAMGDODrr7++4f4jR44kNNSaWT3vXFA55s2bR69evQgJCcn9KWqxXFfIqfn9999ny5YtRe4XEhLCkSNHbunY8+bNy13AuTATJ05k27Ztt3TM4lq0aBEDBgxg4MCBbN++vUzOISIiZWTBAvj5Z7iF6XRcSU/nASdOnGDt2rWsW7cOT09Pjh8/Tnh4eO7kjIWJjo5mz549nDhxgk2bNhX6+H9Ry764Wt6a+/fv7+pySk1CQgIff/wxq1ev5tKlSwQHB9O5c2fNQi4iUh6kpcEbb0DPnvDLX7q6mmK5rRBlGMYcoEv2cSKAPcAyoApwCggxTTP9dossa5cuXSI9PZ3MzEw8PT3x8fHJbUkxTZMZM2Zgt9vx8vJi9uzZLFiwgNTUVEaMGIHdbufgwYPMnz+f0aNH3/Rcx48fZ9y4caxZs4aTJ08yZswY1q5dS/fu3enduzexsbE0btyYuXPnkpaWRkREBE6nk6tXrxIeHk7r1q0JCgqiZ8+e7N27l9q1a/Puu++SmprK5MmTSUlJISsri/DwcPz8/AgKCmLw4MFs27aNjIwMli5dyowZM3Jrdjqd1K9fn+DgYCZMmMDp06dJTU0lLCyM7sVYp2jJkiVs3rwZh8NBt27d8n0Gu3btYuHChVStWpXExER69+7NqFGjcrctX76cU6dO8dprr9GqVSsiIiI4ePAg6enpDBkyhIEDB+Yeqzhr5+3atYsuXbpQtWpVvL29adasGfHx8RiGcdPrEBERF1u8GH76yVpwuJwocYgyDKM78KBpmr82DKMBsA/YCrxlmua/swPW08Dbt3LcyEhYsqSkVRXu6ach9AZrGvv5+dG2bVt69OhBt27d6Nq1K7169cLDw4OZM2cyfvx4AgICWLx4MZGRkUycOJGoqCgWLVqUOyt3cQIUgI+PD127dmX9+vXs2LGDKVOm4OHhwZkzZ+jbty/h4eGEhYWxfft2Dh8+jGEYTJkyhZiYGCIiIli+fDkJCQn069ePCRMmMGjQIEzT5PPPP6dLly4MHDiQ+Ph4Zs6cydKlS8nKyqJFixaMGDGCsWPHsnPnToYPH55b87x58wBISUmhc+fOPPHEEyQkJDBmzJhihSiAlStXYrfb6dGjR4GgExsby9atW/Hw8KBPnz4EBwcDYLPZWLx4MatXryYqKgpfX1+aNWvGpEmTSEtLo2fPnvlCVM7aeTdy7ty5fHNZNWzYkLNnzypEiYi4u4wMa3LNTp3ccqHhotxOS9SXwO7s35MAL+ARYGT2axuAv3KLIcpV5syZw9GjR9mxYweLFi1i1apVREZGEh8fT0BAAAAdOnTg7beLfznXr50XGhpKUFAQzz77LMHBwfj5+eWuM1ezZk0CAwMBCAwM5NixY8TGxtKnTx8A2rRpw7FjxwCoVasWfn5+ADRt2pSLFy+yb98+Lly4wMaNGwG4cuVK7nk7dOiQb9+c9fLyqlOnDjExMaxZswa73U5ycnKxrrF69eoMGzYMDw8PkpKSCrwvICAALy8vAFq2bElCQgJA7nU3adKEAwcOUK1aNVJSUggODsbT05OkpKRinT+v65cwcjqd2NxwwUoREblOZCQkJMC777rlQsNFKXGIMk0zC7ic/dcRwMdA7zzddz8Bd93qcUNDb9xqVBacTicZGRn4+vri6+tLSEgIffr0ITExMd+XsMPhwG4v/lj8osZE5QScc+fO5Tt23npsNhs2m61AMAAKjPFxOp14enoydepU2rVrd8P9i1or8aOPPiIlJYWVK1eSnJzMgAEDbnJ1cPLkSd577z2ioqLw8vKib9++Bfa5/rqKqmn37t3s3LmTZcuW4enpWeA6itOd16RJk9ygCXD69GkaNWp00+sQEREXunoVIiKgQwcoZ8uL3fbAcsMw+gHDgV5A3se4bECh39hxcXG3e9pStWXLFg4dOsSYMWOw2WxcvHiRtLQ0zp49y913301UVBR+fn5s2rSJJk2aEBcXR1ZWFnFxcSQkJHD+/PkC13T27FnS09MLvda3336bxx9/nL1797Jw4UI6d+5MWloaH374Iffffz87duygZ8+eNG3alH379mEYBqZpcvfdd+c7N8DPP//MDz/8wF133cWaNWuoXr06CQkJ7N27l379+pGRkYFpmtSoUYPz58/j5eWFw+HIrTmnzszMTKpXr45pmmzevJnU1FTi4uK4fPky33//PVlZWQWuIz4+nho1avDjjz9y9OhREhISOHz4cO4xq1atysGDB9m/fz82m424uDjS09NJTk4mISEh9/NLTk4mJiaGmjVrEh8fz+7du7l69SoHDx7E09Mz93yTJ08uUEPez7dhw4YsWLCAoKAgLl68yIkTJ8jMzCz1f29paWlu929YdF/cke6Je3K3+1J3wwbu/v57Ev76Vy4dPuzqcm7J7Q4s7w1MAR41TTPFMIzLhmHUME3zCtAMa3B5Ae62evQDDzzAa6+9xosvvkjNmjXJzMxkxowZBAYGEhERwfTp07HZbNStW5eIiAhq1apFlSpV8Pf3p0mTJsyfP5+oqKh8X/KNGjXiww8/5MCBA7mv1a1bl2eeeYbU1FRCQ0P53e9+R0hICEOGDKFevXrExMSwatUqGjVqxJAhQ0hLS2P06NG5g8tnzZpFy5Ytc88NVjfcvffey2OPPcakSZN4+eWXcTgcTJkyBX9/f6pWrYphGHh5edGgQQPuvvtuHnnkkdyaGzVqRP369XnkkUcYNWoUs2fP5sknn+See+7h888/x8vLixYtWpCVlcWWLVt4/vnn831uH3zwAdOnT6d9+/YMHTqU5cuX0759e+rXr8+9996LYRj861//4vjx44SEhPDQQw+xfv16mjdvjr+/Pz/99BP16tVj0KBBfPLJJ8ycOZOePXvSvXt3Vq9ezaxZs4p9H/39/QkJCeGVV17BZrMxa9YsWrduXQr/QvIrNyugVzK6L+5H98Q9udV9ycqCpUshIIDmf/6zW3blRUdHF7nNVlT3zs0YhlEX2AH0NE3zTPZr7wJfmqa53DCMN4GDpmkuuq4YZ854GLmmY8eOhS5C7E5tK1B5AAAgAElEQVT/2OfMmcP4W5i7I2fQ/ZtvvlmGVd157nRP5BrdF/eje+Ke3Oq+rF4NQ4bAv/8NxRhG4grR0dG0b9++0HR3Oy1Rg4GGwNo8Tz/9EVhkGMazwA/Av27j+OJGLly4UOhcWCIiIiXicMDMmeDvD+V0zsLbGVj+LvBuIZuCSl5O5VVYK5Q78fb2zjd9QHF07NiRjh07llFFIiJSrm3YALGxsHw53MJDW+6kfFYtIiIi5ZfTCa+8AvffD4MHu7qaEtOyLyIiInJnbdwIe/dag8o9ym8UUUuUiIiI3DkOB7z4IrRsCW64vuytKL/xT0RERMqfdevg4EFrjbxy3AoFClG5VqxYwYYNG6hWrRpXrlzhr3/9K506dSpy/5EjR5Kampq7tMv1T67NmzePDz/8kCZNmuS+1qZNm1uaIqAs5dT8/vvvU7t2bYKCCn8eICQkhKlTp/LAAw8U+9jz5s2jfv36hc7WDjBx4kR69+5d7LX5bsWiRYv49NNPsdlsjB49mm7duuXbPnLkSFJSUvDI/g93woQJPPjgg6Veh4iIFCIrC6ZNg1atyvVYqBwKUcCJEydYu3Yt69atw9PTk+PHjxMeHn7DEBUdHc2ePXs4ceIEmzZtKvTx/6KWfXG1vDX3L6ePlRYmISGBjz/+mNWrV3Pp0iWCg4Pp3LlzviVmLl++zDvvvEOdOnVcWKmISCW1ahXExVnzQl23hFl5pBAFXLp0KXfpE09PT3x8fFi+fDkApmkyY8YM7HY7Xl5ezJ49mwULFpCamsqIESOw2+0cPHiQ+fPnM3r06Jue6/jx44wbN441a9Zw8uRJxowZw9q1a+nevTu9e/cmNjaWxo0bM3fuXNLS0nJnK7969Srh4eG0bt2aoKAgevbsyd69e6lduzbvvvsuqampTJ48mZSUFLKysggPD8fPz4+goCAGDx7Mtm3byMjIYOnSpcyYMSO3ZqfTSf369QkODmbChAmcPn2a1NRUwsLCitVStGTJEjZv3ozD4aBbt275PoNdu3axcOFCqlatSmJiIr1792bUqFG525YvX86pU6d47bXXaNWqFRERERw8eJD09HSGDBnCwIEDc49VnLXzdu3aRZcuXahatSre3t40a9aM+Ph48sxjxuXLlxERERfIzLRaoQICyu28UNdzvxAVGQlLlpTuMZ9++oarGvv5+dG2bVt69OhBt27d6Nq1K7169cLDw4OZM2cyfvx4AgICWLx4MZGRkUycOJGoqCgWLVqUOyt3cQIUgI+PD127dmX9+vXs2LGDKVOm4OHhwZkzZ+jbty/h4eGEhYWxfft2Dh8+jGEYTJkyhZiYGCIiIli+fDkJCQn069ePCRMmMGjQIEzT5PPPP6dLly4MHDiQ+Ph4Zs6cydKlS8nKyqJFixaMGDGCsWPHsnPnToYPH55b87x58wBISUmhc+fOPPHEEyQkJDBmzJhid7etXLkSu91Ojx49CgSd2NhYtm7dioeHB3369CE4OBgAm83G4sWLWb16NVFRUfj6+tKsWTMmTZpEWloaPXv2zBeiqlSpwrJly25Yx7lz5/LNZdWwYUPOnj2bL0SlpqYyffp0Tp06xQMPPMCkSZOoVq1asa5TRERuQ2QkHD1qPZlXTueFup77hSgXmTNnDkePHmXHjh0sWrSIVatWERkZSXx8PAEBAQB06NCBt99+u9jHzBkvlSM0NJSgoCCeffZZgoOD8fPzI2cJnJo1axIYGAhAYGAgx44dIzY2lj59+gDWeKpjx44BUKtWLfz8/ABo2rQpFy9eZN++fVy4cIGNGzcCcOXKldzzdujQId++tWvXLlBrnTp1iImJYc2aNdjtdpKTk4t1jdWrV2fYsGF4eHiQlJRU4H0BAQF4eXkB0LJlSxISEgByr7tJkyYcOHCAatWqkZKSQnBwMJ6eniQlJRXr/Hldv4SR0+nEdt06TM8++yy/+c1vaNSoES+++CIrVqzg6aefvuVziYjILcjIgJdfhl/9Cvr2dXU1pcb9QlRo6A1bjcqC0+kkIyMDX19ffH19CQkJoU+fPiQmJub7EnY4HNhvIT0XNSYqJ+CcO3cu37Hz1mOz2bDZbAWCAZBvjE/O/p6enkydOpV27drdcP+i1kr86KOPSElJYeXKlSQnJzOgGGsYnTx5kvfee4+oqCi8vLzoW8h/GNdfV1E17d69m507d7Js2TI8PT0LXEdxuvOaNGmSGzQBTp8+TaNGjfK954knnsj9vWfPnnz88cc3vU4REblNixfDDz/AO++45SLDJVUx2tNu07p165g6dWrul/zFixdxOBw0aNCAli1bsm/fPgD27NlT4Ekuu91ORkbGLZ1v7ty5hIWF0axZs9wv8bS0NGJjYwHYv38/999/P23atCEmJib3tZYtWxZ5zICAAD777DMA4uPjWbp0aZH7FlZzUlIS99xzD3a7nS1bthTrmpKSkvD29sbLy4tDhw5x8uRJMjMz8+3z3XffceXKFdLT04mPj8fHx6fIYzVt2hRPT0+2bt1KVlZWvhpyuvPy/uQNUAAPP/wwX3zxBRkZGZw+fZozZ85w//33527Pysrij3/8I5cuXQKsMVQ3+kxFRKQUXLlizU7+m99Ar16urqZUuV9LlAv079+f77//noEDB1KzZk0yMzMJDw+nevXqhIeHM336dGw2G3Xr1iUiIiLfe319fTl8+DCzZs1i8uTJ+bZd351Xt25dnnnmGRITE+nevTuBgYGEhITQtWtX6tWrx8aNG5k1axaNGjWic+fOdOjQgdGjRxMaGorT6eTFF18s8hqGDRvGpEmTGDp0KA6HgylTphS5b96ac7r2evXqxahRo9i/fz9PPvkkTZs25a233sp9T1xcHFu2bOH555/Pfc3f3x8vLy+Cg4Np3749wcHBTJ8+PberLudckydP5vjx4wQHBxf5VFynTp1YuHAhw4YNo2fPnjzyyCNMmzaNWbNmFXkd17v77rsZNGgQw4YNw2azMW3aNOx2O19++SUnTpxg6NCh9O/fn9DQUGrUqEGTJk0ICwsr9vFFRKQE5s+HxERrXqgK1AoFYCuqe6esREdHO/N+yYqlY8eOhS5CHBcXh7+/vwsqKmjOnDm3NM9VzqD7N998swyruvPc6Z7INbov7kf3xD3d0fuSlAQtWsCvfw3ldPhEdHQ07du3LzT9qTtPiuXChQuFzoUlIiJSpNmzISUFruvFqSjUnecmCmuFcife3t75pg8ojo4dO9KxY8cyqkhERNzaiRPw5pvW+njZT7lXNGqJEhERkdI3bZq12PCMGa6upMwoRImIiEjp+u47WLoUnnsOingquyJQiBIREZHSNXky1Kpl/VmBKUSJiIhI6fnqK9iwAcaPh4YNXV1NmVKIEhERkdLhdMKECdC0KfzlL66upszp6TwREREpHR98YLVEvf02ZK+bWpGpJUpERERuX3o6jBsHrVrBiBGuruaOUEuUiIiI3L758+HoUfjkE/CoHPFCLVEiIiJye86ehZdfhkcftX4qCYUoERERuT3TpsGlSzB3rqsruaMUokRERKTkvvsO3nkHnn3WGg9ViShEiYiISMm98II1seb06a6u5I6rHCO/REREpPR9+qn189prFX5izcKoJUpERERu3dWrViuUry+MHu3qalxCLVEiIiJy6xYssMZDvf8+VKvm6mpcQi1RIiIicmvOnIHwcOjZEx5/3NXVuIxClIiIiNyaSZPg8mWYNw9sNldX4zIKUSIiIlJ8O3fCkiUwdiz4+bm6GpdSiBIREZHiycqyBpHffTdMnerqalxOA8tFRESkeBYtguhoWLkSatd2dTUup5YoERERubnz52HyZOjWDYKDXV2NW1CIEhERkZubMgVSUmD+/Eo9mDwvhSgRERG5sW+/hXffhbAwePBBV1fjNhSiREREpGhXr1qLCzdpAtOmuboat6KB5SIiIlK0+fNh715Yuxbq1nV1NW7ltkKUYRgPAhuAv5umOd8wjHnAr4FL2bv8zTTNTbdZo4iIiLjCjz9aM5M/9hgMGODqatxOiUOUYRhewDxga56XawEjTNPcf7uFiYiIiAs5ndacUE4nvPWWBpMX4nbGRKUDjwGJeV7TpBEiIiIVwQcfwIcfwvTp4OPj6mrcUolbokzTvApcNQwj78u1gJcMw6gPnACeN03zwu2VKCIiInfUzz9bT+IFBMCYMa6uxm2V9sDyd4BDpmkeMQxjCjAdCLt+p7i4uFI+bcWVlpamz8vN6J64J90X96N74p6Kc1+azJxJ/cREjs+dS1p8/B2qrPwp1RBlmmZUnr9GAW8Xtp+/v39pnrZCi4uL0+flZnRP3JPui/vRPXFPN70vu3dby7o89xz3DR585wpzU9HR0UVuK9V5ogzD2GgYxi+y//oIEFuaxxcREZEylJ4OTz9tLTD8yiuursbt3c7Tee2BuYAPkGkYxgDgn8B6wzAuA5eBP5VGkSIiInIHzJwJhw7BRx9pTqhiuJ2B5dFYrU3XW1viakRERMQ19u+HiAgICYHf/tbV1ZQLWvZFRESkssvMhD/9CRo0gDfecHU15YaWfREREans5syxWqLefx+8vV1dTbmhligREZHK7NAhmDEDBg+GJ55wdTXlikKUiIhIZXX1qtWNV6cOzJvn6mrKHXXniYiIVFavvw579sDq1dCokaurKXfUEiUiIlIZHTwIU6dC//4waJCrqymXFKJEREQqm/R0GDYM6teHd94Bm83VFZVL6s4TERGpbF58EWJirEk1GzZ0dTXlllqiREREKpEa334Lf/sb/N//q0k1b5NClIiISGXx88/cPWkStGgBc+e6uppyT915IiIilcXYsXieOgU7dkCtWq6uptxTS5SIiEhlsGEDLFnC+eHDoVMnV1dTIShEiYiIVHQnT8Lw4RAYyNnnnnN1NRWGuvNEREQqsqwsazqDK1esSTUdDldXVGGoJUpERKQimz0bvvgC5s8Hw3B1NRWKQpSIiEhF9fXX8NJLEBwMTz3l6moqHIUoERGRiig5GYYOhV/8AhYs0KzkZUBjokRERCoap9OaTPPkSfjvf6FuXVdXVCEpRImIiFQ0ixbBv/8NERHQsaOrq6mw1J0nIiJSkezfD88/Dz17wvjxrq6mQlOIEhERqShSUmDAAPD2hhUrwK6v+bKk7jwREZGKwOmEP/0Jjh+H7duhcWNXV1ThKUSJiIhUBG+8AVFR1sLCv/mNq6upFNTOJyIiUt599ZU1/umJJ2DsWFdXU2koRImIiJRnZ87A4MFw772wZInmg7qD1J0nIiJSXl29ak2oee4c7NwJ9eq5uqJKRSFKRESkvJowAbZutVqgAgNdXU2lo+48ERGR8mjZMnj9dQgLs57KkztOIUpERKS8+fZbeOYZeOQR62k8cQmFKBERkfLk9GnrKbwmTWDtWvD0dHVFlZbGRImIiJQXGRnWjOTnz1vTGjRq5OqKKjWFKBERkfJi7Fj4739h5Upo187V1VR66s4TEREpDxYsgH/+E8aNgyFDXF2NoBAlIiLi/v7zHxg9Gh57DCIiXF2NZFOIEhERcWeHDsHAgdCqFaxeDVWquLoiyaYQJSIi4q5On4bf/hZq1oSPPoLatV1dkeShgeUiIiLu6MoVePxxa2287dvhF79wdUVyHYUoERERd+NwWLOQ79wJ69fDQw+5uiIphLrzRERE3M1LL8GaNfDqq9C/v6urkSIoRImIiLiTd96BV16B4cOt6QzEbd1Wd55hGA8CG4C/m6Y53zCM5sAyoApwCggxTTP99ssUERGpBD74AP78Z2sw+YIFYLO5uiK5gRK3RBmG4QXMA7bmeXkG8JZpml2A48DTt1WdiIhIZfHf/0JwsDX+ac0a8NCwZXd3O9156cBjQGKe1x4BNmb/vgHoeRvHFxERqRxiY+F3vwMfH2sqAy8vV1ckxVDimGua5lXgqmEYeV/2ytN99xNwV2HvjYuLK+lpK520tDR9Xm5G98Q96b64H92T4vE4dQqfoUPB05Pj8+dz9exZOHu2zM6n+1J6Srut0Jnnd9t1f8/l7+9fyqetuOLi4vR5uRndE/ek++J+dE+K4cIFePJJSEuDL7+kZdu2ZX5K3ZdbEx0dXeS20n4677JhGDWyf2+GNbhcRERErnfxIvTpA0ePwoYNcAcClJSu0g5RnwFPZv/+JPBpKR9fRESk/LtyBX7/e4iOhrVroVs3V1ckJVDi7jzDMNoDcwEfINMwjAHAH4D3DMN4FvgB+FdpFCkiIlJhZGRYXXjbt8OKFdCvn6srkhK6nYHl0VhP410vqMTViIiIVGRXr8LQofDJJ7BwIQwZ4uqK5DZoxnIREZE7weGwZiFfvx7+/ncYMcLVFcltUogSEREpa04nhIVBZCTMmAF/+YurK5JSoBAlIiJSlpxOGDMG/vlPay288HBXVySlRHPKi4iIlBWnE55/HubPhxdegFdf1Xp4FYhClIiISFlwOmH0aKsF6v/9P5gzRwGqglF3noiISGlzOOC556514SlAVUgKUSIiIqUpJ0C9/TaMH68uvApMIUpERKS0ZGXByJGwYAFMnAizZytAVWAaEyUiIlIaMjMhNBRWr4YpU+DllxWgKjiFKBERkdt15QoMGgQffWR1340f7+qK5A5QiBIREbkdFy9a69998YU1kHzUKFdXJHeIQpSIiEhJXbgAffpAdDQsWwZ/+IOrK5I7SCFKRESkJE6dgt694cgReP99+P3vXV2R3GEKUSIiIrfq8GF49FE4dw42bYIePVxdkbiAQpSIiMit+Ppr+N3vwMMDtm+H9u1dXZG4iOaJEhERKa4PPrBanRo0gG++UYCq5BSiREREimPBAnjySQgIsFqjWrRwdUXiYgpRIiIiN+JwWJNnjhoFjz0Gn38ODRu6uipxAxoTJSIiUpTUVPjjH2HdOnjmGWseKA99dYpF/xJEREQKk5hoTaIZHQ1z58LYsVrGRfJRiBIREbnevn3WE3jJybBhg/W7yHU0JkpERCSvDz6Azp3BboevvlKAkiIpRImIiAA4ndbiwf37w4MPwu7d1pN4IkVQd56IiMilS/D00/Dvf8PgwbB0KdSo4eqqxM2pJUpERCq3//0PHn4Y1q+3WqJWrVKAkmJRS5SIiFReH30Ew4ZBlSrw6acQFOTqiqQcUUuUiIhUPg4HzJhhDRpv0cKaxkABSm6RWqJERKRyuXABnnoKPvwQQkLgnXfUfSclohAlIiKVx86d1sDxU6fgzTdh9GhNoCklpu48ERGp+BwOeO016NLFmv/pv/+FsDAFKLktaokSEZGK7fx5a/27TZvgiSdgyRKoV8/VVUkFoJYoERGpuL76CgIDYcsWq/tu/XoFKCk1ClEiIlLxXL0K06dDt25QtSp8/bW676TUqTtPREQqlvh4a+6nXbusP+fPh7p1XV2VVEBqiRIRkYrB6YTFi63uO9O0Zh5ftkwBSsqMQpSIiJR/587Bk0/CiBHwq1/BwYMQHOzqqqSCU4gSEZHy7cMPoU0b6+m7116Dzz6D5s1dXZVUAgpRIiJSPp0/D3/4A/z+99C4MezeDS+8YM0DJXIH6F+aiIiUP+vXQ6tWsHYtTJsGe/ZAQICrq5JKplSfzjMMoz2wAYjPfinGNM2w0jyHiIhUYmfOWEu1/Pvf8MtfWvM/tW3r6qqkkirtKQ5qAetM0/xLKR9XREQqM6cTVq+G55+Hn3+GmTNh3Djw9HR1ZVKJlXaIql3KxxMRkcouPh6eew7+8x946CFYuhRat3Z1VSJl0hLV2TCMTwAv4CXTNLeV8jlERKQySE+HOXOsVqeqVWHePBg1CqpUcXVlcoc5HHDsGOzdC/v2WT9JSVaurlPHdXXZnE5nqR3MMAx/oKVpmhsNw3gA+Ay43zTNjJx9oqOjnTVr1iy1c1Z0aWlpVK9e3dVlSB66J+5J98X93M49qblrF01nzKDasWP8/OijnJ44kauNG5dyhZWTu/+3kpkJ339fjbi46tk/1Th8uDqXLlnh2cPDia9vOu3bpzJu3BmqVSu9HFOY1NRU2rdvX+h6QaXaEmWaZhwQl/37EcMwfgKaAcfy7ufv71+ap63Q4uLi9Hm5Gd0T96T74n5KdE9On4bx4yEyEu67Dz75hDqPPooLGxsqHHf6b+XyZWte1JzWpX37IDbWaoQEqFnTeugyNBTatbN+Wre2ZYfA6oB3mdcYHR1d5LbSfjrvaaCWaZpvGobRFGgCnCzNc4iISAWUkWF1182YAVeuwOTJMGWK9S0qFcKFC/nD0r591uo8Doe13dvbCklhYdcC0wMPuHfvbWmPiYoCVhiGMQCoBozK25UnIiJSwMcfw9ixcOQIPPYYvP46GIarq5IScjrhxImCgenHH6/t07y5FZIGDboWmJo3B1uhnWbuq7S785KAx0rzmCIiUkGZphWePvnEanLYtMkKUVJuOBzwv/8VDEznzlnbbTbr1nbqZD1gmROYGjZ0bd2lpbRbokRERG7swgV45RWr+65mTZg715pAs2pVV1cmN5CeDocO5Q9LBw5Y45rAmrKrTRvo1+9aWGrbFmrVcm3dZUkhSkRE7owrV+DNNyEiwpowc/hwa/oCPXXndi5etAJS3sB06JD15BxYwSgwEJ5++lpgatWq8uVghSgRESlbWVnW03YvvmgNlvntb2H2bHjwQVdXJlgr6VzfHRcfb41tAmjUyFph59FHrwUmX1+t8wwKUSIiUlacTmuc08SJ1nPrv/oVLF8O3bq5urJKyemEH36Azz6rzapV1wLTyTzP0Pv4WCEpJORaYLr77vI34PtOUYgSEZHSt2MHv3jhBdizB+6/H9auhQED9G18h1y9ao3bz9u6tH+/Ncs33IPdDv7+0L37tbAUGAj167u68vJFIUpERErPV1/BSy/B1q1Ua9AA3noLnnlGCwWXobQ0iInJH5gOHrSGoAFUr24N+B440ApL9eod4/e/v09TcJUChSgREbl9O3da4ek//7EGir/+OvHduuH3y1+6urIKJTnZalHKG5ji4qxhZwB161pBaeTIay1Mfn7gkefbPi4uTQGqlChEiYhIye3ebYWnTz+1Jv/529+sRYK9vHDGxbm6unLt1KmCA76///7a9rvuskJS3ikF7rtPPaZ3kkKUiIjcGqcTtm2znrDbsgUaNLB+f+65ij0pUBlxOKxwdH1gOn362j6+vtC+PYwYcS0wNWniuprFohAlIiLF43DAhg1WYNq92/oWnz0b/vxnqF3b1dWVC5mZVvfb9QO+f/7Z2u7hYc23lHc6gYAAq5tO3I9ClIiI3FhmJqxcCa++aiWAFi1gwQL44x+tUctSqMuXrQHeeQNTbKw18zdYk7UHBMCwYdcCU+vW+kjLE4UoEREpXHIyLF5szTL+44/WN/6qVdZUBR76+sjrwoX8YWnvXms9ZYfD2u7tbYWksLBrgemBB6BKFdfWLbdH/xWIiEh+//ufFZyWLrWaU7p1s1qeHn200o9adjqtSdevH7/044/X9mne3ApJgwdfC0zNm1f6j65CUogSEZFrg8XfeAM++sia12nIEBgzxkoBlZDDYeXJvK1L+/fDuXPWdpvNak3q1MkaU58TmBo2dG3dcucoRImIVGaXL8Pq1fCPf1gzNjZqZK1xN3IkNG3q6urumPR0a4HdvK1LBw5YHw9YmbJNm/zTCbRtq4cRKzuFKBGRyujQIXjnHWth4JQUKxEsWWK1PlXwkc0XL1oBKW9gOnTIGj8PVjAKDISnn74WmFq1gqpVXVu3uB+FKBGRyiI9Hdavt8Y37dhhpYKBA61Wp9/8pkIO2jlzpuD4pfh4q/cSrIa3X/4y/5QCvr5gt7u2bikfFKJERCq6I0dg0SJroPi5c1ZKmDMHnnrKShEVgNMJP/xQ8Am5xMRr+/j4WCEpJORaYLr77gqZHeUOUYgSEamIUlJg7Vp47z34+mvrWfp+/axWpx49ynVTy9WrYJoFJ6xMSrK22+3g7w//5/9cC0uBgVC/vmvrlopHIUpEpKJwOODzz63g9P77cOWKlSZefdVqfrnrLldXeMvS0qzx7nlbl2JirEsDa/hWmzZWr2ROYGrTBi2wW0E5nA4uZVwiOS2ZqlWq0rSWax9+UIgSESnv/vc/a4D4v/4FCQlQr57VVffUU/DQQ+Wmvyo52WpRytvCFBcHWVnW9rp1rZA0cuS1wOTnp3k/y5MsRxbJackkpSVZf16x/kxOSyYlPcX6My2F5HTrz3yvpSXzc/rPOLEGtNltdhL/mkiTWq5bRFD/9EREyqOEBKu7btUqiI62+rB69YK//c3qtnPzJ+xOncrfurRvHxw7dm37XXdZISnvlAL33Vdu8mCF5XQ6Sbuali8EFfp7EdsvZly84fHtNjt1q9WlXvV61K1el7rV6nJfvfuoW70u9apde61e9XrcV/8+lwYoUIgSESk/zp6Fdeus4LRjh/Vahw4wd641PXazZq6trxAOB3z/fcEn5E6fvraPr691Gc88cy0wNXHtd2OFdyXzCuevnOfClQu5P8UNRBlZGTc8dq2qtahfvT71qtejfo36+NTzoX71+vleq1e9nvVa9u/1qtejbrW61KpaC1s5SsoKUSIi7uzCBdi4EdasgS1brL6tVq3g5Zet4NSypasrzJWZaXW/5bQuffXVvRw5Ys3LBFa3W6tW+acTCAiwuumkZNKvpueGofOp10JRvtfS8m87l3qO9Kz0Io9ZxVYlN9jUr2GFn+Z1mucLQUUForrV6uJZxfMOfgKupRAlIuJuEhPhgw+sweFffGEFJx8fGDfOmgyzTRuX92tdvgwHD+ZvXYqNtaaiAmtg9wMP5J9OoHVrt+9ldBmH00HSlSTOpp7lXOq53NBzozB0/sp5UjNTizymp92TBjUb0KBGA7xreOPr7ctD1R/CecXJA/c8gHcN79xt3jW8cwNR7aq1y1VrkCspRImIuIOjR63QFBUF33xjvWYYMF1Hz+wAACAASURBVGEC9O9vzQjpoi+2CxcKjl86csTqqgPw9rZCUljYtcD0wANw5MgP+Pv7u6RmV0u7msbZy1YgyglGef9+/Wvnr5zH4XQUeiwPu0e+wHNvvXtpd1c7vKt706Bmg3zbcv7uXcMbL0+vQsNQXFxcpb0vpU0hSkTEFbKyYM8e2LTJ6q47eNB6/Ze/hFdesYLTHf6iczrhxImC45d+/PHaPs2bWyFp8OBrgal5c5c3jJUpp9PJxYyLnL50mjOXzxQaiq4PS5czLxd6LLvNToMaDWjk1YiGNRvSqlErGtZsSKOa1t9zXm9Ys2FuGFLLkPtSiBIRuVOSk2HzZis4ffKJNXu43W4tufL6/2/vzuOjqu/9j78Ssu97SEgCIcAhgUAQCiiglboArRtutdYNpcr1apX7a3/1Vq+/3ttbre21m7b1tnavXttaa2/dqlWrKC5FdsIRCAECIYGE7Nskmd8fXzKTIWFJGHJOkvfz8TiPzJwzmflkxoS33/M9n++jcMUV5rTdEOjuNp0Reo8wbdhgSgITiqZMgXPOgTvv9AemtLQhKe+M6/Z2U9taS1VTFVXNJhz13K5qqqK6pTrgWFtnW7/PExse6ws+6THpFKYV9huIevYlRycTGjJ8G50ORmd3Jy2eFpo7mmnxtJAUlURqTCpNHU28Wf5mwLFmTzMXFVxEydgSdh/ZzX+89R++/S2eFlo8Lfz7J/+dCwsudPrHAhSiRETOHK8Xtm0zoemFF+Cdd8wIVGoqLF0Kn/40XHzxGW+l3d5uFtjtPbq0caOZ1wRmCb3p0wPbCcyYYRbiHU66urs41HKIysZKfxhqrja3j7l/qPkQXd6uPs8RFhpGRmwGGbEZZMZmUpheSGZspu9+z7GecBQdHu3AT3pmdHR10NTRRGhIKElRSXi9Xt4sf5NmTzNNHU00d5ivxZnFLM5fTFtnG3e+cCdNHv+xZk8zN8+8mTvn3snBpoOM/+74PlfzPXLBI3xpwZeobKzkkqcv6VNHfEQ8JWNLaPY081rZa8RGxBITHkNMeAwJkQmEhbonurinEhGRkaCqCv72N3Ml3WuvmfNjYNYd+cpXTHCaO9csw3IGNDaagNR7/tK2bebKOTDBqKQEVqzwB6aiIhOk3MrT5aGquYrKxkoqmyqpbKzkQOMBc7vX/erm6n6DUVRYFJmxmWTGZZKXmMcnsj9hQlFcpm9/T0gabiNFVU1VHGk7QmN7Iw3tDTR2NJIQmcDi/MUAPPLOI+xv2E9TR5Mv7OSF5/HDwh8CUPyjYioaKmjuaMbTbf4juXHmjfzy8l8SEhLCkt8u6ROCVs1ZxeL8xYSFhvHKrleIi4gjLiKO2IhYMmIzSIhMACAhMoF7599LbHhsQBCakz0HgLzEPD5c+aFvf0x4DLHhsb5gOj1jOnvv3YubKUSJiJyO5mbTs6knNPXMbUpJMWvUXXghLFt2Rno4VVf3nb+0c6cZAAOztvBZZ5lBr57AVFDgnmXzPF0eKpuOBqLewehoWOq5f6j5kK9LdW8ZsRlkxWWRFZ/FjMwZZMdn++73hKPM2EzX9B7yer20eFpo6mjyNYnccHAD5XXlJgAdDUJRYVHce/a9AHz1b1/l/f3v09hhjjW0NzA5ZTJv3vwmAMueWsZHlR8FvM6ivEW+EPXrTb+moqHChJzwWOIi4shM8DfhurjgYjq6OgKOT8+Y7jv+2g2vERUWRWxErO8x8ZHxgBm1q1hdcdyfNyY8hocvePi4xyPDIn2BarhSiBIRGYi2NvjgA/j73806de++Cx0dEBkJCxfCQw+Z4DRrVtDSitcLe/YEji6tX286IfSYMMG8ZO+WAtnZzk34bu9sp6KpgsN7DlPRUEFFQwX7Gvb5blc0VHCw6WCfcBQaEkpmbCbZ8dnkJuYyd9xcsuKyTECKzwoISUPdj6iru4uG9gbq2uqYkDSBkJAQ1leuZ2PVRt+yJPXt9TR3NPPEJU8A8G9v/BtPbX7Kd6yzu5OU6BRqvlwDwNff+jrPlj4b8Dp5iXm+EHWk7QgtnhaSo5LJS8wjISKBSSmTfI998LwHaepoIiEygfiIeBIiE0iNSfUd37xqc5+fo7S01Hf72xd9+4Q/86Lxiwb4Lo0uClEiIifS1GRaDrz1lglO779vQlNIiOkU+cUvmtC0YEFQVr3t7ATbDhxd2rABjhwxx0NDzUV7ixf7w1JJyRmfVhWgrbPNH4zqewWjRn9Aqm6u7vN9CZEJ5CTkkJOQw4zMGeQk5DAufhzZ8dm+kJQek86Y0DNzqhNMEKpvryc+Ip7wMeHsrN3J+xXvB3Turm2r5XtLvkdKdAqPf/A4D7/zMPVt9QFLltT93zoSoxJ5esvTfOvdb/n2x0XEkRSVhKfLQ/iYcLLjs5mXM8+3VElSVBIp0Sm+x//n4v/kq4u+SnxkvC8IRYX5m2n98NM/POHPc6l1aRDfHRkohSgRkd5qa01o+vvfTXBat84kmzFjzLmxu++Gc881oSkl5eTPdwJtbbB5c+Do0qZNZj+YxpTFxXD11f7AVFwclKx2QvVt9eyp38Oeuj2BX4/ermqu6vM9KdEpvoA0J2sOOQk5hDaHMteaa8JSwjjfXJlgamhvYGv11oAQVNNawy0lt5CfnM9fd/2V+1+/33esrq0OL17W376ekrElvFb2GqteWOV7voTIBJKjkmlobyAlOoW8xDwumHiBb922niAUMcZMIlt99mrumHMHSVFJ/U56vmPOHdwx547j1m+lWUF/T2ToKESJyOjV1WUuW1u71mzvvWeGgQDCw2HePPjyl01oOucciI8f9EvV1ZkRpd4jTKWlpgQwS5/MmgWrVvkD09SpZqmUYPJ6vRxuOcye+j2U15X3CUh76vdQ11YX8D2RYyIZnzSe8YnjuWTKJYxPGk9eYh65Cbm+gBQT3jfZlZaWUlhwar2uenfsPtR8iPzkfHIScthTt4f/Wvtfvv09X3/8mR9zqXUpa/etZclvlwQ8VwghLMxbSH5yPlFhUaTGpDI5dTIpUSm+3ktj48YCcHXR1Zw/4XxSolNIikrqc4rwEusSLrH6XkHWo+d5ZHRSiBKR0ePwYXM6ric0ffCBOV0HZhb2/Plw001w9tkmQEUP7vL1ysrA0aX162H3bv/xrCwTknq3FMjPD978JU+Xhz31eyg7Usau2l3sOmK2siNllB0po6mjKeDx8RHxjE8az4SkCSzMW8j4xPG+0DQ+aTwZsRmDvmKtq7uL0sOlVDZWcrDpIJVN5uuFEy9k6eSllNeVM++n86hpqQm4su4HS3/AP8/9Z5o6mvjlxl+SHpNOemw64xPHMydrji+8zM6ezYufezGgU3dSVJKv3nPHn8u54889bn2pMakBc4hEBkIhSkRGptpak2TWrfNvZWXm2JgxZj5TT2CaPx8mThxwiunuNk957BVyVb3Odk2aBHPmwMqV/sCUmXn85zxVDe0N7Ko1wWjXkV2+sFR2pIy99XsDAknkmEgmJk+kIKWA8yecT35SPhOSJviCUlJU0oCuXuv2dtPqaSU2Ihav18uP//FjXzjq+Xq5dTnL05fT2tlK8Y+KA74/JjyGzNhMlk5eSmp0KldMvcIXknoaU/ZcITYtYxr1X6k/bi1pMWksnbx0gO+eSHAoRInI8FdbS+y778Lzz/sDU++hn/x8mD0bvvAFE5rmzBnwxCKPx5x+632F3IYNpi8TmNNuRUWwZIk/LM2caU7TDVZ7ZztlR8qwa2w+rvkY+7DNx7Xm66GWQwGPTYtJY2LyRObnzOf64uspSCkwwSm5gKz4rFMeSerq7qK6uZoWTwsFKQUAfO3Nr7G9ZrtvIvn+xv1cVXQVT1/5NCEhIdz3t/toaG8gIzaDsXFjGRs3lrQY09o8LiKO31/9ezJjM33Hei6RB4iPjOfHn/nx4N8kEQcpRInI8NHVZRbq3bTJv23cCOXl5PU8ZuJEE5Juv90Ep7POGvAE8OZm89S9R5e2bDGdv8Hkr5kzA9sJTJtmJoIPlNfrZX/jfhOQaj72B6Yam/K68oBFaTNjM7HSLC6zLmNy6mQKkk1Qmpg8kcSoU0trXd1dHGg8QG1rLTPHzgTggdcf4K29b1FeV86BxgN0dneyIHcBa1asAeAvO/7CkdYj5CTksGj8InLic5g7bq7vOXfctYPk6OQ+k6p7LqW/quiqgb8xIsOAQpSIuFNNjUkymzf7A9OWLdDaao6HhoJlme7fd9zBnrQ0xl9xxYADU21t3/5LH39sTtWBebpZs+Cuu/yBacqUgTcc7/Z2s6duD1sPbWVr9Va2HtrKtkPb2H54e8BitTHhMUxJncInsj/B9cXXY6VaTEmdwpTUKacclFo8Leyt38vUtKkA/OD9H/C/H/8vu+t2s6duD55uD3mJeey5Zw8A+xv30+3t5tzx55KbkEtuQi5TUqf4nu/DlR+e8PXSY9MH9maIjBBBD1GWZX0HmA94gS/atn3i3z4RGb28XtN2u7Q0cNu2LbCTZFqaGfq54w6zqNuMGaZZUq+J3y2lpScMUF6vWYHl2PlLe3utKpGba0LStdf6A1Nu7sCmSnV7u9lbv9cXlHpCU+nhUlo8Lb7HZcdnMy19GreddRtWqoWVZsLSuPhxpzQ/qbO7kzEhYwgJCeHVXa/y3Pbn+LjmYz6u+Zh9DfsICw2j9authIWGsb9xP3VtdZyVdRZXFl5JflK+71QdwM8u+9mp/4Ai4hPUEGVZ1nnAZNu2z7Ysqwj4OTAvmK8hIsNQdzeUl/cNS9u3+7tIglnYbepUs1xKT1iaMcPMxB7IxOdu2LGj7xVyNaZJNCEhZjTpnHPgzjv9gSktbWA/VlNHE5urNrPh4AY2HNzAxqqNbKneEjCylBWXxbSMaaw8ayXT0qcxLWMaRelFJEUlnfLrHGo+xNqKtWyp3uLbth/eTvk95WTHZ/OPA//g6S1PY6VafHLCJ30jVz2nAk+09IaIDF6wR6I+BfwJwLbtbZZlJVuWlWDbdkOQX0dE3Ka7G/bvN+ll587AbccOfwdJgIwME5auucaMKPVsOTkDvkKuvd20enrxxUQOHjRhaeNGM68JzMK606fD5Zf7w9KMGSavnSqv18uBxgMBYWnDwQ3srN3pW7YkKSqJkrElrJi1gukZ0ylKL2Ja+jSSo0+9lXhzRzMbDm7gHwf+webqzaw+ezVF6UW8vPNlbvzTjQBMSJrA9IzpLJu8zDdZ/EsLvsRXFn7FFevDiYwmwQ5RY4F1ve5XHd2nECUyEnR2wr59ZnJ3TzjqCUq7dvlnXoNJLwUF5hr/Cy/0B6WpUyF1cH15Ghv7Nqzcts1cOQfZxMWZJVBWrPAHpqIiU8qp6pno/eH+D/nwgNnWV66nprXG95iC5AJmjp3JDTNuYObYmZSMLSE3IXdAIabF04Kny0NiVCJbq7dy7R+upfRwqW/0KD0mnSsLr6QovYglk5aw9ta1FKUX9dv1+9gJ3SIyNIL9m3fsX5AQ6Lv0du/FD+XE2tra9H65zIj+TDwewquqCN+/379VVvpvV1UR0uXvP9QdGUlHXh6evDw65s2jIy/PbOPH05mZ2f/s6+pqs51ETc0YSkujem2R7NkT6TuektJJUVEbN93URmFhGwUFDUyaFNJnzd9du078Okfaj7C5djNbarew9chWNtdu5nDbYQDCQsKYkjSF88eez9SkqVhJFlaSRVx4r2Gsbmg+0Mz2A9tP+DqHWg+xvmY9Hx36iPU16yk9Uso/Tfsn7ii6g4b2BtLC0ri98Hamp0xnWvI0MqIzoNP/9zKRRPY37mc/+0/63rnFiP5dGcb0uQRPsEPUfszIU49s4OCxDyosPLVlAOTosgl6v1xl2H4mXi/U15vZ1fv3m6979pi5SuXl5nZFhf+yNDCn1saNgwkTzIq348eb25MmwaRJhGZnExUayiCu7A8oq7y874Tv3vPKJ0wwnQpuvdU/wpSdHUZISBxgAs2pfC6d3Z1sPLiRd/a9wzv73uGD/R9QXlduflRCsNIslk5ZyieyP8HccXOZOXZmwGKwA1HVVMXBpoPMHDsTT5eH2Q/PprWzlaiwKOaNm8eXi77MZVMvo3CcqfnNkjcH9TpuNmx/V0Y4fS4Ds27duuMeC3aI+ivwNeAJy7JmAQds2248yfeIyOnq7jajO70D0rFfKyqgpSXw+0JDzTykCRPgvPPM195bTs7AzoWdRGenWZru2MBUV+cvp7DQ5LWesFRSAsmnPq0oQH1bPe9VvOcLTe9XvO+b9J2TkMP8nPmsmrOKT2R/gtnZs09rgdxWTytv732bV3e9yl/L/sqmqk3MyZ7Dhys/JHxMOE9e+iQTkycyK2uWb/FaERneghqibNt+17KsdZZlvQt0A3cG8/lFRp3WVrOGyMGD/u3Y+wcOmK2zM/B7w8IgO9sEoZkz4dOfNrdzcszoUs8WHt7/aweh9M2bA8PSpk3++eVRUVBcbOaW9wSm4uIBNxIPcLjlMG+Wv8kbu99gzb41bK7ajBcvoSGhzMycyS0lt7AgbwELcheQm5h72j9jY3ujr/v29X+8nue2P0fEmAgW5i3koU89xEUFF/kee13xdaf9eiLiLkGfjWjb9leC/ZwiI0bPKbXDh/3boUOBoah3WKo/zpphaWnmsv+xY+HccwPDUc/tjAz6TBA6Q+rq+k74Li01DcbBLH0yaxasWuUPTFOnmpx3Ourb6nlrz1u8vvt1XrJfwq6zAYgNj+Wc3HNYft5yFuQtYN64eQFLjZyOxvZGni19lt9u/i1/L/87u7+4m3EJ4/iXs/+FlWet5LwJ5xETfhpJUESGDV3SIXI6WloCA1HvYNTf/sOH+44Y9UhIMKFo7FhzDqsnJPVsPfczMs7Y6NGpqKzs23+p9zJ1WVkmJF12mT8w5ecPuHNBvzq6Onhn7zu8susVXt/9Ousq19Ht7SYqLIqSlBK+fv7XWZy/mDnZcwgfE9z3qOxIGQ+88QDPlT5Ha2crBckFrD57te+KvAV5C4L6eiLifgpRMrp5vea805Ejp7xNrKoy4enIkcDeR72FhprL+NPSzDZ5sln4Nj3dv69nS083AalX92036O6GsrK+85eqqvyPmTTJLFO3cqU/MGVmBreOioYKXtrxEi/tfInXyl6jsaORsNAw5ufM56uLvsri/MXMz5nP7h27gz5Ztq6tjkPNh5icOpnosGj+uuuv3DTzJm6ceSPzc+arL5PIKKcQJcNXV5dpHNTQcPKt9+Pq6szWE4w6Ok78OomJZmbz0a29oIDIvDz/vvT0vuEoKWngi6s5yOMx/ZZ6h6UNG8zbBua0W1ERLFniD0szZ5q3Jti6urtYW7GWP9t/5qWdL7GlegsAuQm5XDf9OpZOXsqn8j8VtNNz/dlXv4+H1zzMLzb+grnj5vLGTW+QFZ9F5b9UqieTiPjor4EMjc5O00K6Z2tqCrzf376e+/0FoYYGf0vqk4mNNafKem85OQHB6LhbYmKfMLS/tJSEYXx5cHOzmeDdOzBt3uzPkjExJiDdcIM/ME2bZiaCnykdXR28vvt1nit9juft56lqriI8NJxF4xfxrQu/xdJJSylKLzrjIz976/fy0NsP8eT6JwH4/IzPc9fcu3zHFaBEpDf9RRjNvF7TYbq11b+1tQXeP5V9LS0nD0MnG+05VlSUCT+9A1Bqqplcc2wgOt4WH2+2YTQiFGw1NX1Px338sb8VVEqKCUl33+0PTFOmDM1b1tbZxos7XuTZ0mf5y8d/oaG9gdjwWJZNXsbywuUsm7zstFoODMYzW57hyfVPcuusW7lv0X3kJeYN6euLyPCiEDXUOjtNcOnoMF973+5nX/yuXeZfvmMfe5Lvo729b/jp7/5ghYSYOTw9W1ycP/RkZJivvff1bMfu6+8xsbGjOvgMhtdr2kD1DksffWRWaOmRm2tC0rXX+gNTbm5wJnyfqq7uLt4of4OnNj/Fs6XP0tDeQGp0KlcWXsnywuVcMPGCQTe3HKw/238G4FLrUu6edzfXTLuG8Unjh7QGERmeRl6I8nhg7VoTIjyevltn55nd39+x3mGndzfoU5BzsgdERpotIqLv156Ak55uRnZ6h57o6L77BvKYiIih/ddXfLq6zJJ1x44w1Rxd2i0kxIwmLVjgD0uzZpmpWk7wer2sP7ie32z6Df+z5X+obKokPiKe5YXLub74es7PP9+R02QN7Q3c8/I9/HzDz1k2eRmXWpcSGRapACUip2zkhajHH4d77x3c94aGmkvHe29hYX33Hbs/JubE33NswDmV20e/lu3fz8SpU/t/bFiYgswI194OW7cGji5t2uSfDhYRAdOnw+WX+8PSjBlmgM9p9W31PLX5Kf77o/9mw8ENRIyJYNnkZXxu+uf4zJTPEB3u3NWIGw9u5IpnrmBP/R7+deG/8uAnH3SsFhEZvkZeiLrzTrPIVn+B6GShaIgaEw5Ee2wsWJbTZcgQaGzs27By61Z/W6m4ONM+asUKf2AqKgrqqiynzev18l7Fe/zko5/wzNZnaPG0UDK2hMeXPc51068jOXqQ67cE0Y6aHZzzs3NIjkpmzS1rODv3bKdLEpFhauSFqPBw08FZxMWqqwNHl9avh507/cfT083/Cyxd6g9MBQWuzPmAubrumS3P8Oh7j7Lh4AbiIuK4vvh6vjD7C8zOmu2qfkqTUiZx/6L7ubnkZrLis5wuR0SGsZEXokRcxOuF8vK+85cOHPA/ZsIEE5JuvNEfmLKzh8eZ2pqWGp5Y9wSPffAYlU2VFKUX8cRnnuC66ded0T5Og/Grjb9i3rh5WGkW9y26z+lyRGQEUIgSCZLOTrDtvoGprs4cDw2FwkJYvNgflkpKTDuq4WZ/w34eXvMwT65/ktbOVi4quIifX/ZzLiq4yFWjTj2e3/48N//pZm4puYUnL3vS6XJEZIRQiBIZhNZW06Cyd1jatMnfNSIqCoqL4Zpr/IGpuNhcgzCcVTRU8PCah/nJRz+h29vNDTNuYPXZq5meMd3p0o5rU9Umrv/j9czJnsNjyx5zuhwRGUEUokROoq4ucML32rX57N5tWg2AaWo+axasWuUPTFOnmmsVRoqDTQf5+ltf94WnFSUruG/RfUxImuB0aSfU1tnGZ//wWRKjEnn+s887ekWgiIw8I+jPvMjp8XqhsrLv6bjdu/2PycqCyZM7AxpW5ucPj/lLg9HqaeXRtY/y8DsP09bZNmzCU4/vvfc9Sg+X8vL1L2sSuYgEnUKUjErd3VBW1vcKuepq/2MmTYI5c2DlSn9gysyE0tJ9FA7jtfNORbe3m6c2P8W//u1f2dewjyumXsE3L/gmk1MnO13agNw9724KUgq4eNLFTpciIiOQQpSMeB4PbNsWOLq0YYPpywTmtFtRUWA7gZkzzWm60aj0UCm3/+V23t77NrOzZvOb5b/h3PHDr22Ip8tDdHg0VxVd5XQpIjJCKUTJiNLcbCZ49x5d2rLFv/5xTIwJSDfc4A9M06aZieCjXVtnGw+9/RAPrXmIuIg4fnrJT7ll1i2Ehri0OdUJvF/xPtc9ex0vfO4FCtNH9qihiDhHIUqGrZqavvOXbNvMbQJISTEh6e67/YFpyhStbdyfjyo/4vN//Dylh0u5vvh6Hr34UTJiM5wua9C+8953ONJ2hNzEXKdLEZERTCFKXM/rhYqKwNGl9eth3z7/Y3JzTUjqPeE7N3fkTvgOlq7uLr797rd54I0HSI9N56XrX2LJpCVOl3VaDjQe4A/b/sDqs1cTF+GCRQRFZMRSiBJX6eqCHTv6jjDV1JjjISFmNGnBAn9YmjUL0tKcrXs4qmqq4rPPfpY3y9/kqqKreOIzT5ASneJ0WaftD9v+QJe3ixWzVjhdioiMcApR4pj2drPAbu/RpU2bzLwmMAvrTp8Ol1/uD0szZpiFeOX0vF/xPlf+7kpqW2v52aU/4+aSm13ZaXwwfrf1d8zInMHUtKlOlyIiI5xClAyJxsbAhpXr15sA1dlpjsfFmSVQVqzwB6aiIhOkJLie/OhJ/unFfyI7Ppt3b32XkrElTpcUVPfOv3fEBEIRcTeFKAm66uq+/Zd27vQfT0+Hs84KbClQUGDWlpMzx+v1cv/r9/ONNd/gwokX8vSVT5Mak+p0WUF3ZdGVTpcgIqOEQpQMmtcL5eV95y8dOOB/zIQJJiTdeKM/MGVna8L3UPN0ebj9L7fz8w0/57ZZt/Gjz/yIsNCR9+u/Zu8aEiMTKc4sdroUERkFRt5fUTkjOjtN+4Deo0sbNph15cCMIhUWwuLF/rBUUgLJyc7WLSZAXfuHa3lu+3M8eN6DPHjegyP2dNc9L99DfGQ8b9z0htOliMgooBAlfbS2wubNgaNLmzZBW5s5HhUFxcVwzTX+wFRcbBpZirt4ujxc9+x1PLf9Ob635HvcPe9up0s6Y1o6W9hwcANfWfgVp0sRkVFCIWqUq6szI0q9r5Dbvt20GgCz9MmsWbBqlT8wTZ1qlkoRd+vq7uLzz32eZ0uf5TsXf2dEByiAHfU76PJ2MXfcXKdLEZFRQv8UjhJeL1RW9p2/tHu3/zFZWSYk9W4pkJ+v+UvD1epXVvO7rb/jkQse4Z759zhdzhlX1lAGQGGalnkRkaGhEDUCdXdDWVng6NL69eaquR6TJsGcObBypT8wZWY6V7ME13ff+y7f/+D73Dv/Xr604EtOlzMkdjfuJjw0nPzkfKdLEZFRQiFqmPN4YNu2wLC0YYPpywTmtFtRUWA7gZkzzWk6GZle3fUqq19ZzfLC5Xz7om87Xc6QWWGtYOXClSPyqkMRcSf9tRlGmpth48bAwLRlC3R0mOMxMSYg3XCDPzBNm2YmgsvoUNFQwef++DmK0ov41eW/IjRk9DTfSopMonCcTuWJyNBRiHKpmhoTkl5+OYUDB8xt2zZzmwBSUkxIuvtuf2CaMgXGjHG2bnFOTyuDts42nr3mWWIjYp0uaUg9teMp4daILAAADNZJREFULo69mIV5C50uRURGCYUoh3m9UFER2H9p/XrYt6/nEZnk5pqQdO21/sCUm6sJ3xLoW+9+i3f3vctTy5/CSrOcLmdIdXu7eXjjw3TGdCpEiciQUYgaQl1dsGNH3yvkamrM8ZAQM5q0YIE/LMXGfsw550xxtnBxva3VW/na37/G1UVXc13xdU6XM+RqWmro7O4kOz7b6VJEZBRRiDpD2tvNAru9R5c2bTLzmsAsrDt9emA7gRkzzEK8vZWWdg198TKsdHV3seLPK0iITOCxZY85XY4jqpqrAMiM0yWmIjJ0FKKCoKGh74TvrVvNUilgglFJCaxY4Q9MRUUmSImcrifXP8kH+z/gt8t/S0ZshtPlOKKpowmAhMgEhysRkdFEIWqAqqr6no7budN/PD0dzjorsKVAQYFZW04k2Orb6rn/9ftZlLeI66aPvtN4PRrbTU+PuIi4kzxSRCR4FKKOw+uF8vK+genAAf9jJkwwIenGG/2BKTtbE75l6Hzj7W9wuOUw37n4OyN2UeFTcd6E83jjM28wJ3uO06WIyCiiEIU57bZ9e9+GlXV15nhoKBQWwuLF/rBUUgLJyc7WLaNbdXM1j334GJ8r/hyzs2c7XY6jIsZEkBmTSVSYmqKJyNAJWoiyLOtK4JtAxdFdr9q2/Z/Bev5gaW2FzZsDA9OmTdDWZo5HRUFxMVxzjT8wFRebRpYibvLo2kdp9bTywLkPOF2K49buW8tvt/6WRyY9Qky4fllFZGgEcyQqDnjMtu3vBvE5T0tdXd/Tcdu3m1YDYJY+mTULVq3yB6apU81SKSJuVttay+MfPs61068ddT2h+rNm7xoe3/o437z0m06XIiKjSDDjQnwQn2vQtm6Ff/s3E5h27/bvz8oyIal3S4H8fM1fkuHpZ+t/RlNHE/ctvM/pUlyhsaOREEKIDo92uhQRGUWCPRK1zLKsZUAI8H9s297Y3wNLS0uD+LKBXn89jk2b0pkypYPLLmujsNBsaWmB/Zba282olNu1tbWd0fdLBs7pz6Tb2833136f2WmzCa8Np7RW/33sPbiX6DHR2Nttp0uRXpz+XZH+6XMJnkGFKMuybgNuO2b3c8CDtm2/YVnWIuDXwIz+vr+w8MwtElpYCHfdBRAFDP+eMaWlpWf0/ZKBc/ozeWXnK+xr2scjFz+i/zaOitgZQUx4jN4Pl3H6d0X6p89lYNatW3fcY4MKUbZt/xT46QmOv21ZVoZlWWNs21bLbZEgemLdE2TEZrC8cLnTpbhGU0cTMWGaUC4iQyuYV+c9AGyzbftZy7KmA4cUoESCq6G9gRd2vMCqOauIGKOW9z1+efkv2bi139kDIiJnTDDnRP0a+IVlWXcffd5bg/jcIgL82f4zHV0dXDPtGqdLcZXwMeHEhsc6XYaIjDJBC1G2bZcDnwzW84lIX7/f9ntyEnKYnzPf6VJEREY9regmMkzUt9Xz8s6XuarwKkJD9KsrIuI0/SUWGSZe2fUKHV0dXFl0pdOliIgIClEiw8ZrZa+RGJmoU3kiIi6hECUyDHi9Xl4te5Xz888nLFTrEomIuIFClMgwsOvILsrryrlw4oVOlyIiIkcpRIkMA2v2rgHgkxM+6WwhIiLioxAlMgy8V/EeiZGJTE2b6nQpIiJylEKUyDDwXsV7zB03V60NRERcRH+RRVyuuaOZzdWbdVWeiIjLKESJuNy6ynV0e7uZN26e06WIiEgvClEiLrelegsAJWNLHK5ERER6U4gScblth7aREJlAdny206WIiEgvClEiLrf10FaK0osICQlxuhQREelFIUrE5bYd2kZRWpHTZYiIyDEUokRc7HDLYaqbqylKV4gSEXEbhSgRFys9VAqgECUi4kIKUSIuVnakDIBJKZMcrkRERI6lECXiYhUNFQDkJOQ4XImIiBxLIUrExfY17CM1OpXo8GinSxERkWMoRIm4WEVDBbmJuU6XISIi/VCIEnGxfQ37dCpPRMSlFKJEXKyioYLcBI1EiYi4kUKUiEu1eFqoba3VSJSIiEspRIm4VM+VeRqJEhFxJ4UoEZfaV78PUHsDERG3UogScamDTQcByIrPcrgSERHpj0KUiEvVttYCkBqd6nAlIiLSH4UoEZeqaa0BIDk62eFKRESkPwpRIi5V01JDUlQSYaFhTpciIiL9UIgScama1hqdyhMRcTGFKBGXqm2tJTVGIUpExK0UokRcSiNRIiLuphAl4lL1bfUkRiU6XYaIiByHQpSISzV1NBEfEe90GSIichwKUSIu1dTRRFxEnNNliIjIcShEibiQ1+tViBIRcTmFKBEXau1sxYtXIUpExMUUokRcqKmjCUAhSkTExRSiRFxIIUpExP0GvZ6EZVnnAb8HVti2/Zej+2YCPwK8wCbbtlcFpUqRUUYhSkTE/QY1EmVZVgGwGlhzzKHvAl+0bXsBkGpZ1tLTrE9kVFKIEhFxv8GezqsElgMNPTssy4oA8m3b/vDorueBC06vPJHRSSFKRMT9BnU6z7btFgDLsnrvTgOO9Lp/EMgadGUio1hPiIoNj3W4EhEROZ6ThijLsm4Dbjtm94O2bb9yzL6Qfu57+3vO0tLSUy5wtGtra9P75TJD8ZmU7S0DYP+e/UQciTijrzVS6HfFffSZuJM+l+A5aYiybfunwE9P4bmqgd6rpY7DnPbro7Cw8JSKExM49X65y1B8Jh92mLPiU6dMZWLyxDP6WiOFflfcR5+JO+lzGZh169Yd91jQWhzYtu0BtluWtfDoruXAy8F6fpHRxNPlASA8NNzhSkRE5HgGe3Xepy3LehNYAjxkWdZfjx665+j9d4Bdtm2/FpwyRUYXT/fREDVGIUpExK0GO7H8BeCFfvZvAxadblEio13PSFRY6KBbuYmIyBmmjuUiLuQbidLpPBER11KIEnEh35wonc4TEXEthSgRF9JIlIiI+ylEibiQ5kSJiLifQpSIC3m6PYSFhhEScmwPWxERcQuFKBEX8nR5dCpPRMTlFKJEXMjT7dGkchERl1OIEnEhjUSJiLifQpSIC2kkSkTE/RSiRFzI062RKBERt1OIEnEhT5dGokRE3E4hSsSFNBIlIuJ+ClEiLqSRKBER91OIEnEhjUSJiLifQpSIC2kkSkTE/RSiRFxII1EiIu6nECXiQhqJEhFxP4UoERfSSJSIiPspRIm4kEaiRETcTyFKxIU83R7CQsOcLkNERE5AIUrEhbQAsYiI++l/dUVcaNWcVeQm5jpdhoiInIBClIgL3TXvLqdLEBGRk9DpPBEREZFBUIgSERERGQSFKBEREZFBUIgSERERGQSFKBEREZFBUIgSERERGQSFKBEREZFBUIgSERERGQSFKBEREZFBUIgSERERGQSFKBEREZFBUIgSERERGQSFKBEREZFBUIgSERERGQSFKBEREZFBCPF6vUP6guvWrRvaFxQRERE5DbNnzw7pb/+QhygRERGRkUCn80REREQGQSFKREREZBDCnC5ATo1lWZnAduAK27bfdLicUc2yrDDgSWAiEA78H9u21zhb1ehlWdZ3gPmAF/iibdsfOlySAJZlPQIswvw785Bt2390uCQBLMuKBrYC/27b9i8cLmfY00jU8PEtoMzpIgSAG4Bm27YXAbcCjzpcz6hlWdZ5wGTbts8GbgMec7gkASzLOh+YfvRzWQJ81+GSxO9+oMbpIkYKhahhwLKsxUAjsNnpWgSA3wCrj94+BKQ6WMto9yngTwC2bW8Dki3LSnC2JAHeAq4+evsIEGtZ1hgH6xHAsqypQBHwgtO1jBQ6nedylmVFAA8Cl6H/m3MF27Y9gOfo3XuApxwsZ7QbC6zrdb/q6L4GZ8oRANu2u4Dmo3dvA148uk+c9V/APwM3OV3ISKEQ5SKWZd2G+YPT20vAT2zbrrMsy4GqRrfjfCYP2rb9imVZdwJnAZcMfWVy1LG9W0Iwc6PEBSzLugxzyvsip2sZ7SzLuhFYa9v2bv1bEjzqE+VylmW9A/QMgxdgTh9dbdv2VueqEsuybsWcrrjctu02p+sZrSzL+n9ApW3bTxy9XwbMtG270dHCBMuyLgb+A1hi23at0/WMdpZlPYO5GKYLyAHagdtt237N0cKGOYWoYcSyrF8Av9DVec6yLGsi8Axwnm3bLU7XM5pZlnUO8DXbti+0LGsW8APbthc6XddoZ1lWIvA2cIFt29VO1yOBjv7PR7muzjt9Op0nMnC3YSaTv9hrWPwi27Y7nCtpdLJt+13LstZZlvUu0A3c6XRNAsC1QBrwu16/Izfatr3XuZJEgk8jUSIiIiKDoBYHIiIiIoOgECUiIiIyCApRIiIiIoOgECUiIiIyCApRIiIiIoOgECUiIiIyCApRIiIiIoOgECUiIiIyCP8fMS2qSVrW0TIAAAAASUVORK5CYII=\n", 491 | "text/plain": [ 492 | "
" 493 | ] 494 | }, 495 | "metadata": { 496 | "needs_background": "light" 497 | }, 498 | "output_type": "display_data" 499 | } 500 | ], 501 | "source": [ 502 | "# Soft Exponential\n", 503 | "def soft_exponential_func(x, alpha):\n", 504 | " if alpha == 0.0:\n", 505 | " return x\n", 506 | " if alpha < 0.0:\n", 507 | " return - np.log(1 - alpha * (x + alpha)) / alpha\n", 508 | " if alpha > 0.0:\n", 509 | " return (np.exp(alpha * x) - 1)/ alpha + alpha\n", 510 | "\n", 511 | "fig = plt.figure(figsize=(10,7))\n", 512 | "ax = plt.axes()\n", 513 | "\n", 514 | "plt.title(\"Soft Exponential\")\n", 515 | "\n", 516 | "x = np.linspace(-5, 5, 1000)\n", 517 | "ax.plot(x, soft_exponential_func(x, -1.0), '-g', label = 'Soft Exponential, alpha = -1.0', linestyle = 'dashed');\n", 518 | "ax.plot(x, soft_exponential_func(x, -0.5), '-g', label = 'Soft Exponential, alpha = -0.5');\n", 519 | "ax.plot(x, soft_exponential_func(x, 0), '-b', label = 'Soft Exponential, alpha = 0');\n", 520 | "ax.plot(x, soft_exponential_func(x, 0.5), '-r', label = 'Soft Exponential, alpha = 0.5');\n", 521 | "\n", 522 | "plt.legend();" 523 | ] 524 | }, 525 | { 526 | "cell_type": "markdown", 527 | "metadata": {}, 528 | "source": [ 529 | "To implement an activation function with trainable parameters we have to:\n", 530 | "* derive a class from nn.Module and make alpha one of its members,\n", 531 | "* wrap alpha as a Parameter and set requiresGrad to True.\n", 532 | "\n", 533 | "See an example:" 534 | ] 535 | }, 536 | { 537 | "cell_type": "code", 538 | "execution_count": 10, 539 | "metadata": {}, 540 | "outputs": [], 541 | "source": [ 542 | "class soft_exponential(nn.Module):\n", 543 | " '''\n", 544 | " Implementation of soft exponential activation.\n", 545 | "\n", 546 | " Shape:\n", 547 | " - Input: (N, *) where * means, any number of additional\n", 548 | " dimensions\n", 549 | " - Output: (N, *), same shape as the input\n", 550 | "\n", 551 | " Parameters:\n", 552 | " - alpha - trainable parameter\n", 553 | "\n", 554 | " References:\n", 555 | " - See related paper:\n", 556 | " https://arxiv.org/pdf/1602.01321.pdf\n", 557 | "\n", 558 | " Examples:\n", 559 | " >>> a1 = soft_exponential(256)\n", 560 | " >>> x = torch.randn(256)\n", 561 | " >>> x = a1(x)\n", 562 | " '''\n", 563 | " def __init__(self, in_features, alpha = None):\n", 564 | " '''\n", 565 | " Initialization.\n", 566 | " INPUT:\n", 567 | " - in_features: shape of the input\n", 568 | " - aplha: trainable parameter\n", 569 | " aplha is initialized with zero value by default\n", 570 | " '''\n", 571 | " super(soft_exponential,self).__init__()\n", 572 | " self.in_features = in_features\n", 573 | "\n", 574 | " # initialize alpha\n", 575 | " if alpha == None:\n", 576 | " self.alpha = Parameter(torch.tensor(0.0)) # create a tensor out of alpha\n", 577 | " else:\n", 578 | " self.alpha = Parameter(torch.tensor(alpha)) # create a tensor out of alpha\n", 579 | " \n", 580 | " self.alpha.requiresGrad = True # set requiresGrad to true!\n", 581 | "\n", 582 | " def forward(self, x):\n", 583 | " '''\n", 584 | " Forward pass of the function.\n", 585 | " Applies the function to the input elementwise.\n", 586 | " '''\n", 587 | " if (self.alpha == 0.0):\n", 588 | " return x\n", 589 | "\n", 590 | " if (self.alpha < 0.0):\n", 591 | " return - torch.log(1 - self.alpha * (x + self.alpha)) / self.alpha\n", 592 | "\n", 593 | " if (self.alpha > 0.0):\n", 594 | " return (torch.exp(self.alpha * x) - 1)/ self.alpha + self.alpha" 595 | ] 596 | }, 597 | { 598 | "cell_type": "markdown", 599 | "metadata": {}, 600 | "source": [ 601 | "Let's make a small demo: create a simple model, which uses Soft Exponential activation:" 602 | ] 603 | }, 604 | { 605 | "cell_type": "code", 606 | "execution_count": 11, 607 | "metadata": {}, 608 | "outputs": [ 609 | { 610 | "name": "stdout", 611 | "output_type": "stream", 612 | "text": [ 613 | "Training the model. Make sure that loss decreases after each epoch.\n", 614 | "\n", 615 | "Training loss: 558.482414022088\n", 616 | "Training loss: 472.3926571011543\n", 617 | "Training loss: 453.55578088760376\n", 618 | "Training loss: 438.11303447186947\n", 619 | "Training loss: 430.5142505913973\n" 620 | ] 621 | } 622 | ], 623 | "source": [ 624 | "# create class for basic fully-connected deep neural network\n", 625 | "class ClassifierSExp(nn.Module):\n", 626 | " '''\n", 627 | " Basic fully-connected network to test Soft Exponential activation.\n", 628 | " '''\n", 629 | " def __init__(self):\n", 630 | " super().__init__()\n", 631 | "\n", 632 | " # initialize layers\n", 633 | " self.fc1 = nn.Linear(784, 256)\n", 634 | " self.fc2 = nn.Linear(256, 128)\n", 635 | " self.fc3 = nn.Linear(128, 64)\n", 636 | " self.fc4 = nn.Linear(64, 10)\n", 637 | "\n", 638 | " # initialize Soft Exponential activation\n", 639 | " self.a1 = soft_exponential(256)\n", 640 | " self.a2 = soft_exponential(128)\n", 641 | " self.a3 = soft_exponential(64)\n", 642 | "\n", 643 | " def forward(self, x):\n", 644 | " # make sure the input tensor is flattened\n", 645 | " x = x.view(x.shape[0], -1)\n", 646 | "\n", 647 | " # apply Soft Exponential unit\n", 648 | " x = self.a1(self.fc1(x))\n", 649 | " x = self.a2(self.fc2(x))\n", 650 | " x = self.a3(self.fc3(x))\n", 651 | " x = F.log_softmax(self.fc4(x), dim=1)\n", 652 | "\n", 653 | " return x\n", 654 | " \n", 655 | "model = ClassifierSExp()\n", 656 | "train_model(model)" 657 | ] 658 | }, 659 | { 660 | "cell_type": "markdown", 661 | "metadata": {}, 662 | "source": [ 663 | "## Implement Activation Function with Custom Backward Step\n", 664 | "The perfect example of an activation function, which needs implementation of a custom backward step is [BReLU](https://arxiv.org/pdf/1709.04054.pdf) (Bipolar Rectified Linear Activation Unit):\n", 665 | "\n", 666 | "$$BReLU(x_i) = \\left\\{\\begin{matrix} f(x_i), i \\mod 2 = 0\\\\ - f(-x_i), i \\mod 2 \\neq 0 \\end{matrix}\\right.$$\n", 667 | "\n", 668 | "This function is not differenciable at 0, so automatic gradient computation may fail.\n", 669 | "\n", 670 | "Plot:" 671 | ] 672 | }, 673 | { 674 | "cell_type": "code", 675 | "execution_count": 12, 676 | "metadata": { 677 | "_kg_hide-input": true 678 | }, 679 | "outputs": [ 680 | { 681 | "data": { 682 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAksAAAGnCAYAAACn0xNmAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzt3Xd4FOXCxuFfEhJ6LxEBFVCH0CFwjqgUBREUqSI9oESaiKhgoXwiCiog4gGlCEr3CAihVwUCAgqhSxi6EKWFZoD07PfHAIcSIqmz5bmvK1d2Zzazz2ZCePLO7DteDocDEREREUmet90BRERERJyZypKIiIhIClSWRERERFKgsiQiIiKSApUlERERkRSoLImIiIikIJvdAUTEMxiG4QAOA/FYf6gdAV4zTfPIbesTrn1JNmA90Mc0zSv/sO11wGTTNGfetrzeteUP37Z8CFDSNM3g9L0qEfEEGlkSkaxUzzTNANM0DWAn8GUy68uZplkOqAAUAgZkdUgRkZtpZElE7PIz0PRuK03TjDUMY8X1xxiG4QeMAhoBfsAk0zSHZ0VQEfFsGlkSkSx3rfh0BBal8JiCQHtg07VFfYDyQCWsUacXDcNokslRRUQ0siQiWWqdYRgJgD9wCmh2l/V+WIfgRgOfXVv3EvCFaZqxQKxhGNOBlsCSLEkuIh5LI0sikpWun5NUEOgOrDcMo/jt64F/AUnAD6ZpXj/huwDwqWEY+w3D2A+8AeT+h+dLIvnfcz5AYnpeiIh4DpUlEbGFaZqhwB9ArWTWRQL/AUbctPgvrHfPlbv2Udo0zTb/8DSngBKGYdw+iv4ocDzt6UXEk6gsiYgtDMN4FDCAPXd5yOfA44Zh1L12fxEQbBiGj2EYXoZhDDIMo1FKz2Ga5gFgAzDEMAyva8/bEKgLfJcRr0NE3J/OWRKRrHT9nCSAWKC7aZoHk3ugaZpRhmF8CowyDONfwDjgQeB3wAvYBoy56UtGGIYx6Kb7k0zTHA20wxqh2netMJ0AnjNN86+MfGEi4r68HA6H3RlEREREnJYOw4mIiIikQGVJREREJAUqSyIiIiIpUFkSERERSYHKkoiIiEgKMm3qgLCwML3NTkRERFxGYGCgV3LLM3WepcDAwMzcvFsJDw8nICDA7hhyG+0X56N94py0X5yP9knqhIWF3XWdDsOJiIiIpEBlSURERCQFKksiIiIiKVBZEhEREUmBypKIiIhIClSWRERERFKgsiQiIiKSApUlERERkRR4VFmKiIigWrVqdOrUiU6dOtGmTRu2bdsGwPz586lbt+6Nda1ateL7779PcXtjx45l5syZdyz/97//fcv9X3/9lT59+qQqa2hoKLNnz07V19yLtWvX8t57792xfMSIEbRp04ZWrVqxatWqNG17+PDhtGnThrZt27J79+70RhUREXEKmTqDtzMqXbo0M2bMAGDr1q2MHz+eKVOmAPDcc8/x7rvvAhAXF0fz5s2pXbs2JUuWzPKcderUybLn2rJlCwcPHuSHH37gwoULtGjRgoYNG6ZqG7/99ht//PEHP/zwA4cOHeL9999n7ty5mZRYREQk69hWlqbvms63O77N0G2+Uu0VgqoE3fPjIyMjKVasWLLr/Pz8ePTRRzlx4gTFixdn8ODBnDhxgoSEBPr06UOtWrUyKjbjxo0jKiqKgIAAvvrqK3x9fSlSpAgHDx68Ud7AGv3aunUrFy5c4ODBg7z55pssWbKEw4cPM2rUKKpUqcK0adNYtmwZAPXr16dbt26Ypsm7776Lv79/sq+3Zs2aVK5cGYD8+fMTHR1NYmIiPj4+AJw+fZp+/frd8jWVKlXinXfeuXF/8+bNNGjQAICHH36Yv//+m8uXL5MnT54M+z6JiIjYweNGlo4ePUqnTp2IjY3l9OnTN0aVbhcZGcnu3bsZPHgwixcvpmjRogwfPpzz58/TuXNnFi9enGGZunXrRsuWLTlw4ADr169n1qxZd93+sWPHmD17NnPnzmXixImEhIQwf/58lixZQqFChViwYAHz5s0DoHXr1jRq1Iivv/6a3r1706BBAz744IM7tunj40OuXLkAmDt3LnXq1LlRlAD8/f1vjMbdTWRkJBUqVLhxv3Dhwpw9e1ZlSUREXJ5tZSmoSlCqRoEyys2H4Q4fPkzfvn1ZsGABAMuWLWPv3r3ExsYSGRnJoEGDKFy4MDt27CAsLIzt27cDEBsbS1xcXIZl8vPzo2PHjrRv354JEybg6+t718dWrFgRLy8vihYtimEY+Pj4UKRIEbZv3054eDhVqlQhWzZrt1auXJn9+/dz+PBhqlevDljnU4WGhia77TVr1jBv3jy+/Tb1I34Oh+OO+15eyV68WURExKV43MjSzcqWLUv27Nk5efIk8L9zlmJiYmjZsiXly5cHwNfXlx49etCkSZN72q6fnx9JSUl4e1vnz58/f/6uh/uuu3jxIvny5ePUqVMpPu56Ebr99vVycnNpcTgceHt731JckpKSkt3uhg0bmDBhApMnTyZv3ry3rLuXw3D+/v5ERkbeuH/mzBmKFCmS4msRERFxBR71brjbXbx4kbNnz+Lv73/L8hw5ctCrVy+GDx8OQJUqVVizZg0A586dY/To0Slut0aNGixduhSA+Ph4QkJCqF279l0fHxUVxeLFi5kzZw6TJ08mKioqTa8nICCAnTt3kpCQQEJCArt27SIgIIDSpUuzd+9ewHpnXnLPP2LECCZOnEiBAgXuWH/9MNzNHzcXJYAnnniClStXArBv3z6KFSumQ3AiIpIuUbFRDFk3hC0RW2zN4XEjS9fPWQLrcNrgwYPx8/O743FNmjRh1qxZbNy4kcaNG7Nlyxbatm1LYmIivXv3vvG46dOn3ygJ+fPnZ9y4cQwePJghQ4YwZ84c4uPjady4MXXr1gVg0qRJ1KxZk2rVqt3YxujRo2natClFihShY8eOjB49mkqVKqX6tZUsWZI2bdrQsWNHHA4HrVu3pkSJEvTs2ZMBAwYwY8YMSpYsSXx8/C1ft2zZMi5cuEDfvn1vLPvss8+4//777/m5q1evToUKFWjbti1eXl7JnhslIiJyr9YdW8fLC1/m+KXjVPavbGsWr9vPNckoYWFhjsDAwEzZtitbt24dOXPmvGMupvDwcAICAmxKJXej/eJ8tE+ck/aL83HVfRIdH837P73Pl79+ycOFHmZa82k8XurxTH/esLAwAgMDkz3Z1uNGluzm5+dH1apV7Y4hIiLidH778zeCFgRhnjPpXbM3nzb4lNx+ue2OpbKU1R5/PPPbsYiIiCuJS4xj6PqhfLLxE0rkLcGaTmuoX6a+3bFuUFkSERER2+w5vYegkCB2ntpJl6pdGPPsGPLnyG93rFuoLImIiEiWS0xKZOSmkfzf2v+jUM5CLGy7kKZGU7tjJUtlSURERLLUwXMHCQoJYkvEFl4s/yLjnx9PkVzOOzefypKIiIhkiSRHEl9v/Zp3Vr9Djmw5mN1yNm0rtnX6Kz541KSUERERVKtWjU6dOtGpUyfatGnDtm3bAOsitXXr1r2xrlWrVnz//fcpbm/s2LHMnDnzjuW3Twvw66+/0qdPn1RlDQ0NZfbs2an6mnuxdu1a3nvvvTuWjxgxgjZt2tCqVStWrVp1x/pOnToRERFxT88RFRVFt27daNeuHV27duXixYvpzi0iIq7t+KXjPDPjGV5f/jr1HqrH3l57aVepndMXJfDAkaWbrw23detWxo8ff+NiutcvdwIQFxdH8+bNqV27NiVLlszynHXq1Mmy59qyZQsHDx7khx9+4MKFC7Ro0YKGDRumeXvTpk3jX//6F8HBwcyaNYtvvvmG/v37Z2BiERFxFQ6Hg2m7pvHGijdIciQxqckkgqsHu0RJus7WslSv3p3LXnoJevWCq1fhuefuXN+li/URGQkvvnjrunXrUvf8kZGRd71mm5+fH48++ignTpygePHiDB48mBMnTpCQkECfPn2oVatW6p4sBePGjSMqKoqAgAC++uorfH19KVKkCAcPHrxR3sAa/dq6dSsXLlzg4MGDvPnmmyxZsoTDhw8zatQoqlSpwrRp01i2bBkA9evXp1u3bpimybvvvou/v3+yr7dmzZpUrmzNjpo/f36io6NJTEzEx8fnjseuW7fuRrm87qWXXuKFF164cX/z5s03LhXToEEDevbsmf5vkoiIuJxTl0/RfUl3FpmLqPtgXb5r9h2lC5a2O1aqedzI0vXLncTGxnL69Ok7/uO/LjIykt27dzN48GAWL15M0aJFGT58OOfPn6dz584sXrw4wzJ169aNli1bcuDAAdavX8+sWbPuuv1jx44xe/Zs5s6dy8SJEwkJCWH+/PksWbKEQoUKsWDBAubNmwdA69atadSoEV9//TW9e/emQYMGyV6GxMfHh1y5cgEwd+5c6tSpk2xRAqhXrx71kmu5N4mMjKRQoUIAFClShDNnztzrt0JERNzE3N/n0nNpTy7HXWZ0w9G88dgbeHu55tk/tpallEaCcuVKeX2RIqkfSYJbD8MdPnyYvn37smDBAsC6RtrevXuJjY0lMjKSQYMGUbhwYXbs2EFYWBjbt28HrGvKxcXFpf7J78LPz4+OHTvSvn17JkyYgK+v710fW7FiRby8vChatCiGYeDj40ORIkXYvn074eHhVKlShWzZrN1auXJl9u/fz+HDh6levTpgnU8VGhqa7LbXrFnDvHnz+Pbbb9P1em6+hI7D4XCpoVYREUmf89Hn6b2sN9/v/Z6a99dkWvNpBBR1vcuu3MzjRpZuVrZsWbJnz87JkyeB/52zFBMTQ8uWLSlfvjwAvr6+9OjRgyZNmtzTdv38/EhKSsLb22rQ58+fv+vhvusuXrxIvnz5OHXqVIqPu16Ebr99vZTcXlS8vb1vKSxJSUnJbnfDhg1MmDCByZMnkzdv3rs+/70chvP39+fs2bPkzZuX06dPU7Ro0RRfk4iIuIdlB5cRvCiYs1fP8tFTH/Hek++Rzdv1q4ZrjodlkIsXL3L27Fn8/f1vWZ4jRw569ep147ybKlWqsGbNGgDOnTvH6NGjU9xujRo1WLp0KQDx8fGEhIRQu3btuz4+KiqKxYsXM2fOHCZPnkxUVFSaXk9AQAA7d+4kISGBhIQEdu3aRUBAAKVLl2bv3r2A9c685J5/xIgRTJw4kQIFCiS77RkzZlCyZEnq1avHjBkzbvm4uSgBPPHEE6xYsQKAVatWpfjaRUTE9UXFRtFtcTeen/08hXMV5rfg3xhUZ5BbFCXwwJGl6+csgXU4bfDgwfj5+d3xuCZNmjBr1iw2btxI48aN2bJlC23btiUxMZHevXvfeNz06dNZuXIlYJ0cPW7cOAYPHsyQIUOYM2cO8fHxNG7cmLp16wIwadIkatasSbVq1W5sY/To0TRt2pQiRYrQsWNHRo8eTaVKlVL92kqWLEmbNm3o2LEjDoeD1q1bU6JECXr27MmAAQNuFJ74+Phbvm7ZsmVcuHCBvn373lj22Wefcf/999+4P2zYMIKCgihVqtQ/5ujUqRP9+/enffv25MuXj5EjR6b6tYiIiGtYf2w9XRZ24fil47z7xLt8WO9DsmfLbnesDOV182GbjBQWFuYIDAzMlG27snXr1pEzZ8475mIKDw8nIMC1j+m6I+0X56N94py0X5xPZu+T6PhoBvw0gDG/juHhQg8zrfk0Hi/luheLDwsLIzAwMNmTbD1uZMlufn5+VK1a1e4YIiIiabb1z60EhQSxP3I/vWv25tMGn5LbL7fdsTKNylIWe/xx123dIiLi2eIS4/ho/Ud8svET7s97P6s7raZBmQZ2x8p0KksiIiLyj/ac3kNQSBA7T+2kc5XOfNnoS/LnyG93rCyhsiQiIiJ3lZiUyKhNo/i/df9HgRwFCGkTQrNyzeyOlaVUlkRERCRZB88dpHNIZzZHbObF8i8y/vnxFMlVxO5YWU5lSURERG6R5Ehi/NbxvLPmHfx8/JjVchbtKrbz2CsyqCyJiIjIDccvHafroq6sObKGRg83YkrTKdyf9/5//kI3prIkIiIiOBwOpu2axhsr3iAxKZGJTSbyavVXPXY06WYqSyIiIh7u1OVTdF/SnUXmIuo8WIfvmn1HmYJl7I7lNFSWREREPNi8ffPosaQHl+MuM7rhaN547A28vTz60rF3UFkSERHxQOejz/P68teZvWc2Ne6vwfTm0wkoqkvWJEdlSURExMMsP7ic4MXBnLlyhqH1hvJ+7ffJ5q1KcDf6zoiIiHiIqNgo3l71Nt9s/4YKRSuwuN1iqhevbncsp6eyJCIi4gHWH1tPl4Vd+OPiH7zz+DsMfWoo2bNltzuWS0hXWTIMIyfwOzDUNM2pGZJIREREMkx0fDQDfx7ImC1jKFOwDBtf2cjjpXRR99RI78jSIOBcRgQRERGRjLX1z60EhQSxP3I/r9V8jc8afEZuv9x2x3I5aS5LhmGUA8oDSzMujoiIiKRXXGIcY/eOZVL4JIrnLc7qTqtpUKaB3bFcVnpGlj4HegOdMyiLiIiIpNPeM3sJWhDEjlM76FylM2MajaFAjgJ2x3JpaSpLhmEEAZtN0zxqGMZdHxceHp7WXB4nJiZG3y8npP3ifLRPnJP2i/0SkxKZemAq/9n7H/L55uPzmp/TuHRjTh49yUlO2h3PpaV1ZOl5oIxhGE2AkkCsYRgRpmmuuflBAQGa3OpehYeH6/vlhLRfnI/2iXPSfrHXofOHCA4JZtOJTbQKaMX458cTeTxS+yQVwsLC7rouTWXJNM02128bhjEEOHZ7URIREZHMleRIYvzW8byz5h38fPyY1XIW7Sq2w8vLi0gi7Y7nNjTPkoiIiAs6cekEryx6hTVH1tDo4UZMfmEyJfKVsDuWW0p3WTJNc0gG5BAREZF74HA4mL5rOn1W9CExKZGJTSbyavVX8fLysjua29LIkoiIiIs4ffk03Zd0Z6G5kNoP1GZq86mUKVjG7lhuT2VJRETEBfy470d6LO1BVGwUnzf8nL6P9cXby9vuWB5BZUlERMSJXYi+QO/lvZm9ZzY17q/B9ObTCSiqd7llJZUlERERJ7Xi0Aq6LurKmStnGFpvKO89+R6+Pr52x/I4KksiIiJOJio2in6r+jFp+yQqFK3A4naLqV68ut2xPJbKkoiIiBMJ/SOULiFdOHbxGO88/g4fPvUhObLlsDuWR1NZEhERcQLR8dEM/HkgY7aMoUzBMmx4eQNPPPCE3bEElSURERHbbftrG0ELggiPDKdXjV6MeGYEuf1y2x1LrlFZEhERsUlcYhzDQocxbMMwiuctzqqOq3im7DN2x5LbqCyJiIjYYO+ZvQQtCGLHqR0EVQniy0ZfUiBHAbtjSTJUlkRERLJQYlIin2/+nMFrB1MgRwEWtFlA83LN7Y4lKVBZEhERySKHzh+ic0hnNp3YRMuAlkx4fgJFcxe1O5b8A5UlERGRTOZwOBi/bTz9V/fHz8ePmS1m0r5Se1381kWoLImIiGSiE5dO0HVRV1YfWc2zZZ9lStMplMhXwu5YkgoqSyIiIpnA4XAwY/cM+izvQ0JSAhOen0C3wG4aTXJBKksiIiIZ7PTl0/RY2oOQ/SHUfqA2U5tPpUzBMnbHkjRSWRIREclAP+77kR5LexAVG8WoZ0bR97G++Hj72B1L0kFlSUREJANciL7A68tfZ9aeWQQWD2R6i+mUL1re7liSAVSWRERE0mnFoRV0XdSVM1fO8GG9D3n/yffx9fG1O5ZkEJUlERGRNLocd5l+q/oxMWwi5YuWZ1HbRQTeH2h3LMlgKksiIiJpEPpHKF1CunDs4jH6P96foU8NJUe2HHbHkkygsiQiIpIKMQkxDPxpIF9s+YLSBUsT+nIoTz7wpN2xJBOpLImIiNyjbX9tI2hBEOGR4fSq0YvPnvmMPH557I4lmUxlSURE5B/EJ8bzcejHDNswjPvy3MfKjitpWLah3bEki6gsiYiIpGDvmb0ELQhix6kddKrcif80/g8FchSwO5ZkIZUlERGRZCQmJTJ682gGrR1E/uz5mf/SfFoEtLA7lthAZUlEROQ2h84foktIF3458QstyrVgQpMJFMtdzO5YYhOVJRERkWscDgcTtk2g3+p++Hr7MqPFDDpU6qCL33o4lSURERHgxKUTdF3UldVHVtOwbEOmNJ1CyXwl7Y4lTkBlSUREPJrD4WDG7hn0Wd6HhKQExj8/nu6B3TWaJDeoLImIiMc6c+UM3Zd0J2R/CE8+8CRTm02lbKGydscSJ6OyJCIiHml++Hy6L+lOVGwUo54ZRd/H+uLj7WN3LHFCKksiIuJRLkRfoM+KPszcPZPqxaszvfl0KhSrYHcscWIqSyIi4jFWHlpJ10VdOX3lNEPqDmFA7QH4+vjaHUucnMqSiIi4vctxl+m3qh8TwyZSvmh5FrZdSOD9gXbHEhehsiQiIm5twx8b6LKwC0cvHKVfrX589PRH5MiWw+5Y4kJUlkRExC3FJMQw6OdBjN48mtIFSxP6cihPPvCk3bHEBaksiYiI29n21zaCFgQRHhlOj8AejGw4kjx+eeyOJS5KZUlERNxGfGI8wzYM4+PQj7kvz32s6LCCZx9+1u5Y4uJUlkRExC38fuZ3gkKC2H5yO50qd+LLRl9SMGdBu2OJG1BZEhERl5aYlMgXW75g0M+DyJc9Hz++9CMtA1raHUvciMqSiIi4rMPnD9NlYRc2Ht9Ii3ItmNBkAsVyF7M7lrgZlSUREXE5DoeDCdsm0G91P3y9fZnefDodK3fUxW8lU6gsiYiIS4n4O4Kui7qy6vAqGpZtyJSmUyiZr6TdscSNqSyJiIhLcDgczNw9k9eXv058Ujzjnx9P98DuGk2STKeyJCIiTu/MlTP0WNKDBfsX8OQDTzK12VTKFiprdyzxECpLIiLi1BaEL6D7ku5cir3EyGdG8uZjb+Lj7WN3LPEgKksiIuKULsZc5PXlrzNz90yqF6/O2uZrqVCsgt2xxAOpLImIiNNZdXgVryx8hVOXT/FB3Q8YWHsgvj6+dscSD6WyJCIiTuNy3GX6r+rPhLAJlC9anoVtFxJ4f6DdscTDqSyJiIhT2Hh8I51DOnP0wlH61erHR09/RI5sOeyOJaKyJCIi9opJiGHwz4P5fPPnlC5YmvVd1lP7wdp2xxK5QWVJRERsE/ZXGEEhQew7u48egT0Y2XAkefzy2B1L5BYqSyIikuXiE+MZvmE4H2/4GP/c/qzosIJnH37W7lgiyVJZEhGRLLXv7D6CFgQRdjKMjpU78p9G/6FgzoJ2xxK5K5UlERHJEolJiXyx5QsG/TyIfNnz8eNLP9IyoKXdsUT+kcqSiIhkusPnD/PywpfZcHwDzcs1Z2KTiRTLXczuWCL3RGVJREQyjcPhYGLYRPqt6kc272xMbz6djpU76uK34lJUlkREJFNE/B1B10VdWXV4Fc+UeYYpTadQKn8pu2OJpJrKkoiIZCiHw8GsPbPovaw38UnxfP3c1/So0UOjSeKy0lWWDMMYAdS+tp1PTNOcnyGpRETEJZ29cpYeS3swP3w+T5R6gqnNp/JwoYftjiWSLt5p/ULDMJ4CKpqmWQtoBIzJsFQiIuJyQvaHUOHrCiw5sIQRDUawvst6FSVxC+kZWQoFfrt2+wKQ2zAMH9M0E9MfS0REXMXFmIu89+t7LPpjEdWLV2dt87VUKFbB7lgiGSbNZelaKbpy7W4wsExFSUTEs6w6vIpXFr7Cqcun+KDuBwysPRBfH1+7Y4lkKC+Hw5GuDRiG0QwYADQ0TfPS9eVhYWGOXLlypTOe54iJiSFHDl1d29lovzgf7RPncCX+Cp/v/pz/Hv4vZfKVYWjVoVS/r7rdseQm+reSOlevXiUwMDDZdyGk9wTvZ4GBQKObi9J1AQEB6dm8RwkPD9f3ywlpvzgf7RP7bTy+kS4hXThy4Qhv13qbj576iGOHjmm/OBn9W0mdsLCwu65Lc1kyDCM/MBJoYJrm+bRuR0REXENMQgz/t/b/GLVpFA8VeIh1XdZR58E6dscSyXTpGVlqAxQB5hiGcX1ZkGmax9OdSkREnMr2k9sJWhDE72d/p3tgd0Y1HEUevzx2xxLJEuk5wXsSMCkDs4iIiJOJT4xn+IbhfLzhY4rlLsbyDstp9HAju2OJZCnN4C0iIsnad3YfQQuCCDsZRodKHRjbeCwFcxa0O5ZIllNZEhGRWyQmJTJmyxgG/jyQvNnzMq/1PFqVb2V3LBHbqCyJiMgNRy4coUtIFzYc30AzoxkTm0zEP4+/3bFEbKWyJCIiOBwOJoVN4u1Vb+Pj7cO05tPoVLmTLn4rgsqSiIjH+/PvP+m6qCsrD6+kQZkGfNv0W0rlL2V3LBGnobIkIuKhHA4Hs/bM4vXlrxOXGMdXz31Fzxo9NZokchuVJRERD3T2yll6LO3B/PD5PFHqCaY2n8rDhR62O5aIU1JZEhHxMCH7Q+i2uBuXYi8xosEI3qr1Fj7ePnbHEnFaKksiIh7iYsxF3ljxBtN3TafafdX4ucXPVCxW0e5YIk5PZUlExAOsPryaVxa9wsmok/xfnf9jYJ2B+Pn42R1LxCWoLImIuLErcVfov7o/47eNp1yRcmzuupmaJWraHUvEpagsiYi4qV+O/0LnkM4cuXCEtx57i4+f/picvjntjiXiclSWRETcTExCDB+s/YCRm0byUIGHWNdlHXUerGN3LBGXpbIkIuJGtp/cTtCCIH4/+zvdqndjVMNR5M2e1+5YIi5NZUlExA3EJ8bzycZP+Cj0I4rlLsbyDstp9HAju2OJuAWVJRERF7fv7D46h3Rm21/b6FCpA2Mbj6VgzoJ2xxJxGypLIiIuKsmRxJgtYxjw0wDyZs/L3NZzebH8i3bHEnE7KksiIi7oyIUjvLzwZUL/CKWZ0YyJTSbin8ff7lgibkllSUTEhTgcDiaFTeLtVW/j4+3D1GZTCaoSpIvfimQilSURERfx599/Erw4mBWHVtCgTAO+bfotpfKXsjuWiNtTWRIRcXIOh4PZe2bTe3lv4hLj+Oq5r+hRowfeXt52RxPxCCpLIiJO7OyVs/Rc2pMfw3/k8VKPM7XZVB4p/IjdsUQ8isqSiIiTWrh/Id2WdOP7b5YBAAAeRklEQVRizEU+a/AZb9eyzlMSkaylsiQi4mQuxlzkjRVvMH3XdKreV5U1ndZQyb+S3bFEPJbKkoiIE1lzZA0vL3yZk1EnGVxnMIPqDMLPx8/uWCIeTWVJRMQJXIm7wjur3+HrbV9Trkg5NnfdTM0SNe2OJSKoLImI2G7TiU10DunM4fOHeeuxt/j46Y/J6ZvT7lgico3KkoiITWISYvhg7QeM2jyKB/I/wNrOa6n7UF27Y4nIbVSWRERssOPkDjot6MTvZ3+nW/VujGo4irzZ89odS0SSobIkIpKF4hPj+XTjpwwNHUrRXEVZ1n4ZjR9pbHcsEUmBypKISBYJPxtOUEgQ2/7aRvtK7RnbeCyFchayO5aI/AOVJRGRTJbkSGLMljEM+GkAefzyMLf1XF4s/6LdsUTkHqksiYhkoqMXjtJlYRdC/wilqdGUSU0m4Z/H3+5YIpIKKksiIpnA4XDwzfZveGvlW/h4+zC12VSCqgTh5eVldzQRSSWVJRGRDPbn338SvDiYFYdWUL90fb5t9i0P5H/A7lgikkYqSyIiGcThcPD93u95bdlrxCbEMq7xOHrW7Im3l7fd0UQkHVSWREQywNkrZ+m5tCc/hv9IrZK1mNZ8Go8UfsTuWCKSAVSWRETSaZG5iFcXv8rFmIt8Wv9T+j3eDx9vH7tjiUgGUVkSEUmjSzGXeGPFG0zbNY2q91VlTac1VPKvZHcsEclgKksiImmw5sgaXln4Cn9F/cXgOoMZVGcQfj5+dscSkUygsiQikgpX4q7w7pp3+WrrV5QrUo5NXTfxrxL/sjuWiGQilSURkXu06cQmOod05tD5Q/T9d1+G1x9OTt+cdscSkUymsiQi8g9iE2L5YN0HjNw0kgfyP8Dazmup91A9u2OJSBZRWRIRScGOkzsICgli75m9vFr9VT5v+Dl5s+e1O5aIZCGVJRGRZCQkJfDpxk/5cP2HFM1VlKXtl/LcI8/ZHUtEbKCyJCJym/2R+wlaEMTWv7bSrmI7xj03jkI5C9kdS0RsorIkInJNkiOJL7d8yYCfB5DbNzdzXpxD6wqt7Y4lIjZTWRIRAY5eOMrLC19m/R/reeHRF5j0wiTuy3Of3bFExAmoLImIR3M4HEzePpm3Vr2Ft5c33zX7js5VOuPl5WV3NBFxEipLIuKx/or6i+BFwSw/tJynSz/Nd82+44H8D9gdS0ScjMqSiHgch8PBf/f+l9eWvUZMQgxjG4+lV81eeHt52x1NRJyQypKIeJTIq5H0XNqTefvm8VjJx5jWfBqPFn7U7lgi4sRUlkTEYywyF/Hq4le5GHORT+t/Sr/H++Hj7WN3LBFxcipLIuL2LsVcou/KvkzdOZUq/lVY02kNlfwr2R1LRFyEypKIuLWfjvzEywtf5q+ovxhUexCD6w7Gz8fP7lgi4kJUlkTELV2Ju8J7a95j3NZxGIUNNnXdxL9K/MvuWCLiglSWRMTtbD6xmaCQIA6dP0Tff/dleP3h5PTNaXcsEXFRKksi4jZiE2IZsm4IIzaNoFS+UqztvJZ6D9WzO5aIuDiVJRFxCztP7aTTgk7sPbOX4GrBjH52NHmz57U7loi4AZUlEXFpCUkJfLrxUz5c/yFFcxVlafulPPfIc3bHEhE3kuayZBjGF8BjgAN4wzTNrRmWSkTkHuyP3E/QgiC2/rWVdhXbMe65cRTKWcjuWCLiZtI0t79hGHWBR0zTrAUEA+MyNJWISAqSHEmM2TKGahOrceTCEea8OIfZrWarKIlIpkjrhZDqAyEApmnuAwoahpEvw1KJiNxFxOUInp72NG+ufJMGZRqwt9deWldobXcsEXFjaT0Mdx8QdtP909eW/Z3uRPcoJiGGnkt7cvry6ax6ykx1+fJl8mzPY3cMuY32i/NwOODPjU+xL6wwuVpu59um33JoXhdemeF1y+MefBDGj7duv/02hIffup1y5WD0aOt2z57wxx+3rq9WDYYNs2536QJnzty6/oknYOBA63bbtvD3bb/1GjSAt96ybjdtCgkJt65v0gR69YL4eGjW7M7X+dJL1vP+/be1/dt17gxt2sDp0/Dyy3eu79HDet5jx6znud2bb8Izz1jfl7ffvnP9gAHw5JOwfTsMGnTn+o8+gsBA2LgRhg//3/LLl0uRJw+MGgXly8Pq1fDFF3d+/ddfw0MPweLF/9tPN/vuO/D3hx9+gGnT7lz//feQPz9MnQpz5ty5fuFC8PW1tr148a3rsmWDRYus26NHw5o1t67Pm9d6XrB+Bn755db1xYpZzwvW92b79lvXO9vP3sWL1j65zlV/9kqVgokT73x8VkprWfJK5r7j9geF3/6TkoGiE6I5dvoY52LPZdpzZKWkpCQuxl20O4bcRvvFOcRfKkrE90O4vK82BR/5nTnPzKdEjhL89McFIiJunz8pjvDwvwA4dqw4ERHZb1mbI0cs4eEnATh+/H4iIm6dzbtgwWjCw09fW1+SyMhbf00ePXqF8PCz19aX4vLlW68td+TIZcLDIwGIiHiQhASv29b/TXj4eeLjISLioTte6+HDlwgPv0BUlDcREQ/csf7QoQuEh1/izJlsRESUTGb9OcLDozh+3JeIiBLJrI+kZMnLHDiQnYiI4nesP3DgDIULX+XAgRxERNyXzPrT5MoVzcGDuYiIKHZjeVKSFxcvRnPgwEm8vGI5dCgPERFF7vj6/fv/JDo6noMH8xIRUTiZ9RGcP5/A4cP5iYgomMz64+TLl8SRIwWJiMh/x/p9+47h5wdHjhQiIuLWAx4+Pg7Cw62GcuRIESIibv1DKHfuJMLDjwNw7FhRIiJy37I+JiaB8PCIa+v9nf5nLy7O2if/W++aP3sORzzh4X/e8fis5OVw3NFx/pFhGEOAk6ZpTrx2/whQxTTNqOuPCQsLcwQGBmZUTrcXHh5OQECA3THkNtov9nI4rJGE3r0hJgZGjICnngqnQgXtE2ejfyvOR/skdcLCwggMDLx9MAhI+zlLq4AXAQzDqAb8dXNREhHJCOfOWYcrDAN27rRKk3daf2uJiKRRmn7tmKa5CQgzDGMTMBZ4LUNTiYhH27jRGlUqUsS6vXEjPPqo3alExFOleZ4l0zTfy8ggIiKXLkHfvtZJtDNnQocOUKmS3alExNNpBm8RcQo//WS9w+bPP613/LTWbAAi4iR09F9EbPfRR9bbnnPlgk2b4OOPwc/vn79ORCQrqCyJiO0eewzeeMOaX+Xf/7Y7jYjIrXQYTkSyXGwsDBkC2bNbn595xvoQEXFGGlkSkSy1cyfUqAGffmrNBJyGqd5ERLKUypKIZImEBOtcpJo1ITISliyxLg3hlewUcCIizkNlSUSyxIEDMHQotGoFe/fC88/bnUhE5N6oLIlIpklKgpUrrdvly8Pu3fDf/0LhOy8JJiLitFSWRCRTHD0KTz8NjRrB5s3WsnLl7M0kIpIWKksikqEcDvjmG6hc2ZoKYMoUa2oAERFXpakDRCRDdegA338PTz0F330HDz5odyIRkfRRWRKRdLv+9n8vL+vE7Vq14LXXwFtj1yLiBlSWRCRdIiOhVy+oXx+6d7dGlkRE3In+7hORNFu8GCpWhJAQiI62O42ISOZQWRKRVLt0CV5+GZo2hfvug23boG9fu1OJiGQOlSURSbVt22DGDBg4EH77zXrnm4iIu9I5SyJyT65ehXXr4LnnrPOTDh/WO91ExDNoZElE/tHmzVC1KjRrBidOWMtUlETEU6gsichdxcbC++/Dk09CXBysWgWlStmdSkQka+kwnIgkKyEBnngCwsKga1cYPRry5bM7lYhI1tPIkojcIinJ+pwtG3TpYk0PMHmyipKIeC6VJRG5Yf9+a/btxYut+717Q5Mm9mYSEbGbypKIkJQEY8ZAtWpw6BAkJtqdSETEeeicJREPd+yYdbht/Xrrum7ffAPFi9udSkTEeagsiXi4detg+3aYMsWaldvLy+5EIiLORYfhRDzQyZOwfLl1u3NnOHAAXnlFRUlEJDkqSyIe5r//hQoVrJJ09apVkO67z+5UIiLOS2VJxENERkKbNtCuHTz6KGzcCLly2Z1KRMT56ZwlEQ9w4QJUqgTnzsHw4dC/vzWPkoiI/DP9uhRxYwkJVikqWBDefBOefRaqVLE7lYiIa9FhOBE39fPPYBiwdat1/513VJRERNJCZUnEzVy9Cn36QP361qiS3uEmIpI+KksibmTzZqhaFcaOtQrTjh1Qo4bdqUREXJvOWRJxIz/9BHFx1iG4p56yO42IiHvQyJKIi9u1C9autW6/9x7s3q2iJCKSkVSWRFxUQgIMGwY1a8Lbb4PDYZ2jlC+f3clERNyLypKICzJNeOIJGDQIWrSA1at1IreISGbROUsiLiY8HKpXt2bf/u9/rVm5RUQk86gsibiI2FjInh3KlYPBg+Hll6F4cbtTiYi4Px2GE3FyDgdMmQKlS8ORI9bhtgEDVJRERLKKypKIEzt5El54AYKDrdm4dT03EZGsp7Ik4qR++AEqVLDmTvryS+vzAw/YnUpExPPo71QRJ7VuHTzyCEyfbo0qiYiIPVSWRJzI0qVw330QGAijR4Ovrw69iYjYTYfhRJzA339D167QpAl89pm1LGdOFSUREWegsiRis59/hkqVYOpUeP99mDHD7kQiInIz/d0qYqOVK6FRI+vcpF9+gcceszuRiIjcTiNLIja4csX6XL8+jBgBO3eqKImIOCuVJZEsFBcHAwda7247d846J6l/f+vSJSIi4px0GE4ki+zaBUFBsHs3vPKK9U43ERFxfhpZEslkSUkwfDjUrAmnT8OiRdblS/LlszuZiIjcC5UlkUzm5QWbN0OLFvD779blS0RExHXoMJxIJkhKgq++guefhzJlYO5cyJHD7lQiIpIWGlkSyWB//AENGkCfPvDdd9YyFSUREdelkSWRDOJwwLffwptvWrcnT7ZO5BYREdemkSWRDPL11xAcDNWrw5491uVLvLzsTiUiIumlkSWRdLp4EQoUgM6drekAgoPBW3+GiIi4Df1KF0mjc+egXTt4/HGIjoY8eaBbNxUlERF3o1/rImmwdClUrAg//ggdOmiCSRERd6bDcCKpcPWq9S63KVOgUiVYvhyqVrU7lYiIZCaNLImkgq+vNbHk++/D1q0qSiIiniBNI0uGYWQDpgBlAF+gn2maGzMymIizuHoVhg2Dt96CwoUhNFSH3UREPElaR5Y6AVdM06wNdAVGZ1wkEeexa1cOqlWzru22dKm1TEVJRMSzpLUszQTeunb7LFA4Y+KIOIe4OBg4EDp0eIjoaFizBoKC7E4lIiJ28HI4HOnagGEYw4FE0zQH37w8LCzMkStXrnRt25PExMSQQ9fEcBqffOLPjBmFaNr0HAMHRpI3b5LdkeQa/VtxTtovzkf7JHWuXr1KYGBgslMJ/+M5S4ZhBAPBty3+wDTNlYZhvAZUB5K9jnpAQEBqs3qs8PBwfb9slpAAly5Z5yV99hm0bg0PP3xG+8XJ6N+Kc9J+cT7aJ6kTFhZ213X/WJZM05wMTL59uWEYXbFKUnPTNOPTE1DEbgcOWDNw+/nB2rVQvDi88AKEh9udTERE7Jamc5YMwygD9ABamqYZk7GRRLJOUhL85z/WFACmCT17agZuERG5VVonpQzGOql7mWEY15c1NE0zLkNSiWSBU6egfXtrJKlxY5g8Ge6/3+5UIiLibNJUlkzTHAAMyOAsIlkqd27r+m7ffANdu4JXsqf1iYiIp9MBB/EoJ0/C669DTAzkzQvbt0NwsIqSiIjcncqSeIw5c6yL306eDL/9Zi3z8bE3k4iIOD+VJXF7585B27bQpg2ULQs7dkCdOnanEhERV6GyJG6vSxf48Uf46CPYtAnKlbM7kYiIuJK0vhtOxKn9/bc1LUCBAjBypFWUqla1O5WIiLgijSyJ21m3DipXhtdes+6XK6eiJCIiaaeyJG4jOhr69oWnnrJm4u7d2+5EIiLiDnQYTtzC3r3w4ovWLNy9e8Onn1rzKImIiKSXypK4hcKFIWdOWLMG6te3O42IiLgTHYYTl7V7N/TqZZ3IXby4NcGkipKIiGQ0lSVxOQkJ1mG2GjVg/nw4etRarlm4RUQkM6gsiUs5cABq14b334dmzaxzlcqWtTuViIi4M52zJC4jKQlatoS//oLZs61ZuTWaJCIimU1lSZze8eNQrBjkyAEzZoC/P9x/v92pRETEU+gwnDgthwO++866+O2HH1rLqlVTURIRkaylsiRO6dQp65ykV16B6tWhWze7E4mIiKdSWRKns3q1NZq0ejV88QX8/DOULm13KhER8VQ6Z0mcTokSUL48TJpkXddNRETEThpZEqewbJl1XTewilJoqIqSiIg4B5UlsVVUFLz6Kjz/PPz0E1y6ZHciERGRW6ksiW3WrYPKleHbb+Hdd2HbNsif3+5UIiIit9I5S2KLK1fgxRehYEHYsAEef9zuRCIiIslTWZIstXevdU5S7tywfPn/bouIiDgrHYaTLBEXB4MHQ9Wq8M031rKaNVWURETE+WlkSTLdnj0QFAQ7d0KXLtY13URERFyFRpYkU33zDQQGWhe/XbjQunyJTuIWERFXorIkmerRR6F5c/j9d2ja1O40IiIiqafDcJKhkpLg66/h7Fnr4rd161ofIiIirkojS5Jhjh+Hhg3h9dchLAwSE+1OJCIikn4qS5JuDgdMnQqVKsGWLTBxIixeDD4+dicTERFJPx2Gk3Q7fhx69IB//9s6gbtMGbsTiYiIZByNLEmabdlifX7wQfjlF1i7VkVJRETcj8qSpNr589C+PdSqBStWWMsCA8FbP00iIuKGdBhOUmXZMggOtt7tNnQoNGhgdyIREZHMpbEAuWf9+8Pzz0OhQvDrr9blS7KpbouIiJtTWZJ7FhgI77xjTQtQvbrdaURERLKGypLcVXQ0vPUWjB1r3W/bFj77DLJntzeXiIhIVlJZkmRt3WqNHn3xhTU1gIiIiKdSWZJbxMVZ5yLVqgVXrsDq1TBypN2pRERE7KOyJLcIC4Nhw6BjR9izR+92ExER0XuZhMRECA2Fp56yRpT27IEKFexOJSIi4hw0suThDh6E2rWhfn3Yt89apqIkIiLyPypLHiopCcaNgypVIDwcZs6EgAC7U4mIiDgfHYbzQA4HNG0KS5dCo0YweTKUKGF3KhEREeeksuRBHA7w8rI+GjWyCtOrr1r3RUREJHk6DOchTp2C5s1h3jzrfu/e0K2bipKIiMg/UVnyAPPmQcWKsHIlnD9vdxoRERHXorLkxs6fhw4doHVrKF0aduywRpNERETk3qksubE1a2DOHBg6FDZt0rvdRERE0kIneLuZqCj47Tdr3qTWrSEwEMqWtTuViIiI69LIkhtZvx4qV4ZmzaxDcF5eKkoiIiLppbLkBqKj4a23rMuV+PhYJ3IXKmR3KhEREfegw3AuLjoaatSwLlXSqxeMGAG5c9udSkRExH2oLLmo6xNM5sxpveOtZk145hm7U4mIiLgfHYZzQXv3WuVo0ybr/oABKkoiIiKZRWXJhSQmWofZAgPhxAm4fNnuRCIiIu5Ph+FcxKFD0LmzNZrUqhWMHw9Fi9qdSkRExP2pLLmIkBDrJO6ZM6F9e13TTUREJKvoMJwTO3HCmjsJ4M03rbLUoYOKkoiISFZSWXJCDgdMm2Zd/LZLF0hIsOZPKl7c7mQiIiKeJ11lyTAMf8MwLhiGUS+D8ni806ehRQurJFWpAj/9BNl0sFRERMQ26f1veCRwJCOCCEREQLVq1vXdPv8c3njDGlESERER+6S5LBmG8TQQBezJuDieKTHR+lyiBPTsCW3bQvny9mYSERERi5fD4Uj1FxmG4QesBpoBY4Cppmmuu/kxYWFhjly5cmVERrcWGpqb4cPv48svD2EYOnPb2cTExJAjRw67Y8hNtE+ck/aL89E+SZ2rV68SGBiY7H/E/ziyZBhGMBB82+LlwDemaV40DOOuXxsQEJCanB4lKgr69YNJk6BCBfDyykFAQBm7Y8ltwsPD9XPsZLRPnJP2i/PRPkmdsLCwu677x7JkmuZkYPLNywzD+AXwMQyjN1AW+JdhGK1N0/w9nVk9QmiodQL3sWPQvz8MHQpHj8baHUtERESSkaZzlkzTfOL6bcMwpmIdhlNRukfz51tzJYWGwpNP2p1GREREUqJ5lrLItm3w22/W7U8+gV27VJRERERcQbpn8DFNs0sG5HBbcXEwbJj1Ubs2rF0LOXPanUpERETulaY7zER790JQEOzYYX3+8ku7E4mIiEhqqSxlkt9+s0aS8ueHBQugeXO7E4mIiEha6JylDBYXZ30ODIR33rFGl1SUREREXJfKUgZJSoKvvwbDgDNnrMuUfPQRFCtmdzIRERFJD5WlDHDiBDz7LLz2Gjz66P8uXyIiIiKuT2UpHRwOmD4dKlWCzZthwgRYsQKKF7c7mYiIiGQUneCdTosWWWVp6lQoW9buNCIiIpLRVJbSYP58qFjROuQ2dao1b5KPj92pREREJDPoMFwqXLgAHTtCq1YwapS1LE8eFSURERF3prJ0j1assEaTfvgBhgyBr76yO5GIiIhkBR2Guwdz5kCbNlC+vHWOUmCg3YlEREQkq2hkKQXR0dbnJk1g+HAIC1NREhER8TQqS8mIjoa334Zq1eDKFciVC95/H3LksDuZiIiIZDWVpdts22aNHo0eDU8/bXcaERERsZvK0jXx8fDBB/DYY/D337BypXX5kty57U4mIiIidlJZusbbG9asgfbtYc8eaNjQ7kQiIiLiDDz63XCJiTBuHLRrZ13wdvVq6/wkERERkes8dmTp0CGoWxf69rWu7wYqSiIiInInjytLDgeMHw9VqsDevTBjhvXONxEREZHkeFxZGj4cevWCJ5+0ylLHjuDlZXcqERERcVYecc6SwwFRUZAvH3TrBv7+0LWrSpKIiIj8M7cfWTpzBlq2hGefhYQEKFoUgoNVlEREROTeuHVZmj8fKlSA5cvhxRdVkERERCT13LIs/f03dOoErVrBAw9Y13R7+23w8bE7mYiIiLgatyxL2bLB9u3WjNxbtlijSyIiIiJp4TYneF++DJ99Bu+9Z12iZMcO8POzO5WIiIi4OrcYWdqwwZo3adgwaxZuUFESERGRjOHSZSkmBvr1s2biBggNhebN7c0kIiIi7sWly1KPHvD559C9O+zaZU00KSIiIpKRXPqcpYEDrYvgPvus3UlERETEXbl0WXrkEetDREREJLO49GE4ERERkcymsiQiIiKSApUlERERkRSoLImIiIikQGVJREREJAUqSyIiIiIpUFkSERERSYHKkoiIiEgKVJZEREREUqCyJCIiIpIClSURERGRFKgsiYiIiKRAZUlEREQkBSpLIiIiIilQWRIRERFJgZfD4ciUDYeFhWXOhkVEREQyQWBgoFdyyzOtLImIiIi4Ax2GExEREUmBypKIiIhICrLZHUBuZRiGP7AfaGGa5jqb43g0wzCyAVOAMoAv0M80zY32pvJshmF8ATwGOIA3TNPcanMkj2cYxgigNtb/J5+Ypjnf5khyjWEYOYHfgaGmaU61OY5L08iS8xkJHLE7hADQCbhimmZtoCsw2uY8Hs0wjLrAI6Zp1gKCgXE2R/J4hmE8BVS8tk8aAWNsjiS3GgScszuEO1BZciKGYTwNRAF77M4iAMwE3rp2+yxQ2MYsAvWBEADTNPcBBQ3DyGdvJI8XCrS+dvsCkNswDB8b88g1hmGUA8oDS+3O4g50GM5JGIbhB3wANEN/nTkF0zTjgfhrd/sCs22MI3AfEHbT/dPXlv1tTxwxTTMRuHLtbjCw7Noysd/nQG+gs91B3IHKkg0MwwjG+sVys+XAN6ZpXjQMw4ZUnu0u++QD0zRXGobxGlAdeCHrk8lNbp//xAvr3CWxmWEYzbAOVTe0O4uAYRhBwGbTNI/q/5OMoXmWnIRhGL8A14evy2Id9mltmubv9qUSwzC6Yh1maG6aZozdeTyZYRhDgJOmaU68dv8IUMU0zShbg3k4wzCeBT4CGpmmed7uPAKGYfyA9caURKAkEAt0N01zja3BXJjKkhMyDGMqMFXvhrOXYRhlgB+AuqZpXrU7j6czDONx4EPTNJ8xDKMaMNY0zSftzuXJDMPID2wAGpimecbuPHKna39kHNO74dJHh+FE7i4Y66TuZTcNZTc0TTPOvkieyzTNTYZhhBmGsQlIAl6zO5PQBigCzLnp30iQaZrH7YskkvE0siQiIiKSAk0dICIiIpIClSURERGRFKgsiYiIiKRAZUlEREQkBSpLIiIiIilQWRIRERFJgcqSiIiISApUlkRERERS8P+pOhCtDwjHbAAAAABJRU5ErkJggg==\n", 683 | "text/plain": [ 684 | "
" 685 | ] 686 | }, 687 | "metadata": { 688 | "needs_background": "light" 689 | }, 690 | "output_type": "display_data" 691 | } 692 | ], 693 | "source": [ 694 | "# ReLU function\n", 695 | "def relu(x):\n", 696 | " return (x >= 0) * x\n", 697 | "# inversed ReLU\n", 698 | "def inv_relu(x):\n", 699 | " return - relu(- x)\n", 700 | "\n", 701 | "fig = plt.figure(figsize=(10,7))\n", 702 | "ax = plt.axes()\n", 703 | "\n", 704 | "plt.title(\"BReLU\")\n", 705 | "\n", 706 | "x = np.linspace(-5, 5, 1000)\n", 707 | "ax.plot(x, relu(x), '-g', label = 'BReLU, xi mod 2 = 0');\n", 708 | "ax.plot(x, inv_relu(x), '-b', label = 'BReLU, xi mod 2 != 0', linestyle='dashed');\n", 709 | "\n", 710 | "plt.legend();" 711 | ] 712 | }, 713 | { 714 | "cell_type": "markdown", 715 | "metadata": {}, 716 | "source": [ 717 | "To impement custom activation function with backward step we should:\n", 718 | "* create a class which, inherits Function from torch.autograd,\n", 719 | "* override static forward and backward methods. Forward method just applies the function to the input. Backward method should compute the gradient of the loss function with respect to the input given the gradient of the loss function with respect to the output.\n", 720 | "\n", 721 | "Let's see an example for BReLU:" 722 | ] 723 | }, 724 | { 725 | "cell_type": "code", 726 | "execution_count": 13, 727 | "metadata": {}, 728 | "outputs": [], 729 | "source": [ 730 | "class brelu(Function):\n", 731 | " '''\n", 732 | " Implementation of BReLU activation function.\n", 733 | "\n", 734 | " Shape:\n", 735 | " - Input: (N, *) where * means, any number of additional\n", 736 | " dimensions\n", 737 | " - Output: (N, *), same shape as the input\n", 738 | "\n", 739 | " References:\n", 740 | " - See BReLU paper:\n", 741 | " https://arxiv.org/pdf/1709.04054.pdf\n", 742 | "\n", 743 | " Examples:\n", 744 | " >>> brelu_activation = brelu.apply\n", 745 | " >>> t = torch.randn((5,5), dtype=torch.float, requires_grad = True)\n", 746 | " >>> t = brelu_activation(t)\n", 747 | " '''\n", 748 | " #both forward and backward are @staticmethods\n", 749 | " @staticmethod\n", 750 | " def forward(ctx, input):\n", 751 | " \"\"\"\n", 752 | " In the forward pass we receive a Tensor containing the input and return\n", 753 | " a Tensor containing the output. ctx is a context object that can be used\n", 754 | " to stash information for backward computation. You can cache arbitrary\n", 755 | " objects for use in the backward pass using the ctx.save_for_backward method.\n", 756 | " \"\"\"\n", 757 | " ctx.save_for_backward(input) # save input for backward pass\n", 758 | "\n", 759 | " # get lists of odd and even indices\n", 760 | " input_shape = input.shape[0]\n", 761 | " even_indices = [i for i in range(0, input_shape, 2)]\n", 762 | " odd_indices = [i for i in range(1, input_shape, 2)]\n", 763 | "\n", 764 | " # clone the input tensor\n", 765 | " output = input.clone()\n", 766 | "\n", 767 | " # apply ReLU to elements where i mod 2 == 0\n", 768 | " output[even_indices] = output[even_indices].clamp(min=0)\n", 769 | "\n", 770 | " # apply inversed ReLU to inversed elements where i mod 2 != 0\n", 771 | " output[odd_indices] = 0 - output[odd_indices] # reverse elements with odd indices\n", 772 | " output[odd_indices] = - output[odd_indices].clamp(min = 0) # apply reversed ReLU\n", 773 | "\n", 774 | " return output\n", 775 | "\n", 776 | " @staticmethod\n", 777 | " def backward(ctx, grad_output):\n", 778 | " \"\"\"\n", 779 | " In the backward pass we receive a Tensor containing the gradient of the loss\n", 780 | " with respect to the output, and we need to compute the gradient of the loss\n", 781 | " with respect to the input.\n", 782 | " \"\"\"\n", 783 | " grad_input = None # set output to None\n", 784 | "\n", 785 | " input, = ctx.saved_tensors # restore input from context\n", 786 | "\n", 787 | " # check that input requires grad\n", 788 | " # if not requires grad we will return None to speed up computation\n", 789 | " if ctx.needs_input_grad[0]:\n", 790 | " grad_input = grad_output.clone()\n", 791 | "\n", 792 | " # get lists of odd and even indices\n", 793 | " input_shape = input.shape[0]\n", 794 | " even_indices = [i for i in range(0, input_shape, 2)]\n", 795 | " odd_indices = [i for i in range(1, input_shape, 2)]\n", 796 | "\n", 797 | " # set grad_input for even_indices\n", 798 | " grad_input[even_indices] = (input[even_indices] >= 0).float() * grad_input[even_indices]\n", 799 | "\n", 800 | " # set grad_input for odd_indices\n", 801 | " grad_input[odd_indices] = (input[odd_indices] < 0).float() * grad_input[odd_indices]\n", 802 | "\n", 803 | " return grad_input" 804 | ] 805 | }, 806 | { 807 | "cell_type": "markdown", 808 | "metadata": {}, 809 | "source": [ 810 | "Create a simple classifier model for a demonstration and run training:[](http://)" 811 | ] 812 | }, 813 | { 814 | "cell_type": "code", 815 | "execution_count": 14, 816 | "metadata": {}, 817 | "outputs": [ 818 | { 819 | "name": "stdout", 820 | "output_type": "stream", 821 | "text": [ 822 | "Training the model. Make sure that loss decreases after each epoch.\n", 823 | "\n", 824 | "Training loss: 585.9589667767286\n", 825 | "Training loss: 442.55006262660027\n", 826 | "Training loss: 414.7568979859352\n", 827 | "Training loss: 390.45185139775276\n", 828 | "Training loss: 372.7816904038191\n" 829 | ] 830 | } 831 | ], 832 | "source": [ 833 | "class ClassifierBReLU(nn.Module):\n", 834 | " '''\n", 835 | " Simple fully-connected classifier model to demonstrate BReLU activation.\n", 836 | " '''\n", 837 | " def __init__(self):\n", 838 | " super(ClassifierBReLU, self).__init__()\n", 839 | "\n", 840 | " # initialize layers\n", 841 | " self.fc1 = nn.Linear(784, 256)\n", 842 | " self.fc2 = nn.Linear(256, 128)\n", 843 | " self.fc3 = nn.Linear(128, 64)\n", 844 | " self.fc4 = nn.Linear(64, 10)\n", 845 | "\n", 846 | " # create shortcuts for BReLU\n", 847 | " self.a1 = brelu.apply\n", 848 | " self.a2 = brelu.apply\n", 849 | " self.a3 = brelu.apply\n", 850 | "\n", 851 | " def forward(self, x):\n", 852 | " # make sure the input tensor is flattened\n", 853 | " x = x.view(x.shape[0], -1)\n", 854 | "\n", 855 | " # apply BReLU\n", 856 | " x = self.a1(self.fc1(x))\n", 857 | " x = self.a2(self.fc2(x))\n", 858 | " x = self.a3(self.fc3(x))\n", 859 | " x = F.log_softmax(self.fc4(x), dim=1)\n", 860 | " \n", 861 | " return x\n", 862 | " \n", 863 | "model = ClassifierBReLU()\n", 864 | "train_model(model)" 865 | ] 866 | }, 867 | { 868 | "cell_type": "markdown", 869 | "metadata": {}, 870 | "source": [ 871 | "## Conclusion\n", 872 | "In this tutorial I demonstrated:\n", 873 | "* How to create a simple custom activation function,\n", 874 | "* How to create an activation function with learnable parameters, which can be trained using gradient descent,\n", 875 | "* How to create an activation function with custom backward step." 876 | ] 877 | }, 878 | { 879 | "cell_type": "markdown", 880 | "metadata": {}, 881 | "source": [ 882 | "## Improvement\n", 883 | "While building a lot of custom activation functions, I noticed, that they often consume much more GPU memory. Creation of inplace implementations of custom activations using PyTorch inplace methods will improve this situation." 884 | ] 885 | }, 886 | { 887 | "cell_type": "markdown", 888 | "metadata": {}, 889 | "source": [ 890 | "## Additional References\n", 891 | "Links to the additional resources and further reading:\n", 892 | "1. [Activation functions wiki page](https://en.wikipedia.org/wiki/Activation_function)\n", 893 | "2. [Tutorial on extending PyTorch](https://pytorch.org/docs/master/notes/extending.html)\n", 894 | "3. [Implementation of Maxout in PyTorch](https://github.com/Usama113/Maxout-PyTorch/blob/master/Maxout.ipynb)\n", 895 | "4. [PyTorch Comprehensive Overview](https://medium.com/@layog/a-comprehensive-overview-of-pytorch-7f70b061963f)\n", 896 | "5. [PyTorch and Fashion MNIST Kernel](https://www.kaggle.com/arturlacerda/pytorch-conditional-gan) I copied some code to load the data" 897 | ] 898 | }, 899 | { 900 | "cell_type": "markdown", 901 | "metadata": {}, 902 | "source": [ 903 | "## PS\n", 904 | "![echo logo](https://github.com/Lexie88rus/Activation-functions-examples-pytorch/blob/master/assets/echo_logo.png?raw=true)\n", 905 | "\n", 906 | "I participate in implementation of a __Echo package__ with mathematical backend for neural networks, which can be used with most popular existing packages (TensorFlow, Keras and [PyTorch](https://pytorch.org/)). We have done a lot for activation functions for PyTorch so far. Here is a [link to a repository on GitHub](https://github.com/digantamisra98/Echo/tree/Dev-adeis), __I will highly appreciate your feedback__ on that." 907 | ] 908 | } 909 | ], 910 | "metadata": { 911 | "kernelspec": { 912 | "display_name": "Python 3", 913 | "language": "python", 914 | "name": "python3" 915 | }, 916 | "language_info": { 917 | "codemirror_mode": { 918 | "name": "ipython", 919 | "version": 3 920 | }, 921 | "file_extension": ".py", 922 | "mimetype": "text/x-python", 923 | "name": "python", 924 | "nbconvert_exporter": "python", 925 | "pygments_lexer": "ipython3", 926 | "version": "3.6.6" 927 | } 928 | }, 929 | "nbformat": 4, 930 | "nbformat_minor": 1 931 | } 932 | -------------------------------------------------------------------------------- /gain/activation_gain.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.functional as F 3 | import sys 4 | 5 | # see pytorch forum topic: 6 | # https://discuss.pytorch.org/t/calculate-gain-tanh/20854 7 | 8 | # 1.5212 looks like a good gain value for SiLU 9 | 10 | def silu(input): 11 | ''' 12 | Applies the Sigmoid Linear Unit (SiLU) function element-wise: 13 | SiLU(x) = x * sigmoid(x) 14 | ''' 15 | return input * torch.sigmoid(input) 16 | 17 | a = torch.randn(1000,1000, requires_grad=True) 18 | b = a 19 | print (f"in: {a.std().item():.4f}") 20 | for i in range(100): 21 | l = torch.nn.Linear(1000,1000, bias=False) 22 | torch.nn.init.xavier_normal_(l.weight, float(sys.argv[2])) 23 | #b = getattr(F, sys.argv[1])(l(b)) 24 | b = silu(l(b)) 25 | 26 | if i % 10 == 0: 27 | print (f"out: {b.std().item():.4f}", end=" ") 28 | a.grad = None 29 | b.sum().backward(retain_graph=True) 30 | print (f"grad: {a.grad.abs().mean().item():.4f}") 31 | -------------------------------------------------------------------------------- /in-place-operations-in-pytorch.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "_cell_guid": "b1076dfc-b9ad-4769-8c92-a6c4dae69d19", 8 | "_kg_hide-input": true, 9 | "_uuid": "8f2839f25d086af736a60e9eeb907d3b93b6e0e5" 10 | }, 11 | "outputs": [], 12 | "source": [ 13 | "# Imports\n", 14 | "\n", 15 | "# Import basic libraries\n", 16 | "import numpy as np # linear algebra\n", 17 | "import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv)\n", 18 | "%matplotlib inline\n", 19 | "import matplotlib.pyplot as plt\n", 20 | "from collections import OrderedDict\n", 21 | "from PIL import Image\n", 22 | "\n", 23 | "# Import PyTorch\n", 24 | "import torch # import main library\n", 25 | "from torch.autograd import Variable\n", 26 | "import torch.nn as nn # import modules\n", 27 | "from torch import optim # import optimizers for demonstrations\n", 28 | "import torch.nn.functional as F # import torch functions\n", 29 | "from torchvision import transforms # import transformations to use for demo\n", 30 | "from torch.utils.data import Dataset, DataLoader " 31 | ] 32 | }, 33 | { 34 | "cell_type": "markdown", 35 | "metadata": {}, 36 | "source": [ 37 | "![image](https://github.com/Lexie88rus/Activation-functions-examples-pytorch/raw/master/assets/background-card-chip.jpg)" 38 | ] 39 | }, 40 | { 41 | "cell_type": "markdown", 42 | "metadata": {}, 43 | "source": [ 44 | "[_Photo by Fancycrave.com from Pexels_](https://www.pexels.com/photo/green-ram-card-collection-825262/)" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "metadata": { 50 | "_cell_guid": "79c7e3d0-c299-4dcb-8224-4455121ee9b0", 51 | "_uuid": "d629ff2d2480ee46fbb7e2d37f6b5fab8052498a", 52 | "collapsed": true 53 | }, 54 | "source": [ 55 | "# In-Place Operations in PyTorch\n", 56 | "_What are they and why avoid them_" 57 | ] 58 | }, 59 | { 60 | "cell_type": "markdown", 61 | "metadata": {}, 62 | "source": [ 63 | "## Introduction\n" 64 | ] 65 | }, 66 | { 67 | "cell_type": "markdown", 68 | "metadata": {}, 69 | "source": [ 70 | "Today's advanced deep neural networks have millions of parameters (for example, see the comparison in [this paper](https://arxiv.org/pdf/1905.11946.pdf)) and trying to train them on free GPU's like Kaggle or Goggle Colab often leads to running out of memory on GPU. There are several simple ways to reduce the GPU memory occupied by the model, for example:\n", 71 | "* Consider changing the architecture of the model or using the type of model with fewer parameters (for example choose [DenseNet](https://arxiv.org/pdf/1608.06993.pdf)-121 over DenseNet-169). This approach can affect model's performance metrics.\n", 72 | "* Reduce the batch size or manually set the number of data loader workers. In this case it will take longer for the model to train.\n", 73 | "\n", 74 | "Using in-place operations in neural networks may help to avoid the downsides of approaches mentioned above while saving some GPU memory. However it is strongly __not recommended to use in-place operations__ for several reasons.\n", 75 | "\n", 76 | "In this kernel I would like to:\n", 77 | "* Describe what are the in-place operations and demonstrate how they might help to save the GPU memory.\n", 78 | "* Tell why we should avoid the in-place operations or use them with great caution." 79 | ] 80 | }, 81 | { 82 | "cell_type": "markdown", 83 | "metadata": {}, 84 | "source": [ 85 | "## In-place Operations\n", 86 | "`In-place operation is an operation that changes directly the content of a given linear algebra, vector, matrices(Tensor) without making a copy. The operators which helps to do the operation is called in-place operator.` See the [tutorial](https://www.tutorialspoint.com/inplace-operator-in-python) on in-place operations in Python.\n", 87 | "\n", 88 | "As it is said in the definition, in-place operations don't make a copy of the input, that is why they can help to reduce the memory usage, when operating with high-dimentional data." 89 | ] 90 | }, 91 | { 92 | "cell_type": "markdown", 93 | "metadata": {}, 94 | "source": [ 95 | "I would like to run a simple model on [Fashion MNIST dataset](https://www.kaggle.com/zalando-research/fashionmnist) to demonstrate how in-place operations help to consume less GPU memory. In this demonstration I use simple fully-connected deep neural network with four linear layers and [ReLU](https://pytorch.org/docs/stable/nn.html#relu) activation after each hidden layer." 96 | ] 97 | }, 98 | { 99 | "cell_type": "markdown", 100 | "metadata": {}, 101 | "source": [ 102 | "## Seeting Up The Demo\n", 103 | "Ih this section I will prepare everything for the demonstration:\n", 104 | "* Load Fashion MNIST dataset from PyTorch,\n", 105 | "* Introduce transformations for Fashion MNIST images using PyTorch,\n", 106 | "* Prepare model training procedure.\n", 107 | "\n", 108 | "If you are familiar with PyTorch basics, just skip this part and go straight to the rest of the kernel." 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": {}, 114 | "source": [ 115 | "### Introduce Transformations" 116 | ] 117 | }, 118 | { 119 | "cell_type": "markdown", 120 | "metadata": {}, 121 | "source": [ 122 | "The most efficient way to transform the input data is to use buil-in PyTorch transformations:" 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": 2, 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "# Define a transform\n", 132 | "transform = transforms.Compose([transforms.ToTensor()])" 133 | ] 134 | }, 135 | { 136 | "cell_type": "markdown", 137 | "metadata": {}, 138 | "source": [ 139 | "### Load the Data" 140 | ] 141 | }, 142 | { 143 | "cell_type": "markdown", 144 | "metadata": {}, 145 | "source": [ 146 | "To load the data I used standard Dataset and Dataloader classes from PyTorch and [FashionMNIST class code from this kernel](https://www.kaggle.com/arturlacerda/pytorch-conditional-gan):" 147 | ] 148 | }, 149 | { 150 | "cell_type": "code", 151 | "execution_count": 3, 152 | "metadata": {}, 153 | "outputs": [], 154 | "source": [ 155 | "class FashionMNIST(Dataset):\n", 156 | " '''\n", 157 | " Dataset clas to load Fashion MNIST data from csv.\n", 158 | " Code from original kernel:\n", 159 | " https://www.kaggle.com/arturlacerda/pytorch-conditional-gan\n", 160 | " '''\n", 161 | " def __init__(self, transform=None):\n", 162 | " self.transform = transform\n", 163 | " fashion_df = pd.read_csv('../input/fashion-mnist_train.csv')\n", 164 | " self.labels = fashion_df.label.values\n", 165 | " self.images = fashion_df.iloc[:, 1:].values.astype('uint8').reshape(-1, 28, 28)\n", 166 | "\n", 167 | " def __len__(self):\n", 168 | " return len(self.images)\n", 169 | "\n", 170 | " def __getitem__(self, idx):\n", 171 | " label = self.labels[idx]\n", 172 | " img = Image.fromarray(self.images[idx])\n", 173 | " \n", 174 | " if self.transform:\n", 175 | " img = self.transform(img)\n", 176 | "\n", 177 | " return img, label\n", 178 | "\n", 179 | "# Load the training data for Fashion MNIST\n", 180 | "trainset = FashionMNIST(transform=transform)\n", 181 | "# Define the dataloader\n", 182 | "trainloader = torch.utils.data.DataLoader(trainset, batch_size=64, shuffle=True)" 183 | ] 184 | }, 185 | { 186 | "cell_type": "markdown", 187 | "metadata": {}, 188 | "source": [ 189 | "### Setup Training Procedure\n", 190 | "I wrote a small training procedure, which runs 5 training epochs and prints the loss for each epoch:" 191 | ] 192 | }, 193 | { 194 | "cell_type": "code", 195 | "execution_count": 4, 196 | "metadata": {}, 197 | "outputs": [], 198 | "source": [ 199 | "def train_model(model, device):\n", 200 | " '''\n", 201 | " Function trains the model and prints out the training log.\n", 202 | " '''\n", 203 | " #setup training\n", 204 | " \n", 205 | " #define loss function\n", 206 | " criterion = nn.NLLLoss()\n", 207 | " #define learning rate\n", 208 | " learning_rate = 0.003\n", 209 | " #define number of epochs\n", 210 | " epochs = 5\n", 211 | " #initialize optimizer\n", 212 | " optimizer = optim.Adam(model.parameters(), lr=learning_rate)\n", 213 | " \n", 214 | " model.to(device)\n", 215 | "\n", 216 | " #run training and print out the loss to make sure that we are actually fitting to the training set\n", 217 | " print('Training the model \\n')\n", 218 | " for e in range(epochs):\n", 219 | " running_loss = 0\n", 220 | " for images, labels in trainloader:\n", 221 | " \n", 222 | " images, labels = images.to(device), labels.to(device)\n", 223 | " images = images.view(images.shape[0], -1)\n", 224 | " log_ps = model(images)\n", 225 | " loss = criterion(log_ps, labels)\n", 226 | "\n", 227 | " optimizer.zero_grad()\n", 228 | " loss.backward()\n", 229 | " optimizer.step()\n", 230 | "\n", 231 | " running_loss += loss.item()\n", 232 | " else:\n", 233 | " # print out the loss to make sure it is decreasing\n", 234 | " print(f\"Training loss: {running_loss}\")" 235 | ] 236 | }, 237 | { 238 | "cell_type": "markdown", 239 | "metadata": {}, 240 | "source": [ 241 | "### Define the Model\n", 242 | "\n", 243 | "PyTorch provides us with in-place implementation of ReLU activation function. I will run consequently training with in-place ReLU implementation and with vanilla ReLU." 244 | ] 245 | }, 246 | { 247 | "cell_type": "code", 248 | "execution_count": 5, 249 | "metadata": {}, 250 | "outputs": [], 251 | "source": [ 252 | "# create class for basic fully-connected deep neural network\n", 253 | "class Classifier(nn.Module):\n", 254 | " '''\n", 255 | " Demo classifier model class to demonstrate in-place operations\n", 256 | " '''\n", 257 | " def __init__(self, inplace = False):\n", 258 | " super().__init__()\n", 259 | "\n", 260 | " # initialize layers\n", 261 | " self.fc1 = nn.Linear(784, 256)\n", 262 | " self.fc2 = nn.Linear(256, 128)\n", 263 | " self.fc3 = nn.Linear(128, 64)\n", 264 | " self.fc4 = nn.Linear(64, 10)\n", 265 | " \n", 266 | " self.relu = nn.ReLU(inplace = inplace) # pass inplace as parameter to ReLU\n", 267 | "\n", 268 | " def forward(self, x):\n", 269 | " # make sure the input tensor is flattened\n", 270 | " x = x.view(x.shape[0], -1)\n", 271 | "\n", 272 | " # apply activation function\n", 273 | " x = self.relu(self.fc1(x))\n", 274 | "\n", 275 | " # apply activation function\n", 276 | " x = self.relu(self.fc2(x))\n", 277 | " \n", 278 | " # apply activation function\n", 279 | " x = self.relu(self.fc3(x))\n", 280 | " \n", 281 | " x = F.log_softmax(self.fc4(x), dim=1)\n", 282 | "\n", 283 | " return x" 284 | ] 285 | }, 286 | { 287 | "cell_type": "markdown", 288 | "metadata": {}, 289 | "source": [ 290 | "## Compare Memory Usage for In-place and Vanilla Operations\n" 291 | ] 292 | }, 293 | { 294 | "cell_type": "markdown", 295 | "metadata": {}, 296 | "source": [ 297 | "Let's compare memory usage for one single call of ReLU activation function:" 298 | ] 299 | }, 300 | { 301 | "cell_type": "code", 302 | "execution_count": 6, 303 | "metadata": {}, 304 | "outputs": [ 305 | { 306 | "data": { 307 | "text/plain": [ 308 | "device(type='cuda', index=0)" 309 | ] 310 | }, 311 | "execution_count": 6, 312 | "metadata": {}, 313 | "output_type": "execute_result" 314 | } 315 | ], 316 | "source": [ 317 | "# empty caches and setup the device\n", 318 | "torch.cuda.empty_cache()\n", 319 | "\n", 320 | "device = torch.device('cuda:0' if torch.cuda.is_available() else \"cpu\")\n", 321 | "device" 322 | ] 323 | }, 324 | { 325 | "cell_type": "code", 326 | "execution_count": 7, 327 | "metadata": {}, 328 | "outputs": [], 329 | "source": [ 330 | "def get_memory_allocated(device, inplace = False):\n", 331 | " '''\n", 332 | " Function measures allocated memory before and after the ReLU function call.\n", 333 | " '''\n", 334 | " \n", 335 | " # Create a large tensor\n", 336 | " t = torch.randn(10000, 10000, device=device)\n", 337 | " \n", 338 | " # Measure allocated memory\n", 339 | " torch.cuda.synchronize()\n", 340 | " start_max_memory = torch.cuda.max_memory_allocated() / 1024**2\n", 341 | " start_memory = torch.cuda.memory_allocated() / 1024**2\n", 342 | " \n", 343 | " # Call in-place or normal ReLU\n", 344 | " if inplace:\n", 345 | " F.relu_(t)\n", 346 | " else:\n", 347 | " output = F.relu(t)\n", 348 | " \n", 349 | " # Measure allocated memory after the call\n", 350 | " torch.cuda.synchronize()\n", 351 | " end_max_memory = torch.cuda.max_memory_allocated() / 1024**2\n", 352 | " end_memory = torch.cuda.memory_allocated() / 1024**2\n", 353 | " \n", 354 | " # Return amount of memory allocated for ReLU call\n", 355 | " return end_memory - start_memory, end_max_memory - start_max_memory" 356 | ] 357 | }, 358 | { 359 | "cell_type": "markdown", 360 | "metadata": {}, 361 | "source": [ 362 | "Run out of place ReLU:" 363 | ] 364 | }, 365 | { 366 | "cell_type": "code", 367 | "execution_count": 8, 368 | "metadata": {}, 369 | "outputs": [ 370 | { 371 | "name": "stdout", 372 | "output_type": "stream", 373 | "text": [ 374 | "Allocated memory: 382.0\n", 375 | "Allocated max memory: 382.0\n" 376 | ] 377 | } 378 | ], 379 | "source": [ 380 | "memory_allocated, max_memory_allocated = get_memory_allocated(device, inplace = False)\n", 381 | "print('Allocated memory: {}'.format(memory_allocated))\n", 382 | "print('Allocated max memory: {}'.format(max_memory_allocated))" 383 | ] 384 | }, 385 | { 386 | "cell_type": "markdown", 387 | "metadata": {}, 388 | "source": [ 389 | "Run in-place ReLU:" 390 | ] 391 | }, 392 | { 393 | "cell_type": "code", 394 | "execution_count": 9, 395 | "metadata": {}, 396 | "outputs": [ 397 | { 398 | "name": "stdout", 399 | "output_type": "stream", 400 | "text": [ 401 | "Allocated memory: 0.0\n", 402 | "Allocated max memory: 0.0\n" 403 | ] 404 | } 405 | ], 406 | "source": [ 407 | "memory_allocated_inplace, max_memory_allocated_inplace = get_memory_allocated(device, inplace = True)\n", 408 | "print('Allocated memory: {}'.format(memory_allocated_inplace))\n", 409 | "print('Allocated max memory: {}'.format(max_memory_allocated_inplace))" 410 | ] 411 | }, 412 | { 413 | "cell_type": "markdown", 414 | "metadata": {}, 415 | "source": [ 416 | "Now let's do the same while training a simple classifier.\n", 417 | "Run training with vanilla ReLU:" 418 | ] 419 | }, 420 | { 421 | "cell_type": "code", 422 | "execution_count": 10, 423 | "metadata": {}, 424 | "outputs": [ 425 | { 426 | "name": "stdout", 427 | "output_type": "stream", 428 | "text": [ 429 | "Training the model \n", 430 | "\n", 431 | "Training loss: 490.5504989773035\n", 432 | "Training loss: 361.1345275044441\n", 433 | "Training loss: 329.05726308375597\n", 434 | "Training loss: 306.97832968086004\n", 435 | "Training loss: 292.6471059694886\n" 436 | ] 437 | } 438 | ], 439 | "source": [ 440 | "# initialize classifier\n", 441 | "model = Classifier(inplace = False)\n", 442 | "\n", 443 | "# measure allocated memory\n", 444 | "torch.cuda.synchronize()\n", 445 | "start_max_memory = torch.cuda.max_memory_allocated() / 1024**2\n", 446 | "start_memory = torch.cuda.memory_allocated() / 1024**2\n", 447 | "\n", 448 | "# train the classifier\n", 449 | "train_model(model, device)\n", 450 | "\n", 451 | "# measure allocated memory after training\n", 452 | "torch.cuda.synchronize()\n", 453 | "end_max_memory = torch.cuda.max_memory_allocated() / 1024**2\n", 454 | "end_memory = torch.cuda.memory_allocated() / 1024**2" 455 | ] 456 | }, 457 | { 458 | "cell_type": "code", 459 | "execution_count": 11, 460 | "metadata": {}, 461 | "outputs": [ 462 | { 463 | "name": "stdout", 464 | "output_type": "stream", 465 | "text": [ 466 | "Allocated memory: 1.853515625\n", 467 | "Allocated max memory: 0.0\n" 468 | ] 469 | } 470 | ], 471 | "source": [ 472 | "print('Allocated memory: {}'.format(end_memory - start_memory))\n", 473 | "print('Allocated max memory: {}'.format(end_max_memory - start_max_memory))" 474 | ] 475 | }, 476 | { 477 | "cell_type": "markdown", 478 | "metadata": {}, 479 | "source": [ 480 | "Run training with in-place ReLU:" 481 | ] 482 | }, 483 | { 484 | "cell_type": "code", 485 | "execution_count": 12, 486 | "metadata": {}, 487 | "outputs": [ 488 | { 489 | "name": "stdout", 490 | "output_type": "stream", 491 | "text": [ 492 | "Training the model \n", 493 | "\n", 494 | "Training loss: 485.5531446188688\n", 495 | "Training loss: 359.61066341400146\n", 496 | "Training loss: 329.1772850751877\n", 497 | "Training loss: 307.14213905483484\n", 498 | "Training loss: 292.3229675516486\n" 499 | ] 500 | } 501 | ], 502 | "source": [ 503 | "# initialize model with in-place ReLU\n", 504 | "model = Classifier(inplace = True)\n", 505 | "\n", 506 | "# measure allocated memory\n", 507 | "torch.cuda.synchronize()\n", 508 | "start_max_memory = torch.cuda.max_memory_allocated() / 1024**2\n", 509 | "start_memory = torch.cuda.memory_allocated() / 1024**2\n", 510 | "\n", 511 | "# train the classifier with in-place ReLU\n", 512 | "train_model(model, device)\n", 513 | "\n", 514 | "# measure allocated memory after training\n", 515 | "torch.cuda.synchronize()\n", 516 | "end_max_memory = torch.cuda.max_memory_allocated() / 1024**2\n", 517 | "end_memory = torch.cuda.memory_allocated() / 1024**2" 518 | ] 519 | }, 520 | { 521 | "cell_type": "code", 522 | "execution_count": 13, 523 | "metadata": {}, 524 | "outputs": [ 525 | { 526 | "name": "stdout", 527 | "output_type": "stream", 528 | "text": [ 529 | "Allocated memory: 1.853515625\n", 530 | "Allocated max memory: 0.0\n" 531 | ] 532 | } 533 | ], 534 | "source": [ 535 | "print('Allocated memory: {}'.format(end_memory - start_memory))\n", 536 | "print('Allocated max memory: {}'.format(end_max_memory - start_max_memory))" 537 | ] 538 | }, 539 | { 540 | "cell_type": "markdown", 541 | "metadata": {}, 542 | "source": [ 543 | "Looks like using in-place ReLU really helps us to save some GPU memory. But we should be __extremely cautious when using in-place operations and check twice__. In the next section I will show why." 544 | ] 545 | }, 546 | { 547 | "cell_type": "markdown", 548 | "metadata": {}, 549 | "source": [ 550 | "## Downsides of In-place Operations" 551 | ] 552 | }, 553 | { 554 | "cell_type": "markdown", 555 | "metadata": {}, 556 | "source": [ 557 | "The major downside of in-place operations is the fact that __they might overwrite values required to compute gradients__ which means breaking the training procedure of the model. That is what [the official PyTorch autograd documentation](https://pytorch.org/docs/stable/notes/autograd.html#in-place-operations-with-autograd) says:\n", 558 | "> Supporting in-place operations in autograd is a hard matter, and we discourage their use in most cases. Autograd’s aggressive buffer freeing and reuse makes it very efficient and there are very few occasions when in-place operations actually lower memory usage by any significant amount. Unless you’re operating under heavy memory pressure, you might never need to use them.\n", 559 | "\n", 560 | "> There are two main reasons that limit the applicability of in-place operations:\n", 561 | "\n", 562 | "> 1. In-place operations can potentially overwrite values required to compute gradients.\n", 563 | "> 2. Every in-place operation actually requires the implementation to rewrite the computational graph. Out-of-place versions simply allocate new objects and keep references to the old graph, while in-place operations, require changing the creator of all inputs to the Function representing this operation. This can be tricky, especially if there are many Tensors that reference the same storage (e.g. created by indexing or transposing), and in-place functions will actually raise an error if the storage of modified inputs is referenced by any other Tensor." 564 | ] 565 | }, 566 | { 567 | "cell_type": "markdown", 568 | "metadata": {}, 569 | "source": [ 570 | "The other reason of being careful with in-place operations is that their implementation is extremely tricky. That is why I would __recommend to use PyTorch standard in-place operations__ (like `torch.tanh_` or `torch.sigmoid_`) instead of implementing one manually.\n", 571 | "\n", 572 | "Let's see an example of [SiLU](https://arxiv.org/pdf/1606.08415.pdf) (or Swish-1) activation function. This is the normal implementation of SiLU:" 573 | ] 574 | }, 575 | { 576 | "cell_type": "code", 577 | "execution_count": 14, 578 | "metadata": {}, 579 | "outputs": [], 580 | "source": [ 581 | "def silu(input):\n", 582 | " '''\n", 583 | " Normal implementation of SiLU activation function\n", 584 | " https://arxiv.org/pdf/1606.08415.pdf\n", 585 | " '''\n", 586 | " return input * torch.sigmoid(input)" 587 | ] 588 | }, 589 | { 590 | "cell_type": "markdown", 591 | "metadata": {}, 592 | "source": [ 593 | "Let's try to implement in-place SiLU using torch.sigmoid_ in-place function:" 594 | ] 595 | }, 596 | { 597 | "cell_type": "code", 598 | "execution_count": 15, 599 | "metadata": {}, 600 | "outputs": [], 601 | "source": [ 602 | "def silu_inplace_1(input):\n", 603 | " '''\n", 604 | " Incorrect implementation of in-place SiLU activation function\n", 605 | " https://arxiv.org/pdf/1606.08415.pdf\n", 606 | " '''\n", 607 | " return input * torch.sigmoid_(input) # THIS IS INCORRECT!!!" 608 | ] 609 | }, 610 | { 611 | "cell_type": "markdown", 612 | "metadata": {}, 613 | "source": [ 614 | "The code above __incorrectly__ implements in-place SiLU. We can make sure of that:" 615 | ] 616 | }, 617 | { 618 | "cell_type": "code", 619 | "execution_count": 16, 620 | "metadata": {}, 621 | "outputs": [ 622 | { 623 | "name": "stdout", 624 | "output_type": "stream", 625 | "text": [ 626 | "Original SiLU: tensor([ 0.0796, -0.2744, -0.2598])\n", 627 | "In-place SiLU: tensor([0.5370, 0.2512, 0.2897])\n" 628 | ] 629 | } 630 | ], 631 | "source": [ 632 | "t = torch.randn(3)\n", 633 | "\n", 634 | "# print result of original SiLU\n", 635 | "print(\"Original SiLU: {}\".format(silu(t)))\n", 636 | "\n", 637 | "# change the value of t with in-place function\n", 638 | "silu_inplace_1(t)\n", 639 | "print(\"In-place SiLU: {}\".format(t))" 640 | ] 641 | }, 642 | { 643 | "cell_type": "markdown", 644 | "metadata": {}, 645 | "source": [ 646 | "It is easy to see that the function `silu_inplace_1` in fact returns `sigmoid(input) * sigmoid(input)` !" 647 | ] 648 | }, 649 | { 650 | "cell_type": "markdown", 651 | "metadata": {}, 652 | "source": [ 653 | "The working example of the in-place implementation of SiLU using `torch.sigmoid_` could be:" 654 | ] 655 | }, 656 | { 657 | "cell_type": "code", 658 | "execution_count": 17, 659 | "metadata": {}, 660 | "outputs": [], 661 | "source": [ 662 | "def silu_inplace_2(input):\n", 663 | " '''\n", 664 | " Example of implementation of in-place SiLU activation function using torch.sigmoid_\n", 665 | " https://arxiv.org/pdf/1606.08415.pdf\n", 666 | " '''\n", 667 | " result = input.clone()\n", 668 | " torch.sigmoid_(input)\n", 669 | " input *= result\n", 670 | " return input" 671 | ] 672 | }, 673 | { 674 | "cell_type": "code", 675 | "execution_count": 18, 676 | "metadata": {}, 677 | "outputs": [ 678 | { 679 | "name": "stdout", 680 | "output_type": "stream", 681 | "text": [ 682 | "Original SiLU: tensor([ 0.7774, -0.2767, 0.2967])\n", 683 | "In-place SiLU #2: tensor([ 0.7774, -0.2767, 0.2967])\n" 684 | ] 685 | } 686 | ], 687 | "source": [ 688 | "t = torch.randn(3)\n", 689 | "\n", 690 | "# print result of original SiLU\n", 691 | "print(\"Original SiLU: {}\".format(silu(t)))\n", 692 | "\n", 693 | "# change the value of t with in-place function\n", 694 | "silu_inplace_2(t)\n", 695 | "print(\"In-place SiLU #2: {}\".format(t))" 696 | ] 697 | }, 698 | { 699 | "cell_type": "markdown", 700 | "metadata": {}, 701 | "source": [ 702 | "This small example demonstrates why we should be extremely careful and check twice when using the in-place operations." 703 | ] 704 | }, 705 | { 706 | "cell_type": "markdown", 707 | "metadata": {}, 708 | "source": [ 709 | "## Conclusion\n", 710 | "In this article: \n", 711 | "* I described the in-place operations and their purpose. Demonstrated how in-place operations help to __consume less GPU memory__.\n", 712 | "* Described the major __downsides of in-place operations__. One should be very careful about using them and check the result twice." 713 | ] 714 | }, 715 | { 716 | "cell_type": "markdown", 717 | "metadata": {}, 718 | "source": [ 719 | "## Additional References\n", 720 | "Links to the additional resources and further reading:\n", 721 | "\n", 722 | "1. [PyTorch Autograd documentation](https://pytorch.org/docs/stable/notes/autograd.html#in-place-operations-with-autograd)" 723 | ] 724 | }, 725 | { 726 | "cell_type": "markdown", 727 | "metadata": {}, 728 | "source": [ 729 | "## PS\n", 730 | "![echo logo](https://github.com/Lexie88rus/Activation-functions-examples-pytorch/blob/master/assets/echo_logo.png?raw=true)\n", 731 | "\n", 732 | "I participate in implementation of a __Echo package__ with mathematical backend for neural networks, which can be used with most popular existing packages (TensorFlow, Keras and [PyTorch](https://pytorch.org/)). We have done a lot for PyTorch and Keras so far. Here is a [link to a repository on GitHub](https://github.com/digantamisra98/Echo/tree/Dev-adeis), __I will highly appreciate your feedback__ on that." 733 | ] 734 | } 735 | ], 736 | "metadata": { 737 | "kernelspec": { 738 | "display_name": "Python 3", 739 | "language": "python", 740 | "name": "python3" 741 | }, 742 | "language_info": { 743 | "codemirror_mode": { 744 | "name": "ipython", 745 | "version": 3 746 | }, 747 | "file_extension": ".py", 748 | "mimetype": "text/x-python", 749 | "name": "python", 750 | "nbconvert_exporter": "python", 751 | "pygments_lexer": "ipython3", 752 | "version": "3.6.6" 753 | } 754 | }, 755 | "nbformat": 4, 756 | "nbformat_minor": 1 757 | } 758 | -------------------------------------------------------------------------------- /landscapes/create_landscape.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from torch import nn 3 | from torch import optim 4 | import torch.nn.functional as F 5 | 6 | from collections import OrderedDict 7 | import numpy as np 8 | 9 | from PIL import Image 10 | 11 | from sklearn.preprocessing import MinMaxScaler 12 | 13 | # import matplotlib for visualization 14 | from matplotlib.pyplot import imshow 15 | import matplotlib.pyplot as plt 16 | 17 | def fswish(input, beta=1.25): 18 | return input * torch.sigmoid(beta * input) 19 | 20 | class swish(nn.Module): 21 | def __init__(self, beta = 1.25): 22 | ''' 23 | Init method. 24 | ''' 25 | super().__init__() 26 | self.beta = beta 27 | 28 | 29 | def forward(self, input): 30 | ''' 31 | Forward pass of the function. 32 | ''' 33 | return fswish(input, self.beta) 34 | 35 | def fmish(input): 36 | return input * torch.tanh(F.softplus(input)) 37 | 38 | class mish(nn.Module): 39 | def __init__(self): 40 | ''' 41 | Init method. 42 | ''' 43 | super().__init__() 44 | 45 | def forward(self, input): 46 | ''' 47 | Forward pass of the function. 48 | ''' 49 | return fmish(input) 50 | 51 | def build_model(activation_function): 52 | return nn.Sequential(OrderedDict([ 53 | ('fc1', nn.Linear(2, 64)), 54 | ('activation1', activation_function), # use custom activation function 55 | ('fc2', nn.Linear(64, 32)), 56 | ('activation2', activation_function), 57 | ('fc3', nn.Linear(32, 16)), 58 | ('activation3', activation_function), 59 | ('fc4', nn.Linear(16, 1)), 60 | ('activation4', activation_function)])) 61 | 62 | def convert_to_PIL(img, width, height): 63 | img_r = img.reshape(height,width) 64 | 65 | pil_img = Image.new('RGB', (height,width), 'white') 66 | pixels = pil_img.load() 67 | 68 | for i in range(0, height): 69 | for j in range(0, width): 70 | pixels[j, i] = img[i,j], img[i,j], img[i,j] 71 | 72 | return pil_img 73 | 74 | def main(): 75 | model1 = build_model(nn.ReLU()) 76 | model2 = build_model(swish(beta = 1)) 77 | model3 = build_model(mish()) 78 | 79 | x = np.linspace(0.0, 10.0, num=100) 80 | y = np.linspace(0.0, 10.0, num=100) 81 | 82 | grid = [torch.tensor([xi, yi]) for xi in x for yi in y] 83 | 84 | np_img_relu = np.array([model1(point).detach().numpy() for point in grid]).reshape(100, 100) 85 | np_img_swish = np.array([model2(point).detach().numpy() for point in grid]).reshape(100, 100) 86 | np_img_mish = np.array([model3(point).detach().numpy() for point in grid]).reshape(100, 100) 87 | 88 | scaler = MinMaxScaler(feature_range=(0, 255)) 89 | np_img_relu = scaler.fit_transform(np_img_relu) 90 | np_img_swish = scaler.fit_transform(np_img_swish) 91 | np_img_mish = scaler.fit_transform(np_img_mish) 92 | 93 | image_relu = convert_to_PIL(np_img_relu, 100, 100) 94 | image_swish = convert_to_PIL(np_img_swish, 100, 100) 95 | image_mish = convert_to_PIL(np_img_mish, 100, 100) 96 | 97 | image_relu.save('relu.png') 98 | image_swish.save('swish.png') 99 | image_mish.save('mish.png') 100 | 101 | plt.imsave('imrelu.png', np_img_relu) 102 | plt.imsave('imswish.png', np_img_swish) 103 | plt.imsave('imsmish.png', np_img_mish) 104 | 105 | return 106 | 107 | 108 | if __name__ == '__main__': 109 | main() 110 | -------------------------------------------------------------------------------- /landscapes/imrelu.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/landscapes/imrelu.png -------------------------------------------------------------------------------- /landscapes/imsmish.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/landscapes/imsmish.png -------------------------------------------------------------------------------- /landscapes/imswish.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Lexie88rus/Activation-functions-examples-pytorch/769ac4c23ac57c9d244f25e4ab2b96b07f1e8826/landscapes/imswish.png --------------------------------------------------------------------------------