├── LICENSE ├── README.md ├── cifar100_train.ipynb ├── efficientnet.py ├── efficientnet_v2.py └── imagenet_eval.ipynb /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 abhuse 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # EfficientNetV2 EfficientNetV1 in Pytorch with pretrained weights 2 | 3 | A single-file implementation of EfficientNetV2 and EfficientNetV1 as introduced in: 4 | [\[Tan & Le 2021\]: EfficientNetV2: Smaller Models and Faster Training](https://arxiv.org/pdf/2104.00298.pdf) 5 | [\[Tan & Le 2019\]: EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks](https://arxiv.org/abs/1905.11946) 6 | 7 | ## Pretrained Weights 8 | Original implementations of both [EfficientNetV2](https://github.com/google/automl/tree/master/efficientnetv2) and [EfficientNetV1](https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet) include pretrained weigths in Tensorflow format. 9 | These weigths were converted to Pytorch format and are provided in this repository. 10 | 11 | ## Accuracy 12 | 13 | ### EfficientNet V2 14 | | Model | ImageNet 1k Top-1 accuracy, % | 15 | | --- | --- | 16 | | EfficientNetV2-b0 | 77.590% | 17 | | EfficientNetV2-b1 | 78.872% | 18 | | EfficientNetV2-b2 | 79.388% | 19 | | EfficientNetV2-b3 | 82.260% | 20 | | EfficientNetV2-S | 84.282% | 21 | | EfficientNetV2-M | 85.596% | 22 | | EfficientNetV2-L | 86.298% | 23 | | EfficientNetV2-XL | 86.414% | 24 | 25 | ### EfficientNet V1 26 | | Model | ImageNet 1k Top-1 Accuracy, % | 27 | | --- | --- | 28 | | EfficientNet-B0 | 76.43% | 29 | | EfficientNet-B1 | 78.396% | 30 | | EfficientNet-B2 | 79.804% | 31 | | EfficientNet-B3 | 81.542% | 32 | | EfficientNet-B4 | 83.036% | 33 | | EfficientNet-B5 | 83.79% | 34 | | EfficientNet-B6 | 84.136% | 35 | | EfficientNet-B7 | 84.578% | 36 | 37 | ## Usage 38 | 39 | Check out [cifar100_train.ipynb](cifar100_train.ipynb) if you would like to experiment with models. 40 | To evaluate pretrained models against Imagenet validation set, run [imagenet_eval.ipynb](imagenet_eval.ipynb). 41 | 42 | ### EfficientNet V2 43 | The example below creates an EfficientNetV2-S model that takes 3-channel image of shape [224, 224] 44 | as input and outputs distribution over 50 classes, model weights are initialized with weights pretrained on ImageNet 45 | dataset: 46 | ```python 47 | import torch 48 | from efficientnet_v2 import EfficientNetV2 49 | 50 | model = EfficientNetV2('s', 51 | in_channels=3, 52 | n_classes=50, 53 | pretrained=True) 54 | 55 | # x - tensor of shape [batch_size, in_channels, image_height, image_width] 56 | x = torch.randn([10, 3, 224, 224]) 57 | 58 | # to get predictions: 59 | pred = model(x) 60 | print('out shape:', pred.shape) 61 | # >>> out shape: torch.Size([10, 50]) 62 | 63 | # to extract features: 64 | features = model.get_features(x) 65 | for i, feature in enumerate(features): 66 | print('feature %d shape:' % i, feature.shape) 67 | # >>> feature 0 shape: torch.Size([10, 48, 56, 56]) 68 | # >>> feature 1 shape: torch.Size([10, 64, 28, 28]) 69 | # >>> feature 2 shape: torch.Size([10, 160, 14, 14]) 70 | # >>> feature 3 shape: torch.Size([10, 256, 7, 7]) 71 | ``` 72 | 73 | ### EfficientNet (V1, original) 74 | 75 | The example below creates an EfficientNet-B0 model that takes 3-channel image of shape [224, 224] 76 | as input and outputs distribution over 50 classes, model weights are initialized with weights pretrained on ImageNet 77 | dataset: 78 | 79 | ```python 80 | import torch 81 | from efficientnet import EfficientNet 82 | 83 | model = EfficientNet(b=0, 84 | in_channels=3, 85 | n_classes=50, 86 | in_spatial_shape=(224,224), 87 | pretrained=True 88 | ) 89 | 90 | # x - tensor of shape [batch_size, in_channels, image_height, image_width] 91 | x = torch.randn([10, 3, 224, 224]) 92 | 93 | # to get predictions: 94 | pred = model(x) 95 | print('out shape:', pred.shape) 96 | # >>> out shape: torch.Size([10, 50]) 97 | 98 | # to extract features: 99 | features = model.get_features(x) 100 | for i, feature in enumerate(features): 101 | print('feature %d shape:' % i, feature.shape) 102 | # >>> feature 0 shape: torch.Size([10, 16, 112, 112]) 103 | # >>> feature 1 shape: torch.Size([10, 24, 56, 56]) 104 | # >>> feature 2 shape: torch.Size([10, 40, 28, 28]) 105 | # >>> feature 3 shape: torch.Size([10, 80, 14, 14]) 106 | # >>> feature 4 shape: torch.Size([10, 112, 14, 14]) 107 | # >>> feature 5 shape: torch.Size([10, 192, 7, 7]) 108 | # >>> feature 6 shape: torch.Size([10, 320, 7, 7]) 109 | ``` 110 | 111 | 112 | ## Parameters 113 | 114 | 115 | ### EfficientNet V2 116 | * ***model_name***, *(str)* - Model name, one of 'b0', 'b1', 'b2', 'b3', 's', 'm', 'l', 'xl' 117 | * ***in_channels***, *(int)*, *(Default=3)* - Number of channels in input image 118 | * ***n_classes***, *(int)*, *(Default=1000)* - Number of output classes 119 | * ***tf_style_conv***, *(bool)*, *(Default=False)* - Whether to simulate "SAME" padding of Tensorflow's convolution op. Set to *True* when evaluating pretrained models against Imagenet dataset 120 | * ***in_spatial_shape***, *(int or iterable of ints)*, 121 | *(Default=None)* - Spatial dimensionality of input image, tuple 122 | (height, width) or single integer *size* for shape (*size*, *size*). 123 | It is recommended to specify this parameter only when *tf_style_conv=True* 124 | * ***activation***, *(str)*, *(Default='silu')* - Activation function 125 | * ***activation_kwargs***, *(dict)*, *(Default=None)* - Keyword arguments to pass to activation function 126 | * ***bias***, *(bool)*, 127 | *(Default=False)* - Enable bias in convolution operations 128 | * ***drop_connect_rate***, *(float)*, 129 | *(Default=0.2)* - DropConnect rate, set to 0 to disable DropConnect 130 | * ***dropout_rate***, *(float or None)*, 131 | *(Default=None)* - Dropout rate, set to *None* to use default dropout rate for each model 132 | * ***bn_epsilon***, *(float)*, 133 | *(Default=0.001)* - Batch normalizaton epsilon 134 | * ***bn_momentum***, *(float)*, 135 | *(Default=0.01)* - Batch normalization momentum 136 | * ***pretrained***, *(bool)*, 137 | *(Default=False)* - Initialize model with weights pretrained on ImageNet dataset 138 | * ***progress***, *(bool)*, 139 | *(Default=False)* - Show progress bar when downloading pretrained weights 140 | 141 | The default parameter values are the ones that were used in 142 | [original implementation](https://github.com/google/automl/tree/master/efficientnetv2). 143 | 144 | 145 | ### EfficientNet V1 146 | * ***b***, *(int)* - Model index, e.g. 1 for EfficientNet-B1 147 | * ***in_channels***, *(int)*, *(Default=3)* - Number of channels in input image 148 | * ***n_classes***, *(int)*, *(Default=1000)* - Number of output classes 149 | * ***in_spatial_shape***, *(int or iterable of ints)*, 150 | *(Default=None)* - Spatial dimensionality of input image, tuple 151 | (height, width) or single integer *size* for shape (*size*, *size*). If None, default image shape will be used for 152 | each model index 153 | * ***activation***, *(callable)*, 154 | *(Default=Swish())* - Activation function 155 | * ***bias***, *(bool)*, 156 | *(Default=False)* - Enable bias in convolution operations 157 | * ***drop_connect_rate***, *(float)*, 158 | *(Default=0.2)* - DropConnect rate, set to 0 to disable DropConnect 159 | * ***dropout_rate***, *(float or None)*, 160 | *(Default=None)* - Dropout rate, set to *None* to use default dropout rate for each model 161 | * ***bn_epsilon***, *(float)*, 162 | *(Default=0.001)* - Batch normalizaton epsilon 163 | * ***bn_momentum***, *(float)*, 164 | *(Default=0.01)* - Batch normalization momentum 165 | * ***pretrained***, *(bool)*, 166 | *(Default=False)* - Initialize model with weights pretrained on ImageNet dataset 167 | * ***progress***, *(bool)*, 168 | *(Default=False)* - Show progress bar when downloading pretrained weights 169 | 170 | The default parameter values are the ones that were used in 171 | [original implementation](https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet). 172 | 173 | 174 | ## Requirements 175 | 176 | * Python v3.5+ 177 | * Pytorch v1.0+ -------------------------------------------------------------------------------- /cifar100_train.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## CIFAR-100 training script\n", 8 | "Note: This script merely outlines the procedures for training pytorch model. It does not replicate any study and does not attemp to achieve state-of-the-art in CIFAR-100 classification." 9 | ] 10 | }, 11 | { 12 | "cell_type": "code", 13 | "execution_count": 1, 14 | "metadata": {}, 15 | "outputs": [], 16 | "source": [ 17 | "from copy import deepcopy\n", 18 | "from datetime import datetime\n", 19 | "from os import makedirs\n", 20 | "from os.path import join, isfile, isdir\n", 21 | "\n", 22 | "import psutil\n", 23 | "import torch\n", 24 | "import torch.nn as nn\n", 25 | "import torch.optim as optim\n", 26 | "\n", 27 | "from efficientnet import EfficientNet\n", 28 | "from sklearn.metrics import accuracy_score\n", 29 | "from torch.optim import lr_scheduler\n", 30 | "from torch.utils.data import DataLoader\n", 31 | "from torchvision import datasets, transforms\n", 32 | "\n", 33 | "import matplotlib.pyplot as plt\n", 34 | "%matplotlib inline" 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": 2, 40 | "metadata": {}, 41 | "outputs": [], 42 | "source": [ 43 | "def train_model(model, dataloader, device, criterion, optimizer):\n", 44 | " model.train()\n", 45 | " for xb, yb in dataloader:\n", 46 | " xb, yb = xb.to(device), yb.to(device)\n", 47 | " optimizer.zero_grad()\n", 48 | " out = model(xb)\n", 49 | " if out.size(1) == 1:\n", 50 | " # regression, squeeze output of shape [N,1] to [N]\n", 51 | " out = torch.squeeze(out, 1)\n", 52 | " loss = criterion(out, yb)\n", 53 | " loss.backward()\n", 54 | " optimizer.step()\n", 55 | "\n", 56 | "\n", 57 | "def eval_model(model, dataloader, device, criterion=None):\n", 58 | " loss_value = []\n", 59 | " y_pred = []\n", 60 | " y_true = []\n", 61 | "\n", 62 | " model.eval()\n", 63 | " with torch.no_grad():\n", 64 | " for xb, yb in dataloader:\n", 65 | " xb, yb = xb.to(device), yb.to(device)\n", 66 | " out = model(xb)\n", 67 | " if out.size(1) == 1:\n", 68 | " # regression, squeeze output of shape [N,1] to [N]\n", 69 | " out = torch.squeeze(out, 1)\n", 70 | "\n", 71 | " if criterion is not None:\n", 72 | " loss = criterion(out, yb)\n", 73 | " loss_value.append(loss.item())\n", 74 | "\n", 75 | " y_pred.append(out.detach().cpu())\n", 76 | " y_true.append(yb.detach().cpu())\n", 77 | "\n", 78 | " if criterion is not None:\n", 79 | " loss_value = sum(loss_value) / len(loss_value)\n", 80 | " return torch.cat(y_pred), torch.cat(y_true), loss_value\n", 81 | " else:\n", 82 | " return torch.cat(y_pred), torch.cat(y_true)\n", 83 | "\n", 84 | "\n", 85 | "def run_experiment(dl_train,\n", 86 | " dl_train_val,\n", 87 | " dl_validation,\n", 88 | " model,\n", 89 | " optimizer,\n", 90 | " criterion,\n", 91 | " device,\n", 92 | " max_epoch,\n", 93 | " metric_fn,\n", 94 | " init_epoch=0,\n", 95 | " scheduler=None,\n", 96 | " load_path=None,\n", 97 | " save_path=None,\n", 98 | " early_stopping=None,\n", 99 | " ):\n", 100 | " results = {\n", 101 | " \"train_loss\": [],\n", 102 | " \"valid_loss\": [],\n", 103 | " \"train_met\": [],\n", 104 | " \"valid_met\": [],\n", 105 | " \"state_dict\": None,\n", 106 | " }\n", 107 | "\n", 108 | " best_validation_metric = .0\n", 109 | " model_best_state_dict = None\n", 110 | " no_score_improvement = 0\n", 111 | " experiment_start = datetime.now()\n", 112 | "\n", 113 | " if load_path is not None:\n", 114 | " # load full experiment state to continue experiment\n", 115 | " load_path = join(load_path, \"full_state.pth\")\n", 116 | " if not isfile(load_path):\n", 117 | " raise ValueError(\"Checkpoint file {} does not exist\".format(load_path))\n", 118 | "\n", 119 | " checkpoint = torch.load(load_path)\n", 120 | "\n", 121 | " optimizer.load_state_dict(checkpoint['optimizer_state_dict'])\n", 122 | "\n", 123 | " model_best_state_dict = checkpoint['model_best_state_dict']\n", 124 | " model.load_state_dict(checkpoint['model_curr_state_dict'])\n", 125 | "\n", 126 | " init_epoch = checkpoint['epoch']\n", 127 | " best_validation_metric = checkpoint['best_validation_metric']\n", 128 | "\n", 129 | " if scheduler is not None:\n", 130 | " scheduler.load_state_dict(checkpoint[\"scheduler_state_dict\"])\n", 131 | " print(\"Successfully loaded checkpoint.\")\n", 132 | "\n", 133 | " s = \"Epoch/Max | Loss: Train / Validation | Metric: Train / Validation | Epoch time\"\n", 134 | " print(s)\n", 135 | "\n", 136 | " if save_path is not None and not isdir(save_path):\n", 137 | " makedirs(save_path)\n", 138 | " for epoch in range(init_epoch, max_epoch):\n", 139 | " now = datetime.now()\n", 140 | " train_model(model=model,\n", 141 | " dataloader=dl_train,\n", 142 | " device=device,\n", 143 | " criterion=criterion,\n", 144 | " optimizer=optimizer, )\n", 145 | "\n", 146 | " # evaluate subset of train set (in eval mode)\n", 147 | " train_val_results = eval_model(model=model,\n", 148 | " dataloader=dl_train_val,\n", 149 | " device=device,\n", 150 | " criterion=criterion, )\n", 151 | " train_y_pred, train_y_true, train_loss = train_val_results\n", 152 | " train_metric = metric_fn(train_y_pred, train_y_true)\n", 153 | " results[\"train_loss\"].append(train_loss)\n", 154 | " results[\"train_met\"].append(train_metric)\n", 155 | "\n", 156 | " # evaluate validation subset\n", 157 | " valid_results = eval_model(model=model,\n", 158 | " dataloader=dl_validation,\n", 159 | " device=device,\n", 160 | " criterion=criterion, )\n", 161 | " valid_y_pred, valid_y_true, valid_loss = valid_results\n", 162 | " validation_metric = metric_fn(valid_y_pred, valid_y_true)\n", 163 | " results[\"valid_loss\"].append(valid_loss)\n", 164 | " results[\"valid_met\"].append(validation_metric)\n", 165 | "\n", 166 | " # check if validation score is improved\n", 167 | " if validation_metric > best_validation_metric:\n", 168 | " model_best_state_dict = deepcopy(model.state_dict())\n", 169 | " best_validation_metric = validation_metric\n", 170 | " # reset early stopping counter\n", 171 | " no_score_improvement = 0\n", 172 | " # save best model weights\n", 173 | " if save_path is not None:\n", 174 | " torch.save(model_best_state_dict, join(save_path, \"best_weights.pth\"))\n", 175 | " else:\n", 176 | " no_score_improvement += 1\n", 177 | " if early_stopping is not None and no_score_improvement >= early_stopping:\n", 178 | " print(\"Early stopping at epoch %d\" % epoch)\n", 179 | " break\n", 180 | "\n", 181 | " if scheduler is not None:\n", 182 | " scheduler.step(validation_metric)\n", 183 | "\n", 184 | " if save_path is not None:\n", 185 | " # (optional) save model state dict at end of each epoch\n", 186 | " # torch.save(model.state_dict(), join(save_path, \"model_state_{}.pth\".format(epoch)))\n", 187 | "\n", 188 | " # save full experiment state at the end of each epoch\n", 189 | " checkpoint = {\n", 190 | " 'epoch': epoch + 1,\n", 191 | " 'model_curr_state_dict': model.state_dict(),\n", 192 | " 'model_best_state_dict': model_best_state_dict,\n", 193 | " 'optimizer_state_dict': optimizer.state_dict(),\n", 194 | " 'scheduler_state_dict': None if scheduler is None else scheduler.state_dict(),\n", 195 | " 'no_score_improvement': no_score_improvement,\n", 196 | " 'best_validation_metric': best_validation_metric,\n", 197 | " }\n", 198 | " torch.save(checkpoint, join(save_path, \"full_state.pth\"))\n", 199 | "\n", 200 | " s = \"{:>5}/{} | Loss: {:.4f} / {:.4f}\".format(epoch, max_epoch, train_loss, valid_loss)\n", 201 | " s += \" | Metric: {:.4f} / {:.4f}\".format(train_metric, validation_metric)\n", 202 | " s += \" | +{}\".format(datetime.now() - now)\n", 203 | " print(s)\n", 204 | "\n", 205 | " print(\"Experiment time: {}\".format(datetime.now() - experiment_start))\n", 206 | " return results" 207 | ] 208 | }, 209 | { 210 | "cell_type": "code", 211 | "execution_count": 3, 212 | "metadata": {}, 213 | "outputs": [], 214 | "source": [ 215 | "model_index = 0 # i.e. EfficientNet-B{model_index}\n", 216 | "batch_size = 128\n", 217 | "max_epoch = 100\n", 218 | "n_classes = 100\n", 219 | "pretrained=False\n", 220 | "num_workers = psutil.cpu_count()\n", 221 | "img_size = 32\n", 222 | "\n", 223 | "def metric_fn(y_pred, y_true):\n", 224 | " _, y_pred = torch.max(y_pred, 1)\n", 225 | " return accuracy_score(y_pred, y_true)" 226 | ] 227 | }, 228 | { 229 | "cell_type": "code", 230 | "execution_count": 4, 231 | "metadata": {}, 232 | "outputs": [ 233 | { 234 | "name": "stdout", 235 | "output_type": "stream", 236 | "text": [ 237 | "Files already downloaded and verified\n", 238 | "Files already downloaded and verified\n", 239 | "Files already downloaded and verified\n" 240 | ] 241 | } 242 | ], 243 | "source": [ 244 | "transform_train = transforms.Compose([\n", 245 | " transforms.RandomCrop(img_size, padding=4),\n", 246 | " transforms.RandomHorizontalFlip(),\n", 247 | " transforms.ToTensor(),\n", 248 | " transforms.Normalize(mean=[0.507075, 0.48655024, 0.44091907],\n", 249 | " std=[0.26733398, 0.25643876, 0.2761503]),\n", 250 | "])\n", 251 | "\n", 252 | "transform_validation = transforms.Compose([\n", 253 | " transforms.ToTensor(),\n", 254 | " transforms.Normalize(mean=[0.5070754, 0.48655024, 0.44091907],\n", 255 | " std=[0.26733398, 0.25643876, 0.2761503]),\n", 256 | "])\n", 257 | "\n", 258 | "dataset_train = datasets.CIFAR100(root='./data',\n", 259 | " train=True,\n", 260 | " download=True,\n", 261 | " transform=transform_train,\n", 262 | " )\n", 263 | "dataset_train_val = datasets.CIFAR100(root='./data',\n", 264 | " train=True,\n", 265 | " download=True,\n", 266 | " transform=transform_validation,\n", 267 | " )\n", 268 | "dataset_validation = datasets.CIFAR100(root='./data',\n", 269 | " train=False,\n", 270 | " download=True,\n", 271 | " transform=transform_validation,\n", 272 | " )\n", 273 | "\n", 274 | "dataloader_train = DataLoader(dataset_train,\n", 275 | " batch_size=batch_size,\n", 276 | " shuffle=True,\n", 277 | " num_workers=num_workers,\n", 278 | " )\n", 279 | "dataloader_train_val = DataLoader(dataset_train_val,\n", 280 | " batch_size=batch_size,\n", 281 | " shuffle=False,\n", 282 | " num_workers=num_workers,\n", 283 | " )\n", 284 | "dataloader_validation = DataLoader(dataset_validation,\n", 285 | " batch_size=batch_size,\n", 286 | " shuffle=False,\n", 287 | " num_workers=num_workers,\n", 288 | " )\n", 289 | "\n", 290 | "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n", 291 | "\n", 292 | "model = EfficientNet(b=model_index,\n", 293 | " in_spatial_shape=img_size,\n", 294 | " n_classes=n_classes,\n", 295 | " pretrained=pretrained,\n", 296 | " )\n", 297 | "model.to(device)\n", 298 | "\n", 299 | "optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.9, weight_decay=1e-4)\n", 300 | "criterion = nn.CrossEntropyLoss()\n", 301 | "scheduler = lr_scheduler.MultiStepLR(optimizer, milestones=[30,60,90], gamma=0.1)" 302 | ] 303 | }, 304 | { 305 | "cell_type": "code", 306 | "execution_count": 5, 307 | "metadata": {}, 308 | "outputs": [ 309 | { 310 | "name": "stdout", 311 | "output_type": "stream", 312 | "text": [ 313 | "Epoch/Max | Loss: Train / Validation | Metric: Train / Validation | Epoch time\n", 314 | " 0/100 | Loss: 4.6407 / 4.6411 | Metric: 0.0141 / 0.0145 | +0:00:25.398279\n", 315 | " 1/100 | Loss: 3.4596 / 3.4839 | Metric: 0.1654 / 0.1632 | +0:00:25.511929\n", 316 | " 2/100 | Loss: 3.2156 / 3.2652 | Metric: 0.2042 / 0.1926 | +0:00:25.510012\n", 317 | " 3/100 | Loss: 2.9651 / 3.0327 | Metric: 0.2499 / 0.2428 | +0:00:25.776397\n", 318 | " 4/100 | Loss: 2.8429 / 2.9313 | Metric: 0.2798 / 0.2601 | +0:00:25.733490\n", 319 | " 5/100 | Loss: 2.6884 / 2.7874 | Metric: 0.3061 / 0.2876 | +0:00:25.556902\n", 320 | " 6/100 | Loss: 2.6452 / 2.7454 | Metric: 0.3214 / 0.2998 | +0:00:25.513321\n", 321 | " 7/100 | Loss: 2.4652 / 2.6077 | Metric: 0.3521 / 0.3251 | +0:00:25.530684\n", 322 | " 8/100 | Loss: 2.3831 / 2.5385 | Metric: 0.3670 / 0.3369 | +0:00:25.537902\n", 323 | " 9/100 | Loss: 2.3387 / 2.5089 | Metric: 0.3823 / 0.3513 | +0:00:25.540766\n", 324 | " 10/100 | Loss: 2.2174 / 2.3946 | Metric: 0.4019 / 0.3645 | +0:00:25.510699\n", 325 | " 11/100 | Loss: 2.1654 / 2.3559 | Metric: 0.4215 / 0.3825 | +0:00:25.508304\n", 326 | " 12/100 | Loss: 2.0757 / 2.2894 | Metric: 0.4363 / 0.3941 | +0:00:25.623292\n", 327 | " 13/100 | Loss: 2.0408 / 2.2736 | Metric: 0.4533 / 0.4051 | +0:00:25.536239\n", 328 | " 14/100 | Loss: 1.9758 / 2.2310 | Metric: 0.4662 / 0.4121 | +0:00:25.539638\n", 329 | " 15/100 | Loss: 1.8560 / 2.1301 | Metric: 0.4842 / 0.4336 | +0:00:25.555950\n", 330 | " 16/100 | Loss: 1.8541 / 2.1502 | Metric: 0.4883 / 0.4267 | +0:00:25.516265\n", 331 | " 17/100 | Loss: 1.7353 / 2.0693 | Metric: 0.5102 / 0.4452 | +0:00:25.490337\n", 332 | " 18/100 | Loss: 1.7392 / 2.0899 | Metric: 0.5121 / 0.4434 | +0:00:25.516422\n", 333 | " 19/100 | Loss: 1.7272 / 2.1080 | Metric: 0.5171 / 0.4391 | +0:00:25.533143\n", 334 | " 20/100 | Loss: 1.7886 / 2.1671 | Metric: 0.5213 / 0.4404 | +0:00:25.480690\n", 335 | " 21/100 | Loss: 1.6183 / 2.0440 | Metric: 0.5479 / 0.4627 | +0:00:25.495024\n", 336 | " 22/100 | Loss: 1.6036 / 2.0365 | Metric: 0.5473 / 0.4624 | +0:00:25.475565\n", 337 | " 23/100 | Loss: 1.6070 / 2.0566 | Metric: 0.5441 / 0.4549 | +0:00:25.492129\n", 338 | " 24/100 | Loss: 1.8741 / 2.3108 | Metric: 0.5567 / 0.4611 | +0:00:25.520776\n", 339 | " 25/100 | Loss: 1.4591 / 1.9652 | Metric: 0.5847 / 0.4805 | +0:00:25.484019\n", 340 | " 26/100 | Loss: 1.4893 / 2.0240 | Metric: 0.5823 / 0.4774 | +0:00:25.546081\n", 341 | " 27/100 | Loss: 1.3791 / 1.9233 | Metric: 0.6107 / 0.4907 | +0:00:25.530611\n", 342 | " 28/100 | Loss: 1.3882 / 1.9783 | Metric: 0.5952 / 0.4772 | +0:00:25.555992\n", 343 | " 29/100 | Loss: 1.4314 / 2.0812 | Metric: 0.6025 / 0.4770 | +0:00:25.513177\n", 344 | " 30/100 | Loss: 1.4443 / 2.0862 | Metric: 0.6109 / 0.4828 | +0:00:25.481391\n", 345 | " 31/100 | Loss: 1.3089 / 1.9888 | Metric: 0.6425 / 0.4958 | +0:00:25.498638\n", 346 | " 32/100 | Loss: 1.6406 / 2.3167 | Metric: 0.6044 / 0.4676 | +0:00:25.481035\n", 347 | " 33/100 | Loss: 1.2996 / 2.0643 | Metric: 0.6382 / 0.4974 | +0:00:25.506845\n", 348 | " 34/100 | Loss: 1.1233 / 1.8935 | Metric: 0.6714 / 0.5125 | +0:00:25.511562\n", 349 | " 35/100 | Loss: 1.1297 / 1.9093 | Metric: 0.6669 / 0.5043 | +0:00:25.527171\n", 350 | " 36/100 | Loss: 1.0770 / 1.9064 | Metric: 0.6800 / 0.5069 | +0:00:25.521235\n", 351 | " 37/100 | Loss: 1.1357 / 1.9935 | Metric: 0.6757 / 0.4925 | +0:00:25.507412\n", 352 | " 38/100 | Loss: 1.0077 / 1.8913 | Metric: 0.6930 / 0.5111 | +0:00:25.524217\n", 353 | " 39/100 | Loss: 1.0088 / 1.9418 | Metric: 0.6951 / 0.5034 | +0:00:25.504317\n", 354 | " 40/100 | Loss: 0.9989 / 1.9648 | Metric: 0.6972 / 0.5018 | +0:00:25.521985\n", 355 | " 41/100 | Loss: 0.9202 / 1.8992 | Metric: 0.7220 / 0.5117 | +0:00:25.520785\n", 356 | " 42/100 | Loss: 0.9513 / 1.9598 | Metric: 0.7122 / 0.5062 | +0:00:25.476306\n", 357 | " 43/100 | Loss: 1.1995 / 2.2258 | Metric: 0.7097 / 0.4965 | +0:00:25.480849\n", 358 | " 44/100 | Loss: 0.8955 / 1.9741 | Metric: 0.7274 / 0.5104 | +0:00:25.500969\n", 359 | " 45/100 | Loss: 0.8821 / 1.9969 | Metric: 0.7390 / 0.5098 | +0:00:25.473217\n", 360 | " 46/100 | Loss: 0.8505 / 2.0071 | Metric: 0.7396 / 0.5051 | +0:00:25.514186\n", 361 | " 47/100 | Loss: 0.7809 / 1.9547 | Metric: 0.7605 / 0.5139 | +0:00:25.545920\n", 362 | " 48/100 | Loss: 0.7247 / 1.9603 | Metric: 0.7761 / 0.5194 | +0:00:25.488569\n", 363 | " 49/100 | Loss: 0.7574 / 2.0076 | Metric: 0.7688 / 0.5114 | +0:00:25.497039\n", 364 | " 50/100 | Loss: 1.0998 / 2.3453 | Metric: 0.7268 / 0.4854 | +0:00:26.040123\n", 365 | " 51/100 | Loss: 0.7521 / 2.0480 | Metric: 0.7694 / 0.5084 | +0:00:25.828677\n", 366 | " 52/100 | Loss: 1.5339 / 2.7158 | Metric: 0.6346 / 0.4254 | +0:00:25.815047\n", 367 | " 53/100 | Loss: 0.6663 / 2.0466 | Metric: 0.7918 / 0.5140 | +0:00:25.826483\n", 368 | " 54/100 | Loss: 0.6193 / 2.0372 | Metric: 0.8081 / 0.5146 | +0:00:25.676460\n", 369 | " 55/100 | Loss: 0.6105 / 2.0812 | Metric: 0.8096 / 0.5125 | +0:00:25.495489\n", 370 | " 56/100 | Loss: 0.6303 / 2.1622 | Metric: 0.8019 / 0.5090 | +0:00:25.558402\n", 371 | " 57/100 | Loss: 0.6239 / 2.1130 | Metric: 0.8040 / 0.5154 | +0:00:25.693560\n", 372 | " 58/100 | Loss: 0.6373 / 2.1264 | Metric: 0.8026 / 0.5037 | +0:00:25.591672\n", 373 | " 59/100 | Loss: 0.5696 / 2.1770 | Metric: 0.8213 / 0.5064 | +0:00:25.515435\n", 374 | " 60/100 | Loss: 0.5271 / 2.1327 | Metric: 0.8331 / 0.5101 | +0:00:25.548699\n", 375 | " 61/100 | Loss: 0.4884 / 2.1617 | Metric: 0.8464 / 0.5099 | +0:00:25.492324\n", 376 | " 62/100 | Loss: 0.5484 / 2.2190 | Metric: 0.8235 / 0.5099 | +0:00:25.512234\n", 377 | " 63/100 | Loss: 0.4893 / 2.1722 | Metric: 0.8443 / 0.5133 | +0:00:25.485472\n", 378 | " 64/100 | Loss: 0.4431 / 2.1613 | Metric: 0.8610 / 0.5143 | +0:00:25.490351\n", 379 | " 65/100 | Loss: 0.6444 / 2.4219 | Metric: 0.7960 / 0.4858 | +0:00:25.497480\n", 380 | " 66/100 | Loss: 0.3970 / 2.2098 | Metric: 0.8751 / 0.5135 | +0:00:25.486181\n", 381 | " 67/100 | Loss: 0.4105 / 2.2758 | Metric: 0.8712 / 0.5100 | +0:00:25.534624\n", 382 | " 68/100 | Loss: 0.3828 / 2.2263 | Metric: 0.8791 / 0.5215 | +0:00:25.548678\n", 383 | " 69/100 | Loss: 0.3694 / 2.2231 | Metric: 0.8835 / 0.5190 | +0:00:25.494058\n", 384 | " 70/100 | Loss: 0.6148 / 2.4313 | Metric: 0.8047 / 0.4892 | +0:00:25.559696\n", 385 | " 71/100 | Loss: 0.3615 / 2.2320 | Metric: 0.8884 / 0.5169 | +0:00:25.486475\n", 386 | " 72/100 | Loss: 0.3077 / 2.2765 | Metric: 0.9041 / 0.5216 | +0:00:25.551599\n", 387 | " 73/100 | Loss: 0.3263 / 2.2852 | Metric: 0.8973 / 0.5174 | +0:00:25.585349\n", 388 | " 74/100 | Loss: 0.3595 / 2.3874 | Metric: 0.8844 / 0.5094 | +0:00:25.507216\n", 389 | " 75/100 | Loss: 0.3023 / 2.3316 | Metric: 0.9051 / 0.5171 | +0:00:25.500517\n", 390 | " 76/100 | Loss: 0.4149 / 2.3949 | Metric: 0.8791 / 0.5073 | +0:00:25.490881\n", 391 | " 77/100 | Loss: 0.2814 / 2.3683 | Metric: 0.9099 / 0.5177 | +0:00:25.478628\n", 392 | " 78/100 | Loss: 0.2676 / 2.3558 | Metric: 0.9159 / 0.5198 | +0:00:25.454193\n", 393 | " 79/100 | Loss: 0.2627 / 2.3218 | Metric: 0.9180 / 0.5201 | +0:00:25.481055\n", 394 | " 80/100 | Loss: 0.2776 / 2.4139 | Metric: 0.9184 / 0.5189 | +0:00:25.497929\n", 395 | " 81/100 | Loss: 0.2523 / 2.3382 | Metric: 0.9194 / 0.5195 | +0:00:25.479852\n", 396 | " 82/100 | Loss: 0.3507 / 2.5150 | Metric: 0.8849 / 0.5029 | +0:00:25.473072\n", 397 | " 83/100 | Loss: 0.2899 / 2.4763 | Metric: 0.9064 / 0.5065 | +0:00:25.467117\n", 398 | " 84/100 | Loss: 0.5574 / 2.7482 | Metric: 0.8247 / 0.4775 | +0:00:25.505151\n", 399 | " 85/100 | Loss: 0.2233 / 2.4334 | Metric: 0.9305 / 0.5148 | +0:00:25.472094\n", 400 | " 86/100 | Loss: 0.7211 / 2.8693 | Metric: 0.8071 / 0.4584 | +0:00:25.507260\n", 401 | " 87/100 | Loss: 0.2264 / 2.4550 | Metric: 0.9299 / 0.5186 | +0:00:25.500985\n", 402 | " 88/100 | Loss: 0.1885 / 2.4443 | Metric: 0.9414 / 0.5235 | +0:00:25.486188\n", 403 | " 89/100 | Loss: 0.1809 / 2.4752 | Metric: 0.9438 / 0.5163 | +0:00:25.488561\n", 404 | " 90/100 | Loss: 0.1809 / 2.4432 | Metric: 0.9433 / 0.5203 | +0:00:25.483951\n", 405 | " 91/100 | Loss: 0.1726 / 2.4504 | Metric: 0.9472 / 0.5268 | +0:00:25.510927\n", 406 | " 92/100 | Loss: 0.2003 / 2.4957 | Metric: 0.9369 / 0.5235 | +0:00:25.476014\n", 407 | " 93/100 | Loss: 0.1826 / 2.4584 | Metric: 0.9435 / 0.5224 | +0:00:25.658069\n", 408 | " 94/100 | Loss: 0.2839 / 2.6165 | Metric: 0.9190 / 0.5104 | +0:00:25.498089\n", 409 | " 95/100 | Loss: 0.3082 / 2.6734 | Metric: 0.9001 / 0.4982 | +0:00:25.532489\n", 410 | " 96/100 | Loss: 0.2176 / 2.4900 | Metric: 0.9348 / 0.5201 | +0:00:25.512353\n", 411 | " 97/100 | Loss: 0.1912 / 2.4989 | Metric: 0.9411 / 0.5237 | +0:00:25.469654\n", 412 | " 98/100 | Loss: 0.1680 / 2.5038 | Metric: 0.9497 / 0.5241 | +0:00:25.611206\n", 413 | " 99/100 | Loss: 0.2187 / 2.5380 | Metric: 0.9370 / 0.5218 | +0:00:25.836507\n", 414 | "Experiment time: 0:42:33.825519\n" 415 | ] 416 | } 417 | ], 418 | "source": [ 419 | "exp_results = run_experiment(dl_train=dataloader_train,\n", 420 | " dl_train_val=dataloader_train_val,\n", 421 | " dl_validation=dataloader_validation,\n", 422 | " model=model,\n", 423 | " optimizer=optimizer,\n", 424 | " criterion=criterion,\n", 425 | " device=device,\n", 426 | " max_epoch=max_epoch,\n", 427 | " metric_fn=metric_fn,\n", 428 | " scheduler=scheduler,\n", 429 | " load_path=None,\n", 430 | " save_path=None,\n", 431 | " )" 432 | ] 433 | }, 434 | { 435 | "cell_type": "code", 436 | "execution_count": 6, 437 | "metadata": {}, 438 | "outputs": [ 439 | { 440 | "data": { 441 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAEWCAYAAABliCz2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3XeY1NW5wPHvmdnZ3gu7bGHpdeldpYkFuygCltglosaYaG40yY3GxKjXxG5UjL1hi71FgQUBUXpvS90CbGV7n3P/OLONrSw7zDDzfp5nn52d+ZVzduGdM6e8R2mtEUII4fksri6AEEKIk0MCvhBCeAkJ+EII4SUk4AshhJeQgC+EEF5CAr4QQngJCfhCnCCl1H6l1FmuLocQ7ZGAL04p3hBclVKpSqmbXV0O4Xl8XF0AIYShlFKAcnU5hOeSFr7wGEqpW5RSaUqpfKXUZ0qpeMfzSin1hFIqWylVpJTarJRKcbx2vlJqm1KqWCmVqZS6p53rb3ccu00pNarRyyOUUpuUUoVKqfeUUv6OcyKUUl8opXKUUgWOx4mNrpmqlHpIKbUCKAPeBCYBzyqlSpRSzzrhVyW8lAR84RGUUmcCDwOzge7AAWCh4+VzgMlAfyDMcUye47WXgV9qrUOAFGBxK9e/AngAuBYIBS5udA0c15wB9AKGAdc7nrcArwLJQA+gHDg2iP8CmAeEOM77AbhDax2stb6jo78DIdojXTrCU1wNvKK1XgeglLoPKFBK9QSqMcF0IPCz1np7o/OqgcFKqY1a6wKgoJXr3wz8n9Z6tePntGNef1prneW49+fACACtdR7wUd1BSqmHgCXHnPua1npro2M6VGEhjpe08IWniMe06gHQWpdgWuAJWuvFmFb1c0C2UmqBUirUcejlwPnAAaXUUqXUxFaunwTsaeP+hxs9LgOCAZRSgUqpF5VSB5RSRcAyIFwpZW10fHqHaynECZCALzxFFqbbBAClVBAQBWQCaK2f1lqPBgZjunZ+53h+tdb6EqAb8AnwfivXTwf6dKJcdwMDgPFa61BM1xI0HZw9NmWtpLAVTiEBX5yKbEop/0ZfPsC7wA1KqRFKKT/g78BPWuv9SqmxSqnxSikbUApUAHallK9S6mqlVJjWuhooAuyt3PPfwD1KqdGOQeC+SqnkVo5tLATTb39UKRUJ3N+Bc44AvTtwnBDHRQK+OBV9hQmidV8PaK2/B/4X019+CNMan+s4PhR4CdM/fwDT1fOY47VfAPsd3S23YsYCmtFafwA8BLwDFGM+DUR2oKxPAgFALrAK+KYD5zwFzHLM6nm6A8cL0SFKNkARQgjvIC18IYTwEhLwhRDCS0jAF0IILyEBXwghvIRbrbSNjo7WPXv27NS5paWlBAUFdW2B3Jw31hm8s97eWGfwznofb53Xrl2bq7WO6cixbhXwe/bsyZo1azp1bmpqKlOnTu3aArk5b6wzeGe9vbHO4J31Pt46K6UOtH+UIV06QgjhJSTgCyGEl5CAL4QQXsKt+vCFEJ6hurqajIwMKioqTug6YWFhbN++vf0DPUhrdfb39ycxMRGbzdbpa0vAF0J0uYyMDEJCQujZs+cJ5fcvLi4mJCSkC0vm/lqqs9aavLw8MjIy6NWrV6evLV06QoguV1FRQVRUlGzm0kWUUkRFRZ3wJyYJ+EIIp5Bg37W64vfpEQH/p9d+T9G+zs3fF0IIb+ERAX/IvtcJzFnr6mIIIdxEXl4eI0aMYMSIEcTFxZGQkFD/c1VVVYeuccMNN7Bz504nl/Tk8ohB21IViK22zNXFEEK4iaioKDZs2ADAAw88QHBwMPfcc0+TY7TWaK2xWFpu97766qtOL+fJ5hEt/DJLMH61pa4uhhDCzaWlpTF48GCuvvpqhgwZwqFDh5g3bx5jxoxhyJAhPPjgg/XHnnHGGWzYsIGamhrCw8O59957GT58OBMnTiQ7O9uFteg8j2jhV1iD8JcWvhBu6S+fb2VbVlGnzq2trcVqtTZ7fnB8KPdfNKRT19yxYwdvvPEGY8aMAeCRRx4hMjKSmpoapk2bxqxZsxg8eHCTcwoLC5kyZQqPPPIIv/3tb3nllVe49957O3V/V/KIFn6VTzCBWgK+EKJ9ffr0qQ/2AO+++y6jRo1i1KhRbN++nW3btjU7JyAggPPOOw+A0aNHs3///pNV3C7lES38Kp9QInWHE8YJIU6izrbEwTkLrxqnHt69ezdPPfUUP//8M+Hh4VxzzTUtznX39fWtf2y1WqmpqenSMp0sHtHCr/UNJkha+EKI41RUVERISAihoaEcOnSIb7/91tVFciqPaOHb/cIIoQy0BlnsIYTooFGjRjF48GAGDhxIcnIyp59+uquL5FQeEfDxC8WmaqmsKMEvwLvybggh2vbAAw/UP+7bt2/9dE0wq1fffPPNFs9bvnx5/eOjR4/WP547dy5z587t+oKeBB7RpWMJCAWgpLDAxSURQgj35SEBPxyA8qJ8F5dECCHcl0cEfFugI+AXS8AXQojWeETA9w2OAKCyRLp0hBCiNR4R8P0cAb+69Gg7RwohhPfyiIAfGGICfm15oYtLIoQQ7ssjAn5QaBQA9nJp4QshYNq0ac0WUT355JPMnz+/1XOCg4MByMrKYtasWS0eM3XqVNasaXvvjSeffJKysoaFoOeff36TaZ2u5BEBPzgklBptQVd0LkGTEMKzXHnllSxcuLDJcwsXLuTKK69s99z4+Hg+/PDDTt/72ID/1VdfER4e3unrdSWPCPhWq4USArFUSpeOEAJmzZrFl19+Wb/Zyf79+8nKymLkyJFMnz6dUaNGMXToUD799NNm5+7fv5+UlBQAysvLmTt3LoMGDWLmzJmUl5fXHzd//vz6tMr3338/AE8//TRZWVlMmzaNadOmAdCzZ09yc3MBePzxx0lJSSElJYUnn3yy/n6DBg3illtuYciQIVxyySVN7tOVPGOlLVCsArFWFbu6GEKIY319Lxze3KlTA2prwNpCmIobCuc90up5kZGRjBs3jq+//ppLLrmEhQsXMnv2bAICAvj4448JDQ0lNzeXCRMmcPHFF7e6X+zzzz9PYGAg27dvZ9OmTYwaNar+tYceeojIyEhqa2uZPn06mzZt4s477+Txxx9nyZIlREdHN7nW2rVrefXVV/npp5/QWjN+/HimTJlCREQEu3fv5t133+Wll17isssu46OPPuKaa67p1O+sLR7RwgcoIxCfagn4QgijcbdOXXeO1po//OEPDBs2jLPOOovMzEyOHDnS6jWWLVtWH3iHDRvGsGHD6l97//33GTVqFCNHjmTr1q0tplVubPny5cycOZOgoCCCg4O57LLL+OGHHwDo1asXI0aMAGDEiBFOS7/sMS38MksQATUS8IVwO220xNtTfgLpkS+55BJ+85vfsG7dOsrKyhg9ejSvvfYaOTk5rF27FpvNRs+ePVtMh9yeffv28Y9//IPVq1cTERHB9ddf36nr1PHz86t/bLVaqa6u7vS12uIxLfwKFSjbHAoh6gUHBzNt2jRuvPHG+sHawsJCunXrhs1mY8mSJRw40PY+GpMnT+add94BYMuWLWzatAkwaZWDgoIICwvjyJEjfP311/XnhISEUFzcvPE5adIkPvnkE8rKyigtLeXjjz9m0qRJXVXdDvGYFn6lJZDAmhJXF0MI4UauvPJKZs6cWd+1c/XVV3PRRRcxdOhQxowZw8CBA9s8f/78+dxwww0MGjSIQYMGMXr0aACGDx/OyJEjGThwIElJSU3SKs+bN48ZM2YQHx/PkiVL6p8fNWoU119/PePGjQPg5ptvZuTIkSd19yyltT5pN2vPmDFjdHtzXFvz30ev5LTyVIIfONTFpXJfqampTJ061dXFOOm8sd6nWp23b9/OoEGDTvg6ztjxyt21VeeWfq9KqbVa6zEtnnAMj+nSqbYGEqjLwW53dVGEEMItOT3gK6WsSqn1SqkvnHmfGp9ALEpTIfl0hBCiRSejhf9rYLuzb1LrYzYmLpWc+EK4BXfqLvYEXfH7dGrAV0olAhcA/3bmfQC0zQR82QRFCNfz9/cnLy9Pgn4X0VqTl5eHv7//CV3H2bN0ngT+B3D6qEtdwK8olpz4QrhaYmIiGRkZ5OTknNB1KioqTjjInWpaq7O/vz+JiYkndG2nBXyl1IVAttZ6rVJqahvHzQPmAcTGxpKamtqp+1VpU5WdW9aTUebXztGeoaSkpNO/r1OZN9bbG+sMpt51WSy9RVt1bm/dQHuc2cI/HbhYKXU+4A+EKqXe0lo3SRChtV4ALAAzLbOzU88+zk4HILFbOMNPoelrJ+JUm6rXVbyx3t5YZ/DOejuzzk7rw9da36e1TtRa9wTmAouPDfZdyeJn3hFlExQhhGiZx8zDt/mbPny7BHwhhGjRSUmtoLVOBVKdeQ9/XxsV2gYVEvCFEKIlHtPCtyhFMUGoKtn1SgghWuIxAR9MimQf2QRFCCFa5HkBXzZBEUKIFnlUwK+0BuMrKZKFEKJFHhXwq3yCCaiVgC+EEC3xqIBfbQslQMuuV0II0RKPCvi1viEEScAXQogWeVTA136h+FOFrql0dVGEEMLteFTAV/5hAFSWyCYoQghxLI8K+JaAUABKi/JcXBIhhHA/HhXwrYHhAJRLTnwhhGjGowK+b2AEABUlEvCFEOJYnhXwg00Lv0oCvhBCNONRAd8/JBKAmjLJmCmEEMfyqIAfGGq6dGrLpIUvhBDH8qiAHxwaiV0rdIWkSBZCiGN5VsD396UEf5CAL4QQzXhUwLdaFPmEE1SW7uqiCCGE2/GogA+wxjaaXkWroVKyZgohRGMeF/BzEs/GV1dRu+s7VxdFCCHciscF/B4jziRPh3B0/ceuLooQQrgVjwv4Z/TvziL7aIIPfA81Va4ujhBCuA2PC/hhgTbSoqbhV1sK+5a5ujhCCOE2PC7gA0SknEWJ9qd8k3TrCCFEHY8M+JMGJbHEPgK182uw17q6OEII4RY8MuAP7h7KSttp+FflQfpPri6OEEK4BY8M+BaLQvU7hyp8sG/7zNXFEUIIt+CRAR9g4uBkltemUL3tS9Da1cURQgiX89iAP6lfNIvto/ArPgi5u1xdHCGEcDmPDfjhgb5kd59qftj5tUvLIoQQ7sBjAz7AkIGD2WpPpnqHBHwhhPDogD+5fzSL7CPxyfgZyvJdXRwhhHApjw74wxLD+dk2DoUddksyNSGEd/PogG+1KCL6jiePcPSub1xdHCGEcCmPDvgAkwbE8n3NCOy7v4faalcXRwghXMbjA/7kfjEsto/EWlUEB390dXGEEMJlnBbwlVL+SqmflVIblVJblVJ/cda92hIX5s/h6IlUY4Od0q0jhPBezmzhVwJnaq2HAyOAGUqpCU68X6vGDUhiqX0Yev0bUJjhiiIIIYTLOS3ga6NuY1mb48slOQ4m94/hweprqK2thU9uA7vdFcUQQgiXUtqJeWaUUlZgLdAXeE5r/fsWjpkHzAOIjY0dvXDhwk7dq6SkhODg4BZfq6rV3LGojD9FpXJt8QJ2951HZuIFnbqPO2mrzp7MG+vtjXUG76z38dZ52rRpa7XWYzp0sNba6V9AOLAESGnruNGjR+vOWrJkSZuv3/z6aj38gW901euXaf3XWK1zdnf6Xu6ivTp7Km+stzfWWWvvrPfx1hlYozsYi0/KLB2t9VFHwJ9xMu7Xkt+c1Z/CihqeDb4TfPzgszski6YQwqs4c5ZOjFIq3PE4ADgb2OGs+7VncHwoc8cm8dyaUnLG/4+Zoil73gohvIgzW/jdgSVKqU3AauA7rfUXTrxfu3579gD8bVb+d/8ICOkOyx5zZXGEECdDTaXk0nJw5iydTVrrkVrrYVrrFK31g866V0fFhPhx+7S+fLPzKHv63QD7f4CDq1xdLCGEMy39P3j+NOnCxQtW2h7rhtN7khQZwD17R6EDo2DZP1xdJCGEM2WuheJDUJTl6pK4nNcFfH+blV+d2Y/1h6s40O96SPsOsja4ulhCCGep2/EuZ7try+EGvC7gA1wyIp6YED8ezpsEfmGw6C9QVebqYgkhulplMRRlmsfZLpsz4ja8MuD7+Vi5bmIy36aVkT3qLtizGJ4bDztkw3MhPEru7obH0sL3zoAPcPX4ZAJsVh4rmg7XfwW+QbDwKvjgOkm9IMSpaOWzsPhvTZ+r684JiZcWPl4c8COCfLliTCKfbMgkO3I03PoDTP4dbPsUdnzu6uKJU0naIqgqdXUpvJvWsOpf8NOCpg22nJ1g8YEBM8xjL/8E77UBH+DG03tRY9e8/uN+sNpg6n0Q1Q9SH5FWvuiYokPw1mWw4R1Xl8S75e81ffWVhQ2tejCPI/tAbApUFXt9tlyvDvg9o4OYMSSOBcv28ti3O6ioBabeC9nbYNsnri6eOBXUBZDCdNeWw9vtW9rwOGN1w+OcnRDTH7oNcvzs5G4drWHDu3B4s3Pv00leHfABHr5sKBcPT+C5JXs454llLPedBNEDYOmjYK91dfGEuyt2zO0uOuTacni7fT+Y1fP+4ZDxs3mupsq0/KP7Q8xA81y2Ewduq0rhwxvgk1vhi9847z4nwOsDfnigL/+cPZx3b5mAj1Vx4xvrKJpwt2kJbP3Y1cUT7q4u0BdLwHcZrc2q+V6TIXEsZKwxz+fvBV1rGnCBkRAc67wW/tF0eOVc2PoJ9DjNfMrI2emce50Arw/4dSb2ieLFa0ZTVWvnvdLREDPI9OXL/HzRlmIJ+C6XswNKcxoCfvZ2qGjUlx/T3/F9oHNa+NUV8MoMKDgAV70PV7wGygob3u76e50gCfiN9IsNYWSPcN5bm4k++0HIS4N350jQF62rC/RFh7x+BojL1GW97TkJksYC2qRTyHW0sKMdAb/bINPq7uoJGZs/gKIMmP069D8HQmKh3zmwcSHU1nTtvU5QhwK+UqqPUsrP8XiqUurOutTHnmbOmCTSsktY5zcWZr4I+5fDO7Nl2p1oWV1+lupSs6pTnHz7lkF4MkQkQ8JoQJlunZxdEJZk1tiAaeFXl3btALvWsOp5Mwuo97SG50deDSVHYM+irrtXF+hoC/8joFYp1RdYACQBHjkP7cLh8QT6Wnl/dToMn2OC/oEV8NYstx15Fy5UfNh8fAfp1nEFe61plPWabH72DzOBPf1n08Kva92Dc2bq7FsK2VthwnxQquH5fudCYBSsf6ud8tvNWOGiv3ZdmdrQ0YBv11rXADOBZ7TWv8Pku/c4wX4+XDC0O19syqK0sgaGzYbLXoJDG+CFM+CV82D75/LxXRjFhyB2sHks2RhPju1fNKRBObwZKo42BHww3ToZq01ahcYB3xkzdX78FwTFQMqsps/7+MKwObDzayjNa35eTRVs/hCenwgfXA87voDq8q4rVyt8OnhctVLqSuA64CLHczbnFMn15oxN4oO1GXy56RCzxybB0FnQd7p5t/75JXjvGvMmMGy2q4sqXKmiCKpKIH6UCTzFh11dIs9Xlg8f3gi1lZA0AaL7mud7Tmo4JnEsrHvDPI5pFPADws3Uzc628PP2wE8vQo8JMPhSyN8Du7+FKfeCzb/58SOuNqt/P70NbIFw9ACUZEN5gfl3A9BtMMx6xVzPYu1cuY5DRwP+DcCtwENa631KqV7Am84rlmuNTo6gd0wQ761JNwEfICACTvsVTLgNXj4bvv2jGZgJ8MihDNERdQE+YRSse71hTr5wnk3vm2A/+XcmqKevMqvjQxt1OCSOa3gcPaDp+S3N1Ck/av5++5fDkMsg5XLTQq9TWwOrnoMlfze7Z/38InR/yrTsrb4w9qaWyxqXAslnmNQbYYlmjCG6PwREmngSOwT6zwDLyZs706GAr7XeBtwJoJSKAEK01o86s2CupJTiqnE9+NuX23lv9UHmjO3R8KLFChf8E146E5Y8BOfLNoleqy7AR/YxfcfSwncurU1gjh8JZ/4JTr8Lfl7Q0FVTJ7q/SXteWQgxxwT8boPMOW9eBtH9oLbazKapLoXgONj9X1j0IIy+3hxffMh0Dx3ZAgMugPP/zwwSL/m76eYdcQ0Ed2u9zNd/Adp+UlrvHdGhgK+USgUudhy/FshWSq3QWv/WiWVzqetO68nSXTn88eMtJEUGclqf6IYX40fC2Jth9b9hxFXmZ+F96hZdhcabrgJ37MPX2vR39zkTfAOP//xUR7tu6u+7tlwdceyUxsy1Ju3JhU+an/2CYVILIchigcTRZmOjoOimr42+3szZz90F61ZBbZXpsp1wG8QNhbTvYcVTkPp3c3xglGmdz3oVhsw0A7MjrjKfArZ/bn6vbVGqYVDfDXS0SydMa12klLoZeENrfb9jc3KPZbNaePaqUVz+/Ermv7WOj287jd4xwQ0HTPujWVX3xW/h5u/d5h1cnER1s3JC4kzAd8dZOoc2wHtXw5n/C5PvOb5zq8pgxZPmTWPi7SbAdpbdbvLRxwzs2P+V/L3w8jn0CxsNU6aYwLn2NbAFmQDdnun3t/z3iBkAl//bPNbadNE07n/vd7b5Ks0FvxDw8Wv5+j5+HSuHm+lo55GPUqo7MBv4wonlcSthATZeuW4sVovixtdWs+NwUcOLAeFw7kOQtc507+z7wXUFFa5RfMh0HfgGOQK+G3bp7Flivm/79PjP3f1fqC6DmnIz26QjNn8IC6Y23Ta0pgo+nmc2En9yqJmCmLen9WtUV8D710F5AQlZ35gd6SqLYct/IGWmCcTtiR8BA85r+xilWh5sBfPJoLVgfwrraMB/EPgW2KO1Xq2U6g3sbuccj9AjKpCXrh1DUUUNFz693GTVrHYkVRt6hZmtU5oLr18I717pnv/phXMUZTUMFoY6Ar67Jdzbm2q+H95kWs3HY+vHZmAyNMGsJm1PwQH4/NeQtd7kldn0PlSWwLtzzfnjfmlmpSx/HJ4ZBY/1hTcuhf/+CY5sa7jOt38w5Z39Jlndz4XlT8A7c00/+6jrj68OookOBXyt9Qda62Fa6/mOn/dqrS93btHcx+jkCL7/7RQuHhHPc0v2cP5TP5BdXGFaCMNmw6/WmI+Qe5eanBpHD7q6yKKr5eyCh5Pg8JaG54oPm+4cMC18XWv6h91FdTkcXAWDHDOpt33W8XOrSk0Lf9DFpr96zyIzJbI1djt8ert5fMtiSBgD/7nFbB26dwlc/KwZ8LzmQ/jNVpjxqFmcVF5gpjo+P9FMd/7hcVjzMpx2Jww8n139fwmDL4EDy82bReKYzv8+RIdTKyQqpT5WSmU7vj5SSiU6u3DuJDLIl8dnj+D1G8eRXlDGY980yoRnCzCDR9d+CuX5ZnFWWx9ZXcFeawaZK0ucd4+dX8M3f3De9V1pbypUFpkgWKf4kNk6D0zAr3vOXRz80UxhHHWdWStwPN06dd05Qy41fdX2mrbPX/2SyVh57t9NeoNrP4Hxt5okZnPehlG/aDg2NB4m3AqXPge/XAp37zTTLPcuNd03SeNh+p/NscpqPkWP+yWc89emq1nFcetol86rwGdAvOPrc8dzXmdK/xhuPL0XH67LYHNGYdMXk8bCdV+YPs9XZphFWoc2ukcCpb1L4Mu7O/bRvLPWvGLmK3tibvisdeb7wVXmu73WtPAbd+mAe9V9bypYbNBjomklZ63r+KfPrZ+Y7pzk0yFumJnquPnDpseUH4VDm8zz390Pfc+GUdea16w2OO9RuPcADDy/7XsFRppplndtNjNw5rxtzq/j42c+HfQ9q8NVFy3raMCP0Vq/qrWucXy9BsQ4sVxu7fYz+xIZ6MtfPt+KPjbFQvdhcMPX4B8KX90DL06GR3rAN/eZlZmuUjd4d2hD28d1ltYNecj3LHbOPVwp0xHw01eZ7ovSXNOFU9eyr2/hu9HUzL2pkDTOzK4ZfLF5rq5bJzcNnh0LHzta4Y1VlcKub82bhMVqWtUps0xOqaIsMxbwxqXwaDK8OAk+usnc4+JnmrfAj2f2WkA4jLkBgr02tDhdRwN+nlLqGqWU1fF1DdBCggjvEOpv455zB7DmQAFfbGpl6tcda0yL5fKXTR/qqufNf7AtH7kmD09dEM5a75zrF+w33VmN7+UpKovNvO2IXo486zsbAntdoA/qBsriPoP2pXmm9V2XwTGyt2mpb/vUjEO8OsNkc9z0vskRVffJBUx3Tk25We5fZ+gsQMPHv4R/TTRz4qfeB7PfgHmpcOeGpqtdhVvqaMC/ETMl8zBwCJgFXO+kMp0SZo9JYlD3UB75eodJsnYspSC8h/mPctmLcPMikyf7wxtNWoaTqeiQWbDiH2ZmQ9RUdv09Mtea77EppvvIlZvA//ySGa/oKoc2AtpkRATTN16/6MoR5Kw+Zkelk9mlU5oH695seb+GfUsBDb2nNjw3+GKz/d9r55uUADcvghu/ARS8eh68fI5Zgfr9X8wbWPJpDedG9TELDPctM10rt/9k9n8efIl5/kTm6IuTpqOzdA5orS/WWsdorbtprS8FvGaWTkusFsUDFw0mq7Ccmf9awe4j7eRCTxwNtyyB0TeYfu6T2Qre6+jOGTcP7NVwZGvX3yNzLfgEmBWLZXnO6zpqj90OqQ+b3cq66k2nrjtnyGUmqB9c1WjRVaNWbUjcyRu0PbIVXpoKn93haKH/1PT1valmjUDjVeB1LfaACNPtGN3PdPncutwMivr4mU8wPv5wxm+ad8fMXAC/+ATmvm0GXsUp50Sy9nhsWoWOGt87ijdvHE9+aRUXP7uCD9dmtH2CxQozHjYJnT653UxJ6yy7HZ/qDs642bPYtNhGXG1+dkYwzlwL3YebhHJ193SF7K3mDac0x+Q/6QpZ6yGsh+lb7jHBtPCLD5kunKBGeVRC4rsu4O/+DpY+1vLGOzu+NK3xmiq48AmTD+aVc8040b5lpltp7xLoNcl88qgT3c/MJLvpe5PIq45/KJz3CFz3OdyyCG5fBRNva37fmP7QZ1rz58Up40QCvsyPAs7oF81Xd05ieFIY93ywkX//0M7iFluA6eIpzYav/sc8V1Fo8nJkdTAQlxfAm5cy8ccbIX9f28fa7Sb49pkGET1N666r+/Frq023R+IYExTjhrku4O9d2vC4q3YbyloHCY6Wco+JZqZL5lrT2m8cUEPi2s88as1jAAAgAElEQVSnU1MJb882G+oUZjZ/vbrC/Lt4exYs+Rv8a4LJ76I1HFgJH9wAC68ys2bmpcKYG+G2lSZHzKp/wesXwT8HmDL2ntr8+r2nyqCoF+toLp2WyA4gDt1C/Xn75gnc/vY6Hv56B8MSwxnXK7L1E+JHwuT/MQma8tLMqkJ7DfgGmz7VuKGtn5u/F96Z4wj0Cr6/3wyctebwJtPi7XOmGVeIH9nxN5aOOrIVaipMmmAw9/rxWTPY2ZFl8F1p31KI6mu6JdIWma6JE1GWbwak67In9phgvu9damZkNRba3WzGUV1u3tiPpTV8eofJoe4TYFINXPSkScpVkmPeWBY9aD6ZTLjNpM796h5463Im+HWDpdlmHOa0O2HaHxru4RdirjPl9ybXe+5u80lj6BUnVnfhcdps4SulipVSRS18FWPm4wsHq0Xx2BXD6BEZyO3vrCO7qKLtEybdbQJjbZXJs3/V++Y/8ztzmg/8aW3+E//8Erw03XRXXPspB3tcbmZdHPix9fvUtbR7TzXfu48wA7jV7ZTveNQN2CaMNt/7TjdvYCc7v1BttWkF95pifrcHV534QrO6+ffxjjez2KEmgZeubVh0Vafu59a6dVIfgc3vm0Rm81eYgdAPrjcpBv7R1+ydXHwYrvrAdP31nmL616fcS3lAHFz0NPx2h1mA1NIbSmh380lu/Dw4637Zq0E002YLX2t9kptnp7YQfxsvXDOaS59bwR3vrOftW8Zjs7bynmr1gV983PS5q94zC7bemQ1Xf2hmVOz6BtIWN0wD7DYE5rwJUX1ITyqmV95S+PY+uHlxyxsp7FlsZs7UpQCIH2mC8ZGtZiC5o7J3mNZzWb7pUuo1GQZdaF7LXAeB0WYjaTArJW2BpkulvUU3XSlzrdlJqPcU8AuFlU+bueP9z+38Neu6v+JHmO9WH9N1tW9p82mIdb/jokNmGmRj69+CpY+Y/OmT7jaftm78FlY+Azk7zae67sObz3jx8YNp97FRTWTq6Kmdr4cQnFiXjmjBgLgQHr5sKHe9t4GLnlnOXWf145zBcVgsHRjyiBtq8m6/Owf+6diazS8M+kw1LfReU0wgcSxusVv9TUvu41+aluPwuU2vV5pnWrkTbm14ri5wHVrf8YBfmGkGCSsdC3R8/M20xxu/NauLM9eY1n3dohsfP7PlXNoi8+nkZC2H37sUUObetkDTbZK26MQCfuZ6s6OSf1jDcz0mmoBfF+DrhLbQwi/JMW/Imz8wb5IXPtHw+7DaWs7nLoSTOC3gK6WSgDeAWEx//wKt9VPOup87uXRkAlaL4onvdnHrW+sYGBfC3y8byqgeEe2f3P8ck6/70EYz4yVpfNNl5scaOht+esFkGNy3zAQdq68JfumrTGt+wAUNx4clmU0dOjpwq7WZ+mevhvkrzQyj6lJ4/gz4z81mel/OTjNlsbEB55m+6sy1Jy/h1T5Hv3qgY/yk5xknPnCbta7pfqnQ0I/frEvH0eJf+qhjZlQ0rH3dzLSZcq8J7o23zhPiJHPmZoo1wN1a68HABOB2pdRgJ97PrVw0PJ7//mYyT8wZTnFFDVe9tIqluzqYSTHlcjj7QROw2gr2YLpxLn7GzNrYswR++KfZerGi0IwN3PQ9JE9sOF4p04+ftdH8nL8P3pxpZgm1ZO1rJnid/aDZg9PqY1q7ly0wM0HevgLQzT8tDJ1lBqHXvNKxOp+oqlJI/9l8CqrTd7oZFC840Pz4msr2xzGKDpnWet1gdJ2ek8zvo65Lq45/mOmuCYw2v7MVT5kMj/NXwLT7PDK/uji1OK2Fr7U+hFmVi9a6WCm1HUgAtrV5ogfxsVqYOTKRSf1iuPbln7n59dU8NXck5w/t4iXocUMdKyYxidqqy8zc6tbEjzQ5xvcshg9vMikRMtaYN4LwpIbjCvabXOW9psCYYzZqTp4Ik+6BZf/nuOYxQdEvxKSO3vCO2SgmoJ1PN9UVsOVDszioM6s2D/5oPoX0bhTw+0w33/csNqtGt31qurjy0qAw3bwhXfy0mSVzrLJ8+MaxrV/CMZ9QrD5w+q+bn6NUQ5ZHMPPkpUUv3IhqlvzLGTdRqiewDEjRWhcd89o8YB5AbGzs6IULF3bqHiUlJQQHu+/y7tJqzZNrK0g7amf+cD/GdT/x99rO1jk6ZxUpWx8GoCwgnt395jFk6yMUh/Rj4/AHQVmwVRUxdPNfCSxLZ/XYZ6j0bz53W9lrGbHhPqy1FawZ+3Sz14NK9jF2zV2k9bmJjKSLWy+Q1gzY+TTdDy/mUNyZ7BzYQjB1HOdfkU1etR9BoU1noPTe8xqJGZ+z/Iy3zdiG4/gJq27Gt6oQi64GoCSoF6VBiZQHxBOZv57Q4l1kJFzAnj43oC020JrI/LUM2Pkstuoi9vecy8EeV7g8La+7//t2Fm+s9/HWedq0aWu11h3qN3V6wFdKBQNLgYe01v9p69gxY8boNWvWdOo+qampTJ06tVPnnizlVbVc+dIqDuaXseSeqYQFtNNd045O17noEDw1zIwPzH7D9HmvfR0+vxPOfdi03t/7BZRkm/GEwW0E66oyMwc/sJV1B/8+y6TRvWN160HzpwXw9e9M90f2Npj7btPZPYe3mEHPbZ9CwT6KQvoTOv/bhnsWHIBXzzerR2/4qum1Vz1vZjoNvNB8NZ5ZU1Nl8q//+GxD/3tZnpkq220IzHyh+Vx7FzkV/n07gzfW+3jrrJTqcMB36iwdpZQN+Ah4u71g7w0CfK387dIULnp2OU99v5s/X+SiIY3Q7vDrjSYtQN1K0VHXmg1Mvn/A/BzcDW76tmkulpb4Bpqv1oy5ET6ZD/uXm6X+x9q/wsxi6T8DrngdXj7LvPEkjTfdUov/avrCLT6ma2nYbIKX/dOsKP3Fx1CYYdYu1FbCWQ80v/6E+Q1Jz47l42u6m5JPh43vmvsFRpkVySOulj534XGcOUtHAS8D27XWjzvrPqealIQw5o7twRs/7ueq8Un07eaipQ7HJr9SyvRnvzjZpHe+/GUzy+REDZlpcrysfMYM8qb/ZFaDWn3N1MnMNSbAXrbAbCg980V4cYqZalqeb2b5jL7B9I07WvSbCwIZvv1R8+mhNMeU8/ovTLk7Y+D5J3e9gBAu4sxZOqcDvwDOVEptcHzJ/yrgnnP6E+Br5cEvtjffQMWJMgrK2r5fcDeTw//aT7sm2INZETriajNF89PbYNsnZhcme42ZARPVF+a+0zDPPXaISRuQ9p0ZXJ39hkkb0KjLqCByhCljxVEzO+mm7zsf7IXwIs6cpbMcSbDWoqhgP+46qz9//WIb//jvTpIiAvH1sXB632hiQ/2dcs+dh4uZ8dQyXr5uDGcOjG39wPamgXbG1N+brqG4oSZAt7QiuLHT7jQBvs+ZZk+BliSNM29OtqCmCcyEEK2S/ykucu3EZD7bmMVzSxo2O0+MCOD7307B33Yc28J1UOrObLSGjemFbQd8Z/APg2HHkcjL6tOQrKy96wohOkwCvovYrBY+unUiR8urqaqxsymjkFvfWsuCZXu5c3q/Lr/fyj1mR8rd2e1s1CKE8FjO7MMX7fCxWogO9iM+PIAZKXFcMKw7/0pNI/NoebvnHimqYHNOC1srtqCqxs7P+8x+s7uOnGD2SCHEKUsCvhv5w/mDAPj7V9vbPG5rViEXPbOcf66tZMfhojaPBdiQfpTy6lr6dQtmf24pVTUu3G9WCOEyEvDdSEJ4APOn9OXLTYf4Zsthth8qYvX+fDakH6XEsVH6D7tzmPPiKqwWhY8F3l51sN3rrtyTi0WZcYMau2Zfbgvb5gkhPJ704buZX07pzftr0rn1rbXNXkuMCOBwYQV9uwXz2g3j+M3rqXy8PpN7zxtIkF/rf8qVaXmkJIQxOtlMbdx1pJgBcbLVgRDeRgK+m/G3WXn9xnGsO1hAiJ8Pwf4+lFXVsvtIMTuPlBBos/LHCwcR6m/jzCQbP2ZV8OmGLK4a3/L0xbKqGtanF3DjGb3oHROERcHuIzJwK4Q3koDvhvp2C6Zvt6bJk84dEtf8uHALA+NCePunA1w5LgnVQq6a1fsLqK7VnN4nGn+blZ5RQTJwK4SXkj78U5hSiqsnJLM1q4gN6UdbPGZlWi42q2JsT9Od0y82mF0yNVMIryQB/xQ3c2QCQb5W3mpl8HblnjxG9oggwNcs5uofG8KBvDIqa2q7vCzvr0nntrebjz0IIdyDdOmc4oL9fLh0ZALvr0mnoKyK4YnhDOweQk2tpqiimi1Zhdw1vX/98f1iQ6i1a/bmlDKoexubpHTCpxsyWZGWR35pFZFBsvGHEO5GAr4HuOus/tTUatYdLGCJI4VCHYuC6YO61f/cP9aMDew6UtylAd9u12xKN5ucb84sZEr/5humCCFcSwK+B4gJ8ePRWWajjuKKavblluLnYyXQ10p4oI0Q/4aEaL2ig7BaFLu7eOB2b24pxY61ApszjkrAF8INScD3MCH+NoYlhrf6up+PleSoQHZ18dTMTRlm0DjAZmVjRmGXXtvd5ZVUEhHoi8UiyWGFe5NBWy/Uv1sIu7O7toW/Mf0ogb5Wzhocy2YvCviF5dWc/uhivth8yNVFEaJdEvC9UP/YYA7klVJRbWbqHMwro6b2xPLrbMgoZGhCGCOSwjlcVEF2UUVXFNXtZR0tp6Lazn5JVyFOARLwvVC/2BDsGl5evo9Zz69k8mNL+PNnWzt9vaoaO9uzihieFM6wRJOjfnOmd7Tyc0sqAcgvrXJxSYRonwR8L9Q/1uTReezbnRwprmBSv2je+ekgq/fnt3turV2zJ6ekSYDbcbiIqlo7wxPDGdw9FIuCTV7SrZNXYn4PEvDFqUAGbb1Qv27B/O7cAfTrFsz0QbFU1tRy9uPLuO8/m/nyzjPw82m641Z5VS0vLtvDkp057DxcREW1nT4xQXx712R8rJb6QdrhSWEE+fnQt1uwtPCFcEPSwvdCFovi9ml9OWdIHFaLItDXh7/NTCEtu4QXl+5tcmzqzmzOeXIpT36/Gz8fC1eNS+aXU3qzJ6eUj9dnAmbANjrYl4TwAACGJoSzKePoSd2g3VVyJOCLU4i08AUA0wZ046Lh8Ty7OA0FHCmuYPeREn7al0/vmCDevWUCE/tEAaC1ZkVaLk8v3s0lIxLYmH6UYYnh9cnbhieF8dG6DA4VVhDveBPwVLnF0qUjTh3Swhf1/nzhYIL8rPzzu118vvEQpVU13H12f77+9aT6YA8madvd5wwgPb+c11buIy2nhOGN5v4PTTADt97Qj1/fpVNW5RWfaMSpTVr4ol5MiB+pv5sGQFiArc1jp/aPYVSPcB77didam1Z9nUHdQ/GxKDZnHmVGSvO0zp4kr9QE/KoaO6VVtQS3sRGNEK4mLXzRRFiArd1gD6aVf885A6iuNa3axqt7/W1W+seGeEcLv7gKm9V0ZRVIt45wcxLwRaed1jea0/tG0bdbcLPsmMMSw9iUUUit3XO7ObTW5JVW0jvaJKTLk4Av3JwEfHFCXrhmNO/cMr7Z81P6x1BYXs2yXTkuKNXJUVheTXWtpr9jf+B8R/eOEO5KAr44ISH+NrqF+Dd7fvqgWKKCfHn355Y3ZvEEdQO2/R3bUeaXVruyOEK0SwK+cApfHwuzRieyaEe2x+bVyXWssu0XKy18cWqQgC+cZs7YJGrtmg/XZZzU++4+UkxaF2cDbUldC79XdBA2q5IWvnB7EvCF0/SOCWZ8r0jeW52O/SQO3t729jp++/4Gp98nt9gE/OhgXyICfaWFL9yeBHzhVFeO68GBvDJW7c07KffLKChjd3YJW7OKKHXswOUsuSVVWBREBPoSGeQrLXzh9iTgC6eakRJHWICNN1cdYPeRYlam5bIyLbfZqtS8kkqeXrSbo2UnNrVx2a5cwGT13OjYhctZ8koriQzyw2JRRAVLC1+4P1kWKJzK32Zl5sgEXlu5n6+3HK5//uzBsfxj1nDCAm3sOlLMja+tJqOgnOziCv526dBO32/Zrhyig33JLali7f4CTusT3RXVaFFOcRXRwWb9QUSgL1lHi5x2LyG6ggR84XS/nt6PPt2CCQ+wER3sx5bMQh79ZgcXPPMDt0zqzT++3Ym/r5XpA7vx7s/p3HB6L/rEBB/3fapr7axIy+XC4d1Ze6CANQcKuqwO+3JLmfPij7xzywT6OqZh5pZUEhPiB0BUkC95JdLCF+5NunSE00UE+fKLCclcNDyeiX2iuGVybz64dSJaw/2fbSUxMpBPbz+dRy4fhr+Phce+2dmp+6w/eJTiyhqm9I9hdHIk6w4WdNlg8fK0XLKLK1m+u2EhWW5JJdHBfvV1LKqoofoEt4oUwpmcFvCVUq8opbKVUlucdQ9x6hrZI4Iv7zyDv12awoe3TiQ+PICYED/mTe7DN1sPs7YTrfOlu7KxWhSn9Y1mdHIExRU1XbZZ+2bHeMAmx8YuWmtHwDddOlGO1BIFJzgGIYQzObOF/xoww4nXF6e48EBfrpmQTFCjDJM3T+pFdLAfj3y9/bjTDS/blcuoHuGE+tsYkxwB0Kk3jpbUJYLb4gj4ZVW1VFTbiWrUwgcokJk6wo05rQ9fa71MKdXTWdcXninIz4e7zurHnz7ZwuA/f0utXVOrNbEhfiRHBdEzOoghtubdJrkllWzOLOSec/oDkBwVSHSwL2sO5HPV+B4nVKbyqlp2Z5cQYLOSll1CWVVN/aKrui6duuRxJl1yyAndTwhncfmgrVJqHjAPIDY2ltTU1E5dp6SkpNPnnqo8tc7d7ZrL+tkoqwarAqWgoKKGI/kFrD+QxxcWja91Md0CGz6grswyc+6Diw+Smmq2XuwRWMvyHVmkpp7Y9My0glpq7Zqx8YplGfD2l0vrX8vau4PU4jTSi82b0PLVG6hK7/r/Vp76t26PN9bbmXV2ecDXWi8AFgCMGTNGT506tVPXSU1NpbPnnqo8uc7TW3k+LbuES55ZygvbLHw0/zQignypqbXzztvriAoq4NqLzsRiMfnpd1n28PevdjBk9MT62TSdsX/FPmAbd18ynmXPrcCnW2+6hwXAT2s587SxpCSEkV1cwf+uWET35L5Mndiz0/dqjSf/rdvijfV2Zp1llo44pfTtFsxdo/zJOFrOTa+v5t8/7GXKY6n8d9sRLhoeXx/sAUYnRwIn3o+/KbOQmBA/hiWGERPix+bMwvqdrureSCICTZeOrLYV7kwCvjjl9Iuw8tScEaxPP8rfvtxOQkQAL107hj9fOLjJcSkJofj6WFh38MQC/uaMQoYlhKGUYmhCGFsyC+s3L6/ru7dZLYT6+8hqW+HWnNalo5R6F5gKRCulMoD7tdYvO+t+wrucN7Q7b980nhB/G0MTw1o8xs/HyrCEMN5adYDth4oYEh/G+N6RTOkX0+STQFtKK2tIyynhgmHdAUhJCCN1ZzYH88sID7Rhsza0mSKDfMkvkxa+cF/OnKVzpbOuLQSYLRbbc9/5g3hv9UG2ZhXx8vK9vLB0D71jgrhlUm9mjkzA32Zt8/ytWUVobbZsBBiWEIZdww+7c+pn6NQxCdSkhS/cl8sHbYVwptHJEYx2zMmvrKnl261HWLBsD/f9ZzPPLUnj7ZvHkxwV1Or5mxwLrlISTMCv+zSRXVxJ75im50UG+ZF5tNwZ1RCiS0gfvvAafj5WLh4ez+d3nMGbN42jtLKGuQtWcSCvtNVzNmcW0j3Mv34bx9hQ/4b8Oc1a+DZp4Qu3JgFfeB2lFJP6xfD2zRMor65tM+hvzihkaELTMYK6n2OaBXw/Ckqrj3uFsBAniwR84bUGx4fyjiPoz3lxVX3ahDpFFdXszS2t77+vU9e9U5dHp05kkI2qWjslTt54RYjOkoAvvNrg+FDevWUCFgWzXljJpxvMKt3ckkr++a3J2jk0MbzJOcPqA37zFj5IPh3hvmTQVni9Qd1D+exXZ3DbW+v49cINfLQuk5/25lFVa+fi4fFM7B3V5PhxvSOZ1C+acb0imzwfGWQDTD6dHlGBJ638om2llTUUlFWRGCF/Ewn4QmBa62/dPJ6/frGN91anM3NkAvOm9G5xI5ZQfxtv3jS+2fP1LXxJkexWnl60mw/XZrDmT2ehVMfWX3gqCfhCOPj6WPjrpSn8+aLBTRZUdVSkI71CXokEfHey43AxeaVV5JZUnVBOJU8gffhCHKMzwR4g0jGI++OePNn5yo3UzcBqa/qtt5CAL0QXCfbz4erxPfjP+kxm/msFOw8XNztGa01adjHrDhaQnl9GeVUtOcWVpO7M5rklaXyyPrNLy1RYXs0VL6xkQ/qJpYg+VdXU2skoMIvh9ueVubg0riddOkJ0oYdmDmVSvxj++PFmLnpmOWf0iyYpIoCEiAD25ZaxbFdOu6tx07JLuPuc/l3S3/z15kOs3l/Ah2vTGZEU3v4Jx1i2y+zhO7l/zAmXxRWyjlZQ49jXWFr4EvCF6HIzUuIY2zOCf/x3JxvSC1m9L5/iyhqC/Xw4vW8Ud5zZl7hQf3JKKskprsTPx8KQ+DAGdQ/hka938OySNKpq7dx33sATDvqfb8oCIHVnDlrr47qe1pr7/rMZiwWW/W7aKTngub9RkJcWvgR8IZwiKtiPhy8bBpjAWVReQ6Cftd3xgb/PHIqvj4UFy/ZSUlnD/14wmADfthO8tSa7uIIf9+SREB5ARkE5+3JL6d3CrKPW7Mkpqf80ciCvjJ7Rrecccld1rfoBsSHSwkf68IVwOqUUYcekUm6NxaL4y8VD+OWU3rzz00HOfmIpi7YfafHYXUeKeejLbeSVtJy/56tNh7Br+OulQwBY6uie6ajUnQ3H/7D7+M51hT98vJnb3l7b5Ln9eWX42yyM7RXBvtxSr097IQFfCDejlOK+8waxcN4EAmxWbnp9DU+tq2BfbkMLdfnuXC7/10pe+mEfl/5rBbuPNB8g/mxjFgPjQjhzYCy9o4OOO+Av3ZVD327BJEUGsHRX7gnXy5lq7ZovNmaxZEcONY1mSB3IKyM5MoieUUEUV9Rw1Mv3K5CAL4SbmtA7ii/vnMS95w1ke14tZz++lL98vpXXV+7n+ld/Jj48gBeuGU15lZ3L/rWySSs8Pb+MdQePctHweMAMuq7am0dFdW2H7l1WVcNPe/OZ2j+Gyf1i+HFPrltPNd2aVUhRRQ3l1bXszi6pf/5AXinJUYH0dKTA3u/l3ToS8IVwY74+Fm6d0odHJgdwxZgkXl+5n/s/28rEPlF8OH8iM1Li+PSO00mICOD6V1fzz//upKrGzhebDgFwsSPgTxkQQ0W1nZ/35XfovqscqSWmDIhhUr8YSqtqWX/Qfad2rtyTV/+4bg8Du11zIN+MPfSMNmkVDnj5wK0M2gpxCgj3s/DwuUO5/rSerN6fz5yxSfVjAgnhAXw4/zT+/MkWnlmcxnfbjlBRXcuIpHCSIk2gm9ArCl8fC0t35XRoimXqzhwCbFbG9oykqtaO1aJYtiunWf4gd7EiLZe+3YLJLqpgY0Yhc8bC4aIKqmrs9IgMJDEiEKWkhS8tfCFOIQPiQrhmQnKzAeBgPx8enzOCl68bQ35pFfvzyupb9wABvlbG94rsUD++1prUnTlM7BOFv81KqL+NkUnhbjtwW1VjZ/X+fM7oG82wxHA2OhaZ1QX3nlFB+NusxIcFeH0LXwK+EB5k+qBYvvvNFB6amcJV43s0eW1K/xjSsktYsz+fb7Yc4snvd/H15kPN+ub355VxML+MqQMaPglM6hfDpsxCCko7lyfoYF4Z9/1nU6fPb8v6gwVUVNs5rU8Uw5PC2Hm4mIrqWg46gnuyI3NpclSg17fwpUtHCA8TFmjj6vHJzZ6fOiCGv325nVkv/Njk+ehgP2aPSeSswbEMigsldWc2YN4g6kzqH80T3+9ixZ5cLhwWz/HQWvM/H21k1d58gv18+OMFgztRq9at2JOHRcH43lFooMau2XaoiP15ZdisivjwAACSo4L4duvhLr33qUYCvhBeok9MMH+9ZAhKKVISwujXLZif9uXxzk/pvLB0D/9K3YNFmb1/e0UHNdncfXhiOKH+PizdmdNmwC+qqOZgXln9rmAAH6/PZNXefBIjAnj9xwPcdEZv4sL8u6xeP+7JZWhCGGEBNoY7NqvZmH6UA3mlJEUEYrWYFcI9owLJL62isLyasABbl93/VCIBXwgvoZTiFxN7NnnuzIGxnDkwluziCtYfPMrWrCK2ZRUxIyWuyXFWi+KswbF8uC6D5KhAbpvaF4ulIdVCbkklryzfx5s/HqC4sob5U/twzzkDKK6o5qEvtzOyRzhPzRnJ9MdTeWbxbh6aObRTdaiptXPHO+uJCfHjfy8cTHWtnfUHj3LL5N4AxIX5Exvqx6aMQvbnldV35wD1b2AH88oYesy2ld5CAr4Qgm4h/pw7JI5zh8S1eszfLk3Bbtf847+7WHfwKL87dwDrDx5lRVoui3YcobLGzvkp3Qn0tfJ86h52HS4mLMDG0fJq3rx0KD2iApk7tgfv/nyQX07u06ldwZ5dksY3jm6ZfbmlzB2XRI1dc1qfhl3J6gZujxRVML7RrKJe0Q1z8SXgCyFEGwJ9fXhizghGJ0fw4BfbWLzD9PXHhfpz2ahEbjqjV/0OYcOSwnngs63U2jW3TOrF4PhQAH51Zl8+WJvOk9/v4vE5I47r/usOFvDM4jRmjkzgtD5R3Pefzazam4ev1cKY5IbAPjwxjO+2mXQUPRu9qfSIrJuL3/UDt7V2zU978/hsYxbfbD3MuYPjeOTyoW6XcE4CvhCiw+q6hUYlR7AxvZBxvSLpExPULLD9YkIyfWKC+HxjFned1b/++W6h/lw3sScLfthLeXUtKQlhpCSEERvqR3iAL4F+VtKyS9hw8CjbDhVhLalm0KgKgvx8uGvhBuJC/fnLJUMI9bfRPSyAW99ay8ge4U0SzA1vlAa68ThEgK+VuFD/Ls+aWVxRzeXPrz3eiD4AAAnfSURBVGTXkRICfa0MjAvhvTXpDEsKa3Hw3JUk4AshjtuQ+DCGxLfdLXJan2hO6xPd7PnbpvUlt6SKNQfy+XpL67NmIgJtFJRV88HDi0iICCCzoJyF8yYS6m8GXM/oF03q76ZiPebNZlhC44DftNsoOSqwy1v4f/1iG2nZJTw2axgXDovHz8fC9a+t5i+fbWNoQhjDEo9/HwJnkYAvhDipwgJs/HP2cAAKy6rZfriI/NIqjpZVU1xRTXJUICOSIogL82fhl4vJtCXw5eZD3H3OgGYrfaODm+9RGxZoo2dUIAfzy0iMaBrwe0YFsWjHEWpq7fh0civLxr7bdoT312Rw+7Q+XDEmqf75J+eM4MKnf2D+W+v48s4zCHfsd9ySmlo7mUfLm3wacRYJ+EIIlwkLtDGhd1Srr8cFWZg7dQB3nzPguK47obdZJezr0zSoj06O4L016cx46gd+P2MgZw3q1uF+drtjfn9CeAARQb7klVRy3382Mah7KL+e3r/JsZFBvjx39Shmv/gjFz6znNljkrh8dCIJjjUBdTIKyrhr4QYyj5az6O4pBPo6NyRLwBdCeJz7LxrSYmbQK8YkEhpg4/++2cEtb6xhaEIYA+JC6B7mT0yIHxZH8LdaFLGhfsSHBxDk68Pnm7JY+HM6B/NN///AuBCsFkVReQ1v3jS82RsLwMgeEfz7urG8kLqHx7/bxRPf72JCryguGNadGSlxrN6Xz+8/2oRdw0MzU5we7EECvhDCAwX4WlvcKUwpxYyUOKYP6sZ7q9P5z7oMVqTlcqSoAns7e6NM6B3JHdP6cqSogh/35rH+4FF+f95ABnUPbfWcKf1jmNI/hvT8Mj5cm8HnG7P40ydb+POnW7BrM6Po6StHnpTuHJCAL4TwQjarhWsmJHPNBDOLpqbWztHyauyOHbFqajWHiyrILCgnv7SKSf2im2wP+avp/Y7rfkmRgfzm7P7cdVY/dhwu5uvNhwj08+HG03u1+OnAWSTgCyG8no/V0mwAOD48gFE9Irr0PkopBnUPbfNTgTNJtkwhhPASEvCFEMJLODXgK6VmKKV2KqXSlFL3OvNeQggh2ua0gK+UsgLPAecBg4ErlVJdmwhbCCFEhzmzhT8OSNNa79VaVwELgUuceD8hhBBtUFq3M/m0sxdWahYwQ2t9s+PnXwDjtdZ3HHPcPGAeQGxs7OiFCxd26n4lJSUEBwe3f6AH8cY6g3fW2xvrDN5Z7+Ot87Rp09Zqrcd05FiXT8vUWi8AFgCMGTNGT506tVPXSU1NpbPnnqq8sc7gnfX2xjqDd9bbmXV2ZpdOJpDU6OdEx3NCCCFcwJldOj7ALmA6JtCvBq7SWm9t45wc4EAnbxkN5Hby3FOVN9YZvLPe3lhn8M56H2+dk7XWMe0f5sQuHa11jVLqDuBbwAq80lawd5zToUK3RCm1pqP9WJ7CG+sM3llvb6wzeGe9nVlnp/bha62/Ar5y5j2EEEJ0jKy0FUIIL+FJAX+BqwvgAt5YZ/DOentjncE76+20Ojtt0FYIIYR78aQWvhBCiDZIwBdCCC9xygd8b8nIqZRKUkotUUptU0ptVUr92vF8pFLqO6XUbsf3rt2xwQ0opaxKqfVKqS8cP/dSSv3k+Ju/p5TydXUZu5pSKlwp9aFSaodSartSaqKn/62VUr9x/NveopR6Vynl74l/a6XUK0qpbKXUlkbPtfi3VcbTjvpvUkqNOpF7n9IB38syctYAd2utBwMTgNsddb0XWKS17gcscvzsaX4NbG/086PAE1rrvkABcJNLSuVcTwHfaK0HAsMx9ffYv7VSKgG4ExijtU7BrN35//buL0SLKozj+PeXmawKuhmIucUaLV1EpdKFVERsXZRGBQUWQhJC5EV/bsqiq6CriAgrhFLKSgoyM+lCqlUqqPwX29o/KGtJYzeV0NoIM/t1cc7asPqG2r47vjPPB4adOTPMew7P8rxnzsx75naqGeuXgOtHlDWK7Q1AV17uBlb8nw9u6YRPjWbktD1g+7O8/hspAcwktXd1Pmw1cEs5NWwOSR3AAmBl3hbQDazNh1SxzVOAq4FVALb/tH2Aisea9Lugtvwr/YnAABWMte0PgV9GFDeK7c3Ay04+BaZKmnGqn93qCX8msLuwvSeXVZqkTmAOsAWYbnsg7xoEppdUrWZ5GngI+DtvTwMO2P4rb1cx5rOAfcCLeShrpaRJVDjWtn8CngR+JCX6g8AOqh/rYY1iO6o5rtUTfu1Imgy8CTxg+9fiPqdnbCvznK2kG4G9tneUXZcxdiYwF1hhew7wOyOGbyoY63ZSb3YWcC4wiWOHPWqhmbFt9YRfqxk5JY0nJfs1ttfl4p+HL/Hy371l1a8JrgRuktRPGq7rJo1tT82X/VDNmO8B9tjekrfXkr4Aqhzr64AfbO+zfRhYR4p/1WM9rFFsRzXHtXrC3wZ05Tv5Z5Fu8mwouU5NkceuVwFf236qsGsDsDivLwbeHuu6NYvtR2x32O4kxXaT7UXAZuC2fFil2gxgexDYLemiXHQt8BUVjjVpKGeepIn5f324zZWOdUGj2G4A7sxP68wDDhaGfk6e7ZZegPmkaZh3AY+WXZ8mtvMq0mVeH9Cbl/mkMe0e4FvgfeDssuvapPZfA7yT1y8AtgLfAW8AE8quXxPaOxvYnuO9HmiveqyBx4BvgC+AV4AJVYw18BrpPsVh0tXckkaxBUR6EnEXsJP0FNMpf3ZMrRBCCDXR6kM6IYQQTlAk/BBCqIlI+CGEUBOR8EMIoSYi4YcQQk1Ewg+VJ+mIpN7CMmqTjknqLM56GMLprKkvMQ/hNPGH7dllVyKEskUPP9SWpH5JT0jaKWmrpAtzeaekTXn+8R5J5+fy6ZLekvR5Xq7Ipxon6YU8l/u7ktry8ffl9xf0SXq9pGaGcFQk/FAHbSOGdBYW9h20fQnwLGlmToBngNW2LwXWAMtz+XLgA9uXkea2+TKXdwHP2b4YOADcmssfBubk89zTrMaFcKLil7ah8iQN2Z58nPJ+oNv293liukHb0yTtB2bYPpzLB2yfI2kf0GH7UOEcncB7Ti+uQNIyYLztxyVtBIZIUyOstz3U5KaG8J+ihx/qzg3WT8ahwvoR/r03toA0D8pcYFth1scQShEJP9TdwsLfT/L6x6TZOQEWAR/l9R5gKRx9z+6URieVdAZwnu3NwDJgCnDMVUYIYyl6HKEO2iT1FrY32h5+NLNdUh+pl35HLruX9LapB0lvnrorl98PPC9pCaknv5Q06+HxjANezV8KApY7vaYwhNLEGH6orTyGf7nt/WXXJYSxEEM6IYRQE9HDDyGEmogefggh1EQk/BBCqIlI+CGEUBOR8EMIoSYi4YcQQk38A7nYAJ7roZf2AAAAAElFTkSuQmCC\n", 442 | "text/plain": [ 443 | "
" 444 | ] 445 | }, 446 | "metadata": { 447 | "needs_background": "light" 448 | }, 449 | "output_type": "display_data" 450 | } 451 | ], 452 | "source": [ 453 | "epochs = list(range(max_epoch))\n", 454 | "train_loss = exp_results[\"train_loss\"]\n", 455 | "valid_loss = exp_results[\"valid_loss\"]\n", 456 | "lines = plt.plot(epochs, train_loss, epochs, valid_loss)\n", 457 | "\n", 458 | "plt.legend(('Train', 'Validation'), loc='upper right')\n", 459 | "plt.title('Loss chart')\n", 460 | "plt.xlabel('Epochs')\n", 461 | "plt.ylabel('Loss')\n", 462 | "plt.grid(True)\n", 463 | "plt.show()" 464 | ] 465 | }, 466 | { 467 | "cell_type": "code", 468 | "execution_count": 7, 469 | "metadata": {}, 470 | "outputs": [ 471 | { 472 | "data": { 473 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3Xd4lFX68PHvSe+VFAKB0AmdEMBCFVR0USyIoKhgwbK6Kquur+vPsquru5a1rr0hKqKiooKdKCDSkd4JEEJCeq8z5/3jTBokZAIZZsLcn+uaK5mnzTkpz/2crrTWCCGEEAAezk6AEEII1yFBQQghRC0JCkIIIWpJUBBCCFFLgoIQQohaEhSEEELUkqAghAtTSj2ilJrr7HQI9yFBQbg8pVSKUipPKeXr7LS0dUqpGUqpZc5Oh3BdEhSES1NKJQAjAQ1cfIo/2+tUfp6jnW75EY4hQUG4umuB34F3gevq71BK+SulnlFK7VdKFSilliml/G37RiilflNK5SulDiqlZti2pyilbqx3jQZPzkoprZT6s1JqF7DLtu152zUKlVJrlVIj6x3vqZR6QCm1RylVZNsfr5R6WSn1zFHpXaiUuruxTCql+iqlflBK5SqlMpVSD9Tb7aOUmmO7/halVHK98+6v99lblVKXHpW35Uqp/yqlcoCPgVeBM5VSxUqpfPt+BcKdSFAQru5a4APb63ylVEy9fU8DQ4CzgAjgPsCqlOoMLAZeBKKAQcCGFnzmJcBwoI/t/WrbNSKAD4FPlFJ+tn2zgWnAhUAIcD1QCrwHTFNKeQAopdoB423nN6CUCgZ+BL4F4oDuwE/1DrkYmAeEAQuBl+rt24MpSYUCjwJzlVLt6+0fDuwFYoDpwC3ACq11kNY6rAU/E+EmJCgIl6WUGgF0BuZrrddiboBX2fZ5YG7Ad2qtD2mtLVrr37TWFbZjftRaf6S1rtJa52itWxIUntBa52qtywC01nNt16jWWj8D+AK9bMfeCDyotd6hjT9sx64CCoBxtuOmAila68xGPm8ikKG1fkZrXa61LtJar6y3f5nWepHW2gK8Dwys2aG1/kRrna61tmqtP8aUbobVOzdda/2iLe1lLfgZCDclQUG4suuA77XW2bb3H1JXhdQO8MMEiqPFN7HdXgfrv1FK3aOU2marosrHPJW3s+Oz3sM8nWP7+n4TxzWX3ox635cCfjXtA0qpa5VSG2zVZPlAv3ppOyYvQjRHGp6ES7K1DUwBPJVSNTdFXyBMKTUQ2ASUA92AP446/SANn5brKwEC6r2PbeSY2qmDbe0H92Ge+Ldora1KqTxA1fusbsDmRq4zF9hsS28i8EUTaTqIKUm0iK2a7A1b2lZorS1KqQ310tYgL028F6IBKSkIV3UJYMHU6w+yvRKBpcC1Wmsr8DbwrFIqztbge6at2+oHwHil1BSllJdSKlIpNch23Q3AZUqpAKVUd+CGZtIRDFQDWYCXUuohTNtBjTeBfyqleihjgFIqEkBrnYZpj3gf+Ow41TdfA+2VUncppXyVUsFKqeF2/IwCMTf5LACl1ExMSeF4MoGOSikfO64v3JAEBeGqrgPe0Vof0Fpn1LwwjaxX26pP7sGUGFYDucC/AQ+t9QFMw+9fbds3UFcP/1+gEnNzfA8TQI7nO0wD8E5gP6Z0Ur9K5llgPvA9UAi8BfjX2/8e0J+mq47QWhcB5wIXYaqKdgFjm0kXWuutwDPAClt++gPLmzntZ2ALkKGUym7mWOGGlCyyI4TjKKVGYaqROmv5ZxNtgJQUhHAQpZQ3cCfwpgQE0VZIUBDCAZRSiUA+0B54zsnJEcJuUn0khBCilpQUhBBC1Gpz4xTatWunExISTujckpISAgMDWzdBbYA75tsd8wzumW93zDO0PN9r167N1lpHNXdcmwsKCQkJrFmz5oTOTUlJYcyYMa2boDbAHfPtjnkG98y3O+YZWp5vpdR+e46T6iMhhBC1JCgIIYSoJUFBCCFErTbXptCYqqoq0tLSKC8vP+5xoaGhbNu27RSlynWcSL79/Pzo2LEj3t7eDkqVEMIVnRZBIS0tjeDgYBISElBKNXlcUVERwcHBpzBlrqGl+dZak5OTQ1paGl26dHFgyoQQrua0qD4qLy8nMjLyuAFB2E8pRWRkZLMlLyHE6ee0CAqABIRWJj9PIdzTaRMUhBDidJJfWsm7y/eRXVxxSj/3tGhTcLacnBzGjTNL8WZkZODp6UlUlBk4uGrVKnx8ml/PZObMmdx///306tWr2WOFEKdeaWU1e7NKKK+y4OftiZ+3B3uzSli6K5tlu7MJD/Dm35cPoEdMXftdeZWF0koLEYH2r2lksWo+Xn2Qp77bTl5pFW8vT+WdmUPpFhXkiGwdQ4JCK4iMjGTDBrMu/COPPEJQUBD33HNPg2O01mit8fBovHD2zjvvODydQoiWKSyv4rGvt/LbnhzS8hpfOC/Ax5PhXSLYmFbAxBeX8fc/JTJxQBxzf9/PnBWpVFRb+XH2aGJC/I77WdUWKz9uy+TlJXvYdKiAYV0iuHp4J/7x1VYuf+U33rg2maEJEQ7IZUMSFBxo9+7dXHzxxQwePJj169fzww8/8Oijj7Ju3TrKysq48soreeihhwAYMWIEL730Ev369aNdu3bccsstLF68mICAAL788kuio6OdnBsh3MuuzCJufn8t+3NLmdAvlinJ8XSPDiLYz4uySgtlVRZiQvxI6hSOj5cHR4rKufeTjTz05RYe/WorFqtmVM8oft+bw5OLt/PfKwc1+jnVFitvL9/He7/t51B+GR3C/Hl+6iAuHhiHUorB8eHMeGcVV7+5khemDmZCv8aWFW89p11QePSrLWxNL2x0n8ViwdPTs8XX7BMXwsMX9T2h9Gzfvp05c+aQnJwMwJNPPklERATV1dWMHTuWyZMn06dPnwbnFBQUMHr0aJ588klmz57N22+/zf33339Cny+EaNq3mzMY0ysKP2/PY7b/df4G/H08+fDG4QzvGtnstaKD/Xh35lA+WnWQXUeKmDasEz1jgnnm+x28+PNupg3rxLAuxz7pz/19P/9atJ3hXSL4v4l9OLdPDJ4edR09OkUG8NmtZ3HXxxvoEOZ/zPmtTRqaHaxbt261AQHgo48+IikpiaSkJLZt28bWrVuPOcff358LLrgAgCFDhpCamnqqkiuE29iVWcQtc9eycEN6g+3VFit3fbyerlFBfH3HSLsCQg2lFFcN78TDF/Wlp61t4bYx3ekQ5s9DX26m2mJtcLzWmjm/72dQfBgf33wmE/rFNggINcIDfXjv+mH07xh6AjltmdOupHC8J3pnDF6rP7Xtrl27eP7551m1ahVhYWFMnz690bEA9RumPT09qa6uPiVpFaKt0lpTVmVBofD3sa82YG92CQBpeaUNtmcUllNeZeXq4Z2IDT1+O4A9/H08efBPidz6wTo+XHWAa89MqN23fHcOe7NKeHbKwJP+nNZy2gUFV1ZYWEhwcDAhISEcPnyY7777jgkTJjg7WUK4vO0Zhfy+J4fUnFJSc0o4UlhBRbWFSouVguIyyn5YTJVFE+TrxZJ7xhAV7NvsNQ/kmGCQlt+wATk93zyoxbViVc2EfrGM6N6Op77bwTm9o+kYHgDAnBWpRAT6cGH/9q32WSdLgsIplJSURJ8+fejduzedO3fm7LPPdnaShHBZFdUWFm/K4P3f97N2fx4AgT6eJLQLpH2oH37envh6eZCbnUlit874eHrw/E+7mLfqAHeM69Hs9ffnmpJC+jFBwbxvzaCglOLxS/sx8YVl/PmDdcy/5Uyyiyv5cVsmN4/udkybhjNJUGhljzzySO333bt3r+2qCuYP4/3332/0vGXLltV+n5+fX/v91KlTmTp1ausnVAgXYbFqPNSxo+hvnbuOn7cfoUu7QB60dfOMCfE95jiz2ExvANYfzOeDlQe4ZUw3vD09aq+/OjWX4V0iGpy731ZSqCkZ1DhkCwqt3ajbOTKQp64YyC1z1/LY19sI9jO336uHd2rVzzlZ0tAshHCIg7mlTHltBU8s2kZmYePzaBWWVzHhuV/5vy83N9i+4WA+P28/wp3jevDT7NHcOLIrsaF+zU6/ct2ZnckoLOeHrZm1257/cSdTX/+9trRR40CuCQqHC8qwWnXt9kP5ZUQE+tjdNtESE/rFcvOorrz/+37eWraPcYkxtVVJrkKCghCi1WUVVXDNWyvZlFbAG0v3MuLfP3Pfp38cExweWbiFXUeKmfv7ATam1ZWQ/7dkN6H+3tw0qisejfTGacqYXtHER/jz3m+pAGxMy+fllD0AbD5UUHtclcVKWl4ZYQHeVFl0g6kk0vPLiAs7+Qbmptx7fi+GdYmgotrKtWd2dtjnnCgJCkKIVlVYXsV1b68io7CcuTcOI+WesUwd2okvN6RzycvL2ZFRBMDXG9NZsO4QN4zoQmSgD499sw2tNbsyi/h+aybXnZVAkG/Larg9PRTXnNGZlfty2ZiWz1/n/0FUkC8hfl7syCyqPS49vwyLVXNGF9PdtH5j86G8MoeOB/Dy9OC16UN45eokRnRv57DPOVESFIQQrabaYuXGd9ew60gRr12TzJDOEXSKDOCfl/Tj89vOxqo1k1/9jS/WH+Lvn29mUHwY91/Qm7vP7cmqfbl8tyWTV37Zg7+3JzPOSjihNExJjsfXy4Pr3l7FriPFPHl5f/rEhbDtcF1QqGlPOLObCQo1jctaa1tJwbGDxMIDfbigf3uXnI1YgoIQokUqq60s3ZXF+ytSG9TFA6w7kM+q1Fweubgvo3tGNdjXJy6EBbedTWyIH3d9vIEqi5X/XjkIb08Ppg6Np0d0EP/8eisLN6QzdVh8iyaRqy8swIdJg+LIK61i2rB4xvSKpndsCDszi2rTuz+38aBQWFZNSaXllIwcdlXS+0gIYZc1qbm8szyVX3ZmUVxhBlT2bh/SYJK2HRlmiplzejc+V1eHMH8+veUsHv1qC+f2iaFLOzO408vTgwf+lMjMd1bj7am4aWTXk0rrneN7EuLnzV3n9jTpjA2mtNLCwbxSOkcGciCnBF8vD7pHBRHs61XbAyktv7Q2ne5KSgqtYOzYsXz33XcNtj333HPceuutTZ4TFGSmwU1PT2fy5MmNHjNmzBjWrFlz3M9+7rnnKC2tG5F54YUXNujSKkRL/bYnmzs+Ws+cFansySpm95EibpqzhsmvruD3vTlcNLA9T00eAHDMPGPbM4oI8fMi9jgzgoYGePPslYO44KgBW2N6RjF5SEduHtXtpKtvOoT58+DEPrVtEr3bh9SmDyA1p5ROEQF4eCg6hPvXdkN1xMC1tkZKCq1g2rRpzJs3j/PPP79227x58/jPf/7T7LlxcXF8+umnJ/zZzz33HNOnTycgwHRrW7Ro0QlfSwiAj1Yd5Ks/0vnqj7o5gYJ8vbjnvJ5cP6ILAT5eaK3516JtbDvcMCjszCyiV2zwCdWVK6V4+grHTPfQMyYIpWD74SLO7xvLgZxSOkea/5m4MP/a6iNHDFxra6Sk0AomT57MN998Q2VlJQCpqamkp6czePBgxo0bR1JSEv379+fLL7885tzU1FT69esHQFlZGVOnTiUxMZFLL72UsrK6HhG33norycnJ9O3bl4cffhiAF154gfT0dMaOHcvYsWMBSEhIIDs7G4Bnn32Wfv36MXz4cJ577rnaz0tMTOSmm26ib9++nHfeeQ0+R4hthwsZnxjDr/eO5fFL+3HPeT359b6x3H5ODwJ8zHOkUorE9iENgoLWmu0ZJii4mgAfLzpHBLA9oxCtNQdyS+kUYaqu4sL8aksKh/LL8PHyoF3QibVnnA5Ov5LC4vshY1Oju/wt1eB5AlmO7Q8XPNnk7oiICIYNG8bixYuZNGkS8+bNY8qUKfj7+/P5558TEhJCdnY2Z5xxBhdffHGTT1GvvPIKAQEBbNu2jY0bN5KUlFS77/HHHyciIgKLxcK4cePYuHEjf/nLX3j22WdZsmQJ7do17Nq2du1a3nnnHVauXElhYSHjx49n9OjRhIeHs2vXLj766CPeeOMNpkyZwmeffcb06dNb/nMRbdpby/ZRbbFy8+hutdvKqyzszSrmwn6xdIoM4OrIpvvRJ7YPYe7v+6m2WPHy9CCjsJyi8mp6xbheUADoHRvCjowisooqKKuyNCgp5JdWUVJRXbuegSv2CjpVpKTQSmqqkMBUHU2bNg2tNQ888AADBgxg/PjxHDp0iMzMzCav8euvv9benAcMGMCAAQNq982fP5+kpCQGDx7Mli1bGp1yu75ly5Zx6aWXEhgYSFBQEJdddhlLly4FoEuXLgwaZBb8kKm53dO8VQf459db+V/KngY9iHZkFGHV5obfnD7tQ6iotpKaY+YQqqmv7xXb/LnO0Cs2mH05JbXjFWqCQk2j8uGCMocPXGsLTr+SwnGe6MscOHX2pEmTuPvuu1m3bh2lpaUMGTKEd999l6ysLNauXYu3tzcJCQmNTpXdnH379vH000+zevVqwsPDmTFjxgldp4avb90Mkp6enlJ95GZ+2ZnF37/YTGSgDzkllezNLqZ7tPm/qKkOsico1Byz9XAR3aODaweluWpJIbF9MFrDj7YpMDpHmuqjmqBwKL+cQ3lljOkV1eQ13IGUFFpJUFAQY8eO5frrr2fatGmAWUEtOjoab29vlixZwv79+497jVGjRvHhhx8CsHnzZjZu3AiYKbcDAwMJDQ0lMzOTxYsX154THBxMUVHRMdcaOXIkX3zxBaWlpZSUlPD5558zcuTI1squcGFVFivP/biT3/ZkH7Nva3oht81dS8+YYN6eMRSAdfvreqttO1xIoI8nnSKan4+ne3QQ3p6qNpDszCgiNsSP0ADvVspJ6+ptK8F8tyUTD1UXDGoalVOzSzhSVOHWjcxwOpYUnGjatGlceumltdVIV199NRdddBH9+/cnOTmZ3r17H/f8W2+9lZkzZ5KYmEhiYiJDhgwBYODAgQwePJjevXsTHx/fYMrtWbNmMWHCBOLi4liyZEnt9qSkJGbMmMGwYcOwWq3MmjWLwYMHS1XRaa7Copk1Zw1LdmQRGejDj7NHE24bBFZYXsWs99cQ4u/NOzOGEh3sS6i/N+sO5DFlaDwA2w6bhmJ75hvy8fKgW1RQbbfU7RlF9HTBRuYanSIC8Pf2JKOwnI7h/vh4mWfi6GBfPD1U7YR5EhREq7nkkkvQuq5+tl27dqxYsaLRY4uLiwHTW2jzZjNDpL+/f21AOdq7777b6PY77riDO+64o/Z9/Zv+7NmzmT17doMV5+p/HsA999zTfMZEm5BfWslTq8vZW1DKLaO78ebSvTz2zTaesa3q9cjCLaTnl/HJLWfVrig2uFNY7c1Qa822jEImDYqz+zP7xIWwbFc21RYru7OKGdHD9ebyqeHhoegZG8wfB/Nr2xPADJyLDfFjdWouAB0lKAgh2rr9OSXc+N4aUgusvHxVEhf0b4+Hgv+l7OGypA4UlFWxYN0h/nJOd4Z0Dq89b0incFJ2ZFFQVkVhWRVF5dV2tSfU6NM+hAXrDrHuQD6V1dbadYldVWJtUAhssD0uzI/VqVJSAAkKQrR5S3dlcfuH6wH4a7Jf7Ujhv4zrwaJNh7l/wUaKyqsZ2DH0mBXJkmwBYsPBfCqqLIB9jcw1ao79fP0hwEwn4cpqxlB0PqrNxAQCExRaY13mtuy0aWiuX20jTp78PNuGN5fu5bq3VxEb4sdXt48gMbJuYRg/b0/+dVl/DuaWUVFVN/lcfQPjw/BQsG5/HlsPF6JUy27sNUFh0abDeCjT+OzK+saFAtA1qmE6axqdo4J9XWppTGc4LUoKfn5+5OTkEBkZ6daDTlqL1pqcnBz8/Nz7icnVfbo2jce+2caEvrE8M2Uggb5e7D3qmLO6tePxS/sRF+Z/zI0QzPQVPWOCWXcgjwAfTxIiA2tHLdsjItCH2BA/MgrL6dou0OVvqEMTwnlnxlBGHTWDa9xRPZHc2WkRFDp27EhaWhpZWVnHPa68vNwtb3Qnkm8/Pz86duzooBSJk7UlvYC/f76JM7tG8tJVg/HybLrQf/Xw46/uNaRzOAs3pBMa4M2AjqEtTkti+2AyCstdcnqLoymlGNvIDK41JYUObj5wDRwcFJRSE4DnAU/gTa31k0ft7wS8B4TZjrlfa93iGd28vb3p0qVLs8elpKQwePDgll6+zXPXfLcFu48U8+bSvRwuKCezsJxqq2bGWQlcOTT+mKqeGgWlVdw6dx3hAT682ExAsEdSp3A+WHmAoopqrkyOb/H5ie1DWLIjy+UbmY8nrjYoSEnBYUFBKeUJvAycC6QBq5VSC7XW9edneBCYr7V+RSnVB1gEJDgqTUK4kvT8Mqa/uZKi8iq6RQfRMTyA7OIKHvxiM28s3cusUV2pqraSmlNKen4ZIf7etAvyZcPBPA4XlDFv1pm0C/Jt/oOakVSvN1JLGplr9Ikz57h6I/PxxEf4E+LnRb8OLS8pnW4cWVIYBuzWWu8FUErNAyYB9YOCBmr+CkOBdIRwAwVlVcx4ZxUlFdV8eutZtTdjrTVLdhzhP9/u4O+fm/EkgT6edAj3p7i8mqziCqwaHr24b4OupScjITKAiEAfcksqSYxreVA4p3c0t4/tzug2PD1EgI8Xq/4+Hl+v06bvzQlTjuplopSaDEzQWt9oe38NMFxrfXu9Y9oD3wPhQCAwXmu9tpFrzQJmAcTExAxpaoBXc4qLi2sXt3En7phvV85zlVXz9Opydudb+WuyH30ij22ctWrNwSIrob6KUB9V24FCa02VFXw8G+9QcaL5fn5dOTvzLLx0TkCb66zhyr9rR2ppvseOHbtWa53c7IFaa4e8gMmYdoSa99cALx11zGzgr7bvz8SUIjyOd90hQ4boE7VkyZITPrctc8d8u3KeX/tlt+78t6/1F+vTWv3aJ5rvvVnFesWe7NZNzCniyr9rR2ppvoE12o57tyOrjw4B9VutOtq21XcDMAFAa71CKeUHtAOOODBdQjjVvuwS2gX5MmlQB2cnpVaXdoG16yUL9+bICrTVQA+lVBellA8wFVh41DEHgHEASqlEwA84fr9SIdq4rKIKooJPvoFYCEdwWElBa12tlLod+A7T3fRtrfUWpdQ/MMWYhcBfgTeUUndjGp1n2Io5QrisimoLDyzYTKCvJzeP7tbiboxZxZVuvdyjcG0OHaegzZiDRUdte6je91uBs48+TwhXVW2x8peP1vPdlky8PRUfrTrAFcnx3DWuB9EhDQc+peWVUlxRXTuPf43sogq6RUlVjXBN0v9KCDtZrZq/fbaJ77Zk8vBFfUi5dyxXDo3n0zVpzJ7/xzHHz57/B3/+YF2DbVprqT4SLu20mOZCiFPhsW+28dm6NGaf25OZZ5sR9I9d0p/IQF9e+HkXmYXlxNhKC4cLyli1LxdfLw+01rXdPAvLqqm0WIlqhUFnQjiClBSEsMPiTYd5e/k+ZpyVwB3ndG+w76KBcWgN32w8XLut5vuKaisFZVW127OKKwCkpCBclgQFIZpxpLCcBz7fxICOofz9T4nHDO7qHh1En/YhLPyjbkD+VxsPU3PY4YLy2u1ZRbagICUF4aIkKAhxHFpr7vtsI2VVlkbXI6hx8aA4NhzM50BOKQdySvnjYD7n9YkBIKN+UJCSgnBxEhSEOIrWmkpbtc87y1NJ2ZHFAxcm0q2R9QhqXDTQrGv81cZ0vt5kSgw3juwKQEZhXVDItpUUWmMiOyEcQRqahajn5+2Z3PvJRnJKKmu3jeoZxTVnHH9Ngg5h/iTb1iXw8FAkdQpjUHwYSh1VfVRcgbenItTf22F5EOJkSFAQAtPd9KUlu/nvjztJjA1hxlkJ+Pt4EuznxYX929s1SdxFA+N4eOEWAB6a2AdvTw+ignzJKCirPSarqIJ2Qb54eLStSeeE+5CgINxeeZWFO+eZAWmXDIrjicsG4O/T8mUlL+zfnke/2oIG/jSgPWAWgc8orKg9Jru4QqqOhEuToCBOa6WV1azcl8voHlGNPp1rrXlggRmQ9uCfErlhRJcTnjo6KtiX8/rEUmWx1o5XiA3xIzWnpPaYrKKK2n1CuCIJCuK09sjCLcxfk8awhAieumIAnSMbTi/x2q97WbD+EH89t2dtw/DJ+N/VSQ3etw/1Y8XenNr3WUUV9IuT1b2E65LeR+K0teFgPvPXpDGyRzu2ZRQy4bmlvLl0L3uyiqm2WPlxayb//nY7Ewe05/ajBqSdKA8P1aBEEhPqR1F5NSUV1VitmpySSumOKlyalBTEacmqNQ8v3EJUsC//uzqJ4opq/vbZJh77ZhuPfbMNHy8P0NAvLpSnJg902Gpj7UNNVVFGYTlh/t5YrFpmSBUuTYKCOC0tP1TNHwdLeXbKQIL9vAn28+a9mUPZfKiQ7RmF7DpSTFF5FXeO63lCjcr2ig0x02pnFJRTZbECEBUsbQrCdUlQEKed7OIKPtlZRVKnMC6pt7qZUor+HUPp3/HU1enH1pQUCsqpWSlEqo+EK5OgINqswvIqlu7MJjWnhP05JaRml7I3u4Ts4goU8OjF/Zw+HiA2pK76yMPWgifVR8KVSVAQbVJReRVXvLKCHZlFgHn67hwRwDm9o+gaFYRXXuopLRE0xd/Hk1B/bw4XlOHtaQKUlBSEK5OgINoci1Xzl4/WszurmFenJzGyRxSBvg3/lFNSDjopdcdqH+pHRkEFAT5e+Hl7EOQr/3bCdclfp3B5y3dnk5ZXyuie0cSG+vH4N9tYsiOLxy/tx4R+7Z2dvGaZUc1lBPt5ERXs67CeTkK0BgkKwqW9/use/rVoe+37blGB7MkqYebZCVw9/PiT1LmK2BA/Nh8qIMzfR6a4EC5PgoJwGf/9YSeVFivn9olhQIdQnli8nbeW7eNPA9pz6+hu/Lori5TtWfTvEMrfL0x0dnLtFhvqR3ZxJekFZXQ/zvTbQrgCCQrCJWxKK+D5n3YB8ErKHoJ8vSiuqGbGWQk8NLEPHh6Kfh1CuW1M64w8PpVqBrDtyy7hzK6RTk6NEMcnQUG4hLeX7yPQx5Pv7h7FmtQ8ftmZxeBOYVxzRuc2XwdfMwGe1rK4jnB9EhSE02UWlvP1xnR5YvaTAAAgAElEQVSmn9GZjuEBdAwP4JLBHZo/sY1oH+pf+710RxWuTibEE073/or9VFs1M85KcHZSHCK23lTZEhSEq5OgIJyqvMrCByv3Mz4x5phprU8XIf5e+Hub+ZUkKAhXJ0FBOFxltZWXft5FdnHFMfs+X3+IvNIqbhjRxQkpOzWUUrWNzVHSpiBcnAQF4XCfr0/j6e938ubSfQ22a615e9k++rQPYXiXCCel7tSoaWyWhmbh6iQoCIeyWjWv/boXgM/WpVFtmz4aYPnuHHYdKWbm2QltvodRc9qH+hHs6+XQabqFaA0SFIRD/bgtk71ZJVw0MI6sogp+2ZlVu+/d3/YRGejDRQPjnJjCU+OmUV154vL+zk6GEM2SoCAcRmvNq7/soWO4P09NHkC7IB/mrzET1e3PKeGn7Ue4angn/LxP/6fnxPYhTBxw+gc/0fZJUBAOs2Z/HusO5HPTyK74eXty6eAO/LTtCNnFFcxZsR9PpdrM/EVCuAsJCsJhXvtlD+EB3lyR3BGAK5LjqbZqPlx5gPmrD3JB//a1K5MJIVyDQ4OCUmqCUmqHUmq3Uur+Jo6ZopTaqpTaopT60JHpEa1jyY4jTH19Ben5ZY3ur6i28Oove/hx2xGuPTOBAB8zcL5nTDCD4sN4/qddFFVUM/PshFOYaiGEPRwWFJRSnsDLwAVAH2CaUqrPUcf0AP4fcLbWui9wl6PSI1pHtcXKowu38PveXKa/tbLB2AOtNT9uzeT8//7Kk4u3M653NDeMbDj+YEpyPBarZmDHUAbHh53q5AshmuHIksIwYLfWeq/WuhKYB0w66pibgJe11nkAWusjDkyPaAUL1h8iNaeUW8d0Iz2/jGvfWkVBWRXLdmVzxasruHHOGjw9FO/OHMpbM4YS4ufd4PyJA9vTOzaYv4zrcdp3QxWiLVJaa8dcWKnJwASt9Y2299cAw7XWt9c75gtgJ3A24Ak8orX+tpFrzQJmAcTExAyZN2/eCaWpuLiYoCD3m8++tfJdbdXcv7SMYG/FQ2f6sSnbwvPrKvD1hNJqiPBTTOzqzaiOXnh5OPeGL79r9+GOeYaW53vs2LFrtdbJzR3n7FlSvYAewBigI/CrUqq/1jq//kFa69eB1wGSk5P1mDFjTujDUlJSONFz27LWyvcHK/eTXbaZp6YmM7ZXNGOBnomHefHn3Uwd1okpyR3x9XKN7qXyu3Yf7phncFy+HRkUDgHx9d53tG2rLw1YqbWuAvYppXZigsRqB6ZL2OlwQRlr9+fRPTqIDmH+vPTzbpI6hTGmZ1TtMRP6tW8T6yQLIezjyKCwGuihlOqCCQZTgauOOuYLYBrwjlKqHdAT2OvANAk7aa25/cP1rN2fB4BSZpGYp68YKG0BQpzGHBYUtNbVSqnbge8w7QVva623KKX+AazRWi+07TtPKbUVsAD3aq1zHJUmYb+UHVms3Z/H3eN70iUqkJ0ZRfj7eHJWN1lOUojTWbNBQSl1BzC3podQS2itFwGLjtr2UL3vNTDb9hIuwmrVPP39DjpFBHDb2G54e3rAQGenSghxKtjTJTUGWK2Umm8bjCZ1B6e5b7dksCW9kLvG9zABQQjhNpr9j9daP4hp/H0LmAHsUkr9SynVzcFpE05gsWqe/WEnPaKDmDTo9FknWQhhH7seA23VPBm2VzUQDnyqlPqPA9MmnOCzdWnsPlLM7HN74unksQZCiFPPnjaFO4FrgWzgTUxjcJVSygPYBdzn2CSKU+XLDYd48PPNDIoPY0K/WGcnRwjhBPb0PooALtNa76+/UWttVUpNdEyyhCO891sqOzKL8Pf2JMDHk/jwAJI6h9G1XRD/S9nN09/vZHiXCF6/Jlm6nQrhpuwJCouB3Jo3SqkQIFFrvVJrvc1hKROtasPBfB5euIVgXy8sWlNWZaFmhpMAH09KKy1cOrgDT17e32VGJQshTj17gsIrQFK998WNbBMuTGvNE4u2ERnowy/3jSXI1wurVbM3u4R1B/JYfyCf7tFBXO8GayULIY7PnqCgdL1Z82zVRs6eM0m0wB9ZFlbuy+Wfk/oS5Gt+dR4eiu7RQXSPDmJKcnwzVxBCuAt7eh/tVUr9RSnlbXvdiUxF0WZUW6zM31lJl3aBTB3WydnJEeL0VFUGpbnNH9cG2PPEfwvwAvAgoIGfsE1jLVzfZ+vSSC/WvDKplwxEEydPazMRlr2sFvBow21Upbmw+yeI7Aodhhy7vzAdVr0Oa96B8nyI7A7xwyGmHwS2A/8IiOoJYXY8kFUUw8GVkL4OKkuguhK0BULjoV1PaNfDXMfBP89mg4Jt4ZupDk2FaFW5JZX8svMIS7Zn8eO2TLqFekgX09aUuw+2fA7hCdDnEvBwg2BrqYLvHzQ3v7B4iE6E+DNg+C3gedRtpPgIbP8Gti2Efb9CRFfocR70OBc6jzj2+KYCTUkOHPwdcvZA7h4oyTbpsFSChxcEREJgOxLSs6HyJ/O0HtgORswGL5+Ty+/O783Nfu8SsFabbd3GwZj7ISgG9vxkgsXOb0FbofdEaD8QDq012zZ8UHctD2+Y+F9Iuqbxz8rZA1/cBofW1H2Whzd4+YLygIrCumPPfwLOvO3k8tYMe8Yp+AE3AH2B2lXWtdbXOzBdooUKSqv4bksGX21MZ/nubKwa2gX5MKFfLGcE5baNBuTKEvDyc80nS6sVtiwwN8X9y+q2x70A5z0GCSPsv1bhYdj0CVQWmzyDCTCR3cwTYUiHuptkdaX53O1fw9l3Q8dGnlbtZamCwxshPxXK8qEsD+IGQffxDY8rzYXDGyAuCfzDoCgTPrkODqyAvpeBtQoyt8C2r8AvtOHNLmMzvDkeqstMMEi+AXJ2mRvsipegXS849x/Q83yT95WvwPIXIbwTXPIKxPY319m6EBbeYZ6+wQSAoFjw9DYvSxVkbYeSbBKqy+CQH3j7mzzlpZprNfc3b6mCpc/AkJkQHFO3vSwfPpoKwe3hzD+bG/6BFbD8BXjr3LrjQjrCsFnmFVFv2VmtTTpKc6E0G1KehIW3w5GtcO4/GwZFSzUsuMkEhrP+Yv6O4oeDb73Fc0pzIXunecUPb+63fNLsqT56H9gOnA/8A7gakK6oTlZYXsUTi7azPaOQg7lltWsld44M4NYx3Ti/byz94kLx8FCkpKQ4N7H2yNgE710EAe3MTbbn+c3/U//2Iuz/DUbd03jRvqoMfn7M3Lx6nAeDr4b2g469buFhc1x1GXgHgG8wdDrD3Cx9As2NdNE9pmgf0RXO+T8YcCWkLoOf/wnv/gnOvtPc7OoryzdpCDlqvYnfXoTfXzbfe/mbJ01L3VrXhHSAzmdBSBz8MQ+KM8HTxzy9TvyvyUd9JdmQuhT2r6DXgd1QuMAEVmUrwWgrZO+CtDUmj/V5B8JdmyCw3uy3X9wGOxcDylSDlByB8kK4/C3oP9l2TQ2vjzE31YHT6m50P/3DPKXf8L25wdf8rCtLYMdiWPIv+OhKU8rI3Wuu3f1cOPyHud6o+6AwDdbNMb+rCU+YUol/+LG/X5uUJT8zZuw55s0v/4Elj5sql3P+3uQ5gPm7SHnC/KxG3Vu3/eBKU21z6avQZaTZFj8Mht4IGz40VWLdzjHVOY39jSoFARHmRXe4+lNTyvr9f5C1Aya/bYItmL+DQ2sb/myPFhBh/h47nXH8/LQSe4JCd631FUqpSVrr95RSHwJLHZ0wcXyPf72NT9Ye5MxukYzrHU18hD+jekbRv0No2ygV1HdkO8y5xNwgwdw0uoyGjsnmiTRzq/mnnPQyeNsKqzsWm380Dy/YsQgSL4Kz7jRP2/7hBBfugtfug+wd0Plsc5NZ/QbE9IcL/2NuugB5+2HOxeZpOCTO3MTL880/sJe/ScP+5aZu+JJXYMDUuuqiQdOg7yXw1V3mRj9gKsT0Mfss1TBnknkave23hvnN2maqGm5KMdfSGooOm6fFI9vMU+m+X00w6DYOJv0P4gbDpzPhS1s1Q3gXyNxsbqZZ2811vQMJ9/CD4i22aoh6S+2GdoQh15kbS1Rvk5+SI/DaKFj+HJz3T3PcgZUmICTfAMGxJu8+gfCnZyC2X931lDI30o+vhs2fwcAr4cDvsOs7GPcwtB/QMM8+geam12cSrH0Xlj8PUb1g6gfmd1uaC4vuhZR/AQpG3A1jHrCvGkjVq74bdS/k74df/2OCu28QpC6HvH1wxbsN6/bXv2++7vu1YVDYv9xU33Q8auVKn0AYdlPz6Tmapxdc8KQJbt/MNqWNqz42fyM/P25KIv0ub/l1HcSeoFBl+5qvlOqHmf8o2nFJEs35ZWcWH685yK1juvG3Cb2dnZyTk7PH3DyVB1z3FYR3hjVvmye4/ctNdUpMX9j8qblZX/kBFB6CBTebG+v0BbD6LXNT3vaVuaaXP0nVFeamNn0BdB9nnto3f2ZuRu9caOrCB10FH02DyiKY8U1d1YylGg78Zqow9qZA8vVwzoONP616+5un2Z22IHXNArN95aumCkZ5mEDj7V93TtYOSBhZF1yUMgEpJM48mQ6fZQJFZbG5sdWYvgB+fNhUw4ApUcT0gwFTIGEUxA3i96XL7V+iMTgG+l8Bq96AM2+HoGjzpB8YbYKET+Dxz+91IUT3haVPmxt+zbnDb276HE9vc2M9+uYaEAGT3zKlDt9g6HSC1SRKwcTnTOnvh/8z24JioSwXfvm3ebAA8zCwZwn4hphAWP93tH+FKXnW/521hiHXmZLm/GvgjXMgOA58AuBPz7as8d7B7AkKryulwjG9jxYCQcD/OTRVokmF5VXc/9lGukcHcee4Hs5Ojn3K8k2d8pAZ5sZTo6oM3r/ENBzO+AbadTfbh99s6nnRprENYO178NWdMG+aear38IAp75uGxTF/M0X71F/NzaDwEAcOZdB52jN1xXT/MBh6g6n2+elRU5e98hVTXTXjm7q6bDBPdl1GmZc9AiJg9N/guwdg14+mWmHJ4xAYBSVZ5kk+brA5trzQBLWoXse/plINA0JNus5/3OTDL8xWPXGSRv8NNn1qSgvdx5v2kgueaj4ggPkdjLrHlGC+vssE8Quftu/cpvQY3/wxzfH0hivfNw2+7QeZG/G3/8/8DY68x9T/1zQEj38YvvkrHFwFXUdDZanp/XPWX04+HY3pMhJu/Mm0WRzZApe92bA9wwUcNyjYJr0rtC2w8yvQ9ZSkSjTpiUXbyCws57Nbz8LP20kNsrl7zRP04GuP7UlytMJ0mDvZ/APk7oNLX6nbt24O5B+Aa7+sq3apcXS1wZDrzFP3wjvM++mfmlJFjcBI6Htp7dt9KSl0rgkI9fkGwYVPQeLFsPpNGPtA8zdoewy9yTxxf/8ghHYAFFz+pikFZW6tCwrZO83XqJMo4UW04r9hZDfzdL76LdjzM4R2Mj9re/WZZEpz6+aYqpmkFpzrSD6BDatkRtwFa9+BX5+Gi1+A9XNNCbL/FFh0n6lC6joa0labqrea6kVHiOwGN/4I6etNNamLOe5/tG308n3A/FOUHtGIjWn5LNmexS87j7DuQD43j+rK4E5NN7w5VPp6c5MvzTY3kcvfqnuaP9qR7TD3cigvME+hG+eZuuKonlBdAcueg05nQdcx9n120jWmZFBVdmyPmZbqMrKuEbE1ePmYhub515g2gwlPmioiLz/T66RGTf1/awSi1jL6XvO7ydpu2i+a+n02pqaRdsFN9rcBOENwrCl9rnrdVDsWHjLVfn4h0CHJBAUwHReUh2nncCS/UPv/7k8xezpY/6iUukcpFa+Uiqh5OTxlAoCXl+zm4peW89xPO7Fo+Ou5Pbn73J6n5sML02H7IvMVYO8v8O5E00Nn5D2mDv+DK6CiqOF5+QdMA9rb55nuizMXwSWvmobblCfMMRs+gKJ0c0NqiV4XQL/LTj5vjpB4kQlWnUeYbooenubmn7ml7pis7eDpa7qguorwBDjjVug4FAaewJCk/lfArF9O7NxTacRdpmrp27+ZasOeF5jtXUaZHkAVRaYKLLa/uWm7KXvaFK60ff1zvW0aqUpyuJ+3Z/L09zuYOKA9/5jUj4jAU/gUVlFsqj5qqjuC40zpILK7afAMaW++//LP8MY4U/2jPM0xe38x5/Q419Qx11TznHGL6cJ49l9g2X+hQzJ0HXvq8uRoSsFV8wFV14gc3dcMdKqRtcNUt7jaWIzzHjvxc5Uy4x1cXXCs6TTw+/9Mz7GaUk2XUebvcu8vpttu8kznptPJ7BnR3KW5Y0Tr232kmDs/2kDfuBCevmLgqW0/0No0vmXvgotegKpS88/i6W2K3DW9cAZNMw24S/5lBi1pi+lPP+peU9Vz9ND+s+6AVW/CB1NMd8gLn3apXhet4uibfUwf+ONDMzo3MNJUqTm6akI0bcRsU/IdVq+HVPxw83e7/HkzjsOR7QltgD0jmq9tbLvWek7rJ0eA6WE06/01+Hh58No1yae+QXnDB6aOefT9zTc69rrAvOzhH24Cw5LHTL1uj/NOPq2uLtrWgH5kixkhXHAAkhr9lxKnQlAUTHmv4TZvfxMYUm3DrzqdeerT5ULsqT4aWu97P2AcsA6QoOAARwrLmfHOag7klPLBjcPpENaKfaXXzbE98fuYp/6OydDn0oZz92Rsgm/uMY2kox2w0uoZt5hujyP/evqVEhoT09d8zdwKPrapC1ypkVkYXUaboBDV23RmcGP2VB/dUf+9UioMmOewFLmx3UeKuO7t1eSVVvLGdckM7xrZ/En22vWD6c7pHw4oqC43davRT5tumf4RsOo12Pa1mWfm8rccU+/tG2y6oLqLoBjzsz2yta7x8mS6owrH6DIKluD2pQSwr6RwtBJA2hla2foDeVz39ip8vDyZf/OZ9OvQer0fPKtL4at7zWRktyw1XQ6tVtj6uWkP+Hi6OdAvzEwANvwWlxtQ02YpZUoLR7aagOzh3XDyNOEaOiSZXlSDm5jJ1I3Y06bwFXWTqHgAfZBxC61qf04JN7y3hvBAH+beMJz4iIBWvX7XvXNMv+wbvq/rg+7hYQb3JE6CrV+YUcV9LjHD7kXrik40E6n5R5jRzp7ezk6ROJqntxlsKOwqKTxd7/tqYL/WOs1B6XE7BaVVzHx3NVateXfmMPsDQuYWWDDLDLQJ62T6mne2DQSrP81A6jI6pC+GM25rvNeLp1fTszOK1hHdx8xjlLoMerpB47po0+wJCgeAw1rrcgCllL9SKkFrnerQlLmBymort8xdy8HcUubeMJwu7eycM6YsH+ZdbaYjjhsEObvNgh8rXjIDozqfZUoExUcgZzdlfrH4n/OgYzMjmlbT2FxVIu0JwuXZExQ+Aep33LXYtg1t/HBhD4tVc++nf7Bibw7PThnYdKNyRRFs/dLMSBkQYdoCPr8ZCtLMRG41s0laqsyUyzu/M7M/eniYGSt7T2Sz93CGnswkZeLkRCfWfS89j4SLsycoeGmtK2veaK0rlVIuOsFJ22C1au77dCNfbkjnvgm9uCypY9MHf323WaXLO9DMMurhYWZ/vPDphtMLe3o3ObNnSVtYZOd05htsqvjyD0hJQbg8e4JCllLqYq31QgCl1CQg27HJOn1ZrZr/t2ATn61L4+7xPbltTPemD9660ASEoTeaEsPKV82o4QFTzTbRdkT3hYJDENHN2SkR4rjsCQq3AB8opWwre5AGyJDME/TE4m18vOYgd5zTnTvHH2c9hJJsU0poP8jMuOnpbcYT7PoBBk93j4Ffp5PkmaZtwVVnERXCxp7Ba3uAM5RSQbb3xQ5P1Wnquy0ZvLF0H9ec0ZnZx5vpVGuzaElFoVkntqYLY3jCiS0HKJyv5/nmJYSLa3bqbKXUv5RSYVrrYq11sVIqXCll15SKSqkJSqkdSqndSqn7j3Pc5UoprZRKbuqYti4tr5R7P/mD/h1CeXBiYtPrKOfuhS9vN9NSj32gYSOlEEI4mD3rKVygtc6veWNbhe3C5k5SSnkCLwMXYAa8TVNK9WnkuGDgTmClvYlua6osVu74aD1WDS9dNRhfr0amj8jbD5/dCC8OMe0Iw2913JKAQgjRBHvaFDyVUr5a6wow4xQAe5ZmGgbs1lrvtZ03D5gEbD3quH8C/wZauNpK2/H09ztYfyCfl64aTOfIRrqGbv8GvrjVLBh/5p/NIurBsac+oUIIt2dPUPgA+Ekp9Q6ggBnAe8c9w+gAHKz3Pg0YXv8ApVQSEK+1/kYp1WRQUErNAmYBxMTEkHKCXSyLi4tP+NwT9UdWNa+trWBMvBdBuTtJSdlZu09Zq+m6933i076gKKgbWwbeR7lPLKzdDmxvtTQ4I9/O5o55BvfMtzvmGRyXb3samv+tlPoDGI+ZA+k7oPPxz2qeUsoDeBYTZJpLw+vA6wDJycl6zJgxJ/SZKSkpnOi5J+JwQRl3P7+U3rHBvDrr7IbrIhRlwifXQdoKGHoTwec/zhktWRu3BU51vl2BO+YZ3DPf7phncFy+7Z0lNRMTEK4A9gGf2XHOISC+3vuOtm01goF+QIqt0TUWWGgbE7HGznS5rGqLlTs/2kBFtZWXrkpqGBAOrjYLvJflmymqZe4hIYSLaDIoKKV6AtNsr2zgY0Bpre1dVHc10EMp1QUTDKYCV9Xs1FoXALWrWSilUoB7ToeAAPDsDztZlZrLs1MG0j06qG7Hpk9N+0Fwe7jxB7NIuBBCuIjjlRS2A0uBiVrr3QBKqbvtvbDWulopdTumuskTeFtrvUUp9Q9gTc0I6dPR28v28b+UPUwdGt9wCouDq+DzW6DjUJj6gZnLSAghXMjxgsJlmKf7JUqpbzGrrbVoGK3WehGw6KhtDzVx7JiWXNtVzV99kH98vZXz+8bw2CX96nYUppvFbEI7SEAQQrisJscpaK2/0FpPBXpjFqq7C4hWSr2ilJJJ4Rvx1R/p/G3BRkb1jOKFaYPx8rT9eKvKTUCoKIapH0lAEEK4rGYHr2mtS7TWH2qtL8I0Fq8H/ubwlLUx324+zF0fb2BoQgSvTR9SN0DNUmXaEA6tNVNWxBwzfk8IIVyGPSOaa2mt87TWr2utxzkqQW3Rj1szuf3D9QzsGMrbM4bi72MLCBXF8OGVsGUBnPsP6HOxcxMqhBDNsLdLqmjCkh1HuO2DdfSNC+Hd64cR5Gv7kZbkwIdXQPp6uPhFSJKJZYUQrk+CwkkoKK3ijg/X0yMmiDnXDyfEzzabaVEmvHcR5O+HKz+A3s1OFSWEEC5BgsJJeG9FKsUV1TwzZSChATUBIQPenWh6G03/DBJGODWNQgjREhIUTlBpZTXvLN/HuN7R9I4NMRsL000JoSjDBITOZzo3kUII0UISFE7QR6sOkldaxW1jbctpZm6Bj6ZBaS5MX9Bw/WQhhGgjWtT7SBiV1Vbe+HUvw7tEMKRzuFlL+c1zoboCrvtSAoIQos2SoHACPl+fRkZhuSkl/PqUmdwuOhFmpUCHIc5OnhBCnDCpPmqh4opqXknZQ9+4EEb57YGfH4N+k2HSy+Dt5+zkCSHESZGg0ALlVRZufG81B/PKeHfGENS3V5rZTi96XgKCEOK0IEHBTpXVVm6du5aV+3L575RBjCz9yQxMu/Q18A1q/gJCCNEGSJuCHbTW3D1/A0t2ZPH4Jf25pE8I/PioaT/oP8XZyRNCiFYjQcEOX208zDcbD/O3Cb25angnWPosFGfAhH+Dh/wIhRCnD7mjNaO8ysK/F2+nT/sQZo3qCuvmwG8vwIArIX6os5MnhBCtStoUmjFnRSqH8sv496V98fzhQVjxEnQ7By58ytlJE0KIVidB4TjySip58efdnNMzghHr7oIdi2DYzXD+v8BTfnRCiNOP3NmO44Wfd1FSUc3jXbdAyiI473E463ZnJ0sIIRxG2hSacDC3lLm/7+fqIdG0X/+c6Wl05p+dnSwhhHAoCQpN+F/KHhSKeyN/g4KDMO5hUMrZyRJCCIeSoNCI9PwyPl17kOlJEYSsfh66joGuo52dLCGEcDgJCo14/de9aA13BnwPpTkw7iFnJ0kIIU4JCQpHOVJUzkerDnB9Py9C178KiRfLzKdCCLchvY+O8ubSffhYSrgn5xlQHjD+EWcnSQghThkJCvXklVTy4e/7mBf5Fj652+HqTyCym7OTJYQQp4wEhXoWbT7MHda59Cv+DS58GrqPd3aShBDilJI2hXp2rV3CzV7foIfeBMNucnZyhBDilJOgYFNQVkXs4Z+wKE/UuP9zdnKEEMIpJCjYLNl+hNFqA6Wxw8Av1NnJEUIIp5CgYLNyw0YSPQ4Q2PdCZydFCCGcRoICZs0E730/AeDR8zwnp0YIIZxHggLw684sRuh1lAd2gKhezk6OEEI4jUODglJqglJqh1Jqt1Lq/kb2z1ZKbVVKbVRK/aSU6uzI9DTlh00HGOG5Ge/e58ukd0IIt+awoKCU8gReBi4A+gDTlFJ9jjpsPZCstR4AfAr8x1HpaUqVxUrh9l8IoALPnuef6o8XQgiX4siSwjBgt9Z6r9a6EpgHTKp/gNZ6ida61Pb2d6CjA9PTqN/35jCsei0WDx/oMvJUf7wQQrgURwaFDsDBeu/TbNuacgOw2IHpadSCdYc4x2sDJIwEn8BT/fFCCOFSXGKaC6XUdCAZaHTRAqXULGAWQExMDCkpKSf0OcXFxQ3OLa3SbPpjL118DrPLYyKHTvC6ru7ofLsDd8wzuGe+3THP4Lh8OzIoHALi673vaNvWgFJqPPB3YLTWuqKxC2mtXwdeB0hOTtZjxow5oQSlpKRQ/9x5qw4wxeMNtPKgx5/uoEe4U9q5He7ofLsDd8wzuGe+3THP4Lh8O7L6aDXQQynVRSnlA0wFFtY/QCk1GHgNuFhrfcSBaWnU96s3c43Xj9B/MpymAUEIIVrCYUFBa10N3A58B2wD5muttyil/qGUuth22FNAEPCJUmqDUmphE5drdXuzihl2+EP8qESNuu9UfawQQk9o6ekAAAmwSURBVLg0h7YpaK0XAYuO2vZQve+dNjf1opWbmen5PeW9LsG/XQ9nJUMIIVyKSzQ0n2oWqyZ4/Wv4q0o8xv8/ZydHCCFchltOc7Fm224ur17E4Y4TZFoLIYSoxy1LChWbFxKkytHj7nF2UoQQwqW4ZUlBFaYDEBTf38kpEUII1+KWQcGz5Ah5hKC8fJ2dFCGEcCluGRR8y7Mo8IxwdjKEEMLluGVQCKzKpsSnnbOTIYQQLsctg0KYJZcKvyhnJ0MIIVyO2wWFyqpqInU+lsAYZydFCCFcjtsFhZysw3grCx4hsc5OihBCuBy3CwoFWWkA+IS1d3JKhBDC9bhdUCjJNkHBP+J46/0IIYR7crugUJlvBq6FRsU3c6QQQrgftwsKloIMAMJiJCgIIcTR3C4oeJRkUkQA3n6yHrMQQhzN7YKCT9kR8jxkNLMQQjTG7YJCYGU2xd6Rzk6GEEK4JLcLCiHVuZT7ymhmIYRojFsFBW21EqlzqQ6MdnZShBDCJblVUCgvL8FPVUGQjGYWQojGuFVQqCrOBcBLRjMLIUSj3CooWErzAPAPj3NySoQQwjW5VVBQZTkABEd1dHJKhBDCNblVUPCqMCWF8OhOTk6JEEK4JrcKCj6VeZRpHwJDwp2dFCGEcEluFRQCqvLI9YgApZydFCGEcEluFRSCLXkUesloZiGEaIpbBYVQaz5lvu2cnQwhhHBZbhUUInUelf4ymlkIIZriNkGhrLiQYFWGDopxdlKEEMJluU1QyMs8CIBniIxmFkKIprhNUCjMNkHBL0JGMwshRFPcJiiU5Zq1mQMjOzg5JUII4brcJihU5R8GICxa1mYWQoimODQoKKUmKKV2KKV2K6Xub2S/r1LqY9v+lUqpBEelxSOmN997jyMsUqbNFkKIpjgsKCilPIGXgQuAPvD/27vfGDuqOozj38ctjQWSFoppsAVbQ6PBP9Bmo1WMaYovVIg1QS0EI2kwjURtNf4BfaHR6AuMEawakgpoVQIxFbExpErarpqoSLFYKDWxqRVKWtoqra7/oPj4Yk5vr8tud7vd2evOPJ9kszPnzr33/PLb3N+dM7PncI2ki4ccdj3wjO2LgFuAm+vqT/+yq5h+2Wr6+lpzchQRccrq/IR8HbDb9h7bzwL3AMuHHLMcWF+2NwCXS5mDIiKiV6bV+NpzgSe79vcBrx/pGNvHJB0FZgOHuw+StApYBTBnzhwGBgbG1aHBwcFxP3cqa2PcbYwZ2hl3G2OG+uKusyhMGNvrgHUA/f39Xrp06bheZ2BggPE+dyprY9xtjBnaGXcbY4b64q5z+OgpoPtWn3mlbdhjJE0DZgJ/rrFPERFxEnUWhYeAhZIWSJoOXA1sHHLMRuC6sv0uYItt19iniIg4idqGj8o1gg8BPwH6gDtt75T0eWCb7Y3AHcB3Je0G/kJVOCIiokdqvaZg+37g/iFtn+na/hfw7jr7EBERY5eb9iMiokNTbQhf0iHgT+N8+nkMud21JdoYdxtjhnbG3caY4dTjfpntl4x20JQrCqdD0jbb/b3ux2RrY9xtjBnaGXcbY4b64s7wUUREdKQoRERER9uKwrped6BH2hh3G2OGdsbdxpihprhbdU0hIiJOrm1nChERcRIpChER0dGaojDaKnBNIOkCSVslPS5pp6Q1pf1cSQ9I+kP5fU6v+zrRJPVJ2i7px2V/QVnNb3dZ3W96r/s40STNkrRB0u8l7ZL0hpbk+qPl7/sxSXdLenHT8i3pTkkHJT3W1TZsblVZW2LfIWnx6bx3K4rCGFeBa4JjwMdsXwwsAT5Y4rwJ2Gx7IbC57DfNGmBX1/7NwC1lVb9nqFb5a5qvAptsvxK4hCr+Ruda0lxgNdBv+9VU86pdTfPy/W3grUPaRsrt24CF5WcVcNvpvHErigJjWwVuyrO93/Zvy/bfqD4k5vK/K9ytB97Zmx7WQ9I84Arg9rIvYBnVan7QzJhnAm+mmlQS28/aPkLDc11MA2aU6fbPBPbTsHzb/jnVJKHdRsrtcuA7rvwamCXp/PG+d1uKwnCrwM3tUV8mhaT5wCLgQWCO7f3loQPAnB51qy63Ap8E/lP2ZwNHbB8r+03M9wLgEPCtMmx2u6SzaHiubT8FfBl4gqoYHAUepvn5hpFzO6Gfb20pCq0i6WzgB8BHbP+1+7GyXkVj7kOWdCVw0PbDve7LJJsGLAZus70I+DtDhoqalmuAMo6+nKoovhQ4ixcOszRenbltS1EYyypwjSDpDKqCcJfte0vz08dPJ8vvg73qXw0uA94haS/VsOAyqrH2WWV4AZqZ733APtsPlv0NVEWiybkGeAvwR9uHbD8H3Ev1N9D0fMPIuZ3Qz7e2FIWxrAI35ZWx9DuAXba/0vVQ9wp31wE/muy+1cX2p2zPsz2fKq9bbF8LbKVazQ8aFjOA7QPAk5JeUZouBx6nwbkungCWSDqz/L0fj7vR+S5Gyu1G4H3lLqQlwNGuYaZT1pr/aJb0dqqx5+OrwH2xx12acJLeBPwCeJQT4+ufprqu8H3gQqppx99je+hFrClP0lLg47avlPRyqjOHc4HtwHtt/7uX/Ztoki6lurg+HdgDrKT6otfoXEv6HLCC6m677cD7qcbQG5NvSXcDS6mmx34a+CxwH8PkthTHr1MNo/0DWGl727jfuy1FISIiRteW4aOIiBiDFIWIiOhIUYiIiI4UhYiI6EhRiIiIjhSFiELS85Ie6fqZsMnkJM3vnvEy4v/VtNEPiWiNf9q+tNediOilnClEjELSXklfkvSopN9Iuqi0z5e0pcxhv1nShaV9jqQfSvpd+Xljeak+Sd8sawH8VNKMcvxqVWtg7JB0T4/CjABSFCK6zRgyfLSi67Gjtl9D9Z+jt5a2rwHrbb8WuAtYW9rXAj+zfQnVfEQ7S/tC4Bu2XwUcAa4q7TcBi8rrfKCu4CLGIv/RHFFIGrR99jDte4FltveUCQcP2J4t6TBwvu3nSvt+2+dJOgTM655moUxl/kBZIAVJNwJn2P6CpE3AINU0BvfZHqw51IgR5UwhYmw8wvap6J6L53lOXNO7gmplwMXAQ12zfUZMuhSFiLFZ0fX7V2X7l1QzswJcSzUZIVRLJd4AnbWjZ470opJeBFxgeytwIzATeMHZSsRkyTeSiBNmSHqka3+T7eO3pZ4jaQfVt/1rStuHqVY++wTVKmgrS/saYJ2k66nOCG6gWiVsOH3A90rhELC2LKsZ0RO5phAxinJNod/24V73JaJuGT6KiIiOnClERERHzhQiIqIjRSEiIjpSFCIioiNFISIiOlIUIiKi479JLMWHGrgIiAAAAABJRU5ErkJggg==\n", 474 | "text/plain": [ 475 | "
" 476 | ] 477 | }, 478 | "metadata": { 479 | "needs_background": "light" 480 | }, 481 | "output_type": "display_data" 482 | } 483 | ], 484 | "source": [ 485 | "epochs = list(range(max_epoch))\n", 486 | "train_metric = exp_results[\"train_met\"]\n", 487 | "valid_metric = exp_results[\"valid_met\"]\n", 488 | "lines = plt.plot(epochs, train_metric, epochs, valid_metric)\n", 489 | "\n", 490 | "plt.legend(('Train', 'Validation'), loc='upper left')\n", 491 | "plt.title('Accuracy chart')\n", 492 | "plt.xlabel('Epochs')\n", 493 | "plt.ylabel('Accuracy')\n", 494 | "plt.grid(True)\n", 495 | "plt.show()" 496 | ] 497 | } 498 | ], 499 | "metadata": { 500 | "kernelspec": { 501 | "display_name": "Python 3", 502 | "language": "python", 503 | "name": "python3" 504 | }, 505 | "language_info": { 506 | "codemirror_mode": { 507 | "name": "ipython", 508 | "version": 3 509 | }, 510 | "file_extension": ".py", 511 | "mimetype": "text/x-python", 512 | "name": "python", 513 | "nbconvert_exporter": "python", 514 | "pygments_lexer": "ipython3", 515 | "version": "3.6.7" 516 | } 517 | }, 518 | "nbformat": 4, 519 | "nbformat_minor": 4 520 | } 521 | -------------------------------------------------------------------------------- /efficientnet.py: -------------------------------------------------------------------------------- 1 | from math import ceil 2 | 3 | import torch 4 | import torch.nn as nn 5 | import torch.nn.functional as F 6 | import collections.abc as container_abcs 7 | from torch.utils import model_zoo 8 | 9 | 10 | def _pair(x): 11 | if isinstance(x, container_abcs.Iterable): 12 | return x 13 | return (x, x) 14 | 15 | 16 | class SamePaddingConv2d(nn.Module): 17 | def __init__(self, 18 | in_spatial_shape, 19 | in_channels, 20 | out_channels, 21 | kernel_size, 22 | stride, 23 | dilation=1, 24 | enforce_in_spatial_shape=False, 25 | **kwargs): 26 | super(SamePaddingConv2d, self).__init__() 27 | 28 | self._in_spatial_shape = _pair(in_spatial_shape) 29 | # e.g. throw exception if input spatial shape does not match in_spatial_shape 30 | # when calling self.forward() 31 | self.enforce_in_spatial_shape = enforce_in_spatial_shape 32 | kernel_size = _pair(kernel_size) 33 | stride = _pair(stride) 34 | dilation = _pair(dilation) 35 | 36 | in_height, in_width = self._in_spatial_shape 37 | filter_height, filter_width = kernel_size 38 | stride_heigth, stride_width = stride 39 | dilation_height, dilation_width = dilation 40 | 41 | out_height = int(ceil(float(in_height) / float(stride_heigth))) 42 | out_width = int(ceil(float(in_width) / float(stride_width))) 43 | 44 | pad_along_height = max((out_height - 1) * stride_heigth + 45 | filter_height + (filter_height - 1) * (dilation_height - 1) - in_height, 0) 46 | pad_along_width = max((out_width - 1) * stride_width + 47 | filter_width + (filter_width - 1) * (dilation_width - 1) - in_width, 0) 48 | 49 | pad_top = pad_along_height // 2 50 | pad_bottom = pad_along_height - pad_top 51 | pad_left = pad_along_width // 2 52 | pad_right = pad_along_width - pad_left 53 | 54 | paddings = (pad_left, pad_right, pad_top, pad_bottom) 55 | if any(p > 0 for p in paddings): 56 | self.zero_pad = nn.ZeroPad2d(paddings) 57 | else: 58 | self.zero_pad = None 59 | self.conv = nn.Conv2d(in_channels=in_channels, 60 | out_channels=out_channels, 61 | kernel_size=kernel_size, 62 | stride=stride, 63 | dilation=dilation, 64 | **kwargs) 65 | 66 | self._out_spatial_shape = (out_height, out_width) 67 | 68 | @property 69 | def in_spatial_shape(self): 70 | return self._in_spatial_shape 71 | 72 | @property 73 | def out_spatial_shape(self): 74 | return self._out_spatial_shape 75 | 76 | @property 77 | def in_channels(self): 78 | return self.conv.in_channels 79 | 80 | @property 81 | def out_channels(self): 82 | return self.conv.out_channels 83 | 84 | def check_spatial_shape(self, x): 85 | if x.size(2) != self.in_spatial_shape[0] or \ 86 | x.size(3) != self.in_spatial_shape[1]: 87 | raise ValueError( 88 | "Expected input spatial shape {}, got {} instead".format(self.in_spatial_shape, 89 | x.shape[2:])) 90 | 91 | def forward(self, x): 92 | if self.enforce_in_spatial_shape: 93 | self.check_spatial_shape(x) 94 | if self.zero_pad is not None: 95 | x = self.zero_pad(x) 96 | x = self.conv(x) 97 | return x 98 | 99 | 100 | class ConvBNAct(nn.Module): 101 | def __init__(self, 102 | out_channels, 103 | activation=None, 104 | bn_epsilon=None, 105 | bn_momentum=None, 106 | same_padding=False, 107 | **kwargs): 108 | super(ConvBNAct, self).__init__() 109 | 110 | _conv_cls = SamePaddingConv2d if same_padding else nn.Conv2d 111 | self.conv = _conv_cls(out_channels=out_channels, **kwargs) 112 | 113 | bn_kwargs = {} 114 | if bn_epsilon is not None: 115 | bn_kwargs["eps"] = bn_epsilon 116 | if bn_momentum is not None: 117 | bn_kwargs["momentum"] = bn_momentum 118 | 119 | self.bn = nn.BatchNorm2d(out_channels, **bn_kwargs) 120 | self.activation = activation 121 | 122 | @property 123 | def in_spatial_shape(self): 124 | if isinstance(self.conv, SamePaddingConv2d): 125 | return self.conv.in_spatial_shape 126 | else: 127 | return None 128 | 129 | @property 130 | def out_spatial_shape(self): 131 | if isinstance(self.conv, SamePaddingConv2d): 132 | return self.conv.out_spatial_shape 133 | else: 134 | return None 135 | 136 | @property 137 | def in_channels(self): 138 | return self.conv.in_channels 139 | 140 | @property 141 | def out_channels(self): 142 | return self.conv.out_channels 143 | 144 | def forward(self, x): 145 | x = self.conv(x) 146 | x = self.bn(x) 147 | if self.activation is not None: 148 | x = self.activation(x) 149 | return x 150 | 151 | 152 | class Swish(nn.Module): 153 | def __init__(self, 154 | beta=1.0, 155 | beta_learnable=False): 156 | super(Swish, self).__init__() 157 | 158 | if beta == 1.0 and not beta_learnable: 159 | self._op = self.simple_swish 160 | else: 161 | self.beta = nn.Parameter(torch.full([1], beta), 162 | requires_grad=beta_learnable) 163 | self._op = self.advanced_swish 164 | 165 | def simple_swish(self, x): 166 | return x * torch.sigmoid(x) 167 | 168 | def advanced_swish(self, x): 169 | return x * torch.sigmoid(self.beta * x) 170 | 171 | def forward(self, x): 172 | return self._op(x) 173 | 174 | 175 | class DropConnect(nn.Module): 176 | def __init__(self, rate=0.5): 177 | super(DropConnect, self).__init__() 178 | self.keep_prob = None 179 | self.set_rate(rate) 180 | 181 | def set_rate(self, rate): 182 | if not 0 <= rate < 1: 183 | raise ValueError("rate must be 0<=rate<1, got {} instead".format(rate)) 184 | self.keep_prob = 1 - rate 185 | 186 | def forward(self, x): 187 | if self.training: 188 | random_tensor = self.keep_prob + torch.rand([x.size(0), 1, 1, 1], 189 | dtype=x.dtype, 190 | device=x.device) 191 | binary_tensor = torch.floor(random_tensor) 192 | return torch.mul(torch.div(x, self.keep_prob), binary_tensor) 193 | else: 194 | return x 195 | 196 | 197 | class SqueezeExcitate(nn.Module): 198 | def __init__(self, 199 | in_channels, 200 | se_size, 201 | activation=None): 202 | super(SqueezeExcitate, self).__init__() 203 | self.dim_reduce = nn.Conv2d(in_channels=in_channels, 204 | out_channels=se_size, 205 | kernel_size=1) 206 | self.dim_restore = nn.Conv2d(in_channels=se_size, 207 | out_channels=in_channels, 208 | kernel_size=1) 209 | self.activation = F.relu if activation is None else activation 210 | 211 | def forward(self, x): 212 | inp = x 213 | x = F.adaptive_avg_pool2d(x, (1, 1)) 214 | x = self.dim_reduce(x) 215 | x = self.activation(x) 216 | x = self.dim_restore(x) 217 | x = torch.sigmoid(x) 218 | return torch.mul(inp, x) 219 | 220 | 221 | class MBConvBlock(nn.Module): 222 | def __init__(self, 223 | in_spatial_shape, 224 | in_channels, 225 | out_channels, 226 | kernel_size, 227 | stride, 228 | expansion_factor, 229 | activation, 230 | bn_epsilon=None, 231 | bn_momentum=None, 232 | se_size=None, 233 | drop_connect_rate=None, 234 | bias=False): 235 | """ 236 | Initialize new MBConv block 237 | :param in_spatial_shape: image shape, e.g. tuple [height, width] or int size for [size, size] 238 | :param in_channels: number of input channels 239 | :param out_channels: number of output channels 240 | :param kernel_size: kernel size for depth-wise convolution 241 | :param stride: stride for depth-wise convolution 242 | :param expansion_factor: expansion factor 243 | :param bn_epsilon: batch normalization epsilon 244 | :param bn_momentum: batch normalization momentum 245 | :param se_size: number of features in reduction layer of Squeeze-and-Excitate layer 246 | :param activation: activation function 247 | :param drop_connect_rate: DropConnect rate 248 | :param bias: enable bias in convolution operations 249 | """ 250 | super(MBConvBlock, self).__init__() 251 | 252 | if se_size is not None and se_size < 1: 253 | raise ValueError("se_size must be >=1, got {} instead".format(se_size)) 254 | 255 | if drop_connect_rate is not None and not 0 <= drop_connect_rate < 1: 256 | raise ValueError("drop_connect_rate must be in range [0,1), got {} instead".format(drop_connect_rate)) 257 | 258 | if not (isinstance(expansion_factor, int) and expansion_factor >= 1): 259 | raise ValueError("expansion factor must be int and >=1, got {} instead".format(expansion_factor)) 260 | 261 | exp_channels = in_channels * expansion_factor 262 | kernel_size = _pair(kernel_size) 263 | stride = _pair(stride) 264 | 265 | self.activation = activation 266 | 267 | # expansion convolution 268 | if expansion_factor != 1: 269 | self.expand_conv = ConvBNAct(in_channels=in_channels, 270 | out_channels=exp_channels, 271 | kernel_size=(1, 1), 272 | bias=bias, 273 | activation=self.activation, 274 | bn_epsilon=bn_epsilon, 275 | bn_momentum=bn_momentum) 276 | else: 277 | self.expand_conv = None 278 | 279 | # depth-wise convolution 280 | self.dp_conv = ConvBNAct(in_spatial_shape=in_spatial_shape, 281 | in_channels=exp_channels, 282 | out_channels=exp_channels, 283 | kernel_size=kernel_size, 284 | stride=stride, 285 | groups=exp_channels, 286 | bias=bias, 287 | activation=self.activation, 288 | same_padding=True, 289 | bn_epsilon=bn_epsilon, 290 | bn_momentum=bn_momentum) 291 | 292 | if se_size is not None: 293 | self.se = SqueezeExcitate(exp_channels, 294 | se_size, 295 | activation=self.activation) 296 | else: 297 | self.se = None 298 | 299 | if drop_connect_rate is not None: 300 | self.drop_connect = DropConnect(drop_connect_rate) 301 | else: 302 | self.drop_connect = None 303 | 304 | if in_channels == out_channels and all(s == 1 for s in stride): 305 | self.skip_enabled = True 306 | else: 307 | self.skip_enabled = False 308 | 309 | # projection convolution 310 | self.project_conv = ConvBNAct(in_channels=exp_channels, 311 | out_channels=out_channels, 312 | kernel_size=(1, 1), 313 | bias=bias, 314 | activation=None, 315 | bn_epsilon=bn_epsilon, 316 | bn_momentum=bn_momentum) 317 | 318 | @property 319 | def in_spatial_shape(self): 320 | return self.dp_conv.in_spatial_shape 321 | 322 | @property 323 | def out_spatial_shape(self): 324 | return self.dp_conv.out_spatial_shape 325 | 326 | @property 327 | def in_channels(self): 328 | if self.expand_conv is not None: 329 | return self.expand_conv.in_channels 330 | else: 331 | return self.dp_conv.in_channels 332 | 333 | @property 334 | def out_channels(self): 335 | return self.project_conv.out_channels 336 | 337 | def forward(self, x): 338 | inp = x 339 | 340 | if self.expand_conv is not None: 341 | # expansion convolution applied only if expansion ratio > 1 342 | x = self.expand_conv(x) 343 | 344 | # depth-wise convolution 345 | x = self.dp_conv(x) 346 | 347 | # squeeze-and-excitate 348 | if self.se is not None: 349 | x = self.se(x) 350 | 351 | # projection convolution 352 | x = self.project_conv(x) 353 | 354 | if self.skip_enabled: 355 | # drop-connect applied only if skip connection enabled 356 | if self.drop_connect is not None: 357 | x = self.drop_connect(x) 358 | x = x + inp 359 | return x 360 | 361 | 362 | class EnetStage(nn.Module): 363 | def __init__(self, 364 | num_layers, 365 | in_spatial_shape, 366 | in_channels, 367 | out_channels, 368 | stride, 369 | se_ratio, 370 | drop_connect_rates, 371 | **kwargs): 372 | super(EnetStage, self).__init__() 373 | 374 | if not (isinstance(num_layers, int) and num_layers >= 1): 375 | raise ValueError("num_layers must be int and >=1, got {} instead".format(num_layers)) 376 | 377 | if not (isinstance(drop_connect_rates, container_abcs.Iterable) and 378 | len(drop_connect_rates) == num_layers): 379 | raise ValueError("drop_connect_rates must be iterable of " 380 | "length num_layers ({}), got {} instead".format(num_layers, drop_connect_rates)) 381 | 382 | self.num_layers = num_layers 383 | self.layers = nn.ModuleList() 384 | spatial_shape = in_spatial_shape 385 | for i in range(self.num_layers): 386 | se_size = max(1, in_channels // se_ratio) 387 | layer = MBConvBlock(in_spatial_shape=spatial_shape, 388 | in_channels=in_channels, 389 | out_channels=out_channels, 390 | stride=stride, 391 | se_size=se_size, 392 | drop_connect_rate=drop_connect_rates[i], 393 | **kwargs) 394 | self.layers.append(layer) 395 | spatial_shape = layer.out_spatial_shape 396 | # remaining MBConv blocks have stride 1 and in_channels=out_channels 397 | stride = 1 398 | in_channels = out_channels 399 | 400 | @property 401 | def in_spatial_shape(self): 402 | return self.layers[0].in_spatial_shape 403 | 404 | @property 405 | def out_spatial_shape(self): 406 | return self.layers[-1].out_spatial_shape 407 | 408 | @property 409 | def in_channels(self): 410 | return self.layers[0].in_channels 411 | 412 | @property 413 | def out_channels(self): 414 | return self.layers[-1].out_channels 415 | 416 | def forward(self, x): 417 | for layer in self.layers: 418 | x = layer(x) 419 | return x 420 | 421 | 422 | def round_filters(filters, width_coefficient, depth_divisor=8): 423 | """Round number of filters based on depth multiplier.""" 424 | min_depth = depth_divisor 425 | 426 | filters *= width_coefficient 427 | new_filters = max(min_depth, int(filters + depth_divisor / 2) // depth_divisor * depth_divisor) 428 | # Make sure that round down does not go down by more than 10%. 429 | if new_filters < 0.9 * filters: 430 | new_filters += depth_divisor 431 | return int(new_filters) 432 | 433 | 434 | def round_repeats(repeats, depth_coefficient): 435 | """Round number of filters based on depth multiplier.""" 436 | return int(ceil(depth_coefficient * repeats)) 437 | 438 | 439 | class EfficientNet(nn.Module): 440 | # (width_coefficient, depth_coefficient, dropout_rate, in_spatial_shape) 441 | coefficients = [ 442 | (1.0, 1.0, 0.2, 224), 443 | (1.0, 1.1, 0.2, 240), 444 | (1.1, 1.2, 0.3, 260), 445 | (1.2, 1.4, 0.3, 300), 446 | (1.4, 1.8, 0.4, 380), 447 | (1.6, 2.2, 0.4, 456), 448 | (1.8, 2.6, 0.5, 528), 449 | (2.0, 3.1, 0.5, 600), 450 | ] 451 | 452 | # block_repeat, kernel_size, stride, expansion_factor, input_channels, output_channels, se_ratio 453 | stage_args = [ 454 | [1, 3, 1, 1, 32, 16, 4], 455 | [2, 3, 2, 6, 16, 24, 4], 456 | [2, 5, 2, 6, 24, 40, 4], 457 | [3, 3, 2, 6, 40, 80, 4], 458 | [3, 5, 1, 6, 80, 112, 4], 459 | [4, 5, 2, 6, 112, 192, 4], 460 | [1, 3, 1, 6, 192, 320, 4], 461 | ] 462 | 463 | state_dict_urls = [ 464 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmliYV9HaE5PWWVEbXVMd3c/root/content", 465 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlicV9HaE5PWWVEbXVMd3c/root/content", 466 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmliNl9HaE5PWWVEbXVMd3c/root/content", 467 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmljS19HaE5PWWVEbXVMd3c/root/content", 468 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmljYV9HaE5PWWVEbXVMd3c/root/content", 469 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmljcV9HaE5PWWVEbXVMd3c/root/content", 470 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmljNl9HaE5PWWVEbXVMd3c/root/content", 471 | "https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlkS19HaE5PWWVEbXVMd3c/root/content", 472 | ] 473 | 474 | dict_names = [ 475 | 'efficientnet-b0-d86f8792.pth', 476 | 'efficientnet-b1-82896633.pth', 477 | 'efficientnet-b2-e4b93854.pth', 478 | 'efficientnet-b3-3b9ca610.pth', 479 | 'efficientnet-b4-24436ca5.pth', 480 | 'efficientnet-b5-d8e577e8.pth', 481 | 'efficientnet-b6-f20845c7.pth', 482 | 'efficientnet-b7-86e8e374.pth' 483 | ] 484 | 485 | def __init__(self, 486 | b, 487 | in_channels=3, 488 | n_classes=1000, 489 | in_spatial_shape=None, 490 | activation=Swish(), 491 | bias=False, 492 | drop_connect_rate=0.2, 493 | dropout_rate=None, 494 | bn_epsilon=1e-3, 495 | bn_momentum=0.01, 496 | pretrained=False, 497 | progress=False): 498 | """ 499 | Initialize new EfficientNet model 500 | :param b: model index, i.e. 0 for EfficientNet-B0 501 | :param in_channels: number of input channels 502 | :param n_classes: number of output classes 503 | :param in_spatial_shape: input image shape 504 | :param activation: activation function 505 | :param bias: enable bias in convolution operations 506 | :param drop_connect_rate: DropConnect rate 507 | :param dropout_rate: dropout rate, this will override default rate for each model 508 | :param bn_epsilon: batch normalization epsilon 509 | :param bn_momentum: batch normalization momentum 510 | :param pretrained: initialize model with weights pre-trained on ImageNet 511 | :param progress: show progress when downloading pre-trained weights 512 | """ 513 | 514 | super(EfficientNet, self).__init__() 515 | 516 | # verify all parameters 517 | EfficientNet.check_init_params(b, 518 | in_channels, 519 | n_classes, 520 | in_spatial_shape, 521 | activation, 522 | bias, 523 | drop_connect_rate, 524 | dropout_rate, 525 | bn_epsilon, 526 | bn_momentum, 527 | pretrained, 528 | progress) 529 | 530 | self.b = b 531 | self.in_channels = in_channels 532 | self.activation = activation 533 | self.drop_connect_rate = drop_connect_rate 534 | self._override_dropout_rate = dropout_rate 535 | 536 | width_coefficient, _, _, spatial_shape = EfficientNet.coefficients[self.b] 537 | 538 | if in_spatial_shape is not None: 539 | self.in_spatial_shape = _pair(in_spatial_shape) 540 | else: 541 | self.in_spatial_shape = _pair(spatial_shape) 542 | 543 | # initial convolution 544 | init_conv_out_channels = round_filters(32, width_coefficient) 545 | self.init_conv = ConvBNAct(in_spatial_shape=self.in_spatial_shape, 546 | in_channels=self.in_channels, 547 | out_channels=init_conv_out_channels, 548 | kernel_size=(3, 3), 549 | stride=(2, 2), 550 | bias=bias, 551 | activation=self.activation, 552 | same_padding=True, 553 | bn_epsilon=bn_epsilon, 554 | bn_momentum=bn_momentum) 555 | spatial_shape = self.init_conv.out_spatial_shape 556 | 557 | self.stages = nn.ModuleList() 558 | mbconv_idx = 0 559 | dc_rates = self.get_dc_rates() 560 | for stage_id in range(self.num_stages): 561 | kernel_size = self.get_stage_kernel_size(stage_id) 562 | stride = self.get_stage_stride(stage_id) 563 | expansion_factor = self.get_stage_expansion_factor(stage_id) 564 | stage_in_channels = self.get_stage_in_channels(stage_id) 565 | stage_out_channels = self.get_stage_out_channels(stage_id) 566 | stage_num_layers = self.get_stage_num_layers(stage_id) 567 | stage_dc_rates = dc_rates[mbconv_idx:mbconv_idx + stage_num_layers] 568 | stage_se_ratio = self.get_stage_se_ratio(stage_id) 569 | 570 | stage = EnetStage(num_layers=stage_num_layers, 571 | in_spatial_shape=spatial_shape, 572 | in_channels=stage_in_channels, 573 | out_channels=stage_out_channels, 574 | stride=stride, 575 | se_ratio=stage_se_ratio, 576 | drop_connect_rates=stage_dc_rates, 577 | kernel_size=kernel_size, 578 | expansion_factor=expansion_factor, 579 | activation=self.activation, 580 | bn_epsilon=bn_epsilon, 581 | bn_momentum=bn_momentum, 582 | bias=bias 583 | ) 584 | self.stages.append(stage) 585 | spatial_shape = stage.out_spatial_shape 586 | mbconv_idx += stage_num_layers 587 | 588 | head_conv_out_channels = round_filters(1280, width_coefficient) 589 | head_conv_in_channels = self.stages[-1].layers[-1].project_conv.out_channels 590 | self.head_conv = ConvBNAct(in_channels=head_conv_in_channels, 591 | out_channels=head_conv_out_channels, 592 | kernel_size=(1, 1), 593 | bias=bias, 594 | activation=self.activation, 595 | bn_epsilon=bn_epsilon, 596 | bn_momentum=bn_momentum) 597 | 598 | if self.dropout_rate > 0: 599 | self.dropout = nn.Dropout(p=self.dropout_rate) 600 | else: 601 | self.dropout = None 602 | 603 | self.avpool = nn.AdaptiveAvgPool2d((1, 1)) 604 | self.fc = nn.Linear(head_conv_out_channels, n_classes) 605 | 606 | if pretrained: 607 | self._load_state(self.b, in_channels, n_classes, progress) 608 | 609 | @property 610 | def num_stages(self): 611 | return len(EfficientNet.stage_args) 612 | 613 | @property 614 | def width_coefficient(self): 615 | return EfficientNet.coefficients[self.b][0] 616 | 617 | @property 618 | def depth_coefficient(self): 619 | return EfficientNet.coefficients[self.b][1] 620 | 621 | @property 622 | def dropout_rate(self): 623 | if self._override_dropout_rate is None: 624 | return EfficientNet.coefficients[self.b][2] 625 | else: 626 | return self._override_dropout_rate 627 | 628 | def get_stage_kernel_size(self, stage): 629 | return EfficientNet.stage_args[stage][1] 630 | 631 | def get_stage_stride(self, stage): 632 | return EfficientNet.stage_args[stage][2] 633 | 634 | def get_stage_expansion_factor(self, stage): 635 | return EfficientNet.stage_args[stage][3] 636 | 637 | def get_stage_in_channels(self, stage): 638 | width_coefficient = self.width_coefficient 639 | in_channels = EfficientNet.stage_args[stage][4] 640 | return round_filters(in_channels, width_coefficient) 641 | 642 | def get_stage_out_channels(self, stage): 643 | width_coefficient = self.width_coefficient 644 | out_channels = EfficientNet.stage_args[stage][5] 645 | return round_filters(out_channels, width_coefficient) 646 | 647 | def get_stage_se_ratio(self, stage): 648 | return EfficientNet.stage_args[stage][6] 649 | 650 | def get_stage_num_layers(self, stage): 651 | depth_coefficient = self.depth_coefficient 652 | num_layers = EfficientNet.stage_args[stage][0] 653 | return round_repeats(num_layers, depth_coefficient) 654 | 655 | def get_num_mbconv_layers(self): 656 | total = 0 657 | for i in range(self.num_stages): 658 | total += self.get_stage_num_layers(i) 659 | return total 660 | 661 | def get_dc_rates(self): 662 | total_mbconv_layers = self.get_num_mbconv_layers() 663 | return [self.drop_connect_rate * i / total_mbconv_layers 664 | for i in range(total_mbconv_layers)] 665 | 666 | def _load_state(self, b, in_channels, n_classes, progress): 667 | state_dict = model_zoo.load_url(EfficientNet.state_dict_urls[b], progress=progress, file_name=EfficientNet.dict_names[b]) 668 | strict = True 669 | if in_channels != 3: 670 | state_dict.pop('init_conv.conv.conv.weight') 671 | strict = False 672 | if n_classes != 1000: 673 | state_dict.pop('fc.weight') 674 | state_dict.pop('fc.bias') 675 | strict = False 676 | self.load_state_dict(state_dict, strict=strict) 677 | print("Model weights loaded successfully.") 678 | 679 | def check_input(self, x): 680 | if x.dim() != 4: 681 | raise ValueError("Input x must be 4 dimensional tensor, got {} instead".format(x.dim())) 682 | if x.size(1) != self.in_channels: 683 | raise ValueError("Input must have {} channels, got {} instead".format(self.in_channels, 684 | x.size(1))) 685 | 686 | @staticmethod 687 | def check_init_params(b, 688 | in_channels, 689 | n_classes, 690 | in_spatial_shape, 691 | activation, 692 | bias, 693 | drop_connect_rate, 694 | override_dropout_rate, 695 | bn_epsilon, 696 | bn_momentum, 697 | pretrained, 698 | progress): 699 | 700 | if not isinstance(b, int): 701 | raise ValueError("b must be int, got {} instead".format(type(b))) 702 | elif not 0 <= b < len(EfficientNet.coefficients): 703 | raise ValueError("b must be in range 0<=b<=7, got {} instead".format(b)) 704 | 705 | if not isinstance(in_channels, int): 706 | raise ValueError("in_channels must be int, got {} instead".format(type(in_channels))) 707 | elif not in_channels > 0: 708 | raise ValueError("in_channels must be > 0, got {} instead".format(in_channels)) 709 | 710 | if not isinstance(n_classes, int): 711 | raise ValueError("n_classes must be int, got {} instead".format(type(n_classes))) 712 | elif not n_classes > 0: 713 | raise ValueError("n_classes must be > 0, got {} instead".format(n_classes)) 714 | 715 | if not (in_spatial_shape is None or 716 | isinstance(in_spatial_shape, int) or 717 | (isinstance(in_spatial_shape, container_abcs.Iterable) and 718 | len(in_spatial_shape) == 2 and 719 | all(isinstance(s, int) for s in in_spatial_shape))): 720 | raise ValueError("in_spatial_shape must be either None, int or iterable of ints of length 2" 721 | ", got {} instead".format(in_spatial_shape)) 722 | 723 | if activation is not None and not callable(activation): 724 | raise ValueError("activation must be callable but is not") 725 | 726 | if not isinstance(bias, bool): 727 | raise ValueError("bias must be bool, got {} instead".format(type(bias))) 728 | 729 | if not isinstance(drop_connect_rate, float): 730 | raise ValueError("drop_connect_rate must be float, got {} instead".format(type(drop_connect_rate))) 731 | elif not 0 <= drop_connect_rate < 1.0: 732 | raise ValueError("drop_connect_rate must be within range 0 <= drop_connect_rate < 1.0, " 733 | "got {} instead".format(drop_connect_rate)) 734 | 735 | if override_dropout_rate is not None: 736 | if not isinstance(override_dropout_rate, float): 737 | raise ValueError("dropout_rate must be either None or float, " 738 | "got {} instead".format(type(override_dropout_rate))) 739 | elif not 0 <= override_dropout_rate < 1.0: 740 | raise ValueError("dropout_rate must be within range 0 <= dropout_rate < 1.0, " 741 | "got {} instead".format(override_dropout_rate)) 742 | 743 | if not isinstance(bn_epsilon, float): 744 | raise ValueError("bn_epsilon must be float, got {} instead".format(bn_epsilon)) 745 | 746 | if not isinstance(bn_momentum, float): 747 | raise ValueError("bn_momentum must be float, got {} instead".format(bn_momentum)) 748 | 749 | if not isinstance(pretrained, bool): 750 | raise ValueError("pretrained must be bool, got {} instead".format(type(pretrained))) 751 | 752 | if not isinstance(progress, bool): 753 | raise ValueError("progress must be bool, got {} instead".format(type(progress))) 754 | 755 | def get_features(self, x): 756 | 757 | self.check_input(x) 758 | 759 | x = self.init_conv(x) 760 | out = [] 761 | for stage in self.stages: 762 | x = stage(x) 763 | out.append(x) 764 | return out 765 | 766 | def forward(self, x): 767 | 768 | x = self.get_features(x)[-1] 769 | 770 | x = self.head_conv(x) 771 | 772 | x = self.avpool(x) 773 | x = torch.flatten(x, 1) 774 | 775 | if self.dropout is not None: 776 | x = self.dropout(x) 777 | x = self.fc(x) 778 | 779 | return x 780 | -------------------------------------------------------------------------------- /efficientnet_v2.py: -------------------------------------------------------------------------------- 1 | import collections.abc as container_abc 2 | from collections import OrderedDict 3 | from math import ceil, floor 4 | 5 | import torch 6 | import torch.nn as nn 7 | import torch.nn.functional as F 8 | from torch.utils import model_zoo 9 | 10 | 11 | def _pair(x): 12 | if isinstance(x, container_abc.Iterable): 13 | return x 14 | return (x, x) 15 | 16 | 17 | def torch_conv_out_spatial_shape(in_spatial_shape, kernel_size, stride): 18 | if in_spatial_shape is None: 19 | return None 20 | # in_spatial_shape -> [H,W] 21 | hin, win = _pair(in_spatial_shape) 22 | kh, kw = _pair(kernel_size) 23 | sh, sw = _pair(stride) 24 | 25 | # dilation and padding are ignored since they are always fixed in efficientnetV2 26 | hout = int(floor((hin - kh - 1) / sh + 1)) 27 | wout = int(floor((win - kw - 1) / sw + 1)) 28 | return hout, wout 29 | 30 | 31 | def get_activation(act_fn: str, **kwargs): 32 | if act_fn in ('silu', 'swish'): 33 | return nn.SiLU(**kwargs) 34 | elif act_fn == 'relu': 35 | return nn.ReLU(**kwargs) 36 | elif act_fn == 'relu6': 37 | return nn.ReLU6(**kwargs) 38 | elif act_fn == 'elu': 39 | return nn.ELU(**kwargs) 40 | elif act_fn == 'leaky_relu': 41 | return nn.LeakyReLU(**kwargs) 42 | elif act_fn == 'selu': 43 | return nn.SELU(**kwargs) 44 | elif act_fn == 'mish': 45 | return nn.Mish(**kwargs) 46 | else: 47 | raise ValueError('Unsupported act_fn {}'.format(act_fn)) 48 | 49 | 50 | def round_filters(filters, width_coefficient, depth_divisor=8): 51 | """Round number of filters based on depth multiplier.""" 52 | min_depth = depth_divisor 53 | filters *= width_coefficient 54 | new_filters = max(min_depth, int(filters + depth_divisor / 2) // depth_divisor * depth_divisor) 55 | return int(new_filters) 56 | 57 | 58 | def round_repeats(repeats, depth_coefficient): 59 | """Round number of filters based on depth multiplier.""" 60 | return int(ceil(depth_coefficient * repeats)) 61 | 62 | 63 | class DropConnect(nn.Module): 64 | def __init__(self, rate=0.5): 65 | super(DropConnect, self).__init__() 66 | self.keep_prob = None 67 | self.set_rate(rate) 68 | 69 | def set_rate(self, rate): 70 | if not 0 <= rate < 1: 71 | raise ValueError("rate must be 0<=rate<1, got {} instead".format(rate)) 72 | self.keep_prob = 1 - rate 73 | 74 | def forward(self, x): 75 | if self.training: 76 | random_tensor = self.keep_prob + torch.rand([x.size(0), 1, 1, 1], 77 | dtype=x.dtype, 78 | device=x.device) 79 | binary_tensor = torch.floor(random_tensor) 80 | return torch.mul(torch.div(x, self.keep_prob), binary_tensor) 81 | else: 82 | return x 83 | 84 | 85 | class SamePaddingConv2d(nn.Module): 86 | def __init__(self, 87 | in_spatial_shape, 88 | in_channels, 89 | out_channels, 90 | kernel_size, 91 | stride, 92 | dilation=1, 93 | enforce_in_spatial_shape=False, 94 | **kwargs): 95 | super(SamePaddingConv2d, self).__init__() 96 | 97 | self._in_spatial_shape = _pair(in_spatial_shape) 98 | # e.g. throw exception if input spatial shape does not match in_spatial_shape 99 | # when calling self.forward() 100 | self.enforce_in_spatial_shape = enforce_in_spatial_shape 101 | kernel_size = _pair(kernel_size) 102 | stride = _pair(stride) 103 | dilation = _pair(dilation) 104 | 105 | in_height, in_width = self._in_spatial_shape 106 | filter_height, filter_width = kernel_size 107 | stride_heigth, stride_width = stride 108 | dilation_height, dilation_width = dilation 109 | 110 | out_height = int(ceil(float(in_height) / float(stride_heigth))) 111 | out_width = int(ceil(float(in_width) / float(stride_width))) 112 | 113 | pad_along_height = max((out_height - 1) * stride_heigth + 114 | filter_height + (filter_height - 1) * (dilation_height - 1) - in_height, 0) 115 | pad_along_width = max((out_width - 1) * stride_width + 116 | filter_width + (filter_width - 1) * (dilation_width - 1) - in_width, 0) 117 | 118 | pad_top = pad_along_height // 2 119 | pad_bottom = pad_along_height - pad_top 120 | pad_left = pad_along_width // 2 121 | pad_right = pad_along_width - pad_left 122 | 123 | paddings = (pad_left, pad_right, pad_top, pad_bottom) 124 | if any(p > 0 for p in paddings): 125 | self.zero_pad = nn.ZeroPad2d(paddings) 126 | else: 127 | self.zero_pad = None 128 | self.conv = nn.Conv2d(in_channels=in_channels, 129 | out_channels=out_channels, 130 | kernel_size=kernel_size, 131 | stride=stride, 132 | dilation=dilation, 133 | **kwargs) 134 | 135 | self._out_spatial_shape = (out_height, out_width) 136 | 137 | @property 138 | def out_spatial_shape(self): 139 | return self._out_spatial_shape 140 | 141 | def check_spatial_shape(self, x): 142 | if x.size(2) != self._in_spatial_shape[0] or \ 143 | x.size(3) != self._in_spatial_shape[1]: 144 | raise ValueError( 145 | "Expected input spatial shape {}, got {} instead".format(self._in_spatial_shape, x.shape[2:])) 146 | 147 | def forward(self, x): 148 | if self.enforce_in_spatial_shape: 149 | self.check_spatial_shape(x) 150 | if self.zero_pad is not None: 151 | x = self.zero_pad(x) 152 | x = self.conv(x) 153 | return x 154 | 155 | 156 | class SqueezeExcitate(nn.Module): 157 | def __init__(self, 158 | in_channels, 159 | se_size, 160 | activation=None): 161 | super(SqueezeExcitate, self).__init__() 162 | self.dim_reduce = nn.Conv2d(in_channels=in_channels, 163 | out_channels=se_size, 164 | kernel_size=1) 165 | self.dim_restore = nn.Conv2d(in_channels=se_size, 166 | out_channels=in_channels, 167 | kernel_size=1) 168 | self.activation = F.relu if activation is None else activation 169 | 170 | def forward(self, x): 171 | inp = x 172 | x = F.adaptive_avg_pool2d(x, (1, 1)) 173 | x = self.dim_reduce(x) 174 | x = self.activation(x) 175 | x = self.dim_restore(x) 176 | x = torch.sigmoid(x) 177 | return torch.mul(inp, x) 178 | 179 | 180 | class MBConvBlockV2(nn.Module): 181 | def __init__(self, 182 | in_channels, 183 | out_channels, 184 | kernel_size, 185 | stride, 186 | expansion_factor, 187 | act_fn, 188 | act_kwargs=None, 189 | bn_epsilon=None, 190 | bn_momentum=None, 191 | se_size=None, 192 | drop_connect_rate=None, 193 | bias=False, 194 | tf_style_conv=False, 195 | in_spatial_shape=None): 196 | 197 | super().__init__() 198 | 199 | if act_kwargs is None: 200 | act_kwargs = {} 201 | exp_channels = in_channels * expansion_factor 202 | 203 | self.ops_lst = [] 204 | 205 | # expansion convolution 206 | if expansion_factor != 1: 207 | self.expand_conv = nn.Conv2d(in_channels=in_channels, 208 | out_channels=exp_channels, 209 | kernel_size=1, 210 | bias=bias) 211 | 212 | self.expand_bn = nn.BatchNorm2d(num_features=exp_channels, 213 | eps=bn_epsilon, 214 | momentum=bn_momentum) 215 | 216 | self.expand_act = get_activation(act_fn, **act_kwargs) 217 | self.ops_lst.extend([self.expand_conv, self.expand_bn, self.expand_act]) 218 | 219 | # depth-wise convolution 220 | if tf_style_conv: 221 | self.dp_conv = SamePaddingConv2d(in_spatial_shape=in_spatial_shape, 222 | in_channels=exp_channels, 223 | out_channels=exp_channels, 224 | kernel_size=kernel_size, 225 | stride=stride, 226 | groups=exp_channels, 227 | bias=bias) 228 | self.out_spatial_shape = self.dp_conv.out_spatial_shape 229 | else: 230 | self.dp_conv = nn.Conv2d(in_channels=exp_channels, 231 | out_channels=exp_channels, 232 | kernel_size=kernel_size, 233 | stride=stride, 234 | padding=1, 235 | groups=exp_channels, 236 | bias=bias) 237 | self.out_spatial_shape = torch_conv_out_spatial_shape(in_spatial_shape, kernel_size, stride) 238 | 239 | self.dp_bn = nn.BatchNorm2d(num_features=exp_channels, 240 | eps=bn_epsilon, 241 | momentum=bn_momentum) 242 | 243 | self.dp_act = get_activation(act_fn, **act_kwargs) 244 | self.ops_lst.extend([self.dp_conv, self.dp_bn, self.dp_act]) 245 | 246 | # Squeeze and Excitate 247 | if se_size is not None: 248 | self.se = SqueezeExcitate(exp_channels, 249 | se_size, 250 | activation=get_activation(act_fn, **act_kwargs)) 251 | self.ops_lst.append(self.se) 252 | 253 | # projection layer 254 | self.project_conv = nn.Conv2d(in_channels=exp_channels, 255 | out_channels=out_channels, 256 | kernel_size=1, 257 | bias=bias) 258 | 259 | self.project_bn = nn.BatchNorm2d(num_features=out_channels, 260 | eps=bn_epsilon, 261 | momentum=bn_momentum) 262 | 263 | # no activation function in projection layer 264 | 265 | self.ops_lst.extend([self.project_conv, self.project_bn]) 266 | 267 | self.skip_enabled = in_channels == out_channels and stride == 1 268 | 269 | if self.skip_enabled and drop_connect_rate is not None: 270 | self.drop_connect = DropConnect(drop_connect_rate) 271 | self.ops_lst.append(self.drop_connect) 272 | 273 | def forward(self, x): 274 | inp = x 275 | for op in self.ops_lst: 276 | x = op(x) 277 | if self.skip_enabled: 278 | return x + inp 279 | else: 280 | return x 281 | 282 | 283 | class FusedMBConvBlockV2(nn.Module): 284 | def __init__(self, 285 | in_channels, 286 | out_channels, 287 | kernel_size, 288 | stride, 289 | expansion_factor, 290 | act_fn, 291 | act_kwargs=None, 292 | bn_epsilon=None, 293 | bn_momentum=None, 294 | se_size=None, 295 | drop_connect_rate=None, 296 | bias=False, 297 | tf_style_conv=False, 298 | in_spatial_shape=None): 299 | 300 | super().__init__() 301 | 302 | if act_kwargs is None: 303 | act_kwargs = {} 304 | exp_channels = in_channels * expansion_factor 305 | 306 | self.ops_lst = [] 307 | 308 | # expansion convolution 309 | expansion_out_shape = in_spatial_shape 310 | if expansion_factor != 1: 311 | if tf_style_conv: 312 | self.expand_conv = SamePaddingConv2d(in_spatial_shape=in_spatial_shape, 313 | in_channels=in_channels, 314 | out_channels=exp_channels, 315 | kernel_size=kernel_size, 316 | stride=stride, 317 | bias=bias) 318 | expansion_out_shape = self.expand_conv.out_spatial_shape 319 | else: 320 | self.expand_conv = nn.Conv2d(in_channels=in_channels, 321 | out_channels=exp_channels, 322 | kernel_size=kernel_size, 323 | padding=1, 324 | stride=stride, 325 | bias=bias) 326 | expansion_out_shape = torch_conv_out_spatial_shape(in_spatial_shape, kernel_size, stride) 327 | 328 | self.expand_bn = nn.BatchNorm2d(num_features=exp_channels, 329 | eps=bn_epsilon, 330 | momentum=bn_momentum) 331 | 332 | self.expand_act = get_activation(act_fn, **act_kwargs) 333 | self.ops_lst.extend([self.expand_conv, self.expand_bn, self.expand_act]) 334 | 335 | # Squeeze and Excitate 336 | if se_size is not None: 337 | self.se = SqueezeExcitate(exp_channels, 338 | se_size, 339 | activation=get_activation(act_fn, **act_kwargs)) 340 | self.ops_lst.append(self.se) 341 | 342 | # projection layer 343 | kernel_size = 1 if expansion_factor != 1 else kernel_size 344 | stride = 1 if expansion_factor != 1 else stride 345 | if tf_style_conv: 346 | self.project_conv = SamePaddingConv2d(in_spatial_shape=expansion_out_shape, 347 | in_channels=exp_channels, 348 | out_channels=out_channels, 349 | kernel_size=kernel_size, 350 | stride=stride, 351 | bias=bias) 352 | self.out_spatial_shape = self.project_conv.out_spatial_shape 353 | else: 354 | self.project_conv = nn.Conv2d(in_channels=exp_channels, 355 | out_channels=out_channels, 356 | kernel_size=kernel_size, 357 | stride=stride, 358 | padding=1 if kernel_size > 1 else 0, 359 | bias=bias) 360 | self.out_spatial_shape = torch_conv_out_spatial_shape(expansion_out_shape, kernel_size, stride) 361 | 362 | self.project_bn = nn.BatchNorm2d(num_features=out_channels, 363 | eps=bn_epsilon, 364 | momentum=bn_momentum) 365 | 366 | self.ops_lst.extend( 367 | [self.project_conv, self.project_bn]) 368 | 369 | if expansion_factor == 1: 370 | self.project_act = get_activation(act_fn, **act_kwargs) 371 | self.ops_lst.append(self.project_act) 372 | 373 | self.skip_enabled = in_channels == out_channels and stride == 1 374 | 375 | if self.skip_enabled and drop_connect_rate is not None: 376 | self.drop_connect = DropConnect(drop_connect_rate) 377 | self.ops_lst.append(self.drop_connect) 378 | 379 | def forward(self, x): 380 | inp = x 381 | for op in self.ops_lst: 382 | x = op(x) 383 | if self.skip_enabled: 384 | return x + inp 385 | else: 386 | return x 387 | 388 | 389 | class EfficientNetV2(nn.Module): 390 | _models = {'b0': {'num_repeat': [1, 2, 2, 3, 5, 8], 391 | 'kernel_size': [3, 3, 3, 3, 3, 3], 392 | 'stride': [1, 2, 2, 2, 1, 2], 393 | 'expand_ratio': [1, 4, 4, 4, 6, 6], 394 | 'in_channel': [32, 16, 32, 48, 96, 112], 395 | 'out_channel': [16, 32, 48, 96, 112, 192], 396 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25], 397 | 'conv_type': [1, 1, 1, 0, 0, 0], 398 | 'is_feature_stage': [False, True, True, False, True, True], 399 | 'width_coefficient': 1.0, 400 | 'depth_coefficient': 1.0, 401 | 'train_size': 192, 402 | 'eval_size': 224, 403 | 'dropout': 0.2, 404 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlnUVBhWkZRcWNXR3dINmRLP2U9UUI5ZndH/root/content', 405 | 'model_name': 'efficientnet_v2_b0_21k_ft1k-a91e14c5.pth'}, 406 | 'b1': {'num_repeat': [1, 2, 2, 3, 5, 8], 407 | 'kernel_size': [3, 3, 3, 3, 3, 3], 408 | 'stride': [1, 2, 2, 2, 1, 2], 409 | 'expand_ratio': [1, 4, 4, 4, 6, 6], 410 | 'in_channel': [32, 16, 32, 48, 96, 112], 411 | 'out_channel': [16, 32, 48, 96, 112, 192], 412 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25], 413 | 'conv_type': [1, 1, 1, 0, 0, 0], 414 | 'is_feature_stage': [False, True, True, False, True, True], 415 | 'width_coefficient': 1.0, 416 | 'depth_coefficient': 1.1, 417 | 'train_size': 192, 418 | 'eval_size': 240, 419 | 'dropout': 0.2, 420 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlnUVJnVGV5UndSY2J2amwtP2U9dTBiV1lO/root/content', 421 | 'model_name': 'efficientnet_v2_b1_21k_ft1k-58f4fb47.pth'}, 422 | 'b2': {'num_repeat': [1, 2, 2, 3, 5, 8], 423 | 'kernel_size': [3, 3, 3, 3, 3, 3], 424 | 'stride': [1, 2, 2, 2, 1, 2], 425 | 'expand_ratio': [1, 4, 4, 4, 6, 6], 426 | 'in_channel': [32, 16, 32, 48, 96, 112], 427 | 'out_channel': [16, 32, 48, 96, 112, 192], 428 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25], 429 | 'conv_type': [1, 1, 1, 0, 0, 0], 430 | 'is_feature_stage': [False, True, True, False, True, True], 431 | 'width_coefficient': 1.1, 432 | 'depth_coefficient': 1.2, 433 | 'train_size': 208, 434 | 'eval_size': 260, 435 | 'dropout': 0.3, 436 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlnUVY4M2NySVFZbU41X0tGP2U9ZERZVmxK/root/content', 437 | 'model_name': 'efficientnet_v2_b2_21k_ft1k-db4ac0ee.pth'}, 438 | 'b3': {'num_repeat': [1, 2, 2, 3, 5, 8], 439 | 'kernel_size': [3, 3, 3, 3, 3, 3], 440 | 'stride': [1, 2, 2, 2, 1, 2], 441 | 'expand_ratio': [1, 4, 4, 4, 6, 6], 442 | 'in_channel': [32, 16, 32, 48, 96, 112], 443 | 'out_channel': [16, 32, 48, 96, 112, 192], 444 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25], 445 | 'conv_type': [1, 1, 1, 0, 0, 0], 446 | 'is_feature_stage': [False, True, True, False, True, True], 447 | 'width_coefficient': 1.2, 448 | 'depth_coefficient': 1.4, 449 | 'train_size': 240, 450 | 'eval_size': 300, 451 | 'dropout': 0.3, 452 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlnUVpkamdZUzhhaDdtTTZLP2U9anA4VWN2/root/content', 453 | 'model_name': 'efficientnet_v2_b3_21k_ft1k-3da5874c.pth'}, 454 | 's': {'num_repeat': [2, 4, 4, 6, 9, 15], 455 | 'kernel_size': [3, 3, 3, 3, 3, 3], 456 | 'stride': [1, 2, 2, 2, 1, 2], 457 | 'expand_ratio': [1, 4, 4, 4, 6, 6], 458 | 'in_channel': [24, 24, 48, 64, 128, 160], 459 | 'out_channel': [24, 48, 64, 128, 160, 256], 460 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25], 461 | 'conv_type': [1, 1, 1, 0, 0, 0], 462 | 'is_feature_stage': [False, True, True, False, True, True], 463 | 'width_coefficient': 1.0, 464 | 'depth_coefficient': 1.0, 465 | 'train_size': 300, 466 | 'eval_size': 384, 467 | 'dropout': 0.2, 468 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmllbFF5VWJOZzd0cmhBbm8/root/content', 469 | 'model_name': 'efficientnet_v2_s_21k_ft1k-dbb43f38.pth'}, 470 | 'm': {'num_repeat': [3, 5, 5, 7, 14, 18, 5], 471 | 'kernel_size': [3, 3, 3, 3, 3, 3, 3], 472 | 'stride': [1, 2, 2, 2, 1, 2, 1], 473 | 'expand_ratio': [1, 4, 4, 4, 6, 6, 6], 474 | 'in_channel': [24, 24, 48, 80, 160, 176, 304], 475 | 'out_channel': [24, 48, 80, 160, 176, 304, 512], 476 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25, 0.25], 477 | 'conv_type': [1, 1, 1, 0, 0, 0, 0], 478 | 'is_feature_stage': [False, True, True, False, True, False, True], 479 | 'width_coefficient': 1.0, 480 | 'depth_coefficient': 1.0, 481 | 'train_size': 384, 482 | 'eval_size': 480, 483 | 'dropout': 0.3, 484 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmllN1ZDazRFb0o1bnlyNUE/root/content', 485 | 'model_name': 'efficientnet_v2_m_21k_ft1k-da8e56c0.pth'}, 486 | 'l': {'num_repeat': [4, 7, 7, 10, 19, 25, 7], 487 | 'kernel_size': [3, 3, 3, 3, 3, 3, 3], 488 | 'stride': [1, 2, 2, 2, 1, 2, 1], 489 | 'expand_ratio': [1, 4, 4, 4, 6, 6, 6], 490 | 'in_channel': [32, 32, 64, 96, 192, 224, 384], 491 | 'out_channel': [32, 64, 96, 192, 224, 384, 640], 492 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25, 0.25], 493 | 'conv_type': [1, 1, 1, 0, 0, 0, 0], 494 | 'is_feature_stage': [False, True, True, False, True, False, True], 495 | 'feature_stages': [1, 2, 4, 6], 496 | 'width_coefficient': 1.0, 497 | 'depth_coefficient': 1.0, 498 | 'train_size': 384, 499 | 'eval_size': 480, 500 | 'dropout': 0.4, 501 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlmcmIyRHEtQTBhUTBhWVE/root/content', 502 | 'model_name': 'efficientnet_v2_l_21k_ft1k-08121eee.pth'}, 503 | 'xl': {'num_repeat': [4, 8, 8, 16, 24, 32, 8], 504 | 'kernel_size': [3, 3, 3, 3, 3, 3, 3], 505 | 'stride': [1, 2, 2, 2, 1, 2, 1], 506 | 'expand_ratio': [1, 4, 4, 4, 6, 6, 6], 507 | 'in_channel': [32, 32, 64, 96, 192, 256, 512], 508 | 'out_channel': [32, 64, 96, 192, 256, 512, 640], 509 | 'se_ratio': [None, None, None, 0.25, 0.25, 0.25, 0.25], 510 | 'conv_type': [1, 1, 1, 0, 0, 0, 0], 511 | 'is_feature_stage': [False, True, True, False, True, False, True], 512 | 'feature_stages': [1, 2, 4, 6], 513 | 'width_coefficient': 1.0, 514 | 'depth_coefficient': 1.0, 515 | 'train_size': 384, 516 | 'eval_size': 512, 517 | 'dropout': 0.4, 518 | 'weight_url': 'https://api.onedrive.com/v1.0/shares/u!aHR0cHM6Ly8xZHJ2Lm1zL3UvcyFBdGlRcHc5VGNjZmlmVXQtRHJLa21taUkxWkE/root/content', 519 | 'model_name': 'efficientnet_v2_xl_21k_ft1k-1fcc9744.pth'}} 520 | 521 | def __init__(self, 522 | model_name, 523 | in_channels=3, 524 | n_classes=1000, 525 | tf_style_conv=False, 526 | in_spatial_shape=None, 527 | activation='silu', 528 | activation_kwargs=None, 529 | bias=False, 530 | drop_connect_rate=0.2, 531 | dropout_rate=None, 532 | bn_epsilon=1e-3, 533 | bn_momentum=0.01, 534 | pretrained=False, 535 | progress=False, 536 | ): 537 | super().__init__() 538 | 539 | self.blocks = nn.ModuleList() 540 | self.model_name = model_name 541 | self.cfg = self._models[model_name] 542 | 543 | if tf_style_conv and in_spatial_shape is None: 544 | in_spatial_shape = self.cfg['eval_size'] 545 | 546 | activation_kwargs = {} if activation_kwargs is None else activation_kwargs 547 | dropout_rate = self.cfg['dropout'] if dropout_rate is None else dropout_rate 548 | _input_ch = in_channels 549 | 550 | self.feature_block_ids = [] 551 | 552 | # stem 553 | if tf_style_conv: 554 | self.stem_conv = SamePaddingConv2d( 555 | in_spatial_shape=in_spatial_shape, 556 | in_channels=in_channels, 557 | out_channels=round_filters(self.cfg['in_channel'][0], self.cfg['width_coefficient']), 558 | kernel_size=3, 559 | stride=2, 560 | bias=bias 561 | ) 562 | in_spatial_shape = self.stem_conv.out_spatial_shape 563 | else: 564 | self.stem_conv = nn.Conv2d( 565 | in_channels=in_channels, 566 | out_channels=round_filters(self.cfg['in_channel'][0], self.cfg['width_coefficient']), 567 | kernel_size=3, 568 | stride=2, 569 | padding=1, 570 | bias=bias 571 | ) 572 | 573 | self.stem_bn = nn.BatchNorm2d( 574 | num_features=round_filters(self.cfg['in_channel'][0], self.cfg['width_coefficient']), 575 | eps=bn_epsilon, 576 | momentum=bn_momentum) 577 | 578 | self.stem_act = get_activation(activation, **activation_kwargs) 579 | 580 | drop_connect_rates = self.get_dropconnect_rates(drop_connect_rate) 581 | 582 | stages = zip(*[self.cfg[x] for x in 583 | ['num_repeat', 'kernel_size', 'stride', 'expand_ratio', 'in_channel', 'out_channel', 'se_ratio', 584 | 'conv_type', 'is_feature_stage']]) 585 | 586 | idx = 0 587 | 588 | for stage_args in stages: 589 | (num_repeat, kernel_size, stride, expand_ratio, 590 | in_channels, out_channels, se_ratio, conv_type, is_feature_stage) = stage_args 591 | 592 | in_channels = round_filters( 593 | in_channels, self.cfg['width_coefficient']) 594 | out_channels = round_filters( 595 | out_channels, self.cfg['width_coefficient']) 596 | num_repeat = round_repeats( 597 | num_repeat, self.cfg['depth_coefficient']) 598 | 599 | conv_block = MBConvBlockV2 if conv_type == 0 else FusedMBConvBlockV2 600 | 601 | for _ in range(num_repeat): 602 | se_size = None if se_ratio is None else max(1, int(in_channels * se_ratio)) 603 | _b = conv_block(in_channels=in_channels, 604 | out_channels=out_channels, 605 | kernel_size=kernel_size, 606 | stride=stride, 607 | expansion_factor=expand_ratio, 608 | act_fn=activation, 609 | act_kwargs=activation_kwargs, 610 | bn_epsilon=bn_epsilon, 611 | bn_momentum=bn_momentum, 612 | se_size=se_size, 613 | drop_connect_rate=drop_connect_rates[idx], 614 | bias=bias, 615 | tf_style_conv=tf_style_conv, 616 | in_spatial_shape=in_spatial_shape 617 | ) 618 | self.blocks.append(_b) 619 | idx += 1 620 | if tf_style_conv: 621 | in_spatial_shape = _b.out_spatial_shape 622 | in_channels = out_channels 623 | stride = 1 624 | 625 | if is_feature_stage: 626 | self.feature_block_ids.append(idx - 1) 627 | 628 | head_conv_out_channels = round_filters(1280, self.cfg['width_coefficient']) 629 | 630 | self.head_conv = nn.Conv2d(in_channels=in_channels, 631 | out_channels=head_conv_out_channels, 632 | kernel_size=1, 633 | bias=bias) 634 | self.head_bn = nn.BatchNorm2d(num_features=head_conv_out_channels, 635 | eps=bn_epsilon, 636 | momentum=bn_momentum) 637 | self.head_act = get_activation(activation, **activation_kwargs) 638 | 639 | self.dropout = nn.Dropout(p=dropout_rate) 640 | 641 | self.avpool = nn.AdaptiveAvgPool2d((1, 1)) 642 | self.fc = nn.Linear(head_conv_out_channels, n_classes) 643 | 644 | if pretrained: 645 | self._load_state(_input_ch, n_classes, progress, tf_style_conv) 646 | 647 | return 648 | 649 | def _load_state(self, in_channels, n_classes, progress, tf_style_conv): 650 | state_dict = model_zoo.load_url(self.cfg['weight_url'], 651 | progress=progress, 652 | file_name=self.cfg['model_name']) 653 | 654 | strict = True 655 | 656 | if not tf_style_conv: 657 | state_dict = OrderedDict( 658 | [(k.replace('.conv.', '.'), v) if '.conv.' in k else (k, v) for k, v in state_dict.items()]) 659 | 660 | if in_channels != 3: 661 | if tf_style_conv: 662 | state_dict.pop('stem_conv.conv.weight') 663 | else: 664 | state_dict.pop('stem_conv.weight') 665 | strict = False 666 | 667 | if n_classes != 1000: 668 | state_dict.pop('fc.weight') 669 | state_dict.pop('fc.bias') 670 | strict = False 671 | 672 | self.load_state_dict(state_dict, strict=strict) 673 | print("Model weights loaded successfully.") 674 | 675 | def get_dropconnect_rates(self, drop_connect_rate): 676 | nr = self.cfg['num_repeat'] 677 | dc = self.cfg['depth_coefficient'] 678 | total = sum(round_repeats(nr[i], dc) for i in range(len(nr))) 679 | return [drop_connect_rate * i / total for i in range(total)] 680 | 681 | def get_features(self, x): 682 | x = self.stem_act(self.stem_bn(self.stem_conv(x))) 683 | 684 | features = [] 685 | feat_idx = 0 686 | for block_idx, block in enumerate(self.blocks): 687 | x = block(x) 688 | if block_idx == self.feature_block_ids[feat_idx]: 689 | features.append(x) 690 | feat_idx += 1 691 | 692 | return features 693 | 694 | def forward(self, x): 695 | x = self.stem_act(self.stem_bn(self.stem_conv(x))) 696 | for block in self.blocks: 697 | x = block(x) 698 | x = self.head_act(self.head_bn(self.head_conv(x))) 699 | x = self.dropout(torch.flatten(self.avpool(x), 1)) 700 | x = self.fc(x) 701 | 702 | return x 703 | -------------------------------------------------------------------------------- /imagenet_eval.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import torch\n", 10 | "import torchvision.transforms as transforms\n", 11 | "import torchvision.datasets as datasets\n", 12 | "from sklearn.metrics import accuracy_score\n", 13 | "from PIL import Image\n", 14 | "\n", 15 | "from efficientnet import EfficientNet\n", 16 | "from efficientnet_v2 import EfficientNetV2\n" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": null, 22 | "metadata": {}, 23 | "outputs": [], 24 | "source": [ 25 | "def eval_model(model, dataloader, device, criterion=None):\n", 26 | " loss_value = []\n", 27 | " y_pred = []\n", 28 | " y_true = []\n", 29 | "\n", 30 | " model.eval()\n", 31 | " with torch.no_grad():\n", 32 | " for xb, yb in dataloader:\n", 33 | " xb, yb = xb.to(device), yb.to(device)\n", 34 | " out = model(xb)\n", 35 | " if out.size(1) == 1:\n", 36 | " # regression\n", 37 | " out = torch.squeeze(out, 1)\n", 38 | "\n", 39 | " if criterion is not None:\n", 40 | " loss = criterion(out, yb)\n", 41 | " loss_value.append(loss.item())\n", 42 | "\n", 43 | " y_pred.append(out.detach().cpu())\n", 44 | " y_true.append(yb.detach().cpu())\n", 45 | "\n", 46 | " if criterion is not None:\n", 47 | " loss_value = sum(loss_value) / len(loss_value)\n", 48 | " return torch.cat(y_pred), torch.cat(y_true), loss_value\n", 49 | " else:\n", 50 | " return torch.cat(y_pred), torch.cat(y_true)\n" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "metadata": {}, 56 | "source": [ 57 | "## EfficientNetV2" 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": null, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n", 67 | "\n", 68 | "modelname = 's'\n", 69 | "in_spatial_shape = EfficientNetV2._models[modelname]['eval_size']\n", 70 | "\n", 71 | "# Setting tf_style_conv=True and in_spatial_shape only necessary when evaluating against Imagenet dataset\n", 72 | "# Model names: 'b0, 'b1', 'b2', 'b3', 's', 'm', 'l', 'xl'\n", 73 | "model = EfficientNetV2(modelname,\n", 74 | " tf_style_conv=True,\n", 75 | " in_spatial_shape=in_spatial_shape,\n", 76 | " pretrained=True,\n", 77 | " progress=True)\n", 78 | "model.to(device)\n", 79 | "\n", 80 | "val_trainsforms = transforms.Compose([\n", 81 | " transforms.Resize(in_spatial_shape,\n", 82 | " interpolation=transforms.InterpolationMode.BICUBIC),\n", 83 | " transforms.CenterCrop(in_spatial_shape),\n", 84 | " transforms.ToTensor(),\n", 85 | " transforms.Normalize(mean=0.5,\n", 86 | " std=0.5),\n", 87 | "])\n", 88 | "\n", 89 | "val_dataset = datasets.ImageNet(root=\"/path/to/imagenet/val/subset\", split=\"val\",\n", 90 | " transform=val_trainsforms)\n", 91 | "\n", 92 | "val_loader = torch.utils.data.DataLoader(\n", 93 | " val_dataset,\n", 94 | " batch_size=32, shuffle=False,\n", 95 | " num_workers=2, pin_memory=True)\n" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "metadata": {}, 102 | "outputs": [], 103 | "source": [ 104 | "y_pred, y_true = eval_model(model, val_loader, device)\n", 105 | "_, y_pred = torch.max(y_pred, 1)\n", 106 | "\n", 107 | "score = accuracy_score(y_pred, y_true)\n", 108 | "print(\"Accuracy: {:.3%}\".format(score))\n" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": {}, 114 | "source": [ 115 | "Expected evaluation metric values on ImageNet validation set \n", 116 | "\n", 117 | "EfficientNetV2-b0 - 77.590%
\n", 118 | "EfficientNetV2-b1 - 78.872%
\n", 119 | "EfficientNetV2-b2 - 79.388%
\n", 120 | "EfficientNetV2-b3 - 82.260%
\n", 121 | "EfficientNetV2-S - 84.282%
\n", 122 | "EfficientNetV2-M - 85.596%
\n", 123 | "EfficientNetV2-L - 86.298%
\n", 124 | "EfficientNetV2-XL - 86.414%
" 125 | ] 126 | }, 127 | { 128 | "cell_type": "markdown", 129 | "metadata": {}, 130 | "source": [ 131 | "## EfficientNetV1" 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "execution_count": null, 137 | "metadata": {}, 138 | "outputs": [], 139 | "source": [ 140 | "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n", 141 | "\n", 142 | "# EfficientNet model index, i.e. 0 for for EfficientNet-B0\n", 143 | "idx = 0\n", 144 | "model = EfficientNet(idx, pretrained=True, progress=True)\n", 145 | "model.to(device)\n", 146 | "\n", 147 | "val_trainsforms = transforms.Compose([\n", 148 | " transforms.Resize(model.in_spatial_shape[0], interpolation=Image.BICUBIC),\n", 149 | " transforms.CenterCrop(model.in_spatial_shape),\n", 150 | " transforms.ToTensor(),\n", 151 | " transforms.Normalize(mean=[0.485, 0.456, 0.406],\n", 152 | " std=[0.229, 0.224, 0.225]),\n", 153 | "])\n", 154 | "\n", 155 | "\n", 156 | "val_dataset = datasets.ImageNet(root=\"path/to/imagenet/dataset\", split=\"val\",\n", 157 | " transform=val_trainsforms)\n", 158 | "\n", 159 | "val_loader = torch.utils.data.DataLoader(\n", 160 | " val_dataset,\n", 161 | " batch_size=32, shuffle=False,\n", 162 | " num_workers=1, pin_memory=True)\n" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": null, 168 | "metadata": {}, 169 | "outputs": [], 170 | "source": [ 171 | "y_pred, y_true = eval_model(model, val_loader, device)\n", 172 | "_, y_pred = torch.max(y_pred, 1)\n", 173 | "\n", 174 | "score = accuracy_score(y_pred, y_true)\n", 175 | "print(\"Accuracy: {:.3%}\".format(score))\n" 176 | ] 177 | }, 178 | { 179 | "cell_type": "markdown", 180 | "metadata": {}, 181 | "source": [ 182 | "Expected evaluation metric values on ImageNet validation set \n", 183 | "\n", 184 | "EfficientNet-B0 - 76.43%
\n", 185 | "EfficientNet-B1 - 78.396%
\n", 186 | "EfficientNet-B2 - 79.804%
\n", 187 | "EfficientNet-B3 - 81.542%
\n", 188 | "EfficientNet-B4 - 83.036%
\n", 189 | "EfficientNet-B5 - 83.79%
\n", 190 | "EfficientNet-B6 - 84.136%
\n", 191 | "EfficientNet-B7 - 84.578%
" 192 | ] 193 | } 194 | ], 195 | "metadata": { 196 | "kernelspec": { 197 | "display_name": "Python 3", 198 | "language": "python", 199 | "name": "python3" 200 | }, 201 | "language_info": { 202 | "codemirror_mode": { 203 | "name": "ipython", 204 | "version": 3 205 | }, 206 | "file_extension": ".py", 207 | "mimetype": "text/x-python", 208 | "name": "python", 209 | "nbconvert_exporter": "python", 210 | "pygments_lexer": "ipython3", 211 | "version": "3.8.10" 212 | } 213 | }, 214 | "nbformat": 4, 215 | "nbformat_minor": 4 216 | } 217 | --------------------------------------------------------------------------------