├── README.md
└── EfficientNet_TransferLearning.ipynb
/README.md:
--------------------------------------------------------------------------------
1 | # EfficientNet - Transfer Learning Implementation
2 | EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
3 |
4 | ---
5 |
6 | 📌[](https://colab.research.google.com/github/ayyucekizrak/EfficientNet-Transfer-Learning-Implementation/blob/master/EfficientNet_TransferLearning.ipynb) **Google Colab Notebook**
7 |
8 |
9 | 📌[](https://nbviewer.jupyter.org/github/ayyucekizrak/EfficientNet-Transfer-Learning-Implementation/blob/master/EfficientNet_TransferLearning.ipynb) **Jupyter Notebook**
10 |
11 | ---
12 |
13 | ## Reviewing EfficientNet: Increasing the Accuracy and Robustness CNNs: EfficientNet
14 |
15 | 
16 |
17 | *Model Scaling:* (a) is a baseline network example; (b)-(d) are conventional scaling that only increases one dimension of network
18 | width, depth, or resolution. (e) is our proposed compound scaling method that uniformly scales all three dimensions with a fixed ratio.
19 |
20 | ---
21 | ### The Effect of Transfer Learning on EfficientNet
22 | ---
23 | :octocat: [EfficientNet: Increasing the Accuracy and Robustness CNNs](https://medium.com/@ayyucekizrak/%C3%B6l%C3%A7eklendirme-ile-cnn-modelinin-do%C4%9Fruluk-ve-verimlili%C4%9Fini-art%C4%B1rma-efficientnet-cb6f2b6512de) Transfer Learning for EfficientNet was performed on the [CIFAR10](https://www.cs.toronto.edu/~kriz/cifar.html) dataset in the application prepared as an attachment to the blog post.
24 |
25 | ---
26 |
27 | :chocolate_bar: **References:**
28 |
29 | * [EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks](https://arxiv.org/pdf/1905.11946v3.pdf)
30 |
31 | * [Implementation of EfficientNet model. Keras and TensorFlow Keras](https://github.com/qubvel/efficientnet)
32 | * [Papers with Codes](https://paperswithcode.com/paper/efficientnet-rethinking-model-scaling-for)
33 | * [EfficientNet: Improving Accuracy and Efficiency through AutoML and Model Scaling](https://ai.googleblog.com/2019/05/efficientnet-improving-accuracy-and.html)
34 | * [How to do Transfer learning with Efficientnet](https://www.dlology.com/blog/transfer-learning-with-efficientnet/)
35 | * [Training EfficientNets on TPUs](https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet)
36 |
37 |
--------------------------------------------------------------------------------
/EfficientNet_TransferLearning.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "nbformat": 4,
3 | "nbformat_minor": 0,
4 | "metadata": {
5 | "colab": {
6 | "name": "EfficientNet_TransferLearning.ipynb",
7 | "provenance": [],
8 | "collapsed_sections": []
9 | },
10 | "kernelspec": {
11 | "name": "python3",
12 | "display_name": "Python 3"
13 | },
14 | "accelerator": "GPU"
15 | },
16 | "cells": [
17 | {
18 | "cell_type": "markdown",
19 | "metadata": {
20 | "id": "nyvQkiTu9f9I",
21 | "colab_type": "text"
22 | },
23 | "source": [
24 | "# The Effect of Transfer Learning on EfficientNet\n",
25 | "---\n",
26 | "[EfficientNet: Increasing the Accuracy and Robustness CNNs: EfficientNet]() implementation is prepared as an attachment to the blog post [CIFAR10](https://www.cs.toronto.edu/~kriz/cifar.html) Transfer Learning was performed on the CIFAR10 dataset.\n",
27 | "\n",
28 | "---\n",
29 | "\n",
30 | "### References: \n",
31 | "\n",
32 | "* [EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks](https://arxiv.org/pdf/1905.11946v3.pdf)\n",
33 | "\n",
34 | "* [Implementation of EfficientNet model. Keras and TensorFlow Keras](https://github.com/qubvel/efficientnet)\n",
35 | "* [Papers with Codes](https://paperswithcode.com/paper/efficientnet-rethinking-model-scaling-for)\n",
36 | "* [EfficientNet: Improving Accuracy and Efficiency through AutoML and Model Scaling](https://ai.googleblog.com/2019/05/efficientnet-improving-accuracy-and.html)\n",
37 | "* [How to do Transfer learning with Efficientnet](https://www.dlology.com/blog/transfer-learning-with-efficientnet/)\n",
38 | "* [Training EfficientNets on TPUs](https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet)\n",
39 | "\n",
40 | "\n",
41 | "\n",
42 | "\n",
43 | "\n",
44 | "\n"
45 | ]
46 | },
47 | {
48 | "cell_type": "markdown",
49 | "metadata": {
50 | "id": "AT0UnIdm_BJL",
51 | "colab_type": "text"
52 | },
53 | "source": [
54 | "**Google Colab Authentication**"
55 | ]
56 | },
57 | {
58 | "cell_type": "code",
59 | "metadata": {
60 | "id": "KkX1le9btip7",
61 | "colab_type": "code",
62 | "outputId": "c253e663-8f11-401e-e859-df4c855768ad",
63 | "colab": {
64 | "base_uri": "https://localhost:8080/",
65 | "height": 139
66 | }
67 | },
68 | "source": [
69 | "from google.colab import drive\n",
70 | "drive.mount('/gdrive')\n",
71 | "%cd /gdrive"
72 | ],
73 | "execution_count": 0,
74 | "outputs": [
75 | {
76 | "output_type": "stream",
77 | "text": [
78 | "Go to this URL in a browser: https://accounts.google.com/o/oauth2/auth?client_id=947318989803-6bn6qk8qdgf4n4g3pfee6491hc0brc4i.apps.googleusercontent.com&redirect_uri=urn%3aietf%3awg%3aoauth%3a2.0%3aoob&response_type=code&scope=email%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdocs.test%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive.photos.readonly%20https%3a%2f%2fwww.googleapis.com%2fauth%2fpeopleapi.readonly\n",
79 | "\n",
80 | "Enter your authorization code:\n",
81 | "··········\n",
82 | "Mounted at /gdrive\n",
83 | "/gdrive\n"
84 | ],
85 | "name": "stdout"
86 | }
87 | ]
88 | },
89 | {
90 | "cell_type": "markdown",
91 | "metadata": {
92 | "id": "WigvCP1TA7yU",
93 | "colab_type": "text"
94 | },
95 | "source": [
96 | "**Installing EfficientNet Source Model**"
97 | ]
98 | },
99 | {
100 | "cell_type": "code",
101 | "metadata": {
102 | "id": "AQ_FmGHutzev",
103 | "colab_type": "code",
104 | "outputId": "2b3092f6-945b-490e-a38c-dca5c906b913",
105 | "colab": {
106 | "base_uri": "https://localhost:8080/",
107 | "height": 513
108 | }
109 | },
110 | "source": [
111 | "import warnings\n",
112 | "warnings.filterwarnings(\"ignore\")\n",
113 | "\n",
114 | "!pip install -U git+https://github.com/qubvel/efficientnet"
115 | ],
116 | "execution_count": 0,
117 | "outputs": [
118 | {
119 | "output_type": "stream",
120 | "text": [
121 | "Collecting git+https://github.com/qubvel/efficientnet\n",
122 | " Cloning https://github.com/qubvel/efficientnet to /tmp/pip-req-build-g5u1k6ri\n",
123 | " Running command git clone -q https://github.com/qubvel/efficientnet /tmp/pip-req-build-g5u1k6ri\n",
124 | "Requirement already satisfied, skipping upgrade: keras_applications<=1.0.8,>=1.0.7 in /usr/local/lib/python3.6/dist-packages (from efficientnet==1.0.0) (1.0.8)\n",
125 | "Requirement already satisfied, skipping upgrade: scikit-image in /usr/local/lib/python3.6/dist-packages (from efficientnet==1.0.0) (0.15.0)\n",
126 | "Requirement already satisfied, skipping upgrade: h5py in /usr/local/lib/python3.6/dist-packages (from keras_applications<=1.0.8,>=1.0.7->efficientnet==1.0.0) (2.8.0)\n",
127 | "Requirement already satisfied, skipping upgrade: numpy>=1.9.1 in /usr/local/lib/python3.6/dist-packages (from keras_applications<=1.0.8,>=1.0.7->efficientnet==1.0.0) (1.17.4)\n",
128 | "Requirement already satisfied, skipping upgrade: matplotlib!=3.0.0,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0) (3.1.2)\n",
129 | "Requirement already satisfied, skipping upgrade: scipy>=0.17.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0) (1.3.3)\n",
130 | "Requirement already satisfied, skipping upgrade: networkx>=2.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0) (2.4)\n",
131 | "Requirement already satisfied, skipping upgrade: PyWavelets>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0) (1.1.1)\n",
132 | "Requirement already satisfied, skipping upgrade: imageio>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0) (2.4.1)\n",
133 | "Requirement already satisfied, skipping upgrade: pillow>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image->efficientnet==1.0.0) (4.3.0)\n",
134 | "Requirement already satisfied, skipping upgrade: six in /usr/local/lib/python3.6/dist-packages (from h5py->keras_applications<=1.0.8,>=1.0.7->efficientnet==1.0.0) (1.12.0)\n",
135 | "Requirement already satisfied, skipping upgrade: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0) (0.10.0)\n",
136 | "Requirement already satisfied, skipping upgrade: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0) (1.1.0)\n",
137 | "Requirement already satisfied, skipping upgrade: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0) (2.6.1)\n",
138 | "Requirement already satisfied, skipping upgrade: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0) (2.4.5)\n",
139 | "Requirement already satisfied, skipping upgrade: decorator>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from networkx>=2.0->scikit-image->efficientnet==1.0.0) (4.4.1)\n",
140 | "Requirement already satisfied, skipping upgrade: olefile in /usr/local/lib/python3.6/dist-packages (from pillow>=4.3.0->scikit-image->efficientnet==1.0.0) (0.46)\n",
141 | "Requirement already satisfied, skipping upgrade: setuptools in /usr/local/lib/python3.6/dist-packages (from kiwisolver>=1.0.1->matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet==1.0.0) (42.0.2)\n",
142 | "Building wheels for collected packages: efficientnet\n",
143 | " Building wheel for efficientnet (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
144 | " Created wheel for efficientnet: filename=efficientnet-1.0.0-cp36-none-any.whl size=17686 sha256=9d0d62deadd1272ba48f639789efa44741fa73c49ed45362d41444f4cb46a70c\n",
145 | " Stored in directory: /tmp/pip-ephem-wheel-cache-y8gz5bik/wheels/64/60/2e/30ebaa76ed1626e86bfb0cc0579b737fdb7d9ff8cb9522663a\n",
146 | "Successfully built efficientnet\n",
147 | "Installing collected packages: efficientnet\n",
148 | "Successfully installed efficientnet-1.0.0\n"
149 | ],
150 | "name": "stdout"
151 | }
152 | ]
153 | },
154 | {
155 | "cell_type": "markdown",
156 | "metadata": {
157 | "id": "0787Q6IOBAs4",
158 | "colab_type": "text"
159 | },
160 | "source": [
161 | "**Importing necessary libraries** "
162 | ]
163 | },
164 | {
165 | "cell_type": "code",
166 | "metadata": {
167 | "id": "1SqcWjgot-3P",
168 | "colab_type": "code",
169 | "outputId": "dc0ff868-5d30-4f0f-a2a6-c7723e043b48",
170 | "colab": {
171 | "base_uri": "https://localhost:8080/",
172 | "height": 80
173 | }
174 | },
175 | "source": [
176 | "import keras\n",
177 | "from keras.datasets import cifar10\n",
178 | "from keras.models import Model\n",
179 | "from keras.layers import Dense, Dropout, Activation, BatchNormalization, Flatten\n",
180 | "from keras.callbacks import ModelCheckpoint, ReduceLROnPlateau\n",
181 | "from keras.optimizers import Adam\n",
182 | "import efficientnet.keras as enet"
183 | ],
184 | "execution_count": 0,
185 | "outputs": [
186 | {
187 | "output_type": "stream",
188 | "text": [
189 | "Using TensorFlow backend.\n"
190 | ],
191 | "name": "stderr"
192 | },
193 | {
194 | "output_type": "display_data",
195 | "data": {
196 | "text/html": [
197 | "
\n",
198 | "The default version of TensorFlow in Colab will soon switch to TensorFlow 2.x. \n",
199 | "We recommend you upgrade now \n",
200 | "or ensure your notebook will continue to use TensorFlow 1.x via the %tensorflow_version 1.x magic:\n",
201 | "more info .
\n"
202 | ],
203 | "text/plain": [
204 | ""
205 | ]
206 | },
207 | "metadata": {
208 | "tags": []
209 | }
210 | }
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {
216 | "id": "wAsLNryOBHtt",
217 | "colab_type": "text"
218 | },
219 | "source": [
220 | "**Downloading CIFAR10 Dateset**"
221 | ]
222 | },
223 | {
224 | "cell_type": "code",
225 | "metadata": {
226 | "id": "sgD7kEGNuDXN",
227 | "colab_type": "code",
228 | "outputId": "742a808d-ac81-4bb9-c8b5-1a72eba44c94",
229 | "colab": {
230 | "base_uri": "https://localhost:8080/",
231 | "height": 102
232 | }
233 | },
234 | "source": [
235 | "# CIFAR10\n",
236 | "\n",
237 | "(x_train, y_train), (x_test, y_test) = cifar10.load_data()\n",
238 | "print('x_train shape:', x_train.shape)\n",
239 | "print(x_train.shape[0], 'train samples')\n",
240 | "print(x_test.shape[0], 'test samples')\n",
241 | "\n",
242 | "# Converting class vectors to binary class matrices\n",
243 | "num_classes = 10\n",
244 | "y_train = keras.utils.to_categorical(y_train, num_classes)\n",
245 | "y_test = keras.utils.to_categorical(y_test, num_classes)\n",
246 | "\n",
247 | "x_train = x_train.astype('float32')\n",
248 | "x_test = x_test.astype('float32')\n",
249 | "x_train /= 255\n",
250 | "x_test /= 255"
251 | ],
252 | "execution_count": 0,
253 | "outputs": [
254 | {
255 | "output_type": "stream",
256 | "text": [
257 | "Downloading data from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz\n",
258 | "170500096/170498071 [==============================] - 6s 0us/step\n",
259 | "x_train shape: (50000, 32, 32, 3)\n",
260 | "50000 train samples\n",
261 | "10000 test samples\n"
262 | ],
263 | "name": "stdout"
264 | }
265 | ]
266 | },
267 | {
268 | "cell_type": "markdown",
269 | "metadata": {
270 | "id": "KHsbhiAaBcl2",
271 | "colab_type": "text"
272 | },
273 | "source": [
274 | "**Definition of Swish Activation Function**\n",
275 | "\n",
276 | "---\n",
277 | "ou can read more about Swish [here](https://towardsdatascience.com/comparison-of-activation-functions-for-deep-neural-networks-706ac4284c8a)!\n",
278 | "\n",
279 | "\n"
280 | ]
281 | },
282 | {
283 | "cell_type": "code",
284 | "metadata": {
285 | "id": "Aw96vBTCuGWH",
286 | "colab_type": "code",
287 | "outputId": "dbe864f1-aacf-4c81-eef1-ad7459633741",
288 | "colab": {
289 | "base_uri": "https://localhost:8080/",
290 | "height": 71
291 | }
292 | },
293 | "source": [
294 | "from keras.backend import sigmoid\n",
295 | "\n",
296 | "class SwishActivation(Activation):\n",
297 | " \n",
298 | " def __init__(self, activation, **kwargs):\n",
299 | " super(SwishActivation, self).__init__(activation, **kwargs)\n",
300 | " self.__name__ = 'swish_act'\n",
301 | "\n",
302 | "def swish_act(x, beta = 1):\n",
303 | " return (x * sigmoid(beta * x))\n",
304 | "\n",
305 | "from keras.utils.generic_utils import get_custom_objects\n",
306 | "from keras.layers impokolart Activation\n",
307 | "get_custom_objects().update({'swish_act': SwishActivation(swish_act)})"
308 | ],
309 | "execution_count": 0,
310 | "outputs": [
311 | {
312 | "output_type": "stream",
313 | "text": [
314 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:66: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.\n",
315 | "\n"
316 | ],
317 | "name": "stdout"
318 | }
319 | ]
320 | },
321 | {
322 | "cell_type": "markdown",
323 | "metadata": {
324 | "id": "LTSv2jvsB3oL",
325 | "colab_type": "text"
326 | },
327 | "source": [
328 | "**Model of EfficientNet-B0** (pre-trained with imagenet)"
329 | ]
330 | },
331 | {
332 | "cell_type": "code",
333 | "metadata": {
334 | "id": "OFttl4isuJX1",
335 | "colab_type": "code",
336 | "outputId": "ebe5c4af-95e8-4291-9034-5a9926688c7b",
337 | "colab": {
338 | "base_uri": "https://localhost:8080/",
339 | "height": 1000
340 | }
341 | },
342 | "source": [
343 | "model = enet.EfficientNetB0(include_top=False, input_shape=(32,32,3), pooling='avg', weights='imagenet')\n",
344 | "\n",
345 | "# Adding 2 fully-connected layers to B0.\n",
346 | "x = model.output\n",
347 | "\n",
348 | "x = BatchNormalization()(x)\n",
349 | "x = Dropout(0.7)(x)\n",
350 | "\n",
351 | "x = Dense(512)(x)\n",
352 | "x = BatchNormalization()(x)\n",
353 | "x = Activation(swish_act)(x)\n",
354 | "x = Dropout(0.5)(x)\n",
355 | "\n",
356 | "x = Dense(128)(x)\n",
357 | "x = BatchNormalization()(x)\n",
358 | "x = Activation(swish_act)(x)\n",
359 | "\n",
360 | "# Output layer\n",
361 | "predictions = Dense(10, activation=\"softmax\")(x)\n",
362 | "\n",
363 | "model_final = Model(inputs = model.input, outputs = predictions)\n",
364 | "\n",
365 | "model_final.summary()"
366 | ],
367 | "execution_count": 0,
368 | "outputs": [
369 | {
370 | "output_type": "stream",
371 | "text": [
372 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:541: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n",
373 | "\n",
374 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:4479: The name tf.truncated_normal is deprecated. Please use tf.random.truncated_normal instead.\n",
375 | "\n",
376 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:190: The name tf.get_default_session is deprecated. Please use tf.compat.v1.get_default_session instead.\n",
377 | "\n",
378 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:197: The name tf.ConfigProto is deprecated. Please use tf.compat.v1.ConfigProto instead.\n",
379 | "\n",
380 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:203: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.\n",
381 | "\n",
382 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:207: The name tf.global_variables is deprecated. Please use tf.compat.v1.global_variables instead.\n",
383 | "\n",
384 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:216: The name tf.is_variable_initialized is deprecated. Please use tf.compat.v1.is_variable_initialized instead.\n",
385 | "\n",
386 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:223: The name tf.variables_initializer is deprecated. Please use tf.compat.v1.variables_initializer instead.\n",
387 | "\n",
388 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:2041: The name tf.nn.fused_batch_norm is deprecated. Please use tf.compat.v1.nn.fused_batch_norm instead.\n",
389 | "\n",
390 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:148: The name tf.placeholder_with_default is deprecated. Please use tf.compat.v1.placeholder_with_default instead.\n",
391 | "\n",
392 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:3733: calling dropout (from tensorflow.python.ops.nn_ops) with keep_prob is deprecated and will be removed in a future version.\n",
393 | "Instructions for updating:\n",
394 | "Please use `rate` instead of `keep_prob`. Rate should be set to `rate = 1 - keep_prob`.\n",
395 | "Downloading data from https://github.com/Callidior/keras-applications/releases/download/efficientnet/efficientnet-b0_weights_tf_dim_ordering_tf_kernels_autoaugment_notop.h5\n",
396 | "16809984/16804768 [==============================] - 1s 0us/step\n",
397 | "WARNING:tensorflow:Large dropout rate: 0.7 (>0.5). In TensorFlow 2.x, dropout() uses dropout rate instead of keep_prob. Please ensure that this is intended.\n",
398 | "WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py:4432: The name tf.random_uniform is deprecated. Please use tf.random.uniform instead.\n",
399 | "\n",
400 | "Model: \"model_1\"\n",
401 | "__________________________________________________________________________________________________\n",
402 | "Layer (type) Output Shape Param # Connected to \n",
403 | "==================================================================================================\n",
404 | "input_1 (InputLayer) (None, 32, 32, 3) 0 \n",
405 | "__________________________________________________________________________________________________\n",
406 | "stem_conv (Conv2D) (None, 16, 16, 32) 864 input_1[0][0] \n",
407 | "__________________________________________________________________________________________________\n",
408 | "stem_bn (BatchNormalization) (None, 16, 16, 32) 128 stem_conv[0][0] \n",
409 | "__________________________________________________________________________________________________\n",
410 | "stem_activation (Activation) (None, 16, 16, 32) 0 stem_bn[0][0] \n",
411 | "__________________________________________________________________________________________________\n",
412 | "block1a_dwconv (DepthwiseConv2D (None, 16, 16, 32) 288 stem_activation[0][0] \n",
413 | "__________________________________________________________________________________________________\n",
414 | "block1a_bn (BatchNormalization) (None, 16, 16, 32) 128 block1a_dwconv[0][0] \n",
415 | "__________________________________________________________________________________________________\n",
416 | "block1a_activation (Activation) (None, 16, 16, 32) 0 block1a_bn[0][0] \n",
417 | "__________________________________________________________________________________________________\n",
418 | "block1a_se_squeeze (GlobalAvera (None, 32) 0 block1a_activation[0][0] \n",
419 | "__________________________________________________________________________________________________\n",
420 | "block1a_se_reshape (Reshape) (None, 1, 1, 32) 0 block1a_se_squeeze[0][0] \n",
421 | "__________________________________________________________________________________________________\n",
422 | "block1a_se_reduce (Conv2D) (None, 1, 1, 8) 264 block1a_se_reshape[0][0] \n",
423 | "__________________________________________________________________________________________________\n",
424 | "block1a_se_expand (Conv2D) (None, 1, 1, 32) 288 block1a_se_reduce[0][0] \n",
425 | "__________________________________________________________________________________________________\n",
426 | "block1a_se_excite (Multiply) (None, 16, 16, 32) 0 block1a_activation[0][0] \n",
427 | " block1a_se_expand[0][0] \n",
428 | "__________________________________________________________________________________________________\n",
429 | "block1a_project_conv (Conv2D) (None, 16, 16, 16) 512 block1a_se_excite[0][0] \n",
430 | "__________________________________________________________________________________________________\n",
431 | "block1a_project_bn (BatchNormal (None, 16, 16, 16) 64 block1a_project_conv[0][0] \n",
432 | "__________________________________________________________________________________________________\n",
433 | "block2a_expand_conv (Conv2D) (None, 16, 16, 96) 1536 block1a_project_bn[0][0] \n",
434 | "__________________________________________________________________________________________________\n",
435 | "block2a_expand_bn (BatchNormali (None, 16, 16, 96) 384 block2a_expand_conv[0][0] \n",
436 | "__________________________________________________________________________________________________\n",
437 | "block2a_expand_activation (Acti (None, 16, 16, 96) 0 block2a_expand_bn[0][0] \n",
438 | "__________________________________________________________________________________________________\n",
439 | "block2a_dwconv (DepthwiseConv2D (None, 8, 8, 96) 864 block2a_expand_activation[0][0] \n",
440 | "__________________________________________________________________________________________________\n",
441 | "block2a_bn (BatchNormalization) (None, 8, 8, 96) 384 block2a_dwconv[0][0] \n",
442 | "__________________________________________________________________________________________________\n",
443 | "block2a_activation (Activation) (None, 8, 8, 96) 0 block2a_bn[0][0] \n",
444 | "__________________________________________________________________________________________________\n",
445 | "block2a_se_squeeze (GlobalAvera (None, 96) 0 block2a_activation[0][0] \n",
446 | "__________________________________________________________________________________________________\n",
447 | "block2a_se_reshape (Reshape) (None, 1, 1, 96) 0 block2a_se_squeeze[0][0] \n",
448 | "__________________________________________________________________________________________________\n",
449 | "block2a_se_reduce (Conv2D) (None, 1, 1, 4) 388 block2a_se_reshape[0][0] \n",
450 | "__________________________________________________________________________________________________\n",
451 | "block2a_se_expand (Conv2D) (None, 1, 1, 96) 480 block2a_se_reduce[0][0] \n",
452 | "__________________________________________________________________________________________________\n",
453 | "block2a_se_excite (Multiply) (None, 8, 8, 96) 0 block2a_activation[0][0] \n",
454 | " block2a_se_expand[0][0] \n",
455 | "__________________________________________________________________________________________________\n",
456 | "block2a_project_conv (Conv2D) (None, 8, 8, 24) 2304 block2a_se_excite[0][0] \n",
457 | "__________________________________________________________________________________________________\n",
458 | "block2a_project_bn (BatchNormal (None, 8, 8, 24) 96 block2a_project_conv[0][0] \n",
459 | "__________________________________________________________________________________________________\n",
460 | "block2b_expand_conv (Conv2D) (None, 8, 8, 144) 3456 block2a_project_bn[0][0] \n",
461 | "__________________________________________________________________________________________________\n",
462 | "block2b_expand_bn (BatchNormali (None, 8, 8, 144) 576 block2b_expand_conv[0][0] \n",
463 | "__________________________________________________________________________________________________\n",
464 | "block2b_expand_activation (Acti (None, 8, 8, 144) 0 block2b_expand_bn[0][0] \n",
465 | "__________________________________________________________________________________________________\n",
466 | "block2b_dwconv (DepthwiseConv2D (None, 8, 8, 144) 1296 block2b_expand_activation[0][0] \n",
467 | "__________________________________________________________________________________________________\n",
468 | "block2b_bn (BatchNormalization) (None, 8, 8, 144) 576 block2b_dwconv[0][0] \n",
469 | "__________________________________________________________________________________________________\n",
470 | "block2b_activation (Activation) (None, 8, 8, 144) 0 block2b_bn[0][0] \n",
471 | "__________________________________________________________________________________________________\n",
472 | "block2b_se_squeeze (GlobalAvera (None, 144) 0 block2b_activation[0][0] \n",
473 | "__________________________________________________________________________________________________\n",
474 | "block2b_se_reshape (Reshape) (None, 1, 1, 144) 0 block2b_se_squeeze[0][0] \n",
475 | "__________________________________________________________________________________________________\n",
476 | "block2b_se_reduce (Conv2D) (None, 1, 1, 6) 870 block2b_se_reshape[0][0] \n",
477 | "__________________________________________________________________________________________________\n",
478 | "block2b_se_expand (Conv2D) (None, 1, 1, 144) 1008 block2b_se_reduce[0][0] \n",
479 | "__________________________________________________________________________________________________\n",
480 | "block2b_se_excite (Multiply) (None, 8, 8, 144) 0 block2b_activation[0][0] \n",
481 | " block2b_se_expand[0][0] \n",
482 | "__________________________________________________________________________________________________\n",
483 | "block2b_project_conv (Conv2D) (None, 8, 8, 24) 3456 block2b_se_excite[0][0] \n",
484 | "__________________________________________________________________________________________________\n",
485 | "block2b_project_bn (BatchNormal (None, 8, 8, 24) 96 block2b_project_conv[0][0] \n",
486 | "__________________________________________________________________________________________________\n",
487 | "block2b_drop (FixedDropout) (None, 8, 8, 24) 0 block2b_project_bn[0][0] \n",
488 | "__________________________________________________________________________________________________\n",
489 | "block2b_add (Add) (None, 8, 8, 24) 0 block2b_drop[0][0] \n",
490 | " block2a_project_bn[0][0] \n",
491 | "__________________________________________________________________________________________________\n",
492 | "block3a_expand_conv (Conv2D) (None, 8, 8, 144) 3456 block2b_add[0][0] \n",
493 | "__________________________________________________________________________________________________\n",
494 | "block3a_expand_bn (BatchNormali (None, 8, 8, 144) 576 block3a_expand_conv[0][0] \n",
495 | "__________________________________________________________________________________________________\n",
496 | "block3a_expand_activation (Acti (None, 8, 8, 144) 0 block3a_expand_bn[0][0] \n",
497 | "__________________________________________________________________________________________________\n",
498 | "block3a_dwconv (DepthwiseConv2D (None, 4, 4, 144) 3600 block3a_expand_activation[0][0] \n",
499 | "__________________________________________________________________________________________________\n",
500 | "block3a_bn (BatchNormalization) (None, 4, 4, 144) 576 block3a_dwconv[0][0] \n",
501 | "__________________________________________________________________________________________________\n",
502 | "block3a_activation (Activation) (None, 4, 4, 144) 0 block3a_bn[0][0] \n",
503 | "__________________________________________________________________________________________________\n",
504 | "block3a_se_squeeze (GlobalAvera (None, 144) 0 block3a_activation[0][0] \n",
505 | "__________________________________________________________________________________________________\n",
506 | "block3a_se_reshape (Reshape) (None, 1, 1, 144) 0 block3a_se_squeeze[0][0] \n",
507 | "__________________________________________________________________________________________________\n",
508 | "block3a_se_reduce (Conv2D) (None, 1, 1, 6) 870 block3a_se_reshape[0][0] \n",
509 | "__________________________________________________________________________________________________\n",
510 | "block3a_se_expand (Conv2D) (None, 1, 1, 144) 1008 block3a_se_reduce[0][0] \n",
511 | "__________________________________________________________________________________________________\n",
512 | "block3a_se_excite (Multiply) (None, 4, 4, 144) 0 block3a_activation[0][0] \n",
513 | " block3a_se_expand[0][0] \n",
514 | "__________________________________________________________________________________________________\n",
515 | "block3a_project_conv (Conv2D) (None, 4, 4, 40) 5760 block3a_se_excite[0][0] \n",
516 | "__________________________________________________________________________________________________\n",
517 | "block3a_project_bn (BatchNormal (None, 4, 4, 40) 160 block3a_project_conv[0][0] \n",
518 | "__________________________________________________________________________________________________\n",
519 | "block3b_expand_conv (Conv2D) (None, 4, 4, 240) 9600 block3a_project_bn[0][0] \n",
520 | "__________________________________________________________________________________________________\n",
521 | "block3b_expand_bn (BatchNormali (None, 4, 4, 240) 960 block3b_expand_conv[0][0] \n",
522 | "__________________________________________________________________________________________________\n",
523 | "block3b_expand_activation (Acti (None, 4, 4, 240) 0 block3b_expand_bn[0][0] \n",
524 | "__________________________________________________________________________________________________\n",
525 | "block3b_dwconv (DepthwiseConv2D (None, 4, 4, 240) 6000 block3b_expand_activation[0][0] \n",
526 | "__________________________________________________________________________________________________\n",
527 | "block3b_bn (BatchNormalization) (None, 4, 4, 240) 960 block3b_dwconv[0][0] \n",
528 | "__________________________________________________________________________________________________\n",
529 | "block3b_activation (Activation) (None, 4, 4, 240) 0 block3b_bn[0][0] \n",
530 | "__________________________________________________________________________________________________\n",
531 | "block3b_se_squeeze (GlobalAvera (None, 240) 0 block3b_activation[0][0] \n",
532 | "__________________________________________________________________________________________________\n",
533 | "block3b_se_reshape (Reshape) (None, 1, 1, 240) 0 block3b_se_squeeze[0][0] \n",
534 | "__________________________________________________________________________________________________\n",
535 | "block3b_se_reduce (Conv2D) (None, 1, 1, 10) 2410 block3b_se_reshape[0][0] \n",
536 | "__________________________________________________________________________________________________\n",
537 | "block3b_se_expand (Conv2D) (None, 1, 1, 240) 2640 block3b_se_reduce[0][0] \n",
538 | "__________________________________________________________________________________________________\n",
539 | "block3b_se_excite (Multiply) (None, 4, 4, 240) 0 block3b_activation[0][0] \n",
540 | " block3b_se_expand[0][0] \n",
541 | "__________________________________________________________________________________________________\n",
542 | "block3b_project_conv (Conv2D) (None, 4, 4, 40) 9600 block3b_se_excite[0][0] \n",
543 | "__________________________________________________________________________________________________\n",
544 | "block3b_project_bn (BatchNormal (None, 4, 4, 40) 160 block3b_project_conv[0][0] \n",
545 | "__________________________________________________________________________________________________\n",
546 | "block3b_drop (FixedDropout) (None, 4, 4, 40) 0 block3b_project_bn[0][0] \n",
547 | "__________________________________________________________________________________________________\n",
548 | "block3b_add (Add) (None, 4, 4, 40) 0 block3b_drop[0][0] \n",
549 | " block3a_project_bn[0][0] \n",
550 | "__________________________________________________________________________________________________\n",
551 | "block4a_expand_conv (Conv2D) (None, 4, 4, 240) 9600 block3b_add[0][0] \n",
552 | "__________________________________________________________________________________________________\n",
553 | "block4a_expand_bn (BatchNormali (None, 4, 4, 240) 960 block4a_expand_conv[0][0] \n",
554 | "__________________________________________________________________________________________________\n",
555 | "block4a_expand_activation (Acti (None, 4, 4, 240) 0 block4a_expand_bn[0][0] \n",
556 | "__________________________________________________________________________________________________\n",
557 | "block4a_dwconv (DepthwiseConv2D (None, 2, 2, 240) 2160 block4a_expand_activation[0][0] \n",
558 | "__________________________________________________________________________________________________\n",
559 | "block4a_bn (BatchNormalization) (None, 2, 2, 240) 960 block4a_dwconv[0][0] \n",
560 | "__________________________________________________________________________________________________\n",
561 | "block4a_activation (Activation) (None, 2, 2, 240) 0 block4a_bn[0][0] \n",
562 | "__________________________________________________________________________________________________\n",
563 | "block4a_se_squeeze (GlobalAvera (None, 240) 0 block4a_activation[0][0] \n",
564 | "__________________________________________________________________________________________________\n",
565 | "block4a_se_reshape (Reshape) (None, 1, 1, 240) 0 block4a_se_squeeze[0][0] \n",
566 | "__________________________________________________________________________________________________\n",
567 | "block4a_se_reduce (Conv2D) (None, 1, 1, 10) 2410 block4a_se_reshape[0][0] \n",
568 | "__________________________________________________________________________________________________\n",
569 | "block4a_se_expand (Conv2D) (None, 1, 1, 240) 2640 block4a_se_reduce[0][0] \n",
570 | "__________________________________________________________________________________________________\n",
571 | "block4a_se_excite (Multiply) (None, 2, 2, 240) 0 block4a_activation[0][0] \n",
572 | " block4a_se_expand[0][0] \n",
573 | "__________________________________________________________________________________________________\n",
574 | "block4a_project_conv (Conv2D) (None, 2, 2, 80) 19200 block4a_se_excite[0][0] \n",
575 | "__________________________________________________________________________________________________\n",
576 | "block4a_project_bn (BatchNormal (None, 2, 2, 80) 320 block4a_project_conv[0][0] \n",
577 | "__________________________________________________________________________________________________\n",
578 | "block4b_expand_conv (Conv2D) (None, 2, 2, 480) 38400 block4a_project_bn[0][0] \n",
579 | "__________________________________________________________________________________________________\n",
580 | "block4b_expand_bn (BatchNormali (None, 2, 2, 480) 1920 block4b_expand_conv[0][0] \n",
581 | "__________________________________________________________________________________________________\n",
582 | "block4b_expand_activation (Acti (None, 2, 2, 480) 0 block4b_expand_bn[0][0] \n",
583 | "__________________________________________________________________________________________________\n",
584 | "block4b_dwconv (DepthwiseConv2D (None, 2, 2, 480) 4320 block4b_expand_activation[0][0] \n",
585 | "__________________________________________________________________________________________________\n",
586 | "block4b_bn (BatchNormalization) (None, 2, 2, 480) 1920 block4b_dwconv[0][0] \n",
587 | "__________________________________________________________________________________________________\n",
588 | "block4b_activation (Activation) (None, 2, 2, 480) 0 block4b_bn[0][0] \n",
589 | "__________________________________________________________________________________________________\n",
590 | "block4b_se_squeeze (GlobalAvera (None, 480) 0 block4b_activation[0][0] \n",
591 | "__________________________________________________________________________________________________\n",
592 | "block4b_se_reshape (Reshape) (None, 1, 1, 480) 0 block4b_se_squeeze[0][0] \n",
593 | "__________________________________________________________________________________________________\n",
594 | "block4b_se_reduce (Conv2D) (None, 1, 1, 20) 9620 block4b_se_reshape[0][0] \n",
595 | "__________________________________________________________________________________________________\n",
596 | "block4b_se_expand (Conv2D) (None, 1, 1, 480) 10080 block4b_se_reduce[0][0] \n",
597 | "__________________________________________________________________________________________________\n",
598 | "block4b_se_excite (Multiply) (None, 2, 2, 480) 0 block4b_activation[0][0] \n",
599 | " block4b_se_expand[0][0] \n",
600 | "__________________________________________________________________________________________________\n",
601 | "block4b_project_conv (Conv2D) (None, 2, 2, 80) 38400 block4b_se_excite[0][0] \n",
602 | "__________________________________________________________________________________________________\n",
603 | "block4b_project_bn (BatchNormal (None, 2, 2, 80) 320 block4b_project_conv[0][0] \n",
604 | "__________________________________________________________________________________________________\n",
605 | "block4b_drop (FixedDropout) (None, 2, 2, 80) 0 block4b_project_bn[0][0] \n",
606 | "__________________________________________________________________________________________________\n",
607 | "block4b_add (Add) (None, 2, 2, 80) 0 block4b_drop[0][0] \n",
608 | " block4a_project_bn[0][0] \n",
609 | "__________________________________________________________________________________________________\n",
610 | "block4c_expand_conv (Conv2D) (None, 2, 2, 480) 38400 block4b_add[0][0] \n",
611 | "__________________________________________________________________________________________________\n",
612 | "block4c_expand_bn (BatchNormali (None, 2, 2, 480) 1920 block4c_expand_conv[0][0] \n",
613 | "__________________________________________________________________________________________________\n",
614 | "block4c_expand_activation (Acti (None, 2, 2, 480) 0 block4c_expand_bn[0][0] \n",
615 | "__________________________________________________________________________________________________\n",
616 | "block4c_dwconv (DepthwiseConv2D (None, 2, 2, 480) 4320 block4c_expand_activation[0][0] \n",
617 | "__________________________________________________________________________________________________\n",
618 | "block4c_bn (BatchNormalization) (None, 2, 2, 480) 1920 block4c_dwconv[0][0] \n",
619 | "__________________________________________________________________________________________________\n",
620 | "block4c_activation (Activation) (None, 2, 2, 480) 0 block4c_bn[0][0] \n",
621 | "__________________________________________________________________________________________________\n",
622 | "block4c_se_squeeze (GlobalAvera (None, 480) 0 block4c_activation[0][0] \n",
623 | "__________________________________________________________________________________________________\n",
624 | "block4c_se_reshape (Reshape) (None, 1, 1, 480) 0 block4c_se_squeeze[0][0] \n",
625 | "__________________________________________________________________________________________________\n",
626 | "block4c_se_reduce (Conv2D) (None, 1, 1, 20) 9620 block4c_se_reshape[0][0] \n",
627 | "__________________________________________________________________________________________________\n",
628 | "block4c_se_expand (Conv2D) (None, 1, 1, 480) 10080 block4c_se_reduce[0][0] \n",
629 | "__________________________________________________________________________________________________\n",
630 | "block4c_se_excite (Multiply) (None, 2, 2, 480) 0 block4c_activation[0][0] \n",
631 | " block4c_se_expand[0][0] \n",
632 | "__________________________________________________________________________________________________\n",
633 | "block4c_project_conv (Conv2D) (None, 2, 2, 80) 38400 block4c_se_excite[0][0] \n",
634 | "__________________________________________________________________________________________________\n",
635 | "block4c_project_bn (BatchNormal (None, 2, 2, 80) 320 block4c_project_conv[0][0] \n",
636 | "__________________________________________________________________________________________________\n",
637 | "block4c_drop (FixedDropout) (None, 2, 2, 80) 0 block4c_project_bn[0][0] \n",
638 | "__________________________________________________________________________________________________\n",
639 | "block4c_add (Add) (None, 2, 2, 80) 0 block4c_drop[0][0] \n",
640 | " block4b_add[0][0] \n",
641 | "__________________________________________________________________________________________________\n",
642 | "block5a_expand_conv (Conv2D) (None, 2, 2, 480) 38400 block4c_add[0][0] \n",
643 | "__________________________________________________________________________________________________\n",
644 | "block5a_expand_bn (BatchNormali (None, 2, 2, 480) 1920 block5a_expand_conv[0][0] \n",
645 | "__________________________________________________________________________________________________\n",
646 | "block5a_expand_activation (Acti (None, 2, 2, 480) 0 block5a_expand_bn[0][0] \n",
647 | "__________________________________________________________________________________________________\n",
648 | "block5a_dwconv (DepthwiseConv2D (None, 2, 2, 480) 12000 block5a_expand_activation[0][0] \n",
649 | "__________________________________________________________________________________________________\n",
650 | "block5a_bn (BatchNormalization) (None, 2, 2, 480) 1920 block5a_dwconv[0][0] \n",
651 | "__________________________________________________________________________________________________\n",
652 | "block5a_activation (Activation) (None, 2, 2, 480) 0 block5a_bn[0][0] \n",
653 | "__________________________________________________________________________________________________\n",
654 | "block5a_se_squeeze (GlobalAvera (None, 480) 0 block5a_activation[0][0] \n",
655 | "__________________________________________________________________________________________________\n",
656 | "block5a_se_reshape (Reshape) (None, 1, 1, 480) 0 block5a_se_squeeze[0][0] \n",
657 | "__________________________________________________________________________________________________\n",
658 | "block5a_se_reduce (Conv2D) (None, 1, 1, 20) 9620 block5a_se_reshape[0][0] \n",
659 | "__________________________________________________________________________________________________\n",
660 | "block5a_se_expand (Conv2D) (None, 1, 1, 480) 10080 block5a_se_reduce[0][0] \n",
661 | "__________________________________________________________________________________________________\n",
662 | "block5a_se_excite (Multiply) (None, 2, 2, 480) 0 block5a_activation[0][0] \n",
663 | " block5a_se_expand[0][0] \n",
664 | "__________________________________________________________________________________________________\n",
665 | "block5a_project_conv (Conv2D) (None, 2, 2, 112) 53760 block5a_se_excite[0][0] \n",
666 | "__________________________________________________________________________________________________\n",
667 | "block5a_project_bn (BatchNormal (None, 2, 2, 112) 448 block5a_project_conv[0][0] \n",
668 | "__________________________________________________________________________________________________\n",
669 | "block5b_expand_conv (Conv2D) (None, 2, 2, 672) 75264 block5a_project_bn[0][0] \n",
670 | "__________________________________________________________________________________________________\n",
671 | "block5b_expand_bn (BatchNormali (None, 2, 2, 672) 2688 block5b_expand_conv[0][0] \n",
672 | "__________________________________________________________________________________________________\n",
673 | "block5b_expand_activation (Acti (None, 2, 2, 672) 0 block5b_expand_bn[0][0] \n",
674 | "__________________________________________________________________________________________________\n",
675 | "block5b_dwconv (DepthwiseConv2D (None, 2, 2, 672) 16800 block5b_expand_activation[0][0] \n",
676 | "__________________________________________________________________________________________________\n",
677 | "block5b_bn (BatchNormalization) (None, 2, 2, 672) 2688 block5b_dwconv[0][0] \n",
678 | "__________________________________________________________________________________________________\n",
679 | "block5b_activation (Activation) (None, 2, 2, 672) 0 block5b_bn[0][0] \n",
680 | "__________________________________________________________________________________________________\n",
681 | "block5b_se_squeeze (GlobalAvera (None, 672) 0 block5b_activation[0][0] \n",
682 | "__________________________________________________________________________________________________\n",
683 | "block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 block5b_se_squeeze[0][0] \n",
684 | "__________________________________________________________________________________________________\n",
685 | "block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 block5b_se_reshape[0][0] \n",
686 | "__________________________________________________________________________________________________\n",
687 | "block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 block5b_se_reduce[0][0] \n",
688 | "__________________________________________________________________________________________________\n",
689 | "block5b_se_excite (Multiply) (None, 2, 2, 672) 0 block5b_activation[0][0] \n",
690 | " block5b_se_expand[0][0] \n",
691 | "__________________________________________________________________________________________________\n",
692 | "block5b_project_conv (Conv2D) (None, 2, 2, 112) 75264 block5b_se_excite[0][0] \n",
693 | "__________________________________________________________________________________________________\n",
694 | "block5b_project_bn (BatchNormal (None, 2, 2, 112) 448 block5b_project_conv[0][0] \n",
695 | "__________________________________________________________________________________________________\n",
696 | "block5b_drop (FixedDropout) (None, 2, 2, 112) 0 block5b_project_bn[0][0] \n",
697 | "__________________________________________________________________________________________________\n",
698 | "block5b_add (Add) (None, 2, 2, 112) 0 block5b_drop[0][0] \n",
699 | " block5a_project_bn[0][0] \n",
700 | "__________________________________________________________________________________________________\n",
701 | "block5c_expand_conv (Conv2D) (None, 2, 2, 672) 75264 block5b_add[0][0] \n",
702 | "__________________________________________________________________________________________________\n",
703 | "block5c_expand_bn (BatchNormali (None, 2, 2, 672) 2688 block5c_expand_conv[0][0] \n",
704 | "__________________________________________________________________________________________________\n",
705 | "block5c_expand_activation (Acti (None, 2, 2, 672) 0 block5c_expand_bn[0][0] \n",
706 | "__________________________________________________________________________________________________\n",
707 | "block5c_dwconv (DepthwiseConv2D (None, 2, 2, 672) 16800 block5c_expand_activation[0][0] \n",
708 | "__________________________________________________________________________________________________\n",
709 | "block5c_bn (BatchNormalization) (None, 2, 2, 672) 2688 block5c_dwconv[0][0] \n",
710 | "__________________________________________________________________________________________________\n",
711 | "block5c_activation (Activation) (None, 2, 2, 672) 0 block5c_bn[0][0] \n",
712 | "__________________________________________________________________________________________________\n",
713 | "block5c_se_squeeze (GlobalAvera (None, 672) 0 block5c_activation[0][0] \n",
714 | "__________________________________________________________________________________________________\n",
715 | "block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 block5c_se_squeeze[0][0] \n",
716 | "__________________________________________________________________________________________________\n",
717 | "block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 block5c_se_reshape[0][0] \n",
718 | "__________________________________________________________________________________________________\n",
719 | "block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 block5c_se_reduce[0][0] \n",
720 | "__________________________________________________________________________________________________\n",
721 | "block5c_se_excite (Multiply) (None, 2, 2, 672) 0 block5c_activation[0][0] \n",
722 | " block5c_se_expand[0][0] \n",
723 | "__________________________________________________________________________________________________\n",
724 | "block5c_project_conv (Conv2D) (None, 2, 2, 112) 75264 block5c_se_excite[0][0] \n",
725 | "__________________________________________________________________________________________________\n",
726 | "block5c_project_bn (BatchNormal (None, 2, 2, 112) 448 block5c_project_conv[0][0] \n",
727 | "__________________________________________________________________________________________________\n",
728 | "block5c_drop (FixedDropout) (None, 2, 2, 112) 0 block5c_project_bn[0][0] \n",
729 | "__________________________________________________________________________________________________\n",
730 | "block5c_add (Add) (None, 2, 2, 112) 0 block5c_drop[0][0] \n",
731 | " block5b_add[0][0] \n",
732 | "__________________________________________________________________________________________________\n",
733 | "block6a_expand_conv (Conv2D) (None, 2, 2, 672) 75264 block5c_add[0][0] \n",
734 | "__________________________________________________________________________________________________\n",
735 | "block6a_expand_bn (BatchNormali (None, 2, 2, 672) 2688 block6a_expand_conv[0][0] \n",
736 | "__________________________________________________________________________________________________\n",
737 | "block6a_expand_activation (Acti (None, 2, 2, 672) 0 block6a_expand_bn[0][0] \n",
738 | "__________________________________________________________________________________________________\n",
739 | "block6a_dwconv (DepthwiseConv2D (None, 1, 1, 672) 16800 block6a_expand_activation[0][0] \n",
740 | "__________________________________________________________________________________________________\n",
741 | "block6a_bn (BatchNormalization) (None, 1, 1, 672) 2688 block6a_dwconv[0][0] \n",
742 | "__________________________________________________________________________________________________\n",
743 | "block6a_activation (Activation) (None, 1, 1, 672) 0 block6a_bn[0][0] \n",
744 | "__________________________________________________________________________________________________\n",
745 | "block6a_se_squeeze (GlobalAvera (None, 672) 0 block6a_activation[0][0] \n",
746 | "__________________________________________________________________________________________________\n",
747 | "block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 block6a_se_squeeze[0][0] \n",
748 | "__________________________________________________________________________________________________\n",
749 | "block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 block6a_se_reshape[0][0] \n",
750 | "__________________________________________________________________________________________________\n",
751 | "block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 block6a_se_reduce[0][0] \n",
752 | "__________________________________________________________________________________________________\n",
753 | "block6a_se_excite (Multiply) (None, 1, 1, 672) 0 block6a_activation[0][0] \n",
754 | " block6a_se_expand[0][0] \n",
755 | "__________________________________________________________________________________________________\n",
756 | "block6a_project_conv (Conv2D) (None, 1, 1, 192) 129024 block6a_se_excite[0][0] \n",
757 | "__________________________________________________________________________________________________\n",
758 | "block6a_project_bn (BatchNormal (None, 1, 1, 192) 768 block6a_project_conv[0][0] \n",
759 | "__________________________________________________________________________________________________\n",
760 | "block6b_expand_conv (Conv2D) (None, 1, 1, 1152) 221184 block6a_project_bn[0][0] \n",
761 | "__________________________________________________________________________________________________\n",
762 | "block6b_expand_bn (BatchNormali (None, 1, 1, 1152) 4608 block6b_expand_conv[0][0] \n",
763 | "__________________________________________________________________________________________________\n",
764 | "block6b_expand_activation (Acti (None, 1, 1, 1152) 0 block6b_expand_bn[0][0] \n",
765 | "__________________________________________________________________________________________________\n",
766 | "block6b_dwconv (DepthwiseConv2D (None, 1, 1, 1152) 28800 block6b_expand_activation[0][0] \n",
767 | "__________________________________________________________________________________________________\n",
768 | "block6b_bn (BatchNormalization) (None, 1, 1, 1152) 4608 block6b_dwconv[0][0] \n",
769 | "__________________________________________________________________________________________________\n",
770 | "block6b_activation (Activation) (None, 1, 1, 1152) 0 block6b_bn[0][0] \n",
771 | "__________________________________________________________________________________________________\n",
772 | "block6b_se_squeeze (GlobalAvera (None, 1152) 0 block6b_activation[0][0] \n",
773 | "__________________________________________________________________________________________________\n",
774 | "block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 block6b_se_squeeze[0][0] \n",
775 | "__________________________________________________________________________________________________\n",
776 | "block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block6b_se_reshape[0][0] \n",
777 | "__________________________________________________________________________________________________\n",
778 | "block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block6b_se_reduce[0][0] \n",
779 | "__________________________________________________________________________________________________\n",
780 | "block6b_se_excite (Multiply) (None, 1, 1, 1152) 0 block6b_activation[0][0] \n",
781 | " block6b_se_expand[0][0] \n",
782 | "__________________________________________________________________________________________________\n",
783 | "block6b_project_conv (Conv2D) (None, 1, 1, 192) 221184 block6b_se_excite[0][0] \n",
784 | "__________________________________________________________________________________________________\n",
785 | "block6b_project_bn (BatchNormal (None, 1, 1, 192) 768 block6b_project_conv[0][0] \n",
786 | "__________________________________________________________________________________________________\n",
787 | "block6b_drop (FixedDropout) (None, 1, 1, 192) 0 block6b_project_bn[0][0] \n",
788 | "__________________________________________________________________________________________________\n",
789 | "block6b_add (Add) (None, 1, 1, 192) 0 block6b_drop[0][0] \n",
790 | " block6a_project_bn[0][0] \n",
791 | "__________________________________________________________________________________________________\n",
792 | "block6c_expand_conv (Conv2D) (None, 1, 1, 1152) 221184 block6b_add[0][0] \n",
793 | "__________________________________________________________________________________________________\n",
794 | "block6c_expand_bn (BatchNormali (None, 1, 1, 1152) 4608 block6c_expand_conv[0][0] \n",
795 | "__________________________________________________________________________________________________\n",
796 | "block6c_expand_activation (Acti (None, 1, 1, 1152) 0 block6c_expand_bn[0][0] \n",
797 | "__________________________________________________________________________________________________\n",
798 | "block6c_dwconv (DepthwiseConv2D (None, 1, 1, 1152) 28800 block6c_expand_activation[0][0] \n",
799 | "__________________________________________________________________________________________________\n",
800 | "block6c_bn (BatchNormalization) (None, 1, 1, 1152) 4608 block6c_dwconv[0][0] \n",
801 | "__________________________________________________________________________________________________\n",
802 | "block6c_activation (Activation) (None, 1, 1, 1152) 0 block6c_bn[0][0] \n",
803 | "__________________________________________________________________________________________________\n",
804 | "block6c_se_squeeze (GlobalAvera (None, 1152) 0 block6c_activation[0][0] \n",
805 | "__________________________________________________________________________________________________\n",
806 | "block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 block6c_se_squeeze[0][0] \n",
807 | "__________________________________________________________________________________________________\n",
808 | "block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block6c_se_reshape[0][0] \n",
809 | "__________________________________________________________________________________________________\n",
810 | "block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block6c_se_reduce[0][0] \n",
811 | "__________________________________________________________________________________________________\n",
812 | "block6c_se_excite (Multiply) (None, 1, 1, 1152) 0 block6c_activation[0][0] \n",
813 | " block6c_se_expand[0][0] \n",
814 | "__________________________________________________________________________________________________\n",
815 | "block6c_project_conv (Conv2D) (None, 1, 1, 192) 221184 block6c_se_excite[0][0] \n",
816 | "__________________________________________________________________________________________________\n",
817 | "block6c_project_bn (BatchNormal (None, 1, 1, 192) 768 block6c_project_conv[0][0] \n",
818 | "__________________________________________________________________________________________________\n",
819 | "block6c_drop (FixedDropout) (None, 1, 1, 192) 0 block6c_project_bn[0][0] \n",
820 | "__________________________________________________________________________________________________\n",
821 | "block6c_add (Add) (None, 1, 1, 192) 0 block6c_drop[0][0] \n",
822 | " block6b_add[0][0] \n",
823 | "__________________________________________________________________________________________________\n",
824 | "block6d_expand_conv (Conv2D) (None, 1, 1, 1152) 221184 block6c_add[0][0] \n",
825 | "__________________________________________________________________________________________________\n",
826 | "block6d_expand_bn (BatchNormali (None, 1, 1, 1152) 4608 block6d_expand_conv[0][0] \n",
827 | "__________________________________________________________________________________________________\n",
828 | "block6d_expand_activation (Acti (None, 1, 1, 1152) 0 block6d_expand_bn[0][0] \n",
829 | "__________________________________________________________________________________________________\n",
830 | "block6d_dwconv (DepthwiseConv2D (None, 1, 1, 1152) 28800 block6d_expand_activation[0][0] \n",
831 | "__________________________________________________________________________________________________\n",
832 | "block6d_bn (BatchNormalization) (None, 1, 1, 1152) 4608 block6d_dwconv[0][0] \n",
833 | "__________________________________________________________________________________________________\n",
834 | "block6d_activation (Activation) (None, 1, 1, 1152) 0 block6d_bn[0][0] \n",
835 | "__________________________________________________________________________________________________\n",
836 | "block6d_se_squeeze (GlobalAvera (None, 1152) 0 block6d_activation[0][0] \n",
837 | "__________________________________________________________________________________________________\n",
838 | "block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 block6d_se_squeeze[0][0] \n",
839 | "__________________________________________________________________________________________________\n",
840 | "block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block6d_se_reshape[0][0] \n",
841 | "__________________________________________________________________________________________________\n",
842 | "block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block6d_se_reduce[0][0] \n",
843 | "__________________________________________________________________________________________________\n",
844 | "block6d_se_excite (Multiply) (None, 1, 1, 1152) 0 block6d_activation[0][0] \n",
845 | " block6d_se_expand[0][0] \n",
846 | "__________________________________________________________________________________________________\n",
847 | "block6d_project_conv (Conv2D) (None, 1, 1, 192) 221184 block6d_se_excite[0][0] \n",
848 | "__________________________________________________________________________________________________\n",
849 | "block6d_project_bn (BatchNormal (None, 1, 1, 192) 768 block6d_project_conv[0][0] \n",
850 | "__________________________________________________________________________________________________\n",
851 | "block6d_drop (FixedDropout) (None, 1, 1, 192) 0 block6d_project_bn[0][0] \n",
852 | "__________________________________________________________________________________________________\n",
853 | "block6d_add (Add) (None, 1, 1, 192) 0 block6d_drop[0][0] \n",
854 | " block6c_add[0][0] \n",
855 | "__________________________________________________________________________________________________\n",
856 | "block7a_expand_conv (Conv2D) (None, 1, 1, 1152) 221184 block6d_add[0][0] \n",
857 | "__________________________________________________________________________________________________\n",
858 | "block7a_expand_bn (BatchNormali (None, 1, 1, 1152) 4608 block7a_expand_conv[0][0] \n",
859 | "__________________________________________________________________________________________________\n",
860 | "block7a_expand_activation (Acti (None, 1, 1, 1152) 0 block7a_expand_bn[0][0] \n",
861 | "__________________________________________________________________________________________________\n",
862 | "block7a_dwconv (DepthwiseConv2D (None, 1, 1, 1152) 10368 block7a_expand_activation[0][0] \n",
863 | "__________________________________________________________________________________________________\n",
864 | "block7a_bn (BatchNormalization) (None, 1, 1, 1152) 4608 block7a_dwconv[0][0] \n",
865 | "__________________________________________________________________________________________________\n",
866 | "block7a_activation (Activation) (None, 1, 1, 1152) 0 block7a_bn[0][0] \n",
867 | "__________________________________________________________________________________________________\n",
868 | "block7a_se_squeeze (GlobalAvera (None, 1152) 0 block7a_activation[0][0] \n",
869 | "__________________________________________________________________________________________________\n",
870 | "block7a_se_reshape (Reshape) (None, 1, 1, 1152) 0 block7a_se_squeeze[0][0] \n",
871 | "__________________________________________________________________________________________________\n",
872 | "block7a_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block7a_se_reshape[0][0] \n",
873 | "__________________________________________________________________________________________________\n",
874 | "block7a_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block7a_se_reduce[0][0] \n",
875 | "__________________________________________________________________________________________________\n",
876 | "block7a_se_excite (Multiply) (None, 1, 1, 1152) 0 block7a_activation[0][0] \n",
877 | " block7a_se_expand[0][0] \n",
878 | "__________________________________________________________________________________________________\n",
879 | "block7a_project_conv (Conv2D) (None, 1, 1, 320) 368640 block7a_se_excite[0][0] \n",
880 | "__________________________________________________________________________________________________\n",
881 | "block7a_project_bn (BatchNormal (None, 1, 1, 320) 1280 block7a_project_conv[0][0] \n",
882 | "__________________________________________________________________________________________________\n",
883 | "top_conv (Conv2D) (None, 1, 1, 1280) 409600 block7a_project_bn[0][0] \n",
884 | "__________________________________________________________________________________________________\n",
885 | "top_bn (BatchNormalization) (None, 1, 1, 1280) 5120 top_conv[0][0] \n",
886 | "__________________________________________________________________________________________________\n",
887 | "top_activation (Activation) (None, 1, 1, 1280) 0 top_bn[0][0] \n",
888 | "__________________________________________________________________________________________________\n",
889 | "avg_pool (GlobalAveragePooling2 (None, 1280) 0 top_activation[0][0] \n",
890 | "__________________________________________________________________________________________________\n",
891 | "batch_normalization_1 (BatchNor (None, 1280) 5120 avg_pool[0][0] \n",
892 | "__________________________________________________________________________________________________\n",
893 | "dropout_1 (Dropout) (None, 1280) 0 batch_normalization_1[0][0] \n",
894 | "__________________________________________________________________________________________________\n",
895 | "dense_1 (Dense) (None, 512) 655872 dropout_1[0][0] \n",
896 | "__________________________________________________________________________________________________\n",
897 | "batch_normalization_2 (BatchNor (None, 512) 2048 dense_1[0][0] \n",
898 | "__________________________________________________________________________________________________\n",
899 | "activation_1 (Activation) (None, 512) 0 batch_normalization_2[0][0] \n",
900 | "__________________________________________________________________________________________________\n",
901 | "dropout_2 (Dropout) (None, 512) 0 activation_1[0][0] \n",
902 | "__________________________________________________________________________________________________\n",
903 | "dense_2 (Dense) (None, 128) 65664 dropout_2[0][0] \n",
904 | "__________________________________________________________________________________________________\n",
905 | "batch_normalization_3 (BatchNor (None, 128) 512 dense_2[0][0] \n",
906 | "__________________________________________________________________________________________________\n",
907 | "activation_2 (Activation) (None, 128) 0 batch_normalization_3[0][0] \n",
908 | "__________________________________________________________________________________________________\n",
909 | "dense_3 (Dense) (None, 10) 1290 activation_2[0][0] \n",
910 | "==================================================================================================\n",
911 | "Total params: 4,780,070\n",
912 | "Trainable params: 4,734,214\n",
913 | "Non-trainable params: 45,856\n",
914 | "__________________________________________________________________________________________________\n"
915 | ],
916 | "name": "stdout"
917 | }
918 | ]
919 | },
920 | {
921 | "cell_type": "code",
922 | "metadata": {
923 | "id": "7Y2h9pPN0Xwe",
924 | "colab_type": "code",
925 | "outputId": "14b0aeb1-cfb5-457e-85b8-6f674bf60a97",
926 | "colab": {
927 | "base_uri": "https://localhost:8080/",
928 | "height": 34
929 | }
930 | },
931 | "source": [
932 | "!pwd"
933 | ],
934 | "execution_count": 0,
935 | "outputs": [
936 | {
937 | "output_type": "stream",
938 | "text": [
939 | "/gdrive\n"
940 | ],
941 | "name": "stdout"
942 | }
943 | ]
944 | },
945 | {
946 | "cell_type": "markdown",
947 | "metadata": {
948 | "id": "JQrkkLyzCWZs",
949 | "colab_type": "text"
950 | },
951 | "source": [
952 | "**Compile the Model and Save the Best Results**"
953 | ]
954 | },
955 | {
956 | "cell_type": "code",
957 | "metadata": {
958 | "id": "lJbaIYeGuQ52",
959 | "colab_type": "code",
960 | "outputId": "56efa2e7-3ffe-44db-f045-9e777c6e780b",
961 | "colab": {
962 | "base_uri": "https://localhost:8080/",
963 | "height": 425
964 | }
965 | },
966 | "source": [
967 | "model_final.compile(loss='categorical_crossentropy',\n",
968 | " optimizer=Adam(0.0001),\n",
969 | " metrics=['accuracy'])\n",
970 | "\n",
971 | "mcp_save = ModelCheckpoint('/gdrive/My Drive/EnetB0_CIFAR10_TL.h5', save_best_only=True, monitor='val_acc')\n",
972 | "reduce_lr = ReduceLROnPlateau(monitor='val_acc', factor=0.5, patience=2, verbose=1,)\n",
973 | "\n",
974 | "#print(\"Training....\")\n",
975 | "model_final.fit(x_train, y_train,\n",
976 | " batch_size=32,\n",
977 | " epochs=10,\n",
978 | " validation_split=0.1,\n",
979 | " callbacks=[mcp_save, reduce_lr],\n",
980 | " shuffle=True,\n",
981 | " verbose=1)"
982 | ],
983 | "execution_count": 0,
984 | "outputs": [
985 | {
986 | "output_type": "stream",
987 | "text": [
988 | "Train on 45000 samples, validate on 5000 samples\n",
989 | "Epoch 1/10\n",
990 | "45000/45000 [==============================] - 178s 4ms/step - loss: 0.6078 - acc: 0.7971 - val_loss: 0.5557 - val_acc: 0.8186\n",
991 | "Epoch 2/10\n",
992 | "45000/45000 [==============================] - 147s 3ms/step - loss: 0.5454 - acc: 0.8156 - val_loss: 0.5343 - val_acc: 0.8220\n",
993 | "Epoch 3/10\n",
994 | "45000/45000 [==============================] - 146s 3ms/step - loss: 0.5012 - acc: 0.8334 - val_loss: 0.5257 - val_acc: 0.8300\n",
995 | "Epoch 4/10\n",
996 | "45000/45000 [==============================] - 147s 3ms/step - loss: 0.4570 - acc: 0.8468 - val_loss: 0.5119 - val_acc: 0.8248\n",
997 | "Epoch 5/10\n",
998 | "45000/45000 [==============================] - 146s 3ms/step - loss: 0.4123 - acc: 0.8616 - val_loss: 0.5077 - val_acc: 0.8262\n",
999 | "\n",
1000 | "Epoch 00005: ReduceLROnPlateau reducing learning rate to 4.999999873689376e-05.\n",
1001 | "Epoch 6/10\n",
1002 | "45000/45000 [==============================] - 147s 3ms/step - loss: 0.3567 - acc: 0.8803 - val_loss: 0.4945 - val_acc: 0.8320\n",
1003 | "Epoch 7/10\n",
1004 | "45000/45000 [==============================] - 147s 3ms/step - loss: 0.3305 - acc: 0.8891 - val_loss: 0.4936 - val_acc: 0.8310\n",
1005 | "Epoch 8/10\n",
1006 | "45000/45000 [==============================] - 151s 3ms/step - loss: 0.3016 - acc: 0.8986 - val_loss: 0.4873 - val_acc: 0.8362\n",
1007 | "Epoch 9/10\n",
1008 | "45000/45000 [==============================] - 151s 3ms/step - loss: 0.2919 - acc: 0.9023 - val_loss: 0.4881 - val_acc: 0.8374\n",
1009 | "Epoch 10/10\n",
1010 | "45000/45000 [==============================] - 151s 3ms/step - loss: 0.2713 - acc: 0.9095 - val_loss: 0.4877 - val_acc: 0.8364\n"
1011 | ],
1012 | "name": "stdout"
1013 | },
1014 | {
1015 | "output_type": "execute_result",
1016 | "data": {
1017 | "text/plain": [
1018 | ""
1019 | ]
1020 | },
1021 | "metadata": {
1022 | "tags": []
1023 | },
1024 | "execution_count": 19
1025 | }
1026 | ]
1027 | },
1028 | {
1029 | "cell_type": "code",
1030 | "metadata": {
1031 | "id": "zrAEgqQouWep",
1032 | "colab_type": "code",
1033 | "outputId": "1815ffd5-9994-4e1f-8444-9fd9148b0b30",
1034 | "colab": {
1035 | "base_uri": "https://localhost:8080/",
1036 | "height": 34
1037 | }
1038 | },
1039 | "source": [
1040 | "_, acc = model_final.evaluate(x_test, y_test)"
1041 | ],
1042 | "execution_count": 0,
1043 | "outputs": [
1044 | {
1045 | "output_type": "stream",
1046 | "text": [
1047 | "10000/10000 [==============================] - 5s 474us/step\n"
1048 | ],
1049 | "name": "stdout"
1050 | }
1051 | ]
1052 | },
1053 | {
1054 | "cell_type": "markdown",
1055 | "metadata": {
1056 | "id": "aQqE2Ne-Cldv",
1057 | "colab_type": "text"
1058 | },
1059 | "source": [
1060 | "**Printing the test accuracy**"
1061 | ]
1062 | },
1063 | {
1064 | "cell_type": "code",
1065 | "metadata": {
1066 | "id": "2SD6yma6ub9G",
1067 | "colab_type": "code",
1068 | "outputId": "993f9827-ad12-4e7e-aa93-34f21f8d9c11",
1069 | "colab": {
1070 | "base_uri": "https://localhost:8080/",
1071 | "height": 34
1072 | }
1073 | },
1074 | "source": [
1075 | "print(\"Test Accuracy: {}%\".format(acc*100))"
1076 | ],
1077 | "execution_count": 0,
1078 | "outputs": [
1079 | {
1080 | "output_type": "stream",
1081 | "text": [
1082 | "Test Accuracy: 82.78999999999999%\n"
1083 | ],
1084 | "name": "stdout"
1085 | }
1086 | ]
1087 | },
1088 | {
1089 | "cell_type": "markdown",
1090 | "metadata": {
1091 | "id": "uCftheoIDX4f",
1092 | "colab_type": "text"
1093 | },
1094 | "source": [
1095 | "**Visualization of Confusion Matrix**"
1096 | ]
1097 | },
1098 | {
1099 | "cell_type": "code",
1100 | "metadata": {
1101 | "id": "fT11oW_Muedu",
1102 | "colab_type": "code",
1103 | "colab": {}
1104 | },
1105 | "source": [
1106 | "import seaborn as sns\n",
1107 | "from sklearn.metrics import confusion_matrix\n",
1108 | "\n",
1109 | "test_pred = model_final.predict(x_test)"
1110 | ],
1111 | "execution_count": 0,
1112 | "outputs": []
1113 | },
1114 | {
1115 | "cell_type": "code",
1116 | "metadata": {
1117 | "id": "i-W9xmG3uhJq",
1118 | "colab_type": "code",
1119 | "outputId": "066ce53a-7682-42b8-b15d-7db596e91d38",
1120 | "colab": {
1121 | "base_uri": "https://localhost:8080/",
1122 | "height": 265
1123 | }
1124 | },
1125 | "source": [
1126 | "import numpy as np\n",
1127 | "\n",
1128 | "ax = sns.heatmap(confusion_matrix(np.argmax(y_test, axis=1),np.argmax(test_pred, axis=1)), cmap=\"binary\",annot=True,fmt=\"d\")"
1129 | ],
1130 | "execution_count": 0,
1131 | "outputs": [
1132 | {
1133 | "output_type": "display_data",
1134 | "data": {
1135 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWAAAAD4CAYAAADSIzzWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjIsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy8li6FKAAAgAElEQVR4nOydeXwN1///nyc3VCKURKS2D6F8qK1L\n7Hti+4gWEaHWKm0/VYIkBEFpG7SVUq0ulrbUUmrf15CoNZZaimpr+SDEliARWW7O74+bzDch+517\nI/md5+Mxj9w7y3nNTM6875kzZ94vIaVEoVAoFNbHpqB3QKFQKP5/RQVghUKhKCBUAFYoFIoCQgVg\nhUKhKCBUAFYoFIoCwtbSAqVKlbLKMIv79+9bQwaApKQkq2nZ2lr8X6RhNBqtplW8eHGr6FhzlI8Q\nwmpajx49spqWNetg8eLFzT6JQohc/9OllNb7p2WCagErFApFAWG9nzaFQqGwAta8EzEXFYAVCkWR\nwsam8NzYqwCsUCiKFKoFrFAoFAWECsAKhUJRQKgAnAs++OADBg0ahJSSP/74g/fff5+EhAQmT55M\njx49MBqNLFiwgO+++w6Azz77jI4dOxIfH89///tfTp48mWfNoKAg9u7di6OjIxs3bgQgJiYGPz8/\nrl+/TqVKlZg1axbPP/+82cd38+ZNgoKCuHv3LkIIevbsSf/+/QFYtmwZv/zyCwaDgVatWuHn52eW\nVlBQEGFhYTg6OrJhwwYAtm3bxty5c7l48SIrVqygXr16uhzThAkTtGPy9vamf//+hISEsHfvXooV\nK0aVKlX4+OOPKV26tNl6aYwfP569e/fi5OTEpk2bdCs3My5evJjh/3H16lV8fX0ZNGiQ7loJCQn0\n69ePxMREjEYjnTp1wtfXV7eyhwwZopXdvn173n//fQ4fPszs2bNJSUnB3t6eqVOn8q9//cssrYKq\nF1lRmAIwUkqLTg4ODvLJqWbNmvLSpUuyXLly0sHBQa5evVq+99578r///a9cunSpLFWqlHRwcJDV\nqlWTDg4O0svLS27fvl06ODjIdu3aySNHjjxVptFozHE6dOiQPHXqlOzSpYs2b8aMGfK7776TRqNR\nfvfdd/LTTz/NsZzHjx/nOF29elUeP35cPn78WN69e1d26NBB/vHHHzI8PFwOGDBAPnjwQD5+/Fhe\nv34923KSk5NznNIfV9q8P//8U/7111+yX79+8vfff89VOQkJCdlO165dkydOnJAJCQny3r17skOH\nDvLs2bNyz549Mi4uTiYkJMjp06fL6dOn51hWXjhy5Ig8c+aM9PT0zNN2UkqZkpKS7ykpKUk2b95c\nXr16NVfr52ffYmNjpZRSJiYmSm9vb3nixIlcbRsXF5ftFBsbK2/fvi3j4uJkTEyM9PLykgcPHpTt\n27eXp0+flnFxcfLHH3+U/v7+OZZl5XphdswpUaKEzO2kh545U46PC4UQtYUQgUKIOalToBCijrmB\n39bWFjs7OwwGA/b29ty4cYMhQ4bw6aefaoPn79y5A4CnpyfLly8HICIigjJlyuDi4pJnzUaNGlGm\nTJkM80JDQ+nWrRsA3bp1Y/fu3eYcloazszMvvfQSACVLlsTV1ZVbt26xcuVKhgwZor2I4OTkZLaW\nm5vbU632GjVq4OrqanbZ6cnsmKKiomjevLk2WL9hw4ZERUXpqtuoUSNd7kryysGDB6lSpQqVKlWy\nSPlCCEqWLAlAcnIyycnJurXehBDY29s/VbYQgri4OAAePnyIs7Oz2VoFVS+ywsbGJtdTQZPtHggh\nAoFfAAEcSZ0EsFwIMS6/ojdu3GDOnDmcPXuWv//+m/v37xMaGkr16tXx8vIiLCyM1atXU6NGDQAq\nVqzI9evXte2vX79OxYoV8yufgbt371K+fHnAVJHu3r2rS7npuX79OufPn6d+/fpcuXKFY8eO0bdv\nXwYPHsyZM2d017MGacfUoEGDDPPXrl1Ly5YtC2iv9GXLli14enpaVMNoNNKtWzeaN29O8+bNadiw\noa5l9+7dGw8PD5o2bUr9+vWZPHkyI0aMoFOnTmzevJnBgwfrpgfPRr1I+6HJzVTQ5PQTMARoJKWc\nIaVckjrNABqnLssUIcS7QoijQoijmb22W6ZMGTw9Palfvz41a9akZMmS9O7dm+LFi5OQkECbNm1Y\ntGgR33zzjVkHl1cs8U959OgRfn5+jB07FgcHB5KTk3nw4AFLly7Fz8+PgIAAq74uqwePHj1i9OjR\nBAYG4uDgoM2fN28eBoOBrl27FuDe6UNiYiKhoaF07tzZojoGg4H169cTFhbGqVOnuHDhgq5lr1ix\ngu3bt3PmzBn+/vtvli5dyldffcX27dvp1q0bISEhuuk9K/WiKAXgFCCzpmaF1GWZIqWcJ6V0k1K6\nFStW7Knlbdu25cqVK9y5c4fk5GQ2bNhAkyZNiIyM1B4ibdiwgbp16wIQGRmZ4TawUqVKREZG5nRs\nucLJyYlbt24BcOvWLRwdHXUpF0w5I/z8/PD09KR9+/YAuLi44OHhgRCC+vXrY2NjQ3R0tG6aliYp\nKYnRo0dnOCaAdevWERYWxowZM56Jim0u+/bt46WXXqJcuXJW0StdujRNmjRh3759upddqlQp3Nzc\n2L9/PxcuXKB+/foAdOzYMV8PszPjWaoXRSkAjwJ2CyG2CiHmpU7bgN3AyPyKXrt2jUaNGmFnZweY\nAvKff/7Jpk2baN26NQAtW7bk77//Bky3gm+++SZg6g+8f/++bv1J7u7urF+/HoD169fj7u6uS7lS\nSj788ENcXV0ZOHBgBr2IiAgALl++TFJSEmXLltVF09KkHVP16tUzjAr47bff+PHHH/nqq6+0/2lh\nZ/PmzRbvfrh37x4PHjwA4PHjxxw4cIDq1avrVvbDhw+1sg8fPoyrqyuxsbFcuXIFgEOHDunynOBZ\nqxeFKQCLnG5/hRA2mLoc0pqg14EIKWWuUmdllQ1twoQJ9OzZk+TkZE6ePMnw4cOxs7Nj4cKFVK5c\nmbi4OEaOHKn1kYaEhNC+fXvi4+N5//33OXHiRIbycpMNzd/fnyNHjhATE4OTkxPDhw/Hw8MDPz8/\nIiMjqVixIrNmzXrqQd2T5CYb2vHjx3nrrbeoWbOm1tnv6+tL06ZNmTx5MufPn6dYsWL4+/vTpEmT\nLMvJTSaqgICAp47r+eefJzg4mHv37lG6dGlq167N/Pnzsy0np2xox48fZ9CgQU8d04wZM0hMTNTO\nW4MGDZg8eXK2ZeUlG5qfnx9HjhwhOjoaJycnRowYQa9evXK1bX66dx49ekS7du3YtWsXpUqVyvV2\neb2gz58/z7hx4zAajUgp6dy5M8OHD8/1PmbHhQsXmDx5MikpKaSkpNChQwfee+89QkND+fbbbxFC\nULp0aaZMmULlypWzLSunOqhzvTA7KpYtWzbX//To6OgCjcI5BmBzUekozUOlozQPa/avq3SU5qNH\nAHZ0dMz1P/3evXsFGoDVm3AKhaJI8Sx0LeQWFYAVCkWRQgVghUKhKCBUAFYoFIoCQgVghUKhKCCe\nhVeMc4sKwAqFokihWsDpsNZbXgaDwSo6YN2hTdbUsuY5VJhHZm+YWorHjx9bTUuP4Yl6BmAhxGhg\nKCCB08BgTG8C/wI4AceAAVLKRCHEc8Bi4DXgLtBbSnk5u/ILT1tdoVAocoFeb8IJISoBvoCblLIe\nYAD6AJ8Cs6SULwLR/F9enCFAdOr8WanrZYsKwAqFokih86vItoCdEMIWsAduAO7AqtTli4DuqZ+7\npX4ndbmHyEFEBWCFQlGk0CsfsJTyOjAT+B+mwHsfU5dDjJQyOXW1a/xfmoZKwNXUbZNT18824bcK\nwAqFokiRlxZw+tS5qdO76copi6lV64opK2RJQNf8pGoUhEKhKFLk5SGclHIeMC+Lxe2BS1LK26nl\nrgFaAGWEELaprdzKmBKUkfq3CnAttcvieUwP47JEtYAVCkWRQsc+4P8BTYUQ9ql9uR7AWWAP4J26\nziBgfernDanfSV0eKnMYxlTgAXjixIm0atVK82UDU9pILy8vvLy86NChA15eXmZp+Pr6cvr0ac6c\nOcPIkaY0xmXLlmXHjh1cuHCBHTt2aCnz2rRpQ0xMDCdOnODEiRNMmjTJLG0wWTANGDCALl264Onp\nyaJFi3LeyAyMRiM9evTgvffes6jO+PHjadasmcWdDqylkx5rnENLH9fEiRNp3bo13bt31+adP3+e\nvn370rNnT3x8fDh9+rRuekajkb59+zJq1CgAhg4dSt++fenbty+dO3fG399fN63s0CsASykPY3qY\ndhzTEDQbTK3lQMBPCPE3pj7ehambLAScUuf7ATnathV4AO7evTvff/99hnkhISGsWbOGNWvW0KFD\nhwwZ9vNK3bp1eeedd2jcuDENGzaka9eu1KhRg3HjxrF7925q1arF7t27GTfu/87Vvn37eOWVV3jl\nlVf4+OOP862dhsFgYNy4cWzZsoUVK1awbNkyLdm8JVi8eLFuib2zw8vLiwULFhQZnfRY4xxa+ri6\nd+/Od999l2FeSEgI77//PqtXr2b48OG6WhItX748Q4L3BQsWsGzZMpYtW0b9+vVp166dblrZoeco\nCCnlh1LK2lLKelLKAVLKBCnlRSllYynli1LKXlLKhNR1H6d+fzF1+cWcyi/wAJyZo28aUkq2b99u\nljNBnTp1OHz4MPHx8RiNRsLCwvDy8qJbt25aS3TRokUZWgl6U758ec1eycHBgerVq1vMIfbmzZuE\nhYXlOmG5OVjLrdjarsjWOoeWPq7Mri0hBLGxsQDExsZqhrTmEhUVxf79+zO9jmJjYzl69Cht27bV\nRSsniowrcnYIIfS1U82EY8eO4eTkRNWqVfNdxpkzZ2jVqhWOjo7Y2dnRpUsXqlSpgouLCzdv3gRM\nF1x6m/tmzZrx+++/s2XLFs1uWy+uXbvGuXPndHW/Tc+0adMICAgoVK9jPmsU5XMYGBhISEgIHh4e\nzJw5U+suMJeQkBB8fX0zPWd79+6lUaNGGYw6LUlhsiQy5ydgalYL0g/tyMkGJzu2bNlCly5d8r09\nmPq8Pv30U3bs2MG2bdv4/fffM3V+SOsrP378OFWrVuXll1/mq6++Yt26dWbppycuLg5fX18mTJhg\nkcq4Z88enJycqFevnu5l//9CUT+HK1asIDAwkN27dzN27NgcLYJyw759+3B0dKROnTqZLt+xYwed\nOnUyWye3FJkALIQ4lcV0GnDJarv0rsjvvPNOvnYsOTmZXbt26WIL/sMPP+Dm5kabNm2Ijo7mwoUL\nREVF8cILLwDwwgsvaM7IDx8+JC4uDoCtW7dSrFgxnJyyHUudK5KSkvD19eX111+nY8eOZpeXGceP\nHyc0NBR3d3f8/f05fPgwY8aMsYhWUaWon8MNGzZoz1Q6deqky0O4kydPEh4ezuuvv05QUBARERHa\nw+uYmBj++OMPWrZsabZObilMATinccAuQCdM7zunRwAHLLJHqRw8eBBXV1ctSJqDs7Mzt2/fpkqV\nKnh5edG0aVNcXV0ZNGgQn376KYMGDdKckV1cXLT+2UaNGmFjY8Pdu9kO5csRKSVBQUFUr16dwYMt\n13Pj7++vPWk+fPgwP/zwA59//rnF9IoiRf0cOjs7ExERQePGjTl8+LBZ3XtpDB8+XDMTPXr0KEuW\nLNEeXu/atYuWLVvy3HPPma2TW56FwJpbcgrAmwAHKeXvTy4QQuzVYwcCAgKIiIggJiYGd3d3Pvjg\nA3r27MnWrVvN7n5IY/Xq1Tg5OZGUlMQHH3zA/fv3mTFjBitXrmTIkCFcuXIFHx8fALy9vXn//fdJ\nTk4mPj6ePn36mK1/7Ngx1q9fT61atbThdn5+frRp08bssguS9G7FrVu3zpNb8bOoY20sfVxjxozR\nri0PDw+GDRvG1KlTmTFjBsnJyTz33HN8+OGHuullxo4dO3jrrbcsqvEkhSkAW9wVOTk52Sr5FK2Z\nnq+opqMsTBU3txTV82dNZ25rpqMsVaqU2SexXr16uf6nnzlzRrkiKxQKhV4UpoaECsAKhaJIoQKw\nQqFQFBAqACsUCkUBoQKwQqFQFBDPwivGucXiATgxMdHSEoDpLTN7e3uraP373/+2ig7AkSNHrKZl\nzSfr1srtkJKSYhUdAFtb67VnrKllrVeI9UK1gAsAawVfhULxbKMCsEKhUBQQKgArFApFAaECsEKh\nUBQQKgArFApFAaFGQSgUCkUBoVrACoVCUUCoAJwHEhISGDJkCImJiRiNRtq3b8/777+PlJK5c+ey\nc+dODAYD3t7e9O3bV1ft8PBwgoODSUlJoVevXrz77rv5LsvV1ZVZs2Zp36tUqcKcOXNYtGgR/fv3\np1+/fpon3eeff06xYsWYOnUq9erVQ0pJcHBwvsf8duvWDXt7e2xsbDAYDCxevJh58+axfv16ze15\n2LBhtGjRIt/HB3DlypUMDgqRkZEMHTqU3r17AyZTxq+//prNmzdruvnlxo0bTJgwgbt37yKEwNvb\nmwEDBnD//n38/f2JjIykYsWKhISEmD2m+ObNmwQFBWlaPXv2pH///gAsW7aMX375BYPBQKtWrfDz\n8zNLKygoiLCwMBwdHdmwYQMAn3/+OXv37qVYsWJUqVKF4OBgSpcubZZOZhiNRry9vSlfvvxTRriF\nWetJVADOA8WLF2fevHnY29uTlJTE22+/TYsWLbh06RI3b95k7dq12NjYcO/ePV11jUYjH330ET/+\n+CMuLi54e3vj7u7Oiy++mK/yLl26pBkS2tjYEB4ezs6dO2nSpAkeHh688cYbJCUl4ejoCKDlfX3j\njTdwdHRk/vz5eHt75zt94rfffvtU0HvzzTe1QKIHVatW1YxMjUYj3bt313IaR0VFceTIkQzeeuZg\na2vLmDFjeOmll4iLi8PHx4fmzZuzbt06mjZtytChQ1mwYAELFy40OygaDAb8/f01rT59+tCsWTPu\n3r3Lnj17WLVqFcWLFzc7MT9Ajx496NevXwYX7ubNmzN69GhsbW0JCQlh/vz5FrFwT3N6TjPltCTW\n1HqSwhSAc+ytFkLUFkJ4CCEcnphvvleQqRztJYrk5GSSk5MRQvDrr7/y7rvvah3qaYFLL06dOkXV\nqlWpUqUKxYsXx9PTk927d+tSdrNmzbh69SqRkZG8+eabzJs3T3vLLO2H5MUXX+Tw4cPavIcPHxYq\nH7KjR49SqVIlzbFkzpw5DBs2TLfK7+zsrBmilixZUnOS3rNnj5bUvlu3boSGhuqu5erqyq1bt7SE\n/cWLFwfQxZoqM6fiFi1aaG+2NWzYUDOL1RNrumVbUyszCpMlUU6ecL7AemAEcEYI0S3d4ml67YTR\naKR37954eHjQtGlT6tevz7Vr19ixYwd9+/blgw8+4MqVK3rJAWTwhIOMVkTm4unpyaZNmwCoVq0a\nbm5urFy5kp9//pn69esDJrNQd3d3DAYDlStXpm7dulSoUCHfmiNGjGDgwIGsXbtWm/frr7/St29f\nPv74Yx48eGDeQT3B7t27NW+xffv24ezsTM2aNXXVSOP69eucO3eOBg0acPfuXZydnQEoV66cLq3S\nJ7XOnz9P/fr1uXLlCseOHaNv374MHjyYM2fO6KqVGWvWrKFVq1a6l2tNp+eCdpUuSrb07wCvSSm7\nA22BSUKIkanLsjy76V2Rf/jhhxx3wmAwsGLFCrZv386ZM2f4+++/SUxMpHjx4ixbtgwvLy+mTs3S\nhPmZolixYri7u7Nt2zbAdGzPP/88Pj4+fPbZZ8yePRsw2STdvHmT1atXM2HCBE6cOJGpW3NumD9/\nPj///DOzZ8/m119/5fjx4/Ts2ZM1a9awZMkSnJyc+PLLL3U7xqSkJH777Tfc3d15/PgxixcvZujQ\nobqVn55Hjx4xevRoAgMDn8pJoHcr5tGjR/j5+TF27FgcHBxITk7mwYMHLF26FD8/PwICAizqsPHd\nd99hMBh4/fXXdS3Xmk7Pz4KrdGFqAefUB2wjpYwFkFJeFkK0BVYJIaqSTQCWUs4D5gE8evQo1zW2\nVKlSuLm5ceDAAVxcXPDw8ADA3d2dKVOm5LaYXOHi4pLhVi8qKkqX/svWrVvzxx9/aC2zqKgodu7c\nCcDp06dJSUmhbNmyREdHM336dG275cuXc/ny5Xxpli9fHjB107Rt25azZ8/y6quvasu7d+9udj9p\neg4dOkStWrVwdHTkn3/+ITIykkGDBgFw+/Zt3n77bebPn2/2LXtSUhKjRo3C09OTDh06AKZugNu3\nb2tGq3p1TSUlJeHn54enp6fWsk+rg0II6tevj42NDdHR0bp3hwGsXbuWsLAwfvjhB90DQ5rTc1hY\nGImJicTGxjJmzBiLmI1aUysrnoXAmltyagFHCSFeTvuSGoy7AuWA+nrsQFr/J5i8pw4fPky1atVo\n27YtERERgMnU8l//+pcechr169fn8uXLXL16lcTERDZv3oy7u7vZ5Xp6erJ582bt+65du2jSpAlg\n6o4oVqwY0dHRlChRAjs7O8D0EMZoNPLPP//kWS8+Pp64uDjt8+HDh6lRowZ37tzR1tm7dy81atQw\n57AysHPnTi0g1qhRg82bN7N69WpWr16Ns7MzP/zwg9nBV0rJ5MmTqV69uhbcAdq2bas5WK9fv552\n7dqZpZOm9eGHH+Lq6srAgQO1+e7u7lodvHz5MklJSZQtW9ZsvSfZt28fCxcuZO7cuVqd0BN/f3/C\nwsIIDQ0lJCSEJk2aWCwgWlMrK4pSC3ggkJx+hpQyGRgohNBlbMmdO3eYPHkyKSkppKSk0KFDB1q3\nbs0rr7zChAkTWLp0KXZ2dhmGP+mBra0tkydPZujQoRiNRnr27Gl2H6adnR3NmzfPsK+rV69m2rRp\nbNy4kaSkJO3pt5OTEwsXLiQlJYWoqCjGjh2bL8179+4xZswYwNSX3qlTJ5o1a8aHH37IhQsXEEJQ\noUIFxo8fb9axpREfH09ERES+9ze3nDhxgo0bN1KzZk169uwJwMiRIxk6dCj+/v6sWbNGG4amh9am\nTZuoWbOm9uDI19eXHj16MHnyZHr06EGxYsX45JNPzL5oAwICOHLkCDExMbRr147hw4drD2mHDBkC\nmB7E6X3H9/8Tz0JgzS0Wd0XOSxeEOVgzHaXKB2w+Kh+weTwLD5AsgdAhenbs2DHXMWfHjh3KFVmh\nUCj0ojC1gFUAVigURQoVgBUKhaKAUAFYoVAoCggVgBUKhaKAUAE4Hc8995ylJazOiRMnrKbVqVMn\nq2mlvT5tDa5du2YVnbTXlq2BNS/8whRk8oIex1WYRoioFrBCoShSFKYfJxWAFQpFkUIFYIVCoSgg\nVABWKBSKAqIwBeDC01utUCgUuUDPZDxCiDJCiFVCiPNCiHNCiGZCCEchxE4hxF+pf8umriuEEHOE\nEH8LIU4JIV7NqXwVgBUKRZFC54TsXwLbpJS1gYbAOWAcsFtKWRPYnfod4D9AzdTpXeDbnAov8C6I\nzEwK58yZQ2hoKEIInJycmDZtmpbzVi8SEhLo16+fZgbaqVMnfH19dS0/M7PRt99+W0sfee/ePerV\nq5fBzDO3ODg4EBgYiKurK1JKZsyYwePHjwkICMDOzo6bN2/y0Ucf8ejRIwD69++Pp6cnKSkpfPnl\nl7oagAKsWLGCVatWYWNjQ4sWLfJ1Lr/44gsOHz5MmTJlNCPH8PBwlixZwtWrV/nyyy+pVasWAKGh\noaxatUrb9tKlS3z99df5TrtpNBoZMGAAzs7OGZLXf/bZZ2zYsIHffvstX+Wm5+bNm08Zjfbv35+Q\nkJAMppwff/yx7qacP/30E6tWrUIIQa1atZg2bZpuQ0SDgoLYu3cvjo6ObNy4EYCYmBj8/Py4fv06\nlSpVYtasWVZLwKRXF4QQ4nmgNfAWgJQyEUhMdQZqm7raImAvEAh0AxZLU4azQ6mt5wpSyhtZaRR4\nC7hHjx7Mmzcvw7y3336bdevWsXbtWtq0acM333yju27x4sVZtGgRGzZsYN26dezbt4/ff/9d1/Ln\nzZvHypUr+eWXXzhw4ACnTp3ihx9+YMWKFaxYsYIGDRrkOwexr68vhw8fpn///gwePJgrV64QGBjI\n999/z1tvvUV4eDhvvvkmYMpD7OHhwcCBAwkICMDPz8+ssZLffvstS5cu1YLv0aNHCQ8PZ+nSpaxY\nsSLfRqAdOnTgk08+yTCvWrVqTJo06SmHBXd3d7755hu++eYbxowZwwsvvGBWzuPly5dTrVq1DPPO\nnj2r5arWA4PBQEBAAOvXr2fp0qX88ssv/PPPPzRr1oy1a9eyZs0aqlatyoIFC3TTBJMpwJIlS1i1\nahUbN24kJSWFLVu26FZ+9+7dn7qG58+fT7Nmzdi+fTvNmjVj/vz5uunlhI5dEK7AbeBHIcQJIcQC\nIURJwCVdUL0JpDk5VAKuptv+Wuq8LCnwAJyZSWF665n4+HiLdKoLIShZsiSQ0QxUz/IzMxtNIzY2\nloiIiHwlFC9ZsiQNGzbUXpxITk4mNjaWKlWqaD8iR48epW3btgC0bNmS3bt3k5SUxI0bN7h+/Tp1\n6tQx8wj/j9WrVzNo0CDNvDK/jhH169enVKlSGeb961//okqVKtlut3fvXs2dOT9ERUXx22+/aa7W\nYGoRz549W9e7oszMP6OiomjevHkGU069vAnTYzQaefz4McnJycTHx+t6R9moUaOnHLlDQ0MzmKfq\nZXibG/ISgNPbp6VO76YryhZ4FfhWSvkKEMf/dTcAkNrazXfK3dy4IjcWQjRK/fySEMJPCNElv4K5\nZfbs2bi7u7Np0yZGjBhhEQ2j0Ui3bt1o3rw5zZs3p2HDhrqX/6TZaBp79uyhcePGT/mc5YYKFSoQ\nExPDhAkTWLhwIYGBgZQoUYJLly5pho7t2rXTLrJy5cpx69Ytbftbt26Z9YbYkwag//vf//j9998Z\nPHgw7733HmfPns132fkhPDxc+7HJDyEhIYwcOTLDXcGKFSto06aNxd6kSzP/bNCgQYb5a9eupWXL\nlrpqubi4MHjwYDw8PGjdujWlSpWiRYsWumo8yd27d7X65+zsrLt5anbkJQBLKedJKd3STemb8teA\na1LKw6nfV2EKyFFCiAqpWp0axwcAACAASURBVBWAtIvrOpC+tVA5dV6W5OSK/CEwB/hWCDEd+Boo\nCYwTQgRls532q5LfW49Ro0YRGhpK165dWbp0ab7KyAmDwcD69esJCwvj1KlTXLhwQffynzQbTWPb\ntm107tw53+XWqlWLdevWMWTIEOLj4+nXrx8zZsyge/fuLFiwADs7O4skWM/MANRoNHL//n1++OEH\nfH19GT9+vEXNK9Nz/vx5nnvuuae6D3JLeHg4ZcuWzXBHcPv2bXbt2kXv3r112suMZGU0Om/ePAwG\nA127dtVV7/79+4SGhrJz507CwsKIj4/XnrdYA2vb/+j1EE5KeRO4KoRIc2DwAM4CG4A0n6xBmJzj\nSZ0/MHU0RFPgfnb9v5DzQzhv4GXgOUx9HZWllA+EEDOBw0BwFjuumXIajUazrsSuXbvy3//+12Kt\nYIDSpUvTpEkT9u3bpz3k0ZP0ZqMvvvgi0dHR/PHHH3zxxRf5Ku/27dvcvn1ba2nu3buX/v37s3Dh\nQvz9/QGoUqUKzZo1A0y2T+lvOcuXL8/t27fzpZ2ZAWj58uVp164dQgjq1q2LjY0NMTExFvFPe5Kw\nsDCzWr8nT54kPDyc/fv3ayaSvXr1onjx4lqXxOPHj+nWrZvmRWcOSUlJjB49OoP5J8C6desICwtj\nwYIFugergwcPUqlSJa1rqH379pw4cYI33nhDV530ODk5cevWLcqXL8+tW7csYmSaFTqfvxHAUiFE\nceAiMBhTw3WlEGIIcAXwSV13C9AF+Bt4lLputuTUBZEspTRKKR8B/0gpHwBIKeMBi3m9pHcHDg0N\npXr16rpr3Lt3jwcPHgCmC+zAgQO66mRlNgomo85WrVrl+yn0vXv3uHXrltY3+tprr3H58mWtH04I\nwcCBA7WA8dtvv+Hh4UGxYsWoUKEClStX5ty5c3nWzcoAtE2bNhw7dgyAK1eukJSU9FSfoCVISUkh\nPDzcrP7fESNGsHXrVjZt2sS0adNo1KgRe/fuZceOHWzatIlNmzZRokQJXYJvmvnnk0ajv/32Gz/+\n+CNfffWVRUw5K1SowMmTJ4mPj0dKyaFDh3Q1ac0Md3f3DOapehje5hY9xwFLKX9P7ZpoIKXsLqWM\nllLelVJ6SClrSinbSynvpa4rpZQfSClrSCnrSymP5lR+Ti3gRCGEfWoAfi3dAT6PTgE4M5PC8PBw\nLl26hI2NDRUrVuTDDz/UQyoDt27dYty4cRiNRqSUdO7cWReH3TSyMhsF2L59O4MH5/jjmC2zZ89m\n8uTJFCtWjMjISKZNm0bnzp3x8vICTC3DtCfdly9fJjQ0lJ9//hmj0cgXX3yRL6+0rAxAk5KS+Pjj\nj+nTpw/FihXjww8/zFcrZPr06Zw6dYoHDx7Qv39/+vfvT6lSpfj222+5f/++5pI8bdo0AE6fPo2z\nszMVKlTIs1ZBkN5o1NvbGzCNZpkxYwaJiYm8+67p+U+DBg10NaFt2LAhnTp1omfPnhgMBurUqYOP\nj0/OG+YSf39/7Rpu27Ytw4cPZ+jQofj5+bFq1SoqVqyYr6GW+aUwvQmXrSmnEOI5KWVCJvPLARWk\nlKdzEjC3CyK3GAwGa8gAaGNrrUFRTUd57949q+hYMx1l2igQa2BNA1BrYmNjY3b0HDhwYK5jzuLF\ni59dU87Mgm/q/DvAHYvskUKhUJhBYWoBF82fUYVC8f8tKiG7QqFQFBCqBaxQKBQFhArACoVCUUCo\nAKxQKBQFhArA6ShMJyO3WPOYtm3bZjWtihUrWk3LWrkB8jPeOb9Y+3VbReYUpnOjWsAKhaJIUZga\nfSoAKxSKIoUKwAqFQlFAqACsUCgUBYQKwAqFQlFAqACsUCgUBYQaBZEHCspR9caNG4wdO1ZzqPXx\n8cmQo9VcEhISePvtt0lKSiI5OZn27dszbNgwpkyZwtmzZ5FSUrVqVT766CPNO84cjEYj/fv3x9nZ\nmTlz5nD9+nXGjx9PTEwMderU4ZNPPqFYsWL5KnvYsGEMHDgQKSVnz55l2LBhzJo1i5YtW3L//n1t\nndOnT9OlSxeCgoJISUnBaDQybtw4Dh06lGfNiRMnam7ZaXll/f39uXTpEgAPHz6kVKlSrFmzJl/H\nlB5ruRXfuHHjKZ0BAwawfft2vvnmGy5evMjy5cufMiA1F0vX9ScJDw8nODiYlJQUevXqpaXZtBaF\nqQWcbTpKPUhJSclWICIiAnt7e8aNG6cF4M8//5wyZcrwzjvvMH/+fO7fv09AQEC2Onn91bt16xa3\nb9+mbt26xMbG0rNnT+bOncuLL76Y47bx8fE5riOlJD4+Hnt7e5KSkhg8eDBjx46levXqmg3NzJkz\ncXR05O23386ynNyOY12yZAlnz54lNjaWOXPmEBgYiLu7O506dSI4OJhatWrRq1evbMvIbBxwhQoV\n2L59O40bN+bx48f89NNP7Nixg5YtW7J9+/anEpWXLFlSS9pet25dfvrpJxo1avRUuTmNAz569Cj2\n9vaMHz8+02Ton332GQ4ODgwbNizbcnJz/tIcRl566SXi4uLo3bs3X375JVFRUTRu3BhbW1vNvcTP\nzy/LcnK68J/U8fHxYc6cOYCp/k6dOpWAgIBcBeC8/JiaU9fzSlqe6B9//BEXFxe8vb354osv8qJl\ndvQcNWpUroPa7NmzCzRa57mtLoRYrOcOFJSjavny5albty5gcmGuXr26rm60WbkipwVfKSUJCQm6\n/FpHRUWxb98+zUJHSklERAQeHh6AydZpz549+S7fYDBgZ2en/b1582aW66YFXwB7e/t8e8Nl5pad\nhpSS7du34+npma+yn8RabsVP6qTVuRo1auDq6mreQWSDpet6ek6dOkXVqlWpUqUKxYsXx9PT06qO\nyKCvI4alycmUc8MT00bAK+27pXbK2o6q165d49y5cxZxRfbx8cHd3T2DK/LkyZPx8PDg0qVL9OnT\nx2ydmTNnZnD1jYmJwcHBQQseLi4u+faAu3HjBl999RVnzpzhwoULPHjwgNDQUAAmTZrE/v37mTZt\nWoZk5F27diUiIoJff/2VDz74wMyje5pjx47h5ORE1apVdS/bWm7F169f59y5c0/pWBpL1fU0oqKi\neOGFF7TvLi4uFgv2WVFkAjAmW+UHwBdASOr0MN3nTEnvijxv3rysVssVlj5RcXFx+Pr6MmHChHxZ\nxGeHwWBg5cqVT7kif/TRR+zcuRNXV1e2b99ulkZ4eDiOjo5ay0pvypQpg6enJw0aNODf//439vb2\n+Pj4MHXqVNzc3GjXrh1ly5Zl1KhR2jabNm2iUaNG9O3bl4kTJ+q+T1u2bKFLly66l2stt+KsdCyN\nJev6s0RRCsBuwDEgCJPF8l4gXkoZJqUMy2ojKeW8VCM7t/x0wKc5qgIWdVRNSkrC19eX119/nY4d\nO1pEA0yuy40aNWL//v3aPIPBQOfOnc2+PTt58iRhYWF4enoyfvx4jh49ysyZM4mNjSU5ORkwtUry\na83Ttm1brly5wt27d0lOTmbjxo00adJEa9UkJiaydOlSXnvttae2PXDgANWqVdP1/5ecnMyuXbvo\n3LmzbmVCzm7FM2bM0OWCTUpKYtSoUXh6etKhQwezy8uLrjXquouLS4YuqqioKFxcXCymlxl62dJb\nZV+zWyilTJFSzsJkrxwkhPgaK4ycsIajqpSSoKAgqlevbrZBZmY86bp86NAhqlWrxv/+9z9NPyws\nzOy+vxEjRrBt2zY2b97M9OnTcXNzIzg4GDc3Ny24b9q0Kd/W7VevXsXNzU1z623Tpg1//vlnhovK\n09NTc1lO7yzdsGFDihcvrqv/28GDB3F1dc1wm2su1nIrllJqxqKWHIWQma4l63p66tevz+XLl7l6\n9SqJiYls3rzZqo7IULhawLkKplLKa0AvIYQnpi4J3SgoR9Vjx46xfv16atWqpT3w8/PzM8viPD13\n7txh0qRJmityx44dadWqFYMHDyYuLg4pJbVq1SIoKEgXvSfx9fVl/PjxzJ07l9q1a2sP6PJK2nkK\nDw8nOTmZU6dO8dNPP7F69WqcnJwQQnD69GlGjx4NwBtvvEGfPn1ISkri8ePH+b7gAwICiIiIICYm\nBnd3dz744AN69uzJ1q1bde9+sJZbcXqdnj17AjBy5EgSExOZPn069+7dY9iwYdSuXRtzu+7SY+m6\nnh5bW1smT57M0KFDMRqN9OzZk5o1a+qukx3PQmDNLQU+DE0vrHk7kZthaHphzXSKKh2leVjzws/v\nmO5CgNkncdy4cbmOOTNmzHh2XZEVCoWisFGYWsAqACsUiiLFs/BwLbeoAKxQKIoUqgWsUCgUBYQK\nwAqFQlFAqACsUCgUBYQKwOkoTB3iucWaQ4COHz9uNa3IyEirabVr184qOml5K6yB0Wi0mpY166Cl\nh6qmR4/gqQKwQqFQFBCFqdGnArBCoShSqBawQqFQFBAqACsUCkUBoQKwQqFQFBAqACsUCkUBoQJw\nPklISKBfv34kJiZq5n6+vr4W0Ro/fjx79+7FycmJTZs26V5+Zq6+AEuXLmX58uXY2NjQunXrHM1G\ns2L+/PmcOHGC0qVLM2PGDACuXLnCTz/9xOPHjylXrhzDhg3Dzs6Ohw8f8tVXX3Hx4kVatWpldi7a\nJx2Yg4KCOHv2LLa2ttStW5egoKB8D5NycHAgMDAQV1dXpJTMmDGDx48fExAQoPnRffTRRzx69IjS\npUvz8ccfU7t2bbZu3crs2bPzpZmVW/H9+/fx9/cnMjKSihUrEhISYpY7982bNwkKCtLyI3t7e9Ov\nXz/+/PNPPvnkEx49ekTFihWZPn26ro4V1ryu0jAajXh7e1O+fHm+//57i2o9SWEaBfFM7Wnx4sVZ\ntGgRGzZsYN26dezbt4/ff//dIlpeXl4sWLDAImUDdO/e/amKd/jwYUJDQ1mzZg0bNmwwKzl2q1at\nGDt2bIZ5CxcuxMfHR0vMvnnzZsA0ZrRnz568+eab+dZLz/LlyzMkkv/Pf/7DmjVrWLlyJQkJCaxb\nty7fZfv6+nL48GH69+/P4MGDuXLlCoGBgXz//fe89dZbhIeHa8eRmJjIggUL+Oabb8w6HltbW8aM\nGcOGDRtYtmwZv/zyC//88w8LFiygadOmbNmyhaZNm7Jw4UKzdAwGAwEBAaxdu5YlS5ZoOlOnTmXk\nyJGsXr0ad3d3fvrpJ7N0nsSa11UaixcvzpCc35oUpoTseQrAQoiWQgg/IYRFPE2EEJQsWRLI6CRs\nCRo1amRWayYnMnP1XbFiBUOHDtUMLJ2cnPJdfu3atbVzlcbNmzepXbs2APXq1SMiIgKAEiVK8O9/\n/1uXwftPOjADtGzZUqvQdevWzbcJY8mSJWnYsKF2R5KcnExsbCxVqlTRAsbRo0c1d4/Hjx9z+vRp\nEhMTzTqmrNyK9+zZk8Gd29yXOpydnalTp04GnVu3bnHlyhXN0qlZs2a6uwhb87oCUz0MCwujV69e\nFtPIjiITgIUQR9J9fgf4GigFfCiEGGeJHTIajXTr1o3mzZvTvHlzi7m3FgSXL1/m2LFj9OnTh0GD\nBnH69Gldy69UqRLHjh0D4MiRI7paAaXxpANzepKSktiyZQvNmzfPV9kVKlQgJiaGCRMmsHDhQgID\nAylRogSXLl2iVatWgOkNujTHbEuQ3q347t27mpdeuXLldE0in+a+XL9+fWrUqMGePXsA2LFjRwZP\nNb2w5nU1bdo0AgICCizA6R2AhRAGIcQJIcSm1O+uQojDQoi/hRArhBDFU+c/l/r979Tl1XIqO6cW\ncPom07tABynlVKAj0C+bHc63K7LBYGD9+vWEhYVx6tQpLly4kKftn2WMRiP3799n+fLl+Pv74+/v\nr+trnu+88w67d+9m0qRJxMfHa7b0epGTA/OMGTN45ZVXePXVV/NVvsFgoFatWqxbt44hQ4YQHx9P\nv379mDFjBt27d2fBggXY2dmRlJRkzmFkSXZuxXq2mB49eoS/vz9jxozBwcGBqVOnsmLFCvr06cOj\nR48s8pqxta6rPXv24OTkRL169SxSfm6wQAt4JHAu3fdPgVlSyheBaGBI6vwhQHTq/Fmp62VLTleo\njRCiLKZALaSUtwGklHFCiOSsNpJSzgPSIm++Ikzp0qVp0qQJ+/bto1atWvkp4pnDxcWF9u3bI4Sg\nQYMG2NjYEB0drZtrcMWKFQkMDARMD5ZOnjypS7lppDkw//bbbyQmJhIXF0dQUBDBwcF8//33REdH\nM3PmzHyXf/v2bW7fvs3Zs2cB2Lt3L/3792fhwoX4+/sDUKVKFZo1a6bL8aQnM7diJycnbt++jbOz\nM7dv39bl/5SUlISfnx9dunTR3JddXV215wWXL18mPDzcbJ2ssPR1dfz4cUJDQwkLCyMxMZHY2FjG\njBnD559/rrtWVuj5EE4IURnwBIIBP2GK2u5A39RVFgFTgG+BbqmfAVYBXwshhMymlZXTnj6PyZb+\nKOAohKiQulMO6ODd9CRPOgkfOHCgwDryLYGHhwdHjph6dS5fvkxSUhJly5bVrfz79+8DJh80S7hJ\nZ+XAvHbtWg4ePMi0adPMqvz37t3j1q1bVKlSBYDXXnuNy5cvU6ZMGcDUshk4cGCGUSV6kJVbcdu2\nbTO4c5ubQEhKyZQpU6hevToDBw7U5qd1baSkpDB//nzd+06teV35+/sTFhZGaGgoISEhNGnSxKrB\nF/LWAk5/t546vftEcbOBsUCauaATECOlTGuAXgMqpX6uBFwFSF1+P3X9LMm2BSylrJbFohSgR3bb\n5odbt24xbtw4jEYjUko6d+5ssaxZfn5+HDlyhOjoaFq3bs2IESN0rfiZufr26NGDSZMm0a1bN4oV\nK0ZwcHC+b2vnzp3LuXPniI2NxdfXFy8vLxISEti1axdgegjYunVrbf3Ro0cTHx9PcnIyx44dIzAw\nkEqVKmVVfJ6YNm0aFSpU4K233gLA3d1dcxLOK7Nnz2by5MkUK1aMyMhIpk2bRufOnfHy8gIgLCyM\nLVu2aOuvXLmSkiVLYmtrS6tWrfD39+fy5ct50szKrXjo0KH4+/uzZs0abRiaOZw4cYJNmzZRs2ZN\nfHx8ANOP2v/+9z9++eUXwPQjnV8H66yw5nX1LJCXa+qJu/Uny+kK3JJSHhNCtNVn757QsEKqOevl\nsrMSyclZ9r7ojjXTUdatW9dqWp07d7aKTlFNR1miRAmraVk5HaXZd9Zz5szJ9Q77+vpmqSeEmA4M\nAJKBEkBpYC3QCXhBSpkshGgGTJFSdhJCbE/9fFAIYQvcBJzN6YJQKBSKQoVeD+GklOOllJVTewL6\nAKFSyn7AHsA7dbVBQFqf2IbU76QuD80u+IIKwAqFoohhhXHAgZgeyP2NqY837Q2dhYBT6nw/IMeh\nus/Uq8gKhUJhLpZ4FVlKuRfYm/r5ItA4k3UeA3l6kKQCsEKhKFI8C2+45RYVgBUKRZFCBeB0pKSk\n5LySTlgrC5Kl3sTKjLTcAdYgLUeFNbDkywbpsbOzs4oOmMbYFkUKU0CDwrW/RaYFXJhS0CkUCsuh\nArBCoVAUECoAKxQKRQFRmO6GVQBWKBRFCtUCVigUigJCBWCFQqEoIFQAVigUigJCBWAzWLx4Mb/+\n+itSSnr16mW2g29W3Lhxg7Fjx2pOuD4+PrpqJSQk8Pbbb5OUlERycjLt27dn2LBhjB8/XnMQrlev\nHhMnTtTFAcFoNDJgwADKly/P7NmzOXLkCF9++SVSSuzs7JgyZYqWZze/ZOUevH37dr755hsuXrzI\n8uXLLeKG8ODBAyZOnMhff/2FEILg4GBeeeWVfJc3YsQIBg8ejJSSP/74g3feeYctW7ZoThjOzs4c\nPXoUHx8fRo8eTZ8+fQCTgWft2rWpXLky0dHRZh2TpZ2507B0XS8orawoTA/hLJ6OMiUlJdcCFy5c\nwN/fn5UrV1KsWDHeeecdpkyZQtWqVXPcNq8n/datW9y+fZu6desSGxtLz549mTt3Li+++GKO28bH\nx+e4jpSS+Ph47O3tSUpKYvDgwYwdO5b79+/TsmVLwHQBvvrqq1pu2MzIberLJUuWcO7cOeLi4pg9\nezZeXl6EhITg6urKr7/+yh9//MGUKVOyLSOnFIdpjhUvvfQScXFx+Pj4MGfOHMB0/qdOnUpAQECu\nAnBe7ZICAwNxc3OjV69eJCYm8vjxY0qXLp3jdpm9iFGxYkVCQ0N5+eWXefz4MUuWLGH79u38/PPP\n2jrLly9n06ZNLF26NMO2Xbp0wdfXN9N0mnl9ESMiIgJ7e3sCAwMtGoDNqesFoGV283XJkiW5jjn9\n+/cv0OZyTqacTYQQpVM/2wkhpgohNgohPhVC6G4pfPHiRRo0aICdnR22trY0atSInTt36i0DQPny\n5bX8tw4ODpoTrl4IIbC3twcyOtG2atVKFwfh9ERFRbF///6nEnnHxcUBEBsbq5lLmkNW7sE1atTI\nYFOvNw8fPuTo0aN4e5syABYvXjxXwTc7bG1tsbOzw2AwYG9vz40bN7RlpUqVom3btmzYsOGp7Xr3\n7s3KlSvN0k7D0s7caVi6rheUVlZYIRuabuTUbPwBeJT6+UtMFkWfps77Ue+dqVmzJseOHSM6Opr4\n+HjCw8Mt4hD7JNeuXePcuXO6O8UajUZ8fHxwd3enadOm1K9fX1uWlJTE5s2badGihdk6ISEh+Pr6\nZqhQkyZNYuTIkXTp0oUtW7bofhuY3j3Y0ly7dg1HR0fGjx9Pjx49mDhxIo8ePcp5wyyIjIxk1qxZ\n/PXXX1y+fJkHDx5oTiIAb7zxBnv27OHhw4cZtrOzs6NDhw6sXbs239oFjaXqekFrpacoBWCbdN5H\nblLKUVLK31KdkbM0lcqvK3KNGjUYOnQoQ4cO5Z133qF27doW78+Ji4vD19eXCRMmPOWEay4Gg4GV\nK1eyfft2zpw5w99//60tmzZtGq+++mq+HYTT2LdvH46Ojk/ljFi2bBlffvklW7Zs4fXXX2fWrFlm\n6aQnO/dgS5CcnMzZs2d58803Wbt2LXZ2dsyfPz/f5ZUpU4bXX3+d2rVr4+rqir29PW+++aa23MfH\nJ9NWrqenJwcPHjS777egsGRdL0itJylKAfiMEGJw6ueTQgg3ACFELSDLjDRSynlSSjcppVtevcG8\nvb1ZvXo1S5Ys4fnnn6datWp52j4vJCUl4evry+uvv07Hjh0tplO6dGkaNWrE/v37Afjuu++Ijo4m\nICDA7LJPnjxJeHg4r7/+OkFBQURERDBy5EguXLig9cV27NiRU6dOma0FmbsHW5oXXngBFxcXrSXV\nqVMnzTk5P7i7u3P58mXu3LlDcnIy69evp2nTpoDJCdnNzY2tW7c+tV2vXr10636wNtaq69bWyoyi\nFICHAm2EEP8ALwEHhRAXgfmpy3QnzSE2MjKSnTt30rVrV0vIIKUkKCiI6tWrM3jw4Jw3yCNPOtEe\nOnQIV1dX1qxZw4EDB5gxY4Yurfvhw4ezZcsWNm7cSHBwMI0aNSIkJITY2FiuXLkCwKFDh3T5IcvK\nPdjSODs7U6FCBS5evAjAwYMHqVGjRr7Lu3r1Ko0bN9Ye0LVr147z588D0KNHD7Zu3UpCQkKGbUqX\nLk2rVq3YuHFjvnULCkvX9YLSygobG5tcTwVNTq7I94G3Uh/Euaauf01KabFe9ZEjRxITE4OtrS2T\nJk0y+2FLVhw7doz169dTq1YtunXrBpicktu0aaNL+Xfu3GHSpEmkpKSQkpJCx44dad26Na+99hoV\nKlTQbMk9PDx47733dNFMw9bWlokTJzJ27FhsbGwoVaoUkydPNrvcrNyDExMTmT59Ovfu3WPYsGHU\nrl2bvHQ95YaJEycyZswYkpKSqFKlCtOmTct3WREREaxdu5ZDhw6RnJzMyZMnWbjQ5Crj4+OTqY16\nt27d2LVrl1l9z09iaWfuNCxd1wtKKyuehZZtbnmmhqGZgzV/zXIzDE0vrOnAbE2n3bwOQ8svKh9w\nocPs6Llq1apcxxxvb+8CjdbP3IsYCoVCYQ6FqQWsArBCoShSqACsUCgUBcSz8HAtt6gArFAoihSq\nBaxQKBQFhArA6ShMJyO36JG9LLdY01U6LXeENbDWiAtrjkxwcnKymlbaeHlrYOmRUunRI14Uppij\nWsAKhaJIoQKwQqFQFBAqACsUCkUBoUZBKBQKRQGhWsAKhUJRQKgArFAoFAWECsBmoLf5YlZY2jxw\n4sSJhIWF4ejoyPr16wGYO3cuq1atomzZsgCMGjWK1q1b66JnNBrp378/zs7OzJkzh+vXrzN+/Hhi\nYmKoU6cOn3zyiS7D5x4+fEhwcDAXL15ECMHEiRP517/+xcSJE4mMjKRixYoEBwebncUuISGBwYMH\na6amHTp0YNiwYSxfvpylS5dy9epV9u7dq51LvbCEUeZ///tfBgwYgJSSs2fPMmLECD777DNefvll\nhBD8888/DB8+nLi4OCpXrsxXX32Fk5MT0dHRvP/++0RGRpq9D9YyAAVTvuWSJUtiMBgwGAysXr3a\nonpPUpgC8DPXWx0cHEyrVq3YunUr69atMyvva3YYDAbGjRvHli1bWLFiBcuWLcvgWGEu3bt35/vv\nv39q/sCBA1mzZg1r1qzRLfiCyUQyvS/bnDlz6NevHxs2bKB06dKsW7dOF50vvviCZs2asXLlSpYs\nWUK1atVYvHgxbm5urF69Gjc3NxYvXmy2TvHixVmwYAG//vorK1euZP/+/Zw6dYqXX36Z77//nooV\nK+pwNE/j5eXFggULdCuvQoUKvPvuu3h4eNCyZUsMBgNeXl5MnDiRNm3a0Lp1a65du8bQoab02h99\n9BErVqygdevWzJw5k0mTJumyH3ofV04sXryYdevWWT34QtFKyG5VLGG+mBWWNg90c3OziuEimEw5\n9+3bp5lySimJiIjAw8MDgK5du7Jnzx6zdWJjYzlx4gRvvPEGYHohpVSpUoSHh+Pp6QmYbHvCwsLM\n1srM1BSgTp06VKpU8YqBYwAAGVBJREFUyezys8ISRpm2traUKFECg8GAnZ0dN27cyOA3Z2dnp73s\n8O9//5vw8HDAZDf1n//8R5d9sJYB6LNAYUrInpMrsq8Qooq1dkZv88W86FrLPHDZsmXasd2/f1+X\nMmfOnMnIkSO1ChUTE4ODg4OWc9fFxYXbt2+brRMZGUnZsmX5+OOPGTBgAMHBwcTHx3Pv3j3KlSsH\nmN4Gu3fvntla8H+mpu3ataNp06ZWMQDVmxs3bvD1119z8uRJzp49y4MHD9i7dy8AX331FefOnePF\nF1/UPO7OnDmjucB07dqVUqVK6d7NYmmEEAwZMgQvLy9WrFhRIPpFpQX8MXBYCLFPCDFMCJErb/P8\nmnLqbb6YG6xpHti7d2+2bdvG6tWrcXZ2ztR5Ia+Eh4fj6Oio2cVbEqPRyJ9//omXlxc///wzJUqU\nYNGiRRnW0bNip5ma7tixgzNnzvDXX3/pUq41ef755+nSpQuvvvoqdevWpWTJkprrxYgRI6hbty5/\n/fUXPXr0AODDDz+kRYsW7Nmzh+bNmxMZGYnRaCzIQ8gzy5YtY82aNcyfP59ly5YRERFhVf2iFIAv\nApUxBeLXgLNCiG1CiEFCiFJZbZRfU069zRdzwtrmgeXKlcNgMGBjY4O3tzenT582u8yTJ08SFhaG\np6cn48eP5+jRo8ycOZPY2Fjttj0qKgpn51z9dmZL+fLlKV++vGb26e7uzp9//omjoyN37twBTFZM\nerfY0kxNDxw4oGu51qBNmzZcuXKFu3fvkpyczKZNm2jcuLG2PCUlhTVr1mit3ps3bzJo0CDatWtH\ncHAwgOYtWFhwcXEBTHdD7du3180QNrcUpQAspZQpUsodUsohQEXgG6AzpuCsK3qbL2ZHQZgHpu8G\n2LVrFzVr1jS7zBEjRrBt2zY2b97M9OnTcXNzIzg4GDc3N3bv3g3Apk2baNu2rdlaTk5OlC9fXjP7\nPHr0KK6urrRq1YrNmzcDsHnzZl0eLmZmampJh2xLcf36ddzc3DRrpNatW3PhwoUMD0w7d+6ste4d\nHR21wDBq1CiWLl1q/Z02g0ePHhEbG6t93r9/P7Vq1bLqPhSmAJzTMLQMeyilTAI2ABuEEPaW2CE9\nzRezw9LmgQEBAURERBATE4O7uzsffPABERERnD9/HiEEFStWZMqUKbpoZYavry/jx49n7ty51K5d\nW3tAZy4BAQFMnjyZ5ORkKlasyKRJk5BSMmHCBDZs2ECFChW0lps53Llzh4kTJ2YwNW3Tpg1Lly7l\np59+4u7du/Tq1YuWLVvqeh71Nso8duwYGzZsYM+ePSQnJ3P69GkWLVrEunXrKFWqFEIIzpw5w5gx\nYwBo0aKFdk4PHjzI2LFjn8njyoq7d+8yfPhwwNRl1bVrV1q1aqW7TnY8Cw/Xcku2ppxCiFpSygvm\nCEgr5bKz5q+ZNY0yn7RHtyRJSUlW07JWOkprGo2qdJTmI3S4kPfv35/rHW7RokWWeqkDEBYDLoAE\n5kkpvxRCOAIrgGrAZcBHShmduu9fAl2AR8BbUsrj2eln+1NhbvBVKBQKa6NjF0Qy4C+lfAloCnwg\nhHgJGAfsllLWBHanfgf4D1AzdXoX+DYngcLTVlcoFIpcoFcAllLeSGvBSikfAueASkA3IG34zyIg\nrX+vG7BYmjgElBFCVMhOQwVghUJRpMhLAE4/ZDZ1ynTYlhCiGvAKcBhwkVLeSF10E1MXBZiC89V0\nm11LnZclz1wuCIVCoTCHvHQjSynnAdm+rCCEcABWA6OklA/Sly+llEKIfHeSqwCsUCiKFHqOghBC\nFMMUfJdKKdekzo4SQlSQUt5I7WK4lTr/OpD+zeHKqfOy3lfd9lShUCieAfTqA04d1bAQOCel/CLd\nog1AWurEQcD6dPMHChNNgfvpuioyxeIt4Pj4eEtLAKbEPdbCYDBYTSttAL81SEt+Yw2ehUHwemPN\noWHWHOtqTWdpPa5jHetWC2AAcFoI8XvqvAnADGClEGIIcAXwSV22BdMQtL8xDUPL8Q0v1QWhUCiK\nFHoFYCnlbzzxMlo6PDJZXwIf5EVDBWCFQlGkKEx3VyoAK/5fe2ceXVV17/HPjwTC2IdEiUzSREAR\nIq9SZFIIkDAkEkzCAklFheW0CgQxxDJYXG9JB+hLK6u+VSpIZIpAg65ChaaksKDUAJpAIMhoZUgg\niExCCJn4vT/uzTXUABnOOTc33Z+1zsrNvffs79nnnPs7++x99u9rMDQoTAA2GAwGL+FLuSBMADYY\nDA0K0wI2GAwGL+FLAdjrbfXi4mKeffZZxo0bR1xcHH/4gyt/xe7du5kwYQLjx49n0qRJnDp1qs5a\nb775Jk8++aQn/SRAYmIisbGxxMbGEhERQWxsbJ11qmLo0KGMHj2ap59+mri4OFs0AD744AOeeuop\nRo8eTWJioq3Z1L799lsSEhIYNWoUkZGR7N271xad2bNn079/f0/ScrvZsWMHI0aMICIigpo4utQU\nO+qVkJDA/v37OXDgANOnTwfgnnvuIT09nSNHjpCenk7r1q0931+0aBFHjx5l3759tXYfLygoYPLk\nyYwZM4ann36aVatW3fL58uXLCQ0N5dKlS7WvWA3wpXzAqKqtS2Fhod5puXbtmp4/f14LCwv18uXL\nGhsbq5mZmRoeHq4HDhzQwsJCTUlJ0cTExDuWU1paetclMzNTc3JyNDIyssrPf/GLX+iiRYvuWs7N\nmzdrvISFhemFCxdqvF55eXm1lzNnzuiQIUO0sLBQy8vLNSEhQdPS0qq9fk23LSkpSdeuXas3b97U\nGzdu6OXLl6u9bk3Ys2eP5ubmalRUVI3Wqw1lZWU6bNgwPXXqlBYXF+vo0aP12LFjtmjVpV4i8r2l\nZ8+eeuDAAW3evLn6+/vrli1btEuXLrpw4UKdNWuWiojOmjVLFyxYoCKikZGRumnTJhUR7devn+7a\ntavKcouLi++45OXl6d69e7W4uFgvXryoERER+sUXX2hxcbGePHlSX3jhBR08eLAWFBTctSy1IOYc\nPHhQq7tYoVeX5W6mnE1E5DkRCXf/Hy8i74rIFPcUvTojVbjfVlydCgsLAZdbshWWOndyKlZV0tPT\nPe6+vkp5eTk3btygrKyMoqIi2rZta4uOkw7WTjr67t+/n86dO9OpUyeaNGlCVFSUx1nEaqyuV/fu\n3dmzZw9FRUWUl5ezY8cOYmNjiY6O9nj3LV++3HMHOGbMGFauXAm47jhbt27N/fffX2Pd++67z+NJ\n2KJFC4KDgz0O4wsXLuT11193tLXpS67Id+sDTnF/p7mIPA+0BD7C9RDy43w3Ha9OlJeXEx8fz+nT\npxk/fjyhoaHMmzePadOmERAQQIsWLVixYoUVUrclKyuLwMBAOnfubEv54naKBZc55/jx4y3XCAoK\nYtKkSQwbNoyAgAAGDhzIwIEDLdeBWx2sjxw5Qo8ePZgzZ46js+ns4Ny5c7cEoaCgIMc9zWpLbm4u\n8+fPp02bNhQVFTFq1CiysrIICgqioKAAcHUXVHi2tW/fntOnv0velZeXR4cOHTzfrQ35+fkcPnyY\nRx99lK1bt9K2bVseeuihulWshtSLroVqcrdLQKiqjgdigOHAWFVdiWuK3W07jCqneFu2bNldN8LP\nz4+1a9eSnp5Obm4ux48fZ/Xq1fz+978nPT2dMWPGkJycXINq1ZxNmzYRGRlpW/lOOMVeuXKFrVu3\nsmXLFrZv305RUREbNmywXAe842BtuDOHDx9m4cKFpKens3nzZnJycqp0VFabHC6uX7/OjBkz+NnP\nfoafnx9Lly5lypQaTQyzBF/qA75bAG4kIk2AVkBzoOJ+KQC4bReEVnJFnjx5crU3plWrVvz4xz/m\nn//8J0ePHiU0NBSA4cOHk5OTU+1yakpZWRkZGRmMHDnSNg0nnGIzMzPp0KEDbdq0oXHjxoSHh9s2\nMOa0g7VTVG4tgqtFXHHsfIFly5bRp08fwsLCuHTpEkePHr2lVX///ffz9deu5F1nzpyhU6fvknd1\n7NiR/Pw7Ju+6LaWlpcyYMYOoqCjCw8M5ffo0+fn5jB07lhEjRnDu3DnGjRvncc+2k4YUgN8HDgP7\ngLnAn0RkCfAZsMaKDbh48SJXr14FXEk/du/eTXBwMNeuXfO47+7atesWF1mryczMJDg4uFb9X9XB\nKafYdu3akZOTQ1FREarKrl27bHOVdtLB2klCQ0M5ceIEp0+fpqSkhE8++YShQ4d6e7OqTcVYSadO\nnYiJiSE1NZWNGzfy/POu3sLnn3/ec1e0YcMGJk6cCEDfvn25cuVKrbofVJW33nqLkJAQj063bt3Y\nvn076enppKenExQUxLp167j33nutqOYd8aUAfMc+YFX9nYisdb8+IyIrgHBgiarusWIDvvnmG+bN\nm+dxv42IiGDQoEH8/Oc/Z+bMmYgIP/jBDyxxvq3KqTguLo7Nmzfb2v3glFNsr169GDFiBHFxcfj5\n+dG9e3fGjRt39xVriVMO1k45+gL4+/szb948XnzxRcrLy4mLi6Nr1662aNlRr7S0NAIDAyktLWXq\n1KlcuXKFX//616xdu5bJkydz8uRJz/hDRbfbsWPHuH79OjW5W63M3r172bhxI127dvUMyiYkJDBo\n0KA61aW21IfAWl3u6IpsBdevX3fEUrWhpqO0+/hUxskT15d+JPWRBpyOss4nxldffVXtH01wcLBX\nT0QzE85gMDQofOnibgKwwWBoUJgAbDAYDF7CBGCDwWDwEiYAGwwGg5cwAdhgMBi8RH3I8VBdbA/A\nAQEBdksAeBL3OIFdSWeqoqE+htYQcfJY2Zlm9N9p2rSpY1pVTZ2uKb50HpsWsMFgaFCYAGwwGAxe\nwgRgg8Fg8BImABsMBoOXMINwBoPB4CVMC9hgMBi8hC8FYK+31efOncsTTzxBdHT09z5LSUnhkUce\nsdRNNTo6mmeeeYb4+Hiee+45wOUMO3bsWCZMmEBSUpInP7FVnD17lokTJxIZGUlUVJTHn8sOnHRF\ndsqt2Mn957QDc3l5OTExMbzyyiuWlnv27FkmTZpEdHT0Ld5vFQ4zoaGh5Obm1klj+vTp7N+/n5yc\nHFavXk1AQABDhgzhs88+Iycnh5SUFE/mwMGDB3Px4kWysrLIysrizTffrHMdb4cv5QP2egCOiYmp\n0vr77NmzfPrpp7Rr185yzcWLF5Oamurxmevbty9r1qzhww8/5IEHHuCDDz6wVM/Pz49Zs2axadMm\n1q5dS2pqKsePH7dUA1zuDatWrSItLY2NGzdy8+ZNNm3aZLlOBbGxsSxdutS28itwav+Bc3WqYMWK\nFYSEhFherr+/P0lJSWzYsIHU1FTWrFnDl19+SZcuXXjnnXfo3bt3ncpv374906ZN4/HHH6dXr174\n+fkRHx9PSkoK8fHx9OrVi5MnT3oStAPs3LmT3r1707t3b+bPn1/XKt6WBhWARSRERGaKyCIR+a2I\nvCoils1EuJ1T8YIFC0hMTHRkJ/Xr1w9/f1dvTM+ePT2OrlbRtm1bevToAUDLli0JCQmxXKMCp1yR\nwTm3Yif3n5MOzAUFBWzfvt2W5PL/7lRcsc8efPBBy9xl/P39adasGX5+fjRv3pzCwkJKSko4duwY\nABkZGcTGxlqiVRMaTAAWkQRgMdAU6IPLC64TsEtEwuzaqL///e+0bduWhx9+2PKyRYSpU6cyceJE\nPvroo+99vmHDBgYMGGC5bgV5eXkcOnTI46VmJZVdkQcNGkSrVq1sc0X2FnbuP6f55S9/6XF9sZP8\n/HwOHTrEo48+almZZ86cITk5mRMnTpCfn8+VK1dYt24d/v7+ntZ1XFwcHTt29KzTr18/srOz+eST\nTzwXBzvwJVv6u23BS8AoVZ2Py4qoh6rOBUYCv7vdSpVdkWvqlFtUVMR7773HtGnTarRedVmyZAmr\nVq1i0aJFpKWlkZ2d7fls2bJl+Pv7M2rUKFu0CwsLSUhIYM6cObRs2dLy8p10RfYGdu8/J9m2bRuB\ngYH07NnTVp3KTsVW7rPWrVsTHR3Ngw8+SMeOHWnRogU/+clPiI+PJzk5mczMTK5eveqZWpydnU1w\ncDCPPfYY7777bpWNH6toMC1gNxVPSgQALQFU9RTVdEV+6aWXarRBFW6qMTExhIeHc+7cOeLi4jh/\n/nyNyrkdFbfkbdq0ISwsjIMHDwKwceNGdu7cydtvv23LgSktLSUhIYHRo0czfPhwy8sHZ12RncaJ\n/eck2dnZbN26laFDh5KYmMju3btJSkqyVKO0tJTXXnuNqKgoIiIiLC07PDycEydO8M0331BWVsbH\nH39M//792bVrF2FhYfTv359//OMfnu6Iq1evevK1bN68mcaNGxMYGGjpNlXQkALwUuAztxNyJvB/\nACJyH3DRjg3q1q0bO3fuJCMjg4yMDIKCgli/fr3H7bUuFBUVeU6CoqIij2vwp59+ysqVK0lOTrYl\n8YiqMnfuXEJCQpg0aZLl5VfgpCuykzi1/5wkMTGR7du3s3XrVpKTk+nbty+/+c1vLCtfVZk3b94t\nTsVWcurUKfr27UuzZs0AGDp0KIcOHfL8Tps0aUJSUhJ//OMfAVf3WAV9+vShUaNGXLhwwfLtAt8K\nwHdzRV4kIhlAdyBZVQ+73z8PWGJ5OnPmTPbs2cPly5cZMmQIU6dOJS4uzoqiv8eFCxd44403ACgr\nK2PkyJEMGDCAmJgYSkpKmDJlCuCyJp89e7ZlullZWfz5z3+mW7dujBkzBnA54g4ePNgyDXDeFdkp\nt2Kn9l9FuU45MNtJZafiit/T9OnTKSkp4Ve/+hUXL17kpz/9KQ8//HCVTyHdjT179rB+/Xo+//xz\nysrK2LdvH0uWLOHtt98mKiqKRo0asXjxYrZt2wa4+oNfffVVz+BwfHy8pfWtTH0IrNXFdlfk8vJy\nR3L0NdR0lDdv3nRMqz4MSvgyTqajLCsrc0zL4XSUdY6eN27cqPaBaNq0qXFFNhgMBqvwpYaECcAG\ng6FB4UtdECYAGwyGBoUvBWDfaasbDAZDNbDyKQgRGSkiR0TkuIjMsnpbTQvYYDA0KKxqAYuIH65H\nbyOAPFyP5G5Q1S8sEcAEYIPB0MCwcBDuceC4qv4LQETWAGMA3wnAfn5+tbocicjLqlrtBxRr+2hY\nTXXqQm20ansy1fd61Xet2ujUtuVVG63GjW87EdVyrdo6FTt5Xvy7dLW/KPIy8HKlt96rtM0dgNOV\nPssD+tZ9876jPvcBv3z3r/iUjtHyLa2GWKeGrFUrKqdNcC+OXjDqcwA2GAwGb5KPK/tjBR3d71mG\nCcAGg8FQNZ8BXUUkWESaAM8AlqYXrM+DcE7dCjh5y2G0fEerIdapIWtZjqqWichUIB3wA5ap6kEr\nNWzPBWEwGAyGqjFdEAaDweAlTAA2GAwGL1HvArDdU/8q6SwTka9FpG7e3NXT6iQi20TkCxE5KCLT\nbdRqKiJ7RCTHrfU/dmm59fxEZK+I/MVmnRMickBE9onI5zZrtRaRNBE5LCKHRKS/TToPuetTsXwr\nIq/ZpDXDfT7kisiHImJbjkkRme7WOWhXfRoMqlpvFlwd3V8CIUATIAd4xCatQcBjQK4D9WoHPOZ+\n3Qo4amO9BGjpft0Y2A30s7FurwOpwF9s3ocngHvtPlZureXAi+7XTYDWDmj6AQVAZxvK7gB8BTRz\n/78OeMGmevQEcoHmuAb5M4AuThw3X1zqWwvYM/VPVUuAiql/lqOqO7DJVqkKrbOqmu1+fRU4hOtH\nYYeWquo197+N3YstI60i0hGIwmVd1SAQkf/CdXF+H0BVS1T1sgPSw4AvVfWkTeX7A81ExB9XcDxj\nk053YLeqXlfVMmA74Lw3vY9Q3wJwVVP/bAlU3kJEfgj8CFfL1C4NPxHZB3wNbFFVu7TeAd4AnLDt\nUOBvIpLlnj5qF8HAeSDF3bWyVERa2KhXwTPAh3YUrKr5wP8Cp4CzwBVV/ZsdWrhav0+KSKCINAci\nuXUyg6ES9S0AN2hEpCWwHnhNVb+1S0dVy1X1v3HN3HlcRCz3PheRp4CvVTXL6rJvwxOq+hgwCpgi\nIpZ4ElaBP66uqT+o6o+AQsC2sQgA90P+0cCfbCr/Hlx3ksFAe6CFiDxrh5aqHgIWAH8D/grsA2qX\nTOI/gPoWgG2f+uctRKQxruC7WlU/ckLTfeu8DRhpQ/EDgWgROYGrq2ioiKyyQQfwtOJQ1a+Bj3F1\nV9lBHpBX6a4hDVdAtpNRQLaqnrOp/HDgK1U9r6qlwEfAAJu0UNX3VbW3qg4CLuEa8zBUQX0LwLZP\n/fMG4kqT9T5wSFV/a7PWfSLS2v26Ga5cpoet1lHV2araUVV/iOs4bVVVW1pVItJCRFpVvAaG47rV\ntRxVLQBOi8hD7reGYWH6wdswAZu6H9ycAvqJSHP3uTgM1ziELYhIW/ffB3D1/6bapeXr1KupyOrA\n1L8KRORDIAy4V0TygLdU9X07tHC1FicCB9x9swBzVHWTDVrtgOXuZNKNgHWqausjYg4QBHzsTvfo\nD6Sq6l9t1JsGrHY3Av4FTLJLyH1BiQBesUtDVXeLSBqQDZQBe7F3mvB6EQkESoEpDg1i+iRmKrLB\nYDB4ifrWBWEwGAz/MZgAbDAYDF7CBGCDwWDwEiYAGwwGg5cwAdhgMBi8hAnABoPB4CVMADYYDAYv\n8f/8tofo00sXwAAAAABJRU5ErkJggg==\n",
1136 | "text/plain": [
1137 | ""
1138 | ]
1139 | },
1140 | "metadata": {
1141 | "tags": []
1142 | }
1143 | }
1144 | ]
1145 | },
1146 | {
1147 | "cell_type": "markdown",
1148 | "metadata": {
1149 | "id": "w_qD224pDeos",
1150 | "colab_type": "text"
1151 | },
1152 | "source": [
1153 | "**MBCovn** Block"
1154 | ]
1155 | },
1156 | {
1157 | "cell_type": "code",
1158 | "metadata": {
1159 | "id": "6saiCur_ujfy",
1160 | "colab_type": "code",
1161 | "colab": {}
1162 | },
1163 | "source": [
1164 | "def mbConv_block(input_data, block_arg):\n",
1165 | " \"\"\"Mobile Inverted Residual block along with Squeeze and Excitation block.\"\"\"\n",
1166 | " kernel_size = block_arg.kernel_size\n",
1167 | " num_repeat= block_arg.num_repeat\n",
1168 | " input_filters= block_arg.input_filters\n",
1169 | " output_filters= output_filters.kernel_size\n",
1170 | " expand_ratio= block_arg.expand_ratio\n",
1171 | " id_skip= block_arg.id_skip\n",
1172 | " strides= block_arg.strides\n",
1173 | " se_ratio= block_arg.se_ratio\n",
1174 | " # Genişleme Evresi\n",
1175 | " expanded_filters = input_filters * expand_ratio\n",
1176 | " x = Conv2D(expanded_filters, 1, padding='same', use_bias=False)(input_data)\n",
1177 | " x = BatchNormalization()(x)\n",
1178 | " x = Activation(swish_activation)(x)\n",
1179 | " # Depthwise Convolution\n",
1180 | " x = DepthwiseConv2D(kernel_size, strides, padding='same', use_bias=False)(x)\n",
1181 | " x = BatchNormalization()(x)\n",
1182 | " x = Activation(swish_activation)(x)\n",
1183 | " # Squeeze and expand steps\n",
1184 | " se = GlobalAveragePooling2D()(x)\n",
1185 | " se = Reshape((1, 1, expanded_filters ))(x)\n",
1186 | " squeezed_filters = max (1, int(input_filters * se_ratio))\n",
1187 | " se = Conv2D(squeezed_filters , 1, activation=swish_activation, padding='same')(se)\n",
1188 | " se = Conv2D(expanded_filters, 1, activation='sigmoid', padding='same')(se)\n",
1189 | " x = multiply([x, se])\n",
1190 | " # Outputs\n",
1191 | " x = Conv2D(output_filters, 1, padding='same', use_bias=False)\n",
1192 | " x = BatchNormalization()(x)\n",
1193 | " return x\n"
1194 | ],
1195 | "execution_count": 0,
1196 | "outputs": []
1197 | }
1198 | ]
1199 | }
--------------------------------------------------------------------------------