├── 1_0_load_and_display_data.ipynb
├── 2.1_data_loader_PyTorch.ipynb
├── 3.1_linearclassiferPytorch.ipynb
├── 4.1_resnet18_PyTorch.ipynb
├── DL0321EN-1-1-Loading-Data-py-v1.0.ipynb
├── DL0321EN-2-1-Data-Preparation-py-v1.0.ipynb
├── README.md
└── _AI-Capstone-Project-with-Deep-Learning.ipynb
/3.1_linearclassiferPytorch.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "\n",
8 | " \n",
9 | " "
10 | ]
11 | },
12 | {
13 | "cell_type": "markdown",
14 | "metadata": {},
15 | "source": [
16 | " "
17 | ]
18 | },
19 | {
20 | "cell_type": "markdown",
21 | "metadata": {},
22 | "source": [
23 | "
Linear Classifier with PyTorch "
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "Before you use a Deep neural network to solve the classification problem, it 's a good idea to try and solve the problem with the simplest method. You will need the dataset object from the previous section.\n",
31 | "In this lab, we solve the problem with a linear classifier.\n",
32 | " You will be asked to determine the maximum accuracy your linear classifier can achieve on the validation data for 5 epochs. We will give some free parameter values if you follow the instructions you will be able to answer the quiz. Just like the other labs there are several steps, but in this lab you will only be quizzed on the final result.
"
33 | ]
34 | },
35 | {
36 | "cell_type": "markdown",
37 | "metadata": {},
38 | "source": [
39 | "Table of Contents "
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "metadata": {},
45 | "source": [
46 | "\n",
47 | "\n",
48 | "\n",
49 | "
\n",
56 | "
Estimated Time Needed: 25 min
\n",
57 | "
\n",
58 | " \n"
59 | ]
60 | },
61 | {
62 | "cell_type": "markdown",
63 | "metadata": {},
64 | "source": [
65 | "Download Data "
66 | ]
67 | },
68 | {
69 | "cell_type": "markdown",
70 | "metadata": {},
71 | "source": [
72 | "In this section, you are going to download the data from IBM object storage using wget , then unzip them. wget is a command the retrieves content from web servers, in this case its a zip file. Locally we store the data in the directory /resources/data . The -p creates the entire directory tree up to the given directory."
73 | ]
74 | },
75 | {
76 | "cell_type": "markdown",
77 | "metadata": {},
78 | "source": [
79 | "First, we download the file that contains the images, if you dint do this in your first lab uncomment:"
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "metadata": {},
86 | "outputs": [],
87 | "source": [
88 | "#!wget https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/DL0321EN/data/images/concrete_crack_images_for_classification.zip -P /resources/data"
89 | ]
90 | },
91 | {
92 | "cell_type": "markdown",
93 | "metadata": {},
94 | "source": [
95 | "We then unzip the file, this ma take a while:"
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": null,
101 | "metadata": {},
102 | "outputs": [],
103 | "source": [
104 | "#!unzip -q /resources/data/concrete_crack_images_for_classification.zip -d /resources/data"
105 | ]
106 | },
107 | {
108 | "cell_type": "markdown",
109 | "metadata": {},
110 | "source": [
111 | "We then download the files that contain the negative images:"
112 | ]
113 | },
114 | {
115 | "cell_type": "markdown",
116 | "metadata": {},
117 | "source": [
118 | "Imports and Auxiliary Functions "
119 | ]
120 | },
121 | {
122 | "cell_type": "markdown",
123 | "metadata": {},
124 | "source": [
125 | "The following are the libraries we are going to use for this lab:"
126 | ]
127 | },
128 | {
129 | "cell_type": "code",
130 | "execution_count": 1,
131 | "metadata": {},
132 | "outputs": [],
133 | "source": [
134 | "from PIL import Image\n",
135 | "import matplotlib.pyplot as plt\n",
136 | "import os\n",
137 | "import glob\n",
138 | "import torch\n",
139 | "from torch.utils.data import Dataset, DataLoader\n",
140 | "import torchvision.transforms as transforms\n",
141 | "import torch.nn as nn\n",
142 | "from torch import optim "
143 | ]
144 | },
145 | {
146 | "cell_type": "markdown",
147 | "metadata": {},
148 | "source": [
149 | "Dataset Class "
150 | ]
151 | },
152 | {
153 | "cell_type": "code",
154 | "execution_count": 17,
155 | "metadata": {},
156 | "outputs": [
157 | {
158 | "name": "stdout",
159 | "output_type": "stream",
160 | "text": [
161 | "1.0_load_and_display_data.ipynb\n",
162 | "2.1_data_loader_PyTorch.ipynb\n",
163 | "3.1_linearclassiferPytorch.ipynb\n",
164 | "DL0321EN-1-1-Loading-Data-py-v1.0.ipynb\n",
165 | "DL0321EN-2-1-Data-Preparation-py-v1.0.ipynb\n",
166 | "DL0321EN-3-1-Pretrained-Models-py-v1.0.ipynb\n",
167 | "\u001b[0m\u001b[01;34m__MACOSX\u001b[0m/\n",
168 | "concrete_crack_images_for_classification.zip\n",
169 | "\u001b[01;34mconcrete_data_week2\u001b[0m/\n",
170 | "\u001b[01;34mconcrete_data_week2.2\u001b[0m/\n",
171 | "concrete_data_week2.zip\n"
172 | ]
173 | }
174 | ],
175 | "source": [
176 | "ls"
177 | ]
178 | },
179 | {
180 | "cell_type": "markdown",
181 | "metadata": {},
182 | "source": [
183 | "In this section, we will use the previous code to build a dataset class. As before, make sure the even samples are positive, and the odd samples are negative. If the parameter train
is set to True
, use the first 30 000 samples as training data; otherwise, the remaining samples will be used as validation data. Do not forget to sort your files so they are in the same order. "
184 | ]
185 | },
186 | {
187 | "cell_type": "code",
188 | "execution_count": 18,
189 | "metadata": {},
190 | "outputs": [
191 | {
192 | "data": {
193 | "text/plain": [
194 | "'concrete_data_week2.2'"
195 | ]
196 | },
197 | "execution_count": 18,
198 | "metadata": {},
199 | "output_type": "execute_result"
200 | }
201 | ],
202 | "source": [
203 | "directory=\"concrete_data_week2.2\"\n",
204 | "directory"
205 | ]
206 | },
207 | {
208 | "cell_type": "code",
209 | "execution_count": 19,
210 | "metadata": {},
211 | "outputs": [
212 | {
213 | "name": "stdout",
214 | "output_type": "stream",
215 | "text": [
216 | "\u001b[0m\u001b[01;34mNegative\u001b[0m/ \u001b[01;34mPositive\u001b[0m/\n"
217 | ]
218 | }
219 | ],
220 | "source": [
221 | "ls \"concrete_data_week2.2\""
222 | ]
223 | },
224 | {
225 | "cell_type": "code",
226 | "execution_count": 25,
227 | "metadata": {},
228 | "outputs": [
229 | {
230 | "data": {
231 | "text/plain": [
232 | "'concrete_data_week2.2/Negative'"
233 | ]
234 | },
235 | "execution_count": 25,
236 | "metadata": {},
237 | "output_type": "execute_result"
238 | }
239 | ],
240 | "source": [
241 | "negative_file_path=os.path.join(directory,negative)\n",
242 | "negative_file_path"
243 | ]
244 | },
245 | {
246 | "cell_type": "code",
247 | "execution_count": 22,
248 | "metadata": {},
249 | "outputs": [
250 | {
251 | "data": {
252 | "text/plain": [
253 | "['concrete_data_week2.2/Negative/00001.jpg',\n",
254 | " 'concrete_data_week2.2/Negative/00002.jpg',\n",
255 | " 'concrete_data_week2.2/Negative/00003.jpg']"
256 | ]
257 | },
258 | "execution_count": 22,
259 | "metadata": {},
260 | "output_type": "execute_result"
261 | }
262 | ],
263 | "source": [
264 | "negative='Negative'\n",
265 | "negative_file_path=os.path.join(directory,negative)\n",
266 | "negative_files=[os.path.join(negative_file_path,file) for file in os.listdir(negative_file_path) if file.endswith(\".jpg\")]\n",
267 | "negative_files.sort()\n",
268 | "negative_files[0:3]"
269 | ]
270 | },
271 | {
272 | "cell_type": "code",
273 | "execution_count": 23,
274 | "metadata": {},
275 | "outputs": [
276 | {
277 | "data": {
278 | "text/plain": [
279 | "['concrete_data_week2.2/Positive/00001.jpg',\n",
280 | " 'concrete_data_week2.2/Positive/00002.jpg',\n",
281 | " 'concrete_data_week2.2/Positive/00003.jpg']"
282 | ]
283 | },
284 | "execution_count": 23,
285 | "metadata": {},
286 | "output_type": "execute_result"
287 | }
288 | ],
289 | "source": [
290 | "positive=\"Positive\"\n",
291 | "positive_file_path=os.path.join(directory,positive)\n",
292 | "positive_files=[os.path.join(positive_file_path,file) for file in os.listdir(positive_file_path) if file.endswith(\".jpg\")]\n",
293 | "positive_files.sort()\n",
294 | "positive_files[0:3]"
295 | ]
296 | },
297 | {
298 | "cell_type": "code",
299 | "execution_count": 26,
300 | "metadata": {},
301 | "outputs": [
302 | {
303 | "data": {
304 | "text/plain": [
305 | "40000"
306 | ]
307 | },
308 | "execution_count": 26,
309 | "metadata": {},
310 | "output_type": "execute_result"
311 | }
312 | ],
313 | "source": [
314 | "number_of_samples = len(positive_files) + len(negative_files)\n",
315 | "number_of_samples"
316 | ]
317 | },
318 | {
319 | "cell_type": "code",
320 | "execution_count": 27,
321 | "metadata": {},
322 | "outputs": [],
323 | "source": [
324 | "class Dataset(Dataset):\n",
325 | "\n",
326 | " # Constructor\n",
327 | " def __init__(self,transform=None,train=True):\n",
328 | " directory=\"concrete_data_week2.2\"\n",
329 | " positive=\"Positive\"\n",
330 | " negative=\"Negative\"\n",
331 | "\n",
332 | " positive_file_path=os.path.join(directory,positive)\n",
333 | " negative_file_path=os.path.join(directory,negative)\n",
334 | " positive_files=[os.path.join(positive_file_path,file) for file in os.listdir(positive_file_path) if file.endswith(\".jpg\")]\n",
335 | " positive_files.sort()\n",
336 | " negative_files=[os.path.join(negative_file_path,file) for file in os.listdir(negative_file_path) if file.endswith(\".jpg\")]\n",
337 | " negative_files.sort()\n",
338 | " #idx\n",
339 | " self.all_files=[None]*number_of_samples\n",
340 | " self.all_files[::2]=positive_files\n",
341 | " self.all_files[1::2]=negative_files \n",
342 | " # The transform is goint to be used on image\n",
343 | " self.transform = transform\n",
344 | " #torch.LongTensor\n",
345 | " self.Y=torch.zeros([number_of_samples]).type(torch.LongTensor)\n",
346 | " self.Y[::2]=1\n",
347 | " self.Y[1::2]=0\n",
348 | " \n",
349 | " if train:\n",
350 | " self.all_files=self.all_files[0:30000]\n",
351 | " self.Y=self.Y[0:30000]\n",
352 | " self.len=len(self.all_files)\n",
353 | " else:\n",
354 | " self.all_files=self.all_files[30000:]\n",
355 | " self.Y=self.Y[30000:]\n",
356 | " self.len=len(self.all_files) \n",
357 | " \n",
358 | " # Get the length\n",
359 | " def __len__(self):\n",
360 | " return self.len\n",
361 | " \n",
362 | " # Getter\n",
363 | " def __getitem__(self, idx):\n",
364 | " \n",
365 | " \n",
366 | " image=Image.open(self.all_files[idx])\n",
367 | " y=self.Y[idx]\n",
368 | " \n",
369 | " \n",
370 | " # If there is any transform method, apply it onto the image\n",
371 | " if self.transform:\n",
372 | " image = self.transform(image)\n",
373 | "\n",
374 | " return image, y"
375 | ]
376 | },
377 | {
378 | "cell_type": "markdown",
379 | "metadata": {},
380 | "source": [
381 | ""
382 | ]
383 | },
384 | {
385 | "cell_type": "markdown",
386 | "metadata": {},
387 | "source": [
388 | "Create a transform object, that uses the Compose
function. First use the transform ToTensor()
and followed by Normalize(mean, std)
. The value for mean
and std
are provided for you."
389 | ]
390 | },
391 | {
392 | "cell_type": "code",
393 | "execution_count": 28,
394 | "metadata": {},
395 | "outputs": [],
396 | "source": [
397 | "mean = [0.485, 0.456, 0.406]\n",
398 | "std = [0.229, 0.224, 0.225]\n",
399 | "# transforms.ToTensor()\n",
400 | "#transforms.Normalize(mean, std)\n",
401 | "#transforms.Compose([])\n",
402 | "\n",
403 | "transform =transforms.Compose([ transforms.ToTensor(), transforms.Normalize(mean, std)])\n"
404 | ]
405 | },
406 | {
407 | "cell_type": "markdown",
408 | "metadata": {},
409 | "source": [
410 | "Create object for the training data dataset_train
and validation dataset_val
. Use the transform object to convert the images to tensors using the transform object:"
411 | ]
412 | },
413 | {
414 | "cell_type": "code",
415 | "execution_count": 29,
416 | "metadata": {},
417 | "outputs": [],
418 | "source": [
419 | "dataset_train=Dataset(transform=transform,train=True)\n",
420 | "dataset_val=Dataset(transform=transform,train=False)"
421 | ]
422 | },
423 | {
424 | "cell_type": "markdown",
425 | "metadata": {},
426 | "source": [
427 | "We can find the shape of the image:"
428 | ]
429 | },
430 | {
431 | "cell_type": "code",
432 | "execution_count": 30,
433 | "metadata": {},
434 | "outputs": [
435 | {
436 | "data": {
437 | "text/plain": [
438 | "torch.Size([3, 227, 227])"
439 | ]
440 | },
441 | "execution_count": 30,
442 | "metadata": {},
443 | "output_type": "execute_result"
444 | }
445 | ],
446 | "source": [
447 | "dataset_train[0][0].shape"
448 | ]
449 | },
450 | {
451 | "cell_type": "markdown",
452 | "metadata": {},
453 | "source": [
454 | "We see that it's a color image with three channels:"
455 | ]
456 | },
457 | {
458 | "cell_type": "code",
459 | "execution_count": 31,
460 | "metadata": {},
461 | "outputs": [
462 | {
463 | "data": {
464 | "text/plain": [
465 | "154587"
466 | ]
467 | },
468 | "execution_count": 31,
469 | "metadata": {},
470 | "output_type": "execute_result"
471 | }
472 | ],
473 | "source": [
474 | "size_of_image=3*227*227\n",
475 | "size_of_image"
476 | ]
477 | },
478 | {
479 | "cell_type": "markdown",
480 | "metadata": {},
481 | "source": [
482 | " Question "
483 | ]
484 | },
485 | {
486 | "cell_type": "markdown",
487 | "metadata": {},
488 | "source": [
489 | " Create a custom module for Softmax for two classes,called model. The input size should be the size_of_image
, you should record the maximum accuracy achieved on the validation data for the different epochs. For example if the 5 epochs the accuracy was 0.5, 0.2, 0.64,0.77, 0.66 you would select 0.77. "
490 | ]
491 | },
492 | {
493 | "cell_type": "markdown",
494 | "metadata": {},
495 | "source": [
496 | "Train the model with the following free parameter values:"
497 | ]
498 | },
499 | {
500 | "cell_type": "markdown",
501 | "metadata": {},
502 | "source": [
503 | "Parameter Values \n",
504 | " learning rate:0.1 \n",
505 | " momentum term:0.1 \n",
506 | " batch size training:1000 \n",
507 | " Loss function:Cross Entropy Loss \n",
508 | " epochs:5 \n",
509 | " set: torch.manual_seed(0) "
510 | ]
511 | },
512 | {
513 | "cell_type": "code",
514 | "execution_count": 32,
515 | "metadata": {},
516 | "outputs": [
517 | {
518 | "data": {
519 | "text/plain": [
520 | ""
521 | ]
522 | },
523 | "execution_count": 32,
524 | "metadata": {},
525 | "output_type": "execute_result"
526 | }
527 | ],
528 | "source": [
529 | "torch.manual_seed(0)"
530 | ]
531 | },
532 | {
533 | "cell_type": "markdown",
534 | "metadata": {},
535 | "source": [
536 | "Custom Module: "
537 | ]
538 | },
539 | {
540 | "cell_type": "markdown",
541 | "metadata": {},
542 | "source": [
543 | "Model Object: "
544 | ]
545 | },
546 | {
547 | "cell_type": "code",
548 | "execution_count": 33,
549 | "metadata": {},
550 | "outputs": [],
551 | "source": [
552 | "class SoftMax(nn.Module):\n",
553 | " \n",
554 | " # Constructor\n",
555 | " def __init__(self, input_size, output_size):\n",
556 | " super(SoftMax, self).__init__()\n",
557 | " self.linear = nn.Linear(input_size, output_size)\n",
558 | " \n",
559 | " # Prediction\n",
560 | " def forward(self, x):\n",
561 | " z = self.linear(x)\n",
562 | " return z"
563 | ]
564 | },
565 | {
566 | "cell_type": "code",
567 | "execution_count": 34,
568 | "metadata": {},
569 | "outputs": [
570 | {
571 | "data": {
572 | "text/plain": [
573 | "torch.Size([3, 227, 227])"
574 | ]
575 | },
576 | "execution_count": 34,
577 | "metadata": {},
578 | "output_type": "execute_result"
579 | }
580 | ],
581 | "source": [
582 | "dataset_train[0][0].shape"
583 | ]
584 | },
585 | {
586 | "cell_type": "code",
587 | "execution_count": 35,
588 | "metadata": {},
589 | "outputs": [
590 | {
591 | "data": {
592 | "text/plain": [
593 | "154587"
594 | ]
595 | },
596 | "execution_count": 35,
597 | "metadata": {},
598 | "output_type": "execute_result"
599 | }
600 | ],
601 | "source": [
602 | "input_dim=3*227*227\n",
603 | "input_dim"
604 | ]
605 | },
606 | {
607 | "cell_type": "code",
608 | "execution_count": 36,
609 | "metadata": {},
610 | "outputs": [
611 | {
612 | "data": {
613 | "text/plain": [
614 | "2"
615 | ]
616 | },
617 | "execution_count": 36,
618 | "metadata": {},
619 | "output_type": "execute_result"
620 | }
621 | ],
622 | "source": [
623 | "output_dim=2\n",
624 | "output_dim"
625 | ]
626 | },
627 | {
628 | "cell_type": "code",
629 | "execution_count": 37,
630 | "metadata": {},
631 | "outputs": [
632 | {
633 | "name": "stdout",
634 | "output_type": "stream",
635 | "text": [
636 | "Print the model:\n",
637 | " SoftMax(\n",
638 | " (linear): Linear(in_features=154587, out_features=2, bias=True)\n",
639 | ")\n"
640 | ]
641 | }
642 | ],
643 | "source": [
644 | "model = SoftMax(input_dim, output_dim)\n",
645 | "print(\"Print the model:\\n \", model)"
646 | ]
647 | },
648 | {
649 | "cell_type": "code",
650 | "execution_count": 38,
651 | "metadata": {},
652 | "outputs": [
653 | {
654 | "name": "stdout",
655 | "output_type": "stream",
656 | "text": [
657 | "W: torch.Size([2, 154587])\n",
658 | "b: torch.Size([2])\n"
659 | ]
660 | }
661 | ],
662 | "source": [
663 | "print('W: ',list(model.parameters())[0].size())\n",
664 | "print('b: ',list(model.parameters())[1].size())"
665 | ]
666 | },
667 | {
668 | "cell_type": "markdown",
669 | "metadata": {},
670 | "source": [
671 | "Optimizer: "
672 | ]
673 | },
674 | {
675 | "cell_type": "code",
676 | "execution_count": 39,
677 | "metadata": {},
678 | "outputs": [],
679 | "source": [
680 | "learning_rate = 0.1"
681 | ]
682 | },
683 | {
684 | "cell_type": "code",
685 | "execution_count": 40,
686 | "metadata": {},
687 | "outputs": [],
688 | "source": [
689 | "momentum = 0.1"
690 | ]
691 | },
692 | {
693 | "cell_type": "code",
694 | "execution_count": 41,
695 | "metadata": {},
696 | "outputs": [],
697 | "source": [
698 | "optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate, momentum=momentum)"
699 | ]
700 | },
701 | {
702 | "cell_type": "markdown",
703 | "metadata": {},
704 | "source": [
705 | "Criterion: "
706 | ]
707 | },
708 | {
709 | "cell_type": "code",
710 | "execution_count": 42,
711 | "metadata": {},
712 | "outputs": [],
713 | "source": [
714 | "criterion = nn.CrossEntropyLoss()"
715 | ]
716 | },
717 | {
718 | "cell_type": "markdown",
719 | "metadata": {},
720 | "source": [
721 | "Data Loader Training and Validation: "
722 | ]
723 | },
724 | {
725 | "cell_type": "code",
726 | "execution_count": 43,
727 | "metadata": {},
728 | "outputs": [],
729 | "source": [
730 | "train_dataset=dataset_train"
731 | ]
732 | },
733 | {
734 | "cell_type": "code",
735 | "execution_count": 44,
736 | "metadata": {},
737 | "outputs": [],
738 | "source": [
739 | "validation_dataset=dataset_val"
740 | ]
741 | },
742 | {
743 | "cell_type": "code",
744 | "execution_count": 45,
745 | "metadata": {},
746 | "outputs": [],
747 | "source": [
748 | "train_loader = torch.utils.data.DataLoader(dataset=train_dataset, batch_size=1000)"
749 | ]
750 | },
751 | {
752 | "cell_type": "code",
753 | "execution_count": 48,
754 | "metadata": {},
755 | "outputs": [],
756 | "source": [
757 | "validation_loader = torch.utils.data.DataLoader(dataset=validation_dataset, batch_size=1000)"
758 | ]
759 | },
760 | {
761 | "cell_type": "markdown",
762 | "metadata": {},
763 | "source": [
764 | "Train Model with 5 epochs, should take 35 minutes: "
765 | ]
766 | },
767 | {
768 | "cell_type": "code",
769 | "execution_count": 49,
770 | "metadata": {},
771 | "outputs": [],
772 | "source": [
773 | "n_epochs = 5\n",
774 | "loss_list = []\n",
775 | "accuracy_list = []\n",
776 | "N_test = len(validation_dataset)\n",
777 | "\n",
778 | "def train_model(n_epochs):\n",
779 | " for epoch in range(n_epochs):\n",
780 | " for x, y in train_loader:\n",
781 | " optimizer.zero_grad()\n",
782 | " z = model(x.view(-1, 3 * 227 * 227))\n",
783 | " loss = criterion(z, y)\n",
784 | " loss.backward()\n",
785 | " optimizer.step()\n",
786 | " \n",
787 | " correct = 0\n",
788 | " # perform a prediction on the validationdata \n",
789 | " for x_test, y_test in validation_loader:\n",
790 | " z = model(x_test.view(-1, 3 * 227 * 227))\n",
791 | " _, yhat = torch.max(z.data, 1)\n",
792 | " correct += (yhat == y_test).sum().item()\n",
793 | " accuracy = correct / N_test\n",
794 | " loss_list.append(loss.data)\n",
795 | " accuracy_list.append(accuracy)\n",
796 | "\n",
797 | "train_model(n_epochs)"
798 | ]
799 | },
800 | {
801 | "cell_type": "code",
802 | "execution_count": 50,
803 | "metadata": {},
804 | "outputs": [
805 | {
806 | "data": {
807 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAagAAAEYCAYAAAAJeGK1AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzdd3xUZfb48c9JL6QRQgeDgmRABQUVO1XRiAUsoGuXouuufY37Wxdcv+vGta2uhaLYFdCgoFFpitgRQhGYUIQACZCQkJBeJvP8/pgLG0NCBpLJTJLzfr3yysy997lzwro5ufee5zlijEEppZTyNX7eDkAppZSqiyYopZRSPkkTlFJKKZ+kCUoppZRP0gSllFLKJ2mCUkop5ZM8lqBEJEREVorIOhHZKCKPW9vbi8gSEdlqfY+pMeZREdkmIptF5JIa2weJyK/WvhdFRDwVt1JKKd/gySuoCmC4MWYAMBAYLSJDgCRgmTGmD7DMeo+I9APGA/2B0cArIuJvnetVYBLQx/oa7cG4lVJK+QCPJSjjUmy9DbS+DHAl8Ja1/S3gKuv1lcAcY0yFMWYHsA04S0S6AJHGmB+Na1bx2zXGKKWUaqUCPHly6wpoNdAbeNkY87OIdDLG7AUwxuwVkY7W4d2An2oMz7S2VVmva28/gj3BNgnXlRb9N6cPCg0La8ofRymlfEppaakxxrTaWgKPJihjTDUwUESigY9F5JSjHF7XcyVzlO1HsKXbZwIzAULDw01JSckxRqyUUi2HiJR5OwZPapbMa4wpAJbjenaUbd22w/qeYx2WCfSoMaw7sMfa3r2O7UoppVoxT1bxxVlXTohIKDASSAcWArdYh90CLLBeLwTGi0iwiPTCVQyx0rodWCQiQ6zqvZtrjFFKKdVKefIWXxfgLes5lB8wzxjzmYj8CMwTkTuAXcC1AMaYjSIyD9gEOIA/WrcIAe4C3gRCgS+sL6WUUq2YtNZ2G+H6DEop1cqJSKkxJtzbcXhKq63+UEop1bJpglJKKeWTNEEppZTySZqglFLKiw5m53o7BJ+lCUoppbxk2+qNDH7mO1Le0cLkumiCUkopLzBOJ++89ikOvwDOvGCgt8PxSZqglFLKCw58+BGLArpyVpShZ3wXb4fjkzRBKaVUM3Pk5vL16x+yLzyW6y453dvh+CxNUEop1cyyk59icVx/wgL9uPTUzh77HBEZbTWA3SYiSXXsf1hE1lpfG0SkWkTaW/syrEaxa0VkVY0x9TadbWqaoJRSqhkVf/sdOV8s4vueg7jstK6EBXlmxTlrmbmXgUuBfsAEqzHsYcaYp40xA40xA4FHgW+MMQdqHDLM2j+4xrY6m856giYopZRqJs6yMvY9/jgrTxtGifFj3BndGx50/M4CthljthtjKoE5uBrD1mcC8IEb562v6WyT0wSllFLNJPfV6VRlZvLN4ES6RYdydq/2nvy4bsDuGu/rbfYqImG42iGl1NhsgMUislpEJtXY/rums0BHPMSjDQuVUkq5lG/eQt7s2VRcdT0/5lbxp2En4OdXVz9W9/mFRgbEJ6WuqrFpZkZy4kzrtdvNXoExwPe1bu+dZ4zZY3U9XyIi6caYFY0K+BhpglJKKQ8zTif7pk7Fv107vrtoHOabXYxtgtt7zrJCR0Zy4uB6dtfXBLYu46l1e88Ys8f6niMiH+O6ZbgCq+msMWZvraazTU5v8SmllIcVzJtH2dq1xP3lET7elMfgE2KI7+DxLhm/AH1EpJeIBOFKQgtrHyQiUcBF1GgEKyLhIhJx6DVwMbDB2l1f09kmpwlKKaU8qConh5xnnyPs7LPJGHwRv+0vYdwgjxZHAGCMcQD3AIsAO66msRtFZIqITKlx6NXAYmNMzQZ6nYDvRGQdsBJINcZ8ae1LBkaJyFZglPXeI7RhoVJKeVDWAw9QtHQZvRZ8wv+tK2Heqt388reRRIYENvrc2rBQKaXUcSlesYLCz78gdspk6NGThev2cEn/zk2SnNoCTVBKKeUBztJS9k17nKATTyT2zjv5yp7DwbKqZrm911poFZ9SSnlA7iuvULVnDye88zZ+QUGkpGXSKTKY83t38HZoLYZeQSmlVBMrT08n7403ibpmHGFnnklucQXLN+/nqtO74d/IuU9tiSYopZRqQqa6mr1Tp+IfFUWnhx4CYMHaPTichms8u7RRq6MJSimlmlD+3LmUr1tPp6RH8I+OBiBldSandY+iT6cIL0fXsmiCUkqpJlKVncP+554n/NxziBwzBoBNewrZtLfQ0wvDtkqaoJRSqolkP/kkprKSzlOnIuJ61pSSlkmgv3DFgK5ejq7l0QSllFJNoOjrrylatIgOd99F0AknAFBV7WTB2iyGJ3QkJjzIyxG2PJqglFKqkZwlJex74gmCep9E7O23H96+Yst+cosr9fbecdJ5UEop1Uj7X3oZx569nPDeu0jQ/66UUtIyaR8exNC+HmuZ1KrpFZRSSjVC+aZNHHj7baKvu46wQYMOby8orWTpphyuGNCVoAD9VXs8PPavJiI9RORrEbGLyEYRudfaPk1EskRkrfV1WY0xj4rINhHZLCKX1Ng+SER+tfa9KIeePiqllBeZ6mr2/n0q/tHRdHzwgd/t+3T9XiqrnVyjSxsdN0/e4nMADxpj0qy+IqtFZIm173ljzDM1DxaRfrj6lfQHugJLReRkY0w18CowCfgJ+BxXa+IvPBi7Uko1KP/9DyjfsIGuzzyDf1TU7/alrM6kb6cI+neN9FJ0LZ/HrqCMMXuNMWnW6yJc/Ui6HWXIlcAcY0yFMWYHsA04y+rYGGmM+dG4eoO8DVzlqbiVUsodVfv2sf/55wk//3wiEy/73b7f9hezdncB4wZ1Q2/4HL9muTEqIvHA6cDP1qZ7RGS9iMwWkRhrWzdgd41hmda2btbr2tuVUsprsv/5T0x1NZ2n/v2IJJSyOhM/gasG6q+qxvB4ghKRdkAKcJ8xphDX7bqTgIHAXuDZQ4fWMdwcZfsR7Am2SfYE2yp7gm2VcTgaHbtSStWlaNkyipYspcMf/0hQjx6/21ftNHy8JosLT46jY2SIlyJsHTyaoEQkEFdyes8YMx/AGJNtjKk2xjiBWcBZ1uGZQM3/pbsDe6zt3evYfgRbun2mLd0+2JZuHywBWkGvlGp61cUl7Hvi/wju04fY2249Yv+Pv+Wx92C5T8x9EpHRVtHZNhFJqmP/wzUK1jaISLWItK+vyM0aU2+hW1PzZBWfAK8DdmPMczW2d6lx2NXABuv1QmC8iASLSC+gD7DSGLMXKBKRIdY5bwYWeCpupZQ6mtz/vohj3z46/+NxJPDIzrgpaZlEhAQwql8nL0T3PyLiD7wMXAr0AyZYxWiHGWOeNsYMNMYMBB4FvjHGHOB/RW42YAjwx1pjnz80zhjzuad+Bk9eZpwH3AT8KiJrrW1/xfWPNBDXbboMYDKAMWajiMwDNuH6x/mjVcEHcBfwJhCKq3pPK/iUUs2ubMNGDrzzLtHjryfs9NOP2F9c4eDLDfu46vRuhAT6eyHC3zkL2GaM2Q4gInNwFaNtquf4CcAH4Cpyw/UIBmNMkYgcKnKrb6xHeCxBGWO+o+7nR/VmW2PMP4F/1rF9FXBK00WnlFLHxjgc7Pv73/GPbU/HBx6o85jPf91LWVW1r8x9qqvw7Oy6DhSRMFzTd+6pY188vy9yA1eh283AKlxXWvlNE/Lv6YMapZRyQ/5771G+aRPdnn8O/8i65zalrM6kV4dwzugZ3Swx+YVGBsQnpa6qsWlmRnLiTOu12wVmwBjge+v23mF1FLmBq9DtCetcT+AqdLsdD9AEpZRSDajau5ecF14k/KILiRg9us5jdh8o5ecdB3jo4pObbe6Ts6zQkZGcOLie3fUVntVlPNbtvUPqKnIDV6FbjWNmAZ8dR+hu0QWilFLqKIwx7Hvi/8DppPNjR855OmR+WhYicLUPVO9ZfgH6iEgvEQnClYQW1j5IRKKAi6hRfFZfkZu1r75CtyanV1BKKXUURUuXUvzVV3R8+CGCutc98dYYw/w1mZxzYizdokObOcK6GWMcInIPsAjwB2ZbxWhTrP3TrUOvBhYbY0pqDK+zyM2q2Pt3XYVuniCu1YNan/DwcFNSUtLwgUopVY/q4mK2J16Of0wMvT6cV2dZOcAvGQe4dvqPPHvtAMY1Y4GEiJQaY8Kb7QObmV5BKaVUPfb/5wUcOTl0f/GFepMTuIojwoL8GX1K52aMrvXTZ1BKKVWHsl9/Jf+994iZMIHQAQPqPa68qprU9Xu59JQuhAfr3/xNSROUUkrVYhwO9v59KgFxccTdf99Rj120cR9FFQ7GDdKFYZuapnullKrlwNvvUGG30+2FF/CPiDjqsR+tzqRbdChDesU2U3Rth15BKaVUDVVZWez/739pN2wYERePOuqx+w6W8/22XMae0Q0/P+371NQ0QSmllMUYw75/PAEidH7sbw1OuP14TRZOA2N9Z+5Tq6IJSimlLEWLFlP8zTfE/elPBHbtetRjjTGkpGUy6IQYenVotZXeXqUJSimlgOqiIrL/+U+C+9lof9MfGjx+feZBtuUU+0Tfp9ZKiySUUgrY//zzOPLy6P7KK7jT8DQlLZOgAD8ST+vS4LHq+OgVlFKqzStbu5b8D+YQc+ONhJ7acGefCkc1C9ft4eJ+nYgKrX8Cr2ocTVBKqTbNVFWxd+o0Ajp2JO7eexseAHydnkNBaVWzLmvUFuktPqVUm3bgrbeo2LyZ7i/9F/927hU7fLQ6i7iIYC7o3cHD0bVtegWllGqzKjMz2f/Sy7QbMYKIkSPdGpNXXMHyzTlcfXo3Avz1V6gn6b+uUqpNMsaw7/F/IH5+dP7b/3N73IK1e3A4jVbvNQNNUEqpNqnoiy8o+fZb4u67l8Au7lfipaRlcmq3KPp2PvoSSKrxNEEppdqc6sJC9j35L0L69yfmxhvdHpe+r5CNewoZd4YuDNsctEhCKdXm5Dz7HNUHDtBjxnTE39/tcSmrMwn0F64YqAmqOegVlFKqTSlNW0PB3Lm0v+kmQvv3d3uco9rJx2v2MKxvR9qHB3kwwqYjIqNFZLOIbBORpDr2Pywia62vDSJSLSLtjzZWRNqLyBIR2Wp9j/FU/JqglFJthqmqYt/UqQR06ULcn/90TGO/3ZpLbnFFi5n7JCL+wMvApUA/YIKI9Kt5jDHmaWPMQGPMQOBR4BtjzIEGxiYBy4wxfYBl1nuP0ASllGoz8ma/QcXWrXR+7DH8wo9tgdeP0jKJCQtkWN+OHoquyZ0FbDPGbDfGVAJzgCuPcvwE4AM3xl4JvGW9fgu4qskjt2iCUkq1CZW7dpH7yitEjBpFxPBhxzT2YGkVSzZlc+XAbgQFtJhfm92A3TXeZ1rbjiAiYcBoIMWNsZ2MMXsBrO8ey9haJKGUavUOz3kKCKDTMcx5OuSzX/dQ6XD63Nwnv9DIgPik1FU1Ns3MSE6cab2uq5mVqedUY4DvjTEHjmOsx2iCUkq1eoWfpVLy/fd0+tvfCOzU6ZjHp6zO5ORO7TilW6QHojt+zrJCR0Zy4uB6dmcCPWq87w7sqefY8fzv9l5DY7NFpIsxZq+IdAFyjj1y97SYa1XlXSU//cRvl4wm55lnqNy9u+EBSvmI6oICspOTCTntNGImjD/m8dv3F5O2q4BxZ3RvsMOuj/kF6CMivUQkCFcSWlj7IBGJAi4CFrg5diFwi/X6llrjmpQmKNUgYww5zz2PIzeXvDfe5LeLL2HXxEkULVuGcTi8HZ5SR5Xz7HNUFxTQ5fFpxzTn6ZD5aVn4CVx9esua+2SMcQD3AIsAOzDPGLNRRKaIyJQah14NLDbGlDQ01tqdDIwSka3AKOu9R4gxnrmtKCI9gLeBzoATmGmMecGqsZ8LxAMZwHXGmHxrzKPAHUA18GdjzCJr+yDgTSAU+By41zQQeHh4uCkpKTnaIcpNJT/8wK7b76Dz44/TbuhFFHz4EQUffogjO5uAzp2JvvYaoq+5lsBOLaa6SbURpatWsfMPN9H+9tvp9JeHj3m802k4/6mv6NMpgrduP8sDETaOiJQaY1ptv3lPJqguQBdjTJqIRACrcZUj3gocMMYkW5O/Yowxj1g19h/gKm/sCiwFTjbGVIvISuBe4CdcCepFY8wXR/t8TVBNZ+fNt1C5cycnLVmMX5BrgqJxOChevpz8D+ZQ8v334O9PxIgRxIy/nrAhQxA/vThX3mUqK9l+9VhMWRknfvYpfmFhx3yO77flcuNrP/PihNO5YkBXD0TZOK09QXmsSMIqPzxUilgkInZcZYpXAkOtw94ClgOPWNvnGGMqgB0isg04S0QygEhjzI8AIvI2rkR31ASlmkZp2hpKV66k06NJh5MTgAQEEDFyJBEjR1K5axf5c+dyMGU+RYsXE3TCCUSPH0/01VfhHx3txehVW5Y3ezaVv/1GjxnTjys5gas4IiIkgIv7HXthhWq8ZvkzV0TigdOBn6m/hr6+uvtu1uva249gT7BNsifYVtkTbKv02UjTyJ0xHf+YGKKvvbbeY4J69qTTww/T+5vldH363/jHxpLz1FNsvfAi9jySRNnatXjqSl2pulRmZJD7yqtEjB5Nu4suOq5zFFc4+GLDPi4/rQshgcf+7Eo1nsfLzEWkHa7JX/cZYwqPUgVTX9292/X4tnT7TGAmgISH62/ERirbuJGSb1YQd999bv0F6hccTNSYMUSNGUP55s3kz5lD4YKFHFywgOCEBGLGjydqzOXHPINfqWNhjGHv448jQUF0+uujx32eL37dS1lVtc/NfWpLPHoFJSKBuJLTe8aY+dbmbOv5FLVq6Ouru8+0Xtferjwsb8ZM/CIiiLnxhmMeG9K3L12mTqX3ihV0njYNgH3TprH1wovY+/jjlG/e0sTRKuVSuHAhpT/+RMcHHyCw4/EX7qSkZRIfG8agEzy2FqpqgMcSlLgulV4H7MaY52rsqq+GfiEwXkSCRaQX0AdYad0GLBKRIdY5b8aDdffKpeK33yhasoSYP9yIf8TxN2bzbxdOzPjr6fXxfOLnfEDEyJEcTJnPjiuvJOOGGzm4cCHOioomjFy1ZY78fLKTnyJ0wACir7/+uM+z+0ApP20/wNiWN/epVfHkFdR5wE3A8BrLuV9GPTX0Vo39PGAT8CXwR2NMtXWuu4DXgG3Ab2iBhMflzZyJhITQ/uabm+R8IkLowIF0fSqZPiu+oeMjj1Cdl8eevzzCtouGkv3001Tu3Nkkn6XarpxnnqG6qIjO//hHoypJP16TBbS8uU+tjcfKzL1Ny8yPX+Xu3fw2+lLa33QTnZIe8djnGKeT0p9/Jv+DORQtWwbV1YSfdx7R468nYtgwJEBX4lLuK1m5kl0330LsxDvp+OCDx30eYwzDnllO56gQ5kw6pwkjbHpaZq7anLzXXkf8/Gh/220e/Rzx8yP8nHMIP+ccqrJzKEj5iIJ5H5L1pz8T0KkT0ddeS/S11xzX2mmqbXFWVrJv6jQCu3enw913N+pcq3fmk5FXyj3D+zRRdOp46WxK9TtV2dkcnD+fqHFjm3VliMBOHYm7+256L11C95dfIvjkk8l9+WW2DR9B5p/+RPH332OczmaLR7UsebNmUbljB52n/h2/0NBGnSslLZOwIH8uPaVzE0WnjpdeQanfOTD7DYzTSeydd3rl8yUggIgRI4gYMYLK3bspmDuXgpT5FC1ZSuAJPYm57nqixl5NQIxWVimXiu07yJs+g8jLLqPdBRc06lzlVdV8tm4vo0/pTHiw/nr0Nr2CUoc5Dhwgf+5coi6/nKDu3p/7EdSjBx0fesiaAPw0AR3iyHn6abZdNJSsv/yF0rQ1OgG4jTPGsG/aNCQ0lE6PNr7z+OJN2RRVOLhG5z75BP0TQR124K23MRUVxE6e5O1QfscvKIioMZcTNeZyyjdvoWDuXA4uWEDhwk8J7tuXmPHXEznmCvzbtdpnxaoeBz9ZQOnKlXR+/HEC4uIafb6U1Zl0iw5lyImxTRCdaiyt4lMAVBcWsm34CMLPP5/u/3ne2+E0yFlSwsHUVPLnzKFikx2/sDAirxhDzPjxhCQkeDs81Qwc+flsv/Qygk48kRPefafRCxRnF5Zzzr+WcffQ3jx0Sd8mitKztIpPtQn577+Ps7iYDj529VQfv/BwYq67juhrr6V8/Xry58zl4MefUDBnLqEDBxIzYTwRo0fjFxzs7VCVh+Q89W+qi4tdfZ6aYPX8T9Zk4TQw9gyd++Qr9ApK4SwtZdvwEYQOGECPGdO9Hc5xqy4ooOATV5KqzMjAPyqKqLFjibn+OoLi470dnmpCJT/9zK5bbyV28mQ63n9fo89njOGS/6ygXXAA8+8+rwkibB6t/QpKiyQU+fPmUV1QQOyUyd4OpVH8o6OJvfVWTvzic3q++QZhQ4Zw4J13+G30pey6/XYKFy/GVFV5O0zVSM6KCvZNnUpgz550uGtKwwPcsCGrkC3ZxYwbpMURvkQTVBvnrKzkwOw3CDv7bMJOP93b4TQJESF8yBC6v/Afen+1jLh7/0zFjgyy/nwv20aMZP+L/6Vq3z5vh6mOU96MmVTu3Oma8xQS0iTnTEnLJCjAj8tP872mhI0hIqNFZLOIbLMaxNZ1zFBrKbqNIvKNta1vjSXq1opIoYjcZ+2bJiJZtZaw80z8eouvbcufM5d906bR843ZhJ/j28u6NIZxOChesYL8OXMo+fY78POj3bChxFw/nvDzztUOwC1ExfbtbL/yKiJHj6bb0/9uknNWOpyc/eRSzu3dgZdvOKNJztlcjnaLT0T8gS241jzNBH4BJhhjNtU4Jhr4ARhtjNklIh2NMTl1nCcLONsYs1NEpgHFxphnPPJD1aBFEm2Yqaoib9YsQgacRtiQId4Ox6MkIICI4cOJGD7cNQF43jwKUuZTvHQZgT16EHP9dUSNHUtA+/beDlXVwzid7Pv7VPzCwpp0jcivN+eQX1rVGuc+nQVsM8ZsBxCRObg6l2+qccwNwHxjzC6A2snJMgL4zRjT7Ks5N/hnoz3Bdq09wRZhvf6bPcE2355ga1l/Zqg6HUxNpSoriw6Tp7SplgJBPXrQ8cEH6b38a7o+8wwBnTqS88yzrgnAD/+F0tWrdQKwDzr48ceUrlpFp4cfIiC26eYpfbQ6k7iIYC7o06HJzukj6utSXtPJQIyILBeR1SJSV/uC8cAHtbbdIyLrRWS2iHhsWRd3rqAes6XbP7Qn2M4HLgGeAV4FzvZUUMrzjNNJ3sxZBPftS7thQ70djlf4BQURdXkiUZcnUrF1q6tUfcECCj/9lOA+fYieMJ6oK67Av107b4fa5jny8sj+99OEDh5E1NixTXbevOIKvk7P4bbz4gnwb3m3ef1CIwPik1JX1dg0MyM5cab12p1u5AHAIFxXSaHAjyLykzFmC4CIBAFXADVbE78KPGGd6wngWeD2+mKMT0pNAWYDX2QkJx7TgpruJKhDPZkSgVdt6fYF9gTbtGP5EOV7ihYvoXL7dro9/1ybunqqT3CfPnR+7G90fOB+Dn7+OQUfzCH7H0+Q88yzRF1+OTETxhNis3k7zDYr+6mncJaW0uXxx5v0eeHCdXtwOE2Lrd5zlhU6MpITB9ezu74u5bWPyTXGlAAlIrICGIDr2RXApUCaMSb70ICar0VkFvBZA2G+CtwGvBiflPoh8GZGcmJ6A2Nc52/oVoY9wfYZrgdkI3Fl2jJgpS3dPsCdD/AWLZKonzGGHePGYcrKOfGzTxF/f2+H5HOMMZT/+iv5c+ZSmJqKqaggZMBpxIyfQOSlo5usekw1rOSHH9h1+x10uPsu4v785yY99+X//RZjIPXPjVtk1lsaKJIIwJVoRuD6Hf4LcIPVHPbQMTbgJVx3x4KAlcB4Y8wGa/8cYJEx5o0aY7pYnc4RkftxFU+MbyjW+KTUKGAC8P9w3XqcBbybkZxY79wPd/4UuQ5YBIy2pdsLgPbAw26MUz6qZMUKKjbZiZ04UZNTPUSE0NNOo+uT/6TPim/o9NdHcRYWsffRR9l60VCy/5VMxY4d3g6z1XOWl7N32uMEnXACsZObdp7e5n1FbMgqZFzrK44AwBjjAO7B9fvbDswzxmwUkSkiMsU6xo6rg/l6XMnptRrJKQxXBeD8Wqf+t4j8KiLrgWHA/Q3FEp+UGgvcCtwJrAFeAM4AlhxtnDtXUCcBmbZ0e4U9wTYUOA1420pWPkuvoOpmjGHnDTfiyM7mpEVfIoGB3g6pxTDGUPrzSvLnzKFo6VJwOAgbMoSY8eOJGDFc/y09IOc//yFv+gx6vvkG4U1cafrk53Zmf7eDn/86gth2LXNJrJawkkR8Uup8IAF4B9ftvb019q06yi1Kt55BpQCD7Qm23sDrwELgfcBjk7OU55Su/IWyNWvo9PfH9BfqMXJNAD6b8CFn49i/n4KUFPLnzSPrvvvwj+tAVOLlRIwaSejAgXpl2gQqtm4l7/XZRF15ZZMnJ0e1k4/XZDG0b8cWm5xakJcykhO/qmvH0ZITuHeLz2lLtzuAscB/bOn2+4Euxx6j8gV5M6bjH9eB6HHjvB1KixYQF0eHKVPovWQJ3V99hdD+p5D/3nvsvPEPbL1oKHunTqP42+8wlZXeDrVFMk4ne6dOwz88nI6P/KXJz//ttlz2F1VwzSBdGLYZ2OKTUqMPvYlPSo2JT0q9252B7iSoKnuCbQJwM/+r1tA/vVugsnXrKPnhR2JvvU1X+W4i4u9PxLBh9Jj+Kn1+/IGuzz5D2ODBHPz0U3ZPnMiW884n6+G/ULh4Mc7SUm+H22IUfPQRZWlpdPzLXzwyeTpldSYxYYEMT+jU5OdWR5iYkZx4+JFQRnJiPjDRnYHu3OK7DZgC/NOWbt9hT7D1At49rjCVV+VOn4F/VBQx46/3diitkn+7dkQlJhKVmIizvJySH36kaMkSir/6isJPP0VCQgg//zwiR42i3dCh+EdFeTtkn+TIzSXnmWcJO+ssoq6+qsnPf7CsisWbsplwZg+CAlre3KcWyC8+KVUykhMNQHxSqj+uisEGNZigbOn2TfYE20PAyfYE2ynAZlu6PblR4apmV755M8Vff02HP92DX7hPP1NtFdZAn64AACAASURBVPxCQogYPoyI4cMwDgelq1ZRtGQpRUuXUrx0GQQEEH7WWUSMGkm7ESMI7NjR2yH7jOx/JWPKyug8bZpH5uilrt9LpcPZYuc+tUCLgHnxSanTcU3unYKrcrBB7lTxDQXeAjJwzUzuAdxiS7evOP54PU+r+H4v64EHKP5mBb2/WqZ/uXuRcTop//VXipYudU2W3rkTRAgdOJCIkSOJGDWSoJ49vR2m1xR/+x27J06kwz33EHfPHz3yGeNe/YHCsioW339hi5+k3kKq+PyAybjmYwmwGHgtIzmx+qgDcS9BrQZusKXbN1vvTwY+sKXbBzU2cE/SBPU/FTt2sP2yRGLvvJOODz7g7XCUxRhD5bZtFC5ZQtGSpVTY7QAE9+1LxKhRRIwaRfDJfVr8L1F3OcvK2D7mCiQwkF4LPsEvyK27QMdkR24Jw55ZTtKlCUy56KQmP39zawkJqjHceQYVeCg5AdjS7VvsCTYtkmhB8ma9hgQH0/7WW7wdiqpBRAju04e4Pn2Iu/tuKjMzD98GzH35ZXJfeonAnj2JGDWSiJEjCR0woFW3Bcl95VWqMjPp+fZbHklOAPPTMvETuPp0rd5rLvFJqX2AfwH9gMNLsGQkJ57Y0Fh3EtQqe4LtdVyTrABuBFYfR5zKC6qysji4cCExEyY06QrQqukFde9O7G23EnvbrTj276foq68pWrKEA2+9zYHXZxMQF0e7kSOIHDWKsDPPbFXz2Mo3byHvjTeIGjuW8LPO8shnOJ2G+WlZnN8njk6RulRVM3oDmAo8j2vliduoeyHbI7jz59hdwEbgz8C9uHqJNE2fZeVxea/PBhFi76h3sWHlgwLi4oi5/jp6vjaLk3/4nq5P/5vQgQM5+MkCdt1+B1vOv4A9jyRRtGwZzvJyb4fbKMbpZN/UqfhHRNDx4Yc89jk/7cgjq6CMcWfo1VMzC81ITlwGSEZy4s6M5MRpwHB3BrpTxVcBPGd9qRakKieHgo8+IvqqKwns3Nnb4ajj5B8ZSdSYMUSNGYOzrIyS77+naMkSir7+moMLFiChobS74AIiRo2i3dCL8I+I8HbIx6Rg3jzK1q6l61PJBMR4rLUQKauziAgO4JL++v+FZlZuFUpsjU9KvQfXwrVula3Wm6DsCbZfObJ3yGG2dPtpRzuxiMwGLgdyjDGnWNum4Zqgtd867K/GmM+tfY8Cd+Bq7/FnY8wia/sg4E1cvUo+B+412k3OLQfefAvjcBB7553eDkU1Eb/QUFe138iRmKoqSlaudFUELl1K0eLFEBhI+JAhrmNGDCegg2834avKySHn2ecIGzKEyCuu8NjnlFQ4+GLDXq4Y0JWQQF2GqpndB4Thugv3BK7bfG49EK+3is+eYDvhaANt6fajtv8VkQuBYuDtWgnqiF72ItIPV8fGs4CuwFLgZGNMtYisxHVr8SdcCepFY8wXDf1gbb2Kz5Gfz7YRI4kYPpxuzzzt7XCUhxmnk7J161xFFkuWULV7t6t8/YwzrCKLUQR1971bW5n330/xsq84ceECguLjPfY5KaszefDDdXw45RzOjG/6lSm8xder+KxJuckZyYnH1QGjwTLzxhCReOAzNxLUowDGmH9Z7xcB03DNvfraGJNgbZ8ADDXGNLjufltPUPtf/C+5r7xCr4ULCDn5ZG+Ho5qRMYaKLVsoWryEoqVLqdjsKsIN7mcj0ipfDzrpJK+Xrxd/8w27J08h7t4/0+Guuzz6WTfM+omsgjKWPzTU6z93U/L1BAUQn5T6FTDi0EoSx8KdKr6mdo/V934V8KAxJh/ohusK6ZBMa1uV9br29jrZE2yTgEkAxuFo4rBbjuriYg68+y4Ro0ZqcmqDRISQvn0J6duXuD/dQ+XOna5bgEuWsv+FF9n/wosExcdbc61GEnLqqc3+S9tZWsq+x/9B0EknEXvHHR79rKyCMn7cnsd9I05uVcmpBVkDLLC66R6+ashITqzdZ+oIzZ2g6utlX9d/NeYo2+tkS7fPBGYCSHh4m31Olf/BBzgLC4mdrMWWClezvzvuIPaOO6jKzqH4q2UULVlC3uzZ5M2aRUDnzoefa4UNHoQEeP7Xwv6XX6Zqzx5OePcdxENzng75OC0TY2CsVu95S3sgj99X7hmObIR4hGZNUEfpZZ+JawmlQ7oDe6zt3evYrurhLC/nwJtvEX7++YSe0t/b4SgfE9ipIzETJhAzYQLVBQUULV9O0ZKlFHz4Ifnvvot/dDTthg8nYtRIws891yOr3penp3PgzbeIvvYawgYftR1QoxljSEnL4uxe7enRPsyjn6XqlpGceNvxjj2eKj4BTENVfHWp2cseuBrYYL1eCLwvIs/hKpLoA6y0iiSKRGQI8DOulh//PdbPbUsKPvyI6rw8Okxp2vbYqvXxj44m+qqriL7qKpylpRR/+93hasCD8+fjFxZG+EUXEjlqFOEXXoR/u8Y/6jDV1eydOhX/6Gg6PvhgE/wUR5e2K58duSXcNbTlL2t0PERkNK726v642rkfsdC3iAwF/oOrjVKuMeYia3sGUISrstphjBlsbW8PzAXicdUJXGc9qqlTfFLqG9SRSzKSExucnHm0K6jLGxp8NCLyATAU6CAimbhmEg8VkYG4gs3AtYAgxpiNIjIP1yRgB/BHY8yhhQTv4n9l5l9YX6oOprKSvNdfJ3TwII//ZapaF7+wMCIvuZjISy7GVFZS8vNK11yrZcso+uJLJDCQ8HPPda2+Pnz4cfdoyp8zh/J16+n69NP4R0c3PKCRPlqdRWigP5ed2vZ6rIqIP/AyMArX3ahfRGShMWZTjWOigVeA0caYXSJSe37SMGNMbq1tScAyY0yyiCRZ7x85Siif1XgdguvixK07YR6t4vOmtljFl//hh+x77O/0mDWLdhec7+1wVCtgqqspW7v2cEVgVVYW+PkRNnjw4dXXA7u498u/KjuH7ZddRuiAAfR4/TWPFyyUV1Vz5j+XMtLWieevH+jRz/KWo1Xxicg5wDRjzCXW+99VS1vb7ga6GmP+Vsf4DGBw7QQlIptxVVPvFZEuwHJjTF93Y7Ym7S7NSE5scDWJBp9B2RNsQ3DdVrPhajLlD5TY0u2R7gakPM84HOTNeo2Q/v0JP/88b4ejWgnx9yds0CDCBg2iY9IjVNjtFC5ZQvHSpWQ/+STZTz5JyCmnHK4IDD6x/vU/s598EuNw0Hna1GappluyKZuicgfjzmi9fZ/8QiMD4pNSV9XYNDMjOXGm9bobsLvGvkzg7FqnOBkIFJHlQATwgjHmbWufARaLiAFmGGMOnbfToUc1VpI61mZmfQC3esq4UyTxEjAe+BAYjOs5UO9jDEh5WOEXX1K1axcd//uiltIqjxARQvr1I6RfPzreey8VO3b8r3z9+efZ//zzBJ100uGJwSH9+x3+b7Ho668pWrSIuPvvb7Z+VylpmXSJCuGck1rvIsnOskJHRnJifffz3amCDgAG4erVFAr8KCI/GWO2AOcZY/ZYCWiJiKQbY465D2B8UmpRrc/dx9FvCf4uuAbZ0u3b7Ak2f1u6vRp4w55g++FYg1SeY5xO8mbOILhPbyJGjPB2OKqNCO7Vi+CJE+kwcSJV+/ZRtNQqX5/1GnnTZxDQtYvrNuDQoex74gmC+/Qm9rZbmyW2nMJyVmzZz11DT8Lfr83+wVZfdXTtY3KNMSVAiYisAAYAW4wxewCMMTki8jGulX5WANmHCt6sW3w5RwsiIznxuBeHdCdBldoTbEHAWnuC7d/AXsCnZy63NcVffUXF1m10ffrpVt0vSPmuwM6daf+HG2n/hxtx5OdT/NXXFC1dSsGcueS/7erU0+399zw+5+mQT9Zm4TQwthXf3nPDL0AfEemFa4HW8cANtY5ZALwkIgG4HuGcDTwvIuGAnzGmyHp9MfAPa8xCXGvpJVvfFxwtiPik1KuBrzKSEw9a76OBoRnJiZ809AO4k6BuwtWW4x7gflwZeawb41QzMMaQO30GgT17EnnpaG+HoxQBMTFEjxtL9LixVBeXUPLtChA/ws44o1k+3xhDyuosTu8ZzUlx7ZrlM32RMcYhIvcAi3DVDsy2KqanWPunG2PsIvIlsB5w4ipF3yAiJwIfW7doA4D3jTFfWqdOBuaJyB3ALuDaBkKZmpGc+PGhNxnJiQXxSalTgSZJUFfZ0u0vAOXA4wD2BNu9uGrrlZeVfP8D5Rs20PmJfzTLCgBKHQv/duFEXnpps37mxj2FbM4u4v+uOqVZP9cXWd0iPq+1bXqt908DT9fath3Xrb66zpmH65mVu+q6rePWLyt3DrqFI5PRrXVsU16QN306AZ07E33lld4ORSmf8NHqTIIC/BhzWldvh6JcVsUnpT6Ha06WAf6Em13Zj7aSxARc9yt72RNsC2vsisS1rpLystJVqyhdtYpOf/1rs93bV8qXVTqcLFy3h1G2TkSFBXo7HOXyJ+AxXKtPACwGjph3VZejXUH9gKsgogOuRV0PKcJ1v1J5We70Gfi3b0/0tdd4OxSlfMLyzTkcKKlk3CBdGNZXZCQnluBabeKY1VvyZUu377Sl25fb0u3nAOm4JnFFAJm2dHvb7WXhI8p+3UDJd9/R/tZb8QsN9XY4SvmElLRMOrQL5sI+cd4ORVnik1KXWJV7h97HxCelLnJnbIM1yfYE27XASlyVGtcBP9sTbPonu5flzZyBX2QkMTdM8HYoSvmE/JJKvkrP4aqBXQnw1+kWPqRDRnJiwaE3GcmJ+YBbq0+487/i34Azben2W2zp9ptxTdZ67LjCVE2iYutWipYspf0fbsS/Xdsto1WqpoXr9lBVbRg3qE3PffJFzvik1MPLh8QnpcZzlL5+NblTxednS7fXnCmch3uJTXlI7sxZSFgYMTfd5O1QlPIZKWmZ9OsSia2LLhPqY/4f8F18Uuo31vsLsTqfN8SdBPWlPcG2CPjAen892vLCayp37aIwNZX2t95KQEyMt8NRyidszS5ifeZBHru8n7dDUbVkJCd+GZ+UOhhXUlqLa+WJMnfGNpigbOn2h+0JtrHA+bgWH5xpS7d/3MAw5SF5s15DAgJof+st3g5FKZ/xUVomAX7ClQN17pOviU9KvRO4F9dagGuBIcCP/L4FfJ3cKZJ4ypZun29Ltz9gS7ffb0u3f2xPsD3V2KDVsavat4+CTz4h+ppxBHY81hXulWqdqp2GT9ZkMbRvHB3aNX2LetVo9wJnAjszkhOHAacD+90Z6M6zpFF1bGvetUsUAHmvzwZjiL3jDm+HopTP+G5bLtmFFa2671MLV56RnFgOEJ+UGpyRnJgOuNXg8GgrSdwF3A2caE+w1ZyYGwF834hg1XFw5OVR8OGHRI0ZQ2A3nYSo1CEpqzOJCg1kuE3vKvioTGse1CfAkvik1HzcbPl+tGdQ7+MqhvgXv58FXGRLtx843kjV8Tnw5luYigpiJ070dihK+YzC8ioWbdzHdYN7EBzg7+1wVB0ykhOvtl5Oi09K/RqIAr48ypDD6k1QtnT7QeAgoDNBvaz64EHy33+fiNGXEHxiL2+Ho5TPSF2/lwqHU+c+tRAZyYnfNHzU/+h8phbgwHvv4SwpocPkyd4ORSmfkrI6k5PiwhnQPcrboSgP0ATl45wlJeS/9Tbthg0jJCHB2+Eo5TMycktYtTOfcYO6YzXWU62MJigflz93HtUHD9Jhil49KVXT/LRM/ATGnq6391orTVA+zFlRQd4bswk7ZwihA+psbqlUm+R0GlLSsjivdwc6R4V4OxyfJSKjRWSziGwTkTpbXojIUBFZKyIbReQba1sPEflaROzW9ntrHD9NRLKsMWtF5DJPxa89wn1YQUoK1ftz6fD0M94ORSmf8vOOA2QVlPGX0W5Np2mTRMQfVxfbUUAm8IuILDTGbKpxTDTwCjDaGLNLRA7V6juAB40xaSISAawWkSU1xj5vjPH4Lya9gvJRpqqKA6+9TujAgYSdfZa3w1HKp6SkZdIuOICL+3X2dii+7CxgmzFmuzGmEpgDXFnrmBuA+caYXQDGmBzr+15jTJr1ugiwA80+AVOvoHzUwU8/o2rPHjr9/TF9AKxUDaWVDr74dS+Xn9aV0KC2PffJLzQyID4pdVWNTTMzkhNnWq+7Abtr7MsEzq51ipOBQBFZjmsRhheMMW/XPEBE4nEtT/Rzjc33iMjNwCpcV1r5jfxR6qQJygeZ6mryZs4k2Gaj3UUXeTscpXzKlxv2UVJZrXOfAGdZoSMjOXFwPbvr+su2dh+mAGAQMAIIBX4UkZ+MMVsARKQdkALcZ4wptMa8CjxhnesJ4Fng9kb9IPXQW3w+qGjxYiozMugweZJePSlVS0paJj3bh3FmvLabaUAm0KPG++4cucRQJvClMabEGJMLrAAGAIhIIK7k9J4xZv6hAcaYbGNMtTHGCczCdSvRIzyWoERktojkiMiGGtvai8gSEdlqfY+pse9Rq9Jks4hcUmP7IBH51dr3orTy39jGGHJnzCSoVy8iRtW1Tq9SbdeegjJ++C2PsWd00z/eGvYL0EdEeolIEDAeWFjrmAXABSISICJhuG4B2q3fs68DdmPMczUHiEiXGm+vBjbgIZ68gnoTGF1rWxKwzBjTB1hmvUdE+uH6x+tvjXnFqkAB1+XkJKCP9VX7nE1mZ14Jew+61UfLY4qXL6ciPZ3YyZMQ/7Z9f12p2j5ek4Ux6MrlbjDGOIB7gEW4ihzmGWM2isgUEZliHWPHtS7eemAl8JoxZgNwHnATMLyOcvJ/WxcN64FhwP2e+hnEGLdawx/fyV0P1z4zxpxivd8MDDXG7LWy8HJjTF8ReRTAGPMv67hFwDQgA/jaGJNgbZ9gjW9w1mp4eLgpKSlxO1an03DpC99SWuXg/TuH0KN9mPs/aBMxxrBz/AQcubmc9OUXSGBgs8eglK8yxjDi2W/oEBHMvMnneDscnyAipcaYcG/H4SnN/QyqkzFmL7jKGIFDNfd1VZt0s74y69heJ3uCbZI9wbbKnmBbZRyOYwrMz0/49zWnUVjm4LoZP7Ij1/3k1lRKf/6ZsnXriJ14pyYnpWpZs7uA7bklXKNXT22GrxRJ1Fdt4k4VymG2dPtMW7p9sC3dPlgCjr1AcUCPaD6YOIRKh5PrZvzIluyiYz5HY+S+Op2AuDiirr664YOVamNSVmcSEujHpafq3Ke2orkTVPahB2zW9xxre33VJpnW69rbPaZf10jmTh6CANfP+JENWQc9+XGHla5ZQ+nPP9P+9tvxC9a21UrVVF5Vzafr9jC6f2ciQvTuQlvR3AlqIXCL9foWXBUkh7aPF5FgEemFqxhipXUbsEhEhlhVJTfXGOMxvTtGMG/yOYQFBTBh1k+k7fLIHLTfyZs+A//oaGKuv87jn6VUS7PMnkNhuUPnPrUxniwz/wD4EegrIpkicgeQDIwSka241odKBjDGbATmAZtwVZT80RhTbZ3qLuA1YBvwG64uvx4X3yGcuZOH0D48iJte+5mft+d57LPK7XaKv/mG9rfcjF9Y8xdnKOXrUtIy6RwZwrkndfB2KKoZebSKz5uOtYqvPtmF5dww6yeyCsqYdfNgLugT1wTR/V7mffdT8t139P5qGf6RkU1+fqVasv1FFQz51zImXXgij4zWnmg1aRVfG9cpMoS5k88hPjacO95cxdJN2U16/ort2ylatIiYG27Q5KRUHRaszaLaaXTuUxukCcoNHdoFM2fSEBK6RDDl3dWkrt/bZOfOmzkLCQ6m/a23NHywUm2MMYaPVmcyoEc0vTu283Y4qplpgnJTdFgQ7955NgN7RPOnD9L4eE1mw4MaUJmZxcFPPyXm+usIaN++CaJUqnXZuKeQ9H1FXHNGs3d6UD5AE9QxiAwJ5K3bz2LIibE8MG8dH6zc1ajz5b02C/Hzo/3tHlkIWKkWLyUtkyB/P8YM6OrtUJQXaII6RuHBAcy+9UwuOjmOR+f/yhvf7ziu81Rl53AwZT5RV19NYKdOTRylUi1fVbWThWv3MLJfR6LDgrwdjvICTVDHISTQnxk3DeKS/p14/NNNvLr8t2M+x4E33sA4ncROvNMDESrV8i3fvJ+8kkotjmjDNEEdp+AAf1664QyuGNCVp75M57klW3C3ZN+Rn0/+3LlEJl5GUI8eDQ9Qqg1KWZ1Jh3ZBXHhy00/tUC2DdtRthEB/P56/fiDBAX68uGwrFVXVJF2a0GCfmgNvv40pK6PDpEnNFKlSLUt+SSXL0rO5+Zx4Av317+i2ShNUI/n7CU+NO81122/Fdsqqqpk2pj9+fnUnqeqiIvLffY+IUaMI7t27maNVqmX4dP0eqqp17lNbpwmqCfj5Cf+4sj8hgX7M+nYHFVVOnhx7Kv51JKn89z/AWVRE7OQGW1op1WalrM7E1iWSfl118npbpgmqiYgIf73MRmigPy9+tY1yRzXPXjuAgBq3J5ylpRx4803CL7yA0FP6ezFapXzXtpwi1mUe5G+JNm+HorxMb+42IRHhgYv78vAlfVmwdg/3vL+GSofz8P6CDz+kOj+fDlOmeDFKpXzbR6uz8PcTrhyok3MbS0RGi8hmEdkmIkn1HDPUaum+UUS+aWisiLQXkSUistX6HuOp+DVBecAfh/Xm75f348uN+5j8zirKq6pxVlaS9/psws48k7AzzvB2iEr5pGqn4eM1mQw9OY64CO2L1hgi4g+8DFwK9AMmiEi/WsdEA68AVxhj+gPXujE2CVhmjOkDLLPee4QmKA+5/fxePHn1qSzfsp873vqFfR99giMnh9gp+uxJqfp8vy2X7MIK7fvUNM4CthljthtjKoE5wJW1jrkBmG+M2QVgjMlxY+yVwFvW67eAqzz1A+gzKA+64eyeBAf48fBH65hUWEDyaWcQfu653g5LKZ+VkpZJVGggI2wdvR1Ki+AXGhkQn5S6qsammRnJiTOt192A3TX2ZQJn1zrFyUCgiCwHIoAXjDFvNzC2k9VMFmPMXhHx2P9YmqA8bNyg7jjXpZGU3pmkbjfyblmVLtuiVB2KyqtYtHEf1wzqTnCAv7fDaRGcZYWOjOTEwfXsrmuuS+3VBAKAQcAIIBT4UUR+cnOsx+ktPg8zTidnzJ/JtMwlbCmBCbN+Jq+4wtthKeVzPv91L+VVTp371HQygZpL1XQH9tRxzJfGmBJjTC6wAhjQwNhsEekCYH3PwUM0QXlY0dKlVG77jStuGM1rt5zJjtxirp/5EzmF5d4OTSmfkrI6ixPjwhnYI9rbobQWvwB9RKSXiAQB44GFtY5ZAFwgIgEiEobrNp69gbELgUMN7G6xzuERmqA8yBhD3vQZBJ7Qk8hLR3PhyXG8edtZ7Cko47oZP5JVUObtEJXyCbvySlmZcYBxZ3RvcKkw5R5jjAO4B1iEK+nMM8ZsFJEpIjLFOsYOfAmsB1YCrxljNtQ31jp1MjBKRLYCo6z3HiHuLnDa0oSHh5uSkhKvxlD87bfsnjiJLv/8P6LHjTu8ffXOfG59YyWRIYG8P/FsTogN92KUSnnf80u28OJXW/khaThdokK9HU6LISKlxphW+wtEr6A8xBhD7qvTCejShagxY363b9AJMXwwcQillQ6um/Ej23KKvRSlUt7ndBrmr8nkvJM6aHJSv6MJykNKf/mFsrQ0Yu+4Awk6smrvlG5RzJl0DtVOuH7Gj9j3FnohSqW875eMA+w+UMa4QbpyhPo9TVAekjd9Bv4dOhB9zbh6j+nbOYK5k4cQ6O/HhFk/sT6zoBkjVMo3pKRlEh7kzyX9O3s7FOVjNEF5QNmvv1Lyww/E3noLfiEhRz32pLh2zJt8Du2CA7hx1s+s3nmgmaJUyvtKKx2krt/LZad2ISxIp2Wq39ME5QG502fgFxVF9PgJbh3fMzaMeZPPoUNEMDe9vpIffsv1cIRK+YZFG/dRUlnNNbq0kaqDJqgmVr55C8XLltH+D3/Av537xTVdo0OZO3kI3WNCue2NX1i+2WNz35TyGSmrs+jRPpQz49t7OxTlgzRBNbG8mTPxCwuj/U1/OOaxHSNCmDPpHE6Ka8fEt1exaOM+D0SolG/YU1DG97/lMvb07vV2oFZtmyaoJlSZkUHhF18QPWE8/tHHNxu+fXgQH0wcQv+uUdz9Xhqfrqu9MolSrcPHa7IwBl3aSNVLE1QTyp01CwkMJPbWWxt1nqiwQN6982wGnRDDvXPW8OGq3Q0PUqoFMcaQkpbJWfHt6Rkb5u1wlI/ySoISkQwR+dXq4rjK2lZvl0YRedTq6rhZRC7xRswNqdqzh4MLFhJ9zTUExMU1+nztggN467azOK93Bx7+aD3v/LSzCaJUyjes3V3A9v0lOvdJHZU3r6CGGWMGGmMOLRVfZ5dGq4vjeKA/MBp4xer26FPyXp8NQOwdtzfZOUOD/Jl182BGJHTksU828Nq325vs3Ep5U0paJiGBflx2ahdvh6J8mC/d4quvS+OVwBxjTIUxZgewDVe3R5/hyM2l4KOPiLryCgK7dm3Sc4cE+vPqHwZx2amd+b9UOy99tbVJz69Uc6twVPPpur1c0r8zESGB3g5H+TBvzYwzwGIRMcAMY8xM6u/S2A34qcbYTGubzzjw5puYqio6TJzokfMHBfjx4vjTCQ5YzzOLt1BWVc1DF/fVVZ9Vi7TMnsPBsiotjlAN8laCOs8Ys8dKQktEJP0ox7rd2dGeYJsETAIwDkfjo3RDdUEB+e9/QOTo0QTFx3vscwL8/Xj22gGEBPrx8te/UV7l5G+JNk1SqsVJWZ1J58gQzuvdwduhKB/nlQRljNljfc8RkY9x3bLLFpEu1tVTzS6N7nSFBMCWbp8JzASQ8PBm6SNy4N33cJaWEjt5ssc/y89PePLqUwkO8Of173ZQXlXNE1eeonNIVIuxv6iC5Vv2M/GCE/HX/25VA5r9GZSIhItIxKHXwMXABurv0rgQGC8iwSLSC+iDq7GW11UXl3DgQFRkIQAAEtRJREFUnXdoN3w4IX1PbpbPFBGmjunHlItO4r2fd/HwR+updrbOnl6q9VmwNotqp+Eard5TbvDGFVQn4GPr1lQA8L4x5ksR+QWYJyJ3ALuAawGsDpDzgE2AA/ijMabaC3EfoWDuHJwHD9JhiuevnmoSER4Z3ZfQQH+eX7qFCkc1z18/kEB/X6p5UepIKWlZDOgeRe+OEd4ORbUAzZ6gjDHbgQF1bM8DRtQz5p/APz0c2jFxlpeT98abhJ97LqGnndbsny8i3DuyDyGBfvzri3QqHE5euuF0ggN8rgJfqf/f3p1HSVXdCRz//mqhadoGutlEFFpQU6WoLE1j4oyakMUtMRk3RMFIEJOMkzCZnInJySRqTs4hOZNMzEwmQoAEE0h0ok6MUaMhxixHkWYbxWo37GYNWxfdNE0vVfWbP96jU5TVdDWn6tWr7t/nnD68eu/Wq19dXtWv3n333QvAa3taie1t5f7rLih2KIOGiFwJPAAEcaZzX5qx/Qqc1qp33FWPqer9IvIe4OG0opOBr6nq90TkXuBO4IC77Suq+lQh4rfx7U/R4V8+SvLgQUZ99ztFjeOuy6cwNBzk609s486HNrLstpmUD7EkZfzn0U27CAeFj16U31sxTHbu/aI/AD6Ecy1/g4g8oaqvZRT9k6pem75CVV8HpqXtZzfweFqR/1DVfy9Y8C5rEzoF2tXFoZUrKZ8xg2GzZhU7HG5/Xw3fuv5C/vTmAe74ycsc7fSmB6MxuepOpvjVlt3MiYyjquLdM0ybgqgD3lLV7araBfwC577S/poDvK2qng9nY2dQp6Dl178msXcv4++71zfdvG+eNZGh4SBfeGQr81eu58d31DGi3G6CNP7wxzcOcLCti+tt3qe8CpQPD9Xc85v6tFXLG5des9xdngCkD+S5C5idZTfvFZGtOL2jv6iq2zK2zwV+nrHubhFZANQD/6Kq8VN+EychqgOzB1hFRYUePXo07/vVZJLtV19DoKKCmkd/6ZsEddzTr+zlc7/YzHtOr+SnC2fbr1XjC59ds5H125t56StzrDNPHolIu6pmnXhORG4EPqKqi9zH84E6Vf2ntDLDgZSqtonI1cAD7nBzx7cPwUlcF6jqPnfdOOAgzv2o3wDGq2r+xnhLY0dKP7U+8wxdTU2Muusu3yUngKsuHM+y+TN5Y18bc5e/xIEjncUOyQxyh9u7+N1r+/nYtDMsOXmrz3tIVbVVVdvc5aeAsIik30F9FbDpeHJyy+1T1aSqpoAfUcCh56yJrx80leLQsuUMmTKFyg99sNjh9OoDkXH8+JOzWLS6npuXvciaO2czfkR5scMyRaaqJFJKIql0JVN0J1Mkkkq3u9ydsZxIptxyf1vuKZ9SuhPuPlJKVyJFIuWU7VlOKN2pFLvjx+hKpmxoI+9tAM517x/djdNUNy+9gIicDuxTVRWROpyTlkNpRW4ho3nv+IAK7sNP4NzHWhCWoPqh7Q9/oPONNzjjW0uRgL9/CV56zmhWL6xj4U82cNOyF1m76BLOqrZ5dwqltaOblvZuEinnC9z5ks748ne/uLvcL/xsSSE9aZyQELKVS0sC3ckTl3v2cUIchW3ODwWEcDBAKCgMcf8NBwOEgwGum3YGF5wxvKCvb06kqgkRuRv4LU4381XufaWfdrc/CNwAfEZEEsAxYK66131EZBhOD8DMGz2/LSLTcJr4GrNszxu7BpUjVaXxpptJxuNMeeZpJFQauX3LzsMsWLmeirIQa++8hLNHZ22uNv2gquxsPsaGxmbqm+LUNzbz5v62vO0/GBDCQSEcCBAOBQgHhVAgwJBQoCcJhNO+/DMTQmZyCAeFkLs8JG05nFEmHHT3Hwo4rx1MWw65MfT6OuLLJu+B7mTXoAYCS1A5avvLX9j5qUWcft99VN18U97264Vte1qYv/JlggFhzaLZnDfO7uLvj0QyRWzvETchNVPfGGe/e22vcmiI2klVzJxUxbjhQ9/1hR/O+BJ/V0Jwk0tPuUDAxlY0ObMEVaLynaCa5i+ga8cOpjz3LIEhpdcz7s19R5i3Yj3JlPLQwjqmThhR7JB8q60zweYdcTY0xtnY1MzmHYdp73JG15owspxZNVXU1lRTW1PFeWMrLaGYorEEVaLymaDaN22iad6tjPvyPVTffnvfT/Cpdw4e5dYfvURbZ4LVC+uYPrGq2CH5wl9bOnrOjDY0NhPb20pKISAQHT+cWTXVzJxURW1NlXU2Mb5iCapE5TNB7Vi8mI5XXuWcdb8jMKy0OxrsbG7n1hXrOdTWyapPzmL25FHFDslTqZTy5v42p7nOvYa0K34MgPJwkOkTR1JbU82smiqmnTXSZnw1vmYJqkTlK0Ed27aNxutvYMySJZ6PWl4of23pYN6Kl9hz+BgrFszi784duBPHdXQn2brzcE9nho1NcVo7nKGgxlSWMaumipmTnIQUHT/c7tMxJcUSVInKV4La9bnPc/TFFznn9+sIVg6czgUHjnQyf+V6th88yg9vncGc6Lhih5QXh9o62dgU70lIr+xu6elefc7Y05zrR5Oc60cTq4dZzzNT0ixBlah8JKjOt99m+7UfZdRdixm7ZEmeIvOP+NEuFqx6mdjeVr5/y3SuvnB8sUPqF1Wl8VA7Gxqb2dgYZ0NTM9sPOP/nQ4IBLjpzhNOZwe1lZ8M+mYHGElSJykeC2vOlL9H67HOc8/t1hKoGZoeC1o5u7vjxBjbviPOdmy7mE9P9e7d/dzLFtj2t1Dc2O0mpKc7Bti4ARg4LM3NiVc/1o6kTRjA0bNOOmIFtoCeo0rjbtAi6du6k5cnfUH3bbQM2OQEMHxrmoYV1fGr1Br7wyFY6u1PMrZtY7LAAJ3luaopT3xinvqmZLTsP09GdAmBi9TAuO28Mte71oyljTrPu3sYMMJagenHoRyuQQIDqhQUZpNdXKspC/OSOOu766UbueewVOrqTfPLSsz2PY/fhY07POre79+v7jqDqjKxw/vjh3FI3kVluk93Y4UM9j88Y4y1LUFl079tHy+OPM+L6fyA8bmyxw/HE0HCQ5Qtmcvfazdz769foSKT49OVTCvZ6yZTy+l+PUN/U7NwQ29jMnpYOACqGBJkxqYqrpo6n1u3uXVFmh6oxg4196rNoXrUKTaUYtWhRsUPxVFkoyH/fOoN/fngLS59u4FhXkiUfPDcvPd3auxJs2XnY7cwQZ3NTnCPuzL/jhpdRW1PN4knONaTI6ZWErLu3MYOeJagMieZm4g8/wohrr2XImf7tMFAo4WCAB+ZOZ2g4yAPr3qQjkeSeKyP9TlIHjnSy0T07qm+Ks213C4mUIgLnja3kY9PO6Bmh4cyqcuvubYx5F0tQGZpXP4R2djLqrsXFDqVoggHh29dfxNBwgGUvbKejK8nXP3pBr50QVJW3DxztGZmhvrGZxkPtAJSFAlx81kgWXzaZWTXVzJhYxYhhNjqDMaZvlqDSJFtbia9ZQ+WHP0zZ5MnFDqeoAgHhG9dNpSwUZOWf36EzkeKbn7iQYEDoTCR5dXeL25nBGVA13t4NQNWwMLU11cybPZGZk6qZOmE4ZSHr7m2M6T9LUGnia9eSamtj9CA+e0onInz1mijl4SD/9fxb7Iy3051Qtuw6TFfC6e599ugK5kTH9YzwPXl0hTXXGWPywhKUS1U58vzzVFx+GUPPP7/Y4fiGiPDFj7yHYWVBfvj820weexoLLplErXv9aExlWbFDNMYMUDaSRBpNJEi2tBAaNbhG+M6VqtrZkTE+0tdIEiJyJfAAzpTvK1R1acb2K4BfAe+4qx5T1fvdbY3AESAJJFS11l1fDTwM1OBM+X6Tqsbz9qbSWF/eNBIKWXI6CUtOxpQOEQkCPwCuAs4HbhGRbM1Df1LVae7f/Rnb3u+ur01bdw+wTlXPBda5jwvCEpQxxgxMdcBbqrpdVbuAXwDX5WG/1wGr3eXVwMfzsM+s7BqUMcaUqED58FDNPb+pT1u1vHHpNcvd5QnAzrRtu4DZWXbzXhHZCuwBvqiq29z1CjwrIgosU9Xj+x2nqnsBVHWviBRsuB1LUMYYU6JSx1oTjUuvqe1lc7Y2+cxOB5uASaraJiJXA/8LnOtuu1RV97gJ6DkRaVDVP+Yn8tyUTBOfiFwpIq+LyFsiUrA2T2OMGSB2AWelPT4T5yyph6q2qmqbu/wUEBaR0e7jPe6/+4HHcZoMAfaJyHgA99/9hXoDJZGg+nGxzxhjjGMDcK6InC0iQ4C5wBPpBUTkdHF7P4lIHU5OOCQiFSJS6a6vAD4MvOo+7Qngdnf5dpxegAVRKk18PRf7AETk+MW+14oalTHG+JSqJkTkbuC3ON3MV6nqNhH5tLv9QeAG4DMikgCOAXNVVUVkHPC4m7tCwFpVfcbd9VLgERH5FLADuLFQ76Ek7oMSkRuAK1V1kft4PjBbVe9OLxeLRBcDiwFmvrN9Zntnp+exGmOMV2xGXX/I5WIf0YbYcmA5gFRU+D/zGmOM6VWpJKg+L/Zlam9vVxE5dgqvFQISp/C8QvFTPH6KBSyek/FTLGDx9OVU4ynPdyB+UipNfCHgDWAOsBvn4t+8tP76+Xyt+oy7povKT/H4KRaweE7GT7GAxdMXv8XjFyVxBtXbxb4ih2WMMaaASiJBQU8f/aeKHYcxxhhvlMR9UB5b3ncRT/kpHj/FAhbPyfgpFrB4+uK3eHyhJK5BGWOMGXzsDMoYY4wvWYIyxhjjS4M2QfU1+Kw4vu9u/z8RmVHEWK4QkRYR2eL+fa1Qsbivt0pE9ovIq71s97Ju+orF67o5S0SeF5GYiGwTkc9nKeNJ/eQYi2f1IyJDReRlEdnqxnNfljJeHju5xOP18RMUkc0i8mSWbZ7VTclQ1UH3h9NV/W1gMjAE2Aqcn1HmauBpnFEsLgHWFzGWK4AnPayfy4AZwKu9bPekbnKMxeu6GQ/McJcrce7PK9axk0ssntWP+35Pc5fDwHrgkiIeO7nE4/Xx8wVgbbbX9LJuSuVvsJ5B5TLT5HXAQ+p4CRgp7hDzRYjFU+rM+dJ8kiJe1U0usXhKVfeq6iZ3+QgQw5kYLp0n9ZNjLJ5x32+b+zDs/mX2wvLy2MklHs+IyJnANcCKXop4VjelYrAmqGwzTWZ+sHMp41Us4M56KSJPi8gFBYijP7yqm1wVpW5EpAaYjvPLPJ3n9XOSWMDD+nGbsLbgzBH0nKoWtW5yiAe8q5/vAf8KpHrZ7rfPVdEN1gSVy+CzOQ1Q61Esx2e9vBj4T5xZL4vJq7rJRVHqRkROAx4Flqhqa+bmLE8pWP30EYun9aOqSVWdhjNeZp2ITM0MN9vTihiPJ/UjItcC+1V148mKZVk3qO8DGqwJKpfBZ/s9QG2hYtGTzHpZJF7VTZ+KUTciEsZJCGtU9bEsRTyrn75iKdaxo6qHgT8AV2ZsKsqx01s8HtbPpcDHRKQRpxn/AyLys4wyvvlc+cVgTVB9zjTpPl7g9qy5BGhR1b3FiEV6mfWyALHkyqu66ZPXdeO+1kogpqrf7aWYJ/WTSyxe1o+IjBGRke5yOfBBoCGjmGfHTi7xeFU/qvplVT1TVWtwPuO/V9XbMor55nPlFyUzFl8+aW4zTT6F06vmLaAduKOIsWSd9bIQ8QCIyM9xejeNFpFdwNdxLjB7Wjc5xuJp3eD8Ep4PvOJe2wD4CjAxLSav6ieXWLysn/HAahEJ4nzRP6KqTxbjc9WPeLw+fk5QxLopCTbUkTHGGF8arE18xhhjfM4SlDHGGF+yBGWMMcaXLEEZY4zxJUtQxhhjfMkSlDEeiUWiV8Qi0XeNYm2Myc4SlDHGGF+y+6CMyRCLRG8DPocz/cl64LNAC7AMeD8QB+ZGG2IHYpHoNOBBYBjOtCkLow2xeCwSPcddPwZIAjfiDGNzL3AQmApsBG6LNsTsQ2hMFnYGZUyaWCQaBW4GLo02xKbhJJdbgQpgU7QhNgN4AWdEC4CHgC9FG2IXAa+krV8D/CDaELsYeB9wfMia6cAS4HycOcAuLfibMqZEDcqhjow5iTnATGBDLBIFKMeZqiEFPOyW+RnwWCwSHQGMjDbEXnDXrwb+JxaJVgITog2xxwGiDbEOAHd/L0cbYrvcx1uAGuDPhX9bxpQeS1DGnEiA1dGG2JfTV8Yi0X/LKHeyZrls0yYc15m2nMQ+g8b0ypr4jDnROuCGWCQ6FiAWiVbHItFJOJ+VG9wy84A/RxtiLUA8Fon+vbt+PvBCtCHWCuyKRaIfd/dRFotEh3n6LowZAOzXmzFpog2x12KR6FeBZ2ORaADoBv4ROApcEItEN+J0mLjZfcrtwINuAtrO30agng8si0Wi97v7uNHDt2HMgGC9+IzJQSwSbYs2xE4rdhzGDCbWxGeMMcaX7AzKGGOML9kZlDHGGF+yBGWMMcaXLEEZY4zxJUtQxhhjfMkSlDHGGF/6f88f87IYlpEyAAAAAElFTkSuQmCC\n",
808 | "text/plain": [
809 | ""
810 | ]
811 | },
812 | "metadata": {
813 | "needs_background": "light"
814 | },
815 | "output_type": "display_data"
816 | }
817 | ],
818 | "source": [
819 | "fig, ax1 = plt.subplots()\n",
820 | "color = 'tab:red'\n",
821 | "ax1.plot(loss_list,color=color)\n",
822 | "ax1.set_xlabel('epoch',color=color)\n",
823 | "ax1.set_ylabel('total loss',color=color)\n",
824 | "ax1.tick_params(axis='y', color=color)\n",
825 | " \n",
826 | "ax2 = ax1.twinx() \n",
827 | "color = 'tab:blue'\n",
828 | "ax2.set_ylabel('accuracy', color=color) \n",
829 | "ax2.plot( accuracy_list, color=color)\n",
830 | "ax2.tick_params(axis='y', color=color)\n",
831 | "fig.tight_layout()"
832 | ]
833 | },
834 | {
835 | "cell_type": "code",
836 | "execution_count": 51,
837 | "metadata": {},
838 | "outputs": [
839 | {
840 | "data": {
841 | "text/plain": [
842 | "[0.6201, 0.554, 0.57, 0.5715, 0.7565]"
843 | ]
844 | },
845 | "execution_count": 51,
846 | "metadata": {},
847 | "output_type": "execute_result"
848 | }
849 | ],
850 | "source": [
851 | "accuracy_list"
852 | ]
853 | },
854 | {
855 | "cell_type": "markdown",
856 | "metadata": {},
857 | "source": [
858 | "About the Authors: \n",
859 | " Joseph Santarcangelo has a PhD in Electrical Engineering, his research focused on using machine learning, signal processing, and computer vision to determine how videos impact human cognition. Joseph has been working for IBM since he completed his PhD."
860 | ]
861 | },
862 | {
863 | "cell_type": "markdown",
864 | "metadata": {},
865 | "source": [
866 | "Copyright © 2019 cognitiveclass.ai . This notebook and its source code are released under the terms of the MIT License "
867 | ]
868 | },
869 | {
870 | "cell_type": "code",
871 | "execution_count": null,
872 | "metadata": {},
873 | "outputs": [],
874 | "source": []
875 | }
876 | ],
877 | "metadata": {
878 | "kernelspec": {
879 | "display_name": "Python",
880 | "language": "python",
881 | "name": "conda-env-python-py"
882 | },
883 | "language_info": {
884 | "codemirror_mode": {
885 | "name": "ipython",
886 | "version": 3
887 | },
888 | "file_extension": ".py",
889 | "mimetype": "text/x-python",
890 | "name": "python",
891 | "nbconvert_exporter": "python",
892 | "pygments_lexer": "ipython3",
893 | "version": "3.6.7"
894 | }
895 | },
896 | "nbformat": 4,
897 | "nbformat_minor": 4
898 | }
899 |
--------------------------------------------------------------------------------
/4.1_resnet18_PyTorch.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "\n",
8 | " \n",
9 | " "
10 | ]
11 | },
12 | {
13 | "cell_type": "markdown",
14 | "metadata": {},
15 | "source": [
16 | " "
17 | ]
18 | },
19 | {
20 | "cell_type": "markdown",
21 | "metadata": {},
22 | "source": [
23 | "Pre-trained-Models with PyTorch "
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "In this lab, you will use pre-trained models to classify between the negative and positive samples; you will be provided with the dataset object. The particular pre-trained model will be resnet18; you will have three questions: \n",
31 | "\n",
32 | "change the output layer \n",
33 | " train the model \n",
34 | " identify several misclassified samples \n",
35 | " \n",
36 | "You will take several screenshots of your work and share your notebook. "
37 | ]
38 | },
39 | {
40 | "cell_type": "markdown",
41 | "metadata": {},
42 | "source": [
43 | "Table of Contents "
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "metadata": {},
49 | "source": [
50 | "\n",
51 | "\n",
52 | "\n",
53 | "
\n",
61 | "
Estimated Time Needed: 120 min
\n",
62 | "
\n",
63 | " "
64 | ]
65 | },
66 | {
67 | "cell_type": "markdown",
68 | "metadata": {},
69 | "source": [
70 | "Download Data "
71 | ]
72 | },
73 | {
74 | "cell_type": "markdown",
75 | "metadata": {},
76 | "source": [
77 | "Download the dataset and unzip the files in your data directory, unlike the other labs, all the data will be deleted after you close the lab, this may take some time:"
78 | ]
79 | },
80 | {
81 | "cell_type": "code",
82 | "execution_count": null,
83 | "metadata": {},
84 | "outputs": [],
85 | "source": [
86 | "!wget https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/DL0321EN/data/images/Positive_tensors.zip "
87 | ]
88 | },
89 | {
90 | "cell_type": "code",
91 | "execution_count": null,
92 | "metadata": {},
93 | "outputs": [],
94 | "source": [
95 | "!unzip -q Positive_tensors.zip "
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": null,
101 | "metadata": {},
102 | "outputs": [],
103 | "source": [
104 | "! wget https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/DL0321EN/data/images/Negative_tensors.zip\n",
105 | "!unzip -q Negative_tensors.zip"
106 | ]
107 | },
108 | {
109 | "cell_type": "markdown",
110 | "metadata": {},
111 | "source": [
112 | "We will install torchvision:"
113 | ]
114 | },
115 | {
116 | "cell_type": "code",
117 | "execution_count": null,
118 | "metadata": {},
119 | "outputs": [],
120 | "source": [
121 | "!pip install torchvision"
122 | ]
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "metadata": {},
127 | "source": [
128 | "Imports and Auxiliary Functions "
129 | ]
130 | },
131 | {
132 | "cell_type": "markdown",
133 | "metadata": {},
134 | "source": [
135 | "The following are the libraries we are going to use for this lab. The torch.manual_seed()
is for forcing the random function to give the same number every time we try to recompile it."
136 | ]
137 | },
138 | {
139 | "cell_type": "code",
140 | "execution_count": null,
141 | "metadata": {},
142 | "outputs": [],
143 | "source": [
144 | "# These are the libraries will be used for this lab.\n",
145 | "import torchvision.models as models\n",
146 | "from PIL import Image\n",
147 | "import pandas\n",
148 | "from torchvision import transforms\n",
149 | "import torch.nn as nn\n",
150 | "import time\n",
151 | "import torch \n",
152 | "import matplotlib.pylab as plt\n",
153 | "import numpy as np\n",
154 | "from torch.utils.data import Dataset, DataLoader\n",
155 | "import h5py\n",
156 | "import os\n",
157 | "import glob\n",
158 | "torch.manual_seed(0)"
159 | ]
160 | },
161 | {
162 | "cell_type": "code",
163 | "execution_count": null,
164 | "metadata": {},
165 | "outputs": [],
166 | "source": [
167 | "from matplotlib.pyplot import imshow\n",
168 | "import matplotlib.pylab as plt\n",
169 | "from PIL import Image\n",
170 | "import pandas as pd\n",
171 | "import os"
172 | ]
173 | },
174 | {
175 | "cell_type": "markdown",
176 | "metadata": {},
177 | "source": [
178 | ""
179 | ]
180 | },
181 | {
182 | "cell_type": "markdown",
183 | "metadata": {},
184 | "source": [
185 | "Dataset Class "
186 | ]
187 | },
188 | {
189 | "cell_type": "markdown",
190 | "metadata": {},
191 | "source": [
192 | " This dataset class is essentially the same dataset you build in the previous section, but to speed things up, we are going to use tensors instead of jpeg images. Therefor for each iteration, you will skip the reshape step, conversion step to tensors and normalization step."
193 | ]
194 | },
195 | {
196 | "cell_type": "code",
197 | "execution_count": null,
198 | "metadata": {},
199 | "outputs": [],
200 | "source": [
201 | "# Create your own dataset object\n",
202 | "\n",
203 | "class Dataset(Dataset):\n",
204 | "\n",
205 | " # Constructor\n",
206 | " def __init__(self,transform=None,train=True):\n",
207 | " directory=\"/home/dsxuser/work\"\n",
208 | " positive=\"Positive_tensors\"\n",
209 | " negative='Negative_tensors'\n",
210 | "\n",
211 | " positive_file_path=os.path.join(directory,positive)\n",
212 | " negative_file_path=os.path.join(directory,negative)\n",
213 | " positive_files=[os.path.join(positive_file_path,file) for file in os.listdir(positive_file_path) if file.endswith(\".pt\")]\n",
214 | " negative_files=[os.path.join(negative_file_path,file) for file in os.listdir(negative_file_path) if file.endswith(\".pt\")]\n",
215 | " number_of_samples=len(positive_files)+len(negative_files)\n",
216 | " self.all_files=[None]*number_of_samples\n",
217 | " self.all_files[::2]=positive_files\n",
218 | " self.all_files[1::2]=negative_files \n",
219 | " # The transform is goint to be used on image\n",
220 | " self.transform = transform\n",
221 | " #torch.LongTensor\n",
222 | " self.Y=torch.zeros([number_of_samples]).type(torch.LongTensor)\n",
223 | " self.Y[::2]=1\n",
224 | " self.Y[1::2]=0\n",
225 | " \n",
226 | " if train:\n",
227 | " self.all_files=self.all_files[0:30000]\n",
228 | " self.Y=self.Y[0:30000]\n",
229 | " self.len=len(self.all_files)\n",
230 | " else:\n",
231 | " self.all_files=self.all_files[30000:]\n",
232 | " self.Y=self.Y[30000:]\n",
233 | " self.len=len(self.all_files) \n",
234 | " \n",
235 | " # Get the length\n",
236 | " def __len__(self):\n",
237 | " return self.len\n",
238 | " \n",
239 | " # Getter\n",
240 | " def __getitem__(self, idx):\n",
241 | " \n",
242 | " image=torch.load(self.all_files[idx])\n",
243 | " y=self.Y[idx]\n",
244 | " \n",
245 | " # If there is any transform method, apply it onto the image\n",
246 | " if self.transform:\n",
247 | " image = self.transform(image)\n",
248 | "\n",
249 | " return image, y\n",
250 | " \n",
251 | "print(\"done\")"
252 | ]
253 | },
254 | {
255 | "cell_type": "markdown",
256 | "metadata": {},
257 | "source": [
258 | "We create two dataset objects, one for the training data and one for the validation data."
259 | ]
260 | },
261 | {
262 | "cell_type": "code",
263 | "execution_count": null,
264 | "metadata": {},
265 | "outputs": [],
266 | "source": [
267 | "train_dataset = Dataset(train=True)\n",
268 | "validation_dataset = Dataset(train=False)\n",
269 | "print(\"done\")"
270 | ]
271 | },
272 | {
273 | "cell_type": "markdown",
274 | "metadata": {},
275 | "source": [
276 | "Question 1 "
277 | ]
278 | },
279 | {
280 | "cell_type": "markdown",
281 | "metadata": {},
282 | "source": [
283 | "Prepare a pre-trained resnet18 model : "
284 | ]
285 | },
286 | {
287 | "cell_type": "markdown",
288 | "metadata": {},
289 | "source": [
290 | "Step 1 : Load the pre-trained model resnet18
Set the parameter pretrained
to true:"
291 | ]
292 | },
293 | {
294 | "cell_type": "code",
295 | "execution_count": null,
296 | "metadata": {},
297 | "outputs": [],
298 | "source": [
299 | "# Step 1: Load the pre-trained model resnet18\n",
300 | "\n",
301 | "# Type your code here"
302 | ]
303 | },
304 | {
305 | "cell_type": "markdown",
306 | "metadata": {},
307 | "source": [
308 | "Step 2 : Set the attribute requires_grad
to False
. As a result, the parameters will not be affected by training."
309 | ]
310 | },
311 | {
312 | "cell_type": "code",
313 | "execution_count": null,
314 | "metadata": {},
315 | "outputs": [],
316 | "source": [
317 | "# Step 2: Set the parameter cannot be trained for the pre-trained model\n",
318 | "\n",
319 | "\n",
320 | "# Type your code here"
321 | ]
322 | },
323 | {
324 | "cell_type": "markdown",
325 | "metadata": {},
326 | "source": [
327 | "resnet18
is used to classify 1000 different objects; as a result, the last layer has 1000 outputs. The 512 inputs come from the fact that the previously hidden layer has 512 outputs. "
328 | ]
329 | },
330 | {
331 | "cell_type": "markdown",
332 | "metadata": {},
333 | "source": [
334 | "Step 3 : Replace the output layer model.fc
of the neural network with a nn.Linear
object, to classify 2 different classes. For the parameters in_features
remember the last hidden layer has 512 neurons."
335 | ]
336 | },
337 | {
338 | "cell_type": "code",
339 | "execution_count": null,
340 | "metadata": {},
341 | "outputs": [],
342 | "source": []
343 | },
344 | {
345 | "cell_type": "markdown",
346 | "metadata": {},
347 | "source": [
348 | "Print out the model in order to show whether you get the correct answer. (Your peer reviewer is going to mark based on what you print here.) "
349 | ]
350 | },
351 | {
352 | "cell_type": "code",
353 | "execution_count": null,
354 | "metadata": {},
355 | "outputs": [],
356 | "source": [
357 | "print(model)"
358 | ]
359 | },
360 | {
361 | "cell_type": "markdown",
362 | "metadata": {},
363 | "source": [
364 | "Question 2: Train the Model "
365 | ]
366 | },
367 | {
368 | "cell_type": "markdown",
369 | "metadata": {},
370 | "source": [
371 | "In this question you will train your, model:"
372 | ]
373 | },
374 | {
375 | "cell_type": "markdown",
376 | "metadata": {},
377 | "source": [
378 | "Step 1 : Create a cross entropy criterion function "
379 | ]
380 | },
381 | {
382 | "cell_type": "code",
383 | "execution_count": null,
384 | "metadata": {},
385 | "outputs": [],
386 | "source": [
387 | "# Step 1: Create the loss function\n",
388 | "\n",
389 | "# Type your code here"
390 | ]
391 | },
392 | {
393 | "cell_type": "markdown",
394 | "metadata": {},
395 | "source": [
396 | "Step 2 : Create a training loader and validation loader object, the batch size should have 100 samples each."
397 | ]
398 | },
399 | {
400 | "cell_type": "code",
401 | "execution_count": null,
402 | "metadata": {},
403 | "outputs": [],
404 | "source": []
405 | },
406 | {
407 | "cell_type": "markdown",
408 | "metadata": {},
409 | "source": [
410 | "Step 3 : Use the following optimizer to minimize the loss "
411 | ]
412 | },
413 | {
414 | "cell_type": "code",
415 | "execution_count": null,
416 | "metadata": {},
417 | "outputs": [],
418 | "source": [
419 | "optimizer = torch.optim.Adam([parameters for parameters in model.parameters() if parameters.requires_grad],lr=0.001)"
420 | ]
421 | },
422 | {
423 | "cell_type": "markdown",
424 | "metadata": {},
425 | "source": [
426 | ""
427 | ]
428 | },
429 | {
430 | "cell_type": "markdown",
431 | "metadata": {},
432 | "source": [
433 | "**Complete the following code to calculate the accuracy on the validation data for one epoch; this should take about 45 minutes. Make sure you calculate the accuracy on the validation data.**"
434 | ]
435 | },
436 | {
437 | "cell_type": "code",
438 | "execution_count": null,
439 | "metadata": {},
440 | "outputs": [],
441 | "source": [
442 | "n_epochs=1\n",
443 | "loss_list=[]\n",
444 | "accuracy_list=[]\n",
445 | "correct=0\n",
446 | "N_test=len(validation_dataset)\n",
447 | "N_train=len(train_dataset)\n",
448 | "start_time = time.time()\n",
449 | "#n_epochs\n",
450 | "\n",
451 | "Loss=0\n",
452 | "start_time = time.time()\n",
453 | "for epoch in range(n_epochs):\n",
454 | " for x, y in train_loader:\n",
455 | "\n",
456 | " model.train() \n",
457 | " #clear gradient \n",
458 | " \n",
459 | " #make a prediction \n",
460 | " \n",
461 | " # calculate loss \n",
462 | " \n",
463 | " # calculate gradients of parameters \n",
464 | " \n",
465 | " # update parameters \n",
466 | " \n",
467 | " loss_list.append(loss.data)\n",
468 | " correct=0\n",
469 | " for x_test, y_test in validation_loader:\n",
470 | " # set model to eval \n",
471 | " \n",
472 | " #make a prediction \n",
473 | " \n",
474 | " #find max \n",
475 | " \n",
476 | " \n",
477 | " #Calculate misclassified samples in mini-batch \n",
478 | " #hint +=(yhat==y_test).sum().item()\n",
479 | " \n",
480 | " \n",
481 | " accuracy=correct/N_test\n",
482 | "\n"
483 | ]
484 | },
485 | {
486 | "cell_type": "markdown",
487 | "metadata": {},
488 | "source": [
489 | "Print out the Accuracy and plot the loss stored in the list loss_list
for every iteration and take a screen shot. "
490 | ]
491 | },
492 | {
493 | "cell_type": "code",
494 | "execution_count": null,
495 | "metadata": {},
496 | "outputs": [],
497 | "source": [
498 | "accuracy"
499 | ]
500 | },
501 | {
502 | "cell_type": "code",
503 | "execution_count": null,
504 | "metadata": {},
505 | "outputs": [],
506 | "source": [
507 | "plt.plot(loss_list)\n",
508 | "plt.xlabel(\"iteration\")\n",
509 | "plt.ylabel(\"loss\")\n",
510 | "plt.show()\n"
511 | ]
512 | },
513 | {
514 | "cell_type": "markdown",
515 | "metadata": {},
516 | "source": [
517 | "Question 3:Find the misclassified samples "
518 | ]
519 | },
520 | {
521 | "cell_type": "markdown",
522 | "metadata": {},
523 | "source": [
524 | "Identify the first four misclassified samples using the validation data: "
525 | ]
526 | },
527 | {
528 | "cell_type": "code",
529 | "execution_count": null,
530 | "metadata": {},
531 | "outputs": [],
532 | "source": [
533 | " "
534 | ]
535 | },
536 | {
537 | "cell_type": "markdown",
538 | "metadata": {},
539 | "source": [
540 | " CLICK HERE Click here to see how to share your notebook."
541 | ]
542 | },
543 | {
544 | "cell_type": "markdown",
545 | "metadata": {},
546 | "source": [
547 | "About the Authors: \n",
548 | "\n",
549 | "Joseph Santarcangelo has a PhD in Electrical Engineering, his research focused on using machine learning, signal processing, and computer vision to determine how videos impact human cognition. Joseph has been working for IBM since he completed his PhD."
550 | ]
551 | },
552 | {
553 | "cell_type": "markdown",
554 | "metadata": {},
555 | "source": [
556 | "Copyright © 2018 cognitiveclass.ai . This notebook and its source code are released under the terms of the MIT License ."
557 | ]
558 | }
559 | ],
560 | "metadata": {
561 | "kernelspec": {
562 | "display_name": "Python 3",
563 | "language": "python",
564 | "name": "python3"
565 | },
566 | "language_info": {
567 | "codemirror_mode": {
568 | "name": "ipython",
569 | "version": 3
570 | },
571 | "file_extension": ".py",
572 | "mimetype": "text/x-python",
573 | "name": "python",
574 | "nbconvert_exporter": "python",
575 | "pygments_lexer": "ipython3",
576 | "version": "3.6.8"
577 | }
578 | },
579 | "nbformat": 4,
580 | "nbformat_minor": 2
581 | }
582 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # AI-Capstone-Project-with-Deep-Learning
2 | https://www.coursera.org/learn/ai-deep-learning-capstone/home/welcome
3 |
4 | ### Data:
5 |
6 | https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/DL0321EN/data/images/concrete_crack_images_for_classification.zip
7 |
--------------------------------------------------------------------------------