├── Coding论坛 ├── PyG │ └── TEST.md ├── Pytorch │ └── TEST.md ├── README.md └── 分布式 │ └── TEST.md ├── README.md ├── 图神经网络进阶 └── README.md ├── 基础知识速通 ├── Deep Learning教程 │ ├── Projects │ │ ├── LeNet │ │ │ ├── lenet.py │ │ │ ├── requirements.txt │ │ │ ├── run.py │ │ │ └── test.txt │ │ └── README.md │ └── README.md ├── Machine Learning教程 │ ├── NoteBooks │ │ ├── 1.linear_regression │ │ │ ├── ex1.linear_regreesion.ipynb │ │ │ ├── ex1data1.txt │ │ │ └── ex1data2.txt │ │ ├── 2.logistic_regression │ │ │ ├── ex2.logistic_regression.ipynb │ │ │ ├── ex2data1.txt │ │ │ ├── ex2data2.txt │ │ │ └── sample_for_scipy.optimize.minimize.ipynb │ │ ├── 4.nurual_network_back_propagation │ │ │ ├── ex4.nerual_network_backpropagation.ipynb │ │ │ ├── ex4data1.mat │ │ │ ├── ex4weights.mat │ │ │ └── img │ │ │ │ └── nn_model.png │ │ ├── 7.kmeans_and_PCA │ │ │ ├── data │ │ │ │ ├── bird_small.mat │ │ │ │ ├── bird_small.png │ │ │ │ ├── ex7data1.mat │ │ │ │ ├── ex7data2.mat │ │ │ │ └── ex7faces.mat │ │ │ └── ex7.K-means_and_PCA.ipynb │ │ ├── knn-notebook.ipynb │ │ └── test.txt │ └── README.md ├── Python教程 │ └── README.md ├── README.md └── 资料 │ └── README.me ├── 强化学习进阶 └── README.md └── 联邦学习进阶 ├── README.md └── 联邦学习图像分类 ├── README.md ├── client.py ├── datasets.py ├── figures ├── fig2.png └── fig31.png ├── main.py ├── models.py ├── server.py └── utils └── conf.json /Coding论坛/PyG/TEST.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Coding论坛/Pytorch/TEST.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /Coding论坛/README.md: -------------------------------------------------------------------------------- 1 | # 说明 2 | 大家可以将coding过程中调参经验、踩过的坑、代码优化技巧、注意事项等分享至本库,相互交流。为方便查阅,现分为Pytorch、torch_geometric、分布式三个部分,后续根据需求增删 3 | -------------------------------------------------------------------------------- /Coding论坛/分布式/TEST.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | # AI Course 4 | 根据实验室现有研究方向制定本教程。其中[深度学习基础速通](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A)是必须掌握的部分,进一步划分三个进阶方向,[强化学习](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A)、[图神经网络](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A)和[联邦学习](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A)。学习路线上首先深度学习基础知识速通是所有学习本教程的同学必须掌握的部分,这一部分教程已经制定的尽量基础以便快速上手,进阶部分可以自行选择学习。强化学习进阶与图神经网络进阶的教程主要以论文和项目为主。 5 | 6 | ## 基础知识速通 7 | 为简化学习内容,这一部分不介绍数学基础。教程以部分开放课程和教材以及代码示例为主 8 | - **Python编程基础**(有Python基础的可以跳过这一部分) 9 | - [ ] [Python介绍与安装](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B#python%E4%BB%8B%E7%BB%8D%E4%B8%8E%E5%AE%89%E8%A3%85) 10 | - [ ] [Python虚拟环境管理](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B#python%E8%99%9A%E6%8B%9F%E7%8E%AF%E5%A2%83%E7%AE%A1%E7%90%86) 11 | - [ ] [代码编辑器](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B#%E4%BB%A3%E7%A0%81%E7%BC%96%E8%BE%91%E5%99%A8) 12 | - [ ] [基础语法](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B#%E7%AC%AC%E4%B8%80%E9%98%B6%E6%AE%B5%E5%9F%BA%E7%A1%80%E8%AF%AD%E6%B3%95) 13 | - [ ] [高阶特性(选学)](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B#%E7%AC%AC%E4%BA%8C%E9%98%B6%E6%AE%B5%E9%AB%98%E9%98%B6%E7%89%B9%E6%80%A7) 14 | - [ ] [Python数据处理基础](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B#%E7%AC%AC%E4%B8%89%E9%98%B6%E6%AE%B5%E6%95%B0%E6%8D%AE%E5%A4%84%E7%90%86) 15 | - **机器学习基础** 16 | - [ ] [线性回归到神经网络](https://github.com/UNIC-Lab/AI_Course/blob/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/README.md#%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E5%88%B0%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C) 17 | - [ ] [传统机器学习算法(选学)](https://github.com/UNIC-Lab/AI_Course/blob/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/README.md#%E4%BC%A0%E7%BB%9F%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E7%AE%97%E6%B3%95) 18 | - [ ] [模型评估与调优](https://github.com/UNIC-Lab/AI_Course/blob/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/README.md#%E6%A8%A1%E5%9E%8B%E8%AF%84%E4%BC%B0%E4%B8%8E%E8%B0%83%E4%BC%98) 19 | - **深度学习基础** 20 | - [ ] [Pytorch快速入门](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B#pytorch%E5%BF%AB%E9%80%9F%E5%85%A5%E9%97%A8) 21 | - [ ] [卷积神经网络](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B#%E5%8D%B7%E7%A7%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C) 22 | - [ ] [循环神经网络](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B#%E5%BE%AA%E7%8E%AF%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C) 23 | - [ ] [注意力机制](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B#%E6%B3%A8%E6%84%8F%E5%8A%9B%E6%9C%BA%E5%88%B6) 24 | - [ ] [优化算法](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B#%E4%BC%98%E5%8C%96%E7%AE%97%E6%B3%95) 25 | - [ ] [调参炼丹](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B#%E8%B0%83%E5%8F%82%E7%82%BC%E4%B8%B9) 26 | - [**强化学习入门**](https://www.bilibili.com/video/BV12o4y197US/?spm_id_from=333.337.search-card.all.click&vd_source=ef6bc9d073dccb208fb608bc99286677) 27 | 28 | ## 强化学习进阶 29 | 针对强化学习方向,推荐了一些强化学习的必读论文,有一定基础的情况下可以通过阅读论文或者相关讲解博客入门,否则推荐[SpinningUp项目教程](https://spinningup.qiwihui.com/zh_CN/latest/)或参照[《强化学习导论》](https://rl.qiwihui.com/zh_CN/latest/)入门。 30 | ## 图神经网络进阶 31 | 图神经网络的部分内容比较少,也比较简单,建议主要通过阅读所列出的论文学习。编程框架主要为PyG或者DGL,在有pytorch基础的情况下比较好上手,官方文档是最好的教材。 32 | ## 联邦学习进阶 33 | -------------------------------------------------------------------------------- /图神经网络进阶/README.md: -------------------------------------------------------------------------------- 1 | # 图神经网络进阶 2 | ## 一、图卷积网络 (Graph Convolutional Networks, GCN): 3 | GCN是最早被提出的图神经网络,其主要思想是将神经网络中的卷积操作推广到图上。在GCN中,节点的表示是通过其相邻节点的表示加权和来计算的。GCN已被广泛应用于节点分类、图分类和链接预测等任务。 4 | - Paper: 5 | - [Semi-Supervised Classification with Graph Convolutional Networks](https://arxiv.org/abs/1609.02907) 6 | - [Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering](https://proceedings.neurips.cc/paper/2016/hash/04df4d434d481c5bb723be1b6df1ee65-Abstract.html) 7 | - [Spectral Networks and Locally Connected Networks on Graphs](https://arxiv.org/abs/1312.6203) 8 | 9 | ## 二、图注意力网络 (Graph Attention Networks, GAT): 10 | GAT是一种基于注意力机制的图神经网络,其主要思想是在计算节点的表示时,给不同的邻居节点分配不同的权重。GAT通过引入注意力机制,可以更好地捕捉节点之间的复杂关系,同时提高了模型的鲁棒性。 11 | - Paper: 12 | - [Graph Attention Networks](https://arxiv.org/abs/1710.10903) 13 | ## 三、图自编码器 (Graph Autoencoder, GAE): 14 | GAE是一种无监督学习方法,其主要思想是在保留原始图结构信息的同时,学习图的低维表示。GAE的训练过程分为编码器和解码器两个部分,其中编码器将原始图转换为低维表示,解码器将低维表示重构为原始图。 15 | - Paper: 16 | - [Variational Graph Auto-Encoders](https://arxiv.org/abs/1611.07308) 17 | - [Deep Autoencoding Gaussian Mixture Model for Unsupervised Anomaly Detection](https://openreview.net/forum?id=BJJLHbb0-) 18 | ## 四、图生成模型 (Graph Generative Models): 19 | 图生成模型主要用于生成具有特定结构的图,其中最为经典的是基于随机游走的模型,如DeepWalk和Node2Vec。这类模型通过学习节点之间的相似度来生成具有相似结构的图。 20 | - Paper: 21 | - [GraphGAN: Graph Representation Learning with Generative Adversarial Nets](https://ojs.aaai.org/index.php/AAAI/article/view/11872) 22 | - [Junction Tree Variational Autoencoder for Molecular Graph Generation](https://proceedings.mlr.press/v80/jin18a.html) 23 | ## 五、图强化学习 (Graph Reinforcement Learning): 24 | 图强化学习主要用于解决节点、边或图级别的任务。该领域的核心思想是将节点或边看作智能体,通过采取不同的动作来最大化累积奖励。近年来,基于强化学习的图分类和图生成方法取得了不错的效果。 25 | - Paper: 26 | - [Graph Convolutional Reinforcement Learning](https://arxiv.org/abs/1810.09202) 27 | 28 | 其他算法:还有一些其他的图神经网络算法,如GraphSAGE、DiffPool、GIN、TAGCN等,这些算法的主要特点是针对不同的任务,采用不同的图卷积方式和特征聚合方式。 29 | 30 | 以上只是图神经网络领域中的一些常见算法,还有很多其他的算法,如GraphSAGE、APPNP、GNN-LSTM等等。建议先从GCN、GAT和GAE入手,再逐渐扩展到其他算法。 31 | 32 | ## 六. 学习资源 33 | 34 | - 书籍:《图神经网络实战》(王磊)、《图神经网络:基础与前沿》(左腾飞) 35 | 36 | 37 | - 视频教程:[CS224W: Machine Learning with Graphs (斯坦福大学)](https://www.bilibili.com/video/BV1s54y1H76H/?vd_source=ef6bc9d073dccb208fb608bc99286677) 38 | 39 | - [PaperList](https://github.com/GRAND-Lab/Awesome-Graph-Neural-Networks) 40 | 41 | 42 | ## 七、框架 43 | [PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/)、 [DGL](https://docs.dgl.ai/) 44 | 45 | [PyG快速上手](https://towardsdatascience.com/hands-on-graph-neural-networks-with-pytorch-pytorch-geometric-359487e221a8) 46 | 47 | ## 说明 48 | 本指南主要针对快速入门选手,分方向列举了少部分经典文献,更多文献可以参照PaperList项目。Coding方面推荐基于Pytorch的PyG,有基本pytorch的基础上可参照快速上手的博客学习,尝试跑通demo,配环境之类的问题记得咨询,没必要在这种过于基础且繁琐的问题上踩坑。《图神经网络:基础与前沿》为公认质量较高的中文教材,图书馆可借阅,或实验室购置。 49 | -------------------------------------------------------------------------------- /基础知识速通/Deep Learning教程/Projects/LeNet/lenet.py: -------------------------------------------------------------------------------- 1 | import torch.nn as nn 2 | from collections import OrderedDict 3 | 4 | 5 | class C1(nn.Module): 6 | def __init__(self): 7 | super(C1, self).__init__() 8 | 9 | self.c1 = nn.Sequential(OrderedDict([ 10 | ('c1', nn.Conv2d(1, 6, kernel_size=(5, 5))), 11 | ('relu1', nn.ReLU()), 12 | ('s1', nn.MaxPool2d(kernel_size=(2, 2), stride=2)) 13 | ])) 14 | 15 | def forward(self, img): 16 | output = self.c1(img) 17 | return output 18 | 19 | 20 | class C2(nn.Module): 21 | def __init__(self): 22 | super(C2, self).__init__() 23 | 24 | self.c2 = nn.Sequential(OrderedDict([ 25 | ('c2', nn.Conv2d(6, 16, kernel_size=(5, 5))), 26 | ('relu2', nn.ReLU()), 27 | ('s2', nn.MaxPool2d(kernel_size=(2, 2), stride=2)) 28 | ])) 29 | 30 | def forward(self, img): 31 | output = self.c2(img) 32 | return output 33 | 34 | 35 | class C3(nn.Module): 36 | def __init__(self): 37 | super(C3, self).__init__() 38 | 39 | self.c3 = nn.Sequential(OrderedDict([ 40 | ('c3', nn.Conv2d(16, 120, kernel_size=(5, 5))), 41 | ('relu3', nn.ReLU()) 42 | ])) 43 | 44 | def forward(self, img): 45 | output = self.c3(img) 46 | return output 47 | 48 | 49 | class F4(nn.Module): 50 | def __init__(self): 51 | super(F4, self).__init__() 52 | 53 | self.f4 = nn.Sequential(OrderedDict([ 54 | ('f4', nn.Linear(120, 84)), 55 | ('relu4', nn.ReLU()) 56 | ])) 57 | 58 | def forward(self, img): 59 | output = self.f4(img) 60 | return output 61 | 62 | 63 | class F5(nn.Module): 64 | def __init__(self): 65 | super(F5, self).__init__() 66 | 67 | self.f5 = nn.Sequential(OrderedDict([ 68 | ('f5', nn.Linear(84, 10)), 69 | ('sig5', nn.LogSoftmax(dim=-1)) 70 | ])) 71 | 72 | def forward(self, img): 73 | output = self.f5(img) 74 | return output 75 | 76 | 77 | class LeNet5(nn.Module): 78 | """ 79 | Input - 1x32x32 80 | Output - 10 81 | """ 82 | def __init__(self): 83 | super(LeNet5, self).__init__() 84 | 85 | self.c1 = C1() 86 | self.c2_1 = C2() 87 | self.c2_2 = C2() 88 | self.c3 = C3() 89 | self.f4 = F4() 90 | self.f5 = F5() 91 | 92 | def forward(self, img): 93 | output = self.c1(img) 94 | 95 | x = self.c2_1(output) 96 | output = self.c2_2(output) 97 | 98 | output += x 99 | 100 | output = self.c3(output) 101 | output = output.view(img.size(0), -1) 102 | output = self.f4(output) 103 | output = self.f5(output) 104 | return output 105 | -------------------------------------------------------------------------------- /基础知识速通/Deep Learning教程/Projects/LeNet/requirements.txt: -------------------------------------------------------------------------------- 1 | numpy>=1.17.0 2 | torch>=1.4.0 3 | torchvision>=0.4.0 4 | visdom>=0.1.6 5 | Pillow==6.2.0 6 | onnx==1.6.0 7 | -------------------------------------------------------------------------------- /基础知识速通/Deep Learning教程/Projects/LeNet/run.py: -------------------------------------------------------------------------------- 1 | from lenet import LeNet5 2 | import torch 3 | import torch.nn as nn 4 | import torch.optim as optim 5 | from torchvision.datasets.mnist import MNIST 6 | import torchvision.transforms as transforms 7 | from torch.utils.data import DataLoader 8 | import visdom 9 | import onnx 10 | 11 | viz = visdom.Visdom() 12 | 13 | data_train = MNIST('./data/mnist', 14 | download=True, 15 | transform=transforms.Compose([ 16 | transforms.Resize((32, 32)), 17 | transforms.ToTensor()])) 18 | data_test = MNIST('./data/mnist', 19 | train=False, 20 | download=True, 21 | transform=transforms.Compose([ 22 | transforms.Resize((32, 32)), 23 | transforms.ToTensor()])) 24 | data_train_loader = DataLoader(data_train, batch_size=256, shuffle=True, num_workers=8) 25 | data_test_loader = DataLoader(data_test, batch_size=1024, num_workers=8) 26 | 27 | net = LeNet5() 28 | criterion = nn.CrossEntropyLoss() 29 | optimizer = optim.Adam(net.parameters(), lr=2e-3) 30 | 31 | cur_batch_win = None 32 | cur_batch_win_opts = { 33 | 'title': 'Epoch Loss Trace', 34 | 'xlabel': 'Batch Number', 35 | 'ylabel': 'Loss', 36 | 'width': 1200, 37 | 'height': 600, 38 | } 39 | 40 | 41 | def train(epoch): 42 | global cur_batch_win 43 | net.train() 44 | loss_list, batch_list = [], [] 45 | for i, (images, labels) in enumerate(data_train_loader): 46 | optimizer.zero_grad() 47 | 48 | output = net(images) 49 | 50 | loss = criterion(output, labels) 51 | 52 | loss_list.append(loss.detach().cpu().item()) 53 | batch_list.append(i+1) 54 | 55 | if i % 10 == 0: 56 | print('Train - Epoch %d, Batch: %d, Loss: %f' % (epoch, i, loss.detach().cpu().item())) 57 | 58 | # Update Visualization 59 | if viz.check_connection(): 60 | cur_batch_win = viz.line(torch.Tensor(loss_list), torch.Tensor(batch_list), 61 | win=cur_batch_win, name='current_batch_loss', 62 | update=(None if cur_batch_win is None else 'replace'), 63 | opts=cur_batch_win_opts) 64 | 65 | loss.backward() 66 | optimizer.step() 67 | 68 | 69 | def test(): 70 | net.eval() 71 | total_correct = 0 72 | avg_loss = 0.0 73 | for i, (images, labels) in enumerate(data_test_loader): 74 | output = net(images) 75 | avg_loss += criterion(output, labels).sum() 76 | pred = output.detach().max(1)[1] 77 | total_correct += pred.eq(labels.view_as(pred)).sum() 78 | 79 | avg_loss /= len(data_test) 80 | print('Test Avg. Loss: %f, Accuracy: %f' % (avg_loss.detach().cpu().item(), float(total_correct) / len(data_test))) 81 | 82 | 83 | def train_and_test(epoch): 84 | train(epoch) 85 | test() 86 | 87 | dummy_input = torch.randn(1, 1, 32, 32, requires_grad=True) 88 | torch.onnx.export(net, dummy_input, "lenet.onnx") 89 | 90 | onnx_model = onnx.load("lenet.onnx") 91 | onnx.checker.check_model(onnx_model) 92 | 93 | 94 | def main(): 95 | for e in range(1, 16): 96 | train_and_test(e) 97 | 98 | 99 | if __name__ == '__main__': 100 | main() 101 | -------------------------------------------------------------------------------- /基础知识速通/Deep Learning教程/Projects/LeNet/test.txt: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /基础知识速通/Deep Learning教程/Projects/README.md: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /基础知识速通/Deep Learning教程/README.md: -------------------------------------------------------------------------------- 1 | - [*深度学习基础教程*](#深度学习基础教程) 2 | - [Pytorch快速入门](#pytorch快速入门) 3 | - [Pytorch安装](#pytorch安装) 4 | - [Pytorch 入门教程](#pytorch-入门教程) 5 | - [卷积神经网络](#卷积神经网络) 6 | - [循环神经网络](#循环神经网络) 7 | - [注意力机制](#注意力机制) 8 | - [优化算法](#优化算法) 9 | - [调参炼丹](#调参炼丹) 10 | 11 | # *深度学习基础教程* 12 | 通过机器学习教程的学习,已经掌握了如如损失函数、梯度下降等神经网络的基础。本教程基于Pytorch框架和《动手学深度学习》第二版教材以及配套视频进行学习。首先通过Pytorch入门教程快速学习Pytorch的基础规则与代码逻辑,并尝试调试手写数字识别项目。然后学习三种主要的神经网络形式即卷积神经网络、循环神经网络、注意力机制。每一部分都有相应的demo项目,学习过程中可以尝试运行调试。然后学习几种优化算法,理解优化器的工作原理,学会如何调试选择优化器。最后学习深度学习的调参技巧。 13 | 14 | 15 | ## Pytorch快速入门 16 | Pytorch是一个庞大的框架系统,无法在短时间内了解所有的框架内容,因此这部分旨在快速介绍Pytorch的基础知识以便于能够阅读和运行后续的示例代码,关于Pytorch的更详细的内容可以查阅[官方文档](https://pytorch123.com/) 17 | ### Pytorch安装 18 | 安装教程主要根据Windows系统 19 | - 1. CUDA和Cudnn安装 20 | 参考[教程博客](https://blog.csdn.net/m0_45447650/article/details/123704930) 21 | 注:按照这个教程安装暂时没遇到什么问题,后续具体问题欢迎反馈 22 | - 2. Pytorch命令行安装 23 | - 1. 首先激活创建的Python环境,查看cuda版本 24 | - 2. 进入[官方启动引导页面](https://pytorch.org/get-started/locally/),根据cuda版本获取安装命令 25 | ![image](https://github.com/UNIC-Lab/AI_Course/assets/90789521/9c250e82-e75d-4db2-8983-589bedd49475) 26 | 27 | 此处`-c pytorch`指从Pytorch官网下载,下载很慢,可以通过配置镜像加速下载 28 | `conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/pytorch/` 29 | 配置完成后去掉`-c pytorch`即可 30 | 31 | - 3. Pytorch离线安装 32 | 命令行安装失败,一般都是因为下载超时或者下载的版本,可以使用通过科学上网在[官方下载地址](https://download.pytorch.org/whl/torch/)中下载对应系统、python版本、cuda版本的离线安装包,其中python版本必须严格对应,cuda版本可以向下兼容。然后cd到离线安装包目录通过安装命令安装。 33 | 安装命令:`pip install [torch_file] -i https://pypi.tuna.tsinghua.edu.cn/simple`或`conda install [torch_file]` 34 | ### Pytorch 入门教程 35 | 1. [参考教程](https://ptorch.com/docs/3/deep_learning_60min_blitz) 36 | 2. 参考[视频课程](https://www.bilibili.com/video/BV1AK4y1P7vs?p=5&vd_source=ef6bc9d073dccb208fb608bc99286677) 37 | 3. 运行手写数字识别项目 38 | ## 卷积神经网络 39 | - 教材:[《动手学深度学习》Ch6-卷积神经网络](https://zh-v2.d2l.ai/chapter_convolutional-neural-networks/index.html) 40 | - 视频:[动手学习深度学习v2[P19-P29]](https://www.bilibili.com/video/BV1264y1i7R1/?spm_id_from=333.999.0.0&vd_source=ef6bc9d073dccb208fb608bc99286677) 41 | - Demo:调试运行[LeNet项目](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B/Projects/LeNet) 42 | ## 循环神经网络 43 | - 教材:[《动手学深度学习》Ch9-现代循环神经网络](https://zh-v2.d2l.ai/chapter_recurrent-modern/index.html) 44 | - 视频:[动手学深度学习v2[P54-P59]](https://www.bilibili.com/video/BV1264y1i7R1/?spm_id_from=333.999.0.0&vd_source=ef6bc9d073dccb208fb608bc99286677) 45 | - Demo:[Seq2Seq示例](https://zh.d2l.ai/chapter_recurrent-modern/seq2seq.html) 46 | ## 注意力机制 47 | - 教材:[《动手学深度学习》Ch10-注意力机制](https://zh-v2.d2l.ai/chapter_attention-mechanisms/index.html) 48 | - 视频:[动手学深度学习v2[P64-P68]](https://www.bilibili.com/video/BV1264y1i7R1/?spm_id_from=333.999.0.0&vd_source=ef6bc9d073dccb208fb608bc99286677) 49 | - Paper:[Attention is All You Need](https://arxiv.org/abs/1706.03762) 50 | - Demo: 参考[Transformer](https://github.com/jadore801120/attention-is-all-you-need-pytorch) 51 | 52 | ## 优化算法 53 | 理解优化算法与网络优化的概念 了解相关优化算法,并理解Pytorch中的优化器 54 | - 参考教材:[《动手学深度学习》Ch11-优化算法](https://zh-v2.d2l.ai/chapter_optimization/index.html) 55 | - 参考课程:[动手学深度学习v2[P72]](https://www.bilibili.com/video/BV1bP4y1p7Gq/?spm_id_from=333.999.0.0&vd_source=ef6bc9d073dccb208fb608bc99286677) 56 | ## 调参炼丹 57 | 学习一般的调参方法,了解哪些属于可调参数,学习调参的基本方向是什么等技能。参见--> [Deep Learning Tuning Playbook](https://github.com/google-research/tuning_playbook)、[DeepLearningTuning-中文译版](https://github.com/chunqiangqian/deepLearningTuning/blob/main/%E6%B7%B1%E5%BA%A6%E5%AD%A6%E4%B9%A0%E8%B0%83%E5%8F%82%E6%96%B9%E6%B3%95.md) 58 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/1.linear_regression/ex1.linear_regreesion.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "colab_type": "text", 7 | "id": "ax1r8W0rU8d1" 8 | }, 9 | "source": [ 10 | "# 线性回归 - Linear Regreesion\n", 11 | "\n", 12 | "此Notebook是配合Andrew Ng \"Machine Leanring\"中[线性回归](https://github.com/loveunk/machine-learning-deep-learning-notes/blob/master/machine-learning/linear-regression.md)部分学习使用。\n", 13 | "\n", 14 | "测试用python版本为3.6\n", 15 | "* 机器学习路径:https://github.com/loveunk/machine-learning-deep-learning-notes/\n", 16 | "* 内容正文综合参考网络资源,使用中如果有疑问请联络:www.kaikai.ai" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": 0, 22 | "metadata": { 23 | "colab": {}, 24 | "colab_type": "code", 25 | "id": "kgTnKLvOU8d2" 26 | }, 27 | "outputs": [], 28 | "source": [ 29 | "import pandas as pd\n", 30 | "import seaborn as sns\n", 31 | "sns.set(context=\"notebook\", style=\"whitegrid\", palette=\"dark\")\n", 32 | "import matplotlib.pyplot as plt\n", 33 | "import numpy as np" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": 0, 39 | "metadata": { 40 | "colab": {}, 41 | "colab_type": "code", 42 | "id": "Nje4hmN0U8d6" 43 | }, 44 | "outputs": [], 45 | "source": [ 46 | "df = pd.read_csv('ex1data1.txt', names=['population', 'profit']) # 读取数据并赋予列名" 47 | ] 48 | }, 49 | { 50 | "cell_type": "code", 51 | "execution_count": 0, 52 | "metadata": { 53 | "colab": {}, 54 | "colab_type": "code", 55 | "id": "a9p1GK2iU8d8" 56 | }, 57 | "outputs": [], 58 | "source": [ 59 | "df.head() # 显示数据前五行" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": 0, 65 | "metadata": { 66 | "colab": {}, 67 | "colab_type": "code", 68 | "id": "Ly0Fs3unU8eA" 69 | }, 70 | "outputs": [], 71 | "source": [ 72 | "df.info() # 打印df的class信息" 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": 0, 78 | "metadata": { 79 | "colab": {}, 80 | "colab_type": "code", 81 | "id": "WsQ2uPs6muT7" 82 | }, 83 | "outputs": [], 84 | "source": [ 85 | "df.describe() # 打印df的统计信息" 86 | ] 87 | }, 88 | { 89 | "cell_type": "markdown", 90 | "metadata": { 91 | "colab_type": "text", 92 | "id": "dtj0pJAOU8eE" 93 | }, 94 | "source": [ 95 | "***\n", 96 | "# 看下原始数据" 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": 0, 102 | "metadata": { 103 | "colab": {}, 104 | "colab_type": "code", 105 | "id": "ON7EiaK7U8eE" 106 | }, 107 | "outputs": [], 108 | "source": [ 109 | "sns.lmplot('population', 'profit', df, size=6, fit_reg=False)\n", 110 | "plt.show()" 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": 0, 116 | "metadata": { 117 | "colab": {}, 118 | "colab_type": "code", 119 | "id": "wRWRxgtAU8eH" 120 | }, 121 | "outputs": [], 122 | "source": [ 123 | "def get_X(df): # 读取特征\n", 124 | "# \"\"\"\n", 125 | "# use concat to add intersect feature to avoid side effect\n", 126 | "# not efficient for big dataset though\n", 127 | "# \"\"\"\n", 128 | " ones = pd.DataFrame({'ones': np.ones(len(df))})#ones是m行1列的dataframe\n", 129 | " data = pd.concat([ones, df], axis=1) # 合并数据,根据列合并\n", 130 | " return data.iloc[:, :-1].as_matrix() # 这个操作返回 ndarray,不是矩阵\n", 131 | "\n", 132 | "\n", 133 | "def get_y(df):#读取标签\n", 134 | "# '''assume the last column is the target'''\n", 135 | " return np.array(df.iloc[:, -1])#df.iloc[:, -1]是指df的最后一列\n", 136 | "\n", 137 | "\n", 138 | "def normalize_feature(df):\n", 139 | "# \"\"\"Applies function along input axis(default 0) of DataFrame.\"\"\"\n", 140 | " return df.apply(lambda column: (column - column.mean()) / column.std())#特征缩放" 141 | ] 142 | }, 143 | { 144 | "cell_type": "markdown", 145 | "metadata": { 146 | "colab_type": "text", 147 | "id": "GPFqxv_zU8eJ" 148 | }, 149 | "source": [ 150 | "多变量的假设 h 表示为:${{h}_{\\theta }}\\left( x \\right)={{\\theta }_{0}}+{{\\theta }_{1}}{{x}_{1}}+{{\\theta }_{2}}{{x}_{2}}+...+{{\\theta }_{n}}{{x}_{n}}$。\n", 151 | "\n", 152 | "这个公式中有n+1个参数和n个变量,为了使得公式能够简化一些,引入${{x}_{0}}=1$,则公式转化为: ${{h}_{\\theta }}\\left( x \\right)={{\\theta }_{0}x_0}+{{\\theta }_{1}}{{x}_{1}}+{{\\theta }_{2}}{{x}_{2}}+...+{{\\theta }_{n}}{{x}_{n}}$。\n", 153 | "\n", 154 | "此时模型中的参数是一个n+1维的向量,任何一个训练实例也都是n+1维的向量,特征矩阵X的维度是 m*(n+1)。 因此公式可以简化为:${{h}_{\\theta }}\\left( x \\right)={{\\theta }^{T}}X$,其中上标T代表矩阵转置。\n" 155 | ] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "metadata": { 160 | "colab_type": "text", 161 | "id": "on8_khfsU8eQ" 162 | }, 163 | "source": [ 164 | "# 计算代价函数\n", 165 | "$$J\\left( \\theta \\right)=\\frac{1}{2m}\\sum\\limits_{i=1}^{m}{{{\\left( {{h}_{\\theta }}\\left( {{x}^{(i)}} \\right)-{{y}^{(i)}} \\right)}^{2}}}$$\n", 166 | "\n", 167 | "其中:\n", 168 | "\n", 169 | "$${{h}_{\\theta }}\\left( x \\right)={{\\theta }^{T}}X={{\\theta }_{0}}{{x}_{0}}+{{\\theta }_{1}}{{x}_{1}}+{{\\theta }_{2}}{{x}_{2}}+...+{{\\theta }_{n}}{{x}_{n}}$$" 170 | ] 171 | }, 172 | { 173 | "cell_type": "code", 174 | "execution_count": 0, 175 | "metadata": { 176 | "colab": {}, 177 | "colab_type": "code", 178 | "id": "yomFBPelU8eQ" 179 | }, 180 | "outputs": [], 181 | "source": [ 182 | "# 查看数据维度\n", 183 | "data = df\n", 184 | "X = get_X(data)\n", 185 | "print(X.shape, type(X))\n", 186 | "\n", 187 | "y = get_y(data)\n", 188 | "print(y.shape, type(y))\n" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": 0, 194 | "metadata": { 195 | "colab": {}, 196 | "colab_type": "code", 197 | "id": "AeTfJ6UyU8eT" 198 | }, 199 | "outputs": [], 200 | "source": [ 201 | "theta = np.zeros(X.shape[1]) # X.shape[1]=2, 代表特征数n\n", 202 | "print(theta)" 203 | ] 204 | }, 205 | { 206 | "cell_type": "code", 207 | "execution_count": 0, 208 | "metadata": { 209 | "colab": {}, 210 | "colab_type": "code", 211 | "id": "ntnGgUoKU8eV" 212 | }, 213 | "outputs": [], 214 | "source": [ 215 | "def lr_cost(theta, X, y):\n", 216 | " \"\"\" 计算代价函数\n", 217 | " X: R(m*n), m 样本数, n 特征数\n", 218 | " y: R(m)\n", 219 | " theta : R(n), 线性回归的参数\n", 220 | " \"\"\"\n", 221 | " m = X.shape[0]#m为样本数\n", 222 | "\n", 223 | " inner = X @ theta - y # R(m*1),X @ theta等价于X.dot(theta)\n", 224 | "\n", 225 | " # 1*m @ m*1 = 1*1 in matrix multiplication\n", 226 | " # but you know numpy didn't do transpose in 1d array, so here is just a\n", 227 | " # vector inner product to itselves\n", 228 | " square_sum = inner.T @ inner\n", 229 | " cost = square_sum / (2 * m)\n", 230 | "\n", 231 | " return cost" 232 | ] 233 | }, 234 | { 235 | "cell_type": "code", 236 | "execution_count": 0, 237 | "metadata": { 238 | "colab": {}, 239 | "colab_type": "code", 240 | "id": "BPFR1bKOU8eY" 241 | }, 242 | "outputs": [], 243 | "source": [ 244 | "lr_cost(theta, X, y) # 返回cost的值" 245 | ] 246 | }, 247 | { 248 | "cell_type": "markdown", 249 | "metadata": { 250 | "colab_type": "text", 251 | "id": "KZ9oOCHrU8eb" 252 | }, 253 | "source": [ 254 | "# 批量梯度下降 - Batch Gradient Decent\n", 255 | "$$\\begin{aligned}{{\\theta }_{j}} &:={{\\theta }_{j}}-\\alpha \\frac{\\partial }{\\partial {{\\theta }_{j}}}J\\left( \\theta \\right) \\\\ &:= {{\\theta }_{j}}-\\alpha \\frac{1}{m} \\sum^{m}_{i=1}\\left( h_\\theta \\left(x^{(i)}\\right) -y^{(i)} \\right)x^{(i)}_j \\end{aligned}$$\n", 256 | "注意:对于所有的$j$,需要同时更新$\\theta_j$。" 257 | ] 258 | }, 259 | { 260 | "cell_type": "code", 261 | "execution_count": 0, 262 | "metadata": { 263 | "colab": {}, 264 | "colab_type": "code", 265 | "id": "4_JZqMY3U8ec" 266 | }, 267 | "outputs": [], 268 | "source": [ 269 | "def gradient(theta, X, y):\n", 270 | " \"\"\"\n", 271 | " 计算梯度,也就是 J(θ)的偏导数\n", 272 | " \"\"\"\n", 273 | " m = X.shape[0]\n", 274 | "\n", 275 | " inner = X.T @ (X @ theta - y) # (m,n).T @ (m, 1) -> (n, 1),X @ theta等价于X.dot(theta)\n", 276 | "\n", 277 | " return inner / m" 278 | ] 279 | }, 280 | { 281 | "cell_type": "code", 282 | "execution_count": 0, 283 | "metadata": { 284 | "colab": {}, 285 | "colab_type": "code", 286 | "id": "IRexXn6EU8ee" 287 | }, 288 | "outputs": [], 289 | "source": [ 290 | "def batch_gradient_decent(theta, X, y, epoch, alpha=0.01):\n", 291 | " \"\"\"\n", 292 | " 批量梯度下降函数。拟合线性回归,返回参数和代价\n", 293 | " epoch: 批处理的轮数\n", 294 | " \"\"\"\n", 295 | " cost_data = [lr_cost(theta, X, y)]\n", 296 | " _theta = theta.copy() # 拷贝一份,不和原来的theta混淆\n", 297 | "\n", 298 | " for _ in range(epoch):\n", 299 | " _theta = _theta - alpha * gradient(_theta, X, y)\n", 300 | " cost_data.append(lr_cost(_theta, X, y))\n", 301 | "\n", 302 | " return _theta, cost_data" 303 | ] 304 | }, 305 | { 306 | "cell_type": "code", 307 | "execution_count": 0, 308 | "metadata": { 309 | "colab": {}, 310 | "colab_type": "code", 311 | "id": "Lx_dnNrxU8ei" 312 | }, 313 | "outputs": [], 314 | "source": [ 315 | "epoch = 500\n", 316 | "final_theta, cost_data = batch_gradient_decent(theta, X, y, epoch)" 317 | ] 318 | }, 319 | { 320 | "cell_type": "code", 321 | "execution_count": 0, 322 | "metadata": { 323 | "colab": {}, 324 | "colab_type": "code", 325 | "id": "kCe-db1AU8en" 326 | }, 327 | "outputs": [], 328 | "source": [ 329 | "final_theta\n", 330 | "#最终的theta" 331 | ] 332 | }, 333 | { 334 | "cell_type": "code", 335 | "execution_count": 0, 336 | "metadata": { 337 | "colab": {}, 338 | "colab_type": "code", 339 | "id": "yigjBbL4U8eq" 340 | }, 341 | "outputs": [], 342 | "source": [ 343 | "cost_data\n", 344 | "# 看下代价数据" 345 | ] 346 | }, 347 | { 348 | "cell_type": "code", 349 | "execution_count": 0, 350 | "metadata": { 351 | "colab": {}, 352 | "colab_type": "code", 353 | "id": "ZrMX8THHU8et" 354 | }, 355 | "outputs": [], 356 | "source": [ 357 | "# 计算最终的代价\n", 358 | "lr_cost(final_theta, X, y)" 359 | ] 360 | }, 361 | { 362 | "cell_type": "markdown", 363 | "metadata": { 364 | "colab_type": "text", 365 | "id": "B9VOLMPqk2Os" 366 | }, 367 | "source": [ 368 | "scikit-learn model的预测表现" 369 | ] 370 | }, 371 | { 372 | "cell_type": "code", 373 | "execution_count": 0, 374 | "metadata": { 375 | "colab": {}, 376 | "colab_type": "code", 377 | "id": "5gM2jYk2T0Hv", 378 | "scrolled": true 379 | }, 380 | "outputs": [], 381 | "source": [ 382 | "from sklearn import linear_model\n", 383 | "model = linear_model.LinearRegression()\n", 384 | "model.fit(X, y)\n", 385 | "\n", 386 | "x = X[:, 1]\n", 387 | "f = model.predict(X).flatten()\n", 388 | "\n", 389 | "plt.scatter(X[:,1], y, label='Traning Data')\n", 390 | "plt.plot(x, f, 'r', label='Prediction')\n", 391 | "plt.legend(loc=2)\n", 392 | "plt.show()" 393 | ] 394 | }, 395 | { 396 | "cell_type": "markdown", 397 | "metadata": { 398 | "colab_type": "text", 399 | "id": "rsQw66Y9U8ew" 400 | }, 401 | "source": [ 402 | "# 代价数据可视化" 403 | ] 404 | }, 405 | { 406 | "cell_type": "code", 407 | "execution_count": 0, 408 | "metadata": { 409 | "colab": {}, 410 | "colab_type": "code", 411 | "id": "YZyEbK4RU8ew" 412 | }, 413 | "outputs": [], 414 | "source": [ 415 | "ax = sns.tsplot(cost_data, time=np.arange(epoch+1))\n", 416 | "ax.set_xlabel('epoch')\n", 417 | "ax.set_ylabel('cost')\n", 418 | "plt.show()\n", 419 | "#可以看到从第二轮代价数据变换很大,接下来平稳了" 420 | ] 421 | }, 422 | { 423 | "cell_type": "code", 424 | "execution_count": 0, 425 | "metadata": { 426 | "colab": {}, 427 | "colab_type": "code", 428 | "id": "5XhEi6SqU8ez" 429 | }, 430 | "outputs": [], 431 | "source": [ 432 | "b = final_theta[0] # intercept,Y轴上的截距\n", 433 | "m = final_theta[1] # slope,斜率\n", 434 | "\n", 435 | "plt.scatter(data.population, data.profit, label=\"Training data\")\n", 436 | "plt.plot(data.population, data.population*m + b, 'r', label=\"Prediction\")\n", 437 | "plt.legend(loc=2)\n", 438 | "plt.show()" 439 | ] 440 | }, 441 | { 442 | "cell_type": "markdown", 443 | "metadata": { 444 | "colab_type": "text", 445 | "id": "67DO0QyBU8e2" 446 | }, 447 | "source": [ 448 | "# 3- 选修章节" 449 | ] 450 | }, 451 | { 452 | "cell_type": "code", 453 | "execution_count": 0, 454 | "metadata": { 455 | "colab": {}, 456 | "colab_type": "code", 457 | "id": "Tb-CO9raU8e2" 458 | }, 459 | "outputs": [], 460 | "source": [ 461 | "raw_data = pd.read_csv('ex1data2.txt', names=['square', 'bedrooms', 'price'])\n", 462 | "raw_data.head()" 463 | ] 464 | }, 465 | { 466 | "cell_type": "markdown", 467 | "metadata": { 468 | "colab_type": "text", 469 | "collapsed": true, 470 | "id": "7WBXHMXzU8e6" 471 | }, 472 | "source": [ 473 | "# 标准化数据\n", 474 | "最简单的方法是令:\n", 475 | "\n", 476 | " \n", 477 | "\n", 478 | "其中 是平均值,sn 是标准差。\n" 479 | ] 480 | }, 481 | { 482 | "cell_type": "code", 483 | "execution_count": 0, 484 | "metadata": { 485 | "colab": {}, 486 | "colab_type": "code", 487 | "id": "sFLN1gSfU8e7" 488 | }, 489 | "outputs": [], 490 | "source": [ 491 | "def normalize_feature(df):\n", 492 | "# \"\"\"Applies function along input axis(default 0) of DataFrame.\"\"\"\n", 493 | " return df.apply(lambda column: (column - column.mean()) / column.std())" 494 | ] 495 | }, 496 | { 497 | "cell_type": "code", 498 | "execution_count": 0, 499 | "metadata": { 500 | "colab": {}, 501 | "colab_type": "code", 502 | "id": "SL2V8gRmU8e-" 503 | }, 504 | "outputs": [], 505 | "source": [ 506 | "data = normalize_feature(raw_data)\n", 507 | "data.head()" 508 | ] 509 | }, 510 | { 511 | "cell_type": "markdown", 512 | "metadata": { 513 | "colab_type": "text", 514 | "id": "pjP2SsV3U8fE" 515 | }, 516 | "source": [ 517 | "# 2. 多变量批量梯度下降 - Multi-var batch gradient decent" 518 | ] 519 | }, 520 | { 521 | "cell_type": "code", 522 | "execution_count": 0, 523 | "metadata": { 524 | "colab": {}, 525 | "colab_type": "code", 526 | "id": "XHTA9KcXU8fE" 527 | }, 528 | "outputs": [], 529 | "source": [ 530 | "X = get_X(data)\n", 531 | "print(X.shape, type(X))\n", 532 | "\n", 533 | "y = get_y(data)\n", 534 | "print(y.shape, type(y)) #看下数据的维度和类型" 535 | ] 536 | }, 537 | { 538 | "cell_type": "code", 539 | "execution_count": 0, 540 | "metadata": { 541 | "colab": {}, 542 | "colab_type": "code", 543 | "id": "PUHGSR5jU8fI" 544 | }, 545 | "outputs": [], 546 | "source": [ 547 | "alpha = 0.01 #学习率\n", 548 | "theta = np.zeros(X.shape[1]) #X.shape[1]:特征数n\n", 549 | "epoch = 500 #迭代次数" 550 | ] 551 | }, 552 | { 553 | "cell_type": "code", 554 | "execution_count": 0, 555 | "metadata": { 556 | "colab": {}, 557 | "colab_type": "code", 558 | "id": "NbC5-6NfU8fL" 559 | }, 560 | "outputs": [], 561 | "source": [ 562 | "final_theta, cost_data = batch_gradient_decent(theta, X, y, epoch, alpha=alpha)" 563 | ] 564 | }, 565 | { 566 | "cell_type": "code", 567 | "execution_count": 0, 568 | "metadata": { 569 | "colab": {}, 570 | "colab_type": "code", 571 | "id": "DTvLgpImU8fN" 572 | }, 573 | "outputs": [], 574 | "source": [ 575 | "sns.tsplot(time=np.arange(len(cost_data)), data = cost_data)\n", 576 | "plt.xlabel('epoch', fontsize=18)\n", 577 | "plt.ylabel('cost', fontsize=18)\n", 578 | "plt.show()" 579 | ] 580 | }, 581 | { 582 | "cell_type": "code", 583 | "execution_count": 0, 584 | "metadata": { 585 | "colab": {}, 586 | "colab_type": "code", 587 | "id": "XIAeC3aWU8fQ" 588 | }, 589 | "outputs": [], 590 | "source": [ 591 | "final_theta" 592 | ] 593 | }, 594 | { 595 | "cell_type": "markdown", 596 | "metadata": { 597 | "colab_type": "text", 598 | "id": "xhv7H1OSTZJs" 599 | }, 600 | "source": [ 601 | "Scikit-learn 的预测" 602 | ] 603 | }, 604 | { 605 | "cell_type": "code", 606 | "execution_count": 0, 607 | "metadata": { 608 | "colab": {}, 609 | "colab_type": "code", 610 | "id": "Dx3gtarBk2Os", 611 | "scrolled": true 612 | }, 613 | "outputs": [], 614 | "source": [ 615 | "from mpl_toolkits.mplot3d import Axes3D\n", 616 | "from sklearn import linear_model\n", 617 | "\n", 618 | "model = linear_model.LinearRegression()\n", 619 | "model.fit(X, y)\n", 620 | "\n", 621 | "f = model.predict(X).flatten()\n", 622 | "\n", 623 | "fig = plt.figure()\n", 624 | "ax = fig.add_subplot(111, projection='3d')\n", 625 | "ax.plot(X[:,1], X[:,2], f, 'r', label='Prediction')\n", 626 | "ax.scatter(X[:,1], X[:,2], y, label='Traning Data')\n", 627 | "ax.legend(loc=2)\n", 628 | "ax.set_xlabel('square')\n", 629 | "ax.set_ylabel('bedrooms')\n", 630 | "ax.set_zlabel('price')\n", 631 | "ax.set_title('square & bedrooms vs. price')\n", 632 | "#ax.view_init(30, 10)\n", 633 | "plt.show()" 634 | ] 635 | }, 636 | { 637 | "cell_type": "markdown", 638 | "metadata": { 639 | "colab_type": "text", 640 | "id": "IRrnttEHU8fS" 641 | }, 642 | "source": [ 643 | "# 3. 学习率 - Learning rate" 644 | ] 645 | }, 646 | { 647 | "cell_type": "code", 648 | "execution_count": 0, 649 | "metadata": { 650 | "colab": {}, 651 | "colab_type": "code", 652 | "id": "w9U-3YXMU8fT" 653 | }, 654 | "outputs": [], 655 | "source": [ 656 | "base = np.logspace(-1, -5, num=4)\n", 657 | "candidate = np.sort(np.concatenate((base, base*3)))\n", 658 | "print(candidate)" 659 | ] 660 | }, 661 | { 662 | "cell_type": "code", 663 | "execution_count": 0, 664 | "metadata": { 665 | "colab": {}, 666 | "colab_type": "code", 667 | "id": "ePELIHY-U8fV" 668 | }, 669 | "outputs": [], 670 | "source": [ 671 | "epoch=50\n", 672 | "\n", 673 | "fig, ax = plt.subplots(figsize=(8, 8))\n", 674 | "\n", 675 | "for alpha in candidate:\n", 676 | " _, cost_data = batch_gradient_decent(theta, X, y, epoch, alpha=alpha)\n", 677 | " ax.plot(np.arange(epoch+1), cost_data, label=alpha)\n", 678 | "\n", 679 | "ax.set_xlabel('epoch', fontsize=12)\n", 680 | "ax.set_ylabel('cost', fontsize=12)\n", 681 | "ax.legend(bbox_to_anchor=(1.05, 1), loc=2, borderaxespad=0.)\n", 682 | "ax.set_title('learning rate', fontsize=12)\n", 683 | "plt.show()" 684 | ] 685 | }, 686 | { 687 | "cell_type": "markdown", 688 | "metadata": { 689 | "colab_type": "text", 690 | "id": "SwpPCazZg3Ik" 691 | }, 692 | "source": [ 693 | "可以看到最合适的learning rate是0.3" 694 | ] 695 | }, 696 | { 697 | "cell_type": "markdown", 698 | "metadata": { 699 | "colab_type": "text", 700 | "id": "6-HlM4onU8fX" 701 | }, 702 | "source": [ 703 | "# 4. 正规方程 - Normal equation\n", 704 | "正规方程是通过求解下面的方程来找出使得代价函数最小的参数的:$\\frac{\\partial }{\\partial {{\\theta }_{j}}}J\\left( {{\\theta }_{j}} \\right)=0$ 。\n", 705 | " 假设我们的训练集特征矩阵为 X(包含了${{x}_{0}}=1$)并且我们的训练集结果为向量 y,则利用正规方程解出向量 $\\theta ={{\\left( {{X}^{T}}X \\right)}^{-1}}{{X}^{T}}y$ 。\n", 706 | "上标T代表矩阵转置,上标-1 代表矩阵的逆。设矩阵$A={{X}^{T}}X$,则:${{\\left( {{X}^{T}}X \\right)}^{-1}}={{A}^{-1}}$\n", 707 | "\n", 708 | "梯度下降与正规方程的比较:\n", 709 | "\n", 710 | "梯度下降:需要选择学习率α,需要多次迭代,当特征数量n大时也能较好适用,适用于各种类型的模型\t\n", 711 | "\n", 712 | "正规方程:不需要选择学习率α,一次计算得出,需要计算${{\\left( {{X}^{T}}X \\right)}^{-1}}$,如果特征数量n较大则运算代价大,因为矩阵逆的计算时间复杂度为O(n3),通常来说当n小于10000 时还是可以接受的,只适用于线性模型,不适合逻辑回归模型等其他模型\n", 713 | "\n" 714 | ] 715 | }, 716 | { 717 | "cell_type": "code", 718 | "execution_count": 0, 719 | "metadata": { 720 | "colab": {}, 721 | "colab_type": "code", 722 | "id": "BGjQLe7jU8fY" 723 | }, 724 | "outputs": [], 725 | "source": [ 726 | "# 正规方程\n", 727 | "def normalEqn(X, y):\n", 728 | " theta = np.linalg.inv(X.T@X)@X.T@y#X.T@X等价于X.T.dot(X)\n", 729 | " return theta" 730 | ] 731 | }, 732 | { 733 | "cell_type": "code", 734 | "execution_count": 0, 735 | "metadata": { 736 | "colab": {}, 737 | "colab_type": "code", 738 | "id": "3JB8AH_iU8fa" 739 | }, 740 | "outputs": [], 741 | "source": [ 742 | "final_theta2=normalEqn(X, y)#感觉和批量梯度下降的theta的值有点差距\n", 743 | "final_theta2" 744 | ] 745 | }, 746 | { 747 | "cell_type": "code", 748 | "execution_count": 0, 749 | "metadata": { 750 | "colab": {}, 751 | "colab_type": "code", 752 | "id": "YfBLlOZnY6Qi" 753 | }, 754 | "outputs": [], 755 | "source": [ 756 | "f = final_theta2[0] + final_theta2[1] * X[:,1] + final_theta2[2] * X[:,2]\n", 757 | "\n", 758 | "fig = plt.figure()\n", 759 | "ax = fig.add_subplot(111, projection='3d')\n", 760 | "ax.plot(X[:,1], X[:,2], f, 'r', label='Prediction')\n", 761 | "ax.scatter(X[:,1], X[:,2], y, label='Traning Data')\n", 762 | "ax.legend(loc=2)\n", 763 | "ax.set_xlabel('square')\n", 764 | "ax.set_ylabel('bedrooms')\n", 765 | "ax.set_zlabel('price')\n", 766 | "ax.set_title('square & bedrooms vs. price')\n", 767 | "#ax.view_init(30, 10)\n", 768 | "plt.show()" 769 | ] 770 | } 771 | ], 772 | "metadata": { 773 | "colab": { 774 | "collapsed_sections": [], 775 | "name": "1.linear_regreesion_v1.ipynb", 776 | "provenance": [], 777 | "toc_visible": true, 778 | "version": "0.3.2" 779 | }, 780 | "kernelspec": { 781 | "display_name": "Python (Python 3.6)", 782 | "language": "python", 783 | "name": "python36" 784 | }, 785 | "language_info": { 786 | "codemirror_mode": { 787 | "name": "ipython", 788 | "version": 3 789 | }, 790 | "file_extension": ".py", 791 | "mimetype": "text/x-python", 792 | "name": "python", 793 | "nbconvert_exporter": "python", 794 | "pygments_lexer": "ipython3", 795 | "version": "3.6.8" 796 | } 797 | }, 798 | "nbformat": 4, 799 | "nbformat_minor": 1 800 | } 801 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/1.linear_regression/ex1data1.txt: -------------------------------------------------------------------------------- 1 | 6.1101,17.592 2 | 5.5277,9.1302 3 | 8.5186,13.662 4 | 7.0032,11.854 5 | 5.8598,6.8233 6 | 8.3829,11.886 7 | 7.4764,4.3483 8 | 8.5781,12 9 | 6.4862,6.5987 10 | 5.0546,3.8166 11 | 5.7107,3.2522 12 | 14.164,15.505 13 | 5.734,3.1551 14 | 8.4084,7.2258 15 | 5.6407,0.71618 16 | 5.3794,3.5129 17 | 6.3654,5.3048 18 | 5.1301,0.56077 19 | 6.4296,3.6518 20 | 7.0708,5.3893 21 | 6.1891,3.1386 22 | 20.27,21.767 23 | 5.4901,4.263 24 | 6.3261,5.1875 25 | 5.5649,3.0825 26 | 18.945,22.638 27 | 12.828,13.501 28 | 10.957,7.0467 29 | 13.176,14.692 30 | 22.203,24.147 31 | 5.2524,-1.22 32 | 6.5894,5.9966 33 | 9.2482,12.134 34 | 5.8918,1.8495 35 | 8.2111,6.5426 36 | 7.9334,4.5623 37 | 8.0959,4.1164 38 | 5.6063,3.3928 39 | 12.836,10.117 40 | 6.3534,5.4974 41 | 5.4069,0.55657 42 | 6.8825,3.9115 43 | 11.708,5.3854 44 | 5.7737,2.4406 45 | 7.8247,6.7318 46 | 7.0931,1.0463 47 | 5.0702,5.1337 48 | 5.8014,1.844 49 | 11.7,8.0043 50 | 5.5416,1.0179 51 | 7.5402,6.7504 52 | 5.3077,1.8396 53 | 7.4239,4.2885 54 | 7.6031,4.9981 55 | 6.3328,1.4233 56 | 6.3589,-1.4211 57 | 6.2742,2.4756 58 | 5.6397,4.6042 59 | 9.3102,3.9624 60 | 9.4536,5.4141 61 | 8.8254,5.1694 62 | 5.1793,-0.74279 63 | 21.279,17.929 64 | 14.908,12.054 65 | 18.959,17.054 66 | 7.2182,4.8852 67 | 8.2951,5.7442 68 | 10.236,7.7754 69 | 5.4994,1.0173 70 | 20.341,20.992 71 | 10.136,6.6799 72 | 7.3345,4.0259 73 | 6.0062,1.2784 74 | 7.2259,3.3411 75 | 5.0269,-2.6807 76 | 6.5479,0.29678 77 | 7.5386,3.8845 78 | 5.0365,5.7014 79 | 10.274,6.7526 80 | 5.1077,2.0576 81 | 5.7292,0.47953 82 | 5.1884,0.20421 83 | 6.3557,0.67861 84 | 9.7687,7.5435 85 | 6.5159,5.3436 86 | 8.5172,4.2415 87 | 9.1802,6.7981 88 | 6.002,0.92695 89 | 5.5204,0.152 90 | 5.0594,2.8214 91 | 5.7077,1.8451 92 | 7.6366,4.2959 93 | 5.8707,7.2029 94 | 5.3054,1.9869 95 | 8.2934,0.14454 96 | 13.394,9.0551 97 | 5.4369,0.61705 98 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/1.linear_regression/ex1data2.txt: -------------------------------------------------------------------------------- 1 | 2104,3,399900 2 | 1600,3,329900 3 | 2400,3,369000 4 | 1416,2,232000 5 | 3000,4,539900 6 | 1985,4,299900 7 | 1534,3,314900 8 | 1427,3,198999 9 | 1380,3,212000 10 | 1494,3,242500 11 | 1940,4,239999 12 | 2000,3,347000 13 | 1890,3,329999 14 | 4478,5,699900 15 | 1268,3,259900 16 | 2300,4,449900 17 | 1320,2,299900 18 | 1236,3,199900 19 | 2609,4,499998 20 | 3031,4,599000 21 | 1767,3,252900 22 | 1888,2,255000 23 | 1604,3,242900 24 | 1962,4,259900 25 | 3890,3,573900 26 | 1100,3,249900 27 | 1458,3,464500 28 | 2526,3,469000 29 | 2200,3,475000 30 | 2637,3,299900 31 | 1839,2,349900 32 | 1000,1,169900 33 | 2040,4,314900 34 | 3137,3,579900 35 | 1811,4,285900 36 | 1437,3,249900 37 | 1239,3,229900 38 | 2132,4,345000 39 | 4215,4,549000 40 | 2162,4,287000 41 | 1664,2,368500 42 | 2238,3,329900 43 | 2567,4,314000 44 | 1200,3,299000 45 | 852,2,179900 46 | 1852,4,299900 47 | 1203,3,239500 48 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/2.logistic_regression/ex2.logistic_regression.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "colab_type": "text", 7 | "id": "6HPvYgkOjlHg" 8 | }, 9 | "source": [ 10 | "# 逻辑回归 Logistic Regression\n", 11 | "\n", 12 | "此Notebook是配合Andrew Ng \"Machine Leanring\"中[逻辑回归](https://github.com/loveunk/machine-learning-deep-learning-notes/blob/master/machine-learning/logistic-regression.md)部分学习使用。\n", 13 | "\n", 14 | "测试用python版本为3.6\n", 15 | "* 机器学习路径:https://github.com/loveunk/machine-learning-deep-learning-notes/\n", 16 | "* 内容正文综合参考网络资源,使用中如果有疑问请联络:www.kaikai.ai" 17 | ] 18 | }, 19 | { 20 | "cell_type": "markdown", 21 | "metadata": { 22 | "colab_type": "text", 23 | "id": "2bSRbRGTjlHi" 24 | }, 25 | "source": [ 26 | "在这一次练习中,我们将要实现逻辑回归并且应用到一个分类任务。我们还将通过将正则化加入训练算法,来提高算法的鲁棒性,并用更复杂的情形来测试它。\n" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": { 32 | "colab_type": "text", 33 | "id": "WXk4cLxBjlHj" 34 | }, 35 | "source": [ 36 | "## 准备数据" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": { 42 | "colab_type": "text", 43 | "id": "nfrEaVuajlHl" 44 | }, 45 | "source": [ 46 | "在训练的初始阶段,我们将要构建一个逻辑回归模型来预测,某个学生是否被大学录取。设想你是大学相关部分的管理者,想通过申请学生两次测试的评分,来决定他们是否被录取。现在你拥有之前申请学生的可以用于训练逻辑回归的训练样本集。对于每一个训练样本,你有他们两次测试的评分和最后是被录取的结果。为了完成这个预测任务,我们准备构建一个可以基于两次测试评分来评估录取可能性的分类模型。" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": { 52 | "colab_type": "text", 53 | "id": "2zO2vzfujlHm" 54 | }, 55 | "source": [ 56 | "让我们从检查数据开始。" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": 0, 62 | "metadata": { 63 | "colab": {}, 64 | "colab_type": "code", 65 | "id": "meEb58oJjlHn" 66 | }, 67 | "outputs": [], 68 | "source": [ 69 | "import numpy as np\n", 70 | "import pandas as pd\n", 71 | "import matplotlib.pyplot as plt\n", 72 | "plt.style.use('fivethirtyeight')\n", 73 | "from sklearn.metrics import classification_report # 这个包是评价报告" 74 | ] 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": 0, 79 | "metadata": { 80 | "colab": {}, 81 | "colab_type": "code", 82 | "id": "zqEIXn3sjlHr" 83 | }, 84 | "outputs": [], 85 | "source": [ 86 | "path = 'ex2data1.txt'\n", 87 | "data = pd.read_csv(path, header=None, names=['exam1', 'exam2', 'admitted'])\n", 88 | "data.head()" 89 | ] 90 | }, 91 | { 92 | "cell_type": "code", 93 | "execution_count": 0, 94 | "metadata": { 95 | "colab": {}, 96 | "colab_type": "code", 97 | "id": "8r4bOK-K8J4i" 98 | }, 99 | "outputs": [], 100 | "source": [ 101 | "data.describe()" 102 | ] 103 | }, 104 | { 105 | "cell_type": "markdown", 106 | "metadata": { 107 | "colab_type": "text", 108 | "id": "bvQFnaZojlH1" 109 | }, 110 | "source": [ 111 | "让我们创建两个分数的散点图,并使用颜色编码来可视化,如果样本是正的(被接纳)或负的(未被接纳)。" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": 0, 117 | "metadata": { 118 | "colab": {}, 119 | "colab_type": "code", 120 | "id": "4VZ-TIUmjlH3" 121 | }, 122 | "outputs": [], 123 | "source": [ 124 | "positive = data[data['admitted'].isin([1])]\n", 125 | "negative = data[data['admitted'].isin([0])]\n", 126 | "\n", 127 | "fig, ax = plt.subplots(figsize=(12,8))\n", 128 | "ax.scatter(positive['exam1'], positive['exam2'], s=50, c='b', marker='o', label='Admitted')\n", 129 | "ax.scatter(negative['exam1'], negative['exam2'], s=50, c='r', marker='x', label='Not Admitted')\n", 130 | "ax.legend()\n", 131 | "ax.set_xlabel('Exam 1 Score')\n", 132 | "ax.set_ylabel('Exam 2 Score')\n", 133 | "plt.show()" 134 | ] 135 | }, 136 | { 137 | "cell_type": "code", 138 | "execution_count": 0, 139 | "metadata": { 140 | "colab": {}, 141 | "colab_type": "code", 142 | "id": "2dp3JtvpjBYR" 143 | }, 144 | "outputs": [], 145 | "source": [ 146 | "def get_X(df):#读取特征\n", 147 | "# \"\"\"\n", 148 | "# use concat to add intersect feature to avoid side effect\n", 149 | "# not efficient for big dataset though\n", 150 | "# \"\"\"\n", 151 | " ones = pd.DataFrame({'ones': np.ones(len(df))})#ones是m行1列的dataframe\n", 152 | " data = pd.concat([ones, df], axis=1) # 合并数据,根据列合并\n", 153 | " return data.iloc[:, :-1].as_matrix() # 这个操作返回 ndarray,不是矩阵\n", 154 | "\n", 155 | "\n", 156 | "def get_y(df):#读取标签\n", 157 | "# '''assume the last column is the target'''\n", 158 | " return np.array(df.iloc[:, -1])#df.iloc[:, -1]是指df的最后一列\n", 159 | "\n", 160 | "\n", 161 | "def normalize_feature(df):\n", 162 | "# \"\"\"Applies function along input axis(default 0) of DataFrame.\"\"\"\n", 163 | " return df.apply(lambda column: (column - column.mean()) / column.std())#特征缩放" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": 0, 169 | "metadata": { 170 | "colab": {}, 171 | "colab_type": "code", 172 | "id": "mMuhs2-hjBYW" 173 | }, 174 | "outputs": [], 175 | "source": [ 176 | "X = get_X(data)\n", 177 | "print(X.shape)\n", 178 | "\n", 179 | "y = get_y(data)\n", 180 | "print(y.shape)" 181 | ] 182 | }, 183 | { 184 | "cell_type": "markdown", 185 | "metadata": { 186 | "colab_type": "text", 187 | "id": "8R0uQXSZjlH7" 188 | }, 189 | "source": [ 190 | "看起来在两类间,有一个清晰的决策边界。现在我们需要实现逻辑回归,那样就可以训练一个模型来预测结果。" 191 | ] 192 | }, 193 | { 194 | "cell_type": "markdown", 195 | "metadata": { 196 | "colab_type": "text", 197 | "id": "R_GK_8JhjlH8" 198 | }, 199 | "source": [ 200 | "## Sigmoid 函数\n", 201 | "$g$ 代表一个常用的逻辑函数(logistic function)为S形函数(Sigmoid function),公式为: \\\\[g\\left( z \\right)=\\frac{1}{1+{{e}^{-z}}}\\\\] \n", 202 | "合起来,我们得到逻辑回归模型的假设函数: \n", 203 | "\t\\\\[{{h}_{\\theta }}\\left( x \\right)=\\frac{1}{1+{{e}^{-{{\\theta }^{T}}X}}}\\\\] " 204 | ] 205 | }, 206 | { 207 | "cell_type": "code", 208 | "execution_count": 0, 209 | "metadata": { 210 | "colab": {}, 211 | "colab_type": "code", 212 | "id": "6yd5Ddl2jlH-" 213 | }, 214 | "outputs": [], 215 | "source": [ 216 | "def sigmoid(z):\n", 217 | " return 1 / (1 + np.exp(-z))" 218 | ] 219 | }, 220 | { 221 | "cell_type": "markdown", 222 | "metadata": { 223 | "colab_type": "text", 224 | "id": "4SG4dpr3jlID" 225 | }, 226 | "source": [ 227 | "让我们做一个快速的检查,来确保它可以工作。" 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "execution_count": 0, 233 | "metadata": { 234 | "colab": {}, 235 | "colab_type": "code", 236 | "id": "FN4JMcM2jlIE" 237 | }, 238 | "outputs": [], 239 | "source": [ 240 | "nums = np.arange(-10, 10, step=1)\n", 241 | "\n", 242 | "fig, ax = plt.subplots(figsize=(12,8))\n", 243 | "ax.plot(nums, sigmoid(nums), 'r')\n", 244 | "ax.set_xlabel('z', fontsize=18)\n", 245 | "ax.set_ylabel('g(z)', fontsize=18)\n", 246 | "ax.set_title('sigmoid function', fontsize=18)\n", 247 | "plt.show()" 248 | ] 249 | }, 250 | { 251 | "cell_type": "markdown", 252 | "metadata": { 253 | "colab_type": "text", 254 | "id": "apEHwiQmvoH_" 255 | }, 256 | "source": [ 257 | "## Cost Function 代价函数" 258 | ] 259 | }, 260 | { 261 | "cell_type": "markdown", 262 | "metadata": { 263 | "colab_type": "text", 264 | "id": "Zw3ZmdqVjlII" 265 | }, 266 | "source": [ 267 | "棒极了!现在,我们需要编写代价函数来评估结果。\n", 268 | "代价函数:\n", 269 | "$J\\left( \\theta \\right)=\\frac{1}{m}\\sum\\limits_{i=1}^{m}{[-{{y}^{(i)}}\\log \\left( {{h}_{\\theta }}\\left( {{x}^{(i)}} \\right) \\right)-\\left( 1-{{y}^{(i)}} \\right)\\log \\left( 1-{{h}_{\\theta }}\\left( {{x}^{(i)}} \\right) \\right)]}$" 270 | ] 271 | }, 272 | { 273 | "cell_type": "code", 274 | "execution_count": 0, 275 | "metadata": { 276 | "colab": {}, 277 | "colab_type": "code", 278 | "id": "R0Y4hVpkjlIJ" 279 | }, 280 | "outputs": [], 281 | "source": [ 282 | "def cost(theta, X, y):\n", 283 | " return np.mean(-y * np.log(sigmoid(X @ theta)) - (1 - y) * np.log(1 - sigmoid(X @ theta)))" 284 | ] 285 | }, 286 | { 287 | "cell_type": "markdown", 288 | "metadata": { 289 | "colab_type": "text", 290 | "id": "XEBSMFxujlIS" 291 | }, 292 | "source": [ 293 | "让我们来检查矩阵的维度来确保一切良好。" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": 0, 299 | "metadata": { 300 | "colab": {}, 301 | "colab_type": "code", 302 | "id": "6Ax8oM2AjlIT" 303 | }, 304 | "outputs": [], 305 | "source": [ 306 | "theta = np.zeros(3)\n", 307 | "theta" 308 | ] 309 | }, 310 | { 311 | "cell_type": "code", 312 | "execution_count": 0, 313 | "metadata": { 314 | "colab": {}, 315 | "colab_type": "code", 316 | "id": "8OY_-JQ8jlIX" 317 | }, 318 | "outputs": [], 319 | "source": [ 320 | "X.shape, theta.shape, y.shape" 321 | ] 322 | }, 323 | { 324 | "cell_type": "markdown", 325 | "metadata": { 326 | "colab_type": "text", 327 | "id": "ZZxmEmkwjlIa" 328 | }, 329 | "source": [ 330 | "让我们计算初始化参数的代价函数(theta为0)。" 331 | ] 332 | }, 333 | { 334 | "cell_type": "code", 335 | "execution_count": 0, 336 | "metadata": { 337 | "colab": {}, 338 | "colab_type": "code", 339 | "id": "fXdMxsdrjlIb" 340 | }, 341 | "outputs": [], 342 | "source": [ 343 | "cost(theta, X, y)" 344 | ] 345 | }, 346 | { 347 | "cell_type": "markdown", 348 | "metadata": { 349 | "colab_type": "text", 350 | "id": "SSGkYiVXjlIf" 351 | }, 352 | "source": [ 353 | "看起来不错,接下来,我们需要一个函数来计算我们的训练数据、标签和一些参数thata的梯度。" 354 | ] 355 | }, 356 | { 357 | "cell_type": "markdown", 358 | "metadata": { 359 | "colab_type": "text", 360 | "id": "tI-AX8knjlIi" 361 | }, 362 | "source": [ 363 | "## Gradient descent 梯度下降\n", 364 | "* 这是批量梯度下降(batch gradient descent) \n", 365 | "* 转化为向量化计算: $\\frac{1}{m} X^T( Sigmoid(X\\theta) - y )$\n", 366 | "$$\\frac{\\partial J\\left( \\theta \\right)}{\\partial {{\\theta }_{j}}}=\\frac{1}{m}\\sum\\limits_{i=1}^{m}{\\left({h_{\\theta }}\\left( {{x}^{(i)}} \\right)-{{y}^{(i)}}\\right)x_{_{j}}^{(i)}}$$" 367 | ] 368 | }, 369 | { 370 | "cell_type": "code", 371 | "execution_count": 0, 372 | "metadata": { 373 | "colab": {}, 374 | "colab_type": "code", 375 | "id": "D1SA0lHXjlIj" 376 | }, 377 | "outputs": [], 378 | "source": [ 379 | "# Gradient的实现,利用的是循环的方法\n", 380 | "def gradient(theta, X, y):\n", 381 | " theta = np.matrix(theta)\n", 382 | " X = np.matrix(X)\n", 383 | " y = np.matrix(y).T\n", 384 | " print(\"X.shape = \", X.shape)\n", 385 | " print(\"y.shape = \", y.shape)\n", 386 | " print(\"theta.shape = \", theta.shape)\n", 387 | " \n", 388 | " parameters = int(theta.ravel().shape[1])\n", 389 | " grad = np.zeros(parameters)\n", 390 | " \n", 391 | " error = sigmoid(X @ theta.T) - y\n", 392 | " print(\"error.shape = \", error.shape)\n", 393 | " \n", 394 | " for i in range(parameters):\n", 395 | " term = X[:,i].T @ error\n", 396 | " grad[i] = term / len(X)\n", 397 | " \n", 398 | " return grad" 399 | ] 400 | }, 401 | { 402 | "cell_type": "code", 403 | "execution_count": 0, 404 | "metadata": { 405 | "colab": {}, 406 | "colab_type": "code", 407 | "id": "brjL4gt34jMb" 408 | }, 409 | "outputs": [], 410 | "source": [ 411 | "# 另外一种实现的方法\n", 412 | "def gradient(theta, X, y):\n", 413 | " return (1 / len(X)) * X.T @ (sigmoid(X @ theta) - y)" 414 | ] 415 | }, 416 | { 417 | "cell_type": "markdown", 418 | "metadata": { 419 | "colab_type": "text", 420 | "id": "6TEJDxaqjlIr" 421 | }, 422 | "source": [ 423 | "注意,我们实际上没有在这个函数中执行梯度下降,我们仅仅在计算一个梯度步长。在练习中,一个称为“fminunc”的Octave函数是用来优化函数来计算成本和梯度参数。由于我们使用Python,我们可以用SciPy的“optimize”命名空间来做同样的事情。" 424 | ] 425 | }, 426 | { 427 | "cell_type": "markdown", 428 | "metadata": { 429 | "colab_type": "text", 430 | "id": "FoLQqZg7jlIt" 431 | }, 432 | "source": [ 433 | "我们看看用我们的数据和初始参数为0的梯度下降法的结果。" 434 | ] 435 | }, 436 | { 437 | "cell_type": "code", 438 | "execution_count": 0, 439 | "metadata": { 440 | "colab": {}, 441 | "colab_type": "code", 442 | "id": "twQhHqydjlIw" 443 | }, 444 | "outputs": [], 445 | "source": [ 446 | "gradient(theta, X, y)" 447 | ] 448 | }, 449 | { 450 | "cell_type": "markdown", 451 | "metadata": { 452 | "colab_type": "text", 453 | "id": "M5cP5VjE-YdI" 454 | }, 455 | "source": [ 456 | "## 拟合参数" 457 | ] 458 | }, 459 | { 460 | "cell_type": "markdown", 461 | "metadata": { 462 | "colab_type": "text", 463 | "id": "3ye-zhoBjlI1" 464 | }, 465 | "source": [ 466 | "现在可以用SciPy's truncated newton(TNC)实现寻找最优参数。" 467 | ] 468 | }, 469 | { 470 | "cell_type": "code", 471 | "execution_count": 0, 472 | "metadata": { 473 | "colab": {}, 474 | "colab_type": "code", 475 | "id": "0UHRpG7ijlI2" 476 | }, 477 | "outputs": [], 478 | "source": [ 479 | "import scipy.optimize as opt\n", 480 | "res = opt.minimize(fun=cost, x0=theta, args=(X, y), method='Newton-CG', jac=gradient)\n", 481 | "print(res)" 482 | ] 483 | }, 484 | { 485 | "cell_type": "markdown", 486 | "metadata": { 487 | "colab_type": "text", 488 | "id": "o-z0iOvZjlI5" 489 | }, 490 | "source": [ 491 | "让我们看看在这个结论下代价函数计算结果是什么个样子~" 492 | ] 493 | }, 494 | { 495 | "cell_type": "code", 496 | "execution_count": 0, 497 | "metadata": { 498 | "colab": {}, 499 | "colab_type": "code", 500 | "id": "d7Sc0NlzjlI6" 501 | }, 502 | "outputs": [], 503 | "source": [ 504 | "cost(res.x, X, y)" 505 | ] 506 | }, 507 | { 508 | "cell_type": "markdown", 509 | "metadata": { 510 | "colab_type": "text", 511 | "id": "jqbJJue1-lK6" 512 | }, 513 | "source": [ 514 | "## 用训练集预测和验证" 515 | ] 516 | }, 517 | { 518 | "cell_type": "markdown", 519 | "metadata": { 520 | "colab_type": "text", 521 | "id": "F5xbkpoXjlJD" 522 | }, 523 | "source": [ 524 | "接下来,我们需要编写一个函数,用我们所学的参数theta来为数据集X输出预测。然后,我们可以使用这个函数来给我们的分类器的训练精度打分。\n", 525 | "逻辑回归模型的假设函数: \n", 526 | "\t\\\\[{{h}_{\\theta }}\\left( x \\right)=\\frac{1}{1+{{e}^{-{{\\theta }^{T}}X}}}\\\\] \n", 527 | "当${{h}_{\\theta }}$大于等于0.5时,预测 y=1\n", 528 | "\n", 529 | "当${{h}_{\\theta }}$小于0.5时,预测 y=0 。" 530 | ] 531 | }, 532 | { 533 | "cell_type": "code", 534 | "execution_count": 0, 535 | "metadata": { 536 | "colab": {}, 537 | "colab_type": "code", 538 | "id": "L5kMfIWijlJE" 539 | }, 540 | "outputs": [], 541 | "source": [ 542 | "def predict(theta, X):\n", 543 | " probability = sigmoid(X @ theta)\n", 544 | " return [1 if x >= 0.5 else 0 for x in probability]" 545 | ] 546 | }, 547 | { 548 | "cell_type": "code", 549 | "execution_count": 0, 550 | "metadata": { 551 | "colab": {}, 552 | "colab_type": "code", 553 | "id": "5yDGG6aXjlJI" 554 | }, 555 | "outputs": [], 556 | "source": [ 557 | "theta_min = res.x\n", 558 | "predictions = predict(theta_min, X)\n", 559 | "correct = [1 if ((a == 1 and b == 1) or (a == 0 and b == 0)) else 0 for (a, b) in zip(predictions, y)]\n", 560 | "accuracy = (sum(map(int, correct)) % len(correct))\n", 561 | "print ('accuracy = {0}%'.format(accuracy))\n", 562 | "print(classification_report(y, predictions))" 563 | ] 564 | }, 565 | { 566 | "cell_type": "markdown", 567 | "metadata": { 568 | "colab_type": "text", 569 | "id": "luDLSnm3jlJL" 570 | }, 571 | "source": [ 572 | "我们的逻辑回归分类器预测正确,如果一个学生被录取或没有录取,达到89%的精确度。不坏!记住,这是训练集的准确性。我们没有保持住了设置或使用交叉验证得到的真实逼近,所以这个数字有可能高于其真实值(这个话题将在以后说明)。" 573 | ] 574 | }, 575 | { 576 | "cell_type": "markdown", 577 | "metadata": { 578 | "colab_type": "text", 579 | "id": "EO1hfP0j-0cc" 580 | }, 581 | "source": [] 582 | }, 583 | { 584 | "cell_type": "markdown", 585 | "metadata": { 586 | "colab_type": "text", 587 | "id": "xIvApW8HjBZg" 588 | }, 589 | "source": [ 590 | "## 寻找决策边界\n", 591 | "决策便捷是一系列的$x$可以满足:$$\\frac{1}{1+e^{-\\theta^T x}}=0.5$$\n", 592 | "也就是说$$e^{-\\theta^T x} = 1$$\n", 593 | "意味着 $$\\theta^T X = 0$$\n", 594 | "\n", 595 | "这是一条直线。\n", 596 | "\n", 597 | "参考这里:http://stats.stackexchange.com/questions/93569/why-is-logistic-regression-a-linear-classifier\n" 598 | ] 599 | }, 600 | { 601 | "cell_type": "code", 602 | "execution_count": 0, 603 | "metadata": { 604 | "colab": {}, 605 | "colab_type": "code", 606 | "id": "oErM-l5djBZh" 607 | }, 608 | "outputs": [], 609 | "source": [ 610 | "print(res.x) # this is final theta" 611 | ] 612 | }, 613 | { 614 | "cell_type": "code", 615 | "execution_count": 0, 616 | "metadata": { 617 | "colab": {}, 618 | "colab_type": "code", 619 | "id": "Bmtqg98tjBZk" 620 | }, 621 | "outputs": [], 622 | "source": [ 623 | "# x0 = 1\n", 624 | "# 定义x1的range后,根据 θ0 + x1*θ1 + x2*θ2 = 0,可得x2\n", 625 | "x1 = np.arange(130, step=0.1)\n", 626 | "\n", 627 | "coef = -(res.x / res.x[2]) # find the equation\n", 628 | "x2 = coef[0] + coef[1]*x" 629 | ] 630 | }, 631 | { 632 | "cell_type": "code", 633 | "execution_count": 0, 634 | "metadata": { 635 | "colab": {}, 636 | "colab_type": "code", 637 | "id": "zW-rRiS-jBZu" 638 | }, 639 | "outputs": [], 640 | "source": [ 641 | "import seaborn as sns\n", 642 | "sns.set(context=\"notebook\", style=\"ticks\", font_scale=1.5)\n", 643 | "\n", 644 | "sns.lmplot('exam1', 'exam2', hue='admitted', data=data, \n", 645 | " size=6, \n", 646 | " fit_reg=False, \n", 647 | " scatter_kws={\"s\": 25}\n", 648 | " )\n", 649 | "\n", 650 | "plt.plot(x1, x2, 'grey')\n", 651 | "plt.xlim(0, 130)\n", 652 | "plt.ylim(0, 130)\n", 653 | "plt.title('Decision Boundary')\n", 654 | "plt.show()" 655 | ] 656 | }, 657 | { 658 | "cell_type": "markdown", 659 | "metadata": { 660 | "colab_type": "text", 661 | "id": "qgr6k1ybjlJM" 662 | }, 663 | "source": [ 664 | "## 正则化逻辑回归" 665 | ] 666 | }, 667 | { 668 | "cell_type": "markdown", 669 | "metadata": { 670 | "colab_type": "text", 671 | "id": "raPYPfoXjlJN" 672 | }, 673 | "source": [ 674 | "在训练的第二部分,我们将要通过加入正则项提升逻辑回归算法。如果你对正则化有点眼生,或者喜欢这一节的方程的背景,请参考在\"exercises\"文件夹中的\"ex2.pdf\"。简而言之,正则化是成本函数中的一个术语,它使算法更倾向于“更简单”的模型(在这种情况下,模型将更小的系数)。这个理论助于减少过拟合,提高模型的泛化能力。这样,我们开始吧。" 675 | ] 676 | }, 677 | { 678 | "cell_type": "markdown", 679 | "metadata": { 680 | "colab_type": "text", 681 | "id": "8HRDKZsYjlJO" 682 | }, 683 | "source": [ 684 | "设想你是工厂的生产主管,你有一些芯片在两次测试中的测试结果。对于这两次测试,你想决定是否芯片要被接受或抛弃。为了帮助你做出艰难的决定,你拥有过去芯片的测试数据集,从其中你可以构建一个逻辑回归模型。" 685 | ] 686 | }, 687 | { 688 | "cell_type": "markdown", 689 | "metadata": { 690 | "colab_type": "text", 691 | "id": "JuiBlOvjjlJR" 692 | }, 693 | "source": [ 694 | "和第一部分很像,从数据可视化开始吧!" 695 | ] 696 | }, 697 | { 698 | "cell_type": "code", 699 | "execution_count": 0, 700 | "metadata": { 701 | "colab": {}, 702 | "colab_type": "code", 703 | "id": "PPewmmIgjlJS" 704 | }, 705 | "outputs": [], 706 | "source": [ 707 | "data2 = pd.read_csv('ex2data2.txt', names=['test1', 'test2', 'accepted'])\n", 708 | "data2.head()" 709 | ] 710 | }, 711 | { 712 | "cell_type": "code", 713 | "execution_count": 0, 714 | "metadata": { 715 | "colab": {}, 716 | "colab_type": "code", 717 | "id": "PH37JgMmjlJV" 718 | }, 719 | "outputs": [], 720 | "source": [ 721 | "positive = data2[data2['accepted'].isin([1])]\n", 722 | "negative = data2[data2['accepted'].isin([0])]\n", 723 | "\n", 724 | "fig, ax = plt.subplots(figsize=(12,8))\n", 725 | "ax.scatter(positive['test1'], positive['test2'], s=50, c='b', marker='o', label='Accepted')\n", 726 | "ax.scatter(negative['test1'], negative['test2'], s=50, c='r', marker='x', label='Rejected')\n", 727 | "ax.legend()\n", 728 | "ax.set_xlabel('Test 1 Score')\n", 729 | "ax.set_ylabel('Test 2 Score')\n", 730 | "plt.show()" 731 | ] 732 | }, 733 | { 734 | "cell_type": "markdown", 735 | "metadata": { 736 | "colab_type": "text", 737 | "id": "nKZFQDlUjlJY" 738 | }, 739 | "source": [ 740 | "哇,这个数据看起来可比前一次的复杂得多。特别地,你会注意到其中没有线性决策界限,来良好的分开两类数据。一个方法是用像逻辑回归这样的线性技术来构造从原始特征的多项式中得到的特征。让我们通过创建一组多项式特征入手吧。" 741 | ] 742 | }, 743 | { 744 | "cell_type": "code", 745 | "execution_count": 0, 746 | "metadata": { 747 | "colab": {}, 748 | "colab_type": "code", 749 | "id": "0WuhZaH9PyJv" 750 | }, 751 | "outputs": [], 752 | "source": [ 753 | "def feature_mapping(x, y, power, as_ndarray=False):\n", 754 | "# \"\"\"return mapped features as ndarray or dataframe\"\"\"\n", 755 | " # data = {}\n", 756 | " # # inclusive\n", 757 | " # for i in np.arange(power + 1):\n", 758 | " # for p in np.arange(i + 1):\n", 759 | " # data[\"f{}{}\".format(i - p, p)] = np.power(x, i - p) * np.power(y, p)\n", 760 | "\n", 761 | " data = {\"f{}{}\".format(i - p, p): np.power(x, i - p) * np.power(y, p)\n", 762 | " for i in np.arange(power + 1)\n", 763 | " for p in np.arange(i + 1)\n", 764 | " }\n", 765 | "\n", 766 | " if as_ndarray:\n", 767 | " return pd.DataFrame(data).as_matrix()\n", 768 | " else:\n", 769 | " return pd.DataFrame(data)" 770 | ] 771 | }, 772 | { 773 | "cell_type": "code", 774 | "execution_count": 0, 775 | "metadata": { 776 | "colab": {}, 777 | "colab_type": "code", 778 | "id": "sISiRom-jlJZ" 779 | }, 780 | "outputs": [], 781 | "source": [ 782 | "degree = 5\n", 783 | "x1 = np.array(df.test1)\n", 784 | "x2 = np.array(df.test2)\n", 785 | "\n", 786 | "data = feature_mapping(x1, x2, power=6)\n", 787 | "print(data.shape)\n", 788 | "data.head()" 789 | ] 790 | }, 791 | { 792 | "cell_type": "markdown", 793 | "metadata": { 794 | "colab_type": "text", 795 | "id": "UaNs-uSEjlJe" 796 | }, 797 | "source": [ 798 | "现在,我们需要修改第1部分的成本和梯度函数,包括正则化项。首先是成本函数:" 799 | ] 800 | }, 801 | { 802 | "cell_type": "markdown", 803 | "metadata": { 804 | "colab_type": "text", 805 | "id": "YgE_bAV2jlJe" 806 | }, 807 | "source": [ 808 | "## Regularized cost(正则化代价函数)\n", 809 | "$$J\\left( \\theta \\right)=\\frac{1}{m}\\sum\\limits_{i=1}^{m}{[-{{y}^{(i)}}\\log \\left( {{h}_{\\theta }}\\left( {{x}^{(i)}} \\right) \\right)-\\left( 1-{{y}^{(i)}} \\right)\\log \\left( 1-{{h}_{\\theta }}\\left( {{x}^{(i)}} \\right) \\right)]}+\\frac{\\lambda }{2m}\\sum\\limits_{j=1}^{n}{\\theta _{j}^{2}}$$" 810 | ] 811 | }, 812 | { 813 | "cell_type": "code", 814 | "execution_count": 0, 815 | "metadata": { 816 | "colab": {}, 817 | "colab_type": "code", 818 | "id": "qBnOVnA9RI1v" 819 | }, 820 | "outputs": [], 821 | "source": [ 822 | "theta = np.zeros(data.shape[1])\n", 823 | "X = feature_mapping(x1, x2, power=6, as_ndarray=True)\n", 824 | "print(X.shape)\n", 825 | "\n", 826 | "y = get_y(df)\n", 827 | "print(y.shape)" 828 | ] 829 | }, 830 | { 831 | "cell_type": "code", 832 | "execution_count": 0, 833 | "metadata": { 834 | "colab": {}, 835 | "colab_type": "code", 836 | "id": "Kkw1EX4DjlJg" 837 | }, 838 | "outputs": [], 839 | "source": [ 840 | "def regularized_cost(theta, X, y, l=1):\n", 841 | "# '''you don't penalize theta_0'''\n", 842 | " theta_j1_to_n = theta[1:]\n", 843 | " regularized_term = (l / (2 * len(X))) * np.power(theta_j1_to_n, 2).sum()\n", 844 | "\n", 845 | " return cost(theta, X, y) + regularized_term\n", 846 | "#正则化代价函数" 847 | ] 848 | }, 849 | { 850 | "cell_type": "code", 851 | "execution_count": 0, 852 | "metadata": { 853 | "colab": {}, 854 | "colab_type": "code", 855 | "id": "207-7syrRNE_" 856 | }, 857 | "outputs": [], 858 | "source": [ 859 | "regularized_cost(theta, X, y, l=1)" 860 | ] 861 | }, 862 | { 863 | "cell_type": "markdown", 864 | "metadata": { 865 | "colab_type": "text", 866 | "id": "5nkQHdeGRtF2" 867 | }, 868 | "source": [ 869 | "## Regularized gradient(正则化梯度) \n", 870 | "\n", 871 | "$$\\frac{\\partial J\\left( \\theta \\right)}{\\partial {{\\theta }_{j}}}=\\left( \\frac{1}{m}\\sum\\limits_{i=1}^{m}{\\left( {{h}_{\\theta }}\\left( {{x}^{\\left( i \\right)}} \\right)-{{y}^{\\left( i \\right)}} \\right)} \\right)+\\frac{\\lambda }{m}{{\\theta }_{j}}\\text{ }\\text{ for j}\\ge \\text{1}$$" 872 | ] 873 | }, 874 | { 875 | "cell_type": "markdown", 876 | "metadata": { 877 | "colab_type": "text", 878 | "id": "1BnWF1tEjlJj" 879 | }, 880 | "source": [ 881 | "请注意等式中的\"reg\" 项。还注意到另外的一个“学习率”参数。这是一种超参数,用来控制正则化项。现在我们需要添加正则化梯度函数:" 882 | ] 883 | }, 884 | { 885 | "cell_type": "markdown", 886 | "metadata": { 887 | "colab_type": "text", 888 | "id": "ycKcQ0ndjlJk" 889 | }, 890 | "source": [ 891 | "如果我们要使用梯度下降法令这个代价函数最小化,因为我们未对${{\\theta }_{0}}$ 进行正则化,所以梯度下降算法将分两种情形:\n", 892 | "\\begin{align}\n", 893 | " & Repeat\\text{ }until\\text{ }convergence\\text{ }\\!\\!\\{\\!\\!\\text{ } \\\\ \n", 894 | " & \\text{ }{{\\theta }_{0}}:={{\\theta }_{0}}-a\\frac{1}{m}\\sum\\limits_{i=1}^{m}{[{{h}_{\\theta }}\\left( {{x}^{(i)}} \\right)-{{y}^{(i)}}]x_{_{0}}^{(i)}} \\\\ \n", 895 | " & \\text{ }{{\\theta }_{j}}:={{\\theta }_{j}}-a\\frac{1}{m}\\sum\\limits_{i=1}^{m}{[{{h}_{\\theta }}\\left( {{x}^{(i)}} \\right)-{{y}^{(i)}}]x_{j}^{(i)}}+\\frac{\\lambda }{m}{{\\theta }_{j}} \\\\ \n", 896 | " & \\text{ }\\!\\!\\}\\!\\!\\text{ } \\\\ \n", 897 | " & Repeat \\\\ \n", 898 | "\\end{align}\n", 899 | "\n", 900 | "对上面的算法中 j=1,2,...,n 时的更新式子进行调整可得: \n", 901 | "${{\\theta }_{j}}:={{\\theta }_{j}}(1-a\\frac{\\lambda }{m})-a\\frac{1}{m}\\sum\\limits_{i=1}^{m}{({{h}_{\\theta }}\\left( {{x}^{(i)}} \\right)-{{y}^{(i)}})x_{j}^{(i)}}$\n" 902 | ] 903 | }, 904 | { 905 | "cell_type": "code", 906 | "execution_count": 0, 907 | "metadata": { 908 | "colab": {}, 909 | "colab_type": "code", 910 | "id": "cwxZ64B4jlJm" 911 | }, 912 | "outputs": [], 913 | "source": [ 914 | "def regularized_gradient(theta, X, y, l=1):\n", 915 | "# '''still, leave theta_0 alone'''\n", 916 | " theta_j1_to_n = theta[1:]\n", 917 | " regularized_theta = (l / len(X)) * theta_j1_to_n\n", 918 | "\n", 919 | " # by doing this, no offset is on theta_0\n", 920 | " regularized_term = np.concatenate([np.array([0]), regularized_theta])\n", 921 | "\n", 922 | " return gradient(theta, X, y) + regularized_term" 923 | ] 924 | }, 925 | { 926 | "cell_type": "markdown", 927 | "metadata": { 928 | "colab_type": "text", 929 | "id": "m3mVnObqjlJo" 930 | }, 931 | "source": [ 932 | "就像在第一部分中做的一样,初始化变量。" 933 | ] 934 | }, 935 | { 936 | "cell_type": "code", 937 | "execution_count": 0, 938 | "metadata": { 939 | "colab": {}, 940 | "colab_type": "code", 941 | "id": "SQJGX8C5jlJq" 942 | }, 943 | "outputs": [], 944 | "source": [ 945 | "regularized_gradient(theta, X, y)" 946 | ] 947 | }, 948 | { 949 | "cell_type": "markdown", 950 | "metadata": { 951 | "colab_type": "text", 952 | "id": "3HDWjlJrjlJs" 953 | }, 954 | "source": [ 955 | "让我们初始学习率到一个合理值。,果有必要的话(即如果惩罚太强或不够强),我们可以之后再折腾这个。" 956 | ] 957 | }, 958 | { 959 | "cell_type": "code", 960 | "execution_count": 0, 961 | "metadata": { 962 | "colab": {}, 963 | "colab_type": "code", 964 | "id": "Ra_Rd5sFjlJs" 965 | }, 966 | "outputs": [], 967 | "source": [ 968 | "learningRate = 1" 969 | ] 970 | }, 971 | { 972 | "cell_type": "markdown", 973 | "metadata": { 974 | "colab_type": "text", 975 | "id": "c3oj6HgkjlJv" 976 | }, 977 | "source": [ 978 | "现在,让我们尝试调用新的默认为0的theta的正则化函数,以确保计算工作正常。" 979 | ] 980 | }, 981 | { 982 | "cell_type": "code", 983 | "execution_count": 0, 984 | "metadata": { 985 | "colab": {}, 986 | "colab_type": "code", 987 | "id": "haAzSuP3jlJw" 988 | }, 989 | "outputs": [], 990 | "source": [ 991 | "regularized_cost(theta2, X2, y2, learningRate)" 992 | ] 993 | }, 994 | { 995 | "cell_type": "markdown", 996 | "metadata": { 997 | "colab_type": "text", 998 | "id": "Co_DCQqPjlJ8" 999 | }, 1000 | "source": [ 1001 | "现在我们可以使用和第一部分相同的优化函数来计算优化后的结果。" 1002 | ] 1003 | }, 1004 | { 1005 | "cell_type": "markdown", 1006 | "metadata": { 1007 | "colab_type": "text", 1008 | "id": "twdWMetWSeAd" 1009 | }, 1010 | "source": [ 1011 | "## 拟合参数" 1012 | ] 1013 | }, 1014 | { 1015 | "cell_type": "code", 1016 | "execution_count": 0, 1017 | "metadata": { 1018 | "colab": {}, 1019 | "colab_type": "code", 1020 | "id": "rb6a5ASMSg09" 1021 | }, 1022 | "outputs": [], 1023 | "source": [ 1024 | "import scipy.optimize as opt" 1025 | ] 1026 | }, 1027 | { 1028 | "cell_type": "code", 1029 | "execution_count": 0, 1030 | "metadata": { 1031 | "colab": {}, 1032 | "colab_type": "code", 1033 | "id": "AXbKitbDSiqS" 1034 | }, 1035 | "outputs": [], 1036 | "source": [ 1037 | "print('init cost = {}'.format(regularized_cost(theta, X, y)))\n", 1038 | "\n", 1039 | "res = opt.minimize(fun=regularized_cost, x0=theta, args=(X, y), method='Newton-CG', jac=regularized_gradient)\n", 1040 | "res" 1041 | ] 1042 | }, 1043 | { 1044 | "cell_type": "markdown", 1045 | "metadata": { 1046 | "colab_type": "text", 1047 | "id": "UyNK7mUXjlJ_" 1048 | }, 1049 | "source": [ 1050 | "最后,我们可以使用第1部分中的预测函数来查看我们的方案在训练数据上的准确度。" 1051 | ] 1052 | }, 1053 | { 1054 | "cell_type": "markdown", 1055 | "metadata": { 1056 | "colab_type": "text", 1057 | "id": "8D7uUrYGSrW6" 1058 | }, 1059 | "source": [ 1060 | "## 预测" 1061 | ] 1062 | }, 1063 | { 1064 | "cell_type": "code", 1065 | "execution_count": 0, 1066 | "metadata": { 1067 | "colab": {}, 1068 | "colab_type": "code", 1069 | "id": "koAbf3UhStRU" 1070 | }, 1071 | "outputs": [], 1072 | "source": [ 1073 | "final_theta = res.x\n", 1074 | "y_pred = predict(final_theta, X)\n", 1075 | "\n", 1076 | "print(classification_report(y, y_pred))" 1077 | ] 1078 | }, 1079 | { 1080 | "cell_type": "markdown", 1081 | "metadata": { 1082 | "colab_type": "text", 1083 | "id": "_yLVQLTFjlKI" 1084 | }, 1085 | "source": [ 1086 | "虽然我们实现了这些算法,值得注意的是,我们还可以使用高级Python库像scikit-learn来解决这个问题。" 1087 | ] 1088 | }, 1089 | { 1090 | "cell_type": "code", 1091 | "execution_count": 0, 1092 | "metadata": { 1093 | "colab": {}, 1094 | "colab_type": "code", 1095 | "id": "cih8ekDVjlKK" 1096 | }, 1097 | "outputs": [], 1098 | "source": [ 1099 | "from sklearn import linear_model#调用sklearn的线性回归包\n", 1100 | "model = linear_model.LogisticRegression(penalty='l2', C=1.0)\n", 1101 | "model.fit(X2, y2.ravel())" 1102 | ] 1103 | }, 1104 | { 1105 | "cell_type": "code", 1106 | "execution_count": 0, 1107 | "metadata": { 1108 | "colab": {}, 1109 | "colab_type": "code", 1110 | "id": "jaT8naLtjlKM" 1111 | }, 1112 | "outputs": [], 1113 | "source": [ 1114 | "model.score(X2, y2)" 1115 | ] 1116 | }, 1117 | { 1118 | "cell_type": "markdown", 1119 | "metadata": { 1120 | "colab_type": "text", 1121 | "id": "A-SkdKuQjlKP" 1122 | }, 1123 | "source": [ 1124 | "这个准确度和我们刚刚实现的差了好多,不过请记住这个结果可以使用默认参数下计算的结果。我们可能需要做一些参数的调整来获得和我们之前结果相同的精确度。" 1125 | ] 1126 | }, 1127 | { 1128 | "cell_type": "markdown", 1129 | "metadata": { 1130 | "colab_type": "text", 1131 | "id": "ioOB1lqPUlp7" 1132 | }, 1133 | "source": [ 1134 | "## 使用不同的 $\\lambda$ \n", 1135 | "### 画出决策边界\n", 1136 | "* 我们找到所有满足 $X\\times \\theta = 0$ 的x\n", 1137 | "* instead of solving polynomial equation, just create a coridate x,y grid that is dense enough, and find all those $X\\times \\theta$ that is close enough to 0, then plot them" 1138 | ] 1139 | }, 1140 | { 1141 | "cell_type": "code", 1142 | "execution_count": 0, 1143 | "metadata": { 1144 | "colab": {}, 1145 | "colab_type": "code", 1146 | "id": "wgDzdelZjBa6" 1147 | }, 1148 | "outputs": [], 1149 | "source": [ 1150 | "def draw_boundary(power, l):\n", 1151 | "# \"\"\"\n", 1152 | "# power: polynomial power for mapped feature\n", 1153 | "# l: lambda constant\n", 1154 | "# \"\"\"\n", 1155 | " density = 1000\n", 1156 | " threshhold = 2 * 10**-3\n", 1157 | "\n", 1158 | " final_theta = feature_mapped_logistic_regression(power, l)\n", 1159 | " x, y = find_decision_boundary(density, power, final_theta, threshhold)\n", 1160 | "\n", 1161 | " df = pd.read_csv('ex2data2.txt', names=['test1', 'test2', 'accepted'])\n", 1162 | " sns.lmplot('test1', 'test2', hue='accepted', data=df, size=6, fit_reg=False, scatter_kws={\"s\": 100})\n", 1163 | "\n", 1164 | " plt.scatter(x, y, c='R', s=10)\n", 1165 | " plt.title('Decision boundary')\n", 1166 | " plt.show()" 1167 | ] 1168 | }, 1169 | { 1170 | "cell_type": "code", 1171 | "execution_count": 0, 1172 | "metadata": { 1173 | "colab": {}, 1174 | "colab_type": "code", 1175 | "id": "SK7PmPjkjBa-" 1176 | }, 1177 | "outputs": [], 1178 | "source": [ 1179 | "def feature_mapped_logistic_regression(power, l):\n", 1180 | "# \"\"\"for drawing purpose only.. not a well generealize logistic regression\n", 1181 | "# power: int\n", 1182 | "# raise x1, x2 to polynomial power\n", 1183 | "# l: int\n", 1184 | "# lambda constant for regularization term\n", 1185 | "# \"\"\"\n", 1186 | " df = pd.read_csv('ex2data2.txt', names=['test1', 'test2', 'accepted'])\n", 1187 | " x1 = np.array(df.test1)\n", 1188 | " x2 = np.array(df.test2)\n", 1189 | " y = get_y(df)\n", 1190 | "\n", 1191 | " X = feature_mapping(x1, x2, power, as_ndarray=True)\n", 1192 | " theta = np.zeros(X.shape[1])\n", 1193 | "\n", 1194 | " res = opt.minimize(fun=regularized_cost,\n", 1195 | " x0=theta,\n", 1196 | " args=(X, y, l),\n", 1197 | " method='TNC',\n", 1198 | " jac=regularized_gradient)\n", 1199 | " final_theta = res.x\n", 1200 | "\n", 1201 | " return final_theta" 1202 | ] 1203 | }, 1204 | { 1205 | "cell_type": "code", 1206 | "execution_count": 0, 1207 | "metadata": { 1208 | "colab": {}, 1209 | "colab_type": "code", 1210 | "id": "GqbGlpA7jBbB" 1211 | }, 1212 | "outputs": [], 1213 | "source": [ 1214 | "def find_decision_boundary(density, power, theta, threshhold):\n", 1215 | " t1 = np.linspace(-1, 1.5, density)\n", 1216 | " t2 = np.linspace(-1, 1.5, density)\n", 1217 | "\n", 1218 | " cordinates = [(x, y) for x in t1 for y in t2]\n", 1219 | " x_cord, y_cord = zip(*cordinates)\n", 1220 | " mapped_cord = feature_mapping(x_cord, y_cord, power) # this is a dataframe\n", 1221 | "\n", 1222 | " inner_product = mapped_cord.as_matrix() @ theta\n", 1223 | "\n", 1224 | " decision = mapped_cord[np.abs(inner_product) < threshhold]\n", 1225 | "\n", 1226 | " return decision.f10, decision.f01\n", 1227 | "#寻找决策边界函数" 1228 | ] 1229 | }, 1230 | { 1231 | "cell_type": "code", 1232 | "execution_count": 0, 1233 | "metadata": { 1234 | "colab": {}, 1235 | "colab_type": "code", 1236 | "id": "JZWGQjWljBbJ", 1237 | "scrolled": true 1238 | }, 1239 | "outputs": [], 1240 | "source": [ 1241 | "draw_boundary(power=6, l=1)#lambda=1" 1242 | ] 1243 | }, 1244 | { 1245 | "cell_type": "code", 1246 | "execution_count": 0, 1247 | "metadata": { 1248 | "colab": {}, 1249 | "colab_type": "code", 1250 | "id": "CLBBoc6ZjBbN" 1251 | }, 1252 | "outputs": [], 1253 | "source": [ 1254 | "draw_boundary(power=6, l=0) # no regularization, over fitting,#lambda=0,没有正则化,过拟合了" 1255 | ] 1256 | }, 1257 | { 1258 | "cell_type": "code", 1259 | "execution_count": 0, 1260 | "metadata": { 1261 | "colab": {}, 1262 | "colab_type": "code", 1263 | "id": "zkOZ5XMZjBbS" 1264 | }, 1265 | "outputs": [], 1266 | "source": [ 1267 | "draw_boundary(power=6, l=100) # underfitting,#lambda=100,欠拟合" 1268 | ] 1269 | } 1270 | ], 1271 | "metadata": { 1272 | "colab": { 1273 | "name": "ML-Exercise2.ipynb", 1274 | "provenance": [], 1275 | "toc_visible": true, 1276 | "version": "0.3.2" 1277 | }, 1278 | "kernelspec": { 1279 | "display_name": "Python (Python 3.6)", 1280 | "language": "python", 1281 | "name": "python36" 1282 | }, 1283 | "language_info": { 1284 | "codemirror_mode": { 1285 | "name": "ipython", 1286 | "version": 3 1287 | }, 1288 | "file_extension": ".py", 1289 | "mimetype": "text/x-python", 1290 | "name": "python", 1291 | "nbconvert_exporter": "python", 1292 | "pygments_lexer": "ipython3", 1293 | "version": "3.6.8" 1294 | } 1295 | }, 1296 | "nbformat": 4, 1297 | "nbformat_minor": 1 1298 | } 1299 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/2.logistic_regression/ex2data1.txt: -------------------------------------------------------------------------------- 1 | 34.62365962451697,78.0246928153624,0 2 | 30.28671076822607,43.89499752400101,0 3 | 35.84740876993872,72.90219802708364,0 4 | 60.18259938620976,86.30855209546826,1 5 | 79.0327360507101,75.3443764369103,1 6 | 45.08327747668339,56.3163717815305,0 7 | 61.10666453684766,96.51142588489624,1 8 | 75.02474556738889,46.55401354116538,1 9 | 76.09878670226257,87.42056971926803,1 10 | 84.43281996120035,43.53339331072109,1 11 | 95.86155507093572,38.22527805795094,0 12 | 75.01365838958247,30.60326323428011,0 13 | 82.30705337399482,76.48196330235604,1 14 | 69.36458875970939,97.71869196188608,1 15 | 39.53833914367223,76.03681085115882,0 16 | 53.9710521485623,89.20735013750205,1 17 | 69.07014406283025,52.74046973016765,1 18 | 67.94685547711617,46.67857410673128,0 19 | 70.66150955499435,92.92713789364831,1 20 | 76.97878372747498,47.57596364975532,1 21 | 67.37202754570876,42.83843832029179,0 22 | 89.67677575072079,65.79936592745237,1 23 | 50.534788289883,48.85581152764205,0 24 | 34.21206097786789,44.20952859866288,0 25 | 77.9240914545704,68.9723599933059,1 26 | 62.27101367004632,69.95445795447587,1 27 | 80.1901807509566,44.82162893218353,1 28 | 93.114388797442,38.80067033713209,0 29 | 61.83020602312595,50.25610789244621,0 30 | 38.78580379679423,64.99568095539578,0 31 | 61.379289447425,72.80788731317097,1 32 | 85.40451939411645,57.05198397627122,1 33 | 52.10797973193984,63.12762376881715,0 34 | 52.04540476831827,69.43286012045222,1 35 | 40.23689373545111,71.16774802184875,0 36 | 54.63510555424817,52.21388588061123,0 37 | 33.91550010906887,98.86943574220611,0 38 | 64.17698887494485,80.90806058670817,1 39 | 74.78925295941542,41.57341522824434,0 40 | 34.1836400264419,75.2377203360134,0 41 | 83.90239366249155,56.30804621605327,1 42 | 51.54772026906181,46.85629026349976,0 43 | 94.44336776917852,65.56892160559052,1 44 | 82.36875375713919,40.61825515970618,0 45 | 51.04775177128865,45.82270145776001,0 46 | 62.22267576120188,52.06099194836679,0 47 | 77.19303492601364,70.45820000180959,1 48 | 97.77159928000232,86.7278223300282,1 49 | 62.07306379667647,96.76882412413983,1 50 | 91.56497449807442,88.69629254546599,1 51 | 79.94481794066932,74.16311935043758,1 52 | 99.2725269292572,60.99903099844988,1 53 | 90.54671411399852,43.39060180650027,1 54 | 34.52451385320009,60.39634245837173,0 55 | 50.2864961189907,49.80453881323059,0 56 | 49.58667721632031,59.80895099453265,0 57 | 97.64563396007767,68.86157272420604,1 58 | 32.57720016809309,95.59854761387875,0 59 | 74.24869136721598,69.82457122657193,1 60 | 71.79646205863379,78.45356224515052,1 61 | 75.3956114656803,85.75993667331619,1 62 | 35.28611281526193,47.02051394723416,0 63 | 56.25381749711624,39.26147251058019,0 64 | 30.05882244669796,49.59297386723685,0 65 | 44.66826172480893,66.45008614558913,0 66 | 66.56089447242954,41.09209807936973,0 67 | 40.45755098375164,97.53518548909936,1 68 | 49.07256321908844,51.88321182073966,0 69 | 80.27957401466998,92.11606081344084,1 70 | 66.74671856944039,60.99139402740988,1 71 | 32.72283304060323,43.30717306430063,0 72 | 64.0393204150601,78.03168802018232,1 73 | 72.34649422579923,96.22759296761404,1 74 | 60.45788573918959,73.09499809758037,1 75 | 58.84095621726802,75.85844831279042,1 76 | 99.82785779692128,72.36925193383885,1 77 | 47.26426910848174,88.47586499559782,1 78 | 50.45815980285988,75.80985952982456,1 79 | 60.45555629271532,42.50840943572217,0 80 | 82.22666157785568,42.71987853716458,0 81 | 88.9138964166533,69.80378889835472,1 82 | 94.83450672430196,45.69430680250754,1 83 | 67.31925746917527,66.58935317747915,1 84 | 57.23870631569862,59.51428198012956,1 85 | 80.36675600171273,90.96014789746954,1 86 | 68.46852178591112,85.59430710452014,1 87 | 42.0754545384731,78.84478600148043,0 88 | 75.47770200533905,90.42453899753964,1 89 | 78.63542434898018,96.64742716885644,1 90 | 52.34800398794107,60.76950525602592,0 91 | 94.09433112516793,77.15910509073893,1 92 | 90.44855097096364,87.50879176484702,1 93 | 55.48216114069585,35.57070347228866,0 94 | 74.49269241843041,84.84513684930135,1 95 | 89.84580670720979,45.35828361091658,1 96 | 83.48916274498238,48.38028579728175,1 97 | 42.2617008099817,87.10385094025457,1 98 | 99.31500880510394,68.77540947206617,1 99 | 55.34001756003703,64.9319380069486,1 100 | 74.77589300092767,89.52981289513276,1 101 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/2.logistic_regression/ex2data2.txt: -------------------------------------------------------------------------------- 1 | 0.051267,0.69956,1 2 | -0.092742,0.68494,1 3 | -0.21371,0.69225,1 4 | -0.375,0.50219,1 5 | -0.51325,0.46564,1 6 | -0.52477,0.2098,1 7 | -0.39804,0.034357,1 8 | -0.30588,-0.19225,1 9 | 0.016705,-0.40424,1 10 | 0.13191,-0.51389,1 11 | 0.38537,-0.56506,1 12 | 0.52938,-0.5212,1 13 | 0.63882,-0.24342,1 14 | 0.73675,-0.18494,1 15 | 0.54666,0.48757,1 16 | 0.322,0.5826,1 17 | 0.16647,0.53874,1 18 | -0.046659,0.81652,1 19 | -0.17339,0.69956,1 20 | -0.47869,0.63377,1 21 | -0.60541,0.59722,1 22 | -0.62846,0.33406,1 23 | -0.59389,0.005117,1 24 | -0.42108,-0.27266,1 25 | -0.11578,-0.39693,1 26 | 0.20104,-0.60161,1 27 | 0.46601,-0.53582,1 28 | 0.67339,-0.53582,1 29 | -0.13882,0.54605,1 30 | -0.29435,0.77997,1 31 | -0.26555,0.96272,1 32 | -0.16187,0.8019,1 33 | -0.17339,0.64839,1 34 | -0.28283,0.47295,1 35 | -0.36348,0.31213,1 36 | -0.30012,0.027047,1 37 | -0.23675,-0.21418,1 38 | -0.06394,-0.18494,1 39 | 0.062788,-0.16301,1 40 | 0.22984,-0.41155,1 41 | 0.2932,-0.2288,1 42 | 0.48329,-0.18494,1 43 | 0.64459,-0.14108,1 44 | 0.46025,0.012427,1 45 | 0.6273,0.15863,1 46 | 0.57546,0.26827,1 47 | 0.72523,0.44371,1 48 | 0.22408,0.52412,1 49 | 0.44297,0.67032,1 50 | 0.322,0.69225,1 51 | 0.13767,0.57529,1 52 | -0.0063364,0.39985,1 53 | -0.092742,0.55336,1 54 | -0.20795,0.35599,1 55 | -0.20795,0.17325,1 56 | -0.43836,0.21711,1 57 | -0.21947,-0.016813,1 58 | -0.13882,-0.27266,1 59 | 0.18376,0.93348,0 60 | 0.22408,0.77997,0 61 | 0.29896,0.61915,0 62 | 0.50634,0.75804,0 63 | 0.61578,0.7288,0 64 | 0.60426,0.59722,0 65 | 0.76555,0.50219,0 66 | 0.92684,0.3633,0 67 | 0.82316,0.27558,0 68 | 0.96141,0.085526,0 69 | 0.93836,0.012427,0 70 | 0.86348,-0.082602,0 71 | 0.89804,-0.20687,0 72 | 0.85196,-0.36769,0 73 | 0.82892,-0.5212,0 74 | 0.79435,-0.55775,0 75 | 0.59274,-0.7405,0 76 | 0.51786,-0.5943,0 77 | 0.46601,-0.41886,0 78 | 0.35081,-0.57968,0 79 | 0.28744,-0.76974,0 80 | 0.085829,-0.75512,0 81 | 0.14919,-0.57968,0 82 | -0.13306,-0.4481,0 83 | -0.40956,-0.41155,0 84 | -0.39228,-0.25804,0 85 | -0.74366,-0.25804,0 86 | -0.69758,0.041667,0 87 | -0.75518,0.2902,0 88 | -0.69758,0.68494,0 89 | -0.4038,0.70687,0 90 | -0.38076,0.91886,0 91 | -0.50749,0.90424,0 92 | -0.54781,0.70687,0 93 | 0.10311,0.77997,0 94 | 0.057028,0.91886,0 95 | -0.10426,0.99196,0 96 | -0.081221,1.1089,0 97 | 0.28744,1.087,0 98 | 0.39689,0.82383,0 99 | 0.63882,0.88962,0 100 | 0.82316,0.66301,0 101 | 0.67339,0.64108,0 102 | 1.0709,0.10015,0 103 | -0.046659,-0.57968,0 104 | -0.23675,-0.63816,0 105 | -0.15035,-0.36769,0 106 | -0.49021,-0.3019,0 107 | -0.46717,-0.13377,0 108 | -0.28859,-0.060673,0 109 | -0.61118,-0.067982,0 110 | -0.66302,-0.21418,0 111 | -0.59965,-0.41886,0 112 | -0.72638,-0.082602,0 113 | -0.83007,0.31213,0 114 | -0.72062,0.53874,0 115 | -0.59389,0.49488,0 116 | -0.48445,0.99927,0 117 | -0.0063364,0.99927,0 118 | 0.63265,-0.030612,0 119 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/2.logistic_regression/sample_for_scipy.optimize.minimize.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "colab_type": "text", 7 | "id": "usDG-zMJWI20" 8 | }, 9 | "source": [ 10 | "# 一个完整的scipy.optimize.minimize训练例子\n", 11 | "\n", 12 | "测试用python版本为3.6\n", 13 | "* 机器学习路径:https://github.com/loveunk/machine-learning-deep-learning-notes/\n", 14 | "* 内容正文综合参考网络资源,使用中如果有疑问请联络:www.kaikai.ai" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": 7, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "import numpy as np\n", 24 | "import matplotlib.pyplot as plt\n", 25 | "from scipy import optimize\n", 26 | "from mpl_toolkits.mplot3d import Axes3D" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": 2, 32 | "metadata": {}, 33 | "outputs": [], 34 | "source": [ 35 | "\"\"\" X是训练集的数据 \"\"\"\n", 36 | "X_train = np.array([[1., 1.],\n", 37 | " [1., 2.],\n", 38 | " [-1., -1.],\n", 39 | " [-1., -2.]])\n", 40 | "\"\"\" y是训练集的label \"\"\"\n", 41 | "y_train = np.array([1, 1, 0, 0])\n", 42 | "\n", 43 | "\"\"\" 处理训练集X,补上x_0 \"\"\"\n", 44 | "X_train = np.hstack((np.ones((X_train.shape[0], 1)), X_train))" 45 | ] 46 | }, 47 | { 48 | "cell_type": "code", 49 | "execution_count": 3, 50 | "metadata": {}, 51 | "outputs": [], 52 | "source": [ 53 | "\"\"\"Sigmoid 函数公式 \"\"\"\n", 54 | "def sigmoid(z):\n", 55 | " return 1/(1 + np.exp(-z))" 56 | ] 57 | }, 58 | { 59 | "cell_type": "code", 60 | "execution_count": 4, 61 | "metadata": {}, 62 | "outputs": [], 63 | "source": [ 64 | "\"\"\" 目标函数,也就是待最小化的 Cost function \"\"\"\n", 65 | "def cost(theta, X, y):\n", 66 | " first = - y.T @ np.log(sigmoid(X @ theta))\n", 67 | " second = (1 - y.T) @ np.log(1 - sigmoid(X @ theta))\n", 68 | " return ((first - second) / (len(X))).item()\n", 69 | "\n", 70 | "def hypothesis(X, theta):\n", 71 | " return sigmoid(X @ theta)\n", 72 | "\n", 73 | "def cost_wrapper(theta):\n", 74 | " return cost(theta, X_train, y_train)\n", 75 | "\n", 76 | "def hypothesis_wrapper(theta):\n", 77 | " return hypothesis(X_train, theta)\n", 78 | "\n", 79 | "\"\"\" 目标函数的梯度 \"\"\"\n", 80 | "def gradient(theta):\n", 81 | " ret = (1/X_train.shape[0])*((hypothesis_wrapper(theta) - y_train).T @ X_train)\n", 82 | " return ret" 83 | ] 84 | }, 85 | { 86 | "cell_type": "code", 87 | "execution_count": 5, 88 | "metadata": {}, 89 | "outputs": [ 90 | { 91 | "name": "stdout", 92 | "output_type": "stream", 93 | "text": [ 94 | " fun: 7.91774566928816e-06\n", 95 | " jac: array([ 2.15714436e-08, -7.91768325e-06, -7.93516986e-06])\n", 96 | " message: 'Optimization terminated successfully.'\n", 97 | " nfev: 34\n", 98 | " nit: 6\n", 99 | " njev: 34\n", 100 | " status: 0\n", 101 | " success: True\n", 102 | " x: array([2.72451378e-03, 4.94223996e+00, 6.11322367e+00])\n" 103 | ] 104 | } 105 | ], 106 | "source": [ 107 | "theta_train = np.array([1, 1.,2.])\n", 108 | "\n", 109 | "theta_opt = optimize.minimize(cost_wrapper, theta_train, method='CG', jac=gradient)\n", 110 | "print(theta_opt)" 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": 6, 116 | "metadata": { 117 | "colab": { 118 | "base_uri": "https://localhost:8080/", 119 | "height": 494 120 | }, 121 | "colab_type": "code", 122 | "executionInfo": { 123 | "elapsed": 2905, 124 | "status": "ok", 125 | "timestamp": 1551364998036, 126 | "user": { 127 | "displayName": "Wen QI", 128 | "photoUrl": "https://lh4.googleusercontent.com/-JqeVAdmmxc8/AAAAAAAAAAI/AAAAAAAAFLU/8N8KzOraE6M/s64/photo.jpg", 129 | "userId": "17732479532480119355" 130 | }, 131 | "user_tz": -480 132 | }, 133 | "id": "r3GaSVl9-TO5", 134 | "outputId": "7eb67a1f-bb1a-4760-ca66-08749ba58cce" 135 | }, 136 | "outputs": [ 137 | { 138 | "data": { 139 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAV0AAADnCAYAAAC9roUQAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzsfXmUXWWV/b73vnmueU4lNaUqSSUhcyAhYVREAVsbUbChHRClIdB223Tb64fatO0IraLSTuCI2q6FQVtQBoEgKUIglZBKJal5rjfP87v3+/1RdT7uq/HVFAm8vVatSurd+d277/n22ed8AmMMeeSRRx55nBuIf+0DyCOPPPJ4OyFPunnkkUce5xB50s0jjzzyOIfIk24eeeSRxzlEnnTzyCOPPM4hNPN8nrc25JFHHnksHMJsH+Qj3TzyyCOPc4g86eaRRx55nEPkSTePReGRRx7Bnj17Vmz7V111FX784x/z///7v/87iouLUV5ejsHBQVgsFsiyvOz7tVgs6O3tXfbt5pEHYT5NN488/ip44okn+L+Hhobw9a9/HQMDAygtLQUARCKRJe9j//79uOmmm/Cxj32M/205tptHHnNhwaSbTqcxPDyMRCKxEseTxwrDYDCguroaWq32r30oOWNgYABFRUWccPPI47wGY2yun2no7e1lbrebKYoy08d5vImhKApzu92st7d3QesNDg6y9773vay4uJgVFhay22+/nT388MPsoosu4svceeedrLq6mlmtVrZlyxb2wgsv8M9efvlltnXrVma1WllpaSm7++67GWOMxeNxduONN7LCwkJmt9vZtm3b2Pj4OGOMsX379rHvf//77KmnnmIGg4EJgsDMZjO7+eabWV9fHwPA0uk0Y4wxr9fLbrnlFlZRUcEcDge79tprGWOM+Xw+dvXVV7Pi4mLmcDjY1VdfzYaGhhhjjP3bv/0bE0WR6fV6Zjab2e23384YYwwA6+rqYowxFggE2Ic//GFWXFzMVq1axf7jP/6DybLMGGP8/D/96U8zh8PBVq9ezf7whz8s+DvJ4y2LWXl1wZpuIpFAUVERBGFWR8TMWAH9LY+FQRAEFBUVLWiUIssy3v3ud6O2thb9/f0YGRnBDTfcMG257du3o729HT6fDx/60Ifwt3/7t3w/Bw4cwIEDBxAKhdDT04Prr78eAPDjH/8YwWAQQ0ND8Hq9eOihh2A0GrO2e/nll+OJJ55AZWUlIpEIHnnkkWn7/vCHP4xYLIaOjg64XC7cfffdAABFUfD3f//3GBgYwODgIIxGI/7hH/4BAPCf//mf2Lt3Lx588EFEIhE8+OCD07Z7xx13IBgMore3F88//zx+8pOf4OGHH+afv/zyy1i7di08Hg8+85nP4KMf/ShYvoFUHvNgUYm0BRPuwABQXg4MDi5md3ksIxb63R05cgSjo6P46le/CrPZDIPBMGMC7aabbkJRURE0Gg0+/elPI5lM4syZMwAArVaL7u5ueDweWCwW7Nq1i//d6/Wiu7sbkiRh69atsNlsCzq+sbExPPHEE3jooYdQUFAArVaLffv2AQCKiorwvve9DyaTCVarFZ/97Gfx/PPP57RdWZbxq1/9Cv/1X/8Fq9WK1atX49Of/jR++tOf8mVqa2vx8Y9/HJIk4eabb8bY2BicTueCjj+Ptx/OjXvh3nsBn2/idx7nFYaGhlBbWwuNZm75/+tf/zpaWlpgt9vhcDgQDAbh8XgAAD/84Q9x9uxZNDc3Y/v27fj9738PYCJCfcc73oEbbrgBlZWV+MxnPoN0Or3g4yssLERBQcG0z2KxGD7xiU+gtrYWNpsNF198MQKBQE6uB4/Hg1QqhdraWv632tpajIyM8P+Xl5fzf5tMJgD5RFwe82PlSXdgAPjVrwBFAX75yyVHu16vF5s3b8bmzZtRXl6Oqqoq/v9UKrWkbT/22GP46le/uqRtEG666SasWbMGmzZtQlNTE26++WaMjo7Ou97999//pkpS1tTUYHBwEJlMZtZlDh06hC9/+cv49a9/Db/fj0AgALvdzofajY2NePTRR+FyufAv//IveP/7349oNAqtVot7770Xp06dwksvvYTf//73+MlPfrLg4/P5fAgEAtM++/rXv44zZ87g5ZdfRigUwgsvvAAA/LjmivqLi4uh1WoxMDDA/zY4OIiqqqoFHV8eeUzFypPuvfe+oefK8pKj3aKiIrS3t6O9vR233XYb7r77bv5/nU4HYOKhUhRlwdt+73vfi3/+539e0vGp8cADD+D48eM4ffo0Wltbcemll84byb3ZSHfHjh2oqKjAPffcg2g0ikQigb/85S9Zy4TDYWg0GpSUlCCTyeALX/gCQqEQ//xnP/sZ3G43RFGEw+EAAEiShD//+c94/fXXIcsybDYbtFotJEla0PFVVFTgqquuwqc+9Sn4/X6k02lOruFwGEajEQ6HAz6fD5///Oez1i0rK5vVkytJEq6//np89rOfRTgcxsDAAO6//37cdNNNCzq+PPKYipUlXYpyiWjS6WWJdmdCd3c3NmzYgNtuuw1btmzB2NgYbr31Vmzbtg3r16/HF77wBb5sdXU1Pve5z+GCCy7Axo0bcfbsWQDAD37wA9x1110AJiLVAwcO4MILL0RdXR0ee+wxABNa32233Yb169fjPe95D975znfit7/97ZzHJooi/umf/gmFhYX405/+BAAzHtsDDzwAl8uFvXv34vLLL591uXMJSZLwu9/9Dt3d3Vi1ahWqq6vxq1/9KmuZd7zjHbjqqqvQ1NSE2tpaGAwG1NTU8M+ffPJJrF+/HhaLBQcOHMAvf/lLGAwGjI+P4/3vfz9sNhtaWlqwb9++RZHaT3/6U2i1WjQ3N6O0tBT//d//DQC46667EI/HUVxcjF27duGd73xn1noHDhzAb37zGxQUFODOO++ctt1vfetbMJvNqKurw549e/ChD30IH/nIRxZ8fHnkkYW5rA0z+SBOnTqVu2ni5psZ02oZA9740WoZu+WWxZkwpuDee+9lX/3qVxljjHV1dTFBENiRI0f4516vlzHGWDqdZnv27GEdHR2MMcaqqqrYd77zHcYYY9/4xjfYJz7xCcYYY9///vfZgQMHGGOM3XjjjeyGG25giqKw48ePs7Vr1zLGGHv00UfZu9/9bibLMhsZGWE2m4099thj047txhtvnPb322+/nX3ta1+b99j8fv+857AULOg7zCOPPBaD5bOM5YypUS5hBaPd+vp6bN++nf//0UcfxZYtW7BlyxZ0dnbi1KlT/LO/+Zu/AQBs3boV/f39M27vuuuugyAI2LhxI0+gvPjii7j++ushiiIqKyt5pjwXMJWdaK5jUyPX5f4aYIxBlmXEYjGEQiHEYjEkEgmk02nIspy3T+WRxwxYuTLg//f/ZvfmyvLE5zN4LpcCs9nM/93V1YVvfOMbOHLkCBwOB2666aYsrVSv1wOYGD7PliSiZYA3CHMpRNLe3o6rr7563mPL9Rz+WiCyzWQyXD9XFAWpVAqMsawElSiKkCSJ/4iiCFEUF247zCOPtwhWJtIdGAB+/evpUS4hnZ6IglWZ4eVGKBSC1WqFzWbD2NgY/vjHPy7Ldvfs2YPf/OY3YIxhbGyMJ23mAmMMDzzwALxeL6644oo5j81qtSIcDq/oOSwWjDFkMhkkk0m88sorSKVSnEAFQYAkSdBoNFkEyxhDOp1GPB5HR0cHnE4ngsEgQqEQT8zlI+M83k5YmUh3riiXQE6GZY52CVu2bMG6deuwYcMG1NXV4aKLLlqW7V5//fV49tlnsWHDBqxduxY7d+6E3W6fcdm7774b9957L+LxOHbv3o1nn30WWq12zmO79dZbcfnll6OmpgZPPfXUipzDQkFkS8QoCEJO7hAiYwJZ+kRx4l2fyWSmuTnykXEeb3UI80QX0z7s7OxES0vL7GsMDADNzUAuw2CDATh9GlAZ0M8HRCIRWCwWuN1u7Ny5Ey+//DJKSkr+2oeVM+b9DidBZEvyi5pEX331Vaxfvx4Gg4EvQ2Q6G06fPo3KyspZq87UEg790P7UkTQRsSRJeTLO482KWW/M5Y90c4lyCSsc7a4UrrrqKoRCIaTTaXz+858/rwg3FyiKwjVbYHrECoBLB/T5ckBNsGrQftQ6siAIE5ngOSLjPCHn8WbE8pLufFruVJC2+/nPn1fR7qFDh/7ah7AiUBSFywjAzGRLINID3ohM54N6nYUgVzI+ceIENm7cyJcVRZFrzHkyzuPNguUl3YVEuYTzNNp9q4AIM51Oc502F2ISBAGZTAaDg4MYnLT/iaIIk8kEi8UCk8nEG+SoSXM5k2VTyTiTyUCSJL4PRVGQTCanrUPRsVqqyJNxHucKy0e6igL87/9O/FtltcppvV//GvjRj4B5NME8lg9k+wqHwzAYDAByI1tggtyi0Sja29tRVVWFbdu28W3GYjFEo1EEg0GMjo4ikUhwMo7H49DpdNDr9VlkvNyYLTKmYyT5ZGqvDrVMQdFxnozzWG4sH+mKIuByAYtpOqPT5Qn3HIEi21QqhXQ6jRMnTmDnzp05EUsqlcLg4CCcTidEUURraysKCgogyzLS6TQkSYLVaoXVas1ajwooenp6EI1G0d3djXg8zslYHR0bjcYVJbnZSFRNxlO9xjNpxnlHRR6LxfLKCxbLsm4uj+XD1OIOxhgkSYKiKPOSRzKZRH9/PzweD1atWoXdu3ejs7NzXrcCgcjYYrGgsLAQhYWFACaG/xQZh0IhjI2N8eIPkifo581Axq+//jqam5v5VEd5e1sei8F5OTGlJElobW1FJpNBS0sLfvzjH/N+pgvFc889h6997Wv4/e9/j8cffxynTp3CPffcM+OygUAAv/jFL/CpT30KADA6Ooo777wTv/nNbxZ9LoT9+/djbGwMer0eqVQKl19+Oe677z7elWs2fPGLX8S//du/zfr5bJV0uQyb4/E4+vr6EAgEsHr1ajQ2NnKizdWrO3Wf6uMQRREWiwWWKS9rNRmHw2GMj4/PSsYGgyFn8l8M1NdJURSemFNr4alUKl+Fl0fOOC9J12g0or29HQBw44034qGHHsI//uM/8s95Y4kFPozXXHMNrrnmmlk/DwQC+M53vsNJt7KyclkIl/Dzn/8c27ZtQyqVwr/+67/i2muvnXemg9lIN1dHwUyIRCLo6+tDNBrFmjVr0NLSMo0wFpsUY4whHA6jo6MD0WgUBQUFWLduHQKBAE6fPo1MJoOqqio0NTXxEm2DwYDi4mJoNBrE43FEo1FEIhE4nU7E43EAE/dEMpmEy+XikfFyk7H6fOeKjIF84Uces+OckG40GkUgEIDD4cjqj7Ac2Lt3L06cOIH+/n5cddVVuOSSS3D48GH89re/xZkzZ3DvvfcimUyivr4eDz/8MCwWC5588kncddddKC4uxpYtW/i2HnnkERw9ehQPPvggnE4nbrvtNt5v9bvf/S6++c1voqenB5s3b8YVV1yB22+/He9+97tx8uRJJBIJfPKTn8TRo0eh0Whw//3345JLLsEjjzyCxx9/nGua733ve/GVr3xlznPS6XT4yle+goaGBhw/fhybNm3Cddddh6GhISQSCRw4cAC33nor7rnnHsTjcWzevBnr16/Hz3/+86zl7rjjDtx66605X8tQKITe3l6k02msWbNmzrnw1KQ7F2kwxuB0OnmjcZ1Oh5MnT4IxBpPJBJ/Ph7a2NiQSCZjNZuh0OgwMDCCdTsPtdnPiMplM2L59O8bHx+FyuWAwGNDU1ASz2QxZlnnyLhqNwuVyZZHxVJliKWSci6tjpuXmI2Oyt+ULP976WHHSff311/H4449DkiTIsoxrr70WGzZsWJZtZzIZPPHEE7xP6pkzZ/Dwww/jO9/5DjweD+677z48/fTTMJvN+PKXv4z7778fn/nMZ/Dxj38czz77LBoaGvCBD3xgxm3feeed2LdvHx577DHIsoxIJIIvfelLOHnyJI+y1d3Jvv3tb/PzPX36NK688krep7e9vR3Hjh2DXq/H2rVrcccdd2T1m50JkiRh06ZNOH36NDZt2oQf/ehHKCwsRDwex/bt2/G+970PX/rSl/Dggw+ivb2da48/+MEP+HI7d+7E+973PhQVFc25L7/fz18udXV1M059MxXq4oi50N/fj66uLuh0Ok68Go2GzywhiiJGR0dhsVjgcDiQTqeRSCTw2muvoaioCMXFxWCMwev14sUXX0Q6nYbRaEQ4HIbf78fGjRvR0dHBO521trZizZo1iMViSCaTEAQB6XQakUgEbrcbsVgMwAQZq6UKk8k0LxlPTbAtBLl4jcfHxxGNRlFbW5sv/HgLY0VJNxqN4vHHH88qJT148CDWrFmzpIiXojtgItL96Ec/itHRUdTW1vJJD9va2nDq1CneryCVSmH37t04ffo01qxZg8bGRgATzcq/973vTdvHs88+y6eOkSQJdrsdfr9/1mN68cUXcccddwAAmpubUVtby0n3sssu4/0Z1q1bh4GBgXlJF8gezn7zm9/kjdSHhobQ1dXFyTSTyfAH8Vvf+hZvqj51uanb9nq9iEajGBgYQGNj44ImhcxF02WMoa+vD1arFZIkIZlMIhaLIZPJwGq1wuVyIRAIIJVKIR6PQ6vVwufzcWeFx+OBTqdDMBhEOBxGKpVCSUkJbDYbZFmGy+XCM888g4KCAthsNvj9fhw9ehTV1dXo7u6GIAjQaDTYsWMHampq4Ha74XA4YLfbIYoiotEootEoPB4PJ2ODwZAVGavJeCmkO9d1pN9qD/HUwo+p6+QLP85frCjpBgKBaa0TJUlCIBBYEumqNV011NtkjOGKK67Ao48+mrVMe3v7ityYc0V96haRc7WSVIOy5S0tLXjuuefw9NNP4/DhwzCZTNi/fz/i8TgnvUQiAYPBgOeffx7PPPMM/vKXv8BkMuHSSy+dsRWkLMt4+eWXeSKKXmALwUI0Xbre8XgcsVgMxcXFcLvd8Hq90Gq1qK6uRigUwuDgICRJgtFoRGFhIdxuNwYHB6HRaPjQmzqwRSIR3kqSMQadTodkMom+vj4MDQ2hpqYGOp0O8Xgcr7zyCvR6PYLBICfiXbt2QavVIhKJQBAErF27FlarlWvGRMZ0nUkzdjqdnIwXOrXQfFAUJStRqf5NyBd+nP9YUdJ1OBzTZl6VZXnejPxyYNeuXbj99tvR3d2NhoYGxGIxDA8Po7m5GX19fejp6UF9ff00UiZcdtll+O53v4u77rqLa4bqtotTcfHFF+PnP/85Lr30Upw9exaDg4NYu3YtXnvttQUfezqdxmc/+1nU1NRg48aNOHjwIAoKCmA0GtHZ2Ym2tjb+8Gm1Wk7iwWAQDocDJpMJp0+fRltb27TtplIpKIqCTZs2wWg04qWXXlrw8QGzk66iKIhEImCMwWKxYNWqVejt7UU8HofL5YJOp4Msy9DpdHA4HCgsLIQgCIhGo2CMQavVoqKiAh6Ph7eR1Gq1qKmpQSQSgcfjQSgU4lptKpVCOBxGIpHg0XIymcT4+DiKiorgdDoRiURgMBiwZs0aaDQahEIhHD58mDfpYYxhcHAQu3fvhsfjwfDwMHQ6HZqbm1FYWAjGGKLRKE6cOIFkMgmfz4dYLAZFUWaMjBdLxmrSneu6q3+rkS/8OD+woqRrNptx7bXX4uDBg1ma7nIn02ZCSUkJHnnkEXzwgx/kEcF9992HpqYmfO9738PVV1+N4uJi7NmzBydPnpy2/je+8Q3ceuut+OEPfwhJkvDd734Xu3fvxkUXXYQNGzbgqquuwu23386X/9SnPoXbbrsNra2t0Gg0eOSRR7Ii3Fxw4403Qq/XI5lM4vLLL8fBgwcBTMxB9tBDD/GZhUlCAYCPf/zj2LlzJ7Zs2YIf/ehH+J//+R9s3rw5a7lUKoVUKgWNRgOTyQStVguj0bjga6rGTJou9T8IBAIQBAFWq5Vfj7a2NthsNjgcDjgcDvj9ft6hzOl0IplMwmq1IpFIYGhoiLsWHA4HIpEIn3RSp9PxpjdmsxmxWAyxWIxroOXl5fB6vVyS0Gq10Ov1kGUZY2NjMBgM8Pv9SKVS0Ov1qK2thSRJ8Hq9eOGFF6AoCiwWC5LJJNra2rB9+3Z0dXXxbdbX16O2thapVIpLElSht1QyXqp8kYvXuK+vD4WFhbDZbLxzW77w49xi+Vs7zoCVdC+8lZHrbBWJRGLaTLpqD6lWq4VOp+MP0unTp1FfXw8AOHz4MHbt2rXgh6y/vx86nQ6VlZV81oihoSH09PRw/ToYDKKmpgZ1dXV45plnoCgKJEmCXq+H3+9HY2MjRkZGMDQ0BLvdjuLiYqRSKfT393ObmKIoGBoaQjKZhF6vR0lJCdLpNLxeL4AJpwdJNqlUCkVFRQgGg/xFazAYUF5ejtHRUV41ZzAY+Domk4n7gjOZDIxGI2pqaqAoCtxuNwRBgMlkgsPhwOjoKGw2GzZu3IiTJ09yeWfz5s2oqqqC3+9HPB6H2WyGXq/nXmP6URQFer0+i4zNZjP/3oaGhqDRaFBRUbGg72IhOHPmDMrLy3kic6Y2mkDe3rYMOIetHWcA3Vx55IbFTA2kXocSUVqtFmazec4HhRJiCx0ST02kCYKAWCzGq7UAcOKRJAlVVVU4c+YMFEVBKBSCRqNBf38/ysrKkE6nYbPZEIlEMD4+DkVReAMep9PJNV1ZlrnUQ64EQRBQVlYGl8uFdDoNp9PJCZuIzuPx8GtAZb5WqxWjo6OIRCKQJAkmkwk6nQ7pdBpjY2NIp9N8RotkMsmlDKfTiaeeegplZWWwWCzIZDJob2+H1+tFf38/38/GjRtRXV2NQCAARVFQVVWFsrIypFIpTsIjIyOIxWKQZZlH4+SoMJlM0GiW//GUZTlLN54tMs4XfqwczsviiLcqFlvQQPpqIpFAJpOBTqebl2wJuVq/ZtunGna7HSMjI1y6iEajKC8vBzDh6AiHwzh9+jQcDgeKi4shiiJcLhesViu8Xi9cLhckSUJFRQXi8ThGRkYgCAKXJajvg16vh81mgyiKCAaDGB8fBwCePGKMQa/Xc62XEo1VVVXweDxIJpNwu93QarUQRRGKovCRAnl9RVGEzWYDYwzJZBJDQ0P8pcIYg8vlgkajgd/vRyAQwPDwMKqqqjgRv/baa+jt7UUwGIQkSejq6sL69etRXl6Onp4eJJNJVFRUcJ94MplEb28vFEWZRsZTq/CWQsa5vGDzhR8ri0V9eythnXk7gzQ3YOENwakHLumI1DFsrn2pQaSzUExdjzGGiooKxGIxDAwMcHtdX18fUqkU1q5di7KyMoyOjnIipu20tLRwoqGEIUWWJpMJBQUFGBsb44kmRVHgcDh4so3kFXrRhEIhjIyMQK/Xw2Kx8Og2Ho/zJJMgCDAajSgvL4fT6YTf74dGo4HBYODfh0ajQTKZRDKZRDwehyAIKCoqQiwWQzwex8DkHH8ajYYTMQDuenA6naisrITFYkEikcDRo0d5cYZGo4HH4+FR/okTJxAMBlFdXY3du3dzix29SGOxGMbGxhCNRnkicqpMkQsZ55Ksmw25Fn6Mj49Dp9OhsLAwq/AjT8YTWDDpGgwGeL3eOauV8sgN6sg2Ho8vqN0hWYboIdLpdFlD+9n25/P5shJ8i+mhMNN6FB01NDQAyHapDA8Pw2Kx8AReNBqFXq/nDW66u7tRW1uLoaEhGAwGBINBTmCZTAbxeByRSAQ6nQ6lpaVwu90YGRkBAJ6gSiQSnEwpaSvLMsrLy3nvhkQiAb1ej+LiYp7w8vl8/Nqk02no9XrY7XaMjY3B4/FAFEUeuSeTSS450NBbkiRUVlbC7/cjGo2iv78/K3lHLwbyH4uiiLKyMpjNZiSTSRw9ehRarRY2mw0ajQbDw8M4efIkRFFEd3c3gImk8M6dOyFJEvx+PxRFgclkQjKZRDQa5UUV6lGOunOb+r5QywvLhalkTPo77WeuGT/U1ra3i6NiwaRbXV2N4eFhuN3ulTietwVmGs6Tu2C+B4IiW8YYjx7IMpZLpKPX66dFmosl3dnKgCORSFZXMJ1Oh1AohIKCAjQ1NcHlcqG7uxuyLKOyshLBYJAXrZw9e5YP/SsqKpDJZBAIBGC321FYWAhZliFJEqLRKIxGIyoqKhAOhxEMBvlwvKioCMlkEpFIBKOjo/ylpCgKGGMwGAy82IKkh8rKSni9XiSTSXg8Hl6GSx5gjUbDHQqiKMJqtXLtORQK8e9AlmUIgsDJXv3CoOidIuFIJIJ0Op1FOplMhhNxWVkZl2BOnDiBSCTCE4gWiwV79+6FyWTikktpaSk/zqlkTCOBRCKBSCQCq9U670t6saDvKJcqvLdj4ceCSVer1WLNmjUrcSxvacw3Q8OpU6dQXFw8awluMBhET08PFEVBXV0db48ITESSiqJg1apVCz6upZIuY4xHherOXx6Ph0sdqVQKFosFgiBAp9Nh8+bN8Pv93LYETPR9qKyshMFgwLFjx+BwOHjyLBKJQK/XI5PJYGxsDLIsc3uZ1+vlowR10ieVSmX1+a2qqoLP50M0GsXo6CgnIarmUy8vCAIMBgNKS0vhcrkQCoWyIjOSHhRFQTQaRTKZhCiKKCgoQDKZRCqVgsvl4m0zSa6w2+3cRUHRfkFBASKRSFbEDUxEixqNBg6HA6FQCK+88gp0Oh0qKiqg0WgQDofx2muvIRAI8D4Ter0e+/fvhyAIcLvdYIxhzZo1KCgo4Ak8t9sNt9uN/v7+LDJW/yyVjIl057p31L8Jb5fCj3wibYVBDxzprsDMiQoaEk9d1+fzobe3FxqNBvX19TNO9y6K4rTERq5YSiItGAzi8OHDKCwsRFlZGeLxOC8tDoVC3DVASS2yR2k0Gk5GJFMQkZWUlHB7WzgcxtjYGCRJ4uXAmUyGF1V4PB74fD5oNBoUFBRwz67b7YZGo4HVauUJxkAgwIshaH9FRUVceqBZLQoLC7n0EAqF+PdHx1dcXMwb+Gg0Guj1+qxzAN4oQhFFEaWlpYhEIojH4xgdHQUwEbjQsmSfS6fTPKouLi5GIBCA3+9HOBzmElIiFoO3qwvW2lq43W6Mjo5Cp9OhurqaF320tbUhFArx76m/vx9a/ZGGAAAgAElEQVT79+9HLBbDmTNn4HK5sGrVKqxduxYAuERBxSvRaBTpdBoajWYaGet0upzujflId657Sv1bjamFH4wxDAwMYPXq1VkJPApqFlLSfq6RJ90VwlSyJaKd7a1MDcVpXY/Hg97eXhiNRjQ3N0+bjUGNxUarwMI1XUVRMDo6ip6eHuj1emzduhU6nQ6pVCorSr/gggswMjKC1157DYqioLOzk2usXq8XZrMZTqcTgiAgEAhAr9ejra0N27ZtQ2trK9rb23kFG8kMALjmSoRE52A2m+H1enn0Tcm2YDCYpb8WFRUhHo/zRBdVwdGLx2QyIRwOI51OI5lMch2ZekR4vV6uG5Prgc6FGuyQtEJ6tCzLnDBEUeQacCwWw8jICC99JqIGJoibvMd6vR4OhwPewUH4ZRmBgQFIk7oxFZdYLBZ4PB6Mj49DkiRUV1dDp9PB7/fj+eefRyKR4DpwW1sbRFFEIBDAmTNnAABr167FunXrAEz4vkm7ppdYf3//rGSs1Wqz7uvFku5cmPrskKxD9z4FLH/84x9x/PhxfPGLX1zW/S8n8qS7zKC5x9SJg1yGQKTnjY+P8yYxra2tOTVnXwrp5rouke3AwABKSkrQ1NSEcDgMvV4/Y6QsiiJ8Ph9sNhv3aI+Pj0Ov12P9+vUIh8NwOBw4fvw4j369Xi+efvpp7Nq1Cy0tLTwZJ4oitFotlynUEW5lZSU8Hg+vYjObzbBYLPB6vXA6nQDe6H1B0SjNeEzSQ0VFBQKBAGKxGEZHR7mLQf2yJOJMJBLQ6XQoLy/nXcvIPUFkT8UoZE8TBIE3ak8mkzyypRceuSKIuMfGxqDRaGCz2fgwOxkKTdxXkgSk0xAnO7W5XC5efafRaDh5O51OGI1GBAIBnrjS6/XQaDRIpVJ4/vnnoSgKf1F2dHRAp9PB6/VicHAQgiCgoqKCF874/X5e2p1KpXhJ9sDAAM9HUAIvkUjMOO3RciKTyfARk3ofwWBwxtHgmwl50l0mzES2uWaJFUVBOBzG4OAgysrKcMEFF8xr/VJDHSUvFPPJC+QbHRwcRGlpKbZv384JZT5ZgkhNvS+yOxUVFcFoNGJoaIhH8bIsIxAIIBwOIxaL8aY2oigiFApBEAQUFxdzYikoKOCE6na7eVQcDof5d0Gkmslk4HK54PV6IUkSfxEkk0mEQqEsEpRlGWVlZXC73UgkEhgfH+eOBAL5aOmHquWcTiePwMgHTFE6DX+pLwLNLxePx7nLQd1hjAgukUggEY9DKwgoCYcR1usha7UIBoM8OUjfv81m45IJJfCICN1ud1ZUSBq72WxGPB7HoUOHIAgCSktLIQgCRkZG0NHRAY/Hk5XAu+SSS1BSUsIr9EpLS2EwGHgCL5VK4ezZs/z7n0mmWCoZE+lOBfUfeTMjT7pLBGOMt6587bXXsHXr1pzJVpZlDA8PY3h4GAaDATU1Nbw8dyFYCXlBURQMDw9jaGgIpaWl2LFjR1aCZbaGN2pUVFSgo6MjS0dVRyFqUqIHSKfTYc2aNTAajbDZbOjo6OBJKYfDgf7+fmg0Gt7vgKJNiuB8Ph+veLNYLIjFYgiHwzzZRfIDEQ1VngmCgMLCQu6NdbvdfJhM6xgMBp4oo0Y8hYWF3DtLUSW9gLVaLYqKijA2NoZwOMyJmKJ2SvzRMTDG4HA4eNMgitS1kgSWTEIEIDEGNrlOapKky8vLEQgEIMsyJ0d1O0qLxYLQZKQMvNG+kjqpUdk0ESElOcPhMF566SVoNBqUlZVBo9EgEonwBB7JO1qtFpdeein0ej08Hg/cbjcaGxtRVlbGv6doNAqv18tLuqkKkGxtFotlQWQ8F+nW1tbmtI2/FvKku0ioyRYAz4DnctNkMhkMDg5idHQUFRUV2LFjBzwez4xtGHMBRZCLXVdNuvQiGBoaQnl5+TSyVa83H+lWVlaCMYahoSEAE35T8roS+bS2tuL48eNQFIW3/Dxx4gRaW1vR1NQEnU6HV155BTabjRNgOBxGc3MzTp06xQmXCNbv9/MqNpPJxLuQkc2LnBWkzVL0qtZdyeQviiJKSkp4oomG8JQIo0iRImUigrKyMp64o2QiXS9RFLkPmO4XvV4PvV7PE270IqTrW5hKIROPI2CxwDU5KjDG48jYbBAmE7BkXyPtuqysjMsY4+Pj/FiJ3A0GAydq0q5tNhtCoRD8fj9CoRCvXiMbXWFhIXw+H5xOJ/cna7VaxGIxHD58GIlEgifm/vznP+PCCy+E3W7H8ePHkUgkUFNTg9bWVv6ipd4Ufr8fw8PDWWSsjowpWakGjQqmIh/pvgWhfjCB3CZ4JKRSKQwMDMDlcqGqqopXHgHLS5yLWVeWZQwNDWF4eJhreXP5fnOJdAVBQHV1NWw2G1555RU+9EwkEtixYwfPvNvtdrz00kuwWCyw2WzweDxoa2vDxRdfDIvFkuVfpiRPY2MjLBYLXn31Vd6UfHR0lOvDJpOJ28nUyTVyQNCxUzmy2+2G3+/nxRD0EqWKMCJBkh5CoRBisRhPBhqNRv79qV/I9IIpKyuDx+PhdjI6TvL1mkwmhEIhTroGgwFGoxGRUAhBRQGMRgiTUS4AmJJJKIkE/FotXC4XRHFiOns6N2rgQ9oqFWU4nU7eX4KOm2xzGo2GZ/+poxslLSORCO+qRuTtcrl432Mi9rKyMt6ESN2CVJIkHDt2DKlUClVVVThy5AgvE9+xYwf0ej3XzNPpNH+BEhmLophFxPF4fEbSDYVCedJ9q0CdgAEWRraJRAL9/f3w+Xx8CvOpEsRMlrFcsRRNFwAvVqisrJyXbAkLcT10dXXxZFIikUA0GsXY2BgfBpLWWVBQwJcLh8OIRqPcjkYlsGTsf/rpp7FhwwZotVpe+RUOh3kCjIbsRqORN02PRCIAwBvoUFKLzodeJBRZxWIx3vicChsYY9wRoba9kVOCXBGSJHFNVl2CTFGpRhRRWlEBl8vFm6NT1KsufsmkUlAmRxqmdBqWRAIBkwkBiwVQFIiCAGFyBEAJvEAgwNtr0hxyVGSidsjQcQcCAa5d0/Un3ZleHnRvkneYZJ14PA5RFLn04vF4oNVqudUNmNCuTSYTGGM4duwYTpw4AZ1OB71ezxOg69atw6FDh3jp9759+7K6rZGTgjoWkgbudrthMpmg1+vx6quvIhAIzOn0eTNg5eaufguAoiOqv6eoZK7acXX0F4vF0NHRgWPHjsFut2PXrl2orq6eUfNdCukuJtLNZDLo6+vD4OAgFEXBrl27UFdXl3MzFbW8kEgk0NnZiSNHjuD06dPTPMNkNVKvq16GIhZ1pzSKzkRRxM6dO7FhwwaeUadoqqOjA7t27UIikeC9cysrK7n+aLfbUVFRwbdDkkBpaSkvYydCoZaLADhpkQarToqSDYymHKKiD6/Xy4f2tKzFYuEvFJ/PxzVbnShCiEYRnExsqcvBi4uLIQjCRFMelwtiJgNrPA6tLEMWBMiiCFkUkZEkZDQa6NJpfi6hUAihUIj7mum6Ut9i8h/TcYuiyCUIdaGBwWDgUS8lTB0OB5diIpEIj6Dp+InoEokE149pqiNyV1CSkfRgqjzr6urC//3f/0GWZV4e/dxzz2U9D5IkwWq1ory8HPX19SgrK0NjYyMuuOAC1NTUQBAEHD58GB0dHfjABz6AnTt34p577snpXgaAj3zkIygtLZ11/kbGGO688040NDRg48aNi5qcgJAn3RlAw0hKmNDDOl+jDnqwI5EITpw4gRMnTqCkpAS7du3iD/9sWKoDIVfCzmQy6O3tRVtbGwRB4LP+LrRzFUV5sizjxIkT8Hq9vGS1o6Mj6+VDjXDIpkWFCQS9Xo/6+nqEw2GuKap7e0iShMLCQlgsFk6MpKsWFRXhoosuQnFxMZ+LTR2VxeNxjI2NIZlMctuWz+fjvR2od4P6eyUyMRgMfJ1QKMQ1WEoAkpeV7hVZlmGz2fg6breb9zMme5NWq4UQiyGl1SI8ORVQUVERt96R11gQBAiKAggC7JNaf1KjgddsBhhDQTQK3eQLIzapv9L3QdeLMYZIJIJgMMhtcKIoIpFIcF2V5BOz2Qyr1cp907SMWtqhWUrINUHuEVqHnhO6huqKxPHxcV5JSMdA3wM5LXw+H7/uVLAy131MpcJWqxV1dXV44IEHYLfbcfToUTz//PO47bbbcr6fb7nlFjz55JOzfv7EE0+gq6sLXV1d+N73vodPfvKTOW97KvKkqwJpcZRZVlu/cpESGGNob29HZ2cnqqqqsHPnTm6/mQ8rremm02n09PSgra0NkiRh9+7dWL16Ne+QtVCQvEBVYKS9UjJGXca5atUqNDU18eh18+bN08qdm5ubsXPnTpSXl/ME1J///Gc+4zJFlDTs9fl8nAAcDge0Wi3S6TQCgQBGRkb4yITOzWazobi4GEVFRXzKIovFgpKSEk7O0WiUOxKo4CORSGRFugaDgfddoHMnWUAQBF4WTDo57cdsNkNRFHjdbqREEfpkErp0GqKq4k0tQxRbLNCnUmCCAJfVCgWARpYhKQoExmDMZMAEASmNBuHJYygqKuLTIVE1nVoCoSqtZDLJW07abDZotVokEoks26OiKLwQhSQVSlrq9XqeYKPrTi9Vq9XKK9comai+fjTqIKcJdVmjHiKkRauTnDNhNvcCuUYMBgNWr16d8/188cUXZ5XWT8XBgwfxd3/3dxAEAbt27UIgEMDY2FjO21cjT7p4g2wHBwf5zZdrUQMwYbF59dVXEY/HUV1dje3bty+4C9tKyQvpdBrd3d04cuQItFotdu/ezaeomW/duUAPs9pSBYBvS53kEAQBdXV12LZtGywWC15//XU899xzWTctkYbL5YLdbofVaoXRaMSJEycQi8VgNBpx4YUXct8wZd4PHTqEvr4+7N+/HxqNBj6fDyaTCeXl5by3rd1uh8PhQCqV4houRWKRSIQnRYloKCmWTqcRj8eh0WhQVFQEURQ5YVEUSORgsViypCjSOUVxYjJNmjeOKQqYIMAUi8EyOQwn8lHLAkmfD8qklJDSaKBIEoqiUehlGbIkwWm1QhZF6DIZaDIZCCpphsgLmHCMkO5NBQ5qCYReBqlUiksTBQUF3NaWSCSydGCaGZsxxjV2SlxKksQ1XroWJDNQ9B8IBHhfCephnE6neck2WfJo4tDZMBPpLiZ4yBUjIyNZM3hXV1fzTncLxds6kTa1oGF0dBR2uz2nhh80FOzt7YVOp0NjYyP6+/tzqiCbCUsl3ak3nNopUVNTM2PyDlhaa0fGGEwmEyoqKjA0NMTPYfXq1TNew6GhIe7cUBQF7e3tMBgMPOqlCJTWJeJOJpMwmUwoLS3F/v378eSTT/IOXFRiXF9fj1WrVvHZeglUHDEyMgKfz8eTRxRN0/UrKCjgjgSy7lF7RnXUSqA+DWT1I7sTRWxERER0AKAVBNi8XgTtdoQnNVBBliFKEpTJYzWbzfD7/QiKIgRRhGEyGk5qNAgbjUiLIsAYZFGEwBhsiQSSkoSIJHGPLh23usESySD0cqNkIBVMUAMfSsipXTqiKKKwsJB3clP3g1BH0SSn0HU3mUyIx+NIJpMwGo1Z5E1FGaFQCIlEAlarFQUFBdi+fTuqqqrmTYbNRborUQU3E6Evdj9vS9KdyWNLkct806MzNtGsuq+vD2azGevXr+cP+V/TgUCgOcbcbvesTgk1Fhvpqol+7dq1sNvtvL5/tmGax+PhkRz9eL1eTrp6vR46nY63W6TElPpFRpqlerhKCTH1sFcURa4j6nQ6rF27FidOnOC+T/LVKorCm23HYjEui9B5EJGQnYzsSpRNVyfCqBiCHAlUTUdShZBIgAFQRBHK5PEb43GYolH4JjuBpdNpSIoCMZOBLEkQARjTacR0OsREERAEmFIpaGQZMZ0OIaMRCgAwNvEzmQhjjHF9nJJjRKJ03PRDL6JoNIpEIsGTZyRV0HVXPzdE3qTlUp9fNREbjUYuF1Akb7fbecUgVcuRrr1p0yaUlpbmdP/NRLp0/60Eqqurud8cmOjsV1lZuahtva3kBXrzU/Jgql47F+lS74HDhw/D6/Vi06ZNaG1tnRZVLSVaXey6dHxnzpzBK6+8ApPJhN27d6Ompmbe6rildBkjsqby3Orqat41a6YkCEVgBHINECRJws6dO/mQPJ1Oo7m5OeulYLFYYDAYuIZISSKK1vbt28c9u4wxlJeXQ1EU9PT08FkbyB5FLgS/3w+v18ubnFPmXi09qEtzqRcDlSbTcJx8wHRNiHzsdjugKEhJEnxFRRAUBdZgEJp0GmmNBhkizMn9SOk0CgIBCIwhptHAZzRCkmUYkkmIsoyMIECbSnHpISNJMCSTsE821CE5g1ohqnsvUHKN7HRGoxGSJPHrrT5X+pykCfq+KJFIIwIqsAAAq9XKvw+v18tHLur7kKJemgH6/e9/P6644oqcCZeOceq9HQgEVqy72DXXXIOf/OQnYIyhra2NO2MWg7dFpJtrQcNMpKvuPVBcXIytW7fOKvAvhXQXO1RJJpPo6+vjyazGxsacy5Bpv0uRF9QYHBxEb28vv77r1q1DcXEx/7y+vh5jY2M8yeNwOLIaqgOAw+HAFVdcgdHRURw7dgwnT55ER0cHNm3ahNraWmi1Wlx88cV49dVXMTY2xrXSP/7xj7Bardi+fTsuv/xy/O53v8uaJy4ej2Pjxo14/fXXeQP+oqIiaLVaeL1epNNpGAwG2O12JJNJBAIB3kPB4XAgFovxLmPq8mAAnGQo+qXhOFmn/H4/2KRMQMUN1kgEcbMZGa0WQa0WkixDYzIhk05D1usRm3xZABNTcgsALDodfIwhxRi8djsEAAZBQEoQkDaZoDcYIKuqGnU6HaxWa1bPXXJQkFXOYrHwlw8AXh1HvS9oHbpPqCOaugxYp9Px5BgVo6jvKfJO0+wdRqMRZWVl2LNnD0pKSuZMmC0ES6lG++AHP4jnnnsOHo8H1dXV+PznP89fQrfddhve9a534Q9/+AMaGhpgMpnw8MMPL/o439Kku9CCBjVpZjIZDA8PY2RkBGVlZbOWw6qRizyxXEgkEujr64Pf78fq1athNptRVVW14O0sNZFGiEaj6O3t5ZagdDqN06dP48ILL8zyrTY1NaG0tBSiKM5qVWOM4fjx49zsn8lkcPz4cZSUlMBkMsFqteLCCy/E448/zj25sixjZGSENzSn86KpfKLRKCRJwnve8x786le/4h5asijRUBcAfylQ4oiGyYwxLhkUFRXxiS9JS6WEFYCs4Tv1NShxuxEoKEBGq4WrtBSyIECTTkOe1Gm1mQwyggAZQGRyqG6fnCVZURREdDoglZqoTGMMgijC5HAgFQggoygITDotyFGQyWS4k0JdGWa32/n0QSQ/UJWZLMv8enItWqvl/RtIo6VRIi1HTXVI09XpdFwqisfjKCsrg8/ng9lsRm1tLbZu3brsM4QvpcPYo48+OufngiDg29/+9qK2PRVvOdKlh2W2GRrmAlUp9fT0YHx8HJWVldi5c2fOHtalRLq5Qk22a9asQXNzMwRB4NaqhWKx8oJ6vVgsht7eXkSjUV5ppNVqeQRFFiIihLKyMnR3d+PUqVOQJAnNzc1ZQ8tUKoVMJsPnJqPrH4vFODGSy4Q+o6RbPB5HYWEhtm3bhldeeYV3ETMYDDh69CjGxsZ4XwLSlGVZhsFg4M4GRVH4sJuawlCtP5E5EYzauUEN0NU9IHQ6HZRYDApjSBoMkEURiiBA0ekgKAoKXC6E7XYk9XrEkkmwSU+v+t6lYyL7GiX9yBNOMgZ9LxaLhSe9KFrV6/U8UUkkDrwxTLfZbPD7/Xw+N9JkyX+sLoSgSL+goICvQxVw6vnpTCYTrwxMJpOoqanBlVdemXMz9Nmg9gOrEQgE3vQlwMBbiHTppphvhobZQCWMoVAI9fX12LVr14IbMZP9ZSUQj8fR19eHYDCYRbZqqB++XLGUSFeWZZw8eRKRSAQlJSUYGhriBndKhg0ODvIKKCKo3t5edHV1cUvXq6++it27d/MHRh0lGQwGfk3VkRFNuEiTIFIESss0NDSgqKgIjz/+OEpKSmA0GrlDZffu3ejs7ER/fz83+dO8aXRNiDAoEUTFBTTtOi2rnrSSWlHS9WGMwSSKyMRiiFgsCE5GYcZ4fKKyTKtF1GpFWqMBE4QJ6YG90bNWTd6kGVOJMw33aX8kgaTTaU646hcjabNUUUfXmZ4X6klBIEsdJcHUZcX0wiMSp++ViDo+WfRBrgWyCxYWFi5Lc/PZmqSfD81ugLcA6dKbV5ZltLe3Y+PGjQsiWyKzQCCAgoIC2Gy2Rc01BkxEZIvtFEaYSpzxeBy9vb0IhUKoq6tDS0vLjOemHk4vBIvRdOmYYrEYmpqasH79eu617OzsRCqVQkVFBerr63lk5fF4EIlEkEwmeU9baqpCf6MHRpIk7Nq1C21tbbxcdNu2bZzggIlrvXfvXrz44ot8u1qtFp2dnVi3bh3vmiVJEq+MUle4XX311fj1r3/N5wmjzy0WC5/MkarDyAZGc5vRstSngYiLvnsqf41GowhHo2AmE0RZhoAJ54Iky9Cm0wgZDIhNRu7GRAKCoiBhMnGiJ6j7QYTD4ayWigC4jEDkR6Mti8XCvcbBYJBHtfSdU4kzOS0A8POhRKFamqPfOp0O8XicTwtE3dzoJWCz2RCNRnml2ObNmxdc8TgXZiuMOB+a3QDnMelO9djSzUNZ7PkQjUbR19eHSCSC1atXo6WlBT6fb0mzHKtn5l0M1MRJQ/ZwOIz6+nqsW7duXj16MaS7kEg3kUigt7cXwWAQdXV1CAQCKCkp4Z9TfwnSBQmUTIvFYujp6QEwEZVQnT5VgqlnfrBarbjiiisQi8UwNjaGoaEh+P1+NDY28uFpUVER3vOe9+Dw4cMYHBxEPB7H6dOn4XQ6cdlll0Gj0aCiogJjY2MwmUzcEuVyuVBaWoqqqip0d3dDkiT4fD5u5pckiZMUuSX8fj/P0BsMBuj1eoRCIYTDYX7/UUGButuYLAiAKMIUiUCXTCJUUICoyQQIAkRZhpTJIK3RQBEEGNJpxIAsAqRqOSp2UH9nkiRBr9fzRjsAeMlxMpnk1XJ0LMAbowhqKETbo3uXCJ7cIbQOJo9ranKNnkObzcbtY4qiYNu2bdixY8eKeGbn6qW7WEfBucR5R7ozka26Nnxqc5WpCIfD6OnpQSqVwpo1a7B+/fqcLGO5YKmaLll3hoaGEI1GUVdXl3V8c2E5/LazIZlMore3F36/PyvaJgJVQxAEPkliKpWC2WzmESqdx8aNG3HkyBGupZaWlmLr1q38oXY6nejt7eVdpHw+H9+G0+nEvn37+MuF+v9arVZkMhlYLBY+qWNJSQn27NmDw4cPo7+/H5FIBGazGZ2dnRgcHMSVV16JZDKJs2fP8giNDP0lJSUIhUJcZiAy0Wq1sFqtWTkDYCKqJLsckbMkyzBEo4iZTEgYDGCCMOGpnTx2bSIBazgMf1ERkno9kpM6rzaTQXoy4qSp4wlarXai5WMkkpW4onuPZAEa4tNnRqORv3RoOXVVIV03dXN16oFA3xPti+4bkoCoMESv16O8vBzNzc1oaGhYEcIFzu9ZI4DzkHSpPFNNtoS5SDMQCKC3txeKMn0Kc8JSI1XK+i4GVFLZ2dmJhoYG3m0qVyzW5zuXvJBKpdDX1wev1zurjjwTenp6+DxbALBhwwZ+PmQVu+iii+Dz+SBJEi9VNRqNWdnndDqNxx9/HAUFBXx4Pzg4iEOHDqG4uJgTnfr4p0Z2NC35Y489BoPBwIfkNBX7pZdeyht9kwyhLlEl65SiKFwDJc8qeV+TySRPGqqPRZRlmFIpxMxmKJKEuCRBUhSYIhHEjEakdTrErFawNw4eoqLAFo3CN+lpJimBWidSZRzthyJeq9WKUCjEk1p0HkSw6h4b1FXMarUiHA4jk8lkJeTot3qf0WiUa7rk5CDfczqdRklJCRoaGrjM8frrr3PphnpP0M9SpYY86Z5jUMOMmUCRLoGx3KYwJyyFNIHFkXYkEkFvby/i8ThMJhPWr1/P69QXgqVEulPXo6o2j8eD1atXo6mpKecXAM31praOdXZ2Ys+ePVnL0QOYSqV4G8WCgoJp3y31MKD9a7VarFu3DmazGZFIBNFoFCaTifdxcLlcsNlsfN9EslQCqwZ916WlpbyYIpVKIRgMIpVKoa6uDh6Ph7smaBhO0wGR1YySRkSA1A84o9fDX14ONllKyxgD02ggFRdDmZQv4pMN060WywS5SxJCZWVQVFElJajI0hWJRLIiUUquqR0JgiBwIqYiErp+6qbsarcPrUPTD5FDgu4RdZEF6cB6vR5XX301qqurAQDt7e1oaGjIkiQooKCeyOpWmiQnUe+GXJAn3TcRKNJljMHtdqOvrw9GoxEtLS05Edm5lBcikQi6u7uRSqVQX1+PwsJCnDx5ctGlwIstI1aTbjqdRn9/P1wuF2pra7Fr164FFVoAmNZViqxjRHpqKSMSieDll1/m1isqPqGHT6vVora2Fv39/dxHarPZUFhYyGefBYDa2lqcOXMGJ06cgN1uhyAIaG9v59VJNAPC4OAgL5Gl40mn02htbYXT6cTo6CgnVK1Wi76+PlRWViIcDvPyYkpykX2MGtwAb5Cduj8w9S6gOcfUHcDUZEYg4gfAE1JEtOp7i8p0pyb4yC/L2Bsdy2h5ADwhRw1ugOyIWH3/07o0dbt6JgydToe9e/eioqIiq0/CVGcBacjqrnJ0jvTSHBoa4qXJRqMxKyqml93Ue2y2RNrU7nVvRrylSFeSJHg8HvT09MBms+U8hbl6/aVGuvOtT5pyOp3mZLsc+19qpNvd3Q2n05lTv4a5QA8JzbsViURgs9myGqoQOjs7eRKGXpSjo6NZ3Zw2b94Mq9UKj8cDs9mMtWvXTnvgJEnCunXr0NPTw2pxWBoAACAASURBVP23FH1eccUVYIyhuroaOp0OAwMDvHT40KFDOHbsGPbt24e9e/fi5MmTvD8DvTi8Xi8KCwvhcrk4sZHVTU2eamJSOwXonNVFOvQ90UwaNI8bnYt6HYrOaQRnsVj4PHPUeGaqpEJWOnJTUN8FdQm8GjRfnTphR9eYSnwpehdFEcXFxbO2QqTIeC5QZza9Xp/VV5kxxgtZyPFCx0NkTBV0M0W0+Uh3hTDTMJf6IgwODsJkMi14CvO5tr0QzEWaoVAIPT09yGQyaGhomPGNvFjipHUXStiZTAYDAwMIh8N8zrbFki1Br9dj48aN6OzsRDgcht1uR0tLC4DpVWzxeDyrcIKiyann1djYiIaGBgwPD+PUqVMwGo2or6/PqhBkjMHpdKK4uJhHWqFQCC6Xi/eE2LdvH1544QX09/fz+v9IJIKOjg5UVFTA7XZzYqIm33q9HpdddhmOHz+OY8eOwWw28/0SYVIhhZpcScukiizSRKlqjchMnZwC3iDIRCLBCYdsXtQsna4h3St0r8/mcqDjoSo1AFyzVpOz+jtijHEJhxLXpaWluO666+Z9thb7HJGMQlMsERRF4dcwEonA6/XC6/ViYGCAS1Rnz57luvtC8eSTT+LAgQOQZRkf+9jHps04MTg4iJtvvpm3E/3Sl76Ed73rXYs6R+A8JF01KHM9PDyM0tJSLuQvhnCXAzPdbMFgED09PVAUBfX19XMOf85VpCvLMgYHBzEyMoLq6mqYzeZFe5MJakJ1OBzYvXv3vFFPcXEx+vv7YbPZODnMFql0dnais7OTX6PR0VFcfPHF06Je9XcwU5KQ+r8Syet0OhgMBrS2tqKxsREHDx7kLR6JgH/2s5+hrq6OX2Pq1EVOGSqLJYsWafvUjxbItmzJsszLaYE3/LbkqZ3qJtFoNHzUQCRNXfGo2muqHkqNvKmRDUXKRL7U5Uu9L1EUeckzyS96vR5lZWXYsGEDVq9eveRqssWAil5ITkokEqiqquIzQHd1deHw4cNwOp3YunUrjEYjbr75ZnziE5+Yd9uyLOP222/HU089xXthX3PNNVi3bh1f5r777sP111+PT37ykzh16hTe9a53LboCFDgPSZfa+NEU5upSXa/Xy32Ef20Eg0F0d3cDmGj0ksuwZ6VJV5bfmPFXPRvx6OjoovaZyzGRfkpEp37Im5qakEql+My069evz4pwCNRBjeb1AibcKF6vF2VlZQDeaIJOpcgUpU7tXFVZWQm32w2tVsurGMnbaTAYcO211+L48eN8lmFqAzkyMoJVq1ZhdHSU66x0TmoiJGkiHA5nVX6R5ELaJS1PUgKRIX2mnjyTWj7Sear1WYp8SWagJBc5FKaCqswUReFyiXrflLgkGaKhoQH79+/Pqcf0uQJdeyp5vuCCC7B582YcPXoUx44dmyhKmZRr5sORI0fQ0NCAuro6AMANN9yAgwcPZpGuIAhc/w4Gg4tu6Ug470iXsYkZRUtLS7OmMAeWp+EMRUeLHWZTWasgCGhoaFhQA46ltHecK5GmKMqCpldfLni9Xpw6dYpP4tjS0pJFuhqNBps3b0Zra2tWi82pUJeZTv27GlVVVTCbzRgdHeUJ1Knn2drailgshu7ubmQyGRgMBpw9exaiKKKurg4GgwGVlZU4deoUj+q0Wi1isRjWrFmDbdu24eDBg5AkiW87kUhwzyr1m1X3ZCBbm/pvOp2Ob1fd25nuYfLo0vKkEVMkStIIkC0JqJ8HiqSNRiNPrpGlSw06T6ps02q12LFjBy688MIFSQWL6eGxGMzWS5e6lamj4vkw04wQL7/8ctYyn/vc53DllVfiW9/6FqLRKJ5++uklHf95109XEARs3749a8oZwnKQ7mK34ff7cfToUaRSKTQ0NGDLli0L7ni03JNTEtkePnwYmUwGO3fuXNCMv7lgtgctHo/j9ddfh1arhc1mQywWw6lTp2ZcnkqC3W4378E69fPVq1cjHA7zHgdGo3FaIoei5b179yIWi+FPf/oT/vd//xednZ1Z27rwwgvxzne+k8sAPp8Phw4d4iMTmo6GridJBIlEAhUVFTziJt2XZAaKXMl+RWRGWXoA/GVOQ3z19SAbGBHd1KQW+XPVRAxMJC/V68x0jWm/FJVTJzY6Fmog39LSgssuu2zBhAvM3hNhuTET6S62w9hM12rqeT/66KO45ZZbMDw8jD/84Q/48Ic/vOjnFDgPI11g5l6uwHSf7mJApJurduXz+dDT0wONRoO1a9eio6NjUT5bADMmknKFWl6gxOLAwEDObSkXg7keSnr4ab9msxnBYHDGa0PWMYrCioqKsH379qwHeNOmTTAajXA6nbBYLLy/wkw4cuQIdzsoioJjx46hsLCQSxEA0NfXxw3+wAQZnT59Go2NjbDZbNizZw9efPFFxONx3i7y6NGjyGQy2LNnD55++mk+tY8gCFyLtdvt8Hq93MVBfWYpiUYyg7pXg7qaTJ0oI5AUQlowkB0cpNPpaURMvl51gxv6zmhb6paVjDFccsklWcPqheJckS5F/Wos1rmQy4wQP/zhD/lMwbt37+azayyk6boabynSPVeRLhVd9PT0QKfTobm5mXsV6QFazM23VE03lUphZGQE/f39KCkpwfbt21c08UFSzEznSkNjkmpSqRR0Ot2MRN3Z2ck9uIwxeDyeadYxSZLQ0tLCJQrSJWeSgVwuF4yTRQd0bH6/P4t0p6439UFuampCVVUVfvGLX/AIKpPJ4OTJk2hsbMR1112Hp59+Gh6Ph++LdFTywlIiTl3lNTW5R9Gtul8vHZ+6cGHqfSFJEt+HuhSYrrm6B4T6nDUaDW+dSVMhXXXVVaioqFjyi/lcke5MCIVCi4p0t2/fjq6uLvT19aGqqgq//OUv8Ytf/CJrmVWrVuGZZ57BLbfcgs7OTiQSiayeIwvFeUm6s2GxvWHVmIt0GXtjMkq9Xo9169ZNi9wWGimrsVhNl7GJ+bCoB/BKky1htpcfMDFtS21tLQYGBnjEt3HjRnR1dU1bNhaLcT2OiHKm6X6ACdmira2NlxBv3bo1i5xp34FAgLdzBDDNr93U1ISzZ89mNYXZuHHjtPMjolIXfcRiMdTU1KCwsJC3PATe8KiWlZXB6XQiHo/z5FYmk+EEDLxRGTabzkrWMrXeqybi2UZE/5+994yN9L6uh8/03ofD3uvuksut2pVkrVZrVavbcSA7RvK+AYL8gbxfAiQO8iGBkCBwiuMgcYD8gdiBg/iD7diIYyiRLK3sXUursr2Tyz7kcDi99/p+IO71MzPPkMNhkSn4AoJkemaefp/7O/fcc4ijK1QHIxiFtCNIuL23txdHjx5tumKrjk8y6UYikaaSrlwuxz//8z/jmWeeQbFYxO/+7u/i0KFD+PM//3OcOHECL730Ev7+7/8ev/d7v4d/+Id/gEQiwXe+851t0Uv3ZdLdLSENQDzpUvW1sLAAjUYjmmwp9tKckripCwsLUKvVaG9vx+joaFPbFmtUbRabveQGBwfhcDhYkLwelc/hcGBhYaGCOlaPWnflyhWEw2Ho9XoUi0VcuXIFRqOx4oF76KGH8O6773LDqauri8dUKUwmE55//nlMTU0hnU6z00U6ncbw8HAFZzSVSvF1pVFZYL0pt7S0xCIxxeK6nbpcLsf4+Dhu3brFk2v0D3XeCQOmgQrglzACNbSqg5K+8P4UVs/E/6W/U4KmwRTCg41GIz7/+c/vuJ/YXiTdegLm2xmM+NznPlfDu/2Lv/gL/u+DBw/i0qVLTf22WOzLpLubIUy6lGzn5+eh0+kwPj6+aVd0u0m3ke+Wy+uOxAsLCzCbzTh27BiTxpsJ4fJ3K5FMJnHz5k2mNRmNRhYsJ0nEzay0AWB4eBjZbBZut3tD6hhdD6JlkYhLdRPFYrHghRdeQDgchkKhgMViET02s9mMI0eO4Mc//jEymQxkMhlcLhdSqRSOHDkCiUSC5557Dm+99Rai0Sh39n/wgx/g2LFjOH78OF566SV88MEHWFlZYUGdQqGAmZmZimRHTTGCBIBfsguEKwYxji7BM8J7o5puVj1kIfxNGkXW6/V47rnn0NLSsisrob1IugSLVMd+cY0A9mnS3Sg5UDXR7MWnpRglNb1ej8OHDzc8TrybSZdGZefn52EymSom74T8z61GM7BMNBqFx+Nh/FKr1cJkMkGj0SAej2NtbY0TmVar5Y6zGGzQKHWMtkNCK1TJiVXQJBl5+fJlLC4uQiKR4PDhwzW6xC6XC9lslq8vuWEcOXIEwHoCf+211/Bf//VfrMtLtMWWlhb09vZibGwMPp+vwjook8ngmWeewfnz57lhRo00qozp2OlvwiYw3QtiSYZeOPQbQhxYyPOlqlmlUsFqtaKlpQWrq6vweDw8Ukv/3olG614lXbF9jcViGBoa2tVt71Tsy6S7UVCl2szFL5fLSCaT8Hq9aGlp4Y55M9tvJuoNOAgrbr1ejyNHjtTs13ZGiDdqiIlFKpXCtWvXUCgU0NXVxVYxqVQKvb29NX5n169fh9frZYnAd999Fy0tLVwR6/V6HqWl46X9qo4TJ07g0qVLLEXY19dXt6lx+/ZtPmfEYjAYDBXTd0LurPBv1eeHqmbhct7v96O3txctLS38N2IyyOVyeL1efOlLX8L3vvc95HI5Pj4yh4xEIhV+aBS0HXoB07+FLAexlzM11whmIEW2z372s0in05ibm8PExAQKhQKP1JIwFPUhhIl4K8pfQH0hmp2M/W7VA3wKky7RxrZi60zY6OLiIhsnkl7AVmMnK11iSczNzUGr1W5YcW+X49vId8m0MxqNwuFwVIiLk+tDdSSTSSSTScZUl5aWIJfLMTQ0xA8+NZ2A9YcnGo1CpVLh0KFDPClEYbfb8fTTTyMajbJ6Vb3K2O12cyVO2CqJ+lB0dXWx6DetkqjKFYZWq2WdWoJiCDohsfT33nuP6WBSqRQ3b97EysoKHA4HXC5XBROBXBaEjTKKfD4vmliEjdbqREx/o+OcnJxEV1cX+vv7AVQK0cjlcphMpgpIZiPlL61WW1EZE1ujOqodQ3Yj6pkU7BerHmCfJt2N4IWtVJrlchkejweLi4swm804evQoUqkUvF5v0/u2U0mXkq1arW4IS97ONNtm8IJQzJycI/L5PFv3qFQqSKVS0fHIah1bqgjVajW0Wm1FlTo1NYXV1VV2KXj//ffhdDphs9n4oacKjJI9iYmL3RPEDSb8khKIMLRaLV544QXcvHkTqVQKnZ2d6Onpqamojh07hvfee49ZAAaDgdXBDAYDDh06hO7ubvzHf/wHN83K5TLC4TCeeuopxOPxCsNIYQIUOjxUN8SEL0Sx+5oaZfT/SSQSnD17FuPj4xWf22zKciPlr1QqxaO1Ho+HX1DVEAXp5O5m1Evs27Ff3+vYl0l3oyARkI2iXC5jbW0NS0tLsFgsOHbsGOOCxGHczvabHXCg4YgrV67w0rDRQYvtKpSJfZdUyDweD3p7ezE8PMwPrlKpRHd3NywWC1dCYtWZwWDgcVcyoWxtbRVNAH6/H0ajkZMkMTL6+vqQSCR4OUycVo/Hg3g8DpVKhWPHjtX83rFjxxAIBJBMJlEul2G1WjE8PCy6j4899hiWlpZw8eJFXLlyBUqlEk8//TS/FAwGA86dO4dCoYD33nsP0WgUH3zwAa5cuYJXX30VFouFq0x6AdC/1Wo1vvzlL+PnP/857t69W2GESUMTRB8T49YSBEIvRuHfCNJQKpV45ZVX0N7eLnodmsVbJRIJj9UKYSOCiqiB63Q6kUgk2LNNmJB3EufdSMB8P2jpAvs06W5U6dINLBalUglra2twOp2wWq04fvx4zZv5k/JJi0QiLGp+9OjRhrr+wtgJTJeChHFIhaye5KNcLofVaoXRaBRdJgPrugNHjx7F7OwsMpkMTCYTRkZGRPeD6FnCylStVvM/QkbDBx98wI4OuVwOv/jFL9De3o5bt25VVF+f+9znEAgEIJPJ0NraWnf5m0gkcPHiRUil695fuVwOb7/9Nl577TUeOlCpVHC5XBXwVS6Xw8cff4xnn30WWq0W7e3tLCBE98HVq1dx+vRptLe348GDB5w8Sf1LaI9eHdWjwsAv4Q2aqCPRp7a2NtFjo3O5XdlOYchkMhiNxgra2czMDL98kskkVldXkUwmK8TJhRh+M9TPXyfdX8EQS5rCsVhyJ6i3DNoLIXNhkBqZRCLByMgI7t69u+WES9vdbqUrPE/t7e2s3lYv6MHP5/MIBoMszVjdXSYlKGCdZ1vvN8fGxvDhhx+yOLjJZEJnZ6foZ/1+P0wmE2QyGftulctljI2NcVUcDAaZqqVUKuFyuaDRaNDX11fDURW6KNDnSQtXr9dzoqOBB+G5E9LBnn/+eXz44YeYn59HLBaDVCplRbwvfOEL6OzshNvt5oEIYjoQd1cIx1CiFP5NyL1VKpV47bXXGko2O510xaJYLDJNUKiLUS6XK/RwCcMn8XdKxDqdru7UIgWJFFVHNpvdctP7k4pPXdIlBXygMtk2Oha7V5VuLBbD3NwcyuXyltXIxGI7mK5EIoHf78fdu3dht9sb1mogcv/s7CwbMxIG3Qy2ZzAYcObMGYTDYUilUthstroJWqPRIJ1OcxVKfFUxXDIej+PNN9/kquvatWsYHh6uYE9QtU/JiRgwwgdcIpGgv78fTqeTEyEZnVIoFAqcOXMGTqeT6WAAmLv78ssvY2ZmBm+//XZFIiyX1+3eo9GoKHOBtk+fO3XqFAYHBxuGnwhL3s2oB2EQ1a8awy8Wi2z8GQwGsby8jFwuB7lcXpGIhWaWYpXuXqmb7VTsy6S7WSMtn89jZWUFy8vLcDgcWxqL3c4ynba/UdKOx+OYm5tDsVjE0NDQjnVcNxrJrRfE+/X5fHXhls226fV6kc1mYTQaGedzu93cNRdGPB6Hy+VCLpeD3W7H4OBgzQNEOO5mceLECfziF79g6lhHR0fdF8X9+/eRy+X4XKdSKcjlchw4cIC79YlEAjabDaurq5BK1w0oT506xYmEKt3h4WEeCimVSujr64NGoxEl59dTsKLjE1a05XIZ3d3dbHle/QKlqlgul+OFF17YsqbrViiBzcZWcWNyMa5e2eXzeV6tCM0sSeeXPNrIjZhiNydVdzL2ZdIFxJNMqVRCMBjE6uoqent7m1LX2u6Fq1fpJhIJzM/Ps/TjRkvCZqbDtvr5YDCIubk5bpC0t7dvuTolAW9h4qzniJzJZHD9+nWmQ62srKBYLNZVtcrn8+wSbDabax5mq9WKp59+mvmzdrsd165dE/0tqoiF+0h6D8Kq+ODBg4hEIgiFQgDW8dr79+8zzYuWr0NDQxgfH8fbb7+NpaUlOJ1OAMCzzz7LdLTjx4/jwoULjMnSCG8oFILFYsHw8DBr+tJLfmpqinHhBw8e1Jzrxx9/HL29vU29qPcKXtiJxE5ThNVmlplMBvfu3UM+n8fi4iLS6TQuX76MixcvIpfL4a233sL4+Dg6Ozsbfh42s+oBgB/84Ad4/fXXIZFIMDk5WSOIs9XYt0lXGELbHovFAqvV+olNp1Qn3WQyifn5eWQyGQwODlYse8Wi2ZHcRiMajWJmZgYKhYKpaOQvtdWQSCQwmUxYW1tjTDmXy4m+UMgJV6FQME/U4/HgwIEDNceaTqfx4Ycfslyi1WrFQw89VFMV05J1syD5Prou+Xy+LlZsNptRLpfxzjvvIB6PQ6lU4ty5c8jlcqyVS4Lwi4uLXBWXSiVWopJIJCw9effuXeboXr9+HTdv3sSLL76IZ555Bj09PXjvvfdYa5fcHEiYRsjCMZvNmJyc3PRY6wWd+92M3ZxIIy0M0lamY5mYmMDIyAj+8i//EhcuXMA3v/lNfOUrX8GXvvSlhvZ3M6ue2dlZfO1rX8OlS5dgsVjg8/m2fSz7NulKJOu2PUJHhFOnTqFQKODevXvb/v1mEx/BC6lUCvPz80ilUpxsG/k9Sto7XZUQrFEqlTA6OlrRSGpWnU0qlcJkMkGv12N5eRnFYhEDAwOiL5bqRp/Q7qY6pqenkclkYDAYeEBkZWVFFLIAwKsbn8+HdDpd01AZHBxkEXXglw+qWJRKJfz0pz9FKpWCWq1GPp/H+fPncfr0aZjN5gp5yIWFhQqebSKRwOXLl9mUUq/X82CEENu9ePEivvKVr+DgwYP48MMPK84BmSsODg5ibm6Ok9jzzz8vur+Nxl7AC8DuL/GrE7tarUZ/fz8GBgbw13/911v6rUasev71X/8Vf/AHf8CFxE4osu3bpLuysoLFxUV0dHRU2M8IBUWaDcJ1m7lJs9kskskkbt++jcHBQdjt9i3diJR0d6oqIWuaTCaD4eFh0Sq0mjLWaFBV3tbWhpaWlrq+XMB6pWa32+Hz+RCNRgGghsBPkUwmK7isQoZAdaytreHSpUuQSCSIRCJ499138dnPfrYi8ZLuQrV0o1iQaDlBLVRxJhKJinPncDj4vFETs6urC6dOnarAJP1+PwvOCI+PhNF7enowNTVVAcmsra1hYmICJ0+eZO3W7Zqt7gW8sBdRT8C8mUZ0I1Y9MzMzAIBHH30UxWIRr7/+Op599tkm9vyXsW+TrslkEvX62m4jDGhOvyGTyfCEllwux6lTp5p662+H+lW9P/Pz84jH4xgaGtqw0m72nFUna6EbQjUPUyqVYmJiAuFwmN0Z6kkLtrS0YGZmBkqlkie0qq15KO7duwelUgmVSsUUL5fLJToEAaxXSqQDIZbMKNlSRUWMBtJDoLDb7Th37hwuXLiAfD4Ph8OBp59+GkAtJulyuXiIoVwuo6WlpYK3bLPZ4PP5eLURCoVw6dIlJJNJnDlzpv4F2ELsBXvhk4pmFcYaseopFAqYnZ3FhQsX4HK58Nhjj+Hu3btNbU8ikfwfAP9nXyddsYbVTixvKOk20ljKZrNYWFhAJBLhEdnqJeNWYjvUL9qfxcVFhEIhDA4O1qhq1dvmdipdYP3mnJqaYrNEahZVN7D0en2Ntm11DA0NcfKUSCQYGxurS/yvriIB1D1/+Xwe7777LgKBAE9xPf300xVVklwuZ6seaoKNj4+LUrOGh4dZQ+LSpUv48Y9/DKvVijNnzvDYdnd3N86ePYv333+f76lYLAaXy4UnnngCCoUCIyMj+NnPfgan01lxPm/cuIEDBw6wTOZ27u3dhhc+SdpWs4MRjVj1dHV14fTp01AoFOjv78fo6ChmZ2dx8uTJLW+vXC7/XwD/d98m3d2MRri6uVwOCwsLCIVCGBgYwNjY2I4k/GaHM0j4+urVq+jr68Po6GjD+9MM3az6e6urq4jH42zqGAqF4PP5ROlfpOZGIjnVUIpMJmOpR4IX6kV/fz9u3brFnmAajaZugp6fn4fP52M93nQ6jStXruDJJ5+s+Nzw8DBaWlpYLL2lpQULCwui57NcLuONN95gbnEsFkMoFOJJNgA4dOgQRkdH8d3vfhexWAypVArxeByhUAhf/vKXYTabmdNbfe2Fegf00hJOdjUqMLPb8MJewBf1BMxjsVhTguyNWPW88sorbEwZCAQwMzNTI8K01di3SXejhCLE2pqJjZKuUPxlo+TWbCNuq0m3WCzyxJNMJmvKqqfZ6lpYIZOYCy3LhUMq1TE3N4fl5WVW/zp27JjoFF4jldnIyAikUimcTic0Gg3OnDlTd+mXTCZZ7wBYv87EHKgOs9kMs9mMUCiE+/fvIxQKiVa7kUiEISV6QSSTSYRCoYpBAOF0HDUuo9EoIpEIrFYrrFYrlpeXmbMrl8tx+PDhigdcKMno9XoxPz/P/NVqmczqe2+34YW90tKtNwIsVI5rNBqx6nnmmWfw9ttv4+DBg5DJZPi7v/u7TRlIG4VEIvl/9m3S3Si241MGiCe+fD6PpaUl+Hw+9PX1VYi/VMd2GnGNYrqlUgkulwsrKyvcTLxx40bTLIRmmo/UtJyamoLH40EqlUI0GmUcViaTwefzVSSCTCYDp9MJg8EAqVTK3MvTp0+LboME0SUSCTo7O2soYjSwMDw8jCtXrog6TlA4HA7cv3+fE1Aul0NfX1/dz6+srOCdd95hCca1tTW8+uqrNbxkMW2E6msvxhARNoWUSiV+67d+C++99x4zXk6cOFHx+XqSjJlMhht3whFbnU4Hg8EAnU7XtMZ0o/FJJt1IJNJQk1QsNrPqkUgk+MY3voFvfOMbTf2+MCQSyXEAf7Rvk24j8o7NJt1qe2un0wmv14ve3t664i/V32/2Jtys6iSFtMXFRTgcjooBkGax2WYoY4VCAYFAALFYDCMjI/jsZz+L6elphMNhAOBGEonC0/KYbMvVajU3wGiqrPqaRiIRXLp0iY9pYWEBn/nMZzaVuawXXV1dOHLkCG7fvo1SqYTu7m5RdTKK999/HxKJhCehwuEwlpaWKjjgRqMRfX19WFpa4uVvT09PDcZot9vR1taGtbU1vje6u7trbIZeeumlLR0T8Vc1Gk3NiK1QqDwWi+HWrVuiVfFOVMCfZNLdR1q6/x8A675NuhtFNbF8q0HyjPPz8/B4POju7m4o2VLQVFYzSb8evEC+aPPz87BaraIwwnaSbqPfE1bYer0efX196OjoQKlUwvDwMO+72LEXCgUeegiFQigWi8hkMjAajVz96vV6Fj2ZnZ1lby9gvepdXl4WFZinkeY333wTMpkMBw8erGnYSSQSTExM4NChQyiVSpviodlslhMJvRCqZTslEgmefvpp3L9/H36/HzabDePj4zUvEIlEgpdffhk3btyA3+9npsebb75Zt8rfTlSrgMXjcRw+fBjFYrGCzkYCPkKcuBn7nk8aXtgPSbdcLv+/wD7GdDeK7YjWFAoFhEIhBINBDA4ObinZUuy0ewSN7BoMhgrtX7Hv7gT1SyyEzsMtLS04deoUfD5fhSEiMQIKhQKi0SgvcYWC3TqdDv39/ZidWJXh+wAAIABJREFUnUWpVEJHRweGh4eRy+UQDoexsrKCXC4HhUIBn8+HfD4PhULBlK1613VhYQFutxsOhwO5XA4ffvghHn/8cVEyu1BEfKPo6urC0tISlEolC9wIhyOEvzc+Po5yuYzbt2/jv//7v6HRaHiggkIul+PkyZO4fPkyLl++zMeytLSEiYmJTfdnO0Fwl0KhqJHJFArPBAIBLC0tIZ/Ps32PUDy+3nn7JE0p95OsI7CPk26zmrr1QqghazKZ0NHRgd7e3qb2jeCFZoKEzAEgHA5jbm4OKpWqIXPMnWiIiUUwGMTs7CyMRmOFKI5Yss5ms7h37x7zdc1mM0ZHRysqxtbWVrS3t/MQiNi1zGazUCqVuHPnDrLZLPL5PEqlEhwOB1fZBoOBKx9S9aIKu1AowOVy1Z0g8vv9uHLlCjKZDMMM1Q80cWSXl5chk8lw+vTpDTHjjz76CDdv3mTVM5fLhddee62mAXf9+nUUCgU+7nw+j0AgUPd3dyLEhgooxIRnhPY91TKZ1VUxvWg/SVPKXyfdTzgacY+gEOo2kBg0NW+ajXqiL42EVCpFOp3GtWvXmKPaqL7uTmO6sViMrcQnJiZqsNRqqplEImF5PlrWhkIhBAIBrhDpOzKZbMOHVKVS4eDBgzAajVhaWoJUKkV/fz/0ej0vjRcWFrh7T0pU9PDTQINYxGIxvPPOO8wSoImwhx9+uOJzSqWS6WTT09ObKnvduXOHGRnAL5uv9Sbv9jKaEVASk8kkfYhEIoFQKASn04l8Ps/SmoQZC1c4OxX1rHpSqdS+0dIF9nHSbUTecaMQYpNtbW0Vgt2flHtEMpmE0+lEJpPB5OTklnGqnZosS6VSmJ2dRT6fx/DwcM2IJU1+idm+p9PpCjxXoVBw1UvboqCknslk4HA4MDg4WJGIJRIJuru7K0Y1AVRwMql7r9Vq4fP54Pf7USwWeR/cbneNbQxNpNHKQSqVYnFxsSbpCiOfz8PtdiMWi6Gtra0hfmy95uTk5CSuXr3K9wippO2HkEqlonKMBEmQIBBR8YQi5UK8vpkQEzAXWhjtl9i3SReoT+onTy6xKJVKWF1dxfLyMlpbW0XdEfbaPSKdTmN+fh7JZBKtra1Ip9NNNQaaxXQpWVPzMBaLYWhoSDQRkNNFuVxGPB5nZ19g/QGwWCxwOp1QKBTsKlH9gJKTwPXr1yGRrHuEkTD4Vl2YqXs/NDSEtbU12O12yGQytLW1cdOIbGPK5TK0Wi3i8Th7kpFg+EaNIxKyoc+bzWa88MILNROLR44cYWv6crkMtVotSqQ/ffo0NBoNZmdnoVarcfz4cSwuLu6qutxuh0QigdForMC8S6USY8XhcBgulwvZbJYFgYjS1qjV+0YW7/vpvO3rpFsv6ln2kBllNdWqke9vd/tikcvlMD8/j0gkgsHBQbS0tCAWi/Eo7Vaj2Uq3VCohFovh2rVr6O/vF5VbpHA6nVCpVJxUfT4fEokET3l1dHQgm83C5/Ox04JQN4FelJT4qGo1GAxYW1ury0xwu92Yn5+HRCLB0NCQ6KSbWq2u+b4Q66MkEIlEeMCFsM6DBw/C4/FwEhAe/8cff4xcLsd/D4VCuH37ds0o6MmTJ6HVarG4uAitVsv/uzokEgmOHDmCyclJvPPOO/jP//xPlMtlLC8v4/Of//yuO+ruRog10oh5Uo1p53I5hihcLhe/EDUaTUVVrFKpKq6DWNLdCyx5p+NTmXSFlDGh86/dbt8zy56NHIEJ6/P7/ejr66sYId6uq+9WKuxSqcQOG9Qo2miZRpUrYbu0z0IPL5lMhqGhIfT394uO8FLSrcaR6+F1AODxeHDt2jWo1WqUy2VcvXoVp06d2rLMnjAJfOELX8DCwgIymQxsNhtXwIFAAKlUqmK4IBgM8vHRRFs8Hq/5fYlEgvHxccZws9ksstls3SR6584dPHjwgKfQ/H4/3n333Rqy/nZjL3QRtsJeUCqVUCqVNS/EdDqNRCKBaDSK1dVVZLPZCuuean86oPkR4E8y9nXSrQcvEKZLQwRWqxUnTpzYkmXPdm7UevBCsViE0+nE2toaenp6RJPcdqCNRhO2cMCitbUVR48exczMzKa4mEQigcViQSgUgk6n4xebWBNDKpXC6/XC5/NBLpeju7u7AmYgsflQKMTXsd5U0crKCg9SAOvnkehhYsc2NTXFzgtjY2OiuhhKpRJjY2MVfxM2jGi4IB6PM15Mk16lUgk6nY7lGcVGbt9++22GDAYHB/HUU0/VJCWXy1XBYigWi/B4PKLnYDuxH1wj6CWn0+kqIAqSyUwmk0ilUpiamuKqeGlpCSsrK9BqtU1BM424RgDAD3/4Q3zxi1/ElStXaqYEm4l9nXTFolwuIxwOIxQKQavVbtn3ayeimjJGFaXL5UJnZydOnz5d9wbdbtLdbJqN6F9ms5mr/mw223B13dPTA4lEgnA4zNipSqWqeUl5vV7Mzs5Cq9Uik8ngzp07OHLkSAV74ciRI/D7/cjn8zAajXU1UYVC4QA2HGxYXFzEnTt3uBq/ffs2C11vdF6qK23hcEFrayt+9KMfsQ7wwMAA7HY7yzMKqzGDwYB79+6xqwSwziG+du0aHnrooYrtWq3WiutNThw7Hfsh6dYLoUzm2toajh8/zmJFfr8ft2/fxv3793H06FEYDAZ885vfxJEjRxra381cI4D1oZJ/+qd/wqlTp3bsmPZ10hW+2Wh5Nj8/D5PJBI1Gs+WmzE4FUcYIi1xaWqphSGz03e1wfOuxNsimR6VSYXJysgJr3EplL5fL0d/fj46ODty7dw8ulwtKpRIDAwMVx7a2tlahIBaLxRCJRCpWJ5S0N4vBwUF4PB62ZlcqlXU1E9xuNxQKBScAhUJR1yiTPv/ee+8hm83CYrHg7NmzNY0/uVyOgwcPore3F1qttuYaCkXLV1ZWmIImk8mYGeJyuWqS7vHjxzE3N4dwOMzHVa14thOxV0m3UcWz7QQdh1arxblz5yCVSmG32/GP//iPiMViDU/SNeIaAQB/9md/hq9+9av4+te/vmPHsK+TLrCebAOBAObn52EwGHDkyBFoNBp88MEHO/LbzXRFpVIpEokEPvzwQ9hstj1zIxardJPJJGZnZ1EsFmtseii26hxRLpcxPz/PDgvFYhEPHjzA6Ogo3/QymaxiWo32TwwSSqVSWFhYQDabRUtLC7q7uyvOu8lkwmOPPcbCNx0dHXX1FzQaTY0tUD0OZyKRwIULFyCVSqHVahGJRPCzn/0ML730Us11p6QolliqRcs9Hg+mpqYq7OELhQIuX74MlUpV0Sx67bXX4HQ6sby8jEceeWRXVmWf5LTYbodwBHgr2G4jrhE3btzAysoKXnjhhV8nXYpoNIp79+5Bp9M1NLG1laCKcytvb3oBEMf11KlTW7ZZ2Q6eLEzY2WwWc3NziMfjGB4e3lCObquJniQGtVotEokE1Go1YrEY22MD6zAEObdS4rPZbOy0S0EawNSZDgaD7JgsDDFuqFiMjY3B7XZzo0uj0dRgtxSRSASlUokTnVarRSwWQy6Xq0l+9AJ2u91MCxsdHRVlepw+fRoulwupVIopZi+++CLUajWy2WyF9gFRG1UqFfx+P6uC7WTVuFeV7ifBlW1Wd2Ez14hSqYQ//MM/xHe+853t7J5o7OukK5PJMDExUTfZbof3SAyGRm/+cDiM2dlZaDQaHDp0CDMzM9v2tdpqSKVSthfx+/0YGBhoyDliqyLmwmUzVXI03UVhNptx+PBhRCIRyGQy2O12HvkVbisSiSCTyfCDo1AosLy8XNfNuVwuM6VObOpJp9PhqaeegtfrhUQigcPhqHsdiA1B9wlVa/WWqIFAAG+99RZ//tKlSyiXyzh06FDF57RaLb70pS/xVGN7ezv/plqtrtE+iMViTIdbW1vj6TqNRsMiQGIUqkZjr/zRdluvV+zYI5FIXWfnjWIz14h4PI67d+/i7NmzANZXLy+99BJ+8pOfbLuZtq+TrsFgqEvtaqZSFUajtDEaFiCup16vZzHvvYxSqQSfz4e1tTWMjIxsSv8SxlYfZKlUioGBAUxNTSGTySAej6Ojo4MTKv0eVaeEtROVT5h0CXKg2EgjoFAo4OrVq/D7/QDAFMDqa6xWqxvSzbDZbBgZGcHMzAzvw2c+8xnR7ZfLZSwuLvLYMe3P/fv3a5IusP7yaFRYm+QjhcmDBkji8XgFhUroNNzouO1ewAu7HfWe5VgsJnr+N4vNXCNMJlOFHsbZs2fx9a9//dfshc1Eb/L5/K4l3UQigbm5ORQKhZpR2Z0wx2w0hM06i8UCu91eMza7G2G1WnHw4EHcunULuVwOH330EZRKJUwmE9RqNSfcVCqFlZUVqNVqFItFhEKhiorCbDZDp9Ox+0Iul6tpZlAsLCzA6/Uyduf3+zE3N1cXPgDA1bhY0pFIJHjooYfQ39/PU4D12ANiv1Hvd4UxNzeHixcvIpfLobe3F08++WQNvi+2IpNIJNBqtdBqtRUUKqEIjXDclkRoqDIWVuufBifgjbR0m2F8NOIasVuxr5PuRrFbU2XpdBpzc3NIpVIYHh4Wdandi5FEwo/n5uZgsVhw8uRJnnDbqyBnXWIvkBzg0NAQ+4Ddv38f+XweMpkMarUa6XQaHo8HFouFl/InTpyAy+VCLpeD1WqtO/QQjUYrZvcVCgVisZjoZ8vlMu7du8e8zt7eXpw4caImSRIEsVmUy2WMjo7iwYMHyGQyPCSxkQi6z+dj5wnSdzh//nzN8MNWkqJSqWR7H+H3hYLli4uLbIKp1+tRLpf55bOfxmWFsRtaupu5RgjjwoULTW1DLPZ10m2k0m02qpMu6RJEo1HWJditG5jw0noPYiQSwczMDDQaDbM1gPUbc7cqbHqwiSO7vLwMn88HYL15RSpThLdSYvD7/ay8lc1mEY1GEY1G2VqIBFHMZjMLotQ7drPZDLfbzcv7XC5Xt8pZXl7mJqtEIsHS0hK0Wu2murX1tl0ul2E0GvHKK69wg3BkZGRD5TGXy1WxLJZKpVheXhb97e3cS0IRGhqPLpfLyGazbGyZSqVw5coVNrcU/rNd6KGeYeROxkZJdz/JOgL7POluFDulFJbP57GwsIBgMMgW67t9gxE8Uf3wJxIJFv8+cOBATTe/WcGbzaJUKrHNfCgUQjwex7Fjx3D8+HG89dZb/NDRtoUPcVdXF2ZmZlhzVa/XY3h4GA6Ho0IQJRgM4u7du3C73QDW8dZDhw7BYrHw1NfAwAAikUhFg2pwcFB0n/1+f4VYuUqlgtfrrZt0Q6EQLly4gHg8DoPBgLNnz9asYoiJ8OijjzZ03tRqNbNR6PyI0dd2Y/lPOLFarUY+n4fFYkFXVxcKhQLDE2tra0gkEiiVSvzyq3bvaCQ+adeIXyfdX5HYbtKVSCTweDxYWlpCb2/vhkaU9WK7jsB0k2UyGczNzSGZTGJkZKTuTbZbWHIkEsHy8jISiQQv/6mhZDKZEI/HIZFIkEql0NbWVrEPbW1tUCgUCAaDXAkLKz9hxbW0tMTNpHA4jPv378PhcPDUl8FgQGdnJ/r6+qDRaLiKFQudTlexpBZqRlRHPp/H+fPnWdQmlUrh/PnzePXVVxkbbeZaDg8P49atWwiHw5yYqBsujN1e9guTulwuZ6dj4f9PugfV7h3CRFzPOeKTTLqJRELUpflXOfZ10t0NeIEcJJaWlmAwGJqy6wG2x54Qq7IHBwdx6NChDY+5WeeIjSIYDOLGjRusrUsvs1wuB4lEAoPBgPHxcTidTsTjcYRCIYTD4Yrmos1mg8FgwPT0NBYXF+H1ejE5OVlhpBiPx3kAAQBaWlqQTCZZjyGfzyMej3NiIA6scPxWyIoYGhrCysoKIpEIgHW+bj1th0QigVwux7AFGVHG43GudqsTo8/nw/vvv49UKoXu7m488sgjNVQzhUKB3/iN38Ds7Cyy2Sy6uroqjplitxtdm92H9XQPcrkcn3Ohc0Q1PLFXSVesAUn7v59iXyddYGPRm42UvqqjVCrB7XbD6XSivb0dBw4cQCQSafqCbifpEvYXDAa3VGVvt9IVJpZEIoEHDx6wRoLT6QSwPuG2trYGi8XCY7kSiQSRSAQWi4WrypmZGRw9epSX+LOzs4hGozwaPDU1xVJ+wHqDSMiZFSZBYD2BCRtIKysrmJ6ehtfrRUtLCzMlLl++DK1WC4PBgKNHjyKTyTBPuN5UIGlHUPKjCrkevzcej+N//ud/mNf74MED5HI5PPXUUzWfVSgUddkYFLuddMkfbauhVCphs9lEhYCETbtsNotyuYyFhQV+AYoJAW0nCoVCXT7+fmsO7vukWy8arXTL5TI8Hg8WFxdht9tZZzcSiey5e0S5XMbq6ioCgQDa29s3FMYRi+3cfPTyyuVymJ2drYEyZDIZZmdn4Xa7YbFYoNFoGPIgBgFp7EqlUuRyOU5KxWIRgUAAOp2Ok2CxWEQ6neaka7Va0dHRwaO+UqkUk5OTovtKUo8ajYa1FY4ePQqdTofjx48zvzUSiSCRSCCfz/O0l1hS0Gq1mJycxM2bN/k8HDlypOYhp8+TjTolcalUiqWlpU1hgmAwyC8nIV68l/DCdqPaZRhYPy46v9S4o+sshCeE7h1bDbEx471o4O1G7Puku1Glu1HSFArkmM3mGjWyvbTsoX2Zm5uDzWZDa2sr2tra9pTQLpFIMDc3h0AggIGBAbYpX1tbQzqdhlar5akyktKjqanFxUXm4hJFicTA6dpotVoEg0Gk02nodDqGJ/L5PNOvyDadGm71NBM8Hk+FCaVKpcLq6ioUCkXFUpnEdEqlEgKBAGKxGGKxGCcFwon1ej0GBgbQ1taGRCIBg8EgCgNQ0OpF2CDb7FpdvXoVly9f5vv1M5/5DMMdu5089qKSVqlUaGlpqThvQiEgcu8gWcxqsfLNQgzTpabnfot9n3TrxUaOwKFQCLOzs9DpdBWUK2Fs17Kn0aQdDocxMzMDnU7H9uozMzN7NtFGrrXxeBwOh4Mn2crlMpaWlpgbG4lE2A69VCohn88jmUzCYrFgdHQUY2NjmJ6eRiAQQDQaRTAYhNvtZunHZDLJPNxisYienh7YbDaGFAgW0ev1yGQymJ+fRy6XQ0tLC7q6uiqSEtHKKKjqFHv5lkolfPzxx1hZWWFrn7Nnz0Kv11fgxE6nk3FiEh+vp4PQ09MDs9mMUCjElf0jjzxSN3HGYjFcvny54py///77GB4ehkaj2XAKbyditzHXer9fLQQE1Fr4kJGpUAjIYDBAo9FUnBMxqC4SiTTN0f0kY98n3Xo3upg5ZTQaxezsLORyOcbHx+t2s+n7u1npJhIJHj89dOhQRQd2uwm/kRCK89hsNpjNZnR2dvKNTpxaqiSUSiWCwSDy+TwymQy8Xi8nvpmZGQwPD2N8fBw3btxAZ2cn5HI5AoEAgsEgWltb0dvby402giTW1tYwOjoKh8PBiTSbzeL69etciVJV2tfXxxVxb28vXC4XotEoJJJ1j7XR0VFMT0/XHKfL5YLT6YRer4dEIkEymcT169dx5syZGpwYqBQv93g8iMfjKJVKyGQy/DsGgwEvv/wyHjx4gFQqhY6ODnR1ddU914lEokLIiM4xudg2i7k2GnvRqGt0/8UsfAjWSiQSdd070ul0zW9tZzDik4x9n3TrhTBpxuNxzM3NoVQqYWRkpCEJuN1KujTRlk6nMTw8LEr/2i2+LYVQW/fo0aPQaDS4fv06SqUSQwper5eX8QDYC21oaIhHL202GxQKBaLRKMLhcIU9j9vt5mrx+PHjvO3p6Wm4XC7I5XIkEgl89NFHcDgcjP/l83nEYjHY7XZIpVKo1WosLy9jcHCQK2KlUomHH34YPp8P5XIZDocDOp1OtNIlOhu9nFUqFTMaxCKXy+Hjjz+G1+uFXq/Ho48+Crvdjo8//hgqlYqrs3w+z/bkSqWyrosE8EufNkp+VJnTfbjble5eJN1GpUvFQiIRt3sXvgCz2Szu37/PkNa9e/ewtLTE8plbgWc2c4z4xje+gW9961uQy+VoaWnBv/3bvzWk5dFofGqTLt3ct2/fRiaTqZvg6sV2MbZq94hcLofFxUWEQiE2oay3jd2gfgHrCZ/oS9XautS1d7vd8Hg80Gq1UKlUmJmZYVUxs9mMZDKJdDrN+G6pVEKxWEQmk4FOp0MoFILP50NrayuP+lJQQieWg9FoRDQaxdjYGCwWCy/zM5kM1tbWePpNIpFw1a1WqznxUqIlRwyi2VESk0qlMBqNnKwlEgkymUxdbYpyuYyf//zn8Pv9UKvVSCQSeOedd/DKK6/UCK7TxBdVZ0KcuLp5pNFo8LnPfQ5vvvkmisUiVCoVXnjhBaaY/SpVor9Kvy9s2rndbhw/fpyFgFwuFx48eIA7d+7g6NGjMJvN+N73vrepKH4jjhFHjx7F1atXodVq8S//8i/46le/iu9///s7dlz7PumKJS7CBNPpNMbGxmCz2fa8y0nuEUJftL6+PoyMjGy6Lxs5QGwWYiPExPcNhUKiI8zl8ro77507d+D3+7nKtNvtLDPY0dEBvV6PQqGASCSCZDIJtVqN1dVVdtiVSqWw2WwoFovMHmhpacH09DSGhoYgk8lqqHTVCXJ0dBShUIi738lkEl1dXQiFQnA6nUwlMxgMMBqN8Pv9uHnzJkqlEg4dOsTHTY0+h8NRoSRmsVjq2rkQy0Gj0XD1lclkEAwGRc8zTZxNT08jFAqhpaUF4+PjPH5LgjQSiQQ6nQ7PPfccFAoFV8fC87+fG2l7qWJGQkDPPfccQqEQTp8+jT/6oz9CKBRqaAXbiGPEE088wf99+vRpfPe7393RY9j3SVcYVE3SMEEkEqnQLd3LkEql8Pl8WFlZ2dQXrTpkMhkymUzT2xXyTZeXl7G6uore3t6ahE9VQyAQQCAQQHd3N+RyObxeLzeZUqkUOx8Eg0EEg0EUCgWo1Wp4vV5ks1lIpVLI5XLWNojH47h37x4GBgagVCoRCASgUCgwODiI4eFhTE1NcTVvtVor9BOUSiVOnTqFpaUlZLNZHDx4sELfoFwuI5PJIBKJ4O7du5ibm4NSqYRCocDdu3eRz+dx6NAhnlYrlUo4duwYDhw4gHw+zxAAvdQIepBIJJDL5XzehK4P9ZbOpVIJb7zxBgKBAKRSKTweD/x+P1588cWKVZVwmRyLxeB2u1EsFnn0NplMbthf2G7s96RbT+uZXuoARIWnxKIRxwhhfPvb38Zzzz23hb3dPPZ90iXx6aWlJXi93opqkvRPt3NDbPWGLZfL8Pl8mJ+fh0qlYt7vVmI7mC4ls0AggIWFBTgcDlFvNmIn+Hw+BINBHnqQyWRIJBIIhUKw2WxcFTudTra16ejo4Ikti8WCjo4OKJVKRKNRZDIZdmslKpBOp2Mctb29HRqNBoFAAC6XC6FQCB988AHrLADrE2FjY2OIxWKYnp7G3NwcWltbMTQ0BKlUinA4DKfTCZlMBofDAYPBwIltbW0NGo2GGzEkBEPGl8LhB8Kwhef62LFjuHr1KlefPT09cDgcokI1oVAIoVCIdQrK5TK8Xm+N3GA1tzUej8Pn83E17ff7sby8DKfTyV18gieo6t5O7PekW2//Y7FYXbH7erGZY4Qwvvvd7+Lq1au4ePHilraxWez7pBuPx3Ht2jV0d3fXjOwSbazZG4Jw2UZvWKKikahLNBrdcsIFtofplkolXL9+HUajUdQJuVAowOPxIBAIwO/3o729HZlMBul0GsViEalUCqlUiitt2pdMJgODwYDW1laEw2GEw2GoVCpeSSwuLiKVSqFUKqGrq6vCVYIm0YhraTabMTc3B2BdLDqXy+HWrVt4+OGHeX/T6TQ++ugjZifMzc3xEIbRaMSJEyd4VQOAoQu73c7810KhgHg8zkv9YDDIjhKdnZ2Ma5PbcLlcxoEDB2Cz2RAMBqHRaNDV1YVcLsfYNT2gzSaxlZUVvPHGGwDWE8DAwAD6+vrQ3t4Oo9FYgRN7vV6k02lRnHir299N+GK3k+5Oit1s5hhBcf78efzVX/0VLl68uOO+dfs+6ep0urpLd6KNNXvSiMGwWeKMx+OYmZmBVCplKhopcjUTzVDGkskkZmZmkEqlMDExUQOrUOOHfNPS6TRCoRAUCgWUSiUymQxisRisViu6u7tZypE4q1TBk/0OYWtut5sTVkdHB7RaLevlhsNh+P1+5tvevHkTExMTkMlkiMViXPnR9pPJJF+rcDiMQqEAk8mEQqGAdDqNmZkZvPLKK7wU7+3tZS4xsP6SFboIyOVy5onS92kYY3p6GvF4HGazmVW2qCJuaWlBZ2cnSqUSgsEg5ubm0N7eXsEnLhaL0Ov1sFqtCAQCXOl2dHRsiC2+9dZb/CKn0VmdTsfOG2J2PjRkIMSJAVTwWvV6/Z648YrFVmytdvL3m6GMbeYYAawbUv7+7/8+3nrrrYa0lrca+z7pUoUjFrslZE4hZAMMDw9X3ADbtVJv9Lu5XI6rwJGREchkspqXTCqV4iTj9XrR29sLg8GAcDiMtbU1puvQBBmxCiKRCMrlMiwWCxQKBQwGA2KxGJRKJdRqNXQ6HcLhMJLJJFpaWpjDm8vlMDY2xqLldrsdMpkM8Xic/c+oWUgMCNLjFZ6DUqnEv28wGGCxWCqwT5VKhbNnzzJn2G6318VGPR4P0uk08441Gg2i0SiefvpplMtlxlyDwSCWlpaQyWT4hdvb2wur1VoxgEG477PPPoubN28iFArBbrdjcnKSK2IhhY6+k06n+X6lRJ1OpzesXMWGDIQaCF6vF/Pz8xU4MSXi3XAXro69YEfsVKXbiGPEH//xHyORSOCLX/wigPVhmJ/85Cc7cizApyDpbhQ7LWROQQ4NkUikrqB5NWVsK9EIpitkRfT397POL1WeANhfa35+HhqNhqX51tbWYDAYkM1mEQqF2J/UjaVmAAAgAElEQVTLbDbD5/MxtkjJN5VKob29HclkkpNjb28vMpkMMpkMSqUSzGYzsyRSqRQ8Hg8MBgNPIKnVaigUCqTTaR4IuXPnDtLpNFKpFEtEUgOMkgopmQHAgQMHas6DUqlEd3c3isUi/H4/wuEwbDZbzZRh9fURYntCtTKHw4HFxUWu/L1eL6anp9HT0wOdTgelUskVMSW2xx57rAInrq6IiaEhkUjYEZmuMeHOW13+i2kglMvlimkvkmhMp9OYn5/fUZxYGLvNviDnkepo1qpnM8eI8+fPb30ntxD7PuludLF3YsBB+P1CoQCn0wmPx4P+/n6MjY3V3X71d7cSG2G6Qk+0jo6OGhybGkWhUAj3799nER2TyYS2tjZIJBIe1TUYDOjo6EA+n2c4gZb5pJVrt9sRjUbh8Xig1+vR2dmJaDSKpaUlTixmsxmrq6solUpsCkmjw2TXvra2hnA4DIPBgHw+j66uLkxOTmJ5eRmpVAq5XA53797F8vIyZDIZdDodXnzxRQSDQbbHISfgav3UQqGA999/n+EchUKBxx57rGLl0dbWxpbxVGULqWMEnywsLKCzsxO5XA5+vx86nY7lPl966SXo9XrGiQlzlclkFQ07wlzFGnbPPPMM3njjDYYIHn300ZqKuNkgappQorFcLuPy5cv8UqveZ6qKm8GJq7e9W1Gv0o3H400l3U869n3SBZoXvdksqFotlUpwuVxYWVlBV1dXQxq7uwEv0NgueaKJ6YuGQiG43W7Gu6xWKzQaDeLxOHK5HFde2WwWNpsNJpOJG1JGo5EdHkgkPJlMMt+YbHVoeMFut6O9vR0GgwFSqRTZbBadnZ3cwFQoFOjr62P6V3t7O8xmM7xeL5/b+/fvc1WazWbh9/vxxBNPcHNDrVbj+vXr8Hg8nJxOnDhRIazicrl4/0lM/fbt2zhz5gx/Rq1W46mnnsLU1BTS6TQ6OzvR398P4Jcylmq1GsePH4dSqcTFixd5/p+Ozev1wmazVTQQgV9irgsLC7h58ybK5TJsNhtXopSQZTIZlEolfud3fgepVApyuRyxWAxzc3Mol8s82EGYObB9rViqsu12u+g+ixlc0v5+kjixMOphurs9Pr1b8cmf0V0MWs42GzKZDMFgEAsLC2hpaRGlXtWL7WjbViddatTJZDJMTk7WSA4Wi0VeCvt8PhaSCQQCvIylZo/dbkdvby8ikQhisRhXPeVyGZFIBH19fTCZTCyCY7PZ0N3djUQigUgkgng8jkKhwM4MAOB0OhGNRlEsFmE0GiGTyVj+UK1W83KfKlSpVAqVSoWBgQHWc4jH49BoNKzFu7q6ylSwlZUV2Gw2Tn537tzBuXPn+PgJsqBEVe+663S6CgvtQqGA+fl5xsOFVRMJmROOS/QusVAoFHC5XLhy5QpDBoVCAWNjY0gmk3C73WyLQ0mN7IMkEgkLHdELsR48QQl4K4m4XmKqhxOToagQJ9ZoNBVV8V7gxMLYSMB8P8anIunuRqUbDAaxvLwMpVIpSr1qZJ+aDToesulJpVIYGRkR7dQWCgXcvXsX0WgUPp8P2WwWhUIB2WwWyWSSYQSr1YpCocAPbyaTQTgchlKphEqlgsVi4eQtk8mg1Wq5IWQ0GhGJRODz+aDVatHW1ob29na4XC7Mzs5CrVZzRedyuZgN0NHRAalUCq/Xy7gsaSuQgA4lo46ODoY1xsfHeTlMVLRsNguJRAKlUoliscjHRVUcDXBIpVJ2aq4XhUIBN27cwOrqKvr7+3HixIma6/XII4/g/PnzPBzicDjqzt+XSiV88MEHPFxBGs2JRKKCiF8qlXjUeWlpie+pmZmZCpxYpVLxYIcQoqAXMf1bONhRLxFvhaMrhEkoqNFH/QFybSZsW6/X77rTsJiAOT3vv9bT/RWLjeQd60UsFsPMzAzkcjn6+vpYdm4vg3ix169fx+DgIBwOR83NlUwm4XQ64fV6kUwmudFFNujkDEyNLuKAxmIxRCIRbqzRVFipVILf70cmk4HZbEZPTw8nxcXFRchkMh7xpQGKQqHATTDChsltobe3F3K5nIclaKLN4/GgWCwyDPL444+zA0EkEoHRaMS1a9cwODiIoaEhKJVK+Hw+KJVKyGQyhMNhWCwWuFwudh7W6/Xo7u5m+ca+vr66bg2RSAT/+7//i2QyCY1Ggzt37oi6O3R2duLll19mi3ma1qt3vYS4IyXCXC5X8blYLIYHDx7Abrfj4MGDvMIQJrWVlRVks1moVCpOgKRNTIm4kYYdJeKt8MzFgqiBWq22AicmK59YLIZcLsdOwzuJE1OIwQuke7wf41OddMXkHetFKpXC7Owse4GZTKYKX6i9CMKOl5eXIZVKWdtWGITF3b17l6UKQ6EQVCoVJ0O/3w+r1Yr29nbGcWn5SgaENDUWi8WwvLwMnU7HTSCyS6dqmfindrsdKysrWFlZ4YkzooV5vV5Wg6IELpFIEAqFIJVKodFouHLW6XRob29njYLBwUGsrKygp6eH4ZCFhQXWezhx4gTu3LmDVCoFm82Gvr4+WK1W9uciDqtOp+PkdfnyZTgcDlitVvZPm5ubg8/nY00GYkncunULY2NjNee62sCxXigUCna9UCgUnOgoSZEbRy6Xw8TEREXVJpbUAPALkpb6hAELK2Kxhl11IqYmJJ37nWrYEc2Q3CImJyd5GKUeTkzJeKs4sVjSjUajDWkt/CrGpyLpbqSpu1mlm81mGdcj+tdWvr9ZNLLsEjpH2O12nD59GpcvX655QNbW1vDgwQPGVwcGBqDX6xEOh+FyuTgJEi5osVh4zJeSnV6vh1QqhUKhgFwur2iykZ1OIpGAz+eD0WiEw+FALpdDPp9nJa9cLgeZTIbu7m6Uy2V2jjCZTOjq6mL7GqJidXR0IBqNwu12o7OzE3a7nalksVgMDoeDG0gAuFqjF2ZLSwueeOIJXL9+HW63G7dv34ZEIsHJkyfhcDhgMplgMpnQ2dmJy5cvIxQKoVgswuVyYWhoiOEJevDp/FASoqV7vYREcIZUKoXJZBK9ns888wx+9rOfYW1tDUajEefOnYNOp8PKygpcLtemynLVUc+JgZgTTqeTdXopmdE/BMGsrq5idXW1QhaToImdatgJObrCYRQKYsXE43H4/X4sLCyI4sQbWb7XS7r7kbkAfEqSbr3YiKdLeg0+n6+C5yqM7dC+6PubmVNGo1E8ePAAGo2GGyrCKJVK8Pl8CIVCWFxchM1mY0qY2+2GXC5HLpdDLBaDTqdDd3c3052WlpagUChYiwAAuru7sby8jHA4DJ1OB5vNBqvVilgsxs4KpVKJR19pjDedTrMGbHt7O1KpFAKBAJ9fSuRky0N8XqvVCrfbDQAMZ2QyGVYnI/NQnU6HRCIBrVaLUCjEVRMxEgKBANxuN//vXC6H27dv48knn+Rz5Xa7sby8zJ+JRCKYnp7GQw89hL6+PmSzWQQCAX7J0fK7paWFoY1qKCmTyeCtt95COBxGuVxGd3c3zp07V9Oc0mg0eP755yuu65UrV2CxWPDQQw/tSJe9nuh6IpFgIR2CfXK5HI+jm0wmHkKhe0qsKgZQUQ03kog3G4wQ6l9QbIQTC18gxCcWS7r71TUC+JQk3Y10aasZBKVSiauP7u5u0SU8xXYGHICNk64Qzjhw4IAoPlUqlTA9PY3V1VXkcjkWSaHl8traGrRaLTQaDXf+yUYnk8kwxmq326FSqViHlBpn5XIZKpWK8blgMAiLxYKuri4eFc7n80x3KhaLUCgUyGazCIfD3Fiz2Ww82ruwsMDKXNQIowpao9Egl8vB5XIhn8+ju7sbBoMBa2trGBsbg8/n48aZzWbD7du3EYlEMD4+XuGlBqwnIIJJ6G/EYiiXyzxGrNPpWBRFLpdDp9Ph5ZdfxpUrV5BIJGCz2TAwMIBIJMJ4qlA68s6dOwgGgwwJLC8vY2pqCuPj46LXPJ/Pc/Pz4MGDNZzinQ6ZTMaVfrFYxMLCAsLhMAYGBniMeXFxkZtRQj4xvWCESZgkMYHGGnbNjABvBKkQVOTz+ZhZk0qluBlLK7X96hoBfEqSbr2oljEk1996ylvVsVM8X2Hk83nMz88jHA5jeHhYVHoyk8nA4/Hgpz/9KUKhEHp6eqBWqxEOh3mwgZJNoVDgBpfT6UQsFmOxb5lMBo1GwzhpOp1GNptFd3c3L30JnzWZTMjn8yzTmMvlWNOgpaUFAwMDWFtbQyqVgkqlgtlsZkYBJdxwOAwAjDd2dHQwHmm32+FwOJDP5+HxeNDd3c3uwXTMExMTCAaDzHool9f92/r7+7l6zWazUCqViMViaG1trbjGRqMR6XQayWQSZrMZhUJBVNTabDaL2qUDv9SoIDyVEjFRr2jlUQ0b0UvQ6XSyklo0GuUpwN2OQCCAubk5dHZ24uTJk6ITeOl0mhupwoadkEtM1WWjDbvtCEpVh5h7RKFQYCre6uoqEokEvvWtb/HAz8WLFzE5OdlwAt7MNSKbzeK3f/u3ce3aNdhsNnz/+99HX1/fjhwfxac66VLQUIHZbMaJEycathbZrleZEJ4Qatv29fVhdHS05sEgPPXevXvIZDIwGo3I5/NYXV2F1WpFPp9ndS+bzQabzYZAIMBJQKFQ8G9Q9RYIBJBMJhlvTSQSSKfTrC2QzWYhk8nQ09ODSCQCr9eLhYUFdv+lDjxpKsRiMajVanR3d/NIcSQSYfUwgjFIl5caQMRYIG5wIBBAS0sLVldXEYvFuDIXWtdQdUX835MnT3L1S8yM2dlZDA4OIhwOY2FhAQMDA/B4PCgUCrDb7Th27Fjd65PJZHDr1i0eHZ6cnGRdCbVaDYfDAbfbjdnZWahUKsazs9ksPvroI9ajUCgU8Pv9MBqNGBwcxBtvvMHH0traihdffHHXhgyy2SwePHgAYN3xoB7TRlhd0ouIXjCEE5PzMx0XVcT04qhOxPl8nitQIcy0Uw07YP0+ksvlFdS7iYkJ/O3f/i18Ph9+9KMf4fXXX8dPf/rTTZ/rRlwjvv3tb8NisWBubg7f+9738Cd/8ic76hoBAJJNSMb7goFcKpVEsVvC1VpaWjA8PFzD9WskPvjgAzzyyCNN7df9+/fR1taGbDaLhYUFtLW1oa+vT7QyCAaDuHHjBnK5HNxuN0MI5MJLTS9gvVombqrH40EqlYJOp4NCoWAcW6VSsdFjsVjkJL24uMidfqPRCKlUimQyCb1ez3P7xWIRFosFPT09WF5eRjQaZZFyGjwgScZIJMLVb3t7O4B1PzUAvA06hlKphEKhwJAGAO7y22w2xONxKJVKpr2RZc/IyAgmJiYYnrhw4QJyuRzkcjlze7u7uzEyMsIsjUQigaWlJaTTabS1taG3t7fiJVcsFnkloVKpkM1m0draiieffLLic5lMhj9XLpfR29uLs2fPQiaTsXIZ4em5XA73799HIpFgrm2hUMBjjz2GiYmJpu6hekGrgNXV1ZoG8HaDuNSUjJPJJDfsqCom/ZGOjg50dHQwNEH/CPdzOw27crmMq1ev4uTJkxV//5u/+RscPnwYv/mbv9nwb3344YecoAHga1/7GgDgT//0T/kzzzzzDF5//XU8/PDDvFIiJs4Wo+4XPpWVbjKZxOzsLIrFInQ6HQ4cOLAt47xmI5/P4/79+7BarThx4kRNFUI6CjTzT/QmMjrU6/X83zSVQ9Ws0+mERqNhTi6wLhBO3Np4PA6r1Yqenh4W1iZhb3r5GI1GxoojkQjrM6RSKRSLRaytrTHft1gscvWzuLgIl8vFEIZWq4VSqWSsjaohnU7H3Xf6fE9PD0wmE5LJJFwuFw9luN1upNNpdHV1sRmlRCJBe3s7wuEwrly5gkcffZStfAwGA6LRKMtOTkxM8INRLBZx6dIlJJNJyOVyLC8vI51OVwjm0DKbXIKpWqUXEIVarcaLL76IWCzGTSEADFX19PTg8OHDvO3p6WmeYqPKeGpqigVqCJfczpI8Ho9jenoaZrMZJ0+e3PFRWLIUqjaJjMfjCIfDuHPnDvcLYrEYyuVyBXOi0YbdZoMd9Bti/38zmG4jrhHCz8jlcqaO7uRL7VORdOmGF9K/hoeHYbPZcOPGDdExwt0MsldPJpPo7u4WxYSIPTEzMwOFQoFQKIRsNouOjg7I5XL4/X4mgBMsQBNPSqUSiUQCqVQKRqMRGo0GiUQCc3Nz0Gg0MJvNTIHKZrMMCygUCrS3t7MO7urqKlewhM8SLhoOhyGVSmGxWNDf349gMIhIJMLNKxoZHRwc5OZkIpGA0WhEa2srd9Sz2SwvWekBJEgkl8tBo9Ggvb2dm4Vk2U6SncFgECqVimlsdJ3p3DgcjhprI5/Px+cGWH/Qp6enKwSKxKYY69H7pFIpP+DJZLJGp0EYHR0dmJ+fZ7xaIpHg6NGjcDgciMViWF1dRTweBwDu1BuNxoZ0DorFIubn59nMcy+HA2iK0ePxYHh4mGl+yWQSsViM3VKoYSfEiYUNu3o4MSCeiOs16mKx2JaTrtiqfiP1uXqf2W58KpJusVjE7Ows/H4/BgYGKuhf25V3pKZCI8sioUj46OioqHMEuSQEg0G4XC5OgqRe5XQ6USwWmWtJSTQSiSAcDnNjjKoxggWo6UeMhFAohGg0ilAoBK1Wi5aWFp4Oi8ViPE2m0+nQ29uLeDzOtDSFQsF4skwmY8ZCPB6HQqGASqWCw+FAOp1GJBJhVgON4VosFuRyOXi9XshkMlitVjgcDng8Hh78UKlU6OrqwurqKmZmZnjU1mKxcAOKKq1YLMbTXIVCgZMsNYb6+vpqKh+xhCoMmqJbXV1l7L6vr68uBFUsFnHlyhVMT09Dp9Ph8OHDouL2jz/+OGvcAus469DQECQSSQWvlEaCY7EY1tbWEI/HK7QZKGnRNvx+P+bn59HV1YXh4eE9HX9Np9OYnp6GSqXCiRMneJ+ISVNNByP9BrJVEpqJ0rGRV1096yRKxNRzqH4Gd8s1gj7T1dWFQqGAaDTasP9ao/GpSLpSqRQ6nQ6Dg4M1yXGnhMw3qpSLxSKWlpbg8XgwMDCAgwcPQiKRMD4KgBPU7OwsEokETCYT5HJ5RfVGotQEC/h8PoTDYSQSCUgk68aJVA3LZDKuVqm6pOZYqVSCSqXiJKxQKCCVSjlJkWC5RqPhJEbyisSG6OnpgcvlQjQaZUyPKhZyVw6Hw4hGozwAQVXP8vIyVzzE46TqmITEOzs7eQDA7/eju7sbWq2WH0KLxYJYLMaDIDKZDEtLSzh69CgeffRRTE1N4cGDByzQPjMzg6GhIfT390OtVvPQBzEx6JpQSCQSnDlzBjMzM9xIq5fMyHF4YWEBKpUKqVQK77//PiQSCUZGRio+q1Kp8OqrryKbzXITSCzI/Vg4VSUcGKEkm8/nKxTb9tLZulwuY2VlBW63GyMjIw0lH6G8ZHXDjnBiYcNOWBELzUSpOJiZmUF7ezuPWlN4vd5dcY146aWX8O///u94+OGH8cMf/hDnzp37daUrFlKpVNTnCGhOf0EYGyVd0qp1Op3o7OyskXykcdpIJILLly8zE4EEs3U6HbxeL5aWlqDRaLhCJGoUTSBRE81ut7O1uUwm42pBIpHAbDbD4/GwFTrpJ9B0lkQi4Wk14JcVcjQaRTQaZdEaurnD4TA7QZRKJZhMJvT398PtdrMeAQli0zXI5/M8Zmyz2dDR0QGPx4NwOMyYqM1m4xcQwQQ0PdXV1cXYbi6XQ0dHB8MLLS0t0Ov1cLlcGBkZQW9vL5aXl3mcl+zTCcM2mUxM9bJY/v/2rjy6rfpKf0+7JdlabMtrFiexYzu7nUA4YdgKYcoSylbo0NIBelrOTGh60nJoSRfSDsl0gVIoBzpLoRzKMAVCy1CgBUJCgCROQkIW74l3SbZ2Wbue3ps/7HuRvAQ7ceyQ6DvHB7JJT7J83/3d+y0WdlpLn80plUqe8w4MDKC9vR16vR5lZWUQBAHRaBQtLS1QqVS8pEyfyzc3N48qusCn8eyTBS2rSDXX09PDijKi5dntdh7LjGWSM1UIhUJoamqakrkxvR/ECCEQP5zUarRzMBqNiMfjiMViWLRoEfLy8rgDjsViePTRR9HT0zNpT5SJpEbcc889+NrXvoYFCxbAarXixRdfPOXXPe51TPkjzhBO5jR2JtIjSLZrtVrHTPyl2SUtcXJycnh2FwqFWD1Efr2k9Ort7YXf70c4HGaFVzKZhNlsZjkuLZLKyso4ubezs5M/sNQZUbdLNC+LxQKr1Qqfz4dYLMbxLtSVGY1GXiYRNY0ifCjtN5VK8fy3vLwcCbsd9nAYnZ2d0Gg0sFgszFQgmhnp/8vLy5Gfn4/u7m7Y7XYu3JS2297eDoVCgcLCQuj1ejQ3N0OpVKKiooJvZqSWE0UxQyxB8+6ysjLuLkVRxK5du5iPnEqlUF5ejtmzZ3OXaTQa0draigMHDvDnZ968eSguLobb7UZlZSWsVit2796d8fmSJOkzZ7DJZBK9vb2QZZk7+4mARikj1WzpnSMlW4w0yaEinH6EnwwkSWJ/5fFEO1MFjUYzamHn8/nQ2NjIQo6mpiYAwCuvvAJgKNXhlltu4THYZPFZqRE6nQ4vvfTSqbycCeOcKbrjQa1Wj3J7mgxGFl068qjV6jG9bcmByW63o6WlBZIksdyRNvzhcJhzv4qKijgA0uFwsEENzXXLysrgcDjgdDqh0+m4uySaHI0FotEo8vPzUVJSArvdzksv6nrpuE/jhGAwCKPRiMLCQo7MoW0/0bZIdup0Otl4nObD0WgUfrcbCa8XKUlCctgisqysDAMDA8wPpu6GXjPJgGnJRJ2qXq9HIBBASUkJ85H1ej1Ts5RKJRfO3bt3Y9myZVAqlex56/V6+YZGx06/3w+Xy8WeB5RiUVpailAohL6+Pvj9fhw+fBg6nY5ZB0eOHGFeMBX7mpoatLW1MQ9ZoVBgxYoV435uYrEYXn75ZXZC02q1uOWWW05axMjfd3BwEDU1NeOq2UiunZOTk+H8RXN3mhOTdDu9Iya3srFAsuni4mKsXLlyWkQdBDI58vl8WL58eUbWXTQaxcsvv4xDhw5h+fLl2LFjB/bt28fUr88bzpmie7JOl9yOTgVUdImIH41GsXDhwjHNNpLJJD7++GMMDAygv7+fXbBoy3v8+HEoFArodDrm3prNZva9pYJHETG0vaciLEkSiouLMTg4yD4Der0eVquVO9dAIMDdsFKpRElJCYxGI8/myItBrVbzsY8KIvFkqasmmhD9XVmW+agXDocR9PmgkyTkAUhqtUgNu4sRw4FmyyUlJez3QIXbarXC4/HA4XCwfFkURY7LoRED0csouZhMdj755BNccMEFOHLkCB+5lUol3nvvPdTX12P27NmssKMiQwsZo9HIS5hoNIoTJ05ApVJxhDwATqpIT3+44YYb0NLSglQqxZab4+HAgQPsG0zP8+GHH+If//Efx/z7tP2fPXs2qqqqJt2hph/h001yiLky0q0svSPWaDQ4ceIEwuEwli5dekp89tNBIBDIKPbpr/3gwYPYsGEDbrrpJjz22GPc3WZNzM9iTIVTWG9vL2KxGBYsWDCmU1QkEuHNv8fjgclkgtfrhd/v5wUXdbzkY0vCB5pjaTQaJBIJmM1m5OXlwW63w+PxsESXOLAul4vTY2mMUFpait7eXk7OJdURWS2SQIK2/nPmzOHOuquri2NkSLhgsVhYckkm4SaTiY25BUFAjlYLMZGAUq2GWaFAYHAQ/Uolz4dpphqJRNDb28u+CDRKodBLt9vNKjAy3unv74darUZRURFMJhN8Ph/MZnNGkaMbxMqVK/Huu++ipKSEu9mDBw9y0CZ1ylqtFuFwGAUFBRnHUuI+e71emM1myLIMlUqFiy++mEUCDocDra2tXLDz8vJ4VzDeiCEYDGb8WqFQMFUsHbFYDM3NzVCpVGNS0E4XGo1mzKge6oibm5sRCARY5UhmQ+RxcCZBFLhgMDjK8jIej+Pf//3f8cEHH+DZZ58d5XUxneyNqcY5U3TH+yacKmUs3RjHarWOa4wTCATw4YcfMkWKjnMGgwGBQADd3d3Q6XTcPVJ3OzAwAFEUmWROIwMqnORBS1Qjks/SWIKKaiwW45ke+RXk5uZy8oLL5eLHp+UZdbU0dyWTbvpB7OrqgkajgdVq5WJJ10pH1lKtFoJKhQEALgApQYBRq4UoCOxGRh67wWCQKWBkrOP3+5FIJNjpi4oCCTFKS0thsVh4oZJKpbhzpcc9cuQI5syZk9HNqlQqVr7l5OTg0ksvZXMbtVoNnU6H48ePo6KiAoFAAK2trViyZAnnrOXm5mLNmjV8fB/JLgiHw+jp6cEHH3yAaDQKs9mMBQsW8M2S/GLLy8vR2dnJHZkkSaNSJHp6euBwOJhTPl1Qq9W8lNRoNLj44ovZizkYDDLnGkAGzet0RR3poFFGaWnpKNbIgQMH8J3vfAe33nordu7ceVbktE0lzgkZMAAOTxyJaDSKpqamk2rw0yHLMh/1CgsLodPpIIoihxjS3yGz8d7eXqjV6gz2QF5eHjt9iaIIi8WC0tJStmgk7isplqxWK6ciEJWGNthUgGlRRN1oWVkZMwOIfmS1Wtm6UKFQ8MKJvBVEUURvb28GG4MMa0wmE1s4yrKMwsJC2Gw2dHV1IRKJ8PZerVYjGgqhIBZDQq2GO5FAUpJgUCoxS62G32pFYPhYTdQfcvsymUyw2+3c3dONgF4jAH6tsVgMNpuNOy+DwcB0uEQigcLCQmg0Gj4h0GnB5XJBqVTiwgsvxNy5c3kssmvXLhaDJJNJtsGsrq7O6LB6e3tht9uh1+tRVVU1qvMMBoPYtm0bj39EUURlZSUqKyv5GE8Lyq6uLnR1dUEQBFRWVuILX/gClMOngZaWFjZkn85wxXRjHjq5jYd0g3ii76V3+1SQJ1MUU6kU2tvbEQqFUFtby+wXYKjr37p1KyGZGqsAACAASURBVHbv3o3f/e53WLRo0Wm91hnG+SUDTsdkKGN+vx+tra3Q6/XsbTswMMDHciogvb29OHjwIHQ6HcteiSIUj8c5rrygoIAXXU6nE9FolLswGguQyThllZENYl5eHh9HqYvV6/Xsj9De3s6MA/IqINpWOBzmMYFOp4Pb7YbL5eKOn7pJMoohug5xK6mr7u/v55sZmbdotVp0+3xwpFJQCQKMKhVkAKlhA5R4KMSLPYvFgpKSEjidTr4JUeoAzYfVajXTx9RqNQoKCthjoa+vDzqdDmVlZTAajaxiKysr42JIqQWtra3o6emBKIqwWq3Yv38/vF4v6urqEAgE4HA4kJuby7PiVCo1quA2NzejoaGBC/Xx48dxzTXXZIwj+vr6uIsGwIbtl156KdMWiW9bUFCA+fPnY3BwELIs4+jRo0xjq6mpmXYTbuI06/V6rFq16jOLZbptJIHn+cNjoPb2dr7JpM+Jx2IWeL1etLa2ory8fNTcet++fdi4cSNuu+027Nix45zrbtNxzryy8cYLEzEij0QiaG1tRSqVGkWToZlwIpHA/v374XQ64Xa7+cNFkeI9PT2QJImFCDRGiMfjGSbgRAETBIGTcMkkpKioiBMP+vr6mAJEHZ5er+fNPxVaSq11uVy8qCIzGjJ/SSQSfLwmKk44HEZ/fz8SiQR33XTNoVAIgUCAC7HFYmEhBFIpSIkERAAKWUaJXo+oKKIvEkG3LEMTCsFcUJAxRyZnLprnzp49Gw6Hg6+XWBmxWAwqlYrfL1K6EbuCjrxUFEl153Q6sXz5cra1pD/v7OxEbW0tCzOI81tQUMAxRARZlvlGSj/wgUCAXeEII0dM6a5o6X+H+LZkBkPpuhaLBQqFAq2trdzB02dpvGJ1uiCHO6fTierq6tPyoR3PlJxGVm63Gx0dHUgmk/xZMxgMfNNfvnx5Boc5Go1iy5Yt2LdvH55//vkMf4xzFedM0R0PJxu4k1OS3+9HVVXVmHM1URThcrmYBkWuWV6vl7mrNOvMy8tDaWkp82y7u7tZEUbOX0TBImWY0WiE1WqF1+tlMxfgU3em0tJSeL1eDAwMwG63c6Q5HdFJ9UTbf8rNotmjSqWCUqnk8YTNZmOeLpnqlJeVwWO383yYwjip0Ofm5vL1aQUBuUolVBoNBpNJeONxREQRkGWIsowcWUapwYABlQperxc9PT08SiCaWCQS4RmxJEkoLy9HTk4Ouru7MTAwwOYw9P7GYjGeQSsUChYs0MKnr68PHo9n1Peb6Ht9fX38fdBqtRwtP3J0QEU+/bMz0gR/7ty5OHjwIAYHB/m5TuZCF41GcejQIZ6TG41GNsghyWwwGBxVrNJZE6ezXBscHERTUxPy8/NxwQUXnJHlmCAIfJMhpzny77Xb7WhubmYGTFNTE1KpFFpaWpCbm4tf/OIXuOOOO7B9+/ZzurtNh/Khhx462Z+f9A/PNoz8ASH09PRkLDHI8LulpQXFxcWoqakZkybj8/mwc+dOOBwOthnMzc2FJEkIBAIccaPVarm4kfMS+cPSEY1UZhTpTXO8wsJCZjaQqY3ZbIZGo8kIh6TXRiYvpG8nU2+LxcLHXhqDUBdtNBq5Y6SbRE5ODsex5yUSCDqdCGPoRkSLOpIH0yxPqVBASCSQr9dDrVDAl0hgUBQhALBqtVArlUgAkEIhBIcXX8QPnjdvHj9WOBzOEFIQ2Z9m4LSI0uv1PFPU6/WYNWsWiouLWaNP4xPi9QLgRGSHw4FQKIQTJ06goqICq1evhiiKLIWmsEiLxcLdcTQahcPh4Hm4SqVCXV1dRtFTqVSYP38+f59WrFgxZty7JEno6upCU1MTjh07xrQ8GoHMmjULwvDSMTc3FwUFBSgtLUV5eTl/xvx+P7q7u9HV1QWv18vub/RZOxmIGdDb24va2loUFxdP68af6H/RaBTLly9HRUUFysvLYTab4Xa78fTTT+OFF16AIAyFl1qt1jHVfZ9jbB7vD86PWws+5fU5HA50dHSgpKQEq1evHvXhlWUZ3d3dcDgc6Ozs5A5RoVCwlJWOpmR3SJ6b1EHSkZnkmrQAI6ZAYWEhtFotix4AsOcosRU8Hg8XIkphIGZDX18fYrEYG0ubzWbo9XrumKirIGoa/RkxBajjIl+FzmgUKgAmSUJ8RNEmYYLNZkNOOIz+WAyueBwyAJUgQMDQxsCkViOYTMIvSYimUlBJEkzDooxUKgWn08kzXboR5OfnZ3BwiR5H1LhEIsFLMmCo4JHog26AWq0Wfr8fAHg8c/z4ce7SgaFtuMViwbJly/DXv/6VkzWSySQ++OADXH/99cjJyWGDe7KhJOOkke5jOTk5qK+vBzDUSTqdTjb7BpCxKLNarfyeA0PF+PDhw+N2x+N5F9DNIt1EhtgVI+N3aHZaWlo6ivc6HSC15ty5czOKvSAIOHToEO6//3587WtfwxtvvAGlUgmHw3FaYQGfN5wzRfdkHyyySjxx4gRMJhNWrVo1rpdCW1sbDh06BLVaDY/Hw0d8j8fDBHMqquTURaTz9CWZzWaD2+2G3+/n5F2r1YpIJMJmMVTYqKhSokJPTw8v5+jIS8d9WkhRsScvXJ/Pxx/cZDKJ2bNnAxgKa3Q6ndBqtRzJQ25ggiBAkCQkBQEGQUCh3Y6uOXPgG1asUTcXi8UgiyLiwSASAJKiCI1CgVK9HqIkwR2Pwx6NIilJUMoyJFmGWhRRXFSE/oEBTqsgyS9FDtG8l2heZrMZOp2OaWUk1CCvW6LG0fvf1dUFvV7Pyjd6T7VaLYqKinisEwwG0dnZybNvEp+Q3WQoFGL3thUrVqCmpgbvvPMOtm/fDgBYsGABLrroolGfsWPHjmHPnj18s7zsssv4ZLJo0SIYDAY0NjZO6rM63md7rNQHCnekjpjm6AqFYtrNcYChzx2JR+rq6jJkz5FIBD/96U9x+PBhvPjiixldLY0kzhecM0V3PIRCIYRCIXR3d2PJkiUZ8kJCLBbDvn37eElWWFjIXQSl7hLNKpVKwWQy8WyWuLIkMCCGgNfr5aOzJEmw2WxQKBSIRCJwOp1sYkPFlNJvqRCTGGAgrWgBQ2kMxO0FPqX1kJrNarWyM1c6P5kkt36/HwMDAywrLoxEEFKpkFCpEDSbISSTEIaLoF6vh0qlGvKy7e+HUpahViqhpzGCLCOeSiE53N1qARSp1YgplQjGYujp6IA4PH4hihVxj+nmQbNLl8vFUfHpogNK0FAoFBl+vWq1mqWuBQUF0Gg0rDykAgqA/YQNBgOWLFnCycTA0CiCbAspcj4vLw+HDh2Cx+Phz0pbWxuKioowf/58/rfBYBB79uzho340GsUbb7yBdevWZUQxzZ07F1qtlrt8ABOmL54M6YW4qKiIF3WUpzc4OIjGxkYO2hzZEU91MSaa5bx58zLCJmVZxu7du3H//ffjrrvuwq9//etpocjFYjFccskliMfjEEURt9xyCzZvHvfEP604Z4tuPB5HW1sbwuEw8vLysHDhwjELLgA0NDSgv78fBoMBAwMDnLpLdC/6oaUlFHklpPvI0kaXiiRFkhuNRjYFp1jzVCoFnU6H0tJS7p77+vpYVEAOY+mpvpT5ZTabeXFEhYkMswsLCyGKIkRRhN/vZ0kouXnREZ/ksBpRRGEkgoTJBL8gIKbTISceh6WwEL5IhFkWSkGABACCAItSCUkUEZZl9IkiFADUw1+CUgmVVotEPI4kgHgkgjyTCcUlJWws3t3dzRQjmnlTRzQ4OAilUon8/HxotVq43W6mrREzhN53v9+PYDDI3T8w5G8QCARQUFDAtDeKJvrkk0/gdrtx0UUXYffu3WzCTuOJ8vJyCIKAvr4+NuMmdRpF3qcXXVqk0eeEZvsWiyWjoOn1etx6661oaGhAJBLBvHnzppR/GovF0NLSAqVSmZH/N9KTgTyJe3t7M8xxqBifijkOMLQDaG5uBoBRirpwOIzNmzejsbERL730EqcyTwe0Wi22b9/OlMqLL74YX/ziF7F69eppu4bxcM4UXfrAUCLDwMAAf8Cbm5vHpY1RLA0tU6xWK/r7+9Hb28s+vaTrB8C+BqR1J+5nLBbj6BiijZWXl/MCxefzcTdMlo/EjyXKmE6nQ3FxMbMbSA5rsVgQGS6CgUCAu2GaKdtsNvT397M3Lz0HGaPQQi4Wi0Gv1w9F4Hi9iHk86MnNRUwQoMSQEsYgSTB7vQgaDNxB5ygUsCgU8EsSYqIIURAgDLMVBEmCTZYRBeCVJHQnk1AKAnIFAQkAqWFntGQyyTcwi8WCgoIC2O12jhEiA2/ypKBiR/4FJPN1Op3o7e3lzlmpVCIcDnP3TGIJijUqLS3l2bbdbkdNTQ2uv/56bN++HclkkpMyGhoacO2116K8vJwpgHRyicfjGBgYwN69e7lbFASBHdRoXJGeY5eOvLw8XHnllVP6eSdb0d7e3pMq2sayVRzpb0s7gsm6lFGI6UgfClmW8cEHH+CBBx7AN77xDfzmN7+ZVgEI8CmjAgB//s4W6fA5U3RJJdbV1YXy8vIM2e5Y9o507E/vVslUORQKIZlMsvLG4/EwI4EMa2KxGMsiBwcHkUgkeCOv1Wq5Y04/KpOSzOPxwOfzcbdKYoBwOMwx58RWKCwsRG5uLvr6+hAIBKBUKvkaKFmCVFCiKEKn06GoqIizz4guRYblgjCUBybE44gJAmIKBXJlGQWShH6lEn6NBuFIBIlhmpksy8grKIBOo4HPboc3kYBSoYBSrUaeVot4IoFYXt6Q+xZ1/1otjGYz/MEgwtEoLyQp0JCO/MR/FkWRX6fdbofb7eblJXUqJNAgOTAtJI1GI6vczGYzq/vIVzi9CNIoxmq1ciYcFSWyyDQajbjgggsyookWLFiA1atXszmQ0+mEw+FgK05S7K1Zs2ZafrDD4TCampqQm5t7Sl634/nbEjWPfHtp8ZveEdNnlWw305MkgKFx3k9+8hO0trbilVdeyTgdTDdSqRTq6+vR3t6Of/3Xf8WFF144Y9eSjnNGBgwAx48f53lfOsjrNT21lCSqgiDA4XBk+KVSmgFxXonrSR6qFosFHo+HN+sKhYK715KSErZypGO01WpFIpFge0fa6NMGvqCggOfDlBCRk5OTMSumpRMA2Gw25qfSUoxmvCSRJTMdeo7i4mKOQFcqFEAoBIUkQVQqoZdlFMgy+hQKRAUBCkmCWaWCcc4cuFwu9nQgJZcgCJg9ezZ3W+mMCPL8pYihRCLBDIq8vDxmeNDrNBgMbEhjMplYsKFUKlFcXAytVgu73Z6RqUXXU1RUxGm1sizDZDLBZrNxVAyp/HQ6HRwOBwRBgMViweLFi9lkhm4soVAIV1xxBS91RFGEz+fDgQMHmGFSWlrK1pOkZotGo8y08Hq9fPNNFzzQ5+h0QTS0gYEBVFdXT4uiLb0jpv8mk0nk5+ejqKgooyPetWsXvv/97+Ob3/wm7r333mm1hjwZ/H4/brzxRjzxxBOjjHPOIMa9+55TRZfI9iNBx9HZs2dzx5RugA2AExQ0Gg1sNhsOHz6MtrY2PkaSd2thYSG7+LtcLnbwVygUXBwEQWA6F0WME/sBAEuEPR4Pb+PpuiRJQn5+PnJyclg6TEWdvIHJF5deL1keknMY/R7R1oChiB232z00ApEk5EciKFSpYFcoEBkeLwgAVLKMmCDAGo9DV1EB57BEN53KRO8TdaDp10D5YGR8bjKZmOJlMpm4EAPgUE1S5pHYg+TOBoOBZ7v052QG3tvbC0mSeKFEuXA5OTnsX2EymRCNRvloabPZeBxRXV3NRH3qaPPz87Fq1SrMmzcPAHD06FHs27cPBoOBi09tbS0uu+yyjM+Oy+XCm2++ya9rzZo1KCkp4TkqzfjTj+6TLcRkf2iz2djkZzpBbmgUG0QUtmAwiG9961s8wvrud7+La665ZlrntxPB5s2bYTAY8L3vfW+6nvL89V4AhsYLRIhnqtSIY+BIjfmyZctgMpnQ3t4Os9mM2tpa7NmzB16vF8DQ8mrhwoXw+Xw8OyK/BPrBoi6or6+Pixb9YJKUlzpDk8nEkmKKt5EkifPFyOyb5r1k8EIiCo/Hw2Y1NFcuLi7mMYbD4RiigJlMiLpciKrVGAQQFwSIAFIACiUJZllGr1IJr0YDpdMJSaPh8Qg5bhHnWK1Wo7CwkGfN6baTdMOhqByPx4OBgQGmc5Hfgkaj4e+HLMswGAycduz1evkkQfNeEmyQx3D6EZlm63l5ebBYLHzdgiBkWHIKwlC80dVXX43du3cjHo/zdX700UdMbaOF6uDgIBfNdBYCfR/feustVhxS/Pstt9zCtD16z6hb7OzsRDgchlKpHNURj/xckkHM4OAgFi9ePO4y+EyBDHK6u7szZscGgwH5+fnYuXMnFAoFvvnNb2Lx4sU4ePAg/vu//xtbt26d1uscCZfLxUZU0WgU77zzDh544IEZvSbCOV10qXMkVsGBAwcyjKlNJtOYTvqxWAzt7e1IJBJYvXo168wvu+wy9Pf3I5VKIT8/H6lUCtu3b2dDE5vNBoPBkOHIRQop+uGi+bDdbs/Ig6LxBMXh0MiAjumUukusCEEQOAEhGo2yexlR0GKxGDMG6L3Q6/WwARhIJhHQaODEEOsgX5IwqFAgolAM8XYBiIIASRRhzs1FbkEBHA4H3G43z4TT/RpoXEIRP2Q7SfQ4ElhQR2wymRCLxRAIBFhyTDFCVLSBTz0WTCYTG7rTvJfGGeFwmIUkNMqgwkhiAvIiNplMcLvdCAaD2L17N1avXo1YLMa+u3SacblcKCgoQCqVQjgcZq5wJBIZlUBLAhYa71Cy8Mh0YhpLpf97ilIKBoPo6OgYVYjJ/nHWrFmnZGx+uojFYmhqaoJOp8PKlSszZLqDg4P44Q9/iO7ubrz22muYM2cOAIxr0j7dcDgc+PrXv84nsS9/+cu47rrrZvqyAJxj44V0R6z0OOf0zjaVSnG2VDAY5I03+YUGAgH4/X7Mnz8fBQUFn/lBj8ViXIyoO/roo4+44ywpKYHD4eDjJElUAXB3kz5GAD69WdhsNp4tEk84Pz+fzV60Wi3T0CRJ4rk1GagT5YkYDJBlGEMhhJRKpIZfV4EkwSrL6FUoEBYEKACoJAlaUUREpUKuWg0hP59n1CqVign6/f39XBipqAJDs08qeJIkIScnBzabjefgOp2ODXDopkgR7TTXViqVMJvNQzPo4ZsJjSkAZLwP1FVTlBEZ25BzGfllUJqyxWJhNzdSpul0OpZ3V1dXQxAE5Ofno6WlhX0d1Go1lixZgvLyci6ekiThueee4xsKWU/efPPNpxTdnUwm2RCfhCP0GujrZJE7U4F0ZsTIFGBZlrFjxw48+OCDWL9+Pe65556zZnZ7luH8mekSp3Osue14iMfj6OjogNPp5OMuLX5MJhPy8vImZcYRDAbh9/uh1Wphs9lw5MgRDtgDgNmzZ6Ozs5NpapFIBC6Xi9kItFCi10GMBVEUuYukopqTk8MGOIODg1zQqHDr9fpPRRbBIIRUCmpJQl4yieDwa9UBCAsCUgAgyygWRRgUCvQqlUjIMgStFhqtlgUIZC5OJjQUyRONRpmRQQWaKGAlJSXsVEZjhIKCArhcLlaR0eKPFl5WqzXDf5fEANRRU1dL4gmS3NINgtIodDodIpEImxKl84LnzZuHzs5OiKLIdDydTofZs2dj7dq1UCgUGBgYwPvvv8+pGQqFAldddRVbOfb19eFvf/sb32hXrlyJ5cuXT+aji6G3XobT6URnZ2eGyIBUdTSeoMid9NHEVBXiaDSKxsZGGI1GLFiwIIMZEQwG8cMf/hB9fX343e9+lzE+OdPo6enBnXfeCafTyeOMDRs2TNvznwLOj6J7//33w2g0YuXKlaivr+eidjJ4vV60t7fDYrFg7ty5LDQgcxg6/qVSKRiNRi7CtDybCGRZhsfjYaGGyWTCrl27uOMl/ix1YcDQxnVkNHp/fz/PS6mzpEWdz+fjRR3FtXu9Xp6bJhIJYHjmm5dKwZpIwKnRIKxWQ5BlaEQRufE4Ajk5UMkyVACiSiVSABRKJUqHlzd2u53HHOThG4/HYbVaEQ6HeSlFklUSMYwsxGq1GiUlJXC5XHzdtMR0u93cuaePJqxWK/Ly8jK8J8h/wev1sjQ7nZNpsVgQi8W4YOr1er5pkJBi2bJlaGpqQl9fH5vohMNhrFixAkuXLkVbWxt27drFhY2WmTfffDN/j+nzEovFuKuurKyccAIwKeO0Wi0qKys/0+KRCjF9EW0tvSMm9d9EP6NkKTrS/lGWZWzfvh2bNm3Chg0bcNddd017d0umU3V1dRgcHER9fT3+/Oc/o7a2dlqvYxI4P4puS0sL9uzZg7179+Ljjz9GIpHA4sWLUV9fj1WrVmHRokX8Yfb5fOjs7IRSqURlZeWYpPZ0kEafRhP0Q5zeDU+m2xBFEUePHoXD4UBVVRVKS0vx7rvv8vGaeJSBQICNVCgendRvZBZDFCqar6pUKthsNhZfCIIAo8EAayKBgUQCKVmGUhAgDvskSAAKVSrkCAKcoog4hrxyNbIMnSAgpFBANxyySYwJlUqF0tJSiKIIp9PJJwviA1MEj9/v5wUfLajoBpTudUyFurCwkL0s6P3Nzc3lRAyafdPNh+LaHQ4HF2JStDmdTn4OCmP0eDysbFMqlcwEobBOeq+j0SjKy8tx6aWX4ujRo2hoaOAlFo2w7rjjjozvaW9vL958802m1RmNRtx8880Z/rEjIcsyh4aOPMpPFolEgrthKsQ0OqOueKxCHIlE0NjYiLy8PMyfPz+juw0EAnjwwQcxMDCAp59+OsOtbyZxww03YP369bjqqqtm+lLGw/lRdEciFovh0KFD2LNnD/bt24djx47xEkir1eKXv/wlqqurT/munb4IIdNvUvVQIR6r0/F4PGhvb0d+fj4qKir4Q045a7SUo60rSVIpOZiWaZIkMUeUzLD9fj9v24nFQI5UFoslIwmDZKs+n4+P6ETNkiQJRUVFEASBpbhkRUiFj9gYVIgpMp7GJcCnceHxeBzAEAeaFl/pHbHL5WJeMglWyIKSeMyU12YymdgYm4771N3SdWs0GgwMDLD3QGFhIdRqdYa1pFKpZDaJQqFghziyebTZbFizZg10Oh1ef/31DJ+FioqKUdSxP/3pTwgGg3xjj8ViuOiii7B06dIxPz+hUAhNTU0wm82YN2/eGVFtURowfZGZPhXiUCgEt9s9KslClmW8/fbb+PGPf4yNGzfizjvvPGtmt52dnbjkkktw9OjRjAy7swznZ9EdiVdeeQUPPfQQrrnmGuh0Ouzfv58VbKtWrUJ9fT1Wrlw5Sj8/GdCxlQoxHUVNJlNG9zWR7hoYUh9R7ldxcTH6+/uxa9euUYs6MnlJpVIYGBhgWa3RaOSCRkWJll+5ubkZMThEi8rJyYHb7c4IeQSGCmhxcTEbu1NHR3Nwor4RY4OWaLRYo4UUAL5pEJ2LOMTAEH3PYrGgv78/4yZANxCDwcCGQ9TdkhKQlpo006ZxRmFhIY956AaTn5/PbnK5ubnQ6/UsrFAoFMwEqaioQE5ODjo7O9lARafTwWAwYO3atdydPv/88xmS8Wg0yqesdEiSxMnRI5NKpgMkQe/o6GBKnlarRW5uLg4dOoT58+fjP/7jP+Dz+fD000+jrKxsWq/vZAiFQrj00kuxadMm3HTTTTN9OSdDtugCQwsPq9WaUewkSUJnZyf27t2LvXv3Yv/+/RgcHERNTQ0X4WXLlp30iHgyyLKMwcFBjpomXwZK7CXWxGS6CEquoAVWU1MTDh06xH8+a9YsdHd3M+8zmUxyt0odL3XItGykI7fFYuEukdRuZrMZquEkCK1Wy6II4NNCHI/HeQRA4xHi0prNZpZKU3dLizWKKqd5K7E4CgoKMgqxxWLhhGG6rvRCTO+n0+nk0UFBQQG0Wi3T/NKvjVRw1OlaLBY+YaR7ccTjceTl5eHyyy9Hf38/3nnnHS7adFO54YYbYDKZsHfvXhw+fJhvKLIsY926dRmuW5SCW1JSglmzZk1790jRPaRqo04xHo/D7/fjwQcfRENDA5LJJGpqanDFFVecNfzWZDKJ6667DldffTU2btw405fzWcgW3ckgmUziyJEjXIgPHz7MKQJ1dXVYuXIlKisrP/M4SCYr3d3dmDVrFsrKyrjbpNkbzYcpooZGE5NZggBD1nrBYBAGgwFFRUXYs2cPTpw4weYmJNZIT0kgoYfJZIJGo2HGBPBpSnAikeB4dI/Hw0YzJCagmTPFpAPgmXIkEoHX6+UiRdaYAJh/nE4bo+VfuoMXqdOoiNLNAgCsViuMRmNGISYlXDwe507e5XKxqo641BSTQ12ySqViQx5KvAXAjlw33ngjnE4n3nnnHWZOkFPZqlWr2Mazv78fLpcLWq0WtbW1KC0t5cTdtrY2RKPRUYGY0wUaZ1itVlRUVGQUfJ/Ph+9///sIBoN46qmnUFpaCrvdjs7OzpPGEU0XZFnG17/+dVitVjz22GMzfTkTQbbong6oW92/fz/27t2LhoYGtLe3w2azcTe8cuVKnoECgNvtxokTJ2A2m1FRUTGhbTR5NtAShPwIqBBPJitLkiR88skn6O3txfz581FVVYXt27ejv78fwFBRnTt3Lk6cOMHHW7oR0AJLqVTybJZGAYIgIJVKoaioCKIowuPxMJvCarWyiTcVNjIaIiUa0d0AsAAkvdskM3ZZljNi5WmxNlYhTh+PWK1W6PV67tTpetMtOIntQIs54h37fD5+DuL5kjWkyWTC6tWrsX37dmi1WqbwpVIpfPWrX2V2xuDgILxeL3bt2oVAIMCd9bx58zBv3rwZkfDSaY5mtyNDJd98801s3rwZDzzwAP7pn/7prJndpuODDz7AP/zDP2DJkiV8fVu2bME111wzw1c2LrJFd6pBXSx1ww0NDXC7+t27QwAAGGFJREFU3SgrK4PP58Py5cuxadOm03Lvjw0nOFAhpnkmFWIqjCNBcTG0oCGOcSqVQnd3N3evubm5eOONN/iITykZPT093OlRyKZCoYDJZIIsyzzvpaJG70dRURHi8XiGYIHSKig5giS8ANgpLBAIMG2MnMFoBj2yEJNcOn1OTcs64q/m5+dnjE+sViurEqnjJfEF8YrJcc5oNHLR9fv97DdBi8OKigp88sknAIZuUuQsd/nll/MY4cCBA9i3bx+PPpLJJMrKylBZWcnXeKrUrsmCgikLCwtHFXyv14sHHngA0WgUTz755LQmONx99914/fXXYbPZcPTo0Wl73mlEtuhOBx555BE888wzWLt2LSKRCD7++GOkUiksXbqUu+GamppTTj2V5aGoayrEJD9OpwM5HA4kEomTmranI5FIoKuri4UXarUab7zxBidVqNVqWK1WuFwu9qUluS4VLlJRER+YlnzURcZiMe4sSQ4bCoWQSCRYykuFmAQNNHYBRjMgSDJMCz6z2QyTycSFmN4r+jcUlURCDKVSicLCQp7tEu2O3kMywaHcNnq8SCSCf/7nf4bX68Vbb72FSCTCZuyCIODWW2+FwWDA22+/jba2NsiyzKMIk8mEL3/5ywDG5tgSoyCd9XI6hViSJJw4cQI+nw81NTV8E6XX8te//hU/+9nP8OCDD+L222+fdonx+++/D6PRiDvvvDNbdEcgW3QngSNHjqC6uppHCSSyOHDgABoaGrB37140NzfDZDLxVnvlypUoKys75SMdyZq7urrg8/mYDpc+lphsKgDFx1M8eiqVwptvvsmjAo1GA4PBwAo46n7Tu1MqLHRMp1BMKsQ07wU+9SUIBoM8AgiFQhmF2GKxsF0lkFmIqZv1eDyjCjFdF/0bo9GIwcFBFhLQNQBgWhlFMKlUKqZRUfdeW1uLuro6vPLKKxnvazwex5VXXomCggK8++67nN8GDN3Yamtrcckll4z7nqenO5B3Mwlj6GuiQgtyJCsqKsKcOXMyvvcejwf3338/RFHEk08+mbHkm250dnbiuuuuyxbdEcgW3SkGFaj0sURfXx/mzp3L3XBdXR0bbH8WvF4v2traUFBQgLlz50KpVHLBo7FENBrlH2AqxpOZDwNDP8idnZ0QBAEVFRUIh8P4+9//zpQy8g6mhRm9TkpxMJlMHMFOhRgA07qKi4sRDoczPCPIe4ESMkh2DIAz7AKBAHflND9Nt7OkmbMsy7BYLDAajTwyAcCLNOIq00KN2Cq0MEulUrw8pMRitVrNLnCxWAx1dXVIJpOorKzExx9/jOPHjwMAiouL8cUvfnFS7zktQEeOl0ieTl/pu4JUKoUTJ04gEAigpqYm46QjyzJee+01bNmyBZs2bcJtt9027d3tSGSL7tjIFt1pgCRJaG9v5yJ84MABRCIRLFq0iAvx4sWLMzqdaDTKR9iqqqqTcn7pBzidPyyKIgwGAxfi3NzcSZPze3p68PHHH0OtVmP16tUIBoPYsWMH82TpaE0iDVmW4XK5WAZMXrt0UyDBBBW/sQqxyWRi28vc3FwuygC4IJEcF8ikolFHTF0rADa/oVk0MOR3TAWLRjgGg4Hfn1gshsWLF+Pw4cOc0UYeFBdeeCHq6+vZU4P40lNR4OjklN4Ri6IIvV7PfOPy8vJR3a3b7cZ3v/tdCIKA3/72txlpETOJbNEdG9miO0NIJBI4dOgQF+KjR49Cp9NhyZIl8Pv90Gg0ePjhh5mONFnQfJgKMS2yaD5Mqq+xikUqleJteFVVVYZdocPhQEdHBzQaDaqrq9HX14ePPvqIea3kj0CG5ZIksUl5eiGOxWJsSE4FUqVSoaioCJFIBD6fDwBYoUaeD1R06d/o9XoYjUY+stO/SS/ERF+jnwXiTkcikYwlIlk/Ek2spKQE27dvh8/n46VmPB7H2rVrMXfu3FP6vkwWoiiiubkZwWCQaYE0cmpoaIBGo8FLL72En/zkJ7j11ltnvLtNR7bojo1s0T1LIMsyXn31VTzwwANYuHAhFAoFTpw4gbKyMtTV1bGibiJ2lOOBKE/ptpfUXVIhJqHHZMj9nZ2d6Orqgk6nw6JFi3D8+HEcOHCAjWxIpEDHdSrE5JlAzxuPx5kuRkVVrVajqKiIbyAAeClG4wjqjqkjzsnJgcFg4Ah2+jeU3JxKpTg2nQI+if8bjUY57p6ELoIg8HI0Go1i2bJl05LH5fP50NLSgrKyMk4zBoZOTocPH8bPfvYz9Pb28lLuX/7lX3DXXXed8euaKLJFd2xki+5ZhPfffx+VlZVM7SGT6z179qChoQH79u2D3+/HwoULeVG3bNmy06IkkXbf4/HA6XRCkiROZqBi/Fkc5JGQZRltbW3o6upCTk4Oli1bhpaWFnzyySdsjk4CB+LEplKpjFgf8qEgGh1R0UhdV1hYiEgkwoWYuMfp0UHphZhkvcQzpiJrNBqhUql4sUf+C/SYVLzJjzccDmPRokVYvXr1hBdfk4Uoimhvb0ckEkFNTU3GaEmWZWzbtg2/+MUv8NBDD+Gmm25iZd3g4CALXWYaX/nKV7Bjxw643W4UFRVh8+bNuOeee2b6sqYS2aJ7vkAURRw7doxNfg4dOgRBELB8+XIWcixcuHDC81uSjTqdTlRWVsJqtXLqA82Hid+aPh+eLBtDlmUcO3aMt/51dXVobGxEY2Mjx6mTpSQZwlMhpt8n34eRhRgAF2KyYASGiia5tdHNJH1Zp1ar2dSH5s0ajQY6nY69hauqqtDT08NxPMBQlI1Op2N58FTC6/WitbUVs2bNQmlpacbNtL+/H9/97neRk5OD3/zmN2dNgT1PkS265ytkeSjp9sCBA9i7dy/27duH1tZW5Ofno76+HvX19bjgggtQXFw8qhsmZsRnhSFSh5c+H6bO8mSxSJ8FSZKYZmcwGHD55ZejpaUFzc3NnKpM1o00mqBCnD6fJWUccYKJzaBWq1FQUMA8Y+DTjphieEioQb4NAHjEQUU4NzcX+fn56O7uZgZHPB7HrFmzpkwxJYoiWltbEY/HUVNTk+EFIkkSXnnlFfzqV7/CT3/6U3zpS1+a1tntW2+9hQ0bNiCVSuEb3/gGvv/970/bc5/FOD+L7o9+9CP85S9/gUKhgM1mw7PPPstu/+czKKGgoaGBO2Kn04kFCxagvr4e8+bNw7Zt27B+/XosXbp0Qm5oI3GyWCQaS5zMRIjoT36/H9XV1SxdTaVSOHDgADo7O5GTk4PVq1ejubkZLS0t7O1AkmVa1qVSKRZn0LItFouxPwMVYmBIlUdRQKSEAz71p6AFHnXGCoWCzXNoBk43p9zcXNx+++2nbdnodrvR1taGOXPmoKSkJKOgOp1ObNy4Ebm5uXjsscc4OHK6kEqlUFVVhbfffpvd+v7nf/7nbDYXny6cn0WXNroA8Pjjj6OxsRFPP/30DF/V2YlUKoXGxkZs3boVb7/9NhYtWoRgMIglS5bwfLi2tnbS89t0JBKJjLEELcZGxiK53W60t7ePWhCNB1EUsXfvXnR2dkKn0+Giiy7C8ePHOYmBeLZU/GhunF6IjUYjpw2nG/hQ8Sb7RjLJSVerxWIxGAwGVrERb7ekpIRNjtKjdUhq/FlIJpNobW1lx6/0GbEkSfjTn/6EX//61/i3f/s3rFu3bkaYCbt378ZDDz2Ev/3tbwDAKcA/+MEPpv1azjKcnxHs6QbH4XD4rKLLnG0gaWxVVRX+67/+iwvPwYMHsWfPHjzxxBM4duwYjEZjhsnP7NmzJzy/Ja8FormlxyK5XC60tbWxN0FZWRny8vIyjvXjQaVSYc2aNVizZg3/ns1mY99arVaLiy66CHa7HceOHWNBA81ek8kkNBoNJElinwhyGhMEgRNDyElMpVKxj28ikeDod1r+KRQKVFRU4Morr+Tum1gh6fHr6TebkctOl8uF9vZ2VFRUZBgpAUPd7YYNG2C1WrFz587TSps4XfT19WWkSZSXl2Pv3r0zdj2fB5zTRRcANm3ahOeeew4mkwnvvffeTF/OWY3i4mI89NBD/GvqGi+66CIAn2a97du3D3v27MGLL76I7u5uzJ49m3Pp6uvrJ2wCT3xZivjxer1YtGgRdDodgsEgent7OQVjsrFIKpUKl156KS699FL+vVmzZiGZTKK9vR0qlQoXX3wxwuEwDh48yMozlUoFg8HARuX0++SvQN04za6J8kYLt+LiYlx77bUZ6jPyqEjPHUtXDVIqBj3O4OAglEol6uvrMx5HkiS8+OKLePzxx7FlyxZce+21M95IjHVSnulrOtvxuR8vXHnllXA6naN+/+GHH8YNN9zAv966dStisRg2b948nZd3zoO6yXQT+FAohNraWu6Ily5dOu78dnBwEM3NzSeNrBkrFkmj0cBkMp00Fmk8pHfPqVQK77//PpqamiBJEurr62E0GrF7924ueIlEgp3L6HVQEaYRCb0XAE6Z+tTb24uOjg7k5eVBFEWW/f7tb39DSUkJtm3bhtLSUjz66KMZgpSZRHa8MC7Oz5luOrq6unDttddOGRH7/vvvx//93/9Bo9Fg/vz5eOaZZzI6mfMZiUQiwwT+yJEjUKvVWLFiBc+HbTYbtm3bhiVLlmQsyiaKkb4EVPzSbS8n4uaWPj8m4yFZltHQ0IAjR44AAJYsWYJZs2bh9ddfBwA2oq+qqkJrayvUajWnBBcUFGSkBE/0/WpuboYgCFi4cCEXe1INbt68Gbt27eKkjKVLl+LZZ589KzpKURRRVVWFd999F2VlZVi1ahVeeOEFLFq0aKYvbaZxfhbdtrY2VFZWAgCeeOIJ7Ny5Ey+//PKUPPbf//53XHHFFVCpVBxn8vOf/3xKHvtcgyzLCAaDbAL/2muvoaWlhccRNB+22WynXEjIejG9EJM/w1ixSIlEAq2trUilUqiurh6zU6afDbqm7u5u7N+/H6Ioora2FrW1tdixYwfa2trY6GfdunUT7kIpaaKjowMLFiwYJem22+349re/jZKSEjzyyCMwm80QRREnTpxAVVXVKb1PZwJvvPEGvvOd7yCVSuHuu+/Gpk2bZvqSzgacn0X35ptvRktLCxQKBebMmXPGQvZeffVVvPzyy/jjH/845Y99ruH111/HH//4Rzz66KNIpVLYu3cv09Y8Hg+qqqp4PrxixYpT4vcSxotFUiqVCIVCqKiomBA74mSQZRl+vx+JRILz5SaCeDyO5uZmqFQqVFVVZbBCJEnC888/j6eeego///nPcfXVV58VXW0Wk8L5WXSnC9dffz1uu+02fPWrX53pSznrcTI2AtHWyOSHFlzpJvDV1dWnbAIfjUbR2NjIwaDhcPi0Y5EmC1mW4XA40NXVhcrKylGqsd7eXnz729/GrFmz8Ktf/WrKFW0TxUsvvYSHHnoITU1NaGhowMqVK2fkOj7HyBbdU8FElnQPP/ww9u/fj23btmW7kSkGjQzSTeBbWlpgsVh4NLFq1Srmwp7scXp6emC321FVVTWKYnWqsUiTRSwWQ1NTE7RaLaqqqjJuHpIk4bnnnsPvfvc7/PKXv8RVV101o5+npqYmKBQKfOtb38KvfvWrbNGdPLJF90zgD3/4A55++mm8++67U57umu00xgZ58qabwNvtdlRUVGSYwOfl5UEQBDidTvT09MBisaCiomJCxZOKfSAQGDMWiWwvJ8pPpjy9np4eVFZWjlKN9fT04L777sO8efPwi1/8IoNfPtO47LLLzkjR/dGPfoSCggJs2LABwBC1s6ioCN/+9ren9HlmENmiO9V46623sHHjRuzcufOUPW1PhmynMXGQCTy5rZEJPPkm/PrXv8aKFStOa2xAuWnUDdN8+LNikaLRKJqamqDX67FgwYJR3e2zzz6L//zP/8QjjzyCL3zhC2fdaelMFd3Ozk7cdNNN+PjjjyFJEiorK9HQ0DDtMuYziPNTkXYmsX79esTjcVx11VUAgNWrV0+pxLimpmbKHutch0KhQFVVFaqqqnDnnXeivb0dt912Gy688EJUVFTg97//PZvA19XVcUdcUVEx4W6VCmz6jHUsgUN6rlk4HIbT6cTChQtHMRq6u7uxfv16VFVV4cMPP8wIjpwuTJTjfiYwd+5c5Ofn4+DBg+jv78eKFSvOpYJ7UmSL7imivb19pi8hi3FQVlaGl19+GRUVFfx7xDKg2fC2bdvQ0dGBsrIyLsL19fXIz8+fcLepVquRn5/PxYJikQYGBtDc3MxOZr29vQgGg+jv70d1dTX+93//F8888wweeeQRXHHFFTPW3b7zzjsz8ryEb3zjG3j22WfhdDpx9913z+i1TCeyRXcGMZOdxrmMnJycjIILgFOKr776alx99dUAPvUK3rNnD3bt2oVHH30UgUAA1dXVo0zgJ4qBgQHY7XYsWbIEZrM5Ixbp6aefxu7duxGLxXD99deju7ubfR/OR9x444348Y9/jGQyiRdeeGGmL2fakC26M4iZ6jSy/qdDUCgUmDt3LubOnYvbb78dwNDIgEzg//jHP+L++++HQqFgNd3KlStRVVU1aiEXDofR1NQEk8mEVatWZaRL5OTk4IUXXkBzczP+8Ic/YNWqVfjkk0+wf//+U6a/nWm8+uqruO++++ByuXDttddi+fLlLPWdKmg0Glx++eUwm81Twg75vCC7SDvLMdWLjKz/6eSQbgJPIo7W1lYUFhaivr4edXV12L17NxYsWIBbb711FK+2o6MD9913H5YsWYItW7ZkxKKf75AkCXV1dXjppZdYOXoO4eQcxpN8ZTFD2LZtm1xWViZrNBrZZrPJa9eunZLH/eijjzIea8uWLfKWLVum5LHPF0iSJPf19cmPP/64XF5eLtfV1clLliyRv/SlL8k/+9nP5DfffFO22+3yI488Ii9fvlzeuXOnLEnSjF3v9773PXnhwoV8jT6fb8auhXDs2DG5oqJC3rhx40xfypnCuHX17DzbZIEbb7wRN95445Q/btb/9PQhCAJKS0vh8Xjw2muvYcWKFUilUmhpacHevXvx5z//Gffeey8uuOACfPjhh1PO4Z4srrrqKmzdupV9QrZu3TrjPiG1tbU4ceLEjF7DTCFbdM8zyFn/0ylDuvewUqlkE5y77rprQubr04W1a9fy/69evXrKTJ+yODVMLrI1i889ysvL0dPTw7/u7e3N5sadAZwtBXckfv/73+OLX/ziTF/GeY1s0T3PsGrVKrS1taGjowOJRAIvvvgi1q1bN6XPcffdd8Nms2Hx4sVT+rhZjI8rr7wSixcvHvX1l7/8hf/Oww8/DJVKhTvuuGMGrzSLLHvhPMSZ9j99//33YTQaceedd06ZaXwWp4cz6ROSxZjIei9kMb3o7OzEddddly26ZwHOtE9IFmNi3KKbHS9kkcU5jvXr12NwcBBXXXUVli9fjnvvvXemL+m8Rpa9kEUW5ziyPiFnF7KdbhafS/T09ODyyy9HTU0NFi1ahN/85jczfUlZZDEhZIvu5xz79u3D0qVLEYvFEA6HsWjRovNijqpSqfDII4+gqakJe/bswZNPPonGxsaZvqwzih/96EdYunQpli9fjrVr18Jut8/0JWVxCsgu0s4B/PCHP0QsFkM0GkV5eTl+8IMfzOj1fOUrX8GOHTvgdrtRVFSEzZs345577jmjz3nDDTdg/fr17G98LiIYDHKqxOOPP47GxsYp9XDOYkqRZS+cy0gkEli1ahV0Oh0++uij88qxCRhiSlxyySU4evToWRV1cyaxdetWdHd346mnnprpS8libGSTI85leL1ehEIhJJNJxGKx88rJKhQK4eabb8Zjjz12XhTcTZs24bnnnoPJZMJ7770305eTxSkg2+meA1i3bh1uv/12dHR0wOFw4Le//e1MX9K0IJlM4rrrrsPVV1+NjRs3Tvnjx2IxXHLJJYjH4xBFEbfccgs2b9485c+Tjoka22/duhWxWOyMX08Wp4xTHi9kcZZDEIQ7AXxJluWbBEFQAvgIwA9kWd4+w5d2RiEMmRv8AYBXluXvnMHnMMiyHBIEQQ3gAwAbZFnecyaebzIQBGEOgL/KspzVWn/OkGUvfM4hy/JzsizfNPz/KVmWLzzXC+4w1gD4GoArBEE4NPx1zVQ+wbAvamj4l+rhrxnrUgRBSHf6XgegeaauJYtTR7bTzSKLk2D49HAAwAIAT8qy/MAMXssrABYCkAB0AbhXluW+mbqeLE4N2aKbRRYTgCAIZgCvArhPluVznwidxRlDdryQRRYTgCzLfgA7APzjDF9KFp9zZItuFlmMA0EQCoc7XAiCkAPgSmTnqFmcJrI83SyyGB8lAP4wPNdVAPiTLMuvz/A1ZfE5x/8DP4VmVLVBi68AAAAASUVORK5CYII=\n", 140 | "text/plain": [ 141 | "
" 142 | ] 143 | }, 144 | "metadata": { 145 | "needs_background": "light" 146 | }, 147 | "output_type": "display_data" 148 | } 149 | ], 150 | "source": [ 151 | "\"\"\" 构造预测集数据 \"\"\"\n", 152 | "delta = 0.2\n", 153 | "px = np.arange(-3.0, 3.0, delta)\n", 154 | "py = np.arange(-3.0, 3.0, delta)\n", 155 | "px, py = np.meshgrid(px, py)\n", 156 | "px = px.reshape((px.size, 1))\n", 157 | "py = py.reshape((py.size, 1))\n", 158 | "pz = np.hstack((np.hstack((np.ones((px.size, 1)), px)), py))\n", 159 | "\n", 160 | "fig = plt.figure()\n", 161 | "ax = fig.add_subplot(111, projection='3d')\n", 162 | "ax.scatter(X_train[:, 1], X_train[:, 2], y_train, color='red', marker='^', s=200, label='Traning Data') # plot训练集\n", 163 | "ax.scatter(px, py, (hypothesis(pz, theta_opt.x)), color='gray', label='Prediction Data') # plot预测集, 分类时加上 np.around\n", 164 | "ax.legend(loc=2)\n", 165 | "ax.set_xlabel('x')\n", 166 | "ax.set_ylabel('y')\n", 167 | "ax.set_zlabel('z')\n", 168 | "ax.set_title('classification')\n", 169 | "plt.show()" 170 | ] 171 | } 172 | ], 173 | "metadata": { 174 | "colab": { 175 | "collapsed_sections": [], 176 | "name": "logistic_regression_sample_with_optimize.minimize.ipynb", 177 | "provenance": [], 178 | "version": "0.3.2" 179 | }, 180 | "kernelspec": { 181 | "display_name": "Python (Python 3.6)", 182 | "language": "python", 183 | "name": "python36" 184 | }, 185 | "language_info": { 186 | "codemirror_mode": { 187 | "name": "ipython", 188 | "version": 3 189 | }, 190 | "file_extension": ".py", 191 | "mimetype": "text/x-python", 192 | "name": "python", 193 | "nbconvert_exporter": "python", 194 | "pygments_lexer": "ipython3", 195 | "version": "3.6.8" 196 | } 197 | }, 198 | "nbformat": 4, 199 | "nbformat_minor": 1 200 | } 201 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/4.nurual_network_back_propagation/ex4data1.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/4.nurual_network_back_propagation/ex4data1.mat -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/4.nurual_network_back_propagation/ex4weights.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/4.nurual_network_back_propagation/ex4weights.mat -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/4.nurual_network_back_propagation/img/nn_model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/4.nurual_network_back_propagation/img/nn_model.png -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/bird_small.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/bird_small.mat -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/bird_small.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/bird_small.png -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/ex7data1.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/ex7data1.mat -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/ex7data2.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/ex7data2.mat -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/ex7faces.mat: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/基础知识速通/Machine Learning教程/NoteBooks/7.kmeans_and_PCA/data/ex7faces.mat -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/NoteBooks/test.txt: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /基础知识速通/Machine Learning教程/README.md: -------------------------------------------------------------------------------- 1 | - [MachineLearning教程](#machinelearning教程) 2 | - [线性回归到神经网络](#线性回归到神经网络) 3 | - [传统机器学习算法](#传统机器学习算法) 4 | - [模型评估与调优](#模型评估与调优) 5 | 6 | # MachineLearning教程 7 | 在开始本教程之前,请先了解机器学习的大致概念,并掌握一定的python基础,能够阅读并运行jupyter notebook文件。基于实验室的方向特性,我们在机器学习部分安排三部分内容,**神经网络机器学习基础**,**传统机器学习算法**,**模型评估与调优**。由于我们的最终目的是掌握深度神经网络及其原理,而深度学习是机器学习的子集,因此第一部分学习如何从最基础的机器学习即线性回归引入到深度神经网络。第二部分我们学习一些简单的其他机器学习机器,如贝叶斯分类、高斯混合模型,这些方法既是常用的工具,也能够从数学上更深的理解机器学习。第三部分学习模型的评估与调优,了解机器学习相关的评估指标,理解机器学习的中的偏差、方差的概念,并学习调优方法。 8 | 9 | ## 线性回归到神经网络 10 | 第一部分从**线性回归**逐渐引入到神经网络,这一部分是学习重点。首先从**线性回归**和**Logistic回归**理解拟合与分类两种任务的联系与区别并对机器学习有一个基础的认识。**(单层)感知机**是神经网络的雏形,了解感知机权重的本质是分界线的参数,并掌握更新权重的方法。**神经网络**即多层感知机,学习这一部分,需要理解神经网络和感知机之间的联系,理解深度网络激活函数的作用,为何深度神经网络相对感知机能够实现非线性分类。这一部分主要学习吴恩达机器学习课程,感知机部分参考《统计学习方法》(周志华)5.2节。配套的其余参考提供了一些博客以便于更轻松地理解相关概念。 11 | |知识点|教材|视频|代码| 12 | |:------|------|------|----:| 13 | |线性回归|[机器学习讲义-线性回归](https://scruel.gitee.io/ml-andrewng-notes/week1.html)|[机器学习(吴恩达)[P10-P19]](https://www.bilibili.com/video/BV1cv4y1W7A3?p=9&vd_source=ef6bc9d073dccb208fb608bc99286677)|[线性回归notebook](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/NoteBooks/1.linear_regression)| 14 | |Logistic回归|[机器学习讲义-Logistic回归](https://scruel.gitee.io/ml-andrewng-notes/week3.html)| [机器学习(吴恩达)[P32-P41]](https://www.bilibili.com/video/BV1cv4y1W7A3?p=32&vd_source=ef6bc9d073dccb208fb608bc99286677) |[Logistic回归notebook](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/NoteBooks/2.logistic_regression)| 15 | |感知机|《机器学习》(周志华)|[解读西瓜书](https://www.bilibili.com/video/BV1dM411k7q5?p=24&vd_source=ef6bc9d073dccb208fb608bc99286677)| 16 | |神经网络|[机器学习讲义-神经网络](https://scruel.gitee.io/ml-andrewng-notes/week4.html)|[机器学习(吴恩达)[P51-P58]](https://www.bilibili.com/video/BV1cv4y1W7A3?p=51&vd_source=ef6bc9d073dccb208fb608bc99286677)|[神经网络notebook](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/NoteBooks/4.nurual_network_back_propagation)| 17 | 18 | ## 传统机器学习算法 19 | 这一部分我们学习非神经网络的机器学习分支,只需要学习一些简单的算法,主要设计分类、聚类、降维三部分。首先从**K近邻**算法入手,KNN是最简单的分类算法之一,掌握其具体的算法步骤和概念即可。**K-Means**是最为常用的聚类算法,只需要学习最简单的线性K-Means分类即可。PCA算法是一种降维算法,可以实现。**支持向量机**是除神经网络外最常用的机器学习分类算法,速通阶段只需要学习线性的SVM即可。这一部分的教程主要参考西瓜书,配套b站解读西瓜书系列视频学习。学习完之后阅读并运行相关代码 20 | |知识点|教程|视频|代码| 21 | |:------|------|------|----:| 22 | |KNN|《机器学习》(周志华)[10.1]|[解读西瓜书-K近邻](https://www.bilibili.com/video/BV1dM411k7q5?p=52&vd_source=ef6bc9d073dccb208fb608bc99286677)|[KNN notebook](https://github.com/UNIC-Lab/AI_Course/blob/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/NoteBooks/knn-notebook.ipynb)| 23 | |K-Means|机器学习讲义|[KMeans聚类](https://www.bilibili.com/video/BV17Y4y1v7XH?p=3&vd_source=ef6bc9d073dccb208fb608bc99286677)|[K-Means notebook](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/NoteBooks/7.kmeans_and_PCA)| 24 | |PCA降维|《机器学习》(周志华)[10.3]|[PCA降维](https://www.bilibili.com/video/BV1QS4y1e7y6/?spm_id_from=333.337.search-card.all.click&vd_source=ef6bc9d073dccb208fb608bc99286677)|[PCA notebook](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B/NoteBooks/7.kmeans_and_PCA)| 25 | |贝叶斯分类 |《机器学习》(周志华)[7.1-7.3]|[解读西瓜书-贝叶斯分类器](https://www.bilibili.com/video/BV1dM411k7q5?p=35&vd_source=ef6bc9d073dccb208fb608bc99286677)| 26 | |*支持向量机|《机器学习》(周志华)[6.1-6.2]|[解读西瓜书-支持向量机](https://www.bilibili.com/video/BV1dM411k7q5?p=29&vd_source=ef6bc9d073dccb208fb608bc99286677)| 27 | 28 | 29 | ## 模型评估与调优 30 | 31 | 这一部分主要参考《机器学习》(周志华) 32 | 模型评估方面 33 | 性能度量:了解混淆矩阵的概念,精准率和召回率 34 | 35 | |知识点|教程|视频| 36 | |:------|------|------:| 37 | |模型评估与交叉验证|《机器学习》(周志华)[2.2]|[机器学习初步(周志华)[P12]](https://www.bilibili.com/video/BV1xs4y1x7Uf?p=12&vd_source=ef6bc9d073dccb208fb608bc99286677)| 38 | |过拟合、偏差与方差|《机器学习》(周志华)[2.1]|[机器学习初步(周志华)[P10]](https://www.bilibili.com/video/BV1xs4y1x7Uf?p=10&vd_source=ef6bc9d073dccb208fb608bc99286677)| 39 | |性能度量|《机器学习》(周志华)[2.3]|[机器学习初步(周志华)[P14]](https://www.bilibili.com/video/BV1xs4y1x7Uf?p=14&vd_source=ef6bc9d073dccb208fb608bc99286677)| 40 | 41 | 42 | 43 | -------------------------------------------------------------------------------- /基础知识速通/Python教程/README.md: -------------------------------------------------------------------------------- 1 | - [写在前面](#写在前面) 2 | - [Python介绍与安装](#python介绍与安装) 3 | - [Python介绍](#python介绍) 4 | - [Python安装](#python安装) 5 | - [Python虚拟环境管理](#python虚拟环境管理) 6 | - [Anaconda安装](#anaconda安装) 7 | - [Anaconda使用](#anaconda使用) 8 | - [代码编辑器](#代码编辑器) 9 | - [Jupyter Notebook](#jupyter-notebook) 10 | - [VSCode](#vscode) 11 | - [Pycharm](#pycharm) 12 | - [Python基础知识](#python基础知识) 13 | - [第一阶段:基础语法](#第一阶段基础语法) 14 | - [第二阶段\*:高阶特性](#第二阶段高阶特性) 15 | - [第三阶段:数据处理](#第三阶段数据处理) 16 | - [基础语法视频教程](#基础语法视频教程) 17 | 18 | # 写在前面 19 | # Python介绍与安装 20 | Python版本、安装等 21 | ## Python介绍 22 | 23 | 24 | 25 | ## Python安装 26 | Python目前有两个版本,一个是2.x版,一个是3.x版,这两个版本是不兼容的。由于3.x版越来越普及,我们的教程建议使用Python 3.6及以上版本。 27 | - Windows安装: 28 | 首先,根据你的Windows版本(64位还是32位)从Python的官方网站下载Python 3.5对应的[64位安装程序](https://www.python.org/ftp/python/3.5.0/python-3.5.0-amd64.exe)或[32位安装程序](https://www.python.org/ftp/python/3.5.0/python-3.5.0.exe)(或者移步[国内镜像](http://pan.baidu.com/s/1sjqOkFF)),然后,运行下载的EXE安装包: 29 | 30 | ![Python基础教程-廖雪峰](https://user-images.githubusercontent.com/90789521/237056090-d8ba5995-edab-4885-a124-96d7fa9adac2.jpg) 31 | # Python虚拟环境管理 32 | 介绍python环境的概念、conda安装、、命令集 33 | ## Anaconda安装 34 | - 1. conda[下载地址](https://mirrors.tuna.tsinghua.edu.cn/anaconda/archive/),注意后缀,下载符合系统的版本 35 | - 2. 安装过程选项: 36 | 37 | ![image](https://user-images.githubusercontent.com/90789521/236983479-e2d57e96-72a3-4cf3-87a1-83069a85661f.png) 38 | 39 | ![image](https://user-images.githubusercontent.com/90789521/236983517-c2456030-0923-4821-bda4-7e55da95f7ec.png) 40 | 41 | ![image](https://user-images.githubusercontent.com/90789521/236983670-3a8472fe-17e9-4ce8-95bf-0e5de8962f05.png) 42 | 43 | ![image](https://user-images.githubusercontent.com/90789521/236983691-85eb151f-f2e9-4245-8322-9b4da4b71d1a.png) 44 | 45 | ![image](https://user-images.githubusercontent.com/90789521/236983715-3c4bf730-bdc1-4490-b63b-04393dc6ff4c.png) 46 | - 3. 测试: 47 | 48 | cmd命令行输入:`conda --version`,输出conda版本即安装成功且环境变量配置成功 49 | 50 | ## Anaconda使用 51 | 52 | conda可以理解为一个工具,也是一个可执行命令,其核心功能是包管理和环境管理。包管理与pip的使用方法类似,环境管理则是允许用户方便滴安装不同版本的python环境并在不同环境之间快速地切换。 53 | - 查看conda中的环境使用命令: 54 | `conda info -e`或者`conda env list` 55 | 56 | ![conda list](https://user-images.githubusercontent.com/90789521/236985934-5118e2f3-f2c9-4837-8baf-0ec9cc50cd99.png) 57 | - 创建环境: 58 | `conda create -n [env_name] python=x.x`,env_name为环境名,创建环境时可以直接指定python的版本 59 | 60 | 或者克隆环境 61 | 62 | `conda create -n your_name --clone env_name`,将env_name的环境克隆到your_name环境 63 | - 激活环境: 64 | 65 | Linux: `source activate [env_name]` `conda activate [env_name]` 66 | 67 | Windows: `activate [env_name]` `conda activate [env_name]` 68 | 69 | 其中`[env_name]`换成需要激活的环境名字 70 | 71 | - conda环境中第三方库的安装: 72 | 73 | ```conda install -n env_name [package] # 未激活环境``` 74 | 75 | ```conda install [package] # 如果已经激活环境``` 76 | 77 | 此外,也可以在激活环境后使用`pip install [package]`安装,`[package]`换成需要安装的库的名字。但是注意尽量别混合使用`conda install`和`pip install`,存在依赖的包必须用同一种形式安装 78 | - 关闭环境:`conda deactivate` 79 | 80 | - 删除环境:`conda remove -n env_name --all` 81 | 82 | - 关于镜像:镜像网站主要作用是在安装第三方库的时候加速库的下载,使用清华源配置conda镜像,配置命令如下: 83 | ``` 84 | conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/msys2/ 85 | conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/conda-forge/ 86 | conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/ 87 | conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/menpo/ 88 | conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/bioconda/ 89 | conda config --add channels https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main/ 90 | conda config --set show_channel_urls yes 91 | ``` 92 | 93 | 但镜像不是万能的,具体项目可能会有些包不支持镜像,镜像删除命令如下: 94 | 95 | `conda config --remove-key channels` 96 | 97 | 如果并不是所有时候都用到镜像,可以通过在安装具体库的时候指定镜像:`pip install [package] -i https://pypi.tuna.tsinghua.edu.cn/simple`或者`conda -c [mirros] [pakage]` 98 | 99 | # 代码编辑器 100 | 主要推荐的编辑器和IDE包括`Jupyter Notebook`、`Visual studio code`、`Pycharm`,其中Jupyter Notebook是以网页的形式打开,可以在网页页面中直接编写代码和运行代码,代码的运行结果也会直接在代码块下显示的程序。如在编程过程中需要编写说明文档,可在同一个页面中直接编写,便于作及时的说明和解释,对新手友好。VsCode是个简化高效的代码编辑器,同时支持调试、任务执行,版本管理等开发操作。Pycharm则是比较专业的针对Python的IDE 101 | ## Jupyter Notebook 102 | Jupyter的安装和所有第三方库一样,先激活环境,再使用pip或者conda安装,推荐使用镜像: 103 | `pip install jupyter -i https://pypi.tuna.tsinghua.edu.cn/simple` 104 | `conda install jupyter -i https://pypi.tuna.tsinghua.edu.cn/simple` 105 | 安装完成后在激活环境后的命令行面板中运行`jupyter notebook`,即可打开notebook页面。 106 | Jupyter Notebook的使用教程可以参考[Blog](https://zhuanlan.zhihu.com/p/32320214) 107 | 108 | ## VSCode 109 | VSCode是轻量级的开源代码编辑器,页面简洁,并且支持大量的扩展程序,可玩性较高,并且并不只是支持Python开发。VSCode的安装一般不会出现问题,直接[官网](https://code.visualstudio.com/)下载`.exe`文件进行安装即可。VSCode运行`.py`脚本首先需要安装python扩展,并且选择解释器,解释器可以使用选择conda中创建的环境。 110 | 具体的使用可以参考[Blog](https://zhuanlan.zhihu.com/p/112431369),更多玩法可以自己探索。 111 | 112 | ## Pycharm 113 | Pycharm相对于vscode更为专业,功能也更强大,但比较繁杂。Pycharm只有专业版可以连接远程服务器,所以尽量下载专业版,可以通过jetbrain的师生认证,也可以找一些都懂的资源。同时使用vscode和pycharm的话需要注意一点,两个编辑器识别文件路径的逻辑是不一样的。Pycharm可以针对项目创建环境(即解释器),也支持加载本地venv环境以及conda中环境。具体使用教程可以参考[Blog](https://zhuanlan.zhihu.com/p/231064736)。 114 | 115 | # Python基础知识 116 | 117 | ## 第一阶段:基础语法 118 | 119 | 第一阶段参照《Python编程:从入门到实践》中的第一部分,并参考知识点基于jupyter notebook进行编程运行,每个知识点有可供参考的notebook文件,教材中的知识点更细致。最好将项目git到本地进行运行,把一些可能遇到的环境配置的问题在学习初期就解决了。当然也可以使用Colab代码学习。 120 | [copyright](https://github.com/shibing624/python-tutorial/) 121 | 122 | | Notebook | 知识点 | Colab | 123 | |:----------|:-------------|---------:| 124 | | [01_字符串类型_str.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/01_字符串类型_str.ipynb) | Python字符串类型 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/01_字符串类型_str.ipynb) | 125 | | [02_列表类型_list.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/02_列表类型_list.ipynb) | Python列表类型 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/02_列表类型_list.ipynb) | 126 | | [03_元组类型_tuple.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/03_元组类型_tuple.ipynb) | Python元组 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/03_元组类型_tuple.ipynb) | 127 | | [04_字典类型_dict.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/04_字典类型_dict.ipynb) | Python字典 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/04_字典类型_dict.ipynb) | 128 | | [05_集合类型_set.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/05_集合类型_set.ipynb) | Python集合 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/05_集合类型_set.ipynb) | 129 | | [06_条件判断_if.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/06_条件判断_if.ipynb) | Python条件判断 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/06_条件判断_if.ipynb) | 130 | | [07_列表推导式.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/07_列表推导式.ipynb) | Python列表推导式 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/07_列表推导式.ipynb) | 131 | | [08_循环结构_loop.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/08_循环结构_loop.ipynb) | Python循环 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/08_循环结构_loop.ipynb) | 132 | | [09_函数和模块.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/09_函数和模块.ipynb) | Python函数 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/09_函数和模块.ipynb) | 133 | | [10_文件和异常.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/10_文件和异常.ipynb) | Python文件和异常 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/10_文件和异常.ipynb) | 134 | | [11_线程和进程.ipynb](https://github.com/shibing624/python-tutorial/blob/master/01_base/11_线程和进程.ipynb) | Python多线程和多进程 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/01_base/11_线程和进程.ipynb) | 135 | | [12_面向对象编程.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/07_面向对象编程.ipynb) | Python类 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/07_面向对象编程.ipynb) | 136 | 137 | ## 第二阶段*:高阶特性 138 | 139 | 掌握了Python的基础知识,并且对Python的运行逻辑具有一定的了解后,可以学习一些高级特性。这部分没有必须要求,如果感兴趣,可以选择性学习。推荐了解高阶函数和迭代器。 140 | [copyright](https://github.com/shibing624/python-tutorial/) 141 | 142 | | Notebook | 知识点 | Colab | 143 | |:----------|:-------------|---------:| 144 | | [01_系统交互_os.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/01_系统交互_os.ipynb) | Python系统交互操作 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/01_系统交互_os.ipynb) | 145 | | [02_数据库_sql.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/02_数据库_sql.ipynb) | Python操作mysql数据库 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/02_数据库_sql.ipynb) | 146 | | [03_高阶函数.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/03_高阶函数.ipynb) | map、filter、lambda高阶函数 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/03_高阶函数.ipynb) | 147 | | [04_迭代器与生成器.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/04_迭代器与生成器.ipynb) | 迭代器和yield生成器 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/04_迭代器与生成器.ipynb) | 148 | | [05_上下文管理器.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/05_上下文管理器.ipynb) | with语句 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/05_上下文管理器.ipynb) | 149 | | [06_装饰器.ipynb](https://github.com/shibing624/python-tutorial/blob/master/02_advanced/06_装饰器.ipynb) | Decorator装饰器 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/02_advanced/06_装饰器.ipynb) | 150 | 151 | ## 第三阶段:数据处理 152 | 153 | 学习一些机器学习和数据科学需要用到的库,在后续机器学习和深度学习过程中,numpy是非常必要的。Matplotlib是Python的画图工具。 154 | [copyright](https://github.com/shibing624/python-tutorial/) 155 | 156 | | Notebook | 知识点 | Colab | 157 | |:----------|:-------------|---------:| 158 | | [01_Numpy数组.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/01_Numpy数组.ipynb) | Numpy array数组 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/01_Numpy数组.ipynb) | 159 | | [02_Numpy索引.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/02_Numpy索引.ipynb) | Numpy index索引 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/02_Numpy索引.ipynb) | 160 | | [03_Numpy方法.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/03_Numpy方法.ipynb) | Numpy 方法 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/03_Numpy方法.ipynb) | 161 | | [04_Matpoltlib画图.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/04_Matpoltlib画图.ipynb) | Matpoltlib画图 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/04_Matpoltlib画图.ipynb) | 162 | | [05_SciPy统计分布.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/05_SciPy统计分布.ipynb) | Scipy统计分布 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/05_SciPy统计分布.ipynb) | 163 | | [06_SciPy曲线拟合.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/06_SciPy曲线拟合.ipynb) | Scipy曲线 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/06_SciPy曲线拟合.ipynb) | 164 | | [07_Pandas数据类型.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/07_Pandas数据类型.ipynb) | Pandas数据类型 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/07_Pandas数据类型.ipynb) | 165 | | [08_Pandas数据操作.ipynb](https://github.com/shibing624/python-tutorial/blob/master/03_data_science/08_Pandas数据操作.ipynb) | Pandas操作 |[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/shibing624/python-tutorial/blob/master/03_data_science/08_Pandas数据操作.ipynb) | 166 | 167 | 168 | # 基础语法视频教程 169 | [视频教程](https://www.bilibili.com/video/BV1qW4y1a7fU/?spm_id_from=333.337.search-card.all.click) 170 | 171 | -------------------------------------------------------------------------------- /基础知识速通/README.md: -------------------------------------------------------------------------------- 1 | # 从Python编程到强化学习 2 | 本教程主要针对有一定数学基础的同学快速入门深度学习与强化学习。学习路线制定为:[Python基础](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B)----->[机器学习基础](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B)----->[深度学习基础](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B)----->强化学习基础。本教程每一部分均配有相关notebook代码,需要自行调试运行,感兴趣的也可以自己复现,建议将教程中的代码down到本地运行。教材电子版已上传[资料]或给出了地址链接,视频课程每部分也给出了具体课程地址,具体查阅每部分所属文件夹。 3 | 4 | - [***Python基础***](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Python%E6%95%99%E7%A8%8B): 5 | 主要参考教材《Python编程:从入门到实践》第一部分和notebook代码学习,首先学习Python安装以及环境管理工具conda的安装与使用。然后本教程将Python的相关语法和工具的学习分为三个阶段 6 | - **第一阶段主要学习基础语法部分**,如果能够看懂的话可以直接学习调试运行给的代码例子,否则建议阅读《Python编程:从入门到时间》第一部分中的对应章节。 7 | - **第二阶段是学习Python的一些高阶特性**,这些高阶特性不是必要的,但是可以提高编程效率,并且以后会遇到的项目里面可能会存在高阶特性的语法。这一部分是选学内容,建议掌握迭代器与生成器、高阶函数这两节。 8 | - **第三阶段学习一些数据科学需要用到的库**,包括数据处理和绘图。掌握基本语法后,可以直接参考notebook中的内容进行学习,建议仔细学习numpy部分,后续可能会以较高的频率使用。这一部分如果理解有困难或者有兴趣进一步学习,可以参考教材[《Python数据科学手册》](https://github.com/wangyingsm/Python-Data-Science-Handbook/blob/master/printable/README.md) 9 | 10 | - [***机器学习基础***](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Machine%20Learning%E6%95%99%E7%A8%8B): 11 | 机器学习基础部分主要参考教程有 <机器学习-吴恩达> 视频课程和教材《机器学习》(周志华),并给出了每个部分的notebook代码。机器学习基础被分为三个部分, 12 | - **第一部分学习从线性回归到神经网络**,主要介绍神经网络的机器学习基础。这一部分主要参考吴恩达的机器学习课程,并配套课程讲义,这一部分比较重要,是需要重点掌握的部分。 13 | - **第二部分学习传统的机器学习算法**,介绍一些非神经网络分支的机器学习算法。主要聚焦于分类、聚类、降维任务,教程中所列出的均为比较简单且经典的机器学习算法,理解算法思路、用途即可。配套教材为西瓜书,西瓜书会涉及到一些数学推导,可直接按照所列出的视频课程<解读西瓜书>学习。 14 | - **第三部分学习一些模型评估与调优的方法**,这一部分主要涉及到一些理论,没有配套代码,但也是比较重要的部分。主要参考西瓜书学习,视频课程参考周志华老师的<机器学习初步>。 15 | 16 | - [***深度学习基础***](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%9F%BA%E7%A1%80%E7%9F%A5%E8%AF%86%E9%80%9F%E9%80%9A/Deep%20Learning%E6%95%99%E7%A8%8B): 17 | 基于上述Python基础和机器学习基础,可以很快入门深度学习。本教程主要参考《动手学深度学习》第二版教材和李沐老师的<动手学深度学习>课程,编程部分主要基于Pytorch框架。 18 | - 第一步请尝试**快速入门Pytorch框架**,教程中给出了快速入门教程地址,理解Pytorch的网络实现逻辑,以及基本的网络构建和训练过程即可,更具体的部分可以阅读官方文档,也可以参考后续教程所给出的项目中调用的API,自行查阅API用法。 19 | - 第二步按照《动手学深度学习》教材和配套课程学习**常用的一些深度神经网络**,主要分为卷积神经网络、循环神经网络、注意力机制,每一部分给出了教材地址和视频课程部分。每一部分学完之后,请阅读并运行所给出的项目,其中注意力机制中给出了完整的Transformer项目代码,请仔细阅读,理解项目的结构与编写风格,培养阅读和编写项目的能力。 20 | - 第三步主要学习**优化算法和调参炼丹技巧**。梯度下降是这些优化算法的基础,学习过程中需要了解一下这些优化算法的改进点,并了解对应的优化器以及优化器参数。调参炼丹技巧参考google研究院的教程,调参是提升网络性能的重要手段,这一部分是必要掌握的内容。 21 | 22 | - ***强化学习基础*** 23 | 24 | 掌握了前面的编程基础和深度学习基础,入门强化学习还需要一定的概率论知识,这部分请自行复习。本教程快速入门强化学习主要参考王树森的[强化学习课程](https://www.bilibili.com/video/BV12o4y197US/?spm_id_from=333.337.search-card.all.click&vd_source=ef6bc9d073dccb208fb608bc99286677),以及[对应讲义](https://github.com/wangshusen/DRL)。这个课程只有几节课程,虽然可以帮助快速入门强化学习,但不够深入,更深入的学习可以参考[强化学习进阶](https://github.com/UNIC-Lab/AI_Course/tree/main/%E5%BC%BA%E5%8C%96%E5%AD%A6%E4%B9%A0%E8%BF%9B%E9%98%B6)部分。 25 | 26 | 27 | -------------------------------------------------------------------------------- /基础知识速通/资料/README.me: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /强化学习进阶/README.md: -------------------------------------------------------------------------------- 1 | # ***强化学习进阶*** 2 | 文档给出了强化学习的大致概念和经典的论文,速通可以参考博客,当然博客质量参差不齐,需要细细斟酌。基础理论如Bellman方程等建议手动推一遍。相关算法是比较重要的需要掌握的强化学习模型,速通可以参考相关博客,给出的论文都是影响力比较高的,大多有比较严格的理论推导和证明,有兴趣可以细推,加粗的算法是必须掌握的,其余的是影响比较高的。当然也可以跟着课程或者书籍更系统性的学习,其中CS285和Reinforcement Learning是公认质量比较高的课程。比较推荐的是根据[openai spinningup教程](https://spinningup.qiwihui.com/zh_CN/latest/)进行系统性学习相关算法,以及实现方法。 3 | 4 | 基础理论和几个核心概念的是强化学习的重点基础,理解不清楚可能会在后面的学习中逐渐懵逼。建议先从Markov决策过程的原理和概念入手,掌握值函数、策略的概念,理解蒙特卡罗方法、时间差分方法和动态规划方法的异同与优劣。 5 | 6 | 基于值函数的强化学习算法部分,建议先学习Q-Learning算法和Sarsa算法,这两个算法是强化学习中比较基础的算法,对于理解和掌握基于值函数的强化学习算法有很大的帮助。然后学习DQN算法、Double DQN算法、Dueling DQN算法、Prioritized Experience Replay算法和Rainbow算法等高级算法。 7 | 8 | 基于策略梯度的强化学习算法部分,建议先学习REINFORCE算法和Actor-Critic算法,这两个算法是基于策略梯度的强化学习算法中比较基础的算法,对于理解和掌握基于策略梯度的强化学习算法有很大的帮助。然后学习A3C算法、TRPO算法和PPO算法等高级算法。 9 | 10 | 基于动作价值和策略梯度的强化学习算法部分,建议先学习DPG和DDPG算法,这两个算法是基于动作价值和策略梯度的强化学习算法中比较基础的算法,对于理解和掌握这类算法有很大的帮助。接着,可以学习TD3算法和SAC算法等高级算法。 11 | ## 一、强化学习基础理论与核心概念 12 | 13 | 第一节中所介绍的值函数、MC方法、TD方法等都是基于算法的,是对算法的归类,因此与后续介绍的QLearning、Sarsa等比较基础的算法有交叉,学习过程中需要对应具体算法的教程进行学习,理解算法具体流程的同时学习归纳基于值函数和基于策略的概念,以及基于MC和基于TD的区别与优劣。 14 | - 1.0 **强化学习基础理论**: 第一阶段需要理解强化学习的核心理论,学习强化学习的四要素以及策略迭代、策略梯度两种方式。可以参照教材,然后理解值函数和策略的概念,并掌握基于蒙特卡洛的强化学习方法和基于TD的方法,并理解基于蒙特卡洛和基于TD两种方法中方差和偏差的概念。-->西瓜书[Ch.16] [神经网络与深度学习[14.1]](https://nndl.github.io/) 15 | - 1.1 **Markov决策过程** 16 | - Blogs: [马尔可夫决策过程原理与代码实现](https://blog.csdn.net/qq_41297934/article/details/105104684) 17 | - Paper: [Markov decision processes: Concepts and algorithms](https://www.writebug.com/git/awan/aicar/raw/commit/f779d0e788f6ba0aeb45d8d31f1384c09c236afe/references/Markov%20Decision%20Processes%20Concepts%20and%20Algorithms.pdf) 18 | - 1.2 **值函数和策略** 19 | - [神经网络与深度学习[14.2-14.3]](https://nndl.github.io/) 20 | - Blogs: [基于值和策略的强化学习入坑](https://zhuanlan.zhihu.com/p/54825295) 21 | - 1.4 **蒙特卡罗方法** 22 | - Blogs: [强化学习 - 蒙特卡罗法(Monte Carlo Methods)](https://zhuanlan.zhihu.com/p/72715842) 23 | - 1.5 **时间差分方法** 24 | - Blogs:[强化学习 - 时间差分学习(Temporal-Difference Learning)](https://zhuanlan.zhihu.com/p/73083240) 25 | - Paper: [Temporal difference learning and TD-gammon](https://dl.acm.org/doi/10.1145/203330.203343) 26 | - 1.6 动态规划方法 27 | - Blogs: [强化学习 - 动态规划(Dynamic Programming)](https://zhuanlan.zhihu.com/p/72360992) 28 | ## 二、基于值函数的强化学习算法 29 | - **Q-Learning算法** 30 | - Paper: [Q Learning](https://link.springer.com/article/10.1007/BF00992698) 31 | - **Sarsa算法** 32 | - Paper: [On-line Q-learning using connectionist systems](https://www.researchgate.net/profile/Mahesan-Niranjan/publication/2500611_On-Line_Q-Learning_Using_Connectionist_Systems/links/5438d5db0cf204cab1d6db0f/On-Line-Q-Learning-Using-Connectionist-Systems.pdf?_sg%5B0%5D=HYd0h230b7WOR6m4hj5yx01K97aS61Z0DufUURMQr9ZqMqcEVZ0dNpG84h6uCfRl_M40FNkXgRX-GnpnxH31Ww.jBF3fgrlhaJYs3bDEaHQU22nRpKP0zKeF_oOsqh7WddL8pfxAomPSbeANzdmLP9YPB26HbLeSaEJqhFgzIxvWQ&_sg%5B1%5D=CZtZhHTEMgSwBZrpZU_7BACd8RH04JUKiITdXRQJ6MQ9SFS27jreZmcsuNcqYYWRoxcwBE-xBMbrfl1QobmEZ65bmkmpzonq5JoLRIIUKXne.jBF3fgrlhaJYs3bDEaHQU22nRpKP0zKeF_oOsqh7WddL8pfxAomPSbeANzdmLP9YPB26HbLeSaEJqhFgzIxvWQ&_iepl=) 33 | - **DQN算法** 34 | - Paper: [Human-level control through deep reinforcement learning](https://www.nature.com/articles/nature14236/?source=post_page---------------------------) 35 | - Double DQN算法 36 | - Paper: [Deep Reinforcement Learning with Double Q-Learning](https://ojs.aaai.org/index.php/AAAI/article/view/10295) 37 | - Dueling DQN算法 38 | - Paper: [Dueling Network Architectures for Deep Reinforcement Learning](http://proceedings.mlr.press/v48/wangf16.html) 39 | - Prioritized Experience Replay算法 40 | - Paper: [Prioritized Experience Replay](https://arxiv.org/abs/1511.05952) 41 | - Rainbow算法 42 | - Paper: [Rainbow: Combining Improvements in Deep Reinforcement Learning.](https://ojs.aaai.org/index.php/AAAI/article/view/11796) 43 | ## 三、基于策略梯度的强化学习算法 44 | - **REINFORCE算法** 45 | - Paper: [Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning](https://link.springer.com/chapter/10.1007/978-1-4615-3618-5_2) 46 | - **Actor-Critic算法** 47 | - Paper: [Actor-Critic Algorithms](https://proceedings.neurips.cc/paper/1999/hash/6449f44a102fde848669bdd9eb6b76fa-Abstract.html) 48 | - A3C算法 49 | - Paper: [Asynchronous Methods for Deep Reinforcement Learning.](http://proceedings.mlr.press/v48/mniha16.html?ref=https://githubhelp.com) 50 | - TRPO算法 51 | - Paper: [Trust Region Policy Optimization](https://proceedings.mlr.press/v37/schulman15.html) 52 | - PPO算法 53 | - Paper: [Proximal Policy Optimization Algorithms](https://arxiv.org/abs/1707.06347) 54 | ## 四、基于动作价值和策略梯度的强化学习算法 55 | - **DDPG算法** 56 | - Paper: [Deterministic Policy Gradient Algorithms](http://proceedings.mlr.press/v32/silver14.html)、 [Continuous control with deep reinforcement learning](https://arxiv.org/abs/1509.02971) 57 | - TD3算法 58 | - Paper: [Addressing Function Approximation Error in Actor-Critic Methods](https://proceedings.mlr.press/v80/fujimoto18a.html) 59 | - SAC算法 60 | - Paper: [Soft Actor-Critic: Off-Policy Maximum Entropy Deep Reinforcement Learning with a Stochastic Actor](https://proceedings.mlr.press/v80/haarnoja18b) 61 | ## 五、基于模型的强化学习算法 62 | - Model-Based RL 63 | - MBRL with Planning and Prediction 64 | - MBRL with World Models 65 | 66 | ## 六、相关学习资源: 67 | 68 | - 书籍与教程 69 | 70 | - 《深度强化学习》(丁负、贾志刚等) 71 | 72 | - [《强化学习导论》(Sutton著)](https://rl.qiwihui.com/zh_CN/latest/) 73 | 74 | - 《强化学习:原理、算法与应用》(李宏毅、王宇飞、李剑飞著) 75 | 76 | - 《深度强化学习实战》(王汝建等著) 77 | 78 | - [OpenAI spinningup强化学习教程]([https://spinningup.openai.com/en/latest/spinningup/rl_intro.html](https://spinningup.qiwihui.com/zh_CN/latest/)) 79 | 80 | - 视频课程 81 | 82 | - [UCB CS285](https://www.bilibili.com/video/BV12341167kL/?spm_id_from=333.337.search-card.all.click&vd_source=ef6bc9d073dccb208fb608bc99286677) 83 | 84 | - [《Reinforcement Learning》(UCL. David Silver)](https://www.davidsilver.uk/teaching/) 85 | 86 | - [深度强化学习-李宏毅](https://www.bilibili.com/video/av24724071/?from=search&seid=9547815852611563503&vd_source=ef6bc9d073dccb208fb608bc99286677) 87 | 88 | -------------------------------------------------------------------------------- /联邦学习进阶/README.md: -------------------------------------------------------------------------------- 1 | # 联邦学习进阶 2 | ## 一. 联邦学习基础概念理解 3 | 联邦学习(Federated Learning)最初由Google的研究科学家Bonawitz等人在2016年提出,其旨在通过在设备上进行本地训练,并仅将更新的模型参数聚合起来,从而在不公开任何个人数据的情况下进行机器学习。 4 | 5 | 联邦学习的起源可以追溯到分散式学习(Decentralized Learning),分散式学习是在不同设备上训练模型并协作学习的过程,其中每个设备都拥有自己的本地数据集。然而,在分散式学习中,设备之间通常需要交换数据和模型,这可能导致隐私和安全问题。 6 | 7 | 因此,联邦学习被提出来解决这些问题,它通过让设备在本地训练模型,将更新的模型参数上传到云端进行聚合,从而保护个人隐私和数据安全。自从提出后,联邦学习已经被广泛应用于各种场景,如医疗、金融、智能交通等领域。 8 | 9 | 联邦学习的特点: 10 | 1.各方数据都保留在本地,不泄露隐私也不违反法规。 11 | 12 | 2.多个参与者联合数据建立虚拟的公有模型,并且共同获益的体系。 13 | 14 | 3.在联邦学习的体系下,各个参与者的身份和地位平等。 15 | 16 | 4.联邦学习的建模效果和将整个数据放在一处建模的效果相同,或相差不大(在各个数据的用户对齐(user alignment)或特征对齐(feature alignment)的条件下) 17 | 迁移学习是在用户或特征不对齐的情况下,也可以在数据间通过交换加密参数达到知识迁移的效果。 18 | 19 | 5.联邦学习使多个参与方在保护数据隐私、满足合法合规要求的前提下继续进行机器学习,解决数据孤岛问题。 20 | ## 二.联邦学习的学习路径 21 | 1.机器学习基础知识:在学习联邦学习之前,需要具备一定的机器学习基础知识,包括监督学习、无监督学习、深度学习等。 22 | 23 | 2.分布式系统基础知识:联邦学习是一种分散式学习方法,需要一定的分布式系统基础知识,如分布式计算、分布式数据库、分布式文件系统等。 24 | 25 | 3.隐私保护:联邦学习涉及到多个参与者之间的数据共享,因此需要对数据进行隐私保护。需要学习隐私保护的基本概念和技术,如差分隐私、同态加密、多方计算等。 26 | 27 | 4.联邦学习框架:学习联邦学习框架,如Google的Federated Learning框架、PyTorch的PySyft框架等。 28 | ## 三.联邦学习相关论文分类汇总 29 | 开坑论文:Communication-Efficient Learning of Deep Networks from Decentralized Data 30 | 31 | 综述: 32 | Advances and Open Problems in Federated Learning 33 | 34 | Federated machine learning: Concept and applications 35 | 36 | 联邦学习算法与通信优化: 37 | Privacy-Preserving Deep Learning 38 | 39 | Agnostic Federated Learning 40 | 41 | Bayesian Nonparametric Federated Learning of Neural Networks 42 | 43 | Federated learning with matched averaging (该文内容基于上面一篇文章的工作) 44 | 45 | Federated learning: Strategies for improving communication efficiency 46 | 47 | Federated multi-task learning 48 | 49 | Federated optimization in heterogeneous networks 50 | 51 | Fair resource allocation in federated learning (与上面一篇同作者) 52 | 53 | Communication-Efficient Federated Learning with Sketching 54 | 55 | FedBoost: Communication-Efficient Algorithms for Federated Learning 56 | 57 | Federated Learning with Only Positive Labels 58 | 59 | Scaffold: Stochastic controlled averaging for federated learning 60 | 61 | Federated Meta-Learning for Fraudulent Credit Card Detection 62 | 63 | Federated Learning with Communication Delay in Edge Networks 64 | 65 | FLFE: A Communication-Efficient and Privacy-Preserving Federated Feature Engineering Framework 66 | 67 | 联邦元学习: 68 | Federated learning with personalization layers 69 | 70 | Improving federated learning personalization via model agnostic meta learning 71 | 72 | 联邦学习安全 73 | Deep models under the GAN: information leakage from collaborative deep learning 74 | 75 | How to backdoor federated learning 76 | 77 | Quantification of the Leakage in Federated Learning 78 | 79 | Analyzing federated learning through an adversarial lens 80 | 81 | Deep leakage from gradients 82 | 83 | 联邦学习公平与贡献评估 84 | A Multi-player Game for Studying Federated Learning Incentive Schemes 85 | 86 | A Real-time Contribution Measurement Method for Participants in Federated Learning 87 | 88 | Collaborative Fairness in Federated Learning 89 | 90 | Hierarchically Fair Federated Learning 91 | 92 | Incentive design for efficient federated learning in mobile networks: A contract theory approach 93 | 94 | Measure contribution of participants in federated learning 95 | 96 | Profit Allocation for Federated Learning 97 | 98 | Interpret federated learning with shapley values 99 | 100 | 联邦学习与计算机视觉 101 | Federated Learning for Vision-and-Language Grounding Problems. 102 | 103 | Performance Optimization for Federated Person Re-identification via Benchmark Analysis 104 | 105 | 联邦学习与推荐系统 106 | FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems 107 | 108 | Non-iid 109 | On the convergence of fedavg on non-iid data 110 | 111 | From Online to Non-iid Batch Learning 112 | 113 | 数据集 114 | Real-world image datasets for federated learning 115 | 116 | ## 四.联邦学习与现有研究的关系 117 | ### 联邦学习与差分隐私理论的区别 118 | 联邦学习的特点使其可以被用来保护用户数据的隐私,但是它和大数据、数据挖掘领域中常用的隐私保护理论如差分隐私保护理论(Differential Privacy)、k 匿名(k-Anonymity)和 Ι 多样化(I-Diversity)等方法还是有较大的差别的。首先联邦学习与传统隐私保护方法的原理不同,联邦学习通过加密机制下的参数交换方式保护用户数据隐私,加密手段包括同态加密等。与Differential Privacy 不同,其数据和模型本身不会进行传输,因此在数据层面上不存在泄露的可能,也不违反更严格的数据保护法案如GDPR等。而差分隐私理论、k 匿名和I多样化等方法是通过在数据里加噪音,或者采用概括化的方法模糊某些敏感属性,直到第三方不能区分个体为止,从而以较高的概率使数据无法被还原,以此来保护用户隐私。但是,从本质上来说这些方法还是进行了原始数据的传输,存在着潜在被攻击的可能性,并且在GDPR等更严格的数据保护法案下这种数据隐私的保护方式可能不再适用。与之对应的,联邦学习是对用户数据隐私保护更为有力的手段。 119 | 120 | ### 联邦学习与分布式机器学习的区别 121 | 横向联邦学习中多方联合训练的方式与分布式机器学习(Distributed Machine Learning)有部分相似的地方。分布式机器学习涵盖了多个方面,包括把机器学习中的训练数据分布式存储、计算任务分布式运行、模型结果分布式发布等,参数服务器(Parameter Server) IP)是分布式机器学习中一个典型的例子。参数服务器作为加速机器学习模型训练过程的一种工具,它将数据存储在分布式的工作节点上,通过一个中心式的调度节点调配数据分布和分配计算资源,以便更高效的获得最终的训练模型。而对于联邦学习而言,首先在于横向联邦学习中的工作节点代表的是模型训练的数据拥有方,其对本地的数据具有完全的自治权限,可以自主决定何时加入联邦学习进行建模,相对地在参数服务器中,中心节点始终占据着主导地位,因此联邦学习面对的是一个更复杂的学习环境;其次,联邦学习则强调模型训练过程中对数据拥有方的数据隐私保护,是一种应对数据隐私保护的有效措施,能够更好地应对未来愈加严格的数据隐私和数据安全监管环境。 122 | 123 | ### 联邦学习与联邦数据库的关系 124 | 联邦数据库系统(Federated Database System)是将多个不同的单元数据库进行集成,并对集成后的整体进行管理的系统。它的提出是为了实现对多个独立的数据库进行相互操作。联邦数据库系统对单元数据库往往采用分布式存储的方式,并且在实际中各个单元数据库中的数据是异构的,因此,它和联邦学习在数据的类型与存储方式上有很多相似之处。但是,联邦数据库系统在各个单元数据库交互的过程中不涉及任何隐私保护机制,所有单元数据库对管理系统都是完全可见的。此外,联邦数据库系统的工作重心在包括插入、删除、查找、合并等各种数据库基本操作上面,而联邦学习的目的是在保护数据隐私的前提下对各个数据建立一个联合模型,使数据中蕴含的各种模式与规律更好地为我们服务。 125 | 126 | ### 联邦学习与区块链的关系 127 | 区块链是一个基于密码学安全的分布式账本,其方便验证,不可篡改。区块链 2.0是一个去中心化的应用,通过使用开源的代码及分布式的存储和运行,保证极高的透明度和安全性,使数据不会被篡改。区块链的典型应用包括比特币(BTC)、以太坊(ETH)等。区块链与联邦学习都是一种去中心化的网络,区块链是一种完全P2P (peer to peer)的网络结构,在联邦学习中,第三方会承担汇聚模型、管理等功能。联邦学习与区块链中,均涉及到密码学、加密算法等基础技术。根据技术的不同,区块链技术使用的加密算法包括哈希算法,非对称加密等;联邦学习中使用同态加密等。从数据角度上看,区块链上通过加密的方式在各个节点上记录了完整的数据,而联邦学习中,各方的数据均仅保留在本地。从奖励机制上看,区块链中,不同节点之间通过竞争记账来获得奖励;在联邦学习中,多个参与方通过共同学习,提高模型训练结果,依据每一方的贡献来分配奖励。 128 | 129 | ### 联邦学习与多方安全计算的关系 130 | 在联邦学习中,用户的隐私与安全是重中之重。为了保护用户隐私,防止联邦学习应用被恶意方攻击,多方安全计算技术可以在联邦学习中被应用,成为联邦学习技术框架中的一部分。学术界已经展开利用多方安全计算来增强联邦学习的安全性的研究。McMahan 指出,联邦学习可以通过差分隐私,多方安全计算,或它们的结合等技术来提供更强的安全保障。Bonawitz指出,联邦学习中,可以利用多方安全计算以安全的方式计算来自用户设备的模型参数更新的总和。Truex中提出了一种利用差分隐私和多方安全计算来保护隐私的联邦学习方法。Liu提出将加性同态加密(AHE)应用于神经网络的多方计算。微众银行提出的开源联邦学习框架FATE 中包含了多方安全计算的相关算子,方便应用方对多方安全计算进行高效的开发。 131 | ## 五.联邦学习的分类 132 | ### 横向联邦学习 133 | 在两个数据集的用户特征重叠较多而用户重叠较少的情况下,我们把数据集按照横向(即用户维度)切分,并取出双方用户特征相同而用户不完全相同的那部分数据进行训练。这种方法叫做横向联邦学习。比如有两家不同地区银行,它们的用户群体分别来自各自所在的地区,相互的交集很小。但是,它们的业务很相似,因此,记录的用户特征是相同的。此时,就可以使用横向联邦学习来构建联合模型。Google在2017年提出了一个针对安卓手机横型更新的数据联合建模方案;在单个用户使用安卓手机时,不断在本地更新模型参数并将参数上传到安卓云上,从而使特征维度相同的各数据拥有方建立联合模型的一种联邦学习方案。 134 | ### 纵向联邦学习 135 | 在两个数据集的用户重叠较多而用户特征重叠较少的情况下,我们把数据集按照纵向(即特征维度)切分,并取出双方用户相同而用户特征不完全相同的那部分数据进行训练。这种方法叫做纵向联邦学习。比如有两个不同机构,一家是某地的银行,另一家是同一个地方的电商。它们的用户群体很有可能包含该地的大部分居民,因此用户的交集较大。但是,由于银行记录的都是用户的收支行为与信用评级,而电商则保有用户的浏览与购买历史,因此它们的用户特征交集较小。纵向联邦学习就是将这些不同特征在加密的状态下加以聚合,以增强模型能力的联邦学习。目前,逻辑回归模型,树型结构模型和神经网络模型等众多机器学习模型已经逐渐被证实能够建立在这个联邦体系上 136 | ### 联邦迁移学习 137 | 在两个数据集的用户与用户特征重叠都较少的情况下,我们不对数据进行切分,而可以利用迁移学习来克服数据或标签不足的情况,这种方法叫作联邦迁移学习。 138 | 比如有两个不同机构,一家是位于中国的银行,另一家是位于美国的电商。由于受到地域限制,这两家机构的用户群体交集很小。同时,由于机构类型的不同,二者的数据特征也只有小部分重合。在这种情况下,要想进行有效的联邦学习,就必须引入迁移学习,来解决单边数据规模小和标签样本少的问题,从而提升模型的效果。 139 | ## 六.未来研究方向 140 | 将其他的技术引入联邦学习之中,从而有效解决联邦学习安全性,有效性的问题。 141 | 142 | -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/README.md: -------------------------------------------------------------------------------- 1 | # 第3章:用Python从零实现横向联邦图像分类 2 | 3 | 4 | 本章是联邦学习实战书中第三章的配套代码。本章的代码运行需要首先安装[Python](https://www.anaconda.com/products/individual)、[Pytorch](https://pytorch.org/get-started/locally/)环境,并下载[cifar10](https://www.cs.toronto.edu/~kriz/cifar.html)数据集放置到data文件夹下面。 5 | 6 | 7 | 8 | ## 3.1 代码运行 9 | 10 | 在本目录下,在命令行中执行下面的命令: 11 | 12 | ``` 13 | python main.py -c ./utils/conf.json 14 | ``` 15 | 16 | 17 | 18 | ## 3.2 服务端 19 | 20 | 横向联邦学习的服务端的主要功能是将被选择的客户端上传的本地模型进行模型聚合。但这里需要特别注意的是,事实上,对于一个功能完善的联邦学习框架,比如我们将在后面介绍的FATE平台,服务端的功能要复杂得多,比如服务端需要对各个客户端节点进行网络监控、对失败节点发出重连信号等。本章由于是在本地模拟的,不涉及网络通信细节和失败故障等处理,因此不讨论这些功能细节,仅涉及模型聚合功能。 21 | 22 | 下面我们首先定义一个服务端类Server,类中的主要函数包括以下几个。 23 | 24 | - 定义构造函数。在构造函数中,服务端的工作包括:第一,将配置信息拷贝到服务端中;第二,按照配置中的模型信息获取模型,这里我们使用torchvision 的models模块内置的ResNet-18模型。 25 | 26 | ```python 27 | class Server(object): 28 | def __init__(self, conf, eval_dataset): 29 | 30 | self.conf = conf 31 | 32 | self.global_model = models.get_model(self.conf["model_name"]) 33 | 34 | self.eval_loader = torch.utils.data.DataLoader(eval_dataset, 35 | batch_size=self.conf["batch_size"], shuffle=True) 36 | ``` 37 | 38 | - 定义模型聚合函数。前面我们提到服务端的主要功能是进行模型的聚合,因此定义构造函数后,我们需要在类中定义模型聚合函数,通过接收客户端上传的模型,使用聚合函数更新全局模型。聚合方案有很多种,本节我们采用经典的FedAvg 算法。 39 | 40 | ```python 41 | def model_aggregate(self, weight_accumulator): 42 | for name, data in self.global_model.state_dict().items(): 43 | update_per_layer = weight_accumulator[name] * self.conf["lambda"] 44 | if data.type() != update_per_layer.type(): 45 | data.add_(update_per_layer.to(torch.int64)) 46 | else: 47 | data.add_(update_per_layer) 48 | ``` 49 | 50 | - 定义模型评估函数。对当前的全局模型,利用评估数据评估当前的全局模型性能。通常情况下,服务端的评估函数主要对当前聚合后的全局模型进行分析,用于判断当前的模型训练是需要进行下一轮迭代、还是提前终止,或者模型是否出现发散退化的现象。根据不同的结果,服务端可以采取不同的措施策略。 51 | 52 | ```python 53 | def model_eval(self): 54 | self.global_model.eval() 55 | total_loss = 0.0 56 | correct = 0 57 | dataset_size = 0 58 | for batch_id, batch in enumerate(self.eval_loader): 59 | data, target = batch 60 | dataset_size += data.size()[0] 61 | if torch.cuda.is_available(): 62 | data = data.cuda() 63 | target = target.cuda() 64 | 65 | output = self.global_model(data) 66 | total_loss += torch.nn.functional.cross_entropy(output, target, 67 | reduction='sum').item() # sum up batch loss 68 | pred = output.data.max(1)[1] # get the index of the max log-probability 69 | correct += pred.eq(target.data.view_as(pred)).cpu().sum().item() 70 | 71 | acc = 100.0 * (float(correct) / float(dataset_size)) 72 | total_l = total_loss / dataset_size 73 | 74 | return acc, total_l 75 | ``` 76 | 77 | 78 | ## 3.3 客户端 79 | 80 | 横向联邦学习的客户端主要功能是接收服务端的下发指令和全局模型,利用本地数据进行局部模型训练。与前一节一样,对于一个功能完善的联邦学习框架,客户端的功能同样相当复杂,比如需要考虑本地的资源(CPU、内存等)是否满足训练需要、当前的网络中断、当前的训练由于受到外界因素影响而中断等。读者如果对这些设计细节感兴趣,可以查看当前流行的联邦学习框架源代码和文档,比如FATE,获取更多的实现细节。本节我们仅考虑客户端本地的模型训练细节。我们首先定义客户端类Client,类中的主要函数包括以下两种。 81 | 82 | - 定义构造函数。在客户端构造函数中,客户端的主要工作包括:首先,将配置信息拷贝到客户端中;然后,按照配置中的模型信息获取模型,通常由服务端将模型参数传递给客户端,客户端将该全局模型覆盖掉本地模型;最后,配置本地训练数据,在本案例中,我们通过torchvision 的datasets 模块获取cifar10 数据集后按客户端ID切分,不同的客户端拥有不同的子数据集,相互之间没有交集。 83 | 84 | ```python 85 | class Client(object): 86 | def __init__(self, conf, model, train_dataset, id = -1): 87 | self.conf = conf 88 | self.local_model = models.get_model(self.conf["model_name"]) 89 | self.client_id = id 90 | self.train_dataset = train_dataset 91 | all_range = list(range(len(self.train_dataset))) 92 | data_len = int(len(self.train_dataset) / self.conf['no_models']) 93 | train_indices = all_range[id * data_len: (id + 1) * data_len] 94 | 95 | self.train_loader = torch.utils.data.DataLoader(self.train_dataset, 96 | batch_size=conf["batch_size"], sampler=torch.utils.data.sampler.SubsetRandomSampler(train_indices)) 97 | ``` 98 | 99 | - 定义模型本地训练函数。本例是一个图像分类的例子,因此,我们使用交叉熵作为本地模型的损失函数,利用梯度下降来求解并更新参数值,实现细节如下面代码块所示。 100 | ```python 101 | def local_train(self, model): 102 | for name, param in model.state_dict().items(): 103 | self.local_model.state_dict()[name].copy_(param.clone()) 104 | optimizer = torch.optim.SGD(self.local_model.parameters(), lr=self.conf['lr'], 105 | momentum=self.conf['momentum']) 106 | self.local_model.train() 107 | for e in range(self.conf["local_epochs"]): 108 | for batch_id, batch in enumerate(self.train_loader): 109 | data, target = batch 110 | if torch.cuda.is_available(): 111 | data = data.cuda() 112 | target = target.cuda() 113 | optimizer.zero_grad() 114 | output = self.local_model(data) 115 | loss = torch.nn.functional.cross_entropy(output, target) 116 | loss.backward() 117 | optimizer.step() 118 | print("Epoch %d done." % e) 119 | 120 | diff = dict() 121 | for name, data in self.local_model.state_dict().items(): 122 | diff[name] = (data - model.state_dict()[name]) 123 | return diff 124 | ``` 125 | 126 | 127 | 128 | ## 3.4 整合 129 | 130 | 当配置文件、服务端类和客户端类都定义完毕,我们将这些信息组合起来。首先,读取配置文件信息。 131 | ```python 132 | with open(args.conf, 'r') as f: 133 | conf = json.load(f) 134 | ``` 135 | 接下来,我们将分别定义一个服务端对象和多个客户端对象,用来模拟横向联邦训练场景。 136 | ```python 137 | train_datasets, eval_datasets = datasets.get_dataset("./data/", conf["type"]) 138 | server = Server(conf, eval_datasets) 139 | clients = [] 140 | 141 | for c in range(conf["no_models"]): 142 | clients.append(Client(conf, server.global_model, train_datasets, c)) 143 | ``` 144 | 每一轮的迭代,服务端会从当前的客户端集合中随机挑选一部分参与本轮迭代训练,被选中的客户端调用本地训练接口local_train进行本地训练,最后服务端调用模型聚合函数model_aggregate来更新全局模型,代码如下所示。 145 | ```python 146 | for e in range(conf["global_epochs"]): 147 | candidates = random.sample(clients, conf["k"]) 148 | weight_accumulator = {} 149 | for name, params in server.global_model.state_dict().items(): 150 | weight_accumulator[name] = torch.zeros_like(params) 151 | for c in candidates: 152 | diff = c.local_train(server.global_model) 153 | for name, params in server.global_model.state_dict().items(): 154 | weight_accumulator[name].add_(diff[name]) 155 | server.model_aggregate(weight_accumulator) 156 | acc, loss = server.model_eval() 157 | print("Epoch %d, acc: %f, loss: %f\n" % (e, acc, loss)) 158 | ``` 159 | 160 | 161 | 162 | ## 3.5 配置信息 163 | 164 | 本案例的配置信息在:[conf.json](./utils/conf.json),读者可以根据实际需要修改。 165 | 166 | * model_name:模型名称 167 | * no_models:客户端数量 168 | * type:数据集信息 169 | * global_epochs:全局迭代次数,即服务端与客户端的通信迭代次数 170 | * local_epochs:本地模型训练迭代次数 171 | * k:每一轮迭代时,服务端会从所有客户端中挑选k个客户端参与训练。 172 | * batch_size:本地训练每一轮的样本数 173 | * lr,momentum,lambda:本地训练的超参数设置 174 | 175 | 176 | 177 | ## 3.6 联邦学习与中心化训练的效果对比 178 | 179 |
180 | 联邦学习与中心化训练的效果对比 181 |
182 | 183 | 184 | 185 | - 联邦训练配置:一共10台客户端设备(no\_models=10),每一轮任意挑选其中的5台参与训练(k=5), 每一次本地训练迭代次数为3次(local\_epochs=3),全局迭代次数为20次(global\_epochs=20)。 186 | 187 | - 集中式训练配置:我们不需要单独编写集中式训练代码,只需要修改联邦学习配置既可使其等价于集中式训练。具体来说,我们将客户端设备no\_models和每一轮挑选的参与训练设备数k都设为1即可。这样只有1台设备参与的联邦训练等价于集中式训练。其余参数配置信息与联邦学习训练一致。图中我们将局部迭代次数分别设置了1,2,3来进行比较。 188 | 189 | 190 | 191 | ## 3.7 联邦学习在模型推断上的效果对比 192 | 193 |
194 | 联邦学习在模型推断上的效果对比 195 |
196 | 197 | 图中的单点训练只的是在某一个客户端下,利用本地的数据进行模型训练的结果。 198 | 199 | - 我们看到单点训练的模型效果(蓝色条)明显要低于联邦训练 的效果(绿色条和红色条),这也说明了仅仅通过单个客户端的数据,不能够 很好的学习到数据的全局分布特性,模型的泛化能力较差。 200 | - 此外,对于每一轮 参与联邦训练的客户端数目(k 值)不同,其性能也会有一定的差别,k 值越大,每一轮参与训练的客户端数目越多,其性能也会越好,但每一轮的完成时间也会相对较长。 -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/client.py: -------------------------------------------------------------------------------- 1 | 2 | import models, torch, copy 3 | class Client(object): 4 | 5 | def __init__(self, conf, model, train_dataset, id = -1): 6 | 7 | self.conf = conf 8 | 9 | self.local_model = models.get_model(self.conf["model_name"]) 10 | 11 | self.client_id = id 12 | 13 | self.train_dataset = train_dataset 14 | 15 | all_range = list(range(len(self.train_dataset))) 16 | data_len = int(len(self.train_dataset) / self.conf['no_models']) 17 | train_indices = all_range[id * data_len: (id + 1) * data_len] 18 | 19 | self.train_loader = torch.utils.data.DataLoader(self.train_dataset, batch_size=conf["batch_size"], 20 | sampler=torch.utils.data.sampler.SubsetRandomSampler(train_indices)) 21 | 22 | 23 | def local_train(self, model): 24 | 25 | for name, param in model.state_dict().items(): 26 | self.local_model.state_dict()[name].copy_(param.clone()) 27 | 28 | #print(id(model)) 29 | optimizer = torch.optim.SGD(self.local_model.parameters(), lr=self.conf['lr'], 30 | momentum=self.conf['momentum']) 31 | #print(id(self.local_model)) 32 | self.local_model.train() 33 | for e in range(self.conf["local_epochs"]): 34 | 35 | for batch_id, batch in enumerate(self.train_loader): 36 | data, target = batch 37 | 38 | if torch.cuda.is_available(): 39 | data = data.cuda() 40 | target = target.cuda() 41 | 42 | optimizer.zero_grad() 43 | output = self.local_model(data) 44 | loss = torch.nn.functional.cross_entropy(output, target) 45 | loss.backward() 46 | 47 | optimizer.step() 48 | print("Epoch %d done." % e) 49 | diff = dict() 50 | for name, data in self.local_model.state_dict().items(): 51 | diff[name] = (data - model.state_dict()[name]) 52 | #print(diff[name]) 53 | 54 | return diff 55 | -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/datasets.py: -------------------------------------------------------------------------------- 1 | 2 | 3 | import torch 4 | from torchvision import datasets, transforms 5 | 6 | def get_dataset(dir, name): 7 | 8 | if name=='mnist': 9 | train_dataset = datasets.MNIST(dir, train=True, download=True, transform=transforms.ToTensor()) 10 | eval_dataset = datasets.MNIST(dir, train=False, transform=transforms.ToTensor()) 11 | 12 | elif name=='cifar': 13 | transform_train = transforms.Compose([ 14 | transforms.RandomCrop(32, padding=4), 15 | transforms.RandomHorizontalFlip(), 16 | transforms.ToTensor(), 17 | transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)), 18 | ]) 19 | 20 | transform_test = transforms.Compose([ 21 | transforms.ToTensor(), 22 | transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)), 23 | ]) 24 | 25 | train_dataset = datasets.CIFAR10(dir, train=True, download=True, 26 | transform=transform_train) 27 | eval_dataset = datasets.CIFAR10(dir, train=False, transform=transform_test) 28 | 29 | 30 | return train_dataset, eval_dataset -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/figures/fig2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/联邦学习进阶/联邦学习图像分类/figures/fig2.png -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/figures/fig31.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/UNIC-Lab/AI_Course/2755f4c883b6c651ae29a50951aa8f4af83242b0/联邦学习进阶/联邦学习图像分类/figures/fig31.png -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/main.py: -------------------------------------------------------------------------------- 1 | import argparse, json 2 | import datetime 3 | import os 4 | import logging 5 | import torch, random 6 | 7 | from server import * 8 | from client import * 9 | import models, datasets 10 | 11 | 12 | 13 | if __name__ == '__main__': 14 | 15 | parser = argparse.ArgumentParser(description='Federated Learning') 16 | parser.add_argument('-c', '--conf', dest='conf') 17 | args = parser.parse_args() 18 | 19 | 20 | with open('./utils/conf.json', 'r') as f: 21 | conf = json.load(f) 22 | 23 | 24 | train_datasets, eval_datasets = datasets.get_dataset("./data/", conf["type"]) # cifar数据集 25 | 26 | server = Server(conf, eval_datasets) 27 | clients = [] 28 | 29 | for c in range(conf["no_models"]): #10 30 | clients.append(Client(conf, server.global_model, train_datasets, c)) # c是从0到10 也就是id 31 | 32 | print("\n\n") 33 | for e in range(conf["global_epochs"]): #global epoch 20 34 | 35 | candidates = random.sample(clients, conf["k"]) #k=5 随机采样5个 36 | 37 | weight_accumulator = {} 38 | 39 | for name, params in server.global_model.state_dict().items(): # 聚合过程 先置0 后面添加到weight accumulator 40 | weight_accumulator[name] = torch.zeros_like(params) 41 | 42 | for c in candidates: 43 | diff = c.local_train(server.global_model) 44 | 45 | for name, params in server.global_model.state_dict().items(): 46 | weight_accumulator[name].add_(diff[name]) 47 | 48 | 49 | server.model_aggregate(weight_accumulator) 50 | 51 | acc, loss = server.model_eval() 52 | 53 | print("Epoch %d, acc: %f, loss: %f\n" % (e, acc, loss)) 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/models.py: -------------------------------------------------------------------------------- 1 | 2 | import torch 3 | from torchvision import models 4 | 5 | def get_model(name="vgg16", pretrained=True): 6 | if name == "resnet18": 7 | model = models.resnet18(pretrained=pretrained) 8 | elif name == "resnet50": 9 | model = models.resnet50(pretrained=pretrained) 10 | elif name == "densenet121": 11 | model = models.densenet121(pretrained=pretrained) 12 | elif name == "alexnet": 13 | model = models.alexnet(pretrained=pretrained) 14 | elif name == "vgg16": 15 | model = models.vgg16(pretrained=pretrained) 16 | elif name == "vgg19": 17 | model = models.vgg19(pretrained=pretrained) 18 | elif name == "inception_v3": 19 | model = models.inception_v3(pretrained=pretrained) 20 | elif name == "googlenet": 21 | model = models.googlenet(pretrained=pretrained) 22 | 23 | if torch.cuda.is_available(): 24 | return model.cuda() 25 | else: 26 | return model -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/server.py: -------------------------------------------------------------------------------- 1 | 2 | import models, torch 3 | 4 | 5 | class Server(object): 6 | 7 | def __init__(self, conf, eval_dataset): 8 | 9 | self.conf = conf 10 | 11 | self.global_model = models.get_model(self.conf["model_name"]) 12 | 13 | self.eval_loader = torch.utils.data.DataLoader(eval_dataset, batch_size=self.conf["batch_size"], shuffle=True) 14 | 15 | 16 | def model_aggregate(self, weight_accumulator): 17 | for name, data in self.global_model.state_dict().items(): 18 | 19 | update_per_layer = weight_accumulator[name] * self.conf["lambda"] 20 | 21 | if data.type() != update_per_layer.type(): 22 | data.add_(update_per_layer.to(torch.int64)) 23 | else: 24 | data.add_(update_per_layer) 25 | 26 | def model_eval(self): 27 | self.global_model.eval() 28 | 29 | total_loss = 0.0 30 | correct = 0 31 | dataset_size = 0 32 | for batch_id, batch in enumerate(self.eval_loader): 33 | data, target = batch 34 | dataset_size += data.size()[0] 35 | 36 | if torch.cuda.is_available(): 37 | data = data.cuda() 38 | target = target.cuda() 39 | 40 | 41 | output = self.global_model(data) 42 | 43 | total_loss += torch.nn.functional.cross_entropy(output, target, 44 | reduction='sum').item() # sum up batch loss 45 | pred = output.data.max(1)[1] # get the index of the max log-probability 46 | correct += pred.eq(target.data.view_as(pred)).cpu().sum().item() 47 | 48 | acc = 100.0 * (float(correct) / float(dataset_size)) 49 | total_l = total_loss / dataset_size 50 | 51 | return acc, total_l -------------------------------------------------------------------------------- /联邦学习进阶/联邦学习图像分类/utils/conf.json: -------------------------------------------------------------------------------- 1 | { 2 | 3 | "model_name" : "resnet18", 4 | 5 | "no_models" : 10, 6 | 7 | "type" : "cifar", 8 | 9 | "global_epochs" : 20, 10 | 11 | "local_epochs" : 3, 12 | 13 | "k" : 5, 14 | 15 | "batch_size" : 32, 16 | 17 | "lr" : 0.001, 18 | 19 | "momentum" : 0.0001, 20 | 21 | "lambda" : 0.1 22 | } 23 | --------------------------------------------------------------------------------