├── LICENSE ├── README.md ├── lecun1989.png ├── modern.py ├── prepro.py ├── repro.py └── vis.ipynb /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022 Andrej 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | # lecun1989-repro 3 | 4 | ![teaser](lecun1989.png) 5 | 6 | This code tries to reproduce the 1989 Yann LeCun et al. paper: [Backpropagation Applied to Handwritten Zip Code Recognition](http://yann.lecun.com/exdb/publis/pdf/lecun-89e.pdf). To my knowledge this is the earliest real-world application of a neural net trained with backpropagation (now 33 years ago). 7 | 8 | #### run 9 | 10 | Since we don't have the exact dataset that was used in the paper, we take MNIST and randomly pick examples from it to generate an approximation of the dataset, which contains only 7291 training and 2007 testing digits, only of size 16x16 pixels (standard MNIST is 28x28). 11 | 12 | ``` 13 | $ python prepro.py 14 | ``` 15 | 16 | Now we can attempt to reproduce the paper. The original network trained for 3 days, but my (Apple Silicon M1) MacBook Air 33 years later chunks through it in about 90 seconds. (non-emulated arm64 but CPU only, I don't believe PyTorch and Apple M1 are best friends ever just yet, but anyway still about 3000X speedup). So now that we've run prepro we can run repro! (haha): 17 | 18 | ``` 19 | $ python repro.py 20 | ``` 21 | 22 | Running this prints (on the 23rd, final pass): 23 | 24 | ``` 25 | eval: split train. loss 4.073383e-03. error 0.62%. misses: 45 26 | eval: split test . loss 2.838382e-02. error 4.09%. misses: 82 27 | ``` 28 | 29 | This is close but not quite the same as what the paper reports. To match the paper exactly we'd expect the following instead: 30 | 31 | ``` 32 | eval: split train. loss 2.5e-3. error 0.14%. misses: 10 33 | eval: split test . loss 1.8e-2. error 5.00%. misses: 102 34 | ``` 35 | 36 | I expect that the majority of this discrepancy comes from the training dataset itself. We've only simulated the original dataset using what we have today 33 years later (MNIST). There are a number of other details that are not specified in the paper, so I also had to do some guessing (see notes below). For example, the specific sparse connectivity structure between layers H1 and H2 is not described, the paper just says that the inputs are "chosen according to a scheme that will not be dicussed here". Alternatively, the paper uses a "special version of Newton's algorithm that uses a positive, diagonal approximation of Hessian", but I only used simple SGD in this implementation because it is signficiantly simpler and, according to the paper, "this algorithm is not believed to bring a tremendous increase in learning speed". Anyway, we are getting numbers on similar orders of magnitude... 37 | 38 | #### notes 39 | 40 | My notes from the paper: 41 | 42 | - 7291 digits are used for training 43 | - 2007 digits are used for testing 44 | - each image is 16x16 pixels grayscale (not binary) 45 | - images are scaled to range [-1, 1] 46 | - network has three hidden layers H1 H2 H3 47 | - H1 is 5x5 stride 2 conv with 12 planes. constant padding of -1. 48 | - not "standard": units do not share biases! (including in the same feature plane) 49 | - H1 has 768 units (8\*8\*12), 19,968 connections (768\*26), 1,068 parameters (768 biases + 25\*12 weights) 50 | - not "standard": H2 units all draw input from 5x5 stride 2 conv but each only connecting to different 8 out of the 12 planes 51 | - H2 contains 192 units (4\*4\*12), 38,592 connections (192 units * 201 input lines), 2,592 parameters (12 * 200 weights + 192 biases) 52 | - H3 has 30 units fully connected to H2. So 5790 connections (30 * 192 + 30) 53 | - output layer has 10 units fully connected to H3. So 310 weights (30 * 10 + 10) 54 | - total: 1256 units, 64,660 connections, 9760 parameters 55 | - tanh activations on all units (including output units!) 56 | - weights of output chosen to be in quasi-linear regime 57 | - cost function: mean squared error 58 | - weight init: random values in U[-2.4/F, 2.4/F] where F is the fan-in. "tends to keep total inputs in operating range of sigmoid" 59 | - training 60 | - patterns presented in constant order 61 | - SGD on single example at a time 62 | - use special version of Newton's algorithm that uses a positive, diagonal approximation of Hessian 63 | - trained for 23 passes over data, measuring train+test error after each pass. total 167,693 presentations (23 * 7291) 64 | - final error: 2.5e-3 train, 1.8e-2 test 65 | - percent misclassification: 0.14% on train (10 mistakes), 5.0% on test (102 mistakes). 66 | - compute: 67 | - run on SUN-4/260 workstation 68 | - digital signal co-processor: 69 | - 256 kbytes of local memory 70 | - peak performance of 12.5M MAC/s on fp32 (ie 25MFLOPS) 71 | - trained for 3 days 72 | - throughput of 10-12 digits/s, "limited mainly by the normalization step" 73 | - throughput of 30 digits/s on normalized digits 74 | - "we have successfully applied backpropagation learning to a large, real-world task" 75 | 76 | **Open questions:** 77 | 78 | - The 12 -> 8 connections from H2 to H1 are not described in this paper... I will assume a sensible block structure connectivity 79 | - Not clear what exactly is the "MSE loss". Was the scaling factor of 1/2 included to simplify the gradient calculation? Will assume no. 80 | - What is the learning rate? I will run a sweep to determine the best one manually. 81 | - Was any learning rate decay used? Not mentioned, I am assuming no. 82 | - Was any weight decay used? not mentioned, assuming no. 83 | - Is there a bug in the pdf where in weight init the fan in should have a square root? The pdf's formatting is a bit messed up. Assuming yes. 84 | - The paper does not say, but what exactly are the targets? Assuming they are +1/-1 for pos/neg, as the output units have tanh too... 85 | 86 | One more notes on the weight init conundrum. Eg the "Kaiming init" is: 87 | 88 | ``` 89 | a = gain * sqrt(3 / fan_in) 90 | ~U(-a, a) 91 | ``` 92 | 93 | For tanh neurons the recommended gain is 5/3. So therefore we would have `a = sqrt(3) * 5 / 3 * sqrt(1 / fan_in) = 2.89 * sqrt(1 / fan_in)`, which is close to what the paper does (gain 2.4). So if the original work in fact did use a sqrt and the pdf is just formatted wrong, then the (modern) Kaiming init and the originally used init are pretty close. 94 | 95 | #### todos 96 | 97 | - modernize the network using knowledge from 33 years of time travel. 98 | - include my janky hyperparameter sweeping code for tuning the learning rate potentially 99 | -------------------------------------------------------------------------------- /lecun1989.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/karpathy/lecun1989-repro/8553f52c8d0a51a4bbbabf98e005cef28a22a36b/lecun1989.png -------------------------------------------------------------------------------- /modern.py: -------------------------------------------------------------------------------- 1 | """ 2 | 3 | repro.py gives: 4 | 23 5 | eval: split train. loss 4.073383e-03. error 0.62%. misses: 45 6 | eval: split test . loss 2.838382e-02. error 4.09%. misses: 82 7 | 8 | we can try to use our knowledge from 33 years later to improve on this, 9 | but keeping the model size same. 10 | 11 | Change 1: replace tanh on last layer with FC and use softmax. Had to 12 | lower the learning rate to 0.01 as well. This improves the optimization 13 | quite a lot, we now crush the training set: 14 | 23 15 | eval: split train. loss 9.536698e-06. error 0.00%. misses: 0 16 | eval: split test . loss 9.536698e-06. error 4.38%. misses: 87 17 | 18 | Change 2: change from SGD to AdamW with LR 3e-4 because I find this 19 | to be significantly more stable and requires little to no tuning. Also 20 | double epochs to 46. I decay the LR to 1e-4 over course of training. 21 | These changes make it so optimization is not culprit of bad performance 22 | with high probability. We also seem to improve test set a bit: 23 | 46 24 | eval: split train. loss 0.000000e+00. error 0.00%. misses: 0 25 | eval: split test . loss 0.000000e+00. error 3.59%. misses: 72 26 | 27 | Change 3: since we are overfitting we can introduce data augmentation, 28 | e.g. let's intro a shift by at most 1 pixel in both x/y directions. Also 29 | because we are augmenting we again want to bump up training time, e.g. 30 | to 60 epochs: 31 | 60 32 | eval: split train. loss 8.780676e-04. error 1.70%. misses: 123 33 | eval: split test . loss 8.780676e-04. error 2.19%. misses: 43 34 | 35 | Change 4: we want to add dropout at the layer with most parameters (H3), 36 | but in addition we also have to shift the activation function to relu so 37 | that dropout makes sense. We also bring up iterations to 80: 38 | 80 39 | eval: split train. loss 2.601336e-03. error 1.47%. misses: 106 40 | eval: split test . loss 2.601336e-03. error 1.59%. misses: 32 41 | 42 | To be continued... 43 | """ 44 | 45 | import os 46 | import json 47 | import argparse 48 | 49 | import numpy as np 50 | import torch 51 | import torch.nn as nn 52 | import torch.nn.functional as F 53 | import torch.optim as optim 54 | from tensorboardX import SummaryWriter # pip install tensorboardX 55 | 56 | # ----------------------------------------------------------------------------- 57 | 58 | class Net(nn.Module): 59 | """ 1989 LeCun ConvNet per description in the paper """ 60 | 61 | def __init__(self): 62 | super().__init__() 63 | 64 | # initialization as described in the paper to my best ability, but it doesn't look right... 65 | winit = lambda fan_in, *shape: (torch.rand(*shape) - 0.5) * 2 * 2.4 / fan_in**0.5 66 | macs = 0 # keep track of MACs (multiply accumulates) 67 | acts = 0 # keep track of number of activations 68 | 69 | # H1 layer parameters and their initialization 70 | self.H1w = nn.Parameter(winit(5*5*1, 12, 1, 5, 5)) 71 | self.H1b = nn.Parameter(torch.zeros(12, 8, 8)) # presumably init to zero for biases 72 | macs += (5*5*1) * (8*8) * 12 73 | acts += (8*8) * 12 74 | 75 | # H2 layer parameters and their initialization 76 | """ 77 | H2 neurons all connect to only 8 of the 12 input planes, with an unspecified pattern 78 | I am going to assume the most sensible block pattern where 4 planes at a time connect 79 | to differently overlapping groups of 8/12 input planes. We will implement this with 3 80 | separate convolutions that we concatenate the results of. 81 | """ 82 | self.H2w = nn.Parameter(winit(5*5*8, 12, 8, 5, 5)) 83 | self.H2b = nn.Parameter(torch.zeros(12, 4, 4)) # presumably init to zero for biases 84 | macs += (5*5*8) * (4*4) * 12 85 | acts += (4*4) * 12 86 | 87 | # H3 is a fully connected layer 88 | self.H3w = nn.Parameter(winit(4*4*12, 4*4*12, 30)) 89 | self.H3b = nn.Parameter(torch.zeros(30)) 90 | macs += (4*4*12) * 30 91 | acts += 30 92 | 93 | # output layer is also fully connected layer 94 | self.outw = nn.Parameter(winit(30, 30, 10)) 95 | self.outb = nn.Parameter(torch.zeros(10)) 96 | macs += 30 * 10 97 | acts += 10 98 | 99 | self.macs = macs 100 | self.acts = acts 101 | 102 | def forward(self, x): 103 | 104 | # poor man's data augmentation by 1 pixel along x/y directions 105 | if self.training: 106 | shift_x, shift_y = np.random.randint(-1, 2, size=2) 107 | x = torch.roll(x, (shift_x, shift_y), (2, 3)) 108 | 109 | # x has shape (1, 1, 16, 16) 110 | x = F.pad(x, (2, 2, 2, 2), 'constant', -1.0) # pad by two using constant -1 for background 111 | x = F.conv2d(x, self.H1w, stride=2) + self.H1b 112 | x = torch.relu(x) 113 | 114 | # x is now shape (1, 12, 8, 8) 115 | x = F.pad(x, (2, 2, 2, 2), 'constant', -1.0) # pad by two using constant -1 for background 116 | slice1 = F.conv2d(x[:, 0:8], self.H2w[0:4], stride=2) # first 4 planes look at first 8 input planes 117 | slice2 = F.conv2d(x[:, 4:12], self.H2w[4:8], stride=2) # next 4 planes look at last 8 input planes 118 | slice3 = F.conv2d(torch.cat((x[:, 0:4], x[:, 8:12]), dim=1), self.H2w[8:12], stride=2) # last 4 planes are cross 119 | x = torch.cat((slice1, slice2, slice3), dim=1) + self.H2b 120 | x = torch.relu(x) 121 | x = F.dropout(x, p=0.25, training=self.training) 122 | 123 | # x is now shape (1, 12, 4, 4) 124 | x = x.flatten(start_dim=1) # (1, 12*4*4) 125 | x = x @ self.H3w + self.H3b 126 | x = torch.relu(x) 127 | 128 | # x is now shape (1, 30) 129 | x = x @ self.outw + self.outb 130 | 131 | # x is finally shape (1, 10) 132 | return x 133 | 134 | # ----------------------------------------------------------------------------- 135 | 136 | if __name__ == '__main__': 137 | 138 | parser = argparse.ArgumentParser(description="Train a 2022 but mini ConvNet on digits") 139 | parser.add_argument('--learning-rate', '-l', type=float, default=3e-4, help="Learning rate") 140 | parser.add_argument('--output-dir' , '-o', type=str, default='out/modern', help="output directory for training logs") 141 | args = parser.parse_args() 142 | print(vars(args)) 143 | 144 | # init rng 145 | torch.manual_seed(1337) 146 | np.random.seed(1337) 147 | torch.use_deterministic_algorithms(True) 148 | 149 | # set up logging 150 | os.makedirs(args.output_dir, exist_ok=True) 151 | with open(os.path.join(args.output_dir, 'args.json'), 'w') as f: 152 | json.dump(vars(args), f, indent=2) 153 | writer = SummaryWriter(args.output_dir) 154 | 155 | # init a model 156 | model = Net() 157 | print("model stats:") 158 | print("# params: ", sum(p.numel() for p in model.parameters())) # in paper total is 9,760 159 | print("# MACs: ", model.macs) 160 | print("# activations: ", model.acts) 161 | 162 | # init data 163 | Xtr, Ytr = torch.load('train1989.pt') 164 | Xte, Yte = torch.load('test1989.pt') 165 | 166 | # init optimizer 167 | optimizer = optim.AdamW(model.parameters(), lr=args.learning_rate) 168 | 169 | def eval_split(split): 170 | # eval the full train/test set, batched implementation for efficiency 171 | model.eval() 172 | X, Y = (Xtr, Ytr) if split == 'train' else (Xte, Yte) 173 | Yhat = model(X) 174 | loss = F.cross_entropy(yhat, y.argmax(dim=1)) 175 | err = torch.mean((Y.argmax(dim=1) != Yhat.argmax(dim=1)).float()) 176 | print(f"eval: split {split:5s}. loss {loss.item():e}. error {err.item()*100:.2f}%. misses: {int(err.item()*Y.size(0))}") 177 | writer.add_scalar(f'error/{split}', err.item()*100, pass_num) 178 | writer.add_scalar(f'loss/{split}', loss.item(), pass_num) 179 | 180 | # train 181 | for pass_num in range(80): 182 | 183 | # learning rate decay 184 | alpha = pass_num / 79 185 | for g in optimizer.param_groups: 186 | g['lr'] = (1 - alpha) * args.learning_rate + alpha * (args.learning_rate / 3) 187 | 188 | # perform one epoch of training 189 | model.train() 190 | for step_num in range(Xtr.size(0)): 191 | 192 | # fetch a single example into a batch of 1 193 | x, y = Xtr[[step_num]], Ytr[[step_num]] 194 | 195 | # forward the model and the loss 196 | yhat = model(x) 197 | loss = F.cross_entropy(yhat, y.argmax(dim=1)) 198 | 199 | # calculate the gradient and update the parameters 200 | optimizer.zero_grad(set_to_none=True) 201 | loss.backward() 202 | optimizer.step() 203 | 204 | # after epoch epoch evaluate the train and test error / metrics 205 | print(pass_num + 1) 206 | eval_split('train') 207 | eval_split('test') 208 | 209 | # save final model to file 210 | torch.save(model.state_dict(), os.path.join(args.output_dir, 'model.pt')) 211 | -------------------------------------------------------------------------------- /prepro.py: -------------------------------------------------------------------------------- 1 | """ 2 | Preprocess today's MNIST dataset into 1989 version's size/format (approximately) 3 | http://yann.lecun.com/exdb/publis/pdf/lecun-89e.pdf 4 | 5 | Some relevant notes for this part: 6 | - 7291 digits are used for training 7 | - 2007 digits are used for testing 8 | - each image is 16x16 pixels grayscale (not binary) 9 | - images are scaled to range [-1, 1] 10 | - paper doesn't say exactly, but reading between the lines I assume label targets to be {-1, 1} 11 | """ 12 | 13 | import numpy as np 14 | import torch 15 | import torch.nn.functional as F 16 | from torchvision import datasets 17 | 18 | # ----------------------------------------------------------------------------- 19 | 20 | torch.manual_seed(1337) 21 | np.random.seed(1337) 22 | 23 | for split in {'train', 'test'}: 24 | 25 | data = datasets.MNIST('./data', train=split=='train', download=True) 26 | 27 | n = 7291 if split == 'train' else 2007 28 | rp = np.random.permutation(len(data))[:n] 29 | 30 | X = torch.full((n, 1, 16, 16), 0.0, dtype=torch.float32) 31 | Y = torch.full((n, 10), -1.0, dtype=torch.float32) 32 | for i, ix in enumerate(rp): 33 | I, yint = data[int(ix)] 34 | # PIL image -> numpy -> torch tensor -> [-1, 1] fp32 35 | xi = torch.from_numpy(np.array(I, dtype=np.float32)) / 127.5 - 1.0 36 | # add a fake batch dimension and a channel dimension of 1 or F.interpolate won't be happy 37 | xi = xi[None, None, ...] 38 | # resize to (16, 16) images with bilinear interpolation 39 | xi = F.interpolate(xi, (16, 16), mode='bilinear') 40 | X[i] = xi[0] # store 41 | 42 | # set the correct class to have target of +1.0 43 | Y[i, yint] = 1.0 44 | 45 | torch.save((X, Y), split + '1989.pt') 46 | -------------------------------------------------------------------------------- /repro.py: -------------------------------------------------------------------------------- 1 | """ 2 | Running this script eventually gives: 3 | 23 4 | eval: split train. loss 4.073383e-03. error 0.62%. misses: 45 5 | eval: split test . loss 2.838382e-02. error 4.09%. misses: 82 6 | """ 7 | 8 | import os 9 | import json 10 | import argparse 11 | 12 | import numpy as np 13 | import torch 14 | import torch.nn as nn 15 | import torch.nn.functional as F 16 | import torch.optim as optim 17 | from tensorboardX import SummaryWriter # pip install tensorboardX 18 | 19 | # ----------------------------------------------------------------------------- 20 | 21 | class Net(nn.Module): 22 | """ 1989 LeCun ConvNet per description in the paper """ 23 | 24 | def __init__(self): 25 | super().__init__() 26 | 27 | # initialization as described in the paper to my best ability, but it doesn't look right... 28 | winit = lambda fan_in, *shape: (torch.rand(*shape) - 0.5) * 2 * 2.4 / fan_in**0.5 29 | macs = 0 # keep track of MACs (multiply accumulates) 30 | acts = 0 # keep track of number of activations 31 | 32 | # H1 layer parameters and their initialization 33 | self.H1w = nn.Parameter(winit(5*5*1, 12, 1, 5, 5)) 34 | self.H1b = nn.Parameter(torch.zeros(12, 8, 8)) # presumably init to zero for biases 35 | assert self.H1w.nelement() + self.H1b.nelement() == 1068 36 | macs += (5*5*1) * (8*8) * 12 37 | acts += (8*8) * 12 38 | 39 | # H2 layer parameters and their initialization 40 | """ 41 | H2 neurons all connect to only 8 of the 12 input planes, with an unspecified pattern 42 | I am going to assume the most sensible block pattern where 4 planes at a time connect 43 | to differently overlapping groups of 8/12 input planes. We will implement this with 3 44 | separate convolutions that we concatenate the results of. 45 | """ 46 | self.H2w = nn.Parameter(winit(5*5*8, 12, 8, 5, 5)) 47 | self.H2b = nn.Parameter(torch.zeros(12, 4, 4)) # presumably init to zero for biases 48 | assert self.H2w.nelement() + self.H2b.nelement() == 2592 49 | macs += (5*5*8) * (4*4) * 12 50 | acts += (4*4) * 12 51 | 52 | # H3 is a fully connected layer 53 | self.H3w = nn.Parameter(winit(4*4*12, 4*4*12, 30)) 54 | self.H3b = nn.Parameter(torch.zeros(30)) 55 | assert self.H3w.nelement() + self.H3b.nelement() == 5790 56 | macs += (4*4*12) * 30 57 | acts += 30 58 | 59 | # output layer is also fully connected layer 60 | self.outw = nn.Parameter(winit(30, 30, 10)) 61 | self.outb = nn.Parameter(-torch.ones(10)) # 9/10 targets are -1, so makes sense to init slightly towards it 62 | assert self.outw.nelement() + self.outb.nelement() == 310 63 | macs += 30 * 10 64 | acts += 10 65 | 66 | self.macs = macs 67 | self.acts = acts 68 | 69 | def forward(self, x): 70 | 71 | # x has shape (1, 1, 16, 16) 72 | x = F.pad(x, (2, 2, 2, 2), 'constant', -1.0) # pad by two using constant -1 for background 73 | x = F.conv2d(x, self.H1w, stride=2) + self.H1b 74 | x = torch.tanh(x) 75 | 76 | # x is now shape (1, 12, 8, 8) 77 | x = F.pad(x, (2, 2, 2, 2), 'constant', -1.0) # pad by two using constant -1 for background 78 | slice1 = F.conv2d(x[:, 0:8], self.H2w[0:4], stride=2) # first 4 planes look at first 8 input planes 79 | slice2 = F.conv2d(x[:, 4:12], self.H2w[4:8], stride=2) # next 4 planes look at last 8 input planes 80 | slice3 = F.conv2d(torch.cat((x[:, 0:4], x[:, 8:12]), dim=1), self.H2w[8:12], stride=2) # last 4 planes are cross 81 | x = torch.cat((slice1, slice2, slice3), dim=1) + self.H2b 82 | x = torch.tanh(x) 83 | 84 | # x is now shape (1, 12, 4, 4) 85 | x = x.flatten(start_dim=1) # (1, 12*4*4) 86 | x = x @ self.H3w + self.H3b 87 | x = torch.tanh(x) 88 | 89 | # x is now shape (1, 30) 90 | x = x @ self.outw + self.outb 91 | x = torch.tanh(x) 92 | 93 | # x is finally shape (1, 10) 94 | return x 95 | 96 | # ----------------------------------------------------------------------------- 97 | 98 | if __name__ == '__main__': 99 | 100 | parser = argparse.ArgumentParser(description="Train a 1989 LeCun ConvNet on digits") 101 | parser.add_argument('--learning-rate', '-l', type=float, default=0.03, help="SGD learning rate") 102 | parser.add_argument('--output-dir' , '-o', type=str, default='out/base', help="output directory for training logs") 103 | args = parser.parse_args() 104 | print(vars(args)) 105 | 106 | # init rng 107 | torch.manual_seed(1337) 108 | np.random.seed(1337) 109 | torch.use_deterministic_algorithms(True) 110 | 111 | # set up logging 112 | os.makedirs(args.output_dir, exist_ok=True) 113 | with open(os.path.join(args.output_dir, 'args.json'), 'w') as f: 114 | json.dump(vars(args), f, indent=2) 115 | writer = SummaryWriter(args.output_dir) 116 | 117 | # init a model 118 | model = Net() 119 | print("model stats:") 120 | print("# params: ", sum(p.numel() for p in model.parameters())) # in paper total is 9,760 121 | print("# MACs: ", model.macs) 122 | print("# activations: ", model.acts) 123 | 124 | # init data 125 | Xtr, Ytr = torch.load('train1989.pt') 126 | Xte, Yte = torch.load('test1989.pt') 127 | 128 | # init optimizer 129 | optimizer = optim.SGD(model.parameters(), lr=args.learning_rate) 130 | 131 | def eval_split(split): 132 | # eval the full train/test set, batched implementation for efficiency 133 | model.eval() 134 | X, Y = (Xtr, Ytr) if split == 'train' else (Xte, Yte) 135 | Yhat = model(X) 136 | loss = torch.mean((Y - Yhat)**2) 137 | err = torch.mean((Y.argmax(dim=1) != Yhat.argmax(dim=1)).float()) 138 | print(f"eval: split {split:5s}. loss {loss.item():e}. error {err.item()*100:.2f}%. misses: {int(err.item()*Y.size(0))}") 139 | writer.add_scalar(f'error/{split}', err.item()*100, pass_num) 140 | writer.add_scalar(f'loss/{split}', loss.item(), pass_num) 141 | 142 | # train 143 | for pass_num in range(23): 144 | 145 | # perform one epoch of training 146 | model.train() 147 | for step_num in range(Xtr.size(0)): 148 | 149 | # fetch a single example into a batch of 1 150 | x, y = Xtr[[step_num]], Ytr[[step_num]] 151 | 152 | # forward the model and the loss 153 | yhat = model(x) 154 | loss = torch.mean((y - yhat)**2) 155 | 156 | # calculate the gradient and update the parameters 157 | optimizer.zero_grad(set_to_none=True) 158 | loss.backward() 159 | optimizer.step() 160 | 161 | # after epoch epoch evaluate the train and test error / metrics 162 | print(pass_num + 1) 163 | eval_split('train') 164 | eval_split('test') 165 | 166 | # save final model to file 167 | torch.save(model.state_dict(), os.path.join(args.output_dir, 'model.pt')) 168 | -------------------------------------------------------------------------------- /vis.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 72, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "# imports\n", 10 | "import os\n", 11 | "\n", 12 | "import numpy as np\n", 13 | "import torch\n", 14 | "import torch.nn as nn\n", 15 | "import torch.nn.functional as F\n", 16 | "\n", 17 | "import matplotlib\n", 18 | "import matplotlib.pyplot as plt\n", 19 | "%matplotlib inline" 20 | ] 21 | }, 22 | { 23 | "cell_type": "code", 24 | "execution_count": 75, 25 | "metadata": {}, 26 | "outputs": [], 27 | "source": [ 28 | "# load model\n", 29 | "from repro import Net\n", 30 | "model_dir = 'out/base'\n", 31 | "model = Net()\n", 32 | "model.load_state_dict(torch.load(os.path.join(model_dir, 'model.pt')))\n", 33 | "model.eval();" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": 76, 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "# load data\n", 43 | "Xtr, Ytr = torch.load('train1989.pt')\n", 44 | "Xte, Yte = torch.load('test1989.pt')" 45 | ] 46 | }, 47 | { 48 | "cell_type": "code", 49 | "execution_count": 85, 50 | "metadata": { 51 | "scrolled": false 52 | }, 53 | "outputs": [], 54 | "source": [ 55 | "\n", 56 | "def grid_mistakes(X, Y):\n", 57 | " \n", 58 | " plt.figure(figsize=(14, 4))\n", 59 | " ishow, nshow = 0, 14\n", 60 | " for ix in range(X.size(0)):\n", 61 | " x, y = X[[ix]], Y[[ix]]\n", 62 | " yhat = model(x)\n", 63 | " yi = y.argmax()\n", 64 | " yhati = yhat.argmax()\n", 65 | " if yi != yhati:\n", 66 | " plt.subplot(2, 7, ishow+1)\n", 67 | " plt.imshow(x[0,0], cmap='gray')\n", 68 | " plt.title(f'gt={yi}, pred={yhati}')\n", 69 | " plt.axis('off')\n", 70 | " ishow += 1\n", 71 | " if ishow >= nshow:\n", 72 | " break\n" 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": 86, 78 | "metadata": {}, 79 | "outputs": [ 80 | { 81 | "data": { 82 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAxsAAAD0CAYAAADzJDsDAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAv20lEQVR4nO3debyUZf3/8fdHkM0FQUQUU0QFtwQNQ3FDM4Sy9Esu9HU3969L9tPcMpcolUy/mZn9KsHtawQqGOWCGqSUmgu5oJjiBrIIsiur1++P++bndL6f6zBzzlxnzsx5PR+P88jeM3Pf15zPueaea+57PlgIQQAAAABQbhtUegAAAAAAahOLDQAAAABJsNgAAAAAkASLDQAAAABJsNgAAAAAkASLDQAAAABJsNgogZkNNLOZlR4HGo9a1gbqWBvMrIeZBTNrXemxoHGYk7WBOVk7msOcTLLYaMwTM7PbzWxZwc9KM1ta7jE2pXzS/tnMFprZHDO7tVomcCNreZKZvWBmS8xsppmNqJbnHVPnb3OZma01s19Uelzr08g6DjOz6Wa22MzmmdmdZrZpucfYlMxskpmtKKjj9EqPqRiNrKOZ2XAzm5XXcpKZ7VbuMTYlM2trZnfkrzFzzOx7lR5TsRpZy7ZmdrOZfZgfV24zsw3LPcZKMLOd8rl5T6XHUgzm5L9rwXOS42Q9mt2ZjRDCWSGEjdf9SLpP0pgU+zKzVim267hN0jxJW0nqK+kgSec00b4rqYOk70rqIqm/pK9IuijFjpqqlnX+NrtJ+lSJ/j6bkSmS9gshdJTUU1JrScNT7KgJ56QknVtQz95NuN9KOVrSqZIOkNRZ0t8l3Z1iR034ocLVknaStJ2kgyV938wGN9G+K+lSSf0k7S6pl6S9JP0gxY6aeE5K0i8l/aOJ91kpzMnawXGyHg1ebJjZXmb2kpktNbMxZjY6X6FvJOlhSVsXrIa2buA+NpL0LUl3Fnn/gfkn6Jeb2Xwze9fMjiu4fZSZ/So/y7Bc0sFmtrWZ3W9mH5nZO2Z2fsH92+ePWWhm0yTt3ZDnIWl7SX8IIawIIcyR9IikZvPpRapahhB+FUJ4KoSwKoQwS9K9kvYrckzNtZaFvqVsEflUGbbVaAnr+EEIYX5BtFbSjkWOqRrq2KwkfG3dXtLTIYQZIYS1ku6RtGuRY1p3ScUZln2aPtvMLiq4/WozG2tm95jZEkknm1lHM/tdft9Z+XNold+/lZndmP9NzJD09RKeR6GTJP0ohLAwhPC6pN9IOrmB2yq7hLX8hqRbQggfhxA+knSLsjetxYyp2c5JMxsmaZGkJxq6jRSYkyVpkXOS4+R6hBBK/pHURtJ7ki6QtKGkoZJWSRqe3z5Q0sw6j/lPZS8isZ9tnf2cKGmGJCtyXAMlrZF0k6S2ys4gLJfUO799lKTFyt7wbqDsk/cXJP0wf0498/0dlt//emVvJDtL+oKkVwufl6QJ9TyfCQX3O1PSXfn+uufb+Y+G/O7L/dNUtcwfN07S9dVcyzpjfFLS1ZWuYVPUUdL++e875HUYVM11lDRJ0keS5iv7RGpgpWuYuo7KPml8Qdkn4RtKGiFpXJHj6pHX/j5JG0n6Yv77OzS//WpJqyUdmdexvaQHJf06v39XSc9JOjO//1mS3shr2FnSX/Ltt85vv62e5/Nyfp9O+WO2LBjnUZJeqXQdm6CWz0s6puBxx+W/i45VPCc3lfSmpG3yv6d7Kl1D5iRzstha5vflOBl7jg0s2IGSZqlgESDp6foK1sD9PKES3swVFGyjguwPkq4sKNhdBbf1l/R+nW1cJmlk/t8zJA0uuO2MhjwvSbvkfxhr8j/CUSpyAZX6pwlreaqkmZK6VHMtCx6/nbJPLravdA2buI7dlR3EelVzHfP9bKLshf0kSUsl7VDLdVR2UPp5/hq0RtI7xf796vM3NjsXZCMk/S7/76sl/bXgti0lrZTUviD7tqS/5P/9pKSzCm4bpII3NkWO6Qv5Y9oVZF+V9G6l69gEtRyu7OC/hbLLOZ/NfxdbFfHY5jonfy7pkoK/p+ay2GBOFv98WuycrLMfjpN1fhp6Dd/WkmaFfDS5Dxq4LZeZbausAKeX+NCFIYTlBf//PWXjXadwnNspO2W2qCBrpc8vi9m6zv3fK3EsMrMNlF029X8lDZC0saQ7JN0g6fulbi+BpqjlkZKuU/aJy/z13L1Qs6plHScoO/39TiO3Uy7J6yhJIYRZZvaIpN8ru068GM2ujiGEZwv+751m9m1JX5NU6S/7p6zjD5WdTv+CpDmSjpf0pJntFkL4pMht1P3dfzFy23bKPjmcbWbrsg0K7lOOOi7L/3dTSSsK/ru5NBRJWcsfS9pM0lRlbyB/I2lPSXOLfHyzmpNm1lfSocqeQ3PDnCxeS56T/x/Hyf+tod/ZmC2puxX8xSqbLOuEOveXmR1n/7uTT+HPtnUecoKkKSGEGSWOrVN+7d0620r6MDK2DyS9E0LYrOBnkxDC1wqeZ+Hz+rcxmtnD9Tyfh/O7dc4fd2sIYWUIYYGkkcoK1hwkraVlXwz7jaRvhBBeKXFsza2WhU5Ukd8laiJNMSfXaS1phxLG1pzrWDgGq+f2ppKyjn0ljQ4hzAwhrAkhjFJ22UNR14g7Y1lfHVcqO5O5ro6bhhDWfVdtfXWs25Ww8Oc1SQohLMy306fgoX0kvVbC80kpWS1DCJ+GEM4NIXQPIfSUtEDSCyGEz4ocW3ObkwOVfVL/vpnNUdZI5Ftm9mKRzycl5qSYkxwn//8YGnacbMjpEGWn/t6XdJ6yX+gR+vfr3nZW1qWnY0O2n29juqRTnXyUpFGRxwxUdirqxnyMByi77m3ngscOL7h/K0kvSrpE2fWMrZR199g7v/0GSZOVTf5tJL2shp2KmqGse0hrZZ9GPSjpfxr6uynnT8paSjpE2UHwwMjtVVfLfFsD8rFsUun6NVEdj9O/X2M8WdID1VrHfA4eJqld/rs6Lh9TUae8q7iOVym7ZGBLZR80nZA/783y26+WNCny2B7KDjT3KrteeDdlzREGFTz2njqPGa/sEpFN8/3tIOmg/LazJU3La9hJ2SWzJV2ykW/n+oK/h52VHWQHl7KNKq1ld2WfXpqkfZS9+RhUcHu1zckOyi4HW/dzo6Sxkrao8ToyJ2unlhwn6/lp0JmNEMIqZV+s+Y6yL5Qcr+wLJyvz299Q9qWlGWa2yErsRmVm++a/IK+l6BeUXasaM0fSQmUrwnuVXYP4RuR5rJV0uLJPF95R9iWY30rqmN/lGmWnn96R9Jga3pJuqKTByr5o85ayL21d2MBtlVXiWl6p7Hf558iquRprKWXXLj4QQmgup4ZT13FXSX+zrAvGFGUfBBRe3lhtddxQ2TXv6774dp6kI0MIbzZgW2WVuI43SPqnsktvFil7DfpWCGFRfvv66ihlB7C3lL0RuTGE8Fg99z1R2YFzmrL6j1XW/lvKznY+mo/nRUkPlPA8Cl0l6W1lfxOTJf00hPBIA7dVVolruYOkvyk7+N8p6dI6taiqORlC+CSEMGfdj7LLcVaErNNWRTEnS9ZS5yTHyXpYvoJpNDN7VtLtIYSRZdmgv482yibCHiGE1c7tA5Wt5LdJNYaWgFrWBupYG5qijvl+pkr6Ssgu9ax7Ww9lB64NQwhrUo6jljEnawNzsnYwJ5tGY/6djYPMrJuZtTazkyTtoeyL0MmE7N9r2MUrFhqOWtYG6lgbKlFHSQoh9PXe1KDhmJO1gTlZO5iTldGYf1Gyt7LWXBsp+07CUSGE2WUZFZoatawN1LE2UMfaQS1rA3WsHdSyAsp2GRUAAAAAFGrwZVQAAAAAUB8WGwAAAACSqPc7G2bGNVYVFEIo2z8yRi0rq1y1pI6VRR1rA3WsDRwjawdzsjbE6siZDQAAAABJsNgAAAAAkASLDQAAAABJsNgAAAAAkERj/lE/AAAAAGXQrl07N+/WrVtJ21m6dKmbL1hQmX+QnjMbAAAAAJJgsQEAAAAgCRYbAAAAAJJgsQEAAAAgCRYbAAAAAJKgGxUAAEANinUxuuSSS9y8Z8+ebn7EEUeUbUyIO+aYY9z8zjvvLGk7y5Ytc/NzzjnHze++++6Stl8qzmwAAAAASILFBgAAAIAkWGwAAAAASILFBgAAAIAkWGwAAAAASMJCCPEbzeI3IrkQgpVrW82tlu3bt3fzu+66y82/8Y1vuPmYMWPc/NRTT3Xz1atXFzG68itXLZtbHVuaaq9jq1at3HzYsGFufvjhh7v5Pvvs4+Y9evRw8xdeeMHN+/Tp4+YHH3ywmz/99NNuXqpqr2OvXr3cfPjw4W6+0047ufl3v/tdN588eXKDxtXUavkYGdO5c2c3Hz16tJvvvffebt6xY0c3j3WpGjFiRBGja7hqn5PlstFGG7l57LVyyZIlbn766ae7eew1fYcddihidOsXqyNnNgAAAAAkwWIDAAAAQBIsNgAAAAAkwWIDAAAAQBIsNgAAAAAk0bqSOx8wYICb33TTTW7ev39/N3/++efd/Nvf/rabv/XWW0WMDildeeWVbn7ggQe6+c033+zmsW4qF110kZvPnTt3/YNDMl/60pfcfOONN3bz5cuXu/nmm2/u5u+9956b/+tf/3LztWvXunmt6t69u5vHusBtsEF5Po+KdUOKqVTXuOYm1pnmpz/9qZvPmjXLzU855RQ3j80vNF8XX3yxm8e6UY0dO9bNY92lHnjggYYNDGXRtm1bN+/UqZObf+UrX3HzE0880c1/+9vfNmxgjcSZDQAAAABJsNgAAAAAkASLDQAAAABJsNgAAAAAkASLDQAAAABJWAghfqNZ/MYStGrVys1ff/11N4914Ih1kerXr5+b/+lPf3LzY445xs2bmxCClWtb5aplqWJdht599103j3WRitV+4sSJbt6jRw83r1Q3qnLVslJ1jM3hLl26uPlVV13l5meffbabz549283Hjx/v5rFuRbEOHLfffrubX3rppW4eU+113Hrrrd08Nh833HDDkrb/yiuvuPnQoUPdPPb6MHXq1JL2W6pqqWOfPn3cPDYvtt9+ezev7zhfzWrhGGnmP4X777/fzQ855BA3HzdunJsffvjhbj5w4EA3j3X0W7p0qZuXS7XMyZjY+9ALL7zQzQcPHuzmHTp0cPNly5a5+YsvvujmZ5xxhpvH6lsusTpyZgMAAABAEiw2AAAAACTBYgMAAABAEiw2AAAAACTBYgMAAABAEq0rufPYt/E//PBDN1+xYoWb33vvvW5+7LHHunnr1v7TXrNmjZuj4Q499FA333zzzd38b3/7m5t37dq1bGNCXP/+/d38t7/9rZv37t3bza+//no3j3UJK1eHjFjHjgMOOKAs2692sXrFuo2VKtZZZ968eW4e6zKHzMKFC0vKy9V1aoMN/M8hf/7zn7t5rKvb8uXLyzKeWhabM7fddpubL1682M1POOEENx82bJibv/rqq25+9NFHu/lDDz3k5itXrnTzlqZjx45u3qlTJzeP/f4PPPBAN7/kkkvc/JZbbilidJXHmQ0AAAAASbDYAAAAAJAEiw0AAAAASbDYAAAAAJAEiw0AAAAASTRJN6q1a9e6+TvvvOPmvXr1Kinv16+fm8c6rMyYMcPNY+bMmePm1113nZs/+OCDJW2/lh1yyCFuftddd7n5m2++6ebbbbedm8e6Dy1durSI0aGuQYMGufk999zj5nfffbebxzrKxXTo0MHN99tvPzf/yU9+4ubPP/+8m8c6rNSq2GvihAkT3HzRokVufuaZZ7r5l7/8ZTe/+OKL3fypp55y83322cfNP/30UzdvaWbPnu3mM2fOdPPOnTu7+ccff1zSfrfeems3/853vuPm9913n5vHugvic7HOX0uWLHHzdu3aufl5553n5mPHji1pPLE5/89//tPNY8fsluaJJ54oKY91IfvmN7/p5rFj7eTJk938lVdecfNY16xYh7ty4cwGAAAAgCRYbAAAAABIgsUGAAAAgCRYbAAAAABIgsUGAAAAgCSapBtVzPDhw9388ssvd/PPPvvMzefOnVvSfmOdM6ZOnerm3bp1c/M999zTzelG9bnHH3/czWNdaGIOO+wwN//rX//q5itWrChp+8j86Ec/Kun+ffr0cfNY16MtttjCzdu0aePm7du3d/Px48e7+dlnn+3mtSrWce/2229389hr6JAhQ9z8ueeec/MHHnigpPxPf/qTm8f+3i666CI3b2linRxjXZ5Gjx7t5ueee66bz58/381jv/+3337bzUvt8NgSxbpODRgwwM2vuuoqN491wYwda0sVe02J/a2gYUIIbj5lyhQ3HzNmjJvH3gO99tprbj5t2jQ3P+2009y8XDizAQAAACAJFhsAAAAAkmCxAQAAACAJFhsAAAAAkmCxAQAAACAJi30jXpLMLH5jGXTo0MHNDz/8cDd/8skn3fyLX/xiSfc/5ZRT3HzUqFFuXikhBCvXtlLXslxinTDeffddN7/pppvc/Oabby7XkMqiXLVsbnW84YYb3DzWFernP/+5m99yyy1uvmTJEjefPn26m/fo0cPNy6W51fH4449387vvvtvNX331VTffY4893Ly+40M5LFy40M2/+tWvuvnzzz9flv02tzqWy7HHHuvmsdfJLbfc0s1jr8Pbbrutm3/wwQdFjK78qukYueOOO7p57G96hx12cPMFCxaUZTxdu3Z183Hjxrl5rGtWuVTLnIx1FWvbtq2bd+nSxc1jHeLOO+88N491Zox1GDzooIPc/Omnn3bzconVkTMbAAAAAJJgsQEAAAAgCRYbAAAAAJJgsQEAAAAgCRYbAAAAAJJoXcmdf/LJJ27+hz/8oaTt/P3vfy/HcNAMfOc733Hzjh07uvnIkSNTDqfqHXnkkW4+bdo0N3/zzTdL2v5ll13m5tdcc42br1mzxs0PO+wwN7/tttvcfMSIEUWMruVasWKFmw8bNszNU3ediol1drn44ovdPNZtCZnRo0e7+R//+Ec3P+mkk9z8wgsvdPM5c+Y0bGAtSKwrUey1snfv3m5erq5T7dq1c/MHH3zQza+44oqy7Lda9O3b181j3cNiXaRi3cOee+45N3/kkUfcfNCgQW7+61//2s1nzpzp5s8884ybVwpnNgAAAAAkwWIDAAAAQBIsNgAAAAAkwWIDAAAAQBIsNgAAAAAkUdFuVOWy5ZZblnT/xYsXJxoJGuvggw9281hHh0WLFiUcTfWLdbB4/PHH3fztt99285122snNY3OvW7dubh6r15gxY9w89vfw1ltvuXlLs8UWW7j5q6++6uavvfZayuGoVatWbj5kyBA3j3Xuef7558s2JsQ7Px5wwAFuPmXKFDdfvXp12cZUq3bZZRc3j3WFmjt3bknbj3Vm/NrXvubmsc5ud955p5s/9dRTJY2nWnTq1MnNY53aYq9lP/3pT9188uTJbt6nTx83P/TQQ938hhtucPMnnnjCzWMd5WKdHyuFMxsAAAAAkmCxAQAAACAJFhsAAAAAkmCxAQAAACAJFhsAAAAAkqiJblQzZ85082XLlrn5ZpttlnA0aIzdd9/dzXfddVc3Hzt2rJsff/zxbr5ixYqGDaxKDRgwwM1jHSy6du3q5rHuT6NGjXLzWDehBQsWuDkaJlaXvn37uvljjz3m5rFuYCEEN+/cubObH3300W6+1157uXmsg0us4wsaJtZZ55BDDnHzo446KuVwatqXvvQlN99tt93c/LLLLitpO3vvvbebt2nTxs1vuukmN7/11lvdfO3atW5e7VauXOnm999/v5vH3kPEfp/Tpk1z8+nTp5c0nthraKwbVbXgzAYAAACAJFhsAAAAAEiCxQYAAACAJFhsAAAAAEiCxQYAAACAJCzWbUSSzCx+YxWYOHFiSff/6le/mmgkDRNCsHJtq7nVMtZxYfTo0W4+d+5cN+/WrZub77vvvm7+zDPPFDG68itXLZtbHVuaaqljrJvTKaec4uYbb7yxm7dt29bNFy1a5Ob/+te/3PzUU09181gHl88++8zNy6Va6lgum2yyiZs/+uijbj5kyBA3X7x4cdnGVA7N8RgZ6wp15513uvl+++3n5rH3ZldffbWbjxw5cv2Da8Za2pysVbE6cmYDAAAAQBIsNgAAAAAkwWIDAAAAQBIsNgAAAAAkwWIDAAAAQBI13Y3q2muvdfNevXq5+bBhw1IOp2TNsdNGuXTv3t3NW7Vq5eYff/yxm3fp0sXNZ8+e7eYrV64sYnTlR6eN2lAtdYzNo/bt27t5qd2oYl2JPv30Uzev1LyLqZY6ptapUyc3X7hwYROPpGFq+RjZ0jAnawPdqAAAAAA0KRYbAAAAAJJgsQEAAAAgCRYbAAAAAJJgsQEAAAAgiZruRtWzZ083X7p0qZt/9NFHKYdTMjpt1A46bdQG6lgbqGNt4BhZO5iTtYFuVAAAAACaFIsNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQRL3dqAAAAACgoTizAQAAACAJFhsAAAAAkmCxAQAAACAJFhsAAAAAkmCxAQAAACAJFhsAAAAAkmCxAQAAACAJFhslMrOBZjaz0uNA41DH2kAda4OZnWxmT1d6HGg85mRtoI61obnUMdliozFP0MyGmdl0M1tsZvPM7E4z27TcY2xKZraszs9aM/tFpce1Po2sY1szu9nMPjSzhWZ2m5ltWO4xNiUz62xmD5rZcjN7z8z+s9JjKkYj63iSmb1gZkvMbKaZjTCz1uUeY1NqofPx5Px5Fj7vgeUdYdMys75m9lR+rJhpZldWekzF4hj578xsFzN7Mn9Ob5nZf1R6TMVo7Js5M+tpZhPMbKmZzTezEeUcX1Mzsx5m9uf8mD/HzG6thuNFI+fj7XVeV1ea2dJyj7EplXs+NtczG1Mk7RdC6Cipp6TWkoan2JGZtUqx3bpCCBuv+5HUTdKnksY0xb4r6FJJ/STtLqmXpL0k/SDFjpqqjpJ+KWmVpC0lHSfpV2a2WxPtu1I6SPqupC6S+kv6iqSLUuyI+Zjc3wufewhhUrl3YJmmOrb8j6S/Suos6SBJ55jZN5to35VUU8fI/M3oeEkTlNXyDEn3mFmv1PuuJDNrI2mipCeVvQ5tI+meRPtqqmPkbZLmSdpKUl/l87KJ9l0RIYSz6hxT7lOi40m1zsdGHRDMbC8zeylfkY8xs9FmNtzMNpL0sKStC1Z6Wxe73RDCByGE+QXRWkk7FjmmgfknXJfnnxK8a2bHFdw+ysx+la+8l0s62My2NrP7zewjM3vHzM4vuH/7/DELzWyapL2LfR71+JayyfhUGbbVaKnqKOkbkm4JIXwcQvhI0i2STi1yTM2ujvnv41uSrgwhLAshPC3pIUknlLqtFBLOx1+FEJ4KIawKIcySdK+k/YocU7Oro6OlzMfGjOlkM5ti2aeUi83sDTP7SsHtk8zsx2Y2RdInknqa2c5mNtHMPrbsU/hjCu6/uZk9ZNnZsuck7dDAofWQdG8IYW0I4W1JT0tqNot/jpFF21nS1pJuzmv5pLIFVU2/tko6WdKHIYSbQgjLQwgrQggvFzmm5lhHSdpe0h/y5zJH0iNqJnOyKV5b7fP3CXcWef/mWMfyz8cQQoN+JLWR9J6kCyRtKGmosk98h+e3D5Q0s85j/lPSonp+ti247/6SFksKkpZLGlTkuAZKWiPpJkltla2ql0vqnd8+Kt/ufsoWWx0kvSDph/lz6ilphqTD8vtfr+xNSGdJX5D0auHzUrbyiz2fCZExPinp6ob+7sv5k7KOkp6XdEzB447L69mxGusoaU9Jn9QZ50WS/ljLdXT2NU7S9czH6qqjsjc2yyXNl/SmpCsltS5yXCfndbwwH9exed0657dPkvS+sjcVrSV1lPSBpFPy/79nvt9d8/v/XtIfJG2k7MznLElPF+zv5Xqez20F9/tJ/jexoaTekmZK2rvSdWyKOakaOkbmfwPLJFnB4yZKerCW6yjpDkl3K3ujO1/ZPPpitdYxv9+Zku7K99c9385/1HId6zzmxPz3akWOq9nVUQnmY2MKd6CyA0ThYJ6ur3AN3E93SVdL6lVi4TYqyP6g7NPodYW7q+C2/pLer7ONyySNzP97hqTBBbed0ZjnJWk7ZZ9CbZ9iQjWnOio7rT9F0hbKThE/q+zAuFU11lHSAZLm1MlOlzSplutYZz+nKntD16XI+ze7OtbZdkuajz2Vfeq4gaQvSpom6bIiH3uypA/rjOs5SSfk/z1J0rUFtx0r6ak62/i1pKsktZK0WtLOBbf9RAWLjRKe0wBJb+V/Y0HSNZWuYVPUss5+qv4YqezN3wxJ38//e5CyN4KP1nIdJT2Wz4Uhyt44Xpz/HtpUYx3zx+2i7M3wujk5SkW+8a7WOtbZzxMq4cOr5ljHFPOxMZdRbS1pVshHlvugEdtzheyyjUeUfRJWrIUhhOUF//89ZeNdp3Cc2yk7dbZo3Y+ky5Vdk6/8cYX3f6+EcXhOUHZQfaeR2ymXlHX8saSXJE2V9Ddln4ivljS3yMc3tzouk1T3S5ibSmoOXwRLPh/N7EhJ10kaEv79Eo71aW51LNRi5mMIYUYI4Z0QwmchhFckXSvpqBI2UXdc66tj/zp1PE7Zhw5bKDvb0ag6mllnZceGayW1U/Yp3mFm1lyuD+cYWaQQwmpJR0r6uqQ5kv6PsjdcFe+io7R1/FTZ68/DIYRVkm6UtLmyN+zFaFZ1tOy7Wo9IekDZWcsukjpJuqHUbSXQFMfIbZUtHu4q8aHNqo4p5mNjFhuzJXU3MyvIvlDw36HO/WVmx9n/7gJT+LNtZF+tVdo1vZ3y6+bW2VbZp3Le2D6Q9E4IYbOCn01CCF/Lb59d53n92xjN7OF6ns/DzthOVJHX8jWRZHUMIXwaQjg3hNA9hNBT0gJJL4QQPitybM2tjm9Kam1mOxU8tI+k14p8PiklnY9mNljSbyR9I3+jWormVsdCLWY+OoIki9zmqTuu9dVxcp06bhxCOFvSR8o+yauvjq/V83xuz+/WU9LaEMJdIYQ1IYSZyt5wf03NA8dIFT8nQwgvhxAOCiFsHkI4TFl9nyvhOaWSso4ve48vQXOrY+f8cbeGEFaGEBZIGqnmMSebYj6eIGlKCGFGiWNrbnUs/3xs6CkRZaf83pd0nrIXuiP079e/7axs1d6xAds+Tp9f07idpMmSHii4fZSkUZHHDlR2ILsxH+MByq5/27ngscML7t9K0ouSLpHUPv//uyu/7lfZinyystX5NspeHBp6ynRAPpZNGvp7L/dP4jp2V7bKNkn7KJskgwpur7o6Knszc5+yT232U3Yt5W41XsdDlC0UD4zcXnV1zLfV0ubjEElbFmznVUlXFdw+SZHT//r8OxsXKDutfrSkJZI2L3jsaQX330TZJ2on5PffUNkXFXfJbx+dz6UOknZV9olZSZdRKTuruEjZddUbKDtr8ndJP6l0HZugljV3jJS0h7IzVB2UfRfuHUlta7yOvZU1VDg0/71eKOlt5ZdRVWkdZyjrRNla0maSHpT0P7Vcx4J9TJd0qpNXYx3LOh8bfGYjZKf8hkr6jrIX/OOVffFkZX77G8relM2w7FRPKd/s31XS3yz75v0UZQU8veD2L+R5zBxJC5WtDO+VdFY+Hu95rJV0uLIWbe8o+5LWb5V9wVGSrlF20HxH2fWVd5fwPOo6SdkBoTlcdiMpeR13UHb51HJlnx5fGkJ4rOD2aqzjOcom+Dxlv5ezQwgVP7ORuI5XKvs9/jlylqAa6yi1vPn4FUkv56+rf1Z2qcNPCm5fXx2flbSTspr8WNJRIfvk0nseS5Vd5ztMWd3nKDsIts3vcq6kjfN8lLJPP0sSQlii7Hd1obK/r6nKFlBJWsCWimNkyU5Q9qnsPGV/q18NIaxs4LbKJmUdQwjT8+3drqweR0j6Zr5PqTrrOFTSYGVnMN9Sdun0hQ3cVtkkno8ys32Vvbn3Wt5WYx3LOh8tX8GUhZk9K+n2EELJB44S9tFG0j8l7RGy68rq3j5Q0j0hhG1SjaHWUcfaQB1rQxPVcRtl7SoHRG4/WdmZi/1TjaElYE7WBupYG6hj02nsv7NxkJl1M7PWZnaSstMuj5RnaL6Q9frfxSsaGoY61gbqWBsqVMeZsYUGGo45WRuoY22gjpXT2H9Cvrc+75U+Q9lp9dmNHhWaGnWsDdSxNlDH2kEtawN1rA3UsULKehkVAAAAAKzTqMuoAAAAACCGxQYAAACAJOr9zoaZcY1VBYUQSvkHt+pFLSurXLWkjpVFHWsDdawNHCNrB3OyNsTqyJkNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQROtKDwAtU7t27dz8H//4h5v36NHDzU877TQ3Hz16dIPGBeBzu+++u5tfc801bn7PPfe4+bhx49w8hNCgcQEAqgdnNgAAAAAkwWIDAAAAQBIsNgAAAAAkwWIDAAAAQBIsNgAAAAAkYfV1AzGzpK1Cvv3tb7v50Ucf7ebPPPOMm8c6GE2ePNnNP/vssyJGV3khBCvXtlLXslQXXnhhSfd/8cUX3fzRRx91844dO7r5ypUrS9pvuZSrlqnruN1227l5rPtQbI7ttddebr7BBv7nG2PHji1p+5Waw9VSx3J56aWX3HzevHluvvfee7v5mDFj3Pzss89289T1bWl13Gqrrdz861//upsfeOCBbv7973/fzefMmdOwgTVSLR8jS/Wb3/zGzWMdG2MmTpzo5ueff76bv/HGGyVtP6ZW52TXrl3d/OCDD3bzIUOGuPn777/v5j/72c/c/JNPPnHz1atXu3m5xOrImQ0AAAAASbDYAAAAAJAEiw0AAAAASbDYAAAAAJAEiw0AAAAASTRJN6p27dq5+T//+U83HzFihJsffvjhbt63b183j3U0OfPMM938ySefLGk7qdVyp40HH3zQzW+44QY333jjjd18woQJJd1/zZo1RYyu/Kql08ZJJ53k5iNHjnTz+l4/PGb+r2H58uVuftBBB7l5rDtZatVSx1Jtvvnmbh7r9Ddo0CA3j82vSZMmufnQoUPdPHV9q72Osa5uhxxyiJtfd911bj516lQ3nz59upt/85vfdPNY96rUavkYGXPZZZe5+Q9+8AM3/8UvfuHm7777rpvfcsstbv7LX/7SzUvtLBlTLXOybdu2br7hhhu6eWyOLVu2zM3fe+89N+/Tp4+br1q1ys1nzZrl5rEuWOVCNyoAAAAATYrFBgAAAIAkWGwAAAAASILFBgAAAIAkWGwAAAAASKJ1U+xkxYoVbn7WWWe5+V/+8hc3/93vflfSfvfff383f+qpp9y8f//+bv7cc8+VtF+sX6xrTdeuXd18r732cvNYZ4hYXqluVNXi2WefdfMFCxa4+VtvveXm3bp1c/PtttvOzTt06ODm9913n5v37t3bzdEwsfk1Y8YMN4/VPaZXr15uznxsmNjr20MPPeTmJ5xwgpvff//9br7jjju6eax7GJrO4MGD3fyVV15x80svvdTNd9hhBzePzclY96SW5oorrnDzsWPHuvmoUaPcfPjw4WUZz+9//3s3b9OmTVm2Xy6c2QAAAACQBIsNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQRJN0o4qZNGlS0u0//fTTbr7zzju7+d133+3mX/7yl8s2JmTOP/98N491R4nl7777rpu3bl3RP+2q9cYbb7h5z5493TzWoaRdu3Zu/sknn5Q0np122qmk+6Nhhg0b5ub33ntvWba/atWqsmwHmZNPPtnN58+f7+ax18+Yyy+/3M1feOGFkraDhot104y9HznooINK2n7s2NmvXz83L7UDXa068MAD3bx79+5uftVVV5Vlv7H3NPvss4+bH3/88WXZb7lwZgMAAABAEiw2AAAAACTBYgMAAABAEiw2AAAAACTBYgMAAABAEhVt2RNCqMh+Y90EZs6c2cQjablmz57t5gMGDChpO8cdd5ybb7LJJm6+ePHikraPTKzrVOz3379/fzePzflYvnDhwiJGh2LF5sWRRx7p5rvuuqubH3DAAW5+3333ufnjjz/u5pU6BlSLDTbwPw+MdQ875phj3HzHHXd08wsuuKCk7Q8cONDNUX5XX321m8deE998882Str927Vo3nzZtWknbaWk6dOjg5rH3LgsWLCjLfrt27erm3bp1c/OpU6eWZb/lwpkNAAAAAEmw2AAAAACQBIsNAAAAAEmw2AAAAACQBIsNAAAAAElUtBtVuXTs2NHNr7jiCjcfPHiwm5faCQmV99JLL7l579693ZyOYw0T62L03//9327eqVOnkrY/b948Nz/11FNL2g7qF3utbNeunZvHurctWrTIzcePH+/me+21l5u/8cYbbo5MrBvV66+/7uZnnHGGm+++++5uvvfee7v5r371Kze///773Xz77bd38zVr1rg5PhfrbhTrBDdmzBg3j81JlNeLL77o5l//+tfdPPZeZP78+W4ee49y1FFHufnEiRPdPNZBslI4swEAAAAgCRYbAAAAAJJgsQEAAAAgCRYbAAAAAJJgsQEAAAAgiZroRvXwww+7+ZQpU9x8jz32SDkcFIh1uVm5cqWbhxBK2n7s/u3bty9pO6jfpptu6uax7kZm5uYLFy508759+7r53Llz1z84FK1Lly5uHuuIc8QRR7h5bP4efPDBbh7rhkQ3qvrFujmdf/75bh6bL5tttpmbDxkyxM1fe+01N99tt93cnK5TDRc7VsXm5Lhx4xKOBusT6wYW67j3y1/+0s333XdfN48dO2NuvPFGN995553dvFKvuZzZAAAAAJAEiw0AAAAASbDYAAAAAJAEiw0AAAAASbDYAAAAAJBETXSj2meffdw81mkDTefWW29188svv9zN582bV9L2Yx3H9ttvPzefMGFCSdtHZtmyZW6+dOlSN491qRo/fryb03WqacS6Bn322Wdl2f6bb77p5qV2mUP9Vq1a5eZ9+vQpaTsffPCBm99xxx1ufsstt5S0fazfiSee6OaxDoCnn366mw8aNMjNL7vssoYNDK5Jkya5eawTX+y179xzz3XzoUOHuvmee+7p5itWrHDz2LG5UjizAQAAACAJFhsAAAAAkmCxAQAAACAJFhsAAAAAkmCxAQAAACCJmuhGNXHiRDcfPXq0mx911FFuHuu4g4Z7/fXX3bxVq1Zl2f7q1avdvF27dmXZPjKffvqpm8+aNcvNY92oHnvssbKNCaX78MMP3Xz58uVuvtVWW7l5rO6xzoAvvfRSEaNDY73//vsl3f+0005z8379+rn5WWedVfKYUL+NNtrIzc3Mzbfccks3P+aYY9w89tp97bXXFjE61LV27Vo3j72GxowYMcLNN9lkEzf/61//6uY//OEP3by5dQDkzAYAAACAJFhsAAAAAEiCxQYAAACAJFhsAAAAAEiCxQYAAACAJGqiG9XJJ5/s5g899JCbn3feeW5+3XXXlWtIyE2ZMsXNY50wbr/9djfv1KmTm//sZz9z82OPPbaI0aFYK1eudPNx48a5+W677ebmQ4cOdfOpU6e6eax70pIlS9wc9YvVccWKFW7et29fNz/xxBPdPNYpJ/ZajKYxePBgN//Rj37k5vvvv7+bx/5+0HDz589381g3oQ8++MDNYx0et9hii4YNDBVx2GGHufmECRPcvLl1nYrhzAYAAACAJFhsAAAAAEiCxQYAAACAJFhsAAAAAEiCxQYAAACAJKy+b7KbWVV8zb19+/Zu/uSTT7r59ddf7+bjx48v25jKIYRg5dpWpWpp5j+FCy64wM2vuOIKN491y/mv//ovN29u3W/KVcvmNic7dOjg5tOnT3fz7t27u/maNWvcfM6cOW6+xx57uPmiRYvcvFyqvY6x+XjHHXe4+cCBA9181apVbh57bR05cuT6B9eEqr2OMbH5+Oijj7p5rDNjrDtcc1MLx8hYp8WXX37ZzbfZZhs3nzlzppv369fPzefOnVvE6JpOrc7JUsWOnWeeeaabT5o0KeFoSherI2c2AAAAACTBYgMAAABAEiw2AAAAACTBYgMAAABAEiw2AAAAACRRE92onn32WTefMmWKm3/ve99LOZyyqYVOG8i0tE4b1113nZtfcsklbl7f65CnXbt2br569eqStlOqllbHWlXtdWzTpo2bv/baa25+zjnnuPnEiRPLNqZKqOVj5GabbebmHTt2dPPFixe7eeoOfeVS7XOyXJ577jk333fffd187dq1KYdTMrpRAQAAAGhSLDYAAAAAJMFiAwAAAEASLDYAAAAAJMFiAwAAAEASNdGNqlbVcqeNloZOG7WBOtYG6lgbOEbWDuZkbaAbFQAAAIAmxWIDAAAAQBIsNgAAAAAkwWIDAAAAQBIsNgAAAAAkUW83KgAAAABoKM5sAAAAAEiCxQYAAACAJFhsAAAAAEiCxQYAAACAJFhsAAAAAEiCxQYAAACAJP4fPf0+qvWn6uoAAAAASUVORK5CYII=\n", 83 | "text/plain": [ 84 | "
" 85 | ] 86 | }, 87 | "metadata": { 88 | "needs_background": "light" 89 | }, 90 | "output_type": "display_data" 91 | } 92 | ], 93 | "source": [ 94 | "grid_mistakes(Xtr, Ytr) # training set mistakes" 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": 87, 100 | "metadata": {}, 101 | "outputs": [ 102 | { 103 | "data": { 104 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAxsAAAD0CAYAAADzJDsDAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAzHUlEQVR4nO3deZgU5bn+8fuRfV8VFwRE3D0RExUVhVHcYlSixu0gMgkxrkeNEn/RxAhqNG7RqEkkRkXBfd9RUXE7bhEUNyTIoiCoKAjKDu/vj7c4tpOnhumerumZ5vu5rrkc7+querufqap+u7ofLIQgAAAAACi29Uo9AAAAAADlickGAAAAgEww2QAAAACQCSYbAAAAADLBZAMAAABAJphsAAAAAMgEk408mVmFmc0q9ThQO9SxPJhZDzMLZta41GNB4ahj+eDYWh7YJ8uHmVWa2UulHENmk43aHHDMbIiZvWlmC81slpld1tD/4M3smyo/q8zs2lKPa21qWcejzexDM/vazD43s1vMrG2xx1iXzGy8mS3NqeOHpR5TTdSyjmZmF5nZ7KSW481su2KPsS6Z2RVm9m8zW2Rmk83suFKPqSao438ys33MbIKZfZucL44s9Zhqopa1vL7K+WSZmS0q9hjrkpltY2bPJn+bU83s0FKPqSbYJ//TOrpPViav63L3y4rijrDumFk353VrMLOzCllffb2y0VLSGZI6S+ojaYCkYVlsyMwaZbHeqkIIrdf8SNpQ0hJJ99TFtkvoZUl9QwjtJPWU1FjSRVlsqK7qmDg1p55b1eF2S+UISb+QtKekjpJekTQ6iw3V4ZsK30o6WFI7SUMk/cXMdq+jbZdK2dXRzLaVdLuk3ynWcgdJb9bFtksphHBilXPKHcrofFIXx9bk7+UhSY8q/m3+StIYM9sy622XGPtkeXkld78MIYwv9gaSCWrmr91DCB9XOcb8l6TVku4rZH21GrCZ/dDMJibvDt5jZncls/RWkp6QtHHOjGjjmq43hPD3EMKLIYTlIYTZkm6T1LeGY6pIZtLnmtk8M5thZoNylo8ys7+b2eNm9q2kvcxsYzO7z8y+MLPpZnZazu1bJPeZb2bvS9q5po+jGodL+lzSi0VYV61lWMdPQgjzcqJVknrVcEwNoY71SlZ1lLSZpJdCCNNCCKskjZG0bQ3HtOZS/K/M7FMzm2Nmw3KWDzeze81sjJktlFRpZu3M7MbktrOTx9AouX0ji1cl5pnZNEk/yeNx/J8QwvkhhMkhhNUhhNcU98XdCllXsVHHvPxe0sgQwhMhhJUhhC9DCB8VuK6iy7CWudtopXhOuaWGt6+Px9atJW0s6aoQwqoQwrOKb1YNLmBdRcc+mZd1fp8sYEyVZvaymV1n8QrXZDMbkLN8vJn90cxelrRYUk8z29rMnjazryx+guTInNt3MrOHLX466HVJmxdhmMdJeiGEMKOge4cQCvqR1FTSTEmnS2oi6TBJyyVdlCyvkDSryn3+W9KCan66pWzrQUl/quG4KiStlPRnSc0k9Vd8F3OrZPkoSV8rTl7WU7yK8qakPySPqaekaZL2T27/J8UXIh0lbSrp3dzHpfhOTNrjeTRljM9KGl7oc1/Mn6zrKGmP5PkOSR32a8h1lDRe0heS5imeDCtKXcOs6yipe/Lcbpms+zJJD9ZwXD2S2t8hqZXiuyNfSNonWT5c0gpJP03q2ELSA5JGJrffQNLrkk5Ibn+ipMlJDTtKei5Zf+Nk+d+qeTyTUsbYQtIcSQdQx4ZVR8V9/EJJ7yQ1HCOpY6nrmHUtq9znuOR5sBqOq0L17NgqaXtJ3+Q+BklPS3qgnOso9slyqmWl4n40T9IUSeetee5qMK5KxX3y18m4jlLcBzsmy8dL+ljSdoqfEGkn6RNJP0/+f8dku9smt79T0t1JnbeXNFtxUrtme5OqeTx/c8Znkj6SVFnwc1+LovVLHkDuweGl6opW4HZ+IWmWpM41vH1FUrRWOdndks5Lfh8l6dacZX0kfVxlHedIujlnxzkgZ9mvavO4FA8uqyRtlvWOVc/quIniwW/LhlzHZDttFE/SQyQtkrR5OddR8QD9F8UTz0pJ02v696vvTohb52SXSbox+X244rsla5Z1kbRMUouc7BhJzyW/PyvpxJxl+ynnhFjg47tF0ljV8MUadaw/dVR8oTBD8cVaa8VL/LeVuo5Z17LKdp5RHm9eqR4eWxVfYE2TdHby+35JbZ8s5zqyT5ZVLXsqXqlaT3Hi976kc2p430pJn1YZ1+uSBie/j5d0Qc6yoyS9WGUdIyWdL6mR4iQz9+/iYuVMNgp4bHsqvhnQutB11OYzfBtLmh2SkSQ+qcX6/oOZ/VTSJYoz9XlruXmu+SGEb3P+f6bieNfIHWd3xctmC3KyRvruI04bV7n9zDzG4RmsWPTptVxPsWReR0kKIcw2s7GKM+4f1vBu9a6OIX7kZo1bzOwYSQdKKvWX/bOs4x8UPxqxqaS5ko6V9KyZbRdCWFzDdVR97v8rZVl3xRcbc8xsTbZezm2Kuj+a2eWK7/zsVeW5KxXqmJ8lii96p0iSmV0saVyB6yq2ujhHdlN8gXR8nnetV8fWEMKK5Hx/raT/J+lfihOgZfmuKwPsk/lZJ/fJEMK0nP99x8wukPQbxdewNVF1XGvbJ/tU2ScbK37fZ/3k92K+bh0i6b4QwjeFrqA239mYI2kTy/mrVdxh1viPE7eZDbL//HZ77k+3nNseIOkGSQeHEN7Jc2wdks/frdFNcdboje0TSdNDCO1zftqEEA7MeZy5j6tbzu8ysyeqeTxPOGM7TjX8bG0dybSOVTRWfp8drM91zB2DVbO8rmRZx96S7gohzArxM7ijJHVQDT9b7IxlbXVcpnglc00d24YQ1nRoWVsdq3bpyf15r8ptR0j6seJH+xbm8ViyRB2VVx0nVdlufZgwrlEXx9bBkl6u8kKnJurdsTWEMCmE0D+E0CmEsL/iO8Wv5/m4ssA+KfbJAl7v5PvaoOq41lbL56vsk61DCCcpfpRupaqv5XvVPJ7rq9y2hWIjg9q9bq3FZZWmip8h+x/FF5ED9f3Pvm2tOMNtV8C695b0paR+KctHSRqVsqwieaKvSMa4p+Ln6LbOue9FObdvJGmC4rspLZL/317SzsnySyU9r3gA6Kq4IxV6mW33ZCxtCn3ei/2TcR0H6fufTX1e0v0NtY6S2kvaX1Lz5LkalIypRh8Na8B1PF/xUnMXxTcoBiePu32yfLik8Sn37aF4kLxN8bPf2yk2R9gv575jqtznIcWPFrRNtre5pP7JspMUL093TWr5jAq71H+OpH9L2rDUtaOOtarjLxQ/etIzGdfdkkaXuo5Z1zJnGx9K+oWTj1IDOrYm6/mB4rG1pWL3yemSmpVzHdkny6qWP5bUJWc970o6P2f5eKV83FHffWfjdMWrT0dIWiipU859f5lz+zaKVysGJ7dvoniFbJtk+V2KnyJpqThxnaUCP0al+J2VGarlx4wLvrIRQliu+OWaoYpfKjlW8Ytgy5LlkxW/uDTNzBZYft/qP0/xCzCPp7y7vKnil3PTzJU0X3FWeJvi5xAnpzyOVZIOUnyHYbril2z+mWxfkkYoFnW6pKdUu7Z0QxRfbNebfugZ13FbSf9rsaPJy4onxtzL/Q2tjk0UW/eu+YL4/0j6aUguF5dSxnW8VNLbkt5K1v1rSYeHEBYky9dWRym+GJmqeAK7IoTwVDW3PU7xpPC+Yv3vlbRRsuwGSU8m45kg6f48HkeuixXf7Zmac4w5t8B1FQ11zE8I4SZJt0p6TXH/XibptGrvVEcyrqXMbDfFF4Zey9uGdmyV4gunOYovmAdI2jeEUPKPUbFP5mcd3icHSJqUvN55XPH5uzhn+dpq+ZqkLRT3rz9K+lkI4cuUx7FI8Ts1Ryvuw3MV/5aaJTc5VfH7MnMV3zy4OY/HUdUQxclira5QWS3v//2Vmb0m6foQQm0e2Nq20VRxZ/hBCGGFs7xCcTbfNasxlDvqWB7qoo7Jdt6SNMA7MJpZD8UXIU1CCCuzHEe5oo7lg2NreWCfLB91tE92lXR3CMH9t5zMrFLxysUeWY2h1Gr772z0N7MNzayxmQ1RvAw6tjhD84X4b29s4x1EURjqWB5KUUdJCiH0TnsHBvmjjuWDY2t5YJ8sHyXaJ2elTTTWFbX9FyW30ne9fKcpXvaZU+tRoa5Rx/JAHcsDdSwf1LI8UMfyQS1LoKgfowIAAACANWr1MSoAAAAASMNkAwAAAEAmqv3OhpnxGasSCiEU7R+Lo5alVaxaUsfSoo7lgTqWB86R5YN9sjyk1ZErGwAAAAAywWQDAAAAQCaYbAAAAADIBJMNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATDDZAAAAAJAJJhsAAAAAMsFkAwAAAEAmGpd6AACAurHLLru4+Y033ujmrVu3dvOPPvrIzQ866CA3X7p0aQ1GB6BQXbp0cfMtttjCzY877jg3HzFihJvPnj27sIEB4soGAAAAgIww2QAAAACQCSYbAAAAADLBZAMAAABAJphsAAAAAMiEhRDSF5qlL8yQmbn5pptu6uZ9+/Z18wMOOMDN0x7zokWL3HzYsGFuvmzZMjcvlhCC/0QUoFS1THPMMce4+TXXXOPmnTt3dvO33nrLzXfccceCxpWVYtUy6zo+++yzbt6kSRM3//jjj918s802c/NRo0a5+T/+8Y+1D64eaCh17N27t5s/8sgjbt61a1c3//DDD918/fXXd/PPPvvMzXfaaSc3X7x4sZtnraHUMU3z5s3d/PLLL3fzhx56yM3HjRtXtDGVQjmfI9O6S+29995uPmDAADffZptt3Hz77bfP6/affvqpmxdLQ98nEaXVkSsbAAAAADLBZAMAAABAJphsAAAAAMgEkw0AAAAAmWCyAQAAACATjUu58datW7v5dddd5+ZHHXWUm0+ePNnN33nnHTdfvny5m5966qlufuWVV7r5jBkz3Bzf+dGPfuTmN998s5t/9dVXbj5x4kQ379atm5un/W198803bo6obdu2bv7Tn/7UzWfNmpXX+s8//3w3HzRokJvffvvtbl5dFz1I++23n5tvsskmbp7WWa9fv35untYd7tprr3XzVq1auXmpulE1dPvss4+bn3zyyW6+7777uvnbb7/t5ml1nDJlipunHbdXrlzp5uVsww03dPO0DoxptZwwYYKb33vvvW6+YMECN995553d/LjjjnPzrLtOrWsaN/ZfZqd1eGzfvr2b9+nTx83nzp3r5ltttZWbp+2T9913n5svXbrUzfPFlQ0AAAAAmWCyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkImSdqPadddd3XzVqlVu3qVLFzdfuHBhXttN65BUWVnp5nQwWrvtt9/ezceOHevmr776qpsfdNBBbn7SSSe5+bBhw9w87W8I1UvryPbKK6+4+TbbbOPmafvMiBEj3HzkyJFu/vTTT7v5559/7uaImjZt6uZm5uZXX321m6c9z2n5gAED1j441FhaHR955BE3v+uuu9z86KOPdvO0c15aZ5oNNtjAzS+55BI3P/fcc928HOywww5unrYvbbfddm6e1l3qvPPOc/OBAwe6+fDhw908rRvVutgprBjSukWddtppbj548GA379WrV7GGlJe0c/Nrr73m5lOnTi3KdrmyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkAkmGwAAAAAyYSGE9IVm6QsbgA4dOrj5lClT3Hz06NFufuaZZxZtTPkIIfitYwpQrFrusccebv7444+7eVo3ld69e7v5jjvu6OY333yzmzdq1MjNzz77bDe/6qqr3DxrxaplqfbJAw44wM1vuOEGN99rr73cPK2zRY8ePdw8rYPLY4895uZZayh1THv+n3nmGTefP3++m6d17psxY0ZB46qtDTfc0M3nzp2b13oaSh3vvPNON0/bH7fccks3z7d7W9euXd38ww8/dPO0LnOXXXZZXtvNV308R2611VZu/tvf/tbN07paVVRUuPkLL7zg5mndkNJuX9/Ut32yRYsWbp52DN1tt92KsdlUace4//3f/3XztNdMkyZNcvNPPvnEzaubI6Tc3q0jVzYAAAAAZILJBgAAAIBMMNkAAAAAkAkmGwAAAAAywWQDAAAAQCYal3oAWTr++OPdvHnz5m5+3XXXZTmcstCkSRM3f/XVV938mmuucfPJkye7+SWXXOLmX3/9tZtPnz7dzS+//HI3/+qrr9z8lltucXNEY8eOdfMjjjjCzf/85z+7+SGHHJLXdrfZZhs3L1U3qoZi/Pjxbr7vvvu6+cMPP+zmaZ1sTj75ZDd/9NFH1z64HP369XPzSy+91M3TOsTstNNObr5y5cq8xlMqZ511lpsfddRRbv63v/3NzefNm1eU8bRr187NmzVr5uZp3QjXRWkdu4YOHermJ5xwgpu/9tprbt6+fXs332KLLdy8oXSjqm86d+7s5sXqOrV48WI3/9nPfubm48aNc/MVK1YUZTxZ48oGAAAAgEww2QAAAACQCSYbAAAAADLBZAMAAABAJphsAAAAAMiEhRDSF5qlL6xHWrVq5eYffPCBmz/zzDNu/vOf/7xoYyqGEIIVa10NpZY9evRw8zlz5rj5smXL3HyPPfZw85tuusnN+/Tp4+bz589383wVq5b1rY677rqrm6c9/6+88oqbp3VJGjVqlJvPmDFjrWPLQkOvY8+ePd38tNNOc/PTTz89r/U/9dRTbp7WdSqtM2Dafjd48GA3z7c7WX2r48iRI908rSPO4YcfXozNpnr77bfdfMGCBW7ev3//DEeTrhzOkR07dnTztGNcvp0Z33jjDTdP62g2bdo0N6/utWIx1Ld9smnTpm5+zz33uPlGG23k5g888ICb77777m6+dOlSN0/r/FjfpNWRKxsAAAAAMsFkAwAAAEAmmGwAAAAAyASTDQAAAACZYLIBAAAAIBONSz2AYhg2bJibm/nNDepb16l1UVoHsWJ1GXrppZfcPK3TxsCBA908rRsSohYtWrj5Zptt5uZHHnmkm++8885uPnbsWDcvVTeqhqJt27Zu/uKLL7r5xhtv7Ob5dqDZb7/98lpPWneyY4891s3T9t+G7oQTTijKejp16uTmacfb3r17u/kPfvADNz///PPd/NZbb3XzW265xc3TOkKuiw455BA3b9KkiZundfpLO+ZWVFS4edo5Mu311B133OHmq1evdvOGbvny5W6e9lohX61bt3bzcePGuXnauXbJkiVFGU/WuLIBAAAAIBNMNgAAAABkgskGAAAAgEww2QAAAACQCSYbAAAAADLRoLpRde3a1c3PPvtsNz/55JOzHA5qIa0TxtChQ918woQJRdnusmXL3DytWwuq99xzz+WVDx482M2vvvpqN3/44YfdfMiQIW7+xBNPuPm6ZpdddnHzjTbaKK/1pHXue/fdd918zJgxbr7VVlu5+ejRo928XLtOFUujRo3c/MEHH3Tzvn37unlax8Y0I0aMcPMHHnjAzRcsWJDX+tdFaV2Jmjdv7uYdOnRw8/Hjx7v5888/7+ZXXnmlm993331u/uGHH7r5v/71LzdH9dK6SPXq1cvNmzVrltd66huubAAAAADIBJMNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATNTLblRNmzZ183vuucfNZ8+e7eZpXRVQehMnTnTzv/71r25+6KGHuvncuXPz2u7ixYvdPK1jB4rryCOPdPO0+r7xxhtu/uKLL7o53aiilStXFmU9y5cvd/MWLVq4eZcuXfJaf+PG9fIUVO+tWrXKzffcc083b9OmjZtPmTLFza+//no3T+tGhcJ9+eWXed3+jDPOcPPKyko3DyG4+aJFi9z8sMMOc/Pbb7/dzc8880w3nzp1qpsjStuHX3nlFTffa6+93DytE1x9w5UNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATDDZAAAAAJAJS+tUIElmlr4wQx06dHDzmTNnuvnJJ5/s5g8//LCbDxgwwM2ffPJJN0/rYJS1EIIVa12lqmWarl27unlal6G0zg2/+c1v3HzXXXd181/+8pdu3rt3bzf/5JNP3DxfxaplfatjvpYtW+bmzZo1c/O0rkefffaZm7dt27awgdVQQ6ljWpena665xs3T9ot8u0WZ+U/PkiVL3HynnXZy8/fffz+v7earodSxWIYMGeLm//znP928V69ebp52Di6VcjhHph3jxowZ4+YVFRVufvXVV7v5euv57ylPnjzZzQ866CA37969u5v369fPzfO1ru2TaV544QU3v+qqq9y8vnWjSqsjVzYAAAAAZILJBgAAAIBMMNkAAAAAkAkmGwAAAAAywWQDAAAAQCbyazVSR1q1auXmLVu2dPM2bdq4+aRJk9z897//vZuXquvUumjWrFluft5557n5JZdc4ub333+/my9dutTNzznnHDcvVtcpVO+mm25y89dff93Nt9xySzc/66yzijamcrRy5Uo3T+vc97vf/c7NTznlFDdP6yI1f/58Nz/33HPdfM6cOW6O4ho8eLCbT5gwwc3rW9epcpbWqe3www93806dOrn5DTfc4ObvvPOOm2+22WZufumll7r5u+++6+aoXqNGjdw8rUtYWmfMtNc0DQVXNgAAAABkgskGAAAAgEww2QAAAACQCSYbAAAAADLBZAMAAABAJiyEkL7QLH1hhho39ptkPffcc27es2dPN//Vr37l5o899lhhA6tjIQQr1rpKVctiad++vZu3a9fOzZctW+bmn3/+uZuvXr26oHHVVLFq2dDr2KRJEzfv3Lmzmy9fvtzNv/rqKzev7nhWDNSxPKxrdRw+fLibV1RU5JXXN5wjy0e57pNp+97xxx/v5mmvddq2bevmq1atKmRYmUmrI1c2AAAAAGSCyQYAAACATDDZAAAAAJAJJhsAAAAAMsFkAwAAAEAm6mU3KkR02igf5dppY11DHcsDdSwPnCPLR7nuk3vssYeb33HHHW5++eWXu/k111xTtDFliW5UAAAAAOoUkw0AAAAAmWCyAQAAACATTDYAAAAAZILJBgAAAIBMVNuNCgAAAAAKxZUNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATDDZAAAAAJAJJhsAAAAAMsFkAwAAAEAmmGzkycx6mFkws8alHgsKZ2aVZvZSqceB2qGO5cHMKsxsVqnHgdrjHFkeqGN5qC/H1swmG7V5gBZdZGazzexrMxtvZtsVe4ylYGYdzeyLhvICqZZ1rDSzVWb2Tc5PRXFHWBpm1j85EF9U6rHURG0POGb2azOba2YLzewmM2tWzPHVNTObYWZLcv4unyr1mGqilvvj9mb2pJnNM7Oy+NdczWyMmc1J/i6nmNkvSz2mmuIc+X1mdpmZfZLUcqaZnVvqMdUEdfw+M3uvyjl/pZk9UupxrQ3H1v9kZkeb2Qdm9q2ZfWRmexa6rvp6ZeMISb+QtKekjpJekTQ6iw2VYNZ+qaQP6nibpfRKCKF1zs/4Ym8gOWDX2d+ymTWR9BdJr9XVNkvJzPaX9FtJAyR1l9RT0ogMtlOndZR0cM7f5X51uN1SWSHpbklDs96QmTXKehuJSyT1CCG0lXSIpIvM7Ed1tO1SKsdz5I2Stk5qubukQWZ2WB1tu1TKro4hhO3WHFcltZH0iaR76mLbJVR2x1Yz21fx9erPFevYT9K0QtdXqxO7mf3QzCaa2SIzu8fM7kpm6a0kPSFp45zZ7cZ5rHozSS+FEKaFEFZJGiNp2xqOac2lv1+Z2afJu17DcpYPN7N7k3fEFkqqNLN2ZnZjctvZyWNolNy+kZldkcxYp0n6SR6Po+rYdpe0vaSbC11HFjKsY23GVGlmL5vZdck7PpPNbEDO8vFm9kcze1nSYkk9zWxrM3vazL4ysw/N7Mic23cys4eTd81el7R5LYZ3lqSnJE2uxTqKLsM6DpF0YwjhvRDCfEkXSqqs4Zjqcx3rpazqGEL4MIRwo6T3ChhThZnNMrNzk2PhDDMblLN8lJn93cweN7NvJe1lZhub2X0Wr+RON7PTcm7fIrnPfDN7X9LO+Y4peUzvhRCWrfnf5Kfe/E1wjqy55O/z25xotaRehayr2KhjwfpJ6izpviKsq9Y4tuZlhKQLQgivhhBWhxBmhxBmF7guKYRQ0I+kppJmSjpdUhNJh0laLumiZHmFpFlV7vPfkhZU89MtuV13SW9K2jJZ92WSHqzhuHoonnDukNRK0n9J+kLSPsny4Yqz0J8qTrZaSHpA0sjk9htIel3SCcntT1R8Ubmp4jsPzyXrb5ws/1s1j2dSzrgaSZog6UeKL9ReKvS5L+ZPxnWslPStpHmSpkg6b83zVoNxVUpaKenXybiOkvS1pI7J8vGSPpa0naTGktopvoPy8+T/d0y2u21y+zsV33lopTjhm51bA0mTqnk8f8u5XffksbSWNGrN81Tqn4zr+Lako3Lu11lxH+jUgOs4Q9JniseGpyTtUOoaZl3HnNv3khTyHFdFUsc/S2omqb/ivr1VsnxUUte+isfVlorH8D8kj6mn4rti+ye3/5OkFxWPqZtKejf3cUl6tJrH82iVsf1NcaIaFI+xrUtdxzrYJ8vuHJnc9reSvknuP01SV+rY8OqYM8abJI0qdQ2zrmPO7cvi2Kr4enW54v44VdIsSddJalHw81+LwvVTPNFbTvZSdYXL84/iL8kf+kpJ0yVtlucOuHVOdpniO7NS3AFfyFnWRdKy3CdR0jGSnkt+f1bSiTnL9lPODpjHY/q1pL8nv1eq/kw2sqxjT8V3btZTPBC+L+mcGt63UtKnVcb1uqTBye/jFWfda5YdJenFKusYKen8ZMdZUeVv4uJCaiDpISUvvFW/JhtZ1vEjSQfk/H+TZB/o0YDr2Ffx5NtS0jmS5kpqX851zFlfbU6IrXKyuyWdl/w+StKtOcv6SPq4yjrOkXRz8vu0Kn9Tv6rN40r+NvaQ9HtJTUpdx6xrqTI8R+bc3xTfZBghqQ11bLB1bClpoaSKUtcw6zrmrK8sjq2SNk5q/y9JGym+wfiypD8W+tzU5mNUG0uaHZKRJT6pxfpy/UHx0s+mkporHnSeNbOWeawjdywzFcfrLeuu+OJpjpktMLMFii9uNkiWb+ysKy/J5bjTJP0u3/vWgczqGOKl4ekhXoJ7R9IFkn6WxyqqjmttdeyzpoZJHQdJ2lDS+orvkte2jgcrnvzuyve+dSDL/fEbSW1z/n/N74tqeP96VUdJCiG8HEJYEkJYHEK4RPFdnYK//FZEWdaxtuaH73/MZW113LhKHc9VfMEjFeG4miuEsCqE8JKkrpJOqs26iohzZAFCNFHSEmXw3bACUMfCHCbpK0nP13I9xcKxteaWJP+9NoQwJ4QwT/HKy4EFrEtS7b6zMUfSJmZmOdmmOb+HKreXmQ2y73cpqPrTLblpb0l3hRBmhRBWhhBGSeqgGn6W0RlLN8V3V72xfaI42+8cQmif/LQNIazpCDHHWVfuY7q+msez5vN7uyjODt83s7mK72TsYrG7T119kTJNlnWsKii+a1VTVce1tjo+n1PD9iF+Se0kxUvLK1V9Hat20Mj9uT652QBJOyV1m6v4LvwZZvZQHo8pK1nW8T1JO+TcdQdJn4UQvqzh2OpbHT35/m1mpS73x3x1SD7bvMba6ji9Sh3bhBDWnKzWdlx9oprH80Q1Y2ys+vOdDc6RqvE50lNfakkdVVAdhyi+I/8fz0+JcGxVzY6tIX43c1aV7daujoVeElG8/PexpP9RPCgM1Pc//7a14uyoXQHrPl/x8lYXxQnRYMXPsLVPlg+XND7lvj2SJ+U2xct420n6XNJ+OfcdU+U+DylOANom29tcUv9k2UmKH//pqngQeEZ5XlpU/Bzehjk/pyt2Mtqw0Oe/WD8Z1/HHkrrkrOddSefnLB8vaXjKfSsVX1iervhuzBGKl2Q75dz3lzm3b6M4gx+c3L6J4jtG2yTL71L8vH9LxQP5LOX58ZtkG7l1vEvSVUq+f1DGdTxA8WNG20pqr3i5/U8NuI7dFD9G1VTx3cTfKE5k1vodlAZeR0se77aKx7DmkprlLB+llM9X67tL/VckY9xT8Zi8dc59L8q5/ZrvqP0/xY+rNVL8js3OyfJLFd/x7KB4bJ2k/C/1byDpaMXvTzWStH8ypkNKXcc6qGW5nSPXk3RCcn9TfINujqTTqGPDqWPOdroqHi82L3X96qiOZXVsTdZzgaQ3FI+zHRS/B3Jhoc9/wVc2QgjLFS+TDVX8CMKxil88WZYsn6z4xaVpFi/15NOh4VLFL6W+laz715IODyEsSJZvqvj5seo8r/jFlmckXRFCqK6P/nGKRX5f0nxJ9ypeiZCkGyQ9mYxngqT783gckqQQwrIQwtw1P4pf9lmR/F5SGddxgKRJFjsoPK743F2cs3xtdXxN0haKXxD+o6SfhZR300MIixQ/Y3q04jsCcxX/jtb8exCnKr4omau4896cx+P4v21UqeMSSd+GEL7Kd13FlmUdQwhjFT8L/JziwXqm4klyjQZVR8UJzd8V9/XZipOpH6eNqS5lvD92V/ybXfMu5BJJH+YsX1sd5yo+Z58qvsA5MRmP9zhWSTpI8Z3b6Yq1/6diAwApflxkZrLsKRXW7jMovkCalYzrCklnhBAeLmBdRcc5Mm+HKn4/bJFiV6Zrk5+Soo4FGazY9v6jWqyjqDi25u1CxcnGFMV/rmGi4vm7IJbMYIrCzF6TdH0IoZAXAPls5y1JA7wXB2bWQ/FJbhJCWJnlOMpVXdTRzLpKujuEsHvK8krFd7z3yGoM5Y46loc6qmNTxRcZPwghrHCWVyi+S9o1qzGsCzhHlgfqWB44ttad2v47G/3NbEMza2xmQyT9QNLY4gwtXQihd314F7JclKKOIX5G1X2BisJQx/JQojouDyFs450MUTjOkeWBOpYHjq2lU9t/UXIrfdfzfprixyPm1HpUqGvUsTxQx/JAHcsHtSwP1LE8UMcSKerHqAAAAABgjVp9jAoAAAAA0jDZAAAAAJCJar+zYWZ8xqqEQghF+0fGqGVpFauW1LG0qGN5oI7lgXNk+WCfLA9pdeTKBgAAAIBMMNkAAAAAkAkmGwAAAAAywWQDAAAAQCZq+4/6AQCABmi99fz3GzfZZBM3//zzz9182bJlRRsTgPLDlQ0AAAAAmWCyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkAm6UQFYq7SuNX/+85/d/Ouvv3bzG2+80c2ffvppN2/Xrp2b33nnnW5+xhlnuDmK65RTTnHzE044wc379u3r5osWLSramJCuSZMmbn7bbbe5+aGHHurmr776qpvvueeehQ0MwPdsvfXWbn777be7+eDBg938vffeK9qYioErGwAAAAAywWQDAAAAQCaYbAAAAADIBJMNAAAAAJlgsgEAAAAgEyXtRrX55pu7+eTJk918hx12cPP333+/aGNCaXXv3t3NTzzxRDfv06ePm++4445uvmLFCjefOHGim48bN87Nr732WjdfunSpmzcUafvkgw8+6OZDhw518zfeeMPNBw4c6Ob9+/d382+++cbNr7rqKjcfNGiQm6d13UH10upy8cUXu3nbtm3dfO+993bzhx56qLCBwZX2/KftvxUVFW4eQnDzjh07uvn666/v5l988YWbl7O99trLzYcMGeLmb7/9tptPnz69KONJO+c99dRTed0exdW0aVM3P+ecc9x82223zSunGxUAAACAdQKTDQAAAACZYLIBAAAAIBNMNgAAAABkgskGAAAAgExYWtcJSTKz9IVFkNb5ZurUqW4+cuRINz/ppJPcvLrHlo/WrVu7eVqnnGIJIVix1pV1LdOkdUd55JFH3HyPPfZw8/XW8+fFq1evdvNZs2a5+ccff+zmb731lpsvXrzYzf/0pz+5+fz58928WLUsVh27devm5mPHjnXzyspKN3/99deLMZy8bbjhhm7+5JNPunlaJ7t81bc6FkuvXr3cPK1bVFoHlLRj7m677ebmr732Wg1GV3wNvY5px8Obb77ZzQcPHpzlcFI736R11nn00UeLst36eI5MO5cU6xhUKsOGDXPzK6+8sijrb+j7ZL722WcfN0/rEpb2mqZnz55uvnLlysIGVktpdeTKBgAAAIBMMNkAAAAAkAkmGwAAAAAywWQDAAAAQCaYbAAAAADIRONSbvyjjz5y88MOO8zNf/GLX7j5WWed5eZXXHFFXuNp3Nh/OoYOHermf/nLX/Ja/7ooretUv3793Dytg0Jat5O0LkkTJkxY++DWYWkd3O6++243L1XXqTRz585187RuY4iOPPJIN7/11lvdvEmTJm6+dOlSN2/atKmbz5kzpwajQ0116tTJzdPOnStWrHDzCy64wM3feecdN7/tttvcfLvttnPz008/3c2L1Y2qPrr88svdPK2DYbt27dy8TZs2eW131apVbv7pp5+6eVqnyEsuucTNi9V1ClHaOdjMb8qVdsxtKLiyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkAkmGwAAAAAyUdJuVGkeeOABN3/hhRfcfMyYMW6ebzeqCy+80M3ff//9vNazLkrrRtK3b9+81pPWceyuu+5y8+XLl+e1fkRHH320m++11151PJLCtG7d2s0rKirqdiANzM477+zmaV2nTjjhBDfffPPN3fzss88ubGDIS8uWLd28VatWbn7HHXe4+XXXXefmG220kZundTxK8/jjj+d1+3KQ1rHrzjvvdPMNNtjAzSdNmuTmnTt3zms8PXr0cPO0blRpr4NOPPFEN7/++uvzGg+iH/7wh24eQnDztC5naR086xuubAAAAADIBJMNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATNTLblRpvvzySzc/55xz3LxPnz5uvsUWW7j5UUcd5eaHHnpoDUa3bujfv7+bjx492s0bNWrk5jfccIObp3VNaSgdFxqK1atXu/mMGTPqdiAF2nvvvd18ypQpdTyShiWtW9Rll13m5l988YWbp3UMRN045JBD8rr9kUce6eb77ruvm+fb8WjcuHFuntYpcl2U1slrzpw5br7tttu6+Y033ujmBx98sJun7du9evVy84EDB7r5yJEj3RzV23XXXd28S5cubr5o0SI3nzhxYtHGVApc2QAAAACQCSYbAAAAADLBZAMAAABAJphsAAAAAMgEkw0AAAAAmWhQ3ajSvPfee27+8ssvu/n222/v5kOHDnXzt99+u7CBlaH777/fzTt27JjXejbZZBM3v/fee9185syZbv7mm2+6+T333OPmS5YsqcHoyl9ax4uG4je/+Y2bX3fddXU8koYlhODmaV2n0rRr187N07rGrVixIq/1I0rrDJTWYSjNRx995OYtW7Z087S/kzRp59p58+bltR58J22fHD58uJundaM666yz8trurbfe6uYnnnhiXutBtP/++7t58+bN3Xz58uVuvnjx4qKNqRS4sgEAAAAgE0w2AAAAAGSCyQYAAACATDDZAAAAAJAJJhsAAAAAMlEW3ajOPPNMN//Rj37k5lOnTnXzH//4x27+4IMPuvm62NkorQtNvg488MCirCfNX//6Vzc/9thj3fyRRx5x89WrVxdtTPVJWnegLl26uPlnn32W5XBSde/e3c3T/g5vu+22LIeDRFr3ob59+7q5mWU5nAavUaNGbj5s2DA3b9asmZu/8MILbn7EEUe4eVpXwFdffdXNmzRp4ubnn3++m6d1ckw7p2LtPv74YzdP6zjWs2dPN0875w0ZMqSwga3j0rpLDRo0yM3TOr59+umnbv7vf/+7sIHVE1zZAAAAAJAJJhsAAAAAMsFkAwAAAEAmmGwAAAAAyASTDQAAAACZKItuVGmdOe688043T+tIlLaeZcuWFTawMnTKKae4eVrHi3yldUNK69aSVsvWrVu7+X333efmO+20k5u/9dZbbt7QTZw40c33228/Nx89enSWw1GHDh3cfMaMGW5+zDHHuPny5cuLNSRU41//+pebpz3/S5cuzXI4DV7v3r3dfLfddnPztG5yp59+upsvXLjQzc877zw3TzvepnUVS8vXxY6NWbvlllvcfPPNN3fztNpXVlYWa0hQ+j7Qpk2bvNaz2WabuXlap8Wjjz7azetbJ02ubAAAAADIBJMNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATDSoblSbbLKJm3/11Vdufvzxx7t5CMHN0zqmpHU2+uabb9y8nI0cObIk211vPX9efOutt7r5o48+6uZNmzZ183WtG9Vll13m5g899JCbp3XCyLfjRY8ePdz8H//4h5tfeOGFbn7PPffktV1EzZs3d/NddtnFzTfYYAM3HzhwoJunHStvuOEGN3/jjTfc/NJLL3XztGN3Q5dvl6fGjf1Td9p+sdFGG7l5q1at3DzteV65cqWb/+QnP3Hzp59+2s2xdmmdAQ888MC81pNWy3LdlxqKtH07rStqixYtshxO5riyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkAkmGwAAAAAy0aC6US1fvtzNX3rpJTfv1KmTm6d10EnrdnX55Ze7+WGHHebmWLtGjRq5+aBBg9z8iCOOcPODDjoor+0uXLjQzd9999281tPQTZ061c0vuOACN//73//u5qNHj3bz3Xff3c1PPvlkN//ggw/c/I9//KObr1q1ys0RpR3LnnrqKTffdttt81p/Wie+L7/80s27devm5uuvv76bX3311W6e1jGwoUv7+580aZKb77DDDm7eq1cvN//ss8/c/LnnnnPzKVOmuPmrr77q5nSdKr7+/fuXegjIw4oVK9x84sSJbr7//vu7edo+lvYaKN+OkKXClQ0AAAAAmWCyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkIkG1Y3qiy++cPNvv/3Wzd988003b9zYf9iTJ0928zPOOGPtg0Ne0roJtWnTxs3TuhvNnDnTzV9++WU3T6tl2t/Wuuauu+5y87TuQ4899pibp9V3yJAhbv7II4/UYHSoqQULFrj5uHHj3LxFixZuPmLECDe/77773Dzt7wTVSzuH7bjjjnU8EtQXaZ0Z85XWiWzJkiVFWT+ilStXuvlJJ53k5pWVlW5+5513unlD78THlQ0AAAAAmWCyAQAAACATTDYAAAAAZILJBgAAAIBMMNkAAAAAkAkLIaQvNEtf2AC0bdvWzVu2bOnm8+bNc/O0LgNZCyFYsdbV0GuZ1qUqrUPDihUrshxO3opVy4Zex4aOOpYH6lgeyvkc+Yc//MHNTz31VDf/4IMP3Pzggw9284ULFxY2sIywT5aHtDpyZQMAAABAJphsAAAAAMgEkw0AAAAAmWCyAQAAACATTDYAAAAAZKKsu1E1dOXcaWNdQ6eN8kAdywN1LA+cI8sH+2R5oBsVAAAAgDrFZAMAAABAJphsAAAAAMgEkw0AAAAAmWCyAQAAACAT1XajAgAAAIBCcWUDAAAAQCaYbAAAAADIBJMNAAAAAJlgsgEAAAAgE0w2AAAAAGSCyQYAAACATPx/hhrZqRZdoeQAAAAASUVORK5CYII=\n", 105 | "text/plain": [ 106 | "
" 107 | ] 108 | }, 109 | "metadata": { 110 | "needs_background": "light" 111 | }, 112 | "output_type": "display_data" 113 | } 114 | ], 115 | "source": [ 116 | "grid_mistakes(Xte, Yte) # test set mistakes" 117 | ] 118 | } 119 | ], 120 | "metadata": { 121 | "kernelspec": { 122 | "display_name": "Python 3 (ipykernel)", 123 | "language": "python", 124 | "name": "python3" 125 | }, 126 | "language_info": { 127 | "codemirror_mode": { 128 | "name": "ipython", 129 | "version": 3 130 | }, 131 | "file_extension": ".py", 132 | "mimetype": "text/x-python", 133 | "name": "python", 134 | "nbconvert_exporter": "python", 135 | "pygments_lexer": "ipython3", 136 | "version": "3.8.12" 137 | } 138 | }, 139 | "nbformat": 4, 140 | "nbformat_minor": 4 141 | } 142 | --------------------------------------------------------------------------------