├── LICENSE ├── README.md ├── fast_mnist └── README.md ├── feature_density ├── README.md └── density.py ├── imagenet_dogs_vs_notdogs ├── README.md ├── dataset.py ├── imagenet.txt ├── imagenet_dogs.txt ├── imagenet_notdogs.txt └── imagenet_notdogs_subset.txt ├── imagenot ├── README.md ├── in-domain.txt └── out-domain.txt ├── minimal_cifar ├── README.md └── train_cifar.py └── subset_of_imagenet ├── README.md └── subset_imagenet.py /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 Joost van Amersfoort 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | This is a collection of PyTorch snippets that have helped me (and others) significantly with doing ML research. 2 | 3 | * [Minimal CIFAR-10](minimal_cifar) A very simple training script to get 94% accuracy with just ~150 lines of code, for an easy but strong baseline. 4 | * [FastMNIST](fast_mnist): a drop-in replacement for the standard MNIST dataset which speeds up training on the **GPU** (by 2x !) by avoiding unnecessary preprocessing that pegs the cpu at 100% for small models. 5 | * [Subset of ImageNet](subset_of_imagenet): it's remarkably difficult to train on a subset of the ImageNet classes with the default TorchVision datasets. This snippet makes the minimal changes to them to make it very easy. 6 | * [ImageNet dogs vs not dogs](imagenet_dogs_vs_notdogs): a standardized setup for the ImageNet Dogs vs Not Dogs out-of-distribution detection task. 7 | 8 | In a separate repository, [Slurm for ML](https://github.com/y0ast/slurm-for-ml), I explain how I use `slurm` job arrays without pain using a simple 1 file shell script. 9 | -------------------------------------------------------------------------------- /fast_mnist/README.md: -------------------------------------------------------------------------------- 1 | # Fast MNIST 2 | 3 | The [PyTorch MNIST dataset](https://pytorch.org/docs/stable/torchvision/datasets.html#mnist) is **SLOW** by default, because it wants to conform to the usual interface of returning a PIL image. This is unnecessary if you just want a normalized MNIST and are not interested in image transforms (such as rotation, cropping). By folding the normalization into the dataset initialization you can **save your CPU and speed up training by 2-3x**. 4 | 5 | The bottleneck when training on MNIST with a GPU and a small-ish model is **the CPU**. In fact, even with six dataloader workers on a six core i7, the GPU utilization is only ~5-10%. Using FastMNIST increases GPU utilization to ~20-25% and reduces CPU utilization to near zero. On my particular model the steps per second with batch size 64 went from ~150 to ~500. 6 | 7 | Instead of the default MNIST dataset, use this: 8 | 9 | ``` 10 | import torch 11 | from torchvision.datasets import MNIST 12 | 13 | device = torch.device('cuda') 14 | 15 | class FastMNIST(MNIST): 16 | def __init__(self, *args, **kwargs): 17 | super().__init__(*args, **kwargs) 18 | 19 | # Scale data to [0,1] 20 | self.data = self.data.unsqueeze(1).float().div(255) 21 | 22 | # Normalize it with the usual MNIST mean and std 23 | self.data = self.data.sub_(0.1307).div_(0.3081) 24 | 25 | # Put both data and targets on GPU in advance 26 | self.data, self.targets = self.data.to(device), self.targets.to(device) 27 | 28 | def __getitem__(self, index): 29 | """ 30 | Args: 31 | index (int): Index 32 | 33 | Returns: 34 | tuple: (image, target) where target is index of the target class. 35 | """ 36 | img, target = self.data[index], self.targets[index] 37 | 38 | return img, target 39 | ``` 40 | 41 | And call the dataloader like this: 42 | 43 | ``` 44 | from torch.utils.data import DataLoader 45 | 46 | train_dataset = FastMNIST('data/MNIST', train=True, download=True) 47 | test_dataset = FastMNIST('data/MNIST', train=False, download=True) 48 | 49 | # num_workers=0 is very important! 50 | train_dataloader = DataLoader(train_dataset, batch_size=64, shuffle=True, num_workers=0) 51 | test_dataloader = DataLoader(test_dataset, batch_size=10000, shuffle=False, num_workers=0) 52 | ``` 53 | 54 | Results in 2-3x speedup (500it/s on a 1080Ti and a smallish MLP), uses near zero CPU (compared to full CPU usage normally). 55 | -------------------------------------------------------------------------------- /feature_density/README.md: -------------------------------------------------------------------------------- 1 | # Feature Density 2 | 3 | There are several feature density methods used in the literature for uncertainty. 4 | This snippet allows computing them easily. 5 | 6 | It has special provisions for being very numerically stable and will try to find the minimum jitter value necessary for the inversion. The output of this function can be used with 7 | 8 | This snippet computes all flavors of feature density: 9 | - LDA: as suggested in "[A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks](https://arxiv.org/abs/1807.03888)" - also known as Mahalonobis distance 10 | - GDA: as suggested in "[Deep Deterministic Uncertainty: A Simple Baseline](https://arxiv.org/abs/2102.11582)" - also known as DDU (without spectral norm) 11 | - Marginal: as suggested in "[A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection](https://arxiv.org/abs/2106.09022)" - also known as relative Mahalonobis distance if combined with LDA. 12 | 13 | Credit to [Andreas](https://github.com/BlackHC) and [Jishnu](https://github.com/omegafragger) for parts of this snippet! 14 | -------------------------------------------------------------------------------- /feature_density/density.py: -------------------------------------------------------------------------------- 1 | import torch 2 | 3 | DOUBLE_INFO = torch.finfo(torch.double) 4 | JITTERS = [0, DOUBLE_INFO.tiny] + [10 ** exp for exp in range(-308, 0, 1)] 5 | 6 | 7 | def centered_cov(x): 8 | n = x.shape[0] 9 | # Pre-multiply here to try to avoid overflow. 10 | x = (n - 1) ** (-0.5) * x 11 | 12 | # NOTE: on CPU with float16, mm is only single core 13 | res = x.t().mm(x) 14 | return res 15 | 16 | 17 | # Default paramters compute GDA 18 | # Set shared_cov to True for LDA 19 | # Set shared_mean/shared_cov true for "marginal" feature density, to be able to compute a relative score 20 | # Use diagonal_cov = True for Naive Bayes type classifier 21 | def fit_gmm( 22 | embeddings, # Shape Dataset size by Features 23 | labels=None, # Shape Dataset size by Classes 24 | shared_mean=False, 25 | shared_cov=False, 26 | diagonal_cov=False, 27 | ): 28 | if not (shared_mean and shared_cov): 29 | assert ( 30 | labels is not None 31 | ), "labels are required in all cases except double shared." 32 | 33 | embeddings = embeddings.cuda() 34 | labels = labels.cuda() 35 | 36 | with torch.no_grad(): 37 | classes = torch.unique(labels, sorted=True) 38 | 39 | if shared_mean: 40 | mean_features = torch.mean(embeddings, dim=0) 41 | else: 42 | mean_features = torch.stack( 43 | [torch.mean(embeddings[labels == c], dim=0) for c in classes] 44 | ) 45 | 46 | if shared_cov: 47 | if shared_mean: 48 | if diagonal_cov: 49 | cov_features = torch.var(embeddings, dim=0, unbiased=True) 50 | else: 51 | cov_features = centered_cov(embeddings - mean_features) 52 | else: 53 | if diagonal_cov: 54 | cov_features = ( 55 | torch.stack( 56 | [ 57 | torch.var(embeddings[labels == c], dim=0, unbiased=True) 58 | for c in classes 59 | ] 60 | ) 61 | .mean(0, keepdim=True) 62 | .expand(len(classes), -1, -1) 63 | ) 64 | else: 65 | cov_features = ( 66 | torch.stack( 67 | [ 68 | centered_cov(embeddings[labels == c] - mean_features[i]) 69 | for i, c in enumerate(classes) 70 | ] 71 | ) 72 | .mean(0, keepdim=True) 73 | .expand(len(classes), -1, -1) 74 | ) 75 | else: 76 | assert ( 77 | not shared_mean 78 | ), "shared mean + independent covariance is not supported" 79 | if diagonal_cov: 80 | cov_features = torch.stack( 81 | [ 82 | torch.var(embeddings[labels == c], dim=0, unbiased=True) 83 | for c in classes 84 | ] 85 | ) 86 | else: 87 | cov_features = torch.stack( 88 | [ 89 | centered_cov(embeddings[labels == c] - mean_features[i]) 90 | for i, c in enumerate(classes) 91 | ] 92 | ) 93 | 94 | mean_features = mean_features.cpu().float() 95 | cov_features = cov_features.cpu().float() 96 | 97 | assert torch.all( 98 | torch.isfinite(cov_features) 99 | ), "Covariance contains inf or nan." 100 | if diagonal_cov: 101 | # This could be a MVN combined with: 102 | # https://pytorch.org/docs/stable/distributions.html#independent 103 | gmm = [ 104 | torch.distributions.Normal(mean_features[i], cov_features[i].sqrt()) 105 | for i in range(len(classes)) 106 | ] 107 | return gmm, 0 108 | 109 | fit = False 110 | for jitter_eps in JITTERS: 111 | try: 112 | jitter = ( 113 | jitter_eps 114 | * torch.eye( 115 | cov_features.shape[1], 116 | device=cov_features.device, 117 | ).unsqueeze(0) 118 | ) 119 | gmm = torch.distributions.MultivariateNormal( 120 | loc=mean_features, 121 | covariance_matrix=(cov_features + jitter), 122 | ) 123 | except (RuntimeError, ValueError) as e: 124 | # Numerical issues -> increase jitter 125 | last_error = str(e) 126 | continue 127 | 128 | fit = True 129 | break 130 | 131 | assert fit, f"Creating gmm failed with jitter={jitter_eps} and: {last_error}" 132 | 133 | return gmm, jitter_eps 134 | -------------------------------------------------------------------------------- /imagenet_dogs_vs_notdogs/README.md: -------------------------------------------------------------------------------- 1 | ## ImageNet Dogs vs Not Dogs 2 | 3 | - [imagenet.txt](imagenet.txt) -> contains all 1000 classes from ImageNet 4 | - [imagenet_dogs.txt](imagenet_dogs.txt) -> contains 118 dog classes in ImageNet (obtained from https://arxiv.org/abs/1606.04080) 5 | - [imagenet_notdogs.txt](imagenet_notdogs.txt) -> contains 882 not dog classes in ImageNet 6 | - [imagenet_notdogs_subset.txt](imagenet_notdogs_subset.txt) -> contains 118 not dog classes in ImageNet (subset of above) 7 | 8 | [imagenet.txt](imagenet.txt) created using: 9 | 10 | ``` 11 | ls train > imagenet.txt 12 | ``` 13 | (train is the folder inside `ILSVRC2017_CLS-LOC.tar.gz`) 14 | 15 | [imagenet_notdogs.txt](imagenet_notdogs.txt) created using: 16 | 17 | ``` 18 | comm -3 <(sort imagenet.txt) <(sort imagenet_dogs.txt) > imagenet_notdogs.txt 19 | ``` 20 | 21 | [imagenet_notdogs_subset.txt](imagenet_notdogs_subset.txt) created using: 22 | 23 | ``` 24 | shuf -n 118 imagenet_notdogs.txt > imagenet_notdogs_subset.txt 25 | ``` 26 | 27 | ### Create dataset on disk 28 | 29 | In a folder with the `train/` and `val/` ImageNet sub-folders: 30 | 31 | ``` 32 | mkdir -p imagenet_notdogs/train 33 | mkdir -p imagenet_notdogs/val 34 | cat imagenet_notdogs.txt | xargs -I{} cp -R train/{} imagenet_notdogs/train/ 35 | cat imagenet_notdogs.txt | xargs -I{} cp -R val/{} imagenet_notdogs/val/ 36 | 37 | mkdir -p imagenet_dogs/train 38 | mkdir -p imagenet_dogs/val 39 | cat imagenet_dogs.txt | xargs -I{} cp -R train/{} imagenet_dogs/train/ 40 | cat imagenet_dogs.txt | xargs -I{} cp -R val/{} imagenet_dogs/val/ 41 | ``` 42 | 43 | (`val/` can be obtained using [Soumith's valprep.sh script](https://raw.githubusercontent.com/soumith/imagenetloader.torch/master/valprep.sh)) 44 | 45 | An example way to use these folders in PyTorch is provided in [dataset.py](dataset.py). 46 | -------------------------------------------------------------------------------- /imagenet_dogs_vs_notdogs/dataset.py: -------------------------------------------------------------------------------- 1 | from os.path import join 2 | from torchvision import datasets, transforms 3 | 4 | 5 | def get_imagenet(root, dogs): 6 | input_size = 224 7 | num_classes = 118 8 | 9 | if dogs: 10 | root = join(root, "imagenet_dogs/") 11 | else: 12 | root = join(root, "imagenet_notdogs/") 13 | 14 | traindir = join(root, "train/") 15 | valdir = join(root, "val/") 16 | 17 | normalize = transforms.Normalize( 18 | # Standard ImageNet preprocessing, not specialized for Dogs 19 | mean=[0.485, 0.456, 0.406], 20 | std=[0.229, 0.224, 0.225], 21 | ) 22 | 23 | train_dataset = datasets.ImageFolder( 24 | traindir, 25 | transforms.Compose( 26 | [ 27 | transforms.RandomResizedCrop(224), 28 | transforms.RandomHorizontalFlip(), 29 | transforms.ToTensor(), 30 | normalize, 31 | ] 32 | ), 33 | ) 34 | 35 | val_dataset = datasets.ImageFolder( 36 | valdir, 37 | transforms.Compose( 38 | [ 39 | transforms.Resize(256), 40 | transforms.CenterCrop(224), 41 | transforms.ToTensor(), 42 | normalize, 43 | ] 44 | ), 45 | ) 46 | 47 | return input_size, num_classes, train_dataset, val_dataset 48 | 49 | 50 | def get_imagenet_dogs(root): 51 | return get_imagenet(root, True) 52 | 53 | 54 | def get_imagenet_notdogs(root): 55 | return get_imagenet(root, False) 56 | -------------------------------------------------------------------------------- /imagenet_dogs_vs_notdogs/imagenet.txt: -------------------------------------------------------------------------------- 1 | n01440764 2 | n01443537 3 | n01484850 4 | n01491361 5 | n01494475 6 | n01496331 7 | n01498041 8 | n01514668 9 | n01514859 10 | n01518878 11 | n01530575 12 | n01531178 13 | n01532829 14 | n01534433 15 | n01537544 16 | n01558993 17 | n01560419 18 | n01580077 19 | n01582220 20 | n01592084 21 | n01601694 22 | n01608432 23 | n01614925 24 | n01616318 25 | n01622779 26 | n01629819 27 | n01630670 28 | n01631663 29 | n01632458 30 | n01632777 31 | n01641577 32 | n01644373 33 | n01644900 34 | n01664065 35 | n01665541 36 | n01667114 37 | n01667778 38 | n01669191 39 | n01675722 40 | n01677366 41 | n01682714 42 | n01685808 43 | n01687978 44 | n01688243 45 | n01689811 46 | n01692333 47 | n01693334 48 | n01694178 49 | n01695060 50 | n01697457 51 | n01698640 52 | n01704323 53 | n01728572 54 | n01728920 55 | n01729322 56 | n01729977 57 | n01734418 58 | n01735189 59 | n01737021 60 | n01739381 61 | n01740131 62 | n01742172 63 | n01744401 64 | n01748264 65 | n01749939 66 | n01751748 67 | n01753488 68 | n01755581 69 | n01756291 70 | n01768244 71 | n01770081 72 | n01770393 73 | n01773157 74 | n01773549 75 | n01773797 76 | n01774384 77 | n01774750 78 | n01775062 79 | n01776313 80 | n01784675 81 | n01795545 82 | n01796340 83 | n01797886 84 | n01798484 85 | n01806143 86 | n01806567 87 | n01807496 88 | n01817953 89 | n01818515 90 | n01819313 91 | n01820546 92 | n01824575 93 | n01828970 94 | n01829413 95 | n01833805 96 | n01843065 97 | n01843383 98 | n01847000 99 | n01855032 100 | n01855672 101 | n01860187 102 | n01871265 103 | n01872401 104 | n01873310 105 | n01877812 106 | n01882714 107 | n01883070 108 | n01910747 109 | n01914609 110 | n01917289 111 | n01924916 112 | n01930112 113 | n01943899 114 | n01944390 115 | n01945685 116 | n01950731 117 | n01955084 118 | n01968897 119 | n01978287 120 | n01978455 121 | n01980166 122 | n01981276 123 | n01983481 124 | n01984695 125 | n01985128 126 | n01986214 127 | n01990800 128 | n02002556 129 | n02002724 130 | n02006656 131 | n02007558 132 | n02009229 133 | n02009912 134 | n02011460 135 | n02012849 136 | n02013706 137 | n02017213 138 | n02018207 139 | n02018795 140 | n02025239 141 | n02027492 142 | n02028035 143 | n02033041 144 | n02037110 145 | n02051845 146 | n02056570 147 | n02058221 148 | n02066245 149 | n02071294 150 | n02074367 151 | n02077923 152 | n02085620 153 | n02085782 154 | n02085936 155 | n02086079 156 | n02086240 157 | n02086646 158 | n02086910 159 | n02087046 160 | n02087394 161 | n02088094 162 | n02088238 163 | n02088364 164 | n02088466 165 | n02088632 166 | n02089078 167 | n02089867 168 | n02089973 169 | n02090379 170 | n02090622 171 | n02090721 172 | n02091032 173 | n02091134 174 | n02091244 175 | n02091467 176 | n02091635 177 | n02091831 178 | n02092002 179 | n02092339 180 | n02093256 181 | n02093428 182 | n02093647 183 | n02093754 184 | n02093859 185 | n02093991 186 | n02094114 187 | n02094258 188 | n02094433 189 | n02095314 190 | n02095570 191 | n02095889 192 | n02096051 193 | n02096177 194 | n02096294 195 | n02096437 196 | n02096585 197 | n02097047 198 | n02097130 199 | n02097209 200 | n02097298 201 | n02097474 202 | n02097658 203 | n02098105 204 | n02098286 205 | n02098413 206 | n02099267 207 | n02099429 208 | n02099601 209 | n02099712 210 | n02099849 211 | n02100236 212 | n02100583 213 | n02100735 214 | n02100877 215 | n02101006 216 | n02101388 217 | n02101556 218 | n02102040 219 | n02102177 220 | n02102318 221 | n02102480 222 | n02102973 223 | n02104029 224 | n02104365 225 | n02105056 226 | n02105162 227 | n02105251 228 | n02105412 229 | n02105505 230 | n02105641 231 | n02105855 232 | n02106030 233 | n02106166 234 | n02106382 235 | n02106550 236 | n02106662 237 | n02107142 238 | n02107312 239 | n02107574 240 | n02107683 241 | n02107908 242 | n02108000 243 | n02108089 244 | n02108422 245 | n02108551 246 | n02108915 247 | n02109047 248 | n02109525 249 | n02109961 250 | n02110063 251 | n02110185 252 | n02110341 253 | n02110627 254 | n02110806 255 | n02110958 256 | n02111129 257 | n02111277 258 | n02111500 259 | n02111889 260 | n02112018 261 | n02112137 262 | n02112350 263 | n02112706 264 | n02113023 265 | n02113186 266 | n02113624 267 | n02113712 268 | n02113799 269 | n02113978 270 | n02114367 271 | n02114548 272 | n02114712 273 | n02114855 274 | n02115641 275 | n02115913 276 | n02116738 277 | n02117135 278 | n02119022 279 | n02119789 280 | n02120079 281 | n02120505 282 | n02123045 283 | n02123159 284 | n02123394 285 | n02123597 286 | n02124075 287 | n02125311 288 | n02127052 289 | n02128385 290 | n02128757 291 | n02128925 292 | n02129165 293 | n02129604 294 | n02130308 295 | n02132136 296 | n02133161 297 | n02134084 298 | n02134418 299 | n02137549 300 | n02138441 301 | n02165105 302 | n02165456 303 | n02167151 304 | n02168699 305 | n02169497 306 | n02172182 307 | n02174001 308 | n02177972 309 | n02190166 310 | n02206856 311 | n02219486 312 | n02226429 313 | n02229544 314 | n02231487 315 | n02233338 316 | n02236044 317 | n02256656 318 | n02259212 319 | n02264363 320 | n02268443 321 | n02268853 322 | n02276258 323 | n02277742 324 | n02279972 325 | n02280649 326 | n02281406 327 | n02281787 328 | n02317335 329 | n02319095 330 | n02321529 331 | n02325366 332 | n02326432 333 | n02328150 334 | n02342885 335 | n02346627 336 | n02356798 337 | n02361337 338 | n02363005 339 | n02364673 340 | n02389026 341 | n02391049 342 | n02395406 343 | n02396427 344 | n02397096 345 | n02398521 346 | n02403003 347 | n02408429 348 | n02410509 349 | n02412080 350 | n02415577 351 | n02417914 352 | n02422106 353 | n02422699 354 | n02423022 355 | n02437312 356 | n02437616 357 | n02441942 358 | n02442845 359 | n02443114 360 | n02443484 361 | n02444819 362 | n02445715 363 | n02447366 364 | n02454379 365 | n02457408 366 | n02480495 367 | n02480855 368 | n02481823 369 | n02483362 370 | n02483708 371 | n02484975 372 | n02486261 373 | n02486410 374 | n02487347 375 | n02488291 376 | n02488702 377 | n02489166 378 | n02490219 379 | n02492035 380 | n02492660 381 | n02493509 382 | n02493793 383 | n02494079 384 | n02497673 385 | n02500267 386 | n02504013 387 | n02504458 388 | n02509815 389 | n02510455 390 | n02514041 391 | n02526121 392 | n02536864 393 | n02606052 394 | n02607072 395 | n02640242 396 | n02641379 397 | n02643566 398 | n02655020 399 | n02666196 400 | n02667093 401 | n02669723 402 | n02672831 403 | n02676566 404 | n02687172 405 | n02690373 406 | n02692877 407 | n02699494 408 | n02701002 409 | n02704792 410 | n02708093 411 | n02727426 412 | n02730930 413 | n02747177 414 | n02749479 415 | n02769748 416 | n02776631 417 | n02777292 418 | n02782093 419 | n02783161 420 | n02786058 421 | n02787622 422 | n02788148 423 | n02790996 424 | n02791124 425 | n02791270 426 | n02793495 427 | n02794156 428 | n02795169 429 | n02797295 430 | n02799071 431 | n02802426 432 | n02804414 433 | n02804610 434 | n02807133 435 | n02808304 436 | n02808440 437 | n02814533 438 | n02814860 439 | n02815834 440 | n02817516 441 | n02823428 442 | n02823750 443 | n02825657 444 | n02834397 445 | n02835271 446 | n02837789 447 | n02840245 448 | n02841315 449 | n02843684 450 | n02859443 451 | n02860847 452 | n02865351 453 | n02869837 454 | n02870880 455 | n02871525 456 | n02877765 457 | n02879718 458 | n02883205 459 | n02892201 460 | n02892767 461 | n02894605 462 | n02895154 463 | n02906734 464 | n02909870 465 | n02910353 466 | n02916936 467 | n02917067 468 | n02927161 469 | n02930766 470 | n02939185 471 | n02948072 472 | n02950826 473 | n02951358 474 | n02951585 475 | n02963159 476 | n02965783 477 | n02966193 478 | n02966687 479 | n02971356 480 | n02974003 481 | n02977058 482 | n02978881 483 | n02979186 484 | n02980441 485 | n02981792 486 | n02988304 487 | n02992211 488 | n02992529 489 | n02999410 490 | n03000134 491 | n03000247 492 | n03000684 493 | n03014705 494 | n03016953 495 | n03017168 496 | n03018349 497 | n03026506 498 | n03028079 499 | n03032252 500 | n03041632 501 | n03042490 502 | n03045698 503 | n03047690 504 | n03062245 505 | n03063599 506 | n03063689 507 | n03065424 508 | n03075370 509 | n03085013 510 | n03089624 511 | n03095699 512 | n03100240 513 | n03109150 514 | n03110669 515 | n03124043 516 | n03124170 517 | n03125729 518 | n03126707 519 | n03127747 520 | n03127925 521 | n03131574 522 | n03133878 523 | n03134739 524 | n03141823 525 | n03146219 526 | n03160309 527 | n03179701 528 | n03180011 529 | n03187595 530 | n03188531 531 | n03196217 532 | n03197337 533 | n03201208 534 | n03207743 535 | n03207941 536 | n03208938 537 | n03216828 538 | n03218198 539 | n03220513 540 | n03223299 541 | n03240683 542 | n03249569 543 | n03250847 544 | n03255030 545 | n03259280 546 | n03271574 547 | n03272010 548 | n03272562 549 | n03290653 550 | n03291819 551 | n03297495 552 | n03314780 553 | n03325584 554 | n03337140 555 | n03344393 556 | n03345487 557 | n03347037 558 | n03355925 559 | n03372029 560 | n03376595 561 | n03379051 562 | n03384352 563 | n03388043 564 | n03388183 565 | n03388549 566 | n03393912 567 | n03394916 568 | n03400231 569 | n03404251 570 | n03417042 571 | n03424325 572 | n03425413 573 | n03443371 574 | n03444034 575 | n03445777 576 | n03445924 577 | n03447447 578 | n03447721 579 | n03450230 580 | n03452741 581 | n03457902 582 | n03459775 583 | n03461385 584 | n03467068 585 | n03476684 586 | n03476991 587 | n03478589 588 | n03481172 589 | n03482405 590 | n03483316 591 | n03485407 592 | n03485794 593 | n03492542 594 | n03494278 595 | n03495258 596 | n03496892 597 | n03498962 598 | n03527444 599 | n03529860 600 | n03530642 601 | n03532672 602 | n03534580 603 | n03535780 604 | n03538406 605 | n03544143 606 | n03584254 607 | n03584829 608 | n03590841 609 | n03594734 610 | n03594945 611 | n03595614 612 | n03598930 613 | n03599486 614 | n03602883 615 | n03617480 616 | n03623198 617 | n03627232 618 | n03630383 619 | n03633091 620 | n03637318 621 | n03642806 622 | n03649909 623 | n03657121 624 | n03658185 625 | n03661043 626 | n03662601 627 | n03666591 628 | n03670208 629 | n03673027 630 | n03676483 631 | n03680355 632 | n03690938 633 | n03691459 634 | n03692522 635 | n03697007 636 | n03706229 637 | n03709823 638 | n03710193 639 | n03710637 640 | n03710721 641 | n03717622 642 | n03720891 643 | n03721384 644 | n03724870 645 | n03729826 646 | n03733131 647 | n03733281 648 | n03733805 649 | n03742115 650 | n03743016 651 | n03759954 652 | n03761084 653 | n03763968 654 | n03764736 655 | n03769881 656 | n03770439 657 | n03770679 658 | n03773504 659 | n03775071 660 | n03775546 661 | n03776460 662 | n03777568 663 | n03777754 664 | n03781244 665 | n03782006 666 | n03785016 667 | n03786901 668 | n03787032 669 | n03788195 670 | n03788365 671 | n03791053 672 | n03792782 673 | n03792972 674 | n03793489 675 | n03794056 676 | n03796401 677 | n03803284 678 | n03804744 679 | n03814639 680 | n03814906 681 | n03825788 682 | n03832673 683 | n03837869 684 | n03838899 685 | n03840681 686 | n03841143 687 | n03843555 688 | n03854065 689 | n03857828 690 | n03866082 691 | n03868242 692 | n03868863 693 | n03871628 694 | n03873416 695 | n03874293 696 | n03874599 697 | n03876231 698 | n03877472 699 | n03877845 700 | n03884397 701 | n03887697 702 | n03888257 703 | n03888605 704 | n03891251 705 | n03891332 706 | n03895866 707 | n03899768 708 | n03902125 709 | n03903868 710 | n03908618 711 | n03908714 712 | n03916031 713 | n03920288 714 | n03924679 715 | n03929660 716 | n03929855 717 | n03930313 718 | n03930630 719 | n03933933 720 | n03935335 721 | n03937543 722 | n03938244 723 | n03942813 724 | n03944341 725 | n03947888 726 | n03950228 727 | n03954731 728 | n03956157 729 | n03958227 730 | n03961711 731 | n03967562 732 | n03970156 733 | n03976467 734 | n03976657 735 | n03977966 736 | n03980874 737 | n03982430 738 | n03983396 739 | n03991062 740 | n03992509 741 | n03995372 742 | n03998194 743 | n04004767 744 | n04005630 745 | n04008634 746 | n04009552 747 | n04019541 748 | n04023962 749 | n04026417 750 | n04033901 751 | n04033995 752 | n04037443 753 | n04039381 754 | n04040759 755 | n04041544 756 | n04044716 757 | n04049303 758 | n04065272 759 | n04067472 760 | n04069434 761 | n04070727 762 | n04074963 763 | n04081281 764 | n04086273 765 | n04090263 766 | n04099969 767 | n04111531 768 | n04116512 769 | n04118538 770 | n04118776 771 | n04120489 772 | n04125021 773 | n04127249 774 | n04131690 775 | n04133789 776 | n04136333 777 | n04141076 778 | n04141327 779 | n04141975 780 | n04146614 781 | n04147183 782 | n04149813 783 | n04152593 784 | n04153751 785 | n04154565 786 | n04162706 787 | n04179913 788 | n04192698 789 | n04200800 790 | n04201297 791 | n04204238 792 | n04204347 793 | n04208210 794 | n04209133 795 | n04209239 796 | n04228054 797 | n04229816 798 | n04235860 799 | n04238763 800 | n04239074 801 | n04243546 802 | n04251144 803 | n04252077 804 | n04252225 805 | n04254120 806 | n04254680 807 | n04254777 808 | n04258138 809 | n04259630 810 | n04263257 811 | n04264628 812 | n04265275 813 | n04266014 814 | n04270147 815 | n04273569 816 | n04275548 817 | n04277352 818 | n04285008 819 | n04286575 820 | n04296562 821 | n04310018 822 | n04311004 823 | n04311174 824 | n04317175 825 | n04325704 826 | n04326547 827 | n04328186 828 | n04330267 829 | n04332243 830 | n04335435 831 | n04336792 832 | n04344873 833 | n04346328 834 | n04347754 835 | n04350905 836 | n04355338 837 | n04355933 838 | n04356056 839 | n04357314 840 | n04366367 841 | n04367480 842 | n04370456 843 | n04371430 844 | n04371774 845 | n04372370 846 | n04376876 847 | n04380533 848 | n04389033 849 | n04392985 850 | n04398044 851 | n04399382 852 | n04404412 853 | n04409515 854 | n04417672 855 | n04418357 856 | n04423845 857 | n04428191 858 | n04429376 859 | n04435653 860 | n04442312 861 | n04443257 862 | n04447861 863 | n04456115 864 | n04458633 865 | n04461696 866 | n04462240 867 | n04465501 868 | n04467665 869 | n04476259 870 | n04479046 871 | n04482393 872 | n04483307 873 | n04485082 874 | n04486054 875 | n04487081 876 | n04487394 877 | n04493381 878 | n04501370 879 | n04505470 880 | n04507155 881 | n04509417 882 | n04515003 883 | n04517823 884 | n04522168 885 | n04523525 886 | n04525038 887 | n04525305 888 | n04532106 889 | n04532670 890 | n04536866 891 | n04540053 892 | n04542943 893 | n04548280 894 | n04548362 895 | n04550184 896 | n04552348 897 | n04553703 898 | n04554684 899 | n04557648 900 | n04560804 901 | n04562935 902 | n04579145 903 | n04579432 904 | n04584207 905 | n04589890 906 | n04590129 907 | n04591157 908 | n04591713 909 | n04592741 910 | n04596742 911 | n04597913 912 | n04599235 913 | n04604644 914 | n04606251 915 | n04612504 916 | n04613696 917 | n06359193 918 | n06596364 919 | n06785654 920 | n06794110 921 | n06874185 922 | n07248320 923 | n07565083 924 | n07579787 925 | n07583066 926 | n07584110 927 | n07590611 928 | n07613480 929 | n07614500 930 | n07615774 931 | n07684084 932 | n07693725 933 | n07695742 934 | n07697313 935 | n07697537 936 | n07711569 937 | n07714571 938 | n07714990 939 | n07715103 940 | n07716358 941 | n07716906 942 | n07717410 943 | n07717556 944 | n07718472 945 | n07718747 946 | n07720875 947 | n07730033 948 | n07734744 949 | n07742313 950 | n07745940 951 | n07747607 952 | n07749582 953 | n07753113 954 | n07753275 955 | n07753592 956 | n07754684 957 | n07760859 958 | n07768694 959 | n07802026 960 | n07831146 961 | n07836838 962 | n07860988 963 | n07871810 964 | n07873807 965 | n07875152 966 | n07880968 967 | n07892512 968 | n07920052 969 | n07930864 970 | n07932039 971 | n09193705 972 | n09229709 973 | n09246464 974 | n09256479 975 | n09288635 976 | n09332890 977 | n09399592 978 | n09421951 979 | n09428293 980 | n09468604 981 | n09472597 982 | n09835506 983 | n10148035 984 | n10565667 985 | n11879895 986 | n11939491 987 | n12057211 988 | n12144580 989 | n12267677 990 | n12620546 991 | n12768682 992 | n12985857 993 | n12998815 994 | n13037406 995 | n13040303 996 | n13044778 997 | n13052670 998 | n13054560 999 | n13133613 1000 | n15075141 1001 | -------------------------------------------------------------------------------- /imagenet_dogs_vs_notdogs/imagenet_dogs.txt: -------------------------------------------------------------------------------- 1 | n02085620 2 | n02085782 3 | n02085936 4 | n02086079 5 | n02086240 6 | n02086646 7 | n02086910 8 | n02087046 9 | n02087394 10 | n02088094 11 | n02088238 12 | n02088364 13 | n02088466 14 | n02088632 15 | n02089078 16 | n02089867 17 | n02089973 18 | n02090379 19 | n02090622 20 | n02090721 21 | n02091032 22 | n02091134 23 | n02091244 24 | n02091467 25 | n02091635 26 | n02091831 27 | n02092002 28 | n02092339 29 | n02093256 30 | n02093428 31 | n02093647 32 | n02093754 33 | n02093859 34 | n02093991 35 | n02094114 36 | n02094258 37 | n02094433 38 | n02095314 39 | n02095570 40 | n02095889 41 | n02096051 42 | n02096177 43 | n02096294 44 | n02096437 45 | n02096585 46 | n02097047 47 | n02097130 48 | n02097209 49 | n02097298 50 | n02097474 51 | n02097658 52 | n02098105 53 | n02098286 54 | n02098413 55 | n02099267 56 | n02099429 57 | n02099601 58 | n02099712 59 | n02099849 60 | n02100236 61 | n02100583 62 | n02100735 63 | n02100877 64 | n02101006 65 | n02101388 66 | n02101556 67 | n02102040 68 | n02102177 69 | n02102318 70 | n02102480 71 | n02102973 72 | n02104029 73 | n02104365 74 | n02105056 75 | n02105162 76 | n02105251 77 | n02105412 78 | n02105505 79 | n02105641 80 | n02105855 81 | n02106030 82 | n02106166 83 | n02106382 84 | n02106550 85 | n02106662 86 | n02107142 87 | n02107312 88 | n02107574 89 | n02107683 90 | n02107908 91 | n02108000 92 | n02108089 93 | n02108422 94 | n02108551 95 | n02108915 96 | n02109047 97 | n02109525 98 | n02109961 99 | n02110063 100 | n02110185 101 | n02110341 102 | n02110627 103 | n02110806 104 | n02110958 105 | n02111129 106 | n02111277 107 | n02111500 108 | n02111889 109 | n02112018 110 | n02112137 111 | n02112350 112 | n02112706 113 | n02113023 114 | n02113186 115 | n02113624 116 | n02113712 117 | n02113799 118 | n02113978 119 | -------------------------------------------------------------------------------- /imagenet_dogs_vs_notdogs/imagenet_notdogs.txt: -------------------------------------------------------------------------------- 1 | n01440764 2 | n01443537 3 | n01484850 4 | n01491361 5 | n01494475 6 | n01496331 7 | n01498041 8 | n01514668 9 | n01514859 10 | n01518878 11 | n01530575 12 | n01531178 13 | n01532829 14 | n01534433 15 | n01537544 16 | n01558993 17 | n01560419 18 | n01580077 19 | n01582220 20 | n01592084 21 | n01601694 22 | n01608432 23 | n01614925 24 | n01616318 25 | n01622779 26 | n01629819 27 | n01630670 28 | n01631663 29 | n01632458 30 | n01632777 31 | n01641577 32 | n01644373 33 | n01644900 34 | n01664065 35 | n01665541 36 | n01667114 37 | n01667778 38 | n01669191 39 | n01675722 40 | n01677366 41 | n01682714 42 | n01685808 43 | n01687978 44 | n01688243 45 | n01689811 46 | n01692333 47 | n01693334 48 | n01694178 49 | n01695060 50 | n01697457 51 | n01698640 52 | n01704323 53 | n01728572 54 | n01728920 55 | n01729322 56 | n01729977 57 | n01734418 58 | n01735189 59 | n01737021 60 | n01739381 61 | n01740131 62 | n01742172 63 | n01744401 64 | n01748264 65 | n01749939 66 | n01751748 67 | n01753488 68 | n01755581 69 | n01756291 70 | n01768244 71 | n01770081 72 | n01770393 73 | n01773157 74 | n01773549 75 | n01773797 76 | n01774384 77 | n01774750 78 | n01775062 79 | n01776313 80 | n01784675 81 | n01795545 82 | n01796340 83 | n01797886 84 | n01798484 85 | n01806143 86 | n01806567 87 | n01807496 88 | n01817953 89 | n01818515 90 | n01819313 91 | n01820546 92 | n01824575 93 | n01828970 94 | n01829413 95 | n01833805 96 | n01843065 97 | n01843383 98 | n01847000 99 | n01855032 100 | n01855672 101 | n01860187 102 | n01871265 103 | n01872401 104 | n01873310 105 | n01877812 106 | n01882714 107 | n01883070 108 | n01910747 109 | n01914609 110 | n01917289 111 | n01924916 112 | n01930112 113 | n01943899 114 | n01944390 115 | n01945685 116 | n01950731 117 | n01955084 118 | n01968897 119 | n01978287 120 | n01978455 121 | n01980166 122 | n01981276 123 | n01983481 124 | n01984695 125 | n01985128 126 | n01986214 127 | n01990800 128 | n02002556 129 | n02002724 130 | n02006656 131 | n02007558 132 | n02009229 133 | n02009912 134 | n02011460 135 | n02012849 136 | n02013706 137 | n02017213 138 | n02018207 139 | n02018795 140 | n02025239 141 | n02027492 142 | n02028035 143 | n02033041 144 | n02037110 145 | n02051845 146 | n02056570 147 | n02058221 148 | n02066245 149 | n02071294 150 | n02074367 151 | n02077923 152 | n02114367 153 | n02114548 154 | n02114712 155 | n02114855 156 | n02115641 157 | n02115913 158 | n02116738 159 | n02117135 160 | n02119022 161 | n02119789 162 | n02120079 163 | n02120505 164 | n02123045 165 | n02123159 166 | n02123394 167 | n02123597 168 | n02124075 169 | n02125311 170 | n02127052 171 | n02128385 172 | n02128757 173 | n02128925 174 | n02129165 175 | n02129604 176 | n02130308 177 | n02132136 178 | n02133161 179 | n02134084 180 | n02134418 181 | n02137549 182 | n02138441 183 | n02165105 184 | n02165456 185 | n02167151 186 | n02168699 187 | n02169497 188 | n02172182 189 | n02174001 190 | n02177972 191 | n02190166 192 | n02206856 193 | n02219486 194 | n02226429 195 | n02229544 196 | n02231487 197 | n02233338 198 | n02236044 199 | n02256656 200 | n02259212 201 | n02264363 202 | n02268443 203 | n02268853 204 | n02276258 205 | n02277742 206 | n02279972 207 | n02280649 208 | n02281406 209 | n02281787 210 | n02317335 211 | n02319095 212 | n02321529 213 | n02325366 214 | n02326432 215 | n02328150 216 | n02342885 217 | n02346627 218 | n02356798 219 | n02361337 220 | n02363005 221 | n02364673 222 | n02389026 223 | n02391049 224 | n02395406 225 | n02396427 226 | n02397096 227 | n02398521 228 | n02403003 229 | n02408429 230 | n02410509 231 | n02412080 232 | n02415577 233 | n02417914 234 | n02422106 235 | n02422699 236 | n02423022 237 | n02437312 238 | n02437616 239 | n02441942 240 | n02442845 241 | n02443114 242 | n02443484 243 | n02444819 244 | n02445715 245 | n02447366 246 | n02454379 247 | n02457408 248 | n02480495 249 | n02480855 250 | n02481823 251 | n02483362 252 | n02483708 253 | n02484975 254 | n02486261 255 | n02486410 256 | n02487347 257 | n02488291 258 | n02488702 259 | n02489166 260 | n02490219 261 | n02492035 262 | n02492660 263 | n02493509 264 | n02493793 265 | n02494079 266 | n02497673 267 | n02500267 268 | n02504013 269 | n02504458 270 | n02509815 271 | n02510455 272 | n02514041 273 | n02526121 274 | n02536864 275 | n02606052 276 | n02607072 277 | n02640242 278 | n02641379 279 | n02643566 280 | n02655020 281 | n02666196 282 | n02667093 283 | n02669723 284 | n02672831 285 | n02676566 286 | n02687172 287 | n02690373 288 | n02692877 289 | n02699494 290 | n02701002 291 | n02704792 292 | n02708093 293 | n02727426 294 | n02730930 295 | n02747177 296 | n02749479 297 | n02769748 298 | n02776631 299 | n02777292 300 | n02782093 301 | n02783161 302 | n02786058 303 | n02787622 304 | n02788148 305 | n02790996 306 | n02791124 307 | n02791270 308 | n02793495 309 | n02794156 310 | n02795169 311 | n02797295 312 | n02799071 313 | n02802426 314 | n02804414 315 | n02804610 316 | n02807133 317 | n02808304 318 | n02808440 319 | n02814533 320 | n02814860 321 | n02815834 322 | n02817516 323 | n02823428 324 | n02823750 325 | n02825657 326 | n02834397 327 | n02835271 328 | n02837789 329 | n02840245 330 | n02841315 331 | n02843684 332 | n02859443 333 | n02860847 334 | n02865351 335 | n02869837 336 | n02870880 337 | n02871525 338 | n02877765 339 | n02879718 340 | n02883205 341 | n02892201 342 | n02892767 343 | n02894605 344 | n02895154 345 | n02906734 346 | n02909870 347 | n02910353 348 | n02916936 349 | n02917067 350 | n02927161 351 | n02930766 352 | n02939185 353 | n02948072 354 | n02950826 355 | n02951358 356 | n02951585 357 | n02963159 358 | n02965783 359 | n02966193 360 | n02966687 361 | n02971356 362 | n02974003 363 | n02977058 364 | n02978881 365 | n02979186 366 | n02980441 367 | n02981792 368 | n02988304 369 | n02992211 370 | n02992529 371 | n02999410 372 | n03000134 373 | n03000247 374 | n03000684 375 | n03014705 376 | n03016953 377 | n03017168 378 | n03018349 379 | n03026506 380 | n03028079 381 | n03032252 382 | n03041632 383 | n03042490 384 | n03045698 385 | n03047690 386 | n03062245 387 | n03063599 388 | n03063689 389 | n03065424 390 | n03075370 391 | n03085013 392 | n03089624 393 | n03095699 394 | n03100240 395 | n03109150 396 | n03110669 397 | n03124043 398 | n03124170 399 | n03125729 400 | n03126707 401 | n03127747 402 | n03127925 403 | n03131574 404 | n03133878 405 | n03134739 406 | n03141823 407 | n03146219 408 | n03160309 409 | n03179701 410 | n03180011 411 | n03187595 412 | n03188531 413 | n03196217 414 | n03197337 415 | n03201208 416 | n03207743 417 | n03207941 418 | n03208938 419 | n03216828 420 | n03218198 421 | n03220513 422 | n03223299 423 | n03240683 424 | n03249569 425 | n03250847 426 | n03255030 427 | n03259280 428 | n03271574 429 | n03272010 430 | n03272562 431 | n03290653 432 | n03291819 433 | n03297495 434 | n03314780 435 | n03325584 436 | n03337140 437 | n03344393 438 | n03345487 439 | n03347037 440 | n03355925 441 | n03372029 442 | n03376595 443 | n03379051 444 | n03384352 445 | n03388043 446 | n03388183 447 | n03388549 448 | n03393912 449 | n03394916 450 | n03400231 451 | n03404251 452 | n03417042 453 | n03424325 454 | n03425413 455 | n03443371 456 | n03444034 457 | n03445777 458 | n03445924 459 | n03447447 460 | n03447721 461 | n03450230 462 | n03452741 463 | n03457902 464 | n03459775 465 | n03461385 466 | n03467068 467 | n03476684 468 | n03476991 469 | n03478589 470 | n03481172 471 | n03482405 472 | n03483316 473 | n03485407 474 | n03485794 475 | n03492542 476 | n03494278 477 | n03495258 478 | n03496892 479 | n03498962 480 | n03527444 481 | n03529860 482 | n03530642 483 | n03532672 484 | n03534580 485 | n03535780 486 | n03538406 487 | n03544143 488 | n03584254 489 | n03584829 490 | n03590841 491 | n03594734 492 | n03594945 493 | n03595614 494 | n03598930 495 | n03599486 496 | n03602883 497 | n03617480 498 | n03623198 499 | n03627232 500 | n03630383 501 | n03633091 502 | n03637318 503 | n03642806 504 | n03649909 505 | n03657121 506 | n03658185 507 | n03661043 508 | n03662601 509 | n03666591 510 | n03670208 511 | n03673027 512 | n03676483 513 | n03680355 514 | n03690938 515 | n03691459 516 | n03692522 517 | n03697007 518 | n03706229 519 | n03709823 520 | n03710193 521 | n03710637 522 | n03710721 523 | n03717622 524 | n03720891 525 | n03721384 526 | n03724870 527 | n03729826 528 | n03733131 529 | n03733281 530 | n03733805 531 | n03742115 532 | n03743016 533 | n03759954 534 | n03761084 535 | n03763968 536 | n03764736 537 | n03769881 538 | n03770439 539 | n03770679 540 | n03773504 541 | n03775071 542 | n03775546 543 | n03776460 544 | n03777568 545 | n03777754 546 | n03781244 547 | n03782006 548 | n03785016 549 | n03786901 550 | n03787032 551 | n03788195 552 | n03788365 553 | n03791053 554 | n03792782 555 | n03792972 556 | n03793489 557 | n03794056 558 | n03796401 559 | n03803284 560 | n03804744 561 | n03814639 562 | n03814906 563 | n03825788 564 | n03832673 565 | n03837869 566 | n03838899 567 | n03840681 568 | n03841143 569 | n03843555 570 | n03854065 571 | n03857828 572 | n03866082 573 | n03868242 574 | n03868863 575 | n03871628 576 | n03873416 577 | n03874293 578 | n03874599 579 | n03876231 580 | n03877472 581 | n03877845 582 | n03884397 583 | n03887697 584 | n03888257 585 | n03888605 586 | n03891251 587 | n03891332 588 | n03895866 589 | n03899768 590 | n03902125 591 | n03903868 592 | n03908618 593 | n03908714 594 | n03916031 595 | n03920288 596 | n03924679 597 | n03929660 598 | n03929855 599 | n03930313 600 | n03930630 601 | n03933933 602 | n03935335 603 | n03937543 604 | n03938244 605 | n03942813 606 | n03944341 607 | n03947888 608 | n03950228 609 | n03954731 610 | n03956157 611 | n03958227 612 | n03961711 613 | n03967562 614 | n03970156 615 | n03976467 616 | n03976657 617 | n03977966 618 | n03980874 619 | n03982430 620 | n03983396 621 | n03991062 622 | n03992509 623 | n03995372 624 | n03998194 625 | n04004767 626 | n04005630 627 | n04008634 628 | n04009552 629 | n04019541 630 | n04023962 631 | n04026417 632 | n04033901 633 | n04033995 634 | n04037443 635 | n04039381 636 | n04040759 637 | n04041544 638 | n04044716 639 | n04049303 640 | n04065272 641 | n04067472 642 | n04069434 643 | n04070727 644 | n04074963 645 | n04081281 646 | n04086273 647 | n04090263 648 | n04099969 649 | n04111531 650 | n04116512 651 | n04118538 652 | n04118776 653 | n04120489 654 | n04125021 655 | n04127249 656 | n04131690 657 | n04133789 658 | n04136333 659 | n04141076 660 | n04141327 661 | n04141975 662 | n04146614 663 | n04147183 664 | n04149813 665 | n04152593 666 | n04153751 667 | n04154565 668 | n04162706 669 | n04179913 670 | n04192698 671 | n04200800 672 | n04201297 673 | n04204238 674 | n04204347 675 | n04208210 676 | n04209133 677 | n04209239 678 | n04228054 679 | n04229816 680 | n04235860 681 | n04238763 682 | n04239074 683 | n04243546 684 | n04251144 685 | n04252077 686 | n04252225 687 | n04254120 688 | n04254680 689 | n04254777 690 | n04258138 691 | n04259630 692 | n04263257 693 | n04264628 694 | n04265275 695 | n04266014 696 | n04270147 697 | n04273569 698 | n04275548 699 | n04277352 700 | n04285008 701 | n04286575 702 | n04296562 703 | n04310018 704 | n04311004 705 | n04311174 706 | n04317175 707 | n04325704 708 | n04326547 709 | n04328186 710 | n04330267 711 | n04332243 712 | n04335435 713 | n04336792 714 | n04344873 715 | n04346328 716 | n04347754 717 | n04350905 718 | n04355338 719 | n04355933 720 | n04356056 721 | n04357314 722 | n04366367 723 | n04367480 724 | n04370456 725 | n04371430 726 | n04371774 727 | n04372370 728 | n04376876 729 | n04380533 730 | n04389033 731 | n04392985 732 | n04398044 733 | n04399382 734 | n04404412 735 | n04409515 736 | n04417672 737 | n04418357 738 | n04423845 739 | n04428191 740 | n04429376 741 | n04435653 742 | n04442312 743 | n04443257 744 | n04447861 745 | n04456115 746 | n04458633 747 | n04461696 748 | n04462240 749 | n04465501 750 | n04467665 751 | n04476259 752 | n04479046 753 | n04482393 754 | n04483307 755 | n04485082 756 | n04486054 757 | n04487081 758 | n04487394 759 | n04493381 760 | n04501370 761 | n04505470 762 | n04507155 763 | n04509417 764 | n04515003 765 | n04517823 766 | n04522168 767 | n04523525 768 | n04525038 769 | n04525305 770 | n04532106 771 | n04532670 772 | n04536866 773 | n04540053 774 | n04542943 775 | n04548280 776 | n04548362 777 | n04550184 778 | n04552348 779 | n04553703 780 | n04554684 781 | n04557648 782 | n04560804 783 | n04562935 784 | n04579145 785 | n04579432 786 | n04584207 787 | n04589890 788 | n04590129 789 | n04591157 790 | n04591713 791 | n04592741 792 | n04596742 793 | n04597913 794 | n04599235 795 | n04604644 796 | n04606251 797 | n04612504 798 | n04613696 799 | n06359193 800 | n06596364 801 | n06785654 802 | n06794110 803 | n06874185 804 | n07248320 805 | n07565083 806 | n07579787 807 | n07583066 808 | n07584110 809 | n07590611 810 | n07613480 811 | n07614500 812 | n07615774 813 | n07684084 814 | n07693725 815 | n07695742 816 | n07697313 817 | n07697537 818 | n07711569 819 | n07714571 820 | n07714990 821 | n07715103 822 | n07716358 823 | n07716906 824 | n07717410 825 | n07717556 826 | n07718472 827 | n07718747 828 | n07720875 829 | n07730033 830 | n07734744 831 | n07742313 832 | n07745940 833 | n07747607 834 | n07749582 835 | n07753113 836 | n07753275 837 | n07753592 838 | n07754684 839 | n07760859 840 | n07768694 841 | n07802026 842 | n07831146 843 | n07836838 844 | n07860988 845 | n07871810 846 | n07873807 847 | n07875152 848 | n07880968 849 | n07892512 850 | n07920052 851 | n07930864 852 | n07932039 853 | n09193705 854 | n09229709 855 | n09246464 856 | n09256479 857 | n09288635 858 | n09332890 859 | n09399592 860 | n09421951 861 | n09428293 862 | n09468604 863 | n09472597 864 | n09835506 865 | n10148035 866 | n10565667 867 | n11879895 868 | n11939491 869 | n12057211 870 | n12144580 871 | n12267677 872 | n12620546 873 | n12768682 874 | n12985857 875 | n12998815 876 | n13037406 877 | n13040303 878 | n13044778 879 | n13052670 880 | n13054560 881 | n13133613 882 | n15075141 883 | -------------------------------------------------------------------------------- /imagenet_dogs_vs_notdogs/imagenet_notdogs_subset.txt: -------------------------------------------------------------------------------- 1 | n04501370 2 | n07693725 3 | n03884397 4 | n04485082 5 | n02480495 6 | n07768694 7 | n02442845 8 | n13052670 9 | n04325704 10 | n01530575 11 | n04389033 12 | n01773549 13 | n03697007 14 | n03062245 15 | n04208210 16 | n01773797 17 | n03063599 18 | n04275548 19 | n04040759 20 | n04254777 21 | n03977966 22 | n03777754 23 | n03709823 24 | n07742313 25 | n09246464 26 | n02930766 27 | n13037406 28 | n02280649 29 | n04009552 30 | n03259280 31 | n01641577 32 | n01776313 33 | n03720891 34 | n03297495 35 | n04266014 36 | n04493381 37 | n02607072 38 | n02487347 39 | n01990800 40 | n03673027 41 | n04462240 42 | n01795545 43 | n01622779 44 | n02509815 45 | n02992529 46 | n01744401 47 | n02002724 48 | n03825788 49 | n02782093 50 | n02119789 51 | n02906734 52 | n01616318 53 | n03388549 54 | n02422699 55 | n03786901 56 | n04162706 57 | n04591157 58 | n03992509 59 | n04141975 60 | n01560419 61 | n04557648 62 | n04259630 63 | n07871810 64 | n02259212 65 | n01860187 66 | n02708093 67 | n04532106 68 | n02120079 69 | n03759954 70 | n01518878 71 | n01534433 72 | n03417042 73 | n02128925 74 | n04429376 75 | n02066245 76 | n02859443 77 | n03733131 78 | n02815834 79 | n02790996 80 | n02823428 81 | n03770439 82 | n03717622 83 | n07717556 84 | n02776631 85 | n02493509 86 | n09399592 87 | n03026506 88 | n01829413 89 | n03920288 90 | n03467068 91 | n04310018 92 | n13133613 93 | n02669723 94 | n01698640 95 | n03216828 96 | n06596364 97 | n02966687 98 | n03485794 99 | n03982430 100 | n02025239 101 | n04116512 102 | n04487394 103 | n03956157 104 | n03970156 105 | n04311174 106 | n01644373 107 | n03832673 108 | n01491361 109 | n01944390 110 | n04507155 111 | n02504458 112 | n02536864 113 | n02325366 114 | n02978881 115 | n04476259 116 | n04273569 117 | n04023962 118 | n02814533 119 | -------------------------------------------------------------------------------- /imagenot/README.md: -------------------------------------------------------------------------------- 1 | ## ImageNot 2 | 3 | From: ["What classifiers know what they don't?"](https://arxiv.org/abs/2107.06217) 4 | 5 | ### Create dataset on disk 6 | 7 | In a folder with the `train/` and `val/` ImageNet sub-folders: 8 | 9 | ``` 10 | mkdir -p imagenot_in-domain/train 11 | mkdir -p imagenot_in-domain/val 12 | cat imagenot_in-domain.txt | xargs -I{} cp -R train/{} imagenot_in-domain/train/ 13 | cat imagenot_in-domain.txt | xargs -I{} cp -R val/{} imagenot_in-domain/val/ 14 | 15 | mkdir -p imagenot_out-domain/train 16 | mkdir -p imagenot_out-domain/val 17 | cat imagenot_out-domain.txt | xargs -I{} cp -R train/{} imagenot_out-domain/train/ 18 | cat imagenot_out-domain.txt | xargs -I{} cp -R val/{} imagenot_out-domain/val/ 19 | ``` 20 | 21 | (`val/` can be obtained using [Soumith's valprep.sh script](https://raw.githubusercontent.com/soumith/imagenetloader.torch/master/valprep.sh)) 22 | 23 | An easy to adapt example to use these folders in PyTorch is provided in [dataset.py](../imagenet_dogs_vs_notdogs/dataset.py). 24 | -------------------------------------------------------------------------------- /imagenot/in-domain.txt: -------------------------------------------------------------------------------- 1 | n02666196 2 | n02669723 3 | n02672831 4 | n02690373 5 | n02699494 6 | n02776631 7 | n02783161 8 | n02786058 9 | n02791124 10 | n02793495 11 | n02794156 12 | n02797295 13 | n02799071 14 | n02804610 15 | n02808304 16 | n02808440 17 | n02814860 18 | n02817516 19 | n02834397 20 | n02835271 21 | n02837789 22 | n02840245 23 | n02859443 24 | n02860847 25 | n02869837 26 | n02871525 27 | n02877765 28 | n02883205 29 | n02892767 30 | n02894605 31 | n02895154 32 | n02909870 33 | n02927161 34 | n02951358 35 | n02951585 36 | n02966687 37 | n02974003 38 | n02977058 39 | n02979186 40 | n02980441 41 | n02981792 42 | n02992211 43 | n03000684 44 | n03026506 45 | n03028079 46 | n03032252 47 | n03042490 48 | n03045698 49 | n03063599 50 | n03075370 51 | n03085013 52 | n03095699 53 | n03124170 54 | n03127747 55 | n03127925 56 | n03131574 57 | n03160309 58 | n03180011 59 | n03187595 60 | n03201208 61 | n03207743 62 | n03207941 63 | n03216828 64 | n03223299 65 | n03240683 66 | n03249569 67 | n03272010 68 | n03272562 69 | n03290653 70 | n03291819 71 | n03325584 72 | n03337140 73 | n03344393 74 | n03347037 75 | n03372029 76 | n03384352 77 | n03388549 78 | n03393912 79 | n03394916 80 | n03417042 81 | n03425413 82 | n03445777 83 | n03445924 84 | n03447721 85 | n03452741 86 | n03459775 87 | n03461385 88 | n03467068 89 | n03478589 90 | n03481172 91 | n03482405 92 | n03495258 93 | n03496892 94 | n03498962 95 | n03527444 96 | n03534580 97 | n03535780 98 | n03584254 99 | n03590841 100 | n03594734 101 | n03594945 102 | n03627232 103 | n03633091 104 | n03657121 105 | n03661043 106 | n03670208 107 | n03680355 108 | n03690938 109 | n03706229 110 | n03709823 111 | n03710193 112 | n03710721 113 | n03717622 114 | n03720891 115 | n03721384 116 | n03729826 117 | n03733281 118 | n03733805 119 | n03742115 120 | n03743016 121 | n03759954 122 | n03761084 123 | n03763968 124 | n03769881 125 | n03773504 126 | n03775071 127 | n03775546 128 | n03777568 129 | n03781244 130 | n03782006 131 | n03785016 132 | n03786901 133 | n03787032 134 | n03788365 135 | n03791053 136 | n03793489 137 | n03794056 138 | n03803284 139 | n03814639 140 | n03841143 141 | n03843555 142 | n03857828 143 | n03866082 144 | n03868863 145 | n03874293 146 | n03874599 147 | n03877845 148 | n03884397 149 | n03887697 150 | n03891332 151 | n03895866 152 | n03903868 153 | n03920288 154 | n03924679 155 | n03930313 156 | n03933933 157 | n03938244 158 | n03947888 159 | n03950228 160 | n03956157 161 | n03958227 162 | n03967562 163 | n03976657 164 | n03982430 165 | n03983396 166 | n03995372 167 | n04004767 168 | n04008634 169 | n04019541 170 | n04023962 171 | n04033995 172 | n04037443 173 | n04039381 174 | n04041544 175 | n04065272 176 | n04067472 177 | n04070727 178 | n04081281 179 | n04086273 180 | n04090263 181 | n04118776 182 | n04131690 183 | n04141327 184 | n04146614 185 | n04153751 186 | n04154565 187 | n04162706 188 | n04179913 189 | n04192698 190 | n04201297 191 | n04208210 192 | n04209133 193 | n04228054 194 | n04235860 195 | n04238763 196 | n04243546 197 | n04254120 198 | n04254777 199 | n04258138 200 | n04259630 201 | n04264628 202 | n04266014 203 | n04270147 204 | n04273569 205 | n04286575 206 | n04311004 207 | n04311174 208 | n04317175 209 | n04325704 210 | n04326547 211 | n04330267 212 | n04332243 213 | n04346328 214 | n04347754 215 | n04355933 216 | n04356056 217 | n04357314 218 | n04371430 219 | n04371774 220 | n04372370 221 | n04376876 222 | n04399382 223 | n04404412 224 | n04409515 225 | n04418357 226 | n04447861 227 | n04456115 228 | n04458633 229 | n04462240 230 | n04465501 231 | n04476259 232 | n04482393 233 | n04485082 234 | n04487394 235 | n04501370 236 | n04517823 237 | n04522168 238 | n04525038 239 | n04525305 240 | n04540053 241 | n04548280 242 | n04548362 243 | n04550184 244 | n04552348 245 | n04553703 246 | n04560804 247 | n04579145 248 | n04584207 249 | n04590129 250 | n04591157 251 | n04591713 252 | n04592741 253 | n04604644 254 | n04612504 255 | n04613696 256 | n06359193 257 | n07802026 258 | n07930864 259 | n09193705 260 | n09246464 261 | n09288635 262 | n09332890 263 | n09421951 264 | n09472597 265 | n10148035 266 | n15075141 267 | -------------------------------------------------------------------------------- /imagenot/out-domain.txt: -------------------------------------------------------------------------------- 1 | n01440764 2 | n01443537 3 | n01484850 4 | n01491361 5 | n01494475 6 | n01496331 7 | n01498041 8 | n01514668 9 | n01514859 10 | n01518878 11 | n01530575 12 | n01531178 13 | n01532829 14 | n01534433 15 | n01537544 16 | n01558993 17 | n01560419 18 | n01580077 19 | n01582220 20 | n01592084 21 | n01601694 22 | n01608432 23 | n01614925 24 | n01616318 25 | n01622779 26 | n01629819 27 | n01630670 28 | n01631663 29 | n01632458 30 | n01632777 31 | n01641577 32 | n01644373 33 | n01644900 34 | n01664065 35 | n01665541 36 | n01667114 37 | n01667778 38 | n01669191 39 | n01675722 40 | n01677366 41 | n01682714 42 | n01685808 43 | n01687978 44 | n01688243 45 | n01689811 46 | n01692333 47 | n01693334 48 | n01694178 49 | n01695060 50 | n01697457 51 | n01698640 52 | n01704323 53 | n01728572 54 | n01728920 55 | n01729322 56 | n01729977 57 | n01734418 58 | n01735189 59 | n01737021 60 | n01739381 61 | n01740131 62 | n01742172 63 | n01744401 64 | n01748264 65 | n01749939 66 | n01751748 67 | n01753488 68 | n01755581 69 | n01756291 70 | n01768244 71 | n01770081 72 | n01770393 73 | n01773157 74 | n01773549 75 | n01773797 76 | n01774384 77 | n01774750 78 | n01775062 79 | n01776313 80 | n01784675 81 | n01795545 82 | n01796340 83 | n01797886 84 | n01798484 85 | n01806143 86 | n01806567 87 | n01807496 88 | n01817953 89 | n01818515 90 | n01819313 91 | n01820546 92 | n01824575 93 | n01828970 94 | n01829413 95 | n01833805 96 | n01843065 97 | n01843383 98 | n01847000 99 | n01855032 100 | n01855672 101 | n01860187 102 | n01871265 103 | n01872401 104 | n01873310 105 | n01877812 106 | n01882714 107 | n01883070 108 | n01910747 109 | n01914609 110 | n01917289 111 | n01924916 112 | n01930112 113 | n01943899 114 | n01944390 115 | n01945685 116 | n01950731 117 | n01955084 118 | n01968897 119 | n01978287 120 | n01978455 121 | n01980166 122 | n01981276 123 | n01983481 124 | n01984695 125 | n01985128 126 | n01986214 127 | n01990800 128 | n02002556 129 | n02002724 130 | n02006656 131 | n02007558 132 | n02009229 133 | n02009912 134 | n02011460 135 | n02012849 136 | n02013706 137 | n02017213 138 | n02018207 139 | n02018795 140 | n02025239 141 | n02027492 142 | n02028035 143 | n02033041 144 | n02037110 145 | n02051845 146 | n02056570 147 | n02058221 148 | n02066245 149 | n02071294 150 | n02074367 151 | n02077923 152 | n02085620 153 | n02085782 154 | n02085936 155 | n02086079 156 | n02086240 157 | n02086646 158 | n02086910 159 | n02087046 160 | n02087394 161 | n02088094 162 | n02088238 163 | n02088364 164 | n02088466 165 | n02088632 166 | n02089078 167 | n02089867 168 | n02089973 169 | n02090379 170 | n02090622 171 | n02090721 172 | n02091032 173 | n02091134 174 | n02091244 175 | n02091467 176 | n02091635 177 | n02091831 178 | n02092002 179 | n02092339 180 | n02093256 181 | n02093428 182 | n02093647 183 | n02093754 184 | n02093859 185 | n02093991 186 | n02094114 187 | n02094258 188 | n02094433 189 | n02095314 190 | n02095570 191 | n02095889 192 | n02096051 193 | n02096177 194 | n02096294 195 | n02096437 196 | n02096585 197 | n02097047 198 | n02097130 199 | n02097209 200 | n02097298 201 | n02097474 202 | n02097658 203 | n02098105 204 | n02098286 205 | n02098413 206 | n02099267 207 | n02099429 208 | n02099601 209 | n02099712 210 | n02099849 211 | n02100236 212 | n02100583 213 | n02100735 214 | n02100877 215 | n02101006 216 | n02101388 217 | n02101556 218 | n02102040 219 | n02102177 220 | n02102318 221 | n02102480 222 | n02102973 223 | n02104029 224 | n02104365 225 | n02105056 226 | n02105162 227 | n02105251 228 | n02105412 229 | n02105505 230 | n02105641 231 | n02105855 232 | n02106030 233 | n02106166 234 | n02106382 235 | n02106550 236 | n02106662 237 | n02107142 238 | n02107312 239 | n02107574 240 | n02107683 241 | n02107908 242 | n02108000 243 | n02108089 244 | n02108422 245 | n02108551 246 | n02108915 247 | n02109047 248 | n02109525 249 | n02109961 250 | n02110063 251 | n02110185 252 | n02110341 253 | n02110627 254 | n02110806 255 | n02110958 256 | n02111129 257 | n02111277 258 | n02111500 259 | n02111889 260 | n02112018 261 | n02112137 262 | n02112350 263 | n02112706 264 | n02113023 265 | n02113186 266 | n02113624 267 | -------------------------------------------------------------------------------- /minimal_cifar/README.md: -------------------------------------------------------------------------------- 1 | Getting high accuracy on CIFAR-10 is not straightforward. This self-contained script gets to 94% accuracy with a minimum of code. 2 | 3 | You can download a model trained with this script from: https://www.cs.ox.ac.uk/people/joost.vanamersfoort/cifar_model.pt 4 | -------------------------------------------------------------------------------- /minimal_cifar/train_cifar.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | from tqdm import tqdm 3 | 4 | import torch 5 | import torch.nn.functional as F 6 | 7 | from torchvision import models, datasets, transforms 8 | 9 | 10 | def get_CIFAR10(root="./"): 11 | input_size = 32 12 | num_classes = 10 13 | normalize = transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)) 14 | 15 | train_transform = transforms.Compose( 16 | [ 17 | transforms.RandomCrop(32, padding=4), 18 | transforms.RandomHorizontalFlip(), 19 | transforms.ToTensor(), 20 | normalize, 21 | ] 22 | ) 23 | train_dataset = datasets.CIFAR10( 24 | root + "data/CIFAR10", train=True, transform=train_transform, download=True 25 | ) 26 | 27 | test_transform = transforms.Compose( 28 | [ 29 | transforms.ToTensor(), 30 | normalize, 31 | ] 32 | ) 33 | test_dataset = datasets.CIFAR10( 34 | root + "data/CIFAR10", train=False, transform=test_transform, download=True 35 | ) 36 | 37 | return input_size, num_classes, train_dataset, test_dataset 38 | 39 | 40 | class Model(torch.nn.Module): 41 | def __init__(self): 42 | super().__init__() 43 | 44 | self.resnet = models.resnet18(pretrained=False, num_classes=10) 45 | 46 | self.resnet.conv1 = torch.nn.Conv2d( 47 | 3, 64, kernel_size=3, stride=1, padding=1, bias=False 48 | ) 49 | self.resnet.maxpool = torch.nn.Identity() 50 | 51 | def forward(self, x): 52 | x = self.resnet(x) 53 | x = F.log_softmax(x, dim=1) 54 | 55 | return x 56 | 57 | 58 | def train(model, train_loader, optimizer, epoch): 59 | model.train() 60 | 61 | total_loss = [] 62 | 63 | for data, target in tqdm(train_loader): 64 | data = data.cuda() 65 | target = target.cuda() 66 | 67 | optimizer.zero_grad() 68 | 69 | prediction = model(data) 70 | loss = F.nll_loss(prediction, target) 71 | 72 | loss.backward() 73 | optimizer.step() 74 | 75 | total_loss.append(loss.item()) 76 | 77 | avg_loss = sum(total_loss) / len(total_loss) 78 | print(f"Epoch: {epoch}:") 79 | print(f"Train Set: Average Loss: {avg_loss:.2f}") 80 | 81 | 82 | def test(model, test_loader): 83 | model.eval() 84 | 85 | loss = 0 86 | correct = 0 87 | 88 | for data, target in test_loader: 89 | with torch.no_grad(): 90 | data = data.cuda() 91 | target = target.cuda() 92 | 93 | prediction = model(data) 94 | loss += F.nll_loss(prediction, target, reduction="sum") 95 | 96 | prediction = prediction.max(1)[1] 97 | correct += prediction.eq(target.view_as(prediction)).sum().item() 98 | 99 | loss /= len(test_loader.dataset) 100 | 101 | percentage_correct = 100.0 * correct / len(test_loader.dataset) 102 | 103 | print( 104 | "Test set: Average loss: {:.4f}, Accuracy: {}/{} ({:.2f}%)".format( 105 | loss, correct, len(test_loader.dataset), percentage_correct 106 | ) 107 | ) 108 | 109 | return loss, percentage_correct 110 | 111 | 112 | def main(): 113 | parser = argparse.ArgumentParser() 114 | parser.add_argument( 115 | "--epochs", type=int, default=50, help="number of epochs to train (default: 50)" 116 | ) 117 | parser.add_argument( 118 | "--lr", type=float, default=0.05, help="learning rate (default: 0.05)" 119 | ) 120 | parser.add_argument("--seed", type=int, default=1, help="random seed (default: 1)") 121 | args = parser.parse_args() 122 | print(args) 123 | 124 | torch.manual_seed(args.seed) 125 | 126 | input_size, num_classes, train_dataset, test_dataset = get_CIFAR10() 127 | 128 | kwargs = {"num_workers": 2, "pin_memory": True} 129 | 130 | train_loader = torch.utils.data.DataLoader( 131 | train_dataset, batch_size=128, shuffle=True, **kwargs 132 | ) 133 | test_loader = torch.utils.data.DataLoader( 134 | test_dataset, batch_size=5000, shuffle=False, **kwargs 135 | ) 136 | 137 | model = Model() 138 | model = model.cuda() 139 | 140 | milestones = [25, 40] 141 | 142 | optimizer = torch.optim.SGD( 143 | model.parameters(), lr=args.lr, momentum=0.9, weight_decay=5e-4 144 | ) 145 | scheduler = torch.optim.lr_scheduler.MultiStepLR( 146 | optimizer, milestones=milestones, gamma=0.1 147 | ) 148 | 149 | for epoch in range(1, args.epochs + 1): 150 | train(model, train_loader, optimizer, epoch) 151 | test(model, test_loader) 152 | 153 | scheduler.step() 154 | 155 | torch.save(model.state_dict(), "cifar_model.pt") 156 | 157 | 158 | if __name__ == "__main__": 159 | main() -------------------------------------------------------------------------------- /subset_of_imagenet/README.md: -------------------------------------------------------------------------------- 1 | Select a subset of ImageNet **classes** in PyTorch in a very simple way. 2 | -------------------------------------------------------------------------------- /subset_of_imagenet/subset_imagenet.py: -------------------------------------------------------------------------------- 1 | import os 2 | 3 | from torchvision import datasets, transforms 4 | from torchvision.datasets.folder import IMG_EXTENSIONS, default_loader 5 | 6 | def get_imagenet(root, num_classes=100): 7 | class SubDatasetFolder(datasets.DatasetFolder): 8 | def _find_classes(self, dir): 9 | classes = [d.name for d in os.scandir(dir) if d.is_dir()] 10 | classes.sort() 11 | classes = classes[:num_classes] # overwritten from original to take subset 12 | class_to_idx = {cls_name: i for i, cls_name in enumerate(classes)} 13 | return classes, class_to_idx 14 | 15 | input_size = 224 16 | 17 | traindir = os.path.join(root, "train/") 18 | valdir = os.path.join(root, "val/") 19 | 20 | normalize = transforms.Normalize( 21 | mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225] 22 | ) 23 | 24 | # Reimplements ImageFolder arguments 25 | train_dataset = SubDatasetFolder( 26 | traindir, 27 | default_loader, 28 | IMG_EXTENSIONS, 29 | transforms.Compose( 30 | [ 31 | transforms.RandomResizedCrop(input_size), 32 | transforms.RandomHorizontalFlip(), 33 | transforms.ToTensor(), 34 | normalize, 35 | ] 36 | ), 37 | ) 38 | 39 | test_dataset = SubDatasetFolder( 40 | valdir, 41 | default_loader, 42 | IMG_EXTENSIONS, 43 | transforms.Compose( 44 | [ 45 | transforms.Resize(256), 46 | transforms.CenterCrop(input_size), 47 | transforms.ToTensor(), 48 | normalize, 49 | ] 50 | ), 51 | ) 52 | 53 | return train_dataset, test_dataset --------------------------------------------------------------------------------