├── .gitignore ├── README.md ├── datasets ├── celebA │ ├── celebA_ood.csv │ ├── celebA_original.csv │ └── celebA_split.csv ├── celebA_dataset.py ├── color_mnist.py ├── cub_dataset.py ├── dataset_utils.py ├── gaussian_dataset.py ├── generate_placebg.py └── generate_waterbird.py ├── main.png ├── models ├── __init__.py └── resnet.py ├── present_results.py ├── test_bg.py ├── train_bg.py └── utils ├── __init__.py ├── anom_utils.py ├── common.py └── svhn_loader.py /.gitignore: -------------------------------------------------------------------------------- 1 | **/datasets/MNIST 2 | **/datasets/ood_datasets 3 | **/checkpoints 4 | **/energy_results 5 | **/output 6 | **/plot_mnist 7 | **/plot_waterbird 8 | *.DS_Store 9 | *.pyc 10 | *.npy 11 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | # On the Impact of Spurious Correlation for Out-of-distribution Detection 3 | This codebase provides a Pytorch implementation for the paper at AAAI-22: On the Impact of Spurious Correlation for Out-of-distribution Detection. Some parts of the codebase are adapted from [GDRO](https://github.com/kohpangwei/group_DRO). 4 | 5 | ## Abstract 6 | Modern neural networks can assign high confidence to inputs drawn from outside the training distribution, posing threats to models in real-world deployments. While much research attention has been placed on designing new out-of-distribution (OOD) detection methods, the precise definition of OOD is often left in vagueness and falls short of the desired notion of OOD in reality. In this paper, we present a new formalization and model the data shifts by taking into account both the invariant and environmental (spurious) features. Under such formalization, we systematically investigate how spurious correlation in the training set impacts OOD detection. Our results suggest that the detection performance is severely worsened when the correlation between spurious features and labels is increased in the training set. We further show insights on detection methods that are more effective in reducing the impact of spurious correlation, and provide theoretical analysis on why reliance on environmental features leads to high OOD detection error. Our work aims to facilitate better understanding of OOD samples and their formalization, as well as the exploration of methods that enhance OOD detection. 7 | 8 | ## Illustration 9 | ![main](main.png) 10 | 11 | ## Required Packages 12 | Our experiments are conducted on Ubuntu Linux 20.04 with Python 3.8 and Pytorch 1.6. Besides, the following packages are required to be installed: 13 | * Scipy 14 | * Numpy 15 | * Sklearn 16 | * Pandas 17 | 18 | ## Datasets 19 | ### In-distribution Datasets 20 | 21 | - In-distribution training sets: 22 | - WaterBirds: Similar to the construction in [Group_DRO](https://github.com/kohpangwei/group_DRO), this dataset is constructed by cropping out birds from photos in the Caltech-UCSD Birds-200-2011 (CUB) dataset (Wah et al., 2011) and transferring them onto backgrounds from the Places dataset (Zhou et al., 2017). 23 | - [CelebA](http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html): Large-scale CelebFaces Attributes Dataset. The data we used for this task is listed in `datasets/celebA/celebA_split.csv`, and after downloading the dataset, please place the images in the folder of `datasets/celebA/img_align_celeba/`. 24 | - ColorMINST: A colour-biased version of the original [MNIST](http://yann.lecun.com/exdb/mnist/) Dataset. 25 | 26 | ### Out-of-distribution Test Datasets 27 | 28 | #### Non-spurious OOD Test Sets 29 | 30 | Following common practice, we choose three datasets with diverse semantics as non-spurious OOD test sets. We provide links and instructions to download each dataset: 31 | * [Textures](https://www.robots.ox.ac.uk/~vgg/data/dtd/download/dtd-r1.0.1.tar.gz): download it and place it in the folder of `datasets/ood_datasets/dtd`. 32 | * [LSUN-R](https://www.dropbox.com/s/moqh2wh8696c3yl/LSUN_resize.tar.gz): download it and place it in the folder of `datasets/ood_datasets/LSUN_resize`. 33 | * [iSUN](https://www.dropbox.com/s/ssz7qxfqae0cca5/iSUN.tar.gz): download it and place it in the folder of `datasets/ood_datasets/iSUN`. 34 | * [SVHN](http://ufldl.stanford.edu/housenumbers/test_32x32.mat): download it and place it in the folder of `datasets/ood_datasets/svhn`. 35 | * Synthetic Images: code for generating a Gaussian noise dataset is contained in `datasets/gaussian_dataset.py` 36 | 37 | For example, run the following commands in the **root** directory to download **LSUN-R**: 38 | ``` 39 | cd datasets/ood_datasets 40 | wget https://www.dropbox.com/s/moqh2wh8696c3yl/LSUN_resize.tar.gz 41 | tar -xvzf LSUN.tar.gz 42 | ``` 43 | 44 | #### Spurious OOD Test Sets 45 | * Color MNIST: can be downloaded [here](https://www.dropbox.com/s/kqqm9doda33f4tt/partial_color_mnist_0%261.zip?dl=0) and place it under `datasets/ood_datasets`. 46 | * WaterBirds: refer to [Waterbirds](#WaterBirds) and the dataset should be placed in the folder of `datasets/ood_datasets/placesbg`. 47 | * CelebA: the meta data for this dataset has already been included in the provided CelebA zip file as `datasets/CelebA/celebA_ood.csv`. 48 | 49 | 50 | ## Quick Start 51 | To run the experiments, you need to first download and place the datasets in the specificed folders as instructed in [Datasets](#Datasets). We provide the following commands and general descriptions for related files. 52 | 53 | ### ColorMNIST 54 | * `datasets/color_mnist.py` downloads the original MNIST and applies colour biases on images by itself. No extra preparation is needed on the user side. 55 | 56 | Here is an example for training on the ColorMNIST Dataset and OOD evaluation: 57 | ```bash 58 | python train_bg.py --gpu-ids 0 --in-dataset color_mnist --model resnet18 --epochs 30 --save-epoch 10 --data_label_correlation 0.45 --domain-num 8 --method erm --name erm_r_0_45 --exp-name cdann_r_0_45_2021-08-31 59 | python test_bg.py --gpu-ids 0 --in-dataset color_mnist --model resnet18 --test_epochs 30 --data_label_correlation 0.45 --method cdann --name cdann_r_0_45 --exp-name cdann_r_0_45_2021-08-31 60 | python present_results_py --in-dataset color_mnist --name cdann_r_0_45 --exp-name cdann_r_0_45_2021-08-31 --test_epochs 30 61 | ``` 62 | Notes for some of the arguments: 63 | * `--data_label_correlation`: the correlation between labels and spurious feature (which is the background color here), as explained in the paper. 64 | * `--method`: selected from 'erm', 'irm', 'gdro', 'rex', 'dann', 'cdann', 'rebias'. The same applies to the experiments below. The uploaded codebase only contains option 'erm'. The complete list of options will be available soon. 65 | * `--name`: by convention, here we specify the name as Method_Correlation. Users are welcome to use other names for convenience. 66 | * `--gpu-ids`: the index of the gpu to be used. Currently we support running with a single gpu. Support for Distributed training will be provided soon. 67 | * `--domain-num`: will be used for others other than 'erm'. This option will be used for domain invariance learning methods. 68 | ### WaterBirds 69 | * `datasets/cub_dataset.py`: provides the dataloader for WaterBirds datasets of multiple correlations. 70 | * `datasets/generate_waterbird.py`: generates the combination of bird and background images with a preset correlation. You can simply run `python generate_waterbird.py` to generate the dataset and the dataset will be stored as `datasets/waterbird_completexx_forest2water2`, where `xx` is the string of the two digits after the decimal point, for example when r=0.9, `xx`=90. 71 | * `datasets/generate_placebg.py`: subsamples background images of specific types as the OOD data. You can simply run `python generate_placebg.py` to generate the OOD dataset, and it will be stored as `datasets/ood_datasets/placesbg/`. 72 | 73 | (Notes: Before the generation of WaterBirds dataset, you need to download and change the path of CUB dataset and Places dataset first as specified in `generate_waterbird.py`.) 74 | 75 | A sample script to run model training and ood evaluation task on WaterBirds is as follows: 76 | ```bash 77 | python train_bg.py --gpu-ids 0 --in-dataset waterbird --model resnet18 --epochs 30 --save-epoch 10 --data_label_correlation 0.9 --domain-num 4 --method erm --name erm_r_0_9 --exp-name erm_r_0_9_2021-08-31 78 | python test_bg.py --gpu-ids 0 --in-dataset waterbird --model resnet18 --test_epochs 30 --data_label_correlation 0.9 --method erm --name erm_r_0_9 --exp-name erm_r_0_9_2021-08-31 79 | python present_results_py --in-dataset waterbird --name erm_r_0_9 --exp-name erm_r_0_9_2021-08-31 --test_epochs 30 80 | ``` 81 | Notes for some of the arguments: 82 | * `--data_label_correlation`: can be selected from [0.5, 0.7, 0.9]. 83 | 84 | ### CelebA 85 | * `datasets/celebA_dataset.py`: provides the dataloader for CelebA datasets and OOD datasets. 86 | 87 | A sample script to run model training and ood evaluation task on CelebA is as follows: 88 | ```bash 89 | python train_bg.py --gpu-ids 0 --in-dataset celebA --model resnet18 --epochs 30 --save-epoch 10 --data_label_correlation 0.8 --domain-num 4 --method erm --name erm_r_0_8 --exp-name erm_r_0_8_2021-08-31 90 | python test_bg.py --gpu-ids 0 --in-dataset celebA --model resnet18 --test_epochs 30 --data_label_correlation 0.8 --method erm --name erm_r_0_8 --exp-name erm_r_0_8_2021-08-31 91 | python present_results_py --in-dataset waterbird --name erm_r_0_8 --exp-name erm_r_0_8_2021-08-31 --test_epochs 30 92 | ``` 93 | Notes for some of the arguments: 94 | * `--data_label_correlation`: the correlation for this experiment and can be selected from[0.7, 0.8]. 95 | 96 | ### For bibtex citation 97 | 98 | ``` 99 | @inproceedings{ming2022impact, 100 | title={On the Impact of Spurious Correlation for Out-of-distribution Detection}, 101 | author={Yifei Ming and Hang Yin and Yixuan Li}, 102 | booktitle={The AAAI Conference on Artificial Intelligence (AAAI)}, 103 | year={2022} 104 | } 105 | ``` 106 | -------------------------------------------------------------------------------- /datasets/celebA/celebA_ood.csv: -------------------------------------------------------------------------------- 1 | ,image_id,Male,Bald 2 | 78,000079.jpg,1,1 3 | 114,000115.jpg,1,1 4 | 133,000134.jpg,1,1 5 | 181,000182.jpg,1,1 6 | 225,000226.jpg,1,1 7 | 298,000299.jpg,1,1 8 | 385,000386.jpg,1,1 9 | 401,000402.jpg,1,1 10 | 424,000425.jpg,1,1 11 | 622,000623.jpg,1,1 12 | 728,000729.jpg,1,1 13 | 901,000902.jpg,1,1 14 | 904,000905.jpg,1,1 15 | 906,000907.jpg,1,1 16 | 925,000926.jpg,1,1 17 | 937,000938.jpg,1,1 18 | 1071,001072.jpg,1,1 19 | 1099,001100.jpg,1,1 20 | 1104,001105.jpg,1,1 21 | 1116,001117.jpg,1,1 22 | 1148,001149.jpg,1,1 23 | 1191,001192.jpg,1,1 24 | 1206,001207.jpg,1,1 25 | 1207,001208.jpg,1,1 26 | 1321,001322.jpg,1,1 27 | 1367,001368.jpg,1,1 28 | 1382,001383.jpg,1,1 29 | 1397,001398.jpg,1,1 30 | 1399,001400.jpg,1,1 31 | 1470,001471.jpg,1,1 32 | 1518,001519.jpg,1,1 33 | 1655,001656.jpg,1,1 34 | 1668,001669.jpg,1,1 35 | 1683,001684.jpg,1,1 36 | 1705,001706.jpg,1,1 37 | 1765,001766.jpg,1,1 38 | 1916,001917.jpg,1,1 39 | 1923,001924.jpg,1,1 40 | 1998,001999.jpg,1,1 41 | 2003,002004.jpg,1,1 42 | 2029,002030.jpg,1,1 43 | 2080,002081.jpg,1,1 44 | 2275,002276.jpg,1,1 45 | 2304,002305.jpg,1,1 46 | 2358,002359.jpg,1,1 47 | 2362,002363.jpg,1,1 48 | 2515,002516.jpg,1,1 49 | 2527,002528.jpg,1,1 50 | 2672,002673.jpg,1,1 51 | 2722,002723.jpg,1,1 52 | 2743,002744.jpg,1,1 53 | 2771,002772.jpg,1,1 54 | 2827,002828.jpg,1,1 55 | 2873,002874.jpg,1,1 56 | 2934,002935.jpg,1,1 57 | 2959,002960.jpg,1,1 58 | 2960,002961.jpg,1,1 59 | 2979,002980.jpg,1,1 60 | 3026,003027.jpg,1,1 61 | 3316,003317.jpg,1,1 62 | 3383,003384.jpg,1,1 63 | 3453,003454.jpg,1,1 64 | 3510,003511.jpg,1,1 65 | 3526,003527.jpg,1,1 66 | 3575,003576.jpg,1,1 67 | 3643,003644.jpg,1,1 68 | 3677,003678.jpg,1,1 69 | 4022,004023.jpg,1,1 70 | 4057,004058.jpg,1,1 71 | 4108,004109.jpg,1,1 72 | 4190,004191.jpg,1,1 73 | 4248,004249.jpg,1,1 74 | 4249,004250.jpg,1,1 75 | 4316,004317.jpg,1,1 76 | 4329,004330.jpg,1,1 77 | 4381,004382.jpg,1,1 78 | 4448,004449.jpg,1,1 79 | 4454,004455.jpg,1,1 80 | 4491,004492.jpg,1,1 81 | 4610,004611.jpg,1,1 82 | 4811,004812.jpg,1,1 83 | 4851,004852.jpg,1,1 84 | 4975,004976.jpg,1,1 85 | 5001,005002.jpg,1,1 86 | 5011,005012.jpg,1,1 87 | 5025,005026.jpg,1,1 88 | 5041,005042.jpg,1,1 89 | 5076,005077.jpg,1,1 90 | 5139,005140.jpg,1,1 91 | 5145,005146.jpg,1,1 92 | 5157,005158.jpg,1,1 93 | 5162,005163.jpg,1,1 94 | 5164,005165.jpg,1,1 95 | 5199,005200.jpg,1,1 96 | 5294,005295.jpg,1,1 97 | 5317,005318.jpg,1,1 98 | 5352,005353.jpg,1,1 99 | 5590,005591.jpg,1,1 100 | 5701,005702.jpg,1,1 101 | 5720,005721.jpg,1,1 102 | 5725,005726.jpg,1,1 103 | 5730,005731.jpg,1,1 104 | 5788,005789.jpg,1,1 105 | 5838,005839.jpg,1,1 106 | 5958,005959.jpg,1,1 107 | 6037,006038.jpg,1,1 108 | 6173,006174.jpg,1,1 109 | 6211,006212.jpg,1,1 110 | 6218,006219.jpg,1,1 111 | 6241,006242.jpg,1,1 112 | 6299,006300.jpg,1,1 113 | 6341,006342.jpg,1,1 114 | 6388,006389.jpg,1,1 115 | 6390,006391.jpg,1,1 116 | 6493,006494.jpg,1,1 117 | 6554,006555.jpg,1,1 118 | 6602,006603.jpg,1,1 119 | 6645,006646.jpg,1,1 120 | 6669,006670.jpg,1,1 121 | 6672,006673.jpg,1,1 122 | 6710,006711.jpg,1,1 123 | 6878,006879.jpg,1,1 124 | 6941,006942.jpg,1,1 125 | 6965,006966.jpg,1,1 126 | 7050,007051.jpg,1,1 127 | 7151,007152.jpg,1,1 128 | 7386,007387.jpg,1,1 129 | 7398,007399.jpg,1,1 130 | 7400,007401.jpg,1,1 131 | 7426,007427.jpg,1,1 132 | 7431,007432.jpg,1,1 133 | 7571,007572.jpg,1,1 134 | 7593,007594.jpg,1,1 135 | 7717,007718.jpg,1,1 136 | 7723,007724.jpg,1,1 137 | 7728,007729.jpg,1,1 138 | 7768,007769.jpg,1,1 139 | 7771,007772.jpg,1,1 140 | 7786,007787.jpg,1,1 141 | 7999,008000.jpg,1,1 142 | 8024,008025.jpg,1,1 143 | 8082,008083.jpg,1,1 144 | 8188,008189.jpg,1,1 145 | 8195,008196.jpg,1,1 146 | 8224,008225.jpg,1,1 147 | 8302,008303.jpg,1,1 148 | 8360,008361.jpg,1,1 149 | 8488,008489.jpg,1,1 150 | 8507,008508.jpg,1,1 151 | 8547,008548.jpg,1,1 152 | 8570,008571.jpg,1,1 153 | 8606,008607.jpg,1,1 154 | 8643,008644.jpg,1,1 155 | 8665,008666.jpg,1,1 156 | 8694,008695.jpg,1,1 157 | 8732,008733.jpg,1,1 158 | 8764,008765.jpg,1,1 159 | 8780,008781.jpg,1,1 160 | 8854,008855.jpg,1,1 161 | 9244,009245.jpg,1,1 162 | 9289,009290.jpg,1,1 163 | 9349,009350.jpg,1,1 164 | 9350,009351.jpg,1,1 165 | 9356,009357.jpg,1,1 166 | 9402,009403.jpg,1,1 167 | 9469,009470.jpg,1,1 168 | 9496,009497.jpg,1,1 169 | 9591,009592.jpg,1,1 170 | 9594,009595.jpg,1,1 171 | 9617,009618.jpg,1,1 172 | 9627,009628.jpg,1,1 173 | 9695,009696.jpg,1,1 174 | 9699,009700.jpg,1,1 175 | 9706,009707.jpg,1,1 176 | 9741,009742.jpg,1,1 177 | 9792,009793.jpg,1,1 178 | 9920,009921.jpg,1,1 179 | 9949,009950.jpg,1,1 180 | 9972,009973.jpg,1,1 181 | 10006,010007.jpg,1,1 182 | 10185,010186.jpg,1,1 183 | 10211,010212.jpg,1,1 184 | 10212,010213.jpg,1,1 185 | 10265,010266.jpg,1,1 186 | 10391,010392.jpg,1,1 187 | 10411,010412.jpg,1,1 188 | 10580,010581.jpg,1,1 189 | 10637,010638.jpg,1,1 190 | 10645,010646.jpg,1,1 191 | 10653,010654.jpg,1,1 192 | 10752,010753.jpg,1,1 193 | 10784,010785.jpg,1,1 194 | 10882,010883.jpg,1,1 195 | 10914,010915.jpg,1,1 196 | 10928,010929.jpg,1,1 197 | 11169,011170.jpg,1,1 198 | 11207,011208.jpg,1,1 199 | 11296,011297.jpg,1,1 200 | 11477,011478.jpg,1,1 201 | 11489,011490.jpg,1,1 202 | 11580,011581.jpg,1,1 203 | 11613,011614.jpg,1,1 204 | 11663,011664.jpg,1,1 205 | 11668,011669.jpg,1,1 206 | 11684,011685.jpg,1,1 207 | 11999,012000.jpg,1,1 208 | 12006,012007.jpg,1,1 209 | 12012,012013.jpg,1,1 210 | 12030,012031.jpg,1,1 211 | 12064,012065.jpg,1,1 212 | 12113,012114.jpg,1,1 213 | 12192,012193.jpg,1,1 214 | 12212,012213.jpg,1,1 215 | 12288,012289.jpg,1,1 216 | 12406,012407.jpg,1,1 217 | 12412,012413.jpg,1,1 218 | 12422,012423.jpg,1,1 219 | 12443,012444.jpg,1,1 220 | 12482,012483.jpg,1,1 221 | 12553,012554.jpg,1,1 222 | 12601,012602.jpg,1,1 223 | 12636,012637.jpg,1,1 224 | 12660,012661.jpg,1,1 225 | 12722,012723.jpg,1,1 226 | 12747,012748.jpg,1,1 227 | 12768,012769.jpg,1,1 228 | 12777,012778.jpg,1,1 229 | 12789,012790.jpg,1,1 230 | 12903,012904.jpg,1,1 231 | 12928,012929.jpg,1,1 232 | 12976,012977.jpg,1,1 233 | 13032,013033.jpg,1,1 234 | 13205,013206.jpg,1,1 235 | 13237,013238.jpg,1,1 236 | 13256,013257.jpg,1,1 237 | 13337,013338.jpg,1,1 238 | 13403,013404.jpg,1,1 239 | 13443,013444.jpg,1,1 240 | 13451,013452.jpg,1,1 241 | 13509,013510.jpg,1,1 242 | 13552,013553.jpg,1,1 243 | 13585,013586.jpg,1,1 244 | 13711,013712.jpg,1,1 245 | 13738,013739.jpg,1,1 246 | 13825,013826.jpg,1,1 247 | 13865,013866.jpg,1,1 248 | 13868,013869.jpg,1,1 249 | 13984,013985.jpg,1,1 250 | 14067,014068.jpg,1,1 251 | 14189,014190.jpg,1,1 252 | 14251,014252.jpg,1,1 253 | 14293,014294.jpg,1,1 254 | 14306,014307.jpg,1,1 255 | 14497,014498.jpg,1,1 256 | 14578,014579.jpg,1,1 257 | 14586,014587.jpg,1,1 258 | 14625,014626.jpg,1,1 259 | 14897,014898.jpg,1,1 260 | 14904,014905.jpg,1,1 261 | 14916,014917.jpg,1,1 262 | 14935,014936.jpg,1,1 263 | 14940,014941.jpg,1,1 264 | 14953,014954.jpg,1,1 265 | 14962,014963.jpg,1,1 266 | 15081,015082.jpg,1,1 267 | 15138,015139.jpg,1,1 268 | 15187,015188.jpg,1,1 269 | 15245,015246.jpg,1,1 270 | 15264,015265.jpg,1,1 271 | 15346,015347.jpg,1,1 272 | 15389,015390.jpg,1,1 273 | 15423,015424.jpg,1,1 274 | 15502,015503.jpg,1,1 275 | 15570,015571.jpg,1,1 276 | 15572,015573.jpg,1,1 277 | 15634,015635.jpg,1,1 278 | 15696,015697.jpg,1,1 279 | 15713,015714.jpg,1,1 280 | 15733,015734.jpg,1,1 281 | 15980,015981.jpg,1,1 282 | 15981,015982.jpg,1,1 283 | 16040,016041.jpg,1,1 284 | 16092,016093.jpg,1,1 285 | 16167,016168.jpg,1,1 286 | 16211,016212.jpg,1,1 287 | 16219,016220.jpg,1,1 288 | 16313,016314.jpg,1,1 289 | 16327,016328.jpg,1,1 290 | 16345,016346.jpg,1,1 291 | 16390,016391.jpg,1,1 292 | 16425,016426.jpg,1,1 293 | 16426,016427.jpg,1,1 294 | 16568,016569.jpg,1,1 295 | 16620,016621.jpg,1,1 296 | 16676,016677.jpg,1,1 297 | 16730,016731.jpg,1,1 298 | 16904,016905.jpg,1,1 299 | 16911,016912.jpg,1,1 300 | 16928,016929.jpg,1,1 301 | 16977,016978.jpg,1,1 302 | 17103,017104.jpg,1,1 303 | 17116,017117.jpg,1,1 304 | 17130,017131.jpg,1,1 305 | 17187,017188.jpg,1,1 306 | 17362,017363.jpg,1,1 307 | 17379,017380.jpg,1,1 308 | 17413,017414.jpg,1,1 309 | 17482,017483.jpg,1,1 310 | 17596,017597.jpg,1,1 311 | 17651,017652.jpg,1,1 312 | 17667,017668.jpg,1,1 313 | 17835,017836.jpg,1,1 314 | 17955,017956.jpg,1,1 315 | 18144,018145.jpg,1,1 316 | 18193,018194.jpg,1,1 317 | 18271,018272.jpg,1,1 318 | 18274,018275.jpg,1,1 319 | 18329,018330.jpg,1,1 320 | 18492,018493.jpg,1,1 321 | 18506,018507.jpg,1,1 322 | 18509,018510.jpg,1,1 323 | 18784,018785.jpg,1,1 324 | 18889,018890.jpg,1,1 325 | 18923,018924.jpg,1,1 326 | 19120,019121.jpg,1,1 327 | 19148,019149.jpg,1,1 328 | 19170,019171.jpg,1,1 329 | 19200,019201.jpg,1,1 330 | 19249,019250.jpg,1,1 331 | 19256,019257.jpg,1,1 332 | 19392,019393.jpg,1,1 333 | 19414,019415.jpg,1,1 334 | 19436,019437.jpg,1,1 335 | 19509,019510.jpg,1,1 336 | 19546,019547.jpg,1,1 337 | 19665,019666.jpg,1,1 338 | 19779,019780.jpg,1,1 339 | 19970,019971.jpg,1,1 340 | 20001,020002.jpg,1,1 341 | 20036,020037.jpg,1,1 342 | 20148,020149.jpg,1,1 343 | 20172,020173.jpg,1,1 344 | 20211,020212.jpg,1,1 345 | 20221,020222.jpg,1,1 346 | 20222,020223.jpg,1,1 347 | 20348,020349.jpg,1,1 348 | 20355,020356.jpg,1,1 349 | 20376,020377.jpg,1,1 350 | 20431,020432.jpg,1,1 351 | 20469,020470.jpg,1,1 352 | 20477,020478.jpg,1,1 353 | 20543,020544.jpg,1,1 354 | 20552,020553.jpg,1,1 355 | 20618,020619.jpg,1,1 356 | 20674,020675.jpg,1,1 357 | 20740,020741.jpg,1,1 358 | 20837,020838.jpg,1,1 359 | 20934,020935.jpg,1,1 360 | 21022,021023.jpg,1,1 361 | 21044,021045.jpg,1,1 362 | 21082,021083.jpg,1,1 363 | 21131,021132.jpg,1,1 364 | 21141,021142.jpg,1,1 365 | 21160,021161.jpg,1,1 366 | 21310,021311.jpg,1,1 367 | 21318,021319.jpg,1,1 368 | 21319,021320.jpg,1,1 369 | 21332,021333.jpg,1,1 370 | 21348,021349.jpg,1,1 371 | 21352,021353.jpg,1,1 372 | 21361,021362.jpg,1,1 373 | 21461,021462.jpg,1,1 374 | 21466,021467.jpg,1,1 375 | 21560,021561.jpg,1,1 376 | 21597,021598.jpg,1,1 377 | 21684,021685.jpg,1,1 378 | 21745,021746.jpg,1,1 379 | 21870,021871.jpg,1,1 380 | 21974,021975.jpg,1,1 381 | 22164,022165.jpg,1,1 382 | 22182,022183.jpg,1,1 383 | 22368,022369.jpg,1,1 384 | 22461,022462.jpg,1,1 385 | 22467,022468.jpg,1,1 386 | 22516,022517.jpg,1,1 387 | 22590,022591.jpg,1,1 388 | 22629,022630.jpg,1,1 389 | 22678,022679.jpg,1,1 390 | 22733,022734.jpg,1,1 391 | 22756,022757.jpg,1,1 392 | 22935,022936.jpg,1,1 393 | 22945,022946.jpg,1,1 394 | 22953,022954.jpg,1,1 395 | 22961,022962.jpg,1,1 396 | 23024,023025.jpg,1,1 397 | 23055,023056.jpg,1,1 398 | 23096,023097.jpg,1,1 399 | 23116,023117.jpg,1,1 400 | 23168,023169.jpg,1,1 401 | 23180,023181.jpg,1,1 402 | 23215,023216.jpg,1,1 403 | 23283,023284.jpg,1,1 404 | 23317,023318.jpg,1,1 405 | 23452,023453.jpg,1,1 406 | 23470,023471.jpg,1,1 407 | 23644,023645.jpg,1,1 408 | 23674,023675.jpg,1,1 409 | 23702,023703.jpg,1,1 410 | 23728,023729.jpg,1,1 411 | 23785,023786.jpg,1,1 412 | 23793,023794.jpg,1,1 413 | 23814,023815.jpg,1,1 414 | 24033,024034.jpg,1,1 415 | 24035,024036.jpg,1,1 416 | 24078,024079.jpg,1,1 417 | 24232,024233.jpg,1,1 418 | 24427,024428.jpg,1,1 419 | 24436,024437.jpg,1,1 420 | 24469,024470.jpg,1,1 421 | 24501,024502.jpg,1,1 422 | 24545,024546.jpg,1,1 423 | 24605,024606.jpg,1,1 424 | 24676,024677.jpg,1,1 425 | 24725,024726.jpg,1,1 426 | 24740,024741.jpg,1,1 427 | 24937,024938.jpg,1,1 428 | 24959,024960.jpg,1,1 429 | 24964,024965.jpg,1,1 430 | 25020,025021.jpg,1,1 431 | 25081,025082.jpg,1,1 432 | 25170,025171.jpg,1,1 433 | 25206,025207.jpg,1,1 434 | 25248,025249.jpg,1,1 435 | 25345,025346.jpg,1,1 436 | 25354,025355.jpg,1,1 437 | 25452,025453.jpg,1,1 438 | 25478,025479.jpg,1,1 439 | 25516,025517.jpg,1,1 440 | 25589,025590.jpg,1,1 441 | 25624,025625.jpg,1,1 442 | 25704,025705.jpg,1,1 443 | 25814,025815.jpg,1,1 444 | 25897,025898.jpg,1,1 445 | 25898,025899.jpg,1,1 446 | 25923,025924.jpg,1,1 447 | 25967,025968.jpg,1,1 448 | 26010,026011.jpg,1,1 449 | 26026,026027.jpg,1,1 450 | 26062,026063.jpg,1,1 451 | 26190,026191.jpg,1,1 452 | 26219,026220.jpg,1,1 453 | 26251,026252.jpg,1,1 454 | 26257,026258.jpg,1,1 455 | 26420,026421.jpg,1,1 456 | 26510,026511.jpg,1,1 457 | 26574,026575.jpg,1,1 458 | 26826,026827.jpg,1,1 459 | 26910,026911.jpg,1,1 460 | 26928,026929.jpg,1,1 461 | 27087,027088.jpg,1,1 462 | 27130,027131.jpg,1,1 463 | 27131,027132.jpg,1,1 464 | 27164,027165.jpg,1,1 465 | 27463,027464.jpg,1,1 466 | 27537,027538.jpg,1,1 467 | 27560,027561.jpg,1,1 468 | 27563,027564.jpg,1,1 469 | 27639,027640.jpg,1,1 470 | 27804,027805.jpg,1,1 471 | 27896,027897.jpg,1,1 472 | 27925,027926.jpg,1,1 473 | 28062,028063.jpg,1,1 474 | 28209,028210.jpg,1,1 475 | 28219,028220.jpg,1,1 476 | 28332,028333.jpg,1,1 477 | 28352,028353.jpg,1,1 478 | 28353,028354.jpg,1,1 479 | 28442,028443.jpg,1,1 480 | 28481,028482.jpg,1,1 481 | 28519,028520.jpg,1,1 482 | 28609,028610.jpg,1,1 483 | 28610,028611.jpg,1,1 484 | 28636,028637.jpg,1,1 485 | 28666,028667.jpg,1,1 486 | 28677,028678.jpg,1,1 487 | 28723,028724.jpg,1,1 488 | 28831,028832.jpg,1,1 489 | 28898,028899.jpg,1,1 490 | 28923,028924.jpg,1,1 491 | 28937,028938.jpg,1,1 492 | 28945,028946.jpg,1,1 493 | 29102,029103.jpg,1,1 494 | 29109,029110.jpg,1,1 495 | 29210,029211.jpg,1,1 496 | 29270,029271.jpg,1,1 497 | 29352,029353.jpg,1,1 498 | 29475,029476.jpg,1,1 499 | 29498,029499.jpg,1,1 500 | 29535,029536.jpg,1,1 501 | 29555,029556.jpg,1,1 502 | 29621,029622.jpg,1,1 503 | 29637,029638.jpg,1,1 504 | 29646,029647.jpg,1,1 505 | 29682,029683.jpg,1,1 506 | 29684,029685.jpg,1,1 507 | 29710,029711.jpg,1,1 508 | 29882,029883.jpg,1,1 509 | 29922,029923.jpg,1,1 510 | 30077,030078.jpg,1,1 511 | 30428,030429.jpg,1,1 512 | 30621,030622.jpg,1,1 513 | 30703,030704.jpg,1,1 514 | 30717,030718.jpg,1,1 515 | 30733,030734.jpg,1,1 516 | 30826,030827.jpg,1,1 517 | 30887,030888.jpg,1,1 518 | 31001,031002.jpg,1,1 519 | 31033,031034.jpg,1,1 520 | 31136,031137.jpg,1,1 521 | 31246,031247.jpg,1,1 522 | 31269,031270.jpg,1,1 523 | 31274,031275.jpg,1,1 524 | 31467,031468.jpg,1,1 525 | 31494,031495.jpg,1,1 526 | 31530,031531.jpg,1,1 527 | 31579,031580.jpg,1,1 528 | 31615,031616.jpg,1,1 529 | 31625,031626.jpg,1,1 530 | 31629,031630.jpg,1,1 531 | 31635,031636.jpg,1,1 532 | 31650,031651.jpg,1,1 533 | 31684,031685.jpg,1,1 534 | 31810,031811.jpg,1,1 535 | 31953,031954.jpg,1,1 536 | 31979,031980.jpg,1,1 537 | 32057,032058.jpg,1,1 538 | 32163,032164.jpg,1,1 539 | 32188,032189.jpg,1,1 540 | 32303,032304.jpg,1,1 541 | 32337,032338.jpg,1,1 542 | 32359,032360.jpg,1,1 543 | 32447,032448.jpg,1,1 544 | 32661,032662.jpg,1,1 545 | 32835,032836.jpg,1,1 546 | 32875,032876.jpg,1,1 547 | 33222,033223.jpg,1,1 548 | 33277,033278.jpg,1,1 549 | 33302,033303.jpg,1,1 550 | 33359,033360.jpg,1,1 551 | 33533,033534.jpg,1,1 552 | 33641,033642.jpg,1,1 553 | 33686,033687.jpg,1,1 554 | 33699,033700.jpg,1,1 555 | 33706,033707.jpg,1,1 556 | 33717,033718.jpg,1,1 557 | 33745,033746.jpg,1,1 558 | 33815,033816.jpg,1,1 559 | 33944,033945.jpg,1,1 560 | 34010,034011.jpg,1,1 561 | 34021,034022.jpg,1,1 562 | 34064,034065.jpg,1,1 563 | 34088,034089.jpg,1,1 564 | 34097,034098.jpg,1,1 565 | 34413,034414.jpg,1,1 566 | 34418,034419.jpg,1,1 567 | 34528,034529.jpg,1,1 568 | 34534,034535.jpg,1,1 569 | 34542,034543.jpg,1,1 570 | 34558,034559.jpg,1,1 571 | 34589,034590.jpg,1,1 572 | 34648,034649.jpg,1,1 573 | 34656,034657.jpg,1,1 574 | 34689,034690.jpg,1,1 575 | 34793,034794.jpg,1,1 576 | 34813,034814.jpg,1,1 577 | 34850,034851.jpg,1,1 578 | 34888,034889.jpg,1,1 579 | 34928,034929.jpg,1,1 580 | 35083,035084.jpg,1,1 581 | 35114,035115.jpg,1,1 582 | 35131,035132.jpg,1,1 583 | 35150,035151.jpg,1,1 584 | 35192,035193.jpg,1,1 585 | 35257,035258.jpg,1,1 586 | 35263,035264.jpg,1,1 587 | 35272,035273.jpg,1,1 588 | 35281,035282.jpg,1,1 589 | 35409,035410.jpg,1,1 590 | 35491,035492.jpg,1,1 591 | 35584,035585.jpg,1,1 592 | 35667,035668.jpg,1,1 593 | 35671,035672.jpg,1,1 594 | 35765,035766.jpg,1,1 595 | 35781,035782.jpg,1,1 596 | 35946,035947.jpg,1,1 597 | 36054,036055.jpg,1,1 598 | 36093,036094.jpg,1,1 599 | 36167,036168.jpg,1,1 600 | 36173,036174.jpg,1,1 601 | 36204,036205.jpg,1,1 602 | 36256,036257.jpg,1,1 603 | 36296,036297.jpg,1,1 604 | 36325,036326.jpg,1,1 605 | 36396,036397.jpg,1,1 606 | 36404,036405.jpg,1,1 607 | 36499,036500.jpg,1,1 608 | 36553,036554.jpg,1,1 609 | 36604,036605.jpg,1,1 610 | 36731,036732.jpg,1,1 611 | 36776,036777.jpg,1,1 612 | 36778,036779.jpg,1,1 613 | 36868,036869.jpg,1,1 614 | 36900,036901.jpg,1,1 615 | 36906,036907.jpg,1,1 616 | 36926,036927.jpg,1,1 617 | 36939,036940.jpg,1,1 618 | 36948,036949.jpg,1,1 619 | 37090,037091.jpg,1,1 620 | 37210,037211.jpg,1,1 621 | 37256,037257.jpg,1,1 622 | 37307,037308.jpg,1,1 623 | 37342,037343.jpg,1,1 624 | 37419,037420.jpg,1,1 625 | 37441,037442.jpg,1,1 626 | 37456,037457.jpg,1,1 627 | 37468,037469.jpg,1,1 628 | 37476,037477.jpg,1,1 629 | 37508,037509.jpg,1,1 630 | 37612,037613.jpg,1,1 631 | 37799,037800.jpg,1,1 632 | 38049,038050.jpg,1,1 633 | 38060,038061.jpg,1,1 634 | 38072,038073.jpg,1,1 635 | 38080,038081.jpg,1,1 636 | 38175,038176.jpg,1,1 637 | 38226,038227.jpg,1,1 638 | 38323,038324.jpg,1,1 639 | 38334,038335.jpg,1,1 640 | 38401,038402.jpg,1,1 641 | 38410,038411.jpg,1,1 642 | 38434,038435.jpg,1,1 643 | 38500,038501.jpg,1,1 644 | 38663,038664.jpg,1,1 645 | 38784,038785.jpg,1,1 646 | 38847,038848.jpg,1,1 647 | 38868,038869.jpg,1,1 648 | 38911,038912.jpg,1,1 649 | 38952,038953.jpg,1,1 650 | 38959,038960.jpg,1,1 651 | 38996,038997.jpg,1,1 652 | 38999,039000.jpg,1,1 653 | 39009,039010.jpg,1,1 654 | 39017,039018.jpg,1,1 655 | 39018,039019.jpg,1,1 656 | 39071,039072.jpg,1,1 657 | 39079,039080.jpg,1,1 658 | 39160,039161.jpg,1,1 659 | 39267,039268.jpg,1,1 660 | 39385,039386.jpg,1,1 661 | 39401,039402.jpg,1,1 662 | 39402,039403.jpg,1,1 663 | 39470,039471.jpg,1,1 664 | 39518,039519.jpg,1,1 665 | 39534,039535.jpg,1,1 666 | 39568,039569.jpg,1,1 667 | 39614,039615.jpg,1,1 668 | 39642,039643.jpg,1,1 669 | 39696,039697.jpg,1,1 670 | 39712,039713.jpg,1,1 671 | 39722,039723.jpg,1,1 672 | 39859,039860.jpg,1,1 673 | 39862,039863.jpg,1,1 674 | 39869,039870.jpg,1,1 675 | 40053,040054.jpg,1,1 676 | 40099,040100.jpg,1,1 677 | 40100,040101.jpg,1,1 678 | 40250,040251.jpg,1,1 679 | 40326,040327.jpg,1,1 680 | 40352,040353.jpg,1,1 681 | 40415,040416.jpg,1,1 682 | 40565,040566.jpg,1,1 683 | 40594,040595.jpg,1,1 684 | 40615,040616.jpg,1,1 685 | 40791,040792.jpg,1,1 686 | 40850,040851.jpg,1,1 687 | 40921,040922.jpg,1,1 688 | 40951,040952.jpg,1,1 689 | 40961,040962.jpg,1,1 690 | 40963,040964.jpg,1,1 691 | 40987,040988.jpg,1,1 692 | 40998,040999.jpg,1,1 693 | 41178,041179.jpg,1,1 694 | 41195,041196.jpg,1,1 695 | 41236,041237.jpg,1,1 696 | 41291,041292.jpg,1,1 697 | 41487,041488.jpg,1,1 698 | 41590,041591.jpg,1,1 699 | 41809,041810.jpg,1,1 700 | 41854,041855.jpg,1,1 701 | 41948,041949.jpg,1,1 702 | 41969,041970.jpg,1,1 703 | 42074,042075.jpg,1,1 704 | 42203,042204.jpg,1,1 705 | 42204,042205.jpg,1,1 706 | 42274,042275.jpg,1,1 707 | 42297,042298.jpg,1,1 708 | 42338,042339.jpg,1,1 709 | 42490,042491.jpg,1,1 710 | 42565,042566.jpg,1,1 711 | 42646,042647.jpg,1,1 712 | 42749,042750.jpg,1,1 713 | 42824,042825.jpg,1,1 714 | 42967,042968.jpg,1,1 715 | 42984,042985.jpg,1,1 716 | 43109,043110.jpg,1,1 717 | 43118,043119.jpg,1,1 718 | 43299,043300.jpg,1,1 719 | 43335,043336.jpg,1,1 720 | 43339,043340.jpg,1,1 721 | 43358,043359.jpg,1,1 722 | 43376,043377.jpg,1,1 723 | 43418,043419.jpg,1,1 724 | 43428,043429.jpg,1,1 725 | 43618,043619.jpg,1,1 726 | 43619,043620.jpg,1,1 727 | 43895,043896.jpg,1,1 728 | 43909,043910.jpg,1,1 729 | 44030,044031.jpg,1,1 730 | 44104,044105.jpg,1,1 731 | 44150,044151.jpg,1,1 732 | 44207,044208.jpg,1,1 733 | 44298,044299.jpg,1,1 734 | 44310,044311.jpg,1,1 735 | 44358,044359.jpg,1,1 736 | 44392,044393.jpg,1,1 737 | 44413,044414.jpg,1,1 738 | 44532,044533.jpg,1,1 739 | 44538,044539.jpg,1,1 740 | 44572,044573.jpg,1,1 741 | 44612,044613.jpg,1,1 742 | 44616,044617.jpg,1,1 743 | 44642,044643.jpg,1,1 744 | 44765,044766.jpg,1,1 745 | 44804,044805.jpg,1,1 746 | 44916,044917.jpg,1,1 747 | 44954,044955.jpg,1,1 748 | 45044,045045.jpg,1,1 749 | 45142,045143.jpg,1,1 750 | 45147,045148.jpg,1,1 751 | 45162,045163.jpg,1,1 752 | 45229,045230.jpg,1,1 753 | 45354,045355.jpg,1,1 754 | 45363,045364.jpg,1,1 755 | 45402,045403.jpg,1,1 756 | 45502,045503.jpg,1,1 757 | 45585,045586.jpg,1,1 758 | 45609,045610.jpg,1,1 759 | 45743,045744.jpg,1,1 760 | 45774,045775.jpg,1,1 761 | 45807,045808.jpg,1,1 762 | 45817,045818.jpg,1,1 763 | 45846,045847.jpg,1,1 764 | 45895,045896.jpg,1,1 765 | 45935,045936.jpg,1,1 766 | 46226,046227.jpg,1,1 767 | 46245,046246.jpg,1,1 768 | 46290,046291.jpg,1,1 769 | 46344,046345.jpg,1,1 770 | 46400,046401.jpg,1,1 771 | 46425,046426.jpg,1,1 772 | 46431,046432.jpg,1,1 773 | 46440,046441.jpg,1,1 774 | 46514,046515.jpg,1,1 775 | 46567,046568.jpg,1,1 776 | 46597,046598.jpg,1,1 777 | 46598,046599.jpg,1,1 778 | 46625,046626.jpg,1,1 779 | 46755,046756.jpg,1,1 780 | 46825,046826.jpg,1,1 781 | 46893,046894.jpg,1,1 782 | 47021,047022.jpg,1,1 783 | 47026,047027.jpg,1,1 784 | 47046,047047.jpg,1,1 785 | 47129,047130.jpg,1,1 786 | 47181,047182.jpg,1,1 787 | 47214,047215.jpg,1,1 788 | 47215,047216.jpg,1,1 789 | 47226,047227.jpg,1,1 790 | 47300,047301.jpg,1,1 791 | 47367,047368.jpg,1,1 792 | 47402,047403.jpg,1,1 793 | 47494,047495.jpg,1,1 794 | 47574,047575.jpg,1,1 795 | 47615,047616.jpg,1,1 796 | 47634,047635.jpg,1,1 797 | 47703,047704.jpg,1,1 798 | 47776,047777.jpg,1,1 799 | 47785,047786.jpg,1,1 800 | 47853,047854.jpg,1,1 801 | 47875,047876.jpg,1,1 802 | 47959,047960.jpg,1,1 803 | 47981,047982.jpg,1,1 804 | 48009,048010.jpg,1,1 805 | 48130,048131.jpg,1,1 806 | 48177,048178.jpg,1,1 807 | 48224,048225.jpg,1,1 808 | 48236,048237.jpg,1,1 809 | 48464,048465.jpg,1,1 810 | 48576,048577.jpg,1,1 811 | 48632,048633.jpg,1,1 812 | 48742,048743.jpg,1,1 813 | 48772,048773.jpg,1,1 814 | 48905,048906.jpg,1,1 815 | 48906,048907.jpg,1,1 816 | 48953,048954.jpg,1,1 817 | 48991,048992.jpg,1,1 818 | 49084,049085.jpg,1,1 819 | 49106,049107.jpg,1,1 820 | 49155,049156.jpg,1,1 821 | 49222,049223.jpg,1,1 822 | 49258,049259.jpg,1,1 823 | 49299,049300.jpg,1,1 824 | 49341,049342.jpg,1,1 825 | 49587,049588.jpg,1,1 826 | 49593,049594.jpg,1,1 827 | 49623,049624.jpg,1,1 828 | 49653,049654.jpg,1,1 829 | 49656,049657.jpg,1,1 830 | 49741,049742.jpg,1,1 831 | 49755,049756.jpg,1,1 832 | 49867,049868.jpg,1,1 833 | 49870,049871.jpg,1,1 834 | 50040,050041.jpg,1,1 835 | 50052,050053.jpg,1,1 836 | 50239,050240.jpg,1,1 837 | 50267,050268.jpg,1,1 838 | 50394,050395.jpg,1,1 839 | 50582,050583.jpg,1,1 840 | 50600,050601.jpg,1,1 841 | 50735,050736.jpg,1,1 842 | 50803,050804.jpg,1,1 843 | 50835,050836.jpg,1,1 844 | 50900,050901.jpg,1,1 845 | 50926,050927.jpg,1,1 846 | 50966,050967.jpg,1,1 847 | 51054,051055.jpg,1,1 848 | 51230,051231.jpg,1,1 849 | 51257,051258.jpg,1,1 850 | 51310,051311.jpg,1,1 851 | 51403,051404.jpg,1,1 852 | 51407,051408.jpg,1,1 853 | 51445,051446.jpg,1,1 854 | 51491,051492.jpg,1,1 855 | 51513,051514.jpg,1,1 856 | 51527,051528.jpg,1,1 857 | 51591,051592.jpg,1,1 858 | 51598,051599.jpg,1,1 859 | 51691,051692.jpg,1,1 860 | 51766,051767.jpg,1,1 861 | 51868,051869.jpg,1,1 862 | 51932,051933.jpg,1,1 863 | 51935,051936.jpg,1,1 864 | 52015,052016.jpg,1,1 865 | 52236,052237.jpg,1,1 866 | 52254,052255.jpg,1,1 867 | 52418,052419.jpg,1,1 868 | 52446,052447.jpg,1,1 869 | 52460,052461.jpg,1,1 870 | 52532,052533.jpg,1,1 871 | 52584,052585.jpg,1,1 872 | 52749,052750.jpg,1,1 873 | 52807,052808.jpg,1,1 874 | 52865,052866.jpg,1,1 875 | 52942,052943.jpg,1,1 876 | 52950,052951.jpg,1,1 877 | 52964,052965.jpg,1,1 878 | 52981,052982.jpg,1,1 879 | 53101,053102.jpg,1,1 880 | 53160,053161.jpg,1,1 881 | 53199,053200.jpg,1,1 882 | 53256,053257.jpg,1,1 883 | 53266,053267.jpg,1,1 884 | 53345,053346.jpg,1,1 885 | 53393,053394.jpg,1,1 886 | 53399,053400.jpg,1,1 887 | 53412,053413.jpg,1,1 888 | 53527,053528.jpg,1,1 889 | 53628,053629.jpg,1,1 890 | 53634,053635.jpg,1,1 891 | 53711,053712.jpg,1,1 892 | 53748,053749.jpg,1,1 893 | 53772,053773.jpg,1,1 894 | 53793,053794.jpg,1,1 895 | 53879,053880.jpg,1,1 896 | 54035,054036.jpg,1,1 897 | 54058,054059.jpg,1,1 898 | 54164,054165.jpg,1,1 899 | 54200,054201.jpg,1,1 900 | 54281,054282.jpg,1,1 901 | 54315,054316.jpg,1,1 902 | 54317,054318.jpg,1,1 903 | 54389,054390.jpg,1,1 904 | 54539,054540.jpg,1,1 905 | 54713,054714.jpg,1,1 906 | 54717,054718.jpg,1,1 907 | 54766,054767.jpg,1,1 908 | 54788,054789.jpg,1,1 909 | 54827,054828.jpg,1,1 910 | 54915,054916.jpg,1,1 911 | 54924,054925.jpg,1,1 912 | 55022,055023.jpg,1,1 913 | 55098,055099.jpg,1,1 914 | 55129,055130.jpg,1,1 915 | 55203,055204.jpg,1,1 916 | 55228,055229.jpg,1,1 917 | 55245,055246.jpg,1,1 918 | 55273,055274.jpg,1,1 919 | 55290,055291.jpg,1,1 920 | 55320,055321.jpg,1,1 921 | 55409,055410.jpg,1,1 922 | 55476,055477.jpg,1,1 923 | 55586,055587.jpg,1,1 924 | 55663,055664.jpg,1,1 925 | 55767,055768.jpg,1,1 926 | 55889,055890.jpg,1,1 927 | 55989,055990.jpg,1,1 928 | 56065,056066.jpg,1,1 929 | 56110,056111.jpg,1,1 930 | 56128,056129.jpg,1,1 931 | 56141,056142.jpg,1,1 932 | 56254,056255.jpg,1,1 933 | 56268,056269.jpg,1,1 934 | 56269,056270.jpg,1,1 935 | 56319,056320.jpg,1,1 936 | 56352,056353.jpg,1,1 937 | 56395,056396.jpg,1,1 938 | 56492,056493.jpg,1,1 939 | 56653,056654.jpg,1,1 940 | 56805,056806.jpg,1,1 941 | 56845,056846.jpg,1,1 942 | 56920,056921.jpg,1,1 943 | 56924,056925.jpg,1,1 944 | 57006,057007.jpg,1,1 945 | 57092,057093.jpg,1,1 946 | 57176,057177.jpg,1,1 947 | 57197,057198.jpg,1,1 948 | 57234,057235.jpg,1,1 949 | 57445,057446.jpg,1,1 950 | 57594,057595.jpg,1,1 951 | 57601,057602.jpg,1,1 952 | 57723,057724.jpg,1,1 953 | 57838,057839.jpg,1,1 954 | 57938,057939.jpg,1,1 955 | 58040,058041.jpg,1,1 956 | 58044,058045.jpg,1,1 957 | 58091,058092.jpg,1,1 958 | 58126,058127.jpg,1,1 959 | 58128,058129.jpg,1,1 960 | 58247,058248.jpg,1,1 961 | 58250,058251.jpg,1,1 962 | 58398,058399.jpg,1,1 963 | 58437,058438.jpg,1,1 964 | 58493,058494.jpg,1,1 965 | 58496,058497.jpg,1,1 966 | 58528,058529.jpg,1,1 967 | 58602,058603.jpg,1,1 968 | 58642,058643.jpg,1,1 969 | 58653,058654.jpg,1,1 970 | 58737,058738.jpg,1,1 971 | 58789,058790.jpg,1,1 972 | 58867,058868.jpg,1,1 973 | 58935,058936.jpg,1,1 974 | 58946,058947.jpg,1,1 975 | 58951,058952.jpg,1,1 976 | 59012,059013.jpg,1,1 977 | 59063,059064.jpg,1,1 978 | 59293,059294.jpg,1,1 979 | 59346,059347.jpg,1,1 980 | 59380,059381.jpg,1,1 981 | 59440,059441.jpg,1,1 982 | 59496,059497.jpg,1,1 983 | 59570,059571.jpg,1,1 984 | 59578,059579.jpg,1,1 985 | 59613,059614.jpg,1,1 986 | 59732,059733.jpg,1,1 987 | 59770,059771.jpg,1,1 988 | 59774,059775.jpg,1,1 989 | 59792,059793.jpg,1,1 990 | 59897,059898.jpg,1,1 991 | 59967,059968.jpg,1,1 992 | 60040,060041.jpg,1,1 993 | 60056,060057.jpg,1,1 994 | 60057,060058.jpg,1,1 995 | 60079,060080.jpg,1,1 996 | 60127,060128.jpg,1,1 997 | 60228,060229.jpg,1,1 998 | 60270,060271.jpg,1,1 999 | 60308,060309.jpg,1,1 1000 | 60386,060387.jpg,1,1 1001 | 60388,060389.jpg,1,1 1002 | 60392,060393.jpg,1,1 1003 | 60434,060435.jpg,1,1 1004 | 60548,060549.jpg,1,1 1005 | 60600,060601.jpg,1,1 1006 | 60616,060617.jpg,1,1 1007 | 60756,060757.jpg,1,1 1008 | 60771,060772.jpg,1,1 1009 | 60778,060779.jpg,1,1 1010 | 60873,060874.jpg,1,1 1011 | 60958,060959.jpg,1,1 1012 | 60983,060984.jpg,1,1 1013 | 61017,061018.jpg,1,1 1014 | 61030,061031.jpg,1,1 1015 | 61059,061060.jpg,1,1 1016 | 61135,061136.jpg,1,1 1017 | 61177,061178.jpg,1,1 1018 | 61199,061200.jpg,1,1 1019 | 61228,061229.jpg,1,1 1020 | 61270,061271.jpg,1,1 1021 | 61341,061342.jpg,1,1 1022 | 61342,061343.jpg,1,1 1023 | 61343,061344.jpg,1,1 1024 | 61357,061358.jpg,1,1 1025 | 61358,061359.jpg,1,1 1026 | 61370,061371.jpg,1,1 1027 | 61375,061376.jpg,1,1 1028 | 61456,061457.jpg,1,1 1029 | 61628,061629.jpg,1,1 1030 | 61638,061639.jpg,1,1 1031 | 61760,061761.jpg,1,1 1032 | 61782,061783.jpg,1,1 1033 | 61790,061791.jpg,1,1 1034 | 61812,061813.jpg,1,1 1035 | 61950,061951.jpg,1,1 1036 | 61976,061977.jpg,1,1 1037 | 62099,062100.jpg,1,1 1038 | 62105,062106.jpg,1,1 1039 | 62390,062391.jpg,1,1 1040 | 62448,062449.jpg,1,1 1041 | 62490,062491.jpg,1,1 1042 | 62513,062514.jpg,1,1 1043 | 62678,062679.jpg,1,1 1044 | 62748,062749.jpg,1,1 1045 | 62773,062774.jpg,1,1 1046 | 62811,062812.jpg,1,1 1047 | 62822,062823.jpg,1,1 1048 | 62903,062904.jpg,1,1 1049 | 62952,062953.jpg,1,1 1050 | 62956,062957.jpg,1,1 1051 | 62992,062993.jpg,1,1 1052 | 63064,063065.jpg,1,1 1053 | 63065,063066.jpg,1,1 1054 | 63213,063214.jpg,1,1 1055 | 63254,063255.jpg,1,1 1056 | 63297,063298.jpg,1,1 1057 | 63443,063444.jpg,1,1 1058 | 63537,063538.jpg,1,1 1059 | 63556,063557.jpg,1,1 1060 | 63610,063611.jpg,1,1 1061 | 63767,063768.jpg,1,1 1062 | 63804,063805.jpg,1,1 1063 | 63820,063821.jpg,1,1 1064 | 63867,063868.jpg,1,1 1065 | 63893,063894.jpg,1,1 1066 | 63930,063931.jpg,1,1 1067 | 64010,064011.jpg,1,1 1068 | 64070,064071.jpg,1,1 1069 | 64351,064352.jpg,1,1 1070 | 64466,064467.jpg,1,1 1071 | 64534,064535.jpg,1,1 1072 | 64559,064560.jpg,1,1 1073 | 64626,064627.jpg,1,1 1074 | 64671,064672.jpg,1,1 1075 | 64836,064837.jpg,1,1 1076 | 64921,064922.jpg,1,1 1077 | 65080,065081.jpg,1,1 1078 | 65122,065123.jpg,1,1 1079 | 65137,065138.jpg,1,1 1080 | 65171,065172.jpg,1,1 1081 | 65189,065190.jpg,1,1 1082 | 65271,065272.jpg,1,1 1083 | 65289,065290.jpg,1,1 1084 | 65361,065362.jpg,1,1 1085 | 65486,065487.jpg,1,1 1086 | 65546,065547.jpg,1,1 1087 | 65551,065552.jpg,1,1 1088 | 65616,065617.jpg,1,1 1089 | 65680,065681.jpg,1,1 1090 | 65864,065865.jpg,1,1 1091 | 65903,065904.jpg,1,1 1092 | 65976,065977.jpg,1,1 1093 | 65999,066000.jpg,1,1 1094 | 66111,066112.jpg,1,1 1095 | 66202,066203.jpg,1,1 1096 | 66211,066212.jpg,1,1 1097 | 66274,066275.jpg,1,1 1098 | 66276,066277.jpg,1,1 1099 | 66310,066311.jpg,1,1 1100 | 66329,066330.jpg,1,1 1101 | 66452,066453.jpg,1,1 1102 | 66534,066535.jpg,1,1 1103 | 66623,066624.jpg,1,1 1104 | 66662,066663.jpg,1,1 1105 | 66674,066675.jpg,1,1 1106 | 66715,066716.jpg,1,1 1107 | 66756,066757.jpg,1,1 1108 | 66768,066769.jpg,1,1 1109 | 66821,066822.jpg,1,1 1110 | 66946,066947.jpg,1,1 1111 | 67025,067026.jpg,1,1 1112 | 67036,067037.jpg,1,1 1113 | 67118,067119.jpg,1,1 1114 | 67151,067152.jpg,1,1 1115 | 67276,067277.jpg,1,1 1116 | 67520,067521.jpg,1,1 1117 | 67542,067543.jpg,1,1 1118 | 67643,067644.jpg,1,1 1119 | 67819,067820.jpg,1,1 1120 | 67864,067865.jpg,1,1 1121 | 67894,067895.jpg,1,1 1122 | 68048,068049.jpg,1,1 1123 | 68229,068230.jpg,1,1 1124 | 68278,068279.jpg,1,1 1125 | 68625,068626.jpg,1,1 1126 | 68627,068628.jpg,1,1 1127 | 68703,068704.jpg,1,1 1128 | 68978,068979.jpg,1,1 1129 | 69087,069088.jpg,1,1 1130 | 69123,069124.jpg,1,1 1131 | 69163,069164.jpg,1,1 1132 | 69236,069237.jpg,1,1 1133 | 69260,069261.jpg,1,1 1134 | 69262,069263.jpg,1,1 1135 | 69281,069282.jpg,1,1 1136 | 69319,069320.jpg,1,1 1137 | 69331,069332.jpg,1,1 1138 | 69355,069356.jpg,1,1 1139 | 69438,069439.jpg,1,1 1140 | 69549,069550.jpg,1,1 1141 | 69584,069585.jpg,1,1 1142 | 69618,069619.jpg,1,1 1143 | 69696,069697.jpg,1,1 1144 | 69771,069772.jpg,1,1 1145 | 69790,069791.jpg,1,1 1146 | 69830,069831.jpg,1,1 1147 | 69891,069892.jpg,1,1 1148 | 69939,069940.jpg,1,1 1149 | 70015,070016.jpg,1,1 1150 | 70046,070047.jpg,1,1 1151 | 70068,070069.jpg,1,1 1152 | 70098,070099.jpg,1,1 1153 | 70165,070166.jpg,1,1 1154 | 70244,070245.jpg,1,1 1155 | 70277,070278.jpg,1,1 1156 | 70280,070281.jpg,1,1 1157 | 70288,070289.jpg,1,1 1158 | 70290,070291.jpg,1,1 1159 | 70330,070331.jpg,1,1 1160 | 70420,070421.jpg,1,1 1161 | 70443,070444.jpg,1,1 1162 | 70458,070459.jpg,1,1 1163 | 70485,070486.jpg,1,1 1164 | 70496,070497.jpg,1,1 1165 | 70573,070574.jpg,1,1 1166 | 70697,070698.jpg,1,1 1167 | 70769,070770.jpg,1,1 1168 | 70801,070802.jpg,1,1 1169 | 70839,070840.jpg,1,1 1170 | 71118,071119.jpg,1,1 1171 | 71122,071123.jpg,1,1 1172 | 71168,071169.jpg,1,1 1173 | 71186,071187.jpg,1,1 1174 | 71200,071201.jpg,1,1 1175 | 71396,071397.jpg,1,1 1176 | 71402,071403.jpg,1,1 1177 | 71493,071494.jpg,1,1 1178 | 71633,071634.jpg,1,1 1179 | 71650,071651.jpg,1,1 1180 | 71756,071757.jpg,1,1 1181 | 71796,071797.jpg,1,1 1182 | 71866,071867.jpg,1,1 1183 | 71875,071876.jpg,1,1 1184 | 72017,072018.jpg,1,1 1185 | 72031,072032.jpg,1,1 1186 | 72032,072033.jpg,1,1 1187 | 72072,072073.jpg,1,1 1188 | 72087,072088.jpg,1,1 1189 | 72101,072102.jpg,1,1 1190 | 72154,072155.jpg,1,1 1191 | 72225,072226.jpg,1,1 1192 | 72235,072236.jpg,1,1 1193 | 72248,072249.jpg,1,1 1194 | 72264,072265.jpg,1,1 1195 | 72310,072311.jpg,1,1 1196 | 72510,072511.jpg,1,1 1197 | 72544,072545.jpg,1,1 1198 | 72652,072653.jpg,1,1 1199 | 72709,072710.jpg,1,1 1200 | 72772,072773.jpg,1,1 1201 | 72987,072988.jpg,1,1 1202 | 73023,073024.jpg,1,1 1203 | 73120,073121.jpg,1,1 1204 | 73128,073129.jpg,1,1 1205 | 73358,073359.jpg,1,1 1206 | 73370,073371.jpg,1,1 1207 | 73382,073383.jpg,1,1 1208 | 73502,073503.jpg,1,1 1209 | 73571,073572.jpg,1,1 1210 | 73738,073739.jpg,1,1 1211 | 73776,073777.jpg,1,1 1212 | 73824,073825.jpg,1,1 1213 | 73842,073843.jpg,1,1 1214 | 73907,073908.jpg,1,1 1215 | 73969,073970.jpg,1,1 1216 | 74002,074003.jpg,1,1 1217 | 74094,074095.jpg,1,1 1218 | 74106,074107.jpg,1,1 1219 | 74176,074177.jpg,1,1 1220 | 74208,074209.jpg,1,1 1221 | 74212,074213.jpg,1,1 1222 | 74281,074282.jpg,1,1 1223 | 74302,074303.jpg,1,1 1224 | 74344,074345.jpg,1,1 1225 | 74495,074496.jpg,1,1 1226 | 74675,074676.jpg,1,1 1227 | 74727,074728.jpg,1,1 1228 | 74902,074903.jpg,1,1 1229 | 74905,074906.jpg,1,1 1230 | 74947,074948.jpg,1,1 1231 | 74968,074969.jpg,1,1 1232 | 75021,075022.jpg,1,1 1233 | 75147,075148.jpg,1,1 1234 | 75151,075152.jpg,1,1 1235 | 75196,075197.jpg,1,1 1236 | 75327,075328.jpg,1,1 1237 | 75337,075338.jpg,1,1 1238 | 75379,075380.jpg,1,1 1239 | 75416,075417.jpg,1,1 1240 | 75421,075422.jpg,1,1 1241 | 75505,075506.jpg,1,1 1242 | 75598,075599.jpg,1,1 1243 | 75628,075629.jpg,1,1 1244 | 75690,075691.jpg,1,1 1245 | 75847,075848.jpg,1,1 1246 | 75980,075981.jpg,1,1 1247 | 76076,076077.jpg,1,1 1248 | 76082,076083.jpg,1,1 1249 | 76137,076138.jpg,1,1 1250 | 76138,076139.jpg,1,1 1251 | 76442,076443.jpg,1,1 1252 | 76484,076485.jpg,1,1 1253 | 76518,076519.jpg,1,1 1254 | 76520,076521.jpg,1,1 1255 | 76542,076543.jpg,1,1 1256 | 76574,076575.jpg,1,1 1257 | 76649,076650.jpg,1,1 1258 | 76696,076697.jpg,1,1 1259 | 76833,076834.jpg,1,1 1260 | 76897,076898.jpg,1,1 1261 | 76901,076902.jpg,1,1 1262 | 76912,076913.jpg,1,1 1263 | 76930,076931.jpg,1,1 1264 | 76986,076987.jpg,1,1 1265 | 77003,077004.jpg,1,1 1266 | 77099,077100.jpg,1,1 1267 | 77163,077164.jpg,1,1 1268 | 77206,077207.jpg,1,1 1269 | 77276,077277.jpg,1,1 1270 | 77308,077309.jpg,1,1 1271 | 77311,077312.jpg,1,1 1272 | 77339,077340.jpg,1,1 1273 | 77354,077355.jpg,1,1 1274 | 77505,077506.jpg,1,1 1275 | 77536,077537.jpg,1,1 1276 | 77592,077593.jpg,1,1 1277 | 77607,077608.jpg,1,1 1278 | 77677,077678.jpg,1,1 1279 | 77716,077717.jpg,1,1 1280 | 77733,077734.jpg,1,1 1281 | 77801,077802.jpg,1,1 1282 | 77862,077863.jpg,1,1 1283 | 77863,077864.jpg,1,1 1284 | 77900,077901.jpg,1,1 1285 | 77904,077905.jpg,1,1 1286 | 78017,078018.jpg,1,1 1287 | 78079,078080.jpg,1,1 1288 | 78156,078157.jpg,1,1 1289 | 78184,078185.jpg,1,1 1290 | 78225,078226.jpg,1,1 1291 | 78302,078303.jpg,1,1 1292 | 78326,078327.jpg,1,1 1293 | 78359,078360.jpg,1,1 1294 | 78385,078386.jpg,1,1 1295 | 78610,078611.jpg,1,1 1296 | 78665,078666.jpg,1,1 1297 | 78796,078797.jpg,1,1 1298 | 78975,078976.jpg,1,1 1299 | 79013,079014.jpg,1,1 1300 | 79045,079046.jpg,1,1 1301 | 79062,079063.jpg,1,1 1302 | 79100,079101.jpg,1,1 1303 | 79124,079125.jpg,1,1 1304 | 79158,079159.jpg,1,1 1305 | 79174,079175.jpg,1,1 1306 | 79202,079203.jpg,1,1 1307 | 79282,079283.jpg,1,1 1308 | 79378,079379.jpg,1,1 1309 | 79386,079387.jpg,1,1 1310 | 79492,079493.jpg,1,1 1311 | 79589,079590.jpg,1,1 1312 | 79593,079594.jpg,1,1 1313 | 79643,079644.jpg,1,1 1314 | 79785,079786.jpg,1,1 1315 | 79955,079956.jpg,1,1 1316 | 80083,080084.jpg,1,1 1317 | 80137,080138.jpg,1,1 1318 | 80151,080152.jpg,1,1 1319 | 80220,080221.jpg,1,1 1320 | 80260,080261.jpg,1,1 1321 | 80345,080346.jpg,1,1 1322 | 80356,080357.jpg,1,1 1323 | 80501,080502.jpg,1,1 1324 | 80533,080534.jpg,1,1 1325 | 80618,080619.jpg,1,1 1326 | 80632,080633.jpg,1,1 1327 | 80720,080721.jpg,1,1 1328 | 80807,080808.jpg,1,1 1329 | 80841,080842.jpg,1,1 1330 | 80907,080908.jpg,1,1 1331 | 80926,080927.jpg,1,1 1332 | 80940,080941.jpg,1,1 1333 | 80948,080949.jpg,1,1 1334 | 80953,080954.jpg,1,1 1335 | 80977,080978.jpg,1,1 1336 | 80991,080992.jpg,1,1 1337 | 81086,081087.jpg,1,1 1338 | 81144,081145.jpg,1,1 1339 | 81206,081207.jpg,1,1 1340 | 81337,081338.jpg,1,1 1341 | 81345,081346.jpg,1,1 1342 | 81373,081374.jpg,1,1 1343 | 81533,081534.jpg,1,1 1344 | 81566,081567.jpg,1,1 1345 | 81595,081596.jpg,1,1 1346 | 81672,081673.jpg,1,1 1347 | 81696,081697.jpg,1,1 1348 | 81703,081704.jpg,1,1 1349 | 81713,081714.jpg,1,1 1350 | 81826,081827.jpg,1,1 1351 | 81840,081841.jpg,1,1 1352 | 81955,081956.jpg,1,1 1353 | 81982,081983.jpg,1,1 1354 | 82093,082094.jpg,1,1 1355 | 82155,082156.jpg,1,1 1356 | 82440,082441.jpg,1,1 1357 | 82532,082533.jpg,1,1 1358 | 82575,082576.jpg,1,1 1359 | 82649,082650.jpg,1,1 1360 | 82654,082655.jpg,1,1 1361 | 82705,082706.jpg,1,1 1362 | 82979,082980.jpg,1,1 1363 | 83035,083036.jpg,1,1 1364 | 83090,083091.jpg,1,1 1365 | 83148,083149.jpg,1,1 1366 | 83263,083264.jpg,1,1 1367 | 83267,083268.jpg,1,1 1368 | 83295,083296.jpg,1,1 1369 | 83358,083359.jpg,1,1 1370 | 83491,083492.jpg,1,1 1371 | 83524,083525.jpg,1,1 1372 | 83535,083536.jpg,1,1 1373 | 83544,083545.jpg,1,1 1374 | 83629,083630.jpg,1,1 1375 | 83854,083855.jpg,1,1 1376 | 83861,083862.jpg,1,1 1377 | 84194,084195.jpg,1,1 1378 | 84254,084255.jpg,1,1 1379 | 84269,084270.jpg,1,1 1380 | 84309,084310.jpg,1,1 1381 | 84359,084360.jpg,1,1 1382 | 84448,084449.jpg,1,1 1383 | 84460,084461.jpg,1,1 1384 | 84497,084498.jpg,1,1 1385 | 84593,084594.jpg,1,1 1386 | 84602,084603.jpg,1,1 1387 | 84758,084759.jpg,1,1 1388 | 84780,084781.jpg,1,1 1389 | 84826,084827.jpg,1,1 1390 | 84867,084868.jpg,1,1 1391 | 84868,084869.jpg,1,1 1392 | 84902,084903.jpg,1,1 1393 | 85061,085062.jpg,1,1 1394 | 85140,085141.jpg,1,1 1395 | 85149,085150.jpg,1,1 1396 | 85159,085160.jpg,1,1 1397 | 85245,085246.jpg,1,1 1398 | 85284,085285.jpg,1,1 1399 | 85377,085378.jpg,1,1 1400 | 85444,085445.jpg,1,1 1401 | 85464,085465.jpg,1,1 1402 | 85535,085536.jpg,1,1 1403 | 85602,085603.jpg,1,1 1404 | 85622,085623.jpg,1,1 1405 | 85779,085780.jpg,1,1 1406 | 85811,085812.jpg,1,1 1407 | 85877,085878.jpg,1,1 1408 | 85909,085910.jpg,1,1 1409 | 86025,086026.jpg,1,1 1410 | 86067,086068.jpg,1,1 1411 | 86173,086174.jpg,1,1 1412 | 86176,086177.jpg,1,1 1413 | 86184,086185.jpg,1,1 1414 | 86508,086509.jpg,1,1 1415 | 86567,086568.jpg,1,1 1416 | 86587,086588.jpg,1,1 1417 | 86687,086688.jpg,1,1 1418 | 86691,086692.jpg,1,1 1419 | 86755,086756.jpg,1,1 1420 | 86852,086853.jpg,1,1 1421 | 86936,086937.jpg,1,1 1422 | 87035,087036.jpg,1,1 1423 | 87164,087165.jpg,1,1 1424 | 87313,087314.jpg,1,1 1425 | 87384,087385.jpg,1,1 1426 | 87410,087411.jpg,1,1 1427 | 87466,087467.jpg,1,1 1428 | 87481,087482.jpg,1,1 1429 | 87540,087541.jpg,1,1 1430 | 87558,087559.jpg,1,1 1431 | 87618,087619.jpg,1,1 1432 | 87663,087664.jpg,1,1 1433 | 87691,087692.jpg,1,1 1434 | 87842,087843.jpg,1,1 1435 | 87965,087966.jpg,1,1 1436 | 87983,087984.jpg,1,1 1437 | 88102,088103.jpg,1,1 1438 | 88123,088124.jpg,1,1 1439 | 88142,088143.jpg,1,1 1440 | 88174,088175.jpg,1,1 1441 | 88184,088185.jpg,1,1 1442 | 88341,088342.jpg,1,1 1443 | 88379,088380.jpg,1,1 1444 | 88396,088397.jpg,1,1 1445 | 88468,088469.jpg,1,1 1446 | 88630,088631.jpg,1,1 1447 | 88663,088664.jpg,1,1 1448 | 88724,088725.jpg,1,1 1449 | 88745,088746.jpg,1,1 1450 | 88771,088772.jpg,1,1 1451 | 88778,088779.jpg,1,1 1452 | 88797,088798.jpg,1,1 1453 | 88911,088912.jpg,1,1 1454 | 88913,088914.jpg,1,1 1455 | 88920,088921.jpg,1,1 1456 | 89046,089047.jpg,1,1 1457 | 89092,089093.jpg,1,1 1458 | 89182,089183.jpg,1,1 1459 | 89327,089328.jpg,1,1 1460 | 89437,089438.jpg,1,1 1461 | 89665,089666.jpg,1,1 1462 | 89772,089773.jpg,1,1 1463 | 89798,089799.jpg,1,1 1464 | 89853,089854.jpg,1,1 1465 | 89917,089918.jpg,1,1 1466 | 90037,090038.jpg,1,1 1467 | 90054,090055.jpg,1,1 1468 | 90159,090160.jpg,1,1 1469 | 90180,090181.jpg,1,1 1470 | 90277,090278.jpg,1,1 1471 | 90349,090350.jpg,1,1 1472 | 90350,090351.jpg,1,1 1473 | 90370,090371.jpg,1,1 1474 | 90375,090376.jpg,1,1 1475 | 90381,090382.jpg,1,1 1476 | 90502,090503.jpg,1,1 1477 | 90561,090562.jpg,1,1 1478 | 90583,090584.jpg,1,1 1479 | 90652,090653.jpg,1,1 1480 | 90830,090831.jpg,1,1 1481 | 90894,090895.jpg,1,1 1482 | 90982,090983.jpg,1,1 1483 | 91011,091012.jpg,1,1 1484 | 91022,091023.jpg,1,1 1485 | 91115,091116.jpg,1,1 1486 | 91194,091195.jpg,1,1 1487 | 91266,091267.jpg,1,1 1488 | 91279,091280.jpg,1,1 1489 | 91283,091284.jpg,1,1 1490 | 91320,091321.jpg,1,1 1491 | 91357,091358.jpg,1,1 1492 | 91366,091367.jpg,1,1 1493 | 91380,091381.jpg,1,1 1494 | 91387,091388.jpg,1,1 1495 | 91407,091408.jpg,1,1 1496 | 91409,091410.jpg,1,1 1497 | 91423,091424.jpg,1,1 1498 | 91432,091433.jpg,1,1 1499 | 91438,091439.jpg,1,1 1500 | 91475,091476.jpg,1,1 1501 | 91502,091503.jpg,1,1 1502 | 91513,091514.jpg,1,1 1503 | 91575,091576.jpg,1,1 1504 | 91619,091620.jpg,1,1 1505 | 91649,091650.jpg,1,1 1506 | 91728,091729.jpg,1,1 1507 | 91745,091746.jpg,1,1 1508 | 91754,091755.jpg,1,1 1509 | 91755,091756.jpg,1,1 1510 | 91941,091942.jpg,1,1 1511 | 91992,091993.jpg,1,1 1512 | 92053,092054.jpg,1,1 1513 | 92090,092091.jpg,1,1 1514 | 92161,092162.jpg,1,1 1515 | 92183,092184.jpg,1,1 1516 | 92188,092189.jpg,1,1 1517 | 92264,092265.jpg,1,1 1518 | 92311,092312.jpg,1,1 1519 | 92410,092411.jpg,1,1 1520 | 92413,092414.jpg,1,1 1521 | 92527,092528.jpg,1,1 1522 | 92555,092556.jpg,1,1 1523 | 92834,092835.jpg,1,1 1524 | 92848,092849.jpg,1,1 1525 | 92893,092894.jpg,1,1 1526 | 92949,092950.jpg,1,1 1527 | 92997,092998.jpg,1,1 1528 | 93008,093009.jpg,1,1 1529 | 93024,093025.jpg,1,1 1530 | 93028,093029.jpg,1,1 1531 | 93321,093322.jpg,1,1 1532 | 93397,093398.jpg,1,1 1533 | 93425,093426.jpg,1,1 1534 | 93488,093489.jpg,1,1 1535 | 93778,093779.jpg,1,1 1536 | 93800,093801.jpg,1,1 1537 | 93866,093867.jpg,1,1 1538 | 93867,093868.jpg,1,1 1539 | 93964,093965.jpg,1,1 1540 | 94027,094028.jpg,1,1 1541 | 94306,094307.jpg,1,1 1542 | 94390,094391.jpg,1,1 1543 | 94434,094435.jpg,1,1 1544 | 94559,094560.jpg,1,1 1545 | 94587,094588.jpg,1,1 1546 | 94624,094625.jpg,1,1 1547 | 94661,094662.jpg,1,1 1548 | 94686,094687.jpg,1,1 1549 | 94775,094776.jpg,1,1 1550 | 94877,094878.jpg,1,1 1551 | 94917,094918.jpg,1,1 1552 | 95111,095112.jpg,1,1 1553 | 95199,095200.jpg,1,1 1554 | 95338,095339.jpg,1,1 1555 | 95366,095367.jpg,1,1 1556 | 95483,095484.jpg,1,1 1557 | 95502,095503.jpg,1,1 1558 | 95544,095545.jpg,1,1 1559 | 95562,095563.jpg,1,1 1560 | 95700,095701.jpg,1,1 1561 | 95775,095776.jpg,1,1 1562 | 95901,095902.jpg,1,1 1563 | 95944,095945.jpg,1,1 1564 | 95985,095986.jpg,1,1 1565 | 95986,095987.jpg,1,1 1566 | 95994,095995.jpg,1,1 1567 | 96053,096054.jpg,1,1 1568 | 96118,096119.jpg,1,1 1569 | 96232,096233.jpg,1,1 1570 | 96241,096242.jpg,1,1 1571 | 96261,096262.jpg,1,1 1572 | 96282,096283.jpg,1,1 1573 | 96411,096412.jpg,1,1 1574 | 96419,096420.jpg,1,1 1575 | 96525,096526.jpg,1,1 1576 | 96544,096545.jpg,1,1 1577 | 96584,096585.jpg,1,1 1578 | 96632,096633.jpg,1,1 1579 | 96635,096636.jpg,1,1 1580 | 96658,096659.jpg,1,1 1581 | 96742,096743.jpg,1,1 1582 | 96847,096848.jpg,1,1 1583 | 96884,096885.jpg,1,1 1584 | 96997,096998.jpg,1,1 1585 | 97066,097067.jpg,1,1 1586 | 97263,097264.jpg,1,1 1587 | 97265,097266.jpg,1,1 1588 | 97317,097318.jpg,1,1 1589 | 97330,097331.jpg,1,1 1590 | 97354,097355.jpg,1,1 1591 | 97584,097585.jpg,1,1 1592 | 97700,097701.jpg,1,1 1593 | 97737,097738.jpg,1,1 1594 | 97768,097769.jpg,1,1 1595 | 97772,097773.jpg,1,1 1596 | 97873,097874.jpg,1,1 1597 | 97883,097884.jpg,1,1 1598 | 97908,097909.jpg,1,1 1599 | 98095,098096.jpg,1,1 1600 | 98173,098174.jpg,1,1 1601 | 98183,098184.jpg,1,1 1602 | 98191,098192.jpg,1,1 1603 | 98216,098217.jpg,1,1 1604 | 98277,098278.jpg,1,1 1605 | 98504,098505.jpg,1,1 1606 | 98510,098511.jpg,1,1 1607 | 98551,098552.jpg,1,1 1608 | 98557,098558.jpg,1,1 1609 | 98614,098615.jpg,1,1 1610 | 98698,098699.jpg,1,1 1611 | 98743,098744.jpg,1,1 1612 | 98793,098794.jpg,1,1 1613 | 98938,098939.jpg,1,1 1614 | 98940,098941.jpg,1,1 1615 | 99043,099044.jpg,1,1 1616 | 99194,099195.jpg,1,1 1617 | 99199,099200.jpg,1,1 1618 | 99235,099236.jpg,1,1 1619 | 99267,099268.jpg,1,1 1620 | 99406,099407.jpg,1,1 1621 | 99419,099420.jpg,1,1 1622 | 99505,099506.jpg,1,1 1623 | 99517,099518.jpg,1,1 1624 | 99548,099549.jpg,1,1 1625 | 99557,099558.jpg,1,1 1626 | 99589,099590.jpg,1,1 1627 | 99601,099602.jpg,1,1 1628 | 99625,099626.jpg,1,1 1629 | 99653,099654.jpg,1,1 1630 | 99657,099658.jpg,1,1 1631 | 99669,099670.jpg,1,1 1632 | 99673,099674.jpg,1,1 1633 | 99759,099760.jpg,1,1 1634 | 99837,099838.jpg,1,1 1635 | 99913,099914.jpg,1,1 1636 | 99949,099950.jpg,1,1 1637 | 100077,100078.jpg,1,1 1638 | 100083,100084.jpg,1,1 1639 | 100307,100308.jpg,1,1 1640 | 100331,100332.jpg,1,1 1641 | 100342,100343.jpg,1,1 1642 | 100374,100375.jpg,1,1 1643 | 100379,100380.jpg,1,1 1644 | 100439,100440.jpg,1,1 1645 | 100708,100709.jpg,1,1 1646 | 100769,100770.jpg,1,1 1647 | 100838,100839.jpg,1,1 1648 | 100930,100931.jpg,1,1 1649 | 101039,101040.jpg,1,1 1650 | 101061,101062.jpg,1,1 1651 | 101076,101077.jpg,1,1 1652 | 101085,101086.jpg,1,1 1653 | 101092,101093.jpg,1,1 1654 | 101104,101105.jpg,1,1 1655 | 101240,101241.jpg,1,1 1656 | 101381,101382.jpg,1,1 1657 | 101412,101413.jpg,1,1 1658 | 101474,101475.jpg,1,1 1659 | 101600,101601.jpg,1,1 1660 | 101617,101618.jpg,1,1 1661 | 101634,101635.jpg,1,1 1662 | 101688,101689.jpg,1,1 1663 | 101722,101723.jpg,1,1 1664 | 101835,101836.jpg,1,1 1665 | 101851,101852.jpg,1,1 1666 | 101946,101947.jpg,1,1 1667 | 102012,102013.jpg,1,1 1668 | 102113,102114.jpg,1,1 1669 | 102126,102127.jpg,1,1 1670 | 102139,102140.jpg,1,1 1671 | 102212,102213.jpg,1,1 1672 | 102336,102337.jpg,1,1 1673 | 102479,102480.jpg,1,1 1674 | 102482,102483.jpg,1,1 1675 | 102501,102502.jpg,1,1 1676 | 102591,102592.jpg,1,1 1677 | 102714,102715.jpg,1,1 1678 | 102737,102738.jpg,1,1 1679 | 102814,102815.jpg,1,1 1680 | 103138,103139.jpg,1,1 1681 | 103192,103193.jpg,1,1 1682 | 103246,103247.jpg,1,1 1683 | 103281,103282.jpg,1,1 1684 | 103503,103504.jpg,1,1 1685 | 103527,103528.jpg,1,1 1686 | 103570,103571.jpg,1,1 1687 | 103661,103662.jpg,1,1 1688 | 103670,103671.jpg,1,1 1689 | 103687,103688.jpg,1,1 1690 | 103694,103695.jpg,1,1 1691 | 103777,103778.jpg,1,1 1692 | 103782,103783.jpg,1,1 1693 | 103811,103812.jpg,1,1 1694 | 103850,103851.jpg,1,1 1695 | 103858,103859.jpg,1,1 1696 | 103880,103881.jpg,1,1 1697 | 103976,103977.jpg,1,1 1698 | 104020,104021.jpg,1,1 1699 | 104024,104025.jpg,1,1 1700 | 104027,104028.jpg,1,1 1701 | 104088,104089.jpg,1,1 1702 | 104107,104108.jpg,1,1 1703 | 104188,104189.jpg,1,1 1704 | 104197,104198.jpg,1,1 1705 | 104199,104200.jpg,1,1 1706 | 104313,104314.jpg,1,1 1707 | 104417,104418.jpg,1,1 1708 | 104446,104447.jpg,1,1 1709 | 104511,104512.jpg,1,1 1710 | 104565,104566.jpg,1,1 1711 | 104744,104745.jpg,1,1 1712 | 104748,104749.jpg,1,1 1713 | 104943,104944.jpg,1,1 1714 | 104971,104972.jpg,1,1 1715 | 105018,105019.jpg,1,1 1716 | 105058,105059.jpg,1,1 1717 | 105098,105099.jpg,1,1 1718 | 105119,105120.jpg,1,1 1719 | 105185,105186.jpg,1,1 1720 | 105244,105245.jpg,1,1 1721 | 105289,105290.jpg,1,1 1722 | 105343,105344.jpg,1,1 1723 | 105366,105367.jpg,1,1 1724 | 105373,105374.jpg,1,1 1725 | 105378,105379.jpg,1,1 1726 | 105394,105395.jpg,1,1 1727 | 105408,105409.jpg,1,1 1728 | 105483,105484.jpg,1,1 1729 | 105677,105678.jpg,1,1 1730 | 105698,105699.jpg,1,1 1731 | 105748,105749.jpg,1,1 1732 | 105784,105785.jpg,1,1 1733 | 105801,105802.jpg,1,1 1734 | 105957,105958.jpg,1,1 1735 | 106063,106064.jpg,1,1 1736 | 106372,106373.jpg,1,1 1737 | 106493,106494.jpg,1,1 1738 | 106594,106595.jpg,1,1 1739 | 106642,106643.jpg,1,1 1740 | 106720,106721.jpg,1,1 1741 | 106798,106799.jpg,1,1 1742 | 106842,106843.jpg,1,1 1743 | 106877,106878.jpg,1,1 1744 | 106975,106976.jpg,1,1 1745 | 107001,107002.jpg,1,1 1746 | 107078,107079.jpg,1,1 1747 | 107127,107128.jpg,1,1 1748 | 107158,107159.jpg,1,1 1749 | 107293,107294.jpg,1,1 1750 | 107339,107340.jpg,1,1 1751 | 107364,107365.jpg,1,1 1752 | 107611,107612.jpg,1,1 1753 | 107646,107647.jpg,1,1 1754 | 107927,107928.jpg,1,1 1755 | 108016,108017.jpg,1,1 1756 | 108093,108094.jpg,1,1 1757 | 108531,108532.jpg,1,1 1758 | 108698,108699.jpg,1,1 1759 | 108717,108718.jpg,1,1 1760 | 108839,108840.jpg,1,1 1761 | 109094,109095.jpg,1,1 1762 | 109227,109228.jpg,1,1 1763 | 109416,109417.jpg,1,1 1764 | 109429,109430.jpg,1,1 1765 | 109439,109440.jpg,1,1 1766 | 109483,109484.jpg,1,1 1767 | 109757,109758.jpg,1,1 1768 | 109759,109760.jpg,1,1 1769 | 109843,109844.jpg,1,1 1770 | 109862,109863.jpg,1,1 1771 | 109910,109911.jpg,1,1 1772 | 109948,109949.jpg,1,1 1773 | 109961,109962.jpg,1,1 1774 | 110028,110029.jpg,1,1 1775 | 110344,110345.jpg,1,1 1776 | 110396,110397.jpg,1,1 1777 | 110550,110551.jpg,1,1 1778 | 110628,110629.jpg,1,1 1779 | 110649,110650.jpg,1,1 1780 | 110755,110756.jpg,1,1 1781 | 110823,110824.jpg,1,1 1782 | 110877,110878.jpg,1,1 1783 | 110898,110899.jpg,1,1 1784 | 110986,110987.jpg,1,1 1785 | 111012,111013.jpg,1,1 1786 | 111015,111016.jpg,1,1 1787 | 111076,111077.jpg,1,1 1788 | 111111,111112.jpg,1,1 1789 | 111115,111116.jpg,1,1 1790 | 111147,111148.jpg,1,1 1791 | 111151,111152.jpg,1,1 1792 | 111215,111216.jpg,1,1 1793 | 111319,111320.jpg,1,1 1794 | 111419,111420.jpg,1,1 1795 | 111425,111426.jpg,1,1 1796 | 111436,111437.jpg,1,1 1797 | 111505,111506.jpg,1,1 1798 | 111597,111598.jpg,1,1 1799 | 111649,111650.jpg,1,1 1800 | 111674,111675.jpg,1,1 1801 | 111711,111712.jpg,1,1 1802 | 111718,111719.jpg,1,1 1803 | 111765,111766.jpg,1,1 1804 | 111906,111907.jpg,1,1 1805 | 111911,111912.jpg,1,1 1806 | 111929,111930.jpg,1,1 1807 | 111942,111943.jpg,1,1 1808 | 111981,111982.jpg,1,1 1809 | 111991,111992.jpg,1,1 1810 | 112050,112051.jpg,1,1 1811 | 112076,112077.jpg,1,1 1812 | 112153,112154.jpg,1,1 1813 | 112186,112187.jpg,1,1 1814 | 112238,112239.jpg,1,1 1815 | 112304,112305.jpg,1,1 1816 | 112308,112309.jpg,1,1 1817 | 112342,112343.jpg,1,1 1818 | 112343,112344.jpg,1,1 1819 | 112365,112366.jpg,1,1 1820 | 112392,112393.jpg,1,1 1821 | 112510,112511.jpg,1,1 1822 | 112549,112550.jpg,1,1 1823 | 112596,112597.jpg,1,1 1824 | 112855,112856.jpg,1,1 1825 | 113018,113019.jpg,1,1 1826 | 113122,113123.jpg,1,1 1827 | 113607,113608.jpg,1,1 1828 | 113633,113634.jpg,1,1 1829 | 113796,113797.jpg,1,1 1830 | 113909,113910.jpg,1,1 1831 | 113969,113970.jpg,1,1 1832 | 114009,114010.jpg,1,1 1833 | 114038,114039.jpg,1,1 1834 | 114044,114045.jpg,1,1 1835 | 114080,114081.jpg,1,1 1836 | 114094,114095.jpg,1,1 1837 | 114128,114129.jpg,1,1 1838 | 114142,114143.jpg,1,1 1839 | 114148,114149.jpg,1,1 1840 | 114171,114172.jpg,1,1 1841 | 114175,114176.jpg,1,1 1842 | 114238,114239.jpg,1,1 1843 | 114254,114255.jpg,1,1 1844 | 114260,114261.jpg,1,1 1845 | 114265,114266.jpg,1,1 1846 | 114289,114290.jpg,1,1 1847 | 114306,114307.jpg,1,1 1848 | 114539,114540.jpg,1,1 1849 | 114585,114586.jpg,1,1 1850 | 114598,114599.jpg,1,1 1851 | 114667,114668.jpg,1,1 1852 | 114700,114701.jpg,1,1 1853 | 114758,114759.jpg,1,1 1854 | 114830,114831.jpg,1,1 1855 | 115054,115055.jpg,1,1 1856 | 115078,115079.jpg,1,1 1857 | 115187,115188.jpg,1,1 1858 | 115266,115267.jpg,1,1 1859 | 115437,115438.jpg,1,1 1860 | 115456,115457.jpg,1,1 1861 | 115474,115475.jpg,1,1 1862 | 115496,115497.jpg,1,1 1863 | 115581,115582.jpg,1,1 1864 | 115605,115606.jpg,1,1 1865 | 115662,115663.jpg,1,1 1866 | 115685,115686.jpg,1,1 1867 | 115699,115700.jpg,1,1 1868 | 115702,115703.jpg,1,1 1869 | 115710,115711.jpg,1,1 1870 | 115716,115717.jpg,1,1 1871 | 115765,115766.jpg,1,1 1872 | 115811,115812.jpg,1,1 1873 | 115831,115832.jpg,1,1 1874 | 115864,115865.jpg,1,1 1875 | 115892,115893.jpg,1,1 1876 | 115984,115985.jpg,1,1 1877 | 115991,115992.jpg,1,1 1878 | 116231,116232.jpg,1,1 1879 | 116265,116266.jpg,1,1 1880 | 116573,116574.jpg,1,1 1881 | 116640,116641.jpg,1,1 1882 | 116656,116657.jpg,1,1 1883 | 116731,116732.jpg,1,1 1884 | 116733,116734.jpg,1,1 1885 | 116749,116750.jpg,1,1 1886 | 116954,116955.jpg,1,1 1887 | 117112,117113.jpg,1,1 1888 | 117142,117143.jpg,1,1 1889 | 117271,117272.jpg,1,1 1890 | 117277,117278.jpg,1,1 1891 | 117324,117325.jpg,1,1 1892 | 117392,117393.jpg,1,1 1893 | 117414,117415.jpg,1,1 1894 | 117493,117494.jpg,1,1 1895 | 117670,117671.jpg,1,1 1896 | 117696,117697.jpg,1,1 1897 | 117793,117794.jpg,1,1 1898 | 117798,117799.jpg,1,1 1899 | 117817,117818.jpg,1,1 1900 | 117884,117885.jpg,1,1 1901 | 117890,117891.jpg,1,1 1902 | 117911,117912.jpg,1,1 1903 | 117943,117944.jpg,1,1 1904 | 118116,118117.jpg,1,1 1905 | 118122,118123.jpg,1,1 1906 | 118224,118225.jpg,1,1 1907 | 118481,118482.jpg,1,1 1908 | 118499,118500.jpg,1,1 1909 | 118614,118615.jpg,1,1 1910 | 118752,118753.jpg,1,1 1911 | 118823,118824.jpg,1,1 1912 | 118853,118854.jpg,1,1 1913 | 118873,118874.jpg,1,1 1914 | 118890,118891.jpg,1,1 1915 | 118961,118962.jpg,1,1 1916 | 119009,119010.jpg,1,1 1917 | 119048,119049.jpg,1,1 1918 | 119166,119167.jpg,1,1 1919 | 119170,119171.jpg,1,1 1920 | 119266,119267.jpg,1,1 1921 | 119267,119268.jpg,1,1 1922 | 119291,119292.jpg,1,1 1923 | 119305,119306.jpg,1,1 1924 | 119342,119343.jpg,1,1 1925 | 119358,119359.jpg,1,1 1926 | 119381,119382.jpg,1,1 1927 | 119399,119400.jpg,1,1 1928 | 119559,119560.jpg,1,1 1929 | 119567,119568.jpg,1,1 1930 | 119589,119590.jpg,1,1 1931 | 119590,119591.jpg,1,1 1932 | 119614,119615.jpg,1,1 1933 | 119641,119642.jpg,1,1 1934 | 119773,119774.jpg,1,1 1935 | 119779,119780.jpg,1,1 1936 | 119827,119828.jpg,1,1 1937 | 119873,119874.jpg,1,1 1938 | 119883,119884.jpg,1,1 1939 | 119981,119982.jpg,1,1 1940 | 119988,119989.jpg,1,1 1941 | 120000,120001.jpg,1,1 1942 | 120007,120008.jpg,1,1 1943 | 120039,120040.jpg,1,1 1944 | 120197,120198.jpg,1,1 1945 | 120421,120422.jpg,1,1 1946 | 120432,120433.jpg,1,1 1947 | 120459,120460.jpg,1,1 1948 | 120487,120488.jpg,1,1 1949 | 120511,120512.jpg,1,1 1950 | 120517,120518.jpg,1,1 1951 | 120551,120552.jpg,1,1 1952 | 120593,120594.jpg,1,1 1953 | 120675,120676.jpg,1,1 1954 | 120735,120736.jpg,1,1 1955 | 120863,120864.jpg,1,1 1956 | 120895,120896.jpg,1,1 1957 | 120915,120916.jpg,1,1 1958 | 120940,120941.jpg,1,1 1959 | 121031,121032.jpg,1,1 1960 | 121041,121042.jpg,1,1 1961 | 121154,121155.jpg,1,1 1962 | 121421,121422.jpg,1,1 1963 | 121437,121438.jpg,1,1 1964 | 121506,121507.jpg,1,1 1965 | 121563,121564.jpg,1,1 1966 | 121597,121598.jpg,1,1 1967 | 121598,121599.jpg,1,1 1968 | 121623,121624.jpg,1,1 1969 | 121632,121633.jpg,1,1 1970 | 121658,121659.jpg,1,1 1971 | 121828,121829.jpg,1,1 1972 | 121854,121855.jpg,1,1 1973 | 121922,121923.jpg,1,1 1974 | 121932,121933.jpg,1,1 1975 | 121949,121950.jpg,1,1 1976 | 122023,122024.jpg,1,1 1977 | 122039,122040.jpg,1,1 1978 | 122074,122075.jpg,1,1 1979 | 122146,122147.jpg,1,1 1980 | 122275,122276.jpg,1,1 1981 | 122329,122330.jpg,1,1 1982 | 122422,122423.jpg,1,1 1983 | 122487,122488.jpg,1,1 1984 | 122501,122502.jpg,1,1 1985 | 122647,122648.jpg,1,1 1986 | 122676,122677.jpg,1,1 1987 | 122719,122720.jpg,1,1 1988 | 122782,122783.jpg,1,1 1989 | 122796,122797.jpg,1,1 1990 | 122907,122908.jpg,1,1 1991 | 122951,122952.jpg,1,1 1992 | 122997,122998.jpg,1,1 1993 | 123057,123058.jpg,1,1 1994 | 123091,123092.jpg,1,1 1995 | 123188,123189.jpg,1,1 1996 | 123223,123224.jpg,1,1 1997 | 123270,123271.jpg,1,1 1998 | 123563,123564.jpg,1,1 1999 | 123575,123576.jpg,1,1 2000 | 123579,123580.jpg,1,1 2001 | 123592,123593.jpg,1,1 2002 | 123698,123699.jpg,1,1 2003 | 123778,123779.jpg,1,1 2004 | 123861,123862.jpg,1,1 2005 | 123876,123877.jpg,1,1 2006 | 124072,124073.jpg,1,1 2007 | 124077,124078.jpg,1,1 2008 | 124103,124104.jpg,1,1 2009 | 124106,124107.jpg,1,1 2010 | 124116,124117.jpg,1,1 2011 | 124187,124188.jpg,1,1 2012 | 124268,124269.jpg,1,1 2013 | 124269,124270.jpg,1,1 2014 | 124307,124308.jpg,1,1 2015 | 124342,124343.jpg,1,1 2016 | 124350,124351.jpg,1,1 2017 | 124399,124400.jpg,1,1 2018 | 124411,124412.jpg,1,1 2019 | 124448,124449.jpg,1,1 2020 | 124482,124483.jpg,1,1 2021 | 124499,124500.jpg,1,1 2022 | 124593,124594.jpg,1,1 2023 | 124679,124680.jpg,1,1 2024 | 124724,124725.jpg,1,1 2025 | 124737,124738.jpg,1,1 2026 | 124806,124807.jpg,1,1 2027 | 124823,124824.jpg,1,1 2028 | 124826,124827.jpg,1,1 2029 | 124884,124885.jpg,1,1 2030 | 124969,124970.jpg,1,1 2031 | 125000,125001.jpg,1,1 2032 | 125008,125009.jpg,1,1 2033 | 125164,125165.jpg,1,1 2034 | 125272,125273.jpg,1,1 2035 | 125282,125283.jpg,1,1 2036 | 125284,125285.jpg,1,1 2037 | 125330,125331.jpg,1,1 2038 | 125473,125474.jpg,1,1 2039 | 125485,125486.jpg,1,1 2040 | 125516,125517.jpg,1,1 2041 | 125530,125531.jpg,1,1 2042 | 125652,125653.jpg,1,1 2043 | 125719,125720.jpg,1,1 2044 | 125872,125873.jpg,1,1 2045 | 125932,125933.jpg,1,1 2046 | 125934,125935.jpg,1,1 2047 | 126074,126075.jpg,1,1 2048 | 126100,126101.jpg,1,1 2049 | 126252,126253.jpg,1,1 2050 | 126286,126287.jpg,1,1 2051 | 126307,126308.jpg,1,1 2052 | 126309,126310.jpg,1,1 2053 | 126325,126326.jpg,1,1 2054 | 126395,126396.jpg,1,1 2055 | 126438,126439.jpg,1,1 2056 | 126446,126447.jpg,1,1 2057 | 126510,126511.jpg,1,1 2058 | 126523,126524.jpg,1,1 2059 | 126527,126528.jpg,1,1 2060 | 126608,126609.jpg,1,1 2061 | 126682,126683.jpg,1,1 2062 | 126702,126703.jpg,1,1 2063 | 126809,126810.jpg,1,1 2064 | 126818,126819.jpg,1,1 2065 | 126839,126840.jpg,1,1 2066 | 126928,126929.jpg,1,1 2067 | 127067,127068.jpg,1,1 2068 | 127093,127094.jpg,1,1 2069 | 127122,127123.jpg,1,1 2070 | 127217,127218.jpg,1,1 2071 | 127243,127244.jpg,1,1 2072 | 127259,127260.jpg,1,1 2073 | 127380,127381.jpg,1,1 2074 | 127399,127400.jpg,1,1 2075 | 127452,127453.jpg,1,1 2076 | 127635,127636.jpg,1,1 2077 | 127696,127697.jpg,1,1 2078 | 127756,127757.jpg,1,1 2079 | 127826,127827.jpg,1,1 2080 | 127902,127903.jpg,1,1 2081 | 127994,127995.jpg,1,1 2082 | 128037,128038.jpg,1,1 2083 | 128046,128047.jpg,1,1 2084 | 128096,128097.jpg,1,1 2085 | 128100,128101.jpg,1,1 2086 | 128136,128137.jpg,1,1 2087 | 128167,128168.jpg,1,1 2088 | 128202,128203.jpg,1,1 2089 | 128231,128232.jpg,1,1 2090 | 128259,128260.jpg,1,1 2091 | 128315,128316.jpg,1,1 2092 | 128330,128331.jpg,1,1 2093 | 128377,128378.jpg,1,1 2094 | 128434,128435.jpg,1,1 2095 | 128478,128479.jpg,1,1 2096 | 128556,128557.jpg,1,1 2097 | 128623,128624.jpg,1,1 2098 | 128647,128648.jpg,1,1 2099 | 128672,128673.jpg,1,1 2100 | 128713,128714.jpg,1,1 2101 | 128735,128736.jpg,1,1 2102 | 128856,128857.jpg,1,1 2103 | 128877,128878.jpg,1,1 2104 | 128968,128969.jpg,1,1 2105 | 129010,129011.jpg,1,1 2106 | 129074,129075.jpg,1,1 2107 | 129078,129079.jpg,1,1 2108 | 129085,129086.jpg,1,1 2109 | 129137,129138.jpg,1,1 2110 | 129188,129189.jpg,1,1 2111 | 129228,129229.jpg,1,1 2112 | 129238,129239.jpg,1,1 2113 | 129293,129294.jpg,1,1 2114 | 129312,129313.jpg,1,1 2115 | 129365,129366.jpg,1,1 2116 | 129422,129423.jpg,1,1 2117 | 129697,129698.jpg,1,1 2118 | 129728,129729.jpg,1,1 2119 | 129813,129814.jpg,1,1 2120 | 129907,129908.jpg,1,1 2121 | 129944,129945.jpg,1,1 2122 | 129948,129949.jpg,1,1 2123 | 130001,130002.jpg,1,1 2124 | 130049,130050.jpg,1,1 2125 | 130065,130066.jpg,1,1 2126 | 130077,130078.jpg,1,1 2127 | 130206,130207.jpg,1,1 2128 | 130271,130272.jpg,1,1 2129 | 130328,130329.jpg,1,1 2130 | 130358,130359.jpg,1,1 2131 | 130439,130440.jpg,1,1 2132 | 130461,130462.jpg,1,1 2133 | 130467,130468.jpg,1,1 2134 | 130538,130539.jpg,1,1 2135 | 130548,130549.jpg,1,1 2136 | 130553,130554.jpg,1,1 2137 | 130580,130581.jpg,1,1 2138 | 130581,130582.jpg,1,1 2139 | 130728,130729.jpg,1,1 2140 | 130741,130742.jpg,1,1 2141 | 130759,130760.jpg,1,1 2142 | 130784,130785.jpg,1,1 2143 | 130823,130824.jpg,1,1 2144 | 130847,130848.jpg,1,1 2145 | 130896,130897.jpg,1,1 2146 | 130954,130955.jpg,1,1 2147 | 131027,131028.jpg,1,1 2148 | 131111,131112.jpg,1,1 2149 | 131154,131155.jpg,1,1 2150 | 131221,131222.jpg,1,1 2151 | 131226,131227.jpg,1,1 2152 | 131231,131232.jpg,1,1 2153 | 131248,131249.jpg,1,1 2154 | 131346,131347.jpg,1,1 2155 | 131348,131349.jpg,1,1 2156 | 131374,131375.jpg,1,1 2157 | 131449,131450.jpg,1,1 2158 | 131456,131457.jpg,1,1 2159 | 131462,131463.jpg,1,1 2160 | 131463,131464.jpg,1,1 2161 | 131475,131476.jpg,1,1 2162 | 131491,131492.jpg,1,1 2163 | 131501,131502.jpg,1,1 2164 | 131656,131657.jpg,1,1 2165 | 131683,131684.jpg,1,1 2166 | 131704,131705.jpg,1,1 2167 | 131822,131823.jpg,1,1 2168 | 131944,131945.jpg,1,1 2169 | 131957,131958.jpg,1,1 2170 | 131960,131961.jpg,1,1 2171 | 131973,131974.jpg,1,1 2172 | 132007,132008.jpg,1,1 2173 | 132041,132042.jpg,1,1 2174 | 132049,132050.jpg,1,1 2175 | 132211,132212.jpg,1,1 2176 | 132272,132273.jpg,1,1 2177 | 132289,132290.jpg,1,1 2178 | 132331,132332.jpg,1,1 2179 | 132420,132421.jpg,1,1 2180 | 132467,132468.jpg,1,1 2181 | 132564,132565.jpg,1,1 2182 | 132773,132774.jpg,1,1 2183 | 132806,132807.jpg,1,1 2184 | 132809,132810.jpg,1,1 2185 | 132814,132815.jpg,1,1 2186 | 132847,132848.jpg,1,1 2187 | 133089,133090.jpg,1,1 2188 | 133424,133425.jpg,1,1 2189 | 133428,133429.jpg,1,1 2190 | 133446,133447.jpg,1,1 2191 | 133457,133458.jpg,1,1 2192 | 133474,133475.jpg,1,1 2193 | 133512,133513.jpg,1,1 2194 | 133666,133667.jpg,1,1 2195 | 133688,133689.jpg,1,1 2196 | 133897,133898.jpg,1,1 2197 | 133912,133913.jpg,1,1 2198 | 133919,133920.jpg,1,1 2199 | 133987,133988.jpg,1,1 2200 | 134183,134184.jpg,1,1 2201 | 134239,134240.jpg,1,1 2202 | 134252,134253.jpg,1,1 2203 | 134321,134322.jpg,1,1 2204 | 134332,134333.jpg,1,1 2205 | 134412,134413.jpg,1,1 2206 | 134422,134423.jpg,1,1 2207 | 134490,134491.jpg,1,1 2208 | 134569,134570.jpg,1,1 2209 | 134652,134653.jpg,1,1 2210 | 134655,134656.jpg,1,1 2211 | 134701,134702.jpg,1,1 2212 | 134716,134717.jpg,1,1 2213 | 134755,134756.jpg,1,1 2214 | 134759,134760.jpg,1,1 2215 | 134876,134877.jpg,1,1 2216 | 134912,134913.jpg,1,1 2217 | 135015,135016.jpg,1,1 2218 | 135056,135057.jpg,1,1 2219 | 135096,135097.jpg,1,1 2220 | 135112,135113.jpg,1,1 2221 | 135131,135132.jpg,1,1 2222 | 135184,135185.jpg,1,1 2223 | 135285,135286.jpg,1,1 2224 | 135320,135321.jpg,1,1 2225 | 135324,135325.jpg,1,1 2226 | 135372,135373.jpg,1,1 2227 | 135422,135423.jpg,1,1 2228 | 135436,135437.jpg,1,1 2229 | 135449,135450.jpg,1,1 2230 | 135466,135467.jpg,1,1 2231 | 135529,135530.jpg,1,1 2232 | 135597,135598.jpg,1,1 2233 | 135640,135641.jpg,1,1 2234 | 135677,135678.jpg,1,1 2235 | 135679,135680.jpg,1,1 2236 | 135729,135730.jpg,1,1 2237 | 135912,135913.jpg,1,1 2238 | 135925,135926.jpg,1,1 2239 | 135951,135952.jpg,1,1 2240 | 135971,135972.jpg,1,1 2241 | 136011,136012.jpg,1,1 2242 | 136074,136075.jpg,1,1 2243 | 136143,136144.jpg,1,1 2244 | 136216,136217.jpg,1,1 2245 | 136235,136236.jpg,1,1 2246 | 136360,136361.jpg,1,1 2247 | 136394,136395.jpg,1,1 2248 | 136422,136423.jpg,1,1 2249 | 136479,136480.jpg,1,1 2250 | 136496,136497.jpg,1,1 2251 | 136516,136517.jpg,1,1 2252 | 136585,136586.jpg,1,1 2253 | 136596,136597.jpg,1,1 2254 | 136597,136598.jpg,1,1 2255 | 136659,136660.jpg,1,1 2256 | 136681,136682.jpg,1,1 2257 | 136693,136694.jpg,1,1 2258 | 136712,136713.jpg,1,1 2259 | 136715,136716.jpg,1,1 2260 | 136763,136764.jpg,1,1 2261 | 136831,136832.jpg,1,1 2262 | 136857,136858.jpg,1,1 2263 | 136883,136884.jpg,1,1 2264 | 136948,136949.jpg,1,1 2265 | 137088,137089.jpg,1,1 2266 | 137095,137096.jpg,1,1 2267 | 137188,137189.jpg,1,1 2268 | 137270,137271.jpg,1,1 2269 | 137370,137371.jpg,1,1 2270 | 137455,137456.jpg,1,1 2271 | 137491,137492.jpg,1,1 2272 | 137643,137644.jpg,1,1 2273 | 137714,137715.jpg,1,1 2274 | 137962,137963.jpg,1,1 2275 | 137976,137977.jpg,1,1 2276 | 138041,138042.jpg,1,1 2277 | 138132,138133.jpg,1,1 2278 | 138299,138300.jpg,1,1 2279 | 138322,138323.jpg,1,1 2280 | 138373,138374.jpg,1,1 2281 | 138393,138394.jpg,1,1 2282 | 138452,138453.jpg,1,1 2283 | 138471,138472.jpg,1,1 2284 | 138547,138548.jpg,1,1 2285 | 138654,138655.jpg,1,1 2286 | 138682,138683.jpg,1,1 2287 | 138685,138686.jpg,1,1 2288 | 138750,138751.jpg,1,1 2289 | 138802,138803.jpg,1,1 2290 | 138905,138906.jpg,1,1 2291 | 138955,138956.jpg,1,1 2292 | 138958,138959.jpg,1,1 2293 | 139038,139039.jpg,1,1 2294 | 139103,139104.jpg,1,1 2295 | 139135,139136.jpg,1,1 2296 | 139183,139184.jpg,1,1 2297 | 139230,139231.jpg,1,1 2298 | 139382,139383.jpg,1,1 2299 | 139390,139391.jpg,1,1 2300 | 139423,139424.jpg,1,1 2301 | 139427,139428.jpg,1,1 2302 | 139467,139468.jpg,1,1 2303 | 139489,139490.jpg,1,1 2304 | 139535,139536.jpg,1,1 2305 | 139643,139644.jpg,1,1 2306 | 139656,139657.jpg,1,1 2307 | 139792,139793.jpg,1,1 2308 | 139933,139934.jpg,1,1 2309 | 140020,140021.jpg,1,1 2310 | 140043,140044.jpg,1,1 2311 | 140077,140078.jpg,1,1 2312 | 140123,140124.jpg,1,1 2313 | 140182,140183.jpg,1,1 2314 | 140262,140263.jpg,1,1 2315 | 140266,140267.jpg,1,1 2316 | 140343,140344.jpg,1,1 2317 | 140349,140350.jpg,1,1 2318 | 140375,140376.jpg,1,1 2319 | 140381,140382.jpg,1,1 2320 | 140428,140429.jpg,1,1 2321 | 140464,140465.jpg,1,1 2322 | 140525,140526.jpg,1,1 2323 | 140534,140535.jpg,1,1 2324 | 140567,140568.jpg,1,1 2325 | 140596,140597.jpg,1,1 2326 | 140645,140646.jpg,1,1 2327 | 140703,140704.jpg,1,1 2328 | 140790,140791.jpg,1,1 2329 | 140989,140990.jpg,1,1 2330 | 140992,140993.jpg,1,1 2331 | 141018,141019.jpg,1,1 2332 | 141034,141035.jpg,1,1 2333 | 141173,141174.jpg,1,1 2334 | 141209,141210.jpg,1,1 2335 | 141230,141231.jpg,1,1 2336 | 141233,141234.jpg,1,1 2337 | 141301,141302.jpg,1,1 2338 | 141352,141353.jpg,1,1 2339 | 141368,141369.jpg,1,1 2340 | 141530,141531.jpg,1,1 2341 | 141542,141543.jpg,1,1 2342 | 141546,141547.jpg,1,1 2343 | 141582,141583.jpg,1,1 2344 | 141644,141645.jpg,1,1 2345 | 141724,141725.jpg,1,1 2346 | 141803,141804.jpg,1,1 2347 | 141825,141826.jpg,1,1 2348 | 141842,141843.jpg,1,1 2349 | 141869,141870.jpg,1,1 2350 | 142062,142063.jpg,1,1 2351 | 142063,142064.jpg,1,1 2352 | 142161,142162.jpg,1,1 2353 | 142199,142200.jpg,1,1 2354 | 142317,142318.jpg,1,1 2355 | 142335,142336.jpg,1,1 2356 | 142384,142385.jpg,1,1 2357 | 142415,142416.jpg,1,1 2358 | 142586,142587.jpg,1,1 2359 | 142591,142592.jpg,1,1 2360 | 142613,142614.jpg,1,1 2361 | 142654,142655.jpg,1,1 2362 | 142697,142698.jpg,1,1 2363 | 142700,142701.jpg,1,1 2364 | 142721,142722.jpg,1,1 2365 | 142734,142735.jpg,1,1 2366 | 142741,142742.jpg,1,1 2367 | 142794,142795.jpg,1,1 2368 | 142821,142822.jpg,1,1 2369 | 142892,142893.jpg,1,1 2370 | 142966,142967.jpg,1,1 2371 | 143011,143012.jpg,1,1 2372 | 143032,143033.jpg,1,1 2373 | 143086,143087.jpg,1,1 2374 | 143126,143127.jpg,1,1 2375 | 143197,143198.jpg,1,1 2376 | 143275,143276.jpg,1,1 2377 | 143522,143523.jpg,1,1 2378 | 143537,143538.jpg,1,1 2379 | 143563,143564.jpg,1,1 2380 | 143578,143579.jpg,1,1 2381 | 143728,143729.jpg,1,1 2382 | 143790,143791.jpg,1,1 2383 | 143916,143917.jpg,1,1 2384 | 144048,144049.jpg,1,1 2385 | 144054,144055.jpg,1,1 2386 | 144074,144075.jpg,1,1 2387 | 144076,144077.jpg,1,1 2388 | 144078,144079.jpg,1,1 2389 | 144122,144123.jpg,1,1 2390 | 144143,144144.jpg,1,1 2391 | 144151,144152.jpg,1,1 2392 | 144175,144176.jpg,1,1 2393 | 144179,144180.jpg,1,1 2394 | 144203,144204.jpg,1,1 2395 | 144390,144391.jpg,1,1 2396 | 144442,144443.jpg,1,1 2397 | 144501,144502.jpg,1,1 2398 | 144558,144559.jpg,1,1 2399 | 144708,144709.jpg,1,1 2400 | 144715,144716.jpg,1,1 2401 | 144738,144739.jpg,1,1 2402 | 144758,144759.jpg,1,1 2403 | 144771,144772.jpg,1,1 2404 | 144784,144785.jpg,1,1 2405 | 144803,144804.jpg,1,1 2406 | 144928,144929.jpg,1,1 2407 | 144933,144934.jpg,1,1 2408 | 144934,144935.jpg,1,1 2409 | 144949,144950.jpg,1,1 2410 | 145150,145151.jpg,1,1 2411 | 145243,145244.jpg,1,1 2412 | 145258,145259.jpg,1,1 2413 | 145384,145385.jpg,1,1 2414 | 145459,145460.jpg,1,1 2415 | 145570,145571.jpg,1,1 2416 | 145581,145582.jpg,1,1 2417 | 145604,145605.jpg,1,1 2418 | 145661,145662.jpg,1,1 2419 | 145712,145713.jpg,1,1 2420 | 145728,145729.jpg,1,1 2421 | 145783,145784.jpg,1,1 2422 | 146184,146185.jpg,1,1 2423 | 146188,146189.jpg,1,1 2424 | 146237,146238.jpg,1,1 2425 | 146356,146357.jpg,1,1 2426 | 146401,146402.jpg,1,1 2427 | 146404,146405.jpg,1,1 2428 | 146422,146423.jpg,1,1 2429 | 146429,146430.jpg,1,1 2430 | 146449,146450.jpg,1,1 2431 | 146454,146455.jpg,1,1 2432 | 146486,146487.jpg,1,1 2433 | 146552,146553.jpg,1,1 2434 | 146653,146654.jpg,1,1 2435 | 146667,146668.jpg,1,1 2436 | 146694,146695.jpg,1,1 2437 | 146700,146701.jpg,1,1 2438 | 146708,146709.jpg,1,1 2439 | 146729,146730.jpg,1,1 2440 | 146789,146790.jpg,1,1 2441 | 146820,146821.jpg,1,1 2442 | 146871,146872.jpg,1,1 2443 | 146875,146876.jpg,1,1 2444 | 146940,146941.jpg,1,1 2445 | 146995,146996.jpg,1,1 2446 | 147027,147028.jpg,1,1 2447 | 147097,147098.jpg,1,1 2448 | 147111,147112.jpg,1,1 2449 | 147116,147117.jpg,1,1 2450 | 147199,147200.jpg,1,1 2451 | 147268,147269.jpg,1,1 2452 | 147380,147381.jpg,1,1 2453 | 147385,147386.jpg,1,1 2454 | 147419,147420.jpg,1,1 2455 | 147458,147459.jpg,1,1 2456 | 147470,147471.jpg,1,1 2457 | 147518,147519.jpg,1,1 2458 | 147528,147529.jpg,1,1 2459 | 147556,147557.jpg,1,1 2460 | 147581,147582.jpg,1,1 2461 | 147599,147600.jpg,1,1 2462 | 147606,147607.jpg,1,1 2463 | 147626,147627.jpg,1,1 2464 | 147692,147693.jpg,1,1 2465 | 147728,147729.jpg,1,1 2466 | 147787,147788.jpg,1,1 2467 | 147805,147806.jpg,1,1 2468 | 148004,148005.jpg,1,1 2469 | 148084,148085.jpg,1,1 2470 | 148170,148171.jpg,1,1 2471 | 148176,148177.jpg,1,1 2472 | 148265,148266.jpg,1,1 2473 | 148338,148339.jpg,1,1 2474 | 148350,148351.jpg,1,1 2475 | 148405,148406.jpg,1,1 2476 | 148439,148440.jpg,1,1 2477 | 148503,148504.jpg,1,1 2478 | 148504,148505.jpg,1,1 2479 | 148505,148506.jpg,1,1 2480 | 148576,148577.jpg,1,1 2481 | 148622,148623.jpg,1,1 2482 | 148674,148675.jpg,1,1 2483 | 148759,148760.jpg,1,1 2484 | 148778,148779.jpg,1,1 2485 | 148853,148854.jpg,1,1 2486 | 148903,148904.jpg,1,1 2487 | 148952,148953.jpg,1,1 2488 | 149013,149014.jpg,1,1 2489 | 149098,149099.jpg,1,1 2490 | 149113,149114.jpg,1,1 2491 | 149182,149183.jpg,1,1 2492 | 149186,149187.jpg,1,1 2493 | 149225,149226.jpg,1,1 2494 | 149254,149255.jpg,1,1 2495 | 149274,149275.jpg,1,1 2496 | 149353,149354.jpg,1,1 2497 | 149364,149365.jpg,1,1 2498 | 149439,149440.jpg,1,1 2499 | 149480,149481.jpg,1,1 2500 | 149524,149525.jpg,1,1 2501 | 149536,149537.jpg,1,1 2502 | 149628,149629.jpg,1,1 2503 | 149734,149735.jpg,1,1 2504 | 149844,149845.jpg,1,1 2505 | 149948,149949.jpg,1,1 2506 | 149960,149961.jpg,1,1 2507 | 149963,149964.jpg,1,1 2508 | 149993,149994.jpg,1,1 2509 | 150027,150028.jpg,1,1 2510 | 150059,150060.jpg,1,1 2511 | 150128,150129.jpg,1,1 2512 | 150243,150244.jpg,1,1 2513 | 150265,150266.jpg,1,1 2514 | 150304,150305.jpg,1,1 2515 | 150542,150543.jpg,1,1 2516 | 150637,150638.jpg,1,1 2517 | 150645,150646.jpg,1,1 2518 | 150764,150765.jpg,1,1 2519 | 150811,150812.jpg,1,1 2520 | 150835,150836.jpg,1,1 2521 | 150898,150899.jpg,1,1 2522 | 150967,150968.jpg,1,1 2523 | 150984,150985.jpg,1,1 2524 | 150989,150990.jpg,1,1 2525 | 151071,151072.jpg,1,1 2526 | 151082,151083.jpg,1,1 2527 | 151232,151233.jpg,1,1 2528 | 151251,151252.jpg,1,1 2529 | 151352,151353.jpg,1,1 2530 | 151437,151438.jpg,1,1 2531 | 151442,151443.jpg,1,1 2532 | 151456,151457.jpg,1,1 2533 | 151535,151536.jpg,1,1 2534 | 151593,151594.jpg,1,1 2535 | 151762,151763.jpg,1,1 2536 | 151808,151809.jpg,1,1 2537 | 151833,151834.jpg,1,1 2538 | 151869,151870.jpg,1,1 2539 | 151969,151970.jpg,1,1 2540 | 152029,152030.jpg,1,1 2541 | 152065,152066.jpg,1,1 2542 | 152182,152183.jpg,1,1 2543 | 152201,152202.jpg,1,1 2544 | 152373,152374.jpg,1,1 2545 | 152506,152507.jpg,1,1 2546 | 152567,152568.jpg,1,1 2547 | 152610,152611.jpg,1,1 2548 | 152658,152659.jpg,1,1 2549 | 152710,152711.jpg,1,1 2550 | 152797,152798.jpg,1,1 2551 | 153009,153010.jpg,1,1 2552 | 153135,153136.jpg,1,1 2553 | 153148,153149.jpg,1,1 2554 | 153181,153182.jpg,1,1 2555 | 153193,153194.jpg,1,1 2556 | 153221,153222.jpg,1,1 2557 | 153238,153239.jpg,1,1 2558 | 153279,153280.jpg,1,1 2559 | 153339,153340.jpg,1,1 2560 | 153372,153373.jpg,1,1 2561 | 153381,153382.jpg,1,1 2562 | 153392,153393.jpg,1,1 2563 | 153416,153417.jpg,1,1 2564 | 153485,153486.jpg,1,1 2565 | 153506,153507.jpg,1,1 2566 | 153528,153529.jpg,1,1 2567 | 153609,153610.jpg,1,1 2568 | 153642,153643.jpg,1,1 2569 | 153741,153742.jpg,1,1 2570 | 153766,153767.jpg,1,1 2571 | 153944,153945.jpg,1,1 2572 | 154031,154032.jpg,1,1 2573 | 154109,154110.jpg,1,1 2574 | 154167,154168.jpg,1,1 2575 | 154189,154190.jpg,1,1 2576 | 154231,154232.jpg,1,1 2577 | 154318,154319.jpg,1,1 2578 | 154359,154360.jpg,1,1 2579 | 154372,154373.jpg,1,1 2580 | 154487,154488.jpg,1,1 2581 | 154618,154619.jpg,1,1 2582 | 154621,154622.jpg,1,1 2583 | 154674,154675.jpg,1,1 2584 | 154692,154693.jpg,1,1 2585 | 154714,154715.jpg,1,1 2586 | 154833,154834.jpg,1,1 2587 | 154864,154865.jpg,1,1 2588 | 155024,155025.jpg,1,1 2589 | 155073,155074.jpg,1,1 2590 | 155180,155181.jpg,1,1 2591 | 155217,155218.jpg,1,1 2592 | 155306,155307.jpg,1,1 2593 | 155358,155359.jpg,1,1 2594 | 155498,155499.jpg,1,1 2595 | 155515,155516.jpg,1,1 2596 | 155519,155520.jpg,1,1 2597 | 155553,155554.jpg,1,1 2598 | 155562,155563.jpg,1,1 2599 | 155666,155667.jpg,1,1 2600 | 155738,155739.jpg,1,1 2601 | 155801,155802.jpg,1,1 2602 | 155834,155835.jpg,1,1 2603 | 155836,155837.jpg,1,1 2604 | 155877,155878.jpg,1,1 2605 | 155904,155905.jpg,1,1 2606 | 156009,156010.jpg,1,1 2607 | 156037,156038.jpg,1,1 2608 | 156131,156132.jpg,1,1 2609 | 156179,156180.jpg,1,1 2610 | 156200,156201.jpg,1,1 2611 | 156202,156203.jpg,1,1 2612 | 156292,156293.jpg,1,1 2613 | 156395,156396.jpg,1,1 2614 | 156474,156475.jpg,1,1 2615 | 156502,156503.jpg,1,1 2616 | 156554,156555.jpg,1,1 2617 | 156568,156569.jpg,1,1 2618 | 156791,156792.jpg,1,1 2619 | 156817,156818.jpg,1,1 2620 | 156872,156873.jpg,1,1 2621 | 156900,156901.jpg,1,1 2622 | 157050,157051.jpg,1,1 2623 | 157125,157126.jpg,1,1 2624 | 157143,157144.jpg,1,1 2625 | 157285,157286.jpg,1,1 2626 | 157324,157325.jpg,1,1 2627 | 157350,157351.jpg,1,1 2628 | 157404,157405.jpg,1,1 2629 | 157413,157414.jpg,1,1 2630 | 157453,157454.jpg,1,1 2631 | 157476,157477.jpg,1,1 2632 | 157507,157508.jpg,1,1 2633 | 157713,157714.jpg,1,1 2634 | 157805,157806.jpg,1,1 2635 | 157974,157975.jpg,1,1 2636 | 157986,157987.jpg,1,1 2637 | 158031,158032.jpg,1,1 2638 | 158125,158126.jpg,1,1 2639 | 158147,158148.jpg,1,1 2640 | 158333,158334.jpg,1,1 2641 | 158447,158448.jpg,1,1 2642 | 158453,158454.jpg,1,1 2643 | 158463,158464.jpg,1,1 2644 | 158570,158571.jpg,1,1 2645 | 158642,158643.jpg,1,1 2646 | 158648,158649.jpg,1,1 2647 | 158718,158719.jpg,1,1 2648 | 158733,158734.jpg,1,1 2649 | 158810,158811.jpg,1,1 2650 | 158956,158957.jpg,1,1 2651 | 158965,158966.jpg,1,1 2652 | 158978,158979.jpg,1,1 2653 | 158989,158990.jpg,1,1 2654 | 159011,159012.jpg,1,1 2655 | 159079,159080.jpg,1,1 2656 | 159165,159166.jpg,1,1 2657 | 159204,159205.jpg,1,1 2658 | 159214,159215.jpg,1,1 2659 | 159296,159297.jpg,1,1 2660 | 159431,159432.jpg,1,1 2661 | 159497,159498.jpg,1,1 2662 | 159629,159630.jpg,1,1 2663 | 159637,159638.jpg,1,1 2664 | 159701,159702.jpg,1,1 2665 | 159704,159705.jpg,1,1 2666 | 159709,159710.jpg,1,1 2667 | 159731,159732.jpg,1,1 2668 | 159737,159738.jpg,1,1 2669 | 159756,159757.jpg,1,1 2670 | 159793,159794.jpg,1,1 2671 | 159812,159813.jpg,1,1 2672 | 159956,159957.jpg,1,1 2673 | 159984,159985.jpg,1,1 2674 | 160007,160008.jpg,1,1 2675 | 160013,160014.jpg,1,1 2676 | 160014,160015.jpg,1,1 2677 | 160020,160021.jpg,1,1 2678 | 160095,160096.jpg,1,1 2679 | 160246,160247.jpg,1,1 2680 | 160312,160313.jpg,1,1 2681 | 160329,160330.jpg,1,1 2682 | 160331,160332.jpg,1,1 2683 | 160402,160403.jpg,1,1 2684 | 160535,160536.jpg,1,1 2685 | 160547,160548.jpg,1,1 2686 | 160590,160591.jpg,1,1 2687 | 160603,160604.jpg,1,1 2688 | 160607,160608.jpg,1,1 2689 | 160631,160632.jpg,1,1 2690 | 160678,160679.jpg,1,1 2691 | 160828,160829.jpg,1,1 2692 | 160843,160844.jpg,1,1 2693 | 160889,160890.jpg,1,1 2694 | 160923,160924.jpg,1,1 2695 | 160962,160963.jpg,1,1 2696 | 161006,161007.jpg,1,1 2697 | 161169,161170.jpg,1,1 2698 | 161224,161225.jpg,1,1 2699 | 161271,161272.jpg,1,1 2700 | 161365,161366.jpg,1,1 2701 | 161383,161384.jpg,1,1 2702 | 161461,161462.jpg,1,1 2703 | 161477,161478.jpg,1,1 2704 | 161484,161485.jpg,1,1 2705 | 161499,161500.jpg,1,1 2706 | 161525,161526.jpg,1,1 2707 | 161537,161538.jpg,1,1 2708 | 161574,161575.jpg,1,1 2709 | 161642,161643.jpg,1,1 2710 | 161749,161750.jpg,1,1 2711 | 161908,161909.jpg,1,1 2712 | 162020,162021.jpg,1,1 2713 | 162263,162264.jpg,1,1 2714 | 162282,162283.jpg,1,1 2715 | 162331,162332.jpg,1,1 2716 | 162350,162351.jpg,1,1 2717 | 162395,162396.jpg,1,1 2718 | 162467,162468.jpg,1,1 2719 | 162565,162566.jpg,1,1 2720 | 162745,162746.jpg,1,1 2721 | 162752,162753.jpg,1,1 2722 | 162941,162942.jpg,1,1 2723 | 163011,163012.jpg,1,1 2724 | 163156,163157.jpg,1,1 2725 | 163224,163225.jpg,1,1 2726 | 163241,163242.jpg,1,1 2727 | 163292,163293.jpg,1,1 2728 | 163304,163305.jpg,1,1 2729 | 163305,163306.jpg,1,1 2730 | 163358,163359.jpg,1,1 2731 | 163393,163394.jpg,1,1 2732 | 163396,163397.jpg,1,1 2733 | 163439,163440.jpg,1,1 2734 | 163520,163521.jpg,1,1 2735 | 163708,163709.jpg,1,1 2736 | 163741,163742.jpg,1,1 2737 | 163761,163762.jpg,1,1 2738 | 163776,163777.jpg,1,1 2739 | 163804,163805.jpg,1,1 2740 | 163820,163821.jpg,1,1 2741 | 163827,163828.jpg,1,1 2742 | 163919,163920.jpg,1,1 2743 | 164071,164072.jpg,1,1 2744 | 164084,164085.jpg,1,1 2745 | 164086,164087.jpg,1,1 2746 | 164163,164164.jpg,1,1 2747 | 164236,164237.jpg,1,1 2748 | 164277,164278.jpg,1,1 2749 | 164330,164331.jpg,1,1 2750 | 164417,164418.jpg,1,1 2751 | 164426,164427.jpg,1,1 2752 | 164490,164491.jpg,1,1 2753 | 164681,164682.jpg,1,1 2754 | 164759,164760.jpg,1,1 2755 | 164875,164876.jpg,1,1 2756 | 164902,164903.jpg,1,1 2757 | 164908,164909.jpg,1,1 2758 | 165015,165016.jpg,1,1 2759 | 165023,165024.jpg,1,1 2760 | 165158,165159.jpg,1,1 2761 | 165316,165317.jpg,1,1 2762 | 165367,165368.jpg,1,1 2763 | 165454,165455.jpg,1,1 2764 | 165470,165471.jpg,1,1 2765 | 165528,165529.jpg,1,1 2766 | 165623,165624.jpg,1,1 2767 | 165663,165664.jpg,1,1 2768 | 165812,165813.jpg,1,1 2769 | 165959,165960.jpg,1,1 2770 | 166024,166025.jpg,1,1 2771 | 166040,166041.jpg,1,1 2772 | 166083,166084.jpg,1,1 2773 | 166126,166127.jpg,1,1 2774 | 166156,166157.jpg,1,1 2775 | 166210,166211.jpg,1,1 2776 | 166272,166273.jpg,1,1 2777 | 166326,166327.jpg,1,1 2778 | 166569,166570.jpg,1,1 2779 | 166745,166746.jpg,1,1 2780 | 166788,166789.jpg,1,1 2781 | 166852,166853.jpg,1,1 2782 | 166858,166859.jpg,1,1 2783 | 166860,166861.jpg,1,1 2784 | 167004,167005.jpg,1,1 2785 | 167021,167022.jpg,1,1 2786 | 167061,167062.jpg,1,1 2787 | 167245,167246.jpg,1,1 2788 | 167334,167335.jpg,1,1 2789 | 167400,167401.jpg,1,1 2790 | 167563,167564.jpg,1,1 2791 | 167632,167633.jpg,1,1 2792 | 167644,167645.jpg,1,1 2793 | 167760,167761.jpg,1,1 2794 | 167840,167841.jpg,1,1 2795 | 167887,167888.jpg,1,1 2796 | 167895,167896.jpg,1,1 2797 | 167922,167923.jpg,1,1 2798 | 167926,167927.jpg,1,1 2799 | 167971,167972.jpg,1,1 2800 | 168040,168041.jpg,1,1 2801 | 168045,168046.jpg,1,1 2802 | 168112,168113.jpg,1,1 2803 | 168199,168200.jpg,1,1 2804 | 168270,168271.jpg,1,1 2805 | 168283,168284.jpg,1,1 2806 | 168345,168346.jpg,1,1 2807 | 168354,168355.jpg,1,1 2808 | 168532,168533.jpg,1,1 2809 | 168542,168543.jpg,1,1 2810 | 168607,168608.jpg,1,1 2811 | 168635,168636.jpg,1,1 2812 | 168671,168672.jpg,1,1 2813 | 168724,168725.jpg,1,1 2814 | 168780,168781.jpg,1,1 2815 | 168959,168960.jpg,1,1 2816 | 169016,169017.jpg,1,1 2817 | 169082,169083.jpg,1,1 2818 | 169098,169099.jpg,1,1 2819 | 169176,169177.jpg,1,1 2820 | 169346,169347.jpg,1,1 2821 | 169537,169538.jpg,1,1 2822 | 169547,169548.jpg,1,1 2823 | 169555,169556.jpg,1,1 2824 | 169568,169569.jpg,1,1 2825 | 169603,169604.jpg,1,1 2826 | 169689,169690.jpg,1,1 2827 | 169781,169782.jpg,1,1 2828 | 169783,169784.jpg,1,1 2829 | 169789,169790.jpg,1,1 2830 | 169811,169812.jpg,1,1 2831 | 169843,169844.jpg,1,1 2832 | 169941,169942.jpg,1,1 2833 | 169978,169979.jpg,1,1 2834 | 169980,169981.jpg,1,1 2835 | 170094,170095.jpg,1,1 2836 | 170124,170125.jpg,1,1 2837 | 170156,170157.jpg,1,1 2838 | 170259,170260.jpg,1,1 2839 | 170291,170292.jpg,1,1 2840 | 170518,170519.jpg,1,1 2841 | 170551,170552.jpg,1,1 2842 | 170557,170558.jpg,1,1 2843 | 170616,170617.jpg,1,1 2844 | 170798,170799.jpg,1,1 2845 | 170832,170833.jpg,1,1 2846 | 170856,170857.jpg,1,1 2847 | 170896,170897.jpg,1,1 2848 | 170899,170900.jpg,1,1 2849 | 170900,170901.jpg,1,1 2850 | 170932,170933.jpg,1,1 2851 | 171007,171008.jpg,1,1 2852 | 171104,171105.jpg,1,1 2853 | 171121,171122.jpg,1,1 2854 | 171151,171152.jpg,1,1 2855 | 171199,171200.jpg,1,1 2856 | 171215,171216.jpg,1,1 2857 | 171221,171222.jpg,1,1 2858 | 171257,171258.jpg,1,1 2859 | 171430,171431.jpg,1,1 2860 | 171504,171505.jpg,1,1 2861 | 171567,171568.jpg,1,1 2862 | 171693,171694.jpg,1,1 2863 | 171717,171718.jpg,1,1 2864 | 171725,171726.jpg,1,1 2865 | 171735,171736.jpg,1,1 2866 | 171756,171757.jpg,1,1 2867 | 171803,171804.jpg,1,1 2868 | 171819,171820.jpg,1,1 2869 | 171831,171832.jpg,1,1 2870 | 171894,171895.jpg,1,1 2871 | 171907,171908.jpg,1,1 2872 | 172012,172013.jpg,1,1 2873 | 172251,172252.jpg,1,1 2874 | 172283,172284.jpg,1,1 2875 | 172395,172396.jpg,1,1 2876 | 172422,172423.jpg,1,1 2877 | 172519,172520.jpg,1,1 2878 | 172595,172596.jpg,1,1 2879 | 172601,172602.jpg,1,1 2880 | 172641,172642.jpg,1,1 2881 | 172713,172714.jpg,1,1 2882 | 172762,172763.jpg,1,1 2883 | 172775,172776.jpg,1,1 2884 | 172785,172786.jpg,1,1 2885 | 172867,172868.jpg,1,1 2886 | 172896,172897.jpg,1,1 2887 | 172917,172918.jpg,1,1 2888 | 172965,172966.jpg,1,1 2889 | 173002,173003.jpg,1,1 2890 | 173012,173013.jpg,1,1 2891 | 173091,173092.jpg,1,1 2892 | 173200,173201.jpg,1,1 2893 | 173244,173245.jpg,1,1 2894 | 173255,173256.jpg,1,1 2895 | 173268,173269.jpg,1,1 2896 | 173429,173430.jpg,1,1 2897 | 173436,173437.jpg,1,1 2898 | 173526,173527.jpg,1,1 2899 | 173635,173636.jpg,1,1 2900 | 173650,173651.jpg,1,1 2901 | 173674,173675.jpg,1,1 2902 | 173774,173775.jpg,1,1 2903 | 173779,173780.jpg,1,1 2904 | 173852,173853.jpg,1,1 2905 | 173864,173865.jpg,1,1 2906 | 173884,173885.jpg,1,1 2907 | 173923,173924.jpg,1,1 2908 | 173924,173925.jpg,1,1 2909 | 173982,173983.jpg,1,1 2910 | 174050,174051.jpg,1,1 2911 | 174076,174077.jpg,1,1 2912 | 174308,174309.jpg,1,1 2913 | 174631,174632.jpg,1,1 2914 | 174667,174668.jpg,1,1 2915 | 174747,174748.jpg,1,1 2916 | 174822,174823.jpg,1,1 2917 | 174952,174953.jpg,1,1 2918 | 175009,175010.jpg,1,1 2919 | 175023,175024.jpg,1,1 2920 | 175034,175035.jpg,1,1 2921 | 175141,175142.jpg,1,1 2922 | 175235,175236.jpg,1,1 2923 | 175250,175251.jpg,1,1 2924 | 175279,175280.jpg,1,1 2925 | 175286,175287.jpg,1,1 2926 | 175434,175435.jpg,1,1 2927 | 175603,175604.jpg,1,1 2928 | 175611,175612.jpg,1,1 2929 | 175680,175681.jpg,1,1 2930 | 175694,175695.jpg,1,1 2931 | 175858,175859.jpg,1,1 2932 | 175921,175922.jpg,1,1 2933 | 176195,176196.jpg,1,1 2934 | 176258,176259.jpg,1,1 2935 | 176325,176326.jpg,1,1 2936 | 176378,176379.jpg,1,1 2937 | 176460,176461.jpg,1,1 2938 | 176499,176500.jpg,1,1 2939 | 176625,176626.jpg,1,1 2940 | 176678,176679.jpg,1,1 2941 | 176749,176750.jpg,1,1 2942 | 176884,176885.jpg,1,1 2943 | 176987,176988.jpg,1,1 2944 | 177078,177079.jpg,1,1 2945 | 177085,177086.jpg,1,1 2946 | 177093,177094.jpg,1,1 2947 | 177222,177223.jpg,1,1 2948 | 177251,177252.jpg,1,1 2949 | 177270,177271.jpg,1,1 2950 | 177305,177306.jpg,1,1 2951 | 177318,177319.jpg,1,1 2952 | 177340,177341.jpg,1,1 2953 | 177555,177556.jpg,1,1 2954 | 177722,177723.jpg,1,1 2955 | 177844,177845.jpg,1,1 2956 | 178010,178011.jpg,1,1 2957 | 178075,178076.jpg,1,1 2958 | 178077,178078.jpg,1,1 2959 | 178134,178135.jpg,1,1 2960 | 178138,178139.jpg,1,1 2961 | 178155,178156.jpg,1,1 2962 | 178294,178295.jpg,1,1 2963 | 178370,178371.jpg,1,1 2964 | 178421,178422.jpg,1,1 2965 | 178579,178580.jpg,1,1 2966 | 178697,178698.jpg,1,1 2967 | 178777,178778.jpg,1,1 2968 | 178964,178965.jpg,1,1 2969 | 179118,179119.jpg,1,1 2970 | 179158,179159.jpg,1,1 2971 | 179234,179235.jpg,1,1 2972 | 179276,179277.jpg,1,1 2973 | 179333,179334.jpg,1,1 2974 | 179382,179383.jpg,1,1 2975 | 179457,179458.jpg,1,1 2976 | 179494,179495.jpg,1,1 2977 | 179512,179513.jpg,1,1 2978 | 179551,179552.jpg,1,1 2979 | 179624,179625.jpg,1,1 2980 | 179777,179778.jpg,1,1 2981 | 179909,179910.jpg,1,1 2982 | 180008,180009.jpg,1,1 2983 | 180211,180212.jpg,1,1 2984 | 180245,180246.jpg,1,1 2985 | 180337,180338.jpg,1,1 2986 | 180661,180662.jpg,1,1 2987 | 180701,180702.jpg,1,1 2988 | 180705,180706.jpg,1,1 2989 | 180713,180714.jpg,1,1 2990 | 180838,180839.jpg,1,1 2991 | 180963,180964.jpg,1,1 2992 | 181084,181085.jpg,1,1 2993 | 181126,181127.jpg,1,1 2994 | 181381,181382.jpg,1,1 2995 | 181390,181391.jpg,1,1 2996 | 181396,181397.jpg,1,1 2997 | 181433,181434.jpg,1,1 2998 | 181467,181468.jpg,1,1 2999 | 181597,181598.jpg,1,1 3000 | 181602,181603.jpg,1,1 3001 | 181643,181644.jpg,1,1 3002 | 181737,181738.jpg,1,1 3003 | 181949,181950.jpg,1,1 3004 | 181957,181958.jpg,1,1 3005 | 181968,181969.jpg,1,1 3006 | 182002,182003.jpg,1,1 3007 | 182043,182044.jpg,1,1 3008 | 182108,182109.jpg,1,1 3009 | 182147,182148.jpg,1,1 3010 | 182373,182374.jpg,1,1 3011 | 182473,182474.jpg,1,1 3012 | 182475,182476.jpg,1,1 3013 | 182534,182535.jpg,1,1 3014 | 182671,182672.jpg,1,1 3015 | 182703,182704.jpg,1,1 3016 | 182796,182797.jpg,1,1 3017 | 182966,182967.jpg,1,1 3018 | 183075,183076.jpg,1,1 3019 | 183090,183091.jpg,1,1 3020 | 183098,183099.jpg,1,1 3021 | 183113,183114.jpg,1,1 3022 | 183143,183144.jpg,1,1 3023 | 183145,183146.jpg,1,1 3024 | 183194,183195.jpg,1,1 3025 | 183259,183260.jpg,1,1 3026 | 183262,183263.jpg,1,1 3027 | 183372,183373.jpg,1,1 3028 | 183381,183382.jpg,1,1 3029 | 183453,183454.jpg,1,1 3030 | 183465,183466.jpg,1,1 3031 | 183467,183468.jpg,1,1 3032 | 183637,183638.jpg,1,1 3033 | 183731,183732.jpg,1,1 3034 | 183747,183748.jpg,1,1 3035 | 183844,183845.jpg,1,1 3036 | 183849,183850.jpg,1,1 3037 | 183864,183865.jpg,1,1 3038 | 183999,184000.jpg,1,1 3039 | 184060,184061.jpg,1,1 3040 | 184234,184235.jpg,1,1 3041 | 184245,184246.jpg,1,1 3042 | 184272,184273.jpg,1,1 3043 | 184288,184289.jpg,1,1 3044 | 184300,184301.jpg,1,1 3045 | 184339,184340.jpg,1,1 3046 | 184354,184355.jpg,1,1 3047 | 184376,184377.jpg,1,1 3048 | 184397,184398.jpg,1,1 3049 | 184423,184424.jpg,1,1 3050 | 184445,184446.jpg,1,1 3051 | 184467,184468.jpg,1,1 3052 | 184586,184587.jpg,1,1 3053 | 184634,184635.jpg,1,1 3054 | 184641,184642.jpg,1,1 3055 | 184670,184671.jpg,1,1 3056 | 184715,184716.jpg,1,1 3057 | 184859,184860.jpg,1,1 3058 | 184929,184930.jpg,1,1 3059 | 185006,185007.jpg,1,1 3060 | 185036,185037.jpg,1,1 3061 | 185039,185040.jpg,1,1 3062 | 185058,185059.jpg,1,1 3063 | 185125,185126.jpg,1,1 3064 | 185160,185161.jpg,1,1 3065 | 185212,185213.jpg,1,1 3066 | 185297,185298.jpg,1,1 3067 | 185548,185549.jpg,1,1 3068 | 185599,185600.jpg,1,1 3069 | 185636,185637.jpg,1,1 3070 | 185681,185682.jpg,1,1 3071 | 185752,185753.jpg,1,1 3072 | 185782,185783.jpg,1,1 3073 | 185830,185831.jpg,1,1 3074 | 185983,185984.jpg,1,1 3075 | 186207,186208.jpg,1,1 3076 | 186277,186278.jpg,1,1 3077 | 186283,186284.jpg,1,1 3078 | 186388,186389.jpg,1,1 3079 | 186437,186438.jpg,1,1 3080 | 186558,186559.jpg,1,1 3081 | 186705,186706.jpg,1,1 3082 | 186708,186709.jpg,1,1 3083 | 186722,186723.jpg,1,1 3084 | 186859,186860.jpg,1,1 3085 | 186978,186979.jpg,1,1 3086 | 186979,186980.jpg,1,1 3087 | 186983,186984.jpg,1,1 3088 | 187039,187040.jpg,1,1 3089 | 187043,187044.jpg,1,1 3090 | 187084,187085.jpg,1,1 3091 | 187090,187091.jpg,1,1 3092 | 187251,187252.jpg,1,1 3093 | 187378,187379.jpg,1,1 3094 | 187486,187487.jpg,1,1 3095 | 187511,187512.jpg,1,1 3096 | 187564,187565.jpg,1,1 3097 | 187633,187634.jpg,1,1 3098 | 187651,187652.jpg,1,1 3099 | 187735,187736.jpg,1,1 3100 | 187736,187737.jpg,1,1 3101 | 187794,187795.jpg,1,1 3102 | 187827,187828.jpg,1,1 3103 | 187908,187909.jpg,1,1 3104 | 187919,187920.jpg,1,1 3105 | 188152,188153.jpg,1,1 3106 | 188181,188182.jpg,1,1 3107 | 188227,188228.jpg,1,1 3108 | 188245,188246.jpg,1,1 3109 | 188301,188302.jpg,1,1 3110 | 188381,188382.jpg,1,1 3111 | 188400,188401.jpg,1,1 3112 | 188413,188414.jpg,1,1 3113 | 188416,188417.jpg,1,1 3114 | 188474,188475.jpg,1,1 3115 | 188520,188521.jpg,1,1 3116 | 188552,188553.jpg,1,1 3117 | 188654,188655.jpg,1,1 3118 | 188691,188692.jpg,1,1 3119 | 188713,188714.jpg,1,1 3120 | 188847,188848.jpg,1,1 3121 | 189005,189006.jpg,1,1 3122 | 189023,189024.jpg,1,1 3123 | 189046,189047.jpg,1,1 3124 | 189102,189103.jpg,1,1 3125 | 189145,189146.jpg,1,1 3126 | 189156,189157.jpg,1,1 3127 | 189325,189326.jpg,1,1 3128 | 189356,189357.jpg,1,1 3129 | 189367,189368.jpg,1,1 3130 | 189394,189395.jpg,1,1 3131 | 189462,189463.jpg,1,1 3132 | 189531,189532.jpg,1,1 3133 | 189580,189581.jpg,1,1 3134 | 189719,189720.jpg,1,1 3135 | 189764,189765.jpg,1,1 3136 | 189778,189779.jpg,1,1 3137 | 189987,189988.jpg,1,1 3138 | 190050,190051.jpg,1,1 3139 | 190066,190067.jpg,1,1 3140 | 190074,190075.jpg,1,1 3141 | 190086,190087.jpg,1,1 3142 | 190099,190100.jpg,1,1 3143 | 190138,190139.jpg,1,1 3144 | 190200,190201.jpg,1,1 3145 | 190238,190239.jpg,1,1 3146 | 190246,190247.jpg,1,1 3147 | 190262,190263.jpg,1,1 3148 | 190286,190287.jpg,1,1 3149 | 190428,190429.jpg,1,1 3150 | 190450,190451.jpg,1,1 3151 | 190513,190514.jpg,1,1 3152 | 190556,190557.jpg,1,1 3153 | 190566,190567.jpg,1,1 3154 | 190630,190631.jpg,1,1 3155 | 190703,190704.jpg,1,1 3156 | 190747,190748.jpg,1,1 3157 | 190776,190777.jpg,1,1 3158 | 190777,190778.jpg,1,1 3159 | 190912,190913.jpg,1,1 3160 | 191063,191064.jpg,1,1 3161 | 191140,191141.jpg,1,1 3162 | 191189,191190.jpg,1,1 3163 | 191218,191219.jpg,1,1 3164 | 191226,191227.jpg,1,1 3165 | 191236,191237.jpg,1,1 3166 | 191276,191277.jpg,1,1 3167 | 191401,191402.jpg,1,1 3168 | 191562,191563.jpg,1,1 3169 | 191698,191699.jpg,1,1 3170 | 191730,191731.jpg,1,1 3171 | 191809,191810.jpg,1,1 3172 | 191853,191854.jpg,1,1 3173 | 191986,191987.jpg,1,1 3174 | 192104,192105.jpg,1,1 3175 | 192154,192155.jpg,1,1 3176 | 192176,192177.jpg,1,1 3177 | 192186,192187.jpg,1,1 3178 | 192307,192308.jpg,1,1 3179 | 192437,192438.jpg,1,1 3180 | 192515,192516.jpg,1,1 3181 | 192592,192593.jpg,1,1 3182 | 192667,192668.jpg,1,1 3183 | 192705,192706.jpg,1,1 3184 | 192728,192729.jpg,1,1 3185 | 192799,192800.jpg,1,1 3186 | 192941,192942.jpg,1,1 3187 | 193007,193008.jpg,1,1 3188 | 193261,193262.jpg,1,1 3189 | 193333,193334.jpg,1,1 3190 | 193367,193368.jpg,1,1 3191 | 193368,193369.jpg,1,1 3192 | 193375,193376.jpg,1,1 3193 | 193413,193414.jpg,1,1 3194 | 193452,193453.jpg,1,1 3195 | 193502,193503.jpg,1,1 3196 | 193565,193566.jpg,1,1 3197 | 193664,193665.jpg,1,1 3198 | 193666,193667.jpg,1,1 3199 | 193731,193732.jpg,1,1 3200 | 193771,193772.jpg,1,1 3201 | 193808,193809.jpg,1,1 3202 | 193863,193864.jpg,1,1 3203 | 193956,193957.jpg,1,1 3204 | 193991,193992.jpg,1,1 3205 | 194089,194090.jpg,1,1 3206 | 194124,194125.jpg,1,1 3207 | 194146,194147.jpg,1,1 3208 | 194187,194188.jpg,1,1 3209 | 194210,194211.jpg,1,1 3210 | 194310,194311.jpg,1,1 3211 | 194327,194328.jpg,1,1 3212 | 194363,194364.jpg,1,1 3213 | 194371,194372.jpg,1,1 3214 | 194479,194480.jpg,1,1 3215 | 194513,194514.jpg,1,1 3216 | 194553,194554.jpg,1,1 3217 | 194601,194602.jpg,1,1 3218 | 194623,194624.jpg,1,1 3219 | 194707,194708.jpg,1,1 3220 | 194776,194777.jpg,1,1 3221 | 194794,194795.jpg,1,1 3222 | 194934,194935.jpg,1,1 3223 | 195206,195207.jpg,1,1 3224 | 195226,195227.jpg,1,1 3225 | 195284,195285.jpg,1,1 3226 | 195341,195342.jpg,1,1 3227 | 195351,195352.jpg,1,1 3228 | 195399,195400.jpg,1,1 3229 | 195451,195452.jpg,1,1 3230 | 195532,195533.jpg,1,1 3231 | 195538,195539.jpg,1,1 3232 | 195559,195560.jpg,1,1 3233 | 195628,195629.jpg,1,1 3234 | 195634,195635.jpg,1,1 3235 | 195759,195760.jpg,1,1 3236 | 195892,195893.jpg,1,1 3237 | 196024,196025.jpg,1,1 3238 | 196062,196063.jpg,1,1 3239 | 196174,196175.jpg,1,1 3240 | 196247,196248.jpg,1,1 3241 | 196299,196300.jpg,1,1 3242 | 196348,196349.jpg,1,1 3243 | 196375,196376.jpg,1,1 3244 | 196408,196409.jpg,1,1 3245 | 196533,196534.jpg,1,1 3246 | 196584,196585.jpg,1,1 3247 | 196632,196633.jpg,1,1 3248 | 196737,196738.jpg,1,1 3249 | 196998,196999.jpg,1,1 3250 | 197053,197054.jpg,1,1 3251 | 197095,197096.jpg,1,1 3252 | 197116,197117.jpg,1,1 3253 | 197290,197291.jpg,1,1 3254 | 197310,197311.jpg,1,1 3255 | 197480,197481.jpg,1,1 3256 | 197503,197504.jpg,1,1 3257 | 197658,197659.jpg,1,1 3258 | 197713,197714.jpg,1,1 3259 | 197759,197760.jpg,1,1 3260 | 197803,197804.jpg,1,1 3261 | 197897,197898.jpg,1,1 3262 | 197931,197932.jpg,1,1 3263 | 198021,198022.jpg,1,1 3264 | 198039,198040.jpg,1,1 3265 | 198057,198058.jpg,1,1 3266 | 198058,198059.jpg,1,1 3267 | 198077,198078.jpg,1,1 3268 | 198110,198111.jpg,1,1 3269 | 198111,198112.jpg,1,1 3270 | 198112,198113.jpg,1,1 3271 | 198352,198353.jpg,1,1 3272 | 198387,198388.jpg,1,1 3273 | 198396,198397.jpg,1,1 3274 | 198434,198435.jpg,1,1 3275 | 198589,198590.jpg,1,1 3276 | 198604,198605.jpg,1,1 3277 | 198662,198663.jpg,1,1 3278 | 198667,198668.jpg,1,1 3279 | 198730,198731.jpg,1,1 3280 | 198762,198763.jpg,1,1 3281 | 198897,198898.jpg,1,1 3282 | 198949,198950.jpg,1,1 3283 | 199009,199010.jpg,1,1 3284 | 199039,199040.jpg,1,1 3285 | 199041,199042.jpg,1,1 3286 | 199174,199175.jpg,1,1 3287 | 199203,199204.jpg,1,1 3288 | 199214,199215.jpg,1,1 3289 | 199256,199257.jpg,1,1 3290 | 199399,199400.jpg,1,1 3291 | 199460,199461.jpg,1,1 3292 | 199487,199488.jpg,1,1 3293 | 199574,199575.jpg,1,1 3294 | 199834,199835.jpg,1,1 3295 | 199852,199853.jpg,1,1 3296 | 199886,199887.jpg,1,1 3297 | 200034,200035.jpg,1,1 3298 | 200173,200174.jpg,1,1 3299 | 200177,200178.jpg,1,1 3300 | 200180,200181.jpg,1,1 3301 | 200193,200194.jpg,1,1 3302 | 200221,200222.jpg,1,1 3303 | 200395,200396.jpg,1,1 3304 | 200403,200404.jpg,1,1 3305 | 200454,200455.jpg,1,1 3306 | 200470,200471.jpg,1,1 3307 | 200544,200545.jpg,1,1 3308 | 200554,200555.jpg,1,1 3309 | 200567,200568.jpg,1,1 3310 | 200597,200598.jpg,1,1 3311 | 200713,200714.jpg,1,1 3312 | 200821,200822.jpg,1,1 3313 | 200880,200881.jpg,1,1 3314 | 200900,200901.jpg,1,1 3315 | 200928,200929.jpg,1,1 3316 | 200953,200954.jpg,1,1 3317 | 200962,200963.jpg,1,1 3318 | 201037,201038.jpg,1,1 3319 | 201045,201046.jpg,1,1 3320 | 201054,201055.jpg,1,1 3321 | 201096,201097.jpg,1,1 3322 | 201112,201113.jpg,1,1 3323 | 201175,201176.jpg,1,1 3324 | 201177,201178.jpg,1,1 3325 | 201192,201193.jpg,1,1 3326 | 201254,201255.jpg,1,1 3327 | 201270,201271.jpg,1,1 3328 | 201448,201449.jpg,1,1 3329 | 201522,201523.jpg,1,1 3330 | 201599,201600.jpg,1,1 3331 | 201686,201687.jpg,1,1 3332 | 201719,201720.jpg,1,1 3333 | 201723,201724.jpg,1,1 3334 | 201756,201757.jpg,1,1 3335 | 201850,201851.jpg,1,1 3336 | 201865,201866.jpg,1,1 3337 | 201891,201892.jpg,1,1 3338 | 202020,202021.jpg,1,1 3339 | 202180,202181.jpg,1,1 3340 | 202188,202189.jpg,1,1 3341 | 202218,202219.jpg,1,1 3342 | 202224,202225.jpg,1,1 3343 | 202241,202242.jpg,1,1 3344 | 202246,202247.jpg,1,1 3345 | 202276,202277.jpg,1,1 3346 | 202340,202341.jpg,1,1 3347 | 202342,202343.jpg,1,1 3348 | 202375,202376.jpg,1,1 3349 | 202415,202416.jpg,1,1 3350 | 202458,202459.jpg,1,1 3351 | 202503,202504.jpg,1,1 3352 | 202524,202525.jpg,1,1 3353 | -------------------------------------------------------------------------------- /datasets/celebA_dataset.py: -------------------------------------------------------------------------------- 1 | import os 2 | import torch 3 | import numpy as np 4 | import pandas as pd 5 | import torchvision.transforms as transforms 6 | 7 | from PIL import Image 8 | from torch.utils.data import Dataset, DataLoader 9 | from torch.utils.data.distributed import DistributedSampler 10 | 11 | # Ignore warnings 12 | import warnings 13 | warnings.filterwarnings("ignore") 14 | 15 | 16 | class celebADataset(Dataset): 17 | def __init__(self, args, split): 18 | self.split_dict = { 19 | 'train': 0, 20 | 'val': 1, 21 | 'test': 2 22 | } 23 | # (y, gender) 24 | self.env_dict = { 25 | (0, 0): 0, # nongrey hair, female 26 | (0, 1): 1, # nongrey hair, male 27 | (1, 0): 2, # gray hair, female 28 | (1, 1): 3 # gray hair, male 29 | } 30 | self.split = split 31 | self.dataset_name = 'celebA' 32 | self.dataset_dir = os.path.join("datasets/", self.dataset_name) 33 | if not os.path.exists(self.dataset_dir): 34 | raise ValueError( 35 | f'{self.dataset_dir} does not exist yet. Please generate the dataset first.') 36 | self.metadata_df = pd.read_csv( 37 | os.path.join(self.dataset_dir, 'celebA_split.csv')) 38 | self.metadata_df = self.metadata_df[self.metadata_df['split']==self.split_dict[self.split]] 39 | 40 | self.y_array = self.metadata_df['Gray_Hair'].values 41 | self.gender_array = self.metadata_df['Male'].values 42 | self.filename_array = self.metadata_df['image_id'].values 43 | self.transform = get_transform_cub(self.split=='train') 44 | if self.split == 'train': 45 | self.subsample(args.data_label_correlation) 46 | 47 | def subsample(self, ratio = 0.6): 48 | np.random.seed(1) 49 | train_group_idx = { 50 | (0, 0): np.array([]), # nongrey hair, female 51 | (0, 1): np.array([]), # nongrey hair, male 52 | (1, 0): np.array([]), # gray hair, female 53 | (1, 1): np.array([]) # gray hair, male 54 | } 55 | for idx, (y, gender) in enumerate(zip(self.y_array, self.gender_array)): 56 | train_group_idx[(y, gender)] = np.append(train_group_idx[(y, gender)],idx) 57 | sample_size = int(ratio/(1-ratio)*len(train_group_idx[(1, 0)])) 58 | undersampled_idx_00 = np.random.choice(train_group_idx[(0, 0)], sample_size, replace = False) 59 | undersampled_idx_11 = np.random.choice(train_group_idx[(1, 1)], sample_size, replace = False) 60 | undersampled_idx = np.concatenate( (train_group_idx[(1, 0)], undersampled_idx_00, undersampled_idx_11, train_group_idx[(0, 1)]) ) 61 | undersampled_idx = undersampled_idx.astype(int) 62 | self.y_array = self.y_array[undersampled_idx] 63 | self.gender_array = self.gender_array[undersampled_idx] 64 | self.filename_array = self.filename_array[undersampled_idx] 65 | 66 | 67 | 68 | def __len__(self): 69 | return len(self.filename_array) 70 | 71 | def __getitem__(self, idx): 72 | y = self.y_array[idx] 73 | gender = self.gender_array[idx] 74 | img_filename = os.path.join( 75 | self.dataset_dir, 76 | 'img_align_celeba', 77 | self.filename_array[idx]) 78 | img = Image.open(img_filename).convert('RGB') 79 | img = self.transform(img) 80 | 81 | return img, y, self.env_dict[(y, gender)] 82 | 83 | 84 | class celebAOodDataset(Dataset): 85 | def __init__(self): 86 | self.dataset_name = 'celebA' 87 | self.dataset_dir = os.path.join("datasets/", self.dataset_name) 88 | if not os.path.exists(self.dataset_dir): 89 | raise ValueError( 90 | f'{self.dataset_dir} does not exist yet. Please generate the dataset first.') 91 | self.metadata_df = pd.read_csv( 92 | os.path.join(self.dataset_dir, 'celebA_ood.csv')) 93 | 94 | self.filename_array = self.metadata_df['image_id'].values 95 | self.transform = get_transform_cub(train=False) 96 | 97 | def __len__(self): 98 | return len(self.filename_array) 99 | 100 | def __getitem__(self, idx): 101 | img_filename = os.path.join( 102 | self.dataset_dir, 103 | 'img_align_celeba', 104 | self.filename_array[idx]) 105 | img = Image.open(img_filename).convert('RGB') 106 | img = self.transform(img) 107 | 108 | return img, img 109 | 110 | 111 | def get_transform_cub(train): 112 | orig_w = 178 113 | orig_h = 218 114 | orig_min_dim = min(orig_w, orig_h) 115 | target_resolution = (224, 224) 116 | 117 | if not train: 118 | transform = transforms.Compose([ 119 | transforms.CenterCrop(orig_min_dim), 120 | transforms.Resize(target_resolution), 121 | transforms.ToTensor(), 122 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]) 123 | ]) 124 | else: 125 | # Orig aspect ratio is 0.81, so we don't squish it in that direction any more 126 | transform = transforms.Compose([ 127 | transforms.RandomResizedCrop( 128 | target_resolution, 129 | scale=(0.7, 1.0), 130 | ratio=(1.0, 1.3333333333333333), 131 | interpolation=2), 132 | transforms.RandomHorizontalFlip(), 133 | transforms.ToTensor(), 134 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]) 135 | ]) 136 | return transform 137 | 138 | 139 | def get_celebA_dataloader(args, split): 140 | kwargs = {'pin_memory': True, 'num_workers': 8, 'drop_last': True} 141 | dataset = celebADataset(args, split) 142 | dataloader = DataLoader(dataset=dataset, 143 | batch_size=args.batch_size, 144 | shuffle=True, 145 | **kwargs) 146 | return dataloader 147 | 148 | 149 | 150 | 151 | def get_celebA_ood_dataloader(args): 152 | kwargs = {'pin_memory': True, 'num_workers': 8, 'drop_last': True} 153 | dataset = celebAOodDataset() 154 | dataloader = DataLoader(dataset=dataset, 155 | batch_size=args.ood_batch_size, 156 | shuffle=True, 157 | **kwargs) 158 | return dataloader 159 | 160 | 161 | if __name__ == "__main__": 162 | import argparse 163 | parser = argparse.ArgumentParser(description='OOD training for multi-label classification') 164 | parser.add_argument('-b', '--batch-size', default=64, type=int, 165 | help='mini-batch size (default: 64) used for training') 166 | parser.add_argument('--ood-batch-size', default= 64, type=int, 167 | help='mini-batch size (default: 400) used for testing') 168 | args = parser.parse_args() 169 | 170 | dataloader = get_celebA_dataloader(args, split='train') 171 | ood_dataloader = get_celebA_ood_dataloader(args) -------------------------------------------------------------------------------- /datasets/color_mnist.py: -------------------------------------------------------------------------------- 1 | """ 2 | Color MNIST Dataset. Adapted from https://github.com/clovaai/rebias 3 | """ 4 | import os 5 | import numpy as np 6 | from PIL import Image 7 | 8 | import torch 9 | from torch.utils import data 10 | 11 | from torchvision import transforms 12 | from torchvision.datasets import MNIST 13 | from torch.utils.data.distributed import DistributedSampler 14 | from utils import UnNormalize 15 | 16 | class BiasedMNIST(MNIST): 17 | """A base class for Biased-MNIST. 18 | We manually select ten colours to synthetic colour bias. (See `COLOUR_MAP` for the colour configuration) 19 | Usage is exactly same as torchvision MNIST dataset class. 20 | 21 | You have two paramters to control the level of bias. 22 | 23 | Parameters 24 | ---------- 25 | root : str 26 | path to MNIST dataset. 27 | data_label_correlation : float, default=1.0 28 | Here, each class has the pre-defined colour (bias). 29 | data_label_correlation, or `rho` controls the level of the dataset bias. 30 | 31 | A sample is coloured with 32 | - the pre-defined colour with probability `rho`, 33 | - coloured with one of the other colours with probability `1 - rho`. 34 | The number of ``other colours'' is controlled by `n_confusing_labels` (default: 9). 35 | Note that the colour is injected into the background of the image (see `_binary_to_colour`). 36 | 37 | Hence, we have 38 | - Perfectly biased dataset with rho=1.0 39 | - Perfectly unbiased with rho=0.1 (1/10) ==> our ``unbiased'' setting in the test time. 40 | In the paper, we explore the high correlations but with small hints, e.g., rho=0.999. 41 | 42 | n_confusing_labels : int, default=9 43 | In the real-world cases, biases are not equally distributed, but highly unbalanced. 44 | We mimic the unbalanced biases by changing the number of confusing colours for each class. 45 | In the paper, we use n_confusing_labels=9, i.e., during training, the model can observe 46 | all colours for each class. However, you can make the problem harder by setting smaller n_confusing_labels, e.g., 2. 47 | We suggest to researchers considering this benchmark for future researches. 48 | """ 49 | 50 | COLOUR_MAP1 = [[255, 0, 0], [0, 255, 0], [0, 0, 255], [225, 225, 0], [225, 0, 225], 51 | [255, 0, 0], [255, 0, 0],[255, 0, 0], [255, 0, 0], [255, 0, 0]] 52 | COLOUR_MAP2 = [[128, 0, 255], [255, 0, 128], [0, 0, 255], [225, 225, 0], [225, 0, 225], 53 | [255, 0, 0], [255, 0, 0],[0, 255, 0], [0, 255, 0], [0, 255, 0]] 54 | 55 | 56 | def __init__(self, root, cmap, train=True, transform=None, target_transform=None, 57 | download=False, data_label_correlation=1.0, n_confusing_labels=9, partial=False): 58 | super().__init__(root, train=train, transform=transform, 59 | target_transform=target_transform, 60 | download=download) 61 | self.cmap = cmap 62 | self.random = True 63 | self.Partial = partial 64 | self.data_label_correlation = data_label_correlation 65 | self.n_confusing_labels = n_confusing_labels 66 | self.data, self.targets, self.biased_targets = self.build_biased_mnist() 67 | 68 | indices = np.arange(len(self.data)) 69 | self._shuffle(indices) 70 | 71 | self.data = self.data[indices].numpy() 72 | self.targets = self.targets[indices] 73 | self.biased_targets = self.biased_targets[indices] 74 | # y, bg, cmap 75 | self.env_dict = { 76 | (0, 0, "1"): 0, #red 0 77 | (0, 1, "1"): 1, #green 0 78 | (1, 0, "1"): 2, #red 1 79 | (1, 1, "1"): 3, #green 1 80 | (0, 0, "2"): 4, #purple 0 81 | (0, 1, "2"): 5, #magenta 0 82 | (1, 0, "2"): 6, #purple 1 83 | (1, 1, "2"): 7 #magenta 1 84 | } 85 | @property 86 | def raw_folder(self): 87 | return os.path.join(self.root, 'raw') 88 | 89 | @property 90 | def processed_folder(self): 91 | return os.path.join(self.root, 'processed') 92 | 93 | def _shuffle(self, iteratable): 94 | if self.random: 95 | np.random.shuffle(iteratable) 96 | 97 | def _make_biased_mnist(self, indices, label, cmap): 98 | raise NotImplementedError 99 | 100 | def _update_bias_indices(self, bias_indices, label): 101 | if self.n_confusing_labels > 9 or self.n_confusing_labels < 1: 102 | raise ValueError(self.n_confusing_labels) 103 | 104 | indices = np.where((self.targets == label).numpy())[0] 105 | self._shuffle(indices) 106 | indices = torch.LongTensor(indices) 107 | 108 | n_samples = len(indices) 109 | n_correlated_samples = int(n_samples * self.data_label_correlation) 110 | n_decorrelated_per_class = int(np.ceil((n_samples - n_correlated_samples) / (self.n_confusing_labels))) 111 | correlated_indices = indices[:n_correlated_samples] 112 | bias_indices[label] = torch.cat([bias_indices[label], correlated_indices]) 113 | 114 | decorrelated_indices = torch.split(indices[n_correlated_samples:], n_decorrelated_per_class) 115 | if self.Partial: 116 | other_labels = [_label % 2 for _label in range(label + 1, label + 1 + self.n_confusing_labels)] 117 | else: 118 | other_labels = [_label % 10 for _label in range(label + 1, label + 1 + self.n_confusing_labels)] 119 | self._shuffle(other_labels) 120 | 121 | for idx, _indices in enumerate(decorrelated_indices): 122 | _label = other_labels[idx] 123 | bias_indices[_label] = torch.cat([bias_indices[_label], _indices]) 124 | 125 | def build_biased_mnist(self): 126 | """Build biased MNIST. 127 | """ 128 | if self.Partial: 129 | n_labels = 2 130 | else: 131 | n_labels = self.targets.max().item() + 1 132 | 133 | bias_indices = {label: torch.LongTensor() for label in range(n_labels)} 134 | for label in range(n_labels): 135 | self._update_bias_indices(bias_indices, label) 136 | 137 | data = torch.ByteTensor() 138 | targets = torch.LongTensor() 139 | biased_targets = [] 140 | 141 | for bias_label, indices in bias_indices.items(): 142 | _data, _targets = self._make_biased_mnist(indices, bias_label, self.cmap) 143 | data = torch.cat([data, _data]) 144 | targets = torch.cat([targets, _targets]) 145 | biased_targets.extend([bias_label] * len(indices)) 146 | 147 | biased_targets = torch.LongTensor(biased_targets) 148 | return data, targets, biased_targets 149 | 150 | def __getitem__(self, index): 151 | img, target = self.data[index], int(self.targets[index]) 152 | img = Image.fromarray(img.astype(np.uint8), mode='RGB') 153 | 154 | if self.transform is not None: 155 | img = self.transform(img) 156 | 157 | if self.target_transform is not None: 158 | target = self.target_transform(target) 159 | 160 | bg_label = int(self.biased_targets[index]) 161 | return img, target, self.env_dict[(target, bg_label, self.cmap)] 162 | 163 | 164 | class ColourBiasedMNIST(BiasedMNIST): 165 | def __init__(self, root, cmap, train=True, transform=None, target_transform=None, 166 | download=False, data_label_correlation=1.0, n_confusing_labels=9, partial = False): 167 | super(ColourBiasedMNIST, self).__init__(root, train=train, transform=transform, 168 | target_transform=target_transform, 169 | download=download, 170 | data_label_correlation=data_label_correlation, 171 | n_confusing_labels=n_confusing_labels, 172 | partial = partial, 173 | cmap = cmap) 174 | 175 | def _binary_to_colour(self, data, colour): 176 | fg_data = torch.zeros_like(data) 177 | fg_data[data != 0] = 255 178 | fg_data[data == 0] = 0 179 | fg_data = torch.stack([fg_data, fg_data, fg_data], dim=1) 180 | 181 | bg_data = torch.zeros_like(data) 182 | bg_data[data == 0] = 1 183 | bg_data[data != 0] = 0 184 | bg_data = torch.stack([bg_data, bg_data, bg_data], dim=3) 185 | bg_data = bg_data * torch.ByteTensor(colour) 186 | bg_data = bg_data.permute(0, 3, 1, 2) 187 | 188 | data = fg_data + bg_data 189 | return data.permute(0, 2, 3, 1) 190 | 191 | def _make_biased_mnist(self, indices, label, cmap): 192 | if cmap == "1": 193 | label = self.COLOUR_MAP1[label] 194 | elif cmap == "2": 195 | label = self.COLOUR_MAP2[label] 196 | 197 | return self._binary_to_colour(self.data[indices], label), self.targets[indices] 198 | 199 | 200 | def get_biased_mnist_dataloader(args, root, batch_size, data_label_correlation, cmap, 201 | n_confusing_labels=9, train=True, partial=False): 202 | kwargs = {'pin_memory': False, 'num_workers': 8, 'drop_last': True} 203 | transform = transforms.Compose([ 204 | transforms.ToTensor(), 205 | transforms.Normalize(mean=(0.5, 0.5, 0.5), 206 | std=(0.5, 0.5, 0.5))]) 207 | dataset = ColourBiasedMNIST(root, train=train, transform=transform, 208 | download=True, data_label_correlation=data_label_correlation*2, 209 | n_confusing_labels=n_confusing_labels, partial=partial, cmap = cmap) 210 | dataloader = data.DataLoader(dataset=dataset, batch_size=batch_size, shuffle=True, **kwargs) 211 | 212 | return dataloader 213 | -------------------------------------------------------------------------------- /datasets/cub_dataset.py: -------------------------------------------------------------------------------- 1 | import os 2 | import torch 3 | import numpy as np 4 | import pandas as pd 5 | import torchvision.transforms as transforms 6 | 7 | from PIL import Image 8 | from torch.utils.data import Dataset, DataLoader 9 | from torch.utils.data.distributed import DistributedSampler 10 | 11 | # Ignore warnings 12 | import warnings 13 | warnings.filterwarnings("ignore") 14 | 15 | class WaterbirdDataset(Dataset): 16 | def __init__(self, data_correlation, split, root_dir = 'datasets'): 17 | self.split_dict = { 18 | 'train': 0, 19 | 'val': 1, 20 | 'test': 2 21 | } 22 | self.env_dict = { 23 | (0, 0): 0, 24 | (0, 1): 1, 25 | (1, 0): 2, 26 | (1, 1): 3 27 | } 28 | self.split = split 29 | self.root_dir = root_dir 30 | self.dataset_name = "waterbird_complete"+"{:0.2f}".format(data_correlation)[-2:]+"_forest2water2" 31 | self.dataset_dir = os.path.join(self.root_dir, self.dataset_name) 32 | if not os.path.exists(self.dataset_dir): 33 | raise ValueError( 34 | f'{self.dataset_dir} does not exist yet. Please generate the dataset first.') 35 | self.metadata_df = pd.read_csv( 36 | os.path.join(self.dataset_dir, 'metadata.csv')) 37 | self.metadata_df = self.metadata_df[self.metadata_df['split']==self.split_dict[self.split]] 38 | 39 | self.y_array = self.metadata_df['y'].values 40 | self.place_array = self.metadata_df['place'].values 41 | self.filename_array = self.metadata_df['img_filename'].values 42 | self.transform = get_transform_cub(self.split=='train') 43 | 44 | def __len__(self): 45 | return len(self.filename_array) 46 | 47 | def __getitem__(self, idx): 48 | y = self.y_array[idx] 49 | place = self.place_array[idx] 50 | img_filename = os.path.join( 51 | self.dataset_dir, 52 | self.filename_array[idx]) 53 | img = Image.open(img_filename).convert('RGB') 54 | img = self.transform(img) 55 | 56 | return img, y, self.env_dict[(y, place)] 57 | 58 | def get_transform_cub(train): 59 | scale = 256.0/224.0 60 | target_resolution = (224, 224) 61 | assert target_resolution is not None 62 | 63 | if not train: 64 | # Resizes the image to a slightly larger square then crops the center. 65 | transform = transforms.Compose([ 66 | transforms.Resize((int(target_resolution[0]*scale), int(target_resolution[1]*scale))), 67 | transforms.CenterCrop(target_resolution), 68 | transforms.ToTensor(), 69 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]) 70 | ]) 71 | else: 72 | transform = transforms.Compose([ 73 | transforms.RandomResizedCrop( 74 | target_resolution, 75 | scale=(0.7, 1.0), 76 | ratio=(0.75, 1.3333333333333333), 77 | interpolation=2), 78 | transforms.RandomHorizontalFlip(), 79 | transforms.ToTensor(), 80 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]) 81 | ]) 82 | return transform 83 | 84 | def get_waterbird_dataloader(args, data_label_correlation, split): 85 | kwargs = {'pin_memory': False, 'num_workers': 8, 'drop_last': True} 86 | dataset = WaterbirdDataset(data_correlation=data_label_correlation, split=split) 87 | dataloader = DataLoader(dataset=dataset, 88 | batch_size=args.batch_size, 89 | shuffle=True, 90 | **kwargs) 91 | return dataloader 92 | 93 | 94 | if __name__ == "__main__": 95 | import argparse 96 | parser = argparse.ArgumentParser(description='OOD training for multi-label classification') 97 | parser.add_argument('-b', '--batch-size', default=64, type=int, 98 | help='mini-batch size (default: 64) used for training') 99 | args = parser.parse_args() 100 | 101 | dataloader = get_waterbird_dataloader(args, 0.9, split='train') -------------------------------------------------------------------------------- /datasets/dataset_utils.py: -------------------------------------------------------------------------------- 1 | from PIL import Image 2 | import numpy as np 3 | 4 | def crop_and_resize(source_img, target_img): 5 | """ 6 | Make source_img exactly the same as target_img by expanding/shrinking and 7 | cropping appropriately. 8 | 9 | If source_img's dimensions are strictly greater than or equal to the 10 | corresponding target img dimensions, we crop left/right or top/bottom 11 | depending on aspect ratio, then shrink down. 12 | 13 | If any of source img's dimensions are smaller than target img's dimensions, 14 | we expand the source img and then crop accordingly 15 | 16 | Modified from 17 | https://stackoverflow.com/questions/4744372/reducing-the-width-height-of-an-image-to-fit-a-given-aspect-ratio-how-python 18 | """ 19 | source_width = source_img.size[0] 20 | source_height = source_img.size[1] 21 | 22 | target_width = target_img.size[0] 23 | target_height = target_img.size[1] 24 | 25 | # Check if source does not completely cover target 26 | if (source_width < target_width) or (source_height < target_height): 27 | # Try matching width 28 | width_resize = (target_width, int((target_width / source_width) * source_height)) 29 | if (width_resize[0] >= target_width) and (width_resize[1] >= target_height): 30 | source_resized = source_img.resize(width_resize, Image.ANTIALIAS) 31 | else: 32 | height_resize = (int((target_height / source_height) * source_width), target_height) 33 | assert (height_resize[0] >= target_width) and (height_resize[1] >= target_height) 34 | source_resized = source_img.resize(height_resize, Image.ANTIALIAS) 35 | # Rerun the cropping 36 | return crop_and_resize(source_resized, target_img) 37 | 38 | source_aspect = source_width / source_height 39 | target_aspect = target_width / target_height 40 | 41 | if source_aspect > target_aspect: 42 | # Crop left/right 43 | new_source_width = int(target_aspect * source_height) 44 | offset = (source_width - new_source_width) // 2 45 | resize = (offset, 0, source_width - offset, source_height) 46 | else: 47 | # Crop top/bottom 48 | new_source_height = int(source_width / target_aspect) 49 | offset = (source_height - new_source_height) // 2 50 | resize = (0, offset, source_width, source_height - offset) 51 | 52 | source_resized = source_img.crop(resize).resize((target_width, target_height), Image.ANTIALIAS) 53 | return source_resized 54 | 55 | 56 | def combine_and_mask(img_new, mask, img_black): 57 | """ 58 | Combine img_new, mask, and image_black based on the mask 59 | 60 | img_new: new (unmasked image) 61 | mask: binary mask of bird image 62 | img_black: already-masked bird image (bird only) 63 | """ 64 | # Warp new img to match black img 65 | img_resized = crop_and_resize(img_new, img_black) 66 | img_resized_np = np.asarray(img_resized) 67 | 68 | # Mask new img 69 | img_masked_np = np.around(img_resized_np * (1 - mask)).astype(np.uint8) 70 | 71 | # Combine 72 | img_combined_np = np.asarray(img_black) + img_masked_np 73 | img_combined = Image.fromarray(img_combined_np) 74 | 75 | return img_combined 76 | -------------------------------------------------------------------------------- /datasets/gaussian_dataset.py: -------------------------------------------------------------------------------- 1 | from torch.utils.data.dataset import Dataset 2 | from torchvision import transforms 3 | import torch 4 | 5 | class GaussianDataset(torch.utils.data.Dataset): 6 | def __init__(self, dataset_size, img_size = 32, labels = None, transform = None, num_classes = 2): 7 | if labels == None: 8 | self.labels = (torch.ones(dataset_size) * num_classes).long() 9 | else: 10 | self.labels = labels 11 | images = torch.normal(0.5, 0.5, size=(dataset_size,3,img_size,img_size)) 12 | self.images = torch.clamp(images, 0, 1) 13 | self.transform = transform 14 | 15 | def __len__(self): 16 | return len(self.images) 17 | 18 | def __getitem__(self, index): 19 | # Load data and get label 20 | if self.transform: 21 | X = self.transform(self.images[index]) 22 | else: 23 | X = self.images[index] 24 | y = self.labels[index] 25 | 26 | return X, y -------------------------------------------------------------------------------- /datasets/generate_placebg.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import random 4 | import shutil 5 | import pandas as pd 6 | from PIL import Image 7 | from tqdm import tqdm 8 | from dataset_utils import crop_and_resize, combine_and_mask 9 | 10 | ################ Paths and other configs - Set these ################################# 11 | places_dir = '/nobackup-slow/dataset/places365_standard' 12 | output_dir = 'datasets/ood_datasets' 13 | 14 | target_places = [ 15 | ['bamboo_forest', 'forest/broadleaf'], # Land backgrounds 16 | ['ocean', 'lake/natural']] # Water backgrounds 17 | 18 | confounder_strength = 0.6 # Determines relative size of majority vs. minority groups 19 | dataset_name = 'placesbg' 20 | ###################################################################################### 21 | 22 | ### Assign places to train, val, and test set 23 | place_ids_df = pd.read_csv( 24 | os.path.join(places_dir, 'categories_places365.txt'), 25 | sep=" ", 26 | header=None, 27 | names=['place_name', 'place_id'], 28 | index_col='place_id') 29 | 30 | target_place_ids = [] 31 | 32 | for idx, target_places in enumerate(target_places): 33 | place_filenames = [] 34 | 35 | for target_place in target_places: 36 | 37 | # Read place filenames associated with target_place 38 | place_filenames += [ 39 | f'/{target_place[0]}/{target_place}/{filename}' for filename in os.listdir( 40 | os.path.join(places_dir, 'data_large', 'train', target_place[0], target_place)) 41 | if filename.endswith('.jpg')] 42 | 43 | random.shuffle(place_filenames) 44 | 45 | ### Write dataset to disk 46 | output_subfolder = os.path.join(output_dir, dataset_name) 47 | os.makedirs(output_subfolder, exist_ok=True) 48 | os.makedirs(os.path.join(output_subfolder, 'land'), exist_ok=True) 49 | os.makedirs(os.path.join(output_subfolder, 'water'), exist_ok=True) 50 | 51 | for i in tqdm(range(len(place_filenames))): 52 | place_filepath = place_filenames[i][1:] 53 | place_category = place_filepath.split("/")[1] 54 | place_name = place_filepath.split("/")[2] 55 | 56 | place_path = os.path.join(places_dir, 'data_large', 'train', place_filepath) 57 | if place_category in target_places[0]: 58 | output_path = os.path.join(output_subfolder, 'land', place_name) 59 | elif place_category in target_places[1]: 60 | output_path = os.path.join(output_subfolder, 'water', place_name) 61 | else: 62 | raise Exception(f'Category{place_category} not found') 63 | shutil.copyfile(place_path, output_path) 64 | -------------------------------------------------------------------------------- /datasets/generate_waterbird.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import random 4 | import pandas as pd 5 | from PIL import Image 6 | from tqdm import tqdm 7 | from dataset_utils import crop_and_resize, combine_and_mask 8 | 9 | ################ Paths and other configs - Set these ################################# 10 | cub_dir = '/nobackup-slow/dataset/CUB_200_2011' 11 | places_dir = '/nobackup-slow/dataset/places365_standard' 12 | output_dir = 'datasets/' 13 | 14 | target_places = [ 15 | ['bamboo_forest', 'forest/broadleaf'], # Land backgrounds 16 | ['ocean', 'lake/natural']] # Water backgrounds 17 | 18 | val_frac = 0.2 # What fraction of the training data to use as validation 19 | confounder_strength = 0.9 # Determines relative size of majority vs. minority groups 20 | dataset_name = 'waterbird_complete'+"{:0.2f}".format(confounder_strength)[-2:]+'_forest2water2' 21 | ###################################################################################### 22 | 23 | images_path = os.path.join(cub_dir, 'images.txt') 24 | 25 | df = pd.read_csv( 26 | images_path, 27 | sep=" ", 28 | header=None, 29 | names=['img_id', 'img_filename'], 30 | index_col='img_id') 31 | 32 | ### Set up labels of waterbirds vs. landbirds 33 | # We consider water birds = seabirds and waterfowl. 34 | species = np.unique([img_filename.split('/')[0].split('.')[1].lower() for img_filename in df['img_filename']]) 35 | water_birds_list = [ 36 | 'Albatross', # Seabirds 37 | 'Auklet', 38 | 'Cormorant', 39 | 'Frigatebird', 40 | 'Fulmar', 41 | 'Gull', 42 | 'Jaeger', 43 | 'Kittiwake', 44 | 'Pelican', 45 | 'Puffin', 46 | 'Tern', 47 | 'Gadwall', # Waterfowl 48 | 'Grebe', 49 | 'Mallard', 50 | 'Merganser', 51 | 'Guillemot', 52 | 'Pacific_Loon' 53 | ] 54 | 55 | water_birds = {} 56 | for species_name in species: 57 | water_birds[species_name] = 0 58 | for water_bird in water_birds_list: 59 | if water_bird.lower() in species_name: 60 | water_birds[species_name] = 1 61 | species_list = [img_filename.split('/')[0].split('.')[1].lower() for img_filename in df['img_filename']] 62 | df['y'] = [water_birds[species] for species in species_list] 63 | 64 | ### Assign train/tesst/valid splits 65 | # In the original CUB dataset split, split = 0 is test and split = 1 is train 66 | # We want to change it to 67 | # split = 0 is train, 68 | # split = 1 is val, 69 | # split = 2 is test 70 | 71 | train_test_df = pd.read_csv( 72 | os.path.join(cub_dir, 'train_test_split.txt'), 73 | sep=" ", 74 | header=None, 75 | names=['img_id', 'split'], 76 | index_col='img_id') 77 | 78 | df = df.join(train_test_df, on='img_id') 79 | test_ids = df.loc[df['split'] == 0].index 80 | train_ids = np.array(df.loc[df['split'] == 1].index) 81 | val_ids = np.random.choice( 82 | train_ids, 83 | size=int(np.round(val_frac * len(train_ids))), 84 | replace=False) 85 | 86 | df.loc[train_ids, 'split'] = 0 87 | df.loc[val_ids, 'split'] = 1 88 | df.loc[test_ids, 'split'] = 2 89 | 90 | ### Assign confounders (place categories) 91 | 92 | # Confounders are set up as the following: 93 | # Y = 0, C = 0: confounder_strength 94 | # Y = 0, C = 1: 1 - confounder_strength 95 | # Y = 1, C = 0: 1 - confounder_strength 96 | # Y = 1, C = 1: confounder_strength 97 | 98 | df['place'] = 0 99 | train_ids = np.array(df.loc[df['split'] == 0].index) 100 | val_ids = np.array(df.loc[df['split'] == 1].index) 101 | test_ids = np.array(df.loc[df['split'] == 2].index) 102 | for split_idx, ids in enumerate([train_ids, val_ids, test_ids]): 103 | for y in (0, 1): 104 | if split_idx == 0: # train 105 | if y == 0: 106 | pos_fraction = 1 - confounder_strength 107 | else: 108 | pos_fraction = confounder_strength 109 | else: 110 | pos_fraction = 0.5 111 | subset_df = df.loc[ids, :] 112 | y_ids = np.array((subset_df.loc[subset_df['y'] == y]).index) 113 | pos_place_ids = np.random.choice( 114 | y_ids, 115 | size=int(np.round(pos_fraction * len(y_ids))), 116 | replace=False) 117 | df.loc[pos_place_ids, 'place'] = 1 118 | 119 | for split, split_label in [(0, 'train'), (1, 'val'), (2, 'test')]: 120 | print(f"{split_label}:") 121 | split_df = df.loc[df['split'] == split, :] 122 | print(f"waterbirds are {np.mean(split_df['y']):.3f} of the examples") 123 | print(f"y = 0, c = 0: {np.mean(split_df.loc[split_df['y'] == 0, 'place'] == 0):.3f}, n = {np.sum((split_df['y'] == 0) & (split_df['place'] == 0))}") 124 | print(f"y = 0, c = 1: {np.mean(split_df.loc[split_df['y'] == 0, 'place'] == 1):.3f}, n = {np.sum((split_df['y'] == 0) & (split_df['place'] == 1))}") 125 | print(f"y = 1, c = 0: {np.mean(split_df.loc[split_df['y'] == 1, 'place'] == 0):.3f}, n = {np.sum((split_df['y'] == 1) & (split_df['place'] == 0))}") 126 | print(f"y = 1, c = 1: {np.mean(split_df.loc[split_df['y'] == 1, 'place'] == 1):.3f}, n = {np.sum((split_df['y'] == 1) & (split_df['place'] == 1))}") 127 | 128 | ### Assign places to train, val, and test set 129 | place_ids_df = pd.read_csv( 130 | os.path.join(places_dir, 'categories_places365.txt'), 131 | sep=" ", 132 | header=None, 133 | names=['place_name', 'place_id'], 134 | index_col='place_id') 135 | 136 | target_place_ids = [] 137 | 138 | for idx, target_places in enumerate(target_places): 139 | place_filenames = [] 140 | 141 | for target_place in target_places: 142 | target_place_full = f'/{target_place[0]}/{target_place}' 143 | assert (np.sum(place_ids_df['place_name'] == target_place_full) == 1) 144 | target_place_ids.append(place_ids_df.index[place_ids_df['place_name'] == target_place_full][0]) 145 | print(f'train category {idx} {target_place_full} has id {target_place_ids[idx]}') 146 | 147 | # Read place filenames associated with target_place 148 | place_filenames += [ 149 | f'/{target_place[0]}/{target_place}/{filename}' for filename in os.listdir( 150 | os.path.join(places_dir, 'data_large', 'train', target_place[0], target_place)) 151 | if filename.endswith('.jpg')] 152 | 153 | random.shuffle(place_filenames) 154 | 155 | # Assign each filename to an image 156 | indices = (df.loc[:, 'place'] == idx) 157 | assert len(place_filenames) >= np.sum(indices),\ 158 | f"Not enough places ({len(place_filenames)}) to fit the dataset ({np.sum(df.loc[:, 'place'] == idx)})" 159 | df.loc[indices, 'place_filename'] = place_filenames[:np.sum(indices)] 160 | 161 | ### Write dataset to disk 162 | output_subfolder = os.path.join(output_dir, dataset_name) 163 | os.makedirs(output_subfolder, exist_ok=True) 164 | 165 | df.to_csv(os.path.join(output_subfolder, 'metadata.csv')) 166 | 167 | for i in tqdm(df.index): 168 | # Load bird image and segmentation 169 | img_path = os.path.join(cub_dir, 'images', df.loc[i, 'img_filename']) 170 | seg_path = os.path.join(cub_dir, 'segmentations', df.loc[i, 'img_filename'].replace('.jpg','.png')) 171 | img_np = np.asarray(Image.open(img_path).convert('RGB')) 172 | seg_np = np.asarray(Image.open(seg_path).convert('RGB')) / 255 173 | 174 | # Load place background 175 | # Skip front / 176 | place_path = os.path.join(places_dir, 'data_large', 'train', df.loc[i, 'place_filename'][1:]) 177 | place = Image.open(place_path).convert('RGB') 178 | 179 | img_black = Image.fromarray(np.around(img_np * seg_np).astype(np.uint8)) 180 | combined_img = combine_and_mask(place, seg_np, img_black) 181 | 182 | output_path = os.path.join(output_subfolder, df.loc[i, 'img_filename']) 183 | os.makedirs('/'.join(output_path.split('/')[:-1]), exist_ok=True) 184 | 185 | combined_img.save(output_path) 186 | -------------------------------------------------------------------------------- /main.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/deeplearning-wisc/Spurious_OOD/7403c95a134aa5c02783cd932cd031d4024c4566/main.png -------------------------------------------------------------------------------- /models/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import -------------------------------------------------------------------------------- /models/resnet.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import numpy as np 3 | import torch.nn as nn 4 | import torch.nn.functional as F 5 | 6 | model_urls = { 7 | 'resnet18': 'https://download.pytorch.org/models/resnet18-f37072fd.pth', 8 | 'resnet34': 'https://download.pytorch.org/models/resnet34-b627a593.pth', 9 | 'resnet50': 'https://download.pytorch.org/models/resnet50-0676ba61.pth', 10 | 'resnet101': 'https://download.pytorch.org/models/resnet101-63fe2227.pth', 11 | 'resnet152': 'https://download.pytorch.org/models/resnet152-394f9c45.pth', 12 | 'resnext50_32x4d': 'https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth', 13 | 'resnext101_32x8d': 'https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth', 14 | 'wide_resnet50_2': 'https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth', 15 | 'wide_resnet101_2': 'https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth', 16 | } 17 | 18 | class Identity(nn.Module): 19 | def __init__(self): 20 | super(Identity, self).__init__() 21 | 22 | def forward(self, x): 23 | return x 24 | 25 | 26 | def conv3x3(in_planes, out_planes, stride=1): 27 | return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride, padding=1, bias=False) 28 | 29 | class BasicBlock(nn.Module): 30 | expansion = 1 31 | 32 | def __init__(self, in_planes, planes, stride=1): 33 | super(BasicBlock, self).__init__() 34 | self.conv1 = conv3x3(in_planes, planes, stride) 35 | self.bn1 = nn.BatchNorm2d(planes) 36 | self.conv2 = conv3x3(planes, planes) 37 | self.bn2 = nn.BatchNorm2d(planes) 38 | 39 | self.downsample = nn.Sequential() 40 | if stride != 1 or in_planes != self.expansion*planes: 41 | self.downsample = nn.Sequential( 42 | nn.Conv2d(in_planes, self.expansion*planes, kernel_size=1, stride=stride, bias=False), 43 | nn.BatchNorm2d(self.expansion*planes) 44 | ) 45 | 46 | def forward(self, x): 47 | t = self.conv1(x) 48 | out = F.relu(self.bn1(t)) 49 | t = self.conv2(out) 50 | out = self.bn2(self.conv2(out)) 51 | t = self.downsample(x) 52 | out += t 53 | out = F.relu(out) 54 | 55 | return out 56 | 57 | class ResNet(nn.Module): 58 | def __init__(self, block, num_blocks, num_classes=2): 59 | super(ResNet, self).__init__() 60 | self.in_planes = 64 61 | 62 | # self.conv1 = conv3x3(3,64) 63 | # for large input size 64 | self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3, 65 | bias=False) 66 | 67 | self.bn1 = nn.BatchNorm2d(64) 68 | self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1) 69 | # end 70 | self.layer1 = self._make_layer(block, 64, num_blocks[0], stride=1) 71 | self.layer2 = self._make_layer(block, 128, num_blocks[1], stride=2) 72 | self.layer3 = self._make_layer(block, 256, num_blocks[2], stride=2) 73 | self.layer4 = self._make_layer(block, 512, num_blocks[3], stride=2) 74 | self.linear = nn.Linear(512*block.expansion, num_classes) 75 | 76 | self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) 77 | 78 | def _make_layer(self, block, planes, num_blocks, stride): 79 | strides = [stride] + [1]*(num_blocks-1) 80 | layers = [] 81 | for stride in strides: 82 | layers.append(block(self.in_planes, planes, stride)) 83 | self.in_planes = planes * block.expansion 84 | return nn.Sequential(*layers) 85 | 86 | def forward(self, x): 87 | out = F.relu(self.bn1(self.conv1(x))) 88 | out= self.maxpool(out) 89 | out = self.layer1(out) 90 | out = self.layer2(out) 91 | out = self.layer3(out) 92 | out = self.layer4(out) 93 | out = self.avgpool(out) 94 | out = out.view(out.size(0), -1) 95 | y = self.linear(out) 96 | return out, y 97 | 98 | # function to extact the multiple features 99 | def feature_list(self, x): 100 | out_list = [] 101 | out = F.relu(self.bn1(self.conv1(x))) 102 | out= self.maxpool(out) 103 | out_list.append(out) 104 | out = self.layer1(out) 105 | out_list.append(out) 106 | out = self.layer2(out) 107 | out_list.append(out) 108 | out = self.layer3(out) 109 | out_list.append(out) 110 | out = self.layer4(out) 111 | out_list.append(out) 112 | out = self.avgpool(out) 113 | out = out.view(out.size(0), -1) 114 | y = self.linear(out) 115 | return y, out_list 116 | 117 | # function to extact a specific feature 118 | def intermediate_forward(self, x, layer_index): 119 | out = F.relu(self.bn1(self.conv1(x))) 120 | out= self.maxpool(out) 121 | if layer_index == 1: 122 | out = self.layer1(out) 123 | elif layer_index == 2: 124 | out = self.layer1(out) 125 | out = self.layer2(out) 126 | elif layer_index == 3: 127 | out = self.layer1(out) 128 | out = self.layer2(out) 129 | out = self.layer3(out) 130 | elif layer_index == 4: 131 | out = self.layer1(out) 132 | out = self.layer2(out) 133 | out = self.layer3(out) 134 | out = self.layer4(out) 135 | return out 136 | 137 | def load(self, path="resnet_svhn.pth"): 138 | tm = torch.load(path, map_location="cpu") 139 | self.load_state_dict(tm) 140 | 141 | 142 | 143 | 144 | def load_model(pretrained = False): 145 | ''' 146 | load resnet18 147 | ''' 148 | torch_model = ResNet(BasicBlock, [2,2,2,2], num_classes=2) 149 | arch = 'resnet18' 150 | if pretrained: 151 | model_dict = torch_model.state_dict() 152 | pretrained_dict = torch.hub.load_state_dict_from_url(model_urls[arch], 153 | progress = True) 154 | # 1. filter out unnecessary keys 155 | pretrained_dict = {k: v for k, v in pretrained_dict.items() if k in model_dict} 156 | # 2. overwrite entries in the existing state dict 157 | model_dict.update(pretrained_dict) 158 | # 3. load the new state dict 159 | torch_model.load_state_dict(model_dict) 160 | print("ResNet Loading Done") 161 | return torch_model 162 | if __name__ == "__main__": 163 | load_model(True) 164 | -------------------------------------------------------------------------------- /present_results.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | from utils import anom_utils 3 | import os 4 | import argparse 5 | from collections import defaultdict 6 | 7 | 8 | parser = argparse.ArgumentParser(description='Present OOD Detection metrics for Energy-score') 9 | parser.add_argument('--name', '-n', default = 'erm_rebuttal', type=str, 10 | help='name of experiment') 11 | parser.add_argument('--exp-name', default = 'erm_new_0.7', type=str, 12 | help='help identify checkpoint') 13 | parser.add_argument('--in-dataset', default='celebA', type=str, help='in-distribution dataset e.g. color_mnist') 14 | parser.add_argument('--test_epochs', "-e", default = "15 20 25", type=str, 15 | help='# epoch to test performance') 16 | args = parser.parse_args() 17 | 18 | def main(): 19 | if args.in_dataset == "color_mnist" or args.in_dataset == "color_mnist_multi": 20 | out_datasets = ['partial_color_mnist_0&1', 'gaussian', 'dtd', 'iSUN', 'LSUN_resize'] 21 | elif args.in_dataset == "waterbird": 22 | out_datasets = ['gaussian', 'placesbg', 'SVHN', 'iSUN', 'LSUN_resize', 'dtd'] 23 | elif args.in_dataset == "celebA": 24 | out_datasets = ['celebA_ood', 'gaussian', 'SVHN', 'iSUN', 'LSUN_resize'] 25 | fprs = dict() 26 | for test_epoch in args.test_epochs.split(): 27 | all_results_ntom = [] 28 | save_dir = f"./energy_results/{args.in_dataset}/{args.name}/{args.exp_name}" 29 | with open(os.path.join(save_dir, f'energy_score_at_epoch_{test_epoch}.npy'), 'rb') as f: 30 | id_sum_energy = np.load(f) 31 | all_results = defaultdict(int) 32 | for out_dataset in out_datasets: 33 | with open(os.path.join(save_dir, f'energy_score_{out_dataset}_at_epoch_{test_epoch}.npy'), 'rb') as f: 34 | ood_sum_energy = np.load(f) 35 | auroc, aupr, fpr = anom_utils.get_and_print_results(-1 * id_sum_energy, -1 * ood_sum_energy, f"{out_dataset}", f" Energy Sum at epoch {test_epoch}") 36 | results = cal_metric(known = -1 * id_sum_energy, novel = -1* ood_sum_energy, method = "energy sum") 37 | all_results_ntom.append(results) 38 | all_results["AUROC"] += auroc 39 | all_results["AUPR"] += aupr 40 | all_results["FPR95"] += fpr 41 | print("Avg FPR95: ", round(100 * all_results["FPR95"]/len(out_datasets),2)) 42 | print("Avg AUROC: ", round(all_results["AUROC"]/len(out_datasets),4)) 43 | print("Avg AUPR: ", round(all_results["AUPR"]/len(out_datasets),4)) 44 | fprs[test_epoch] = 100 * all_results["FPR95"]/len(out_datasets) 45 | avg_results = compute_average_results(all_results_ntom) 46 | print_results(avg_results, args.in_dataset, "All", args.name, "energy sum") 47 | 48 | def print_results(results, in_dataset, out_dataset, name, method): 49 | mtypes = ['FPR', 'DTERR', 'AUROC', 'AUIN', 'AUOUT'] 50 | 51 | print('in_distribution: ' + in_dataset) 52 | print('out_distribution: '+ out_dataset) 53 | print('Model Name: ' + name) 54 | print('') 55 | 56 | print(' OOD detection method: ' + method) 57 | for mtype in mtypes: 58 | print(' {mtype:6s}'.format(mtype=mtype), end='') 59 | print('\n{val:6.2f}'.format(val=100.*results['FPR']), end='') 60 | print(' {val:6.2f}'.format(val=100.*results['DTERR']), end='') 61 | print(' {val:6.2f}'.format(val=100.*results['AUROC']), end='') 62 | print(' {val:6.2f}'.format(val=100.*results['AUIN']), end='') 63 | print(' {val:6.2f}\n'.format(val=100.*results['AUOUT']), end='') 64 | print('') 65 | 66 | def cal_metric(known, novel, method): 67 | tp, fp, fpr_at_tpr95 = get_curve(known, novel, method) 68 | results = dict() 69 | 70 | # FPR 71 | mtype = 'FPR' 72 | results[mtype] = fpr_at_tpr95 73 | 74 | # AUROC 75 | mtype = 'AUROC' 76 | tpr = np.concatenate([[1.], tp/tp[0], [0.]]) 77 | fpr = np.concatenate([[1.], fp/fp[0], [0.]]) 78 | results[mtype] = -np.trapz(1.-fpr, tpr) 79 | 80 | # DTERR 81 | mtype = 'DTERR' 82 | results[mtype] = ((tp[0] - tp + fp) / (tp[0] + fp[0])).min() 83 | 84 | # AUIN 85 | mtype = 'AUIN' 86 | denom = tp+fp 87 | denom[denom == 0.] = -1. 88 | pin_ind = np.concatenate([[True], denom > 0., [True]]) 89 | pin = np.concatenate([[.5], tp/denom, [0.]]) 90 | results[mtype] = -np.trapz(pin[pin_ind], tpr[pin_ind]) 91 | 92 | # AUOUT 93 | mtype = 'AUOUT' 94 | denom = tp[0]-tp+fp[0]-fp 95 | denom[denom == 0.] = -1. 96 | pout_ind = np.concatenate([[True], denom > 0., [True]]) 97 | pout = np.concatenate([[0.], (fp[0]-fp)/denom, [.5]]) 98 | results[mtype] = np.trapz(pout[pout_ind], 1.-fpr[pout_ind]) 99 | 100 | return results 101 | 102 | def get_curve(known, novel, method): 103 | tp, fp = dict(), dict() 104 | fpr_at_tpr95 = dict() 105 | 106 | known.sort() 107 | novel.sort() 108 | 109 | end = np.max([np.max(known), np.max(novel)]) 110 | start = np.min([np.min(known),np.min(novel)]) 111 | 112 | all = np.concatenate((known, novel)) 113 | all.sort() 114 | 115 | num_k = known.shape[0] 116 | num_n = novel.shape[0] 117 | 118 | if method == 'row': 119 | threshold = -0.5 120 | else: 121 | threshold = known[round(0.05 * num_k)] 122 | 123 | tp = -np.ones([num_k+num_n+1], dtype=int) 124 | fp = -np.ones([num_k+num_n+1], dtype=int) 125 | tp[0], fp[0] = num_k, num_n 126 | k, n = 0, 0 127 | for l in range(num_k+num_n): 128 | if k == num_k: 129 | tp[l+1:] = tp[l] 130 | fp[l+1:] = np.arange(fp[l]-1, -1, -1) 131 | break 132 | elif n == num_n: 133 | tp[l+1:] = np.arange(tp[l]-1, -1, -1) 134 | fp[l+1:] = fp[l] 135 | break 136 | else: 137 | if novel[n] < known[k]: 138 | n += 1 139 | tp[l+1] = tp[l] 140 | fp[l+1] = fp[l] - 1 141 | else: 142 | k += 1 143 | tp[l+1] = tp[l] - 1 144 | fp[l+1] = fp[l] 145 | 146 | j = num_k+num_n-1 147 | for l in range(num_k+num_n-1): 148 | if all[j] == all[j-1]: 149 | tp[j] = tp[j+1] 150 | fp[j] = fp[j+1] 151 | j -= 1 152 | 153 | fpr_at_tpr95 = np.sum(novel > threshold) / float(num_n) 154 | 155 | return tp, fp, fpr_at_tpr95 156 | 157 | def compute_average_results(all_results): 158 | mtypes = ['FPR', 'DTERR', 'AUROC', 'AUIN', 'AUOUT'] 159 | avg_results = dict() 160 | 161 | for mtype in mtypes: 162 | avg_results[mtype] = 0.0 163 | 164 | for results in all_results: 165 | for mtype in mtypes: 166 | avg_results[mtype] += results[mtype] 167 | 168 | print("len of all results", float(len(all_results))) 169 | for mtype in mtypes: 170 | avg_results[mtype] /= float(len(all_results)) 171 | 172 | return avg_results 173 | 174 | 175 | if __name__ == '__main__': 176 | main() 177 | -------------------------------------------------------------------------------- /test_bg.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import os 3 | import logging 4 | 5 | import numpy as np 6 | import torch 7 | import torchvision 8 | import torch.nn.parallel 9 | import torch.optim 10 | import torch.utils.data 11 | import torchvision.transforms as transforms 12 | from torch.utils.data import DataLoader 13 | 14 | 15 | from models.resnet import load_model 16 | from datasets.color_mnist import get_biased_mnist_dataloader 17 | from datasets.cub_dataset import WaterbirdDataset 18 | from datasets.celebA_dataset import get_celebA_dataloader, celebAOodDataset 19 | from utils import AverageMeter, accuracy 20 | import utils.svhn_loader as svhn 21 | from datasets.gaussian_dataset import GaussianDataset 22 | 23 | parser = argparse.ArgumentParser(description='OOD Detection Evaluation based on Energy-score') 24 | parser.add_argument('--name', default = 'erm_rebuttal', type=str, help='help identify checkpoint') 25 | parser.add_argument('--exp_name', '-n', default = 'erm_new_0.7', type=str, help='name of experiment') 26 | parser.add_argument('--in-dataset', default="celebA", type=str, help='name of the in-distribution dataset') 27 | parser.add_argument('--root_dir', required = True, type=str, help='the root directory that contains the OOD test datasets') 28 | parser.add_argument('--model-arch', default='resnet18', type=str, help='model architecture e.g. resnet18') 29 | parser.add_argument('--method', default='erm', type=str, help='method used for model training') 30 | parser.add_argument('--print-freq', '-p', default=10, type=int, help='print frequency (default: 10)') 31 | parser.add_argument('--domain-num', default=4, type=int, 32 | help='the number of environments for model training') 33 | parser.add_argument('-b', '--batch-size', default= 64, type=int, 34 | help='mini-batch size (default: 64) used for training id and ood') 35 | parser.add_argument('--num-classes', default=2, type=int, 36 | help='number of classes for model training') 37 | parser.add_argument('--ood-batch-size', default= 64, type=int, 38 | help='mini-batch size (default: 400) used for testing') 39 | parser.add_argument('--data_label_correlation', default= 0.7, type=float, 40 | help='data_label_correlation') 41 | parser.add_argument('--test_epochs', "-e", default = "15 20 25", type=str, 42 | help='# epoch to test performance') 43 | parser.add_argument('--log_name', 44 | help='Name of the Log File', type = str, default = "info_val.log") 45 | parser.add_argument('--base-dir', default='output/ood_scores', type=str, help='result directory') 46 | parser.add_argument('--gpu-ids', default='6', type=str, 47 | help='id(s) for CUDA_VISIBLE_DEVICES') 48 | parser.add_argument('--manualSeed', type=int, help='manual seed') 49 | parser.add_argument('--multi-gpu', default=False, type=bool) 50 | parser.add_argument('--local_rank', default=-1, type=int, 51 | help='rank for the current node') 52 | 53 | args = parser.parse_args() 54 | 55 | state = {k: v for k, v in args._get_kwargs()} 56 | directory = "checkpoints/{in_dataset}/{name}/".format(in_dataset=args.in_dataset, name=args.name) 57 | if not os.path.exists(directory): 58 | os.makedirs(directory) 59 | save_state_file = os.path.join(directory, 'test_args.txt') 60 | fw = open(save_state_file, 'w') 61 | print(state, file=fw) 62 | fw.close() 63 | 64 | os.environ['CUDA_VISIBLE_DEVICES'] = args.gpu_ids 65 | if torch.cuda.is_available(): 66 | torch.cuda.set_device(args.local_rank) 67 | device = torch.device(f"cuda" if torch.cuda.is_available() else "cpu") 68 | 69 | 70 | def get_ood_energy(args, model, val_loader, epoch, log, method): 71 | in_energy = AverageMeter() 72 | model.eval() 73 | init = True 74 | log.debug("######## Start collecting energy score ########") 75 | with torch.no_grad(): 76 | for i, (images, labels) in enumerate(val_loader): 77 | images = images.cuda() 78 | _, outputs = model(images) 79 | e_s = -torch.logsumexp(outputs, dim=1) 80 | e_s = e_s.data.cpu().numpy() 81 | in_energy.update(e_s.mean(), len(labels)) 82 | if init: 83 | sum_energy = e_s 84 | init = False 85 | else: 86 | sum_energy = np.concatenate((sum_energy, e_s)) 87 | if i % args.print_freq == 0: 88 | log.debug('Epoch: [{0}] Batch#[{1}/{2}]\t' 89 | 'Energy Sum {in_energy.val:.4f} ({in_energy.avg:.4f})'.format( 90 | epoch, i, len(val_loader), in_energy=in_energy)) 91 | return sum_energy 92 | 93 | def get_id_energy(args, model, val_loader, epoch, log, method): 94 | in_energy = AverageMeter() 95 | top1 = AverageMeter() 96 | env_E = {0:AverageMeter(), 97 | 1: AverageMeter(), 98 | 2: AverageMeter(), 99 | 3: AverageMeter() } 100 | NUM_ENV = 4 101 | all_preds = torch.tensor([]) 102 | all_targets = torch.tensor([]) 103 | energy = np.empty(0) 104 | energy_grey = np.empty(0) 105 | energy_nongrey = np.empty(0) 106 | 107 | model.eval() 108 | log.debug("######## Start collecting energy score ########") 109 | with torch.no_grad(): 110 | for i, (images, labels, envs) in enumerate(val_loader): 111 | images = images.cuda() 112 | _, outputs = model(images) 113 | all_targets = torch.cat((all_targets, labels),dim=0) 114 | all_preds = torch.cat((all_preds, outputs.argmax(dim=1).cpu()),dim=0) 115 | prec1 = accuracy(outputs.cpu().data, labels, topk=(1,))[0] 116 | top1.update(prec1, images.size(0)) 117 | e_s = -torch.logsumexp(outputs, dim=1) 118 | e_s = e_s.data.cpu().numpy() 119 | for j in range(NUM_ENV): 120 | env_E[j].update(e_s[envs == j].mean(), len(labels[envs == j])) 121 | in_energy.update(e_s.mean(), len(labels)) 122 | energy = np.concatenate((energy, e_s)) 123 | energy_grey = np.concatenate((energy_grey, e_s[labels == 1])) 124 | energy_nongrey = np.concatenate((energy_nongrey, e_s[labels == 0])) 125 | if i % args.print_freq == 0: 126 | if args.in_dataset == 'color_mnist': 127 | log.debug('Epoch: [{0}] Batch#[{1}/{2}]\t' 128 | 'ID Energy {in_energy.val:.4f} ({in_energy.avg:.4f})\t' 129 | 'red 0 {env_E[0].val:.4f} ({env_E[0].avg:.4f})\t' 130 | 'green 0 {env_E[1].val:.4f} ({env_E[1].avg:.4f})\t' 131 | 'red 1 {env_E[2].val:.4f} ({env_E[2].avg:.4f})\t' 132 | 'green 1 {env_E[3].val:.4f} ({env_E[3].avg:.4f})\t'.format( 133 | epoch, i, len(val_loader), in_energy=in_energy, env_E = env_E)) 134 | elif args.in_dataset == 'celebA': 135 | log.debug('Epoch: [{0}] Batch#[{1}/{2}]\t' 136 | 'ID Energy {in_energy.val:.4f} ({in_energy.avg:.4f})\t' 137 | 'nongrey hair F {env_E[0].val:.4f} ({env_E[0].avg:.4f})\t' 138 | 'nongrey hair M {env_E[1].val:.4f} ({env_E[1].avg:.4f})\t' 139 | 'gray hair F {env_E[2].val:.4f} ({env_E[2].avg:.4f})\t' 140 | 'gray hair M {env_E[3].val:.4f} ({env_E[3].avg:.4f})\t'.format( 141 | epoch, i, len(val_loader), in_energy=in_energy, env_E = env_E)) 142 | log.debug(' * Prec@1 {top1.avg:.3f}'.format(top1=top1)) 143 | return energy, energy_grey, energy_nongrey 144 | 145 | 146 | def get_ood_loader(args, out_dataset, in_dataset = 'color_mnist'): 147 | # for mnist 148 | small_transform = transforms.Compose([ 149 | transforms.Resize(32), 150 | transforms.CenterCrop(32), 151 | transforms.ToTensor(), 152 | transforms.Normalize(mean=(0.5, 0.5, 0.5), 153 | std=(0.5, 0.5, 0.5))]) 154 | # for celebA 155 | scale = 256.0/224.0 156 | target_resolution = (224, 224) 157 | large_transform = transforms.Compose([ 158 | transforms.Resize((int(target_resolution[0]*scale), int(target_resolution[1]*scale))), 159 | transforms.CenterCrop(target_resolution), 160 | transforms.ToTensor(), 161 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]) 162 | ]) 163 | 164 | root_dir = args.root_dir 165 | 166 | if in_dataset == 'color_mnist': 167 | if out_dataset == 'gaussian': 168 | testsetout = GaussianDataset(dataset_size =10000, img_size = 32, 169 | transform=transforms.Normalize(mean=(0.5, 0.5, 0.5), 170 | std=(0.5, 0.5, 0.5))) 171 | else: 172 | testsetout = torchvision.datasets.ImageFolder(f"{root_dir}/{out_dataset}", 173 | transform=small_transform) 174 | subset = torch.utils.data.Subset(testsetout, np.random.choice(len(testsetout), 2000, replace=False)) 175 | testloaderOut = torch.utils.data.DataLoader(subset, batch_size=args.ood_batch_size, 176 | shuffle=True, num_workers=4) 177 | elif in_dataset == 'waterbird' or in_dataset == 'celebA': 178 | if out_dataset == "SVHN": 179 | testsetout = svhn.SVHN(f"{root_dir}/{out_dataset}", split='test', 180 | transform=large_transform, download=False) 181 | elif out_dataset == 'gaussian': 182 | testsetout = GaussianDataset(dataset_size =10000, img_size = 224, 183 | transform=transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])) 184 | elif out_dataset == 'celebA_ood': 185 | testsetout = celebAOodDataset() 186 | else: 187 | testsetout = torchvision.datasets.ImageFolder(f"{root_dir}/{out_dataset}", 188 | transform=large_transform) 189 | if out_dataset == 'celebA_ood': 190 | subset = torch.utils.data.Subset(testsetout, np.random.choice(len(testsetout), 2000, replace=True)) 191 | else: 192 | subset = torch.utils.data.Subset(testsetout, np.random.choice(len(testsetout), 2000, replace=False)) 193 | testloaderOut = torch.utils.data.DataLoader(subset, batch_size=args.ood_batch_size, 194 | shuffle=True, num_workers=4) 195 | else: 196 | testsetout = torchvision.datasets.ImageFolder("datasets/ood_datasets/{}".format(out_dataset), 197 | transform= small_transform) 198 | testloaderOut = torch.utils.data.DataLoader(testsetout, batch_size=args.ood_batch_size, 199 | shuffle=True, num_workers=4) 200 | return testloaderOut 201 | 202 | 203 | def main(): 204 | log = logging.getLogger(__name__) 205 | formatter = logging.Formatter('%(asctime)s : %(message)s') 206 | fileHandler = logging.FileHandler(os.path.join(directory, args.log_name), mode='w') 207 | fileHandler.setFormatter(formatter) 208 | streamHandler = logging.StreamHandler() 209 | streamHandler.setFormatter(formatter) 210 | log.setLevel(logging.DEBUG) 211 | log.addHandler(fileHandler) 212 | log.addHandler(streamHandler) 213 | 214 | 215 | if args.in_dataset == "color_mnist": 216 | val_loader = get_biased_mnist_dataloader(args, root = './datasets/MNIST', batch_size=args.batch_size, 217 | data_label_correlation= args.data_label_correlation, 218 | n_confusing_labels= args.num_classes - 1, 219 | train=False, partial=True, cmap = "1") 220 | elif args.in_dataset == "waterbird": 221 | val_dataset = WaterbirdDataset(data_correlation=args.data_label_correlation, split='test') 222 | val_loader = DataLoader(val_dataset, batch_size=args.batch_size, shuffle=True) 223 | elif args.in_dataset == "celebA": 224 | val_loader = get_celebA_dataloader(args, split='test') 225 | 226 | # create model 227 | if args.model_arch == 'resnet18': 228 | model = load_model() 229 | 230 | model = model.cuda() 231 | 232 | test_epochs = args.test_epochs.split() 233 | if args.in_dataset == 'color_mnist': 234 | out_datasets = ['partial_color_mnist_0&1', 'gaussian', 'dtd', 'iSUN', 'LSUN_resize'] 235 | elif args.in_dataset == 'waterbird': 236 | out_datasets = [ 'gaussian', 'placesbg', 'SVHN', 'iSUN', 'LSUN_resize', 'dtd'] 237 | elif args.in_dataset == 'color_mnist_multi': 238 | out_datasets = ['partial_color_mnist_0&1'] 239 | elif args.in_dataset == 'celebA': 240 | out_datasets = ['celebA_ood', 'gaussian', 'SVHN', 'iSUN', 'LSUN_resize'] 241 | 242 | if args.in_dataset == 'color_mnist': 243 | cpts_directory = "./checkpoints/{in_dataset}/{name}/{exp}".format(in_dataset=args.in_dataset, name=args.name, exp=args.exp_name) 244 | else: 245 | cpts_directory = "./checkpoints/{in_dataset}/{name}/{exp}".format(in_dataset=args.in_dataset, name=args.name, exp=args.exp_name) 246 | 247 | for test_epoch in test_epochs: 248 | cpts_dir = os.path.join(cpts_directory, "checkpoint_{epochs}.pth.tar".format(epochs=test_epoch)) 249 | checkpoint = torch.load(cpts_dir) 250 | state_dict = checkpoint['state_dict_model'] 251 | if torch.cuda.device_count() == 1: 252 | new_state_dict = {} 253 | for k, v in state_dict.items(): 254 | k = k.replace("module.", "") 255 | new_state_dict[k] = v 256 | state_dict = new_state_dict 257 | model.load_state_dict(state_dict) 258 | model.eval() 259 | model.cuda() 260 | save_dir = f"./energy_results/{args.in_dataset}/{args.name}/{args.exp_name}" 261 | if not os.path.exists(save_dir): 262 | os.makedirs(save_dir) 263 | print("processing ID dataset") 264 | 265 | #********** normal procedure ********** 266 | id_energy, _, _ = get_id_energy(args, model, val_loader, test_epoch, log, method=args.method) 267 | with open(os.path.join(save_dir, f'energy_score_at_epoch_{test_epoch}.npy'), 'wb') as f: 268 | np.save(f, id_energy) 269 | for out_dataset in out_datasets: 270 | print("processing OOD dataset ", out_dataset) 271 | testloaderOut = get_ood_loader(args, out_dataset, args.in_dataset) 272 | ood_energy = get_ood_energy(args, model, testloaderOut, test_epoch, log, method=args.method) 273 | with open(os.path.join(save_dir, f'energy_score_{out_dataset}_at_epoch_{test_epoch}.npy'), 'wb') as f: 274 | np.save(f, ood_energy) 275 | 276 | if __name__ == '__main__': 277 | main() 278 | 279 | -------------------------------------------------------------------------------- /train_bg.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import os 3 | import time 4 | import random 5 | import logging 6 | import itertools 7 | import numpy as np 8 | import torch 9 | import torch.nn as nn 10 | import torch.backends.cudnn as cudnn 11 | import torch.nn.functional as F 12 | from models.resnet import load_model 13 | from utils import AverageMeter, save_checkpoint, accuracy 14 | from datasets.color_mnist import get_biased_mnist_dataloader 15 | from datasets.cub_dataset import get_waterbird_dataloader 16 | from datasets.celebA_dataset import get_celebA_dataloader 17 | import math 18 | 19 | parser = argparse.ArgumentParser(description=' use resnet (pretrained)') 20 | 21 | parser.add_argument('--in-dataset', default="celebA", type=str, choices = ['celebA', 'color_mnist', 'waterbird'], help='in-distribution dataset e.g. IN-9') 22 | parser.add_argument('--model-arch', default='resnet18', type=str, help='model architecture e.g. resnet50') 23 | parser.add_argument('--domain-num', default=4, type=int, 24 | help='the number of environments for model training') 25 | parser.add_argument('--method', default='erm', type=str, help='method used for model training') 26 | parser.add_argument('--save-epoch', default=5, type=int, 27 | help='save the model every save_epoch, default = 10') 28 | parser.add_argument('--print-freq', '-p', default=10, type=int, 29 | help='print frequency (default: 10)') 30 | # ID train & val batch size 31 | parser.add_argument('-b', '--batch-size', default=256, type=int, 32 | help='mini-batch size (default: 64) used for training') 33 | # training schedule 34 | parser.add_argument('--start-epoch', default=0, type=int, 35 | help='manual epoch number (useful on restarts)') 36 | parser.add_argument('--epochs', default=30, type=int, 37 | help='number of total epochs to run, default = 30') 38 | parser.add_argument('--lr', '--learning-rate', default=1e-4, type=float, 39 | help='initial learning rate') 40 | parser.add_argument('--num-classes', default=2, type=int, 41 | help='number of classes for model training') 42 | parser.add_argument('--momentum', default=0.9, type=float, help='momentum') 43 | parser.add_argument('--weight-decay', '--wd', default=0.005, type=float, 44 | help='weight decay (default: 0.0001)') 45 | parser.add_argument('--data_label_correlation', default=0.9, type=float, 46 | help='data_label_correlation') 47 | # saving, naming and logging 48 | parser.add_argument('--exp-name', default = 'erm_new_0.9', type=str, 49 | help='help identify checkpoint') 50 | parser.add_argument('--name', default="erm_rebuttal", type=str, 51 | help='name of experiment') 52 | parser.add_argument('--log_name', type = str, default = "info.log", 53 | help='Name of the Log File') 54 | # Device options 55 | parser.add_argument('--gpu-ids', default='6', type=str, 56 | help='id(s) for CUDA_VISIBLE_DEVICES') 57 | parser.add_argument('--local_rank', default=-1, type=int, 58 | help='rank for the current node') 59 | # Miscs 60 | parser.add_argument('--manualSeed', type=int, help='manual seed') 61 | parser.add_argument('--n-g-nets', default=1, type=int, 62 | help="the number of networks for g_model in ReBias") 63 | parser.add_argument('--penalty-multiplier', default=1.1, type=float, 64 | help="the penalty multiplier used in IRM training") 65 | 66 | parser.add_argument('--cosine', action='store_false', 67 | help='using cosine annealing') 68 | parser.add_argument('--lr_decay_epochs', type=str, default='15,25', 69 | help=' 15, 25, 40 for waterbibrds; 10, 15 ,20 for color_mnist') 70 | parser.add_argument('--lr_decay_rate', type=float, default=0.1, 71 | help='decay rate for learning rate') 72 | 73 | args = parser.parse_args() 74 | 75 | state = {k: v for k, v in args._get_kwargs()} 76 | 77 | directory = "checkpoints/{in_dataset}/{name}/{exp}/".format(in_dataset=args.in_dataset, 78 | name=args.name, exp=args.exp_name) 79 | os.makedirs(directory, exist_ok=True) 80 | save_state_file = os.path.join(directory, 'args.txt') 81 | fw = open(save_state_file, 'w') 82 | print(state, file=fw) 83 | fw.close() 84 | 85 | # CUDA Specification 86 | os.environ['CUDA_VISIBLE_DEVICES'] = args.gpu_ids 87 | if torch.cuda.is_available(): 88 | torch.cuda.set_device(args.local_rank) 89 | device = torch.device(f"cuda" if torch.cuda.is_available() else "cpu") 90 | 91 | # Set random seed 92 | def set_random_seed(seed): 93 | random.seed(seed) 94 | np.random.seed(seed) 95 | torch.manual_seed(seed) 96 | torch.cuda.manual_seed_all(seed) 97 | 98 | if args.manualSeed is None: 99 | args.manualSeed = random.randint(1, 10000) 100 | set_random_seed(args.manualSeed) 101 | 102 | 103 | def flatten(list_of_lists): 104 | return itertools.chain.from_iterable(list_of_lists) 105 | 106 | def train(model, train_loaders, criterion, optimizer, epoch, log): 107 | """Train for one epoch on the training set""" 108 | batch_time = AverageMeter() 109 | 110 | nat_losses = AverageMeter() 111 | nat_top1 = AverageMeter() 112 | 113 | # switch to train mode 114 | model.train() 115 | end = time.time() 116 | batch_idx = 0 117 | train_loaders = [iter(x) for x in train_loaders] 118 | len_dataloader = 0 119 | for x in train_loaders: 120 | len_dataloader += len(x) 121 | while True: 122 | for loader in train_loaders: 123 | input, target, _ = next(loader, (None, None, None)) 124 | if input is None: 125 | return 126 | input = input.cuda() 127 | target = target.cuda() 128 | 129 | _, nat_output = model(input) 130 | 131 | nat_loss = criterion(nat_output, target) 132 | 133 | # measure accuracy and record loss 134 | nat_prec1 = accuracy(nat_output.data, target, topk=(1,))[0] 135 | nat_losses.update(nat_loss.data, input.size(0)) 136 | nat_top1.update(nat_prec1, input.size(0)) 137 | 138 | # compute gradient and do SGD step 139 | loss = nat_loss 140 | 141 | optimizer.zero_grad() 142 | loss.backward() 143 | optimizer.step() 144 | 145 | # measure elapsed time 146 | batch_time.update(time.time() - end) 147 | end = time.time() 148 | 149 | if batch_idx % 10 == 0: 150 | log.debug('Epoch: [{0}][{1}/{2}]\t' 151 | 'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t' 152 | 'Loss {loss.val:.4f} ({loss.avg:.4f})\t' 153 | 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})'.format( 154 | epoch, batch_idx, len_dataloader, batch_time=batch_time, 155 | loss=nat_losses, top1=nat_top1)) 156 | batch_idx += 1 157 | 158 | def validate(val_loader, model, criterion, epoch, log, method): 159 | """Perform validation on the validation set""" 160 | batch_time = AverageMeter() 161 | losses = AverageMeter() 162 | top1 = AverageMeter() 163 | 164 | # switch to evaluate mode 165 | model.eval() 166 | with torch.no_grad(): 167 | end = time.time() 168 | for i, (input, target, _) in enumerate(val_loader): 169 | input = input.cuda() 170 | target = target.cuda() 171 | # compute output 172 | _, output = model(input) 173 | loss = criterion(output, target) 174 | 175 | # measure accuracy and record loss 176 | prec1 = accuracy(output.data, target, topk=(1,))[0] 177 | losses.update(loss.data, input.size(0)) 178 | top1.update(prec1, input.size(0)) 179 | 180 | # measure elapsed time 181 | batch_time.update(time.time() - end) 182 | end = time.time() 183 | 184 | if i % args.print_freq == 0: 185 | log.debug('Validate: [{0}/{1}]\t' 186 | 'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t' 187 | 'Loss {loss.val:.4f} ({loss.avg:.4f})\t' 188 | 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})'.format( 189 | i, len(val_loader), batch_time=batch_time, loss=losses, 190 | top1=top1)) 191 | 192 | log.debug(' * Prec@1 {top1.avg:.3f}'.format(top1=top1)) 193 | return top1.avg 194 | 195 | def adjust_learning_rate(args, optimizer, epoch): 196 | lr = args.lr 197 | if args.cosine: 198 | eta_min = lr * (args.lr_decay_rate ** 3) 199 | lr = eta_min + (lr - eta_min) * ( 200 | 1 + math.cos(math.pi * epoch / args.epochs)) / 2 201 | else: 202 | steps = np.sum(epoch > np.asarray(args.lr_decay_epochs)) 203 | if steps > 0: 204 | lr = lr * (args.lr_decay_rate ** steps) 205 | 206 | for param_group in optimizer.param_groups: 207 | param_group['lr'] = lr 208 | 209 | def main(): 210 | 211 | log = logging.getLogger(__name__) 212 | formatter = logging.Formatter('%(asctime)s : %(message)s') 213 | fileHandler = logging.FileHandler(os.path.join(directory, args.log_name), mode='w') 214 | fileHandler.setFormatter(formatter) 215 | streamHandler = logging.StreamHandler() 216 | streamHandler.setFormatter(formatter) 217 | log.setLevel(logging.DEBUG) 218 | log.addHandler(fileHandler) 219 | log.addHandler(streamHandler) 220 | 221 | if args.in_dataset == "color_mnist": 222 | train_loader1 = get_biased_mnist_dataloader(args, root = './datasets/MNIST', batch_size=args.batch_size, 223 | data_label_correlation= args.data_label_correlation, 224 | n_confusing_labels= args.num_classes - 1, 225 | train=True, partial=True, cmap = "1") 226 | train_loader2 = get_biased_mnist_dataloader(args, root = './datasets/MNIST', batch_size=args.batch_size, 227 | data_label_correlation= args.data_label_correlation, 228 | n_confusing_labels= args.num_classes - 1, 229 | train=True, partial=True, cmap = "2") 230 | val_loader = get_biased_mnist_dataloader(args, root = './datasets/MNIST', batch_size=args.batch_size, 231 | data_label_correlation= args.data_label_correlation, 232 | n_confusing_labels= args.num_classes - 1, 233 | train=False, partial=True, cmap = "1") 234 | elif args.in_dataset == "waterbird": 235 | train_loader = get_waterbird_dataloader(args, data_label_correlation=args.data_label_correlation, split="train") 236 | val_loader = get_waterbird_dataloader(args, data_label_correlation=args.data_label_correlation, split="val") 237 | elif args.in_dataset == "celebA": 238 | train_loader = get_celebA_dataloader(args, split="train") 239 | val_loader = get_celebA_dataloader(args, split="val") 240 | 241 | if args.model_arch == 'resnet18': 242 | pretrained = True 243 | if args.in_dataset == 'color_mnist': 244 | pretrained = False #True for celebA & waterbird ; False for Color_MNIST 245 | base_model = load_model(pretrained) 246 | if torch.cuda.device_count() > 1: 247 | base_model = torch.nn.DataParallel(base_model) 248 | 249 | 250 | if args.method == "erm": 251 | model = base_model.cuda() 252 | criterion = nn.CrossEntropyLoss().cuda() 253 | optimizer = torch.optim.Adam(model.parameters(), lr=args.lr, weight_decay=args.weight_decay) 254 | else: 255 | assert False, 'Not supported method: {}'.format(args.method) 256 | 257 | cudnn.benchmark = True 258 | 259 | freeze_bn_affine = False 260 | def freeze_bn(model, freeze_bn_affine=True): 261 | for m in model.modules(): 262 | if isinstance(m, nn.BatchNorm2d): 263 | m.eval() 264 | if freeze_bn_affine: 265 | m.weight.requires_grad = False 266 | m.bias.requires_grad = False 267 | freeze_bn(model, freeze_bn_affine) 268 | 269 | if args.in_dataset == "color_mnist": 270 | train_loaders = [train_loader1, train_loader2] 271 | elif args.in_dataset == "waterbird" or args.in_dataset == "celebA": 272 | train_loaders = [train_loader] 273 | 274 | for epoch in range(args.start_epoch, args.epochs): 275 | print(f"Start training epoch {epoch}") 276 | adjust_learning_rate(args, optimizer, epoch) 277 | train(model, train_loaders, criterion, optimizer, epoch, log) 278 | prec1 = validate(val_loader, model, criterion, epoch, log, args.method) 279 | if (epoch + 1) % args.save_epoch == 0: 280 | save_checkpoint(args, { 281 | 'epoch': epoch + 1, 282 | 'state_dict_model': model.state_dict(), 283 | }, epoch + 1) 284 | if __name__ == '__main__': 285 | main() 286 | -------------------------------------------------------------------------------- /utils/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | from .anom_utils import * 3 | from .common import * 4 | from .svhn_loader import * 5 | -------------------------------------------------------------------------------- /utils/anom_utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import sklearn.metrics as sk 3 | 4 | recall_level_default = 0.95 5 | 6 | def stable_cumsum(arr, rtol=1e-05, atol=1e-08): 7 | """Use high precision for cumsum and check that final value matches sum 8 | Parameters 9 | ---------- 10 | arr : array-like 11 | To be cumulatively summed as flat 12 | rtol : float 13 | Relative tolerance, see ``np.allclose`` 14 | atol : float 15 | Absolute tolerance, see ``np.allclose`` 16 | """ 17 | out = np.cumsum(arr, dtype=np.float64) 18 | expected = np.sum(arr, dtype=np.float64) 19 | if not np.allclose(out[-1], expected, rtol=rtol, atol=atol): 20 | raise RuntimeError('cumsum was found to be unstable: ' 21 | 'its last element does not correspond to sum') 22 | return out 23 | 24 | def fpr_and_fdr_at_recall(y_true, y_score, recall_level=recall_level_default, pos_label=None): 25 | classes = np.unique(y_true) 26 | if (pos_label is None and 27 | not (np.array_equal(classes, [0, 1]) or 28 | np.array_equal(classes, [-1, 1]) or 29 | np.array_equal(classes, [0]) or 30 | np.array_equal(classes, [-1]) or 31 | np.array_equal(classes, [1]))): 32 | raise ValueError("Data is not binary and pos_label is not specified") 33 | elif pos_label is None: 34 | pos_label = 1. 35 | 36 | # make y_true a boolean vector 37 | y_true = (y_true == pos_label) 38 | 39 | # sort scores and corresponding truth values 40 | desc_score_indices = np.argsort(y_score, kind="mergesort")[::-1] 41 | y_score = y_score[desc_score_indices] 42 | y_true = y_true[desc_score_indices] 43 | 44 | # y_score typically has many tied values. Here we extract 45 | # the indices associated with the distinct values. We also 46 | # concatenate a value for the end of the curve. 47 | distinct_value_indices = np.where(np.diff(y_score))[0] 48 | threshold_idxs = np.r_[distinct_value_indices, y_true.size - 1] 49 | 50 | # accumulate the true positives with decreasing threshold 51 | tps = stable_cumsum(y_true)[threshold_idxs] 52 | fps = 1 + threshold_idxs - tps # add one because of zero-based indexing 53 | 54 | thresholds = y_score[threshold_idxs] 55 | 56 | recall = tps / tps[-1] 57 | 58 | last_ind = tps.searchsorted(tps[-1]) 59 | sl = slice(last_ind, None, -1) # [last_ind::-1] 60 | recall, fps, tps, thresholds = np.r_[recall[sl], 1], np.r_[fps[sl], 0], np.r_[tps[sl], 0], thresholds[sl] 61 | 62 | cutoff = np.argmin(np.abs(recall - recall_level)) 63 | return fps[cutoff] / (np.sum(np.logical_not(y_true))), fps[cutoff]/(fps[cutoff] + tps[cutoff]) 64 | 65 | def get_measures(_pos, _neg, recall_level=recall_level_default): 66 | pos = np.array(_pos[:]).reshape((-1, 1)) 67 | neg = np.array(_neg[:]).reshape((-1, 1)) 68 | examples = np.squeeze(np.vstack((pos, neg))) 69 | labels = np.zeros(len(examples), dtype=np.int32) 70 | labels[:len(pos)] += 1 71 | 72 | auroc = sk.roc_auc_score(labels, examples) 73 | aupr = sk.average_precision_score(labels, examples) 74 | fpr, threshould = fpr_and_fdr_at_recall(labels, examples, recall_level) 75 | 76 | return auroc, aupr, fpr, threshould 77 | 78 | 79 | def print_measures(auroc, aupr, fpr, ood, method, recall_level=recall_level_default): 80 | print('\t\t\t' + ood+'_'+method) 81 | print('FPR{:d}:\t\t\t{:.2f}'.format(int(100 * recall_level), 100 * fpr)) 82 | print('AUROC: \t\t\t{:.2f}'.format(100 * auroc)) 83 | print('AUPR: \t\t\t{:.2f}'.format(100 * aupr)) 84 | 85 | def get_and_print_results(out_score, in_score, ood, method): 86 | 87 | aurocs, auprs, fprs = [], [], [] 88 | measures = get_measures(out_score, in_score) 89 | aurocs.append(measures[0]); auprs.append(measures[1]); fprs.append(measures[2]) 90 | 91 | auroc = np.mean(aurocs); aupr = np.mean(auprs); fpr = np.mean(fprs) 92 | print_measures(auroc, aupr, fpr, ood, method) 93 | return auroc, aupr, fpr -------------------------------------------------------------------------------- /utils/common.py: -------------------------------------------------------------------------------- 1 | import os 2 | import torch 3 | 4 | class UnNormalize(object): 5 | def __init__(self, mean, std): 6 | self.mean = mean 7 | self.std = std 8 | 9 | def __call__(self, tensor): 10 | """ 11 | Args: 12 | tensor (Tensor): Tensor image of size (C, H, W) to be normalized. 13 | Returns: 14 | Tensor: Normalized image. 15 | """ 16 | for t, m, s in zip(tensor, self.mean, self.std): 17 | t.mul_(s).add_(m) 18 | return tensor 19 | 20 | 21 | class AverageMeter(object): 22 | """Computes and stores the average and current value""" 23 | def __init__(self): 24 | self.reset() 25 | 26 | def reset(self): 27 | self.val = 0 28 | self.avg = 0 29 | self.sum = 0 30 | self.count = 0 31 | 32 | def update(self, val, n=1): 33 | self.val = val 34 | self.sum += self.val * n 35 | self.count += n 36 | self.avg = self.sum / self.count 37 | 38 | 39 | def accuracy(output, target, topk=(1,)): 40 | """Computes the precision@k for the specified values of k""" 41 | maxk = max(topk) 42 | batch_size = target.size(0) 43 | 44 | _, pred = output.topk(maxk, 1, True, True) 45 | pred = pred.t() 46 | correct = pred.eq(target.view(1, -1).expand_as(pred)) 47 | 48 | res = [] 49 | for k in topk: 50 | correct_k = correct[:k].view(-1).float().sum(0) 51 | res.append(correct_k.mul_(100.0 / batch_size)) 52 | return res 53 | 54 | 55 | def save_checkpoint(args, state, epoch, name = None): 56 | """Saves checkpoint to disk""" 57 | directory = "checkpoints/{in_dataset}/{name}/{exp}/".format(in_dataset=args.in_dataset, name=args.name, exp=args.exp_name) 58 | os.makedirs(directory, exist_ok=True) 59 | if name == None: 60 | filename = directory + 'checkpoint_{}.pth.tar'.format(epoch) 61 | else: 62 | filename = directory + 'checkpoint_{}.pth.tar'.format(name) 63 | torch.save(state, filename) 64 | -------------------------------------------------------------------------------- /utils/svhn_loader.py: -------------------------------------------------------------------------------- 1 | import torch.utils.data as data 2 | from PIL import Image 3 | import os 4 | import os.path 5 | import numpy as np 6 | 7 | 8 | class SVHN(data.Dataset): 9 | url = "" 10 | filename = "" 11 | file_md5 = "" 12 | 13 | split_list = { 14 | 'train': ["http://ufldl.stanford.edu/housenumbers/train_32x32.mat", 15 | "train_32x32.mat", "e26dedcc434d2e4c54c9b2d4a06d8373"], 16 | 'test': ["http://ufldl.stanford.edu/housenumbers/test_32x32.mat", 17 | "selected_test_32x32.mat", "eb5a983be6a315427106f1b164d9cef3"], 18 | 'extra': ["http://ufldl.stanford.edu/housenumbers/extra_32x32.mat", 19 | "extra_32x32.mat", "a93ce644f1a588dc4d68dda5feec44a7"], 20 | 'train_and_extra': [ 21 | ["http://ufldl.stanford.edu/housenumbers/train_32x32.mat", 22 | "train_32x32.mat", "e26dedcc434d2e4c54c9b2d4a06d8373"], 23 | ["http://ufldl.stanford.edu/housenumbers/extra_32x32.mat", 24 | "extra_32x32.mat", "a93ce644f1a588dc4d68dda5feec44a7"]]} 25 | 26 | def __init__(self, root, split='train', 27 | transform=None, target_transform=None, download=False): 28 | self.root = root 29 | self.transform = transform 30 | self.target_transform = target_transform 31 | self.split = split # training set or test set or extra set 32 | 33 | if self.split not in self.split_list: 34 | raise ValueError('Wrong split entered! Please use split="train" ' 35 | 'or split="extra" or split="test" ' 36 | 'or split="train_and_extra" ') 37 | 38 | if self.split == "train_and_extra": 39 | self.url = self.split_list[split][0][0] 40 | self.filename = self.split_list[split][0][1] 41 | self.file_md5 = self.split_list[split][0][2] 42 | else: 43 | self.url = self.split_list[split][0] 44 | self.filename = self.split_list[split][1] 45 | self.file_md5 = self.split_list[split][2] 46 | 47 | # import here rather than at top of file because this is 48 | # an optional dependency for torchvision 49 | import scipy.io as sio 50 | 51 | # reading(loading) mat file as array 52 | loaded_mat = sio.loadmat(os.path.join(root, self.filename)) 53 | 54 | if self.split == "test": 55 | self.data = loaded_mat['X'] 56 | self.targets = loaded_mat['y'] 57 | # Note label 10 == 0 so modulo operator required 58 | self.targets = (self.targets % 10).squeeze() # convert to zero-based indexing 59 | self.data = np.transpose(self.data, (3, 2, 0, 1)) 60 | else: 61 | self.data = loaded_mat['X'] 62 | self.targets = loaded_mat['y'] 63 | 64 | if self.split == "train_and_extra": 65 | extra_filename = self.split_list[split][1][1] 66 | loaded_mat = sio.loadmat(os.path.join(root, extra_filename)) 67 | self.data = np.concatenate([self.data, 68 | loaded_mat['X']], axis=3) 69 | self.targets = np.vstack((self.targets, 70 | loaded_mat['y'])) 71 | # Note label 10 == 0 so modulo operator required 72 | self.targets = (self.targets % 10).squeeze() # convert to zero-based indexing 73 | self.data = np.transpose(self.data, (3, 2, 0, 1)) 74 | 75 | def __getitem__(self, index): 76 | if self.split == "test": 77 | img, target = self.data[index], self.targets[index] 78 | else: 79 | img, target = self.data[index], self.targets[index] 80 | 81 | # doing this so that it is consistent with all other datasets 82 | # to return a PIL Image 83 | img = Image.fromarray(np.transpose(img, (1, 2, 0))) 84 | 85 | if self.transform is not None: 86 | img = self.transform(img) 87 | 88 | if self.target_transform is not None: 89 | target = self.target_transform(target) 90 | 91 | return img, target.astype(np.long) 92 | 93 | def __len__(self): 94 | if self.split == "test": 95 | return len(self.data) 96 | else: 97 | return len(self.data) 98 | --------------------------------------------------------------------------------