├── LICENSE ├── README.md ├── data ├── aircraft_test.txt ├── aircraft_train.txt ├── bird_test.txt ├── bird_train.txt ├── car_test.txt ├── car_train.txt ├── dog_test.txt ├── dog_train.txt └── link_data.sh ├── dataset ├── __init__.py └── custom_dataset.py ├── model ├── __init__.py ├── bap.py ├── inception.py ├── inception_bap.py └── resnet.py ├── test_bap.sh ├── train_bap.py ├── train_bap.sh └── utils ├── __init__.py ├── attention.py ├── config.py ├── convert_data.py ├── engine.py ├── meter.py └── utils.py /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 wvinzh 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # PyTorch Implementation Of WS-DAN 2 | 3 | ## Introduction 4 | This is a PyTorch implementation of the paper 5 | "[See Better Before Looking Closer: Weakly Supervised Data Augmentation Network for Fine-Grained Visual Classification](https://arxiv.org/abs/1901.09891)". It also has an official TensorFlow implementation [WS_DAN](https://github.com/tau-yihouxiang/WS_DAN). The core part of the code refers to the official version, and finally,the performance almost reaches the results reported in the paper. 6 | 7 | ## Environment 8 | 9 | - Ubuntu 16.04, GTX 1080 8G * 2, cuda 8.0 10 | - Anaconda with Python=3.6.5, PyTorch=0.4.1, torchvison=0.2.1, etc. 11 | - Some **third-party dependencies** may be installed with **pip** or **conda** when needed. 12 | 13 | ## Result 14 | 15 | | Dataset | ACC(this repo) | ACC Refine(this repo) | ACC(paper) 16 | | ------------- | ------ | ----------- | ----------- | 17 | | CUB-200-2011 | 88.20 | 89.30 | 89.4 18 | | FGVC-Aircraft | 93.15 | 93.22 | 93.0 19 | | Stanford Cars | 94.13 | 94.43 | 94.5 20 | | Stanford Dogs | 86.03 | 86.46 | 92.2 21 | 22 | You can download pretrained models from [WS_DAN_Onedrive](https://1drv.ms/u/s!AseTbxZ7P87UknnvrfLAsIFlhAmb?e=XC0DFn) 23 | 24 | ## Install 25 | 26 | 1. Clone the repo 27 | ``` 28 | git clone https://github.com/wvinzh/WS_DAN_PyTorch 29 | ``` 30 | 2. Prepare dataset 31 | 32 | - Download the following datasets. 33 | 34 | Dataset | Object | Category | Training | Testing 35 | ---|--- |--- |--- |--- 36 | [CUB-200-2011](http://www.vision.caltech.edu/visipedia/CUB-200-2011.html) | Bird | 200 | 5994 | 5794 37 | [Stanford-Cars](https://ai.stanford.edu/~jkrause/cars/car_dataset.html) | Car | 100 | 6667 | 3333 38 | [fgvc-aircraft](http://www.robots.ox.ac.uk/~vgg/data/fgvc-aircraft/) | Aircraft | 196 | 8144 | 8041 39 | [Stanford-Dogs](http://vision.stanford.edu/aditya86/ImageNetDogs/) | Dogs | 120 | 12000 | 8580 40 | 41 | - Extract the data like following: 42 | ``` 43 | Fine-grained 44 | ├── CUB_200_2011 45 | │   ├── attributes 46 | │   ├── bounding_boxes.txt 47 | │   ├── classes.txt 48 | │   ├── image_class_labels.txt 49 | │   ├── images 50 | │   ├── images.txt 51 | │   ├── parts 52 | │   ├── README 53 | ├── Car 54 | │   ├── cars_test 55 | │   ├── cars_train 56 | │   ├── devkit 57 | │   └── tfrecords 58 | ├── fgvc-aircraft-2013b 59 | │   ├── data 60 | │   ├── evaluation.m 61 | │   ├── example_evaluation.m 62 | │   ├── README.html 63 | │   ├── README.md 64 | │   ├── vl_argparse.m 65 | │   ├── vl_pr.m 66 | │   ├── vl_roc.m 67 | │   └── vl_tpfp.m 68 | ├── dogs 69 | │   ├── file_list.mat 70 | │   ├── Images 71 | │   ├── test_list.mat 72 | │   └── train_list.mat 73 | ``` 74 | - Prepare the ./data folder: generate file list txt (**using ./utils/convert_data.py**) and do soft link. 75 | ``` 76 | python utils/convert_data.py --dataset_name bird --root_path .../Fine-grained/CUB_200_2011 77 | ``` 78 | 79 | ``` 80 | ├── data 81 | │   ├── Aircraft -> /your_root_path/Fine-grained/fgvc-aircraft-2013b/data 82 | │   ├── aircraft_test.txt 83 | │   ├── aircraft_train.txt 84 | │   ├── Bird -> /your_root_path/Fine-grained/CUB_200_2011 85 | │   ├── bird_test.txt 86 | │   ├── bird_train.txt 87 | │   ├── Car -> /your_root_path/Fine-grained/Car 88 | │   ├── car_test.txt 89 | │   ├── car_train.txt 90 | │   ├── Dog -> /your_root_path/Fine-grained/dogs 91 | │   ├── dog_test.txt 92 | │   └── dog_train.txt 93 | 94 | ``` 95 | 96 | 97 | 98 | ## Usage 99 | 100 | - Train 101 | 102 | ``` 103 | python train_bap.py train\ 104 | --model-name inception \ 105 | --batch-size 12 \ 106 | --dataset car \ 107 | --image-size 512 \ 108 | --input-size 448 \ 109 | --checkpoint-path checkpoint/car \ 110 | --optim sgd \ 111 | --scheduler step \ 112 | --lr 0.001 \ 113 | --momentum 0.9 \ 114 | --weight-decay 1e-5 \ 115 | --workers 4 \ 116 | --parts 32 \ 117 | --epochs 80 \ 118 | --use-gpu \ 119 | --multi-gpu \ 120 | --gpu-ids 0,1 \ 121 | ``` 122 | A simple way is to use `sh train_bap.sh` or run backgroud with logs using cmd `nohup sh train_bap.sh 1>train.log 2>error.log &` 123 | - Test 124 | 125 | ``` 126 | python train_bap.py test\ 127 | --model-name inception \ 128 | --batch-size 12 \ 129 | --dataset car \ 130 | --image-size 512 \ 131 | --input-size 448 \ 132 | --checkpoint-path checkpoint/car/model_best.pth.tar \ 133 | --workers 4 \ 134 | --parts 32 \ 135 | --use-gpu \ 136 | --multi-gpu \ 137 | --gpu-ids 0,1 \ 138 | ``` 139 | 140 | -------------------------------------------------------------------------------- /data/aircraft_test.txt: -------------------------------------------------------------------------------- 1 | 1514522.jpg 0 2 | 0747566.jpg 0 3 | 1008575.jpg 0 4 | 0717480.jpg 0 5 | 0991569.jpg 0 6 | 1446335.jpg 0 7 | 1345732.jpg 0 8 | 0064932.jpg 0 9 | 1453508.jpg 0 10 | 0536515.jpg 0 11 | 0809560.jpg 0 12 | 0812092.jpg 0 13 | 0979377.jpg 0 14 | 0062765.jpg 0 15 | 1191359.jpg 0 16 | 0894381.jpg 0 17 | 0574359.jpg 0 18 | 0447936.jpg 0 19 | 0658110.jpg 0 20 | 0725155.jpg 0 21 | 0691159.jpg 0 22 | 1707891.jpg 0 23 | 1338874.jpg 0 24 | 0812908.jpg 0 25 | 1187784.jpg 0 26 | 0895165.jpg 0 27 | 0197342.jpg 0 28 | 0789828.jpg 0 29 | 0602227.jpg 0 30 | 2127935.jpg 0 31 | 0869739.jpg 0 32 | 0939480.jpg 0 33 | 0537688.jpg 0 34 | 0116175.jpg 1 35 | 0704269.jpg 1 36 | 1192237.jpg 1 37 | 1169900.jpg 1 38 | 0328235.jpg 1 39 | 0923507.jpg 1 40 | 0306187.jpg 1 41 | 0907384.jpg 1 42 | 0064005.jpg 1 43 | 0907370.jpg 1 44 | 1337431.jpg 1 45 | 0061581.jpg 1 46 | 1338360.jpg 1 47 | 2208641.jpg 1 48 | 0218041.jpg 1 49 | 0907433.jpg 1 50 | 0694478.jpg 1 51 | 0699286.jpg 1 52 | 1713236.jpg 1 53 | 1047756.jpg 1 54 | 0256039.jpg 1 55 | 2021695.jpg 1 56 | 0907405.jpg 1 57 | 0724783.jpg 1 58 | 1215395.jpg 1 59 | 0920409.jpg 1 60 | 1833887.jpg 1 61 | 0713927.jpg 1 62 | 0538261.jpg 1 63 | 0109655.jpg 1 64 | 1639641.jpg 1 65 | 0457894.jpg 1 66 | 0846103.jpg 1 67 | 0227058.jpg 2 68 | 0950275.jpg 2 69 | 0114122.jpg 2 70 | 1754503.jpg 2 71 | 0127648.jpg 2 72 | 0225218.jpg 2 73 | 1713238.jpg 2 74 | 1295799.jpg 2 75 | 1455998.jpg 2 76 | 1102089.jpg 2 77 | 0067619.jpg 2 78 | 0713752.jpg 2 79 | 1361876.jpg 2 80 | 0704268.jpg 2 81 | 0494478.jpg 2 82 | 1398436.jpg 2 83 | 0037512.jpg 2 84 | 0698787.jpg 2 85 | 0875303.jpg 2 86 | 1831096.jpg 2 87 | 1565290.jpg 2 88 | 1297843.jpg 2 89 | 0694480.jpg 2 90 | 2054409.jpg 2 91 | 1860176.jpg 2 92 | 0075050.jpg 2 93 | 1110805.jpg 2 94 | 0702845.jpg 2 95 | 1471735.jpg 2 96 | 1395798.jpg 2 97 | 0880593.jpg 2 98 | 0062679.jpg 2 99 | 0198440.jpg 2 100 | 1258401.jpg 2 101 | 0448049.jpg 3 102 | 0357873.jpg 3 103 | 1196360.jpg 3 104 | 0218033.jpg 3 105 | 0395652.jpg 3 106 | 1320676.jpg 3 107 | 1100234.jpg 3 108 | 2067870.jpg 3 109 | 0681541.jpg 3 110 | 0133440.jpg 3 111 | 0218801.jpg 3 112 | 1298958.jpg 3 113 | 0746112.jpg 3 114 | 0953559.jpg 3 115 | 0545210.jpg 3 116 | 1043473.jpg 3 117 | 0894189.jpg 3 118 | 0447745.jpg 3 119 | 1682001.jpg 3 120 | 0522856.jpg 3 121 | 1456004.jpg 3 122 | 0460770.jpg 3 123 | 1115297.jpg 3 124 | 0653804.jpg 3 125 | 2221342.jpg 3 126 | 1363673.jpg 3 127 | 1352205.jpg 3 128 | 0882609.jpg 3 129 | 1446516.jpg 3 130 | 1615914.jpg 3 131 | 1958758.jpg 3 132 | 1810855.jpg 3 133 | 2126017.jpg 3 134 | 1891967.jpg 4 135 | 0076843.jpg 4 136 | 1119094.jpg 4 137 | 1606542.jpg 4 138 | 0923504.jpg 4 139 | 1080160.jpg 4 140 | 0979665.jpg 4 141 | 1483195.jpg 4 142 | 2217839.jpg 4 143 | 0075752.jpg 4 144 | 0807961.jpg 4 145 | 0195013.jpg 4 146 | 0682201.jpg 4 147 | 1163484.jpg 4 148 | 0063284.jpg 4 149 | 1506226.jpg 4 150 | 0973082.jpg 4 151 | 0176136.jpg 4 152 | 1163792.jpg 4 153 | 0237107.jpg 4 154 | 0523207.jpg 4 155 | 2007890.jpg 4 156 | 0066410.jpg 4 157 | 0558343.jpg 4 158 | 0218042.jpg 4 159 | 1382601.jpg 4 160 | 0482804.jpg 4 161 | 1115400.jpg 4 162 | 1696061.jpg 4 163 | 1559372.jpg 4 164 | 1879889.jpg 4 165 | 1237623.jpg 4 166 | 0851302.jpg 4 167 | 0072369.jpg 5 168 | 1783685.jpg 5 169 | 1149871.jpg 5 170 | 1110779.jpg 5 171 | 1267074.jpg 5 172 | 1224910.jpg 5 173 | 0361471.jpg 5 174 | 0944257.jpg 5 175 | 1336460.jpg 5 176 | 1736209.jpg 5 177 | 2002331.jpg 5 178 | 0876440.jpg 5 179 | 0522417.jpg 5 180 | 0894188.jpg 5 181 | 1599381.jpg 5 182 | 0097111.jpg 5 183 | 0974734.jpg 5 184 | 1572350.jpg 5 185 | 1548404.jpg 5 186 | 1377012.jpg 5 187 | 0175746.jpg 5 188 | 1372809.jpg 5 189 | 1358066.jpg 5 190 | 1545325.jpg 5 191 | 2031384.jpg 5 192 | 1842045.jpg 5 193 | 1577681.jpg 5 194 | 1340279.jpg 5 195 | 1163793.jpg 5 196 | 0944247.jpg 5 197 | 2132361.jpg 5 198 | 1476642.jpg 5 199 | 1376842.jpg 5 200 | 1423426.jpg 5 201 | 0890396.jpg 6 202 | 0259167.jpg 6 203 | 1078482.jpg 6 204 | 1129244.jpg 6 205 | 1268068.jpg 6 206 | 2128746.jpg 6 207 | 0755282.jpg 6 208 | 1398987.jpg 6 209 | 0675711.jpg 6 210 | 0104601.jpg 6 211 | 1650662.jpg 6 212 | 1087920.jpg 6 213 | 0669870.jpg 6 214 | 1244625.jpg 6 215 | 2031421.jpg 6 216 | 0192625.jpg 6 217 | 1418231.jpg 6 218 | 1561990.jpg 6 219 | 0097578.jpg 6 220 | 0931601.jpg 6 221 | 1237650.jpg 6 222 | 1857204.jpg 6 223 | 1361693.jpg 6 224 | 0450740.jpg 6 225 | 1429846.jpg 6 226 | 2260617.jpg 6 227 | 1842040.jpg 6 228 | 0730798.jpg 6 229 | 2263029.jpg 6 230 | 1212753.jpg 6 231 | 0635825.jpg 6 232 | 1758619.jpg 6 233 | 2094229.jpg 6 234 | 1950707.jpg 7 235 | 0681537.jpg 7 236 | 1927317.jpg 7 237 | 0618966.jpg 7 238 | 1454446.jpg 7 239 | 0225227.jpg 7 240 | 2227525.jpg 7 241 | 1876022.jpg 7 242 | 1551279.jpg 7 243 | 1709124.jpg 7 244 | 1143441.jpg 7 245 | 2223909.jpg 7 246 | 1192642.jpg 7 247 | 1139751.jpg 7 248 | 0336540.jpg 7 249 | 1238449.jpg 7 250 | 1393634.jpg 7 251 | 1790921.jpg 7 252 | 2054442.jpg 7 253 | 0522857.jpg 7 254 | 0673596.jpg 7 255 | 1736223.jpg 7 256 | 1548400.jpg 7 257 | 0308863.jpg 7 258 | 2038532.jpg 7 259 | 2106086.jpg 7 260 | 1510253.jpg 7 261 | 0770474.jpg 7 262 | 1550106.jpg 7 263 | 2250798.jpg 7 264 | 1852007.jpg 7 265 | 1884738.jpg 7 266 | 1946040.jpg 7 267 | 2236183.jpg 8 268 | 1261350.jpg 8 269 | 1085719.jpg 8 270 | 0257552.jpg 8 271 | 1142906.jpg 8 272 | 1318450.jpg 8 273 | 2229397.jpg 8 274 | 0391437.jpg 8 275 | 0325574.jpg 8 276 | 0147057.jpg 8 277 | 1931271.jpg 8 278 | 1899099.jpg 8 279 | 1935750.jpg 8 280 | 0457841.jpg 8 281 | 1337849.jpg 8 282 | 2195493.jpg 8 283 | 2147231.jpg 8 284 | 1616720.jpg 8 285 | 2067982.jpg 8 286 | 1459210.jpg 8 287 | 1403158.jpg 8 288 | 0074246.jpg 8 289 | 1355815.jpg 8 290 | 1146021.jpg 8 291 | 1538844.jpg 8 292 | 2236964.jpg 8 293 | 1421021.jpg 8 294 | 0973167.jpg 8 295 | 2198761.jpg 8 296 | 2207626.jpg 8 297 | 1881962.jpg 8 298 | 1535044.jpg 8 299 | 1016570.jpg 8 300 | 0857549.jpg 8 301 | 1398526.jpg 9 302 | 1668558.jpg 9 303 | 1291783.jpg 9 304 | 0987802.jpg 9 305 | 0305545.jpg 9 306 | 1879110.jpg 9 307 | 0218852.jpg 9 308 | 0582863.jpg 9 309 | 1356890.jpg 9 310 | 1771959.jpg 9 311 | 1768538.jpg 9 312 | 0701628.jpg 9 313 | 2191957.jpg 9 314 | 0522819.jpg 9 315 | 1251519.jpg 9 316 | 1577934.jpg 9 317 | 1419049.jpg 9 318 | 0174054.jpg 9 319 | 0514744.jpg 9 320 | 1354835.jpg 9 321 | 2238661.jpg 9 322 | 1795553.jpg 9 323 | 0275806.jpg 9 324 | 1529574.jpg 9 325 | 0496901.jpg 9 326 | 1133684.jpg 9 327 | 1662032.jpg 9 328 | 0467798.jpg 9 329 | 1365786.jpg 9 330 | 2145082.jpg 9 331 | 2218546.jpg 9 332 | 2133249.jpg 9 333 | 2230427.jpg 9 334 | 2217964.jpg 10 335 | 1821602.jpg 10 336 | 0136486.jpg 10 337 | 0338531.jpg 10 338 | 1626629.jpg 10 339 | 1415611.jpg 10 340 | 1013534.jpg 10 341 | 0851311.jpg 10 342 | 1041645.jpg 10 343 | 0814397.jpg 10 344 | 1292533.jpg 10 345 | 1009765.jpg 10 346 | 0705084.jpg 10 347 | 0062223.jpg 10 348 | 1427673.jpg 10 349 | 0127505.jpg 10 350 | 0149352.jpg 10 351 | 0188405.jpg 10 352 | 2162684.jpg 10 353 | 0540389.jpg 10 354 | 0485231.jpg 10 355 | 0980549.jpg 10 356 | 0641196.jpg 10 357 | 0901513.jpg 10 358 | 0677586.jpg 10 359 | 0155579.jpg 10 360 | 0396525.jpg 10 361 | 0598112.jpg 10 362 | 0783900.jpg 10 363 | 1231746.jpg 10 364 | 2167150.jpg 10 365 | 0104828.jpg 10 366 | 0733462.jpg 10 367 | 0618964.jpg 11 368 | 2226313.jpg 11 369 | 1237674.jpg 11 370 | 1707900.jpg 11 371 | 1320257.jpg 11 372 | 0687392.jpg 11 373 | 0869743.jpg 11 374 | 1645747.jpg 11 375 | 1340195.jpg 11 376 | 1292378.jpg 11 377 | 0836147.jpg 11 378 | 0812094.jpg 11 379 | 0540023.jpg 11 380 | 0115565.jpg 11 381 | 0636590.jpg 11 382 | 0989831.jpg 11 383 | 1739526.jpg 11 384 | 1402021.jpg 11 385 | 1627935.jpg 11 386 | 0064890.jpg 11 387 | 1513616.jpg 11 388 | 0702839.jpg 11 389 | 1519194.jpg 11 390 | 1624193.jpg 11 391 | 1116410.jpg 11 392 | 0964344.jpg 11 393 | 0396535.jpg 11 394 | 1351594.jpg 11 395 | 0885277.jpg 11 396 | 0065286.jpg 11 397 | 1896036.jpg 11 398 | 0725997.jpg 11 399 | 0482823.jpg 11 400 | 0939547.jpg 11 401 | 0143388.jpg 12 402 | 0183641.jpg 12 403 | 0487369.jpg 12 404 | 1161004.jpg 12 405 | 0894166.jpg 12 406 | 1137626.jpg 12 407 | 1409250.jpg 12 408 | 0727357.jpg 12 409 | 0426810.jpg 12 410 | 0946798.jpg 12 411 | 0693511.jpg 12 412 | 1469918.jpg 12 413 | 0751008.jpg 12 414 | 1329664.jpg 12 415 | 0880576.jpg 12 416 | 1663953.jpg 12 417 | 0181737.jpg 12 418 | 0894385.jpg 12 419 | 0063295.jpg 12 420 | 0959346.jpg 12 421 | 1106454.jpg 12 422 | 0066420.jpg 12 423 | 1145204.jpg 12 424 | 1305520.jpg 12 425 | 0064924.jpg 12 426 | 1864991.jpg 12 427 | 0948344.jpg 12 428 | 1149068.jpg 12 429 | 1619901.jpg 12 430 | 1124247.jpg 12 431 | 1647303.jpg 12 432 | 1836113.jpg 12 433 | 0181726.jpg 12 434 | 1922915.jpg 13 435 | 1572354.jpg 13 436 | 1375148.jpg 13 437 | 1244348.jpg 13 438 | 2233262.jpg 13 439 | 0946093.jpg 13 440 | 1527503.jpg 13 441 | 0062769.jpg 13 442 | 0929568.jpg 13 443 | 0458628.jpg 13 444 | 1860132.jpg 13 445 | 1663372.jpg 13 446 | 1336043.jpg 13 447 | 0688083.jpg 13 448 | 0989281.jpg 13 449 | 0062672.jpg 13 450 | 2102584.jpg 13 451 | 1564952.jpg 13 452 | 0894240.jpg 13 453 | 1796764.jpg 13 454 | 2230119.jpg 13 455 | 1450008.jpg 13 456 | 1650652.jpg 13 457 | 1818227.jpg 13 458 | 2094829.jpg 13 459 | 1505652.jpg 13 460 | 2119978.jpg 13 461 | 1237717.jpg 13 462 | 0967862.jpg 13 463 | 0175817.jpg 13 464 | 1623096.jpg 13 465 | 1313993.jpg 13 466 | 0973263.jpg 13 467 | 1160955.jpg 14 468 | 0678976.jpg 14 469 | 1124487.jpg 14 470 | 0973098.jpg 14 471 | 1704745.jpg 14 472 | 1298901.jpg 14 473 | 2263603.jpg 14 474 | 1237861.jpg 14 475 | 0740882.jpg 14 476 | 0488869.jpg 14 477 | 1000413.jpg 14 478 | 0174942.jpg 14 479 | 0166974.jpg 14 480 | 1017810.jpg 14 481 | 0075760.jpg 14 482 | 0376937.jpg 14 483 | 1447274.jpg 14 484 | 1673067.jpg 14 485 | 0056323.jpg 14 486 | 1380492.jpg 14 487 | 1123364.jpg 14 488 | 0749479.jpg 14 489 | 0990829.jpg 14 490 | 0296937.jpg 14 491 | 1366112.jpg 14 492 | 2119983.jpg 14 493 | 1815692.jpg 14 494 | 0325676.jpg 14 495 | 1365500.jpg 14 496 | 1200651.jpg 14 497 | 1173016.jpg 14 498 | 1623903.jpg 14 499 | 2031403.jpg 14 500 | 0134598.jpg 14 501 | 1550890.jpg 15 502 | 2137126.jpg 15 503 | 1338255.jpg 15 504 | 0957893.jpg 15 505 | 2199948.jpg 15 506 | 0709566.jpg 15 507 | 0914475.jpg 15 508 | 1375987.jpg 15 509 | 1471690.jpg 15 510 | 2140530.jpg 15 511 | 1630526.jpg 15 512 | 1938051.jpg 15 513 | 1519381.jpg 15 514 | 1152153.jpg 15 515 | 1568750.jpg 15 516 | 0594827.jpg 15 517 | 1463774.jpg 15 518 | 0469653.jpg 15 519 | 0950442.jpg 15 520 | 2127992.jpg 15 521 | 1416312.jpg 15 522 | 0158891.jpg 15 523 | 0522873.jpg 15 524 | 0257029.jpg 15 525 | 1892614.jpg 15 526 | 0386937.jpg 15 527 | 0130597.jpg 15 528 | 2002363.jpg 15 529 | 0708642.jpg 15 530 | 0477671.jpg 15 531 | 1774577.jpg 15 532 | 0205784.jpg 15 533 | 1245209.jpg 15 534 | 0744907.jpg 16 535 | 2026733.jpg 16 536 | 1935709.jpg 16 537 | 0487319.jpg 16 538 | 1798594.jpg 16 539 | 0147029.jpg 16 540 | 0658065.jpg 16 541 | 0237109.jpg 16 542 | 0907375.jpg 16 543 | 0955450.jpg 16 544 | 0115066.jpg 16 545 | 0457808.jpg 16 546 | 1736099.jpg 16 547 | 0311183.jpg 16 548 | 1375367.jpg 16 549 | 0529781.jpg 16 550 | 0063055.jpg 16 551 | 0730739.jpg 16 552 | 1355824.jpg 16 553 | 0476348.jpg 16 554 | 1292174.jpg 16 555 | 1355224.jpg 16 556 | 1318171.jpg 16 557 | 1564408.jpg 16 558 | 0187247.jpg 16 559 | 1230274.jpg 16 560 | 1381995.jpg 16 561 | 0894313.jpg 16 562 | 1564822.jpg 16 563 | 0907407.jpg 16 564 | 0490174.jpg 16 565 | 2221715.jpg 16 566 | 0204622.jpg 16 567 | 0492495.jpg 17 568 | 0127496.jpg 17 569 | 1876024.jpg 17 570 | 0901358.jpg 17 571 | 0939563.jpg 17 572 | 0570746.jpg 17 573 | 1191062.jpg 17 574 | 1112644.jpg 17 575 | 1313184.jpg 17 576 | 1662018.jpg 17 577 | 1561406.jpg 17 578 | 2066993.jpg 17 579 | 1530082.jpg 17 580 | 1856979.jpg 17 581 | 1665263.jpg 17 582 | 1876023.jpg 17 583 | 0783932.jpg 17 584 | 0985425.jpg 17 585 | 0558232.jpg 17 586 | 1634276.jpg 17 587 | 0367941.jpg 17 588 | 1992339.jpg 17 589 | 1366758.jpg 17 590 | 2101703.jpg 17 591 | 0256997.jpg 17 592 | 0066423.jpg 17 593 | 1146228.jpg 17 594 | 0851325.jpg 17 595 | 1474094.jpg 17 596 | 0863693.jpg 17 597 | 1364162.jpg 17 598 | 1053316.jpg 17 599 | 1298911.jpg 17 600 | 1636534.jpg 17 601 | 0985290.jpg 18 602 | 2243382.jpg 18 603 | 0127032.jpg 18 604 | 0176131.jpg 18 605 | 1506812.jpg 18 606 | 2026511.jpg 18 607 | 0467448.jpg 18 608 | 2024757.jpg 18 609 | 2131508.jpg 18 610 | 2171175.jpg 18 611 | 1232270.jpg 18 612 | 1363578.jpg 18 613 | 1349689.jpg 18 614 | 1906071.jpg 18 615 | 2082475.jpg 18 616 | 1635235.jpg 18 617 | 0255294.jpg 18 618 | 1861593.jpg 18 619 | 1539768.jpg 18 620 | 1520426.jpg 18 621 | 1127995.jpg 18 622 | 0225221.jpg 18 623 | 1271552.jpg 18 624 | 0981955.jpg 18 625 | 0087248.jpg 18 626 | 0940358.jpg 18 627 | 2259027.jpg 18 628 | 2021838.jpg 18 629 | 0905077.jpg 18 630 | 0867007.jpg 18 631 | 2054456.jpg 18 632 | 2209439.jpg 18 633 | 0097107.jpg 18 634 | 1921655.jpg 19 635 | 0174925.jpg 19 636 | 2080693.jpg 19 637 | 2215391.jpg 19 638 | 1587526.jpg 19 639 | 0894406.jpg 19 640 | 0305715.jpg 19 641 | 0543862.jpg 19 642 | 1258139.jpg 19 643 | 1088532.jpg 19 644 | 1458705.jpg 19 645 | 1145215.jpg 19 646 | 1325459.jpg 19 647 | 1700645.jpg 19 648 | 1930685.jpg 19 649 | 0894021.jpg 19 650 | 0275909.jpg 19 651 | 1448175.jpg 19 652 | 2221641.jpg 19 653 | 0890198.jpg 19 654 | 1472187.jpg 19 655 | 1692543.jpg 19 656 | 1842168.jpg 19 657 | 0136489.jpg 19 658 | 1615623.jpg 19 659 | 0846180.jpg 19 660 | 0445392.jpg 19 661 | 1530788.jpg 19 662 | 0361566.jpg 19 663 | 1534012.jpg 19 664 | 0941858.jpg 19 665 | 1956397.jpg 19 666 | 1376815.jpg 19 667 | 2060563.jpg 20 668 | 0637452.jpg 20 669 | 1823530.jpg 20 670 | 0885092.jpg 20 671 | 2083822.jpg 20 672 | 2226247.jpg 20 673 | 1860127.jpg 20 674 | 1551732.jpg 20 675 | 2094475.jpg 20 676 | 1398649.jpg 20 677 | 2251121.jpg 20 678 | 1418226.jpg 20 679 | 1379395.jpg 20 680 | 1179713.jpg 20 681 | 2243952.jpg 20 682 | 1997314.jpg 20 683 | 1718901.jpg 20 684 | 1818832.jpg 20 685 | 1728005.jpg 20 686 | 1870557.jpg 20 687 | 1105732.jpg 20 688 | 0846636.jpg 20 689 | 1914684.jpg 20 690 | 1700930.jpg 20 691 | 2035598.jpg 20 692 | 1630525.jpg 20 693 | 1984373.jpg 20 694 | 1230443.jpg 20 695 | 1603124.jpg 20 696 | 2225263.jpg 20 697 | 2026614.jpg 20 698 | 1919891.jpg 20 699 | 1760948.jpg 20 700 | 2234645.jpg 20 701 | 1161624.jpg 21 702 | 0454802.jpg 21 703 | 0063105.jpg 21 704 | 1135502.jpg 21 705 | 0680027.jpg 21 706 | 1879887.jpg 21 707 | 0675845.jpg 21 708 | 0434370.jpg 21 709 | 0104832.jpg 21 710 | 0203220.jpg 21 711 | 1511271.jpg 21 712 | 1447577.jpg 21 713 | 0901367.jpg 21 714 | 0923710.jpg 21 715 | 0064072.jpg 21 716 | 0931107.jpg 21 717 | 0763193.jpg 21 718 | 0091354.jpg 21 719 | 1608203.jpg 21 720 | 0924311.jpg 21 721 | 1031476.jpg 21 722 | 0894382.jpg 21 723 | 0066532.jpg 21 724 | 0065780.jpg 21 725 | 0995410.jpg 21 726 | 1429799.jpg 21 727 | 2175626.jpg 21 728 | 0487371.jpg 21 729 | 0171397.jpg 21 730 | 1230196.jpg 21 731 | 1517855.jpg 21 732 | 0063096.jpg 21 733 | 0187226.jpg 21 734 | 2259299.jpg 22 735 | 0368357.jpg 22 736 | 1277988.jpg 22 737 | 1797113.jpg 22 738 | 1154647.jpg 22 739 | 0383679.jpg 22 740 | 0947462.jpg 22 741 | 0894387.jpg 22 742 | 0869735.jpg 22 743 | 0062719.jpg 22 744 | 0467411.jpg 22 745 | 1695905.jpg 22 746 | 0115062.jpg 22 747 | 0699302.jpg 22 748 | 0136480.jpg 22 749 | 0174858.jpg 22 750 | 1149969.jpg 22 751 | 0894203.jpg 22 752 | 0454743.jpg 22 753 | 0926148.jpg 22 754 | 1223291.jpg 22 755 | 0934703.jpg 22 756 | 1539455.jpg 22 757 | 0123340.jpg 22 758 | 0525862.jpg 22 759 | 0201677.jpg 22 760 | 0104855.jpg 22 761 | 0668306.jpg 22 762 | 1200647.jpg 22 763 | 0367786.jpg 22 764 | 0978286.jpg 22 765 | 1195650.jpg 22 766 | 0903308.jpg 22 767 | 2168374.jpg 23 768 | 1113895.jpg 23 769 | 1535566.jpg 23 770 | 1910729.jpg 23 771 | 1663628.jpg 23 772 | 2239589.jpg 23 773 | 1707575.jpg 23 774 | 1552592.jpg 23 775 | 1882508.jpg 23 776 | 1393354.jpg 23 777 | 1927186.jpg 23 778 | 1813838.jpg 23 779 | 0864663.jpg 23 780 | 2164953.jpg 23 781 | 2173639.jpg 23 782 | 1335573.jpg 23 783 | 1222822.jpg 23 784 | 1055199.jpg 23 785 | 2033184.jpg 23 786 | 1723838.jpg 23 787 | 1093247.jpg 23 788 | 1185069.jpg 23 789 | 0527682.jpg 23 790 | 1069925.jpg 23 791 | 0695327.jpg 23 792 | 1252026.jpg 23 793 | 1537100.jpg 23 794 | 1055149.jpg 23 795 | 0923888.jpg 23 796 | 1242658.jpg 23 797 | 1570498.jpg 23 798 | 1807932.jpg 23 799 | 0939541.jpg 23 800 | 1810937.jpg 23 801 | 1092270.jpg 24 802 | 1871273.jpg 24 803 | 1714240.jpg 24 804 | 0198104.jpg 24 805 | 1966096.jpg 24 806 | 1358664.jpg 24 807 | 0701751.jpg 24 808 | 2171602.jpg 24 809 | 2183458.jpg 24 810 | 1591976.jpg 24 811 | 1178011.jpg 24 812 | 0957883.jpg 24 813 | 2192006.jpg 24 814 | 1691816.jpg 24 815 | 1898311.jpg 24 816 | 0847415.jpg 24 817 | 1351643.jpg 24 818 | 0649605.jpg 24 819 | 2058107.jpg 24 820 | 0454592.jpg 24 821 | 1172324.jpg 24 822 | 1281985.jpg 24 823 | 2173352.jpg 24 824 | 1940832.jpg 24 825 | 1288222.jpg 24 826 | 1838645.jpg 24 827 | 2246817.jpg 24 828 | 1903505.jpg 24 829 | 0383395.jpg 24 830 | 1222076.jpg 24 831 | 0959262.jpg 24 832 | 1169718.jpg 24 833 | 2170860.jpg 24 834 | 1396941.jpg 25 835 | 1109377.jpg 25 836 | 1597449.jpg 25 837 | 1211309.jpg 25 838 | 0412223.jpg 25 839 | 0894324.jpg 25 840 | 1739566.jpg 25 841 | 1551668.jpg 25 842 | 1458698.jpg 25 843 | 0067618.jpg 25 844 | 1876402.jpg 25 845 | 1139750.jpg 25 846 | 1471543.jpg 25 847 | 0784484.jpg 25 848 | 0668444.jpg 25 849 | 1659448.jpg 25 850 | 0523219.jpg 25 851 | 0676942.jpg 25 852 | 1470663.jpg 25 853 | 0225210.jpg 25 854 | 1885833.jpg 25 855 | 0957899.jpg 25 856 | 2186944.jpg 25 857 | 2211026.jpg 25 858 | 0361109.jpg 25 859 | 1662015.jpg 25 860 | 0699565.jpg 25 861 | 0688076.jpg 25 862 | 1768872.jpg 25 863 | 0716437.jpg 25 864 | 1878567.jpg 25 865 | 0802263.jpg 25 866 | 1036863.jpg 25 867 | 1555884.jpg 26 868 | 0447769.jpg 26 869 | 0936133.jpg 26 870 | 1461862.jpg 26 871 | 0862739.jpg 26 872 | 0083150.jpg 26 873 | 1559373.jpg 26 874 | 0182727.jpg 26 875 | 1526019.jpg 26 876 | 1921311.jpg 26 877 | 2245660.jpg 26 878 | 1496971.jpg 26 879 | 2191945.jpg 26 880 | 0136472.jpg 26 881 | 1231168.jpg 26 882 | 1222601.jpg 26 883 | 1327939.jpg 26 884 | 1019107.jpg 26 885 | 2194912.jpg 26 886 | 2069707.jpg 26 887 | 1469738.jpg 26 888 | 0740504.jpg 26 889 | 0181710.jpg 26 890 | 1129148.jpg 26 891 | 2006296.jpg 26 892 | 1366856.jpg 26 893 | 0361611.jpg 26 894 | 1218624.jpg 26 895 | 1611219.jpg 26 896 | 1107447.jpg 26 897 | 1886463.jpg 26 898 | 0498140.jpg 26 899 | 0740884.jpg 26 900 | 0863885.jpg 26 901 | 1600343.jpg 27 902 | 1855382.jpg 27 903 | 2175403.jpg 27 904 | 1856980.jpg 27 905 | 2084801.jpg 27 906 | 1143133.jpg 27 907 | 1459418.jpg 27 908 | 0786219.jpg 27 909 | 1935749.jpg 27 910 | 2239290.jpg 27 911 | 1682813.jpg 27 912 | 2120035.jpg 27 913 | 0723567.jpg 27 914 | 2147399.jpg 27 915 | 2207935.jpg 27 916 | 1413316.jpg 27 917 | 0324845.jpg 27 918 | 1945458.jpg 27 919 | 0457837.jpg 27 920 | 1810201.jpg 27 921 | 1884889.jpg 27 922 | 1253455.jpg 27 923 | 1550643.jpg 27 924 | 0337277.jpg 27 925 | 1193361.jpg 27 926 | 1240501.jpg 27 927 | 1106412.jpg 27 928 | 0744228.jpg 27 929 | 0350198.jpg 27 930 | 2026736.jpg 27 931 | 0067863.jpg 27 932 | 1884741.jpg 27 933 | 2208639.jpg 27 934 | 1547722.jpg 28 935 | 1925949.jpg 28 936 | 0328368.jpg 28 937 | 1396726.jpg 28 938 | 1894690.jpg 28 939 | 1985096.jpg 28 940 | 1966252.jpg 28 941 | 1283779.jpg 28 942 | 2140540.jpg 28 943 | 0065291.jpg 28 944 | 2225269.jpg 28 945 | 1500991.jpg 28 946 | 1320344.jpg 28 947 | 0249916.jpg 28 948 | 1604652.jpg 28 949 | 0487330.jpg 28 950 | 1952115.jpg 28 951 | 1910575.jpg 28 952 | 2234653.jpg 28 953 | 1616511.jpg 28 954 | 2120031.jpg 28 955 | 1427275.jpg 28 956 | 1149067.jpg 28 957 | 0460482.jpg 28 958 | 0819081.jpg 28 959 | 0517745.jpg 28 960 | 2103081.jpg 28 961 | 1855285.jpg 28 962 | 1591877.jpg 28 963 | 1037982.jpg 28 964 | 2250823.jpg 28 965 | 1952832.jpg 28 966 | 0846181.jpg 28 967 | 0056782.jpg 29 968 | 0553601.jpg 29 969 | 0188372.jpg 29 970 | 0842387.jpg 29 971 | 0136468.jpg 29 972 | 0346400.jpg 29 973 | 0719790.jpg 29 974 | 0950119.jpg 29 975 | 0862199.jpg 29 976 | 0437571.jpg 29 977 | 0458627.jpg 29 978 | 1383088.jpg 29 979 | 1634020.jpg 29 980 | 0445601.jpg 29 981 | 1732551.jpg 29 982 | 1054022.jpg 29 983 | 0944357.jpg 29 984 | 1764565.jpg 29 985 | 1097486.jpg 29 986 | 0995290.jpg 29 987 | 0658059.jpg 29 988 | 1278085.jpg 29 989 | 1602273.jpg 29 990 | 1543509.jpg 29 991 | 0852822.jpg 29 992 | 0350851.jpg 29 993 | 0115074.jpg 29 994 | 0710667.jpg 29 995 | 0155576.jpg 29 996 | 0842399.jpg 29 997 | 0064930.jpg 29 998 | 0939501.jpg 29 999 | 0721177.jpg 29 1000 | 0143336.jpg 29 1001 | 2061742.jpg 30 1002 | 1469768.jpg 30 1003 | 1122585.jpg 30 1004 | 0558237.jpg 30 1005 | 0283609.jpg 30 1006 | 2235586.jpg 30 1007 | 1718898.jpg 30 1008 | 0143338.jpg 30 1009 | 1891293.jpg 30 1010 | 0906211.jpg 30 1011 | 2215390.jpg 30 1012 | 1075856.jpg 30 1013 | 1112645.jpg 30 1014 | 1774584.jpg 30 1015 | 0072869.jpg 30 1016 | 1373938.jpg 30 1017 | 0422938.jpg 30 1018 | 1589412.jpg 30 1019 | 0447935.jpg 30 1020 | 1094693.jpg 30 1021 | 1250249.jpg 30 1022 | 0949944.jpg 30 1023 | 1290785.jpg 30 1024 | 0973041.jpg 30 1025 | 1750520.jpg 30 1026 | 0494477.jpg 30 1027 | 1250523.jpg 30 1028 | 0214193.jpg 30 1029 | 1194421.jpg 30 1030 | 2170875.jpg 30 1031 | 1950557.jpg 30 1032 | 1951612.jpg 30 1033 | 2154733.jpg 30 1034 | 0803688.jpg 31 1035 | 2054362.jpg 31 1036 | 0727657.jpg 31 1037 | 2081083.jpg 31 1038 | 2084805.jpg 31 1039 | 1298374.jpg 31 1040 | 1205178.jpg 31 1041 | 1241522.jpg 31 1042 | 1325220.jpg 31 1043 | 1240315.jpg 31 1044 | 2047539.jpg 31 1045 | 0936094.jpg 31 1046 | 1698608.jpg 31 1047 | 1205272.jpg 31 1048 | 1358180.jpg 31 1049 | 1469692.jpg 31 1050 | 0727647.jpg 31 1051 | 1331448.jpg 31 1052 | 0967566.jpg 31 1053 | 1308386.jpg 31 1054 | 1110734.jpg 31 1055 | 2193516.jpg 31 1056 | 0807071.jpg 31 1057 | 0633233.jpg 31 1058 | 0704984.jpg 31 1059 | 0927368.jpg 31 1060 | 1119079.jpg 31 1061 | 1514251.jpg 31 1062 | 1110733.jpg 31 1063 | 0959085.jpg 31 1064 | 2029900.jpg 31 1065 | 1291233.jpg 31 1066 | 2177828.jpg 31 1067 | 0488533.jpg 32 1068 | 1847505.jpg 32 1069 | 1231196.jpg 32 1070 | 1661601.jpg 32 1071 | 1346446.jpg 32 1072 | 1842173.jpg 32 1073 | 2264249.jpg 32 1074 | 1885942.jpg 32 1075 | 1373275.jpg 32 1076 | 1624887.jpg 32 1077 | 1587527.jpg 32 1078 | 0770351.jpg 32 1079 | 1699455.jpg 32 1080 | 2130085.jpg 32 1081 | 0230833.jpg 32 1082 | 1794927.jpg 32 1083 | 2212521.jpg 32 1084 | 1009039.jpg 32 1085 | 1260329.jpg 32 1086 | 0683548.jpg 32 1087 | 0458635.jpg 32 1088 | 1691545.jpg 32 1089 | 0710665.jpg 32 1090 | 1567288.jpg 32 1091 | 0821375.jpg 32 1092 | 1820323.jpg 32 1093 | 1508910.jpg 32 1094 | 1201556.jpg 32 1095 | 2040834.jpg 32 1096 | 0478341.jpg 32 1097 | 0464756.jpg 32 1098 | 1615560.jpg 32 1099 | 1573406.jpg 32 1100 | 0199985.jpg 32 1101 | 1617066.jpg 33 1102 | 1941697.jpg 33 1103 | 1788036.jpg 33 1104 | 2046831.jpg 33 1105 | 1055151.jpg 33 1106 | 1757833.jpg 33 1107 | 1836183.jpg 33 1108 | 2054198.jpg 33 1109 | 1647304.jpg 33 1110 | 1718618.jpg 33 1111 | 2210073.jpg 33 1112 | 2131734.jpg 33 1113 | 2115336.jpg 33 1114 | 2084810.jpg 33 1115 | 2234610.jpg 33 1116 | 1195232.jpg 33 1117 | 2257341.jpg 33 1118 | 1921950.jpg 33 1119 | 1951063.jpg 33 1120 | 1990751.jpg 33 1121 | 1379570.jpg 33 1122 | 2247793.jpg 33 1123 | 1811111.jpg 33 1124 | 2188600.jpg 33 1125 | 1987428.jpg 33 1126 | 1941845.jpg 33 1127 | 1456849.jpg 33 1128 | 2087371.jpg 33 1129 | 2122835.jpg 33 1130 | 0977324.jpg 33 1131 | 2199048.jpg 33 1132 | 2251118.jpg 33 1133 | 1979068.jpg 33 1134 | 0995392.jpg 34 1135 | 0523021.jpg 34 1136 | 0967960.jpg 34 1137 | 1801880.jpg 34 1138 | 1220586.jpg 34 1139 | 0386931.jpg 34 1140 | 1429835.jpg 34 1141 | 1346448.jpg 34 1142 | 0822346.jpg 34 1143 | 1054467.jpg 34 1144 | 0800867.jpg 34 1145 | 0114437.jpg 34 1146 | 1570251.jpg 34 1147 | 0130601.jpg 34 1148 | 0693418.jpg 34 1149 | 1459209.jpg 34 1150 | 1377098.jpg 34 1151 | 0730742.jpg 34 1152 | 0702843.jpg 34 1153 | 1149116.jpg 34 1154 | 1023312.jpg 34 1155 | 1218546.jpg 34 1156 | 0311117.jpg 34 1157 | 1926943.jpg 34 1158 | 1277981.jpg 34 1159 | 0337958.jpg 34 1160 | 0858701.jpg 34 1161 | 1121149.jpg 34 1162 | 2097905.jpg 34 1163 | 2176403.jpg 34 1164 | 1548487.jpg 34 1165 | 0066977.jpg 34 1166 | 0977357.jpg 34 1167 | 0907304.jpg 35 1168 | 2072271.jpg 35 1169 | 1842068.jpg 35 1170 | 0109458.jpg 35 1171 | 0921924.jpg 35 1172 | 1253449.jpg 35 1173 | 0302719.jpg 35 1174 | 1471166.jpg 35 1175 | 1901216.jpg 35 1176 | 1113454.jpg 35 1177 | 0155611.jpg 35 1178 | 1912972.jpg 35 1179 | 1629114.jpg 35 1180 | 1381988.jpg 35 1181 | 1318533.jpg 35 1182 | 2196365.jpg 35 1183 | 2125186.jpg 35 1184 | 2184062.jpg 35 1185 | 0136180.jpg 35 1186 | 1004978.jpg 35 1187 | 0931109.jpg 35 1188 | 2191964.jpg 35 1189 | 1323666.jpg 35 1190 | 0483586.jpg 35 1191 | 1022616.jpg 35 1192 | 0688078.jpg 35 1193 | 1222169.jpg 35 1194 | 0193711.jpg 35 1195 | 0177480.jpg 35 1196 | 1358696.jpg 35 1197 | 1950558.jpg 35 1198 | 0683605.jpg 35 1199 | 2218547.jpg 35 1200 | 0182732.jpg 35 1201 | 1525661.jpg 36 1202 | 0103329.jpg 36 1203 | 1144556.jpg 36 1204 | 0737315.jpg 36 1205 | 0767799.jpg 36 1206 | 0592094.jpg 36 1207 | 0490170.jpg 36 1208 | 1335478.jpg 36 1209 | 1691785.jpg 36 1210 | 0894379.jpg 36 1211 | 0755283.jpg 36 1212 | 0918727.jpg 36 1213 | 0812906.jpg 36 1214 | 0811680.jpg 36 1215 | 0987259.jpg 36 1216 | 1119088.jpg 36 1217 | 0878583.jpg 36 1218 | 1018992.jpg 36 1219 | 0756079.jpg 36 1220 | 1273373.jpg 36 1221 | 1639639.jpg 36 1222 | 1314034.jpg 36 1223 | 1332980.jpg 36 1224 | 0738255.jpg 36 1225 | 0645934.jpg 36 1226 | 0988940.jpg 36 1227 | 0483241.jpg 36 1228 | 1581651.jpg 36 1229 | 0582373.jpg 36 1230 | 0737371.jpg 36 1231 | 0582376.jpg 36 1232 | 1448054.jpg 36 1233 | 0814904.jpg 36 1234 | 0934710.jpg 37 1235 | 0979639.jpg 37 1236 | 0435276.jpg 37 1237 | 0959053.jpg 37 1238 | 1660306.jpg 37 1239 | 2004307.jpg 37 1240 | 0894309.jpg 37 1241 | 0920416.jpg 37 1242 | 1407338.jpg 37 1243 | 1492329.jpg 37 1244 | 1746124.jpg 37 1245 | 1631127.jpg 37 1246 | 1782057.jpg 37 1247 | 1407337.jpg 37 1248 | 1913801.jpg 37 1249 | 1926379.jpg 37 1250 | 0562110.jpg 37 1251 | 0257376.jpg 37 1252 | 0923557.jpg 37 1253 | 0702871.jpg 37 1254 | 1538840.jpg 37 1255 | 0977327.jpg 37 1256 | 1531908.jpg 37 1257 | 1738905.jpg 37 1258 | 0895456.jpg 37 1259 | 1607357.jpg 37 1260 | 0418706.jpg 37 1261 | 0894325.jpg 37 1262 | 0136140.jpg 37 1263 | 0967804.jpg 37 1264 | 2053595.jpg 37 1265 | 2031402.jpg 37 1266 | 0582368.jpg 37 1267 | 1631130.jpg 38 1268 | 0324852.jpg 38 1269 | 1950528.jpg 38 1270 | 1288673.jpg 38 1271 | 1417568.jpg 38 1272 | 0963962.jpg 38 1273 | 1486401.jpg 38 1274 | 0094398.jpg 38 1275 | 0586973.jpg 38 1276 | 1740617.jpg 38 1277 | 1008176.jpg 38 1278 | 0136185.jpg 38 1279 | 2147406.jpg 38 1280 | 1415721.jpg 38 1281 | 0894206.jpg 38 1282 | 1407350.jpg 38 1283 | 0356192.jpg 38 1284 | 2007883.jpg 38 1285 | 0076782.jpg 38 1286 | 0799585.jpg 38 1287 | 1941995.jpg 38 1288 | 1259710.jpg 38 1289 | 1637445.jpg 38 1290 | 2107214.jpg 38 1291 | 1414866.jpg 38 1292 | 0814348.jpg 38 1293 | 0808883.jpg 38 1294 | 1966243.jpg 38 1295 | 1103334.jpg 38 1296 | 0694269.jpg 38 1297 | 1152152.jpg 38 1298 | 1169470.jpg 38 1299 | 0462061.jpg 38 1300 | 1312740.jpg 38 1301 | 0683604.jpg 39 1302 | 0917345.jpg 39 1303 | 2072230.jpg 39 1304 | 1158578.jpg 39 1305 | 1025167.jpg 39 1306 | 1272007.jpg 39 1307 | 1419958.jpg 39 1308 | 1627449.jpg 39 1309 | 2143407.jpg 39 1310 | 1688324.jpg 39 1311 | 1655409.jpg 39 1312 | 0913417.jpg 39 1313 | 1903393.jpg 39 1314 | 1069041.jpg 39 1315 | 1784216.jpg 39 1316 | 1047313.jpg 39 1317 | 0525265.jpg 39 1318 | 1801417.jpg 39 1319 | 0753360.jpg 39 1320 | 2186370.jpg 39 1321 | 1358537.jpg 39 1322 | 0297025.jpg 39 1323 | 1340322.jpg 39 1324 | 0697871.jpg 39 1325 | 0713916.jpg 39 1326 | 1535637.jpg 39 1327 | 1527350.jpg 39 1328 | 2171063.jpg 39 1329 | 1545981.jpg 39 1330 | 0939614.jpg 39 1331 | 1014363.jpg 39 1332 | 0677428.jpg 39 1333 | 1670149.jpg 39 1334 | 0495198.jpg 40 1335 | 1776868.jpg 40 1336 | 0935361.jpg 40 1337 | 0457849.jpg 40 1338 | 1094607.jpg 40 1339 | 0175726.jpg 40 1340 | 0337963.jpg 40 1341 | 0127621.jpg 40 1342 | 0457880.jpg 40 1343 | 0362870.jpg 40 1344 | 0091357.jpg 40 1345 | 0306188.jpg 40 1346 | 0126154.jpg 40 1347 | 0082497.jpg 40 1348 | 0067035.jpg 40 1349 | 0457881.jpg 40 1350 | 0337959.jpg 40 1351 | 0302935.jpg 40 1352 | 0263496.jpg 40 1353 | 1341837.jpg 40 1354 | 1592286.jpg 40 1355 | 1767102.jpg 40 1356 | 1355218.jpg 40 1357 | 1337523.jpg 40 1358 | 1415430.jpg 40 1359 | 0770350.jpg 40 1360 | 0730396.jpg 40 1361 | 1312418.jpg 40 1362 | 1036862.jpg 40 1363 | 0227120.jpg 40 1364 | 0592099.jpg 40 1365 | 0396119.jpg 40 1366 | 1129974.jpg 40 1367 | 0181728.jpg 41 1368 | 0525841.jpg 41 1369 | 1474933.jpg 41 1370 | 2039044.jpg 41 1371 | 1474064.jpg 41 1372 | 1146068.jpg 41 1373 | 0176115.jpg 41 1374 | 1838535.jpg 41 1375 | 0445395.jpg 41 1376 | 0435287.jpg 41 1377 | 1092246.jpg 41 1378 | 1088602.jpg 41 1379 | 0918624.jpg 41 1380 | 0061136.jpg 41 1381 | 1535190.jpg 41 1382 | 0165200.jpg 41 1383 | 2175632.jpg 41 1384 | 1379290.jpg 41 1385 | 0706180.jpg 41 1386 | 0885552.jpg 41 1387 | 1914874.jpg 41 1388 | 1341384.jpg 41 1389 | 1102466.jpg 41 1390 | 0799590.jpg 41 1391 | 1984367.jpg 41 1392 | 0574356.jpg 41 1393 | 0302020.jpg 41 1394 | 0708637.jpg 41 1395 | 1000107.jpg 41 1396 | 0383871.jpg 41 1397 | 0239583.jpg 41 1398 | 2245398.jpg 41 1399 | 0315569.jpg 41 1400 | 1053968.jpg 41 1401 | 1202350.jpg 42 1402 | 0736232.jpg 42 1403 | 0963485.jpg 42 1404 | 1246917.jpg 42 1405 | 1244362.jpg 42 1406 | 1586062.jpg 42 1407 | 1593383.jpg 42 1408 | 0759282.jpg 42 1409 | 0774763.jpg 42 1410 | 1869952.jpg 42 1411 | 1331516.jpg 42 1412 | 1564407.jpg 42 1413 | 1621666.jpg 42 1414 | 2110996.jpg 42 1415 | 0911010.jpg 42 1416 | 0326603.jpg 42 1417 | 0547019.jpg 42 1418 | 0773394.jpg 42 1419 | 2228007.jpg 42 1420 | 0755464.jpg 42 1421 | 0851299.jpg 42 1422 | 2175630.jpg 42 1423 | 0773399.jpg 42 1424 | 1762100.jpg 42 1425 | 2139346.jpg 42 1426 | 0875337.jpg 42 1427 | 0681901.jpg 42 1428 | 0464476.jpg 42 1429 | 1564427.jpg 42 1430 | 1304098.jpg 42 1431 | 1144110.jpg 42 1432 | 0883341.jpg 42 1433 | 1256472.jpg 42 1434 | 0683550.jpg 43 1435 | 1785637.jpg 43 1436 | 0618973.jpg 43 1437 | 1345223.jpg 43 1438 | 0870655.jpg 43 1439 | 1147907.jpg 43 1440 | 1222772.jpg 43 1441 | 1149069.jpg 43 1442 | 0441014.jpg 43 1443 | 1745957.jpg 43 1444 | 0458771.jpg 43 1445 | 0773538.jpg 43 1446 | 1404441.jpg 43 1447 | 1044764.jpg 43 1448 | 1102454.jpg 43 1449 | 0879882.jpg 43 1450 | 1639640.jpg 43 1451 | 0730690.jpg 43 1452 | 0068811.jpg 43 1453 | 0457908.jpg 43 1454 | 0952726.jpg 43 1455 | 0961452.jpg 43 1456 | 1592586.jpg 43 1457 | 0944161.jpg 43 1458 | 0551411.jpg 43 1459 | 1538889.jpg 43 1460 | 1869320.jpg 43 1461 | 1446336.jpg 43 1462 | 1237817.jpg 43 1463 | 0879888.jpg 43 1464 | 2169306.jpg 43 1465 | 1028184.jpg 43 1466 | 1089013.jpg 43 1467 | 1636446.jpg 44 1468 | 1696057.jpg 44 1469 | 0916246.jpg 44 1470 | 2183366.jpg 44 1471 | 0650211.jpg 44 1472 | 0880309.jpg 44 1473 | 0065406.jpg 44 1474 | 0181723.jpg 44 1475 | 2231271.jpg 44 1476 | 0958484.jpg 44 1477 | 0977248.jpg 44 1478 | 2198151.jpg 44 1479 | 1060226.jpg 44 1480 | 0487363.jpg 44 1481 | 1426707.jpg 44 1482 | 0227144.jpg 44 1483 | 1531909.jpg 44 1484 | 1921769.jpg 44 1485 | 1182683.jpg 44 1486 | 0841712.jpg 44 1487 | 0922754.jpg 44 1488 | 0701629.jpg 44 1489 | 0866582.jpg 44 1490 | 1601262.jpg 44 1491 | 1288668.jpg 44 1492 | 1299978.jpg 44 1493 | 0928613.jpg 44 1494 | 0225469.jpg 44 1495 | 0523111.jpg 44 1496 | 0895892.jpg 44 1497 | 0907437.jpg 44 1498 | 1990014.jpg 44 1499 | 1742638.jpg 44 1500 | 1414836.jpg 44 1501 | 1246713.jpg 45 1502 | 1698615.jpg 45 1503 | 1387820.jpg 45 1504 | 1109944.jpg 45 1505 | 1473973.jpg 45 1506 | 0243476.jpg 45 1507 | 0383407.jpg 45 1508 | 1206458.jpg 45 1509 | 1608287.jpg 45 1510 | 1240500.jpg 45 1511 | 2124891.jpg 45 1512 | 0386926.jpg 45 1513 | 0312400.jpg 45 1514 | 0490833.jpg 45 1515 | 0354453.jpg 45 1516 | 0361101.jpg 45 1517 | 2123149.jpg 45 1518 | 1100466.jpg 45 1519 | 0988559.jpg 45 1520 | 0209760.jpg 45 1521 | 0521261.jpg 45 1522 | 1692500.jpg 45 1523 | 2239753.jpg 45 1524 | 1759327.jpg 45 1525 | 1152155.jpg 45 1526 | 1717925.jpg 45 1527 | 0328525.jpg 45 1528 | 1426526.jpg 45 1529 | 1982724.jpg 45 1530 | 0914134.jpg 45 1531 | 1001818.jpg 45 1532 | 0550295.jpg 45 1533 | 0312421.jpg 45 1534 | 1956504.jpg 46 1535 | 2167369.jpg 46 1536 | 1358472.jpg 46 1537 | 1873333.jpg 46 1538 | 1053558.jpg 46 1539 | 1155014.jpg 46 1540 | 2200403.jpg 46 1541 | 2076551.jpg 46 1542 | 1088809.jpg 46 1543 | 1259151.jpg 46 1544 | 1727705.jpg 46 1545 | 1393350.jpg 46 1546 | 2194210.jpg 46 1547 | 1258559.jpg 46 1548 | 2176477.jpg 46 1549 | 1401606.jpg 46 1550 | 1163812.jpg 46 1551 | 1457541.jpg 46 1552 | 1364804.jpg 46 1553 | 1255844.jpg 46 1554 | 1464135.jpg 46 1555 | 1869389.jpg 46 1556 | 1806584.jpg 46 1557 | 1834899.jpg 46 1558 | 1975737.jpg 46 1559 | 1810935.jpg 46 1560 | 1540396.jpg 46 1561 | 1221335.jpg 46 1562 | 1719924.jpg 46 1563 | 1292386.jpg 46 1564 | 1283646.jpg 46 1565 | 1861815.jpg 46 1566 | 1483143.jpg 46 1567 | 1605923.jpg 47 1568 | 1560636.jpg 47 1569 | 1396258.jpg 47 1570 | 2172533.jpg 47 1571 | 1726922.jpg 47 1572 | 1892613.jpg 47 1573 | 1956487.jpg 47 1574 | 1217074.jpg 47 1575 | 1658851.jpg 47 1576 | 1829871.jpg 47 1577 | 0744300.jpg 47 1578 | 1194001.jpg 47 1579 | 1324595.jpg 47 1580 | 2111679.jpg 47 1581 | 1093290.jpg 47 1582 | 1125994.jpg 47 1583 | 1731827.jpg 47 1584 | 1283206.jpg 47 1585 | 2072892.jpg 47 1586 | 1955377.jpg 47 1587 | 1357987.jpg 47 1588 | 1170071.jpg 47 1589 | 2240083.jpg 47 1590 | 1928115.jpg 47 1591 | 1164982.jpg 47 1592 | 1623632.jpg 47 1593 | 1495768.jpg 47 1594 | 1690728.jpg 47 1595 | 0779327.jpg 47 1596 | 1395799.jpg 47 1597 | 0544844.jpg 47 1598 | 1703860.jpg 47 1599 | 1103400.jpg 47 1600 | 2204066.jpg 47 1601 | 1606037.jpg 48 1602 | 2259451.jpg 48 1603 | 1526601.jpg 48 1604 | 1217071.jpg 48 1605 | 1259623.jpg 48 1606 | 1308317.jpg 48 1607 | 1525673.jpg 48 1608 | 1158363.jpg 48 1609 | 0979612.jpg 48 1610 | 2111853.jpg 48 1611 | 1510644.jpg 48 1612 | 2100018.jpg 48 1613 | 0492475.jpg 48 1614 | 1969049.jpg 48 1615 | 0078400.jpg 48 1616 | 1337428.jpg 48 1617 | 2239587.jpg 48 1618 | 0302204.jpg 48 1619 | 2151013.jpg 48 1620 | 2107238.jpg 48 1621 | 1382000.jpg 48 1622 | 0523033.jpg 48 1623 | 0761960.jpg 48 1624 | 2067268.jpg 48 1625 | 0157278.jpg 48 1626 | 2139351.jpg 48 1627 | 1031503.jpg 48 1628 | 1215875.jpg 48 1629 | 1340613.jpg 48 1630 | 1158325.jpg 48 1631 | 1926830.jpg 48 1632 | 0082500.jpg 48 1633 | 1767276.jpg 48 1634 | 2018768.jpg 49 1635 | 1927638.jpg 49 1636 | 2190740.jpg 49 1637 | 1861737.jpg 49 1638 | 1889566.jpg 49 1639 | 1381998.jpg 49 1640 | 1095184.jpg 49 1641 | 1796970.jpg 49 1642 | 1762803.jpg 49 1643 | 1046468.jpg 49 1644 | 2190848.jpg 49 1645 | 2266336.jpg 49 1646 | 1305611.jpg 49 1647 | 1967054.jpg 49 1648 | 1308404.jpg 49 1649 | 1807114.jpg 49 1650 | 0796349.jpg 49 1651 | 1806227.jpg 49 1652 | 1476061.jpg 49 1653 | 2228978.jpg 49 1654 | 2072231.jpg 49 1655 | 1561986.jpg 49 1656 | 1769000.jpg 49 1657 | 1468399.jpg 49 1658 | 0859229.jpg 49 1659 | 1304344.jpg 49 1660 | 1655252.jpg 49 1661 | 2013386.jpg 49 1662 | 1480422.jpg 49 1663 | 2161158.jpg 49 1664 | 1310123.jpg 49 1665 | 2135566.jpg 49 1666 | 1124590.jpg 49 1667 | 0446944.jpg 50 1668 | 1333183.jpg 50 1669 | 0901473.jpg 50 1670 | 1647665.jpg 50 1671 | 2255571.jpg 50 1672 | 1468400.jpg 50 1673 | 2116329.jpg 50 1674 | 0225634.jpg 50 1675 | 1363662.jpg 50 1676 | 2188604.jpg 50 1677 | 0880582.jpg 50 1678 | 1062970.jpg 50 1679 | 1040656.jpg 50 1680 | 1388499.jpg 50 1681 | 1336582.jpg 50 1682 | 1321494.jpg 50 1683 | 1329158.jpg 50 1684 | 1963559.jpg 50 1685 | 2207610.jpg 50 1686 | 1508811.jpg 50 1687 | 2229401.jpg 50 1688 | 0741298.jpg 50 1689 | 0951126.jpg 50 1690 | 1768807.jpg 50 1691 | 2137161.jpg 50 1692 | 2234272.jpg 50 1693 | 1641833.jpg 50 1694 | 1608084.jpg 50 1695 | 1545982.jpg 50 1696 | 2175263.jpg 50 1697 | 1934357.jpg 50 1698 | 1362530.jpg 50 1699 | 1032301.jpg 50 1700 | 1279709.jpg 50 1701 | 0126141.jpg 51 1702 | 2045488.jpg 51 1703 | 1230221.jpg 51 1704 | 1379044.jpg 51 1705 | 0781083.jpg 51 1706 | 1364299.jpg 51 1707 | 1240939.jpg 51 1708 | 2185923.jpg 51 1709 | 1275786.jpg 51 1710 | 2169120.jpg 51 1711 | 0780417.jpg 51 1712 | 0571575.jpg 51 1713 | 2033052.jpg 51 1714 | 1069042.jpg 51 1715 | 0988971.jpg 51 1716 | 0683594.jpg 51 1717 | 0789676.jpg 51 1718 | 1223342.jpg 51 1719 | 0065328.jpg 51 1720 | 1468661.jpg 51 1721 | 1028854.jpg 51 1722 | 1023005.jpg 51 1723 | 1444712.jpg 51 1724 | 1853951.jpg 51 1725 | 0525849.jpg 51 1726 | 0909856.jpg 51 1727 | 1151616.jpg 51 1728 | 1423648.jpg 51 1729 | 0705082.jpg 51 1730 | 1158365.jpg 51 1731 | 1394145.jpg 51 1732 | 2213541.jpg 51 1733 | 1522559.jpg 51 1734 | 0091257.jpg 52 1735 | 0043890.jpg 52 1736 | 0065207.jpg 52 1737 | 0678983.jpg 52 1738 | 0536340.jpg 52 1739 | 0333816.jpg 52 1740 | 0205769.jpg 52 1741 | 0574287.jpg 52 1742 | 0923550.jpg 52 1743 | 1002749.jpg 52 1744 | 0063838.jpg 52 1745 | 1448066.jpg 52 1746 | 0143351.jpg 52 1747 | 0136174.jpg 52 1748 | 1363954.jpg 52 1749 | 1314181.jpg 52 1750 | 0487317.jpg 52 1751 | 1089016.jpg 52 1752 | 0143354.jpg 52 1753 | 0187248.jpg 52 1754 | 1113208.jpg 52 1755 | 1190656.jpg 52 1756 | 0066521.jpg 52 1757 | 0677579.jpg 52 1758 | 0901363.jpg 52 1759 | 1013524.jpg 52 1760 | 1017202.jpg 52 1761 | 0894213.jpg 52 1762 | 0450544.jpg 52 1763 | 0536338.jpg 52 1764 | 0063059.jpg 52 1765 | 0568803.jpg 52 1766 | 0540272.jpg 52 1767 | 2223262.jpg 53 1768 | 0632012.jpg 53 1769 | 0708633.jpg 53 1770 | 0525555.jpg 53 1771 | 0539185.jpg 53 1772 | 0544833.jpg 53 1773 | 1036828.jpg 53 1774 | 1751866.jpg 53 1775 | 0547337.jpg 53 1776 | 0539191.jpg 53 1777 | 0548761.jpg 53 1778 | 1338570.jpg 53 1779 | 1744484.jpg 53 1780 | 1036827.jpg 53 1781 | 0539189.jpg 53 1782 | 1236289.jpg 53 1783 | 1828039.jpg 53 1784 | 0554722.jpg 53 1785 | 1139864.jpg 53 1786 | 0548499.jpg 53 1787 | 0554934.jpg 53 1788 | 0422856.jpg 53 1789 | 0806578.jpg 53 1790 | 0767912.jpg 53 1791 | 0236037.jpg 53 1792 | 1002746.jpg 53 1793 | 0773531.jpg 53 1794 | 2223256.jpg 53 1795 | 0563774.jpg 53 1796 | 0713823.jpg 53 1797 | 1031443.jpg 53 1798 | 0551835.jpg 53 1799 | 0593819.jpg 53 1800 | 1274483.jpg 53 1801 | 1316035.jpg 54 1802 | 0725711.jpg 54 1803 | 0727631.jpg 54 1804 | 1292390.jpg 54 1805 | 0619977.jpg 54 1806 | 0725666.jpg 54 1807 | 1762224.jpg 54 1808 | 0554928.jpg 54 1809 | 0554640.jpg 54 1810 | 0548478.jpg 54 1811 | 0956802.jpg 54 1812 | 0563866.jpg 54 1813 | 2121969.jpg 54 1814 | 0576250.jpg 54 1815 | 0950429.jpg 54 1816 | 0545058.jpg 54 1817 | 0538371.jpg 54 1818 | 1338873.jpg 54 1819 | 0554718.jpg 54 1820 | 1785545.jpg 54 1821 | 1456007.jpg 54 1822 | 1517850.jpg 54 1823 | 1937368.jpg 54 1824 | 0730398.jpg 54 1825 | 0620009.jpg 54 1826 | 0102560.jpg 54 1827 | 1568883.jpg 54 1828 | 0547008.jpg 54 1829 | 0648034.jpg 54 1830 | 0438668.jpg 54 1831 | 0901533.jpg 54 1832 | 0539791.jpg 54 1833 | 1316960.jpg 54 1834 | 1002431.jpg 55 1835 | 1428985.jpg 55 1836 | 0765750.jpg 55 1837 | 0640660.jpg 55 1838 | 1201877.jpg 55 1839 | 0956798.jpg 55 1840 | 0177063.jpg 55 1841 | 1426920.jpg 55 1842 | 0219265.jpg 55 1843 | 0993951.jpg 55 1844 | 1050604.jpg 55 1845 | 1343589.jpg 55 1846 | 0608863.jpg 55 1847 | 0395643.jpg 55 1848 | 1349066.jpg 55 1849 | 1818866.jpg 55 1850 | 0092971.jpg 55 1851 | 1193567.jpg 55 1852 | 0109630.jpg 55 1853 | 0718917.jpg 55 1854 | 0987741.jpg 55 1855 | 0901457.jpg 55 1856 | 0880592.jpg 55 1857 | 0721412.jpg 55 1858 | 0065405.jpg 55 1859 | 0179069.jpg 55 1860 | 1168800.jpg 55 1861 | 0989090.jpg 55 1862 | 0643700.jpg 55 1863 | 0522981.jpg 55 1864 | 0536312.jpg 55 1865 | 1317750.jpg 55 1866 | 0694464.jpg 55 1867 | 0133434.jpg 56 1868 | 0248131.jpg 56 1869 | 0477027.jpg 56 1870 | 1329317.jpg 56 1871 | 0116702.jpg 56 1872 | 0062134.jpg 56 1873 | 1197204.jpg 56 1874 | 1713266.jpg 56 1875 | 1377250.jpg 56 1876 | 0143378.jpg 56 1877 | 0564155.jpg 56 1878 | 0182190.jpg 56 1879 | 0136193.jpg 56 1880 | 0907380.jpg 56 1881 | 0923548.jpg 56 1882 | 0441018.jpg 56 1883 | 0064793.jpg 56 1884 | 0108172.jpg 56 1885 | 1745942.jpg 56 1886 | 0704547.jpg 56 1887 | 0133438.jpg 56 1888 | 1376993.jpg 56 1889 | 1707909.jpg 56 1890 | 1312496.jpg 56 1891 | 0457831.jpg 56 1892 | 1776869.jpg 56 1893 | 1296942.jpg 56 1894 | 0551249.jpg 56 1895 | 0813373.jpg 56 1896 | 1190643.jpg 56 1897 | 1002382.jpg 56 1898 | 0901518.jpg 56 1899 | 0409535.jpg 56 1900 | 0875346.jpg 56 1901 | 0788163.jpg 57 1902 | 1529298.jpg 57 1903 | 1195235.jpg 57 1904 | 1499332.jpg 57 1905 | 0730318.jpg 57 1906 | 0885288.jpg 57 1907 | 1696301.jpg 57 1908 | 1928769.jpg 57 1909 | 1706500.jpg 57 1910 | 2106391.jpg 57 1911 | 1553876.jpg 57 1912 | 2195297.jpg 57 1913 | 0911710.jpg 57 1914 | 2215054.jpg 57 1915 | 1934561.jpg 57 1916 | 1260718.jpg 57 1917 | 1744483.jpg 57 1918 | 1980186.jpg 57 1919 | 1384679.jpg 57 1920 | 1979290.jpg 57 1921 | 1302570.jpg 57 1922 | 1115130.jpg 57 1923 | 0877674.jpg 57 1924 | 1369785.jpg 57 1925 | 0914477.jpg 57 1926 | 1422217.jpg 57 1927 | 1001809.jpg 57 1928 | 1472410.jpg 57 1929 | 1767081.jpg 57 1930 | 1673535.jpg 57 1931 | 1928845.jpg 57 1932 | 1773158.jpg 57 1933 | 1979306.jpg 57 1934 | 1369769.jpg 58 1935 | 1622728.jpg 58 1936 | 2180329.jpg 58 1937 | 1773176.jpg 58 1938 | 0743270.jpg 58 1939 | 0981290.jpg 58 1940 | 2158197.jpg 58 1941 | 1674645.jpg 58 1942 | 2074174.jpg 58 1943 | 1382206.jpg 58 1944 | 2199430.jpg 58 1945 | 1570575.jpg 58 1946 | 1915066.jpg 58 1947 | 1738367.jpg 58 1948 | 1570890.jpg 58 1949 | 2165387.jpg 58 1950 | 1368002.jpg 58 1951 | 1513734.jpg 58 1952 | 1237798.jpg 58 1953 | 1591269.jpg 58 1954 | 1533267.jpg 58 1955 | 1378086.jpg 58 1956 | 1523427.jpg 58 1957 | 1738373.jpg 58 1958 | 1191372.jpg 58 1959 | 1119476.jpg 58 1960 | 1177272.jpg 58 1961 | 0765178.jpg 58 1962 | 1376745.jpg 58 1963 | 1768028.jpg 58 1964 | 2152684.jpg 58 1965 | 1337995.jpg 58 1966 | 1770257.jpg 58 1967 | 0076795.jpg 59 1968 | 2005735.jpg 59 1969 | 0908027.jpg 59 1970 | 1043782.jpg 59 1971 | 2003458.jpg 59 1972 | 1053502.jpg 59 1973 | 1008467.jpg 59 1974 | 0303914.jpg 59 1975 | 1997544.jpg 59 1976 | 2004744.jpg 59 1977 | 2220794.jpg 59 1978 | 1992125.jpg 59 1979 | 0738985.jpg 59 1980 | 2066356.jpg 59 1981 | 1252819.jpg 59 1982 | 0712752.jpg 59 1983 | 1173007.jpg 59 1984 | 0564318.jpg 59 1985 | 0302877.jpg 59 1986 | 2006288.jpg 59 1987 | 0182103.jpg 59 1988 | 0338375.jpg 59 1989 | 1009081.jpg 59 1990 | 1991742.jpg 59 1991 | 1977444.jpg 59 1992 | 0857553.jpg 59 1993 | 2008338.jpg 59 1994 | 1999441.jpg 59 1995 | 2002994.jpg 59 1996 | 0913620.jpg 59 1997 | 1992617.jpg 59 1998 | 0735947.jpg 59 1999 | 2005736.jpg 59 2000 | 0734890.jpg 59 2001 | 1338364.jpg 60 2002 | 0095379.jpg 60 2003 | 0167124.jpg 60 2004 | 0313722.jpg 60 2005 | 0949750.jpg 60 2006 | 0321703.jpg 60 2007 | 0218791.jpg 60 2008 | 1257119.jpg 60 2009 | 1533629.jpg 60 2010 | 1222395.jpg 60 2011 | 1136120.jpg 60 2012 | 0091403.jpg 60 2013 | 1043797.jpg 60 2014 | 0192881.jpg 60 2015 | 0299623.jpg 60 2016 | 1334566.jpg 60 2017 | 0576232.jpg 60 2018 | 0072491.jpg 60 2019 | 0210366.jpg 60 2020 | 1251898.jpg 60 2021 | 0302290.jpg 60 2022 | 1772163.jpg 60 2023 | 0114407.jpg 60 2024 | 1378720.jpg 60 2025 | 1422135.jpg 60 2026 | 1534695.jpg 60 2027 | 0413384.jpg 60 2028 | 0537423.jpg 60 2029 | 2118163.jpg 60 2030 | 1023006.jpg 60 2031 | 1088366.jpg 60 2032 | 1043726.jpg 60 2033 | 0693515.jpg 60 2034 | 0229943.jpg 61 2035 | 1983628.jpg 61 2036 | 1300481.jpg 61 2037 | 1146067.jpg 61 2038 | 1351312.jpg 61 2039 | 2009849.jpg 61 2040 | 0066278.jpg 61 2041 | 1225610.jpg 61 2042 | 0688075.jpg 61 2043 | 0799381.jpg 61 2044 | 0836152.jpg 61 2045 | 1784213.jpg 61 2046 | 1291373.jpg 61 2047 | 2032655.jpg 61 2048 | 2163949.jpg 61 2049 | 2213539.jpg 61 2050 | 2256413.jpg 61 2051 | 1837424.jpg 61 2052 | 2169125.jpg 61 2053 | 2147235.jpg 61 2054 | 0255361.jpg 61 2055 | 0485221.jpg 61 2056 | 0302924.jpg 61 2057 | 1600892.jpg 61 2058 | 0457863.jpg 61 2059 | 1053554.jpg 61 2060 | 0522866.jpg 61 2061 | 0413905.jpg 61 2062 | 1143419.jpg 61 2063 | 2219553.jpg 61 2064 | 0256999.jpg 61 2065 | 1885840.jpg 61 2066 | 1326471.jpg 61 2067 | 1357567.jpg 62 2068 | 2231688.jpg 62 2069 | 2166979.jpg 62 2070 | 1592961.jpg 62 2071 | 1829314.jpg 62 2072 | 2003512.jpg 62 2073 | 2229409.jpg 62 2074 | 1928761.jpg 62 2075 | 2074641.jpg 62 2076 | 1329269.jpg 62 2077 | 1295011.jpg 62 2078 | 2236569.jpg 62 2079 | 2162056.jpg 62 2080 | 1408685.jpg 62 2081 | 1276588.jpg 62 2082 | 2011071.jpg 62 2083 | 1378082.jpg 62 2084 | 1374699.jpg 62 2085 | 2083823.jpg 62 2086 | 1339197.jpg 62 2087 | 1423710.jpg 62 2088 | 1586850.jpg 62 2089 | 1385878.jpg 62 2090 | 1581524.jpg 62 2091 | 1348956.jpg 62 2092 | 1521970.jpg 62 2093 | 1388205.jpg 62 2094 | 1428350.jpg 62 2095 | 1398420.jpg 62 2096 | 1821967.jpg 62 2097 | 2030080.jpg 62 2098 | 1827450.jpg 62 2099 | 1625116.jpg 62 2100 | 2185964.jpg 62 2101 | 1270714.jpg 63 2102 | 1426175.jpg 63 2103 | 0155602.jpg 63 2104 | 1763667.jpg 63 2105 | 1092039.jpg 63 2106 | 0278439.jpg 63 2107 | 2236758.jpg 63 2108 | 0909855.jpg 63 2109 | 0517628.jpg 63 2110 | 0077517.jpg 63 2111 | 0908026.jpg 63 2112 | 0921985.jpg 63 2113 | 1549180.jpg 63 2114 | 1268480.jpg 63 2115 | 0838486.jpg 63 2116 | 1931138.jpg 63 2117 | 0326676.jpg 63 2118 | 0761149.jpg 63 2119 | 1281620.jpg 63 2120 | 2166855.jpg 63 2121 | 1535205.jpg 63 2122 | 0628815.jpg 63 2123 | 1152778.jpg 63 2124 | 1202541.jpg 63 2125 | 1052538.jpg 63 2126 | 0143367.jpg 63 2127 | 2134405.jpg 63 2128 | 0357874.jpg 63 2129 | 0574360.jpg 63 2130 | 0690637.jpg 63 2131 | 0517627.jpg 63 2132 | 0796384.jpg 63 2133 | 1261329.jpg 63 2134 | 1591312.jpg 64 2135 | 1634373.jpg 64 2136 | 1579853.jpg 64 2137 | 1461218.jpg 64 2138 | 0998786.jpg 64 2139 | 2107061.jpg 64 2140 | 1943476.jpg 64 2141 | 2190977.jpg 64 2142 | 1444748.jpg 64 2143 | 1312536.jpg 64 2144 | 0381609.jpg 64 2145 | 1296201.jpg 64 2146 | 1879121.jpg 64 2147 | 1901160.jpg 64 2148 | 1259554.jpg 64 2149 | 1966298.jpg 64 2150 | 1008903.jpg 64 2151 | 1714236.jpg 64 2152 | 1602628.jpg 64 2153 | 1676582.jpg 64 2154 | 0661734.jpg 64 2155 | 2117298.jpg 64 2156 | 0713593.jpg 64 2157 | 0944994.jpg 64 2158 | 0642071.jpg 64 2159 | 1169979.jpg 64 2160 | 0962643.jpg 64 2161 | 1900937.jpg 64 2162 | 1375264.jpg 64 2163 | 0717995.jpg 64 2164 | 1953630.jpg 64 2165 | 2063913.jpg 64 2166 | 0972415.jpg 64 2167 | 2241970.jpg 65 2168 | 1644649.jpg 65 2169 | 1088603.jpg 65 2170 | 1776825.jpg 65 2171 | 2057962.jpg 65 2172 | 1926205.jpg 65 2173 | 2220612.jpg 65 2174 | 1647313.jpg 65 2175 | 1826956.jpg 65 2176 | 1364919.jpg 65 2177 | 2158858.jpg 65 2178 | 1529275.jpg 65 2179 | 1943358.jpg 65 2180 | 1535181.jpg 65 2181 | 2056248.jpg 65 2182 | 1860580.jpg 65 2183 | 1714026.jpg 65 2184 | 2264828.jpg 65 2185 | 1351213.jpg 65 2186 | 2113949.jpg 65 2187 | 1458030.jpg 65 2188 | 1806208.jpg 65 2189 | 1510217.jpg 65 2190 | 2200427.jpg 65 2191 | 1086833.jpg 65 2192 | 2182128.jpg 65 2193 | 1999292.jpg 65 2194 | 1586117.jpg 65 2195 | 2152231.jpg 65 2196 | 2187801.jpg 65 2197 | 1545068.jpg 65 2198 | 1759298.jpg 65 2199 | 2134406.jpg 65 2200 | 1018933.jpg 65 2201 | 1520357.jpg 66 2202 | 2137129.jpg 66 2203 | 1549035.jpg 66 2204 | 1981537.jpg 66 2205 | 2044260.jpg 66 2206 | 2169104.jpg 66 2207 | 1922722.jpg 66 2208 | 1594758.jpg 66 2209 | 2195128.jpg 66 2210 | 1469699.jpg 66 2211 | 1801182.jpg 66 2212 | 1659671.jpg 66 2213 | 1841489.jpg 66 2214 | 2253635.jpg 66 2215 | 2114112.jpg 66 2216 | 2060949.jpg 66 2217 | 1420016.jpg 66 2218 | 1736148.jpg 66 2219 | 1788034.jpg 66 2220 | 2209714.jpg 66 2221 | 0939519.jpg 66 2222 | 1971194.jpg 66 2223 | 1330195.jpg 66 2224 | 1796759.jpg 66 2225 | 2156795.jpg 66 2226 | 1679201.jpg 66 2227 | 1398982.jpg 66 2228 | 1367016.jpg 66 2229 | 2259453.jpg 66 2230 | 2094472.jpg 66 2231 | 1912564.jpg 66 2232 | 1917223.jpg 66 2233 | 1687909.jpg 66 2234 | 1594735.jpg 67 2235 | 1103348.jpg 67 2236 | 0657816.jpg 67 2237 | 0881386.jpg 67 2238 | 0850831.jpg 67 2239 | 0574357.jpg 67 2240 | 1966263.jpg 67 2241 | 0911625.jpg 67 2242 | 1255661.jpg 67 2243 | 1347334.jpg 67 2244 | 2070644.jpg 67 2245 | 0573365.jpg 67 2246 | 0072617.jpg 67 2247 | 0275905.jpg 67 2248 | 0338316.jpg 67 2249 | 1226558.jpg 67 2250 | 1776806.jpg 67 2251 | 1170134.jpg 67 2252 | 0091361.jpg 67 2253 | 0169635.jpg 67 2254 | 0400394.jpg 67 2255 | 0977345.jpg 67 2256 | 1511412.jpg 67 2257 | 1149051.jpg 67 2258 | 0885624.jpg 67 2259 | 0996456.jpg 67 2260 | 0334474.jpg 67 2261 | 0066281.jpg 67 2262 | 1407531.jpg 67 2263 | 0067037.jpg 67 2264 | 1253530.jpg 67 2265 | 0525387.jpg 67 2266 | 1338346.jpg 67 2267 | 1847170.jpg 68 2268 | 0785251.jpg 68 2269 | 1267455.jpg 68 2270 | 0625195.jpg 68 2271 | 2167865.jpg 68 2272 | 0993482.jpg 68 2273 | 1328376.jpg 68 2274 | 1537569.jpg 68 2275 | 0708639.jpg 68 2276 | 1788041.jpg 68 2277 | 1601042.jpg 68 2278 | 1531429.jpg 68 2279 | 1276718.jpg 68 2280 | 1546826.jpg 68 2281 | 0313723.jpg 68 2282 | 1474763.jpg 68 2283 | 1486386.jpg 68 2284 | 1418270.jpg 68 2285 | 1898331.jpg 68 2286 | 0440148.jpg 68 2287 | 1850439.jpg 68 2288 | 1552844.jpg 68 2289 | 1689074.jpg 68 2290 | 0975833.jpg 68 2291 | 0249029.jpg 68 2292 | 1736110.jpg 68 2293 | 2160086.jpg 68 2294 | 1385706.jpg 68 2295 | 1445486.jpg 68 2296 | 2021346.jpg 68 2297 | 0619179.jpg 68 2298 | 0695328.jpg 68 2299 | 1090346.jpg 68 2300 | 1172651.jpg 68 2301 | 0958304.jpg 69 2302 | 1746035.jpg 69 2303 | 1935311.jpg 69 2304 | 1236292.jpg 69 2305 | 0062125.jpg 69 2306 | 0752805.jpg 69 2307 | 1014303.jpg 69 2308 | 0701145.jpg 69 2309 | 1676591.jpg 69 2310 | 2195119.jpg 69 2311 | 1658584.jpg 69 2312 | 0907301.jpg 69 2313 | 1343003.jpg 69 2314 | 0109449.jpg 69 2315 | 0345784.jpg 69 2316 | 1519774.jpg 69 2317 | 0447766.jpg 69 2318 | 1482188.jpg 69 2319 | 0382232.jpg 69 2320 | 1106865.jpg 69 2321 | 0709959.jpg 69 2322 | 0699600.jpg 69 2323 | 1306976.jpg 69 2324 | 1062510.jpg 69 2325 | 1723821.jpg 69 2326 | 1156842.jpg 69 2327 | 0182730.jpg 69 2328 | 1739105.jpg 69 2329 | 0931966.jpg 69 2330 | 2253091.jpg 69 2331 | 0085153.jpg 69 2332 | 0950305.jpg 69 2333 | 0925498.jpg 69 2334 | 1376762.jpg 70 2335 | 0609174.jpg 70 2336 | 1482335.jpg 70 2337 | 1655564.jpg 70 2338 | 2080763.jpg 70 2339 | 0395611.jpg 70 2340 | 1137368.jpg 70 2341 | 2255569.jpg 70 2342 | 1896163.jpg 70 2343 | 1455184.jpg 70 2344 | 1602275.jpg 70 2345 | 1053510.jpg 70 2346 | 1376737.jpg 70 2347 | 2173324.jpg 70 2348 | 1320584.jpg 70 2349 | 0779581.jpg 70 2350 | 1163997.jpg 70 2351 | 1763830.jpg 70 2352 | 1806212.jpg 70 2353 | 1895796.jpg 70 2354 | 1206417.jpg 70 2355 | 1465199.jpg 70 2356 | 1521727.jpg 70 2357 | 2079015.jpg 70 2358 | 1724870.jpg 70 2359 | 1900829.jpg 70 2360 | 0404487.jpg 70 2361 | 2031391.jpg 70 2362 | 1670312.jpg 70 2363 | 1206352.jpg 70 2364 | 2080691.jpg 70 2365 | 1030353.jpg 70 2366 | 1992923.jpg 70 2367 | 1739390.jpg 71 2368 | 1469089.jpg 71 2369 | 2005574.jpg 71 2370 | 1085153.jpg 71 2371 | 1752996.jpg 71 2372 | 1901689.jpg 71 2373 | 2024661.jpg 71 2374 | 1799133.jpg 71 2375 | 1454285.jpg 71 2376 | 0607514.jpg 71 2377 | 1499562.jpg 71 2378 | 1240217.jpg 71 2379 | 1940260.jpg 71 2380 | 1240216.jpg 71 2381 | 2105989.jpg 71 2382 | 0382062.jpg 71 2383 | 1357292.jpg 71 2384 | 1709263.jpg 71 2385 | 1317747.jpg 71 2386 | 1794569.jpg 71 2387 | 1997801.jpg 71 2388 | 0368356.jpg 71 2389 | 1499880.jpg 71 2390 | 1265382.jpg 71 2391 | 1716063.jpg 71 2392 | 1757760.jpg 71 2393 | 2189411.jpg 71 2394 | 1267012.jpg 71 2395 | 2186726.jpg 71 2396 | 1698551.jpg 71 2397 | 2158616.jpg 71 2398 | 1720365.jpg 71 2399 | 2210227.jpg 71 2400 | 1333639.jpg 71 2401 | 1614884.jpg 72 2402 | 1736085.jpg 72 2403 | 2118960.jpg 72 2404 | 0948007.jpg 72 2405 | 1362024.jpg 72 2406 | 1601600.jpg 72 2407 | 0737927.jpg 72 2408 | 0729054.jpg 72 2409 | 1375880.jpg 72 2410 | 1912107.jpg 72 2411 | 1499818.jpg 72 2412 | 1302868.jpg 72 2413 | 1611465.jpg 72 2414 | 1596547.jpg 72 2415 | 1859060.jpg 72 2416 | 0725553.jpg 72 2417 | 0895308.jpg 72 2418 | 1367983.jpg 72 2419 | 1089047.jpg 72 2420 | 1255006.jpg 72 2421 | 0920469.jpg 72 2422 | 2114078.jpg 72 2423 | 1764575.jpg 72 2424 | 0747565.jpg 72 2425 | 1299189.jpg 72 2426 | 1764574.jpg 72 2427 | 1120190.jpg 72 2428 | 2118961.jpg 72 2429 | 1945975.jpg 72 2430 | 1253784.jpg 72 2431 | 1321941.jpg 72 2432 | 1372357.jpg 72 2433 | 1946049.jpg 72 2434 | 1810600.jpg 73 2435 | 0141433.jpg 73 2436 | 2118959.jpg 73 2437 | 1596394.jpg 73 2438 | 0780680.jpg 73 2439 | 1120498.jpg 73 2440 | 1768791.jpg 73 2441 | 1376226.jpg 73 2442 | 1371565.jpg 73 2443 | 1133481.jpg 73 2444 | 2118958.jpg 73 2445 | 1804604.jpg 73 2446 | 1722177.jpg 73 2447 | 1931272.jpg 73 2448 | 1243068.jpg 73 2449 | 1415519.jpg 73 2450 | 1357182.jpg 73 2451 | 1818391.jpg 73 2452 | 0286125.jpg 73 2453 | 2038744.jpg 73 2454 | 1817155.jpg 73 2455 | 1592595.jpg 73 2456 | 0874976.jpg 73 2457 | 1764706.jpg 73 2458 | 1600188.jpg 73 2459 | 1125696.jpg 73 2460 | 1564413.jpg 73 2461 | 1369990.jpg 73 2462 | 1373011.jpg 73 2463 | 0218834.jpg 73 2464 | 0890438.jpg 73 2465 | 1120136.jpg 73 2466 | 0440929.jpg 73 2467 | 1320915.jpg 74 2468 | 0757651.jpg 74 2469 | 1616391.jpg 74 2470 | 1258577.jpg 74 2471 | 1973254.jpg 74 2472 | 1382002.jpg 74 2473 | 0973382.jpg 74 2474 | 1889569.jpg 74 2475 | 1158575.jpg 74 2476 | 1236365.jpg 74 2477 | 1046908.jpg 74 2478 | 1743336.jpg 74 2479 | 1201875.jpg 74 2480 | 1283645.jpg 74 2481 | 1095664.jpg 74 2482 | 1418775.jpg 74 2483 | 1938067.jpg 74 2484 | 2035396.jpg 74 2485 | 1223416.jpg 74 2486 | 2104253.jpg 74 2487 | 1355813.jpg 74 2488 | 2223316.jpg 74 2489 | 1425610.jpg 74 2490 | 1714324.jpg 74 2491 | 1364327.jpg 74 2492 | 0741747.jpg 74 2493 | 0841849.jpg 74 2494 | 0945365.jpg 74 2495 | 1942913.jpg 74 2496 | 1236413.jpg 74 2497 | 2178627.jpg 74 2498 | 0927431.jpg 74 2499 | 0440127.jpg 74 2500 | 2137097.jpg 74 2501 | 1344355.jpg 75 2502 | 1533873.jpg 75 2503 | 2227042.jpg 75 2504 | 0759487.jpg 75 2505 | 2123950.jpg 75 2506 | 0944920.jpg 75 2507 | 1601176.jpg 75 2508 | 0624127.jpg 75 2509 | 0553608.jpg 75 2510 | 1258704.jpg 75 2511 | 1390758.jpg 75 2512 | 1838164.jpg 75 2513 | 1260868.jpg 75 2514 | 1498794.jpg 75 2515 | 1251098.jpg 75 2516 | 1918575.jpg 75 2517 | 1386248.jpg 75 2518 | 1758584.jpg 75 2519 | 1870831.jpg 75 2520 | 1493116.jpg 75 2521 | 1723914.jpg 75 2522 | 1126453.jpg 75 2523 | 1391897.jpg 75 2524 | 1026502.jpg 75 2525 | 0822350.jpg 75 2526 | 1935338.jpg 75 2527 | 1539226.jpg 75 2528 | 0547319.jpg 75 2529 | 1231579.jpg 75 2530 | 1353412.jpg 75 2531 | 0489925.jpg 75 2532 | 1430016.jpg 75 2533 | 1206825.jpg 75 2534 | 2266079.jpg 76 2535 | 1541506.jpg 76 2536 | 0706174.jpg 76 2537 | 0065294.jpg 76 2538 | 1455403.jpg 76 2539 | 1358045.jpg 76 2540 | 0980397.jpg 76 2541 | 1296872.jpg 76 2542 | 0123337.jpg 76 2543 | 1448036.jpg 76 2544 | 1779674.jpg 76 2545 | 0184352.jpg 76 2546 | 1564786.jpg 76 2547 | 0337125.jpg 76 2548 | 1348803.jpg 76 2549 | 0995426.jpg 76 2550 | 1236355.jpg 76 2551 | 0573392.jpg 76 2552 | 2031381.jpg 76 2553 | 2166443.jpg 76 2554 | 1449580.jpg 76 2555 | 0134602.jpg 76 2556 | 0944174.jpg 76 2557 | 1559632.jpg 76 2558 | 2007888.jpg 76 2559 | 1288743.jpg 76 2560 | 1457085.jpg 76 2561 | 1294505.jpg 76 2562 | 1355825.jpg 76 2563 | 0879141.jpg 76 2564 | 1115393.jpg 76 2565 | 1208469.jpg 76 2566 | 0688081.jpg 76 2567 | 2259848.jpg 77 2568 | 1730963.jpg 77 2569 | 0458625.jpg 77 2570 | 1578336.jpg 77 2571 | 0487379.jpg 77 2572 | 1407349.jpg 77 2573 | 0952377.jpg 77 2574 | 0744239.jpg 77 2575 | 0497539.jpg 77 2576 | 0384109.jpg 77 2577 | 1539458.jpg 77 2578 | 1891657.jpg 77 2579 | 0923501.jpg 77 2580 | 0561949.jpg 77 2581 | 1036873.jpg 77 2582 | 0115064.jpg 77 2583 | 0067031.jpg 77 2584 | 1115299.jpg 77 2585 | 0225206.jpg 77 2586 | 0550328.jpg 77 2587 | 1312101.jpg 77 2588 | 0143349.jpg 77 2589 | 0114414.jpg 77 2590 | 0901515.jpg 77 2591 | 2097522.jpg 77 2592 | 0576234.jpg 77 2593 | 0945369.jpg 77 2594 | 1117094.jpg 77 2595 | 1204728.jpg 77 2596 | 2175631.jpg 77 2597 | 0856823.jpg 77 2598 | 0892223.jpg 77 2599 | 1127319.jpg 77 2600 | 1206350.jpg 77 2601 | 2243379.jpg 78 2602 | 0428259.jpg 78 2603 | 1313340.jpg 78 2604 | 1007850.jpg 78 2605 | 1825883.jpg 78 2606 | 0064919.jpg 78 2607 | 1560605.jpg 78 2608 | 1385467.jpg 78 2609 | 1000027.jpg 78 2610 | 1314535.jpg 78 2611 | 0761965.jpg 78 2612 | 1997674.jpg 78 2613 | 1103350.jpg 78 2614 | 1107750.jpg 78 2615 | 1261752.jpg 78 2616 | 0955770.jpg 78 2617 | 0848172.jpg 78 2618 | 0078141.jpg 78 2619 | 1409946.jpg 78 2620 | 0445578.jpg 78 2621 | 0792175.jpg 78 2622 | 1640693.jpg 78 2623 | 1092252.jpg 78 2624 | 0443952.jpg 78 2625 | 1503577.jpg 78 2626 | 1129248.jpg 78 2627 | 0870094.jpg 78 2628 | 0901157.jpg 78 2629 | 0325518.jpg 78 2630 | 0070035.jpg 78 2631 | 2078845.jpg 78 2632 | 0973158.jpg 78 2633 | 1163454.jpg 78 2634 | 1824491.jpg 79 2635 | 0497788.jpg 79 2636 | 1218932.jpg 79 2637 | 2003239.jpg 79 2638 | 1811490.jpg 79 2639 | 1619900.jpg 79 2640 | 0984408.jpg 79 2641 | 1898513.jpg 79 2642 | 1524818.jpg 79 2643 | 0457845.jpg 79 2644 | 2127891.jpg 79 2645 | 2006677.jpg 79 2646 | 2088437.jpg 79 2647 | 1158131.jpg 79 2648 | 1291218.jpg 79 2649 | 1930872.jpg 79 2650 | 0923416.jpg 79 2651 | 1037466.jpg 79 2652 | 2057960.jpg 79 2653 | 1823327.jpg 79 2654 | 1922919.jpg 79 2655 | 1883938.jpg 79 2656 | 0708626.jpg 79 2657 | 1413237.jpg 79 2658 | 2091951.jpg 79 2659 | 1210469.jpg 79 2660 | 0880314.jpg 79 2661 | 1973455.jpg 79 2662 | 2054780.jpg 79 2663 | 1624196.jpg 79 2664 | 1083206.jpg 79 2665 | 1812931.jpg 79 2666 | 1200888.jpg 79 2667 | 1685202.jpg 80 2668 | 1645736.jpg 80 2669 | 0918179.jpg 80 2670 | 1125617.jpg 80 2671 | 1081565.jpg 80 2672 | 1698699.jpg 80 2673 | 0492490.jpg 80 2674 | 1354235.jpg 80 2675 | 1903311.jpg 80 2676 | 1425231.jpg 80 2677 | 1500676.jpg 80 2678 | 1565325.jpg 80 2679 | 0958065.jpg 80 2680 | 1622762.jpg 80 2681 | 2180164.jpg 80 2682 | 1173014.jpg 80 2683 | 0606801.jpg 80 2684 | 1985180.jpg 80 2685 | 2106207.jpg 80 2686 | 1066524.jpg 80 2687 | 0764239.jpg 80 2688 | 2023598.jpg 80 2689 | 1362533.jpg 80 2690 | 1455400.jpg 80 2691 | 1088415.jpg 80 2692 | 0543870.jpg 80 2693 | 1686317.jpg 80 2694 | 0890827.jpg 80 2695 | 1385093.jpg 80 2696 | 2160088.jpg 80 2697 | 2099873.jpg 80 2698 | 1425707.jpg 80 2699 | 1614686.jpg 80 2700 | 0795231.jpg 80 2701 | 2211444.jpg 81 2702 | 1400629.jpg 81 2703 | 0965713.jpg 81 2704 | 1300002.jpg 81 2705 | 1636521.jpg 81 2706 | 1806210.jpg 81 2707 | 0422472.jpg 81 2708 | 2223311.jpg 81 2709 | 1663379.jpg 81 2710 | 1619270.jpg 81 2711 | 1272698.jpg 81 2712 | 1771969.jpg 81 2713 | 1950559.jpg 81 2714 | 0093510.jpg 81 2715 | 0071754.jpg 81 2716 | 1471641.jpg 81 2717 | 1047407.jpg 81 2718 | 1658586.jpg 81 2719 | 1077924.jpg 81 2720 | 2139678.jpg 81 2721 | 1921656.jpg 81 2722 | 0497629.jpg 81 2723 | 1244306.jpg 81 2724 | 1546282.jpg 81 2725 | 1244687.jpg 81 2726 | 1091663.jpg 81 2727 | 0759316.jpg 81 2728 | 0498138.jpg 81 2729 | 0447762.jpg 81 2730 | 0396507.jpg 81 2731 | 2030079.jpg 81 2732 | 1063148.jpg 81 2733 | 0492487.jpg 81 2734 | 1244766.jpg 82 2735 | 1376413.jpg 82 2736 | 1969071.jpg 82 2737 | 1745999.jpg 82 2738 | 0680033.jpg 82 2739 | 1711730.jpg 82 2740 | 1560108.jpg 82 2741 | 2160566.jpg 82 2742 | 1779625.jpg 82 2743 | 1912356.jpg 82 2744 | 2169747.jpg 82 2745 | 1754180.jpg 82 2746 | 1105223.jpg 82 2747 | 2228412.jpg 82 2748 | 2065885.jpg 82 2749 | 1567619.jpg 82 2750 | 1203397.jpg 82 2751 | 1088411.jpg 82 2752 | 1203398.jpg 82 2753 | 1736173.jpg 82 2754 | 1258663.jpg 82 2755 | 2178477.jpg 82 2756 | 2163696.jpg 82 2757 | 1258557.jpg 82 2758 | 0681463.jpg 82 2759 | 1244757.jpg 82 2760 | 1280976.jpg 82 2761 | 2171897.jpg 82 2762 | 1570911.jpg 82 2763 | 0903057.jpg 82 2764 | 1780364.jpg 82 2765 | 1634646.jpg 82 2766 | 1759330.jpg 82 2767 | 0065838.jpg 83 2768 | 0523183.jpg 83 2769 | 0950056.jpg 83 2770 | 0523246.jpg 83 2771 | 0522938.jpg 83 2772 | 1869891.jpg 83 2773 | 0127650.jpg 83 2774 | 1654785.jpg 83 2775 | 1240215.jpg 83 2776 | 0809559.jpg 83 2777 | 1446340.jpg 83 2778 | 0195016.jpg 83 2779 | 0523110.jpg 83 2780 | 0577856.jpg 83 2781 | 1623084.jpg 83 2782 | 0973255.jpg 83 2783 | 1476163.jpg 83 2784 | 0902147.jpg 83 2785 | 0756076.jpg 83 2786 | 0523228.jpg 83 2787 | 0447776.jpg 83 2788 | 0869721.jpg 83 2789 | 0979926.jpg 83 2790 | 1869890.jpg 83 2791 | 1283846.jpg 83 2792 | 0812685.jpg 83 2793 | 1594759.jpg 83 2794 | 2117431.jpg 83 2795 | 1725715.jpg 83 2796 | 1459176.jpg 83 2797 | 0913985.jpg 83 2798 | 0523093.jpg 83 2799 | 0993066.jpg 83 2800 | 1398527.jpg 83 2801 | 1164431.jpg 84 2802 | 0677585.jpg 84 2803 | 0482820.jpg 84 2804 | 0061401.jpg 84 2805 | 0909293.jpg 84 2806 | 0923566.jpg 84 2807 | 0065387.jpg 84 2808 | 0851668.jpg 84 2809 | 0127508.jpg 84 2810 | 0851323.jpg 84 2811 | 0875300.jpg 84 2812 | 1879800.jpg 84 2813 | 0677582.jpg 84 2814 | 0814401.jpg 84 2815 | 1880976.jpg 84 2816 | 0396497.jpg 84 2817 | 0103331.jpg 84 2818 | 0808778.jpg 84 2819 | 0356128.jpg 84 2820 | 1345582.jpg 84 2821 | 0174920.jpg 84 2822 | 0114397.jpg 84 2823 | 1530572.jpg 84 2824 | 1084453.jpg 84 2825 | 0149342.jpg 84 2826 | 1192640.jpg 84 2827 | 0875308.jpg 84 2828 | 0220011.jpg 84 2829 | 0807408.jpg 84 2830 | 1707732.jpg 84 2831 | 0089346.jpg 84 2832 | 2084966.jpg 84 2833 | 0350210.jpg 84 2834 | 2241569.jpg 85 2835 | 0980197.jpg 85 2836 | 0812098.jpg 85 2837 | 0813852.jpg 85 2838 | 0950331.jpg 85 2839 | 0467409.jpg 85 2840 | 2107060.jpg 85 2841 | 2009833.jpg 85 2842 | 0668320.jpg 85 2843 | 1083205.jpg 85 2844 | 1226609.jpg 85 2845 | 0871198.jpg 85 2846 | 0996676.jpg 85 2847 | 0487353.jpg 85 2848 | 1647701.jpg 85 2849 | 0457822.jpg 85 2850 | 0944945.jpg 85 2851 | 2221709.jpg 85 2852 | 0275804.jpg 85 2853 | 0130594.jpg 85 2854 | 0894211.jpg 85 2855 | 1048572.jpg 85 2856 | 2211315.jpg 85 2857 | 1338572.jpg 85 2858 | 0395603.jpg 85 2859 | 0950117.jpg 85 2860 | 1227422.jpg 85 2861 | 1053311.jpg 85 2862 | 1122540.jpg 85 2863 | 1031465.jpg 85 2864 | 0276024.jpg 85 2865 | 0901378.jpg 85 2866 | 0066530.jpg 85 2867 | 2013771.jpg 86 2868 | 0338178.jpg 86 2869 | 0929790.jpg 86 2870 | 1355228.jpg 86 2871 | 1469839.jpg 86 2872 | 1310046.jpg 86 2873 | 1373071.jpg 86 2874 | 1221605.jpg 86 2875 | 1149056.jpg 86 2876 | 0255360.jpg 86 2877 | 0921024.jpg 86 2878 | 0176138.jpg 86 2879 | 0955444.jpg 86 2880 | 0171387.jpg 86 2881 | 0447810.jpg 86 2882 | 0233762.jpg 86 2883 | 1996875.jpg 86 2884 | 2248944.jpg 86 2885 | 2116518.jpg 86 2886 | 0222915.jpg 86 2887 | 0167802.jpg 86 2888 | 0151656.jpg 86 2889 | 1251237.jpg 86 2890 | 0950436.jpg 86 2891 | 1113446.jpg 86 2892 | 1467698.jpg 86 2893 | 0979472.jpg 86 2894 | 0132979.jpg 86 2895 | 0328522.jpg 86 2896 | 1705400.jpg 86 2897 | 0821363.jpg 86 2898 | 1291370.jpg 86 2899 | 1386647.jpg 86 2900 | 0866586.jpg 86 2901 | 2194203.jpg 87 2902 | 1589252.jpg 87 2903 | 0255295.jpg 87 2904 | 0742025.jpg 87 2905 | 2021840.jpg 87 2906 | 0977340.jpg 87 2907 | 1783575.jpg 87 2908 | 0756253.jpg 87 2909 | 1026719.jpg 87 2910 | 1197109.jpg 87 2911 | 0078427.jpg 87 2912 | 1426711.jpg 87 2913 | 1194590.jpg 87 2914 | 0984407.jpg 87 2915 | 0880568.jpg 87 2916 | 1099443.jpg 87 2917 | 1551551.jpg 87 2918 | 1283328.jpg 87 2919 | 0487359.jpg 87 2920 | 0874951.jpg 87 2921 | 1519495.jpg 87 2922 | 0337279.jpg 87 2923 | 1063410.jpg 87 2924 | 2115316.jpg 87 2925 | 0817414.jpg 87 2926 | 0136383.jpg 87 2927 | 1292318.jpg 87 2928 | 0174941.jpg 87 2929 | 0866092.jpg 87 2930 | 0435307.jpg 87 2931 | 0114427.jpg 87 2932 | 0716452.jpg 87 2933 | 0197002.jpg 87 2934 | 1543498.jpg 88 2935 | 0275792.jpg 88 2936 | 0227589.jpg 88 2937 | 0175704.jpg 88 2938 | 0184347.jpg 88 2939 | 0092378.jpg 88 2940 | 0979579.jpg 88 2941 | 1073029.jpg 88 2942 | 1256035.jpg 88 2943 | 1670091.jpg 88 2944 | 1133482.jpg 88 2945 | 0765747.jpg 88 2946 | 1745932.jpg 88 2947 | 1429165.jpg 88 2948 | 1218930.jpg 88 2949 | 0259164.jpg 88 2950 | 0174041.jpg 88 2951 | 0699510.jpg 88 2952 | 1284591.jpg 88 2953 | 0257374.jpg 88 2954 | 0363591.jpg 88 2955 | 2210238.jpg 88 2956 | 0524796.jpg 88 2957 | 1154577.jpg 88 2958 | 1302093.jpg 88 2959 | 1091650.jpg 88 2960 | 1602022.jpg 88 2961 | 0733747.jpg 88 2962 | 1950564.jpg 88 2963 | 1917856.jpg 88 2964 | 1283508.jpg 88 2965 | 1315760.jpg 88 2966 | 2237178.jpg 88 2967 | 0738976.jpg 89 2968 | 0573469.jpg 89 2969 | 1090725.jpg 89 2970 | 0873946.jpg 89 2971 | 1634013.jpg 89 2972 | 1769898.jpg 89 2973 | 1135124.jpg 89 2974 | 0573366.jpg 89 2975 | 0312409.jpg 89 2976 | 0312438.jpg 89 2977 | 0262656.jpg 89 2978 | 0447752.jpg 89 2979 | 1059772.jpg 89 2980 | 0979622.jpg 89 2981 | 0824980.jpg 89 2982 | 1225993.jpg 89 2983 | 1136155.jpg 89 2984 | 1620007.jpg 89 2985 | 0334277.jpg 89 2986 | 2188372.jpg 89 2987 | 1155023.jpg 89 2988 | 0068810.jpg 89 2989 | 0523019.jpg 89 2990 | 2116114.jpg 89 2991 | 1575768.jpg 89 2992 | 1146074.jpg 89 2993 | 1546042.jpg 89 2994 | 0074462.jpg 89 2995 | 0359779.jpg 89 2996 | 0347785.jpg 89 2997 | 0921760.jpg 89 2998 | 0640534.jpg 89 2999 | 0573942.jpg 89 3000 | 1790920.jpg 89 3001 | 2186731.jpg 90 3002 | 1265817.jpg 90 3003 | 2072258.jpg 90 3004 | 1420123.jpg 90 3005 | 1058210.jpg 90 3006 | 1158360.jpg 90 3007 | 1103635.jpg 90 3008 | 1316268.jpg 90 3009 | 1501746.jpg 90 3010 | 0953562.jpg 90 3011 | 1425043.jpg 90 3012 | 1144972.jpg 90 3013 | 1797935.jpg 90 3014 | 1426173.jpg 90 3015 | 1571390.jpg 90 3016 | 1689606.jpg 90 3017 | 1366119.jpg 90 3018 | 0118439.jpg 90 3019 | 2091946.jpg 90 3020 | 1800296.jpg 90 3021 | 1237648.jpg 90 3022 | 1823272.jpg 90 3023 | 1620836.jpg 90 3024 | 2247510.jpg 90 3025 | 1009344.jpg 90 3026 | 1163471.jpg 90 3027 | 2137721.jpg 90 3028 | 1925943.jpg 90 3029 | 1584040.jpg 90 3030 | 1782799.jpg 90 3031 | 1193562.jpg 90 3032 | 1471860.jpg 90 3033 | 1356910.jpg 90 3034 | 1673633.jpg 91 3035 | 2262617.jpg 91 3036 | 1646253.jpg 91 3037 | 0996666.jpg 91 3038 | 0779048.jpg 91 3039 | 0879367.jpg 91 3040 | 1188309.jpg 91 3041 | 1928766.jpg 91 3042 | 1254806.jpg 91 3043 | 0100263.jpg 91 3044 | 1231580.jpg 91 3045 | 1534326.jpg 91 3046 | 1490907.jpg 91 3047 | 0885590.jpg 91 3048 | 2153672.jpg 91 3049 | 0192630.jpg 91 3050 | 1374624.jpg 91 3051 | 1398419.jpg 91 3052 | 1465113.jpg 91 3053 | 1667002.jpg 91 3054 | 1739709.jpg 91 3055 | 1066879.jpg 91 3056 | 1216713.jpg 91 3057 | 1090095.jpg 91 3058 | 1451197.jpg 91 3059 | 1307739.jpg 91 3060 | 1282206.jpg 91 3061 | 1581472.jpg 91 3062 | 1010412.jpg 91 3063 | 0509495.jpg 91 3064 | 1646930.jpg 91 3065 | 1767221.jpg 91 3066 | 1292176.jpg 91 3067 | 2233941.jpg 92 3068 | 1560399.jpg 92 3069 | 1658296.jpg 92 3070 | 2037389.jpg 92 3071 | 2144149.jpg 92 3072 | 1605922.jpg 92 3073 | 2148743.jpg 92 3074 | 2123466.jpg 92 3075 | 1771266.jpg 92 3076 | 1323647.jpg 92 3077 | 1627409.jpg 92 3078 | 1538918.jpg 92 3079 | 1927265.jpg 92 3080 | 1673599.jpg 92 3081 | 1625091.jpg 92 3082 | 2265511.jpg 92 3083 | 2018707.jpg 92 3084 | 1636570.jpg 92 3085 | 1849793.jpg 92 3086 | 1168408.jpg 92 3087 | 1598048.jpg 92 3088 | 2072464.jpg 92 3089 | 2065042.jpg 92 3090 | 1853950.jpg 92 3091 | 1979665.jpg 92 3092 | 2177217.jpg 92 3093 | 1476645.jpg 92 3094 | 2006049.jpg 92 3095 | 1465151.jpg 92 3096 | 1658839.jpg 92 3097 | 2151715.jpg 92 3098 | 1713365.jpg 92 3099 | 1468767.jpg 92 3100 | 2143595.jpg 92 3101 | 2094478.jpg 93 3102 | 0492491.jpg 93 3103 | 1748171.jpg 93 3104 | 1538520.jpg 93 3105 | 0127634.jpg 93 3106 | 1476428.jpg 93 3107 | 1191125.jpg 93 3108 | 2127890.jpg 93 3109 | 2136389.jpg 93 3110 | 0749032.jpg 93 3111 | 0936826.jpg 93 3112 | 1338706.jpg 93 3113 | 1122577.jpg 93 3114 | 1383369.jpg 93 3115 | 1187417.jpg 93 3116 | 0343276.jpg 93 3117 | 1818869.jpg 93 3118 | 0967849.jpg 93 3119 | 1271556.jpg 93 3120 | 2057425.jpg 93 3121 | 0788134.jpg 93 3122 | 0936113.jpg 93 3123 | 2218842.jpg 93 3124 | 1597800.jpg 93 3125 | 0939487.jpg 93 3126 | 0920052.jpg 93 3127 | 1386609.jpg 93 3128 | 0257281.jpg 93 3129 | 1912967.jpg 93 3130 | 1818870.jpg 93 3131 | 1162919.jpg 93 3132 | 1290986.jpg 93 3133 | 1219250.jpg 93 3134 | 1149058.jpg 94 3135 | 1807121.jpg 94 3136 | 1343264.jpg 94 3137 | 0174870.jpg 94 3138 | 1092262.jpg 94 3139 | 0095325.jpg 94 3140 | 2119896.jpg 94 3141 | 1425386.jpg 94 3142 | 0184349.jpg 94 3143 | 0345877.jpg 94 3144 | 0174021.jpg 94 3145 | 0202465.jpg 94 3146 | 0447764.jpg 94 3147 | 0488490.jpg 94 3148 | 0255288.jpg 94 3149 | 1540063.jpg 94 3150 | 0143371.jpg 94 3151 | 1762143.jpg 94 3152 | 1351194.jpg 94 3153 | 1571407.jpg 94 3154 | 1407502.jpg 94 3155 | 1008788.jpg 94 3156 | 2031661.jpg 94 3157 | 0960098.jpg 94 3158 | 1895028.jpg 94 3159 | 1827565.jpg 94 3160 | 1334600.jpg 94 3161 | 0097765.jpg 94 3162 | 0977245.jpg 94 3163 | 0973105.jpg 94 3164 | 0842128.jpg 94 3165 | 0339425.jpg 94 3166 | 1136122.jpg 94 3167 | 1743122.jpg 95 3168 | 2211016.jpg 95 3169 | 0880060.jpg 95 3170 | 1578664.jpg 95 3171 | 2043684.jpg 95 3172 | 2156226.jpg 95 3173 | 1832165.jpg 95 3174 | 0881972.jpg 95 3175 | 1967993.jpg 95 3176 | 1180306.jpg 95 3177 | 1117178.jpg 95 3178 | 2131704.jpg 95 3179 | 1106369.jpg 95 3180 | 0877086.jpg 95 3181 | 1887787.jpg 95 3182 | 1135019.jpg 95 3183 | 1237253.jpg 95 3184 | 2151483.jpg 95 3185 | 1898052.jpg 95 3186 | 1485890.jpg 95 3187 | 2046587.jpg 95 3188 | 1859259.jpg 95 3189 | 1251385.jpg 95 3190 | 0881971.jpg 95 3191 | 1723907.jpg 95 3192 | 1167955.jpg 95 3193 | 2041565.jpg 95 3194 | 0912525.jpg 95 3195 | 0881964.jpg 95 3196 | 1244763.jpg 95 3197 | 1384345.jpg 95 3198 | 2106824.jpg 95 3199 | 0717614.jpg 95 3200 | 1859059.jpg 95 3201 | 1611099.jpg 96 3202 | 1280040.jpg 96 3203 | 1125997.jpg 96 3204 | 1927270.jpg 96 3205 | 1374697.jpg 96 3206 | 1115161.jpg 96 3207 | 1045405.jpg 96 3208 | 1961222.jpg 96 3209 | 2162821.jpg 96 3210 | 0691437.jpg 96 3211 | 1750220.jpg 96 3212 | 2118955.jpg 96 3213 | 1588762.jpg 96 3214 | 2118872.jpg 96 3215 | 2031890.jpg 96 3216 | 1240010.jpg 96 3217 | 1611095.jpg 96 3218 | 1940190.jpg 96 3219 | 1266018.jpg 96 3220 | 1376307.jpg 96 3221 | 1715547.jpg 96 3222 | 0877084.jpg 96 3223 | 0749198.jpg 96 3224 | 1614885.jpg 96 3225 | 1966899.jpg 96 3226 | 0749197.jpg 96 3227 | 0648036.jpg 96 3228 | 2118873.jpg 96 3229 | 1564418.jpg 96 3230 | 1979664.jpg 96 3231 | 1349345.jpg 96 3232 | 1060704.jpg 96 3233 | 2080681.jpg 96 3234 | 0829813.jpg 97 3235 | 0523278.jpg 97 3236 | 1590457.jpg 97 3237 | 0523139.jpg 97 3238 | 1140416.jpg 97 3239 | 1297059.jpg 97 3240 | 1083282.jpg 97 3241 | 0677638.jpg 97 3242 | 0974359.jpg 97 3243 | 0063293.jpg 97 3244 | 1061208.jpg 97 3245 | 1351441.jpg 97 3246 | 0923714.jpg 97 3247 | 0522967.jpg 97 3248 | 1691610.jpg 97 3249 | 0836257.jpg 97 3250 | 0894205.jpg 97 3251 | 0958066.jpg 97 3252 | 0094396.jpg 97 3253 | 0719053.jpg 97 3254 | 0738982.jpg 97 3255 | 0522975.jpg 97 3256 | 0693457.jpg 97 3257 | 0324987.jpg 97 3258 | 0504826.jpg 97 3259 | 1594717.jpg 97 3260 | 0522969.jpg 97 3261 | 0659112.jpg 97 3262 | 1619881.jpg 97 3263 | 0066418.jpg 97 3264 | 0551251.jpg 97 3265 | 0195009.jpg 97 3266 | 0845036.jpg 97 3267 | 1940728.jpg 98 3268 | 0066405.jpg 98 3269 | 1412918.jpg 98 3270 | 0582362.jpg 98 3271 | 0808929.jpg 98 3272 | 0907404.jpg 98 3273 | 0523052.jpg 98 3274 | 0903662.jpg 98 3275 | 0064200.jpg 98 3276 | 0879893.jpg 98 3277 | 0836159.jpg 98 3278 | 0983420.jpg 98 3279 | 1204567.jpg 98 3280 | 0337964.jpg 98 3281 | 1298949.jpg 98 3282 | 0184897.jpg 98 3283 | 1177961.jpg 98 3284 | 0229256.jpg 98 3285 | 0523189.jpg 98 3286 | 1415738.jpg 98 3287 | 1388984.jpg 98 3288 | 1273293.jpg 98 3289 | 0391828.jpg 98 3290 | 1481131.jpg 98 3291 | 2007881.jpg 98 3292 | 0255359.jpg 98 3293 | 0384438.jpg 98 3294 | 0979615.jpg 98 3295 | 0984879.jpg 98 3296 | 0522973.jpg 98 3297 | 0066406.jpg 98 3298 | 0936117.jpg 98 3299 | 1185509.jpg 98 3300 | 1633282.jpg 98 3301 | 0800390.jpg 99 3302 | 1320565.jpg 99 3303 | 0923499.jpg 99 3304 | 1298910.jpg 99 3305 | 0523227.jpg 99 3306 | 0973160.jpg 99 3307 | 0104604.jpg 99 3308 | 0198448.jpg 99 3309 | 1616362.jpg 99 3310 | 1296580.jpg 99 3311 | 1051041.jpg 99 3312 | 0906979.jpg 99 3313 | 0457162.jpg 99 3314 | 2158992.jpg 99 3315 | 0198446.jpg 99 3316 | 0944176.jpg 99 3317 | 1176955.jpg 99 3318 | 0784557.jpg 99 3319 | 0197892.jpg 99 3320 | 1695906.jpg 99 3321 | 1544178.jpg 99 3322 | 0445606.jpg 99 3323 | 1885835.jpg 99 3324 | 0784479.jpg 99 3325 | 0725715.jpg 99 3326 | 1430022.jpg 99 3327 | 0995387.jpg 99 3328 | 0487393.jpg 99 3329 | 0383400.jpg 99 3330 | 1117062.jpg 99 3331 | 0329381.jpg 99 3332 | 0523192.jpg 99 3333 | 0810303.jpg 99 3334 | -------------------------------------------------------------------------------- /data/link_data.sh: -------------------------------------------------------------------------------- 1 | ln -s /home/zengh/Dataset/Fine-grained/fgvc-aircraft-2013b/data/ Aircaft 2 | ln -s /home/zengh/Dataset/Fine-grained/CUB_200_2011/ Bird 3 | ln -s /home/zengh/Dataset/Fine-grained/Car/ Car 4 | ln -s /home/zengh/Dataset/Fine-grained/dogs/ Dog -------------------------------------------------------------------------------- /dataset/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wvinzh/WS_DAN_PyTorch/cbf7ff5fd3abe3154d4f0fc972eedb056a205550/dataset/__init__.py -------------------------------------------------------------------------------- /dataset/custom_dataset.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: custom_dataset.py # 3 | # Created: 2019-10-31 19:28:59 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:custom_dataset.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | from __future__ import print_function, division 12 | from PIL import Image 13 | from torch.utils.data import Dataset,DataLoader 14 | import torchvision 15 | import os 16 | import random 17 | 18 | class CustomDataset(Dataset): 19 | def __init__(self, txt_file, root_dir, transform=None, training=False): 20 | self.image_list = [] 21 | self.id_list = [] 22 | self.root_dir = root_dir 23 | self.transform = transform 24 | self.num_classes = 0 25 | self.training = training 26 | with open(txt_file, 'r') as f: 27 | line = f.readline() 28 | # self.datas = f.readlines() 29 | while line: 30 | img_name = line.split()[0] 31 | label = int(line.split()[1]) 32 | # label = int(label) 33 | self.image_list.append(img_name) 34 | self.id_list.append(label) 35 | line = f.readline() 36 | self.num_classes = max(self.id_list)+1 37 | 38 | def __len__(self): 39 | return len(self.id_list) 40 | 41 | def __getitem__(self, idx): 42 | img_name = self.image_list[idx] 43 | label = self.id_list[idx] 44 | img_name = os.path.join(self.root_dir,img_name) 45 | image = Image.open(img_name).convert('RGB') 46 | 47 | if self.transform: 48 | image = self.transform(image) 49 | return image,label 50 | 51 | 52 | def test_dataset(): 53 | root = '/home/zengh/Dataset/Fine-grained/CUB_200_2011/images' 54 | txt = '/home/zengh/Dataset/Fine-grained/CUB_200_2011/test_pytorch.txt' 55 | from torchvision import transforms 56 | rgb_mean = [0.5,0.5,0.5] 57 | rgb_std = [0.5,0.5,0.5] 58 | transform_val = transforms.Compose([ 59 | transforms.Resize((299,299)), 60 | transforms.ToTensor(), 61 | transforms.Normalize(rgb_mean, rgb_std), 62 | ]) 63 | carData = CustomDataset(txt,root,transform_val,True) 64 | print(carData.num_classes) 65 | dataloader = DataLoader(carData,batch_size=16,shuffle=True) 66 | for data in dataloader: 67 | images,labels = data 68 | # print(images.size(),labels.size(),labels) 69 | 70 | 71 | if __name__=='__main__': 72 | test_dataset() -------------------------------------------------------------------------------- /model/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/wvinzh/WS_DAN_PyTorch/cbf7ff5fd3abe3154d4f0fc972eedb056a205550/model/__init__.py -------------------------------------------------------------------------------- /model/bap.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | ### Bilinear Attention Pooling 4 | class BAP(nn.Module): 5 | def __init__(self, **kwargs): 6 | super(BAP, self).__init__() 7 | def forward(self,feature_maps,attention_maps): 8 | feature_shape = feature_maps.size() ## 12*768*26*26* 9 | attention_shape = attention_maps.size() ## 12*num_parts*26*26 10 | # print(feature_shape,attention_shape) 11 | phi_I = torch.einsum('imjk,injk->imn', (attention_maps, feature_maps)) ## 12*32*768 12 | phi_I = torch.div(phi_I, float(attention_shape[2] * attention_shape[3])) 13 | phi_I = torch.mul(torch.sign(phi_I), torch.sqrt(torch.abs(phi_I) + 1e-12)) 14 | phi_I = phi_I.view(feature_shape[0],-1) 15 | raw_features = torch.nn.functional.normalize(phi_I, dim=-1) ##12*(32*768) 16 | pooling_features = raw_features*100 17 | # print(pooling_features.shape) 18 | return raw_features,pooling_features 19 | class ResizeCat(nn.Module): 20 | def __init__(self, **kwargs): 21 | super(ResizeCat, self).__init__() 22 | 23 | def forward(self,at1,at3,at5): 24 | N,C,H,W = at1.size() 25 | resized_at3 = nn.functional.interpolate(at3,(H,W)) 26 | resized_at5 = nn.functional.interpolate(at5,(H,W)) 27 | cat_at = torch.cat((at1,resized_at3,resized_at5),dim=1) 28 | return cat_at 29 | 30 | if __name__ == '__main__': 31 | # a = BAP() 32 | a = ResizeCat() 33 | a1 = torch.Tensor(4,3,14,14) 34 | a3 = torch.Tensor(4,5,12,12) 35 | a5 = torch.Tensor(4,9,9,9) 36 | ret = a(a1,a3,a5) 37 | print(ret.size()) -------------------------------------------------------------------------------- /model/inception.py: -------------------------------------------------------------------------------- 1 | from collections import namedtuple 2 | import torch 3 | import torch.nn as nn 4 | import torch.nn.functional as F 5 | try: 6 | from torch.hub import load_state_dict_from_url 7 | except ImportError: 8 | from torch.utils.model_zoo import load_url as load_state_dict_from_url 9 | 10 | __all__ = ['Inception3', 'inception_v3'] 11 | 12 | 13 | model_urls = { 14 | # Inception v3 ported from TensorFlow 15 | 'inception_v3_google': 'https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth', 16 | } 17 | 18 | _InceptionOutputs = namedtuple('InceptionOutputs', ['logits', 'aux_logits']) 19 | 20 | 21 | def inception_v3(pretrained=False, progress=True, **kwargs): 22 | r"""Inception v3 model architecture from 23 | `"Rethinking the Inception Architecture for Computer Vision" `_. 24 | .. note:: 25 | **Important**: In contrast to the other models the inception_v3 expects tensors with a size of 26 | N x 3 x 299 x 299, so ensure your images are sized accordingly. 27 | Args: 28 | pretrained (bool): If True, returns a model pre-trained on ImageNet 29 | progress (bool): If True, displays a progress bar of the download to stderr 30 | aux_logits (bool): If True, add an auxiliary branch that can improve training. 31 | Default: *True* 32 | transform_input (bool): If True, preprocesses the input according to the method with which it 33 | was trained on ImageNet. Default: *False* 34 | """ 35 | kwargs['transform_input'] = True 36 | if pretrained: 37 | # if 'transform_input' not in kwargs: 38 | # kwargs['transform_input'] = True 39 | if 'aux_logits' in kwargs: 40 | original_aux_logits = kwargs['aux_logits'] 41 | kwargs['aux_logits'] = True 42 | else: 43 | original_aux_logits = True 44 | model = Inception3(**kwargs) 45 | state_dict = load_state_dict_from_url(model_urls['inception_v3_google'], 46 | progress=progress) 47 | model.load_state_dict(state_dict) 48 | if not original_aux_logits: 49 | model.aux_logits = False 50 | del model.AuxLogits 51 | return model 52 | 53 | return Inception3(**kwargs) 54 | 55 | 56 | class Inception3(nn.Module): 57 | 58 | def __init__(self, num_classes=1000, aux_logits=True, transform_input=False): 59 | super(Inception3, self).__init__() 60 | self.aux_logits = aux_logits 61 | self.transform_input = transform_input 62 | self.Conv2d_1a_3x3 = BasicConv2d(3, 32, kernel_size=3, stride=2) 63 | self.Conv2d_2a_3x3 = BasicConv2d(32, 32, kernel_size=3) 64 | self.Conv2d_2b_3x3 = BasicConv2d(32, 64, kernel_size=3, padding=1) 65 | self.Conv2d_3b_1x1 = BasicConv2d(64, 80, kernel_size=1) 66 | self.Conv2d_4a_3x3 = BasicConv2d(80, 192, kernel_size=3) 67 | self.Mixed_5b = InceptionA(192, pool_features=32) 68 | self.Mixed_5c = InceptionA(256, pool_features=64) 69 | self.Mixed_5d = InceptionA(288, pool_features=64) 70 | self.Mixed_6a = InceptionB(288) 71 | self.Mixed_6b = InceptionC(768, channels_7x7=128) 72 | self.Mixed_6c = InceptionC(768, channels_7x7=160) 73 | self.Mixed_6d = InceptionC(768, channels_7x7=160) 74 | self.Mixed_6e = InceptionC(768, channels_7x7=192) 75 | if aux_logits: 76 | self.AuxLogits = InceptionAux(768, num_classes) 77 | self.Mixed_7a = InceptionD(768) 78 | self.Mixed_7b = InceptionE(1280) 79 | self.Mixed_7c = InceptionE(2048) 80 | self.fc = nn.Linear(2048, num_classes) 81 | 82 | for m in self.modules(): 83 | if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear): 84 | import scipy.stats as stats 85 | stddev = m.stddev if hasattr(m, 'stddev') else 0.1 86 | X = stats.truncnorm(-2, 2, scale=stddev) 87 | values = torch.as_tensor(X.rvs(m.weight.numel()), dtype=m.weight.dtype) 88 | values = values.view(m.weight.size()) 89 | with torch.no_grad(): 90 | m.weight.copy_(values) 91 | elif isinstance(m, nn.BatchNorm2d): 92 | nn.init.constant_(m.weight, 1) 93 | nn.init.constant_(m.bias, 0) 94 | 95 | def forward(self, x): 96 | if self.transform_input: 97 | x_ch0 = torch.unsqueeze(x[:, 0], 1) * (0.229 / 0.5) + (0.485 - 0.5) / 0.5 98 | x_ch1 = torch.unsqueeze(x[:, 1], 1) * (0.224 / 0.5) + (0.456 - 0.5) / 0.5 99 | x_ch2 = torch.unsqueeze(x[:, 2], 1) * (0.225 / 0.5) + (0.406 - 0.5) / 0.5 100 | x = torch.cat((x_ch0, x_ch1, x_ch2), 1) 101 | # N x 3 x 299 x 299 102 | x = self.Conv2d_1a_3x3(x) 103 | # N x 32 x 149 x 149 104 | x = self.Conv2d_2a_3x3(x) 105 | # N x 32 x 147 x 147 106 | x = self.Conv2d_2b_3x3(x) 107 | # N x 64 x 147 x 147 108 | x = F.max_pool2d(x, kernel_size=3, stride=2) 109 | # N x 64 x 73 x 73 110 | x = self.Conv2d_3b_1x1(x) 111 | # N x 80 x 73 x 73 112 | x = self.Conv2d_4a_3x3(x) 113 | # N x 192 x 71 x 71 114 | x = F.max_pool2d(x, kernel_size=3, stride=2) 115 | # N x 192 x 35 x 35 116 | x = self.Mixed_5b(x) 117 | # N x 256 x 35 x 35 118 | x = self.Mixed_5c(x) 119 | # N x 288 x 35 x 35 120 | x = self.Mixed_5d(x) 121 | # N x 288 x 35 x 35 122 | x = self.Mixed_6a(x) 123 | # N x 768 x 17 x 17 124 | x = self.Mixed_6b(x) 125 | # N x 768 x 17 x 17 126 | x = self.Mixed_6c(x) 127 | # N x 768 x 17 x 17 128 | x = self.Mixed_6d(x) 129 | # N x 768 x 17 x 17 130 | x = self.Mixed_6e(x) 131 | # N x 768 x 17 x 17 132 | if self.training and self.aux_logits: 133 | aux = self.AuxLogits(x) 134 | # N x 768 x 17 x 17 135 | x = self.Mixed_7a(x) 136 | # N x 1280 x 8 x 8 137 | x = self.Mixed_7b(x) 138 | # N x 2048 x 8 x 8 139 | x = self.Mixed_7c(x) 140 | # N x 2048 x 8 x 8 141 | # Adaptive average pooling 142 | x = F.adaptive_avg_pool2d(x, (1, 1)) 143 | # N x 2048 x 1 x 1 144 | x = F.dropout(x, training=self.training) 145 | # N x 2048 x 1 x 1 146 | x = torch.flatten(x, 1) 147 | # N x 2048 148 | x = self.fc(x) 149 | # N x 1000 (num_classes) 150 | if self.training and self.aux_logits: 151 | return _InceptionOutputs(x, aux) 152 | return x 153 | 154 | 155 | class InceptionA(nn.Module): 156 | 157 | def __init__(self, in_channels, pool_features): 158 | super(InceptionA, self).__init__() 159 | self.branch1x1 = BasicConv2d(in_channels, 64, kernel_size=1) 160 | 161 | self.branch5x5_1 = BasicConv2d(in_channels, 48, kernel_size=1) 162 | self.branch5x5_2 = BasicConv2d(48, 64, kernel_size=5, padding=2) 163 | 164 | self.branch3x3dbl_1 = BasicConv2d(in_channels, 64, kernel_size=1) 165 | self.branch3x3dbl_2 = BasicConv2d(64, 96, kernel_size=3, padding=1) 166 | self.branch3x3dbl_3 = BasicConv2d(96, 96, kernel_size=3, padding=1) 167 | 168 | self.branch_pool = BasicConv2d(in_channels, pool_features, kernel_size=1) 169 | 170 | def forward(self, x): 171 | branch1x1 = self.branch1x1(x) 172 | 173 | branch5x5 = self.branch5x5_1(x) 174 | branch5x5 = self.branch5x5_2(branch5x5) 175 | 176 | branch3x3dbl = self.branch3x3dbl_1(x) 177 | branch3x3dbl = self.branch3x3dbl_2(branch3x3dbl) 178 | branch3x3dbl = self.branch3x3dbl_3(branch3x3dbl) 179 | 180 | branch_pool = F.avg_pool2d(x, kernel_size=3, stride=1, padding=1) 181 | branch_pool = self.branch_pool(branch_pool) 182 | 183 | outputs = [branch1x1, branch5x5, branch3x3dbl, branch_pool] 184 | return torch.cat(outputs, 1) 185 | 186 | 187 | class InceptionB(nn.Module): 188 | 189 | def __init__(self, in_channels): 190 | super(InceptionB, self).__init__() 191 | self.branch3x3 = BasicConv2d(in_channels, 384, kernel_size=3, stride=2) 192 | 193 | self.branch3x3dbl_1 = BasicConv2d(in_channels, 64, kernel_size=1) 194 | self.branch3x3dbl_2 = BasicConv2d(64, 96, kernel_size=3, padding=1) 195 | self.branch3x3dbl_3 = BasicConv2d(96, 96, kernel_size=3, stride=2) 196 | 197 | def forward(self, x): 198 | branch3x3 = self.branch3x3(x) 199 | 200 | branch3x3dbl = self.branch3x3dbl_1(x) 201 | branch3x3dbl = self.branch3x3dbl_2(branch3x3dbl) 202 | branch3x3dbl = self.branch3x3dbl_3(branch3x3dbl) 203 | 204 | branch_pool = F.max_pool2d(x, kernel_size=3, stride=2) 205 | 206 | outputs = [branch3x3, branch3x3dbl, branch_pool] 207 | return torch.cat(outputs, 1) 208 | 209 | 210 | class InceptionC(nn.Module): 211 | 212 | def __init__(self, in_channels, channels_7x7, attention=False): 213 | super(InceptionC, self).__init__() 214 | self.attention = attention 215 | self.branch1x1 = BasicConv2d(in_channels, 192, kernel_size=1) 216 | 217 | c7 = channels_7x7 218 | self.branch7x7_1 = BasicConv2d(in_channels, c7, kernel_size=1) 219 | self.branch7x7_2 = BasicConv2d(c7, c7, kernel_size=(1, 7), padding=(0, 3)) 220 | self.branch7x7_3 = BasicConv2d(c7, 192, kernel_size=(7, 1), padding=(3, 0)) 221 | 222 | self.branch7x7dbl_1 = BasicConv2d(in_channels, c7, kernel_size=1) 223 | self.branch7x7dbl_2 = BasicConv2d(c7, c7, kernel_size=(7, 1), padding=(3, 0)) 224 | self.branch7x7dbl_3 = BasicConv2d(c7, c7, kernel_size=(1, 7), padding=(0, 3)) 225 | self.branch7x7dbl_4 = BasicConv2d(c7, c7, kernel_size=(7, 1), padding=(3, 0)) 226 | self.branch7x7dbl_5 = BasicConv2d(c7, 192, kernel_size=(1, 7), padding=(0, 3)) 227 | 228 | self.branch_pool = BasicConv2d(in_channels, 192, kernel_size=1) 229 | 230 | def forward(self, x): 231 | branch1x1 = self.branch1x1(x) 232 | if self.attention: 233 | self.atm = branch1x1 234 | branch7x7 = self.branch7x7_1(x) 235 | branch7x7 = self.branch7x7_2(branch7x7) 236 | branch7x7 = self.branch7x7_3(branch7x7) 237 | 238 | branch7x7dbl = self.branch7x7dbl_1(x) 239 | branch7x7dbl = self.branch7x7dbl_2(branch7x7dbl) 240 | branch7x7dbl = self.branch7x7dbl_3(branch7x7dbl) 241 | branch7x7dbl = self.branch7x7dbl_4(branch7x7dbl) 242 | branch7x7dbl = self.branch7x7dbl_5(branch7x7dbl) 243 | 244 | branch_pool = F.avg_pool2d(x, kernel_size=3, stride=1, padding=1) 245 | branch_pool = self.branch_pool(branch_pool) 246 | 247 | outputs = [branch1x1, branch7x7, branch7x7dbl, branch_pool] 248 | return torch.cat(outputs, 1) 249 | 250 | 251 | class InceptionD(nn.Module): 252 | 253 | def __init__(self, in_channels): 254 | super(InceptionD, self).__init__() 255 | self.branch3x3_1 = BasicConv2d(in_channels, 192, kernel_size=1) 256 | self.branch3x3_2 = BasicConv2d(192, 320, kernel_size=3, stride=2) 257 | 258 | self.branch7x7x3_1 = BasicConv2d(in_channels, 192, kernel_size=1) 259 | self.branch7x7x3_2 = BasicConv2d(192, 192, kernel_size=(1, 7), padding=(0, 3)) 260 | self.branch7x7x3_3 = BasicConv2d(192, 192, kernel_size=(7, 1), padding=(3, 0)) 261 | self.branch7x7x3_4 = BasicConv2d(192, 192, kernel_size=3, stride=2) 262 | 263 | def forward(self, x): 264 | branch3x3 = self.branch3x3_1(x) 265 | branch3x3 = self.branch3x3_2(branch3x3) 266 | 267 | branch7x7x3 = self.branch7x7x3_1(x) 268 | branch7x7x3 = self.branch7x7x3_2(branch7x7x3) 269 | branch7x7x3 = self.branch7x7x3_3(branch7x7x3) 270 | branch7x7x3 = self.branch7x7x3_4(branch7x7x3) 271 | 272 | branch_pool = F.max_pool2d(x, kernel_size=3, stride=2) 273 | outputs = [branch3x3, branch7x7x3, branch_pool] 274 | return torch.cat(outputs, 1) 275 | 276 | 277 | class InceptionE(nn.Module): 278 | 279 | def __init__(self, in_channels): 280 | super(InceptionE, self).__init__() 281 | self.branch1x1 = BasicConv2d(in_channels, 320, kernel_size=1) 282 | 283 | self.branch3x3_1 = BasicConv2d(in_channels, 384, kernel_size=1) 284 | self.branch3x3_2a = BasicConv2d(384, 384, kernel_size=(1, 3), padding=(0, 1)) 285 | self.branch3x3_2b = BasicConv2d(384, 384, kernel_size=(3, 1), padding=(1, 0)) 286 | 287 | self.branch3x3dbl_1 = BasicConv2d(in_channels, 448, kernel_size=1) 288 | self.branch3x3dbl_2 = BasicConv2d(448, 384, kernel_size=3, padding=1) 289 | self.branch3x3dbl_3a = BasicConv2d(384, 384, kernel_size=(1, 3), padding=(0, 1)) 290 | self.branch3x3dbl_3b = BasicConv2d(384, 384, kernel_size=(3, 1), padding=(1, 0)) 291 | 292 | self.branch_pool = BasicConv2d(in_channels, 192, kernel_size=1) 293 | 294 | def forward(self, x): 295 | branch1x1 = self.branch1x1(x) 296 | # self.attention_map = branch1x1 297 | branch3x3 = self.branch3x3_1(x) 298 | branch3x3 = [ 299 | self.branch3x3_2a(branch3x3), 300 | self.branch3x3_2b(branch3x3), 301 | ] 302 | branch3x3 = torch.cat(branch3x3, 1) 303 | 304 | branch3x3dbl = self.branch3x3dbl_1(x) 305 | branch3x3dbl = self.branch3x3dbl_2(branch3x3dbl) 306 | branch3x3dbl = [ 307 | self.branch3x3dbl_3a(branch3x3dbl), 308 | self.branch3x3dbl_3b(branch3x3dbl), 309 | ] 310 | branch3x3dbl = torch.cat(branch3x3dbl, 1) 311 | 312 | branch_pool = F.avg_pool2d(x, kernel_size=3, stride=1, padding=1) 313 | branch_pool = self.branch_pool(branch_pool) 314 | 315 | outputs = [branch1x1, branch3x3, branch3x3dbl, branch_pool] 316 | return torch.cat(outputs, 1) 317 | 318 | 319 | class InceptionAux(nn.Module): 320 | 321 | def __init__(self, in_channels, num_classes): 322 | super(InceptionAux, self).__init__() 323 | self.conv0 = BasicConv2d(in_channels, 128, kernel_size=1) 324 | self.conv1 = BasicConv2d(128, 768, kernel_size=5) 325 | self.conv1.stddev = 0.01 326 | self.fc = nn.Linear(768, num_classes) 327 | self.fc.stddev = 0.001 328 | 329 | def forward(self, x): 330 | # N x 768 x 17 x 17 331 | x = F.avg_pool2d(x, kernel_size=5, stride=3) 332 | # N x 768 x 5 x 5 333 | x = self.conv0(x) 334 | # N x 128 x 5 x 5 335 | x = self.conv1(x) 336 | # N x 768 x 1 x 1 337 | # Adaptive average pooling 338 | x = F.adaptive_avg_pool2d(x, (1, 1)) 339 | # N x 768 x 1 x 1 340 | x = torch.flatten(x, 1) 341 | # N x 768 342 | x = self.fc(x) 343 | # N x 1000 344 | return x 345 | 346 | 347 | class BasicConv2d(nn.Module): 348 | 349 | def __init__(self, in_channels, out_channels, **kwargs): 350 | super(BasicConv2d, self).__init__() 351 | self.conv = nn.Conv2d(in_channels, out_channels, bias=False, **kwargs) 352 | self.bn = nn.BatchNorm2d(out_channels, eps=0.001) 353 | 354 | def forward(self, x): 355 | x = self.conv(x) 356 | x = self.bn(x) 357 | return F.relu(x, inplace=True) 358 | 359 | #### 360 | # Bilinear Attention Pooling 361 | class BAP(nn.Module): 362 | def __init__(self, **kwargs): 363 | super(BAP, self).__init__() 364 | def forward(self,feature_maps,attention_maps): 365 | feature_shape = feature_maps.size() ## 12*768*26*26* 366 | attention_shape = attention_maps.size() ## 12*32*26*26 367 | # print(feature_shape,attention_shape) 368 | phi_I = torch.einsum('imjk,injk->imn', (attention_maps, feature_maps)) ## 12*32*768 369 | phi_I = torch.div(phi_I, (attention_shape[1] * attention_shape[2])) 370 | phi_I = torch.mul(torch.sign(phi_I), torch.sqrt(torch.abs(phi_I) + 1e-12)) 371 | phi_I = phi_I.view(feature_shape[0],-1,1,1) 372 | raw_features = torch.nn.functional.normalize(phi_I, dim=-1) 373 | pooling_features = raw_features * 100.0 374 | return pooling_features 375 | 376 | if __name__=='__main__': 377 | bap = BAP() 378 | atm = torch.Tensor(12,768,26,26) 379 | ftm = torch.Tensor(12,32,26,26) 380 | rft = bap(atm,ftm) 381 | # criterion = torch.nn.CrossEntropyLoss() 382 | # print(bap.parameters()) 383 | 384 | print(rft.size()) 385 | # import torchvision 386 | # import torchvision.transforms as transforms 387 | # from PIL import Image 388 | # # input_transform = transforms([]) 389 | # img = Image.open('/home/zengh/Dataset/oxy/oxySensitive/Sensitive_train_img/n01608432/n01608432_322.JPEG') 390 | # rgb_mean = [0.5,0.5,0.5] 391 | # rgb_std = [0.5,0.5,0.5] 392 | # # rgb_mean = [0.485, 0.456, 0.406] 393 | # # rgb_std = [0.229, 0.224, 0.225] 394 | # transform_val = transforms.Compose([ 395 | # transforms.Resize((299,299)), 396 | # # transforms.CenterCrop(args.crop_size), 397 | # transforms.ToTensor(), 398 | # transforms.Normalize(rgb_mean, rgb_std), 399 | # ]) 400 | # net = inception_v3(pretrained=True,aux_logits=False) 401 | # # net = torchvision.models.inception_v3(pretrained=True) 402 | 403 | # net.eval() 404 | # # print(net) 405 | # # input = torch.Tensor(4,3,448,448) 406 | # input = transform_val(img).unsqueeze(0) 407 | # output = net(input) 408 | # print('OK',output.size(),torch.argmax(output)) -------------------------------------------------------------------------------- /model/inception_bap.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: inception_bap.py # 3 | # Created: 2019-10-31 15:50:02 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:inception_bap.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | from collections import namedtuple 12 | import torch 13 | import torch.nn as nn 14 | import torch.nn.functional as F 15 | from utils.attention import attention_crop,attention_drop,attention_crop_drop 16 | from .bap import BAP 17 | try: 18 | from torch.hub import load_state_dict_from_url 19 | except ImportError: 20 | from torch.utils.model_zoo import load_url as load_state_dict_from_url 21 | 22 | __all__ = ['Inception3', 'inception_v3'] 23 | 24 | 25 | model_urls = { 26 | # Inception v3 ported from TensorFlow 27 | 'inception_v3_google': 'https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth', 28 | } 29 | 30 | _InceptionOutputs = namedtuple('InceptionOutputs', ['logits', 'aux_logits']) 31 | 32 | 33 | def inception_v3(pretrained=False, progress=True, **kwargs): 34 | r"""Inception v3 model architecture from 35 | `"Rethinking the Inception Architecture for Computer Vision" `_. 36 | .. note:: 37 | **Important**: In contrast to the other models the inception_v3 expects tensors with a size of 38 | N x 3 x 299 x 299, so ensure your images are sized accordingly. 39 | Args: 40 | pretrained (bool): If True, returns a model pre-trained on ImageNet 41 | progress (bool): If True, displays a progress bar of the download to stderr 42 | aux_logits (bool): If True, add an auxiliary branch that can improve training. 43 | Default: *True* 44 | transform_input (bool): If True, preprocesses the input according to the method with which it 45 | was trained on ImageNet. Default: *False* 46 | """ 47 | kwargs['transform_input'] = True 48 | if pretrained: 49 | # if 'transform_input' not in kwargs: 50 | # kwargs['transform_input'] = True 51 | if 'aux_logits' in kwargs: 52 | original_aux_logits = kwargs['aux_logits'] 53 | kwargs['aux_logits'] = True 54 | else: 55 | original_aux_logits = True 56 | model = Inception3(**kwargs) 57 | state_dict = load_state_dict_from_url(model_urls['inception_v3_google'], 58 | progress=progress) 59 | model.load_state_dict(state_dict) 60 | if not original_aux_logits: 61 | model.aux_logits = False 62 | del model.AuxLogits 63 | return model 64 | 65 | return Inception3(**kwargs) 66 | 67 | def inception_v3_bap(pretrained=False, progress=True, **kwargs): 68 | r"""Inception v3 model architecture from 69 | `"Rethinking the Inception Architecture for Computer Vision" `_. 70 | .. note:: 71 | **Important**: In contrast to the other models the inception_v3 expects tensors with a size of 72 | N x 3 x 299 x 299, so ensure your images are sized accordingly. 73 | Args: 74 | pretrained (bool): If True, returns a model pre-trained on ImageNet 75 | progress (bool): If True, displays a progress bar of the download to stderr 76 | aux_logits (bool): If True, add an auxiliary branch that can improve training. 77 | Default: *True* 78 | transform_input (bool): If True, preprocesses the input according to the method with which it 79 | was trained on ImageNet. Default: *False* 80 | """ 81 | kwargs['transform_input'] = True 82 | if pretrained: 83 | # if 'transform_input' not in kwargs: 84 | # kwargs['transform_input'] = True 85 | if 'aux_logits' in kwargs: 86 | original_aux_logits = kwargs['aux_logits'] 87 | kwargs['aux_logits'] = True 88 | else: 89 | original_aux_logits = True 90 | model = Inception3(**kwargs) 91 | pretrained_dict = load_state_dict_from_url(model_urls['inception_v3_google'], 92 | progress=progress) 93 | model_dict = model.state_dict() 94 | state_dict = {k:v for k,v in pretrained_dict.items() if k in model_dict.keys()} 95 | # model.load_state_dict(state_dict) 96 | model_dict.update(state_dict) 97 | model.load_state_dict(model_dict) 98 | if not original_aux_logits: 99 | model.aux_logits = False 100 | del model.AuxLogits 101 | return model 102 | 103 | return Inception3(**kwargs) 104 | 105 | 106 | 107 | class Inception3(nn.Module): 108 | 109 | def __init__(self, num_classes=1000, aux_logits=True, transform_input=False, num_parts=32): 110 | super(Inception3, self).__init__() 111 | self.aux_logits = aux_logits 112 | self.transform_input = transform_input 113 | self.Conv2d_1a_3x3 = BasicConv2d(3, 32, kernel_size=3, stride=2) 114 | self.Conv2d_2a_3x3 = BasicConv2d(32, 32, kernel_size=3) 115 | self.Conv2d_2b_3x3 = BasicConv2d(32, 64, kernel_size=3, padding=1) 116 | self.Conv2d_3b_1x1 = BasicConv2d(64, 80, kernel_size=1) 117 | self.Conv2d_4a_3x3 = BasicConv2d(80, 192, kernel_size=3) 118 | self.Mixed_5b = InceptionA(192, pool_features=32) 119 | self.Mixed_5c = InceptionA(256, pool_features=64) 120 | self.Mixed_5d = InceptionA(288, pool_features=64) 121 | self.Mixed_6a = InceptionB(288) 122 | self.Mixed_6b = InceptionC(768, channels_7x7=128) 123 | self.Mixed_6c = InceptionC(768, channels_7x7=160) 124 | self.Mixed_6d = InceptionC(768, channels_7x7=160) 125 | self.Mixed_6e = InceptionC(768, channels_7x7=192) 126 | ### Mixed_6e output feature map 127 | if aux_logits: 128 | self.AuxLogits = InceptionAux(768, num_classes) 129 | self.Mixed_7a = InceptionD(768, attention=True,num_parts=num_parts) 130 | ### Mixed_7a output attention map 131 | self.bap = BAP() 132 | # self.Mixed_7b = InceptionE(1280) 133 | # self.Mixed_7c = InceptionE(2048) 134 | 135 | self.fc_new = nn.Linear(768*num_parts, num_classes) 136 | 137 | for m in self.modules(): 138 | if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear): 139 | import scipy.stats as stats 140 | stddev = m.stddev if hasattr(m, 'stddev') else 0.1 141 | X = stats.truncnorm(-2, 2, scale=stddev) 142 | values = torch.as_tensor(X.rvs(m.weight.numel()), dtype=m.weight.dtype) 143 | values = values.view(m.weight.size()) 144 | with torch.no_grad(): 145 | m.weight.copy_(values) 146 | elif isinstance(m, nn.BatchNorm2d): 147 | nn.init.constant_(m.weight, 1) 148 | nn.init.constant_(m.bias, 0) 149 | 150 | def forward(self, x): 151 | if self.transform_input: 152 | x_ch0 = torch.unsqueeze(x[:, 0], 1) * (0.229 / 0.5) + (0.485 - 0.5) / 0.5 153 | x_ch1 = torch.unsqueeze(x[:, 1], 1) * (0.224 / 0.5) + (0.456 - 0.5) / 0.5 154 | x_ch2 = torch.unsqueeze(x[:, 2], 1) * (0.225 / 0.5) + (0.406 - 0.5) / 0.5 155 | x = torch.cat((x_ch0, x_ch1, x_ch2), 1) 156 | # N x 3 x 299 x 299 157 | x = self.Conv2d_1a_3x3(x) 158 | # N x 32 x 149 x 149 159 | x = self.Conv2d_2a_3x3(x) 160 | # N x 32 x 147 x 147 161 | x = self.Conv2d_2b_3x3(x) 162 | # N x 64 x 147 x 147 163 | x = F.max_pool2d(x, kernel_size=3, stride=2) 164 | # N x 64 x 73 x 73 165 | x = self.Conv2d_3b_1x1(x) 166 | # N x 80 x 73 x 73 167 | x = self.Conv2d_4a_3x3(x) 168 | # N x 192 x 71 x 71 169 | x = F.max_pool2d(x, kernel_size=3, stride=2) 170 | # N x 192 x 35 x 35 171 | x = self.Mixed_5b(x) 172 | # N x 256 x 35 x 35 173 | x = self.Mixed_5c(x) 174 | # N x 288 x 35 x 35 175 | x = self.Mixed_5d(x) 176 | # N x 288 x 35 x 35 177 | x = self.Mixed_6a(x) 178 | # N x 768 x 17 x 17 179 | x = self.Mixed_6b(x) 180 | # N x 768 x 17 x 17 181 | x = self.Mixed_6c(x) 182 | # N x 768 x 17 x 17 183 | x = self.Mixed_6d(x) 184 | # N x 768 x 17 x 17 185 | ftm = self.Mixed_6e(x) #N x 768 x 17 x 17 186 | 187 | # N x 768 x 17 x 17 188 | # if self.training and self.aux_logits: 189 | # aux = self.AuxLogits(x) 190 | # N x 768 x 17 x 17 191 | x = self.Mixed_7a(ftm) 192 | 193 | ### get the attention out 194 | atm = self.Mixed_7a.atm ## N*num_parts*17*17 195 | 196 | raw_features, pooling_features = self.bap(ftm,atm) 197 | # pooling_features. 198 | # N x 1280 x 8 x 8 199 | # x = self.Mixed_7b(x) 200 | # N x 2048 x 8 x 8 201 | # x = self.Mixed_7c(x) 202 | # N x 2048 x 8 x 8 203 | # Adaptive average pooling 204 | # x = F.adaptive_avg_pool2d(x, (1, 1)) 205 | # N x 2048 x 1 x 1 206 | # x = F.dropout(x, training=self.training) 207 | # N x 2048 x 1 x 1 208 | x = torch.flatten(pooling_features, 1) 209 | 210 | # N x 2048 211 | x = self.fc_new(x) 212 | # N x 1000 (num_classes) 213 | # if self.training and self.aux_logits: 214 | # return _InceptionOutputs(x, aux) 215 | return atm,raw_features,x 216 | 217 | 218 | class InceptionA(nn.Module): 219 | 220 | def __init__(self, in_channels, pool_features): 221 | super(InceptionA, self).__init__() 222 | self.branch1x1 = BasicConv2d(in_channels, 64, kernel_size=1) 223 | 224 | self.branch5x5_1 = BasicConv2d(in_channels, 48, kernel_size=1) 225 | self.branch5x5_2 = BasicConv2d(48, 64, kernel_size=5, padding=2) 226 | 227 | self.branch3x3dbl_1 = BasicConv2d(in_channels, 64, kernel_size=1) 228 | self.branch3x3dbl_2 = BasicConv2d(64, 96, kernel_size=3, padding=1) 229 | self.branch3x3dbl_3 = BasicConv2d(96, 96, kernel_size=3, padding=1) 230 | 231 | self.branch_pool = BasicConv2d(in_channels, pool_features, kernel_size=1) 232 | 233 | def forward(self, x): 234 | branch1x1 = self.branch1x1(x) 235 | 236 | branch5x5 = self.branch5x5_1(x) 237 | branch5x5 = self.branch5x5_2(branch5x5) 238 | 239 | branch3x3dbl = self.branch3x3dbl_1(x) 240 | branch3x3dbl = self.branch3x3dbl_2(branch3x3dbl) 241 | branch3x3dbl = self.branch3x3dbl_3(branch3x3dbl) 242 | 243 | branch_pool = F.avg_pool2d(x, kernel_size=3, stride=1, padding=1) 244 | branch_pool = self.branch_pool(branch_pool) 245 | 246 | outputs = [branch1x1, branch5x5, branch3x3dbl, branch_pool] 247 | return torch.cat(outputs, 1) 248 | 249 | 250 | class InceptionB(nn.Module): 251 | 252 | def __init__(self, in_channels): 253 | super(InceptionB, self).__init__() 254 | self.branch3x3 = BasicConv2d(in_channels, 384, kernel_size=3, stride=2) 255 | 256 | self.branch3x3dbl_1 = BasicConv2d(in_channels, 64, kernel_size=1) 257 | self.branch3x3dbl_2 = BasicConv2d(64, 96, kernel_size=3, padding=1) 258 | self.branch3x3dbl_3 = BasicConv2d(96, 96, kernel_size=3, stride=2) 259 | 260 | def forward(self, x): 261 | branch3x3 = self.branch3x3(x) 262 | 263 | branch3x3dbl = self.branch3x3dbl_1(x) 264 | branch3x3dbl = self.branch3x3dbl_2(branch3x3dbl) 265 | branch3x3dbl = self.branch3x3dbl_3(branch3x3dbl) 266 | 267 | branch_pool = F.max_pool2d(x, kernel_size=3, stride=2) 268 | 269 | outputs = [branch3x3, branch3x3dbl, branch_pool] 270 | return torch.cat(outputs, 1) 271 | 272 | 273 | class InceptionC(nn.Module): 274 | 275 | def __init__(self, in_channels, channels_7x7): 276 | super(InceptionC, self).__init__() 277 | 278 | self.branch1x1 = BasicConv2d(in_channels, 192, kernel_size=1) 279 | 280 | c7 = channels_7x7 281 | self.branch7x7_1 = BasicConv2d(in_channels, c7, kernel_size=1) 282 | self.branch7x7_2 = BasicConv2d(c7, c7, kernel_size=(1, 7), padding=(0, 3)) 283 | self.branch7x7_3 = BasicConv2d(c7, 192, kernel_size=(7, 1), padding=(3, 0)) 284 | 285 | self.branch7x7dbl_1 = BasicConv2d(in_channels, c7, kernel_size=1) 286 | self.branch7x7dbl_2 = BasicConv2d(c7, c7, kernel_size=(7, 1), padding=(3, 0)) 287 | self.branch7x7dbl_3 = BasicConv2d(c7, c7, kernel_size=(1, 7), padding=(0, 3)) 288 | self.branch7x7dbl_4 = BasicConv2d(c7, c7, kernel_size=(7, 1), padding=(3, 0)) 289 | self.branch7x7dbl_5 = BasicConv2d(c7, 192, kernel_size=(1, 7), padding=(0, 3)) 290 | 291 | self.branch_pool = BasicConv2d(in_channels, 192, kernel_size=1) 292 | 293 | def forward(self, x): 294 | branch1x1 = self.branch1x1(x) 295 | 296 | branch7x7 = self.branch7x7_1(x) 297 | branch7x7 = self.branch7x7_2(branch7x7) 298 | branch7x7 = self.branch7x7_3(branch7x7) 299 | 300 | branch7x7dbl = self.branch7x7dbl_1(x) 301 | branch7x7dbl = self.branch7x7dbl_2(branch7x7dbl) 302 | branch7x7dbl = self.branch7x7dbl_3(branch7x7dbl) 303 | branch7x7dbl = self.branch7x7dbl_4(branch7x7dbl) 304 | branch7x7dbl = self.branch7x7dbl_5(branch7x7dbl) 305 | 306 | branch_pool = F.avg_pool2d(x, kernel_size=3, stride=1, padding=1) 307 | branch_pool = self.branch_pool(branch_pool) 308 | 309 | outputs = [branch1x1, branch7x7, branch7x7dbl, branch_pool] 310 | return torch.cat(outputs, 1) 311 | 312 | 313 | class InceptionD(nn.Module): 314 | 315 | def __init__(self, in_channels,attention=False,num_parts=32): 316 | super(InceptionD, self).__init__() 317 | self.attention = attention 318 | self.num_parts = num_parts 319 | self.branch3x3_1 = BasicConv2d(in_channels, 192, kernel_size=1) 320 | self.branch3x3_2 = BasicConv2d(192, 320, kernel_size=3, stride=2) 321 | 322 | self.branch7x7x3_1 = BasicConv2d(in_channels, 192, kernel_size=1) 323 | self.branch7x7x3_2 = BasicConv2d(192, 192, kernel_size=(1, 7), padding=(0, 3)) 324 | self.branch7x7x3_3 = BasicConv2d(192, 192, kernel_size=(7, 1), padding=(3, 0)) 325 | self.branch7x7x3_4 = BasicConv2d(192, 192, kernel_size=3, stride=2) 326 | 327 | def forward(self, x): 328 | branch3x3 = self.branch3x3_1(x) 329 | if self.attention: 330 | self.atm = branch3x3[:,:self.num_parts,:,:] 331 | branch3x3 = self.branch3x3_2(branch3x3) 332 | 333 | branch7x7x3 = self.branch7x7x3_1(x) 334 | branch7x7x3 = self.branch7x7x3_2(branch7x7x3) 335 | branch7x7x3 = self.branch7x7x3_3(branch7x7x3) 336 | branch7x7x3 = self.branch7x7x3_4(branch7x7x3) 337 | 338 | branch_pool = F.max_pool2d(x, kernel_size=3, stride=2) 339 | outputs = [branch3x3, branch7x7x3, branch_pool] 340 | return torch.cat(outputs, 1) 341 | 342 | 343 | class InceptionE(nn.Module): 344 | 345 | def __init__(self, in_channels): 346 | super(InceptionE, self).__init__() 347 | self.branch1x1 = BasicConv2d(in_channels, 320, kernel_size=1) 348 | 349 | self.branch3x3_1 = BasicConv2d(in_channels, 384, kernel_size=1) 350 | self.branch3x3_2a = BasicConv2d(384, 384, kernel_size=(1, 3), padding=(0, 1)) 351 | self.branch3x3_2b = BasicConv2d(384, 384, kernel_size=(3, 1), padding=(1, 0)) 352 | 353 | self.branch3x3dbl_1 = BasicConv2d(in_channels, 448, kernel_size=1) 354 | self.branch3x3dbl_2 = BasicConv2d(448, 384, kernel_size=3, padding=1) 355 | self.branch3x3dbl_3a = BasicConv2d(384, 384, kernel_size=(1, 3), padding=(0, 1)) 356 | self.branch3x3dbl_3b = BasicConv2d(384, 384, kernel_size=(3, 1), padding=(1, 0)) 357 | 358 | self.branch_pool = BasicConv2d(in_channels, 192, kernel_size=1) 359 | 360 | def forward(self, x): 361 | branch1x1 = self.branch1x1(x) 362 | # self.attention_map = branch1x1 363 | branch3x3 = self.branch3x3_1(x) 364 | branch3x3 = [ 365 | self.branch3x3_2a(branch3x3), 366 | self.branch3x3_2b(branch3x3), 367 | ] 368 | branch3x3 = torch.cat(branch3x3, 1) 369 | 370 | branch3x3dbl = self.branch3x3dbl_1(x) 371 | branch3x3dbl = self.branch3x3dbl_2(branch3x3dbl) 372 | branch3x3dbl = [ 373 | self.branch3x3dbl_3a(branch3x3dbl), 374 | self.branch3x3dbl_3b(branch3x3dbl), 375 | ] 376 | branch3x3dbl = torch.cat(branch3x3dbl, 1) 377 | 378 | branch_pool = F.avg_pool2d(x, kernel_size=3, stride=1, padding=1) 379 | branch_pool = self.branch_pool(branch_pool) 380 | 381 | outputs = [branch1x1, branch3x3, branch3x3dbl, branch_pool] 382 | return torch.cat(outputs, 1) 383 | 384 | 385 | class InceptionAux(nn.Module): 386 | 387 | def __init__(self, in_channels, num_classes): 388 | super(InceptionAux, self).__init__() 389 | self.conv0 = BasicConv2d(in_channels, 128, kernel_size=1) 390 | self.conv1 = BasicConv2d(128, 768, kernel_size=5) 391 | self.conv1.stddev = 0.01 392 | self.fc = nn.Linear(768, num_classes) 393 | self.fc.stddev = 0.001 394 | 395 | def forward(self, x): 396 | # N x 768 x 17 x 17 397 | x = F.avg_pool2d(x, kernel_size=5, stride=3) 398 | # N x 768 x 5 x 5 399 | x = self.conv0(x) 400 | # N x 128 x 5 x 5 401 | x = self.conv1(x) 402 | # N x 768 x 1 x 1 403 | # Adaptive average pooling 404 | x = F.adaptive_avg_pool2d(x, (1, 1)) 405 | # N x 768 x 1 x 1 406 | x = torch.flatten(x, 1) 407 | # N x 768 408 | x = self.fc(x) 409 | # N x 1000 410 | return x 411 | 412 | 413 | class BasicConv2d(nn.Module): 414 | 415 | def __init__(self, in_channels, out_channels, **kwargs): 416 | super(BasicConv2d, self).__init__() 417 | self.conv = nn.Conv2d(in_channels, out_channels, bias=False, **kwargs) 418 | self.bn = nn.BatchNorm2d(out_channels, eps=0.001) 419 | 420 | def forward(self, x): 421 | x = self.conv(x) 422 | x = self.bn(x) 423 | return F.relu(x, inplace=True) 424 | 425 | if __name__=='__main__': 426 | # bap = BAP() 427 | # atm = torch.Tensor(12,768,26,26) 428 | # ftm = torch.Tensor(12,32,26,26) 429 | # rft = bap(atm,ftm) 430 | # criterion = torch.nn.CrossEntropyLoss() 431 | # print(bap.parameters()) 432 | 433 | # print(rft.size()) 434 | import torchvision 435 | import torchvision.transforms as transforms 436 | from PIL import Image 437 | # input_transform = transforms([]) 438 | img = Image.open('/home/zengh/Dataset/oxy/oxySensitive/Sensitive_train_img/n01608432/n01608432_322.JPEG') 439 | # rgb_mean = [0.5,0.5,0.5] 440 | # rgb_std = [0.5,0.5,0.5] 441 | rgb_mean = [0.485, 0.456, 0.406] 442 | rgb_std = [0.229, 0.224, 0.225] 443 | transform_val = transforms.Compose([ 444 | transforms.Resize((299,299)), 445 | # transforms.CenterCrop(args.crop_size), 446 | transforms.ToTensor(), 447 | transforms.Normalize(rgb_mean, rgb_std), 448 | ]) 449 | net = inception_v3(pretrained=True,aux_logits=False,num_parts=16) 450 | # net = torchvision.models.inception_v3(pretrained=True) 451 | 452 | net.eval() 453 | # print(net) 454 | # input = torch.Tensor(4,3,448,448) 455 | input = transform_val(img).unsqueeze(0) 456 | output = net(input) 457 | print('OK',output.size(),torch.argmax(output)) -------------------------------------------------------------------------------- /model/resnet.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | from .bap import BAP 4 | try: 5 | from torch.hub import load_state_dict_from_url 6 | except ImportError: 7 | from torch.utils.model_zoo import load_url as load_state_dict_from_url 8 | 9 | 10 | __all__ = ['ResNet', 'resnet18', 'resnet34', 'resnet50', 'resnet101', 11 | 'resnet152', 'resnext50_32x4d', 'resnext101_32x8d', 12 | 'wide_resnet50_2', 'wide_resnet101_2'] 13 | 14 | 15 | model_urls = { 16 | 'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth', 17 | 'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth', 18 | 'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth', 19 | 'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth', 20 | 'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth', 21 | 'resnext50_32x4d': 'https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth', 22 | 'resnext101_32x8d': 'https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth', 23 | 'wide_resnet50_2': 'https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth', 24 | 'wide_resnet101_2': 'https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth', 25 | } 26 | 27 | 28 | def conv3x3(in_planes, out_planes, stride=1, groups=1, dilation=1): 29 | """3x3 convolution with padding""" 30 | return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride, 31 | padding=dilation, groups=groups, bias=False, dilation=dilation) 32 | 33 | 34 | def conv1x1(in_planes, out_planes, stride=1): 35 | """1x1 convolution""" 36 | return nn.Conv2d(in_planes, out_planes, kernel_size=1, stride=stride, bias=False) 37 | 38 | 39 | class BasicBlock(nn.Module): 40 | expansion = 1 41 | 42 | def __init__(self, inplanes, planes, stride=1, downsample=None, groups=1, 43 | base_width=64, dilation=1, norm_layer=None): 44 | super(BasicBlock, self).__init__() 45 | if norm_layer is None: 46 | norm_layer = nn.BatchNorm2d 47 | if groups != 1 or base_width != 64: 48 | raise ValueError('BasicBlock only supports groups=1 and base_width=64') 49 | if dilation > 1: 50 | raise NotImplementedError("Dilation > 1 not supported in BasicBlock") 51 | # Both self.conv1 and self.downsample layers downsample the input when stride != 1 52 | self.conv1 = conv3x3(inplanes, planes, stride) 53 | self.bn1 = norm_layer(planes) 54 | self.relu = nn.ReLU(inplace=True) 55 | self.conv2 = conv3x3(planes, planes) 56 | self.bn2 = norm_layer(planes) 57 | self.downsample = downsample 58 | self.stride = stride 59 | 60 | def forward(self, x): 61 | identity = x 62 | 63 | out = self.conv1(x) 64 | out = self.bn1(out) 65 | out = self.relu(out) 66 | 67 | out = self.conv2(out) 68 | out = self.bn2(out) 69 | 70 | if self.downsample is not None: 71 | identity = self.downsample(x) 72 | 73 | out += identity 74 | out = self.relu(out) 75 | 76 | return out 77 | 78 | 79 | class Bottleneck_bk(nn.Module): 80 | expansion = 4 81 | 82 | def __init__(self, inplanes, planes, stride=1, downsample=None, groups=1, 83 | base_width=64, dilation=1, norm_layer=None): 84 | super(Bottleneck, self).__init__() 85 | if norm_layer is None: 86 | norm_layer = nn.BatchNorm2d 87 | width = int(planes * (base_width / 64.)) * groups 88 | # Both self.conv2 and self.downsample layers downsample the input when stride != 1 89 | self.conv1 = conv1x1(inplanes, width) 90 | self.bn1 = norm_layer(width) 91 | self.conv2 = conv3x3(width, width, stride, groups, dilation) 92 | self.bn2 = norm_layer(width) 93 | self.conv3 = conv1x1(width, planes * self.expansion) 94 | self.bn3 = norm_layer(planes * self.expansion) 95 | self.relu = nn.ReLU(inplace=True) 96 | self.downsample = downsample 97 | self.stride = stride 98 | 99 | def forward(self, x): 100 | identity = x 101 | 102 | out = self.conv1(x) 103 | out = self.bn1(out) 104 | out = self.relu(out) 105 | 106 | out = self.conv2(out) 107 | out = self.bn2(out) 108 | out = self.relu(out) 109 | 110 | out = self.conv3(out) 111 | out = self.bn3(out) 112 | 113 | if self.downsample is not None: 114 | identity = self.downsample(x) 115 | 116 | out += identity 117 | out = self.relu(out) 118 | 119 | return out 120 | 121 | 122 | class Bottleneck(nn.Module): 123 | expansion = 4 124 | 125 | def __init__(self, inplanes, planes, stride=1, downsample=None, groups=1, 126 | base_width=64, dilation=1, norm_layer=None, use_bap=False): 127 | super(Bottleneck, self).__init__() 128 | 129 | ## add by zengh 130 | self.use_bap = use_bap 131 | if norm_layer is None: 132 | norm_layer = nn.BatchNorm2d 133 | width = int(planes * (base_width / 64.)) * groups 134 | # Both self.conv2 and self.downsample layers downsample the input when stride != 1 135 | self.conv1 = conv1x1(inplanes, width) 136 | self.bn1 = norm_layer(width) 137 | 138 | self.conv2 = conv3x3(width, width, stride, groups, dilation) 139 | self.bn2 = norm_layer(width) 140 | self.conv3 = conv1x1(width, planes * self.expansion) 141 | self.bn3 = norm_layer(planes * self.expansion) 142 | self.relu = nn.ReLU(inplace=True) 143 | if self.use_bap: 144 | self.bap = BAP() 145 | self.downsample = downsample 146 | self.stride = stride 147 | 148 | def forward(self, x): 149 | identity = x ## feature map 150 | if self.downsample is not None: 151 | identity = self.downsample(x) 152 | out = self.conv1(x) 153 | out = self.bn1(out) 154 | out = self.relu(out) 155 | feature_map = out 156 | out = self.conv2(out) 157 | out = self.bn2(out) 158 | out = self.relu(out) 159 | if self.use_bap: 160 | attention = out[:,:32,:,:] 161 | raw_features,pooling_features = self.bap(feature_map,attention) 162 | return attention,raw_features,pooling_features 163 | out = self.conv3(out) 164 | out = self.bn3(out) 165 | 166 | out += identity 167 | out = self.relu(out) 168 | 169 | return out 170 | 171 | 172 | class ResNet(nn.Module): 173 | 174 | def __init__(self, block, layers, num_classes=1000, zero_init_residual=False, 175 | groups=1, width_per_group=64, replace_stride_with_dilation=None, 176 | norm_layer=None,use_bap = False): 177 | super(ResNet, self).__init__() 178 | self.use_bap = use_bap 179 | if norm_layer is None: 180 | norm_layer = nn.BatchNorm2d 181 | self._norm_layer = norm_layer 182 | 183 | self.inplanes = 64 184 | self.dilation = 1 185 | if replace_stride_with_dilation is None: 186 | # each element in the tuple indicates if we should replace 187 | # the 2x2 stride with a dilated convolution instead 188 | replace_stride_with_dilation = [False, False, False] 189 | if len(replace_stride_with_dilation) != 3: 190 | raise ValueError("replace_stride_with_dilation should be None " 191 | "or a 3-element tuple, got {}".format(replace_stride_with_dilation)) 192 | self.groups = groups 193 | self.base_width = width_per_group 194 | self.conv1 = nn.Conv2d(3, self.inplanes, kernel_size=7, stride=2, padding=3, 195 | bias=False) 196 | self.bn1 = norm_layer(self.inplanes) 197 | self.relu = nn.ReLU(inplace=True) 198 | self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1) 199 | self.layer1 = self._make_layer(block, 64, layers[0]) 200 | self.layer2 = self._make_layer(block, 128, layers[1], stride=2, 201 | dilate=replace_stride_with_dilation[0]) 202 | self.layer3 = self._make_layer(block, 256, layers[2], stride=2, 203 | dilate=replace_stride_with_dilation[1]) 204 | self.layer4 = self._make_layer(block, 512, layers[3], stride=2, 205 | dilate=replace_stride_with_dilation[2],use_bap=use_bap) 206 | self.avgpool = nn.AdaptiveAvgPool2d((1, 1)) 207 | # self.fc = nn.Linear(512 * block.expansion, num_classes) 208 | self.fc_new = nn.Linear(512*32,num_classes) 209 | 210 | for m in self.modules(): 211 | if isinstance(m, nn.Conv2d): 212 | nn.init.kaiming_normal_(m.weight, mode='fan_out', nonlinearity='relu') 213 | elif isinstance(m, (nn.BatchNorm2d, nn.GroupNorm)): 214 | nn.init.constant_(m.weight, 1) 215 | nn.init.constant_(m.bias, 0) 216 | 217 | # Zero-initialize the last BN in each residual branch, 218 | # so that the residual branch starts with zeros, and each residual block behaves like an identity. 219 | # This improves the model by 0.2~0.3% according to https://arxiv.org/abs/1706.02677 220 | if zero_init_residual: 221 | for m in self.modules(): 222 | if isinstance(m, Bottleneck): 223 | nn.init.constant_(m.bn3.weight, 0) 224 | elif isinstance(m, BasicBlock): 225 | nn.init.constant_(m.bn2.weight, 0) 226 | 227 | def _make_layer(self, block, planes, blocks, stride=1, dilate=False, use_bap = False): 228 | norm_layer = self._norm_layer 229 | downsample = None 230 | previous_dilation = self.dilation 231 | if dilate: 232 | self.dilation *= stride 233 | stride = 1 234 | if stride != 1 or self.inplanes != planes * block.expansion: 235 | downsample = nn.Sequential( 236 | conv1x1(self.inplanes, planes * block.expansion, stride), 237 | norm_layer(planes * block.expansion), 238 | ) 239 | 240 | layers = [] 241 | # if use_bap: 242 | layers.append(block(self.inplanes, planes, stride, downsample, self.groups, 243 | self.base_width, previous_dilation, norm_layer)) 244 | self.inplanes = planes * block.expansion 245 | 246 | layers.append(block(self.inplanes, planes, groups=self.groups, 247 | base_width=self.base_width, dilation=self.dilation, 248 | norm_layer=norm_layer,use_bap=use_bap)) 249 | if use_bap: 250 | return nn.Sequential(*layers) 251 | 252 | for _ in range(2, blocks): 253 | layers.append(block(self.inplanes, planes, groups=self.groups, 254 | base_width=self.base_width, dilation=self.dilation, 255 | norm_layer=norm_layer)) 256 | 257 | return nn.Sequential(*layers) 258 | 259 | def forward(self, x): 260 | x = self.conv1(x) 261 | x = self.bn1(x) 262 | x = self.relu(x) 263 | x = self.maxpool(x) 264 | 265 | x = self.layer1(x) 266 | x = self.layer2(x) 267 | x = self.layer3(x) 268 | x = self.layer4(x) 269 | if self.use_bap: 270 | attention,raw_features,x = x 271 | # print(attention.shape,raw_features.shape,x.shape) 272 | if not self.use_bap: 273 | x = self.avgpool(x) 274 | x = torch.flatten(x, 1) 275 | x = self.fc_new(x) 276 | 277 | if self.use_bap: 278 | return attention,raw_features,x 279 | return x 280 | 281 | 282 | def _resnet(arch, block, layers, pretrained, progress, **kwargs): 283 | model = ResNet(block, layers, **kwargs) 284 | if pretrained: 285 | pretrained_dict = load_state_dict_from_url(model_urls[arch], 286 | progress=progress) 287 | model_dict = model.state_dict() 288 | state_dict = {k:v for k,v in pretrained_dict.items() if k in model_dict.keys()} 289 | # model.load_state_dict(state_dict) 290 | model_dict.update(state_dict) 291 | model.load_state_dict(model_dict) 292 | return model 293 | 294 | 295 | def resnet18(pretrained=False, progress=True, **kwargs): 296 | r"""ResNet-18 model from 297 | `"Deep Residual Learning for Image Recognition" `_ 298 | Args: 299 | pretrained (bool): If True, returns a model pre-trained on ImageNet 300 | progress (bool): If True, displays a progress bar of the download to stderr 301 | """ 302 | return _resnet('resnet18', BasicBlock, [2, 2, 2, 2], pretrained, progress, 303 | **kwargs) 304 | 305 | 306 | def resnet34(pretrained=False, progress=True, **kwargs): 307 | r"""ResNet-34 model from 308 | `"Deep Residual Learning for Image Recognition" `_ 309 | Args: 310 | pretrained (bool): If True, returns a model pre-trained on ImageNet 311 | progress (bool): If True, displays a progress bar of the download to stderr 312 | """ 313 | return _resnet('resnet34', BasicBlock, [3, 4, 6, 3], pretrained, progress, 314 | **kwargs) 315 | 316 | 317 | def resnet50(pretrained=False, progress=True, **kwargs): 318 | r"""ResNet-50 model from 319 | `"Deep Residual Learning for Image Recognition" `_ 320 | Args: 321 | pretrained (bool): If True, returns a model pre-trained on ImageNet 322 | progress (bool): If True, displays a progress bar of the download to stderr 323 | """ 324 | return _resnet('resnet50', Bottleneck, [3, 4, 6, 3], pretrained, progress, 325 | **kwargs) 326 | 327 | 328 | def resnet101(pretrained=False, progress=True, **kwargs): 329 | r"""ResNet-101 model from 330 | `"Deep Residual Learning for Image Recognition" `_ 331 | Args: 332 | pretrained (bool): If True, returns a model pre-trained on ImageNet 333 | progress (bool): If True, displays a progress bar of the download to stderr 334 | """ 335 | return _resnet('resnet101', Bottleneck, [3, 4, 23, 3], pretrained, progress, 336 | **kwargs) 337 | 338 | 339 | def resnet152(pretrained=False, progress=True, **kwargs): 340 | r"""ResNet-152 model from 341 | `"Deep Residual Learning for Image Recognition" `_ 342 | Args: 343 | pretrained (bool): If True, returns a model pre-trained on ImageNet 344 | progress (bool): If True, displays a progress bar of the download to stderr 345 | """ 346 | return _resnet('resnet152', Bottleneck, [3, 8, 36, 3], pretrained, progress, 347 | **kwargs) 348 | 349 | 350 | def resnext50_32x4d(pretrained=False, progress=True, **kwargs): 351 | r"""ResNeXt-50 32x4d model from 352 | `"Aggregated Residual Transformation for Deep Neural Networks" `_ 353 | Args: 354 | pretrained (bool): If True, returns a model pre-trained on ImageNet 355 | progress (bool): If True, displays a progress bar of the download to stderr 356 | """ 357 | kwargs['groups'] = 32 358 | kwargs['width_per_group'] = 4 359 | return _resnet('resnext50_32x4d', Bottleneck, [3, 4, 6, 3], 360 | pretrained, progress, **kwargs) 361 | 362 | 363 | def resnext101_32x8d(pretrained=False, progress=True, **kwargs): 364 | r"""ResNeXt-101 32x8d model from 365 | `"Aggregated Residual Transformation for Deep Neural Networks" `_ 366 | Args: 367 | pretrained (bool): If True, returns a model pre-trained on ImageNet 368 | progress (bool): If True, displays a progress bar of the download to stderr 369 | """ 370 | kwargs['groups'] = 32 371 | kwargs['width_per_group'] = 8 372 | return _resnet('resnext101_32x8d', Bottleneck, [3, 4, 23, 3], 373 | pretrained, progress, **kwargs) 374 | 375 | 376 | def wide_resnet50_2(pretrained=False, progress=True, **kwargs): 377 | r"""Wide ResNet-50-2 model from 378 | `"Wide Residual Networks" `_ 379 | The model is the same as ResNet except for the bottleneck number of channels 380 | which is twice larger in every block. The number of channels in outer 1x1 381 | convolutions is the same, e.g. last block in ResNet-50 has 2048-512-2048 382 | channels, and in Wide ResNet-50-2 has 2048-1024-2048. 383 | Args: 384 | pretrained (bool): If True, returns a model pre-trained on ImageNet 385 | progress (bool): If True, displays a progress bar of the download to stderr 386 | """ 387 | kwargs['width_per_group'] = 64 * 2 388 | return _resnet('wide_resnet50_2', Bottleneck, [3, 4, 6, 3], 389 | pretrained, progress, **kwargs) 390 | 391 | 392 | def wide_resnet101_2(pretrained=False, progress=True, **kwargs): 393 | r"""Wide ResNet-101-2 model from 394 | `"Wide Residual Networks" `_ 395 | The model is the same as ResNet except for the bottleneck number of channels 396 | which is twice larger in every block. The number of channels in outer 1x1 397 | convolutions is the same, e.g. last block in ResNet-50 has 2048-512-2048 398 | channels, and in Wide ResNet-50-2 has 2048-1024-2048. 399 | Args: 400 | pretrained (bool): If True, returns a model pre-trained on ImageNet 401 | progress (bool): If True, displays a progress bar of the download to stderr 402 | """ 403 | kwargs['width_per_group'] = 64 * 2 404 | return _resnet('wide_resnet101_2', Bottleneck, [3, 4, 23, 3], 405 | pretrained, progress, **kwargs) 406 | 407 | 408 | if __name__ == '__main__': 409 | net = resnet50(use_bap=True,pretrained=True) 410 | input = torch.Tensor(4,3,224,224) 411 | out = net(input) 412 | # print(net) -------------------------------------------------------------------------------- /test_bap.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | python train_bap.py test\ 4 | --model-name inception \ 5 | --batch-size 12 \ 6 | --dataset bird \ 7 | --image-size 512 \ 8 | --input-size 448 \ 9 | --checkpoint-path checkpoint/bird/model_best.pth.tar \ 10 | --use-gpu \ 11 | --multi-gpu \ 12 | --gpu-ids 0,1 \ 13 | -------------------------------------------------------------------------------- /train_bap.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: train_bap.py # 3 | # Created: 2019-11-06 13:22:23 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:train_bap.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | # system 12 | import os 13 | import time 14 | import shutil 15 | import random 16 | import numpy as np 17 | 18 | # my implementation 19 | from model.inception_bap import inception_v3_bap 20 | from model.resnet import resnet50 21 | from dataset.custom_dataset import CustomDataset 22 | 23 | from utils import calculate_pooling_center_loss, mask2bbox 24 | from utils import attention_crop, attention_drop, attention_crop_drop 25 | from utils import getDatasetConfig, getConfig 26 | from utils import accuracy, get_lr, save_checkpoint, AverageMeter, set_seed 27 | from utils import Engine 28 | 29 | # pytorch 30 | import torch 31 | import torchvision.transforms as transforms 32 | from torch.utils.data import DataLoader 33 | import torchvision.models as models 34 | import torch.nn.functional as F 35 | from tensorboardX import SummaryWriter 36 | 37 | GLOBAL_SEED = 1231 38 | def _init_fn(worker_id): 39 | set_seed(GLOBAL_SEED+worker_id) 40 | 41 | def train(): 42 | # input params 43 | set_seed(GLOBAL_SEED) 44 | config = getConfig() 45 | data_config = getDatasetConfig(config.dataset) 46 | sw_log = 'logs/%s' % config.dataset 47 | sw = SummaryWriter(log_dir=sw_log) 48 | best_prec1 = 0. 49 | rate = 0.875 50 | 51 | # define train_dataset and loader 52 | transform_train = transforms.Compose([ 53 | transforms.Resize((int(config.input_size//rate), int(config.input_size//rate))), 54 | transforms.RandomCrop((config.input_size,config.input_size)), 55 | transforms.RandomHorizontalFlip(), 56 | transforms.ColorJitter(brightness=32./255.,saturation=0.5), 57 | transforms.ToTensor(), 58 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]), 59 | ]) 60 | train_dataset = CustomDataset( 61 | data_config['train'], data_config['train_root'], transform=transform_train) 62 | train_loader = DataLoader( 63 | train_dataset, batch_size=config.batch_size, shuffle=True, num_workers=config.workers, pin_memory=True, worker_init_fn=_init_fn) 64 | 65 | transform_test = transforms.Compose([ 66 | transforms.Resize((config.image_size, config.image_size)), 67 | transforms.CenterCrop(config.input_size), 68 | transforms.ToTensor(), 69 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]), 70 | ]) 71 | val_dataset = CustomDataset( 72 | data_config['val'], data_config['val_root'], transform=transform_test) 73 | val_loader = DataLoader( 74 | val_dataset, batch_size=config.batch_size, shuffle=False, num_workers=config.workers, pin_memory=True, worker_init_fn=_init_fn) 75 | # logging dataset info 76 | print('Dataset Name:{dataset_name}, Train:[{train_num}], Val:[{val_num}]'.format( 77 | dataset_name=config.dataset, 78 | train_num=len(train_dataset), 79 | val_num=len(val_dataset))) 80 | print('Batch Size:[{0}], Total:::Train Batches:[{1}],Val Batches:[{2}]'.format( 81 | config.batch_size, len(train_loader), len(val_loader) 82 | )) 83 | # define model 84 | if config.model_name == 'inception': 85 | net = inception_v3_bap(pretrained=True, aux_logits=False,num_parts=config.parts) 86 | elif config.model_name == 'resnet50': 87 | net = resnet50(pretrained=True,use_bap=True) 88 | 89 | 90 | in_features = net.fc_new.in_features 91 | new_linear = torch.nn.Linear( 92 | in_features=in_features, out_features=train_dataset.num_classes) 93 | net.fc_new = new_linear 94 | # feature center 95 | feature_len = 768 if config.model_name == 'inception' else 512 96 | center_dict = {'center': torch.zeros( 97 | train_dataset.num_classes, feature_len*config.parts)} 98 | 99 | # gpu config 100 | use_gpu = torch.cuda.is_available() and config.use_gpu 101 | if use_gpu: 102 | net = net.cuda() 103 | center_dict['center'] = center_dict['center'].cuda() 104 | gpu_ids = [int(r) for r in config.gpu_ids.split(',')] 105 | if use_gpu and config.multi_gpu: 106 | net = torch.nn.DataParallel(net, device_ids=gpu_ids) 107 | 108 | # define optimizer 109 | assert config.optim in ['sgd', 'adam'], 'optim name not found!' 110 | if config.optim == 'sgd': 111 | optimizer = torch.optim.SGD( 112 | net.parameters(), lr=config.lr, momentum=config.momentum, weight_decay=config.weight_decay) 113 | elif config.optim == 'adam': 114 | optimizer = torch.optim.Adam( 115 | net.parameters(), lr=config.lr, weight_decay=config.weight_decay) 116 | 117 | # define learning scheduler 118 | assert config.scheduler in ['plateau', 119 | 'step'], 'scheduler not supported!!!' 120 | if config.scheduler == 'plateau': 121 | scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau( 122 | optimizer, 'min', patience=3, factor=0.1) 123 | elif config.scheduler == 'step': 124 | scheduler = torch.optim.lr_scheduler.StepLR( 125 | optimizer, step_size=2, gamma=0.9) 126 | 127 | # define loss 128 | criterion = torch.nn.CrossEntropyLoss() 129 | if use_gpu: 130 | criterion = criterion.cuda() 131 | 132 | # train val parameters dict 133 | state = {'model': net, 'train_loader': train_loader, 134 | 'val_loader': val_loader, 'criterion': criterion, 135 | 'center': center_dict['center'], 'config': config, 136 | 'optimizer': optimizer} 137 | ## train and val 138 | engine = Engine() 139 | print(config) 140 | for e in range(config.epochs): 141 | if config.scheduler == 'step': 142 | scheduler.step() 143 | lr_val = get_lr(optimizer) 144 | print("Start epoch %d ==========,lr=%f" % (e, lr_val)) 145 | train_prec, train_loss = engine.train(state, e) 146 | prec1, val_loss = engine.validate(state) 147 | is_best = prec1 > best_prec1 148 | best_prec1 = max(prec1, best_prec1) 149 | save_checkpoint({ 150 | 'epoch': e + 1, 151 | 'state_dict': net.state_dict(), 152 | 'best_prec1': best_prec1, 153 | 'optimizer': optimizer.state_dict(), 154 | 'center': center_dict['center'] 155 | }, is_best, config.checkpoint_path) 156 | sw.add_scalars("Accurancy", {'train': train_prec, 'val': prec1}, e) 157 | sw.add_scalars("Loss", {'train': train_loss, 'val': val_loss}, e) 158 | if config.scheduler == 'plateau': 159 | scheduler.step(val_loss) 160 | 161 | def test(): 162 | ## 163 | engine = Engine() 164 | config = getConfig() 165 | data_config = getDatasetConfig(config.dataset) 166 | # define dataset 167 | transform_test = transforms.Compose([ 168 | transforms.Resize((config.image_size, config.image_size)), 169 | transforms.CenterCrop(config.input_size), 170 | transforms.ToTensor(), 171 | transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]), 172 | ]) 173 | val_dataset = CustomDataset( 174 | data_config['val'], data_config['val_root'], transform=transform_test) 175 | val_loader = DataLoader( 176 | val_dataset, batch_size=config.batch_size, shuffle=False, num_workers=config.workers, pin_memory=True) 177 | # define model 178 | if config.model_name == 'inception': 179 | net = inception_v3_bap(pretrained=True, aux_logits=False) 180 | elif config.model_name == 'resnet50': 181 | net = resnet50(pretrained=True) 182 | 183 | in_features = net.fc_new.in_features 184 | new_linear = torch.nn.Linear( 185 | in_features=in_features, out_features=val_dataset.num_classes) 186 | net.fc_new = new_linear 187 | 188 | # load checkpoint 189 | use_gpu = torch.cuda.is_available() and config.use_gpu 190 | if use_gpu: 191 | net = net.cuda() 192 | gpu_ids = [int(r) for r in config.gpu_ids.split(',')] 193 | if use_gpu and len(gpu_ids) > 1: 194 | net = torch.nn.DataParallel(net, device_ids=gpu_ids) 195 | #checkpoint_path = os.path.join(config.checkpoint_path,'model_best.pth.tar') 196 | net.load_state_dict(torch.load(config.checkpoint_path)['state_dict']) 197 | 198 | # define loss 199 | # define loss 200 | criterion = torch.nn.CrossEntropyLoss() 201 | if use_gpu: 202 | criterion = criterion.cuda() 203 | prec1, prec5 = engine.test(val_loader, net, criterion) 204 | 205 | 206 | if __name__ == '__main__': 207 | config = getConfig() 208 | engine = Engine() 209 | if config.action == 'train': 210 | train() 211 | else: 212 | test() 213 | -------------------------------------------------------------------------------- /train_bap.sh: -------------------------------------------------------------------------------- 1 | 2 | 3 | python train_bap.py train\ 4 | --model-name inception \ 5 | --batch-size 12 \ 6 | --dataset bird \ 7 | --image-size 512 \ 8 | --input-size 448 \ 9 | --checkpoint-path checkpoint/bird \ 10 | --optim sgd \ 11 | --scheduler step \ 12 | --lr 0.001 \ 13 | --momentum 0.9 \ 14 | --weight-decay 1e-5 \ 15 | --workers 4 \ 16 | --parts 32 \ 17 | --epochs 80 \ 18 | --use-gpu \ 19 | --multi-gpu \ 20 | --gpu-ids 0,1 \ 21 | -------------------------------------------------------------------------------- /utils/__init__.py: -------------------------------------------------------------------------------- 1 | from .utils import accuracy,get_lr,set_seed,save_checkpoint,_init_fn 2 | from .attention import mask2bbox, calculate_pooling_center_loss,attention_crop,attention_drop,attention_crop_drop 3 | from .meter import AverageMeter 4 | from .config import getConfig,getDatasetConfig 5 | from .utils import getLogger 6 | from .engine import Engine -------------------------------------------------------------------------------- /utils/attention.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: attention.py # 3 | # Created: 2019-11-05 19:19:08 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:attention.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | import numpy as np 12 | import random 13 | import torch 14 | import torchvision.transforms as transforms 15 | import torch.nn.functional as F 16 | import time 17 | 18 | 19 | def attention_crop(attention_maps,input_image): 20 | 21 | # start = time.time() 22 | B,N,W,H = input_image.shape 23 | input_tensor = input_image 24 | batch_size, num_parts, height, width = attention_maps.shape 25 | attention_maps = torch.nn.functional.interpolate(attention_maps.detach(),size=(W,H),mode='bilinear') 26 | part_weights = F.avg_pool2d(attention_maps,(W,H)).reshape(batch_size,-1) 27 | part_weights = torch.add(torch.sqrt(part_weights),1e-12) 28 | part_weights = torch.div(part_weights,torch.sum(part_weights,dim=1).unsqueeze(1)).cpu() 29 | part_weights = part_weights.numpy() 30 | ret_imgs = [] 31 | # print(part_weights[3]) 32 | for i in range(batch_size): 33 | attention_map = attention_maps[i] 34 | part_weight = part_weights[i] 35 | selected_index = np.random.choice( 36 | np.arange(0, num_parts), 1, p=part_weight)[0] 37 | mask = attention_map[selected_index, :, :] 38 | # print(type(mask)) 39 | # mask = (mask-mask.min())/(mask.max()-mask.min()) 40 | threshold = random.uniform(0.4, 0.6) 41 | # threshold = 0.5 42 | # itemindex = np.where(mask >= threshold) 43 | itemindex = np.where(mask >= mask.max() * threshold) 44 | 45 | # itemindex = torch.nonzero(mask >= threshold) 46 | padding_h = int(0.1*H) 47 | padding_w = int(0.1*W) 48 | height_min = itemindex[0].min() 49 | height_min = max(0,height_min-padding_h) 50 | height_max = itemindex[0].max() + padding_h 51 | width_min = itemindex[1].min() 52 | width_min = max(0,width_min-padding_w) 53 | width_max = itemindex[1].max() + padding_w 54 | out_img = input_tensor[i][:,height_min:height_max,width_min:width_max].unsqueeze(0) 55 | out_img = torch.nn.functional.interpolate(out_img,size=(W,H),mode='bilinear',align_corners=True) 56 | out_img = out_img.squeeze(0) 57 | # print(out_img.shape) 58 | ret_imgs.append(out_img) 59 | ret_imgs = torch.stack(ret_imgs) 60 | return ret_imgs 61 | 62 | 63 | def attention_drop(attention_maps,input_image): 64 | B,N,W,H = input_image.shape 65 | input_tensor = input_image 66 | batch_size, num_parts, height, width = attention_maps.shape 67 | attention_maps = torch.nn.functional.interpolate(attention_maps.detach(),size=(W,H),mode='bilinear') 68 | part_weights = F.avg_pool2d(attention_maps,(W,H)).reshape(batch_size,-1) 69 | part_weights = torch.add(torch.sqrt(part_weights),1e-12) 70 | part_weights = torch.div(part_weights,torch.sum(part_weights,dim=1).unsqueeze(1)).cpu().numpy() 71 | # attention_maps = torch.nn.functional.interpolate(attention_maps,size=(W,H),mode='bilinear', align_corners=True) 72 | # print(part_weights.shape) 73 | masks = [] 74 | for i in range(batch_size): 75 | attention_map = attention_maps[i].detach() 76 | part_weight = part_weights[i] 77 | selected_index = np.random.choice( 78 | np.arange(0, num_parts), 1, p=part_weight)[0] 79 | mask = attention_map[selected_index:selected_index + 1, :, :] 80 | 81 | # soft mask 82 | # threshold = random.uniform(0.2, 0.5) 83 | # threshold = 0.5 84 | # mask = (mask-mask.min())/(mask.max()-mask.min()) 85 | # mask = (mask < threshold).float() 86 | threshold = random.uniform(0.2, 0.5) 87 | mask = (mask < threshold * mask.max()).float() 88 | masks.append(mask) 89 | masks = torch.stack(masks) 90 | # print(masks.shape) 91 | ret = input_tensor*masks 92 | return ret 93 | 94 | def attention_crop_drop(attention_maps,input_image): 95 | # start = time.time() 96 | B,N,W,H = input_image.shape 97 | input_tensor = input_image 98 | batch_size, num_parts, height, width = attention_maps.shape 99 | attention_maps = torch.nn.functional.interpolate(attention_maps.detach(),size=(W,H),mode='bilinear') 100 | part_weights = F.avg_pool2d(attention_maps.detach(),(W,H)).reshape(batch_size,-1) 101 | part_weights = torch.add(torch.sqrt(part_weights),1e-12) 102 | part_weights = torch.div(part_weights,torch.sum(part_weights,dim=1).unsqueeze(1)).cpu() 103 | part_weights = part_weights.numpy() 104 | # print(part_weights.shape) 105 | ret_imgs = [] 106 | masks = [] 107 | # print(part_weights[3]) 108 | for i in range(batch_size): 109 | attention_map = attention_maps[i] 110 | part_weight = part_weights[i] 111 | selected_index = np.random.choice(np.arange(0, num_parts), 1, p=part_weight)[0] 112 | selected_index2 = np.random.choice(np.arange(0, num_parts), 1, p=part_weight)[0] 113 | ## create crop imgs 114 | mask = attention_map[selected_index, :, :] 115 | # mask = (mask-mask.min())/(mask.max()-mask.min()) 116 | threshold = random.uniform(0.4, 0.6) 117 | # threshold = 0.5 118 | itemindex = np.where(mask >= mask.max()*threshold) 119 | # print(itemindex.shape) 120 | # itemindex = torch.nonzero(mask >= threshold*mask.max()) 121 | padding_h = int(0.1*H) 122 | padding_w = int(0.1*W) 123 | height_min = itemindex[0].min() 124 | height_min = max(0,height_min-padding_h) 125 | height_max = itemindex[0].max() + padding_h 126 | width_min = itemindex[1].min() 127 | width_min = max(0,width_min-padding_w) 128 | width_max = itemindex[1].max() + padding_w 129 | # print('numpy',height_min,height_max,width_min,width_max) 130 | out_img = input_tensor[i][:,height_min:height_max,width_min:width_max].unsqueeze(0) 131 | out_img = torch.nn.functional.interpolate(out_img,size=(W,H),mode='bilinear',align_corners=True) 132 | out_img = out_img.squeeze(0) 133 | ret_imgs.append(out_img) 134 | 135 | ## create drop imgs 136 | mask2 = attention_map[selected_index2:selected_index2 + 1, :, :] 137 | threshold = random.uniform(0.2, 0.5) 138 | mask2 = (mask2 < threshold * mask2.max()).float() 139 | masks.append(mask2) 140 | # bboxes = np.asarray(bboxes, np.float32) 141 | crop_imgs = torch.stack(ret_imgs) 142 | masks = torch.stack(masks) 143 | drop_imgs = input_tensor*masks 144 | return (crop_imgs,drop_imgs) 145 | 146 | def mask2bbox(attention_maps,input_image): 147 | input_tensor = input_image 148 | B,C,H,W = input_tensor.shape 149 | batch_size, num_parts, Hh, Ww = attention_maps.shape 150 | attention_maps = torch.nn.functional.interpolate(attention_maps,size=(W,H),mode='bilinear') 151 | ret_imgs = [] 152 | # print(part_weights[3]) 153 | for i in range(batch_size): 154 | attention_map = attention_maps[i] 155 | # print(attention_map.shape) 156 | mask = attention_map.mean(dim=0) 157 | # print(type(mask)) 158 | # mask = (mask-mask.min())/(mask.max()-mask.min()) 159 | # threshold = random.uniform(0.4, 0.6) 160 | threshold = 0.1 161 | max_activate = mask.max() 162 | min_activate = threshold * max_activate 163 | itemindex = torch.nonzero(mask >= min_activate) 164 | 165 | padding_h = int(0.05*H) 166 | padding_w = int(0.05*W) 167 | height_min = itemindex[:, 0].min() 168 | height_min = max(0,height_min-padding_h) 169 | height_max = itemindex[:, 0].max() + padding_h 170 | width_min = itemindex[:, 1].min() 171 | width_min = max(0,width_min-padding_w) 172 | width_max = itemindex[:, 1].max() + padding_w 173 | # print(height_min,height_max,width_min,width_max) 174 | out_img = input_tensor[i][:,height_min:height_max,width_min:width_max].unsqueeze(0) 175 | out_img = torch.nn.functional.interpolate(out_img,size=(W,H),mode='bilinear',align_corners=True) 176 | out_img = out_img.squeeze(0) 177 | # print(out_img.shape) 178 | ret_imgs.append(out_img) 179 | ret_imgs = torch.stack(ret_imgs) 180 | # print(ret_imgs.shape) 181 | return ret_imgs 182 | 183 | def calculate_pooling_center_loss(features, centers, label, alfa=0.95): 184 | # centers = model.centers 185 | # print('111111111',sum(sum(centers))) 186 | # mse_loss = torch.nn.MSELoss() 187 | features = features.reshape(features.shape[0], -1) 188 | # print(features.shape) 189 | centers_batch = centers[label] 190 | # print(centers_batch) 191 | # print(centers_batch.shape,centers.shape) 192 | centers_batch = torch.nn.functional.normalize(centers_batch, dim=-1) 193 | diff = (1-alfa)*(features.detach() - centers_batch) 194 | distance = torch.pow(features - centers_batch,2) 195 | distance = torch.sum(distance, dim=-1) 196 | center_loss = torch.mean(distance) 197 | # loss2 = mse_loss(features,centers_batch) 198 | # print('================',center_loss.item(),loss2.item()) 199 | return center_loss, diff 200 | 201 | def attention_crop_drop2(attention_maps,input_image): 202 | # start = time.time() 203 | B,N,W,H = input_image.shape 204 | input_tensor = input_image 205 | batch_size, num_parts, height, width = attention_maps.shape 206 | attention_maps = torch.nn.functional.interpolate(attention_maps.detach(),size=(W,H),mode='bilinear') 207 | part_weights = F.avg_pool2d(attention_maps.detach(),(W,H)).reshape(batch_size,-1) 208 | part_weights = torch.add(torch.sqrt(part_weights),1e-12) 209 | part_weights = torch.div(part_weights,torch.sum(part_weights,dim=1).unsqueeze(1)).cpu() 210 | part_weights = part_weights.numpy() 211 | # print(part_weights.shape) 212 | ret_imgs = [] 213 | masks = [] 214 | # print(part_weights[3]) 215 | for i in range(batch_size): 216 | attention_map = attention_maps[i] 217 | part_weight = part_weights[i] 218 | selected_index = np.random.choice(np.arange(0, num_parts), 1, p=part_weight)[0] 219 | selected_index2 = np.random.choice(np.arange(0, num_parts), 1, p=part_weight)[0] 220 | ## create crop imgs 221 | mask = attention_map[selected_index, :, :] 222 | # mask = (mask-mask.min())/(mask.max()-mask.min()) 223 | threshold = random.uniform(0.4, 0.6) 224 | # threshold = 0.5 225 | # itemindex = np.where(mask >= mask.max()*threshold) 226 | # print(itemindex.shape) 227 | itemindex = torch.nonzero(mask >= threshold*mask.max()) 228 | padding_h = int(0.1*H) 229 | padding_w = int(0.1*W) 230 | height_min = itemindex[:,0].min() 231 | height_min = max(0,height_min-padding_h) 232 | height_max = itemindex[:,0].max() + padding_h 233 | width_min = itemindex[:,1].min() 234 | width_min = max(0,width_min-padding_w) 235 | width_max = itemindex[:,1].max() + padding_w 236 | # print(height_min,height_max,width_min,width_max) 237 | out_img = input_tensor[i][:,height_min:height_max,width_min:width_max].unsqueeze(0) 238 | out_img = torch.nn.functional.interpolate(out_img,size=(W,H),mode='bilinear',align_corners=True) 239 | out_img = out_img.squeeze(0) 240 | ret_imgs.append(out_img) 241 | 242 | ## create drop imgs 243 | mask2 = attention_map[selected_index2:selected_index2 + 1, :, :] 244 | threshold = random.uniform(0.2, 0.5) 245 | mask2 = (mask2 < threshold * mask2.max()).float() 246 | masks.append(mask2) 247 | # bboxes = np.asarray(bboxes, np.float32) 248 | crop_imgs = torch.stack(ret_imgs) 249 | masks = torch.stack(masks) 250 | drop_imgs = input_tensor*masks 251 | return (crop_imgs,drop_imgs) 252 | 253 | 254 | 255 | 256 | 257 | 258 | if __name__ == '__main__': 259 | import torch 260 | a = torch.rand(4*26*26*32).reshape(4, 32, 26, 26) 261 | # a = torch.Tensor((4, 32, 26, 26)) 262 | img = torch.arange(4*3*448*448.0).reshape(4, 3, 448, 448) 263 | # a = torch.arange(4*1*1*8.0).reshape(4, 8, 1, 1) 264 | # b = torch.ones(10*1*1*8).reshape(10, 8) 265 | # label = torch.LongTensor([1, 2, 3, 4]) 266 | # a = torch.div(a,4*26*26*8) 267 | # ret = attention_drop2(a,img) 268 | ret1 = attention_crop_drop(a,img) 269 | ret2 = attention_crop_drop2(a,img) 270 | # ret2 = attention_crop2(a,img) 271 | # ret = calculate_pooling_center_loss(a, b, label) 272 | # print(ret) 273 | # print(ret.shape,ret2.shape) 274 | # print(type(ret),type(ret2)) 275 | -------------------------------------------------------------------------------- /utils/config.py: -------------------------------------------------------------------------------- 1 | import os 2 | import argparse 3 | 4 | 5 | def getConfig(): 6 | parser = argparse.ArgumentParser() 7 | 8 | # train or test 9 | # action = parser.add_subparsers() 10 | # action.add_parser('train', action='store_true', help='run train') 11 | # action.add_parser('test', action='store_true', help='run test') 12 | parser.add_argument('action', choices=('train', 'test')) 13 | # dataset 14 | parser.add_argument('--dataset', metavar='DIR', 15 | default='bird', help='name of the dataset') 16 | parser.add_argument('--image-size', '-i', default=512, type=int, 17 | metavar='N', help='image size (default: 512)') 18 | parser.add_argument('--input-size', '-cs', default=448, type=int, 19 | metavar='N', help='the input size of the model (default: 448)') 20 | parser.add_argument('-j', '--workers', default=4, type=int, metavar='N', 21 | help='number of data loading workers (default: 4)') 22 | 23 | # optimizer config 24 | parser.add_argument('--optim', default='sgd', type=str, 25 | help='the name of optimizer(adam,sgd)') 26 | parser.add_argument('--scheduler', default='plateau', type=str, 27 | help='the name of scheduler(step,plateau)') 28 | parser.add_argument('--lr', '--learning-rate', default=0.001, type=float, 29 | metavar='LR', help='initial learning rate') 30 | parser.add_argument('--momentum', default=0.9, type=float, metavar='M', 31 | help='momentum') 32 | parser.add_argument('--weight-decay', '--wd', default=1e-5, type=float, 33 | metavar='W', help='weight decay (default: 1e-5)') 34 | 35 | # model config 36 | parser.add_argument('--parts', default=32, type=int, 37 | metavar='N', help='number of parts (default: 32)') 38 | parser.add_argument('--alpha', default=0.95, type=float, 39 | metavar='N', help='weight for BAP loss') 40 | parser.add_argument('--model-name', default='inception', type=str, 41 | help='model name') 42 | 43 | # training config 44 | parser.add_argument('--use-gpu', action="store_true", default=True, 45 | help='whether use gpu or not, default True') 46 | parser.add_argument('--multi-gpu', action="store_true", default=True, 47 | help='whether use multiple gpus or not, default True') 48 | parser.add_argument('--gpu-ids', default='0,1', 49 | help='gpu id list(eg: 0,1,2...)') 50 | parser.add_argument('--epochs', default=80, type=int, metavar='N', 51 | help='number of total epochs to run') 52 | parser.add_argument('-b', '--batch-size', default=16, type=int, 53 | metavar='N', help='mini-batch size (default: 16)') 54 | parser.add_argument('--print-freq', '-pf', default=100, type=int, 55 | metavar='N', help='print frequency (default: 10)') 56 | parser.add_argument('--resume', default='', type=str, metavar='PATH', 57 | help='path to latest checkpoint (default: none)') 58 | parser.add_argument('--checkpoint-path', default='checkpoint', type=str, metavar='checkpoint_path', 59 | help='path to save checkpoint (default: checkpoint)') 60 | 61 | args = parser.parse_args() 62 | 63 | return args 64 | 65 | 66 | def getDatasetConfig(dataset_name): 67 | assert dataset_name in ['bird', 'car', 68 | 'aircraft','dog'], 'No dataset named %s!' % dataset_name 69 | dataset_dict = { 70 | 'bird': {'train_root': 'data/Bird/images', # the root path of the train images stored 71 | 'val_root': 'data/Bird/images', # the root path of the validate images stored 72 | # training list file (aranged as filename lable) 73 | 'train': 'data/bird_train.txt', 74 | 'val': 'data/bird_test.txt'}, # validate list file 75 | 'car': {'train_root': 'data/Car/cars_train', 76 | 'val_root': 'data/Car/cars_test', 77 | 'train': 'data/car_train.txt', 78 | 'val': 'data/car_test.txt'}, 79 | 'aircraft': {'train_root': 'data/Aircraft/images', 80 | 'val_root': 'data/Aircraft/images', 81 | 'train': 'data/aircraft_train.txt', 82 | 'val': 'data/aircraft_test.txt'}, 83 | 'dog': {'train_root': 'data/Dog/Images', 84 | 'val_root': 'data/Dog/Images', 85 | 'train': 'data/dog_train.txt', 86 | 'val': 'data/dog_test.txt'}, 87 | } 88 | return dataset_dict[dataset_name] 89 | 90 | 91 | if __name__ == '__main__': 92 | config = getConfig() 93 | config = vars(config) 94 | dataConfig = getDatasetConfig('bird') 95 | # for k,v in config.items(): 96 | # print(k,v) 97 | # config. 98 | print(config) 99 | -------------------------------------------------------------------------------- /utils/convert_data.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: convert_data.py # 3 | # Created: 2019-10-31 19:06:54 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:convert_data.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | import os 12 | import random 13 | from scipy import io as scio 14 | import argparse 15 | def convert_bird(data_root): 16 | images_txt = os.path.join(data_root,'images.txt') 17 | train_val_txt = os.path.join(data_root,'train_test_split.txt') 18 | labels_txt = os.path.join(data_root,'image_class_labels.txt') 19 | 20 | id_name_dict = {} 21 | id_class_dict = {} 22 | id_train_val = {} 23 | with open(images_txt,'r',encoding='utf-8') as f: 24 | line = f.readline() 25 | while line: 26 | id,name = line.strip().split() 27 | id_name_dict[id] = name 28 | line = f.readline() 29 | 30 | with open(train_val_txt,'r',encoding='utf-8') as f: 31 | line = f.readline() 32 | while line: 33 | id,trainval = line.strip().split() 34 | id_train_val[id] = trainval 35 | line = f.readline() 36 | 37 | with open(labels_txt,'r',encoding='utf-8') as f: 38 | line = f.readline() 39 | while line: 40 | id,class_id = line.strip().split() 41 | id_class_dict[id] = int(class_id) 42 | line = f.readline() 43 | 44 | train_txt = os.path.join(data_root,'bird_train.txt') 45 | test_txt = os.path.join(data_root,'bird_test.txt') 46 | if os.path.exists(train_txt): 47 | os.remove(train_txt) 48 | if os.path.exists(test_txt): 49 | os.remove(test_txt) 50 | 51 | f1 = open(train_txt,'a',encoding='utf-8') 52 | f2 = open(test_txt,'a',encoding='utf-8') 53 | 54 | for id,trainval in id_train_val.items(): 55 | if trainval == '1': 56 | f1.write('%s %d\n' % (id_name_dict[id],id_class_dict[id]-1)) 57 | else: 58 | f2.write('%s %d\n' % (id_name_dict[id],id_class_dict[id]-1)) 59 | f1.close() 60 | f2.close() 61 | 62 | 63 | def convert_car(data_root): 64 | train_mat = data_root+'/cars_train_annos.mat' 65 | test_mat = data_root+'/cars_test_annos_withlabels.mat' 66 | 67 | # print(train_data) 68 | 69 | train_txt = data_root+'/car_train.txt' 70 | test_txt = data_root+'/car_test.txt' 71 | ### train txt 72 | train_data = scio.loadmat(train_mat) 73 | anno = train_data['annotations'] 74 | train_f = open(train_txt,'a') 75 | for r in anno[0]: 76 | # print(r,'============') 77 | _,_,_,_,label,name = r 78 | # print(label,name) 79 | train_f.write('%s %d\n' % (name[0],label[0][0]-1)) 80 | train_f.close() 81 | 82 | ### test txt 83 | test_data = scio.loadmat(test_mat) 84 | anno = test_data['annotations'] 85 | test_f = open(test_txt,'a') 86 | for r in anno[0]: 87 | # print(r,'============') 88 | _,_,_,_,label,name = r 89 | # print(label,name) 90 | test_f.write('%s %d\n' % (name[0],label[0][0]-1)) 91 | test_f.close() 92 | 93 | def convert_aircraft(root): 94 | # root = '.../Fine-grained/fgvc-aircraft-2013b/data' 95 | train_txt = root + '/images_variant_trainval.txt' 96 | test_txt = root + '/images_variant_test.txt' 97 | variant_txt = root + '/variants.txt' 98 | 99 | ### 100 | variants_dict = {} 101 | with open(variant_txt,'r') as f: 102 | lines = f.readlines() 103 | index = 0 104 | for line in lines: 105 | variant = line.strip() 106 | if variant in variants_dict: 107 | continue 108 | else: 109 | variants_dict[variant] = index 110 | index += 1 111 | # print(index) 112 | 113 | ### 114 | train_lst = root + '/aircraft_train.txt' 115 | test_lst = root + '/aircraft_test.txt' 116 | 117 | train_f = open(train_lst,'a') 118 | with open(train_txt,'r') as f: 119 | lines = f.readlines() 120 | for line in lines: 121 | # print(line) 122 | lst= line.strip().split(' ',1) 123 | # print(lst) 124 | name,label = lst 125 | name = name +'.jpg' 126 | label = variants_dict[label] 127 | train_f.write('%s %d\n'%(name,label)) 128 | train_f.close() 129 | test_f = open(test_lst,'a') 130 | with open(test_txt,'r') as f: 131 | lines = f.readlines() 132 | for line in lines: 133 | name,label = line.strip().split(' ',1) 134 | name = name+'.jpg' 135 | label = variants_dict[label] 136 | test_f.write('%s %d\n'%(name,label)) 137 | test_f.close() 138 | 139 | def convert_dog(data_root): 140 | train_lst = data_root+'/train_list.mat' 141 | train_txt = data_root+'/dog_train.txt' 142 | info = scio.loadmat(train_lst)['file_list'] 143 | name_dict = {} 144 | index = 0 145 | # print(info) 146 | for i in info: 147 | # print(i[0]) 148 | name = i[0][0] 149 | cate = name.split('/')[0] 150 | if cate in name_dict: 151 | label = name_dict[cate] 152 | else: 153 | label = index 154 | name_dict[cate] = index 155 | index += 1 156 | # print(name,label) 157 | with open(train_txt,'a') as f: 158 | f.write('%s %d\n'%(name,label)) 159 | 160 | test_lst = data_root+'/test_list.mat' 161 | test_txt = data_root+'/dog_test.txt' 162 | info = scio.loadmat(test_lst)['file_list'] 163 | # print(info) 164 | for i in info: 165 | # print(i[0]) 166 | name = i[0][0] 167 | cate = name.split('/')[0] 168 | label = name_dict[cate] 169 | # print(name,label) 170 | with open(test_txt,'a') as f: 171 | f.write('%s %d\n'%(name,label)) 172 | 173 | 174 | if __name__ == '__main__': 175 | # convert_bird('/home/XXX/Dataset/Fine-grained/CUB_200_2011') 176 | # convert_car('/home/XXX/Dataset/Fine-grained/Car/devkit') 177 | # convert_aircraft('/home/XXX/Dataset/Fine-grained/fgvc-aircraft-2013b/data') 178 | # convert_dog('/home/XXX/Dataset/Fine-grained/dogs') 179 | parser = argparse.ArgumentParser() 180 | parser.add_argument('--dataset_name',type=str,default='bird') 181 | parser.add_argument('--root_path',type=str,default='.') 182 | arg = parser.parse_args() 183 | func = eval('convert_'+arg.dataset_name) 184 | func(arg.root_path) 185 | -------------------------------------------------------------------------------- /utils/engine.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: engine.py # 3 | # Created: 2019-11-20 15:02:13 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:engine.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | import time 11 | from utils import calculate_pooling_center_loss, mask2bbox 12 | from utils import attention_crop, attention_drop, attention_crop_drop 13 | from utils import getDatasetConfig, getConfig, getLogger 14 | from utils import accuracy, get_lr, save_checkpoint, AverageMeter, set_seed 15 | 16 | import torch 17 | import torch.nn.functional as F 18 | 19 | class Engine(): 20 | def __init__(self,): 21 | pass 22 | 23 | def train(self,state,epoch): 24 | batch_time = AverageMeter() 25 | data_time = AverageMeter() 26 | losses = AverageMeter() 27 | top1 = AverageMeter() 28 | top5 = AverageMeter() 29 | config = state['config'] 30 | print_freq = config.print_freq 31 | model = state['model'] 32 | criterion = state['criterion'] 33 | optimizer = state['optimizer'] 34 | train_loader = state['train_loader'] 35 | model.train() 36 | end = time.time() 37 | for i, (img, label) in enumerate(train_loader): 38 | # measure data loading time 39 | data_time.update(time.time() - end) 40 | 41 | target = label.cuda() 42 | input = img.cuda() 43 | # compute output 44 | attention_maps, raw_features, output1 = model(input) 45 | features = raw_features.reshape(raw_features.shape[0], -1) 46 | 47 | feature_center_loss, center_diff = calculate_pooling_center_loss( 48 | features, state['center'], target, alfa=config.alpha) 49 | 50 | # update model.centers 51 | state['center'][target] += center_diff 52 | 53 | # compute refined loss 54 | # img_drop = attention_drop(attention_maps,input) 55 | # img_crop = attention_crop(attention_maps, input) 56 | img_crop, img_drop = attention_crop_drop(attention_maps, input) 57 | _, _, output2 = model(img_drop) 58 | _, _, output3 = model(img_crop) 59 | 60 | loss1 = criterion(output1, target) 61 | loss2 = criterion(output2, target) 62 | loss3 = criterion(output3, target) 63 | 64 | loss = (loss1+loss2+loss3)/3 + feature_center_loss 65 | # measure accuracy and record loss 66 | prec1, prec5 = accuracy(output1, target, topk=(1, 5)) 67 | losses.update(loss.item(), input.size(0)) 68 | top1.update(prec1[0], input.size(0)) 69 | top5.update(prec5[0], input.size(0)) 70 | 71 | # compute gradient and do SGD step 72 | optimizer.zero_grad() 73 | loss.backward() 74 | optimizer.step() 75 | 76 | # measure elapsed time 77 | batch_time.update(time.time() - end) 78 | end = time.time() 79 | 80 | if i % print_freq == 0: 81 | print('Epoch: [{0}][{1}/{2}]\t' 82 | 'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t' 83 | 'Data {data_time.val:.3f} ({data_time.avg:.3f})\t' 84 | 'Loss {loss.val:.4f} ({loss.avg:.4f})\t' 85 | 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\t' 86 | 'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format( 87 | epoch, i, len(train_loader), batch_time=batch_time, 88 | data_time=data_time, loss=losses, top1=top1, top5=top5)) 89 | print("loss1,loss2,loss3,feature_center_loss", loss1.item(), loss2.item(), loss3.item(), 90 | feature_center_loss.item()) 91 | return top1.avg, losses.avg 92 | 93 | def validate(self,state): 94 | batch_time = AverageMeter() 95 | losses = AverageMeter() 96 | top1 = AverageMeter() 97 | top5 = AverageMeter() 98 | 99 | config = state['config'] 100 | print_freq = config.print_freq 101 | model = state['model'] 102 | val_loader = state['val_loader'] 103 | criterion = state['criterion'] 104 | # switch to evaluate mode 105 | model.eval() 106 | with torch.no_grad(): 107 | end = time.time() 108 | for i, (input, target) in enumerate(val_loader): 109 | target = target.cuda() 110 | input = input.cuda() 111 | # forward 112 | attention_maps, raw_features, output1 = model(input) 113 | features = raw_features.reshape(raw_features.shape[0], -1) 114 | feature_center_loss, _ = calculate_pooling_center_loss( 115 | features, state['center'], target, alfa=config.alpha) 116 | 117 | img_crop, img_drop = attention_crop_drop(attention_maps, input) 118 | # img_drop = attention_drop(attention_maps,input) 119 | # img_crop = attention_crop(attention_maps,input) 120 | _, _, output2 = model(img_drop) 121 | _, _, output3 = model(img_crop) 122 | loss1 = criterion(output1, target) 123 | loss2 = criterion(output2, target) 124 | loss3 = criterion(output3, target) 125 | # loss = loss1 + feature_center_loss 126 | loss = (loss1+loss2+loss3)/3+feature_center_loss 127 | # measure accuracy and record loss 128 | prec1, prec5 = accuracy(output1, target, topk=(1, 5)) 129 | losses.update(loss.item(), input.size(0)) 130 | top1.update(prec1[0], input.size(0)) 131 | top5.update(prec5[0], input.size(0)) 132 | 133 | # measure elapsed time 134 | batch_time.update(time.time() - end) 135 | end = time.time() 136 | 137 | if i % print_freq == 0: 138 | print('Test: [{0}/{1}]\t' 139 | 'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\t' 140 | 'Loss {loss.val:.4f} ({loss.avg:.4f})\t' 141 | 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\t' 142 | 'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format( 143 | i, len(val_loader), batch_time=batch_time, loss=losses, 144 | top1=top1, top5=top5)) 145 | 146 | print(' * Prec@1 {top1.avg:.3f} Prec@5 {top5.avg:.3f}' 147 | .format(top1=top1, top5=top5)) 148 | 149 | return top1.avg, losses.avg 150 | 151 | def test(self,val_loader, model, criterion): 152 | top1 = AverageMeter() 153 | top5 = AverageMeter() 154 | print_freq = 100 155 | # switch to evaluate mode 156 | model.eval() 157 | with torch.no_grad(): 158 | for i, (input, target) in enumerate(val_loader): 159 | target = target.cuda() 160 | input = input.cuda() 161 | # forward 162 | attention_maps, _, output1 = model(input) 163 | refined_input = mask2bbox(attention_maps, input) 164 | _, _, output2 = model(refined_input) 165 | output = (F.softmax(output1, dim=-1)+F.softmax(output2, dim=-1))/2 166 | # measure accuracy and record loss 167 | prec1, prec5 = accuracy(output, target, topk=(1, 5)) 168 | top1.update(prec1[0], input.size(0)) 169 | top5.update(prec5[0], input.size(0)) 170 | 171 | if i % print_freq == 0: 172 | print('Test: [{0}/{1}]\t' 173 | 'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\t' 174 | 'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format( 175 | i, len(val_loader), 176 | top1=top1, top5=top5)) 177 | 178 | print(' * Prec@1 {top1.avg:.3f} Prec@5 {top5.avg:.3f}' 179 | .format(top1=top1, top5=top5)) 180 | 181 | return top1.avg, top5.avg 182 | 183 | 184 | if __name__ == '__main__': 185 | 186 | engine = Engine() 187 | engine.train() 188 | -------------------------------------------------------------------------------- /utils/meter.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: meter.py # 3 | # Created: 2019-11-05 11:36:34 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:meter.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | class AverageMeter(object): 12 | """Computes and stores the average and current value""" 13 | def __init__(self): 14 | self.reset() 15 | 16 | def reset(self): 17 | self.val = 0 18 | self.avg = 0 19 | self.sum = 0 20 | self.count = 0 21 | 22 | def update(self, val, n=1): 23 | self.val = val 24 | self.sum += val * n 25 | self.count += n 26 | self.avg = self.sum / self.count -------------------------------------------------------------------------------- /utils/utils.py: -------------------------------------------------------------------------------- 1 | ############################################################ 2 | # File: utils.py # 3 | # Created: 2019-11-18 20:50:50 # 4 | # Author : wvinzh # 5 | # Email : wvinzh@qq.com # 6 | # ------------------------------------------ # 7 | # Description:utils.py # 8 | # Copyright@2019 wvinzh, HUST # 9 | ############################################################ 10 | 11 | 12 | import os 13 | import random 14 | import numpy as np 15 | import torch 16 | import shutil 17 | import logging 18 | def getLogger(name='logger',filename=''): 19 | # 使用一个名字为fib的logger 20 | logger = logging.getLogger(name) 21 | 22 | # 设置logger的level为DEBUG 23 | logger.setLevel(logging.DEBUG) 24 | 25 | # 创建一个输出日志到控制台的StreamHandler 26 | if filename: 27 | if os.path.exists(filename): 28 | os.remove(filename) 29 | hdl = logging.FileHandler(filename, mode='a', encoding='utf-8', delay=False) 30 | else: 31 | hdl = logging.StreamHandler() 32 | 33 | format = '%(asctime)s [%(levelname)s] at %(filename)s,%(lineno)d: %(message)s' 34 | datefmt = '%Y-%m-%d(%a)%H:%M:%S' 35 | formatter = logging.Formatter(format,datefmt) 36 | hdl.setFormatter(formatter) 37 | # 给logger添加上handler 38 | logger.addHandler(hdl) 39 | return logger 40 | def set_seed(seed=0): 41 | os.environ['PYTHONHASHSEED'] = str(seed) 42 | random.seed(seed) 43 | np.random.seed(seed) 44 | torch.manual_seed(seed) 45 | torch.cuda.manual_seed(seed) 46 | torch.cuda.manual_seed_all(seed) # if you are using multi-GPU. 47 | torch.backends.cudnn.benchmark = False 48 | torch.backends.cudnn.deterministic = True 49 | 50 | def _init_fn(worker_id): 51 | set_seed(worker_id) 52 | # np.random.seed() 53 | 54 | 55 | def get_lr(optimizer): 56 | for param_group in optimizer.param_groups: 57 | old_lr = float(param_group['lr']) 58 | return old_lr 59 | 60 | 61 | def accuracy(output, target, topk=(1,)): 62 | """Computes the precision@k for the specified values of k""" 63 | with torch.no_grad(): 64 | maxk = max(topk) 65 | batch_size = target.size(0) 66 | 67 | _, pred = output.topk(maxk, 1, True, True) 68 | pred = pred.t() 69 | correct = pred.eq(target.view(1, -1).expand_as(pred)) 70 | 71 | res = [] 72 | for k in topk: 73 | correct_k = correct[:k].view(-1).float().sum(0, keepdim=True) 74 | res.append(correct_k.mul_(100.0 / batch_size)) 75 | return res 76 | 77 | 78 | def save_checkpoint(state, is_best, path='checkpoint', filename='checkpoint.pth.tar'): 79 | if not os.path.exists(path): 80 | os.makedirs(path) 81 | full_path = os.path.join(path, filename) 82 | torch.save(state, full_path) 83 | if is_best: 84 | shutil.copyfile(full_path, os.path.join(path, 'model_best.pth.tar')) 85 | print("Save best model at %s==" % 86 | os.path.join(path, 'model_best.pth.tar')) --------------------------------------------------------------------------------