├── .gitattributes ├── README.md ├── VOC_data └── VOC_mask │ ├── create_dataset.py │ ├── data_list │ ├── test.txt │ ├── train.txt │ ├── trainval.txt │ └── val.txt │ ├── data_log.py │ ├── data_proc.py │ └── test_img │ ├── 01.png │ ├── 01_out.jpg │ ├── 01_out_show.png │ ├── 02.jpg │ ├── 02_out.jpg │ └── mask_00025_o1.jpg ├── _mask_test.txt ├── _mask_train.txt ├── _mask_val.txt ├── coco_annotation.py ├── convert.py ├── darknet53.cfg ├── detect_batch.py ├── font ├── FiraMono-Medium.otf └── SIL Open Font License.txt ├── kmeans.py ├── model_data ├── coco_classes.txt ├── tiny_yolo_anchors.txt ├── voc_classes.txt └── yolo_anchors.txt ├── train.py ├── train_bottleneck.py ├── voc_annotation.py ├── yolo.py ├── yolo3 ├── __init__.py ├── __pycache__ │ ├── __init__.cpython-35.pyc │ ├── __init__.cpython-36.pyc │ ├── model.cpython-35.pyc │ ├── model.cpython-36.pyc │ ├── utils.cpython-35.pyc │ └── utils.cpython-36.pyc ├── model.py └── utils.py ├── yolo_video.py ├── yolov3-tiny.cfg └── yolov3.cfg /.gitattributes: -------------------------------------------------------------------------------- 1 | # Auto detect text files and perform LF normalization 2 | * text=auto 3 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # MaskDetect 2 | 3 | 基于深度学习的口罩佩戴检测,Keras-YOLOv3 实现。 4 | 5 | ## Preview 6 | 7 | ![nomask_sample](VOC_data/VOC_mask/test_img/01_out_show.png) 8 | 9 | ![rightmask_sample](VOC_data/VOC_mask/test_img/02_out.jpg) 10 | 11 | ## 测试 12 | 13 | ### 下载模型文件 14 | 15 | 模型文件存储在百度云上:[MaskDetect_model(提取码: yphs)](https://pan.baidu.com/s/1L9g8dvM8tn0wZkHM47lsfw) 16 | 17 | 模型文件对应说明如下: 18 | | 模型文件 | 说明 | 19 | |-----------------------------|--------------| 20 | | trained\_weights\_final\_12385\.h5 | 在大数据集上训练的模型 | 21 | | trained\_weights\_final\_147\.h5 | 在初代数据集上训练的模型 * | 22 | | yolo\_weights\.h5 | YOLO 官方预训练模型 | 23 | 24 | 下载后,请将模型文件放在 `model_data` 目录下。 25 | 26 | \* 使用初代模型,请注意修改 `voc_annotation.py` 及 `model_data/voc_classes.txt` 中类名为 `rightmask` `wrongmask` `nomask`。 27 | 28 | ### 开始测试 29 | 30 | 对于图片:`python3 yolo_video.py [OPTIONS...] --image` 31 | 32 | 对于视频:`python3 yolo_video.py [video_path] [output_path (optional)]` 33 | 34 | `yolo_video.py` 完整的选项可以通过 `python3 yolo_video.py --help` 查看。 35 | 36 | ## 训练 37 | 38 | ### 准备数据集 39 | 40 | 你需要按照 VOC 数据集的格式准备数据集,`VOC_data/VOC_mask` 目录用于存放数据。 41 | 42 | ```sh 43 | VOC_mask 44 | ├─data_list # 存放数据集列表文件,建议由 create_dataset.py 生成 45 | ├─img # 存放图片文件 46 | ├─label # 存放图片标签,xml 格式 47 | └─test_img # 存放了两个样例,与训练无关。 48 | ``` 49 | 50 | ### 修改类名 51 | 52 | 修改根目录下的 `voc_annotation.py` 中第 7 行的 classes 数组为数据对应类别。 53 | 修改 `model_data` 目录下的 `voc_classes.txt` 为数据对应类别。 54 | 55 | ### 处理数据并生成索引 56 | 57 | `VOC_data/VOC_mask` 目录下的 `data_proc.py` 可以帮助处理数据集,完成统一命名和数据对应,如果数据集格式本身就规范,则没有必要使用它。 58 | 59 | 将图片放入 `img` 目录,将标签放入 `label` 目录之后,运行 `VOC_data/VOC_mask` 目录下的 `create_dataset.py`,该程序将在 `data_list` 目录下生成 `test.txt` `train.txt` `trainval.txt` `val.txt` 这四个列表文件。 60 | 61 | 在根目录下,运行 `voc_annotation.py`,程序将在根目录下生成用于训练的数据列表。 62 | 63 | ### 开始训练 64 | 65 | 在根目录下,运行 `train.py` 进行训练。可以根据情况修改 `train.py` 中的参数。 66 | 67 | ## 细节 68 | 69 | ### 环境 70 | 71 | 训练及测试的环境如下: 72 | 73 | * **Python:** 3.6.9 74 | * **Keras:** 2.2.0 75 | * **TensorFlow:** 1.6.0 76 | 77 | ### 数据集 78 | 79 | #### 更新数据集(12385 张) 80 | 81 | 我们使用了新的更大的数据集进行训练,新的数据集的组成如下: 82 | 83 | | 人脸样本类别 | 数量 | 84 | |-|-| 85 | | 有口罩人脸样本 | 7056 个 | 86 | | 无口罩人脸样本 | 20318 个 | 87 | | 共计 | 27374 个 | 88 | 89 | | 图片类别 | 数量 | 90 | |-|-| 91 | | 仅包含有口罩样本的图片 | 3943 张 | 92 | | 仅包含无口罩样本的图片 | 8110 张 | 93 | | 包含有口罩、无口罩样本的图片 | 329 张 | 94 | | 共计 * | 12385 张 | 95 | 96 | \* 包含 3 张未处理的无效样本(`03580` `05301` `06124`) 97 | 98 | 新的数据集是下面三个开源数据集的联合数据集,在这里向他们的贡献表示感谢: 99 | 100 | * 更新版的 [hikariming/virus-mask-dataset](https://github.com/hikariming/virus-mask-dataset) 101 | * AIZOO 开源的口罩数据集 102 | * [hamlinzheng](https://github.com/hamlinzheng) 开源的口罩数据集 103 | 104 | #### 初代数据集(147 张) 105 | 106 | 训练使用的数据集:[hikariming/virus-mask-dataset](https://github.com/hikariming/virus-mask-dataset) 107 | 数据集分为三个类:正确佩戴口罩(rightmask)、错误佩戴口罩(wrongmask)、未佩戴口罩(nomask)。经过数据清洗后,剩余 147 个有效样本用于训练和评估。 108 | 109 | ### 训练 110 | 111 | 考虑正负样本不均衡的问题,应用了 oversampling。 112 | 113 | 考虑迁移学习,在 YOLO 官方预训练的权重上继续训练 100 轮,loss 降至 22 左右。 114 | 115 | ### 不足 116 | 117 | 实测对于单张照片存在漏测的现象。 118 | 119 | ## 引用及致谢 120 | 121 | 训练使用了 hikariming, AIZOO, hamlinzheng 公开的口罩检测数据集 122 | 123 | Keras-YOLOv3 框架来自:[qqwweee/keras-yolo3](https://github.com/qqwweee/keras-yolo3) 124 | 125 | 在此表示感谢! 126 | -------------------------------------------------------------------------------- /VOC_data/VOC_mask/create_dataset.py: -------------------------------------------------------------------------------- 1 | import os 2 | import random 3 | 4 | trainval_percent = 0.2 5 | train_percent = 0.8 6 | xmlfilepath = 'label' 7 | txtsavepath = 'img' 8 | total_xml = os.listdir(xmlfilepath) 9 | 10 | num = len(total_xml) 11 | list = range(num) 12 | tv = int(num * trainval_percent) 13 | tr = int(tv * train_percent) 14 | trainval = random.sample(list, tv) 15 | train = random.sample(trainval, tr) 16 | 17 | ftrainval = open('data_list/trainval.txt', 'w') 18 | ftest = open('data_list/test.txt', 'w') 19 | ftrain = open('data_list/train.txt', 'w') 20 | fval = open('data_list/val.txt', 'w') 21 | 22 | for i in list: 23 | name = total_xml[i][:-4] + '\n' 24 | if i in trainval: 25 | ftrainval.write(name) 26 | if i in train: 27 | ftest.write(name) 28 | else: 29 | fval.write(name) 30 | else: 31 | ftrain.write(name) 32 | 33 | ftrainval.close() 34 | ftrain.close() 35 | fval.close() 36 | ftest.close() 37 | -------------------------------------------------------------------------------- /VOC_data/VOC_mask/data_list/test.txt: -------------------------------------------------------------------------------- 1 | mask_07320_o1 2 | mask_06115_o2 3 | mask_07586 4 | mask_12062 5 | mask_05304_o2 6 | mask_01274 7 | mask_09099_o1 8 | mask_06630_o2 9 | mask_01836 10 | mask_03653 11 | mask_01770 12 | mask_04542 13 | mask_07704_o2 14 | mask_08415_o1 15 | mask_01017 16 | mask_07377_o1 17 | mask_01805 18 | mask_10018 19 | mask_03933 20 | mask_00846 21 | mask_11696 22 | mask_00449_o2 23 | mask_11125 24 | mask_09847 25 | mask_11042_o2 26 | mask_05555 27 | mask_06405_o1 28 | mask_08579_o2 29 | mask_11234_o2 30 | mask_06223_o2 31 | mask_01438 32 | mask_06851 33 | mask_07151 34 | mask_00117_o2 35 | mask_08713_o2 36 | mask_11436_o2 37 | mask_11539_o2 38 | mask_05603_o2 39 | mask_08058_o2 40 | mask_07841 41 | mask_07408 42 | mask_09131 43 | mask_07167_o2 44 | mask_05011 45 | mask_09293 46 | mask_06926 47 | mask_08954 48 | mask_06674 49 | mask_06701_o2 50 | mask_06650 51 | mask_07382_o1 52 | mask_11000_o1 53 | mask_09734 54 | mask_02681 55 | mask_06208_o1 56 | mask_08766_o2 57 | mask_12216 58 | mask_06503 59 | mask_09103_o1 60 | mask_04724 61 | mask_03851 62 | mask_10691 63 | mask_06920_o1 64 | mask_01440 65 | mask_00140_o2 66 | mask_10492 67 | mask_06290_o2 68 | mask_08399 69 | mask_00303 70 | mask_08537_o1 71 | mask_00089_o1 72 | mask_03308 73 | mask_05858_o2 74 | mask_11274_o1 75 | mask_05429_o1 76 | mask_07397 77 | mask_04965 78 | mask_06631_o1 79 | mask_09598 80 | mask_04978 81 | mask_00424_o1 82 | mask_06683_o2 83 | mask_02631 84 | mask_01800 85 | mask_03328 86 | mask_02185 87 | mask_00417_o1 88 | mask_07433_o2 89 | mask_08486 90 | mask_08335 91 | mask_08516 92 | mask_06533_o2 93 | mask_11506 94 | mask_07356_o1 95 | mask_07731_o2 96 | mask_10578 97 | mask_03044 98 | mask_00480 99 | mask_05492 100 | mask_04955 101 | mask_05121_o2 102 | mask_05394 103 | mask_03990 104 | mask_05780_o2 105 | mask_04841 106 | mask_06631_o2 107 | mask_09490_o1 108 | mask_05185 109 | mask_06017_o1 110 | mask_06676_o2 111 | mask_07861_o2 112 | mask_07058 113 | mask_07931_o1 114 | mask_11235_o1 115 | mask_07829_o2 116 | mask_06373 117 | mask_00459_o1 118 | mask_04677 119 | mask_01082 120 | mask_08449 121 | mask_06514_o2 122 | mask_08846_o2 123 | mask_09439 124 | mask_02339 125 | mask_06996 126 | mask_02940 127 | mask_05843 128 | mask_03386 129 | mask_03643 130 | mask_05797_o2 131 | mask_06521 132 | mask_00255 133 | mask_00251_o2 134 | mask_07233 135 | mask_09336 136 | mask_06826 137 | mask_07135 138 | mask_01688 139 | mask_11081_o2 140 | mask_11636 141 | mask_00025_o1 142 | mask_02256 143 | mask_08272_o1 144 | mask_09529 145 | mask_08680 146 | mask_05252 147 | mask_09574 148 | mask_11129 149 | mask_11326 150 | mask_11851 151 | mask_01682 152 | mask_04689 153 | mask_10042 154 | mask_04913 155 | mask_04686 156 | mask_02710 157 | mask_03409 158 | mask_06411 159 | mask_03976 160 | mask_01073 161 | mask_11676 162 | mask_05158_o1 163 | mask_08818 164 | mask_00999 165 | mask_06685_o2 166 | mask_09400 167 | mask_07548 168 | mask_05694 169 | mask_05426_o2 170 | mask_06103_o2 171 | mask_05461 172 | mask_03269 173 | mask_06053_o2 174 | mask_10998_o1 175 | mask_04592 176 | mask_08782_o2 177 | mask_07996 178 | mask_11573_o2 179 | mask_03461 180 | mask_09575 181 | mask_05798_o1 182 | mask_05222_o2 183 | mask_00324 184 | mask_11454 185 | mask_09441 186 | mask_07636_o2 187 | mask_05565 188 | mask_05762 189 | mask_10933 190 | mask_08126_o1 191 | mask_07962 192 | mask_00210_o2 193 | mask_07812_o1 194 | mask_00088_o2 195 | mask_06953_o1 196 | mask_00196 197 | mask_06936_o2 198 | mask_08771 199 | mask_00648 200 | mask_02156 201 | mask_08692 202 | mask_00223 203 | mask_04715 204 | mask_05150_o2 205 | mask_07881 206 | mask_02024 207 | mask_06796_o1 208 | mask_04779 209 | mask_03801 210 | mask_09130 211 | mask_07360_o2 212 | mask_10027 213 | mask_08237_o1 214 | mask_11537_o2 215 | mask_05570_o1 216 | mask_01510 217 | mask_04289 218 | mask_10288 219 | mask_11297_o1 220 | mask_08328_o1 221 | mask_05648 222 | mask_03517 223 | mask_08201 224 | mask_06047_o2 225 | mask_07967_o1 226 | mask_06068 227 | mask_06299_o1 228 | mask_05662_o1 229 | mask_00206_o1 230 | mask_05300_o2 231 | mask_08401_o2 232 | mask_00916 233 | mask_05043 234 | mask_10117 235 | mask_06664 236 | mask_00240_o2 237 | mask_05093_o2 238 | mask_08899_o1 239 | mask_01319 240 | mask_06031_o1 241 | mask_00493_o2 242 | mask_12011 243 | mask_05181_o1 244 | mask_00307_o1 245 | mask_04615 246 | mask_06762 247 | mask_06942 248 | mask_06588 249 | mask_09925 250 | mask_11427 251 | mask_03232 252 | mask_05443 253 | mask_06010 254 | mask_05124_o1 255 | mask_11818 256 | mask_08342 257 | mask_01914 258 | mask_07120 259 | mask_05076 260 | mask_05270 261 | mask_02307 262 | mask_07680_o1 263 | mask_08624 264 | mask_07791 265 | mask_02581 266 | mask_04184 267 | mask_10064 268 | mask_08940_o2 269 | mask_05065_o2 270 | mask_00951 271 | mask_12161 272 | mask_10575 273 | mask_05344_o1 274 | mask_09994 275 | mask_04866 276 | mask_03673 277 | mask_11476_o1 278 | mask_06982 279 | mask_11188 280 | mask_03996 281 | mask_04644 282 | mask_00246_o1 283 | mask_08055 284 | mask_04055 285 | mask_01072 286 | mask_08822_o1 287 | mask_08991_o2 288 | mask_09851 289 | mask_09062 290 | mask_12129 291 | mask_00357 292 | mask_03060 293 | mask_06268 294 | mask_07833 295 | mask_09209 296 | mask_00049 297 | mask_00469_o1 298 | mask_05377_o1 299 | mask_02532 300 | mask_08710 301 | mask_10781 302 | mask_11222 303 | mask_00778 304 | mask_00228 305 | mask_07251_o2 306 | mask_09316 307 | mask_06229 308 | mask_08820_o1 309 | mask_06610_o2 310 | mask_07060_o1 311 | mask_05973 312 | mask_07181_o1 313 | mask_07305_o2 314 | mask_11066_o1 315 | mask_01756 316 | mask_07658 317 | mask_04007 318 | mask_05707 319 | mask_08469 320 | mask_05150_o1 321 | mask_07150_o2 322 | mask_09685 323 | mask_04194 324 | mask_03062 325 | mask_00345_o2 326 | mask_07107 327 | mask_07942_o2 328 | mask_08925 329 | mask_06785_o1 330 | mask_07706_o1 331 | mask_00470_o1 332 | mask_06119_o1 333 | mask_05535 334 | mask_04397 335 | mask_10525 336 | mask_01086 337 | mask_07887 338 | mask_04241 339 | mask_07972 340 | mask_00092_o1 341 | mask_03627 342 | mask_05364 343 | mask_05700_o1 344 | mask_00870 345 | mask_12094 346 | mask_05903_o2 347 | mask_06685_o1 348 | mask_07008_o1 349 | mask_11155_o2 350 | mask_10834 351 | mask_08935 352 | mask_09005_o1 353 | mask_10565 354 | mask_02346 355 | mask_04586 356 | mask_08681 357 | mask_07099_o2 358 | mask_06119 359 | mask_01641 360 | mask_08521_o2 361 | mask_00039_o2 362 | mask_09898 363 | mask_00295_o2 364 | mask_04266 365 | mask_11762 366 | mask_06274_o2 367 | mask_09087_o2 368 | mask_11007_o1 369 | mask_06138_o2 370 | mask_11875 371 | mask_08849_o1 372 | mask_07940_o1 373 | mask_06756_o1 374 | mask_08275 375 | mask_12242 376 | mask_08551 377 | mask_11217_o2 378 | mask_07390 379 | mask_00094 380 | mask_06612 381 | mask_09975 382 | mask_06186_o1 383 | mask_06792_o2 384 | mask_09553 385 | mask_09365 386 | mask_05645 387 | mask_08477_o1 388 | mask_08331 389 | mask_01239 390 | mask_09312 391 | mask_10152 392 | mask_06779_o1 393 | mask_11480 394 | mask_06020_o1 395 | mask_08520 396 | mask_10931 397 | mask_02899 398 | mask_06864_o1 399 | mask_09212 400 | mask_06894_o1 401 | mask_04253 402 | mask_06858 403 | mask_11060 404 | mask_10184 405 | mask_11602_o2 406 | mask_00854 407 | mask_07884 408 | mask_08356 409 | mask_05205_o2 410 | mask_05778_o1 411 | mask_08762 412 | mask_00385_o2 413 | mask_08215 414 | mask_03919 415 | mask_03724 416 | mask_00480_o1 417 | mask_05140_o1 418 | mask_08967_o2 419 | mask_00726 420 | mask_07159 421 | mask_03888 422 | mask_02351 423 | mask_05538 424 | mask_10498 425 | mask_08979 426 | mask_10995_o2 427 | mask_07745_o2 428 | mask_06392 429 | mask_07792_o2 430 | mask_11136_o1 431 | mask_00561 432 | mask_08108_o2 433 | mask_06950_o2 434 | mask_05576_o2 435 | mask_00661 436 | mask_06482_o1 437 | mask_02981 438 | mask_05658 439 | mask_09538 440 | mask_05503_o2 441 | mask_08370 442 | mask_06851_o2 443 | mask_01969 444 | mask_09965 445 | mask_06203 446 | mask_01600 447 | mask_01026 448 | mask_06686_o1 449 | mask_10822 450 | mask_10275 451 | mask_11983 452 | mask_02721 453 | mask_00180 454 | mask_00131_o2 455 | mask_10238 456 | mask_05756_o2 457 | mask_00146 458 | mask_06445_o2 459 | mask_08400_o2 460 | mask_08382 461 | mask_05442_o2 462 | mask_07808_o1 463 | mask_08166 464 | mask_00164 465 | mask_08856 466 | mask_00098_o2 467 | mask_07175 468 | mask_03431 469 | mask_07789_o1 470 | mask_08186_o1 471 | mask_06651_o1 472 | mask_03119 473 | mask_08798_o1 474 | mask_00452 475 | mask_11562 476 | mask_11857 477 | mask_06519_o2 478 | mask_06719_o2 479 | mask_00321_o2 480 | mask_07504_o2 481 | mask_05842 482 | mask_07277_o1 483 | mask_05407_o2 484 | mask_07320 485 | mask_06146_o1 486 | mask_05179_o2 487 | mask_08932_o1 488 | mask_00145_o1 489 | mask_04292 490 | mask_09386 491 | mask_05540_o1 492 | mask_05750_o1 493 | mask_05513 494 | mask_10030 495 | mask_10715 496 | mask_08517_o2 497 | mask_00462 498 | mask_00242_o2 499 | mask_08707_o2 500 | mask_10225 501 | mask_06545 502 | mask_06079 503 | mask_06221_o2 504 | mask_00144_o1 505 | mask_08171_o1 506 | mask_00474 507 | mask_07496_o2 508 | mask_11035_o1 509 | mask_00569 510 | mask_10146 511 | mask_08025 512 | mask_05702 513 | mask_00488_o1 514 | mask_07807_o2 515 | mask_08355 516 | mask_09935 517 | mask_07212_o2 518 | mask_05144 519 | mask_09085_o2 520 | mask_09745 521 | mask_00438_o1 522 | mask_03857 523 | mask_12092 524 | mask_06884_o1 525 | mask_05379_o1 526 | mask_07452_o1 527 | mask_06699 528 | mask_01419 529 | mask_02148 530 | mask_00862 531 | mask_07855_o1 532 | mask_01203 533 | mask_07425 534 | mask_00032_o2 535 | mask_07903 536 | mask_04396 537 | mask_11433_o2 538 | mask_07811_o1 539 | mask_02414 540 | mask_01263 541 | mask_05448_o1 542 | mask_02142 543 | mask_11774 544 | mask_07358 545 | mask_06117_o2 546 | mask_10888 547 | mask_03094 548 | mask_05741_o1 549 | mask_08732_o2 550 | mask_00570 551 | mask_10683 552 | mask_07883 553 | mask_07069_o1 554 | mask_07054_o1 555 | mask_11008 556 | mask_11247_o2 557 | mask_01337 558 | mask_06658_o2 559 | mask_00348_o2 560 | mask_04784 561 | mask_00155_o2 562 | mask_10973_o1 563 | mask_07470_o2 564 | mask_00287_o1 565 | mask_11415 566 | mask_05601_o2 567 | mask_06714 568 | mask_05877 569 | mask_08393 570 | mask_11539_o1 571 | mask_07227_o2 572 | mask_00696 573 | mask_05217 574 | mask_08852_o2 575 | mask_05585_o1 576 | mask_06267_o1 577 | mask_08626 578 | mask_00174_o1 579 | mask_08906_o2 580 | mask_09540 581 | mask_05620_o1 582 | mask_07745_o1 583 | mask_09159 584 | mask_11257_o2 585 | mask_07985 586 | mask_05679 587 | mask_00849 588 | mask_05891 589 | mask_10409 590 | mask_07654_o2 591 | mask_08729_o1 592 | mask_03105 593 | mask_06162_o1 594 | mask_10482 595 | mask_00806 596 | mask_10915 597 | mask_12105 598 | mask_06081 599 | mask_10429 600 | mask_06750 601 | mask_07595 602 | mask_06646_o2 603 | mask_05554_o1 604 | mask_08738_o1 605 | mask_08314_o1 606 | mask_08543_o2 607 | mask_06819_o1 608 | mask_11435_o2 609 | mask_11564_o1 610 | mask_08491 611 | mask_00522 612 | mask_10986_o1 613 | mask_10037 614 | mask_05908 615 | mask_07381 616 | mask_00267_o2 617 | mask_02646 618 | mask_10966_o1 619 | mask_05830 620 | mask_04234 621 | mask_06952_o2 622 | mask_07959_o1 623 | mask_07015_o2 624 | mask_01854 625 | mask_11123_o1 626 | mask_07684_o1 627 | mask_11137 628 | mask_07084 629 | mask_07372 630 | mask_06086_o2 631 | mask_07751_o2 632 | mask_00113 633 | mask_07351 634 | mask_08520_o1 635 | mask_01629 636 | mask_08943_o2 637 | mask_08687 638 | mask_05166_o1 639 | mask_05875 640 | mask_08618_o2 641 | mask_06028 642 | mask_06305_o2 643 | mask_00258_o2 644 | mask_05571_o1 645 | mask_08395 646 | mask_11300 647 | mask_04675 648 | mask_00095 649 | mask_06607_o2 650 | mask_07732_o1 651 | mask_11272 652 | mask_00388_o1 653 | mask_08435 654 | mask_00441 655 | mask_05251 656 | mask_07971 657 | mask_05556 658 | mask_09093 659 | mask_09242 660 | mask_00402 661 | mask_06005_o2 662 | mask_07412 663 | mask_06098_o2 664 | mask_03885 665 | mask_11459_o2 666 | mask_04262 667 | mask_05137_o1 668 | mask_08317 669 | mask_05142_o2 670 | mask_11249 671 | mask_06502 672 | mask_04560 673 | mask_07026_o2 674 | mask_03197 675 | mask_10326 676 | mask_07890 677 | mask_00041_o1 678 | mask_05787_o1 679 | mask_05666_o2 680 | mask_09014_o2 681 | mask_00484_o1 682 | mask_04153 683 | mask_03694 684 | mask_08234_o1 685 | mask_06771 686 | mask_05088_o2 687 | mask_00476 688 | mask_08771_o1 689 | mask_03230 690 | mask_00079_o2 691 | mask_07663 692 | mask_08003 693 | mask_00261_o1 694 | mask_08829_o1 695 | mask_11374 696 | mask_00369_o2 697 | mask_10599 698 | mask_02575 699 | mask_04674 700 | mask_06033 701 | mask_05950_o2 702 | mask_07441_o2 703 | mask_01876 704 | mask_04629 705 | mask_03745 706 | mask_11920 707 | mask_10164 708 | mask_10804 709 | mask_09544 710 | mask_12296 711 | mask_00082_o1 712 | mask_01242 713 | mask_11867 714 | mask_06343_o2 715 | mask_01177 716 | mask_06085_o2 717 | mask_02102 718 | mask_00699 719 | mask_01213 720 | mask_05012 721 | mask_06718 722 | mask_05314_o1 723 | mask_11970 724 | mask_02258 725 | mask_06068_o1 726 | mask_00087_o2 727 | mask_11069 728 | mask_08833 729 | mask_05732 730 | mask_07928_o1 731 | mask_11589_o1 732 | mask_08342_o2 733 | mask_02985 734 | mask_00587 735 | mask_05339_o2 736 | mask_08728_o2 737 | mask_03317 738 | mask_05458 739 | mask_00552 740 | mask_10452 741 | mask_05192_o1 742 | mask_05350_o2 743 | mask_00468_o2 744 | mask_05327_o1 745 | mask_06358 746 | mask_01853 747 | mask_00125 748 | mask_05773 749 | mask_06600_o2 750 | mask_01000 751 | mask_06355_o2 752 | mask_00644 753 | mask_09926 754 | mask_08866_o2 755 | mask_00145_o2 756 | mask_01398 757 | mask_00222 758 | mask_10905 759 | mask_11475 760 | mask_07906_o1 761 | mask_05416_o2 762 | mask_08332_o2 763 | mask_03635 764 | mask_06309 765 | mask_07197_o2 766 | mask_05941_o1 767 | mask_08002 768 | mask_06397_o2 769 | mask_06858_o1 770 | mask_06985_o1 771 | mask_06060 772 | mask_08553_o1 773 | mask_00281_o1 774 | mask_05235 775 | mask_10315 776 | mask_06652 777 | mask_04445 778 | mask_11492 779 | mask_07350_o1 780 | mask_06839_o2 781 | mask_10534 782 | mask_06983_o2 783 | mask_05512_o2 784 | mask_11150 785 | mask_08018 786 | mask_09322 787 | mask_02268 788 | mask_01300 789 | mask_07744_o1 790 | mask_08093 791 | mask_04492 792 | mask_06452 793 | mask_05395_o2 794 | mask_08730 795 | mask_09044_o2 796 | mask_06289_o1 797 | mask_09808 798 | mask_02946 799 | mask_05378_o1 800 | mask_05774_o1 801 | mask_05662 802 | mask_00252_o2 803 | mask_05763 804 | mask_05686_o1 805 | mask_08406_o2 806 | mask_06224_o1 807 | mask_10981 808 | mask_02434 809 | mask_07834 810 | mask_11486_o1 811 | mask_05805_o2 812 | mask_06718_o1 813 | mask_06775 814 | mask_06720 815 | mask_09600 816 | mask_00336 817 | mask_05163_o2 818 | mask_06194_o2 819 | mask_12283 820 | mask_07737_o1 821 | mask_04524 822 | mask_07327_o2 823 | mask_11006_o1 824 | mask_05450_o1 825 | mask_04780 826 | mask_07193 827 | mask_11529_o1 828 | mask_10296 829 | mask_02394 830 | mask_10431 831 | mask_07047_o1 832 | mask_01066 833 | mask_02973 834 | mask_05180_o2 835 | mask_00435_o2 836 | mask_05594 837 | mask_05485_o2 838 | mask_06754_o2 839 | mask_00979 840 | mask_07366 841 | mask_05085_o1 842 | mask_08180_o2 843 | mask_04533 844 | mask_08044_o2 845 | mask_07219 846 | mask_09194 847 | mask_07188 848 | mask_01411 849 | mask_07522_o1 850 | mask_09022_o1 851 | mask_01587 852 | mask_06039 853 | mask_01903 854 | mask_11633 855 | mask_10535 856 | mask_06877 857 | mask_06154_o2 858 | mask_08933_o1 859 | mask_11979 860 | mask_05297_o1 861 | mask_11367 862 | mask_01586 863 | mask_06460_o2 864 | mask_07220_o1 865 | mask_00046_o1 866 | mask_02316 867 | mask_00128 868 | mask_11561_o2 869 | mask_06878_o1 870 | mask_05236 871 | mask_01894 872 | mask_07075_o1 873 | mask_08125 874 | mask_11776 875 | mask_01104 876 | mask_12271 877 | mask_07342 878 | mask_07768 879 | mask_08284_o1 880 | mask_05112_o1 881 | mask_06904_o2 882 | mask_06321_o2 883 | mask_05532_o1 884 | mask_09149 885 | mask_10808 886 | mask_05770 887 | mask_00282_o2 888 | mask_05824_o1 889 | mask_03935 890 | mask_00402_o2 891 | mask_09094 892 | mask_06179_o1 893 | mask_02070 894 | mask_06809_o2 895 | mask_05704 896 | mask_03459 897 | mask_11574_o2 898 | mask_08690 899 | mask_04112 900 | mask_11619 901 | mask_10975_o1 902 | mask_05991_o1 903 | mask_11947 904 | mask_06935_o2 905 | mask_06407 906 | mask_03572 907 | mask_06175 908 | mask_06315_o1 909 | mask_01135 910 | mask_07423 911 | mask_01744 912 | mask_05705_o1 913 | mask_05568 914 | mask_03132 915 | mask_09920 916 | mask_01672 917 | mask_07810 918 | mask_06661 919 | mask_03100 920 | mask_07359_o2 921 | mask_01468 922 | mask_01793 923 | mask_00229_o1 924 | mask_00226 925 | mask_01972 926 | mask_06473 927 | mask_01345 928 | mask_00393_o1 929 | mask_00017_o1 930 | mask_08404 931 | mask_00333_o2 932 | mask_09869 933 | mask_09182 934 | mask_10298 935 | mask_08215_o1 936 | mask_09628 937 | mask_02260_o2 938 | mask_08343 939 | mask_08870_o1 940 | mask_00202 941 | mask_06960_o2 942 | mask_10468 943 | mask_10156 944 | mask_03056 945 | mask_02878 946 | mask_08523 947 | mask_04960 948 | mask_04100 949 | mask_06223 950 | mask_02131 951 | mask_08338_o1 952 | mask_05896_o2 953 | mask_04681 954 | mask_01172 955 | mask_08503 956 | mask_09191 957 | mask_11661_o1 958 | mask_06160_o2 959 | mask_05507_o2 960 | mask_02811 961 | mask_05936_o1 962 | mask_06676 963 | mask_08908_o1 964 | mask_08444_o1 965 | mask_10497 966 | mask_07837 967 | mask_09379 968 | mask_08499_o2 969 | mask_10469 970 | mask_00134_o2 971 | mask_05660_o2 972 | mask_09778 973 | mask_02796 974 | mask_07773_o1 975 | mask_06946_o2 976 | mask_08163 977 | mask_06928 978 | mask_08898_o1 979 | mask_07274_o2 980 | mask_06866_o2 981 | mask_10435 982 | mask_07301 983 | mask_09083_o1 984 | mask_07318 985 | mask_07361 986 | mask_11547 987 | mask_10934 988 | mask_08854_o1 989 | mask_11410_o1 990 | mask_06649_o2 991 | mask_11967 992 | mask_03597 993 | mask_06538 994 | mask_02118 995 | mask_06653_o1 996 | mask_05183_o2 997 | mask_11098_o2 998 | mask_06811_o2 999 | mask_08578 1000 | mask_08214_o1 1001 | mask_10524 1002 | mask_04033 1003 | mask_05067_o1 1004 | mask_05831_o1 1005 | mask_00801 1006 | mask_04230 1007 | mask_10796 1008 | mask_01246 1009 | mask_03934 1010 | mask_01330 1011 | mask_00389_o2 1012 | mask_00091_o1 1013 | mask_00338 1014 | mask_09089_o1 1015 | mask_08622 1016 | mask_00061 1017 | mask_07569_o2 1018 | mask_06254 1019 | mask_11275_o1 1020 | mask_08254 1021 | mask_03091 1022 | mask_02364 1023 | mask_03316 1024 | mask_05308_o2 1025 | mask_06437 1026 | mask_04358 1027 | mask_00426 1028 | mask_08061 1029 | mask_03265 1030 | mask_12049 1031 | mask_11581 1032 | mask_08968 1033 | mask_00625 1034 | mask_08654 1035 | mask_08225_o2 1036 | mask_04436 1037 | mask_05158 1038 | mask_02998 1039 | mask_04254 1040 | mask_07807 1041 | mask_09068_o2 1042 | mask_06460_o1 1043 | mask_05065 1044 | mask_11991 1045 | mask_12321 1046 | mask_07153_o1 1047 | mask_00030_o2 1048 | mask_10183 1049 | mask_07824_o1 1050 | mask_00034 1051 | mask_02798 1052 | mask_07124 1053 | mask_09071 1054 | mask_00160 1055 | mask_10926 1056 | mask_01755 1057 | mask_00060 1058 | mask_10985_o1 1059 | mask_12262 1060 | mask_08762_o1 1061 | mask_02034 1062 | mask_02925 1063 | mask_03768 1064 | mask_08886_o1 1065 | mask_06435_o2 1066 | mask_09867 1067 | mask_00698 1068 | mask_03771 1069 | mask_10941 1070 | mask_09055_o2 1071 | mask_08927_o1 1072 | mask_03251 1073 | mask_03749 1074 | mask_03262 1075 | mask_09330 1076 | mask_03866 1077 | mask_00976 1078 | mask_00096_o1 1079 | mask_11489 1080 | mask_08120 1081 | mask_06650_o1 1082 | mask_01712 1083 | mask_06364_o1 1084 | mask_05585_o2 1085 | mask_11744 1086 | mask_11357 1087 | mask_00260_o2 1088 | mask_08930 1089 | mask_07315_o1 1090 | mask_05880_o2 1091 | mask_03415 1092 | mask_09277 1093 | mask_00234 1094 | mask_09924 1095 | mask_03394 1096 | mask_05673 1097 | mask_07451_o1 1098 | mask_01891 1099 | mask_10197 1100 | mask_07833_o1 1101 | mask_05220 1102 | mask_10337 1103 | mask_05041 1104 | mask_08099 1105 | mask_05616 1106 | mask_04559 1107 | mask_05472 1108 | mask_09082_o2 1109 | mask_09382 1110 | mask_01782 1111 | mask_07146 1112 | mask_05800 1113 | mask_06255 1114 | mask_05150 1115 | mask_06121_o1 1116 | mask_07098 1117 | mask_05306_o2 1118 | mask_07644 1119 | mask_06171_o2 1120 | mask_01518 1121 | mask_05063_o2 1122 | mask_03880 1123 | mask_05390_o1 1124 | mask_12223 1125 | mask_07054 1126 | mask_06716_o1 1127 | mask_05461_o1 1128 | mask_08746_o1 1129 | mask_00637 1130 | mask_06129 1131 | mask_07461_o1 1132 | mask_01166 1133 | mask_09063_o1 1134 | mask_03339 1135 | mask_00189_o1 1136 | mask_00192 1137 | mask_06934 1138 | mask_02594 1139 | mask_10634 1140 | mask_05484 1141 | mask_06857_o1 1142 | mask_07808_o2 1143 | mask_07879_o2 1144 | mask_11489_o2 1145 | mask_07053_o2 1146 | mask_08678_o2 1147 | mask_00136_o2 1148 | mask_06207_o1 1149 | mask_08271 1150 | mask_06500 1151 | mask_06414_o1 1152 | mask_08473 1153 | mask_08676 1154 | mask_08194 1155 | mask_10553 1156 | mask_06673_o2 1157 | mask_00122 1158 | mask_08510 1159 | mask_09282 1160 | mask_05907 1161 | mask_11591 1162 | mask_00425_o1 1163 | mask_00378_o1 1164 | mask_00137_o2 1165 | mask_05907_o2 1166 | mask_11757 1167 | mask_07066 1168 | mask_03471 1169 | mask_04334 1170 | mask_02890 1171 | mask_11135 1172 | mask_09417 1173 | mask_03744 1174 | mask_05293_o1 1175 | mask_05629_o2 1176 | mask_08812_o2 1177 | mask_08379_o1 1178 | mask_07386_o1 1179 | mask_11475_o2 1180 | mask_00252 1181 | mask_09843 1182 | mask_05888_o2 1183 | mask_05074 1184 | mask_08840_o2 1185 | mask_10264 1186 | mask_06285_o1 1187 | mask_10597 1188 | mask_02997 1189 | mask_04898 1190 | mask_08797 1191 | mask_09885 1192 | mask_08883_o2 1193 | mask_07730 1194 | mask_02716 1195 | mask_05788 1196 | mask_01019 1197 | mask_05298_o2 1198 | mask_06941_o1 1199 | mask_06210_o2 1200 | mask_00466_o1 1201 | mask_08477 1202 | mask_06514_o1 1203 | mask_05319 1204 | mask_05505_o2 1205 | mask_07186 1206 | mask_04859 1207 | mask_08624_o1 1208 | mask_01880 1209 | mask_05445_o1 1210 | mask_12193 1211 | mask_05338_o2 1212 | mask_05828 1213 | mask_05311_o2 1214 | mask_06041 1215 | mask_09413 1216 | mask_08823_o2 1217 | mask_06333 1218 | mask_06948 1219 | mask_10124 1220 | mask_02297 1221 | mask_05587_o1 1222 | mask_08486_o1 1223 | mask_06031_o2 1224 | mask_08645 1225 | mask_05118_o2 1226 | mask_07431 1227 | mask_11936 1228 | mask_00504_o2 1229 | mask_07184 1230 | mask_12266 1231 | mask_05187 1232 | mask_07571 1233 | mask_05599 1234 | mask_07471 1235 | mask_08885 1236 | mask_07863 1237 | mask_06765 1238 | mask_00203_o2 1239 | mask_02054 1240 | mask_11000_o2 1241 | mask_08716_o1 1242 | mask_00016_o1 1243 | mask_11537_o1 1244 | mask_07187_o1 1245 | mask_04132 1246 | mask_06102_o1 1247 | mask_10179 1248 | mask_00016 1249 | mask_06188_o2 1250 | mask_08247_o1 1251 | mask_05676 1252 | mask_05384 1253 | mask_04136 1254 | mask_07517_o2 1255 | mask_00365_o1 1256 | mask_00193_o1 1257 | mask_00475 1258 | mask_05286_o2 1259 | mask_08646_o1 1260 | mask_05512_o1 1261 | mask_11168 1262 | mask_02498 1263 | mask_08892 1264 | mask_10459 1265 | mask_03387 1266 | mask_11945 1267 | mask_07172 1268 | mask_08951_o1 1269 | mask_11907 1270 | mask_00468 1271 | mask_00172_o1 1272 | mask_08673 1273 | mask_00514_o1 1274 | mask_01209 1275 | mask_11576_o2 1276 | mask_07246 1277 | mask_09059 1278 | mask_09992 1279 | mask_02350 1280 | mask_02683 1281 | mask_04339 1282 | mask_10287 1283 | mask_05316_o2 1284 | mask_00392 1285 | mask_01977 1286 | mask_00354_o1 1287 | mask_01160 1288 | mask_03861 1289 | mask_04771 1290 | mask_12229 1291 | mask_11734 1292 | mask_06540_o1 1293 | mask_08785_o2 1294 | mask_00057_o2 1295 | mask_05495_o2 1296 | mask_02978 1297 | mask_06885 1298 | mask_00477_o1 1299 | mask_08207_o2 1300 | mask_11495 1301 | mask_04404 1302 | mask_05438_o1 1303 | mask_05586_o1 1304 | mask_10486 1305 | mask_08918_o2 1306 | mask_09471 1307 | mask_05245_o1 1308 | mask_00103_o1 1309 | mask_09802 1310 | mask_07230 1311 | mask_01391 1312 | mask_06026 1313 | mask_08269 1314 | mask_07321 1315 | mask_00473_o1 1316 | mask_00379_o2 1317 | mask_07235_o1 1318 | mask_00777 1319 | mask_02525 1320 | mask_08797_o1 1321 | mask_01772 1322 | mask_06055_o2 1323 | mask_01493 1324 | mask_08612_o2 1325 | mask_10919 1326 | mask_10367 1327 | mask_05769_o2 1328 | mask_06401_o2 1329 | mask_00594 1330 | mask_08068_o2 1331 | mask_06751 1332 | mask_06975 1333 | mask_01180 1334 | mask_11294_o2 1335 | mask_08615 1336 | mask_08310_o1 1337 | mask_06744_o2 1338 | mask_08847_o2 1339 | mask_06087_o2 1340 | mask_06457_o2 1341 | mask_09178 1342 | mask_06770_o2 1343 | mask_09570 1344 | mask_05443_o1 1345 | mask_01127 1346 | mask_06546 1347 | mask_11479_o2 1348 | mask_06229_o1 1349 | mask_05724 1350 | mask_08340_o1 1351 | mask_11578_o1 1352 | mask_06338_o1 1353 | mask_05136 1354 | mask_03671 1355 | mask_00081_o2 1356 | mask_01423 1357 | mask_02374 1358 | mask_11503_o2 1359 | mask_00671 1360 | mask_05361_o2 1361 | mask_09023_o2 1362 | mask_03171_o1 1363 | mask_07063_o1 1364 | mask_10185 1365 | mask_05638 1366 | mask_01916 1367 | mask_06198_o2 1368 | mask_07285_o2 1369 | mask_05927 1370 | mask_07570_o1 1371 | mask_00213_o1 1372 | mask_05282_o1 1373 | mask_11545_o2 1374 | mask_00400_o1 1375 | mask_05334_o2 1376 | mask_07249 1377 | mask_03408 1378 | mask_10952 1379 | mask_09979 1380 | mask_02797 1381 | mask_05928_o2 1382 | mask_08433 1383 | mask_00113_o2 1384 | mask_06657 1385 | mask_01882 1386 | mask_08384_o1 1387 | mask_02602 1388 | mask_07096 1389 | mask_02302 1390 | mask_06946_o1 1391 | mask_01344 1392 | mask_08978 1393 | mask_11268_o2 1394 | mask_02974 1395 | mask_01328 1396 | mask_04595 1397 | mask_00481 1398 | mask_00178_o2 1399 | mask_06586_o1 1400 | mask_07220_o2 1401 | mask_08258 1402 | mask_06302_o2 1403 | mask_12255 1404 | mask_04435 1405 | mask_07657 1406 | mask_03185 1407 | mask_05398_o1 1408 | mask_06844 1409 | mask_02458 1410 | mask_06494 1411 | mask_06044 1412 | mask_02314 1413 | mask_07239_o2 1414 | mask_05017 1415 | mask_11803 1416 | mask_05410_o1 1417 | mask_05288_o1 1418 | mask_07430_o1 1419 | mask_05661_o1 1420 | mask_05279_o2 1421 | mask_02484 1422 | mask_05067 1423 | mask_05767 1424 | mask_04088 1425 | mask_08337_o2 1426 | mask_00436 1427 | mask_09566 1428 | mask_10310 1429 | mask_03315 1430 | mask_08779_o2 1431 | mask_08980 1432 | mask_03846 1433 | mask_04634 1434 | mask_09446 1435 | mask_09057_o2 1436 | mask_07440_o2 1437 | mask_10208 1438 | mask_02698 1439 | mask_08220 1440 | mask_02692 1441 | mask_03067 1442 | mask_07642 1443 | mask_06780_o2 1444 | mask_09767 1445 | mask_05853_o1 1446 | mask_10263 1447 | mask_04598 1448 | mask_03726 1449 | mask_06506 1450 | mask_03898 1451 | mask_11842 1452 | mask_05239_o1 1453 | mask_05275 1454 | mask_09174 1455 | mask_07201_o2 1456 | mask_11547_o2 1457 | mask_08360_o2 1458 | mask_09941 1459 | mask_08714_o1 1460 | mask_10744 1461 | mask_09649 1462 | mask_06050_o2 1463 | mask_11141_o2 1464 | mask_07505 1465 | mask_08234_o2 1466 | mask_00558 1467 | mask_07949_o1 1468 | mask_01448 1469 | mask_03470 1470 | mask_11614_o1 1471 | mask_06778 1472 | mask_01971 1473 | mask_05200_o2 1474 | mask_07378 1475 | mask_11180_o1 1476 | mask_06718_o2 1477 | mask_09329 1478 | mask_08487_o2 1479 | mask_06676_o1 1480 | mask_00213_o2 1481 | mask_08581_o2 1482 | mask_04909 1483 | mask_01191 1484 | mask_01341 1485 | mask_04378 1486 | mask_05566_o2 1487 | mask_00309_o1 1488 | mask_01981 1489 | mask_07175_o1 1490 | mask_04793 1491 | mask_05266_o1 1492 | mask_08272 1493 | mask_06030_o2 1494 | mask_00410 1495 | mask_09049_o2 1496 | mask_09096 1497 | mask_02082 1498 | mask_11827 1499 | mask_12204 1500 | mask_00221 1501 | mask_08544_o1 1502 | mask_01707 1503 | mask_03891 1504 | mask_00222_o2 1505 | mask_10756 1506 | mask_05840_o2 1507 | mask_08118_o1 1508 | mask_02778 1509 | mask_06801_o2 1510 | mask_05172_o2 1511 | mask_09206 1512 | mask_11542 1513 | mask_07670 1514 | mask_04756 1515 | mask_00902 1516 | mask_11078 1517 | mask_06337 1518 | mask_02260_o1 1519 | mask_11308 1520 | mask_03489 1521 | mask_00075 1522 | mask_07515 1523 | mask_00396_o1 1524 | mask_03981 1525 | mask_07264_o1 1526 | mask_09836 1527 | mask_11677 1528 | mask_06955 1529 | mask_02665 1530 | mask_05404_o2 1531 | mask_11283_o2 1532 | mask_11248_o2 1533 | mask_06547 1534 | mask_10123 1535 | mask_07171_o2 1536 | mask_08534 1537 | mask_09311 1538 | mask_05418 1539 | mask_00407_o2 1540 | mask_09075_o2 1541 | mask_03281 1542 | mask_09436 1543 | mask_10735_o1 1544 | mask_08871_o1 1545 | mask_04708 1546 | mask_08093_o1 1547 | mask_00169_o2 1548 | mask_09004_o1 1549 | mask_07170 1550 | mask_05319_o1 1551 | mask_01683 1552 | mask_04928 1553 | mask_01818 1554 | mask_07456 1555 | mask_02544 1556 | mask_10560 1557 | mask_05627 1558 | mask_09753 1559 | mask_03224 1560 | mask_03539 1561 | mask_08961_o1 1562 | mask_11592_o1 1563 | mask_01966 1564 | mask_12001 1565 | mask_04747 1566 | mask_00058_o1 1567 | mask_07932_o1 1568 | mask_11596_o1 1569 | mask_11018_o1 1570 | mask_11182 1571 | mask_09763 1572 | mask_00626 1573 | mask_05915 1574 | mask_10207 1575 | mask_05882_o2 1576 | mask_07300 1577 | mask_03242 1578 | mask_12319 1579 | mask_10098 1580 | mask_00993 1581 | mask_07309_o2 1582 | mask_10244 1583 | mask_02673 1584 | mask_00243 1585 | mask_00241_o2 1586 | mask_03904 1587 | mask_03827 1588 | mask_08814_o1 1589 | mask_09894 1590 | mask_00056_o2 1591 | mask_05388_o1 1592 | mask_04455 1593 | mask_05490_o2 1594 | mask_05186_o1 1595 | mask_07099_o1 1596 | mask_00783 1597 | mask_06335_o1 1598 | mask_07749_o2 1599 | mask_05027 1600 | mask_05894_o1 1601 | mask_11603_o1 1602 | mask_08691 1603 | mask_07082 1604 | mask_06209 1605 | mask_09488 1606 | mask_09886 1607 | mask_04870 1608 | mask_09658 1609 | mask_08440 1610 | mask_08846_o1 1611 | mask_00547 1612 | mask_05885_o2 1613 | mask_11177_o1 1614 | mask_03003 1615 | mask_00418_o1 1616 | mask_07897_o2 1617 | mask_06048_o2 1618 | mask_05352_o2 1619 | mask_06396_o1 1620 | mask_00852 1621 | mask_11142 1622 | mask_00243_o2 1623 | mask_07260_o2 1624 | mask_08807 1625 | mask_00401 1626 | mask_09059_o2 1627 | mask_06128 1628 | mask_10542 1629 | mask_00486 1630 | mask_00317 1631 | mask_10943 1632 | mask_10954 1633 | mask_04940 1634 | mask_11072_o1 1635 | mask_05531 1636 | mask_00398_o1 1637 | mask_08069 1638 | mask_04363 1639 | mask_12072 1640 | mask_11194_o1 1641 | mask_06788_o1 1642 | mask_12032 1643 | mask_01358 1644 | mask_05278_o2 1645 | mask_08319_o1 1646 | mask_07056_o2 1647 | mask_11269 1648 | mask_11112 1649 | mask_08966 1650 | mask_05847_o2 1651 | mask_06490_o2 1652 | mask_00703 1653 | mask_06503_o2 1654 | mask_07495_o1 1655 | mask_10548 1656 | mask_09402 1657 | mask_11589_o2 1658 | mask_01215 1659 | mask_11474 1660 | mask_11152_o1 1661 | mask_07337_o2 1662 | mask_05076_o2 1663 | mask_07066_o2 1664 | mask_00328_o1 1665 | mask_11752 1666 | mask_05432_o2 1667 | mask_03359 1668 | mask_09024 1669 | mask_08688_o2 1670 | mask_09942 1671 | mask_00114 1672 | mask_06181 1673 | mask_07859 1674 | mask_04800 1675 | mask_11551 1676 | mask_06284 1677 | mask_05838 1678 | mask_04310 1679 | mask_06564_o1 1680 | mask_06073 1681 | mask_01585 1682 | mask_06467_o1 1683 | mask_11863 1684 | mask_05772_o2 1685 | mask_08589 1686 | mask_07087_o1 1687 | mask_00416 1688 | mask_06589_o1 1689 | mask_06913_o2 1690 | mask_09034_o1 1691 | mask_07236 1692 | mask_05707_o2 1693 | mask_00081 1694 | mask_03705 1695 | mask_00355_o1 1696 | mask_05977_o2 1697 | mask_05147_o2 1698 | mask_11055 1699 | mask_04104 1700 | mask_08268_o1 1701 | mask_06894_o2 1702 | mask_10186 1703 | mask_00214_o1 1704 | mask_05107_o2 1705 | mask_03331 1706 | mask_09014 1707 | mask_03390 1708 | mask_07946_o2 1709 | mask_01067 1710 | mask_08062 1711 | mask_01489 1712 | mask_04695 1713 | mask_05269 1714 | mask_05740 1715 | mask_08017 1716 | mask_08635 1717 | mask_03310 1718 | mask_07214_o1 1719 | mask_06313 1720 | mask_02465 1721 | mask_11885 1722 | mask_00344 1723 | mask_08139 1724 | mask_03730 1725 | mask_07866 1726 | mask_01039 1727 | mask_03655 1728 | mask_05578 1729 | mask_02695 1730 | mask_05483 1731 | mask_07325 1732 | mask_07327_o1 1733 | mask_10073 1734 | mask_04257 1735 | mask_08758_o1 1736 | mask_04550 1737 | mask_00265_o1 1738 | mask_04344 1739 | mask_01606 1740 | mask_11534_o2 1741 | mask_06106_o1 1742 | mask_05779_o1 1743 | mask_00476_o1 1744 | mask_01858 1745 | mask_12023 1746 | mask_11414 1747 | mask_07311 1748 | mask_08645_o1 1749 | mask_04847 1750 | mask_01816 1751 | mask_02396 1752 | mask_09255 1753 | mask_00642 1754 | mask_11344 1755 | mask_05180_o1 1756 | mask_00476_o2 1757 | mask_11731 1758 | mask_06662 1759 | mask_07261 1760 | mask_01006 1761 | mask_07215 1762 | mask_11600_o1 1763 | mask_07278_o2 1764 | mask_00442_o2 1765 | mask_06394 1766 | mask_09639 1767 | mask_11343 1768 | mask_07252_o2 1769 | mask_05754_o2 1770 | mask_10437 1771 | mask_03839 1772 | mask_02782 1773 | mask_12232 1774 | mask_00416_o1 1775 | mask_06693_o1 1776 | mask_10113 1777 | mask_03678 1778 | mask_09068_o1 1779 | mask_00342 1780 | mask_04951 1781 | mask_00331_o1 1782 | mask_09496 1783 | mask_08213_o1 1784 | mask_11323 1785 | mask_01105 1786 | mask_10286 1787 | mask_09332 1788 | mask_08825_o1 1789 | mask_06026_o1 1790 | mask_05136_o2 1791 | mask_05921 1792 | mask_11755 1793 | mask_07275_o2 1794 | mask_03661 1795 | mask_05088 1796 | mask_07153_o2 1797 | mask_02360 1798 | mask_00630 1799 | mask_11615_o2 1800 | mask_01352 1801 | mask_09669 1802 | mask_05852 1803 | mask_00619 1804 | mask_10224 1805 | mask_09515 1806 | mask_05162 1807 | mask_08302_o1 1808 | mask_03475 1809 | mask_06123 1810 | mask_07946 1811 | mask_07330_o1 1812 | mask_03203 1813 | mask_05512 1814 | mask_04118 1815 | mask_03856 1816 | mask_05047 1817 | mask_01993 1818 | mask_07680_o2 1819 | mask_06436_o2 1820 | mask_11139_o2 1821 | mask_08683 1822 | mask_03522 1823 | mask_08202 1824 | mask_09309 1825 | mask_07766_o2 1826 | mask_05613 1827 | mask_03406 1828 | mask_00179_o2 1829 | mask_10951 1830 | mask_07869_o2 1831 | mask_11098_o1 1832 | mask_04638 1833 | mask_06701 1834 | mask_00143 1835 | mask_07382_o2 1836 | mask_07128 1837 | mask_05589 1838 | mask_06152_o2 1839 | mask_07314 1840 | mask_10314 1841 | mask_08525 1842 | mask_06964 1843 | mask_11658 1844 | mask_03783 1845 | mask_09937 1846 | mask_10507 1847 | mask_08817 1848 | mask_05811_o1 1849 | mask_12380 1850 | mask_05097_o1 1851 | mask_11768 1852 | mask_07079_o1 1853 | mask_11090_o2 1854 | mask_07476_o1 1855 | mask_04773 1856 | mask_07988_o2 1857 | mask_09497 1858 | mask_10832 1859 | mask_10440 1860 | mask_06553 1861 | mask_00414 1862 | mask_08323 1863 | mask_11265_o1 1864 | mask_08052 1865 | mask_06432 1866 | mask_06420 1867 | mask_05301 1868 | mask_00286 1869 | mask_08041 1870 | mask_02999 1871 | mask_08554 1872 | mask_08689 1873 | mask_03188 1874 | mask_07720 1875 | mask_11117 1876 | mask_00283_o2 1877 | mask_03367 1878 | mask_06229_o2 1879 | mask_06073_o1 1880 | mask_01693 1881 | mask_05223 1882 | mask_10602 1883 | mask_06917_o1 1884 | mask_06950_o1 1885 | mask_01043 1886 | mask_05699 1887 | mask_02799 1888 | mask_02433 1889 | mask_10731 1890 | mask_11522 1891 | mask_07588 1892 | mask_05891_o1 1893 | mask_08727 1894 | mask_05918 1895 | mask_05806 1896 | mask_09699 1897 | mask_11168_o2 1898 | mask_00123_o1 1899 | mask_01414 1900 | mask_02427 1901 | mask_05655 1902 | mask_00024_o2 1903 | mask_00381 1904 | mask_07945_o1 1905 | mask_07025 1906 | mask_05792_o2 1907 | mask_09897 1908 | mask_06628 1909 | mask_11267_o2 1910 | mask_00329 1911 | mask_08244_o1 1912 | mask_00147 1913 | mask_04126 1914 | mask_05533_o2 1915 | mask_08960 1916 | mask_10290 1917 | mask_05094_o2 1918 | mask_02674 1919 | mask_05083_o2 1920 | mask_07100_o2 1921 | mask_08875 1922 | mask_06893 1923 | mask_02008 1924 | mask_11484 1925 | mask_00274_o1 1926 | mask_03007 1927 | mask_07583 1928 | mask_11048 1929 | mask_06088_o1 1930 | mask_12056 1931 | mask_09223 1932 | mask_07309_o1 1933 | mask_05793_o2 1934 | mask_00205 1935 | mask_08714_o2 1936 | mask_05197_o1 1937 | mask_08694_o2 1938 | mask_12113 1939 | mask_06536_o1 1940 | mask_11118_o2 1941 | mask_08213 1942 | mask_01520 1943 | mask_06869 1944 | mask_08202_o2 1945 | mask_00465_o1 1946 | mask_05955_o2 1947 | mask_00369 1948 | mask_07193_o2 1949 | mask_02295 1950 | mask_07814_o2 1951 | mask_11058 1952 | mask_06942_o2 1953 | mask_04774 1954 | mask_08588_o2 1955 | mask_09271 1956 | mask_07574_o1 1957 | mask_00259_o1 1958 | mask_08268_o2 1959 | mask_05825_o2 1960 | mask_08815 1961 | mask_07949 1962 | mask_09910_o1 1963 | mask_09670 1964 | mask_02994 1965 | mask_11220 1966 | mask_11083_o2 1967 | mask_03707 1968 | mask_11128 1969 | mask_03363 1970 | mask_09820 1971 | mask_07523 1972 | mask_00257 1973 | mask_06145 1974 | mask_08937_o2 1975 | mask_04259 1976 | mask_09397 1977 | mask_07706 1978 | mask_00376_o2 1979 | mask_11377 1980 | mask_03384 1981 | mask_09189 1982 | mask_08752 1983 | mask_10632 1984 | mask_01555 1985 | mask_07771_o2 1986 | mask_06806_o1 1987 | mask_06311 1988 | mask_02842 1989 | mask_03120 1990 | mask_05113 1991 | mask_08565 1992 | mask_08128_o2 1993 | mask_07276_o2 1994 | mask_07818_o2 1995 | mask_04369 1996 | mask_03506 1997 | mask_11637_o1 1998 | mask_00485_o1 1999 | mask_06668 2000 | mask_11059_o2 2001 | mask_09116_o2 2002 | mask_00110_o1 2003 | mask_05779 2004 | mask_02313 2005 | mask_09621 2006 | mask_08195 2007 | mask_11440_o2 2008 | mask_07002_o1 2009 | mask_07407_o1 2010 | mask_08184_o2 2011 | mask_06340 2012 | mask_09874 2013 | mask_06120 2014 | mask_07401_o1 2015 | mask_07189_o1 2016 | mask_02413 2017 | mask_00336_o1 2018 | mask_11280 2019 | mask_02666 2020 | mask_04068 2021 | mask_02380 2022 | mask_06705_o1 2023 | mask_08703 2024 | mask_00782 2025 | mask_08866 2026 | mask_06925_o2 2027 | mask_12211 2028 | mask_06617_o2 2029 | mask_07332_o2 2030 | mask_02524 2031 | mask_11794 2032 | mask_04529 2033 | mask_08870 2034 | mask_11648 2035 | mask_07764 2036 | mask_11605 2037 | mask_02136 2038 | mask_11624_o1 2039 | mask_04376 2040 | mask_07392 2041 | mask_11566 2042 | mask_00981 2043 | mask_06156 2044 | mask_07448 2045 | mask_04811 2046 | mask_11383 2047 | mask_10949 2048 | mask_01118 2049 | mask_05740_o1 2050 | mask_07148_o2 2051 | mask_06268_o2 2052 | mask_09106_o1 2053 | mask_11881 2054 | mask_11501_o1 2055 | mask_08644_o1 2056 | mask_02770 2057 | mask_11081_o1 2058 | mask_09019 2059 | mask_07157_o2 2060 | mask_00464_o2 2061 | mask_06970 2062 | mask_09437 2063 | mask_01096 2064 | mask_00273_o2 2065 | mask_02972 2066 | mask_11432 2067 | mask_10601 2068 | mask_07585_o1 2069 | mask_01031 2070 | mask_04371 2071 | mask_01655 2072 | mask_00159_o2 2073 | mask_08956_o2 2074 | mask_03169 2075 | mask_03914 2076 | mask_09037_o1 2077 | mask_07606 2078 | mask_04141 2079 | mask_10665 2080 | mask_05530_o1 2081 | mask_09001_o1 2082 | mask_01346 2083 | mask_07314_o2 2084 | mask_05723_o2 2085 | mask_11195 2086 | mask_02507 2087 | mask_04991 2088 | mask_10019 2089 | mask_05127 2090 | mask_05583 2091 | mask_08893 2092 | mask_04526 2093 | mask_09500 2094 | mask_10456 2095 | mask_07386_o2 2096 | mask_05570_o2 2097 | mask_10252 2098 | mask_06184 2099 | mask_00440_o1 2100 | mask_03722 2101 | mask_07082_o2 2102 | mask_09714 2103 | mask_08211_o2 2104 | mask_00271 2105 | mask_08599_o1 2106 | mask_00898 2107 | mask_00285 2108 | mask_05439 2109 | mask_02277 2110 | mask_05184_o2 2111 | mask_06572_o1 2112 | mask_00221_o2 2113 | mask_08435_o2 2114 | mask_03379 2115 | mask_00749 2116 | mask_05228_o1 2117 | mask_12385 2118 | mask_05790_o1 2119 | mask_07840_o1 2120 | mask_00109_o1 2121 | mask_09154 2122 | mask_06305 2123 | mask_02013 2124 | mask_07461_o2 2125 | mask_08659 2126 | mask_06780_o1 2127 | mask_11393 2128 | mask_05586 2129 | mask_08083 2130 | mask_00505_o1 2131 | mask_05049 2132 | mask_08977_o2 2133 | mask_11181 2134 | mask_07410_o1 2135 | mask_06374_o1 2136 | mask_07817_o2 2137 | mask_04541 2138 | mask_06075_o2 2139 | mask_11220_o1 2140 | mask_09373 2141 | mask_01704 2142 | mask_11528_o2 2143 | mask_11124_o2 2144 | mask_04010 2145 | mask_09959 2146 | mask_06476_o2 2147 | mask_05651_o1 2148 | mask_00490 2149 | mask_07355_o1 2150 | mask_07939_o2 2151 | mask_03640 2152 | mask_11231_o1 2153 | mask_03391 2154 | mask_10371 2155 | mask_05299 2156 | mask_01266 2157 | mask_07827 2158 | mask_00875 2159 | mask_07766 2160 | mask_03691 2161 | mask_03011 2162 | mask_06169 2163 | mask_07863_o1 2164 | mask_08930_o1 2165 | mask_06433 2166 | mask_06915 2167 | mask_05296_o2 2168 | mask_02057 2169 | mask_00909 2170 | mask_02820 2171 | mask_03337 2172 | mask_02260 2173 | mask_03534 2174 | mask_07565_o2 2175 | mask_03523 2176 | mask_00753 2177 | mask_06082 2178 | mask_10385 2179 | mask_07828 2180 | mask_05748_o1 2181 | mask_03797 2182 | mask_06551 2183 | mask_07438 2184 | mask_02424 2185 | mask_05718_o1 2186 | mask_10624 2187 | mask_00174_o2 2188 | mask_10470 2189 | mask_01394 2190 | mask_06938 2191 | mask_12231 2192 | mask_08350 2193 | mask_05206 2194 | mask_05290_o1 2195 | mask_07230_o2 2196 | mask_10008 2197 | mask_02954 2198 | mask_08841_o1 2199 | mask_06488 2200 | mask_08343_o1 2201 | mask_07140_o2 2202 | mask_10251 2203 | mask_01722 2204 | mask_00201 2205 | mask_09493 2206 | mask_03751 2207 | mask_08342_o1 2208 | mask_06482 2209 | mask_02365 2210 | mask_00735 2211 | mask_11044_o2 2212 | mask_03721 2213 | mask_06017_o2 2214 | mask_05989 2215 | mask_00820 2216 | mask_08654_o1 2217 | mask_11872 2218 | mask_08723 2219 | mask_00022_o2 2220 | mask_06575_o1 2221 | mask_07157 2222 | mask_07344_o1 2223 | mask_08929 2224 | mask_03616 2225 | mask_06435 2226 | mask_11482_o1 2227 | mask_01260 2228 | mask_07718_o2 2229 | mask_06646 2230 | mask_02519 2231 | mask_04752 2232 | mask_09791 2233 | mask_11265_o2 2234 | mask_11634_o2 2235 | mask_11688 2236 | mask_05365_o1 2237 | mask_07512 2238 | mask_11392 2239 | mask_03034 2240 | mask_11481 2241 | mask_05129_o1 2242 | mask_00090_o1 2243 | mask_08944 2244 | mask_07946_o1 2245 | mask_04994 2246 | mask_08667_o1 2247 | mask_08088_o2 2248 | mask_00941 2249 | mask_10140 2250 | mask_05827_o1 2251 | mask_11202 2252 | mask_11598_o2 2253 | mask_11239_o1 2254 | mask_03024 2255 | mask_06131_o1 2256 | mask_07509 2257 | mask_07853_o2 2258 | mask_10161 2259 | mask_11119_o1 2260 | mask_05545_o1 2261 | mask_07087 2262 | mask_11062 2263 | mask_10567 2264 | mask_07344_o2 2265 | mask_09532 2266 | mask_05273_o1 2267 | mask_10327 2268 | mask_08065 2269 | mask_00499_o1 2270 | mask_06382 2271 | mask_06431 2272 | mask_02606 2273 | mask_07524 2274 | mask_00371 2275 | mask_05442_o1 2276 | mask_00227_o2 2277 | mask_02876 2278 | mask_08050 2279 | mask_05325_o2 2280 | mask_11138 2281 | mask_09374 2282 | mask_08378 2283 | mask_05929_o2 2284 | mask_08668_o2 2285 | mask_08485 2286 | mask_06962_o1 2287 | mask_08364_o2 2288 | mask_03670 2289 | mask_12183 2290 | mask_07486_o2 2291 | mask_08777_o2 2292 | mask_08625 2293 | mask_06136_o2 2294 | mask_02457 2295 | mask_08631_o2 2296 | mask_00456_o1 2297 | mask_07596 2298 | mask_00398 2299 | mask_05999_o1 2300 | mask_08627_o2 2301 | mask_09452 2302 | mask_08385 2303 | mask_02774_o1 2304 | mask_12230 2305 | mask_01375 2306 | mask_06149 2307 | mask_00062 2308 | mask_07692_o1 2309 | mask_00995 2310 | mask_08869 2311 | mask_10325 2312 | mask_07009_o1 2313 | mask_06042_o2 2314 | mask_03066 2315 | mask_02931 2316 | mask_08563 2317 | mask_08571 2318 | mask_11073 2319 | mask_01738 2320 | mask_06591 2321 | mask_08653 2322 | mask_07900_o2 2323 | mask_08717 2324 | mask_07006_o1 2325 | mask_05547_o2 2326 | mask_11061 2327 | mask_05832_o2 2328 | mask_08974_o2 2329 | mask_12046 2330 | mask_07409 2331 | mask_02933 2332 | mask_11928 2333 | mask_08981_o1 2334 | mask_06987_o1 2335 | mask_06084_o1 2336 | mask_07567 2337 | mask_00260 2338 | mask_00113_o1 2339 | mask_07194_o1 2340 | mask_07077_o2 2341 | mask_01554 2342 | mask_10204 2343 | mask_00242_o1 2344 | mask_11080_o2 2345 | mask_07520 2346 | mask_11058_o2 2347 | mask_00974 2348 | mask_09010_o1 2349 | mask_10963_o1 2350 | mask_12149 2351 | mask_05066 2352 | mask_05324_o1 2353 | mask_09394 2354 | mask_01779 2355 | mask_01968 2356 | mask_05081_o1 2357 | mask_12120 2358 | mask_06845_o1 2359 | mask_03152 2360 | mask_05194_o1 2361 | mask_09061_o1 2362 | mask_11521_o1 2363 | mask_05108 2364 | mask_07439 2365 | mask_05938 2366 | mask_07069_o2 2367 | mask_05070_o1 2368 | mask_06768_o1 2369 | mask_05793 2370 | mask_09938 2371 | mask_00598 2372 | mask_08715 2373 | mask_06884 2374 | mask_04308 2375 | mask_03004 2376 | mask_06046 2377 | mask_03536 2378 | mask_10523 2379 | mask_00869 2380 | mask_02368 2381 | mask_09510 2382 | mask_05190_o1 2383 | mask_00093 2384 | mask_04269 2385 | mask_04119 2386 | mask_11478_o2 2387 | mask_00871 2388 | mask_07191_o1 2389 | mask_02737 2390 | mask_07707_o1 2391 | mask_07384 2392 | mask_01960 2393 | mask_05077 2394 | mask_05922 2395 | mask_00153_o1 2396 | mask_07113 2397 | mask_10772 2398 | mask_06541_o2 2399 | mask_05404_o1 2400 | mask_07772_o1 2401 | mask_01927 2402 | mask_03854 2403 | mask_00845 2404 | mask_05970_o1 2405 | mask_06524 2406 | mask_08747 2407 | mask_05083 2408 | mask_00692 2409 | mask_03584 2410 | mask_05107 2411 | mask_00199_o2 2412 | mask_01015 2413 | mask_10226 2414 | mask_04670 2415 | mask_09122_o1 2416 | mask_08146_o2 2417 | mask_11136_o2 2418 | mask_08257 2419 | mask_10309 2420 | mask_03136 2421 | mask_08805_o2 2422 | mask_12008 2423 | mask_10968_o2 2424 | mask_04865 2425 | mask_04737 2426 | mask_05717_o2 2427 | mask_01758 2428 | mask_03492 2429 | mask_03860 2430 | mask_09863 2431 | mask_07610_o1 2432 | mask_11116_o1 2433 | mask_09895 2434 | mask_01439 2435 | mask_05218_o2 2436 | mask_08887 2437 | mask_06093_o2 2438 | mask_03424 2439 | mask_05855 2440 | mask_01561 2441 | mask_08948_o1 2442 | mask_05345_o1 2443 | mask_08040 2444 | mask_05709_o1 2445 | mask_07242 2446 | mask_07446 2447 | mask_01788 2448 | mask_02598 2449 | mask_09222 2450 | mask_06000 2451 | mask_05014 2452 | mask_01327 2453 | mask_11609 2454 | mask_00075_o1 2455 | mask_08235 2456 | mask_02906 2457 | mask_09172 2458 | mask_09337 2459 | mask_02599 2460 | mask_08128 2461 | mask_02635 2462 | mask_00417 2463 | mask_06683_o1 2464 | mask_07926 2465 | mask_11214_o2 2466 | mask_02520 2467 | mask_08488_o1 2468 | mask_12325 2469 | mask_06007 2470 | mask_05154_o1 2471 | mask_05102 2472 | mask_11179 2473 | mask_05491_o2 2474 | mask_05978 2475 | mask_07028_o2 2476 | mask_08748_o1 2477 | mask_08945_o1 2478 | mask_06000_o2 2479 | mask_01868 2480 | mask_02875 2481 | mask_02959 2482 | mask_05079 2483 | mask_02762 2484 | mask_01523 2485 | mask_11077 2486 | mask_11473_o2 2487 | mask_05001 2488 | mask_05358_o2 2489 | mask_08573 2490 | mask_07220 2491 | mask_06568_o1 2492 | mask_07414_o1 2493 | mask_02322 2494 | mask_12100 2495 | mask_05689 2496 | mask_05528 2497 | mask_00254_o2 2498 | mask_00196_o2 2499 | mask_05718_o2 2500 | mask_11525_o1 2501 | mask_07724 2502 | mask_01371 2503 | mask_06846 2504 | mask_05317 2505 | mask_11587 2506 | mask_04427 2507 | mask_08741 2508 | mask_05908_o1 2509 | mask_02844 2510 | mask_03795 2511 | mask_08856_o2 2512 | mask_07378_o2 2513 | mask_06747 2514 | mask_03430 2515 | mask_08848 2516 | mask_07386 2517 | mask_05601 2518 | mask_06403_o2 2519 | mask_09653 2520 | mask_08862 2521 | mask_00289_o2 2522 | mask_08138 2523 | mask_08972 2524 | mask_07115_o2 2525 | mask_12111 2526 | mask_05451_o2 2527 | mask_08605 2528 | mask_02047 2529 | mask_00535 2530 | mask_09722 2531 | mask_02766 2532 | mask_04665 2533 | mask_00067_o2 2534 | mask_06837_o2 2535 | mask_10611 2536 | mask_07291 2537 | mask_01132 2538 | mask_06444 2539 | mask_09934 2540 | mask_11504_o1 2541 | mask_09103 2542 | mask_05578_o1 2543 | mask_06992_o1 2544 | mask_07032_o1 2545 | mask_07632_o2 2546 | mask_11666 2547 | mask_05856 2548 | mask_10643 2549 | mask_10025 2550 | mask_10845 2551 | mask_11238 2552 | mask_02509 2553 | mask_08174 2554 | mask_06881_o1 2555 | mask_05824_o2 2556 | mask_07502_o2 2557 | mask_08876_o2 2558 | mask_01679 2559 | mask_06622 2560 | mask_00055_o2 2561 | mask_01994 2562 | mask_08383 2563 | mask_05433 2564 | mask_01617 2565 | mask_07656_o1 2566 | mask_00505 2567 | mask_07689 2568 | mask_06615_o1 2569 | mask_03290 2570 | mask_08263 2571 | mask_05354 2572 | mask_07556_o2 2573 | mask_04977 2574 | mask_08882 2575 | mask_09852 2576 | mask_09766 2577 | mask_06362_o1 2578 | mask_05502_o2 2579 | mask_11029_o2 2580 | mask_06422 2581 | mask_11282 2582 | mask_08961_o2 2583 | mask_09035 2584 | mask_04199 2585 | mask_00122_o1 2586 | mask_05595 2587 | mask_07612_o2 2588 | mask_00509_o1 2589 | mask_00369_o1 2590 | mask_08203_o1 2591 | mask_00005_o2 2592 | mask_09970 2593 | mask_05468_o1 2594 | mask_08802_o1 2595 | mask_02503 2596 | mask_06232_o1 2597 | mask_09433 2598 | mask_04931 2599 | mask_06171_o1 2600 | mask_08944_o1 2601 | mask_06895_o1 2602 | mask_00376 2603 | mask_04781 2604 | mask_08737 2605 | mask_07221 2606 | mask_05532_o2 2607 | mask_08857_o1 2608 | mask_07085_o1 2609 | mask_10992_o2 2610 | mask_06636 2611 | mask_05351_o2 2612 | mask_11480_o1 2613 | mask_00542 2614 | mask_03779 2615 | mask_01004 2616 | mask_07315 2617 | mask_03959 2618 | mask_09883 2619 | mask_06625 2620 | mask_02199 2621 | mask_05295_o1 2622 | mask_02669 2623 | mask_08368 2624 | mask_00380_o1 2625 | mask_06150 2626 | mask_07432_o2 2627 | mask_03838 2628 | mask_03863 2629 | mask_04362 2630 | mask_02879 2631 | mask_05517 2632 | mask_02063 2633 | mask_06275 2634 | mask_09058 2635 | mask_02639 2636 | mask_03737 2637 | mask_11411_o1 2638 | mask_08015_o1 2639 | mask_07725 2640 | mask_05899 2641 | mask_07820_o2 2642 | mask_06423 2643 | mask_06363 2644 | mask_09299 2645 | mask_04987 2646 | mask_08605_o2 2647 | mask_09891 2648 | mask_07294_o1 2649 | mask_08000_o2 2650 | mask_10076 2651 | mask_05071 2652 | mask_05727 2653 | mask_01470 2654 | mask_06103 2655 | mask_11196_o2 2656 | mask_08780_o1 2657 | mask_05913_o2 2658 | mask_06714_o1 2659 | mask_02707 2660 | mask_06317_o1 2661 | mask_06312 2662 | mask_01778 2663 | mask_03652 2664 | mask_11548_o1 2665 | mask_08183_o1 2666 | mask_11709 2667 | mask_02776 2668 | mask_06482_o2 2669 | mask_05382_o1 2670 | mask_05304_o1 2671 | mask_07960 2672 | mask_08156 2673 | mask_10380 2674 | mask_08849 2675 | mask_09593 2676 | mask_10615 2677 | mask_02167 2678 | mask_08801_o2 2679 | mask_08130 2680 | mask_01013 2681 | mask_07108_o1 2682 | mask_08293 2683 | mask_11043_o1 2684 | mask_03753 2685 | mask_08747_o1 2686 | mask_03765 2687 | mask_05710_o2 2688 | mask_05398_o2 2689 | mask_06388_o1 2690 | mask_08519_o1 2691 | mask_11272_o2 2692 | mask_03167 2693 | mask_11191 2694 | mask_00531 2695 | mask_05431_o2 2696 | mask_09528 2697 | mask_05615 2698 | mask_11432_o1 2699 | mask_06680_o1 2700 | mask_08087 2701 | mask_11442_o2 2702 | mask_05508 2703 | mask_01716 2704 | mask_04590 2705 | mask_04425 2706 | mask_11609_o1 2707 | mask_04000 2708 | mask_08289 2709 | mask_10701 2710 | mask_06426_o2 2711 | mask_07109_o2 2712 | mask_06310_o2 2713 | mask_07620_o2 2714 | mask_08004 2715 | mask_07715_o1 2716 | mask_05237_o1 2717 | mask_07495 2718 | mask_09113 2719 | mask_07870_o1 2720 | mask_07403 2721 | mask_04458 2722 | mask_00358 2723 | mask_08609_o1 2724 | mask_05423_o1 2725 | mask_08252_o1 2726 | mask_10621 2727 | mask_03537 2728 | mask_02905 2729 | mask_05645_o1 2730 | mask_00258 2731 | mask_06670_o1 2732 | mask_11938 2733 | mask_04039 2734 | mask_10282 2735 | mask_08394 2736 | mask_00477_o2 2737 | mask_02951 2738 | mask_11454_o2 2739 | mask_08478_o1 2740 | mask_03746 2741 | mask_00037_o2 2742 | mask_07106_o1 2743 | mask_07483_o1 2744 | mask_01496 2745 | mask_06034_o1 2746 | mask_09243 2747 | mask_05170 2748 | mask_06397 2749 | mask_07259_o2 2750 | mask_12301 2751 | mask_08879_o2 2752 | mask_11075 2753 | mask_10533 2754 | mask_09175 2755 | mask_01387 2756 | mask_03480 2757 | mask_00238 2758 | mask_11592_o2 2759 | mask_06149_o1 2760 | mask_12168 2761 | mask_07619 2762 | mask_06716_o2 2763 | mask_08223 2764 | mask_06759_o2 2765 | mask_10985_o2 2766 | mask_05427_o1 2767 | mask_05341_o1 2768 | mask_06852 2769 | mask_07641 2770 | mask_12051 2771 | mask_06043 2772 | mask_04293 2773 | mask_07103 2774 | mask_08811 2775 | mask_09998 2776 | mask_01454 2777 | mask_06081_o2 2778 | mask_02501 2779 | mask_04169 2780 | mask_05684 2781 | mask_02887 2782 | mask_00475_o2 2783 | mask_05476 2784 | mask_06579_o1 2785 | mask_03358 2786 | mask_08475 2787 | mask_12009 2788 | mask_09730 2789 | mask_11270 2790 | mask_11209_o1 2791 | mask_04746 2792 | mask_09387 2793 | mask_05169_o1 2794 | mask_02830 2795 | mask_11884 2796 | mask_11932 2797 | mask_05360_o1 2798 | mask_04128 2799 | mask_03200 2800 | mask_04195 2801 | mask_05651 2802 | mask_01299 2803 | mask_08261_o2 2804 | mask_11014 2805 | mask_07688 2806 | mask_07214 2807 | mask_05128 2808 | mask_00009 2809 | mask_08252 2810 | mask_02818_o1 2811 | mask_12323 2812 | mask_06033_o2 2813 | mask_08012 2814 | mask_10247 2815 | mask_08097_o1 2816 | mask_05253_o1 2817 | mask_04548 2818 | mask_11403 2819 | mask_04528 2820 | mask_02789 2821 | mask_05453 2822 | mask_06154 2823 | mask_08455 2824 | mask_07073 2825 | mask_08758_o2 2826 | mask_05640 2827 | mask_07737_o2 2828 | mask_11002 2829 | mask_05421 2830 | mask_11397 2831 | mask_03040 2832 | mask_04353 2833 | mask_00448_o2 2834 | mask_04878 2835 | mask_06480 2836 | mask_09228 2837 | mask_11442_o1 2838 | mask_10465 2839 | mask_03696 2840 | mask_11158_o2 2841 | mask_11515_o1 2842 | mask_01901 2843 | mask_03800 2844 | mask_10413 2845 | mask_08841_o2 2846 | mask_08251 2847 | mask_00546 2848 | mask_10357 2849 | mask_07526_o2 2850 | mask_07976_o2 2851 | mask_05473 2852 | mask_06807 2853 | mask_06522_o1 2854 | mask_10308 2855 | mask_06620_o1 2856 | mask_11486_o2 2857 | mask_04954 2858 | mask_08753_o1 2859 | mask_00442 2860 | mask_08310_o2 2861 | mask_03442 2862 | mask_07147 2863 | mask_03922 2864 | mask_00008_o2 2865 | mask_08575_o1 2866 | mask_05538_o1 2867 | mask_10212 2868 | mask_09037_o2 2869 | mask_05153_o1 2870 | mask_06322_o2 2871 | mask_06158_o2 2872 | mask_07036_o1 2873 | mask_00479_o1 2874 | mask_11150_o2 2875 | mask_08875_o2 2876 | mask_04461 2877 | mask_08478_o2 2878 | mask_05888_o1 2879 | mask_00764 2880 | mask_01746 2881 | mask_06216 2882 | mask_05659_o1 2883 | mask_05309_o1 2884 | mask_00237 2885 | mask_05592 2886 | mask_03841 2887 | mask_02060 2888 | mask_01537 2889 | mask_08116_o1 2890 | mask_01511 2891 | mask_05276 2892 | mask_03449 2893 | mask_04440 2894 | mask_08090 2895 | mask_06402 2896 | mask_02794 2897 | mask_07326 2898 | mask_08641_o2 2899 | mask_03547 2900 | mask_01089 2901 | mask_08362 2902 | mask_08043 2903 | mask_00407_o1 2904 | mask_05007 2905 | mask_01064 2906 | mask_08932 2907 | mask_05750_o2 2908 | mask_06236_o1 2909 | mask_11408_o1 2910 | mask_00960 2911 | mask_10678 2912 | mask_05603 2913 | mask_05758_o2 2914 | mask_08901_o2 2915 | mask_06554 2916 | mask_11146 2917 | mask_05031 2918 | mask_05499_o1 2919 | mask_06564_o2 2920 | mask_07942 2921 | mask_06581_o1 2922 | mask_00346_o1 2923 | mask_00039_o1 2924 | mask_00011_o1 2925 | mask_11526 2926 | mask_07261_o2 2927 | mask_00050_o2 2928 | mask_11131_o1 2929 | mask_10494 2930 | mask_05944 2931 | mask_05892 2932 | mask_07969 2933 | mask_09302 2934 | mask_07334_o2 2935 | mask_08456_o1 2936 | mask_06861 2937 | mask_05176_o2 2938 | mask_04261 2939 | mask_06413 2940 | mask_06551_o2 2941 | mask_06341 2942 | mask_10627 2943 | mask_08711 2944 | mask_05884 2945 | mask_00685 2946 | mask_06836_o1 2947 | mask_04947 2948 | mask_11490 2949 | mask_02833 2950 | mask_07094_o1 2951 | mask_03340 2952 | mask_05243_o2 2953 | mask_02950 2954 | mask_05346_o1 2955 | mask_00163_o2 2956 | mask_10794 2957 | mask_00077_o1 2958 | mask_07561 2959 | mask_06255_o2 2960 | mask_10270 2961 | mask_02337 2962 | mask_11563 2963 | mask_00622 2964 | mask_06303 2965 | mask_09785 2966 | mask_00466_o2 2967 | mask_06880_o2 2968 | mask_05964 2969 | mask_02076 2970 | mask_08712 2971 | mask_07768_o2 2972 | mask_11775 2973 | mask_07099 2974 | mask_04073 2975 | mask_05697_o2 2976 | mask_06544 2977 | mask_04051 2978 | mask_08336 2979 | mask_04690 2980 | mask_07081_o1 2981 | mask_08915_o1 2982 | mask_06788_o2 2983 | mask_09871 2984 | mask_05802_o1 2985 | mask_06240_o2 2986 | mask_05630 2987 | mask_09646 2988 | mask_07478_o2 2989 | mask_12257 2990 | mask_03752 2991 | mask_10306 2992 | mask_09091_o1 2993 | mask_06705_o2 2994 | mask_11116_o2 2995 | mask_08512 2996 | mask_03742 2997 | mask_08726_o1 2998 | mask_08836_o1 2999 | mask_05085_o2 3000 | mask_06985_o2 3001 | mask_08804 3002 | mask_09211 3003 | mask_08388_o2 3004 | mask_08263_o2 3005 | mask_06285 3006 | mask_05455 3007 | mask_03369 3008 | mask_07004_o2 3009 | mask_12311 3010 | mask_05009 3011 | mask_07508 3012 | mask_09853 3013 | mask_10588 3014 | mask_11950 3015 | mask_01369 3016 | mask_07247_o2 3017 | mask_10556 3018 | mask_11419_o1 3019 | mask_01303 3020 | mask_00526 3021 | mask_09716 3022 | mask_02184 3023 | mask_06176 3024 | mask_06693_o2 3025 | mask_08386 3026 | mask_07376_o2 3027 | mask_06223_o1 3028 | mask_03018 3029 | mask_07327 3030 | mask_05092_o2 3031 | mask_06622_o1 3032 | mask_05963 3033 | mask_11456_o1 3034 | mask_03641 3035 | mask_02044 3036 | mask_03513 3037 | mask_05675_o2 3038 | mask_06790 3039 | mask_00176 3040 | mask_00434 3041 | mask_03964 3042 | mask_04225 3043 | mask_08782_o1 3044 | mask_00128_o1 3045 | mask_11398 3046 | mask_03806 3047 | mask_03300 3048 | mask_05800_o2 3049 | mask_08581_o1 3050 | mask_03526 3051 | mask_05739_o2 3052 | mask_03865 3053 | mask_07216_o1 3054 | mask_11236_o1 3055 | mask_09495 3056 | mask_02659 3057 | mask_05923 3058 | mask_06004 3059 | mask_05816_o2 3060 | mask_01549 3061 | mask_01170 3062 | mask_06024_o1 3063 | mask_04061 3064 | mask_02493 3065 | mask_03235 3066 | mask_05079_o2 3067 | mask_00346_o2 3068 | mask_11067_o2 3069 | mask_06190_o2 3070 | mask_08670_o2 3071 | mask_11610 3072 | mask_00404_o1 3073 | mask_08535 3074 | mask_09672 3075 | mask_01333 3076 | mask_07464_o2 3077 | mask_06879_o2 3078 | mask_09103_o2 3079 | mask_07155_o2 3080 | mask_09135 3081 | mask_09390 3082 | mask_07201 3083 | mask_06580 3084 | mask_06136_o1 3085 | mask_11176 3086 | mask_06726_o2 3087 | mask_08387_o1 3088 | mask_02697 3089 | mask_08031 3090 | mask_02492 3091 | mask_00225_o2 3092 | mask_08754_o2 3093 | mask_08268 3094 | mask_00085_o2 3095 | mask_06767 3096 | mask_07403_o1 3097 | mask_11266_o1 3098 | mask_06575 3099 | mask_05363_o1 3100 | mask_10585 3101 | mask_07584 3102 | mask_05353 3103 | mask_02385 3104 | mask_05193_o1 3105 | mask_11170_o2 3106 | mask_06821_o2 3107 | mask_09101_o1 3108 | mask_06074 3109 | mask_08327 3110 | mask_04842 3111 | mask_06543_o2 3112 | mask_05817_o2 3113 | mask_10962_o2 3114 | mask_03738 3115 | mask_11139_o1 3116 | mask_01938 3117 | mask_00121_o1 3118 | mask_11028_o1 3119 | mask_08890_o2 3120 | mask_11621 3121 | mask_03332 3122 | mask_04920 3123 | mask_02913 3124 | mask_04736 3125 | mask_04407 3126 | mask_03130 3127 | mask_08032_o1 3128 | mask_09017 3129 | mask_07126_o1 3130 | mask_06152 3131 | mask_06338_o2 3132 | mask_06573_o1 3133 | mask_05853_o2 3134 | mask_05405 3135 | mask_06354 3136 | mask_01516 3137 | mask_01176 3138 | mask_05255_o1 3139 | mask_07978 3140 | mask_04340 3141 | mask_05913 3142 | mask_04170 3143 | mask_06646_o1 3144 | mask_03999 3145 | mask_08683_o2 3146 | mask_06316 3147 | mask_10463 3148 | mask_07278 3149 | mask_08008 3150 | mask_07654 3151 | mask_05320_o1 3152 | mask_10755 3153 | mask_02988 3154 | mask_08631 3155 | mask_05340 3156 | mask_03278 3157 | mask_09104 3158 | mask_08317_o1 3159 | mask_08110 3160 | mask_06068_o2 3161 | mask_05551 3162 | mask_07253_o2 3163 | mask_05706_o1 3164 | mask_07679_o1 3165 | mask_08219 3166 | mask_11562_o2 3167 | mask_08600_o1 3168 | mask_07078 3169 | mask_06447 3170 | mask_11265 3171 | mask_01112 3172 | mask_05977 3173 | mask_05858 3174 | mask_11474_o1 3175 | mask_10007 3176 | mask_08587_o2 3177 | mask_07046 3178 | mask_08845_o2 3179 | mask_06989_o2 3180 | mask_08033_o2 3181 | mask_08061_o1 3182 | mask_08299 3183 | mask_05965_o2 3184 | mask_07798 3185 | mask_06012_o2 3186 | mask_00445_o1 3187 | mask_04014 3188 | mask_06200 3189 | mask_07192_o1 3190 | mask_09418 3191 | mask_02093 3192 | mask_08329_o2 3193 | mask_11671 3194 | mask_07923 3195 | mask_04177 3196 | mask_06218_o1 3197 | mask_00412_o1 3198 | mask_12261 3199 | mask_02831 3200 | mask_03304 3201 | mask_11905 3202 | mask_03928 3203 | mask_05505_o1 3204 | mask_05359_o1 3205 | mask_07976_o1 3206 | mask_06412_o2 3207 | mask_00046 3208 | mask_09370 3209 | mask_00432 3210 | mask_05386 3211 | mask_06608 3212 | mask_08633 3213 | mask_07143_o2 3214 | mask_08986 3215 | mask_08733 3216 | mask_05383_o1 3217 | mask_09290 3218 | mask_09040_o2 3219 | mask_10504 3220 | mask_11639_o2 3221 | mask_07256 3222 | mask_06139 3223 | mask_11210_o2 3224 | mask_06809 3225 | mask_00097_o2 3226 | mask_04732 3227 | mask_09151 3228 | mask_05579_o1 3229 | mask_10271 3230 | mask_07445_o2 3231 | mask_05136_o1 3232 | mask_03958 3233 | mask_00500_o1 3234 | mask_07228_o1 3235 | mask_05996_o2 3236 | mask_05605 3237 | mask_05173 3238 | mask_05882_o1 3239 | mask_00540 3240 | mask_08420_o2 3241 | mask_09005 3242 | mask_06179 3243 | mask_05379 3244 | -------------------------------------------------------------------------------- /VOC_data/VOC_mask/data_list/val.txt: -------------------------------------------------------------------------------- 1 | mask_02133 2 | mask_07623 3 | mask_08788 4 | mask_08298 5 | mask_06522 6 | mask_00308_o1 7 | mask_08574 8 | mask_05322_o1 9 | mask_03477 10 | mask_03196 11 | mask_09125 12 | mask_02467 13 | mask_03264 14 | mask_00013_o2 15 | mask_07531_o2 16 | mask_05619 17 | mask_02062 18 | mask_00007_o1 19 | mask_08995 20 | mask_02023 21 | mask_11169_o1 22 | mask_06767_o2 23 | mask_09065_o1 24 | mask_05517_o2 25 | mask_05929_o1 26 | mask_02319 27 | mask_11210 28 | mask_00795 29 | mask_03089 30 | mask_05320_o2 31 | mask_08374_o1 32 | mask_00002_o1 33 | mask_05401_o1 34 | mask_10221 35 | mask_05494 36 | mask_05645_o2 37 | mask_02626 38 | mask_11353 39 | mask_03890 40 | mask_01400 41 | mask_06911_o1 42 | mask_00104 43 | mask_01616 44 | mask_00704 45 | mask_09988 46 | mask_05148_o1 47 | mask_11490_o2 48 | mask_02870 49 | mask_06574_o2 50 | mask_03330 51 | mask_00381_o1 52 | mask_00041_o2 53 | mask_04588 54 | mask_10344 55 | mask_08595 56 | mask_11412_o1 57 | mask_03808 58 | mask_08679 59 | mask_07364_o1 60 | mask_08245_o1 61 | mask_01222 62 | mask_09530 63 | mask_05499_o2 64 | mask_09443 65 | mask_10004 66 | mask_08184_o1 67 | mask_09344 68 | mask_08816_o1 69 | mask_04802 70 | mask_01397 71 | mask_11435_o1 72 | mask_04009 73 | mask_03735 74 | mask_08937 75 | mask_01567 76 | mask_03054 77 | mask_09319 78 | mask_00997 79 | mask_05411_o2 80 | mask_07375_o2 81 | mask_11880 82 | mask_06137_o1 83 | mask_08349_o2 84 | mask_03574 85 | mask_06912 86 | mask_00110_o2 87 | mask_02578 88 | mask_07279_o2 89 | mask_00248_o1 90 | mask_00906 91 | mask_03325 92 | mask_03444 93 | mask_10333 94 | mask_09008 95 | mask_12245 96 | mask_11587_o1 97 | mask_10671 98 | mask_05912_o1 99 | mask_08063 100 | mask_02758 101 | mask_00332_o2 102 | mask_11670 103 | mask_03405 104 | mask_04364 105 | mask_07151_o2 106 | mask_08252_o2 107 | mask_04597 108 | mask_01575 109 | mask_11522_o1 110 | mask_07539 111 | mask_09114 112 | mask_07654_o1 113 | mask_00038 114 | mask_06329 115 | mask_11559_o2 116 | mask_07089_o1 117 | mask_00054 118 | mask_08245 119 | mask_01919 120 | mask_09142 121 | mask_06296 122 | mask_07801_o2 123 | mask_08742 124 | mask_05743 125 | mask_04052 126 | mask_07335 127 | mask_08619_o2 128 | mask_11382 129 | mask_04064 130 | mask_11068_o1 131 | mask_08750 132 | mask_08985 133 | mask_05434_o1 134 | mask_03630 135 | mask_09031_o2 136 | mask_02449 137 | mask_02624 138 | mask_01363 139 | mask_05577_o1 140 | mask_06058_o1 141 | mask_06583_o1 142 | mask_11612_o2 143 | mask_08188_o1 144 | mask_11573 145 | mask_07847 146 | mask_00938 147 | mask_12015 148 | mask_07973_o2 149 | mask_05460 150 | mask_05399 151 | mask_01860 152 | mask_00453_o1 153 | mask_05353_o1 154 | mask_07543_o2 155 | mask_01005 156 | mask_05862 157 | mask_04135 158 | mask_05696_o1 159 | mask_08637 160 | mask_08759 161 | mask_11628 162 | mask_11121 163 | mask_12006 164 | mask_07366_o2 165 | mask_00027_o2 166 | mask_02356 167 | mask_03814 168 | mask_00061_o1 169 | mask_09473 170 | mask_03734 171 | mask_06445 172 | mask_00377_o1 173 | mask_11285_o1 174 | mask_06887_o1 175 | mask_00922 176 | mask_06774_o2 177 | mask_04480 178 | mask_11140 179 | mask_07778_o2 180 | mask_08334_o2 181 | mask_11444_o2 182 | mask_10039 183 | mask_11070_o1 184 | mask_11461 185 | mask_07162_o1 186 | mask_11569_o2 187 | mask_09679 188 | mask_08920_o1 189 | mask_07123_o2 190 | mask_11982 191 | mask_11206 192 | mask_05759_o1 193 | mask_06737_o2 194 | mask_04648 195 | mask_07211_o2 196 | mask_00259 197 | mask_05590_o1 198 | mask_06797 199 | mask_07945_o2 200 | mask_04684 201 | mask_08266_o1 202 | mask_06616 203 | mask_06415 204 | mask_03741 205 | mask_05965_o1 206 | mask_02630 207 | mask_05911_o1 208 | mask_03174 209 | mask_01356 210 | mask_12076 211 | mask_12217 212 | mask_07570 213 | mask_05289 214 | mask_03619 215 | mask_07476_o2 216 | mask_05866_o2 217 | mask_02733 218 | mask_10736 219 | mask_06587_o2 220 | mask_02821 221 | mask_11258_o2 222 | mask_05376 223 | mask_06368 224 | mask_09756 225 | mask_05381 226 | mask_05316 227 | mask_03171 228 | mask_08767 229 | mask_11247 230 | mask_11198_o1 231 | mask_07441_o1 232 | mask_06281_o1 233 | mask_02291 234 | mask_09419 235 | mask_10125 236 | mask_01859 237 | mask_06234_o2 238 | mask_07551_o2 239 | mask_06011_o1 240 | mask_04963 241 | mask_01406 242 | mask_07317 243 | mask_09227 244 | mask_01870 245 | mask_02265 246 | mask_07363 247 | mask_05722_o2 248 | mask_05955 249 | mask_11871 250 | mask_02138 251 | mask_10753 252 | mask_01121 253 | mask_06168_o2 254 | mask_00251_o1 255 | mask_11013_o1 256 | mask_02324 257 | mask_06922_o2 258 | mask_10998 259 | mask_00739 260 | mask_00072_o2 261 | mask_11007_o2 262 | mask_10171 263 | mask_01578 264 | mask_02439 265 | mask_04966 266 | mask_00123_o2 267 | mask_07593_o2 268 | mask_06412 269 | mask_06557_o1 270 | mask_06625_o1 271 | mask_05524 272 | mask_07861 273 | mask_06280 274 | mask_06151 275 | mask_01685 276 | mask_10404 277 | mask_00227_o1 278 | mask_06395 279 | mask_05611 280 | mask_07290 281 | mask_02568 282 | mask_11560 283 | mask_06859_o1 284 | mask_07444_o1 285 | mask_10957 286 | mask_07816_o1 287 | mask_08983 288 | mask_12209 289 | mask_11255 290 | mask_04563 291 | mask_08279 292 | mask_03464 293 | mask_08787 294 | mask_04091 295 | mask_08990_o2 296 | mask_05327_o2 297 | mask_08827_o1 298 | mask_09094_o1 299 | mask_09427 300 | mask_03550 301 | mask_02101 302 | mask_06561_o1 303 | mask_08159 304 | mask_05188_o1 305 | mask_10965_o1 306 | mask_01592 307 | mask_05341_o2 308 | mask_06399 309 | mask_08497 310 | mask_06449 311 | mask_05068_o2 312 | mask_06359 313 | mask_05263 314 | mask_00068_o1 315 | mask_09879 316 | mask_03307 317 | mask_07218_o2 318 | mask_06175_o2 319 | mask_00121 320 | mask_05642_o1 321 | mask_01866 322 | mask_02119 323 | mask_01238 324 | mask_01257 325 | mask_11708 326 | mask_00399_o1 327 | mask_06430 328 | mask_00453 329 | mask_05924 330 | mask_07018_o2 331 | mask_11504_o2 332 | mask_06857 333 | mask_07526_o1 334 | mask_09721 335 | mask_05551_o2 336 | mask_06441_o2 337 | mask_05028 338 | mask_02442 339 | mask_01705 340 | mask_01353 341 | mask_11606 342 | mask_06689_o2 343 | mask_01665 344 | mask_11544_o1 345 | mask_06361 346 | mask_06569 347 | mask_00459 348 | mask_00149_o1 349 | mask_06498 350 | mask_02444 351 | mask_02282 352 | mask_12345 353 | mask_07303 354 | mask_00139 355 | mask_06576_o2 356 | mask_11056 357 | mask_00187_o1 358 | mask_03279 359 | mask_05784 360 | mask_03027 361 | mask_11085 362 | mask_00147_o2 363 | mask_06732_o1 364 | mask_07581_o2 365 | mask_08493 366 | mask_07830_o2 367 | mask_11425_o2 368 | mask_01322 369 | mask_06650_o2 370 | mask_00221_o1 371 | mask_04678 372 | mask_06066_o2 373 | mask_06843_o2 374 | mask_03055 375 | mask_05334_o1 376 | mask_02392 377 | mask_02540 378 | mask_08901 379 | mask_00740 380 | mask_07926_o1 381 | mask_00811 382 | mask_03698 383 | mask_10554 384 | mask_08417 385 | mask_07373_o1 386 | mask_05324 387 | mask_09056_o1 388 | mask_08571_o2 389 | mask_02205 390 | mask_07254 391 | mask_06203_o1 392 | mask_11481_o1 393 | mask_10433 394 | mask_07610_o2 395 | mask_05971 396 | mask_01560 397 | mask_10085 398 | mask_05334 399 | mask_04810 400 | mask_03926 401 | mask_07200 402 | mask_08781 403 | mask_03322 404 | mask_09326 405 | mask_03901 406 | mask_05674_o1 407 | mask_11262 408 | mask_08110_o1 409 | mask_05294_o1 410 | mask_05482_o2 411 | mask_02828 412 | mask_10436 413 | mask_02304 414 | mask_08154 415 | mask_00843 416 | mask_07197_o1 417 | mask_07789 418 | mask_05868_o2 419 | mask_01195 420 | mask_00150 421 | mask_01999 422 | mask_09230 423 | mask_01098 424 | mask_06208 425 | mask_09048_o1 426 | mask_11009_o2 427 | mask_08725 428 | mask_05228 429 | mask_11612 430 | mask_07637 431 | mask_03039 432 | mask_08256 433 | mask_05308_o1 434 | mask_11302 435 | mask_11024_o1 436 | mask_08212 437 | mask_00864 438 | mask_01037 439 | mask_08748_o2 440 | mask_06104_o1 441 | mask_07633 442 | mask_04777 443 | mask_07048_o1 444 | mask_05717_o1 445 | mask_11145_o2 446 | mask_05541_o2 447 | mask_07158_o1 448 | mask_07009 449 | mask_05088_o1 450 | mask_10997 451 | mask_06485_o1 452 | mask_11727 453 | mask_01351 454 | mask_11632 455 | mask_07612_o1 456 | mask_07162_o2 457 | mask_06040 458 | mask_11082 459 | mask_05754_o1 460 | mask_08842_o1 461 | mask_11255_o1 462 | mask_08507 463 | mask_06226_o1 464 | mask_05275_o1 465 | mask_05245 466 | mask_07495_o2 467 | mask_09198 468 | mask_08609 469 | mask_00433_o2 470 | mask_10830 471 | mask_11063 472 | mask_08823_o1 473 | mask_08543 474 | mask_05517_o1 475 | mask_06754 476 | mask_03191 477 | mask_00276_o2 478 | mask_11114_o1 479 | mask_02614 480 | mask_00119_o1 481 | mask_05884_o2 482 | mask_06081_o1 483 | mask_05078_o1 484 | mask_10735 485 | mask_11166 486 | mask_02744 487 | mask_07332_o1 488 | mask_00067_o1 489 | mask_12317 490 | mask_07443_o2 491 | mask_11248_o1 492 | mask_10959 493 | mask_04161 494 | mask_03681 495 | mask_04776 496 | mask_00100_o2 497 | mask_11878 498 | mask_01557 499 | mask_05331_o1 500 | mask_05948_o2 501 | mask_11388 502 | mask_09950 503 | mask_02237 504 | mask_07777_o1 505 | mask_02529 506 | mask_11617 507 | mask_08510_o2 508 | mask_09116 509 | mask_07064_o2 510 | mask_11550_o1 511 | mask_00313 512 | mask_08142 513 | mask_07933_o2 514 | mask_08229 515 | mask_08103 516 | mask_07490 517 | mask_11599 518 | mask_03812 519 | mask_07955 520 | mask_05187_o1 521 | mask_12205 522 | mask_00347_o1 523 | mask_06686_o2 524 | mask_07202_o1 525 | mask_00084_o2 526 | mask_05951 527 | mask_07699_o1 528 | mask_06531 529 | mask_00495_o2 530 | mask_11049 531 | mask_05210_o1 532 | mask_07456_o2 533 | mask_09939 534 | mask_00229 535 | mask_03690 536 | mask_02178 537 | mask_10790 538 | mask_01740 539 | mask_11023_o2 540 | mask_06177 541 | mask_11558 542 | mask_09046_o1 543 | mask_06116_o2 544 | mask_11553_o2 545 | mask_09247 546 | mask_03793 547 | mask_02723 548 | mask_00930 549 | mask_08766_o1 550 | mask_07740 551 | mask_08468_o2 552 | mask_10613 553 | mask_02641 554 | mask_01340 555 | mask_06735 556 | mask_05303_o1 557 | mask_05587 558 | mask_08127 559 | mask_11174_o1 560 | mask_01061 561 | mask_11396 562 | mask_04558 563 | mask_07543 564 | mask_00463 565 | mask_06433_o2 566 | mask_11331 567 | mask_02067 568 | mask_12190 569 | mask_06378_o2 570 | mask_03297 571 | mask_00220 572 | mask_00033_o2 573 | mask_05541_o1 574 | mask_09796 575 | mask_05781_o1 576 | mask_11208 577 | mask_10669 578 | mask_12052 579 | mask_00419 580 | mask_11295 581 | mask_00142_o2 582 | mask_01583 583 | mask_08936 584 | mask_11151_o2 585 | mask_07285_o1 586 | mask_06232_o2 587 | mask_08859_o1 588 | mask_11540_o1 589 | mask_04905 590 | mask_06129_o2 591 | mask_05140 592 | mask_01748 593 | mask_01057 594 | mask_08300 595 | mask_12237 596 | mask_10913 597 | mask_06927_o2 598 | mask_11823 599 | mask_05263_o1 600 | mask_02642 601 | mask_01348 602 | mask_01469 603 | mask_00405_o1 604 | mask_06196_o1 605 | mask_03017 606 | mask_00016_o2 607 | mask_11079 608 | mask_04896 609 | mask_06847_o1 610 | mask_00248 611 | mask_00425_o2 612 | mask_06048_o1 613 | mask_03995 614 | mask_04498 615 | mask_07065_o1 616 | mask_05158_o2 617 | mask_00508_o1 618 | mask_01331 619 | mask_06230_o1 620 | mask_00490_o1 621 | mask_00071_o1 622 | mask_07124_o2 623 | mask_07472_o2 624 | mask_06279 625 | mask_07223_o2 626 | mask_07020 627 | mask_08596_o1 628 | mask_06877_o2 629 | mask_03968 630 | mask_06093 631 | mask_09752 632 | mask_11289 633 | mask_11268 634 | mask_08730_o2 635 | mask_08335_o1 636 | mask_05421_o1 637 | mask_06619_o2 638 | mask_08140_o1 639 | mask_09380 640 | mask_08994 641 | mask_06864_o2 642 | mask_07022_o1 643 | mask_04801 644 | mask_05857_o1 645 | mask_05456_o2 646 | mask_06839_o1 647 | mask_07131 648 | mask_06598 649 | mask_07847_o1 650 | mask_08854_o2 651 | mask_02230 652 | mask_02182 653 | mask_01285 654 | mask_04511 655 | mask_09403 656 | mask_06515 657 | mask_08488 658 | mask_04632 659 | mask_08844_o2 660 | mask_01935 661 | mask_04006 662 | mask_04200 663 | mask_05222 664 | mask_07913_o1 665 | mask_07632 666 | mask_02709 667 | mask_07880 668 | mask_00270_o2 669 | mask_10999 670 | mask_02511 671 | mask_11546_o2 672 | mask_06941 673 | mask_01769 674 | mask_10116 675 | mask_06334 676 | mask_07496_o1 677 | mask_08408 678 | mask_11548_o2 679 | mask_05920_o2 680 | mask_02703 681 | mask_11033_o1 682 | mask_04027 683 | mask_03088 684 | mask_00238_o1 685 | mask_00883 686 | mask_04370 687 | mask_08828_o1 688 | mask_06437_o2 689 | mask_07272 690 | mask_12360 691 | mask_04813 692 | mask_06407_o1 693 | mask_00987 694 | mask_10359 695 | mask_02292 696 | mask_00106_o2 697 | mask_05457_o2 698 | mask_04554 699 | mask_09084 700 | mask_09663 701 | mask_06761_o1 702 | mask_10002 703 | mask_05708 704 | mask_05443_o2 705 | mask_06955_o1 706 | mask_00449_o1 707 | mask_02159 708 | mask_05134_o1 709 | mask_02420 710 | mask_05456 711 | mask_06635 712 | mask_03023 713 | mask_11683 714 | mask_00319_o1 715 | mask_09590 716 | mask_08458_o1 717 | mask_07661_o1 718 | mask_11644 719 | mask_03202 720 | mask_11241 721 | mask_00064_o1 722 | mask_01056 723 | mask_02836 724 | mask_10925 725 | mask_03115 726 | mask_01852 727 | mask_12248 728 | mask_00379 729 | mask_02926 730 | mask_02039 731 | mask_08878 732 | mask_11243 733 | mask_12373 734 | mask_08997_o2 735 | mask_12002 736 | mask_08449_o1 737 | mask_08819_o2 738 | mask_00102 739 | mask_03373 740 | mask_06858_o2 741 | mask_11083 742 | mask_06805_o2 743 | mask_01256 744 | mask_00269_o2 745 | mask_01851 746 | mask_04472 747 | mask_03593 748 | mask_11252_o2 749 | mask_08430_o2 750 | mask_00659 751 | mask_04373 752 | mask_08694 753 | mask_00083_o2 754 | mask_07450_o1 755 | mask_10029 756 | mask_06777_o2 757 | mask_07906 758 | mask_07195_o2 759 | mask_00500 760 | mask_05765_o1 761 | mask_06750_o1 762 | mask_05122_o2 763 | mask_08860_o2 764 | mask_10119 765 | mask_00756 766 | mask_11244_o2 767 | mask_05816_o1 768 | mask_04306 769 | mask_00121_o2 770 | mask_06867_o2 771 | mask_11304 772 | mask_05175_o2 773 | mask_09033_o1 774 | mask_00270_o1 775 | mask_04535 776 | mask_08649 777 | mask_07328 778 | mask_08868 779 | mask_08548 780 | mask_03764 781 | mask_10454 782 | mask_06548_o2 783 | mask_04543 784 | mask_04050 785 | mask_11301 786 | mask_07354_o1 787 | mask_08912_o2 788 | mask_06365_o1 789 | mask_00197_o1 790 | mask_05415 791 | mask_09943 792 | mask_01696 793 | mask_07053_o1 794 | mask_08078 795 | mask_11021_o2 796 | mask_00017_o2 797 | mask_11636_o2 798 | mask_01442 799 | mask_03618 800 | mask_08619 801 | mask_07399 802 | mask_08490 803 | mask_02849 804 | mask_10689 805 | mask_10783 806 | mask_06721 807 | mask_09152 808 | mask_00405_o2 809 | mask_06046_o1 810 | mask_05962_o2 811 | mask_06465 812 | -------------------------------------------------------------------------------- /VOC_data/VOC_mask/data_log.py: -------------------------------------------------------------------------------- 1 | def start(logname): 2 | import time 3 | localtime = time.strftime("-%Y-%m-%d-%H-%M-%S", time.localtime()) 4 | filename = logname + localtime + ".txt" 5 | logfile = open(filename,"w+") 6 | return filename 7 | 8 | def add(content,filename): 9 | logfile = open(filename,"a") 10 | logfile.write(content+"\n") -------------------------------------------------------------------------------- /VOC_data/VOC_mask/data_proc.py: -------------------------------------------------------------------------------- 1 | import glob 2 | import os 3 | import shutil 4 | import xml.etree.ElementTree as et 5 | import data_log 6 | 7 | def img_cmp(): 8 | imgDir = "img/" 9 | xmlDir = "label/" 10 | imgDirNew = "img_new/" 11 | xmlDirNew = "label_new/" 12 | 13 | curID = 0 14 | 15 | # print(os.path.exists("label/_2020011619353286084.xml")) 16 | 17 | for imgPath in glob.glob(imgDir+"*jpg"): 18 | onlyName = imgPath.replace("img/","").replace(".jpg","") 19 | xmlPath = xmlDir +onlyName+".xml" 20 | if(os.path.exists(xmlPath)): 21 | newName = "mask_"+str(curID) 22 | print(imgPath,imgDirNew+newName+".jpg") 23 | shutil.copyfile(imgPath,imgDirNew+newName+".jpg") 24 | shutil.copyfile(xmlPath,xmlDirNew+newName+".xml") 25 | curID = curID + 1 26 | # print(imgPath) 27 | 28 | def modify_text(): 29 | xmldir = "label_nomask/" 30 | 31 | log = data_log.start("modifylog") 32 | 33 | for xmlpath in glob.glob(xmldir+"*xml"): 34 | tree = et.parse(xmlpath) 35 | root = tree.getroot() 36 | all_obj = root.findall('object') 37 | for curobj in all_obj: 38 | name = curobj.find("name") 39 | if name.text == "have_mask": 40 | name.text = "havemask" 41 | print(xmlpath) 42 | elif name.text == "no_mask": 43 | name.text = "nomask" 44 | print(xmlpath) 45 | else: 46 | data_log.add(xmlpath,log) 47 | tree.write(xmlpath) 48 | 49 | def modify_tag(): 50 | xmldir = "label_nomask/" 51 | 52 | log = data_log.start("modifylog") 53 | 54 | for xmlpath in glob.glob(xmldir+"*xml"): 55 | tree = et.parse(xmlpath) 56 | root = tree.getroot() 57 | all_obj = root.findall('object') 58 | for curobj in all_obj: 59 | name = curobj.find("Difficult") 60 | name.tag = 'difficult' 61 | tree.write(xmlpath) 62 | 63 | def check(): 64 | log = data_log.start("checklog") 65 | checklist = ["have_mask","no_mask"] 66 | for xmlpath in glob.glob(xmldir+"*xml"): 67 | for checkword in checklist: 68 | checkfile = open(xmlpath,"r") 69 | if checkword in checkfile.read(): 70 | data_log.add(xmlpath,log) 71 | -------------------------------------------------------------------------------- /VOC_data/VOC_mask/test_img/01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/VOC_data/VOC_mask/test_img/01.png -------------------------------------------------------------------------------- /VOC_data/VOC_mask/test_img/01_out.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/VOC_data/VOC_mask/test_img/01_out.jpg -------------------------------------------------------------------------------- /VOC_data/VOC_mask/test_img/01_out_show.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/VOC_data/VOC_mask/test_img/01_out_show.png -------------------------------------------------------------------------------- /VOC_data/VOC_mask/test_img/02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/VOC_data/VOC_mask/test_img/02.jpg -------------------------------------------------------------------------------- /VOC_data/VOC_mask/test_img/02_out.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/VOC_data/VOC_mask/test_img/02_out.jpg -------------------------------------------------------------------------------- /VOC_data/VOC_mask/test_img/mask_00025_o1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/VOC_data/VOC_mask/test_img/mask_00025_o1.jpg -------------------------------------------------------------------------------- /coco_annotation.py: -------------------------------------------------------------------------------- 1 | import json 2 | from collections import defaultdict 3 | 4 | name_box_id = defaultdict(list) 5 | id_name = dict() 6 | f = open( 7 | "mscoco2017/annotations/instances_train2017.json", 8 | encoding='utf-8') 9 | data = json.load(f) 10 | 11 | annotations = data['annotations'] 12 | for ant in annotations: 13 | id = ant['image_id'] 14 | name = 'mscoco2017/train2017/%012d.jpg' % id 15 | cat = ant['category_id'] 16 | 17 | if cat >= 1 and cat <= 11: 18 | cat = cat - 1 19 | elif cat >= 13 and cat <= 25: 20 | cat = cat - 2 21 | elif cat >= 27 and cat <= 28: 22 | cat = cat - 3 23 | elif cat >= 31 and cat <= 44: 24 | cat = cat - 5 25 | elif cat >= 46 and cat <= 65: 26 | cat = cat - 6 27 | elif cat == 67: 28 | cat = cat - 7 29 | elif cat == 70: 30 | cat = cat - 9 31 | elif cat >= 72 and cat <= 82: 32 | cat = cat - 10 33 | elif cat >= 84 and cat <= 90: 34 | cat = cat - 11 35 | 36 | name_box_id[name].append([ant['bbox'], cat]) 37 | 38 | f = open('train.txt', 'w') 39 | for key in name_box_id.keys(): 40 | f.write(key) 41 | box_infos = name_box_id[key] 42 | for info in box_infos: 43 | x_min = int(info[0][0]) 44 | y_min = int(info[0][1]) 45 | x_max = x_min + int(info[0][2]) 46 | y_max = y_min + int(info[0][3]) 47 | 48 | box_info = " %d,%d,%d,%d,%d" % ( 49 | x_min, y_min, x_max, y_max, int(info[1])) 50 | f.write(box_info) 51 | f.write('\n') 52 | f.close() 53 | -------------------------------------------------------------------------------- /convert.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | """ 3 | Reads Darknet config and weights and creates Keras model with TF backend. 4 | 5 | """ 6 | 7 | import argparse 8 | import configparser 9 | import io 10 | import os 11 | from collections import defaultdict 12 | 13 | import numpy as np 14 | from keras import backend as K 15 | from keras.layers import (Conv2D, Input, ZeroPadding2D, Add, 16 | UpSampling2D, MaxPooling2D, Concatenate) 17 | from keras.layers.advanced_activations import LeakyReLU 18 | from keras.layers.normalization import BatchNormalization 19 | from keras.models import Model 20 | from keras.regularizers import l2 21 | from keras.utils.vis_utils import plot_model as plot 22 | 23 | 24 | parser = argparse.ArgumentParser(description='Darknet To Keras Converter.') 25 | parser.add_argument('config_path', help='Path to Darknet cfg file.') 26 | parser.add_argument('weights_path', help='Path to Darknet weights file.') 27 | parser.add_argument('output_path', help='Path to output Keras model file.') 28 | parser.add_argument( 29 | '-p', 30 | '--plot_model', 31 | help='Plot generated Keras model and save as image.', 32 | action='store_true') 33 | parser.add_argument( 34 | '-w', 35 | '--weights_only', 36 | help='Save as Keras weights file instead of model file.', 37 | action='store_true') 38 | 39 | def unique_config_sections(config_file): 40 | """Convert all config sections to have unique names. 41 | 42 | Adds unique suffixes to config sections for compability with configparser. 43 | """ 44 | section_counters = defaultdict(int) 45 | output_stream = io.StringIO() 46 | with open(config_file) as fin: 47 | for line in fin: 48 | if line.startswith('['): 49 | section = line.strip().strip('[]') 50 | _section = section + '_' + str(section_counters[section]) 51 | section_counters[section] += 1 52 | line = line.replace(section, _section) 53 | output_stream.write(line) 54 | output_stream.seek(0) 55 | return output_stream 56 | 57 | # %% 58 | def _main(args): 59 | config_path = os.path.expanduser(args.config_path) 60 | weights_path = os.path.expanduser(args.weights_path) 61 | assert config_path.endswith('.cfg'), '{} is not a .cfg file'.format( 62 | config_path) 63 | assert weights_path.endswith( 64 | '.weights'), '{} is not a .weights file'.format(weights_path) 65 | 66 | output_path = os.path.expanduser(args.output_path) 67 | assert output_path.endswith( 68 | '.h5'), 'output path {} is not a .h5 file'.format(output_path) 69 | output_root = os.path.splitext(output_path)[0] 70 | 71 | # Load weights and config. 72 | print('Loading weights.') 73 | weights_file = open(weights_path, 'rb') 74 | major, minor, revision = np.ndarray( 75 | shape=(3, ), dtype='int32', buffer=weights_file.read(12)) 76 | if (major*10+minor)>=2 and major<1000 and minor<1000: 77 | seen = np.ndarray(shape=(1,), dtype='int64', buffer=weights_file.read(8)) 78 | else: 79 | seen = np.ndarray(shape=(1,), dtype='int32', buffer=weights_file.read(4)) 80 | print('Weights Header: ', major, minor, revision, seen) 81 | 82 | print('Parsing Darknet config.') 83 | unique_config_file = unique_config_sections(config_path) 84 | cfg_parser = configparser.ConfigParser() 85 | cfg_parser.read_file(unique_config_file) 86 | 87 | print('Creating Keras model.') 88 | input_layer = Input(shape=(None, None, 3)) 89 | prev_layer = input_layer 90 | all_layers = [] 91 | 92 | weight_decay = float(cfg_parser['net_0']['decay'] 93 | ) if 'net_0' in cfg_parser.sections() else 5e-4 94 | count = 0 95 | out_index = [] 96 | for section in cfg_parser.sections(): 97 | print('Parsing section {}'.format(section)) 98 | if section.startswith('convolutional'): 99 | filters = int(cfg_parser[section]['filters']) 100 | size = int(cfg_parser[section]['size']) 101 | stride = int(cfg_parser[section]['stride']) 102 | pad = int(cfg_parser[section]['pad']) 103 | activation = cfg_parser[section]['activation'] 104 | batch_normalize = 'batch_normalize' in cfg_parser[section] 105 | 106 | padding = 'same' if pad == 1 and stride == 1 else 'valid' 107 | 108 | # Setting weights. 109 | # Darknet serializes convolutional weights as: 110 | # [bias/beta, [gamma, mean, variance], conv_weights] 111 | prev_layer_shape = K.int_shape(prev_layer) 112 | 113 | weights_shape = (size, size, prev_layer_shape[-1], filters) 114 | darknet_w_shape = (filters, weights_shape[2], size, size) 115 | weights_size = np.product(weights_shape) 116 | 117 | print('conv2d', 'bn' 118 | if batch_normalize else ' ', activation, weights_shape) 119 | 120 | conv_bias = np.ndarray( 121 | shape=(filters, ), 122 | dtype='float32', 123 | buffer=weights_file.read(filters * 4)) 124 | count += filters 125 | 126 | if batch_normalize: 127 | bn_weights = np.ndarray( 128 | shape=(3, filters), 129 | dtype='float32', 130 | buffer=weights_file.read(filters * 12)) 131 | count += 3 * filters 132 | 133 | bn_weight_list = [ 134 | bn_weights[0], # scale gamma 135 | conv_bias, # shift beta 136 | bn_weights[1], # running mean 137 | bn_weights[2] # running var 138 | ] 139 | 140 | conv_weights = np.ndarray( 141 | shape=darknet_w_shape, 142 | dtype='float32', 143 | buffer=weights_file.read(weights_size * 4)) 144 | count += weights_size 145 | 146 | # DarkNet conv_weights are serialized Caffe-style: 147 | # (out_dim, in_dim, height, width) 148 | # We would like to set these to Tensorflow order: 149 | # (height, width, in_dim, out_dim) 150 | conv_weights = np.transpose(conv_weights, [2, 3, 1, 0]) 151 | conv_weights = [conv_weights] if batch_normalize else [ 152 | conv_weights, conv_bias 153 | ] 154 | 155 | # Handle activation. 156 | act_fn = None 157 | if activation == 'leaky': 158 | pass # Add advanced activation later. 159 | elif activation != 'linear': 160 | raise ValueError( 161 | 'Unknown activation function `{}` in section {}'.format( 162 | activation, section)) 163 | 164 | # Create Conv2D layer 165 | if stride>1: 166 | # Darknet uses left and top padding instead of 'same' mode 167 | prev_layer = ZeroPadding2D(((1,0),(1,0)))(prev_layer) 168 | conv_layer = (Conv2D( 169 | filters, (size, size), 170 | strides=(stride, stride), 171 | kernel_regularizer=l2(weight_decay), 172 | use_bias=not batch_normalize, 173 | weights=conv_weights, 174 | activation=act_fn, 175 | padding=padding))(prev_layer) 176 | 177 | if batch_normalize: 178 | conv_layer = (BatchNormalization( 179 | weights=bn_weight_list))(conv_layer) 180 | prev_layer = conv_layer 181 | 182 | if activation == 'linear': 183 | all_layers.append(prev_layer) 184 | elif activation == 'leaky': 185 | act_layer = LeakyReLU(alpha=0.1)(prev_layer) 186 | prev_layer = act_layer 187 | all_layers.append(act_layer) 188 | 189 | elif section.startswith('route'): 190 | ids = [int(i) for i in cfg_parser[section]['layers'].split(',')] 191 | layers = [all_layers[i] for i in ids] 192 | if len(layers) > 1: 193 | print('Concatenating route layers:', layers) 194 | concatenate_layer = Concatenate()(layers) 195 | all_layers.append(concatenate_layer) 196 | prev_layer = concatenate_layer 197 | else: 198 | skip_layer = layers[0] # only one layer to route 199 | all_layers.append(skip_layer) 200 | prev_layer = skip_layer 201 | 202 | elif section.startswith('maxpool'): 203 | size = int(cfg_parser[section]['size']) 204 | stride = int(cfg_parser[section]['stride']) 205 | all_layers.append( 206 | MaxPooling2D( 207 | pool_size=(size, size), 208 | strides=(stride, stride), 209 | padding='same')(prev_layer)) 210 | prev_layer = all_layers[-1] 211 | 212 | elif section.startswith('shortcut'): 213 | index = int(cfg_parser[section]['from']) 214 | activation = cfg_parser[section]['activation'] 215 | assert activation == 'linear', 'Only linear activation supported.' 216 | all_layers.append(Add()([all_layers[index], prev_layer])) 217 | prev_layer = all_layers[-1] 218 | 219 | elif section.startswith('upsample'): 220 | stride = int(cfg_parser[section]['stride']) 221 | assert stride == 2, 'Only stride=2 supported.' 222 | all_layers.append(UpSampling2D(stride)(prev_layer)) 223 | prev_layer = all_layers[-1] 224 | 225 | elif section.startswith('yolo'): 226 | out_index.append(len(all_layers)-1) 227 | all_layers.append(None) 228 | prev_layer = all_layers[-1] 229 | 230 | elif section.startswith('net'): 231 | pass 232 | 233 | else: 234 | raise ValueError( 235 | 'Unsupported section header type: {}'.format(section)) 236 | 237 | # Create and save model. 238 | if len(out_index)==0: out_index.append(len(all_layers)-1) 239 | model = Model(inputs=input_layer, outputs=[all_layers[i] for i in out_index]) 240 | print(model.summary()) 241 | if args.weights_only: 242 | model.save_weights('{}'.format(output_path)) 243 | print('Saved Keras weights to {}'.format(output_path)) 244 | else: 245 | model.save('{}'.format(output_path)) 246 | print('Saved Keras model to {}'.format(output_path)) 247 | 248 | # Check to see if all weights have been read. 249 | remaining_weights = len(weights_file.read()) / 4 250 | weights_file.close() 251 | print('Read {} of {} from Darknet weights.'.format(count, count + 252 | remaining_weights)) 253 | if remaining_weights > 0: 254 | print('Warning: {} unused weights'.format(remaining_weights)) 255 | 256 | if args.plot_model: 257 | plot(model, to_file='{}.png'.format(output_root), show_shapes=True) 258 | print('Saved model plot to {}.png'.format(output_root)) 259 | 260 | 261 | if __name__ == '__main__': 262 | _main(parser.parse_args()) 263 | -------------------------------------------------------------------------------- /darknet53.cfg: -------------------------------------------------------------------------------- 1 | [net] 2 | # Testing 3 | batch=1 4 | subdivisions=1 5 | # Training 6 | # batch=64 7 | # subdivisions=16 8 | width=416 9 | height=416 10 | channels=3 11 | momentum=0.9 12 | decay=0.0005 13 | angle=0 14 | saturation = 1.5 15 | exposure = 1.5 16 | hue=.1 17 | 18 | learning_rate=0.001 19 | burn_in=1000 20 | max_batches = 500200 21 | policy=steps 22 | steps=400000,450000 23 | scales=.1,.1 24 | 25 | [convolutional] 26 | batch_normalize=1 27 | filters=32 28 | size=3 29 | stride=1 30 | pad=1 31 | activation=leaky 32 | 33 | # Downsample 34 | 35 | [convolutional] 36 | batch_normalize=1 37 | filters=64 38 | size=3 39 | stride=2 40 | pad=1 41 | activation=leaky 42 | 43 | [convolutional] 44 | batch_normalize=1 45 | filters=32 46 | size=1 47 | stride=1 48 | pad=1 49 | activation=leaky 50 | 51 | [convolutional] 52 | batch_normalize=1 53 | filters=64 54 | size=3 55 | stride=1 56 | pad=1 57 | activation=leaky 58 | 59 | [shortcut] 60 | from=-3 61 | activation=linear 62 | 63 | # Downsample 64 | 65 | [convolutional] 66 | batch_normalize=1 67 | filters=128 68 | size=3 69 | stride=2 70 | pad=1 71 | activation=leaky 72 | 73 | [convolutional] 74 | batch_normalize=1 75 | filters=64 76 | size=1 77 | stride=1 78 | pad=1 79 | activation=leaky 80 | 81 | [convolutional] 82 | batch_normalize=1 83 | filters=128 84 | size=3 85 | stride=1 86 | pad=1 87 | activation=leaky 88 | 89 | [shortcut] 90 | from=-3 91 | activation=linear 92 | 93 | [convolutional] 94 | batch_normalize=1 95 | filters=64 96 | size=1 97 | stride=1 98 | pad=1 99 | activation=leaky 100 | 101 | [convolutional] 102 | batch_normalize=1 103 | filters=128 104 | size=3 105 | stride=1 106 | pad=1 107 | activation=leaky 108 | 109 | [shortcut] 110 | from=-3 111 | activation=linear 112 | 113 | # Downsample 114 | 115 | [convolutional] 116 | batch_normalize=1 117 | filters=256 118 | size=3 119 | stride=2 120 | pad=1 121 | activation=leaky 122 | 123 | [convolutional] 124 | batch_normalize=1 125 | filters=128 126 | size=1 127 | stride=1 128 | pad=1 129 | activation=leaky 130 | 131 | [convolutional] 132 | batch_normalize=1 133 | filters=256 134 | size=3 135 | stride=1 136 | pad=1 137 | activation=leaky 138 | 139 | [shortcut] 140 | from=-3 141 | activation=linear 142 | 143 | [convolutional] 144 | batch_normalize=1 145 | filters=128 146 | size=1 147 | stride=1 148 | pad=1 149 | activation=leaky 150 | 151 | [convolutional] 152 | batch_normalize=1 153 | filters=256 154 | size=3 155 | stride=1 156 | pad=1 157 | activation=leaky 158 | 159 | [shortcut] 160 | from=-3 161 | activation=linear 162 | 163 | [convolutional] 164 | batch_normalize=1 165 | filters=128 166 | size=1 167 | stride=1 168 | pad=1 169 | activation=leaky 170 | 171 | [convolutional] 172 | batch_normalize=1 173 | filters=256 174 | size=3 175 | stride=1 176 | pad=1 177 | activation=leaky 178 | 179 | [shortcut] 180 | from=-3 181 | activation=linear 182 | 183 | [convolutional] 184 | batch_normalize=1 185 | filters=128 186 | size=1 187 | stride=1 188 | pad=1 189 | activation=leaky 190 | 191 | [convolutional] 192 | batch_normalize=1 193 | filters=256 194 | size=3 195 | stride=1 196 | pad=1 197 | activation=leaky 198 | 199 | [shortcut] 200 | from=-3 201 | activation=linear 202 | 203 | 204 | [convolutional] 205 | batch_normalize=1 206 | filters=128 207 | size=1 208 | stride=1 209 | pad=1 210 | activation=leaky 211 | 212 | [convolutional] 213 | batch_normalize=1 214 | filters=256 215 | size=3 216 | stride=1 217 | pad=1 218 | activation=leaky 219 | 220 | [shortcut] 221 | from=-3 222 | activation=linear 223 | 224 | [convolutional] 225 | batch_normalize=1 226 | filters=128 227 | size=1 228 | stride=1 229 | pad=1 230 | activation=leaky 231 | 232 | [convolutional] 233 | batch_normalize=1 234 | filters=256 235 | size=3 236 | stride=1 237 | pad=1 238 | activation=leaky 239 | 240 | [shortcut] 241 | from=-3 242 | activation=linear 243 | 244 | [convolutional] 245 | batch_normalize=1 246 | filters=128 247 | size=1 248 | stride=1 249 | pad=1 250 | activation=leaky 251 | 252 | [convolutional] 253 | batch_normalize=1 254 | filters=256 255 | size=3 256 | stride=1 257 | pad=1 258 | activation=leaky 259 | 260 | [shortcut] 261 | from=-3 262 | activation=linear 263 | 264 | [convolutional] 265 | batch_normalize=1 266 | filters=128 267 | size=1 268 | stride=1 269 | pad=1 270 | activation=leaky 271 | 272 | [convolutional] 273 | batch_normalize=1 274 | filters=256 275 | size=3 276 | stride=1 277 | pad=1 278 | activation=leaky 279 | 280 | [shortcut] 281 | from=-3 282 | activation=linear 283 | 284 | # Downsample 285 | 286 | [convolutional] 287 | batch_normalize=1 288 | filters=512 289 | size=3 290 | stride=2 291 | pad=1 292 | activation=leaky 293 | 294 | [convolutional] 295 | batch_normalize=1 296 | filters=256 297 | size=1 298 | stride=1 299 | pad=1 300 | activation=leaky 301 | 302 | [convolutional] 303 | batch_normalize=1 304 | filters=512 305 | size=3 306 | stride=1 307 | pad=1 308 | activation=leaky 309 | 310 | [shortcut] 311 | from=-3 312 | activation=linear 313 | 314 | 315 | [convolutional] 316 | batch_normalize=1 317 | filters=256 318 | size=1 319 | stride=1 320 | pad=1 321 | activation=leaky 322 | 323 | [convolutional] 324 | batch_normalize=1 325 | filters=512 326 | size=3 327 | stride=1 328 | pad=1 329 | activation=leaky 330 | 331 | [shortcut] 332 | from=-3 333 | activation=linear 334 | 335 | 336 | [convolutional] 337 | batch_normalize=1 338 | filters=256 339 | size=1 340 | stride=1 341 | pad=1 342 | activation=leaky 343 | 344 | [convolutional] 345 | batch_normalize=1 346 | filters=512 347 | size=3 348 | stride=1 349 | pad=1 350 | activation=leaky 351 | 352 | [shortcut] 353 | from=-3 354 | activation=linear 355 | 356 | 357 | [convolutional] 358 | batch_normalize=1 359 | filters=256 360 | size=1 361 | stride=1 362 | pad=1 363 | activation=leaky 364 | 365 | [convolutional] 366 | batch_normalize=1 367 | filters=512 368 | size=3 369 | stride=1 370 | pad=1 371 | activation=leaky 372 | 373 | [shortcut] 374 | from=-3 375 | activation=linear 376 | 377 | [convolutional] 378 | batch_normalize=1 379 | filters=256 380 | size=1 381 | stride=1 382 | pad=1 383 | activation=leaky 384 | 385 | [convolutional] 386 | batch_normalize=1 387 | filters=512 388 | size=3 389 | stride=1 390 | pad=1 391 | activation=leaky 392 | 393 | [shortcut] 394 | from=-3 395 | activation=linear 396 | 397 | 398 | [convolutional] 399 | batch_normalize=1 400 | filters=256 401 | size=1 402 | stride=1 403 | pad=1 404 | activation=leaky 405 | 406 | [convolutional] 407 | batch_normalize=1 408 | filters=512 409 | size=3 410 | stride=1 411 | pad=1 412 | activation=leaky 413 | 414 | [shortcut] 415 | from=-3 416 | activation=linear 417 | 418 | 419 | [convolutional] 420 | batch_normalize=1 421 | filters=256 422 | size=1 423 | stride=1 424 | pad=1 425 | activation=leaky 426 | 427 | [convolutional] 428 | batch_normalize=1 429 | filters=512 430 | size=3 431 | stride=1 432 | pad=1 433 | activation=leaky 434 | 435 | [shortcut] 436 | from=-3 437 | activation=linear 438 | 439 | [convolutional] 440 | batch_normalize=1 441 | filters=256 442 | size=1 443 | stride=1 444 | pad=1 445 | activation=leaky 446 | 447 | [convolutional] 448 | batch_normalize=1 449 | filters=512 450 | size=3 451 | stride=1 452 | pad=1 453 | activation=leaky 454 | 455 | [shortcut] 456 | from=-3 457 | activation=linear 458 | 459 | # Downsample 460 | 461 | [convolutional] 462 | batch_normalize=1 463 | filters=1024 464 | size=3 465 | stride=2 466 | pad=1 467 | activation=leaky 468 | 469 | [convolutional] 470 | batch_normalize=1 471 | filters=512 472 | size=1 473 | stride=1 474 | pad=1 475 | activation=leaky 476 | 477 | [convolutional] 478 | batch_normalize=1 479 | filters=1024 480 | size=3 481 | stride=1 482 | pad=1 483 | activation=leaky 484 | 485 | [shortcut] 486 | from=-3 487 | activation=linear 488 | 489 | [convolutional] 490 | batch_normalize=1 491 | filters=512 492 | size=1 493 | stride=1 494 | pad=1 495 | activation=leaky 496 | 497 | [convolutional] 498 | batch_normalize=1 499 | filters=1024 500 | size=3 501 | stride=1 502 | pad=1 503 | activation=leaky 504 | 505 | [shortcut] 506 | from=-3 507 | activation=linear 508 | 509 | [convolutional] 510 | batch_normalize=1 511 | filters=512 512 | size=1 513 | stride=1 514 | pad=1 515 | activation=leaky 516 | 517 | [convolutional] 518 | batch_normalize=1 519 | filters=1024 520 | size=3 521 | stride=1 522 | pad=1 523 | activation=leaky 524 | 525 | [shortcut] 526 | from=-3 527 | activation=linear 528 | 529 | [convolutional] 530 | batch_normalize=1 531 | filters=512 532 | size=1 533 | stride=1 534 | pad=1 535 | activation=leaky 536 | 537 | [convolutional] 538 | batch_normalize=1 539 | filters=1024 540 | size=3 541 | stride=1 542 | pad=1 543 | activation=leaky 544 | 545 | [shortcut] 546 | from=-3 547 | activation=linear 548 | 549 | -------------------------------------------------------------------------------- /detect_batch.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import argparse 3 | from yolo import YOLO, detect_video 4 | from PIL import Image 5 | 6 | def detect_img(yolo): 7 | list_file = open("_mask_test.txt","r") 8 | detect_list = [i.split(" ")[0] for i in list_file.readlines()] 9 | output_dir = "./logs/output/" 10 | print("Testing on %d images." % len(detect_list)) 11 | cnt = 1 12 | for img in detect_list: 13 | print("%d/%d" % (cnt,len(detect_list))) 14 | cnt = cnt + 1 15 | try: 16 | image = Image.open(img) 17 | except: 18 | print('Open Error! Try again!') 19 | continue 20 | else: 21 | r_image = yolo.detect_image(image) 22 | # r_image.show() 23 | r_image.save(output_dir+img.split("/")[-1]) 24 | yolo.close_session() 25 | 26 | 27 | if __name__ == '__main__': 28 | detect_img(YOLO()) -------------------------------------------------------------------------------- /font/FiraMono-Medium.otf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/font/FiraMono-Medium.otf -------------------------------------------------------------------------------- /font/SIL Open Font License.txt: -------------------------------------------------------------------------------- 1 | Copyright (c) 2014, Mozilla Foundation https://mozilla.org/ with Reserved Font Name Fira Mono. 2 | 3 | Copyright (c) 2014, Telefonica S.A. 4 | 5 | This Font Software is licensed under the SIL Open Font License, Version 1.1. 6 | This license is copied below, and is also available with a FAQ at: http://scripts.sil.org/OFL 7 | 8 | ----------------------------------------------------------- 9 | SIL OPEN FONT LICENSE Version 1.1 - 26 February 2007 10 | ----------------------------------------------------------- 11 | 12 | PREAMBLE 13 | The goals of the Open Font License (OFL) are to stimulate worldwide development of collaborative font projects, to support the font creation efforts of academic and linguistic communities, and to provide a free and open framework in which fonts may be shared and improved in partnership with others. 14 | 15 | The OFL allows the licensed fonts to be used, studied, modified and redistributed freely as long as they are not sold by themselves. The fonts, including any derivative works, can be bundled, embedded, redistributed and/or sold with any software provided that any reserved names are not used by derivative works. The fonts and derivatives, however, cannot be released under any other type of license. The requirement for fonts to remain under this license does not apply to any document created using the fonts or their derivatives. 16 | 17 | DEFINITIONS 18 | "Font Software" refers to the set of files released by the Copyright Holder(s) under this license and clearly marked as such. This may include source files, build scripts and documentation. 19 | 20 | "Reserved Font Name" refers to any names specified as such after the copyright statement(s). 21 | 22 | "Original Version" refers to the collection of Font Software components as distributed by the Copyright Holder(s). 23 | 24 | "Modified Version" refers to any derivative made by adding to, deleting, or substituting -- in part or in whole -- any of the components of the Original Version, by changing formats or by porting the Font Software to a new environment. 25 | 26 | "Author" refers to any designer, engineer, programmer, technical writer or other person who contributed to the Font Software. 27 | 28 | PERMISSION & CONDITIONS 29 | Permission is hereby granted, free of charge, to any person obtaining a copy of the Font Software, to use, study, copy, merge, embed, modify, redistribute, and sell modified and unmodified copies of the Font Software, subject to the following conditions: 30 | 31 | 1) Neither the Font Software nor any of its individual components, in Original or Modified Versions, may be sold by itself. 32 | 33 | 2) Original or Modified Versions of the Font Software may be bundled, redistributed and/or sold with any software, provided that each copy contains the above copyright notice and this license. These can be included either as stand-alone text files, human-readable headers or in the appropriate machine-readable metadata fields within text or binary files as long as those fields can be easily viewed by the user. 34 | 35 | 3) No Modified Version of the Font Software may use the Reserved Font Name(s) unless explicit written permission is granted by the corresponding Copyright Holder. This restriction only applies to the primary font name as presented to the users. 36 | 37 | 4) The name(s) of the Copyright Holder(s) or the Author(s) of the Font Software shall not be used to promote, endorse or advertise any Modified Version, except to acknowledge the contribution(s) of the Copyright Holder(s) and the Author(s) or with their explicit written permission. 38 | 39 | 5) The Font Software, modified or unmodified, in part or in whole, must be distributed entirely under this license, and must not be distributed under any other license. The requirement for fonts to remain under this license does not apply to any document created using the Font Software. 40 | 41 | TERMINATION 42 | This license becomes null and void if any of the above conditions are not met. 43 | 44 | DISCLAIMER 45 | THE FONT SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF COPYRIGHT, PATENT, TRADEMARK, OR OTHER RIGHT. IN NO EVENT SHALL THE COPYRIGHT HOLDER BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, INCLUDING ANY GENERAL, SPECIAL, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF THE USE OR INABILITY TO USE THE FONT SOFTWARE OR FROM OTHER DEALINGS IN THE FONT SOFTWARE. -------------------------------------------------------------------------------- /kmeans.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | 4 | class YOLO_Kmeans: 5 | 6 | def __init__(self, cluster_number, filename): 7 | self.cluster_number = cluster_number 8 | self.filename = "_mask_train.txt" 9 | 10 | def iou(self, boxes, clusters): # 1 box -> k clusters 11 | n = boxes.shape[0] 12 | k = self.cluster_number 13 | 14 | box_area = boxes[:, 0] * boxes[:, 1] 15 | box_area = box_area.repeat(k) 16 | box_area = np.reshape(box_area, (n, k)) 17 | 18 | cluster_area = clusters[:, 0] * clusters[:, 1] 19 | cluster_area = np.tile(cluster_area, [1, n]) 20 | cluster_area = np.reshape(cluster_area, (n, k)) 21 | 22 | box_w_matrix = np.reshape(boxes[:, 0].repeat(k), (n, k)) 23 | cluster_w_matrix = np.reshape(np.tile(clusters[:, 0], (1, n)), (n, k)) 24 | min_w_matrix = np.minimum(cluster_w_matrix, box_w_matrix) 25 | 26 | box_h_matrix = np.reshape(boxes[:, 1].repeat(k), (n, k)) 27 | cluster_h_matrix = np.reshape(np.tile(clusters[:, 1], (1, n)), (n, k)) 28 | min_h_matrix = np.minimum(cluster_h_matrix, box_h_matrix) 29 | inter_area = np.multiply(min_w_matrix, min_h_matrix) 30 | 31 | result = inter_area / (box_area + cluster_area - inter_area) 32 | return result 33 | 34 | def avg_iou(self, boxes, clusters): 35 | accuracy = np.mean([np.max(self.iou(boxes, clusters), axis=1)]) 36 | return accuracy 37 | 38 | def kmeans(self, boxes, k, dist=np.median): 39 | box_number = boxes.shape[0] 40 | distances = np.empty((box_number, k)) 41 | last_nearest = np.zeros((box_number,)) 42 | np.random.seed() 43 | clusters = boxes[np.random.choice( 44 | box_number, k, replace=False)] # init k clusters 45 | while True: 46 | 47 | distances = 1 - self.iou(boxes, clusters) 48 | 49 | current_nearest = np.argmin(distances, axis=1) 50 | if (last_nearest == current_nearest).all(): 51 | break # clusters won't change 52 | for cluster in range(k): 53 | clusters[cluster] = dist( # update clusters 54 | boxes[current_nearest == cluster], axis=0) 55 | 56 | last_nearest = current_nearest 57 | 58 | return clusters 59 | 60 | def result2txt(self, data): 61 | f = open("yolo_anchors.txt", 'w') 62 | row = np.shape(data)[0] 63 | for i in range(row): 64 | if i == 0: 65 | x_y = "%d,%d" % (data[i][0], data[i][1]) 66 | else: 67 | x_y = ", %d,%d" % (data[i][0], data[i][1]) 68 | f.write(x_y) 69 | f.close() 70 | 71 | def txt2boxes(self): 72 | f = open(self.filename, 'r') 73 | dataSet = [] 74 | for line in f: 75 | infos = line.split(" ") 76 | length = len(infos) 77 | for i in range(1, length): 78 | width = int(infos[i].split(",")[2]) - \ 79 | int(infos[i].split(",")[0]) 80 | height = int(infos[i].split(",")[3]) - \ 81 | int(infos[i].split(",")[1]) 82 | dataSet.append([width, height]) 83 | result = np.array(dataSet) 84 | f.close() 85 | return result 86 | 87 | def txt2clusters(self): 88 | all_boxes = self.txt2boxes() 89 | result = self.kmeans(all_boxes, k=self.cluster_number) 90 | result = result[np.lexsort(result.T[0, None])] 91 | self.result2txt(result) 92 | print("K anchors:\n {}".format(result)) 93 | print("Accuracy: {:.2f}%".format( 94 | self.avg_iou(all_boxes, result) * 100)) 95 | 96 | 97 | if __name__ == "__main__": 98 | cluster_number = 9 99 | filename = "_mask_train.txt" 100 | kmeans = YOLO_Kmeans(cluster_number, filename) 101 | kmeans.txt2clusters() 102 | -------------------------------------------------------------------------------- /model_data/coco_classes.txt: -------------------------------------------------------------------------------- 1 | nomask 2 | havemask 3 | -------------------------------------------------------------------------------- /model_data/tiny_yolo_anchors.txt: -------------------------------------------------------------------------------- 1 | 10,14, 23,27, 37,58, 81,82, 135,169, 344,319 2 | -------------------------------------------------------------------------------- /model_data/voc_classes.txt: -------------------------------------------------------------------------------- 1 | havemask 2 | nomask 3 | -------------------------------------------------------------------------------- /model_data/yolo_anchors.txt: -------------------------------------------------------------------------------- 1 | 17,22, 34,42, 53,63, 74,92, 104,132, 142,177, 192,244, 259,331, 381,483 -------------------------------------------------------------------------------- /train.py: -------------------------------------------------------------------------------- 1 | """ 2 | Retrain the YOLO model for your own dataset. 3 | """ 4 | 5 | import numpy as np 6 | import keras.backend as K 7 | from keras.layers import Input, Lambda 8 | from keras.models import Model 9 | from keras.optimizers import Adam 10 | from keras.callbacks import TensorBoard, ModelCheckpoint, ReduceLROnPlateau, EarlyStopping 11 | 12 | from yolo3.model import preprocess_true_boxes, yolo_body, tiny_yolo_body, yolo_loss 13 | from yolo3.utils import get_random_data 14 | 15 | 16 | def _main(): 17 | annotation_path = '_mask_train.txt' 18 | log_dir = 'logs/002_over/' 19 | classes_path = 'model_data/voc_classes.txt' 20 | anchors_path = 'model_data/yolo_anchors.txt' 21 | class_names = get_classes(classes_path) 22 | num_classes = len(class_names) 23 | anchors = get_anchors(anchors_path) 24 | 25 | input_shape = (416,416) # multiple of 32, hw 26 | 27 | is_tiny_version = len(anchors)==6 # default setting 28 | if is_tiny_version: 29 | model = create_tiny_model(input_shape, anchors, num_classes, 30 | freeze_body=2, weights_path='model_data/tiny_yolo_weights.h5') 31 | else: 32 | model = create_model(input_shape, anchors, num_classes, 33 | freeze_body=2, weights_path='model_data/yolo_weights.h5') # make sure you know what you freeze 34 | 35 | logging = TensorBoard(log_dir=log_dir) 36 | checkpoint = ModelCheckpoint(log_dir + 'ep{epoch:03d}-loss{loss:.3f}-val_loss{val_loss:.3f}.h5', 37 | monitor='val_loss', save_weights_only=True, save_best_only=True, period=3) 38 | reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=3, verbose=1) 39 | early_stopping = EarlyStopping(monitor='val_loss', min_delta=0, patience=10, verbose=1) 40 | 41 | val_split = 0.1 42 | with open(annotation_path) as f: 43 | lines = f.readlines() 44 | np.random.seed(10101) 45 | np.random.shuffle(lines) 46 | np.random.seed(None) 47 | num_val = int(len(lines)*val_split) 48 | num_train = len(lines) - num_val 49 | 50 | # Train with frozen layers first, to get a stable loss. 51 | # Adjust num epochs to your dataset. This step is enough to obtain a not bad model. 52 | if True: 53 | model.compile(optimizer=Adam(lr=1e-3), loss={ 54 | # use custom yolo_loss Lambda layer. 55 | 'yolo_loss': lambda y_true, y_pred: y_pred}) 56 | 57 | batch_size = 32 58 | print('Train on {} samples, val on {} samples, with batch size {}.'.format(num_train, num_val, batch_size)) 59 | model.fit_generator(data_generator_wrapper(lines[:num_train], batch_size, input_shape, anchors, num_classes), 60 | steps_per_epoch=max(1, num_train//batch_size), 61 | validation_data=data_generator_wrapper(lines[num_train:], batch_size, input_shape, anchors, num_classes), 62 | validation_steps=max(1, num_val//batch_size), 63 | epochs=100, 64 | initial_epoch=0, 65 | callbacks=[logging, checkpoint]) 66 | model.save_weights(log_dir + 'trained_weights_stage_1.h5') 67 | 68 | # Unfreeze and continue training, to fine-tune. 69 | # Train longer if the result is not good. 70 | if True: 71 | for i in range(len(model.layers)): 72 | model.layers[i].trainable = True 73 | model.compile(optimizer=Adam(lr=1e-4), loss={'yolo_loss': lambda y_true, y_pred: y_pred}) # recompile to apply the change 74 | print('Unfreeze all of the layers.') 75 | 76 | batch_size = 32 # note that more GPU memory is required after unfreezing the body 77 | print('Train on {} samples, val on {} samples, with batch size {}.'.format(num_train, num_val, batch_size)) 78 | model.fit_generator(data_generator_wrapper(lines[:num_train], batch_size, input_shape, anchors, num_classes), 79 | steps_per_epoch=max(1, num_train//batch_size), 80 | validation_data=data_generator_wrapper(lines[num_train:], batch_size, input_shape, anchors, num_classes), 81 | validation_steps=max(1, num_val//batch_size), 82 | epochs=100, 83 | initial_epoch=100, 84 | callbacks=[logging, checkpoint, reduce_lr, early_stopping]) 85 | model.save_weights(log_dir + 'trained_weights_final.h5') 86 | 87 | # Further training if needed. 88 | 89 | 90 | def get_classes(classes_path): 91 | '''loads the classes''' 92 | with open(classes_path) as f: 93 | class_names = f.readlines() 94 | class_names = [c.strip() for c in class_names] 95 | return class_names 96 | 97 | def get_anchors(anchors_path): 98 | '''loads the anchors from a file''' 99 | with open(anchors_path) as f: 100 | anchors = f.readline() 101 | anchors = [float(x) for x in anchors.split(',')] 102 | return np.array(anchors).reshape(-1, 2) 103 | 104 | 105 | def create_model(input_shape, anchors, num_classes, load_pretrained=True, freeze_body=2, 106 | weights_path='model_data/yolo_weights.h5'): 107 | '''create the training model''' 108 | K.clear_session() # get a new session 109 | image_input = Input(shape=(None, None, 3)) 110 | h, w = input_shape 111 | num_anchors = len(anchors) 112 | 113 | y_true = [Input(shape=(h//{0:32, 1:16, 2:8}[l], w//{0:32, 1:16, 2:8}[l], \ 114 | num_anchors//3, num_classes+5)) for l in range(3)] 115 | 116 | model_body = yolo_body(image_input, num_anchors//3, num_classes) 117 | print('Create YOLOv3 model with {} anchors and {} classes.'.format(num_anchors, num_classes)) 118 | 119 | if load_pretrained: 120 | model_body.load_weights(weights_path, by_name=True, skip_mismatch=True) 121 | print('Load weights {}.'.format(weights_path)) 122 | if freeze_body in [1, 2]: 123 | # Freeze darknet53 body or freeze all but 3 output layers. 124 | num = (185, len(model_body.layers)-3)[freeze_body-1] 125 | for i in range(num): model_body.layers[i].trainable = False 126 | print('Freeze the first {} layers of total {} layers.'.format(num, len(model_body.layers))) 127 | 128 | model_loss = Lambda(yolo_loss, output_shape=(1,), name='yolo_loss', 129 | arguments={'anchors': anchors, 'num_classes': num_classes, 'ignore_thresh': 0.5})( 130 | [*model_body.output, *y_true]) 131 | model = Model([model_body.input, *y_true], model_loss) 132 | 133 | return model 134 | 135 | def create_tiny_model(input_shape, anchors, num_classes, load_pretrained=True, freeze_body=2, 136 | weights_path='model_data/tiny_yolo_weights.h5'): 137 | '''create the training model, for Tiny YOLOv3''' 138 | K.clear_session() # get a new session 139 | image_input = Input(shape=(None, None, 3)) 140 | h, w = input_shape 141 | num_anchors = len(anchors) 142 | 143 | y_true = [Input(shape=(h//{0:32, 1:16}[l], w//{0:32, 1:16}[l], \ 144 | num_anchors//2, num_classes+5)) for l in range(2)] 145 | 146 | model_body = tiny_yolo_body(image_input, num_anchors//2, num_classes) 147 | print('Create Tiny YOLOv3 model with {} anchors and {} classes.'.format(num_anchors, num_classes)) 148 | 149 | if load_pretrained: 150 | model_body.load_weights(weights_path, by_name=True, skip_mismatch=True) 151 | print('Load weights {}.'.format(weights_path)) 152 | if freeze_body in [1, 2]: 153 | # Freeze the darknet body or freeze all but 2 output layers. 154 | num = (20, len(model_body.layers)-2)[freeze_body-1] 155 | for i in range(num): model_body.layers[i].trainable = False 156 | print('Freeze the first {} layers of total {} layers.'.format(num, len(model_body.layers))) 157 | 158 | model_loss = Lambda(yolo_loss, output_shape=(1,), name='yolo_loss', 159 | arguments={'anchors': anchors, 'num_classes': num_classes, 'ignore_thresh': 0.7})( 160 | [*model_body.output, *y_true]) 161 | model = Model([model_body.input, *y_true], model_loss) 162 | 163 | return model 164 | 165 | def data_generator(annotation_lines, batch_size, input_shape, anchors, num_classes): 166 | '''data generator for fit_generator''' 167 | n = len(annotation_lines) 168 | i = 0 169 | while True: 170 | image_data = [] 171 | box_data = [] 172 | for b in range(batch_size): 173 | if i==0: 174 | np.random.shuffle(annotation_lines) 175 | image, box = get_random_data(annotation_lines[i], input_shape, random=True) 176 | image_data.append(image) 177 | box_data.append(box) 178 | i = (i+1) % n 179 | image_data = np.array(image_data) 180 | box_data = np.array(box_data) 181 | y_true = preprocess_true_boxes(box_data, input_shape, anchors, num_classes) 182 | yield [image_data, *y_true], np.zeros(batch_size) 183 | 184 | def data_generator_wrapper(annotation_lines, batch_size, input_shape, anchors, num_classes): 185 | n = len(annotation_lines) 186 | if n==0 or batch_size<=0: return None 187 | return data_generator(annotation_lines, batch_size, input_shape, anchors, num_classes) 188 | 189 | if __name__ == '__main__': 190 | _main() 191 | -------------------------------------------------------------------------------- /train_bottleneck.py: -------------------------------------------------------------------------------- 1 | """ 2 | Retrain the YOLO model for your own dataset. 3 | """ 4 | import os 5 | import numpy as np 6 | import keras.backend as K 7 | from keras.layers import Input, Lambda 8 | from keras.models import Model 9 | from keras.optimizers import Adam 10 | from keras.callbacks import TensorBoard, ModelCheckpoint, ReduceLROnPlateau, EarlyStopping 11 | 12 | from yolo3.model import preprocess_true_boxes, yolo_body, tiny_yolo_body, yolo_loss 13 | from yolo3.utils import get_random_data 14 | 15 | 16 | def _main(): 17 | annotation_path = 'train.txt' 18 | log_dir = 'logs/000/' 19 | classes_path = 'model_data/coco_classes.txt' 20 | anchors_path = 'model_data/yolo_anchors.txt' 21 | class_names = get_classes(classes_path) 22 | num_classes = len(class_names) 23 | anchors = get_anchors(anchors_path) 24 | 25 | input_shape = (416,416) # multiple of 32, hw 26 | 27 | model, bottleneck_model, last_layer_model = create_model(input_shape, anchors, num_classes, 28 | freeze_body=2, weights_path='model_data/yolo_weights.h5') # make sure you know what you freeze 29 | 30 | logging = TensorBoard(log_dir=log_dir) 31 | checkpoint = ModelCheckpoint(log_dir + 'ep{epoch:03d}-loss{loss:.3f}-val_loss{val_loss:.3f}.h5', 32 | monitor='val_loss', save_weights_only=True, save_best_only=True, period=3) 33 | reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=3, verbose=1) 34 | early_stopping = EarlyStopping(monitor='val_loss', min_delta=0, patience=10, verbose=1) 35 | 36 | val_split = 0.1 37 | with open(annotation_path) as f: 38 | lines = f.readlines() 39 | np.random.seed(10101) 40 | np.random.shuffle(lines) 41 | np.random.seed(None) 42 | num_val = int(len(lines)*val_split) 43 | num_train = len(lines) - num_val 44 | 45 | # Train with frozen layers first, to get a stable loss. 46 | # Adjust num epochs to your dataset. This step is enough to obtain a not bad model. 47 | if True: 48 | # perform bottleneck training 49 | if not os.path.isfile("bottlenecks.npz"): 50 | print("calculating bottlenecks") 51 | batch_size=8 52 | bottlenecks=bottleneck_model.predict_generator(data_generator_wrapper(lines, batch_size, input_shape, anchors, num_classes, random=False, verbose=True), 53 | steps=(len(lines)//batch_size)+1, max_queue_size=1) 54 | np.savez("bottlenecks.npz", bot0=bottlenecks[0], bot1=bottlenecks[1], bot2=bottlenecks[2]) 55 | 56 | # load bottleneck features from file 57 | dict_bot=np.load("bottlenecks.npz") 58 | bottlenecks_train=[dict_bot["bot0"][:num_train], dict_bot["bot1"][:num_train], dict_bot["bot2"][:num_train]] 59 | bottlenecks_val=[dict_bot["bot0"][num_train:], dict_bot["bot1"][num_train:], dict_bot["bot2"][num_train:]] 60 | 61 | # train last layers with fixed bottleneck features 62 | batch_size=8 63 | print("Training last layers with bottleneck features") 64 | print('with {} samples, val on {} samples and batch size {}.'.format(num_train, num_val, batch_size)) 65 | last_layer_model.compile(optimizer='adam', loss={'yolo_loss': lambda y_true, y_pred: y_pred}) 66 | last_layer_model.fit_generator(bottleneck_generator(lines[:num_train], batch_size, input_shape, anchors, num_classes, bottlenecks_train), 67 | steps_per_epoch=max(1, num_train//batch_size), 68 | validation_data=bottleneck_generator(lines[num_train:], batch_size, input_shape, anchors, num_classes, bottlenecks_val), 69 | validation_steps=max(1, num_val//batch_size), 70 | epochs=30, 71 | initial_epoch=0, max_queue_size=1) 72 | model.save_weights(log_dir + 'trained_weights_stage_0.h5') 73 | 74 | # train last layers with random augmented data 75 | model.compile(optimizer=Adam(lr=1e-3), loss={ 76 | # use custom yolo_loss Lambda layer. 77 | 'yolo_loss': lambda y_true, y_pred: y_pred}) 78 | batch_size = 16 79 | print('Train on {} samples, val on {} samples, with batch size {}.'.format(num_train, num_val, batch_size)) 80 | model.fit_generator(data_generator_wrapper(lines[:num_train], batch_size, input_shape, anchors, num_classes), 81 | steps_per_epoch=max(1, num_train//batch_size), 82 | validation_data=data_generator_wrapper(lines[num_train:], batch_size, input_shape, anchors, num_classes), 83 | validation_steps=max(1, num_val//batch_size), 84 | epochs=50, 85 | initial_epoch=0, 86 | callbacks=[logging, checkpoint]) 87 | model.save_weights(log_dir + 'trained_weights_stage_1.h5') 88 | 89 | # Unfreeze and continue training, to fine-tune. 90 | # Train longer if the result is not good. 91 | if True: 92 | for i in range(len(model.layers)): 93 | model.layers[i].trainable = True 94 | model.compile(optimizer=Adam(lr=1e-4), loss={'yolo_loss': lambda y_true, y_pred: y_pred}) # recompile to apply the change 95 | print('Unfreeze all of the layers.') 96 | 97 | batch_size = 4 # note that more GPU memory is required after unfreezing the body 98 | print('Train on {} samples, val on {} samples, with batch size {}.'.format(num_train, num_val, batch_size)) 99 | model.fit_generator(data_generator_wrapper(lines[:num_train], batch_size, input_shape, anchors, num_classes), 100 | steps_per_epoch=max(1, num_train//batch_size), 101 | validation_data=data_generator_wrapper(lines[num_train:], batch_size, input_shape, anchors, num_classes), 102 | validation_steps=max(1, num_val//batch_size), 103 | epochs=100, 104 | initial_epoch=50, 105 | callbacks=[logging, checkpoint, reduce_lr, early_stopping]) 106 | model.save_weights(log_dir + 'trained_weights_final.h5') 107 | 108 | # Further training if needed. 109 | 110 | 111 | def get_classes(classes_path): 112 | '''loads the classes''' 113 | with open(classes_path) as f: 114 | class_names = f.readlines() 115 | class_names = [c.strip() for c in class_names] 116 | return class_names 117 | 118 | def get_anchors(anchors_path): 119 | '''loads the anchors from a file''' 120 | with open(anchors_path) as f: 121 | anchors = f.readline() 122 | anchors = [float(x) for x in anchors.split(',')] 123 | return np.array(anchors).reshape(-1, 2) 124 | 125 | 126 | def create_model(input_shape, anchors, num_classes, load_pretrained=True, freeze_body=2, 127 | weights_path='model_data/yolo_weights.h5'): 128 | '''create the training model''' 129 | K.clear_session() # get a new session 130 | image_input = Input(shape=(None, None, 3)) 131 | h, w = input_shape 132 | num_anchors = len(anchors) 133 | 134 | y_true = [Input(shape=(h//{0:32, 1:16, 2:8}[l], w//{0:32, 1:16, 2:8}[l], \ 135 | num_anchors//3, num_classes+5)) for l in range(3)] 136 | 137 | model_body = yolo_body(image_input, num_anchors//3, num_classes) 138 | print('Create YOLOv3 model with {} anchors and {} classes.'.format(num_anchors, num_classes)) 139 | 140 | if load_pretrained: 141 | model_body.load_weights(weights_path, by_name=True, skip_mismatch=True) 142 | print('Load weights {}.'.format(weights_path)) 143 | if freeze_body in [1, 2]: 144 | # Freeze darknet53 body or freeze all but 3 output layers. 145 | num = (185, len(model_body.layers)-3)[freeze_body-1] 146 | for i in range(num): model_body.layers[i].trainable = False 147 | print('Freeze the first {} layers of total {} layers.'.format(num, len(model_body.layers))) 148 | 149 | # get output of second last layers and create bottleneck model of it 150 | out1=model_body.layers[246].output 151 | out2=model_body.layers[247].output 152 | out3=model_body.layers[248].output 153 | bottleneck_model = Model([model_body.input, *y_true], [out1, out2, out3]) 154 | 155 | # create last layer model of last layers from yolo model 156 | in0 = Input(shape=bottleneck_model.output[0].shape[1:].as_list()) 157 | in1 = Input(shape=bottleneck_model.output[1].shape[1:].as_list()) 158 | in2 = Input(shape=bottleneck_model.output[2].shape[1:].as_list()) 159 | last_out0=model_body.layers[249](in0) 160 | last_out1=model_body.layers[250](in1) 161 | last_out2=model_body.layers[251](in2) 162 | model_last=Model(inputs=[in0, in1, in2], outputs=[last_out0, last_out1, last_out2]) 163 | model_loss_last =Lambda(yolo_loss, output_shape=(1,), name='yolo_loss', 164 | arguments={'anchors': anchors, 'num_classes': num_classes, 'ignore_thresh': 0.5})( 165 | [*model_last.output, *y_true]) 166 | last_layer_model = Model([in0,in1,in2, *y_true], model_loss_last) 167 | 168 | 169 | model_loss = Lambda(yolo_loss, output_shape=(1,), name='yolo_loss', 170 | arguments={'anchors': anchors, 'num_classes': num_classes, 'ignore_thresh': 0.5})( 171 | [*model_body.output, *y_true]) 172 | model = Model([model_body.input, *y_true], model_loss) 173 | 174 | return model, bottleneck_model, last_layer_model 175 | 176 | def data_generator(annotation_lines, batch_size, input_shape, anchors, num_classes, random=True, verbose=False): 177 | '''data generator for fit_generator''' 178 | n = len(annotation_lines) 179 | i = 0 180 | while True: 181 | image_data = [] 182 | box_data = [] 183 | for b in range(batch_size): 184 | if i==0 and random: 185 | np.random.shuffle(annotation_lines) 186 | image, box = get_random_data(annotation_lines[i], input_shape, random=random) 187 | image_data.append(image) 188 | box_data.append(box) 189 | i = (i+1) % n 190 | image_data = np.array(image_data) 191 | if verbose: 192 | print("Progress: ",i,"/",n) 193 | box_data = np.array(box_data) 194 | y_true = preprocess_true_boxes(box_data, input_shape, anchors, num_classes) 195 | yield [image_data, *y_true], np.zeros(batch_size) 196 | 197 | def data_generator_wrapper(annotation_lines, batch_size, input_shape, anchors, num_classes, random=True, verbose=False): 198 | n = len(annotation_lines) 199 | if n==0 or batch_size<=0: return None 200 | return data_generator(annotation_lines, batch_size, input_shape, anchors, num_classes, random, verbose) 201 | 202 | def bottleneck_generator(annotation_lines, batch_size, input_shape, anchors, num_classes, bottlenecks): 203 | n = len(annotation_lines) 204 | i = 0 205 | while True: 206 | box_data = [] 207 | b0=np.zeros((batch_size,bottlenecks[0].shape[1],bottlenecks[0].shape[2],bottlenecks[0].shape[3])) 208 | b1=np.zeros((batch_size,bottlenecks[1].shape[1],bottlenecks[1].shape[2],bottlenecks[1].shape[3])) 209 | b2=np.zeros((batch_size,bottlenecks[2].shape[1],bottlenecks[2].shape[2],bottlenecks[2].shape[3])) 210 | for b in range(batch_size): 211 | _, box = get_random_data(annotation_lines[i], input_shape, random=False, proc_img=False) 212 | box_data.append(box) 213 | b0[b]=bottlenecks[0][i] 214 | b1[b]=bottlenecks[1][i] 215 | b2[b]=bottlenecks[2][i] 216 | i = (i+1) % n 217 | box_data = np.array(box_data) 218 | y_true = preprocess_true_boxes(box_data, input_shape, anchors, num_classes) 219 | yield [b0, b1, b2, *y_true], np.zeros(batch_size) 220 | 221 | if __name__ == '__main__': 222 | _main() 223 | -------------------------------------------------------------------------------- /voc_annotation.py: -------------------------------------------------------------------------------- 1 | import xml.etree.ElementTree as ET 2 | from os import getcwd 3 | 4 | sets=[('_mask', 'train'), ('_mask', 'val'), ('_mask', 'test')] 5 | 6 | # classes = ["aeroplane", "bicycle", "bird", "boat", "bottle", "bus", "car", "cat", "chair", "cow", "diningtable", "dog", "horse", "motorbike", "person", "pottedplant", "sheep", "sofa", "train", "tvmonitor"] 7 | classes = ["nomask","havemask"] 8 | 9 | 10 | def convert_annotation(year, image_id, list_file): 11 | in_file = open('VOC_data/VOC%s/label/%s.xml'%(year, image_id), encoding='utf-8') 12 | tree=ET.parse(in_file) 13 | root = tree.getroot() 14 | 15 | for obj in root.iter('object'): 16 | difficult = obj.find('difficult').text 17 | cls = obj.find('name').text 18 | if cls not in classes or int(difficult)==1: 19 | continue 20 | cls_id = classes.index(cls) 21 | xmlbox = obj.find('bndbox') 22 | b = (int(xmlbox.find('xmin').text), int(xmlbox.find('ymin').text), int(xmlbox.find('xmax').text), int(xmlbox.find('ymax').text)) 23 | list_file.write(" " + ",".join([str(a) for a in b]) + ',' + str(cls_id)) 24 | 25 | wd = getcwd() 26 | 27 | for year, image_set in sets: 28 | image_ids = open('VOC_data/VOC%s/data_list/%s.txt'%(year, image_set)).read().strip().split() 29 | list_file = open('%s_%s.txt'%(year, image_set), 'w') 30 | for image_id in image_ids: 31 | print(image_id) 32 | list_file.write('%s/VOC_data/VOC%s/img/%s.jpg'%(wd, year, image_id)) 33 | convert_annotation(year, image_id, list_file) 34 | list_file.write('\n') 35 | list_file.close() 36 | 37 | -------------------------------------------------------------------------------- /yolo.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | """ 3 | Class definition of YOLO_v3 style detection model on image and video 4 | """ 5 | 6 | import colorsys 7 | import os 8 | from timeit import default_timer as timer 9 | 10 | import numpy as np 11 | from keras import backend as K 12 | from keras.models import load_model 13 | from keras.layers import Input 14 | from PIL import Image, ImageFont, ImageDraw 15 | 16 | from yolo3.model import yolo_eval, yolo_body, tiny_yolo_body 17 | from yolo3.utils import letterbox_image 18 | import os 19 | from keras.utils import multi_gpu_model 20 | 21 | class YOLO(object): 22 | _defaults = { 23 | "model_path": 'model_data/trained_weights_final.h5', 24 | "anchors_path": 'model_data/yolo_anchors.txt', 25 | "classes_path": 'model_data/voc_classes.txt', 26 | "score" : 0.3, 27 | "iou" : 0.45, 28 | "model_image_size" : (416,416), 29 | "gpu_num" : 0, 30 | } 31 | 32 | @classmethod 33 | def get_defaults(cls, n): 34 | if n in cls._defaults: 35 | return cls._defaults[n] 36 | else: 37 | return "Unrecognized attribute name '" + n + "'" 38 | 39 | def __init__(self, **kwargs): 40 | self.__dict__.update(self._defaults) # set up default values 41 | self.__dict__.update(kwargs) # and update with user overrides 42 | self.class_names = self._get_class() 43 | self.anchors = self._get_anchors() 44 | self.sess = K.get_session() 45 | self.boxes, self.scores, self.classes = self.generate() 46 | 47 | def _get_class(self): 48 | classes_path = os.path.expanduser(self.classes_path) 49 | with open(classes_path) as f: 50 | class_names = f.readlines() 51 | class_names = [c.strip() for c in class_names] 52 | return class_names 53 | 54 | def _get_anchors(self): 55 | anchors_path = os.path.expanduser(self.anchors_path) 56 | with open(anchors_path) as f: 57 | anchors = f.readline() 58 | anchors = [float(x) for x in anchors.split(',')] 59 | return np.array(anchors).reshape(-1, 2) 60 | 61 | def generate(self): 62 | model_path = os.path.expanduser(self.model_path) 63 | assert model_path.endswith('.h5'), 'Keras model or weights must be a .h5 file.' 64 | 65 | # Load model, or construct model and load weights. 66 | num_anchors = len(self.anchors) 67 | num_classes = len(self.class_names) 68 | is_tiny_version = num_anchors==6 # default setting 69 | try: 70 | self.yolo_model = load_model(model_path, compile=False) 71 | except: 72 | self.yolo_model = tiny_yolo_body(Input(shape=(None,None,3)), num_anchors//2, num_classes) \ 73 | if is_tiny_version else yolo_body(Input(shape=(None,None,3)), num_anchors//3, num_classes) 74 | self.yolo_model.load_weights(self.model_path) # make sure model, anchors and classes match 75 | else: 76 | assert self.yolo_model.layers[-1].output_shape[-1] == \ 77 | num_anchors/len(self.yolo_model.output) * (num_classes + 5), \ 78 | 'Mismatch between model and given anchor and class sizes' 79 | 80 | print('{} model, anchors, and classes loaded.'.format(model_path)) 81 | 82 | # Generate colors for drawing bounding boxes. 83 | hsv_tuples = [(x / len(self.class_names), 1., 1.) 84 | for x in range(len(self.class_names))] 85 | self.colors = list(map(lambda x: colorsys.hsv_to_rgb(*x), hsv_tuples)) 86 | self.colors = list( 87 | map(lambda x: (int(x[0] * 255), int(x[1] * 255), int(x[2] * 255)), 88 | self.colors)) 89 | np.random.seed(10101) # Fixed seed for consistent colors across runs. 90 | np.random.shuffle(self.colors) # Shuffle colors to decorrelate adjacent classes. 91 | np.random.seed(None) # Reset seed to default. 92 | 93 | # Generate output tensor targets for filtered bounding boxes. 94 | self.input_image_shape = K.placeholder(shape=(2, )) 95 | if self.gpu_num>=2: 96 | self.yolo_model = multi_gpu_model(self.yolo_model, gpus=self.gpu_num) 97 | boxes, scores, classes = yolo_eval(self.yolo_model.output, self.anchors, 98 | len(self.class_names), self.input_image_shape, 99 | score_threshold=self.score, iou_threshold=self.iou) 100 | return boxes, scores, classes 101 | 102 | def detect_image(self, image): 103 | start = timer() 104 | 105 | if self.model_image_size != (None, None): 106 | assert self.model_image_size[0]%32 == 0, 'Multiples of 32 required' 107 | assert self.model_image_size[1]%32 == 0, 'Multiples of 32 required' 108 | boxed_image = letterbox_image(image, tuple(reversed(self.model_image_size))) 109 | else: 110 | new_image_size = (image.width - (image.width % 32), 111 | image.height - (image.height % 32)) 112 | boxed_image = letterbox_image(image, new_image_size) 113 | image_data = np.array(boxed_image, dtype='float32') 114 | 115 | print(image_data.shape) 116 | image_data /= 255. 117 | image_data = np.expand_dims(image_data, 0) # Add batch dimension. 118 | 119 | out_boxes, out_scores, out_classes = self.sess.run( 120 | [self.boxes, self.scores, self.classes], 121 | feed_dict={ 122 | self.yolo_model.input: image_data, 123 | self.input_image_shape: [image.size[1], image.size[0]], 124 | K.learning_phase(): 0 125 | }) 126 | 127 | print('Found {} boxes for {}'.format(len(out_boxes), 'img')) 128 | 129 | font = ImageFont.truetype(font='font/FiraMono-Medium.otf', 130 | size=np.floor(3e-2 * image.size[1] + 0.5).astype('int32')) 131 | thickness = (image.size[0] + image.size[1]) // 300 132 | 133 | for i, c in reversed(list(enumerate(out_classes))): 134 | predicted_class = self.class_names[c] 135 | box = out_boxes[i] 136 | score = out_scores[i] 137 | 138 | label = '{} {:.2f}'.format(predicted_class, score) 139 | draw = ImageDraw.Draw(image) 140 | label_size = draw.textsize(label, font) 141 | 142 | top, left, bottom, right = box 143 | top = max(0, np.floor(top + 0.5).astype('int32')) 144 | left = max(0, np.floor(left + 0.5).astype('int32')) 145 | bottom = min(image.size[1], np.floor(bottom + 0.5).astype('int32')) 146 | right = min(image.size[0], np.floor(right + 0.5).astype('int32')) 147 | print(label, (left, top), (right, bottom)) 148 | 149 | if top - label_size[1] >= 0: 150 | text_origin = np.array([left, top - label_size[1]]) 151 | else: 152 | text_origin = np.array([left, top + 1]) 153 | 154 | # My kingdom for a good redistributable image drawing library. 155 | for i in range(thickness): 156 | draw.rectangle( 157 | [left + i, top + i, right - i, bottom - i], 158 | outline=self.colors[c]) 159 | draw.rectangle( 160 | [tuple(text_origin), tuple(text_origin + label_size)], 161 | fill=self.colors[c]) 162 | draw.text(text_origin, label, fill=(0, 0, 0), font=font) 163 | del draw 164 | 165 | end = timer() 166 | print(end - start) 167 | return image 168 | 169 | def close_session(self): 170 | self.sess.close() 171 | 172 | def detect_video(yolo, video_path, output_path=""): 173 | import cv2 174 | vid = cv2.VideoCapture(video_path) 175 | if not vid.isOpened(): 176 | raise IOError("Couldn't open webcam or video") 177 | video_FourCC = int(vid.get(cv2.CAP_PROP_FOURCC)) 178 | video_fps = vid.get(cv2.CAP_PROP_FPS) 179 | video_size = (int(vid.get(cv2.CAP_PROP_FRAME_WIDTH)), 180 | int(vid.get(cv2.CAP_PROP_FRAME_HEIGHT))) 181 | isOutput = True if output_path != "" else False 182 | if isOutput: 183 | print("!!! TYPE:", type(output_path), type(video_FourCC), type(video_fps), type(video_size)) 184 | out = cv2.VideoWriter(output_path, video_FourCC, video_fps, video_size) 185 | accum_time = 0 186 | curr_fps = 0 187 | fps = "FPS: ??" 188 | prev_time = timer() 189 | while True: 190 | return_value, frame = vid.read() 191 | image = Image.fromarray(frame) 192 | image = yolo.detect_image(image) 193 | result = np.asarray(image) 194 | curr_time = timer() 195 | exec_time = curr_time - prev_time 196 | prev_time = curr_time 197 | accum_time = accum_time + exec_time 198 | curr_fps = curr_fps + 1 199 | if accum_time > 1: 200 | accum_time = accum_time - 1 201 | fps = "FPS: " + str(curr_fps) 202 | curr_fps = 0 203 | cv2.putText(result, text=fps, org=(3, 15), fontFace=cv2.FONT_HERSHEY_SIMPLEX, 204 | fontScale=0.50, color=(255, 0, 0), thickness=2) 205 | cv2.namedWindow("result", cv2.WINDOW_NORMAL) 206 | cv2.imshow("result", result) 207 | if isOutput: 208 | out.write(result) 209 | if cv2.waitKey(1) & 0xFF == ord('q'): 210 | break 211 | yolo.close_session() 212 | 213 | -------------------------------------------------------------------------------- /yolo3/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__init__.py -------------------------------------------------------------------------------- /yolo3/__pycache__/__init__.cpython-35.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__pycache__/__init__.cpython-35.pyc -------------------------------------------------------------------------------- /yolo3/__pycache__/__init__.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__pycache__/__init__.cpython-36.pyc -------------------------------------------------------------------------------- /yolo3/__pycache__/model.cpython-35.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__pycache__/model.cpython-35.pyc -------------------------------------------------------------------------------- /yolo3/__pycache__/model.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__pycache__/model.cpython-36.pyc -------------------------------------------------------------------------------- /yolo3/__pycache__/utils.cpython-35.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__pycache__/utils.cpython-35.pyc -------------------------------------------------------------------------------- /yolo3/__pycache__/utils.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ztjryg4/MaskDetect/bc625cd95cfbfc9a228f0d57063c0ae18c88d88c/yolo3/__pycache__/utils.cpython-36.pyc -------------------------------------------------------------------------------- /yolo3/model.py: -------------------------------------------------------------------------------- 1 | """YOLO_v3 Model Defined in Keras.""" 2 | 3 | from functools import wraps 4 | 5 | import numpy as np 6 | import tensorflow as tf 7 | from keras import backend as K 8 | from keras.layers import Conv2D, Add, ZeroPadding2D, UpSampling2D, Concatenate, MaxPooling2D 9 | from keras.layers.advanced_activations import LeakyReLU 10 | from keras.layers.normalization import BatchNormalization 11 | from keras.models import Model 12 | from keras.regularizers import l2 13 | 14 | from yolo3.utils import compose 15 | 16 | 17 | @wraps(Conv2D) 18 | def DarknetConv2D(*args, **kwargs): 19 | """Wrapper to set Darknet parameters for Convolution2D.""" 20 | darknet_conv_kwargs = {'kernel_regularizer': l2(5e-4)} 21 | darknet_conv_kwargs['padding'] = 'valid' if kwargs.get('strides')==(2,2) else 'same' 22 | darknet_conv_kwargs.update(kwargs) 23 | return Conv2D(*args, **darknet_conv_kwargs) 24 | 25 | def DarknetConv2D_BN_Leaky(*args, **kwargs): 26 | """Darknet Convolution2D followed by BatchNormalization and LeakyReLU.""" 27 | no_bias_kwargs = {'use_bias': False} 28 | no_bias_kwargs.update(kwargs) 29 | return compose( 30 | DarknetConv2D(*args, **no_bias_kwargs), 31 | BatchNormalization(), 32 | LeakyReLU(alpha=0.1)) 33 | 34 | def resblock_body(x, num_filters, num_blocks): 35 | '''A series of resblocks starting with a downsampling Convolution2D''' 36 | # Darknet uses left and top padding instead of 'same' mode 37 | x = ZeroPadding2D(((1,0),(1,0)))(x) 38 | x = DarknetConv2D_BN_Leaky(num_filters, (3,3), strides=(2,2))(x) 39 | for i in range(num_blocks): 40 | y = compose( 41 | DarknetConv2D_BN_Leaky(num_filters//2, (1,1)), 42 | DarknetConv2D_BN_Leaky(num_filters, (3,3)))(x) 43 | x = Add()([x,y]) 44 | return x 45 | 46 | def darknet_body(x): 47 | '''Darknent body having 52 Convolution2D layers''' 48 | x = DarknetConv2D_BN_Leaky(32, (3,3))(x) 49 | x = resblock_body(x, 64, 1) 50 | x = resblock_body(x, 128, 2) 51 | x = resblock_body(x, 256, 8) 52 | x = resblock_body(x, 512, 8) 53 | x = resblock_body(x, 1024, 4) 54 | return x 55 | 56 | def make_last_layers(x, num_filters, out_filters): 57 | '''6 Conv2D_BN_Leaky layers followed by a Conv2D_linear layer''' 58 | x = compose( 59 | DarknetConv2D_BN_Leaky(num_filters, (1,1)), 60 | DarknetConv2D_BN_Leaky(num_filters*2, (3,3)), 61 | DarknetConv2D_BN_Leaky(num_filters, (1,1)), 62 | DarknetConv2D_BN_Leaky(num_filters*2, (3,3)), 63 | DarknetConv2D_BN_Leaky(num_filters, (1,1)))(x) 64 | y = compose( 65 | DarknetConv2D_BN_Leaky(num_filters*2, (3,3)), 66 | DarknetConv2D(out_filters, (1,1)))(x) 67 | return x, y 68 | 69 | 70 | def yolo_body(inputs, num_anchors, num_classes): 71 | """Create YOLO_V3 model CNN body in Keras.""" 72 | darknet = Model(inputs, darknet_body(inputs)) 73 | x, y1 = make_last_layers(darknet.output, 512, num_anchors*(num_classes+5)) 74 | 75 | x = compose( 76 | DarknetConv2D_BN_Leaky(256, (1,1)), 77 | UpSampling2D(2))(x) 78 | x = Concatenate()([x,darknet.layers[152].output]) 79 | x, y2 = make_last_layers(x, 256, num_anchors*(num_classes+5)) 80 | 81 | x = compose( 82 | DarknetConv2D_BN_Leaky(128, (1,1)), 83 | UpSampling2D(2))(x) 84 | x = Concatenate()([x,darknet.layers[92].output]) 85 | x, y3 = make_last_layers(x, 128, num_anchors*(num_classes+5)) 86 | 87 | return Model(inputs, [y1,y2,y3]) 88 | 89 | def tiny_yolo_body(inputs, num_anchors, num_classes): 90 | '''Create Tiny YOLO_v3 model CNN body in keras.''' 91 | x1 = compose( 92 | DarknetConv2D_BN_Leaky(16, (3,3)), 93 | MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same'), 94 | DarknetConv2D_BN_Leaky(32, (3,3)), 95 | MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same'), 96 | DarknetConv2D_BN_Leaky(64, (3,3)), 97 | MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same'), 98 | DarknetConv2D_BN_Leaky(128, (3,3)), 99 | MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same'), 100 | DarknetConv2D_BN_Leaky(256, (3,3)))(inputs) 101 | x2 = compose( 102 | MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same'), 103 | DarknetConv2D_BN_Leaky(512, (3,3)), 104 | MaxPooling2D(pool_size=(2,2), strides=(1,1), padding='same'), 105 | DarknetConv2D_BN_Leaky(1024, (3,3)), 106 | DarknetConv2D_BN_Leaky(256, (1,1)))(x1) 107 | y1 = compose( 108 | DarknetConv2D_BN_Leaky(512, (3,3)), 109 | DarknetConv2D(num_anchors*(num_classes+5), (1,1)))(x2) 110 | 111 | x2 = compose( 112 | DarknetConv2D_BN_Leaky(128, (1,1)), 113 | UpSampling2D(2))(x2) 114 | y2 = compose( 115 | Concatenate(), 116 | DarknetConv2D_BN_Leaky(256, (3,3)), 117 | DarknetConv2D(num_anchors*(num_classes+5), (1,1)))([x2,x1]) 118 | 119 | return Model(inputs, [y1,y2]) 120 | 121 | 122 | def yolo_head(feats, anchors, num_classes, input_shape, calc_loss=False): 123 | """Convert final layer features to bounding box parameters.""" 124 | num_anchors = len(anchors) 125 | # Reshape to batch, height, width, num_anchors, box_params. 126 | anchors_tensor = K.reshape(K.constant(anchors), [1, 1, 1, num_anchors, 2]) 127 | 128 | grid_shape = K.shape(feats)[1:3] # height, width 129 | grid_y = K.tile(K.reshape(K.arange(0, stop=grid_shape[0]), [-1, 1, 1, 1]), 130 | [1, grid_shape[1], 1, 1]) 131 | grid_x = K.tile(K.reshape(K.arange(0, stop=grid_shape[1]), [1, -1, 1, 1]), 132 | [grid_shape[0], 1, 1, 1]) 133 | grid = K.concatenate([grid_x, grid_y]) 134 | grid = K.cast(grid, K.dtype(feats)) 135 | 136 | feats = K.reshape( 137 | feats, [-1, grid_shape[0], grid_shape[1], num_anchors, num_classes + 5]) 138 | 139 | # Adjust preditions to each spatial grid point and anchor size. 140 | box_xy = (K.sigmoid(feats[..., :2]) + grid) / K.cast(grid_shape[::-1], K.dtype(feats)) 141 | box_wh = K.exp(feats[..., 2:4]) * anchors_tensor / K.cast(input_shape[::-1], K.dtype(feats)) 142 | box_confidence = K.sigmoid(feats[..., 4:5]) 143 | box_class_probs = K.sigmoid(feats[..., 5:]) 144 | 145 | if calc_loss == True: 146 | return grid, feats, box_xy, box_wh 147 | return box_xy, box_wh, box_confidence, box_class_probs 148 | 149 | 150 | def yolo_correct_boxes(box_xy, box_wh, input_shape, image_shape): 151 | '''Get corrected boxes''' 152 | box_yx = box_xy[..., ::-1] 153 | box_hw = box_wh[..., ::-1] 154 | input_shape = K.cast(input_shape, K.dtype(box_yx)) 155 | image_shape = K.cast(image_shape, K.dtype(box_yx)) 156 | new_shape = K.round(image_shape * K.min(input_shape/image_shape)) 157 | offset = (input_shape-new_shape)/2./input_shape 158 | scale = input_shape/new_shape 159 | box_yx = (box_yx - offset) * scale 160 | box_hw *= scale 161 | 162 | box_mins = box_yx - (box_hw / 2.) 163 | box_maxes = box_yx + (box_hw / 2.) 164 | boxes = K.concatenate([ 165 | box_mins[..., 0:1], # y_min 166 | box_mins[..., 1:2], # x_min 167 | box_maxes[..., 0:1], # y_max 168 | box_maxes[..., 1:2] # x_max 169 | ]) 170 | 171 | # Scale boxes back to original image shape. 172 | boxes *= K.concatenate([image_shape, image_shape]) 173 | return boxes 174 | 175 | 176 | def yolo_boxes_and_scores(feats, anchors, num_classes, input_shape, image_shape): 177 | '''Process Conv layer output''' 178 | box_xy, box_wh, box_confidence, box_class_probs = yolo_head(feats, 179 | anchors, num_classes, input_shape) 180 | boxes = yolo_correct_boxes(box_xy, box_wh, input_shape, image_shape) 181 | boxes = K.reshape(boxes, [-1, 4]) 182 | box_scores = box_confidence * box_class_probs 183 | box_scores = K.reshape(box_scores, [-1, num_classes]) 184 | return boxes, box_scores 185 | 186 | 187 | def yolo_eval(yolo_outputs, 188 | anchors, 189 | num_classes, 190 | image_shape, 191 | max_boxes=20, 192 | score_threshold=.6, 193 | iou_threshold=.5): 194 | """Evaluate YOLO model on given input and return filtered boxes.""" 195 | num_layers = len(yolo_outputs) 196 | anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [1,2,3]] # default setting 197 | input_shape = K.shape(yolo_outputs[0])[1:3] * 32 198 | boxes = [] 199 | box_scores = [] 200 | for l in range(num_layers): 201 | _boxes, _box_scores = yolo_boxes_and_scores(yolo_outputs[l], 202 | anchors[anchor_mask[l]], num_classes, input_shape, image_shape) 203 | boxes.append(_boxes) 204 | box_scores.append(_box_scores) 205 | boxes = K.concatenate(boxes, axis=0) 206 | box_scores = K.concatenate(box_scores, axis=0) 207 | 208 | mask = box_scores >= score_threshold 209 | max_boxes_tensor = K.constant(max_boxes, dtype='int32') 210 | boxes_ = [] 211 | scores_ = [] 212 | classes_ = [] 213 | for c in range(num_classes): 214 | # TODO: use keras backend instead of tf. 215 | class_boxes = tf.boolean_mask(boxes, mask[:, c]) 216 | class_box_scores = tf.boolean_mask(box_scores[:, c], mask[:, c]) 217 | nms_index = tf.image.non_max_suppression( 218 | class_boxes, class_box_scores, max_boxes_tensor, iou_threshold=iou_threshold) 219 | class_boxes = K.gather(class_boxes, nms_index) 220 | class_box_scores = K.gather(class_box_scores, nms_index) 221 | classes = K.ones_like(class_box_scores, 'int32') * c 222 | boxes_.append(class_boxes) 223 | scores_.append(class_box_scores) 224 | classes_.append(classes) 225 | boxes_ = K.concatenate(boxes_, axis=0) 226 | scores_ = K.concatenate(scores_, axis=0) 227 | classes_ = K.concatenate(classes_, axis=0) 228 | 229 | return boxes_, scores_, classes_ 230 | 231 | 232 | def preprocess_true_boxes(true_boxes, input_shape, anchors, num_classes): 233 | '''Preprocess true boxes to training input format 234 | 235 | Parameters 236 | ---------- 237 | true_boxes: array, shape=(m, T, 5) 238 | Absolute x_min, y_min, x_max, y_max, class_id relative to input_shape. 239 | input_shape: array-like, hw, multiples of 32 240 | anchors: array, shape=(N, 2), wh 241 | num_classes: integer 242 | 243 | Returns 244 | ------- 245 | y_true: list of array, shape like yolo_outputs, xywh are reletive value 246 | 247 | ''' 248 | assert (true_boxes[..., 4]0 269 | 270 | for b in range(m): 271 | # Discard zero rows. 272 | wh = boxes_wh[b, valid_mask[b]] 273 | if len(wh)==0: continue 274 | # Expand dim to apply broadcasting. 275 | wh = np.expand_dims(wh, -2) 276 | box_maxes = wh / 2. 277 | box_mins = -box_maxes 278 | 279 | intersect_mins = np.maximum(box_mins, anchor_mins) 280 | intersect_maxes = np.minimum(box_maxes, anchor_maxes) 281 | intersect_wh = np.maximum(intersect_maxes - intersect_mins, 0.) 282 | intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1] 283 | box_area = wh[..., 0] * wh[..., 1] 284 | anchor_area = anchors[..., 0] * anchors[..., 1] 285 | iou = intersect_area / (box_area + anchor_area - intersect_area) 286 | 287 | # Find best anchor for each true box 288 | best_anchor = np.argmax(iou, axis=-1) 289 | 290 | for t, n in enumerate(best_anchor): 291 | for l in range(num_layers): 292 | if n in anchor_mask[l]: 293 | i = np.floor(true_boxes[b,t,0]*grid_shapes[l][1]).astype('int32') 294 | j = np.floor(true_boxes[b,t,1]*grid_shapes[l][0]).astype('int32') 295 | k = anchor_mask[l].index(n) 296 | c = true_boxes[b,t, 4].astype('int32') 297 | y_true[l][b, j, i, k, 0:4] = true_boxes[b,t, 0:4] 298 | y_true[l][b, j, i, k, 4] = 1 299 | y_true[l][b, j, i, k, 5+c] = 1 300 | 301 | return y_true 302 | 303 | 304 | def box_iou(b1, b2): 305 | '''Return iou tensor 306 | 307 | Parameters 308 | ---------- 309 | b1: tensor, shape=(i1,...,iN, 4), xywh 310 | b2: tensor, shape=(j, 4), xywh 311 | 312 | Returns 313 | ------- 314 | iou: tensor, shape=(i1,...,iN, j) 315 | 316 | ''' 317 | 318 | # Expand dim to apply broadcasting. 319 | b1 = K.expand_dims(b1, -2) 320 | b1_xy = b1[..., :2] 321 | b1_wh = b1[..., 2:4] 322 | b1_wh_half = b1_wh/2. 323 | b1_mins = b1_xy - b1_wh_half 324 | b1_maxes = b1_xy + b1_wh_half 325 | 326 | # Expand dim to apply broadcasting. 327 | b2 = K.expand_dims(b2, 0) 328 | b2_xy = b2[..., :2] 329 | b2_wh = b2[..., 2:4] 330 | b2_wh_half = b2_wh/2. 331 | b2_mins = b2_xy - b2_wh_half 332 | b2_maxes = b2_xy + b2_wh_half 333 | 334 | intersect_mins = K.maximum(b1_mins, b2_mins) 335 | intersect_maxes = K.minimum(b1_maxes, b2_maxes) 336 | intersect_wh = K.maximum(intersect_maxes - intersect_mins, 0.) 337 | intersect_area = intersect_wh[..., 0] * intersect_wh[..., 1] 338 | b1_area = b1_wh[..., 0] * b1_wh[..., 1] 339 | b2_area = b2_wh[..., 0] * b2_wh[..., 1] 340 | iou = intersect_area / (b1_area + b2_area - intersect_area) 341 | 342 | return iou 343 | 344 | 345 | def yolo_loss(args, anchors, num_classes, ignore_thresh=.5, print_loss=False): 346 | '''Return yolo_loss tensor 347 | 348 | Parameters 349 | ---------- 350 | yolo_outputs: list of tensor, the output of yolo_body or tiny_yolo_body 351 | y_true: list of array, the output of preprocess_true_boxes 352 | anchors: array, shape=(N, 2), wh 353 | num_classes: integer 354 | ignore_thresh: float, the iou threshold whether to ignore object confidence loss 355 | 356 | Returns 357 | ------- 358 | loss: tensor, shape=(1,) 359 | 360 | ''' 361 | num_layers = len(anchors)//3 # default setting 362 | yolo_outputs = args[:num_layers] 363 | y_true = args[num_layers:] 364 | anchor_mask = [[6,7,8], [3,4,5], [0,1,2]] if num_layers==3 else [[3,4,5], [1,2,3]] 365 | input_shape = K.cast(K.shape(yolo_outputs[0])[1:3] * 32, K.dtype(y_true[0])) 366 | grid_shapes = [K.cast(K.shape(yolo_outputs[l])[1:3], K.dtype(y_true[0])) for l in range(num_layers)] 367 | loss = 0 368 | m = K.shape(yolo_outputs[0])[0] # batch size, tensor 369 | mf = K.cast(m, K.dtype(yolo_outputs[0])) 370 | 371 | for l in range(num_layers): 372 | object_mask = y_true[l][..., 4:5] 373 | true_class_probs = y_true[l][..., 5:] 374 | 375 | grid, raw_pred, pred_xy, pred_wh = yolo_head(yolo_outputs[l], 376 | anchors[anchor_mask[l]], num_classes, input_shape, calc_loss=True) 377 | pred_box = K.concatenate([pred_xy, pred_wh]) 378 | 379 | # Darknet raw box to calculate loss. 380 | raw_true_xy = y_true[l][..., :2]*grid_shapes[l][::-1] - grid 381 | raw_true_wh = K.log(y_true[l][..., 2:4] / anchors[anchor_mask[l]] * input_shape[::-1]) 382 | raw_true_wh = K.switch(object_mask, raw_true_wh, K.zeros_like(raw_true_wh)) # avoid log(0)=-inf 383 | box_loss_scale = 2 - y_true[l][...,2:3]*y_true[l][...,3:4] 384 | 385 | # Find ignore mask, iterate over each of batch. 386 | ignore_mask = tf.TensorArray(K.dtype(y_true[0]), size=1, dynamic_size=True) 387 | object_mask_bool = K.cast(object_mask, 'bool') 388 | def loop_body(b, ignore_mask): 389 | true_box = tf.boolean_mask(y_true[l][b,...,0:4], object_mask_bool[b,...,0]) 390 | iou = box_iou(pred_box[b], true_box) 391 | best_iou = K.max(iou, axis=-1) 392 | ignore_mask = ignore_mask.write(b, K.cast(best_iou0: 61 | np.random.shuffle(box) 62 | if len(box)>max_boxes: box = box[:max_boxes] 63 | box[:, [0,2]] = box[:, [0,2]]*scale + dx 64 | box[:, [1,3]] = box[:, [1,3]]*scale + dy 65 | box_data[:len(box)] = box 66 | 67 | return image_data, box_data 68 | 69 | # resize image 70 | new_ar = w/h * rand(1-jitter,1+jitter)/rand(1-jitter,1+jitter) 71 | scale = rand(.25, 2) 72 | if new_ar < 1: 73 | nh = int(scale*h) 74 | nw = int(nh*new_ar) 75 | else: 76 | nw = int(scale*w) 77 | nh = int(nw/new_ar) 78 | image = image.resize((nw,nh), Image.BICUBIC) 79 | 80 | # place image 81 | dx = int(rand(0, w-nw)) 82 | dy = int(rand(0, h-nh)) 83 | new_image = Image.new('RGB', (w,h), (128,128,128)) 84 | new_image.paste(image, (dx, dy)) 85 | image = new_image 86 | 87 | # flip image or not 88 | flip = rand()<.5 89 | if flip: image = image.transpose(Image.FLIP_LEFT_RIGHT) 90 | 91 | # distort image 92 | hue = rand(-hue, hue) 93 | sat = rand(1, sat) if rand()<.5 else 1/rand(1, sat) 94 | val = rand(1, val) if rand()<.5 else 1/rand(1, val) 95 | x = rgb_to_hsv(np.array(image)/255.) 96 | x[..., 0] += hue 97 | x[..., 0][x[..., 0]>1] -= 1 98 | x[..., 0][x[..., 0]<0] += 1 99 | x[..., 1] *= sat 100 | x[..., 2] *= val 101 | x[x>1] = 1 102 | x[x<0] = 0 103 | image_data = hsv_to_rgb(x) # numpy array, 0 to 1 104 | 105 | # correct boxes 106 | box_data = np.zeros((max_boxes,5)) 107 | if len(box)>0: 108 | np.random.shuffle(box) 109 | box[:, [0,2]] = box[:, [0,2]]*nw/iw + dx 110 | box[:, [1,3]] = box[:, [1,3]]*nh/ih + dy 111 | if flip: box[:, [0,2]] = w - box[:, [2,0]] 112 | box[:, 0:2][box[:, 0:2]<0] = 0 113 | box[:, 2][box[:, 2]>w] = w 114 | box[:, 3][box[:, 3]>h] = h 115 | box_w = box[:, 2] - box[:, 0] 116 | box_h = box[:, 3] - box[:, 1] 117 | box = box[np.logical_and(box_w>1, box_h>1)] # discard invalid box 118 | if len(box)>max_boxes: box = box[:max_boxes] 119 | box_data[:len(box)] = box 120 | 121 | return image_data, box_data 122 | -------------------------------------------------------------------------------- /yolo_video.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import argparse 3 | from yolo import YOLO, detect_video 4 | from PIL import Image 5 | 6 | def detect_img(yolo): 7 | while True: 8 | img = input('Input image filename:') 9 | try: 10 | image = Image.open(img) 11 | except: 12 | print('Open Error! Try again!') 13 | continue 14 | else: 15 | r_image = yolo.detect_image(image) 16 | r_image.show() 17 | yolo.close_session() 18 | 19 | FLAGS = None 20 | 21 | if __name__ == '__main__': 22 | # class YOLO defines the default value, so suppress any default here 23 | parser = argparse.ArgumentParser(argument_default=argparse.SUPPRESS) 24 | ''' 25 | Command line options 26 | ''' 27 | parser.add_argument( 28 | '--model', type=str, 29 | help='path to model weight file, default ' + YOLO.get_defaults("model_path") 30 | ) 31 | 32 | parser.add_argument( 33 | '--anchors', type=str, 34 | help='path to anchor definitions, default ' + YOLO.get_defaults("anchors_path") 35 | ) 36 | 37 | parser.add_argument( 38 | '--classes', type=str, 39 | help='path to class definitions, default ' + YOLO.get_defaults("classes_path") 40 | ) 41 | 42 | parser.add_argument( 43 | '--gpu_num', type=int, 44 | help='Number of GPU to use, default ' + str(YOLO.get_defaults("gpu_num")) 45 | ) 46 | 47 | parser.add_argument( 48 | '--image', default=False, action="store_true", 49 | help='Image detection mode, will ignore all positional arguments' 50 | ) 51 | ''' 52 | Command line positional arguments -- for video detection mode 53 | ''' 54 | parser.add_argument( 55 | "--input", nargs='?', type=str,required=False,default='./path2your_video', 56 | help = "Video input path" 57 | ) 58 | 59 | parser.add_argument( 60 | "--output", nargs='?', type=str, default="", 61 | help = "[Optional] Video output path" 62 | ) 63 | 64 | FLAGS = parser.parse_args() 65 | 66 | if FLAGS.image: 67 | """ 68 | Image detection mode, disregard any remaining command line arguments 69 | """ 70 | print("Image detection mode") 71 | if "input" in FLAGS: 72 | print(" Ignoring remaining command line arguments: " + FLAGS.input + "," + FLAGS.output) 73 | detect_img(YOLO(**vars(FLAGS))) 74 | elif "input" in FLAGS: 75 | detect_video(YOLO(**vars(FLAGS)), FLAGS.input, FLAGS.output) 76 | else: 77 | print("Must specify at least video_input_path. See usage with --help.") 78 | -------------------------------------------------------------------------------- /yolov3-tiny.cfg: -------------------------------------------------------------------------------- 1 | [net] 2 | # Testing 3 | batch=1 4 | subdivisions=1 5 | # Training 6 | # batch=64 7 | # subdivisions=2 8 | width=416 9 | height=416 10 | channels=3 11 | momentum=0.9 12 | decay=0.0005 13 | angle=0 14 | saturation = 1.5 15 | exposure = 1.5 16 | hue=.1 17 | 18 | learning_rate=0.001 19 | burn_in=1000 20 | max_batches = 500200 21 | policy=steps 22 | steps=400000,450000 23 | scales=.1,.1 24 | 25 | [convolutional] 26 | batch_normalize=1 27 | filters=16 28 | size=3 29 | stride=1 30 | pad=1 31 | activation=leaky 32 | 33 | [maxpool] 34 | size=2 35 | stride=2 36 | 37 | [convolutional] 38 | batch_normalize=1 39 | filters=32 40 | size=3 41 | stride=1 42 | pad=1 43 | activation=leaky 44 | 45 | [maxpool] 46 | size=2 47 | stride=2 48 | 49 | [convolutional] 50 | batch_normalize=1 51 | filters=64 52 | size=3 53 | stride=1 54 | pad=1 55 | activation=leaky 56 | 57 | [maxpool] 58 | size=2 59 | stride=2 60 | 61 | [convolutional] 62 | batch_normalize=1 63 | filters=128 64 | size=3 65 | stride=1 66 | pad=1 67 | activation=leaky 68 | 69 | [maxpool] 70 | size=2 71 | stride=2 72 | 73 | [convolutional] 74 | batch_normalize=1 75 | filters=256 76 | size=3 77 | stride=1 78 | pad=1 79 | activation=leaky 80 | 81 | [maxpool] 82 | size=2 83 | stride=2 84 | 85 | [convolutional] 86 | batch_normalize=1 87 | filters=512 88 | size=3 89 | stride=1 90 | pad=1 91 | activation=leaky 92 | 93 | [maxpool] 94 | size=2 95 | stride=1 96 | 97 | [convolutional] 98 | batch_normalize=1 99 | filters=1024 100 | size=3 101 | stride=1 102 | pad=1 103 | activation=leaky 104 | 105 | ########### 106 | 107 | [convolutional] 108 | batch_normalize=1 109 | filters=256 110 | size=1 111 | stride=1 112 | pad=1 113 | activation=leaky 114 | 115 | [convolutional] 116 | batch_normalize=1 117 | filters=512 118 | size=3 119 | stride=1 120 | pad=1 121 | activation=leaky 122 | 123 | [convolutional] 124 | size=1 125 | stride=1 126 | pad=1 127 | filters=255 128 | activation=linear 129 | 130 | 131 | 132 | [yolo] 133 | mask = 3,4,5 134 | anchors = 10,14, 23,27, 37,58, 81,82, 135,169, 344,319 135 | classes=80 136 | num=6 137 | jitter=.3 138 | ignore_thresh = .7 139 | truth_thresh = 1 140 | random=1 141 | 142 | [route] 143 | layers = -4 144 | 145 | [convolutional] 146 | batch_normalize=1 147 | filters=128 148 | size=1 149 | stride=1 150 | pad=1 151 | activation=leaky 152 | 153 | [upsample] 154 | stride=2 155 | 156 | [route] 157 | layers = -1, 8 158 | 159 | [convolutional] 160 | batch_normalize=1 161 | filters=256 162 | size=3 163 | stride=1 164 | pad=1 165 | activation=leaky 166 | 167 | [convolutional] 168 | size=1 169 | stride=1 170 | pad=1 171 | filters=255 172 | activation=linear 173 | 174 | [yolo] 175 | mask = 1,2,3 176 | anchors = 10,14, 23,27, 37,58, 81,82, 135,169, 344,319 177 | classes=80 178 | num=6 179 | jitter=.3 180 | ignore_thresh = .7 181 | truth_thresh = 1 182 | random=1 183 | -------------------------------------------------------------------------------- /yolov3.cfg: -------------------------------------------------------------------------------- 1 | [net] 2 | # Testing 3 | batch=1 4 | subdivisions=1 5 | # Training 6 | # batch=64 7 | # subdivisions=16 8 | width=416 9 | height=416 10 | channels=3 11 | momentum=0.9 12 | decay=0.0005 13 | angle=0 14 | saturation = 1.5 15 | exposure = 1.5 16 | hue=.1 17 | 18 | learning_rate=0.001 19 | burn_in=1000 20 | max_batches = 500200 21 | policy=steps 22 | steps=400000,450000 23 | scales=.1,.1 24 | 25 | [convolutional] 26 | batch_normalize=1 27 | filters=32 28 | size=3 29 | stride=1 30 | pad=1 31 | activation=leaky 32 | 33 | # Downsample 34 | 35 | [convolutional] 36 | batch_normalize=1 37 | filters=64 38 | size=3 39 | stride=2 40 | pad=1 41 | activation=leaky 42 | 43 | [convolutional] 44 | batch_normalize=1 45 | filters=32 46 | size=1 47 | stride=1 48 | pad=1 49 | activation=leaky 50 | 51 | [convolutional] 52 | batch_normalize=1 53 | filters=64 54 | size=3 55 | stride=1 56 | pad=1 57 | activation=leaky 58 | 59 | [shortcut] 60 | from=-3 61 | activation=linear 62 | 63 | # Downsample 64 | 65 | [convolutional] 66 | batch_normalize=1 67 | filters=128 68 | size=3 69 | stride=2 70 | pad=1 71 | activation=leaky 72 | 73 | [convolutional] 74 | batch_normalize=1 75 | filters=64 76 | size=1 77 | stride=1 78 | pad=1 79 | activation=leaky 80 | 81 | [convolutional] 82 | batch_normalize=1 83 | filters=128 84 | size=3 85 | stride=1 86 | pad=1 87 | activation=leaky 88 | 89 | [shortcut] 90 | from=-3 91 | activation=linear 92 | 93 | [convolutional] 94 | batch_normalize=1 95 | filters=64 96 | size=1 97 | stride=1 98 | pad=1 99 | activation=leaky 100 | 101 | [convolutional] 102 | batch_normalize=1 103 | filters=128 104 | size=3 105 | stride=1 106 | pad=1 107 | activation=leaky 108 | 109 | [shortcut] 110 | from=-3 111 | activation=linear 112 | 113 | # Downsample 114 | 115 | [convolutional] 116 | batch_normalize=1 117 | filters=256 118 | size=3 119 | stride=2 120 | pad=1 121 | activation=leaky 122 | 123 | [convolutional] 124 | batch_normalize=1 125 | filters=128 126 | size=1 127 | stride=1 128 | pad=1 129 | activation=leaky 130 | 131 | [convolutional] 132 | batch_normalize=1 133 | filters=256 134 | size=3 135 | stride=1 136 | pad=1 137 | activation=leaky 138 | 139 | [shortcut] 140 | from=-3 141 | activation=linear 142 | 143 | [convolutional] 144 | batch_normalize=1 145 | filters=128 146 | size=1 147 | stride=1 148 | pad=1 149 | activation=leaky 150 | 151 | [convolutional] 152 | batch_normalize=1 153 | filters=256 154 | size=3 155 | stride=1 156 | pad=1 157 | activation=leaky 158 | 159 | [shortcut] 160 | from=-3 161 | activation=linear 162 | 163 | [convolutional] 164 | batch_normalize=1 165 | filters=128 166 | size=1 167 | stride=1 168 | pad=1 169 | activation=leaky 170 | 171 | [convolutional] 172 | batch_normalize=1 173 | filters=256 174 | size=3 175 | stride=1 176 | pad=1 177 | activation=leaky 178 | 179 | [shortcut] 180 | from=-3 181 | activation=linear 182 | 183 | [convolutional] 184 | batch_normalize=1 185 | filters=128 186 | size=1 187 | stride=1 188 | pad=1 189 | activation=leaky 190 | 191 | [convolutional] 192 | batch_normalize=1 193 | filters=256 194 | size=3 195 | stride=1 196 | pad=1 197 | activation=leaky 198 | 199 | [shortcut] 200 | from=-3 201 | activation=linear 202 | 203 | 204 | [convolutional] 205 | batch_normalize=1 206 | filters=128 207 | size=1 208 | stride=1 209 | pad=1 210 | activation=leaky 211 | 212 | [convolutional] 213 | batch_normalize=1 214 | filters=256 215 | size=3 216 | stride=1 217 | pad=1 218 | activation=leaky 219 | 220 | [shortcut] 221 | from=-3 222 | activation=linear 223 | 224 | [convolutional] 225 | batch_normalize=1 226 | filters=128 227 | size=1 228 | stride=1 229 | pad=1 230 | activation=leaky 231 | 232 | [convolutional] 233 | batch_normalize=1 234 | filters=256 235 | size=3 236 | stride=1 237 | pad=1 238 | activation=leaky 239 | 240 | [shortcut] 241 | from=-3 242 | activation=linear 243 | 244 | [convolutional] 245 | batch_normalize=1 246 | filters=128 247 | size=1 248 | stride=1 249 | pad=1 250 | activation=leaky 251 | 252 | [convolutional] 253 | batch_normalize=1 254 | filters=256 255 | size=3 256 | stride=1 257 | pad=1 258 | activation=leaky 259 | 260 | [shortcut] 261 | from=-3 262 | activation=linear 263 | 264 | [convolutional] 265 | batch_normalize=1 266 | filters=128 267 | size=1 268 | stride=1 269 | pad=1 270 | activation=leaky 271 | 272 | [convolutional] 273 | batch_normalize=1 274 | filters=256 275 | size=3 276 | stride=1 277 | pad=1 278 | activation=leaky 279 | 280 | [shortcut] 281 | from=-3 282 | activation=linear 283 | 284 | # Downsample 285 | 286 | [convolutional] 287 | batch_normalize=1 288 | filters=512 289 | size=3 290 | stride=2 291 | pad=1 292 | activation=leaky 293 | 294 | [convolutional] 295 | batch_normalize=1 296 | filters=256 297 | size=1 298 | stride=1 299 | pad=1 300 | activation=leaky 301 | 302 | [convolutional] 303 | batch_normalize=1 304 | filters=512 305 | size=3 306 | stride=1 307 | pad=1 308 | activation=leaky 309 | 310 | [shortcut] 311 | from=-3 312 | activation=linear 313 | 314 | 315 | [convolutional] 316 | batch_normalize=1 317 | filters=256 318 | size=1 319 | stride=1 320 | pad=1 321 | activation=leaky 322 | 323 | [convolutional] 324 | batch_normalize=1 325 | filters=512 326 | size=3 327 | stride=1 328 | pad=1 329 | activation=leaky 330 | 331 | [shortcut] 332 | from=-3 333 | activation=linear 334 | 335 | 336 | [convolutional] 337 | batch_normalize=1 338 | filters=256 339 | size=1 340 | stride=1 341 | pad=1 342 | activation=leaky 343 | 344 | [convolutional] 345 | batch_normalize=1 346 | filters=512 347 | size=3 348 | stride=1 349 | pad=1 350 | activation=leaky 351 | 352 | [shortcut] 353 | from=-3 354 | activation=linear 355 | 356 | 357 | [convolutional] 358 | batch_normalize=1 359 | filters=256 360 | size=1 361 | stride=1 362 | pad=1 363 | activation=leaky 364 | 365 | [convolutional] 366 | batch_normalize=1 367 | filters=512 368 | size=3 369 | stride=1 370 | pad=1 371 | activation=leaky 372 | 373 | [shortcut] 374 | from=-3 375 | activation=linear 376 | 377 | [convolutional] 378 | batch_normalize=1 379 | filters=256 380 | size=1 381 | stride=1 382 | pad=1 383 | activation=leaky 384 | 385 | [convolutional] 386 | batch_normalize=1 387 | filters=512 388 | size=3 389 | stride=1 390 | pad=1 391 | activation=leaky 392 | 393 | [shortcut] 394 | from=-3 395 | activation=linear 396 | 397 | 398 | [convolutional] 399 | batch_normalize=1 400 | filters=256 401 | size=1 402 | stride=1 403 | pad=1 404 | activation=leaky 405 | 406 | [convolutional] 407 | batch_normalize=1 408 | filters=512 409 | size=3 410 | stride=1 411 | pad=1 412 | activation=leaky 413 | 414 | [shortcut] 415 | from=-3 416 | activation=linear 417 | 418 | 419 | [convolutional] 420 | batch_normalize=1 421 | filters=256 422 | size=1 423 | stride=1 424 | pad=1 425 | activation=leaky 426 | 427 | [convolutional] 428 | batch_normalize=1 429 | filters=512 430 | size=3 431 | stride=1 432 | pad=1 433 | activation=leaky 434 | 435 | [shortcut] 436 | from=-3 437 | activation=linear 438 | 439 | [convolutional] 440 | batch_normalize=1 441 | filters=256 442 | size=1 443 | stride=1 444 | pad=1 445 | activation=leaky 446 | 447 | [convolutional] 448 | batch_normalize=1 449 | filters=512 450 | size=3 451 | stride=1 452 | pad=1 453 | activation=leaky 454 | 455 | [shortcut] 456 | from=-3 457 | activation=linear 458 | 459 | # Downsample 460 | 461 | [convolutional] 462 | batch_normalize=1 463 | filters=1024 464 | size=3 465 | stride=2 466 | pad=1 467 | activation=leaky 468 | 469 | [convolutional] 470 | batch_normalize=1 471 | filters=512 472 | size=1 473 | stride=1 474 | pad=1 475 | activation=leaky 476 | 477 | [convolutional] 478 | batch_normalize=1 479 | filters=1024 480 | size=3 481 | stride=1 482 | pad=1 483 | activation=leaky 484 | 485 | [shortcut] 486 | from=-3 487 | activation=linear 488 | 489 | [convolutional] 490 | batch_normalize=1 491 | filters=512 492 | size=1 493 | stride=1 494 | pad=1 495 | activation=leaky 496 | 497 | [convolutional] 498 | batch_normalize=1 499 | filters=1024 500 | size=3 501 | stride=1 502 | pad=1 503 | activation=leaky 504 | 505 | [shortcut] 506 | from=-3 507 | activation=linear 508 | 509 | [convolutional] 510 | batch_normalize=1 511 | filters=512 512 | size=1 513 | stride=1 514 | pad=1 515 | activation=leaky 516 | 517 | [convolutional] 518 | batch_normalize=1 519 | filters=1024 520 | size=3 521 | stride=1 522 | pad=1 523 | activation=leaky 524 | 525 | [shortcut] 526 | from=-3 527 | activation=linear 528 | 529 | [convolutional] 530 | batch_normalize=1 531 | filters=512 532 | size=1 533 | stride=1 534 | pad=1 535 | activation=leaky 536 | 537 | [convolutional] 538 | batch_normalize=1 539 | filters=1024 540 | size=3 541 | stride=1 542 | pad=1 543 | activation=leaky 544 | 545 | [shortcut] 546 | from=-3 547 | activation=linear 548 | 549 | ###################### 550 | 551 | [convolutional] 552 | batch_normalize=1 553 | filters=512 554 | size=1 555 | stride=1 556 | pad=1 557 | activation=leaky 558 | 559 | [convolutional] 560 | batch_normalize=1 561 | size=3 562 | stride=1 563 | pad=1 564 | filters=1024 565 | activation=leaky 566 | 567 | [convolutional] 568 | batch_normalize=1 569 | filters=512 570 | size=1 571 | stride=1 572 | pad=1 573 | activation=leaky 574 | 575 | [convolutional] 576 | batch_normalize=1 577 | size=3 578 | stride=1 579 | pad=1 580 | filters=1024 581 | activation=leaky 582 | 583 | [convolutional] 584 | batch_normalize=1 585 | filters=512 586 | size=1 587 | stride=1 588 | pad=1 589 | activation=leaky 590 | 591 | [convolutional] 592 | batch_normalize=1 593 | size=3 594 | stride=1 595 | pad=1 596 | filters=1024 597 | activation=leaky 598 | 599 | [convolutional] 600 | size=1 601 | stride=1 602 | pad=1 603 | filters=255 604 | activation=linear 605 | 606 | 607 | [yolo] 608 | mask = 6,7,8 609 | anchors = 10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326 610 | classes=80 611 | num=9 612 | jitter=.3 613 | ignore_thresh = .5 614 | truth_thresh = 1 615 | random=1 616 | 617 | 618 | [route] 619 | layers = -4 620 | 621 | [convolutional] 622 | batch_normalize=1 623 | filters=256 624 | size=1 625 | stride=1 626 | pad=1 627 | activation=leaky 628 | 629 | [upsample] 630 | stride=2 631 | 632 | [route] 633 | layers = -1, 61 634 | 635 | 636 | 637 | [convolutional] 638 | batch_normalize=1 639 | filters=256 640 | size=1 641 | stride=1 642 | pad=1 643 | activation=leaky 644 | 645 | [convolutional] 646 | batch_normalize=1 647 | size=3 648 | stride=1 649 | pad=1 650 | filters=512 651 | activation=leaky 652 | 653 | [convolutional] 654 | batch_normalize=1 655 | filters=256 656 | size=1 657 | stride=1 658 | pad=1 659 | activation=leaky 660 | 661 | [convolutional] 662 | batch_normalize=1 663 | size=3 664 | stride=1 665 | pad=1 666 | filters=512 667 | activation=leaky 668 | 669 | [convolutional] 670 | batch_normalize=1 671 | filters=256 672 | size=1 673 | stride=1 674 | pad=1 675 | activation=leaky 676 | 677 | [convolutional] 678 | batch_normalize=1 679 | size=3 680 | stride=1 681 | pad=1 682 | filters=512 683 | activation=leaky 684 | 685 | [convolutional] 686 | size=1 687 | stride=1 688 | pad=1 689 | filters=255 690 | activation=linear 691 | 692 | 693 | [yolo] 694 | mask = 3,4,5 695 | anchors = 10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326 696 | classes=80 697 | num=9 698 | jitter=.3 699 | ignore_thresh = .5 700 | truth_thresh = 1 701 | random=1 702 | 703 | 704 | 705 | [route] 706 | layers = -4 707 | 708 | [convolutional] 709 | batch_normalize=1 710 | filters=128 711 | size=1 712 | stride=1 713 | pad=1 714 | activation=leaky 715 | 716 | [upsample] 717 | stride=2 718 | 719 | [route] 720 | layers = -1, 36 721 | 722 | 723 | 724 | [convolutional] 725 | batch_normalize=1 726 | filters=128 727 | size=1 728 | stride=1 729 | pad=1 730 | activation=leaky 731 | 732 | [convolutional] 733 | batch_normalize=1 734 | size=3 735 | stride=1 736 | pad=1 737 | filters=256 738 | activation=leaky 739 | 740 | [convolutional] 741 | batch_normalize=1 742 | filters=128 743 | size=1 744 | stride=1 745 | pad=1 746 | activation=leaky 747 | 748 | [convolutional] 749 | batch_normalize=1 750 | size=3 751 | stride=1 752 | pad=1 753 | filters=256 754 | activation=leaky 755 | 756 | [convolutional] 757 | batch_normalize=1 758 | filters=128 759 | size=1 760 | stride=1 761 | pad=1 762 | activation=leaky 763 | 764 | [convolutional] 765 | batch_normalize=1 766 | size=3 767 | stride=1 768 | pad=1 769 | filters=256 770 | activation=leaky 771 | 772 | [convolutional] 773 | size=1 774 | stride=1 775 | pad=1 776 | filters=255 777 | activation=linear 778 | 779 | 780 | [yolo] 781 | mask = 0,1,2 782 | anchors = 10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326 783 | classes=80 784 | num=9 785 | jitter=.3 786 | ignore_thresh = .5 787 | truth_thresh = 1 788 | random=1 789 | 790 | --------------------------------------------------------------------------------