├── README.md
├── config_CPC.py
├── config_GAICD.py
├── cropping_dataset.py
├── cropping_model.py
├── figures
├── motivation.jpg
├── pipeline.png
└── qualitative_comparison.png
├── generate_pseudo_heatmap.py
├── human_bboxes
├── CPC
│ ├── human_bboxes.json
│ └── image_crop.json
├── FCDB
│ ├── data_split.json
│ ├── human_bboxes.json
│ └── image_crop.json
├── FLMS
│ ├── human_bboxes.json
│ └── image_crop.json
└── GAICD
│ ├── human_bboxes.json
│ ├── human_data_split.json
│ ├── image_crop.json
│ └── original_data_split.json
├── requirements.txt
├── test.py
├── train_on_CPC.py
└── train_on_GAICD.py
/README.md:
--------------------------------------------------------------------------------
1 | # Human-Centric-Image-Cropping
2 |
3 | This is the official repository for the following paper:
4 |
5 | > **Human-centric Image Cropping with Partition-aware and Content-preserving Features** [[arxiv]](https://arxiv.org/pdf/2207.10269.pdf)
6 | >
7 | > Bo Zhang, Li Niu, Xing Zhao, Liqing Zhang
8 | > Accepted by **ECCV2022**.
9 |
10 | We consider a specific and practical application: human-centric image cropping, which focuses on the depiction of a person.
11 | To this end, we propose a human-centric image cropping method with human-centric partition and important content preservation.
12 | As illustrated in the figure below, the proposed method uses two novel feature designs for the candidate crop: partition-aware feature and content-preserving feature.
13 | The partition-aware feature allows to treat different regions in a candidate crop differently conditioned on the human information,
14 | and the content-preserving feature helps to preserve the important content to be included in a good crop.
15 |
16 |
17 |

18 |
19 |
20 | ## Results
21 |
22 | In the below figure, we show the source image and the returned best crops by different methods, which demonstrates that our method can perform more reliable content preservation and removal. For example, in the first row, our method preserves more content on the left of human, probably because the person walks right to left, and reduces the top area that may hurt the image composition quality. In the second row, given the opposite face orientations to the first row, our model performs obviously different content preservation on the left/right sides of the human, yielding visually appealing crop. More qualitative and quantitative results are shown in our paper and supplementary.
23 |
24 |
25 |

26 |
27 |
28 |
29 | ## Usage
30 |
31 | Here we not only release the code of our method, but also provide the selected human-centric samples in these frequently used image cropping datasets,
32 | as well as their human bounding boxes under the folder
33 | [``human_bboxes``](https://github.com/bcmi/Human-Centric-Image-Cropping/tree/main/human_bboxes).
34 |
35 | 1. Download the source code and related image cropping datasets including CPC, GAICD, FCDB, and FLMS datasets.
36 | The homepages of these datasets have been summarized in our another repository
37 | [``Awesome-Aesthetic-Evaluation-and-Cropping``](https://github.com/bcmi/Awesome-Aesthetic-Evaluation-and-Cropping).
38 |
39 | 2. Change the pathes to above datasets and annotation files in ``config_GAICD.py`` and ``config_CPC.py``.
40 |
41 | 3. Run ``generate_pseudo_heatmap.py`` to generate pseudo heatmaps for GAICD or CPC dataset.
42 |
43 | 4. Install the RoI&RoDAlign libraries following the instruction of [GAICD](https://github.com/HuiZeng/Grid-Anchor-based-Image-Cropping-Pytorch).
44 |
45 | 5. Run ``train_on_GAICD.py`` (*resp.*, ``train_on_CPC.py``) to train a new model on GAICD dataset (*resp.*, CPC dataset).
46 |
47 | ### Requirements
48 | Please see [``requirement.txt``](./requirements.txt).
49 |
50 | ## Other Resources
51 |
52 | + [Awesome-Aesthetic-Evaluation-and-Cropping](https://github.com/bcmi/Awesome-Aesthetic-Evaluation-and-Cropping)
53 |
54 | ## Acknowledgement
55 |
56 | This implementation borrows from [GAICD](https://github.com/HuiZeng/Grid-Anchor-based-Image-Cropping-Pytorch).
57 |
58 |
--------------------------------------------------------------------------------
/config_CPC.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | class Config:
4 | data_root = '/workspace/aesthetic_cropping/human_centric/dataset/'
5 | # the pre-defined candidate crops are downloaded from:
6 | # https://github.com/luwr1022/listwise-view-ranking/blob/master/pdefined_anchors.pkl
7 | predefined_pkl = os.path.join(data_root, 'pdefined_anchors.pkl')
8 | CPC_path = os.path.join(data_root, 'CPCDataset')
9 | CPC_image = os.path.join(CPC_path, 'images')
10 | CPC_anno = os.path.join(CPC_path, 'image_crop.json')
11 | CPC_human = os.path.join(CPC_path, 'human_bboxes.json')
12 |
13 | FCDB_path = os.path.join(data_root, 'FCDB')
14 | FCDB_image= os.path.join(FCDB_path, 'data')
15 | FCDB_anno = os.path.join(FCDB_path, 'image_crop.json')
16 | FCDB_human= os.path.join(FCDB_path, 'human_bboxes.json')
17 | FCDB_split= os.path.join(FCDB_path, 'data_split.json')
18 |
19 | FLMS_path = os.path.join(data_root, 'FLMS')
20 | FLMS_image= os.path.join(FLMS_path, 'image')
21 | FLMS_anno = os.path.join(FLMS_path, 'image_crop.json')
22 | FLMS_human= os.path.join(FLMS_path, 'human_bboxes.json')
23 |
24 | GAIC_path = os.path.join(data_root, 'GAICD')
25 | GAIC_image= os.path.join(GAIC_path, 'images')
26 | GAIC_anno = os.path.join(GAIC_path, 'image_crop.json')
27 | GAIC_human= os.path.join(GAIC_path, 'human_bboxes.json')
28 | GAIC_split= os.path.join(GAIC_path, 'original_data_split.json')
29 | GAIC_human_split = os.path.join(GAIC_path, 'human_data_split.json')
30 |
31 | heat_map_dir = '/workspace/aesthetic_cropping/human_centric/code/heat_map/heat_map_gt'
32 | CPC_heat_map = os.path.join(heat_map_dir, 'CPC', 'mask')
33 | GAIC_heat_map= os.path.join(heat_map_dir, 'GAICD', 'mask')
34 |
35 | heat_map_size = 1./4
36 | image_size = (256,256)
37 | backbone = 'vgg16'
38 | backbone_weight_path = ('/workspace/pretrained_models/{}.pth'.format(backbone))
39 | # training
40 | training_set = 'CPC' # ['GAICD', 'CPC']
41 | loss_type = ['L1Loss','RankLoss']
42 | gpu_id = 0
43 | num_workers = 8
44 | only_human = False
45 | data_augmentation = True
46 | keep_aspect_ratio = False
47 |
48 | use_partition_aware = True
49 | partition_aware_type = 9 # [0,1,2,9]
50 | visualize_partition_feature = False
51 |
52 | use_content_preserve = True
53 | only_content_preserve = False
54 | content_preserve_type = 'gcn'
55 | content_loss_weight = 1.
56 | visualize_heat_map = False
57 |
58 | use_rod_feature = True
59 | reduced_dim = 32
60 |
61 | batch_size = 8
62 | lr_scheduler= 'cosine'
63 | max_epoch = 30
64 | lr_decay_epoch = [max_epoch+1]
65 | eval_freq = 1
66 |
67 | view_per_image = 16
68 | lr = 1e-4
69 | lr_decay = 0.1
70 | weight_decay = 1e-4
71 |
72 | save_freq = max_epoch+1
73 | if only_human:
74 | display_freq = 10
75 | else:
76 | display_freq = 100
77 | visualize_freq = 100
78 |
79 | prefix = training_set
80 | if only_human:
81 | prefix += '-Human'
82 | if not data_augmentation:
83 | prefix += '_wodataaug'
84 | if reduced_dim != 32:
85 | prefix += f'_{reduced_dim}redim'
86 | if loss_type != ['L1Loss','RankLoss']:
87 | if isinstance(loss_type, list) and len(loss_type) > 1:
88 | for i in range(len(loss_type)):
89 | if loss_type[i] == 'L1Loss':
90 | continue
91 | prefix += f'_{loss_type[i]}'
92 | if use_partition_aware:
93 | prefix += ('_PA')
94 | if partition_aware_type != 9:
95 | prefix += f'-{partition_aware_type}part'
96 | if use_content_preserve:
97 | prefix += ('_CP')
98 | if content_preserve_type != 'gcn':
99 | prefix += f'-{content_preserve_type}'
100 | if only_content_preserve:
101 | prefix += f'_onlycontent'
102 | exp_root = os.path.join(os.getcwd(), './experiments')
103 |
104 | # prefix = f'{content_loss_weight}_content_loss'
105 | # exp_root = os.path.join(os.getcwd(), './experiments/hyper_parameter')
106 |
107 | exp_name = prefix
108 | exp_path = os.path.join(exp_root, prefix)
109 | while os.path.exists(exp_path):
110 | index = os.path.basename(exp_path).split(prefix)[-1].split('repeat')[-1]
111 | try:
112 | index = int(index) + 1
113 | except:
114 | index = 1
115 | exp_name = prefix + ('_repeat{}'.format(index))
116 | exp_path = os.path.join(exp_root, exp_name)
117 | # print('Experiment name {} \n'.format(os.path.basename(exp_path)))
118 | checkpoint_dir = os.path.join(exp_path, 'checkpoints')
119 | log_dir = os.path.join(exp_path, 'logs')
120 |
121 | def create_path(self):
122 | print('Create experiment directory: ', self.exp_path)
123 | os.makedirs(self.exp_path)
124 | os.makedirs(self.checkpoint_dir)
125 | os.makedirs(self.log_dir)
126 |
127 | cfg = Config()
128 |
129 | if __name__ == '__main__':
130 | cfg = Config()
--------------------------------------------------------------------------------
/config_GAICD.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | class Config:
4 | data_root = '/workspace/aesthetic_cropping/human_centric/dataset/'
5 | # the pre-defined candidate crops are downloaded from:
6 | # https://github.com/luwr1022/listwise-view-ranking/blob/master/pdefined_anchors.pkl
7 | predefined_pkl = os.path.join(data_root, 'pdefined_anchors.pkl')
8 | CPC_path = os.path.join(data_root, 'CPCDataset')
9 | CPC_image = os.path.join(CPC_path, 'images')
10 | CPC_anno = os.path.join(CPC_path, 'image_crop.json')
11 | CPC_human = os.path.join(CPC_path, 'human_bboxes.json')
12 |
13 | FCDB_path = os.path.join(data_root, 'FCDB')
14 | FCDB_image= os.path.join(FCDB_path, 'data')
15 | FCDB_anno = os.path.join(FCDB_path, 'image_crop.json')
16 | FCDB_human= os.path.join(FCDB_path, 'human_bboxes.json')
17 | FCDB_split= os.path.join(FCDB_path, 'data_split.json')
18 |
19 | FLMS_path = os.path.join(data_root, 'FLMS')
20 | FLMS_image= os.path.join(FLMS_path, 'image')
21 | FLMS_anno = os.path.join(FLMS_path, 'image_crop.json')
22 | FLMS_human= os.path.join(FLMS_path, 'human_bboxes.json')
23 |
24 | GAIC_path = os.path.join(data_root, 'GAICD')
25 | GAIC_image= os.path.join(GAIC_path, 'images')
26 | GAIC_anno = os.path.join(GAIC_path, 'image_crop.json')
27 | GAIC_human= os.path.join(GAIC_path, 'human_bboxes.json')
28 | GAIC_split= os.path.join(GAIC_path, 'original_data_split.json')
29 | GAIC_human_split = os.path.join(GAIC_path, 'human_data_split.json')
30 |
31 | heat_map_dir = '/workspace/aesthetic_cropping/human_centric/code/heat_map/heat_map_gt'
32 | CPC_heat_map = os.path.join(heat_map_dir, 'CPC', 'mask')
33 | GAIC_heat_map= os.path.join(heat_map_dir, 'GAICD', 'mask')
34 |
35 | heat_map_size = 1./4
36 | image_size = (256,256)
37 | backbone = 'vgg16'
38 | backbone_weight_path = ('/workspace/pretrained_models/{}.pth'.format(backbone))
39 | # training
40 | training_set = 'GAICD' # ['GAICD', 'CPC']
41 | loss_type = ['L1Loss','RankLoss']
42 | gpu_id = 0
43 | num_workers = 4
44 | only_human = False
45 | data_augmentation = True
46 | keep_aspect_ratio = True
47 |
48 | use_partition_aware = True
49 | partition_aware_type = 9 # [0,1,2,9]
50 | concat_with_human = True
51 | visualize_partition_feature = False
52 |
53 | use_content_preserve = True
54 | only_content_preserve = False
55 | content_preserve_type = 'gcn'
56 | content_loss_weight = 1.
57 | visualize_heat_map = False
58 |
59 | use_rod_feature = True
60 | reduced_dim = 32
61 |
62 | batch_size = 1
63 | lr_scheduler= 'cosine'
64 | max_epoch = 80
65 | lr_decay_epoch = [max_epoch + 1]
66 | eval_freq = 1
67 |
68 | view_per_image = 64
69 | lr = 1e-4
70 | lr_decay = 0.1
71 | weight_decay = 1e-4
72 |
73 | save_freq = max_epoch+1
74 | if only_human:
75 | display_freq = 20
76 | else:
77 | display_freq = 50
78 | visualize_freq = 50
79 |
80 | prefix = training_set
81 | if only_human:
82 | prefix += '-Human'
83 | if not data_augmentation:
84 | prefix += '_wodataaug'
85 | if reduced_dim != 32:
86 | prefix += f'_{reduced_dim}redim'
87 | if loss_type != ['L1Loss','RankLoss']:
88 | if isinstance(loss_type, list) and len(loss_type) > 1:
89 | for i in range(len(loss_type)):
90 | if loss_type[i] == 'L1Loss':
91 | continue
92 | prefix += f'_{loss_type[i]}'
93 | if use_partition_aware:
94 | prefix += ('_PA')
95 | if partition_aware_type != 9:
96 | prefix += f'-{partition_aware_type}part'
97 | if use_content_preserve:
98 | prefix += ('_CP')
99 | if content_preserve_type != 'gcn':
100 | prefix += f'-{content_preserve_type}'
101 | if only_content_preserve:
102 | prefix += f'_onlycontent'
103 | exp_root = os.path.join(os.getcwd(), './experiments')
104 |
105 | # prefix = f'{content_loss_weight}_content_loss'
106 | # exp_root = os.path.join(os.getcwd(), './experiments/hyper_parameter')
107 |
108 | exp_name = prefix
109 | exp_path = os.path.join(exp_root, prefix)
110 | while os.path.exists(exp_path):
111 | index = os.path.basename(exp_path).split(prefix)[-1].split('repeat')[-1]
112 | try:
113 | index = int(index) + 1
114 | except:
115 | index = 1
116 | exp_name = prefix + ('_repeat{}'.format(index))
117 | exp_path = os.path.join(exp_root, exp_name)
118 | # print('Experiment name {} \n'.format(os.path.basename(exp_path)))
119 | checkpoint_dir = os.path.join(exp_path, 'checkpoints')
120 | log_dir = os.path.join(exp_path, 'logs')
121 |
122 | def create_path(self):
123 | print('Create experiment directory: ', self.exp_path)
124 | os.makedirs(self.exp_path)
125 | os.makedirs(self.checkpoint_dir)
126 | os.makedirs(self.log_dir)
127 |
128 | cfg = Config()
129 |
130 | if __name__ == '__main__':
131 | cfg = Config()
--------------------------------------------------------------------------------
/cropping_dataset.py:
--------------------------------------------------------------------------------
1 | import os
2 | import cv2
3 | import numpy as np
4 | import pickle
5 | import lmdb
6 | import datetime
7 | import torch
8 | from PIL import Image, ImageOps
9 | from torch.utils.data import DataLoader, Dataset
10 | import torchvision.transforms as transforms
11 | import json
12 | import matplotlib.pyplot as plt
13 | import math
14 | import random
15 | from config_GAICD import cfg
16 |
17 | MOS_MEAN = 2.95
18 | MOS_STD = 0.8
19 | IMAGE_NET_MEAN = [0.485, 0.456, 0.406]
20 | IMAGE_NET_STD = [0.229, 0.224, 0.225]
21 |
22 | # debug_dir = './test_dataset'
23 | # os.makedirs(debug_dir, exist_ok=True)
24 |
25 | def rescale_bbox(bbox, ratio_w, ratio_h):
26 | bbox = np.array(bbox).reshape(-1, 4)
27 | bbox[:, 0] = np.floor(bbox[:, 0] * ratio_w)
28 | bbox[:, 1] = np.floor(bbox[:, 1] * ratio_h)
29 | bbox[:, 2] = np.ceil(bbox[:, 2] * ratio_w)
30 | bbox[:, 3] = np.ceil(bbox[:, 3] * ratio_h)
31 | return bbox.astype(np.float32)
32 |
33 | def generate_crop_mask(bbox, width, height, downsample):
34 | bbox = np.array(bbox).reshape(-1, 4)
35 | target_w, target_h = int(width / downsample), int(height / downsample)
36 | bbox[:,0::2] *= (float(target_w) / width)
37 | bbox[:,1::2] *= (float(target_h) / height)
38 | bbox = bbox.astype(np.int32)
39 | mask = np.zeros((bbox.shape[0], target_h, target_w))
40 | for i in range(bbox.shape[0]):
41 | x1,y1,x2,y2 = bbox[i]
42 | mask[i, y1:y2, x1:x2] = 1
43 | mask = mask.astype(np.float32)
44 | return mask
45 |
46 | def generate_partition_mask(bbox, width, height, downsample):
47 | bbox = np.array(bbox).reshape(-1, 4)
48 | target_w, target_h = int(width / downsample), int(height / downsample)
49 | bbox[:, 0::2] *= (float(target_w) / width)
50 | bbox[:, 1::2] *= (float(target_h) / height)
51 | bbox = bbox.astype(np.int32)
52 | x1, y1, x2, y2 = bbox[0]
53 | hor = [0, x1, x2, target_w]
54 | ver = [0, y1, y2, target_h]
55 | mask = np.zeros((9, target_h, target_w))
56 | if x1 >= 0:
57 | if x2 - x1 == 0:
58 | x2 = x1 + 1
59 | if y2 - y1 == 0:
60 | y2 = y1 + 1
61 | for i in range(len(ver) - 1):
62 | for j in range(len(hor) - 1):
63 | ch = i * 3 + j
64 | mask[ch, ver[i]: ver[i + 1], hor[j]: hor[j + 1]] = 1
65 | mask = mask.astype(np.float32)
66 | return mask
67 |
68 | def generate_target_size_crop_mask(bbox, current_w, current_h, target_w, target_h):
69 | bbox = np.array(bbox).reshape(-1, 4)
70 | bbox[:,0::2] *= (float(target_w) / current_w)
71 | bbox[:,1::2] *= (float(target_h) / current_h)
72 | bbox = bbox.astype(np.int32)
73 | mask = np.zeros((bbox.shape[0], target_h, target_w))
74 | for i in range(bbox.shape[0]):
75 | x1,y1,x2,y2 = bbox[i]
76 | if x1 >= 0:
77 | mask[i, y1:y2, x1:x2] = 1
78 | mask = mask.astype(np.float32)
79 | return mask
80 |
81 | class CPCDataset(Dataset):
82 | def __init__(self, only_human_images=False,
83 | keep_aspect_ratio=True):
84 | self.only_human = only_human_images
85 | self.image_dir = cfg.CPC_image
86 | self.heat_map_dir = cfg.CPC_heat_map
87 | self.heat_map_scale = cfg.heat_map_size
88 | self.keep_aspect = keep_aspect_ratio
89 | assert os.path.exists(self.image_dir), self.image_dir
90 | self.human_bboxes = json.load(open(cfg.CPC_human, 'r'))
91 | self.annotations = json.load(open(cfg.CPC_anno, 'r'))
92 | self.score_mean, self.score_std = self.statistic_score()
93 | if self.only_human:
94 | self.image_list = list(self.human_bboxes.keys())
95 | else:
96 | self.image_list = list(self.annotations.keys())
97 | self.augmentation = cfg.data_augmentation
98 | self.PhotometricDistort = transforms.ColorJitter(
99 | brightness=0.125, contrast=0.5, saturation=0.5, hue=0.05)
100 | self.image_transformer = transforms.Compose([
101 | transforms.ToTensor(),
102 | transforms.Normalize(mean=IMAGE_NET_MEAN, std=IMAGE_NET_STD)])
103 | self.heat_map_transformer = transforms.ToTensor()
104 | self.crop_mask_downsample = 4
105 | self.human_mask_downsample = 16
106 |
107 | def statistic_crop(self):
108 | overlap_thresh = 0.1
109 | total_cnt = 0
110 | nonhuman_cnt = 0
111 | nonhuman_score = []
112 | for image_name in self.human_bboxes.keys():
113 | crop = self.annotations[image_name]['bboxes']
114 | score = self.annotations[image_name]['scores']
115 | hbox = self.human_bboxes[image_name]['bbox']
116 | crop = np.array(crop).reshape(-1, 4)
117 | hbox = np.array(hbox).reshape(-1, 4)
118 | score = np.array(score).reshape(-1)
119 | overlap = compute_overlap(crop, hbox)
120 | total_cnt += overlap.shape[0]
121 | nonhuman_cnt += (overlap < overlap_thresh).sum()
122 | nonhuman_score.extend(score[overlap < overlap_thresh].tolist())
123 | print('{} human images, {} crops, {} non-human, {:.2%}'.format(
124 | len(self.human_bboxes.keys()), total_cnt, nonhuman_cnt, float(nonhuman_cnt) / total_cnt
125 | ))
126 | nonhuman_score = np.array(nonhuman_score).reshape(-1)
127 | print('{} scores, mean={:.2f}, median={:.2f}, max={:.2f}, min={:.2f}'.format(
128 | nonhuman_score.shape[0], np.mean(nonhuman_score), np.median(nonhuman_score),
129 | np.max(nonhuman_score), np.min(nonhuman_score)
130 | ))
131 |
132 | def statistic_score(self):
133 | score_list = []
134 | for image_name in self.annotations.keys():
135 | score = self.annotations[image_name]['scores']
136 | score_list.extend(score)
137 | score = np.array(score_list).reshape(-1)
138 | mean = np.mean(score)
139 | std = np.std(score)
140 | print('CPC has {} score annotations, mean={:.2f}, std={:.2f}'.format(
141 | len(score), mean, std))
142 | return mean, std
143 |
144 | def __len__(self):
145 | return len(self.image_list)
146 |
147 | def __getitem__(self, index):
148 | image_name = self.image_list[index]
149 | image_file = os.path.join(self.image_dir, image_name)
150 | assert os.path.exists(image_file), image_file
151 | image = Image.open(image_file).convert('RGB')
152 | im_width, im_height = image.size
153 | if self.keep_aspect:
154 | scale = float(cfg.image_size[0]) / min(im_height, im_width)
155 | h = round(im_height * scale / 32.0) * 32
156 | w = round(im_width * scale / 32.0) * 32
157 | else:
158 | h = cfg.image_size[1]
159 | w = cfg.image_size[0]
160 | resized_image = image.resize((w,h), Image.ANTIALIAS)
161 | rs_width, rs_height = resized_image.size
162 | ratio_h = float(rs_height) / im_height
163 | ratio_w = float(rs_width) / im_width
164 |
165 | heat_map_file = os.path.join(self.heat_map_dir, os.path.splitext(image_name)[0] + '.png')
166 | assert os.path.exists(heat_map_file), heat_map_file
167 | heat_map = Image.open(heat_map_file)
168 | # hm_w, hm_h = int(rs_width * self.heat_map_scale), int(rs_height * self.heat_map_scale)
169 | hm_w = hm_h = cfg.image_size[0] // 4
170 | heat_map = heat_map.resize((hm_w, hm_h))
171 | crop = self.annotations[image_name]['bboxes']
172 | crop = rescale_bbox(crop, ratio_w, ratio_h)
173 | score = self.annotations[image_name]['scores']
174 | score = torch.tensor(score).reshape(-1).float()
175 | if image_name in self.human_bboxes:
176 | hbox = self.human_bboxes[image_name]['bbox']
177 | hbox = rescale_bbox(hbox, ratio_w, ratio_h)
178 | else:
179 | hbox = np.array([[-1, -1, -1, -1]]).astype(np.float32)
180 |
181 | if self.augmentation:
182 | if random.uniform(0,1) > 0.5:
183 | resized_image = ImageOps.mirror(resized_image)
184 | heat_map = ImageOps.mirror(heat_map)
185 | temp_x1 = crop[:,0].copy()
186 | crop[:,0] = rs_width - crop[:,2]
187 | crop[:,2] = rs_width - temp_x1
188 |
189 | if image_name in self.human_bboxes:
190 | # print('human mirror_before', hbox[0])
191 | temp_x1 = hbox[:,0].copy()
192 | hbox[:,0] = rs_width - hbox[:,2]
193 | hbox[:,2] = rs_width - temp_x1
194 | # print('human mirror after', hbox)
195 | resized_image = self.PhotometricDistort(resized_image)
196 | # debug
197 | # if hbox[0,0] > 0:
198 | # plt.subplot(2,2,1)
199 | # plt.imshow(resized_image)
200 | # plt.title('input image')
201 | # plt.axis('off')
202 | #
203 | # plt.subplot(2,2,2)
204 | # x1,y1,x2,y2 = crop[0].astype(np.int32)
205 | # best_crop = np.asarray(resized_image)[y1:y2,x1:x2]
206 | # # print('best_crop', crop[0], best_crop.shape)
207 | # plt.imshow(best_crop)
208 | # plt.title('best crop')
209 | # plt.axis('off')
210 | #
211 | # plt.subplot(2, 2, 3)
212 | # x1, y1, x2, y2 = crop[-1].astype(np.int32)
213 | # worst_crop = np.asarray(resized_image)[y1:y2, x1:x2]
214 | # # print('worst_crop', crop[-1], worst_crop.shape)
215 | # plt.imshow(worst_crop)
216 | # plt.title('worst crop')
217 | # plt.axis('off')
218 | #
219 | # if hbox[0,0] >= 0:
220 | # plt.subplot(2, 2, 4)
221 | # x1,y1,x2,y2 = hbox[0].astype(np.int32)
222 | # human = np.asarray(resized_image)[y1:y2,x1:x2]
223 | # # print('human bbox', hbox, human.shape)
224 | # plt.imshow(human)
225 | # plt.title('human')
226 | # plt.axis('off')
227 | # fig_file = os.path.join(debug_dir, os.path.splitext(image_name)[0] + '_crop.jpg')
228 | # plt.savefig(fig_file)
229 | # plt.close()
230 | im = self.image_transformer(resized_image)
231 | heat_map = self.heat_map_transformer(heat_map)
232 | # crop_mask = generate_crop_mask(crop, rs_width, rs_height,
233 | # self.crop_mask_downsample)
234 | crop_mask = generate_target_size_crop_mask(crop, rs_width, rs_height, 64, 64)
235 | partition_mask = generate_partition_mask(hbox, rs_width, rs_height,
236 | self.human_mask_downsample)
237 | # debug
238 | # if hbox[0,0] > 0:
239 | # plt.imshow(heat_map[0], cmap ='gray')
240 | # plt.axis('off')
241 | # fig_file = os.path.join(debug_dir, os.path.splitext(image_name)[0] + '_heat_map.jpg')
242 | # plt.savefig(fig_file)
243 | # plt.close()
244 | #
245 | # for i in range(9):
246 | # plt.subplot(3,3,i+1)
247 | # plt.imshow(partition_mask[i], cmap ='gray')
248 | # plt.axis('off')
249 | # fig_file = os.path.join(debug_dir, os.path.splitext(image_name)[0] + '_partition_mask.jpg')
250 | # plt.savefig(fig_file)
251 | # plt.close()
252 |
253 | return im, crop, hbox, heat_map, crop_mask, partition_mask, score, im_width, im_height
254 |
255 | class FCDBDataset(Dataset):
256 | def __init__(self, only_human_images=True,
257 | keep_aspect_ratio=False):
258 | self.only_human = only_human_images
259 | self.keep_aspect = keep_aspect_ratio
260 | self.image_dir = cfg.FCDB_image
261 | assert os.path.exists(self.image_dir), self.image_dir
262 | self.human_bboxes = json.load(open(cfg.FCDB_human, 'r'))
263 | self.annotations = json.load(open(cfg.FCDB_anno, 'r'))
264 | if self.only_human:
265 | self.image_list = list(self.human_bboxes.keys())
266 | else:
267 | self.image_list = json.load(open(cfg.FCDB_split, 'r'))['test']
268 | self.image_transformer = transforms.Compose([
269 | transforms.ToTensor(),
270 | transforms.Normalize(mean=IMAGE_NET_MEAN, std=IMAGE_NET_STD)])
271 | self.crop_mask_downsample = 4
272 | self.human_mask_downsample = 16
273 |
274 | def __len__(self):
275 | return len(self.image_list)
276 |
277 | def __getitem__(self, index):
278 | image_name = self.image_list[index]
279 | image_file = os.path.join(self.image_dir, image_name)
280 | assert os.path.exists(image_file), image_file
281 | image = Image.open(image_file).convert('RGB')
282 | im_width, im_height = image.size
283 | if self.keep_aspect:
284 | scale = float(cfg.image_size[0]) / min(im_height, im_width)
285 | h = round(im_height * scale / 32.0) * 32
286 | w = round(im_width * scale / 32.0) * 32
287 | else:
288 | h = cfg.image_size[1]
289 | w = cfg.image_size[0]
290 | resized_image = image.resize((w, h), Image.ANTIALIAS)
291 | im = self.image_transformer(resized_image)
292 | rs_width, rs_height = resized_image.size
293 | ratio_h = float(rs_height) / im_height
294 | ratio_w = float(rs_width) / im_width
295 | if image_name in self.human_bboxes:
296 | hbox = self.human_bboxes[image_name]
297 | hbox = rescale_bbox(hbox, ratio_w, ratio_h)
298 | else:
299 | hbox = np.array([[-1, -1, -1, -1]]).astype(np.float32)
300 | partition_mask = generate_partition_mask(hbox, rs_width, rs_height,
301 | self.human_mask_downsample)
302 |
303 | crop = self.annotations[image_name]
304 | x,y,w,h = crop
305 | crop = torch.tensor([x,y,x+w,y+h])
306 | return im, crop, hbox, partition_mask, im_width, im_height
307 |
308 | class FLMSDataset(Dataset):
309 | def __init__(self, only_human_images=True,
310 | keep_aspect_ratio=False):
311 | self.only_human = only_human_images
312 | self.keep_aspect = keep_aspect_ratio
313 | self.image_dir = cfg.FLMS_image
314 | assert os.path.exists(self.image_dir), self.image_dir
315 | self.human_bboxes = json.load(open(cfg.FLMS_human, 'r'))
316 | self.annotations = json.load(open(cfg.FLMS_anno, 'r'))
317 | if self.only_human:
318 | self.image_list = list(self.human_bboxes.keys())
319 | else:
320 | self.image_list = list(self.annotations.keys())
321 | self.image_transformer = transforms.Compose([
322 | transforms.ToTensor(),
323 | transforms.Normalize(mean=IMAGE_NET_MEAN, std=IMAGE_NET_STD)])
324 | self.crop_mask_downsample = 4
325 | self.human_mask_downsample = 16
326 |
327 | def __len__(self):
328 | return len(self.image_list)
329 |
330 | def __getitem__(self, index):
331 | image_name = self.image_list[index]
332 | image_file = os.path.join(self.image_dir, image_name)
333 | assert os.path.exists(image_file), image_file
334 | image = Image.open(image_file).convert('RGB')
335 | im_width, im_height = image.size
336 | if self.keep_aspect:
337 | scale = float(cfg.image_size[0]) / min(im_height, im_width)
338 | h = round(im_height * scale / 32.0) * 32
339 | w = round(im_width * scale / 32.0) * 32
340 | else:
341 | h = cfg.image_size[1]
342 | w = cfg.image_size[0]
343 | resized_image = image.resize((w, h), Image.ANTIALIAS)
344 | im = self.image_transformer(resized_image)
345 | rs_width, rs_height = resized_image.size
346 | ratio_h = float(rs_height) / im_height
347 | ratio_w = float(rs_width) / im_width
348 | if image_name in self.human_bboxes:
349 | hbox = self.human_bboxes[image_name]
350 | hbox = rescale_bbox(hbox, ratio_w, ratio_h)
351 | else:
352 | hbox = np.array([[-1, -1, -1, -1]]).astype(np.float32)
353 | partition_mask = generate_partition_mask(hbox, rs_width, rs_height,
354 | self.human_mask_downsample)
355 | crop = self.annotations[image_name]
356 | keep_crop = []
357 | for box in crop:
358 | x1, y1, x2, y2 = box
359 | if (x2 > im_width or y2 > im_height):
360 | continue
361 | keep_crop.append(box)
362 | for i in range(10 - len(keep_crop)):
363 | keep_crop.append([-1, -1, -1, -1])
364 | crop = torch.tensor(keep_crop)
365 | return im, crop, hbox, partition_mask, im_width, im_height
366 |
367 | class GAICDataset(Dataset):
368 | def __init__(self, only_human_images=False, split='all',
369 | keep_aspect_ratio=True):
370 | self.only_human = only_human_images
371 | self.split = split
372 | self.keep_aspect = keep_aspect_ratio
373 | self.image_size = cfg.image_size
374 | self.image_dir = cfg.GAIC_image
375 | self.heat_map_dir = cfg.GAIC_heat_map
376 | assert os.path.exists(self.image_dir), self.image_dir
377 | self.human_bboxes = json.load(open(cfg.GAIC_human, 'r'))
378 | self.annotations = json.load(open(cfg.GAIC_anno, 'r'))
379 | self.data_split = json.load(open(cfg.GAIC_split, 'r'))
380 | if self.only_human:
381 | if self.split == 'all':
382 | self.image_list = list(self.human_bboxes.keys())
383 | else:
384 | self.image_list = json.load(open(cfg.GAIC_human_split, 'r'))[self.split]
385 | else:
386 | if self.split == 'all':
387 | self.image_list = self.data_split['test'] + self.data_split['train']
388 | else:
389 | self.image_list = self.data_split[split]
390 | self.augmentation = (cfg.data_augmentation and split == 'train')
391 | self.PhotometricDistort = transforms.ColorJitter(
392 | brightness=0.125, contrast=0.5, saturation=0.5, hue=0.05)
393 | self.image_transformer = transforms.Compose([
394 | transforms.ToTensor(),
395 | transforms.Normalize(mean=IMAGE_NET_MEAN, std=IMAGE_NET_STD)])
396 | self.heat_map_transformer = transforms.ToTensor()
397 | self.crop_mask_downsample = 4
398 | self.human_mask_downsample = 16
399 | self.heat_map_scale = cfg.heat_map_size
400 | self.view_per_image = 64
401 |
402 | def statistic_crop(self):
403 | overlap_thresh = 0.1
404 | total_cnt = 0
405 | nonhuman_cnt = 0
406 | nonhuman_score = []
407 | for image_name in self.human_bboxes.keys():
408 | crop = self.annotations[image_name]['bbox']
409 | score = self.annotations[image_name]['score']
410 | hbox = self.human_bboxes[image_name]
411 | crop = np.array(crop).reshape(-1, 4)
412 | hbox = np.array(hbox).reshape(-1, 4)
413 | score = np.array(score).reshape(-1)
414 | overlap = compute_overlap(crop, hbox)
415 | total_cnt += overlap.shape[0]
416 | nonhuman_cnt += (overlap < overlap_thresh).sum()
417 | nonhuman_score.extend(score[overlap < overlap_thresh].tolist())
418 | print('{} human images, {} crops, {} non-human, {:.2%}'.format(
419 | len(self.human_bboxes.keys()), total_cnt, nonhuman_cnt, float(nonhuman_cnt) / total_cnt
420 | ))
421 | nonhuman_score = np.array(nonhuman_score).reshape(-1)
422 | print('{} scores, mean={:.2f}, median={:.2f}, max={:.2f}, min={:.2f}'.format(
423 | nonhuman_score.shape[0], np.mean(nonhuman_score), np.median(nonhuman_score),
424 | np.max(nonhuman_score), np.min(nonhuman_score)
425 | ))
426 |
427 | def __len__(self):
428 | return len(self.image_list)
429 |
430 | def __getitem__(self, index):
431 | image_name = self.image_list[index]
432 | image_file = os.path.join(self.image_dir, image_name)
433 | image = Image.open(image_file).convert('RGB')
434 | im_width, im_height = image.size
435 | if self.keep_aspect:
436 | scale = float(cfg.image_size[0]) / min(im_height, im_width)
437 | h = round(im_height * scale / 32.0) * 32
438 | w = round(im_width * scale / 32.0) * 32
439 | else:
440 | h = cfg.image_size[1]
441 | w = cfg.image_size[0]
442 | resized_image = image.resize((w, h), Image.ANTIALIAS)
443 | rs_width, rs_height = resized_image.size
444 | ratio_h = float(rs_height) / im_height
445 | ratio_w = float(rs_width) / im_width
446 | if image_name in self.human_bboxes:
447 | hbox = self.human_bboxes[image_name]
448 | hbox = rescale_bbox(hbox, ratio_w, ratio_h)
449 | else:
450 | hbox = np.array([[-1, -1, -1, -1]]).astype(np.float32)
451 | heat_map_file = os.path.join(self.heat_map_dir, os.path.splitext(image_name)[0] + '.png')
452 | heat_map = Image.open(heat_map_file)
453 | # hm_w, hm_h = int(rs_width * self.heat_map_scale), int(rs_height * self.heat_map_scale)
454 | hm_w = hm_h = 64
455 | heat_map = heat_map.resize((hm_w, hm_h))
456 | crop = self.annotations[image_name]['bbox']
457 | crop = rescale_bbox(crop, ratio_w, ratio_h)
458 | score = self.annotations[image_name]['score']
459 | # score = [float(s - MOS_MEAN) / MOS_STD for s in score]
460 | score = torch.tensor(score).reshape(-1)
461 | if self.augmentation:
462 | if random.uniform(0,1) > 0.5:
463 | resized_image = ImageOps.mirror(resized_image)
464 | heat_map = ImageOps.mirror(heat_map)
465 | temp_x1 = crop[:, 0].copy()
466 | crop[:, 0] = rs_width - crop[:, 2]
467 | crop[:, 2] = rs_width - temp_x1
468 |
469 | if image_name in self.human_bboxes:
470 | # print('human mirror_before', hbox[0])
471 | temp_x1 = hbox[:,0].copy()
472 | hbox[:,0] = rs_width - hbox[:,2]
473 | hbox[:,2] = rs_width - temp_x1
474 | # print('human mirror after', hbox)
475 | resized_image = self.PhotometricDistort(resized_image)
476 | im = self.image_transformer(resized_image)
477 | heat_map = self.heat_map_transformer(heat_map)
478 | # crop_mask = generate_crop_mask(crop, rs_width, rs_height,
479 | # self.crop_mask_downsample)
480 | crop_mask = generate_target_size_crop_mask(crop, rs_width, rs_height, 64, 64)
481 | partition_mask = generate_partition_mask(hbox, rs_width, rs_height,
482 | self.human_mask_downsample)
483 | return im, crop, hbox, heat_map, crop_mask, partition_mask, score, im_width, im_height
484 |
485 | def compute_overlap(crops, human_bbox):
486 | human_bbox = human_bbox.reshape(-1)
487 | if not isinstance(crops, np.ndarray):
488 | crops = np.array(crops).reshape((-1,4))
489 | over_x1 = np.maximum(crops[:,0], human_bbox[0])
490 | over_x2 = np.minimum(crops[:,2], human_bbox[2])
491 | over_y1 = np.maximum(crops[:,1], human_bbox[1])
492 | over_y2 = np.minimum(crops[:,3], human_bbox[3])
493 |
494 | over_w = np.maximum(0, over_x2 - over_x1)
495 | over_h = np.maximum(0, over_y2 - over_y1)
496 | over_area = over_w * over_h
497 | overlap = over_area / ((human_bbox[2] - human_bbox[0]) * (human_bbox[3] - human_bbox[1]))
498 | return overlap
499 |
500 | def count_GAICD():
501 | human_bboxes = json.load(open(cfg.GAIC_human, 'r'))
502 | human_lists = list(human_bboxes.keys())
503 | human_split = dict()
504 | data_split = dict()
505 |
506 | GAICD_path = '/workspace/aesthetic_cropping/dataset/GAICD/images'
507 | assert os.path.exists(GAICD_path), GAICD_path
508 | for split in ['train', 'test']:
509 | subpath = os.path.join(GAICD_path, split)
510 | data_split[split] = os.listdir(subpath)
511 | humans = [im for im in data_split[split] if im in human_lists]
512 | human_split[split] = humans
513 | print('{} set {} images, {} human-centric images'.format(split, len(data_split[split]), len(human_split[split])))
514 | with open(os.path.join(cfg.GAIC_path, 'original_data_split.json'), 'w') as f:
515 | json.dump(data_split,f)
516 | with open(os.path.join(cfg.GAIC_path, 'human_data_split.json'), 'w') as f:
517 | json.dump(human_split,f)
518 |
519 | if __name__ == '__main__':
520 | cpc_dataset = CPCDataset(only_human_images=False, keep_aspect_ratio=False)
521 | cpc_dataset.statistic_crop()
522 | print('CPC Dataset contains {} images'.format(len(cpc_dataset)))
523 | dataloader = DataLoader(cpc_dataset, batch_size=1, num_workers=0, shuffle=False)
524 | for batch_idx, data in enumerate(dataloader):
525 | im, crop, hbox, heat_map, crop_mask, partition_mask, score, im_width, im_height = data
526 | print(im.shape, crop.shape, hbox.shape, heat_map.shape, crop_mask.shape,
527 | partition_mask.shape, score.shape, im_width.shape, im_height.shape)
528 |
529 | # fcdb_testset = FCDBDataset(only_human_images=False, keep_aspect_ratio=True)
530 | # print('FCDB testset has {} images'.format(len(fcdb_testset)))
531 | # dataloader = DataLoader(fcdb_testset, batch_size=1, num_workers=4)
532 | # for batch_idx, data in enumerate(dataloader):
533 | # im, crop, partition_mask, w, h = data
534 | # print(im.shape, crop.shape, partition_mask.shape, w.shape, h.shape)
535 | #
536 | # FLMS_testset = FLMSDataset(keep_aspect_ratio=True, only_human_images=False)
537 | # print('FLMS testset has {} images'.format(len(FLMS_testset)))
538 | # dataloader = DataLoader(FLMS_testset, batch_size=1, num_workers=4)
539 | # for batch_idx, data in enumerate(dataloader):
540 | # im, crop, partition_mask, w, h = data
541 | # print(im.shape, crop.shape, partition_mask.shape, w.shape, h.shape)
542 |
543 | # GAICD_testset = GAICDataset(only_human_images=False, keep_aspect_ratio=True, split='train')
544 | # print('GAICD testset has {} images'.format(len(GAICD_testset)))
545 | # dataloader = DataLoader(GAICD_testset, batch_size=1, num_workers=0, shuffle=False)
546 | # for batch_idx, data in enumerate(dataloader):
547 | # im, crop, heat_map, crop_mask, partition_mask, score, im_width, im_height = data
548 | # print(im.shape, crop.shape, heat_map.shape, crop_mask.shape,
549 | # partition_mask.shape, score.shape, im_width.shape, im_height.shape)
550 | # print(crop)
551 | # print(score)
552 | # GAICD_testset.statistic_crop()
--------------------------------------------------------------------------------
/cropping_model.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torchvision.models as models
4 | import torch.nn.functional as F
5 | import torch.nn.init as init
6 | from einops import rearrange, repeat
7 | import sys
8 | sys.path.insert(0, './..')
9 | '''
10 | install RoIAlign and RoIAlign libraries following the repository:
11 | https://github.com/lld533/Grid-Anchor-based-Image-Cropping-Pytorch
12 | '''
13 | from roi_align.modules.roi_align import RoIAlignAvg, RoIAlign
14 | from rod_align.modules.rod_align import RoDAlignAvg, RoDAlign
15 |
16 | import os
17 | import torchvision.transforms as transforms
18 | from PIL import Image
19 | import math
20 | import numpy as np
21 |
22 | class vgg_base(nn.Module):
23 |
24 | def __init__(self, loadweights=True, downsample=4, model_path=None):
25 | super(vgg_base, self).__init__()
26 |
27 | vgg = models.vgg16(pretrained=loadweights)
28 |
29 | # if downsample == 4:
30 | # self.feature = nn.Sequential(vgg.features[:-1])
31 | # elif downsample == 5:
32 | # self.feature = nn.Sequential(vgg.features)
33 |
34 | self.feature3 = nn.Sequential(vgg.features[:23])
35 | self.feature4 = nn.Sequential(vgg.features[23:30])
36 | self.feature5 = nn.Sequential(vgg.features[30:])
37 |
38 | #flops, params = profile(self.feature, input_size=(1, 3, 256,256))
39 |
40 | def forward(self, x):
41 | f3 = self.feature3(x)
42 | f4 = self.feature4(f3)
43 | f5 = self.feature5(f4)
44 | return f3, f4, f5
45 |
46 | class resnet50_base(nn.Module):
47 |
48 | def __init__(self, loadweights=True, downsample=4, model_path=None):
49 | super(resnet50_base, self).__init__()
50 |
51 | resnet50 = models.resnet50(pretrained=loadweights)
52 | self.feature3 = nn.Sequential(resnet50.conv1,resnet50.bn1,resnet50.relu,resnet50.maxpool,resnet50.layer1,resnet50.layer2)
53 | self.feature4 = nn.Sequential(resnet50.layer3)
54 | self.feature5 = nn.Sequential(resnet50.layer4)
55 |
56 | def forward(self, x):
57 | #return self.feature(x)
58 | f3 = self.feature3(x)
59 | f4 = self.feature4(f3)
60 | f5 = self.feature5(f4)
61 | return f3, f4, f5
62 |
63 | class PartitionAwareModule(nn.Module):
64 | def __init__(self, in_dim, partiition=9, concat_with_human=True):
65 | super(PartitionAwareModule, self).__init__()
66 | alignsize = 3
67 | downsample = 4
68 | human_dim = in_dim // 2 if concat_with_human else 0
69 | if partiition in [0,9]:
70 | self.group_conv = nn.Conv2d(in_dim + human_dim, in_dim * 9, kernel_size=3, stride=1, padding=1)
71 | else:
72 | self.group_conv = nn.Conv2d(in_dim + human_dim, in_dim * partiition, kernel_size=3, stride=1, padding=1)
73 | if concat_with_human:
74 | self.RoIAlign = RoIAlignAvg(alignsize, alignsize, 1.0 / 2 ** downsample)
75 | self.RoIConv = nn.Conv2d(in_dim, human_dim, kernel_size=3)
76 | self.partition = partiition
77 | self.concat_with_human = concat_with_human
78 |
79 | def forward(self, x, human_box, partition_mask):
80 | # x: (b,c,h,w)
81 | # p_mask: # b,9,1,h,w
82 | if self.concat_with_human:
83 | humanRoI = self.RoIAlign(x, human_box)
84 | humanRoI = self.RoIConv(humanRoI)
85 | humanRoI = repeat(humanRoI, 'b c 1 1 -> b c h w', h=x.shape[-2], w=x.shape[-1])
86 | cat_feat = torch.cat([x, humanRoI], dim=1)
87 | p_conv = F.relu(self.group_conv(cat_feat))
88 | else:
89 | p_conv = F.relu(self.group_conv(x))
90 | if self.partition in [0,9]:
91 | p_feat = torch.chunk(p_conv, 9, dim=1) # tuple, each of shape (b,c,h,w)
92 | p_feat = torch.stack(p_feat, dim=1) # b,9,c,h,w
93 | p_mean = torch.mean(p_feat, dim=2)
94 | if self.partition == 0:
95 | fused = torch.mean(p_feat, dim=1)
96 | else:
97 | fused = torch.sum(p_feat * partition_mask.unsqueeze(2), dim=1)
98 | else:
99 | human_mask = partition_mask[:,4].unsqueeze(1) # b,1,h,w
100 | if self.partition == 1:
101 | fused = p_conv * human_mask
102 | p_mean = torch.mean(p_conv, dim=1).unsqueeze(1) # b,1,h,w
103 | else:
104 | p_feat = torch.chunk(p_conv, self.partition, dim=1) # tuple, each of shape (b,c,h,w)
105 | p_feat = torch.stack(p_feat, dim=1) # b,2,c,h,w
106 | p_mean = torch.mean(p_feat, dim=2) # b,2,h,w
107 | non_mask = 1 - human_mask
108 | cat_mask = torch.stack([human_mask, non_mask], dim=1) # b,2,1,h,w
109 | fused = torch.sum(cat_mask * p_feat, dim=1) # b,c,h,w
110 | out = x + fused
111 | return out, p_mean
112 |
113 | class ContentAwareModule(nn.Module):
114 | def __init__(self, in_dim):
115 | super(ContentAwareModule, self).__init__()
116 | self.conv1 = nn.Sequential(
117 | nn.Conv2d(in_dim, in_dim, kernel_size=3, padding=1),
118 | nn.ReLU(True))
119 | self.conv2 = nn.Sequential(
120 | nn.Conv2d(in_dim, in_dim, kernel_size=3, padding=1),
121 | nn.ReLU(True))
122 | self.conv3 = nn.Sequential(
123 | nn.Conv2d(in_dim, 1, kernel_size=1),
124 | nn.Sigmoid()
125 | )
126 |
127 | def forward(self, x):
128 | x = F.interpolate(x, scale_factor=2, mode='bilinear', align_corners=True)
129 | x = self.conv1(x)
130 | x = F.interpolate(x, scale_factor=2, mode='bilinear', align_corners=True)
131 | x = self.conv2(x)
132 | out = self.conv3(x)
133 | return out
134 |
135 | class ContentPreserveModule(nn.Module):
136 | def __init__(self, in_dim, inter_dim, out_dim, mode='gcn'):
137 | super(ContentPreserveModule, self).__init__()
138 | if mode == 'conv':
139 | self.relation_encoder = nn.Sequential(
140 | nn.Conv2d(in_dim, in_dim, kernel_size=3, padding=1),
141 | nn.ReLU(True),
142 | nn.Conv2d(in_dim, inter_dim, kernel_size=1))
143 | else:
144 | self.relation_encoder = GraphResoningModule(in_dim, in_dim, inter_dim)
145 |
146 | self.preserve_predict = nn.Sequential(
147 | nn.Upsample(size=(64, 64), mode='bilinear', align_corners=True),
148 | nn.Conv2d(inter_dim, inter_dim, kernel_size=3, padding=1),
149 | nn.ReLU(True),
150 | nn.Conv2d(inter_dim, 1, kernel_size=1),
151 | nn.Sigmoid()
152 | )
153 | self.content_feat_layer = nn.Sequential(
154 | nn.Conv2d(2, 64, kernel_size=5),
155 | nn.MaxPool2d(kernel_size=2),
156 | nn.Conv2d(64, 32, kernel_size=5),
157 | nn.MaxPool2d(kernel_size=2),
158 | nn.Flatten(1),
159 | nn.Linear(5408, out_dim)
160 | )
161 |
162 | def forward(self, feat_map, crop_mask):
163 | '''
164 | :param feat_map: b,c,h,w
165 | :param crop_mask: b,n,h,w
166 | :return:
167 | '''
168 | relation_feat = self.relation_encoder(feat_map)
169 | heat_map = self.preserve_predict(relation_feat)
170 | B, N, H, W = crop_mask.shape
171 | rep_heat = repeat(heat_map, 'b 1 h w -> b n h w', n=N)
172 | cat_map = torch.stack([rep_heat, crop_mask], dim=2)
173 | cat_map = rearrange(cat_map, 'b n c h w -> (b n) c h w', b=B, n=N)
174 | content_feat = self.content_feat_layer(cat_map)
175 | return heat_map, content_feat
176 |
177 | class GraphResoningModule(nn.Module):
178 | def __init__(self, in_dim, inter_dim, out_dim):
179 | super(GraphResoningModule, self).__init__()
180 | self.phi = nn.Linear(in_dim, inter_dim, bias=False)
181 | self.theta = nn.Linear(in_dim, inter_dim, bias=False)
182 | self.weight= nn.Parameter(torch.empty(in_dim, out_dim))
183 | init.xavier_uniform_(self.weight.data)
184 |
185 | def forward(self, feat_map):
186 | '''
187 | :param feat_map: b,c,h,w
188 | :param crop_mask: b,n,h,w
189 | :return:
190 | '''
191 | B,D,H,W = feat_map.shape
192 | feat_vec = rearrange(feat_map, 'b c h w -> b (h w) c')
193 | phi = self.phi(feat_vec) # b,n,d
194 | theta = self.theta(feat_vec) # b,n,d
195 | product = torch.matmul(phi, theta.permute(0,2,1)) # b,n,n
196 | phi_norm = torch.norm(phi, p=2, dim=-1, keepdim=True) # b,n,1
197 | theta_norm = torch.norm(theta, p=2, dim=-1, keepdim=True) # b,n,1
198 | norms = torch.matmul(phi_norm, theta_norm.permute(0,2,1)) # b,n,n
199 | sim = torch.softmax(product / norms, dim=-1) # b,n,n
200 | out = torch.matmul(sim, feat_vec) # b,n,d
201 | out = torch.matmul(out, self.weight) # b,n,d'
202 | updated = rearrange(out, 'b (h w) d -> b d h w', h=H, w=W)
203 | updated = F.relu(updated)
204 | return updated
205 |
206 | class HumanCentricCroppingModel(nn.Module):
207 | def __init__(self, loadweights=True, cfg=None):
208 | super(HumanCentricCroppingModel, self).__init__()
209 | if cfg is None:
210 | from config_GAICD import cfg
211 | reduce_dim = cfg.reduced_dim
212 | alignsize = 9
213 | cp_dim = 256
214 | downsample = 4
215 | self.cfg = cfg
216 | self.backbone = vgg_base(loadweights=loadweights)
217 | self.f3_layer = nn.Conv2d(512, 256, kernel_size=1)
218 | self.f4_layer = nn.Conv2d(512, 256, kernel_size=1)
219 | self.f5_layer = nn.Conv2d(512, 256, kernel_size=1)
220 | self.feat_ext = nn.Conv2d(256, reduce_dim, kernel_size=1, padding=0)
221 | if cfg.use_partition_aware:
222 | self.partition_aware = PartitionAwareModule(reduce_dim, partiition=cfg.partition_aware_type,
223 | concat_with_human=cfg.concat_with_human)
224 | else:
225 | self.partition_aware = None
226 |
227 | if cfg.use_content_preserve:
228 | self.content_preserve = ContentPreserveModule(reduce_dim, 8, cp_dim, mode=cfg.content_preserve_type)
229 | else:
230 | self.content_preserve = None
231 |
232 | fc1_dim = 512
233 | fc2_dim = 256
234 | self.RoIAlign = RoIAlignAvg(alignsize, alignsize, 1.0 / 2 ** downsample)
235 | self.RoDAlign = RoDAlignAvg(alignsize, alignsize, 1.0 / 2 ** downsample)
236 | reduce_dim *= 2
237 | self.roi_feat_layer = nn.Sequential(
238 | nn.Conv2d(reduce_dim, fc1_dim, kernel_size=alignsize),
239 | nn.ReLU(True),
240 | nn.Flatten(1),
241 | nn.Linear(fc1_dim, fc2_dim, bias=False),
242 | )
243 | if cfg.use_content_preserve and not cfg.only_content_preserve:
244 | self.fc_layer = nn.Linear(fc2_dim+cp_dim, 1)
245 | else:
246 | self.fc_layer = nn.Linear(fc2_dim, 1)
247 |
248 | def forward(self, im, crop_box, human_box, crop_mask, human_mask):
249 | B, N, _ = crop_box.shape
250 | if crop_box.shape[-1] == 4:
251 | index = torch.arange(B).view(-1, 1).repeat(1, N).reshape(B, N, 1).to(im.device)
252 | crop_box = torch.cat((index, crop_box), dim=-1).contiguous()
253 | if human_box.shape[-1] == 4:
254 | hindex = torch.arange(B).view(-1, 1, 1).to(im.device)
255 | human_box = torch.cat((hindex, human_box), dim=-1).contiguous()
256 | if crop_box.dim() == 3:
257 | crop_box = crop_box.reshape(-1, 5)
258 | human_box = human_box.reshape(-1, 5)
259 |
260 | f3, f4, f5 = self.backbone(im)
261 | f3 = F.interpolate(f3, size=f4.shape[2:], mode='bilinear', align_corners=True)
262 | f5 = F.interpolate(f5, size=f4.shape[2:], mode='bilinear', align_corners=True)
263 | agg_feat = self.f3_layer(f3) + self.f4_layer(f4) + self.f5_layer(f5)
264 | red_feat = self.feat_ext(agg_feat)
265 | contain_human = (torch.count_nonzero(human_mask[0,4]) > 0)
266 | part_feat = torch.zeros_like(human_mask.detach())
267 |
268 | if self.partition_aware and contain_human:
269 | red_feat, part_feat = self.partition_aware(red_feat, human_box, human_mask)
270 |
271 | RoI_feat = self.RoIAlign(red_feat, crop_box)
272 | RoD_feat = self.RoDAlign(red_feat, crop_box)
273 | cat_feat = torch.cat([RoI_feat, RoD_feat], dim=1)
274 | crop_feat = self.roi_feat_layer(cat_feat)
275 | heat_map = torch.zeros_like(crop_mask[:, 0:1])
276 |
277 | if self.content_preserve:
278 | heat_map, cont_feat = self.content_preserve(red_feat, crop_mask)
279 | if self.cfg.only_content_preserve:
280 | crop_feat = cont_feat
281 | else:
282 | crop_feat = torch.cat([crop_feat, cont_feat], dim=1)
283 |
284 | crop_score = self.fc_layer(crop_feat)
285 | score = rearrange(crop_score, '(b n) 1 -> b n', b=B, n=N)
286 | return part_feat, heat_map, score
287 |
288 | def weights_init(m):
289 | if isinstance(m, (nn.Conv2d, nn.Linear)):
290 | init.xavier_uniform_(m.weight.data)
291 | m.bias.data.zero_()
292 |
293 | def listwise_view_ranking_loss(pre_score, gt_score):
294 | if pre_score.dim() > 1:
295 | pre_score = pre_score.reshape(-1)
296 | if gt_score.dim() > 1:
297 | gt_score = gt_score.reshape(-1)
298 | assert pre_score.shape == gt_score.shape, '{} vs. {}'.format(pre_score.shape, gt_score.shape)
299 | scores = nn.LogSoftmax(dim=-1)(pre_score)
300 | score_list = gt_score.detach().cpu().numpy().tolist()
301 | sort_scores = torch.sort(torch.unique(gt_score))[0].detach().cpu().numpy().tolist()
302 | label_list = [sort_scores.index(e) + 1 for e in score_list]
303 | labels = torch.tensor(label_list).float().to(gt_score.device)
304 | labels = F.softmax(labels, dim=-1)
305 | loss = torch.sum(-(scores * labels), dim=-1)
306 | return loss
307 |
308 | def score_weighted_regression_loss(pre_score, gt_score, score_mean):
309 | if pre_score.dim() > 1:
310 | pre_score = pre_score.reshape(-1)
311 | if gt_score.dim() > 1:
312 | gt_score = gt_score.reshape(-1)
313 | assert pre_score.shape == gt_score.shape, '{} vs. {}'.format(pre_score.shape, gt_score.shape)
314 | l1_loss = F.smooth_l1_loss(pre_score, gt_score, reduction='none')
315 | weight = torch.exp((gt_score - score_mean).clip(min=0,max=100))
316 | reg_loss= torch.mean(weight * l1_loss)
317 | return reg_loss
318 |
319 | def score_regression_loss(pre_score, gt_score):
320 | if pre_score.dim() > 1:
321 | pre_score = pre_score.reshape(-1)
322 | if gt_score.dim() > 1:
323 | gt_score = gt_score.reshape(-1)
324 | assert pre_score.shape == gt_score.shape, '{} vs. {}'.format(pre_score.shape, gt_score.shape)
325 | l1_loss = F.smooth_l1_loss(pre_score, gt_score, reduction='mean')
326 | return l1_loss
327 |
328 | def score_rank_loss(pre_score, gt_score):
329 | if pre_score.dim() > 1:
330 | pre_score = pre_score.reshape(-1)
331 | if gt_score.dim() > 1:
332 | gt_score = gt_score.reshape(-1)
333 | assert pre_score.shape == gt_score.shape, '{} vs. {}'.format(pre_score.shape, gt_score.shape)
334 | N = pre_score.shape[0]
335 | pair_num = N * (N-1) / 2
336 | pre_diff = pre_score[:,None] - pre_score[None,:]
337 | gt_diff = gt_score[:,None] - gt_score[None,:]
338 | indicat = -1 * torch.sin(gt_diff) * (pre_diff - gt_diff)
339 | diff = torch.maximum(indicat, torch.zeros_like(indicat))
340 | rank_loss= torch.sum(diff) / pair_num
341 | return rank_loss
342 |
343 | def partition_ce_loss(pre_patition, gt_partition):
344 | '''
345 | :param pre_patition: b,9,h,w
346 | :param gt_partition: b,9,h,w
347 | :return:
348 | '''
349 | pre_logit = rearrange(pre_patition, 'b c h w -> (b h w) c')
350 | gt_mask = rearrange(gt_partition, 'b c h w -> (b h w) c')
351 | gt_label = torch.argmax(gt_mask, dim=-1) # (b h w)
352 | loss = F.cross_entropy(pre_logit, gt_label)
353 | return loss
354 |
355 | if __name__ == '__main__':
356 | device = torch.device('cuda:0')
357 | net = HumanCentricCroppingModel(loadweights=True).to(device)
358 | w,h = 256, 256
359 | x = torch.randn(2,3,h,w).to(device)
360 | boxes = torch.tensor([[64, 64, 223, 223],
361 | [64, 64, 223, 223]]).float().to(device)
362 | boxes = boxes.unsqueeze(0).repeat(2,1,1)
363 | human_box = torch.tensor([32, 32, 64, 64]).reshape(1,1,-1).float().to(device)
364 | human_box = human_box.repeat(2,1,1)
365 | crop_mask = torch.randn(2, 2,h//4,w//4).to(device)
366 | human_mask = torch.randn(2,9,h//16,w//16).to(device)
367 | part_mask, heat_map, score = net(x, boxes, human_box, crop_mask, human_mask)
368 | print(part_mask.shape, heat_map.shape, score.shape)
369 | print(score)
--------------------------------------------------------------------------------
/figures/motivation.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bcmi/Human-Centric-Image-Cropping/4720b68551ecc68a5cac6e915eecc60e93b73f76/figures/motivation.jpg
--------------------------------------------------------------------------------
/figures/pipeline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bcmi/Human-Centric-Image-Cropping/4720b68551ecc68a5cac6e915eecc60e93b73f76/figures/pipeline.png
--------------------------------------------------------------------------------
/figures/qualitative_comparison.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/bcmi/Human-Centric-Image-Cropping/4720b68551ecc68a5cac6e915eecc60e93b73f76/figures/qualitative_comparison.png
--------------------------------------------------------------------------------
/generate_pseudo_heatmap.py:
--------------------------------------------------------------------------------
1 | import os
2 | import cv2
3 | import numpy as np
4 | import pickle
5 | import lmdb
6 | import datetime
7 | import torch
8 | import PIL.Image as Image
9 | from torch.utils.data import DataLoader, Dataset
10 | import torchvision.transforms as transforms
11 | import json
12 | import matplotlib.pyplot as plt
13 | import math
14 | import shutil
15 | from tqdm import tqdm
16 |
17 | from config_GAICD import cfg
18 |
19 | heat_map_dir = './heat_map_gt'
20 | os.makedirs(heat_map_dir, exist_ok=True)
21 |
22 | class HeatMap:
23 | def __init__(self, dataset, top_k=10):
24 | self.dataset = dataset
25 | self.topk = top_k
26 | self.weighted_sum = True
27 | if self.dataset == 'CPC':
28 | self.image_dir = cfg.CPC_image
29 | self.human_bboxes = json.load(open(cfg.CPC_human, 'r'))
30 | self.annotations = json.load(open(cfg.CPC_anno, 'r'))
31 | else:
32 | self.image_dir = cfg.GAIC_image
33 | self.human_bboxes = json.load(open(cfg.GAIC_human, 'r'))
34 | self.annotations = json.load(open(cfg.GAIC_anno, 'r'))
35 | assert os.path.exists(self.image_dir), self.image_dir
36 | self.image_list = list(self.annotations.keys())
37 | self.human_image_list = list(self.human_bboxes.keys())
38 | self.heat_map_path = os.path.join(heat_map_dir, self.dataset)
39 | if os.path.exists(self.heat_map_path):
40 | shutil.rmtree(self.heat_map_path)
41 | os.makedirs(self.heat_map_path, exist_ok=True)
42 | self.display_path = os.path.join(self.heat_map_path, 'display')
43 | self.mask_path = os.path.join(self.heat_map_path, 'mask')
44 | os.makedirs(self.display_path)
45 | os.makedirs(self.mask_path)
46 | self.score_mean, self.score_std = self.statistic_score()
47 | self.generate_heat_map()
48 |
49 | def statistic_score(self):
50 | score_list = []
51 | for image_name in self.image_list:
52 | if self.dataset == 'CPC':
53 | crop = self.annotations[image_name]['bboxes']
54 | score = self.annotations[image_name]['scores']
55 | else:
56 | crop = self.annotations[image_name]['bbox']
57 | score = self.annotations[image_name]['score']
58 | score_list.extend(score)
59 | score = np.array(score_list).reshape(-1)
60 | mean = np.mean(score)
61 | std = np.std(score)
62 | print('{} has {} score annotations, mean={:.2f}, std={:.2f}'.format(
63 | self.dataset, len(score), mean, std))
64 | return mean, std
65 |
66 | def normalize_mask(self, mask, score, weighted_sum=False):
67 | height, width = mask.shape[1:]
68 | if not weighted_sum:
69 | mask = np.sum(mask, axis=0)
70 | mask = (mask - mask.min()) / (mask.max() - mask.min())
71 | # else:
72 | # weights = (score - score.min()) / (score.max() - score.min())
73 | # weighted_mask = np.sum(weights[:, None, None] * mask, axis=0)
74 | # weighted_mask = (weighted_mask - weighted_mask.min()) / (weighted_mask.max() - weighted_mask.min())
75 | # return weighted_mask
76 | else:
77 | if len(mask) > 10:
78 | pos = score > self.score_mean
79 | mask = mask[pos]
80 | score = score[pos]
81 | area = mask.sum(2).sum(1) / (height * width)
82 | score = score + area * 2
83 | weight = torch.softmax(torch.from_numpy(score).reshape(-1, 1), dim=0).unsqueeze(2).numpy()
84 | pos_mask = np.sum(weight * mask, axis=0)
85 | mask = (pos_mask - pos_mask.min()) / (pos_mask.max() - pos_mask.min())
86 | exp_mask = np.expand_dims(mask, axis=2)
87 | norm_mask = np.zeros((height, width, 1))
88 | cv2.normalize(exp_mask, norm_mask, 0, 255, cv2.NORM_MINMAX)
89 | norm_mask = np.asarray(norm_mask, dtype=np.uint8)
90 | return norm_mask
91 |
92 | def mask2hotmap(self, src, mask):
93 | heat_im = cv2.applyColorMap(mask, cv2.COLORMAP_JET)
94 | heat_im = cv2.cvtColor(heat_im, cv2.COLOR_BGR2RGB)
95 | fuse_im = cv2.addWeighted(src, 0.3, heat_im, 0.7, 0)
96 | return fuse_im
97 |
98 | def generate_heat_map(self):
99 | for image_name in tqdm(self.image_list):
100 | image_file = os.path.join(self.image_dir, image_name)
101 | assert os.path.exists(image_file), image_file
102 | src = cv2.imread(image_file)
103 | height, width = src.shape[0:2]
104 |
105 | if self.dataset == 'CPC':
106 | crop = self.annotations[image_name]['bboxes']
107 | score = self.annotations[image_name]['scores']
108 | else:
109 | crop = self.annotations[image_name]['bbox']
110 | score = self.annotations[image_name]['score']
111 |
112 | crop = np.array(crop).astype(np.int).reshape((-1, 4))
113 | score = np.array(score).reshape(-1)
114 | rank = np.argsort(score)[::-1]
115 | crop = crop[rank]
116 | score = score[rank]
117 | topk = min(self.topk, len(crop))
118 | mask = np.zeros((len(crop), height, width))
119 | for i in range(len(crop)):
120 | x1,y1,x2,y2 = [int(c) for c in crop[i]]
121 | mask[i, y1:y2, x1:x2] += 1
122 | fuse_mask = self.normalize_mask(mask, score, weighted_sum=self.weighted_sum)
123 | mask_path = os.path.join(self.mask_path, image_name.replace('.jpg', '.png'))
124 | cv2.imwrite(mask_path, fuse_mask)
125 |
126 | if image_name not in self.human_image_list:
127 | continue
128 | plt.figure(figsize=(10,10))
129 | plt.subplots_adjust(hspace=0.5, wspace=0.5)
130 | plt.subplot(2,3,1)
131 | plt.imshow(src[:,:,::-1])
132 | plt.title('input image')
133 | plt.axis('off')
134 |
135 | fuse_im = self.mask2hotmap(src, self.normalize_mask(mask[:5], score[:5], weighted_sum=self.weighted_sum))
136 | plt.subplot(2,3,2)
137 | plt.imshow(fuse_im, vmin=0, vmax=255)
138 | plt.title('top-5 heat map')
139 | plt.axis('off')
140 |
141 | fuse_im = self.mask2hotmap(src, self.normalize_mask(mask[:10], score[:10], weighted_sum=self.weighted_sum))
142 | plt.subplot(2, 3, 3)
143 | plt.imshow(fuse_im, vmin=0, vmax=255)
144 | plt.title('top-10 heat map')
145 | plt.axis('off')
146 |
147 | top_mask = mask
148 | fuse_im = self.mask2hotmap(src, self.normalize_mask(top_mask, score, weighted_sum=self.weighted_sum))
149 | plt.subplot(2, 3, 4)
150 | plt.imshow(fuse_im, vmin=0, vmax=255)
151 | plt.title('score > mean_score heat map'.format(mask.shape[0]))
152 | plt.axis('off')
153 |
154 | best_crop = [int(x) for x in crop[0]]
155 | best_im = fuse_im[best_crop[1] : best_crop[3], best_crop[0] : best_crop[2]]
156 | plt.subplot(2,3,5)
157 | plt.imshow(best_im)
158 | plt.title('best crop')
159 | plt.axis('off')
160 |
161 | bad_crop = [int(x) for x in crop[-1]]
162 | bad_im = fuse_im[bad_crop[1] : bad_crop[3], bad_crop[0] : bad_crop[2]]
163 | plt.subplot(2, 3, 6)
164 | plt.imshow(bad_im)
165 | plt.title('worst crop')
166 | plt.axis('off')
167 |
168 | plt.tight_layout()
169 | save_fig = os.path.join(self.display_path, image_name)
170 | plt.savefig(save_fig)
171 | # plt.show()
172 | plt.close()
173 | # print(save_fig, human_numer, nonhuman_number)
174 |
175 |
176 | if __name__ == '__main__':
177 | HeatMap('GAICD')
--------------------------------------------------------------------------------
/human_bboxes/FCDB/data_split.json:
--------------------------------------------------------------------------------
1 | {"train": ["4910188666_04cf9f487d_b.jpg", "541402014_6d9f21ba8b_b.jpg", "23361824710_4cf1345188_c.jpg", "734680941_ceabdc5e42_o.jpg", "8303815506_ea79387c42_c.jpg", "5808034069_019dd83c99_b.jpg", "5782605225_ef43b80482_b.jpg", "2808329980_c1557c3972_b.jpg", "3404902674_1c50e85c39_b.jpg", "8585778497_e5868c2b7b_c.jpg", "2870093392_ac396a069f_b.jpg", "4887697425_ed7ed96a7b_b.jpg", "4593715956_9d0de87929_b.jpg", "2042696446_2556d13c0b_b.jpg", "2361817609_91c4eff8df_b.jpg", "5237687896_bdf349c7e7_b.jpg", "21145828833_a434343ed9_c.jpg", "2839993476_580db550b9_o.jpg", "4806571783_8726a66bc2_b.jpg", "8455922606_b0a908b5ec_c.jpg", "6099740343_ecaaf53945_b.jpg", "3886467178_840c4da178_b.jpg", "3214216374_1449711a32_o.jpg", "2781187081_b367c6e5fc_b.jpg", "5005574607_e50e541b18_b.jpg", "6157997326_c12673698b_b.jpg", "5794711715_9abfdaaa53_b.jpg", "3968099779_1b0dc1f006_b.jpg", "13179700414_73169bd82b_c.jpg", "6077338965_cd4038ff7a_b.jpg", "3808769144_49b168a92e_b.jpg", "23124963609_91b8ce37d0_c.jpg", "3180017589_a2398322c4_o.jpg", "4160433824_e94e7877dc_b.jpg", "3447521051_be19081d79_o.jpg", "3262317059_f825784f84_b.jpg", "4340753319_fc32a0f15d_o.jpg", "4154715706_8e13f89920_o.jpg", "6961245926_482e9b4d04_c.jpg", "4430103244_17b9af5027_b.jpg", "17125637175_da317ff236_c.jpg", "8062088404_cbffd05196_c.jpg", "4644633023_f0fab6f767_b.jpg", "92421229_581b3d51b2_b.jpg", "5646496634_0f015f98f9_b.jpg", "3548805431_5bb192077b_b.jpg", "7996394837_d34820a411_c.jpg", "4439304888_80106281df_b.jpg", "88254138_78553d2659_b.jpg", "2679602215_dcc3424fb3_b.jpg", "549585097_cd088bbce3_b.jpg", "9415194618_20c9391083_c.jpg", "5990412799_1042de23f2_b.jpg", "4201093965_69819acc08_b.jpg", "19831654502_0f94a8c19b_c.jpg", "7384141950_b54024bece_c.jpg", "5394494396_e3201e467d_b.jpg", "7006823703_8652a88ae9_c.jpg", "4758188186_13c8990964_b.jpg", "85428860_c4a6823fdb_b.jpg", "3238325867_7cdf361ca2_o.jpg", "3050738966_e6073cc043_b.jpg", "18293352926_e5c53b7266_c.jpg", "7945129608_42f9a167c5_c.jpg", "22466866528_ec88a7db7f_c.jpg", "3407264561_dc5e39995f_b.jpg", "5168441662_37b5b02c72_b.jpg", "6894381065_75990a0fd8_b.jpg", "23048920955_0bced2a9b6_c.jpg", "6971154728_c4a8899269_c.jpg", "2701744003_e8d110e23d_o.jpg", "3139484304_0113d6e6d8_b.jpg", "4223943233_b82cd3201f_b.jpg", "4214515986_8c7aaa2015_b.jpg", "16582975724_349010e75f_c.jpg", "4193752608_4bd65e5090_o.jpg", "3591314704_65404afc41_b.jpg", "9279438911_8927bc068c_c.jpg", "8104201739_498358f3c4_c.jpg", "2949538889_7d48c2b9c7_o.jpg", "2917961861_52d311168f_b.jpg", "16485688749_f83f5f38ef_c.jpg", "8691306529_254648651f_c.jpg", "6827998334_eeb8867fe7_c.jpg", "2321177732_dd1aa71f17_o.jpg", "20118462063_32620a6f27_c.jpg", "7618778560_1a583b29af_c.jpg", "12048375993_b9b0f5d98c_c.jpg", "14602488995_ef6c0499ce_c.jpg", "4311721622_895cfe2892_o.jpg", "2876188365_d88ed8a052_b.jpg", "10884871684_476ba6ec8f_c.jpg", "5653821068_98dc141fc1_b.jpg", "18618684938_89dd714278_c.jpg", "3888323100_a77d78113d_b.jpg", "8226759241_9d9eb9ac8a_c.jpg", "19511672486_d4b5bbe78e_c.jpg", "10325183785_6320f354b8_c.jpg", "8445387618_8038d9b2fd_c.jpg", "22100157642_fc59368616_c.jpg", "8997607849_7012ee5097_c.jpg", "221438622_47dce1bf16_b.jpg", "3231838631_624bd72e6f_b.jpg", "8535410869_6a36409c2d_c.jpg", "8515708132_d8d5108d7f_c.jpg", "533566477_404685b444_b.jpg", "7819141294_973823ebf5_c.jpg", "2343276303_b027b34479_b.jpg", "21213987492_abb1c407e5_c.jpg", "5714272946_e9d29ba375_b.jpg", "8988812214_4190ef843b_c.jpg", "12348984654_4491f5ece0_c.jpg", "2306940887_aa51f069d6_b.jpg", "5908304120_6025c8c5e8_b.jpg", "2471737725_446bd52553_b.jpg", "12437304993_d476230f39_c.jpg", "3676438605_7afd203f4a_b.jpg", "5214682472_26ce2de36e_b.jpg", "14325995004_9034faf97e_c.jpg", "321969873_e72ade4bbe_o.jpg", "12547204835_b5f5861369_c.jpg", "8025202759_e71744d049_c.jpg", "17795792259_1a8966a438_c.jpg", "4974583259_89498085d5_b.jpg", "7218959918_78a210cff5_c.jpg", "517780041_1f8efd9de4_b.jpg", "260498979_bdf19e9616_o.jpg", "611183750_ff730e9657_b.jpg", "10100461493_2597206933_c.jpg", "4319200031_440b82e4d2_b.jpg", "23737186965_ede686946a_c.jpg", "22130886504_b7422c45eb_c.jpg", "16736557879_aa5cbb916f_c.jpg", "5867894824_9d7ff02e0d_b.jpg", "4738622703_17af11f904_b.jpg", "8660467798_d400d9482f_c.jpg", "5944832841_2894f958c4_b.jpg", "4553095608_6cce7bc5a0_b.jpg", "22759645545_72fec31c63_c.jpg", "4571926312_921b986892_b.jpg", "16318498382_b81b195930_c.jpg", "3763662290_5c0235478e_o.jpg", "3550939215_dd8614074d_b.jpg", "23065226594_2901f31963_c.jpg", "3057515884_58cc8e97c9_b.jpg", "22625808588_910655610c_c.jpg", "8907227716_b0d21d916f_c.jpg", "7090253263_75cdb2d02d_c.jpg", "3795372681_431c6b0e4c_b.jpg", "2217501518_1def8069a8_o.jpg", "20760675403_5f3b837c82_c.jpg", "4800497503_87fe223b26_b.jpg", "3579364195_b0f8b621db_o.jpg", "3219088943_f4220a7256_o.jpg", "14636071375_35f5dfef66_c.jpg", "20282673201_9c25796a80_c.jpg", "14517844877_cf49c387b6_c.jpg", "6060249948_51746b754f_b.jpg", "19318012258_1768f0efd4_c.jpg", "5832677747_dec1c418a8_b.jpg", "338519791_225567b998_b.jpg", "4168657309_3020ec59dd_o.jpg", "15681509401_e979547659_c.jpg", "6142667969_acae837ab9_b.jpg", "12002411854_f502633084_c.jpg", "16122360749_1a47b0a26d_c.jpg", "3374250836_4afe25397c_b.jpg", "2673169487_e60535d90a_b.jpg", "499297998_19d35e8927_b.jpg", "3421194821_c5a68a422c_o.jpg", "13229886735_ac786fb3e6_c.jpg", "731583713_2fde35c209_b.jpg", "3225388186_f759bb2efd_b.jpg", "4695258103_ea5ff1d4c7_b.jpg", "9721013193_9d7e2394a8_c.jpg", "258887093_abc7a6fd8a_o.jpg", "8101845854_aae5a203ae_c.jpg", "2199328642_b2720f5d73_b.jpg", "9513142218_e1fa17486e_c.jpg", "20655939975_1fa66b2d56_c.jpg", "3907147179_860df9fca5_b.jpg", "472745267_4e76c71e98_b.jpg", "14975779256_a948bbde8d_c.jpg", "2172394144_e49dd5da35_o.jpg", "5174655209_e27b8bccb9_b.jpg", "17956751690_a8a2eaa85c_c.jpg", "196182059_c204b061b6_b.jpg", "7573374532_a3ef8a9998_c.jpg", "3867151516_8392cd122a_b.jpg", "3595779777_3d5b588753_b.jpg", "4107531094_7010756df3_b.jpg", "3353695115_6b20a2b4f8_b.jpg", "433008570_6cb9124dd5_b.jpg", "22786348460_840c295bc6_c.jpg", "4919948876_0c15e52c57_b.jpg", "6764514689_316285ebef_b.jpg", "23432512185_b02fd2fa29_c.jpg", "2981123893_3a7025563b_o.jpg", "6030477272_750c0944bc_b.jpg", "5294381748_83946401aa_b.jpg", "13935524706_a74aa1d8d9_c.jpg", "2321763853_60ae688ca9_b.jpg", "8398094803_a2c7645d7d_c.jpg", "2350134559_7cd22143aa_b.jpg", "13668001834_71b52151c0_c.jpg", "4514059239_24e5ae0bc6_b.jpg", "8370960443_5e71b0ff92_c.jpg", "5117362953_69e2d91509_b.jpg", "9316530032_3367bfe767_c.jpg", "13842086385_fe5ae0c52a_c.jpg", "6781596358_8efa351164_b.jpg", "18798136865_2b500f0c85_c.jpg", "20468036586_3f312b1944_c.jpg", "3754883861_9fe62cce81_b.jpg", "4473894201_080494bc02_b.jpg", "3750276693_8e507ce1a4_o.jpg", "7032333741_3b3ab17286_c.jpg", "14814725920_ef551a2983_c.jpg", "2487200620_72e8b5f8b8_o.jpg", "5120859921_24064343e9_b.jpg", "2265919893_8b3e1da67e_b.jpg", "5719365374_98107152f3_b.jpg", "22857366457_1b93dd3d46_c.jpg", "8489921021_955180ec6a_c.jpg", "6758326771_9f6d355471_b.jpg", "5583124033_903e07266a_b.jpg", "4580886884_5135a20f85_b.jpg", "3929747967_3847a855b5_b.jpg", "17389910412_5c4a5713e7_c.jpg", "6129815981_713bea4532_b.jpg", "7981113830_1e9ba412e1_c.jpg", "18930815778_8689d60ccc_c.jpg", "5817471014_48dd232d68_b.jpg", "3757927409_35783b5a91_b.jpg", "16985182529_2ecc3b639a_c.jpg", "4033480137_17740a92e7_b.jpg", "6979735930_991247c039_c.jpg", "9646966948_39441cfd6f_c.jpg", "2701790832_83be776320_b.jpg", "5760275886_16494f1660_b.jpg", "6349127012_d1f451147f_b.jpg", "4565128220_227ccc7b09_b.jpg", "6574497687_2a83fbe12c_b.jpg", "250923071_f9d723d5db_b.jpg", "417109550_bf17adfa53_b.jpg", "9607029021_e2a1a1c8c9_c.jpg", "4080103963_5f604a0fbc_b.jpg", "4331287532_53f782d060_b.jpg", "21865146804_48826a186d_c.jpg", "4230098407_471982e795_o.jpg", "22563940740_554168a5c0_c.jpg", "5339768652_b7af342ebc_b.jpg", "17203994010_82e2178a18_c.jpg", "5662433518_83f818358b_b.jpg", "166298286_59820f896b_b.jpg", "4345894455_579f8e1ff6_o.jpg", "282623264_76b510870e_b.jpg", "272574164_6dcd859832_o.jpg", "4789883049_ed6e0cd494_b.jpg", "1450945425_2582b7a71c_b.jpg", "7398570468_32c9a7a2dc_c.jpg", "14628488609_89faa6d2e1_c.jpg", "2768184473_0b7159aea9_b.jpg", "229616557_abdf43735d_b.jpg", "5446368398_9c3ff74443_b.jpg", "22074736561_a339549eff_c.jpg", "5065501279_5e90de539e_b.jpg", "2310026828_791fb76056_b.jpg", "19805500203_74baa5a377_c.jpg", "14489719271_4f39612e9d_c.jpg", "3432940096_3f2127f722_b.jpg", "20199964351_6c333205f2_c.jpg", "5580248427_391888e9e9_b.jpg", "65820149_661ced7986_b.jpg", "7527951864_ce43040f4f_c.jpg", "3654967608_95e3fdf2ab_b.jpg", "3572976568_35e3db1280_b.jpg", "3995375705_5e6dda9e98_b.jpg", "5480737643_a344ce65ff_b.jpg", "389843181_5e2e0f96c5_o.jpg", "9449620384_ca42c8bd1c_c.jpg", "14054681074_22a502c535_c.jpg", "6658369867_24541b25a9_b.jpg", "16104276216_d7008267e7_c.jpg", "18955211639_00da0648ed_c.jpg", "2549935435_9bb1a3495f_o.jpg", "2425917510_db2c9be582_b.jpg", "5105716511_0f825de125_b.jpg", "2078669736_321eb7a5ea_b.jpg", "4671070513_1a92b49972_b.jpg", "4552409726_72ae03c6ce_o.jpg", "3571923089_4a36f10599_b.jpg", "11033441755_a02969ab5e_c.jpg", "4134550606_af9c9e7624_b.jpg", "10992814763_7563d43113_c.jpg", "4088954880_7ca0db8964_o.jpg", "16350626170_6b94890ff4_c.jpg", "4330738202_1c4ecc6b6b_b.jpg", "942547180_d3d73e5f2d_b.jpg", "3464043302_ddaec5ef60_b.jpg", "16835674072_5772e4f71c_c.jpg", "3171690973_d3b1154982_b.jpg", "8719070035_9a60a50f99_c.jpg", "23844334716_b0cfe76a4f_c.jpg", "4796970560_cfc52bbb6a_b.jpg", "9216097266_b7aaf62a3c_c.jpg", "4031167404_1f745a5df8_b.jpg", "14052205626_b6956e8ab9_c.jpg", "22320702303_7a82e4d044_c.jpg", "5611899143_5a31675d1b_b.jpg", "14821085701_d8e9b0f313_c.jpg", "6063042881_bf8b71b0af_b.jpg", "21413393506_a93001248d_c.jpg", "4774374808_55d1cf3a9b_b.jpg", "510780119_de27c6c81d_b.jpg", "6505463509_87cf8fc6a0_b.jpg", "4152775143_b89ee954fb_b.jpg", "4088481124_f995260ec3_o.jpg", "8929358399_0b97d7d2bb_c.jpg", "15158955193_e6e60d7317_c.jpg", "15191152919_3ce117ae29_c.jpg", "3140972374_29e9bd2a61_b.jpg", "21610132779_4ac0a9bbe1_c.jpg", "14422598959_6bd4ee3210_c.jpg", "25007270_063635c757_o.jpg", "2595359018_ea4e82e900_b.jpg", "7456410340_5a36fbd8bd_c.jpg", "3027979734_1558b07965_o.jpg", "13134316073_69cc9522dd_c.jpg", "9710260188_81d08dfe1a_c.jpg", "13914779771_b12f2d40fb_c.jpg", "8591771654_31cece470a_c.jpg", "14969425154_855426e5d8_c.jpg", "2127512366_d58aee2eca_b.jpg", "5645923191_68b48b44dc_b.jpg", "2513741825_a5f891ef01_b.jpg", "3142501503_6373604b4b_b.jpg", "6822588809_fd760a3c44_b.jpg", "9285562214_7099bed346_c.jpg", "21745708903_38eb372588_c.jpg", "20175952908_15be5b0c56_c.jpg", "22641062728_7b8d4dd0af_c.jpg", "12597940315_2684c1294f_c.jpg", "22567764082_36e76bb119_c.jpg", "3463555899_c248235d5d_b.jpg", "15612144043_b44411aea8_c.jpg", "5346664759_768b577739_b.jpg", "2175529797_10581d9683_o.jpg", "15174050021_0458bfb326_c.jpg", "35799548_0e4c79a5ca_b.jpg", "17859637_4fad4671ee_o.jpg", "6545067517_374883d482_b.jpg", "790051208_489fa8b645_o.jpg", "7455263026_cab9a377e4_c.jpg", "1924960814_5e2db61668_o.jpg", "5730609369_83160675a2_b.jpg", "6494258083_cf29f81330_b.jpg", "16817876069_733911e9a2_c.jpg", "4144086193_b38fe2d0df_b.jpg", "8715261793_d8aea22745_c.jpg", "6042508733_3855f35f21_b.jpg", "2118649278_1dc2aafb38_o.jpg", "21057819188_fd2e408170_c.jpg", "5836587543_7d77fc35be_b.jpg", "20871482255_94eeabe412_c.jpg", "138337087_66f4bb95a5_o.jpg", "8953633778_d90cd5ff6c_c.jpg", "4098326327_ed2b183116_b.jpg", "3405723563_6c749858f0_o.jpg", "16462709402_5f53fc4f6f_c.jpg", "13529981875_45b3e6d6b7_c.jpg", "4470684420_24d96f30fb_b.jpg", "6804681516_e720e5e634_c.jpg", "13793558185_7e19276dcd_c.jpg", "3557130210_78db3e2a13_b.jpg", "5697770046_aaee118f43_b.jpg", "3605642532_48fe89fe8f_b.jpg", "4777516537_6869763487_b.jpg", "14751374335_57a5c559aa_c.jpg", "4297432961_d41babc7b6_b.jpg", "6234359505_469bd11200_b.jpg", "10836587473_c68812c646_c.jpg", "18367166002_303778e5d5_c.jpg", "7385211384_d22765c7b7_c.jpg", "3290499879_368de49bd2_b.jpg", "14482014885_bf167dcd8e_c.jpg", "4143182565_5e7942b7d2_o.jpg", "320747983_59345ed818_o.jpg", "2163214948_02e2f53933_b.jpg", "5242605879_76852f3f16_b.jpg", "3644493741_095c8a76c3_b.jpg", "7998462783_405c725176_c.jpg", "7679448330_4ccde7cee9_c.jpg", "7080123509_d1046fde4d_c.jpg", "3326333792_7447ca00d1_b.jpg", "155228771_6290ce615e_o.jpg", "62820937_95e2d9c160_b.jpg", "3748400673_f25bc9b15b_b.jpg", "4219766886_11738b19f7_o.jpg", "6170771700_964c7bcfb5_b.jpg", "6169500283_2ea147a7ff_b.jpg", "3657323794_baed9e6fdb_b.jpg", "12454739714_587862492f_c.jpg", "3261115703_5c1da8dfcb_b.jpg", "23109508062_738086eee9_c.jpg", "4219703656_8389643e5b_b.jpg", "2668824043_513173d885_b.jpg", "12280012566_75f0c31eca_c.jpg", "8543975849_0d4d9d4869_c.jpg", "136283100_fb0f3ac693_o.jpg", "6436580141_4409f92b4d_b.jpg", "6979316042_fb5e599baa_c.jpg", "12198922813_64720f3feb_c.jpg", "2076799798_b0fd92b172_b.jpg", "4039336841_670d9152bb_b.jpg", "22511479958_102c5d2949_c.jpg", "8540875360_297d3d6fd8_c.jpg", "4816483149_4c62a5a07e_b.jpg", "15095294843_29aa2cc92b_c.jpg", "1015166507_da4d031d37_b.jpg", "2649005361_1188c5caf2_o.jpg", "5146852769_5136a70ce5_b.jpg", "5546720280_836fd37153_b.jpg", "19414569600_f84d6e5fa2_c.jpg", "5903072845_54a1dbf3a5_b.jpg", "5874531319_0f96fb358f_b.jpg", "3841402906_c7647ab603_o.jpg", "18802968963_7a7b24f056_c.jpg", "14922745441_9a99bd70ce_c.jpg", "21216722218_a45baa4cdf_c.jpg", "505798928_c03d2f4aa2_b.jpg", "8671601673_4705c06c72_c.jpg", "2350951486_c5547b5ac5_b.jpg", "3890560449_39b9f11431_b.jpg", "6150754537_a663b1f561_b.jpg", "4182811993_4e436b5595_b.jpg", "186444540_6d3691ded0_o.jpg", "9568277728_53997f75a5_c.jpg", "8744314992_af64ccb5d1_c.jpg", "8419143272_31fcec4905_c.jpg", "2936069202_9b45fcd978_b.jpg", "2811176120_3307806e32_b.jpg", "22858277217_915604c95b_c.jpg", "8866475585_7b903fbd90_c.jpg", "5474936674_d51c5e92b2_b.jpg", "14398343526_d9b38711eb_c.jpg", "6475843833_6cc44cb755_b.jpg", "3423786182_ae58f66387_b.jpg", "3486391527_a3d33fb591_b.jpg", "6164155958_b6e3bd79da_b.jpg", "4264894805_04753e9ee7_b.jpg", "8553222748_eb3684795c_c.jpg", "2545039943_bfc4677672_o.jpg", "2283171365_0eb807c077_b.jpg", "10486758315_e04771f77c_c.jpg", "1246316133_fa1ee44152_b.jpg", "22096446580_39effe28ba_c.jpg", "6122763779_dfbe63a23f_b.jpg", "10599149634_c85839d7db_c.jpg", "14117102802_5a3847d1c4_c.jpg", "6073945366_687688897f_b.jpg", "2872010087_9c83c73b48_b.jpg", "16382417532_9db2d9e979_c.jpg", "15967676189_f944cd0337_c.jpg", "5681422401_040e9bd00f_b.jpg", "9578195214_b99b2cd4ed_c.jpg", "433742163_c39ed0466d_b.jpg", "6437018235_aaf7569cb3_b.jpg", "4284391580_ccfd6e6d8c_b.jpg", "8716971858_0a428005cc_c.jpg", "2250257794_5baaca4c86_o.jpg", "6206714636_03e088a69e_b.jpg", "2529498636_37e672a1ee_b.jpg", "20631084213_b02b61f2e8_c.jpg", "8590217293_b67c32e874_c.jpg", "17552718331_0327cd3991_c.jpg", "2920719332_d4dfdb0c2a_o.jpg", "227428598_04f1120e84_o.jpg", "3642085361_f81fa8a01c_o.jpg", "5991396675_a46c4cf7fd_b.jpg", "441606427_2a849b835c_b.jpg", "20990870298_b59f7793fb_c.jpg", "7974254890_44a6b40c1d_c.jpg", "2945568464_c88b0ed4a0_b.jpg", "6310686169_250df1c71f_b.jpg", "7573366056_48c4227856_c.jpg", "5123207448_159a36c100_b.jpg", "2231484792_39ae10bdcb_b.jpg", "6193738389_89d3290f6d_b.jpg", "158393639_9b20639850_b.jpg", "7282929694_059ba3b700_c.jpg", "443130992_4346b25248_o.jpg", "4383363989_3a1870cdce_b.jpg", "2498179656_a80f8d399f_b.jpg", "15915282895_43fd6cb540_c.jpg", "14457164070_8a7869faed_c.jpg", "2726785274_4883e95185_b.jpg", "3487265655_ede37e3961_b.jpg", "94372383_0492150614_b.jpg", "6779995537_7ed7917c2b_b.jpg", "4201617232_580a272818_b.jpg", "16023952754_b293d3241f_c.jpg", "2766580132_be1145c1d2_o.jpg", "5647398425_6caa72659e_b.jpg", "918557850_2593d4bc96_o.jpg", "14886878957_a0f7943da7_c.jpg", "22262628418_8e76d76064_c.jpg", "4057486434_9c6498d9cd_o.jpg", "6333155115_66cf4954c1_b.jpg", "396255335_973db6630d_o.jpg", "9216061054_9b15b5e220_c.jpg", "4514617552_a6a61c7722_b.jpg", "232204804_178413f4c7_o.jpg", "6886971749_bee7038a9f_b.jpg", "5580638659_5989163c09_b.jpg", "4970246596_3e75cf6928_b.jpg", "22849247812_e0c8227b9f_c.jpg", "15596624559_bf8a5675e0_c.jpg", "5531821521_d35ecd157f_b.jpg", "23568241852_365fe1c51d_c.jpg", "501620457_707c789a6f_o.jpg", "5159114203_b0d460fe6f_b.jpg", "6789951093_3fe8a0248b_b.jpg", "3837927496_01ac5b3604_b.jpg", "14115330352_ed2ae74f31_c.jpg", "3087215888_cf73d4d938_b.jpg", "1007927655_c82216f632_o.jpg", "15744559202_3e18d12266_c.jpg", "3565677857_bbf2ca268f_b.jpg", "2802338077_5b3acb3926_b.jpg", "15167429951_dc7efb58e3_c.jpg", "3067123663_91cd9bbd3e_o.jpg", "4835383168_c8c6d4e82e_b.jpg", "15310274345_a1435a9ccc_c.jpg", "4613697256_70b970f15f_b.jpg", "3235545970_d187a39bb3_b.jpg", "15994071703_2224dc5ee8_c.jpg", "10034946104_a36ac3f97c_c.jpg", "4847424052_40e8ffeaf2_b.jpg", "19426709018_4897aa9e8c_c.jpg", "736381880_bf0f2287d2_b.jpg", "22427434721_8b9d299d7d_c.jpg", "17097202822_0001f4fb65_c.jpg", "17107351976_7ca3d58d0c_c.jpg", "6594611713_2432a86997_b.jpg", "10776560556_001570785b_c.jpg", "4039074965_a1d446c4ba_b.jpg", "4437964585_5b405183f4_b.jpg", "4996322507_39d1745631_b.jpg", "2595693901_a9940583f2_b.jpg", "8442474472_956eca7dae_c.jpg", "5004198469_4bd4143c81_b.jpg", "6411715157_206e0dd94b_b.jpg", "5472541306_5d565c1c3c_b.jpg", "7062702129_e6ee513d16_c.jpg", "11828169965_d3f414a926_c.jpg", "9266127805_41cf51eeab_c.jpg", "7978132022_b3c6797601_c.jpg", "9579806575_68b791cb45_c.jpg", "13696697463_33e63788c8_c.jpg", "3172826466_0abc4be981_b.jpg", "6051241498_e5d201961f_b.jpg", "21660181146_21caf60eff_c.jpg", "2725106340_e9324fa942_b.jpg", "14502857169_d6aa16e73f_c.jpg", "15616599050_c0a8f455b6_c.jpg", "369954151_acd0eb3b5a_b.jpg", "15216317530_5d82b96247_c.jpg", "21519139605_2c57f9e33b_c.jpg", "4461603817_ae8b41cefc_o.jpg", "12261092194_0337313ef7_c.jpg", "8571888279_6790b8b991_c.jpg", "8240157886_1e58ca60c5_c.jpg", "2553360340_acebf99383_b.jpg", "16025151662_e0a90b6e3a_c.jpg", "2720971512_ba70c41ecf_b.jpg", "12591232453_79280b5308_c.jpg", "2393357553_0d91a5c3b0_b.jpg", "2775489209_31487b9eee_b.jpg", "21879582656_15be3183c1_c.jpg", "14305192492_e35078cd9c_c.jpg", "22058867264_24123161d1_c.jpg", "3057811714_3459034f69_b.jpg", "3086172537_f7c6d8071f_b.jpg", "15318363674_9ee4a87348_c.jpg", "6004716540_7e431fdc74_b.jpg", "3299747870_6dc0f5fc11_b.jpg", "8303966392_b37ff68bda_c.jpg", "16101620000_2a1aae8a6f_c.jpg", "9359365312_c83fd7a661_c.jpg", "21923340111_67d9cb41a2_c.jpg", "5518190521_f700996cb2_b.jpg", "22844705840_04147c7e19_c.jpg", "15449964032_7a427e0e43_c.jpg", "3000969263_1a210f0095_b.jpg", "4658781235_817f3d8217_b.jpg", "19798364581_5dc60ba4da_c.jpg", "5314479722_e5ee0e9c0d_b.jpg", "12436811303_731e3c5aea_c.jpg", "22145377164_e5b0e75bf8_c.jpg", "4385892363_acaf64cc92_b.jpg", "2672494444_dd6a1297a7_b.jpg", "17367295070_8301e79974_c.jpg", "6140022240_a9e8263846_b.jpg", "2037016346_1ceb5d2240_b.jpg", "18306427344_564361fdfa_c.jpg", "8697461715_12fa0164b3_c.jpg", "20491000189_8f950ec4d0_c.jpg", "8766760310_b18f471243_c.jpg", "19683990688_dbcfd5ed77_c.jpg", "8636518865_0b73280c27_c.jpg", "5594910071_a23b3eff35_b.jpg", "4500229383_93ed07ee90_o.jpg", "5402958608_4d4862f6a3_b.jpg", "6378849167_3e0027b771_b.jpg", "356845181_fba26f432c_b.jpg", "493504461_55da56e03b_b.jpg", "4754573870_c22e474fb1_b.jpg", "4772065108_25885cbf75_b.jpg", "2551410890_6385081d69_o.jpg", "3972047638_dc66f62fd4_b.jpg", "5600283438_640ebde9c1_b.jpg", "2794323261_e0800e5617_b.jpg", "3579127770_024e34836b_b.jpg", "263077669_64c15e99bc_o.jpg", "335977630_0b01a71ca7_o.jpg", "22145962524_e9bbb16223_c.jpg", "22448892228_5f824c80b8_c.jpg", "21372421124_dc9a8437d1_c.jpg", "6626869413_9124647cbd_b.jpg", "5138316920_b7cd1260b3_b.jpg", "16515815430_6b585347ee_c.jpg", "131441072_2dcf11a29f_o.jpg", "14980631939_7e3f17af03_c.jpg", "5418593624_18e58e932b_b.jpg", "23406898429_1a0fa7c3e6_c.jpg", "329137116_69d33a38af_b.jpg", "4391499291_56473b074d_b.jpg", "4049548625_c4f0cfd35d_o.jpg", "5244375725_31bd21d487_b.jpg", "9732277791_636e21f67d_c.jpg", "7022084509_e4d26e887a_c.jpg", "15253232277_b40f75c654_c.jpg", "16196327935_5118d61352_c.jpg", "4561151545_19bdfd57fe_o.jpg", "7592772032_e8514edd92_c.jpg", "1373321769_cbab5e8613_o.jpg", "5446171666_cecc301312_b.jpg", "14518923604_15189e5355_c.jpg", "4158482001_c479431427_b.jpg", "11020299_16c8146f6e_o.jpg", "3632042172_e54d431b86_o.jpg", "5628058623_50aceef706_b.jpg", "3548084231_c878e0c3ae_b.jpg", "2851807028_ff9efc6d3e_b.jpg", "3990212739_376105b273_o.jpg", "9063432552_2e1547938b_c.jpg", "6480239819_6ac068af0d_b.jpg", "3224201087_6c9445b1a4_b.jpg", "2728358229_641fc014ce_b.jpg", "21221410746_d7b9316814_c.jpg", "4009150377_0672f7eb84_o.jpg", "4593367568_26f0a46d3d_o.jpg", "3795215261_f7f640d239_b.jpg", "6806512426_0f205a9994_c.jpg", "23051735325_70957abb10_c.jpg", "10944126843_a18331d12d_c.jpg", "18296653572_978088a103_c.jpg", "19474567506_74825d3c60_c.jpg", "1286424018_0f9a9d45f0_o.jpg", "17178259296_d788350455_c.jpg", "14523168818_48edc791c5_c.jpg", "8211017070_01a85a5776_c.jpg", "3469853873_037b17861f_b.jpg", "19499853981_3d4180f13e_c.jpg", "23068947649_5f9f31f963_c.jpg", "13586145634_1d21057f2b_c.jpg", "8542192758_0f6a976f91_c.jpg", "8666924888_2ee610fa8b_c.jpg", "19019845676_b662cfd6a9_c.jpg", "3245866915_a21a9eceec_o.jpg", "2706299732_954b3434b3_b.jpg", "16564946939_c663803cb2_c.jpg", "4419560333_148fd8baa9_b.jpg", "4028196999_b75cb129a1_b.jpg", "15710841374_5f3b0f915c_c.jpg", "15571622215_58908db3ea_c.jpg", "2221870918_6e51fcfea7_b.jpg", "336582042_3d2a75120c_b.jpg", "7019834365_7ef9ef5d45_c.jpg", "17240529508_7b10fb0be4_c.jpg", "2908896449_2cf41496b1_b.jpg", "8263228534_33115a7611_c.jpg", "10899044693_3efff6e72b_c.jpg", "5853678140_68fc05f306_b.jpg", "7743949198_50665b8a1f_c.jpg", "4265715322_1da3c84c44_o.jpg", "4682189889_346a9a5e5d_b.jpg", "3052034269_59df728771_b.jpg", "1875472091_14e359c5a8_b.jpg", "8670499783_6a7bdc8e1c_c.jpg", "4235157696_98efbc6d79_b.jpg", "20521963755_a358574408_c.jpg", "197317408_ca84d8f678_o.jpg", "6053015128_04e7a5571c_b.jpg", "4788329446_cd54c7c0cc_b.jpg", "11579722535_c897ee0cd9_c.jpg", "4220199715_42e2ddd7a0_b.jpg", "1073749328_47e6e18563_o.jpg", "7686656798_561ec4df09_c.jpg", "8211108484_a32557f84b_c.jpg", "7190401968_c7cce151fe_c.jpg", "16637357449_9ab457ebf1_c.jpg", "4783145950_54b15ec115_b.jpg", "14902090397_0fa2fc6cfe_c.jpg", "5701898862_2e1092c7b1_b.jpg", "4050814215_7b9fc2119c_b.jpg", "4264963835_b6f423c003_o.jpg", "7326770668_89b629d826_c.jpg", "5172555508_5b5621979a_b.jpg", "6062760677_a009a62b95_b.jpg", "5158260461_1fabe0f941_b.jpg", "16355930576_314743d574_c.jpg", "2450222724_6653ccf01f_b.jpg", "6158658505_df6799dd68_b.jpg", "2805238576_fe5c065374_b.jpg", "3996332252_d76a74d8ec_b.jpg", "10718279544_77e2ac8abd_c.jpg", "2397488162_270085716f_b.jpg", "7160394722_9bfd834b70_c.jpg", "3530912181_d2e9f373a6_b.jpg", "4793226836_f543c0dbce_b.jpg", "17748207196_cbe702bfbb_c.jpg", "472191335_29fe4641f7_o.jpg", "5008643347_ca9b71683c_b.jpg", "151112475_815eed9608_o.jpg", "2141847448_96e2ac358f_b.jpg", "875781439_89b935a723_o.jpg", "13851336345_b787ac9f24_c.jpg", "8699298095_7492df0209_c.jpg", "6093330238_91f906d71e_b.jpg", "7929343632_7f3df413fe_c.jpg", "3506060926_f7aed81086_o.jpg", "5451330835_39ce788055_b.jpg", "4390281551_e6dd0e73db_b.jpg", "18334954013_cd20a555d1_c.jpg", "4924944893_356ffa4ef4_b.jpg", "14045963274_96aa7fb614_c.jpg", "8905782098_f29bc900fe_c.jpg", "14432878908_6008bc46ec_c.jpg", "3499089227_aa7472d736_b.jpg", "3259119347_637a6381ca_b.jpg", "529119133_09525cd2ea_b.jpg", "143644215_46424fca56_b.jpg", "11438228353_b0942c1965_c.jpg", "7567773422_7a542cfe17_c.jpg", "8208839397_9e31448da9_c.jpg", "8072522194_367c7a5f36_c.jpg", "15447316874_8e96b4dc9b_c.jpg", "7976676470_2ceaf8b087_c.jpg", "4319789414_e569a644dd_b.jpg", "3901098229_c1038ce1f9_o.jpg", "16486630499_9b252ee18f_c.jpg", "14531258584_9cb2c92220_c.jpg", "15863687784_be044118ab_c.jpg", "3062694667_489c73b06e_b.jpg", "4568663797_78cececfff_o.jpg", "2698130574_18a12c1936_b.jpg", "20117911144_abed3d90db_c.jpg", "4967726312_c3c0477537_b.jpg", "2754966958_b0b6645ffd_o.jpg", "8542655941_4eaa9fb6cf_c.jpg", "18287066206_9ae3904cd9_c.jpg", "3303060510_1f7466dff4_b.jpg", "98680400_0fd467f41e_o.jpg", "6303397657_d01cb1f27d_b.jpg", "7034221605_a87696850a_c.jpg", "15906209252_ba493449ff_c.jpg", "5433197802_091ebbcde8_b.jpg", "13768642103_d29e5c609e_c.jpg", "20845385026_7641a83464_c.jpg", "12075608606_20a081d551_c.jpg", "13922256400_63c37088d9_c.jpg", "2566611344_aa410a9793_o.jpg", "23225140375_70aeb546cf_c.jpg", "7836602166_f34841cd1d_c.jpg", "1183803694_4adcb15447_b.jpg", "5641146275_89ec726de0_b.jpg", "512378915_292944e588_b.jpg", "3005969494_ea80356f06_b.jpg", "849433883_0c7a29e9ba_o.jpg", "3949306444_597d377b6c_b.jpg", "729239880_eb232a9532_o.jpg", "2486096065_78712dee1c_b.jpg", "15526653210_28f4f07be8_c.jpg", "3953796531_189337015a_b.jpg", "14166439083_3ba8538815_c.jpg", "16873844981_ecf05eb296_c.jpg", "22596812838_72e37920cc_c.jpg", "168426970_e1003414b7_b.jpg", "517443384_c3be3d72f4_b.jpg", "428453286_c5a262144b_b.jpg", "444039695_f2d09cbdeb_b.jpg", "15708751093_37835eea54_c.jpg", "6255410035_a73e989f8f_b.jpg", "6871436940_735eee36be_c.jpg", "472783773_43c208c6db_o.jpg", "6222717324_28654b3266_b.jpg", "7939590778_aaea39faec_c.jpg", "14978138550_62efe2b8a3_c.jpg", "331728936_20b82983ca_b.jpg", "15848385284_3d345ec8c6_c.jpg", "2159019383_8e460867a1_o.jpg", "2898780582_161fc0d6d1_o.jpg", "14158410392_7fbc1da8f6_c.jpg", "11292895816_cbf1b5d1fa_c.jpg", "4930816396_112414579a_b.jpg", "23221615936_d9592cb24c_c.jpg", "8305719865_10518aed77_c.jpg", "6139759614_9e6a85cb02_b.jpg", "2764917287_acb188f501_o.jpg", "7687398084_b538310ac8_c.jpg", "6257187481_2f1bf781e0_b.jpg", "2770512723_9e90b5935a_b.jpg", "5287584095_351bf9591f_b.jpg", "20123509415_0eb31dd700_c.jpg", "2582906496_326b64e94a_o.jpg", "16729217971_e988fe704d_c.jpg", "840960137_550714d41c_o.jpg", "3879937894_d128cc90e6_o.jpg", "3537485224_7b599a88be_b.jpg", "5692588554_b5876164a0_b.jpg", "22630416313_baa04fe8f0_c.jpg", "8010003295_5c0e299ac7_c.jpg", "8454334115_7f1e8fc5e5_c.jpg", "3064529250_e7e14c8e33_b.jpg", "4458876892_8ce21a1418_o.jpg", "3222878365_072592c006_o.jpg", "3432219275_83ca1b3566_b.jpg", "18608090692_d65a382f1e_c.jpg", "15919764513_21f7012013_c.jpg", "20566069885_f8c23ef2e4_c.jpg", "3232481441_af0eafc0ff_b.jpg", "17685991492_066d6c9803_c.jpg", "4256845104_8e6dce3f25_b.jpg", "17902401000_d8f361c6b6_c.jpg", "15961409258_5f95ec3a5a_c.jpg", "4164747569_aecddf6578_b.jpg", "6119920436_d7017fab9a_b.jpg", "21017514760_7c4e5930c8_c.jpg", "15355918731_a51c56c31f_c.jpg", "9383886161_77d821566d_c.jpg", "15274713521_14c6751aca_c.jpg", "9736449059_6004a23e79_c.jpg", "4560668795_6769b27330_b.jpg", "5646023604_16f17f6152_b.jpg", "4609429866_cf4384c2a9_b.jpg", "4598193825_6cf06e8b64_b.jpg", "7745592928_a2b00a055e_c.jpg", "3088916340_cfe55a8e80_b.jpg", "14400676147_564b7cab22_c.jpg", "2855180925_125241a121_b.jpg", "16046198423_7d2191cb71_c.jpg", "15794054873_1a6246fb2d_c.jpg", "11240564926_0998d96c3d_c.jpg", "9395016052_0fcc25cb76_c.jpg", "22894722109_1a15bff1d2_c.jpg", "3210010263_905229c87f_b.jpg", "17052947508_a3409482af_c.jpg", "2259554888_f710ac08c3_b.jpg", "21142113183_0a8c160a7f_c.jpg", "2047992435_1d3173b70a_b.jpg", "3907582121_3b45b2768a_o.jpg", "14656454858_e36a407da7_c.jpg", "13243755383_d18cf48492_c.jpg", "6351738850_e7f46a55cf_b.jpg", "3538295066_a48884edb8_b.jpg", "16235802101_b55d5ccae9_c.jpg", "4114847222_169fb18012_o.jpg", "17893025753_9e26dbef49_c.jpg", "15064813784_121a046feb_c.jpg", "6167623189_8b7fa60113_b.jpg", "5241072704_af98926b27_b.jpg", "13973566409_0840333a3c_c.jpg", "6239791578_384bb29fcb_b.jpg", "7827501016_9b8e1e62e2_c.jpg", "5219568961_1313fb60f9_b.jpg", "17194704076_2b359d05a7_c.jpg", "8397684029_eb421965c2_c.jpg", "6060054886_86a522a3cb_b.jpg", "12779118034_8ba5ca3c7d_c.jpg", "2567370748_55cd8634ba_o.jpg", "7307325804_a636bd0d05_c.jpg", "3500680792_589680c69a_b.jpg", "22602604234_0575fe23db_c.jpg", "7841134984_5ce7c47590_c.jpg", "2474665651_ab41be60fd_o.jpg", "4354120530_b86da9ff3f_b.jpg", "7293215700_61ddbdf952_c.jpg", "8689565132_84164b7c29_c.jpg", "6863423728_b3cf4d619a_c.jpg", "12591597574_2d32bf2ed6_c.jpg", "20935855725_e93b4d498c_c.jpg", "5347079242_25614b41b6_b.jpg", "3026775477_2e0fa1a038_b.jpg", "14618305846_3ab8661c10_c.jpg", "22614679143_9ae418023d_c.jpg", "5648238464_6c98e356a4_b.jpg", "8530261244_45c5019ce9_c.jpg", "5954177501_6137dde316_b.jpg", "21485977208_7826afbf33_c.jpg", "14220236353_bfea633b03_c.jpg", "4134769500_b487e5d845_b.jpg", "458780979_e44eacf13b_b.jpg", "22963751835_fc81b21191_c.jpg", "8749307317_47cf6e9190_c.jpg", "15819083662_7bcac2080e_c.jpg", "4931996373_26ab5ae3e4_b.jpg", "416361186_f71c53ff12_b.jpg", "5894041481_0ab428612e_b.jpg", "16077522676_08d062215e_c.jpg", "6010636200_98029b29ab_b.jpg", "173320067_5295e29578_b.jpg", "542240616_a1a607b9ea_o.jpg", "14973219267_e88612875a_c.jpg", "3438663153_d7f161b046_b.jpg", "8671734303_7a5d0fa932_c.jpg", "12106712145_9575e3340d_c.jpg", "21713168185_c11a18a067_c.jpg", "16722159573_4f764c43f3_c.jpg", "16043105558_2eb5903619_c.jpg", "4187814015_eaf5b968d4_b.jpg", "5717490370_616525d7e2_b.jpg", "14290721685_335804a964_c.jpg", "3661221854_43d66a7738_b.jpg", "61629073_ecff7bf1d2_b.jpg", "4834648363_89e3205c97_b.jpg", "4824421971_ca15f55b6c_b.jpg", "4507187611_f0d063ed06_b.jpg", "5955594271_75ffa33ae1_b.jpg", "4534381033_0f31d90ec6_b.jpg", "5293714429_d9fa2c7dfd_b.jpg", "5683113685_68651d47aa_b.jpg", "3982685077_a7cfa929d9_b.jpg", "16920830325_751daab4d2_c.jpg", "4500451551_3e18a6ecdd_o.jpg", "293991010_b2a8d7cf8e_o.jpg", "2455356735_98e3716d18_b.jpg", "308649033_992398ed9f_b.jpg", "2998129808_3f6c95ae8f_b.jpg", "338478912_bf806bb3b9_b.jpg", "19409483762_d83b9bd9e6_c.jpg", "7192013430_c1cee0ddbf_c.jpg", "10602119956_4e25f63ef0_c.jpg", "6012795004_34c90a2963_b.jpg", "3711907342_0a3a3d93d7_b.jpg", "8044143106_cf9ae28ca9_c.jpg", "22902043471_dd84f0aa6a_c.jpg", "21679174348_6426878fda_c.jpg", "20220037833_745a63521c_c.jpg", "4582550712_2bb27ebd56_b.jpg", "4167582989_448dded8ed_b.jpg", "22449410953_2c4e7e6ec7_c.jpg", "5855561646_eb79959244_b.jpg", "6191834481_d2536a6e82_b.jpg", "8431423189_1a3ccf8fd5_c.jpg", "76537045_5997edc2af_b.jpg", "5337183180_537eca0f32_b.jpg", "17205204850_9bb07acd4a_c.jpg", "15533820687_644f9dfa16_c.jpg", "372309722_2da8bd2b56_b.jpg", "248619430_7aad1ab4aa_o.jpg", "6806661893_d44a891c33_b.jpg", "300988253_0ffc103b42_b.jpg", "4708188866_635918cae9_b.jpg", "14319497155_fab2acf0c6_c.jpg", "3786949855_89e7323e37_b.jpg", "5816840555_f4f6ce28b3_b.jpg", "14861857412_d624f3c051_c.jpg", "17936057513_f5b412fa41_c.jpg", "4570425063_ca5c0be6b8_b.jpg", "8546281996_9f2493838d_c.jpg", "2375009502_3839a9c02d_o.jpg", "16595445945_db5f230376_c.jpg", "11003679623_0d72ac7f83_c.jpg", "15098324773_34d6d1c2f4_c.jpg", "22221870136_7e4c1fea84_c.jpg", "10944401125_64b62a8722_c.jpg", "2859335587_70eca1832d_b.jpg", "2695004242_bee67b8d7c_b.jpg", "5437590144_985200fea8_b.jpg", "15634469860_e909dd119a_c.jpg", "5233021752_0bbaf9efa9_b.jpg", "18622440113_64b087efeb_c.jpg", "17794523236_f25133dff0_c.jpg", "2231129523_62b9dd014b_b.jpg", "6813158842_96b165e072_c.jpg", "11110877253_3ab4e59965_c.jpg", "4632093019_0da63295b9_b.jpg", "6218416327_c26d62ca1a_b.jpg", "5275792379_0512d6601f_b.jpg", "10594271873_611e07a39d_c.jpg", "3443314023_4afd41d59e_b.jpg", "21030021806_2ff303fa50_c.jpg", "4789597255_8241112a67_b.jpg", "4767711595_e4f6139e0d_b.jpg", "5613106640_e564d747e5_b.jpg", "5927142626_868f00d1c0_b.jpg", "21720221453_c56946b979_c.jpg", "9117010836_93abf2bebf_c.jpg", "11841122786_49d767ed87_c.jpg", "5508316836_a8a44d20b7_b.jpg", "3885687307_2c5d147721_b.jpg", "12952643943_219aaa692f_c.jpg", "9020181876_d702011bc7_c.jpg", "278999481_169ae79ab8_b.jpg", "2113000836_1b59022c50_b.jpg", "8289557046_804639f287_c.jpg", "386074707_3b83678316_b.jpg", "2637106116_f7be48f6fe_b.jpg", "8386945977_e2e157f347_c.jpg", "4470391891_b17fb29a9d_b.jpg", "8210103616_4f0c04be92_c.jpg", "2641514781_4c3973e0ca_b.jpg", "9609906443_7a300a6d92_c.jpg", "16205795878_0ce64dbd55_c.jpg", "20489938215_ef085b73df_c.jpg", "7206649152_c87d89c67f_c.jpg", "22973257175_b82b734376_c.jpg", "4303693266_1440066080_o.jpg", "6389472685_ee702a440c_b.jpg", "15251367120_9bdca6b5c3_c.jpg", "8662438577_cb89b0ecb6_c.jpg", "8637952912_3ca0cfe9b9_c.jpg", "9064919235_9ece90beed_c.jpg", "5073517839_15c97cc5d4_b.jpg", "25760133_985004b241_b.jpg", "10769837753_0db64cd9a0_c.jpg", "2440034269_8c30e74ff1_b.jpg", "258271835_ddfdacf203_b.jpg", "9889267043_8034fe7a14_c.jpg", "16908221218_a1db9a2889_c.jpg", "22664735478_75f2f624d3_c.jpg", "2626529468_e0f7f67197_b.jpg", "9918469313_ebfae40ac0_c.jpg", "4182275506_b289327455_b.jpg", "21784510820_fb32123219_c.jpg", "535431307_3dd811ff4a_b.jpg", "3438685303_bd8b01678e_b.jpg", "4990942661_d307a7f565_b.jpg", "7811340070_0cf453bb02_c.jpg", "3245188637_d9bd87728f_b.jpg", "6972983070_59647b1162_c.jpg", "5198068642_283d49be13_b.jpg", "22624406863_9b48c82a9f_c.jpg", "23189351459_455c9d7e72_c.jpg", "422994213_5558209da9_b.jpg", "7230791168_e56dfdc96e_c.jpg", "4692987771_1a54723bd6_b.jpg", "14984927867_9fcd13198e_c.jpg", "241567473_b54493a0bf_o.jpg", "366651998_6f96d06844_o.jpg", "4144346049_8662f555e2_o.jpg", "536627868_c401a97b60_o.jpg", "16782892834_c66c5f56e7_c.jpg", "15737777606_ce1e72b6fc_c.jpg", "4094615903_8e1cfffee3_b.jpg", "2455746657_a6c0a50fd0_b.jpg", "2168215999_310704cb1e_o.jpg", "9463391568_067e6c902a_c.jpg", "6158390864_58b3225ca7_b.jpg", "8450809710_0d3ca4658b_c.jpg", "1440357219_46766df023_b.jpg", "16725286276_370120f3d7_c.jpg", "14420598388_dd82f0778b_c.jpg", "64965251_6f3ecb5bf9_b.jpg", "10243649146_5c4b100ab1_c.jpg", "7491071536_c89220dcb1_c.jpg", "21341768233_1d9ac995df_c.jpg", "17158252174_20cdec9c60_c.jpg", "22891220689_cbb9fbcbea_c.jpg", "2219073456_7cba41ce59_o.jpg", "15630714639_7d8b9102ba_c.jpg", "16232719988_4b5ffb7e10_c.jpg", "3966686401_ac83243400_b.jpg", "10430880325_6fee019f41_c.jpg", "4536387873_378f2e2bf1_b.jpg", "8386187086_eb2b6da050_c.jpg", "4742030237_f4b286b00b_b.jpg", "2739164059_9a5c1f877b_o.jpg", "2689623868_3c12964fff_b.jpg", "9211747895_8a3c2e1fb9_c.jpg", "9132013188_caffe33355_c.jpg", "3862441614_0e53ee6048_b.jpg", "15436653502_2abc4b0375_c.jpg", "2081333542_3a215c60d9_b.jpg", "20046209539_e9dbe46a2a_c.jpg", "5656900857_511c81f5da_b.jpg", "104769628_9fffcfd48d_o.jpg", "9900189186_7b3cb19674_c.jpg", "5819425814_4349ba535e_b.jpg", "5067276708_19ab4efbb4_b.jpg", "13574185995_3084894e56_c.jpg", "8715252625_5137778c0e_c.jpg", "3313105641_e9553b5670_b.jpg", "15926950302_6befd6d3c9_c.jpg", "194309096_9a5aa7a4e3_b.jpg", "14420219_4095ea950c_o.jpg", "18315493242_4ce64eec14_c.jpg", "2873630594_ae10d84c28_b.jpg", "496779320_bdff593dd7_o.jpg", "5878819521_acac758dbd_b.jpg", "16268619079_b4eacd7b6c_c.jpg", "18865738132_82c043c025_c.jpg", "9111614820_e235f3e5b6_c.jpg", "3099806844_8b40c22025_o.jpg", "12390198474_25c76a8b41_c.jpg", "18876709178_a08f754267_c.jpg", "3702992789_90997cacbb_b.jpg", "3050784426_97ec96ff66_b.jpg", "22110258556_5affb194ce_c.jpg", "22449190473_690d363009_c.jpg", "406219801_65e93cee17_b.jpg", "5756906041_2a9484400a_b.jpg", "5066633387_da8a53b287_b.jpg", "4940438691_ed77abc791_b.jpg", "19542987629_571cb00830_c.jpg", "5667426580_bfba6156b8_b.jpg", "4278065243_ddb2b04451_b.jpg", "2380629494_3cc96ab658_b.jpg", "8587946681_8ab74251be_c.jpg", "21845023794_2a9dbe745e_c.jpg", "23222865636_e835e4ae3d_c.jpg", "7920231842_f35e3ab973_c.jpg", "15675136195_2017203103_c.jpg", "8048787037_249cd05c59_c.jpg", "15880218281_5ce5729a89_c.jpg", "14985082448_618a16d1fa_c.jpg", "17210871975_cd29743390_c.jpg", "4707184449_2c1dedc3fc_b.jpg", "10510635434_312ddfecc1_c.jpg", "13103653185_64efe96ce3_c.jpg", "17001945481_3333f764a2_c.jpg", "7714321772_7bdd4ef1e5_c.jpg", "3097068962_601f782a65_b.jpg", "5152936486_cdf9c19d63_b.jpg", "144847289_7b2e1c876d_b.jpg", "15004028030_e365e07737_c.jpg", "9229422581_1e1f63f78d_c.jpg", "8117710419_472e1727f6_c.jpg", "21007557339_57aa82b081_c.jpg", "4174108565_60b10e62bf_b.jpg", "5539461827_baf0ee31fb_b.jpg", "1246312638_d7d04ef096_o.jpg", "6376583173_fdb52e3bd5_b.jpg", "16227971328_02c500530a_c.jpg", "5830937201_d028056d03_b.jpg", "7075587655_060c69610a_c.jpg", "2634991554_fa7196f057_b.jpg", "493504411_966dc9cec4_b.jpg", "4417198517_2b67a1b8ce_b.jpg", "7138246913_cf7b0f84bb_c.jpg", "2084003810_3ae7b356da_b.jpg", "6925667075_95e15b39c7_b.jpg", "2294077710_a6531d5f90_b.jpg", "21499445706_668f8f22d8_c.jpg", "5945921454_8d0fcbc4fc_b.jpg", "7319506376_5d04e832fa_c.jpg", "8349513923_ef5f1453d0_c.jpg", "21048444369_c30aaf4d34_c.jpg", "17623540659_f5858ba3eb_c.jpg", "7909766306_ff028c26dd_c.jpg", "22487539367_d4db74d070_c.jpg", "2794729330_69101390e5_b.jpg", "6822873338_ec8e41ee29_c.jpg", "9656254626_7221ed8610_c.jpg", "15753480320_bda82670e5_c.jpg", "6493922857_8cf324a1f2_b.jpg", "5077353745_c3d77626dd_b.jpg", "22940946360_f5d702c0d0_c.jpg", "4890770222_6b6e7944da_b.jpg", "8204388070_253ea269c9_c.jpg", "15229026647_54ea79c64a_c.jpg", "6102762874_42f370ed12_b.jpg", "4086880172_375b8671da_b.jpg", "23607991491_04c20a02bd_c.jpg", "13053373624_6368bb3d13_c.jpg", "16227084228_1b7c5a3681_c.jpg", "40428218_69064961be_b.jpg", "3960681147_627a480f7f_b.jpg", "18768918376_a37060694c_c.jpg", "4405112072_1b2580e218_b.jpg", "7580088000_afbe175726_c.jpg", "22616497234_05a087aa6a_c.jpg", "4768398769_9f1155c68e_b.jpg", "5374495086_e9fff0ecc1_b.jpg", "21552192372_523cc07fdf_c.jpg", "20958441694_c982ee7e48_c.jpg", "9094583464_b3b7c35e81_c.jpg", "4715604333_eaa8295760_b.jpg", "4098064412_91a1524f0a_b.jpg", "4381287304_b2a7a81529_b.jpg", "2548660076_89a8ebb4d8_b.jpg", "15658851113_89d19c3265_c.jpg", "3101486742_01092cd31c_o.jpg", "4504983566_fbd276ce64_b.jpg", "6196298243_4046d3a187_b.jpg", "22418900424_e17148a70c_c.jpg", "4337379201_a83c3b1fbf_o.jpg", "2694845675_20471200f4_b.jpg", "31272765_42f5c218ae_o.jpg", "15643642828_91bc19dc9c_c.jpg", "4024641972_bc25f5bdd3_o.jpg", "8679621385_3b13d3f8a9_c.jpg", "6218293882_a8cd133091_b.jpg", "2872799631_376e780d92_b.jpg", "2877019418_d3ce429d02_b.jpg", "257881290_a1a1dfa7ee_b.jpg", "13132472_3d65412050_b.jpg", "13312195854_8abce0c428_c.jpg", "2122122702_d23a9384b1_o.jpg", "18447556469_eb51a6a53f_c.jpg", "9516657232_338435a8a7_c.jpg", "441132464_d8e4f7bfc8_b.jpg", "3065222973_a6ee385483_b.jpg", "14080756514_cae6420358_c.jpg", "12988495184_72eeddcea0_c.jpg", "15331823319_6fd2928cf9_c.jpg", "3633341803_14ff758aca_o.jpg", "15088379414_d2036ef019_c.jpg", "8414334819_9d9ddff48a_c.jpg", "16268482091_b2a9b23a62_c.jpg", "21281616980_43c35b4b37_c.jpg", "8566320903_75ace877aa_c.jpg", "8631160559_b3b39e720f_c.jpg", "5741195859_042bf6d366_b.jpg", "512879929_5d1b9cf330_b.jpg", "16087231719_73798f4858_c.jpg", "3851569001_79d7aaafab_o.jpg", "18679586886_24651381ff_c.jpg", "5104021436_aa1368c46d_b.jpg", "154067078_3b0fd61945_o.jpg", "6262234676_4e761730a8_b.jpg", "3889047692_d41fb582bc_o.jpg", "6102943164_8cd1da3eae_b.jpg", "2911844831_443a3627a9_b.jpg", "19490819674_8c4485c3d1_c.jpg", "6319551297_deb1a6684b_b.jpg", "2428184093_c11504766d_o.jpg", "18801942494_84d834ae72_c.jpg", "21276725809_f6458c6dbc_c.jpg", "19329237141_ca83e94539_c.jpg", "3262595806_5ecaa4d8ab_b.jpg", "4034973293_2996ca9e06_b.jpg", "22040383895_c8e228fdc0_c.jpg", "4268247224_1d7e375060_b.jpg", "5638668425_7c178128fa_b.jpg", "7800731300_16101eff71_c.jpg", "22621733003_421d165f68_c.jpg", "13903858187_0d5b4ac524_c.jpg", "14305489401_6cc869c4f6_c.jpg"], "test": ["6931423716_4fb31feac6_c.jpg", "6356033521_b93c3cf5d6_b.jpg", "18436750958_b6f650e6d4_c.jpg", "4624005299_6d98f4d909_b.jpg", "5465125897_db85858f42_b.jpg", "9349007347_3078171d06_c.jpg", "22088464206_ef667e9762_c.jpg", "4145405326_82b18b2693_b.jpg", "22531710197_6eb71ef7b3_c.jpg", "4341115136_cd77268a52_b.jpg", "475589928_d7320addb3_b.jpg", "3374926549_f1c5aa7567_b.jpg", "294699501_0e4bf6f556_b.jpg", "2233151_16b7db3ae2_b.jpg", "8096365964_33484939e8_c.jpg", "15920025378_768dc5a84f_c.jpg", "10593669606_2541644cd8_c.jpg", "10758495053_2d04960440_c.jpg", "4810815511_c6a6c14033_b.jpg", "3528768150_10acc82bb4_o.jpg", "4154418862_40e3cf2b6f_b.jpg", "2551085398_fbb77b329a_b.jpg", "7333806116_02c1e3eab5_c.jpg", "7154096652_f483efaefd_c.jpg", "17403692_3972cc299a_o.jpg", "4363660067_5ed3206b78_b.jpg", "7862449580_81eb1bdd1a_c.jpg", "514599235_3490a79faa_o.jpg", "15364035210_4b35691e73_c.jpg", "4932170466_2e3979be27_b.jpg", "3589708304_e31587797b_o.jpg", "3846608461_14638b661b_b.jpg", "8747633714_bea8dd1f01_c.jpg", "3602624992_491158ee96_b.jpg", "15008159818_883cae6852_c.jpg", "22818256357_db7e3f780e_c.jpg", "8161839238_a852ab5455_c.jpg", "519315108_6b08000ff4_b.jpg", "8977137296_55d7c7196d_c.jpg", "11966278053_9451dc7865_c.jpg", "3270208497_f320ec2c9e_b.jpg", "4033692759_4616dec071_b.jpg", "3399152245_3f145004f2_b.jpg", "9137219415_1917d5b471_c.jpg", "8233948739_20e398afd5_c.jpg", "3318848219_64dd301926_b.jpg", "15058648637_2bf19790c6_c.jpg", "23664996672_aa0ce6541f_c.jpg", "22369223203_6aee81471b_c.jpg", "4973476179_97e87a471f_b.jpg", "4010140017_5b6105336b_b.jpg", "16444655_25f23ba920_b.jpg", "20769982335_b9091efb76_c.jpg", "2164327810_43db533ef0_o.jpg", "3870703092_b4ee81577c_b.jpg", "14139164702_12ffb8c993_c.jpg", "13091565704_a6bb18d1cb_c.jpg", "2354630583_7fc7207d18_o.jpg", "2870531206_48c962f2df_b.jpg", "74089747_a49efd02c4_b.jpg", "5741481015_2aba6470df_b.jpg", "2271644237_8777931787_b.jpg", "2667951742_9d339d07e1_o.jpg", "10765458356_7bbbb4e1d2_c.jpg", "20479970986_c7a8aa9da4_c.jpg", "16004123231_c2778c70a9_c.jpg", "8728498204_0c7fa0f7e3_c.jpg", "17274878389_10d7bb8b48_c.jpg", "4579512461_283669a2de_b.jpg", "8002377254_20aa629eb0_c.jpg", "963512134_2287414f0a_b.jpg", "11341534884_733b7df026_c.jpg", "12319362274_44ea464962_c.jpg", "19784931422_5eed1ca091_c.jpg", "1347575495_a15a137508_o.jpg", "1516190930_c7fc4cb39c_b.jpg", "15173889880_15279495d4_c.jpg", "16912243896_7e55b74afb_c.jpg", "14972066772_e4c22056b4_c.jpg", "22883293679_2197e01f0e_c.jpg", "6868170793_a0ecbdbc81_b.jpg", "2722630483_7f11abef57_b.jpg", "386528941_d65beee352_o.jpg", "2088415293_876fbdb149_b.jpg", "14608268091_8b8a47d35e_c.jpg", "16883470242_145f7056ff_c.jpg", "15604716778_e20489efb5_c.jpg", "4676698337_78eda90dff_b.jpg", "2157516909_de12572d95_b.jpg", "4845209890_2e634f876a_b.jpg", "520996315_62069e5bbe_b.jpg", "22889017612_773f95ff05_c.jpg", "8162635893_2f011e8095_c.jpg", "14373990807_4ba6cfa7c2_c.jpg", "4121654669_ebfe71dfe3_b.jpg", "3599667280_950504e9fe_b.jpg", "291420183_a7662cc76a_o.jpg", "10003475154_f6a656262c_c.jpg", "13949737227_1ae78a857e_c.jpg", "7718054472_64e6c01e1b_c.jpg", "9365940223_79083d5673_c.jpg", "22875211982_eb58e697ff_c.jpg", "4246642833_9f9db1f77e_b.jpg", "7170541977_ee28f86db1_c.jpg", "2494908972_d75c7db209_b.jpg", "7112857929_a5080e0c55_c.jpg", "15747766454_c44ac18783_c.jpg", "8262039475_16e7b0e393_c.jpg", "3215982805_aa553c9310_b.jpg", "8474352659_84b8936f36_c.jpg", "15972446375_bd81c2a8fa_c.jpg", "8049761989_3fd9bbef78_c.jpg", "193354531_f993136e11_o.jpg", "4071195254_b7b01b7a95_b.jpg", "4661172818_e69682a8d9_b.jpg", "3081857872_df78aaf77a_b.jpg", "3882311594_17e33ca3f9_b.jpg", "3605785348_b05dafbfe8_b.jpg", "21515068813_1cb03ac6a6_c.jpg", "2814541145_8b7051f871_o.jpg", "3697316544_059ff65eeb_o.jpg", "7609786270_51981aa5f5_c.jpg", "5390823683_faa7667285_b.jpg", "17566672810_d645ef4018_c.jpg", "21503034159_89f360f492_c.jpg", "3533704049_1dc579093e_b.jpg", "9466513966_d0a12053f5_c.jpg", "6563812829_81e5c65416_b.jpg", "6994235745_04af1b5e8e_c.jpg", "23325240806_98d1b79be1_c.jpg", "5248594050_b5f9b5f210_b.jpg", "8455922255_eddf254972_c.jpg", "3956691038_7706fc91ed_b.jpg", "7121235191_bceff622a8_c.jpg", "8381496243_d3787fcb99_c.jpg", "3688834469_18f4fe36f6_b.jpg", "15797803775_95cc5a5b3b_c.jpg", "21889659319_3fd3311255_c.jpg", "15387321983_4db4ebc38c_c.jpg", "4156283447_2180167b8b_b.jpg", "15161716774_41b50e0d73_c.jpg", "15144658677_f7d3329dca_c.jpg", "13071839595_dc5977e585_c.jpg", "3830736783_225517cb16_o.jpg", "15946832492_33e17a859b_c.jpg", "4593950257_e4be73904a_b.jpg", "2903041478_9ca96942e9_b.jpg", "8568454618_a891f5dc7c_c.jpg", "8605044054_51faca459d_c.jpg", "3600949753_ff0d884177_o.jpg", "6141794105_173c4abf79_b.jpg", "3294446343_070405d6ff_b.jpg", "14217671141_4177d42b42_c.jpg", "6892829410_bdee9582ca_c.jpg", "5128123530_de2576599f_b.jpg", "185452253_658456ed55_o.jpg", "3130821222_0e2a8e4f74_b.jpg", "7783570278_c0fe088a6d_c.jpg", "3759553615_0ba28c9e15_b.jpg", "14428338601_cdabceddd3_c.jpg", "6006260787_0336e721db_b.jpg", "182325986_6274cceb2e_o.jpg", "16216650043_64385b3b91_c.jpg", "7380311256_8714609bd8_c.jpg", "6176821244_1ceec31f5f_b.jpg", "4348794196_05d5ef653f_b.jpg", "16621106381_040c4f395a_c.jpg", "15171811828_f06124e20a_c.jpg", "11709125105_882c822076_c.jpg", "5321157581_28ea897fef_b.jpg", "6863898922_9d9beaa8d9_c.jpg", "7128129839_654ce448c6_c.jpg", "10495354685_5fb9280b19_c.jpg", "19656076351_cb30f43a41_c.jpg", "4905497799_8c22a0407a_b.jpg", "3370400296_1f9899c300_b.jpg", "5971458664_9588aaee90_b.jpg", "16672426446_24c72ed9e6_c.jpg", "9634586723_633fa1be93_c.jpg", "1849556624_fe7ccb917b_b.jpg", "2115746090_0a963d702a_b.jpg", "8421105901_3cc2d6d15a_c.jpg", "8504226368_c55a266cda_c.jpg", "3593754070_d44e8cfbb7_b.jpg", "8208913350_1eed448f89_c.jpg", "22253222970_44cb7cffa8_c.jpg", "3799722947_74b04a500c_b.jpg", "3197239548_c0087172ca_b.jpg", "4218887379_906f7b758b_o.jpg", "8606745821_bfc7d7d738_c.jpg", "9540370303_e7cba551b4_c.jpg", "5987290194_d3dd371fc9_b.jpg", "4402619069_805599b4f8_b.jpg", "11978346203_8c63a8f34e_c.jpg", "4806911870_5c1a0e1aae_b.jpg", "9194628292_703b475a95_c.jpg", "19676062654_045e8e517e_c.jpg", "2924070719_29040ae280_o.jpg", "14840525964_e190fc0872_c.jpg", "10831984204_7a152e0fae_c.jpg", "3065677022_2c2f511c9b_b.jpg", "20033485558_868b8515db_c.jpg", "23131646942_2460fd031a_c.jpg", "4678059440_6e5bf34da7_b.jpg", "2966610236_8404994d37_o.jpg", "9586520865_c6f685ec4d_c.jpg", "1149089847_737b78a031_b.jpg", "4820322211_1f425ecf01_b.jpg", "206358137_fedf01dc2d_b.jpg", "4211286840_f16b8e609c_b.jpg", "11875410586_3f1ed4d1b1_c.jpg", "22908115460_6caef62718_c.jpg", "5163600450_ce80028039_b.jpg", "8239033953_2e835e0696_c.jpg", "16002047337_1b8f93be96_c.jpg", "21931672282_38034a0466_c.jpg", "22666564194_56b757a6bb_c.jpg", "4354120560_17eabe4270_b.jpg", "5014563983_c63ff280af_b.jpg", "3707164863_1f31a56b56_b.jpg", "5410671051_1e094dc2e2_b.jpg", "4309837660_3e11a2c61b_b.jpg", "6632066133_9371b5dc05_b.jpg", "37856869_ab1a736364_o.jpg", "5517181929_820eaa7b76_b.jpg", "9255252452_2b8326a370_c.jpg", "9379145287_b25dbbc451_c.jpg", "16569865263_37a149b163_c.jpg", "22093736505_10d5efac6e_c.jpg", "633671322_f1f438739c_b.jpg", "16712322764_cf5b9cc405_c.jpg", "22458485522_f4aa924492_c.jpg", "1308100275_29d4dbf4be_b.jpg", "1992069_0751b406c5_o.jpg", "5307582791_ccd41ccc50_b.jpg", "5755997394_ef4540f89f_b.jpg", "3737292250_9bf839df28_o.jpg", "13977006428_2f2c26120a_c.jpg", "2894528580_23758072c3_b.jpg", "373221129_2d319cf1cd_b.jpg", "16823035549_b1a6e773e3_c.jpg", "2031554795_7651951aa9_b.jpg", "10961968454_5abfe19802_c.jpg", "5587064356_3b9bb134e7_b.jpg", "5436335130_45cabeceec_b.jpg", "1855075703_e2d71b8a96_b.jpg", "2756522575_4f1fda2bc9_b.jpg", "7186342794_59291165bf_c.jpg", "22113603504_230e2672e0_c.jpg", "9610827736_bd4891c1d0_c.jpg", "7072914403_4063901b51_c.jpg", "2250029763_59f61096d8_o.jpg", "2805897111_fb6ef61983_b.jpg", "9264311376_686a82d1c6_c.jpg", "14795583817_184313818a_c.jpg", "1331914821_3ec59a5ec0_b.jpg", "21883376114_c0fabb1619_c.jpg", "16289018812_bcab1dfff6_c.jpg", "5039566604_c43f4a35e1_b.jpg", "11771734935_b4222e1296_c.jpg", "19021734563_a4391bebd0_c.jpg", "4590568467_ec251f8ab9_b.jpg", "230224963_5eccc7f3cf_o.jpg", "5444209637_75b41a6e15_b.jpg", "6809501442_60d584fcd4_c.jpg", "5935689010_7fa4c35126_b.jpg", "12276579123_b6de16ca02_c.jpg", "7237158370_bc98299b03_c.jpg", "22648075619_87d39c4405_c.jpg", "12437100285_6f3de0d77e_c.jpg", "539582366_3ed23b0711_b.jpg", "5235153707_b01718c197_b.jpg", "9036218399_47c0509c78_c.jpg", "3974373424_321ce6ee7b_o.jpg", "3214670909_c3387eeb52_b.jpg", "3966816742_c5f66a0050_b.jpg", "6124790801_ceec6164ab_b.jpg", "4526889617_7424b610ea_b.jpg", "6540157697_78e2ef7b18_b.jpg", "19087618072_06a31cbb14_c.jpg", "5798838070_3cbe124617_b.jpg", "4094533283_74fff1a29d_b.jpg", "2357500954_ae0e75e04b_b.jpg", "4528429967_3a530ea823_b.jpg", "2344365160_77e93576fe_b.jpg", "2400838721_047e93c1da_b.jpg", "14500205073_6ee2813ca3_c.jpg", "11947867996_41f588562b_c.jpg", "3904086268_ff29c9b39b_o.jpg", "7465747792_6d9a49d04c_c.jpg", "1832112770_07c67b274d_b.jpg", "14785045305_64b847d1c2_c.jpg", "22378175864_89d56ba96d_c.jpg", "2747380017_8fb3d0a7ff_o.jpg", "3064273548_b7b5391456_b.jpg", "5525988501_709a4ce1dc_b.jpg", "15137530922_b9462b9cd9_c.jpg", "4192603791_ef6a86762b_b.jpg", "5196233321_4586cd07fa_b.jpg", "3002420275_af08388d5c_b.jpg", "5926737743_f6e0bcf3f5_b.jpg", "15495627874_eeec984393_c.jpg", "317480749_328fee973d_b.jpg", "11001262846_ae5c2790ba_c.jpg", "4666417792_b87cb9affe_b.jpg", "22861201839_233687f96f_c.jpg", "10598232615_dd407dff97_c.jpg", "4670893327_a2e431be6f_b.jpg", "5845827826_ee1eaa6d77_b.jpg", "8545650602_341f4dcb3f_c.jpg", "7783121058_fdb9f2ea33_c.jpg", "2366903269_503dab4814_b.jpg", "6292047264_d0020104bd_b.jpg", "3954380986_32f82052b1_b.jpg", "6601866919_c70c634836_b.jpg", "20692989538_ae285e181c_c.jpg", "3776584858_91ae007539_b.jpg", "7980794043_29af27a4b4_c.jpg", "112592728_2b95307ab9_b.jpg"], "human": ["5128123530_de2576599f_b.jpg", "5935689010_7fa4c35126_b.jpg", "20220037833_745a63521c_c.jpg", "16712322764_cf5b9cc405_c.jpg", "14975779256_a948bbde8d_c.jpg", "16002047337_1b8f93be96_c.jpg", "16908221218_a1db9a2889_c.jpg", "20692989538_ae285e181c_c.jpg", "8671601673_4705c06c72_c.jpg", "13922256400_63c37088d9_c.jpg", "5307582791_ccd41ccc50_b.jpg", "21923340111_67d9cb41a2_c.jpg", "21679174348_6426878fda_c.jpg", "2141847448_96e2ac358f_b.jpg", "8591771654_31cece470a_c.jpg", "23124963609_91b8ce37d0_c.jpg", "4528429967_3a530ea823_b.jpg", "4887697425_ed7ed96a7b_b.jpg", "16382417532_9db2d9e979_c.jpg", "3707164863_1f31a56b56_b.jpg", "17001945481_3333f764a2_c.jpg", "4905497799_8c22a0407a_b.jpg", "2047992435_1d3173b70a_b.jpg", "9449620384_ca42c8bd1c_c.jpg", "5955594271_75ffa33ae1_b.jpg", "25760133_985004b241_b.jpg", "22889017612_773f95ff05_c.jpg", "2250257794_5baaca4c86_o.jpg", "3996332252_d76a74d8ec_b.jpg", "7075587655_060c69610a_c.jpg", "6545067517_374883d482_b.jpg", "3210010263_905229c87f_b.jpg", "6758326771_9f6d355471_b.jpg", "14400676147_564b7cab22_c.jpg", "241567473_b54493a0bf_o.jpg", "22378175864_89d56ba96d_c.jpg", "3990212739_376105b273_o.jpg", "21610132779_4ac0a9bbe1_c.jpg", "4009150377_0672f7eb84_o.jpg", "8010003295_5c0e299ac7_c.jpg", "6411715157_206e0dd94b_b.jpg", "5172555508_5b5621979a_b.jpg", "10599149634_c85839d7db_c.jpg", "19318012258_1768f0efd4_c.jpg", "2701744003_e8d110e23d_o.jpg", "520996315_62069e5bbe_b.jpg", "22858277217_915604c95b_c.jpg", "8117710419_472e1727f6_c.jpg", "5314479722_e5ee0e9c0d_b.jpg", "14821085701_d8e9b0f313_c.jpg", "3593754070_d44e8cfbb7_b.jpg", "5233021752_0bbaf9efa9_b.jpg", "3225388186_f759bb2efd_b.jpg", "4580886884_5135a20f85_b.jpg", "6063042881_bf8b71b0af_b.jpg", "7743949198_50665b8a1f_c.jpg", "5005574607_e50e541b18_b.jpg", "5465125897_db85858f42_b.jpg", "15919764513_21f7012013_c.jpg", "3399152245_3f145004f2_b.jpg", "15880218281_5ce5729a89_c.jpg", "2548660076_89a8ebb4d8_b.jpg", "18447556469_eb51a6a53f_c.jpg", "8568454618_a891f5dc7c_c.jpg", "5697770046_aaee118f43_b.jpg", "14489719271_4f39612e9d_c.jpg", "329137116_69d33a38af_b.jpg", "5991396675_a46c4cf7fd_b.jpg", "9216097266_b7aaf62a3c_c.jpg", "9132013188_caffe33355_c.jpg", "11438228353_b0942c1965_c.jpg", "17389910412_5c4a5713e7_c.jpg", "4582550712_2bb27ebd56_b.jpg", "2545039943_bfc4677672_o.jpg", "7862449580_81eb1bdd1a_c.jpg", "338478912_bf806bb3b9_b.jpg", "4071195254_b7b01b7a95_b.jpg", "9656254626_7221ed8610_c.jpg", "9216061054_9b15b5e220_c.jpg", "8997607849_7012ee5097_c.jpg", "5451330835_39ce788055_b.jpg", "335977630_0b01a71ca7_o.jpg", "7237158370_bc98299b03_c.jpg", "138337087_66f4bb95a5_o.jpg", "1855075703_e2d71b8a96_b.jpg", "3224201087_6c9445b1a4_b.jpg", "6196298243_4046d3a187_b.jpg", "14117102802_5a3847d1c4_c.jpg", "15095294843_29aa2cc92b_c.jpg", "4010140017_5b6105336b_b.jpg", "5219568961_1313fb60f9_b.jpg", "472783773_43c208c6db_o.jpg", "4094533283_74fff1a29d_b.jpg", "493504461_55da56e03b_b.jpg", "193354531_f993136e11_o.jpg", "5346664759_768b577739_b.jpg", "3759553615_0ba28c9e15_b.jpg", "4219766886_11738b19f7_o.jpg", "4158482001_c479431427_b.jpg", "3632042172_e54d431b86_o.jpg", "13935524706_a74aa1d8d9_c.jpg", "3888323100_a77d78113d_b.jpg", "7006823703_8652a88ae9_c.jpg", "5653821068_98dc141fc1_b.jpg", "258887093_abc7a6fd8a_o.jpg", "76537045_5997edc2af_b.jpg", "4742030237_f4b286b00b_b.jpg", "10899044693_3efff6e72b_c.jpg", "37856869_ab1a736364_o.jpg", "15533820687_644f9dfa16_c.jpg", "17552718331_0327cd3991_c.jpg", "2455356735_98e3716d18_b.jpg", "10100461493_2597206933_c.jpg", "9415194618_20c9391083_c.jpg", "18622440113_64b087efeb_c.jpg", "3064529250_e7e14c8e33_b.jpg", "2219073456_7cba41ce59_o.jpg", "2380629494_3cc96ab658_b.jpg", "8670499783_6a7bdc8e1c_c.jpg", "15604716778_e20489efb5_c.jpg", "9513142218_e1fa17486e_c.jpg", "278999481_169ae79ab8_b.jpg", "5104021436_aa1368c46d_b.jpg", "4661172818_e69682a8d9_b.jpg", "5830937201_d028056d03_b.jpg", "11709125105_882c822076_c.jpg", "4682189889_346a9a5e5d_b.jpg", "8697461715_12fa0164b3_c.jpg", "2037016346_1ceb5d2240_b.jpg", "17794523236_f25133dff0_c.jpg", "8455922606_b0a908b5ec_c.jpg", "5242605879_76852f3f16_b.jpg", "8606745821_bfc7d7d738_c.jpg", "8504226368_c55a266cda_c.jpg", "19087618072_06a31cbb14_c.jpg", "3432940096_3f2127f722_b.jpg", "154067078_3b0fd61945_o.jpg", "4565128220_227ccc7b09_b.jpg", "3565677857_bbf2ca268f_b.jpg", "15058648637_2bf19790c6_c.jpg", "2494908972_d75c7db209_b.jpg", "4536387873_378f2e2bf1_b.jpg", "6626869413_9124647cbd_b.jpg", "3197239548_c0087172ca_b.jpg", "3589708304_e31587797b_o.jpg", "8431423189_1a3ccf8fd5_c.jpg", "5782605225_ef43b80482_b.jpg", "2163214948_02e2f53933_b.jpg", "20175952908_15be5b0c56_c.jpg", "9609906443_7a300a6d92_c.jpg", "3879937894_d128cc90e6_o.jpg", "14518923604_15189e5355_c.jpg", "23664996672_aa0ce6541f_c.jpg", "2949538889_7d48c2b9c7_o.jpg", "12348984654_4491f5ece0_c.jpg", "19499853981_3d4180f13e_c.jpg", "16985182529_2ecc3b639a_c.jpg", "5587064356_3b9bb134e7_b.jpg", "2400838721_047e93c1da_b.jpg", "8239033953_2e835e0696_c.jpg", "230224963_5eccc7f3cf_o.jpg", "3890560449_39b9f11431_b.jpg", "840960137_550714d41c_o.jpg", "731583713_2fde35c209_b.jpg", "13903858187_0d5b4ac524_c.jpg", "8262039475_16e7b0e393_c.jpg", "2903041478_9ca96942e9_b.jpg", "3889047692_d41fb582bc_o.jpg", "2157516909_de12572d95_b.jpg", "8450809710_0d3ca4658b_c.jpg", "6632066133_9371b5dc05_b.jpg", "4458876892_8ce21a1418_o.jpg", "21213987492_abb1c407e5_c.jpg", "22625808588_910655610c_c.jpg", "7783121058_fdb9f2ea33_c.jpg", "2911844831_443a3627a9_b.jpg"]}
--------------------------------------------------------------------------------
/human_bboxes/FCDB/human_bboxes.json:
--------------------------------------------------------------------------------
1 | {"5128123530_de2576599f_b.jpg": [101, 37, 1024, 974], "5935689010_7fa4c35126_b.jpg": [323, 0, 1020, 675], "20220037833_745a63521c_c.jpg": [381, 281, 475, 502], "16712322764_cf5b9cc405_c.jpg": [105, 20, 722, 622], "14975779256_a948bbde8d_c.jpg": [223, 175, 562, 536], "16002047337_1b8f93be96_c.jpg": [137, 137, 384, 528], "16908221218_a1db9a2889_c.jpg": [400, 212, 553, 486], "20692989538_ae285e181c_c.jpg": [348, 312, 530, 531], "8671601673_4705c06c72_c.jpg": [563, 244, 771, 593], "13922256400_63c37088d9_c.jpg": [460, 290, 519, 434], "5307582791_ccd41ccc50_b.jpg": [508, 316, 650, 845], "21923340111_67d9cb41a2_c.jpg": [357, 29, 611, 527], "21679174348_6426878fda_c.jpg": [137, 34, 340, 707], "2141847448_96e2ac358f_b.jpg": [186, 412, 665, 1003], "8591771654_31cece470a_c.jpg": [304, 137, 345, 240], "23124963609_91b8ce37d0_c.jpg": [69, 159, 732, 480], "4528429967_3a530ea823_b.jpg": [601, 20, 1012, 567], "4887697425_ed7ed96a7b_b.jpg": [348, 239, 569, 523], "16382417532_9db2d9e979_c.jpg": [130, 10, 609, 614], "3707164863_1f31a56b56_b.jpg": [576, 15, 938, 669], "17001945481_3333f764a2_c.jpg": [144, 110, 599, 528], "4905497799_8c22a0407a_b.jpg": [133, 245, 752, 679], "2047992435_1d3173b70a_b.jpg": [7, 32, 1024, 689], "9449620384_ca42c8bd1c_c.jpg": [126, 302, 426, 626], "5955594271_75ffa33ae1_b.jpg": [102, 370, 626, 701], "25760133_985004b241_b.jpg": [271, 192, 412, 675], "22889017612_773f95ff05_c.jpg": [525, 93, 638, 369], "2250257794_5baaca4c86_o.jpg": [0, 209, 659, 1188], "3996332252_d76a74d8ec_b.jpg": [513, 226, 786, 669], "7075587655_060c69610a_c.jpg": [208, 52, 795, 517], "6545067517_374883d482_b.jpg": [40, 146, 678, 1005], "3210010263_905229c87f_b.jpg": [208, 154, 410, 597], "6758326771_9f6d355471_b.jpg": [115, 283, 343, 641], "14400676147_564b7cab22_c.jpg": [530, 308, 789, 591], "241567473_b54493a0bf_o.jpg": [6, 124, 517, 790], "22378175864_89d56ba96d_c.jpg": [36, 137, 496, 620], "3990212739_376105b273_o.jpg": [202, 312, 312, 702], "21610132779_4ac0a9bbe1_c.jpg": [64, 229, 417, 708], "4009150377_0672f7eb84_o.jpg": [126, 51, 639, 999], "8010003295_5c0e299ac7_c.jpg": [241, 383, 387, 793], "6411715157_206e0dd94b_b.jpg": [65, 20, 669, 671], "5172555508_5b5621979a_b.jpg": [90, 100, 590, 901], "10599149634_c85839d7db_c.jpg": [7, 189, 533, 791], "19318012258_1768f0efd4_c.jpg": [134, 128, 511, 795], "2701744003_e8d110e23d_o.jpg": [539, 183, 1112, 1024], "520996315_62069e5bbe_b.jpg": [197, 113, 738, 666], "22858277217_915604c95b_c.jpg": [394, 272, 429, 362], "8117710419_472e1727f6_c.jpg": [273, 182, 467, 409], "5314479722_e5ee0e9c0d_b.jpg": [378, 227, 510, 529], "14821085701_d8e9b0f313_c.jpg": [338, 20, 577, 532], "3593754070_d44e8cfbb7_b.jpg": [106, 370, 556, 931], "5233021752_0bbaf9efa9_b.jpg": [242, 801, 298, 932], "3225388186_f759bb2efd_b.jpg": [200, 114, 374, 392], "4580886884_5135a20f85_b.jpg": [315, 465, 436, 606], "6063042881_bf8b71b0af_b.jpg": [383, 442, 518, 695], "7743949198_50665b8a1f_c.jpg": [675, 386, 764, 579], "5005574607_e50e541b18_b.jpg": [564, 0, 1014, 751], "5465125897_db85858f42_b.jpg": [19, 36, 678, 1019], "15919764513_21f7012013_c.jpg": [431, 67, 705, 527], "3399152245_3f145004f2_b.jpg": [130, 92, 452, 616], "15880218281_5ce5729a89_c.jpg": [166, 211, 217, 405], "2548660076_89a8ebb4d8_b.jpg": [353, 540, 393, 638], "18447556469_eb51a6a53f_c.jpg": [45, 236, 308, 537], "8568454618_a891f5dc7c_c.jpg": [366, 165, 611, 524], "5697770046_aaee118f43_b.jpg": [67, 5, 835, 656], "14489719271_4f39612e9d_c.jpg": [399, 37, 673, 515], "329137116_69d33a38af_b.jpg": [369, 42, 598, 568], "5991396675_a46c4cf7fd_b.jpg": [315, 149, 699, 679], "9216097266_b7aaf62a3c_c.jpg": [402, 104, 678, 586], "9132013188_caffe33355_c.jpg": [347, 280, 405, 352], "11438228353_b0942c1965_c.jpg": [522, 261, 633, 421], "17389910412_5c4a5713e7_c.jpg": [262, 371, 293, 454], "4582550712_2bb27ebd56_b.jpg": [356, 22, 1016, 758], "2545039943_bfc4677672_o.jpg": [77, 282, 648, 996], "7862449580_81eb1bdd1a_c.jpg": [385, 225, 451, 394], "338478912_bf806bb3b9_b.jpg": [136, 300, 588, 850], "4071195254_b7b01b7a95_b.jpg": [416, 62, 751, 566], "9656254626_7221ed8610_c.jpg": [404, 167, 632, 533], "9216061054_9b15b5e220_c.jpg": [436, 180, 637, 596], "8997607849_7012ee5097_c.jpg": [345, 128, 470, 546], "5451330835_39ce788055_b.jpg": [327, 296, 599, 638], "335977630_0b01a71ca7_o.jpg": [393, 71, 1261, 1066], "7237158370_bc98299b03_c.jpg": [142, 249, 478, 773], "138337087_66f4bb95a5_o.jpg": [305, 183, 441, 543], "1855075703_e2d71b8a96_b.jpg": [0, 119, 685, 918], "3224201087_6c9445b1a4_b.jpg": [85, 300, 373, 591], "6196298243_4046d3a187_b.jpg": [383, 314, 442, 381], "14117102802_5a3847d1c4_c.jpg": [382, 239, 402, 294], "15095294843_29aa2cc92b_c.jpg": [216, 79, 411, 685], "4010140017_5b6105336b_b.jpg": [92, 359, 657, 946], "5219568961_1313fb60f9_b.jpg": [496, 109, 805, 425], "472783773_43c208c6db_o.jpg": [144, 9, 1001, 665], "4094533283_74fff1a29d_b.jpg": [25, 101, 671, 674], "493504461_55da56e03b_b.jpg": [385, 86, 601, 713], "193354531_f993136e11_o.jpg": [13, 261, 281, 843], "5346664759_768b577739_b.jpg": [621, 113, 769, 550], "3759553615_0ba28c9e15_b.jpg": [43, 16, 508, 534], "4219766886_11738b19f7_o.jpg": [43, 312, 164, 600], "4158482001_c479431427_b.jpg": [397, 266, 543, 587], "3632042172_e54d431b86_o.jpg": [475, 295, 614, 507], "13935524706_a74aa1d8d9_c.jpg": [257, 104, 553, 534], "3888323100_a77d78113d_b.jpg": [431, 213, 740, 581], "7006823703_8652a88ae9_c.jpg": [460, 158, 655, 526], "5653821068_98dc141fc1_b.jpg": [213, 85, 727, 748], "258887093_abc7a6fd8a_o.jpg": [0, 33, 488, 739], "76537045_5997edc2af_b.jpg": [210, 41, 894, 669], "4742030237_f4b286b00b_b.jpg": [69, 216, 566, 608], "10899044693_3efff6e72b_c.jpg": [373, 12, 747, 526], "37856869_ab1a736364_o.jpg": [304, 183, 738, 673], "15533820687_644f9dfa16_c.jpg": [208, 20, 423, 527], "17552718331_0327cd3991_c.jpg": [285, 112, 650, 519], "2455356735_98e3716d18_b.jpg": [376, 19, 747, 449], "10100461493_2597206933_c.jpg": [11, 0, 711, 512], "9415194618_20c9391083_c.jpg": [2, 81, 456, 793], "18622440113_64b087efeb_c.jpg": [370, 113, 624, 527], "3064529250_e7e14c8e33_b.jpg": [289, 90, 950, 682], "2219073456_7cba41ce59_o.jpg": [167, 59, 430, 902], "2380629494_3cc96ab658_b.jpg": [99, 110, 333, 1004], "8670499783_6a7bdc8e1c_c.jpg": [51, 86, 135, 339], "15604716778_e20489efb5_c.jpg": [0, 74, 594, 794], "9513142218_e1fa17486e_c.jpg": [268, 198, 437, 531], "278999481_169ae79ab8_b.jpg": [253, 135, 1015, 749], "5104021436_aa1368c46d_b.jpg": [197, 146, 539, 1010], "4661172818_e69682a8d9_b.jpg": [516, 326, 564, 487], "5830937201_d028056d03_b.jpg": [331, 329, 450, 757], "11709125105_882c822076_c.jpg": [400, 145, 508, 495], "4682189889_346a9a5e5d_b.jpg": [508, 130, 703, 441], "8697461715_12fa0164b3_c.jpg": [413, 209, 461, 267], "2037016346_1ceb5d2240_b.jpg": [227, 226, 294, 333], "17794523236_f25133dff0_c.jpg": [267, 207, 454, 592], "8455922606_b0a908b5ec_c.jpg": [290, 346, 526, 798], "5242605879_76852f3f16_b.jpg": [111, 9, 883, 674], "8606745821_bfc7d7d738_c.jpg": [271, 21, 515, 362], "8504226368_c55a266cda_c.jpg": [278, 14, 629, 797], "19087618072_06a31cbb14_c.jpg": [130, 208, 288, 354], "3432940096_3f2127f722_b.jpg": [746, 296, 952, 677], "154067078_3b0fd61945_o.jpg": [64, 11, 1091, 591], "4565128220_227ccc7b09_b.jpg": [340, 3, 1012, 654], "3565677857_bbf2ca268f_b.jpg": [8, 423, 602, 1019], "15058648637_2bf19790c6_c.jpg": [153, 36, 535, 280], "2494908972_d75c7db209_b.jpg": [24, 158, 684, 1020], "4536387873_378f2e2bf1_b.jpg": [219, 321, 466, 864], "6626869413_9124647cbd_b.jpg": [305, 476, 371, 658], "3197239548_c0087172ca_b.jpg": [26, 175, 487, 705], "3589708304_e31587797b_o.jpg": [296, 236, 451, 486], "8431423189_1a3ccf8fd5_c.jpg": [5, 196, 531, 792], "5782605225_ef43b80482_b.jpg": [380, 257, 497, 470], "2163214948_02e2f53933_b.jpg": [81, 425, 383, 1007], "20175952908_15be5b0c56_c.jpg": [148, 146, 627, 506], "9609906443_7a300a6d92_c.jpg": [354, 20, 799, 520], "3879937894_d128cc90e6_o.jpg": [153, 321, 550, 1016], "14518923604_15189e5355_c.jpg": [248, 238, 360, 364], "23664996672_aa0ce6541f_c.jpg": [358, 108, 538, 328], "2949538889_7d48c2b9c7_o.jpg": [51, 116, 783, 790], "12348984654_4491f5ece0_c.jpg": [230, 159, 517, 506], "19499853981_3d4180f13e_c.jpg": [208, 40, 760, 524], "16985182529_2ecc3b639a_c.jpg": [516, 211, 718, 526], "5587064356_3b9bb134e7_b.jpg": [432, 136, 677, 675], "2400838721_047e93c1da_b.jpg": [308, 106, 708, 669], "8239033953_2e835e0696_c.jpg": [232, 160, 424, 403], "230224963_5eccc7f3cf_o.jpg": [49, 21, 190, 226], "3890560449_39b9f11431_b.jpg": [472, 327, 534, 522], "840960137_550714d41c_o.jpg": [0, 4, 731, 579], "731583713_2fde35c209_b.jpg": [166, 138, 500, 942], "13903858187_0d5b4ac524_c.jpg": [160, 85, 271, 421], "8262039475_16e7b0e393_c.jpg": [536, 184, 648, 417], "2903041478_9ca96942e9_b.jpg": [231, 347, 679, 984], "3889047692_d41fb582bc_o.jpg": [234, 125, 454, 567], "2157516909_de12572d95_b.jpg": [319, 627, 370, 811], "8450809710_0d3ca4658b_c.jpg": [1, 370, 124, 599], "6632066133_9371b5dc05_b.jpg": [247, 15, 661, 542], "4458876892_8ce21a1418_o.jpg": [224, 213, 490, 871], "21213987492_abb1c407e5_c.jpg": [470, 81, 671, 684], "22625808588_910655610c_c.jpg": [241, 349, 401, 464], "7783121058_fdb9f2ea33_c.jpg": [290, 150, 386, 427], "2911844831_443a3627a9_b.jpg": [308, 188, 467, 356]}
--------------------------------------------------------------------------------
/human_bboxes/FLMS/human_bboxes.json:
--------------------------------------------------------------------------------
1 | {"520651056_Large.jpg": [453, 248, 675, 755], "119260988_Large.jpg": [527, 6, 603, 176], "101685710_Large.jpg": [400, 311, 487, 482], "1238479583_Large.jpg": [522, 319, 658, 614], "6851859244_Large.jpg": [439, 254, 618, 741], "887993249_Large.jpg": [491, 298, 658, 690], "138895540_Large.jpg": [475, 318, 575, 508], "87097024_Large.jpg": [429, 401, 644, 759], "144836967_Large.jpg": [468, 328, 644, 667], "226854451_Large.jpg": [694, 312, 799, 516], "1818097578_Large.jpg": [293, 473, 453, 767], "60132202_Large.jpg": [426, 352, 540, 659], "7004033151_Large.jpg": [432, 352, 593, 666], "795230518_Large.jpg": [442, 223, 704, 765], "6853050148_Large.jpg": [318, 491, 485, 803], "490466578_Large.jpg": [167, 417, 294, 684], "997626761_Large.jpg": [242, 606, 341, 899], "33006163_Large.jpg": [259, 107, 635, 723], "196137741_Large.jpg": [307, 419, 424, 649], "274700974_Large.jpg": [201, 301, 456, 970], "338774403_Large.jpg": [218, 449, 466, 983], "126085096_Large.jpg": [422, 186, 618, 652], "859542660_Large.jpg": [346, 552, 409, 735], "174430493_Large.jpg": [466, 378, 630, 755], "119442448_Large.jpg": [221, 349, 441, 766], "1049828137_Large.jpg": [494, 261, 572, 471], "108059817_Large.jpg": [155, 330, 596, 986], "379047973_Large.jpg": [56, 251, 273, 676], "1347846010_Large.jpg": [355, 127, 708, 681], "898619475_Large.jpg": [417, 332, 625, 678], "1173995169_Large.jpg": [115, 227, 769, 756], "735049260_Large.jpg": [251, 534, 397, 842], "1349095323_Large.jpg": [103, 270, 338, 722], "229122444_Large.jpg": [286, 446, 484, 965], "1351032584_Large.jpg": [479, 298, 560, 465], "824401800_Large.jpg": [530, 333, 622, 538], "1740154095_Large.jpg": [412, 498, 720, 765], "84933575_Large.jpg": [424, 340, 577, 542], "130057475_Large.jpg": [188, 312, 297, 629]}
--------------------------------------------------------------------------------
/human_bboxes/GAICD/human_bboxes.json:
--------------------------------------------------------------------------------
1 | {"235093.jpg": [367, 22, 935, 687], "308233.jpg": [113, 38, 342, 596], "302811.jpg": [416, 290, 503, 461], "222577.jpg": [446, 720, 552, 927], "251647.jpg": [280, 340, 353, 582], "239760.jpg": [133, 638, 248, 794], "252770.jpg": [471, 370, 561, 589], "236839.jpg": [404, 283, 491, 419], "236058.jpg": [455, 127, 977, 684], "278927.jpg": [516, 297, 552, 380], "308394.jpg": [474, 423, 696, 660], "312262.jpg": [473, 188, 568, 404], "490605.jpg": [372, 365, 513, 463], "222291.jpg": [563, 266, 595, 335], "327672.jpg": [60, 50, 675, 957], "279678.jpg": [412, 633, 506, 830], "322484.jpg": [176, 72, 559, 977], "290758.jpg": [118, 253, 347, 693], "223848.jpg": [339, 189, 588, 495], "252783.jpg": [354, 339, 433, 533], "497266.jpg": [362, 261, 434, 351], "318237.jpg": [408, 122, 629, 434], "316695.jpg": [332, 226, 673, 668], "340515.jpg": [465, 240, 550, 518], "323138.jpg": [438, 264, 670, 671], "394757.jpg": [479, 212, 799, 670], "253230.jpg": [225, 176, 472, 665], "220320.jpg": [456, 579, 794, 1013], "338230.jpg": [500, 130, 567, 221], "411610.jpg": [86, 282, 390, 417], "222876.jpg": [245, 303, 439, 669], "315248.jpg": [390, 54, 622, 271], "302166.jpg": [212, 260, 337, 524], "301252.jpg": [552, 83, 859, 655], "250903.jpg": [233, 406, 518, 959], "308728.jpg": [93, 260, 326, 641], "332764.jpg": [325, 542, 403, 635], "325204.jpg": [165, 293, 541, 921], "322858.jpg": [704, 199, 855, 534], "328200.jpg": [456, 298, 503, 405], "431149.jpg": [290, 247, 725, 671], "239121.jpg": [441, 301, 491, 414], "239085.jpg": [377, 250, 621, 611], "232700.jpg": [311, 23, 676, 681], "233725.jpg": [297, 146, 771, 671], "333393.jpg": [142, 434, 244, 584], "221320.jpg": [389, 279, 835, 679], "420325.jpg": [137, 294, 475, 665], "324259.jpg": [263, 238, 501, 682], "220865.jpg": [210, 399, 382, 570], "233858.jpg": [354, 253, 849, 544], "304558.jpg": [387, 162, 657, 696], "235992.jpg": [32, 44, 691, 871], "272046.jpg": [573, 242, 899, 623], "250900.jpg": [273, 203, 702, 775], "306149.jpg": [372, 123, 678, 356], "250455.jpg": [244, 122, 757, 691], "340700.jpg": [308, 227, 669, 492], "222610.jpg": [151, 219, 566, 963], "418314.jpg": [359, 222, 457, 486], "402584.jpg": [214, 91, 612, 876], "420036.jpg": [298, 124, 429, 403], "300434.jpg": [190, 83, 827, 609], "287292.jpg": [641, 234, 712, 350], "250666.jpg": [132, 309, 383, 757], "231352.jpg": [335, 735, 413, 896], "220207.jpg": [341, 153, 665, 554], "252970.jpg": [334, 162, 605, 664], "235792.jpg": [511, 383, 642, 606], "280832.jpg": [222, 363, 480, 762], "238669.jpg": [409, 298, 463, 402], "236546.jpg": [259, 527, 443, 914], "214216.jpg": [143, 225, 194, 361], "403759.jpg": [561, 435, 679, 603], "304073.jpg": [855, 88, 971, 356], "313343.jpg": [564, 544, 619, 663], "237845.jpg": [441, 351, 537, 650], "465230.jpg": [613, 392, 692, 505], "309802.jpg": [674, 343, 726, 448], "408500.jpg": [686, 198, 887, 639], "264748.jpg": [665, 367, 707, 465], "300747.jpg": [404, 155, 580, 565], "311046.jpg": [368, 115, 663, 447], "401408.jpg": [28, 294, 428, 1023], "255513.jpg": [318, 270, 357, 329], "315966.jpg": [276, 269, 390, 590], "310154.jpg": [335, 160, 580, 569], "236401.jpg": [580, 367, 705, 649], "324123.jpg": [103, 235, 534, 1012], "340362.jpg": [58, 450, 127, 637], "234280.jpg": [318, 130, 648, 509], "470220.jpg": [459, 300, 541, 467], "314884.jpg": [333, 78, 702, 581], "402802.jpg": [201, 276, 373, 607], "232942.jpg": [351, 207, 649, 557], "221336.jpg": [367, 87, 713, 620], "498390.jpg": [703, 300, 840, 607], "250688.jpg": [483, 241, 654, 692], "315782.jpg": [381, 188, 645, 511], "314112.jpg": [249, 351, 311, 485], "238319.jpg": [400, 48, 897, 671], "230363.jpg": [367, 262, 522, 610], "233513.jpg": [352, 370, 711, 704], "225742.jpg": [382, 192, 511, 370], "308853.jpg": [175, 106, 577, 524], "230083.jpg": [341, 186, 691, 681], "237943.jpg": [248, 75, 812, 650], "497828.jpg": [686, 385, 813, 620], "225483.jpg": [576, 289, 692, 476], "320875.jpg": [395, 231, 427, 288], "301780.jpg": [427, 285, 939, 677], "292117.jpg": [172, 202, 292, 464], "300725.jpg": [400, 152, 830, 641], "221085.jpg": [481, 192, 579, 437], "474276.jpg": [711, 201, 764, 330], "395279.jpg": [369, 213, 654, 399], "220243.jpg": [398, 135, 596, 378], "270591.jpg": [487, 304, 551, 405], "315480.jpg": [453, 253, 611, 435], "427629.jpg": [382, 155, 508, 414], "221342.jpg": [261, 166, 712, 677], "409113.jpg": [306, 184, 588, 441], "239054.jpg": [556, 329, 668, 622], "390936.jpg": [595, 321, 719, 641], "271562.jpg": [710, 296, 809, 506], "235683.jpg": [304, 28, 729, 588], "394992.jpg": [234, 117, 361, 402], "220746.jpg": [65, 392, 222, 574], "290810.jpg": [458, 299, 614, 582], "250211.jpg": [512, 175, 618, 303], "433761.jpg": [498, 735, 589, 936], "222638.jpg": [482, 330, 580, 579], "314264.jpg": [423, 226, 641, 663], "322605.jpg": [416, 195, 766, 614], "490585.jpg": [349, 60, 580, 679], "481177.jpg": [448, 286, 829, 618], "421576.jpg": [337, 106, 790, 610], "461460.jpg": [464, 381, 643, 642], "234659.jpg": [199, 154, 979, 681], "420218.jpg": [150, 256, 255, 587], "255991.jpg": [85, 257, 393, 756], "302089.jpg": [304, 252, 648, 623], "222437.jpg": [447, 227, 521, 417], "291358.jpg": [505, 94, 883, 473], "278891.jpg": [315, 453, 383, 649], "270801.jpg": [248, 65, 622, 575], "221627.jpg": [563, 473, 772, 747], "326310.jpg": [712, 384, 805, 646], "253510.jpg": [680, 403, 759, 591], "217386.jpg": [127, 231, 203, 400], "276367.jpg": [487, 251, 553, 322], "300948.jpg": [369, 163, 599, 826], "341252.jpg": [503, 163, 580, 297], "225125.jpg": [294, 443, 420, 553], "225566.jpg": [528, 250, 786, 669], "346630.jpg": [467, 421, 500, 462], "409964.jpg": [230, 122, 492, 456], "390822.jpg": [332, 342, 555, 580], "317475.jpg": [379, 198, 561, 718], "328459.jpg": [628, 280, 674, 372], "347246.jpg": [263, 329, 384, 681], "300663.jpg": [363, 129, 829, 681], "347746.jpg": [625, 351, 693, 505], "237344.jpg": [295, 481, 447, 753], "230448.jpg": [242, 516, 622, 1006], "341906.jpg": [455, 116, 685, 355], "300527.jpg": [602, 327, 714, 570], "250202.jpg": [282, 256, 758, 671], "343339.jpg": [592, 305, 666, 471], "323192.jpg": [537, 119, 663, 364], "279759.jpg": [382, 347, 462, 547], "239028.jpg": [528, 381, 656, 634], "282044.jpg": [316, 179, 639, 653], "287432.jpg": [522, 165, 769, 651], "490502.jpg": [575, 240, 736, 442], "280712.jpg": [1, 28, 486, 549], "310159.jpg": [336, 289, 415, 412], "221860.jpg": [500, 233, 583, 470], "323004.jpg": [655, 162, 910, 576], "231537.jpg": [321, 11, 847, 729], "236412.jpg": [472, 119, 985, 860], "490465.jpg": [415, 296, 470, 420], "251928.jpg": [45, 191, 660, 1002], "491747.jpg": [539, 90, 976, 462], "328506.jpg": [605, 235, 854, 617], "300282.jpg": [354, 234, 677, 778], "322699.jpg": [245, 217, 600, 652], "267475.jpg": [547, 208, 919, 600], "398019.jpg": [294, 325, 455, 818], "466370.jpg": [126, 459, 194, 634], "421089.jpg": [371, 133, 669, 508], "221365.jpg": [62, 277, 408, 621], "426180.jpg": [431, 222, 738, 681], "237874.jpg": [69, 67, 550, 1008], "281726.jpg": [424, 230, 551, 402], "239817.jpg": [281, 74, 704, 638], "312918.jpg": [421, 111, 610, 465], "250100.jpg": [175, 222, 456, 946], "304050.jpg": [733, 526, 775, 574], "328421.jpg": [362, 241, 536, 624], "492526.jpg": [238, 462, 320, 561], "301982.jpg": [281, 188, 579, 488], "253460.jpg": [257, 328, 569, 681], "317987.jpg": [369, 164, 748, 669], "310643.jpg": [731, 140, 799, 282], "472556.jpg": [236, 229, 347, 385], "300075.jpg": [524, 159, 712, 513], "322280.jpg": [275, 160, 443, 563], "322674.jpg": [358, 225, 486, 566], "222027.jpg": [300, 320, 402, 570], "496749.jpg": [667, 286, 789, 433], "233027.jpg": [151, 222, 475, 556], "239790.jpg": [416, 493, 644, 916], "312187.jpg": [410, 326, 489, 426], "446979.jpg": [281, 138, 615, 677], "220404.jpg": [546, 101, 717, 549], "300122.jpg": [385, 175, 823, 676], "295472.jpg": [740, 224, 819, 433], "326024.jpg": [430, 281, 488, 503], "252757.jpg": [426, 200, 636, 638], "308847.jpg": [398, 208, 877, 675], "461269.jpg": [516, 310, 559, 391], "221565.jpg": [207, 136, 378, 642], "236471.jpg": [212, 171, 792, 1018], "256545.jpg": [292, 237, 541, 1005], "408054.jpg": [355, 218, 592, 531], "326702.jpg": [174, 217, 685, 1022], "421091.jpg": [388, 230, 631, 507], "220657.jpg": [489, 253, 600, 398], "474645.jpg": [425, 649, 536, 852], "312927.jpg": [132, 230, 486, 951], "239043.jpg": [105, 611, 255, 942], "422300.jpg": [391, 282, 576, 656], "450416.jpg": [424, 127, 974, 678], "302589.jpg": [287, 209, 545, 595], "319035.jpg": [442, 310, 612, 453], "271043.jpg": [217, 394, 539, 930], "280629.jpg": [552, 272, 1000, 693], "322436.jpg": [444, 258, 489, 331], "222369.jpg": [176, 166, 426, 870], "442086.jpg": [242, 127, 656, 681], "460735.jpg": [657, 396, 730, 654], "297356.jpg": [638, 345, 728, 609], "225081.jpg": [434, 285, 518, 461], "297060.jpg": [728, 137, 802, 273], "274268.jpg": [647, 323, 671, 369], "430665.jpg": [519, 342, 552, 393], "253298.jpg": [296, 56, 822, 988], "422680.jpg": [418, 306, 640, 518], "250810.jpg": [352, 214, 548, 489], "301383.jpg": [402, 272, 677, 672], "238026.jpg": [311, 30, 639, 575], "397234.jpg": [364, 292, 441, 559], "392682.jpg": [429, 189, 557, 577], "304985.jpg": [436, 154, 700, 557], "220696.jpg": [333, 177, 823, 676], "300267.jpg": [547, 330, 629, 490], "238121.jpg": [523, 157, 919, 818], "280182.jpg": [270, 236, 788, 681], "398593.jpg": [426, 172, 581, 619], "404720.jpg": [263, 294, 370, 487], "301222.jpg": [512, 188, 568, 271], "220093.jpg": [303, 202, 802, 681], "279253.jpg": [693, 332, 843, 720], "232685.jpg": [441, 207, 711, 658], "398241.jpg": [347, 238, 471, 582], "321295.jpg": [157, 48, 728, 670], "235488.jpg": [635, 487, 821, 958], "301434.jpg": [343, 152, 802, 664], "476048.jpg": [302, 260, 581, 618], "414629.jpg": [488, 42, 730, 579], "425746.jpg": [94, 351, 361, 738], "334710.jpg": [677, 309, 759, 437], "293944.jpg": [683, 480, 798, 616], "239048.jpg": [466, 221, 800, 666], "235709.jpg": [333, 232, 747, 679], "260652.jpg": [304, 174, 406, 395], "421552.jpg": [416, 203, 501, 446], "404038.jpg": [170, 300, 456, 985], "222960.jpg": [259, 301, 519, 1005], "221273.jpg": [429, 327, 692, 704], "325978.jpg": [386, 676, 529, 910], "430791.jpg": [329, 259, 630, 632], "236564.jpg": [360, 254, 622, 675], "222971.jpg": [474, 158, 689, 641], "492802.jpg": [252, 211, 412, 624], "322539.jpg": [361, 288, 470, 471], "298669.jpg": [663, 336, 754, 551], "302503.jpg": [320, 377, 421, 594], "252728.jpg": [130, 362, 165, 424], "238037.jpg": [403, 295, 558, 565], "220921.jpg": [586, 256, 966, 719], "400379.jpg": [239, 253, 552, 448], "496658.jpg": [339, 230, 554, 675], "481039.jpg": [454, 162, 554, 435], "437031.jpg": [173, 164, 275, 427], "315080.jpg": [162, 434, 537, 963], "432318.jpg": [250, 277, 481, 673], "253638.jpg": [557, 140, 995, 766], "315184.jpg": [408, 371, 653, 651], "496800.jpg": [353, 129, 879, 670], "396840.jpg": [194, 227, 487, 601], "310400.jpg": [500, 458, 538, 526], "458525.jpg": [441, 243, 723, 673], "308640.jpg": [344, 138, 515, 351], "221431.jpg": [200, 475, 500, 1022], "450782.jpg": [557, 143, 729, 498], "251686.jpg": [487, 496, 867, 767], "461035.jpg": [189, 199, 501, 984], "297800.jpg": [471, 270, 671, 610], "349220.jpg": [117, 303, 386, 640], "223577.jpg": [753, 429, 810, 497], "214577.jpg": [323, 169, 684, 663], "311683.jpg": [155, 315, 322, 494], "254618.jpg": [20, 64, 609, 890], "223122.jpg": [348, 341, 391, 395], "230534.jpg": [405, 139, 539, 479], "479477.jpg": [499, 135, 598, 393], "269391.jpg": [344, 180, 573, 681], "231196.jpg": [532, 382, 593, 537], "287921.jpg": [200, 293, 298, 564], "236002.jpg": [356, 120, 661, 670], "230045.jpg": [1, 38, 579, 640], "306613.jpg": [268, 873, 320, 955], "321109.jpg": [325, 40, 535, 237], "397718.jpg": [382, 612, 446, 739], "304723.jpg": [621, 122, 940, 682], "238216.jpg": [446, 52, 644, 220], "433340.jpg": [295, 362, 450, 630], "327901.jpg": [443, 671, 486, 754], "323116.jpg": [31, 166, 794, 804], "232589.jpg": [702, 390, 801, 603], "275186.jpg": [209, 234, 700, 680], "304051.jpg": [368, 179, 625, 594], "297406.jpg": [142, 488, 252, 676], "286837.jpg": [382, 113, 537, 257], "422156.jpg": [655, 291, 752, 528], "394385.jpg": [343, 224, 598, 665], "300492.jpg": [384, 254, 461, 433]}
--------------------------------------------------------------------------------
/human_bboxes/GAICD/human_data_split.json:
--------------------------------------------------------------------------------
1 | {"train": ["235093.jpg", "308233.jpg", "302811.jpg", "222577.jpg", "251647.jpg", "239760.jpg", "252770.jpg", "236839.jpg", "236058.jpg", "312262.jpg", "222291.jpg", "327672.jpg", "279678.jpg", "322484.jpg", "290758.jpg", "223848.jpg", "252783.jpg", "497266.jpg", "318237.jpg", "316695.jpg", "340515.jpg", "394757.jpg", "253230.jpg", "220320.jpg", "338230.jpg", "222876.jpg", "301252.jpg", "250903.jpg", "308728.jpg", "332764.jpg", "325204.jpg", "322858.jpg", "328200.jpg", "239121.jpg", "239085.jpg", "233725.jpg", "221320.jpg", "420325.jpg", "324259.jpg", "233858.jpg", "304558.jpg", "235992.jpg", "272046.jpg", "250900.jpg", "250455.jpg", "340700.jpg", "222610.jpg", "418314.jpg", "402584.jpg", "420036.jpg", "300434.jpg", "287292.jpg", "250666.jpg", "231352.jpg", "252970.jpg", "235792.jpg", "280832.jpg", "238669.jpg", "214216.jpg", "403759.jpg", "304073.jpg", "313343.jpg", "465230.jpg", "309802.jpg", "408500.jpg", "264748.jpg", "300747.jpg", "311046.jpg", "401408.jpg", "255513.jpg", "310154.jpg", "340362.jpg", "234280.jpg", "470220.jpg", "314884.jpg", "402802.jpg", "232942.jpg", "221336.jpg", "498390.jpg", "250688.jpg", "315782.jpg", "314112.jpg", "238319.jpg", "230363.jpg", "308853.jpg", "230083.jpg", "237943.jpg", "497828.jpg", "320875.jpg", "301780.jpg", "292117.jpg", "300725.jpg", "221085.jpg", "474276.jpg", "395279.jpg", "220243.jpg", "270591.jpg", "315480.jpg", "427629.jpg", "221342.jpg", "409113.jpg", "239054.jpg", "390936.jpg", "271562.jpg", "235683.jpg", "394992.jpg", "220746.jpg", "250211.jpg", "433761.jpg", "222638.jpg", "314264.jpg", "322605.jpg", "490585.jpg", "481177.jpg", "421576.jpg", "461460.jpg", "234659.jpg", "420218.jpg", "255991.jpg", "302089.jpg", "222437.jpg", "291358.jpg", "278891.jpg", "270801.jpg", "221627.jpg", "326310.jpg", "253510.jpg", "276367.jpg", "300948.jpg", "341252.jpg", "225125.jpg", "346630.jpg", "390822.jpg", "317475.jpg", "328459.jpg", "347246.jpg", "300663.jpg", "347746.jpg", "237344.jpg", "230448.jpg", "300527.jpg", "250202.jpg", "343339.jpg", "323192.jpg", "279759.jpg", "282044.jpg", "287432.jpg", "490502.jpg", "280712.jpg", "310159.jpg", "221860.jpg", "323004.jpg", "231537.jpg", "236412.jpg", "491747.jpg", "328506.jpg", "300282.jpg", "267475.jpg", "398019.jpg", "466370.jpg", "421089.jpg", "221365.jpg", "426180.jpg", "237874.jpg", "239817.jpg", "312918.jpg", "250100.jpg", "304050.jpg", "328421.jpg", "492526.jpg", "301982.jpg", "253460.jpg", "317987.jpg", "472556.jpg", "322280.jpg", "222027.jpg", "496749.jpg", "233027.jpg", "239790.jpg", "312187.jpg", "446979.jpg", "220404.jpg", "300122.jpg", "295472.jpg", "326024.jpg", "252757.jpg", "308847.jpg", "461269.jpg", "236471.jpg", "256545.jpg", "326702.jpg", "421091.jpg", "220657.jpg", "474645.jpg", "312927.jpg", "422300.jpg", "450416.jpg", "302589.jpg", "319035.jpg", "271043.jpg", "280629.jpg", "322436.jpg", "222369.jpg", "442086.jpg", "460735.jpg", "297356.jpg", "297060.jpg", "274268.jpg", "253298.jpg", "250810.jpg", "397234.jpg", "392682.jpg", "220696.jpg", "300267.jpg", "280182.jpg", "398593.jpg", "404720.jpg", "301222.jpg", "220093.jpg", "279253.jpg", "232685.jpg", "398241.jpg", "321295.jpg", "235488.jpg", "301434.jpg", "476048.jpg", "414629.jpg", "425746.jpg", "334710.jpg", "293944.jpg", "239048.jpg", "235709.jpg", "260652.jpg", "421552.jpg", "404038.jpg", "222960.jpg", "325978.jpg", "430791.jpg", "236564.jpg", "492802.jpg", "322539.jpg", "298669.jpg", "302503.jpg", "252728.jpg", "238037.jpg", "400379.jpg", "496658.jpg", "481039.jpg", "437031.jpg", "315080.jpg", "432318.jpg", "253638.jpg", "315184.jpg", "496800.jpg", "396840.jpg", "310400.jpg", "458525.jpg", "308640.jpg", "221431.jpg", "450782.jpg", "251686.jpg", "461035.jpg", "297800.jpg", "349220.jpg", "223577.jpg", "214577.jpg", "311683.jpg", "254618.jpg", "230534.jpg", "479477.jpg", "269391.jpg", "231196.jpg", "287921.jpg", "236002.jpg", "306613.jpg", "321109.jpg", "397718.jpg", "304723.jpg", "238216.jpg", "433340.jpg", "327901.jpg", "323116.jpg", "232589.jpg", "275186.jpg", "304051.jpg", "286837.jpg", "422156.jpg", "394385.jpg", "300492.jpg"], "test": ["278927.jpg", "308394.jpg", "490605.jpg", "323138.jpg", "411610.jpg", "315248.jpg", "302166.jpg", "431149.jpg", "232700.jpg", "333393.jpg", "220865.jpg", "306149.jpg", "220207.jpg", "236546.jpg", "237845.jpg", "315966.jpg", "236401.jpg", "324123.jpg", "233513.jpg", "225742.jpg", "225483.jpg", "290810.jpg", "217386.jpg", "225566.jpg", "409964.jpg", "341906.jpg", "239028.jpg", "490465.jpg", "251928.jpg", "322699.jpg", "281726.jpg", "310643.jpg", "300075.jpg", "322674.jpg", "221565.jpg", "408054.jpg", "239043.jpg", "225081.jpg", "430665.jpg", "422680.jpg", "301383.jpg", "238026.jpg", "304985.jpg", "238121.jpg", "221273.jpg", "222971.jpg", "220921.jpg", "223122.jpg", "230045.jpg", "297406.jpg"]}
--------------------------------------------------------------------------------
/human_bboxes/GAICD/original_data_split.json:
--------------------------------------------------------------------------------
1 | {"train": ["235093.jpg", "410011.jpg", "308233.jpg", "302811.jpg", "221188.jpg", "238812.jpg", "456435.jpg", "292367.jpg", "347287.jpg", "222577.jpg", "300453.jpg", "236045.jpg", "222279.jpg", "423854.jpg", "239386.jpg", "324556.jpg", "230626.jpg", "494562.jpg", "310450.jpg", "293136.jpg", "251647.jpg", "241757.jpg", "239760.jpg", "451951.jpg", "286817.jpg", "251097.jpg", "230664.jpg", "276336.jpg", "319215.jpg", "252770.jpg", "310589.jpg", "221702.jpg", "300542.jpg", "290214.jpg", "220259.jpg", "236839.jpg", "420163.jpg", "300124.jpg", "230392.jpg", "287610.jpg", "330874.jpg", "232290.jpg", "293703.jpg", "236058.jpg", "221385.jpg", "293319.jpg", "326566.jpg", "288814.jpg", "250104.jpg", "300954.jpg", "241992.jpg", "253882.jpg", "228532.jpg", "219148.jpg", "297905.jpg", "251645.jpg", "309079.jpg", "281804.jpg", "217870.jpg", "233521.jpg", "421129.jpg", "230996.jpg", "271852.jpg", "300524.jpg", "300801.jpg", "257122.jpg", "405143.jpg", "252625.jpg", "476067.jpg", "275867.jpg", "312262.jpg", "272939.jpg", "222291.jpg", "237450.jpg", "291161.jpg", "222174.jpg", "327672.jpg", "279678.jpg", "293884.jpg", "220334.jpg", "342255.jpg", "313247.jpg", "273377.jpg", "289862.jpg", "260145.jpg", "322484.jpg", "300938.jpg", "499425.jpg", "221663.jpg", "242400.jpg", "391463.jpg", "222599.jpg", "250412.jpg", "255857.jpg", "285137.jpg", "290758.jpg", "279786.jpg", "310137.jpg", "252392.jpg", "395823.jpg", "223848.jpg", "221999.jpg", "252783.jpg", "497266.jpg", "428082.jpg", "318237.jpg", "295600.jpg", "316695.jpg", "340515.jpg", "231784.jpg", "236356.jpg", "224492.jpg", "238671.jpg", "394757.jpg", "319891.jpg", "327618.jpg", "253230.jpg", "301603.jpg", "301964.jpg", "281230.jpg", "220320.jpg", "338230.jpg", "324005.jpg", "222876.jpg", "270979.jpg", "300383.jpg", "301252.jpg", "321167.jpg", "244605.jpg", "331545.jpg", "250903.jpg", "398671.jpg", "220203.jpg", "282236.jpg", "279181.jpg", "225099.jpg", "308728.jpg", "332764.jpg", "440899.jpg", "491050.jpg", "254271.jpg", "325204.jpg", "260539.jpg", "291305.jpg", "331806.jpg", "421678.jpg", "244135.jpg", "222149.jpg", "291451.jpg", "330064.jpg", "329656.jpg", "349620.jpg", "253426.jpg", "322858.jpg", "291531.jpg", "427008.jpg", "225032.jpg", "235945.jpg", "328200.jpg", "211292.jpg", "231019.jpg", "237149.jpg", "300085.jpg", "411445.jpg", "314950.jpg", "304894.jpg", "309411.jpg", "250386.jpg", "239121.jpg", "331871.jpg", "239085.jpg", "238819.jpg", "323291.jpg", "261976.jpg", "250138.jpg", "450670.jpg", "284219.jpg", "292644.jpg", "233831.jpg", "233725.jpg", "422678.jpg", "232649.jpg", "221320.jpg", "420325.jpg", "250586.jpg", "324259.jpg", "302041.jpg", "329811.jpg", "297282.jpg", "442399.jpg", "219383.jpg", "280063.jpg", "252743.jpg", "295908.jpg", "252668.jpg", "279342.jpg", "419975.jpg", "295969.jpg", "233858.jpg", "267533.jpg", "275796.jpg", "280201.jpg", "300183.jpg", "349241.jpg", "304558.jpg", "235992.jpg", "291416.jpg", "246606.jpg", "236379.jpg", "239010.jpg", "272046.jpg", "250900.jpg", "332819.jpg", "313418.jpg", "221726.jpg", "340539.jpg", "390639.jpg", "250455.jpg", "340700.jpg", "453469.jpg", "280259.jpg", "466727.jpg", "440109.jpg", "222610.jpg", "418314.jpg", "267457.jpg", "211085.jpg", "321865.jpg", "230706.jpg", "402584.jpg", "245860.jpg", "233377.jpg", "420036.jpg", "340550.jpg", "290964.jpg", "291329.jpg", "300434.jpg", "287292.jpg", "428424.jpg", "419392.jpg", "287825.jpg", "248312.jpg", "325296.jpg", "346192.jpg", "250666.jpg", "394752.jpg", "314580.jpg", "235074.jpg", "303882.jpg", "302140.jpg", "420038.jpg", "220660.jpg", "414251.jpg", "306667.jpg", "407866.jpg", "231352.jpg", "221159.jpg", "455373.jpg", "304656.jpg", "240126.jpg", "224102.jpg", "252970.jpg", "333107.jpg", "261666.jpg", "292336.jpg", "235792.jpg", "271121.jpg", "259549.jpg", "223385.jpg", "494896.jpg", "324866.jpg", "430571.jpg", "470553.jpg", "323027.jpg", "261815.jpg", "342287.jpg", "238108.jpg", "306341.jpg", "280832.jpg", "236976.jpg", "238669.jpg", "302825.jpg", "224524.jpg", "349663.jpg", "253501.jpg", "214216.jpg", "403759.jpg", "237097.jpg", "304073.jpg", "216140.jpg", "313343.jpg", "420262.jpg", "317798.jpg", "271000.jpg", "465230.jpg", "320113.jpg", "235469.jpg", "309802.jpg", "408500.jpg", "322548.jpg", "272216.jpg", "474409.jpg", "262196.jpg", "264748.jpg", "342776.jpg", "286014.jpg", "318402.jpg", "314509.jpg", "239910.jpg", "323466.jpg", "236768.jpg", "302473.jpg", "327375.jpg", "396097.jpg", "300747.jpg", "311046.jpg", "277710.jpg", "401408.jpg", "347436.jpg", "255513.jpg", "285976.jpg", "310154.jpg", "309852.jpg", "280363.jpg", "322199.jpg", "238265.jpg", "260733.jpg", "230508.jpg", "422155.jpg", "307135.jpg", "340362.jpg", "320863.jpg", "234280.jpg", "276749.jpg", "234811.jpg", "241573.jpg", "271665.jpg", "424852.jpg", "470220.jpg", "398452.jpg", "303861.jpg", "287484.jpg", "283685.jpg", "314884.jpg", "244981.jpg", "322411.jpg", "444040.jpg", "402802.jpg", "214992.jpg", "232942.jpg", "302339.jpg", "221336.jpg", "498390.jpg", "250688.jpg", "315782.jpg", "329554.jpg", "314112.jpg", "270752.jpg", "242925.jpg", "252693.jpg", "254325.jpg", "396842.jpg", "451428.jpg", "238319.jpg", "220880.jpg", "458891.jpg", "325075.jpg", "222666.jpg", "230363.jpg", "238533.jpg", "321412.jpg", "323149.jpg", "308853.jpg", "303670.jpg", "230009.jpg", "260917.jpg", "230083.jpg", "237943.jpg", "497828.jpg", "320875.jpg", "278969.jpg", "301780.jpg", "236040.jpg", "214018.jpg", "292117.jpg", "300725.jpg", "309654.jpg", "302853.jpg", "343526.jpg", "233070.jpg", "221085.jpg", "474276.jpg", "490673.jpg", "395279.jpg", "400807.jpg", "220243.jpg", "270591.jpg", "280591.jpg", "261545.jpg", "315480.jpg", "441946.jpg", "427629.jpg", "213521.jpg", "239954.jpg", "291898.jpg", "247787.jpg", "270755.jpg", "221342.jpg", "238275.jpg", "409113.jpg", "239054.jpg", "305327.jpg", "390936.jpg", "273966.jpg", "494884.jpg", "286788.jpg", "219270.jpg", "213181.jpg", "442238.jpg", "271562.jpg", "235683.jpg", "394992.jpg", "260853.jpg", "260560.jpg", "266900.jpg", "292348.jpg", "398150.jpg", "250885.jpg", "397972.jpg", "220746.jpg", "222991.jpg", "250211.jpg", "433761.jpg", "222638.jpg", "254409.jpg", "314264.jpg", "322605.jpg", "340480.jpg", "213732.jpg", "331584.jpg", "490585.jpg", "300020.jpg", "241735.jpg", "255578.jpg", "293260.jpg", "347063.jpg", "280932.jpg", "481177.jpg", "421576.jpg", "483935.jpg", "238029.jpg", "294677.jpg", "224194.jpg", "461460.jpg", "315286.jpg", "410870.jpg", "234659.jpg", "445389.jpg", "300787.jpg", "254981.jpg", "420218.jpg", "255991.jpg", "408991.jpg", "302089.jpg", "292401.jpg", "222437.jpg", "230156.jpg", "261341.jpg", "244984.jpg", "280217.jpg", "278778.jpg", "344269.jpg", "291358.jpg", "278891.jpg", "270801.jpg", "338570.jpg", "304242.jpg", "247089.jpg", "221627.jpg", "265480.jpg", "326310.jpg", "241703.jpg", "253510.jpg", "239470.jpg", "485300.jpg", "452054.jpg", "301377.jpg", "276367.jpg", "300948.jpg", "341252.jpg", "236371.jpg", "225125.jpg", "253703.jpg", "325662.jpg", "346630.jpg", "260759.jpg", "239741.jpg", "297476.jpg", "298495.jpg", "300045.jpg", "423661.jpg", "390822.jpg", "395984.jpg", "281134.jpg", "301505.jpg", "279758.jpg", "225386.jpg", "328970.jpg", "317475.jpg", "328459.jpg", "300643.jpg", "347246.jpg", "310496.jpg", "456289.jpg", "403270.jpg", "300663.jpg", "408265.jpg", "347746.jpg", "237344.jpg", "230448.jpg", "219048.jpg", "311919.jpg", "235892.jpg", "330260.jpg", "300527.jpg", "220357.jpg", "250202.jpg", "343339.jpg", "292483.jpg", "300942.jpg", "323192.jpg", "254102.jpg", "416663.jpg", "279759.jpg", "293740.jpg", "271933.jpg", "282044.jpg", "238811.jpg", "302572.jpg", "326186.jpg", "300405.jpg", "398757.jpg", "238471.jpg", "220046.jpg", "405461.jpg", "492659.jpg", "222035.jpg", "231743.jpg", "292111.jpg", "241289.jpg", "222391.jpg", "230908.jpg", "287432.jpg", "343345.jpg", "323649.jpg", "490502.jpg", "302737.jpg", "213643.jpg", "280712.jpg", "484100.jpg", "310159.jpg", "263961.jpg", "221860.jpg", "400120.jpg", "323004.jpg", "231537.jpg", "238494.jpg", "236412.jpg", "395763.jpg", "292421.jpg", "321246.jpg", "491747.jpg", "326015.jpg", "328506.jpg", "262876.jpg", "247557.jpg", "300282.jpg", "328440.jpg", "246858.jpg", "232571.jpg", "403158.jpg", "267475.jpg", "323040.jpg", "312952.jpg", "291379.jpg", "443456.jpg", "298650.jpg", "398019.jpg", "274245.jpg", "297603.jpg", "406346.jpg", "279871.jpg", "318726.jpg", "466370.jpg", "421089.jpg", "239210.jpg", "420090.jpg", "221365.jpg", "317108.jpg", "220885.jpg", "424414.jpg", "263520.jpg", "496742.jpg", "243968.jpg", "302974.jpg", "308305.jpg", "343371.jpg", "274271.jpg", "268274.jpg", "218821.jpg", "459372.jpg", "261487.jpg", "426180.jpg", "249609.jpg", "237874.jpg", "327830.jpg", "239817.jpg", "312918.jpg", "304617.jpg", "224458.jpg", "250100.jpg", "304050.jpg", "318963.jpg", "253956.jpg", "328421.jpg", "330333.jpg", "307968.jpg", "244060.jpg", "230050.jpg", "492526.jpg", "300855.jpg", "326137.jpg", "301982.jpg", "414735.jpg", "211958.jpg", "253460.jpg", "242541.jpg", "317987.jpg", "300497.jpg", "472556.jpg", "294739.jpg", "221804.jpg", "280867.jpg", "245786.jpg", "322280.jpg", "250893.jpg", "300414.jpg", "329742.jpg", "270768.jpg", "432945.jpg", "407814.jpg", "290687.jpg", "216118.jpg", "240098.jpg", "240828.jpg", "222027.jpg", "496749.jpg", "427088.jpg", "233027.jpg", "301963.jpg", "274007.jpg", "239790.jpg", "247809.jpg", "439094.jpg", "312187.jpg", "286910.jpg", "446979.jpg", "220404.jpg", "304870.jpg", "237498.jpg", "300122.jpg", "243316.jpg", "293189.jpg", "330670.jpg", "295472.jpg", "326024.jpg", "236432.jpg", "244834.jpg", "252757.jpg", "308847.jpg", "292525.jpg", "252318.jpg", "247463.jpg", "461269.jpg", "224316.jpg", "267227.jpg", "300151.jpg", "309345.jpg", "415443.jpg", "243997.jpg", "236471.jpg", "256545.jpg", "301185.jpg", "291048.jpg", "300370.jpg", "410411.jpg", "293774.jpg", "255871.jpg", "224897.jpg", "326702.jpg", "289828.jpg", "423564.jpg", "320189.jpg", "323005.jpg", "290678.jpg", "278775.jpg", "399111.jpg", "421091.jpg", "245410.jpg", "349936.jpg", "265340.jpg", "332514.jpg", "220657.jpg", "220262.jpg", "474645.jpg", "340145.jpg", "312927.jpg", "308342.jpg", "245653.jpg", "293625.jpg", "422300.jpg", "450416.jpg", "302589.jpg", "344711.jpg", "224323.jpg", "319035.jpg", "238614.jpg", "230352.jpg", "271043.jpg", "280629.jpg", "252834.jpg", "218317.jpg", "322436.jpg", "424014.jpg", "301160.jpg", "308205.jpg", "247014.jpg", "399666.jpg", "221966.jpg", "263603.jpg", "222369.jpg", "218572.jpg", "252769.jpg", "442086.jpg", "460735.jpg", "297356.jpg", "296018.jpg", "287942.jpg", "297060.jpg", "274268.jpg", "252849.jpg", "317117.jpg", "309645.jpg", "253298.jpg", "393464.jpg", "320380.jpg", "279444.jpg", "476093.jpg", "250810.jpg", "410018.jpg", "255852.jpg", "397234.jpg", "392682.jpg", "220871.jpg", "220696.jpg", "279455.jpg", "398220.jpg", "457118.jpg", "220883.jpg", "300267.jpg", "497588.jpg", "255774.jpg", "280182.jpg", "398593.jpg", "404720.jpg", "422453.jpg", "400109.jpg", "301222.jpg", "262635.jpg", "274201.jpg", "306440.jpg", "220093.jpg", "279253.jpg", "232685.jpg", "398241.jpg", "290828.jpg", "321295.jpg", "329770.jpg", "271098.jpg", "280056.jpg", "239192.jpg", "235488.jpg", "464490.jpg", "274661.jpg", "420164.jpg", "220152.jpg", "456935.jpg", "289318.jpg", "442212.jpg", "347960.jpg", "318628.jpg", "236038.jpg", "323532.jpg", "396456.jpg", "232129.jpg", "220390.jpg", "220903.jpg", "301434.jpg", "260316.jpg", "308456.jpg", "470237.jpg", "482643.jpg", "324033.jpg", "476048.jpg", "235118.jpg", "414629.jpg", "398789.jpg", "440278.jpg", "425746.jpg", "334710.jpg", "217419.jpg", "490491.jpg", "278791.jpg", "293944.jpg", "239048.jpg", "253351.jpg", "236580.jpg", "307726.jpg", "246352.jpg", "238755.jpg", "326664.jpg", "235709.jpg", "415717.jpg", "267378.jpg", "472073.jpg", "260652.jpg", "421552.jpg", "489173.jpg", "296669.jpg", "304356.jpg", "262795.jpg", "219927.jpg", "236921.jpg", "291640.jpg", "404038.jpg", "230446.jpg", "222960.jpg", "390545.jpg", "296703.jpg", "277952.jpg", "230761.jpg", "325978.jpg", "301935.jpg", "430791.jpg", "296607.jpg", "244561.jpg", "276444.jpg", "328434.jpg", "243982.jpg", "342556.jpg", "253633.jpg", "251851.jpg", "236564.jpg", "260575.jpg", "212453.jpg", "395970.jpg", "238116.jpg", "492802.jpg", "224629.jpg", "309173.jpg", "461160.jpg", "348539.jpg", "222464.jpg", "323370.jpg", "252408.jpg", "322539.jpg", "298669.jpg", "421591.jpg", "302503.jpg", "252728.jpg", "322831.jpg", "238037.jpg", "477331.jpg", "400379.jpg", "496658.jpg", "481039.jpg", "316774.jpg", "221970.jpg", "493063.jpg", "243737.jpg", "237758.jpg", "291607.jpg", "437031.jpg", "442241.jpg", "220098.jpg", "420507.jpg", "241575.jpg", "233370.jpg", "309804.jpg", "300017.jpg", "315080.jpg", "274229.jpg", "395336.jpg", "421514.jpg", "220123.jpg", "285091.jpg", "324574.jpg", "398513.jpg", "257834.jpg", "303563.jpg", "300247.jpg", "432318.jpg", "323518.jpg", "253638.jpg", "321267.jpg", "223110.jpg", "300463.jpg", "315184.jpg", "231697.jpg", "496800.jpg", "396840.jpg", "310400.jpg", "430160.jpg", "458525.jpg", "252579.jpg", "325264.jpg", "301208.jpg", "233791.jpg", "235229.jpg", "250453.jpg", "214567.jpg", "314590.jpg", "275012.jpg", "308640.jpg", "275524.jpg", "331468.jpg", "221431.jpg", "296842.jpg", "450782.jpg", "251686.jpg", "295238.jpg", "252646.jpg", "252789.jpg", "461035.jpg", "297800.jpg", "349220.jpg", "307156.jpg", "223577.jpg", "331504.jpg", "214577.jpg", "311683.jpg", "230758.jpg", "274223.jpg", "291929.jpg", "309778.jpg", "463742.jpg", "254618.jpg", "306715.jpg", "450155.jpg", "230271.jpg", "230534.jpg", "479477.jpg", "495706.jpg", "446317.jpg", "491461.jpg", "230778.jpg", "237898.jpg", "238294.jpg", "416191.jpg", "221422.jpg", "242576.jpg", "326231.jpg", "234796.jpg", "345329.jpg", "269391.jpg", "231196.jpg", "240680.jpg", "315329.jpg", "287921.jpg", "309374.jpg", "236002.jpg", "300745.jpg", "272745.jpg", "411440.jpg", "315581.jpg", "303099.jpg", "300588.jpg", "306613.jpg", "321109.jpg", "349287.jpg", "291686.jpg", "236648.jpg", "268346.jpg", "397718.jpg", "304723.jpg", "238216.jpg", "433340.jpg", "327901.jpg", "323116.jpg", "341039.jpg", "264614.jpg", "456365.jpg", "232589.jpg", "256341.jpg", "275186.jpg", "267422.jpg", "304051.jpg", "254410.jpg", "329564.jpg", "274983.jpg", "236740.jpg", "439648.jpg", "250233.jpg", "412646.jpg", "251731.jpg", "320523.jpg", "323326.jpg", "248807.jpg", "313690.jpg", "215417.jpg", "296888.jpg", "230690.jpg", "422000.jpg", "286837.jpg", "422156.jpg", "236420.jpg", "395686.jpg", "286843.jpg", "232565.jpg", "296913.jpg", "411134.jpg", "394385.jpg", "433378.jpg", "278213.jpg", "252709.jpg", "300492.jpg"], "test": ["252647.jpg", "266473.jpg", "213822.jpg", "300829.jpg", "286151.jpg", "278927.jpg", "257830.jpg", "291083.jpg", "308394.jpg", "490605.jpg", "338509.jpg", "323138.jpg", "211144.jpg", "411610.jpg", "315248.jpg", "280236.jpg", "302166.jpg", "220989.jpg", "321720.jpg", "248065.jpg", "309007.jpg", "320949.jpg", "330360.jpg", "431149.jpg", "244696.jpg", "232700.jpg", "333393.jpg", "320452.jpg", "220865.jpg", "323737.jpg", "292284.jpg", "401076.jpg", "216186.jpg", "233918.jpg", "244757.jpg", "300090.jpg", "306149.jpg", "246204.jpg", "223832.jpg", "325119.jpg", "261821.jpg", "233733.jpg", "309683.jpg", "220207.jpg", "238842.jpg", "239194.jpg", "300380.jpg", "245398.jpg", "236546.jpg", "232826.jpg", "264035.jpg", "297575.jpg", "270415.jpg", "237845.jpg", "233781.jpg", "231228.jpg", "349906.jpg", "239473.jpg", "315966.jpg", "275811.jpg", "251109.jpg", "236401.jpg", "324123.jpg", "236941.jpg", "297012.jpg", "233513.jpg", "235016.jpg", "225742.jpg", "225483.jpg", "230379.jpg", "239027.jpg", "323970.jpg", "489880.jpg", "273741.jpg", "323166.jpg", "239879.jpg", "290810.jpg", "420461.jpg", "252867.jpg", "271557.jpg", "261259.jpg", "431796.jpg", "318317.jpg", "247949.jpg", "287172.jpg", "306179.jpg", "265813.jpg", "255381.jpg", "277497.jpg", "410268.jpg", "294786.jpg", "253929.jpg", "419677.jpg", "212000.jpg", "217386.jpg", "281659.jpg", "225566.jpg", "409964.jpg", "288019.jpg", "274076.jpg", "215679.jpg", "391156.jpg", "341906.jpg", "293838.jpg", "239028.jpg", "219695.jpg", "238207.jpg", "440804.jpg", "318224.jpg", "213811.jpg", "238113.jpg", "212042.jpg", "490465.jpg", "251928.jpg", "318513.jpg", "322699.jpg", "245110.jpg", "398292.jpg", "236626.jpg", "318776.jpg", "281726.jpg", "237593.jpg", "442517.jpg", "310643.jpg", "445735.jpg", "300075.jpg", "322674.jpg", "428940.jpg", "260410.jpg", "277423.jpg", "239267.jpg", "300077.jpg", "261589.jpg", "221565.jpg", "408054.jpg", "297007.jpg", "329827.jpg", "291616.jpg", "294782.jpg", "250229.jpg", "239043.jpg", "282090.jpg", "321833.jpg", "394605.jpg", "487943.jpg", "218744.jpg", "225081.jpg", "244689.jpg", "280233.jpg", "430665.jpg", "422680.jpg", "323903.jpg", "301383.jpg", "238026.jpg", "317002.jpg", "304985.jpg", "273046.jpg", "238121.jpg", "309571.jpg", "289802.jpg", "310696.jpg", "230861.jpg", "459353.jpg", "219853.jpg", "237036.jpg", "322353.jpg", "266718.jpg", "221549.jpg", "280069.jpg", "234500.jpg", "221273.jpg", "239374.jpg", "222317.jpg", "222971.jpg", "390852.jpg", "277993.jpg", "220921.jpg", "218904.jpg", "414429.jpg", "261267.jpg", "334977.jpg", "300159.jpg", "400943.jpg", "337270.jpg", "332450.jpg", "256204.jpg", "318811.jpg", "223122.jpg", "263897.jpg", "321062.jpg", "210333.jpg", "284487.jpg", "325608.jpg", "230045.jpg", "231629.jpg", "297406.jpg", "309808.jpg", "245111.jpg", "222707.jpg", "290673.jpg"]}
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | einops==0.3.0
2 | lmdb==1.2.1
3 | matplotlib==3.2.2
4 | numpy==1.19.1
5 | opencv_python==4.5.4.60
6 | Pillow==9.2.0
7 | scipy==1.5.2
8 | setuptools==52.0.0.post20210125
9 | tensorboardX==2.5.1
10 | torch==1.9.1+cu111
11 | torchvision==0.10.1+cu111
12 | tqdm==4.51.0
13 |
--------------------------------------------------------------------------------
/test.py:
--------------------------------------------------------------------------------
1 | import os
2 | import sys
3 | import numpy as np
4 | from tensorboardX import SummaryWriter
5 | import torch
6 | import time
7 | import datetime
8 | import csv
9 | from tqdm import tqdm
10 | import shutil
11 | import pickle
12 | from scipy.stats import spearmanr, pearsonr
13 | import torch.backends.cudnn as cudnn
14 | import math
15 | import torchvision.transforms as transforms
16 | from PIL import Image
17 |
18 | from cropping_dataset import FCDBDataset, FLMSDataset, GAICDataset, generate_target_size_crop_mask
19 | from config_GAICD import cfg
20 | from cropping_model import HumanCentricCroppingModel
21 |
22 | device = torch.device('cuda:{}'.format(cfg.gpu_id))
23 | torch.cuda.set_device(cfg.gpu_id)
24 | IMAGE_NET_MEAN = [0.485, 0.456, 0.406]
25 | IMAGE_NET_STD = [0.229, 0.224, 0.225]
26 |
27 | def compute_acc(gt_scores, pr_scores):
28 | assert (len(gt_scores) == len(pr_scores)), '{} vs. {}'.format(len(gt_scores), len(pr_scores))
29 | sample_cnt = 0
30 | acc4_5 = [0 for i in range(4)]
31 | acc4_10 = [0 for i in range(4)]
32 | for i in range(len(gt_scores)):
33 | gts, preds = gt_scores[i], pr_scores[i]
34 | id_gt = sorted(range(len(gts)), key=lambda j : gts[j], reverse=True)
35 | id_pr = sorted(range(len(preds)), key=lambda j : preds[j], reverse=True)
36 | for k in range(4):
37 | temp_acc4_5 = 0.
38 | temp_acc4_10 = 0.
39 | for j in range(k+1):
40 | if gts[id_pr[j]] >= gts[id_gt[4]]:
41 | temp_acc4_5 += 1.0
42 | if gts[id_pr[j]] >= gts[id_gt[9]]:
43 | temp_acc4_10 += 1.0
44 | acc4_5[k] += (temp_acc4_5 / (k+1.0))
45 | acc4_10[k] += ((temp_acc4_10) / (k+1.0))
46 | sample_cnt += 1
47 | acc4_5 = [i / sample_cnt for i in acc4_5]
48 | acc4_10 = [i / sample_cnt for i in acc4_10]
49 | # print('acc4_5', acc4_5)
50 | # print('acc4_10', acc4_10)
51 | avg_acc4_5 = sum(acc4_5) / len(acc4_5)
52 | avg_acc4_10 = sum(acc4_10) / len(acc4_10)
53 | return avg_acc4_5, avg_acc4_10
54 |
55 | def compute_iou_and_disp(gt_crop, pre_crop, im_w, im_h):
56 | ''''
57 | :param gt_crop: [[x1,y1,x2,y2]]
58 | :param pre_crop: [[x1,y1,x2,y2]]
59 | :return:
60 | '''
61 | gt_crop = gt_crop[gt_crop[:,0] >= 0]
62 | zero_t = torch.zeros(gt_crop.shape[0])
63 | over_x1 = torch.maximum(gt_crop[:,0], pre_crop[:,0])
64 | over_y1 = torch.maximum(gt_crop[:,1], pre_crop[:,1])
65 | over_x2 = torch.minimum(gt_crop[:,2], pre_crop[:,2])
66 | over_y2 = torch.minimum(gt_crop[:,3], pre_crop[:,3])
67 | over_w = torch.maximum(zero_t, over_x2 - over_x1)
68 | over_h = torch.maximum(zero_t, over_y2 - over_y1)
69 | inter = over_w * over_h
70 | area1 = (gt_crop[:,2] - gt_crop[:,0]) * (gt_crop[:,3] - gt_crop[:,1])
71 | area2 = (pre_crop[:,2] - pre_crop[:,0]) * (pre_crop[:,3] - pre_crop[:,1])
72 | union = area1 + area2 - inter
73 | iou = inter / union
74 | disp = (torch.abs(gt_crop[:, 0] - pre_crop[:, 0]) + torch.abs(gt_crop[:, 2] - pre_crop[:, 2])) / im_w + \
75 | (torch.abs(gt_crop[:, 1] - pre_crop[:, 1]) + torch.abs(gt_crop[:, 3] - pre_crop[:, 3])) / im_h
76 | iou_idx = torch.argmax(iou, dim=-1)
77 | dis_idx = torch.argmin(disp, dim=-1)
78 | index = dis_idx if (iou[iou_idx] == iou[dis_idx]) else iou_idx
79 | return iou[index].item(), disp[index].item()
80 |
81 |
82 | def evaluate_on_GAICD(model, only_human=True):
83 | model.eval()
84 | print('='*5, 'Evaluating on GAICD dataset', '='*5)
85 | srcc_list = []
86 | gt_scores = []
87 | pr_scores = []
88 | count = 0
89 | test_dataset = GAICDataset(only_human_images=only_human,
90 | split='test',
91 | keep_aspect_ratio=cfg.keep_aspect_ratio)
92 | test_loader = torch.utils.data.DataLoader(
93 | test_dataset, batch_size=1,
94 | shuffle=False, num_workers=cfg.num_workers,
95 | drop_last=False)
96 | device = next(model.parameters()).device
97 | with torch.no_grad():
98 | for batch_idx, batch_data in enumerate(tqdm(test_loader)):
99 | im = batch_data[0].to(device)
100 | crop = batch_data[1].to(device)
101 | humanbox = batch_data[2].to(device)
102 | heat_map = batch_data[3]
103 | crop_mask = batch_data[4].to(device)
104 | part_mask = batch_data[5].to(device)
105 | scores = batch_data[6].reshape(-1).numpy().tolist()
106 | width = batch_data[7]
107 | height = batch_data[8]
108 | count += im.shape[0]
109 |
110 | part_feat, heat_map, pre_scores = model(im, crop, humanbox, crop_mask, part_mask)
111 | pre_scores = pre_scores.cpu().detach().numpy().reshape(-1)
112 | srcc_list.append(spearmanr(scores, pre_scores)[0])
113 | gt_scores.append(scores)
114 | pr_scores.append(pre_scores)
115 |
116 | srcc = sum(srcc_list) / len(srcc_list)
117 | acc5, acc10 = compute_acc(gt_scores, pr_scores)
118 | print('Test on GAICD {} images, SRCC={:.3f}, acc5={:.3f}, acc10={:.3f}'.format(
119 | count, srcc, acc5, acc10
120 | ))
121 | return srcc, acc5, acc10
122 |
123 | def get_pdefined_anchor():
124 | # get predefined boxes(x1, y1, x2, y2)
125 | pdefined_anchors = np.array(pickle.load(open(cfg.predefined_pkl, 'rb'), encoding='iso-8859-1')).astype(np.float32)
126 | print('num of pre-defined anchors: ', pdefined_anchors.shape)
127 | return pdefined_anchors
128 |
129 | def get_pdefined_anchor_v1(im_w, im_h):
130 | bins = 12.0
131 | step_h = im_h / bins
132 | step_w = im_w / bins
133 | pdefined_anchors = []
134 | for x1 in range(0,4):
135 | for y1 in range(0,4):
136 | for x2 in range(8,12):
137 | for y2 in range(8,12):
138 | if (x2 - x1) * (y2 - y1) > 0.4999 * bins * bins and (y2 - y1) * step_w / (
139 | x2 - x1) / step_h > 0.5 and (y2 - y1) * step_w / (x2 - x1) / step_h < 2.0:
140 | x1 = float(step_h*(0.5+x1)) / im_w
141 | y1 = float(step_w*(0.5+y1)) / im_h
142 | x2 = float(step_h * (0.5 + x2)) / im_w
143 | y2 = float(step_w*(0.5+y2)) / im_h
144 | pdefined_anchors.append([x1, y1, x2, y2])
145 | pdefined_anchors = np.array(pdefined_anchors).reshape(-1,4)
146 | print('num of pre-defined anchors: ', pdefined_anchors.shape)
147 | return pdefined_anchors
148 |
149 | def evaluate_on_FCDB_and_FLMS(model, dataset='both', only_human=True):
150 | from config_CPC import cfg
151 | model.eval()
152 | device = next(model.parameters()).device
153 | pdefined_anchors = get_pdefined_anchor() # n,4, (x1,y1,x2,y2)
154 | accum_disp = 0
155 | accum_iou = 0
156 | crop_cnt = 0
157 | alpha = 0.75
158 | alpha_cnt = 0
159 | cnt = 0
160 |
161 | print('=' * 5, 'Evaluating on FCDB&FLMS', '=' * 5)
162 | with torch.no_grad():
163 | if dataset == 'FCDB':
164 | test_set = [FCDBDataset]
165 | elif dataset == 'FLMS':
166 | test_set = [FLMSDataset]
167 | else:
168 | test_set = [FCDBDataset,FLMSDataset]
169 | for dataset in test_set:
170 | test_dataset= dataset(only_human_images=only_human,
171 | keep_aspect_ratio=cfg.keep_aspect_ratio)
172 | test_loader = torch.utils.data.DataLoader(test_dataset, batch_size=1,
173 | shuffle=False, num_workers=cfg.num_workers,
174 | drop_last=False)
175 | for batch_idx, batch_data in enumerate(tqdm(test_loader)):
176 | im = batch_data[0].to(device)
177 | gt_crop = batch_data[1]
178 | part_mask = batch_data[2].to(device)
179 | width = batch_data[3].item()
180 | height = batch_data[4].item()
181 |
182 | crop = np.zeros((len(pdefined_anchors), 4), dtype=np.float32)
183 | crop[:, 0::2] = pdefined_anchors[:, 0::2] * im.shape[-1]
184 | crop[:, 1::2] = pdefined_anchors[:, 1::2] * im.shape[-2]
185 | crop_mask = generate_target_size_crop_mask(crop, im.shape[-1], im.shape[-2], 64, 64)
186 |
187 | crop = torch.from_numpy(crop).unsqueeze(0).to(device) # 1,n,4
188 | crop_mask = torch.from_numpy(crop_mask).unsqueeze(0).to(device)
189 | part_feat, heat_map, scores = model(im, crop, crop_mask, part_mask)
190 | # get best crop
191 | scores = scores.reshape(-1).cpu().detach().numpy()
192 | idx = np.argmax(scores)
193 | pred_x1 = int(pdefined_anchors[idx][0] * width)
194 | pred_y1 = int(pdefined_anchors[idx][1] * height)
195 | pred_x2 = int(pdefined_anchors[idx][2] * width)
196 | pred_y2 = int(pdefined_anchors[idx][3] * height)
197 | pred_crop = torch.tensor([[pred_x1, pred_y1, pred_x2, pred_y2]])
198 | gt_crop = gt_crop.reshape(-1,4)
199 |
200 | iou, disp = compute_iou_and_disp(gt_crop, pred_crop, width, height)
201 | if iou >= alpha:
202 | alpha_cnt += 1
203 | accum_iou += iou
204 | accum_disp += disp
205 | cnt += 1
206 |
207 | avg_iou = accum_iou / cnt
208 | avg_disp = accum_disp / (cnt * 4.0)
209 | avg_recall = float(alpha_cnt) / cnt
210 | print('Test on {} images, IoU={:.4f}, Disp={:.4f}, recall={:.4f}(iou>={:.2f})'.format(
211 | cnt, avg_iou, avg_disp, avg_recall, alpha
212 | ))
213 | return avg_iou, avg_disp
214 |
215 | def weight_translate():
216 | model = HumanCentricCroppingModel(loadweights=False, cfg=cfg)
217 | model_dir = './experiments/ablation_study/GAICD_PA_CP'
218 |
219 | src_weight_path = os.path.join(model_dir, 'checkpoints_origin')
220 | tar_weight_path = os.path.join(model_dir, 'checkpoints')
221 | for file in os.listdir(src_weight_path):
222 | if not file.endswith('.pth'):
223 | continue
224 | weight = os.path.join(src_weight_path, file)
225 | weight_dict = torch.load(weight)
226 | model_state_dict = model.state_dict()
227 | new_state_dict = dict()
228 | for name,params in weight_dict.items():
229 | if name in model_state_dict:
230 | new_state_dict[name] = params
231 | else:
232 | if 'p_conv' in name:
233 | name = name.replace('p_conv', 'group_conv')
234 | new_state_dict[name] = params
235 | else:
236 | print(name)
237 | try:
238 | model.load_state_dict(new_state_dict)
239 | torch.save(new_state_dict, os.path.join(tar_weight_path, file))
240 | print(f'trans {file} successfully...')
241 | except:
242 | print(f'trans {file} failed...')
243 | break
244 |
245 |
246 | if __name__ == '__main__':
247 | from config_GAICD import cfg
248 | cfg.use_partition_aware = True
249 | cfg.partition_aware_type = 9
250 | cfg.use_content_preserve = True
251 | cfg.content_preserve_type = 'gcn'
252 | cfg.only_content_preserve = False
253 |
254 | model = HumanCentricCroppingModel(loadweights=False, cfg=cfg)
255 | model = model.eval().to(device)
256 |
257 | evaluate_on_GAICD(model, only_human=False)
258 | # evaluate_on_GAICD(model, only_human=True)
259 | # evaluate_on_FCDB_and_FLMS(model, dataset='FCDB&FLMS', only_human=True)
260 | # evaluate_on_FCDB_and_FLMS(model, dataset='FCDB', only_human=False)
261 | # evaluate_on_FCDB_and_FLMS(model, dataset='FLMS', only_human=False)
--------------------------------------------------------------------------------
/train_on_CPC.py:
--------------------------------------------------------------------------------
1 | import os
2 | import sys
3 | import numpy as np
4 | from tensorboardX import SummaryWriter
5 | import torch
6 | import time
7 | import datetime
8 | import csv
9 | from tqdm import tqdm
10 | import shutil
11 | import pickle
12 | from scipy.stats import spearmanr
13 | import random
14 | from torch.autograd import Variable
15 | import torch.utils.data as data
16 | import torchvision.transforms as transforms
17 | from PIL import Image
18 | import math
19 |
20 | from cropping_model import HumanCentricCroppingModel
21 | from cropping_model import score_regression_loss, score_rank_loss
22 | from cropping_model import score_weighted_regression_loss, listwise_view_ranking_loss
23 | from cropping_dataset import CPCDataset
24 | from config_CPC import cfg
25 | from test import evaluate_on_FCDB_and_FLMS
26 |
27 | device = torch.device('cuda:{}'.format(cfg.gpu_id))
28 | torch.cuda.set_device(cfg.gpu_id)
29 | IMAGE_NET_MEAN = [0.485, 0.456, 0.406]
30 | IMAGE_NET_STD = [0.229, 0.224, 0.225]
31 | MOS_MEAN = 0.66
32 |
33 | SEED = 0
34 | torch.manual_seed(SEED)
35 | np.random.seed(SEED)
36 | random.seed(SEED)
37 |
38 | def create_dataloader():
39 | assert cfg.training_set == 'CPC', cfg.training_set
40 | dataset = CPCDataset(only_human_images=cfg.only_human,
41 | keep_aspect_ratio=cfg.keep_aspect_ratio)
42 | if cfg.keep_aspect_ratio:
43 | dataloader = torch.utils.data.DataLoader(dataset, batch_size=1,
44 | shuffle=True, num_workers=cfg.num_workers,
45 | drop_last=False, worker_init_fn=random.seed(SEED),
46 | pin_memory=True)
47 | else:
48 | dataloader = torch.utils.data.DataLoader(dataset, batch_size=cfg.batch_size,
49 | shuffle=True, num_workers=cfg.num_workers,
50 | drop_last=False, worker_init_fn=random.seed(SEED),
51 | pin_memory=True)
52 | print('training set has {} samples, {} batches'.format(len(dataset), len(dataloader)))
53 | return dataloader
54 |
55 | class Trainer:
56 | def __init__(self, model):
57 | self.model = model
58 | self.epoch = 0
59 | self.iters = 0
60 | self.max_epoch = cfg.max_epoch
61 | self.writer = SummaryWriter(log_dir=cfg.log_dir)
62 | self.optimizer, self.lr_scheduler = self.get_optimizer()
63 | self.train_loader = create_dataloader()
64 | self.eval_results = []
65 | self.best_results = {'human_iou': 0, 'human_disp': 1.,
66 | 'FCDB_iou': 0., 'FCDB_disp': 1.,
67 | 'FLMS_iou': 0., 'FLMS_disp': 1.}
68 | self.score_loss_type = cfg.loss_type if isinstance(cfg.loss_type, list) else [cfg.loss_type]
69 | self.l1_loss = torch.nn.L1Loss()
70 |
71 | def get_optimizer(self):
72 | optim = torch.optim.Adam(
73 | self.model.parameters(),
74 | lr=cfg.lr,
75 | weight_decay=cfg.weight_decay
76 | )
77 | if cfg.lr_scheduler == 'cosine':
78 | warm_up_epochs = 5
79 | warm_up_with_cosine_lr = lambda epoch: min(1.,(epoch+1) / warm_up_epochs) if epoch <= warm_up_epochs else 0.5 * (
80 | math.cos((epoch - warm_up_epochs) / (self.max_epoch - warm_up_epochs) * math.pi) + 1)
81 | lr_scheduler = torch.optim.lr_scheduler.LambdaLR(optim, lr_lambda=warm_up_with_cosine_lr)
82 | else:
83 | lr_scheduler = torch.optim.lr_scheduler.MultiStepLR(
84 | optim, milestones=cfg.lr_decay_epoch, gamma=cfg.lr_decay
85 | )
86 | return optim, lr_scheduler
87 |
88 | def run(self):
89 | print(("======== Begin Training ========="))
90 | for epoch in range(self.max_epoch):
91 | self.epoch = epoch
92 | self.train()
93 | if epoch % cfg.eval_freq == 0 or (epoch == self.max_epoch-1):
94 | self.eval()
95 | self.record_eval_results()
96 | self.lr_scheduler.step()
97 |
98 | def visualize_partition_mask(self, im, pre_part, gt_part):
99 | im = im.detach().cpu()
100 | pre_part = torch.softmax(pre_part,dim=0).detach().cpu()
101 | # pre_part = pre_part.detach().cpu()
102 | gt_part = gt_part.detach().cpu()
103 | im = im * torch.tensor(IMAGE_NET_STD).view(3,1,1) + torch.tensor(IMAGE_NET_MEAN).view(3,1,1)
104 | trans_fn = transforms.ToPILImage()
105 | im = trans_fn(im).convert('RGB')
106 | width, height = im.size
107 |
108 | joint_pre = None
109 | joint_gt = None
110 | for i in range(3):
111 | h,w = pre_part.shape[1:]
112 | col_band= torch.ones(h,1).float()
113 | row_pre = torch.cat([pre_part[i*3], col_band, pre_part[i*3+1], col_band, pre_part[i*3+2]], dim=-1)
114 | row_gt = torch.cat([gt_part[i*3], col_band, gt_part[i*3+1], col_band, gt_part[i*3+2]], dim=-1)
115 | if joint_gt is None:
116 | joint_gt = row_gt
117 | joint_pre = row_pre
118 | else:
119 | row_band = torch.ones(1, row_pre.shape[-1]).float()
120 | joint_pre = torch.cat([joint_pre, row_band, row_pre], dim=0)
121 | joint_gt = torch.cat([joint_gt, row_band, row_gt], dim=0)
122 | pre_part = trans_fn(joint_pre).convert('RGB').resize((width,height))
123 | gt_part = trans_fn(joint_gt).convert('RGB').resize((width, height))
124 | ver_band = (np.ones((height,5,3)) * 255).astype(np.uint8)
125 | cat_im = np.concatenate([np.asarray(im), ver_band, np.asarray(gt_part), ver_band, np.asarray(pre_part)], axis=1)
126 | cat_im = Image.fromarray(cat_im)
127 | fig_dir = os.path.join(cfg.exp_path, 'visualization')
128 | os.makedirs(fig_dir,exist_ok=True)
129 | fig_file = os.path.join(fig_dir, str(self.iters) + '_part.jpg')
130 | cat_im.save(fig_file)
131 |
132 | def visualize_heat_map(self, im, pre_heat, gt_heat):
133 | im = im.detach().cpu()
134 | pre_heat = pre_heat.detach().cpu()
135 | gt_heat = gt_heat.detach().cpu()
136 | im = im * torch.tensor(IMAGE_NET_STD).view(3,1,1) + torch.tensor(IMAGE_NET_MEAN).view(3,1,1)
137 | trans_fn = transforms.ToPILImage()
138 | im = trans_fn(im).convert('RGB')
139 | width, height = im.size
140 | pre_heat = trans_fn(pre_heat).convert('RGB').resize((width,height))
141 | gt_heat = trans_fn(gt_heat).convert('RGB').resize((width, height))
142 | ver_band = (np.ones((height,5,3)) * 255).astype(np.uint8)
143 | cat_im = np.concatenate([np.asarray(im), ver_band, np.asarray(gt_heat), ver_band, np.asarray(pre_heat)], axis=1)
144 | cat_im = Image.fromarray(cat_im)
145 | fig_dir = os.path.join(cfg.exp_path, 'visualization')
146 | os.makedirs(fig_dir,exist_ok=True)
147 | fig_file = os.path.join(fig_dir, str(self.iters) + '_content.jpg')
148 | cat_im.save(fig_file)
149 |
150 | def train(self):
151 | self.model.train()
152 | start = time.time()
153 | batch_loss = 0
154 | batch_score_loss = 0.
155 | batch_content_loss = 0.
156 | total_batch = len(self.train_loader)
157 | human_cnt = 0.
158 |
159 | for batch_idx, batch_data in enumerate(self.train_loader):
160 | self.iters += 1
161 | im = batch_data[0].to(device)
162 | rois = batch_data[1].to(device)
163 | heat_map = batch_data[2].to(device)
164 | crop_mask = batch_data[3].to(device)
165 | part_mask = batch_data[4].to(device)
166 | score = batch_data[5].to(device)
167 | # width = batch_data[4].to(device)
168 | # height = batch_data[5].to(device)
169 | contain_human = (torch.count_nonzero(part_mask[0, 4]) > 0)
170 |
171 | # random_ID = list(range(0, rois.shape[1]))
172 | # random.shuffle(random_ID)
173 | # chosen_ID = random_ID[ : cfg.view_per_image]
174 | # rois = rois[:,chosen_ID]
175 | # crop_mask = crop_mask[:,chosen_ID]
176 | # score = score[:,chosen_ID]
177 |
178 | pre_patition, pred_heat_map, pred_score = self.model(im, rois, crop_mask, part_mask)
179 | score_loss = None
180 | for loss_type in self.score_loss_type:
181 | if loss_type == 'L1Loss':
182 | cur_loss = score_regression_loss(pred_score, score)
183 | elif loss_type == 'WeightedL1Loss':
184 | cur_loss = score_weighted_regression_loss(pred_score, score, MOS_MEAN)
185 | elif loss_type == 'RankLoss':
186 | cur_loss = score_rank_loss(pred_score, score)
187 | elif loss_type == 'LVRLoss':
188 | cur_loss = listwise_view_ranking_loss(pred_score, score)
189 | else:
190 | raise Exception('Undefined score loss type', loss_type)
191 | if score_loss:
192 | score_loss += cur_loss
193 | else:
194 | score_loss = cur_loss
195 | batch_score_loss += score_loss.item()
196 | loss = score_loss
197 |
198 | if cfg.use_content_preserve:
199 | content_loss = self.l1_loss(pred_heat_map.reshape(-1), heat_map.reshape(-1))
200 | loss += (content_loss * cfg.content_loss_weight)
201 | batch_content_loss += content_loss.item()
202 | if contain_human:
203 | human_cnt += 1
204 | if human_cnt % cfg.visualize_freq == 0:
205 | if cfg.use_partition_aware and cfg.visualize_partition_feature:
206 | self.visualize_partition_mask(im[0], pre_patition[0], part_mask[0])
207 | if cfg.use_content_preserve and cfg.visualize_heat_map:
208 | self.visualize_heat_map(im[0], pred_heat_map[0], heat_map[0])
209 | batch_loss += loss.item()
210 | self.optimizer.zero_grad()
211 | loss.backward()
212 | self.optimizer.step()
213 | # batch_loss += loss.item()
214 | # loss = loss / cfg.batch_size
215 | # loss.backward()
216 | # if (batch_idx+1) % cfg.batch_size == 0 or batch_idx >= total_batch-1:
217 | # self.optimizer.step()
218 | # self.optimizer.zero_grad()
219 | if batch_idx > 0 and batch_idx % cfg.display_freq == 0:
220 | avg_loss = batch_loss / (1 + batch_idx)
221 | cur_lr = self.optimizer.param_groups[0]['lr']
222 | avg_score_loss = batch_score_loss / (1 + batch_idx)
223 | self.writer.add_scalar('train/score_loss', avg_score_loss, self.iters)
224 | self.writer.add_scalar('train/total_loss', avg_loss, self.iters)
225 | self.writer.add_scalar('train/lr', cur_lr, self.iters)
226 |
227 | if cfg.use_content_preserve:
228 | avg_content_loss = batch_content_loss / (1 + batch_idx)
229 | self.writer.add_scalar('train/content_loss', avg_content_loss, self.iters)
230 | else:
231 | avg_content_loss = 0.
232 |
233 | time_per_batch = (time.time() - start) / (batch_idx + 1.)
234 | last_batches = (self.max_epoch - self.epoch - 1) * total_batch + (total_batch - batch_idx - 1)
235 | last_time = int(last_batches * time_per_batch)
236 | time_str = str(datetime.timedelta(seconds=last_time))
237 |
238 | print('=== epoch:{}/{}, step:{}/{} | Loss:{:.4f} | Score_Loss:{:.4f} | Content_Loss:{:.4f} | lr:{:.6f} | estimated last time:{} ==='.format(
239 | self.epoch, self.max_epoch, batch_idx, total_batch, avg_loss, avg_score_loss, avg_content_loss, cur_lr, time_str
240 | ))
241 |
242 | def eval(self):
243 | human_iou, human_disp = evaluate_on_FCDB_and_FLMS(self.model, dataset='both', only_human=True)
244 | FCDB_iou, FCDB_disp = evaluate_on_FCDB_and_FLMS(self.model, dataset='FCDB', only_human=False)
245 | FLMS_iou, FLMS_disp = evaluate_on_FCDB_and_FLMS(self.model, dataset='FLMS', only_human=False)
246 |
247 | self.eval_results.append([self.epoch, human_iou, human_disp, FCDB_iou, FCDB_disp, FLMS_iou, FLMS_disp])
248 | epoch_result = {'human_iou': human_iou, 'human_disp': human_disp,
249 | 'FCDB_iou': FCDB_iou, 'FCDB_disp': FCDB_disp,
250 | 'FLMS_iou': FLMS_iou, 'FLMS_disp': FLMS_disp}
251 | for m in self.best_results.keys():
252 | update = False
253 | if ('disp' not in m) and (epoch_result[m] > self.best_results[m]):
254 | update = True
255 | elif ('disp' in m) and (epoch_result[m] < self.best_results[m]):
256 | update = True
257 | if update:
258 | self.best_results[m] = epoch_result[m]
259 | checkpoint_path = os.path.join(cfg.checkpoint_dir, 'best-{}.pth'.format(m))
260 | torch.save(self.model.state_dict(), checkpoint_path)
261 | print('Update best {} model, best {}={:.4f}'.format(m, m, self.best_results[m]))
262 | if m in ['human_iou', 'human_disp']:
263 | self.writer.add_scalar('test/{}'.format(m), epoch_result[m], self.epoch)
264 | self.writer.add_scalar('test/best-{}'.format(m), self.best_results[m], self.epoch)
265 | if self.epoch > 0 and self.epoch % cfg.save_freq == 0:
266 | checkpoint_path = os.path.join(cfg.checkpoint_dir, 'epoch-{}.pth'.format(self.epoch))
267 | torch.save(self.model.state_dict(), checkpoint_path)
268 |
269 | def record_eval_results(self):
270 | csv_path = os.path.join(cfg.exp_path, '..', '{}.csv'.format(cfg.exp_name))
271 | header = ['epoch', 'human_iou', 'human_disp',
272 | 'FCDB_iou', 'FCDB_disp', 'FLMS_iou', 'FLMS_disp']
273 | rows = [header]
274 | for i in range(len(self.eval_results)):
275 | new_results = []
276 | for j in range(len(self.eval_results[i])):
277 | if header[j] == 'epoch':
278 | new_results.append(self.eval_results[i][j])
279 | else:
280 | new_results.append(round(self.eval_results[i][j], 4))
281 | self.eval_results[i] = new_results
282 | rows += self.eval_results
283 | metrics = [[] for i in header]
284 | for result in self.eval_results:
285 | for i, r in enumerate(result):
286 | metrics[i].append(r)
287 | for name, m in zip(header, metrics):
288 | if name == 'epoch':
289 | continue
290 | index = m.index(max(m))
291 | if 'disp' in name:
292 | index = m.index(min(m))
293 | title = 'best {}(epoch-{})'.format(name, index)
294 | row = [l[index] for l in metrics]
295 | row[0] = title
296 | rows.append(row)
297 | with open(csv_path, 'w') as f:
298 | cw = csv.writer(f)
299 | cw.writerows(rows)
300 | print('Save result to ', csv_path)
301 |
302 | if __name__ == '__main__':
303 | cfg.create_path()
304 | for file in os.listdir('./'):
305 | if file.endswith('.py'):
306 | shutil.copy(file, cfg.exp_path)
307 | print('backup', file)
308 | net = HumanCentricCroppingModel(loadweights=True, cfg=cfg)
309 | net = net.to(device)
310 | trainer = Trainer(net)
311 | trainer.run()
--------------------------------------------------------------------------------
/train_on_GAICD.py:
--------------------------------------------------------------------------------
1 | import os
2 | import sys
3 | import numpy as np
4 | from tensorboardX import SummaryWriter
5 | import torch
6 | import time
7 | import datetime
8 | import csv
9 | from tqdm import tqdm
10 | import shutil
11 | import pickle
12 | from scipy.stats import spearmanr
13 | import random
14 | from torch.autograd import Variable
15 | import torch.utils.data as data
16 | import torchvision.transforms as transforms
17 | from PIL import Image
18 | import math
19 |
20 | from cropping_model import HumanCentricCroppingModel
21 | from cropping_model import score_regression_loss, score_rank_loss, \
22 | score_weighted_regression_loss, listwise_view_ranking_loss
23 | from cropping_dataset import GAICDataset
24 | from config_GAICD import cfg
25 | from test import evaluate_on_GAICD
26 |
27 | device = torch.device('cuda:{}'.format(cfg.gpu_id))
28 | torch.cuda.set_device(cfg.gpu_id)
29 | IMAGE_NET_MEAN = [0.485, 0.456, 0.406]
30 | IMAGE_NET_STD = [0.229, 0.224, 0.225]
31 | MOS_MEAN = 2.95
32 | MOS_STD = 0.8
33 |
34 | SEED = 0
35 | torch.manual_seed(SEED)
36 | np.random.seed(SEED)
37 | random.seed(SEED)
38 |
39 | def create_dataloader():
40 | assert cfg.training_set == 'GAICD', cfg.training_set
41 | dataset = GAICDataset(only_human_images=cfg.only_human,
42 | keep_aspect_ratio=cfg.keep_aspect_ratio,
43 | split='train')
44 | dataloader = torch.utils.data.DataLoader(dataset, batch_size=1,
45 | shuffle=True, num_workers=cfg.num_workers,
46 | drop_last=False, worker_init_fn=random.seed(SEED),
47 | pin_memory=True)
48 | print('training set has {} samples, {} batches'.format(len(dataset), len(dataloader)))
49 | return dataloader
50 |
51 | class Trainer:
52 | def __init__(self, model):
53 | self.model = model
54 | self.epoch = 0
55 | self.iters = 0
56 | self.max_epoch = cfg.max_epoch
57 | self.writer = SummaryWriter(log_dir=cfg.log_dir)
58 | self.optimizer, self.lr_scheduler = self.get_optimizer()
59 | self.train_loader = create_dataloader()
60 | self.eval_results = []
61 | self.best_results = {'human_srcc': 0, 'human_acc5': 0., 'human_acc10': 0.,
62 | 'srcc': 0, 'acc5': 0., 'acc10': 0.}
63 | self.score_loss_type = cfg.loss_type if isinstance(cfg.loss_type, list) else [cfg.loss_type]
64 | self.l1_loss = torch.nn.L1Loss()
65 |
66 | def get_optimizer(self):
67 | optim = torch.optim.Adam(
68 | self.model.parameters(),
69 | lr=cfg.lr,
70 | weight_decay=cfg.weight_decay
71 | )
72 | if cfg.lr_scheduler == 'cosine':
73 | warm_up_epochs = 5
74 | warm_up_with_cosine_lr = lambda epoch: min(1.,(epoch+1) / warm_up_epochs) if epoch <= warm_up_epochs else 0.5 * (
75 | math.cos((epoch - warm_up_epochs) / (self.max_epoch - warm_up_epochs) * math.pi) + 1)
76 | lr_scheduler = torch.optim.lr_scheduler.LambdaLR(optim, lr_lambda=warm_up_with_cosine_lr)
77 | else:
78 | lr_scheduler = torch.optim.lr_scheduler.MultiStepLR(
79 | optim, milestones=cfg.lr_decay_epoch, gamma=cfg.lr_decay
80 | )
81 | return optim, lr_scheduler
82 |
83 | def run(self):
84 | print(("======== Begin Training ========="))
85 | for epoch in range(self.max_epoch):
86 | self.epoch = epoch
87 | self.train()
88 | if epoch % cfg.eval_freq == 0 or (epoch == self.max_epoch-1):
89 | self.eval()
90 | self.record_eval_results()
91 | self.lr_scheduler.step()
92 |
93 | def visualize_partition_mask(self, im, pre_part, gt_part):
94 | im = im.detach().cpu()
95 | pre_part = torch.softmax(pre_part,dim=0).detach().cpu()
96 | # pre_part = pre_part.detach().cpu()
97 | gt_part = gt_part.detach().cpu()
98 | im = im * torch.tensor(IMAGE_NET_STD).view(3,1,1) + torch.tensor(IMAGE_NET_MEAN).view(3,1,1)
99 | trans_fn = transforms.ToPILImage()
100 | im = trans_fn(im).convert('RGB')
101 | width, height = im.size
102 |
103 | joint_pre = None
104 | joint_gt = None
105 | for i in range(3):
106 | h,w = pre_part.shape[1:]
107 | col_band= torch.ones(h,1).float()
108 | row_pre = torch.cat([pre_part[i*3], col_band, pre_part[i*3+1], col_band, pre_part[i*3+2]], dim=-1)
109 | row_gt = torch.cat([gt_part[i*3], col_band, gt_part[i*3+1], col_band, gt_part[i*3+2]], dim=-1)
110 | if joint_gt is None:
111 | joint_gt = row_gt
112 | joint_pre = row_pre
113 | else:
114 | row_band = torch.ones(1, row_pre.shape[-1]).float()
115 | joint_pre = torch.cat([joint_pre, row_band, row_pre], dim=0)
116 | joint_gt = torch.cat([joint_gt, row_band, row_gt], dim=0)
117 | pre_part = trans_fn(joint_pre).convert('RGB').resize((width,height))
118 | gt_part = trans_fn(joint_gt).convert('RGB').resize((width, height))
119 | ver_band = (np.ones((height,5,3)) * 255).astype(np.uint8)
120 | cat_im = np.concatenate([np.asarray(im), ver_band, np.asarray(gt_part), ver_band, np.asarray(pre_part)], axis=1)
121 | cat_im = Image.fromarray(cat_im)
122 | fig_dir = os.path.join(cfg.exp_path, 'visualization')
123 | os.makedirs(fig_dir,exist_ok=True)
124 | fig_file = os.path.join(fig_dir, str(self.iters) + '_part.jpg')
125 | cat_im.save(fig_file)
126 |
127 | def visualize_heat_map(self, im, pre_heat, gt_heat):
128 | im = im.detach().cpu()
129 | pre_heat = pre_heat.detach().cpu()
130 | gt_heat = gt_heat.detach().cpu()
131 | im = im * torch.tensor(IMAGE_NET_STD).view(3,1,1) + torch.tensor(IMAGE_NET_MEAN).view(3,1,1)
132 | trans_fn = transforms.ToPILImage()
133 | im = trans_fn(im).convert('RGB')
134 | width, height = im.size
135 | pre_heat = trans_fn(pre_heat).convert('RGB').resize((width,height))
136 | gt_heat = trans_fn(gt_heat).convert('RGB').resize((width, height))
137 | ver_band = (np.ones((height,5,3)) * 255).astype(np.uint8)
138 | cat_im = np.concatenate([np.asarray(im), ver_band, np.asarray(gt_heat), ver_band, np.asarray(pre_heat)], axis=1)
139 | cat_im = Image.fromarray(cat_im)
140 | fig_dir = os.path.join(cfg.exp_path, 'visualization')
141 | os.makedirs(fig_dir,exist_ok=True)
142 | fig_file = os.path.join(fig_dir, str(self.iters) + '_content.jpg')
143 | cat_im.save(fig_file)
144 |
145 | def train(self):
146 | self.model.train()
147 | start = time.time()
148 | batch_loss = 0
149 | batch_score_loss = 0.
150 | batch_content_loss = 0.
151 | total_batch = len(self.train_loader)
152 | human_cnt = 0.
153 |
154 | for batch_idx, batch_data in enumerate(self.train_loader):
155 | self.iters += 1
156 | im = batch_data[0].to(device)
157 | rois = batch_data[1].to(device)
158 | human_box = batch_data[2].to(device)
159 | heat_map = batch_data[3].to(device)
160 | crop_mask = batch_data[4].to(device)
161 | part_mask = batch_data[5].to(device)
162 | score = batch_data[6].to(device)
163 | # width = batch_data[7].to(device)
164 | # height = batch_data[8].to(device)
165 | contain_human = (torch.count_nonzero(part_mask[0, 4]) > 0)
166 |
167 | random_ID = list(range(0, rois.shape[1]))
168 | random.shuffle(random_ID)
169 | chosen_ID = random_ID[:64]
170 | rois = rois[:,chosen_ID]
171 | crop_mask = crop_mask[:,chosen_ID]
172 | score = score[:,chosen_ID]
173 |
174 | pre_patition, pred_heat_map, pred_score = self.model(im, rois, human_box, crop_mask, part_mask)
175 | score_loss = None
176 | for loss_type in self.score_loss_type:
177 | if loss_type == 'L1Loss':
178 | cur_loss = score_regression_loss(pred_score, score)
179 | elif loss_type == 'WeightedL1Loss':
180 | cur_loss = score_weighted_regression_loss(pred_score, score, MOS_MEAN)
181 | elif loss_type == 'RankLoss':
182 | cur_loss = score_rank_loss(pred_score, score)
183 | elif loss_type == 'LVRLoss':
184 | cur_loss = listwise_view_ranking_loss(pred_score, score)
185 | else:
186 | raise Exception('Undefined score loss type', loss_type)
187 | if score_loss:
188 | score_loss += cur_loss
189 | else:
190 | score_loss = cur_loss
191 | batch_score_loss += score_loss.item()
192 | loss = score_loss
193 |
194 | if cfg.use_content_preserve:
195 | content_loss = self.l1_loss(pred_heat_map.reshape(-1), heat_map.reshape(-1))
196 | loss += (content_loss * cfg.content_loss_weight)
197 | batch_content_loss += content_loss.item()
198 | if contain_human:
199 | human_cnt += 1
200 | if human_cnt % cfg.visualize_freq == 0:
201 | if cfg.use_partition_aware and cfg.visualize_partition_feature:
202 | self.visualize_partition_mask(im[0], pre_patition[0], part_mask[0])
203 | if cfg.use_content_preserve and cfg.visualize_heat_map:
204 | self.visualize_heat_map(im[0], pred_heat_map[0], heat_map[0])
205 | batch_loss += loss.item()
206 | loss = loss / cfg.batch_size
207 | loss.backward()
208 |
209 | if (batch_idx+1) % cfg.batch_size == 0 or batch_idx >= total_batch-1:
210 | self.optimizer.step()
211 | self.optimizer.zero_grad()
212 | if batch_idx > 0 and batch_idx % cfg.display_freq == 0:
213 | avg_loss = batch_loss / (1 + batch_idx)
214 | cur_lr = self.optimizer.param_groups[0]['lr']
215 | avg_score_loss = batch_score_loss / (1 + batch_idx)
216 | self.writer.add_scalar('train/score_loss', avg_score_loss, self.iters)
217 | self.writer.add_scalar('train/total_loss', avg_loss, self.iters)
218 | self.writer.add_scalar('train/lr', cur_lr, self.iters)
219 |
220 | if cfg.use_content_preserve:
221 | avg_content_loss = batch_content_loss / (1 + batch_idx)
222 | self.writer.add_scalar('train/content_loss', avg_content_loss, self.iters)
223 | else:
224 | avg_content_loss = 0.
225 |
226 | time_per_batch = (time.time() - start) / (batch_idx + 1.)
227 | last_batches = (self.max_epoch - self.epoch - 1) * total_batch + (total_batch - batch_idx - 1)
228 | last_time = int(last_batches * time_per_batch)
229 | time_str = str(datetime.timedelta(seconds=last_time))
230 |
231 | print('=== epoch:{}/{}, step:{}/{} | Loss:{:.4f} | Score_Loss:{:.4f} | Content_Loss:{:.4f} | lr:{:.6f} | estimated last time:{} ==='.format(
232 | self.epoch, self.max_epoch, batch_idx, total_batch, avg_loss, avg_score_loss, avg_content_loss, cur_lr, time_str
233 | ))
234 |
235 | def eval(self):
236 | self.model.eval()
237 | human_srcc, human_acc5, human_acc10 = evaluate_on_GAICD(self.model, only_human=True)
238 | srcc, acc5, acc10 = evaluate_on_GAICD(self.model, only_human=False)
239 | self.eval_results.append([self.epoch, human_srcc, human_acc5, human_acc10,
240 | srcc, acc5, acc10])
241 | epoch_result = {'human_srcc': human_srcc, 'human_acc5': human_acc5, 'human_acc10': human_acc10,
242 | 'srcc': srcc, 'acc5': acc5, 'acc10': acc10}
243 | for m in self.best_results.keys():
244 | update = False
245 | if (m != 'disp') and (epoch_result[m] > self.best_results[m]):
246 | update = True
247 | elif (m == 'disp') and (epoch_result[m] < self.best_results[m]):
248 | update = True
249 | if update:
250 | self.best_results[m] = epoch_result[m]
251 | checkpoint_path = os.path.join(cfg.checkpoint_dir, 'best-{}.pth'.format(m))
252 | torch.save(self.model.state_dict(), checkpoint_path)
253 | print('Update best {} model, best {}={:.4f}'.format(m, m, self.best_results[m]))
254 | if m in ['human_srcc', 'srcc']:
255 | self.writer.add_scalar('test/{}'.format(m), epoch_result[m], self.epoch)
256 | self.writer.add_scalar('test/best-{}'.format(m), self.best_results[m], self.epoch)
257 | if self.epoch % cfg.save_freq == 0:
258 | checkpoint_path = os.path.join(cfg.checkpoint_dir, 'epoch-{}.pth'.format(self.epoch))
259 | torch.save(self.model.state_dict(), checkpoint_path)
260 |
261 | def record_eval_results(self):
262 | csv_path = os.path.join(cfg.exp_path, '..', '{}.csv'.format(cfg.exp_name))
263 | header = ['epoch', 'human_srcc', 'human_acc5', 'human_acc10',
264 | 'srcc', 'acc5', 'acc10']
265 | rows = [header]
266 | for i in range(len(self.eval_results)):
267 | new_results = []
268 | for j in range(len(self.eval_results[i])):
269 | if 'srcc' in header[j]:
270 | new_results.append(round(self.eval_results[i][j], 3))
271 | elif 'acc' in header[j]:
272 | new_results.append(round(self.eval_results[i][j], 3))
273 | else:
274 | new_results.append(self.eval_results[i][j])
275 | self.eval_results[i] = new_results
276 | rows += self.eval_results
277 | metrics = [[] for i in header]
278 | for result in self.eval_results:
279 | for i, r in enumerate(result):
280 | metrics[i].append(r)
281 | for name, m in zip(header, metrics):
282 | if name == 'epoch':
283 | continue
284 | index = m.index(max(m))
285 | if name == 'disp':
286 | index = m.index(min(m))
287 | title = 'best {}(epoch-{})'.format(name, index)
288 | row = [l[index] for l in metrics]
289 | row[0] = title
290 | rows.append(row)
291 | with open(csv_path, 'w') as f:
292 | cw = csv.writer(f)
293 | cw.writerows(rows)
294 | print('Save result to ', csv_path)
295 |
296 | if __name__ == '__main__':
297 | cfg.create_path()
298 | for file in os.listdir('./'):
299 | if file.endswith('.py'):
300 | shutil.copy(file, cfg.exp_path)
301 | print('backup', file)
302 | net = HumanCentricCroppingModel(loadweights=True, cfg=cfg)
303 | net = net.to(device)
304 | trainer = Trainer(net)
305 | trainer.run()
--------------------------------------------------------------------------------