├── .gitignore
├── LICENSE
├── README.md
├── code
├── data_utils.py
├── eval_interpreter.py
├── experiments
│ ├── bp4d_0.json
│ ├── disfa_0.json
│ ├── eval_b2d.json
│ ├── eval_d2b.json
│ ├── single_image_inference_bp4d.json
│ └── single_image_inference_disfa.json
├── model_utils.py
├── models
│ ├── encoders
│ │ ├── helpers.py
│ │ └── psp_encoders.py
│ ├── interpreter.py
│ └── stylegan2_pytorch
│ │ ├── op
│ │ ├── __init__.py
│ │ ├── fused_act.py
│ │ ├── fused_bias_act.cpp
│ │ ├── fused_bias_act_kernel.cu
│ │ ├── upfirdn2d.cpp
│ │ ├── upfirdn2d.py
│ │ └── upfirdn2d_kernel.cu
│ │ └── stylegan2_pytorch.py
├── single_image_inference.py
└── train_interpreter.py
├── data
├── BP4D
│ └── labels
│ │ ├── 0
│ │ ├── test.csv
│ │ ├── train.csv
│ │ ├── train_100.csv
│ │ ├── train_1000.csv
│ │ ├── train_10000.csv
│ │ └── train_5000.csv
│ │ ├── 1
│ │ ├── test.csv
│ │ ├── train.csv
│ │ ├── train_100.csv
│ │ ├── train_1000.csv
│ │ ├── train_10000.csv
│ │ └── train_5000.csv
│ │ └── 2
│ │ ├── test.csv
│ │ ├── train.csv
│ │ ├── train_100.csv
│ │ ├── train_1000.csv
│ │ ├── train_10000.csv
│ │ └── train_5000.csv
└── DISFA
│ └── labels
│ ├── 0
│ ├── test.csv
│ ├── train.csv
│ ├── train_100.csv
│ ├── train_1000.csv
│ ├── train_10000.csv
│ └── train_5000.csv
│ ├── 1
│ ├── test.csv
│ ├── train.csv
│ ├── train_100.csv
│ ├── train_1000.csv
│ ├── train_10000.csv
│ └── train_5000.csv
│ └── 2
│ ├── test.csv
│ ├── train.csv
│ ├── train_100.csv
│ ├── train_1000.csv
│ ├── train_10000.csv
│ └── train_5000.csv
├── pipeline.png
└── test_image.jpg
/.gitignore:
--------------------------------------------------------------------------------
1 | __pycache__
2 |
3 | # data
4 | data/BP4D/aligned_images
5 | data/BP4D/aligned_landmarks
6 | data/DISFA/aligned_images
7 | data/DISFA/aligned_landmarks
8 |
9 | # checkpoints
10 | code/checkpoints
11 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2023 Intelligent Human Perception Lab
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
31 |
32 | ## Introduction
33 |
34 | This is the official implementation of our WACV 2024 Algorithm Track paper: FG-Net: Facial Action Unit Detection with Generalizable Pyramidal Features.
35 |
36 | FG-Net extracts feature maps from a StyleGAN2 model pre-trained on a large and diverse face image dataset. Then, these features are used to detect AUs with a Pyramid CNN Interpreter, making the training efficient and capturing essential local features. The proposed FG-Net achieves a strong generalization ability for heatmap-based AU detection thanks to the generalizable and semantic-rich features extracted from the pre-trained generative model. Extensive experiments are conducted to evaluate within- and cross-corpus AU detection with the widely-used DISFA and BP4D datasets. Compared with the state-of-the-art, the proposed method achieves superior cross-domain performance while maintaining competitive within-domain performance. In addition, FG-Net is dataefficient and achieves competitive performance even when trained on 1000 samples.
37 |
38 |
39 |
40 |
41 |
42 | ## Installation
43 | Clone repo:
44 | ```
45 | git clone https://github.com/ihp-lab/FG-Net.git
46 | cd FG-Net
47 | ```
48 |
49 | The code is tested with Python == 3.7, PyTorch == 1.10.1 and CUDA == 11.3 on NVIDIA Quadro RTX 8000. We recommend you to use [anaconda](https://www.anaconda.com/) to manage dependencies.
50 |
51 | ```
52 | conda create -n fg-net python=3.7
53 | conda activate fg-net
54 | conda install pytorch==1.10.1 torchvision==0.11.2 torchaudio==0.10.1 cudatoolkit=11.3 -c pytorch -c conda-forge
55 | conda install cudatoolkit-dev=11.3
56 | pip install pandas
57 | pip install tqdm
58 | pip install -U scikit-learn
59 | pip install opencv-python
60 | pip install dlib
61 | pip install imutils
62 | ```
63 |
64 | ## Data Structure
65 | Download the [BP4D](https://www.cs.binghamton.edu/~lijun/Research/3DFE/3DFE_Analysis.html) and [DISFA](http://mohammadmahoor.com/disfa/) dataset from the official website.
66 |
67 | We first pre-process the input image by dlib to obatain the facial landmarks. The detected landmark are used to crop and align the face by [FFHQ-alignment](https://github.com/happy-jihye/FFHQ-Alignment). We finally use dlib again to detect the facial labdmarks for the aligned images to generate heatmaps.
68 |
69 | You should get a dataset folder like below:
70 |
71 | ```
72 | data
73 | ├── DISFA
74 | │ ├── labels
75 | │ │ └── 0
76 | │ │ │ ├── train.csv
77 | │ │ │ └── test.csv
78 | │ │ ├── 1
79 | │ │ └── 2
80 | │ ├── aligned_images
81 | │ └── aligned_landmarks
82 | └── BP4D
83 | ```
84 |
85 | ## Checkpoints
86 | StyleGAN2:
87 | To get pytorch checkpoints for StyleGAN2 (stylegan2-ffhq-config-f.pt), check Section [Convert weight from official checkpoints](https://github.com/rosinality/stylegan2-pytorch/tree/master)
88 |
89 | StyleGAN2 Encoder:
90 | [pSp encoder](https://drive.google.com/file/d/1bMTNWkh5LArlaWSc_wa8VKyq2V42T2z0/view?pli=1). Rename the pt file to `encoder.pt`.
91 |
92 | AU detection:
93 | [BP4D](https://drive.google.com/file/d/1mwOq1gceGTKvcxN0EJywY9VTuXvASgUZ/view?usp=sharing) and [DISFA](https://drive.google.com/file/d/1wTeqUNkhi-ONXXFJ4K6AozyOxyHTHog-/view?usp=sharing)
94 |
95 | Put all checkpoints under the folder `/code/checkpoints`.
96 |
97 | ## Training and Within-domain Evaluation
98 | ```
99 | cd code
100 | CUDA_VISIBLE_DEVICES=0 python train_interpreter.py --exp experiments/bp4d_0.json
101 | CUDA_VISIBLE_DEVICES=1 python train_interpreter.py --exp experiments/disfa_0.json
102 | ```
103 |
104 | ## Cross-domain Evaluation
105 | ```
106 | cd code
107 | CUDA_VISIBLE_DEVICES=0 python eval_interpreter.py --exp experiments/eval_b2d.json
108 | CUDA_VISIBLE_DEVICES=1 python eval_interpreter.py --exp experiments/eval_d2b.json
109 | ```
110 |
111 | ## Single image inference
112 | ```
113 | cd code
114 | CUDA_VISIBLE_DEVICES=0 python single_image_inference.py --exp experiments/single_image_inference_bp4d.json
115 | CUDA_VISIBLE_DEVICES=1 python single_image_inference.py --exp experiments/single_image_inference_disfa.json
116 | ```
117 |
118 | ## License
119 | Our code is distributed under the MIT License. See `LICENSE` file for more information.
120 |
121 | ## Citation
122 | ```
123 | @article{yin2023fg,
124 | title={FG-Net: Facial Action Unit Detection with Generalizable Pyramidal Features},
125 | author={Yin, Yufeng and Chang, Di and Song, Guoxian and Sang, Shen and Zhi, Tiancheng and Liu, Jing and Luo, Linjie and Soleymani, Mohammad},
126 | journal={arXiv preprint arXiv:2308.12380},
127 | year={2023}
128 | }
129 | ```
130 |
131 | ## Contact
132 | If you have any questions, please raise an issue or email to Yufeng Yin (`yin@ict.usc.edu`or `yufengy@usc.edu`).
133 |
134 | ## Acknowledgments
135 | Our code follows several awesome repositories. We appreciate them for making their codes available to public.
136 |
137 | - [DatasetGAN](https://github.com/nv-tlabs/datasetGAN_release/)
138 | - [StyleGAN2-pytorch](https://github.com/rosinality/stylegan2-pytorch/tree/master)
139 | - [pSp](https://github.com/eladrich/pixel2style2pixel)
140 |
--------------------------------------------------------------------------------
/code/data_utils.py:
--------------------------------------------------------------------------------
1 | import os
2 | import cv2
3 | import dlib
4 | import torch
5 | import numpy as np
6 |
7 | from PIL import Image
8 | from imutils import face_utils
9 | from torch.utils.data import Dataset
10 |
11 |
12 | def findlandmark(img_path):
13 | cascade = '../face_align/shape_predictor_68_face_landmarks.dat'
14 | detector = dlib.get_frontal_face_detector()
15 | predictor = dlib.shape_predictor(cascade)
16 |
17 | image = cv2.imread(img_path)
18 | gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
19 |
20 | rects = detector(gray, 1)
21 |
22 | for rect in rects:
23 | shape = predictor(gray, rect)
24 | shape = face_utils.shape_to_np(shape)
25 |
26 | return shape
27 |
28 |
29 | def add_noise(heatmap, channel, label, center_1, center_2, sigma, size, threshold=0.01):
30 | gauss_noise_1 = np.fromfunction(lambda y,x : ((x-center_1[0])**2 \
31 | + (y-center_1[1])**2) / -(2.0*sigma*sigma),
32 | (size, size), dtype=int)
33 | gauss_noise_1 = np.exp(gauss_noise_1)
34 | gauss_noise_1[gauss_noise_1 < threshold] = 0
35 | gauss_noise_1[gauss_noise_1 > 1] = 1
36 | gauss_noise_2 = np.fromfunction(lambda y,x : ((x-center_2[0])**2 \
37 | + (y-center_2[1])**2) / -(2.0*sigma*sigma),
38 | (size, size), dtype=int)
39 | if label[channel] == 1:
40 | heatmap[channel] += gauss_noise_1
41 | else:
42 | heatmap[channel] -= gauss_noise_1
43 | if center_2[0] == -1:
44 | return heatmap
45 |
46 | gauss_noise_2 = np.exp(gauss_noise_2)
47 | gauss_noise_2[gauss_noise_2 < threshold] = 0
48 | gauss_noise_2[gauss_noise_2 > 1] = 1
49 | if label[channel] == 1:
50 | heatmap[channel] += gauss_noise_2
51 | else:
52 | heatmap[channel] -= gauss_noise_2
53 | return heatmap
54 |
55 |
56 | def au2heatmap(img_path, label, size, sigma, args):
57 | lmk_path = img_path.replace('aligned_images', 'aligned_landmarks')[:-4]+'.npy'
58 | if os.path.exists(lmk_path):
59 | lmk = np.load(lmk_path)
60 | else:
61 | print(img_path)
62 | lmk = findlandmark(img_path)
63 |
64 | if size == 128:
65 | lmk = lmk/2 # [68, 2]
66 | elif size == 64:
67 | lmk = lmk/4 # [68, 2]
68 |
69 | heatmap = np.zeros((args['num_labels'], size, size))
70 | lmk_eye_left = lmk[36:42]
71 | lmk_eye_right = lmk[42:48]
72 | eye_left = np.mean(lmk_eye_left, axis=0)
73 | eye_right = np.mean(lmk_eye_right, axis=0)
74 | lmk_eyebrow_left = lmk[17:22]
75 | lmk_eyebrow_right = lmk[22:27]
76 | eyebrow_left = np.mean(lmk_eyebrow_left, axis=0)
77 | eyebrow_right = np.mean(lmk_eyebrow_right, axis=0)
78 | IOD = np.linalg.norm(lmk[42] - lmk[39])
79 |
80 | if args['dataset'] == 'BP4D':
81 | # au1 lmk 21, 22
82 | heatmap = add_noise(heatmap, 0, label, lmk[21], lmk[22], sigma, size)
83 |
84 | # au2 lmk 17, 26
85 | heatmap = add_noise(heatmap, 1, label, lmk[17], lmk[26], sigma, size)
86 |
87 | # au4 brow center
88 | heatmap = add_noise(heatmap, 2, label, eyebrow_left, eyebrow_right, sigma, size)
89 |
90 | # au6 1 scale below eye bottom
91 | heatmap = add_noise(heatmap, 3, label, [eye_left[0], eye_left[1]+IOD], [eye_right[0], eye_right[1]+IOD], sigma, size)
92 |
93 | # au7 lmk 38, 43
94 | heatmap = add_noise(heatmap, 4, label, lmk[38], lmk[43], sigma, size)
95 |
96 | # au10 lmk 50, 52
97 | heatmap = add_noise(heatmap, 5, label, lmk[50], lmk[52], sigma, size)
98 |
99 | # au12 lmk 48, 54
100 | heatmap = add_noise(heatmap, 6, label, lmk[48], lmk[54], sigma, size)
101 |
102 | # au14 lmk 48, 54
103 | heatmap = add_noise(heatmap, 7, label, lmk[48], lmk[54], sigma, size)
104 |
105 | # au15 lmk 48, 54
106 | heatmap = add_noise(heatmap, 8, label, lmk[48], lmk[54], sigma, size)
107 |
108 | # au17 0.5 scale below lmk 56, 58 / 0.5 scale below lip center
109 | heatmap = add_noise(heatmap, 9, label, [lmk[56,0], lmk[56,1]+0.5*IOD], [lmk[58,0], lmk[58,1]+0.5*IOD], sigma, size)
110 |
111 | # au23 lmk 51, 57 / lip center
112 | heatmap = add_noise(heatmap, 10, label, lmk[51], lmk[57], sigma, size)
113 |
114 | # au24 lmk 51, 57 / lip center
115 | heatmap = add_noise(heatmap, 11, label, lmk[51], lmk[57], sigma, size)
116 | elif args['dataset'] == 'DISFA':
117 | # au1 lmk 21, 22
118 | heatmap = add_noise(heatmap, 0, label, lmk[21], lmk[22], sigma, size)
119 |
120 | # au2 lmk 17, 26
121 | heatmap = add_noise(heatmap, 1, label, lmk[17], lmk[26], sigma, size)
122 |
123 | # au4 brow center
124 | heatmap = add_noise(heatmap, 2, label, eyebrow_left, eyebrow_right, sigma, size)
125 |
126 | # au6 1 scale below eye bottom
127 | heatmap = add_noise(heatmap, 3, label, [eye_left[0], eye_left[1]+IOD], [eye_right[0], eye_right[1]+IOD], sigma, size)
128 |
129 | # au9 0.5 scale below lmk 39, 42 / lmk 39, 42 / lmk 21, 22
130 | heatmap = add_noise(heatmap, 4, label, lmk[39], lmk[42], sigma, size)
131 |
132 | # au12 lmk 48, 54
133 | heatmap = add_noise(heatmap, 5, label, lmk[48], lmk[54], sigma, size)
134 |
135 | # au25 lmk 51, 57
136 | heatmap = add_noise(heatmap, 6, label, lmk[51], lmk[57], sigma, size)
137 |
138 | # au26 0.5 scale below lmk 56, 58 / lmk 56, 58
139 | heatmap = add_noise(heatmap, 7, label, lmk[56], lmk[58], sigma, size)
140 |
141 | heatmap = np.clip(heatmap, -1., 1.)
142 |
143 | return heatmap
144 |
145 | def heatmap2au(heatmap):
146 | avg = torch.mean(heatmap, dim=(2,3))
147 | label = (avg > 0).int()
148 |
149 | return label
150 |
151 |
152 | class MyDataset(Dataset):
153 | def __init__(self, file_list, transform, type, args):
154 | self.file_list = file_list
155 | self.transform = transform
156 | self.args = args
157 |
158 | self.data_root = args['image_path']
159 | self.images = self.file_list['image_path']
160 | self.type = type
161 | if args['dataset'] == 'BP4D':
162 | self.labels = [
163 | self.file_list['au1'],
164 | self.file_list['au2'],
165 | self.file_list['au4'],
166 | self.file_list['au6'],
167 | self.file_list['au7'],
168 | self.file_list['au10'],
169 | self.file_list['au12'],
170 | self.file_list['au14'],
171 | self.file_list['au15'],
172 | self.file_list['au17'],
173 | self.file_list['au23'],
174 | self.file_list['au24'],
175 | ]
176 | else:
177 | self.labels = [
178 | self.file_list['au1'],
179 | self.file_list['au2'],
180 | self.file_list['au4'],
181 | self.file_list['au6'],
182 | self.file_list['au9'],
183 | self.file_list['au12'],
184 | self.file_list['au25'],
185 | self.file_list['au26']
186 | ]
187 | self.num_labels = len(self.labels)
188 |
189 | def __getitem__(self, index):
190 | # load image
191 | image_path = os.path.join(self.data_root, self.images[index])
192 |
193 | image = Image.open(image_path)
194 | image = self.transform(image)
195 |
196 | # load label
197 | label = []
198 | for i in range(self.num_labels):
199 | label.append(int(self.labels[i][index]))
200 | label = torch.FloatTensor(label)
201 |
202 | if self.type == 'train':
203 | heatmap = au2heatmap(image_path, label, self.args['dim'], self.args['sigma'], self.args) # [x,128,128]
204 | heatmap = torch.from_numpy(heatmap)
205 |
206 | return image, label, heatmap
207 | else:
208 | return image, label
209 |
210 | def collate_fn(self, data):
211 | if self.type == 'train':
212 | images, labels, heatmaps = zip(*data)
213 |
214 | images = torch.stack(images).cuda()
215 | labels = torch.stack(labels).cuda()
216 | heatmaps = torch.stack(heatmaps).float().cuda()
217 |
218 | return images, labels, heatmaps
219 | else:
220 | images, labels = zip(*data)
221 |
222 | images = torch.stack(images).cuda()
223 | labels = torch.stack(labels).cuda()
224 |
225 | return images, labels
226 |
227 | def __len__(self):
228 | return len(self.file_list)
229 |
--------------------------------------------------------------------------------
/code/eval_interpreter.py:
--------------------------------------------------------------------------------
1 | import os
2 | os.environ['CUDA_DEVICE_ORDER']='PCI_BUS_ID' # see issue #152
3 |
4 | import torch
5 | from torch.utils.data import DataLoader
6 | import torchvision.transforms as T
7 |
8 | import json
9 | import argparse
10 | import pandas as pd
11 |
12 | from tqdm import tqdm
13 | from sklearn.metrics import f1_score
14 |
15 | from data_utils import heatmap2au, MyDataset
16 | from model_utils import latent_to_image, prepare_model, load_psp_standalone
17 | from models.interpreter import pyramid_interpreter
18 |
19 |
20 | def prepare(args):
21 | g_all, upsamplers = prepare_model(args)
22 |
23 | pspencoder = load_psp_standalone(args['style_encoder_path'], 'cuda')
24 |
25 | transform = T.Compose([
26 | T.ToTensor(),
27 | T.Normalize(mean=[0.5,0.5,0.5], std=[0.5,0.5,0.5]),
28 | ])
29 |
30 | test_list = pd.read_csv(args['test_csv'])
31 |
32 | return test_list, g_all, upsamplers, pspencoder, transform
33 |
34 |
35 | def val(interpreter, val_loader, pspencoder, g_all, upsamplers, args):
36 | interpreter.eval()
37 | pspencoder.eval()
38 | g_all.eval()
39 |
40 | with torch.no_grad():
41 | pred_list = []
42 | gt_list = []
43 | f1_list = []
44 |
45 | for (images, labels) in tqdm(val_loader):
46 | images = images.cuda()
47 | # get latent code
48 | latent_codes = pspencoder(images)
49 | latent_codes = g_all.style(latent_codes.reshape(latent_codes.shape[0]*latent_codes.shape[1], latent_codes.shape[2])).reshape(latent_codes.shape)
50 | # get stylegan features
51 | features = latent_to_image(g_all, upsamplers, latent_codes, upsample=False, args=args)
52 |
53 | heatmaps_pred = interpreter(features)
54 | heatmaps_pred = torch.clamp(heatmaps_pred, min=-1., max=1.)
55 | labels_pred = heatmap2au(heatmaps_pred)
56 |
57 | if 'bp4d' in args['checkpoint_path'] and args['dataset'] == 'DISFA':
58 | indices_bp4d = torch.tensor([0, 1, 2, 3, 6]).cuda()
59 | indices_disfa = torch.tensor([0, 1, 2, 3, 5]).cuda()
60 |
61 | labels = torch.index_select(labels, 1, indices_disfa)
62 | labels_pred = torch.index_select(labels_pred, 1, indices_bp4d)
63 | elif 'disfa' in args['checkpoint_path'] and args['dataset'] == 'BP4D':
64 | indices_bp4d = torch.tensor([0, 1, 2, 3, 6]).cuda()
65 | indices_disfa = torch.tensor([0, 1, 2, 3, 5]).cuda()
66 |
67 | labels = torch.index_select(labels, 1, indices_bp4d)
68 | labels_pred = torch.index_select(labels_pred, 1, indices_disfa)
69 |
70 | pred_list.append(labels_pred.detach().cpu())
71 | gt_list.append(labels.detach().cpu())
72 |
73 | pred_list = torch.cat(pred_list, dim=0).numpy()
74 | gt_list = torch.cat(gt_list, dim=0).numpy()
75 |
76 | for i in range(gt_list.shape[1]):
77 | f1_list.append(100.0*f1_score(gt_list[:, i], pred_list[:, i]))
78 |
79 | return sum(f1_list)/len(f1_list), f1_list
80 |
81 |
82 | def main(args):
83 | print('Prepare model')
84 | val_list, g_all, upsamplers, pspencoder, transform = prepare(args)
85 |
86 | if 'bp4d' in args['checkpoint_path']:
87 | num_labels = 12
88 | else:
89 | num_labels = 8
90 | num_labels = 32 # bug in model checkpoint
91 |
92 | interpreter = pyramid_interpreter(num_labels, 0.1).cuda()
93 |
94 | checkpoint = torch.load(args['checkpoint_path'])
95 | interpreter.load_state_dict(checkpoint['interpreter'])
96 | g_all.load_state_dict(checkpoint['g_all'])
97 | pspencoder.load_state_dict(checkpoint['pspencoder'])
98 |
99 | print('Prepare data')
100 | val_data = MyDataset(val_list, transform, 'test', args)
101 | val_loader = DataLoader(dataset=val_data, batch_size=args['batch_size'],
102 | shuffle=False, collate_fn=val_data.collate_fn)
103 |
104 | print('Start evaluation')
105 | aus = [1,2,4,6,12]
106 | val_f1, val_f1_list = val(interpreter, val_loader, pspencoder, g_all, upsamplers, args)
107 | print('Val avg F1: {:.2f}'.format(val_f1))
108 | for i in range(len(aus)):
109 | print('AU {}: {:.2f}'.format(aus[i], val_f1_list[i]), end=' ')
110 | print('')
111 |
112 |
113 | if __name__ == '__main__':
114 | parser = argparse.ArgumentParser()
115 |
116 | parser.add_argument('--exp', type=str)
117 | args = parser.parse_args()
118 | opts = json.load(open(args.exp, 'r'))
119 | print('Opt', opts)
120 |
121 | main(opts)
122 |
--------------------------------------------------------------------------------
/code/experiments/bp4d_0.json:
--------------------------------------------------------------------------------
1 | {
2 | "exp_dir": "checkpoints/bp4d_0",
3 | "batch_size": 16,
4 | "dataset": "BP4D",
5 | "dim": 128,
6 | "num_labels": 12,
7 | "num_epochs": 15,
8 | "learning_rate": 5e-5,
9 | "weight_decay": 5e-4,
10 | "sigma": 20.0,
11 | "image_path": "../data/",
12 | "train_csv": "../data/BP4D/labels/0/train.csv",
13 | "test_csv": "../data/BP4D/labels/0/test.csv",
14 | "stylegan_checkpoint": "checkpoints/stylegan2-ffhq-config-f.pt",
15 | "style_encoder_path": "checkpoints/encoder.pt",
16 | "interpreter": "pyramid",
17 | "interval": 500,
18 | "dropout": 0.1,
19 | "seed": 0
20 | }
21 |
--------------------------------------------------------------------------------
/code/experiments/disfa_0.json:
--------------------------------------------------------------------------------
1 | {
2 | "exp_dir": "checkpoints/disfa_0",
3 | "batch_size": 16,
4 | "dataset": "DISFA",
5 | "dim": 128,
6 | "num_labels": 8,
7 | "num_epochs": 15,
8 | "learning_rate": 5e-5,
9 | "weight_decay": 5e-4,
10 | "sigma": 20.0,
11 | "image_path": "../data/",
12 | "train_csv": "../data/DISFA/labels/0/train.csv",
13 | "test_csv": "../data/DISFA/labels/0/test.csv",
14 | "stylegan_checkpoint": "checkpoints/stylegan2-ffhq-config-f.pt",
15 | "style_encoder_path": "checkpoints/encoder.pt",
16 | "interpreter": "pyramid",
17 | "interval": 500,
18 | "dropout": 0.1,
19 | "seed": 0
20 | }
21 |
--------------------------------------------------------------------------------
/code/experiments/eval_b2d.json:
--------------------------------------------------------------------------------
1 | {
2 | "batch_size": 32,
3 | "dataset": "DISFA",
4 | "dim": 128,
5 | "num_labels": 8,
6 | "sigma": 20.0,
7 | "image_path": "../data/",
8 | "train_csv": "../data/DISFA/labels/0/train.csv",
9 | "test_csv": "../data/DISFA/labels/0/test.csv",
10 | "stylegan_checkpoint": "checkpoints/stylegan2-ffhq-config-f.pt",
11 | "style_encoder_path": "checkpoints/encoder.pt",
12 | "checkpoint_path": "checkpoints/bp4d_0/model.pth",
13 | "dropout": 0.1,
14 | "seed": 0
15 | }
16 |
--------------------------------------------------------------------------------
/code/experiments/eval_d2b.json:
--------------------------------------------------------------------------------
1 | {
2 | "batch_size": 32,
3 | "dataset": "BP4D",
4 | "dim": 128,
5 | "num_labels": 12,
6 | "sigma": 20.0,
7 | "image_path": "../data/",
8 | "train_csv": "../data/BP4D/labels/0/train.csv",
9 | "test_csv": "../data/BP4D/labels/0/test.csv",
10 | "stylegan_checkpoint": "checkpoints/stylegan2-ffhq-config-f.pt",
11 | "style_encoder_path": "checkpoints/encoder.pt",
12 | "checkpoint_path": "checkpoints/disfa_0/model.pth",
13 | "dropout": 0.1,
14 | "seed": 0
15 | }
16 |
--------------------------------------------------------------------------------
/code/experiments/single_image_inference_bp4d.json:
--------------------------------------------------------------------------------
1 | {
2 | "dataset": "BP4D",
3 | "dim": 128,
4 | "num_labels": 12,
5 | "stylegan_checkpoint": "checkpoints/stylegan2-ffhq-config-f.pt",
6 | "style_encoder_path": "checkpoints/encoder.pt",
7 | "checkpoint_path": "checkpoints/bp4d_0/model.pth",
8 | "dropout": 0.1,
9 | "seed": 0
10 | }
11 |
--------------------------------------------------------------------------------
/code/experiments/single_image_inference_disfa.json:
--------------------------------------------------------------------------------
1 | {
2 | "dataset": "DISFA",
3 | "dim": 128,
4 | "num_labels": 8,
5 | "stylegan_checkpoint": "checkpoints/stylegan2-ffhq-config-f.pt",
6 | "style_encoder_path": "checkpoints/encoder.pt",
7 | "checkpoint_path": "checkpoints/disfa_0/model.pth",
8 | "dropout": 0.1,
9 | "seed": 0
10 | }
11 |
--------------------------------------------------------------------------------
/code/model_utils.py:
--------------------------------------------------------------------------------
1 | import math
2 | import argparse
3 | from PIL import Image
4 | from math import cos, pi
5 |
6 | import torch
7 | import torch.nn as nn
8 |
9 | from models.encoders.psp_encoders import GradualStyleEncoder
10 | from models.stylegan2_pytorch.stylegan2_pytorch import Generator as Stylegan2Generator
11 |
12 |
13 | def load_psp_standalone(checkpoint_path, device='cuda'):
14 | ckpt = torch.load(checkpoint_path, map_location='cpu')
15 | opts = ckpt['opts']
16 | if 'output_size' not in opts:
17 | opts['output_size'] = 1024
18 | opts['n_styles'] = int(math.log(opts['output_size'], 2)) * 2 - 2
19 | opts = argparse.Namespace(**opts)
20 | psp = GradualStyleEncoder(50, 'ir_se', opts)
21 | psp_dict = {k.replace('encoder.', ''): v for k, v in ckpt['state_dict'].items() if k.startswith('encoder.')}
22 | psp.load_state_dict(psp_dict)
23 | psp = psp.to(device)
24 | latent_avg = ckpt['latent_avg'].to(device)
25 |
26 | def add_latent_avg(model, inputs, outputs):
27 | return outputs + latent_avg.repeat(outputs.shape[0], 1, 1)
28 |
29 | psp.register_forward_hook(add_latent_avg)
30 | return psp
31 |
32 |
33 | class Interpolate(nn.Module):
34 | def __init__(self, size, mode, align_corners=False):
35 | super(Interpolate, self).__init__()
36 | self.interp = nn.functional.interpolate
37 | self.size = size
38 | self.mode = mode
39 | self.align_corners = align_corners
40 |
41 | def forward(self, x):
42 | if self.align_corners:
43 | x = self.interp(x, size=self.size, mode=self.mode, align_corners=False)
44 | else:
45 | x = self.interp(x, size=self.size, mode=self.mode, align_corners=True)
46 | return x
47 |
48 |
49 | downsample = Interpolate(256, 'bilinear')
50 | def latent_to_image(g_all, upsamplers, style_latents, upsample=True, args=None):
51 | images_recon, affine_layers = g_all.g_synthesis(style_latents, noise=None)
52 | images_recon = downsample(images_recon)
53 |
54 | feature_layer_list = [1,3,5,7,9,11,13,15,17]
55 |
56 | affine_layers_upsamples = []
57 | if not upsample:
58 | for i in feature_layer_list:
59 | affine_layers_upsamples.append(affine_layers[i])
60 | else:
61 | for i in feature_layer_list:
62 | affine_layers_upsamples.append(upsamplers[i](affine_layers[i]))
63 | affine_layers_upsamples = torch.cat(affine_layers_upsamples, dim=1)
64 |
65 | return affine_layers_upsamples
66 |
67 |
68 | def prepare_model(args):
69 | res = 1024
70 | out_res = args['dim']
71 |
72 | g_all = Stylegan2Generator(res, 512, 8, channel_multiplier=2, randomize_noise=False).cuda()
73 | checkpoint = torch.load(args['stylegan_checkpoint'])
74 | g_all.load_state_dict(checkpoint['g_ema'], strict=True)
75 | g_all.make_mean_latent(5000)
76 |
77 | mode = 'bilinear'
78 |
79 | bi_upsamplers = [
80 | nn.Upsample(scale_factor=out_res / 4, mode=mode, align_corners=True),
81 | nn.Upsample(scale_factor=out_res / 4, mode=mode, align_corners=True),
82 | nn.Upsample(scale_factor=out_res / 8, mode=mode, align_corners=True),
83 | nn.Upsample(scale_factor=out_res / 8, mode=mode, align_corners=True),
84 | nn.Upsample(scale_factor=out_res / 16, mode=mode, align_corners=True),
85 | nn.Upsample(scale_factor=out_res / 16, mode=mode, align_corners=True),
86 | nn.Upsample(scale_factor=out_res / 32, mode=mode, align_corners=True),
87 | nn.Upsample(scale_factor=out_res / 32, mode=mode, align_corners=True),
88 | nn.Upsample(scale_factor=out_res / 64, mode=mode, align_corners=True),
89 | nn.Upsample(scale_factor=out_res / 64, mode=mode, align_corners=True),
90 | nn.Upsample(scale_factor=out_res / 128, mode=mode, align_corners=True),
91 | nn.Upsample(scale_factor=out_res / 128, mode=mode, align_corners=True),
92 | Interpolate(out_res, mode),
93 | Interpolate(out_res, mode),
94 | Interpolate(out_res, mode),
95 | Interpolate(out_res, mode),
96 | Interpolate(out_res, mode),
97 | Interpolate(out_res, mode)
98 | ]
99 |
100 | return g_all, bi_upsamplers
101 |
--------------------------------------------------------------------------------
/code/models/encoders/helpers.py:
--------------------------------------------------------------------------------
1 | from collections import namedtuple
2 | import torch
3 | from torch.nn import Conv2d, BatchNorm2d, PReLU, ReLU, Sigmoid, MaxPool2d, AdaptiveAvgPool2d, Sequential, Module
4 |
5 | """
6 | ArcFace implementation from [TreB1eN](https://github.com/TreB1eN/InsightFace_Pytorch)
7 | """
8 |
9 |
10 | class Flatten(Module):
11 | def forward(self, input):
12 | return input.view(input.size(0), -1)
13 |
14 |
15 | def l2_norm(input, axis=1):
16 | norm = torch.norm(input, 2, axis, True)
17 | output = torch.div(input, norm)
18 | return output
19 |
20 |
21 | class Bottleneck(namedtuple('Block', ['in_channel', 'depth', 'stride'])):
22 | """ A named tuple describing a ResNet block. """
23 |
24 |
25 | def get_block(in_channel, depth, num_units, stride=2):
26 | return [Bottleneck(in_channel, depth, stride)] + [Bottleneck(depth, depth, 1) for i in range(num_units - 1)]
27 |
28 |
29 | def get_blocks(num_layers):
30 | if num_layers == 50:
31 | blocks = [
32 | get_block(in_channel=64, depth=64, num_units=3),
33 | get_block(in_channel=64, depth=128, num_units=4),
34 | get_block(in_channel=128, depth=256, num_units=14),
35 | get_block(in_channel=256, depth=512, num_units=3)
36 | ]
37 | elif num_layers == 100:
38 | blocks = [
39 | get_block(in_channel=64, depth=64, num_units=3),
40 | get_block(in_channel=64, depth=128, num_units=13),
41 | get_block(in_channel=128, depth=256, num_units=30),
42 | get_block(in_channel=256, depth=512, num_units=3)
43 | ]
44 | elif num_layers == 152:
45 | blocks = [
46 | get_block(in_channel=64, depth=64, num_units=3),
47 | get_block(in_channel=64, depth=128, num_units=8),
48 | get_block(in_channel=128, depth=256, num_units=36),
49 | get_block(in_channel=256, depth=512, num_units=3)
50 | ]
51 | else:
52 | raise ValueError("Invalid number of layers: {}. Must be one of [50, 100, 152]".format(num_layers))
53 | return blocks
54 |
55 |
56 | class SEModule(Module):
57 | def __init__(self, channels, reduction):
58 | super(SEModule, self).__init__()
59 | self.avg_pool = AdaptiveAvgPool2d(1)
60 | self.fc1 = Conv2d(channels, channels // reduction, kernel_size=1, padding=0, bias=False)
61 | self.relu = ReLU(inplace=True)
62 | self.fc2 = Conv2d(channels // reduction, channels, kernel_size=1, padding=0, bias=False)
63 | self.sigmoid = Sigmoid()
64 |
65 | def forward(self, x):
66 | module_input = x
67 | x = self.avg_pool(x)
68 | x = self.fc1(x)
69 | x = self.relu(x)
70 | x = self.fc2(x)
71 | x = self.sigmoid(x)
72 | return module_input * x
73 |
74 |
75 | class bottleneck_IR(Module):
76 | def __init__(self, in_channel, depth, stride):
77 | super(bottleneck_IR, self).__init__()
78 | if in_channel == depth:
79 | self.shortcut_layer = MaxPool2d(1, stride)
80 | else:
81 | self.shortcut_layer = Sequential(
82 | Conv2d(in_channel, depth, (1, 1), stride, bias=False),
83 | BatchNorm2d(depth)
84 | )
85 | self.res_layer = Sequential(
86 | BatchNorm2d(in_channel),
87 | Conv2d(in_channel, depth, (3, 3), (1, 1), 1, bias=False), PReLU(depth),
88 | Conv2d(depth, depth, (3, 3), stride, 1, bias=False), BatchNorm2d(depth)
89 | )
90 |
91 | def forward(self, x):
92 | shortcut = self.shortcut_layer(x)
93 | res = self.res_layer(x)
94 | return res + shortcut
95 |
96 |
97 | class bottleneck_IR_SE(Module):
98 | def __init__(self, in_channel, depth, stride):
99 | super(bottleneck_IR_SE, self).__init__()
100 | if in_channel == depth:
101 | self.shortcut_layer = MaxPool2d(1, stride)
102 | else:
103 | self.shortcut_layer = Sequential(
104 | Conv2d(in_channel, depth, (1, 1), stride, bias=False),
105 | BatchNorm2d(depth)
106 | )
107 | self.res_layer = Sequential(
108 | BatchNorm2d(in_channel),
109 | Conv2d(in_channel, depth, (3, 3), (1, 1), 1, bias=False),
110 | PReLU(depth),
111 | Conv2d(depth, depth, (3, 3), stride, 1, bias=False),
112 | BatchNorm2d(depth),
113 | SEModule(depth, 16)
114 | )
115 |
116 | def forward(self, x):
117 | shortcut = self.shortcut_layer(x)
118 | res = self.res_layer(x)
119 | return res + shortcut
120 |
--------------------------------------------------------------------------------
/code/models/encoders/psp_encoders.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import torch
3 | import torch.nn.functional as F
4 | from torch import nn
5 | from torch.nn import Linear, Conv2d, BatchNorm2d, PReLU, Sequential, Module
6 |
7 | from models.encoders.helpers import get_blocks, Flatten, bottleneck_IR, bottleneck_IR_SE
8 | from models.stylegan2_pytorch.stylegan2_pytorch import EqualLinear
9 |
10 | class GradualStyleBlock(Module):
11 | def __init__(self, in_c, out_c, spatial):
12 | super(GradualStyleBlock, self).__init__()
13 | self.out_c = out_c
14 | self.spatial = spatial
15 | num_pools = int(np.log2(spatial))
16 | modules = []
17 | modules += [Conv2d(in_c, out_c, kernel_size=3, stride=2, padding=1),
18 | nn.LeakyReLU()]
19 | for i in range(num_pools - 1):
20 | modules += [
21 | Conv2d(out_c, out_c, kernel_size=3, stride=2, padding=1),
22 | nn.LeakyReLU()
23 | ]
24 | self.convs = nn.Sequential(*modules)
25 | self.linear = EqualLinear(out_c, out_c, lr_mul=1)
26 |
27 | def forward(self, x):
28 | x = self.convs(x)
29 | x = x.view(-1, self.out_c)
30 | x = self.linear(x)
31 | return x
32 |
33 |
34 | class GradualStyleEncoder(Module):
35 | def __init__(self, num_layers, mode='ir', opts=None):
36 | super(GradualStyleEncoder, self).__init__()
37 | assert num_layers in [50, 100, 152], 'num_layers should be 50,100, or 152'
38 | assert mode in ['ir', 'ir_se'], 'mode should be ir or ir_se'
39 | blocks = get_blocks(num_layers)
40 | if mode == 'ir':
41 | unit_module = bottleneck_IR
42 | elif mode == 'ir_se':
43 | unit_module = bottleneck_IR_SE
44 | self.input_layer = Sequential(Conv2d(opts.input_nc, 64, (3, 3), 1, 1, bias=False),
45 | BatchNorm2d(64),
46 | PReLU(64))
47 | modules = []
48 | for block in blocks:
49 | for bottleneck in block:
50 | modules.append(unit_module(bottleneck.in_channel,
51 | bottleneck.depth,
52 | bottleneck.stride))
53 | self.body = Sequential(*modules)
54 |
55 | self.styles = nn.ModuleList()
56 | self.style_count = opts.n_styles
57 | self.coarse_ind = 3
58 | self.middle_ind = 7
59 | for i in range(self.style_count):
60 | if i < self.coarse_ind:
61 | style = GradualStyleBlock(512, 512, 16)
62 | elif i < self.middle_ind:
63 | style = GradualStyleBlock(512, 512, 32)
64 | else:
65 | style = GradualStyleBlock(512, 512, 64)
66 | self.styles.append(style)
67 | self.latlayer1 = nn.Conv2d(256, 512, kernel_size=1, stride=1, padding=0)
68 | self.latlayer2 = nn.Conv2d(128, 512, kernel_size=1, stride=1, padding=0)
69 |
70 | def _upsample_add(self, x, y):
71 | '''Upsample and add two feature maps.
72 | Args:
73 | x: (Variable) top feature map to be upsampled.
74 | y: (Variable) lateral feature map.
75 | Returns:
76 | (Variable) added feature map.
77 | Note in PyTorch, when input size is odd, the upsampled feature map
78 | with `F.upsample(..., scale_factor=2, mode='nearest')`
79 | maybe not equal to the lateral feature map size.
80 | e.g.
81 | original input size: [N,_,15,15] ->
82 | conv2d feature map size: [N,_,8,8] ->
83 | upsampled feature map size: [N,_,16,16]
84 | So we choose bilinear upsample which supports arbitrary output sizes.
85 | '''
86 | _, _, H, W = y.size()
87 | return F.interpolate(x, size=(H, W), mode='bilinear', align_corners=True) + y
88 |
89 | def forward(self, x):
90 | x = self.input_layer(x)
91 |
92 | latents = []
93 | modulelist = list(self.body._modules.values())
94 | for i, l in enumerate(modulelist):
95 | x = l(x)
96 | if i == 6:
97 | c1 = x
98 | elif i == 20:
99 | c2 = x
100 | elif i == 23:
101 | c3 = x
102 |
103 | for j in range(self.coarse_ind):
104 | latents.append(self.styles[j](c3))
105 |
106 | p2 = self._upsample_add(c3, self.latlayer1(c2))
107 | for j in range(self.coarse_ind, self.middle_ind):
108 | latents.append(self.styles[j](p2))
109 |
110 | p1 = self._upsample_add(p2, self.latlayer2(c1))
111 | for j in range(self.middle_ind, self.style_count):
112 | latents.append(self.styles[j](p1))
113 |
114 | out = torch.stack(latents, dim=1)
115 | return out
116 |
117 |
118 | class BackboneEncoderUsingLastLayerIntoW(Module):
119 | def __init__(self, num_layers, mode='ir', opts=None):
120 | super(BackboneEncoderUsingLastLayerIntoW, self).__init__()
121 | print('Using BackboneEncoderUsingLastLayerIntoW')
122 | assert num_layers in [50, 100, 152], 'num_layers should be 50,100, or 152'
123 | assert mode in ['ir', 'ir_se'], 'mode should be ir or ir_se'
124 | blocks = get_blocks(num_layers)
125 | if mode == 'ir':
126 | unit_module = bottleneck_IR
127 | elif mode == 'ir_se':
128 | unit_module = bottleneck_IR_SE
129 | self.input_layer = Sequential(Conv2d(opts.input_nc, 64, (3, 3), 1, 1, bias=False),
130 | BatchNorm2d(64),
131 | PReLU(64))
132 | self.output_pool = torch.nn.AdaptiveAvgPool2d((1, 1))
133 | self.linear = EqualLinear(512, 512, lr_mul=1)
134 | modules = []
135 | for block in blocks:
136 | for bottleneck in block:
137 | modules.append(unit_module(bottleneck.in_channel,
138 | bottleneck.depth,
139 | bottleneck.stride))
140 | self.body = Sequential(*modules)
141 |
142 | def forward(self, x):
143 | x = self.input_layer(x)
144 | x = self.body(x)
145 | x = self.output_pool(x)
146 | x = x.view(-1, 512)
147 | x = self.linear(x)
148 | return x
149 |
150 |
151 | class BackboneEncoderUsingLastLayerIntoWPlus(Module):
152 | def __init__(self, num_layers, mode='ir', opts=None):
153 | super(BackboneEncoderUsingLastLayerIntoWPlus, self).__init__()
154 | print('Using BackboneEncoderUsingLastLayerIntoWPlus')
155 | assert num_layers in [50, 100, 152], 'num_layers should be 50,100, or 152'
156 | assert mode in ['ir', 'ir_se'], 'mode should be ir or ir_se'
157 | blocks = get_blocks(num_layers)
158 | if mode == 'ir':
159 | unit_module = bottleneck_IR
160 | elif mode == 'ir_se':
161 | unit_module = bottleneck_IR_SE
162 | self.n_styles = opts.n_styles
163 | self.input_layer = Sequential(Conv2d(opts.input_nc, 64, (3, 3), 1, 1, bias=False),
164 | BatchNorm2d(64),
165 | PReLU(64))
166 | self.output_layer_2 = Sequential(BatchNorm2d(512),
167 | torch.nn.AdaptiveAvgPool2d((7, 7)),
168 | Flatten(),
169 | Linear(512 * 7 * 7, 512))
170 | self.linear = EqualLinear(512, 512 * self.n_styles, lr_mul=1)
171 | modules = []
172 | for block in blocks:
173 | for bottleneck in block:
174 | modules.append(unit_module(bottleneck.in_channel,
175 | bottleneck.depth,
176 | bottleneck.stride))
177 | self.body = Sequential(*modules)
178 |
179 | def forward(self, x):
180 | x = self.input_layer(x)
181 | x = self.body(x)
182 | x = self.output_layer_2(x)
183 | x = self.linear(x)
184 | x = x.view(-1, self.n_styles, 512)
185 | return x
186 |
--------------------------------------------------------------------------------
/code/models/interpreter.py:
--------------------------------------------------------------------------------
1 | import torch.nn as nn
2 | import torch.nn.functional as F
3 |
4 |
5 | class Interpolate(nn.Module):
6 | def __init__(self, size, mode, align_corners=False):
7 | super(Interpolate, self).__init__()
8 | self.interp = nn.functional.interpolate
9 | self.size = size
10 | self.mode = mode
11 | self.align_corners = align_corners
12 |
13 | def forward(self, x):
14 | if self.align_corners:
15 | x = self.interp(x, size=self.size, mode=self.mode, align_corners=False)
16 | else:
17 | x = self.interp(x, size=self.size, mode=self.mode, align_corners=True)
18 | return x
19 |
20 |
21 | class pyramid_interpreter(nn.Module):
22 | def __init__(self, points, dropout):
23 | super(pyramid_interpreter, self).__init__()
24 | self.upsample = nn.Upsample(scale_factor=2, mode='bilinear')
25 | self.layers = []
26 | self.layer0 = nn.Sequential(
27 | nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True),
28 | nn.Conv2d(512, 512, kernel_size=3, padding=1),
29 | nn.ReLU(),
30 | nn.Dropout(p=dropout),
31 | nn.BatchNorm2d(num_features=512),
32 | )
33 | self.layer1 = nn.Sequential(
34 | nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True),
35 | nn.Conv2d(512, 512, kernel_size=5, padding=2),
36 | nn.ReLU(),
37 | nn.Dropout(p=dropout),
38 | nn.BatchNorm2d(num_features=512),
39 | )
40 | self.layer2 = nn.Sequential(
41 | nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True),
42 | nn.Conv2d(512, 512, kernel_size=5, padding=2),
43 | nn.ReLU(),
44 | nn.Dropout(p=dropout),
45 | nn.BatchNorm2d(num_features=512),
46 | )
47 | self.layer3 = nn.Sequential(
48 | nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True),
49 | nn.Conv2d(512, 512, kernel_size=5, padding=2),
50 | nn.ReLU(),
51 | nn.Dropout(p=dropout),
52 | nn.BatchNorm2d(num_features=512),
53 | )
54 | self.layer4 = nn.Sequential(
55 | nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True),
56 | nn.Conv2d(512, 256, kernel_size=5, padding=2),
57 | nn.ReLU(),
58 | nn.Dropout(p=dropout),
59 | nn.BatchNorm2d(num_features=256),
60 | )
61 | self.layer5 = nn.Sequential(
62 | nn.Conv2d(256, 128, kernel_size=5, padding=2),
63 | nn.ReLU(),
64 | nn.Dropout(p=dropout),
65 | nn.BatchNorm2d(num_features=128),
66 | )
67 | self.layer6 = nn.Sequential(
68 | nn.Conv2d(128, 64, kernel_size=5, padding=2),
69 | nn.ReLU(),
70 | nn.Dropout(p=dropout),
71 | nn.BatchNorm2d(num_features=64),
72 | )
73 | self.layer7 = nn.Sequential(
74 | nn.Conv2d(64, 32, kernel_size=5, padding=2),
75 | nn.ReLU(),
76 | nn.Dropout(p=dropout),
77 | nn.BatchNorm2d(num_features=32),
78 | )
79 | self.layer8 = nn.Sequential(
80 | nn.Conv2d(32, 32, kernel_size=5, padding=2),
81 | nn.ReLU(),
82 | nn.Dropout(p=dropout),
83 | nn.BatchNorm2d(num_features=32),
84 | nn.Conv2d(32, points, kernel_size=5, padding=2),
85 | )
86 | self.interpolate = Interpolate(128, 'bilinear')
87 |
88 | self.layers.append(self.layer0)
89 | self.layers.append(self.layer1)
90 | self.layers.append(self.layer2)
91 | self.layers.append(self.layer3)
92 | self.layers.append(self.layer4)
93 | self.layers.append(self.layer5)
94 | self.layers.append(self.layer6)
95 | self.layers.append(self.layer7)
96 | self.layers.append(self.layer8)
97 |
98 | def init_weights(self, init_type='normal', gain=0.02):
99 | def init_func(m):
100 | classname = m.__class__.__name__
101 | if hasattr(m, 'weight') and (classname.find('Conv') != -1 or classname.find('Linear') != -1):
102 | if init_type == 'normal':
103 | nn.init.normal_(m.weight.data, 0.0, gain)
104 | elif init_type == 'xavier':
105 | nn.init.xavier_normal_(m.weight.data, gain=gain)
106 | elif init_type == 'kaiming':
107 | nn.init.kaiming_normal_(m.weight.data, a=0, mode='fan_in')
108 | elif init_type == 'orthogonal':
109 | nn.init.orthogonal_(m.weight.data, gain=gain)
110 |
111 | if hasattr(m, 'bias') and m.bias is not None:
112 | nn.init.constant_(m.bias.data, 0.0)
113 |
114 | elif classname.find('BatchNorm2d') != -1:
115 | nn.init.normal_(m.weight.data, 1.0, gain)
116 | nn.init.constant_(m.bias.data, 0.0)
117 |
118 | self.apply(init_func)
119 |
120 | def forward(self, x):
121 | '''
122 | 512, 4, 4
123 | 512, 8, 8
124 | 512, 16, 16
125 | 512, 32, 32
126 | 512, 64, 64
127 | 256, 128, 128
128 | 128, 256, 256
129 | 64, 512, 512
130 | 32, 1024, 1024
131 | '''
132 | for i in range(len(x)):
133 | if i == 0:
134 | hs = x[i]
135 | else:
136 | if x[i].shape[2] > 128:
137 | x[i] = self.interpolate(x[i])
138 | hs = self.layers[i-1](hs) + x[i]
139 |
140 | hs = self.layers[-1](hs)
141 |
142 | return hs
143 |
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/__init__.py:
--------------------------------------------------------------------------------
1 | from .fused_act import FusedLeakyReLU, fused_leaky_relu
2 | from .upfirdn2d import upfirdn2d
3 |
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/fused_act.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | import torch
4 | from torch import nn
5 | from torch.nn import functional as F
6 | from torch.autograd import Function
7 | from torch.utils.cpp_extension import load
8 |
9 |
10 | module_path = os.path.dirname(__file__)
11 | fused = load(
12 | "fused",
13 | sources=[
14 | os.path.join(module_path, "fused_bias_act.cpp"),
15 | os.path.join(module_path, "fused_bias_act_kernel.cu"),
16 | ],
17 | )
18 |
19 |
20 | class FusedLeakyReLUFunctionBackward(Function):
21 | @staticmethod
22 | def forward(ctx, grad_output, out, bias, negative_slope, scale):
23 | ctx.save_for_backward(out)
24 | ctx.negative_slope = negative_slope
25 | ctx.scale = scale
26 |
27 | empty = grad_output.new_empty(0)
28 |
29 | grad_input = fused.fused_bias_act(
30 | grad_output, empty, out, 3, 1, negative_slope, scale
31 | )
32 |
33 | dim = [0]
34 |
35 | if grad_input.ndim > 2:
36 | dim += list(range(2, grad_input.ndim))
37 |
38 | if bias:
39 | grad_bias = grad_input.sum(dim).detach()
40 |
41 | else:
42 | grad_bias = empty
43 |
44 | return grad_input, grad_bias
45 |
46 | @staticmethod
47 | def backward(ctx, gradgrad_input, gradgrad_bias):
48 | out, = ctx.saved_tensors
49 | gradgrad_out = fused.fused_bias_act(
50 | gradgrad_input, gradgrad_bias, out, 3, 1, ctx.negative_slope, ctx.scale
51 | )
52 |
53 | return gradgrad_out, None, None, None, None
54 |
55 |
56 | class FusedLeakyReLUFunction(Function):
57 | @staticmethod
58 | def forward(ctx, input, bias, negative_slope, scale):
59 | empty = input.new_empty(0)
60 |
61 | ctx.bias = bias is not None
62 |
63 | if bias is None:
64 | bias = empty
65 |
66 | out = fused.fused_bias_act(input, bias, empty, 3, 0, negative_slope, scale)
67 | ctx.save_for_backward(out)
68 | ctx.negative_slope = negative_slope
69 | ctx.scale = scale
70 |
71 | return out
72 |
73 | @staticmethod
74 | def backward(ctx, grad_output):
75 | out, = ctx.saved_tensors
76 |
77 | grad_input, grad_bias = FusedLeakyReLUFunctionBackward.apply(
78 | grad_output, out, ctx.bias, ctx.negative_slope, ctx.scale
79 | )
80 |
81 | if not ctx.bias:
82 | grad_bias = None
83 |
84 | return grad_input, grad_bias, None, None
85 |
86 |
87 | class FusedLeakyReLU(nn.Module):
88 | def __init__(self, channel, bias=True, negative_slope=0.2, scale=2 ** 0.5):
89 | super().__init__()
90 |
91 | if bias:
92 | self.bias = nn.Parameter(torch.zeros(channel))
93 |
94 | else:
95 | self.bias = None
96 |
97 | self.negative_slope = negative_slope
98 | self.scale = scale
99 |
100 | def forward(self, input):
101 | return fused_leaky_relu(input, self.bias, self.negative_slope, self.scale)
102 |
103 |
104 | def fused_leaky_relu(input, bias=None, negative_slope=0.2, scale=2 ** 0.5):
105 | if input.device.type == "cpu":
106 | if bias is not None:
107 | rest_dim = [1] * (input.ndim - bias.ndim - 1)
108 | return (
109 | F.leaky_relu(
110 | input + bias.view(1, bias.shape[0], *rest_dim), negative_slope=0.2
111 | )
112 | * scale
113 | )
114 |
115 | else:
116 | return F.leaky_relu(input, negative_slope=0.2) * scale
117 |
118 | else:
119 | return FusedLeakyReLUFunction.apply(input, bias, negative_slope, scale)
120 |
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/fused_bias_act.cpp:
--------------------------------------------------------------------------------
1 | #include
2 |
3 |
4 | torch::Tensor fused_bias_act_op(const torch::Tensor& input, const torch::Tensor& bias, const torch::Tensor& refer,
5 | int act, int grad, float alpha, float scale);
6 |
7 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
8 | #define CHECK_CONTIGUOUS(x) TORCH_CHECK(x.is_contiguous(), #x " must be contiguous")
9 | #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
10 |
11 | torch::Tensor fused_bias_act(const torch::Tensor& input, const torch::Tensor& bias, const torch::Tensor& refer,
12 | int act, int grad, float alpha, float scale) {
13 | CHECK_CUDA(input);
14 | CHECK_CUDA(bias);
15 |
16 | return fused_bias_act_op(input, bias, refer, act, grad, alpha, scale);
17 | }
18 |
19 | PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
20 | m.def("fused_bias_act", &fused_bias_act, "fused bias act (CUDA)");
21 | }
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/fused_bias_act_kernel.cu:
--------------------------------------------------------------------------------
1 | // Copyright (c) 2019, NVIDIA Corporation. All rights reserved.
2 | //
3 | // This work is made available under the Nvidia Source Code License-NC.
4 | // To view a copy of this license, visit
5 | // https://nvlabs.github.io/stylegan2/license.html
6 |
7 | #include
8 |
9 | #include
10 | #include
11 | #include
12 | #include
13 |
14 | #include
15 | #include
16 |
17 |
18 | template
19 | static __global__ void fused_bias_act_kernel(scalar_t* out, const scalar_t* p_x, const scalar_t* p_b, const scalar_t* p_ref,
20 | int act, int grad, scalar_t alpha, scalar_t scale, int loop_x, int size_x, int step_b, int size_b, int use_bias, int use_ref) {
21 | int xi = blockIdx.x * loop_x * blockDim.x + threadIdx.x;
22 |
23 | scalar_t zero = 0.0;
24 |
25 | for (int loop_idx = 0; loop_idx < loop_x && xi < size_x; loop_idx++, xi += blockDim.x) {
26 | scalar_t x = p_x[xi];
27 |
28 | if (use_bias) {
29 | x += p_b[(xi / step_b) % size_b];
30 | }
31 |
32 | scalar_t ref = use_ref ? p_ref[xi] : zero;
33 |
34 | scalar_t y;
35 |
36 | switch (act * 10 + grad) {
37 | default:
38 | case 10: y = x; break;
39 | case 11: y = x; break;
40 | case 12: y = 0.0; break;
41 |
42 | case 30: y = (x > 0.0) ? x : x * alpha; break;
43 | case 31: y = (ref > 0.0) ? x : x * alpha; break;
44 | case 32: y = 0.0; break;
45 | }
46 |
47 | out[xi] = y * scale;
48 | }
49 | }
50 |
51 |
52 | torch::Tensor fused_bias_act_op(const torch::Tensor& input, const torch::Tensor& bias, const torch::Tensor& refer,
53 | int act, int grad, float alpha, float scale) {
54 | int curDevice = -1;
55 | cudaGetDevice(&curDevice);
56 | cudaStream_t stream = at::cuda::getCurrentCUDAStream(curDevice);
57 |
58 | auto x = input.contiguous();
59 | auto b = bias.contiguous();
60 | auto ref = refer.contiguous();
61 |
62 | int use_bias = b.numel() ? 1 : 0;
63 | int use_ref = ref.numel() ? 1 : 0;
64 |
65 | int size_x = x.numel();
66 | int size_b = b.numel();
67 | int step_b = 1;
68 |
69 | for (int i = 1 + 1; i < x.dim(); i++) {
70 | step_b *= x.size(i);
71 | }
72 |
73 | int loop_x = 4;
74 | int block_size = 4 * 32;
75 | int grid_size = (size_x - 1) / (loop_x * block_size) + 1;
76 |
77 | auto y = torch::empty_like(x);
78 |
79 | AT_DISPATCH_FLOATING_TYPES_AND_HALF(x.scalar_type(), "fused_bias_act_kernel", [&] {
80 | fused_bias_act_kernel<<>>(
81 | y.data_ptr(),
82 | x.data_ptr(),
83 | b.data_ptr(),
84 | ref.data_ptr(),
85 | act,
86 | grad,
87 | alpha,
88 | scale,
89 | loop_x,
90 | size_x,
91 | step_b,
92 | size_b,
93 | use_bias,
94 | use_ref
95 | );
96 | });
97 |
98 | return y;
99 | }
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/upfirdn2d.cpp:
--------------------------------------------------------------------------------
1 | #include
2 |
3 |
4 | torch::Tensor upfirdn2d_op(const torch::Tensor& input, const torch::Tensor& kernel,
5 | int up_x, int up_y, int down_x, int down_y,
6 | int pad_x0, int pad_x1, int pad_y0, int pad_y1);
7 |
8 | #define CHECK_CUDA(x) TORCH_CHECK(x.type().is_cuda(), #x " must be a CUDA tensor")
9 | #define CHECK_CONTIGUOUS(x) TORCH_CHECK(x.is_contiguous(), #x " must be contiguous")
10 | #define CHECK_INPUT(x) CHECK_CUDA(x); CHECK_CONTIGUOUS(x)
11 |
12 | torch::Tensor upfirdn2d(const torch::Tensor& input, const torch::Tensor& kernel,
13 | int up_x, int up_y, int down_x, int down_y,
14 | int pad_x0, int pad_x1, int pad_y0, int pad_y1) {
15 | CHECK_CUDA(input);
16 | CHECK_CUDA(kernel);
17 |
18 | return upfirdn2d_op(input, kernel, up_x, up_y, down_x, down_y, pad_x0, pad_x1, pad_y0, pad_y1);
19 | }
20 |
21 | PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
22 | m.def("upfirdn2d", &upfirdn2d, "upfirdn2d (CUDA)");
23 | }
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/upfirdn2d.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | import torch
4 | from torch.nn import functional as F
5 | from torch.autograd import Function
6 | from torch.utils.cpp_extension import load
7 |
8 | module_path = os.path.dirname(__file__)
9 | upfirdn2d_op = load(
10 | "upfirdn2d",
11 | sources=[
12 | os.path.join(module_path, "upfirdn2d.cpp"),
13 | os.path.join(module_path, "upfirdn2d_kernel.cu"),
14 | ],
15 | )
16 |
17 |
18 | class UpFirDn2dBackward(Function):
19 | @staticmethod
20 | def forward(
21 | ctx, grad_output, kernel, grad_kernel, up, down, pad, g_pad, in_size, out_size
22 | ):
23 |
24 | up_x, up_y = up
25 | down_x, down_y = down
26 | g_pad_x0, g_pad_x1, g_pad_y0, g_pad_y1 = g_pad
27 |
28 | grad_output = grad_output.reshape(-1, out_size[0], out_size[1], 1)
29 |
30 | grad_input = upfirdn2d_op.upfirdn2d(
31 | grad_output,
32 | grad_kernel,
33 | down_x,
34 | down_y,
35 | up_x,
36 | up_y,
37 | g_pad_x0,
38 | g_pad_x1,
39 | g_pad_y0,
40 | g_pad_y1,
41 | )
42 | grad_input = grad_input.view(in_size[0], in_size[1], in_size[2], in_size[3])
43 |
44 | ctx.save_for_backward(kernel)
45 |
46 | pad_x0, pad_x1, pad_y0, pad_y1 = pad
47 |
48 | ctx.up_x = up_x
49 | ctx.up_y = up_y
50 | ctx.down_x = down_x
51 | ctx.down_y = down_y
52 | ctx.pad_x0 = pad_x0
53 | ctx.pad_x1 = pad_x1
54 | ctx.pad_y0 = pad_y0
55 | ctx.pad_y1 = pad_y1
56 | ctx.in_size = in_size
57 | ctx.out_size = out_size
58 |
59 | return grad_input
60 |
61 | @staticmethod
62 | def backward(ctx, gradgrad_input):
63 | kernel, = ctx.saved_tensors
64 |
65 | gradgrad_input = gradgrad_input.reshape(-1, ctx.in_size[2], ctx.in_size[3], 1)
66 |
67 | gradgrad_out = upfirdn2d_op.upfirdn2d(
68 | gradgrad_input,
69 | kernel,
70 | ctx.up_x,
71 | ctx.up_y,
72 | ctx.down_x,
73 | ctx.down_y,
74 | ctx.pad_x0,
75 | ctx.pad_x1,
76 | ctx.pad_y0,
77 | ctx.pad_y1,
78 | )
79 | # gradgrad_out = gradgrad_out.view(ctx.in_size[0], ctx.out_size[0], ctx.out_size[1], ctx.in_size[3])
80 | gradgrad_out = gradgrad_out.view(
81 | ctx.in_size[0], ctx.in_size[1], ctx.out_size[0], ctx.out_size[1]
82 | )
83 |
84 | return gradgrad_out, None, None, None, None, None, None, None, None
85 |
86 |
87 | class UpFirDn2d(Function):
88 | @staticmethod
89 | def forward(ctx, input, kernel, up, down, pad):
90 | up_x, up_y = up
91 | down_x, down_y = down
92 | pad_x0, pad_x1, pad_y0, pad_y1 = pad
93 |
94 | kernel_h, kernel_w = kernel.shape
95 | batch, channel, in_h, in_w = input.shape
96 | ctx.in_size = input.shape
97 |
98 | input = input.reshape(-1, in_h, in_w, 1)
99 |
100 | ctx.save_for_backward(kernel, torch.flip(kernel, [0, 1]))
101 |
102 | out_h = (in_h * up_y + pad_y0 + pad_y1 - kernel_h) // down_y + 1
103 | out_w = (in_w * up_x + pad_x0 + pad_x1 - kernel_w) // down_x + 1
104 | ctx.out_size = (out_h, out_w)
105 |
106 | ctx.up = (up_x, up_y)
107 | ctx.down = (down_x, down_y)
108 | ctx.pad = (pad_x0, pad_x1, pad_y0, pad_y1)
109 |
110 | g_pad_x0 = kernel_w - pad_x0 - 1
111 | g_pad_y0 = kernel_h - pad_y0 - 1
112 | g_pad_x1 = in_w * up_x - out_w * down_x + pad_x0 - up_x + 1
113 | g_pad_y1 = in_h * up_y - out_h * down_y + pad_y0 - up_y + 1
114 |
115 | ctx.g_pad = (g_pad_x0, g_pad_x1, g_pad_y0, g_pad_y1)
116 |
117 | out = upfirdn2d_op.upfirdn2d(
118 | input, kernel, up_x, up_y, down_x, down_y, pad_x0, pad_x1, pad_y0, pad_y1
119 | )
120 | # out = out.view(major, out_h, out_w, minor)
121 | out = out.view(-1, channel, out_h, out_w)
122 |
123 | return out
124 |
125 | @staticmethod
126 | def backward(ctx, grad_output):
127 | kernel, grad_kernel = ctx.saved_tensors
128 |
129 | grad_input = UpFirDn2dBackward.apply(
130 | grad_output,
131 | kernel,
132 | grad_kernel,
133 | ctx.up,
134 | ctx.down,
135 | ctx.pad,
136 | ctx.g_pad,
137 | ctx.in_size,
138 | ctx.out_size,
139 | )
140 |
141 | return grad_input, None, None, None, None
142 |
143 |
144 | def upfirdn2d(input, kernel, up=1, down=1, pad=(0, 0)):
145 | if input.device.type == "cpu":
146 | out = upfirdn2d_native(
147 | input, kernel, up, up, down, down, pad[0], pad[1], pad[0], pad[1]
148 | )
149 |
150 | else:
151 | out = UpFirDn2d.apply(
152 | input, kernel, (up, up), (down, down), (pad[0], pad[1], pad[0], pad[1])
153 | )
154 |
155 | return out
156 |
157 |
158 | def upfirdn2d_native(
159 | input, kernel, up_x, up_y, down_x, down_y, pad_x0, pad_x1, pad_y0, pad_y1
160 | ):
161 | _, channel, in_h, in_w = input.shape
162 | input = input.reshape(-1, in_h, in_w, 1)
163 |
164 | _, in_h, in_w, minor = input.shape
165 | kernel_h, kernel_w = kernel.shape
166 |
167 | out = input.view(-1, in_h, 1, in_w, 1, minor)
168 | out = F.pad(out, [0, 0, 0, up_x - 1, 0, 0, 0, up_y - 1])
169 | out = out.view(-1, in_h * up_y, in_w * up_x, minor)
170 |
171 | out = F.pad(
172 | out, [0, 0, max(pad_x0, 0), max(pad_x1, 0), max(pad_y0, 0), max(pad_y1, 0)]
173 | )
174 | out = out[
175 | :,
176 | max(-pad_y0, 0) : out.shape[1] - max(-pad_y1, 0),
177 | max(-pad_x0, 0) : out.shape[2] - max(-pad_x1, 0),
178 | :,
179 | ]
180 |
181 | out = out.permute(0, 3, 1, 2)
182 | out = out.reshape(
183 | [-1, 1, in_h * up_y + pad_y0 + pad_y1, in_w * up_x + pad_x0 + pad_x1]
184 | )
185 | w = torch.flip(kernel, [0, 1]).view(1, 1, kernel_h, kernel_w)
186 | out = F.conv2d(out, w)
187 | out = out.reshape(
188 | -1,
189 | minor,
190 | in_h * up_y + pad_y0 + pad_y1 - kernel_h + 1,
191 | in_w * up_x + pad_x0 + pad_x1 - kernel_w + 1,
192 | )
193 | out = out.permute(0, 2, 3, 1)
194 | out = out[:, ::down_y, ::down_x, :]
195 |
196 | out_h = (in_h * up_y + pad_y0 + pad_y1 - kernel_h) // down_y + 1
197 | out_w = (in_w * up_x + pad_x0 + pad_x1 - kernel_w) // down_x + 1
198 |
199 | return out.view(-1, channel, out_h, out_w)
200 |
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/op/upfirdn2d_kernel.cu:
--------------------------------------------------------------------------------
1 | // Copyright (c) 2019, NVIDIA Corporation. All rights reserved.
2 | //
3 | // This work is made available under the Nvidia Source Code License-NC.
4 | // To view a copy of this license, visit
5 | // https://nvlabs.github.io/stylegan2/license.html
6 |
7 | #include
8 |
9 | #include
10 | #include
11 | #include
12 | #include
13 |
14 | #include
15 | #include
16 |
17 | static __host__ __device__ __forceinline__ int floor_div(int a, int b) {
18 | int c = a / b;
19 |
20 | if (c * b > a) {
21 | c--;
22 | }
23 |
24 | return c;
25 | }
26 |
27 | struct UpFirDn2DKernelParams {
28 | int up_x;
29 | int up_y;
30 | int down_x;
31 | int down_y;
32 | int pad_x0;
33 | int pad_x1;
34 | int pad_y0;
35 | int pad_y1;
36 |
37 | int major_dim;
38 | int in_h;
39 | int in_w;
40 | int minor_dim;
41 | int kernel_h;
42 | int kernel_w;
43 | int out_h;
44 | int out_w;
45 | int loop_major;
46 | int loop_x;
47 | };
48 |
49 | template
50 | __global__ void upfirdn2d_kernel_large(scalar_t *out, const scalar_t *input,
51 | const scalar_t *kernel,
52 | const UpFirDn2DKernelParams p) {
53 | int minor_idx = blockIdx.x * blockDim.x + threadIdx.x;
54 | int out_y = minor_idx / p.minor_dim;
55 | minor_idx -= out_y * p.minor_dim;
56 | int out_x_base = blockIdx.y * p.loop_x * blockDim.y + threadIdx.y;
57 | int major_idx_base = blockIdx.z * p.loop_major;
58 |
59 | if (out_x_base >= p.out_w || out_y >= p.out_h ||
60 | major_idx_base >= p.major_dim) {
61 | return;
62 | }
63 |
64 | int mid_y = out_y * p.down_y + p.up_y - 1 - p.pad_y0;
65 | int in_y = min(max(floor_div(mid_y, p.up_y), 0), p.in_h);
66 | int h = min(max(floor_div(mid_y + p.kernel_h, p.up_y), 0), p.in_h) - in_y;
67 | int kernel_y = mid_y + p.kernel_h - (in_y + 1) * p.up_y;
68 |
69 | for (int loop_major = 0, major_idx = major_idx_base;
70 | loop_major < p.loop_major && major_idx < p.major_dim;
71 | loop_major++, major_idx++) {
72 | for (int loop_x = 0, out_x = out_x_base;
73 | loop_x < p.loop_x && out_x < p.out_w; loop_x++, out_x += blockDim.y) {
74 | int mid_x = out_x * p.down_x + p.up_x - 1 - p.pad_x0;
75 | int in_x = min(max(floor_div(mid_x, p.up_x), 0), p.in_w);
76 | int w = min(max(floor_div(mid_x + p.kernel_w, p.up_x), 0), p.in_w) - in_x;
77 | int kernel_x = mid_x + p.kernel_w - (in_x + 1) * p.up_x;
78 |
79 | const scalar_t *x_p =
80 | &input[((major_idx * p.in_h + in_y) * p.in_w + in_x) * p.minor_dim +
81 | minor_idx];
82 | const scalar_t *k_p = &kernel[kernel_y * p.kernel_w + kernel_x];
83 | int x_px = p.minor_dim;
84 | int k_px = -p.up_x;
85 | int x_py = p.in_w * p.minor_dim;
86 | int k_py = -p.up_y * p.kernel_w;
87 |
88 | scalar_t v = 0.0f;
89 |
90 | for (int y = 0; y < h; y++) {
91 | for (int x = 0; x < w; x++) {
92 | v += static_cast(*x_p) * static_cast(*k_p);
93 | x_p += x_px;
94 | k_p += k_px;
95 | }
96 |
97 | x_p += x_py - w * x_px;
98 | k_p += k_py - w * k_px;
99 | }
100 |
101 | out[((major_idx * p.out_h + out_y) * p.out_w + out_x) * p.minor_dim +
102 | minor_idx] = v;
103 | }
104 | }
105 | }
106 |
107 | template
109 | __global__ void upfirdn2d_kernel(scalar_t *out, const scalar_t *input,
110 | const scalar_t *kernel,
111 | const UpFirDn2DKernelParams p) {
112 | const int tile_in_h = ((tile_out_h - 1) * down_y + kernel_h - 1) / up_y + 1;
113 | const int tile_in_w = ((tile_out_w - 1) * down_x + kernel_w - 1) / up_x + 1;
114 |
115 | __shared__ volatile float sk[kernel_h][kernel_w];
116 | __shared__ volatile float sx[tile_in_h][tile_in_w];
117 |
118 | int minor_idx = blockIdx.x;
119 | int tile_out_y = minor_idx / p.minor_dim;
120 | minor_idx -= tile_out_y * p.minor_dim;
121 | tile_out_y *= tile_out_h;
122 | int tile_out_x_base = blockIdx.y * p.loop_x * tile_out_w;
123 | int major_idx_base = blockIdx.z * p.loop_major;
124 |
125 | if (tile_out_x_base >= p.out_w | tile_out_y >= p.out_h |
126 | major_idx_base >= p.major_dim) {
127 | return;
128 | }
129 |
130 | for (int tap_idx = threadIdx.x; tap_idx < kernel_h * kernel_w;
131 | tap_idx += blockDim.x) {
132 | int ky = tap_idx / kernel_w;
133 | int kx = tap_idx - ky * kernel_w;
134 | scalar_t v = 0.0;
135 |
136 | if (kx < p.kernel_w & ky < p.kernel_h) {
137 | v = kernel[(p.kernel_h - 1 - ky) * p.kernel_w + (p.kernel_w - 1 - kx)];
138 | }
139 |
140 | sk[ky][kx] = v;
141 | }
142 |
143 | for (int loop_major = 0, major_idx = major_idx_base;
144 | loop_major < p.loop_major & major_idx < p.major_dim;
145 | loop_major++, major_idx++) {
146 | for (int loop_x = 0, tile_out_x = tile_out_x_base;
147 | loop_x < p.loop_x & tile_out_x < p.out_w;
148 | loop_x++, tile_out_x += tile_out_w) {
149 | int tile_mid_x = tile_out_x * down_x + up_x - 1 - p.pad_x0;
150 | int tile_mid_y = tile_out_y * down_y + up_y - 1 - p.pad_y0;
151 | int tile_in_x = floor_div(tile_mid_x, up_x);
152 | int tile_in_y = floor_div(tile_mid_y, up_y);
153 |
154 | __syncthreads();
155 |
156 | for (int in_idx = threadIdx.x; in_idx < tile_in_h * tile_in_w;
157 | in_idx += blockDim.x) {
158 | int rel_in_y = in_idx / tile_in_w;
159 | int rel_in_x = in_idx - rel_in_y * tile_in_w;
160 | int in_x = rel_in_x + tile_in_x;
161 | int in_y = rel_in_y + tile_in_y;
162 |
163 | scalar_t v = 0.0;
164 |
165 | if (in_x >= 0 & in_y >= 0 & in_x < p.in_w & in_y < p.in_h) {
166 | v = input[((major_idx * p.in_h + in_y) * p.in_w + in_x) *
167 | p.minor_dim +
168 | minor_idx];
169 | }
170 |
171 | sx[rel_in_y][rel_in_x] = v;
172 | }
173 |
174 | __syncthreads();
175 | for (int out_idx = threadIdx.x; out_idx < tile_out_h * tile_out_w;
176 | out_idx += blockDim.x) {
177 | int rel_out_y = out_idx / tile_out_w;
178 | int rel_out_x = out_idx - rel_out_y * tile_out_w;
179 | int out_x = rel_out_x + tile_out_x;
180 | int out_y = rel_out_y + tile_out_y;
181 |
182 | int mid_x = tile_mid_x + rel_out_x * down_x;
183 | int mid_y = tile_mid_y + rel_out_y * down_y;
184 | int in_x = floor_div(mid_x, up_x);
185 | int in_y = floor_div(mid_y, up_y);
186 | int rel_in_x = in_x - tile_in_x;
187 | int rel_in_y = in_y - tile_in_y;
188 | int kernel_x = (in_x + 1) * up_x - mid_x - 1;
189 | int kernel_y = (in_y + 1) * up_y - mid_y - 1;
190 |
191 | scalar_t v = 0.0;
192 |
193 | #pragma unroll
194 | for (int y = 0; y < kernel_h / up_y; y++)
195 | #pragma unroll
196 | for (int x = 0; x < kernel_w / up_x; x++)
197 | v += sx[rel_in_y + y][rel_in_x + x] *
198 | sk[kernel_y + y * up_y][kernel_x + x * up_x];
199 |
200 | if (out_x < p.out_w & out_y < p.out_h) {
201 | out[((major_idx * p.out_h + out_y) * p.out_w + out_x) * p.minor_dim +
202 | minor_idx] = v;
203 | }
204 | }
205 | }
206 | }
207 | }
208 |
209 | torch::Tensor upfirdn2d_op(const torch::Tensor &input,
210 | const torch::Tensor &kernel, int up_x, int up_y,
211 | int down_x, int down_y, int pad_x0, int pad_x1,
212 | int pad_y0, int pad_y1) {
213 | int curDevice = -1;
214 | cudaGetDevice(&curDevice);
215 | cudaStream_t stream = at::cuda::getCurrentCUDAStream(curDevice);
216 |
217 | UpFirDn2DKernelParams p;
218 |
219 | auto x = input.contiguous();
220 | auto k = kernel.contiguous();
221 |
222 | p.major_dim = x.size(0);
223 | p.in_h = x.size(1);
224 | p.in_w = x.size(2);
225 | p.minor_dim = x.size(3);
226 | p.kernel_h = k.size(0);
227 | p.kernel_w = k.size(1);
228 | p.up_x = up_x;
229 | p.up_y = up_y;
230 | p.down_x = down_x;
231 | p.down_y = down_y;
232 | p.pad_x0 = pad_x0;
233 | p.pad_x1 = pad_x1;
234 | p.pad_y0 = pad_y0;
235 | p.pad_y1 = pad_y1;
236 |
237 | p.out_h = (p.in_h * p.up_y + p.pad_y0 + p.pad_y1 - p.kernel_h + p.down_y) /
238 | p.down_y;
239 | p.out_w = (p.in_w * p.up_x + p.pad_x0 + p.pad_x1 - p.kernel_w + p.down_x) /
240 | p.down_x;
241 |
242 | auto out =
243 | at::empty({p.major_dim, p.out_h, p.out_w, p.minor_dim}, x.options());
244 |
245 | int mode = -1;
246 |
247 | int tile_out_h = -1;
248 | int tile_out_w = -1;
249 |
250 | if (p.up_x == 1 && p.up_y == 1 && p.down_x == 1 && p.down_y == 1 &&
251 | p.kernel_h <= 4 && p.kernel_w <= 4) {
252 | mode = 1;
253 | tile_out_h = 16;
254 | tile_out_w = 64;
255 | }
256 |
257 | if (p.up_x == 1 && p.up_y == 1 && p.down_x == 1 && p.down_y == 1 &&
258 | p.kernel_h <= 3 && p.kernel_w <= 3) {
259 | mode = 2;
260 | tile_out_h = 16;
261 | tile_out_w = 64;
262 | }
263 |
264 | if (p.up_x == 2 && p.up_y == 2 && p.down_x == 1 && p.down_y == 1 &&
265 | p.kernel_h <= 4 && p.kernel_w <= 4) {
266 | mode = 3;
267 | tile_out_h = 16;
268 | tile_out_w = 64;
269 | }
270 |
271 | if (p.up_x == 2 && p.up_y == 2 && p.down_x == 1 && p.down_y == 1 &&
272 | p.kernel_h <= 2 && p.kernel_w <= 2) {
273 | mode = 4;
274 | tile_out_h = 16;
275 | tile_out_w = 64;
276 | }
277 |
278 | if (p.up_x == 1 && p.up_y == 1 && p.down_x == 2 && p.down_y == 2 &&
279 | p.kernel_h <= 4 && p.kernel_w <= 4) {
280 | mode = 5;
281 | tile_out_h = 8;
282 | tile_out_w = 32;
283 | }
284 |
285 | if (p.up_x == 1 && p.up_y == 1 && p.down_x == 2 && p.down_y == 2 &&
286 | p.kernel_h <= 2 && p.kernel_w <= 2) {
287 | mode = 6;
288 | tile_out_h = 8;
289 | tile_out_w = 32;
290 | }
291 |
292 | dim3 block_size;
293 | dim3 grid_size;
294 |
295 | if (tile_out_h > 0 && tile_out_w > 0) {
296 | p.loop_major = (p.major_dim - 1) / 16384 + 1;
297 | p.loop_x = 1;
298 | block_size = dim3(32 * 8, 1, 1);
299 | grid_size = dim3(((p.out_h - 1) / tile_out_h + 1) * p.minor_dim,
300 | (p.out_w - 1) / (p.loop_x * tile_out_w) + 1,
301 | (p.major_dim - 1) / p.loop_major + 1);
302 | } else {
303 | p.loop_major = (p.major_dim - 1) / 16384 + 1;
304 | p.loop_x = 4;
305 | block_size = dim3(4, 32, 1);
306 | grid_size = dim3((p.out_h * p.minor_dim - 1) / block_size.x + 1,
307 | (p.out_w - 1) / (p.loop_x * block_size.y) + 1,
308 | (p.major_dim - 1) / p.loop_major + 1);
309 | }
310 |
311 | AT_DISPATCH_FLOATING_TYPES_AND_HALF(x.scalar_type(), "upfirdn2d_cuda", [&] {
312 | switch (mode) {
313 | case 1:
314 | upfirdn2d_kernel
315 | <<>>(out.data_ptr(),
316 | x.data_ptr(),
317 | k.data_ptr(), p);
318 |
319 | break;
320 |
321 | case 2:
322 | upfirdn2d_kernel
323 | <<>>(out.data_ptr(),
324 | x.data_ptr(),
325 | k.data_ptr(), p);
326 |
327 | break;
328 |
329 | case 3:
330 | upfirdn2d_kernel
331 | <<>>(out.data_ptr(),
332 | x.data_ptr(),
333 | k.data_ptr(), p);
334 |
335 | break;
336 |
337 | case 4:
338 | upfirdn2d_kernel
339 | <<>>(out.data_ptr(),
340 | x.data_ptr(),
341 | k.data_ptr(), p);
342 |
343 | break;
344 |
345 | case 5:
346 | upfirdn2d_kernel
347 | <<>>(out.data_ptr(),
348 | x.data_ptr(),
349 | k.data_ptr(), p);
350 |
351 | break;
352 |
353 | case 6:
354 | upfirdn2d_kernel
355 | <<>>(out.data_ptr(),
356 | x.data_ptr(),
357 | k.data_ptr(), p);
358 |
359 | break;
360 |
361 | default:
362 | upfirdn2d_kernel_large<<>>(
363 | out.data_ptr(), x.data_ptr(),
364 | k.data_ptr(), p);
365 | }
366 | });
367 |
368 | return out;
369 | }
--------------------------------------------------------------------------------
/code/models/stylegan2_pytorch/stylegan2_pytorch.py:
--------------------------------------------------------------------------------
1 | import math
2 | import random
3 | import torch
4 | from torch import nn
5 | from torch.nn import functional as F
6 |
7 | from models.stylegan2_pytorch.op import FusedLeakyReLU, fused_leaky_relu, upfirdn2d
8 |
9 | class PixelNorm(nn.Module):
10 | def __init__(self):
11 | super().__init__()
12 |
13 | def forward(self, input):
14 | return input * torch.rsqrt(torch.mean(input ** 2, dim=1, keepdim=True) + 1e-8)
15 |
16 |
17 | def make_kernel(k):
18 | k = torch.tensor(k, dtype=torch.float32)
19 |
20 | if k.ndim == 1:
21 | k = k[None, :] * k[:, None]
22 |
23 | k /= k.sum()
24 |
25 | return k
26 |
27 |
28 | class Upsample(nn.Module):
29 | def __init__(self, kernel, factor=2):
30 | super().__init__()
31 |
32 | self.factor = factor
33 | kernel = make_kernel(kernel) * (factor ** 2)
34 | self.register_buffer("kernel", kernel)
35 |
36 | p = kernel.shape[0] - factor
37 |
38 | pad0 = (p + 1) // 2 + factor - 1
39 | pad1 = p // 2
40 |
41 | self.pad = (pad0, pad1)
42 |
43 | def forward(self, input):
44 | out = upfirdn2d(input, self.kernel, up=self.factor, down=1, pad=self.pad)
45 |
46 | return out
47 |
48 |
49 | class Downsample(nn.Module):
50 | def __init__(self, kernel, factor=2):
51 | super().__init__()
52 |
53 | self.factor = factor
54 | kernel = make_kernel(kernel)
55 | self.register_buffer("kernel", kernel)
56 |
57 | p = kernel.shape[0] - factor
58 |
59 | pad0 = (p + 1) // 2
60 | pad1 = p // 2
61 |
62 | self.pad = (pad0, pad1)
63 |
64 | def forward(self, input):
65 | out = upfirdn2d(input, self.kernel, up=1, down=self.factor, pad=self.pad)
66 |
67 | return out
68 |
69 |
70 | class Blur(nn.Module):
71 | def __init__(self, kernel, pad, upsample_factor=1):
72 | super().__init__()
73 |
74 | kernel = make_kernel(kernel)
75 |
76 | if upsample_factor > 1:
77 | kernel = kernel * (upsample_factor ** 2)
78 |
79 | self.register_buffer("kernel", kernel)
80 |
81 | self.pad = pad
82 |
83 | def forward(self, input):
84 | out = upfirdn2d(input, self.kernel, pad=self.pad)
85 |
86 | return out
87 |
88 |
89 | class EqualConv2d(nn.Module):
90 | def __init__(
91 | self, in_channel, out_channel, kernel_size, stride=1, padding=0, bias=True
92 | ):
93 | super().__init__()
94 |
95 | self.weight = nn.Parameter(
96 | torch.randn(out_channel, in_channel, kernel_size, kernel_size)
97 | )
98 | self.scale = 1 / math.sqrt(in_channel * kernel_size ** 2)
99 |
100 | self.stride = stride
101 | self.padding = padding
102 |
103 | if bias:
104 | self.bias = nn.Parameter(torch.zeros(out_channel))
105 |
106 | else:
107 | self.bias = None
108 |
109 | def forward(self, input):
110 | out = F.conv2d(
111 | input,
112 | self.weight * self.scale,
113 | bias=self.bias,
114 | stride=self.stride,
115 | padding=self.padding,
116 | )
117 |
118 | return out
119 |
120 | def __repr__(self):
121 | return (
122 | f"{self.__class__.__name__}({self.weight.shape[1]}, {self.weight.shape[0]},"
123 | f" {self.weight.shape[2]}, stride={self.stride}, padding={self.padding})"
124 | )
125 |
126 |
127 | class EqualLinear(nn.Module):
128 | def __init__(
129 | self, in_dim, out_dim, bias=True, bias_init=0, lr_mul=1, activation=None
130 | ):
131 | super().__init__()
132 |
133 | self.weight = nn.Parameter(torch.randn(out_dim, in_dim).div_(lr_mul))
134 |
135 | if bias:
136 | self.bias = nn.Parameter(torch.zeros(out_dim).fill_(bias_init))
137 |
138 | else:
139 | self.bias = None
140 |
141 | self.activation = activation
142 |
143 | self.scale = (1 / math.sqrt(in_dim)) * lr_mul
144 | self.lr_mul = lr_mul
145 |
146 | def forward(self, input):
147 | if self.activation:
148 | out = F.linear(input, self.weight * self.scale)
149 | out = fused_leaky_relu(out, self.bias * self.lr_mul)
150 |
151 | else:
152 | out = F.linear(
153 | input, self.weight * self.scale, bias=self.bias * self.lr_mul
154 | )
155 |
156 | return out
157 |
158 | def __repr__(self):
159 | return (
160 | f"{self.__class__.__name__}({self.weight.shape[1]}, {self.weight.shape[0]})"
161 | )
162 |
163 |
164 | class ModulatedConv2d(nn.Module):
165 | def __init__(
166 | self,
167 | in_channel,
168 | out_channel,
169 | kernel_size,
170 | style_dim,
171 | demodulate=True,
172 | upsample=False,
173 | downsample=False,
174 | blur_kernel=[1, 3, 3, 1],
175 | return_style=False
176 | ):
177 | super().__init__()
178 |
179 | self.eps = 1e-8
180 | self.kernel_size = kernel_size
181 | self.in_channel = in_channel
182 | self.out_channel = out_channel
183 | self.upsample = upsample
184 | self.return_style = return_style
185 | self.downsample = downsample
186 | if upsample:
187 | factor = 2
188 | p = (len(blur_kernel) - factor) - (kernel_size - 1)
189 | pad0 = (p + 1) // 2 + factor - 1
190 | pad1 = p // 2 + 1
191 |
192 | self.blur = Blur(blur_kernel, pad=(pad0, pad1), upsample_factor=factor)
193 |
194 | if downsample:
195 | factor = 2
196 | p = (len(blur_kernel) - factor) + (kernel_size - 1)
197 | pad0 = (p + 1) // 2
198 | pad1 = p // 2
199 |
200 | self.blur = Blur(blur_kernel, pad=(pad0, pad1))
201 |
202 | fan_in = in_channel * kernel_size ** 2
203 | self.scale = 1 / math.sqrt(fan_in)
204 | self.padding = kernel_size // 2
205 |
206 | self.weight = nn.Parameter(
207 | torch.randn(1, out_channel, in_channel, kernel_size, kernel_size)
208 | )
209 |
210 | self.modulation = EqualLinear(style_dim, in_channel, bias_init=1)
211 |
212 | self.demodulate = demodulate
213 |
214 | def __repr__(self):
215 | return (
216 | f"{self.__class__.__name__}({self.in_channel}, {self.out_channel}, {self.kernel_size}, "
217 | f"upsample={self.upsample}, downsample={self.downsample})"
218 | )
219 |
220 | def forward(self, input, style, edit_vector=None):
221 | batch, in_channel, height, width = input.shape
222 |
223 | style = self.modulation(style).view(batch, 1, in_channel, 1, 1)
224 |
225 | if edit_vector is not None:
226 |
227 | style = style + edit_vector.view(batch, 1, in_channel, 1, 1)
228 | weight = self.scale * self.weight * style
229 |
230 | if self.demodulate:
231 | demod = torch.rsqrt(weight.pow(2).sum([2, 3, 4]) + 1e-8)
232 | weight = weight * demod.view(batch, self.out_channel, 1, 1, 1)
233 |
234 | weight = weight.view(
235 | batch * self.out_channel, in_channel, self.kernel_size, self.kernel_size
236 | )
237 |
238 | if self.upsample:
239 | input = input.view(1, batch * in_channel, height, width)
240 | weight = weight.view(
241 | batch, self.out_channel, in_channel, self.kernel_size, self.kernel_size
242 | )
243 | weight = weight.transpose(1, 2).reshape(
244 | batch * in_channel, self.out_channel, self.kernel_size, self.kernel_size
245 | )
246 | out = F.conv_transpose2d(input, weight, padding=0, stride=2, groups=batch)
247 | _, _, height, width = out.shape
248 | out = out.view(batch, self.out_channel, height, width)
249 | out = self.blur(out)
250 |
251 | elif self.downsample:
252 | input = self.blur(input)
253 | _, _, height, width = input.shape
254 | input = input.view(1, batch * in_channel, height, width)
255 | out = F.conv2d(input, weight, padding=0, stride=2, groups=batch)
256 | _, _, height, width = out.shape
257 | out = out.view(batch, self.out_channel, height, width)
258 |
259 | else:
260 | input = input.view(1, batch * in_channel, height, width)
261 | out = F.conv2d(input, weight, padding=self.padding, groups=batch)
262 | _, _, height, width = out.shape
263 | out = out.view(batch, self.out_channel, height, width)
264 |
265 | if self.return_style:
266 | return out, style
267 | else:
268 | return out
269 |
270 | class NoiseInjection(nn.Module):
271 | def __init__(self):
272 | super().__init__()
273 |
274 | self.weight = nn.Parameter(torch.zeros(1))
275 |
276 | def forward(self, image, noise=None):
277 | if noise is None:
278 | batch, _, height, width = image.shape
279 | noise = image.new_empty(batch, 1, height, width).normal_()
280 |
281 | return image + self.weight * noise
282 |
283 |
284 | class ConstantInput(nn.Module):
285 | def __init__(self, channel, size=4):
286 | super().__init__()
287 |
288 | self.input = nn.Parameter(torch.randn(1, channel, size, size))
289 |
290 | def forward(self, input):
291 | batch = input.shape[0]
292 | out = self.input.repeat(batch, 1, 1, 1)
293 |
294 | return out
295 |
296 |
297 | class StyledConv(nn.Module):
298 | def __init__(
299 | self,
300 | in_channel,
301 | out_channel,
302 | kernel_size,
303 | style_dim,
304 | upsample=False,
305 | blur_kernel=[1, 3, 3, 1],
306 | demodulate=True, return_style=False
307 | ):
308 | super().__init__()
309 |
310 | self.conv = ModulatedConv2d(
311 | in_channel,
312 | out_channel,
313 | kernel_size,
314 | style_dim,
315 | upsample=upsample,
316 | blur_kernel=blur_kernel,
317 | demodulate=demodulate,
318 | return_style=return_style
319 | )
320 | self.return_style = return_style
321 | self.noise = NoiseInjection()
322 | # self.bias = nn.Parameter(torch.zeros(1, out_channel, 1, 1))
323 | # self.activate = ScaledLeakyReLU(0.2)
324 | self.activate = FusedLeakyReLU(out_channel)
325 |
326 | def forward(self, input, style, noise=None, edit_vector=None):
327 | if self.return_style:
328 | out ,style = self.conv(input, style, edit_vector=edit_vector)
329 | else:
330 | out = self.conv(input, style)
331 | out = self.noise(out, noise=noise)
332 | # out = out + self.bias
333 | out = self.activate(out)
334 |
335 | if self.return_style:
336 | return out, style
337 | else:
338 | return out
339 |
340 |
341 | class ToRGB(nn.Module):
342 | def __init__(self, in_channel, style_dim, upsample=True, blur_kernel=[1, 3, 3, 1]):
343 | super().__init__()
344 |
345 | if upsample:
346 | self.upsample = Upsample(blur_kernel)
347 |
348 | self.conv = ModulatedConv2d(in_channel, 3, 1, style_dim, demodulate=False)
349 | self.bias = nn.Parameter(torch.zeros(1, 3, 1, 1))
350 |
351 | def forward(self, input, style, skip=None):
352 | out = self.conv(input, style)
353 | out = out + self.bias
354 |
355 | if skip is not None:
356 | skip = self.upsample(skip)
357 |
358 | out = out + skip
359 |
360 | return out
361 |
362 |
363 | class Generator(nn.Module):
364 | def __init__(
365 | self,
366 | size,
367 | style_dim,
368 | n_mlp,
369 | channel_multiplier=2,
370 | blur_kernel=[1, 3, 3, 1],
371 | lr_mlp=0.01,
372 | randomize_noise=True, return_style=False
373 | ):
374 | super().__init__()
375 |
376 | self.size = size
377 |
378 | self.style_dim = style_dim
379 | self.return_style = return_style
380 | layers = [PixelNorm()]
381 |
382 | for i in range(n_mlp):
383 | layers.append(
384 | EqualLinear(
385 | style_dim, style_dim, lr_mul=lr_mlp, activation="fused_lrelu"
386 | )
387 | )
388 |
389 | self.style = nn.Sequential(*layers)
390 |
391 | self.channels = {
392 | 4: 512,
393 | 8: 512,
394 | 16: 512,
395 | 32: 512,
396 | 64: 256 * channel_multiplier,
397 | 128: 128 * channel_multiplier,
398 | 256: 64 * channel_multiplier,
399 | 512: 32 * channel_multiplier,
400 | 1024: 16 * channel_multiplier,
401 | }
402 |
403 | self.input = ConstantInput(self.channels[4])
404 |
405 | self.conv1 = StyledConv(
406 | self.channels[4], self.channels[4], 3, style_dim, blur_kernel=blur_kernel, return_style=return_style
407 | )
408 | self.to_rgb1 = ToRGB(self.channels[4], style_dim, upsample=False)
409 |
410 | self.log_size = int(math.log(size, 2))
411 | self.num_layers = (self.log_size - 2) * 2 + 1
412 |
413 | self.convs = nn.ModuleList()
414 | self.upsamples = nn.ModuleList()
415 | self.to_rgbs = nn.ModuleList()
416 | self.noises = nn.Module()
417 | self.randomize_noise = randomize_noise
418 | in_channel = self.channels[4]
419 |
420 | for layer_idx in range(self.num_layers):
421 | res = (layer_idx + 5) // 2
422 | shape = [1, 1, 2 ** res, 2 ** res]
423 | self.noises.register_buffer(f"noise_{layer_idx}", torch.randn(*shape))
424 |
425 | for i in range(3, self.log_size + 1):
426 | out_channel = self.channels[2 ** i]
427 |
428 | self.convs.append(
429 | StyledConv(
430 | in_channel,
431 | out_channel,
432 | 3,
433 | style_dim,
434 | upsample=True,
435 | blur_kernel=blur_kernel,
436 | return_style=return_style
437 | )
438 | )
439 |
440 | self.convs.append(
441 | StyledConv(
442 | out_channel, out_channel, 3, style_dim, blur_kernel=blur_kernel, return_style=return_style
443 | )
444 | )
445 |
446 | self.to_rgbs.append(ToRGB(out_channel, style_dim))
447 |
448 | in_channel = out_channel
449 |
450 | self.n_latent = self.log_size * 2 - 2
451 |
452 |
453 | def make_noise(self):
454 | device = self.input.input.device
455 |
456 | noises = [torch.randn(1, 1, 2 ** 2, 2 ** 2, device=device)]
457 |
458 | for i in range(3, self.log_size + 1):
459 | for _ in range(2):
460 | noises.append(torch.randn(1, 1, 2 ** i, 2 ** i, device=device))
461 |
462 | return noises
463 |
464 | def make_mean_latent(self, n_latent):
465 | latent_in = torch.randn(
466 | n_latent, self.style_dim, device=self.input.input.device
467 | )
468 | latent = self.style(latent_in).mean(0, keepdim=True)
469 | self.mean_latent = latent.cuda()
470 | return self.mean_latent
471 |
472 | def get_latent(self, input):
473 | style = self.style(input)
474 | return style
475 |
476 | def truncation(self, input):
477 | out = self.mean_latent.unsqueeze(1) + 0.7 * (input - self.mean_latent.unsqueeze(1))
478 |
479 | return out
480 |
481 | def g_mapping(self, input):
482 | style = self.style(input)
483 | style = style.unsqueeze(1).repeat(1, self.n_latent, 1)
484 |
485 | return style
486 |
487 | def g_synthesis(self, latent, noise=None):
488 | if noise is None:
489 | if self.randomize_noise:
490 | noise = [None] * self.num_layers
491 | else:
492 | noise = [getattr(self.noises, f"noise_{i}") for i in range(self.num_layers)]
493 |
494 | styles_feature = []
495 | out = self.input(latent)
496 | styles_feature.append(out)
497 | all_styles = []
498 |
499 | if self.return_style:
500 | out, style = self.conv1(out, latent[:, 0], noise=noise[0])
501 | all_styles.append(style)
502 | else:
503 | out = self.conv1(out, latent[:, 0], noise=noise[0])
504 | styles_feature.append(out)
505 |
506 | skip = self.to_rgb1(out, latent[:, 1])
507 |
508 | i = 1
509 | for conv1, conv2, noise1, noise2, to_rgb in zip(
510 | self.convs[::2], self.convs[1::2], noise[1::2], noise[2::2], self.to_rgbs
511 | ):
512 | if self.return_style:
513 | out, style = conv1(out, latent[:, i], noise=noise1)
514 | all_styles.append(style)
515 | else:
516 | out = conv1(out, latent[:, i], noise=noise1)
517 |
518 | styles_feature.append(out)
519 |
520 | if self.return_style:
521 | out, style = conv2(out, latent[:, i + 1], noise=noise2)
522 | all_styles.append(style)
523 |
524 | else:
525 | out = conv2(out, latent[:, i + 1], noise=noise2)
526 |
527 | styles_feature.append(out)
528 | skip = to_rgb(out, latent[:, i + 2], skip)
529 |
530 | i += 2
531 |
532 | image = skip
533 | if self.return_style:
534 | return image, styles_feature, all_styles
535 | else:
536 | return image, styles_feature
537 |
538 | def g_synthesis_editing_transformed_w(self, latent, transformed_w):
539 | if self.randomize_noise:
540 | noise = [None] * self.num_layers
541 | else:
542 | noise = [
543 | getattr(self.noises, f"noise_{i}") for i in range(self.num_layers)
544 | ]
545 | styles_feature = []
546 |
547 | out = self.input(latent)
548 | styles_feature.append(out)
549 | all_styles = []
550 |
551 | out, style = self.conv1(out, latent[:, 0], noise=noise[0], edit_vector=transformed_w[0])
552 | all_styles.append(style)
553 |
554 | styles_feature.append(out)
555 |
556 | skip = self.to_rgb1(out, latent[:, 1])
557 |
558 | i = 1
559 | for conv1, conv2, noise1, noise2, to_rgb in zip(
560 | self.convs[::2], self.convs[1::2], noise[1::2], noise[2::2], self.to_rgbs
561 | ):
562 | out, style = conv1(out, latent[:, i], noise=noise1, edit_vector=transformed_w[i])
563 | all_styles.append(style)
564 |
565 | styles_feature.append(out)
566 |
567 | out, style = conv2(out, latent[:, i + 1], noise=noise2, edit_vector=transformed_w[i + 1])
568 | all_styles.append(style)
569 |
570 | styles_feature.append(out)
571 |
572 | skip = to_rgb(out, latent[:, i + 2], skip)
573 |
574 | i += 2
575 |
576 | image = skip
577 | return image, styles_feature, all_styles
578 |
579 | def forward(
580 | self,
581 | styles,
582 | return_latents=False,
583 | inject_index=None,
584 | truncation=1,
585 | truncation_latent=None,
586 | input_is_latent=False,
587 | noise=None,
588 | randomize_noise=True,
589 | ):
590 | if not input_is_latent:
591 | styles = [self.style(s) for s in styles]
592 |
593 | if noise is None:
594 | if randomize_noise:
595 | noise = [None] * self.num_layers
596 | else:
597 | noise = [
598 | getattr(self.noises, f"noise_{i}") for i in range(self.num_layers)
599 | ]
600 |
601 | if truncation < 1:
602 | style_t = []
603 |
604 | for style in styles:
605 | style_t.append(
606 | truncation_latent + truncation * (style - truncation_latent)
607 | )
608 |
609 | styles = style_t
610 |
611 | if len(styles) < 2:
612 | inject_index = self.n_latent
613 |
614 | if styles[0].ndim < 3:
615 | latent = styles[0].unsqueeze(1).repeat(1, inject_index, 1)
616 |
617 | else:
618 | latent = styles[0]
619 |
620 | else:
621 | if inject_index is None:
622 | inject_index = random.randint(1, self.n_latent - 1)
623 |
624 | latent = styles[0].unsqueeze(1).repeat(1, inject_index, 1)
625 | latent2 = styles[1].unsqueeze(1).repeat(1, self.n_latent - inject_index, 1)
626 |
627 | latent = torch.cat([latent, latent2], 1)
628 |
629 | out = self.input(latent)
630 | out = self.conv1(out, latent[:, 0], noise=noise[0])
631 |
632 | skip = self.to_rgb1(out, latent[:, 1])
633 |
634 | i = 1
635 |
636 | for conv1, conv2, noise1, noise2, to_rgb in zip(
637 | self.convs[::2], self.convs[1::2], noise[1::2], noise[2::2], self.to_rgbs
638 | ):
639 | out = conv1(out, latent[:, i], noise=noise1)
640 | out = conv2(out, latent[:, i + 1], noise=noise2)
641 | skip = to_rgb(out, latent[:, i + 2], skip)
642 |
643 | i += 2
644 |
645 | image = skip
646 |
647 | if return_latents:
648 | return image, latent
649 |
650 | else:
651 | return image, None
652 |
653 |
654 | class ConvLayer(nn.Sequential):
655 | def __init__(
656 | self,
657 | in_channel,
658 | out_channel,
659 | kernel_size,
660 | downsample=False,
661 | blur_kernel=[1, 3, 3, 1],
662 | bias=True,
663 | activate=True,
664 | ):
665 | layers = []
666 |
667 | if downsample:
668 | factor = 2
669 | p = (len(blur_kernel) - factor) + (kernel_size - 1)
670 | pad0 = (p + 1) // 2
671 | pad1 = p // 2
672 |
673 | layers.append(Blur(blur_kernel, pad=(pad0, pad1)))
674 |
675 | stride = 2
676 | self.padding = 0
677 |
678 | else:
679 | stride = 1
680 | self.padding = kernel_size // 2
681 |
682 | layers.append(
683 | EqualConv2d(
684 | in_channel,
685 | out_channel,
686 | kernel_size,
687 | padding=self.padding,
688 | stride=stride,
689 | bias=bias and not activate,
690 | )
691 | )
692 |
693 | if activate:
694 | layers.append(FusedLeakyReLU(out_channel, bias=bias))
695 |
696 | super().__init__(*layers)
697 |
698 |
699 | class ResBlock(nn.Module):
700 | def __init__(self, in_channel, out_channel, blur_kernel=[1, 3, 3, 1]):
701 | super().__init__()
702 |
703 | self.conv1 = ConvLayer(in_channel, in_channel, 3)
704 | self.conv2 = ConvLayer(in_channel, out_channel, 3, downsample=True)
705 |
706 | self.skip = ConvLayer(
707 | in_channel, out_channel, 1, downsample=True, activate=False, bias=False
708 | )
709 |
710 | def forward(self, input):
711 | out = self.conv1(input)
712 | out = self.conv2(out)
713 |
714 | skip = self.skip(input)
715 | out = (out + skip) / math.sqrt(2)
716 |
717 | return out
718 |
719 |
720 | class Discriminator(nn.Module):
721 | def __init__(self, size, channel_multiplier=2, blur_kernel=[1, 3, 3, 1]):
722 | super().__init__()
723 |
724 | channels = {
725 | 4: 512,
726 | 8: 512,
727 | 16: 512,
728 | 32: 512,
729 | 64: 256 * channel_multiplier,
730 | 128: 128 * channel_multiplier,
731 | 256: 64 * channel_multiplier,
732 | 512: 32 * channel_multiplier,
733 | 1024: 16 * channel_multiplier,
734 | }
735 |
736 | convs = [ConvLayer(3, channels[size], 1)]
737 |
738 | log_size = int(math.log(size, 2))
739 |
740 | in_channel = channels[size]
741 |
742 | for i in range(log_size, 2, -1):
743 | out_channel = channels[2 ** (i - 1)]
744 |
745 | convs.append(ResBlock(in_channel, out_channel, blur_kernel))
746 |
747 | in_channel = out_channel
748 |
749 | self.convs = nn.Sequential(*convs)
750 |
751 | self.stddev_group = 4
752 | self.stddev_feat = 1
753 |
754 | self.final_conv = ConvLayer(in_channel + 1, channels[4], 3)
755 | self.final_linear = nn.Sequential(
756 | EqualLinear(channels[4] * 4 * 4, channels[4], activation="fused_lrelu"),
757 | EqualLinear(channels[4], 1),
758 | )
759 |
760 | def forward(self, input):
761 | out = self.convs(input)
762 |
763 | batch, channel, height, width = out.shape
764 | group = min(batch, self.stddev_group)
765 | stddev = out.view(
766 | group, -1, self.stddev_feat, channel // self.stddev_feat, height, width
767 | )
768 | stddev = torch.sqrt(stddev.var(0, unbiased=False) + 1e-8)
769 | stddev = stddev.mean([2, 3, 4], keepdims=True).squeeze(2)
770 | stddev = stddev.repeat(group, 1, height, width)
771 | out = torch.cat([out, stddev], 1)
772 |
773 | out = self.final_conv(out)
774 |
775 | out = out.view(batch, -1)
776 | out = self.final_linear(out)
777 |
778 | return out
779 |
--------------------------------------------------------------------------------
/code/single_image_inference.py:
--------------------------------------------------------------------------------
1 | import os
2 | os.environ['CUDA_DEVICE_ORDER']='PCI_BUS_ID' # see issue #152
3 |
4 | import torch
5 | from torch.utils.data import DataLoader
6 | import torchvision.transforms as T
7 |
8 | import json
9 | import argparse
10 | import pandas as pd
11 |
12 | from PIL import Image
13 | from tqdm import tqdm
14 | from sklearn.metrics import f1_score
15 |
16 | from data_utils import heatmap2au, MyDataset
17 | from model_utils import latent_to_image, prepare_model, load_psp_standalone
18 | from models.interpreter import pyramid_interpreter
19 |
20 |
21 | def prepare(args):
22 | g_all, upsamplers = prepare_model(args)
23 |
24 | pspencoder = load_psp_standalone(args['style_encoder_path'], 'cuda')
25 |
26 | transform = T.Compose([
27 | T.ToTensor(),
28 | T.Normalize(mean=[0.5,0.5,0.5], std=[0.5,0.5,0.5]),
29 | ])
30 |
31 | return g_all, upsamplers, pspencoder, transform
32 |
33 |
34 | def inference(interpreter, images, pspencoder, g_all, upsamplers, args):
35 | interpreter.eval()
36 | pspencoder.eval()
37 | g_all.eval()
38 |
39 | with torch.no_grad():
40 | pred_list = []
41 | gt_list = []
42 | f1_list = []
43 |
44 | images = images.cuda()
45 | # get latent code
46 | latent_codes = pspencoder(images)
47 | latent_codes = g_all.style(latent_codes.reshape(latent_codes.shape[0]*latent_codes.shape[1], latent_codes.shape[2])).reshape(latent_codes.shape)
48 | # get stylegan features
49 | features = latent_to_image(g_all, upsamplers, latent_codes, upsample=False, args=args)
50 |
51 | heatmaps_pred = interpreter(features)
52 | heatmaps_pred = torch.clamp(heatmaps_pred, min=-1., max=1.)
53 | labels_pred = heatmap2au(heatmaps_pred)
54 |
55 | if args['dataset'] == 'DISFA':
56 | labels_pred = labels_pred[:, :8].detach().cpu()
57 | elif args['dataset'] == 'BP4D':
58 | labels_pred = labels_pred[:, :12].detach().cpu()
59 |
60 | return labels_pred
61 |
62 |
63 | def main(args):
64 | print('Prepare model')
65 | g_all, upsamplers, pspencoder, transform = prepare(args)
66 |
67 | num_labels = 32 # bug in model checkpoint
68 | interpreter = pyramid_interpreter(num_labels, 0.1).cuda()
69 |
70 | checkpoint = torch.load(args['checkpoint_path'])
71 | interpreter.load_state_dict(checkpoint['interpreter'])
72 | g_all.load_state_dict(checkpoint['g_all'])
73 | pspencoder.load_state_dict(checkpoint['pspencoder'])
74 |
75 | print('Prepare data')
76 | image = Image.open('../test_image.jpg') # replace with your own image path
77 | image = transform(image)
78 | image = image.unsqueeze(0) # [1, 3, 256, 256]
79 |
80 | print('Start evaluation')
81 | if args['dataset'] == 'BP4D':
82 | aus = [1,2,4,6,7,10,12,14,15,17,23,24]
83 | elif args['dataset'] == 'DISFA':
84 | aus = [1,2,4,6,9,12,25,26]
85 | pred_aus = inference(interpreter, image, pspencoder, g_all, upsamplers, args)
86 | pred_aus = pred_aus.squeeze(0)
87 |
88 | for i in range(len(aus)):
89 | print('AU {}: {}'.format(aus[i], pred_aus[i]), end=' ')
90 | print('')
91 |
92 |
93 | if __name__ == '__main__':
94 | parser = argparse.ArgumentParser()
95 |
96 | parser.add_argument('--exp', type=str)
97 | args = parser.parse_args()
98 | opts = json.load(open(args.exp, 'r'))
99 | print('Opt', opts)
100 |
101 | main(opts)
102 |
--------------------------------------------------------------------------------
/code/train_interpreter.py:
--------------------------------------------------------------------------------
1 | import os
2 | os.environ['CUDA_DEVICE_ORDER']='PCI_BUS_ID' # see issue #152
3 |
4 | import torch
5 | import torch.nn as nn
6 | import torch.optim as optim
7 | import torchvision.transforms as T
8 | from torch.utils.data import DataLoader
9 |
10 | import json
11 | import argparse
12 | import pandas as pd
13 |
14 | from tqdm import tqdm
15 | from PIL import Image
16 | from sklearn.metrics import f1_score
17 |
18 | from data_utils import heatmap2au, MyDataset
19 | from model_utils import load_psp_standalone, latent_to_image, prepare_model
20 | from models.interpreter import pyramid_interpreter
21 |
22 |
23 | def prepare(args):
24 | pspencoder = load_psp_standalone(args['style_encoder_path'], 'cuda')
25 | g_all, upsamplers = prepare_model(args)
26 |
27 | transform = T.Compose([
28 | T.ToTensor(),
29 | T.Normalize(mean=[0.5,0.5,0.5], std=[0.5,0.5,0.5]),
30 | ])
31 |
32 | train_list = pd.read_csv(args['train_csv'])
33 | val_list = pd.read_csv(args['test_csv'])
34 | test_list = pd.read_csv(args['test_csv'])
35 |
36 | return train_list, val_list, test_list, g_all, upsamplers, pspencoder, transform
37 |
38 |
39 | def val(interpreter, val_loader, pspencoder, g_all, upsamplers, args):
40 | interpreter.eval()
41 | pspencoder.eval()
42 | g_all.eval()
43 |
44 | with torch.no_grad():
45 | pred_list = []
46 | gt_list = []
47 | f1_list = []
48 |
49 | for i, (images, labels) in enumerate(tqdm(val_loader)):
50 | # get latent code
51 | latent_codes = pspencoder(images)
52 | latent_codes = g_all.style(latent_codes.reshape(latent_codes.shape[0]*latent_codes.shape[1], latent_codes.shape[2])).reshape(latent_codes.shape)
53 | # get stylegan features
54 | features = latent_to_image(g_all, upsamplers, latent_codes, upsample=False, args=args)
55 |
56 | heatmaps_pred = interpreter(features)
57 | heatmaps_pred = torch.clamp(heatmaps_pred, min=-1., max=1.)
58 | labels_pred = heatmap2au(heatmaps_pred)
59 |
60 | pred_list.append(labels_pred.detach().cpu())
61 | gt_list.append(labels.detach().cpu())
62 |
63 | pred_list = torch.cat(pred_list, dim=0).numpy()
64 | gt_list = torch.cat(gt_list, dim=0).numpy()
65 |
66 | for i in range(args['num_labels']):
67 | f1_list.append(100.0*f1_score(gt_list[:, i], pred_list[:, i]))
68 |
69 | return sum(f1_list)/len(f1_list), f1_list
70 |
71 |
72 | def main(args):
73 | print('Prepare model')
74 | train_list, val_list, test_list, g_all, upsamplers, pspencoder, transform = prepare(args)
75 |
76 | print('Prepare data')
77 | train_data = MyDataset(train_list, transform, 'train', args)
78 | val_data = MyDataset(val_list, transform, 'val', args)
79 | val_loader = DataLoader(dataset=val_data, batch_size=args['batch_size'],
80 | shuffle=False, collate_fn=val_data.collate_fn)
81 | test_data = MyDataset(test_list, transform, 'val', args)
82 | test_loader = DataLoader(dataset=test_data, batch_size=2*args['batch_size'],
83 | shuffle=False, collate_fn=test_data.collate_fn)
84 |
85 | print('Start training')
86 | interpreter = pyramid_interpreter(32, args['dropout']).cuda()
87 | interpreter.init_weights()
88 | criterion = nn.MSELoss()
89 | optimizer = optim.AdamW(list(interpreter.parameters())
90 | +list(g_all.parameters())
91 | +list(pspencoder.parameters()),
92 | lr=args['learning_rate'], weight_decay=args['weight_decay'])
93 |
94 | total_loss, total_sample = 0., 0
95 | best_f1, best_f1_list = 0., []
96 |
97 | if args['dataset'] == 'BP4D':
98 | aus = [1,2,4,6,7,10,12,14,15,17,23,24]
99 | elif args['dataset'] == 'DISFA':
100 | aus = [1,2,4,6,9,12,25,26]
101 |
102 | for epoch in range(args['num_epochs']):
103 | interpreter.train()
104 | g_all.train()
105 | pspencoder.train()
106 | train_loader = DataLoader(dataset=train_data, batch_size=args['batch_size'], shuffle=True, collate_fn=train_data.collate_fn)
107 | args['interval'] = min(args['interval'], len(train_loader))
108 | for i, (images, labels, heatmaps) in enumerate(tqdm(train_loader, total=args['interval'])):
109 | if i >= args['interval']:
110 | break
111 | batch_size = images.shape[0]
112 | # get latent code
113 | latent_codes = pspencoder(images)
114 | latent_codes = g_all.style(latent_codes.reshape(latent_codes.shape[0]*latent_codes.shape[1], latent_codes.shape[2])).reshape(latent_codes.shape)
115 | # get stylegan features
116 | features = latent_to_image(g_all, upsamplers, latent_codes, upsample=False, args=args)
117 | heatmaps_pred = interpreter(features)
118 |
119 | loss = 0.
120 | for i in range(len(aus)):
121 | loss += criterion(heatmaps_pred[:,i,:,:], heatmaps[:,i,:,:])
122 | loss /= len(aus)
123 |
124 | optimizer.zero_grad()
125 | loss.backward()
126 | torch.nn.utils.clip_grad_norm_(list(interpreter.parameters())
127 | +list(g_all.parameters())
128 | +list(pspencoder.parameters()), 0.1)
129 | optimizer.step()
130 |
131 | total_loss += loss.item()*batch_size
132 | total_sample += batch_size
133 |
134 | avg_loss = total_loss / total_sample
135 | print('** Epoch {}/{} loss {:.6f} **'.format(epoch+1, args['num_epochs'], avg_loss))
136 |
137 | val_f1, val_f1_list = val(interpreter, val_loader, pspencoder, g_all, upsamplers, args)
138 | print('Val avg F1: {:.2f}'.format(val_f1))
139 | for i in range(args['num_labels']):
140 | print('AU {}: {:.2f}'.format(aus[i], val_f1_list[i]), end=' ')
141 | print('')
142 |
143 | if best_f1 < val_f1:
144 | best_f1 = val_f1
145 | best_f1_list = val_f1_list
146 | model_path = os.path.join(args['exp_dir'], 'model.pth')
147 | print('save to:', model_path)
148 | torch.save({'interpreter': interpreter.state_dict(),
149 | 'g_all': g_all.state_dict(),
150 | 'pspencoder': pspencoder.state_dict()}, model_path)
151 |
152 | checkpoint = torch.load(os.path.join(args['exp_dir'], 'model.pth'))
153 | interpreter.load_state_dict(checkpoint['interpreter'])
154 | g_all.load_state_dict(checkpoint['g_all'])
155 | pspencoder.load_state_dict(checkpoint['pspencoder'])
156 | best_f1, best_f1_list = val(interpreter, test_loader, pspencoder, g_all, upsamplers, args)
157 |
158 | print('Test avg F1: {:.2f}'.format(best_f1))
159 | for i in range(args['num_labels']):
160 | print('AU {}: {:.2f}'.format(aus[i], best_f1_list[i]), end=' ')
161 | print('')
162 |
163 |
164 | if __name__ == '__main__':
165 | parser = argparse.ArgumentParser()
166 |
167 | parser.add_argument('--exp', type=str)
168 | args = parser.parse_args()
169 | opts = json.load(open(args.exp, 'r'))
170 | print('Opt', opts)
171 |
172 | os.makedirs(opts['exp_dir'], exist_ok=True)
173 | os.system('cp %s %s' % (args.exp, opts['exp_dir']))
174 |
175 | torch.manual_seed(opts['seed'])
176 |
177 | main(opts)
178 |
--------------------------------------------------------------------------------
/data/BP4D/labels/0/train_100.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au7,au10,au12,au14,au15,au17,au23,au24
2 | 70764,BP4D/aligned_images/F015/T6/01406.jpg,0,0,0,0,0,0,0,0,0,0,0,0
3 | 69527,BP4D/aligned_images/F015/T4/01921.jpg,1,0,0,1,1,1,1,1,0,1,0,0
4 | 17114,BP4D/aligned_images/F020/T6/00983.jpg,0,0,0,1,1,1,0,0,0,0,0,0
5 | 71986,BP4D/aligned_images/F017/T1/00629.jpg,0,0,0,1,1,1,1,1,0,1,1,1
6 | 15522,BP4D/aligned_images/F020/T2/01339.jpg,1,0,1,0,1,0,0,0,0,1,0,0
7 | 50345,BP4D/aligned_images/F004/T1/01086.jpg,0,0,0,1,1,1,1,1,0,0,0,0
8 | 12471,BP4D/aligned_images/F013/T3/00145.jpg,0,0,0,0,0,0,0,0,0,0,0,0
9 | 78909,BP4D/aligned_images/F021/T1/00215.jpg,0,0,0,1,1,1,1,0,0,0,0,0
10 | 9894,BP4D/aligned_images/F011/T5/01304.jpg,0,0,0,1,1,1,1,0,0,0,0,0
11 | 91859,BP4D/aligned_images/M009/T7/01058.jpg,0,0,0,1,1,1,1,1,0,0,0,0
12 | 35505,BP4D/aligned_images/M011/T7/01881.jpg,0,1,1,0,0,0,0,1,0,0,0,1
13 | 78968,BP4D/aligned_images/F021/T1/00274.jpg,0,0,0,1,1,1,1,0,0,0,0,0
14 | 43143,BP4D/aligned_images/M017/T1/00430.jpg,0,0,0,0,0,1,1,0,0,0,0,0
15 | 8482,BP4D/aligned_images/F011/T2/01226.jpg,0,0,0,0,0,0,0,0,0,1,0,0
16 | 93904,BP4D/aligned_images/M015/T4/00882.jpg,0,0,0,1,0,1,1,0,0,0,0,0
17 | 85317,BP4D/aligned_images/M003/T8/00252.jpg,1,1,0,1,1,1,1,1,1,0,1,0
18 | 61355,BP4D/aligned_images/F012/T1/00565.jpg,0,0,0,1,1,1,1,0,0,0,0,0
19 | 43674,BP4D/aligned_images/M017/T2/01187.jpg,0,0,1,0,0,0,0,0,0,0,0,0
20 | 67612,BP4D/aligned_images/F014/T7/00900.jpg,0,0,0,0,0,1,1,1,0,0,0,0
21 | 91029,BP4D/aligned_images/M009/T5/00841.jpg,0,0,0,0,0,0,0,1,0,0,1,0
22 | 84481,BP4D/aligned_images/M003/T6/00148.jpg,0,0,0,1,1,1,1,0,1,0,0,0
23 | 33464,BP4D/aligned_images/M011/T3/00085.jpg,0,0,1,0,0,0,0,0,0,0,0,0
24 | 55644,BP4D/aligned_images/F006/T4/00939.jpg,1,1,0,1,0,1,1,0,0,0,0,0
25 | 2962,BP4D/aligned_images/F003/T7/00307.jpg,1,1,0,1,1,1,1,1,0,0,0,0
26 | 62780,BP4D/aligned_images/F012/T4/01069.jpg,0,1,0,1,1,1,1,0,0,0,0,0
27 | 18839,BP4D/aligned_images/F022/T1/01542.jpg,0,0,0,1,1,1,1,1,1,1,0,0
28 | 88468,BP4D/aligned_images/M006/T7/01090.jpg,0,0,0,0,0,0,0,0,0,1,1,0
29 | 25662,BP4D/aligned_images/M002/T8/00494.jpg,1,1,0,1,1,0,0,0,0,0,0,0
30 | 76460,BP4D/aligned_images/F019/T3/00267.jpg,0,0,1,1,1,1,1,0,0,0,0,0
31 | 35691,BP4D/aligned_images/M011/T8/00081.jpg,1,1,0,1,1,1,0,1,0,1,0,0
32 | 36510,BP4D/aligned_images/M013/T2/00145.jpg,0,0,0,0,0,0,0,1,0,1,0,1
33 | 88266,BP4D/aligned_images/M006/T7/00888.jpg,0,0,1,0,0,1,0,0,0,1,0,0
34 | 66727,BP4D/aligned_images/F014/T5/00858.jpg,1,0,0,0,1,0,0,1,0,1,1,0
35 | 46395,BP4D/aligned_images/M017/T8/00069.jpg,0,0,1,0,1,1,0,1,0,0,0,0
36 | 30379,BP4D/aligned_images/M010/T3/00293.jpg,0,0,0,0,0,1,1,1,0,1,0,0
37 | 57507,BP4D/aligned_images/F007/T1/00680.jpg,0,0,0,1,1,1,1,1,0,0,1,0
38 | 53598,BP4D/aligned_images/F004/T8/00090.jpg,1,0,0,0,0,1,1,0,0,1,0,0
39 | 36283,BP4D/aligned_images/M013/T1/01080.jpg,0,0,0,1,1,1,1,1,0,1,0,1
40 | 93592,BP4D/aligned_images/M015/T3/00171.jpg,0,0,1,0,1,1,0,1,0,1,0,0
41 | 45378,BP4D/aligned_images/M017/T6/01008.jpg,0,0,1,1,1,1,0,0,0,1,1,1
42 | 86317,BP4D/aligned_images/M006/T2/01574.jpg,0,0,0,0,0,0,0,0,0,0,0,0
43 | 95777,BP4D/aligned_images/M015/T8/00227.jpg,0,0,0,1,1,1,1,1,0,0,0,0
44 | 25596,BP4D/aligned_images/M002/T8/00428.jpg,0,0,0,1,1,0,0,1,0,0,0,0
45 | 85877,BP4D/aligned_images/M006/T2/01134.jpg,0,0,0,0,0,0,0,0,0,0,0,0
46 | 31573,BP4D/aligned_images/M010/T6/00756.jpg,0,0,0,0,0,0,0,0,0,0,0,0
47 | 36127,BP4D/aligned_images/M013/T1/00924.jpg,0,0,0,1,1,0,1,1,0,1,0,1
48 | 22058,BP4D/aligned_images/M002/T1/00243.jpg,0,0,0,0,0,0,0,1,0,0,1,1
49 | 53970,BP4D/aligned_images/F006/T1/00486.jpg,0,1,0,0,0,0,1,0,0,1,0,1
50 | 67766,BP4D/aligned_images/F014/T7/01054.jpg,1,0,0,0,0,0,1,1,0,0,1,0
51 | 3909,BP4D/aligned_images/F005/T1/01084.jpg,1,0,0,0,0,0,1,0,1,1,0,1
52 | 60632,BP4D/aligned_images/F007/T7/01094.jpg,0,0,0,1,1,1,1,1,0,0,0,0
53 | 41107,BP4D/aligned_images/M016/T4/00600.jpg,0,0,0,1,1,1,1,1,0,0,0,0
54 | 95832,BP4D/aligned_images/M015/T8/00282.jpg,0,0,1,1,1,1,0,1,0,0,0,0
55 | 82137,BP4D/aligned_images/M003/T1/00221.jpg,0,0,0,1,0,1,1,1,0,0,0,0
56 | 81098,BP4D/aligned_images/F021/T6/00210.jpg,1,0,0,1,1,1,1,1,0,0,0,0
57 | 28317,BP4D/aligned_images/M005/T6/00478.jpg,1,0,1,0,0,1,1,1,0,0,0,0
58 | 76878,BP4D/aligned_images/F019/T4/00417.jpg,0,0,0,1,1,1,1,0,0,1,0,1
59 | 86873,BP4D/aligned_images/M006/T4/00752.jpg,0,0,0,1,1,1,1,0,1,1,0,0
60 | 52258,BP4D/aligned_images/F004/T5/01014.jpg,1,1,0,1,1,1,1,1,0,0,0,0
61 | 32110,BP4D/aligned_images/M010/T7/00330.jpg,0,0,0,0,0,1,1,1,1,0,0,1
62 | 20786,BP4D/aligned_images/F022/T6/00037.jpg,0,0,0,0,0,0,0,0,0,1,0,0
63 | 10889,BP4D/aligned_images/F011/T7/01253.jpg,0,0,0,0,0,1,1,0,0,0,0,0
64 | 37672,BP4D/aligned_images/M013/T4/00734.jpg,0,0,0,0,1,1,0,0,0,0,1,0
65 | 58112,BP4D/aligned_images/F007/T2/00162.jpg,0,0,1,1,1,0,0,0,0,0,0,0
66 | 2245,BP4D/aligned_images/F003/T5/00860.jpg,0,0,0,1,0,0,0,1,0,0,0,0
67 | 10798,BP4D/aligned_images/F011/T7/01162.jpg,0,0,0,1,1,1,1,1,0,1,1,0
68 | 22969,BP4D/aligned_images/M002/T2/00449.jpg,0,0,0,0,0,0,0,1,0,0,0,0
69 | 64842,BP4D/aligned_images/F014/T1/00671.jpg,0,0,0,0,0,0,0,1,0,1,0,0
70 | 35168,BP4D/aligned_images/M011/T7/01544.jpg,0,0,0,0,0,0,0,0,0,0,0,0
71 | 49902,BP4D/aligned_images/M018/T7/00998.jpg,0,0,0,0,0,0,0,0,0,0,0,0
72 | 67876,BP4D/aligned_images/F014/T8/00045.jpg,0,0,1,0,1,0,1,0,1,1,1,1
73 | 62195,BP4D/aligned_images/F012/T3/00174.jpg,0,0,1,1,1,1,0,1,0,0,0,0
74 | 74153,BP4D/aligned_images/F017/T6/00976.jpg,0,0,0,0,0,0,0,0,0,0,1,1
75 | 92044,BP4D/aligned_images/M009/T7/01243.jpg,0,0,0,1,1,1,1,1,0,0,0,0
76 | 18549,BP4D/aligned_images/F022/T1/01252.jpg,0,0,0,1,1,0,0,1,1,1,1,0
77 | 67062,BP4D/aligned_images/F014/T6/00713.jpg,0,0,0,0,1,1,0,1,0,1,0,0
78 | 65087,BP4D/aligned_images/F014/T2/00583.jpg,0,0,0,0,0,0,0,0,0,0,0,0
79 | 56146,BP4D/aligned_images/F006/T5/00894.jpg,0,1,0,0,0,1,1,0,0,1,0,0
80 | 45961,BP4D/aligned_images/M017/T7/00934.jpg,0,0,0,0,1,1,1,1,0,0,0,0
81 | 50423,BP4D/aligned_images/F004/T1/01164.jpg,0,0,0,1,1,1,1,1,1,1,1,1
82 | 50522,BP4D/aligned_images/F004/T2/01147.jpg,1,0,1,0,0,0,0,0,0,0,0,0
83 | 10521,BP4D/aligned_images/F011/T7/00885.jpg,0,0,0,1,0,1,1,1,1,0,0,0
84 | 3311,BP4D/aligned_images/F003/T8/00547.jpg,1,1,0,1,1,1,1,0,0,0,0,0
85 | 57926,BP4D/aligned_images/F007/T1/01099.jpg,0,0,0,1,1,1,1,1,0,1,0,0
86 | 87175,BP4D/aligned_images/M006/T5/00805.jpg,0,0,0,0,0,1,0,0,0,1,0,0
87 | 28839,BP4D/aligned_images/M005/T7/01393.jpg,0,0,0,1,1,1,1,1,0,1,0,1
88 | 86486,BP4D/aligned_images/M006/T3/00147.jpg,0,0,0,0,0,0,1,0,0,0,0,0
89 | 8843,BP4D/aligned_images/F011/T3/00279.jpg,0,0,0,1,1,1,1,1,0,0,0,0
90 | 9636,BP4D/aligned_images/F011/T5/01046.jpg,0,0,0,1,1,1,1,0,0,0,0,0
91 | 40944,BP4D/aligned_images/M016/T4/00437.jpg,0,0,0,1,1,1,1,1,0,1,0,0
92 | 11144,BP4D/aligned_images/F011/T8/00217.jpg,0,0,0,1,1,1,1,1,0,1,1,0
93 | 26110,BP4D/aligned_images/M005/T1/00717.jpg,0,0,0,1,1,1,1,0,0,0,0,0
94 | 70740,BP4D/aligned_images/F015/T6/01382.jpg,0,0,0,0,0,0,0,0,0,0,0,0
95 | 64000,BP4D/aligned_images/F012/T7/01703.jpg,0,0,0,1,0,0,1,0,1,1,1,0
96 | 89747,BP4D/aligned_images/M009/T2/00885.jpg,0,0,1,0,0,0,0,0,0,0,0,0
97 | 30458,BP4D/aligned_images/M010/T4/00489.jpg,0,0,0,0,0,1,0,1,0,1,0,1
98 | 75429,BP4D/aligned_images/F019/T1/00421.jpg,0,0,0,1,1,1,1,1,0,0,0,0
99 | 20555,BP4D/aligned_images/F022/T5/00584.jpg,0,0,0,1,1,0,1,1,1,1,1,1
100 | 30744,BP4D/aligned_images/M010/T4/00775.jpg,0,0,0,0,0,1,1,1,0,1,0,0
101 | 67278,BP4D/aligned_images/F014/T6/00929.jpg,0,0,1,0,1,1,1,1,0,1,1,1
102 |
--------------------------------------------------------------------------------
/data/BP4D/labels/1/train_100.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au7,au10,au12,au14,au15,au17,au23,au24
2 | 11491,BP4D/aligned_images/F012/T2/01230.jpg,0,0,0,0,1,1,0,0,0,1,0,0
3 | 13694,BP4D/aligned_images/F012/T6/00492.jpg,0,0,0,1,1,1,1,0,0,0,0,0
4 | 30831,BP4D/aligned_images/F021/T6/00100.jpg,1,0,1,1,1,1,1,1,0,0,0,0
5 | 93228,BP4D/aligned_images/M014/T1/00853.jpg,0,0,0,1,1,1,1,1,0,0,0,0
6 | 41709,BP4D/aligned_images/M009/T7/01065.jpg,0,0,0,1,1,1,1,1,0,0,0,0
7 | 89217,BP4D/aligned_images/M012/T1/00727.jpg,0,0,0,1,1,1,1,1,0,0,0,0
8 | 15677,BP4D/aligned_images/F014/T3/00262.jpg,1,0,0,0,0,0,0,0,0,0,0,0
9 | 64612,BP4D/aligned_images/F016/T2/00818.jpg,0,0,0,0,0,0,0,0,0,0,0,0
10 | 33004,BP4D/aligned_images/M003/T3/00014.jpg,0,0,0,0,0,0,0,0,0,0,0,0
11 | 39617,BP4D/aligned_images/M009/T2/00912.jpg,0,0,1,0,0,0,0,0,0,0,0,0
12 | 87593,BP4D/aligned_images/M008/T5/00907.jpg,0,0,0,0,0,0,0,0,0,0,0,0
13 | 85074,BP4D/aligned_images/M007/T7/00244.jpg,0,0,0,1,1,1,0,0,0,0,0,0
14 | 2314,BP4D/aligned_images/F004/T5/01227.jpg,1,0,0,1,1,1,1,1,0,0,0,0
15 | 72764,BP4D/aligned_images/F023/T5/00409.jpg,1,1,0,0,1,1,0,1,0,0,0,0
16 | 48938,BP4D/aligned_images/F001/T7/00714.jpg,1,0,0,0,1,1,1,1,0,0,0,0
17 | 37170,BP4D/aligned_images/M006/T5/00957.jpg,0,0,0,0,0,0,1,0,0,1,0,0
18 | 39254,BP4D/aligned_images/M009/T2/00549.jpg,0,0,0,0,0,0,0,0,0,0,0,0
19 | 27023,BP4D/aligned_images/F019/T5/00424.jpg,0,1,0,1,1,1,1,1,1,1,1,1
20 | 18528,BP4D/aligned_images/F015/T2/01205.jpg,0,0,0,0,0,0,0,0,0,0,0,0
21 | 69287,BP4D/aligned_images/F018/T5/00519.jpg,0,0,0,0,1,1,0,0,1,0,0,0
22 | 50211,BP4D/aligned_images/F002/T2/00076.jpg,0,0,0,0,0,0,0,0,0,1,1,1
23 | 73032,BP4D/aligned_images/F023/T5/00677.jpg,0,0,0,0,0,1,0,1,0,1,1,0
24 | 64896,BP4D/aligned_images/F016/T3/00091.jpg,0,0,0,0,0,0,0,0,0,0,0,0
25 | 87720,BP4D/aligned_images/M008/T5/01034.jpg,0,1,0,1,1,1,1,1,0,1,1,0
26 | 7913,BP4D/aligned_images/F007/T2/00120.jpg,0,0,1,1,1,0,0,0,0,0,0,0
27 | 33737,BP4D/aligned_images/M003/T5/00918.jpg,1,1,0,1,1,1,1,0,0,0,0,0
28 | 35815,BP4D/aligned_images/M006/T2/01229.jpg,0,0,0,0,1,0,0,0,0,0,0,0
29 | 76913,BP4D/aligned_images/M001/T5/00724.jpg,1,1,0,0,0,0,1,1,0,0,0,0
30 | 12942,BP4D/aligned_images/F012/T5/00292.jpg,0,0,0,0,0,1,1,1,1,0,0,0
31 | 82793,BP4D/aligned_images/M007/T2/01252.jpg,0,0,1,0,1,0,0,0,0,0,0,0
32 | 15682,BP4D/aligned_images/F014/T3/00267.jpg,1,0,0,0,0,0,0,0,0,0,0,0
33 | 59574,BP4D/aligned_images/F009/T6/00791.jpg,0,0,1,0,0,0,0,0,0,0,0,0
34 | 69337,BP4D/aligned_images/F018/T5/00569.jpg,1,1,0,1,1,1,1,1,1,0,0,0
35 | 67208,BP4D/aligned_images/F016/T8/00148.jpg,0,0,0,1,1,1,1,1,1,1,0,1
36 | 42499,BP4D/aligned_images/M015/T1/01059.jpg,0,1,0,0,0,1,1,0,0,0,0,0
37 | 45307,BP4D/aligned_images/M015/T7/00582.jpg,0,0,0,0,1,1,1,1,0,0,0,0
38 | 87810,BP4D/aligned_images/M008/T5/01124.jpg,1,1,1,1,1,1,1,0,0,1,0,0
39 | 25438,BP4D/aligned_images/F019/T1/00587.jpg,0,0,0,1,1,1,1,1,0,0,0,0
40 | 89481,BP4D/aligned_images/M012/T1/00991.jpg,0,0,0,0,1,1,1,0,0,0,0,0
41 | 12231,BP4D/aligned_images/F012/T3/00367.jpg,1,0,0,1,1,1,1,1,0,0,0,0
42 | 63475,BP4D/aligned_images/F010/T7/00422.jpg,0,1,0,0,0,1,0,0,0,0,0,0
43 | 35161,BP4D/aligned_images/M003/T8/00253.jpg,1,1,0,1,1,1,1,1,1,0,1,0
44 | 10200,BP4D/aligned_images/F007/T7/00819.jpg,1,0,0,1,0,1,0,0,0,0,0,0
45 | 22026,BP4D/aligned_images/F017/T2/01282.jpg,0,0,1,1,1,0,0,1,0,1,0,1
46 | 69333,BP4D/aligned_images/F018/T5/00565.jpg,1,1,0,1,1,1,1,1,1,0,0,0
47 | 28782,BP4D/aligned_images/F021/T1/00245.jpg,0,0,0,1,1,1,1,0,0,0,0,0
48 | 37658,BP4D/aligned_images/M006/T6/00792.jpg,0,0,1,1,1,1,0,1,0,1,1,1
49 | 78592,BP4D/aligned_images/M001/T8/00301.jpg,0,0,0,1,1,1,1,0,0,0,0,0
50 | 86629,BP4D/aligned_images/M008/T2/00540.jpg,0,0,0,0,0,0,0,1,0,1,1,0
51 | 80746,BP4D/aligned_images/M004/T5/00574.jpg,0,0,1,1,1,1,1,0,1,1,1,0
52 | 53061,BP4D/aligned_images/F002/T7/01092.jpg,0,0,0,0,1,0,1,0,0,1,0,0
53 | 41120,BP4D/aligned_images/M009/T6/00376.jpg,0,0,1,0,0,0,0,1,1,1,0,1
54 | 37441,BP4D/aligned_images/M006/T6/00575.jpg,0,0,1,1,1,1,0,0,0,1,1,1
55 | 59811,BP4D/aligned_images/F009/T6/01028.jpg,0,0,1,0,0,0,0,0,0,0,1,0
56 | 32134,BP4D/aligned_images/M003/T1/00383.jpg,0,0,0,0,0,1,0,1,1,0,0,0
57 | 95759,BP4D/aligned_images/M014/T7/00827.jpg,0,0,0,0,0,1,1,0,0,0,0,0
58 | 48417,BP4D/aligned_images/F001/T6/00304.jpg,0,0,0,1,1,1,1,1,0,1,1,1
59 | 35178,BP4D/aligned_images/M006/T1/00755.jpg,0,0,0,0,0,0,0,1,0,0,0,0
60 | 91984,BP4D/aligned_images/M012/T6/00492.jpg,0,0,0,0,0,0,0,0,0,0,0,0
61 | 95070,BP4D/aligned_images/M014/T5/01321.jpg,0,0,0,1,0,1,1,1,0,0,0,0
62 | 72200,BP4D/aligned_images/F023/T4/00709.jpg,0,0,0,1,1,1,1,1,1,1,1,0
63 | 64280,BP4D/aligned_images/F016/T1/00497.jpg,0,0,0,1,1,1,1,0,0,0,0,0
64 | 57801,BP4D/aligned_images/F009/T2/00129.jpg,0,0,1,0,0,0,0,0,0,0,0,0
65 | 2076,BP4D/aligned_images/F004/T5/00989.jpg,1,0,0,1,1,1,1,1,0,0,0,0
66 | 26458,BP4D/aligned_images/F019/T4/00154.jpg,0,0,0,1,1,1,1,0,1,1,1,0
67 | 46848,BP4D/aligned_images/F001/T3/00035.jpg,0,0,0,0,0,0,0,0,0,0,0,0
68 | 74818,BP4D/aligned_images/M001/T1/01001.jpg,0,0,0,1,1,1,1,0,0,0,0,0
69 | 56841,BP4D/aligned_images/F008/T8/00094.jpg,0,0,0,0,0,0,0,0,0,0,0,0
70 | 72900,BP4D/aligned_images/F023/T5/00545.jpg,1,1,0,1,1,1,1,1,1,1,1,1
71 | 35826,BP4D/aligned_images/M006/T2/01240.jpg,0,0,0,0,1,0,0,0,0,0,0,0
72 | 63777,BP4D/aligned_images/F010/T8/00163.jpg,0,0,0,1,1,1,1,0,1,1,1,0
73 | 57061,BP4D/aligned_images/F008/T8/00314.jpg,0,0,0,1,1,1,1,0,0,0,0,0
74 | 78298,BP4D/aligned_images/M001/T8/00007.jpg,0,1,0,0,0,0,0,0,0,0,0,0
75 | 43589,BP4D/aligned_images/M015/T3/00325.jpg,0,1,0,1,1,1,1,1,0,0,0,0
76 | 33087,BP4D/aligned_images/M003/T3/00097.jpg,0,0,0,0,0,0,0,0,0,0,0,0
77 | 63150,BP4D/aligned_images/F010/T6/00546.jpg,0,0,0,0,1,1,1,0,0,0,0,0
78 | 35889,BP4D/aligned_images/M006/T2/01303.jpg,0,0,0,0,0,0,0,0,0,0,0,0
79 | 68615,BP4D/aligned_images/F018/T4/00512.jpg,0,0,0,1,1,1,1,1,1,0,0,0
80 | 9461,BP4D/aligned_images/F007/T5/00390.jpg,1,0,1,1,1,1,1,1,1,0,1,0
81 | 69269,BP4D/aligned_images/F018/T5/00501.jpg,0,0,0,0,1,1,1,1,1,0,0,0
82 | 9090,BP4D/aligned_images/F007/T4/01539.jpg,1,1,1,1,1,1,1,0,0,0,1,0
83 | 14715,BP4D/aligned_images/F014/T1/00701.jpg,0,0,0,0,0,0,0,1,0,1,0,0
84 | 19495,BP4D/aligned_images/F015/T4/02046.jpg,0,0,0,0,0,1,0,1,0,0,0,0
85 | 51772,BP4D/aligned_images/F002/T5/00928.jpg,0,0,1,0,1,0,1,0,0,1,0,0
86 | 28781,BP4D/aligned_images/F021/T1/00244.jpg,0,0,0,1,1,1,1,0,0,0,0,0
87 | 79194,BP4D/aligned_images/M004/T2/00966.jpg,0,0,1,0,1,0,0,0,0,0,0,0
88 | 31035,BP4D/aligned_images/F021/T6/00304.jpg,1,0,1,1,1,1,1,1,1,1,1,0
89 | 89273,BP4D/aligned_images/M012/T1/00783.jpg,0,0,0,1,1,1,1,1,0,0,0,0
90 | 50385,BP4D/aligned_images/F002/T2/00250.jpg,0,0,0,0,0,0,0,0,0,1,1,1
91 | 19467,BP4D/aligned_images/F015/T4/02018.jpg,1,0,0,0,0,1,0,1,0,1,1,0
92 | 75987,BP4D/aligned_images/M001/T3/00217.jpg,0,0,0,0,0,0,0,0,0,0,0,0
93 | 63417,BP4D/aligned_images/F010/T7/00364.jpg,1,1,0,0,1,1,0,0,0,0,0,0
94 | 85614,BP4D/aligned_images/M007/T8/00293.jpg,1,0,1,0,1,1,0,0,0,0,0,0
95 | 60236,BP4D/aligned_images/F009/T7/01419.jpg,0,0,0,0,0,1,0,1,0,0,0,0
96 | 31697,BP4D/aligned_images/F021/T8/00134.jpg,1,0,1,0,0,1,0,1,1,0,1,0
97 | 11412,BP4D/aligned_images/F012/T2/01151.jpg,0,0,0,0,0,0,0,0,0,1,0,0
98 | 49178,BP4D/aligned_images/F001/T8/00454.jpg,0,0,0,0,0,1,1,1,0,1,0,1
99 | 62188,BP4D/aligned_images/F010/T5/00018.jpg,0,0,0,1,1,1,1,1,0,0,0,0
100 | 87726,BP4D/aligned_images/M008/T5/01040.jpg,0,1,0,1,1,1,1,1,0,0,1,0
101 | 35001,BP4D/aligned_images/M003/T8/00093.jpg,0,0,1,1,1,1,1,1,1,0,1,1
102 |
--------------------------------------------------------------------------------
/data/BP4D/labels/2/train_100.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au7,au10,au12,au14,au15,au17,au23,au24
2 | 99340,BP4D/aligned_images/M018/T5/00515.jpg,0,0,1,1,1,1,1,1,0,1,0,0
3 | 24893,BP4D/aligned_images/F018/T8/00004.jpg,0,0,0,0,0,1,0,1,1,0,0,0
4 | 2891,BP4D/aligned_images/F001/T6/00465.jpg,0,0,0,1,0,1,1,1,0,0,1,0
5 | 7306,BP4D/aligned_images/F002/T7/01024.jpg,0,0,0,1,1,0,1,0,0,0,0,0
6 | 94253,BP4D/aligned_images/M017/T2/01187.jpg,0,0,1,0,0,0,0,0,0,0,0,0
7 | 39856,BP4D/aligned_images/M007/T8/00222.jpg,0,0,0,0,0,1,1,0,0,0,0,0
8 | 10575,BP4D/aligned_images/F008/T6/00599.jpg,0,0,0,1,1,1,1,1,1,0,0,0
9 | 95844,BP4D/aligned_images/M017/T5/01049.jpg,0,0,0,1,1,1,1,1,0,0,0,0
10 | 86691,BP4D/aligned_images/M013/T1/00909.jpg,0,0,0,1,1,1,1,1,0,1,1,0
11 | 68312,BP4D/aligned_images/F020/T7/01044.jpg,0,0,0,0,0,0,1,1,0,0,1,0
12 | 82700,BP4D/aligned_images/M010/T7/00341.jpg,0,0,0,1,0,1,1,0,0,0,0,0
13 | 81497,BP4D/aligned_images/M010/T5/00528.jpg,1,0,0,0,0,1,1,1,0,0,0,0
14 | 74714,BP4D/aligned_images/M002/T5/00842.jpg,0,0,0,0,1,1,0,1,0,1,1,0
15 | 46041,BP4D/aligned_images/M012/T6/00236.jpg,0,0,0,0,0,1,0,0,0,0,0,0
16 | 50081,BP4D/aligned_images/M014/T7/00836.jpg,0,0,0,0,0,1,1,0,0,0,0,0
17 | 49698,BP4D/aligned_images/M014/T6/00500.jpg,0,0,0,1,1,0,1,0,0,1,1,0
18 | 77756,BP4D/aligned_images/M005/T4/00389.jpg,1,1,0,0,1,1,1,0,0,1,0,0
19 | 90381,BP4D/aligned_images/M016/T1/00745.jpg,0,0,0,1,1,1,1,1,0,0,0,0
20 | 11380,BP4D/aligned_images/F008/T8/00320.jpg,0,0,0,1,1,1,1,0,0,0,0,0
21 | 29257,BP4D/aligned_images/M001/T1/01127.jpg,0,0,0,1,1,1,1,0,0,0,0,0
22 | 40890,BP4D/aligned_images/M008/T2/00488.jpg,0,0,0,0,0,0,0,0,0,0,0,0
23 | 76980,BP4D/aligned_images/M005/T2/01110.jpg,0,0,0,0,0,0,0,1,0,1,0,1
24 | 68709,BP4D/aligned_images/F020/T8/00040.jpg,0,0,1,1,1,1,1,1,1,1,0,1
25 | 3311,BP4D/aligned_images/F001/T7/00774.jpg,1,0,0,1,1,1,1,1,0,0,0,0
26 | 30378,BP4D/aligned_images/M001/T4/00571.jpg,0,0,0,0,1,1,1,0,0,0,0,0
27 | 96387,BP4D/aligned_images/M017/T6/01438.jpg,0,0,0,1,1,1,1,0,0,0,0,0
28 | 43089,BP4D/aligned_images/M008/T7/00634.jpg,0,0,0,0,1,1,1,0,0,0,0,0
29 | 93427,BP4D/aligned_images/M016/T8/00075.jpg,0,0,0,0,0,0,0,0,0,1,0,1
30 | 28103,BP4D/aligned_images/F023/T7/00150.jpg,0,0,0,0,0,0,0,0,0,0,0,0
31 | 100264,BP4D/aligned_images/M018/T7/00781.jpg,0,0,0,0,0,0,0,0,0,0,0,0
32 | 43093,BP4D/aligned_images/M008/T7/00638.jpg,0,0,0,0,1,1,1,0,0,0,1,0
33 | 21136,BP4D/aligned_images/F016/T7/00276.jpg,0,0,0,1,1,1,1,1,0,0,0,0
34 | 19214,BP4D/aligned_images/F016/T3/00096.jpg,0,0,0,0,0,0,0,0,0,0,0,0
35 | 9960,BP4D/aligned_images/F008/T5/00720.jpg,1,1,0,1,1,1,1,0,1,0,0,0
36 | 25053,BP4D/aligned_images/F018/T8/00164.jpg,0,0,0,1,1,1,1,1,1,1,0,0
37 | 19312,BP4D/aligned_images/F016/T3/00194.jpg,0,0,0,0,1,1,0,1,0,0,0,0
38 | 39515,BP4D/aligned_images/M007/T7/00372.jpg,0,0,0,0,1,1,0,0,0,1,0,0
39 | 21928,BP4D/aligned_images/F018/T1/00912.jpg,0,0,0,1,1,1,1,1,1,0,0,0
40 | 40397,BP4D/aligned_images/M008/T1/00425.jpg,0,0,0,1,1,1,1,0,0,0,0,0
41 | 81328,BP4D/aligned_images/M010/T4/00780.jpg,0,0,0,0,0,1,1,1,0,1,0,0
42 | 77346,BP4D/aligned_images/M005/T2/01476.jpg,0,0,1,1,1,1,0,1,0,1,0,1
43 | 83716,BP4D/aligned_images/M011/T2/00328.jpg,0,0,0,0,0,1,0,0,0,0,0,0
44 | 100624,BP4D/aligned_images/M018/T8/00090.jpg,0,0,1,1,1,1,0,1,0,0,0,0
45 | 20835,BP4D/aligned_images/F016/T6/01297.jpg,1,0,1,1,1,1,0,1,0,1,0,1
46 | 47469,BP4D/aligned_images/M014/T1/00781.jpg,0,0,0,1,1,1,1,1,0,1,0,0
47 | 35234,BP4D/aligned_images/M004/T5/00749.jpg,0,0,0,1,1,1,1,0,1,1,1,0
48 | 56483,BP4D/aligned_images/F005/T5/01212.jpg,1,0,1,0,0,0,0,0,0,0,0,0
49 | 7017,BP4D/aligned_images/F002/T7/00735.jpg,0,0,0,1,1,0,1,0,0,0,0,0
50 | 68035,BP4D/aligned_images/F020/T6/01325.jpg,0,0,0,1,1,1,0,0,0,1,0,0
51 | 79794,BP4D/aligned_images/M010/T1/00205.jpg,0,0,0,0,0,1,1,0,0,0,0,0
52 | 17733,BP4D/aligned_images/F010/T7/00367.jpg,1,1,0,0,1,1,0,0,0,0,0,0
53 | 100729,BP4D/aligned_images/M018/T8/00195.jpg,0,0,1,0,1,1,0,1,0,0,0,0
54 | 3243,BP4D/aligned_images/F001/T7/00706.jpg,1,0,0,0,1,1,1,1,0,0,0,0
55 | 19223,BP4D/aligned_images/F016/T3/00105.jpg,0,0,0,0,0,0,0,0,0,0,0,0
56 | 79234,BP4D/aligned_images/M005/T7/01209.jpg,1,0,0,1,1,0,1,0,0,1,0,0
57 | 62814,BP4D/aligned_images/F013/T2/00409.jpg,0,0,0,0,0,0,0,0,0,0,0,0
58 | 67148,BP4D/aligned_images/F020/T4/00460.jpg,0,0,0,1,1,1,1,1,1,1,0,0
59 | 59744,BP4D/aligned_images/F011/T4/00182.jpg,1,1,0,1,1,0,1,0,0,1,1,0
60 | 17680,BP4D/aligned_images/F010/T7/00314.jpg,1,1,0,1,1,1,1,0,0,0,0,0
61 | 25207,BP4D/aligned_images/F023/T1/00060.jpg,0,0,0,1,1,1,1,1,1,0,1,0
62 | 92152,BP4D/aligned_images/M016/T5/00984.jpg,0,0,0,1,1,1,1,0,0,0,0,0
63 | 22602,BP4D/aligned_images/F018/T2/00483.jpg,0,0,0,1,1,1,0,0,1,1,0,0
64 | 61912,BP4D/aligned_images/F011/T8/00406.jpg,0,0,1,0,0,1,1,1,0,1,0,0
65 | 37938,BP4D/aligned_images/M007/T4/00193.jpg,0,0,0,0,0,1,1,1,0,1,0,0
66 | 17161,BP4D/aligned_images/F010/T6/00244.jpg,0,0,0,0,1,1,0,0,0,0,0,0
67 | 7903,BP4D/aligned_images/F008/T1/00326.jpg,0,1,0,1,1,1,1,0,0,0,0,0
68 | 83872,BP4D/aligned_images/M011/T2/00484.jpg,0,0,0,0,0,1,0,1,1,1,1,1
69 | 24969,BP4D/aligned_images/F018/T8/00080.jpg,0,0,0,0,1,1,0,1,1,0,0,0
70 | 98916,BP4D/aligned_images/M018/T4/01429.jpg,0,0,0,1,0,1,1,0,0,0,0,0
71 | 69285,BP4D/aligned_images/F022/T1/01409.jpg,0,0,0,1,1,0,0,1,1,1,0,1
72 | 34378,BP4D/aligned_images/M004/T4/01088.jpg,0,0,0,0,1,1,1,0,0,1,0,0
73 | 56866,BP4D/aligned_images/F005/T6/00535.jpg,0,0,1,1,1,1,0,1,0,0,0,0
74 | 80975,BP4D/aligned_images/M010/T3/00310.jpg,0,0,0,0,0,1,1,1,0,1,1,0
75 | 72011,BP4D/aligned_images/F022/T7/00790.jpg,0,0,1,1,1,1,1,1,1,1,1,0
76 | 67997,BP4D/aligned_images/F020/T6/01287.jpg,1,1,0,1,1,1,0,0,0,1,0,0
77 | 8130,BP4D/aligned_images/F008/T1/00553.jpg,0,1,0,1,0,1,1,0,0,0,0,0
78 | 56402,BP4D/aligned_images/F005/T5/01131.jpg,1,0,1,0,0,0,0,0,0,0,0,0
79 | 83311,BP4D/aligned_images/M011/T1/01066.jpg,0,0,0,1,0,1,1,1,0,1,0,0
80 | 37094,BP4D/aligned_images/M007/T2/01240.jpg,0,0,1,0,1,0,0,0,0,0,0,0
81 | 20641,BP4D/aligned_images/F016/T6/01103.jpg,0,0,0,1,1,1,0,1,0,1,0,1
82 | 20968,BP4D/aligned_images/F016/T7/00108.jpg,0,0,0,0,0,1,1,0,0,0,0,0
83 | 75512,BP4D/aligned_images/M002/T7/00760.jpg,0,1,0,0,1,1,0,0,1,1,1,1
84 | 43418,BP4D/aligned_images/M008/T8/00043.jpg,0,0,1,1,1,1,0,0,0,0,0,0
85 | 86208,BP4D/aligned_images/M011/T8/00019.jpg,0,1,1,1,1,1,0,1,1,1,0,0
86 | 90249,BP4D/aligned_images/M016/T1/00613.jpg,0,0,1,0,0,0,1,0,0,0,0,0
87 | 16150,BP4D/aligned_images/F010/T4/00248.jpg,0,0,0,1,0,1,1,0,0,1,0,0
88 | 22524,BP4D/aligned_images/F018/T2/00405.jpg,0,0,0,1,1,1,0,0,1,1,0,0
89 | 13136,BP4D/aligned_images/F009/T4/00457.jpg,0,0,0,1,1,1,1,1,0,1,1,0
90 | 57569,BP4D/aligned_images/F005/T7/00663.jpg,1,1,0,0,0,0,0,0,0,0,0,0
91 | 63061,BP4D/aligned_images/F013/T3/00156.jpg,0,0,1,1,0,0,0,1,0,1,1,1
92 | 35870,BP4D/aligned_images/M004/T7/00454.jpg,0,0,0,0,1,1,1,0,1,0,0,0
93 | 54951,BP4D/aligned_images/F005/T2/01274.jpg,1,0,1,0,0,0,0,0,0,0,0,0
94 | 49139,BP4D/aligned_images/M014/T5/01077.jpg,0,0,0,0,0,1,1,1,0,0,0,0
95 | 18588,BP4D/aligned_images/F016/T1/00492.jpg,0,0,0,1,1,1,1,0,0,0,0,0
96 | 64554,BP4D/aligned_images/F013/T6/01379.jpg,0,0,0,0,0,0,0,1,1,1,1,1
97 | 55250,BP4D/aligned_images/F005/T2/01573.jpg,1,0,0,0,0,0,0,0,0,0,0,0
98 | 21193,BP4D/aligned_images/F016/T7/00333.jpg,0,0,0,0,1,1,1,0,0,1,0,0
99 | 13770,BP4D/aligned_images/F009/T5/00617.jpg,1,0,0,0,0,1,1,0,1,0,1,0
100 | 73998,BP4D/aligned_images/M002/T4/00718.jpg,0,0,0,1,1,1,1,0,1,0,0,0
101 | 32834,BP4D/aligned_images/M001/T8/00230.jpg,0,0,0,1,1,0,0,0,0,0,0,0
102 |
--------------------------------------------------------------------------------
/data/DISFA/labels/0/train_100.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au9,au12,au25,au26
2 | 34040,DISFA/aligned_images/SN028/00126.jpg,0,0,0,1,0,1,1,0
3 | 83591,DISFA/aligned_images/SN029/01253.jpg,0,0,0,0,0,0,0,0
4 | 13426,DISFA/aligned_images/SN012/03737.jpg,0,0,0,0,0,0,0,0
5 | 23602,DISFA/aligned_images/SN018/04223.jpg,0,0,0,0,0,0,0,0
6 | 19322,DISFA/aligned_images/SN013/04788.jpg,0,0,0,1,0,1,1,0
7 | 28336,DISFA/aligned_images/SN021/04112.jpg,0,0,0,0,0,0,0,0
8 | 31833,DISFA/aligned_images/SN024/02764.jpg,0,0,0,0,0,0,0,0
9 | 16537,DISFA/aligned_images/SN013/02003.jpg,0,0,0,0,0,0,0,0
10 | 26008,DISFA/aligned_images/SN021/01784.jpg,0,0,1,0,0,0,0,0
11 | 74487,DISFA/aligned_images/SN023/01825.jpg,1,1,1,0,1,0,1,0
12 | 30843,DISFA/aligned_images/SN024/01774.jpg,0,0,0,0,0,0,1,0
13 | 14360,DISFA/aligned_images/SN012/04671.jpg,0,0,0,0,0,0,1,1
14 | 25016,DISFA/aligned_images/SN021/00792.jpg,0,0,0,0,0,0,0,0
15 | 7341,DISFA/aligned_images/SN011/02497.jpg,0,0,1,0,0,0,1,0
16 | 48105,DISFA/aligned_images/SN003/04501.jpg,0,0,0,0,0,0,1,0
17 | 76744,DISFA/aligned_images/SN023/04082.jpg,0,0,0,0,0,0,1,0
18 | 73577,DISFA/aligned_images/SN023/00903.jpg,0,0,0,0,0,0,0,0
19 | 82090,DISFA/aligned_images/SN025/04583.jpg,0,0,0,0,0,0,1,1
20 | 66876,DISFA/aligned_images/SN008/03892.jpg,0,0,0,0,0,0,0,0
21 | 28835,DISFA/aligned_images/SN021/04611.jpg,0,0,0,0,0,0,1,0
22 | 18219,DISFA/aligned_images/SN013/03685.jpg,0,0,0,0,0,0,0,0
23 | 74804,DISFA/aligned_images/SN023/02142.jpg,0,0,0,0,0,0,1,0
24 | 52232,DISFA/aligned_images/SN004/03783.jpg,0,0,0,0,0,0,0,0
25 | 77561,DISFA/aligned_images/SN025/00054.jpg,0,0,0,0,0,0,0,0
26 | 43775,DISFA/aligned_images/SN003/00171.jpg,0,0,0,1,0,1,1,1
27 | 81047,DISFA/aligned_images/SN025/03540.jpg,0,0,0,0,0,0,1,0
28 | 9190,DISFA/aligned_images/SN011/04346.jpg,0,0,1,0,0,0,0,0
29 | 1503,DISFA/aligned_images/SN006/01504.jpg,0,0,0,1,0,1,1,0
30 | 578,DISFA/aligned_images/SN006/00579.jpg,0,0,0,0,0,1,1,0
31 | 9768,DISFA/aligned_images/SN012/00079.jpg,0,0,0,0,0,0,0,0
32 | 84251,DISFA/aligned_images/SN029/01915.jpg,0,0,0,0,0,0,0,0
33 | 41967,DISFA/aligned_images/SN031/03208.jpg,0,0,0,0,0,0,0,0
34 | 68711,DISFA/aligned_images/SN017/00882.jpg,0,0,0,0,0,0,0,0
35 | 56307,DISFA/aligned_images/SN005/03013.jpg,0,0,0,0,0,0,0,0
36 | 15939,DISFA/aligned_images/SN013/01405.jpg,0,0,0,0,0,0,0,0
37 | 19341,DISFA/aligned_images/SN013/04807.jpg,0,0,0,1,0,1,1,0
38 | 29960,DISFA/aligned_images/SN024/00891.jpg,0,0,0,0,0,0,0,0
39 | 8448,DISFA/aligned_images/SN011/03604.jpg,0,0,1,0,0,0,0,0
40 | 42870,DISFA/aligned_images/SN031/04111.jpg,0,0,0,0,0,0,0,0
41 | 75002,DISFA/aligned_images/SN023/02340.jpg,0,0,0,0,0,0,1,0
42 | 15511,DISFA/aligned_images/SN013/00977.jpg,0,0,0,1,0,1,1,1
43 | 9255,DISFA/aligned_images/SN011/04411.jpg,0,1,1,0,0,0,0,0
44 | 82608,DISFA/aligned_images/SN029/00256.jpg,0,0,0,0,0,0,0,0
45 | 73796,DISFA/aligned_images/SN023/01134.jpg,0,0,0,0,0,0,1,0
46 | 29234,DISFA/aligned_images/SN024/00165.jpg,0,0,0,1,0,1,1,0
47 | 45709,DISFA/aligned_images/SN003/02105.jpg,0,0,1,1,1,0,1,1
48 | 41051,DISFA/aligned_images/SN031/02292.jpg,0,0,0,0,0,1,1,1
49 | 21464,DISFA/aligned_images/SN018/02085.jpg,0,0,1,0,0,0,0,0
50 | 29239,DISFA/aligned_images/SN024/00170.jpg,0,0,0,1,0,1,1,0
51 | 76820,DISFA/aligned_images/SN023/04158.jpg,0,0,0,0,0,0,1,0
52 | 50019,DISFA/aligned_images/SN004/01570.jpg,1,1,0,0,0,0,0,0
53 | 48982,DISFA/aligned_images/SN004/00533.jpg,0,0,0,0,0,0,0,0
54 | 61274,DISFA/aligned_images/SN007/03135.jpg,0,0,0,0,0,0,0,0
55 | 84284,DISFA/aligned_images/SN029/01948.jpg,0,0,0,0,0,0,0,0
56 | 60222,DISFA/aligned_images/SN007/02083.jpg,0,0,0,0,0,0,0,0
57 | 66784,DISFA/aligned_images/SN008/03800.jpg,0,0,0,0,0,0,0,0
58 | 40471,DISFA/aligned_images/SN031/01712.jpg,0,0,0,0,0,0,1,0
59 | 8296,DISFA/aligned_images/SN011/03452.jpg,0,0,1,0,0,0,0,0
60 | 44267,DISFA/aligned_images/SN003/00663.jpg,0,0,0,1,0,1,1,0
61 | 83374,DISFA/aligned_images/SN029/01036.jpg,0,0,1,0,0,0,0,0
62 | 66692,DISFA/aligned_images/SN008/03708.jpg,0,0,0,0,0,0,0,0
63 | 36823,DISFA/aligned_images/SN028/02909.jpg,0,0,0,0,0,0,0,0
64 | 24534,DISFA/aligned_images/SN021/00310.jpg,0,0,0,1,0,1,1,0
65 | 45360,DISFA/aligned_images/SN003/01756.jpg,1,0,1,1,1,0,0,0
66 | 82230,DISFA/aligned_images/SN025/04723.jpg,0,0,0,0,0,0,1,0
67 | 23811,DISFA/aligned_images/SN018/04432.jpg,1,1,0,0,0,0,0,0
68 | 60663,DISFA/aligned_images/SN007/02524.jpg,0,0,0,0,0,0,0,0
69 | 20042,DISFA/aligned_images/SN018/00663.jpg,0,0,0,0,0,1,1,0
70 | 44779,DISFA/aligned_images/SN003/01175.jpg,0,0,1,0,1,0,0,0
71 | 64009,DISFA/aligned_images/SN008/01025.jpg,0,0,0,0,0,0,0,0
72 | 62503,DISFA/aligned_images/SN007/04364.jpg,0,0,0,0,0,0,0,0
73 | 2534,DISFA/aligned_images/SN006/02535.jpg,0,0,0,0,0,0,0,0
74 | 40179,DISFA/aligned_images/SN031/01420.jpg,0,0,0,0,0,1,1,1
75 | 33464,DISFA/aligned_images/SN024/04395.jpg,0,0,0,0,0,0,0,0
76 | 28841,DISFA/aligned_images/SN021/04617.jpg,0,0,0,0,0,0,1,0
77 | 61799,DISFA/aligned_images/SN007/03660.jpg,0,0,0,0,0,0,0,0
78 | 56836,DISFA/aligned_images/SN005/03542.jpg,0,0,0,0,0,0,0,0
79 | 27280,DISFA/aligned_images/SN021/03056.jpg,0,0,0,0,0,0,0,0
80 | 69177,DISFA/aligned_images/SN017/01348.jpg,0,0,0,1,0,1,0,0
81 | 17678,DISFA/aligned_images/SN013/03144.jpg,0,0,0,1,1,1,1,0
82 | 64484,DISFA/aligned_images/SN008/01500.jpg,0,0,0,0,0,0,0,0
83 | 8537,DISFA/aligned_images/SN011/03693.jpg,0,0,1,0,0,0,0,0
84 | 14928,DISFA/aligned_images/SN013/00394.jpg,0,0,0,0,0,0,0,0
85 | 76991,DISFA/aligned_images/SN023/04329.jpg,0,0,1,0,0,0,1,0
86 | 19307,DISFA/aligned_images/SN013/04773.jpg,0,0,0,1,0,1,1,0
87 | 37170,DISFA/aligned_images/SN028/03256.jpg,0,0,0,0,0,0,0,0
88 | 32358,DISFA/aligned_images/SN024/03289.jpg,0,0,0,0,0,0,0,0
89 | 29246,DISFA/aligned_images/SN024/00177.jpg,0,0,0,0,0,1,1,0
90 | 29420,DISFA/aligned_images/SN024/00351.jpg,0,0,0,0,0,0,0,0
91 | 41656,DISFA/aligned_images/SN031/02897.jpg,0,0,0,0,0,0,0,0
92 | 6870,DISFA/aligned_images/SN011/02026.jpg,0,0,1,0,0,0,0,0
93 | 22109,DISFA/aligned_images/SN018/02730.jpg,0,0,0,0,0,0,0,0
94 | 24673,DISFA/aligned_images/SN021/00449.jpg,0,0,0,0,0,0,0,0
95 | 36905,DISFA/aligned_images/SN028/02991.jpg,0,0,0,0,0,0,0,1
96 | 16835,DISFA/aligned_images/SN013/02301.jpg,0,0,0,0,0,0,0,0
97 | 4376,DISFA/aligned_images/SN006/04377.jpg,0,0,0,0,0,0,0,0
98 | 64295,DISFA/aligned_images/SN008/01311.jpg,1,1,0,0,0,0,0,0
99 | 30263,DISFA/aligned_images/SN024/01194.jpg,0,0,0,0,0,0,0,0
100 | 86486,DISFA/aligned_images/SN029/04566.jpg,1,1,0,0,0,0,0,0
101 | 11557,DISFA/aligned_images/SN012/01868.jpg,1,0,1,0,1,0,0,0
102 |
--------------------------------------------------------------------------------
/data/DISFA/labels/0/train_1000.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au9,au12,au25,au26
2 | 34040,DISFA/aligned_images/SN028/00126.jpg,0,0,0,1,0,1,1,0
3 | 83591,DISFA/aligned_images/SN029/01253.jpg,0,0,0,0,0,0,0,0
4 | 13426,DISFA/aligned_images/SN012/03737.jpg,0,0,0,0,0,0,0,0
5 | 23602,DISFA/aligned_images/SN018/04223.jpg,0,0,0,0,0,0,0,0
6 | 19322,DISFA/aligned_images/SN013/04788.jpg,0,0,0,1,0,1,1,0
7 | 28336,DISFA/aligned_images/SN021/04112.jpg,0,0,0,0,0,0,0,0
8 | 31833,DISFA/aligned_images/SN024/02764.jpg,0,0,0,0,0,0,0,0
9 | 16537,DISFA/aligned_images/SN013/02003.jpg,0,0,0,0,0,0,0,0
10 | 26008,DISFA/aligned_images/SN021/01784.jpg,0,0,1,0,0,0,0,0
11 | 74487,DISFA/aligned_images/SN023/01825.jpg,1,1,1,0,1,0,1,0
12 | 30843,DISFA/aligned_images/SN024/01774.jpg,0,0,0,0,0,0,1,0
13 | 14360,DISFA/aligned_images/SN012/04671.jpg,0,0,0,0,0,0,1,1
14 | 25016,DISFA/aligned_images/SN021/00792.jpg,0,0,0,0,0,0,0,0
15 | 7341,DISFA/aligned_images/SN011/02497.jpg,0,0,1,0,0,0,1,0
16 | 48105,DISFA/aligned_images/SN003/04501.jpg,0,0,0,0,0,0,1,0
17 | 76744,DISFA/aligned_images/SN023/04082.jpg,0,0,0,0,0,0,1,0
18 | 73577,DISFA/aligned_images/SN023/00903.jpg,0,0,0,0,0,0,0,0
19 | 82090,DISFA/aligned_images/SN025/04583.jpg,0,0,0,0,0,0,1,1
20 | 66876,DISFA/aligned_images/SN008/03892.jpg,0,0,0,0,0,0,0,0
21 | 28835,DISFA/aligned_images/SN021/04611.jpg,0,0,0,0,0,0,1,0
22 | 18219,DISFA/aligned_images/SN013/03685.jpg,0,0,0,0,0,0,0,0
23 | 74804,DISFA/aligned_images/SN023/02142.jpg,0,0,0,0,0,0,1,0
24 | 52232,DISFA/aligned_images/SN004/03783.jpg,0,0,0,0,0,0,0,0
25 | 77561,DISFA/aligned_images/SN025/00054.jpg,0,0,0,0,0,0,0,0
26 | 43775,DISFA/aligned_images/SN003/00171.jpg,0,0,0,1,0,1,1,1
27 | 81047,DISFA/aligned_images/SN025/03540.jpg,0,0,0,0,0,0,1,0
28 | 9190,DISFA/aligned_images/SN011/04346.jpg,0,0,1,0,0,0,0,0
29 | 1503,DISFA/aligned_images/SN006/01504.jpg,0,0,0,1,0,1,1,0
30 | 578,DISFA/aligned_images/SN006/00579.jpg,0,0,0,0,0,1,1,0
31 | 9768,DISFA/aligned_images/SN012/00079.jpg,0,0,0,0,0,0,0,0
32 | 84251,DISFA/aligned_images/SN029/01915.jpg,0,0,0,0,0,0,0,0
33 | 41967,DISFA/aligned_images/SN031/03208.jpg,0,0,0,0,0,0,0,0
34 | 68711,DISFA/aligned_images/SN017/00882.jpg,0,0,0,0,0,0,0,0
35 | 56307,DISFA/aligned_images/SN005/03013.jpg,0,0,0,0,0,0,0,0
36 | 15939,DISFA/aligned_images/SN013/01405.jpg,0,0,0,0,0,0,0,0
37 | 19341,DISFA/aligned_images/SN013/04807.jpg,0,0,0,1,0,1,1,0
38 | 29960,DISFA/aligned_images/SN024/00891.jpg,0,0,0,0,0,0,0,0
39 | 8448,DISFA/aligned_images/SN011/03604.jpg,0,0,1,0,0,0,0,0
40 | 42870,DISFA/aligned_images/SN031/04111.jpg,0,0,0,0,0,0,0,0
41 | 75002,DISFA/aligned_images/SN023/02340.jpg,0,0,0,0,0,0,1,0
42 | 15511,DISFA/aligned_images/SN013/00977.jpg,0,0,0,1,0,1,1,1
43 | 9255,DISFA/aligned_images/SN011/04411.jpg,0,1,1,0,0,0,0,0
44 | 82608,DISFA/aligned_images/SN029/00256.jpg,0,0,0,0,0,0,0,0
45 | 73796,DISFA/aligned_images/SN023/01134.jpg,0,0,0,0,0,0,1,0
46 | 29234,DISFA/aligned_images/SN024/00165.jpg,0,0,0,1,0,1,1,0
47 | 45709,DISFA/aligned_images/SN003/02105.jpg,0,0,1,1,1,0,1,1
48 | 41051,DISFA/aligned_images/SN031/02292.jpg,0,0,0,0,0,1,1,1
49 | 21464,DISFA/aligned_images/SN018/02085.jpg,0,0,1,0,0,0,0,0
50 | 29239,DISFA/aligned_images/SN024/00170.jpg,0,0,0,1,0,1,1,0
51 | 76820,DISFA/aligned_images/SN023/04158.jpg,0,0,0,0,0,0,1,0
52 | 50019,DISFA/aligned_images/SN004/01570.jpg,1,1,0,0,0,0,0,0
53 | 48982,DISFA/aligned_images/SN004/00533.jpg,0,0,0,0,0,0,0,0
54 | 61274,DISFA/aligned_images/SN007/03135.jpg,0,0,0,0,0,0,0,0
55 | 84284,DISFA/aligned_images/SN029/01948.jpg,0,0,0,0,0,0,0,0
56 | 60222,DISFA/aligned_images/SN007/02083.jpg,0,0,0,0,0,0,0,0
57 | 66784,DISFA/aligned_images/SN008/03800.jpg,0,0,0,0,0,0,0,0
58 | 40471,DISFA/aligned_images/SN031/01712.jpg,0,0,0,0,0,0,1,0
59 | 8296,DISFA/aligned_images/SN011/03452.jpg,0,0,1,0,0,0,0,0
60 | 44267,DISFA/aligned_images/SN003/00663.jpg,0,0,0,1,0,1,1,0
61 | 83374,DISFA/aligned_images/SN029/01036.jpg,0,0,1,0,0,0,0,0
62 | 66692,DISFA/aligned_images/SN008/03708.jpg,0,0,0,0,0,0,0,0
63 | 36823,DISFA/aligned_images/SN028/02909.jpg,0,0,0,0,0,0,0,0
64 | 24534,DISFA/aligned_images/SN021/00310.jpg,0,0,0,1,0,1,1,0
65 | 45360,DISFA/aligned_images/SN003/01756.jpg,1,0,1,1,1,0,0,0
66 | 82230,DISFA/aligned_images/SN025/04723.jpg,0,0,0,0,0,0,1,0
67 | 23811,DISFA/aligned_images/SN018/04432.jpg,1,1,0,0,0,0,0,0
68 | 60663,DISFA/aligned_images/SN007/02524.jpg,0,0,0,0,0,0,0,0
69 | 20042,DISFA/aligned_images/SN018/00663.jpg,0,0,0,0,0,1,1,0
70 | 44779,DISFA/aligned_images/SN003/01175.jpg,0,0,1,0,1,0,0,0
71 | 64009,DISFA/aligned_images/SN008/01025.jpg,0,0,0,0,0,0,0,0
72 | 62503,DISFA/aligned_images/SN007/04364.jpg,0,0,0,0,0,0,0,0
73 | 2534,DISFA/aligned_images/SN006/02535.jpg,0,0,0,0,0,0,0,0
74 | 40179,DISFA/aligned_images/SN031/01420.jpg,0,0,0,0,0,1,1,1
75 | 33464,DISFA/aligned_images/SN024/04395.jpg,0,0,0,0,0,0,0,0
76 | 28841,DISFA/aligned_images/SN021/04617.jpg,0,0,0,0,0,0,1,0
77 | 61799,DISFA/aligned_images/SN007/03660.jpg,0,0,0,0,0,0,0,0
78 | 56836,DISFA/aligned_images/SN005/03542.jpg,0,0,0,0,0,0,0,0
79 | 27280,DISFA/aligned_images/SN021/03056.jpg,0,0,0,0,0,0,0,0
80 | 69177,DISFA/aligned_images/SN017/01348.jpg,0,0,0,1,0,1,0,0
81 | 17678,DISFA/aligned_images/SN013/03144.jpg,0,0,0,1,1,1,1,0
82 | 64484,DISFA/aligned_images/SN008/01500.jpg,0,0,0,0,0,0,0,0
83 | 8537,DISFA/aligned_images/SN011/03693.jpg,0,0,1,0,0,0,0,0
84 | 14928,DISFA/aligned_images/SN013/00394.jpg,0,0,0,0,0,0,0,0
85 | 76991,DISFA/aligned_images/SN023/04329.jpg,0,0,1,0,0,0,1,0
86 | 19307,DISFA/aligned_images/SN013/04773.jpg,0,0,0,1,0,1,1,0
87 | 37170,DISFA/aligned_images/SN028/03256.jpg,0,0,0,0,0,0,0,0
88 | 32358,DISFA/aligned_images/SN024/03289.jpg,0,0,0,0,0,0,0,0
89 | 29246,DISFA/aligned_images/SN024/00177.jpg,0,0,0,0,0,1,1,0
90 | 29420,DISFA/aligned_images/SN024/00351.jpg,0,0,0,0,0,0,0,0
91 | 41656,DISFA/aligned_images/SN031/02897.jpg,0,0,0,0,0,0,0,0
92 | 6870,DISFA/aligned_images/SN011/02026.jpg,0,0,1,0,0,0,0,0
93 | 22109,DISFA/aligned_images/SN018/02730.jpg,0,0,0,0,0,0,0,0
94 | 24673,DISFA/aligned_images/SN021/00449.jpg,0,0,0,0,0,0,0,0
95 | 36905,DISFA/aligned_images/SN028/02991.jpg,0,0,0,0,0,0,0,1
96 | 16835,DISFA/aligned_images/SN013/02301.jpg,0,0,0,0,0,0,0,0
97 | 4376,DISFA/aligned_images/SN006/04377.jpg,0,0,0,0,0,0,0,0
98 | 64295,DISFA/aligned_images/SN008/01311.jpg,1,1,0,0,0,0,0,0
99 | 30263,DISFA/aligned_images/SN024/01194.jpg,0,0,0,0,0,0,0,0
100 | 86486,DISFA/aligned_images/SN029/04566.jpg,1,1,0,0,0,0,0,0
101 | 11557,DISFA/aligned_images/SN012/01868.jpg,1,0,1,0,1,0,0,0
102 | 11538,DISFA/aligned_images/SN012/01849.jpg,1,0,1,0,1,0,0,0
103 | 73157,DISFA/aligned_images/SN023/00483.jpg,0,0,0,0,0,1,1,0
104 | 52183,DISFA/aligned_images/SN004/03734.jpg,0,0,0,0,0,0,0,0
105 | 44402,DISFA/aligned_images/SN003/00798.jpg,0,0,0,1,0,1,1,0
106 | 6518,DISFA/aligned_images/SN011/01674.jpg,0,0,1,0,0,0,0,0
107 | 54939,DISFA/aligned_images/SN005/01645.jpg,0,0,0,0,0,0,0,0
108 | 64803,DISFA/aligned_images/SN008/01819.jpg,0,0,0,0,1,0,0,0
109 | 64510,DISFA/aligned_images/SN008/01526.jpg,0,0,0,0,0,0,0,0
110 | 66072,DISFA/aligned_images/SN008/03088.jpg,0,0,0,0,0,0,0,0
111 | 40185,DISFA/aligned_images/SN031/01426.jpg,0,0,0,0,0,1,1,1
112 | 75888,DISFA/aligned_images/SN023/03226.jpg,0,0,0,0,0,0,0,0
113 | 21949,DISFA/aligned_images/SN018/02570.jpg,0,0,0,0,0,0,0,0
114 | 20885,DISFA/aligned_images/SN018/01506.jpg,0,0,0,0,0,0,0,0
115 | 69001,DISFA/aligned_images/SN017/01172.jpg,0,0,0,1,0,1,1,0
116 | 42397,DISFA/aligned_images/SN031/03638.jpg,0,0,0,0,0,0,0,0
117 | 51974,DISFA/aligned_images/SN004/03525.jpg,0,0,0,0,0,0,0,0
118 | 80152,DISFA/aligned_images/SN025/02645.jpg,0,0,0,0,0,0,0,0
119 | 27790,DISFA/aligned_images/SN021/03566.jpg,0,0,0,0,0,0,0,0
120 | 48116,DISFA/aligned_images/SN003/04512.jpg,0,0,0,0,0,0,1,0
121 | 74860,DISFA/aligned_images/SN023/02198.jpg,0,0,1,0,0,0,1,0
122 | 51753,DISFA/aligned_images/SN004/03304.jpg,0,0,0,0,0,0,0,0
123 | 47417,DISFA/aligned_images/SN003/03813.jpg,0,0,0,0,0,0,0,0
124 | 38385,DISFA/aligned_images/SN028/04471.jpg,0,0,0,0,0,0,0,1
125 | 69730,DISFA/aligned_images/SN017/01901.jpg,0,0,0,0,0,0,0,0
126 | 73821,DISFA/aligned_images/SN023/01159.jpg,0,0,0,0,0,0,1,0
127 | 2708,DISFA/aligned_images/SN006/02709.jpg,0,0,0,0,0,0,0,0
128 | 26910,DISFA/aligned_images/SN021/02686.jpg,0,0,0,0,0,0,0,0
129 | 28519,DISFA/aligned_images/SN021/04295.jpg,0,0,1,0,0,0,1,0
130 | 59426,DISFA/aligned_images/SN007/01287.jpg,0,0,0,0,0,0,0,0
131 | 33264,DISFA/aligned_images/SN024/04195.jpg,0,0,0,0,0,0,0,0
132 | 5898,DISFA/aligned_images/SN011/01054.jpg,0,0,0,1,0,1,1,0
133 | 81513,DISFA/aligned_images/SN025/04006.jpg,0,0,0,0,0,0,1,0
134 | 78053,DISFA/aligned_images/SN025/00546.jpg,0,0,0,0,0,0,1,0
135 | 11453,DISFA/aligned_images/SN012/01764.jpg,1,0,1,0,1,0,0,0
136 | 27437,DISFA/aligned_images/SN021/03213.jpg,0,0,0,0,0,0,0,0
137 | 30869,DISFA/aligned_images/SN024/01800.jpg,0,0,0,0,0,0,1,0
138 | 36533,DISFA/aligned_images/SN028/02619.jpg,0,0,0,0,0,0,0,0
139 | 25975,DISFA/aligned_images/SN021/01751.jpg,0,0,1,0,0,0,0,0
140 | 23924,DISFA/aligned_images/SN018/04545.jpg,1,1,0,0,0,0,0,0
141 | 8572,DISFA/aligned_images/SN011/03728.jpg,0,0,1,0,0,0,0,0
142 | 5599,DISFA/aligned_images/SN011/00755.jpg,0,0,0,1,0,1,1,1
143 | 9370,DISFA/aligned_images/SN011/04526.jpg,0,0,1,0,0,0,0,0
144 | 20611,DISFA/aligned_images/SN018/01232.jpg,0,0,0,0,0,0,1,0
145 | 14974,DISFA/aligned_images/SN013/00440.jpg,0,0,0,0,0,0,0,0
146 | 55007,DISFA/aligned_images/SN005/01713.jpg,0,0,1,1,0,0,1,0
147 | 81733,DISFA/aligned_images/SN025/04226.jpg,0,0,0,0,0,0,1,0
148 | 44429,DISFA/aligned_images/SN003/00825.jpg,0,0,0,1,0,1,1,0
149 | 29314,DISFA/aligned_images/SN024/00245.jpg,0,0,0,0,0,1,1,0
150 | 56530,DISFA/aligned_images/SN005/03236.jpg,0,0,0,0,0,0,0,0
151 | 14320,DISFA/aligned_images/SN012/04631.jpg,0,0,0,0,0,0,1,1
152 | 866,DISFA/aligned_images/SN006/00867.jpg,0,0,0,0,0,1,1,0
153 | 49131,DISFA/aligned_images/SN004/00682.jpg,0,0,0,1,0,1,1,0
154 | 72548,DISFA/aligned_images/SN017/04719.jpg,0,0,1,0,1,0,1,0
155 | 6440,DISFA/aligned_images/SN011/01596.jpg,0,0,1,0,0,0,0,0
156 | 4918,DISFA/aligned_images/SN011/00074.jpg,0,0,0,0,0,0,0,0
157 | 31101,DISFA/aligned_images/SN024/02032.jpg,0,0,0,0,0,0,0,0
158 | 84713,DISFA/aligned_images/SN029/02377.jpg,0,1,1,0,0,0,0,0
159 | 43477,DISFA/aligned_images/SN031/04718.jpg,0,0,0,0,0,1,1,0
160 | 10472,DISFA/aligned_images/SN012/00783.jpg,0,0,0,0,0,1,1,0
161 | 30256,DISFA/aligned_images/SN024/01187.jpg,0,0,0,0,0,0,0,0
162 | 43178,DISFA/aligned_images/SN031/04419.jpg,0,0,0,0,0,0,0,0
163 | 76147,DISFA/aligned_images/SN023/03485.jpg,0,0,0,0,0,0,0,0
164 | 53772,DISFA/aligned_images/SN005/00478.jpg,0,0,0,0,0,0,0,0
165 | 9432,DISFA/aligned_images/SN011/04588.jpg,1,0,1,0,0,0,1,1
166 | 32831,DISFA/aligned_images/SN024/03762.jpg,0,0,0,0,0,0,0,0
167 | 74755,DISFA/aligned_images/SN023/02093.jpg,0,0,1,0,0,0,1,0
168 | 64317,DISFA/aligned_images/SN008/01333.jpg,1,1,0,0,0,0,0,0
169 | 58858,DISFA/aligned_images/SN007/00719.jpg,0,0,0,0,0,1,0,0
170 | 78758,DISFA/aligned_images/SN025/01251.jpg,0,0,0,0,0,0,0,0
171 | 8173,DISFA/aligned_images/SN011/03329.jpg,0,0,1,0,0,0,0,0
172 | 42768,DISFA/aligned_images/SN031/04009.jpg,0,0,0,0,0,0,0,0
173 | 76351,DISFA/aligned_images/SN023/03689.jpg,0,0,0,0,0,0,0,0
174 | 50837,DISFA/aligned_images/SN004/02388.jpg,0,0,1,0,0,0,0,0
175 | 12555,DISFA/aligned_images/SN012/02866.jpg,0,0,0,0,0,0,0,0
176 | 32430,DISFA/aligned_images/SN024/03361.jpg,0,0,0,0,0,0,0,0
177 | 80509,DISFA/aligned_images/SN025/03002.jpg,0,0,0,0,0,0,1,0
178 | 59189,DISFA/aligned_images/SN007/01050.jpg,0,0,0,1,0,1,0,0
179 | 44556,DISFA/aligned_images/SN003/00952.jpg,0,0,1,0,0,0,0,0
180 | 13007,DISFA/aligned_images/SN012/03318.jpg,0,0,0,0,0,0,1,0
181 | 10890,DISFA/aligned_images/SN012/01201.jpg,0,0,0,0,0,1,1,0
182 | 10369,DISFA/aligned_images/SN012/00680.jpg,0,0,0,0,0,0,1,0
183 | 26399,DISFA/aligned_images/SN021/02175.jpg,0,0,0,0,0,0,1,0
184 | 5340,DISFA/aligned_images/SN011/00496.jpg,0,0,0,0,0,0,0,0
185 | 78179,DISFA/aligned_images/SN025/00672.jpg,0,0,0,0,0,1,1,0
186 | 15437,DISFA/aligned_images/SN013/00903.jpg,0,0,0,0,0,0,0,0
187 | 39423,DISFA/aligned_images/SN031/00664.jpg,0,0,0,0,0,1,1,1
188 | 77705,DISFA/aligned_images/SN025/00198.jpg,0,0,0,0,0,1,1,0
189 | 57004,DISFA/aligned_images/SN005/03710.jpg,0,0,0,0,0,0,0,0
190 | 54665,DISFA/aligned_images/SN005/01371.jpg,0,0,0,0,0,0,0,0
191 | 4895,DISFA/aligned_images/SN011/00051.jpg,0,0,0,0,0,0,0,0
192 | 31681,DISFA/aligned_images/SN024/02612.jpg,0,0,0,0,0,0,0,0
193 | 4134,DISFA/aligned_images/SN006/04135.jpg,0,0,0,0,0,0,0,0
194 | 67403,DISFA/aligned_images/SN008/04419.jpg,0,0,1,0,0,0,0,0
195 | 83890,DISFA/aligned_images/SN029/01553.jpg,0,0,0,0,1,0,0,0
196 | 23989,DISFA/aligned_images/SN018/04610.jpg,1,1,0,0,0,0,0,0
197 | 27458,DISFA/aligned_images/SN021/03234.jpg,0,0,0,0,0,0,0,0
198 | 83434,DISFA/aligned_images/SN029/01096.jpg,1,0,0,0,0,1,1,0
199 | 5735,DISFA/aligned_images/SN011/00891.jpg,0,0,0,0,0,0,0,0
200 | 74624,DISFA/aligned_images/SN023/01962.jpg,0,0,0,0,0,0,1,0
201 | 51016,DISFA/aligned_images/SN004/02567.jpg,0,0,0,0,0,0,0,0
202 | 9087,DISFA/aligned_images/SN011/04243.jpg,0,0,1,0,0,0,0,0
203 | 15153,DISFA/aligned_images/SN013/00619.jpg,0,0,0,0,0,0,0,0
204 | 1796,DISFA/aligned_images/SN006/01797.jpg,0,0,0,0,0,0,0,0
205 | 24607,DISFA/aligned_images/SN021/00383.jpg,0,0,0,0,0,0,0,0
206 | 54363,DISFA/aligned_images/SN005/01069.jpg,0,0,0,0,0,0,0,1
207 | 74142,DISFA/aligned_images/SN023/01480.jpg,0,0,0,0,0,0,1,0
208 | 17226,DISFA/aligned_images/SN013/02692.jpg,0,0,1,0,0,0,0,0
209 | 86722,DISFA/aligned_images/SN029/04802.jpg,1,1,0,0,0,0,0,0
210 | 54443,DISFA/aligned_images/SN005/01149.jpg,1,1,0,0,0,0,1,0
211 | 64049,DISFA/aligned_images/SN008/01065.jpg,0,0,0,0,0,0,0,0
212 | 42320,DISFA/aligned_images/SN031/03561.jpg,0,0,0,0,0,0,0,0
213 | 23548,DISFA/aligned_images/SN018/04169.jpg,0,0,0,0,0,0,0,0
214 | 63314,DISFA/aligned_images/SN008/00330.jpg,0,0,0,1,0,0,0,0
215 | 52242,DISFA/aligned_images/SN004/03793.jpg,0,0,0,0,0,0,0,0
216 | 70869,DISFA/aligned_images/SN017/03040.jpg,0,0,0,0,0,0,0,0
217 | 46689,DISFA/aligned_images/SN003/03085.jpg,0,0,1,0,0,0,0,0
218 | 12009,DISFA/aligned_images/SN012/02320.jpg,0,0,1,0,0,0,1,0
219 | 53162,DISFA/aligned_images/SN004/04713.jpg,1,1,0,1,0,1,1,0
220 | 78683,DISFA/aligned_images/SN025/01176.jpg,0,0,0,0,0,0,1,0
221 | 8355,DISFA/aligned_images/SN011/03511.jpg,0,0,1,0,0,0,0,0
222 | 34855,DISFA/aligned_images/SN028/00941.jpg,0,0,0,0,0,0,0,0
223 | 2637,DISFA/aligned_images/SN006/02638.jpg,0,0,0,0,0,0,0,0
224 | 14956,DISFA/aligned_images/SN013/00422.jpg,0,0,0,0,0,0,0,0
225 | 41193,DISFA/aligned_images/SN031/02434.jpg,0,0,0,0,0,0,1,1
226 | 60940,DISFA/aligned_images/SN007/02801.jpg,0,0,0,0,0,0,0,0
227 | 7875,DISFA/aligned_images/SN011/03031.jpg,0,0,1,0,0,0,0,0
228 | 63819,DISFA/aligned_images/SN008/00835.jpg,0,0,0,0,0,0,0,0
229 | 35165,DISFA/aligned_images/SN028/01251.jpg,0,0,0,0,0,0,0,0
230 | 60388,DISFA/aligned_images/SN007/02249.jpg,0,0,0,0,0,0,0,0
231 | 5976,DISFA/aligned_images/SN011/01132.jpg,0,0,0,0,0,0,1,0
232 | 61912,DISFA/aligned_images/SN007/03773.jpg,0,0,0,0,0,0,0,0
233 | 38311,DISFA/aligned_images/SN028/04397.jpg,0,0,0,0,0,0,0,1
234 | 44740,DISFA/aligned_images/SN003/01136.jpg,0,0,1,0,1,0,0,0
235 | 48028,DISFA/aligned_images/SN003/04424.jpg,0,0,0,0,0,0,1,0
236 | 61857,DISFA/aligned_images/SN007/03718.jpg,0,0,0,0,0,0,0,0
237 | 42581,DISFA/aligned_images/SN031/03822.jpg,0,0,0,0,0,0,0,1
238 | 55506,DISFA/aligned_images/SN005/02212.jpg,0,0,0,0,0,0,0,0
239 | 58575,DISFA/aligned_images/SN007/00436.jpg,0,0,0,0,0,0,0,0
240 | 29338,DISFA/aligned_images/SN024/00269.jpg,0,0,0,0,0,1,1,0
241 | 70872,DISFA/aligned_images/SN017/03043.jpg,0,0,0,0,0,0,0,0
242 | 16718,DISFA/aligned_images/SN013/02184.jpg,0,0,0,0,0,0,0,0
243 | 65146,DISFA/aligned_images/SN008/02162.jpg,0,0,1,0,0,0,0,0
244 | 49030,DISFA/aligned_images/SN004/00581.jpg,0,0,0,0,0,1,1,0
245 | 73600,DISFA/aligned_images/SN023/00926.jpg,0,0,0,0,0,0,0,0
246 | 3336,DISFA/aligned_images/SN006/03337.jpg,0,0,0,0,0,0,0,0
247 | 4292,DISFA/aligned_images/SN006/04293.jpg,0,0,0,0,0,0,0,0
248 | 2002,DISFA/aligned_images/SN006/02003.jpg,0,0,0,0,0,0,1,0
249 | 67986,DISFA/aligned_images/SN017/00157.jpg,0,0,0,1,0,1,0,0
250 | 82122,DISFA/aligned_images/SN025/04615.jpg,0,0,0,0,0,1,1,1
251 | 69576,DISFA/aligned_images/SN017/01747.jpg,0,0,1,0,1,1,1,0
252 | 59957,DISFA/aligned_images/SN007/01818.jpg,0,0,0,0,1,0,0,0
253 | 63386,DISFA/aligned_images/SN008/00402.jpg,0,0,0,0,0,0,0,0
254 | 43356,DISFA/aligned_images/SN031/04597.jpg,0,0,0,0,0,0,0,0
255 | 72182,DISFA/aligned_images/SN017/04353.jpg,0,0,0,0,0,0,0,0
256 | 67578,DISFA/aligned_images/SN008/04594.jpg,0,0,0,1,0,0,1,0
257 | 8601,DISFA/aligned_images/SN011/03757.jpg,0,0,1,0,0,0,0,0
258 | 34116,DISFA/aligned_images/SN028/00202.jpg,0,0,0,1,0,1,1,0
259 | 34771,DISFA/aligned_images/SN028/00857.jpg,0,1,0,0,0,0,0,0
260 | 51521,DISFA/aligned_images/SN004/03072.jpg,0,0,1,0,0,0,0,0
261 | 69757,DISFA/aligned_images/SN017/01928.jpg,0,0,0,0,0,0,0,0
262 | 15827,DISFA/aligned_images/SN013/01293.jpg,0,0,0,0,0,0,0,0
263 | 15103,DISFA/aligned_images/SN013/00569.jpg,0,0,0,0,0,0,0,0
264 | 7715,DISFA/aligned_images/SN011/02871.jpg,0,0,1,0,0,0,0,0
265 | 35184,DISFA/aligned_images/SN028/01270.jpg,0,0,0,0,0,0,0,0
266 | 64104,DISFA/aligned_images/SN008/01120.jpg,0,0,0,0,0,0,0,0
267 | 64039,DISFA/aligned_images/SN008/01055.jpg,0,0,0,0,0,0,0,0
268 | 43476,DISFA/aligned_images/SN031/04717.jpg,0,0,0,0,0,1,1,0
269 | 18497,DISFA/aligned_images/SN013/03963.jpg,0,0,0,0,0,0,0,0
270 | 57585,DISFA/aligned_images/SN005/04291.jpg,0,0,0,0,0,0,0,0
271 | 67188,DISFA/aligned_images/SN008/04204.jpg,0,0,0,0,0,0,0,0
272 | 79799,DISFA/aligned_images/SN025/02292.jpg,0,0,0,0,0,1,1,0
273 | 71001,DISFA/aligned_images/SN017/03172.jpg,0,0,0,0,0,0,0,0
274 | 3152,DISFA/aligned_images/SN006/03153.jpg,0,0,0,0,0,0,0,0
275 | 57629,DISFA/aligned_images/SN005/04335.jpg,0,0,0,0,0,0,0,0
276 | 16600,DISFA/aligned_images/SN013/02066.jpg,0,0,0,0,0,0,0,0
277 | 20226,DISFA/aligned_images/SN018/00847.jpg,0,0,0,0,0,0,1,0
278 | 78214,DISFA/aligned_images/SN025/00707.jpg,0,0,0,0,0,1,1,0
279 | 60619,DISFA/aligned_images/SN007/02480.jpg,0,0,0,0,0,0,0,0
280 | 49817,DISFA/aligned_images/SN004/01368.jpg,1,1,0,0,0,0,0,0
281 | 43681,DISFA/aligned_images/SN003/00077.jpg,0,0,0,0,0,0,0,0
282 | 62560,DISFA/aligned_images/SN007/04421.jpg,0,0,0,0,0,0,0,0
283 | 68808,DISFA/aligned_images/SN017/00979.jpg,0,0,0,1,0,1,1,0
284 | 45258,DISFA/aligned_images/SN003/01654.jpg,0,0,0,0,0,0,0,0
285 | 85808,DISFA/aligned_images/SN029/03472.jpg,0,0,1,0,0,0,0,0
286 | 33210,DISFA/aligned_images/SN024/04141.jpg,0,0,0,0,0,0,0,0
287 | 20468,DISFA/aligned_images/SN018/01089.jpg,0,0,0,0,0,0,1,1
288 | 46521,DISFA/aligned_images/SN003/02917.jpg,0,0,1,0,0,0,0,0
289 | 31186,DISFA/aligned_images/SN024/02117.jpg,0,0,0,0,0,0,1,0
290 | 62990,DISFA/aligned_images/SN008/00006.jpg,0,0,0,0,0,0,0,0
291 | 11785,DISFA/aligned_images/SN012/02096.jpg,0,0,0,0,0,0,1,0
292 | 64919,DISFA/aligned_images/SN008/01935.jpg,0,0,0,0,0,0,0,0
293 | 8366,DISFA/aligned_images/SN011/03522.jpg,0,0,1,0,0,0,0,0
294 | 86020,DISFA/aligned_images/SN029/03684.jpg,0,0,1,0,0,0,0,0
295 | 23115,DISFA/aligned_images/SN018/03736.jpg,0,0,0,0,0,0,0,0
296 | 13140,DISFA/aligned_images/SN012/03451.jpg,0,0,0,0,0,0,1,0
297 | 62786,DISFA/aligned_images/SN007/04647.jpg,0,0,0,0,0,0,0,0
298 | 30770,DISFA/aligned_images/SN024/01701.jpg,0,0,1,0,1,0,1,0
299 | 18893,DISFA/aligned_images/SN013/04359.jpg,0,0,0,0,0,0,0,0
300 | 12113,DISFA/aligned_images/SN012/02424.jpg,0,0,0,0,0,0,0,0
301 | 26818,DISFA/aligned_images/SN021/02594.jpg,0,0,0,0,0,0,0,0
302 | 9856,DISFA/aligned_images/SN012/00167.jpg,0,0,0,0,0,1,1,0
303 | 38853,DISFA/aligned_images/SN031/00094.jpg,0,0,0,0,0,0,0,0
304 | 16337,DISFA/aligned_images/SN013/01803.jpg,0,0,1,0,0,0,1,0
305 | 37183,DISFA/aligned_images/SN028/03269.jpg,0,0,0,0,0,0,0,0
306 | 35191,DISFA/aligned_images/SN028/01277.jpg,0,0,0,0,0,0,0,0
307 | 86572,DISFA/aligned_images/SN029/04652.jpg,0,1,0,0,0,0,0,0
308 | 37976,DISFA/aligned_images/SN028/04062.jpg,0,0,0,0,0,0,0,0
309 | 64800,DISFA/aligned_images/SN008/01816.jpg,0,0,0,0,1,0,0,0
310 | 48666,DISFA/aligned_images/SN004/00217.jpg,0,0,0,0,0,0,0,0
311 | 70810,DISFA/aligned_images/SN017/02981.jpg,0,0,0,0,0,0,0,0
312 | 5386,DISFA/aligned_images/SN011/00542.jpg,0,0,0,0,0,1,0,0
313 | 86301,DISFA/aligned_images/SN029/03965.jpg,0,0,1,0,0,0,0,0
314 | 70341,DISFA/aligned_images/SN017/02512.jpg,0,0,0,0,0,0,0,0
315 | 85451,DISFA/aligned_images/SN029/03115.jpg,0,1,1,0,0,0,0,0
316 | 84874,DISFA/aligned_images/SN029/02538.jpg,0,0,0,0,0,0,0,0
317 | 61967,DISFA/aligned_images/SN007/03828.jpg,0,0,0,0,0,0,0,0
318 | 52488,DISFA/aligned_images/SN004/04039.jpg,0,0,1,0,0,0,0,0
319 | 65952,DISFA/aligned_images/SN008/02968.jpg,0,0,0,0,0,0,0,0
320 | 25935,DISFA/aligned_images/SN021/01711.jpg,0,0,1,0,0,0,1,0
321 | 47738,DISFA/aligned_images/SN003/04134.jpg,0,0,0,0,0,0,0,0
322 | 26254,DISFA/aligned_images/SN021/02030.jpg,0,0,0,0,0,0,0,0
323 | 27623,DISFA/aligned_images/SN021/03399.jpg,0,0,0,0,0,0,0,0
324 | 55900,DISFA/aligned_images/SN005/02606.jpg,0,0,0,0,0,0,0,0
325 | 77218,DISFA/aligned_images/SN023/04556.jpg,0,0,0,0,0,0,0,0
326 | 79564,DISFA/aligned_images/SN025/02057.jpg,0,0,0,1,0,0,1,0
327 | 61433,DISFA/aligned_images/SN007/03294.jpg,0,0,0,0,0,0,0,0
328 | 11340,DISFA/aligned_images/SN012/01651.jpg,0,0,0,0,0,0,0,0
329 | 41912,DISFA/aligned_images/SN031/03153.jpg,0,0,0,0,0,0,0,0
330 | 56215,DISFA/aligned_images/SN005/02921.jpg,0,0,0,0,0,0,0,0
331 | 8843,DISFA/aligned_images/SN011/03999.jpg,0,0,1,0,0,0,0,0
332 | 78037,DISFA/aligned_images/SN025/00530.jpg,0,0,0,0,0,0,1,0
333 | 2256,DISFA/aligned_images/SN006/02257.jpg,0,0,1,0,0,0,0,0
334 | 26461,DISFA/aligned_images/SN021/02237.jpg,0,0,0,0,0,0,0,0
335 | 3635,DISFA/aligned_images/SN006/03636.jpg,0,0,0,0,0,0,0,0
336 | 4315,DISFA/aligned_images/SN006/04316.jpg,0,0,0,0,0,0,0,0
337 | 58497,DISFA/aligned_images/SN007/00358.jpg,0,0,0,0,0,0,0,0
338 | 64563,DISFA/aligned_images/SN008/01579.jpg,0,0,0,0,0,0,0,0
339 | 22844,DISFA/aligned_images/SN018/03465.jpg,0,0,0,0,0,0,0,0
340 | 83479,DISFA/aligned_images/SN029/01141.jpg,0,0,0,0,0,1,1,0
341 | 28317,DISFA/aligned_images/SN021/04093.jpg,0,0,0,0,0,0,0,0
342 | 37397,DISFA/aligned_images/SN028/03483.jpg,0,0,0,0,0,0,0,0
343 | 80102,DISFA/aligned_images/SN025/02595.jpg,0,0,0,0,0,0,0,0
344 | 60205,DISFA/aligned_images/SN007/02066.jpg,0,0,0,0,0,0,0,0
345 | 55473,DISFA/aligned_images/SN005/02179.jpg,0,0,0,0,0,0,0,0
346 | 82157,DISFA/aligned_images/SN025/04650.jpg,0,0,0,0,0,0,1,1
347 | 415,DISFA/aligned_images/SN006/00416.jpg,0,0,0,0,0,0,1,0
348 | 45969,DISFA/aligned_images/SN003/02365.jpg,0,0,0,1,0,0,1,1
349 | 60455,DISFA/aligned_images/SN007/02316.jpg,0,0,0,0,0,0,0,0
350 | 4057,DISFA/aligned_images/SN006/04058.jpg,0,0,0,0,0,0,0,0
351 | 45735,DISFA/aligned_images/SN003/02131.jpg,0,0,1,1,1,0,1,1
352 | 42609,DISFA/aligned_images/SN031/03850.jpg,0,0,0,0,0,0,0,0
353 | 65838,DISFA/aligned_images/SN008/02854.jpg,0,0,0,0,0,0,0,0
354 | 7676,DISFA/aligned_images/SN011/02832.jpg,0,0,1,0,0,0,0,0
355 | 12159,DISFA/aligned_images/SN012/02470.jpg,0,0,0,0,0,0,0,0
356 | 6787,DISFA/aligned_images/SN011/01943.jpg,0,0,0,0,0,0,1,0
357 | 23084,DISFA/aligned_images/SN018/03705.jpg,0,0,0,0,0,0,0,0
358 | 19351,DISFA/aligned_images/SN013/04817.jpg,0,0,0,1,0,1,1,0
359 | 43776,DISFA/aligned_images/SN003/00172.jpg,0,0,0,1,0,1,1,1
360 | 42852,DISFA/aligned_images/SN031/04093.jpg,0,0,0,0,0,0,0,0
361 | 16130,DISFA/aligned_images/SN013/01596.jpg,0,0,1,0,0,0,0,0
362 | 41914,DISFA/aligned_images/SN031/03155.jpg,0,0,0,0,0,0,0,0
363 | 82189,DISFA/aligned_images/SN025/04682.jpg,0,0,0,0,0,0,1,1
364 | 58564,DISFA/aligned_images/SN007/00425.jpg,0,0,0,0,0,0,0,0
365 | 37393,DISFA/aligned_images/SN028/03479.jpg,0,0,0,0,0,0,0,0
366 | 68397,DISFA/aligned_images/SN017/00568.jpg,0,0,0,0,0,0,0,0
367 | 6474,DISFA/aligned_images/SN011/01630.jpg,0,0,1,0,0,0,0,0
368 | 49052,DISFA/aligned_images/SN004/00603.jpg,0,0,0,0,0,0,0,0
369 | 11005,DISFA/aligned_images/SN012/01316.jpg,0,0,1,0,0,0,0,0
370 | 28952,DISFA/aligned_images/SN021/04728.jpg,0,0,0,0,0,0,0,0
371 | 68326,DISFA/aligned_images/SN017/00497.jpg,0,0,0,0,0,0,0,0
372 | 78268,DISFA/aligned_images/SN025/00761.jpg,0,0,0,0,0,0,1,0
373 | 26791,DISFA/aligned_images/SN021/02567.jpg,1,0,1,0,0,0,0,0
374 | 22641,DISFA/aligned_images/SN018/03262.jpg,0,0,0,0,0,0,0,0
375 | 77517,DISFA/aligned_images/SN025/00010.jpg,0,0,0,0,0,0,0,1
376 | 4836,DISFA/aligned_images/SN006/04837.jpg,0,0,1,1,0,0,1,1
377 | 38968,DISFA/aligned_images/SN031/00209.jpg,0,0,0,1,0,1,1,0
378 | 35770,DISFA/aligned_images/SN028/01856.jpg,0,0,0,0,0,0,1,0
379 | 28207,DISFA/aligned_images/SN021/03983.jpg,0,0,0,0,0,0,0,0
380 | 13744,DISFA/aligned_images/SN012/04055.jpg,0,0,0,0,0,0,0,0
381 | 4043,DISFA/aligned_images/SN006/04044.jpg,0,0,0,0,0,0,0,0
382 | 55712,DISFA/aligned_images/SN005/02418.jpg,0,0,0,0,0,0,0,0
383 | 54946,DISFA/aligned_images/SN005/01652.jpg,0,0,0,0,0,0,0,0
384 | 50524,DISFA/aligned_images/SN004/02075.jpg,0,0,1,0,0,0,0,0
385 | 66657,DISFA/aligned_images/SN008/03673.jpg,0,0,0,0,0,0,0,0
386 | 49833,DISFA/aligned_images/SN004/01384.jpg,1,0,0,0,0,0,0,0
387 | 41773,DISFA/aligned_images/SN031/03014.jpg,0,0,0,0,0,0,0,0
388 | 7086,DISFA/aligned_images/SN011/02242.jpg,0,0,1,0,0,0,0,0
389 | 50026,DISFA/aligned_images/SN004/01577.jpg,1,1,0,0,0,0,0,0
390 | 17053,DISFA/aligned_images/SN013/02519.jpg,0,0,0,0,0,0,0,0
391 | 7068,DISFA/aligned_images/SN011/02224.jpg,0,0,1,0,0,0,0,0
392 | 51742,DISFA/aligned_images/SN004/03293.jpg,0,0,0,0,0,0,0,0
393 | 27766,DISFA/aligned_images/SN021/03542.jpg,0,0,0,0,0,0,0,0
394 | 62849,DISFA/aligned_images/SN007/04710.jpg,0,0,0,0,0,0,0,0
395 | 26569,DISFA/aligned_images/SN021/02345.jpg,0,0,0,0,0,0,0,0
396 | 10122,DISFA/aligned_images/SN012/00433.jpg,0,0,0,0,0,0,1,0
397 | 28136,DISFA/aligned_images/SN021/03912.jpg,0,0,0,0,0,0,0,0
398 | 14251,DISFA/aligned_images/SN012/04562.jpg,0,0,0,0,0,0,1,0
399 | 50089,DISFA/aligned_images/SN004/01640.jpg,1,1,0,0,0,0,0,0
400 | 55216,DISFA/aligned_images/SN005/01922.jpg,0,0,0,0,0,0,0,0
401 | 17753,DISFA/aligned_images/SN013/03219.jpg,0,0,0,1,1,1,1,0
402 | 5660,DISFA/aligned_images/SN011/00816.jpg,0,0,0,1,0,1,1,0
403 | 46753,DISFA/aligned_images/SN003/03149.jpg,0,0,1,0,0,0,0,0
404 | 50014,DISFA/aligned_images/SN004/01565.jpg,0,0,1,0,0,0,0,0
405 | 37223,DISFA/aligned_images/SN028/03309.jpg,0,0,0,0,0,0,0,0
406 | 44000,DISFA/aligned_images/SN003/00396.jpg,0,0,0,0,0,0,0,0
407 | 32006,DISFA/aligned_images/SN024/02937.jpg,0,0,0,0,0,0,0,0
408 | 75738,DISFA/aligned_images/SN023/03076.jpg,0,0,1,0,0,0,0,0
409 | 78177,DISFA/aligned_images/SN025/00670.jpg,0,0,0,0,0,1,1,0
410 | 50268,DISFA/aligned_images/SN004/01819.jpg,0,0,0,0,0,0,0,0
411 | 64030,DISFA/aligned_images/SN008/01046.jpg,0,0,0,0,0,0,0,0
412 | 31776,DISFA/aligned_images/SN024/02707.jpg,0,0,0,0,0,0,0,0
413 | 13302,DISFA/aligned_images/SN012/03613.jpg,0,0,0,0,0,0,0,0
414 | 44614,DISFA/aligned_images/SN003/01010.jpg,0,0,1,0,1,0,0,0
415 | 41985,DISFA/aligned_images/SN031/03226.jpg,0,0,0,0,0,0,0,0
416 | 22352,DISFA/aligned_images/SN018/02973.jpg,0,0,0,0,0,0,0,0
417 | 50436,DISFA/aligned_images/SN004/01987.jpg,1,1,0,0,0,0,0,0
418 | 8320,DISFA/aligned_images/SN011/03476.jpg,0,0,1,0,0,0,0,0
419 | 2252,DISFA/aligned_images/SN006/02253.jpg,0,0,1,0,0,0,0,0
420 | 31627,DISFA/aligned_images/SN024/02558.jpg,0,0,0,0,0,0,0,0
421 | 16656,DISFA/aligned_images/SN013/02122.jpg,0,0,0,0,0,0,0,0
422 | 63885,DISFA/aligned_images/SN008/00901.jpg,0,0,0,0,0,0,0,0
423 | 13910,DISFA/aligned_images/SN012/04221.jpg,0,0,0,0,0,0,0,0
424 | 75424,DISFA/aligned_images/SN023/02762.jpg,0,0,1,0,0,0,1,0
425 | 61675,DISFA/aligned_images/SN007/03536.jpg,0,0,0,0,0,0,0,0
426 | 24928,DISFA/aligned_images/SN021/00704.jpg,0,0,0,0,0,0,0,0
427 | 50957,DISFA/aligned_images/SN004/02508.jpg,0,0,0,0,0,0,0,0
428 | 49541,DISFA/aligned_images/SN004/01092.jpg,1,1,0,0,0,0,0,0
429 | 56178,DISFA/aligned_images/SN005/02884.jpg,0,0,0,0,0,0,0,0
430 | 10003,DISFA/aligned_images/SN012/00314.jpg,0,0,0,0,0,1,1,0
431 | 43826,DISFA/aligned_images/SN003/00222.jpg,0,0,0,1,0,1,1,0
432 | 13934,DISFA/aligned_images/SN012/04245.jpg,0,0,0,0,0,0,0,0
433 | 56727,DISFA/aligned_images/SN005/03433.jpg,0,0,0,0,0,0,0,0
434 | 51188,DISFA/aligned_images/SN004/02739.jpg,0,0,1,0,1,0,0,0
435 | 36465,DISFA/aligned_images/SN028/02551.jpg,0,0,0,0,0,0,0,0
436 | 31862,DISFA/aligned_images/SN024/02793.jpg,0,0,0,0,0,0,0,0
437 | 16923,DISFA/aligned_images/SN013/02389.jpg,0,0,0,0,0,0,0,0
438 | 34205,DISFA/aligned_images/SN028/00291.jpg,0,0,0,1,0,1,1,0
439 | 70047,DISFA/aligned_images/SN017/02218.jpg,0,0,0,0,0,1,1,0
440 | 41146,DISFA/aligned_images/SN031/02387.jpg,0,0,0,0,0,0,0,1
441 | 20014,DISFA/aligned_images/SN018/00635.jpg,0,0,0,0,0,1,1,0
442 | 82387,DISFA/aligned_images/SN029/00035.jpg,0,0,0,1,0,1,0,0
443 | 27326,DISFA/aligned_images/SN021/03102.jpg,0,0,0,0,0,0,0,0
444 | 37312,DISFA/aligned_images/SN028/03398.jpg,0,0,0,0,0,0,0,1
445 | 39038,DISFA/aligned_images/SN031/00279.jpg,0,0,0,1,0,1,1,0
446 | 29504,DISFA/aligned_images/SN024/00435.jpg,0,0,0,0,0,0,0,0
447 | 54226,DISFA/aligned_images/SN005/00932.jpg,0,0,0,0,0,0,0,0
448 | 35668,DISFA/aligned_images/SN028/01754.jpg,0,0,1,1,1,0,1,0
449 | 36392,DISFA/aligned_images/SN028/02478.jpg,0,0,0,0,0,0,1,1
450 | 84919,DISFA/aligned_images/SN029/02583.jpg,0,0,0,0,0,0,0,0
451 | 62018,DISFA/aligned_images/SN007/03879.jpg,0,0,0,0,0,0,0,0
452 | 181,DISFA/aligned_images/SN006/00182.jpg,0,0,0,0,0,1,1,0
453 | 26650,DISFA/aligned_images/SN021/02426.jpg,0,0,1,0,0,0,1,0
454 | 58298,DISFA/aligned_images/SN007/00159.jpg,0,0,0,0,0,0,0,0
455 | 12935,DISFA/aligned_images/SN012/03246.jpg,0,0,0,0,0,0,1,0
456 | 79755,DISFA/aligned_images/SN025/02248.jpg,0,0,0,1,0,1,1,0
457 | 60064,DISFA/aligned_images/SN007/01925.jpg,0,0,0,0,1,0,0,0
458 | 48137,DISFA/aligned_images/SN003/04533.jpg,0,0,1,1,1,0,1,0
459 | 65407,DISFA/aligned_images/SN008/02423.jpg,0,0,0,0,0,0,0,0
460 | 68724,DISFA/aligned_images/SN017/00895.jpg,0,0,0,0,0,0,0,0
461 | 16948,DISFA/aligned_images/SN013/02414.jpg,0,0,0,0,0,0,0,0
462 | 659,DISFA/aligned_images/SN006/00660.jpg,0,0,0,1,0,1,1,0
463 | 34451,DISFA/aligned_images/SN028/00537.jpg,0,0,0,0,0,0,0,0
464 | 29948,DISFA/aligned_images/SN024/00879.jpg,0,0,0,0,0,0,0,0
465 | 6562,DISFA/aligned_images/SN011/01718.jpg,0,0,1,0,0,0,0,0
466 | 59290,DISFA/aligned_images/SN007/01151.jpg,0,0,0,0,0,0,0,0
467 | 11239,DISFA/aligned_images/SN012/01550.jpg,0,0,0,0,0,0,1,0
468 | 62158,DISFA/aligned_images/SN007/04019.jpg,1,0,0,0,0,0,0,0
469 | 82693,DISFA/aligned_images/SN029/00341.jpg,0,0,0,0,0,0,0,0
470 | 44749,DISFA/aligned_images/SN003/01145.jpg,0,0,0,0,1,0,0,0
471 | 55578,DISFA/aligned_images/SN005/02284.jpg,0,0,0,0,0,0,0,0
472 | 71483,DISFA/aligned_images/SN017/03654.jpg,0,0,0,0,0,0,0,0
473 | 18616,DISFA/aligned_images/SN013/04082.jpg,0,0,0,0,0,0,0,0
474 | 82963,DISFA/aligned_images/SN029/00611.jpg,0,0,0,0,0,0,0,0
475 | 25657,DISFA/aligned_images/SN021/01433.jpg,0,0,0,0,0,0,0,0
476 | 79623,DISFA/aligned_images/SN025/02116.jpg,0,0,0,1,0,1,1,0
477 | 32258,DISFA/aligned_images/SN024/03189.jpg,0,0,0,0,0,0,0,0
478 | 50104,DISFA/aligned_images/SN004/01655.jpg,1,1,0,0,0,0,0,0
479 | 51092,DISFA/aligned_images/SN004/02643.jpg,0,0,0,0,0,0,0,0
480 | 33220,DISFA/aligned_images/SN024/04151.jpg,0,0,0,0,0,0,0,0
481 | 31346,DISFA/aligned_images/SN024/02277.jpg,0,0,0,0,0,0,0,0
482 | 52372,DISFA/aligned_images/SN004/03923.jpg,0,0,0,0,0,0,0,0
483 | 4570,DISFA/aligned_images/SN006/04571.jpg,0,0,0,1,0,0,1,0
484 | 23608,DISFA/aligned_images/SN018/04229.jpg,0,0,0,0,0,0,0,0
485 | 32069,DISFA/aligned_images/SN024/03000.jpg,0,0,0,0,0,0,0,0
486 | 14540,DISFA/aligned_images/SN013/00006.jpg,0,0,0,0,0,0,0,0
487 | 12324,DISFA/aligned_images/SN012/02635.jpg,0,0,0,0,0,0,0,0
488 | 45235,DISFA/aligned_images/SN003/01631.jpg,1,0,0,0,0,0,0,0
489 | 83631,DISFA/aligned_images/SN029/01293.jpg,0,0,0,1,0,0,0,0
490 | 10009,DISFA/aligned_images/SN012/00320.jpg,0,0,0,0,0,1,1,0
491 | 30226,DISFA/aligned_images/SN024/01157.jpg,0,0,0,0,0,0,0,0
492 | 37476,DISFA/aligned_images/SN028/03562.jpg,0,0,0,0,0,0,0,0
493 | 15182,DISFA/aligned_images/SN013/00648.jpg,0,0,0,0,0,0,0,0
494 | 44943,DISFA/aligned_images/SN003/01339.jpg,0,0,1,0,0,0,1,0
495 | 28490,DISFA/aligned_images/SN021/04266.jpg,0,0,1,0,0,0,1,0
496 | 25796,DISFA/aligned_images/SN021/01572.jpg,0,0,0,0,0,0,0,0
497 | 70530,DISFA/aligned_images/SN017/02701.jpg,0,0,0,0,0,0,0,0
498 | 19725,DISFA/aligned_images/SN018/00346.jpg,0,0,0,1,0,1,1,1
499 | 31705,DISFA/aligned_images/SN024/02636.jpg,0,0,0,0,0,0,0,0
500 | 51642,DISFA/aligned_images/SN004/03193.jpg,0,0,0,0,0,0,0,0
501 | 77485,DISFA/aligned_images/SN023/04823.jpg,0,0,1,0,0,0,0,0
502 | 59275,DISFA/aligned_images/SN007/01136.jpg,0,0,0,0,0,0,0,0
503 | 30606,DISFA/aligned_images/SN024/01537.jpg,0,0,0,0,0,0,0,0
504 | 6703,DISFA/aligned_images/SN011/01859.jpg,0,0,1,0,0,0,0,0
505 | 12032,DISFA/aligned_images/SN012/02343.jpg,0,0,1,0,0,0,1,0
506 | 52784,DISFA/aligned_images/SN004/04335.jpg,1,1,1,0,0,0,0,1
507 | 84540,DISFA/aligned_images/SN029/02204.jpg,0,1,1,0,0,0,0,1
508 | 3155,DISFA/aligned_images/SN006/03156.jpg,0,0,0,0,0,0,0,0
509 | 64612,DISFA/aligned_images/SN008/01628.jpg,0,0,0,0,0,0,0,0
510 | 38638,DISFA/aligned_images/SN028/04724.jpg,0,0,0,0,0,0,0,0
511 | 39379,DISFA/aligned_images/SN031/00620.jpg,0,0,0,0,0,0,0,0
512 | 54653,DISFA/aligned_images/SN005/01359.jpg,0,0,0,0,0,0,0,0
513 | 58750,DISFA/aligned_images/SN007/00611.jpg,0,0,0,0,0,0,0,0
514 | 13578,DISFA/aligned_images/SN012/03889.jpg,0,0,0,0,0,0,0,0
515 | 20808,DISFA/aligned_images/SN018/01429.jpg,0,0,0,0,0,0,1,0
516 | 60868,DISFA/aligned_images/SN007/02729.jpg,0,0,0,0,0,0,0,0
517 | 6100,DISFA/aligned_images/SN011/01256.jpg,0,0,0,0,0,0,0,0
518 | 40433,DISFA/aligned_images/SN031/01674.jpg,0,0,1,0,0,0,1,1
519 | 71397,DISFA/aligned_images/SN017/03568.jpg,0,0,0,0,0,0,0,0
520 | 20453,DISFA/aligned_images/SN018/01074.jpg,0,0,1,0,0,0,1,1
521 | 27728,DISFA/aligned_images/SN021/03504.jpg,0,0,0,0,0,0,0,0
522 | 29385,DISFA/aligned_images/SN024/00316.jpg,0,0,0,0,0,0,0,0
523 | 60955,DISFA/aligned_images/SN007/02816.jpg,0,0,0,0,0,0,0,0
524 | 28123,DISFA/aligned_images/SN021/03899.jpg,0,0,0,0,0,0,0,0
525 | 16535,DISFA/aligned_images/SN013/02001.jpg,0,0,0,0,0,0,0,0
526 | 42059,DISFA/aligned_images/SN031/03300.jpg,0,0,0,0,0,0,0,0
527 | 7748,DISFA/aligned_images/SN011/02904.jpg,0,0,1,0,0,0,0,0
528 | 54280,DISFA/aligned_images/SN005/00986.jpg,0,0,0,0,0,0,0,0
529 | 68290,DISFA/aligned_images/SN017/00461.jpg,0,0,0,0,0,0,0,0
530 | 39636,DISFA/aligned_images/SN031/00877.jpg,0,0,0,0,0,0,0,0
531 | 66512,DISFA/aligned_images/SN008/03528.jpg,0,0,0,0,0,0,0,0
532 | 79529,DISFA/aligned_images/SN025/02022.jpg,0,0,0,0,0,0,1,0
533 | 66149,DISFA/aligned_images/SN008/03165.jpg,0,0,0,0,0,0,0,0
534 | 44006,DISFA/aligned_images/SN003/00402.jpg,0,0,0,0,0,0,0,0
535 | 78058,DISFA/aligned_images/SN025/00551.jpg,0,0,0,0,0,0,1,0
536 | 22715,DISFA/aligned_images/SN018/03336.jpg,0,0,0,0,0,0,0,0
537 | 76710,DISFA/aligned_images/SN023/04048.jpg,0,0,0,0,0,0,0,0
538 | 6801,DISFA/aligned_images/SN011/01957.jpg,0,0,0,0,0,0,0,0
539 | 12646,DISFA/aligned_images/SN012/02957.jpg,0,0,0,0,0,0,0,0
540 | 1155,DISFA/aligned_images/SN006/01156.jpg,0,0,0,0,0,0,0,0
541 | 60004,DISFA/aligned_images/SN007/01865.jpg,0,0,0,0,1,0,0,0
542 | 51503,DISFA/aligned_images/SN004/03054.jpg,0,0,1,0,0,0,0,0
543 | 57074,DISFA/aligned_images/SN005/03780.jpg,0,0,0,0,0,0,0,0
544 | 46262,DISFA/aligned_images/SN003/02658.jpg,1,0,0,0,0,0,0,0
545 | 44126,DISFA/aligned_images/SN003/00522.jpg,0,0,0,1,0,1,1,0
546 | 35551,DISFA/aligned_images/SN028/01637.jpg,0,0,0,0,0,0,0,0
547 | 25438,DISFA/aligned_images/SN021/01214.jpg,0,0,0,0,0,0,0,0
548 | 65689,DISFA/aligned_images/SN008/02705.jpg,0,0,0,0,0,0,0,0
549 | 25936,DISFA/aligned_images/SN021/01712.jpg,0,0,1,0,0,0,1,0
550 | 23370,DISFA/aligned_images/SN018/03991.jpg,0,0,0,0,0,0,0,0
551 | 14477,DISFA/aligned_images/SN012/04788.jpg,0,0,0,0,0,1,1,0
552 | 75390,DISFA/aligned_images/SN023/02728.jpg,1,0,1,0,1,0,1,0
553 | 45912,DISFA/aligned_images/SN003/02308.jpg,0,0,0,0,0,0,1,1
554 | 7721,DISFA/aligned_images/SN011/02877.jpg,0,0,1,0,0,0,0,0
555 | 29269,DISFA/aligned_images/SN024/00200.jpg,0,0,0,0,0,0,1,0
556 | 22185,DISFA/aligned_images/SN018/02806.jpg,0,0,0,0,0,0,0,0
557 | 60621,DISFA/aligned_images/SN007/02482.jpg,0,0,0,0,0,0,0,0
558 | 68340,DISFA/aligned_images/SN017/00511.jpg,0,0,0,0,0,1,0,0
559 | 3340,DISFA/aligned_images/SN006/03341.jpg,0,0,0,0,0,0,0,0
560 | 50113,DISFA/aligned_images/SN004/01664.jpg,1,1,0,0,0,0,0,0
561 | 4407,DISFA/aligned_images/SN006/04408.jpg,0,0,0,0,0,0,0,0
562 | 34109,DISFA/aligned_images/SN028/00195.jpg,0,0,0,1,0,1,1,0
563 | 76616,DISFA/aligned_images/SN023/03954.jpg,0,0,0,0,0,0,0,0
564 | 2340,DISFA/aligned_images/SN006/02341.jpg,0,0,0,0,0,0,0,0
565 | 22163,DISFA/aligned_images/SN018/02784.jpg,0,0,0,0,0,0,0,0
566 | 11257,DISFA/aligned_images/SN012/01568.jpg,0,0,0,0,1,0,1,0
567 | 41448,DISFA/aligned_images/SN031/02689.jpg,1,1,0,0,0,0,0,0
568 | 22341,DISFA/aligned_images/SN018/02962.jpg,0,0,0,0,0,0,0,0
569 | 29058,DISFA/aligned_images/SN021/04834.jpg,0,0,0,0,0,0,1,0
570 | 63520,DISFA/aligned_images/SN008/00536.jpg,0,0,0,0,0,0,0,0
571 | 49051,DISFA/aligned_images/SN004/00602.jpg,0,0,0,0,0,0,0,0
572 | 25598,DISFA/aligned_images/SN021/01374.jpg,0,0,0,0,0,0,0,0
573 | 54308,DISFA/aligned_images/SN005/01014.jpg,1,0,0,0,0,0,0,1
574 | 24446,DISFA/aligned_images/SN021/00222.jpg,0,0,0,1,0,1,1,0
575 | 3295,DISFA/aligned_images/SN006/03296.jpg,0,0,0,0,0,0,0,0
576 | 25619,DISFA/aligned_images/SN021/01395.jpg,0,0,0,0,0,0,0,0
577 | 18482,DISFA/aligned_images/SN013/03948.jpg,0,0,0,0,0,0,0,0
578 | 6284,DISFA/aligned_images/SN011/01440.jpg,0,0,0,0,0,0,1,0
579 | 76743,DISFA/aligned_images/SN023/04081.jpg,0,0,0,0,0,0,1,0
580 | 77696,DISFA/aligned_images/SN025/00189.jpg,0,0,0,0,0,1,1,0
581 | 21357,DISFA/aligned_images/SN018/01978.jpg,0,0,0,0,0,0,0,0
582 | 40891,DISFA/aligned_images/SN031/02132.jpg,0,0,0,0,0,1,1,1
583 | 16781,DISFA/aligned_images/SN013/02247.jpg,0,0,0,0,0,0,0,0
584 | 33574,DISFA/aligned_images/SN024/04505.jpg,0,0,0,0,0,0,0,0
585 | 70046,DISFA/aligned_images/SN017/02217.jpg,0,0,0,0,0,1,1,0
586 | 73964,DISFA/aligned_images/SN023/01302.jpg,0,0,0,0,0,0,0,0
587 | 51091,DISFA/aligned_images/SN004/02642.jpg,0,0,0,0,0,0,0,0
588 | 66573,DISFA/aligned_images/SN008/03589.jpg,0,0,0,0,0,0,0,0
589 | 30081,DISFA/aligned_images/SN024/01012.jpg,0,0,0,0,0,0,0,0
590 | 57834,DISFA/aligned_images/SN005/04540.jpg,0,0,0,0,0,0,1,1
591 | 31854,DISFA/aligned_images/SN024/02785.jpg,0,0,0,0,0,0,0,0
592 | 62869,DISFA/aligned_images/SN007/04730.jpg,0,0,0,0,0,0,0,0
593 | 23593,DISFA/aligned_images/SN018/04214.jpg,0,0,0,0,0,0,0,0
594 | 57683,DISFA/aligned_images/SN005/04389.jpg,0,0,0,0,0,0,0,0
595 | 43156,DISFA/aligned_images/SN031/04397.jpg,0,0,0,0,0,0,0,0
596 | 22609,DISFA/aligned_images/SN018/03230.jpg,0,0,0,0,0,0,0,0
597 | 53730,DISFA/aligned_images/SN005/00436.jpg,0,0,0,0,0,0,0,0
598 | 21687,DISFA/aligned_images/SN018/02308.jpg,0,0,1,0,0,0,0,0
599 | 68982,DISFA/aligned_images/SN017/01153.jpg,0,0,0,1,0,1,1,0
600 | 36812,DISFA/aligned_images/SN028/02898.jpg,0,0,0,0,0,0,0,0
601 | 15463,DISFA/aligned_images/SN013/00929.jpg,0,0,0,0,0,0,0,0
602 | 73998,DISFA/aligned_images/SN023/01336.jpg,0,0,0,0,0,0,0,0
603 | 35681,DISFA/aligned_images/SN028/01767.jpg,0,0,1,1,1,0,1,0
604 | 84207,DISFA/aligned_images/SN029/01871.jpg,1,1,1,0,0,0,0,0
605 | 68106,DISFA/aligned_images/SN017/00277.jpg,0,0,0,1,0,1,1,0
606 | 43385,DISFA/aligned_images/SN031/04626.jpg,0,0,0,0,0,0,0,0
607 | 53677,DISFA/aligned_images/SN005/00383.jpg,0,0,0,0,0,0,1,0
608 | 76199,DISFA/aligned_images/SN023/03537.jpg,0,0,0,0,0,0,0,0
609 | 4123,DISFA/aligned_images/SN006/04124.jpg,0,0,0,0,0,0,0,0
610 | 66304,DISFA/aligned_images/SN008/03320.jpg,0,0,0,0,0,0,0,0
611 | 22952,DISFA/aligned_images/SN018/03573.jpg,0,0,0,0,0,0,0,0
612 | 2034,DISFA/aligned_images/SN006/02035.jpg,0,0,0,0,0,0,1,0
613 | 62442,DISFA/aligned_images/SN007/04303.jpg,0,0,0,0,0,0,0,0
614 | 78529,DISFA/aligned_images/SN025/01022.jpg,0,0,1,0,0,0,0,0
615 | 67995,DISFA/aligned_images/SN017/00166.jpg,0,0,0,1,0,1,0,0
616 | 67836,DISFA/aligned_images/SN017/00007.jpg,0,0,0,0,0,0,0,0
617 | 47900,DISFA/aligned_images/SN003/04296.jpg,0,0,0,0,0,0,1,0
618 | 31408,DISFA/aligned_images/SN024/02339.jpg,0,0,0,0,0,0,0,0
619 | 53132,DISFA/aligned_images/SN004/04683.jpg,1,1,0,1,0,1,1,0
620 | 55929,DISFA/aligned_images/SN005/02635.jpg,0,0,0,0,0,0,0,0
621 | 45618,DISFA/aligned_images/SN003/02014.jpg,0,0,1,0,1,0,1,1
622 | 46873,DISFA/aligned_images/SN003/03269.jpg,0,0,1,0,0,0,1,1
623 | 52480,DISFA/aligned_images/SN004/04031.jpg,0,0,1,0,0,0,0,0
624 | 73348,DISFA/aligned_images/SN023/00674.jpg,0,0,0,0,0,0,1,1
625 | 83749,DISFA/aligned_images/SN029/01411.jpg,0,0,0,0,0,0,0,0
626 | 85535,DISFA/aligned_images/SN029/03199.jpg,0,1,1,0,0,0,0,0
627 | 14740,DISFA/aligned_images/SN013/00206.jpg,0,0,0,1,0,1,1,0
628 | 62243,DISFA/aligned_images/SN007/04104.jpg,0,0,0,0,0,0,0,0
629 | 17498,DISFA/aligned_images/SN013/02964.jpg,0,0,0,0,0,0,0,0
630 | 31903,DISFA/aligned_images/SN024/02834.jpg,0,0,0,0,0,0,0,0
631 | 1481,DISFA/aligned_images/SN006/01482.jpg,0,0,0,1,0,1,1,0
632 | 69793,DISFA/aligned_images/SN017/01964.jpg,0,0,0,0,0,0,0,0
633 | 46313,DISFA/aligned_images/SN003/02709.jpg,1,0,1,0,0,0,0,0
634 | 29434,DISFA/aligned_images/SN024/00365.jpg,0,0,0,0,0,0,0,0
635 | 33781,DISFA/aligned_images/SN024/04712.jpg,1,1,0,0,0,0,0,0
636 | 2867,DISFA/aligned_images/SN006/02868.jpg,0,0,0,0,0,0,0,0
637 | 33840,DISFA/aligned_images/SN024/04771.jpg,1,1,0,0,0,0,0,0
638 | 33723,DISFA/aligned_images/SN024/04654.jpg,1,1,0,0,0,0,0,0
639 | 47360,DISFA/aligned_images/SN003/03756.jpg,0,0,0,0,0,0,0,0
640 | 33686,DISFA/aligned_images/SN024/04617.jpg,0,0,0,0,0,0,0,0
641 | 38146,DISFA/aligned_images/SN028/04232.jpg,0,0,0,0,0,0,0,0
642 | 5513,DISFA/aligned_images/SN011/00669.jpg,0,0,0,1,0,1,1,0
643 | 47168,DISFA/aligned_images/SN003/03564.jpg,0,0,0,0,0,0,0,0
644 | 30662,DISFA/aligned_images/SN024/01593.jpg,0,0,0,0,0,0,0,0
645 | 13428,DISFA/aligned_images/SN012/03739.jpg,0,0,0,0,0,0,0,0
646 | 17680,DISFA/aligned_images/SN013/03146.jpg,0,0,0,1,1,1,1,0
647 | 75783,DISFA/aligned_images/SN023/03121.jpg,0,0,1,0,0,0,0,0
648 | 7564,DISFA/aligned_images/SN011/02720.jpg,0,0,1,0,0,0,0,0
649 | 37284,DISFA/aligned_images/SN028/03370.jpg,0,0,0,0,0,0,0,0
650 | 17265,DISFA/aligned_images/SN013/02731.jpg,0,0,0,0,0,0,0,0
651 | 50284,DISFA/aligned_images/SN004/01835.jpg,0,0,0,0,0,0,0,0
652 | 38857,DISFA/aligned_images/SN031/00098.jpg,0,0,0,0,0,0,0,0
653 | 25115,DISFA/aligned_images/SN021/00891.jpg,0,0,1,0,0,0,0,0
654 | 15375,DISFA/aligned_images/SN013/00841.jpg,0,0,0,0,0,0,1,0
655 | 33259,DISFA/aligned_images/SN024/04190.jpg,0,0,0,0,0,0,0,0
656 | 50552,DISFA/aligned_images/SN004/02103.jpg,0,0,1,0,0,0,0,0
657 | 2895,DISFA/aligned_images/SN006/02896.jpg,0,0,0,0,0,0,0,0
658 | 31072,DISFA/aligned_images/SN024/02003.jpg,0,0,0,0,0,0,0,0
659 | 71359,DISFA/aligned_images/SN017/03530.jpg,0,0,0,0,0,0,0,0
660 | 14786,DISFA/aligned_images/SN013/00252.jpg,0,0,0,0,0,0,0,0
661 | 77024,DISFA/aligned_images/SN023/04362.jpg,0,0,1,0,0,0,1,0
662 | 39489,DISFA/aligned_images/SN031/00730.jpg,0,0,0,0,0,1,1,1
663 | 32067,DISFA/aligned_images/SN024/02998.jpg,0,0,0,0,0,0,0,0
664 | 50967,DISFA/aligned_images/SN004/02518.jpg,0,0,0,0,0,0,0,0
665 | 76764,DISFA/aligned_images/SN023/04102.jpg,0,0,0,0,0,0,1,0
666 | 44837,DISFA/aligned_images/SN003/01233.jpg,0,0,0,0,0,0,0,1
667 | 50205,DISFA/aligned_images/SN004/01756.jpg,0,0,0,0,0,1,1,0
668 | 22553,DISFA/aligned_images/SN018/03174.jpg,0,0,0,0,0,0,0,0
669 | 39473,DISFA/aligned_images/SN031/00714.jpg,0,0,0,0,0,1,1,1
670 | 74923,DISFA/aligned_images/SN023/02261.jpg,0,0,1,0,0,0,1,0
671 | 59845,DISFA/aligned_images/SN007/01706.jpg,0,0,0,0,1,0,0,0
672 | 61658,DISFA/aligned_images/SN007/03519.jpg,0,0,0,0,0,0,0,0
673 | 42269,DISFA/aligned_images/SN031/03510.jpg,0,0,0,0,0,0,0,0
674 | 44103,DISFA/aligned_images/SN003/00499.jpg,0,0,0,0,0,0,0,0
675 | 66571,DISFA/aligned_images/SN008/03587.jpg,0,0,0,0,0,0,0,0
676 | 51013,DISFA/aligned_images/SN004/02564.jpg,0,0,0,0,0,0,0,0
677 | 72563,DISFA/aligned_images/SN017/04734.jpg,0,0,1,0,1,0,1,1
678 | 57840,DISFA/aligned_images/SN005/04546.jpg,0,0,0,0,0,0,1,1
679 | 27140,DISFA/aligned_images/SN021/02916.jpg,0,0,0,0,0,0,0,0
680 | 40575,DISFA/aligned_images/SN031/01816.jpg,0,0,0,0,0,1,1,1
681 | 77605,DISFA/aligned_images/SN025/00098.jpg,0,0,0,0,0,0,0,0
682 | 46379,DISFA/aligned_images/SN003/02775.jpg,1,0,1,0,0,0,0,0
683 | 49658,DISFA/aligned_images/SN004/01209.jpg,1,1,0,0,0,0,0,0
684 | 658,DISFA/aligned_images/SN006/00659.jpg,0,0,0,1,0,1,1,0
685 | 61668,DISFA/aligned_images/SN007/03529.jpg,0,0,0,0,0,0,0,0
686 | 17415,DISFA/aligned_images/SN013/02881.jpg,0,0,0,0,0,0,0,0
687 | 48967,DISFA/aligned_images/SN004/00518.jpg,0,0,0,0,0,0,0,0
688 | 77446,DISFA/aligned_images/SN023/04784.jpg,0,0,1,0,0,0,0,1
689 | 65406,DISFA/aligned_images/SN008/02422.jpg,0,0,0,0,0,0,0,0
690 | 47272,DISFA/aligned_images/SN003/03668.jpg,0,0,0,0,0,0,0,0
691 | 81251,DISFA/aligned_images/SN025/03744.jpg,0,0,0,0,0,0,1,0
692 | 15029,DISFA/aligned_images/SN013/00495.jpg,0,0,0,0,0,0,0,0
693 | 3049,DISFA/aligned_images/SN006/03050.jpg,0,0,0,0,0,0,0,0
694 | 45212,DISFA/aligned_images/SN003/01608.jpg,0,0,0,0,0,0,0,0
695 | 44956,DISFA/aligned_images/SN003/01352.jpg,0,0,1,0,0,0,1,0
696 | 56941,DISFA/aligned_images/SN005/03647.jpg,0,0,0,0,0,0,0,0
697 | 57701,DISFA/aligned_images/SN005/04407.jpg,0,0,0,0,0,0,0,0
698 | 86652,DISFA/aligned_images/SN029/04732.jpg,1,1,0,0,0,0,0,0
699 | 46553,DISFA/aligned_images/SN003/02949.jpg,0,0,1,0,0,0,0,0
700 | 76686,DISFA/aligned_images/SN023/04024.jpg,0,0,0,0,0,0,0,0
701 | 31899,DISFA/aligned_images/SN024/02830.jpg,0,0,0,0,0,0,0,0
702 | 75718,DISFA/aligned_images/SN023/03056.jpg,0,1,1,0,0,0,0,0
703 | 39828,DISFA/aligned_images/SN031/01069.jpg,0,0,0,0,0,0,0,0
704 | 81837,DISFA/aligned_images/SN025/04330.jpg,0,0,0,0,0,0,1,0
705 | 879,DISFA/aligned_images/SN006/00880.jpg,0,0,0,0,0,1,1,0
706 | 55653,DISFA/aligned_images/SN005/02359.jpg,0,0,0,0,0,0,0,0
707 | 49505,DISFA/aligned_images/SN004/01056.jpg,1,1,0,0,0,0,0,0
708 | 64280,DISFA/aligned_images/SN008/01296.jpg,1,1,0,0,0,0,0,0
709 | 79121,DISFA/aligned_images/SN025/01614.jpg,0,0,0,0,0,1,1,0
710 | 63258,DISFA/aligned_images/SN008/00274.jpg,0,0,0,1,0,0,0,0
711 | 4640,DISFA/aligned_images/SN006/04641.jpg,0,0,0,1,0,0,1,0
712 | 30734,DISFA/aligned_images/SN024/01665.jpg,0,0,0,0,0,0,0,0
713 | 55191,DISFA/aligned_images/SN005/01897.jpg,0,0,0,0,0,0,0,0
714 | 25129,DISFA/aligned_images/SN021/00905.jpg,0,0,0,0,0,0,0,0
715 | 20196,DISFA/aligned_images/SN018/00817.jpg,0,0,0,0,0,1,1,1
716 | 52139,DISFA/aligned_images/SN004/03690.jpg,0,0,0,0,0,0,0,0
717 | 60768,DISFA/aligned_images/SN007/02629.jpg,0,0,0,0,0,0,0,0
718 | 22002,DISFA/aligned_images/SN018/02623.jpg,0,0,0,0,0,0,0,0
719 | 66830,DISFA/aligned_images/SN008/03846.jpg,0,0,0,0,0,0,0,0
720 | 19518,DISFA/aligned_images/SN018/00139.jpg,0,0,0,0,0,0,0,0
721 | 83105,DISFA/aligned_images/SN029/00764.jpg,0,0,0,1,0,1,0,0
722 | 8364,DISFA/aligned_images/SN011/03520.jpg,0,0,1,0,0,0,0,0
723 | 85103,DISFA/aligned_images/SN029/02767.jpg,1,1,1,0,0,0,0,0
724 | 30078,DISFA/aligned_images/SN024/01009.jpg,0,0,0,0,0,0,0,0
725 | 54880,DISFA/aligned_images/SN005/01586.jpg,0,0,0,0,0,0,0,0
726 | 22942,DISFA/aligned_images/SN018/03563.jpg,0,0,0,0,0,0,0,0
727 | 55773,DISFA/aligned_images/SN005/02479.jpg,0,0,0,0,0,0,0,0
728 | 8342,DISFA/aligned_images/SN011/03498.jpg,0,0,1,0,0,0,0,0
729 | 50133,DISFA/aligned_images/SN004/01684.jpg,0,0,1,1,1,1,1,0
730 | 20070,DISFA/aligned_images/SN018/00691.jpg,0,0,0,0,0,1,1,0
731 | 82184,DISFA/aligned_images/SN025/04677.jpg,0,0,0,0,0,0,1,1
732 | 58315,DISFA/aligned_images/SN007/00176.jpg,0,0,0,0,0,1,0,0
733 | 47764,DISFA/aligned_images/SN003/04160.jpg,0,0,0,0,0,0,0,0
734 | 80836,DISFA/aligned_images/SN025/03329.jpg,0,0,0,0,0,0,1,0
735 | 35463,DISFA/aligned_images/SN028/01549.jpg,0,0,0,0,0,0,1,0
736 | 32193,DISFA/aligned_images/SN024/03124.jpg,0,0,0,0,0,0,0,0
737 | 22212,DISFA/aligned_images/SN018/02833.jpg,0,0,0,0,0,0,0,0
738 | 67411,DISFA/aligned_images/SN008/04427.jpg,0,0,1,0,0,0,0,0
739 | 19364,DISFA/aligned_images/SN013/04830.jpg,0,0,0,1,0,1,1,0
740 | 73507,DISFA/aligned_images/SN023/00833.jpg,0,0,0,0,0,0,0,0
741 | 40271,DISFA/aligned_images/SN031/01512.jpg,0,0,0,0,0,1,1,1
742 | 72448,DISFA/aligned_images/SN017/04619.jpg,0,0,1,0,1,0,1,0
743 | 3398,DISFA/aligned_images/SN006/03399.jpg,0,0,0,0,0,0,0,0
744 | 70986,DISFA/aligned_images/SN017/03157.jpg,0,0,0,0,0,0,0,0
745 | 76530,DISFA/aligned_images/SN023/03868.jpg,0,0,0,0,0,0,0,0
746 | 75756,DISFA/aligned_images/SN023/03094.jpg,0,0,1,0,0,0,0,0
747 | 69364,DISFA/aligned_images/SN017/01535.jpg,0,0,0,0,0,1,0,0
748 | 50541,DISFA/aligned_images/SN004/02092.jpg,0,0,1,0,0,0,0,0
749 | 78925,DISFA/aligned_images/SN025/01418.jpg,0,0,0,0,0,1,1,0
750 | 18633,DISFA/aligned_images/SN013/04099.jpg,0,0,0,0,0,0,0,0
751 | 32323,DISFA/aligned_images/SN024/03254.jpg,0,0,0,0,0,0,0,0
752 | 31344,DISFA/aligned_images/SN024/02275.jpg,0,0,0,0,0,0,0,0
753 | 35129,DISFA/aligned_images/SN028/01215.jpg,0,0,0,0,0,0,0,0
754 | 71841,DISFA/aligned_images/SN017/04012.jpg,0,0,0,0,0,0,0,0
755 | 10977,DISFA/aligned_images/SN012/01288.jpg,0,0,0,0,0,0,1,0
756 | 54298,DISFA/aligned_images/SN005/01004.jpg,0,0,1,0,0,0,0,0
757 | 67522,DISFA/aligned_images/SN008/04538.jpg,1,1,0,0,0,0,0,1
758 | 58175,DISFA/aligned_images/SN007/00036.jpg,0,0,0,0,0,1,0,0
759 | 26446,DISFA/aligned_images/SN021/02222.jpg,0,0,1,0,0,0,0,0
760 | 6482,DISFA/aligned_images/SN011/01638.jpg,0,0,1,0,0,0,0,0
761 | 6607,DISFA/aligned_images/SN011/01763.jpg,0,0,1,0,0,0,0,0
762 | 45507,DISFA/aligned_images/SN003/01903.jpg,0,0,1,1,1,0,0,0
763 | 46193,DISFA/aligned_images/SN003/02589.jpg,1,0,0,0,0,0,0,0
764 | 36007,DISFA/aligned_images/SN028/02093.jpg,0,0,0,0,0,0,0,1
765 | 20844,DISFA/aligned_images/SN018/01465.jpg,0,0,0,0,0,0,0,0
766 | 84426,DISFA/aligned_images/SN029/02090.jpg,1,1,0,0,0,0,0,0
767 | 18220,DISFA/aligned_images/SN013/03686.jpg,0,0,0,0,0,0,0,0
768 | 84008,DISFA/aligned_images/SN029/01671.jpg,0,0,0,0,1,0,1,0
769 | 14177,DISFA/aligned_images/SN012/04488.jpg,0,0,0,0,0,0,0,0
770 | 76382,DISFA/aligned_images/SN023/03720.jpg,0,0,0,0,0,0,0,0
771 | 59775,DISFA/aligned_images/SN007/01636.jpg,0,0,0,0,0,0,0,0
772 | 55139,DISFA/aligned_images/SN005/01845.jpg,0,0,0,0,0,0,0,0
773 | 5442,DISFA/aligned_images/SN011/00598.jpg,0,0,0,0,0,0,0,0
774 | 48214,DISFA/aligned_images/SN003/04610.jpg,0,0,1,1,1,0,1,1
775 | 40222,DISFA/aligned_images/SN031/01463.jpg,0,0,0,0,0,1,1,1
776 | 54138,DISFA/aligned_images/SN005/00844.jpg,0,0,0,0,0,0,0,0
777 | 72313,DISFA/aligned_images/SN017/04484.jpg,0,0,0,0,0,0,0,0
778 | 84742,DISFA/aligned_images/SN029/02406.jpg,0,1,1,0,0,0,0,0
779 | 33205,DISFA/aligned_images/SN024/04136.jpg,0,0,0,0,0,0,0,0
780 | 24542,DISFA/aligned_images/SN021/00318.jpg,0,0,0,1,0,1,1,0
781 | 63482,DISFA/aligned_images/SN008/00498.jpg,0,0,0,0,0,0,0,0
782 | 35772,DISFA/aligned_images/SN028/01858.jpg,0,0,0,0,0,0,1,0
783 | 78583,DISFA/aligned_images/SN025/01076.jpg,0,0,0,0,0,0,0,0
784 | 23535,DISFA/aligned_images/SN018/04156.jpg,0,0,0,0,0,0,0,0
785 | 24836,DISFA/aligned_images/SN021/00612.jpg,0,0,0,0,0,0,0,0
786 | 76706,DISFA/aligned_images/SN023/04044.jpg,0,0,0,0,0,0,0,0
787 | 24793,DISFA/aligned_images/SN021/00569.jpg,0,0,0,0,0,1,0,0
788 | 11579,DISFA/aligned_images/SN012/01890.jpg,1,0,1,0,1,0,0,0
789 | 77786,DISFA/aligned_images/SN025/00279.jpg,0,0,0,0,0,1,1,0
790 | 84576,DISFA/aligned_images/SN029/02240.jpg,1,1,0,0,0,0,0,0
791 | 46347,DISFA/aligned_images/SN003/02743.jpg,1,0,1,0,0,0,0,0
792 | 46684,DISFA/aligned_images/SN003/03080.jpg,0,0,1,0,0,0,0,0
793 | 12869,DISFA/aligned_images/SN012/03180.jpg,0,0,0,0,0,0,1,0
794 | 19695,DISFA/aligned_images/SN018/00316.jpg,0,0,0,1,0,1,1,0
795 | 47000,DISFA/aligned_images/SN003/03396.jpg,0,0,0,0,0,0,0,0
796 | 85388,DISFA/aligned_images/SN029/03052.jpg,0,1,1,0,0,0,0,0
797 | 82824,DISFA/aligned_images/SN029/00472.jpg,0,0,0,0,0,1,0,0
798 | 65332,DISFA/aligned_images/SN008/02348.jpg,0,0,0,0,0,0,0,0
799 | 16888,DISFA/aligned_images/SN013/02354.jpg,0,0,0,0,0,0,0,0
800 | 62947,DISFA/aligned_images/SN007/04808.jpg,0,0,0,0,0,0,0,0
801 | 31999,DISFA/aligned_images/SN024/02930.jpg,0,0,0,0,0,0,0,0
802 | 6809,DISFA/aligned_images/SN011/01965.jpg,0,0,0,0,0,0,0,0
803 | 84364,DISFA/aligned_images/SN029/02028.jpg,0,0,0,0,0,0,0,0
804 | 35914,DISFA/aligned_images/SN028/02000.jpg,0,0,0,0,0,0,0,1
805 | 34865,DISFA/aligned_images/SN028/00951.jpg,0,0,0,0,0,0,0,0
806 | 17463,DISFA/aligned_images/SN013/02929.jpg,0,0,0,0,0,0,0,0
807 | 25137,DISFA/aligned_images/SN021/00913.jpg,0,0,0,0,0,0,0,0
808 | 20413,DISFA/aligned_images/SN018/01034.jpg,0,0,1,0,0,0,1,1
809 | 25821,DISFA/aligned_images/SN021/01597.jpg,0,0,0,0,0,0,0,0
810 | 20334,DISFA/aligned_images/SN018/00955.jpg,1,1,0,0,0,0,1,1
811 | 74718,DISFA/aligned_images/SN023/02056.jpg,0,0,1,0,1,0,1,0
812 | 63461,DISFA/aligned_images/SN008/00477.jpg,0,0,0,0,0,0,0,0
813 | 86325,DISFA/aligned_images/SN029/03989.jpg,0,0,1,0,0,0,0,0
814 | 66377,DISFA/aligned_images/SN008/03393.jpg,0,0,0,0,0,0,0,0
815 | 81134,DISFA/aligned_images/SN025/03627.jpg,0,0,0,0,0,0,1,0
816 | 20493,DISFA/aligned_images/SN018/01114.jpg,0,0,0,0,0,0,1,0
817 | 81216,DISFA/aligned_images/SN025/03709.jpg,0,0,0,0,0,0,1,0
818 | 73596,DISFA/aligned_images/SN023/00922.jpg,0,0,0,0,0,0,0,0
819 | 82893,DISFA/aligned_images/SN029/00541.jpg,0,0,0,0,0,1,0,0
820 | 81636,DISFA/aligned_images/SN025/04129.jpg,0,0,0,0,0,0,1,0
821 | 84912,DISFA/aligned_images/SN029/02576.jpg,0,0,0,0,0,0,0,0
822 | 71603,DISFA/aligned_images/SN017/03774.jpg,0,0,0,0,0,0,0,0
823 | 84539,DISFA/aligned_images/SN029/02203.jpg,1,1,1,0,0,0,0,1
824 | 54633,DISFA/aligned_images/SN005/01339.jpg,0,0,0,0,0,0,0,0
825 | 24779,DISFA/aligned_images/SN021/00555.jpg,0,0,0,0,0,1,0,0
826 | 25769,DISFA/aligned_images/SN021/01545.jpg,0,0,0,0,0,0,0,0
827 | 23779,DISFA/aligned_images/SN018/04400.jpg,1,1,0,0,0,0,0,0
828 | 65917,DISFA/aligned_images/SN008/02933.jpg,0,0,0,0,0,0,0,0
829 | 26009,DISFA/aligned_images/SN021/01785.jpg,0,0,1,0,0,0,0,0
830 | 33794,DISFA/aligned_images/SN024/04725.jpg,1,1,0,0,0,0,0,0
831 | 85677,DISFA/aligned_images/SN029/03341.jpg,1,0,1,0,0,0,0,0
832 | 48565,DISFA/aligned_images/SN004/00116.jpg,0,0,0,0,0,0,0,0
833 | 53249,DISFA/aligned_images/SN004/04800.jpg,1,1,1,0,1,1,1,0
834 | 18476,DISFA/aligned_images/SN013/03942.jpg,0,0,0,0,0,0,0,0
835 | 80428,DISFA/aligned_images/SN025/02921.jpg,0,0,0,0,0,0,1,0
836 | 34727,DISFA/aligned_images/SN028/00813.jpg,0,0,0,0,0,0,0,0
837 | 9311,DISFA/aligned_images/SN011/04467.jpg,0,0,1,0,0,0,0,0
838 | 44688,DISFA/aligned_images/SN003/01084.jpg,0,0,1,0,1,0,0,0
839 | 79447,DISFA/aligned_images/SN025/01940.jpg,0,0,0,0,0,0,1,0
840 | 64171,DISFA/aligned_images/SN008/01187.jpg,0,0,0,0,0,0,0,0
841 | 80674,DISFA/aligned_images/SN025/03167.jpg,0,0,0,0,0,0,1,0
842 | 84557,DISFA/aligned_images/SN029/02221.jpg,0,1,0,0,0,0,0,0
843 | 39990,DISFA/aligned_images/SN031/01231.jpg,0,0,0,0,0,0,1,1
844 | 11169,DISFA/aligned_images/SN012/01480.jpg,0,0,0,0,0,0,1,0
845 | 85841,DISFA/aligned_images/SN029/03505.jpg,0,0,1,0,0,0,0,0
846 | 40233,DISFA/aligned_images/SN031/01474.jpg,0,0,0,0,0,1,1,1
847 | 1466,DISFA/aligned_images/SN006/01467.jpg,0,0,0,1,0,1,1,0
848 | 18113,DISFA/aligned_images/SN013/03579.jpg,0,0,0,0,0,0,0,0
849 | 42055,DISFA/aligned_images/SN031/03296.jpg,0,0,0,0,0,0,0,0
850 | 24748,DISFA/aligned_images/SN021/00524.jpg,0,0,0,0,0,0,0,0
851 | 15388,DISFA/aligned_images/SN013/00854.jpg,0,0,0,0,0,0,0,0
852 | 21600,DISFA/aligned_images/SN018/02221.jpg,0,0,1,0,0,0,0,0
853 | 15306,DISFA/aligned_images/SN013/00772.jpg,0,0,0,0,0,1,1,0
854 | 83117,DISFA/aligned_images/SN029/00778.jpg,0,0,0,1,0,1,0,0
855 | 34848,DISFA/aligned_images/SN028/00934.jpg,0,0,0,0,0,0,0,0
856 | 77586,DISFA/aligned_images/SN025/00079.jpg,0,0,0,0,0,0,1,0
857 | 5937,DISFA/aligned_images/SN011/01093.jpg,0,0,0,1,0,1,1,0
858 | 47593,DISFA/aligned_images/SN003/03989.jpg,0,0,0,0,0,0,0,0
859 | 18919,DISFA/aligned_images/SN013/04385.jpg,0,0,0,0,0,0,0,0
860 | 75700,DISFA/aligned_images/SN023/03038.jpg,1,1,1,0,0,0,0,0
861 | 65070,DISFA/aligned_images/SN008/02086.jpg,0,0,0,0,0,0,0,0
862 | 31205,DISFA/aligned_images/SN024/02136.jpg,0,0,0,0,0,0,1,0
863 | 83850,DISFA/aligned_images/SN029/01512.jpg,0,0,0,0,0,0,0,0
864 | 60120,DISFA/aligned_images/SN007/01981.jpg,0,0,0,0,0,0,0,0
865 | 59014,DISFA/aligned_images/SN007/00875.jpg,0,0,0,0,0,0,1,0
866 | 74376,DISFA/aligned_images/SN023/01714.jpg,0,0,1,0,1,1,1,0
867 | 55288,DISFA/aligned_images/SN005/01994.jpg,0,0,0,0,0,0,1,0
868 | 16785,DISFA/aligned_images/SN013/02251.jpg,0,0,0,0,0,0,0,0
869 | 45297,DISFA/aligned_images/SN003/01693.jpg,1,0,1,1,1,0,0,0
870 | 30213,DISFA/aligned_images/SN024/01144.jpg,0,0,0,0,0,0,0,0
871 | 86717,DISFA/aligned_images/SN029/04797.jpg,1,1,0,0,0,0,0,0
872 | 33881,DISFA/aligned_images/SN024/04812.jpg,1,1,0,0,0,0,1,0
873 | 9205,DISFA/aligned_images/SN011/04361.jpg,0,0,1,0,0,0,0,0
874 | 26864,DISFA/aligned_images/SN021/02640.jpg,0,0,0,0,0,0,0,0
875 | 65576,DISFA/aligned_images/SN008/02592.jpg,0,0,0,0,0,0,0,0
876 | 85254,DISFA/aligned_images/SN029/02918.jpg,1,1,1,0,0,0,0,0
877 | 21869,DISFA/aligned_images/SN018/02490.jpg,0,0,1,0,0,0,0,0
878 | 15784,DISFA/aligned_images/SN013/01250.jpg,0,0,0,0,0,0,0,0
879 | 33535,DISFA/aligned_images/SN024/04466.jpg,0,0,0,0,0,0,0,0
880 | 80341,DISFA/aligned_images/SN025/02834.jpg,0,0,0,0,0,0,1,0
881 | 72777,DISFA/aligned_images/SN023/00103.jpg,0,0,0,0,0,0,1,1
882 | 19929,DISFA/aligned_images/SN018/00550.jpg,0,0,0,0,0,1,1,0
883 | 53906,DISFA/aligned_images/SN005/00612.jpg,0,0,0,0,0,0,0,0
884 | 892,DISFA/aligned_images/SN006/00893.jpg,0,0,0,0,0,1,1,0
885 | 86750,DISFA/aligned_images/SN029/04830.jpg,1,1,0,0,0,0,0,0
886 | 21876,DISFA/aligned_images/SN018/02497.jpg,0,0,1,0,0,0,1,0
887 | 50580,DISFA/aligned_images/SN004/02131.jpg,0,0,1,0,0,0,0,0
888 | 84027,DISFA/aligned_images/SN029/01690.jpg,0,0,1,0,1,0,1,0
889 | 77152,DISFA/aligned_images/SN023/04490.jpg,0,0,1,0,0,0,1,0
890 | 13296,DISFA/aligned_images/SN012/03607.jpg,0,0,0,0,0,0,0,0
891 | 83822,DISFA/aligned_images/SN029/01484.jpg,0,0,0,0,0,0,0,0
892 | 55636,DISFA/aligned_images/SN005/02342.jpg,0,0,0,0,0,0,0,0
893 | 67002,DISFA/aligned_images/SN008/04018.jpg,0,0,0,0,0,0,0,0
894 | 72722,DISFA/aligned_images/SN023/00048.jpg,0,0,0,0,0,0,0,0
895 | 31932,DISFA/aligned_images/SN024/02863.jpg,0,0,0,0,0,0,0,0
896 | 54467,DISFA/aligned_images/SN005/01173.jpg,0,0,0,0,0,0,0,0
897 | 48879,DISFA/aligned_images/SN004/00430.jpg,0,0,0,0,0,0,0,0
898 | 36865,DISFA/aligned_images/SN028/02951.jpg,0,0,0,0,0,0,0,0
899 | 7647,DISFA/aligned_images/SN011/02803.jpg,0,0,1,0,0,0,0,0
900 | 43092,DISFA/aligned_images/SN031/04333.jpg,0,0,0,0,0,0,0,0
901 | 26707,DISFA/aligned_images/SN021/02483.jpg,0,0,0,0,0,0,1,0
902 | 31263,DISFA/aligned_images/SN024/02194.jpg,0,0,0,0,0,0,1,0
903 | 54136,DISFA/aligned_images/SN005/00842.jpg,0,0,0,0,0,0,0,0
904 | 20271,DISFA/aligned_images/SN018/00892.jpg,0,0,0,0,0,0,0,0
905 | 49061,DISFA/aligned_images/SN004/00612.jpg,0,0,0,0,0,0,0,0
906 | 27883,DISFA/aligned_images/SN021/03659.jpg,0,0,0,0,0,0,0,0
907 | 8606,DISFA/aligned_images/SN011/03762.jpg,0,0,1,0,0,0,0,0
908 | 29706,DISFA/aligned_images/SN024/00637.jpg,1,1,0,0,0,0,0,0
909 | 35212,DISFA/aligned_images/SN028/01298.jpg,0,0,0,0,0,0,1,0
910 | 57279,DISFA/aligned_images/SN005/03985.jpg,0,0,0,0,0,0,0,0
911 | 72464,DISFA/aligned_images/SN017/04635.jpg,0,0,1,0,1,0,1,0
912 | 71138,DISFA/aligned_images/SN017/03309.jpg,0,0,0,0,0,0,0,0
913 | 39432,DISFA/aligned_images/SN031/00673.jpg,0,0,0,0,0,1,1,1
914 | 14628,DISFA/aligned_images/SN013/00094.jpg,0,0,0,0,0,0,0,0
915 | 85356,DISFA/aligned_images/SN029/03020.jpg,0,1,1,0,0,0,0,0
916 | 39309,DISFA/aligned_images/SN031/00550.jpg,0,0,0,0,0,0,0,0
917 | 72351,DISFA/aligned_images/SN017/04522.jpg,0,0,0,0,0,0,0,0
918 | 9281,DISFA/aligned_images/SN011/04437.jpg,0,0,1,0,0,0,0,0
919 | 61351,DISFA/aligned_images/SN007/03212.jpg,0,0,0,0,0,0,0,0
920 | 57822,DISFA/aligned_images/SN005/04528.jpg,0,0,0,0,0,0,0,0
921 | 31229,DISFA/aligned_images/SN024/02160.jpg,0,0,0,0,0,0,1,0
922 | 13770,DISFA/aligned_images/SN012/04081.jpg,0,0,0,0,0,0,0,0
923 | 43688,DISFA/aligned_images/SN003/00084.jpg,0,0,0,0,0,0,0,0
924 | 5074,DISFA/aligned_images/SN011/00230.jpg,0,0,0,0,0,0,1,0
925 | 28823,DISFA/aligned_images/SN021/04599.jpg,0,0,0,0,0,0,1,0
926 | 65762,DISFA/aligned_images/SN008/02778.jpg,0,0,0,0,0,0,0,0
927 | 5413,DISFA/aligned_images/SN011/00569.jpg,0,1,0,0,0,0,0,0
928 | 56782,DISFA/aligned_images/SN005/03488.jpg,0,0,0,0,0,0,0,0
929 | 86150,DISFA/aligned_images/SN029/03814.jpg,0,0,0,0,0,0,0,1
930 | 54345,DISFA/aligned_images/SN005/01051.jpg,0,0,0,0,0,0,1,1
931 | 83568,DISFA/aligned_images/SN029/01230.jpg,0,0,0,0,0,0,1,1
932 | 57675,DISFA/aligned_images/SN005/04381.jpg,0,0,0,0,0,0,0,0
933 | 57328,DISFA/aligned_images/SN005/04034.jpg,0,0,0,0,0,0,0,0
934 | 23092,DISFA/aligned_images/SN018/03713.jpg,0,0,0,0,0,0,0,0
935 | 41700,DISFA/aligned_images/SN031/02941.jpg,0,0,0,0,0,0,0,0
936 | 18620,DISFA/aligned_images/SN013/04086.jpg,0,0,0,0,0,0,0,0
937 | 7686,DISFA/aligned_images/SN011/02842.jpg,0,0,1,0,0,0,0,0
938 | 32584,DISFA/aligned_images/SN024/03515.jpg,0,0,0,0,0,0,0,0
939 | 43985,DISFA/aligned_images/SN003/00381.jpg,0,0,0,0,0,0,0,0
940 | 65786,DISFA/aligned_images/SN008/02802.jpg,0,0,0,0,0,0,0,0
941 | 55085,DISFA/aligned_images/SN005/01791.jpg,0,0,1,1,0,0,1,0
942 | 62435,DISFA/aligned_images/SN007/04296.jpg,0,0,0,0,0,0,0,0
943 | 58672,DISFA/aligned_images/SN007/00533.jpg,0,0,0,0,0,1,0,0
944 | 63802,DISFA/aligned_images/SN008/00818.jpg,0,0,0,0,0,0,0,0
945 | 1286,DISFA/aligned_images/SN006/01287.jpg,0,0,0,1,0,1,1,0
946 | 69905,DISFA/aligned_images/SN017/02076.jpg,0,0,0,0,0,0,0,0
947 | 44328,DISFA/aligned_images/SN003/00724.jpg,0,0,0,1,0,1,1,0
948 | 14501,DISFA/aligned_images/SN012/04812.jpg,0,0,0,0,0,1,1,0
949 | 6715,DISFA/aligned_images/SN011/01871.jpg,0,0,1,0,0,0,1,0
950 | 36323,DISFA/aligned_images/SN028/02409.jpg,0,0,0,0,0,0,1,1
951 | 13175,DISFA/aligned_images/SN012/03486.jpg,0,0,0,0,0,0,0,0
952 | 31191,DISFA/aligned_images/SN024/02122.jpg,0,0,0,0,0,0,1,0
953 | 48151,DISFA/aligned_images/SN003/04547.jpg,0,0,1,1,1,0,1,0
954 | 37611,DISFA/aligned_images/SN028/03697.jpg,0,0,0,0,0,0,0,0
955 | 42675,DISFA/aligned_images/SN031/03916.jpg,0,0,0,0,0,0,0,0
956 | 26464,DISFA/aligned_images/SN021/02240.jpg,0,0,0,0,0,0,0,0
957 | 39732,DISFA/aligned_images/SN031/00973.jpg,0,0,0,0,0,0,0,0
958 | 14654,DISFA/aligned_images/SN013/00120.jpg,0,0,0,0,0,0,0,0
959 | 439,DISFA/aligned_images/SN006/00440.jpg,0,0,0,0,0,0,1,0
960 | 24930,DISFA/aligned_images/SN021/00706.jpg,0,0,0,0,0,0,0,0
961 | 9830,DISFA/aligned_images/SN012/00141.jpg,0,0,0,0,0,1,1,0
962 | 81681,DISFA/aligned_images/SN025/04174.jpg,0,0,0,0,0,0,1,0
963 | 3905,DISFA/aligned_images/SN006/03906.jpg,0,0,0,0,0,0,0,0
964 | 31819,DISFA/aligned_images/SN024/02750.jpg,0,0,0,0,0,0,0,0
965 | 63,DISFA/aligned_images/SN006/00064.jpg,0,0,0,0,0,0,0,0
966 | 52403,DISFA/aligned_images/SN004/03954.jpg,0,0,1,0,0,0,0,0
967 | 9509,DISFA/aligned_images/SN011/04665.jpg,1,0,1,0,0,0,1,0
968 | 41680,DISFA/aligned_images/SN031/02921.jpg,0,0,0,0,0,0,0,0
969 | 31560,DISFA/aligned_images/SN024/02491.jpg,0,0,0,0,0,0,0,0
970 | 43362,DISFA/aligned_images/SN031/04603.jpg,0,0,0,0,0,0,0,0
971 | 49650,DISFA/aligned_images/SN004/01201.jpg,1,1,0,0,0,0,0,0
972 | 23297,DISFA/aligned_images/SN018/03918.jpg,0,0,0,0,0,0,0,0
973 | 80713,DISFA/aligned_images/SN025/03206.jpg,0,0,0,0,0,0,1,0
974 | 9441,DISFA/aligned_images/SN011/04597.jpg,1,0,1,0,0,0,1,1
975 | 8005,DISFA/aligned_images/SN011/03161.jpg,0,0,1,0,0,0,0,0
976 | 57385,DISFA/aligned_images/SN005/04091.jpg,0,0,0,0,0,0,0,0
977 | 60753,DISFA/aligned_images/SN007/02614.jpg,0,0,0,0,0,0,0,0
978 | 11392,DISFA/aligned_images/SN012/01703.jpg,1,0,1,0,1,0,1,0
979 | 84585,DISFA/aligned_images/SN029/02249.jpg,1,1,0,0,0,0,0,0
980 | 79774,DISFA/aligned_images/SN025/02267.jpg,0,0,0,0,0,1,1,0
981 | 75846,DISFA/aligned_images/SN023/03184.jpg,0,0,0,0,0,0,0,0
982 | 17145,DISFA/aligned_images/SN013/02611.jpg,0,0,0,0,0,0,0,0
983 | 27663,DISFA/aligned_images/SN021/03439.jpg,0,0,0,0,0,0,0,0
984 | 19530,DISFA/aligned_images/SN018/00151.jpg,0,0,0,0,0,1,0,0
985 | 85128,DISFA/aligned_images/SN029/02792.jpg,1,1,1,0,0,0,0,0
986 | 57343,DISFA/aligned_images/SN005/04049.jpg,0,0,0,0,0,0,0,0
987 | 41633,DISFA/aligned_images/SN031/02874.jpg,0,0,0,0,0,0,0,0
988 | 46476,DISFA/aligned_images/SN003/02872.jpg,0,0,1,0,0,0,0,0
989 | 8046,DISFA/aligned_images/SN011/03202.jpg,0,0,1,0,0,0,0,0
990 | 73823,DISFA/aligned_images/SN023/01161.jpg,0,0,0,0,0,0,1,0
991 | 31142,DISFA/aligned_images/SN024/02073.jpg,0,0,0,0,0,0,1,0
992 | 2611,DISFA/aligned_images/SN006/02612.jpg,0,0,0,0,0,0,0,0
993 | 43632,DISFA/aligned_images/SN003/00028.jpg,0,0,0,0,0,0,0,0
994 | 60559,DISFA/aligned_images/SN007/02420.jpg,0,0,0,0,0,0,0,0
995 | 6295,DISFA/aligned_images/SN011/01451.jpg,0,0,0,0,0,0,1,0
996 | 86726,DISFA/aligned_images/SN029/04806.jpg,1,1,0,0,0,0,0,0
997 | 17262,DISFA/aligned_images/SN013/02728.jpg,0,0,1,0,0,0,0,0
998 | 27182,DISFA/aligned_images/SN021/02958.jpg,0,0,0,0,0,0,0,0
999 | 15829,DISFA/aligned_images/SN013/01295.jpg,0,0,0,0,0,0,0,0
1000 | 55123,DISFA/aligned_images/SN005/01829.jpg,0,0,0,0,0,0,0,0
1001 | 64359,DISFA/aligned_images/SN008/01375.jpg,0,0,0,0,0,0,0,0
1002 |
--------------------------------------------------------------------------------
/data/DISFA/labels/1/train_100.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au9,au12,au25,au26
2 | 49927,DISFA/aligned_images/SN002/01925.jpg,0,0,0,0,0,0,1,0
3 | 30213,DISFA/aligned_images/SN023/01156.jpg,0,0,0,0,0,0,1,0
4 | 85851,DISFA/aligned_images/SN032/03991.jpg,0,0,0,0,0,0,0,0
5 | 4025,DISFA/aligned_images/SN003/04026.jpg,0,0,0,0,0,0,0,1
6 | 26278,DISFA/aligned_images/SN017/02054.jpg,1,1,0,0,0,0,0,0
7 | 5399,DISFA/aligned_images/SN004/00555.jpg,0,0,0,0,0,0,1,0
8 | 85026,DISFA/aligned_images/SN032/03166.jpg,0,0,0,0,0,0,0,0
9 | 16731,DISFA/aligned_images/SN007/02197.jpg,0,0,0,0,0,0,0,0
10 | 48628,DISFA/aligned_images/SN002/00623.jpg,0,0,0,0,0,0,0,0
11 | 61855,DISFA/aligned_images/SN010/04219.jpg,0,0,0,0,0,0,0,0
12 | 81140,DISFA/aligned_images/SN030/04125.jpg,0,0,0,0,0,0,0,0
13 | 15750,DISFA/aligned_images/SN007/01216.jpg,0,0,0,0,0,0,0,0
14 | 67556,DISFA/aligned_images/SN026/00231.jpg,0,0,0,1,0,1,1,0
15 | 46279,DISFA/aligned_images/SN001/03119.jpg,0,0,0,0,0,0,0,0
16 | 28969,DISFA/aligned_images/SN017/04745.jpg,0,0,1,0,1,0,1,1
17 | 51337,DISFA/aligned_images/SN002/03335.jpg,0,0,0,0,0,0,0,0
18 | 37349,DISFA/aligned_images/SN025/03447.jpg,0,0,0,0,0,0,1,0
19 | 79959,DISFA/aligned_images/SN030/02944.jpg,0,0,0,0,0,0,0,0
20 | 72063,DISFA/aligned_images/SN026/04738.jpg,0,0,0,0,0,0,0,0
21 | 10429,DISFA/aligned_images/SN005/00740.jpg,0,0,0,0,0,0,0,0
22 | 8761,DISFA/aligned_images/SN004/03917.jpg,0,0,0,0,0,0,0,0
23 | 15606,DISFA/aligned_images/SN007/01072.jpg,0,0,0,1,0,1,0,0
24 | 47410,DISFA/aligned_images/SN001/04250.jpg,0,0,0,0,0,0,0,0
25 | 21950,DISFA/aligned_images/SN008/02571.jpg,0,0,0,0,0,0,0,0
26 | 33085,DISFA/aligned_images/SN023/04028.jpg,0,0,0,0,0,0,0,0
27 | 52485,DISFA/aligned_images/SN002/04483.jpg,0,0,0,0,0,0,0,0
28 | 55565,DISFA/aligned_images/SN009/02774.jpg,0,0,0,0,0,0,0,0
29 | 18604,DISFA/aligned_images/SN007/04070.jpg,0,0,0,0,0,0,0,0
30 | 40691,DISFA/aligned_images/SN029/01960.jpg,0,0,0,0,0,0,0,0
31 | 27202,DISFA/aligned_images/SN017/02978.jpg,0,0,0,0,0,0,0,0
32 | 28699,DISFA/aligned_images/SN017/04475.jpg,0,0,0,0,0,0,0,0
33 | 79301,DISFA/aligned_images/SN030/02286.jpg,0,0,0,0,0,0,0,0
34 | 73757,DISFA/aligned_images/SN027/01587.jpg,0,0,0,0,0,0,0,0
35 | 82441,DISFA/aligned_images/SN032/00581.jpg,0,0,0,0,0,0,1,0
36 | 46020,DISFA/aligned_images/SN001/02860.jpg,0,0,0,0,0,0,0,0
37 | 58827,DISFA/aligned_images/SN010/01191.jpg,0,0,0,0,0,0,1,0
38 | 76232,DISFA/aligned_images/SN027/04062.jpg,0,0,0,0,0,0,0,0
39 | 55319,DISFA/aligned_images/SN009/02528.jpg,0,0,0,0,0,0,0,0
40 | 34218,DISFA/aligned_images/SN025/00316.jpg,0,0,0,0,0,1,1,0
41 | 67169,DISFA/aligned_images/SN016/04689.jpg,1,1,0,0,0,0,0,0
42 | 34714,DISFA/aligned_images/SN025/00812.jpg,0,0,0,0,0,0,1,0
43 | 79485,DISFA/aligned_images/SN030/02470.jpg,0,0,0,0,0,0,0,0
44 | 415,DISFA/aligned_images/SN003/00416.jpg,0,0,0,0,0,0,0,0
45 | 54576,DISFA/aligned_images/SN009/01729.jpg,0,0,0,0,0,0,0,0
46 | 10260,DISFA/aligned_images/SN005/00571.jpg,0,0,0,0,0,1,1,0
47 | 69802,DISFA/aligned_images/SN026/02477.jpg,0,0,0,0,0,0,1,0
48 | 79545,DISFA/aligned_images/SN030/02530.jpg,0,0,0,0,0,0,0,0
49 | 14764,DISFA/aligned_images/SN007/00230.jpg,0,0,0,1,0,1,1,0
50 | 15209,DISFA/aligned_images/SN007/00675.jpg,0,0,0,0,0,0,0,0
51 | 13145,DISFA/aligned_images/SN005/03456.jpg,0,0,0,0,0,0,0,0
52 | 74434,DISFA/aligned_images/SN027/02264.jpg,0,0,0,0,0,0,0,0
53 | 17072,DISFA/aligned_images/SN007/02538.jpg,0,0,0,0,0,0,0,0
54 | 17426,DISFA/aligned_images/SN007/02892.jpg,0,0,1,0,0,0,0,0
55 | 50422,DISFA/aligned_images/SN002/02420.jpg,0,0,0,0,0,0,1,1
56 | 72029,DISFA/aligned_images/SN026/04704.jpg,0,0,0,0,0,0,0,0
57 | 59073,DISFA/aligned_images/SN010/01437.jpg,0,0,0,0,0,0,0,0
58 | 34448,DISFA/aligned_images/SN025/00546.jpg,0,0,0,0,0,0,1,0
59 | 19277,DISFA/aligned_images/SN007/04743.jpg,0,0,0,0,0,0,0,0
60 | 2955,DISFA/aligned_images/SN003/02956.jpg,0,0,1,0,0,0,0,0
61 | 27806,DISFA/aligned_images/SN017/03582.jpg,0,0,0,0,0,0,0,0
62 | 19289,DISFA/aligned_images/SN007/04755.jpg,0,0,0,0,0,0,0,0
63 | 31316,DISFA/aligned_images/SN023/02259.jpg,0,0,1,0,0,0,1,0
64 | 22989,DISFA/aligned_images/SN008/03610.jpg,0,0,0,0,0,0,0,0
65 | 61112,DISFA/aligned_images/SN010/03476.jpg,0,0,0,0,0,0,0,0
66 | 74483,DISFA/aligned_images/SN027/02313.jpg,0,0,0,0,0,0,0,0
67 | 72537,DISFA/aligned_images/SN027/00367.jpg,0,0,0,0,0,0,0,0
68 | 77532,DISFA/aligned_images/SN030/00517.jpg,0,0,0,0,0,0,0,0
69 | 20698,DISFA/aligned_images/SN008/01319.jpg,1,1,0,0,0,0,0,0
70 | 25682,DISFA/aligned_images/SN017/01458.jpg,0,0,0,1,0,1,0,0
71 | 73503,DISFA/aligned_images/SN027/01333.jpg,1,0,1,0,0,0,0,0
72 | 14894,DISFA/aligned_images/SN007/00360.jpg,0,0,0,0,0,0,0,0
73 | 71986,DISFA/aligned_images/SN026/04661.jpg,0,0,0,0,0,0,0,0
74 | 68256,DISFA/aligned_images/SN026/00931.jpg,0,0,0,0,0,0,0,0
75 | 36161,DISFA/aligned_images/SN025/02259.jpg,0,0,0,0,0,1,1,0
76 | 75330,DISFA/aligned_images/SN027/03160.jpg,0,0,0,0,0,0,0,0
77 | 14335,DISFA/aligned_images/SN005/04646.jpg,0,0,0,0,0,0,1,1
78 | 68622,DISFA/aligned_images/SN026/01297.jpg,0,0,0,0,0,0,0,0
79 | 75224,DISFA/aligned_images/SN027/03054.jpg,0,0,0,0,0,0,0,0
80 | 80842,DISFA/aligned_images/SN030/03827.jpg,0,0,0,0,0,0,0,0
81 | 33820,DISFA/aligned_images/SN023/04763.jpg,0,0,1,0,0,0,1,0
82 | 79891,DISFA/aligned_images/SN030/02876.jpg,0,0,0,0,0,0,0,0
83 | 32395,DISFA/aligned_images/SN023/03338.jpg,0,0,0,0,0,0,0,0
84 | 70609,DISFA/aligned_images/SN026/03284.jpg,0,0,0,0,0,0,0,0
85 | 43887,DISFA/aligned_images/SN001/00727.jpg,0,0,0,0,0,0,0,0
86 | 22401,DISFA/aligned_images/SN008/03022.jpg,0,0,0,0,0,0,0,0
87 | 79614,DISFA/aligned_images/SN030/02599.jpg,0,0,0,0,0,0,0,0
88 | 13583,DISFA/aligned_images/SN005/03894.jpg,0,0,0,0,0,0,0,0
89 | 53805,DISFA/aligned_images/SN009/00958.jpg,0,0,0,0,0,0,0,0
90 | 68273,DISFA/aligned_images/SN026/00948.jpg,0,0,0,0,0,0,0,0
91 | 81199,DISFA/aligned_images/SN030/04184.jpg,0,0,0,0,0,0,0,0
92 | 36586,DISFA/aligned_images/SN025/02684.jpg,0,0,0,0,0,0,1,0
93 | 25598,DISFA/aligned_images/SN017/01374.jpg,0,0,0,1,0,1,0,0
94 | 41362,DISFA/aligned_images/SN029/02631.jpg,0,0,0,0,0,0,0,0
95 | 5522,DISFA/aligned_images/SN004/00678.jpg,0,0,0,1,0,1,1,0
96 | 17177,DISFA/aligned_images/SN007/02643.jpg,1,1,0,0,0,0,0,0
97 | 59938,DISFA/aligned_images/SN010/02302.jpg,0,0,0,0,0,0,0,0
98 | 82686,DISFA/aligned_images/SN032/00826.jpg,0,0,0,0,0,1,1,1
99 | 43329,DISFA/aligned_images/SN001/00169.jpg,0,0,0,0,0,1,0,0
100 | 48345,DISFA/aligned_images/SN002/00340.jpg,0,0,0,0,0,0,0,0
101 | 62769,DISFA/aligned_images/SN016/00289.jpg,0,0,0,0,0,0,1,0
102 |
--------------------------------------------------------------------------------
/data/DISFA/labels/2/train_100.csv:
--------------------------------------------------------------------------------
1 | ,image_path,au1,au2,au4,au6,au9,au12,au25,au26
2 | 75161,DISFA/aligned_images/SN024/02547.jpg,0,0,0,0,0,0,0,1
3 | 33203,DISFA/aligned_images/SN027/04194.jpg,0,0,0,0,0,0,1,0
4 | 72581,DISFA/aligned_images/SN021/04812.jpg,0,0,0,0,0,0,1,0
5 | 33835,DISFA/aligned_images/SN027/04826.jpg,0,0,0,0,0,1,1,0
6 | 3786,DISFA/aligned_images/SN001/03787.jpg,0,0,0,0,0,0,0,0
7 | 22550,DISFA/aligned_images/SN016/03231.jpg,0,0,1,0,0,0,0,0
8 | 78486,DISFA/aligned_images/SN028/01027.jpg,0,0,0,0,0,0,1,0
9 | 12064,DISFA/aligned_images/SN009/02434.jpg,0,0,0,0,0,0,0,0
10 | 46266,DISFA/aligned_images/SN006/02722.jpg,0,0,0,0,0,0,0,0
11 | 82495,DISFA/aligned_images/SN031/00191.jpg,0,0,0,1,0,1,1,0
12 | 62471,DISFA/aligned_images/SN013/04392.jpg,0,0,0,0,0,0,0,0
13 | 55009,DISFA/aligned_images/SN012/01775.jpg,1,0,1,0,1,0,0,0
14 | 13692,DISFA/aligned_images/SN009/04062.jpg,0,0,0,0,0,0,0,0
15 | 34819,DISFA/aligned_images/SN030/00965.jpg,0,0,0,0,0,1,1,1
16 | 78267,DISFA/aligned_images/SN028/00808.jpg,0,0,0,0,0,0,0,0
17 | 64185,DISFA/aligned_images/SN018/01261.jpg,0,0,0,0,0,0,1,0
18 | 53274,DISFA/aligned_images/SN012/00040.jpg,0,0,0,0,0,0,1,1
19 | 70869,DISFA/aligned_images/SN021/03100.jpg,0,0,0,0,0,0,0,0
20 | 60465,DISFA/aligned_images/SN013/02386.jpg,0,0,0,0,0,0,0,0
21 | 10987,DISFA/aligned_images/SN009/01301.jpg,0,0,0,0,0,0,0,0
22 | 84882,DISFA/aligned_images/SN031/02578.jpg,0,0,0,0,0,0,0,0
23 | 30382,DISFA/aligned_images/SN027/01373.jpg,0,0,0,0,0,0,0,0
24 | 19396,DISFA/aligned_images/SN016/00077.jpg,0,0,0,0,0,0,0,0
25 | 69166,DISFA/aligned_images/SN021/01397.jpg,0,0,0,0,0,0,0,0
26 | 3039,DISFA/aligned_images/SN001/03040.jpg,0,0,0,0,0,0,0,0
27 | 72253,DISFA/aligned_images/SN021/04484.jpg,0,0,0,0,0,0,1,0
28 | 5081,DISFA/aligned_images/SN002/00237.jpg,0,0,0,0,0,1,1,1
29 | 61991,DISFA/aligned_images/SN013/03912.jpg,0,0,0,0,0,0,0,0
30 | 76186,DISFA/aligned_images/SN024/03572.jpg,0,0,0,0,0,0,0,0
31 | 4584,DISFA/aligned_images/SN001/04585.jpg,0,0,0,0,0,0,0,0
32 | 17463,DISFA/aligned_images/SN010/02988.jpg,0,0,0,0,0,0,0,0
33 | 77951,DISFA/aligned_images/SN028/00492.jpg,0,0,0,1,0,1,1,0
34 | 13045,DISFA/aligned_images/SN009/03415.jpg,0,0,0,0,0,0,0,0
35 | 76394,DISFA/aligned_images/SN024/03780.jpg,0,0,0,0,0,0,0,0
36 | 25143,DISFA/aligned_images/SN026/00979.jpg,0,0,0,0,0,1,1,1
37 | 58744,DISFA/aligned_images/SN013/00665.jpg,0,0,0,0,0,0,0,0
38 | 4770,DISFA/aligned_images/SN001/04771.jpg,0,0,0,0,0,0,0,0
39 | 76696,DISFA/aligned_images/SN024/04082.jpg,0,0,0,0,0,0,0,0
40 | 58564,DISFA/aligned_images/SN013/00485.jpg,0,0,0,0,0,0,0,0
41 | 33376,DISFA/aligned_images/SN027/04367.jpg,0,0,0,0,0,0,1,0
42 | 5892,DISFA/aligned_images/SN002/01051.jpg,0,0,0,0,0,0,1,1
43 | 8528,DISFA/aligned_images/SN002/03687.jpg,0,0,0,0,0,0,0,0
44 | 61669,DISFA/aligned_images/SN013/03590.jpg,0,0,0,0,0,0,0,0
45 | 61053,DISFA/aligned_images/SN013/02974.jpg,0,0,0,0,0,0,0,0
46 | 51877,DISFA/aligned_images/SN011/03488.jpg,0,0,1,0,0,0,0,0
47 | 37813,DISFA/aligned_images/SN030/03959.jpg,0,0,0,0,0,0,0,0
48 | 70919,DISFA/aligned_images/SN021/03150.jpg,0,0,0,0,0,0,0,0
49 | 39109,DISFA/aligned_images/SN032/00410.jpg,0,0,0,0,0,0,1,0
50 | 63849,DISFA/aligned_images/SN018/00925.jpg,0,0,0,0,0,0,1,1
51 | 33529,DISFA/aligned_images/SN027/04520.jpg,0,0,0,0,0,0,1,0
52 | 37141,DISFA/aligned_images/SN030/03287.jpg,0,0,0,0,0,0,0,0
53 | 21058,DISFA/aligned_images/SN016/01739.jpg,0,0,1,0,0,0,0,0
54 | 54420,DISFA/aligned_images/SN012/01186.jpg,0,0,0,0,0,1,1,0
55 | 16314,DISFA/aligned_images/SN010/01839.jpg,0,0,0,0,0,0,0,0
56 | 85719,DISFA/aligned_images/SN031/03415.jpg,0,0,0,0,0,0,0,0
57 | 29490,DISFA/aligned_images/SN027/00481.jpg,0,0,0,0,0,0,0,0
58 | 22962,DISFA/aligned_images/SN016/03643.jpg,0,0,1,0,0,0,0,0
59 | 78984,DISFA/aligned_images/SN028/01525.jpg,0,0,0,0,0,0,0,0
60 | 12477,DISFA/aligned_images/SN009/02847.jpg,0,0,0,0,0,0,0,0
61 | 29819,DISFA/aligned_images/SN027/00810.jpg,0,0,0,0,0,0,1,0
62 | 86642,DISFA/aligned_images/SN031/04338.jpg,0,0,0,0,0,0,0,0
63 | 42037,DISFA/aligned_images/SN032/03338.jpg,0,0,0,0,0,0,0,0
64 | 11694,DISFA/aligned_images/SN009/02064.jpg,0,0,0,0,0,1,1,0
65 | 10708,DISFA/aligned_images/SN009/01022.jpg,0,0,0,0,0,0,0,0
66 | 33698,DISFA/aligned_images/SN027/04689.jpg,0,0,0,0,0,0,1,0
67 | 50957,DISFA/aligned_images/SN011/02568.jpg,0,0,1,0,0,0,0,0
68 | 82536,DISFA/aligned_images/SN031/00232.jpg,0,0,0,1,0,1,1,0
69 | 61407,DISFA/aligned_images/SN013/03328.jpg,0,0,0,0,0,0,0,0
70 | 80941,DISFA/aligned_images/SN028/03482.jpg,0,0,0,0,0,0,0,0
71 | 45365,DISFA/aligned_images/SN006/01821.jpg,0,0,0,0,0,0,0,0
72 | 3870,DISFA/aligned_images/SN001/03871.jpg,0,0,0,0,0,0,0,0
73 | 82753,DISFA/aligned_images/SN031/00449.jpg,0,0,0,0,0,1,1,0
74 | 45836,DISFA/aligned_images/SN006/02292.jpg,0,0,1,0,0,0,0,0
75 | 8554,DISFA/aligned_images/SN002/03713.jpg,0,0,0,0,0,0,0,0
76 | 5326,DISFA/aligned_images/SN002/00482.jpg,0,0,0,0,0,0,0,0
77 | 1155,DISFA/aligned_images/SN001/01156.jpg,0,0,0,0,0,0,0,0
78 | 76987,DISFA/aligned_images/SN024/04373.jpg,0,0,0,0,0,0,0,0
79 | 82262,DISFA/aligned_images/SN028/04803.jpg,0,0,0,0,0,0,1,1
80 | 28857,DISFA/aligned_images/SN026/04693.jpg,0,0,0,0,0,0,0,0
81 | 49036,DISFA/aligned_images/SN011/00647.jpg,0,0,0,1,0,1,1,0
82 | 18532,DISFA/aligned_images/SN010/04057.jpg,0,0,0,0,0,0,0,0
83 | 46157,DISFA/aligned_images/SN006/02613.jpg,0,0,0,0,0,0,0,0
84 | 47205,DISFA/aligned_images/SN006/03661.jpg,0,0,0,0,0,0,0,0
85 | 28001,DISFA/aligned_images/SN026/03837.jpg,0,0,0,0,0,0,0,0
86 | 55245,DISFA/aligned_images/SN012/02011.jpg,0,0,1,0,0,0,1,0
87 | 32627,DISFA/aligned_images/SN027/03618.jpg,0,0,0,0,0,0,0,0
88 | 63602,DISFA/aligned_images/SN018/00678.jpg,0,0,0,0,0,1,1,0
89 | 22157,DISFA/aligned_images/SN016/02838.jpg,0,0,1,0,0,0,0,0
90 | 45469,DISFA/aligned_images/SN006/01925.jpg,0,0,0,0,0,0,0,0
91 | 63338,DISFA/aligned_images/SN018/00414.jpg,0,0,0,0,0,0,0,0
92 | 22476,DISFA/aligned_images/SN016/03157.jpg,0,0,1,0,0,0,0,0
93 | 42298,DISFA/aligned_images/SN032/03599.jpg,0,0,0,0,0,0,0,0
94 | 13856,DISFA/aligned_images/SN009/04226.jpg,0,0,0,0,0,0,0,0
95 | 69566,DISFA/aligned_images/SN021/01797.jpg,0,0,1,0,0,0,0,0
96 | 66418,DISFA/aligned_images/SN018/03494.jpg,0,0,0,0,0,0,0,0
97 | 76512,DISFA/aligned_images/SN024/03898.jpg,0,0,0,0,0,0,0,0
98 | 13922,DISFA/aligned_images/SN009/04292.jpg,0,0,0,0,0,0,0,0
99 | 19951,DISFA/aligned_images/SN016/00632.jpg,0,0,1,0,0,0,0,0
100 | 51447,DISFA/aligned_images/SN011/03058.jpg,0,0,1,0,0,0,0,0
101 | 15871,DISFA/aligned_images/SN010/01396.jpg,0,0,0,0,0,0,0,0
102 |
--------------------------------------------------------------------------------
/pipeline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ihp-lab/FG-Net/e1e61c4e2576f192312fda6571b08547d9ab2c24/pipeline.png
--------------------------------------------------------------------------------
/test_image.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ihp-lab/FG-Net/e1e61c4e2576f192312fda6571b08547d9ab2c24/test_image.jpg
--------------------------------------------------------------------------------