├── README.md ├── annotation.py ├── annotations └── emty ├── celebrityDB └── images here ├── data ├── label.pbtxt └── ssd_mobilenet_v1_coco.config ├── generate_tfrecords.py ├── it5.jpg ├── lbpcascade_frontalface_improved.xml ├── split_data.ipynb ├── ssd_mobilenet_v1_coco_2018_01_28 └── ssd.md └── xml_to_csv.py /README.md: -------------------------------------------------------------------------------- 1 | # Face Recognition with the TensorFlow Object Detection API 2 | ![Screenshot](it5.jpg) 3 | 4 | ### Note: Tensorflow object detection is an accurate machine learning API capable of localizing and identifying multiple objects in a single image. You can use the API for multiple use cases like object detection , person recognition, text detection, etc.. 5 | ### [This tutorial needs tensorflow api installation](https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/installation.md) 6 | Today, we will see together how tensorflow can recognize people. In this post I'll outline the steps I took to get from a collection of Celebrities images (crawled from the internet) 7 | 8 | 1. Data Set Download 9 | 2. Image Annotation 10 | 3. Xml to CSV 11 | 4. TF-Record Creation  12 | 5. Label Map preparation 13 | 6. Pipeline Configuration 14 | 7. Training 15 | 8. Exporting Graph 16 | 17 | 18 | 19 | ## 1. Data set Download 20 | 21 | You can crawl celebrity pictures from google images if you don't have a ready Data set. Try to order the data set as bellow: 22 | 23 | ``` 24 | 25 | CelebrityDB/ 26 | TomHanks/ 27 | img001.jpg 28 | tomhanks.jpg 29 | ... 30 | WillSmith/ 31 | willsmith1.jpg 32 | will-smith-pic.jpg 33 | ... 34 | ... 35 | 36 | ``` 37 | ## 2. Image Annotation 38 | you can annotate the images using an annotation tool like labelImg. But it will take a lot of time. That's why i created a script to generate xml files(exactly like PASCAL VOC). I used opencv to detect faces but, you can change it with any other tool( i recommend dlib or a neural network face detection model which are much more accurate than opencv). 39 | #### Use [python annotation script to generate xml annotations](annotation.py) 40 | The Xml files should look like : 41 | ```xml 42 | 43 | celebrityDB1d9k49.jpeg 44 | /media/emna/datapartition/tutos/celebrityDB/dewayneJohnson/1d9k49.jpeg 45 | 46 | Emna Amor 47 | 48 | 49 | 104 50 | 142 51 | 3 52 | 53 | 0 54 | 55 | dewayneJohnson 56 | Unspecified 57 | 0 58 | 0 59 | 60 | 2 61 | 34 62 | 133 63 | 99 64 | 65 | 66 | 67 | ``` 68 | ## 3. XML tO CSV 69 | After annotating the pictures we have, We need to generate a csv file containing all pictures details / classes. 70 | #### Use [python xml to csv script to generate the csv file](xml_to_csv.py) 71 | The csv File should look like this: 72 | 73 | 74 | | filename | width | height | class | xmin | ymin | xmax | ymax | 75 | | --- | --- | --- | --- | --- | --- | --- | --- | 76 | | 1jth1461.jpeg | 76 | 105 | kimKardashian | 10 | 30 | 89 | 59 | 77 | | wenn33850496.jpg | 470 | 654 | AndySerkis | 52 | 142 | 494 | 352 | 78 | 79 | Then we need to split the data into train and test using [this python notebook ](split_data.ipynb.ipynb). 80 | 81 | ### 4. TF-Record Creation  82 | To train our model, we need to convert the data to Tensorflow file format called Tfrecords. Most of the batch operations aren’t done directly from images, rather they are converted into a single tfrecord file (images which are numpy arrays and labels which are a list of strings). 83 | #### WHAT IS TFRECORD? 84 | ``` 85 | “… TFRECORD is an approach that convert whatever data you have into a supported format. This approach makes it easier to mix and match data sets and network architectures. The recommended format for TensorFlow is a TFRecords file containing tf.train.Example protocol buffers (which contain Features as a field).“ 86 | ``` 87 | Use this [python script](generate_tfrecords.py) to generate te tf records files (train.record and test.record) 88 | ``` 89 | Usage: 90 | # Create train data: 91 | python generate_tfrecords.py --csv_input=train.csv --output_path=data/train.record 92 | 93 | # Create test data: 94 | python generate_tfrecords.py --csv_input=test.csv --output_path=data/test.record 95 | ``` 96 | ###### Note: do not forget to edit the generate_tfrecords file with your own labels. 97 | 98 | ### 5. Label Map preparation 99 | Use the same order you appended the labels in the generate_tfrecords python script 100 | ###### Note: label map id should be diffirent to 0 ! 101 | ``` 102 | item { 103 | id: 1 104 | name: 'AndySerkis' 105 | } 106 | 107 | item { 108 | id: 2 109 | name: 'dewayneJohnson' 110 | } 111 | item { 112 | id: 3 113 | name: 'drake' 114 | } 115 | 116 | item { 117 | id: 4 118 | name: 'jayZ' 119 | } 120 | 121 | item { 122 | id: 5 123 | name: 'justinBieber' 124 | } 125 | 126 | item { 127 | id: 6 128 | name: 'kimKardashian' 129 | } 130 | 131 | 132 | item { 133 | id: 7 134 | name: 'kimKardashian' 135 | } 136 | 137 | item { 138 | id: 8 139 | name: 'tomHanks' 140 | } 141 | 142 | item { 143 | id: 9 144 | name: 'willSmith' 145 | } 146 | 147 | ``` 148 | ### Pipeline Configuration 149 | We will use [ssd_mobilenet_v1_coco](http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2018_01_28.tar.gz) to train our face recognition model. 150 | #### Do not forget to edit [the ssd_mobilenet_v1_coco.config file](data/ssd_mobilenet_v1_coco.config) with the number of classes( 9 in my case) , ssd_mobilenet_v1_coco model.ckpt under ssd_mobilenet_v1_coco_2018_01_28, the train record path , test record path and the label.pbtxt path. 151 | 152 | ### 7. Training 153 | 154 | ``` 155 | # From the tensorflow/models/research/ directory 156 | 157 | PIPELINE_CONFIG_PATH={path to pipeline config file}/data/ssd_mobilenet_v1_coco.config 158 | MODEL_DIR={path to model directory}/ssd_mobilenet_v1_coco_2018_01_28 159 | NUM_TRAIN_STEPS=50000 160 | SAMPLE_1_OF_N_EVAL_EXAMPLES=1 161 | python object_detection/model_main.py \ 162 | --pipeline_config_path=${PIPELINE_CONFIG_PATH} \ 163 | --model_dir=${MODEL_DIR} \ 164 | --num_train_steps=${NUM_TRAIN_STEPS} \ 165 | --sample_1_of_n_eval_examples=$SAMPLE_1_OF_N_EVAL_EXAMPLES \ 166 | --alsologtostderr 167 | 168 | - to keep training infinetly, remove NUM_TRAIN_STEPS=50000 169 | 170 | ``` 171 | ### 8. Exporting Graph 172 | ``` 173 | 174 | # From tensorflow/models/research/ 175 | INPUT_TYPE=image_tensor 176 | PIPELINE_CONFIG_PATH={path to pipeline config file}/data/ssd_mobilenet_v1_coco.config 177 | TRAINED_CKPT_PREFIX={path to model directory}/ssd_mobilenet_v1_coco_2018_01_28/model.ckpt-num 178 | EXPORT_DIR={path to folder that will be used for export} 179 | python object_detection/export_inference_graph.py \ 180 | --input_type=${INPUT_TYPE} \ 181 | --pipeline_config_path=${PIPELINE_CONFIG_PATH} \ 182 | --trained_checkpoint_prefix=${TRAINED_CKPT_PREFIX} \ 183 | --output_directory=${EXPORT_DIR} 184 | ``` 185 | You can use the object detection notebook to make prediction (change the model download section with your model path) 186 | -------------------------------------------------------------------------------- /annotation.py: -------------------------------------------------------------------------------- 1 | import xml.etree.cElementTree as ET 2 | import os 3 | import cv2 4 | import numpy as np 5 | #from detect_faces.src import detect_faces, show_bboxes 6 | from PIL import Image 7 | 8 | path, filename = os.path.split(os.getcwd()+'/celebrityDB') 9 | face_cascade = cv2.CascadeClassifier('lbpcascade_frontalface_improved.xml') 10 | 11 | for subdir, dirs, files in os.walk(os.getcwd()): 12 | for file in files: 13 | 14 | 15 | img_path=os.path.join(subdir, file) 16 | img_name=os.path.basename(img_path) 17 | 18 | if img_path.lower().endswith(('.png', '.jpg', '.jpeg')): 19 | 20 | img = cv2.imread(img_path) 21 | gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) 22 | h,w,bpp = np.shape(img) 23 | imgp = Image.open(img_path) 24 | # b, landmarks = detect_faces(imgp) 25 | faces = face_cascade.detectMultiScale(gray, 1.3, 5) 26 | 27 | 28 | # g=show_bboxes(imgp, b, landmarks) 29 | #print(len(faces)) 30 | if len(faces) == 1 : 31 | try : 32 | 33 | for bounding_boxes in faces: 34 | face = img[int(bounding_boxes[1]):int(bounding_boxes[3]), 35 | int(bounding_boxes[0]):int(bounding_boxes[2])] 36 | 37 | 38 | subdir_path, subdir_name = os.path.split(subdir) 39 | 40 | root = ET.Element("annotation",verified="yes") 41 | ET.SubElement(root, "folder").text=filename 42 | 43 | ET.SubElement(root, "filename").text = img_name 44 | ET.SubElement(root, "path").text = img_path 45 | 46 | source=ET.SubElement(root, "source") 47 | ET.SubElement(source, "database").text = "Emna Amor" 48 | 49 | size=ET.SubElement(root, "size") 50 | ET.SubElement(size, "width").text = str(w) 51 | ET.SubElement(size, "height").text = str(h) 52 | ET.SubElement(size, "depth").text = str(bpp) 53 | 54 | ET.SubElement(root, "segmented").text = "0" 55 | 56 | obj=ET.SubElement(root, "object") 57 | ET.SubElement(obj, "name").text = subdir_name 58 | ET.SubElement(obj, "pose").text = "Unspecified" 59 | ET.SubElement(obj, "truncated").text = "0" 60 | ET.SubElement(obj, "difficult").text = "0" 61 | 62 | box=ET.SubElement(obj, "bndbox") 63 | ET.SubElement(box, "xmin").text = str(int(bounding_boxes[0])) 64 | ET.SubElement(box, "ymin").text = str(int(bounding_boxes[1])) 65 | ET.SubElement(box, "xmax").text = str(int(bounding_boxes[2])+int(bounding_boxes[1])) 66 | ET.SubElement(box, "ymax").text = str(int(bounding_boxes[3])) 67 | 68 | tree = ET.ElementTree(root) 69 | tree.write(os.path.join(os.getcwd()+'/annotations', os.path.splitext(img_name)[0] + '.xml')) 70 | except RuntimeError : 71 | with open("delete.txt", "a") as myfile: 72 | myfile.write(img_path+"\n") 73 | #print(img_path) 74 | 75 | else : 76 | with open("delete.txt", "a") as myfile: 77 | myfile.write(img_path+"\n") 78 | #print('no face') 79 | -------------------------------------------------------------------------------- /annotations/emty: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/EmnamoR/Face-recognition-Tensorflow-object-detection-api/bf0a8563252a30106ec763310bc1fd41bcae4167/annotations/emty -------------------------------------------------------------------------------- /celebrityDB/images here: -------------------------------------------------------------------------------- 1 | images in this folder 2 | 3 | -------------------------------------------------------------------------------- /data/label.pbtxt: -------------------------------------------------------------------------------- 1 | item { 2 | id: 1 3 | name: 'AndySerkis' 4 | } 5 | 6 | item { 7 | id: 2 8 | name: 'dewayneJohnson' 9 | } 10 | item { 11 | id: 3 12 | name: 'drake' 13 | } 14 | 15 | item { 16 | id: 4 17 | name: 'jayZ' 18 | } 19 | 20 | item { 21 | id: 5 22 | name: 'justinBieber' 23 | } 24 | 25 | item { 26 | id: 6 27 | name: 'kimKardashian' 28 | } 29 | 30 | 31 | item { 32 | id: 7 33 | name: 'kimKardashian' 34 | } 35 | 36 | item { 37 | id: 8 38 | name: 'tomHanks' 39 | } 40 | 41 | item { 42 | id: 9 43 | name: 'willSmith' 44 | } 45 | -------------------------------------------------------------------------------- /data/ssd_mobilenet_v1_coco.config: -------------------------------------------------------------------------------- 1 | # SSD with Mobilenet v1 configuration for MSCOCO Dataset. 2 | # Users should configure the fine_tune_checkpoint field in the train config as 3 | # well as the label_map_path and input_path fields in the train_input_reader and 4 | # eval_input_reader. Search for "PATH_TO_BE_CONFIGURED" to find the fields that 5 | # should be configured. 6 | 7 | model { 8 | ssd { 9 | num_classes: 9 10 | box_coder { 11 | faster_rcnn_box_coder { 12 | y_scale: 10.0 13 | x_scale: 10.0 14 | height_scale: 5.0 15 | width_scale: 5.0 16 | } 17 | } 18 | matcher { 19 | argmax_matcher { 20 | matched_threshold: 0.5 21 | unmatched_threshold: 0.5 22 | ignore_thresholds: false 23 | negatives_lower_than_unmatched: true 24 | force_match_for_each_row: true 25 | } 26 | } 27 | similarity_calculator { 28 | iou_similarity { 29 | } 30 | } 31 | anchor_generator { 32 | ssd_anchor_generator { 33 | num_layers: 6 34 | min_scale: 0.2 35 | max_scale: 0.95 36 | aspect_ratios: 1.0 37 | aspect_ratios: 2.0 38 | aspect_ratios: 0.5 39 | aspect_ratios: 3.0 40 | aspect_ratios: 0.3333 41 | } 42 | } 43 | image_resizer { 44 | fixed_shape_resizer { 45 | height: 300 46 | width: 300 47 | } 48 | } 49 | box_predictor { 50 | convolutional_box_predictor { 51 | min_depth: 0 52 | max_depth: 0 53 | num_layers_before_predictor: 0 54 | use_dropout: false 55 | dropout_keep_probability: 0.8 56 | kernel_size: 1 57 | box_code_size: 4 58 | apply_sigmoid_to_scores: false 59 | conv_hyperparams { 60 | activation: RELU_6, 61 | regularizer { 62 | l2_regularizer { 63 | weight: 0.00004 64 | } 65 | } 66 | initializer { 67 | truncated_normal_initializer { 68 | stddev: 0.03 69 | mean: 0.0 70 | } 71 | } 72 | batch_norm { 73 | train: true, 74 | scale: true, 75 | center: true, 76 | decay: 0.9997, 77 | epsilon: 0.001, 78 | } 79 | } 80 | } 81 | } 82 | feature_extractor { 83 | type: 'ssd_mobilenet_v1' 84 | min_depth: 16 85 | depth_multiplier: 1.0 86 | conv_hyperparams { 87 | activation: RELU_6, 88 | regularizer { 89 | l2_regularizer { 90 | weight: 0.00004 91 | } 92 | } 93 | initializer { 94 | truncated_normal_initializer { 95 | stddev: 0.03 96 | mean: 0.0 97 | } 98 | } 99 | batch_norm { 100 | train: true, 101 | scale: true, 102 | center: true, 103 | decay: 0.9997, 104 | epsilon: 0.001, 105 | } 106 | } 107 | } 108 | loss { 109 | classification_loss { 110 | weighted_sigmoid { 111 | } 112 | } 113 | localization_loss { 114 | weighted_smooth_l1 { 115 | } 116 | } 117 | hard_example_miner { 118 | num_hard_examples: 3000 119 | iou_threshold: 0.99 120 | loss_type: CLASSIFICATION 121 | max_negatives_per_positive: 3 122 | min_negatives_per_image: 0 123 | } 124 | classification_weight: 1.0 125 | localization_weight: 1.0 126 | } 127 | normalize_loss_by_num_matches: true 128 | post_processing { 129 | batch_non_max_suppression { 130 | score_threshold: 1e-8 131 | iou_threshold: 0.6 132 | max_detections_per_class: 100 133 | max_total_detections: 100 134 | } 135 | score_converter: SIGMOID 136 | } 137 | } 138 | } 139 | 140 | train_config: { 141 | batch_size: 24 142 | optimizer { 143 | rms_prop_optimizer: { 144 | learning_rate: { 145 | exponential_decay_learning_rate { 146 | initial_learning_rate: 0.004 147 | decay_steps: 800720 148 | decay_factor: 0.95 149 | } 150 | } 151 | momentum_optimizer_value: 0.9 152 | decay: 0.9 153 | epsilon: 1.0 154 | } 155 | } 156 | fine_tune_checkpoint: "PATH_TO_BE_CONFIGURED(the ssd folder we just downloaded)/model.ckpt" 157 | from_detection_checkpoint: true 158 | # Note: The below line limits the training process to 200K steps, which we 159 | # empirically found to be sufficient enough to train the pets dataset. This 160 | # effectively bypasses the learning rate schedule (the learning rate will 161 | # never decay). Remove the below line to train indefinitely. 162 | num_steps: 200000 163 | data_augmentation_options { 164 | random_horizontal_flip { 165 | } 166 | } 167 | data_augmentation_options { 168 | ssd_random_crop { 169 | } 170 | } 171 | } 172 | 173 | train_input_reader: { 174 | tf_record_input_reader { 175 | input_path: "PATH_TO_BE_CONFIGURED/train.record" 176 | } 177 | label_map_path: "PATH_TO_BE_CONFIGURED/label.pbtxt" 178 | } 179 | 180 | eval_config: { 181 | num_examples: 8000 182 | # Note: The below line limits the evaluation process to 10 evaluations. 183 | # Remove the below line to evaluate indefinitely. 184 | max_evals: 10 185 | } 186 | 187 | eval_input_reader: { 188 | tf_record_input_reader { 189 | input_path: "PATH_TO_BE_CONFIGURED/test.record" 190 | } 191 | label_map_path: "PATH_TO_BE_CONFIGURED/label.pbtxt" 192 | shuffle: false 193 | num_readers: 1 194 | } 195 | -------------------------------------------------------------------------------- /generate_tfrecords.py: -------------------------------------------------------------------------------- 1 | """ 2 | Usage: 3 | # From tensorflow/models/ 4 | # Create train data: 5 | python generate_tfrecord.py --csv_input=train.csv --output_path=data/train.record 6 | 7 | # Create test data: 8 | python generate_tfrecord.py --csv_input=test.csv --output_path=data/test.record 9 | """ 10 | from __future__ import division 11 | from __future__ import print_function 12 | from __future__ import absolute_import 13 | 14 | import os 15 | import io 16 | import pandas as pd 17 | import tensorflow as tf 18 | 19 | from PIL import Image 20 | from models.research.object_detection.utils import dataset_util 21 | from collections import namedtuple, OrderedDict 22 | 23 | flags = tf.app.flags 24 | flags.DEFINE_string('csv_input', '', 'Path to the CSV input') 25 | flags.DEFINE_string('output_path', '', 'Path to output TFRecord') 26 | FLAGS = flags.FLAGS 27 | 28 | 29 | # TO-DO replace this with label map 30 | def class_text_to_int(row_label): 31 | if row_label == 'AndySerkis': 32 | return 1 33 | elif row_label == 'dewayneJohnson': 34 | return 2 35 | elif row_label == 'drake': 36 | return 3 37 | elif row_label == 'jayZ': 38 | return 4 39 | elif row_label == 'justinBieber': 40 | return 5 41 | elif row_label == 'kimKardashian': 42 | return 6 43 | elif row_label == 'tomHanks': 44 | return 7 45 | elif row_label == 'willSmith': 46 | return 8 47 | else: 48 | None 49 | 50 | 51 | def split(df, group): 52 | data = namedtuple('data', ['filename', 'object']) 53 | gb = df.groupby(group) 54 | return [data(filename, gb.get_group(x)) for filename, x in zip(gb.groups.keys(), gb.groups)] 55 | 56 | 57 | def create_tf_example(group, path): 58 | #print(os.path.join(path, '{}'.format(group.filename))) 59 | with tf.gfile.GFile(os.path.join(path, '{}'.format(group.filename)), 'rb') as fid: 60 | encoded_jpg = fid.read() 61 | encoded_jpg_io = io.BytesIO(encoded_jpg) 62 | image = Image.open(encoded_jpg_io) 63 | width, height = image.size 64 | 65 | filename = group.filename.encode('utf8') 66 | image_format = b'jpg' 67 | xmins = [] 68 | xmaxs = [] 69 | ymins = [] 70 | ymaxs = [] 71 | classes_text = [] 72 | classes = [] 73 | 74 | for index, row in group.object.iterrows(): 75 | xmins.append(row['xmin'] / width) 76 | xmaxs.append(row['xmax'] / width) 77 | ymins.append(row['ymin'] / height) 78 | ymaxs.append(row['ymax'] / height) 79 | classes_text.append(row['class'].encode('utf8')) 80 | classes.append(class_text_to_int(row['class'])) 81 | 82 | tf_example = tf.train.Example(features=tf.train.Features(feature={ 83 | 'image/height': dataset_util.int64_feature(height), 84 | 'image/width': dataset_util.int64_feature(width), 85 | 'image/filename': dataset_util.bytes_feature(filename), 86 | 'image/source_id': dataset_util.bytes_feature(filename), 87 | 'image/encoded': dataset_util.bytes_feature(encoded_jpg), 88 | 'image/format': dataset_util.bytes_feature(image_format), 89 | 'image/object/bbox/xmin': dataset_util.float_list_feature(xmins), 90 | 'image/object/bbox/xmax': dataset_util.float_list_feature(xmaxs), 91 | 'image/object/bbox/ymin': dataset_util.float_list_feature(ymins), 92 | 'image/object/bbox/ymax': dataset_util.float_list_feature(ymaxs), 93 | 'image/object/class/text': dataset_util.bytes_list_feature(classes_text), 94 | 'image/object/class/label': dataset_util.int64_list_feature(classes), 95 | })) 96 | return tf_example 97 | 98 | 99 | def main(_): 100 | writer = tf.python_io.TFRecordWriter(FLAGS.output_path) 101 | path = os.path.join(os.getcwd(), 'images') 102 | examples = pd.read_csv(FLAGS.csv_input) 103 | grouped = split(examples, 'filename') 104 | grouped2 = split(examples, 'class') 105 | for dirpath,dirs,filenames in os.walk(path): 106 | for i, val in enumerate(dirs): 107 | #print(dirs[i]) 108 | #print((i,val)) 109 | 110 | for d in dirs : 111 | path2=os.path.join(dirpath,d) 112 | for group in grouped: 113 | 114 | 115 | name=group[0].split('-') 116 | if name[0] == d : 117 | tf_example = create_tf_example(group, path2) 118 | t=path2+'/'+group[0] 119 | print(os.path.join(group,t)) 120 | writer.write(tf_example.SerializeToString()) 121 | 122 | writer.close() 123 | output_path = os.path.join(os.getcwd(), FLAGS.output_path) 124 | print('Successfully created the TFRecords: {}'.format(output_path)) 125 | 126 | if __name__ == '__main__': 127 | tf.app.run() 128 | -------------------------------------------------------------------------------- /it5.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/EmnamoR/Face-recognition-Tensorflow-object-detection-api/bf0a8563252a30106ec763310bc1fd41bcae4167/it5.jpg -------------------------------------------------------------------------------- /lbpcascade_frontalface_improved.xml: -------------------------------------------------------------------------------- 1 | 2 | 66 | 67 | 68 | 69 | BOOST 70 | LBP 71 | 45 72 | 45 73 | 74 | GAB 75 | 9.9500000476837158e-001 76 | 5.0000000000000000e-001 77 | 9.4999999999999996e-001 78 | 1 79 | 100 80 | 81 | 256 82 | 1 83 | 19 84 | 85 | 86 | <_> 87 | 6 88 | -4.1617846488952637e+000 89 | 90 | <_> 91 | 92 | 0 -1 26 -1 -1 -17409 -1 -1 -1 -1 -1 93 | 94 | -9.9726462364196777e-001 -3.8938775658607483e-001 95 | <_> 96 | 97 | 0 -1 18 -1 -1 -21569 -20545 -1 -1 -20545 -1 98 | 99 | -9.8648911714553833e-001 -2.5386649370193481e-001 100 | <_> 101 | 102 | 0 -1 30 -21569 -16449 1006578219 -20801 -16449 -1 -21585 -1 103 | 104 | -9.6436238288879395e-001 -1.4039695262908936e-001 105 | <_> 106 | 107 | 0 -1 54 -1 -1 -16402 -4370 -1 -1 -1053010 -4456466 108 | 109 | -8.4081345796585083e-001 3.8321062922477722e-001 110 | <_> 111 | 112 | 0 -1 29 -184747280 -705314819 1326353 1364574079 -131073 -5 113 | 2147481147 -1 114 | 115 | -8.1084597110748291e-001 4.3495711684226990e-001 116 | <_> 117 | 118 | 0 -1 89 -142618625 -4097 -37269 -20933 872350430 -268476417 119 | 1207894255 2139032115 120 | 121 | -7.3140043020248413e-001 4.3799084424972534e-001 122 | 123 | <_> 124 | 6 125 | -4.0652265548706055e+000 126 | 127 | <_> 128 | 129 | 0 -1 19 -1 -1 -17409 -1 -1 -1 -1 -1 130 | 131 | -9.9727255105972290e-001 -7.2050148248672485e-001 132 | <_> 133 | 134 | 0 -1 38 -1 1073741823 -1 -1 -1 -1 -1 -1 135 | 136 | -9.8717331886291504e-001 -5.3031939268112183e-001 137 | <_> 138 | 139 | 0 -1 28 -16385 -1 -21569 -20545 -1 -1 -21569 -1 140 | 141 | -9.3442338705062866e-001 6.5213099122047424e-002 142 | <_> 143 | 144 | 0 -1 112 -2097153 -1 -1 -1 -1 -8193 -1 -35467 145 | 146 | -7.9567342996597290e-001 4.2883640527725220e-001 147 | <_> 148 | 149 | 0 -1 48 -134239573 -16465 58663467 -1079022929 -1073758273 150 | -81937 -8412501 -404766817 151 | 152 | -7.1264797449111938e-001 4.1050794720649719e-001 153 | <_> 154 | 155 | 0 -1 66 -17047555 -1099008003 2147479551 -1090584581 -69633 156 | -1342177281 -1090650121 -1472692240 157 | 158 | -7.6119172573089600e-001 4.2042696475982666e-001 159 | 160 | <_> 161 | 7 162 | -4.6904473304748535e+000 163 | 164 | <_> 165 | 166 | 0 -1 12 -1 -1 -17409 -1 -1 -1 -1 -1 167 | 168 | -9.9725550413131714e-001 -8.3142280578613281e-001 169 | <_> 170 | 171 | 0 -1 31 -1 -168429569 -1 -1 -1 -1 -1 -1 172 | 173 | -9.8183268308639526e-001 -3.6373397707939148e-001 174 | <_> 175 | 176 | 0 -1 38 -1 1073741759 -1 -1 -1 -1 -1 -1 177 | 178 | -9.1890293359756470e-001 7.8322596848011017e-002 179 | <_> 180 | 181 | 0 -1 27 -17409 -2097153 -134372726 -21873 -65 -536870913 182 | -161109 -4215889 183 | 184 | -8.0752444267272949e-001 1.9565649330615997e-001 185 | <_> 186 | 187 | 0 -1 46 -469779457 -286371842 -33619971 -212993 -1 -41943049 188 | -134217731 -1346863620 189 | 190 | -6.9232726097106934e-001 3.8141927123069763e-001 191 | <_> 192 | 193 | 0 -1 125 -1896950780 -1964839052 -9 707723004 -34078727 194 | -1074266122 -536872969 -262145 195 | 196 | -8.1760478019714355e-001 3.4172961115837097e-001 197 | <_> 198 | 199 | 0 -1 80 -402657501 654311423 -419533278 -452984853 200 | 1979676215 -1208090625 -167772569 -524289 201 | 202 | -6.3433408737182617e-001 4.3154156208038330e-001 203 | 204 | <_> 205 | 8 206 | -4.2590322494506836e+000 207 | 208 | <_> 209 | 210 | 0 -1 42 -1 -655361 -1 -1 -1 -1 -1 -1 211 | 212 | -9.9715477228164673e-001 -8.6178696155548096e-001 213 | <_> 214 | 215 | 0 -1 40 -1 -705300491 -1 -1 -1 -1 -1 -1 216 | 217 | -9.8356908559799194e-001 -5.7423096895217896e-001 218 | <_> 219 | 220 | 0 -1 43 -65 872413111 -2049 -1 -1 -1 -1 -1 221 | 222 | -9.2525935173034668e-001 -1.3835857808589935e-001 223 | <_> 224 | 225 | 0 -1 111 -1 -5242881 -1 -524289 -4194305 -1 -1 -43148 226 | 227 | -7.8076487779617310e-001 1.8362471461296082e-001 228 | <_> 229 | 230 | 0 -1 25 -145227841 868203194 -1627394049 935050171 231 | 2147483647 1006600191 -268439637 1002437615 232 | 233 | -7.2554033994674683e-001 3.3393219113349915e-001 234 | <_> 235 | 236 | 0 -1 116 -214961408 50592514 -2128 1072162674 -1077940293 237 | -1084489966 -134219854 -1074790401 238 | 239 | -6.1547595262527466e-001 3.9214438199996948e-001 240 | <_> 241 | 242 | 0 -1 3 -294987948 -1124421633 -73729 -268435841 -33654928 243 | 2122317823 -268599297 -33554945 244 | 245 | -6.4863425493240356e-001 3.8784855604171753e-001 246 | <_> 247 | 248 | 0 -1 22 -525585 -26738821 -17895690 1123482236 1996455758 249 | -8519849 -252182980 -461898753 250 | 251 | -5.5464369058609009e-001 4.4275921583175659e-001 252 | 253 | <_> 254 | 8 255 | -4.0009465217590332e+000 256 | 257 | <_> 258 | 259 | 0 -1 82 -1 -1 -1 -1 -33685505 -1 -1 -1 260 | 261 | -9.9707120656967163e-001 -8.9196771383285522e-001 262 | <_> 263 | 264 | 0 -1 84 -1 -1 -1 -1 2147446783 -1 -1 -1 265 | 266 | -9.8670446872711182e-001 -7.5064390897750854e-001 267 | <_> 268 | 269 | 0 -1 79 -1 -1 -262145 -1 -252379137 -1 -1 -1 270 | 271 | -8.9446705579757690e-001 7.0268943905830383e-002 272 | <_> 273 | 274 | 0 -1 61 -1 -8201 -1 -2097153 -16777217 -513 -16777217 275 | -1162149889 276 | 277 | -7.2166109085083008e-001 2.9786801338195801e-001 278 | <_> 279 | 280 | 0 -1 30 -21569 -1069121 1006578211 -134238545 -16450 281 | -268599297 -21617 -14680097 282 | 283 | -6.2449234724044800e-001 3.8551881909370422e-001 284 | <_> 285 | 286 | 0 -1 75 -268701913 -1999962377 1995165474 -453316822 287 | 1744684853 -2063597697 -134226057 -50336769 288 | 289 | -5.5207914113998413e-001 4.2211884260177612e-001 290 | <_> 291 | 292 | 0 -1 21 -352321825 -526489 -420020626 -486605074 1155483470 293 | -110104705 -587840772 -25428801 294 | 295 | -5.3324747085571289e-001 4.4535955786705017e-001 296 | <_> 297 | 298 | 0 -1 103 70270772 2012790229 -16810020 -245764 -1208090635 299 | -753667 -1073741828 -1363662420 300 | 301 | -6.4402890205383301e-001 3.8995954394340515e-001 302 | 303 | <_> 304 | 8 305 | -4.6897511482238770e+000 306 | 307 | <_> 308 | 309 | 0 -1 97 -1 -1 -1 -1 -524289 -524289 -1 -1 310 | 311 | -9.9684870243072510e-001 -8.8232177495956421e-001 312 | <_> 313 | 314 | 0 -1 84 -1 -1 -1 -1 2147438591 -1 -1 -1 315 | 316 | -9.8677414655685425e-001 -7.8965580463409424e-001 317 | <_> 318 | 319 | 0 -1 113 -1 -1 -1 -1 -1048577 -262149 -1048577 -35339 320 | 321 | -9.2621946334838867e-001 -2.9984828829765320e-001 322 | <_> 323 | 324 | 0 -1 33 -2249 867434291 -32769 -33562753 -1 -1073758209 325 | -4165 -1 326 | 327 | -7.2429555654525757e-001 2.2348840534687042e-001 328 | <_> 329 | 330 | 0 -1 98 1659068671 -142606337 587132538 -67108993 577718271 331 | -294921 -134479873 -129 332 | 333 | -5.5495566129684448e-001 3.5419258475303650e-001 334 | <_> 335 | 336 | 0 -1 100 -268441813 788267007 -286265494 -486576145 -8920251 337 | 2138505075 -151652570 -2050 338 | 339 | -5.3362584114074707e-001 3.9479774236679077e-001 340 | <_> 341 | 342 | 0 -1 51 -1368387212 -537102978 -98305 -163843 1065109500 343 | -16777217 -67321939 -1141359619 344 | 345 | -5.6162708997726440e-001 3.8008108735084534e-001 346 | <_> 347 | 348 | 0 -1 127 -268435550 1781120906 -251658720 -143130698 349 | -1048605 -1887436825 1979700688 -1008730125 350 | 351 | -5.1167154312133789e-001 4.0678605437278748e-001 352 | 353 | <_> 354 | 10 355 | -4.2179841995239258e+000 356 | 357 | <_> 358 | 359 | 0 -1 97 -1 -1 -1 -1 -524289 -524289 -1 -1 360 | 361 | -9.9685418605804443e-001 -8.8037383556365967e-001 362 | <_> 363 | 364 | 0 -1 90 -1 -1 -1 -1 -8912897 -524297 -8912897 -1 365 | 366 | -9.7972750663757324e-001 -5.7626229524612427e-001 367 | <_> 368 | 369 | 0 -1 96 -1 -1 -1 -1 -1 -65 -1 -2249 370 | 371 | -9.0239793062210083e-001 -1.7454113066196442e-001 372 | <_> 373 | 374 | 0 -1 71 -1 -4097 -1 -513 -16777217 -268468483 -16797697 375 | -1430589697 376 | 377 | -7.4346423149108887e-001 9.4165161252021790e-002 378 | <_> 379 | 380 | 0 -1 37 1364588304 -581845274 -536936460 -3 -308936705 381 | -1074331649 -4196865 -134225953 382 | 383 | -6.8877440690994263e-001 2.7647304534912109e-001 384 | <_> 385 | 386 | 0 -1 117 -37765187 -540675 -3 -327753 -1082458115 -65537 387 | 1071611901 536827253 388 | 389 | -5.7555085420608521e-001 3.4339720010757446e-001 390 | <_> 391 | 392 | 0 -1 85 -269490650 -1561395522 -1343312090 -857083986 393 | -1073750223 -369098755 -50856110 -2065 394 | 395 | -5.4036927223205566e-001 4.0065473318099976e-001 396 | <_> 397 | 398 | 0 -1 4 -425668880 -34427164 1879048177 -269570140 790740912 399 | -196740 2138535839 -536918145 400 | 401 | -4.8439365625381470e-001 4.4630467891693115e-001 402 | <_> 403 | 404 | 0 -1 92 74726960 -1246482434 -1 -246017 -1078607916 405 | -1073947163 -1644231687 -1359211496 406 | 407 | -5.6686979532241821e-001 3.6671569943428040e-001 408 | <_> 409 | 410 | 0 -1 11 -135274809 -1158173459 -353176850 540195262 411 | 2139086600 2071977814 -546898600 -96272673 412 | 413 | -5.1499199867248535e-001 4.0788397192955017e-001 414 | 415 | <_> 416 | 9 417 | -4.0345416069030762e+000 418 | 419 | <_> 420 | 421 | 0 -1 78 -1 -1 -1 -1 -8912897 -1 -8912897 -1 422 | 423 | -9.9573624134063721e-001 -8.5452395677566528e-001 424 | <_> 425 | 426 | 0 -1 93 -1 -1 -1 -1 -148635649 -524297 -8912897 -1 427 | 428 | -9.7307401895523071e-001 -5.2884924411773682e-001 429 | <_> 430 | 431 | 0 -1 77 -1 -8209 -1 -257 -772734977 -1 -201850881 -1 432 | 433 | -8.6225658655166626e-001 4.3712578713893890e-002 434 | <_> 435 | 436 | 0 -1 68 -570427393 -16649 -69633 -131073 -536944677 -1 -8737 437 | -1435828225 438 | 439 | -6.8078064918518066e-001 2.5120577216148376e-001 440 | <_> 441 | 442 | 0 -1 50 -1179697 -34082849 -3278356 -37429266 -1048578 443 | -555753474 -1015551096 -37489685 444 | 445 | -6.1699724197387695e-001 3.0963841080665588e-001 446 | <_> 447 | 448 | 0 -1 129 -1931606992 -17548804 -16842753 -1075021827 449 | 1073667572 -81921 -1611073620 -1415047752 450 | 451 | -6.0499197244644165e-001 3.0735063552856445e-001 452 | <_> 453 | 454 | 0 -1 136 -269754813 1761591286 -1073811523 2130378623 -17580 455 | -1082294665 -159514800 -1026883840 456 | 457 | -5.6772041320800781e-001 3.5023149847984314e-001 458 | <_> 459 | 460 | 0 -1 65 2016561683 1528827871 -10258447 960184191 125476830 461 | -8511618 -1078239365 187648611 462 | 463 | -5.5894804000854492e-001 3.4856522083282471e-001 464 | <_> 465 | 466 | 0 -1 13 -207423502 -333902 2013200231 -202348848 1042454451 467 | -16393 1073117139 2004162321 468 | 469 | -5.7197356224060059e-001 3.2818377017974854e-001 470 | 471 | <_> 472 | 9 473 | -3.4892759323120117e+000 474 | 475 | <_> 476 | 477 | 0 -1 78 -1 -1 -1 -1 -8912897 -1 -8912897 -1 478 | 479 | -9.8917990922927856e-001 -7.3812037706375122e-001 480 | <_> 481 | 482 | 0 -1 93 -1 -1 -1 -1 -148635649 -524297 -8912897 -1 483 | 484 | -9.3414896726608276e-001 -2.6945295929908752e-001 485 | <_> 486 | 487 | 0 -1 83 -1 -524289 -1 -1048577 1879011071 -32769 -524289 488 | -3178753 489 | 490 | -7.6891708374023438e-001 5.2568886429071426e-002 491 | <_> 492 | 493 | 0 -1 9 -352329729 -17891329 -16810117 -486871042 -688128841 494 | -1358954675 -16777218 -219217968 495 | 496 | -6.2337344884872437e-001 2.5143685936927795e-001 497 | <_> 498 | 499 | 0 -1 130 -2157 -1548812374 -1343233440 -418381854 -953155613 500 | -836960513 -713571200 -709888014 501 | 502 | -4.7277018427848816e-001 3.9616456627845764e-001 503 | <_> 504 | 505 | 0 -1 121 -1094717701 -67240065 -65857 -32899 -5783756 506 | -136446081 -134285352 -2003298884 507 | 508 | -5.1766264438629150e-001 3.5814732313156128e-001 509 | <_> 510 | 511 | 0 -1 23 -218830160 -119671186 5505075 1241491391 -1594469 512 | -2097185 2004828075 -67649541 513 | 514 | -6.5394639968872070e-001 3.0377501249313354e-001 515 | <_> 516 | 517 | 0 -1 115 -551814749 2099511088 -1090732551 -2045546512 518 | -1086341441 1059848178 800042912 252705994 519 | 520 | -5.2584588527679443e-001 3.3847147226333618e-001 521 | <_> 522 | 523 | 0 -1 99 -272651477 578776766 -285233490 -889225217 524 | 2147448656 377454463 2012701952 -68157761 525 | 526 | -6.1836904287338257e-001 2.8922611474990845e-001 527 | 528 | <_> 529 | 9 530 | -3.0220029354095459e+000 531 | 532 | <_> 533 | 534 | 0 -1 36 -1 -570425345 -1 -570425345 -1 -50331649 -6291457 -1 535 | 536 | -9.7703826427459717e-001 -6.2527233362197876e-001 537 | <_> 538 | 539 | 0 -1 124 -1430602241 -33619969 -1 -3 -1074003969 -1073758209 540 | -1073741825 -1073768705 541 | 542 | -8.9538317918777466e-001 -3.1887885928153992e-001 543 | <_> 544 | 545 | 0 -1 88 -1 -268439625 -65601 -268439569 -393809 -270532609 546 | -42076889 -288361721 547 | 548 | -6.8733429908752441e-001 1.2978810071945190e-001 549 | <_> 550 | 551 | 0 -1 132 -755049252 2042563807 1795096575 465121071 552 | -1090585188 -20609 -1459691784 539672495 553 | 554 | -5.7038843631744385e-001 3.0220884084701538e-001 555 | <_> 556 | 557 | 0 -1 20 -94377762 -25702678 1694167798 -231224662 1079955016 558 | -346144140 2029995743 -536918961 559 | 560 | -5.3204691410064697e-001 3.4054222702980042e-001 561 | <_> 562 | 563 | 0 -1 47 2143026943 -285278225 -3 -612438281 -16403 -131074 564 | -1 -1430749256 565 | 566 | -4.6176829934120178e-001 4.1114711761474609e-001 567 | <_> 568 | 569 | 0 -1 74 203424336 -25378820 -35667973 1073360894 -1912815660 570 | -573444 -356583491 -1365235056 571 | 572 | -4.9911966919898987e-001 3.5335537791252136e-001 573 | <_> 574 | 575 | 0 -1 6 -1056773 -1508430 -558153 -102747408 2133997491 576 | -269043865 2004842231 -8947721 577 | 578 | -4.0219521522521973e-001 4.3947893381118774e-001 579 | <_> 580 | 581 | 0 -1 70 -880809694 -1070282769 -1363162108 -838881281 582 | -680395161 -2064124929 -34244753 1173880701 583 | 584 | -5.3891533613204956e-001 3.2062566280364990e-001 585 | 586 | <_> 587 | 8 588 | -2.5489892959594727e+000 589 | 590 | <_> 591 | 592 | 0 -1 39 -1 -572522497 -8519681 -570425345 -4195329 -50333249 593 | -1 -1 594 | 595 | -9.4647216796875000e-001 -3.3662387728691101e-001 596 | <_> 597 | 598 | 0 -1 124 -1430735362 -33619971 -8201 -3 -1677983745 599 | -1073762817 -1074003969 -1142979329 600 | 601 | -8.0300611257553101e-001 -3.8466516882181168e-002 602 | <_> 603 | 604 | 0 -1 91 -67113217 -524289 -671482265 -786461 1677132031 605 | -268473345 -68005889 -70291765 606 | 607 | -5.8367580175399780e-001 2.6507318019866943e-001 608 | <_> 609 | 610 | 0 -1 17 -277872641 -553910292 -268435458 -16843010 611 | 1542420439 -1342178311 -143132940 -2834 612 | 613 | -4.6897178888320923e-001 3.7864661216735840e-001 614 | <_> 615 | 616 | 0 -1 137 -1312789 -290527285 -286326862 -5505280 -1712335966 617 | -2045979188 1165423617 -709363723 618 | 619 | -4.6382644772529602e-001 3.6114525794982910e-001 620 | <_> 621 | 622 | 0 -1 106 1355856590 -109445156 -96665606 2066939898 623 | 1356084692 1549031917 -30146561 -16581701 624 | 625 | -6.3095021247863770e-001 2.9294869303703308e-001 626 | <_> 627 | 628 | 0 -1 104 -335555328 118529 1860167712 -810680357 -33558656 629 | -1368391795 -402663552 -1343225921 630 | 631 | -5.9658926725387573e-001 2.7228885889053345e-001 632 | <_> 633 | 634 | 0 -1 76 217581168 -538349634 1062631419 1039868926 635 | -1090707460 -2228359 -1078042693 -1147128518 636 | 637 | -4.5812287926673889e-001 3.7063929438591003e-001 638 | 639 | <_> 640 | 9 641 | -2.5802578926086426e+000 642 | 643 | <_> 644 | 645 | 0 -1 35 -513 -706873891 -270541825 1564475391 -120602625 646 | -118490145 -3162113 -1025 647 | 648 | -8.9068460464477539e-001 -1.6470588743686676e-001 649 | <_> 650 | 651 | 0 -1 41 -1025 872144563 -2105361 -1078076417 -1048577 652 | -1145061461 -87557413 -1375993973 653 | 654 | -7.1808964014053345e-001 2.2022204473614693e-002 655 | <_> 656 | 657 | 0 -1 95 -42467849 967946223 -811601986 1030598351 658 | -1212430676 270856533 -1392539508 147705039 659 | 660 | -4.9424821138381958e-001 3.0048963427543640e-001 661 | <_> 662 | 663 | 0 -1 10 -218116370 -637284625 -87373174 -521998782 664 | -805355450 -615023745 -814267322 -12069282 665 | 666 | -5.5306458473205566e-001 2.9137542843818665e-001 667 | <_> 668 | 669 | 0 -1 105 -275849241 -527897 -11052049 -69756067 -15794193 670 | -1141376839 -564771 -287095455 671 | 672 | -4.6759819984436035e-001 3.6638516187667847e-001 673 | <_> 674 | 675 | 0 -1 24 -1900898096 -18985228 -44056577 -24675 -1074880639 676 | -283998 796335613 -1079041957 677 | 678 | -4.2737138271331787e-001 3.9243003726005554e-001 679 | <_> 680 | 681 | 0 -1 139 -555790844 410735094 -32106513 406822863 -897632192 682 | -912830145 -117771560 -1204027649 683 | 684 | -4.1896930336952209e-001 3.6744937300682068e-001 685 | <_> 686 | 687 | 0 -1 0 -1884822366 -1406613148 1135342180 -1979127580 688 | -68174862 246469804 1001386992 -708885872 689 | 690 | -5.7093089818954468e-001 2.9880744218826294e-001 691 | <_> 692 | 693 | 0 -1 45 -469053950 1439068142 2117758841 2004671078 694 | 207931006 1265321675 970353931 1541343047 695 | 696 | -6.0491901636123657e-001 2.4652053415775299e-001 697 | 698 | <_> 699 | 9 700 | -2.2425732612609863e+000 701 | 702 | <_> 703 | 704 | 0 -1 58 1481987157 282547485 -14952129 421131223 -391065352 705 | -24212488 -100094241 -1157907473 706 | 707 | -8.2822084426879883e-001 -2.1619293093681335e-001 708 | <_> 709 | 710 | 0 -1 126 -134217889 -543174305 -75497474 -16851650 -6685738 711 | -75834693 -2097200 -262146 712 | 713 | -5.4628932476043701e-001 2.7662658691406250e-001 714 | <_> 715 | 716 | 0 -1 133 -220728227 -604288517 -661662214 413104863 717 | -627323700 -251915415 -626200872 -1157958657 718 | 719 | -4.1643124818801880e-001 4.1700571775436401e-001 720 | <_> 721 | 722 | 0 -1 2 -186664033 -44236961 -1630262774 -65163606 -103237330 723 | -3083265 -1003729 2053105955 724 | 725 | -5.4847818613052368e-001 2.9710745811462402e-001 726 | <_> 727 | 728 | 0 -1 62 -256115886 -237611873 -620250696 387061799 729 | 1437882671 274878849 -8684449 1494294023 730 | 731 | -4.6202757954597473e-001 3.3915829658508301e-001 732 | <_> 733 | 734 | 0 -1 1 -309400577 -275864640 -1056864869 1737132756 735 | -272385089 1609671419 1740601343 1261376789 736 | 737 | -4.6158722043037415e-001 3.3939516544342041e-001 738 | <_> 739 | 740 | 0 -1 102 818197248 -196324552 286970589 -573270699 741 | -1174099579 -662077381 -1165157895 -1626859296 742 | 743 | -4.6193107962608337e-001 3.2456985116004944e-001 744 | <_> 745 | 746 | 0 -1 69 -1042550357 14675409 1367955200 -841482753 747 | 1642443255 8774277 1941304147 1099949563 748 | 749 | -4.9091196060180664e-001 3.3870378136634827e-001 750 | <_> 751 | 752 | 0 -1 72 -639654997 1375720439 -2129542805 1614801090 753 | -626787937 -5779294 1488699183 -525406458 754 | 755 | -4.9073097109794617e-001 3.0637946724891663e-001 756 | 757 | <_> 758 | 9 759 | -1.2258235216140747e+000 760 | 761 | <_> 762 | 763 | 0 -1 118 302046707 -16744240 1360106207 -543735387 764 | 1025700851 -1079408512 1796961263 -6334981 765 | 766 | -6.1358314752578735e-001 2.3539231717586517e-001 767 | <_> 768 | 769 | 0 -1 5 -144765953 -116448726 -653851877 1934829856 722021887 770 | 856564834 1933919231 -540838029 771 | 772 | -5.1209545135498047e-001 3.2506987452507019e-001 773 | <_> 774 | 775 | 0 -1 140 -170132825 -1438923874 1879300370 -1689337194 776 | -695606496 285911565 -1044188928 -154210028 777 | 778 | -5.1769560575485229e-001 3.2290914654731750e-001 779 | <_> 780 | 781 | 0 -1 131 -140776261 -355516414 822178224 -1039743806 782 | -1012208926 134887424 1438876097 -908591660 783 | 784 | -5.0321841239929199e-001 3.0263835191726685e-001 785 | <_> 786 | 787 | 0 -1 64 -2137211696 -1634281249 1464325973 498569935 788 | -1580152080 -2001687927 721783561 265096035 789 | 790 | -4.6532225608825684e-001 3.4638473391532898e-001 791 | <_> 792 | 793 | 0 -1 101 -255073589 -211824417 -972195129 -1063415417 794 | 1937994261 1363165220 -754733105 1967602541 795 | 796 | -4.9611270427703857e-001 3.3260712027549744e-001 797 | <_> 798 | 799 | 0 -1 81 -548146862 -655567194 -2062466596 1164562721 800 | 416408236 -1591631712 -83637777 975344427 801 | 802 | -4.9862930178642273e-001 3.2003280520439148e-001 803 | <_> 804 | 805 | 0 -1 55 -731904652 2147179896 2147442687 2112830847 -65604 806 | -131073 -42139667 -1074907393 807 | 808 | -3.6636069416999817e-001 4.5651626586914063e-001 809 | <_> 810 | 811 | 0 -1 67 1885036886 571985932 -1784930633 724431327 812 | 1940422257 -1085746880 964888398 731867951 813 | 814 | -5.2619713544845581e-001 3.2635414600372314e-001 815 | 816 | <_> 817 | 9 818 | -1.3604533672332764e+000 819 | 820 | <_> 821 | 822 | 0 -1 8 -287609985 -965585953 -2146397793 -492129894 823 | -729029645 -544619901 -645693256 -6565484 824 | 825 | -4.5212322473526001e-001 3.8910505175590515e-001 826 | <_> 827 | 828 | 0 -1 122 -102903523 -145031013 536899675 688195859 829 | -645291520 -1165359094 -905565928 171608223 830 | 831 | -4.9594074487686157e-001 3.4109055995941162e-001 832 | <_> 833 | 834 | 0 -1 134 -790640459 487931983 1778450522 1036604041 835 | -904752984 -954040118 -2134707506 304866043 836 | 837 | -4.1148442029953003e-001 3.9666590094566345e-001 838 | <_> 839 | 840 | 0 -1 141 -303829117 1726939070 922189815 -827983123 841 | 1567883042 1324809852 292710260 -942678754 842 | 843 | -3.5154473781585693e-001 4.8011952638626099e-001 844 | <_> 845 | 846 | 0 -1 59 -161295376 -159215460 -1858041315 2140644499 847 | -2009065472 -133804007 -2003265301 1263206851 848 | 849 | -4.2808216810226440e-001 3.9841541647911072e-001 850 | <_> 851 | 852 | 0 -1 34 -264248081 -667846464 1342624856 1381160835 853 | -2104716852 1342865409 -266612310 -165954877 854 | 855 | -4.3293288350105286e-001 4.0339657664299011e-001 856 | <_> 857 | 858 | 0 -1 32 -1600388464 -40369901 285344639 1394344275 859 | -255680312 -100532214 -1031663944 -7471079 860 | 861 | -4.1385015845298767e-001 4.5087572932243347e-001 862 | <_> 863 | 864 | 0 -1 15 1368521651 280207469 35779199 -105983261 1208124819 865 | -565870452 -1144024288 -591535344 866 | 867 | -4.2956474423408508e-001 4.2176279425621033e-001 868 | <_> 869 | 870 | 0 -1 109 1623607527 -661513115 -1073217263 -2142994420 871 | -1339883309 -89816956 436308899 1426178059 872 | 873 | -4.7764992713928223e-001 3.7551075220108032e-001 874 | 875 | <_> 876 | 9 877 | -4.2518746852874756e-001 878 | 879 | <_> 880 | 881 | 0 -1 135 -116728032 -1154420809 -1350582273 746061691 882 | -1073758277 2138570623 2113797566 -138674182 883 | 884 | -1.7125381529331207e-001 6.5421247482299805e-001 885 | <_> 886 | 887 | 0 -1 63 -453112432 -1795354691 -1342242964 494112553 888 | 209458404 -2114697500 1316830362 259213855 889 | 890 | -3.9870172739028931e-001 4.5807033777236938e-001 891 | <_> 892 | 893 | 0 -1 52 -268172036 294715533 268575185 486785157 -1065303920 894 | -360185856 -2147476808 134777113 895 | 896 | -5.3581339120864868e-001 3.5815808176994324e-001 897 | <_> 898 | 899 | 0 -1 86 -301996882 -345718921 1877946252 -940720129 900 | -58737369 -721944585 -92954835 -530449 901 | 902 | -3.9938014745712280e-001 4.9603295326232910e-001 903 | <_> 904 | 905 | 0 -1 14 -853281886 -756895766 2130706352 -9519120 906 | -1921059862 394133373 2138453959 -538200841 907 | 908 | -4.0230083465576172e-001 4.9537116289138794e-001 909 | <_> 910 | 911 | 0 -1 128 -2133448688 -641138493 1078022185 294060066 912 | -327122776 -2130640896 -2147466247 -1910634326 913 | 914 | -5.8290809392929077e-001 3.4102553129196167e-001 915 | <_> 916 | 917 | 0 -1 53 587265978 -2071658479 1108361221 -578448765 918 | -1811905899 -2008965119 33900729 762301595 919 | 920 | -4.5518967509269714e-001 4.7242793440818787e-001 921 | <_> 922 | 923 | 0 -1 138 -1022189373 -2139094976 16658 -1069445120 924 | -1073555454 -1073577856 1096068 -978351488 925 | 926 | -4.7530207037925720e-001 4.3885371088981628e-001 927 | <_> 928 | 929 | 0 -1 7 -395352441 -1073541103 -1056964605 1053186 269111298 930 | -2012184576 1611208714 -360415095 931 | 932 | -5.0448113679885864e-001 4.1588482260704041e-001 933 | 934 | <_> 935 | 7 936 | 2.7163455262780190e-002 937 | 938 | <_> 939 | 940 | 0 -1 49 783189748 -137429026 -257 709557994 2130460236 941 | -196611 -9580 585428708 942 | 943 | -2.0454545319080353e-001 7.9608374834060669e-001 944 | <_> 945 | 946 | 0 -1 108 1284360448 1057423155 1592696573 -852672655 947 | 1547382714 -1642594369 125705358 797134398 948 | 949 | -3.6474677920341492e-001 6.0925579071044922e-001 950 | <_> 951 | 952 | 0 -1 94 1347680270 -527720448 1091567712 1073745933 953 | -1073180671 0 285745154 -511192438 954 | 955 | -4.6406838297843933e-001 5.5626088380813599e-001 956 | <_> 957 | 958 | 0 -1 73 1705780944 -145486260 -115909 -281793505 -418072663 959 | -1681064068 1877454127 -1912330993 960 | 961 | -4.7043186426162720e-001 5.8430361747741699e-001 962 | <_> 963 | 964 | 0 -1 110 -2118142016 339509033 -285260567 1417764573 965 | 68144392 -468879483 -2033291636 231451911 966 | 967 | -4.8700931668281555e-001 5.4639810323715210e-001 968 | <_> 969 | 970 | 0 -1 119 -1888051818 489996135 -65539 849536890 2146716845 971 | -1107542088 -1275615746 -1119617586 972 | 973 | -4.3356490135192871e-001 6.5175366401672363e-001 974 | <_> 975 | 976 | 0 -1 44 -1879021438 336830528 1073766659 1477541961 8560696 977 | -1207369568 8462472 1493893448 978 | 979 | -5.4343086481094360e-001 5.2777874469757080e-001 980 | 981 | <_> 982 | 7 983 | 4.9174150824546814e-001 984 | 985 | <_> 986 | 987 | 0 -1 57 644098 15758324 1995964260 -463011882 893285175 988 | 83156983 2004317989 16021237 989 | 990 | -1.7073170840740204e-001 9.0782123804092407e-001 991 | <_> 992 | 993 | 0 -1 123 268632845 -2147450864 -2143240192 -2147401728 994 | 8523937 -1878523840 16777416 616824984 995 | 996 | -4.8744434118270874e-001 7.3311311006546021e-001 997 | <_> 998 | 999 | 0 -1 120 -2110735872 803880886 989739810 1673281312 91564930 1000 | -277454958 997709514 -581366443 1001 | 1002 | -4.0291741490364075e-001 8.2450771331787109e-001 1003 | <_> 1004 | 1005 | 0 -1 87 941753434 -1067128905 788512753 -1074450460 1006 | 779101657 -1346552460 938805167 -2050424642 1007 | 1008 | -3.6246949434280396e-001 8.7103593349456787e-001 1009 | <_> 1010 | 1011 | 0 -1 60 208 1645217920 130 538263552 33595552 -1475870592 1012 | 16783361 1375993867 1013 | 1014 | -6.1472141742706299e-001 5.9707164764404297e-001 1015 | <_> 1016 | 1017 | 0 -1 114 1860423179 1034692624 -285213187 -986681712 1018 | 1576755092 -1408205463 -127714 -1246035687 1019 | 1020 | -4.5621752738952637e-001 8.9482426643371582e-001 1021 | <_> 1022 | 1023 | 0 -1 107 33555004 -1861746688 1073807361 -754909184 1024 | 645922856 8388608 134250648 419635458 1025 | 1026 | -5.2466005086898804e-001 7.1834069490432739e-001 1027 | 1028 | <_> 1029 | 2 1030 | 1.9084988832473755e+000 1031 | 1032 | <_> 1033 | 1034 | 0 -1 16 536064 131072 -20971516 524288 576 1048577 0 40960 1035 | 1036 | -8.0000001192092896e-001 9.8018401861190796e-001 1037 | <_> 1038 | 1039 | 0 -1 56 67108864 0 4096 1074003968 8192 536870912 4 262144 1040 | 1041 | -9.6610915660858154e-001 9.2831486463546753e-001 1042 | 1043 | <_> 1044 | 1045 | 0 0 1 1 1046 | <_> 1047 | 1048 | 0 0 3 2 1049 | <_> 1050 | 1051 | 0 1 13 6 1052 | <_> 1053 | 1054 | 0 2 3 14 1055 | <_> 1056 | 1057 | 0 2 4 2 1058 | <_> 1059 | 1060 | 0 6 2 3 1061 | <_> 1062 | 1063 | 0 6 3 2 1064 | <_> 1065 | 1066 | 0 16 1 3 1067 | <_> 1068 | 1069 | 0 20 3 3 1070 | <_> 1071 | 1072 | 0 22 2 3 1073 | <_> 1074 | 1075 | 0 28 4 4 1076 | <_> 1077 | 1078 | 0 35 2 3 1079 | <_> 1080 | 1081 | 1 0 14 7 1082 | <_> 1083 | 1084 | 1 5 3 2 1085 | <_> 1086 | 1087 | 1 6 2 1 1088 | <_> 1089 | 1090 | 1 14 10 9 1091 | <_> 1092 | 1093 | 1 21 4 4 1094 | <_> 1095 | 1096 | 1 23 4 2 1097 | <_> 1098 | 1099 | 2 0 13 7 1100 | <_> 1101 | 1102 | 2 0 14 7 1103 | <_> 1104 | 1105 | 2 33 5 4 1106 | <_> 1107 | 1108 | 2 36 4 3 1109 | <_> 1110 | 1111 | 2 39 3 2 1112 | <_> 1113 | 1114 | 3 1 13 11 1115 | <_> 1116 | 1117 | 3 2 3 2 1118 | <_> 1119 | 1120 | 4 0 7 8 1121 | <_> 1122 | 1123 | 4 0 13 7 1124 | <_> 1125 | 1126 | 5 0 12 6 1127 | <_> 1128 | 1129 | 5 0 13 7 1130 | <_> 1131 | 1132 | 5 1 10 13 1133 | <_> 1134 | 1135 | 5 1 12 7 1136 | <_> 1137 | 1138 | 5 2 7 13 1139 | <_> 1140 | 1141 | 5 4 2 1 1142 | <_> 1143 | 1144 | 5 8 7 4 1145 | <_> 1146 | 1147 | 5 39 3 2 1148 | <_> 1149 | 1150 | 6 3 5 2 1151 | <_> 1152 | 1153 | 6 3 6 2 1154 | <_> 1155 | 1156 | 6 5 4 12 1157 | <_> 1158 | 1159 | 6 9 6 3 1160 | <_> 1161 | 1162 | 7 3 5 2 1163 | <_> 1164 | 1165 | 7 3 6 13 1166 | <_> 1167 | 1168 | 7 5 6 4 1169 | <_> 1170 | 1171 | 7 7 6 10 1172 | <_> 1173 | 1174 | 7 8 6 4 1175 | <_> 1176 | 1177 | 7 32 5 4 1178 | <_> 1179 | 1180 | 7 33 5 4 1181 | <_> 1182 | 1183 | 8 0 1 1 1184 | <_> 1185 | 1186 | 8 0 2 1 1187 | <_> 1188 | 1189 | 8 2 10 7 1190 | <_> 1191 | 1192 | 9 0 6 2 1193 | <_> 1194 | 1195 | 9 2 9 3 1196 | <_> 1197 | 1198 | 9 4 1 1 1199 | <_> 1200 | 1201 | 9 6 2 1 1202 | <_> 1203 | 1204 | 9 28 6 4 1205 | <_> 1206 | 1207 | 10 0 9 3 1208 | <_> 1209 | 1210 | 10 3 1 1 1211 | <_> 1212 | 1213 | 10 10 11 11 1214 | <_> 1215 | 1216 | 10 15 4 3 1217 | <_> 1218 | 1219 | 11 4 2 1 1220 | <_> 1221 | 1222 | 11 27 4 3 1223 | <_> 1224 | 1225 | 11 36 8 2 1226 | <_> 1227 | 1228 | 12 0 2 2 1229 | <_> 1230 | 1231 | 12 23 4 3 1232 | <_> 1233 | 1234 | 12 25 4 3 1235 | <_> 1236 | 1237 | 12 29 5 3 1238 | <_> 1239 | 1240 | 12 33 3 4 1241 | <_> 1242 | 1243 | 13 0 2 2 1244 | <_> 1245 | 1246 | 13 36 8 3 1247 | <_> 1248 | 1249 | 14 0 2 2 1250 | <_> 1251 | 1252 | 15 15 2 2 1253 | <_> 1254 | 1255 | 16 13 3 4 1256 | <_> 1257 | 1258 | 17 0 1 3 1259 | <_> 1260 | 1261 | 17 1 3 3 1262 | <_> 1263 | 1264 | 17 31 5 3 1265 | <_> 1266 | 1267 | 17 35 3 1 1268 | <_> 1269 | 1270 | 18 13 2 3 1271 | <_> 1272 | 1273 | 18 39 2 1 1274 | <_> 1275 | 1276 | 19 0 7 15 1277 | <_> 1278 | 1279 | 19 2 7 2 1280 | <_> 1281 | 1282 | 19 3 7 13 1283 | <_> 1284 | 1285 | 19 14 2 2 1286 | <_> 1287 | 1288 | 19 24 7 4 1289 | <_> 1290 | 1291 | 20 1 6 13 1292 | <_> 1293 | 1294 | 20 8 7 3 1295 | <_> 1296 | 1297 | 20 9 7 3 1298 | <_> 1299 | 1300 | 20 13 1 1 1301 | <_> 1302 | 1303 | 20 14 2 3 1304 | <_> 1305 | 1306 | 20 30 3 2 1307 | <_> 1308 | 1309 | 21 0 3 4 1310 | <_> 1311 | 1312 | 21 0 6 8 1313 | <_> 1314 | 1315 | 21 3 6 2 1316 | <_> 1317 | 1318 | 21 6 6 4 1319 | <_> 1320 | 1321 | 21 37 2 1 1322 | <_> 1323 | 1324 | 22 3 6 2 1325 | <_> 1326 | 1327 | 22 13 1 2 1328 | <_> 1329 | 1330 | 22 22 4 3 1331 | <_> 1332 | 1333 | 23 0 2 3 1334 | <_> 1335 | 1336 | 23 3 6 2 1337 | <_> 1338 | 1339 | 23 9 5 4 1340 | <_> 1341 | 1342 | 23 11 1 1 1343 | <_> 1344 | 1345 | 23 15 1 1 1346 | <_> 1347 | 1348 | 23 16 3 2 1349 | <_> 1350 | 1351 | 23 35 2 1 1352 | <_> 1353 | 1354 | 23 36 1 1 1355 | <_> 1356 | 1357 | 23 39 6 2 1358 | <_> 1359 | 1360 | 24 0 2 3 1361 | <_> 1362 | 1363 | 24 8 6 11 1364 | <_> 1365 | 1366 | 24 28 2 2 1367 | <_> 1368 | 1369 | 24 33 4 4 1370 | <_> 1371 | 1372 | 25 16 4 3 1373 | <_> 1374 | 1375 | 25 31 5 3 1376 | <_> 1377 | 1378 | 26 0 1 2 1379 | <_> 1380 | 1381 | 26 0 2 2 1382 | <_> 1383 | 1384 | 26 0 3 2 1385 | <_> 1386 | 1387 | 26 24 4 4 1388 | <_> 1389 | 1390 | 27 30 4 5 1391 | <_> 1392 | 1393 | 27 36 5 3 1394 | <_> 1395 | 1396 | 28 0 2 2 1397 | <_> 1398 | 1399 | 28 4 2 1 1400 | <_> 1401 | 1402 | 28 21 2 5 1403 | <_> 1404 | 1405 | 29 8 2 1 1406 | <_> 1407 | 1408 | 33 0 2 1 1409 | <_> 1410 | 1411 | 33 0 4 2 1412 | <_> 1413 | 1414 | 33 0 4 6 1415 | <_> 1416 | 1417 | 33 3 1 1 1418 | <_> 1419 | 1420 | 33 6 4 12 1421 | <_> 1422 | 1423 | 33 21 4 2 1424 | <_> 1425 | 1426 | 33 36 4 3 1427 | <_> 1428 | 1429 | 35 1 2 2 1430 | <_> 1431 | 1432 | 36 5 1 1 1433 | <_> 1434 | 1435 | 36 29 3 4 1436 | <_> 1437 | 1438 | 36 39 2 2 1439 | <_> 1440 | 1441 | 37 5 2 2 1442 | <_> 1443 | 1444 | 38 6 2 1 1445 | <_> 1446 | 1447 | 38 6 2 2 1448 | <_> 1449 | 1450 | 39 1 2 12 1451 | <_> 1452 | 1453 | 39 24 1 2 1454 | <_> 1455 | 1456 | 39 36 2 2 1457 | <_> 1458 | 1459 | 40 39 1 2 1460 | <_> 1461 | 1462 | 42 4 1 1 1463 | <_> 1464 | 1465 | 42 20 1 2 1466 | <_> 1467 | 1468 | 42 29 1 2 1469 | 1470 | -------------------------------------------------------------------------------- /split_data.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import sys\n", 10 | "sys.path.append('/home/emna/text/lib/python3.5/site-packages')\n", 11 | "import numpy as np\n", 12 | "import pandas as pd\n", 13 | "np.random.seed(1)" 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": 2, 19 | "metadata": {}, 20 | "outputs": [], 21 | "source": [ 22 | "full_labels = pd.read_csv('celebrities_labels.csv',index_col=0)" 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": 3, 28 | "metadata": {}, 29 | "outputs": [ 30 | { 31 | "data": { 32 | "text/html": [ 33 | "
\n", 34 | "\n", 47 | "\n", 48 | " \n", 49 | " \n", 50 | " \n", 51 | " \n", 52 | " \n", 53 | " \n", 54 | " \n", 55 | " \n", 56 | " \n", 57 | " \n", 58 | " \n", 59 | " \n", 60 | " \n", 61 | " \n", 62 | " \n", 63 | " \n", 64 | " \n", 65 | " \n", 66 | " \n", 67 | " \n", 68 | " \n", 69 | " \n", 70 | " \n", 71 | " \n", 72 | " \n", 73 | " \n", 74 | " \n", 75 | " \n", 76 | " \n", 77 | " \n", 78 | " \n", 79 | " \n", 80 | " \n", 81 | " \n", 82 | " \n", 83 | " \n", 84 | " \n", 85 | " \n", 86 | " \n", 87 | " \n", 88 | " \n", 89 | " \n", 90 | " \n", 91 | " \n", 92 | " \n", 93 | " \n", 94 | " \n", 95 | " \n", 96 | " \n", 97 | " \n", 98 | " \n", 99 | " \n", 100 | " \n", 101 | " \n", 102 | " \n", 103 | " \n", 104 | " \n", 105 | " \n", 106 | " \n", 107 | " \n", 108 | " \n", 109 | " \n", 110 | " \n", 111 | " \n", 112 | " \n", 113 | " \n", 114 | " \n", 115 | " \n", 116 | " \n", 117 | " \n", 118 | " \n", 119 | " \n", 120 | " \n", 121 | " \n", 122 | "
widthheightclassxminyminxmaxymax
filename
wenn5937954.jpg603839AndySerkis79186642456
2s6584.jpeg125165willSmith124214199
1s15927.jpeg6880willSmith6167256
2s6372.jpeg85113willSmith143510065
1jth406.jpeg82115justinBieber11339764
\n", 123 | "
" 124 | ], 125 | "text/plain": [ 126 | " width height class xmin ymin xmax ymax\n", 127 | "filename \n", 128 | "wenn5937954.jpg 603 839 AndySerkis 79 186 642 456\n", 129 | "2s6584.jpeg 125 165 willSmith 12 42 141 99\n", 130 | "1s15927.jpeg 68 80 willSmith 6 16 72 56\n", 131 | "2s6372.jpeg 85 113 willSmith 14 35 100 65\n", 132 | "1jth406.jpeg 82 115 justinBieber 11 33 97 64" 133 | ] 134 | }, 135 | "execution_count": 3, 136 | "metadata": {}, 137 | "output_type": "execute_result" 138 | } 139 | ], 140 | "source": [ 141 | "full_labels.head()\n" 142 | ] 143 | }, 144 | { 145 | "cell_type": "code", 146 | "execution_count": 4, 147 | "metadata": {}, 148 | "outputs": [ 149 | { 150 | "data": { 151 | "text/plain": [ 152 | "" 153 | ] 154 | }, 155 | "execution_count": 4, 156 | "metadata": {}, 157 | "output_type": "execute_result" 158 | } 159 | ], 160 | "source": [ 161 | "grouped = full_labels.groupby('class')\n", 162 | "grouped" 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": 5, 168 | "metadata": {}, 169 | "outputs": [ 170 | { 171 | "data": { 172 | "text/plain": [ 173 | "class\n", 174 | "AndySerkis 200\n", 175 | "dewayneJohnson 43\n", 176 | "drake 61\n", 177 | "jayZ 15\n", 178 | "justinBieber 103\n", 179 | "kimKardashian 128\n", 180 | "tomHanks 79\n", 181 | "willSmith 78\n", 182 | "dtype: int64" 183 | ] 184 | }, 185 | "execution_count": 5, 186 | "metadata": {}, 187 | "output_type": "execute_result" 188 | } 189 | ], 190 | "source": [ 191 | "full_labels.groupby(['class']).size()" 192 | ] 193 | }, 194 | { 195 | "cell_type": "code", 196 | "execution_count": 6, 197 | "metadata": {}, 198 | "outputs": [], 199 | "source": [ 200 | "df = full_labels.groupby(['class']).size().to_frame('size')" 201 | ] 202 | }, 203 | { 204 | "cell_type": "code", 205 | "execution_count": 7, 206 | "metadata": {}, 207 | "outputs": [ 208 | { 209 | "data": { 210 | "text/html": [ 211 | "
\n", 212 | "\n", 225 | "\n", 226 | " \n", 227 | " \n", 228 | " \n", 229 | " \n", 230 | " \n", 231 | " \n", 232 | " \n", 233 | " \n", 234 | " \n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \n", 239 | " \n", 240 | " \n", 241 | " \n", 242 | " \n", 243 | " \n", 244 | " \n", 245 | " \n", 246 | " \n", 247 | " \n", 248 | " \n", 249 | " \n", 250 | " \n", 251 | " \n", 252 | " \n", 253 | " \n", 254 | " \n", 255 | " \n", 256 | " \n", 257 | " \n", 258 | " \n", 259 | " \n", 260 | " \n", 261 | " \n", 262 | " \n", 263 | " \n", 264 | " \n", 265 | " \n", 266 | " \n", 267 | " \n", 268 | " \n", 269 | " \n", 270 | "
size
class
AndySerkis200
dewayneJohnson43
drake61
jayZ15
justinBieber103
kimKardashian128
tomHanks79
willSmith78
\n", 271 | "
" 272 | ], 273 | "text/plain": [ 274 | " size\n", 275 | "class \n", 276 | "AndySerkis 200\n", 277 | "dewayneJohnson 43\n", 278 | "drake 61\n", 279 | "jayZ 15\n", 280 | "justinBieber 103\n", 281 | "kimKardashian 128\n", 282 | "tomHanks 79\n", 283 | "willSmith 78" 284 | ] 285 | }, 286 | "execution_count": 7, 287 | "metadata": {}, 288 | "output_type": "execute_result" 289 | } 290 | ], 291 | "source": [ 292 | "df" 293 | ] 294 | }, 295 | { 296 | "cell_type": "markdown", 297 | "metadata": {}, 298 | "source": [ 299 | "\n", 300 | "### split each file into a group in a list\n" 301 | ] 302 | }, 303 | { 304 | "cell_type": "code", 305 | "execution_count": 8, 306 | "metadata": {}, 307 | "outputs": [], 308 | "source": [ 309 | "grouped = full_labels.groupby('class')\n" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": 9, 315 | "metadata": {}, 316 | "outputs": [], 317 | "source": [ 318 | "grouped_list = [grouped.get_group(x) for x in grouped.groups]" 319 | ] 320 | }, 321 | { 322 | "cell_type": "code", 323 | "execution_count": 10, 324 | "metadata": {}, 325 | "outputs": [ 326 | { 327 | "data": { 328 | "text/plain": [ 329 | "8" 330 | ] 331 | }, 332 | "execution_count": 10, 333 | "metadata": {}, 334 | "output_type": "execute_result" 335 | } 336 | ], 337 | "source": [ 338 | "len(grouped_list)" 339 | ] 340 | }, 341 | { 342 | "cell_type": "code", 343 | "execution_count": 11, 344 | "metadata": {}, 345 | "outputs": [], 346 | "source": [ 347 | "from sklearn.model_selection import train_test_split" 348 | ] 349 | }, 350 | { 351 | "cell_type": "code", 352 | "execution_count": 12, 353 | "metadata": {}, 354 | "outputs": [], 355 | "source": [ 356 | "X_train, X_test, y_train, y_test = train_test_split(full_labels, full_labels['class'],test_size=0.2)" 357 | ] 358 | }, 359 | { 360 | "cell_type": "code", 361 | "execution_count": 13, 362 | "metadata": {}, 363 | "outputs": [ 364 | { 365 | "data": { 366 | "text/plain": [ 367 | "(707, 7)" 368 | ] 369 | }, 370 | "execution_count": 13, 371 | "metadata": {}, 372 | "output_type": "execute_result" 373 | } 374 | ], 375 | "source": [ 376 | "full_labels.shape" 377 | ] 378 | }, 379 | { 380 | "cell_type": "code", 381 | "execution_count": 14, 382 | "metadata": {}, 383 | "outputs": [ 384 | { 385 | "data": { 386 | "text/plain": [ 387 | "(565, 7)" 388 | ] 389 | }, 390 | "execution_count": 14, 391 | "metadata": {}, 392 | "output_type": "execute_result" 393 | } 394 | ], 395 | "source": [ 396 | "X_train.shape" 397 | ] 398 | }, 399 | { 400 | "cell_type": "code", 401 | "execution_count": 15, 402 | "metadata": {}, 403 | "outputs": [ 404 | { 405 | "data": { 406 | "text/plain": [ 407 | "class\n", 408 | "AndySerkis 159\n", 409 | "dewayneJohnson 34\n", 410 | "drake 49\n", 411 | "jayZ 14\n", 412 | "justinBieber 85\n", 413 | "kimKardashian 108\n", 414 | "tomHanks 59\n", 415 | "willSmith 57\n", 416 | "dtype: int64" 417 | ] 418 | }, 419 | "execution_count": 15, 420 | "metadata": {}, 421 | "output_type": "execute_result" 422 | } 423 | ], 424 | "source": [ 425 | "X_train.groupby(['class']).size()" 426 | ] 427 | }, 428 | { 429 | "cell_type": "code", 430 | "execution_count": 16, 431 | "metadata": {}, 432 | "outputs": [ 433 | { 434 | "data": { 435 | "text/plain": [ 436 | "(142, 7)" 437 | ] 438 | }, 439 | "execution_count": 16, 440 | "metadata": {}, 441 | "output_type": "execute_result" 442 | } 443 | ], 444 | "source": [ 445 | "X_test.shape" 446 | ] 447 | }, 448 | { 449 | "cell_type": "code", 450 | "execution_count": 17, 451 | "metadata": {}, 452 | "outputs": [ 453 | { 454 | "data": { 455 | "text/plain": [ 456 | "class\n", 457 | "AndySerkis 41\n", 458 | "dewayneJohnson 9\n", 459 | "drake 12\n", 460 | "jayZ 1\n", 461 | "justinBieber 18\n", 462 | "kimKardashian 20\n", 463 | "tomHanks 20\n", 464 | "willSmith 21\n", 465 | "dtype: int64" 466 | ] 467 | }, 468 | "execution_count": 17, 469 | "metadata": {}, 470 | "output_type": "execute_result" 471 | } 472 | ], 473 | "source": [ 474 | "X_test.groupby(['class']).size()" 475 | ] 476 | }, 477 | { 478 | "cell_type": "code", 479 | "execution_count": 18, 480 | "metadata": {}, 481 | "outputs": [], 482 | "source": [ 483 | "X_train.to_csv('train.csv')\n", 484 | "X_test.to_csv('test.csv')" 485 | ] 486 | }, 487 | { 488 | "cell_type": "code", 489 | "execution_count": null, 490 | "metadata": {}, 491 | "outputs": [], 492 | "source": [] 493 | } 494 | ], 495 | "metadata": { 496 | "kernelspec": { 497 | "display_name": "Python 3", 498 | "language": "python", 499 | "name": "python3" 500 | }, 501 | "language_info": { 502 | "codemirror_mode": { 503 | "name": "ipython", 504 | "version": 3 505 | }, 506 | "file_extension": ".py", 507 | "mimetype": "text/x-python", 508 | "name": "python", 509 | "nbconvert_exporter": "python", 510 | "pygments_lexer": "ipython3", 511 | "version": "3.5.2" 512 | } 513 | }, 514 | "nbformat": 4, 515 | "nbformat_minor": 2 516 | } 517 | -------------------------------------------------------------------------------- /ssd_mobilenet_v1_coco_2018_01_28/ssd.md: -------------------------------------------------------------------------------- 1 | We will use [ssd_mobilenet_v1_coco](http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2018_01_28.tar.gz) to train our face recognition model. 2 | -------------------------------------------------------------------------------- /xml_to_csv.py: -------------------------------------------------------------------------------- 1 | import os 2 | import glob 3 | import pandas as pd 4 | import xml.etree.ElementTree as ET 5 | 6 | 7 | def xml_to_csv(path): 8 | xml_list = [] 9 | for xml_file in glob.glob(path + '/*.xml'): 10 | tree = ET.parse(xml_file) 11 | root = tree.getroot() 12 | for member in root.findall('object'): 13 | value = (root.find('filename').text, 14 | int(root.find('size')[0].text), 15 | int(root.find('size')[1].text), 16 | member[0].text, 17 | int(member[4][0].text), 18 | int(member[4][1].text), 19 | int(member[4][2].text), 20 | int(member[4][3].text) 21 | ) 22 | xml_list.append(value) 23 | column_name = ['filename', 'width', 'height', 'class', 'xmin', 'ymin', 'xmax', 'ymax'] 24 | xml_df = pd.DataFrame(xml_list, columns=column_name) 25 | return xml_df 26 | 27 | 28 | def main(): 29 | image_path = os.path.join(os.getcwd(), 'annotations') 30 | xml_df = xml_to_csv(image_path) 31 | xml_df.to_csv('celebrities_labels.csv', index=None) 32 | print('Successfully converted xml to csv.') 33 | 34 | 35 | main() 36 | --------------------------------------------------------------------------------