├── .gitignore ├── LICENSE ├── README.md ├── create_cityscapes_tf_record.py ├── create_pascal_tf_record.py ├── dataset ├── sample_images_list.txt ├── test.txt ├── train.txt └── val.txt ├── deeplab_model.py ├── evaluate.py ├── export_inference_graph.py ├── images ├── tensorboard_images.png └── tensorboard_miou.png ├── inference.py ├── requirements.txt ├── train.py └── utils ├── __init__.py ├── dataset_util.py └── preprocessing.py /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | env/ 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | 28 | # PyInstaller 29 | # Usually these files are written by a python script from a template 30 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 31 | *.manifest 32 | *.spec 33 | 34 | # Installer logs 35 | pip-log.txt 36 | pip-delete-this-directory.txt 37 | 38 | # Unit test / coverage reports 39 | htmlcov/ 40 | .tox/ 41 | .coverage 42 | .coverage.* 43 | .cache 44 | nosetests.xml 45 | coverage.xml 46 | *.cover 47 | .hypothesis/ 48 | 49 | # Translations 50 | *.mo 51 | *.pot 52 | 53 | # Django stuff: 54 | *.log 55 | local_settings.py 56 | 57 | # Flask stuff: 58 | instance/ 59 | .webassets-cache 60 | 61 | # Scrapy stuff: 62 | .scrapy 63 | 64 | # Sphinx documentation 65 | docs/_build/ 66 | 67 | # PyBuilder 68 | target/ 69 | 70 | # Jupyter Notebook 71 | .ipynb_checkpoints 72 | 73 | # pyenv 74 | .python-version 75 | 76 | # celery beat schedule file 77 | celerybeat-schedule 78 | 79 | # SageMath parsed files 80 | *.sage.py 81 | 82 | # dotenv 83 | .env 84 | 85 | # virtualenv 86 | .venv 87 | venv/ 88 | ENV/ 89 | 90 | # Spyder project settings 91 | .spyderproject 92 | .spyproject 93 | 94 | # Rope project settings 95 | .ropeproject 96 | 97 | # mkdocs documentation 98 | /site 99 | 100 | # mypy 101 | .mypy_cache/ 102 | 103 | # PyCharm 104 | .idea 105 | 106 | # dataset 107 | dataset/VOCdevkit 108 | ini_checkpoints 109 | model 110 | *.record 111 | models 112 | dataset/inference_output/ 113 | .DS_Store 114 | dataset/export_output 115 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2018 Riei Ishizeki 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # DeepLab-v3-plus Semantic Segmentation in TensorFlow 2 | 3 | This repo attempts to reproduce [Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation (DeepLabv3+)](https://arxiv.org/abs/1802.02611) in 4 | TensorFlow for semantic image segmentation on the 5 | [PASCAL VOC dataset](http://host.robots.ox.ac.uk/pascal/VOC/) and [Cityscapes dataset](https://www.cityscapes-dataset.com/). 6 | The implementation is largely based on 7 | [my DeepLabv3 implementation](https://github.com/rishizek/tensorflow-deeplab-v3), 8 | which was originally based on 9 | [DrSleep's DeepLab v2 implemantation](https://github.com/DrSleep/tensorflow-deeplab-resnet) 10 | and [tensorflow models Resnet implementation](https://github.com/tensorflow/models/tree/master/official/resnet). 11 | 12 | 13 | ## Setup 14 | ### Requirements: 15 | - tensorflow >=1.6 16 | - numpy 17 | - matplotlib 18 | - pillow 19 | - opencv-python 20 | 21 | You can install the requirements by running `pip install -r requirements.txt`. 22 | 23 | ## Dataset Preparation 24 | This project uses the [TFRecord format](https://www.tensorflow.org/api_guides/python/python_io#tfrecords_format_details) to consume data in the training and evaluation process. Creating a TFRecord from raw image files is pretty straight forward and will be covered here. 25 | 26 | ### Cityscapes 27 | 28 | 29 | *Note:* **This project includes a script for creating a TFRecord for Cityscapes and Pascal VOC**, but not other datasets. 30 | ### Creating TFRecords for Cityscapes 31 | 32 | In order to download the Cityscapes dataset, you must first register with their [website](https://www.cityscapes-dataset.com/). After this, make sure to download both `leftImg8bit` and `gtFine`. You should end up with a folder that will be in the structure 33 | 34 | ``` 35 | + cityscapes 36 | + leftImg8bit 37 | + gtFine 38 | ``` 39 | 40 | Next, in order to generate training labels for the dataset, clone cityScapesScripts project 41 | 42 | ``` 43 | git clone https://github.com/mcordts/cityscapesScripts.git 44 | cd cityscapesScripts 45 | ``` 46 | 47 | Then from the root of your cityscapes dataset run 48 | 49 | ``` 50 | # must have $CITYSCAPES_ROOT defined 51 | python cityscapesscripts/preparation/createTrainIdLabelImgs.py 52 | ``` 53 | Finally, you can now run the conversion script `create_cityscapes_tf_record.py` provided in this repository. 54 | 55 | ### Pascal VOC 56 | - Download and extract 57 | [PASCAL VOC training/validation data](http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar) 58 | (2GB tar file), specifying the location with the `--data_dir`. 59 | - Download and extract 60 | [augmented segmentation data](https://www.dropbox.com/s/oeu149j8qtbs1x0/SegmentationClassAug.zip?dl=0) 61 | (Thanks to DrSleep), specifying the location with `--data_dir` and `--label_data_dir` 62 | (namely, `$data_dir/$label_data_dir`). 63 | 64 | ### Creating TFRecords for Pascal VOC 65 | Once you have the dataset available, you can create tf records for pascal voc by running the following 66 | ```bash 67 | python create_pascal_tf_record.py --data_dir DATA_DIR \ 68 | --image_data_dir IMAGE_DATA_DIR \ 69 | --label_data_dir LABEL_DATA_DIR 70 | ``` 71 | 72 | ## Training 73 | For training, you need to download and extract [pre-trained Resnet v2 101 model](http://download.tensorflow.org/models/resnet_v2_101_2017_04_14.tar.gz) from [slim](https://github.com/tensorflow/models/tree/master/research/slim) specifying the location with `--pre_trained_model`. You also need to convert original data to the TensorFlow TFRecord format. Once you have followed all the steps in dataset preparation and created TFrecord for training and validation data, you can start training model as follow: 74 | ```bash 75 | python train.py --model_dir MODEL_DIR --pre_trained_model PRE_TRAINED_MODEL 76 | ``` 77 | Here, `--pre_trained_model` contains the pre-trained Resnet model, whereas 78 | `--model_dir` contains the trained DeepLabv3+ checkpoints. 79 | If `--model_dir` contains the valid checkpoints, the model is trained from the 80 | specified checkpoint in `--model_dir`. 81 | 82 | You can see other options with the following command: 83 | ```bash 84 | python train.py --help 85 | ``` 86 | For inference the trained model with `77.31%` mIoU on the Pascal VOC 2012 validation dataset 87 | is available 88 | [here](https://www.dropbox.com/s/1xrd4c5atyrkb6z/deeplabv3plus_ver1.tar.gz?dl=0). Download and extract to 89 | `--model_dir`. 90 | 91 |

92 | 93 |

94 | 95 | The training process can be visualized with Tensor Board as follow: 96 | ```bash 97 | tensorboard --logdir MODEL_DIR 98 | ``` 99 | 100 |

101 | 102 | 103 |

104 | 105 | ## Evaluation 106 | To evaluate how model perform, one can use the following command: 107 | ```bash 108 | python evaluate.py --help 109 | ``` 110 | The current best model build by this implementation achieves `77.31%` mIoU on the Pascal VOC 2012 111 | validation dataset. 112 | 113 | | Network Backbone | train OS | eval OS | SC | mIOU paper | mIOU repo | 114 | |:----------------:|:--------:|:-------:|:---:|:-----------:|:----------:| 115 | | Resnet101 | 16 | 16 | | 78.85% | **77.31%** | 116 | 117 | Here, the above model was trained about 9.5 hours (with Tesla V100 and r1.6) with following parameters: 118 | ```bash 119 | python train.py --train_epochs 43 --batch_size 15 --weight_decay 2e-4 --model_dir models/ba=15,wd=2e-4,max_iter=30k --max_iter 30000 120 | ``` 121 | 122 | ## Inference 123 | To apply semantic segmentation to your images, one can use the following commands: 124 | ```bash 125 | python inference.py --data_dir DATA_DIR --infer_data_list INFER_DATA_LIST --model_dir MODEL_DIR 126 | ``` 127 | The trained model is available [here](https://www.dropbox.com/s/1xrd4c5atyrkb6z/deeplabv3plus_ver1.tar.gz?dl=0). 128 | One can find the detailed explanation of mask such as meaning of color in 129 | [DrSleep's repo](https://github.com/DrSleep/tensorflow-deeplab-resnet). 130 | 131 | ## TODO: 132 | Pull requests are welcome. 133 | - [x] Implement Decoder 134 | - [x] Resnet as Network Backbone 135 | - [x] Training on cityscapes 136 | - [ ] Xception as Network Backbone 137 | - [ ] Implement depthwise separable convolutions 138 | - [ ] Make network more GPU memory efficient (i.e. support larger batch size) 139 | - [ ] Multi-GPU support 140 | - [ ] Channels first support (Apparently large performance boost on GPU) 141 | - [ ] Model pretrained on MS-COCO 142 | - [ ] Unit test 143 | 144 | ## Acknowledgment 145 | This repo borrows code heavily from 146 | - [DrSleep's DeepLab-ResNet (DeepLabv2)](https://github.com/DrSleep/tensorflow-deeplab-resnet) 147 | - [TensorFlow Official Models](https://github.com/tensorflow/models/tree/master/official) 148 | - [Tensorflow Object Detection API](https://github.com/tensorflow/models/tree/master/research/object_detection) 149 | - [TensorFlow-Slim](https://github.com/tensorflow/models/tree/master/research/slim) 150 | - [TensorFlow](https://github.com/tensorflow/tensorflow) 151 | -------------------------------------------------------------------------------- /create_cityscapes_tf_record.py: -------------------------------------------------------------------------------- 1 | r"""Build a TF Record for Cityscapes Semantic Segmentation dataset.""" 2 | from __future__ import absolute_import 3 | from __future__ import division 4 | from __future__ import print_function 5 | 6 | import hashlib 7 | import glob 8 | import io 9 | import json 10 | import os 11 | import numpy as np 12 | import PIL.Image 13 | 14 | import tensorflow as tf 15 | 16 | flags = tf.app.flags 17 | tf.flags.DEFINE_string('input_pattern', '', 18 | 'Cityscapes dataset root folder.') 19 | tf.flags.DEFINE_string('annot_pattern', '', 20 | 'Pattern matching input images for Cityscapes.') 21 | tf.flags.DEFINE_string('cityscapes_dir', '', 22 | 'Pattern matching ground truth images for Cityscapes.') 23 | tf.flags.DEFINE_string('split_type', '', 24 | 'Type of split: `train`, `test` or `val`.') 25 | tf.flags.DEFINE_string('output_dir', '', 'Output data directory.') 26 | 27 | FLAGS = flags.FLAGS 28 | 29 | 30 | tf.logging.set_verbosity(tf.logging.INFO) 31 | 32 | 33 | _DEFAULT_PATTEN = { 34 | 'input': '*_leftImg8bit.png', 35 | 'annot': '*_gtFine_labelTrainIds.png' 36 | } 37 | 38 | _DEFAULT_DIR = { 39 | 'image': 'leftImg8bit', 40 | 'label': 'gtFine' 41 | } 42 | 43 | 44 | def _bytes_feature(values): 45 | return tf.train.Feature( 46 | bytes_list=tf.train.BytesList(value=[values])) 47 | 48 | 49 | def _int64_feature(values): 50 | if not isinstance(values, (tuple, list)): 51 | values = [values] 52 | return tf.train.Feature(int64_list=tf.train.Int64List(value=values)) 53 | 54 | 55 | def _open_file(full_path): 56 | with tf.gfile.GFile(full_path, 'rb') as fid: 57 | encoded_file = fid.read() 58 | encoded_file_io = io.BytesIO(encoded_file) 59 | image = PIL.Image.open(encoded_file_io) 60 | return image, encoded_file 61 | 62 | 63 | def create_tf_example(image_path, label_path, image_dir='', is_jpeg=False): 64 | file_format = 'jpeg' if is_jpeg else 'png' 65 | full_image_path = os.path.join(image_dir, image_path) 66 | full_label_path = os.path.join(image_dir, label_path) 67 | image, encoded_image = _open_file(full_image_path) 68 | label, encoded_label = _open_file(full_label_path) 69 | 70 | height = image.height 71 | width = image.width 72 | if height != label.height or width != label.width: 73 | raise ValueError('Input and annotated images must have same dims.' 74 | 'verify the matching pair for {}'.format(full_image_path)) 75 | 76 | feature_dict = { 77 | 'image/encoded': _bytes_feature(encoded_image), 78 | 'image/filename': _bytes_feature( 79 | full_image_path.encode('utf8')), 80 | 'image/format': _bytes_feature( 81 | file_format.encode('utf8')), 82 | 'image/height': _int64_feature(height), 83 | 'image/width': _int64_feature(width), 84 | 'image/channels': _int64_feature(3), 85 | 'label/encoded': _bytes_feature(encoded_label), 86 | 'label/format':_bytes_feature( 87 | 'png'.encode('utf8')), 88 | } 89 | 90 | example = tf.train.Example( 91 | features=tf.train.Features(feature=feature_dict)) 92 | return example 93 | 94 | 95 | def _create_tf_record(images, labels, output_path): 96 | writer = tf.python_io.TFRecordWriter(output_path) 97 | for idx, image in enumerate(images): 98 | if idx % 100 == 0: 99 | tf.logging.info('On image %d of %d', idx, len(images)) 100 | tf_example = create_tf_example( 101 | image, labels[idx], is_jpeg=False) 102 | writer.write(tf_example.SerializeToString()) 103 | writer.close() 104 | tf.logging.info('Finished writing!') 105 | 106 | 107 | def main(_): 108 | assert FLAGS.output_dir, '`output_dir` missing.' 109 | assert FLAGS.split_type, '`split_type` missing.' 110 | assert (FLAGS.cityscapes_dir) or \ 111 | (FLAGS.input_pattern and FLAGS.annot_pattern), \ 112 | 'Must specify either `cityscapes_dir` or ' \ 113 | '`input_pattern` and `annot_pattern`.' 114 | 115 | if not tf.gfile.IsDirectory(FLAGS.output_dir): 116 | tf.gfile.MakeDirs(FLAGS.output_dir) 117 | train_output_path = os.path.join(FLAGS.output_dir, 118 | '{}.record'.format(FLAGS.split_type)) 119 | 120 | if FLAGS.cityscapes_dir: 121 | search_image_files = os.path.join(FLAGS.cityscapes_dir, 122 | _DEFAULT_DIR['image'], FLAGS.split_type, '*', _DEFAULT_PATTEN['input']) 123 | search_annot_files = os.path.join(FLAGS.cityscapes_dir, 124 | _DEFAULT_DIR['label'], FLAGS.split_type, '*', _DEFAULT_PATTEN['annot']) 125 | image_filenames = glob.glob(search_image_files) 126 | annot_filenames = glob.glob(search_annot_files) 127 | else: 128 | image_filenames = glob.glob(FLAGS.input_pattern) 129 | annot_filenames = glob.glob(FLAGS.annot_pattern) 130 | if len(image_filenames) != len(annot_filenames): 131 | raise ValueError('Supplied patterns do not have image counts.') 132 | 133 | _create_tf_record( 134 | sorted(image_filenames), 135 | sorted(annot_filenames), 136 | output_path=train_output_path) 137 | 138 | 139 | if __name__ == '__main__': 140 | tf.app.run() 141 | -------------------------------------------------------------------------------- /create_pascal_tf_record.py: -------------------------------------------------------------------------------- 1 | """Converts PASCAL dataset to TFRecords file format.""" 2 | 3 | from __future__ import absolute_import 4 | from __future__ import division 5 | from __future__ import print_function 6 | 7 | import argparse 8 | import io 9 | import os 10 | import sys 11 | 12 | import PIL.Image 13 | import tensorflow as tf 14 | 15 | from utils import dataset_util 16 | 17 | parser = argparse.ArgumentParser() 18 | 19 | parser.add_argument('--data_dir', type=str, default='./dataset/VOCdevkit/VOC2012', 20 | help='Path to the directory containing the PASCAL VOC data.') 21 | 22 | parser.add_argument('--output_path', type=str, default='./dataset', 23 | help='Path to the directory to create TFRecords outputs.') 24 | 25 | parser.add_argument('--train_data_list', type=str, default='./dataset/train.txt', 26 | help='Path to the file listing the training data.') 27 | 28 | parser.add_argument('--valid_data_list', type=str, default='./dataset/val.txt', 29 | help='Path to the file listing the validation data.') 30 | 31 | parser.add_argument('--image_data_dir', type=str, default='JPEGImages', 32 | help='The directory containing the image data.') 33 | 34 | parser.add_argument('--label_data_dir', type=str, default='SegmentationClassAug', 35 | help='The directory containing the augmented label data.') 36 | 37 | 38 | def dict_to_tf_example(image_path, 39 | label_path): 40 | """Convert image and label to tf.Example proto. 41 | 42 | Args: 43 | image_path: Path to a single PASCAL image. 44 | label_path: Path to its corresponding label. 45 | 46 | Returns: 47 | example: The converted tf.Example. 48 | 49 | Raises: 50 | ValueError: if the image pointed to by image_path is not a valid JPEG or 51 | if the label pointed to by label_path is not a valid PNG or 52 | if the size of image does not match with that of label. 53 | """ 54 | with tf.gfile.GFile(image_path, 'rb') as fid: 55 | encoded_jpg = fid.read() 56 | encoded_jpg_io = io.BytesIO(encoded_jpg) 57 | image = PIL.Image.open(encoded_jpg_io) 58 | if image.format != 'JPEG': 59 | raise ValueError('Image format not JPEG') 60 | 61 | with tf.gfile.GFile(label_path, 'rb') as fid: 62 | encoded_label = fid.read() 63 | encoded_label_io = io.BytesIO(encoded_label) 64 | label = PIL.Image.open(encoded_label_io) 65 | if label.format != 'PNG': 66 | raise ValueError('Label format not PNG') 67 | 68 | if image.size != label.size: 69 | raise ValueError('The size of image does not match with that of label.') 70 | 71 | width, height = image.size 72 | 73 | example = tf.train.Example(features=tf.train.Features(feature={ 74 | 'image/height': dataset_util.int64_feature(height), 75 | 'image/width': dataset_util.int64_feature(width), 76 | 'image/encoded': dataset_util.bytes_feature(encoded_jpg), 77 | 'image/format': dataset_util.bytes_feature('jpeg'.encode('utf8')), 78 | 'label/encoded': dataset_util.bytes_feature(encoded_label), 79 | 'label/format': dataset_util.bytes_feature('png'.encode('utf8')), 80 | })) 81 | return example 82 | 83 | 84 | def create_tf_record(output_filename, 85 | image_dir, 86 | label_dir, 87 | examples): 88 | """Creates a TFRecord file from examples. 89 | 90 | Args: 91 | output_filename: Path to where output file is saved. 92 | image_dir: Directory where image files are stored. 93 | label_dir: Directory where label files are stored. 94 | examples: Examples to parse and save to tf record. 95 | """ 96 | writer = tf.python_io.TFRecordWriter(output_filename) 97 | for idx, example in enumerate(examples): 98 | if idx % 500 == 0: 99 | tf.logging.info('On image %d of %d', idx, len(examples)) 100 | image_path = os.path.join(image_dir, example + '.jpg') 101 | label_path = os.path.join(label_dir, example + '.png') 102 | 103 | if not os.path.exists(image_path): 104 | tf.logging.warning('Could not find %s, ignoring example.', image_path) 105 | continue 106 | elif not os.path.exists(label_path): 107 | tf.logging.warning('Could not find %s, ignoring example.', label_path) 108 | continue 109 | 110 | try: 111 | tf_example = dict_to_tf_example(image_path, label_path) 112 | writer.write(tf_example.SerializeToString()) 113 | except ValueError: 114 | tf.logging.warning('Invalid example: %s, ignoring.', example) 115 | 116 | writer.close() 117 | 118 | 119 | def main(unused_argv): 120 | if not os.path.exists(FLAGS.output_path): 121 | os.makedirs(FLAGS.output_path) 122 | 123 | tf.logging.info("Reading from VOC dataset") 124 | image_dir = os.path.join(FLAGS.data_dir, FLAGS.image_data_dir) 125 | label_dir = os.path.join(FLAGS.data_dir, FLAGS.label_data_dir) 126 | 127 | if not os.path.isdir(label_dir): 128 | raise ValueError("Missing Augmentation label directory. " 129 | "You may download the augmented labels from the link (Thanks to DrSleep): " 130 | "https://www.dropbox.com/s/oeu149j8qtbs1x0/SegmentationClassAug.zip") 131 | train_examples = dataset_util.read_examples_list(FLAGS.train_data_list) 132 | val_examples = dataset_util.read_examples_list(FLAGS.valid_data_list) 133 | 134 | train_output_path = os.path.join(FLAGS.output_path, 'train.record') 135 | val_output_path = os.path.join(FLAGS.output_path, 'val.record') 136 | 137 | create_tf_record(train_output_path, image_dir, label_dir, train_examples) 138 | create_tf_record(val_output_path, image_dir, label_dir, val_examples) 139 | 140 | 141 | if __name__ == '__main__': 142 | tf.logging.set_verbosity(tf.logging.INFO) 143 | FLAGS, unparsed = parser.parse_known_args() 144 | tf.app.run(main=main, argv=[sys.argv[0]] + unparsed) 145 | -------------------------------------------------------------------------------- /dataset/sample_images_list.txt: -------------------------------------------------------------------------------- 1 | 2007_000033.jpg 2 | 2007_000042.jpg 3 | 2007_000061.jpg 4 | 2007_000123.jpg 5 | 2007_000129.jpg 6 | 2007_000175.jpg 7 | 2007_000187.jpg 8 | 2007_000323.jpg -------------------------------------------------------------------------------- /dataset/test.txt: -------------------------------------------------------------------------------- 1 | 2008_000006 2 | 2008_000011 3 | 2008_000012 4 | 2008_000018 5 | 2008_000024 6 | 2008_000030 7 | 2008_000031 8 | 2008_000046 9 | 2008_000047 10 | 2008_000048 11 | 2008_000057 12 | 2008_000058 13 | 2008_000068 14 | 2008_000072 15 | 2008_000079 16 | 2008_000081 17 | 2008_000083 18 | 2008_000088 19 | 2008_000094 20 | 2008_000101 21 | 2008_000104 22 | 2008_000106 23 | 2008_000108 24 | 2008_000110 25 | 2008_000111 26 | 2008_000126 27 | 2008_000127 28 | 2008_000129 29 | 2008_000130 30 | 2008_000135 31 | 2008_000150 32 | 2008_000152 33 | 2008_000156 34 | 2008_000159 35 | 2008_000160 36 | 2008_000161 37 | 2008_000166 38 | 2008_000167 39 | 2008_000168 40 | 2008_000169 41 | 2008_000171 42 | 2008_000175 43 | 2008_000178 44 | 2008_000186 45 | 2008_000198 46 | 2008_000206 47 | 2008_000208 48 | 2008_000209 49 | 2008_000211 50 | 2008_000220 51 | 2008_000224 52 | 2008_000230 53 | 2008_000240 54 | 2008_000248 55 | 2008_000249 56 | 2008_000250 57 | 2008_000256 58 | 2008_000279 59 | 2008_000282 60 | 2008_000285 61 | 2008_000286 62 | 2008_000296 63 | 2008_000300 64 | 2008_000322 65 | 2008_000324 66 | 2008_000337 67 | 2008_000366 68 | 2008_000369 69 | 2008_000377 70 | 2008_000384 71 | 2008_000390 72 | 2008_000404 73 | 2008_000411 74 | 2008_000434 75 | 2008_000440 76 | 2008_000460 77 | 2008_000467 78 | 2008_000478 79 | 2008_000485 80 | 2008_000487 81 | 2008_000490 82 | 2008_000503 83 | 2008_000504 84 | 2008_000507 85 | 2008_000513 86 | 2008_000523 87 | 2008_000529 88 | 2008_000556 89 | 2008_000565 90 | 2008_000580 91 | 2008_000590 92 | 2008_000596 93 | 2008_000597 94 | 2008_000600 95 | 2008_000603 96 | 2008_000604 97 | 2008_000612 98 | 2008_000617 99 | 2008_000621 100 | 2008_000627 101 | 2008_000633 102 | 2008_000643 103 | 2008_000644 104 | 2008_000649 105 | 2008_000651 106 | 2008_000664 107 | 2008_000665 108 | 2008_000680 109 | 2008_000681 110 | 2008_000684 111 | 2008_000685 112 | 2008_000688 113 | 2008_000693 114 | 2008_000698 115 | 2008_000707 116 | 2008_000709 117 | 2008_000712 118 | 2008_000747 119 | 2008_000751 120 | 2008_000754 121 | 2008_000762 122 | 2008_000767 123 | 2008_000768 124 | 2008_000773 125 | 2008_000774 126 | 2008_000779 127 | 2008_000797 128 | 2008_000813 129 | 2008_000816 130 | 2008_000846 131 | 2008_000866 132 | 2008_000871 133 | 2008_000872 134 | 2008_000891 135 | 2008_000892 136 | 2008_000894 137 | 2008_000896 138 | 2008_000898 139 | 2008_000909 140 | 2008_000913 141 | 2008_000920 142 | 2008_000933 143 | 2008_000935 144 | 2008_000937 145 | 2008_000938 146 | 2008_000954 147 | 2008_000958 148 | 2008_000963 149 | 2008_000967 150 | 2008_000974 151 | 2008_000986 152 | 2008_000994 153 | 2008_000995 154 | 2008_001008 155 | 2008_001010 156 | 2008_001014 157 | 2008_001016 158 | 2008_001025 159 | 2008_001029 160 | 2008_001037 161 | 2008_001059 162 | 2008_001061 163 | 2008_001072 164 | 2008_001124 165 | 2008_001126 166 | 2008_001131 167 | 2008_001138 168 | 2008_001144 169 | 2008_001151 170 | 2008_001156 171 | 2008_001179 172 | 2008_001181 173 | 2008_001184 174 | 2008_001186 175 | 2008_001197 176 | 2008_001207 177 | 2008_001212 178 | 2008_001233 179 | 2008_001234 180 | 2008_001258 181 | 2008_001268 182 | 2008_001279 183 | 2008_001281 184 | 2008_001288 185 | 2008_001291 186 | 2008_001298 187 | 2008_001309 188 | 2008_001315 189 | 2008_001316 190 | 2008_001319 191 | 2008_001327 192 | 2008_001328 193 | 2008_001332 194 | 2008_001341 195 | 2008_001347 196 | 2008_001355 197 | 2008_001378 198 | 2008_001386 199 | 2008_001400 200 | 2008_001409 201 | 2008_001411 202 | 2008_001416 203 | 2008_001418 204 | 2008_001435 205 | 2008_001459 206 | 2008_001469 207 | 2008_001474 208 | 2008_001477 209 | 2008_001483 210 | 2008_001484 211 | 2008_001485 212 | 2008_001496 213 | 2008_001507 214 | 2008_001511 215 | 2008_001519 216 | 2008_001557 217 | 2008_001567 218 | 2008_001570 219 | 2008_001571 220 | 2008_001572 221 | 2008_001579 222 | 2008_001587 223 | 2008_001608 224 | 2008_001611 225 | 2008_001614 226 | 2008_001621 227 | 2008_001639 228 | 2008_001658 229 | 2008_001678 230 | 2008_001700 231 | 2008_001713 232 | 2008_001720 233 | 2008_001755 234 | 2008_001779 235 | 2008_001785 236 | 2008_001793 237 | 2008_001794 238 | 2008_001803 239 | 2008_001818 240 | 2008_001848 241 | 2008_001855 242 | 2008_001857 243 | 2008_001861 244 | 2008_001875 245 | 2008_001878 246 | 2008_001886 247 | 2008_001897 248 | 2008_001916 249 | 2008_001925 250 | 2008_001949 251 | 2008_001953 252 | 2008_001972 253 | 2008_001999 254 | 2008_002027 255 | 2008_002040 256 | 2008_002057 257 | 2008_002070 258 | 2008_002075 259 | 2008_002095 260 | 2008_002104 261 | 2008_002105 262 | 2008_002106 263 | 2008_002136 264 | 2008_002137 265 | 2008_002147 266 | 2008_002149 267 | 2008_002163 268 | 2008_002173 269 | 2008_002174 270 | 2008_002184 271 | 2008_002186 272 | 2008_002188 273 | 2008_002190 274 | 2008_002203 275 | 2008_002211 276 | 2008_002217 277 | 2008_002228 278 | 2008_002233 279 | 2008_002246 280 | 2008_002257 281 | 2008_002261 282 | 2008_002285 283 | 2008_002287 284 | 2008_002295 285 | 2008_002303 286 | 2008_002306 287 | 2008_002309 288 | 2008_002310 289 | 2008_002318 290 | 2008_002320 291 | 2008_002332 292 | 2008_002337 293 | 2008_002345 294 | 2008_002348 295 | 2008_002352 296 | 2008_002360 297 | 2008_002381 298 | 2008_002387 299 | 2008_002388 300 | 2008_002393 301 | 2008_002406 302 | 2008_002440 303 | 2008_002455 304 | 2008_002460 305 | 2008_002462 306 | 2008_002480 307 | 2008_002518 308 | 2008_002525 309 | 2008_002535 310 | 2008_002544 311 | 2008_002553 312 | 2008_002569 313 | 2008_002572 314 | 2008_002587 315 | 2008_002635 316 | 2008_002655 317 | 2008_002695 318 | 2008_002702 319 | 2008_002706 320 | 2008_002707 321 | 2008_002722 322 | 2008_002745 323 | 2008_002757 324 | 2008_002779 325 | 2008_002805 326 | 2008_002871 327 | 2008_002895 328 | 2008_002905 329 | 2008_002923 330 | 2008_002927 331 | 2008_002939 332 | 2008_002941 333 | 2008_002962 334 | 2008_002975 335 | 2008_003000 336 | 2008_003031 337 | 2008_003038 338 | 2008_003042 339 | 2008_003069 340 | 2008_003070 341 | 2008_003115 342 | 2008_003116 343 | 2008_003130 344 | 2008_003137 345 | 2008_003138 346 | 2008_003139 347 | 2008_003165 348 | 2008_003171 349 | 2008_003176 350 | 2008_003192 351 | 2008_003194 352 | 2008_003195 353 | 2008_003198 354 | 2008_003227 355 | 2008_003247 356 | 2008_003262 357 | 2008_003298 358 | 2008_003299 359 | 2008_003307 360 | 2008_003337 361 | 2008_003353 362 | 2008_003355 363 | 2008_003363 364 | 2008_003383 365 | 2008_003389 366 | 2008_003392 367 | 2008_003399 368 | 2008_003436 369 | 2008_003457 370 | 2008_003465 371 | 2008_003481 372 | 2008_003539 373 | 2008_003548 374 | 2008_003550 375 | 2008_003567 376 | 2008_003568 377 | 2008_003606 378 | 2008_003615 379 | 2008_003654 380 | 2008_003670 381 | 2008_003700 382 | 2008_003705 383 | 2008_003727 384 | 2008_003731 385 | 2008_003734 386 | 2008_003760 387 | 2008_003804 388 | 2008_003807 389 | 2008_003810 390 | 2008_003822 391 | 2008_003833 392 | 2008_003877 393 | 2008_003879 394 | 2008_003895 395 | 2008_003901 396 | 2008_003903 397 | 2008_003911 398 | 2008_003919 399 | 2008_003927 400 | 2008_003937 401 | 2008_003946 402 | 2008_003950 403 | 2008_003955 404 | 2008_003981 405 | 2008_003991 406 | 2008_004009 407 | 2008_004039 408 | 2008_004052 409 | 2008_004063 410 | 2008_004070 411 | 2008_004078 412 | 2008_004104 413 | 2008_004139 414 | 2008_004177 415 | 2008_004181 416 | 2008_004200 417 | 2008_004219 418 | 2008_004236 419 | 2008_004250 420 | 2008_004266 421 | 2008_004299 422 | 2008_004320 423 | 2008_004334 424 | 2008_004343 425 | 2008_004349 426 | 2008_004366 427 | 2008_004386 428 | 2008_004401 429 | 2008_004423 430 | 2008_004448 431 | 2008_004481 432 | 2008_004516 433 | 2008_004536 434 | 2008_004582 435 | 2008_004609 436 | 2008_004638 437 | 2008_004642 438 | 2008_004644 439 | 2008_004669 440 | 2008_004673 441 | 2008_004691 442 | 2008_004693 443 | 2008_004709 444 | 2008_004715 445 | 2008_004757 446 | 2008_004775 447 | 2008_004782 448 | 2008_004785 449 | 2008_004798 450 | 2008_004848 451 | 2008_004861 452 | 2008_004870 453 | 2008_004877 454 | 2008_004884 455 | 2008_004891 456 | 2008_004901 457 | 2008_004919 458 | 2008_005058 459 | 2008_005069 460 | 2008_005086 461 | 2008_005087 462 | 2008_005112 463 | 2008_005113 464 | 2008_005118 465 | 2008_005128 466 | 2008_005129 467 | 2008_005153 468 | 2008_005161 469 | 2008_005162 470 | 2008_005165 471 | 2008_005187 472 | 2008_005227 473 | 2008_005308 474 | 2008_005318 475 | 2008_005320 476 | 2008_005351 477 | 2008_005372 478 | 2008_005383 479 | 2008_005391 480 | 2008_005407 481 | 2008_005420 482 | 2008_005440 483 | 2008_005487 484 | 2008_005493 485 | 2008_005520 486 | 2008_005551 487 | 2008_005556 488 | 2008_005576 489 | 2008_005578 490 | 2008_005594 491 | 2008_005619 492 | 2008_005629 493 | 2008_005644 494 | 2008_005645 495 | 2008_005651 496 | 2008_005661 497 | 2008_005662 498 | 2008_005667 499 | 2008_005694 500 | 2008_005697 501 | 2008_005709 502 | 2008_005710 503 | 2008_005733 504 | 2008_005749 505 | 2008_005753 506 | 2008_005771 507 | 2008_005781 508 | 2008_005793 509 | 2008_005802 510 | 2008_005833 511 | 2008_005844 512 | 2008_005908 513 | 2008_005931 514 | 2008_005952 515 | 2008_006016 516 | 2008_006030 517 | 2008_006033 518 | 2008_006054 519 | 2008_006073 520 | 2008_006091 521 | 2008_006142 522 | 2008_006150 523 | 2008_006206 524 | 2008_006217 525 | 2008_006264 526 | 2008_006283 527 | 2008_006308 528 | 2008_006313 529 | 2008_006333 530 | 2008_006343 531 | 2008_006381 532 | 2008_006391 533 | 2008_006423 534 | 2008_006428 535 | 2008_006440 536 | 2008_006444 537 | 2008_006473 538 | 2008_006505 539 | 2008_006531 540 | 2008_006560 541 | 2008_006571 542 | 2008_006582 543 | 2008_006594 544 | 2008_006601 545 | 2008_006633 546 | 2008_006653 547 | 2008_006678 548 | 2008_006755 549 | 2008_006772 550 | 2008_006788 551 | 2008_006799 552 | 2008_006809 553 | 2008_006838 554 | 2008_006845 555 | 2008_006852 556 | 2008_006894 557 | 2008_006905 558 | 2008_006947 559 | 2008_006983 560 | 2008_007049 561 | 2008_007065 562 | 2008_007068 563 | 2008_007111 564 | 2008_007148 565 | 2008_007159 566 | 2008_007193 567 | 2008_007228 568 | 2008_007235 569 | 2008_007249 570 | 2008_007255 571 | 2008_007268 572 | 2008_007275 573 | 2008_007292 574 | 2008_007299 575 | 2008_007306 576 | 2008_007316 577 | 2008_007400 578 | 2008_007401 579 | 2008_007419 580 | 2008_007437 581 | 2008_007483 582 | 2008_007487 583 | 2008_007520 584 | 2008_007551 585 | 2008_007603 586 | 2008_007616 587 | 2008_007654 588 | 2008_007663 589 | 2008_007708 590 | 2008_007795 591 | 2008_007801 592 | 2008_007859 593 | 2008_007903 594 | 2008_007920 595 | 2008_007926 596 | 2008_008014 597 | 2008_008017 598 | 2008_008060 599 | 2008_008077 600 | 2008_008107 601 | 2008_008108 602 | 2008_008119 603 | 2008_008126 604 | 2008_008133 605 | 2008_008144 606 | 2008_008216 607 | 2008_008244 608 | 2008_008248 609 | 2008_008250 610 | 2008_008260 611 | 2008_008277 612 | 2008_008280 613 | 2008_008290 614 | 2008_008304 615 | 2008_008340 616 | 2008_008371 617 | 2008_008390 618 | 2008_008397 619 | 2008_008409 620 | 2008_008412 621 | 2008_008419 622 | 2008_008454 623 | 2008_008491 624 | 2008_008498 625 | 2008_008565 626 | 2008_008599 627 | 2008_008603 628 | 2008_008631 629 | 2008_008634 630 | 2008_008640 631 | 2008_008646 632 | 2008_008660 633 | 2008_008663 634 | 2008_008664 635 | 2008_008709 636 | 2008_008720 637 | 2008_008747 638 | 2008_008768 639 | 2009_000004 640 | 2009_000019 641 | 2009_000024 642 | 2009_000025 643 | 2009_000053 644 | 2009_000076 645 | 2009_000107 646 | 2009_000110 647 | 2009_000115 648 | 2009_000117 649 | 2009_000175 650 | 2009_000220 651 | 2009_000259 652 | 2009_000275 653 | 2009_000314 654 | 2009_000368 655 | 2009_000373 656 | 2009_000384 657 | 2009_000388 658 | 2009_000423 659 | 2009_000433 660 | 2009_000434 661 | 2009_000458 662 | 2009_000475 663 | 2009_000481 664 | 2009_000495 665 | 2009_000514 666 | 2009_000555 667 | 2009_000556 668 | 2009_000561 669 | 2009_000571 670 | 2009_000581 671 | 2009_000605 672 | 2009_000609 673 | 2009_000644 674 | 2009_000654 675 | 2009_000671 676 | 2009_000733 677 | 2009_000740 678 | 2009_000766 679 | 2009_000775 680 | 2009_000776 681 | 2009_000795 682 | 2009_000850 683 | 2009_000881 684 | 2009_000900 685 | 2009_000914 686 | 2009_000941 687 | 2009_000977 688 | 2009_000984 689 | 2009_000986 690 | 2009_001005 691 | 2009_001015 692 | 2009_001058 693 | 2009_001072 694 | 2009_001087 695 | 2009_001092 696 | 2009_001109 697 | 2009_001114 698 | 2009_001115 699 | 2009_001141 700 | 2009_001174 701 | 2009_001175 702 | 2009_001182 703 | 2009_001222 704 | 2009_001228 705 | 2009_001246 706 | 2009_001262 707 | 2009_001274 708 | 2009_001284 709 | 2009_001297 710 | 2009_001331 711 | 2009_001336 712 | 2009_001337 713 | 2009_001379 714 | 2009_001392 715 | 2009_001451 716 | 2009_001485 717 | 2009_001488 718 | 2009_001497 719 | 2009_001504 720 | 2009_001506 721 | 2009_001573 722 | 2009_001576 723 | 2009_001603 724 | 2009_001613 725 | 2009_001652 726 | 2009_001661 727 | 2009_001668 728 | 2009_001680 729 | 2009_001688 730 | 2009_001697 731 | 2009_001729 732 | 2009_001771 733 | 2009_001785 734 | 2009_001793 735 | 2009_001814 736 | 2009_001866 737 | 2009_001872 738 | 2009_001880 739 | 2009_001883 740 | 2009_001891 741 | 2009_001913 742 | 2009_001938 743 | 2009_001946 744 | 2009_001953 745 | 2009_001969 746 | 2009_001978 747 | 2009_001995 748 | 2009_002007 749 | 2009_002036 750 | 2009_002041 751 | 2009_002049 752 | 2009_002051 753 | 2009_002062 754 | 2009_002063 755 | 2009_002067 756 | 2009_002085 757 | 2009_002092 758 | 2009_002114 759 | 2009_002115 760 | 2009_002142 761 | 2009_002148 762 | 2009_002157 763 | 2009_002181 764 | 2009_002220 765 | 2009_002284 766 | 2009_002287 767 | 2009_002300 768 | 2009_002310 769 | 2009_002315 770 | 2009_002334 771 | 2009_002337 772 | 2009_002354 773 | 2009_002357 774 | 2009_002411 775 | 2009_002426 776 | 2009_002458 777 | 2009_002459 778 | 2009_002461 779 | 2009_002466 780 | 2009_002481 781 | 2009_002483 782 | 2009_002503 783 | 2009_002581 784 | 2009_002583 785 | 2009_002589 786 | 2009_002600 787 | 2009_002601 788 | 2009_002602 789 | 2009_002641 790 | 2009_002646 791 | 2009_002656 792 | 2009_002666 793 | 2009_002720 794 | 2009_002767 795 | 2009_002768 796 | 2009_002794 797 | 2009_002821 798 | 2009_002825 799 | 2009_002839 800 | 2009_002840 801 | 2009_002859 802 | 2009_002860 803 | 2009_002881 804 | 2009_002889 805 | 2009_002892 806 | 2009_002895 807 | 2009_002896 808 | 2009_002900 809 | 2009_002924 810 | 2009_002966 811 | 2009_002973 812 | 2009_002981 813 | 2009_003004 814 | 2009_003021 815 | 2009_003028 816 | 2009_003037 817 | 2009_003038 818 | 2009_003055 819 | 2009_003085 820 | 2009_003100 821 | 2009_003106 822 | 2009_003117 823 | 2009_003139 824 | 2009_003170 825 | 2009_003179 826 | 2009_003184 827 | 2009_003186 828 | 2009_003190 829 | 2009_003221 830 | 2009_003236 831 | 2009_003242 832 | 2009_003244 833 | 2009_003260 834 | 2009_003264 835 | 2009_003274 836 | 2009_003283 837 | 2009_003296 838 | 2009_003332 839 | 2009_003341 840 | 2009_003354 841 | 2009_003370 842 | 2009_003371 843 | 2009_003374 844 | 2009_003391 845 | 2009_003393 846 | 2009_003404 847 | 2009_003405 848 | 2009_003414 849 | 2009_003428 850 | 2009_003470 851 | 2009_003474 852 | 2009_003532 853 | 2009_003536 854 | 2009_003578 855 | 2009_003580 856 | 2009_003620 857 | 2009_003621 858 | 2009_003680 859 | 2009_003699 860 | 2009_003727 861 | 2009_003737 862 | 2009_003780 863 | 2009_003811 864 | 2009_003824 865 | 2009_003831 866 | 2009_003844 867 | 2009_003850 868 | 2009_003851 869 | 2009_003864 870 | 2009_003868 871 | 2009_003869 872 | 2009_003893 873 | 2009_003909 874 | 2009_003924 875 | 2009_003925 876 | 2009_003960 877 | 2009_003979 878 | 2009_003990 879 | 2009_003997 880 | 2009_004006 881 | 2009_004010 882 | 2009_004066 883 | 2009_004077 884 | 2009_004081 885 | 2009_004097 886 | 2009_004098 887 | 2009_004136 888 | 2009_004216 889 | 2009_004220 890 | 2009_004266 891 | 2009_004269 892 | 2009_004286 893 | 2009_004296 894 | 2009_004321 895 | 2009_004342 896 | 2009_004343 897 | 2009_004344 898 | 2009_004385 899 | 2009_004408 900 | 2009_004420 901 | 2009_004441 902 | 2009_004447 903 | 2009_004461 904 | 2009_004467 905 | 2009_004485 906 | 2009_004488 907 | 2009_004516 908 | 2009_004521 909 | 2009_004544 910 | 2009_004596 911 | 2009_004613 912 | 2009_004615 913 | 2009_004618 914 | 2009_004621 915 | 2009_004646 916 | 2009_004659 917 | 2009_004663 918 | 2009_004666 919 | 2009_004691 920 | 2009_004715 921 | 2009_004726 922 | 2009_004753 923 | 2009_004776 924 | 2009_004811 925 | 2009_004814 926 | 2009_004818 927 | 2009_004835 928 | 2009_004863 929 | 2009_004894 930 | 2009_004909 931 | 2009_004928 932 | 2009_004937 933 | 2009_004954 934 | 2009_004966 935 | 2009_004970 936 | 2009_004976 937 | 2009_005004 938 | 2009_005011 939 | 2009_005053 940 | 2009_005072 941 | 2009_005115 942 | 2009_005146 943 | 2009_005151 944 | 2009_005164 945 | 2009_005179 946 | 2009_005224 947 | 2009_005243 948 | 2009_005249 949 | 2009_005252 950 | 2009_005254 951 | 2009_005258 952 | 2009_005264 953 | 2009_005266 954 | 2009_005276 955 | 2009_005290 956 | 2009_005295 957 | 2010_000004 958 | 2010_000005 959 | 2010_000006 960 | 2010_000032 961 | 2010_000062 962 | 2010_000093 963 | 2010_000094 964 | 2010_000161 965 | 2010_000176 966 | 2010_000223 967 | 2010_000226 968 | 2010_000236 969 | 2010_000239 970 | 2010_000287 971 | 2010_000300 972 | 2010_000301 973 | 2010_000328 974 | 2010_000378 975 | 2010_000405 976 | 2010_000407 977 | 2010_000472 978 | 2010_000479 979 | 2010_000491 980 | 2010_000533 981 | 2010_000535 982 | 2010_000542 983 | 2010_000554 984 | 2010_000580 985 | 2010_000594 986 | 2010_000596 987 | 2010_000599 988 | 2010_000606 989 | 2010_000615 990 | 2010_000654 991 | 2010_000659 992 | 2010_000693 993 | 2010_000698 994 | 2010_000730 995 | 2010_000734 996 | 2010_000741 997 | 2010_000755 998 | 2010_000768 999 | 2010_000794 1000 | 2010_000813 1001 | 2010_000817 1002 | 2010_000834 1003 | 2010_000839 1004 | 2010_000848 1005 | 2010_000881 1006 | 2010_000888 1007 | 2010_000900 1008 | 2010_000903 1009 | 2010_000924 1010 | 2010_000946 1011 | 2010_000953 1012 | 2010_000957 1013 | 2010_000967 1014 | 2010_000992 1015 | 2010_000998 1016 | 2010_001053 1017 | 2010_001067 1018 | 2010_001114 1019 | 2010_001132 1020 | 2010_001138 1021 | 2010_001169 1022 | 2010_001171 1023 | 2010_001228 1024 | 2010_001260 1025 | 2010_001268 1026 | 2010_001280 1027 | 2010_001298 1028 | 2010_001302 1029 | 2010_001308 1030 | 2010_001324 1031 | 2010_001332 1032 | 2010_001335 1033 | 2010_001345 1034 | 2010_001346 1035 | 2010_001349 1036 | 2010_001373 1037 | 2010_001381 1038 | 2010_001392 1039 | 2010_001396 1040 | 2010_001420 1041 | 2010_001500 1042 | 2010_001506 1043 | 2010_001521 1044 | 2010_001532 1045 | 2010_001558 1046 | 2010_001598 1047 | 2010_001611 1048 | 2010_001631 1049 | 2010_001639 1050 | 2010_001651 1051 | 2010_001663 1052 | 2010_001664 1053 | 2010_001728 1054 | 2010_001778 1055 | 2010_001861 1056 | 2010_001874 1057 | 2010_001900 1058 | 2010_001905 1059 | 2010_001969 1060 | 2010_002008 1061 | 2010_002014 1062 | 2010_002049 1063 | 2010_002052 1064 | 2010_002091 1065 | 2010_002115 1066 | 2010_002119 1067 | 2010_002134 1068 | 2010_002156 1069 | 2010_002160 1070 | 2010_002186 1071 | 2010_002210 1072 | 2010_002241 1073 | 2010_002252 1074 | 2010_002258 1075 | 2010_002262 1076 | 2010_002273 1077 | 2010_002290 1078 | 2010_002292 1079 | 2010_002347 1080 | 2010_002358 1081 | 2010_002360 1082 | 2010_002367 1083 | 2010_002416 1084 | 2010_002451 1085 | 2010_002481 1086 | 2010_002490 1087 | 2010_002495 1088 | 2010_002588 1089 | 2010_002607 1090 | 2010_002609 1091 | 2010_002610 1092 | 2010_002641 1093 | 2010_002685 1094 | 2010_002699 1095 | 2010_002719 1096 | 2010_002735 1097 | 2010_002751 1098 | 2010_002804 1099 | 2010_002835 1100 | 2010_002852 1101 | 2010_002885 1102 | 2010_002889 1103 | 2010_002904 1104 | 2010_002908 1105 | 2010_002916 1106 | 2010_002974 1107 | 2010_002977 1108 | 2010_003005 1109 | 2010_003021 1110 | 2010_003030 1111 | 2010_003038 1112 | 2010_003046 1113 | 2010_003052 1114 | 2010_003089 1115 | 2010_003110 1116 | 2010_003118 1117 | 2010_003171 1118 | 2010_003217 1119 | 2010_003221 1120 | 2010_003228 1121 | 2010_003243 1122 | 2010_003271 1123 | 2010_003295 1124 | 2010_003306 1125 | 2010_003324 1126 | 2010_003363 1127 | 2010_003382 1128 | 2010_003388 1129 | 2010_003389 1130 | 2010_003392 1131 | 2010_003430 1132 | 2010_003442 1133 | 2010_003459 1134 | 2010_003485 1135 | 2010_003486 1136 | 2010_003500 1137 | 2010_003523 1138 | 2010_003542 1139 | 2010_003552 1140 | 2010_003570 1141 | 2010_003572 1142 | 2010_003586 1143 | 2010_003615 1144 | 2010_003623 1145 | 2010_003657 1146 | 2010_003666 1147 | 2010_003705 1148 | 2010_003710 1149 | 2010_003720 1150 | 2010_003733 1151 | 2010_003750 1152 | 2010_003767 1153 | 2010_003802 1154 | 2010_003809 1155 | 2010_003830 1156 | 2010_003832 1157 | 2010_003836 1158 | 2010_003838 1159 | 2010_003850 1160 | 2010_003867 1161 | 2010_003882 1162 | 2010_003909 1163 | 2010_003922 1164 | 2010_003923 1165 | 2010_003978 1166 | 2010_003989 1167 | 2010_003990 1168 | 2010_004000 1169 | 2010_004003 1170 | 2010_004068 1171 | 2010_004076 1172 | 2010_004117 1173 | 2010_004136 1174 | 2010_004142 1175 | 2010_004195 1176 | 2010_004200 1177 | 2010_004202 1178 | 2010_004232 1179 | 2010_004261 1180 | 2010_004266 1181 | 2010_004273 1182 | 2010_004305 1183 | 2010_004403 1184 | 2010_004433 1185 | 2010_004434 1186 | 2010_004435 1187 | 2010_004438 1188 | 2010_004442 1189 | 2010_004473 1190 | 2010_004482 1191 | 2010_004487 1192 | 2010_004489 1193 | 2010_004512 1194 | 2010_004525 1195 | 2010_004527 1196 | 2010_004532 1197 | 2010_004566 1198 | 2010_004568 1199 | 2010_004579 1200 | 2010_004611 1201 | 2010_004641 1202 | 2010_004688 1203 | 2010_004699 1204 | 2010_004702 1205 | 2010_004716 1206 | 2010_004754 1207 | 2010_004767 1208 | 2010_004776 1209 | 2010_004811 1210 | 2010_004837 1211 | 2010_004839 1212 | 2010_004845 1213 | 2010_004860 1214 | 2010_004867 1215 | 2010_004881 1216 | 2010_004939 1217 | 2010_005001 1218 | 2010_005047 1219 | 2010_005051 1220 | 2010_005091 1221 | 2010_005095 1222 | 2010_005125 1223 | 2010_005140 1224 | 2010_005177 1225 | 2010_005178 1226 | 2010_005194 1227 | 2010_005197 1228 | 2010_005200 1229 | 2010_005205 1230 | 2010_005212 1231 | 2010_005248 1232 | 2010_005294 1233 | 2010_005298 1234 | 2010_005313 1235 | 2010_005324 1236 | 2010_005328 1237 | 2010_005329 1238 | 2010_005380 1239 | 2010_005404 1240 | 2010_005407 1241 | 2010_005411 1242 | 2010_005423 1243 | 2010_005499 1244 | 2010_005509 1245 | 2010_005510 1246 | 2010_005544 1247 | 2010_005549 1248 | 2010_005590 1249 | 2010_005639 1250 | 2010_005699 1251 | 2010_005704 1252 | 2010_005707 1253 | 2010_005711 1254 | 2010_005726 1255 | 2010_005741 1256 | 2010_005765 1257 | 2010_005790 1258 | 2010_005792 1259 | 2010_005797 1260 | 2010_005812 1261 | 2010_005850 1262 | 2010_005861 1263 | 2010_005869 1264 | 2010_005908 1265 | 2010_005915 1266 | 2010_005946 1267 | 2010_005965 1268 | 2010_006044 1269 | 2010_006047 1270 | 2010_006052 1271 | 2010_006081 1272 | 2011_000001 1273 | 2011_000013 1274 | 2011_000014 1275 | 2011_000020 1276 | 2011_000032 1277 | 2011_000042 1278 | 2011_000063 1279 | 2011_000115 1280 | 2011_000120 1281 | 2011_000240 1282 | 2011_000244 1283 | 2011_000254 1284 | 2011_000261 1285 | 2011_000262 1286 | 2011_000271 1287 | 2011_000274 1288 | 2011_000306 1289 | 2011_000311 1290 | 2011_000316 1291 | 2011_000328 1292 | 2011_000351 1293 | 2011_000352 1294 | 2011_000406 1295 | 2011_000414 1296 | 2011_000448 1297 | 2011_000451 1298 | 2011_000470 1299 | 2011_000473 1300 | 2011_000515 1301 | 2011_000537 1302 | 2011_000576 1303 | 2011_000603 1304 | 2011_000616 1305 | 2011_000636 1306 | 2011_000639 1307 | 2011_000654 1308 | 2011_000660 1309 | 2011_000664 1310 | 2011_000667 1311 | 2011_000670 1312 | 2011_000676 1313 | 2011_000721 1314 | 2011_000723 1315 | 2011_000762 1316 | 2011_000766 1317 | 2011_000786 1318 | 2011_000802 1319 | 2011_000810 1320 | 2011_000821 1321 | 2011_000841 1322 | 2011_000844 1323 | 2011_000846 1324 | 2011_000869 1325 | 2011_000890 1326 | 2011_000915 1327 | 2011_000924 1328 | 2011_000937 1329 | 2011_000939 1330 | 2011_000952 1331 | 2011_000968 1332 | 2011_000974 1333 | 2011_001037 1334 | 2011_001072 1335 | 2011_001085 1336 | 2011_001089 1337 | 2011_001090 1338 | 2011_001099 1339 | 2011_001104 1340 | 2011_001112 1341 | 2011_001120 1342 | 2011_001132 1343 | 2011_001151 1344 | 2011_001194 1345 | 2011_001258 1346 | 2011_001274 1347 | 2011_001314 1348 | 2011_001317 1349 | 2011_001321 1350 | 2011_001379 1351 | 2011_001425 1352 | 2011_001431 1353 | 2011_001443 1354 | 2011_001446 1355 | 2011_001452 1356 | 2011_001454 1357 | 2011_001477 1358 | 2011_001509 1359 | 2011_001512 1360 | 2011_001515 1361 | 2011_001528 1362 | 2011_001554 1363 | 2011_001561 1364 | 2011_001580 1365 | 2011_001587 1366 | 2011_001623 1367 | 2011_001648 1368 | 2011_001651 1369 | 2011_001654 1370 | 2011_001684 1371 | 2011_001696 1372 | 2011_001697 1373 | 2011_001760 1374 | 2011_001761 1375 | 2011_001798 1376 | 2011_001807 1377 | 2011_001851 1378 | 2011_001852 1379 | 2011_001853 1380 | 2011_001888 1381 | 2011_001940 1382 | 2011_002014 1383 | 2011_002028 1384 | 2011_002056 1385 | 2011_002061 1386 | 2011_002068 1387 | 2011_002076 1388 | 2011_002090 1389 | 2011_002095 1390 | 2011_002104 1391 | 2011_002136 1392 | 2011_002138 1393 | 2011_002151 1394 | 2011_002153 1395 | 2011_002155 1396 | 2011_002197 1397 | 2011_002198 1398 | 2011_002243 1399 | 2011_002250 1400 | 2011_002257 1401 | 2011_002262 1402 | 2011_002264 1403 | 2011_002296 1404 | 2011_002314 1405 | 2011_002331 1406 | 2011_002333 1407 | 2011_002411 1408 | 2011_002417 1409 | 2011_002425 1410 | 2011_002437 1411 | 2011_002444 1412 | 2011_002445 1413 | 2011_002449 1414 | 2011_002468 1415 | 2011_002469 1416 | 2011_002473 1417 | 2011_002508 1418 | 2011_002523 1419 | 2011_002534 1420 | 2011_002557 1421 | 2011_002564 1422 | 2011_002572 1423 | 2011_002597 1424 | 2011_002622 1425 | 2011_002632 1426 | 2011_002635 1427 | 2011_002643 1428 | 2011_002653 1429 | 2011_002667 1430 | 2011_002681 1431 | 2011_002707 1432 | 2011_002736 1433 | 2011_002759 1434 | 2011_002783 1435 | 2011_002792 1436 | 2011_002799 1437 | 2011_002824 1438 | 2011_002835 1439 | 2011_002866 1440 | 2011_002876 1441 | 2011_002888 1442 | 2011_002894 1443 | 2011_002903 1444 | 2011_002905 1445 | 2011_002986 1446 | 2011_003045 1447 | 2011_003064 1448 | 2011_003070 1449 | 2011_003083 1450 | 2011_003093 1451 | 2011_003096 1452 | 2011_003102 1453 | 2011_003156 1454 | 2011_003170 1455 | 2011_003178 1456 | 2011_003231 1457 | -------------------------------------------------------------------------------- /dataset/val.txt: -------------------------------------------------------------------------------- 1 | 2007_000033 2 | 2007_000042 3 | 2007_000061 4 | 2007_000123 5 | 2007_000129 6 | 2007_000175 7 | 2007_000187 8 | 2007_000323 9 | 2007_000332 10 | 2007_000346 11 | 2007_000452 12 | 2007_000464 13 | 2007_000491 14 | 2007_000529 15 | 2007_000559 16 | 2007_000572 17 | 2007_000629 18 | 2007_000636 19 | 2007_000661 20 | 2007_000663 21 | 2007_000676 22 | 2007_000727 23 | 2007_000762 24 | 2007_000783 25 | 2007_000799 26 | 2007_000804 27 | 2007_000830 28 | 2007_000837 29 | 2007_000847 30 | 2007_000862 31 | 2007_000925 32 | 2007_000999 33 | 2007_001154 34 | 2007_001175 35 | 2007_001239 36 | 2007_001284 37 | 2007_001288 38 | 2007_001289 39 | 2007_001299 40 | 2007_001311 41 | 2007_001321 42 | 2007_001377 43 | 2007_001408 44 | 2007_001423 45 | 2007_001430 46 | 2007_001457 47 | 2007_001458 48 | 2007_001526 49 | 2007_001568 50 | 2007_001585 51 | 2007_001586 52 | 2007_001587 53 | 2007_001594 54 | 2007_001630 55 | 2007_001677 56 | 2007_001678 57 | 2007_001717 58 | 2007_001733 59 | 2007_001761 60 | 2007_001763 61 | 2007_001774 62 | 2007_001884 63 | 2007_001955 64 | 2007_002046 65 | 2007_002094 66 | 2007_002119 67 | 2007_002132 68 | 2007_002260 69 | 2007_002266 70 | 2007_002268 71 | 2007_002284 72 | 2007_002376 73 | 2007_002378 74 | 2007_002387 75 | 2007_002400 76 | 2007_002412 77 | 2007_002426 78 | 2007_002427 79 | 2007_002445 80 | 2007_002470 81 | 2007_002539 82 | 2007_002565 83 | 2007_002597 84 | 2007_002618 85 | 2007_002619 86 | 2007_002624 87 | 2007_002643 88 | 2007_002648 89 | 2007_002719 90 | 2007_002728 91 | 2007_002823 92 | 2007_002824 93 | 2007_002852 94 | 2007_002903 95 | 2007_003011 96 | 2007_003020 97 | 2007_003022 98 | 2007_003051 99 | 2007_003088 100 | 2007_003101 101 | 2007_003106 102 | 2007_003110 103 | 2007_003131 104 | 2007_003134 105 | 2007_003137 106 | 2007_003143 107 | 2007_003169 108 | 2007_003188 109 | 2007_003194 110 | 2007_003195 111 | 2007_003201 112 | 2007_003349 113 | 2007_003367 114 | 2007_003373 115 | 2007_003499 116 | 2007_003503 117 | 2007_003506 118 | 2007_003530 119 | 2007_003571 120 | 2007_003587 121 | 2007_003611 122 | 2007_003621 123 | 2007_003682 124 | 2007_003711 125 | 2007_003714 126 | 2007_003742 127 | 2007_003786 128 | 2007_003841 129 | 2007_003848 130 | 2007_003861 131 | 2007_003872 132 | 2007_003917 133 | 2007_003957 134 | 2007_003991 135 | 2007_004033 136 | 2007_004052 137 | 2007_004112 138 | 2007_004121 139 | 2007_004143 140 | 2007_004189 141 | 2007_004190 142 | 2007_004193 143 | 2007_004241 144 | 2007_004275 145 | 2007_004281 146 | 2007_004380 147 | 2007_004392 148 | 2007_004405 149 | 2007_004468 150 | 2007_004483 151 | 2007_004510 152 | 2007_004538 153 | 2007_004558 154 | 2007_004644 155 | 2007_004649 156 | 2007_004712 157 | 2007_004722 158 | 2007_004856 159 | 2007_004866 160 | 2007_004902 161 | 2007_004969 162 | 2007_005058 163 | 2007_005074 164 | 2007_005107 165 | 2007_005114 166 | 2007_005149 167 | 2007_005173 168 | 2007_005281 169 | 2007_005294 170 | 2007_005296 171 | 2007_005304 172 | 2007_005331 173 | 2007_005354 174 | 2007_005358 175 | 2007_005428 176 | 2007_005460 177 | 2007_005469 178 | 2007_005509 179 | 2007_005547 180 | 2007_005600 181 | 2007_005608 182 | 2007_005626 183 | 2007_005689 184 | 2007_005696 185 | 2007_005705 186 | 2007_005759 187 | 2007_005803 188 | 2007_005813 189 | 2007_005828 190 | 2007_005844 191 | 2007_005845 192 | 2007_005857 193 | 2007_005911 194 | 2007_005915 195 | 2007_005978 196 | 2007_006028 197 | 2007_006035 198 | 2007_006046 199 | 2007_006076 200 | 2007_006086 201 | 2007_006117 202 | 2007_006171 203 | 2007_006241 204 | 2007_006260 205 | 2007_006277 206 | 2007_006348 207 | 2007_006364 208 | 2007_006373 209 | 2007_006444 210 | 2007_006449 211 | 2007_006549 212 | 2007_006553 213 | 2007_006560 214 | 2007_006647 215 | 2007_006678 216 | 2007_006680 217 | 2007_006698 218 | 2007_006761 219 | 2007_006802 220 | 2007_006837 221 | 2007_006841 222 | 2007_006864 223 | 2007_006866 224 | 2007_006946 225 | 2007_007007 226 | 2007_007084 227 | 2007_007109 228 | 2007_007130 229 | 2007_007165 230 | 2007_007168 231 | 2007_007195 232 | 2007_007196 233 | 2007_007203 234 | 2007_007211 235 | 2007_007235 236 | 2007_007341 237 | 2007_007414 238 | 2007_007417 239 | 2007_007470 240 | 2007_007477 241 | 2007_007493 242 | 2007_007498 243 | 2007_007524 244 | 2007_007534 245 | 2007_007624 246 | 2007_007651 247 | 2007_007688 248 | 2007_007748 249 | 2007_007795 250 | 2007_007810 251 | 2007_007815 252 | 2007_007818 253 | 2007_007836 254 | 2007_007849 255 | 2007_007881 256 | 2007_007996 257 | 2007_008051 258 | 2007_008084 259 | 2007_008106 260 | 2007_008110 261 | 2007_008204 262 | 2007_008222 263 | 2007_008256 264 | 2007_008260 265 | 2007_008339 266 | 2007_008374 267 | 2007_008415 268 | 2007_008430 269 | 2007_008543 270 | 2007_008547 271 | 2007_008596 272 | 2007_008645 273 | 2007_008670 274 | 2007_008708 275 | 2007_008722 276 | 2007_008747 277 | 2007_008802 278 | 2007_008815 279 | 2007_008897 280 | 2007_008944 281 | 2007_008964 282 | 2007_008973 283 | 2007_008980 284 | 2007_009015 285 | 2007_009068 286 | 2007_009084 287 | 2007_009088 288 | 2007_009096 289 | 2007_009221 290 | 2007_009245 291 | 2007_009251 292 | 2007_009252 293 | 2007_009258 294 | 2007_009320 295 | 2007_009323 296 | 2007_009331 297 | 2007_009346 298 | 2007_009392 299 | 2007_009413 300 | 2007_009419 301 | 2007_009446 302 | 2007_009458 303 | 2007_009521 304 | 2007_009562 305 | 2007_009592 306 | 2007_009654 307 | 2007_009655 308 | 2007_009684 309 | 2007_009687 310 | 2007_009691 311 | 2007_009706 312 | 2007_009750 313 | 2007_009756 314 | 2007_009764 315 | 2007_009794 316 | 2007_009817 317 | 2007_009841 318 | 2007_009897 319 | 2007_009911 320 | 2007_009923 321 | 2007_009938 322 | 2008_000009 323 | 2008_000016 324 | 2008_000073 325 | 2008_000075 326 | 2008_000080 327 | 2008_000107 328 | 2008_000120 329 | 2008_000123 330 | 2008_000149 331 | 2008_000182 332 | 2008_000213 333 | 2008_000215 334 | 2008_000223 335 | 2008_000233 336 | 2008_000234 337 | 2008_000239 338 | 2008_000254 339 | 2008_000270 340 | 2008_000271 341 | 2008_000345 342 | 2008_000359 343 | 2008_000391 344 | 2008_000401 345 | 2008_000464 346 | 2008_000469 347 | 2008_000474 348 | 2008_000501 349 | 2008_000510 350 | 2008_000533 351 | 2008_000573 352 | 2008_000589 353 | 2008_000602 354 | 2008_000630 355 | 2008_000657 356 | 2008_000661 357 | 2008_000662 358 | 2008_000666 359 | 2008_000673 360 | 2008_000700 361 | 2008_000725 362 | 2008_000731 363 | 2008_000763 364 | 2008_000765 365 | 2008_000782 366 | 2008_000795 367 | 2008_000811 368 | 2008_000848 369 | 2008_000853 370 | 2008_000863 371 | 2008_000911 372 | 2008_000919 373 | 2008_000943 374 | 2008_000992 375 | 2008_001013 376 | 2008_001028 377 | 2008_001040 378 | 2008_001070 379 | 2008_001074 380 | 2008_001076 381 | 2008_001078 382 | 2008_001135 383 | 2008_001150 384 | 2008_001170 385 | 2008_001231 386 | 2008_001249 387 | 2008_001260 388 | 2008_001283 389 | 2008_001308 390 | 2008_001379 391 | 2008_001404 392 | 2008_001433 393 | 2008_001439 394 | 2008_001478 395 | 2008_001491 396 | 2008_001504 397 | 2008_001513 398 | 2008_001514 399 | 2008_001531 400 | 2008_001546 401 | 2008_001547 402 | 2008_001580 403 | 2008_001629 404 | 2008_001640 405 | 2008_001682 406 | 2008_001688 407 | 2008_001715 408 | 2008_001821 409 | 2008_001874 410 | 2008_001885 411 | 2008_001895 412 | 2008_001966 413 | 2008_001971 414 | 2008_001992 415 | 2008_002043 416 | 2008_002152 417 | 2008_002205 418 | 2008_002212 419 | 2008_002239 420 | 2008_002240 421 | 2008_002241 422 | 2008_002269 423 | 2008_002273 424 | 2008_002358 425 | 2008_002379 426 | 2008_002383 427 | 2008_002429 428 | 2008_002464 429 | 2008_002467 430 | 2008_002492 431 | 2008_002495 432 | 2008_002504 433 | 2008_002521 434 | 2008_002536 435 | 2008_002588 436 | 2008_002623 437 | 2008_002680 438 | 2008_002681 439 | 2008_002775 440 | 2008_002778 441 | 2008_002835 442 | 2008_002859 443 | 2008_002864 444 | 2008_002900 445 | 2008_002904 446 | 2008_002929 447 | 2008_002936 448 | 2008_002942 449 | 2008_002958 450 | 2008_003003 451 | 2008_003026 452 | 2008_003034 453 | 2008_003076 454 | 2008_003105 455 | 2008_003108 456 | 2008_003110 457 | 2008_003135 458 | 2008_003141 459 | 2008_003155 460 | 2008_003210 461 | 2008_003238 462 | 2008_003270 463 | 2008_003330 464 | 2008_003333 465 | 2008_003369 466 | 2008_003379 467 | 2008_003451 468 | 2008_003461 469 | 2008_003477 470 | 2008_003492 471 | 2008_003499 472 | 2008_003511 473 | 2008_003546 474 | 2008_003576 475 | 2008_003577 476 | 2008_003676 477 | 2008_003709 478 | 2008_003733 479 | 2008_003777 480 | 2008_003782 481 | 2008_003821 482 | 2008_003846 483 | 2008_003856 484 | 2008_003858 485 | 2008_003874 486 | 2008_003876 487 | 2008_003885 488 | 2008_003886 489 | 2008_003926 490 | 2008_003976 491 | 2008_004069 492 | 2008_004101 493 | 2008_004140 494 | 2008_004172 495 | 2008_004175 496 | 2008_004212 497 | 2008_004279 498 | 2008_004339 499 | 2008_004345 500 | 2008_004363 501 | 2008_004367 502 | 2008_004396 503 | 2008_004399 504 | 2008_004453 505 | 2008_004477 506 | 2008_004552 507 | 2008_004562 508 | 2008_004575 509 | 2008_004610 510 | 2008_004612 511 | 2008_004621 512 | 2008_004624 513 | 2008_004654 514 | 2008_004659 515 | 2008_004687 516 | 2008_004701 517 | 2008_004704 518 | 2008_004705 519 | 2008_004754 520 | 2008_004758 521 | 2008_004854 522 | 2008_004910 523 | 2008_004995 524 | 2008_005049 525 | 2008_005089 526 | 2008_005097 527 | 2008_005105 528 | 2008_005145 529 | 2008_005197 530 | 2008_005217 531 | 2008_005242 532 | 2008_005245 533 | 2008_005254 534 | 2008_005262 535 | 2008_005338 536 | 2008_005398 537 | 2008_005399 538 | 2008_005422 539 | 2008_005439 540 | 2008_005445 541 | 2008_005525 542 | 2008_005544 543 | 2008_005628 544 | 2008_005633 545 | 2008_005637 546 | 2008_005642 547 | 2008_005676 548 | 2008_005680 549 | 2008_005691 550 | 2008_005727 551 | 2008_005738 552 | 2008_005812 553 | 2008_005904 554 | 2008_005915 555 | 2008_006008 556 | 2008_006036 557 | 2008_006055 558 | 2008_006063 559 | 2008_006108 560 | 2008_006130 561 | 2008_006143 562 | 2008_006159 563 | 2008_006216 564 | 2008_006219 565 | 2008_006229 566 | 2008_006254 567 | 2008_006275 568 | 2008_006325 569 | 2008_006327 570 | 2008_006341 571 | 2008_006408 572 | 2008_006480 573 | 2008_006523 574 | 2008_006526 575 | 2008_006528 576 | 2008_006553 577 | 2008_006554 578 | 2008_006703 579 | 2008_006722 580 | 2008_006752 581 | 2008_006784 582 | 2008_006835 583 | 2008_006874 584 | 2008_006981 585 | 2008_006986 586 | 2008_007025 587 | 2008_007031 588 | 2008_007048 589 | 2008_007120 590 | 2008_007123 591 | 2008_007143 592 | 2008_007194 593 | 2008_007219 594 | 2008_007273 595 | 2008_007350 596 | 2008_007378 597 | 2008_007392 598 | 2008_007402 599 | 2008_007497 600 | 2008_007498 601 | 2008_007507 602 | 2008_007513 603 | 2008_007527 604 | 2008_007548 605 | 2008_007596 606 | 2008_007677 607 | 2008_007737 608 | 2008_007797 609 | 2008_007804 610 | 2008_007811 611 | 2008_007814 612 | 2008_007828 613 | 2008_007836 614 | 2008_007945 615 | 2008_007994 616 | 2008_008051 617 | 2008_008103 618 | 2008_008127 619 | 2008_008221 620 | 2008_008252 621 | 2008_008268 622 | 2008_008296 623 | 2008_008301 624 | 2008_008335 625 | 2008_008362 626 | 2008_008392 627 | 2008_008393 628 | 2008_008421 629 | 2008_008434 630 | 2008_008469 631 | 2008_008629 632 | 2008_008682 633 | 2008_008711 634 | 2008_008746 635 | 2009_000012 636 | 2009_000013 637 | 2009_000022 638 | 2009_000032 639 | 2009_000037 640 | 2009_000039 641 | 2009_000074 642 | 2009_000080 643 | 2009_000087 644 | 2009_000096 645 | 2009_000121 646 | 2009_000136 647 | 2009_000149 648 | 2009_000156 649 | 2009_000201 650 | 2009_000205 651 | 2009_000219 652 | 2009_000242 653 | 2009_000309 654 | 2009_000318 655 | 2009_000335 656 | 2009_000351 657 | 2009_000354 658 | 2009_000387 659 | 2009_000391 660 | 2009_000412 661 | 2009_000418 662 | 2009_000421 663 | 2009_000426 664 | 2009_000440 665 | 2009_000446 666 | 2009_000455 667 | 2009_000457 668 | 2009_000469 669 | 2009_000487 670 | 2009_000488 671 | 2009_000523 672 | 2009_000573 673 | 2009_000619 674 | 2009_000628 675 | 2009_000641 676 | 2009_000664 677 | 2009_000675 678 | 2009_000704 679 | 2009_000705 680 | 2009_000712 681 | 2009_000716 682 | 2009_000723 683 | 2009_000727 684 | 2009_000730 685 | 2009_000731 686 | 2009_000732 687 | 2009_000771 688 | 2009_000825 689 | 2009_000828 690 | 2009_000839 691 | 2009_000840 692 | 2009_000845 693 | 2009_000879 694 | 2009_000892 695 | 2009_000919 696 | 2009_000924 697 | 2009_000931 698 | 2009_000935 699 | 2009_000964 700 | 2009_000989 701 | 2009_000991 702 | 2009_000998 703 | 2009_001008 704 | 2009_001082 705 | 2009_001108 706 | 2009_001160 707 | 2009_001215 708 | 2009_001240 709 | 2009_001255 710 | 2009_001278 711 | 2009_001299 712 | 2009_001300 713 | 2009_001314 714 | 2009_001332 715 | 2009_001333 716 | 2009_001363 717 | 2009_001391 718 | 2009_001411 719 | 2009_001433 720 | 2009_001505 721 | 2009_001535 722 | 2009_001536 723 | 2009_001565 724 | 2009_001607 725 | 2009_001644 726 | 2009_001663 727 | 2009_001683 728 | 2009_001684 729 | 2009_001687 730 | 2009_001718 731 | 2009_001731 732 | 2009_001765 733 | 2009_001768 734 | 2009_001775 735 | 2009_001804 736 | 2009_001816 737 | 2009_001818 738 | 2009_001850 739 | 2009_001851 740 | 2009_001854 741 | 2009_001941 742 | 2009_001991 743 | 2009_002012 744 | 2009_002035 745 | 2009_002042 746 | 2009_002082 747 | 2009_002094 748 | 2009_002097 749 | 2009_002122 750 | 2009_002150 751 | 2009_002155 752 | 2009_002164 753 | 2009_002165 754 | 2009_002171 755 | 2009_002185 756 | 2009_002202 757 | 2009_002221 758 | 2009_002238 759 | 2009_002239 760 | 2009_002265 761 | 2009_002268 762 | 2009_002291 763 | 2009_002295 764 | 2009_002317 765 | 2009_002320 766 | 2009_002346 767 | 2009_002366 768 | 2009_002372 769 | 2009_002382 770 | 2009_002390 771 | 2009_002415 772 | 2009_002445 773 | 2009_002487 774 | 2009_002521 775 | 2009_002527 776 | 2009_002535 777 | 2009_002539 778 | 2009_002549 779 | 2009_002562 780 | 2009_002568 781 | 2009_002571 782 | 2009_002573 783 | 2009_002584 784 | 2009_002591 785 | 2009_002594 786 | 2009_002604 787 | 2009_002618 788 | 2009_002635 789 | 2009_002638 790 | 2009_002649 791 | 2009_002651 792 | 2009_002727 793 | 2009_002732 794 | 2009_002749 795 | 2009_002753 796 | 2009_002771 797 | 2009_002808 798 | 2009_002856 799 | 2009_002887 800 | 2009_002888 801 | 2009_002928 802 | 2009_002936 803 | 2009_002975 804 | 2009_002982 805 | 2009_002990 806 | 2009_003003 807 | 2009_003005 808 | 2009_003043 809 | 2009_003059 810 | 2009_003063 811 | 2009_003065 812 | 2009_003071 813 | 2009_003080 814 | 2009_003105 815 | 2009_003123 816 | 2009_003193 817 | 2009_003196 818 | 2009_003217 819 | 2009_003224 820 | 2009_003241 821 | 2009_003269 822 | 2009_003273 823 | 2009_003299 824 | 2009_003304 825 | 2009_003311 826 | 2009_003323 827 | 2009_003343 828 | 2009_003378 829 | 2009_003387 830 | 2009_003406 831 | 2009_003433 832 | 2009_003450 833 | 2009_003466 834 | 2009_003481 835 | 2009_003494 836 | 2009_003498 837 | 2009_003504 838 | 2009_003507 839 | 2009_003517 840 | 2009_003523 841 | 2009_003542 842 | 2009_003549 843 | 2009_003551 844 | 2009_003564 845 | 2009_003569 846 | 2009_003576 847 | 2009_003589 848 | 2009_003607 849 | 2009_003640 850 | 2009_003666 851 | 2009_003696 852 | 2009_003703 853 | 2009_003707 854 | 2009_003756 855 | 2009_003771 856 | 2009_003773 857 | 2009_003804 858 | 2009_003806 859 | 2009_003810 860 | 2009_003849 861 | 2009_003857 862 | 2009_003858 863 | 2009_003895 864 | 2009_003903 865 | 2009_003904 866 | 2009_003928 867 | 2009_003938 868 | 2009_003971 869 | 2009_003991 870 | 2009_004021 871 | 2009_004033 872 | 2009_004043 873 | 2009_004070 874 | 2009_004072 875 | 2009_004084 876 | 2009_004099 877 | 2009_004125 878 | 2009_004140 879 | 2009_004217 880 | 2009_004221 881 | 2009_004247 882 | 2009_004248 883 | 2009_004255 884 | 2009_004298 885 | 2009_004324 886 | 2009_004455 887 | 2009_004494 888 | 2009_004497 889 | 2009_004504 890 | 2009_004507 891 | 2009_004509 892 | 2009_004540 893 | 2009_004568 894 | 2009_004579 895 | 2009_004581 896 | 2009_004590 897 | 2009_004592 898 | 2009_004594 899 | 2009_004635 900 | 2009_004653 901 | 2009_004687 902 | 2009_004721 903 | 2009_004730 904 | 2009_004732 905 | 2009_004738 906 | 2009_004748 907 | 2009_004789 908 | 2009_004799 909 | 2009_004801 910 | 2009_004848 911 | 2009_004859 912 | 2009_004867 913 | 2009_004882 914 | 2009_004886 915 | 2009_004895 916 | 2009_004942 917 | 2009_004969 918 | 2009_004987 919 | 2009_004993 920 | 2009_004994 921 | 2009_005038 922 | 2009_005078 923 | 2009_005087 924 | 2009_005089 925 | 2009_005137 926 | 2009_005148 927 | 2009_005156 928 | 2009_005158 929 | 2009_005189 930 | 2009_005190 931 | 2009_005217 932 | 2009_005219 933 | 2009_005220 934 | 2009_005231 935 | 2009_005260 936 | 2009_005262 937 | 2009_005302 938 | 2010_000003 939 | 2010_000038 940 | 2010_000065 941 | 2010_000083 942 | 2010_000084 943 | 2010_000087 944 | 2010_000110 945 | 2010_000159 946 | 2010_000160 947 | 2010_000163 948 | 2010_000174 949 | 2010_000216 950 | 2010_000238 951 | 2010_000241 952 | 2010_000256 953 | 2010_000272 954 | 2010_000284 955 | 2010_000309 956 | 2010_000318 957 | 2010_000330 958 | 2010_000335 959 | 2010_000342 960 | 2010_000372 961 | 2010_000422 962 | 2010_000426 963 | 2010_000427 964 | 2010_000502 965 | 2010_000530 966 | 2010_000552 967 | 2010_000559 968 | 2010_000572 969 | 2010_000573 970 | 2010_000622 971 | 2010_000628 972 | 2010_000639 973 | 2010_000666 974 | 2010_000679 975 | 2010_000682 976 | 2010_000683 977 | 2010_000724 978 | 2010_000738 979 | 2010_000764 980 | 2010_000788 981 | 2010_000814 982 | 2010_000836 983 | 2010_000874 984 | 2010_000904 985 | 2010_000906 986 | 2010_000907 987 | 2010_000918 988 | 2010_000929 989 | 2010_000941 990 | 2010_000952 991 | 2010_000961 992 | 2010_001000 993 | 2010_001010 994 | 2010_001011 995 | 2010_001016 996 | 2010_001017 997 | 2010_001024 998 | 2010_001036 999 | 2010_001061 1000 | 2010_001069 1001 | 2010_001070 1002 | 2010_001079 1003 | 2010_001104 1004 | 2010_001124 1005 | 2010_001149 1006 | 2010_001151 1007 | 2010_001174 1008 | 2010_001206 1009 | 2010_001246 1010 | 2010_001251 1011 | 2010_001256 1012 | 2010_001264 1013 | 2010_001292 1014 | 2010_001313 1015 | 2010_001327 1016 | 2010_001331 1017 | 2010_001351 1018 | 2010_001367 1019 | 2010_001376 1020 | 2010_001403 1021 | 2010_001448 1022 | 2010_001451 1023 | 2010_001522 1024 | 2010_001534 1025 | 2010_001553 1026 | 2010_001557 1027 | 2010_001563 1028 | 2010_001577 1029 | 2010_001579 1030 | 2010_001646 1031 | 2010_001656 1032 | 2010_001692 1033 | 2010_001699 1034 | 2010_001734 1035 | 2010_001752 1036 | 2010_001767 1037 | 2010_001768 1038 | 2010_001773 1039 | 2010_001820 1040 | 2010_001830 1041 | 2010_001851 1042 | 2010_001908 1043 | 2010_001913 1044 | 2010_001951 1045 | 2010_001956 1046 | 2010_001962 1047 | 2010_001966 1048 | 2010_001995 1049 | 2010_002017 1050 | 2010_002025 1051 | 2010_002030 1052 | 2010_002106 1053 | 2010_002137 1054 | 2010_002142 1055 | 2010_002146 1056 | 2010_002147 1057 | 2010_002150 1058 | 2010_002161 1059 | 2010_002200 1060 | 2010_002228 1061 | 2010_002232 1062 | 2010_002251 1063 | 2010_002271 1064 | 2010_002305 1065 | 2010_002310 1066 | 2010_002336 1067 | 2010_002348 1068 | 2010_002361 1069 | 2010_002390 1070 | 2010_002396 1071 | 2010_002422 1072 | 2010_002450 1073 | 2010_002480 1074 | 2010_002512 1075 | 2010_002531 1076 | 2010_002536 1077 | 2010_002538 1078 | 2010_002546 1079 | 2010_002623 1080 | 2010_002682 1081 | 2010_002691 1082 | 2010_002693 1083 | 2010_002701 1084 | 2010_002763 1085 | 2010_002792 1086 | 2010_002868 1087 | 2010_002900 1088 | 2010_002902 1089 | 2010_002921 1090 | 2010_002929 1091 | 2010_002939 1092 | 2010_002988 1093 | 2010_003014 1094 | 2010_003060 1095 | 2010_003123 1096 | 2010_003127 1097 | 2010_003132 1098 | 2010_003168 1099 | 2010_003183 1100 | 2010_003187 1101 | 2010_003207 1102 | 2010_003231 1103 | 2010_003239 1104 | 2010_003275 1105 | 2010_003276 1106 | 2010_003293 1107 | 2010_003302 1108 | 2010_003325 1109 | 2010_003362 1110 | 2010_003365 1111 | 2010_003381 1112 | 2010_003402 1113 | 2010_003409 1114 | 2010_003418 1115 | 2010_003446 1116 | 2010_003453 1117 | 2010_003468 1118 | 2010_003473 1119 | 2010_003495 1120 | 2010_003506 1121 | 2010_003514 1122 | 2010_003531 1123 | 2010_003532 1124 | 2010_003541 1125 | 2010_003547 1126 | 2010_003597 1127 | 2010_003675 1128 | 2010_003708 1129 | 2010_003716 1130 | 2010_003746 1131 | 2010_003758 1132 | 2010_003764 1133 | 2010_003768 1134 | 2010_003771 1135 | 2010_003772 1136 | 2010_003781 1137 | 2010_003813 1138 | 2010_003820 1139 | 2010_003854 1140 | 2010_003912 1141 | 2010_003915 1142 | 2010_003947 1143 | 2010_003956 1144 | 2010_003971 1145 | 2010_004041 1146 | 2010_004042 1147 | 2010_004056 1148 | 2010_004063 1149 | 2010_004104 1150 | 2010_004120 1151 | 2010_004149 1152 | 2010_004165 1153 | 2010_004208 1154 | 2010_004219 1155 | 2010_004226 1156 | 2010_004314 1157 | 2010_004320 1158 | 2010_004322 1159 | 2010_004337 1160 | 2010_004348 1161 | 2010_004355 1162 | 2010_004369 1163 | 2010_004382 1164 | 2010_004419 1165 | 2010_004432 1166 | 2010_004472 1167 | 2010_004479 1168 | 2010_004519 1169 | 2010_004520 1170 | 2010_004529 1171 | 2010_004543 1172 | 2010_004550 1173 | 2010_004551 1174 | 2010_004556 1175 | 2010_004559 1176 | 2010_004628 1177 | 2010_004635 1178 | 2010_004662 1179 | 2010_004697 1180 | 2010_004757 1181 | 2010_004763 1182 | 2010_004772 1183 | 2010_004783 1184 | 2010_004789 1185 | 2010_004795 1186 | 2010_004815 1187 | 2010_004825 1188 | 2010_004828 1189 | 2010_004856 1190 | 2010_004857 1191 | 2010_004861 1192 | 2010_004941 1193 | 2010_004946 1194 | 2010_004951 1195 | 2010_004980 1196 | 2010_004994 1197 | 2010_005013 1198 | 2010_005021 1199 | 2010_005046 1200 | 2010_005063 1201 | 2010_005108 1202 | 2010_005118 1203 | 2010_005159 1204 | 2010_005160 1205 | 2010_005166 1206 | 2010_005174 1207 | 2010_005180 1208 | 2010_005187 1209 | 2010_005206 1210 | 2010_005245 1211 | 2010_005252 1212 | 2010_005284 1213 | 2010_005305 1214 | 2010_005344 1215 | 2010_005353 1216 | 2010_005366 1217 | 2010_005401 1218 | 2010_005421 1219 | 2010_005428 1220 | 2010_005432 1221 | 2010_005433 1222 | 2010_005496 1223 | 2010_005501 1224 | 2010_005508 1225 | 2010_005531 1226 | 2010_005534 1227 | 2010_005575 1228 | 2010_005582 1229 | 2010_005606 1230 | 2010_005626 1231 | 2010_005644 1232 | 2010_005664 1233 | 2010_005705 1234 | 2010_005706 1235 | 2010_005709 1236 | 2010_005718 1237 | 2010_005719 1238 | 2010_005727 1239 | 2010_005762 1240 | 2010_005788 1241 | 2010_005860 1242 | 2010_005871 1243 | 2010_005877 1244 | 2010_005888 1245 | 2010_005899 1246 | 2010_005922 1247 | 2010_005991 1248 | 2010_005992 1249 | 2010_006026 1250 | 2010_006034 1251 | 2010_006054 1252 | 2010_006070 1253 | 2011_000045 1254 | 2011_000051 1255 | 2011_000054 1256 | 2011_000066 1257 | 2011_000070 1258 | 2011_000112 1259 | 2011_000173 1260 | 2011_000178 1261 | 2011_000185 1262 | 2011_000226 1263 | 2011_000234 1264 | 2011_000238 1265 | 2011_000239 1266 | 2011_000248 1267 | 2011_000283 1268 | 2011_000291 1269 | 2011_000310 1270 | 2011_000312 1271 | 2011_000338 1272 | 2011_000396 1273 | 2011_000412 1274 | 2011_000419 1275 | 2011_000435 1276 | 2011_000436 1277 | 2011_000438 1278 | 2011_000455 1279 | 2011_000456 1280 | 2011_000479 1281 | 2011_000481 1282 | 2011_000482 1283 | 2011_000503 1284 | 2011_000512 1285 | 2011_000521 1286 | 2011_000526 1287 | 2011_000536 1288 | 2011_000548 1289 | 2011_000566 1290 | 2011_000585 1291 | 2011_000598 1292 | 2011_000607 1293 | 2011_000618 1294 | 2011_000638 1295 | 2011_000658 1296 | 2011_000661 1297 | 2011_000669 1298 | 2011_000747 1299 | 2011_000780 1300 | 2011_000789 1301 | 2011_000807 1302 | 2011_000809 1303 | 2011_000813 1304 | 2011_000830 1305 | 2011_000843 1306 | 2011_000874 1307 | 2011_000888 1308 | 2011_000900 1309 | 2011_000912 1310 | 2011_000953 1311 | 2011_000969 1312 | 2011_001005 1313 | 2011_001014 1314 | 2011_001020 1315 | 2011_001047 1316 | 2011_001060 1317 | 2011_001064 1318 | 2011_001069 1319 | 2011_001071 1320 | 2011_001082 1321 | 2011_001110 1322 | 2011_001114 1323 | 2011_001159 1324 | 2011_001161 1325 | 2011_001190 1326 | 2011_001232 1327 | 2011_001263 1328 | 2011_001276 1329 | 2011_001281 1330 | 2011_001287 1331 | 2011_001292 1332 | 2011_001313 1333 | 2011_001341 1334 | 2011_001346 1335 | 2011_001350 1336 | 2011_001407 1337 | 2011_001416 1338 | 2011_001421 1339 | 2011_001434 1340 | 2011_001447 1341 | 2011_001489 1342 | 2011_001529 1343 | 2011_001530 1344 | 2011_001534 1345 | 2011_001546 1346 | 2011_001567 1347 | 2011_001589 1348 | 2011_001597 1349 | 2011_001601 1350 | 2011_001607 1351 | 2011_001613 1352 | 2011_001614 1353 | 2011_001619 1354 | 2011_001624 1355 | 2011_001642 1356 | 2011_001665 1357 | 2011_001669 1358 | 2011_001674 1359 | 2011_001708 1360 | 2011_001713 1361 | 2011_001714 1362 | 2011_001722 1363 | 2011_001726 1364 | 2011_001745 1365 | 2011_001748 1366 | 2011_001775 1367 | 2011_001782 1368 | 2011_001793 1369 | 2011_001794 1370 | 2011_001812 1371 | 2011_001862 1372 | 2011_001863 1373 | 2011_001868 1374 | 2011_001880 1375 | 2011_001910 1376 | 2011_001984 1377 | 2011_001988 1378 | 2011_002002 1379 | 2011_002040 1380 | 2011_002041 1381 | 2011_002064 1382 | 2011_002075 1383 | 2011_002098 1384 | 2011_002110 1385 | 2011_002121 1386 | 2011_002124 1387 | 2011_002150 1388 | 2011_002156 1389 | 2011_002178 1390 | 2011_002200 1391 | 2011_002223 1392 | 2011_002244 1393 | 2011_002247 1394 | 2011_002279 1395 | 2011_002295 1396 | 2011_002298 1397 | 2011_002308 1398 | 2011_002317 1399 | 2011_002322 1400 | 2011_002327 1401 | 2011_002343 1402 | 2011_002358 1403 | 2011_002371 1404 | 2011_002379 1405 | 2011_002391 1406 | 2011_002498 1407 | 2011_002509 1408 | 2011_002515 1409 | 2011_002532 1410 | 2011_002535 1411 | 2011_002548 1412 | 2011_002575 1413 | 2011_002578 1414 | 2011_002589 1415 | 2011_002592 1416 | 2011_002623 1417 | 2011_002641 1418 | 2011_002644 1419 | 2011_002662 1420 | 2011_002675 1421 | 2011_002685 1422 | 2011_002713 1423 | 2011_002730 1424 | 2011_002754 1425 | 2011_002812 1426 | 2011_002863 1427 | 2011_002879 1428 | 2011_002885 1429 | 2011_002929 1430 | 2011_002951 1431 | 2011_002975 1432 | 2011_002993 1433 | 2011_002997 1434 | 2011_003003 1435 | 2011_003011 1436 | 2011_003019 1437 | 2011_003030 1438 | 2011_003055 1439 | 2011_003085 1440 | 2011_003103 1441 | 2011_003114 1442 | 2011_003145 1443 | 2011_003146 1444 | 2011_003182 1445 | 2011_003197 1446 | 2011_003205 1447 | 2011_003240 1448 | 2011_003256 1449 | 2011_003271 1450 | -------------------------------------------------------------------------------- /deeplab_model.py: -------------------------------------------------------------------------------- 1 | """DeepLab v3 models based on slim library.""" 2 | 3 | from __future__ import absolute_import 4 | from __future__ import division 5 | from __future__ import print_function 6 | 7 | import tensorflow as tf 8 | 9 | from tensorflow.contrib.slim.nets import resnet_v2 10 | from tensorflow.contrib import layers as layers_lib 11 | from tensorflow.contrib.framework.python.ops import arg_scope 12 | from tensorflow.contrib.layers.python.layers import layers 13 | 14 | from utils import preprocessing 15 | 16 | _BATCH_NORM_DECAY = 0.9997 17 | _WEIGHT_DECAY = 5e-4 18 | 19 | 20 | def atrous_spatial_pyramid_pooling(inputs, output_stride, batch_norm_decay, is_training, depth=256): 21 | """Atrous Spatial Pyramid Pooling. 22 | 23 | Args: 24 | inputs: A tensor of size [batch, height, width, channels]. 25 | output_stride: The ResNet unit's stride. Determines the rates for atrous convolution. 26 | the rates are (6, 12, 18) when the stride is 16, and doubled when 8. 27 | batch_norm_decay: The moving average decay when estimating layer activation 28 | statistics in batch normalization. 29 | is_training: A boolean denoting whether the input is for training. 30 | depth: The depth of the ResNet unit output. 31 | 32 | Returns: 33 | The atrous spatial pyramid pooling output. 34 | """ 35 | with tf.variable_scope("aspp"): 36 | if output_stride not in [8, 16]: 37 | raise ValueError('output_stride must be either 8 or 16.') 38 | 39 | atrous_rates = [6, 12, 18] 40 | if output_stride == 8: 41 | atrous_rates = [2*rate for rate in atrous_rates] 42 | 43 | with tf.contrib.slim.arg_scope(resnet_v2.resnet_arg_scope(batch_norm_decay=batch_norm_decay)): 44 | with arg_scope([layers.batch_norm], is_training=is_training): 45 | inputs_size = tf.shape(inputs)[1:3] 46 | # (a) one 1x1 convolution and three 3x3 convolutions with rates = (6, 12, 18) when output stride = 16. 47 | # the rates are doubled when output stride = 8. 48 | conv_1x1 = layers_lib.conv2d(inputs, depth, [1, 1], stride=1, scope="conv_1x1") 49 | conv_3x3_1 = layers_lib.conv2d(inputs, depth, [3, 3], stride=1, rate=atrous_rates[0], scope='conv_3x3_1') 50 | conv_3x3_2 = layers_lib.conv2d(inputs, depth, [3, 3], stride=1, rate=atrous_rates[1], scope='conv_3x3_2') 51 | conv_3x3_3 = layers_lib.conv2d(inputs, depth, [3, 3], stride=1, rate=atrous_rates[2], scope='conv_3x3_3') 52 | 53 | # (b) the image-level features 54 | with tf.variable_scope("image_level_features"): 55 | # global average pooling 56 | image_level_features = tf.reduce_mean(inputs, [1, 2], name='global_average_pooling', keepdims=True) 57 | # 1x1 convolution with 256 filters( and batch normalization) 58 | image_level_features = layers_lib.conv2d(image_level_features, depth, [1, 1], stride=1, scope='conv_1x1') 59 | # bilinearly upsample features 60 | image_level_features = tf.image.resize_bilinear(image_level_features, inputs_size, name='upsample') 61 | 62 | net = tf.concat([conv_1x1, conv_3x3_1, conv_3x3_2, conv_3x3_3, image_level_features], axis=3, name='concat') 63 | net = layers_lib.conv2d(net, depth, [1, 1], stride=1, scope='conv_1x1_concat') 64 | 65 | return net 66 | 67 | 68 | def deeplab_v3_plus_generator(num_classes, 69 | output_stride, 70 | base_architecture, 71 | pre_trained_model, 72 | batch_norm_decay, 73 | data_format='channels_last'): 74 | """Generator for DeepLab v3 plus models. 75 | 76 | Args: 77 | num_classes: The number of possible classes for image classification. 78 | output_stride: The ResNet unit's stride. Determines the rates for atrous convolution. 79 | the rates are (6, 12, 18) when the stride is 16, and doubled when 8. 80 | base_architecture: The architecture of base Resnet building block. 81 | pre_trained_model: The path to the directory that contains pre-trained models. 82 | batch_norm_decay: The moving average decay when estimating layer activation 83 | statistics in batch normalization. 84 | data_format: The input format ('channels_last', 'channels_first', or None). 85 | If set to None, the format is dependent on whether a GPU is available. 86 | Only 'channels_last' is supported currently. 87 | 88 | Returns: 89 | The model function that takes in `inputs` and `is_training` and 90 | returns the output tensor of the DeepLab v3 model. 91 | """ 92 | if data_format is None: 93 | # data_format = ( 94 | # 'channels_first' if tf.test.is_built_with_cuda() else 'channels_last') 95 | pass 96 | 97 | if batch_norm_decay is None: 98 | batch_norm_decay = _BATCH_NORM_DECAY 99 | 100 | if base_architecture not in ['resnet_v2_50', 'resnet_v2_101']: 101 | raise ValueError("'base_architrecture' must be either 'resnet_v2_50' or 'resnet_v2_101'.") 102 | 103 | if base_architecture == 'resnet_v2_50': 104 | base_model = resnet_v2.resnet_v2_50 105 | else: 106 | base_model = resnet_v2.resnet_v2_101 107 | 108 | def model(inputs, is_training): 109 | """Constructs the ResNet model given the inputs.""" 110 | if data_format == 'channels_first': 111 | # Convert the inputs from channels_last (NHWC) to channels_first (NCHW). 112 | # This provides a large performance boost on GPU. See 113 | # https://www.tensorflow.org/performance/performance_guide#data_formats 114 | inputs = tf.transpose(inputs, [0, 3, 1, 2]) 115 | 116 | # tf.logging.info('net shape: {}'.format(inputs.shape)) 117 | # encoder 118 | with tf.contrib.slim.arg_scope(resnet_v2.resnet_arg_scope(batch_norm_decay=batch_norm_decay)): 119 | logits, end_points = base_model(inputs, 120 | num_classes=None, 121 | is_training=is_training, 122 | global_pool=False, 123 | output_stride=output_stride) 124 | 125 | if is_training: 126 | exclude = [base_architecture + '/logits', 'global_step'] 127 | variables_to_restore = tf.contrib.slim.get_variables_to_restore(exclude=exclude) 128 | tf.train.init_from_checkpoint(pre_trained_model, 129 | {v.name.split(':')[0]: v for v in variables_to_restore}) 130 | 131 | inputs_size = tf.shape(inputs)[1:3] 132 | net = end_points[base_architecture + '/block4'] 133 | encoder_output = atrous_spatial_pyramid_pooling(net, output_stride, batch_norm_decay, is_training) 134 | 135 | with tf.variable_scope("decoder"): 136 | with tf.contrib.slim.arg_scope(resnet_v2.resnet_arg_scope(batch_norm_decay=batch_norm_decay)): 137 | with arg_scope([layers.batch_norm], is_training=is_training): 138 | with tf.variable_scope("low_level_features"): 139 | low_level_features = end_points[base_architecture + '/block1/unit_3/bottleneck_v2/conv1'] 140 | low_level_features = layers_lib.conv2d(low_level_features, 48, 141 | [1, 1], stride=1, scope='conv_1x1') 142 | low_level_features_size = tf.shape(low_level_features)[1:3] 143 | 144 | with tf.variable_scope("upsampling_logits"): 145 | net = tf.image.resize_bilinear(encoder_output, low_level_features_size, name='upsample_1') 146 | net = tf.concat([net, low_level_features], axis=3, name='concat') 147 | net = layers_lib.conv2d(net, 256, [3, 3], stride=1, scope='conv_3x3_1') 148 | net = layers_lib.conv2d(net, 256, [3, 3], stride=1, scope='conv_3x3_2') 149 | net = layers_lib.conv2d(net, num_classes, [1, 1], activation_fn=None, normalizer_fn=None, scope='conv_1x1') 150 | logits = tf.image.resize_bilinear(net, inputs_size, name='upsample_2') 151 | 152 | return logits 153 | 154 | return model 155 | 156 | 157 | def deeplabv3_plus_model_fn(features, labels, mode, params): 158 | """Model function for PASCAL VOC.""" 159 | if isinstance(features, dict): 160 | features = features['feature'] 161 | 162 | images = tf.cast( 163 | tf.map_fn(preprocessing.mean_image_addition, features), 164 | tf.uint8) 165 | 166 | network = deeplab_v3_plus_generator(params['num_classes'], 167 | params['output_stride'], 168 | params['base_architecture'], 169 | params['pre_trained_model'], 170 | params['batch_norm_decay']) 171 | 172 | logits = network(features, mode == tf.estimator.ModeKeys.TRAIN) 173 | 174 | pred_classes = tf.expand_dims(tf.argmax(logits, axis=3, output_type=tf.int32), axis=3) 175 | 176 | pred_decoded_labels = tf.py_func(preprocessing.decode_labels, 177 | [pred_classes, params['batch_size'], params['num_classes']], 178 | tf.uint8) 179 | 180 | predictions = { 181 | 'classes': pred_classes, 182 | 'probabilities': tf.nn.softmax(logits, name='softmax_tensor'), 183 | 'decoded_labels': pred_decoded_labels 184 | } 185 | 186 | if mode == tf.estimator.ModeKeys.PREDICT: 187 | # Delete 'decoded_labels' from predictions because custom functions produce error when used with saved_model 188 | predictions_without_decoded_labels = predictions.copy() 189 | del predictions_without_decoded_labels['decoded_labels'] 190 | 191 | return tf.estimator.EstimatorSpec( 192 | mode=mode, 193 | predictions=predictions, 194 | export_outputs={ 195 | 'preds': tf.estimator.export.PredictOutput( 196 | predictions_without_decoded_labels) 197 | }) 198 | 199 | gt_decoded_labels = tf.py_func(preprocessing.decode_labels, 200 | [labels, params['batch_size'], params['num_classes']], tf.uint8) 201 | 202 | labels = tf.squeeze(labels, axis=3) # reduce the channel dimension. 203 | 204 | logits_by_num_classes = tf.reshape(logits, [-1, params['num_classes']]) 205 | labels_flat = tf.reshape(labels, [-1, ]) 206 | 207 | valid_indices = tf.to_int32(labels_flat <= params['num_classes'] - 1) 208 | valid_logits = tf.dynamic_partition(logits_by_num_classes, valid_indices, num_partitions=2)[1] 209 | valid_labels = tf.dynamic_partition(labels_flat, valid_indices, num_partitions=2)[1] 210 | 211 | preds_flat = tf.reshape(pred_classes, [-1, ]) 212 | valid_preds = tf.dynamic_partition(preds_flat, valid_indices, num_partitions=2)[1] 213 | confusion_matrix = tf.confusion_matrix(valid_labels, valid_preds, num_classes=params['num_classes']) 214 | 215 | predictions['valid_preds'] = valid_preds 216 | predictions['valid_labels'] = valid_labels 217 | predictions['confusion_matrix'] = confusion_matrix 218 | 219 | cross_entropy = tf.losses.sparse_softmax_cross_entropy( 220 | logits=valid_logits, labels=valid_labels) 221 | 222 | # Create a tensor named cross_entropy for logging purposes. 223 | tf.identity(cross_entropy, name='cross_entropy') 224 | tf.summary.scalar('cross_entropy', cross_entropy) 225 | 226 | if not params['freeze_batch_norm']: 227 | train_var_list = [v for v in tf.trainable_variables()] 228 | else: 229 | train_var_list = [v for v in tf.trainable_variables() 230 | if 'beta' not in v.name and 'gamma' not in v.name] 231 | 232 | # Add weight decay to the loss. 233 | with tf.variable_scope("total_loss"): 234 | loss = cross_entropy + params.get('weight_decay', _WEIGHT_DECAY) * tf.add_n( 235 | [tf.nn.l2_loss(v) for v in train_var_list]) 236 | # loss = tf.losses.get_total_loss() # obtain the regularization losses as well 237 | 238 | if mode == tf.estimator.ModeKeys.TRAIN: 239 | tf.summary.image('images', 240 | tf.concat(axis=2, values=[images, gt_decoded_labels, pred_decoded_labels]), 241 | max_outputs=params['tensorboard_images_max_outputs']) # Concatenate row-wise. 242 | 243 | global_step = tf.train.get_or_create_global_step() 244 | 245 | if params['learning_rate_policy'] == 'piecewise': 246 | # Scale the learning rate linearly with the batch size. When the batch size 247 | # is 128, the learning rate should be 0.1. 248 | initial_learning_rate = 0.1 * params['batch_size'] / 128 249 | batches_per_epoch = params['num_train'] / params['batch_size'] 250 | # Multiply the learning rate by 0.1 at 100, 150, and 200 epochs. 251 | boundaries = [int(batches_per_epoch * epoch) for epoch in [100, 150, 200]] 252 | values = [initial_learning_rate * decay for decay in [1, 0.1, 0.01, 0.001]] 253 | learning_rate = tf.train.piecewise_constant( 254 | tf.cast(global_step, tf.int32), boundaries, values) 255 | elif params['learning_rate_policy'] == 'poly': 256 | learning_rate = tf.train.polynomial_decay( 257 | params['initial_learning_rate'], 258 | tf.cast(global_step, tf.int32) - params['initial_global_step'], 259 | params['max_iter'], params['end_learning_rate'], power=params['power']) 260 | else: 261 | raise ValueError('Learning rate policy must be "piecewise" or "poly"') 262 | 263 | # Create a tensor named learning_rate for logging purposes 264 | tf.identity(learning_rate, name='learning_rate') 265 | tf.summary.scalar('learning_rate', learning_rate) 266 | 267 | optimizer = tf.train.MomentumOptimizer( 268 | learning_rate=learning_rate, 269 | momentum=params['momentum']) 270 | 271 | # Batch norm requires update ops to be added as a dependency to the train_op 272 | update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) 273 | with tf.control_dependencies(update_ops): 274 | train_op = optimizer.minimize(loss, global_step, var_list=train_var_list) 275 | else: 276 | train_op = None 277 | 278 | accuracy = tf.metrics.accuracy( 279 | valid_labels, valid_preds) 280 | mean_iou = tf.metrics.mean_iou(valid_labels, valid_preds, params['num_classes']) 281 | metrics = {'px_accuracy': accuracy, 'mean_iou': mean_iou} 282 | 283 | # Create a tensor named train_accuracy for logging purposes 284 | tf.identity(accuracy[1], name='train_px_accuracy') 285 | tf.summary.scalar('train_px_accuracy', accuracy[1]) 286 | 287 | def compute_mean_iou(total_cm, name='mean_iou'): 288 | """Compute the mean intersection-over-union via the confusion matrix.""" 289 | sum_over_row = tf.to_float(tf.reduce_sum(total_cm, 0)) 290 | sum_over_col = tf.to_float(tf.reduce_sum(total_cm, 1)) 291 | cm_diag = tf.to_float(tf.diag_part(total_cm)) 292 | denominator = sum_over_row + sum_over_col - cm_diag 293 | 294 | # The mean is only computed over classes that appear in the 295 | # label or prediction tensor. If the denominator is 0, we need to 296 | # ignore the class. 297 | num_valid_entries = tf.reduce_sum(tf.cast( 298 | tf.not_equal(denominator, 0), dtype=tf.float32)) 299 | 300 | # If the value of the denominator is 0, set it to 1 to avoid 301 | # zero division. 302 | denominator = tf.where( 303 | tf.greater(denominator, 0), 304 | denominator, 305 | tf.ones_like(denominator)) 306 | iou = tf.div(cm_diag, denominator) 307 | 308 | for i in range(params['num_classes']): 309 | tf.identity(iou[i], name='train_iou_class{}'.format(i)) 310 | tf.summary.scalar('train_iou_class{}'.format(i), iou[i]) 311 | 312 | # If the number of valid entries is 0 (no classes) we return 0. 313 | result = tf.where( 314 | tf.greater(num_valid_entries, 0), 315 | tf.reduce_sum(iou, name=name) / num_valid_entries, 316 | 0) 317 | return result 318 | 319 | train_mean_iou = compute_mean_iou(mean_iou[1]) 320 | 321 | tf.identity(train_mean_iou, name='train_mean_iou') 322 | tf.summary.scalar('train_mean_iou', train_mean_iou) 323 | 324 | return tf.estimator.EstimatorSpec( 325 | mode=mode, 326 | predictions=predictions, 327 | loss=loss, 328 | train_op=train_op, 329 | eval_metric_ops=metrics) 330 | -------------------------------------------------------------------------------- /evaluate.py: -------------------------------------------------------------------------------- 1 | """Evaluate a DeepLab v3 model.""" 2 | 3 | from __future__ import absolute_import 4 | from __future__ import division 5 | from __future__ import print_function 6 | 7 | import argparse 8 | import os 9 | import sys 10 | 11 | import tensorflow as tf 12 | 13 | import deeplab_model 14 | from utils import preprocessing 15 | from utils import dataset_util 16 | 17 | import numpy as np 18 | import timeit 19 | 20 | parser = argparse.ArgumentParser() 21 | 22 | parser.add_argument('--image_data_dir', type=str, default='dataset/VOCdevkit/VOC2012/JPEGImages', 23 | help='The directory containing the image data.') 24 | 25 | parser.add_argument('--label_data_dir', type=str, default='dataset/VOCdevkit/VOC2012/SegmentationClassAug', 26 | help='The directory containing the ground truth label data.') 27 | 28 | parser.add_argument('--evaluation_data_list', type=str, default='./dataset/val.txt', 29 | help='Path to the file listing the evaluation images.') 30 | 31 | parser.add_argument('--model_dir', type=str, default='./model', 32 | help="Base directory for the model. " 33 | "Make sure 'model_checkpoint_path' given in 'checkpoint' file matches " 34 | "with checkpoint name.") 35 | 36 | parser.add_argument('--base_architecture', type=str, default='resnet_v2_101', 37 | choices=['resnet_v2_50', 'resnet_v2_101'], 38 | help='The architecture of base Resnet building block.') 39 | 40 | parser.add_argument('--output_stride', type=int, default=16, 41 | choices=[8, 16], 42 | help='Output stride for DeepLab v3. Currently 8 or 16 is supported.') 43 | 44 | _NUM_CLASSES = 21 45 | 46 | 47 | def main(unused_argv): 48 | # Using the Winograd non-fused algorithms provides a small performance boost. 49 | os.environ['TF_ENABLE_WINOGRAD_NONFUSED'] = '1' 50 | 51 | examples = dataset_util.read_examples_list(FLAGS.evaluation_data_list) 52 | image_files = [os.path.join(FLAGS.image_data_dir, filename) + '.jpg' for filename in examples] 53 | label_files = [os.path.join(FLAGS.label_data_dir, filename) + '.png' for filename in examples] 54 | 55 | features, labels = preprocessing.eval_input_fn(image_files, label_files) 56 | 57 | predictions = deeplab_model.deeplabv3_plus_model_fn( 58 | features, 59 | labels, 60 | tf.estimator.ModeKeys.EVAL, 61 | params={ 62 | 'output_stride': FLAGS.output_stride, 63 | 'batch_size': 1, # Batch size must be 1 because the images' size may differ 64 | 'base_architecture': FLAGS.base_architecture, 65 | 'pre_trained_model': None, 66 | 'batch_norm_decay': None, 67 | 'num_classes': _NUM_CLASSES, 68 | 'freeze_batch_norm': True 69 | }).predictions 70 | 71 | # Manually load the latest checkpoint 72 | saver = tf.train.Saver() 73 | with tf.Session() as sess: 74 | ckpt = tf.train.get_checkpoint_state(FLAGS.model_dir) 75 | saver.restore(sess, ckpt.model_checkpoint_path) 76 | 77 | # Loop through the batches and store predictions and labels 78 | step = 1 79 | sum_cm = np.zeros((_NUM_CLASSES, _NUM_CLASSES), dtype=np.int32) 80 | start = timeit.default_timer() 81 | while True: 82 | try: 83 | preds = sess.run(predictions) 84 | sum_cm += preds['confusion_matrix'] 85 | if not step % 100: 86 | stop = timeit.default_timer() 87 | tf.logging.info("current step = {} ({:.3f} sec)".format(step, stop-start)) 88 | start = timeit.default_timer() 89 | step += 1 90 | except tf.errors.OutOfRangeError: 91 | break 92 | 93 | def compute_mean_iou(total_cm): 94 | """Compute the mean intersection-over-union via the confusion matrix.""" 95 | sum_over_row = np.sum(total_cm, axis=0).astype(float) 96 | sum_over_col = np.sum(total_cm, axis=1).astype(float) 97 | cm_diag = np.diagonal(total_cm).astype(float) 98 | denominator = sum_over_row + sum_over_col - cm_diag 99 | 100 | # The mean is only computed over classes that appear in the 101 | # label or prediction tensor. If the denominator is 0, we need to 102 | # ignore the class. 103 | num_valid_entries = np.sum((denominator != 0).astype(float)) 104 | 105 | # If the value of the denominator is 0, set it to 1 to avoid 106 | # zero division. 107 | denominator = np.where( 108 | denominator > 0, 109 | denominator, 110 | np.ones_like(denominator)) 111 | 112 | ious = cm_diag / denominator 113 | 114 | print('Intersection over Union for each class:') 115 | for i, iou in enumerate(ious): 116 | print(' class {}: {:.4f}'.format(i, iou)) 117 | 118 | # If the number of valid entries is 0 (no classes) we return 0. 119 | m_iou = np.where( 120 | num_valid_entries > 0, 121 | np.sum(ious) / num_valid_entries, 122 | 0) 123 | m_iou = float(m_iou) 124 | print('mean Intersection over Union: {:.4f}'.format(float(m_iou))) 125 | 126 | def compute_accuracy(total_cm): 127 | """Compute the accuracy via the confusion matrix.""" 128 | denominator = total_cm.sum().astype(float) 129 | cm_diag_sum = np.diagonal(total_cm).sum().astype(float) 130 | 131 | # If the number of valid entries is 0 (no classes) we return 0. 132 | accuracy = np.where( 133 | denominator > 0, 134 | cm_diag_sum / denominator, 135 | 0) 136 | accuracy = float(accuracy) 137 | print('Pixel Accuracy: {:.4f}'.format(float(accuracy))) 138 | 139 | compute_mean_iou(sum_cm) 140 | compute_accuracy(sum_cm) 141 | 142 | 143 | if __name__ == '__main__': 144 | tf.logging.set_verbosity(tf.logging.INFO) 145 | FLAGS, unparsed = parser.parse_known_args() 146 | tf.app.run(main=main, argv=[sys.argv[0]] + unparsed) 147 | -------------------------------------------------------------------------------- /export_inference_graph.py: -------------------------------------------------------------------------------- 1 | """Export inference graph.""" 2 | 3 | from __future__ import absolute_import 4 | from __future__ import division 5 | from __future__ import print_function 6 | 7 | import argparse 8 | import os 9 | import sys 10 | 11 | import tensorflow as tf 12 | 13 | import deeplab_model 14 | from utils import preprocessing 15 | 16 | 17 | parser = argparse.ArgumentParser() 18 | 19 | parser.add_argument('--model_dir', type=str, default='./model', 20 | help="Base directory for the model. " 21 | "Make sure 'model_checkpoint_path' given in 'checkpoint' file matches " 22 | "with checkpoint name.") 23 | 24 | parser.add_argument('--export_dir', type=str, default='dataset/export_output', 25 | help='The directory where the exported SavedModel will be stored.') 26 | 27 | parser.add_argument('--base_architecture', type=str, default='resnet_v2_101', 28 | choices=['resnet_v2_50', 'resnet_v2_101'], 29 | help='The architecture of base Resnet building block.') 30 | 31 | parser.add_argument('--output_stride', type=int, default=16, 32 | choices=[8, 16], 33 | help='Output stride for DeepLab v3. Currently 8 or 16 is supported.') 34 | 35 | 36 | _NUM_CLASSES = 21 37 | 38 | 39 | def main(unused_argv): 40 | # Using the Winograd non-fused algorithms provides a small performance boost. 41 | os.environ['TF_ENABLE_WINOGRAD_NONFUSED'] = '1' 42 | 43 | model = tf.estimator.Estimator( 44 | model_fn=deeplab_model.deeplabv3_plus_model_fn, 45 | model_dir=FLAGS.model_dir, 46 | params={ 47 | 'output_stride': FLAGS.output_stride, 48 | 'batch_size': 1, # Batch size must be 1 because the images' size may differ 49 | 'base_architecture': FLAGS.base_architecture, 50 | 'pre_trained_model': None, 51 | 'batch_norm_decay': None, 52 | 'num_classes': _NUM_CLASSES, 53 | }) 54 | 55 | # Export the model 56 | def serving_input_receiver_fn(): 57 | image = tf.placeholder(tf.float32, [None, None, None, 3], name='image_tensor') 58 | receiver_tensors = {'image': image} 59 | features = tf.map_fn(preprocessing.mean_image_subtraction, image) 60 | return tf.estimator.export.ServingInputReceiver( 61 | features=features, 62 | receiver_tensors=receiver_tensors) 63 | 64 | model.export_savedmodel(FLAGS.export_dir, serving_input_receiver_fn) 65 | 66 | 67 | if __name__ == '__main__': 68 | tf.logging.set_verbosity(tf.logging.INFO) 69 | FLAGS, unparsed = parser.parse_known_args() 70 | tf.app.run(main=main, argv=[sys.argv[0]] + unparsed) 71 | -------------------------------------------------------------------------------- /images/tensorboard_images.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rishizek/tensorflow-deeplab-v3-plus/f6b416be519bf481f20d2458614a4d33737da37e/images/tensorboard_images.png -------------------------------------------------------------------------------- /images/tensorboard_miou.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rishizek/tensorflow-deeplab-v3-plus/f6b416be519bf481f20d2458614a4d33737da37e/images/tensorboard_miou.png -------------------------------------------------------------------------------- /inference.py: -------------------------------------------------------------------------------- 1 | """Run inference a DeepLab v3 model using tf.estimator API.""" 2 | 3 | from __future__ import absolute_import 4 | from __future__ import division 5 | from __future__ import print_function 6 | 7 | import argparse 8 | import os 9 | import sys 10 | 11 | import tensorflow as tf 12 | 13 | import deeplab_model 14 | from utils import preprocessing 15 | from utils import dataset_util 16 | 17 | from PIL import Image 18 | import matplotlib.pyplot as plt 19 | 20 | from tensorflow.python import debug as tf_debug 21 | 22 | parser = argparse.ArgumentParser() 23 | 24 | parser.add_argument('--data_dir', type=str, default='dataset/VOCdevkit/VOC2012/JPEGImages', 25 | help='The directory containing the image data.') 26 | 27 | parser.add_argument('--output_dir', type=str, default='./dataset/inference_output', 28 | help='Path to the directory to generate the inference results') 29 | 30 | parser.add_argument('--infer_data_list', type=str, default='./dataset/sample_images_list.txt', 31 | help='Path to the file listing the inferring images.') 32 | 33 | parser.add_argument('--model_dir', type=str, default='./model', 34 | help="Base directory for the model. " 35 | "Make sure 'model_checkpoint_path' given in 'checkpoint' file matches " 36 | "with checkpoint name.") 37 | 38 | parser.add_argument('--base_architecture', type=str, default='resnet_v2_101', 39 | choices=['resnet_v2_50', 'resnet_v2_101'], 40 | help='The architecture of base Resnet building block.') 41 | 42 | parser.add_argument('--output_stride', type=int, default=16, 43 | choices=[8, 16], 44 | help='Output stride for DeepLab v3. Currently 8 or 16 is supported.') 45 | 46 | parser.add_argument('--debug', action='store_true', 47 | help='Whether to use debugger to track down bad values during training.') 48 | 49 | _NUM_CLASSES = 21 50 | 51 | 52 | def main(unused_argv): 53 | # Using the Winograd non-fused algorithms provides a small performance boost. 54 | os.environ['TF_ENABLE_WINOGRAD_NONFUSED'] = '1' 55 | 56 | pred_hooks = None 57 | if FLAGS.debug: 58 | debug_hook = tf_debug.LocalCLIDebugHook() 59 | pred_hooks = [debug_hook] 60 | 61 | model = tf.estimator.Estimator( 62 | model_fn=deeplab_model.deeplabv3_plus_model_fn, 63 | model_dir=FLAGS.model_dir, 64 | params={ 65 | 'output_stride': FLAGS.output_stride, 66 | 'batch_size': 1, # Batch size must be 1 because the images' size may differ 67 | 'base_architecture': FLAGS.base_architecture, 68 | 'pre_trained_model': None, 69 | 'batch_norm_decay': None, 70 | 'num_classes': _NUM_CLASSES, 71 | }) 72 | 73 | examples = dataset_util.read_examples_list(FLAGS.infer_data_list) 74 | image_files = [os.path.join(FLAGS.data_dir, filename) for filename in examples] 75 | 76 | predictions = model.predict( 77 | input_fn=lambda: preprocessing.eval_input_fn(image_files), 78 | hooks=pred_hooks) 79 | 80 | output_dir = FLAGS.output_dir 81 | if not os.path.exists(output_dir): 82 | os.makedirs(output_dir) 83 | 84 | for pred_dict, image_path in zip(predictions, image_files): 85 | image_basename = os.path.splitext(os.path.basename(image_path))[0] 86 | output_filename = image_basename + '_mask.png' 87 | path_to_output = os.path.join(output_dir, output_filename) 88 | 89 | print("generating:", path_to_output) 90 | mask = pred_dict['decoded_labels'] 91 | mask = Image.fromarray(mask) 92 | plt.axis('off') 93 | plt.imshow(mask) 94 | plt.savefig(path_to_output, bbox_inches='tight') 95 | 96 | 97 | if __name__ == '__main__': 98 | tf.logging.set_verbosity(tf.logging.INFO) 99 | FLAGS, unparsed = parser.parse_known_args() 100 | tf.app.run(main=main, argv=[sys.argv[0]] + unparsed) 101 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | numpy 2 | tensorflow-gpu==1.12.0 3 | matplotlib 4 | opencv-python 5 | pillow 6 | -------------------------------------------------------------------------------- /train.py: -------------------------------------------------------------------------------- 1 | """Train a DeepLab v3 plus model using tf.estimator API.""" 2 | 3 | from __future__ import absolute_import 4 | from __future__ import division 5 | from __future__ import print_function 6 | 7 | import argparse 8 | import os 9 | import sys 10 | 11 | import tensorflow as tf 12 | import deeplab_model 13 | from utils import preprocessing 14 | from tensorflow.python import debug as tf_debug 15 | 16 | import shutil 17 | 18 | parser = argparse.ArgumentParser() 19 | 20 | parser.add_argument('--model_dir', type=str, default='./model', 21 | help='Base directory for the model.') 22 | 23 | parser.add_argument('--clean_model_dir', action='store_true', 24 | help='Whether to clean up the model directory if present.') 25 | 26 | parser.add_argument('--train_epochs', type=int, default=26, 27 | help='Number of training epochs: ' 28 | 'For 30K iteration with batch size 6, train_epoch = 17.01 (= 30K * 6 / 10,582). ' 29 | 'For 30K iteration with batch size 8, train_epoch = 22.68 (= 30K * 8 / 10,582). ' 30 | 'For 30K iteration with batch size 10, train_epoch = 25.52 (= 30K * 10 / 10,582). ' 31 | 'For 30K iteration with batch size 11, train_epoch = 31.19 (= 30K * 11 / 10,582). ' 32 | 'For 30K iteration with batch size 15, train_epoch = 42.53 (= 30K * 15 / 10,582). ' 33 | 'For 30K iteration with batch size 16, train_epoch = 45.36 (= 30K * 16 / 10,582).') 34 | 35 | parser.add_argument('--epochs_per_eval', type=int, default=1, 36 | help='The number of training epochs to run between evaluations.') 37 | 38 | parser.add_argument('--tensorboard_images_max_outputs', type=int, default=6, 39 | help='Max number of batch elements to generate for Tensorboard.') 40 | 41 | parser.add_argument('--batch_size', type=int, default=10, 42 | help='Number of examples per batch.') 43 | 44 | parser.add_argument('--learning_rate_policy', type=str, default='poly', 45 | choices=['poly', 'piecewise'], 46 | help='Learning rate policy to optimize loss.') 47 | 48 | parser.add_argument('--max_iter', type=int, default=30000, 49 | help='Number of maximum iteration used for "poly" learning rate policy.') 50 | 51 | parser.add_argument('--data_dir', type=str, default='./dataset/', 52 | help='Path to the directory containing the PASCAL VOC data tf record.') 53 | 54 | parser.add_argument('--base_architecture', type=str, default='resnet_v2_101', 55 | choices=['resnet_v2_50', 'resnet_v2_101'], 56 | help='The architecture of base Resnet building block.') 57 | 58 | parser.add_argument('--pre_trained_model', type=str, default='./ini_checkpoints/resnet_v2_101/resnet_v2_101.ckpt', 59 | help='Path to the pre-trained model checkpoint.') 60 | 61 | parser.add_argument('--output_stride', type=int, default=16, 62 | choices=[8, 16], 63 | help='Output stride for DeepLab v3. Currently 8 or 16 is supported.') 64 | 65 | parser.add_argument('--freeze_batch_norm', action='store_true', 66 | help='Freeze batch normalization parameters during the training.') 67 | 68 | parser.add_argument('--initial_learning_rate', type=float, default=7e-3, 69 | help='Initial learning rate for the optimizer.') 70 | 71 | parser.add_argument('--end_learning_rate', type=float, default=1e-6, 72 | help='End learning rate for the optimizer.') 73 | 74 | parser.add_argument('--initial_global_step', type=int, default=0, 75 | help='Initial global step for controlling learning rate when fine-tuning model.') 76 | 77 | parser.add_argument('--weight_decay', type=float, default=2e-4, 78 | help='The weight decay to use for regularizing the model.') 79 | 80 | parser.add_argument('--debug', action='store_true', 81 | help='Whether to use debugger to track down bad values during training.') 82 | 83 | _NUM_CLASSES = 21 84 | _HEIGHT = 513 85 | _WIDTH = 513 86 | _DEPTH = 3 87 | _MIN_SCALE = 0.5 88 | _MAX_SCALE = 2.0 89 | _IGNORE_LABEL = 255 90 | 91 | _POWER = 0.9 92 | _MOMENTUM = 0.9 93 | 94 | _BATCH_NORM_DECAY = 0.9997 95 | 96 | _NUM_IMAGES = { 97 | 'train': 10582, 98 | 'validation': 1449, 99 | } 100 | 101 | 102 | def get_filenames(is_training, data_dir): 103 | """Return a list of filenames. 104 | 105 | Args: 106 | is_training: A boolean denoting whether the input is for training. 107 | data_dir: path to the the directory containing the input data. 108 | 109 | Returns: 110 | A list of file names. 111 | """ 112 | if is_training: 113 | return [os.path.join(data_dir, 'train.record')] 114 | else: 115 | return [os.path.join(data_dir, 'val.record')] 116 | 117 | 118 | def parse_record(raw_record): 119 | """Parse PASCAL image and label from a tf record.""" 120 | keys_to_features = { 121 | 'image/height': 122 | tf.FixedLenFeature((), tf.int64), 123 | 'image/width': 124 | tf.FixedLenFeature((), tf.int64), 125 | 'image/encoded': 126 | tf.FixedLenFeature((), tf.string, default_value=''), 127 | 'image/format': 128 | tf.FixedLenFeature((), tf.string, default_value='jpeg'), 129 | 'label/encoded': 130 | tf.FixedLenFeature((), tf.string, default_value=''), 131 | 'label/format': 132 | tf.FixedLenFeature((), tf.string, default_value='png'), 133 | } 134 | 135 | parsed = tf.parse_single_example(raw_record, keys_to_features) 136 | 137 | # height = tf.cast(parsed['image/height'], tf.int32) 138 | # width = tf.cast(parsed['image/width'], tf.int32) 139 | 140 | image = tf.image.decode_image( 141 | tf.reshape(parsed['image/encoded'], shape=[]), _DEPTH) 142 | image = tf.to_float(tf.image.convert_image_dtype(image, dtype=tf.uint8)) 143 | image.set_shape([None, None, 3]) 144 | 145 | label = tf.image.decode_image( 146 | tf.reshape(parsed['label/encoded'], shape=[]), 1) 147 | label = tf.to_int32(tf.image.convert_image_dtype(label, dtype=tf.uint8)) 148 | label.set_shape([None, None, 1]) 149 | 150 | return image, label 151 | 152 | 153 | def preprocess_image(image, label, is_training): 154 | """Preprocess a single image of layout [height, width, depth].""" 155 | if is_training: 156 | # Randomly scale the image and label. 157 | image, label = preprocessing.random_rescale_image_and_label( 158 | image, label, _MIN_SCALE, _MAX_SCALE) 159 | 160 | # Randomly crop or pad a [_HEIGHT, _WIDTH] section of the image and label. 161 | image, label = preprocessing.random_crop_or_pad_image_and_label( 162 | image, label, _HEIGHT, _WIDTH, _IGNORE_LABEL) 163 | 164 | # Randomly flip the image and label horizontally. 165 | image, label = preprocessing.random_flip_left_right_image_and_label( 166 | image, label) 167 | 168 | image.set_shape([_HEIGHT, _WIDTH, 3]) 169 | label.set_shape([_HEIGHT, _WIDTH, 1]) 170 | 171 | image = preprocessing.mean_image_subtraction(image) 172 | 173 | return image, label 174 | 175 | 176 | def input_fn(is_training, data_dir, batch_size, num_epochs=1): 177 | """Input_fn using the tf.data input pipeline for CIFAR-10 dataset. 178 | 179 | Args: 180 | is_training: A boolean denoting whether the input is for training. 181 | data_dir: The directory containing the input data. 182 | batch_size: The number of samples per batch. 183 | num_epochs: The number of epochs to repeat the dataset. 184 | 185 | Returns: 186 | A tuple of images and labels. 187 | """ 188 | dataset = tf.data.Dataset.from_tensor_slices(get_filenames(is_training, data_dir)) 189 | dataset = dataset.flat_map(tf.data.TFRecordDataset) 190 | 191 | if is_training: 192 | # When choosing shuffle buffer sizes, larger sizes result in better 193 | # randomness, while smaller sizes have better performance. 194 | # is a relatively small dataset, we choose to shuffle the full epoch. 195 | dataset = dataset.shuffle(buffer_size=_NUM_IMAGES['train']) 196 | 197 | dataset = dataset.map(parse_record) 198 | dataset = dataset.map( 199 | lambda image, label: preprocess_image(image, label, is_training)) 200 | dataset = dataset.prefetch(batch_size) 201 | 202 | # We call repeat after shuffling, rather than before, to prevent separate 203 | # epochs from blending together. 204 | dataset = dataset.repeat(num_epochs) 205 | dataset = dataset.batch(batch_size) 206 | 207 | iterator = dataset.make_one_shot_iterator() 208 | images, labels = iterator.get_next() 209 | 210 | return images, labels 211 | 212 | 213 | def main(unused_argv): 214 | # Using the Winograd non-fused algorithms provides a small performance boost. 215 | os.environ['TF_ENABLE_WINOGRAD_NONFUSED'] = '1' 216 | 217 | if FLAGS.clean_model_dir: 218 | shutil.rmtree(FLAGS.model_dir, ignore_errors=True) 219 | 220 | # Set up a RunConfig to only save checkpoints once per training cycle. 221 | run_config = tf.estimator.RunConfig().replace(save_checkpoints_secs=1e9) 222 | model = tf.estimator.Estimator( 223 | model_fn=deeplab_model.deeplabv3_plus_model_fn, 224 | model_dir=FLAGS.model_dir, 225 | config=run_config, 226 | params={ 227 | 'output_stride': FLAGS.output_stride, 228 | 'batch_size': FLAGS.batch_size, 229 | 'base_architecture': FLAGS.base_architecture, 230 | 'pre_trained_model': FLAGS.pre_trained_model, 231 | 'batch_norm_decay': _BATCH_NORM_DECAY, 232 | 'num_classes': _NUM_CLASSES, 233 | 'tensorboard_images_max_outputs': FLAGS.tensorboard_images_max_outputs, 234 | 'weight_decay': FLAGS.weight_decay, 235 | 'learning_rate_policy': FLAGS.learning_rate_policy, 236 | 'num_train': _NUM_IMAGES['train'], 237 | 'initial_learning_rate': FLAGS.initial_learning_rate, 238 | 'max_iter': FLAGS.max_iter, 239 | 'end_learning_rate': FLAGS.end_learning_rate, 240 | 'power': _POWER, 241 | 'momentum': _MOMENTUM, 242 | 'freeze_batch_norm': FLAGS.freeze_batch_norm, 243 | 'initial_global_step': FLAGS.initial_global_step 244 | }) 245 | 246 | for _ in range(FLAGS.train_epochs // FLAGS.epochs_per_eval): 247 | tensors_to_log = { 248 | 'learning_rate': 'learning_rate', 249 | 'cross_entropy': 'cross_entropy', 250 | 'train_px_accuracy': 'train_px_accuracy', 251 | 'train_mean_iou': 'train_mean_iou', 252 | } 253 | 254 | logging_hook = tf.train.LoggingTensorHook( 255 | tensors=tensors_to_log, every_n_iter=10) 256 | train_hooks = [logging_hook] 257 | eval_hooks = None 258 | 259 | if FLAGS.debug: 260 | debug_hook = tf_debug.LocalCLIDebugHook() 261 | train_hooks.append(debug_hook) 262 | eval_hooks = [debug_hook] 263 | 264 | tf.logging.info("Start training.") 265 | model.train( 266 | input_fn=lambda: input_fn(True, FLAGS.data_dir, FLAGS.batch_size, FLAGS.epochs_per_eval), 267 | hooks=train_hooks, 268 | # steps=1 # For debug 269 | ) 270 | 271 | tf.logging.info("Start evaluation.") 272 | # Evaluate the model and print results 273 | eval_results = model.evaluate( 274 | # Batch size must be 1 for testing because the images' size differs 275 | input_fn=lambda: input_fn(False, FLAGS.data_dir, 1), 276 | hooks=eval_hooks, 277 | # steps=1 # For debug 278 | ) 279 | print(eval_results) 280 | 281 | 282 | if __name__ == '__main__': 283 | tf.logging.set_verbosity(tf.logging.INFO) 284 | FLAGS, unparsed = parser.parse_known_args() 285 | tf.app.run(main=main, argv=[sys.argv[0]] + unparsed) 286 | -------------------------------------------------------------------------------- /utils/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rishizek/tensorflow-deeplab-v3-plus/f6b416be519bf481f20d2458614a4d33737da37e/utils/__init__.py -------------------------------------------------------------------------------- /utils/dataset_util.py: -------------------------------------------------------------------------------- 1 | # Copyright 2017 The TensorFlow Authors. All Rights Reserved. 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # http://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | # ============================================================================== 15 | 16 | """Utility functions for creating TFRecord data sets. 17 | source: https://github.com/tensorflow/models/blob/master/research/object_detection/utils/dataset_util.py 18 | """ 19 | 20 | import tensorflow as tf 21 | 22 | 23 | def int64_feature(value): 24 | return tf.train.Feature(int64_list=tf.train.Int64List(value=[value])) 25 | 26 | 27 | def int64_list_feature(value): 28 | return tf.train.Feature(int64_list=tf.train.Int64List(value=value)) 29 | 30 | 31 | def bytes_feature(value): 32 | return tf.train.Feature(bytes_list=tf.train.BytesList(value=[value])) 33 | 34 | 35 | def bytes_list_feature(value): 36 | return tf.train.Feature(bytes_list=tf.train.BytesList(value=value)) 37 | 38 | 39 | def float_list_feature(value): 40 | return tf.train.Feature(float_list=tf.train.FloatList(value=value)) 41 | 42 | 43 | def read_examples_list(path): 44 | """Read list of training or validation examples. 45 | 46 | The file is assumed to contain a single example per line where the first 47 | token in the line is an identifier that allows us to find the image and 48 | annotation xml for that example. 49 | 50 | For example, the line: 51 | xyz 3 52 | would allow us to find files xyz.jpg and xyz.xml (the 3 would be ignored). 53 | 54 | Args: 55 | path: absolute path to examples list file. 56 | 57 | Returns: 58 | list of example identifiers (strings). 59 | """ 60 | with tf.gfile.GFile(path) as fid: 61 | lines = fid.readlines() 62 | return [line.strip().split(' ')[0] for line in lines] 63 | 64 | 65 | def recursive_parse_xml_to_dict(xml): 66 | """Recursively parses XML contents to python dict. 67 | 68 | We assume that `object` tags are the only ones that can appear 69 | multiple times at the same level of a tree. 70 | 71 | Args: 72 | xml: xml tree obtained by parsing XML file contents using lxml.etree 73 | 74 | Returns: 75 | Python dictionary holding XML contents. 76 | """ 77 | if not xml: 78 | return {xml.tag: xml.text} 79 | result = {} 80 | for child in xml: 81 | child_result = recursive_parse_xml_to_dict(child) 82 | if child.tag != 'object': 83 | result[child.tag] = child_result[child.tag] 84 | else: 85 | if child.tag not in result: 86 | result[child.tag] = [] 87 | result[child.tag].append(child_result[child.tag]) 88 | return {xml.tag: result} 89 | 90 | 91 | def make_initializable_iterator(dataset): 92 | """Creates an iterator, and initializes tables. 93 | 94 | This is useful in cases where make_one_shot_iterator wouldn't work because 95 | the graph contains a hash table that needs to be initialized. 96 | 97 | Args: 98 | dataset: A `tf.data.Dataset` object. 99 | 100 | Returns: 101 | A `tf.data.Iterator`. 102 | """ 103 | iterator = dataset.make_initializable_iterator() 104 | tf.add_to_collection(tf.GraphKeys.TABLE_INITIALIZERS, iterator.initializer) 105 | return iterator 106 | 107 | 108 | def read_dataset( 109 | file_read_func, decode_func, input_files, config, num_workers=1, 110 | worker_index=0): 111 | """Reads a dataset, and handles repetition and shuffling. 112 | 113 | Args: 114 | file_read_func: Function to use in tf.data.Dataset.interleave, to read 115 | every individual file into a tf.data.Dataset. 116 | decode_func: Function to apply to all records. 117 | input_files: A list of file paths to read. 118 | config: A input_reader_builder.InputReader object. 119 | num_workers: Number of workers / shards. 120 | worker_index: Id for the current worker. 121 | 122 | Returns: 123 | A tf.data.Dataset based on config. 124 | """ 125 | # Shard, shuffle, and read files. 126 | filenames = tf.concat([tf.matching_files(pattern) for pattern in input_files], 127 | 0) 128 | dataset = tf.data.Dataset.from_tensor_slices(filenames) 129 | dataset = dataset.shard(num_workers, worker_index) 130 | dataset = dataset.repeat(config.num_epochs or None) 131 | if config.shuffle: 132 | dataset = dataset.shuffle(config.filenames_shuffle_buffer_size, 133 | reshuffle_each_iteration=True) 134 | 135 | # Read file records and shuffle them. 136 | # If cycle_length is larger than the number of files, more than one reader 137 | # will be assigned to the same file, leading to repetition. 138 | cycle_length = tf.cast( 139 | tf.minimum(config.num_readers, tf.size(filenames)), tf.int64) 140 | # TODO: find the optimal block_length. 141 | dataset = dataset.interleave( 142 | file_read_func, cycle_length=cycle_length, block_length=1) 143 | 144 | if config.shuffle: 145 | dataset = dataset.shuffle(config.shuffle_buffer_size, 146 | reshuffle_each_iteration=True) 147 | 148 | dataset = dataset.map(decode_func, num_parallel_calls=config.num_readers) 149 | return dataset.prefetch(config.prefetch_buffer_size) 150 | -------------------------------------------------------------------------------- /utils/preprocessing.py: -------------------------------------------------------------------------------- 1 | """Utility functions for preprocessing data sets.""" 2 | 3 | from PIL import Image 4 | import numpy as np 5 | import tensorflow as tf 6 | 7 | _R_MEAN = 123.68 8 | _G_MEAN = 116.78 9 | _B_MEAN = 103.94 10 | 11 | # colour map 12 | label_colours = [(0, 0, 0), # 0=background 13 | # 1=aeroplane, 2=bicycle, 3=bird, 4=boat, 5=bottle 14 | (128, 0, 0), (0, 128, 0), (128, 128, 0), (0, 0, 128), (128, 0, 128), 15 | # 6=bus, 7=car, 8=cat, 9=chair, 10=cow 16 | (0, 128, 128), (128, 128, 128), (64, 0, 0), (192, 0, 0), (64, 128, 0), 17 | # 11=dining table, 12=dog, 13=horse, 14=motorbike, 15=person 18 | (192, 128, 0), (64, 0, 128), (192, 0, 128), (64, 128, 128), (192, 128, 128), 19 | # 16=potted plant, 17=sheep, 18=sofa, 19=train, 20=tv/monitor 20 | (0, 64, 0), (128, 64, 0), (0, 192, 0), (128, 192, 0), (0, 64, 128)] 21 | 22 | 23 | def decode_labels(mask, num_images=1, num_classes=21): 24 | """Decode batch of segmentation masks. 25 | 26 | Args: 27 | mask: result of inference after taking argmax. 28 | num_images: number of images to decode from the batch. 29 | num_classes: number of classes to predict (including background). 30 | 31 | Returns: 32 | A batch with num_images RGB images of the same size as the input. 33 | """ 34 | n, h, w, c = mask.shape 35 | assert (n >= num_images), 'Batch size %d should be greater or equal than number of images to save %d.' \ 36 | % (n, num_images) 37 | outputs = np.zeros((num_images, h, w, 3), dtype=np.uint8) 38 | for i in range(num_images): 39 | img = Image.new('RGB', (len(mask[i, 0]), len(mask[i]))) 40 | pixels = img.load() 41 | for j_, j in enumerate(mask[i, :, :, 0]): 42 | for k_, k in enumerate(j): 43 | if k < num_classes: 44 | pixels[k_, j_] = label_colours[k] 45 | outputs[i] = np.array(img) 46 | return outputs 47 | 48 | 49 | def mean_image_addition(image, means=(_R_MEAN, _G_MEAN, _B_MEAN)): 50 | """Adds the given means from each image channel. 51 | 52 | For example: 53 | means = [123.68, 116.779, 103.939] 54 | image = _mean_image_subtraction(image, means) 55 | 56 | Note that the rank of `image` must be known. 57 | 58 | Args: 59 | image: a tensor of size [height, width, C]. 60 | means: a C-vector of values to subtract from each channel. 61 | 62 | Returns: 63 | the centered image. 64 | 65 | Raises: 66 | ValueError: If the rank of `image` is unknown, if `image` has a rank other 67 | than three or if the number of channels in `image` doesn't match the 68 | number of values in `means`. 69 | """ 70 | if image.get_shape().ndims != 3: 71 | raise ValueError('Input must be of size [height, width, C>0]') 72 | num_channels = image.get_shape().as_list()[-1] 73 | if len(means) != num_channels: 74 | raise ValueError('len(means) must match the number of channels') 75 | 76 | channels = tf.split(axis=2, num_or_size_splits=num_channels, value=image) 77 | for i in range(num_channels): 78 | channels[i] += means[i] 79 | return tf.concat(axis=2, values=channels) 80 | 81 | 82 | def mean_image_subtraction(image, means=(_R_MEAN, _G_MEAN, _B_MEAN)): 83 | """Subtracts the given means from each image channel. 84 | 85 | For example: 86 | means = [123.68, 116.779, 103.939] 87 | image = _mean_image_subtraction(image, means) 88 | 89 | Note that the rank of `image` must be known. 90 | 91 | Args: 92 | image: a tensor of size [height, width, C]. 93 | means: a C-vector of values to subtract from each channel. 94 | 95 | Returns: 96 | the centered image. 97 | 98 | Raises: 99 | ValueError: If the rank of `image` is unknown, if `image` has a rank other 100 | than three or if the number of channels in `image` doesn't match the 101 | number of values in `means`. 102 | """ 103 | if image.get_shape().ndims != 3: 104 | raise ValueError('Input must be of size [height, width, C>0]') 105 | num_channels = image.get_shape().as_list()[-1] 106 | if len(means) != num_channels: 107 | raise ValueError('len(means) must match the number of channels') 108 | 109 | channels = tf.split(axis=2, num_or_size_splits=num_channels, value=image) 110 | for i in range(num_channels): 111 | channels[i] -= means[i] 112 | return tf.concat(axis=2, values=channels) 113 | 114 | 115 | def random_rescale_image_and_label(image, label, min_scale, max_scale): 116 | """Rescale an image and label with in target scale. 117 | 118 | Rescales an image and label within the range of target scale. 119 | 120 | Args: 121 | image: 3-D Tensor of shape `[height, width, channels]`. 122 | label: 3-D Tensor of shape `[height, width, 1]`. 123 | min_scale: Min target scale. 124 | max_scale: Max target scale. 125 | 126 | Returns: 127 | Cropped and/or padded image. 128 | If `images` was 3-D, a 3-D float Tensor of shape 129 | `[new_height, new_width, channels]`. 130 | If `labels` was 3-D, a 3-D float Tensor of shape 131 | `[new_height, new_width, 1]`. 132 | """ 133 | if min_scale <= 0: 134 | raise ValueError('\'min_scale\' must be greater than 0.') 135 | elif max_scale <= 0: 136 | raise ValueError('\'max_scale\' must be greater than 0.') 137 | elif min_scale >= max_scale: 138 | raise ValueError('\'max_scale\' must be greater than \'min_scale\'.') 139 | 140 | shape = tf.shape(image) 141 | height = tf.to_float(shape[0]) 142 | width = tf.to_float(shape[1]) 143 | scale = tf.random_uniform( 144 | [], minval=min_scale, maxval=max_scale, dtype=tf.float32) 145 | new_height = tf.to_int32(height * scale) 146 | new_width = tf.to_int32(width * scale) 147 | image = tf.image.resize_images(image, [new_height, new_width], 148 | method=tf.image.ResizeMethod.BILINEAR) 149 | # Since label classes are integers, nearest neighbor need to be used. 150 | label = tf.image.resize_images(label, [new_height, new_width], 151 | method=tf.image.ResizeMethod.NEAREST_NEIGHBOR) 152 | 153 | return image, label 154 | 155 | 156 | def random_crop_or_pad_image_and_label(image, label, crop_height, crop_width, ignore_label): 157 | """Crops and/or pads an image to a target width and height. 158 | 159 | Resizes an image to a target width and height by rondomly 160 | cropping the image or padding it evenly with zeros. 161 | 162 | Args: 163 | image: 3-D Tensor of shape `[height, width, channels]`. 164 | label: 3-D Tensor of shape `[height, width, 1]`. 165 | crop_height: The new height. 166 | crop_width: The new width. 167 | ignore_label: Label class to be ignored. 168 | 169 | Returns: 170 | Cropped and/or padded image. 171 | If `images` was 3-D, a 3-D float Tensor of shape 172 | `[new_height, new_width, channels]`. 173 | """ 174 | label = label - ignore_label # Subtract due to 0 padding. 175 | label = tf.to_float(label) 176 | image_height = tf.shape(image)[0] 177 | image_width = tf.shape(image)[1] 178 | image_and_label = tf.concat([image, label], axis=2) 179 | image_and_label_pad = tf.image.pad_to_bounding_box( 180 | image_and_label, 0, 0, 181 | tf.maximum(crop_height, image_height), 182 | tf.maximum(crop_width, image_width)) 183 | image_and_label_crop = tf.random_crop( 184 | image_and_label_pad, [crop_height, crop_width, 4]) 185 | 186 | image_crop = image_and_label_crop[:, :, :3] 187 | label_crop = image_and_label_crop[:, :, 3:] 188 | label_crop += ignore_label 189 | label_crop = tf.to_int32(label_crop) 190 | 191 | return image_crop, label_crop 192 | 193 | 194 | def random_flip_left_right_image_and_label(image, label): 195 | """Randomly flip an image and label horizontally (left to right). 196 | 197 | Args: 198 | image: A 3-D tensor of shape `[height, width, channels].` 199 | label: A 3-D tensor of shape `[height, width, 1].` 200 | 201 | Returns: 202 | A 3-D tensor of the same type and shape as `image`. 203 | A 3-D tensor of the same type and shape as `label`. 204 | """ 205 | uniform_random = tf.random_uniform([], 0, 1.0) 206 | mirror_cond = tf.less(uniform_random, .5) 207 | image = tf.cond(mirror_cond, lambda: tf.reverse(image, [1]), lambda: image) 208 | label = tf.cond(mirror_cond, lambda: tf.reverse(label, [1]), lambda: label) 209 | 210 | return image, label 211 | 212 | 213 | def eval_input_fn(image_filenames, label_filenames=None, batch_size=1): 214 | """An input function for evaluation and inference. 215 | 216 | Args: 217 | image_filenames: The file names for the inferred images. 218 | label_filenames: The file names for the grand truth labels. 219 | batch_size: The number of samples per batch. Need to be 1 220 | for the images of different sizes. 221 | 222 | Returns: 223 | A tuple of images and labels. 224 | """ 225 | # Reads an image from a file, decodes it into a dense tensor 226 | def _parse_function(filename, is_label): 227 | if not is_label: 228 | image_filename, label_filename = filename, None 229 | else: 230 | image_filename, label_filename = filename 231 | 232 | image_string = tf.read_file(image_filename) 233 | image = tf.image.decode_image(image_string) 234 | image = tf.to_float(tf.image.convert_image_dtype(image, dtype=tf.uint8)) 235 | image.set_shape([None, None, 3]) 236 | 237 | image = mean_image_subtraction(image) 238 | 239 | if not is_label: 240 | return image 241 | else: 242 | label_string = tf.read_file(label_filename) 243 | label = tf.image.decode_image(label_string) 244 | label = tf.to_int32(tf.image.convert_image_dtype(label, dtype=tf.uint8)) 245 | label.set_shape([None, None, 1]) 246 | 247 | return image, label 248 | 249 | if label_filenames is None: 250 | input_filenames = image_filenames 251 | else: 252 | input_filenames = (image_filenames, label_filenames) 253 | 254 | dataset = tf.data.Dataset.from_tensor_slices(input_filenames) 255 | if label_filenames is None: 256 | dataset = dataset.map(lambda x: _parse_function(x, False)) 257 | else: 258 | dataset = dataset.map(lambda x, y: _parse_function((x, y), True)) 259 | dataset = dataset.prefetch(batch_size) 260 | dataset = dataset.batch(batch_size) 261 | iterator = dataset.make_one_shot_iterator() 262 | 263 | if label_filenames is None: 264 | images = iterator.get_next() 265 | labels = None 266 | else: 267 | images, labels = iterator.get_next() 268 | 269 | return images, labels 270 | --------------------------------------------------------------------------------