├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── LICENSE ├── README.md ├── config_training.py ├── export_path.sh ├── labels ├── annos.csv ├── luna_candidate_all.csv ├── lunaqualified.csv ├── lunaqualified_all.csv ├── shorter.csv └── val9_sid.csv ├── preprocess ├── make_validate_npy.py └── prepare.py ├── train.sh └── training ├── data.py ├── layers.py ├── main.py ├── res_classifier.py └── utils.py /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as 6 | contributors and maintainers pledge to making participation in our project and 7 | our community a harassment-free experience for everyone, regardless of age, body 8 | size, disability, ethnicity, sex characteristics, gender identity and expression, 9 | level of experience, education, socio-economic status, nationality, personal 10 | appearance, race, religion, or sexual identity and orientation. 11 | 12 | ## Our Standards 13 | 14 | Examples of behavior that contributes to creating a positive environment 15 | include: 16 | 17 | * Using welcoming and inclusive language 18 | * Being respectful of differing viewpoints and experiences 19 | * Gracefully accepting constructive criticism 20 | * Focusing on what is best for the community 21 | * Showing empathy towards other community members 22 | 23 | Examples of unacceptable behavior by participants include: 24 | 25 | * The use of sexualized language or imagery and unwelcome sexual attention or 26 | advances 27 | * Trolling, insulting/derogatory comments, and personal or political attacks 28 | * Public or private harassment 29 | * Publishing others' private information, such as a physical or electronic 30 | address, without explicit permission 31 | * Other conduct which could reasonably be considered inappropriate in a 32 | professional setting 33 | 34 | ## Our Responsibilities 35 | 36 | Project maintainers are responsible for clarifying the standards of acceptable 37 | behavior and are expected to take appropriate and fair corrective action in 38 | response to any instances of unacceptable behavior. 39 | 40 | Project maintainers have the right and responsibility to remove, edit, or 41 | reject comments, commits, code, wiki edits, issues, and other contributions 42 | that are not aligned to this Code of Conduct, or to ban temporarily or 43 | permanently any contributor for other behaviors that they deem inappropriate, 44 | threatening, offensive, or harmful. 45 | 46 | ## Scope 47 | 48 | This Code of Conduct applies both within project spaces and in public spaces 49 | when an individual is representing the project or its community. Examples of 50 | representing a project or community include using an official project e-mail 51 | address, posting via an official social media account, or acting as an appointed 52 | representative at an online or offline event. Representation of a project may be 53 | further defined and clarified by project maintainers. 54 | 55 | ## Enforcement 56 | 57 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 58 | reported by contacting the project team at dypark@airi.kr. All 59 | complaints will be reviewed and investigated and will result in a response that 60 | is deemed necessary and appropriate to the circumstances. The project team is 61 | obligated to maintain confidentiality with regard to the reporter of an incident. 62 | Further details of specific enforcement policies may be posted separately. 63 | 64 | Project maintainers who do not follow or enforce the Code of Conduct in good 65 | faith may face temporary or permanent repercussions as determined by other 66 | members of the project's leadership. 67 | 68 | ## Attribution 69 | 70 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, 71 | available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html 72 | 73 | [homepage]: https://www.contributor-covenant.org 74 | 75 | For answers to common questions about this code of conduct, see 76 | https://www.contributor-covenant.org/faq 77 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Introduction 2 | 3 | thank you for considering contributing to xai project. 4 | 5 | ### Tell them why they should read your guidelines. 6 | 7 | Following the document helps to communicate among developers and maintainers. Even this is a small project, we want to minimize the communication costs. 8 | 9 | ### We need contributor 10 | 11 | xai project is an open source project and we welcome contributor. 12 | contribute via pull requests, bug reports, issue, refactoring or any questions. 13 | 14 | # Ground Rules 15 | ### Report issue. 16 | if you report issue, please add detail of your environments for reproduce issue. 17 | - data 18 | - OS 19 | - deeplearning platform 20 | - sequence of execution 21 | 22 | ### Community 23 | if you have any queation, use issue in the GitHub repository or email to dypark@airi.kr 24 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2018 AIRI 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # lung_nodule_classifier 2 | lung nodule classifier for 9 attribute 3 | 4 | # Prerequisites 5 | python 3.5.2 6 | pytorch 0.2.0 7 | 8 | # lidc 9 attribute 9 | - malignancy (range: 1~5) 10 | - sphericity (range: 1~5) 11 | - margin (range: 1~5) 12 | - spiculation (range: 1~5) 13 | - texture (range: 1~5) 14 | - calcification (range: 1~6) 15 | - internal structure (range: 1~4) 16 | - lobulation (range: 1~5) 17 | - subtlety (range: 1~5) 18 | 19 | # Result 20 | | | | | | | | | | | | | 21 | |--------------------|---------------|---------------|---------------|---------------|---------------|---------------|---------------|---------------|---------------|-------------| 22 | | | MAE under 0.1 | MAE under 0.2 | MAE under 0.3 | MAE under 0.4 | MAE under 0.5 | MAE under 0.6 | MAE under 0.7 | MAE under 0.8 | MAE under 0.9 | MAE under 1 | 23 | | malignancy | 15.24% | 24.76% | 42.86% | 60.00% | 71.43% | 80.95% | 86.67% | 92.38% | 93.33% | 94.29% | 24 | | sphericity | 17.14% | 29.52% | 50.48% | 60.00% | 67.62% | 72.38% | 80.95% | 84.76% | 89.52% | 92.38% | 25 | | margin | 19.05% | 38.10% | 48.57% | 60.95% | 67.62% | 77.14% | 82.86% | 85.71% | 86.67% | 89.52% | 26 | | spiculation | 21.90% | 37.14% | 63.81% | 72.38% | 76.19% | 81.90% | 83.81% | 87.62% | 88.57% | 91.43% | 27 | | texture | 22.86% | 59.05% | 76.19% | 80.95% | 84.76% | 84.76% | 86.67% | 89.52% | 91.43% | 92.38% | 28 | | calcification | 89.52% | 89.52% | 92.38% | 93.33% | 93.33% | 93.33% | 97.14% | 99.05% | 99.05% | 100.00% | 29 | | internal_structure | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% | 100.00% | 30 | | lobulation | 19.05% | 31.43% | 52.38% | 67.62% | 75.24% | 79.05% | 85.71% | 87.62% | 91.43% | 92.38% | 31 | | subtlety | 15.24% | 32.38% | 50.48% | 57.14% | 67.62% | 71.43% | 80.95% | 86.67% | 86.67% | 88.57% | 32 | 33 | 34 | # Data Download 35 | - https://luna16.grand-challenge.org/download/ 36 | - download data and candidates_V2.csv 37 | - https://wiki.cancerimagingarchive.net/display/Public/LIDC-IDRI 38 | - download Radiologist Annotations/Segmentations (XML) 39 | 40 | # Training 41 | - set luna and lidc path: set config_training.py 42 | - export path: source export_path.sh 43 | - train, val idx npy make: python preprocess/make_validate_npy.py 44 | - preprocess data make: python preprocess/prepare.py 45 | - train: sh train.sh 46 | 47 | # Reference Code 48 | https://github.com/lfz/DSB2017 49 | https://github.com/juliandewit/kaggle_ndsb2017 50 | -------------------------------------------------------------------------------- /config_training.py: -------------------------------------------------------------------------------- 1 | config = {'luna_raw':'/root/ssd_data/LUNA/', 2 | 'luna_data':'/root/ssd_data/LUNA/allset/', 3 | 'preprocess_result_path':'/root/ssd_data/luna_segment_attribute/', 4 | 'luna_abbr':'./labels/shorter.csv', 5 | 'luna_label':'./labels/lunaqualified_all.csv', 6 | 'luna_candidate_label':'./labels/luna_candidate_all.csv', 7 | 'lidc_xml':'./lidc_xml', 8 | 'preprocessing_backend':'python' 9 | } 10 | -------------------------------------------------------------------------------- /export_path.sh: -------------------------------------------------------------------------------- 1 | export PYTHONPATH=$PWD:$PYTHONPATH 2 | echo $PYTHONPATH 3 | -------------------------------------------------------------------------------- /labels/lunaqualified.csv: -------------------------------------------------------------------------------- 1 | 5,-24.014,192.1,-391.08,8.1433 2 | 5,2.4415,172.46,-405.49,18.545 3 | 5,90.932,149.03,-426.54,18.209 4 | 5,89.541,196.41,-515.07,16.381 5 | 7,81.51,54.957,-150.35,10.362 6 | 10,105.06,19.825,-91.247,21.09 7 | 12,-124.83,127.25,-473.06,10.466 8 | 14,-106.9,21.923,-126.92,9.7453 9 | 16,2.2638,33.526,-170.64,7.1685 10 | 17,-70.551,66.359,-160.94,6.6422 11 | 19,-96.44,9.7362,-175.04,6.7538 12 | 20,-57.087,74.259,1790.5,13.694 13 | 21,-98.136,-72.868,-221.82,18.128 14 | 22,-69.127,-80.567,-189.81,8.648 15 | 26,46.189,48.403,-108.58,13.596 16 | 28,-38.096,-106.71,-139.55,13.684 17 | 31,-80.952,10.343,-28.399,9.4544 18 | 34,47.671,37.643,-99.89,30.61 19 | 34,-60.764,-66.435,-100.21,16.883 20 | 35,42.296,56.152,-84.662,6.478 21 | 35,99.375,23.947,-138.57,9.1046 22 | 40,-127.53,-29.608,-253.7,8.2693 23 | 40,93.139,-5.6526,-213.91,6.2098 24 | 42,-34.497,-9.3119,-37.448,7.3606 25 | 43,86.403,50.875,-81.387,12.734 26 | 47,88.256,230.52,-623.3,9.0118 27 | 47,55.714,190.76,-579.65,10.889 28 | 49,-63.098,-56.555,-332.98,10.333 29 | 49,-164.35,-136.66,-301.58,9.2206 30 | 50,-68.621,-84.035,-160.52,6.3342 31 | 51,-117.72,160.49,-688.29,15.343 32 | 57,49.313,-63.215,-118.8,22.133 33 | 60,-94.292,60.226,-261.21,14.198 34 | 60,-54.618,59.352,-236.07,13.932 35 | 65,-129.05,-12.239,-79.498,21.828 36 | 68,-86.737,50.435,-156.61,9.2598 37 | 68,78.94,15.801,-95.356,14.953 38 | 68,-60.615,50.429,-95.987,6.7366 39 | 74,59.114,-158.58,-118.04,6.7203 40 | 75,-96.96,48.936,-68.038,16.978 41 | 77,97.49,28.55,-160.53,8.8428 42 | 78,33.833,88.412,-101.78,10.463 43 | 78,-104.49,78.052,-208.33,56.865 44 | 79,69.529,-24.415,-137.82,7.0358 45 | 80,-99.859,37.434,-170.46,12.319 46 | 80,-78.667,66.422,-168.87,13.025 47 | 81,44.141,100.38,-139.84,6.1334 48 | 90,-91.919,87.004,1453.8,14.44 49 | 90,-59.723,84.839,1368.4,9.7696 50 | 91,74.282,57.094,-122.28,6.7066 51 | 93,-33.326,50.441,-253.8,11.345 52 | 93,-57.186,78.725,-133.62,13.652 53 | 94,-61.475,-140.37,-59.317,9.3264 54 | 95,-119.43,-48.796,-223.6,12.035 55 | 97,-84.047,-42.015,-146.81,7.0537 56 | 97,-70.323,56.204,-82.479,7.8888 57 | 97,-86.854,15.042,-72.327,9.2583 58 | 98,-109.14,-124.56,-105.89,14.151 59 | 101,-103.72,153.39,-465.85,11.14 60 | 101,-80.446,183.94,-467.88,17.694 61 | 102,88.825,-213.34,168.26,17.746 62 | 102,57.179,-191.04,182.11,10.076 63 | 105,53.277,66.297,-254.54,17.374 64 | 107,-91.212,-159.22,-181.03,6.3107 65 | 107,-62.943,-91.857,-154.36,11.686 66 | 108,-108.13,-176.56,-192.96,8.6216 67 | 114,97.376,-102.6,-156.34,11.232 68 | 115,-108.03,59.544,-256.41,8.438 69 | 118,-101.01,24.07,-108.76,11.629 70 | 120,156.49,-149.4,-280.77,6.0496 71 | 121,67.927,-24.895,-78.858,16.595 72 | 122,37.069,-156.56,959.58,7.3034 73 | 126,-76.87,-149.53,-127.54,18.793 74 | 130,138.48,50.14,-311.36,16.185 75 | 138,91.419,-21.567,-257.8,24.773 76 | 139,84.438,41.64,-38.625,6.4562 77 | 142,-89.669,34.299,-128.56,23.484 78 | 144,58.513,75.794,-279.29,15.992 79 | 146,-71.614,20.407,-22.784,17.347 80 | 147,38.149,1.8027,-70.907,7.047 81 | 148,-114.09,-229.05,-171.18,6.3463 82 | 150,-50.505,124.85,-550.37,6.8671 83 | 154,112.5,-43.727,-199.01,12.817 84 | 154,104.39,-9.9725,-183.34,9.8437 85 | 154,55.391,11.937,-63.155,13.736 86 | 156,-83.567,69.861,-119.36,16.451 87 | 157,-32.449,49.382,-60.319,24.282 88 | 158,-57.439,131.23,-520.36,15.485 89 | 161,-118.04,-3.2187,-165.58,6.4683 90 | 167,-87.631,-25.153,-119.22,11.476 91 | 168,-48.731,19.905,-97.42,7.7654 92 | 170,-58.645,29.438,-78.844,8.742 93 | 172,39.297,69.528,-210.55,9.76 94 | 176,-105.42,250.82,-624.21,9.3591 95 | 176,-53.409,239.45,-635.49,9.3763 96 | 176,57.832,268.89,-646.46,8.1524 97 | 176,36.176,271.41,-690.78,9.4211 98 | 178,116.23,-173.58,-162.72,7.3181 99 | 179,-25.829,-91.168,-200.19,9.5873 100 | 180,17.295,-80.61,-181.42,9.5622 101 | 183,-62.463,40.617,-418.83,21.256 102 | 187,-89.508,-219.18,-204.12,6.9086 103 | 188,100.62,64.757,-211.96,10.572 104 | 188,-87.674,-12.694,-120.93,20.484 105 | 190,38.161,-132.23,-198.8,8.8337 106 | 190,97.9,-130.77,-156.65,12.565 107 | 191,-27.958,53.79,-10.409,6.2506 108 | 194,-51.89,97.114,-115.61,16.333 109 | 197,-76.005,38.004,-242.56,6.4639 110 | 199,72.287,76.9,-190.46,21.454 111 | 199,23.491,-47.634,-172.28,8.7858 112 | 199,-47.403,-30.171,-169.62,12.823 113 | 206,-136.71,62.061,-99.277,18.385 114 | 210,-51.044,75.2,-101.55,14.837 115 | 210,-116.29,21.161,-124.62,10.888 116 | 210,-111.19,-1.2645,-138.7,17.397 117 | 210,73.775,37.278,-118.31,8.6483 118 | 212,76.494,88.158,-193.16,19.044 119 | 214,104.09,25.522,-140.17,16.043 120 | 216,-43.912,-43.242,-141.7,9.1472 121 | 216,-36.513,-65.215,-124.43,14.616 122 | 219,71.122,103.58,-148.45,7.046 123 | 219,49.946,39.217,-239.83,12.381 124 | 219,72.371,83.674,-244.38,6.3371 125 | 219,-35.369,83.725,-150.25,7.0258 126 | 219,-41.69,-7.1446,-158.72,7.6707 127 | 219,-33.076,103.02,-192.64,6.4507 128 | 219,-53.332,96.459,-226.26,10.985 129 | 223,-66.474,58.44,-109.36,25.873 130 | 231,56.208,86.343,-115.87,23.351 131 | 232,134.42,86.577,-762.59,7.4018 132 | 233,-78.51,-162.58,-64.409,6.9296 133 | 237,38.836,104.87,-72.041,6.4098 134 | 241,77.436,22.217,-194.86,8.8758 135 | 241,25.469,-15.969,-116.7,7.6153 136 | 242,-82.491,31.001,-213.68,14.157 137 | 242,-36.367,-41.895,-147.41,17.158 138 | 243,-46.594,64.339,-149.79,24.782 139 | 245,-103.49,-63.8,-222.22,7.3528 140 | 246,-43.907,-74.65,-213.26,11.048 141 | 249,-64.795,-35.557,-65.405,11.471 142 | 249,108.24,-23.006,-75.132,9.3466 143 | 249,-50.874,61.489,-138.68,7.8229 144 | 250,94.635,-17.372,-204.4,17.753 145 | 251,91.807,-228.15,-206.73,15.481 146 | 253,-50.869,86.607,-145.77,9.7727 147 | 256,-60.494,70.546,-58.675,8.7091 148 | 257,-55.188,174.39,-405.73,10.938 149 | 257,79.036,209.82,-465.06,6.3055 150 | 260,-95.298,189.23,-500.44,7.7948 151 | 263,-47.597,38.356,-259.52,6.705 152 | 272,-47.715,-32.475,-77.44,7.1416 153 | 274,-32.754,8.7772,-111.78,6.9752 154 | 275,56.24,-92.29,-277.1,8.983 155 | 278,-47.899,-10.225,-96.114,20.16 156 | 279,-106.86,9.0482,-122.35,23.688 157 | 283,92.366,18.315,-84.536,9.6421 158 | 283,-85.442,-50.321,-177.47,6.3226 159 | 286,59.257,-34.723,-134.84,13.136 160 | 286,78.296,-16.929,-163.95,19.348 161 | 286,64.43,70.386,-216.67,20.525 162 | 290,-87.98,-170.44,-67.398,8.2605 163 | 296,88.873,-97.927,-169.22,8.181 164 | 296,66.313,-135.7,-42.758,8.824 165 | 298,-109.18,70.159,-107.62,8.3918 166 | 304,-82.989,8.8409,-85.892,8.6046 167 | 305,-83.158,-21.679,-97.004,18.783 168 | 310,-130.89,-160.5,851.78,13.161 169 | 311,57.42,33.204,-61.85,13.82 170 | 312,99.829,-146.12,-169.53,18.825 171 | 317,-69.812,-69.682,-166.63,12.627 172 | 317,75.713,12.583,-135.3,9.9007 173 | 318,67.247,-122.55,-155.52,20.412 174 | 319,138.64,-118.77,-60.254,7.7485 175 | 322,98.19,37.786,-70.467,14.881 176 | 322,-75.666,-51.486,-93.677,10.201 177 | 326,33.618,-158.89,-129.44,16.023 178 | 328,52,155.92,-554.11,8.7932 179 | 328,47.055,170.39,-527.55,19.849 180 | 328,-64.764,222.43,-520.04,10.073 181 | 329,68.293,71.327,-224.24,9.0022 182 | 332,54.578,137.28,-626.59,13.963 183 | 333,115.37,-179.87,-281.63,10.638 184 | 334,70.309,-48.697,-147.2,7.8624 185 | 334,114.99,36.943,-143.63,7.8417 186 | 335,-39.569,-71.945,-218.78,6.3647 187 | 335,-64.999,-71.028,-185.12,7.2885 188 | 338,96.276,-23.557,-107.37,7.4777 189 | 338,72.191,33.773,-115.45,6.5695 190 | 338,87.579,9.7838,-100.93,9.0882 191 | 340,-31.098,222.31,-512.32,6.7734 192 | 344,141.31,23.488,-242.68,6.376 193 | 353,123.02,85.595,-215.18,12.135 194 | 353,96.413,92.614,-224.12,7.8794 195 | 353,127.59,44.77,-188.87,6.4364 196 | 353,-7.5422,60.292,-240.95,6.0114 197 | 353,-33.967,114.13,-195.07,7.748 198 | 353,-49.514,-13.582,-136.21,8.2449 199 | 353,66.146,36.398,-72.803,9.6986 200 | 353,79.18,73.328,-167.73,6.1028 201 | 354,-102.38,-96.395,-138.86,25.415 202 | 355,118.32,20.247,-155.28,13.849 203 | 357,-84.394,246.93,-508.19,23.572 204 | 358,75.476,71.694,-97.732,7.39 205 | 360,-64.401,7.8415,-75.952,10.722 206 | 364,-55.528,245.44,-758.13,13.843 207 | 365,75.633,111.26,-182.14,6.2795 208 | 366,38.176,76.658,-217.88,13.205 209 | 367,28.822,-67.503,-119.49,14.744 210 | 367,86.343,-24.242,-116.26,6.5421 211 | 368,-88.879,252.97,-542.21,9.5847 212 | 369,-91.776,193.67,-636.67,15.673 213 | 373,-20.261,66.589,-206.24,7.79 214 | 374,-149.07,-156.37,-422.12,6.5873 215 | 375,-79.513,120.48,-68.384,6.3137 216 | 376,72.883,-1.2814,-56.52,17.526 217 | 377,-87.393,181.75,-508.56,13.181 218 | 378,103.49,19.024,-227.21,9.2864 219 | 378,-31.017,-25.196,-196.29,6.2309 220 | 380,-73.172,-55.926,-183.72,14.798 221 | 380,-73.664,-215.81,-183.91,6.4894 222 | 387,97.956,28.735,-133.41,12.167 223 | 387,95.351,50.215,-139.27,12.076 224 | 389,67.291,-2.5114,-97.674,7.9203 225 | 395,-64.823,-60.709,-192.06,7.9851 226 | 396,-103.07,-201.97,-199.26,7.8249 227 | 400,64.682,32.102,-236.2,11.603 228 | 405,-58.146,-55.41,-195.86,6.0064 229 | 409,-50.061,84.283,-182,7.016 230 | 409,-55.354,-21.739,-170.02,6.5678 231 | 411,-67.893,66.626,-201.73,6.2327 232 | 412,37.291,-153.93,-230.49,8.3617 233 | 414,31.352,49.994,-93.09,9.9532 234 | 415,-105.49,-195.79,-231.98,7.3403 235 | 416,-112.3,97.358,-324.51,10.431 236 | 417,36.101,56.747,-225.46,15.27 237 | 417,88.047,82.068,-235.18,9.5143 238 | 417,76.803,80.647,-206.02,8.2204 239 | 417,45.57,34.24,-140.18,12.997 240 | 417,62.562,73.24,-255.62,12.14 241 | 419,74.066,-121.25,-208.58,6.4294 242 | 422,-54.533,-122.34,-106.85,12.068 243 | 432,-101.13,-66.598,-149.26,6.6052 244 | 433,30.322,113.81,-401.13,12.657 245 | 434,93.408,32.615,-183.93,6.4051 246 | 438,67.02,2.846,-85.926,12.468 247 | 439,61.15,65.368,-282.88,8.8158 248 | 439,-82.51,75.767,-284.3,6.765 249 | 439,78.145,84.694,-181.04,8.2797 250 | 439,129.66,32.395,-174.18,11.37 251 | 440,-108.27,-128.03,-247.69,7.0706 252 | 441,145.05,33.853,-189.8,6.2422 253 | 441,123.67,65.766,-210.81,10.831 254 | 442,100,22.006,-147.08,8.8716 255 | 442,127.21,44.798,-164.79,7.7243 256 | 445,-82.014,-28.011,-153.91,10.062 257 | 446,-83.903,-43.197,-275.31,14.813 258 | 448,60.161,67.202,-97.951,19.576 259 | 454,78.919,28.07,-30.648,6.1459 260 | 456,-36.289,37.18,1549.3,12.316 261 | 459,94.799,-131.05,-139.71,6.0735 262 | 461,-115.35,-13.637,-174.06,12.986 263 | 463,-14.233,54.112,-168.7,18.021 264 | 463,-47.1,55.212,-132.72,13.806 265 | 464,-42.793,-131.46,-226.18,12.652 266 | 465,-92.274,49.159,-179.91,6.359 267 | 466,63.27,77.363,-225.93,13.448 268 | 468,-41.618,178.41,-381.25,19.31 269 | 470,-29.123,48.557,-85.505,10.727 270 | 472,-56.693,-71.398,-160.15,6.0241 271 | 476,36.991,75.969,-211.92,7.9689 272 | 476,-55.693,73.998,-147.35,14.678 273 | 487,-85.254,-46.796,-180.66,6.8566 274 | 488,-71.455,-176.25,241.81,7.9778 275 | 488,-93.681,-176.03,213.21,23.934 276 | 493,-12.416,-184.51,867.18,12.17 277 | 495,-94.171,60.229,-185.15,7.6363 278 | 497,-49.728,-19.986,-151.93,14.314 279 | 497,-71.783,-64.17,-210.64,8.2222 280 | 501,123.64,-5.0811,-262.46,20.389 281 | 503,-36.644,-76.576,-155.61,15.283 282 | 503,107.3,-1.6683,-189.2,20.036 283 | 503,129.14,35.495,-229.66,22.135 284 | 504,-45.805,59.892,-200.99,8.7667 285 | 504,-88.564,20.209,-103.74,7.0767 286 | 506,96.78,1.5657,-113.1,22.039 287 | 507,-48.522,45.382,-90.224,22.881 288 | 510,-72.804,223.43,-567.17,6.0317 289 | 511,-29.741,-18.462,-134.25,17.989 290 | 512,-48.066,-258.05,-250.54,8.4139 291 | 512,-90.94,-230.22,-217.07,7.6638 292 | 513,33.018,94.395,-251.8,8.3283 293 | 517,-67.46,-247.74,-211.1,7.7442 294 | 517,-97.691,-134.07,-187.09,6.69 295 | 518,69.529,87.036,-149.19,10.825 296 | 524,51.489,9.855,-86.835,14.861 297 | 526,-73.966,-92.445,-238.65,7.186 298 | 526,-52.545,-195.13,-180.13,8.4366 299 | 526,-85.941,-169.27,-128.35,7.5956 300 | 526,-73.195,-102.81,-166.83,6.1726 301 | 526,106.69,-126.23,-246.31,22.78 302 | 527,-72.123,7.7971,-217.03,6.8443 303 | 529,110.14,-32.017,-241.1,6.0054 304 | 531,-88.491,4.6908,-184.93,18.51 305 | 533,101.06,-2.9096,-129.73,6.4178 306 | 533,111.8,34.379,-156.34,6.8988 307 | 533,54.867,83.421,-200.52,15.998 308 | 538,-66.079,27.442,-61.404,6.8621 309 | 539,-90.942,9.4226,-72.519,22.225 310 | 543,102.24,65.328,-240.7,18.21 311 | 543,22.536,34.721,-117.44,10.34 312 | 543,67.827,85.38,-109.75,32.27 313 | 545,-59.391,91.298,-217.32,18.999 314 | 545,-65.435,68.629,-282.43,13.553 315 | 545,80.326,58.999,-150.48,12.714 316 | 545,92.707,19.698,-194.27,8.8097 317 | 550,76.666,-163.87,-147.25,15.693 318 | 556,-130.2,27.57,-210.27,9.5032 319 | 557,120.78,26.8,-282.4,7.9035 320 | 558,60.464,6.6847,-166.26,6.2753 321 | 560,92.755,28.68,-68.422,14.011 322 | 565,33.105,-63.116,-108.73,10.759 323 | 566,-79.344,175,-675.27,9.5476 324 | 569,-57.726,73.779,-147.42,11.802 325 | 569,-72.581,3.1897,-124.79,7.7408 326 | 571,118.89,9.5434,-223.39,22.305 327 | 571,-32.576,-38.5,-195.9,9.6516 328 | 572,-55.107,-92.467,-105.13,6.8214 329 | 578,91.67,-140.49,1158,13.058 330 | 578,91.67,-140.49,1158,13.058 331 | 578,73.324,-154.08,1175,12.764 332 | 582,-35.865,50.09,-54.978,11.73 333 | 588,-61.935,226.79,-379.06,6.9729 334 | 588,22.141,196.1,-367.36,9.2539 335 | 595,101.25,15.809,-105.57,16.095 336 | 595,117.15,-1.52,-208.73,21.677 337 | 602,48.842,-79.993,-170.12,7.7471 338 | 603,-44.746,15.745,-199.21,11.066 339 | 605,108.35,17.664,-144.35,7.1993 340 | 605,-96.744,37.574,-138.16,9.0084 341 | 605,64.422,10.399,-97.913,6.1804 342 | 605,-61.414,86.922,-68.295,8.7486 343 | 606,-43.175,-70.665,-149.53,8.489 344 | 607,-46.842,58.349,-64.068,8.365 345 | 610,-89.42,230.47,-614.34,7.6 346 | 612,-6.6072,22.559,-143.79,9.4849 347 | 612,119.21,-11.433,-208.29,6.8534 348 | 614,-96.159,50.987,-126.45,10.321 349 | 614,-129.56,43.951,-164.99,10.418 350 | 614,-54.403,81.76,-286.54,15.711 351 | 615,68.056,43.703,-266.02,12.686 352 | 618,-104.13,5.5618,-43.863,13.14 353 | 627,-97.411,-44.656,-143.89,24.663 354 | 629,-87.927,-49.534,-117,9.4439 355 | 635,-20.724,-82.434,-163.52,9.9911 356 | 636,74.133,-26.753,-252.66,10.039 357 | 638,72.189,-14.469,-195.24,22.469 358 | 638,74.728,30.758,-101.31,18.868 359 | 642,97.837,55.509,-280.98,21.082 360 | 645,-34.217,-258.21,-180.59,8.7141 361 | 645,-89.744,-177.92,-171.97,11.052 362 | 645,81.069,-250.94,-166.75,9.3302 363 | 646,-87.612,-12.625,-163.86,9.8217 364 | 652,92.332,0.47116,-196.17,11.408 365 | 652,76.373,-58.715,-188.32,6.7624 366 | 652,38.185,-70.857,-177.4,7.0132 367 | 652,73.403,-79.171,-160.47,7.3828 368 | 652,54.101,-74.662,-158.29,6.1625 369 | 655,29.809,60.468,-278.37,6.9554 370 | 655,36.08,-29.736,-89.481,23.043 371 | 659,-52.419,-24.794,-176.16,16.829 372 | 663,-37.951,-30.073,-111.61,7.7364 373 | 664,-36.556,94.954,-231.09,8.2083 374 | 670,-90.57,207.98,-397.63,12.571 375 | 670,-105.17,139.33,-448.43,7.1789 376 | 670,-113.01,127.66,-453.05,8.2446 377 | 677,87.929,-5.6235,-78.695,10.261 378 | 679,-78.231,21.392,-185.92,11.841 379 | 683,108.14,-9.0283,-127.66,9.5669 380 | 685,-59.558,9.1483,-95.453,16.635 381 | 686,-68.66,-222.49,721.58,6.4707 382 | 703,16.893,206.01,-53.009,11.476 383 | 703,-86.933,218.23,-221.67,9.6932 384 | 707,122.16,-197.25,-159.47,12.951 385 | 708,82.98,225.31,-584.62,17.153 386 | 709,-41.47,39.42,-171.62,7.1211 387 | 717,-97.07,-23.312,-166.01,10.803 388 | 717,-136.54,47.458,-191.64,7.8515 389 | 718,-106.91,-134.47,-187.21,7.601 390 | 719,-104.33,-130.46,-81.178,12.304 391 | 723,52.95,21.623,230.69,11.175 392 | 724,67.939,-29.646,-125.73,6.168 393 | 727,89.715,72.842,-235.99,11.232 394 | 735,-152.36,-22.286,-274.37,7.8858 395 | 736,-103.13,-5.7747,-206.36,27.075 396 | 736,32.961,35.982,-176.78,16.923 397 | 738,98.897,104.19,-208.78,8.0856 398 | 739,119.21,11.45,-165.04,26.837 399 | 739,-37.902,55.028,-80.003,17.719 400 | 740,48.683,-15.356,-80.569,22.782 401 | 741,-111.2,37.526,-67.136,15.393 402 | 744,60.775,74.124,-214.78,25.233 403 | 744,-85.421,25.262,-220.83,9.7566 404 | 744,-108.53,-38.825,-199.21,10.989 405 | 746,100.45,-88.728,-171.24,7.8058 406 | 748,123.6,-8.0373,-208.26,7.2838 407 | 752,-28.412,68.094,-217.51,7.2341 408 | 752,105.19,48.219,-129.56,6.1316 409 | 755,59.331,-24.992,-197.04,8.7358 410 | 757,-124.67,21.097,-169.6,6.7319 411 | 760,112.24,-132.08,-179.22,10.127 412 | 762,-31.908,27.181,-170.15,10.168 413 | 762,65.997,108.89,-168.72,7.9188 414 | 762,-126.46,46.472,-203.92,7.0235 415 | 765,-22.966,32.462,-207.39,18.409 416 | 766,-43.535,61.082,-176.67,12.554 417 | 767,-39.57,112.66,-173.71,20.884 418 | 767,-59.919,-50.478,-171.93,6.0099 419 | 767,-12.147,80.919,-147.91,6.872 420 | 767,79.176,56.236,-153.67,8.2223 421 | 770,-96.404,43.841,-155.37,11.487 422 | 772,-73.156,-61.941,-261.6,21.579 423 | 773,-107.93,46.064,-143.15,21.88 424 | 774,127.29,-151.03,-245.18,7.9259 425 | 777,-96.281,53.449,-196.18,17.724 426 | 778,-45.91,-9.4096,-306.95,20.116 427 | 784,-23.139,22.157,-179.84,21.151 428 | 786,-27.135,88.963,-193.55,6.9677 429 | 789,105.98,-105.19,-207.06,21.343 430 | 791,73.043,8.2188,-54.187,15.125 431 | 797,-93.46,60.144,1523.5,7.6745 432 | 804,114.04,-22.69,-161.17,20.911 433 | 807,59.497,-6.6966,-135.92,14.168 434 | 809,-65.365,197.65,-672.1,12.449 435 | 810,56.393,67.68,-64.674,19.654 436 | 812,71.956,105.22,-534.73,7.4785 437 | 812,35.819,171.34,-470.2,8.7279 438 | 813,-68.685,206.2,-685.87,17.156 439 | 815,67.698,-135.42,-167.49,6.6724 440 | 821,-71.666,-60.409,-135.73,13.051 441 | 821,92.054,-11.909,-96.799,12.973 442 | 822,-107.32,-96.37,-211.97,15.354 443 | 823,98.702,160.55,-418.74,13.343 444 | 826,-75.743,37.461,-158.69,17.232 445 | 830,-66.693,97.045,-267.31,17.336 446 | 835,-48.614,-4.9129,-81.285,8.3496 447 | 835,45.606,-33.229,-141.75,12.033 448 | 835,56.741,-38.923,-214.44,17.604 449 | 835,42.189,76.001,-230.09,9.8834 450 | 835,23.623,78.542,-240.77,11.884 451 | 835,17.205,8.9386,-256.56,8.119 452 | 840,-47.537,-35.797,-137.33,13.642 453 | 842,49.353,119.84,-209.7,6.9435 454 | 843,-85.603,29.067,-107.85,7.2964 455 | 845,-56.878,169.46,960.35,9.4964 456 | 846,132.72,16.665,-130.31,10.438 457 | 846,83.223,36.045,-47.98,16.945 458 | 851,59.088,14.434,-83.776,17.411 459 | 854,-106.77,143.13,-711.11,15.131 460 | 857,109.12,48.59,-120.89,21.583 461 | 858,-59.562,-21.608,-184.87,6.0965 462 | 859,68.244,83.941,-296.89,7.0686 463 | 859,-113.69,61.068,-204.51,6.6779 464 | 859,30.716,-22.257,-151.4,14.221 465 | 859,16.203,-28.689,-142.81,8.8462 466 | 865,64.134,71.916,-126.18,8.5013 467 | 865,-74.892,64.177,-112.4,6.2675 468 | 868,-46.95,72.636,-95.645,27.442 469 | 869,40.315,-80.724,-149.99,8.6871 470 | 870,121.91,-67.416,-158.8,9.0616 471 | 871,-84.152,21.816,-184.79,18.98 472 | 871,-42.465,-36.855,-123.88,19.927 473 | 872,-48.062,-65.097,222.29,6.5296 474 | 873,87.077,222.25,-465.36,12.116 475 | 873,55.973,250.6,-517.77,6.3195 476 | 876,35.106,-25.062,-90.501,8.6206 477 | 876,56.351,52.952,-83.782,14.915 478 | 887,-21.958,33.486,-155.29,23.803 479 | -------------------------------------------------------------------------------- /labels/shorter.csv: -------------------------------------------------------------------------------- 1 | 000,1.3.6.1.4.1.14519.5.2.1.6279.6001.100225287222365663678666836860 2 | 001,1.3.6.1.4.1.14519.5.2.1.6279.6001.100332161840553388986847034053 3 | 002,1.3.6.1.4.1.14519.5.2.1.6279.6001.100398138793540579077826395208 4 | 003,1.3.6.1.4.1.14519.5.2.1.6279.6001.100530488926682752765845212286 5 | 004,1.3.6.1.4.1.14519.5.2.1.6279.6001.100620385482151095585000946543 6 | 005,1.3.6.1.4.1.14519.5.2.1.6279.6001.100621383016233746780170740405 7 | 006,1.3.6.1.4.1.14519.5.2.1.6279.6001.100684836163890911914061745866 8 | 007,1.3.6.1.4.1.14519.5.2.1.6279.6001.100953483028192176989979435275 9 | 008,1.3.6.1.4.1.14519.5.2.1.6279.6001.101228986346984399347858840086 10 | 009,1.3.6.1.4.1.14519.5.2.1.6279.6001.102133688497886810253331438797 11 | 010,1.3.6.1.4.1.14519.5.2.1.6279.6001.102681962408431413578140925249 12 | 011,1.3.6.1.4.1.14519.5.2.1.6279.6001.103115201714075993579787468219 13 | 012,1.3.6.1.4.1.14519.5.2.1.6279.6001.104562737760173137525888934217 14 | 013,1.3.6.1.4.1.14519.5.2.1.6279.6001.104780906131535625872840889059 15 | 014,1.3.6.1.4.1.14519.5.2.1.6279.6001.105495028985881418176186711228 16 | 015,1.3.6.1.4.1.14519.5.2.1.6279.6001.105756658031515062000744821260 17 | 016,1.3.6.1.4.1.14519.5.2.1.6279.6001.106164978370116976238911317774 18 | 017,1.3.6.1.4.1.14519.5.2.1.6279.6001.106379658920626694402549886949 19 | 018,1.3.6.1.4.1.14519.5.2.1.6279.6001.106419850406056634877579573537 20 | 019,1.3.6.1.4.1.14519.5.2.1.6279.6001.106630482085576298661469304872 21 | 020,1.3.6.1.4.1.14519.5.2.1.6279.6001.106719103982792863757268101375 22 | 021,1.3.6.1.4.1.14519.5.2.1.6279.6001.107109359065300889765026303943 23 | 022,1.3.6.1.4.1.14519.5.2.1.6279.6001.107351566259572521472765997306 24 | 023,1.3.6.1.4.1.14519.5.2.1.6279.6001.108193664222196923321844991231 25 | 024,1.3.6.1.4.1.14519.5.2.1.6279.6001.108197895896446896160048741492 26 | 025,1.3.6.1.4.1.14519.5.2.1.6279.6001.108231420525711026834210228428 27 | 026,1.3.6.1.4.1.14519.5.2.1.6279.6001.109002525524522225658609808059 28 | 027,1.3.6.1.4.1.14519.5.2.1.6279.6001.109882169963817627559804568094 29 | 028,1.3.6.1.4.1.14519.5.2.1.6279.6001.110678335949765929063942738609 30 | 029,1.3.6.1.4.1.14519.5.2.1.6279.6001.111017101339429664883879536171 31 | 030,1.3.6.1.4.1.14519.5.2.1.6279.6001.111172165674661221381920536987 32 | 031,1.3.6.1.4.1.14519.5.2.1.6279.6001.111258527162678142285870245028 33 | 032,1.3.6.1.4.1.14519.5.2.1.6279.6001.111496024928645603833332252962 34 | 033,1.3.6.1.4.1.14519.5.2.1.6279.6001.111780708132595903430640048766 35 | 034,1.3.6.1.4.1.14519.5.2.1.6279.6001.112740418331256326754121315800 36 | 035,1.3.6.1.4.1.14519.5.2.1.6279.6001.112767175295249119452142211437 37 | 036,1.3.6.1.4.1.14519.5.2.1.6279.6001.113586291551175790743673929831 38 | 037,1.3.6.1.4.1.14519.5.2.1.6279.6001.113679818447732724990336702075 39 | 038,1.3.6.1.4.1.14519.5.2.1.6279.6001.113697708991260454310623082679 40 | 039,1.3.6.1.4.1.14519.5.2.1.6279.6001.114195693932194925962391697338 41 | 040,1.3.6.1.4.1.14519.5.2.1.6279.6001.114218724025049818743426522343 42 | 041,1.3.6.1.4.1.14519.5.2.1.6279.6001.114249388265341701207347458535 43 | 042,1.3.6.1.4.1.14519.5.2.1.6279.6001.114914167428485563471327801935 44 | 043,1.3.6.1.4.1.14519.5.2.1.6279.6001.115386642382564804180764325545 45 | 044,1.3.6.1.4.1.14519.5.2.1.6279.6001.116097642684124305074876564522 46 | 045,1.3.6.1.4.1.14519.5.2.1.6279.6001.116492508532884962903000261147 47 | 046,1.3.6.1.4.1.14519.5.2.1.6279.6001.116703382344406837243058680403 48 | 047,1.3.6.1.4.1.14519.5.2.1.6279.6001.117040183261056772902616195387 49 | 048,1.3.6.1.4.1.14519.5.2.1.6279.6001.117383608379722740629083782428 50 | 049,1.3.6.1.4.1.14519.5.2.1.6279.6001.118140393257625250121502185026 51 | 050,1.3.6.1.4.1.14519.5.2.1.6279.6001.119209873306155771318545953948 52 | 051,1.3.6.1.4.1.14519.5.2.1.6279.6001.119304665257760307862874140576 53 | 052,1.3.6.1.4.1.14519.5.2.1.6279.6001.119515474430718803379832249911 54 | 053,1.3.6.1.4.1.14519.5.2.1.6279.6001.119806527488108718706404165837 55 | 054,1.3.6.1.4.1.14519.5.2.1.6279.6001.120196332569034738680965284519 56 | 055,1.3.6.1.4.1.14519.5.2.1.6279.6001.120842785645314664964010792308 57 | 056,1.3.6.1.4.1.14519.5.2.1.6279.6001.121108220866971173712229588402 58 | 057,1.3.6.1.4.1.14519.5.2.1.6279.6001.121391737347333465796214915391 59 | 058,1.3.6.1.4.1.14519.5.2.1.6279.6001.121805476976020513950614465787 60 | 059,1.3.6.1.4.1.14519.5.2.1.6279.6001.121824995088859376862458155637 61 | 060,1.3.6.1.4.1.14519.5.2.1.6279.6001.121993590721161347818774929286 62 | 061,1.3.6.1.4.1.14519.5.2.1.6279.6001.122621219961396951727742490470 63 | 062,1.3.6.1.4.1.14519.5.2.1.6279.6001.122763913896761494371822656720 64 | 063,1.3.6.1.4.1.14519.5.2.1.6279.6001.122914038048856168343065566972 65 | 064,1.3.6.1.4.1.14519.5.2.1.6279.6001.123654356399290048011621921476 66 | 065,1.3.6.1.4.1.14519.5.2.1.6279.6001.123697637451437522065941162930 67 | 066,1.3.6.1.4.1.14519.5.2.1.6279.6001.124154461048929153767743874565 68 | 067,1.3.6.1.4.1.14519.5.2.1.6279.6001.124656777236468248920498636247 69 | 068,1.3.6.1.4.1.14519.5.2.1.6279.6001.124663713663969377020085460568 70 | 069,1.3.6.1.4.1.14519.5.2.1.6279.6001.124822907934319930841506266464 71 | 070,1.3.6.1.4.1.14519.5.2.1.6279.6001.125067060506283419853742462394 72 | 071,1.3.6.1.4.1.14519.5.2.1.6279.6001.125124219978170516876304987559 73 | 072,1.3.6.1.4.1.14519.5.2.1.6279.6001.125356649712550043958727288500 74 | 073,1.3.6.1.4.1.14519.5.2.1.6279.6001.126121460017257137098781143514 75 | 074,1.3.6.1.4.1.14519.5.2.1.6279.6001.126264578931778258890371755354 76 | 075,1.3.6.1.4.1.14519.5.2.1.6279.6001.126631670596873065041988320084 77 | 076,1.3.6.1.4.1.14519.5.2.1.6279.6001.126704785377921920210612476953 78 | 077,1.3.6.1.4.1.14519.5.2.1.6279.6001.127965161564033605177803085629 79 | 078,1.3.6.1.4.1.14519.5.2.1.6279.6001.128023902651233986592378348912 80 | 079,1.3.6.1.4.1.14519.5.2.1.6279.6001.128059192202504367870633619224 81 | 080,1.3.6.1.4.1.14519.5.2.1.6279.6001.128881800399702510818644205032 82 | 081,1.3.6.1.4.1.14519.5.2.1.6279.6001.129007566048223160327836686225 83 | 082,1.3.6.1.4.1.14519.5.2.1.6279.6001.129055977637338639741695800950 84 | 083,1.3.6.1.4.1.14519.5.2.1.6279.6001.129567032250534530765928856531 85 | 084,1.3.6.1.4.1.14519.5.2.1.6279.6001.129650136453746261130135157590 86 | 085,1.3.6.1.4.1.14519.5.2.1.6279.6001.129982010889624423230394257528 87 | 086,1.3.6.1.4.1.14519.5.2.1.6279.6001.130036599816889919308975074972 88 | 087,1.3.6.1.4.1.14519.5.2.1.6279.6001.130438550890816550994739120843 89 | 088,1.3.6.1.4.1.14519.5.2.1.6279.6001.130765375502800983459674173881 90 | 089,1.3.6.1.4.1.14519.5.2.1.6279.6001.131150737314367975651717513386 91 | 090,1.3.6.1.4.1.14519.5.2.1.6279.6001.131939324905446238286154504249 92 | 091,1.3.6.1.4.1.14519.5.2.1.6279.6001.132817748896065918417924920957 93 | 092,1.3.6.1.4.1.14519.5.2.1.6279.6001.133132722052053001903031735878 94 | 093,1.3.6.1.4.1.14519.5.2.1.6279.6001.133378195429627807109985347209 95 | 094,1.3.6.1.4.1.14519.5.2.1.6279.6001.134370886216012873213579659366 96 | 095,1.3.6.1.4.1.14519.5.2.1.6279.6001.134519406153127654901640638633 97 | 096,1.3.6.1.4.1.14519.5.2.1.6279.6001.134638281277099121660656324702 98 | 097,1.3.6.1.4.1.14519.5.2.1.6279.6001.134996872583497382954024478441 99 | 098,1.3.6.1.4.1.14519.5.2.1.6279.6001.135657246677982059395844827629 100 | 099,1.3.6.1.4.1.14519.5.2.1.6279.6001.136830368929967292376608088362 101 | 100,1.3.6.1.4.1.14519.5.2.1.6279.6001.137375498893536422914241295628 102 | 101,1.3.6.1.4.1.14519.5.2.1.6279.6001.137763212752154081977261297097 103 | 102,1.3.6.1.4.1.14519.5.2.1.6279.6001.137773550852881583165286615668 104 | 103,1.3.6.1.4.1.14519.5.2.1.6279.6001.138080888843357047811238713686 105 | 104,1.3.6.1.4.1.14519.5.2.1.6279.6001.138674679709964033277400089532 106 | 105,1.3.6.1.4.1.14519.5.2.1.6279.6001.138813197521718693188313387015 107 | 106,1.3.6.1.4.1.14519.5.2.1.6279.6001.138894439026794145866157853158 108 | 107,1.3.6.1.4.1.14519.5.2.1.6279.6001.138904664700896606480369521124 109 | 108,1.3.6.1.4.1.14519.5.2.1.6279.6001.139258777898746693365877042411 110 | 109,1.3.6.1.4.1.14519.5.2.1.6279.6001.139444426690868429919252698606 111 | 110,1.3.6.1.4.1.14519.5.2.1.6279.6001.139577698050713461261415990027 112 | 111,1.3.6.1.4.1.14519.5.2.1.6279.6001.139595277234735528205899724196 113 | 112,1.3.6.1.4.1.14519.5.2.1.6279.6001.139713436241461669335487719526 114 | 113,1.3.6.1.4.1.14519.5.2.1.6279.6001.139889514693390832525232698200 115 | 114,1.3.6.1.4.1.14519.5.2.1.6279.6001.140239815496047437552471323962 116 | 115,1.3.6.1.4.1.14519.5.2.1.6279.6001.140253591510022414496468423138 117 | 116,1.3.6.1.4.1.14519.5.2.1.6279.6001.140527383975300992150799777603 118 | 117,1.3.6.1.4.1.14519.5.2.1.6279.6001.141069661700670042960678408762 119 | 118,1.3.6.1.4.1.14519.5.2.1.6279.6001.141149610914910880857802344415 120 | 119,1.3.6.1.4.1.14519.5.2.1.6279.6001.141345499716190654505508410197 121 | 120,1.3.6.1.4.1.14519.5.2.1.6279.6001.141430002307216644912805017227 122 | 121,1.3.6.1.4.1.14519.5.2.1.6279.6001.141511313712034597336182402384 123 | 122,1.3.6.1.4.1.14519.5.2.1.6279.6001.142154819868944114554521645782 124 | 123,1.3.6.1.4.1.14519.5.2.1.6279.6001.142485715518010940961688015191 125 | 124,1.3.6.1.4.1.14519.5.2.1.6279.6001.143410010885830403003179808334 126 | 125,1.3.6.1.4.1.14519.5.2.1.6279.6001.143412474064515942785157561636 127 | 126,1.3.6.1.4.1.14519.5.2.1.6279.6001.143622857676008763729469324839 128 | 127,1.3.6.1.4.1.14519.5.2.1.6279.6001.143782059748737055784173697516 129 | 128,1.3.6.1.4.1.14519.5.2.1.6279.6001.143813757344903170810482790787 130 | 129,1.3.6.1.4.1.14519.5.2.1.6279.6001.144438612068946916340281098509 131 | 130,1.3.6.1.4.1.14519.5.2.1.6279.6001.144883090372691745980459537053 132 | 131,1.3.6.1.4.1.14519.5.2.1.6279.6001.144943344795414353192059796098 133 | 132,1.3.6.1.4.1.14519.5.2.1.6279.6001.145283812746259413053188838096 134 | 133,1.3.6.1.4.1.14519.5.2.1.6279.6001.145474881373882284343459153872 135 | 134,1.3.6.1.4.1.14519.5.2.1.6279.6001.145510611155363050427743946446 136 | 135,1.3.6.1.4.1.14519.5.2.1.6279.6001.145759169833745025756371695397 137 | 136,1.3.6.1.4.1.14519.5.2.1.6279.6001.146429221666426688999739595820 138 | 137,1.3.6.1.4.1.14519.5.2.1.6279.6001.146603910507557786636779705509 139 | 138,1.3.6.1.4.1.14519.5.2.1.6279.6001.146987333806092287055399155268 140 | 139,1.3.6.1.4.1.14519.5.2.1.6279.6001.147250707071097813243473865421 141 | 140,1.3.6.1.4.1.14519.5.2.1.6279.6001.147325126373007278009743173696 142 | 141,1.3.6.1.4.1.14519.5.2.1.6279.6001.148229375703208214308676934766 143 | 142,1.3.6.1.4.1.14519.5.2.1.6279.6001.148447286464082095534651426689 144 | 143,1.3.6.1.4.1.14519.5.2.1.6279.6001.148935306123327835217659769212 145 | 144,1.3.6.1.4.1.14519.5.2.1.6279.6001.149041668385192796520281592139 146 | 145,1.3.6.1.4.1.14519.5.2.1.6279.6001.149463915556499304732434215056 147 | 146,1.3.6.1.4.1.14519.5.2.1.6279.6001.149893110752986700464921264055 148 | 147,1.3.6.1.4.1.14519.5.2.1.6279.6001.150097650621090951325113116280 149 | 148,1.3.6.1.4.1.14519.5.2.1.6279.6001.150684298696437181894923266019 150 | 149,1.3.6.1.4.1.14519.5.2.1.6279.6001.151669338315069779994664893123 151 | 150,1.3.6.1.4.1.14519.5.2.1.6279.6001.151764021165118974848436095034 152 | 151,1.3.6.1.4.1.14519.5.2.1.6279.6001.152684536713461901635595118048 153 | 152,1.3.6.1.4.1.14519.5.2.1.6279.6001.152706273988004688708784163325 154 | 153,1.3.6.1.4.1.14519.5.2.1.6279.6001.153181766344026020914478182395 155 | 154,1.3.6.1.4.1.14519.5.2.1.6279.6001.153536305742006952753134773630 156 | 155,1.3.6.1.4.1.14519.5.2.1.6279.6001.153646219551578201092527860224 157 | 156,1.3.6.1.4.1.14519.5.2.1.6279.6001.153732973534937692357111055819 158 | 157,1.3.6.1.4.1.14519.5.2.1.6279.6001.153985109349433321657655488650 159 | 158,1.3.6.1.4.1.14519.5.2.1.6279.6001.154677396354641150280013275227 160 | 159,1.3.6.1.4.1.14519.5.2.1.6279.6001.154703816225841204080664115280 161 | 160,1.3.6.1.4.1.14519.5.2.1.6279.6001.154837327827713479309898027966 162 | 161,1.3.6.1.4.1.14519.5.2.1.6279.6001.156016499715048493339281864474 163 | 162,1.3.6.1.4.1.14519.5.2.1.6279.6001.156322145453198768801776721493 164 | 163,1.3.6.1.4.1.14519.5.2.1.6279.6001.156579001330474859527530187095 165 | 164,1.3.6.1.4.1.14519.5.2.1.6279.6001.156821379677057223126714881626 166 | 165,1.3.6.1.4.1.14519.5.2.1.6279.6001.159521777966998275980367008904 167 | 166,1.3.6.1.4.1.14519.5.2.1.6279.6001.159665703190517688573100822213 168 | 167,1.3.6.1.4.1.14519.5.2.1.6279.6001.159996104466052855396410079250 169 | 168,1.3.6.1.4.1.14519.5.2.1.6279.6001.160124400349792614505500125883 170 | 169,1.3.6.1.4.1.14519.5.2.1.6279.6001.160216916075817913953530562493 171 | 170,1.3.6.1.4.1.14519.5.2.1.6279.6001.160586340600816116143631200450 172 | 171,1.3.6.1.4.1.14519.5.2.1.6279.6001.161002239822118346732951898613 173 | 172,1.3.6.1.4.1.14519.5.2.1.6279.6001.161067514225109999586362698069 174 | 173,1.3.6.1.4.1.14519.5.2.1.6279.6001.161073793312426102774780216551 175 | 174,1.3.6.1.4.1.14519.5.2.1.6279.6001.161633200801003804714818844696 176 | 175,1.3.6.1.4.1.14519.5.2.1.6279.6001.161821150841552408667852639317 177 | 176,1.3.6.1.4.1.14519.5.2.1.6279.6001.161855583909753609742728521805 178 | 177,1.3.6.1.4.1.14519.5.2.1.6279.6001.162207236104936931957809623059 179 | 178,1.3.6.1.4.1.14519.5.2.1.6279.6001.162351539386551708034407968929 180 | 179,1.3.6.1.4.1.14519.5.2.1.6279.6001.162718361851587451505896742103 181 | 180,1.3.6.1.4.1.14519.5.2.1.6279.6001.162845309248822193437735868939 182 | 181,1.3.6.1.4.1.14519.5.2.1.6279.6001.162901839201654862079549658100 183 | 182,1.3.6.1.4.1.14519.5.2.1.6279.6001.163217526257871051722166468085 184 | 183,1.3.6.1.4.1.14519.5.2.1.6279.6001.163901773171373940247829492387 185 | 184,1.3.6.1.4.1.14519.5.2.1.6279.6001.163931625580639955914619627409 186 | 185,1.3.6.1.4.1.14519.5.2.1.6279.6001.163994693532965040247348251579 187 | 186,1.3.6.1.4.1.14519.5.2.1.6279.6001.164790817284381538042494285101 188 | 187,1.3.6.1.4.1.14519.5.2.1.6279.6001.164988920331211858091402361989 189 | 188,1.3.6.1.4.1.14519.5.2.1.6279.6001.167237290696350215427953159586 190 | 189,1.3.6.1.4.1.14519.5.2.1.6279.6001.167500254299688235071950909530 191 | 190,1.3.6.1.4.1.14519.5.2.1.6279.6001.167661207884826429102690781600 192 | 191,1.3.6.1.4.1.14519.5.2.1.6279.6001.167919147233131417984739058859 193 | 192,1.3.6.1.4.1.14519.5.2.1.6279.6001.168037818448885856452592057286 194 | 193,1.3.6.1.4.1.14519.5.2.1.6279.6001.168605638657404145360275453085 195 | 194,1.3.6.1.4.1.14519.5.2.1.6279.6001.168737928729363683423228050295 196 | 195,1.3.6.1.4.1.14519.5.2.1.6279.6001.168833925301530155818375859047 197 | 196,1.3.6.1.4.1.14519.5.2.1.6279.6001.168985655485163461062675655739 198 | 197,1.3.6.1.4.1.14519.5.2.1.6279.6001.169128136262002764211589185953 199 | 198,1.3.6.1.4.1.14519.5.2.1.6279.6001.170052181746004939527661217512 200 | 199,1.3.6.1.4.1.14519.5.2.1.6279.6001.170706757615202213033480003264 201 | 200,1.3.6.1.4.1.14519.5.2.1.6279.6001.170825539570536865106681134236 202 | 201,1.3.6.1.4.1.14519.5.2.1.6279.6001.170921541362033046216100409521 203 | 202,1.3.6.1.4.1.14519.5.2.1.6279.6001.171177995014336749670107905732 204 | 203,1.3.6.1.4.1.14519.5.2.1.6279.6001.171667800241622018839592854574 205 | 204,1.3.6.1.4.1.14519.5.2.1.6279.6001.171682845383273105440297561095 206 | 205,1.3.6.1.4.1.14519.5.2.1.6279.6001.171919524048654494439256263785 207 | 206,1.3.6.1.4.1.14519.5.2.1.6279.6001.172243743899615313644757844726 208 | 207,1.3.6.1.4.1.14519.5.2.1.6279.6001.172573195301625265149778785969 209 | 208,1.3.6.1.4.1.14519.5.2.1.6279.6001.172845185165807139298420209778 210 | 209,1.3.6.1.4.1.14519.5.2.1.6279.6001.173101104804533997398137418032 211 | 210,1.3.6.1.4.1.14519.5.2.1.6279.6001.173106154739244262091404659845 212 | 211,1.3.6.1.4.1.14519.5.2.1.6279.6001.173556680294801532247454313511 213 | 212,1.3.6.1.4.1.14519.5.2.1.6279.6001.173931884906244951746140865701 214 | 213,1.3.6.1.4.1.14519.5.2.1.6279.6001.174168737938619557573021395302 215 | 214,1.3.6.1.4.1.14519.5.2.1.6279.6001.174449669706458092793093760291 216 | 215,1.3.6.1.4.1.14519.5.2.1.6279.6001.174692377730646477496286081479 217 | 216,1.3.6.1.4.1.14519.5.2.1.6279.6001.174907798609768549012640380786 218 | 217,1.3.6.1.4.1.14519.5.2.1.6279.6001.174935793360491516757154875981 219 | 218,1.3.6.1.4.1.14519.5.2.1.6279.6001.175318131822744218104175746898 220 | 219,1.3.6.1.4.1.14519.5.2.1.6279.6001.176030616406569931557298712518 221 | 220,1.3.6.1.4.1.14519.5.2.1.6279.6001.176362912420491262783064585333 222 | 221,1.3.6.1.4.1.14519.5.2.1.6279.6001.176638348958425792989125209419 223 | 222,1.3.6.1.4.1.14519.5.2.1.6279.6001.176869045992276345870480098568 224 | 223,1.3.6.1.4.1.14519.5.2.1.6279.6001.177086402277715068525592995222 225 | 224,1.3.6.1.4.1.14519.5.2.1.6279.6001.177252583002664900748714851615 226 | 225,1.3.6.1.4.1.14519.5.2.1.6279.6001.177685820605315926524514718990 227 | 226,1.3.6.1.4.1.14519.5.2.1.6279.6001.177785764461425908755977367558 228 | 227,1.3.6.1.4.1.14519.5.2.1.6279.6001.177888806135892723698313903329 229 | 228,1.3.6.1.4.1.14519.5.2.1.6279.6001.177985905159808659201278495182 230 | 229,1.3.6.1.4.1.14519.5.2.1.6279.6001.178391668569567816549737454720 231 | 230,1.3.6.1.4.1.14519.5.2.1.6279.6001.178680586845223339579041794709 232 | 231,1.3.6.1.4.1.14519.5.2.1.6279.6001.179049373636438705059720603192 233 | 232,1.3.6.1.4.1.14519.5.2.1.6279.6001.179162671133894061547290922949 234 | 233,1.3.6.1.4.1.14519.5.2.1.6279.6001.179209990684978588019929720099 235 | 234,1.3.6.1.4.1.14519.5.2.1.6279.6001.179683407589764683292800449011 236 | 235,1.3.6.1.4.1.14519.5.2.1.6279.6001.179730018513720561213088132029 237 | 236,1.3.6.1.4.1.14519.5.2.1.6279.6001.179943248049071805421192715219 238 | 237,1.3.6.1.4.1.14519.5.2.1.6279.6001.182192086929819295877506541021 239 | 238,1.3.6.1.4.1.14519.5.2.1.6279.6001.182798854785392200340436516930 240 | 239,1.3.6.1.4.1.14519.5.2.1.6279.6001.183056151780567460322586876100 241 | 240,1.3.6.1.4.1.14519.5.2.1.6279.6001.183184435049555024219115904825 242 | 241,1.3.6.1.4.1.14519.5.2.1.6279.6001.183843376225716802567192412456 243 | 242,1.3.6.1.4.1.14519.5.2.1.6279.6001.183924380327950237519832859527 244 | 243,1.3.6.1.4.1.14519.5.2.1.6279.6001.183982839679953938397312236359 245 | 244,1.3.6.1.4.1.14519.5.2.1.6279.6001.184019785706727365023450012318 246 | 245,1.3.6.1.4.1.14519.5.2.1.6279.6001.184412674007117333405073397832 247 | 246,1.3.6.1.4.1.14519.5.2.1.6279.6001.185154482385982570363528682299 248 | 247,1.3.6.1.4.1.14519.5.2.1.6279.6001.185226274332527104841463955058 249 | 248,1.3.6.1.4.1.14519.5.2.1.6279.6001.186021279664749879526003668137 250 | 249,1.3.6.1.4.1.14519.5.2.1.6279.6001.187108608022306504546286626125 251 | 250,1.3.6.1.4.1.14519.5.2.1.6279.6001.187451715205085403623595258748 252 | 251,1.3.6.1.4.1.14519.5.2.1.6279.6001.187694838527128312070807533473 253 | 252,1.3.6.1.4.1.14519.5.2.1.6279.6001.187803155574314810830688534991 254 | 253,1.3.6.1.4.1.14519.5.2.1.6279.6001.187966156856911682643615997798 255 | 254,1.3.6.1.4.1.14519.5.2.1.6279.6001.188059920088313909273628445208 256 | 255,1.3.6.1.4.1.14519.5.2.1.6279.6001.188209889686363159853715266493 257 | 256,1.3.6.1.4.1.14519.5.2.1.6279.6001.188265424231150847356515802868 258 | 257,1.3.6.1.4.1.14519.5.2.1.6279.6001.188376349804761988217597754952 259 | 258,1.3.6.1.4.1.14519.5.2.1.6279.6001.188385286346390202873004762827 260 | 259,1.3.6.1.4.1.14519.5.2.1.6279.6001.188484197846284733942365679565 261 | 260,1.3.6.1.4.1.14519.5.2.1.6279.6001.188619674701053082195613114069 262 | 261,1.3.6.1.4.1.14519.5.2.1.6279.6001.189483585244687808087477024767 263 | 262,1.3.6.1.4.1.14519.5.2.1.6279.6001.190144948425835566841437565646 264 | 263,1.3.6.1.4.1.14519.5.2.1.6279.6001.190298296009658115773239776160 265 | 264,1.3.6.1.4.1.14519.5.2.1.6279.6001.190937805243443708408459490152 266 | 265,1.3.6.1.4.1.14519.5.2.1.6279.6001.191266041369462391833537519639 267 | 266,1.3.6.1.4.1.14519.5.2.1.6279.6001.191301539558980174217770205256 268 | 267,1.3.6.1.4.1.14519.5.2.1.6279.6001.191617711875409989053242965150 269 | 268,1.3.6.1.4.1.14519.5.2.1.6279.6001.192256506776434538421891524301 270 | 269,1.3.6.1.4.1.14519.5.2.1.6279.6001.192419869605596446455526220766 271 | 270,1.3.6.1.4.1.14519.5.2.1.6279.6001.193408384740507320589857096592 272 | 271,1.3.6.1.4.1.14519.5.2.1.6279.6001.193721075067404532739943086458 273 | 272,1.3.6.1.4.1.14519.5.2.1.6279.6001.193808128386712859512130599234 274 | 273,1.3.6.1.4.1.14519.5.2.1.6279.6001.193964947698259739624715468431 275 | 274,1.3.6.1.4.1.14519.5.2.1.6279.6001.194246472548954252250399902051 276 | 275,1.3.6.1.4.1.14519.5.2.1.6279.6001.194440094986948071643661798326 277 | 276,1.3.6.1.4.1.14519.5.2.1.6279.6001.194465340552956447447896167830 278 | 277,1.3.6.1.4.1.14519.5.2.1.6279.6001.194488534645348916700259325236 279 | 278,1.3.6.1.4.1.14519.5.2.1.6279.6001.194632613233275988184244485809 280 | 279,1.3.6.1.4.1.14519.5.2.1.6279.6001.194766721609772924944646251928 281 | 280,1.3.6.1.4.1.14519.5.2.1.6279.6001.195557219224169985110295082004 282 | 281,1.3.6.1.4.1.14519.5.2.1.6279.6001.195913706607582347421429908613 283 | 282,1.3.6.1.4.1.14519.5.2.1.6279.6001.196251645377731223510086726530 284 | 283,1.3.6.1.4.1.14519.5.2.1.6279.6001.197063290812663596858124411210 285 | 284,1.3.6.1.4.1.14519.5.2.1.6279.6001.197987940182806628828566429132 286 | 285,1.3.6.1.4.1.14519.5.2.1.6279.6001.198016798894102791158686961192 287 | 286,1.3.6.1.4.1.14519.5.2.1.6279.6001.198698492013538481395497694975 288 | 287,1.3.6.1.4.1.14519.5.2.1.6279.6001.199069398344356765037879821616 289 | 288,1.3.6.1.4.1.14519.5.2.1.6279.6001.199171741859530285887752432478 290 | 289,1.3.6.1.4.1.14519.5.2.1.6279.6001.199220738144407033276946096708 291 | 290,1.3.6.1.4.1.14519.5.2.1.6279.6001.199261544234308780356714831537 292 | 291,1.3.6.1.4.1.14519.5.2.1.6279.6001.199282854229880908602362094937 293 | 292,1.3.6.1.4.1.14519.5.2.1.6279.6001.199670099218798685977406484591 294 | 293,1.3.6.1.4.1.14519.5.2.1.6279.6001.199975006921901879512837687266 295 | 294,1.3.6.1.4.1.14519.5.2.1.6279.6001.200513183558872708878454294671 296 | 295,1.3.6.1.4.1.14519.5.2.1.6279.6001.200558451375970945040979397866 297 | 296,1.3.6.1.4.1.14519.5.2.1.6279.6001.200725988589959521302320481687 298 | 297,1.3.6.1.4.1.14519.5.2.1.6279.6001.200837896655745926888305239398 299 | 298,1.3.6.1.4.1.14519.5.2.1.6279.6001.200841000324240313648595016964 300 | 299,1.3.6.1.4.1.14519.5.2.1.6279.6001.201890795870532056891161597218 301 | 300,1.3.6.1.4.1.14519.5.2.1.6279.6001.202187810895588720702176009630 302 | 301,1.3.6.1.4.1.14519.5.2.1.6279.6001.202283133206014258077705539227 303 | 302,1.3.6.1.4.1.14519.5.2.1.6279.6001.202464973819273687476049035824 304 | 303,1.3.6.1.4.1.14519.5.2.1.6279.6001.202476538079060560282495099956 305 | 304,1.3.6.1.4.1.14519.5.2.1.6279.6001.202643836890896697853521610450 306 | 305,1.3.6.1.4.1.14519.5.2.1.6279.6001.202811684116768680758082619196 307 | 306,1.3.6.1.4.1.14519.5.2.1.6279.6001.203179378754043776171267611064 308 | 307,1.3.6.1.4.1.14519.5.2.1.6279.6001.203425588524695836343069893813 309 | 308,1.3.6.1.4.1.14519.5.2.1.6279.6001.203741923654363010377298352671 310 | 309,1.3.6.1.4.1.14519.5.2.1.6279.6001.204287915902811325371247860532 311 | 310,1.3.6.1.4.1.14519.5.2.1.6279.6001.204303454658845815034433453512 312 | 311,1.3.6.1.4.1.14519.5.2.1.6279.6001.204566802718283633558802774757 313 | 312,1.3.6.1.4.1.14519.5.2.1.6279.6001.204802250386343794613980417281 314 | 313,1.3.6.1.4.1.14519.5.2.1.6279.6001.205523326998654833765855998037 315 | 314,1.3.6.1.4.1.14519.5.2.1.6279.6001.205615524269596458818376243313 316 | 315,1.3.6.1.4.1.14519.5.2.1.6279.6001.205852555362702089950453265567 317 | 316,1.3.6.1.4.1.14519.5.2.1.6279.6001.205993750485568250373835565680 318 | 317,1.3.6.1.4.1.14519.5.2.1.6279.6001.206028343897359374907954580114 319 | 318,1.3.6.1.4.1.14519.5.2.1.6279.6001.206097113343059612247503064658 320 | 319,1.3.6.1.4.1.14519.5.2.1.6279.6001.206539885154775002929031534291 321 | 320,1.3.6.1.4.1.14519.5.2.1.6279.6001.207341668080525761926965850679 322 | 321,1.3.6.1.4.1.14519.5.2.1.6279.6001.208511362832825683639135205368 323 | 322,1.3.6.1.4.1.14519.5.2.1.6279.6001.208737629504245244513001631764 324 | 323,1.3.6.1.4.1.14519.5.2.1.6279.6001.209269973797560820442292189762 325 | 324,1.3.6.1.4.1.14519.5.2.1.6279.6001.210426531621179400035178209430 326 | 325,1.3.6.1.4.1.14519.5.2.1.6279.6001.210837812047373739447725050963 327 | 326,1.3.6.1.4.1.14519.5.2.1.6279.6001.211051626197585058967163339846 328 | 327,1.3.6.1.4.1.14519.5.2.1.6279.6001.211071908915618528829547301883 329 | 328,1.3.6.1.4.1.14519.5.2.1.6279.6001.211956804948320236390242845468 330 | 329,1.3.6.1.4.1.14519.5.2.1.6279.6001.212346425055214308006918165305 331 | 330,1.3.6.1.4.1.14519.5.2.1.6279.6001.212608679077007918190529579976 332 | 331,1.3.6.1.4.1.14519.5.2.1.6279.6001.213022585153512920098588556742 333 | 332,1.3.6.1.4.1.14519.5.2.1.6279.6001.213140617640021803112060161074 334 | 333,1.3.6.1.4.1.14519.5.2.1.6279.6001.213854687290736562463866711534 335 | 334,1.3.6.1.4.1.14519.5.2.1.6279.6001.214252223927572015414741039150 336 | 335,1.3.6.1.4.1.14519.5.2.1.6279.6001.214800939017429618305208626314 337 | 336,1.3.6.1.4.1.14519.5.2.1.6279.6001.215086589927307766627151367533 338 | 337,1.3.6.1.4.1.14519.5.2.1.6279.6001.215104063467523905369326175410 339 | 338,1.3.6.1.4.1.14519.5.2.1.6279.6001.215640837032688688030770057224 340 | 339,1.3.6.1.4.1.14519.5.2.1.6279.6001.215785045378334625097907422785 341 | 340,1.3.6.1.4.1.14519.5.2.1.6279.6001.216252660192313507027754194207 342 | 341,1.3.6.1.4.1.14519.5.2.1.6279.6001.216526102138308489357443843021 343 | 342,1.3.6.1.4.1.14519.5.2.1.6279.6001.216652640878960522552873394709 344 | 343,1.3.6.1.4.1.14519.5.2.1.6279.6001.216882370221919561230873289517 345 | 344,1.3.6.1.4.1.14519.5.2.1.6279.6001.217589936421986638139451480826 346 | 345,1.3.6.1.4.1.14519.5.2.1.6279.6001.217697417596902141600884006982 347 | 346,1.3.6.1.4.1.14519.5.2.1.6279.6001.217754016294471278921686508169 348 | 347,1.3.6.1.4.1.14519.5.2.1.6279.6001.217955041973656886482758642958 349 | 348,1.3.6.1.4.1.14519.5.2.1.6279.6001.218476624578721885561483687176 350 | 349,1.3.6.1.4.1.14519.5.2.1.6279.6001.219087313261026510628926082729 351 | 350,1.3.6.1.4.1.14519.5.2.1.6279.6001.219254430927834326484477690403 352 | 351,1.3.6.1.4.1.14519.5.2.1.6279.6001.219281726101239572270900838145 353 | 352,1.3.6.1.4.1.14519.5.2.1.6279.6001.219349715895470349269596532320 354 | 353,1.3.6.1.4.1.14519.5.2.1.6279.6001.219428004988664846407984058588 355 | 354,1.3.6.1.4.1.14519.5.2.1.6279.6001.219618492426142913407827034169 356 | 355,1.3.6.1.4.1.14519.5.2.1.6279.6001.219909753224298157409438012179 357 | 356,1.3.6.1.4.1.14519.5.2.1.6279.6001.220205300714852483483213840572 358 | 357,1.3.6.1.4.1.14519.5.2.1.6279.6001.220596530836092324070084384692 359 | 358,1.3.6.1.4.1.14519.5.2.1.6279.6001.221017801605543296514746423389 360 | 359,1.3.6.1.4.1.14519.5.2.1.6279.6001.221945191226273284587353530424 361 | 360,1.3.6.1.4.1.14519.5.2.1.6279.6001.222052723822248889877676736332 362 | 361,1.3.6.1.4.1.14519.5.2.1.6279.6001.222087811960706096424718056430 363 | 362,1.3.6.1.4.1.14519.5.2.1.6279.6001.223098610241551815995595311693 364 | 363,1.3.6.1.4.1.14519.5.2.1.6279.6001.223650122819238796121876338881 365 | 364,1.3.6.1.4.1.14519.5.2.1.6279.6001.224465398054769500989828256685 366 | 365,1.3.6.1.4.1.14519.5.2.1.6279.6001.225154811831720426832024114593 367 | 366,1.3.6.1.4.1.14519.5.2.1.6279.6001.225227615446398900698431118292 368 | 367,1.3.6.1.4.1.14519.5.2.1.6279.6001.225515255547637437801620523312 369 | 368,1.3.6.1.4.1.14519.5.2.1.6279.6001.226152078193253087875725735761 370 | 369,1.3.6.1.4.1.14519.5.2.1.6279.6001.226383054119800793308721198594 371 | 370,1.3.6.1.4.1.14519.5.2.1.6279.6001.226456162308124493341905600418 372 | 371,1.3.6.1.4.1.14519.5.2.1.6279.6001.226564372605239604660221582288 373 | 372,1.3.6.1.4.1.14519.5.2.1.6279.6001.226889213794065160713547677129 374 | 373,1.3.6.1.4.1.14519.5.2.1.6279.6001.227707494413800460340110762069 375 | 374,1.3.6.1.4.1.14519.5.2.1.6279.6001.227796349777753378641347819780 376 | 375,1.3.6.1.4.1.14519.5.2.1.6279.6001.227885601428639043345478571594 377 | 376,1.3.6.1.4.1.14519.5.2.1.6279.6001.227962600322799211676960828223 378 | 377,1.3.6.1.4.1.14519.5.2.1.6279.6001.227968442353440630355230778531 379 | 378,1.3.6.1.4.1.14519.5.2.1.6279.6001.228511122591230092662900221600 380 | 379,1.3.6.1.4.1.14519.5.2.1.6279.6001.228934821089041845791238006047 381 | 380,1.3.6.1.4.1.14519.5.2.1.6279.6001.229096941293122177107846044795 382 | 381,1.3.6.1.4.1.14519.5.2.1.6279.6001.229171189693734694696158152904 383 | 382,1.3.6.1.4.1.14519.5.2.1.6279.6001.229664630348267553620068691756 384 | 383,1.3.6.1.4.1.14519.5.2.1.6279.6001.229860476925100292554329427970 385 | 384,1.3.6.1.4.1.14519.5.2.1.6279.6001.229960820686439513664996214638 386 | 385,1.3.6.1.4.1.14519.5.2.1.6279.6001.230078008964732806419498631442 387 | 386,1.3.6.1.4.1.14519.5.2.1.6279.6001.230416590143922549745658357505 388 | 387,1.3.6.1.4.1.14519.5.2.1.6279.6001.230491296081537726468075344411 389 | 388,1.3.6.1.4.1.14519.5.2.1.6279.6001.230675342744370103160629638194 390 | 389,1.3.6.1.4.1.14519.5.2.1.6279.6001.231002159523969307155990628066 391 | 390,1.3.6.1.4.1.14519.5.2.1.6279.6001.231645134739451754302647733304 392 | 391,1.3.6.1.4.1.14519.5.2.1.6279.6001.231834776365874788440767645596 393 | 392,1.3.6.1.4.1.14519.5.2.1.6279.6001.232011770495640253949434620907 394 | 393,1.3.6.1.4.1.14519.5.2.1.6279.6001.232058316950007760548968840196 395 | 394,1.3.6.1.4.1.14519.5.2.1.6279.6001.232071262560365924176679652948 396 | 395,1.3.6.1.4.1.14519.5.2.1.6279.6001.233001470265230594739708503198 397 | 396,1.3.6.1.4.1.14519.5.2.1.6279.6001.233433352108264931671753343044 398 | 397,1.3.6.1.4.1.14519.5.2.1.6279.6001.233652865358649579816568545171 399 | 398,1.3.6.1.4.1.14519.5.2.1.6279.6001.234400932423244218697302970157 400 | 399,1.3.6.1.4.1.14519.5.2.1.6279.6001.235217371152464582553341729176 401 | 400,1.3.6.1.4.1.14519.5.2.1.6279.6001.235364978775280910367690540811 402 | 401,1.3.6.1.4.1.14519.5.2.1.6279.6001.236698827306171960683086245994 403 | 402,1.3.6.1.4.1.14519.5.2.1.6279.6001.237215747217294006286437405216 404 | 403,1.3.6.1.4.1.14519.5.2.1.6279.6001.237428977311365557972720635401 405 | 404,1.3.6.1.4.1.14519.5.2.1.6279.6001.237915456403882324748189195892 406 | 405,1.3.6.1.4.1.14519.5.2.1.6279.6001.238019241099704094018548301753 407 | 406,1.3.6.1.4.1.14519.5.2.1.6279.6001.238042459915048190592571019348 408 | 407,1.3.6.1.4.1.14519.5.2.1.6279.6001.238522526736091851696274044574 409 | 408,1.3.6.1.4.1.14519.5.2.1.6279.6001.238855414831158993232534884296 410 | 409,1.3.6.1.4.1.14519.5.2.1.6279.6001.239358021703233250639913775427 411 | 410,1.3.6.1.4.1.14519.5.2.1.6279.6001.240630002689062442926543993263 412 | 411,1.3.6.1.4.1.14519.5.2.1.6279.6001.240969450540588211676803094518 413 | 412,1.3.6.1.4.1.14519.5.2.1.6279.6001.241083615484551649610616348856 414 | 413,1.3.6.1.4.1.14519.5.2.1.6279.6001.241570579760883349458693655367 415 | 414,1.3.6.1.4.1.14519.5.2.1.6279.6001.241717018262666382493757419144 416 | 415,1.3.6.1.4.1.14519.5.2.1.6279.6001.242624386080831911167122628616 417 | 416,1.3.6.1.4.1.14519.5.2.1.6279.6001.242761658169703141430370511586 418 | 417,1.3.6.1.4.1.14519.5.2.1.6279.6001.243094273518213382155770295147 419 | 418,1.3.6.1.4.1.14519.5.2.1.6279.6001.244204120220889433826451158706 420 | 419,1.3.6.1.4.1.14519.5.2.1.6279.6001.244442540088515471945035689377 421 | 420,1.3.6.1.4.1.14519.5.2.1.6279.6001.244447966386688625240438849169 422 | 421,1.3.6.1.4.1.14519.5.2.1.6279.6001.244590453955380448651329424024 423 | 422,1.3.6.1.4.1.14519.5.2.1.6279.6001.244681063194071446501270815660 424 | 423,1.3.6.1.4.1.14519.5.2.1.6279.6001.245248446973732759194067808002 425 | 424,1.3.6.1.4.1.14519.5.2.1.6279.6001.245349763807614756148761326488 426 | 425,1.3.6.1.4.1.14519.5.2.1.6279.6001.245391706475696258069508046497 427 | 426,1.3.6.1.4.1.14519.5.2.1.6279.6001.245546033414728092794968890929 428 | 427,1.3.6.1.4.1.14519.5.2.1.6279.6001.246178337114401749164850220976 429 | 428,1.3.6.1.4.1.14519.5.2.1.6279.6001.246225645401227472829175288633 430 | 429,1.3.6.1.4.1.14519.5.2.1.6279.6001.246589849815292078281051154201 431 | 430,1.3.6.1.4.1.14519.5.2.1.6279.6001.246758220302211646532176593724 432 | 431,1.3.6.1.4.1.14519.5.2.1.6279.6001.247060297988514823071467295949 433 | 432,1.3.6.1.4.1.14519.5.2.1.6279.6001.247769845138587733933485039556 434 | 433,1.3.6.1.4.1.14519.5.2.1.6279.6001.247816269490470394602288565775 435 | 434,1.3.6.1.4.1.14519.5.2.1.6279.6001.248357157975955379661896491341 436 | 435,1.3.6.1.4.1.14519.5.2.1.6279.6001.248360766706804179966476685510 437 | 436,1.3.6.1.4.1.14519.5.2.1.6279.6001.248425363469507808613979846863 438 | 437,1.3.6.1.4.1.14519.5.2.1.6279.6001.249032660919473722154870746474 439 | 438,1.3.6.1.4.1.14519.5.2.1.6279.6001.249314567767437206995861966896 440 | 439,1.3.6.1.4.1.14519.5.2.1.6279.6001.249404938669582150398726875826 441 | 440,1.3.6.1.4.1.14519.5.2.1.6279.6001.249450003033735700817635168066 442 | 441,1.3.6.1.4.1.14519.5.2.1.6279.6001.249530219848512542668813996730 443 | 442,1.3.6.1.4.1.14519.5.2.1.6279.6001.250397690690072950000431855143 444 | 443,1.3.6.1.4.1.14519.5.2.1.6279.6001.250438451287314206124484591986 445 | 444,1.3.6.1.4.1.14519.5.2.1.6279.6001.250481236093201801255751845296 446 | 445,1.3.6.1.4.1.14519.5.2.1.6279.6001.250863365157630276148828903732 447 | 446,1.3.6.1.4.1.14519.5.2.1.6279.6001.251215764736737018371915284679 448 | 447,1.3.6.1.4.1.14519.5.2.1.6279.6001.252358625003143649770119512644 449 | 448,1.3.6.1.4.1.14519.5.2.1.6279.6001.252634638822000832774167856951 450 | 449,1.3.6.1.4.1.14519.5.2.1.6279.6001.252697338970999211181671881792 451 | 450,1.3.6.1.4.1.14519.5.2.1.6279.6001.252709517998555732486024866345 452 | 451,1.3.6.1.4.1.14519.5.2.1.6279.6001.252814707117018427472206147014 453 | 452,1.3.6.1.4.1.14519.5.2.1.6279.6001.253283426904813468115158375647 454 | 453,1.3.6.1.4.1.14519.5.2.1.6279.6001.253317247142837717905329340520 455 | 454,1.3.6.1.4.1.14519.5.2.1.6279.6001.253322967203074795232627653819 456 | 455,1.3.6.1.4.1.14519.5.2.1.6279.6001.254138388912084634057282064266 457 | 456,1.3.6.1.4.1.14519.5.2.1.6279.6001.254254303842550572473665729969 458 | 457,1.3.6.1.4.1.14519.5.2.1.6279.6001.254473943359963613733707320244 459 | 458,1.3.6.1.4.1.14519.5.2.1.6279.6001.254929810944557499537650429296 460 | 459,1.3.6.1.4.1.14519.5.2.1.6279.6001.254957696184671649675053562027 461 | 460,1.3.6.1.4.1.14519.5.2.1.6279.6001.255409701134762680010928250229 462 | 461,1.3.6.1.4.1.14519.5.2.1.6279.6001.255999614855292116767517149228 463 | 462,1.3.6.1.4.1.14519.5.2.1.6279.6001.256542095129414948017808425649 464 | 463,1.3.6.1.4.1.14519.5.2.1.6279.6001.257383535269991165447822992959 465 | 464,1.3.6.1.4.1.14519.5.2.1.6279.6001.257515388956260258681136624817 466 | 465,1.3.6.1.4.1.14519.5.2.1.6279.6001.257840703452266097926250569223 467 | 466,1.3.6.1.4.1.14519.5.2.1.6279.6001.258220324170977900491673635112 468 | 467,1.3.6.1.4.1.14519.5.2.1.6279.6001.259018373683540453277752706262 469 | 468,1.3.6.1.4.1.14519.5.2.1.6279.6001.259123825760999546551970425757 470 | 469,1.3.6.1.4.1.14519.5.2.1.6279.6001.259124675432205040899951626253 471 | 470,1.3.6.1.4.1.14519.5.2.1.6279.6001.259227883564429312164962953756 472 | 471,1.3.6.1.4.1.14519.5.2.1.6279.6001.259453428008507791234730686014 473 | 472,1.3.6.1.4.1.14519.5.2.1.6279.6001.259543921154154401875872845498 474 | 473,1.3.6.1.4.1.14519.5.2.1.6279.6001.261678072503577216586082745513 475 | 474,1.3.6.1.4.1.14519.5.2.1.6279.6001.261700367741314729940340271960 476 | 475,1.3.6.1.4.1.14519.5.2.1.6279.6001.262736997975960398949912434623 477 | 476,1.3.6.1.4.1.14519.5.2.1.6279.6001.262873069163227096134627700599 478 | 477,1.3.6.1.4.1.14519.5.2.1.6279.6001.264090899378396711987322794314 479 | 478,1.3.6.1.4.1.14519.5.2.1.6279.6001.264251211689085893915477907261 480 | 479,1.3.6.1.4.1.14519.5.2.1.6279.6001.265133389948279331857097127422 481 | 480,1.3.6.1.4.1.14519.5.2.1.6279.6001.265453131727473342790950829556 482 | 481,1.3.6.1.4.1.14519.5.2.1.6279.6001.265570697208310960298668720669 483 | 482,1.3.6.1.4.1.14519.5.2.1.6279.6001.265775376735520890308424143898 484 | 483,1.3.6.1.4.1.14519.5.2.1.6279.6001.265780642925621389994857727416 485 | 484,1.3.6.1.4.1.14519.5.2.1.6279.6001.265960756233787099041040311282 486 | 485,1.3.6.1.4.1.14519.5.2.1.6279.6001.266009527139315622265711325223 487 | 486,1.3.6.1.4.1.14519.5.2.1.6279.6001.266581250778073944645044950856 488 | 487,1.3.6.1.4.1.14519.5.2.1.6279.6001.267519732763035023633235877753 489 | 488,1.3.6.1.4.1.14519.5.2.1.6279.6001.267957701183569638795986183786 490 | 489,1.3.6.1.4.1.14519.5.2.1.6279.6001.268030488196493755113553009785 491 | 490,1.3.6.1.4.1.14519.5.2.1.6279.6001.268589491017129166376960414534 492 | 491,1.3.6.1.4.1.14519.5.2.1.6279.6001.268838889380981659524993261082 493 | 492,1.3.6.1.4.1.14519.5.2.1.6279.6001.268992195564407418480563388746 494 | 493,1.3.6.1.4.1.14519.5.2.1.6279.6001.269075535958871753309238331179 495 | 494,1.3.6.1.4.1.14519.5.2.1.6279.6001.269689294231892620436462818860 496 | 495,1.3.6.1.4.1.14519.5.2.1.6279.6001.270152671889301412052226973069 497 | 496,1.3.6.1.4.1.14519.5.2.1.6279.6001.270215889102603268207599305185 498 | 497,1.3.6.1.4.1.14519.5.2.1.6279.6001.270390050141765094612147226290 499 | 498,1.3.6.1.4.1.14519.5.2.1.6279.6001.270788655216695628640355888562 500 | 499,1.3.6.1.4.1.14519.5.2.1.6279.6001.270951128717816232360812849541 501 | 500,1.3.6.1.4.1.14519.5.2.1.6279.6001.271220641987745483198036913951 502 | 501,1.3.6.1.4.1.14519.5.2.1.6279.6001.271307051432838466826189754230 503 | 502,1.3.6.1.4.1.14519.5.2.1.6279.6001.272042302501586336192628818865 504 | 503,1.3.6.1.4.1.14519.5.2.1.6279.6001.272123398257168239653655006815 505 | 504,1.3.6.1.4.1.14519.5.2.1.6279.6001.272190966764020277652079081128 506 | 505,1.3.6.1.4.1.14519.5.2.1.6279.6001.272259794130271010519952623746 507 | 506,1.3.6.1.4.1.14519.5.2.1.6279.6001.272344603176687884771013620823 508 | 507,1.3.6.1.4.1.14519.5.2.1.6279.6001.272348349298439120568330857680 509 | 508,1.3.6.1.4.1.14519.5.2.1.6279.6001.272961322147784625028175033640 510 | 509,1.3.6.1.4.1.14519.5.2.1.6279.6001.273525289046256012743471155680 511 | 510,1.3.6.1.4.1.14519.5.2.1.6279.6001.274052674198758621258447180130 512 | 511,1.3.6.1.4.1.14519.5.2.1.6279.6001.275007193025729362844652516689 513 | 512,1.3.6.1.4.1.14519.5.2.1.6279.6001.275755514659958628040305922764 514 | 513,1.3.6.1.4.1.14519.5.2.1.6279.6001.275766318636944297772360944907 515 | 514,1.3.6.1.4.1.14519.5.2.1.6279.6001.275849601663847251574860892603 516 | 515,1.3.6.1.4.1.14519.5.2.1.6279.6001.275986221854423197884953496664 517 | 516,1.3.6.1.4.1.14519.5.2.1.6279.6001.276351267409869539593937734609 518 | 517,1.3.6.1.4.1.14519.5.2.1.6279.6001.276556509002726404418399209377 519 | 518,1.3.6.1.4.1.14519.5.2.1.6279.6001.276710697414087561012670296643 520 | 519,1.3.6.1.4.1.14519.5.2.1.6279.6001.277445975068759205899107114231 521 | 520,1.3.6.1.4.1.14519.5.2.1.6279.6001.277452631455527999380186898011 522 | 521,1.3.6.1.4.1.14519.5.2.1.6279.6001.277662902666135640561346462196 523 | 522,1.3.6.1.4.1.14519.5.2.1.6279.6001.278010349511857248000260557753 524 | 523,1.3.6.1.4.1.14519.5.2.1.6279.6001.278660284797073139172446973682 525 | 524,1.3.6.1.4.1.14519.5.2.1.6279.6001.279300249795483097365868125932 526 | 525,1.3.6.1.4.1.14519.5.2.1.6279.6001.279953669991076107785464313394 527 | 526,1.3.6.1.4.1.14519.5.2.1.6279.6001.280072876841890439628529365478 528 | 527,1.3.6.1.4.1.14519.5.2.1.6279.6001.280125803152924778388346920341 529 | 528,1.3.6.1.4.1.14519.5.2.1.6279.6001.280972147860943609388015648430 530 | 529,1.3.6.1.4.1.14519.5.2.1.6279.6001.281489753704424911132261151767 531 | 530,1.3.6.1.4.1.14519.5.2.1.6279.6001.281967919138248195763602360723 532 | 531,1.3.6.1.4.1.14519.5.2.1.6279.6001.282512043257574309474415322775 533 | 532,1.3.6.1.4.1.14519.5.2.1.6279.6001.282779922503707013097174625409 534 | 533,1.3.6.1.4.1.14519.5.2.1.6279.6001.283569726884265181140892667131 535 | 534,1.3.6.1.4.1.14519.5.2.1.6279.6001.283733738239331719775105586296 536 | 535,1.3.6.1.4.1.14519.5.2.1.6279.6001.283878926524838648426928238498 537 | 536,1.3.6.1.4.1.14519.5.2.1.6279.6001.285926554490515269336267972830 538 | 537,1.3.6.1.4.1.14519.5.2.1.6279.6001.286061375572911414226912429210 539 | 538,1.3.6.1.4.1.14519.5.2.1.6279.6001.286217539434358186648717203667 540 | 539,1.3.6.1.4.1.14519.5.2.1.6279.6001.286422846896797433168187085942 541 | 540,1.3.6.1.4.1.14519.5.2.1.6279.6001.286627485198831346082954437212 542 | 541,1.3.6.1.4.1.14519.5.2.1.6279.6001.286647622786041008124419915089 543 | 542,1.3.6.1.4.1.14519.5.2.1.6279.6001.287560874054243719452635194040 544 | 543,1.3.6.1.4.1.14519.5.2.1.6279.6001.287966244644280690737019247886 545 | 544,1.3.6.1.4.1.14519.5.2.1.6279.6001.288701997968615460794642979503 546 | 545,1.3.6.1.4.1.14519.5.2.1.6279.6001.290135156874098366424871975734 547 | 546,1.3.6.1.4.1.14519.5.2.1.6279.6001.290410217650314119074833254861 548 | 547,1.3.6.1.4.1.14519.5.2.1.6279.6001.291156498203266896953765649282 549 | 548,1.3.6.1.4.1.14519.5.2.1.6279.6001.291539125579672469833850180824 550 | 549,1.3.6.1.4.1.14519.5.2.1.6279.6001.292049618819567427252971059233 551 | 550,1.3.6.1.4.1.14519.5.2.1.6279.6001.292057261351416339496913597985 552 | 551,1.3.6.1.4.1.14519.5.2.1.6279.6001.292194861362266467652267941663 553 | 552,1.3.6.1.4.1.14519.5.2.1.6279.6001.292576688635952269497781991202 554 | 553,1.3.6.1.4.1.14519.5.2.1.6279.6001.292994770358625142596171316474 555 | 554,1.3.6.1.4.1.14519.5.2.1.6279.6001.293593766328917170359373773080 556 | 555,1.3.6.1.4.1.14519.5.2.1.6279.6001.293757615532132808762625441831 557 | 556,1.3.6.1.4.1.14519.5.2.1.6279.6001.294120933998772507043263238704 558 | 557,1.3.6.1.4.1.14519.5.2.1.6279.6001.294188507421106424248264912111 559 | 558,1.3.6.1.4.1.14519.5.2.1.6279.6001.295298571102631191572192562523 560 | 559,1.3.6.1.4.1.14519.5.2.1.6279.6001.295420274214095686326263147663 561 | 560,1.3.6.1.4.1.14519.5.2.1.6279.6001.295462530340364058116953738925 562 | 561,1.3.6.1.4.1.14519.5.2.1.6279.6001.296066944953051278419805374238 563 | 562,1.3.6.1.4.1.14519.5.2.1.6279.6001.296738183013079390785739615169 564 | 563,1.3.6.1.4.1.14519.5.2.1.6279.6001.296863826932699509516219450076 565 | 564,1.3.6.1.4.1.14519.5.2.1.6279.6001.297251044869095073091780740645 566 | 565,1.3.6.1.4.1.14519.5.2.1.6279.6001.297433269262659217151107535012 567 | 566,1.3.6.1.4.1.14519.5.2.1.6279.6001.297964221542942838344351735414 568 | 567,1.3.6.1.4.1.14519.5.2.1.6279.6001.297988578825170426663869669862 569 | 568,1.3.6.1.4.1.14519.5.2.1.6279.6001.299476369290630280560355838785 570 | 569,1.3.6.1.4.1.14519.5.2.1.6279.6001.299767339686526858593516834230 571 | 570,1.3.6.1.4.1.14519.5.2.1.6279.6001.299806338046301317870803017534 572 | 571,1.3.6.1.4.1.14519.5.2.1.6279.6001.300136985030081433029390459071 573 | 572,1.3.6.1.4.1.14519.5.2.1.6279.6001.300146276266881736689307479986 574 | 573,1.3.6.1.4.1.14519.5.2.1.6279.6001.300246184547502297539521283806 575 | 574,1.3.6.1.4.1.14519.5.2.1.6279.6001.300270516469599170290456821227 576 | 575,1.3.6.1.4.1.14519.5.2.1.6279.6001.300271604576987336866436407488 577 | 576,1.3.6.1.4.1.14519.5.2.1.6279.6001.300392272203629213913702120739 578 | 577,1.3.6.1.4.1.14519.5.2.1.6279.6001.300693623747082239407271583452 579 | 578,1.3.6.1.4.1.14519.5.2.1.6279.6001.301462380687644451483231621986 580 | 579,1.3.6.1.4.1.14519.5.2.1.6279.6001.301582691063019848479942618641 581 | 580,1.3.6.1.4.1.14519.5.2.1.6279.6001.302134342469412607966016057827 582 | 581,1.3.6.1.4.1.14519.5.2.1.6279.6001.302403227435841351528721627052 583 | 582,1.3.6.1.4.1.14519.5.2.1.6279.6001.302557165094691896097534021075 584 | 583,1.3.6.1.4.1.14519.5.2.1.6279.6001.303066851236267189733420290986 585 | 584,1.3.6.1.4.1.14519.5.2.1.6279.6001.303421828981831854739626597495 586 | 585,1.3.6.1.4.1.14519.5.2.1.6279.6001.303865116731361029078599241306 587 | 586,1.3.6.1.4.1.14519.5.2.1.6279.6001.304676828064484590312919543151 588 | 587,1.3.6.1.4.1.14519.5.2.1.6279.6001.304700823314998198591652152637 589 | 588,1.3.6.1.4.1.14519.5.2.1.6279.6001.305858704835252413616501469037 590 | 589,1.3.6.1.4.1.14519.5.2.1.6279.6001.305887072264491016857673607285 591 | 590,1.3.6.1.4.1.14519.5.2.1.6279.6001.306112617218006614029386065035 592 | 591,1.3.6.1.4.1.14519.5.2.1.6279.6001.306140003699110313373771452136 593 | 592,1.3.6.1.4.1.14519.5.2.1.6279.6001.306520140119968755187868602181 594 | 593,1.3.6.1.4.1.14519.5.2.1.6279.6001.306558074682524259000586270818 595 | 594,1.3.6.1.4.1.14519.5.2.1.6279.6001.306788423710427765311352901943 596 | 595,1.3.6.1.4.1.14519.5.2.1.6279.6001.306948744223170422945185006551 597 | 596,1.3.6.1.4.1.14519.5.2.1.6279.6001.307835307280028057486413359377 598 | 597,1.3.6.1.4.1.14519.5.2.1.6279.6001.307921770358136677021532761235 599 | 598,1.3.6.1.4.1.14519.5.2.1.6279.6001.307946352302138765071461362398 600 | 599,1.3.6.1.4.1.14519.5.2.1.6279.6001.308153138776443962077214577161 601 | 600,1.3.6.1.4.1.14519.5.2.1.6279.6001.308183340111270052562662456038 602 | 601,1.3.6.1.4.1.14519.5.2.1.6279.6001.308655308958459380153492314021 603 | 602,1.3.6.1.4.1.14519.5.2.1.6279.6001.309672797925724868457151381131 604 | 603,1.3.6.1.4.1.14519.5.2.1.6279.6001.309901913847714156367981722205 605 | 604,1.3.6.1.4.1.14519.5.2.1.6279.6001.309955814083231537823157605135 606 | 605,1.3.6.1.4.1.14519.5.2.1.6279.6001.309955999522338651429118207446 607 | 606,1.3.6.1.4.1.14519.5.2.1.6279.6001.310395752124284049604069960014 608 | 607,1.3.6.1.4.1.14519.5.2.1.6279.6001.310548927038333190233889983845 609 | 608,1.3.6.1.4.1.14519.5.2.1.6279.6001.310626494937915759224334597176 610 | 609,1.3.6.1.4.1.14519.5.2.1.6279.6001.311236942972970815890902714604 611 | 610,1.3.6.1.4.1.14519.5.2.1.6279.6001.311476128731958142981941696518 612 | 611,1.3.6.1.4.1.14519.5.2.1.6279.6001.311981398931043315779172047718 613 | 612,1.3.6.1.4.1.14519.5.2.1.6279.6001.312127933722985204808706697221 614 | 613,1.3.6.1.4.1.14519.5.2.1.6279.6001.312704771348460502013249647868 615 | 614,1.3.6.1.4.1.14519.5.2.1.6279.6001.313283554967554803238484128406 616 | 615,1.3.6.1.4.1.14519.5.2.1.6279.6001.313334055029671473836954456733 617 | 616,1.3.6.1.4.1.14519.5.2.1.6279.6001.313605260055394498989743099991 618 | 617,1.3.6.1.4.1.14519.5.2.1.6279.6001.313756547848086902190878548835 619 | 618,1.3.6.1.4.1.14519.5.2.1.6279.6001.313835996725364342034830119490 620 | 619,1.3.6.1.4.1.14519.5.2.1.6279.6001.314519596680450457855054746285 621 | 620,1.3.6.1.4.1.14519.5.2.1.6279.6001.314789075871001236641548593165 622 | 621,1.3.6.1.4.1.14519.5.2.1.6279.6001.314836406260772370397541392345 623 | 622,1.3.6.1.4.1.14519.5.2.1.6279.6001.315187221221054114974341475212 624 | 623,1.3.6.1.4.1.14519.5.2.1.6279.6001.315214756157389122376518747372 625 | 624,1.3.6.1.4.1.14519.5.2.1.6279.6001.315770913282450940389971401304 626 | 625,1.3.6.1.4.1.14519.5.2.1.6279.6001.315918264676377418120578391325 627 | 626,1.3.6.1.4.1.14519.5.2.1.6279.6001.316393351033132458296975008261 628 | 627,1.3.6.1.4.1.14519.5.2.1.6279.6001.316900421002460665752357657094 629 | 628,1.3.6.1.4.1.14519.5.2.1.6279.6001.316911475886263032009840828684 630 | 629,1.3.6.1.4.1.14519.5.2.1.6279.6001.317087518531899043292346860596 631 | 630,1.3.6.1.4.1.14519.5.2.1.6279.6001.317613170669207528926259976488 632 | 631,1.3.6.1.4.1.14519.5.2.1.6279.6001.319009811633846643966578282371 633 | 632,1.3.6.1.4.1.14519.5.2.1.6279.6001.319066480138812986026181758474 634 | 633,1.3.6.1.4.1.14519.5.2.1.6279.6001.320111824803959660037459294083 635 | 634,1.3.6.1.4.1.14519.5.2.1.6279.6001.320967206808467952819309001585 636 | 635,1.3.6.1.4.1.14519.5.2.1.6279.6001.321465552859463184018938648244 637 | 636,1.3.6.1.4.1.14519.5.2.1.6279.6001.321935195060268166151738328001 638 | 637,1.3.6.1.4.1.14519.5.2.1.6279.6001.323302986710576400812869264321 639 | 638,1.3.6.1.4.1.14519.5.2.1.6279.6001.323408652979949774528873200770 640 | 639,1.3.6.1.4.1.14519.5.2.1.6279.6001.323426705628838942177546503237 641 | 640,1.3.6.1.4.1.14519.5.2.1.6279.6001.323535944958374186208096541480 642 | 641,1.3.6.1.4.1.14519.5.2.1.6279.6001.323541312620128092852212458228 643 | 642,1.3.6.1.4.1.14519.5.2.1.6279.6001.323753921818102744511069914832 644 | 643,1.3.6.1.4.1.14519.5.2.1.6279.6001.323859712968543712594665815359 645 | 644,1.3.6.1.4.1.14519.5.2.1.6279.6001.323899724653546164058849558431 646 | 645,1.3.6.1.4.1.14519.5.2.1.6279.6001.324290109423920971676288828329 647 | 646,1.3.6.1.4.1.14519.5.2.1.6279.6001.324567010179873305471925391582 648 | 647,1.3.6.1.4.1.14519.5.2.1.6279.6001.324649110927013926557500550446 649 | 648,1.3.6.1.4.1.14519.5.2.1.6279.6001.325164338773720548739146851679 650 | 649,1.3.6.1.4.1.14519.5.2.1.6279.6001.325580698241281352835338693869 651 | 650,1.3.6.1.4.1.14519.5.2.1.6279.6001.326057189095429101398977448288 652 | 651,1.3.6.1.4.1.14519.5.2.1.6279.6001.328695385904874796172316226975 653 | 652,1.3.6.1.4.1.14519.5.2.1.6279.6001.328789598898469177563438457842 654 | 653,1.3.6.1.4.1.14519.5.2.1.6279.6001.328944769569002417592093467626 655 | 654,1.3.6.1.4.1.14519.5.2.1.6279.6001.329326052298830421573852261436 656 | 655,1.3.6.1.4.1.14519.5.2.1.6279.6001.329404588567903628160652715124 657 | 656,1.3.6.1.4.1.14519.5.2.1.6279.6001.329624439086643515259182406526 658 | 657,1.3.6.1.4.1.14519.5.2.1.6279.6001.330043769832606379655473292782 659 | 658,1.3.6.1.4.1.14519.5.2.1.6279.6001.330425234131526435132846006585 660 | 659,1.3.6.1.4.1.14519.5.2.1.6279.6001.330544495001617450666819906758 661 | 660,1.3.6.1.4.1.14519.5.2.1.6279.6001.330643702676971528301859647742 662 | 661,1.3.6.1.4.1.14519.5.2.1.6279.6001.331211682377519763144559212009 663 | 662,1.3.6.1.4.1.14519.5.2.1.6279.6001.332453873575389860371315979768 664 | 663,1.3.6.1.4.1.14519.5.2.1.6279.6001.332829333783605240302521201463 665 | 664,1.3.6.1.4.1.14519.5.2.1.6279.6001.333145094436144085379032922488 666 | 665,1.3.6.1.4.1.14519.5.2.1.6279.6001.333319057944372470283038483725 667 | 666,1.3.6.1.4.1.14519.5.2.1.6279.6001.334022941831199910030220864961 668 | 667,1.3.6.1.4.1.14519.5.2.1.6279.6001.334105754605642100456249422350 669 | 668,1.3.6.1.4.1.14519.5.2.1.6279.6001.334166493392278943610545989413 670 | 669,1.3.6.1.4.1.14519.5.2.1.6279.6001.334184846571549530235084187602 671 | 670,1.3.6.1.4.1.14519.5.2.1.6279.6001.334517907433161353885866806005 672 | 671,1.3.6.1.4.1.14519.5.2.1.6279.6001.335866409407244673864352309754 673 | 672,1.3.6.1.4.1.14519.5.2.1.6279.6001.336102335330125765000317290445 674 | 673,1.3.6.1.4.1.14519.5.2.1.6279.6001.336198008634390022174744544656 675 | 674,1.3.6.1.4.1.14519.5.2.1.6279.6001.336225579776978874775723463327 676 | 675,1.3.6.1.4.1.14519.5.2.1.6279.6001.336894364358709782463716339027 677 | 676,1.3.6.1.4.1.14519.5.2.1.6279.6001.337005960787660957389988207064 678 | 677,1.3.6.1.4.1.14519.5.2.1.6279.6001.337845202462615014431060697507 679 | 678,1.3.6.1.4.1.14519.5.2.1.6279.6001.338104567770715523699587505022 680 | 679,1.3.6.1.4.1.14519.5.2.1.6279.6001.338114620394879648539943280992 681 | 680,1.3.6.1.4.1.14519.5.2.1.6279.6001.338447145504282422142824032832 682 | 681,1.3.6.1.4.1.14519.5.2.1.6279.6001.338875090785618956575597613546 683 | 682,1.3.6.1.4.1.14519.5.2.1.6279.6001.339039410276356623209709113755 684 | 683,1.3.6.1.4.1.14519.5.2.1.6279.6001.339142594937666268384335506819 685 | 684,1.3.6.1.4.1.14519.5.2.1.6279.6001.339484970190920330170416228517 686 | 685,1.3.6.1.4.1.14519.5.2.1.6279.6001.339546614783708685476232944897 687 | 686,1.3.6.1.4.1.14519.5.2.1.6279.6001.339882192295517122002429068974 688 | 687,1.3.6.1.4.1.14519.5.2.1.6279.6001.340012777775661021262977442176 689 | 688,1.3.6.1.4.1.14519.5.2.1.6279.6001.340158437895922179455019686521 690 | 689,1.3.6.1.4.1.14519.5.2.1.6279.6001.341557859428950960906150406596 691 | 690,1.3.6.1.4.1.14519.5.2.1.6279.6001.346115813056769250958550383763 692 | 691,1.3.6.1.4.1.14519.5.2.1.6279.6001.362762275895885013176610377950 693 | 692,1.3.6.1.4.1.14519.5.2.1.6279.6001.367204840301639918160517361062 694 | 693,1.3.6.1.4.1.14519.5.2.1.6279.6001.373433682859788429397781158572 695 | 694,1.3.6.1.4.1.14519.5.2.1.6279.6001.385151742584074711135621089321 696 | 695,1.3.6.1.4.1.14519.5.2.1.6279.6001.387954549120924524005910602207 697 | 696,1.3.6.1.4.1.14519.5.2.1.6279.6001.390009458146468860187238398197 698 | 697,1.3.6.1.4.1.14519.5.2.1.6279.6001.390513733720659266816639651938 699 | 698,1.3.6.1.4.1.14519.5.2.1.6279.6001.392861216720727557882279374324 700 | 699,1.3.6.1.4.1.14519.5.2.1.6279.6001.394470743585708729682444806008 701 | 700,1.3.6.1.4.1.14519.5.2.1.6279.6001.395623571499047043765181005112 702 | 701,1.3.6.1.4.1.14519.5.2.1.6279.6001.397062004302272014259317520874 703 | 702,1.3.6.1.4.1.14519.5.2.1.6279.6001.397202838387416555106806022938 704 | 703,1.3.6.1.4.1.14519.5.2.1.6279.6001.397522780537301776672854630421 705 | 704,1.3.6.1.4.1.14519.5.2.1.6279.6001.398955972049286139436103068984 706 | 705,1.3.6.1.4.1.14519.5.2.1.6279.6001.401389720232123950202941034290 707 | 706,1.3.6.1.4.1.14519.5.2.1.6279.6001.404364125369979066736354549484 708 | 707,1.3.6.1.4.1.14519.5.2.1.6279.6001.404457313935200882843898832756 709 | 708,1.3.6.1.4.1.14519.5.2.1.6279.6001.404768898286087278137462774930 710 | 709,1.3.6.1.4.1.14519.5.2.1.6279.6001.413896555982844732694353377538 711 | 710,1.3.6.1.4.1.14519.5.2.1.6279.6001.414288023902112119945238126594 712 | 711,1.3.6.1.4.1.14519.5.2.1.6279.6001.416701701108520592702405866796 713 | 712,1.3.6.1.4.1.14519.5.2.1.6279.6001.417815314896088956784723476543 714 | 713,1.3.6.1.4.1.14519.5.2.1.6279.6001.419601611032172899567156073142 715 | 714,1.3.6.1.4.1.14519.5.2.1.6279.6001.428038562098395445838061018440 716 | 715,1.3.6.1.4.1.14519.5.2.1.6279.6001.430109407146633213496148200410 717 | 716,1.3.6.1.4.1.14519.5.2.1.6279.6001.436403998650924660479049012235 718 | 717,1.3.6.1.4.1.14519.5.2.1.6279.6001.438308540025607517017949816111 719 | 718,1.3.6.1.4.1.14519.5.2.1.6279.6001.439153572396640163898529626096 720 | 719,1.3.6.1.4.1.14519.5.2.1.6279.6001.440226700369921575481834344455 721 | 720,1.3.6.1.4.1.14519.5.2.1.6279.6001.443400977949406454649939526179 722 | 721,1.3.6.1.4.1.14519.5.2.1.6279.6001.447468612991222399440694673357 723 | 722,1.3.6.1.4.1.14519.5.2.1.6279.6001.449254134266555649028108149727 724 | 723,1.3.6.1.4.1.14519.5.2.1.6279.6001.450501966058662668272378865145 725 | 724,1.3.6.1.4.1.14519.5.2.1.6279.6001.454273545863197752384437758130 726 | 725,1.3.6.1.4.1.14519.5.2.1.6279.6001.458525794434429386945463560826 727 | 726,1.3.6.1.4.1.14519.5.2.1.6279.6001.461155505515403114280165935891 728 | 727,1.3.6.1.4.1.14519.5.2.1.6279.6001.463214953282361219537913355115 729 | 728,1.3.6.1.4.1.14519.5.2.1.6279.6001.463588161905537526756964393219 730 | 729,1.3.6.1.4.1.14519.5.2.1.6279.6001.465032801496479029639448332481 731 | 730,1.3.6.1.4.1.14519.5.2.1.6279.6001.466284753932369813717081722101 732 | 731,1.3.6.1.4.1.14519.5.2.1.6279.6001.470912100568074901744259213968 733 | 732,1.3.6.1.4.1.14519.5.2.1.6279.6001.472487466001405705666001578363 734 | 733,1.3.6.1.4.1.14519.5.2.1.6279.6001.475325201787910087416720919680 735 | 734,1.3.6.1.4.1.14519.5.2.1.6279.6001.478062284228419671253422844986 736 | 735,1.3.6.1.4.1.14519.5.2.1.6279.6001.479402560265137632920333093071 737 | 736,1.3.6.1.4.1.14519.5.2.1.6279.6001.481278873893653517789960724156 738 | 737,1.3.6.1.4.1.14519.5.2.1.6279.6001.483655032093002252444764787700 739 | 738,1.3.6.1.4.1.14519.5.2.1.6279.6001.486999111981013268988489262668 740 | 739,1.3.6.1.4.1.14519.5.2.1.6279.6001.487268565754493433372433148666 741 | 740,1.3.6.1.4.1.14519.5.2.1.6279.6001.487745546557477250336016826588 742 | 741,1.3.6.1.4.1.14519.5.2.1.6279.6001.503980049263254396021509831276 743 | 742,1.3.6.1.4.1.14519.5.2.1.6279.6001.504324996863016748259361352296 744 | 743,1.3.6.1.4.1.14519.5.2.1.6279.6001.504845428620607044098514803031 745 | 744,1.3.6.1.4.1.14519.5.2.1.6279.6001.511347030803753100045216493273 746 | 745,1.3.6.1.4.1.14519.5.2.1.6279.6001.513023675145166449943177283490 747 | 746,1.3.6.1.4.1.14519.5.2.1.6279.6001.518487185634324801733841260431 748 | 747,1.3.6.1.4.1.14519.5.2.1.6279.6001.525937963993475482158828421281 749 | 748,1.3.6.1.4.1.14519.5.2.1.6279.6001.534006575256943390479252771547 750 | 749,1.3.6.1.4.1.14519.5.2.1.6279.6001.534083630500464995109143618896 751 | 750,1.3.6.1.4.1.14519.5.2.1.6279.6001.550599855064600241623943717588 752 | 751,1.3.6.1.4.1.14519.5.2.1.6279.6001.553241901808946577644850294647 753 | 752,1.3.6.1.4.1.14519.5.2.1.6279.6001.557875302364105947813979213632 754 | 753,1.3.6.1.4.1.14519.5.2.1.6279.6001.558286136379689377915919180358 755 | 754,1.3.6.1.4.1.14519.5.2.1.6279.6001.561423049201987049884663740668 756 | 755,1.3.6.1.4.1.14519.5.2.1.6279.6001.561458563853929400124470098603 757 | 756,1.3.6.1.4.1.14519.5.2.1.6279.6001.564534197011295112247542153557 758 | 757,1.3.6.1.4.1.14519.5.2.1.6279.6001.566816709786169715745131047975 759 | 758,1.3.6.1.4.1.14519.5.2.1.6279.6001.569096986145782511000054443951 760 | 759,1.3.6.1.4.1.14519.5.2.1.6279.6001.584871944187559733312703328980 761 | 760,1.3.6.1.4.1.14519.5.2.1.6279.6001.592821488053137951302246128864 762 | 761,1.3.6.1.4.1.14519.5.2.1.6279.6001.596908385953413160131451426904 763 | 762,1.3.6.1.4.1.14519.5.2.1.6279.6001.603126300703296693942875967838 764 | 763,1.3.6.1.4.1.14519.5.2.1.6279.6001.603166427542096384265514998412 765 | 764,1.3.6.1.4.1.14519.5.2.1.6279.6001.608029415915051219877530734559 766 | 765,1.3.6.1.4.1.14519.5.2.1.6279.6001.613212850444255764524630781782 767 | 766,1.3.6.1.4.1.14519.5.2.1.6279.6001.614147706162329660656328811671 768 | 767,1.3.6.1.4.1.14519.5.2.1.6279.6001.616033753016904899083676284739 769 | 768,1.3.6.1.4.1.14519.5.2.1.6279.6001.618434772073433276874225174904 770 | 769,1.3.6.1.4.1.14519.5.2.1.6279.6001.619372068417051974713149104919 771 | 770,1.3.6.1.4.1.14519.5.2.1.6279.6001.621916089407825046337959219998 772 | 771,1.3.6.1.4.1.14519.5.2.1.6279.6001.622243923620914676263059698181 773 | 772,1.3.6.1.4.1.14519.5.2.1.6279.6001.624425075947752229712087113746 774 | 773,1.3.6.1.4.1.14519.5.2.1.6279.6001.625270601160880745954773142570 775 | 774,1.3.6.1.4.1.14519.5.2.1.6279.6001.627998298349675613581885874395 776 | 775,1.3.6.1.4.1.14519.5.2.1.6279.6001.631047517458234322522264161877 777 | 776,1.3.6.1.4.1.14519.5.2.1.6279.6001.640729228179368154416184318668 778 | 777,1.3.6.1.4.1.14519.5.2.1.6279.6001.652347820272212119124022644822 779 | 778,1.3.6.1.4.1.14519.5.2.1.6279.6001.655242448149322898770987310561 780 | 779,1.3.6.1.4.1.14519.5.2.1.6279.6001.657775098760536289051744981056 781 | 780,1.3.6.1.4.1.14519.5.2.1.6279.6001.658611160253017715059194304729 782 | 781,1.3.6.1.4.1.14519.5.2.1.6279.6001.663019255629770796363333877035 783 | 782,1.3.6.1.4.1.14519.5.2.1.6279.6001.664409965623578819357819577077 784 | 783,1.3.6.1.4.1.14519.5.2.1.6279.6001.664989286137882319237192185951 785 | 784,1.3.6.1.4.1.14519.5.2.1.6279.6001.669435869708883155232318480131 786 | 785,1.3.6.1.4.1.14519.5.2.1.6279.6001.669518152156802508672627785405 787 | 786,1.3.6.1.4.1.14519.5.2.1.6279.6001.670107649586205629860363487713 788 | 787,1.3.6.1.4.1.14519.5.2.1.6279.6001.671278273674156798801285503514 789 | 788,1.3.6.1.4.1.14519.5.2.1.6279.6001.674809958213117379592437424616 790 | 789,1.3.6.1.4.1.14519.5.2.1.6279.6001.675543413149938600000570588203 791 | 790,1.3.6.1.4.1.14519.5.2.1.6279.6001.686193079844756926365065559979 792 | 791,1.3.6.1.4.1.14519.5.2.1.6279.6001.690929968028676628605553365896 793 | 792,1.3.6.1.4.1.14519.5.2.1.6279.6001.692598144815688523679745963696 794 | 793,1.3.6.1.4.1.14519.5.2.1.6279.6001.693480911433291675609148051914 795 | 794,1.3.6.1.4.1.14519.5.2.1.6279.6001.701514276942509393419164159551 796 | 795,1.3.6.1.4.1.14519.5.2.1.6279.6001.707218743153927597786179232739 797 | 796,1.3.6.1.4.1.14519.5.2.1.6279.6001.710845873679853791427022019413 798 | 797,1.3.6.1.4.1.14519.5.2.1.6279.6001.712472578497712558367294720243 799 | 798,1.3.6.1.4.1.14519.5.2.1.6279.6001.716498695101447665580610403574 800 | 799,1.3.6.1.4.1.14519.5.2.1.6279.6001.724251104254976962355686318345 801 | 800,1.3.6.1.4.1.14519.5.2.1.6279.6001.724562063158320418413995627171 802 | 801,1.3.6.1.4.1.14519.5.2.1.6279.6001.725023183844147505748475581290 803 | 802,1.3.6.1.4.1.14519.5.2.1.6279.6001.725236073737175770730904408416 804 | 803,1.3.6.1.4.1.14519.5.2.1.6279.6001.733642690503782454656013446707 805 | 804,1.3.6.1.4.1.14519.5.2.1.6279.6001.741709061958490690246385302477 806 | 805,1.3.6.1.4.1.14519.5.2.1.6279.6001.743969234977916254223533321294 807 | 806,1.3.6.1.4.1.14519.5.2.1.6279.6001.745109871503276594185453478952 808 | 807,1.3.6.1.4.1.14519.5.2.1.6279.6001.747803439040091794717626507402 809 | 808,1.3.6.1.4.1.14519.5.2.1.6279.6001.749871569713868632259874663577 810 | 809,1.3.6.1.4.1.14519.5.2.1.6279.6001.750792629100457382099842515038 811 | 810,1.3.6.1.4.1.14519.5.2.1.6279.6001.752756872840730509471096155114 812 | 811,1.3.6.1.4.1.14519.5.2.1.6279.6001.756684168227383088294595834066 813 | 812,1.3.6.1.4.1.14519.5.2.1.6279.6001.765459236550358748053283544075 814 | 813,1.3.6.1.4.1.14519.5.2.1.6279.6001.765930210026773090100532964804 815 | 814,1.3.6.1.4.1.14519.5.2.1.6279.6001.766881513533845439335142582269 816 | 815,1.3.6.1.4.1.14519.5.2.1.6279.6001.768276876111112560631432843476 817 | 816,1.3.6.1.4.1.14519.5.2.1.6279.6001.771741891125176943862272696845 818 | 817,1.3.6.1.4.1.14519.5.2.1.6279.6001.771831598853841017505646275338 819 | 818,1.3.6.1.4.1.14519.5.2.1.6279.6001.774060103415303828812229821954 820 | 819,1.3.6.1.4.1.14519.5.2.1.6279.6001.776429308535398795601496131524 821 | 820,1.3.6.1.4.1.14519.5.2.1.6279.6001.776800177074349870648765614630 822 | 821,1.3.6.1.4.1.14519.5.2.1.6279.6001.779493719385047675154892222907 823 | 822,1.3.6.1.4.1.14519.5.2.1.6279.6001.780558315515979171413904604168 824 | 823,1.3.6.1.4.1.14519.5.2.1.6279.6001.792381786708289670758399079830 825 | 824,1.3.6.1.4.1.14519.5.2.1.6279.6001.797637294244261543517154417124 826 | 825,1.3.6.1.4.1.14519.5.2.1.6279.6001.799582546798528864710752164515 827 | 826,1.3.6.1.4.1.14519.5.2.1.6279.6001.801945620899034889998809817499 828 | 827,1.3.6.1.4.1.14519.5.2.1.6279.6001.802595762867498341201607992711 829 | 828,1.3.6.1.4.1.14519.5.2.1.6279.6001.803808126682275425758092691689 830 | 829,1.3.6.1.4.1.14519.5.2.1.6279.6001.803987517543436570820681016103 831 | 830,1.3.6.1.4.1.14519.5.2.1.6279.6001.805925269324902055566754756843 832 | 831,1.3.6.1.4.1.14519.5.2.1.6279.6001.811825890493256320617655474043 833 | 832,1.3.6.1.4.1.14519.5.2.1.6279.6001.814122498113547115932318256859 834 | 833,1.3.6.1.4.1.14519.5.2.1.6279.6001.822128649427327893802314908658 835 | 834,1.3.6.1.4.1.14519.5.2.1.6279.6001.826812708000318290301835871780 836 | 835,1.3.6.1.4.1.14519.5.2.1.6279.6001.826829446346820089862659555750 837 | 836,1.3.6.1.4.1.14519.5.2.1.6279.6001.832260670372728970918746541371 838 | 837,1.3.6.1.4.1.14519.5.2.1.6279.6001.837810280808122125183730411210 839 | 838,1.3.6.1.4.1.14519.5.2.1.6279.6001.842317928015463083368074520378 840 | 839,1.3.6.1.4.1.14519.5.2.1.6279.6001.842980983137518332429408284002 841 | 840,1.3.6.1.4.1.14519.5.2.1.6279.6001.850739282072340578344345230132 842 | 841,1.3.6.1.4.1.14519.5.2.1.6279.6001.855232435861303786204450738044 843 | 842,1.3.6.1.4.1.14519.5.2.1.6279.6001.861997885565255340442123234170 844 | 843,1.3.6.1.4.1.14519.5.2.1.6279.6001.866845763956586959109892274084 845 | 844,1.3.6.1.4.1.14519.5.2.1.6279.6001.868211851413924881662621747734 846 | 845,1.3.6.1.4.1.14519.5.2.1.6279.6001.877026508860018521147620598474 847 | 846,1.3.6.1.4.1.14519.5.2.1.6279.6001.882070241245008756731854510592 848 | 847,1.3.6.1.4.1.14519.5.2.1.6279.6001.885168397833922082085837240429 849 | 848,1.3.6.1.4.1.14519.5.2.1.6279.6001.885292267869246639232975687131 850 | 849,1.3.6.1.4.1.14519.5.2.1.6279.6001.888291896309937415860209787179 851 | 850,1.3.6.1.4.1.14519.5.2.1.6279.6001.888615810685807330497715730842 852 | 851,1.3.6.1.4.1.14519.5.2.1.6279.6001.892375496445736188832556446335 853 | 852,1.3.6.1.4.1.14519.5.2.1.6279.6001.897161587681142256575045076919 854 | 853,1.3.6.1.4.1.14519.5.2.1.6279.6001.897279226481700053115245043064 855 | 854,1.3.6.1.4.1.14519.5.2.1.6279.6001.897684031374557757145405000951 856 | 855,1.3.6.1.4.1.14519.5.2.1.6279.6001.898642529028521482602829374444 857 | 856,1.3.6.1.4.1.14519.5.2.1.6279.6001.900182736599353600185270496549 858 | 857,1.3.6.1.4.1.14519.5.2.1.6279.6001.905371958588660410240398317235 859 | 858,1.3.6.1.4.1.14519.5.2.1.6279.6001.908250781706513856628130123235 860 | 859,1.3.6.1.4.1.14519.5.2.1.6279.6001.910435939545691201820711078950 861 | 860,1.3.6.1.4.1.14519.5.2.1.6279.6001.910607280658963002048724648683 862 | 861,1.3.6.1.4.1.14519.5.2.1.6279.6001.910757789941076242457816491305 863 | 862,1.3.6.1.4.1.14519.5.2.1.6279.6001.922852847124879997825997808179 864 | 863,1.3.6.1.4.1.14519.5.2.1.6279.6001.927394449308471452920270961822 865 | 864,1.3.6.1.4.1.14519.5.2.1.6279.6001.931383239747372227838946053237 866 | 865,1.3.6.1.4.1.14519.5.2.1.6279.6001.935683764293840351008008793409 867 | 866,1.3.6.1.4.1.14519.5.2.1.6279.6001.939152384493874708850321969356 868 | 867,1.3.6.1.4.1.14519.5.2.1.6279.6001.939216568327879462530496768794 869 | 868,1.3.6.1.4.1.14519.5.2.1.6279.6001.943403138251347598519939390311 870 | 869,1.3.6.1.4.1.14519.5.2.1.6279.6001.944888107209008719031293531091 871 | 870,1.3.6.1.4.1.14519.5.2.1.6279.6001.946129570505893110165820050204 872 | 871,1.3.6.1.4.1.14519.5.2.1.6279.6001.948414623428298219623354433437 873 | 872,1.3.6.1.4.1.14519.5.2.1.6279.6001.952265563663939823135367733681 874 | 873,1.3.6.1.4.1.14519.5.2.1.6279.6001.955688628308192728558382581802 875 | 874,1.3.6.1.4.1.14519.5.2.1.6279.6001.957384617596077920906744920611 876 | 875,1.3.6.1.4.1.14519.5.2.1.6279.6001.961063442349005937536597225349 877 | 876,1.3.6.1.4.1.14519.5.2.1.6279.6001.964952370561266624992539111877 878 | 877,1.3.6.1.4.1.14519.5.2.1.6279.6001.965620538050807352935663552285 879 | 878,1.3.6.1.4.1.14519.5.2.1.6279.6001.969607480572818589276327766720 880 | 879,1.3.6.1.4.1.14519.5.2.1.6279.6001.970264865033574190975654369557 881 | 880,1.3.6.1.4.1.14519.5.2.1.6279.6001.970428941353693253759289796610 882 | 881,1.3.6.1.4.1.14519.5.2.1.6279.6001.975254950136384517744116790879 883 | 882,1.3.6.1.4.1.14519.5.2.1.6279.6001.975426625618184773401026809852 884 | 883,1.3.6.1.4.1.14519.5.2.1.6279.6001.979083010707182900091062408058 885 | 884,1.3.6.1.4.1.14519.5.2.1.6279.6001.980362852713685276785310240144 886 | 885,1.3.6.1.4.1.14519.5.2.1.6279.6001.986011151772797848993829243183 887 | 886,1.3.6.1.4.1.14519.5.2.1.6279.6001.994459772950022352718462251777 888 | 887,1.3.6.1.4.1.14519.5.2.1.6279.6001.997611074084993415992563148335 889 | -------------------------------------------------------------------------------- /labels/val9_sid.csv: -------------------------------------------------------------------------------- 1 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.222052723822248889877676736332' 2 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.334166493392278943610545989413' 3 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.212608679077007918190529579976' 4 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.229860476925100292554329427970' 5 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.227968442353440630355230778531' 6 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.230491296081537726468075344411' 7 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.330043769832606379655473292782' 8 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.138813197521718693188313387015' 9 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.927394449308471452920270961822' 10 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.109882169963817627559804568094' 11 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.311476128731958142981941696518' 12 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.150097650621090951325113116280' 13 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.193721075067404532739943086458' 14 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.306140003699110313373771452136' 15 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.229096941293122177107846044795' 16 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.265570697208310960298668720669' 17 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.436403998650924660479049012235' 18 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.237915456403882324748189195892' 19 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.121108220866971173712229588402' 20 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.170706757615202213033480003264' 21 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.328944769569002417592093467626' 22 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.771741891125176943862272696845' 23 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.188385286346390202873004762827' 24 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.220596530836092324070084384692' 25 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.340012777775661021262977442176' 26 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.179683407589764683292800449011' 27 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.188619674701053082195613114069' 28 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.121805476976020513950614465787' 29 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.174449669706458092793093760291' 30 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.182192086929819295877506541021' 31 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.472487466001405705666001578363' 32 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.603126300703296693942875967838' 33 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.309901913847714156367981722205' 34 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.229960820686439513664996214638' 35 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.124656777236468248920498636247' 36 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.194632613233275988184244485809' 37 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.268030488196493755113553009785' 38 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.145510611155363050427743946446' 39 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.139577698050713461261415990027' 40 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.337005960787660957389988207064' 41 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.196251645377731223510086726530' 42 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.167661207884826429102690781600' 43 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.134519406153127654901640638633' 44 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.138674679709964033277400089532' 45 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.392861216720727557882279374324' 46 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.733642690503782454656013446707' 47 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.187694838527128312070807533473' 48 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.217754016294471278921686508169' 49 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.237215747217294006286437405216' 50 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.300392272203629213913702120739' 51 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.153985109349433321657655488650' 52 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.176869045992276345870480098568' 53 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.199261544234308780356714831537' 54 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.440226700369921575481834344455' 55 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.765459236550358748053283544075' 56 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.254929810944557499537650429296' 57 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.112767175295249119452142211437' 58 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.291156498203266896953765649282' 59 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.312127933722985204808706697221' 60 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.259453428008507791234730686014' 61 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.300270516469599170290456821227' 62 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.250481236093201801255751845296' 63 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.141345499716190654505508410197' 64 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.102681962408431413578140925249' 65 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.292994770358625142596171316474' 66 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.558286136379689377915919180358' 67 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.765930210026773090100532964804' 68 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.946129570505893110165820050204' 69 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.148935306123327835217659769212' 70 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.330643702676971528301859647742' 71 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.300693623747082239407271583452' 72 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.272344603176687884771013620823' 73 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.124822907934319930841506266464' 74 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.387954549120924524005910602207' 75 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.292576688635952269497781991202' 76 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.114914167428485563471327801935' 77 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.195557219224169985110295082004' 78 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.265780642925621389994857727416' 79 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.712472578497712558367294720243' 80 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.215104063467523905369326175410' 81 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.747803439040091794717626507402' 82 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.286061375572911414226912429210' 83 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.193964947698259739624715468431' 84 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.340158437895922179455019686521' 85 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.173931884906244951746140865701' 86 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.129650136453746261130135157590' 87 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.215785045378334625097907422785' 88 | b'1.3.6.1.4.1.14519.5.2.1.6279.6001.855232435861303786204450738044' 89 | -------------------------------------------------------------------------------- /preprocess/make_validate_npy.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import glob 3 | import ntpath 4 | import pandas 5 | from config_training import config 6 | 7 | val_num = 9 8 | 9 | trainlist = [] 10 | vallist = [] 11 | 12 | subset_dir = config['luna_raw'] 13 | print (subset_dir) 14 | 15 | subject_no_dict = {} 16 | for subject_no in range(0, 10): 17 | src_dir = subset_dir + "subset" + str(subject_no) + "/" 18 | for src_path in glob.glob(src_dir + "*.mhd"): 19 | patient_id = ntpath.basename(src_path).replace(".mhd", "") 20 | subject_no_dict[patient_id] = subject_no 21 | 22 | abbrevs = np.array(pandas.read_csv(config['luna_abbr'],header=None)) 23 | namelist = list(abbrevs[:,1]) 24 | ids = abbrevs[:,0] 25 | 26 | print (len(subject_no_dict)) 27 | 28 | for key, value in subject_no_dict.items(): 29 | id = ids[namelist.index(key)] 30 | id = '0' * (3 - len(str(id))) + str(id) 31 | if value != val_num: 32 | trainlist.append(id) 33 | else: 34 | vallist.append(id) 35 | 36 | print (len(trainlist)) 37 | print (len(vallist)) 38 | print (trainlist) 39 | print (vallist) 40 | 41 | np.save('train_luna_' + str(val_num) + '.npy',np.array(trainlist)) 42 | np.save('val' + str(val_num) + '.npy',np.array(vallist)) -------------------------------------------------------------------------------- /preprocess/prepare.py: -------------------------------------------------------------------------------- 1 | import os 2 | import shutil 3 | import numpy as np 4 | from config_training import config 5 | 6 | 7 | #from scipy.io import loadmat 8 | import numpy as np 9 | #import h5py 10 | #import pandas 11 | #import scipy 12 | from scipy.ndimage.interpolation import zoom 13 | #from skimage import measure 14 | import SimpleITK as sitk 15 | from scipy.ndimage.morphology import binary_dilation,generate_binary_structure 16 | from skimage.morphology import convex_hull_image 17 | import pandas 18 | #from multiprocessing import Pool 19 | #from functools import partial 20 | import sys 21 | import math 22 | sys.path.append('../preprocessing') 23 | #from step1 import step1_python_luna 24 | #import warnings 25 | import glob 26 | from bs4 import BeautifulSoup 27 | 28 | def resample(imgs, spacing, new_spacing,order=2): 29 | if len(imgs.shape)==3: 30 | new_shape = np.round(imgs.shape * spacing / new_spacing) 31 | true_spacing = spacing * imgs.shape / new_shape 32 | resize_factor = new_shape / imgs.shape 33 | imgs = zoom(imgs, resize_factor, mode = 'nearest',order=order) 34 | return imgs, true_spacing 35 | elif len(imgs.shape)==4: 36 | n = imgs.shape[-1] 37 | newimg = [] 38 | for i in range(n): 39 | slice = imgs[:,:,:,i] 40 | newslice,true_spacing = resample(slice,spacing,new_spacing) 41 | newimg.append(newslice) 42 | newimg=np.transpose(np.array(newimg),[1,2,3,0]) 43 | return newimg,true_spacing 44 | else: 45 | raise ValueError('wrong shape') 46 | def worldToVoxelCoord(worldCoord, origin, spacing): 47 | 48 | stretchedVoxelCoord = np.absolute(worldCoord - origin) 49 | voxelCoord = stretchedVoxelCoord / spacing 50 | return voxelCoord 51 | 52 | def load_itk_image(filename): 53 | with open(filename) as f: 54 | contents = f.readlines() 55 | line = [k for k in contents if k.startswith('TransformMatrix')][0] 56 | transformM = np.array(line.split(' = ')[1].split(' ')).astype('float') 57 | transformM = np.round(transformM) 58 | if np.any( transformM!=np.array([1,0,0, 0, 1, 0, 0, 0, 1])): 59 | isflip = True 60 | else: 61 | isflip = False 62 | 63 | itkimage = sitk.ReadImage(filename) 64 | numpyImage = sitk.GetArrayFromImage(itkimage) 65 | 66 | numpyOrigin = np.array(list(reversed(itkimage.GetOrigin()))) 67 | numpySpacing = np.array(list(reversed(itkimage.GetSpacing()))) 68 | 69 | return numpyImage, numpyOrigin, numpySpacing, isflip 70 | 71 | def process_mask(mask): 72 | convex_mask = np.copy(mask) 73 | for i_layer in range(convex_mask.shape[0]): 74 | mask1 = np.ascontiguousarray(mask[i_layer]) 75 | if np.sum(mask1)>0: 76 | mask2 = convex_hull_image(mask1) 77 | if np.sum(mask2)>1.5*np.sum(mask1): 78 | mask2 = mask1 79 | else: 80 | mask2 = mask1 81 | convex_mask[i_layer] = mask2 82 | struct = generate_binary_structure(3,1) 83 | dilatedMask = binary_dilation(convex_mask,structure=struct,iterations=10) 84 | return dilatedMask 85 | 86 | 87 | def lumTrans(img): 88 | lungwin = np.array([-1200.,600.]) 89 | newimg = (img-lungwin[0])/(lungwin[1]-lungwin[0]) 90 | newimg[newimg<0]=0 91 | newimg[newimg>1]=1 92 | newimg = (newimg*255).astype('uint8') 93 | return newimg 94 | 95 | def savenpy_luna_attribute(xml_path, annos, filelist, luna_data, savepath, candidate_annos, abbrevs, readlist): 96 | islabel = True 97 | isClean = True 98 | isCandidate = True 99 | isAttribute = True 100 | resolution = np.array([1, 1, 1]) 101 | namelist = list(abbrevs[:, 1]) 102 | ids = abbrevs[:, 0] 103 | 104 | with open(xml_path, 'r') as xml_file: 105 | markup = xml_file.read() 106 | xml = BeautifulSoup(markup, features="xml") 107 | if xml.LidcReadMessage is None: 108 | return -1 109 | patient_id = xml.LidcReadMessage.ResponseHeader.SeriesInstanceUid.text 110 | 111 | if patient_id in namelist: 112 | name = ids[namelist.index(patient_id)] 113 | 114 | name = str(name) 115 | if len(name) < 3: 116 | for i in range(3 - len(name)): 117 | name = '0' + name 118 | print (name) 119 | 120 | if name in readlist: 121 | print ("overlap", name) 122 | return -1 123 | 124 | #print (id, patient_id) 125 | this_annos = np.copy(annos[annos[:, 0] == int(name)]) 126 | if isClean: 127 | 128 | sliceim, origin, spacing, isflip = load_itk_image(os.path.join(luna_data, name + '.mhd')) 129 | ori_sliceim_shape_yx = sliceim.shape[1:3] 130 | if isflip: 131 | sliceim = sliceim[:, ::-1, ::-1] 132 | print('flip!') 133 | sliceim = lumTrans(sliceim) 134 | 135 | sliceim1, _ = resample(sliceim, spacing, resolution, order=1) 136 | sliceim = sliceim1[np.newaxis, ...] 137 | np.save(os.path.join(savepath, name + '_clean.npy'), sliceim) 138 | 139 | #make attribute_annos 140 | # name,pos_x, pos_y, pos_z, malignacy, sphericiy, margin, spiculation, texture, calcification, internal_structure, lobulation, subtlety, hit_count 141 | if isAttribute: 142 | luna_annos = np.copy(this_annos) 143 | annos_shape = np.shape(luna_annos) 144 | attribute_annos = np.zeros((annos_shape[0], annos_shape[1] + 10)) 145 | 146 | for i in range(len(luna_annos)): 147 | luna_annos[i][1] = (luna_annos[i][1] - origin[2]) / spacing[2] 148 | luna_annos[i][2] = (luna_annos[i][2] - origin[1]) / spacing[1] 149 | luna_annos[i][3] = (luna_annos[i][3] - origin[0]) / spacing[0] 150 | 151 | if isflip: 152 | luna_annos[i][1] = -luna_annos[i][1] 153 | luna_annos[i][2] = -luna_annos[i][2] 154 | attribute_annos[i] = np.concatenate((luna_annos[i], np.array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0]))) 155 | 156 | reading_sessions = xml.LidcReadMessage.find_all("readingSession") 157 | for reading_session in reading_sessions: 158 | # print("Sesion") 159 | nodules = reading_session.find_all("unblindedReadNodule") 160 | for nodule in nodules: 161 | nodule_id = nodule.noduleID.text 162 | rois = nodule.find_all("roi") 163 | x_min = y_min = z_min = 999999 164 | x_max = y_max = z_max = -999999 165 | for roi in rois: 166 | z_pos = float(roi.imageZposition.text) 167 | z_min = min(z_min, z_pos) 168 | z_max = max(z_max, z_pos) 169 | edge_maps = roi.find_all("edgeMap") 170 | for edge_map in edge_maps: 171 | x = float(edge_map.xCoord.text) 172 | y = float(edge_map.yCoord.text) 173 | x_min = min(x_min, x) 174 | y_min = min(y_min, y) 175 | x_max = max(x_max, x) 176 | y_max = max(y_max, y) 177 | if x_max == x_min: 178 | continue 179 | if y_max == y_min: 180 | continue 181 | x_diameter = x_max - x_min 182 | x_center = x_min + x_diameter / 2 183 | y_diameter = y_max - y_min 184 | y_center = y_min + y_diameter / 2 185 | z_diameter = z_max - z_min 186 | z_center = z_min + z_diameter / 2 187 | z_center -= origin[0] 188 | z_center /= spacing[0] 189 | 190 | if nodule.characteristics is None: 191 | # print("!!!!Nodule:", nodule_id, " has no charecteristics") 192 | continue 193 | if nodule.characteristics.malignancy is None: 194 | # print("!!!!Nodule:", nodule_id, " has no malignacy") 195 | continue 196 | 197 | malignacy = int(nodule.characteristics.malignancy.text) 198 | sphericiy = int(nodule.characteristics.sphericity.text) 199 | margin = int(nodule.characteristics.margin.text) 200 | spiculation = int(nodule.characteristics.spiculation.text) 201 | texture = int(nodule.characteristics.texture.text) 202 | calcification = int(nodule.characteristics.calcification.text) 203 | internal_structure = int(nodule.characteristics.internalStructure.text) 204 | lobulation = int(nodule.characteristics.lobulation.text) 205 | subtlety = int(nodule.characteristics.subtlety.text) 206 | 207 | for annos in attribute_annos: 208 | dist = math.sqrt(math.pow(x_center - annos[1], 2) + math.pow(y_center - annos[2], 2) + math.pow( 209 | z_center - annos[3], 2)) 210 | if dist <= annos[4]: 211 | annos[5] += malignacy 212 | annos[6] += sphericiy 213 | annos[7] += margin 214 | annos[8] += spiculation 215 | annos[9] += texture 216 | annos[10] += calcification 217 | annos[11] += internal_structure 218 | annos[12] += lobulation 219 | annos[13] += subtlety 220 | annos[14] += 1 221 | 222 | for annos in attribute_annos: 223 | if (annos[14] > 0): 224 | annos[5] = annos[5] / annos[14] 225 | annos[6] = annos[6] / annos[14] 226 | annos[7] = annos[7] / annos[14] 227 | annos[8] = annos[8] / annos[14] 228 | annos[9] = annos[9] / annos[14] 229 | annos[10] = annos[10] / annos[14] 230 | annos[11] = annos[11] / annos[14] 231 | annos[12] = annos[12] / annos[14] 232 | annos[13] = annos[13] / annos[14] 233 | else: 234 | print ('no hit nodule', annos) 235 | 236 | if islabel: 237 | 238 | this_annos = np.copy(this_annos) 239 | label = [] 240 | if len(this_annos) > 0: 241 | 242 | for c in this_annos: 243 | pos = worldToVoxelCoord(c[1:4][::-1], origin=origin, spacing=spacing) 244 | if isflip: 245 | pos[1:] = ori_sliceim_shape_yx - pos[1:] 246 | label.append(np.concatenate([pos, [c[4] / spacing[1]]])) 247 | 248 | label = np.array(label) 249 | if len(label) == 0: 250 | label2 = np.array([[0, 0, 0, 0]]) 251 | else: 252 | label2 = np.copy(label).T 253 | label2[:3] = label2[:3] * np.expand_dims(spacing, 1) / np.expand_dims(resolution, 1) 254 | label2[3] = label2[3] * spacing[1] / resolution[1] 255 | label2 = label2[:4].T 256 | 257 | #set voxel z,y,x pos to attibute annos 258 | for i in range(len(attribute_annos)): 259 | attribute_annos[i][1] = label2[i][0] 260 | attribute_annos[i][2] = label2[i][1] 261 | attribute_annos[i][3] = label2[i][2] 262 | 263 | np.save(os.path.join(savepath, name + '_label.npy'), label2) 264 | np.save(os.path.join(savepath, name + '_attribute.npy'), attribute_annos) 265 | 266 | if isCandidate: 267 | img_shape = sliceim.shape[1:4] 268 | candidate_annos = np.copy(candidate_annos[candidate_annos[:, 0] == int(name)]) 269 | label = [] 270 | if len(candidate_annos) > 0: 271 | 272 | for c in candidate_annos: 273 | pos = worldToVoxelCoord(c[1:4][::-1], origin=origin, spacing=spacing) 274 | # print ("2 label", pos) 275 | if isflip: 276 | pos[1:] = ori_sliceim_shape_yx - pos[1:] 277 | # print ("flip label", pos) 278 | 279 | pos = pos * spacing / resolution 280 | 281 | transit_val = 6 282 | min_dist = 3 283 | min_check = ((pos - transit_val) > 0).all() 284 | max_check = (((pos + transit_val) - img_shape) < 0).all() 285 | # print (min_check, max_check) 286 | if (min_check and max_check): 287 | label.append(np.concatenate([pos, [c[4] / spacing[1]]])) 288 | 289 | label = np.array(label) 290 | print ("candidate len", len(candidate_annos), len(label)) 291 | if len(label) == 0: 292 | label2 = np.array([[0, 0, 0, 0]]) 293 | else: 294 | label2 = np.copy(label).T 295 | # print ("3 label", label2) 296 | # label2[:3] = label2[:3] * np.expand_dims(spacing, 1) / np.expand_dims(resolution, 1) 297 | label2[3] = label2[3] * spacing[1] / resolution[1] 298 | # label2[:3] = label2[:3] - np.expand_dims(extendbox[:, 0], 1) 299 | # print ("a label", label2) 300 | label2 = label2[:4].T 301 | np.save(os.path.join(savepath, name + '_candidate.npy'), label2) 302 | 303 | return name 304 | else: 305 | print ('not LUNA16 list', patient_id) 306 | return -1 307 | #name = filelist[id] 308 | print(id, name) 309 | 310 | return 1 311 | 312 | 313 | def prepare_luna(): 314 | luna_raw = config['luna_raw'] 315 | luna_abbr = config['luna_abbr'] 316 | luna_data = config['luna_data'] 317 | #luna_segment = config['luna_segment'] 318 | finished_flag = '.flag_prepareluna' 319 | 320 | if not os.path.exists(finished_flag): 321 | print('start changing luna name') 322 | subsetdirs = [os.path.join(luna_raw, f) for f in os.listdir(luna_raw) if 323 | f.startswith('subset') and os.path.isdir(os.path.join(luna_raw, f))] 324 | if not os.path.exists(luna_data): 325 | os.mkdir(luna_data) 326 | 327 | abbrevs = np.array(pandas.read_csv(config['luna_abbr'], header=None)) 328 | namelist = list(abbrevs[:, 1]) 329 | ids = abbrevs[:, 0] 330 | 331 | for d in subsetdirs: 332 | files = os.listdir(d) 333 | files.sort() 334 | 335 | for f in files: 336 | name = f[:-4] 337 | id = ids[namelist.index(name)] 338 | filename = '0' * (3 - len(str(id))) + str(id) 339 | shutil.move(os.path.join(d, f), os.path.join(luna_data, filename + f[-4:])) 340 | print(os.path.join(luna_data, str(id) + f[-4:])) 341 | 342 | files = [f for f in os.listdir(luna_data) if f.endswith('mhd')] 343 | for file in files: 344 | with open(os.path.join(luna_data, file), 'r') as f: 345 | content = f.readlines() 346 | id = file.split('.mhd')[0] 347 | filename = '0' * (3 - len(str(id))) + str(id) 348 | content[-1] = 'ElementDataFile = ' + filename + '.raw\n' 349 | print(content[-1]) 350 | with open(os.path.join(luna_data, file), 'w') as f: 351 | f.writelines(content) 352 | 353 | print('end changing luna name') 354 | f = open(finished_flag, "w+") 355 | 356 | def preprocess_luna(): 357 | savepath = config['preprocess_result_path'] 358 | luna_data = config['luna_data'] 359 | luna_label = config['luna_label'] 360 | luna_candidate_label = config['luna_candidate_label'] 361 | finished_flag = '.flag_preprocessluna' 362 | xml_path = config['lidc_xml'] 363 | 364 | abbrevs = np.array(pandas.read_csv(config['luna_abbr'], header=None)) 365 | 366 | print('starting preprocessing luna', os.path.exists(finished_flag)) 367 | if not os.path.exists(finished_flag): 368 | filelist = [f.split('.mhd')[0] for f in os.listdir(luna_data) if f.endswith('.mhd') ] 369 | annos = np.array(pandas.read_csv(luna_label)) 370 | candidate_annos = np.array(pandas.read_csv(luna_candidate_label)) 371 | 372 | if not os.path.exists(savepath): 373 | os.mkdir(savepath) 374 | 375 | readlist = [] 376 | file_no = 0 377 | for anno_dir in [d for d in glob.glob(xml_path + "/*") if os.path.isdir(d)]: 378 | xml_paths = glob.glob(anno_dir + "/*.xml") 379 | print(file_no, ": ", xml_path) 380 | for xml_path in xml_paths: 381 | err = savenpy_luna_attribute(xml_path=xml_path, annos=annos, filelist=filelist, luna_data=luna_data, 382 | savepath=savepath, candidate_annos=candidate_annos, abbrevs=abbrevs, readlist=readlist) 383 | if (err != -1): 384 | if err not in readlist: 385 | readlist.append(err) 386 | file_no += 1 387 | print('end preprocessing luna', file_no, len(filelist),) 388 | 389 | f= open(finished_flag,"w+") 390 | 391 | if __name__=='__main__': 392 | prepare_luna() 393 | preprocess_luna() 394 | 395 | -------------------------------------------------------------------------------- /train.sh: -------------------------------------------------------------------------------- 1 | python training/main.py --model res_classifier --gpu 3 --save-dir res18_classifier 2 | -------------------------------------------------------------------------------- /training/data.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import torch 3 | from torch.utils.data import Dataset 4 | import os 5 | import time 6 | import collections 7 | from scipy.ndimage.interpolation import rotate 8 | 9 | class DataBowl3Detector(Dataset): 10 | def __init__(self, data_dir, split_path, config, phase = 'train',split_comber=None): 11 | assert(phase == 'train' or phase == 'val' or phase == 'test') 12 | self.phase = phase 13 | self.isScale = config['aug_scale'] 14 | self.r_rand = config['r_rand_crop'] 15 | 16 | self.augtype = config['augtype'] 17 | self.pad_value = config['pad_value'] 18 | self.idcs = np.load(split_path) 19 | 20 | self.filenames = [os.path.join(data_dir, '%s_clean.npy' % idx) for idx in self.idcs] 21 | 22 | labels = [] 23 | candidates = [] 24 | # labels 25 | # name, pos_z, pos_y, pos_x, size, malignacy, sphericiy, margin, spiculation, texture, calcification, internal_structure, lobulation, subtlety, hit_count 26 | for idx in self.idcs: 27 | l = np.load(os.path.join(data_dir, '%s_attribute.npy' %idx)) 28 | if np.all(l==0): 29 | l=np.array([]) 30 | labels.append(l) 31 | 32 | if self.phase != 'test': 33 | self.bboxes = [] 34 | for i, l in enumerate(labels): 35 | if len(l) > 0 : 36 | for label in l: 37 | self.bboxes.append([np.concatenate([[i], label[1:]])]) 38 | 39 | self.bboxes = np.concatenate(self.bboxes,axis = 0) 40 | np.random.shuffle(self.bboxes) 41 | #self.crop = Crop(config) 42 | self.crop = padding_crop(config) 43 | 44 | def __getitem__(self, idx,split=None): 45 | #print ('get item', idx) 46 | t = time.time() 47 | 48 | bbox = self.bboxes[idx] 49 | filename = self.filenames[int(bbox[0])] 50 | imgs = np.load(filename) 51 | isScale = self.augtype['scale'] and (self.phase=='train') 52 | 53 | if self.phase=='train': 54 | sample, target = self.crop(imgs, bbox[1:], True) 55 | sample, target = augment(sample, target, 56 | ifflip=self.augtype['flip'], ifrotate=self.augtype['rotate'], 57 | ifswap=self.augtype['swap']) 58 | else: 59 | sample, target = self.crop(imgs, bbox[1:], False) 60 | 61 | label = np.array(bbox[5:14]) 62 | sample = (sample.astype(np.float32)-128)/128 63 | 64 | #if filename in self.kagglenames and self.phase=='train': 65 | # label[label==-1]=0 66 | return torch.from_numpy(sample), torch.from_numpy(label) 67 | 68 | def __len__(self): 69 | return len(self.bboxes) 70 | 71 | 72 | def augment(sample, target, ifflip = True, ifrotate=True, ifswap = True): 73 | # angle1 = np.random.rand()*180 74 | 75 | if ifrotate: 76 | validrot = False 77 | counter = 0 78 | while not validrot: 79 | newtarget = np.copy(target) 80 | angle1 = np.random.rand()*180 81 | size = np.array(sample.shape[2:4]).astype('float') 82 | rotmat = np.array([[np.cos(angle1/180*np.pi),-np.sin(angle1/180*np.pi)],[np.sin(angle1/180*np.pi),np.cos(angle1/180*np.pi)]]) 83 | newtarget[1:3] = np.dot(rotmat,target[1:3]-size/2)+size/2 84 | if np.all(newtarget[:3]>target[3]) and np.all(newtarget[:3]< np.array(sample.shape[1:4])-newtarget[3]): 85 | validrot = True 86 | target = newtarget 87 | sample = rotate(sample,angle1,axes=(2,3),reshape=False) 88 | else: 89 | counter += 1 90 | if counter ==3: 91 | break 92 | if ifswap: 93 | if sample.shape[1]==sample.shape[2] and sample.shape[1]==sample.shape[3]: 94 | axisorder = np.random.permutation(3) 95 | sample = np.transpose(sample,np.concatenate([[0],axisorder+1])) 96 | target[:3] = target[:3][axisorder] 97 | 98 | if ifflip: 99 | # flipid = np.array([np.random.randint(2),np.random.randint(2),np.random.randint(2)])*2-1 100 | flipid = np.array([1,np.random.randint(2),np.random.randint(2)])*2-1 101 | sample = np.ascontiguousarray(sample[:,::flipid[0],::flipid[1],::flipid[2]]) 102 | for ax in range(3): 103 | if flipid[ax]==-1: 104 | target[ax] = np.array(sample.shape[ax+1])-target[ax] 105 | 106 | return sample, target 107 | 108 | class Crop(object): 109 | def __init__(self, config): 110 | self.img_size = config['img_size'] 111 | self.bound_size = config['bound_size'] 112 | #self.stride = config['stride'] 113 | self.pad_value = config['pad_value'] 114 | 115 | def __call__(self, imgs, target): 116 | 117 | crop_size = self.img_size 118 | bound_size = self.bound_size 119 | 120 | target = np.copy(target) 121 | 122 | start = [] 123 | for i in range(3): 124 | s = np.floor(target[i])+ 1 - bound_size 125 | e = np.ceil (target[i])+ 1 + bound_size - crop_size[i] 126 | 127 | 128 | if s>e: 129 | start.append(np.random.randint(e,s))#! 130 | else: 131 | start.append(int(target[i])-crop_size[i]/2+np.random.randint(-bound_size/2,bound_size/2)) 132 | 133 | pad = [] 134 | pad.append([0,0]) 135 | for i in range(3): 136 | leftpad = max(0,-start[i]) 137 | rightpad = max(0,start[i]+crop_size[i]-imgs.shape[i+1]) 138 | pad.append([leftpad,rightpad]) 139 | crop = imgs[:, 140 | max(start[0],0):min(start[0] + crop_size[0],imgs.shape[1]), 141 | max(start[1],0):min(start[1] + crop_size[1],imgs.shape[2]), 142 | max(start[2],0):min(start[2] + crop_size[2],imgs.shape[3])] 143 | 144 | crop = np.pad(crop,pad,'constant',constant_values =self.pad_value) 145 | 146 | for i in range(3): 147 | target[i] = target[i] - start[i] 148 | 149 | return crop, target 150 | 151 | 152 | class padding_crop(object): 153 | def __init__(self, config): 154 | self.img_size = config['img_size'] 155 | self.bound_size = config['bound_size'] 156 | # self.stride = config['stride'] 157 | self.pad_value = config['pad_value'] 158 | 159 | def crop(self, imgs, target, train=True): 160 | crop_size = self.img_size 161 | bound_size = self.bound_size 162 | target = np.copy(target) 163 | 164 | start = [] 165 | for i in range(3): 166 | start.append(int(target[i]) - int(crop_size[i] / 2)) 167 | 168 | pad = [] 169 | pad.append([0, 0]) 170 | for i in range(3): 171 | leftpad = max(0, -start[i]) 172 | rightpad = max(0, start[i] + crop_size[i] - imgs.shape[i + 1]) 173 | pad.append([leftpad, rightpad]) 174 | crop = imgs[:, 175 | max(start[0], 0):min(start[0] + crop_size[0], imgs.shape[1]), 176 | max(start[1], 0):min(start[1] + crop_size[1], imgs.shape[2]), 177 | max(start[2], 0):min(start[2] + crop_size[2], imgs.shape[3])] 178 | 179 | crop = np.pad(crop, pad, 'constant', constant_values=self.pad_value) 180 | 181 | for i in range(3): 182 | target[i] = target[i] - start[i] 183 | 184 | return crop, target 185 | 186 | def __call__(self, imgs, target, train=True): 187 | 188 | crop_img, target = self.crop(imgs, target, train) 189 | imgs = np.squeeze(crop_img, axis=0) 190 | 191 | z = int(target[0]) 192 | y = int(target[1]) 193 | x = int(target[2]) 194 | #z = 24 195 | #y = 24 196 | #x = 24 197 | 198 | nodule_size = int(target[3]) 199 | margin = max(7, nodule_size * 0.4) 200 | radius = int((nodule_size + margin) / 2) 201 | 202 | s_z_pad = 0 203 | e_z_pad = 0 204 | s_y_pad = 0 205 | e_y_pad = 0 206 | s_x_pad = 0 207 | e_x_pad = 0 208 | 209 | s_z = max(0, z - radius) 210 | if (s_z == 0): 211 | s_z_pad = -(z - radius) 212 | 213 | e_z = min(np.shape(imgs)[0], z + radius) 214 | if (e_z == np.shape(imgs)[0]): 215 | e_z_pad = (z + radius) - np.shape(imgs)[0] 216 | 217 | s_y = max(0, y - radius) 218 | if (s_y == 0): 219 | s_y_pad = -(y - radius) 220 | 221 | e_y = min(np.shape(imgs)[1], y + radius) 222 | if (e_y == np.shape(imgs)[1]): 223 | e_y_pad = (y + radius) - np.shape(imgs)[1] 224 | 225 | s_x = max(0, x - radius) 226 | if (s_x == 0): 227 | s_x_pad = -(x - radius) 228 | 229 | e_x = min(np.shape(imgs)[2], x + radius) 230 | if (e_x == np.shape(imgs)[2]): 231 | e_x_pad = (x + radius) - np.shape(imgs)[2] 232 | 233 | # print (s_x, e_x, s_y, e_y, s_z, e_z) 234 | # print (np.shape(img_arr[s_z:e_z, s_y:e_y, s_x:e_x])) 235 | nodule_img = imgs[s_z:e_z, s_y:e_y, s_x:e_x] 236 | nodule_img = np.pad(nodule_img, [[s_z_pad, e_z_pad], [s_y_pad, e_y_pad], [s_x_pad, e_x_pad]], 'constant', 237 | constant_values=0) 238 | 239 | imgpad_size = [self.img_size[0] - np.shape(nodule_img)[0], 240 | self.img_size[1] - np.shape(nodule_img)[1], 241 | self.img_size[2] - np.shape(nodule_img)[2]] 242 | imgpad = [] 243 | imgpad_left = [int(imgpad_size[0] / 2), 244 | int(imgpad_size[1] / 2), 245 | int(imgpad_size[2] / 2)] 246 | imgpad_right = [int(imgpad_size[0] / 2), 247 | int(imgpad_size[1] / 2), 248 | int(imgpad_size[2] / 2)] 249 | 250 | for i in range(3): 251 | if (imgpad_size[i] % 2 != 0): 252 | 253 | rand = np.random.randint(2) 254 | if rand == 0: 255 | imgpad.append([imgpad_left[i], imgpad_right[i] + 1]) 256 | else: 257 | imgpad.append([imgpad_left[i] + 1, imgpad_right[i]]) 258 | else: 259 | imgpad.append([imgpad_left[i], imgpad_right[i]]) 260 | 261 | padding_crop = np.pad(nodule_img, imgpad, 'constant', constant_values=self.pad_value) 262 | 263 | padding_crop = np.expand_dims(padding_crop, axis=0) 264 | 265 | crop = np.concatenate((padding_crop, crop_img)) 266 | 267 | return crop, target 268 | 269 | def collate(batch): 270 | if torch.is_tensor(batch[0]): 271 | return [b.unsqueeze(0) for b in batch] 272 | elif isinstance(batch[0], np.ndarray): 273 | return batch 274 | elif isinstance(batch[0], int): 275 | return torch.LongTensor(batch) 276 | elif isinstance(batch[0], collections.Iterable): 277 | transposed = zip(*batch) 278 | return [collate(samples) for samples in transposed] 279 | 280 | -------------------------------------------------------------------------------- /training/layers.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | import torch 4 | from torch import nn 5 | import torch.nn.functional as F 6 | from torch.autograd import Variable 7 | import math 8 | 9 | class PostRes2d(nn.Module): 10 | def __init__(self, n_in, n_out, stride = 1): 11 | super(PostRes2d, self).__init__() 12 | self.conv1 = nn.Conv2d(n_in, n_out, kernel_size = 3, stride = stride, padding = 1) 13 | self.bn1 = nn.BatchNorm2d(n_out) 14 | self.relu = nn.ReLU(inplace = True) 15 | self.conv2 = nn.Conv2d(n_out, n_out, kernel_size = 3, padding = 1) 16 | self.bn2 = nn.BatchNorm2d(n_out) 17 | 18 | if stride != 1 or n_out != n_in: 19 | self.shortcut = nn.Sequential( 20 | nn.Conv2d(n_in, n_out, kernel_size = 1, stride = stride), 21 | nn.BatchNorm2d(n_out)) 22 | else: 23 | self.shortcut = None 24 | 25 | def forward(self, x): 26 | residual = x 27 | if self.shortcut is not None: 28 | residual = self.shortcut(x) 29 | out = self.conv1(x) 30 | out = self.bn1(out) 31 | out = self.relu(out) 32 | out = self.conv2(out) 33 | out = self.bn2(out) 34 | 35 | out += residual 36 | out = self.relu(out) 37 | return out 38 | 39 | class PostRes(nn.Module): 40 | def __init__(self, n_in, n_out, stride = 1): 41 | super(PostRes, self).__init__() 42 | self.conv1 = nn.Conv3d(n_in, n_out, kernel_size = 3, stride = stride, padding = 1) 43 | self.bn1 = nn.BatchNorm3d(n_out) 44 | self.relu = nn.ReLU(inplace = True) 45 | self.conv2 = nn.Conv3d(n_out, n_out, kernel_size = 3, padding = 1) 46 | self.bn2 = nn.BatchNorm3d(n_out) 47 | 48 | if stride != 1 or n_out != n_in: 49 | self.shortcut = nn.Sequential( 50 | nn.Conv3d(n_in, n_out, kernel_size = 1, stride = stride), 51 | nn.BatchNorm3d(n_out)) 52 | else: 53 | self.shortcut = None 54 | 55 | def forward(self, x): 56 | residual = x 57 | if self.shortcut is not None: 58 | residual = self.shortcut(x) 59 | out = self.conv1(x) 60 | out = self.bn1(out) 61 | out = self.relu(out) 62 | out = self.conv2(out) 63 | out = self.bn2(out) 64 | 65 | out += residual 66 | out = self.relu(out) 67 | return out 68 | 69 | class AttributeOutput(nn.Module): 70 | def __init__(self): 71 | super(AttributeOutput, self).__init__() 72 | self.conv0 = nn.Sequential(nn.Conv3d(128, 64, kernel_size=2), 73 | nn.BatchNorm3d(64), 74 | nn.ReLU(), 75 | nn.Conv3d(64, 64, kernel_size=1), 76 | nn.ReLU() 77 | ) 78 | self.conv1 = nn.Sequential(nn.Conv3d(128, 64, kernel_size=2), 79 | nn.BatchNorm3d(64), 80 | nn.ReLU(), 81 | nn.Conv3d(64, 64, kernel_size=1), 82 | nn.ReLU(), 83 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2)) 84 | ) 85 | self.conv2 = nn.Sequential(nn.Conv3d(128, 64, kernel_size=2), 86 | nn.BatchNorm3d(64), 87 | nn.ReLU(), 88 | nn.Conv3d(64, 64, kernel_size=1), 89 | nn.ReLU(), 90 | nn.AvgPool3d((4, 4, 4), stride=(4, 4, 4)) 91 | ) 92 | self.fc = nn.Sequential(nn.Linear(1536, 256), 93 | nn.Linear(256, 128), 94 | nn.Linear(128, 1)) 95 | self.drop = nn.Dropout3d(p=0.5, inplace=False) 96 | 97 | def forward(self, x): 98 | x0_drop = self.drop(x[0]) 99 | x1_drop = self.drop(x[1]) 100 | x2_drop = self.drop(x[2]) 101 | conv0 = self.conv0(x0_drop) 102 | conv1 = self.conv1(x1_drop) 103 | conv2 = self.conv2(x2_drop) 104 | conv0 = conv0.view(conv0.size(0), -1) 105 | conv1 = conv1.view(conv1.size(0), -1) 106 | conv2 = conv2.view(conv2.size(0), -1) 107 | out = torch.cat((conv0, conv1, conv2), 1) 108 | out = self.fc(out) 109 | 110 | return out 111 | 112 | class Rec3(nn.Module): 113 | def __init__(self, n0, n1, n2, n3, p = 0.0, integrate = True): 114 | super(Rec3, self).__init__() 115 | 116 | self.block01 = nn.Sequential( 117 | nn.Conv3d(n0, n1, kernel_size = 3, stride = 2, padding = 1), 118 | nn.BatchNorm3d(n1), 119 | nn.ReLU(inplace = True), 120 | nn.Conv3d(n1, n1, kernel_size = 3, padding = 1), 121 | nn.BatchNorm3d(n1)) 122 | 123 | self.block11 = nn.Sequential( 124 | nn.Conv3d(n1, n1, kernel_size = 3, padding = 1), 125 | nn.BatchNorm3d(n1), 126 | nn.ReLU(inplace = True), 127 | nn.Conv3d(n1, n1, kernel_size = 3, padding = 1), 128 | nn.BatchNorm3d(n1)) 129 | 130 | self.block21 = nn.Sequential( 131 | nn.ConvTranspose3d(n2, n1, kernel_size = 2, stride = 2), 132 | nn.BatchNorm3d(n1), 133 | nn.ReLU(inplace = True), 134 | nn.Conv3d(n1, n1, kernel_size = 3, padding = 1), 135 | nn.BatchNorm3d(n1)) 136 | 137 | self.block12 = nn.Sequential( 138 | nn.Conv3d(n1, n2, kernel_size = 3, stride = 2, padding = 1), 139 | nn.BatchNorm3d(n2), 140 | nn.ReLU(inplace = True), 141 | nn.Conv3d(n2, n2, kernel_size = 3, padding = 1), 142 | nn.BatchNorm3d(n2)) 143 | 144 | self.block22 = nn.Sequential( 145 | nn.Conv3d(n2, n2, kernel_size = 3, padding = 1), 146 | nn.BatchNorm3d(n2), 147 | nn.ReLU(inplace = True), 148 | nn.Conv3d(n2, n2, kernel_size = 3, padding = 1), 149 | nn.BatchNorm3d(n2)) 150 | 151 | self.block32 = nn.Sequential( 152 | nn.ConvTranspose3d(n3, n2, kernel_size = 2, stride = 2), 153 | nn.BatchNorm3d(n2), 154 | nn.ReLU(inplace = True), 155 | nn.Conv3d(n2, n2, kernel_size = 3, padding = 1), 156 | nn.BatchNorm3d(n2)) 157 | 158 | self.block23 = nn.Sequential( 159 | nn.Conv3d(n2, n3, kernel_size = 3, stride = 2, padding = 1), 160 | nn.BatchNorm3d(n3), 161 | nn.ReLU(inplace = True), 162 | nn.Conv3d(n3, n3, kernel_size = 3, padding = 1), 163 | nn.BatchNorm3d(n3)) 164 | 165 | self.block33 = nn.Sequential( 166 | nn.Conv3d(n3, n3, kernel_size = 3, padding = 1), 167 | nn.BatchNorm3d(n3), 168 | nn.ReLU(inplace = True), 169 | nn.Conv3d(n3, n3, kernel_size = 3, padding = 1), 170 | nn.BatchNorm3d(n3)) 171 | 172 | self.relu = nn.ReLU(inplace = True) 173 | self.p = p 174 | self.integrate = integrate 175 | 176 | def forward(self, x0, x1, x2, x3): 177 | if self.p > 0 and self.training: 178 | coef = torch.bernoulli((1.0 - self.p) * torch.ones(8)) 179 | out1 = coef[0] * self.block01(x0) + coef[1] * self.block11(x1) + coef[2] * self.block21(x2) 180 | out2 = coef[3] * self.block12(x1) + coef[4] * self.block22(x2) + coef[5] * self.block32(x3) 181 | out3 = coef[6] * self.block23(x2) + coef[7] * self.block33(x3) 182 | else: 183 | out1 = (1 - self.p) * (self.block01(x0) + self.block11(x1) + self.block21(x2)) 184 | out2 = (1 - self.p) * (self.block12(x1) + self.block22(x2) + self.block32(x3)) 185 | out3 = (1 - self.p) * (self.block23(x2) + self.block33(x3)) 186 | 187 | if self.integrate: 188 | out1 += x1 189 | out2 += x2 190 | out3 += x3 191 | 192 | return x0, self.relu(out1), self.relu(out2), self.relu(out3) 193 | 194 | def hard_mining(neg_output, neg_labels, num_hard): 195 | _, idcs = torch.topk(neg_output, min(num_hard, len(neg_output))) 196 | neg_output = torch.index_select(neg_output, 0, idcs) 197 | neg_labels = torch.index_select(neg_labels, 0, idcs) 198 | return neg_output, neg_labels 199 | 200 | class MAE_Loss(nn.Module): 201 | def __init__(self): 202 | super(MAE_Loss, self).__init__() 203 | self.l1_loss = nn.L1Loss() 204 | 205 | def forward(self, output, labels, train=True): 206 | if not (train): 207 | #calcification max range 5, internal structure max range 4, other max range 5 208 | for i in range(9): 209 | if (i == 5): 210 | if (output[:, i].data[0] > 6): 211 | output[:, i].data[0] = 6.0 212 | elif (i == 6): 213 | if (output[:, i].data[0] > 4): 214 | output[:, i].data[0] = 4.0 215 | else: 216 | if (output[:, i].data[0] > 5): 217 | output[:, i].data[0] = 5.0 218 | 219 | if (output[:, i].data[0] < 1): 220 | output[:, i].data[0] = 1.0 221 | 222 | criteria = self.l1_loss 223 | losses = [ 224 | criteria(output[:,0], labels[:,0]), 225 | criteria(output[:,1], labels[:,1]), 226 | criteria(output[:,2], labels[:,2]), 227 | criteria(output[:,3], labels[:,3]), 228 | criteria(output[:,4], labels[:,4]), 229 | criteria(output[:,5], labels[:,5]), 230 | criteria(output[:,6], labels[:,6]), 231 | criteria(output[:,7], labels[:,7]), 232 | criteria(output[:,8], labels[:,8]), 233 | ] 234 | 235 | total_loss = 0 236 | for loss in losses: 237 | total_loss += loss 238 | return total_loss, losses 239 | 240 | class MAE_Loss_one(nn.Module): 241 | def __init__(self): 242 | super(MAE_Loss_one, self).__init__() 243 | self.l1_loss = nn.L1Loss() 244 | 245 | def forward(self, output, labels, train=True): 246 | if not (train): 247 | #calcification max range 5, internal structure max range 4, other max range 5 248 | if (output[:, 0].data[0] > 5): 249 | output[:, 0].data[0] = 5.0 250 | 251 | if (output[:, 0].data[0] < 1): 252 | output[:, 0].data[0] = 1.0 253 | 254 | criteria = self.l1_loss 255 | 256 | total_loss = criteria(output[:,0], labels[:,0]) 257 | 258 | return total_loss 259 | 260 | class GetPBB(object): 261 | def __init__(self, config): 262 | self.stride = config['stride'] 263 | self.anchors = np.asarray(config['anchors']) 264 | 265 | def __call__(self, output,thresh = -3, ismask=False): 266 | stride = self.stride 267 | anchors = self.anchors 268 | output = np.copy(output) 269 | offset = (float(stride) - 1) / 2 270 | output_size = output.shape 271 | oz = np.arange(offset, offset + stride * (output_size[0] - 1) + 1, stride) 272 | oh = np.arange(offset, offset + stride * (output_size[1] - 1) + 1, stride) 273 | ow = np.arange(offset, offset + stride * (output_size[2] - 1) + 1, stride) 274 | 275 | output[:, :, :, :, 1] = oz.reshape((-1, 1, 1, 1)) + output[:, :, :, :, 1] * anchors.reshape((1, 1, 1, -1)) 276 | output[:, :, :, :, 2] = oh.reshape((1, -1, 1, 1)) + output[:, :, :, :, 2] * anchors.reshape((1, 1, 1, -1)) 277 | output[:, :, :, :, 3] = ow.reshape((1, 1, -1, 1)) + output[:, :, :, :, 3] * anchors.reshape((1, 1, 1, -1)) 278 | output[:, :, :, :, 4] = np.exp(output[:, :, :, :, 4]) * anchors.reshape((1, 1, 1, -1)) 279 | mask = output[..., 0] > thresh 280 | xx,yy,zz,aa = np.where(mask) 281 | 282 | output = output[xx,yy,zz,aa] 283 | #bboxes = nms(output, 0.4) 284 | if ismask: 285 | return output, [xx,yy,zz,aa] 286 | else: 287 | return output 288 | 289 | #output = output[output[:, 0] >= self.conf_th] 290 | #bboxes = nms(output, self.nms_th) 291 | def nms(output, nms_th): 292 | if len(output) == 0: 293 | return output 294 | 295 | output = output[np.argsort(-output[:, 0])] 296 | bboxes = [output[0]] 297 | 298 | for i in np.arange(1, len(output)): 299 | bbox = output[i] 300 | flag = 1 301 | for j in range(len(bboxes)): 302 | if iou(bbox[1:5], bboxes[j][1:5]) >= nms_th: 303 | flag = -1 304 | break 305 | if flag == 1: 306 | bboxes.append(bbox) 307 | 308 | bboxes = np.asarray(bboxes, np.float32) 309 | return bboxes 310 | 311 | def iou(box0, box1): 312 | 313 | r0 = box0[3] / 2 314 | s0 = box0[:3] - r0 315 | e0 = box0[:3] + r0 316 | 317 | r1 = box1[3] / 2 318 | s1 = box1[:3] - r1 319 | e1 = box1[:3] + r1 320 | 321 | overlap = [] 322 | for i in range(len(s0)): 323 | overlap.append(max(0, min(e0[i], e1[i]) - max(s0[i], s1[i]))) 324 | 325 | intersection = overlap[0] * overlap[1] * overlap[2] 326 | union = box0[3] * box0[3] * box0[3] + box1[3] * box1[3] * box1[3] - intersection 327 | return intersection / union 328 | 329 | def acc(pbb, lbb, conf_th, nms_th, detect_th): 330 | if (len(pbb) > 0): 331 | pbb = pbb[pbb[:, 0] >= conf_th] 332 | pbb = nms(pbb, nms_th) 333 | 334 | tp = [] 335 | fp = [] 336 | fn = [] 337 | l_flag = np.zeros((len(lbb),), np.int32) 338 | for p in pbb: 339 | flag = 0 340 | bestscore = 0 341 | for i, l in enumerate(lbb): 342 | score = iou(p[1:5], l) 343 | if score>bestscore: 344 | bestscore = score 345 | besti = i 346 | if bestscore > detect_th: 347 | flag = 1 348 | if l_flag[besti] == 0: 349 | l_flag[besti] = 1 350 | tp.append(np.concatenate([p,[bestscore]],0)) 351 | else: 352 | fp.append(np.concatenate([p,[bestscore]],0)) 353 | if flag == 0: 354 | fp.append(np.concatenate([p,[bestscore]],0)) 355 | for i,l in enumerate(lbb): 356 | if l_flag[i]==0: 357 | score = [] 358 | for p in pbb: 359 | score.append(iou(p[1:5],l)) 360 | if len(score)!=0: 361 | bestscore = np.max(score) 362 | else: 363 | bestscore = 0 364 | if bestscore0: 391 | fn = np.concatenate([fn,tp[fn_i,:5]]) 392 | else: 393 | fn = fn 394 | if len(tp_in_topk)>0: 395 | tp = tp[tp_in_topk] 396 | else: 397 | tp = [] 398 | if len(fp_in_topk)>0: 399 | fp = newallp[fp_in_topk] 400 | else: 401 | fp = [] 402 | return tp, fp , fn 403 | -------------------------------------------------------------------------------- /training/main.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | import os 3 | import time 4 | import numpy as np 5 | import data 6 | from importlib import import_module 7 | import shutil 8 | from utils import * 9 | import sys 10 | sys.path.append('../') 11 | 12 | import torch 13 | from torch.nn import DataParallel 14 | from torch.backends import cudnn 15 | from torch.utils.data import DataLoader 16 | from torch import optim 17 | from torch.autograd import Variable 18 | from config_training import config as config_training 19 | from torch import nn 20 | import math 21 | 22 | parser = argparse.ArgumentParser(description='PyTorch DataBowl3 Detector') 23 | parser.add_argument('--model', '-m', metavar='MODEL', default='base', 24 | help='model') 25 | parser.add_argument('-j', '--workers', default=12, type=int, metavar='N', 26 | help='number of data loading workers (default: 32)') 27 | parser.add_argument('--epochs', default=100, type=int, metavar='N', 28 | help='number of total epochs to run') 29 | parser.add_argument('--start-epoch', default=0, type=int, metavar='N', 30 | help='manual epoch number (useful on restarts)') 31 | parser.add_argument('-b', '--batch-size', default=48, type=int, 32 | metavar='N', help='mini-batch size (default: 16)') 33 | parser.add_argument('--lr', '--learning-rate', default=0.01, type=float, 34 | metavar='LR', help='initial learning rate') 35 | parser.add_argument('--momentum', default=0.9, type=float, metavar='M', 36 | help='momentum') 37 | parser.add_argument('--weight-decay', '--wd', default=1e-4, type=float, 38 | metavar='W', help='weight decay (default: 1e-4)') 39 | parser.add_argument('--save-freq', default='10', type=int, metavar='S', 40 | help='save frequency') 41 | parser.add_argument('--resume', default='', type=str, metavar='PATH', 42 | help='path to latest checkpoint (default: none)') 43 | parser.add_argument('--save-dir', default='', type=str, metavar='SAVE', 44 | help='directory to save checkpoint (default: none)') 45 | parser.add_argument('--test', default=0, type=int, metavar='TEST', 46 | help='1 do test evaluation, 0 not') 47 | parser.add_argument('--split', default=8, type=int, metavar='SPLIT', 48 | help='In the test phase, split the image to 8 parts') 49 | parser.add_argument('--gpu', default='all', type=str, metavar='N', 50 | help='use gpu') 51 | parser.add_argument('--n_test', default=8, type=int, metavar='N', 52 | help='number of gpu for test') 53 | 54 | def sigmoid(x): 55 | return 1 / (1 + math.exp(-x)) 56 | 57 | def main(): 58 | global args 59 | args = parser.parse_args() 60 | 61 | 62 | torch.manual_seed(0) 63 | torch.cuda.set_device(0) 64 | 65 | model = import_module(args.model) 66 | config, net, loss = model.get_model() 67 | start_epoch = args.start_epoch 68 | save_dir = args.save_dir 69 | 70 | if args.resume: 71 | checkpoint = torch.load(args.resume) 72 | #if start_epoch == 0: 73 | # start_epoch = checkpoint['epoch'] + 1 74 | #if not save_dir: 75 | # save_dir = checkpoint['save_dir'] 76 | #else: 77 | save_dir = os.path.join('results',save_dir) 78 | net.load_state_dict(checkpoint['state_dict']) 79 | else: 80 | if start_epoch == 0: 81 | start_epoch = 1 82 | if not save_dir: 83 | exp_id = time.strftime('%Y%m%d-%H%M%S', time.localtime()) 84 | save_dir = os.path.join('results', args.model + '-' + exp_id) 85 | else: 86 | save_dir = os.path.join('results',save_dir) 87 | 88 | if not os.path.exists(save_dir): 89 | os.makedirs(save_dir) 90 | logfile = os.path.join(save_dir,'log') 91 | if args.test!=1: 92 | sys.stdout = Logger(logfile) 93 | pyfiles = [f for f in os.listdir('./') if f.endswith('.py')] 94 | for f in pyfiles: 95 | shutil.copy(f,os.path.join(save_dir,f)) 96 | n_gpu = setgpu(args.gpu) 97 | args.n_gpu = n_gpu 98 | print ("arg", args.gpu) 99 | print ("num_gpu",n_gpu) 100 | net = net.cuda() 101 | loss = loss.cuda() 102 | cudnn.benchmark = True 103 | net = DataParallel(net) 104 | datadir = config_training['preprocess_result_path'] 105 | 106 | print ("datadir", datadir) 107 | print ("pad_val", config['pad_value']) 108 | print ("aug type", config['augtype']) 109 | 110 | dataset = data.DataBowl3Detector( 111 | datadir, 112 | 'train_luna_9.npy', 113 | config, 114 | phase = 'train') 115 | print ("len train_dataset", dataset.__len__()) 116 | train_loader = DataLoader( 117 | dataset, 118 | batch_size = args.batch_size, 119 | shuffle = True, 120 | num_workers = args.workers, 121 | pin_memory=True) 122 | 123 | dataset = data.DataBowl3Detector( 124 | datadir, 125 | 'val9.npy', 126 | config, 127 | phase = 'val') 128 | print ("len val_dataset", dataset.__len__()) 129 | 130 | val_loader = DataLoader( 131 | dataset, 132 | batch_size = 1, 133 | shuffle = False, 134 | num_workers = args.workers, 135 | pin_memory=True) 136 | 137 | optimizer = torch.optim.SGD( 138 | net.parameters(), 139 | args.lr, 140 | momentum = 0.9, 141 | weight_decay = args.weight_decay) 142 | 143 | def get_lr(epoch): 144 | if epoch <= args.epochs * 0.5: 145 | lr = args.lr 146 | elif epoch <= args.epochs * 0.8: 147 | lr = 0.1 * args.lr 148 | else: 149 | lr = 0.01 * args.lr 150 | return lr 151 | 152 | best_val_loss = 100 153 | best_mal_loss = 100 154 | for epoch in range(start_epoch, args.epochs + 1): 155 | print ("epoch", epoch) 156 | train(train_loader, net, loss, epoch, optimizer, get_lr, args.save_freq, save_dir) 157 | best_val_loss, best_mal_loss = validate(val_loader, net, loss, best_val_loss, best_mal_loss, epoch, save_dir) 158 | 159 | def train(data_loader, net, loss, epoch, optimizer, get_lr, save_freq, save_dir): 160 | start_time = time.time() 161 | 162 | net.train() 163 | lr = get_lr(epoch) 164 | for param_group in optimizer.param_groups: 165 | param_group['lr'] = lr 166 | 167 | loss_res = [] 168 | for i, (data, target) in enumerate(data_loader): 169 | data = Variable(data.cuda(async = True)) 170 | target = Variable(target.cuda(async = True)) 171 | output = net(data) 172 | 173 | loss_output, attribute = loss(output, target.float()) 174 | optimizer.zero_grad() 175 | loss_output.backward() 176 | optimizer.step() 177 | loss_res.append(loss_output.data[0]) 178 | 179 | if epoch % args.save_freq == 0: 180 | state_dict = net.module.state_dict() 181 | for key in state_dict.keys(): 182 | state_dict[key] = state_dict[key].cpu() 183 | 184 | torch.save({ 185 | 'epoch': epoch, 186 | 'save_dir': save_dir, 187 | 'state_dict': state_dict, 188 | 'args': args}, 189 | os.path.join(save_dir, '%03d.ckpt' % epoch)) 190 | 191 | end_time = time.time() 192 | 193 | loss_res = np.asarray(loss_res, np.float32) 194 | print('Epoch %03d (lr %.5f)' % (epoch, lr)) 195 | print('Train loss %2.4f, time %2.4f' % (np.mean(loss_res), end_time - start_time)) 196 | print 197 | 198 | def validate(data_loader, net, loss, best_val_loss, best_mal_loss, epoch, save_dir): 199 | start_time = time.time() 200 | 201 | net.eval() 202 | 203 | loss_res = [] 204 | less_list = np.zeros((9, 10)) 205 | malignacy_loss = [] 206 | sphericiy_loss = [] 207 | margin_loss = [] 208 | spiculation_loss = [] 209 | texture_loss = [] 210 | calcification_loss = [] 211 | internal_structure_loss = [] 212 | lobulation_loss = [] 213 | subtlety_loss = [] 214 | for i, (data, target) in enumerate(data_loader): 215 | data = Variable(data.cuda(async = True), volatile = True) 216 | target = Variable(target.cuda(async = True), volatile = True) 217 | 218 | output = net(data) 219 | loss_output, attribute = loss(output, target.float(), train=False) 220 | 221 | loss_res.append(loss_output.data[0]) 222 | 223 | for i in range(len(less_list)): 224 | for j in range(10): 225 | if attribute[i].data[0] < 0.1*(j+1): 226 | less_list[i][j] = less_list[i][j] + 1 227 | 228 | 229 | malignacy_loss.append(attribute[0].data[0]) 230 | sphericiy_loss.append(attribute[1].data[0]) 231 | margin_loss.append(attribute[2].data[0]) 232 | spiculation_loss.append(attribute[3].data[0]) 233 | texture_loss.append(attribute[4].data[0]) 234 | calcification_loss.append(attribute[5].data[0]) 235 | internal_structure_loss.append(attribute[6].data[0]) 236 | lobulation_loss.append(attribute[7].data[0]) 237 | subtlety_loss.append(attribute[8].data[0]) 238 | 239 | end_time = time.time() 240 | 241 | loss_res = np.asarray(loss_res, np.float32) 242 | val_loss = np.mean(loss_res) 243 | 244 | if ( val_loss < best_val_loss): 245 | state_dict = net.module.state_dict() 246 | for key in state_dict.keys(): 247 | state_dict[key] = state_dict[key].cpu() 248 | 249 | torch.save({ 250 | 'epoch': epoch, 251 | 'save_dir': save_dir, 252 | 'state_dict': state_dict, 253 | 'args': args}, 254 | os.path.join(save_dir, 'best_all.ckpt')) 255 | 256 | best_val_loss = val_loss 257 | 258 | 259 | for i in range(len(less_list)): 260 | temp_list = np.zeros((10)) 261 | for j in range(10): 262 | temp_list[j] = float(less_list[i][j]) / len(loss_res) 263 | print(i, 'th', 'acc less ', temp_list) 264 | 265 | if ( np.mean(malignacy_loss) < best_mal_loss): 266 | state_dict = net.module.state_dict() 267 | for key in state_dict.keys(): 268 | state_dict[key] = state_dict[key].cpu() 269 | 270 | torch.save({ 271 | 'epoch': epoch, 272 | 'save_dir': save_dir, 273 | 'state_dict': state_dict, 274 | 'args': args}, 275 | os.path.join(save_dir, 'best_mal.ckpt')) 276 | 277 | best_mal_loss = np.mean(malignacy_loss) 278 | 279 | i=0 280 | temp_list = np.zeros((10)) 281 | for j in range(10): 282 | temp_list[j] = float(less_list[i][j]) / len(loss_res) 283 | print(i, 'th', 'acc less ', temp_list) 284 | 285 | print('val loss %2.4f, time %2.4f' % (np.mean(loss_res), end_time - start_time)) 286 | print('malignacy_loss %2.4f, sphericiy_loss %2.4f, margin_loss %2.4f' % (np.mean(malignacy_loss), np.mean(sphericiy_loss), np.mean(margin_loss))) 287 | print('spiculation_loss %2.4f, texture_loss %2.4f, calcification_loss %2.4f' % (np.mean(spiculation_loss), np.mean(texture_loss), np.mean(calcification_loss))) 288 | print('internal_structure_loss %2.4f, lobulation_loss %2.4f, subtlety_loss %2.4f' % (np.mean(internal_structure_loss), np.mean(lobulation_loss), np.mean(subtlety_loss))) 289 | 290 | print ('best_val_loss ', best_val_loss) 291 | 292 | return best_val_loss, best_mal_loss 293 | 294 | if __name__ == '__main__': 295 | main() 296 | 297 | -------------------------------------------------------------------------------- /training/res_classifier.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from torch import nn 3 | 4 | from layers import * 5 | 6 | config = {} 7 | #config['anchors'] = [ 5.0, 10.0, 30.0, 60.0] 8 | config['chanel'] = 1 9 | config['img_size'] = [48, 48, 48] 10 | 11 | config['num_neg'] = 800 12 | config['bound_size'] = 3 13 | config['reso'] = 1 14 | config['aug_scale'] = True 15 | config['r_rand_crop'] = 0.3 16 | config['pad_value'] = 0 17 | #config['pad_value'] = 0 18 | config['augtype'] = {'flip':True,'swap':False,'scale':False,'rotate':False} 19 | 20 | class Net(nn.Module): 21 | def __init__(self): 22 | super(Net, self).__init__() 23 | # The first few layers consumes the most memory, so use simple convolution to save memory. 24 | # Call these layers preBlock, i.e., before the residual blocks of later layers. 25 | self.preBlock = nn.Sequential( 26 | nn.Conv3d(2, 48, kernel_size = 3, padding = 1), 27 | nn.BatchNorm3d(48), 28 | nn.ReLU(inplace = True), 29 | nn.Conv3d(48, 48, kernel_size = 3, padding = 1), 30 | nn.BatchNorm3d(48), 31 | nn.ReLU(inplace = True)) 32 | 33 | # 3 poolings, each pooling downsamples the feature map by a factor 2. 34 | # 3 groups of blocks. The first block of each group has one pooling. 35 | num_blocks_forw = [3,3,4] 36 | self.featureNum_forw = [48,64,128,256] 37 | for i in range(len(num_blocks_forw)): 38 | blocks = [] 39 | for j in range(num_blocks_forw[i]): 40 | if j == 0: 41 | blocks.append(PostRes(self.featureNum_forw[i], self.featureNum_forw[i+1])) 42 | else: 43 | blocks.append(PostRes(self.featureNum_forw[i+1], self.featureNum_forw[i+1])) 44 | setattr(self, 'forw' + str(i + 1), nn.Sequential(*blocks)) 45 | 46 | self.maxpool1 = nn.MaxPool3d(kernel_size=2,stride=2,return_indices =True) 47 | self.maxpool2 = nn.MaxPool3d(kernel_size=2,stride=2,return_indices =True) 48 | self.maxpool3 = nn.MaxPool3d(kernel_size=2,stride=2,return_indices =True) 49 | self.maxpool4 = nn.MaxPool3d(kernel_size=2, stride=2, return_indices=True) 50 | 51 | self.drop = nn.Dropout3d(p = 0.5, inplace = False) 52 | self.malignacy_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 53 | nn.BatchNorm3d(128), 54 | nn.ReLU(), 55 | nn.Conv3d(128, 128, kernel_size=1), 56 | nn.ReLU(), 57 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 58 | self.malignacy_fc = nn.Sequential(nn.Linear(128, 1)) 59 | 60 | self.sphericiy_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 61 | nn.BatchNorm3d(128), 62 | nn.ReLU(), 63 | nn.Conv3d(128, 128, kernel_size=1), 64 | nn.ReLU(), 65 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 66 | self.sphericiy_fc = nn.Sequential(nn.Linear(128, 1)) 67 | 68 | self.margin_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 69 | nn.BatchNorm3d(128), 70 | nn.ReLU(), 71 | nn.Conv3d(128, 128, kernel_size=1), 72 | nn.ReLU(), 73 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 74 | self.margin_fc = nn.Sequential(nn.Linear(128, 1)) 75 | 76 | self.spiculation_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 77 | nn.BatchNorm3d(128), 78 | nn.ReLU(), 79 | nn.Conv3d(128, 128, kernel_size=1), 80 | nn.ReLU(), 81 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 82 | self.spiculation_fc = nn.Sequential(nn.Linear(128, 1)) 83 | 84 | self.texture_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 85 | nn.BatchNorm3d(128), 86 | nn.ReLU(), 87 | nn.Conv3d(128, 128, kernel_size=1), 88 | nn.ReLU(), 89 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 90 | self.texture_fc = nn.Sequential(nn.Linear(128, 1)) 91 | 92 | self.calcification_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 93 | nn.BatchNorm3d(128), 94 | nn.ReLU(), 95 | nn.Conv3d(128, 128, kernel_size=1), 96 | nn.ReLU(), 97 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 98 | self.calcification_fc = nn.Sequential(nn.Linear(128, 1)) 99 | 100 | self.internal_structure_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 101 | nn.BatchNorm3d(128), 102 | nn.ReLU(), 103 | nn.Conv3d(128, 128, kernel_size=1), 104 | nn.ReLU(), 105 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 106 | self.internal_structure_fc = nn.Sequential(nn.Linear(128, 1)) 107 | 108 | self.lobulation_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 109 | nn.BatchNorm3d(128), 110 | nn.ReLU(), 111 | nn.Conv3d(128, 128, kernel_size=1), 112 | nn.ReLU(), 113 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 114 | self.lobulation_fc = nn.Sequential(nn.Linear(128, 1)) 115 | 116 | self.subtlety_conv = nn.Sequential(nn.Conv3d(256, 128, kernel_size = 2), 117 | nn.BatchNorm3d(128), 118 | nn.ReLU(), 119 | nn.Conv3d(128, 128, kernel_size=1), 120 | nn.ReLU(), 121 | nn.AvgPool3d((2, 2, 2), stride=(2, 2, 2))) 122 | self.subtlety_fc = nn.Sequential(nn.Linear(128, 1)) 123 | 124 | 125 | def forward(self, x, dropout=False): 126 | out = self.preBlock(x)#16 127 | out_pool,indices0 = self.maxpool1(out) 128 | if dropout: 129 | out_pool = self.drop(out_pool) 130 | out1 = self.forw1(out_pool)#32 131 | out1_pool,indices1 = self.maxpool2(out1) 132 | if dropout: 133 | out1_pool = self.drop(out1_pool) 134 | out2 = self.forw2(out1_pool)#64 135 | out2_pool,indices2 = self.maxpool3(out2) 136 | if dropout: 137 | out2_pool = self.drop(out2_pool) 138 | out3 = self.forw3(out2_pool)#96 139 | out3_pool, indices3 = self.maxpool4(out3) 140 | if dropout: 141 | out3_pool = self.drop(out3_pool) 142 | 143 | malignacy = self.malignacy_conv(out3_pool) 144 | malignacy = self.drop(malignacy) 145 | malignacy = malignacy.view(malignacy.size(0), -1) 146 | malignacy = self.malignacy_fc(malignacy) 147 | 148 | sphericiy = self.sphericiy_conv(out3_pool) 149 | sphericiy = self.drop(sphericiy) 150 | sphericiy = sphericiy.view(sphericiy.size(0), -1) 151 | sphericiy = self.sphericiy_fc(sphericiy) 152 | 153 | margin = self.margin_conv(out3_pool) 154 | margin = self.drop(margin) 155 | margin = margin.view(margin.size(0), -1) 156 | margin = self.margin_fc(margin) 157 | 158 | spiculation = self.spiculation_conv(out3_pool) 159 | spiculation = self.drop(spiculation) 160 | spiculation = spiculation.view(spiculation.size(0), -1) 161 | spiculation = self.spiculation_fc(spiculation) 162 | 163 | texture = self.texture_conv(out3_pool) 164 | texture = self.drop(texture) 165 | texture = texture.view(texture.size(0), -1) 166 | texture = self.texture_fc(texture) 167 | 168 | calcification = self.calcification_conv(out3_pool) 169 | calcification = self.drop(calcification) 170 | calcification = calcification.view(calcification.size(0), -1) 171 | calcification = self.calcification_fc(calcification) 172 | 173 | internal_structure = self.internal_structure_conv(out3_pool) 174 | internal_structure = self.drop(internal_structure) 175 | internal_structure = internal_structure.view(internal_structure.size(0), -1) 176 | internal_structure = self.internal_structure_fc(internal_structure) 177 | 178 | lobulation = self.lobulation_conv(out3_pool) 179 | lobulation = self.drop(lobulation) 180 | lobulation = lobulation.view(lobulation.size(0), -1) 181 | lobulation = self.lobulation_fc(lobulation) 182 | 183 | subtlety = self.subtlety_conv(out3_pool) 184 | subtlety = self.drop(subtlety) 185 | subtlety = subtlety.view(subtlety.size(0), -1) 186 | subtlety = self.subtlety_fc(subtlety) 187 | 188 | out_all = torch.cat((malignacy, sphericiy, margin, spiculation, 189 | texture, calcification, internal_structure, lobulation, subtlety), 1) 190 | 191 | return out_all 192 | 193 | 194 | def get_model(): 195 | net = Net() 196 | loss = MAE_Loss() 197 | return config, net, loss 198 | -------------------------------------------------------------------------------- /training/utils.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import os 3 | import numpy as np 4 | import torch 5 | def getFreeId(): 6 | import pynvml 7 | 8 | pynvml.nvmlInit() 9 | def getFreeRatio(id): 10 | handle = pynvml.nvmlDeviceGetHandleByIndex(id) 11 | use = pynvml.nvmlDeviceGetUtilizationRates(handle) 12 | ratio = 0.5*(float(use.gpu+float(use.memory))) 13 | return ratio 14 | 15 | deviceCount = pynvml.nvmlDeviceGetCount() 16 | available = [] 17 | for i in range(deviceCount): 18 | if getFreeRatio(i)<70: 19 | available.append(i) 20 | gpus = '' 21 | for g in available: 22 | gpus = gpus+str(g)+',' 23 | gpus = gpus[:-1] 24 | return gpus 25 | 26 | def setgpu(gpuinput): 27 | freeids = getFreeId() 28 | if gpuinput=='all': 29 | gpus = freeids 30 | else: 31 | gpus = gpuinput 32 | if any([g not in freeids for g in gpus.split(',')]): 33 | raise ValueError('gpu'+g+'is being used') 34 | print('using gpu '+gpus) 35 | os.environ['CUDA_VISIBLE_DEVICES']=gpus 36 | return len(gpus.split(',')) 37 | 38 | class Logger(object): 39 | def __init__(self,logfile): 40 | self.terminal = sys.stdout 41 | self.log = open(logfile, "a") 42 | 43 | def write(self, message): 44 | self.terminal.write(message) 45 | self.log.write(message) 46 | 47 | def flush(self): 48 | #this flush method is needed for python 3 compatibility. 49 | #this handles the flush command by doing nothing. 50 | #you might want to specify some extra behavior here. 51 | pass 52 | 53 | 54 | def split4(data, max_stride, margin): 55 | splits = [] 56 | data = torch.Tensor.numpy(data) 57 | _,c, z, h, w = data.shape 58 | 59 | w_width = np.ceil(float(w / 2 + margin)/max_stride).astype('int')*max_stride 60 | h_width = np.ceil(float(h / 2 + margin)/max_stride).astype('int')*max_stride 61 | pad = int(np.ceil(float(z)/max_stride)*max_stride)-z 62 | leftpad = pad/2 63 | pad = [[0,0],[0,0],[leftpad,pad-leftpad],[0,0],[0,0]] 64 | data = np.pad(data,pad,'constant',constant_values=-1) 65 | data = torch.from_numpy(data) 66 | splits.append(data[:, :, :, :h_width, :w_width]) 67 | splits.append(data[:, :, :, :h_width, -w_width:]) 68 | splits.append(data[:, :, :, -h_width:, :w_width]) 69 | splits.append(data[:, :, :, -h_width:, -w_width:]) 70 | 71 | return torch.cat(splits, 0) 72 | 73 | def combine4(output, h, w): 74 | splits = [] 75 | for i in range(len(output)): 76 | splits.append(output[i]) 77 | 78 | output = np.zeros(( 79 | splits[0].shape[0], 80 | h, 81 | w, 82 | splits[0].shape[3], 83 | splits[0].shape[4]), np.float32) 84 | 85 | h0 = output.shape[1] / 2 86 | h1 = output.shape[1] - h0 87 | w0 = output.shape[2] / 2 88 | w1 = output.shape[2] - w0 89 | 90 | splits[0] = splits[0][:, :h0, :w0, :, :] 91 | output[:, :h0, :w0, :, :] = splits[0] 92 | 93 | splits[1] = splits[1][:, :h0, -w1:, :, :] 94 | output[:, :h0, -w1:, :, :] = splits[1] 95 | 96 | splits[2] = splits[2][:, -h1:, :w0, :, :] 97 | output[:, -h1:, :w0, :, :] = splits[2] 98 | 99 | splits[3] = splits[3][:, -h1:, -w1:, :, :] 100 | output[:, -h1:, -w1:, :, :] = splits[3] 101 | 102 | return output 103 | 104 | def split8(data, max_stride, margin): 105 | splits = [] 106 | if isinstance(data, np.ndarray): 107 | c, z, h, w = data.shape 108 | else: 109 | _,c, z, h, w = data.size() 110 | 111 | z_width = np.ceil(float(z / 2 + margin)/max_stride).astype('int')*max_stride 112 | w_width = np.ceil(float(w / 2 + margin)/max_stride).astype('int')*max_stride 113 | h_width = np.ceil(float(h / 2 + margin)/max_stride).astype('int')*max_stride 114 | for zz in [[0,z_width],[-z_width,None]]: 115 | for hh in [[0,h_width],[-h_width,None]]: 116 | for ww in [[0,w_width],[-w_width,None]]: 117 | if isinstance(data, np.ndarray): 118 | splits.append(data[np.newaxis, :, zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1]]) 119 | else: 120 | splits.append(data[:, :, zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1]]) 121 | 122 | 123 | if isinstance(data, np.ndarray): 124 | return np.concatenate(splits, 0) 125 | else: 126 | return torch.cat(splits, 0) 127 | 128 | 129 | 130 | def combine8(output, z, h, w): 131 | splits = [] 132 | for i in range(len(output)): 133 | splits.append(output[i]) 134 | 135 | output = np.zeros(( 136 | z, 137 | h, 138 | w, 139 | splits[0].shape[3], 140 | splits[0].shape[4]), np.float32) 141 | 142 | 143 | z_width = z / 2 144 | h_width = h / 2 145 | w_width = w / 2 146 | i = 0 147 | for zz in [[0,z_width],[z_width-z,None]]: 148 | for hh in [[0,h_width],[h_width-h,None]]: 149 | for ww in [[0,w_width],[w_width-w,None]]: 150 | output[zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1], :, :] = splits[i][zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1], :, :] 151 | i = i+1 152 | 153 | return output 154 | 155 | 156 | def split16(data, max_stride, margin): 157 | splits = [] 158 | _,c, z, h, w = data.size() 159 | 160 | z_width = np.ceil(float(z / 4 + margin)/max_stride).astype('int')*max_stride 161 | z_pos = [z*3/8-z_width/2, 162 | z*5/8-z_width/2] 163 | h_width = np.ceil(float(h / 2 + margin)/max_stride).astype('int')*max_stride 164 | w_width = np.ceil(float(w / 2 + margin)/max_stride).astype('int')*max_stride 165 | for zz in [[0,z_width],[z_pos[0],z_pos[0]+z_width],[z_pos[1],z_pos[1]+z_width],[-z_width,None]]: 166 | for hh in [[0,h_width],[-h_width,None]]: 167 | for ww in [[0,w_width],[-w_width,None]]: 168 | splits.append(data[:, :, zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1]]) 169 | 170 | return torch.cat(splits, 0) 171 | 172 | def combine16(output, z, h, w): 173 | splits = [] 174 | for i in range(len(output)): 175 | splits.append(output[i]) 176 | 177 | output = np.zeros(( 178 | z, 179 | h, 180 | w, 181 | splits[0].shape[3], 182 | splits[0].shape[4]), np.float32) 183 | 184 | 185 | z_width = z / 4 186 | h_width = h / 2 187 | w_width = w / 2 188 | splitzstart = splits[0].shape[0]/2-z_width/2 189 | z_pos = [z*3/8-z_width/2, 190 | z*5/8-z_width/2] 191 | i = 0 192 | for zz,zz2 in zip([[0,z_width],[z_width,z_width*2],[z_width*2,z_width*3],[z_width*3-z,None]], 193 | [[0,z_width],[splitzstart,z_width+splitzstart],[splitzstart,z_width+splitzstart],[z_width*3-z,None]]): 194 | for hh in [[0,h_width],[h_width-h,None]]: 195 | for ww in [[0,w_width],[w_width-w,None]]: 196 | output[zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1], :, :] = splits[i][zz2[0]:zz2[1], hh[0]:hh[1], ww[0]:ww[1], :, :] 197 | i = i+1 198 | 199 | return output 200 | 201 | def split32(data, max_stride, margin): 202 | splits = [] 203 | _,c, z, h, w = data.size() 204 | 205 | z_width = np.ceil(float(z / 2 + margin)/max_stride).astype('int')*max_stride 206 | w_width = np.ceil(float(w / 4 + margin)/max_stride).astype('int')*max_stride 207 | h_width = np.ceil(float(h / 4 + margin)/max_stride).astype('int')*max_stride 208 | 209 | w_pos = [w*3/8-w_width/2, 210 | w*5/8-w_width/2] 211 | h_pos = [h*3/8-h_width/2, 212 | h*5/8-h_width/2] 213 | 214 | for zz in [[0,z_width],[-z_width,None]]: 215 | for hh in [[0,h_width],[h_pos[0],h_pos[0]+h_width],[h_pos[1],h_pos[1]+h_width],[-h_width,None]]: 216 | for ww in [[0,w_width],[w_pos[0],w_pos[0]+w_width],[w_pos[1],w_pos[1]+w_width],[-w_width,None]]: 217 | splits.append(data[:, :, zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1]]) 218 | 219 | return torch.cat(splits, 0) 220 | 221 | def combine32(splits, z, h, w): 222 | 223 | output = np.zeros(( 224 | z, 225 | h, 226 | w, 227 | splits[0].shape[3], 228 | splits[0].shape[4]), np.float32) 229 | 230 | 231 | z_width = int(np.ceil(float(z) / 2)) 232 | h_width = int(np.ceil(float(h) / 4)) 233 | w_width = int(np.ceil(float(w) / 4)) 234 | splithstart = splits[0].shape[1]/2-h_width/2 235 | splitwstart = splits[0].shape[2]/2-w_width/2 236 | 237 | i = 0 238 | for zz in [[0,z_width],[z_width-z,None]]: 239 | 240 | for hh,hh2 in zip([[0,h_width],[h_width,h_width*2],[h_width*2,h_width*3],[h_width*3-h,None]], 241 | [[0,h_width],[splithstart,h_width+splithstart],[splithstart,h_width+splithstart],[h_width*3-h,None]]): 242 | 243 | for ww,ww2 in zip([[0,w_width],[w_width,w_width*2],[w_width*2,w_width*3],[w_width*3-w,None]], 244 | [[0,w_width],[splitwstart,w_width+splitwstart],[splitwstart,w_width+splitwstart],[w_width*3-w,None]]): 245 | 246 | output[zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1], :, :] = splits[i][zz[0]:zz[1], hh2[0]:hh2[1], ww2[0]:ww2[1], :, :] 247 | i = i+1 248 | 249 | return output 250 | 251 | 252 | 253 | def split64(data, max_stride, margin): 254 | splits = [] 255 | _,c, z, h, w = data.size() 256 | 257 | z_width = np.ceil(float(z / 4 + margin)/max_stride).astype('int')*max_stride 258 | w_width = np.ceil(float(w / 4 + margin)/max_stride).astype('int')*max_stride 259 | h_width = np.ceil(float(h / 4 + margin)/max_stride).astype('int')*max_stride 260 | 261 | z_pos = [z*3/8-z_width/2, 262 | z*5/8-z_width/2] 263 | w_pos = [w*3/8-w_width/2, 264 | w*5/8-w_width/2] 265 | h_pos = [h*3/8-h_width/2, 266 | h*5/8-h_width/2] 267 | 268 | for zz in [[0,z_width],[z_pos[0],z_pos[0]+z_width],[z_pos[1],z_pos[1]+z_width],[-z_width,None]]: 269 | for hh in [[0,h_width],[h_pos[0],h_pos[0]+h_width],[h_pos[1],h_pos[1]+h_width],[-h_width,None]]: 270 | for ww in [[0,w_width],[w_pos[0],w_pos[0]+w_width],[w_pos[1],w_pos[1]+w_width],[-w_width,None]]: 271 | splits.append(data[:, :, zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1]]) 272 | 273 | return torch.cat(splits, 0) 274 | 275 | def combine64(output, z, h, w): 276 | splits = [] 277 | for i in range(len(output)): 278 | splits.append(output[i]) 279 | 280 | output = np.zeros(( 281 | z, 282 | h, 283 | w, 284 | splits[0].shape[3], 285 | splits[0].shape[4]), np.float32) 286 | 287 | 288 | z_width = int(np.ceil(float(z) / 4)) 289 | h_width = int(np.ceil(float(h) / 4)) 290 | w_width = int(np.ceil(float(w) / 4)) 291 | splitzstart = splits[0].shape[0]/2-z_width/2 292 | splithstart = splits[0].shape[1]/2-h_width/2 293 | splitwstart = splits[0].shape[2]/2-w_width/2 294 | 295 | i = 0 296 | for zz,zz2 in zip([[0,z_width],[z_width,z_width*2],[z_width*2,z_width*3],[z_width*3-z,None]], 297 | [[0,z_width],[splitzstart,z_width+splitzstart],[splitzstart,z_width+splitzstart],[z_width*3-z,None]]): 298 | 299 | for hh,hh2 in zip([[0,h_width],[h_width,h_width*2],[h_width*2,h_width*3],[h_width*3-h,None]], 300 | [[0,h_width],[splithstart,h_width+splithstart],[splithstart,h_width+splithstart],[h_width*3-h,None]]): 301 | 302 | for ww,ww2 in zip([[0,w_width],[w_width,w_width*2],[w_width*2,w_width*3],[w_width*3-w,None]], 303 | [[0,w_width],[splitwstart,w_width+splitwstart],[splitwstart,w_width+splitwstart],[w_width*3-w,None]]): 304 | 305 | output[zz[0]:zz[1], hh[0]:hh[1], ww[0]:ww[1], :, :] = splits[i][zz2[0]:zz2[1], hh2[0]:hh2[1], ww2[0]:ww2[1], :, :] 306 | i = i+1 307 | 308 | return output 309 | --------------------------------------------------------------------------------