├── .gitignore
├── LICENSE
├── README.md
├── checkpoints
└── model.ckpt
├── dataloader
├── datalist
│ ├── dtu
│ │ ├── test.txt
│ │ ├── train.txt
│ │ └── val.txt
│ └── tanks
│ │ ├── logs
│ │ ├── Family.log
│ │ ├── Francis.log
│ │ ├── Horse.log
│ │ ├── Lighthouse.log
│ │ ├── M60.log
│ │ ├── Panther.log
│ │ ├── Playground.log
│ │ └── Train.log
│ │ └── test.txt
└── mvs_dataset.py
├── depthfusion.py
├── networks
├── submodules.py
└── ucsnet.py
├── requirements.txt
├── results
└── dtu.png
├── scripts
├── fuse_dtu.sh
├── fuse_tanks.sh
├── test_on_dtu.sh
├── test_on_tanks.sh
└── train.sh
├── test.py
├── train.py
└── utils
├── collect_pointclouds.py
└── utils.py
/.gitignore:
--------------------------------------------------------------------------------
1 | .idea
2 | .DS_*
3 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 成硕
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # UCSNet
2 | ### Deep Stereo using Adaptive Thin Volume Representation with Uncertainty Awareness, CVPR 2020. (Oral Presentation)
3 | ## Introduction
4 | [UCSNet](https://arxiv.org/abs/1911.12012) is a learning-based framework for multi-view stereo (MVS). If you find this project useful for your research, please cite:
5 |
6 |
15 |
16 |
17 | ```
18 | @inproceedings{cheng2020deep,
19 | title={Deep stereo using adaptive thin volume representation with uncertainty awareness},
20 | author={Cheng, Shuo and Xu, Zexiang and Zhu, Shilin and Li, Zhuwen and Li, Li Erran and Ramamoorthi, Ravi and Su, Hao},
21 | booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
22 | pages={2524--2534},
23 | year={2020}
24 | }
25 | ```
26 |
27 | reconstruction results on DTU dataset:
28 |
29 | 
30 |
31 | ## How to Use
32 |
33 | ### Environment
34 | * python 3.6 (Anaconda)
35 | * ``pip install -r requirements.txt``
36 |
37 | ### Reproducing Results
38 |
39 | #### Compute Depth:
40 | * Download pre-processed testset: [Tanks and Temples](https://drive.google.com/file/d/1-6v88UcdKBSb8_c1FLH5kpwWQkHuQqaW/view?usp=sharing) and [DTU](https://drive.google.com/file/d/1SpyJSnj16XFhKXPHu8VcYwyQMCGc75bZ/view?usp=sharing). Each dataset should be organized as the following:
41 |
42 | ```
43 | root_directory
44 | ├──scan1 (scene_name1)
45 | ├──scan2 (scene_name2)
46 | ├── images
47 | │ ├── 00000000.jpg
48 | │ ├── 00000001.jpg
49 | │ └── ...
50 | ├── cams
51 | │ ├── 00000000_cam.txt
52 | │ ├── 00000001_cam.txt
53 | │ └── ...
54 | └── pair.txt
55 | ```
56 |
57 | * In ``scripts/test_on_dtu.sh`` or ``scripts/test_on_tanks.sh``, set `root_path` to dataset root directory, set `save_path` to your directory
58 | * Test on GPU by running ``bash scripts/test_on_dtu.sh`` or ``bash scripts/test_on_tanks.sh``
59 | * For testing your own data, please organize your dataset in the same way, and generate the data list for the scenes you want to test. View selection is very crutial for multi-view stereo. For each scene, you may also need to implement the view selection in ``pair.txt``:
60 |
61 | ```
62 | TOTAL_IMAGE_NUM
63 | IMAGE_ID0 # index of reference image 0
64 | 10 ID0 SCORE0 ID1 SCORE1 ... # 10 best source images for reference image 0
65 | IMAGE_ID1 # index of reference image 1
66 | 10 ID0 SCORE0 ID1 SCORE1 ... # 10 best source images for reference image 1
67 | ...
68 | ```
69 | #### Depth Fusion:
70 | * Download the modified [fusibile](https://github.com/kysucix/fusibile): `git clone https://github.com/YoYo000/fusibile`
71 | * Install by `cmake .` and `make`
72 | * In ``scripts/fuse_dtu.sh`` or ``bash scripts/fuse_tanks.sh``, set ``exe_path`` to executable fusibile path, set ``root_path`` to the directory that contain the test results, set ``target_path`` to where you want to save the point clouds.
73 | * Fusing by running ``bash scripts/fuse_dtu.sh`` or ``bash scripts/fuse_tanks.sh``
74 |
75 |
76 | Note: For DTU results, the fusion is performed on an NVIDIA GTX 1080Ti. For Tanks and Temple results, the fusion is performed on an NVIDIA P6000, as fusibile requires to read in the depth maps all in once, you may need a GPU with memory around 20GB.
77 | You can decrease the depth resolution in previous computing step or try [our implementation](https://github.com/touristCheng/DepthFusion) for depth fusion
78 |
79 | #### DTU Evaluation:
80 |
81 | * Download the offical evaluation tool from [DTU benchmark](http://roboimagedata.compute.dtu.dk/?page_id=36)
82 | * Put the ground-truth point clouds and the predicted point clouds in the ``MVS Data/Points`` folder
83 | * In ``GetUsedSets.m``, modify the ``UsedSets`` to be ``[1 4 9 10 11 12 13 15 23 24 29 32 33 34 48 49 62 75 77 110 114 118]`` as that are the test objects used in the literatures, then calculate the scores using ``BaseEvalMain_web.m`` and ``ComputeStat_web.m``
84 | * The accuracy of each object is stored in ``BaseStat.MeanData``, and the completeness of each object is stored in ``BaseStat.MeanStl``, use the average number as the final accuracy and completeness
85 | * We also provide our pre-computed [point clouds](https://drive.google.com/file/d/18bk-153cdPs5ehi_JjOHx9h1N9zhrPkW/view?usp=sharing) for your convenience, the evaluation results are:
86 |
87 | | Accuracy | Completeness | Overall |
88 | |------------|---------------|---------|
89 | | 0.3388 | 0.3456 | 0.3422 |
90 |
91 |
92 |
93 | ### Training
94 | * Install NVIDIA [apex](https://github.com/NVIDIA/apex) for using Synchronized Batch Normalization
95 | * Download pre-processed DTU [training data](https://drive.google.com/file/d/1ssnznSXyTCDgdXLr4497dQsSSREC8vVj/view?usp=sharing) from MVSNet, and download our rendered full resolution [ground-truth](https://drive.google.com/file/d/1KYP9XfEjzyzkKMxC-nyTvAdsNNU64UQM/view?usp=sharing). Place the ground-truth in root directory, the train set need to be organized as:
96 |
97 | ```
98 | root_directory
99 | ├──Cameras
100 | ├──Rectified
101 | ├──Depths_4
102 | └──Depths
103 | ```
104 | * In ``scripts/train.sh``, set ``root_path`` to root directory, set ``num_gpus`` to the number of GPU on a machine (We use 8 1080Ti in our experiments).
105 | * Training: ``bash scripts/train.sh``
106 |
107 |
108 | ## Acknowledgements
109 | [UCSNet](https://arxiv.org/abs/1911.12012) takes the [MVSNet](https://arxiv.org/abs/1804.02505) as its backbone. Thanks to Yao Yao for opening source of his excellent work, thanks to Xiaoyang Guo for his PyTorch implementation [MVSNet_pytorch](https://github.com/xy-guo/MVSNet_pytorch).
110 |
--------------------------------------------------------------------------------
/checkpoints/model.ckpt:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/touristCheng/UCSNet/a1361810f47b420941a1e7b32e24f37c305f0953/checkpoints/model.ckpt
--------------------------------------------------------------------------------
/dataloader/datalist/dtu/test.txt:
--------------------------------------------------------------------------------
1 | scan1
2 | scan4
3 | scan9
4 | scan10
5 | scan11
6 | scan12
7 | scan13
8 | scan15
9 | scan23
10 | scan24
11 | scan29
12 | scan32
13 | scan33
14 | scan34
15 | scan48
16 | scan49
17 | scan62
18 | scan75
19 | scan77
20 | scan110
21 | scan114
22 | scan118
--------------------------------------------------------------------------------
/dataloader/datalist/dtu/train.txt:
--------------------------------------------------------------------------------
1 | scan2
2 | scan6
3 | scan7
4 | scan8
5 | scan14
6 | scan16
7 | scan18
8 | scan19
9 | scan20
10 | scan22
11 | scan30
12 | scan31
13 | scan36
14 | scan39
15 | scan41
16 | scan42
17 | scan44
18 | scan45
19 | scan46
20 | scan47
21 | scan50
22 | scan51
23 | scan52
24 | scan53
25 | scan55
26 | scan57
27 | scan58
28 | scan60
29 | scan61
30 | scan63
31 | scan64
32 | scan65
33 | scan68
34 | scan69
35 | scan70
36 | scan71
37 | scan72
38 | scan74
39 | scan76
40 | scan83
41 | scan84
42 | scan85
43 | scan87
44 | scan88
45 | scan89
46 | scan90
47 | scan91
48 | scan92
49 | scan93
50 | scan94
51 | scan95
52 | scan96
53 | scan97
54 | scan98
55 | scan99
56 | scan100
57 | scan101
58 | scan102
59 | scan103
60 | scan104
61 | scan105
62 | scan107
63 | scan108
64 | scan109
65 | scan111
66 | scan112
67 | scan113
68 | scan115
69 | scan116
70 | scan119
71 | scan120
72 | scan121
73 | scan122
74 | scan123
75 | scan124
76 | scan125
77 | scan126
78 | scan127
79 | scan128
--------------------------------------------------------------------------------
/dataloader/datalist/dtu/val.txt:
--------------------------------------------------------------------------------
1 | scan3
2 | scan5
3 | scan17
4 | scan21
5 | scan28
6 | scan35
7 | scan37
8 | scan38
9 | scan40
10 | scan43
11 | scan56
12 | scan59
13 | scan66
14 | scan67
15 | scan82
16 | scan86
17 | scan106
18 | scan117
--------------------------------------------------------------------------------
/dataloader/datalist/tanks/logs/Family.log:
--------------------------------------------------------------------------------
1 | 0 0 0
2 | 0.999914 6.96597e-06 0.0130972 0.0297143
3 | -0.000314413 0.999724 0.0234724 0.114098
4 | -0.0130934 -0.0234745 0.999639 -0.573847
5 | -0 -0 0 1
6 | 1 1 0
7 | 0.999994 0.00073458 0.00344365 0.0454525
8 | -0.000829393 0.999618 0.0276124 0.0466132
9 | -0.00342205 -0.0276151 0.999613 -0.542059
10 | -0 -0 0 1
11 | 2 2 0
12 | 0.999817 0.00184825 -0.0190252 0.0565189
13 | -0.00102378 0.999064 0.0432548 0.0226157
14 | 0.0190873 -0.0432274 0.998883 -0.547897
15 | -0 -0 0 1
16 | 3 3 0
17 | 0.99851 0.00615892 -0.0542213 0.0592199
18 | -0.00323696 0.998546 0.0538133 0.0206174
19 | 0.0544739 -0.0535576 0.997078 -0.544579
20 | -0 -0 0 1
21 | 4 4 0
22 | 0.999054 0.00806457 -0.0427258 0.0663682
23 | -0.00572169 0.998488 0.0546765 0.0215152
24 | 0.0431021 -0.0543804 0.99759 -0.543858
25 | -0 -0 0 1
26 | 5 5 0
27 | 0.987934 0.0174106 -0.153894 0.0855601
28 | -0.00451777 0.996478 0.083733 0.00583505
29 | 0.15481 -0.0820275 0.984533 -0.531965
30 | -0 -0 0 1
31 | 6 6 0
32 | 0.972213 0.0335106 -0.231689 0.103133
33 | -0.00748165 0.993644 0.112322 0.00617607
34 | 0.23398 -0.107468 0.966284 -0.512829
35 | -0 -0 0 1
36 | 7 7 0
37 | 0.972177 0.0413708 -0.230564 0.127597
38 | -0.0132987 0.992441 0.122003 0.005449
39 | 0.233868 -0.115542 0.965379 -0.485124
40 | -0 -0 0 1
41 | 8 8 0
42 | 0.96336 0.0523021 -0.263062 0.171705
43 | -0.0153997 0.989971 0.140431 0.00685975
44 | 0.267769 -0.131234 0.954504 -0.486677
45 | -0 -0 0 1
46 | 9 9 0
47 | 0.935261 0.0748351 -0.345958 0.205926
48 | -0.0160381 0.98535 0.169787 0.0125106
49 | 0.353595 -0.153246 0.92276 -0.478084
50 | -0 -0 0 1
51 | 10 10 0
52 | 0.907896 0.0959119 -0.408075 0.228115
53 | -0.0196454 0.982139 0.187129 0.0115549
54 | 0.418735 -0.161877 0.893564 -0.448821
55 | -0 -0 0 1
56 | 11 11 0
57 | 0.864118 0.116186 -0.489695 0.277274
58 | -0.020336 0.980254 0.196692 0.0128925
59 | 0.502879 -0.160006 0.849418 -0.436417
60 | -0 -0 0 1
61 | 12 12 0
62 | 0.819289 0.138052 -0.556514 0.309309
63 | -0.0241633 0.978033 0.207044 0.0152671
64 | 0.572872 -0.156182 0.804627 -0.410445
65 | -0 -0 0 1
66 | 13 13 0
67 | 0.78466 0.152437 -0.600892 0.332562
68 | -0.0272055 0.97683 0.212281 0.01474
69 | 0.619329 -0.150221 0.770627 -0.37791
70 | -0 -0 0 1
71 | 14 14 0
72 | 0.693182 0.172752 -0.699754 0.379661
73 | -0.019113 0.974916 0.22175 0.00965293
74 | 0.720509 -0.140339 0.679096 -0.347091
75 | -0 -0 0 1
76 | 15 15 0
77 | 0.642926 0.184902 -0.743275 0.412192
78 | -0.0234519 0.974721 0.222192 0.00522956
79 | 0.765569 -0.125422 0.63101 -0.3026
80 | -0 -0 0 1
81 | 16 16 0
82 | 0.599101 0.184808 -0.779053 0.458166
83 | -0.0219553 0.976423 0.214745 0.00325108
84 | 0.800372 -0.11155 0.589034 -0.262541
85 | -0 -0 0 1
86 | 17 17 0
87 | 0.493026 0.193253 -0.84828 0.503564
88 | -0.0134534 0.976595 0.214666 0.00301187
89 | 0.869911 -0.0944239 0.484087 -0.220543
90 | -0 -0 0 1
91 | 18 18 0
92 | 0.362116 0.196206 -0.911249 0.549551
93 | -0.00447567 0.977951 0.208789 0.00203334
94 | 0.932122 -0.0715274 0.35501 -0.164529
95 | -0 -0 0 1
96 | 19 19 0
97 | 0.235109 0.196449 -0.951909 0.581388
98 | -0.0020643 0.979461 0.201625 0.00358686
99 | 0.971967 -0.0454389 0.230686 -0.105406
100 | -0 -0 0 1
101 | 20 20 0
102 | 0.153786 0.184995 -0.970632 0.587738
103 | -0.00205638 0.982375 0.186908 0.00372407
104 | 0.988102 -0.0267478 0.151456 -0.0484664
105 | -0 -0 0 1
106 | 21 21 0
107 | 0.0430398 0.174811 -0.983661 0.605077
108 | 0.00171092 0.984559 0.175046 0.00449917
109 | 0.999072 -0.0092169 0.0420761 0.00851314
110 | -0 -0 0 1
111 | 22 22 0
112 | -0.0357061 0.16535 -0.985588 0.588685
113 | 0.00344953 0.986232 0.165333 0.00538191
114 | 0.999356 0.00250358 -0.0357849 0.0657637
115 | -0 -0 0 1
116 | 23 23 0
117 | -0.156668 0.154931 -0.975424 0.599491
118 | 0.00900499 0.987803 0.155451 0.00727325
119 | 0.98761 0.0155705 -0.156152 0.133321
120 | -0 -0 0 1
121 | 24 24 0
122 | -0.256104 0.147591 -0.955315 0.591478
123 | 0.00974105 0.988619 0.150125 0.0124647
124 | 0.9666 0.0291419 -0.254627 0.192733
125 | -0 -0 0 1
126 | 25 25 0
127 | -0.345373 0.139359 -0.928061 0.566522
128 | 0.0120903 0.989492 0.144084 0.0139908
129 | 0.938388 0.0385424 -0.343429 0.260484
130 | -0 -0 0 1
131 | 26 26 0
132 | -0.441664 0.131534 -0.887486 0.545151
133 | 0.0111386 0.989922 0.141173 0.0190685
134 | 0.897111 0.0524655 -0.438678 0.319565
135 | -0 -0 0 1
136 | 27 27 0
137 | -0.464943 0.123619 -0.876667 0.514401
138 | 0.00771227 0.990732 0.135613 0.0181351
139 | 0.885307 0.0562914 -0.461588 0.367407
140 | -0 -0 0 1
141 | 28 28 0
142 | -0.535776 0.115598 -0.83641 0.491997
143 | 0.0108968 0.991448 0.130045 0.0200495
144 | 0.84429 0.0605608 -0.532454 0.436535
145 | -0 -0 0 1
146 | 29 29 0
147 | -0.575657 0.11082 -0.810147 0.452982
148 | 0.00926686 0.991594 0.129055 0.0221048
149 | 0.817639 0.0667841 -0.571845 0.483323
150 | -0 -0 0 1
151 | 30 30 0
152 | -0.64492 0.10405 -0.757134 0.42065
153 | 0.00938653 0.991692 0.128289 0.0249256
154 | 0.764193 0.0756293 -0.640539 0.531402
155 | -0 -0 0 1
156 | 31 31 0
157 | -0.752352 0.0970081 -0.65158 0.378495
158 | 0.00950895 0.990594 0.136501 0.0299292
159 | 0.658693 0.0965012 -0.746198 0.563862
160 | -0 -0 0 1
161 | 32 32 0
162 | -0.814061 0.0840564 -0.574664 0.322845
163 | 0.00246134 0.989962 0.141316 0.0293939
164 | 0.580774 0.113625 -0.806096 0.588326
165 | -0 -0 0 1
166 | 33 33 0
167 | -0.881416 0.0675669 -0.467483 0.269682
168 | -0.0016676 0.989265 0.146126 0.0316337
169 | 0.472337 0.129577 -0.871841 0.626678
170 | -0 -0 0 1
171 | 34 34 0
172 | -0.916354 0.0507233 -0.397142 0.209836
173 | -0.0104154 0.988586 0.150295 0.0308044
174 | 0.400233 0.14186 -0.905367 0.636481
175 | -0 -0 0 1
176 | 35 35 0
177 | -0.933539 0.0399966 -0.356236 0.14496
178 | -0.0144229 0.988761 0.14881 0.0299171
179 | 0.358184 0.144058 -0.92247 0.658973
180 | -0 -0 0 1
181 | 36 36 0
182 | -0.958555 0.026446 -0.283676 0.0929634
183 | -0.0172498 0.988469 0.150439 0.0339616
184 | 0.284383 0.149098 -0.947046 0.668648
185 | -0 -0 0 1
186 | 37 37 0
187 | -0.979991 0.01143 -0.198712 0.0448141
188 | -0.0196719 0.9879 0.153841 0.0323462
189 | 0.198066 0.154672 -0.967908 0.667116
190 | -0 -0 0 1
191 | 38 38 0
192 | -0.998601 -0.0138655 -0.0510225 -0.0163671
193 | -0.0218158 0.987081 0.158732 0.0312569
194 | 0.0481624 0.159624 -0.986002 0.674226
195 | -0 -0 0 1
196 | 39 39 0
197 | -0.998677 -0.0356676 0.0370445 -0.068935
198 | -0.0292626 0.986525 0.160971 0.0282673
199 | -0.0422868 0.159674 -0.986264 0.650587
200 | -0 -0 0 1
201 | 40 40 0
202 | -0.991224 -0.0533435 0.120956 -0.131886
203 | -0.0335447 0.986518 0.160175 0.0256987
204 | -0.12787 0.154712 -0.97965 0.630536
205 | -0 -0 0 1
206 | 41 41 0
207 | -0.976927 -0.0712624 0.201334 -0.177172
208 | -0.0388684 0.986272 0.160492 0.0241323
209 | -0.210007 0.148964 -0.966285 0.600845
210 | -0 -0 0 1
211 | 42 42 0
212 | -0.970044 -0.0794891 0.229554 -0.214048
213 | -0.0440051 0.986815 0.155754 0.0239364
214 | -0.238908 0.140987 -0.960753 0.558923
215 | -0 -0 0 1
216 | 43 43 0
217 | -0.940335 -0.09178 0.327637 -0.259246
218 | -0.0408591 0.986424 0.159056 0.0180233
219 | -0.337787 0.136179 -0.931319 0.510862
220 | -0 -0 0 1
221 | 44 44 0
222 | -0.903917 -0.106407 0.414261 -0.291073
223 | -0.0433721 0.98637 0.158721 0.0173309
224 | -0.425504 0.125503 -0.896212 0.473449
225 | -0 -0 0 1
226 | 45 45 0
227 | -0.8486 -0.1242 0.514249 -0.346993
228 | -0.0450816 0.985493 0.163621 0.0149353
229 | -0.527111 0.115666 -0.841888 0.444543
230 | -0 -0 0 1
231 | 46 46 0
232 | -0.801704 -0.139567 0.581198 -0.383429
233 | -0.0494557 0.984512 0.168199 0.0138135
234 | -0.595671 0.106102 -0.79619 0.405353
235 | -0 -0 0 1
236 | 47 47 0
237 | -0.749764 -0.153364 0.643687 -0.42831
238 | -0.0527912 0.983534 0.172844 0.00691511
239 | -0.659596 0.095611 -0.745514 0.360192
240 | -0 -0 0 1
241 | 48 48 0
242 | -0.708481 -0.158444 0.687713 -0.456566
243 | -0.0555132 0.983964 0.169508 -0.00131238
244 | -0.703543 0.0819161 -0.705916 0.308082
245 | -0 -0 0 1
246 | 49 49 0
247 | -0.598623 -0.167064 0.783415 -0.493431
248 | -0.0503528 0.983923 0.171347 -0.00144639
249 | -0.799447 0.0631253 -0.597411 0.258808
250 | -0 -0 0 1
251 | 50 50 0
252 | -0.488502 -0.17455 0.854926 -0.504741
253 | -0.0537023 0.983944 0.170207 -0.00354963
254 | -0.870909 0.0372348 -0.490032 0.206026
255 | -0 -0 0 1
256 | 51 51 0
257 | -0.349225 -0.182543 0.919086 -0.530394
258 | -0.0538048 0.983129 0.174819 -0.00487158
259 | -0.935493 0.0115999 -0.353155 0.147359
260 | -0 -0 0 1
261 | 52 52 0
262 | -0.265031 -0.184191 0.946484 -0.53958
263 | -0.0593222 0.982841 0.174655 -0.00518474
264 | -0.962413 -0.00985862 -0.27141 0.0836334
265 | -0 -0 0 1
266 | 53 53 0
267 | -0.148903 -0.186318 0.97114 -0.570295
268 | -0.0571807 0.982068 0.179647 -0.00910858
269 | -0.987197 -0.0287805 -0.156887 0.0205745
270 | -0 -0 0 1
271 | 54 54 0
272 | -0.0832297 -0.181729 0.97982 -0.554604
273 | -0.059159 0.982399 0.177182 -0.0127675
274 | -0.994773 -0.0432184 -0.0925156 -0.0432164
275 | -0 -0 0 1
276 | 55 55 0
277 | -0.00432292 -0.179557 0.983738 -0.544293
278 | -0.0568258 0.982202 0.179026 -0.0135261
279 | -0.998375 -0.0551278 -0.0144494 -0.117141
280 | -0 -0 0 1
281 | 56 56 0
282 | 0.101596 -0.174866 0.979337 -0.53888
283 | -0.0521776 0.982139 0.180779 -0.0160711
284 | -0.993456 -0.0694659 0.0906573 -0.176608
285 | -0 -0 0 1
286 | 57 57 0
287 | 0.254237 -0.164121 0.953115 -0.590849
288 | -0.0490361 0.982041 0.182182 -0.0163064
289 | -0.965898 -0.0930544 0.241624 -0.197099
290 | -0 -0 0 1
291 | 58 58 0
292 | 0.292945 -0.156703 0.943201 -0.626467
293 | -0.054752 0.98211 0.180173 -0.0159245
294 | -0.95456 -0.104423 0.279124 -0.234758
295 | -0 -0 0 1
296 | 59 59 0
297 | 0.300865 -0.154115 0.941132 -0.635999
298 | -0.0573256 0.982149 0.179158 -0.0139061
299 | -0.951942 -0.107853 0.28666 -0.277162
300 | -0 -0 0 1
301 | 60 60 0
302 | 0.194148 -0.16309 0.96732 -0.622227
303 | -0.064948 0.981782 0.178564 -0.00765345
304 | -0.97882 -0.0974934 0.180019 -0.282266
305 | -0 -0 0 1
306 | 61 61 0
307 | 0.103083 -0.185189 0.977281 -0.582666
308 | -0.0570913 0.979794 0.191687 -0.00737261
309 | -0.993033 -0.0755539 0.090427 -0.301006
310 | -0 -0 0 1
311 | 62 62 0
312 | 0.0670877 -0.189979 0.979493 -0.537185
313 | -0.0468365 0.980023 0.19329 -0.00880144
314 | -0.996647 -0.0588434 0.0568496 -0.296081
315 | -0 -0 0 1
316 | 63 63 0
317 | 0.154154 -0.192504 0.969112 -0.488955
318 | -0.028045 0.979589 0.199046 -0.0116269
319 | -0.987649 -0.0578624 0.145609 -0.307361
320 | -0 -0 0 1
321 | 64 64 0
322 | 0.373477 -0.179218 0.910162 -0.438992
323 | -0.0462802 0.976338 0.211239 -0.0123629
324 | -0.926484 -0.121015 0.356346 -0.328489
325 | -0 -0 0 1
326 | 65 65 0
327 | 0.490523 -0.162332 0.856175 -0.409471
328 | -0.0514375 0.975389 0.214405 -0.015345
329 | -0.869909 -0.14921 0.470101 -0.372568
330 | -0 -0 0 1
331 | 66 66 0
332 | 0.610027 -0.142504 0.779461 -0.372378
333 | -0.0522632 0.974317 0.219031 -0.0158724
334 | -0.790655 -0.174352 0.586912 -0.423435
335 | -0 -0 0 1
336 | 67 67 0
337 | 0.662487 -0.12674 0.738274 -0.327215
338 | -0.0586866 0.973771 0.21983 -0.0172458
339 | -0.746771 -0.188961 0.637673 -0.450778
340 | -0 -0 0 1
341 | 68 68 0
342 | 0.734213 -0.110526 0.669862 -0.287379
343 | -0.0546257 0.973843 0.220556 -0.0171869
344 | -0.676718 -0.198527 0.708971 -0.515142
345 | -0 -0 0 1
346 | 69 69 0
347 | 0.784332 -0.0977651 0.612589 -0.246823
348 | -0.0539266 0.973019 0.224333 -0.0173394
349 | -0.617993 -0.208986 0.757898 -0.567953
350 | -0 -0 0 1
351 | 70 70 0
352 | 0.866768 -0.0783466 0.492518 -0.241237
353 | -0.0445314 0.971479 0.232906 -0.0260773
354 | -0.496719 -0.223808 0.838559 -0.619507
355 | -0 -0 0 1
356 | 71 71 0
357 | 0.897469 -0.0576853 0.43729 -0.259109
358 | -0.0492396 0.972111 0.229293 -0.0338957
359 | -0.438321 -0.227315 0.869599 -0.631604
360 | -0 -0 0 1
361 | 72 72 0
362 | 0.950609 -0.0261237 0.309289 -0.235658
363 | -0.0377737 0.979309 0.198814 -0.0376228
364 | -0.308084 -0.200678 0.929953 -0.634463
365 | -0 -0 0 1
366 | 73 73 0
367 | 0.978729 -0.00448881 0.205109 -0.196253
368 | -0.0345384 0.981886 0.186297 -0.0427111
369 | -0.20223 -0.189418 0.960845 -0.589647
370 | -0 -0 0 1
371 | 74 74 0
372 | 0.975333 0.0030234 0.220717 -0.135706
373 | -0.0363429 0.98846 0.147057 -0.0485553
374 | -0.217725 -0.151451 0.964188 -0.533527
375 | -0 -0 0 1
376 | 75 75 0
377 | 0.976715 0.00426256 0.2145 -0.0927815
378 | -0.0258896 0.994838 0.0981174 -0.0447152
379 | -0.212975 -0.101386 0.971783 -0.483426
380 | -0 -0 0 1
381 | 76 76 0
382 | 0.988483 -0.00316178 0.151299 -0.0346959
383 | -0.00948763 0.99652 0.0828102 -0.0390764
384 | -0.151035 -0.083292 0.985013 -0.461979
385 | -0 -0 0 1
386 | 77 77 0
387 | 0.999979 -0.00524783 0.00391092 0.0329406
388 | 0.00493983 0.997172 0.0749874 -0.032644
389 | -0.00429338 -0.0749665 0.997177 -0.452897
390 | -0 -0 0 1
391 | 78 78 0
392 | 0.98397 -0.00529841 -0.178256 0.111785
393 | 0.0181529 0.997342 0.0705592 -0.0307768
394 | 0.177409 -0.072664 0.981451 -0.432739
395 | -0 -0 0 1
396 | 79 79 0
397 | 0.938647 -0.00132351 -0.344878 0.167169
398 | 0.027019 0.997201 0.0697101 -0.0252786
399 | 0.34382 -0.0747514 0.936055 -0.402515
400 | -0 -0 0 1
401 | 80 80 0
402 | 0.890631 -0.00133729 -0.454724 0.223881
403 | 0.0346473 0.997288 0.0649281 -0.0243749
404 | 0.453404 -0.073582 0.888263 -0.358855
405 | -0 -0 0 1
406 | 81 81 0
407 | 0.806894 0.000924523 -0.590695 0.261639
408 | 0.0421693 0.997357 0.0591646 -0.015155
409 | 0.589189 -0.0726488 0.804723 -0.318696
410 | -0 -0 0 1
411 | 82 82 0
412 | 0.688401 0.00438603 -0.725317 0.271368
413 | 0.0484104 0.997474 0.0519783 -0.0112016
414 | 0.723713 -0.0708948 0.68645 -0.278863
415 | -0 -0 0 1
416 | 83 83 0
417 | 0.583834 0.0331182 -0.811197 0.302179
418 | 0.0059808 0.998965 0.0450886 -0.00893461
419 | 0.811851 -0.0311758 0.583032 -0.254483
420 | -0 -0 0 1
421 | 84 84 0
422 | 0.541058 0.0309662 -0.840415 0.324447
423 | 0.00824311 0.999079 0.0421192 -0.0040326
424 | 0.840945 -0.0297166 0.540304 -0.216565
425 | -0 -0 0 1
426 | 85 85 0
427 | 0.503113 0.0295982 -0.863713 0.354606
428 | 0.00817978 0.999205 0.039006 -0.00229145
429 | 0.864182 -0.0266894 0.502472 -0.164286
430 | -0 -0 0 1
431 | 86 86 0
432 | 0.407393 0.0305004 -0.912743 0.379778
433 | 0.0114968 0.999192 0.0385207 -0.000526797
434 | 0.91318 -0.0261867 0.406713 -0.117736
435 | -0 -0 0 1
436 | 87 87 0
437 | 0.317686 0.0319286 -0.947658 0.365254
438 | 0.0141229 0.999163 0.0383984 -0.00106844
439 | 0.948091 -0.0255823 0.31697 -0.0675036
440 | -0 -0 0 1
441 | 88 88 0
442 | 0.160113 0.03134 -0.986601 0.368647
443 | 0.0162206 0.999277 0.034375 0.000377915
444 | 0.986965 -0.0215071 0.159489 -0.0165193
445 | -0 -0 0 1
446 | 89 89 0
447 | 0.0308629 0.0354099 -0.998896 0.35673
448 | 0.0145263 0.999251 0.0358713 0.00420728
449 | 0.999418 -0.0156173 0.0303254 0.0266604
450 | -0 -0 0 1
451 | 90 90 0
452 | -0.0841862 0.0354138 -0.995821 0.359261
453 | 0.0128252 0.999324 0.0344542 0.00958368
454 | 0.996368 -0.00987103 -0.0845835 0.0759798
455 | -0 -0 0 1
456 | 91 91 0
457 | -0.195315 0.0287438 -0.980319 0.401907
458 | 0.00954579 0.999579 0.0274066 0.015771
459 | 0.980694 -0.004005 -0.195507 0.116242
460 | -0 -0 0 1
461 | 92 92 0
462 | -0.275049 0.0288139 -0.960998 0.455225
463 | 0.00685633 0.999584 0.0280085 0.0186893
464 | 0.961406 0.00111481 -0.275133 0.168267
465 | -0 -0 0 1
466 | 93 93 0
467 | -0.33178 0.0546815 -0.941771 0.516489
468 | 0.00415244 0.998394 0.0565063 0.023954
469 | 0.943348 0.014837 -0.331474 0.204934
470 | -0 -0 0 1
471 | 94 94 0
472 | -0.348776 0.0885444 -0.933014 0.569871
473 | 0.00310173 0.995631 0.0933274 0.0202385
474 | 0.937201 0.0296564 -0.347527 0.261792
475 | -0 -0 0 1
476 | 95 95 0
477 | -0.370178 0.100079 -0.923554 0.622301
478 | -0.000639077 0.994152 0.107985 0.0281804
479 | 0.928961 0.0405641 -0.367949 0.282388
480 | -0 -0 0 1
481 | 96 96 0
482 | -0.347128 0.148584 -0.925973 0.617837
483 | -0.000448963 0.987343 0.1586 0.0321321
484 | 0.937818 0.05547 -0.342667 0.321841
485 | -0 -0 0 1
486 | 97 97 0
487 | -0.394702 0.165051 -0.903863 0.567766
488 | -0.00249537 0.983537 0.180689 0.030952
489 | 0.918806 0.0735739 -0.387792 0.343341
490 | -0 -0 0 1
491 | 98 98 0
492 | -0.44473 0.163504 -0.880615 0.503086
493 | -0.00886723 0.982345 0.18687 0.0246608
494 | 0.895621 0.0909154 -0.435428 0.34265
495 | -0 -0 0 1
496 | 99 99 0
497 | -0.535873 0.149637 -0.830932 0.431845
498 | -0.012735 0.982624 0.185167 0.0195334
499 | 0.844202 0.109808 -0.524657 0.347246
500 | -0 -0 0 1
501 | 100 100 0
502 | -0.605403 0.129698 -0.785281 0.368657
503 | -0.0251244 0.983028 0.181728 0.0110968
504 | 0.795523 0.129748 -0.591869 0.327242
505 | -0 -0 0 1
506 | 101 101 0
507 | -0.665633 0.101088 -0.739401 0.308855
508 | -0.0318454 0.986033 0.163475 -0.00508713
509 | 0.7456 0.132361 -0.653117 0.345174
510 | -0 -0 0 1
511 | 102 102 0
512 | -0.729166 0.0660954 -0.681138 0.245685
513 | -0.0384281 0.9898 0.137185 -0.00738133
514 | 0.683257 0.126205 -0.719188 0.36172
515 | -0 -0 0 1
516 | 103 103 0
517 | -0.795857 0.0293219 -0.604775 0.188497
518 | -0.0411639 0.993896 0.102358 -0.00650684
519 | 0.604084 0.106357 -0.789791 0.378133
520 | -0 -0 0 1
521 | 104 104 0
522 | -0.901225 -0.0120028 -0.433185 0.121623
523 | -0.0449246 0.996818 0.0658439 -0.000920891
524 | 0.431016 0.0788008 -0.898897 0.382388
525 | -0 -0 0 1
526 | 105 105 0
527 | -0.962004 -0.0323082 -0.271119 0.0478844
528 | -0.0502705 0.996958 0.0595697 -0.00537124
529 | 0.268369 0.0709355 -0.960701 0.380155
530 | -0 -0 0 1
531 | 106 106 0
532 | -0.993119 -0.0466861 -0.107405 -0.0185201
533 | -0.052825 0.997086 0.0550381 -0.0040807
534 | 0.104523 0.060333 -0.992691 0.386608
535 | -0 -0 0 1
536 | 107 107 0
537 | -0.998471 -0.0548308 0.00704779 -0.0816128
538 | -0.0544199 0.997307 0.0491659 -0.00846789
539 | -0.00972462 0.0487072 -0.998766 0.36171
540 | -0 -0 0 1
541 | 108 108 0
542 | -0.991554 -0.0567941 0.116596 -0.147809
543 | -0.0521685 0.99774 0.04235 -0.0130009
544 | -0.118737 0.0359097 -0.992276 0.340824
545 | -0 -0 0 1
546 | 109 109 0
547 | -0.963057 -0.0602854 0.262465 -0.191638
548 | -0.0516736 0.997879 0.0395975 -0.015748
549 | -0.264295 0.0245722 -0.964129 0.30521
550 | -0 -0 0 1
551 | 110 110 0
552 | -0.912156 -0.060107 0.405412 -0.242264
553 | -0.0515508 0.998158 0.0320018 -0.0184003
554 | -0.406588 0.00829139 -0.913574 0.280671
555 | -0 -0 0 1
556 | 111 111 0
557 | -0.834045 -0.058595 0.548575 -0.279401
558 | -0.0524011 0.998262 0.0269575 -0.0157163
559 | -0.549202 -0.00626219 -0.835666 0.247726
560 | -0 -0 0 1
561 | 112 112 0
562 | -0.746935 -0.0511709 0.662926 -0.309283
563 | -0.0524538 0.998462 0.0179697 -0.0143244
564 | -0.662825 -0.0213508 -0.74847 0.212817
565 | -0 -0 0 1
566 | 113 113 0
567 | -0.634209 -0.0384214 0.772206 -0.348074
568 | -0.0496588 0.998727 0.00890743 -0.0122161
569 | -0.771565 -0.0326976 -0.63531 0.203315
570 | -0 -0 0 1
571 | 114 114 0
572 | -0.613537 -0.0295903 0.789111 -0.379356
573 | -0.0480077 0.998847 0.000128853 -8.85837e-05
574 | -0.788205 -0.0378044 -0.61425 0.228658
575 | -0 -0 0 1
576 | 115 115 0
577 | -0.644251 -0.0226368 0.764479 -0.429542
578 | -0.0396631 0.999206 -0.00383815 0.00876347
579 | -0.763785 -0.0327944 -0.644638 0.226211
580 | -0 -0 0 1
581 | 116 116 0
582 | -0.675327 -0.0222119 0.737184 -0.457203
583 | -0.0320919 0.999485 0.000716072 0.0168224
584 | -0.73682 -0.0231741 -0.675692 0.240197
585 | -0 -0 0 1
586 | 117 117 0
587 | -0.695387 -0.0486381 0.716988 -0.507555
588 | -0.0268934 0.998769 0.0416701 0.0206916
589 | -0.718132 0.00969456 -0.695839 0.23965
590 | -0 -0 0 1
591 | 118 118 0
592 | -0.651933 -0.102825 0.751273 -0.544945
593 | -0.030329 0.993506 0.109661 0.0250205
594 | -0.75767 0.0487062 -0.650818 0.268833
595 | -0 -0 0 1
596 | 119 119 0
597 | -0.654525 -0.162257 0.738424 -0.571439
598 | -0.0392047 0.982669 0.181176 0.0334175
599 | -0.755023 0.0896342 -0.649543 0.291802
600 | -0 -0 0 1
601 | 120 120 0
602 | -0.654902 -0.194122 0.730357 -0.596605
603 | -0.0481124 0.975195 0.216056 0.0175102
604 | -0.754181 0.106356 -0.647996 0.294909
605 | -0 -0 0 1
606 | 121 121 0
607 | -0.672557 -0.195848 0.71366 -0.600447
608 | -0.0575041 0.975261 0.213446 -0.00616033
609 | -0.737808 0.102516 -0.667181 0.300745
610 | -0 -0 0 1
611 | 122 122 0
612 | -0.698388 -0.177974 0.693238 -0.596645
613 | -0.0634161 0.980168 0.187749 -0.0246475
614 | -0.712904 0.0871594 -0.695824 0.297946
615 | -0 -0 0 1
616 | 123 123 0
617 | -0.695643 -0.141407 0.704333 -0.571275
618 | -0.0617387 0.988576 0.137496 -0.0197177
619 | -0.71573 0.0521638 -0.696426 0.270146
620 | -0 -0 0 1
621 | 124 124 0
622 | -0.624099 -0.128485 0.770709 -0.551402
623 | -0.0586891 0.991309 0.117736 -0.0154718
624 | -0.779138 0.028247 -0.626216 0.21653
625 | -0 -0 0 1
626 | 125 125 0
627 | -0.534516 -0.12456 0.835929 -0.509238
628 | -0.0646828 0.992208 0.106487 -0.0143688
629 | -0.842679 0.00284877 -0.538408 0.160243
630 | -0 -0 0 1
631 | 126 126 0
632 | -0.443917 -0.124197 0.887419 -0.490306
633 | -0.0694681 0.992138 0.104102 -0.013743
634 | -0.893371 -0.0154345 -0.449055 0.091086
635 | -0 -0 0 1
636 | 127 127 0
637 | -0.335472 -0.122463 0.934057 -0.452254
638 | -0.075811 0.991808 0.102807 -0.0157562
639 | -0.938995 -0.0363231 -0.342008 0.0318695
640 | -0 -0 0 1
641 | 128 128 0
642 | -0.190614 -0.117568 0.974599 -0.443339
643 | -0.0791182 0.991413 0.104122 -0.0171657
644 | -0.978472 -0.0572614 -0.198279 -0.0337307
645 | -0 -0 0 1
646 | 129 129 0
647 | -0.0495327 -0.114053 0.992239 -0.400748
648 | -0.0837374 0.990435 0.109665 -0.0210336
649 | -0.995256 -0.0776555 -0.0586094 -0.0793901
650 | -0 -0 0 1
651 | 130 130 0
652 | 0.112793 -0.101906 0.988379 -0.357611
653 | -0.0835145 0.990235 0.111628 -0.0286304
654 | -0.990103 -0.0951348 0.103181 -0.130804
655 | -0 -0 0 1
656 | 131 131 0
657 | 0.330452 -0.0851921 0.93997 -0.310483
658 | -0.0809177 0.989694 0.118146 -0.0342644
659 | -0.940348 -0.115102 0.320153 -0.163209
660 | -0 -0 0 1
661 | 132 132 0
662 | 0.492057 -0.0351996 0.869851 -0.261969
663 | -0.0778148 0.993404 0.0842176 -0.042981
664 | -0.867078 -0.109127 0.486072 -0.198415
665 | -0 -0 0 1
666 | 133 133 0
667 | 0.658491 -0.0107569 0.752511 -0.208474
668 | -0.0724493 0.994348 0.0776113 -0.041021
669 | -0.749093 -0.105625 0.65399 -0.232691
670 | -0 -0 0 1
671 | 134 134 0
672 | 0.760802 0.00239978 0.64898 -0.151804
673 | -0.0652007 0.995216 0.0727549 -0.0405488
674 | -0.645701 -0.097666 0.757319 -0.25048
675 | -0 -0 0 1
676 | 135 135 0
677 | 0.89511 0.0151186 0.445589 -0.100542
678 | -0.0508304 0.996369 0.0683032 -0.0371443
679 | -0.442938 -0.0837883 0.892628 -0.286928
680 | -0 -0 0 1
681 | 136 136 0
682 | 0.95372 0.0196573 0.300053 -0.0488513
683 | -0.0394095 0.997425 0.0599192 -0.0352066
684 | -0.298103 -0.0689711 0.952039 -0.292864
685 | -0 -0 0 1
686 | 137 137 0
687 | 0.995264 0.0191888 0.0952993 0.0216773
688 | -0.0243167 0.998302 0.0529412 -0.0314365
689 | -0.0941216 -0.0550078 0.99404 -0.303335
690 | -0 -0 0 1
691 | 138 138 0
692 | 0.998377 0.0172698 -0.0542653 0.0745075
693 | -0.0147745 0.99883 0.046052 -0.0279398
694 | 0.0549971 -0.0451755 0.997464 -0.287043
695 | -0 -0 0 1
696 | 139 139 0
697 | 0.968151 0.0103704 -0.250153 0.135518
698 | -0.00123037 0.999327 0.0366664 -0.027058
699 | 0.250365 -0.0351908 0.967512 -0.281028
700 | -0 -0 0 1
701 | 140 140 0
702 | 0.888133 0.00606977 -0.459546 0.170381
703 | 0.00877091 0.999507 0.0301526 -0.0210421
704 | 0.459503 -0.0308101 0.887642 -0.249644
705 | -0 -0 0 1
706 | 141 141 0
707 | 0.798775 0.000486114 -0.60163 0.208275
708 | 0.0164801 0.999607 0.022688 -0.0256519
709 | 0.601405 -0.0280375 0.798452 -0.214936
710 | -0 -0 0 1
711 | 142 142 0
712 | 0.708453 0.000291603 -0.705757 0.242178
713 | 0.0225103 0.999482 0.0230093 -0.0246125
714 | 0.705398 -0.0321878 0.70808 -0.184985
715 | -0 -0 0 1
716 | 143 143 0
717 | 0.566891 0.000687775 -0.823792 0.258711
718 | 0.0264967 0.999467 0.0190681 -0.0226867
719 | 0.823366 -0.0326373 0.566571 -0.134291
720 | -0 -0 0 1
721 | 144 144 0
722 | 0.369868 -0.00100563 -0.929084 0.291142
723 | 0.0283861 0.999545 0.0102186 -0.0181585
724 | 0.928651 -0.0301526 0.369728 -0.0825559
725 | -0 -0 0 1
726 | 145 145 0
727 | 0.245314 -0.00308832 -0.969439 0.28095
728 | 0.0297544 0.999548 0.00434503 -0.0138651
729 | 0.968987 -0.0299109 0.245295 -0.0360677
730 | -0 -0 0 1
731 | 146 146 0
732 | 0.151077 -0.00740054 -0.988494 0.265361
733 | 0.0288086 0.99958 -0.00308057 0.00614944
734 | 0.988102 -0.0280117 0.151227 0.00761854
735 | -0 -0 0 1
736 | 147 147 0
737 | 0.160657 -0.0153276 -0.986891 0.26233
738 | 0.0294398 0.999509 -0.010731 0.0233297
739 | 0.986571 -0.0273299 0.161029 0.0158146
740 | -0 -0 0 1
741 | 148 148 0
742 | 0.132414 -0.0374414 -0.990487 0.274651
743 | 0.0299112 0.998982 -0.0337638 0.0264828
744 | 0.990743 -0.0251558 0.133399 0.00834242
745 | -0 -0 0 1
746 | 149 149 0
747 | -0.0107134 -0.0782171 -0.996879 0.275545
748 | 0.0251847 0.996599 -0.0784658 0.0290319
749 | 0.999625 -0.0259467 -0.0087071 3.21164e-05
750 | -0 -0 0 1
751 | 150 150 0
752 | -0.116103 -0.0841527 -0.989666 0.27694
753 | 0.0251532 0.995836 -0.0876282 0.0285862
754 | 0.992919 -0.0350672 -0.113503 1.206e-05
755 | -0 -0 0 1
756 | 151 151 0
757 | -0.105165 -0.0903112 -0.990346 0.276802
758 | 0.0336308 0.994975 -0.0943046 0.0287369
759 | 0.993886 -0.0432236 -0.101599 0.00214624
760 | -0 -0 0 1
761 |
--------------------------------------------------------------------------------
/dataloader/datalist/tanks/logs/Francis.log:
--------------------------------------------------------------------------------
1 | 0 0 0
2 | 0.191178 -0.0561232 0.97995 -0.943293
3 | 0.0133187 -0.998124 -0.0597624 0.00807854
4 | 0.981465 0.0244769 -0.190072 0.135622
5 | -0 -0 0 1
6 | 1 1 0
7 | 0.149047 -0.0526273 0.987429 -0.944199
8 | 0.00992047 -0.998453 -0.0547123 0.00923427
9 | 0.98878 0.0179505 -0.148294 0.14194
10 | -0 -0 0 1
11 | 2 2 0
12 | 0.0906118 -0.0455649 0.994843 -0.940913
13 | 0.00757324 -0.998892 -0.0464401 0.00846398
14 | 0.995857 0.0117422 -0.0901664 0.144116
15 | -0 -0 0 1
16 | 3 3 0
17 | 0.0408542 -0.0410856 0.99832 -0.93848
18 | 0.00394314 -0.99914 -0.0412807 0.0099872
19 | 0.999157 0.00562301 -0.0406571 0.14672
20 | -0 -0 0 1
21 | 4 4 0
22 | -0.0127557 -0.0338025 0.999347 -0.938356
23 | 0.00122605 -0.999428 -0.0337896 0.00679604
24 | 0.999918 0.00079424 0.0127898 0.137318
25 | -0 -0 0 1
26 | 5 5 0
27 | -0.0741007 -0.0286721 0.996839 -0.915337
28 | -0.00188123 -0.999581 -0.0288908 0.0124374
29 | 0.997249 -0.00401612 0.0740157 0.124355
30 | -0 -0 0 1
31 | 6 6 0
32 | -0.12799 0.00534905 0.991761 -0.896237
33 | -0.00221533 -0.999985 0.00510751 0.0145139
34 | 0.991773 -0.00154337 0.128 0.103721
35 | -0 -0 0 1
36 | 7 7 0
37 | -0.180362 0.0518434 0.982233 -0.876587
38 | -0.00438836 -0.998642 0.0519036 0.0140702
39 | 0.98359 0.00505106 0.180345 0.0710926
40 | -0 -0 0 1
41 | 8 8 0
42 | -0.229118 0.0631899 0.971345 -0.853675
43 | -0.0088712 -0.997985 0.0628304 0.016258
44 | 0.973358 0.0057786 0.229217 0.0404156
45 | -0 -0 0 1
46 | 9 9 0
47 | -0.269577 0.0715093 0.96032 -0.851708
48 | -0.0132457 -0.99742 0.0705537 0.0170033
49 | 0.962888 0.0062995 0.269829 -0.00589197
50 | -0 -0 0 1
51 | 10 10 0
52 | -0.278511 0.0707681 0.957822 -0.835001
53 | -0.0168455 -0.997488 0.0688006 0.0155951
54 | 0.960285 0.00302675 0.279004 -0.0479874
55 | -0 -0 0 1
56 | 11 11 0
57 | -0.327317 0.0762421 0.941834 -0.821293
58 | -0.020809 -0.997079 0.0734825 0.0140159
59 | 0.944686 0.00445342 0.327947 -0.0920915
60 | -0 -0 0 1
61 | 12 12 0
62 | -0.379453 0.0750719 0.92216 -0.788978
63 | -0.0241132 -0.997167 0.0712559 0.0125189
64 | 0.924897 0.00480204 0.380188 -0.134089
65 | -0 -0 0 1
66 | 13 13 0
67 | -0.430163 0.0815242 0.899062 -0.768314
68 | -0.0269141 -0.996629 0.0774941 0.0140308
69 | 0.90235 0.00913767 0.430908 -0.181706
70 | -0 -0 0 1
71 | 14 14 0
72 | -0.475683 0.131051 0.8698 -0.752466
73 | -0.0297286 -0.990669 0.133004 0.023507
74 | 0.879114 0.03741 0.47514 -0.21235
75 | -0 -0 0 1
76 | 15 15 0
77 | -0.520631 0.235889 0.820549 -0.743136
78 | -0.0318565 -0.965773 0.257425 0.0304671
79 | 0.853188 0.107883 0.510326 -0.238669
80 | -0 -0 0 1
81 | 16 16 0
82 | -0.562658 0.276912 0.778933 -0.739149
83 | -0.0374749 -0.949806 0.310588 0.0321755
84 | 0.82584 0.145564 0.544793 -0.279417
85 | -0 -0 0 1
86 | 17 17 0
87 | -0.601064 0.30617 0.738229 -0.724569
88 | -0.044404 -0.935076 0.351655 0.0318729
89 | 0.797966 0.178587 0.575636 -0.313335
90 | -0 -0 0 1
91 | 18 18 0
92 | -0.622127 0.316928 0.715901 -0.710896
93 | -0.05121 -0.928918 0.366728 0.0245754
94 | 0.78124 0.19149 0.594135 -0.356588
95 | -0 -0 0 1
96 | 19 19 0
97 | -0.643358 0.313899 0.698254 -0.686377
98 | -0.0618298 -0.930401 0.361291 0.0179621
99 | 0.763065 0.189266 0.617989 -0.39602
100 | -0 -0 0 1
101 | 20 20 0
102 | -0.668561 0.298708 0.681028 -0.659081
103 | -0.0741453 -0.937994 0.338629 0.00964436
104 | 0.739952 0.175899 0.649254 -0.435859
105 | -0 -0 0 1
106 | 21 21 0
107 | -0.702535 0.269105 0.658808 -0.622436
108 | -0.0853739 -0.950931 0.29739 0.00617094
109 | 0.70651 0.152682 0.691037 -0.471634
110 | -0 -0 0 1
111 | 22 22 0
112 | -0.735307 0.240758 0.633529 -0.602375
113 | -0.09271 -0.96172 0.257875 -2.35923e-05
114 | 0.671363 0.130883 0.72948 -0.505426
115 | -0 -0 0 1
116 | 23 23 0
117 | -0.768401 0.198052 0.608552 -0.568868
118 | -0.0932654 -0.975412 0.199682 -0.00264379
119 | 0.633136 0.0966789 0.767979 -0.543517
120 | -0 -0 0 1
121 | 24 24 0
122 | -0.797583 0.180395 0.575603 -0.526836
123 | -0.0855744 -0.978422 0.188064 -0.00528746
124 | 0.597108 0.100739 0.79581 -0.581031
125 | -0 -0 0 1
126 | 25 25 0
127 | -0.825955 0.126319 0.549401 -0.489172
128 | -0.0792253 -0.99091 0.108725 -0.0105464
129 | 0.558141 0.0462759 0.828455 -0.614201
130 | -0 -0 0 1
131 | 26 26 0
132 | -0.853529 0.0829808 0.514396 -0.449466
133 | -0.0676357 -0.996529 0.0485303 -0.00742861
134 | 0.516637 0.00663047 0.856179 -0.643268
135 | -0 -0 0 1
136 | 27 27 0
137 | -0.869614 0.0621654 0.489802 -0.404434
138 | -0.054991 -0.998064 0.0290404 0.0216228
139 | 0.49066 -0.00168077 0.87135 -0.6561
140 | -0 -0 0 1
141 | 28 28 0
142 | -0.881851 0.036884 0.470084 -0.35256
143 | -0.0419967 -0.999118 -0.000390031 0.0270352
144 | 0.469655 -0.0200859 0.882622 -0.676858
145 | -0 -0 0 1
146 | 29 29 0
147 | -0.890615 0.011231 0.45462 -0.305864
148 | -0.0317532 -0.998791 -0.0375312 0.0386737
149 | 0.453648 -0.0478615 0.889895 -0.686711
150 | -0 -0 0 1
151 | 30 30 0
152 | -0.909088 0.00267123 0.416595 -0.250424
153 | -0.0201917 -0.999087 -0.0376559 0.039832
154 | 0.416114 -0.0426443 0.908312 -0.716191
155 | -0 -0 0 1
156 | 31 31 0
157 | -0.928378 -0.00161541 0.371634 -0.204234
158 | -0.0121265 -0.999326 -0.0346372 0.0449551
159 | 0.37144 -0.036663 0.927733 -0.737935
160 | -0 -0 0 1
161 | 32 32 0
162 | -0.945852 0.0086625 0.324483 -0.161994
163 | -0.00399626 -0.999879 0.0150442 0.0536122
164 | 0.324574 0.0129328 0.945772 -0.759747
165 | -0 -0 0 1
166 | 33 33 0
167 | -0.961151 0.0241985 0.274961 -0.110321
168 | 0.00151011 -0.995674 0.0929049 0.055992
169 | 0.27602 0.0897109 0.956956 -0.763373
170 | -0 -0 0 1
171 | 34 34 0
172 | -0.974442 0.0262437 0.2231 -0.0630098
173 | 0.00447321 -0.990689 0.136075 0.0607826
174 | 0.224594 0.133595 0.965251 -0.780226
175 | -0 -0 0 1
176 | 35 35 0
177 | -0.985381 0.0221099 0.168924 -0.00981449
178 | 0.00404448 -0.988227 0.152938 0.0614959
179 | 0.170316 0.151386 0.973691 -0.774799
180 | -0 -0 0 1
181 | 36 36 0
182 | -0.992782 0.0174412 0.118659 0.0383475
183 | 0.00214824 -0.986625 0.162994 0.0604445
184 | 0.119915 0.162072 0.979466 -0.789822
185 | -0 -0 0 1
186 | 37 37 0
187 | -0.997837 0.0124357 0.0645548 0.0914093
188 | -0.00120935 -0.985252 0.171103 0.0562101
189 | 0.0657306 0.170655 0.983136 -0.799369
190 | -0 -0 0 1
191 | 38 38 0
192 | -0.999969 0.00547312 0.00563471 0.142159
193 | -0.00440454 -0.984609 0.174717 0.0404514
194 | 0.00650423 0.174686 0.984603 -0.819422
195 | -0 -0 0 1
196 | 39 39 0
197 | -0.998555 -0.00159224 -0.0537083 0.199665
198 | -0.00771476 -0.984956 0.172634 0.0278651
199 | -0.0531752 0.172799 0.983521 -0.834474
200 | -0 -0 0 1
201 | 40 40 0
202 | -0.994015 -0.00886813 -0.108881 0.259056
203 | -0.00954752 -0.985833 0.167456 0.0120382
204 | -0.108824 0.167494 0.979849 -0.842212
205 | -0 -0 0 1
206 | 41 41 0
207 | -0.986982 -0.0148917 -0.160141 0.314469
208 | -0.0108554 -0.987265 0.158711 0.00167613
209 | -0.160465 0.158383 0.974251 -0.835359
210 | -0 -0 0 1
211 | 42 42 0
212 | -0.977821 -0.00988501 -0.209211 0.372428
213 | -0.01241 -0.994396 0.104987 -0.000702171
214 | -0.209076 0.105255 0.972218 -0.827836
215 | -0 -0 0 1
216 | 43 43 0
217 | -0.966091 0.00987371 -0.258011 0.429359
218 | -0.0146635 -0.999754 0.0166466 0.00998298
219 | -0.257783 0.0198654 0.965999 -0.823972
220 | -0 -0 0 1
221 | 44 44 0
222 | -0.951909 0.0166086 -0.30593 0.488011
223 | -0.0119105 -0.999781 -0.0172173 0.0208751
224 | -0.306149 -0.0127455 0.951898 -0.818722
225 | -0 -0 0 1
226 | 45 45 0
227 | -0.93436 0.00744593 -0.356252 0.547082
228 | -0.00595452 -0.999968 -0.00528285 0.0337707
229 | -0.35628 -0.00281478 0.934375 -0.813268
230 | -0 -0 0 1
231 | 46 46 0
232 | -0.914755 -0.00995615 -0.403885 0.609272
233 | -0.000271148 -0.999681 0.0252572 0.0358758
234 | -0.404008 0.0232137 0.914461 -0.808482
235 | -0 -0 0 1
236 | 47 47 0
237 | -0.891555 -0.0493169 -0.45022 0.672292
238 | 0.00612992 -0.995277 0.0968832 0.037991
239 | -0.452872 0.0836169 0.887646 -0.803465
240 | -0 -0 0 1
241 | 48 48 0
242 | -0.862772 -0.0812406 -0.499024 0.729252
243 | 0.0102232 -0.989607 0.143432 0.0409367
244 | -0.50549 0.118648 0.854636 -0.775045
245 | -0 -0 0 1
246 | 49 49 0
247 | -0.830591 -0.0935893 -0.548963 0.779629
248 | 0.0098747 -0.988097 0.153514 0.0392285
249 | -0.556796 0.122086 0.821628 -0.752254
250 | -0 -0 0 1
251 | 50 50 0
252 | -0.786895 -0.106556 -0.607817 0.819092
253 | 0.00739515 -0.986536 0.163375 0.0342222
254 | -0.617042 0.124064 0.777089 -0.706634
255 | -0 -0 0 1
256 | 51 51 0
257 | -0.741913 -0.116178 -0.660354 0.857285
258 | 0.00505018 -0.985814 0.167764 0.0337076
259 | -0.670477 0.121131 0.731975 -0.663902
260 | -0 -0 0 1
261 | 52 52 0
262 | -0.690172 -0.120777 -0.713495 0.888125
263 | 0.00226889 -0.98633 0.164766 0.0348546
264 | -0.723642 0.112098 0.681011 -0.606457
265 | -0 -0 0 1
266 | 53 53 0
267 | -0.638952 -0.123638 -0.759246 0.931896
268 | 4.08156e-05 -0.987004 0.160693 0.0286458
269 | -0.769247 0.102644 0.630653 -0.552544
270 | -0 -0 0 1
271 | 54 54 0
272 | -0.591481 -0.122528 -0.796955 0.970329
273 | -0.0019234 -0.988169 0.153354 0.0195295
274 | -0.806316 0.092239 0.584248 -0.502421
275 | -0 -0 0 1
276 | 55 55 0
277 | -0.546962 -0.0905647 -0.832245 1.00946
278 | -0.00367844 -0.993862 0.110569 0.00954444
279 | -0.83715 0.0635386 0.543271 -0.445149
280 | -0 -0 0 1
281 | 56 56 0
282 | -0.496417 -0.0492804 -0.866684 1.04712
283 | -0.00548541 -0.998189 0.0598998 0.00693642
284 | -0.868067 0.0344894 0.495248 -0.388295
285 | -0 -0 0 1
286 | 57 57 0
287 | -0.444086 -0.0369355 -0.895222 1.08594
288 | -0.00386784 -0.999062 0.0431384 0.00572749
289 | -0.895976 0.0226197 0.443527 -0.319584
290 | -0 -0 0 1
291 | 58 58 0
292 | -0.390455 -0.0333077 -0.920019 1.12414
293 | -0.00103058 -0.999329 0.0366163 0.00570318
294 | -0.920621 0.0152452 0.390159 -0.248746
295 | -0 -0 0 1
296 | 59 59 0
297 | -0.337272 -0.0280555 -0.940989 1.15313
298 | 0.0016514 -0.999572 0.0292103 0.00883949
299 | -0.941406 0.00829787 0.337174 -0.176379
300 | -0 -0 0 1
301 | 60 60 0
302 | -0.280369 -0.0241403 -0.959589 1.18663
303 | 0.00456108 -0.999706 0.0238169 0.00305918
304 | -0.959881 0.00230077 0.280397 -0.10719
305 | -0 -0 0 1
306 | 61 61 0
307 | -0.209918 -0.00347925 -0.977713 1.20698
308 | 0.00716887 -0.999972 0.00201929 0.00534963
309 | -0.977693 -0.00658521 0.209937 -0.0299376
310 | -0 -0 0 1
311 | 62 62 0
312 | -0.118066 -0.00163304 -0.993004 1.22423
313 | 0.00959433 -0.999954 0.000503729 0.00476275
314 | -0.992959 -0.00946773 0.118076 0.0504353
315 | -0 -0 0 1
316 | 63 63 0
317 | -0.0521229 0.000375594 -0.998641 1.24025
318 | 0.0123472 -0.999923 -0.00102052 0.0110495
319 | -0.998564 -0.0123836 0.0521143 0.130098
320 | -0 -0 0 1
321 | 64 64 0
322 | 0.00439078 0.00199301 -0.999988 1.23758
323 | 0.013956 -0.999901 -0.00193156 0.0231144
324 | -0.999893 -0.0139473 -0.00441816 0.195353
325 | -0 -0 0 1
326 | 65 65 0
327 | 0.0606092 -0.0431476 -0.997229 1.24368
328 | 0.0165271 -0.998885 0.0442237 0.0180421
329 | -0.998025 -0.0191617 -0.0598285 0.26169
330 | -0 -0 0 1
331 | 66 66 0
332 | 0.114528 -0.139384 -0.983593 1.23595
333 | 0.0198034 -0.989591 0.142539 0.0116618
334 | -0.993223 -0.0358032 -0.110575 0.324853
335 | -0 -0 0 1
336 | 67 67 0
337 | 0.165042 -0.157672 -0.973602 1.23216
338 | 0.0182279 -0.986483 0.162848 0.00380221
339 | -0.986118 -0.0446235 -0.159937 0.386766
340 | -0 -0 0 1
341 | 68 68 0
342 | 0.217059 -0.153404 -0.964029 1.21964
343 | 0.0138892 -0.986989 0.160185 0.00058729
344 | -0.97606 -0.0481591 -0.212104 0.44364
345 | -0 -0 0 1
346 | 69 69 0
347 | 0.273996 -0.13192 -0.95264 1.2047
348 | 0.00804659 -0.990198 0.139435 0.000404743
349 | -0.961697 -0.0458703 -0.270249 0.50176
350 | -0 -0 0 1
351 | 70 70 0
352 | 0.352161 -0.107657 -0.929727 1.18964
353 | -0.000198945 -0.993371 0.114951 -0.00259242
354 | -0.935939 -0.0402964 -0.349848 0.559476
355 | -0 -0 0 1
356 | 71 71 0
357 | 0.477466 -0.0837853 -0.874646 1.16886
358 | -0.00980226 -0.995889 0.0900485 -0.00467969
359 | -0.878596 -0.0344216 -0.476324 0.615349
360 | -0 -0 0 1
361 | 72 72 0
362 | 0.602859 -0.0754874 -0.794269 1.15003
363 | -0.0170104 -0.996504 0.0817967 -0.00995901
364 | -0.797666 -0.0358011 -0.602035 0.673814
365 | -0 -0 0 1
366 | 73 73 0
367 | 0.699804 -0.0615853 -0.711675 1.1199
368 | -0.0203082 -0.997589 0.0663577 -0.013428
369 | -0.714046 -0.0319846 -0.699368 0.723573
370 | -0 -0 0 1
371 | 74 74 0
372 | 0.749924 -0.035281 -0.660583 1.08778
373 | -0.0203941 -0.999335 0.0302211 -0.0157662
374 | -0.66121 -0.00919147 -0.750144 0.785462
375 | -0 -0 0 1
376 | 75 75 0
377 | 0.782672 -0.0102693 -0.622349 1.04848
378 | -0.0185598 -0.999804 -0.00684324 -0.0167516
379 | -0.622157 0.0169067 -0.78271 0.837468
380 | -0 -0 0 1
381 | 76 76 0
382 | 0.812515 -0.00101189 -0.582939 1.01723
383 | -0.0156853 -0.999674 -0.0201272 -0.0124004
384 | -0.582729 0.0254972 -0.812266 0.887961
385 | -0 -0 0 1
386 | 77 77 0
387 | 0.838055 -0.00265162 -0.545579 0.997259
388 | -0.0141157 -0.999759 -0.0168239 -0.00855619
389 | -0.545403 0.0218006 -0.83789 0.919517
390 | -0 -0 0 1
391 | 78 78 0
392 | 0.864171 -0.0223559 -0.502702 0.973703
393 | -0.0105149 -0.999597 0.026378 -0.0114587
394 | -0.503089 -0.0175092 -0.864057 0.945696
395 | -0 -0 0 1
396 | 79 79 0
397 | 0.888061 -0.0426972 -0.457738 0.943241
398 | -0.00817181 -0.996987 0.0771435 -0.0168973
399 | -0.459652 -0.0647676 -0.885734 0.976918
400 | -0 -0 0 1
401 | 80 80 0
402 | 0.911032 -0.0605293 -0.407869 0.903461
403 | -0.00623989 -0.991077 0.133142 -0.0207898
404 | -0.412289 -0.118751 -0.903281 1.0096
405 | -0 -0 0 1
406 | 81 81 0
407 | 0.9312 -0.0633774 -0.358958 0.850727
408 | -0.00703144 -0.987709 0.156148 -0.00643376
409 | -0.364442 -0.142881 -0.920199 1.0443
410 | -0 -0 0 1
411 | 82 82 0
412 | 0.946872 -0.0581857 -0.316303 0.79278
413 | -0.0099764 -0.988339 0.151945 0.0129083
414 | -0.321455 -0.140717 -0.936411 1.06515
415 | -0 -0 0 1
416 | 83 83 0
417 | 0.962174 -0.0552417 -0.266775 0.742935
418 | -0.01552 -0.98875 0.148767 0.0147386
419 | -0.271992 -0.139 -0.952208 1.09581
420 | -0 -0 0 1
421 | 84 84 0
422 | 0.974917 -0.0507512 -0.216705 0.686965
423 | -0.0202922 -0.989868 0.140531 0.0149814
424 | -0.221641 -0.132609 -0.966069 1.11697
425 | -0 -0 0 1
426 | 85 85 0
427 | 0.986631 -0.0462245 -0.15628 0.625289
428 | -0.0253967 -0.990827 0.132732 0.0169773
429 | -0.160982 -0.126988 -0.978754 1.14478
430 | -0 -0 0 1
431 | 86 86 0
432 | 0.994396 -0.0413755 -0.0972891 0.557349
433 | -0.0289945 -0.991683 0.125393 0.0121533
434 | -0.101668 -0.12187 -0.987325 1.15124
435 | -0 -0 0 1
436 | 87 87 0
437 | 0.998357 -0.0380999 -0.0428072 0.494354
438 | -0.0324662 -0.99158 0.12536 0.0112342
439 | -0.047223 -0.123764 -0.991187 1.15666
440 | -0 -0 0 1
441 | 88 88 0
442 | 0.999382 -0.0341525 0.00827299 0.425785
443 | -0.0349141 -0.991708 0.123678 0.00562682
444 | 0.00398046 -0.123891 -0.992288 1.17133
445 | -0 -0 0 1
446 | 89 89 0
447 | 0.998177 -0.0292928 0.0527639 0.355174
448 | -0.0355851 -0.991822 0.122565 -0.00471376
449 | 0.0487421 -0.124219 -0.991057 1.18816
450 | -0 -0 0 1
451 | 90 90 0
452 | 0.998328 -0.0274685 0.0508615 0.289043
453 | -0.0334158 -0.992206 0.120042 -0.0114604
454 | 0.0471678 -0.12154 -0.991465 1.19457
455 | -0 -0 0 1
456 | 91 91 0
457 | 0.996043 -0.0284271 0.084207 0.224418
458 | -0.0368025 -0.994343 0.0996418 -0.0140143
459 | 0.0808981 -0.102346 -0.991454 1.18827
460 | -0 -0 0 1
461 | 92 92 0
462 | 0.989421 -0.0252712 0.142857 0.158913
463 | -0.0395414 -0.994405 0.0979524 -0.0152679
464 | 0.139583 -0.102565 -0.984884 1.18444
465 | -0 -0 0 1
466 | 93 93 0
467 | 0.979383 -0.0217039 0.200842 0.0934835
468 | -0.0421441 -0.994287 0.0980641 -0.0175489
469 | 0.197566 -0.104507 -0.974703 1.17438
470 | -0 -0 0 1
471 | 94 94 0
472 | 0.965433 -0.0175242 0.260062 0.0332161
473 | -0.04374 -0.994481 0.0953642 -0.0189089
474 | 0.256955 -0.103443 -0.960871 1.16685
475 | -0 -0 0 1
476 | 95 95 0
477 | 0.944205 -0.0141292 0.329055 -0.0272243
478 | -0.0446033 -0.995361 0.0852471 -0.0180987
479 | 0.326324 -0.0951677 -0.940455 1.14299
480 | -0 -0 0 1
481 | 96 96 0
482 | 0.919079 -0.0109313 0.393921 -0.0966937
483 | -0.0442567 -0.996154 0.0756146 -0.00385216
484 | 0.39158 -0.0869295 -0.916029 1.11986
485 | -0 -0 0 1
486 | 97 97 0
487 | 0.892366 -0.00732579 0.451253 -0.164228
488 | -0.0429964 -0.9967 0.0688459 0.0122862
489 | 0.44926 -0.080838 -0.889736 1.08377
490 | -0 -0 0 1
491 | 98 98 0
492 | 0.867099 -0.00776889 0.498076 -0.224349
493 | -0.0424501 -0.997394 0.0583442 0.0279992
494 | 0.496324 -0.0717335 -0.865169 1.0589
495 | -0 -0 0 1
496 | 99 99 0
497 | 0.83797 -0.0109412 0.545607 -0.278232
498 | -0.0422138 -0.998103 0.0448187 0.0385524
499 | 0.544082 -0.0605889 -0.836842 1.02824
500 | -0 -0 0 1
501 | 100 100 0
502 | 0.803592 -0.00414244 0.595167 -0.331108
503 | -0.0414401 -0.997938 0.0490065 0.040191
504 | 0.593737 -0.064045 -0.802107 1.0035
505 | -0 -0 0 1
506 | 101 101 0
507 | 0.767099 0.00833128 0.641474 -0.381857
508 | -0.0415136 -0.997175 0.0625946 0.0384232
509 | 0.640184 -0.0746462 -0.764587 0.974428
510 | -0 -0 0 1
511 | 102 102 0
512 | 0.727782 0.0444582 0.684366 -0.431107
513 | -0.0395164 -0.99352 0.106565 0.0312474
514 | 0.684669 -0.1046 -0.72131 0.93695
515 | -0 -0 0 1
516 | 103 103 0
517 | 0.685857 0.128129 0.716368 -0.484195
518 | -0.0378451 -0.976767 0.210938 0.023498
519 | 0.726751 -0.171784 -0.665073 0.898635
520 | -0 -0 0 1
521 | 104 104 0
522 | 0.64215 0.184821 0.743965 -0.534368
523 | -0.039668 -0.961189 0.273024 0.0171567
524 | 0.765552 -0.204834 -0.609896 0.853534
525 | -0 -0 0 1
526 | 105 105 0
527 | 0.594207 0.200924 0.778812 -0.58411
528 | -0.0466691 -0.958051 0.282772 0.0143701
529 | 0.802957 -0.204371 -0.559903 0.817022
530 | -0 -0 0 1
531 | 106 106 0
532 | 0.540383 0.204179 0.81627 -0.62365
533 | -0.0549092 -0.959486 0.276353 0.0149934
534 | 0.839626 -0.194157 -0.507279 0.767308
535 | -0 -0 0 1
536 | 107 107 0
537 | 0.485396 0.205947 0.849692 -0.6716
538 | -0.0618459 -0.961338 0.268338 0.0146787
539 | 0.872104 -0.1828 -0.453892 0.716899
540 | -0 -0 0 1
541 | 108 108 0
542 | 0.441335 0.161083 0.882766 -0.710156
543 | -0.0684513 -0.974846 0.212107 0.0173034
544 | 0.894728 -0.154037 -0.419208 0.667172
545 | -0 -0 0 1
546 | 109 109 0
547 | 0.391766 0.145886 0.908425 -0.753986
548 | -0.0732193 -0.979274 0.188841 0.0139447
549 | 0.917147 -0.140496 -0.372964 0.612714
550 | -0 -0 0 1
551 | 110 110 0
552 | 0.399766 0.141721 0.905595 -0.790008
553 | -0.0691246 -0.980501 0.183958 0.0146811
554 | 0.914007 -0.136139 -0.382174 0.549686
555 | -0 -0 0 1
556 | 111 111 0
557 | 0.428022 0.130646 0.894276 -0.824756
558 | -0.0684105 -0.981974 0.1762 0.01749
559 | 0.901176 -0.136595 -0.411369 0.488365
560 | -0 -0 0 1
561 | 112 112 0
562 | 0.453852 0.0824547 0.887254 -0.850888
563 | -0.0712153 -0.989168 0.128354 0.0174968
564 | 0.888227 -0.12144 -0.443064 0.42838
565 | -0 -0 0 1
566 | 113 113 0
567 | 0.40171 0.0348042 0.915105 -0.862162
568 | -0.0814308 -0.993962 0.0735496 0.0238467
569 | 0.912139 -0.104063 -0.396451 0.375454
570 | -0 -0 0 1
571 | 114 114 0
572 | 0.310477 0.0424912 0.949631 -0.86803
573 | -0.0856594 -0.993685 0.0724684 0.0237015
574 | 0.946714 -0.103845 -0.304877 0.345193
575 | -0 -0 0 1
576 | 115 115 0
577 | 0.0350809 0.0554594 0.997844 -0.892778
578 | -0.0906181 -0.99417 0.058441 0.0322259
579 | 0.995268 -0.092473 -0.0298508 0.327932
580 | -0 -0 0 1
581 | 116 116 0
582 | -0.120639 0.0544499 0.991202 -0.910976
583 | -0.078843 -0.995866 0.0451101 0.0473571
584 | 0.98956 -0.0727073 0.124434 0.297206
585 | -0 -0 0 1
586 | 117 117 0
587 | -0.184796 0.0511349 0.981446 -0.930647
588 | -0.0628638 -0.997215 0.0401199 0.0533248
589 | 0.980764 -0.0542834 0.187496 0.249659
590 | -0 -0 0 1
591 | 118 118 0
592 | -0.275302 0.0493424 0.960091 -0.943503
593 | -0.0511069 -0.998021 0.0366371 0.043924
594 | 0.959999 -0.038981 0.277278 0.183833
595 | -0 -0 0 1
596 | 119 119 0
597 | -0.327658 0.041837 0.94387 -0.954633
598 | -0.0410963 -0.998705 0.0300013 0.0364221
599 | 0.943902 -0.0289594 0.328953 0.117725
600 | -0 -0 0 1
601 | 120 120 0
602 | -0.348918 0.032371 0.936594 -0.940401
603 | -0.0330093 -0.999208 0.0222378 0.0328316
604 | 0.936572 -0.0231571 0.34971 0.0616939
605 | -0 -0 0 1
606 | 121 121 0
607 | -0.283528 0.025167 0.958634 -0.933158
608 | -0.0241443 -0.999526 0.0190995 0.0261574
609 | 0.95866 -0.0177302 0.284001 -0.00793485
610 | -0 -0 0 1
611 | 122 122 0
612 | -0.255144 0.0240202 0.966605 -0.944612
613 | -0.0216582 -0.999583 0.0191229 0.0175957
614 | 0.96666 -0.0160558 0.255558 -0.0728434
615 | -0 -0 0 1
616 | 123 123 0
617 | -0.276508 0.0270732 0.96063 -0.959872
618 | -0.020043 -0.999548 0.0224009 0.0112251
619 | 0.960803 -0.0130599 0.276926 -0.12586
620 | -0 -0 0 1
621 | 124 124 0
622 | -0.322395 0.0320569 0.946062 -0.949662
623 | -0.0183776 -0.99945 0.0276033 0.00897421
624 | 0.946427 -0.00848719 0.322807 -0.177363
625 | -0 -0 0 1
626 | 125 125 0
627 | -0.378874 0.0367424 0.924718 -0.923118
628 | -0.0149132 -0.999324 0.0335965 0.00305845
629 | 0.925328 -0.00106164 0.379166 -0.238887
630 | -0 -0 0 1
631 | 126 126 0
632 | -0.438295 0.0444114 0.897733 -0.893917
633 | -0.0114769 -0.998974 0.0438165 0.000803722
634 | 0.898758 0.00890137 0.438355 -0.306518
635 | -0 -0 0 1
636 | 127 127 0
637 | -0.505588 0.0541912 0.861071 -0.860971
638 | -0.0101339 -0.99833 0.0568792 -0.00210933
639 | 0.862715 0.0200315 0.505293 -0.377019
640 | -0 -0 0 1
641 | 128 128 0
642 | -0.566169 0.139629 0.812377 -0.827561
643 | -0.00805971 -0.986439 0.163929 0.0114771
644 | 0.82425 0.0862641 0.559617 -0.435878
645 | -0 -0 0 1
646 | 129 129 0
647 | -0.635405 0.258968 0.727459 -0.793122
648 | -0.00981277 -0.944717 0.327739 0.018747
649 | 0.772117 0.201108 0.602819 -0.486051
650 | -0 -0 0 1
651 | 130 130 0
652 | -0.689106 0.314054 0.653072 -0.771202
653 | -0.0149581 -0.907183 0.42047 0.0272539
654 | 0.724506 0.27998 0.629843 -0.540526
655 | -0 -0 0 1
656 | 131 131 0
657 | -0.732797 0.356083 0.579839 -0.732357
658 | -0.0228823 -0.864558 0.502012 0.0345685
659 | 0.680062 0.354605 0.641694 -0.585915
660 | -0 -0 0 1
661 | 132 132 0
662 | -0.768816 0.388604 0.507847 -0.702135
663 | -0.0351153 -0.818627 0.573252 0.0329951
664 | 0.638505 0.422892 0.643019 -0.641451
665 | -0 -0 0 1
666 | 133 133 0
667 | -0.798723 0.399706 0.449751 -0.668863
668 | -0.0499652 -0.788948 0.612425 0.0265392
669 | 0.599621 0.466686 0.650122 -0.699359
670 | -0 -0 0 1
671 | 134 134 0
672 | -0.824134 0.38976 0.410962 -0.63638
673 | -0.0667087 -0.78732 0.612925 0.0191093
674 | 0.562453 0.477718 0.674858 -0.754697
675 | -0 -0 0 1
676 | 135 135 0
677 | -0.848438 0.376573 0.371949 -0.593035
678 | -0.0848483 -0.790402 0.606684 0.0131852
679 | 0.52245 0.483174 0.702559 -0.806433
680 | -0 -0 0 1
681 | 136 136 0
682 | -0.877887 0.339134 0.338086 -0.529431
683 | -0.103061 -0.823272 0.558213 0.0170881
684 | 0.467645 0.455204 0.757692 -0.838826
685 | -0 -0 0 1
686 | 137 137 0
687 | -0.902651 0.302276 0.30635 -0.462294
688 | -0.105593 -0.845616 0.523243 0.0148791
689 | 0.417218 0.439957 0.795215 -0.869296
690 | -0 -0 0 1
691 | 138 138 0
692 | -0.924863 0.264489 0.273266 -0.392382
693 | -0.0998974 -0.862279 0.496483 0.0126698
694 | 0.366946 0.43188 0.823911 -0.888316
695 | -0 -0 0 1
696 | 139 139 0
697 | -0.945232 0.222095 0.239188 -0.330628
698 | -0.0885037 -0.87975 0.467127 0.00901911
699 | 0.314172 0.420374 0.851224 -0.92039
700 | -0 -0 0 1
701 | 140 140 0
702 | -0.966104 0.173796 0.190888 -0.26256
703 | -0.0745454 -0.89575 0.438263 0.00559754
704 | 0.247156 0.409178 0.878343 -0.955775
705 | -0 -0 0 1
706 | 141 141 0
707 | -0.98057 0.124851 0.151312 -0.195966
708 | -0.0542067 -0.913741 0.402665 -0.00183788
709 | 0.188533 0.386639 0.902754 -0.994686
710 | -0 -0 0 1
711 | 142 142 0
712 | -0.990201 0.0842095 0.111406 -0.129096
713 | -0.0363674 -0.92571 0.376483 -0.0132498
714 | 0.134833 0.368742 0.919701 -1.03545
715 | -0 -0 0 1
716 | 143 143 0
717 | -0.996156 0.0519468 0.0705387 -0.0652216
718 | -0.0233363 -0.933476 0.357881 -0.0191859
719 | 0.084437 0.354859 0.931099 -1.05291
720 | -0 -0 0 1
721 | 144 144 0
722 | -0.999257 0.024959 0.0293721 0.00104253
723 | -0.0130138 -0.935752 0.352419 -0.0268804
724 | 0.036281 0.351775 0.935381 -1.06731
725 | -0 -0 0 1
726 | 145 145 0
727 | -0.999063 -0.00499873 -0.0429898 0.0737583
728 | -0.0121792 -0.920694 0.390094 -0.0159793
729 | -0.0415304 0.390253 0.919771 -1.07593
730 | -0 -0 0 1
731 | 146 146 0
732 | -0.991237 -0.0513162 -0.121718 0.14735
733 | -0.0108378 -0.886755 0.462113 0.0128887
734 | -0.131648 0.459383 0.878428 -1.08223
735 | -0 -0 0 1
736 | 147 147 0
737 | -0.979293 -0.0917452 -0.180464 0.217199
738 | -0.00678749 -0.876036 0.482197 0.0332945
739 | -0.202332 0.473437 0.857274 -1.11471
740 | -0 -0 0 1
741 | 148 148 0
742 | -0.962879 -0.12293 -0.240318 0.286557
743 | -0.00816355 -0.876615 0.481123 0.027244
744 | -0.269811 0.465225 0.84307 -1.14459
745 | -0 -0 0 1
746 | 149 149 0
747 | -0.940114 -0.151252 -0.305465 0.362904
748 | -0.00908179 -0.884725 0.466025 0.0266107
749 | -0.34074 0.44089 0.830369 -1.15518
750 | -0 -0 0 1
751 | 150 150 0
752 | -0.918287 -0.177652 -0.353821 0.443665
753 | -0.00560658 -0.887753 0.460287 0.0241039
754 | -0.395876 0.424659 0.814216 -1.13968
755 | -0 -0 0 1
756 | 151 151 0
757 | -0.895802 -0.205062 -0.394319 0.5216
758 | -0.00131227 -0.885978 0.463725 0.0271379
759 | -0.44445 0.415923 0.793392 -1.10088
760 | -0 -0 0 1
761 | 152 152 0
762 | -0.872663 -0.228461 -0.431585 0.603201
763 | -0.000414918 -0.883462 0.468502 0.0299703
764 | -0.488323 0.409023 0.77087 -1.07441
765 | -0 -0 0 1
766 | 153 153 0
767 | -0.851048 -0.247975 -0.462846 0.684172
768 | -0.000403316 -0.881154 0.47283 0.0344998
769 | -0.525088 0.402588 0.749804 -1.04281
770 | -0 -0 0 1
771 | 154 154 0
772 | -0.823648 -0.266322 -0.500676 0.761986
773 | -0.00249571 -0.881158 0.472815 0.0344301
774 | -0.567096 0.390683 0.7251 -1.00364
775 | -0 -0 0 1
776 | 155 155 0
777 | -0.780432 -0.270116 -0.563883 0.840424
778 | -0.0121624 -0.895135 0.445629 0.0281265
779 | -0.625123 0.354641 0.695307 -0.974086
780 | -0 -0 0 1
781 | 156 156 0
782 | -0.717578 -0.284564 -0.635692 0.90319
783 | -0.0188686 -0.904447 0.426169 0.0298864
784 | -0.696222 0.317804 0.643642 -0.939415
785 | -0 -0 0 1
786 | 157 157 0
787 | -0.660443 -0.305387 -0.685969 0.962433
788 | -0.0166778 -0.907367 0.420009 0.0299591
789 | -0.750691 0.288832 0.594171 -0.901135
790 | -0 -0 0 1
791 | 158 158 0
792 | -0.605424 -0.326288 -0.725946 1.02075
793 | -0.0119281 -0.908282 0.418189 0.0351336
794 | -0.795814 0.261841 0.546003 -0.849859
795 | -0 -0 0 1
796 | 159 159 0
797 | -0.55731 -0.332276 -0.76092 1.07812
798 | -0.00400248 -0.915348 0.402643 0.0337205
799 | -0.830295 0.227442 0.508802 -0.792189
800 | -0 -0 0 1
801 | 160 160 0
802 | -0.513249 -0.341791 -0.787245 1.13766
803 | 0.00188269 -0.917725 0.397213 0.0371002
804 | -0.858238 0.202387 0.471665 -0.734682
805 | -0 -0 0 1
806 | 161 161 0
807 | -0.472938 -0.353921 -0.806889 1.19404
808 | 0.00610205 -0.917072 0.398674 0.0355498
809 | -0.881074 0.183624 0.435878 -0.671038
810 | -0 -0 0 1
811 | 162 162 0
812 | -0.433625 -0.370813 -0.82126 1.24585
813 | 0.00628199 -0.912625 0.408749 0.0299191
814 | -0.901072 0.172085 0.398066 -0.619935
815 | -0 -0 0 1
816 | 163 163 0
817 | -0.41996 -0.384833 -0.821911 1.30047
818 | 0.0126437 -0.908037 0.418699 0.0283432
819 | -0.907455 0.165445 0.386205 -0.565293
820 | -0 -0 0 1
821 | 164 164 0
822 | -0.471784 -0.380146 -0.795556 1.33693
823 | 0.0304352 -0.908767 0.416193 0.0303815
824 | -0.881189 0.17214 0.440311 -0.496561
825 | -0 -0 0 1
826 | 165 165 0
827 | -0.523993 -0.360171 -0.771821 1.36866
828 | 0.0112571 -0.909038 0.416561 0.0284847
829 | -0.851648 0.209587 0.480385 -0.422456
830 | -0 -0 0 1
831 | 166 166 0
832 | -0.535826 -0.35792 -0.764712 1.40733
833 | 0.00690981 -0.907532 0.419925 0.0219567
834 | -0.8443 0.219723 0.488753 -0.340851
835 | -0 -0 0 1
836 | 167 167 0
837 | -0.515962 -0.360006 -0.777289 1.43685
838 | -0.00580637 -0.90591 0.423431 5.90674e-05
839 | -0.856592 0.222988 0.465325 -0.278051
840 | -0 -0 0 1
841 | 168 168 0
842 | -0.473275 -0.372931 -0.798081 1.45762
843 | -0.0138127 -0.902715 0.430016 -0.013759
844 | -0.880807 0.21454 0.422081 -0.22288
845 | -0 -0 0 1
846 | 169 169 0
847 | -0.422059 -0.348513 -0.836902 1.46075
848 | -0.0170883 -0.919931 0.391707 -0.0222408
849 | -0.906407 0.179625 0.38231 -0.158982
850 | -0 -0 0 1
851 | 170 170 0
852 | -0.338431 -0.310079 -0.888434 1.45473
853 | -0.0238213 -0.941021 0.337507 -0.024359
854 | -0.94069 0.135387 0.311084 -0.095404
855 | -0 -0 0 1
856 | 171 171 0
857 | -0.196344 -0.318074 -0.927512 1.4413
858 | -0.03335 -0.94321 0.330517 -0.0163835
859 | -0.979968 0.0958275 0.174586 -0.0335003
860 | -0 -0 0 1
861 | 172 172 0
862 | -0.113045 -0.42998 -0.895733 1.44473
863 | -0.0151376 -0.900662 0.434256 -0.0140824
864 | -0.993475 0.0626499 0.0953068 0.0346016
865 | -0 -0 0 1
866 | 173 173 0
867 | -0.0497801 -0.541519 -0.839213 1.45041
868 | -0.0017132 -0.840207 0.542262 -0.0108339
869 | -0.998759 0.0284316 0.0408979 0.107118
870 | -0 -0 0 1
871 | 174 174 0
872 | 0.00964921 -0.586096 -0.810184 1.45333
873 | 0.00358174 -0.810197 0.586147 -0.0138378
874 | -0.999947 -0.00855773 -0.00571851 0.178702
875 | -0 -0 0 1
876 | 175 175 0
877 | -0.0147081 -0.589462 -0.807662 1.44137
878 | 0.003848 -0.807777 0.589476 -0.0112111
879 | -0.999884 0.0055622 0.0141491 0.239766
880 | -0 -0 0 1
881 | 176 176 0
882 | -0.0284361 -0.574932 -0.817707 1.43265
883 | -0.00686922 -0.817906 0.575311 -0.0145722
884 | -0.999572 0.0219766 0.0193087 0.30527
885 | -0 -0 0 1
886 | 177 177 0
887 | -0.0581113 -0.552211 -0.831677 1.42221
888 | -0.0148301 -0.832515 0.553804 -0.0126611
889 | -0.9982 0.0445161 0.0401891 0.369234
890 | -0 -0 0 1
891 | 178 178 0
892 | -0.067784 -0.512165 -0.856208 1.41156
893 | -0.0161349 -0.857507 0.514219 -0.0154437
894 | -0.99757 0.0486707 0.0498615 0.436593
895 | -0 -0 0 1
896 | 179 179 0
897 | -0.0283679 -0.486088 -0.873449 1.41262
898 | -0.0304778 -0.872974 0.486814 -0.0219568
899 | -0.999133 0.0404307 0.00994954 0.504122
900 | -0 -0 0 1
901 | 180 180 0
902 | 0.0708281 -0.475599 -0.876806 1.39674
903 | -0.0297708 -0.87963 0.474725 -0.0215512
904 | -0.997044 -0.00752066 -0.0764615 0.573902
905 | -0 -0 0 1
906 | 181 181 0
907 | 0.264104 -0.500356 -0.824556 1.37129
908 | -0.0178961 -0.857306 0.514496 -0.0143097
909 | -0.964328 -0.121124 -0.235372 0.639701
910 | -0 -0 0 1
911 | 182 182 0
912 | 0.422877 -0.530946 -0.734351 1.35383
913 | -0.0245955 -0.816801 0.576395 -0.0129335
914 | -0.905853 -0.225683 -0.358465 0.705908
915 | -0 -0 0 1
916 | 183 183 0
917 | 0.487722 -0.520142 -0.701127 1.32742
918 | -0.0300404 -0.812649 0.581979 -0.0149387
919 | -0.872482 -0.262782 -0.411972 0.759243
920 | -0 -0 0 1
921 | 184 184 0
922 | 0.547189 -0.481935 -0.684341 1.32727
923 | -0.0218237 -0.82554 0.563922 -0.0269433
924 | -0.836724 -0.293637 -0.462244 0.814594
925 | -0 -0 0 1
926 | 185 185 0
927 | 0.603824 -0.425038 -0.674344 1.32808
928 | -0.0201351 -0.853841 0.520145 -0.0333196
929 | -0.796864 -0.300498 -0.524127 0.874127
930 | -0 -0 0 1
931 | 186 186 0
932 | 0.692213 -0.360226 -0.625363 1.32212
933 | -0.0317397 -0.880878 0.472278 -0.0407519
934 | -0.720995 -0.307068 -0.621189 0.928614
935 | -0 -0 0 1
936 | 187 187 0
937 | 0.789287 -0.29945 -0.536056 1.28924
938 | -0.037181 -0.894727 0.445064 -0.0387786
939 | -0.612898 -0.331352 -0.71733 0.984763
940 | -0 -0 0 1
941 | 188 188 0
942 | 0.845025 -0.244435 -0.475588 1.25638
943 | -0.0250121 -0.906499 0.421467 -0.042008
944 | -0.534141 -0.344254 -0.772128 1.03945
945 | -0 -0 0 1
946 | 189 189 0
947 | 0.876642 -0.202931 -0.436254 1.2045
948 | -0.0068055 -0.911843 0.410483 -0.0380368
949 | -0.481095 -0.356878 -0.800741 1.05921
950 | -0 -0 0 1
951 | 190 190 0
952 | 0.889654 -0.181673 -0.418939 1.16799
953 | 0.00477306 -0.9137 0.406361 -0.0313274
954 | -0.45661 -0.36352 -0.81201 1.06303
955 | -0 -0 0 1
956 | 191 191 0
957 | 0.870567 -0.20122 -0.449024 1.13505
958 | -0.0053799 -0.916398 0.400233 -0.0314726
959 | -0.49202 -0.346014 -0.798868 1.05982
960 | -0 -0 0 1
961 | 192 192 0
962 | 0.838521 -0.231131 -0.493417 1.11506
963 | -0.0178488 -0.916736 0.399094 -0.0206389
964 | -0.544576 -0.325842 -0.772828 1.01263
965 | -0 -0 0 1
966 | 193 193 0
967 | 0.827629 -0.264301 -0.495152 1.08837
968 | -0.00602172 -0.88632 0.463033 -0.0128855
969 | -0.561244 -0.380238 -0.735136 0.975234
970 | -0 -0 0 1
971 | 194 194 0
972 | 0.813254 -0.317369 -0.487744 1.06475
973 | 0.00371124 -0.835334 0.54973 -0.00556105
974 | -0.581897 -0.448881 -0.678161 0.942716
975 | -0 -0 0 1
976 | 195 195 0
977 | 0.781322 -0.361638 -0.508678 1.03183
978 | 0.012461 -0.805821 0.592028 -0.000211094
979 | -0.624004 -0.468903 -0.6251 0.899089
980 | -0 -0 0 1
981 | 196 196 0
982 | 0.761784 -0.402502 -0.507619 1.0007
983 | 0.00527235 -0.77969 0.626144 0.00676174
984 | -0.64781 -0.479663 -0.591833 0.850335
985 | -0 -0 0 1
986 | 197 197 0
987 | 0.744576 -0.444273 -0.498225 0.960252
988 | -0.00264438 -0.74832 0.663333 0.0140231
989 | -0.667532 -0.492585 -0.558356 0.804536
990 | -0 -0 0 1
991 | 198 198 0
992 | 0.737131 -0.477786 -0.477869 0.924725
993 | -0.013952 -0.717779 0.696132 0.02178
994 | -0.675606 -0.506473 -0.535763 0.767664
995 | -0 -0 0 1
996 | 199 199 0
997 | 0.747409 -0.483253 -0.455901 0.880362
998 | -0.0294666 -0.70966 0.703928 0.0198614
999 | -0.66371 -0.512688 -0.544646 0.783861
1000 | -0 -0 0 1
1001 | 200 200 0
1002 | 0.770769 -0.471281 -0.428729 0.831821
1003 | -0.0427639 -0.709674 0.703231 0.0164781
1004 | -0.635677 -0.523695 -0.567149 0.804925
1005 | -0 -0 0 1
1006 | 201 201 0
1007 | 0.801163 -0.449491 -0.395088 0.792978
1008 | -0.0508253 -0.708912 0.703464 0.015172
1009 | -0.596283 -0.543509 -0.5908 0.834763
1010 | -0 -0 0 1
1011 | 202 202 0
1012 | 0.848944 -0.405438 -0.338989 0.78739
1013 | -0.0609637 -0.712287 0.699236 0.00890429
1014 | -0.524954 -0.572946 -0.629409 0.876129
1015 | -0 -0 0 1
1016 | 203 203 0
1017 | 0.909096 -0.315243 -0.272334 0.774296
1018 | -0.0470069 -0.727179 0.684836 -0.000463806
1019 | -0.413926 -0.60978 -0.675894 0.932842
1020 | -0 -0 0 1
1021 | 204 204 0
1022 | 0.941453 -0.253198 -0.222612 0.763754
1023 | -0.0506584 -0.759033 0.649079 -0.00961452
1024 | -0.333315 -0.5998 -0.727421 0.992548
1025 | -0 -0 0 1
1026 | 205 205 0
1027 | 0.966121 -0.192704 -0.171685 0.736041
1028 | -0.0511806 -0.79505 0.604381 -0.0151156
1029 | -0.252965 -0.575118 -0.777977 1.04578
1030 | -0 -0 0 1
1031 | 206 206 0
1032 | 0.981054 -0.138801 -0.135158 0.709523
1033 | -0.0412557 -0.831316 0.554267 -0.0192595
1034 | -0.189292 -0.53819 -0.821292 1.10055
1035 | -0 -0 0 1
1036 | 207 207 0
1037 | 0.990537 -0.0960559 -0.098032 0.688519
1038 | -0.0332812 -0.861056 0.507419 -0.024738
1039 | -0.133152 -0.499355 -0.856105 1.15644
1040 | -0 -0 0 1
1041 | 208 208 0
1042 | 0.99582 -0.0583868 -0.070235 0.658258
1043 | -0.0222613 -0.900956 0.43334 -0.0306109
1044 | -0.0885799 -0.429965 -0.89849 1.21513
1045 | -0 -0 0 1
1046 | 209 209 0
1047 | 0.99859 -0.0325006 -0.0419652 0.643318
1048 | -0.0166975 -0.942829 0.332859 -0.0356972
1049 | -0.0503841 -0.331689 -0.942042 1.27991
1050 | -0 -0 0 1
1051 | 210 210 0
1052 | 0.999525 -0.0150773 -0.0268808 0.639133
1053 | -0.00833264 -0.971889 0.235291 -0.0175595
1054 | -0.0296728 -0.234955 -0.971553 1.33934
1055 | -0 -0 0 1
1056 | 211 211 0
1057 | 0.999996 0.000879476 0.00274552 0.631878
1058 | 0.000255254 -0.975602 0.219545 -0.00375967
1059 | 0.00287162 -0.219543 -0.975598 1.39112
1060 | -0 -0 0 1
1061 | 212 212 0
1062 | 0.999044 0.0154823 0.0408852 0.602806
1063 | 0.00817673 -0.984863 0.173143 -0.00950962
1064 | 0.042947 -0.172644 -0.984048 1.43972
1065 | -0 -0 0 1
1066 | 213 213 0
1067 | 0.995696 0.0262 0.0889001 0.557104
1068 | 0.0183015 -0.995906 0.088527 -0.0124151
1069 | 0.0908555 -0.086519 -0.992099 1.47179
1070 | -0 -0 0 1
1071 | 214 214 0
1072 | 0.991144 0.0236644 0.130664 0.517888
1073 | 0.0289541 -0.99883 -0.0387333 -0.0193053
1074 | 0.129594 0.0421736 -0.99067 1.49644
1075 | -0 -0 0 1
1076 | 215 215 0
1077 | 0.984036 0.0333999 0.174808 0.445597
1078 | 0.0452309 -0.996915 -0.0641391 -0.0203518
1079 | 0.172126 0.0710219 -0.982511 1.48279
1080 | -0 -0 0 1
1081 | 216 216 0
1082 | 0.974211 0.0479819 0.220479 0.366793
1083 | 0.0623181 -0.996339 -0.0585304 -0.0210373
1084 | 0.216863 0.0707607 -0.973634 1.45856
1085 | -0 -0 0 1
1086 | 217 217 0
1087 | 0.961448 0.0625295 0.267784 0.289372
1088 | 0.0785561 -0.995678 -0.0495486 -0.0190553
1089 | 0.263528 0.0686744 -0.962204 1.44004
1090 | -0 -0 0 1
1091 | 218 218 0
1092 | 0.948473 0.0743588 0.308009 0.217428
1093 | 0.0908351 -0.995083 -0.039484 -0.0112976
1094 | 0.303558 0.0654275 -0.950564 1.42367
1095 | -0 -0 0 1
1096 | 219 219 0
1097 | 0.948908 0.0818094 0.304763 0.143577
1098 | 0.0974335 -0.994577 -0.0363878 -0.0130482
1099 | 0.300134 0.0642229 -0.951733 1.40232
1100 | -0 -0 0 1
1101 | 220 220 0
1102 | 0.946797 0.0808174 0.31152 0.0785981
1103 | 0.0965486 -0.994699 -0.0353842 -0.0213527
1104 | 0.307009 0.0635784 -0.949581 1.38421
1105 | -0 -0 0 1
1106 | 221 221 0
1107 | 0.937856 0.0951283 0.333731 0.0282065
1108 | 0.0846529 -0.995356 0.0458282 -0.0171424
1109 | 0.336541 -0.0147289 -0.941554 1.35983
1110 | -0 -0 0 1
1111 | 222 222 0
1112 | 0.925237 0.122576 0.359043 0.00772055
1113 | 0.0594736 -0.98153 0.18183 -0.00522308
1114 | 0.3747 -0.146882 -0.915437 1.31942
1115 | -0 -0 0 1
1116 | 223 223 0
1117 | 0.910459 0.177038 0.373793 -0.0166112
1118 | 0.0351587 -0.933616 0.356546 0.013328
1119 | 0.412101 -0.311478 -0.856244 1.27712
1120 | -0 -0 0 1
1121 | 224 224 0
1122 | 0.892629 0.250686 0.37466 -0.0403788
1123 | 0.0130303 -0.845116 0.534425 0.0253875
1124 | 0.450604 -0.472161 -0.757641 1.24405
1125 | -0 -0 0 1
1126 | 225 225 0
1127 | 0.874117 0.29938 0.382479 -0.0785439
1128 | 0.000566017 -0.788084 0.615568 0.0301679
1129 | 0.485715 -0.537862 -0.689047 1.19972
1130 | -0 -0 0 1
1131 | 226 226 0
1132 | 0.851108 0.332511 0.406266 -0.115817
1133 | -0.0124097 -0.760895 0.648757 0.033101
1134 | 0.524844 -0.557204 -0.643477 1.15237
1135 | -0 -0 0 1
1136 | 227 227 0
1137 | 0.82296 0.371069 0.430167 -0.166907
1138 | -0.02763 -0.730165 0.682712 0.030798
1139 | 0.567426 -0.573731 -0.590644 1.14379
1140 | -0 -0 0 1
1141 | 228 228 0
1142 | 0.790616 0.397001 0.466172 -0.226492
1143 | -0.0382029 -0.727865 0.684656 0.0253854
1144 | 0.611119 -0.559109 -0.560295 1.12937
1145 | -0 -0 0 1
1146 | 229 229 0
1147 | 0.737831 0.42666 0.523036 -0.265034
1148 | -0.0567571 -0.732924 0.677939 0.0252728
1149 | 0.672595 -0.52989 -0.516558 1.08621
1150 | -0 -0 0 1
1151 | 230 230 0
1152 | 0.692175 0.454551 0.560605 -0.3051
1153 | -0.0613534 -0.736881 0.673232 0.0263279
1154 | 0.719117 -0.50039 -0.482162 1.03982
1155 | -0 -0 0 1
1156 | 231 231 0
1157 | 0.654359 0.456387 0.60293 -0.342021
1158 | -0.0656041 -0.760063 0.646529 0.0282515
1159 | 0.753332 -0.462617 -0.467414 0.999711
1160 | -0 -0 0 1
1161 | 232 232 0
1162 | 0.621042 0.397664 0.675404 -0.385533
1163 | -0.071379 -0.829452 0.553998 0.0475301
1164 | 0.78052 -0.392266 -0.48674 0.967359
1165 | -0 -0 0 1
1166 | 233 233 0
1167 | 0.591737 0.352329 0.72506 -0.421808
1168 | -0.0758373 -0.871113 0.485192 0.0538234
1169 | 0.802556 -0.342093 -0.48875 0.921093
1170 | -0 -0 0 1
1171 | 234 234 0
1172 | 0.55607 0.34368 0.75675 -0.478238
1173 | -0.0782733 -0.8848 0.45935 0.0512632
1174 | 0.827442 -0.314664 -0.465109 0.8797
1175 | -0 -0 0 1
1176 | 235 235 0
1177 | 0.507999 0.349003 0.787486 -0.543784
1178 | -0.0788486 -0.891558 0.445991 0.0492788
1179 | 0.857741 -0.288655 -0.425392 0.847501
1180 | -0 -0 0 1
1181 | 236 236 0
1182 | 0.45904 0.32343 0.827451 -0.587545
1183 | -0.0776535 -0.913207 0.400029 0.0535004
1184 | 0.885016 -0.247884 -0.394083 0.79737
1185 | -0 -0 0 1
1186 | 237 237 0
1187 | 0.398052 0.296795 0.868025 -0.633399
1188 | -0.0745884 -0.932614 0.353084 0.0655964
1189 | 0.914325 -0.20529 -0.349091 0.745563
1190 | -0 -0 0 1
1191 | 238 238 0
1192 | 0.358613 0.284382 0.889114 -0.659639
1193 | -0.063524 -0.942823 0.327183 0.0716066
1194 | 0.931322 -0.173812 -0.320044 0.68482
1195 | -0 -0 0 1
1196 | 239 239 0
1197 | 0.34488 0.267016 0.899867 -0.700522
1198 | -0.0490107 -0.952255 0.301344 0.0744502
1199 | 0.937366 -0.148031 -0.315327 0.62187
1200 | -0 -0 0 1
1201 | 240 240 0
1202 | 0.340874 0.268059 0.901082 -0.740375
1203 | -0.0378711 -0.953794 0.298066 0.0662616
1204 | 0.939346 -0.135728 -0.314972 0.569136
1205 | -0 -0 0 1
1206 | 241 241 0
1207 | 0.310068 0.267065 0.912433 -0.779814
1208 | -0.0358839 -0.955763 0.291941 0.0538657
1209 | 0.950037 -0.123263 -0.286768 0.523022
1210 | -0 -0 0 1
1211 | 242 242 0
1212 | 0.273221 0.250857 0.928666 -0.833212
1213 | -0.0323635 -0.962455 0.269506 0.0310876
1214 | 0.961407 -0.10369 -0.254844 0.49975
1215 | -0 -0 0 1
1216 | 243 243 0
1217 | 0.232672 0.193721 0.953067 -0.899527
1218 | -0.0303952 -0.978034 0.206216 0.0187138
1219 | 0.97208 -0.0769494 -0.221673 0.501681
1220 | -0 -0 0 1
1221 | 244 244 0
1222 | 0.189111 0.0584237 0.980216 -0.948541
1223 | -0.026887 -0.997546 0.0646438 0.0247439
1224 | 0.981588 -0.0385799 -0.187076 0.500789
1225 | -0 -0 0 1
1226 | 245 245 0
1227 | 0.149548 -0.038469 0.988006 -0.996628
1228 | -0.0206302 -0.999147 -0.0357802 0.0464223
1229 | 0.988539 -0.0150319 -0.150214 0.506496
1230 | -0 -0 0 1
1231 | 246 246 0
1232 | 0.109823 -0.0631013 0.991946 -1.05162
1233 | -0.00937557 -0.998004 -0.0624487 0.0510183
1234 | 0.993907 -0.00244175 -0.110196 0.527482
1235 | -0 -0 0 1
1236 | 247 247 0
1237 | 0.0721324 -0.064007 0.995339 -1.10214
1238 | 0.00609441 -0.997892 -0.0646128 0.0477031
1239 | 0.997376 0.0107267 -0.0715903 0.526667
1240 | -0 -0 0 1
1241 | 248 248 0
1242 | 0.0348307 -0.0518743 0.998046 -1.15694
1243 | 0.0219572 -0.998371 -0.0526575 0.0455528
1244 | 0.999152 0.0237484 -0.033635 0.521725
1245 | -0 -0 0 1
1246 | 249 249 0
1247 | -0.00087525 0.0259796 0.999662 -1.20847
1248 | 0.0395666 -0.998879 0.0259939 0.0328415
1249 | 0.999217 0.039576 -0.000153655 0.488142
1250 | -0 -0 0 1
1251 | 250 250 0
1252 | -0.0428759 0.159694 0.986235 -1.25398
1253 | 0.0564382 -0.985179 0.161977 0.0255155
1254 | 0.997485 0.0626062 0.0332276 0.45165
1255 | -0 -0 0 1
1256 | 251 251 0
1257 | -0.0882702 0.224081 0.970565 -1.29698
1258 | 0.0646493 -0.971025 0.230067 0.0145522
1259 | 0.993996 0.0830544 0.0712259 0.412247
1260 | -0 -0 0 1
1261 | 252 252 0
1262 | -0.132514 0.278975 0.951111 -1.32369
1263 | 0.0677213 -0.954783 0.289487 0.0126898
1264 | 0.988865 0.102772 0.10763 0.362439
1265 | -0 -0 0 1
1266 | 253 253 0
1267 | -0.173427 0.338221 0.924948 -1.36421
1268 | 0.0633974 -0.933398 0.353198 0.0122069
1269 | 0.982804 0.119893 0.140434 0.325536
1270 | -0 -0 0 1
1271 | 254 254 0
1272 | -0.215107 0.391322 0.89476 -1.38979
1273 | 0.0556163 -0.909813 0.411276 0.0108379
1274 | 0.975005 0.138232 0.173944 0.274786
1275 | -0 -0 0 1
1276 | 255 255 0
1277 | -0.258955 0.427648 0.86606 -1.43767
1278 | 0.0411087 -0.890953 0.452231 0.00614699
1279 | 0.965014 0.15271 0.213137 0.243446
1280 | -0 -0 0 1
1281 | 256 256 0
1282 | -0.304533 0.423052 0.853397 -1.47367
1283 | 0.0254049 -0.892027 0.451267 0.00185467
1284 | 0.952163 0.159106 0.260904 0.203788
1285 | -0 -0 0 1
1286 | 257 257 0
1287 | -0.342253 0.409888 0.845491 -1.48433
1288 | 0.0182096 -0.896771 0.44212 -0.000511412
1289 | 0.939431 0.166713 0.299459 0.156218
1290 | -0 -0 0 1
1291 | 258 258 0
1292 | -0.364535 0.387261 0.846843 -1.47529
1293 | 0.0149411 -0.906871 0.421143 -0.00165634
1294 | 0.93107 0.166174 0.3248 0.101218
1295 | -0 -0 0 1
1296 | 259 259 0
1297 | -0.333369 0.380105 0.862778 -1.45745
1298 | 0.0219428 -0.91175 0.410158 -0.004109
1299 | 0.942541 0.155666 0.295608 0.0265435
1300 | -0 -0 0 1
1301 | 260 260 0
1302 | -0.280775 0.356109 0.891264 -1.44181
1303 | 0.021998 -0.925987 0.376913 -0.00190389
1304 | 0.959522 0.125434 0.25216 -0.0467792
1305 | -0 -0 0 1
1306 | 261 261 0
1307 | -0.227506 0.355321 0.906635 -1.44754
1308 | 0.013149 -0.929845 0.367717 -0.00741619
1309 | 0.973688 0.0955792 0.206873 -0.129788
1310 | -0 -0 0 1
1311 | 262 262 0
1312 | -0.192873 0.35956 0.912971 -1.44206
1313 | -0.00245465 -0.930615 0.36599 -0.0125629
1314 | 0.981221 0.0683487 0.180373 -0.217279
1315 | -0 -0 0 1
1316 | 263 263 0
1317 | -0.20724 0.356092 0.91118 -1.42542
1318 | -0.0175384 -0.932604 0.360476 -0.00895727
1319 | 0.978133 0.0587244 0.199518 -0.292307
1320 | -0 -0 0 1
1321 | 264 264 0
1322 | -0.209814 0.349947 0.91297 -1.39131
1323 | -0.0231072 -0.935269 0.353184 0.00976658
1324 | 0.977468 0.0530069 0.204319 -0.369657
1325 | -0 -0 0 1
1326 | 265 265 0
1327 | -0.288866 0.347769 0.891972 -1.36128
1328 | -0.0315054 -0.934638 0.354201 0.0149727
1329 | 0.956851 0.0742147 0.280942 -0.452167
1330 | -0 -0 0 1
1331 | 266 266 0
1332 | -0.412201 0.330676 0.848966 -1.33224
1333 | -0.0147409 -0.93411 0.356682 0.00929703
1334 | 0.910974 0.13451 0.389915 -0.530995
1335 | -0 -0 0 1
1336 | 267 267 0
1337 | -0.490883 0.315975 0.811908 -1.30009
1338 | -0.0118241 -0.934245 0.356436 0.00410789
1339 | 0.871145 0.165368 0.462341 -0.609709
1340 | -0 -0 0 1
1341 | 268 268 0
1342 | -0.5773 0.29997 0.759436 -1.25881
1343 | -0.0159886 -0.934049 0.356786 0.00107158
1344 | 0.816376 0.19383 0.544022 -0.67973
1345 | -0 -0 0 1
1346 | 269 269 0
1347 | -0.659634 0.280369 0.697335 -1.21147
1348 | -0.0161597 -0.932893 0.359791 -0.000498097
1349 | 0.751413 0.226062 0.619899 -0.746852
1350 | -0 -0 0 1
1351 | 270 270 0
1352 | -0.733507 0.269208 0.624095 -1.15875
1353 | -0.0137943 -0.923924 0.382328 -0.00501337
1354 | 0.679542 0.271832 0.681418 -0.810896
1355 | -0 -0 0 1
1356 | 271 271 0
1357 | -0.787739 0.268996 0.554174 -1.10741
1358 | -0.00687892 -0.903405 0.428734 -0.0107761
1359 | 0.615971 0.333918 0.713497 -0.879441
1360 | -0 -0 0 1
1361 | 272 272 0
1362 | -0.820367 0.2732 0.502355 -1.06036
1363 | 0.00111194 -0.877728 0.479157 0.00203535
1364 | 0.571837 0.393643 0.719755 -0.947488
1365 | -0 -0 0 1
1366 | 273 273 0
1367 | -0.84508 0.252883 0.471052 -1.01107
1368 | 0.00224706 -0.879376 0.476122 0.00216885
1369 | 0.534635 0.40342 0.742575 -1.01928
1370 | -0 -0 0 1
1371 | 274 274 0
1372 | -0.851767 0.23133 0.470085 -0.969498
1373 | 0.00792783 -0.89145 0.45305 -0.00725494
1374 | 0.523861 0.38962 0.757473 -1.08728
1375 | -0 -0 0 1
1376 | 275 275 0
1377 | -0.852036 0.234234 0.468155 -0.919488
1378 | -0.0202578 -0.908391 0.417631 -0.0186962
1379 | 0.523091 0.346353 0.778727 -1.15474
1380 | -0 -0 0 1
1381 | 276 276 0
1382 | -0.871213 0.217539 0.440074 -0.877937
1383 | -0.0269014 -0.916262 0.399674 -0.0299593
1384 | 0.490168 0.336363 0.804112 -1.22139
1385 | -0 -0 0 1
1386 | 277 277 0
1387 | -0.890796 0.201692 0.407188 -0.823204
1388 | -0.0289729 -0.919482 0.392064 -0.0411009
1389 | 0.453478 0.337451 0.824915 -1.2777
1390 | -0 -0 0 1
1391 | 278 278 0
1392 | -0.910441 0.184804 0.370061 -0.767125
1393 | -0.029801 -0.921627 0.386932 -0.0505822
1394 | 0.412565 0.34125 0.844594 -1.34116
1395 | -0 -0 0 1
1396 | 279 279 0
1397 | -0.929336 0.164447 0.330593 -0.708956
1398 | -0.0282839 -0.92442 0.380325 -0.0557223
1399 | 0.36815 0.344099 0.863749 -1.37274
1400 | -0 -0 0 1
1401 | 280 280 0
1402 | -0.946601 0.143777 0.288574 -0.652475
1403 | -0.0258538 -0.926027 0.376571 -0.0668604
1404 | 0.32137 0.349001 0.880295 -1.4051
1405 | -0 -0 0 1
1406 | 281 281 0
1407 | -0.959983 0.123675 0.251271 -0.599574
1408 | -0.0219817 -0.927716 0.372639 -0.0860744
1409 | 0.279194 0.352203 0.89331 -1.43654
1410 | -0 -0 0 1
1411 | 282 282 0
1412 | -0.968958 0.10463 0.223993 -0.568215
1413 | -0.0201662 -0.936459 0.350196 -0.103896
1414 | 0.246401 0.334809 0.9095 -1.48171
1415 | -0 -0 0 1
1416 | 283 283 0
1417 | -0.968555 0.0938092 0.230436 -0.545326
1418 | -0.0120561 -0.942802 0.333135 -0.109342
1419 | 0.248507 0.319882 0.914287 -1.53102
1420 | -0 -0 0 1
1421 | 284 284 0
1422 | -0.963291 0.094611 0.251237 -0.54502
1423 | -0.00889846 -0.94658 0.322346 -0.120409
1424 | 0.268313 0.308277 0.912674 -1.58389
1425 | -0 -0 0 1
1426 | 285 285 0
1427 | -0.955053 0.0886041 0.282884 -0.545353
1428 | -0.00834534 -0.961943 0.273122 -0.129471
1429 | 0.296318 0.258485 0.919446 -1.64824
1430 | -0 -0 0 1
1431 | 286 286 0
1432 | -0.949773 0.0759323 0.303587 -0.542622
1433 | -0.0099934 -0.97698 0.213095 -0.13836
1434 | 0.312779 0.199358 0.928669 -1.70483
1435 | -0 -0 0 1
1436 | 287 287 0
1437 | -0.96029 0.0636066 0.271657 -0.535526
1438 | -0.0143264 -0.983623 0.179665 -0.149818
1439 | 0.278636 0.168639 0.945475 -1.76037
1440 | -0 -0 0 1
1441 | 288 288 0
1442 | -0.969277 0.0484024 0.241164 -0.533089
1443 | -0.00846102 -0.986428 0.163973 -0.143293
1444 | 0.245827 0.156895 0.956532 -1.81833
1445 | -0 -0 0 1
1446 | 289 289 0
1447 | -0.978272 0.0368965 0.204017 -0.534208
1448 | -0.00136476 -0.985162 0.171623 -0.128789
1449 | 0.207322 0.167615 0.963806 -1.87129
1450 | -0 -0 0 1
1451 | 290 290 0
1452 | -0.983591 0.0239793 0.17881 -0.529421
1453 | 0.00953651 -0.982831 0.18426 -0.131538
1454 | 0.180159 0.182942 0.966476 -1.91571
1455 | -0 -0 0 1
1456 | 291 291 0
1457 | -0.990742 0.010867 0.135325 -0.538412
1458 | 0.0153337 -0.981456 0.191075 -0.134494
1459 | 0.134892 0.191381 0.972202 -1.93085
1460 | -0 -0 0 1
1461 | 292 292 0
1462 | -0.995577 -0.00600877 0.0937582 -0.541944
1463 | 0.0244206 -0.980201 0.196493 -0.12626
1464 | 0.0907213 0.197913 0.976012 -1.93481
1465 | -0 -0 0 1
1466 | 293 293 0
1467 | -0.995435 -0.0124062 0.0946372 -0.539813
1468 | 0.04069 -0.952062 0.303187 -0.129763
1469 | 0.0863391 0.305653 0.94822 -1.93808
1470 | -0 -0 0 1
1471 | 294 294 0
1472 | -0.974495 0.0413968 0.220558 -0.537166
1473 | 0.0492045 -0.919506 0.389985 -0.136952
1474 | 0.218948 0.390891 0.894017 -1.94342
1475 | -0 -0 0 1
1476 | 295 295 0
1477 | -0.91691 0.127342 0.378232 -0.533845
1478 | 0.0329242 -0.920362 0.38968 -0.143143
1479 | 0.397732 0.369755 0.839697 -1.94234
1480 | -0 -0 0 1
1481 | 296 296 0
1482 | -0.886964 0.158553 0.433769 -0.536475
1483 | 0.0260146 -0.920579 0.389688 -0.150593
1484 | 0.461105 0.356924 0.812396 -1.93622
1485 | -0 -0 0 1
1486 | 297 297 0
1487 | -0.908951 0.14233 0.391854 -0.53926
1488 | 0.0222034 -0.922058 0.386415 -0.160865
1489 | 0.416311 0.359932 0.834945 -1.93759
1490 | -0 -0 0 1
1491 | 298 298 0
1492 | -0.927146 0.11424 0.356861 -0.541813
1493 | 0.0150429 -0.940274 0.340087 -0.165157
1494 | 0.374399 0.320679 0.870052 -1.93534
1495 | -0 -0 0 1
1496 | 299 299 0
1497 | -0.941502 0.0781221 0.327828 -0.543696
1498 | 0.0105196 -0.965474 0.260286 -0.165841
1499 | 0.336843 0.248509 0.908174 -1.93116
1500 | -0 -0 0 1
1501 | 300 300 0
1502 | -0.950853 0.0608904 0.303598 -0.54242
1503 | 0.00764763 -0.975557 0.219612 -0.164951
1504 | 0.30955 0.21114 0.927145 -1.93178
1505 | -0 -0 0 1
1506 | 301 301 0
1507 | -0.947695 0.0538137 0.314607 -0.541903
1508 | 0.0137453 -0.977889 0.208673 -0.164456
1509 | 0.318881 0.202083 0.926001 -1.93492
1510 | -0 -0 0 1
1511 |
--------------------------------------------------------------------------------
/dataloader/datalist/tanks/logs/Horse.log:
--------------------------------------------------------------------------------
1 | 0 0 0
2 | 0.333487 0.0582181 -0.940956 0.246049
3 | -0.0576322 -0.994966 -0.0819853 -0.00593322
4 | -0.940992 0.0815704 -0.328452 0.0535024
5 | -0 -0 0 1
6 | 1 1 0
7 | 0.245471 0.0784569 -0.966224 0.24422
8 | -0.0567507 -0.993847 -0.0951175 -0.00553358
9 | -0.967741 0.0781824 -0.239508 0.0451454
10 | -0 -0 0 1
11 | 2 2 0
12 | 0.181022 0.089019 -0.979442 0.238977
13 | -0.0545094 -0.993456 -0.100367 -0.00333249
14 | -0.981967 0.0715575 -0.174985 0.0303806
15 | -0 -0 0 1
16 | 3 3 0
17 | 0.0803272 0.095014 -0.99223 0.244623
18 | -0.0539963 -0.99357 -0.0995137 -0.00339126
19 | -0.995305 0.0615704 -0.0746803 0.00455779
20 | -0 -0 0 1
21 | 4 4 0
22 | 0.0268262 0.0964456 -0.994977 0.238126
23 | -0.0496303 -0.993979 -0.097687 -0.00272161
24 | -0.998407 0.0520015 -0.0218781 -0.0129911
25 | -0 -0 0 1
26 | 5 5 0
27 | 0.0313773 0.0908639 -0.995369 0.248045
28 | -0.0478239 -0.994582 -0.0922996 -0.00339736
29 | -0.998363 0.0504985 -0.0268619 -0.0343998
30 | -0 -0 0 1
31 | 6 6 0
32 | 0.0339195 0.0909057 -0.995282 0.245212
33 | -0.0406015 -0.994907 -0.0922552 -0.00456413
34 | -0.9986 0.0435391 -0.0300559 -0.035359
35 | -0 -0 0 1
36 | 7 7 0
37 | 0.0541951 0.0883495 -0.994614 0.24396
38 | 0.00521888 -0.996089 -0.0881962 -0.00458477
39 | -0.998517 -0.000410973 -0.0544442 -0.0356513
40 | -0 -0 0 1
41 | 8 8 0
42 | 0.00160017 0.092262 -0.995733 0.246206
43 | 0.00933493 -0.995693 -0.0922432 -0.0030711
44 | -0.999955 -0.0091475 -0.00245453 -0.0452027
45 | -0 -0 0 1
46 | 9 9 0
47 | -0.191518 0.100364 -0.976344 0.250039
48 | 0.00823035 -0.994559 -0.103851 0.00245893
49 | -0.981455 -0.027925 0.189649 -0.0649545
50 | -0 -0 0 1
51 | 10 10 0
52 | -0.254294 0.0351352 -0.966488 0.249677
53 | 0.0096996 -0.999197 -0.0388764 0.0109627
54 | -0.967078 -0.0192606 0.253749 -0.0955581
55 | -0 -0 0 1
56 | 11 11 0
57 | -0.306477 -0.0235415 -0.951587 0.237457
58 | 0.00240052 -0.99971 0.0239589 0.0123132
59 | -0.951875 0.00505856 0.306445 -0.12886
60 | -0 -0 0 1
61 | 12 12 0
62 | -0.426398 -0.032709 -0.903944 0.226657
63 | -0.00781535 -0.999175 0.0398415 0.0106603
64 | -0.904502 0.024053 0.425791 -0.163565
65 | -0 -0 0 1
66 | 13 13 0
67 | -0.539164 -0.0295002 -0.841684 0.201342
68 | -0.0198172 -0.998665 0.0476967 0.0111544
69 | -0.841968 0.0423962 0.53786 -0.194878
70 | -0 -0 0 1
71 | 14 14 0
72 | -0.687231 -0.0149038 -0.726286 0.16897
73 | -0.0282519 -0.998485 0.0472222 0.0103959
74 | -0.725889 0.0529715 0.685769 -0.213991
75 | -0 -0 0 1
76 | 15 15 0
77 | -0.813985 0.00129749 -0.580885 0.135916
78 | -0.0368413 -0.9981 0.0493957 0.0089236
79 | -0.579717 0.0616078 0.812485 -0.236059
80 | -0 -0 0 1
81 | 16 16 0
82 | -0.882517 0.0160713 -0.470005 0.102817
83 | -0.0459638 -0.997579 0.0521941 0.00921122
84 | -0.468028 0.0676654 0.881119 -0.248775
85 | -0 -0 0 1
86 | 17 17 0
87 | -0.9374 0.0306253 -0.346904 0.0719323
88 | -0.0519501 -0.997277 0.0523376 0.00860293
89 | -0.344357 0.067083 0.936439 -0.259363
90 | -0 -0 0 1
91 | 18 18 0
92 | -0.981758 0.0426464 -0.185291 0.0305246
93 | -0.0537147 -0.997033 0.055129 0.00670076
94 | -0.182391 0.0640762 0.981136 -0.271075
95 | -0 -0 0 1
96 | 19 19 0
97 | -0.996572 0.0538889 -0.0627672 -0.0110904
98 | -0.0572727 -0.996928 0.053419 0.00592666
99 | -0.0596957 0.0568308 0.996598 -0.271277
100 | -0 -0 0 1
101 | 20 20 0
102 | -0.994751 0.0611132 0.0820749 -0.0405925
103 | -0.0575234 -0.997311 0.0454144 0.00423025
104 | 0.0846296 0.0404547 0.995591 -0.263326
105 | -0 -0 0 1
106 | 21 21 0
107 | -0.969858 0.0626008 0.235492 -0.0724779
108 | -0.0547098 -0.997705 0.0399012 0.0030372
109 | 0.237449 0.0258148 0.971057 -0.256907
110 | -0 -0 0 1
111 | 22 22 0
112 | -0.901621 0.0602879 0.428306 -0.0907017
113 | -0.0514441 -0.998156 0.0322053 0.00306936
114 | 0.429458 0.00700317 0.90306 -0.231912
115 | -0 -0 0 1
116 | 23 23 0
117 | -0.789255 0.0554234 0.611559 -0.114588
118 | -0.0490426 -0.998426 0.0271912 0.00138223
119 | 0.612104 -0.0085316 0.790732 -0.214537
120 | -0 -0 0 1
121 | 24 24 0
122 | -0.695018 0.0500982 0.717245 -0.129214
123 | -0.047456 -0.998591 0.0237643 0.00122257
124 | 0.717425 -0.017521 0.696415 -0.196706
125 | -0 -0 0 1
126 | 25 25 0
127 | -0.667328 0.0185277 0.744533 -0.144933
128 | -0.00475385 -0.999776 0.0206186 0.000359229
129 | 0.744748 0.01022 0.667267 -0.182261
130 | -0 -0 0 1
131 | 26 26 0
132 | -0.650901 -0.002328 0.759159 -0.15609
133 | 0.0373207 -0.998884 0.0289355 0.00282908
134 | 0.758244 0.0471665 0.650262 -0.162911
135 | -0 -0 0 1
136 | 27 27 0
137 | -0.590626 0.00246049 0.806942 -0.177294
138 | 0.0504092 -0.99793 0.0399389 0.00348779
139 | 0.80537 0.0642662 0.589279 -0.145579
140 | -0 -0 0 1
141 | 28 28 0
142 | -0.544385 0.0168622 0.838666 -0.18983
143 | 0.0512379 -0.997263 0.0533099 0.00353671
144 | 0.837269 0.0719926 0.542031 -0.125406
145 | -0 -0 0 1
146 | 29 29 0
147 | -0.437671 0.0356899 0.898427 -0.210035
148 | 0.0508625 -0.996629 0.0643687 0.00315775
149 | 0.897696 0.0738685 0.43438 -0.102081
150 | -0 -0 0 1
151 | 30 30 0
152 | -0.357857 0.0574888 0.932005 -0.233405
153 | 0.047937 -0.995656 0.0798211 0.00214814
154 | 0.932545 0.0732421 0.353547 -0.0722871
155 | -0 -0 0 1
156 | 31 31 0
157 | -0.287477 0.0704126 0.955196 -0.240755
158 | 0.044769 -0.995216 0.0868364 0.00153982
159 | 0.956741 0.0677266 0.282949 -0.0413004
160 | -0 -0 0 1
161 | 32 32 0
162 | -0.104381 0.0878527 0.99065 -0.252835
163 | 0.050263 -0.994352 0.093477 2.78921e-05
164 | 0.993266 0.0595502 0.0993754 -0.0137171
165 | -0 -0 0 1
166 | 33 33 0
167 | 0.00627881 0.100603 0.994907 -0.260436
168 | 0.0479128 -0.993814 0.10019 0.000941306
169 | 0.998832 0.0470397 -0.0110601 0.019065
170 | -0 -0 0 1
171 | 34 34 0
172 | 0.156817 0.108555 0.981644 -0.255122
173 | 0.0497545 -0.993547 0.101924 8.97758e-05
174 | 0.986374 0.0328578 -0.161207 0.0476408
175 | -0 -0 0 1
176 | 35 35 0
177 | 0.281645 0.113047 0.952836 -0.25086
178 | 0.0495047 -0.993425 0.10323 0.000376428
179 | 0.958241 0.0180957 -0.28539 0.0752936
180 | -0 -0 0 1
181 | 36 36 0
182 | 0.36061 0.111129 0.926073 -0.249631
183 | 0.0442329 -0.993797 0.102031 0.000110991
184 | 0.931667 0.00416945 -0.363288 0.107374
185 | -0 -0 0 1
186 | 37 37 0
187 | 0.418124 0.111641 0.901504 -0.244148
188 | 0.0432152 -0.99374 0.10302 0.00206757
189 | 0.907361 -0.00411637 -0.420331 0.142282
190 | -0 -0 0 1
191 | 38 38 0
192 | 0.53047 0.110735 0.84044 -0.229057
193 | 0.0461111 -0.993733 0.101828 0.00169594
194 | 0.846449 -0.015263 -0.532251 0.165218
195 | -0 -0 0 1
196 | 39 39 0
197 | 0.65559 0.108374 0.7473 -0.216398
198 | 0.0520623 -0.99378 0.0984455 0.00142033
199 | 0.753321 -0.0256337 -0.657154 0.188082
200 | -0 -0 0 1
201 | 40 40 0
202 | 0.750299 0.0998333 0.653517 -0.1899
203 | 0.0534258 -0.994455 0.0905783 0.00141635
204 | 0.658936 -0.0330462 -0.751472 0.204054
205 | -0 -0 0 1
206 | 41 41 0
207 | 0.837155 0.0936018 0.538897 -0.159495
208 | 0.0571756 -0.994826 0.0839729 0.000701085
209 | 0.543969 -0.0394866 -0.838176 0.21805
210 | -0 -0 0 1
211 | 42 42 0
212 | 0.910677 0.0882955 0.403574 -0.128109
213 | 0.0612222 -0.994951 0.0795295 0.000949073
214 | 0.408558 -0.047718 -0.911484 0.21926
215 | -0 -0 0 1
216 | 43 43 0
217 | 0.960235 0.0820739 0.266858 -0.0941022
218 | 0.0639015 -0.995051 0.0760975 0.000562457
219 | 0.271783 -0.0560189 -0.960727 0.225299
220 | -0 -0 0 1
221 | 44 44 0
222 | 0.984005 0.0768324 0.160723 -0.0645263
223 | 0.065369 -0.995005 0.0754423 0.00130347
224 | 0.165716 -0.0637293 -0.984112 0.226691
225 | -0 -0 0 1
226 | 45 45 0
227 | 0.997372 0.0703797 -0.0172079 -0.0393522
228 | 0.0714507 -0.994799 0.0725985 0.00143202
229 | -0.012009 -0.0736373 -0.997213 0.222328
230 | -0 -0 0 1
231 | 46 46 0
232 | 0.950888 0.0538403 -0.304816 -0.0158285
233 | 0.0801603 -0.993995 0.0744927 0.000188352
234 | -0.298975 -0.0952684 -0.949494 0.220302
235 | -0 -0 0 1
236 | 47 47 0
237 | 0.902412 0.0331274 -0.429599 0.000659708
238 | 0.0793871 -0.992754 0.090206 -0.00152349
239 | -0.423498 -0.115508 -0.898503 0.232764
240 | -0 -0 0 1
241 | 48 48 0
242 | 0.880096 -0.0155142 -0.474542 0.02182
243 | 0.0318747 -0.995281 0.0916544 -0.00226141
244 | -0.473724 -0.0957906 -0.875448 0.232022
245 | -0 -0 0 1
246 | 49 49 0
247 | 0.855164 -0.0253585 -0.517737 0.0383066
248 | 0.0294046 -0.994821 0.0972945 -0.00229659
249 | -0.517523 -0.0984266 -0.84999 0.247883
250 | -0 -0 0 1
251 | 50 50 0
252 | 0.871769 -0.0292611 -0.489042 0.064258
253 | 0.0239314 -0.99448 0.102164 -0.0021084
254 | -0.489332 -0.100767 -0.866257 0.243157
255 | -0 -0 0 1
256 | 51 51 0
257 | 0.849851 -0.0344343 -0.525897 0.0871271
258 | 0.0231073 -0.994469 0.102456 -0.00310863
259 | -0.526517 -0.0992248 -0.844355 0.247217
260 | -0 -0 0 1
261 | 52 52 0
262 | 0.822208 -0.0375514 -0.567947 0.118223
263 | 0.0198283 -0.995326 0.0945139 -0.000985058
264 | -0.568842 -0.0889715 -0.81762 0.239609
265 | -0 -0 0 1
266 | 53 53 0
267 | 0.773876 -0.0318347 -0.632537 0.150065
268 | 0.0208433 -0.996915 0.0756739 -0.00145917
269 | -0.632995 -0.0717464 -0.770824 0.237312
270 | -0 -0 0 1
271 | 54 54 0
272 | 0.734297 0.00494075 -0.678811 0.172767
273 | 0.0221377 -0.999616 0.0166715 -0.00388953
274 | -0.678468 -0.0272691 -0.734124 0.220335
275 | -0 -0 0 1
276 | 55 55 0
277 | 0.680918 0.0345317 -0.731545 0.202982
278 | 0.0254863 -0.9994 -0.0234531 -0.00303322
279 | -0.731916 -0.00267474 -0.681389 0.205415
280 | -0 -0 0 1
281 | 56 56 0
282 | 0.614298 0.0466839 -0.787692 0.23217
283 | 0.0374036 -0.998849 -0.0300285 -0.000377699
284 | -0.788187 -0.0110161 -0.615337 0.186286
285 | -0 -0 0 1
286 | 57 57 0
287 | 0.541559 0.0497956 -0.839187 0.263345
288 | 0.0450526 -0.998529 -0.0301764 -0.0016863
289 | -0.839455 -0.0214652 -0.543006 0.16244
290 | -0 -0 0 1
291 | 58 58 0
292 | 0.468643 0.0485295 -0.882053 0.289095
293 | 0.0512892 -0.9983 -0.0276749 0.000690224
294 | -0.881897 -0.0322702 -0.470336 0.130507
295 | -0 -0 0 1
296 | 59 59 0
297 | 0.376438 0.0510427 -0.925034 0.303695
298 | 0.0581426 -0.997814 -0.0313978 0.00121456
299 | -0.924615 -0.0419646 -0.378583 0.100666
300 | -0 -0 0 1
301 | 60 60 0
302 | 0.237526 0.0522648 -0.969974 0.314596
303 | 0.0601391 -0.997427 -0.0390173 0.00028985
304 | -0.969518 -0.0490657 -0.240058 0.063856
305 | -0 -0 0 1
306 | 61 61 0
307 | 0.124643 0.0515514 -0.990862 0.311147
308 | 0.0636965 -0.997005 -0.0438585 0.000964225
309 | -0.990155 -0.0576478 -0.127553 0.0291394
310 | -0 -0 0 1
311 | 62 62 0
312 | 0.00563151 0.055338 -0.998452 0.309841
313 | 0.0640828 -0.996435 -0.0548648 0.0022955
314 | -0.997929 -0.0636746 -0.00915764 -0.0107888
315 | -0 -0 0 1
316 | 63 63 0
317 | -0.0261116 0.0574581 -0.998006 0.320991
318 | 0.0669482 -0.996005 -0.0590945 0.00199665
319 | -0.997415 -0.0683578 0.0221605 -0.0483434
320 | -0 -0 0 1
321 | 64 64 0
322 | -0.0102977 0.056091 -0.998373 0.332777
323 | 0.0672776 -0.996124 -0.0566586 0.0030559
324 | -0.997681 -0.0677515 0.00648408 -0.0885359
325 | -0 -0 0 1
326 | 65 65 0
327 | -0.101311 0.0559605 -0.99328 0.347484
328 | 0.066875 -0.995775 -0.0629221 0.00404563
329 | -0.992605 -0.0728003 0.0971408 -0.130797
330 | -0 -0 0 1
331 | 66 66 0
332 | -0.268661 0.0581252 -0.961479 0.351525
333 | 0.0659587 -0.994725 -0.0785655 0.00477127
334 | -0.960974 -0.0845254 0.26341 -0.176141
335 | -0 -0 0 1
336 | 67 67 0
337 | -0.416281 0.0547188 -0.907588 0.334496
338 | 0.0670815 -0.993619 -0.0906737 0.00790373
339 | -0.906758 -0.0986281 0.409954 -0.217047
340 | -0 -0 0 1
341 | 68 68 0
342 | -0.523985 0.0494012 -0.850294 0.310999
343 | 0.0684493 -0.992645 -0.0998529 0.00784707
344 | -0.848973 -0.110523 0.516749 -0.247245
345 | -0 -0 0 1
346 | 69 69 0
347 | -0.639159 0.0378985 -0.76814 0.285624
348 | 0.0665426 -0.992314 -0.104328 0.007347
349 | -0.76619 -0.117796 0.631725 -0.289343
350 | -0 -0 0 1
351 | 70 70 0
352 | -0.715775 0.0281961 -0.697762 0.248187
353 | 0.0650971 -0.99214 -0.106869 0.00703895
354 | -0.69529 -0.121917 0.708313 -0.329405
355 | -0 -0 0 1
356 | 71 71 0
357 | -0.787744 0.0165984 -0.615779 0.207661
358 | 0.0645276 -0.991914 -0.109285 0.00615035
359 | -0.612613 -0.125823 0.780303 -0.349034
360 | -0 -0 0 1
361 | 72 72 0
362 | -0.853294 0.00244405 -0.521424 0.172499
363 | 0.06143 -0.992554 -0.105181 0.00404469
364 | -0.517798 -0.121781 0.846791 -0.376524
365 | -0 -0 0 1
366 | 73 73 0
367 | -0.91338 -0.0111396 -0.406955 0.135092
368 | 0.0587069 -0.992781 -0.104588 0.00530962
369 | -0.402852 -0.11942 0.907441 -0.403999
370 | -0 -0 0 1
371 | 74 74 0
372 | -0.952178 -0.0229581 -0.304681 0.0961013
373 | 0.0567742 -0.993101 -0.102597 0.00265794
374 | -0.300223 -0.114989 0.946913 -0.410506
375 | -0 -0 0 1
376 | 75 75 0
377 | -0.980783 -0.0345267 -0.192025 0.0587294
378 | 0.0551428 -0.993144 -0.103076 0.000989848
379 | -0.187149 -0.111684 0.975962 -0.425785
380 | -0 -0 0 1
381 | 76 76 0
382 | -0.994391 -0.0417245 -0.0971908 0.0140603
383 | 0.0516794 -0.993413 -0.102272 0.00122264
384 | -0.0922833 -0.106721 0.989997 -0.440116
385 | -0 -0 0 1
386 | 77 77 0
387 | -0.998806 -0.048325 -0.00710303 -0.0383012
388 | 0.0488003 -0.993466 -0.103166 7.63441e-05
389 | -0.00207111 -0.10339 0.994639 -0.440919
390 | -0 -0 0 1
391 | 78 78 0
392 | -0.991106 -0.0602391 0.118658 -0.0823408
393 | 0.047478 -0.993062 -0.107582 -0.00159121
394 | 0.124315 -0.100992 0.98709 -0.429395
395 | -0 -0 0 1
396 | 79 79 0
397 | -0.961826 -0.0729069 0.26377 -0.128608
398 | 0.0438352 -0.992458 -0.114476 -0.0034804
399 | 0.270127 -0.0985432 0.957769 -0.412807
400 | -0 -0 0 1
401 | 80 80 0
402 | -0.904192 -0.0892039 0.417706 -0.169836
403 | 0.0437838 -0.992154 -0.117104 -0.00308917
404 | 0.424875 -0.0875956 0.901004 -0.392371
405 | -0 -0 0 1
406 | 81 81 0
407 | -0.867175 -0.100906 0.487674 -0.203937
408 | 0.0477099 -0.991586 -0.120336 -0.00433841
409 | 0.495714 -0.0810851 0.864692 -0.364851
410 | -0 -0 0 1
411 | 82 82 0
412 | -0.824247 -0.108497 0.555739 -0.241615
413 | 0.0487538 -0.991424 -0.121246 -0.00610201
414 | 0.564128 -0.0728424 0.822468 -0.33787
415 | -0 -0 0 1
416 | 83 83 0
417 | -0.781883 -0.113865 0.612938 -0.282942
418 | 0.0468994 -0.991136 -0.124296 -0.00643987
419 | 0.621658 -0.0684386 0.780293 -0.30627
420 | -0 -0 0 1
421 | 84 84 0
422 | -0.692893 -0.127246 0.709723 -0.309185
423 | 0.0474428 -0.990218 -0.131218 -0.00636809
424 | 0.719477 -0.0572489 0.692152 -0.269886
425 | -0 -0 0 1
426 | 85 85 0
427 | -0.595073 -0.132623 0.792654 -0.331789
428 | 0.0489568 -0.99044 -0.128962 -0.00440845
429 | 0.802179 -0.0379359 0.595877 -0.22928
430 | -0 -0 0 1
431 | 86 86 0
432 | -0.519467 -0.124102 0.845431 -0.357426
433 | 0.0502984 -0.992123 -0.114729 -0.00197347
434 | 0.853009 -0.0170743 0.521617 -0.186043
435 | -0 -0 0 1
436 | 87 87 0
437 | -0.418598 -0.112019 0.901237 -0.37666
438 | 0.0499503 -0.993702 -0.100311 -0.00352438
439 | 0.906797 0.00302717 0.421557 -0.145073
440 | -0 -0 0 1
441 | 88 88 0
442 | -0.329418 -0.0988512 0.938995 -0.394087
443 | 0.0482891 -0.994967 -0.0878028 -0.00583412
444 | 0.942949 0.0164194 0.332533 -0.0944231
445 | -0 -0 0 1
446 | 89 89 0
447 | -0.204821 -0.088878 0.974756 -0.403455
448 | 0.0452016 -0.995665 -0.0812865 -0.00610201
449 | 0.977755 0.0274113 0.20795 -0.0481476
450 | -0 -0 0 1
451 | 90 90 0
452 | -0.0456893 -0.0827093 0.995526 -0.400516
453 | 0.0429112 -0.995809 -0.0807634 -0.00725087
454 | 0.998034 0.0390292 0.049047 -0.000188418
455 | -0 -0 0 1
456 | 91 91 0
457 | 0.094073 -0.0758653 0.992671 -0.396405
458 | 0.0412135 -0.995941 -0.0800209 -0.00700952
459 | 0.994712 0.0484392 -0.0905645 0.0506169
460 | -0 -0 0 1
461 | 92 92 0
462 | 0.190123 -0.0689606 0.979335 -0.392824
463 | 0.0405471 -0.996127 -0.0780146 -0.00656462
464 | 0.980923 0.0545416 -0.18659 0.0981344
465 | -0 -0 0 1
466 | 93 93 0
467 | 0.297739 -0.0645622 0.952462 -0.377093
468 | 0.039783 -0.996005 -0.0799499 -0.00586466
469 | 0.953818 0.061696 -0.293981 0.138658
470 | -0 -0 0 1
471 | 94 94 0
472 | 0.410771 -0.0574074 0.90993 -0.358355
473 | 0.0398196 -0.995934 -0.0808092 -0.00625058
474 | 0.910869 0.0694271 -0.406814 0.176874
475 | -0 -0 0 1
476 | 95 95 0
477 | 0.511163 -0.052047 0.857907 -0.341939
478 | 0.0400914 -0.995634 -0.0842901 -0.00698362
479 | 0.858548 0.0774806 -0.506845 0.211753
480 | -0 -0 0 1
481 | 96 96 0
482 | 0.610898 -0.0430421 0.790538 -0.318636
483 | 0.042258 -0.995325 -0.0868474 -0.00526224
484 | 0.79058 0.0864615 -0.606224 0.235694
485 | -0 -0 0 1
486 | 97 97 0
487 | 0.670686 -0.030485 0.741114 -0.287938
488 | 0.0436637 -0.9958 -0.0804755 -0.00441478
489 | 0.740455 0.0863337 -0.666538 0.266588
490 | -0 -0 0 1
491 | 98 98 0
492 | 0.752584 -0.0128977 0.65837 -0.257706
493 | 0.0431481 -0.996694 -0.0688482 -0.00466541
494 | 0.657081 0.0802214 -0.749539 0.296484
495 | -0 -0 0 1
496 | 99 99 0
497 | 0.829111 0.00280661 0.559077 -0.21635
498 | 0.0444607 -0.997151 -0.0609294 -0.00541482
499 | 0.557313 0.0753742 -0.826874 0.314971
500 | -0 -0 0 1
501 | 100 100 0
502 | 0.902959 0.0171403 0.429385 -0.175302
503 | 0.0452642 -0.997439 -0.0553705 -0.00515917
504 | 0.427337 0.0694331 -0.901422 0.330459
505 | -0 -0 0 1
506 | 101 101 0
507 | 0.941403 0.0280231 0.336118 -0.142102
508 | 0.0460615 -0.997888 -0.0458128 -0.00416546
509 | 0.334125 0.0586104 -0.940705 0.348754
510 | -0 -0 0 1
511 | 102 102 0
512 | 0.959561 0.0305508 0.279839 -0.0996191
513 | 0.0435941 -0.998228 -0.0405035 -0.00572982
514 | 0.278106 0.0510649 -0.959192 0.374411
515 | -0 -0 0 1
516 | 103 103 0
517 | 0.982139 0.0337604 0.185106 -0.0556054
518 | 0.0425934 -0.998125 -0.0439501 -0.00856917
519 | 0.183275 0.0510494 -0.981735 0.397988
520 | -0 -0 0 1
521 | 104 104 0
522 | 0.998025 0.0403074 0.048191 -0.00983432
523 | 0.042724 -0.997824 -0.0502148 -0.00746105
524 | 0.0460621 0.0521745 -0.997575 0.40318
525 | -0 -0 0 1
526 | 105 105 0
527 | 0.995743 0.0506376 -0.0770136 0.0281916
528 | 0.0461595 -0.997199 -0.0588569 -0.00696977
529 | -0.0797782 0.0550514 -0.995291 0.397807
530 | -0 -0 0 1
531 | 106 106 0
532 | 0.977834 0.0606784 -0.200395 0.0710101
533 | 0.0485214 -0.996703 -0.0650334 -0.00889019
534 | -0.203681 0.0538684 -0.977554 0.390657
535 | -0 -0 0 1
536 | 107 107 0
537 | 0.940498 0.0718671 -0.332111 0.111511
538 | 0.0505866 -0.9961 -0.0722955 -0.00610201
539 | -0.336012 0.0511934 -0.940465 0.375182
540 | -0 -0 0 1
541 | 108 108 0
542 | 0.86731 0.0909705 -0.489384 0.130196
543 | 0.0563273 -0.99478 -0.0850913 -0.00690136
544 | -0.49457 0.0462349 -0.867907 0.341703
545 | -0 -0 0 1
546 | 109 109 0
547 | 0.812571 0.0987952 -0.574428 0.146562
548 | 0.059203 -0.994423 -0.0872827 -0.00339681
549 | -0.579847 0.0369156 -0.813888 0.300207
550 | -0 -0 0 1
551 | 110 110 0
552 | 0.781618 0.0617364 -0.620695 0.155112
553 | 0.0581098 -0.997969 -0.0260858 -0.00015073
554 | -0.621045 -0.0156793 -0.783618 0.261266
555 | -0 -0 0 1
556 | 111 111 0
557 | 0.721966 0.0174914 -0.691708 0.15567
558 | 0.0537071 -0.998081 0.0308177 0.000433289
559 | -0.689841 -0.0593989 -0.72152 0.219228
560 | -0 -0 0 1
561 | 112 112 0
562 | 0.649872 -0.00867586 -0.759994 0.158534
563 | 0.0438363 -0.997842 0.0488755 0.000823163
564 | -0.758778 -0.0650781 -0.64809 0.174088
565 | -0 -0 0 1
566 | 113 113 0
567 | 0.505419 -0.0427763 -0.861813 0.144776
568 | 0.0387521 -0.996637 0.0721949 0.00292699
569 | -0.862003 -0.0698858 -0.502062 0.139914
570 | -0 -0 0 1
571 | 114 114 0
572 | 0.424691 -0.0805644 -0.901747 0.140652
573 | 0.0260078 -0.994536 0.101103 0.00504487
574 | -0.904965 -0.0663901 -0.420275 0.105426
575 | -0 -0 0 1
576 | 115 115 0
577 | 0.311151 -0.171916 -0.934682 0.152644
578 | 0.0145785 -0.982523 0.185569 0.00577006
579 | -0.950249 -0.0713662 -0.303207 0.0669794
580 | -0 -0 0 1
581 | 116 116 0
582 | 0.172968 -0.222836 -0.959389 0.153186
583 | 0.000651563 -0.974044 0.226358 0.00571229
584 | -0.984927 -0.0397776 -0.168333 0.0306292
585 | -0 -0 0 1
586 | 117 117 0
587 | 0.0233141 -0.24719 -0.968687 0.160716
588 | -0.0128442 -0.968944 0.246946 0.00655222
589 | -0.999646 0.00668472 -0.025765 -0.0108867
590 | -0 -0 0 1
591 | 118 118 0
592 | -0.221615 -0.246117 -0.943564 0.160965
593 | -0.0141531 -0.966711 0.255479 0.00798743
594 | -0.975031 0.0699725 0.210755 -0.0532132
595 | -0 -0 0 1
596 | 119 119 0
597 | -0.472193 -0.216677 -0.85445 0.145852
598 | -0.0200816 -0.966423 0.25617 0.0075382
599 | -0.881266 0.13812 0.451987 -0.088921
600 | -0 -0 0 1
601 | 120 120 0
602 | -0.673578 -0.164051 -0.720681 0.133612
603 | -0.0319074 -0.967694 0.250101 0.00620109
604 | -0.738427 0.191458 0.646582 -0.123465
605 | -0 -0 0 1
606 | 121 121 0
607 | -0.82166 -0.107588 -0.559731 0.107615
608 | -0.0357544 -0.97036 0.239003 0.00576408
609 | -0.568855 0.216392 0.79346 -0.159078
610 | -0 -0 0 1
611 | 122 122 0
612 | -0.918393 -0.0487338 -0.392657 0.0752606
613 | -0.0419682 -0.974789 0.219144 0.0037056
614 | -0.393437 0.21774 0.893195 -0.17516
615 | -0 -0 0 1
616 | 123 123 0
617 | -0.974903 0.00582579 -0.222555 0.0417523
618 | -0.0427882 -0.985924 0.161626 0.00142785
619 | -0.218481 0.167092 0.961429 -0.19196
620 | -0 -0 0 1
621 | 124 124 0
622 | -0.999041 0.0348547 0.0265096 0.00415963
623 | -0.0309447 -0.990253 0.135799 0.00178836
624 | 0.0309845 0.134848 0.990382 -0.197235
625 | -0 -0 0 1
626 | 125 125 0
627 | -0.967979 0.0547157 0.244994 -0.0328859
628 | -0.027178 -0.993062 0.114404 0.00206134
629 | 0.249554 0.104082 0.962751 -0.187978
630 | -0 -0 0 1
631 | 126 126 0
632 | -0.890067 0.0691525 0.450555 -0.0702377
633 | -0.0215362 -0.993701 0.109972 0.0033977
634 | 0.455321 0.088179 0.88595 -0.174136
635 | -0 -0 0 1
636 | 127 127 0
637 | -0.773524 0.084571 0.628099 -0.107894
638 | -0.0148495 -0.993203 0.115443 0.003635
639 | 0.633593 0.0799711 0.769523 -0.150317
640 | -0 -0 0 1
641 | 128 128 0
642 | -0.663841 0.0986758 0.741336 -0.129437
643 | -0.017504 -0.993036 0.116504 0.00179245
644 | 0.747669 0.0643639 0.660945 -0.113211
645 | -0 -0 0 1
646 | 129 129 0
647 | -0.56091 0.132626 0.817184 -0.149713
648 | -0.0196228 -0.988937 0.147032 0.00241859
649 | 0.827644 0.0664363 0.557307 -0.082791
650 | -0 -0 0 1
651 | 130 130 0
652 | -0.517822 0.259439 0.8152 -0.172727
653 | -0.0276278 -0.957481 0.287171 0.00290406
654 | 0.855042 0.126181 0.502972 -0.0569228
655 | -0 -0 0 1
656 | 131 131 0
657 | -0.343232 0.295231 0.891645 -0.184734
658 | -0.019051 -0.951308 0.307653 0.00293244
659 | 0.939057 0.0886097 0.332144 -0.0182126
660 | -0 -0 0 1
661 | 132 132 0
662 | -0.0581552 0.318631 0.946093 -0.19007
663 | -0.0228267 -0.947874 0.317828 0.00206679
664 | 0.998047 -0.0031129 0.0623971 0.0197775
665 | -0 -0 0 1
666 | 133 133 0
667 | 0.157648 0.313903 0.936276 -0.186436
668 | -0.0438036 -0.944976 0.324195 0.00322471
669 | 0.986523 -0.0921209 -0.135223 0.0572255
670 | -0 -0 0 1
671 | 134 134 0
672 | 0.412194 0.277318 0.867865 -0.17446
673 | -0.0478366 -0.94465 0.324574 0.00278584
674 | 0.909839 -0.175303 -0.376114 0.0875242
675 | -0 -0 0 1
676 | 135 135 0
677 | 0.649486 0.208857 0.731127 -0.153715
678 | -0.053206 -0.946696 0.317703 0.0018716
679 | 0.75851 -0.245244 -0.603753 0.112344
680 | -0 -0 0 1
681 | 136 136 0
682 | 0.757934 0.152976 0.63414 -0.130612
683 | -0.0649064 -0.949605 0.306655 0.00282096
684 | 0.649094 -0.273584 -0.709809 0.122801
685 | -0 -0 0 1
686 | 137 137 0
687 | 0.884296 0.0956506 0.457026 -0.0986983
688 | -0.052078 -0.952482 0.30011 0.00215922
689 | 0.464014 -0.289187 -0.837294 0.137699
690 | -0 -0 0 1
691 | 138 138 0
692 | 0.97035 0.0362543 0.238968 -0.0643784
693 | -0.0377179 -0.953861 0.297869 0.00266038
694 | 0.238741 -0.298051 -0.924212 0.15658
695 | -0 -0 0 1
696 | 139 139 0
697 | 0.995618 -0.0126278 0.0926606 -0.0260925
698 | -0.0391771 -0.956023 0.290662 0.001617
699 | 0.0849153 -0.293019 -0.952328 0.155762
700 | -0 -0 0 1
701 | 140 140 0
702 | 0.993249 -0.0594611 -0.0996079 0.00690957
703 | -0.0282968 -0.956894 0.289055 0.00126527
704 | -0.112502 -0.284285 -0.952116 0.153023
705 | -0 -0 0 1
706 | 141 141 0
707 | 0.966615 -0.100705 -0.235614 0.043069
708 | -0.0235199 -0.950519 0.309776 -0.000855226
709 | -0.255152 -0.293892 -0.921154 0.150897
710 | -0 -0 0 1
711 | 142 142 0
712 | 0.923363 -0.15726 -0.350244 0.0669639
713 | -0.0248243 -0.934808 0.354284 -0.00540877
714 | -0.383125 -0.318438 -0.867071 0.139692
715 | -0 -0 0 1
716 | 143 143 0
717 | 0.844237 -0.201373 -0.496701 0.0811733
718 | -0.0133626 -0.934354 0.356095 -0.00880657
719 | -0.535803 -0.293991 -0.791508 0.138569
720 | -0 -0 0 1
721 | 144 144 0
722 | 0.844696 -0.198539 -0.497062 0.090818
723 | -0.0402739 -0.949604 0.310854 -0.00913856
724 | -0.533729 -0.242559 -0.810123 0.150243
725 | -0 -0 0 1
726 | 145 145 0
727 | 0.778302 -0.192914 -0.59752 0.11226
728 | -0.0203656 -0.958887 0.283057 -0.00748482
729 | -0.62756 -0.208135 -0.750233 0.129334
730 | -0 -0 0 1
731 | 146 146 0
732 | 0.579085 -0.22275 -0.784247 0.134457
733 | 0.00743895 -0.960467 0.278294 -0.0060797
734 | -0.815233 -0.16699 -0.554535 0.105483
735 | -0 -0 0 1
736 | 147 147 0
737 | 0.470801 -0.2421 -0.848371 0.144907
738 | -0.0084403 -0.962803 0.270072 -0.00669605
739 | -0.882199 -0.11999 -0.455333 0.0799833
740 | -0 -0 0 1
741 | 148 148 0
742 | 0.386645 -0.24608 -0.888791 0.152515
743 | -0.00983408 -0.964788 0.262844 -0.00921821
744 | -0.922176 -0.0928867 -0.37545 0.0614973
745 | -0 -0 0 1
746 | 149 149 0
747 | 0.397201 -0.203689 -0.894842 0.1528
748 | -0.0123081 -0.976153 0.216735 -0.0111538
749 | -0.917649 -0.0750734 -0.390236 0.0545013
750 | -0 -0 0 1
751 | 150 150 0
752 | 0.428337 -0.171887 -0.88712 0.152381
753 | -0.00444691 -0.982131 0.188149 -0.0112657
754 | -0.903608 -0.0766461 -0.421447 0.0517873
755 | -0 -0 0 1
756 |
--------------------------------------------------------------------------------
/dataloader/datalist/tanks/logs/Train.log:
--------------------------------------------------------------------------------
1 | 0 0 0
2 | 0.999633 -0.0232816 0.0138163 -0.3686
3 | 0.0247054 0.993187 -0.113881 0.0526984
4 | -0.0110708 0.114181 0.993398 -2.17113
5 | -0 -0 0 1
6 | 1 1 0
7 | 0.999125 -0.0278002 -0.0312556 -0.365783
8 | 0.0240267 0.993049 -0.115221 0.0532314
9 | 0.0342415 0.114369 0.992848 -2.17141
10 | -0 -0 0 1
11 | 2 2 0
12 | 0.99637 -0.0327259 -0.0785823 -0.347926
13 | 0.023335 0.992789 -0.117579 0.0541618
14 | 0.0818635 0.115318 0.98995 -2.15608
15 | -0 -0 0 1
16 | 3 3 0
17 | 0.962313 -0.0505502 -0.267206 -0.299838
18 | 0.0179444 0.992234 -0.123087 0.054664
19 | 0.271353 0.113653 0.955746 -2.09594
20 | -0 -0 0 1
21 | 4 4 0
22 | 0.915591 -0.0820105 -0.39366 -0.267271
23 | 0.0190146 0.986716 -0.161336 0.0509778
24 | 0.401662 0.140233 0.904988 -2.03395
25 | -0 -0 0 1
26 | 5 5 0
27 | 0.868474 -0.104405 -0.484615 -0.270567
28 | 0.0242184 0.985339 -0.168878 0.0513353
29 | 0.495142 0.13493 0.858271 -2.01954
30 | -0 -0 0 1
31 | 6 6 0
32 | 0.843727 -0.120981 -0.522961 -0.272361
33 | 0.0316935 0.983798 -0.176457 0.0515965
34 | 0.535835 0.132307 0.833892 -2.01904
35 | -0 -0 0 1
36 | 7 7 0
37 | 0.870797 -0.125219 -0.47543 -0.265995
38 | 0.041522 0.982298 -0.182667 0.0577559
39 | 0.489887 0.139325 0.860581 -2.01795
40 | -0 -0 0 1
41 | 8 8 0
42 | 0.851516 -0.124003 -0.509455 -0.197314
43 | 0.0368139 0.983373 -0.177824 0.0610336
44 | 0.523035 0.132665 0.841923 -2.01084
45 | -0 -0 0 1
46 | 9 9 0
47 | 0.827772 -0.130194 -0.54575 -0.125025
48 | 0.040779 0.984093 -0.172913 0.0629862
49 | 0.55958 0.120878 0.819914 -1.97706
50 | -0 -0 0 1
51 | 10 10 0
52 | 0.697012 -0.132964 -0.704624 -0.0402968
53 | 0.0346257 0.987752 -0.15214 0.0595448
54 | 0.716223 0.0816452 0.693079 -1.91045
55 | -0 -0 0 1
56 | 11 11 0
57 | 0.743886 -0.052968 -0.666205 0.00447499
58 | 0.0526691 0.9984 -0.0205695 0.0371029
59 | 0.666228 -0.0197871 0.745485 -1.8188
60 | -0 -0 0 1
61 | 12 12 0
62 | 0.791559 -0.022798 -0.610667 0.0608883
63 | 0.0469886 0.998616 0.0236262 0.0264992
64 | 0.609283 -0.0473959 0.791535 -1.73417
65 | -0 -0 0 1
66 | 13 13 0
67 | 0.838991 -0.0365605 -0.542916 0.127882
68 | 0.0346421 0.999305 -0.0137602 0.0209172
69 | 0.543042 -0.0072631 0.839674 -1.64584
70 | -0 -0 0 1
71 | 14 14 0
72 | 0.876483 -0.0904827 -0.472854 0.228786
73 | 0.0259488 0.989631 -0.141272 0.0123494
74 | 0.480733 0.111552 0.869742 -1.56193
75 | -0 -0 0 1
76 | 15 15 0
77 | 0.895563 -0.0949518 -0.434685 0.30936
78 | 0.024436 0.985985 -0.165032 -0.0360468
79 | 0.444263 0.137175 0.885332 -1.46371
80 | -0 -0 0 1
81 | 16 16 0
82 | 0.879119 -0.0964651 -0.466737 0.373238
83 | 0.0202174 0.985969 -0.165699 -0.0486654
84 | 0.476173 0.136233 0.868735 -1.35331
85 | -0 -0 0 1
86 | 17 17 0
87 | 0.880497 -0.101805 -0.462992 0.45285
88 | 0.0257775 0.985505 -0.167676 -0.0122771
89 | 0.473351 0.135703 0.870358 -1.28848
90 | -0 -0 0 1
91 | 18 18 0
92 | 0.906946 -0.105443 -0.407836 0.519636
93 | 0.0327293 0.982877 -0.181332 0.0190077
94 | 0.419973 0.15111 0.894868 -1.20495
95 | -0 -0 0 1
96 | 19 19 0
97 | 0.925121 -0.102897 -0.365462 0.580361
98 | 0.0360951 0.982051 -0.18513 0.049242
99 | 0.377952 0.158077 0.91223 -1.11066
100 | -0 -0 0 1
101 | 20 20 0
102 | 0.947689 -0.082956 -0.308225 0.693707
103 | 0.0421475 0.989704 -0.13678 0.0724739
104 | 0.316399 0.116634 0.941429 -1.05164
105 | -0 -0 0 1
106 | 21 21 0
107 | 0.969222 -0.0222276 -0.245181 0.764337
108 | 0.0406766 0.996686 0.0704406 0.0886617
109 | 0.242803 -0.0782458 0.966915 -0.966066
110 | -0 -0 0 1
111 | 22 22 0
112 | 0.984967 -0.00774423 -0.172572 0.841963
113 | 0.0269152 0.993674 0.109029 0.0780227
114 | 0.170635 -0.112035 0.978944 -0.829183
115 | -0 -0 0 1
116 | 23 23 0
117 | 0.995343 -0.0018189 -0.0963782 0.955902
118 | 0.0122257 0.99413 0.107499 0.0553188
119 | 0.095617 -0.108176 0.989523 -0.74845
120 | -0 -0 0 1
121 | 24 24 0
122 | 0.995678 0.00839386 -0.0924906 1.04903
123 | -0.000558536 0.99643 0.0844169 0.0095993
124 | 0.092869 -0.0840004 0.992129 -0.660253
125 | -0 -0 0 1
126 | 25 25 0
127 | 0.981324 0.00715416 -0.19223 1.11366
128 | -0.0159001 0.998905 -0.0439932 -0.00389854
129 | 0.191705 0.0462281 0.980363 -0.567527
130 | -0 -0 0 1
131 | 26 26 0
132 | 0.872783 -0.0124582 -0.487949 1.15731
133 | -0.033965 0.995701 -0.0861746 0.0170945
134 | 0.486925 0.0917849 0.868608 -0.541029
135 | -0 -0 0 1
136 | 27 27 0
137 | 0.688488 -0.0394825 -0.724173 1.15975
138 | -0.0261613 0.996515 -0.079203 0.0202659
139 | 0.724776 0.0734756 0.685056 -0.54358
140 | -0 -0 0 1
141 | 28 28 0
142 | 0.437621 -0.0681977 -0.896569 1.14928
143 | -0.00755804 0.996805 -0.0795114 0.0217207
144 | 0.899128 0.0415722 0.435708 -0.564394
145 | -0 -0 0 1
146 | 29 29 0
147 | 0.173158 -0.0844967 -0.981263 1.13512
148 | 0.0120678 0.99642 -0.0836724 0.0224057
149 | 0.98482 0.00264687 0.173558 -0.57268
150 | -0 -0 0 1
151 | 30 30 0
152 | 0.0491008 -0.0974671 -0.994027 1.14447
153 | 0.0340167 0.994813 -0.0958639 0.0421396
154 | 0.998214 -0.0291065 0.0521616 -0.554974
155 | -0 -0 0 1
156 | 31 31 0
157 | -0.0372913 -0.0996114 -0.994327 1.14005
158 | 0.0412405 0.994018 -0.101127 0.0531711
159 | 0.998453 -0.0447777 -0.0329602 -0.494412
160 | -0 -0 0 1
161 | 32 32 0
162 | -0.242193 -0.0956832 -0.965499 1.20193
163 | 0.0397663 0.99331 -0.108415 0.0467412
164 | 0.969413 -0.0646515 -0.236768 -0.433989
165 | -0 -0 0 1
166 | 33 33 0
167 | -0.208484 -0.0978577 -0.973118 1.23969
168 | 0.0506198 0.992569 -0.110659 0.0473462
169 | 0.976715 -0.0723296 -0.201981 -0.307674
170 | -0 -0 0 1
171 | 34 34 0
172 | -0.254497 -0.0914047 -0.962744 1.32972
173 | 0.0463717 0.993225 -0.106557 0.0328173
174 | 0.965961 -0.0717625 -0.248534 -0.220478
175 | -0 -0 0 1
176 | 35 35 0
177 | -0.30531 -0.0887613 -0.948107 1.42099
178 | 0.0414706 0.993462 -0.106362 0.0319056
179 | 0.95135 -0.0717919 -0.299633 -0.122326
180 | -0 -0 0 1
181 | 36 36 0
182 | -0.235167 -0.0896713 -0.96781 1.47446
183 | 0.0434084 0.993773 -0.102625 0.0384823
184 | 0.970985 -0.0661449 -0.22981 -0.0512808
185 | -0 -0 0 1
186 | 37 37 0
187 | -0.186604 -0.0804697 -0.979134 1.52661
188 | 0.042533 0.995044 -0.0898832 0.0267952
189 | 0.981514 -0.0584181 -0.182256 0.0719078
190 | -0 -0 0 1
191 | 38 38 0
192 | -0.236795 -0.0731822 -0.968799 1.59463
193 | 0.0363906 0.995791 -0.0841158 0.029266
194 | 0.970878 -0.0551734 -0.233136 0.174554
195 | -0 -0 0 1
196 | 39 39 0
197 | -0.272914 -0.0725253 -0.959301 1.71889
198 | 0.0281315 0.996126 -0.0833127 0.0334936
199 | 0.961627 -0.0497238 -0.269817 0.210039
200 | -0 -0 0 1
201 | 40 40 0
202 | -0.254008 -0.0475211 -0.966034 1.83054
203 | 0.0268849 0.998059 -0.0561656 0.0374671
204 | 0.966828 -0.0402382 -0.252238 0.275284
205 | -0 -0 0 1
206 | 41 41 0
207 | -0.185346 -0.017375 -0.98252 1.90176
208 | 0.029285 0.999302 -0.0231962 0.0227946
209 | 0.982237 -0.0330724 -0.184708 0.365681
210 | -0 -0 0 1
211 | 42 42 0
212 | -0.123654 -0.0117967 -0.992255 1.97555
213 | 0.0261508 0.999543 -0.0151422 0.016197
214 | 0.991981 -0.0278206 -0.123289 0.439601
215 | -0 -0 0 1
216 | 43 43 0
217 | -0.0318772 -0.0112961 -0.999428 1.98992
218 | 0.0221198 0.999683 -0.0120045 0.00657606
219 | 0.999247 -0.0224898 -0.0316172 0.450948
220 | -0 -0 0 1
221 | 44 44 0
222 | 0.118193 -0.0174902 -0.992837 1.99009
223 | 0.0227339 0.99963 -0.0149035 0.00441607
224 | 0.99273 -0.0208095 0.118547 0.464346
225 | -0 -0 0 1
226 | 45 45 0
227 | 0.211775 -0.0236087 -0.977033 1.98143
228 | 0.0220314 0.999569 -0.0193779 -0.0452274
229 | 0.97707 -0.0174216 0.212204 0.471023
230 | -0 -0 0 1
231 | 46 46 0
232 | 0.281752 -0.0447899 -0.958441 1.97409
233 | 0.0208876 0.998959 -0.0405431 -0.0757049
234 | 0.95926 -0.00859639 0.282395 0.478032
235 | -0 -0 0 1
236 | 47 47 0
237 | 0.33257 -0.0607904 -0.941117 1.94603
238 | 0.0199678 0.998151 -0.0574183 -0.0619431
239 | 0.942867 0.000303566 0.333168 0.49588
240 | -0 -0 0 1
241 | 48 48 0
242 | 0.24179 -0.0521229 -0.968928 1.81741
243 | 0.0138091 0.99864 -0.0502753 -0.0258342
244 | 0.97023 -0.00122397 0.242181 0.543541
245 | -0 -0 0 1
246 | 49 49 0
247 | 0.0643863 -0.0459604 -0.996866 1.70192
248 | 0.00802663 0.99893 -0.0455371 0.000174929
249 | 0.997893 -0.00506951 0.0646864 0.619638
250 | -0 -0 0 1
251 | 50 50 0
252 | -0.159203 -0.0429124 -0.986313 1.6054
253 | 0.00686895 0.998983 -0.0445724 0.0259828
254 | 0.987222 -0.013871 -0.158746 0.728788
255 | -0 -0 0 1
256 | 51 51 0
257 | -0.36571 -0.0590434 -0.928854 1.60099
258 | 0.00679799 0.99779 -0.0661019 0.0486865
259 | 0.930704 -0.0304885 -0.3645 0.828061
260 | -0 -0 0 1
261 | 52 52 0
262 | -0.582378 -0.0571276 -0.810908 1.61291
263 | 0.00604419 0.997196 -0.0745921 0.0437002
264 | 0.812896 -0.0483421 -0.5804 0.941569
265 | -0 -0 0 1
266 | 53 53 0
267 | -0.681195 -0.0243783 -0.731696 1.6729
268 | 0.0125315 0.998911 -0.0449479 0.0482432
269 | 0.731995 -0.0397876 -0.680147 1.04225
270 | -0 -0 0 1
271 | 54 54 0
272 | -0.651585 -0.0120742 -0.75848 1.71654
273 | 0.021604 0.999172 -0.0344651 0.0466511
274 | 0.758268 -0.0388432 -0.650785 1.15547
275 | -0 -0 0 1
276 | 55 55 0
277 | -0.614671 -0.00751562 -0.788748 1.74022
278 | 0.0266504 0.999186 -0.0302894 0.0463387
279 | 0.788333 -0.0396385 -0.61397 1.30427
280 | -0 -0 0 1
281 | 56 56 0
282 | -0.616448 -0.00183783 -0.787393 1.76934
283 | 0.0263565 0.999389 -0.0229671 0.0111288
284 | 0.786954 -0.034911 -0.616023 1.45234
285 | -0 -0 0 1
286 | 57 57 0
287 | -0.633162 0.00317049 -0.774013 1.78157
288 | 0.0247787 0.999562 -0.0161752 -0.0234706
289 | 0.773623 -0.0294205 -0.632963 1.58984
290 | -0 -0 0 1
291 | 58 58 0
292 | -0.644971 0.00975194 -0.764145 1.77443
293 | 0.0230336 0.999712 -0.00668313 -0.026513
294 | 0.76386 -0.0219114 -0.64501 1.74466
295 | -0 -0 0 1
296 | 59 59 0
297 | -0.628905 0.0111872 -0.777401 1.84266
298 | 0.0184074 0.99983 -0.000503185 -0.0236905
299 | 0.777264 -0.0146264 -0.629005 1.86935
300 | -0 -0 0 1
301 | 60 60 0
302 | -0.574934 0.0151402 -0.81806 1.88685
303 | 0.0173898 0.999829 0.0062827 -0.0218278
304 | 0.818015 -0.0106137 -0.575099 1.92104
305 | -0 -0 0 1
306 | 61 61 0
307 | -0.542167 -0.00971241 -0.840215 1.81891
308 | 0.0146278 0.999673 -0.0209946 0.0168062
309 | 0.840143 -0.0236731 -0.541848 1.81658
310 | -0 -0 0 1
311 | 62 62 0
312 | -0.52684 -0.0368845 -0.849164 1.7534
313 | 0.013415 0.998573 -0.0516972 0.0541002
314 | 0.849859 -0.0386277 -0.525593 1.66769
315 | -0 -0 0 1
316 | 63 63 0
317 | -0.488528 -0.0436522 -0.871455 1.65136
318 | 0.0199413 0.997928 -0.0611662 0.0689963
319 | 0.87232 -0.0472593 -0.486646 1.54673
320 | -0 -0 0 1
321 | 64 64 0
322 | -0.476147 -0.0558521 -0.87759 1.57395
323 | 0.0217352 0.996929 -0.0752398 0.0632674
324 | 0.879097 -0.0548998 -0.473471 1.42497
325 | -0 -0 0 1
326 | 65 65 0
327 | -0.419003 -0.0676868 -0.905458 1.48308
328 | 0.0289892 0.995712 -0.0878484 0.0761312
329 | 0.907522 -0.0630573 -0.415244 1.30526
330 | -0 -0 0 1
331 | 66 66 0
332 | -0.350719 -0.075021 -0.933471 1.40375
333 | 0.0346497 0.995064 -0.0929895 0.0857747
334 | 0.935839 -0.0649578 -0.346389 1.18367
335 | -0 -0 0 1
336 | 67 67 0
337 | -0.202471 -0.00463388 -0.979277 1.30847
338 | 0.0436276 0.998953 -0.0137472 0.107331
339 | 0.978316 -0.045507 -0.202057 1.12087
340 | -0 -0 0 1
341 | 68 68 0
342 | -0.0369606 0.154981 -0.987226 1.23798
343 | 0.0432012 0.987225 0.153363 0.10704
344 | 0.998382 -0.0369809 -0.0431838 1.09133
345 | -0 -0 0 1
346 | 69 69 0
347 | 0.053846 0.115009 -0.991904 1.17308
348 | 0.0374281 0.992415 0.1171 0.0677365
349 | 0.997848 -0.0434304 0.049133 1.07558
350 | -0 -0 0 1
351 | 70 70 0
352 | -0.0419536 0.0386751 -0.998371 1.13784
353 | 0.0345389 0.998709 0.0372368 0.0286325
354 | 0.998522 -0.0329204 -0.0432353 1.11424
355 | -0 -0 0 1
356 | 71 71 0
357 | -0.284401 -0.102575 -0.953202 1.13018
358 | 0.0136772 0.993724 -0.111017 0.0309096
359 | 0.958608 -0.0446104 -0.281214 1.14552
360 | -0 -0 0 1
361 | 72 72 0
362 | -0.495933 -0.162945 -0.852936 1.14865
363 | -0.00206578 0.982455 -0.186487 0.0477755
364 | 0.868358 -0.0907231 -0.487568 1.15598
365 | -0 -0 0 1
366 | 73 73 0
367 | -0.542232 -0.142034 -0.828137 1.14283
368 | 0.0216477 0.98292 -0.182755 0.0574755
369 | 0.83995 -0.117023 -0.529896 1.16461
370 | -0 -0 0 1
371 | 74 74 0
372 | -0.586924 -0.0989481 -0.803573 1.10242
373 | 0.0394457 0.987831 -0.150448 0.0580641
374 | 0.808681 -0.119999 -0.575878 1.15251
375 | -0 -0 0 1
376 | 75 75 0
377 | -0.70448 -0.0621812 -0.706994 1.0399
378 | 0.045307 0.990183 -0.132234 0.0415118
379 | 0.708276 -0.125188 -0.694747 1.14987
380 | -0 -0 0 1
381 | 76 76 0
382 | -0.769549 -0.0110613 -0.638491 1.01661
383 | 0.0540279 0.995137 -0.0823576 0.0343478
384 | 0.636298 -0.0978746 -0.76521 1.14348
385 | -0 -0 0 1
386 | 77 77 0
387 | -0.78797 -0.00106285 -0.615713 0.976908
388 | 0.0574198 0.995514 -0.0752025 -0.000835338
389 | 0.613031 -0.0946114 -0.784374 1.12668
390 | -0 -0 0 1
391 | 78 78 0
392 | -0.896018 0.0114263 -0.44387 0.938823
393 | 0.0480182 0.9963 -0.0712849 -0.0612467
394 | 0.441413 -0.0851864 -0.893251 1.13287
395 | -0 -0 0 1
396 | 79 79 0
397 | -0.915698 0.0135704 -0.401638 0.929842
398 | 0.0468041 0.99623 -0.0730488 -0.0910788
399 | 0.399132 -0.0856889 -0.91288 1.13788
400 | -0 -0 0 1
401 | 80 80 0
402 | -0.882682 0.00856392 -0.469892 0.900332
403 | 0.0502095 0.99583 -0.0761683 -0.126592
404 | 0.46728 -0.0908255 -0.879432 1.15801
405 | -0 -0 0 1
406 | 81 81 0
407 | -0.869991 0.0032858 -0.493057 0.891583
408 | 0.0475553 0.995875 -0.0772739 -0.16476
409 | 0.490769 -0.090675 -0.866559 1.16894
410 | -0 -0 0 1
411 | 82 82 0
412 | -0.85447 -0.0028089 -0.519493 0.854853
413 | 0.0461203 0.995627 -0.0812429 -0.199125
414 | 0.517449 -0.0933788 -0.850604 1.19795
415 | -0 -0 0 1
416 | 83 83 0
417 | -0.832375 -0.0127787 -0.554065 0.825517
418 | 0.045345 0.994812 -0.0910659 -0.215612
419 | 0.552355 -0.100925 -0.827477 1.21412
420 | -0 -0 0 1
421 | 84 84 0
422 | -0.807366 -0.0195992 -0.589725 0.792181
423 | 0.047081 0.994122 -0.0974955 -0.267306
424 | 0.58817 -0.106479 -0.801697 1.23979
425 | -0 -0 0 1
426 | 85 85 0
427 | -0.821852 -0.010515 -0.569604 0.783023
428 | 0.0440802 0.995659 -0.0819811 -0.224695
429 | 0.567993 -0.0924846 -0.817821 1.27149
430 | -0 -0 0 1
431 | 86 86 0
432 | -0.878628 0.142715 -0.45568 0.792986
433 | 0.0453753 0.974927 0.217848 -0.193252
434 | 0.475345 0.170731 -0.863075 1.26931
435 | -0 -0 0 1
436 | 87 87 0
437 | -0.900014 0.217662 -0.377623 0.779171
438 | 0.0263339 0.891954 0.451359 -0.184134
439 | 0.435066 0.396285 -0.808502 1.27268
440 | -0 -0 0 1
441 | 88 88 0
442 | -0.795999 0.302571 -0.524249 0.743902
443 | -0.00744973 0.861137 0.508318 -0.17798
444 | 0.605252 0.408526 -0.68321 1.28438
445 | -0 -0 0 1
446 | 89 89 0
447 | -0.686297 0.418456 -0.594887 0.712174
448 | -0.0299144 0.800983 0.597939 -0.179551
449 | 0.726706 0.428159 -0.537194 1.30906
450 | -0 -0 0 1
451 | 90 90 0
452 | -0.602174 0.449255 -0.659967 0.65235
453 | -0.0484164 0.804577 0.591871 -0.19836
454 | 0.796896 0.388362 -0.462744 1.33377
455 | -0 -0 0 1
456 | 91 91 0
457 | -0.656861 0.274281 -0.702356 0.623154
458 | -0.0691975 0.905633 0.418379 -0.212618
459 | 0.75083 0.323418 -0.575895 1.36959
460 | -0 -0 0 1
461 | 92 92 0
462 | -0.740977 0.0672344 -0.668156 0.614335
463 | -0.0668322 0.982652 0.172997 -0.221965
464 | 0.668196 0.172841 -0.723629 1.37291
465 | -0 -0 0 1
466 | 93 93 0
467 | -0.752599 -0.0759759 -0.654082 0.612852
468 | -0.051255 0.997067 -0.0568409 -0.23538
469 | 0.656482 -0.00925341 -0.754285 1.38053
470 | -0 -0 0 1
471 | 94 94 0
472 | -0.782537 -0.159958 -0.601705 0.612625
473 | -0.0169726 0.971555 -0.236206 -0.254843
474 | 0.622372 -0.174627 -0.762993 1.37889
475 | -0 -0 0 1
476 | 95 95 0
477 | -0.773999 -0.157451 -0.613299 0.574858
478 | 0.0301742 0.958317 -0.284108 -0.244609
479 | 0.632468 -0.238405 -0.736985 1.36625
480 | -0 -0 0 1
481 | 96 96 0
482 | -0.79333 -0.131471 -0.594426 0.547073
483 | 0.0577046 0.955768 -0.288404 -0.206656
484 | 0.606051 -0.263101 -0.750653 1.40185
485 | -0 -0 0 1
486 | 97 97 0
487 | -0.75996 -0.122788 -0.638266 0.529796
488 | 0.0854458 0.954598 -0.28538 -0.19032
489 | 0.644329 -0.271415 -0.714965 1.40159
490 | -0 -0 0 1
491 | 98 98 0
492 | -0.776981 -0.111802 -0.619517 0.507971
493 | 0.0875503 0.955349 -0.282212 -0.185606
494 | 0.623407 -0.273512 -0.732499 1.41957
495 | -0 -0 0 1
496 | 99 99 0
497 | -0.84273 -0.0786394 -0.532562 0.480639
498 | 0.0939653 0.952599 -0.289354 -0.172198
499 | 0.530073 -0.29389 -0.795394 1.44138
500 | -0 -0 0 1
501 | 100 100 0
502 | -0.805556 -0.0474171 -0.59062 0.46322
503 | 0.125517 0.960515 -0.248309 -0.161372
504 | 0.579073 -0.27416 -0.767796 1.4453
505 | -0 -0 0 1
506 | 101 101 0
507 | -0.773342 -0.0524406 -0.631817 0.452159
508 | 0.12968 0.962419 -0.238608 -0.11826
509 | 0.620585 -0.266459 -0.737478 1.45141
510 | -0 -0 0 1
511 | 102 102 0
512 | -0.814516 -0.0501599 -0.577969 0.4453
513 | 0.109844 0.9649 -0.23854 -0.0933238
514 | 0.569647 -0.257781 -0.780417 1.45984
515 | -0 -0 0 1
516 | 103 103 0
517 | -0.91636 -0.0107754 -0.400211 0.428258
518 | 0.0871487 0.970298 -0.225668 -0.0706486
519 | 0.390755 -0.241671 -0.888203 1.46495
520 | -0 -0 0 1
521 | 104 104 0
522 | -0.899292 -0.00638476 -0.437302 0.423427
523 | 0.0923426 0.974579 -0.204127 -0.0675672
524 | 0.427489 -0.223952 -0.875842 1.45488
525 | -0 -0 0 1
526 | 105 105 0
527 | -0.894988 -0.0173813 -0.445751 0.41839
528 | 0.0793261 0.977114 -0.197374 -0.00902215
529 | 0.43898 -0.212007 -0.873127 1.46921
530 | -0 -0 0 1
531 | 106 106 0
532 | -0.932238 -0.0161825 -0.361485 0.417467
533 | 0.0596939 0.978434 -0.197747 0.0318228
534 | 0.356889 -0.205925 -0.911167 1.49943
535 | -0 -0 0 1
536 | 107 107 0
537 | -0.948705 -0.00804836 -0.316061 0.401114
538 | 0.0565127 0.97926 -0.194568 0.0526504
539 | 0.311072 -0.202449 -0.928574 1.55993
540 | -0 -0 0 1
541 | 108 108 0
542 | -0.970502 0.0220904 -0.240079 0.348796
543 | 0.0534553 0.990725 -0.12493 0.0582575
544 | 0.235093 -0.134078 -0.962681 1.65097
545 | -0 -0 0 1
546 | 109 109 0
547 | -0.976875 0.0399156 -0.210055 0.286604
548 | 0.0554932 0.996086 -0.0687941 0.0620433
549 | 0.206486 -0.0788598 -0.975266 1.72199
550 | -0 -0 0 1
551 | 110 110 0
552 | -0.993974 0.0411759 -0.101586 0.258954
553 | 0.0448687 0.998403 -0.0343376 0.0689804
554 | 0.10001 -0.0386887 -0.994234 1.84851
555 | -0 -0 0 1
556 | 111 111 0
557 | -0.994475 0.0425269 -0.0959694 0.221456
558 | 0.0373332 0.997773 0.055281 0.0807525
559 | 0.0981066 0.0513928 -0.993848 1.95555
560 | -0 -0 0 1
561 | 112 112 0
562 | -0.996755 0.0364478 -0.071775 0.152442
563 | 0.0288636 0.994158 0.104004 0.0715619
564 | 0.0751464 0.101595 -0.991984 2.03785
565 | -0 -0 0 1
566 | 113 113 0
567 | -0.989197 0.000687975 0.14659 0.113282
568 | 0.0184843 0.992593 0.120074 0.053129
569 | -0.145422 0.121487 -0.981883 2.08865
570 | -0 -0 0 1
571 | 114 114 0
572 | -0.987642 -0.0187048 0.155608 0.0187187
573 | -0.00107412 0.993637 0.112623 -0.000216482
574 | -0.156724 0.111064 -0.981378 2.11869
575 | -0 -0 0 1
576 | 115 115 0
577 | -0.98276 -0.0108452 0.184566 -0.0308363
578 | -0.0144249 0.999733 -0.0180634 -0.0137794
579 | -0.184321 -0.0204143 -0.982654 2.15065
580 | -0 -0 0 1
581 | 116 116 0
582 | -0.985835 -0.00167325 0.167711 -0.0508789
583 | -0.0148471 0.996895 -0.0773281 -0.00614615
584 | -0.167061 -0.0787227 -0.982799 2.15547
585 | -0 -0 0 1
586 | 117 117 0
587 | -0.98965 0.00513947 0.143409 -0.132186
588 | -0.0070492 0.996411 -0.0843547 0.0327007
589 | -0.143328 -0.0844926 -0.986062 2.13928
590 | -0 -0 0 1
591 | 118 118 0
592 | -0.989151 0.0138546 0.146248 -0.226092
593 | 0.00265382 0.997066 -0.0765066 0.035189
594 | -0.146879 -0.0752884 -0.986285 2.06268
595 | -0 -0 0 1
596 | 119 119 0
597 | -0.9674 0.0230309 0.252205 -0.328348
598 | 0.00690713 0.997885 -0.064631 0.0372617
599 | -0.25316 -0.060782 -0.965513 1.98195
600 | -0 -0 0 1
601 | 120 120 0
602 | -0.925983 0.0257829 0.376683 -0.395445
603 | 0.00999013 0.99899 -0.0438197 0.0409198
604 | -0.377432 -0.0368132 -0.925305 1.89286
605 | -0 -0 0 1
606 | 121 121 0
607 | -0.88525 0.0284358 0.464246 -0.417736
608 | 0.0133213 0.99927 -0.0358051 0.0302164
609 | -0.464925 -0.0255121 -0.884982 1.76907
610 | -0 -0 0 1
611 | 122 122 0
612 | -0.854283 0.0346194 0.518655 -0.488729
613 | 0.0165154 0.999084 -0.0394846 0.0288938
614 | -0.519546 -0.0251652 -0.854072 1.64502
615 | -0 -0 0 1
616 | 123 123 0
617 | -0.823196 0.0395771 0.566377 -0.537409
618 | 0.0173561 0.998855 -0.0445717 0.0287593
619 | -0.567492 -0.0268612 -0.82294 1.55274
620 | -0 -0 0 1
621 | 124 124 0
622 | -0.849306 0.0428573 0.526159 -0.586959
623 | 0.0186446 0.998512 -0.0512365 0.0343478
624 | -0.527572 -0.0337054 -0.848841 1.49765
625 | -0 -0 0 1
626 | 125 125 0
627 | -0.856059 0.0360283 0.515622 -0.64898
628 | 0.0182065 0.999051 -0.0395798 0.0541286
629 | -0.516558 -0.024495 -0.855902 1.56007
630 | -0 -0 0 1
631 | 126 126 0
632 | -0.735996 0.0255678 0.676503 -0.765152
633 | 0.012507 0.99963 -0.0241732 0.0735016
634 | -0.67687 -0.00933036 -0.736043 1.64986
635 | -0 -0 0 1
636 | 127 127 0
637 | -0.552482 0.0153357 0.833384 -0.861057
638 | 0.00682321 0.99988 -0.0138761 0.0752855
639 | -0.833497 -0.00197994 -0.552521 1.7092
640 | -0 -0 0 1
641 | 128 128 0
642 | -0.335925 0.0160754 0.941752 -0.957634
643 | 0.00657963 0.99987 -0.0147204 0.059231
644 | -0.941866 0.00125142 -0.335987 1.69023
645 | -0 -0 0 1
646 | 129 129 0
647 | -0.322216 0.021562 0.94642 -1.07449
648 | 0.0130861 0.999747 -0.0183216 0.0713831
649 | -0.946576 0.00648142 -0.322417 1.63267
650 | -0 -0 0 1
651 | 130 130 0
652 | -0.348186 0.0215501 0.937178 -1.16432
653 | 0.0177066 0.999709 -0.0164095 0.0632024
654 | -0.937258 0.0108807 -0.348466 1.52771
655 | -0 -0 0 1
656 | 131 131 0
657 | -0.405456 0.0252339 0.913766 -1.26355
658 | 0.0203025 0.999621 -0.0185962 0.0595877
659 | -0.913889 0.0110118 -0.405815 1.3916
660 | -0 -0 0 1
661 | 132 132 0
662 | -0.460981 0.0361836 0.886672 -1.36285
663 | 0.0229214 0.999321 -0.0288638 0.0603999
664 | -0.887114 0.00701809 -0.461497 1.23912
665 | -0 -0 0 1
666 | 133 133 0
667 | -0.515561 0.0464022 0.855596 -1.49942
668 | 0.0237774 0.998923 -0.0398477 0.0605876
669 | -0.856523 -0.000200081 -0.516109 1.10814
670 | -0 -0 0 1
671 | 134 134 0
672 | -0.561754 0.0520303 0.825666 -1.63478
673 | 0.025321 0.998634 -0.0457026 0.0551542
674 | -0.826917 -0.00476694 -0.562304 0.985991
675 | -0 -0 0 1
676 | 135 135 0
677 | -0.595076 0.054363 0.801829 -1.7558
678 | 0.0259471 0.998489 -0.0484397 0.0609658
679 | -0.80325 -0.00802018 -0.595588 0.852149
680 | -0 -0 0 1
681 | 136 136 0
682 | -0.551798 0.0465746 0.832676 -1.85847
683 | 0.0216065 0.998903 -0.041554 0.0605577
684 | -0.833698 -0.00493821 -0.552199 0.718436
685 | -0 -0 0 1
686 | 137 137 0
687 | -0.492264 0.0417058 0.869446 -1.96391
688 | 0.0187923 0.999128 -0.0372865 0.0568667
689 | -0.870243 -0.00201587 -0.492618 0.579632
690 | -0 -0 0 1
691 | 138 138 0
692 | -0.46897 0.0426956 0.882181 -2.07193
693 | 0.0192556 0.999088 -0.0381172 0.0587134
694 | -0.883004 -0.000888916 -0.469365 0.437267
695 | -0 -0 0 1
696 | 139 139 0
697 | -0.492171 0.0192681 0.870285 -2.15516
698 | 0.021246 0.999723 -0.0101186 0.0391689
699 | -0.870239 0.0135099 -0.492444 0.282877
700 | -0 -0 0 1
701 | 140 140 0
702 | -0.537053 0.000407698 0.843549 -2.24097
703 | 0.0218845 0.99967 0.0134498 0.0343478
704 | -0.843265 0.0256838 -0.536885 0.148968
705 | -0 -0 0 1
706 | 141 141 0
707 | -0.510014 -0.00288401 0.860161 -2.32971
708 | 0.0175419 0.999752 0.0137531 0.0225832
709 | -0.859987 0.0221031 -0.509837 0.0125297
710 | -0 -0 0 1
711 | 142 142 0
712 | -0.461823 -0.00989638 0.886917 -2.40437
713 | 0.0125933 0.999764 0.017713 0.0257997
714 | -0.886883 0.0193495 -0.461589 -0.140768
715 | -0 -0 0 1
716 | 143 143 0
717 | -0.329952 -0.0152285 0.943875 -2.45098
718 | 0.00636996 0.999811 0.0183577 0.0267815
719 | -0.943976 0.0120696 -0.329792 -0.289911
720 | -0 -0 0 1
721 | 144 144 0
722 | -0.169074 -0.0254681 0.985274 -2.48327
723 | 0.00144105 0.999659 0.0260872 0.0365492
724 | -0.985602 0.00583052 -0.16898 -0.445743
725 | -0 -0 0 1
726 | 145 145 0
727 | 0.0243041 -0.0349643 0.999093 -2.45019
728 | -0.00138989 0.999386 0.0350084 0.0468218
729 | -0.999704 -0.00223948 0.0242406 -0.595113
730 | -0 -0 0 1
731 | 146 146 0
732 | 0.181224 -0.0350012 0.982819 -2.369
733 | -0.000990072 0.999359 0.0357728 0.0356448
734 | -0.983441 -0.00745594 0.181073 -0.751812
735 | -0 -0 0 1
736 | 147 147 0
737 | 0.270011 -0.030109 0.962386 -2.28655
738 | -0.000768905 0.999504 0.0314859 0.0364923
739 | -0.962857 -0.00924153 0.269854 -0.887554
740 | -0 -0 0 1
741 | 148 148 0
742 | 0.402892 -0.0277614 0.914827 -2.2145
743 | -0.000286547 0.999536 0.0304582 0.0374353
744 | -0.915248 -0.0125335 0.402697 -0.992202
745 | -0 -0 0 1
746 | 149 149 0
747 | 0.459362 -0.0324057 0.887658 -2.14014
748 | 0.000827085 0.999349 0.0360552 0.0347836
749 | -0.888249 -0.0158282 0.45909 -1.07111
750 | -0 -0 0 1
751 | 150 150 0
752 | 0.604647 -0.0307654 0.795899 -2.03731
753 | 0.00543639 0.99939 0.0345013 0.00602985
754 | -0.796475 -0.0165343 0.604446 -1.17035
755 | -0 -0 0 1
756 | 151 151 0
757 | 0.664756 -0.0218705 0.74674 -1.9279
758 | 0.00723683 0.999713 0.0228373 0.0154192
759 | -0.747025 -0.0097772 0.664724 -1.24707
760 | -0 -0 0 1
761 | 152 152 0
762 | 0.688688 -0.0117455 0.724963 -1.84598
763 | 0.00945035 0.999929 0.00722292 0.0148311
764 | -0.724996 0.00187681 0.68875 -1.38816
765 | -0 -0 0 1
766 | 153 153 0
767 | 0.715788 -0.00915734 0.698257 -1.82968
768 | 0.0112666 0.999935 0.00156421 0.0083899
769 | -0.698226 0.00674736 0.715845 -1.55656
770 | -0 -0 0 1
771 | 154 154 0
772 | 0.751244 -0.0136928 0.659883 -1.87641
773 | 0.0131755 0.999897 0.00574866 -0.00311042
774 | -0.659893 0.0043756 0.751347 -1.71726
775 | -0 -0 0 1
776 | 155 155 0
777 | 0.768802 -0.0128973 0.639357 -1.91592
778 | 0.0121125 0.999911 0.00560579 -0.014888
779 | -0.639372 0.00343445 0.76889 -1.84535
780 | -0 -0 0 1
781 | 156 156 0
782 | 0.865804 -0.0188059 0.50003 -1.91273
783 | 0.0149901 0.99982 0.0116472 -0.0405304
784 | -0.500159 -0.00258871 0.86593 -2.0131
785 | -0 -0 0 1
786 | 157 157 0
787 | 0.899296 -0.0133045 0.437138 -1.90089
788 | 0.0171012 0.999842 -0.00475064 -0.0503844
789 | -0.437006 0.0117478 0.899382 -2.18536
790 | -0 -0 0 1
791 | 158 158 0
792 | 0.937175 -0.0100732 0.348715 -1.91763
793 | 0.017283 0.999696 -0.0175702 -0.0361501
794 | -0.348432 0.0224932 0.937064 -2.36189
795 | -0 -0 0 1
796 | 159 159 0
797 | 0.970644 -0.0112429 0.240259 -1.87968
798 | 0.0176988 0.999537 -0.0247294 -0.0302356
799 | -0.239869 0.0282557 0.970394 -2.51896
800 | -0 -0 0 1
801 | 160 160 0
802 | 0.968196 -0.0183693 0.249519 -1.7617
803 | 0.0274573 0.999078 -0.0329901 -0.0398184
804 | -0.248683 0.038792 0.967808 -2.62669
805 | -0 -0 0 1
806 | 161 161 0
807 | 0.946343 -0.0183151 0.322645 -1.637
808 | 0.0339631 0.9985 -0.0429362 -0.0423368
809 | -0.321375 0.0515905 0.945546 -2.68018
810 | -0 -0 0 1
811 | 162 162 0
812 | 0.937186 -0.017216 0.348406 -1.49892
813 | 0.0360137 0.998219 -0.0475485 -0.0286375
814 | -0.346967 0.0571091 0.936137 -2.73866
815 | -0 -0 0 1
816 | 163 163 0
817 | 0.972469 -0.022066 0.231986 -1.3547
818 | 0.033411 0.998424 -0.0450888 -0.00445069
819 | -0.230625 0.0515983 0.971674 -2.67768
820 | -0 -0 0 1
821 | 164 164 0
822 | 0.992416 -0.0301697 0.119169 -1.19404
823 | 0.0348041 0.998709 -0.0370014 -0.00739048
824 | -0.117898 0.0408683 0.992184 -2.57002
825 | -0 -0 0 1
826 | 165 165 0
827 | 0.987165 -0.0330802 0.156241 -1.04726
828 | 0.0391365 0.998591 -0.035846 -0.00718917
829 | -0.154835 0.0415006 0.987068 -2.44599
830 | -0 -0 0 1
831 | 166 166 0
832 | 0.989198 -0.0325258 0.142928 -0.898864
833 | 0.0386593 0.998438 -0.0403474 -0.000562265
834 | -0.141392 0.0454371 0.98891 -2.34373
835 | -0 -0 0 1
836 | 167 167 0
837 | 0.997569 -0.0329792 0.061381 -0.765259
838 | 0.0355242 0.998534 -0.0408437 -0.000636129
839 | -0.059944 0.0429249 0.997278 -2.24505
840 | -0 -0 0 1
841 | 168 168 0
842 | 0.998845 -0.0353579 -0.0325279 -0.657798
843 | 0.0341936 0.998778 -0.0356823 0.00675672
844 | 0.0337498 0.0345288 0.998834 -2.09728
845 | -0 -0 0 1
846 | 169 169 0
847 | 0.955395 -0.0327012 -0.293515 -0.582344
848 | 0.0273581 0.999377 -0.022292 0.0182423
849 | 0.294061 0.0132676 0.955695 -1.97161
850 | -0 -0 0 1
851 | 170 170 0
852 | 0.897482 -0.0179799 -0.440685 -0.497543
853 | 0.0286779 0.999433 0.0176276 0.0457167
854 | 0.440118 -0.0284584 0.897489 -1.86247
855 | -0 -0 0 1
856 | 171 171 0
857 | 0.814239 0.0771709 -0.575378 -0.398219
858 | 0.0351162 0.982763 0.181504 0.0568634
859 | 0.579467 -0.167993 0.797494 -1.78063
860 | -0 -0 0 1
861 | 172 172 0
862 | 0.821443 0.098241 -0.561765 -0.286716
863 | 0.0220564 0.978841 0.203431 0.0167165
864 | 0.569864 -0.179497 0.801895 -1.67505
865 | -0 -0 0 1
866 | 173 173 0
867 | 0.831551 0.034686 -0.554365 -0.175448
868 | 0.00905393 0.997069 0.0759666 -0.00767929
869 | 0.555375 -0.0681892 0.8288 -1.58214
870 | -0 -0 0 1
871 | 174 174 0
872 | 0.819782 -0.093765 -0.564947 -0.0915265
873 | 0.000769091 0.986684 -0.162645 -0.0201555
874 | 0.572675 0.132899 0.808938 -1.47626
875 | -0 -0 0 1
876 | 175 175 0
877 | 0.841446 -0.156675 -0.517129 0.00132468
878 | 0.0158707 0.963794 -0.266177 -0.0178098
879 | 0.540109 0.215766 0.813466 -1.35599
880 | -0 -0 0 1
881 | 176 176 0
882 | 0.872257 -0.166162 -0.459955 0.105492
883 | 0.0373009 0.960375 -0.276205 -0.0106844
884 | 0.487624 0.223765 0.843891 -1.25098
885 | -0 -0 0 1
886 | 177 177 0
887 | 0.891541 -0.169862 -0.419882 0.188008
888 | 0.0545493 0.960536 -0.272756 0.00473549
889 | 0.449643 0.220269 0.865623 -1.12911
890 | -0 -0 0 1
891 | 178 178 0
892 | 0.871739 -0.143447 -0.468502 0.276447
893 | 0.0619694 0.980784 -0.184992 0.0259625
894 | 0.486035 0.132232 0.863877 -1.02112
895 | -0 -0 0 1
896 | 179 179 0
897 | 0.793305 -0.0626367 -0.605594 0.3318
898 | 0.0672406 0.997622 -0.0151016 0.05136
899 | 0.6051 -0.0287403 0.795631 -0.878139
900 | -0 -0 0 1
901 | 180 180 0
902 | 0.72484 0.0392918 -0.687796 0.418494
903 | 0.0712852 0.988735 0.131608 0.0682817
904 | 0.685219 -0.144425 0.713874 -0.768978
905 | -0 -0 0 1
906 | 181 181 0
907 | 0.594943 0.139452 -0.791578 0.484694
908 | 0.0666078 0.972893 0.221456 0.0597262
909 | 0.801003 -0.184479 0.569527 -0.663485
910 | -0 -0 0 1
911 | 182 182 0
912 | 0.303994 0.208151 -0.929656 0.518774
913 | 0.0496087 0.971056 0.233642 0.0387798
914 | 0.951381 -0.117145 0.284869 -0.571561
915 | -0 -0 0 1
916 | 183 183 0
917 | 0.00227422 0.0933874 -0.995627 0.524108
918 | 0.00868429 0.99559 0.0934038 0.0283738
919 | 0.99996 -0.00885873 0.00145319 -0.512555
920 | -0 -0 0 1
921 | 184 184 0
922 | -0.184912 -0.13943 -0.972814 0.573556
923 | -0.00542279 0.990014 -0.140865 0.0254121
924 | 0.98274 -0.0207722 -0.183821 -0.459503
925 | -0 -0 0 1
926 | 185 185 0
927 | -0.243623 -0.18305 -0.952439 0.615612
928 | 0.0122502 0.981369 -0.191744 0.00546257
929 | 0.969793 -0.0583808 -0.236842 -0.373143
930 | -0 -0 0 1
931 | 186 186 0
932 | -0.204043 -0.201514 -0.957997 0.656087
933 | 0.0278193 0.976996 -0.211436 0.028252
934 | 0.978566 -0.0697928 -0.193744 -0.284877
935 | -0 -0 0 1
936 | 187 187 0
937 | -0.259171 -0.14106 -0.955475 0.687472
938 | 0.0390594 0.986937 -0.1563 0.0475007
939 | 0.965041 -0.0778287 -0.250276 -0.178919
940 | -0 -0 0 1
941 | 188 188 0
942 | -0.323293 0.0445995 -0.945247 0.741513
943 | 0.0532075 0.998165 0.0288983 0.0645402
944 | 0.944802 -0.0409516 -0.325073 -0.0634842
945 | -0 -0 0 1
946 | 189 189 0
947 | -0.28207 0.103683 -0.953775 0.798435
948 | 0.0599695 0.994105 0.0903317 0.0589328
949 | 0.957518 -0.0317175 -0.286624 0.0579204
950 | -0 -0 0 1
951 | 190 190 0
952 | -0.223844 0.129719 -0.965954 0.841675
953 | 0.0515243 0.991292 0.121182 0.0562224
954 | 0.973262 -0.0226443 -0.228579 0.185689
955 | -0 -0 0 1
956 | 191 191 0
957 | -0.1612 0.127183 -0.978693 0.923421
958 | 0.0421476 0.991644 0.121924 0.053142
959 | 0.986021 -0.0215954 -0.165213 0.316045
960 | -0 -0 0 1
961 | 192 192 0
962 | -0.0920782 0.113433 -0.98927 1.02263
963 | 0.0341624 0.993265 0.110711 0.043114
964 | 0.995166 -0.0236018 -0.0953332 0.446221
965 | -0 -0 0 1
966 | 193 193 0
967 | -0.0223008 0.0551414 -0.998229 1.11997
968 | 0.0286825 0.998102 0.0544936 0.0435416
969 | 0.99934 -0.0274165 -0.0238401 0.573031
970 | -0 -0 0 1
971 | 194 194 0
972 | 0.0394983 -0.0141958 -0.999119 1.16991
973 | 0.0254452 0.999589 -0.0131965 0.0593623
974 | 0.998896 -0.0249015 0.0398433 0.696441
975 | -0 -0 0 1
976 | 195 195 0
977 | 0.000813523 -0.113525 -0.993535 1.24647
978 | 0.0232494 0.993269 -0.113476 0.050798
979 | 0.999729 -0.0230068 0.00344744 0.778322
980 | -0 -0 0 1
981 | 196 196 0
982 | -0.171655 -0.291361 -0.941086 1.31649
983 | 0.0119723 0.954578 -0.297721 0.0404445
984 | 0.985084 -0.0623723 -0.16037 0.853472
985 | -0 -0 0 1
986 | 197 197 0
987 | -0.216312 -0.341681 -0.914584 1.3318
988 | 0.0345189 0.9335 -0.356912 0.0612265
989 | 0.975714 -0.108775 -0.190132 0.939534
990 | -0 -0 0 1
991 | 198 198 0
992 | -0.336344 -0.292449 -0.89518 1.26856
993 | 0.0428052 0.94483 -0.324752 0.0725807
994 | 0.940766 -0.147547 -0.30527 0.986707
995 | -0 -0 0 1
996 | 199 199 0
997 | -0.569467 -0.0598952 -0.819829 1.20543
998 | 0.056063 0.99219 -0.11143 0.0658361
999 | 0.8201 -0.109418 -0.561662 1.05782
1000 | -0 -0 0 1
1001 | 200 200 0
1002 | -0.533287 0.0142984 -0.845814 1.12767
1003 | 0.0664868 0.997473 -0.0250578 0.0618875
1004 | 0.843318 -0.0695984 -0.532889 1.09951
1005 | -0 -0 0 1
1006 | 201 201 0
1007 | -0.573364 0.125963 -0.80956 1.06906
1008 | 0.0590584 0.991894 0.112506 0.0796165
1009 | 0.817169 0.0166957 -0.576156 1.09444
1010 | -0 -0 0 1
1011 | 202 202 0
1012 | -0.65429 0.125225 -0.745804 1.05249
1013 | 0.0424828 0.990724 0.129078 0.0673899
1014 | 0.75505 0.0527709 -0.65354 1.10711
1015 | -0 -0 0 1
1016 | 203 203 0
1017 | -0.811488 0.115671 -0.572807 1.01984
1018 | 0.0393868 0.988811 0.143879 0.0510155
1019 | 0.58304 0.0941952 -0.806964 1.13691
1020 | -0 -0 0 1
1021 | 204 204 0
1022 | -0.890245 0.0868913 -0.447117 0.979527
1023 | 0.0160741 0.987017 0.159809 0.0137725
1024 | 0.455198 0.135082 -0.880084 1.12189
1025 | -0 -0 0 1
1026 | 205 205 0
1027 | -0.91407 0.0623769 -0.400732 0.932029
1028 | -0.0032273 0.986951 0.160988 -0.0792203
1029 | 0.405545 0.148447 -0.901941 1.11599
1030 | -0 -0 0 1
1031 | 206 206 0
1032 | -0.889116 0.0514512 -0.454781 0.896722
1033 | -0.0175256 0.989105 0.146165 -0.0955073
1034 | 0.457347 0.137927 -0.878527 1.12713
1035 | -0 -0 0 1
1036 | 207 207 0
1037 | -0.885755 0.0534896 -0.461062 0.883914
1038 | -0.0140073 0.989805 0.141741 -0.161561
1039 | 0.463943 0.132006 -0.875975 1.14965
1040 | -0 -0 0 1
1041 | 208 208 0
1042 | -0.889055 0.0486449 -0.455209 0.864529
1043 | -0.0155001 0.99057 0.136128 -0.211026
1044 | 0.457538 0.128081 -0.879917 1.19374
1045 | -0 -0 0 1
1046 | 209 209 0
1047 | -0.882596 0.049525 -0.467515 0.813448
1048 | -0.00961579 0.992326 0.123273 -0.253684
1049 | 0.470033 0.113295 -0.875347 1.23579
1050 | -0 -0 0 1
1051 | 210 210 0
1052 | -0.91843 0.0311022 -0.394359 0.789101
1053 | -0.00666188 0.995547 0.0940316 -0.297619
1054 | 0.395527 0.0889886 -0.914133 1.27153
1055 | -0 -0 0 1
1056 | 211 211 0
1057 | -0.926257 0.0117826 -0.376708 0.743153
1058 | -0.00921479 0.998504 0.0538885 -0.337723
1059 | 0.376779 0.0533859 -0.924763 1.27911
1060 | -0 -0 0 1
1061 | 212 212 0
1062 | -0.975703 0.00305495 -0.219076 0.692066
1063 | -0.00285815 0.99964 0.0266691 -0.335349
1064 | 0.219078 0.0266473 -0.975343 1.3144
1065 | -0 -0 0 1
1066 | 213 213 0
1067 | -0.967506 0.00873609 -0.252698 0.645635
1068 | 0.00489986 0.999863 0.0158064 -0.350849
1069 | 0.252801 0.0140546 -0.967416 1.34646
1070 | -0 -0 0 1
1071 | 214 214 0
1072 | -0.956068 0.0157708 -0.292721 0.632695
1073 | 0.0129416 0.999849 0.0115994 -0.309345
1074 | 0.29286 0.00730155 -0.956128 1.37064
1075 | -0 -0 0 1
1076 | 215 215 0
1077 | -0.956552 0.0237573 -0.290592 0.601631
1078 | 0.020863 0.999697 0.0130546 -0.259668
1079 | 0.290814 0.00642477 -0.956758 1.37076
1080 | -0 -0 0 1
1081 | 216 216 0
1082 | -0.981085 0.0277464 -0.191578 0.577576
1083 | 0.025524 0.999575 0.0140592 -0.216613
1084 | 0.191887 0.00890346 -0.981377 1.41012
1085 | -0 -0 0 1
1086 | 217 217 0
1087 | -0.984567 0.0285385 -0.172664 0.563388
1088 | 0.0272645 0.999581 0.00974602 -0.189107
1089 | 0.17287 0.004888 -0.984933 1.42512
1090 | -0 -0 0 1
1091 | 218 218 0
1092 | -0.972278 0.0320025 -0.231626 0.533392
1093 | 0.032002 0.999481 0.00376052 -0.163907
1094 | 0.231626 -0.0037562 -0.972798 1.43702
1095 | -0 -0 0 1
1096 | 219 219 0
1097 | -0.955128 0.0364776 -0.293938 0.469259
1098 | 0.0364484 0.99932 0.00557911 -0.139403
1099 | 0.293941 -0.00538479 -0.955808 1.436
1100 | -0 -0 0 1
1101 | 220 220 0
1102 | -0.930408 0.0427975 -0.364019 0.442404
1103 | 0.0416916 0.999071 0.0108992 -0.0914612
1104 | 0.364148 -0.0050358 -0.931328 1.45246
1105 | -0 -0 0 1
1106 | 221 221 0
1107 | -0.902365 0.0480943 -0.428281 0.433516
1108 | 0.0446421 0.998839 0.0181073 -0.0316921
1109 | 0.428655 -0.00277993 -0.903464 1.48354
1110 | -0 -0 0 1
1111 | 222 222 0
1112 | -0.89984 0.0507775 -0.433254 0.403773
1113 | 0.0459532 0.99871 0.0216072 -0.00980495
1114 | 0.433792 -0.000466397 -0.901013 1.49889
1115 | -0 -0 0 1
1116 | 223 223 0
1117 | -0.989712 0.0436915 -0.136237 0.299925
1118 | 0.0424743 0.999028 0.0118299 0.0390397
1119 | 0.136622 0.00592159 -0.990606 1.55628
1120 | -0 -0 0 1
1121 | 224 224 0
1122 | -0.981198 0.0273279 0.191061 0.210655
1123 | 0.0310229 0.999385 0.0163744 0.088648
1124 | -0.190496 0.0219938 -0.981441 1.59058
1125 | -0 -0 0 1
1126 | 225 225 0
1127 | -0.931171 -0.0243756 0.363767 0.141861
1128 | 0.0259256 0.990809 0.132757 0.120685
1129 | -0.36366 0.133051 -0.921981 1.63936
1130 | -0 -0 0 1
1131 | 226 226 0
1132 | -0.888968 -0.0932072 0.448383 0.0802484
1133 | 0.0274197 0.966481 0.255269 0.0825339
1134 | -0.457147 0.23922 -0.856616 1.63519
1135 | -0 -0 0 1
1136 | 227 227 0
1137 | -0.897164 -0.0709521 0.435961 0.00877627
1138 | 0.00542116 0.985171 0.171492 0.0523039
1139 | -0.441664 0.15622 -0.883475 1.59087
1140 | -0 -0 0 1
1141 | 228 228 0
1142 | -0.920869 0.0254522 0.389041 -0.0750516
1143 | -0.000729418 0.997753 -0.0670024 0.0686162
1144 | -0.389872 -0.0619842 -0.918781 1.53789
1145 | -0 -0 0 1
1146 | 229 229 0
1147 | -0.942542 0.0940571 0.320573 -0.141236
1148 | 0.00991552 0.967004 -0.254568 0.0535068
1149 | -0.333939 -0.236763 -0.912375 1.45656
1150 | -0 -0 0 1
1151 | 230 230 0
1152 | -0.960652 0.117172 0.251831 -0.246803
1153 | 0.0339497 0.9494 -0.312229 0.0519612
1154 | -0.275672 -0.291394 -0.916021 1.35607
1155 | -0 -0 0 1
1156 | 231 231 0
1157 | -0.947451 0.127375 0.293448 -0.352153
1158 | 0.0476087 0.963239 -0.264392 0.0447648
1159 | -0.316337 -0.236528 -0.918687 1.27848
1160 | -0 -0 0 1
1161 | 232 232 0
1162 | -0.849918 0.0996258 0.51741 -0.436443
1163 | 0.0518492 0.99301 -0.106032 0.0490551
1164 | -0.524357 -0.0632909 -0.849143 1.18738
1165 | -0 -0 0 1
1166 | 233 233 0
1167 | -0.851484 0.10255 0.514256 -0.508543
1168 | 0.0731102 0.994329 -0.0772307 0.0378645
1169 | -0.519259 -0.0281634 -0.854153 1.06461
1170 | -0 -0 0 1
1171 | 234 234 0
1172 | -0.875894 0.100522 0.471916 -0.565329
1173 | 0.0789129 0.994733 -0.0654214 0.0334068
1174 | -0.476006 -0.020062 -0.879213 0.934473
1175 | -0 -0 0 1
1176 | 235 235 0
1177 | -0.905974 0.0926503 0.413071 -0.642205
1178 | 0.0764549 0.995521 -0.0556061 0.0383274
1179 | -0.416373 -0.0187963 -0.908999 0.80844
1180 | -0 -0 0 1
1181 | 236 236 0
1182 | -0.929035 0.0834526 0.360457 -0.720689
1183 | 0.0714356 0.996358 -0.046559 0.0369699
1184 | -0.36303 -0.0175055 -0.931613 0.689641
1185 | -0 -0 0 1
1186 | 237 237 0
1187 | -0.94946 0.0739677 0.305048 -0.807114
1188 | 0.0654754 0.997131 -0.0379915 0.0285016
1189 | -0.306983 -0.0160983 -0.951579 0.56501
1190 | -0 -0 0 1
1191 | 238 238 0
1192 | -0.966639 0.065664 0.247583 -0.892023
1193 | 0.0589631 0.997668 -0.0343916 0.0306911
1194 | -0.249264 -0.018646 -0.968256 0.443177
1195 | -0 -0 0 1
1196 | 239 239 0
1197 | -0.966739 0.0586794 0.248942 -0.956712
1198 | 0.0506661 0.997974 -0.0384812 0.0281723
1199 | -0.250695 -0.0245883 -0.967754 0.302871
1200 | -0 -0 0 1
1201 | 240 240 0
1202 | -0.931889 0.0506265 0.359194 -1.01847
1203 | 0.0384633 0.998421 -0.0409335 0.0263363
1204 | -0.3607 -0.0243296 -0.932365 0.198633
1205 | -0 -0 0 1
1206 | 241 241 0
1207 | -0.81898 0.0449272 0.572061 -1.06483
1208 | 0.0248726 0.998773 -0.0428308 0.053526
1209 | -0.573283 -0.0208489 -0.819092 0.145493
1210 | -0 -0 0 1
1211 | 242 242 0
1212 | -0.545275 -0.022641 0.837952 -1.03962
1213 | 0.0109886 0.999356 0.0341526 0.0889837
1214 | -0.838185 0.0278305 -0.544675 0.17433
1215 | -0 -0 0 1
1216 | 243 243 0
1217 | -0.0757316 -0.106718 0.991401 -1.04699
1218 | 0.0141384 0.994041 0.108082 0.0941254
1219 | -0.997028 0.0222021 -0.0737715 0.20848
1220 | -0 -0 0 1
1221 | 244 244 0
1222 | 0.206159 -0.141984 0.968163 -1.04488
1223 | 0.00714286 0.989609 0.143608 0.0891512
1224 | -0.978492 -0.0226906 0.205031 0.180166
1225 | -0 -0 0 1
1226 | 245 245 0
1227 | 0.216052 -0.0204518 0.976168 -1.05371
1228 | -0.00458389 0.999748 0.0219604 0.0631715
1229 | -0.976371 -0.00921923 0.215904 0.10221
1230 | -0 -0 0 1
1231 | 246 246 0
1232 | 0.272122 0.125524 0.95404 -1.07263
1233 | -0.011194 0.991801 -0.127299 0.0658675
1234 | -0.962198 0.0239615 0.271296 0.00604324
1235 | -0 -0 0 1
1236 | 247 247 0
1237 | 0.380217 0.128902 0.915871 -1.04457
1238 | -0.00406444 0.990464 -0.137713 0.0752788
1239 | -0.924888 0.0486385 0.377115 -0.079502
1240 | -0 -0 0 1
1241 | 248 248 0
1242 | 0.431156 0.119035 0.894391 -1.03322
1243 | 0.01433 0.990231 -0.138699 0.0847884
1244 | -0.902164 0.0726173 0.425238 -0.190848
1245 | -0 -0 0 1
1246 | 249 249 0
1247 | 0.486294 0.10675 0.86725 -1.02153
1248 | 0.0283198 0.990063 -0.137746 0.0836847
1249 | -0.873336 0.0915455 0.478438 -0.312239
1250 | -0 -0 0 1
1251 | 250 250 0
1252 | 0.581211 0.0829905 0.80951 -1.01129
1253 | 0.0323529 0.991643 -0.124891 0.0745738
1254 | -0.813109 0.0987783 0.573669 -0.38064
1255 | -0 -0 0 1
1256 | 251 251 0
1257 | 0.626845 0.0671796 0.776242 -1.06905
1258 | 0.0425381 0.991839 -0.12019 0.0697121
1259 | -0.777982 0.10836 0.618872 -0.46041
1260 | -0 -0 0 1
1261 | 252 252 0
1262 | 0.565412 0.0560908 0.822899 -1.11695
1263 | 0.0514775 0.99334 -0.103079 0.0639613
1264 | -0.823201 0.100643 0.558759 -0.528949
1265 | -0 -0 0 1
1266 | 253 253 0
1267 | 0.516024 0.0556809 0.854763 -1.1779
1268 | 0.0520683 0.994001 -0.096185 0.0602044
1269 | -0.85499 0.0941398 0.510029 -0.582466
1270 | -0 -0 0 1
1271 | 254 254 0
1272 | 0.490298 0.0541857 0.869869 -1.2333
1273 | 0.0481943 0.994853 -0.0891357 0.0515047
1274 | -0.870221 0.0856258 0.485163 -0.603026
1275 | -0 -0 0 1
1276 | 255 255 0
1277 | 0.485795 0.0321823 0.87348 -1.28976
1278 | 0.0472851 0.996891 -0.0630273 0.0352259
1279 | -0.872793 0.0719209 0.482762 -0.653225
1280 | -0 -0 0 1
1281 | 256 256 0
1282 | 0.560554 0.000768552 0.828118 -1.28702
1283 | 0.0408404 0.998757 -0.0285718 -0.0338705
1284 | -0.82711 0.0498367 0.559826 -0.702847
1285 | -0 -0 0 1
1286 | 257 257 0
1287 | 0.678921 -0.0109384 0.73413 -1.21352
1288 | 0.0358694 0.999189 -0.0182842 -0.065632
1289 | -0.733335 0.0387463 0.678763 -0.740479
1290 | -0 -0 0 1
1291 | 258 258 0
1292 | 0.694831 -0.00645815 0.719144 -1.18682
1293 | 0.0310111 0.999299 -0.0209886 -0.045945
1294 | -0.718504 0.036885 0.694544 -0.78659
1295 | -0 -0 0 1
1296 | 259 259 0
1297 | 0.661817 -0.00310285 0.749659 -1.20656
1298 | 0.0277137 0.999409 -0.0203298 -0.00248284
1299 | -0.749153 0.0342305 0.661512 -0.859404
1300 | -0 -0 0 1
1301 | 260 260 0
1302 | 0.591394 -0.00361708 0.806375 -1.24936
1303 | 0.0293981 0.999422 -0.0170775 0.0427053
1304 | -0.805847 0.0338055 0.591159 -0.940835
1305 | -0 -0 0 1
1306 | 261 261 0
1307 | 0.524946 -0.0226406 0.850835 -1.32684
1308 | 0.0268639 0.999589 0.0100245 0.0454883
1309 | -0.850712 0.0175944 0.525338 -1.02199
1310 | -0 -0 0 1
1311 | 262 262 0
1312 | 0.471509 -0.0879195 0.877468 -1.40557
1313 | 0.0237315 0.995922 0.0870361 0.0633838
1314 | -0.881542 -0.0202147 0.471673 -1.10442
1315 | -0 -0 0 1
1316 | 263 263 0
1317 | 0.409037 -0.192209 0.892045 -1.43429
1318 | 0.0160996 0.978932 0.203549 0.0692954
1319 | -0.912376 -0.0688973 0.403514 -1.08644
1320 | -0 -0 0 1
1321 | 264 264 0
1322 | 0.16527 -0.215502 0.962416 -1.39573
1323 | -0.00643276 0.975579 0.219554 0.068642
1324 | -0.986227 -0.0424767 0.159848 -1.02267
1325 | -0 -0 0 1
1326 | 265 265 0
1327 | 0.0129025 -0.0420417 0.999033 -1.37839
1328 | -0.0022113 0.999112 0.0420736 0.0293147
1329 | -0.999914 -0.00275202 0.0127981 -1.00014
1330 | -0 -0 0 1
1331 | 266 266 0
1332 | -0.0855129 0.0518348 0.994988 -1.39137
1333 | 0.0102156 0.998639 -0.051147 -0.00244455
1334 | -0.996285 0.00579064 -0.085926 -1.04926
1335 | -0 -0 0 1
1336 | 267 267 0
1337 | -0.0557775 0.220956 0.973687 -1.39051
1338 | 0.00281539 0.975237 -0.221146 -0.028823
1339 | -0.998439 -0.00959367 -0.0550184 -1.05127
1340 | -0 -0 0 1
1341 | 268 268 0
1342 | 0.225585 0.292997 0.92912 -1.40564
1343 | -0.00610795 0.95411 -0.299394 -0.0330941
1344 | -0.974204 0.0618637 0.217022 -1.13348
1345 | -0 -0 0 1
1346 | 269 269 0
1347 | 0.451619 0.259523 0.853632 -1.42602
1348 | 0.0170192 0.954081 -0.299065 -0.0124998
1349 | -0.892049 0.149592 0.426464 -1.2351
1350 | -0 -0 0 1
1351 | 270 270 0
1352 | 0.534558 0.186314 0.824339 -1.45288
1353 | 0.0450174 0.967735 -0.247916 0.00557297
1354 | -0.843932 0.169635 0.508923 -1.30633
1355 | -0 -0 0 1
1356 | 271 271 0
1357 | 0.684433 0.120046 0.719125 -1.4484
1358 | 0.0508119 0.976099 -0.211305 0.0105199
1359 | -0.727303 0.181164 0.661974 -1.42734
1360 | -0 -0 0 1
1361 | 272 272 0
1362 | 0.788271 0.0107892 0.615233 -1.43362
1363 | 0.0625215 0.993267 -0.0975247 0.0332949
1364 | -0.612143 0.115341 0.782289 -1.52453
1365 | -0 -0 0 1
1366 | 273 273 0
1367 | 0.879121 -0.0731634 0.470949 -1.39263
1368 | 0.0674528 0.9973 0.0290194 0.0336566
1369 | -0.4718 0.00625524 0.881683 -1.64314
1370 | -0 -0 0 1
1371 | 274 274 0
1372 | 0.922964 -0.112508 0.368074 -1.34011
1373 | 0.0599251 0.986666 0.151326 0.0346216
1374 | -0.380192 -0.117612 0.917399 -1.75004
1375 | -0 -0 0 1
1376 | 275 275 0
1377 | 0.945659 -0.0997101 0.309493 -1.24884
1378 | 0.0421132 0.981363 0.187491 0.0133057
1379 | -0.32242 -0.164269 0.932235 -1.85005
1380 | -0 -0 0 1
1381 | 276 276 0
1382 | 0.952192 -0.0740811 0.296382 -1.14395
1383 | 0.0191668 0.982729 0.184057 -0.0138983
1384 | -0.304898 -0.169577 0.937166 -1.85154
1385 | -0 -0 0 1
1386 | 277 277 0
1387 | 0.976814 -0.0111487 0.2138 -1.03866
1388 | 0.00420228 0.999449 0.0329175 -0.0699878
1389 | -0.214049 -0.0312558 0.976323 -1.84841
1390 | -0 -0 0 1
1391 | 278 278 0
1392 | 0.99921 0.0258983 0.0301528 -0.938155
1393 | -0.0211531 0.988725 -0.148243 -0.0843899
1394 | -0.0336521 0.147488 0.988491 -1.81725
1395 | -0 -0 0 1
1396 | 279 279 0
1397 | 0.998981 -0.00863183 -0.0442977 -0.837251
1398 | -0.00167489 0.973772 -0.22752 -0.0906046
1399 | 0.0450998 0.227363 0.972765 -1.76802
1400 | -0 -0 0 1
1401 | 280 280 0
1402 | 0.999716 -0.0222741 0.00842429 -0.748421
1403 | 0.0236945 0.965767 -0.258328 -0.0882995
1404 | -0.00238189 0.258454 0.966021 -1.71486
1405 | -0 -0 0 1
1406 | 281 281 0
1407 | 0.99645 -0.0205864 0.0816307 -0.676479
1408 | 0.0414309 0.964008 -0.262627 -0.0444318
1409 | -0.073286 0.265077 0.961438 -1.71041
1410 | -0 -0 0 1
1411 | 282 282 0
1412 | 0.999412 -0.0341696 0.00286222 -0.58081
1413 | 0.0337331 0.964797 -0.260822 0.00281658
1414 | 0.00615072 0.260765 0.965383 -1.78848
1415 | -0 -0 0 1
1416 | 283 283 0
1417 | 0.981467 -0.0758679 -0.175972 -0.452485
1418 | 0.0398253 0.978995 -0.199958 0.0277672
1419 | 0.187446 0.189244 0.963873 -1.83278
1420 | -0 -0 0 1
1421 | 284 284 0
1422 | 0.941353 -0.0850585 -0.326527 -0.341121
1423 | 0.0595297 0.994392 -0.0874138 0.0457903
1424 | 0.332131 0.0628492 0.941137 -1.87592
1425 | -0 -0 0 1
1426 | 285 285 0
1427 | 0.926289 -0.0761625 -0.369037 -0.227173
1428 | 0.0697598 0.997092 -0.0306832 0.0540367
1429 | 0.370301 0.00267752 0.928908 -1.93913
1430 | -0 -0 0 1
1431 | 286 286 0
1432 | 0.897798 -0.0661135 -0.435417 -0.14579
1433 | 0.0621251 0.997794 -0.023407 0.0735809
1434 | 0.436004 -0.00603562 0.899924 -1.98243
1435 | -0 -0 0 1
1436 | 287 287 0
1437 | 0.867364 -0.050272 -0.495129 -0.10782
1438 | 0.0497226 0.998661 -0.0142935 0.0671998
1439 | 0.495185 -0.0122214 0.868702 -1.99849
1440 | -0 -0 0 1
1441 | 288 288 0
1442 | 0.842096 -0.0484652 -0.537146 -0.0928497
1443 | 0.0394181 0.998821 -0.0283241 0.0499989
1444 | 0.537885 0.00267836 0.843014 -1.99987
1445 | -0 -0 0 1
1446 | 289 289 0
1447 | 0.880893 -0.0706861 -0.468008 -0.0853173
1448 | 0.0354844 0.995866 -0.0836224 0.048002
1449 | 0.471984 0.0570554 0.879759 -2.00119
1450 | -0 -0 0 1
1451 | 290 290 0
1452 | 0.940245 -0.068394 -0.333559 -0.0832527
1453 | 0.0373347 0.994421 -0.098659 0.0497564
1454 | 0.338446 0.0803103 0.937552 -2.00599
1455 | -0 -0 0 1
1456 | 291 291 0
1457 | 0.95971 -0.0751895 -0.270745 -0.0789363
1458 | 0.0357128 0.988359 -0.147889 0.0504851
1459 | 0.278713 0.132261 0.951223 -2.01057
1460 | -0 -0 0 1
1461 | 292 292 0
1462 | 0.962936 -0.074663 -0.25919 -0.119419
1463 | 0.0344853 0.987118 -0.156233 0.0512516
1464 | 0.267516 0.141504 0.953106 -2.0332
1465 | -0 -0 0 1
1466 | 293 293 0
1467 | 0.98293 -0.0630221 -0.172846 -0.241745
1468 | 0.0390512 0.98956 -0.138733 0.0529808
1469 | 0.179785 0.129615 0.975129 -2.13164
1470 | -0 -0 0 1
1471 | 294 294 0
1472 | 0.992315 -0.048361 -0.113892 -0.34153
1473 | 0.0397877 0.996284 -0.076382 0.0517629
1474 | 0.117163 0.0712636 0.990553 -2.22131
1475 | -0 -0 0 1
1476 | 295 295 0
1477 | 0.992652 -0.0409192 -0.113874 -0.358829
1478 | 0.0399959 0.999146 -0.0103822 0.0396408
1479 | 0.114202 0.00575142 0.993441 -2.30874
1480 | -0 -0 0 1
1481 | 296 296 0
1482 | 0.967552 -0.0462198 -0.248407 -0.369018
1483 | 0.031495 0.997521 -0.0629295 0.0291705
1484 | 0.2507 0.053064 0.966609 -2.3515
1485 | -0 -0 0 1
1486 | 297 297 0
1487 | 0.949118 -0.0814574 -0.304203 -0.365029
1488 | 0.0212955 0.980358 -0.196071 0.0410547
1489 | 0.3142 0.179617 0.93221 -2.37868
1490 | -0 -0 0 1
1491 | 298 298 0
1492 | 0.94423 -0.0940247 -0.315578 -0.371787
1493 | 0.0305002 0.97922 -0.200495 0.0482809
1494 | 0.327871 0.179688 0.927477 -2.37229
1495 | -0 -0 0 1
1496 | 299 299 0
1497 | 0.929896 -0.104079 -0.352789 -0.374905
1498 | 0.0372072 0.980828 -0.191288 0.0533307
1499 | 0.365935 0.164752 0.915941 -2.37071
1500 | -0 -0 0 1
1501 | 300 300 0
1502 | 0.944447 -0.0926965 -0.31532 -0.374779
1503 | 0.0481573 0.988078 -0.14623 0.0581205
1504 | 0.325116 0.122922 0.937651 -2.3693
1505 | -0 -0 0 1
1506 |
--------------------------------------------------------------------------------
/dataloader/datalist/tanks/test.txt:
--------------------------------------------------------------------------------
1 | Family
2 | Francis
3 | Horse
4 | Lighthouse
5 | M60
6 | Panther
7 | Playground
8 | Train
9 |
--------------------------------------------------------------------------------
/dataloader/mvs_dataset.py:
--------------------------------------------------------------------------------
1 | import torch
2 | from torch.utils.data import Dataset
3 |
4 | from PIL import Image
5 | from utils.utils import read_pfm
6 |
7 | import numpy as np
8 | import cv2
9 | import glob
10 | import os, sys
11 | import re
12 |
13 |
14 | def scale_inputs(img, intrinsics, max_w, max_h, base=32):
15 | h, w = img.shape[:2]
16 | if h > max_h or w > max_w:
17 | scale = 1.0 * max_h / h
18 | if scale * w > max_w:
19 | scale = 1.0 * max_w / w
20 | new_w, new_h = scale * w // base * base, scale * h // base * base
21 | else:
22 | new_w, new_h = 1.0 * w // base * base, 1.0 * h // base * base
23 |
24 | scale_w = 1.0 * new_w / w
25 | scale_h = 1.0 * new_h / h
26 | intrinsics[0, :] *= scale_w
27 | intrinsics[1, :] *= scale_h
28 | img = cv2.resize(img, (int(new_w), int(new_h)))
29 | return img, intrinsics
30 |
31 | class MVSTrainSet(Dataset):
32 | def __init__(self, root_dir, data_list, lightings=range(7), num_views=4):
33 | super(MVSTrainSet, self).__init__()
34 |
35 | self.root_dir = root_dir
36 | scene_names = open(data_list, 'r').readlines()
37 | self.scene_names = list(map(lambda x: x.strip(), scene_names))
38 | self.lightings = lightings
39 | self.num_views = num_views
40 | self.generate_pairs()
41 |
42 | def generate_pairs(self, ):
43 | data_pairs = []
44 | pair_list = open('{}/Cameras/pair.txt'.format(self.root_dir), 'r').readlines()
45 |
46 | pair_list = list(map(lambda x: x.strip(), pair_list))
47 | cnt = int(pair_list[0])
48 | for i in range(cnt):
49 | ref_id = int(pair_list[i*2+1])
50 | candidates = pair_list[i*2+2].split()
51 |
52 | nei_id = [int(candidates[2*j+1]) for j in range(self.num_views)]
53 | for scene_name in self.scene_names:
54 | for light in self.lightings:
55 | data_pairs.append({'scene_name': scene_name,
56 | 'frame_idx': [ref_id, ]+nei_id,
57 | 'light': light
58 | })
59 | self.data_pairs = data_pairs
60 |
61 | def parse_cameras(self, path):
62 | cam_txt = open(path).readlines()
63 | f = lambda xs: list(map(lambda x: list(map(float, x.strip().split())), xs))
64 |
65 | extr_mat = f(cam_txt[1:5])
66 | intr_mat = f(cam_txt[7:10])
67 |
68 | extr_mat = np.array(extr_mat, np.float32)
69 | intr_mat = np.array(intr_mat, np.float32)
70 |
71 | min_dep, delta = list(map(float, cam_txt[11].strip().split()))
72 | max_dep = 1.06 * 191.5 * delta + min_dep
73 |
74 | intr_mat[:2] *= 4.
75 | # note the loaded camera model is for 1/4 original resolution
76 |
77 | return extr_mat, intr_mat, min_dep, max_dep
78 |
79 | def load_depths(self, path):
80 | depth_s3 = np.array(read_pfm(path)[0], np.float32)
81 | h, w = depth_s3.shape
82 | depth_s2 = cv2.resize(depth_s3, (w//2, h//2), interpolation=cv2.INTER_NEAREST)
83 | depth_s1 = cv2.resize(depth_s3, (w//4, h//4), interpolation=cv2.INTER_NEAREST)
84 | return {'stage1': depth_s1, 'stage2': depth_s2, 'stage3': depth_s3}
85 |
86 | def make_masks(self, depths:dict, min_d, max_d):
87 | masks = {}
88 | for k, v in depths.items():
89 | m = np.ones(v.shape, np.uint8)
90 | m[v>max_d] = 0
91 | m[v 0, 1, 0))
110 | mask_image = np.reshape(mask_image, (image_shape[0], image_shape[1], 1))
111 | mask_image = np.tile(mask_image, [1, 1, 3])
112 | mask_image = np.float32(mask_image)
113 |
114 | normal_image = np.multiply(normal_image, mask_image)
115 | normal_image = np.float32(normal_image)
116 |
117 | write_gipuma_dmb(out_normal_path, normal_image)
118 | return
119 |
120 | def ucsnet_to_gipuma(dense_folder, gipuma_point_folder):
121 |
122 | image_folder = os.path.join(dense_folder, 'rgb')
123 | cam_folder = os.path.join(dense_folder, 'cam')
124 | depth_folder = os.path.join(dense_folder, 'depth')
125 |
126 | gipuma_cam_folder = os.path.join(gipuma_point_folder, 'cams')
127 | gipuma_image_folder = os.path.join(gipuma_point_folder, 'images')
128 |
129 |
130 | if not os.path.isdir(gipuma_point_folder):
131 | os.mkdir(gipuma_point_folder)
132 | if not os.path.isdir(gipuma_cam_folder):
133 | os.mkdir(gipuma_cam_folder)
134 | if not os.path.isdir(gipuma_image_folder):
135 | os.mkdir(gipuma_image_folder)
136 |
137 | # convert cameras
138 | image_names = os.listdir(image_folder)
139 | for image_name in image_names:
140 | image_prefix = os.path.splitext(image_name)[0]
141 | in_cam_file = os.path.join(cam_folder, 'cam_'+image_prefix+'.txt')
142 | out_cam_file = os.path.join(gipuma_cam_folder, image_name+'.P')
143 | ucsnet_to_gipuma_cam(in_cam_file, out_cam_file)
144 |
145 | # copy images to gipuma image folder
146 | image_names = os.listdir(image_folder)
147 | for image_name in image_names:
148 | in_image_file = os.path.join(image_folder, image_name)
149 | out_image_file = os.path.join(gipuma_image_folder, image_name)
150 | shutil.copyfile(in_image_file, out_image_file)
151 |
152 | # convert depth maps and fake normal maps
153 | gipuma_prefix = '2333__'
154 | for image_name in image_names:
155 | image_prefix = os.path.splitext(image_name)[0]
156 | sub_depth_folder = os.path.join(gipuma_point_folder, gipuma_prefix+image_prefix)
157 | if not os.path.isdir(sub_depth_folder):
158 | os.mkdir(sub_depth_folder)
159 |
160 | in_depth_pfm = os.path.join(depth_folder, image_prefix+'_prob_filtered.pfm')
161 | #
162 | out_depth_dmb = os.path.join(sub_depth_folder, 'disp.dmb')
163 | fake_normal_dmb = os.path.join(sub_depth_folder, 'normals.dmb')
164 | ucsnet_to_gipuma_dmb(in_depth_pfm, out_depth_dmb)
165 | fake_gipuma_normal(out_depth_dmb, fake_normal_dmb)
166 |
167 | def probability_filter(dense_folder, prob_threshold, s=3):
168 | '''
169 | filter with stage 3.
170 | :param dense_folder:
171 | :param prob_threshold:
172 | :return:
173 | '''
174 | image_folder = os.path.join(dense_folder, 'rgb')
175 | depth_folder = os.path.join(dense_folder, 'depth')
176 | conf_folder = os.path.join(dense_folder, 'confidence')
177 |
178 | # convert cameras
179 | image_names = os.listdir(image_folder)
180 | for image_name in image_names:
181 | image_prefix = os.path.splitext(image_name)[0]
182 |
183 | init_depth_map_path = os.path.join(depth_folder, 'dep_'+image_prefix+'_3.pfm')
184 | prob_map_path = os.path.join(conf_folder, 'conf_'+image_prefix+'_1.pfm')
185 | out_depth_map_path = os.path.join(depth_folder, image_prefix+'_prob_filtered.pfm')
186 |
187 | depth_map, _ = read_pfm(init_depth_map_path)
188 | prob_map, _ = read_pfm(prob_map_path)
189 | h, w = depth_map.shape
190 |
191 | prob_map = cv2.resize(prob_map, (w, h), interpolation=cv2.INTER_LINEAR)
192 |
193 | depth_map[prob_map < prob_threshold] = 0
194 | write_pfm(out_depth_map_path, depth_map)
195 |
196 | def depth_map_fusion(point_folder, fusibile_exe_path, disp_thresh, num_consistent):
197 |
198 | cam_folder = os.path.join(point_folder, 'cams')
199 | image_folder = os.path.join(point_folder, 'images')
200 | depth_min = 0.001
201 | depth_max = 100000
202 | normal_thresh = 360
203 |
204 | cmd = fusibile_exe_path
205 | cmd = cmd + ' -input_folder ' + point_folder + '/'
206 | cmd = cmd + ' -p_folder ' + cam_folder + '/'
207 | cmd = cmd + ' -images_folder ' + image_folder + '/'
208 | cmd = cmd + ' --depth_min=' + str(depth_min)
209 | cmd = cmd + ' --depth_max=' + str(depth_max)
210 | cmd = cmd + ' --normal_thresh=' + str(normal_thresh)
211 | cmd = cmd + ' --disp_thresh=' + str(disp_thresh)
212 | cmd = cmd + ' --num_consistent=' + str(num_consistent)
213 | print (cmd)
214 | os.system(cmd)
215 |
216 | return
217 |
218 | if __name__ == '__main__':
219 | parser = argparse.ArgumentParser()
220 | parser.add_argument('--dense_folder', type=str, default = '')
221 | parser.add_argument('--fusibile_exe_path', type=str, default = '')
222 | parser.add_argument('--prob_threshold', type=float, default = '0.8')
223 | parser.add_argument('--disp_threshold', type=float, default = '0.25')
224 | parser.add_argument('--num_consistent', type=float, default = '3')
225 | args = parser.parse_args()
226 |
227 | dense_folder = args.dense_folder
228 | fusibile_exe_path = args.fusibile_exe_path
229 | prob_threshold = args.prob_threshold
230 | disp_threshold = args.disp_threshold
231 | num_consistent = args.num_consistent
232 |
233 | point_folder = os.path.join(dense_folder, 'points_ucsnet')
234 | if not os.path.isdir(point_folder):
235 | os.mkdir(point_folder)
236 |
237 | # probability filter
238 | print ('filter depth map with probability map')
239 | probability_filter(dense_folder, prob_threshold)
240 |
241 | # convert to gipuma format
242 | print ('Convert ucsnet output to gipuma input')
243 | ucsnet_to_gipuma(dense_folder, point_folder)
244 |
245 | # depth map fusion with gipuma
246 | print ('Run depth map fusion & filter')
247 | depth_map_fusion(point_folder, fusibile_exe_path, disp_threshold, num_consistent)
248 |
--------------------------------------------------------------------------------
/networks/submodules.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 |
5 | eps = 1e-12
6 |
7 | def homo_warping(src_fea, src_proj, ref_proj, depth_values):
8 | # src_fea: [B, C, H, W]
9 | # src_proj: [B, 4, 4]
10 | # ref_proj: [B, 4, 4]
11 | # depth_values: [B, Ndepth] o [B, Ndepth, H, W]
12 | # out: [B, C, Ndepth, H, W]
13 | batch, channels = src_fea.shape[0], src_fea.shape[1]
14 | num_depth = depth_values.shape[1]
15 | height, width = src_fea.shape[2], src_fea.shape[3]
16 |
17 | with torch.no_grad():
18 | proj = torch.matmul(src_proj, torch.inverse(ref_proj))
19 | rot = proj[:, :3, :3] # [B,3,3]
20 | trans = proj[:, :3, 3:4] # [B,3,1]
21 |
22 | y, x = torch.meshgrid([torch.arange(0, height, dtype=torch.float32, device=src_fea.device),
23 | torch.arange(0, width, dtype=torch.float32, device=src_fea.device)])
24 | y, x = y.contiguous(), x.contiguous()
25 | y, x = y.view(height * width), x.view(height * width)
26 | xyz = torch.stack((x, y, torch.ones_like(x))) # [3, H*W]
27 | xyz = torch.unsqueeze(xyz, 0).repeat(batch, 1, 1) # [B, 3, H*W]
28 | rot_xyz = torch.matmul(rot, xyz) # [B, 3, H*W]
29 | rot_depth_xyz = rot_xyz.unsqueeze(2).repeat(1, 1, num_depth, 1) * depth_values.view(batch, 1, num_depth,
30 | -1) # [B, 3, Ndepth, H*W]
31 | proj_xyz = rot_depth_xyz + trans.view(batch, 3, 1, 1) # [B, 3, Ndepth, H*W]
32 | proj_xy = proj_xyz[:, :2, :, :] / proj_xyz[:, 2:3, :, :] # [B, 2, Ndepth, H*W]
33 | proj_x_normalized = proj_xy[:, 0, :, :] / ((width - 1) / 2) - 1
34 | proj_y_normalized = proj_xy[:, 1, :, :] / ((height - 1) / 2) - 1
35 | proj_xy = torch.stack((proj_x_normalized, proj_y_normalized), dim=3) # [B, Ndepth, H*W, 2]
36 | grid = proj_xy
37 |
38 | warped_src_fea = F.grid_sample(src_fea, grid.view(batch, num_depth * height, width, 2), mode='bilinear',
39 | padding_mode='zeros')
40 | warped_src_fea = warped_src_fea.view(batch, channels, num_depth, height, width)
41 |
42 | return warped_src_fea
43 |
44 | def uncertainty_aware_samples(cur_depth, exp_var, ndepth, device, dtype, shape):
45 | if cur_depth.dim() == 2:
46 | #must be the first stage
47 | cur_depth_min = cur_depth[:, 0] # (B,)
48 | cur_depth_max = cur_depth[:, -1]
49 | new_interval = (cur_depth_max - cur_depth_min) / (ndepth - 1) # (B, )
50 | depth_range_samples = cur_depth_min.unsqueeze(1) + (torch.arange(0, ndepth, device=device, dtype=dtype,
51 | requires_grad=False).reshape(1, -1) * new_interval.unsqueeze(1)) # (B, D)
52 | depth_range_samples = depth_range_samples.unsqueeze(-1).unsqueeze(-1).repeat(1, 1, shape[1], shape[2]) # (B, D, H, W)
53 | else:
54 | low_bound = -torch.min(cur_depth, exp_var)
55 | high_bound = exp_var
56 |
57 | # assert exp_var.min() >= 0, exp_var.min()
58 | assert ndepth > 1
59 |
60 | step = (high_bound - low_bound) / (float(ndepth) - 1)
61 | new_samps = []
62 | for i in range(int(ndepth)):
63 | new_samps.append(cur_depth + low_bound + step * i + eps)
64 |
65 | depth_range_samples = torch.cat(new_samps, 1)
66 | # assert depth_range_samples.min() >= 0, depth_range_samples.min()
67 | return depth_range_samples
68 |
69 | def depth_regression(p, depth_values):
70 | if depth_values.dim() <= 2:
71 | # print("regression dim <= 2")
72 | depth_values = depth_values.view(*depth_values.shape, 1, 1)
73 | depth = torch.sum(p * depth_values, 1)
74 | return depth
75 |
76 | class Conv2dUnit(nn.Module):
77 | """Applies a 2D convolution (optionally with batch normalization and relu activation)
78 | over an input signal composed of several input planes.
79 |
80 | Attributes:
81 | conv (nn.Module): convolution module
82 | bn (nn.Module): batch normalization module
83 | relu (bool): whether to activate by relu
84 |
85 | Notes:
86 | Default momentum for batch normalization is set to be 0.01,
87 |
88 | """
89 |
90 | def __init__(self, in_channels, out_channels, kernel_size, stride=1,
91 | relu=True, bn=True, bn_momentum=0.1, **kwargs):
92 | super(Conv2dUnit, self).__init__()
93 |
94 | self.conv = nn.Conv2d(in_channels, out_channels, kernel_size, stride=stride,
95 | bias=(not bn), **kwargs)
96 | self.kernel_size = kernel_size
97 | self.stride = stride
98 | self.bn = nn.BatchNorm2d(out_channels, momentum=bn_momentum) if bn else None
99 | self.relu = relu
100 |
101 | def forward(self, x):
102 | x = self.conv(x)
103 | if self.bn is not None:
104 | x = self.bn(x)
105 | if self.relu:
106 | x = F.relu(x, inplace=True)
107 | return x
108 |
109 | class Deconv2dUnit(nn.Module):
110 | """Applies a 2D deconvolution (optionally with batch normalization and relu activation)
111 | over an input signal composed of several input planes.
112 |
113 | Attributes:
114 | conv (nn.Module): convolution module
115 | bn (nn.Module): batch normalization module
116 | relu (bool): whether to activate by relu
117 |
118 | Notes:
119 | Default momentum for batch normalization is set to be 0.01,
120 |
121 | """
122 |
123 | def __init__(self, in_channels, out_channels, kernel_size, stride=1,
124 | relu=True, bn=True, bn_momentum=0.1, **kwargs):
125 | super(Deconv2dUnit, self).__init__()
126 | self.out_channels = out_channels
127 | assert stride in [1, 2]
128 | self.stride = stride
129 |
130 | self.conv = nn.ConvTranspose2d(in_channels, out_channels, kernel_size, stride=stride,
131 | bias=(not bn), **kwargs)
132 | self.bn = nn.BatchNorm2d(out_channels, momentum=bn_momentum) if bn else None
133 | self.relu = relu
134 |
135 | def forward(self, x):
136 | y = self.conv(x)
137 | if self.stride == 2:
138 | h, w = list(x.size())[2:]
139 | y = y[:, :, :2 * h, :2 * w].contiguous()
140 | if self.bn is not None:
141 | x = self.bn(y)
142 | if self.relu:
143 | x = F.relu(x, inplace=True)
144 | return x
145 |
146 | class Conv3dUnit(nn.Module):
147 | """Applies a 3D convolution (optionally with batch normalization and relu activation)
148 | over an input signal composed of several input planes.
149 |
150 | Attributes:
151 | conv (nn.Module): convolution module
152 | bn (nn.Module): batch normalization module
153 | relu (bool): whether to activate by relu
154 |
155 | Notes:
156 | Default momentum for batch normalization is set to be 0.01,
157 |
158 | """
159 |
160 | def __init__(self, in_channels, out_channels, kernel_size=3, stride=1,
161 | relu=True, bn=True, bn_momentum=0.1, init_method="xavier", **kwargs):
162 | super(Conv3dUnit, self).__init__()
163 | self.out_channels = out_channels
164 | self.kernel_size = kernel_size
165 | assert stride in [1, 2]
166 | self.stride = stride
167 |
168 | self.conv = nn.Conv3d(in_channels, out_channels, kernel_size, stride=stride,
169 | bias=(not bn), **kwargs)
170 | self.bn = nn.BatchNorm3d(out_channels, momentum=bn_momentum) if bn else None
171 | self.relu = relu
172 |
173 | def forward(self, x):
174 | x = self.conv(x)
175 | if self.bn is not None:
176 | x = self.bn(x)
177 | if self.relu:
178 | x = F.relu(x, inplace=True)
179 | return x
180 |
181 | class Deconv3dUnit(nn.Module):
182 | """Applies a 3D deconvolution (optionally with batch normalization and relu activation)
183 | over an input signal composed of several input planes.
184 |
185 | Attributes:
186 | conv (nn.Module): convolution module
187 | bn (nn.Module): batch normalization module
188 | relu (bool): whether to activate by relu
189 |
190 | Notes:
191 | Default momentum for batch normalization is set to be 0.01,
192 |
193 | """
194 |
195 | def __init__(self, in_channels, out_channels, kernel_size=3, stride=1,
196 | relu=True, bn=True, bn_momentum=0.1, init_method="xavier", **kwargs):
197 | super(Deconv3dUnit, self).__init__()
198 | self.out_channels = out_channels
199 | assert stride in [1, 2]
200 | self.stride = stride
201 |
202 | self.conv = nn.ConvTranspose3d(in_channels, out_channels, kernel_size, stride=stride,
203 | bias=(not bn), **kwargs)
204 | self.bn = nn.BatchNorm3d(out_channels, momentum=bn_momentum) if bn else None
205 | self.relu = relu
206 |
207 | def forward(self, x):
208 | y = self.conv(x)
209 | if self.bn is not None:
210 | x = self.bn(y)
211 | if self.relu:
212 | x = F.relu(x, inplace=True)
213 | return x
214 |
215 | class Deconv2dBlock(nn.Module):
216 | def __init__(self, in_channels, out_channels, kernel_size, relu=True, bn=True,
217 | bn_momentum=0.1):
218 | super(Deconv2dBlock, self).__init__()
219 |
220 | self.deconv = Deconv2dUnit(in_channels, out_channels, kernel_size, stride=2, padding=1, output_padding=1,
221 | bn=True, relu=relu, bn_momentum=bn_momentum)
222 |
223 | self.conv = Conv2dUnit(2 * out_channels, out_channels, kernel_size, stride=1, padding=1,
224 | bn=bn, relu=relu, bn_momentum=bn_momentum)
225 |
226 | def forward(self, x_pre, x):
227 | x = self.deconv(x)
228 | x = torch.cat((x, x_pre), dim=1)
229 | x = self.conv(x)
230 | return x
231 |
232 | class FeatExtNet(nn.Module):
233 | def __init__(self, base_channels, num_stage=3,):
234 | super(FeatExtNet, self).__init__()
235 |
236 | self.base_channels = base_channels
237 | self.num_stage = num_stage
238 |
239 | self.conv0 = nn.Sequential(
240 | Conv2dUnit(3, base_channels, 3, 1, padding=1),
241 | Conv2dUnit(base_channels, base_channels, 3, 1, padding=1),
242 | )
243 |
244 | self.conv1 = nn.Sequential(
245 | Conv2dUnit(base_channels, base_channels * 2, 5, stride=2, padding=2),
246 | Conv2dUnit(base_channels * 2, base_channels * 2, 3, 1, padding=1),
247 | Conv2dUnit(base_channels * 2, base_channels * 2, 3, 1, padding=1),
248 | )
249 |
250 | self.conv2 = nn.Sequential(
251 | Conv2dUnit(base_channels * 2, base_channels * 4, 5, stride=2, padding=2),
252 | Conv2dUnit(base_channels * 4, base_channels * 4, 3, 1, padding=1),
253 | Conv2dUnit(base_channels * 4, base_channels * 4, 3, 1, padding=1),
254 | )
255 |
256 | self.out1 = nn.Conv2d(base_channels * 4, base_channels * 4, 1, bias=False)
257 | self.out_channels = [4 * base_channels]
258 |
259 | if num_stage == 3:
260 | self.deconv1 = Deconv2dBlock(base_channels * 4, base_channels * 2, 3)
261 | self.deconv2 = Deconv2dBlock(base_channels * 2, base_channels, 3)
262 |
263 | self.out2 = nn.Conv2d(base_channels * 2, base_channels * 2, 1, bias=False)
264 | self.out3 = nn.Conv2d(base_channels, base_channels, 1, bias=False)
265 | self.out_channels.append(2 * base_channels)
266 | self.out_channels.append(base_channels)
267 |
268 | elif num_stage == 2:
269 | self.deconv1 = Deconv2dBlock(base_channels * 4, base_channels * 2, 3)
270 |
271 | self.out2 = nn.Conv2d(base_channels * 2, base_channels * 2, 1, bias=False)
272 | self.out_channels.append(2 * base_channels)
273 |
274 | def forward(self, x):
275 | conv0 = self.conv0(x)
276 | conv1 = self.conv1(conv0)
277 | conv2 = self.conv2(conv1)
278 | intra_feat = conv2
279 | outputs = {}
280 | out = self.out1(intra_feat)
281 |
282 | outputs["stage1"] = out
283 | if self.num_stage == 3:
284 | intra_feat = self.deconv1(conv1, intra_feat)
285 | out = self.out2(intra_feat)
286 | outputs["stage2"] = out
287 |
288 | intra_feat = self.deconv2(conv0, intra_feat)
289 | out = self.out3(intra_feat)
290 | outputs["stage3"] = out
291 |
292 | elif self.num_stage == 2:
293 | intra_feat = self.deconv1(conv1, intra_feat)
294 | out = self.out2(intra_feat)
295 | outputs["stage2"] = out
296 |
297 | return outputs
298 |
299 | class CostRegNet(nn.Module):
300 | def __init__(self, in_channels, base_channels):
301 | super(CostRegNet, self).__init__()
302 | self.conv0 = Conv3dUnit(in_channels, base_channels, padding=1)
303 |
304 | self.conv1 = Conv3dUnit(base_channels, base_channels * 2, stride=2, padding=1)
305 | self.conv2 = Conv3dUnit(base_channels * 2, base_channels * 2, padding=1)
306 |
307 | self.conv3 = Conv3dUnit(base_channels * 2, base_channels * 4, stride=2, padding=1)
308 | self.conv4 = Conv3dUnit(base_channels * 4, base_channels * 4, padding=1)
309 |
310 | self.conv5 = Conv3dUnit(base_channels * 4, base_channels * 8, stride=2, padding=1)
311 | self.conv6 = Conv3dUnit(base_channels * 8, base_channels * 8, padding=1)
312 |
313 | self.deconv7 = Deconv3dUnit(base_channels * 8, base_channels * 4, stride=2, padding=1, output_padding=1)
314 |
315 | self.deconv8 = Deconv3dUnit(base_channels * 4, base_channels * 2, stride=2, padding=1, output_padding=1)
316 |
317 | self.deconv9 = Deconv3dUnit(base_channels * 2, base_channels * 1, stride=2, padding=1, output_padding=1)
318 |
319 | self.prob = nn.Conv3d(base_channels, 1, 3, stride=1, padding=1, bias=False)
320 |
321 | def forward(self, x):
322 | conv0 = self.conv0(x)
323 | conv2 = self.conv2(self.conv1(conv0))
324 | conv4 = self.conv4(self.conv3(conv2))
325 | x = self.conv6(self.conv5(conv4))
326 | x = conv4 + self.deconv7(x)
327 | x = conv2 + self.deconv8(x)
328 | x = conv0 + self.deconv9(x)
329 | x = self.prob(x)
330 | return x
331 |
332 |
--------------------------------------------------------------------------------
/networks/ucsnet.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 | from .submodules import *
5 |
6 |
7 | def compute_depth(feats, proj_mats, depth_samps, cost_reg, lamb, is_training=False):
8 | '''
9 |
10 | :param feats: [(B, C, H, W), ] * num_views
11 | :param proj_mats: [()]
12 | :param depth_samps:
13 | :param cost_reg:
14 | :param lamb:
15 | :return:
16 | '''
17 |
18 | proj_mats = torch.unbind(proj_mats, 1)
19 | num_views = len(feats)
20 | num_depth = depth_samps.shape[1]
21 |
22 | assert len(proj_mats) == num_views, "Different number of images and projection matrices"
23 |
24 | ref_feat, src_feats = feats[0], feats[1:]
25 | ref_proj, src_projs = proj_mats[0], proj_mats[1:]
26 |
27 | ref_volume = ref_feat.unsqueeze(2).repeat(1, 1, num_depth, 1, 1)
28 | volume_sum = ref_volume
29 | volume_sq_sum = ref_volume ** 2
30 | del ref_volume
31 |
32 | #todo optimize impl
33 | for src_fea, src_proj in zip(src_feats, src_projs):
34 | src_proj_new = src_proj[:, 0].clone()
35 | src_proj_new[:, :3, :4] = torch.matmul(src_proj[:, 1, :3, :3], src_proj[:, 0, :3, :4])
36 |
37 | ref_proj_new = ref_proj[:, 0].clone()
38 | ref_proj_new[:, :3, :4] = torch.matmul(ref_proj[:, 1, :3, :3], ref_proj[:, 0, :3, :4])
39 | warped_volume = homo_warping(src_fea, src_proj_new, ref_proj_new, depth_samps)
40 |
41 | if is_training:
42 | volume_sum = volume_sum + warped_volume
43 | volume_sq_sum = volume_sq_sum + warped_volume ** 2
44 | else:
45 | volume_sum += warped_volume
46 | volume_sq_sum += warped_volume.pow_(2) #in_place method
47 | del warped_volume
48 | volume_variance = volume_sq_sum.div_(num_views).sub_(volume_sum.div_(num_views).pow_(2))
49 |
50 | prob_volume_pre = cost_reg(volume_variance).squeeze(1)
51 | prob_volume = F.softmax(prob_volume_pre, dim=1)
52 | depth = depth_regression(prob_volume, depth_values=depth_samps)
53 |
54 | with torch.no_grad():
55 | prob_volume_sum4 = 4 * F.avg_pool3d(F.pad(prob_volume.unsqueeze(1), pad=(0, 0, 0, 0, 1, 2)), (4, 1, 1),
56 | stride=1, padding=0).squeeze(1)
57 | depth_index = depth_regression(prob_volume, depth_values=torch.arange(num_depth, device=prob_volume.device,
58 | dtype=torch.float)).long()
59 | depth_index = depth_index.clamp(min=0, max=num_depth - 1)
60 | prob_conf = torch.gather(prob_volume_sum4, 1, depth_index.unsqueeze(1)).squeeze(1)
61 |
62 | samp_variance = (depth_samps - depth.unsqueeze(1)) ** 2
63 | exp_variance = lamb * torch.sum(samp_variance * prob_volume, dim=1, keepdim=False) ** 0.5
64 |
65 | return {"depth": depth, "confidence": prob_conf, 'variance': exp_variance}
66 |
67 | class UCSNet(nn.Module):
68 | def __init__(self, lamb=1.5, stage_configs=[64, 32, 8], grad_method="detach", base_chs=[8, 8, 8], feat_ext_ch=8):
69 | super(UCSNet, self).__init__()
70 |
71 | self.stage_configs = stage_configs
72 | self.grad_method = grad_method
73 | self.base_chs = base_chs
74 | self.lamb = lamb
75 | self.num_stage = len(stage_configs)
76 | self.ds_ratio = {"stage1": 4.0,
77 | "stage2": 2.0,
78 | "stage3": 1.0
79 | }
80 |
81 | self.feature_extraction = FeatExtNet(base_channels=feat_ext_ch, num_stage=self.num_stage,)
82 |
83 | self.cost_regularization = nn.ModuleList([CostRegNet(in_channels=self.feature_extraction.out_channels[i],
84 | base_channels=self.base_chs[i]) for i in range(self.num_stage)])
85 |
86 | def forward(self, imgs, proj_matrices, depth_values):
87 | features = []
88 | for nview_idx in range(imgs.shape[1]):
89 | img = imgs[:, nview_idx]
90 | features.append(self.feature_extraction(img))
91 |
92 | outputs = {}
93 | depth, cur_depth, exp_var = None, None, None
94 | for stage_idx in range(self.num_stage):
95 | features_stage = [feat["stage{}".format(stage_idx + 1)] for feat in features]
96 | proj_matrices_stage = proj_matrices["stage{}".format(stage_idx + 1)]
97 | stage_scale = self.ds_ratio["stage{}".format(stage_idx + 1)]
98 | cur_h = img.shape[2] // int(stage_scale)
99 | cur_w = img.shape[3] // int(stage_scale)
100 |
101 | if depth is not None:
102 | if self.grad_method == "detach":
103 | cur_depth = depth.detach()
104 | exp_var = exp_var.detach()
105 | else:
106 | cur_depth = depth
107 |
108 | cur_depth = F.interpolate(cur_depth.unsqueeze(1),
109 | [cur_h, cur_w], mode='bilinear')
110 | exp_var = F.interpolate(exp_var.unsqueeze(1), [cur_h, cur_w], mode='bilinear')
111 |
112 | else:
113 | cur_depth = depth_values
114 |
115 | depth_range_samples = uncertainty_aware_samples(cur_depth=cur_depth,
116 | exp_var=exp_var,
117 | ndepth=self.stage_configs[stage_idx],
118 | dtype=img[0].dtype,
119 | device=img[0].device,
120 | shape=[img.shape[0], cur_h, cur_w])
121 |
122 | outputs_stage = compute_depth(features_stage, proj_matrices_stage,
123 | depth_samps=depth_range_samples,
124 | cost_reg=self.cost_regularization[stage_idx],
125 | lamb=self.lamb,
126 | is_training=self.training)
127 |
128 | depth = outputs_stage['depth']
129 | exp_var = outputs_stage['variance']
130 |
131 | outputs["stage{}".format(stage_idx + 1)] = outputs_stage
132 |
133 | return outputs
134 |
135 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | torch==1.2
2 | torchvision==0.2.0
3 | numpy
4 | pillow
5 | tensorboardX
6 | opencv-python
7 |
--------------------------------------------------------------------------------
/results/dtu.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/touristCheng/UCSNet/a1361810f47b420941a1e7b32e24f37c305f0953/results/dtu.png
--------------------------------------------------------------------------------
/scripts/fuse_dtu.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | exe_path="/home/shuocheng/fusibile/fusibile"
4 | root_path="/new1/shuocheng/dtu_results"
5 | target_path="/new1/shuocheng/dtu_points"
6 |
7 |
8 |
9 | declare -a arr=(1 4 9 10 11 12 13 15 23 24 29 32 33 34 48 49 62 75 77 110 114 118)
10 |
11 | for i in ${arr[@]}; do
12 | scene_path="$root_path/scan$i"
13 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold 0.6 --disp_threshold 0.25 --num_consistent 3
14 | done
15 |
16 | python utils/collect_pointclouds.py --root_dir $root_path --target_dir $target_path --dataset "dtu"
--------------------------------------------------------------------------------
/scripts/fuse_tanks.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | exe_path="/home/shuocheng/fusibile/fusibile"
4 | root_path="/new1/shuocheng/tanks_results"
5 | target_path="/new1/shuocheng/tanks_points"
6 |
7 | root_path="/cephfs/shuocheng/tanks_results"
8 | target_path="/cephfs/shuocheng/tanks_points"
9 |
10 |
11 |
12 |
13 | scene_path="$root_path/Family"
14 | disp=0.25
15 | num_const=4
16 | prob=0.6
17 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
18 |
19 |
20 | scene_path="$root_path/Horse"
21 | disp=0.25
22 | num_const=4
23 | prob=0.6
24 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
25 |
26 |
27 | scene_path="$root_path/Francis"
28 | disp=0.2
29 | num_const=7
30 | prob=0.6
31 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
32 |
33 |
34 | scene_path="$root_path/Lighthouse"
35 | disp=0.3
36 | num_const=5
37 | prob=0.6
38 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
39 |
40 | scene_path="$root_path/M60"
41 | disp=0.25
42 | num_const=4
43 | prob=0.6
44 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
45 |
46 |
47 | scene_path="$root_path/Panther"
48 | disp=0.2
49 | num_const=4
50 | prob=0.6
51 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
52 |
53 |
54 | scene_path="$root_path/Playground"
55 | disp=0.25
56 | num_const=5
57 | prob=0.6
58 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
59 |
60 |
61 | scene_path="$root_path/Train"
62 | disp=0.25
63 | num_const=5
64 | prob=0.6
65 | CUDA_VISIBLE_DEVICES=0 python depthfusion.py --dense_folder $scene_path --fusibile_exe_path $exe_path --prob_threshold $prob --disp_threshold $disp --num_consistent $num_const
66 |
67 |
68 |
69 | python utils/collect_pointclouds.py --root_dir $root_path --target_dir $target_path --dataset "tanks"
--------------------------------------------------------------------------------
/scripts/test_on_dtu.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 |
4 | save_path="/new1/shuocheng/dtu_results"
5 | test_list="./dataloader/datalist/dtu/test.txt"
6 | root_path="/new1/shuocheng/dtu/mvs_eval/dtu"
7 |
8 |
9 |
10 | CUDA_VISIBLE_DEVICES=0 python test.py --root_path $root_path --test_list $test_list --save_path $save_path --max_h 1200 --max_w 1600
11 |
--------------------------------------------------------------------------------
/scripts/test_on_tanks.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 |
4 | #save_path="/new1/shuocheng/tanks_results"
5 | #test_list="./dataloader/datalist/tanks/test.txt"
6 | #root_path="/new1/shuocheng/tankandtemples"
7 |
8 | save_path="/cephfs/shuocheng/tanks_results"
9 | test_list="./dataloader/datalist/tanks/test.txt"
10 | root_path="/cephfs/shuocheng/tankandtemples/intermediate"
11 |
12 | CUDA_VISIBLE_DEVICES=0 python test.py --root_path $root_path --test_list $test_list --save_path $save_path --max_h 1080 --max_w 1920
13 |
--------------------------------------------------------------------------------
/scripts/train.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 |
3 | root_path="/cephfs/shuocheng/mvs_training/dtu"
4 |
5 | root_path="/new1/shuocheng/dtu/mvs_training/dtu"
6 |
7 | save_path="./training_$(date +"%F-%T")"
8 | num_gpus=$1
9 | batch=2
10 |
11 | mkdir -p $save_path
12 |
13 | python -m torch.distributed.launch --nproc_per_node=$num_gpus train.py --root_path=$root_path --save_path $save_path \
14 | --batch_size $batch --epochs 60 --lr 0.0016 --lr_idx "20,30,40,50:0.625" --loss_weights "0.5,1.0,2.0" \
15 | --net_configs "64,32,8" --num_views 2 --lamb 1.5 --sync_bn | tee -a $save_path/log.txt
16 |
--------------------------------------------------------------------------------
/test.py:
--------------------------------------------------------------------------------
1 |
2 | import torch
3 | import torch.nn as nn
4 | import torch.nn.parallel
5 | from torch.utils.data import DataLoader
6 | import torch.backends.cudnn as cudnn
7 |
8 | from dataloader.mvs_dataset import MVSTestSet
9 | from networks.ucsnet import UCSNet
10 | from utils.utils import dict2cuda, dict2numpy, mkdir_p, save_cameras, write_pfm
11 |
12 | import numpy as np
13 | import argparse, os, time, gc, cv2
14 | from PIL import Image
15 | import os.path as osp
16 | from collections import *
17 | import sys
18 |
19 | cudnn.benchmark = True
20 |
21 | parser = argparse.ArgumentParser(description='Test UCSNet.')
22 |
23 | parser.add_argument('--root_path', type=str, help='path to root directory.')
24 | parser.add_argument('--test_list', type=str, help='testing scene list.')
25 | parser.add_argument('--save_path', type=str, help='path to save depth maps.')
26 |
27 | #test parameters
28 | parser.add_argument('--max_h', type=int, help='image height', default=1080)
29 | parser.add_argument('--max_w', type=int, help='image width', default=1920)
30 | parser.add_argument('--num_views', type=int, help='num of candidate views', default=3)
31 | parser.add_argument('--lamb', type=float, help='the interval coefficient.', default=1.5)
32 | parser.add_argument('--net_configs', type=str, help='number of samples for each stage.', default='64,32,8')
33 | parser.add_argument('--ckpt', type=str, help='the path for pre-trained model.', default='./checkpoints/model.ckpt')
34 |
35 | args = parser.parse_args()
36 |
37 |
38 | def main(args):
39 | # dataset, dataloader
40 | testset = MVSTestSet(root_dir=args.root_path, data_list=args.test_list,
41 | max_h=args.max_h, max_w=args.max_w, num_views=args.num_views)
42 | test_loader = DataLoader(testset, 1, shuffle=False, num_workers=4, drop_last=False)
43 |
44 | # build model
45 | model = UCSNet(stage_configs=list(map(int, args.net_configs.split(","))),
46 | lamb=args.lamb)
47 |
48 | # load checkpoint file specified by args.loadckpt
49 | print("Loading model {} ...".format(args.ckpt))
50 | state_dict = torch.load(args.ckpt, map_location=torch.device("cpu"))
51 | model.load_state_dict(state_dict['model'], strict=True)
52 | print('Success!')
53 |
54 | model = nn.DataParallel(model)
55 | model.cuda()
56 | model.eval()
57 |
58 | tim_cnt = 0
59 |
60 | for batch_idx, sample in enumerate(test_loader):
61 | scene_name = sample["scene_name"][0]
62 | frame_idx = sample["frame_idx"][0][0]
63 | scene_path = osp.join(args.save_path, scene_name)
64 |
65 | print('Process data ...')
66 | sample_cuda = dict2cuda(sample)
67 |
68 | print('Testing {} frame {} ...'.format(scene_name, frame_idx))
69 | start_time = time.time()
70 | outputs = model(sample_cuda["imgs"], sample_cuda["proj_matrices"], sample_cuda["depth_values"])
71 | end_time = time.time()
72 |
73 | outputs = dict2numpy(outputs)
74 | del sample_cuda
75 |
76 | tim_cnt += (end_time - start_time)
77 |
78 | print('Finished {}/{}, time: {:.2f}s ({:.2f}s/frame).'.format(batch_idx+1, len(test_loader), end_time-start_time,
79 | tim_cnt / (batch_idx + 1.)))
80 |
81 | rgb_path = osp.join(scene_path, 'rgb')
82 | mkdir_p(rgb_path)
83 | depth_path = osp.join(scene_path, 'depth')
84 | mkdir_p(depth_path)
85 | cam_path = osp.join(scene_path, 'cam')
86 | mkdir_p(cam_path)
87 | conf_path = osp.join(scene_path, 'confidence')
88 | mkdir_p(conf_path)
89 |
90 |
91 | ref_img = sample["imgs"][0, 0].numpy().transpose(1, 2, 0) * 255
92 | ref_img = np.clip(ref_img, 0, 255).astype(np.uint8)
93 | Image.fromarray(ref_img).save(rgb_path+'/{:08d}.png'.format(frame_idx))
94 |
95 | cam = sample["proj_matrices"]["stage3"][0, 0].numpy()
96 | save_cameras(cam, cam_path+'/cam_{:08d}.txt'.format(frame_idx))
97 |
98 | for stage_id in range(3):
99 | cur_res = outputs["stage{}".format(stage_id+1)]
100 | cur_dep = cur_res["depth"][0]
101 | cur_conf = cur_res["confidence"][0]
102 |
103 | write_pfm(depth_path+"/dep_{:08d}_{}.pfm".format(frame_idx, stage_id+1), cur_dep)
104 | write_pfm(conf_path+'/conf_{:08d}_{}.pfm'.format(frame_idx, stage_id+1), cur_conf)
105 |
106 | print('Saved results for {}/{} (resolution: {})'.format(scene_name, frame_idx, cur_dep.shape))
107 |
108 | torch.cuda.empty_cache()
109 | gc.collect()
110 |
111 | if __name__ == '__main__':
112 | with torch.no_grad():
113 | main(args)
--------------------------------------------------------------------------------
/train.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.backends.cudnn as cudnn
4 | import torch.optim as optim
5 | import torch.distributed as dist
6 | from torch.utils.data import DataLoader
7 | import torch.nn.functional as F
8 |
9 | from tensorboardX import SummaryWriter
10 | from dataloader.mvs_dataset import MVSTrainSet, MVSTestSet
11 | from networks.ucsnet import UCSNet
12 | from utils.utils import *
13 |
14 | import argparse, os, sys, time, gc, datetime
15 | import os.path as osp
16 |
17 | cudnn.benchmark = True
18 | num_gpus = int(os.environ["WORLD_SIZE"]) if "WORLD_SIZE" in os.environ else 1
19 | is_distributed = num_gpus > 1
20 |
21 |
22 | parser = argparse.ArgumentParser(description='Deep stereo using adaptive cost volume.')
23 | parser.add_argument('--root_path', type=str, help='path to root directory.')
24 | parser.add_argument('--train_list', type=str, help='train scene list.', default='./dataloader/datalist/dtu/train.txt')
25 | parser.add_argument('--val_list', type=str, help='val scene list.', default='./dataloader/datalist/dtu/val.txt')
26 | parser.add_argument('--save_path', type=str, help='path to save checkpoints.')
27 |
28 | parser.add_argument('--epochs', type=int, default=60)
29 | parser.add_argument('--lr', type=float, default=0.0016)
30 | parser.add_argument('--lr_idx', type=str, default="10,12,14:0.5")
31 | parser.add_argument('--loss_weights', type=str, default="0.5,1.0,2.0")
32 | parser.add_argument('--wd', type=float, default=0.0, help='weight decay')
33 | parser.add_argument('--batch_size', type=int, default=1)
34 |
35 | parser.add_argument('--num_views', type=int, help='num of candidate views', default=2)
36 | parser.add_argument('--lamb', type=float, help='the interval coefficient.', default=1.5)
37 | parser.add_argument('--net_configs', type=str, help='number of samples for each stage.', default='64,32,8')
38 |
39 | parser.add_argument('--log_freq', type=int, default=50, help='print and summary frequency')
40 | parser.add_argument('--save_freq', type=int, default=1, help='save checkpoint frequency.')
41 | parser.add_argument('--eval_freq', type=int, default=1, help='evaluate frequency.')
42 |
43 | parser.add_argument('--sync_bn', action='store_true',help='Sync BN.')
44 | parser.add_argument('--opt_level', type=str, default="O0")
45 | parser.add_argument('--seed', type=int, default=0)
46 | parser.add_argument("--local_rank", type=int, default=0)
47 |
48 |
49 | args = parser.parse_args()
50 |
51 | if args.sync_bn:
52 | import apex
53 | import apex.amp as amp
54 |
55 | on_main = True
56 |
57 | torch.manual_seed(args.seed)
58 | torch.cuda.manual_seed(args.seed)
59 |
60 | def print_func(data: dict, prefix: str= ''):
61 | for k, v in data.items():
62 | if isinstance(v, dict):
63 | print_func(v, prefix + '.' + k)
64 | elif isinstance(v, list):
65 | print(prefix+'.'+k, v)
66 | else:
67 | print(prefix+'.'+k, v.shape)
68 |
69 | def main(args, model:nn.Module, optimizer, train_loader, val_loader):
70 | milestones = list(map(lambda x: int(x) * len(train_loader), args.lr_idx.split(':')[0].split(',')))
71 | gamma = float(args.lr_idx.split(':')[1])
72 | scheduler = get_step_schedule_with_warmup(optimizer=optimizer, milestones=milestones, gamma=gamma)
73 |
74 | loss_weights = list(map(float, args.loss_weights.split(',')))
75 |
76 | for ep in range(args.epochs):
77 | model.train()
78 | for batch_idx, sample in enumerate(train_loader):
79 |
80 | tic = time.time()
81 | sample_cuda = dict2cuda(sample)
82 |
83 | # print_func(sample_cuda)
84 |
85 | optimizer.zero_grad()
86 | outputs = model(sample_cuda["imgs"], sample_cuda["proj_matrices"], sample_cuda["depth_values"])
87 |
88 | # print_func(outputs)
89 |
90 | loss = multi_stage_loss(outputs, sample_cuda["depth_labels"], sample_cuda["masks"], loss_weights)
91 | if is_distributed and args.sync_bn:
92 | with amp.scale_loss(loss, optimizer) as scaled_loss:
93 | scaled_loss.backward()
94 | else:
95 | loss.backward()
96 |
97 | optimizer.step()
98 | scheduler.step()
99 |
100 | log_index = (len(train_loader)+len(val_loader)) * ep + batch_idx
101 | if log_index % args.log_freq == 0:
102 |
103 | image_summary, scalar_summary = collect_summary(sample_cuda, outputs)
104 | if on_main:
105 | add_summary(image_summary, 'image', logger, index=log_index, flag='train')
106 | add_summary(scalar_summary, 'scalar', logger, index=log_index, flag='train')
107 | print("Epoch {}/{}, Iter {}/{}, lr {:.6f}, train loss {:.2f}, eval 4mm ({:.2f}, {:.2f}), time = {:.2f}".format(
108 | ep+1, args.epochs, batch_idx+1, len(train_loader),
109 | optimizer.param_groups[0]["lr"], loss,
110 | scalar_summary["4mm_abs"], scalar_summary["4mm_acc"],
111 | time.time() - tic))
112 |
113 | del scalar_summary, image_summary
114 |
115 | gc.collect()
116 | if on_main and (ep + 1) % args.save_freq == 0:
117 | torch.save({"epoch": ep+1,
118 | "model": model.module.state_dict(),
119 | "optimizer": optimizer.state_dict()},
120 | "{}/model_{:06d}.ckpt".format(args.save_path, ep+1))
121 |
122 | if (ep + 1) % args.eval_freq == 0 or (ep+1) == args.epochs:
123 | with torch.no_grad():
124 | test(args, model, val_loader, ep)
125 |
126 | def test(args, model, test_loader, epoch):
127 | model.eval()
128 | avg_scalars = DictAverageMeter()
129 | for batch_idx, sample in enumerate(test_loader):
130 | sample_cuda = dict2cuda(sample)
131 | outputs = model(sample_cuda["imgs"], sample_cuda["proj_matrices"], sample_cuda["depth_values"])
132 |
133 | image_summary, scalar_summary = collect_summary(sample_cuda, outputs)
134 | avg_scalars.update(scalar_summary)
135 |
136 | log_index = len(train_loader) * (epoch + 1) + len(val_loader) * epoch + batch_idx
137 | if log_index % args.log_freq == 0 and on_main:
138 | add_summary(image_summary, 'image', logger, index=log_index, flag='val')
139 | add_summary(scalar_summary, 'scalar', logger, index=log_index, flag='val')
140 |
141 | del scalar_summary, image_summary
142 |
143 | if on_main:
144 | print("Epoch {}/{}: {}".format(epoch + 1, args.epochs, avg_scalars.mean()))
145 | add_summary(avg_scalars.mean(), 'scalar', logger, index=epoch + 1, flag='brief')
146 |
147 | gc.collect()
148 |
149 | def collect_summary(inputs, outputs):
150 | depth = outputs["stage3"]["depth"]
151 | label = inputs["depth_labels"]["stage3"]
152 | mask = inputs["masks"]["stage3"].bool()
153 |
154 | err_map = torch.abs(label - depth) * mask.float()
155 | rgb = inputs["imgs"][:, 0]
156 |
157 | image_summary = {"depth": depth,
158 | "label": label,
159 | "mask": mask,
160 | "error": err_map,
161 | "ref_view": rgb
162 | }
163 |
164 | scalar_summary = {}
165 | for thresh in [2, 3, 4, 20]:
166 | abs_err, acc = evaluate(depth, mask, label, thresh)
167 | scalar_summary["{}mm_abs".format(thresh)] = abs_err
168 | scalar_summary["{}mm_acc".format(thresh)] = acc
169 | scalar_summary = reduce_tensors(scalar_summary)
170 | return dict2numpy(image_summary), dict2float(scalar_summary)
171 |
172 | def distribute_model(args):
173 | def sync():
174 | if not dist.is_available():
175 | return
176 | if not dist.is_initialized():
177 | return
178 | if dist.get_world_size() == 1:
179 | return
180 | dist.barrier()
181 |
182 | if is_distributed:
183 | torch.cuda.set_device(args.local_rank)
184 | torch.distributed.init_process_group(
185 | backend="nccl", init_method="env://"
186 | )
187 | sync()
188 |
189 | model: torch.nn.Module = UCSNet(stage_configs=list(map(int, args.net_configs.split(","))),
190 | lamb=args.lamb)
191 | model.to(torch.device("cuda"))
192 |
193 | optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=args.lr, betas=(0.9, 0.999),
194 | weight_decay=args.wd)
195 |
196 | train_set = MVSTrainSet(root_dir=args.root_path, data_list=args.train_list, num_views=args.num_views)
197 |
198 | val_set = MVSTrainSet(root_dir=args.root_path, data_list=args.val_list, num_views=args.num_views)
199 |
200 | if is_distributed:
201 | if args.sync_bn:
202 | model = apex.parallel.convert_syncbn_model(model)
203 | model, optimizer = amp.initialize(model, optimizer, opt_level=args.opt_level, )
204 | print('Convert BN to Sync_BN successful.')
205 |
206 | model = torch.nn.parallel.DistributedDataParallel(
207 | model, device_ids=[args.local_rank], output_device=args.local_rank,)
208 |
209 | train_sampler = torch.utils.data.DistributedSampler(train_set, num_replicas=dist.get_world_size(),
210 | rank=dist.get_rank())
211 | val_sampler = torch.utils.data.DistributedSampler(val_set, num_replicas=dist.get_world_size(),
212 | rank=dist.get_rank())
213 | else:
214 | model = nn.DataParallel(model)
215 | train_sampler, val_sampler = None, None
216 |
217 | train_loader = DataLoader(train_set, args.batch_size, sampler=train_sampler, num_workers=1,
218 | drop_last=True, shuffle=not is_distributed)
219 | val_loader = DataLoader(val_set, args.batch_size, sampler=val_sampler, num_workers=1,
220 | drop_last=False, shuffle=False)
221 |
222 | return model, optimizer, train_loader, val_loader
223 |
224 | def multi_stage_loss(outputs, labels, masks, weights):
225 | tot_loss = 0.
226 | for stage_id in range(3):
227 | depth_i = outputs["stage{}".format(stage_id+1)]["depth"]
228 | label_i = labels["stage{}".format(stage_id+1)]
229 | mask_i = masks["stage{}".format(stage_id+1)].bool()
230 | depth_loss = F.smooth_l1_loss(depth_i[mask_i], label_i[mask_i], reduction='mean')
231 | tot_loss += depth_loss * weights[stage_id]
232 | return tot_loss
233 |
234 | if __name__ == '__main__':
235 |
236 | model, optimizer, train_loader, val_loader = distribute_model(args)
237 |
238 | on_main = (not is_distributed) or (dist.get_rank() == 0)
239 |
240 | if on_main:
241 | mkdir_p(args.save_path)
242 | logger = SummaryWriter(args.save_path)
243 | print(args)
244 |
245 | main(args=args, model=model, optimizer=optimizer, train_loader=train_loader, val_loader=val_loader)
--------------------------------------------------------------------------------
/utils/collect_pointclouds.py:
--------------------------------------------------------------------------------
1 | import os, sys
2 | import argparse
3 | import glob
4 | import errno
5 | import os.path as osp
6 | import shutil
7 |
8 |
9 | parser = argparse.ArgumentParser()
10 |
11 | parser.add_argument('--root_dir', help='path to prediction', type=str,)
12 | parser.add_argument('--target_dir', type=str)
13 | parser.add_argument('--dataset', type=str, )
14 |
15 | args = parser.parse_args()
16 |
17 | def mkdir_p(path):
18 | try:
19 | os.makedirs(path)
20 | except OSError as exc: # Python >2.5
21 | if exc.errno == errno.EEXIST and os.path.isdir(path):
22 | pass
23 | else:
24 | raise
25 |
26 | def collect_dtu(args):
27 | mkdir_p(args.target_dir)
28 | all_scenes = sorted(glob.glob(args.root_dir+'/*'))
29 | all_scenes = list(filter(os.path.isdir, all_scenes))
30 | for scene in all_scenes:
31 | scene_id = int(scene.strip().split('/')[-1][len('scan'):])
32 | all_plys = sorted(glob.glob('{}/points_ucsnet/consistencyCheck*'.format(scene)))
33 | print('Found points: ', all_plys)
34 |
35 | shutil.copyfile(all_plys[-1]+'/final3d_model.ply', '{}/ucsnet{:03d}_l3.ply'.format(args.target_dir, scene_id))
36 |
37 | def collect_tanks(args):
38 | mkdir_p(args.target_dir)
39 | all_scenes = sorted(glob.glob(args.root_dir + '/*'))
40 | all_scenes = list(filter(os.path.isdir, all_scenes))
41 | for scene in all_scenes:
42 | all_plys = sorted(glob.glob('{}/points_ucsnet/consistencyCheck*'.format(scene)))
43 | print('Found points: ', all_plys)
44 | scene_name = scene.strip().split('/')[-1]
45 | shutil.copyfile(all_plys[-1]+'/final3d_model.ply', '{}/{}.ply'.format(args.target_dir, scene_name))
46 | shutil.copyfile('./dataloader/datalist/tanks/logs/{}.log'.format(scene_name),
47 | '{}/{}.log'.format(args.target_dir, scene_name))
48 |
49 | if __name__ == '__main__':
50 | if args.dataset == 'dtu':
51 | collect_dtu(args)
52 | elif args.dataset == 'tanks':
53 | collect_tanks(args)
54 | else:
55 | print('Unknown dataset.')
56 |
--------------------------------------------------------------------------------
/utils/utils.py:
--------------------------------------------------------------------------------
1 | import torch
2 | from torch.optim.lr_scheduler import LambdaLR, _LRScheduler
3 | import torchvision.utils as vutils
4 | import torch.distributed as dist
5 |
6 |
7 | import errno
8 | import os
9 | import re
10 | import sys
11 | import numpy as np
12 | from bisect import bisect_right
13 |
14 | num_gpus = int(os.environ["WORLD_SIZE"]) if "WORLD_SIZE" in os.environ else 1
15 | is_distributed = num_gpus > 1
16 |
17 | def mkdir_p(path):
18 | try:
19 | os.makedirs(path)
20 | except OSError as exc: # Python ≥ 2.5
21 | if exc.errno == errno.EEXIST and os.path.isdir(path):
22 | pass
23 | else:
24 | raise
25 |
26 | def dict2cuda(data: dict):
27 | new_dic = {}
28 | for k, v in data.items():
29 | if isinstance(v, dict):
30 | v = dict2cuda(v)
31 | elif isinstance(v, torch.Tensor):
32 | v = v.cuda()
33 | new_dic[k] = v
34 | return new_dic
35 |
36 | def dict2numpy(data: dict):
37 | new_dic = {}
38 | for k, v in data.items():
39 | if isinstance(v, dict):
40 | v = dict2numpy(v)
41 | elif isinstance(v, torch.Tensor):
42 | v = v.detach().cpu().numpy().copy()
43 | new_dic[k] = v
44 | return new_dic
45 |
46 | def dict2float(data: dict):
47 | new_dic = {}
48 | for k, v in data.items():
49 | if isinstance(v, dict):
50 | v = dict2float(v)
51 | elif isinstance(v, torch.Tensor):
52 | v = v.detach().cpu().item()
53 | new_dic[k] = v
54 | return new_dic
55 |
56 | def metric_with_thresh(depth, label, mask, thresh):
57 | err = torch.abs(depth - label)
58 | valid = err <= thresh
59 | mean_abs = torch.mean(err[valid])
60 | acc = valid.sum(dtype=torch.float) / mask.sum(dtype=torch.float)
61 | return mean_abs, acc
62 |
63 | def evaluate(depth, mask, label, thresh):
64 | batch_abs_err = []
65 | batch_acc = []
66 | for d, m, l in zip(depth, mask, label):
67 | abs_err, acc = metric_with_thresh(d, l, m, thresh)
68 | batch_abs_err.append(abs_err)
69 | batch_acc.append(acc)
70 |
71 | tot_abs = torch.stack(batch_abs_err)
72 | tot_acc = torch.stack(batch_acc)
73 | return tot_abs.mean(), tot_acc.mean()
74 |
75 | def save_cameras(cam, path):
76 | cam_txt = open(path, 'w+')
77 |
78 | cam_txt.write('extrinsic\n')
79 | for i in range(4):
80 | for j in range(4):
81 | cam_txt.write(str(cam[0, i, j]) + ' ')
82 | cam_txt.write('\n')
83 | cam_txt.write('\n')
84 |
85 | cam_txt.write('intrinsic\n')
86 | for i in range(3):
87 | for j in range(3):
88 | cam_txt.write(str(cam[1, i, j]) + ' ')
89 | cam_txt.write('\n')
90 | cam_txt.close()
91 |
92 | def read_pfm(filename):
93 | file = open(filename, 'rb')
94 | color = None
95 | width = None
96 | height = None
97 | scale = None
98 | endian = None
99 |
100 | header = file.readline().decode('utf-8').rstrip()
101 | if header == 'PF':
102 | color = True
103 | elif header == 'Pf':
104 | color = False
105 | else:
106 | raise Exception('Not a PFM file.')
107 |
108 | dim_match = re.match(r'^(\d+)\s(\d+)\s$', file.readline().decode('utf-8'))
109 | if dim_match:
110 | width, height = map(int, dim_match.groups())
111 | else:
112 | raise Exception('Malformed PFM header.')
113 |
114 | scale = float(file.readline().rstrip())
115 | if scale < 0: # little-endian
116 | endian = '<'
117 | scale = -scale
118 | else:
119 | endian = '>' # big-endian
120 |
121 | data = np.fromfile(file, endian + 'f')
122 | shape = (height, width, 3) if color else (height, width)
123 |
124 | data = np.reshape(data, shape)
125 | data = np.flipud(data)
126 | file.close()
127 | return data, scale
128 |
129 | def write_pfm(file, image, scale=1):
130 | file = open(file, 'wb')
131 | color = None
132 | if image.dtype.name != 'float32':
133 | raise Exception('Image dtype must be float32.')
134 |
135 | image = np.flipud(image)
136 |
137 | if len(image.shape) == 3 and image.shape[2] == 3: # color image
138 | color = True
139 | elif len(image.shape) == 2 or len(image.shape) == 3 and image.shape[2] == 1: # greyscale
140 | color = False
141 | else:
142 | raise Exception('Image must have H x W x 3, H x W x 1 or H x W dimensions.')
143 |
144 | file.write('PF\n'.encode() if color else 'Pf\n'.encode())
145 | file.write('%d %d\n'.encode() % (image.shape[1], image.shape[0]))
146 |
147 | endian = image.dtype.byteorder
148 |
149 | if endian == '<' or endian == '=' and sys.byteorder == 'little':
150 | scale = -scale
151 |
152 | file.write('%f\n'.encode() % scale)
153 |
154 | image_string = image.tostring()
155 | file.write(image_string)
156 | file.close()
157 |
158 | def get_linear_schedule_with_warmup(optimizer, num_warmup_steps, num_training_steps, last_epoch=-1):
159 | """ Create a schedule with a learning rate that decreases linearly after
160 | linearly increasing during a warmup period.
161 | """
162 | def lr_lambda(current_step):
163 | if current_step < num_warmup_steps:
164 | return float(current_step) / float(max(1, num_warmup_steps))
165 | return max(
166 | 0.0, float(num_training_steps - current_step) / float(max(1, num_training_steps - num_warmup_steps))
167 | )
168 |
169 | return LambdaLR(optimizer, lr_lambda, last_epoch)
170 |
171 | def get_step_schedule_with_warmup(optimizer, milestones, gamma=0.1, warmup_factor=1.0/3, warmup_iters=500, last_epoch=-1,):
172 | def lr_lambda(current_step):
173 | if current_step < warmup_iters:
174 | alpha = float(current_step) / warmup_iters
175 | current_factor = warmup_factor * (1. - alpha) + alpha
176 | else:
177 | current_factor = 1.
178 |
179 | return max(0.0, current_factor * (gamma ** bisect_right(milestones, current_step)))
180 |
181 | return LambdaLR(optimizer, lr_lambda, last_epoch)
182 |
183 | def add_summary(data_dict: dict, dtype: str, logger, index: int, flag: str):
184 | def preprocess(name, img):
185 | if not (len(img.shape) == 3 or len(img.shape) == 4):
186 | raise NotImplementedError("invalid img shape {}:{} in save_images".format(name, img.shape))
187 | if len(img.shape) == 3:
188 | img = img[:, np.newaxis, :, :]
189 | if img.dtype == np.bool:
190 | img = img.astype(np.float32)
191 | img = torch.from_numpy(img[:1])
192 | if 'depth' in name or 'label' in name:
193 | return vutils.make_grid(img, padding=0, nrow=1, normalize=True, scale_each=True, range=(450, 850))
194 | elif 'mask' in name:
195 | return vutils.make_grid(img, padding=0, nrow=1, normalize=True, scale_each=True, range=(0, 1))
196 | elif 'error' in name:
197 | return vutils.make_grid(img, padding=0, nrow=1, normalize=True, scale_each=True, range=(0, 4))
198 | return vutils.make_grid(img, padding=0, nrow=1, normalize=True, scale_each=True,)
199 |
200 | on_main = (not is_distributed) or (dist.get_rank() == 0)
201 | if not on_main:
202 | return
203 |
204 | if dtype == 'image':
205 | for k, v in data_dict.items():
206 | logger.add_image('{}/{}'.format(flag, k), preprocess(k, v), index)
207 |
208 | elif dtype == 'scalar':
209 | for k, v in data_dict.items():
210 | logger.add_scalar('{}/{}'.format(flag, k), v, index)
211 | else:
212 | raise NotImplementedError
213 |
214 | class DictAverageMeter(object):
215 | def __init__(self):
216 | self.data = {}
217 | self.count = 0
218 |
219 | def update(self, new_input: dict):
220 | self.count += 1
221 | for k, v in new_input.items():
222 | assert isinstance(v, float), type(v)
223 | self.data[k] = self.data.get(k, 0) + v
224 |
225 | def mean(self):
226 | return {k: v / self.count for k, v in self.data.items()}
227 |
228 | def reduce_tensors(datas: dict):
229 | if not is_distributed:
230 | return datas
231 | world_size = dist.get_world_size()
232 | with torch.no_grad():
233 | keys = list(datas.keys())
234 | vals = []
235 | for k in keys:
236 | vals.append(datas[k])
237 | vals = torch.stack(vals, dim=0)
238 | dist.reduce(vals, op=dist.reduce_op.SUM, dst=0)
239 | if dist.get_rank() == 0:
240 | vals /= float(world_size)
241 | reduced_datas = {k: v for k, v in zip(keys, vals)}
242 | return reduced_datas
243 |
244 |
--------------------------------------------------------------------------------