├── ImageCaptureGuide.odt ├── ImageCaptureGuide.pdf ├── README.md ├── data └── 0 │ ├── 0.gz │ ├── 1.gz │ ├── 10.gz │ ├── 11.gz │ ├── 12.gz │ ├── 13.gz │ ├── 14.gz │ ├── 15.gz │ ├── 16.gz │ ├── 17.gz │ ├── 18.gz │ ├── 19.gz │ ├── 2.gz │ ├── 20.gz │ ├── 21.gz │ ├── 22.gz │ ├── 23.gz │ ├── 24.gz │ ├── 25.gz │ ├── 26.gz │ ├── 27.gz │ ├── 28.gz │ ├── 29.gz │ ├── 3.gz │ ├── 30.gz │ ├── 31.gz │ ├── 32.gz │ ├── 33.gz │ ├── 34.gz │ ├── 35.gz │ ├── 36.gz │ ├── 37.gz │ ├── 38.gz │ ├── 39.gz │ ├── 4.gz │ ├── 40.gz │ ├── 41.gz │ ├── 42.gz │ ├── 43.gz │ ├── 44.gz │ ├── 45.gz │ ├── 46.gz │ ├── 47.gz │ ├── 48.gz │ ├── 49.gz │ ├── 5.gz │ ├── 50.gz │ ├── 51.gz │ ├── 52.gz │ ├── 53.gz │ ├── 54.gz │ ├── 55.gz │ ├── 56.gz │ ├── 57.gz │ ├── 58.gz │ ├── 59.gz │ ├── 6.gz │ ├── 60.gz │ ├── 61.gz │ ├── 62.gz │ ├── 63.gz │ ├── 7.gz │ ├── 8.gz │ └── 9.gz ├── demo ├── 1.png ├── 1.png.txt ├── 1_m.png ├── 2.png ├── 2_m.png ├── interactive_crop.py ├── r0.png ├── r1.png ├── r2.png ├── r3.png ├── runr2n2_128.py ├── runsingleimage.py ├── show3d.py ├── src_3.png ├── src_3.png.crop.png ├── src_3.png.crop_m.png ├── src_3.png.rect.txt ├── src_3.png.xyz ├── src_3_m.png └── view.py ├── depthestimate ├── BatchFetcher.py ├── render_balls_so.cpp ├── show3d.py ├── show3d_balls.py ├── tf_nndistance.cpp ├── tf_nndistance.py ├── tf_nndistance_g.cu ├── train_nn.py └── visualizeptexample.v.py ├── makefile ├── position.svg ├── show3d.py └── traindataviewer.py /ImageCaptureGuide.odt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/ImageCaptureGuide.odt -------------------------------------------------------------------------------- /ImageCaptureGuide.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/ImageCaptureGuide.pdf -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ## Using the code 2 | 3 | Training scripts and a couple of trained demo networks are included. More demos and the complete set of data are on the road. 4 | 5 | **KMZ file of the ShapeNet shapes used in this experiment are released! See the end of this page** 6 | **The full training batches used in this experiment are released! See the end of this page** 7 | 8 | Make sure you have python-numpy, python-opencv, tensorflow, tflearn, CUDA, etc. 9 | Some paths are configured in makefile. Overwrite them properly. 10 | 11 | ### Running the demo 12 | 13 | If you just want to try the demo, cd into the demo directory, and run 14 | ``` 15 | $ python runsingleimage.py 1.png 1_m.png twobranch_v1.pkl 16 | $ python view.py 1.png.txt 17 | ``` 18 | The .pkl files can be found in the google drive: 19 | - version 1 (used in the paper) https://drive.google.com/file/d/0B0gQFbJEIJ4kT3lCUy1UQVlZQnc/view?usp=sharing 20 | - version 2 (improved) https://drive.google.com/file/d/0B0gQFbJEIJ4kUTc2cGlDeDh6VGs/view?usp=sharing 21 | 22 | The first script runs the code on the image 1.png with segmentation mask 1_m.png using neural network weights twobranch_v1.pkl. Another set of weights twobranch_v2.pkl seems more robust. The input images must be of size 256x192. The second script visualizes the predicted point cloud. Move your mouse over the window to rotate the point cloud. 23 | 24 | **If you want to try the networks on your own captured image, see ImageCaptureGuide.pdf first.** 25 | 26 | We have also included a trained network corresponding to the R2N2 paper's setting. you can download runr2n2_128_v1.pkl from 27 | https://drive.google.com/file/d/0B0gQFbJEIJ4kQVdpeVBNb2RJTlk/view?usp=sharing 28 | and run 29 | ``` 30 | $ python runr2n2_128.py r1.png runr2n2_128_v1.pkl 31 | $ python view.py r1.png.txt 32 | ``` 33 | 34 | ### Training 35 | 36 | If you are interested in training a network, here are the instructions. 37 | 38 | Compiling CUDA code 39 | ``` 40 | $ make 41 | ``` 42 | 43 | Usage of training script: 44 | 45 | * Predict on validation set 46 | ``` 47 | $ python train_nn.py [data=] [dump=] [num=] predict 48 | example: $ python train_nn.py data=data dump=dump num=3 predict 49 | ``` 50 | 51 | * Visualualize dumpped prediction (press space to view the next one) 52 | ``` 53 | $ python python visualizeptexample.v.py /train_nn.v.pkl 54 | example: $ python visualizeptexample.v.py dump/train_nn.v.pkl 55 | ``` 56 | 57 | * Train 58 | ``` 59 | $ python train_nn.py [data=] [dump=] train 60 | example: $ python train_nn.py data=data dump=dump train 61 | ``` 62 | 63 | ## Format of training data 64 | A few minibatches of processed training data is in the data/ folder. 65 | 66 | .bin.gz files here are not gzipped file (sorry). 67 | ``` 68 | python traindataviewer.py data/0/0.gz 69 | ``` 70 | This shows a batch of training data. The loadBinFile function returns a tuple containing the color image, depth image, ground truth point cloud and model key names. 71 | 72 | Below is the complete set of training data. Download them all into the data/ folder. 73 | https://www.dropbox.com/sh/68kfpqut2y75etz/AABtIn2LUMALTnULSTUr5ZlUa?dl=0 74 | 75 | Below is more data that might be useful. Notice: you must use https. 76 | 77 | https://shapenet.cs.stanford.edu/media/sampledata_220k.tar 78 | -------------------------------------------------------------------------------- /data/0/0.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/0.gz -------------------------------------------------------------------------------- /data/0/1.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/1.gz -------------------------------------------------------------------------------- /data/0/10.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/10.gz -------------------------------------------------------------------------------- /data/0/11.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/11.gz -------------------------------------------------------------------------------- /data/0/12.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/12.gz -------------------------------------------------------------------------------- /data/0/13.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/13.gz -------------------------------------------------------------------------------- /data/0/14.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/14.gz -------------------------------------------------------------------------------- /data/0/15.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/15.gz -------------------------------------------------------------------------------- /data/0/16.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/16.gz -------------------------------------------------------------------------------- /data/0/17.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/17.gz -------------------------------------------------------------------------------- /data/0/18.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/18.gz -------------------------------------------------------------------------------- /data/0/19.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/19.gz -------------------------------------------------------------------------------- /data/0/2.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/2.gz -------------------------------------------------------------------------------- /data/0/20.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/20.gz -------------------------------------------------------------------------------- /data/0/21.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/21.gz -------------------------------------------------------------------------------- /data/0/22.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/22.gz -------------------------------------------------------------------------------- /data/0/23.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/23.gz -------------------------------------------------------------------------------- /data/0/24.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/24.gz -------------------------------------------------------------------------------- /data/0/25.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/25.gz -------------------------------------------------------------------------------- /data/0/26.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/26.gz -------------------------------------------------------------------------------- /data/0/27.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/27.gz -------------------------------------------------------------------------------- /data/0/28.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/28.gz -------------------------------------------------------------------------------- /data/0/29.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/29.gz -------------------------------------------------------------------------------- /data/0/3.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/3.gz -------------------------------------------------------------------------------- /data/0/30.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/30.gz -------------------------------------------------------------------------------- /data/0/31.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/31.gz -------------------------------------------------------------------------------- /data/0/32.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/32.gz -------------------------------------------------------------------------------- /data/0/33.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/33.gz -------------------------------------------------------------------------------- /data/0/34.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/34.gz -------------------------------------------------------------------------------- /data/0/35.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/35.gz -------------------------------------------------------------------------------- /data/0/36.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/36.gz -------------------------------------------------------------------------------- /data/0/37.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/37.gz -------------------------------------------------------------------------------- /data/0/38.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/38.gz -------------------------------------------------------------------------------- /data/0/39.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/39.gz -------------------------------------------------------------------------------- /data/0/4.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/4.gz -------------------------------------------------------------------------------- /data/0/40.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/40.gz -------------------------------------------------------------------------------- /data/0/41.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/41.gz -------------------------------------------------------------------------------- /data/0/42.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/42.gz -------------------------------------------------------------------------------- /data/0/43.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/43.gz -------------------------------------------------------------------------------- /data/0/44.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/44.gz -------------------------------------------------------------------------------- /data/0/45.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/45.gz -------------------------------------------------------------------------------- /data/0/46.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/46.gz -------------------------------------------------------------------------------- /data/0/47.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/47.gz -------------------------------------------------------------------------------- /data/0/48.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/48.gz -------------------------------------------------------------------------------- /data/0/49.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/49.gz -------------------------------------------------------------------------------- /data/0/5.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/5.gz -------------------------------------------------------------------------------- /data/0/50.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/50.gz -------------------------------------------------------------------------------- /data/0/51.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/51.gz -------------------------------------------------------------------------------- /data/0/52.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/52.gz -------------------------------------------------------------------------------- /data/0/53.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/53.gz -------------------------------------------------------------------------------- /data/0/54.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/54.gz -------------------------------------------------------------------------------- /data/0/55.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/55.gz -------------------------------------------------------------------------------- /data/0/56.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/56.gz -------------------------------------------------------------------------------- /data/0/57.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/57.gz -------------------------------------------------------------------------------- /data/0/58.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/58.gz -------------------------------------------------------------------------------- /data/0/59.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/59.gz -------------------------------------------------------------------------------- /data/0/6.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/6.gz -------------------------------------------------------------------------------- /data/0/60.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/60.gz -------------------------------------------------------------------------------- /data/0/61.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/61.gz -------------------------------------------------------------------------------- /data/0/62.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/62.gz -------------------------------------------------------------------------------- /data/0/63.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/63.gz -------------------------------------------------------------------------------- /data/0/7.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/7.gz -------------------------------------------------------------------------------- /data/0/8.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/8.gz -------------------------------------------------------------------------------- /data/0/9.gz: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/data/0/9.gz -------------------------------------------------------------------------------- /demo/1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/1.png -------------------------------------------------------------------------------- /demo/1.png.txt: -------------------------------------------------------------------------------- 1 | 0.906966 -0.118016 -0.65902 2 | 0.709956 -0.124343 -0.64281 3 | 0.607596 -0.326132 -0.248087 4 | 0.499168 0.3701 -0.657671 5 | 0.542987 -0.353436 0.714898 6 | 0.818345 0.511064 -0.702977 7 | 0.482903 0.0320513 0.575379 8 | 0.510575 0.467205 -0.0771623 9 | 0.856672 0.420665 -0.594471 10 | 0.494987 0.489916 -0.517617 11 | 0.490741 0.222615 0.451468 12 | 0.500027 0.184922 0.432936 13 | 0.520904 0.195667 0.562042 14 | 0.594272 0.0310365 0.810022 15 | 0.489881 0.255173 0.372087 16 | 0.900294 -0.168459 -0.525174 17 | 0.650372 -0.122198 -0.691146 18 | 0.812073 -0.269086 -0.420828 19 | 0.410481 0.361145 -0.315269 20 | 0.517835 0.34916 -0.723945 21 | 0.559808 -0.255103 0.653108 22 | 0.779368 0.599765 -0.652038 23 | 0.746186 0.286323 -0.785522 24 | 0.638792 0.0586756 -0.824257 25 | 0.502621 0.0124441 0.683387 26 | 0.635328 -0.0306463 0.804509 27 | 0.479724 0.483507 -0.400955 28 | 0.635176 -0.400649 0.654889 29 | 0.418088 0.326644 -0.455673 30 | 0.672519 -0.0983494 -0.557812 31 | 0.494889 0.196766 -0.771181 32 | 0.605307 0.0598199 -0.731142 33 | 0.691236 0.307311 -0.819522 34 | 0.693703 0.0111441 -0.681952 35 | 0.801976 0.0999548 -0.851157 36 | 0.667372 0.482804 -0.643867 37 | 0.894445 -0.192439 -0.456215 38 | 0.60901 0.182093 -0.810669 39 | 0.625113 0.0969357 0.739214 40 | 0.537958 0.298328 -0.727276 41 | 0.454825 0.261593 0.246234 42 | 0.492342 0.275109 0.299379 43 | 0.48763 -0.36317 0.634618 44 | 0.586282 -0.0960182 -0.721698 45 | 0.428262 0.0624635 -0.503435 46 | 0.549404 -0.171951 -0.529254 47 | 0.44703 -0.186707 0.423622 48 | 0.658974 0.194423 0.574787 49 | 0.442003 0.249725 -0.606215 50 | 0.665178 -0.109183 0.809748 51 | 0.789968 -0.278954 -0.342163 52 | 0.588165 0.356234 -0.667062 53 | 0.502833 0.360772 0.137722 54 | 0.536545 -0.107138 0.826347 55 | 0.487069 -0.178668 0.637857 56 | 0.761049 -0.230028 0.703567 57 | 0.921575 -0.0759272 -0.704789 58 | 0.548015 0.412692 0.0857372 59 | 0.441375 0.186205 -0.668496 60 | 0.830669 0.139048 -0.737112 61 | 0.575306 0.0265865 -0.837369 62 | 0.60477 0.38376 -0.75533 63 | 0.439314 -0.235163 0.314103 64 | 0.47571 -0.397173 0.259677 65 | 0.468096 -0.170453 -0.418762 66 | 0.731859 -0.0486874 -0.791521 67 | 0.412965 0.33745 0.00954286 68 | 0.534317 0.0235459 0.764414 69 | 0.545299 0.32237 0.323057 70 | 0.504944 0.0775315 0.670539 71 | 0.600807 0.341318 0.278453 72 | 0.77585 0.138293 -0.777242 73 | 0.471059 0.087335 0.496787 74 | 0.626093 -0.0916377 0.74727 75 | 0.806873 0.31259 -0.816796 76 | 0.66395 -0.00168894 -0.610416 77 | 0.578908 0.00123985 0.710184 78 | 0.631999 -0.0819086 -0.650162 79 | 0.509923 -0.0546269 0.733359 80 | 0.507741 0.53603 -0.450557 81 | 0.548442 -0.20811 -0.455192 82 | 0.464765 -0.0902245 0.48422 83 | 0.835338 0.241681 -0.723213 84 | 0.454812 0.037406 0.514666 85 | 0.631354 -0.177324 -0.577821 86 | 0.53971 -0.0116514 -0.713783 87 | 0.76184 -0.150249 -0.621911 88 | 0.821272 -0.0903703 -0.709927 89 | 0.461175 0.418691 -0.515191 90 | 0.889043 0.00333461 -0.689819 91 | 0.430586 -0.0150869 0.455081 92 | 0.746167 -0.330914 -0.285372 93 | 0.591963 0.140734 0.69769 94 | 0.432698 0.0536095 0.39844 95 | 0.689889 0.0952531 0.683885 96 | 0.695221 0.0240867 0.723474 97 | 0.507628 -0.235772 -0.404784 98 | 0.760403 0.152251 -0.891281 99 | 0.678181 -0.287074 -0.362849 100 | 0.599035 -0.0542234 -0.76812 101 | 0.381479 0.191652 0.227551 102 | 0.478692 0.445457 -0.158674 103 | 0.807234 -0.0400735 -0.742757 104 | 0.577961 -0.214176 0.826319 105 | 0.690084 0.00558677 -0.832366 106 | 0.752303 0.379117 -0.803687 107 | 0.41271 0.379683 -0.150424 108 | 0.533447 0.507675 -0.626881 109 | 0.496289 0.0579114 0.61903 110 | 0.682339 0.12297 -0.859087 111 | 0.420027 0.40723 -0.234567 112 | 0.681407 -0.0739119 -0.753422 113 | 0.525598 -0.156532 0.758073 114 | 0.5498 0.356482 0.191932 115 | 0.845556 -0.136927 -0.614928 116 | 0.416972 -0.00960429 -0.524461 117 | 0.491205 -0.277835 0.706804 118 | 0.761912 0.00562035 -0.681584 119 | 0.786497 0.238897 -0.835052 120 | 0.495605 0.441019 -0.619134 121 | 0.568546 -0.387572 0.58589 122 | 0.491835 -0.388251 0.472717 123 | 0.734454 -0.113736 -0.73297 124 | 0.610087 0.554985 -0.591562 125 | 0.792159 0.198512 -0.686063 126 | 0.574469 0.581207 -0.494302 127 | 0.677174 0.14616 -0.743683 128 | 0.591409 0.207067 0.527093 129 | 0.693205 -0.244138 -0.523916 130 | 0.392105 0.156057 -0.497818 131 | 0.672506 0.379417 -0.763099 132 | 0.588061 -0.164424 -0.6293 133 | 0.465977 0.356517 -0.57444 134 | 0.580856 0.443544 0.0387938 135 | 0.53227 -0.0153519 -0.768832 136 | 0.534165 -0.239144 0.780463 137 | 0.565542 0.498834 -0.118646 138 | 0.451168 0.325002 0.0928658 139 | 0.475287 -0.00345754 -0.645227 140 | 0.683777 0.498745 -0.742478 141 | 0.822654 -0.052754 -0.826709 142 | 0.451251 0.308375 -0.647979 143 | 0.460237 -0.142492 0.586964 144 | 0.468921 0.186806 -0.592722 145 | 0.546749 0.0763941 0.713327 146 | 0.486842 -0.170421 0.486758 147 | 0.530389 0.106759 -0.662905 148 | 0.488503 -0.141945 0.708789 149 | 0.388398 0.282364 0.0885479 150 | 0.464671 0.472634 -0.298926 151 | 0.852534 -0.0594964 -0.642096 152 | 0.774834 0.4431 -0.749768 153 | 0.557228 0.218191 -0.659414 154 | 0.753341 0.055074 -0.904993 155 | 0.634083 -0.0262884 -0.829249 156 | 0.493966 0.156964 0.505178 157 | 0.50779 0.42253 -0.0154539 158 | 0.419966 0.298211 -0.526511 159 | 0.908469 -0.16208 -0.576903 160 | 0.50531 0.13301 0.570986 161 | 0.466778 -0.0580378 -0.593472 162 | 0.840752 0.052346 -0.796986 163 | 0.812004 0.302508 -0.689576 164 | 0.560711 0.179161 0.595667 165 | 0.549467 0.449064 -0.655487 166 | 0.719969 0.535762 -0.701957 167 | 0.748078 -0.050256 -0.624069 168 | 0.502385 0.434585 -0.549551 169 | 0.720727 0.389441 -0.699969 170 | 0.526496 -0.0735848 -0.691896 171 | 0.630853 0.294414 0.354348 172 | 0.438147 0.232849 0.297792 173 | 0.52358 0.137703 0.633989 174 | 0.499596 0.531049 -0.274972 175 | 0.782704 0.0286517 -0.822828 176 | 0.524193 0.311146 0.259427 177 | 0.762833 0.464011 -0.651369 178 | 0.82145 0.0556186 -0.705103 179 | 0.514499 -0.278733 -0.280327 180 | 0.497448 -0.088431 0.618718 181 | 0.822342 0.521742 -0.610434 182 | 0.444376 0.414893 -0.421055 183 | 0.536657 0.566014 -0.569766 184 | 0.809496 0.353292 -0.758534 185 | 0.470155 -0.0589104 0.399357 186 | 0.642358 -0.24221 -0.461981 187 | 0.827966 0.392177 -0.684128 188 | 0.491398 0.10463 -0.735066 189 | 0.553977 -0.0372616 0.797861 190 | 0.535825 0.166701 -0.753492 191 | 0.581864 0.126722 -0.844934 192 | 0.68349 0.289047 -0.729711 193 | 0.779341 -0.131297 -0.67583 194 | 0.643422 0.235936 -0.828966 195 | 0.442728 0.165086 0.385609 196 | 0.616801 0.0842194 0.635601 197 | 0.81051 -0.177948 -0.565604 198 | 0.574076 -0.296091 -0.322045 199 | 0.369979 -0.182454 -0.0851483 200 | 0.819627 0.177596 -0.819832 201 | 0.703405 0.24103 0.443558 202 | 0.59903 0.527708 -0.676754 203 | 0.641811 0.232282 0.49772 204 | 0.435623 0.122886 0.435754 205 | 0.839566 -0.211604 -0.518093 206 | 0.609737 0.264045 0.438988 207 | 0.487912 0.258062 -0.715189 208 | 0.548949 0.26969 0.384653 209 | 0.468539 -0.0527639 0.558635 210 | 0.934176 -0.0602999 -0.594886 211 | 0.664626 0.586088 -0.657404 212 | 0.666666 0.60578 -0.571081 213 | 0.631581 0.43728 -0.708081 214 | 0.615369 0.154036 0.636689 215 | 0.429591 0.283434 0.175232 216 | 0.52197 -0.165196 -0.58255 217 | 0.435266 0.204456 -0.507273 218 | 0.708258 0.214318 -0.83564 219 | 0.54692 0.249851 -0.790821 220 | 0.782107 0.551067 -0.500917 221 | 0.52031 0.08215 -0.79133 222 | 0.49124 0.328867 0.193704 223 | 0.423152 0.133057 -0.592633 224 | 0.476563 -0.277358 0.503917 225 | 0.47802 -0.0250118 0.623921 226 | 0.649475 -0.0336388 0.674592 227 | 0.600175 -0.218943 -0.517591 228 | 0.474648 0.0489631 -0.691436 229 | 0.630054 -0.303051 0.798845 230 | 0.699106 -0.186342 -0.58292 231 | 0.731773 0.561083 -0.591983 232 | 0.856342 -0.0192701 -0.772976 233 | 0.467704 -0.209906 -0.33677 234 | 0.438889 0.0445179 -0.598582 235 | 0.531844 -0.073786 -0.594189 236 | 0.732687 0.0629592 -0.739314 237 | 0.470924 -0.29193 0.397508 238 | 0.446343 0.395452 -0.0690728 239 | 0.346291 -0.0435876 -0.201628 240 | 0.371652 0.292614 -0.235859 241 | 0.611423 0.301054 -0.78157 242 | 0.476737 -0.251948 0.583931 243 | 0.576862 -0.277152 -0.39978 244 | 0.839332 0.541325 -0.512924 245 | 0.469806 -0.111502 -0.530988 246 | 0.549392 0.234484 0.474743 247 | 0.375576 0.17705 0.295287 248 | 0.761251 -0.189347 -0.522516 249 | 0.398073 0.00076582 -0.433995 250 | 0.424045 -0.111347 -0.438187 251 | 0.51423 -0.115854 -0.630665 252 | 0.744098 -0.242348 -0.461902 253 | 0.499518 -0.199724 -0.486305 254 | 0.478268 0.378518 0.0383551 255 | 0.567092 0.429095 -0.716366 256 | 0.404166 -0.072077 0.320397 257 | 0.795643 -0.351929 0.216092 258 | 0.830608 -0.448099 0.313271 259 | 0.816927 -0.370635 0.298701 260 | 0.790537 -0.415247 0.506651 261 | 0.742913 -0.406457 0.213552 262 | 0.719758 -0.410217 0.471196 263 | 0.825248 -0.291588 -0.0200946 264 | 0.783219 -0.314221 0.553921 265 | 0.788248 -0.302905 0.0599013 266 | 0.680538 -0.308275 0.430654 267 | 0.733008 -0.269607 -0.0610727 268 | 0.675238 -0.219332 0.522433 269 | 0.61406 -0.233619 -0.0244487 270 | 0.611146 -0.208219 0.190568 271 | 0.655165 -0.0844216 -0.320752 272 | 0.712023 -0.0142968 0.129607 273 | 0.63539 0.141618 -0.301079 274 | 0.658992 0.103709 0.0187185 275 | 0.684697 0.205503 -0.438652 276 | 0.727703 0.137448 0.120502 277 | 0.70778 0.23455 -0.31071 278 | 0.740065 0.206603 -0.0162417 279 | 0.777425 0.261405 -0.433346 280 | 0.784199 0.250832 0.101036 281 | 0.748108 0.320007 -0.247392 282 | 0.794409 0.290284 -0.0521373 283 | 0.806884 0.370534 -0.395779 284 | 0.829785 0.349545 0.0244093 285 | 0.769958 0.430301 -0.209533 286 | 0.845936 0.391421 -0.0672551 287 | 0.844362 0.424256 -0.257984 288 | 0.797824 0.337411 -0.183545 289 | 0.813674 -0.455855 0.23437 290 | 0.82467 -0.471692 0.434272 291 | 0.832072 -0.450491 0.156886 292 | 0.814059 -0.533818 0.330748 293 | 0.828028 -0.372487 -0.0212793 294 | 0.73304 -0.411375 0.395812 295 | 0.729054 -0.355068 0.103816 296 | 0.776274 -0.357429 0.308641 297 | 0.751344 -0.324307 -0.0817333 298 | 0.698404 -0.332402 0.314088 299 | 0.667881 -0.323786 -0.0488973 300 | 0.689311 -0.299493 0.187293 301 | 0.648186 -0.281689 -0.183068 302 | 0.616129 -0.268663 0.29185 303 | 0.671363 -0.295121 0.0529398 304 | 0.754932 -0.23738 0.131658 305 | 0.803617 -0.211161 -0.189472 306 | 0.718034 -0.139521 0.26172 307 | 0.771557 -0.195141 0.0157472 308 | 0.773336 -0.137523 0.179596 309 | 0.806078 -0.173933 -0.10848 310 | 0.78824 -0.105196 0.364458 311 | 0.808431 -0.129289 0.0739838 312 | 0.822099 -0.042278 0.238877 313 | 0.834279 -0.0724611 -0.0197445 314 | 0.834959 0.0140403 0.40764 315 | 0.820083 -0.0241867 0.131781 316 | 0.835564 0.0603124 0.266691 317 | 0.85639 0.0370485 0.0234571 318 | 0.84835 0.135441 0.367153 319 | 0.823509 0.06735 0.147448 320 | 0.825322 0.154097 0.238467 321 | 0.78804 -0.490368 0.377634 322 | 0.803369 -0.528065 0.528701 323 | 0.792828 -0.403057 0.161722 324 | 0.750523 -0.468755 0.596601 325 | 0.789663 -0.360557 0.0504448 326 | 0.825037 -0.360943 0.411126 327 | 0.807817 -0.287585 -0.211257 328 | 0.812512 -0.28239 0.407365 329 | 0.745704 -0.286007 -0.167469 330 | 0.791445 -0.258646 0.272195 331 | 0.688739 -0.266123 -0.283244 332 | 0.680551 -0.204241 0.372417 333 | 0.523627 -0.200085 -0.275602 334 | 0.641527 -0.132309 0.097278 335 | 0.497696 -0.0201012 -0.443359 336 | 0.521867 0.0827815 0.0881943 337 | 0.466948 0.193023 -0.318029 338 | 0.503667 0.180602 -0.0477694 339 | 0.531042 0.283363 -0.380804 340 | 0.543654 0.266254 0.0372923 341 | 0.563521 0.351807 -0.279139 342 | 0.638483 0.275143 -0.0736991 343 | 0.666416 0.366537 -0.396375 344 | 0.681763 0.330981 0.0386369 345 | 0.667674 0.387216 -0.271132 346 | 0.720399 0.356077 -0.0988795 347 | 0.72184 0.434257 -0.429616 348 | 0.735902 0.425066 -0.0690622 349 | 0.711676 0.465181 -0.303279 350 | 0.817353 0.451572 -0.133593 351 | 0.789913 0.476354 -0.400162 352 | 0.821816 0.413159 -0.327284 353 | 0.761484 -0.495035 0.268961 354 | 0.752316 -0.510535 0.449929 355 | 0.744448 -0.457686 0.171606 356 | 0.7219 -0.476065 0.338242 357 | 0.731873 -0.397685 -0.0169553 358 | 0.662762 -0.439222 0.398959 359 | 0.692816 -0.391735 0.0416479 360 | 0.699199 -0.407181 0.239055 361 | 0.714291 -0.34988 -0.155576 362 | 0.609847 -0.366403 0.219037 363 | 0.595241 -0.331567 -0.0859628 364 | 0.536361 -0.300875 0.167224 365 | 0.529549 -0.27364 -0.202417 366 | 0.461201 -0.213369 0.162354 367 | 0.5251 -0.242681 -0.092309 368 | 0.537776 -0.159016 0.27868 369 | 0.687952 -0.187417 -0.264624 370 | 0.594402 -0.0786343 0.257676 371 | 0.703071 -0.162934 -0.144493 372 | 0.676005 -0.0602636 0.370015 373 | 0.77218 -0.119321 -0.252771 374 | 0.693124 0.0106656 0.281085 375 | 0.75648 -0.106793 -0.10218 376 | 0.736698 0.00918499 0.435368 377 | 0.80844 -0.0452294 -0.189219 378 | 0.774991 0.0985733 0.315464 379 | 0.789053 -0.0195415 -0.0837147 380 | 0.785271 0.118865 0.464449 381 | 0.822684 0.0608891 -0.122913 382 | 0.79305 0.210946 0.36327 383 | 0.856651 0.0891132 -0.0252996 384 | 0.838964 0.2178 0.290305 385 | 0.733482 -0.54797 0.388758 386 | 0.724957 -0.548049 0.530928 387 | 0.685431 -0.463521 0.257124 388 | 0.649884 -0.486318 0.484382 389 | 0.633541 -0.415854 0.144251 390 | 0.632793 -0.422274 0.318627 391 | 0.558626 -0.362835 0.000280588 392 | 0.560313 -0.391239 0.119197 393 | 0.491222 -0.341682 -0.0378692 394 | 0.444329 -0.334703 0.0842275 395 | 0.364979 -0.283045 -0.0403488 396 | 0.34115 -0.222472 0.00489527 397 | 0.337389 -0.145513 -0.0756011 398 | 0.344615 -0.0595947 -0.0435899 399 | 0.332564 0.00737804 -0.137233 400 | 0.335375 0.0871617 -0.107794 401 | 0.350205 0.132711 -0.169849 402 | 0.38235 0.185489 -0.0274249 403 | 0.370039 0.27175 -0.174482 404 | 0.418784 0.276225 -0.100709 405 | 0.431838 0.320921 -0.270012 406 | 0.510175 0.319278 0.0150461 407 | 0.510104 0.379384 -0.210614 408 | 0.607654 0.340325 -0.142229 409 | 0.573506 0.404355 -0.308891 410 | 0.616688 0.414354 -0.0743596 411 | 0.606071 0.446456 -0.24748 412 | 0.67178 0.436064 -0.180821 413 | 0.642264 0.489526 -0.366109 414 | 0.725784 0.488895 -0.127728 415 | 0.733129 0.488789 -0.219279 416 | 0.83131 0.488255 -0.296145 417 | 0.684238 -0.516903 0.297274 418 | 0.639041 -0.505344 0.394551 419 | 0.646171 -0.479614 0.220157 420 | 0.58272 -0.475398 0.307261 421 | 0.59104 -0.436293 0.0722335 422 | 0.508944 -0.448449 0.2685 423 | 0.501681 -0.403827 0.0506968 424 | 0.457752 -0.407283 0.127116 425 | 0.453616 -0.351253 -0.0578035 426 | 0.412916 -0.349754 0.0989115 427 | 0.422474 -0.287973 -0.0850713 428 | 0.377499 -0.276856 0.0354425 429 | 0.381208 -0.250519 -0.0932065 430 | 0.3614 -0.195724 0.0524589 431 | 0.352415 -0.192588 -0.14104 432 | 0.352956 -0.0992862 0.070488 433 | 0.35475 -0.140311 -0.157671 434 | 0.395913 -0.0635377 0.147066 435 | 0.449782 -0.155229 -0.14387 436 | 0.474952 0.0348949 0.220587 437 | 0.523495 -0.0724298 -0.158381 438 | 0.576669 0.0635225 0.323303 439 | 0.618566 -0.126741 -0.136656 440 | 0.624627 0.118849 0.2533 441 | 0.670342 -0.0140176 -0.133198 442 | 0.694279 0.139248 0.380279 443 | 0.770717 -0.0313103 0.032333 444 | 0.717256 0.227745 0.253536 445 | 0.771213 0.120501 -0.0431859 446 | 0.751778 0.287385 0.252728 447 | 0.847058 0.124988 0.0873585 448 | 0.840907 0.285546 0.263337 449 | 0.632797 -0.554405 0.388594 450 | 0.640333 -0.530482 0.484893 451 | 0.581671 -0.464848 0.194493 452 | 0.553032 -0.471094 0.436042 453 | 0.510638 -0.423176 0.135028 454 | 0.528333 -0.40705 0.306495 455 | 0.491026 -0.330372 -0.150653 456 | 0.461671 -0.359707 0.240127 457 | 0.415783 -0.31048 -0.104555 458 | 0.400466 -0.320799 0.151459 459 | 0.381685 -0.22045 -0.231207 460 | 0.362863 -0.168806 0.13581 461 | 0.358366 -0.0778834 -0.213632 462 | 0.345508 0.016591 -0.0188689 463 | 0.3631 0.0522361 -0.30214 464 | 0.336313 0.122113 0.102965 465 | 0.347349 0.146558 -0.288796 466 | 0.340705 0.227167 -0.0648745 467 | 0.369152 0.237776 -0.34146 468 | 0.362498 0.29586 -0.00180075 469 | 0.386638 0.33086 -0.346052 470 | 0.401003 0.374206 -0.137863 471 | 0.431147 0.386038 -0.372424 472 | 0.488986 0.392204 -0.0913182 473 | 0.484817 0.455735 -0.364028 474 | 0.49398 0.456304 -0.220386 475 | 0.539764 0.493472 -0.384783 476 | 0.569138 0.498202 -0.209261 477 | 0.570579 0.542568 -0.284458 478 | 0.668108 0.535682 -0.181475 479 | 0.694619 0.547841 -0.359374 480 | 0.778435 0.525003 -0.310396 481 | 0.58949 -0.539759 0.29964 482 | 0.543957 -0.534513 0.423261 483 | 0.553884 -0.492907 0.196717 484 | 0.523937 -0.510975 0.350689 485 | 0.5779 -0.450063 -0.0137809 486 | 0.498001 -0.45677 0.431767 487 | 0.49881 -0.394654 -0.0234304 488 | 0.470636 -0.422517 0.239954 489 | 0.500543 -0.31983 -0.219095 490 | 0.43058 -0.318989 0.298513 491 | 0.441763 -0.269458 -0.208916 492 | 0.38198 -0.252506 0.178607 493 | 0.397397 -0.193226 -0.315352 494 | 0.378857 -0.113542 0.219008 495 | 0.362332 -0.12309 -0.313191 496 | 0.347013 -0.017637 0.155874 497 | 0.355824 -0.0462089 -0.390337 498 | 0.358373 0.0495204 0.217349 499 | 0.376456 0.017004 -0.352243 500 | 0.36099 0.141693 0.168926 501 | 0.398131 0.0555259 -0.435433 502 | 0.435631 0.194501 0.24525 503 | 0.519439 0.0816176 -0.421361 504 | 0.573245 0.253749 0.201139 505 | 0.556309 0.210926 -0.501709 506 | 0.63843 0.285632 0.127869 507 | 0.726313 0.175391 -0.202248 508 | 0.720607 0.341107 0.145805 509 | 0.740913 0.299974 -0.308551 510 | 0.712829 0.40356 0.0739764 511 | 0.855811 0.171136 -0.0713099 512 | 0.81965 0.323138 0.112775 513 | 0.576231 -0.582713 0.382417 514 | 0.563244 -0.568673 0.491691 515 | 0.555878 -0.511283 0.256507 516 | 0.513579 -0.507675 0.488074 517 | 0.533187 -0.463629 0.166076 518 | 0.517732 -0.393661 0.428677 519 | 0.507067 -0.377525 0.0373581 520 | 0.481838 -0.391471 0.312116 521 | 0.450645 -0.323125 -0.0159681 522 | 0.408038 -0.253679 0.301294 523 | 0.372565 -0.228054 -0.159838 524 | 0.338121 -0.135419 0.024759 525 | 0.338948 -0.0864125 -0.149591 526 | 0.331614 0.0373212 0.0731934 527 | 0.336236 0.0564934 -0.204541 528 | 0.338581 0.138616 -0.0180709 529 | 0.336772 0.176554 -0.189674 530 | 0.35123 0.21578 0.0960456 531 | 0.355427 0.240185 -0.258908 532 | 0.358717 0.339335 -0.111421 533 | 0.381289 0.336346 -0.271201 534 | 0.410272 0.347751 -0.0345108 535 | 0.429758 0.423286 -0.333826 536 | 0.470244 0.432618 -0.160104 537 | 0.474236 0.504102 -0.312807 538 | 0.505749 0.471094 -0.121094 539 | 0.535168 0.566144 -0.37605 540 | 0.542168 0.562609 -0.262608 541 | 0.574888 0.602516 -0.385099 542 | 0.577104 0.541584 -0.14635 543 | 0.634606 0.598844 -0.281081 544 | 0.673946 0.564735 -0.259364 545 | 0.59722 -0.569858 0.304765 546 | 0.543727 -0.569027 0.508873 547 | 0.600328 -0.537146 0.226299 548 | 0.540851 -0.573038 0.424214 549 | 0.591753 -0.503625 0.0928051 550 | 0.503226 -0.485869 0.532 551 | 0.557603 -0.441606 0.0238868 552 | 0.503916 -0.455961 0.391461 553 | 0.572899 -0.382215 -0.12696 554 | 0.462304 -0.341937 0.420922 555 | 0.524067 -0.309998 -0.189325 556 | 0.420592 -0.307532 0.257622 557 | 0.455391 -0.239204 -0.305231 558 | 0.382913 -0.159473 0.261999 559 | 0.365097 -0.158476 -0.262007 560 | 0.351288 -0.0621901 0.0829873 561 | 0.354042 -0.0598042 -0.339383 562 | 0.356327 0.0394193 0.157996 563 | 0.395279 -0.0298295 -0.272462 564 | 0.369409 0.12012 0.0958618 565 | 0.441336 0.00890445 -0.411064 566 | 0.430847 0.217824 0.15761 567 | 0.585373 -0.0425943 -0.274391 568 | 0.526211 0.331233 0.117149 569 | 0.648709 0.101786 -0.410985 570 | 0.557474 0.408994 0.0672886 571 | 0.77694 0.0345595 -0.23869 572 | 0.610551 0.453513 0.00752042 573 | 0.842346 0.228821 -0.175211 574 | 0.670145 0.481331 -0.0392812 575 | 0.864818 0.249452 -0.000495297 576 | 0.775354 0.433433 -0.000131165 577 | 0.618204 -0.592671 0.403767 578 | 0.580026 -0.573313 0.551936 579 | 0.667556 -0.502513 0.141471 580 | 0.537466 -0.524914 0.598777 581 | 0.61507 -0.445887 0.0402893 582 | 0.508292 -0.417108 0.587361 583 | 0.587588 -0.391054 -0.164179 584 | 0.50198 -0.363713 0.513796 585 | 0.560304 -0.325506 -0.243299 586 | 0.481296 -0.258734 0.431084 587 | 0.513106 -0.253494 -0.347174 588 | 0.405134 -0.150014 0.360597 589 | 0.410126 -0.154504 -0.401162 590 | 0.365327 -0.0213346 0.322569 591 | 0.380613 -0.0405868 -0.475867 592 | 0.376779 0.0884464 0.282562 593 | 0.354378 0.116532 -0.415217 594 | 0.353147 0.223656 0.0239885 595 | 0.384562 0.18855 -0.449606 596 | 0.36624 0.275962 0.145985 597 | 0.403514 0.266049 -0.453999 598 | 0.426326 0.368974 0.00309252 599 | 0.523715 0.380201 -0.510568 600 | 0.501305 0.424579 -0.0136093 601 | 0.550195 0.507406 -0.515895 602 | 0.540494 0.52338 -0.138434 603 | 0.613258 0.612985 -0.516332 604 | 0.597729 0.600213 -0.294689 605 | 0.659473 0.628698 -0.397097 606 | 0.627499 0.593619 -0.185915 607 | 0.696673 0.631676 -0.367581 608 | 0.776283 0.612379 -0.318819 609 | 0.664324 -0.579891 0.31739 610 | 0.594878 -0.58936 0.585277 611 | 0.679343 -0.545906 0.222038 612 | 0.601918 -0.607495 0.49319 613 | 0.69565 -0.45671 -0.0246208 614 | 0.526774 -0.470285 0.652265 615 | 0.633204 -0.422904 -0.0738555 616 | 0.502004 -0.431996 0.599145 617 | 0.686447 -0.363457 -0.232254 618 | 0.510914 -0.328108 0.619424 619 | 0.663336 -0.307405 -0.305609 620 | 0.482415 -0.22338 0.531619 621 | 0.645282 -0.241987 -0.412006 622 | 0.484813 -0.112709 0.50015 623 | 0.583221 -0.212511 -0.338694 624 | 0.427758 -0.0294834 0.402907 625 | 0.51209 -0.132692 -0.51124 626 | 0.434554 0.0586454 0.366561 627 | 0.593203 -0.118109 -0.387728 628 | 0.494776 0.140726 0.351465 629 | 0.58273 -0.0522888 -0.528402 630 | 0.495316 0.213241 0.323891 631 | 0.677655 -0.0276672 -0.438011 632 | 0.607541 0.305663 0.276894 633 | 0.740264 0.0874605 -0.527218 634 | 0.577928 0.390911 0.170961 635 | 0.82464 0.140841 -0.377977 636 | 0.646838 0.459119 0.0578722 637 | 0.82527 0.318242 -0.366163 638 | 0.671567 0.529336 -0.11083 639 | 0.860108 0.402701 -0.222335 640 | 0.776583 0.540217 -0.208294 641 | 0.67466 -0.607251 0.42657 642 | 0.647431 -0.57475 0.57815 643 | 0.750409 -0.523445 0.189031 644 | 0.602853 -0.542026 0.652237 645 | 0.705977 -0.473821 0.0791244 646 | 0.550066 -0.432443 0.665459 647 | 0.699647 -0.400388 -0.126331 648 | 0.537282 -0.33224 0.672125 649 | 0.714605 -0.30932 -0.298035 650 | 0.537282 -0.231708 0.612482 651 | 0.710517 -0.232929 -0.41824 652 | 0.56522 -0.126675 0.563496 653 | 0.661896 -0.141192 -0.525722 654 | 0.546003 -0.0156266 0.50662 655 | 0.662754 -0.00157502 -0.619195 656 | 0.559178 0.0542274 0.46846 657 | 0.544683 0.0759201 -0.560663 658 | 0.534455 0.160627 0.423217 659 | 0.644303 0.155753 -0.585424 660 | 0.586319 0.220928 0.367577 661 | 0.640854 0.272847 -0.531366 662 | 0.568161 0.325506 0.263335 663 | 0.736732 0.384988 -0.558076 664 | 0.620324 0.403121 0.143879 665 | 0.725802 0.513896 -0.511764 666 | 0.596448 0.502408 -0.0314207 667 | 0.705789 0.629879 -0.520834 668 | 0.608221 0.555701 -0.115836 669 | 0.770672 0.654924 -0.478446 670 | 0.69215 0.611782 -0.25685 671 | 0.732331 0.630469 -0.312013 672 | 0.780281 0.598074 -0.227315 673 | 0.715087 -0.586113 0.358184 674 | 0.660533 -0.593373 0.606881 675 | 0.75717 -0.563041 0.281501 676 | 0.676056 -0.615421 0.521762 677 | 0.782507 -0.497729 0.119346 678 | 0.606549 -0.486211 0.690715 679 | 0.763762 -0.44294 -0.0177676 680 | 0.576098 -0.401712 0.725615 681 | 0.800039 -0.392333 -0.0897027 682 | 0.546982 -0.303827 0.740908 683 | 0.78229 -0.342557 -0.214506 684 | 0.566497 -0.200166 0.731504 685 | 0.771856 -0.252965 -0.354035 686 | 0.580906 -0.0740332 0.721873 687 | 0.786312 -0.210672 -0.258022 688 | 0.672111 -0.0116636 0.567728 689 | 0.724684 -0.156571 -0.488203 690 | 0.622603 0.0746438 0.586105 691 | 0.727212 -0.141592 -0.366486 692 | 0.687313 0.126331 0.497853 693 | 0.746372 -0.06945 -0.497299 694 | 0.646965 0.21031 0.436955 695 | 0.790478 -0.0623569 -0.362485 696 | 0.694238 0.268799 0.335027 697 | 0.815684 0.0406742 -0.416825 698 | 0.657764 0.350871 0.207297 699 | 0.836454 0.0798008 -0.253686 700 | 0.708884 0.444333 0.0704549 701 | 0.872354 0.240211 -0.252225 702 | 0.695931 0.500694 0.00096739 703 | 0.861054 0.342605 -0.13617 704 | 0.799285 0.504723 -0.125641 705 | 0.727699 -0.611539 0.469586 706 | 0.698695 -0.562951 0.638987 707 | 0.841025 -0.519894 0.178607 708 | 0.667241 -0.526798 0.682287 709 | 0.813105 -0.452155 0.0501576 710 | 0.653104 -0.428957 0.709684 711 | 0.815923 -0.386742 -0.169402 712 | 0.630269 -0.328383 0.748592 713 | 0.837068 -0.305003 -0.348447 714 | 0.64594 -0.213714 0.734242 715 | 0.799843 -0.203989 -0.468277 716 | 0.661922 -0.113353 0.695051 717 | 0.773111 -0.0705643 -0.579346 718 | 0.687546 -0.00139458 0.644285 719 | 0.795247 0.030977 -0.695473 720 | 0.721806 0.0635675 0.585005 721 | 0.751878 0.0853433 -0.591462 722 | 0.702966 0.144627 0.579906 723 | 0.776275 0.159989 -0.670009 724 | 0.709544 0.213176 0.499685 725 | 0.759731 0.260093 -0.579316 726 | 0.681133 0.298434 0.374296 727 | 0.830008 0.361958 -0.639341 728 | 0.701818 0.369061 0.217955 729 | 0.833473 0.465794 -0.563185 730 | 0.668126 0.431459 0.108121 731 | 0.844991 0.60833 -0.52403 732 | 0.721343 0.597589 -0.233713 733 | 0.813267 0.639627 -0.414917 734 | 0.722364 0.568638 -0.117873 735 | 0.860214 0.631261 -0.366075 736 | 0.82081 0.621712 -0.288179 737 | 0.773449 -0.594831 0.395074 738 | 0.723762 -0.580121 0.65269 739 | 0.815555 -0.567875 0.336191 740 | 0.732239 -0.621652 0.553004 741 | 0.878542 -0.423425 -0.0416186 742 | 0.688474 -0.460885 0.739042 743 | 0.845768 -0.426669 0.014583 744 | 0.691342 -0.371635 0.784264 745 | 0.859904 -0.337112 -0.268681 746 | 0.656051 -0.284636 0.811772 747 | 0.863908 -0.308292 -0.215054 748 | 0.682782 -0.178704 0.829597 749 | 0.854297 -0.216705 -0.454751 750 | 0.651629 -0.0869274 0.847636 751 | 0.835649 -0.191023 -0.346385 752 | 0.706475 -0.00527779 0.775081 753 | 0.852603 -0.0852182 -0.590434 754 | 0.684869 0.0750915 0.742782 755 | 0.833096 -0.101932 -0.50707 756 | 0.70139 0.154004 0.666035 757 | 0.855604 -0.00692484 -0.626422 758 | 0.6755 0.20865 0.583935 759 | 0.8659 -0.0307517 -0.452696 760 | 0.767408 0.243323 0.422574 761 | 0.859287 0.0133224 -0.515122 762 | 0.711088 0.337403 0.262884 763 | 0.890245 0.142889 -0.26458 764 | 0.786531 0.43771 0.0887732 765 | 0.872992 0.331808 -0.338821 766 | 0.747482 0.538686 -0.0765657 767 | 0.872756 0.470973 -0.316915 768 | 0.831509 0.550901 -0.169416 769 | 0.789166 -0.608265 0.48521 770 | 0.772305 -0.544173 0.621011 771 | 0.88058 -0.524194 0.232359 772 | 0.743611 -0.513522 0.694195 773 | 0.882064 -0.446905 0.112842 774 | 0.770316 -0.395417 0.676285 775 | 0.885756 -0.336898 -0.0748029 776 | 0.766964 -0.290034 0.722999 777 | 0.891802 -0.25855 -0.0456411 778 | 0.796924 -0.168511 0.62265 779 | 0.881696 -0.158469 -0.241917 780 | 0.789182 -0.0720824 0.678824 781 | 0.874691 0.00414585 -0.324146 782 | 0.796905 -0.0089884 0.538035 783 | 0.872263 0.0898854 -0.486253 784 | 0.791207 0.0536939 0.587718 785 | 0.856827 0.11506 -0.537832 786 | 0.775198 0.129548 0.524109 787 | 0.861241 0.186491 -0.605496 788 | 0.772577 0.197293 0.554662 789 | 0.841743 0.251478 -0.494487 790 | 0.725018 0.278119 0.43959 791 | 0.858548 0.368387 -0.491302 792 | 0.764579 0.331978 0.289177 793 | 0.875635 0.360529 -0.572045 794 | 0.737208 0.400878 0.161706 795 | 0.868384 0.509043 -0.453313 796 | 0.773596 0.483925 0.00857844 797 | 0.898273 0.617109 -0.43849 798 | 0.833977 0.60482 -0.199274 799 | 0.917607 0.637499 -0.294009 800 | 0.873944 0.610531 -0.229505 801 | 0.823873 -0.593017 0.403723 802 | 0.792518 -0.581014 0.608777 803 | 0.868319 -0.5693 0.336974 804 | 0.812122 -0.600053 0.548444 805 | 0.899492 -0.484001 0.133833 806 | 0.74209 -0.455158 0.752503 807 | 0.886014 -0.42254 0.044676 808 | 0.762008 -0.358768 0.781122 809 | 0.884875 -0.369226 0.0990036 810 | 0.740062 -0.268041 0.822832 811 | 0.884058 -0.340333 -0.176321 812 | 0.752478 -0.160524 0.823321 813 | 0.875382 -0.204019 0.00915142 814 | 0.736918 -0.0842573 0.838987 815 | 0.89076 -0.231881 -0.344358 816 | 0.759734 -0.0143073 0.792396 817 | 0.886594 -0.107246 -0.147673 818 | 0.767683 0.0588352 0.762168 819 | 0.886259 -0.138304 -0.480536 820 | 0.767998 0.116744 0.712979 821 | 0.88296 -0.021581 -0.0852067 822 | 0.747999 0.191004 0.603738 823 | 0.876423 -0.101105 -0.39869 824 | 0.792654 0.226215 0.497262 825 | 0.8768 0.0223207 -0.139824 826 | 0.781901 0.298372 0.328841 827 | 0.885888 0.0714189 -0.224479 828 | 0.804947 0.409559 0.11407 829 | 0.867061 0.199066 -0.165077 830 | 0.79786 0.537099 -0.075106 831 | 0.873943 0.330759 -0.204744 832 | 0.868989 0.535075 -0.116903 833 | 0.850141 -0.587006 0.456454 834 | 0.822723 -0.511184 0.593187 835 | 0.94797 -0.472029 0.154908 836 | 0.811967 -0.483429 0.707331 837 | 0.88773 -0.282209 0.0624265 838 | 0.813355 -0.359355 0.754729 839 | 0.888011 -0.244296 -0.12641 840 | 0.81042 -0.271102 0.791343 841 | 0.886256 -0.171275 -0.123352 842 | 0.798388 -0.168477 0.751604 843 | 0.891172 -0.097997 -0.281446 844 | 0.812491 -0.0627258 0.751299 845 | 0.89369 -0.0454285 -0.341229 846 | 0.815201 0.0140996 0.686566 847 | 0.885629 0.0836205 -0.637287 848 | 0.825053 0.0712815 0.674507 849 | 0.892944 0.0516098 -0.50594 850 | 0.820876 0.128022 0.659416 851 | 0.874914 0.153011 -0.667777 852 | 0.824783 0.192992 0.624912 853 | 0.892886 0.159518 -0.531295 854 | 0.828199 0.252633 0.457861 855 | 0.873565 0.267439 -0.676607 856 | 0.824637 0.281049 0.384089 857 | 0.879222 0.256976 -0.57626 858 | 0.811871 0.375268 0.22447 859 | 0.893514 0.453955 -0.506775 860 | 0.863478 0.513875 -0.0217797 861 | 0.876877 0.558682 -0.356201 862 | 0.915765 0.598531 -0.171533 863 | 0.945195 0.615512 -0.336292 864 | 0.935135 0.57913 -0.269639 865 | 0.869715 -0.580661 0.361337 866 | 0.848603 -0.579986 0.54251 867 | 0.940949 -0.529915 0.22654 868 | 0.834234 -0.542316 0.617416 869 | 0.898145 -0.380175 -0.0671599 870 | 0.783278 -0.411083 0.763183 871 | 0.885073 -0.31585 -0.0129262 872 | 0.812561 -0.317021 0.785217 873 | 0.884192 -0.283127 -0.301199 874 | 0.795826 -0.21252 0.835235 875 | 0.896026 -0.229659 -0.242561 876 | 0.807691 -0.120327 0.822678 877 | 0.891408 -0.219759 -0.424215 878 | 0.788687 -0.0536004 0.835352 879 | 0.89915 -0.145779 -0.340377 880 | 0.820063 0.0139238 0.775847 881 | 0.898391 -0.140124 -0.557738 882 | 0.815151 0.0782819 0.762961 883 | 0.909983 -0.0677957 -0.45326 884 | 0.840391 0.142964 0.720925 885 | 0.90624 -0.0384042 -0.650428 886 | 0.789269 0.164416 0.67872 887 | 0.903568 -0.0195912 -0.548009 888 | 0.836575 0.220168 0.564718 889 | 0.893938 0.0563533 -0.677909 890 | 0.841307 0.332382 0.3318 891 | 0.887022 0.135353 -0.348143 892 | 0.855107 0.415165 0.161378 893 | 0.874198 0.230471 -0.305716 894 | 0.803423 0.482541 0.0199479 895 | 0.869775 0.411457 -0.281732 896 | 0.894751 0.552258 -0.18338 897 | 0.879543 -0.513853 0.278612 898 | 0.93901 -0.522947 0.357059 899 | 0.925911 -0.42056 0.162993 900 | 0.8988 -0.376769 0.358959 901 | 0.887976 -0.329695 0.160963 902 | 0.892415 -0.257546 0.360531 903 | 0.876958 -0.186345 0.0824956 904 | 0.872302 -0.21934 0.45761 905 | 0.888515 -0.138925 -0.0468235 906 | 0.881802 -0.123656 0.436466 907 | 0.895089 -0.0679389 -0.00336642 908 | 0.865595 -0.0500873 0.477918 909 | 0.887355 -0.0413106 -0.223002 910 | 0.887022 -0.00309818 0.388973 911 | 0.887075 0.0291431 -0.0399383 912 | 0.886371 0.0713797 0.456157 913 | 0.887005 0.0525844 -0.37778 914 | 0.922494 0.120974 0.42117 915 | 0.892909 0.138534 -0.115708 916 | 0.902558 0.168336 0.519291 917 | 0.881689 0.19298 -0.407263 918 | 0.917297 0.204078 0.359754 919 | 0.882528 0.317524 -0.368281 920 | 0.877026 0.285807 0.36802 921 | 0.878727 0.316213 -0.455649 922 | 0.84412 0.354195 0.231808 923 | 0.883552 0.40624 -0.381264 924 | 0.858696 0.455804 0.081626 925 | 0.927077 0.506641 -0.389966 926 | 0.962771 0.564882 -0.223324 927 | 0.96349 0.626692 -0.225019 928 | 0.93895 0.534506 -0.2759 929 | 0.874184 -0.547208 0.345786 930 | 0.875867 -0.566496 0.420876 931 | 0.974409 -0.543688 0.285405 932 | 0.907709 -0.469076 0.223681 933 | 0.975865 -0.463289 0.278485 934 | 0.834731 -0.419862 0.625425 935 | 0.885784 -0.386225 0.27572 936 | 0.850502 -0.322389 0.624716 937 | 0.879323 -0.300817 0.254634 938 | 0.85444 -0.23942 0.6899 939 | 0.888502 -0.242764 0.138299 940 | 0.859491 -0.143576 0.671018 941 | 0.883256 -0.190402 0.252687 942 | 0.850217 -0.0614207 0.760088 943 | 0.883353 -0.134768 0.130006 944 | 0.878057 0.0190678 0.620929 945 | 0.880915 -0.0795951 0.210485 946 | 0.883192 0.0959608 0.697968 947 | 0.896648 -0.024017 0.110504 948 | 0.947471 0.136332 0.618755 949 | 0.882076 0.0355946 0.295715 950 | 0.919091 0.186201 0.609329 951 | 0.886456 0.0847041 0.12383 952 | 0.940887 0.216062 0.54393 953 | 0.879221 0.140855 0.182838 954 | 0.858011 0.252721 0.36716 955 | 0.880287 0.154583 0.019863 956 | 0.869364 0.333547 0.159589 957 | 0.880989 0.229625 0.0541981 958 | 0.888617 0.501109 -0.0589621 959 | 0.879516 0.288711 -0.128169 960 | 0.884344 0.408573 -0.0258653 961 | 0.870439 -0.475739 0.346066 962 | 0.912916 -0.50161 0.411529 963 | 0.896054 -0.363389 0.228438 964 | 0.87283 -0.404801 0.442276 965 | 0.862895 -0.330379 0.420827 966 | 0.862367 -0.297163 0.485905 967 | 0.88429 -0.223335 0.253999 968 | 0.862016 -0.206024 0.528077 969 | 0.885086 -0.181062 0.342948 970 | 0.876606 -0.124462 0.510767 971 | 0.89683 -0.110876 0.277465 972 | 0.871947 -0.0702593 0.655189 973 | 0.887645 -0.0559375 0.338786 974 | 0.899577 0.00900956 0.500026 975 | 0.892304 -0.00136083 0.192323 976 | 0.931249 0.0802919 0.611793 977 | 0.877856 0.0731498 0.366186 978 | 0.975258 0.125394 0.548203 979 | 0.901379 0.108631 0.258741 980 | 0.984939 0.17935 0.587406 981 | 0.885673 0.162099 0.365222 982 | 0.957186 0.199136 0.477092 983 | 0.883166 0.191747 0.12247 984 | 0.913503 0.244874 0.412881 985 | 0.876951 0.238224 0.187456 986 | 0.889762 0.306731 0.25103 987 | 0.879398 0.301721 -0.0115964 988 | 0.887413 0.396058 0.0472292 989 | 0.902248 0.45751 -0.083711 990 | 0.934005 0.557403 -0.122513 991 | 0.902324 0.499953 -0.162311 992 | 0.904897 0.463168 -0.22508 993 | 0.861874 -0.425453 0.320987 994 | 0.844219 -0.444855 0.447474 995 | 0.865929 -0.52748 0.480677 996 | 0.833256 -0.487481 0.614272 997 | 0.86632 -0.449647 0.5456 998 | 0.833066 -0.381198 0.685075 999 | 0.861714 -0.367564 0.535977 1000 | 0.843933 -0.290393 0.708051 1001 | 0.863265 -0.272257 0.558132 1002 | 0.850916 -0.194078 0.75682 1003 | 0.863894 -0.206435 0.611787 1004 | 0.850459 -0.131403 0.746939 1005 | 0.869492 -0.132981 0.586807 1006 | 0.855955 -0.0249683 0.726403 1007 | 0.866393 -0.0426686 0.574309 1008 | 0.857518 0.0389543 0.700216 1009 | 0.871333 0.0281035 0.557481 1010 | 0.862278 0.117911 0.665398 1011 | 0.886992 0.097905 0.572744 1012 | 0.886386 0.165119 0.647753 1013 | 0.849449 0.120569 0.502209 1014 | 0.860722 0.19656 0.554627 1015 | 0.870979 0.160251 0.427428 1016 | 0.886781 0.246033 0.479334 1017 | 0.87594 0.18497 0.28289 1018 | 0.896276 0.283632 0.316861 1019 | 0.880335 0.232751 0.255743 1020 | 0.89253 0.375787 0.147859 1021 | 0.888928 0.313183 0.0892862 1022 | 0.895151 0.482812 0.0105019 1023 | 0.879765 0.408424 -0.129818 1024 | 0.865365 0.355578 -0.0685897 1025 | -------------------------------------------------------------------------------- /demo/1_m.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/1_m.png -------------------------------------------------------------------------------- /demo/2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/2.png -------------------------------------------------------------------------------- /demo/2_m.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/2_m.png -------------------------------------------------------------------------------- /demo/interactive_crop.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | import numpy as np 3 | runsingleimage=None 4 | import sys 5 | import show3d 6 | 7 | overlay=np.zeros((192,256,3),dtype='uint8') 8 | h,w=overlay.shape[:2] 9 | focus=400 10 | cx,cy=overlay.shape[0]/2,overlay.shape[1]/2 11 | beta=20.0/180.0*np.pi 12 | viewmat=np.array([[ 13 | np.cos(beta),0,-np.sin(beta)],[ 14 | 0,1,0],[ 15 | np.sin(beta),0,np.cos(beta)]],dtype='float32') 16 | for t in np.linspace(0,2*np.pi,1000): 17 | xyz=np.array([0,np.cos(t)/1.8,np.sin(t)/1.8]) 18 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 19 | x=int(cx+xyz[0]/-xyz[2]*focus) 20 | y=int(cy+xyz[1]/-xyz[2]*focus) 21 | cv2.circle(overlay,(y,x),1,cv2.cv.CV_RGB(255,0,0)) 22 | 23 | xyz=np.array([-np.sin(t/2)/1.8,np.cos(t/2)/1.8,0]) 24 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 25 | x=int(cx+xyz[0]/-xyz[2]*focus) 26 | y=int(cy+xyz[1]/-xyz[2]*focus) 27 | cv2.circle(overlay,(y,x),1,cv2.cv.CV_RGB(255,0,0)) 28 | 29 | for k in [-1,-0.5,0,0.5,1]: 30 | xyz=np.array([0,(t-np.pi)/np.pi/1.8,k/1.8]) 31 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 32 | x=int(cx+xyz[0]/-xyz[2]*focus) 33 | y=int(cy+xyz[1]/-xyz[2]*focus) 34 | cv2.circle(overlay,(y,x),0,cv2.cv.CV_RGB(0,0,255)) 35 | 36 | for k in [-1,-0.5,0,0.5,1]: 37 | xyz=np.array([0,k/1.8,(t-np.pi)/np.pi/1.8]) 38 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 39 | x=int(cx+xyz[0]/-xyz[2]*focus) 40 | y=int(cy+xyz[1]/-xyz[2]*focus) 41 | cv2.circle(overlay,(y,x),0,cv2.cv.CV_RGB(0,0,255)) 42 | 43 | xyz=np.array([-0.5/1.8,1/1.8,(t-np.pi)/np.pi/1.8]) 44 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 45 | x=int(cx+xyz[0]/-xyz[2]*focus) 46 | y=int(cy+xyz[1]/-xyz[2]*focus) 47 | cv2.circle(overlay,(y,x),0,cv2.cv.CV_RGB(0,255,0)) 48 | 49 | xyz=np.array([-0.5/1.8,-1/1.8,(t-np.pi)/np.pi/1.8]) 50 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 51 | x=int(cx+xyz[0]/-xyz[2]*focus) 52 | y=int(cy+xyz[1]/-xyz[2]*focus) 53 | cv2.circle(overlay,(y,x),0,cv2.cv.CV_RGB(0,255,0)) 54 | 55 | xyz=np.array([-0.5/1.8,(t-np.pi)/np.pi/1.8,1/1.8]) 56 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 57 | x=int(cx+xyz[0]/-xyz[2]*focus) 58 | y=int(cy+xyz[1]/-xyz[2]*focus) 59 | cv2.circle(overlay,(y,x),0,cv2.cv.CV_RGB(0,255,0)) 60 | 61 | xyz=np.array([-0.5/1.8,(t-np.pi)/np.pi/1.8,-1/1.8]) 62 | xyz=viewmat.T.dot(xyz)+[0.14,0,-2] 63 | x=int(cx+xyz[0]/-xyz[2]*focus) 64 | y=int(cy+xyz[1]/-xyz[2]*focus) 65 | cv2.circle(overlay,(y,x),0,cv2.cv.CV_RGB(0,255,0)) 66 | overlay_mask=overlay.sum(axis=-1,keepdims=True)!=0 67 | 68 | img_in=cv2.imread(sys.argv[1]) 69 | img_mask=cv2.imread(sys.argv[2]) 70 | modelname=sys.argv[3] 71 | assert img_in.shape[:2]==img_mask.shape[:2] 72 | h,w=img_in.shape[:2] 73 | h2,w2=384,512 74 | if h*w2>h2*w: 75 | h2=h2 76 | w2=w*h2/h 77 | else: 78 | w2=w2 79 | h2=h*w2/w 80 | h,w=h2,w2 81 | img_in=cv2.resize(img_in,(w,h)) 82 | img_mask=cv2.resize(img_mask,(w,h)) 83 | 84 | xy0=h/8,w/8 85 | xy1=h-h/8,w-w/8 86 | 87 | mousexy=[(0,0)] 88 | cur_drag=-1 89 | start_drag_xy=(0,0) 90 | def mouseCallback(tp,mousey,mousex,*args): 91 | global xy0,xy1,cur_drag,start_drag_xy 92 | if tp==0: 93 | if cur_drag==0: 94 | dy=mousey-xy1[1] 95 | nx=int(dy*3/4)+xy1[0] 96 | xy0=(nx,mousey) 97 | elif cur_drag==1: 98 | dy=mousey-xy0[1] 99 | nx=int(dy*3/4)+xy0[0] 100 | xy1=(nx,mousey) 101 | elif cur_drag==2: 102 | nx=xy0[0]+mousex-start_drag_xy[0] 103 | ny=xy0[1]+mousey-start_drag_xy[1] 104 | xy0=(nx,ny) 105 | nx=xy1[0]+mousex-start_drag_xy[0] 106 | ny=xy1[1]+mousey-start_drag_xy[1] 107 | xy1=(nx,ny) 108 | start_drag_xy=(mousex,mousey) 109 | elif tp==1: 110 | dist1=(xy0[0]-mousex)**2+(xy0[1]-mousey)**2 111 | dist2=(xy1[0]-mousex)**2+(xy1[1]-mousey)**2 112 | if min(dist1,dist2)>100: 113 | if mousex>=xy0[0] and mousex<=xy1[0] and mousey>=xy0[1] and mousey<=xy1[1]: 114 | start_drag_xy=(mousex,mousey) 115 | cur_drag=2 116 | else: 117 | if dist10.5 145 | show_cropped=((cropped-(cropped/2*cropped_mask[:,:,None]))*(~overlay_mask))|(overlay*(overlay_mask)) 146 | cv2.imshow('image',show) 147 | cv2.imshow('cropped',show_cropped) 148 | if xyzs is not None: 149 | cmd=show3d.showpoints(xyzs,waittime=10)%256 150 | else: 151 | cmd=cv2.waitKey(10)%256 152 | if cmd==ord('q'): 153 | break 154 | elif cmd==ord('l'): 155 | rects=np.loadtxt('%s.rect.txt'%sys.argv[1]) 156 | x0=int(np.round(rects[0,0])) 157 | y0=int(np.round(rects[0,1])) 158 | x1=int(np.round(rects[1,0])) 159 | y1=int(np.round(rects[1,1])) 160 | xy0=(x0,y0) 161 | xy1=(x1,y1) 162 | elif cmd==ord(' '): 163 | if runsingleimage is None: 164 | import runsingleimage 165 | model=runsingleimage.loadModel(modelname) 166 | cv2.imwrite('%s.crop.png'%sys.argv[1],cropped) 167 | cv2.imwrite('%s.crop_m.png'%sys.argv[1],np.uint8(cropped_mask)*255) 168 | np.savetxt('%s.rect.txt'%sys.argv[1],[[x0,y0],[x1,y1]]) 169 | xyzs=runsingleimage.run_image(model,cropped,cropped_mask) 170 | np.savetxt('%s.xyz'%sys.argv[1],xyzs) 171 | -------------------------------------------------------------------------------- /demo/r0.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/r0.png -------------------------------------------------------------------------------- /demo/r1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/r1.png -------------------------------------------------------------------------------- /demo/r2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/r2.png -------------------------------------------------------------------------------- /demo/r3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/r3.png -------------------------------------------------------------------------------- /demo/runr2n2_128.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | import time 3 | import numpy as np 4 | import cPickle as pickle 5 | import tensorflow as tf 6 | import tflearn 7 | import sys 8 | 9 | BATCH_SIZE=1 10 | HEIGHT=128 11 | WIDTH=128 12 | 13 | def loadModel(weightsfile): 14 | with tf.device('/cpu'): 15 | img_inp=tf.placeholder(tf.float32,shape=(BATCH_SIZE,HEIGHT,WIDTH,3),name='img_inp') 16 | x=img_inp 17 | #128 128 18 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 19 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 20 | x1=x 21 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 22 | #64 64 23 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 24 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 25 | x2=x 26 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 27 | #32 32 28 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 29 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 30 | x3=x 31 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 32 | #16 16 33 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 34 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 35 | x4=x 36 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 37 | #8 8 38 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 39 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 40 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 41 | x5=x 42 | x=tflearn.layers.conv.conv_2d(x,512,(5,5),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 43 | x_additional=tflearn.layers.core.fully_connected(x,2048,activation='relu',weight_decay=1e-3,regularizer='L2') 44 | x_additional=tflearn.layers.core.fully_connected(x_additional,1024,activation='relu',weight_decay=1e-3,regularizer='L2') 45 | x_additional=tflearn.layers.core.fully_connected(x_additional,256*3,activation='linear',weight_decay=1e-3,regularizer='L2') 46 | x_additional=tf.reshape(x_additional,(BATCH_SIZE,256,3)) 47 | #x=tflearn.layers.core.fully_connected(x,3072,activation='relu',weight_decay=1e-3,regularizer='L2') 48 | #x=tf.reshape(x,(BATCH_SIZE,3,4,256)) 49 | x=tflearn.layers.conv.conv_2d_transpose(x,256,[5,5],[8,8],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 50 | x5=tflearn.layers.conv.conv_2d(x5,256,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 51 | x=tf.nn.relu(tf.add(x,x5)) 52 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 53 | x=tflearn.layers.conv.conv_2d_transpose(x,128,[5,5],[16,16],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 54 | x4=tflearn.layers.conv.conv_2d(x4,128,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 55 | x=tf.nn.relu(tf.add(x,x4)) 56 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 57 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 58 | x=tflearn.layers.conv.conv_2d(x,3*3,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 59 | x=tf.reshape(x,(BATCH_SIZE,16*16*3,3)) 60 | x=tf.concat([x_additional,x],1) 61 | x=tf.reshape(x,(BATCH_SIZE,1024,3)) 62 | sess=tf.Session('') 63 | sess.run(tf.global_variables_initializer()) 64 | loaddict={} 65 | fin=open(weightsfile,'rb') 66 | while True: 67 | try: 68 | v,p=pickle.load(fin) 69 | except EOFError: 70 | break 71 | loaddict[v]=p 72 | fin.close() 73 | for t in tf.trainable_variables(): 74 | if t.name not in loaddict: 75 | print 'missing',t.name 76 | else: 77 | sess.run(t.assign(loaddict[t.name])) 78 | del loaddict[t.name] 79 | for k in loaddict.iteritems(): 80 | if k[0]!='Variable:0': 81 | print 'unused',k 82 | return (sess,img_inp,x) 83 | 84 | def run_image(model,img_in): 85 | (sess,img_inp,x)=model 86 | assert img_in.shape==(HEIGHT,WIDTH,3) 87 | img_in=np.float32(img_in)/255.0 88 | (ret,),=sess.run([x],feed_dict={img_inp:img_in[None,:,:,:]}) 89 | return ret 90 | 91 | if __name__=='__main__': 92 | model=loadModel(sys.argv[2]) 93 | img_in=cv2.imread(sys.argv[1]) 94 | fout=open(sys.argv[1]+'.txt','w') 95 | ret=run_image(model,img_in) 96 | for x,y,z in ret: 97 | print >>fout,x,y,z 98 | -------------------------------------------------------------------------------- /demo/runsingleimage.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | import time 3 | import numpy as np 4 | import cPickle as pickle 5 | import tensorflow as tf 6 | import tflearn 7 | import sys 8 | 9 | BATCH_SIZE=1 10 | HEIGHT=192 11 | WIDTH=256 12 | 13 | def loadModel(weightsfile): 14 | with tf.device('/cpu'): 15 | img_inp=tf.placeholder(tf.float32,shape=(BATCH_SIZE,HEIGHT,WIDTH,4),name='img_inp') 16 | x=img_inp 17 | #192 256 18 | x=tflearn.layers.conv.conv_2d(x,16,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 19 | x=tflearn.layers.conv.conv_2d(x,16,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 20 | x0=x 21 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 22 | #96 128 23 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 24 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 25 | x1=x 26 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 27 | #48 64 28 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 29 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 30 | x2=x 31 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 32 | #24 32 33 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 34 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 35 | x3=x 36 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 37 | #12 16 38 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 39 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 40 | x4=x 41 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 42 | #6 8 43 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 44 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 45 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 46 | x5=x 47 | x=tflearn.layers.conv.conv_2d(x,512,(5,5),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 48 | x_additional=tflearn.layers.core.fully_connected(x,2048,activation='relu',weight_decay=1e-3,regularizer='L2') 49 | x_additional=tflearn.layers.core.fully_connected(x_additional,1024,activation='relu',weight_decay=1e-3,regularizer='L2') 50 | x_additional=tflearn.layers.core.fully_connected(x_additional,256*3,activation='linear',weight_decay=1e-3,regularizer='L2') 51 | x_additional=tf.reshape(x_additional,(BATCH_SIZE,256,3)) 52 | x=tflearn.layers.conv.conv_2d_transpose(x,256,[5,5],[6,8],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 53 | x5=tflearn.layers.conv.conv_2d(x5,256,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 54 | x=tf.nn.relu(tf.add(x,x5)) 55 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 56 | x=tflearn.layers.conv.conv_2d_transpose(x,128,[5,5],[12,16],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 57 | x4=tflearn.layers.conv.conv_2d(x4,128,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 58 | x=tf.nn.relu(tf.add(x,x4)) 59 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 60 | x=tflearn.layers.conv.conv_2d_transpose(x,64,[5,5],[24,32],strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 61 | x3=tflearn.layers.conv.conv_2d(x3,64,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 62 | x=tf.nn.relu(tf.add(x,x3)) 63 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 64 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 65 | x=tflearn.layers.conv.conv_2d(x,3,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 66 | x=tf.reshape(x,(BATCH_SIZE,32*24,3)) 67 | x=tf.concat([x_additional,x],axis=1) 68 | x=tf.reshape(x,(BATCH_SIZE,1024,3)) 69 | sess=tf.Session('') 70 | sess.run(tf.global_variables_initializer()) 71 | loaddict={} 72 | fin=open(weightsfile,'rb') 73 | while True: 74 | try: 75 | v,p=pickle.load(fin) 76 | except EOFError: 77 | break 78 | loaddict[v]=p 79 | fin.close() 80 | for t in tf.trainable_variables(): 81 | if t.name not in loaddict: 82 | print 'missing',t.name 83 | else: 84 | sess.run(t.assign(loaddict[t.name])) 85 | del loaddict[t.name] 86 | for k in loaddict.iteritems(): 87 | if k[0]!='Variable:0': 88 | print 'unused',k 89 | return (sess,img_inp,x) 90 | 91 | def run_image(model,img_in,img_mask): 92 | (sess,img_inp,x)=model 93 | img_in=img_in*(1-img_mask[:,:,None])+191*img_mask[:,:,None] 94 | img_packed=np.dstack([img_in.astype('float32')/255,img_mask[:,:,None]]) 95 | assert img_packed.shape==(HEIGHT,WIDTH,4) 96 | 97 | (ret,),=sess.run([x],feed_dict={img_inp:img_packed[None,:,:,:]}) 98 | return ret 99 | 100 | if __name__=='__main__': 101 | model=loadModel(sys.argv[3]) 102 | img_in=cv2.imread(sys.argv[1]) 103 | img_mask=cv2.imread(sys.argv[2],0)!=0 104 | fout=open(sys.argv[1]+'.txt','w') 105 | ret=run_image(model,img_in,img_mask) 106 | for x,y,z in ret: 107 | print >>fout,x,y,z 108 | -------------------------------------------------------------------------------- /demo/show3d.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import cv2 3 | import sys 4 | showsz=800 5 | mousex,mousey=0.5,0.5 6 | zoom=1.0 7 | changed=True 8 | def onmouse(*args): 9 | global mousex,mousey,changed 10 | y=args[1] 11 | x=args[2] 12 | mousex=x/float(showsz) 13 | mousey=y/float(showsz) 14 | changed=True 15 | cv2.namedWindow('show3d') 16 | cv2.moveWindow('show3d',0,0) 17 | cv2.setMouseCallback('show3d',onmouse) 18 | def showpoints(xyz,c0=None,c1=None,c2=None,waittime=0,showrot=False,magnifyBlue=0,freezerot=False,background=(0,0,0),normalizecolor=True): 19 | global showsz,mousex,mousey,zoom,changed 20 | xyz=xyz-xyz.mean(axis=0) 21 | radius=((xyz**2).sum(axis=-1)**0.5).max() 22 | xyz/=(radius*2.2)/showsz 23 | if c0 is None: 24 | c0=np.zeros((len(xyz),),dtype='float32')+255 25 | if c1 is None: 26 | c1=c0 27 | if c2 is None: 28 | c2=c0 29 | if normalizecolor: 30 | c0/=(c0.max()+1e-14)/255.0 31 | c1/=(c1.max()+1e-14)/255.0 32 | c2/=(c2.max()+1e-14)/255.0 33 | 34 | show=np.zeros((showsz,showsz,3),dtype='uint8') 35 | def render(): 36 | rotmat=np.eye(3) 37 | if not freezerot: 38 | xangle=(mousey-0.5)*np.pi*1.2 39 | else: 40 | xangle=0 41 | rotmat=rotmat.dot(np.array([ 42 | [1.0,0.0,0.0], 43 | [0.0,np.cos(xangle),-np.sin(xangle)], 44 | [0.0,np.sin(xangle),np.cos(xangle)], 45 | ])) 46 | if not freezerot: 47 | yangle=(mousex-0.5)*np.pi*1.2 48 | else: 49 | yangle=0 50 | rotmat=rotmat.dot(np.array([ 51 | [np.cos(yangle),0.0,-np.sin(yangle)], 52 | [0.0,1.0,0.0], 53 | [np.sin(yangle),0.0,np.cos(yangle)], 54 | ])) 55 | rotmat*=zoom 56 | nxyz=xyz.dot(rotmat) 57 | nz=nxyz[:,2].argsort() 58 | nxyz=nxyz[nz] 59 | nxyz=(nxyz[:,:2]+[showsz/2,showsz/2]).astype('int32') 60 | p=nxyz[:,0]*showsz+nxyz[:,1] 61 | show[:]=background 62 | m=(nxyz[:,0]>=0)*(nxyz[:,0]=0)*(nxyz[:,1]0: 67 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=0)) 68 | if magnifyBlue>=2: 69 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=0)) 70 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=1)) 71 | if magnifyBlue>=2: 72 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=1)) 73 | if showrot: 74 | cv2.putText(show,'xangle %d'%(int(xangle/np.pi*180)),(30,showsz-30),0,0.5,cv2.cv.CV_RGB(255,0,0)) 75 | cv2.putText(show,'yangle %d'%(int(yangle/np.pi*180)),(30,showsz-50),0,0.5,cv2.cv.CV_RGB(255,0,0)) 76 | cv2.putText(show,'zoom %d%%'%(int(zoom*100)),(30,showsz-70),0,0.5,cv2.cv.CV_RGB(255,0,0)) 77 | changed=True 78 | while True: 79 | if changed: 80 | render() 81 | changed=False 82 | cv2.imshow('show3d',show) 83 | if waittime==0: 84 | cmd=cv2.waitKey(10)%256 85 | else: 86 | cmd=cv2.waitKey(waittime)%256 87 | if cmd==ord('q'): 88 | break 89 | elif cmd==ord('Q'): 90 | sys.exit(0) 91 | if cmd==ord('n'): 92 | zoom*=1.1 93 | changed=True 94 | elif cmd==ord('m'): 95 | zoom/=1.1 96 | changed=True 97 | elif cmd==ord('r'): 98 | zoom=1.0 99 | changed=True 100 | elif cmd==ord('s'): 101 | cv2.imwrite('show3d.png',show) 102 | if waittime!=0: 103 | break 104 | return cmd 105 | -------------------------------------------------------------------------------- /demo/src_3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/src_3.png -------------------------------------------------------------------------------- /demo/src_3.png.crop.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/src_3.png.crop.png -------------------------------------------------------------------------------- /demo/src_3.png.crop_m.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/src_3.png.crop_m.png -------------------------------------------------------------------------------- /demo/src_3.png.rect.txt: -------------------------------------------------------------------------------- 1 | 9.500000000000000000e+01 1.030000000000000000e+02 2 | 3.280000000000000000e+02 4.130000000000000000e+02 3 | -------------------------------------------------------------------------------- /demo/src_3.png.xyz: -------------------------------------------------------------------------------- 1 | 7.705926299095153809e-01 -1.146750450134277344e-01 -3.767032623291015625e-01 2 | 5.562801361083984375e-01 -1.212157234549522400e-01 -3.694468140602111816e-01 3 | 5.320999622344970703e-01 -2.981717884540557861e-01 -2.240327000617980957e-01 4 | 2.418586164712905884e-01 2.653748393058776855e-01 -2.474299520254135132e-01 5 | 4.061003923416137695e-01 -1.970603913068771362e-01 3.136818408966064453e-01 6 | 7.486844062805175781e-01 2.747643589973449707e-01 -2.529205679893493652e-01 7 | 3.353889882564544678e-01 -1.519514992833137512e-02 3.584670722484588623e-01 8 | 2.656065821647644043e-01 3.618135452270507812e-01 9.089146554470062256e-02 9 | 8.144126534461975098e-01 2.431047558784484863e-01 -2.454209774732589722e-01 10 | 2.181874215602874756e-01 3.511094450950622559e-01 -1.014925166964530945e-01 11 | 2.494989931583404541e-01 2.160856127738952637e-01 3.305924236774444580e-01 12 | 3.366644382476806641e-01 1.479753702878952026e-01 3.443672060966491699e-01 13 | 2.914511859416961670e-01 1.950120627880096436e-01 3.655069470405578613e-01 14 | 5.436064004898071289e-01 3.714980930089950562e-02 4.202842712402343750e-01 15 | 2.092808485031127930e-01 2.571685910224914551e-01 3.147209882736206055e-01 16 | 8.220213651657104492e-01 -2.069231271743774414e-01 -3.113873898983001709e-01 17 | 4.240033626556396484e-01 -1.221607029438018799e-01 -3.813233673572540283e-01 18 | 7.150502204895019531e-01 -2.518184483051300049e-01 -2.889570891857147217e-01 19 | 1.378212273120880127e-01 3.472861349582672119e-01 -2.166412025690078735e-02 20 | 2.149495184421539307e-01 2.451208531856536865e-01 -3.067460656166076660e-01 21 | 5.169282555580139160e-01 -1.898092627525329590e-01 3.074489831924438477e-01 22 | 6.295615434646606445e-01 3.141358494758605957e-01 -1.772823184728622437e-01 23 | 6.576899290084838867e-01 1.889052093029022217e-01 -3.117966651916503906e-01 24 | 3.888659179210662842e-01 6.504556536674499512e-02 -3.808341324329376221e-01 25 | 2.683319151401519775e-01 3.431324288249015808e-02 3.863714933395385742e-01 26 | 6.272878050804138184e-01 9.393028914928436279e-03 3.804520368576049805e-01 27 | 1.784453094005584717e-01 3.713415861129760742e-01 -3.603206947445869446e-02 28 | 6.082012057304382324e-01 -2.187290489673614502e-01 2.665959000587463379e-01 29 | 1.128521710634231567e-01 3.313661515712738037e-01 -1.427271813154220581e-01 30 | 4.820144772529602051e-01 -1.539824604988098145e-01 -3.587653040885925293e-01 31 | 1.608115285634994507e-01 1.527476757764816284e-01 -3.815534114837646484e-01 32 | 3.340698778629302979e-01 3.866538405418395996e-02 -3.702365458011627197e-01 33 | 5.315281152725219727e-01 1.881164312362670898e-01 -3.201015293598175049e-01 34 | 5.041375160217285156e-01 -1.505932956933975220e-02 -3.777618408203125000e-01 35 | 7.207415699958801270e-01 8.033823221921920776e-02 -3.841775357723236084e-01 36 | 4.898673295974731445e-01 2.986828088760375977e-01 -1.872919499874114990e-01 37 | 7.950515747070312500e-01 -2.345340996980667114e-01 -2.706584036350250244e-01 38 | 3.933067917823791504e-01 1.309704780578613281e-01 -3.508762121200561523e-01 39 | 5.363295078277587891e-01 1.033615991473197937e-01 3.700568974018096924e-01 40 | 2.976536750793457031e-01 2.293733209371566772e-01 -2.915981411933898926e-01 41 | 1.932522058486938477e-01 2.438957244157791138e-01 2.863798141479492188e-01 42 | 2.868653535842895508e-01 2.598408460617065430e-01 2.769750654697418213e-01 43 | 3.282673358917236328e-01 -2.251402735710144043e-01 2.954606413841247559e-01 44 | 2.752423584461212158e-01 -9.050513058900833130e-02 -4.080760776996612549e-01 45 | 1.309191286563873291e-01 6.171436980366706848e-02 -3.304080367088317871e-01 46 | 3.043651580810546875e-01 -1.754481494426727295e-01 -3.546653091907501221e-01 47 | 1.286292225122451782e-01 -2.163901329040527344e-01 3.177973628044128418e-01 48 | 6.564464569091796875e-01 1.831406950950622559e-01 3.743536174297332764e-01 49 | 1.457999795675277710e-01 2.119274586439132690e-01 -3.061694800853729248e-01 50 | 7.050445079803466797e-01 -6.368923932313919067e-02 3.652115762233734131e-01 51 | 8.409731388092041016e-01 -2.480416893959045410e-01 -2.269707173109054565e-01 52 | 4.317613244056701660e-01 2.495521306991577148e-01 -2.710983455181121826e-01 53 | 2.826033234596252441e-01 3.173267543315887451e-01 1.942638009786605835e-01 54 | 2.824973464012145996e-01 -4.466225951910018921e-02 4.243206977844238281e-01 55 | 3.028146624565124512e-01 -1.148161292076110840e-01 3.556703031063079834e-01 56 | 6.902369856834411621e-01 -1.497818678617477417e-01 3.205738961696624756e-01 57 | 8.573985695838928223e-01 -7.956265658140182495e-02 -3.686289787292480469e-01 58 | 3.678670525550842285e-01 3.221522569656372070e-01 1.819022297859191895e-01 59 | 1.210997551679611206e-01 1.753932684659957886e-01 -3.403524458408355713e-01 60 | 7.649956941604614258e-01 1.138893961906433105e-01 -3.481847047805786133e-01 61 | 2.925502657890319824e-01 2.309242635965347290e-02 -3.953397870063781738e-01 62 | 3.629691600799560547e-01 2.439160048961639404e-01 -2.887226045131683350e-01 63 | 1.358288377523422241e-01 -2.898198366165161133e-01 2.170063108205795288e-01 64 | 2.233422696590423584e-01 -3.071841001510620117e-01 2.055003792047500610e-01 65 | 1.648660302162170410e-01 -2.245137095451354980e-01 -3.198354840278625488e-01 66 | 5.733913183212280273e-01 -5.857632681727409363e-02 -3.889948427677154541e-01 67 | 1.416138410568237305e-01 3.383083939552307129e-01 1.519013345241546631e-01 68 | 3.740040063858032227e-01 3.361190482974052429e-02 3.994985222816467285e-01 69 | 3.863651156425476074e-01 2.601618170738220215e-01 2.922971844673156738e-01 70 | 2.187293469905853271e-01 8.434005081653594971e-02 4.137287437915802002e-01 71 | 4.961263835430145264e-01 2.751903533935546875e-01 2.514103949069976807e-01 72 | 6.541758775711059570e-01 1.116375476121902466e-01 -3.481611013412475586e-01 73 | 1.685232818126678467e-01 9.992828220129013062e-02 3.889634609222412109e-01 74 | 5.472611188888549805e-01 -9.961419552564620972e-02 3.602122664451599121e-01 75 | 8.211843967437744141e-01 1.793246716260910034e-01 -3.599840104579925537e-01 76 | 3.969377875328063965e-01 -1.896826922893524170e-02 -3.754572272300720215e-01 77 | 5.038072466850280762e-01 1.612651720643043518e-02 3.666536211967468262e-01 78 | 3.493393659591674805e-01 -1.276005506515502930e-01 -3.639337718486785889e-01 79 | 3.564705550670623779e-01 -7.030356675386428833e-02 3.822591304779052734e-01 80 | 2.713019847869873047e-01 3.584863841533660889e-01 -5.714968591928482056e-02 81 | 2.840348482131958008e-01 -2.239733785390853882e-01 -3.102399408817291260e-01 82 | 1.489798128604888916e-01 -1.360507458448410034e-01 3.459404408931732178e-01 83 | 8.711200952529907227e-01 1.179288923740386963e-01 -3.296959400177001953e-01 84 | 1.339323222637176514e-01 4.032979160547256470e-02 3.956090211868286133e-01 85 | 4.287827014923095703e-01 -1.964634805917739868e-01 -3.389676213264465332e-01 86 | 2.542583346366882324e-01 -8.408253081142902374e-03 -3.803386390209197998e-01 87 | 5.699475407600402832e-01 -1.837004125118255615e-01 -3.512721657752990723e-01 88 | 6.858030557632446289e-01 -1.025931388139724731e-01 -3.761575818061828613e-01 89 | 1.627897024154663086e-01 3.397272229194641113e-01 -1.558921039104461670e-01 90 | 8.227867484092712402e-01 -1.014477293938398361e-02 -3.581556975841522217e-01 91 | 1.151465401053428650e-01 -4.190011322498321533e-02 3.885966241359710693e-01 92 | 6.773176193237304688e-01 -2.835121452808380127e-01 -2.235905379056930542e-01 93 | 4.140653014183044434e-01 1.617291718721389771e-01 3.974392414093017578e-01 94 | 1.399283111095428467e-01 6.710889935493469238e-02 3.473164141178131104e-01 95 | 6.240779161453247070e-01 1.090542599558830261e-01 3.530281484127044678e-01 96 | 7.062150835990905762e-01 8.506269007921218872e-02 3.685637414455413818e-01 97 | 2.463641166687011719e-01 -2.651540040969848633e-01 -2.915841937065124512e-01 98 | 6.176531314849853516e-01 1.179689243435859680e-01 -4.079605937004089355e-01 99 | 4.822879135608673096e-01 -2.528754472732543945e-01 -2.881846427917480469e-01 100 | 3.729798793792724609e-01 -7.130751013755798340e-02 -3.825619518756866455e-01 101 | 1.411867141723632812e-01 1.506595909595489502e-01 3.206208050251007080e-01 102 | 1.837417036294937134e-01 3.626599907875061035e-01 8.281032741069793701e-02 103 | 6.529207229614257812e-01 -2.325833030045032501e-02 -3.791441917419433594e-01 104 | 4.936356544494628906e-01 -8.837587386369705200e-02 3.966190516948699951e-01 105 | 4.306542873382568359e-01 1.596540771424770355e-02 -3.968495726585388184e-01 106 | 6.084516048431396484e-01 2.224996685981750488e-01 -3.073175549507141113e-01 107 | 1.213368177413940430e-01 3.398918211460113525e-01 9.462702274322509766e-02 108 | 2.416471689939498901e-01 3.248145580291748047e-01 -1.815816462039947510e-01 109 | 3.850282132625579834e-01 9.300028532743453979e-02 3.689247667789459229e-01 110 | 4.631093144416809082e-01 9.048161655664443970e-02 -3.797538876533508301e-01 111 | 1.244608312845230103e-01 3.575341105461120605e-01 3.857764229178428650e-02 112 | 4.647703766822814941e-01 -6.825302541255950928e-02 -3.852329254150390625e-01 113 | 4.593727588653564453e-01 -1.453298628330230713e-01 3.382258415222167969e-01 114 | 4.278001189231872559e-01 2.917236089706420898e-01 2.233502715826034546e-01 115 | 7.232968807220458984e-01 -1.527774035930633545e-01 -3.503187894821166992e-01 116 | 1.187248080968856812e-01 1.150705013424158096e-02 -3.711213171482086182e-01 117 | 2.462246417999267578e-01 -1.628848761320114136e-01 3.566002249717712402e-01 118 | 5.829972028732299805e-01 7.914572954177856445e-03 -3.703404664993286133e-01 119 | 7.150468826293945312e-01 1.526413112878799438e-01 -3.392485678195953369e-01 120 | 1.966511309146881104e-01 3.022801280021667480e-01 -2.236572802066802979e-01 121 | 4.605308771133422852e-01 -2.399379014968872070e-01 2.692742943763732910e-01 122 | 3.351870179176330566e-01 -2.674106359481811523e-01 2.412204295396804810e-01 123 | 4.952849447727203369e-01 -1.185224354267120361e-01 -4.010999500751495361e-01 124 | 4.136235117912292480e-01 3.424460291862487793e-01 -1.095898151397705078e-01 125 | 6.022516489028930664e-01 1.570783704519271851e-01 -3.243980705738067627e-01 126 | 3.602800965309143066e-01 3.586621880531311035e-01 -5.320477485656738281e-02 127 | 4.720673263072967529e-01 1.230899393558502197e-01 -3.376909792423248291e-01 128 | 4.714152216911315918e-01 1.995466351509094238e-01 3.400657474994659424e-01 129 | 5.399237275123596191e-01 -2.428116202354431152e-01 -3.060319721698760986e-01 130 | 1.197926476597785950e-01 1.348896771669387817e-01 -3.192522227764129639e-01 131 | 5.420909523963928223e-01 2.382534444332122803e-01 -2.661869823932647705e-01 132 | 3.641397953033447266e-01 -1.744139492511749268e-01 -3.714274168014526367e-01 133 | 1.460055857896804810e-01 3.088219165802001953e-01 -1.949191093444824219e-01 134 | 5.255779623985290527e-01 3.179879784584045410e-01 1.770293712615966797e-01 135 | 2.030551284551620483e-01 -4.354900866746902466e-02 -4.117467999458312988e-01 136 | 3.606460988521575928e-01 -1.385498344898223877e-01 3.692905306816101074e-01 137 | 4.352413415908813477e-01 3.421209454536437988e-01 1.263881027698516846e-01 138 | 1.535896509885787964e-01 3.106475770473480225e-01 2.134804427623748779e-01 139 | 1.600665599107742310e-01 -4.393070563673973083e-02 -3.875620067119598389e-01 140 | 4.793885648250579834e-01 2.681903839111328125e-01 -2.487217634916305542e-01 141 | 7.121145129203796387e-01 -7.027380168437957764e-02 -4.337366223335266113e-01 142 | 1.461880058050155640e-01 2.657249271869659424e-01 -2.857758700847625732e-01 143 | 1.544559299945831299e-01 -1.330906748771667480e-01 3.872537910938262939e-01 144 | 2.304584681987762451e-01 1.555246859788894653e-01 -3.234785795211791992e-01 145 | 4.504301846027374268e-01 6.382942944765090942e-02 3.835517168045043945e-01 146 | 3.022644817829132080e-01 -1.729137450456619263e-01 3.123476207256317139e-01 147 | 2.541521191596984863e-01 9.252720326185226440e-02 -3.621792495250701904e-01 148 | 2.217245399951934814e-01 -9.294619411230087280e-02 3.946228623390197754e-01 149 | 1.217171102762222290e-01 2.749453485012054443e-01 2.383698225021362305e-01 150 | 2.330081909894943237e-01 3.710658848285675049e-01 1.425025798380374908e-02 151 | 7.597330808639526367e-01 -6.045643985271453857e-02 -3.660657703876495361e-01 152 | 6.722259521484375000e-01 2.468386888504028320e-01 -2.740604281425476074e-01 153 | 3.476486206054687500e-01 1.650825440883636475e-01 -3.202390670776367188e-01 154 | 5.091602206230163574e-01 1.847266778349876404e-02 -4.217930734157562256e-01 155 | 3.306161463260650635e-01 -3.195853531360626221e-02 -4.098563194274902344e-01 156 | 2.678832113742828369e-01 1.286802887916564941e-01 3.661068975925445557e-01 157 | 3.215838670730590820e-01 3.402626514434814453e-01 1.390081048011779785e-01 158 | 1.155452355742454529e-01 2.738668918609619141e-01 -2.404388785362243652e-01 159 | 8.590327501296997070e-01 -1.649713218212127686e-01 -3.523962497711181641e-01 160 | 2.194930613040924072e-01 1.698912084102630615e-01 3.677099943161010742e-01 161 | 1.448232233524322510e-01 -1.010343208909034729e-01 -3.798206150531768799e-01 162 | 8.499252796173095703e-01 3.904517740011215210e-02 -4.243245124816894531e-01 163 | 7.235859632492065430e-01 2.157934606075286865e-01 -2.884799838066101074e-01 164 | 3.836352825164794922e-01 1.951463669538497925e-01 3.373952805995941162e-01 165 | 3.044406473636627197e-01 2.967647314071655273e-01 -2.004311382770538330e-01 166 | 5.552313327789306641e-01 3.003069162368774414e-01 -2.117542475461959839e-01 167 | 6.199748516082763672e-01 -8.640737831592559814e-02 -3.641122579574584961e-01 168 | 2.687269449234008789e-01 3.281421363353729248e-01 -1.447060406208038330e-01 169 | 6.205809116363525391e-01 2.702515721321105957e-01 -2.444070577621459961e-01 170 | 2.191077172756195068e-01 -9.272672981023788452e-02 -3.854133188724517822e-01 171 | 6.202413439750671387e-01 2.651161849498748779e-01 2.647429108619689941e-01 172 | 1.584216654300689697e-01 2.338143736124038696e-01 3.246772289276123047e-01 173 | 3.153225183486938477e-01 9.651164710521697998e-02 3.934436440467834473e-01 174 | 3.423835635185241699e-01 3.536463081836700439e-01 5.837447196245193481e-02 175 | 6.191207766532897949e-01 5.764074623584747314e-02 -3.809842467308044434e-01 176 | 3.358862102031707764e-01 2.891623973846435547e-01 2.384984940290451050e-01 177 | 6.810749173164367676e-01 2.893354594707489014e-01 -2.088682651519775391e-01 178 | 6.956853866577148438e-01 1.678022183477878571e-02 -3.676601052284240723e-01 179 | 2.127347588539123535e-01 -3.048884570598602295e-01 -2.398660033941268921e-01 180 | 4.212883710861206055e-01 -8.178474009037017822e-02 3.453911840915679932e-01 181 | 7.467024922370910645e-01 3.143213391304016113e-01 -1.699282526969909668e-01 182 | 1.522493362426757812e-01 3.536935448646545410e-01 -9.172190725803375244e-02 183 | 3.234657645225524902e-01 3.424029052257537842e-01 -1.239131838083267212e-01 184 | 7.813875675201416016e-01 1.914428174495697021e-01 -3.085030317306518555e-01 185 | 1.808654069900512695e-01 -4.799156635999679565e-02 3.678755760192871094e-01 186 | 4.017655253410339355e-01 -2.503978908061981201e-01 -2.975029349327087402e-01 187 | 8.698890209197998047e-01 2.128464728593826294e-01 -2.804791033267974854e-01 188 | 2.012467980384826660e-01 1.002654507756233215e-01 -3.802563548088073730e-01 189 | 4.237178266048431396e-01 -1.621858403086662292e-02 3.885680139064788818e-01 190 | 2.928330302238464355e-01 1.427642554044723511e-01 -3.615003824234008789e-01 191 | 3.234296739101409912e-01 1.060604974627494812e-01 -3.906720876693725586e-01 192 | 4.782593846321105957e-01 2.095991820096969604e-01 -3.006124496459960938e-01 193 | 6.354279518127441406e-01 -1.573133766651153564e-01 -3.731481134891510010e-01 194 | 4.328106343746185303e-01 1.776905059814453125e-01 -3.585928976535797119e-01 195 | 1.483676284551620483e-01 1.785362809896469116e-01 3.713724017143249512e-01 196 | 4.691488146781921387e-01 1.252301782369613647e-01 3.429818153381347656e-01 197 | 6.648914813995361328e-01 -2.005436718463897705e-01 -3.308480978012084961e-01 198 | 4.230238199234008789e-01 -2.857699394226074219e-01 -2.347917407751083374e-01 199 | 1.339489966630935669e-01 -2.937968969345092773e-01 -1.242954730987548828e-01 200 | 8.116637468338012695e-01 7.926568388938903809e-02 -3.601549863815307617e-01 201 | 7.220122814178466797e-01 2.020340710878372192e-01 3.194888830184936523e-01 202 | 3.639465868473052979e-01 3.172283172607421875e-01 -1.784331500530242920e-01 203 | 5.836064815521240234e-01 2.014105767011642456e-01 3.245353698730468750e-01 204 | 1.961987465620040894e-01 1.295228749513626099e-01 3.533636927604675293e-01 205 | 7.529631853103637695e-01 -2.055707573890686035e-01 -3.340223133563995361e-01 206 | 5.301479101181030273e-01 2.380781769752502441e-01 3.158683776855468750e-01 207 | 1.928285956382751465e-01 1.972386240959167480e-01 -3.332825899124145508e-01 208 | 4.363066852092742920e-01 2.283169478178024292e-01 2.909407615661621094e-01 209 | 2.321944832801818848e-01 -4.309381917119026184e-02 3.830273747444152832e-01 210 | 8.144912123680114746e-01 -1.341196298599243164e-01 -3.420256972312927246e-01 211 | 4.512342810630798340e-01 3.331987261772155762e-01 -1.593857109546661377e-01 212 | 4.999360442161560059e-01 3.507626056671142578e-01 -8.871947973966598511e-02 213 | 4.105191826820373535e-01 2.854452729225158691e-01 -2.200508415699005127e-01 214 | 5.249826908111572266e-01 1.613923162221908569e-01 3.566271066665649414e-01 215 | 1.530536413192749023e-01 2.864663302898406982e-01 2.688374221324920654e-01 216 | 2.435720115900039673e-01 -1.889840662479400635e-01 -3.675448000431060791e-01 217 | 1.263950765132904053e-01 2.137791067361831665e-01 -2.687318623065948486e-01 218 | 5.526820421218872070e-01 1.279724389314651489e-01 -3.534289002418518066e-01 219 | 2.618957161903381348e-01 1.986897587776184082e-01 -3.502616882324218750e-01 220 | 6.783820390701293945e-01 3.456427156925201416e-01 -1.154312938451766968e-01 221 | 2.109176516532897949e-01 5.464596301317214966e-02 -4.154730737209320068e-01 222 | 2.373152077198028564e-01 2.990410327911376953e-01 2.556762993335723877e-01 223 | 1.265939474105834961e-01 1.078022345900535583e-01 -3.693514466285705566e-01 224 | 2.438249289989471436e-01 -2.293980419635772705e-01 2.917152345180511475e-01 225 | 1.832219511270523071e-01 -1.276282826438546181e-03 4.123396575450897217e-01 226 | 5.852688550949096680e-01 -3.684821724891662598e-02 3.578302860260009766e-01 227 | 3.391652107238769531e-01 -2.232906520366668701e-01 -3.230921030044555664e-01 228 | 1.585738807916641235e-01 2.288912981748580933e-02 -4.059509932994842529e-01 229 | 5.939701795578002930e-01 -1.713022142648696899e-01 3.345609605312347412e-01 230 | 4.940721988677978516e-01 -2.013041228055953979e-01 -3.426719605922698975e-01 231 | 5.660185217857360840e-01 3.368895947933197021e-01 -1.408058702945709229e-01 232 | 7.604880332946777344e-01 2.724863588809967041e-03 -3.722755312919616699e-01 233 | 1.467536687850952148e-01 -2.708900272846221924e-01 -2.782883048057556152e-01 234 | 1.613494753837585449e-01 6.146432831883430481e-02 -3.766616582870483398e-01 235 | 2.772063612937927246e-01 -9.509335458278656006e-02 -3.679552972316741943e-01 236 | 5.365747809410095215e-01 6.795690208673477173e-02 -3.705240488052368164e-01 237 | 1.711698770523071289e-01 -2.670844793319702148e-01 2.780916988849639893e-01 238 | 2.265510261058807373e-01 3.515241742134094238e-01 1.405401527881622314e-01 239 | 1.349474936723709106e-01 -1.465604901313781738e-01 -2.738518714904785156e-01 240 | 1.225642859935760498e-01 3.063913285732269287e-01 -6.804824620485305786e-02 241 | 3.974741101264953613e-01 2.033689767122268677e-01 -3.112744688987731934e-01 242 | 1.872568130493164062e-01 -1.959496289491653442e-01 3.340162336826324463e-01 243 | 3.254360258579254150e-01 -2.685531079769134521e-01 -2.616096138954162598e-01 244 | 8.238362073898315430e-01 3.109607696533203125e-01 -1.518586724996566772e-01 245 | 1.475738286972045898e-01 -1.820254027843475342e-01 -3.565183281898498535e-01 246 | 3.311561644077301025e-01 2.282937765121459961e-01 3.115609884262084961e-01 247 | 1.215573400259017944e-01 2.045575082302093506e-01 3.030666112899780273e-01 248 | 6.125872135162353516e-01 -2.264412343502044678e-01 -3.155186772346496582e-01 249 | 1.375759541988372803e-01 -3.388127684593200684e-02 -3.293606340885162354e-01 250 | 1.219640225172042847e-01 -1.330269873142242432e-01 -3.424401283264160156e-01 251 | 1.882543414831161499e-01 -1.441334635019302368e-01 -3.960108160972595215e-01 252 | 6.057394742965698242e-01 -2.652103900909423828e-01 -2.725022733211517334e-01 253 | 2.067829966545104980e-01 -2.250116169452667236e-01 -3.400144279003143311e-01 254 | 2.069714367389678955e-01 3.313360214233398438e-01 1.990315318107604980e-01 255 | 3.136795759201049805e-01 2.765176594257354736e-01 -2.489568293094635010e-01 256 | 1.288634538650512695e-01 -7.907451689243316650e-02 3.396677672863006592e-01 257 | 5.870261192321777344e-01 -3.065111041069030762e-01 -1.865074038505554199e-02 258 | 7.104701399803161621e-01 -3.099812567234039307e-01 1.063335221260786057e-02 259 | 6.589885950088500977e-01 -3.042175769805908203e-01 1.071733683347702026e-01 260 | 5.883206725120544434e-01 -2.843776345252990723e-01 1.341823190450668335e-01 261 | 4.260388016700744629e-01 -3.263129889965057373e-01 2.228891476988792419e-02 262 | 4.017775654792785645e-01 -3.096783757209777832e-01 1.328234970569610596e-01 263 | 3.036157786846160889e-01 -3.371527194976806641e-01 -1.768677309155464172e-02 264 | 3.038485944271087646e-01 -3.259164690971374512e-01 1.424968987703323364e-01 265 | 1.627638638019561768e-01 -3.761115372180938721e-01 -1.569752581417560577e-03 266 | 2.129251956939697266e-01 -3.448823094367980957e-01 1.709728837013244629e-01 267 | 1.619277745485305786e-01 -3.581978380680084229e-01 1.164428424090147018e-02 268 | 2.146890163421630859e-01 -3.208925426006317139e-01 2.378596365451812744e-01 269 | 2.159285545349121094e-01 -3.148442804813385010e-01 5.775283090770244598e-03 270 | 2.345138341188430786e-01 -2.561639845371246338e-01 3.365793526172637939e-01 271 | 2.644638419151306152e-01 -2.178266197443008423e-01 -1.262019574642181396e-01 272 | 2.675763964653015137e-01 4.253546148538589478e-02 3.528955876827239990e-01 273 | 2.495574951171875000e-01 1.677969992160797119e-01 -1.347315162420272827e-01 274 | 2.415793687105178833e-01 2.635939419269561768e-01 2.576273381710052490e-01 275 | 2.929051518440246582e-01 2.384439706802368164e-01 -2.642680406570434570e-01 276 | 3.422034978866577148e-01 2.555538713932037354e-01 1.673049479722976685e-01 277 | 3.581928014755249023e-01 2.834301292896270752e-01 -7.759348303079605103e-02 278 | 3.974027037620544434e-01 2.904525697231292725e-01 1.461512297391891479e-01 279 | 4.445295333862304688e-01 2.242757976055145264e-01 -1.483562588691711426e-01 280 | 5.079360604286193848e-01 2.103472352027893066e-01 1.390603184700012207e-01 281 | 4.998089671134948730e-01 2.476524710655212402e-01 -4.892597720026969910e-02 282 | 5.864915847778320312e-01 2.422505915164947510e-01 1.128237247467041016e-01 283 | 6.442742943763732910e-01 2.195063233375549316e-01 -1.243334114551544189e-01 284 | 7.034811377525329590e-01 2.171640396118164062e-01 9.081200510263442993e-02 285 | 6.117165088653564453e-01 2.659941017627716064e-01 -3.001759573817253113e-02 286 | 7.524155974388122559e-01 2.457240223884582520e-01 8.943600952625274658e-02 287 | 7.751027941703796387e-01 2.724086940288543701e-01 -2.669089660048484802e-02 288 | 6.972598433494567871e-01 2.515257894992828369e-01 -6.385734677314758301e-02 289 | 5.489131212234497070e-01 -3.425690531730651855e-01 -1.136846188455820084e-02 290 | 6.116610169410705566e-01 -3.287175893783569336e-01 4.346203804016113281e-02 291 | 6.455554366111755371e-01 -3.063198626041412354e-01 -5.885123088955879211e-02 292 | 5.061725974082946777e-01 -3.221710026264190674e-01 1.279405783861875534e-02 293 | 4.628726840019226074e-01 -2.909434139728546143e-01 -8.741429448127746582e-02 294 | 2.502472698688507080e-01 -3.458096981048583984e-01 5.983317643404006958e-02 295 | 1.923073828220367432e-01 -3.441230356693267822e-01 -1.577230170369148254e-02 296 | 1.827364861965179443e-01 -3.651925623416900635e-01 3.592635318636894226e-02 297 | 1.270102411508560181e-01 -3.444291055202484131e-01 -3.271302580833435059e-02 298 | 1.067702323198318481e-01 -3.661663234233856201e-01 3.524677082896232605e-02 299 | 1.091686040163040161e-01 -3.361750245094299316e-01 -4.324610903859138489e-02 300 | 1.068224385380744934e-01 -3.423548638820648193e-01 3.304879739880561829e-02 301 | 1.413657069206237793e-01 -3.237423598766326904e-01 -8.670733869075775146e-02 302 | 1.885515749454498291e-01 -3.263814151287078857e-01 7.424919307231903076e-02 303 | 2.328571528196334839e-01 -3.374068439006805420e-01 -8.430291712284088135e-02 304 | 2.794865369796752930e-01 -3.240743577480316162e-01 1.093370020389556885e-01 305 | 3.252292871475219727e-01 -3.152033984661102295e-01 -1.510252803564071655e-01 306 | 3.374906182289123535e-01 -2.385938018560409546e-01 2.289526164531707764e-01 307 | 3.944402337074279785e-01 -3.247720301151275635e-01 -8.556336164474487305e-02 308 | 4.509431719779968262e-01 -2.663586735725402832e-01 1.583440005779266357e-01 309 | 4.625803828239440918e-01 -3.112175762653350830e-01 -1.767466664314270020e-01 310 | 4.828013181686401367e-01 -2.215069234371185303e-01 2.385824918746948242e-01 311 | 5.281835198402404785e-01 -3.230877816677093506e-01 5.424711853265762329e-03 312 | 5.940511822700500488e-01 -2.270993739366531372e-01 1.732209622859954834e-01 313 | 6.653634905815124512e-01 -2.936058938503265381e-01 -1.141245067119598389e-01 314 | 7.180732488632202148e-01 -1.937102675437927246e-01 2.392395287752151489e-01 315 | 7.613562941551208496e-01 -2.830920815467834473e-01 9.543035924434661865e-02 316 | 8.108318448066711426e-01 -1.676810532808303833e-01 1.992912143468856812e-01 317 | 8.904978632926940918e-01 -2.583211362361907959e-01 -3.768843412399291992e-02 318 | 8.712719678878784180e-01 -6.680362671613693237e-02 2.847369015216827393e-01 319 | 9.117830395698547363e-01 -1.419956088066101074e-01 1.456836909055709839e-01 320 | 8.239653706550598145e-01 5.977044254541397095e-02 2.644605636596679688e-01 321 | 4.711192846298217773e-01 -3.428127765655517578e-01 5.178474262356758118e-02 322 | 5.329884886741638184e-01 -3.387692868709564209e-01 7.240140438079833984e-02 323 | 4.582235217094421387e-01 -3.282020986080169678e-01 -2.265802398324012756e-02 324 | 4.638685584068298340e-01 -3.104678094387054443e-01 1.184445172548294067e-01 325 | 3.467629551887512207e-01 -3.035320043563842773e-01 -8.183431625366210938e-02 326 | 3.606443405151367188e-01 -3.173504769802093506e-01 6.945243477821350098e-02 327 | 2.278503030538558960e-01 -3.135642409324645996e-01 -8.707185089588165283e-02 328 | 2.509502172470092773e-01 -3.333913981914520264e-01 7.538387179374694824e-02 329 | 1.439512819051742554e-01 -3.386295139789581299e-01 -1.056775823235511780e-01 330 | 2.070772051811218262e-01 -3.418378531932830811e-01 8.585754036903381348e-02 331 | 1.523206681013107300e-01 -3.217562437057495117e-01 -1.410056054592132568e-01 332 | 2.474277913570404053e-01 -2.993625402450561523e-01 1.615621000528335571e-01 333 | 1.830532848834991455e-01 -2.774512469768524170e-01 -2.552492022514343262e-01 334 | 2.808741331100463867e-01 -2.884114086627960205e-01 2.132054120302200317e-01 335 | 2.008676379919052124e-01 -8.765018731355667114e-02 -3.476310372352600098e-01 336 | 2.597755193710327148e-01 1.965925246477127075e-01 3.404984474182128906e-01 337 | 1.943729817867279053e-01 1.674606800079345703e-01 -3.051182627677917480e-01 338 | 2.293226718902587891e-01 3.233703076839447021e-01 5.432518944144248962e-02 339 | 1.937404125928878784e-01 2.332981228828430176e-01 -2.396350055932998657e-01 340 | 2.216679304838180542e-01 3.437899053096771240e-01 1.254063248634338379e-01 341 | 2.379001677036285400e-01 2.812468111515045166e-01 -1.363129168748855591e-01 342 | 3.364045023918151855e-01 2.999465167522430420e-01 2.516761049628257751e-02 343 | 3.530920445919036865e-01 2.381044924259185791e-01 -1.566105782985687256e-01 344 | 4.154704213142395020e-01 2.893326878547668457e-01 6.086428463459014893e-02 345 | 3.970087766647338867e-01 2.739211916923522949e-01 -6.109280511736869812e-02 346 | 4.982565045356750488e-01 2.827596366405487061e-01 4.213006049394607544e-02 347 | 5.101578235626220703e-01 2.290229052305221558e-01 -9.164176881313323975e-02 348 | 5.830904841423034668e-01 2.632229030132293701e-01 8.639197796583175659e-02 349 | 5.587353110313415527e-01 2.844349741935729980e-01 -3.230749443173408508e-02 350 | 6.952983736991882324e-01 2.747285962104797363e-01 1.440446358174085617e-02 351 | 6.030161976814270020e-01 2.768974602222442627e-01 -1.099577695131301880e-01 352 | 7.150692343711853027e-01 2.526558935642242432e-01 -1.172464862465858459e-01 353 | 3.840991258621215820e-01 -3.480804264545440674e-01 -1.182882394641637802e-02 354 | 3.638728857040405273e-01 -3.451878130435943604e-01 9.531708061695098877e-02 355 | 3.404591083526611328e-01 -3.438294529914855957e-01 -3.537509962916374207e-02 356 | 3.000702261924743652e-01 -3.460887372493743896e-01 2.745504304766654968e-02 357 | 2.366604804992675781e-01 -3.442286550998687744e-01 -7.861657440662384033e-02 358 | 1.804336011409759521e-01 -3.621229231357574463e-01 8.994290232658386230e-02 359 | 1.599026769399642944e-01 -3.432327210903167725e-01 -8.953891694545745850e-02 360 | 1.281714886426925659e-01 -3.691480457782745361e-01 8.692710846662521362e-02 361 | 1.403197348117828369e-01 -3.175276815891265869e-01 -1.637112051248550415e-01 362 | 1.098335236310958862e-01 -3.606230318546295166e-01 1.284127831459045410e-01 363 | 9.989374130964279175e-02 -2.952738702297210693e-01 -1.795721054077148438e-01 364 | 1.080365851521492004e-01 -3.235208392143249512e-01 1.379677951335906982e-01 365 | 1.015883311629295349e-01 -2.722707986831665039e-01 -2.518034577369689941e-01 366 | 1.548620909452438354e-01 -2.948818504810333252e-01 2.198386043310165405e-01 367 | 1.667576283216476440e-01 -2.548432648181915283e-01 -2.794291377067565918e-01 368 | 2.481586337089538574e-01 -2.621965706348419189e-01 2.219340503215789795e-01 369 | 2.412436455488204956e-01 -2.391223013401031494e-01 -2.306722402572631836e-01 370 | 2.573164701461791992e-01 -1.484297215938568115e-01 3.025496602058410645e-01 371 | 2.763662934303283691e-01 -2.913908660411834717e-01 -1.604400873184204102e-01 372 | 3.715569078922271729e-01 -1.668738722801208496e-01 2.891961634159088135e-01 373 | 4.331616759300231934e-01 -2.751070559024810791e-01 -2.031281590461730957e-01 374 | 4.234377741813659668e-01 -1.104926168918609619e-01 2.616300284862518311e-01 375 | 4.608174562454223633e-01 -3.158953487873077393e-01 -1.298105567693710327e-01 376 | 5.277716517448425293e-01 -1.115065589547157288e-01 2.659195661544799805e-01 377 | 5.852107405662536621e-01 -2.505995929241180420e-01 -1.951492428779602051e-01 378 | 6.557826399803161621e-01 -1.034284755587577820e-01 2.274284362792968750e-01 379 | 6.911842226982116699e-01 -2.952949106693267822e-01 -1.375382486730813980e-02 380 | 7.905885577201843262e-01 -1.120209991931915283e-01 2.357827126979827881e-01 381 | 8.008006811141967773e-01 -2.306812107563018799e-01 -8.997534960508346558e-02 382 | 7.647358775138854980e-01 5.530048161745071411e-02 3.135595321655273438e-01 383 | 9.100858569145202637e-01 -8.582661300897598267e-02 -3.337421454489231110e-03 384 | 9.087048172950744629e-01 7.088643312454223633e-02 2.641575336456298828e-01 385 | 2.475120723247528076e-01 -3.601472675800323486e-01 4.880369827151298523e-02 386 | 2.986174225807189941e-01 -3.572633564472198486e-01 1.098885387182235718e-01 387 | 2.205286026000976562e-01 -3.574404120445251465e-01 -2.248669043183326721e-04 388 | 2.007586807012557983e-01 -3.517437875270843506e-01 1.515545248985290527e-01 389 | 1.682094633579254150e-01 -3.497604131698608398e-01 -4.788015037775039673e-02 390 | 1.453440785408020020e-01 -3.500467240810394287e-01 1.627845317125320435e-01 391 | 1.305864155292510986e-01 -3.269705474376678467e-01 -1.090615838766098022e-01 392 | 1.180361509323120117e-01 -3.267085850238800049e-01 2.039803564548492432e-01 393 | 1.050344258546829224e-01 -2.955084443092346191e-01 -1.320621818304061890e-01 394 | 1.185453012585639954e-01 -2.990240156650543213e-01 2.098459750413894653e-01 395 | 1.290167570114135742e-01 -2.735262215137481689e-01 -2.001987397670745850e-01 396 | 1.485157012939453125e-01 -2.431323975324630737e-01 2.298854887485504150e-01 397 | 1.641216576099395752e-01 -1.864743232727050781e-01 -2.985822558403015137e-01 398 | 2.233402132987976074e-01 -1.086352542042732239e-01 1.939011663198471069e-01 399 | 2.307570576667785645e-01 4.576732590794563293e-02 -3.262151479721069336e-01 400 | 2.388190627098083496e-01 2.162701338529586792e-01 1.905899643898010254e-01 401 | 1.929587274789810181e-01 2.432815730571746826e-01 -2.311726659536361694e-01 402 | 1.663455367088317871e-01 3.214017450809478760e-01 1.165836080908775330e-01 403 | 1.392973810434341431e-01 2.681834995746612549e-01 -2.062396407127380371e-01 404 | 1.383722424507141113e-01 3.462662398815155029e-01 -8.041160181164741516e-04 405 | 1.365537941455841064e-01 2.977587282657623291e-01 -1.015606224536895752e-01 406 | 1.798339486122131348e-01 3.409998714923858643e-01 7.472477853298187256e-02 407 | 2.235587239265441895e-01 2.899016439914703369e-01 -1.043307930231094360e-01 408 | 2.263291180133819580e-01 3.045303821563720703e-01 2.443526312708854675e-02 409 | 2.602382302284240723e-01 2.869909703731536865e-01 -2.431648597121238708e-02 410 | 3.292037844657897949e-01 2.892454564571380615e-01 6.186391040682792664e-02 411 | 3.457925915718078613e-01 2.768393456935882568e-01 -4.543063417077064514e-02 412 | 3.882560729980468750e-01 2.916504442691802979e-01 8.824221789836883545e-02 413 | 4.142588376998901367e-01 3.040173351764678955e-01 3.048027865588665009e-03 414 | 5.535283684730529785e-01 2.980986833572387695e-01 8.370853215456008911e-02 415 | 5.873535275459289551e-01 2.875618040561676025e-01 3.131654486060142517e-02 416 | 7.840161323547363281e-01 2.304500639438629150e-01 -3.822709619998931885e-02 417 | 3.147050738334655762e-01 -3.640642464160919189e-01 1.426745485514402390e-02 418 | 2.582250237464904785e-01 -3.511945307254791260e-01 1.507678925991058350e-01 419 | 2.780674993991851807e-01 -3.559783101081848145e-01 -4.130175337195396423e-02 420 | 2.261032462120056152e-01 -3.567920625209808350e-01 9.510038793087005615e-02 421 | 2.504703104496002197e-01 -3.352146446704864502e-01 -1.124182045459747314e-01 422 | 2.130861431360244751e-01 -3.363108932971954346e-01 1.996430754661560059e-01 423 | 2.024156451225280762e-01 -3.191296458244323730e-01 -1.370099782943725586e-01 424 | 1.704321056604385376e-01 -3.212529718875885010e-01 2.124585807323455811e-01 425 | 1.762266904115676880e-01 -2.958396077156066895e-01 -1.984342932701110840e-01 426 | 1.304740160703659058e-01 -2.960538268089294434e-01 2.549323737621307373e-01 427 | 1.449709534645080566e-01 -2.675573527812957764e-01 -2.354126572608947754e-01 428 | 1.267482489347457886e-01 -2.546699047088623047e-01 2.784974575042724609e-01 429 | 1.346609443426132202e-01 -2.263390272855758667e-01 -2.914935648441314697e-01 430 | 1.364447623491287231e-01 -2.070560455322265625e-01 3.078488409519195557e-01 431 | 1.332420110702514648e-01 -2.078266590833663940e-01 -3.311989307403564453e-01 432 | 1.316955089569091797e-01 -1.512014418840408325e-01 3.150207102298736572e-01 433 | 1.304066926240921021e-01 -1.472906768321990967e-01 -3.487092554569244385e-01 434 | 1.482712924480438232e-01 -7.445887476205825806e-02 3.264475464820861816e-01 435 | 2.352007776498794556e-01 -1.764246672391891479e-01 -3.314315080642700195e-01 436 | 2.407141327857971191e-01 -3.060574643313884735e-03 3.288681507110595703e-01 437 | 3.487506806850433350e-01 -9.966054558753967285e-02 -3.345769047737121582e-01 438 | 3.202104568481445312e-01 -1.057089399546384811e-02 3.398993015289306641e-01 439 | 3.228514790534973145e-01 -2.234788835048675537e-01 -2.612532377243041992e-01 440 | 3.903146386146545410e-01 3.245193511247634888e-02 3.027601838111877441e-01 441 | 3.853302001953125000e-01 -8.390126377344131470e-02 -2.702122926712036133e-01 442 | 5.005937814712524414e-01 3.286355175077915192e-03 2.898192405700683594e-01 443 | 4.965470433235168457e-01 -2.682318389415740967e-01 -6.561274081468582153e-02 444 | 6.487087607383728027e-01 -7.187395356595516205e-03 2.621040940284729004e-01 445 | 7.512964010238647461e-01 -2.303413599729537964e-01 1.193589121103286743e-01 446 | 7.717929482460021973e-01 1.869908571243286133e-01 2.532850205898284912e-01 447 | 9.466604590415954590e-01 -5.078644305467605591e-02 1.103995442390441895e-01 448 | 8.484253287315368652e-01 1.664246916770935059e-01 2.743775248527526855e-01 449 | 3.776506185531616211e-01 -3.604306578636169434e-01 5.448306724429130554e-02 450 | 3.473805785179138184e-01 -3.501385450363159180e-01 1.464262008666992188e-01 451 | 3.672268986701965332e-01 -3.406812250614166260e-01 -8.420850336551666260e-02 452 | 3.048670589923858643e-01 -3.299812376499176025e-01 1.863734573125839233e-01 453 | 3.162286281585693359e-01 -3.276324272155761719e-01 -1.105489134788513184e-01 454 | 2.585087418556213379e-01 -3.089193999767303467e-01 2.209246307611465454e-01 455 | 2.696520686149597168e-01 -3.114947080612182617e-01 -1.665471494197845459e-01 456 | 2.295742928981781006e-01 -2.786285281181335449e-01 2.693396806716918945e-01 457 | 2.420110255479812622e-01 -2.703376710414886475e-01 -1.969386786222457886e-01 458 | 2.171444594860076904e-01 -2.427655756473541260e-01 2.863408327102661133e-01 459 | 2.214785367250442505e-01 -2.461834549903869629e-01 -2.564672827720642090e-01 460 | 1.888176500797271729e-01 -1.711449474096298218e-01 3.303120136260986328e-01 461 | 1.941568553447723389e-01 -1.363630443811416626e-01 -3.476205170154571533e-01 462 | 1.527634561061859131e-01 -1.493280287832021713e-02 3.740275204181671143e-01 463 | 1.563925445079803467e-01 4.826273769140243530e-02 -3.655359148979187012e-01 464 | 1.445694714784622192e-01 1.325945556163787842e-01 3.286501765251159668e-01 465 | 1.526640057563781738e-01 1.484958529472351074e-01 -3.156592845916748047e-01 466 | 1.547650843858718872e-01 2.319489866495132446e-01 2.483026981353759766e-01 467 | 1.574207544326782227e-01 1.861081570386886597e-01 -2.752220332622528076e-01 468 | 1.381434053182601929e-01 2.893376052379608154e-01 2.045011967420578003e-01 469 | 1.654953956604003906e-01 2.365835607051849365e-01 -2.348482459783554077e-01 470 | 1.979387551546096802e-01 3.071866631507873535e-01 1.295001357793807983e-01 471 | 2.046324610710144043e-01 2.743221223354339600e-01 -1.762080043554306030e-01 472 | 2.216017842292785645e-01 3.120378851890563965e-01 8.365170657634735107e-02 473 | 2.170482277870178223e-01 2.947779297828674316e-01 -9.997114539146423340e-02 474 | 2.115930765867233276e-01 3.430117666721343994e-01 5.638030916452407837e-02 475 | 2.088987231254577637e-01 3.559600710868835449e-01 -3.623465076088905334e-02 476 | 1.894641071557998657e-01 3.628549575805664062e-01 1.194252148270606995e-01 477 | 2.614356875419616699e-01 3.999381363391876221e-01 6.551623344421386719e-02 478 | 4.141348600387573242e-01 3.386872708797454834e-01 6.587380170822143555e-02 479 | 4.784247279167175293e-01 3.275373280048370361e-01 -7.886058092117309570e-02 480 | 6.855554580688476562e-01 2.715770304203033447e-01 -4.009957611560821533e-02 481 | 4.409415721893310547e-01 -3.637986481189727783e-01 1.749401539564132690e-02 482 | 4.125518798828125000e-01 -3.452669680118560791e-01 1.582454890012741089e-01 483 | 4.529128670692443848e-01 -3.488171696662902832e-01 -5.019827187061309814e-02 484 | 4.296848773956298828e-01 -3.533995449542999268e-01 1.000825911760330200e-01 485 | 4.181376695632934570e-01 -3.301760256290435791e-01 -1.126526296138763428e-01 486 | 3.749446272850036621e-01 -3.167711198329925537e-01 1.985492557287216187e-01 487 | 3.531206250190734863e-01 -3.126070201396942139e-01 -1.381117999553680420e-01 488 | 3.306277990341186523e-01 -2.923969328403472900e-01 2.353980988264083862e-01 489 | 3.408719301223754883e-01 -2.848329246044158936e-01 -2.019148468971252441e-01 490 | 3.179234266281127930e-01 -2.545042037963867188e-01 2.810165584087371826e-01 491 | 3.180207014083862305e-01 -2.433350831270217896e-01 -2.204529941082000732e-01 492 | 3.146771788597106934e-01 -1.974224150180816650e-01 3.083055019378662109e-01 493 | 3.078404068946838379e-01 -1.886045485734939575e-01 -2.594069242477416992e-01 494 | 2.702341675758361816e-01 -1.532564610242843628e-01 3.197845816612243652e-01 495 | 2.494946569204330444e-01 -1.381093561649322510e-01 -3.138192594051361084e-01 496 | 2.299055159091949463e-01 -7.157368212938308716e-02 3.307468593120574951e-01 497 | 2.251525074243545532e-01 -3.882783278822898865e-02 -3.340628743171691895e-01 498 | 2.236631810665130615e-01 3.245600312948226929e-02 3.420813083648681641e-01 499 | 3.796441555023193359e-01 -5.273967981338500977e-02 -3.416226804256439209e-01 500 | 3.418651223182678223e-01 1.067194417119026184e-01 3.317749798297882080e-01 501 | 3.669651746749877930e-01 3.217338025569915771e-02 -3.506337702274322510e-01 502 | 3.314508199691772461e-01 1.771857142448425293e-01 2.744199633598327637e-01 503 | 4.196445941925048828e-01 4.777365177869796753e-02 -3.489185571670532227e-01 504 | 3.864958286285400391e-01 2.086157500743865967e-01 1.902287006378173828e-01 505 | 4.158664941787719727e-01 1.753225475549697876e-01 -2.939342856407165527e-01 506 | 3.235054314136505127e-01 3.018184900283813477e-01 9.218074381351470947e-02 507 | 4.815356731414794922e-01 1.856266558170318604e-01 -2.755061089992523193e-01 508 | 4.638420343399047852e-01 2.496159374713897705e-01 6.244355812668800354e-02 509 | 6.055638194084167480e-01 2.137198448181152344e-01 -2.075121551752090454e-01 510 | 6.502246260643005371e-01 3.061592876911163330e-01 1.530297994613647461e-01 511 | 9.486055970191955566e-01 1.042004153132438660e-01 2.449370175600051880e-02 512 | 8.771246075630187988e-01 2.206470966339111328e-01 1.526583880186080933e-01 513 | 4.825706481933593750e-01 -3.622149825096130371e-01 4.477451369166374207e-02 514 | 4.825307726860046387e-01 -3.489141166210174561e-01 1.231178864836692810e-01 515 | 5.206290483474731445e-01 -3.451716601848602295e-01 -6.778651475906372070e-02 516 | 4.647086262702941895e-01 -3.232368528842926025e-01 1.838974058628082275e-01 517 | 4.891312122344970703e-01 -3.355741798877716064e-01 -1.061416268348693848e-01 518 | 4.244468808174133301e-01 -2.902847826480865479e-01 2.325658202171325684e-01 519 | 4.410102367401123047e-01 -3.153614103794097900e-01 -1.608822047710418701e-01 520 | 3.950052261352539062e-01 -2.619787156581878662e-01 2.790385782718658447e-01 521 | 4.134130477905273438e-01 -2.744576930999755859e-01 -1.902070939540863037e-01 522 | 3.797461986541748047e-01 -2.160191386938095093e-01 2.918352782726287842e-01 523 | 4.055519104003906250e-01 -2.220133543014526367e-01 -2.310617566108703613e-01 524 | 3.684231936931610107e-01 -1.445512473583221436e-01 3.075169622898101807e-01 525 | 3.641955852508544922e-01 -1.436848342418670654e-01 -2.938155531883239746e-01 526 | 2.982994914054870605e-01 -3.976357728242874146e-02 3.540773987770080566e-01 527 | 3.111898303031921387e-01 3.590981476008892059e-03 -3.412347733974456787e-01 528 | 2.482261657714843750e-01 8.560851961374282837e-02 3.474077880382537842e-01 529 | 2.565176486968994141e-01 9.765567630529403687e-02 -3.196156024932861328e-01 530 | 2.088227570056915283e-01 2.017590254545211792e-01 2.690753936767578125e-01 531 | 3.034853339195251465e-01 1.683565080165863037e-01 -2.507770061492919922e-01 532 | 2.522099614143371582e-01 2.694134414196014404e-01 2.090864926576614380e-01 533 | 3.129189610481262207e-01 2.286509126424789429e-01 -2.142049074172973633e-01 534 | 3.079756796360015869e-01 3.076579570770263672e-01 1.446383893489837646e-01 535 | 3.681507706642150879e-01 2.622888386249542236e-01 -1.673653721809387207e-01 536 | 3.051906824111938477e-01 3.571248352527618408e-01 1.362623274326324463e-01 537 | 3.185001313686370850e-01 3.321876227855682373e-01 -1.086046993732452393e-01 538 | 2.458412349224090576e-01 4.297595322132110596e-01 2.269919365644454956e-01 539 | 2.375694811344146729e-01 4.192549884319305420e-01 -4.251275956630706787e-02 540 | 1.840924918651580811e-01 4.823182225227355957e-01 2.032460123300552368e-01 541 | 2.389470934867858887e-01 5.238053798675537109e-01 8.318467438220977783e-02 542 | 2.749437093734741211e-01 4.766195714473724365e-01 1.811425089836120605e-01 543 | 4.272813200950622559e-01 4.930627644062042236e-01 8.935854583978652954e-02 544 | 5.901375412940979004e-01 3.595598638057708740e-01 3.790695965290069580e-02 545 | 5.351638793945312500e-01 -3.655857443809509277e-01 5.540754646062850952e-03 546 | 5.344732403755187988e-01 -3.447107672691345215e-01 1.588094830513000488e-01 547 | 5.964179635047912598e-01 -3.511300384998321533e-01 -4.373894259333610535e-02 548 | 5.491134524345397949e-01 -3.512019813060760498e-01 9.228918701410293579e-02 549 | 5.674753785133361816e-01 -3.371316194534301758e-01 -1.055354401469230652e-01 550 | 5.123897790908813477e-01 -3.111553490161895752e-01 2.131083309650421143e-01 551 | 5.281934738159179688e-01 -3.196621537208557129e-01 -1.426931023597717285e-01 552 | 4.811592102050781250e-01 -2.737948596477508545e-01 2.514020204544067383e-01 553 | 5.091983079910278320e-01 -2.953938245773315430e-01 -1.881653964519500732e-01 554 | 4.517657160758972168e-01 -2.412524968385696411e-01 3.037028014659881592e-01 555 | 4.903714060783386230e-01 -2.555366456508636475e-01 -1.913998574018478394e-01 556 | 4.471374750137329102e-01 -1.760488450527191162e-01 3.131255507469177246e-01 557 | 4.722563028335571289e-01 -2.104958891868591309e-01 -2.300357222557067871e-01 558 | 3.973010778427124023e-01 -1.145621836185455322e-01 3.349912166595458984e-01 559 | 4.278738498687744141e-01 -1.603954434394836426e-01 -2.594062089920043945e-01 560 | 3.843249082565307617e-01 -4.883892834186553955e-02 3.216208517551422119e-01 561 | 3.859214782714843750e-01 -8.404938876628875732e-02 -3.218221962451934814e-01 562 | 3.587364554405212402e-01 4.074335470795631409e-02 3.255441784858703613e-01 563 | 5.065984725952148438e-01 -8.934389054775238037e-02 -3.221167922019958496e-01 564 | 5.036923289299011230e-01 1.241595670580863953e-01 2.814852595329284668e-01 565 | 5.561169981956481934e-01 3.066179901361465454e-02 -3.349325954914093018e-01 566 | 5.159146785736083984e-01 2.207071781158447266e-01 2.336340099573135376e-01 567 | 5.732106566429138184e-01 6.942572444677352905e-02 -2.947135567665100098e-01 568 | 5.464475154876708984e-01 2.933229207992553711e-01 1.858983039855957031e-01 569 | 5.544320344924926758e-01 2.072040885686874390e-01 -2.588119506835937500e-01 570 | 4.288353323936462402e-01 3.390342295169830322e-01 9.564127027988433838e-02 571 | 6.235879063606262207e-01 1.795792132616043091e-01 -2.852818965911865234e-01 572 | 4.118034243583679199e-01 3.606465756893157959e-01 8.503651618957519531e-02 573 | 7.251248955726623535e-01 2.500190436840057373e-01 -9.623559564352035522e-02 574 | 6.635566353797912598e-01 3.426259756088256836e-01 1.045663282275199890e-01 575 | 9.861039519309997559e-01 1.964396834373474121e-01 9.164115786552429199e-02 576 | 8.150200247764587402e-01 2.697373926639556885e-01 1.573245227336883545e-01 577 | 6.013503074645996094e-01 -3.630013167858123779e-01 3.250050917267799377e-02 578 | 5.948599576950073242e-01 -3.493158817291259766e-01 1.110007539391517639e-01 579 | 6.573851704597473145e-01 -3.440907597541809082e-01 -7.900358736515045166e-02 580 | 5.782310366630554199e-01 -3.215864002704620361e-01 1.857069134712219238e-01 581 | 6.180440783500671387e-01 -3.313764929771423340e-01 -1.133218780159950256e-01 582 | 5.541467070579528809e-01 -2.811813652515411377e-01 2.306894212961196899e-01 583 | 5.833148360252380371e-01 -3.090473711490631104e-01 -1.743018925189971924e-01 584 | 5.204341411590576172e-01 -2.483596205711364746e-01 2.805524170398712158e-01 585 | 5.588702559471130371e-01 -2.657942175865173340e-01 -2.183552980422973633e-01 586 | 5.048732161521911621e-01 -1.839365214109420776e-01 2.984517216682434082e-01 587 | 5.391233563423156738e-01 -1.882038116455078125e-01 -2.454031109809875488e-01 588 | 4.875276088714599609e-01 -1.110030934214591980e-01 3.161312043666839600e-01 589 | 5.170523524284362793e-01 -1.120936945080757141e-01 -3.070421516895294189e-01 590 | 4.576268196105957031e-01 -2.935784682631492615e-02 3.462419509887695312e-01 591 | 4.969234466552734375e-01 -1.614686846733093262e-02 -3.213478326797485352e-01 592 | 4.042679667472839355e-01 5.806200951337814331e-02 3.588908910751342773e-01 593 | 4.166272878646850586e-01 9.229717403650283813e-02 -3.251119256019592285e-01 594 | 3.875344991683959961e-01 1.785902976989746094e-01 2.851871550083160400e-01 595 | 4.899145960807800293e-01 1.517161428928375244e-01 -2.877608537673950195e-01 596 | 4.326307177543640137e-01 2.765344977378845215e-01 1.959466487169265747e-01 597 | 4.659287929534912109e-01 2.379817068576812744e-01 -2.338835895061492920e-01 598 | 4.815406799316406250e-01 3.195504248142242432e-01 1.358843743801116943e-01 599 | 5.830618739128112793e-01 2.569645941257476807e-01 -2.045385539531707764e-01 600 | 5.315453410148620605e-01 3.607364296913146973e-01 1.084307059645652771e-01 601 | 5.516614317893981934e-01 3.178162276744842529e-01 -1.656130850315093994e-01 602 | 3.479624986648559570e-01 4.849449694156646729e-01 2.717540562152862549e-01 603 | 3.778189420700073242e-01 4.849790632724761963e-01 -8.239927142858505249e-02 604 | 3.123258352279663086e-01 5.851296782493591309e-01 2.177841961383819580e-01 605 | 3.762546777725219727e-01 5.951312780380249023e-01 5.480180308222770691e-02 606 | 3.495945334434509277e-01 5.728756785392761230e-01 2.556076049804687500e-01 607 | 4.915535449981689453e-01 5.992711186408996582e-01 7.022681832313537598e-02 608 | 6.157793998718261719e-01 4.409801065921783447e-01 9.886364638805389404e-02 609 | 6.483098864555358887e-01 -3.633528947830200195e-01 -4.435404203832149506e-03 610 | 6.273993849754333496e-01 -3.467665612697601318e-01 1.561688184738159180e-01 611 | 6.944954991340637207e-01 -3.551388084888458252e-01 -4.387958347797393799e-02 612 | 6.405023336410522461e-01 -3.586633503437042236e-01 8.351252973079681396e-02 613 | 6.879588961601257324e-01 -3.313218653202056885e-01 -1.362977921962738037e-01 614 | 6.108544468879699707e-01 -3.038045465946197510e-01 2.241511791944503784e-01 615 | 6.314454674720764160e-01 -3.164648711681365967e-01 -1.590046286582946777e-01 616 | 5.847033262252807617e-01 -2.627965509891510010e-01 2.623683214187622070e-01 617 | 6.539681553840637207e-01 -2.784535884857177734e-01 -2.259132266044616699e-01 618 | 5.639780163764953613e-01 -2.197225093841552734e-01 3.151870369911193848e-01 619 | 6.220164895057678223e-01 -2.609622478485107422e-01 -1.777757704257965088e-01 620 | 5.505065917968750000e-01 -1.517899185419082642e-01 3.325262069702148438e-01 621 | 6.204538345336914062e-01 -1.899767816066741943e-01 -2.527663707733154297e-01 622 | 5.259878039360046387e-01 -8.511608839035034180e-02 3.437669873237609863e-01 623 | 6.021137237548828125e-01 -1.842083483934402466e-01 -2.277105748653411865e-01 624 | 5.095966458320617676e-01 -2.010415121912956238e-02 3.389674425125122070e-01 625 | 5.653395652770996094e-01 -1.124656125903129578e-01 -2.907236516475677490e-01 626 | 4.937660098075866699e-01 5.281177908182144165e-02 3.144481480121612549e-01 627 | 6.657564043998718262e-01 -1.069393232464790344e-01 -2.825081646442413330e-01 628 | 5.383818745613098145e-01 1.376593858003616333e-01 2.739061713218688965e-01 629 | 6.406977772712707520e-01 1.270553749054670334e-02 -3.226630091667175293e-01 630 | 5.510884523391723633e-01 2.452872991561889648e-01 2.066994160413742065e-01 631 | 7.222850918769836426e-01 7.384143024682998657e-02 -3.277384340763092041e-01 632 | 6.505212187767028809e-01 2.921126186847686768e-01 1.460705697536468506e-01 633 | 7.457680106163024902e-01 1.937808394432067871e-01 -2.773072719573974609e-01 634 | 6.553327441215515137e-01 3.177099525928497314e-01 3.334616497159004211e-02 635 | 7.765710949897766113e-01 2.118187248706817627e-01 -2.504064440727233887e-01 636 | 6.255805492401123047e-01 2.519127428531646729e-01 6.728244572877883911e-02 637 | 7.992474436759948730e-01 2.755801081657409668e-01 -1.255046725273132324e-01 638 | 6.567339301109313965e-01 4.042401611804962158e-01 9.037052094936370850e-02 639 | 8.927142620086669922e-01 3.040845990180969238e-01 -1.912583597004413605e-03 640 | 7.516071200370788574e-01 3.576108515262603760e-01 2.597668394446372986e-02 641 | 6.887086033821105957e-01 -3.573661148548126221e-01 3.205669298768043518e-02 642 | 6.778393387794494629e-01 -3.402317464351654053e-01 9.642623364925384521e-02 643 | 7.412241101264953613e-01 -3.352771699428558350e-01 -7.410223037004470825e-02 644 | 6.630221009254455566e-01 -3.191084563732147217e-01 1.754772067070007324e-01 645 | 6.983423233032226562e-01 -3.204394280910491943e-01 -1.046396791934967041e-01 646 | 6.429010033607482910e-01 -2.727093398571014404e-01 2.203125655651092529e-01 647 | 7.009095549583435059e-01 -2.907215058803558350e-01 -1.976989656686782837e-01 648 | 6.211814880371093750e-01 -2.344808876514434814e-01 2.844678759574890137e-01 649 | 6.892034411430358887e-01 -2.295331656932830811e-01 -2.282647639513015747e-01 650 | 6.147431731224060059e-01 -1.653504669666290283e-01 3.026181161403656006e-01 651 | 6.932014822959899902e-01 -1.644360125064849854e-01 -2.772849202156066895e-01 652 | 5.901781916618347168e-01 -9.345259517431259155e-02 3.376370668411254883e-01 653 | 6.726232171058654785e-01 -6.478725373744964600e-02 -3.153128027915954590e-01 654 | 5.744469165802001953e-01 -1.608363538980484009e-02 3.461700677871704102e-01 655 | 6.830284595489501953e-01 3.796151280403137207e-02 -3.371228277683258057e-01 656 | 5.840971469879150391e-01 6.358836591243743896e-02 3.366127014160156250e-01 657 | 6.129695773124694824e-01 8.940165489912033081e-02 -3.094424903392791748e-01 658 | 5.848594903945922852e-01 1.517488807439804077e-01 3.171165585517883301e-01 659 | 6.915267705917358398e-01 1.801256239414215088e-01 -2.799758911132812500e-01 660 | 6.087830662727355957e-01 2.522152662277221680e-01 2.421142160892486572e-01 661 | 6.660435795783996582e-01 2.482905685901641846e-01 -2.418991625308990479e-01 662 | 5.990217924118041992e-01 3.227544128894805908e-01 1.579825580120086670e-01 663 | 7.440043091773986816e-01 2.989058792591094971e-01 -1.714011728763580322e-01 664 | 6.817775368690490723e-01 3.886187970638275146e-01 1.313230991363525391e-01 665 | 7.554704546928405762e-01 3.092043399810791016e-01 -1.561577916145324707e-01 666 | 6.334025859832763672e-01 3.741030395030975342e-01 2.539962530136108398e-01 667 | 7.169741988182067871e-01 1.886284351348876953e-01 -6.161925569176673889e-02 668 | 4.352132081985473633e-01 5.421056151390075684e-01 3.661003112792968750e-01 669 | 5.810977816581726074e-01 6.321234107017517090e-01 -6.815130263566970825e-02 670 | 4.818281531333923340e-01 5.787970423698425293e-01 2.020398080348968506e-01 671 | 6.203794479370117188e-01 6.211790442466735840e-01 5.391460284590721130e-02 672 | 6.737689971923828125e-01 4.795133769512176514e-01 1.126783639192581177e-01 673 | 7.293669581413269043e-01 -3.620644509792327881e-01 -7.766438648104667664e-04 674 | 7.029747962951660156e-01 -3.481416404247283936e-01 1.426605284214019775e-01 675 | 7.682949900627136230e-01 -3.542655408382415771e-01 -4.659514501690864563e-02 676 | 7.216551899909973145e-01 -3.623925745487213135e-01 7.634473592042922974e-02 677 | 7.709839940071105957e-01 -3.391311466693878174e-01 -1.209912896156311035e-01 678 | 6.858450770378112793e-01 -3.092031478881835938e-01 2.248831242322921753e-01 679 | 7.292723655700683594e-01 -3.256782889366149902e-01 -1.369771063327789307e-01 680 | 6.754236817359924316e-01 -2.569802403450012207e-01 2.712615132331848145e-01 681 | 7.675944566726684570e-01 -2.870183289051055908e-01 -2.034449875354766846e-01 682 | 6.558411717414855957e-01 -2.143408060073852539e-01 3.195002973079681396e-01 683 | 7.278205156326293945e-01 -2.600375711917877197e-01 -1.765581369400024414e-01 684 | 6.491928696632385254e-01 -1.417090594768524170e-01 3.436054587364196777e-01 685 | 7.461617588996887207e-01 -1.923574507236480713e-01 -2.637513279914855957e-01 686 | 6.291331052780151367e-01 -9.037306904792785645e-02 3.714423775672912598e-01 687 | 7.578914761543273926e-01 -1.844502985477447510e-01 -1.792325973510742188e-01 688 | 6.339496970176696777e-01 -2.546449378132820129e-02 3.458323180675506592e-01 689 | 7.220644354820251465e-01 -1.057939901947975159e-01 -3.246865272521972656e-01 690 | 6.108739972114562988e-01 3.692981973290443420e-02 3.529980778694152832e-01 691 | 7.653755545616149902e-01 -1.069961711764335632e-01 -2.750903964042663574e-01 692 | 6.447779536247253418e-01 1.148310229182243347e-01 3.067907094955444336e-01 693 | 7.873205542564392090e-01 -3.198779746890068054e-02 -3.098785281181335449e-01 694 | 6.504848599433898926e-01 2.170475423336029053e-01 2.238391041755676270e-01 695 | 8.546116352081298828e-01 1.065184269100427628e-02 -2.516090869903564453e-01 696 | 7.244028449058532715e-01 2.772575914859771729e-01 1.510304361581802368e-01 697 | 8.283495306968688965e-01 1.744519770145416260e-01 -2.913473546504974365e-01 698 | 7.422685027122497559e-01 2.899519503116607666e-01 5.098290368914604187e-02 699 | 8.630796670913696289e-01 1.507962942123413086e-01 -2.433070987462997437e-01 700 | 7.711499929428100586e-01 2.475374042987823486e-01 -2.990518696606159210e-03 701 | 9.190011024475097656e-01 1.630514413118362427e-01 -1.876048743724822998e-01 702 | 7.384818196296691895e-01 3.805035352706909180e-01 1.098234578967094421e-01 703 | 9.102031588554382324e-01 2.504334449768066406e-01 4.396929964423179626e-02 704 | 7.677336931228637695e-01 3.290736377239227295e-01 8.763958513736724854e-02 705 | 7.582560181617736816e-01 -3.524928987026214600e-01 3.401169925928115845e-02 706 | 7.465139031410217285e-01 -3.237361013889312744e-01 8.966332674026489258e-02 707 | 8.123026490211486816e-01 -3.298769295215606689e-01 -8.456890285015106201e-02 708 | 7.363429069519042969e-01 -3.171233236789703369e-01 1.742358505725860596e-01 709 | 7.897144556045532227e-01 -2.833695411682128906e-01 -1.483228504657745361e-01 710 | 7.239897847175598145e-01 -2.619552910327911377e-01 2.153023779392242432e-01 711 | 8.041482567787170410e-01 -2.575831711292266846e-01 -2.444294393062591553e-01 712 | 7.132138609886169434e-01 -2.233828008174896240e-01 2.766981720924377441e-01 713 | 8.080251812934875488e-01 -1.983449459075927734e-01 -2.780805528163909912e-01 714 | 6.952464580535888672e-01 -1.475236713886260986e-01 3.215521275997161865e-01 715 | 8.007055521011352539e-01 -1.322666406631469727e-01 -3.250758945941925049e-01 716 | 7.103579044342041016e-01 -7.850935310125350952e-02 3.338292539119720459e-01 717 | 7.957025170326232910e-01 -3.600028902292251587e-02 -3.453088998794555664e-01 718 | 6.794357895851135254e-01 -5.215912126004695892e-03 3.421095609664916992e-01 719 | 8.208813667297363281e-01 2.537871897220611572e-02 -3.331291079521179199e-01 720 | 7.223940491676330566e-01 5.887024849653244019e-02 3.358320593833923340e-01 721 | 7.538279294967651367e-01 6.797087192535400391e-02 -3.089575171470642090e-01 722 | 6.728193163871765137e-01 1.482514441013336182e-01 3.167356252670288086e-01 723 | 7.958087325096130371e-01 1.630566418170928955e-01 -2.988272905349731445e-01 724 | 7.234117388725280762e-01 2.412228882312774658e-01 2.531076967716217041e-01 725 | 7.779470086097717285e-01 2.181325256824493408e-01 -2.521705627441406250e-01 726 | 6.549699306488037109e-01 2.903306782245635986e-01 2.003470361232757568e-01 727 | 8.085796236991882324e-01 2.861692607402801514e-01 -2.113425433635711670e-01 728 | 7.172757983207702637e-01 3.527125418186187744e-01 1.540618091821670532e-01 729 | 8.192027211189270020e-01 2.864430248737335205e-01 -1.350341439247131348e-01 730 | 7.257239222526550293e-01 3.723627030849456787e-01 1.458851695060729980e-01 731 | 8.012509942054748535e-01 2.653126418590545654e-01 -1.021988466382026672e-01 732 | 6.985675692558288574e-01 4.698306620121002197e-01 1.087286323308944702e-01 733 | 7.548425197601318359e-01 4.584166109561920166e-01 -6.228202208876609802e-02 734 | 6.016756892204284668e-01 5.187817215919494629e-01 2.308009117841720581e-01 735 | 7.521588206291198730e-01 5.254459977149963379e-01 -5.388607457280158997e-02 736 | 7.444047927856445312e-01 4.627093970775604248e-01 1.169020030647516251e-02 737 | 7.946695685386657715e-01 -3.582091629505157471e-01 -5.642675794661045074e-03 738 | 7.684870958328247070e-01 -3.500503003597259521e-01 1.288115829229354858e-01 739 | 8.327470421791076660e-01 -3.519331514835357666e-01 -4.788323119282722473e-02 740 | 7.896798253059387207e-01 -3.625591695308685303e-01 7.137616723775863647e-02 741 | 8.667770028114318848e-01 -2.990222275257110596e-01 -1.864989995956420898e-01 742 | 7.579344511032104492e-01 -2.947378754615783691e-01 2.253245115280151367e-01 743 | 8.357911705970764160e-01 -3.144713044166564941e-01 -1.295989900827407837e-01 744 | 7.659190297126770020e-01 -2.369428575038909912e-01 2.631184458732604980e-01 745 | 8.791479468345642090e-01 -2.278300523757934570e-01 -2.709158957004547119e-01 746 | 7.174790501594543457e-01 -1.892473995685577393e-01 3.306214213371276855e-01 747 | 8.574470877647399902e-01 -2.378215044736862183e-01 -2.111035883426666260e-01 748 | 7.348659634590148926e-01 -1.271528601646423340e-01 3.643514513969421387e-01 749 | 8.855170607566833496e-01 -1.345829665660858154e-01 -3.263025879859924316e-01 750 | 6.835426688194274902e-01 -7.305514812469482422e-02 3.797659277915954590e-01 751 | 8.572905063629150391e-01 -1.544212698936462402e-01 -2.655877768993377686e-01 752 | 7.228160500526428223e-01 -1.272204052656888962e-02 3.675057291984558105e-01 753 | 8.662510514259338379e-01 -3.482564911246299744e-02 -3.534485697746276855e-01 754 | 6.687078475952148438e-01 6.616310775279998779e-02 3.574805259704589844e-01 755 | 8.456219434738159180e-01 -6.831150501966476440e-02 -3.061533570289611816e-01 756 | 7.241894602775573730e-01 1.263752877712249756e-01 3.392650485038757324e-01 757 | 9.063442349433898926e-01 1.256102044135332108e-02 -3.381753563880920410e-01 758 | 6.812778711318969727e-01 2.233390510082244873e-01 2.880860567092895508e-01 759 | 8.975054025650024414e-01 1.388373784720897675e-03 -3.071210682392120361e-01 760 | 7.825376987457275391e-01 2.597984671592712402e-01 1.816226392984390259e-01 761 | 8.740891814231872559e-01 1.280471831560134888e-01 -3.434857428073883057e-01 762 | 7.997146844863891602e-01 2.727449834346771240e-01 7.137553393840789795e-02 763 | 9.259715676307678223e-01 1.826581358909606934e-01 -2.106783390045166016e-01 764 | 8.527056574821472168e-01 2.859246432781219482e-01 2.232308499515056610e-03 765 | 9.155532121658325195e-01 2.187409400939941406e-01 -1.691024601459503174e-01 766 | 7.822196483612060547e-01 2.983671128749847412e-01 4.992062598466873169e-02 767 | 8.997389078140258789e-01 2.153772115707397461e-01 -4.079970344901084900e-02 768 | 8.171082735061645508e-01 2.872838079929351807e-01 7.436040043830871582e-02 769 | 8.263893723487854004e-01 -3.464070260524749756e-01 2.592055499553680420e-02 770 | 8.187742233276367188e-01 -3.190304934978485107e-01 8.053362369537353516e-02 771 | 8.798204064369201660e-01 -3.279045522212982178e-01 -7.077683508396148682e-02 772 | 8.010851740837097168e-01 -3.179261684417724609e-01 1.556131392717361450e-01 773 | 8.864702582359313965e-01 -2.919926941394805908e-01 -1.306891739368438721e-01 774 | 8.003429174423217773e-01 -2.580735683441162109e-01 1.916072368621826172e-01 775 | 9.093622565269470215e-01 -2.452206909656524658e-01 -2.178390324115753174e-01 776 | 8.099827170372009277e-01 -2.000999599695205688e-01 2.806186378002166748e-01 777 | 9.317126870155334473e-01 -1.836155354976654053e-01 -2.083503305912017822e-01 778 | 7.841443419456481934e-01 -1.425586491823196411e-01 2.905784249305725098e-01 779 | 9.383673071861267090e-01 -1.054808422923088074e-01 -2.784426212310791016e-01 780 | 8.081508874893188477e-01 -6.396952271461486816e-02 3.447321653366088867e-01 781 | 9.458779692649841309e-01 5.097235552966594696e-03 -2.811287343502044678e-01 782 | 7.561194300651550293e-01 -7.107156328856945038e-03 3.231740891933441162e-01 783 | 9.258481264114379883e-01 7.428052276372909546e-02 -3.065497875213623047e-01 784 | 7.906650900840759277e-01 6.242197006940841675e-02 3.343628644943237305e-01 785 | 8.982087969779968262e-01 1.211216524243354797e-01 -3.242623507976531982e-01 786 | 7.383627891540527344e-01 1.530059874057769775e-01 2.913953065872192383e-01 787 | 8.535773158073425293e-01 1.893086433410644531e-01 -2.953355312347412109e-01 788 | 7.968606352806091309e-01 2.254534661769866943e-01 2.520091235637664795e-01 789 | 8.791587948799133301e-01 2.282943427562713623e-01 -2.558764219284057617e-01 790 | 7.251339554786682129e-01 2.750891745090484619e-01 2.183405160903930664e-01 791 | 8.461011052131652832e-01 2.526941001415252686e-01 -1.762821376323699951e-01 792 | 7.813171744346618652e-01 3.046122491359710693e-01 1.357797086238861084e-01 793 | 8.926328420639038086e-01 2.813044488430023193e-01 -1.889967024326324463e-01 794 | 7.543542981147766113e-01 3.514379560947418213e-01 1.245186924934387207e-01 795 | 8.671330809593200684e-01 2.595802843570709229e-01 -1.181796640157699585e-01 796 | 7.976250648498535156e-01 3.544491231441497803e-01 8.482055366039276123e-02 797 | 8.520076274871826172e-01 3.751432597637176514e-01 -8.946887403726577759e-02 798 | 7.906055450439453125e-01 4.267510771751403809e-01 5.139235034584999084e-02 799 | 8.274056315422058105e-01 4.175955951213836670e-01 -4.768170788884162903e-02 800 | 8.032710552215576172e-01 3.255435824394226074e-01 -9.130009450018405914e-03 801 | 8.661543726921081543e-01 -3.603245913982391357e-01 -1.260900031775236130e-02 802 | 8.429861068725585938e-01 -3.486053049564361572e-01 1.135354265570640564e-01 803 | 9.105997085571289062e-01 -3.507043719291687012e-01 -4.625553265213966370e-02 804 | 8.628128170967102051e-01 -3.561064898967742920e-01 5.032567679882049561e-02 805 | 9.318823218345642090e-01 -3.324631452560424805e-01 -1.129431277513504028e-01 806 | 8.283348679542541504e-01 -3.012861311435699463e-01 2.069576382637023926e-01 807 | 9.338935613632202148e-01 -2.895059883594512939e-01 -1.704526245594024658e-01 808 | 8.335115313529968262e-01 -2.385761439800262451e-01 2.650062739849090576e-01 809 | 9.435694217681884766e-01 -2.435392141342163086e-01 -1.118030175566673279e-01 810 | 7.786865830421447754e-01 -1.844384372234344482e-01 3.304829597473144531e-01 811 | 9.628596901893615723e-01 -2.379730492830276489e-01 -2.430204302072525024e-01 812 | 8.042101860046386719e-01 -1.160959377884864807e-01 3.591642677783966064e-01 813 | 9.323316812515258789e-01 -1.384260207414627075e-01 -9.853616356849670410e-02 814 | 7.564831376075744629e-01 -6.887990236282348633e-02 3.739956319332122803e-01 815 | 9.440152645111083984e-01 -1.653725057840347290e-01 -3.061915338039398193e-01 816 | 8.008583784103393555e-01 -3.813626244664192200e-04 3.705638647079467773e-01 817 | 9.709077477455139160e-01 -5.467531830072402954e-02 -1.950053721666336060e-01 818 | 7.554166316986083984e-01 4.872750490903854370e-02 3.591551184654235840e-01 819 | 9.317686557769775391e-01 -7.243305444717407227e-02 -3.140510320663452148e-01 820 | 7.967594265937805176e-01 1.330490112304687500e-01 3.311205506324768066e-01 821 | 9.599699378013610840e-01 6.283818744122982025e-03 -1.160315424203872681e-01 822 | 7.697292566299438477e-01 2.009412646293640137e-01 2.861437797546386719e-01 823 | 9.761444926261901855e-01 -1.707627624273300171e-02 -2.875298857688903809e-01 824 | 8.173559904098510742e-01 2.466672062873840332e-01 2.218305319547653198e-01 825 | 9.601662755012512207e-01 9.150151163339614868e-02 -2.022230178117752075e-01 826 | 8.448044657707214355e-01 2.584210336208343506e-01 1.236945912241935730e-01 827 | 9.665034413337707520e-01 1.126107499003410339e-01 -2.645610570907592773e-01 828 | 8.681555390357971191e-01 2.650742828845977783e-01 4.943337664008140564e-02 829 | 9.402534961700439453e-01 1.553885936737060547e-01 -1.752883642911911011e-01 830 | 8.337607383728027344e-01 3.225761353969573975e-01 5.640591494739055634e-03 831 | 9.161803126335144043e-01 1.128364428877830505e-01 -4.028211534023284912e-02 832 | 8.502092361450195312e-01 1.974707692861557007e-01 4.220972582697868347e-02 833 | 9.056332707405090332e-01 -3.446802794933319092e-01 1.327080186456441879e-02 834 | 8.784546256065368652e-01 -3.152637183666229248e-01 1.089787185192108154e-01 835 | 9.837417006492614746e-01 -2.986507713794708252e-01 -1.261472105979919434e-01 836 | 8.798136711120605469e-01 -3.075916171073913574e-01 1.726869344711303711e-01 837 | 9.946226477622985840e-01 -1.910518556833267212e-01 -1.508496254682540894e-01 838 | 8.707421422004699707e-01 -2.482331544160842896e-01 2.155934423208236694e-01 839 | 9.816164374351501465e-01 -1.678672134876251221e-01 -2.400877773761749268e-01 840 | 8.866643309593200684e-01 -1.820592284202575684e-01 2.965978682041168213e-01 841 | 9.930918812751770020e-01 -1.355528831481933594e-01 -1.819695532321929932e-01 842 | 8.692678809165954590e-01 -1.322995424270629883e-01 3.032137155532836914e-01 843 | 9.892535209655761719e-01 -7.393798232078552246e-02 -2.446498870849609375e-01 844 | 8.906872868537902832e-01 -5.952479690313339233e-02 3.579449653625488281e-01 845 | 9.991694092750549316e-01 -1.003711484372615814e-02 -2.494066357612609863e-01 846 | 8.511778712272644043e-01 7.136796601116657257e-03 3.395188748836517334e-01 847 | 9.707874655723571777e-01 8.167806267738342285e-02 -3.306886851787567139e-01 848 | 8.889178633689880371e-01 4.878100007772445679e-02 3.407969772815704346e-01 849 | 9.970996975898742676e-01 7.241299748420715332e-02 -2.574376761913299561e-01 850 | 8.321877121925354004e-01 1.315905004739761353e-01 3.118473291397094727e-01 851 | 9.500251412391662598e-01 1.501131802797317505e-01 -3.049130439758300781e-01 852 | 8.880587816238403320e-01 1.870986223220825195e-01 2.661286294460296631e-01 853 | 9.877746701240539551e-01 1.485872119665145874e-01 -2.498966157436370850e-01 854 | 8.702324032783508301e-01 2.384954988956451416e-01 1.940462887287139893e-01 855 | 9.556793570518493652e-01 2.122696638107299805e-01 -2.729912698268890381e-01 856 | 8.810493350028991699e-01 2.834970653057098389e-01 1.648153662681579590e-01 857 | 9.362762570381164551e-01 2.323102653026580811e-01 -2.320438325405120850e-01 858 | 8.364964127540588379e-01 3.211489915847778320e-01 8.143030852079391479e-02 859 | 9.175445437431335449e-01 2.871732413768768311e-01 -1.814536005258560181e-01 860 | 8.965218663215637207e-01 3.312304019927978516e-01 -2.355659380555152893e-02 861 | 8.820596337318420410e-01 3.006013631820678711e-01 -1.270829588174819946e-01 862 | 8.460555672645568848e-01 3.544090092182159424e-01 -2.346678450703620911e-02 863 | 8.484640717506408691e-01 3.516451716423034668e-01 -6.683474779129028320e-02 864 | 8.224053978919982910e-01 2.835827767848968506e-01 -3.327456489205360413e-02 865 | 9.380033612251281738e-01 -3.591677546501159668e-01 -2.887116745114326477e-02 866 | 9.151651263236999512e-01 -3.560366034507751465e-01 6.353183835744857788e-02 867 | 9.802919030189514160e-01 -3.392683863639831543e-01 -8.768387138843536377e-02 868 | 9.139810800552368164e-01 -3.414957225322723389e-01 1.213367432355880737e-01 869 | 9.812049269676208496e-01 -2.704907357692718506e-01 -1.816558092832565308e-01 870 | 8.905971646308898926e-01 -2.803626060485839844e-01 2.195400744676589966e-01 871 | 9.974605441093444824e-01 -2.403956502676010132e-01 -1.528673470020294189e-01 872 | 8.940530419349670410e-01 -2.236583828926086426e-01 2.823438942432403564e-01 873 | 9.802671074867248535e-01 -2.159306704998016357e-01 -2.629775106906890869e-01 874 | 8.458197116851806641e-01 -1.794805228710174561e-01 3.396019935607910156e-01 875 | 9.984694123268127441e-01 -1.906170845031738281e-01 -2.228844016790390015e-01 876 | 8.846653699874877930e-01 -1.077072545886039734e-01 3.552118241786956787e-01 877 | 9.828847646713256836e-01 -1.454566419124603271e-01 -3.159827589988708496e-01 878 | 8.402758836746215820e-01 -5.213154852390289307e-02 3.785318732261657715e-01 879 | 9.985480308532714844e-01 -1.080241948366165161e-01 -2.706336975097656250e-01 880 | 8.913782835006713867e-01 -1.300898380577564240e-03 3.547912240028381348e-01 881 | 9.747120141983032227e-01 -7.585869729518890381e-02 -3.448778986930847168e-01 882 | 8.390888571739196777e-01 6.866884976625442505e-02 3.616430461406707764e-01 883 | 9.926521182060241699e-01 -3.403817489743232727e-02 -3.076988458633422852e-01 884 | 8.738889694213867188e-01 1.271062791347503662e-01 3.238858282566070557e-01 885 | 9.713000655174255371e-01 8.819530718028545380e-03 -3.476434350013732910e-01 886 | 8.407492041587829590e-01 1.960326135158538818e-01 2.951946854591369629e-01 887 | 9.933466315269470215e-01 5.249100923538208008e-02 -2.979012131690979004e-01 888 | 8.961240649223327637e-01 2.380929291248321533e-01 2.344366908073425293e-01 889 | 9.720154404640197754e-01 1.064566746354103088e-01 -3.424096703529357910e-01 890 | 8.743882775306701660e-01 2.755723893642425537e-01 1.215599030256271362e-01 891 | 9.999750256538391113e-01 1.519031375646591187e-01 -1.998206675052642822e-01 892 | 8.937362432479858398e-01 2.939407825469970703e-01 6.071737036108970642e-02 893 | 9.897215366363525391e-01 2.070549726486206055e-01 -1.729860007762908936e-01 894 | 8.660336732864379883e-01 2.983190119266510010e-01 5.466449074447154999e-03 895 | 9.445202350616455078e-01 1.999466866254806519e-01 -1.164478063583374023e-01 896 | 8.548974394798278809e-01 2.313175499439239502e-01 -1.838774979114532471e-02 897 | 9.682211279869079590e-01 -3.286415934562683105e-01 -6.990095973014831543e-02 898 | 9.893963932991027832e-01 -3.274431526660919189e-01 3.248402848839759827e-02 899 | 9.956479668617248535e-01 -2.720705568790435791e-01 -9.137526899576187134e-02 900 | 9.990776777267456055e-01 -2.525004744529724121e-01 6.556857377290725708e-02 901 | 9.992859959602355957e-01 -2.103721201419830322e-01 -9.670196473598480225e-02 902 | 9.938433766365051270e-01 -1.850551068782806396e-01 1.096355989575386047e-01 903 | 9.952617287635803223e-01 -1.373533159494400024e-01 -8.790745586156845093e-02 904 | 9.904380440711975098e-01 -1.491958498954772949e-01 1.711575090885162354e-01 905 | 9.936257004737854004e-01 -9.021985530853271484e-02 -1.492502093315124512e-01 906 | 9.931637644767761230e-01 -8.790460973978042603e-02 1.545750200748443604e-01 907 | 9.918295145034790039e-01 -4.796648770570755005e-02 -1.006933599710464478e-01 908 | 9.919754862785339355e-01 -4.500089958310127258e-02 1.979006677865982056e-01 909 | 9.935844540596008301e-01 -5.036632530391216278e-03 -1.820010840892791748e-01 910 | 9.908829927444458008e-01 9.683930315077304840e-03 1.661093533039093018e-01 911 | 9.921720623970031738e-01 5.266156047582626343e-02 -1.123826876282691956e-01 912 | 9.952113032341003418e-01 5.610615760087966919e-02 2.158187329769134521e-01 913 | 9.942771792411804199e-01 8.899181336164474487e-02 -2.198483198881149292e-01 914 | 9.913802742958068848e-01 1.279665082693099976e-01 1.656307727098464966e-01 915 | 9.917767643928527832e-01 1.466963440179824829e-01 -1.366267800331115723e-01 916 | 9.964217543601989746e-01 1.765180528163909912e-01 2.326681613922119141e-01 917 | 9.912059903144836426e-01 1.896033138036727905e-01 -2.290402352809906006e-01 918 | 9.974167943000793457e-01 2.277416735887527466e-01 1.421711742877960205e-01 919 | 9.945033788681030273e-01 2.509286701679229736e-01 -1.433061510324478149e-01 920 | 9.707536101341247559e-01 2.836204469203948975e-01 1.608230918645858765e-01 921 | 9.798478484153747559e-01 2.534401118755340576e-01 -2.068418562412261963e-01 922 | 9.274001717567443848e-01 2.992326915264129639e-01 9.046371281147003174e-02 923 | 9.803882241249084473e-01 2.740585505962371826e-01 -1.387225091457366943e-01 924 | 9.121417403221130371e-01 3.116559982299804688e-01 1.402007695287466049e-02 925 | 9.480953812599182129e-01 2.937613725662231445e-01 -1.418655514717102051e-01 926 | 9.203308820724487305e-01 3.181891739368438721e-01 -9.585495293140411377e-02 927 | 8.762710690498352051e-01 3.314063549041748047e-01 -4.542685672640800476e-02 928 | 8.901892304420471191e-01 2.359675467014312744e-01 -1.205334514379501343e-01 929 | 9.575216174125671387e-01 -3.467938005924224854e-01 1.045824680477380753e-02 930 | 9.634895920753479004e-01 -3.373872935771942139e-01 6.162369996309280396e-02 931 | 9.871478080749511719e-01 -3.425591886043548584e-01 -1.812716946005821228e-02 932 | 9.956731200218200684e-01 -2.879963219165802002e-01 -6.059433892369270325e-02 933 | 9.951739907264709473e-01 -3.055452108383178711e-01 -3.245134837925434113e-03 934 | 9.574992060661315918e-01 -2.687194347381591797e-01 2.034021168947219849e-01 935 | 9.980021119117736816e-01 -2.590558230876922607e-01 1.532938610762357712e-02 936 | 9.815939068794250488e-01 -2.233359664678573608e-01 2.633296251296997070e-01 937 | 9.871450066566467285e-01 -2.062292695045471191e-01 4.688015952706336975e-02 938 | 9.611923098564147949e-01 -1.736671030521392822e-01 3.247825205326080322e-01 939 | 9.829142689704895020e-01 -1.775476932525634766e-01 -4.037047550082206726e-02 940 | 9.856779575347900391e-01 -1.110126674175262451e-01 3.299556970596313477e-01 941 | 9.675258398056030273e-01 -1.313629448413848877e-01 6.801862269639968872e-02 942 | 9.729154706001281738e-01 -5.080402269959449768e-02 3.510459661483764648e-01 943 | 9.796893596649169922e-01 -7.411189377307891846e-02 -3.612123057246208191e-02 944 | 9.900010228157043457e-01 -3.989986144006252289e-03 3.420226573944091797e-01 945 | 9.669751524925231934e-01 -1.118981186300516129e-02 5.641920492053031921e-02 946 | 9.747434854507446289e-01 6.466060131788253784e-02 3.552054166793823242e-01 947 | 9.854139089584350586e-01 1.840065419673919678e-02 -4.862569645047187805e-02 948 | 9.818375110626220703e-01 1.199912652373313904e-01 3.223966360092163086e-01 949 | 9.708996415138244629e-01 7.470924407243728638e-02 9.978357702493667603e-02 950 | 9.660490155220031738e-01 1.815133988857269287e-01 2.924338579177856445e-01 951 | 9.859656691551208496e-01 1.124236062169075012e-01 -4.723272100090980530e-02 952 | 9.822290539741516113e-01 2.198667228221893311e-01 2.537179589271545410e-01 953 | 9.835046529769897461e-01 1.717567294836044312e-01 -1.734538003802299500e-02 954 | 9.435740113258361816e-01 2.505329251289367676e-01 1.576833426952362061e-01 955 | 9.928509593009948730e-01 1.716223955154418945e-01 -1.105395182967185974e-01 956 | 9.538426399230957031e-01 2.718662619590759277e-01 8.633175492286682129e-02 957 | 9.885726571083068848e-01 2.126553654670715332e-01 -3.884865716099739075e-02 958 | 9.143970608711242676e-01 3.040686547756195068e-01 -2.639082819223403931e-02 959 | 9.933204054832458496e-01 2.263707667589187622e-01 -9.348905086517333984e-02 960 | 9.099355936050415039e-01 2.136297672986984253e-01 4.687049612402915955e-02 961 | 9.929195642471313477e-01 -2.975024878978729248e-01 5.928660556674003601e-02 962 | 1.004877686500549316e+00 -2.991856634616851807e-01 1.019482016563415527e-01 963 | 9.968697428703308105e-01 -2.355885207653045654e-01 -3.012077882885932922e-02 964 | 1.002904057502746582e+00 -2.549250423908233643e-01 1.563379466533660889e-01 965 | 1.000658035278320312e+00 -2.271676212549209595e-01 1.249668002128601074e-01 966 | 1.001362085342407227e+00 -2.012293487787246704e-01 1.948826611042022705e-01 967 | 9.946398138999938965e-01 -1.571370065212249756e-01 6.724166683852672577e-03 968 | 9.949777126312255859e-01 -1.530827283859252930e-01 2.417950481176376343e-01 969 | 9.941086769104003906e-01 -1.147820800542831421e-01 1.153508350253105164e-01 970 | 9.978278279304504395e-01 -9.734089672565460205e-02 2.298221588134765625e-01 971 | 9.935618638992309570e-01 -8.130999654531478882e-02 3.529636934399604797e-02 972 | 9.983130097389221191e-01 -6.653008610010147095e-02 3.118231594562530518e-01 973 | 9.879146218299865723e-01 -3.288551047444343567e-02 1.130410581827163696e-01 974 | 9.977426528930664062e-01 4.882951267063617706e-03 2.450324296951293945e-01 975 | 9.892538189888000488e-01 4.038263112306594849e-02 2.493084594607353210e-02 976 | 9.978398680686950684e-01 4.623809456825256348e-02 3.207935690879821777e-01 977 | 9.862424731254577637e-01 7.394162565469741821e-02 1.547179371118545532e-01 978 | 9.950109124183654785e-01 1.167356893420219421e-01 2.708142399787902832e-01 979 | 9.873197078704833984e-01 1.168811395764350891e-01 4.029345884919166565e-02 980 | 9.906565546989440918e-01 1.575142890214920044e-01 2.947626411914825439e-01 981 | 9.904249906539916992e-01 1.681260317564010620e-01 9.452438354492187500e-02 982 | 9.929032921791076660e-01 2.054677158594131470e-01 2.148616611957550049e-01 983 | 9.924734234809875488e-01 2.167074084281921387e-01 -6.489401310682296753e-02 984 | 9.870419502258300781e-01 2.481268346309661865e-01 1.893983185291290283e-01 985 | 9.923433661460876465e-01 2.343206703662872314e-01 1.379234623163938522e-02 986 | 9.949901103973388672e-01 2.810648381710052490e-01 9.996143728494644165e-02 987 | 9.967890381813049316e-01 2.800196707248687744e-01 -6.877328455448150635e-02 988 | 9.865583777427673340e-01 3.021214902400970459e-01 1.718898490071296692e-02 989 | 9.807505011558532715e-01 2.937117815017700195e-01 -2.148448303341865540e-02 990 | 9.371030926704406738e-01 3.292689323425292969e-01 -3.177863731980323792e-02 991 | 9.662911295890808105e-01 2.977233827114105225e-01 -8.050939440727233887e-02 992 | 9.357407093048095703e-01 2.499813437461853027e-01 -7.158517837524414062e-02 993 | 9.384104013442993164e-01 -2.718453705310821533e-01 6.510621868073940277e-03 994 | 9.416527748107910156e-01 -2.696694433689117432e-01 1.123272702097892761e-01 995 | 9.607216715812683105e-01 -3.230700194835662842e-01 1.178797408938407898e-01 996 | 9.327302575111389160e-01 -3.005367815494537354e-01 1.635005176067352295e-01 997 | 9.817808270454406738e-01 -2.930319011211395264e-01 1.562542170286178589e-01 998 | 9.332191348075866699e-01 -2.504684031009674072e-01 2.317765802145004272e-01 999 | 9.887095093727111816e-01 -2.463723421096801758e-01 2.059456110000610352e-01 1000 | 9.433526396751403809e-01 -2.077101469039916992e-01 2.913896441459655762e-01 1001 | 9.872601628303527832e-01 -1.909483075141906738e-01 2.537379562854766846e-01 1002 | 9.281954765319824219e-01 -1.454530954360961914e-01 3.358955681324005127e-01 1003 | 9.915336966514587402e-01 -1.456499993801116943e-01 2.945837974548339844e-01 1004 | 9.376960396766662598e-01 -9.012371301651000977e-02 3.512653708457946777e-01 1005 | 9.708288311958312988e-01 -9.543272107839584351e-02 2.692807614803314209e-01 1006 | 9.353204965591430664e-01 -2.395005524158477783e-02 3.441843688488006592e-01 1007 | 9.912788867950439453e-01 -3.108671680092811584e-02 2.713073790073394775e-01 1008 | 9.495055079460144043e-01 2.901057153940200806e-02 3.605308234691619873e-01 1009 | 9.684078097343444824e-01 1.519006583839654922e-02 2.852769792079925537e-01 1010 | 9.297097325325012207e-01 9.874609857797622681e-02 3.373870253562927246e-01 1011 | 9.891348481178283691e-01 7.317136228084564209e-02 2.772386074066162109e-01 1012 | 9.363410472869873047e-01 1.491268724203109741e-01 3.174185752868652344e-01 1013 | 9.712795019149780273e-01 1.219221875071525574e-01 2.261544913053512573e-01 1014 | 9.235932826995849609e-01 2.016454935073852539e-01 2.792545258998870850e-01 1015 | 9.887298941612243652e-01 1.814896911382675171e-01 1.746383905410766602e-01 1016 | 9.439821839332580566e-01 2.443216741085052490e-01 2.213265597820281982e-01 1017 | 9.732591509819030762e-01 2.128544598817825317e-01 8.886057138442993164e-02 1018 | 9.228019118309020996e-01 2.770280539989471436e-01 1.409106254577636719e-01 1019 | 9.846619963645935059e-01 2.495029866695404053e-01 7.282407581806182861e-02 1020 | 9.624958634376525879e-01 3.030804097652435303e-01 7.812434434890747070e-02 1021 | 9.816001057624816895e-01 2.516531646251678467e-01 3.034384548664093018e-02 1022 | 9.374278187751770020e-01 2.937456071376800537e-01 2.941405400633811951e-02 1023 | 9.867789149284362793e-01 2.433578968048095703e-01 -4.232605919241905212e-02 1024 | 9.347581267356872559e-01 2.036699801683425903e-01 -9.964791126549243927e-03 1025 | -------------------------------------------------------------------------------- /demo/src_3_m.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/fanhqme/PointSetGeneration/c5e80647207476038023e6cada9b69f2b1693a7d/demo/src_3_m.png -------------------------------------------------------------------------------- /demo/view.py: -------------------------------------------------------------------------------- 1 | import show3d 2 | import numpy as np 3 | import sys 4 | a=np.loadtxt(sys.argv[1]) 5 | show3d.showpoints(a) 6 | -------------------------------------------------------------------------------- /depthestimate/BatchFetcher.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import numpy as np 3 | import cv2 4 | import random 5 | import math 6 | import os 7 | import time 8 | import zlib 9 | import socket 10 | import threading 11 | import Queue 12 | import sys 13 | import cPickle as pickle 14 | import show3d 15 | 16 | FETCH_BATCH_SIZE=32 17 | BATCH_SIZE=32 18 | HEIGHT=192 19 | WIDTH=256 20 | POINTCLOUDSIZE=16384 21 | OUTPUTPOINTS=1024 22 | REEBSIZE=1024 23 | 24 | class BatchFetcher(threading.Thread): 25 | def __init__(self, dataname): 26 | super(BatchFetcher,self).__init__() 27 | self.queue=Queue.Queue(64) 28 | self.stopped=False 29 | self.datadir = dataname 30 | self.bno=0 31 | def work(self,bno): 32 | path = os.path.join(self.datadir,'%d/%d.gz'%(bno//1000,bno)) 33 | if not os.path.exists(path): 34 | self.stopped=True 35 | print "error! data file not exists: %s"%path 36 | print "please KILL THIS PROGRAM otherwise it will bear undefined behaviors" 37 | assert False,"data file not exists: %s"%path 38 | binfile=zlib.decompress(open(path,'r').read()) 39 | p=0 40 | color=np.fromstring(binfile[p:p+FETCH_BATCH_SIZE*HEIGHT*WIDTH*3],dtype='uint8').reshape((FETCH_BATCH_SIZE,HEIGHT,WIDTH,3)) 41 | p+=FETCH_BATCH_SIZE*HEIGHT*WIDTH*3 42 | depth=np.fromstring(binfile[p:p+FETCH_BATCH_SIZE*HEIGHT*WIDTH*2],dtype='uint16').reshape((FETCH_BATCH_SIZE,HEIGHT,WIDTH)) 43 | p+=FETCH_BATCH_SIZE*HEIGHT*WIDTH*2 44 | rotmat=np.fromstring(binfile[p:p+FETCH_BATCH_SIZE*3*3*4],dtype='float32').reshape((FETCH_BATCH_SIZE,3,3)) 45 | p+=FETCH_BATCH_SIZE*3*3*4 46 | ptcloud=np.fromstring(binfile[p:p+FETCH_BATCH_SIZE*POINTCLOUDSIZE*3],dtype='uint8').reshape((FETCH_BATCH_SIZE,POINTCLOUDSIZE,3)) 47 | ptcloud=ptcloud.astype('float32')/255 48 | beta=math.pi/180*20 49 | viewmat=np.array([[ 50 | np.cos(beta),0,-np.sin(beta)],[ 51 | 0,1,0],[ 52 | np.sin(beta),0,np.cos(beta)]],dtype='float32') 53 | rotmat=rotmat.dot(np.linalg.inv(viewmat)) 54 | for i in xrange(FETCH_BATCH_SIZE): 55 | ptcloud[i]=((ptcloud[i]-[0.7,0.5,0.5])/0.4).dot(rotmat[i])+[1,0,0] 56 | p+=FETCH_BATCH_SIZE*POINTCLOUDSIZE*3 57 | reeb=np.fromstring(binfile[p:p+FETCH_BATCH_SIZE*REEBSIZE*2*4],dtype='uint16').reshape((FETCH_BATCH_SIZE,REEBSIZE,4)) 58 | p+=FETCH_BATCH_SIZE*REEBSIZE*2*4 59 | keynames=binfile[p:].split('\n') 60 | reeb=reeb.astype('float32')/65535 61 | for i in xrange(FETCH_BATCH_SIZE): 62 | reeb[i,:,:3]=((reeb[i,:,:3]-[0.7,0.5,0.5])/0.4).dot(rotmat[i])+[1,0,0] 63 | data=np.zeros((FETCH_BATCH_SIZE,HEIGHT,WIDTH,4),dtype='float32') 64 | data[:,:,:,:3]=color*(1/255.0) 65 | data[:,:,:,3]=depth==0 66 | validating=np.array([i[0]=='f' for i in keynames],dtype='float32') 67 | return (data,ptcloud,validating) 68 | def run(self): 69 | while self.bno<300000 and not self.stopped: 70 | self.queue.put(self.work(self.bno%300000)) 71 | self.bno+=1 72 | def fetch(self): 73 | if self.stopped: 74 | return None 75 | return self.queue.get() 76 | def shutdown(self): 77 | self.stopped=True 78 | while not self.queue.empty(): 79 | self.queue.get() 80 | 81 | if __name__=='__main__': 82 | dataname = "YTTRBtraindump_220k" 83 | fetchworker = BatchFetcher(dataname) 84 | fetchworker.bno=0 85 | fetchworker.start() 86 | for cnt in xrange(100): 87 | data,ptcloud,validating = fetchworker.fetch() 88 | validating = validating[0]!=0 89 | assert len(data)==FETCH_BATCH_SIZE 90 | for i in range(len(data)): 91 | cv2.imshow('data',data[i]) 92 | while True: 93 | cmd=show3d.showpoints(ptcloud[i]) 94 | if cmd==ord(' '): 95 | break 96 | elif cmd==ord('q'): 97 | break 98 | if cmd==ord('q'): 99 | break 100 | 101 | 102 | -------------------------------------------------------------------------------- /depthestimate/render_balls_so.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | using namespace std; 6 | 7 | struct PointInfo{ 8 | int x,y,z; 9 | float r,g,b; 10 | }; 11 | 12 | extern "C"{ 13 | 14 | void render_ball(int h,int w,unsigned char * show,int n,int * xyzs,float * c0,float * c1,float * c2,int r){ 15 | r=max(r,1); 16 | vector depth(h*w,-2100000000); 17 | vector pattern; 18 | for (int dx=-r;dx<=r;dx++) 19 | for (int dy=-r;dy<=r;dy++) 20 | if (dx*dx+dy*dy=h || y2<0 || y2>=w) && depth[x2*w+y2]=0)*(nxyz[:,0]=0)*(nxyz[:,1]0: 67 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=0)) 68 | if magnifyBlue>=2: 69 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=0)) 70 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=1)) 71 | if magnifyBlue>=2: 72 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=1)) 73 | if showrot: 74 | cv2.putText(show,'xangle %d'%(int(xangle/np.pi*180)),(30,showsz-30),0,0.5,cv2.cv.CV_RGB(255,0,0)) 75 | cv2.putText(show,'yangle %d'%(int(yangle/np.pi*180)),(30,showsz-50),0,0.5,cv2.cv.CV_RGB(255,0,0)) 76 | cv2.putText(show,'zoom %d%%'%(int(zoom*100)),(30,showsz-70),0,0.5,cv2.cv.CV_RGB(255,0,0)) 77 | changed=True 78 | while True: 79 | if changed: 80 | render() 81 | changed=False 82 | cv2.imshow('show3d',show) 83 | if waittime==0: 84 | cmd=cv2.waitKey(10)%256 85 | else: 86 | cmd=cv2.waitKey(waittime)%256 87 | if cmd==ord('q'): 88 | break 89 | elif cmd==ord('Q'): 90 | sys.exit(0) 91 | if cmd==ord('n'): 92 | zoom*=1.1 93 | changed=True 94 | elif cmd==ord('m'): 95 | zoom/=1.1 96 | changed=True 97 | elif cmd==ord('r'): 98 | zoom=1.0 99 | changed=True 100 | elif cmd==ord('s'): 101 | cv2.imwrite('show3d.png',show) 102 | if waittime!=0: 103 | break 104 | return cmd 105 | -------------------------------------------------------------------------------- /depthestimate/show3d_balls.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import ctypes as ct 3 | import cv2 4 | import sys 5 | showsz=800 6 | mousex,mousey=0.5,0.5 7 | zoom=1.0 8 | changed=True 9 | def onmouse(*args): 10 | global mousex,mousey,changed 11 | y=args[1] 12 | x=args[2] 13 | mousex=x/float(showsz) 14 | mousey=y/float(showsz) 15 | changed=True 16 | cv2.namedWindow('show3d') 17 | cv2.moveWindow('show3d',0,0) 18 | cv2.setMouseCallback('show3d',onmouse) 19 | 20 | dll=np.ctypeslib.load_library('render_balls_so','.') 21 | 22 | def showpoints(xyz,c0=None,c1=None,c2=None,waittime=0,showrot=False,magnifyBlue=0,freezerot=False,background=(0,0,0),normalizecolor=True,ballradius=10): 23 | global showsz,mousex,mousey,zoom,changed 24 | xyz=xyz-xyz.mean(axis=0) 25 | radius=((xyz**2).sum(axis=-1)**0.5).max() 26 | xyz/=(radius*2.2)/showsz 27 | if c0 is None: 28 | c0=np.zeros((len(xyz),),dtype='float32')+255 29 | if c1 is None: 30 | c1=c0 31 | if c2 is None: 32 | c2=c0 33 | if normalizecolor: 34 | c0/=(c0.max()+1e-14)/255.0 35 | c1/=(c1.max()+1e-14)/255.0 36 | c2/=(c2.max()+1e-14)/255.0 37 | c0=np.require(c0,'float32','C') 38 | c1=np.require(c1,'float32','C') 39 | c2=np.require(c2,'float32','C') 40 | 41 | show=np.zeros((showsz,showsz,3),dtype='uint8') 42 | def render(): 43 | rotmat=np.eye(3) 44 | if not freezerot: 45 | xangle=(mousey-0.5)*np.pi*1.2 46 | else: 47 | xangle=0 48 | rotmat=rotmat.dot(np.array([ 49 | [1.0,0.0,0.0], 50 | [0.0,np.cos(xangle),-np.sin(xangle)], 51 | [0.0,np.sin(xangle),np.cos(xangle)], 52 | ])) 53 | if not freezerot: 54 | yangle=(mousex-0.5)*np.pi*1.2 55 | else: 56 | yangle=0 57 | rotmat=rotmat.dot(np.array([ 58 | [np.cos(yangle),0.0,-np.sin(yangle)], 59 | [0.0,1.0,0.0], 60 | [np.sin(yangle),0.0,np.cos(yangle)], 61 | ])) 62 | rotmat*=zoom 63 | nxyz=xyz.dot(rotmat)+[showsz/2,showsz/2,0] 64 | 65 | ixyz=nxyz.astype('int32') 66 | show[:]=background 67 | dll.render_ball( 68 | ct.c_int(show.shape[0]), 69 | ct.c_int(show.shape[1]), 70 | show.ctypes.data_as(ct.c_void_p), 71 | ct.c_int(ixyz.shape[0]), 72 | ixyz.ctypes.data_as(ct.c_void_p), 73 | c0.ctypes.data_as(ct.c_void_p), 74 | c1.ctypes.data_as(ct.c_void_p), 75 | c2.ctypes.data_as(ct.c_void_p), 76 | ct.c_int(ballradius) 77 | ) 78 | 79 | if magnifyBlue>0: 80 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=0)) 81 | if magnifyBlue>=2: 82 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=0)) 83 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=1)) 84 | if magnifyBlue>=2: 85 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=1)) 86 | if showrot: 87 | cv2.putText(show,'xangle %d'%(int(xangle/np.pi*180)),(30,showsz-30),0,0.5,cv2.cv.CV_RGB(255,0,0)) 88 | cv2.putText(show,'yangle %d'%(int(yangle/np.pi*180)),(30,showsz-50),0,0.5,cv2.cv.CV_RGB(255,0,0)) 89 | cv2.putText(show,'zoom %d%%'%(int(zoom*100)),(30,showsz-70),0,0.5,cv2.cv.CV_RGB(255,0,0)) 90 | changed=True 91 | while True: 92 | if changed: 93 | render() 94 | changed=False 95 | cv2.imshow('show3d',show) 96 | if waittime==0: 97 | cmd=cv2.waitKey(10)%256 98 | else: 99 | cmd=cv2.waitKey(waittime)%256 100 | if cmd==ord('q'): 101 | break 102 | elif cmd==ord('Q'): 103 | sys.exit(0) 104 | if cmd==ord('n'): 105 | zoom*=1.1 106 | changed=True 107 | elif cmd==ord('m'): 108 | zoom/=1.1 109 | changed=True 110 | elif cmd==ord('r'): 111 | zoom=1.0 112 | changed=True 113 | elif cmd==ord('s'): 114 | cv2.imwrite('show3d.png',show) 115 | if waittime!=0: 116 | break 117 | return cmd 118 | if __name__=='__main__': 119 | np.random.seed(100) 120 | showpoints(np.random.randn(1024,3)) 121 | -------------------------------------------------------------------------------- /depthestimate/tf_nndistance.cpp: -------------------------------------------------------------------------------- 1 | #include "tensorflow/core/framework/op.h" 2 | #include "tensorflow/core/framework/op_kernel.h" 3 | REGISTER_OP("NnDistance") 4 | .Input("xyz1: float32") 5 | .Input("xyz2: float32") 6 | .Output("dist1: float32") 7 | .Output("idx1: int32") 8 | .Output("dist2: float32") 9 | .Output("idx2: int32"); 10 | REGISTER_OP("NnDistanceGrad") 11 | .Input("xyz1: float32") 12 | .Input("xyz2: float32") 13 | .Input("grad_dist1: float32") 14 | .Input("idx1: int32") 15 | .Input("grad_dist2: float32") 16 | .Input("idx2: int32") 17 | .Output("grad_xyz1: float32") 18 | .Output("grad_xyz2: float32"); 19 | using namespace tensorflow; 20 | 21 | static void nnsearch(int b,int n,int m,const float * xyz1,const float * xyz2,float * dist,int * idx){ 22 | for (int i=0;iinput(0); 50 | const Tensor& xyz2_tensor=context->input(1); 51 | OP_REQUIRES(context,xyz1_tensor.dims()==3,errors::InvalidArgument("NnDistance requires xyz1 be of shape (batch,#points,3)")); 52 | OP_REQUIRES(context,xyz1_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistance only accepts 3d point set xyz1")); 53 | int b=xyz1_tensor.shape().dim_size(0); 54 | int n=xyz1_tensor.shape().dim_size(1); 55 | OP_REQUIRES(context,xyz2_tensor.dims()==3,errors::InvalidArgument("NnDistance requires xyz2 be of shape (batch,#points,3)")); 56 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistance only accepts 3d point set xyz2")); 57 | int m=xyz2_tensor.shape().dim_size(1); 58 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(0)==b,errors::InvalidArgument("NnDistance expects xyz1 and xyz2 have same batch size")); 59 | auto xyz1_flat=xyz1_tensor.flat(); 60 | const float * xyz1=&xyz1_flat(0); 61 | auto xyz2_flat=xyz2_tensor.flat(); 62 | const float * xyz2=&xyz2_flat(0); 63 | Tensor * dist1_tensor=NULL; 64 | Tensor * idx1_tensor=NULL; 65 | Tensor * dist2_tensor=NULL; 66 | Tensor * idx2_tensor=NULL; 67 | OP_REQUIRES_OK(context,context->allocate_output(0,TensorShape{b,n},&dist1_tensor)); 68 | OP_REQUIRES_OK(context,context->allocate_output(1,TensorShape{b,n},&idx1_tensor)); 69 | auto dist1_flat=dist1_tensor->flat(); 70 | auto idx1_flat=idx1_tensor->flat(); 71 | OP_REQUIRES_OK(context,context->allocate_output(2,TensorShape{b,m},&dist2_tensor)); 72 | OP_REQUIRES_OK(context,context->allocate_output(3,TensorShape{b,m},&idx2_tensor)); 73 | auto dist2_flat=dist2_tensor->flat(); 74 | auto idx2_flat=idx2_tensor->flat(); 75 | float * dist1=&(dist1_flat(0)); 76 | int * idx1=&(idx1_flat(0)); 77 | float * dist2=&(dist2_flat(0)); 78 | int * idx2=&(idx2_flat(0)); 79 | nnsearch(b,n,m,xyz1,xyz2,dist1,idx1); 80 | nnsearch(b,m,n,xyz2,xyz1,dist2,idx2); 81 | } 82 | }; 83 | REGISTER_KERNEL_BUILDER(Name("NnDistance").Device(DEVICE_CPU), NnDistanceOp); 84 | class NnDistanceGradOp : public OpKernel{ 85 | public: 86 | explicit NnDistanceGradOp(OpKernelConstruction* context):OpKernel(context){} 87 | void Compute(OpKernelContext * context)override{ 88 | const Tensor& xyz1_tensor=context->input(0); 89 | const Tensor& xyz2_tensor=context->input(1); 90 | const Tensor& grad_dist1_tensor=context->input(2); 91 | const Tensor& idx1_tensor=context->input(3); 92 | const Tensor& grad_dist2_tensor=context->input(4); 93 | const Tensor& idx2_tensor=context->input(5); 94 | OP_REQUIRES(context,xyz1_tensor.dims()==3,errors::InvalidArgument("NnDistanceGrad requires xyz1 be of shape (batch,#points,3)")); 95 | OP_REQUIRES(context,xyz1_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistanceGrad only accepts 3d point set xyz1")); 96 | int b=xyz1_tensor.shape().dim_size(0); 97 | int n=xyz1_tensor.shape().dim_size(1); 98 | OP_REQUIRES(context,xyz2_tensor.dims()==3,errors::InvalidArgument("NnDistanceGrad requires xyz2 be of shape (batch,#points,3)")); 99 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistanceGrad only accepts 3d point set xyz2")); 100 | int m=xyz2_tensor.shape().dim_size(1); 101 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(0)==b,errors::InvalidArgument("NnDistanceGrad expects xyz1 and xyz2 have same batch size")); 102 | OP_REQUIRES(context,grad_dist1_tensor.shape()==(TensorShape{b,n}),errors::InvalidArgument("NnDistanceGrad requires grad_dist1 be of shape(batch,#points)")); 103 | OP_REQUIRES(context,idx1_tensor.shape()==(TensorShape{b,n}),errors::InvalidArgument("NnDistanceGrad requires idx1 be of shape(batch,#points)")); 104 | OP_REQUIRES(context,grad_dist2_tensor.shape()==(TensorShape{b,m}),errors::InvalidArgument("NnDistanceGrad requires grad_dist2 be of shape(batch,#points)")); 105 | OP_REQUIRES(context,idx2_tensor.shape()==(TensorShape{b,m}),errors::InvalidArgument("NnDistanceGrad requires idx2 be of shape(batch,#points)")); 106 | auto xyz1_flat=xyz1_tensor.flat(); 107 | const float * xyz1=&xyz1_flat(0); 108 | auto xyz2_flat=xyz2_tensor.flat(); 109 | const float * xyz2=&xyz2_flat(0); 110 | auto idx1_flat=idx1_tensor.flat(); 111 | const int * idx1=&idx1_flat(0); 112 | auto idx2_flat=idx2_tensor.flat(); 113 | const int * idx2=&idx2_flat(0); 114 | auto grad_dist1_flat=grad_dist1_tensor.flat(); 115 | const float * grad_dist1=&grad_dist1_flat(0); 116 | auto grad_dist2_flat=grad_dist2_tensor.flat(); 117 | const float * grad_dist2=&grad_dist2_flat(0); 118 | Tensor * grad_xyz1_tensor=NULL; 119 | OP_REQUIRES_OK(context,context->allocate_output(0,TensorShape{b,n,3},&grad_xyz1_tensor)); 120 | Tensor * grad_xyz2_tensor=NULL; 121 | OP_REQUIRES_OK(context,context->allocate_output(1,TensorShape{b,m,3},&grad_xyz2_tensor)); 122 | auto grad_xyz1_flat=grad_xyz1_tensor->flat(); 123 | float * grad_xyz1=&grad_xyz1_flat(0); 124 | auto grad_xyz2_flat=grad_xyz2_tensor->flat(); 125 | float * grad_xyz2=&grad_xyz2_flat(0); 126 | for (int i=0;iinput(0); 174 | const Tensor& xyz2_tensor=context->input(1); 175 | OP_REQUIRES(context,xyz1_tensor.dims()==3,errors::InvalidArgument("NnDistance requires xyz1 be of shape (batch,#points,3)")); 176 | OP_REQUIRES(context,xyz1_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistance only accepts 3d point set xyz1")); 177 | int b=xyz1_tensor.shape().dim_size(0); 178 | int n=xyz1_tensor.shape().dim_size(1); 179 | OP_REQUIRES(context,xyz2_tensor.dims()==3,errors::InvalidArgument("NnDistance requires xyz2 be of shape (batch,#points,3)")); 180 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistance only accepts 3d point set xyz2")); 181 | int m=xyz2_tensor.shape().dim_size(1); 182 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(0)==b,errors::InvalidArgument("NnDistance expects xyz1 and xyz2 have same batch size")); 183 | auto xyz1_flat=xyz1_tensor.flat(); 184 | const float * xyz1=&xyz1_flat(0); 185 | auto xyz2_flat=xyz2_tensor.flat(); 186 | const float * xyz2=&xyz2_flat(0); 187 | Tensor * dist1_tensor=NULL; 188 | Tensor * idx1_tensor=NULL; 189 | Tensor * dist2_tensor=NULL; 190 | Tensor * idx2_tensor=NULL; 191 | OP_REQUIRES_OK(context,context->allocate_output(0,TensorShape{b,n},&dist1_tensor)); 192 | OP_REQUIRES_OK(context,context->allocate_output(1,TensorShape{b,n},&idx1_tensor)); 193 | auto dist1_flat=dist1_tensor->flat(); 194 | auto idx1_flat=idx1_tensor->flat(); 195 | OP_REQUIRES_OK(context,context->allocate_output(2,TensorShape{b,m},&dist2_tensor)); 196 | OP_REQUIRES_OK(context,context->allocate_output(3,TensorShape{b,m},&idx2_tensor)); 197 | auto dist2_flat=dist2_tensor->flat(); 198 | auto idx2_flat=idx2_tensor->flat(); 199 | float * dist1=&(dist1_flat(0)); 200 | int * idx1=&(idx1_flat(0)); 201 | float * dist2=&(dist2_flat(0)); 202 | int * idx2=&(idx2_flat(0)); 203 | NmDistanceKernelLauncher(b,n,xyz1,m,xyz2,dist1,idx1,dist2,idx2); 204 | } 205 | }; 206 | REGISTER_KERNEL_BUILDER(Name("NnDistance").Device(DEVICE_GPU), NnDistanceGpuOp); 207 | 208 | void NmDistanceGradKernelLauncher(int b,int n,const float * xyz1,int m,const float * xyz2,const float * grad_dist1,const int * idx1,const float * grad_dist2,const int * idx2,float * grad_xyz1,float * grad_xyz2); 209 | class NnDistanceGradGpuOp : public OpKernel{ 210 | public: 211 | explicit NnDistanceGradGpuOp(OpKernelConstruction* context):OpKernel(context){} 212 | void Compute(OpKernelContext * context)override{ 213 | const Tensor& xyz1_tensor=context->input(0); 214 | const Tensor& xyz2_tensor=context->input(1); 215 | const Tensor& grad_dist1_tensor=context->input(2); 216 | const Tensor& idx1_tensor=context->input(3); 217 | const Tensor& grad_dist2_tensor=context->input(4); 218 | const Tensor& idx2_tensor=context->input(5); 219 | OP_REQUIRES(context,xyz1_tensor.dims()==3,errors::InvalidArgument("NnDistanceGrad requires xyz1 be of shape (batch,#points,3)")); 220 | OP_REQUIRES(context,xyz1_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistanceGrad only accepts 3d point set xyz1")); 221 | int b=xyz1_tensor.shape().dim_size(0); 222 | int n=xyz1_tensor.shape().dim_size(1); 223 | OP_REQUIRES(context,xyz2_tensor.dims()==3,errors::InvalidArgument("NnDistanceGrad requires xyz2 be of shape (batch,#points,3)")); 224 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(2)==3,errors::InvalidArgument("NnDistanceGrad only accepts 3d point set xyz2")); 225 | int m=xyz2_tensor.shape().dim_size(1); 226 | OP_REQUIRES(context,xyz2_tensor.shape().dim_size(0)==b,errors::InvalidArgument("NnDistanceGrad expects xyz1 and xyz2 have same batch size")); 227 | OP_REQUIRES(context,grad_dist1_tensor.shape()==(TensorShape{b,n}),errors::InvalidArgument("NnDistanceGrad requires grad_dist1 be of shape(batch,#points)")); 228 | OP_REQUIRES(context,idx1_tensor.shape()==(TensorShape{b,n}),errors::InvalidArgument("NnDistanceGrad requires idx1 be of shape(batch,#points)")); 229 | OP_REQUIRES(context,grad_dist2_tensor.shape()==(TensorShape{b,m}),errors::InvalidArgument("NnDistanceGrad requires grad_dist2 be of shape(batch,#points)")); 230 | OP_REQUIRES(context,idx2_tensor.shape()==(TensorShape{b,m}),errors::InvalidArgument("NnDistanceGrad requires idx2 be of shape(batch,#points)")); 231 | auto xyz1_flat=xyz1_tensor.flat(); 232 | const float * xyz1=&xyz1_flat(0); 233 | auto xyz2_flat=xyz2_tensor.flat(); 234 | const float * xyz2=&xyz2_flat(0); 235 | auto idx1_flat=idx1_tensor.flat(); 236 | const int * idx1=&idx1_flat(0); 237 | auto idx2_flat=idx2_tensor.flat(); 238 | const int * idx2=&idx2_flat(0); 239 | auto grad_dist1_flat=grad_dist1_tensor.flat(); 240 | const float * grad_dist1=&grad_dist1_flat(0); 241 | auto grad_dist2_flat=grad_dist2_tensor.flat(); 242 | const float * grad_dist2=&grad_dist2_flat(0); 243 | Tensor * grad_xyz1_tensor=NULL; 244 | OP_REQUIRES_OK(context,context->allocate_output(0,TensorShape{b,n,3},&grad_xyz1_tensor)); 245 | Tensor * grad_xyz2_tensor=NULL; 246 | OP_REQUIRES_OK(context,context->allocate_output(1,TensorShape{b,m,3},&grad_xyz2_tensor)); 247 | auto grad_xyz1_flat=grad_xyz1_tensor->flat(); 248 | float * grad_xyz1=&grad_xyz1_flat(0); 249 | auto grad_xyz2_flat=grad_xyz2_tensor->flat(); 250 | float * grad_xyz2=&grad_xyz2_flat(0); 251 | NmDistanceGradKernelLauncher(b,n,xyz1,m,xyz2,grad_dist1,idx1,grad_dist2,idx2,grad_xyz1,grad_xyz2); 252 | } 253 | }; 254 | REGISTER_KERNEL_BUILDER(Name("NnDistanceGrad").Device(DEVICE_GPU), NnDistanceGradGpuOp); 255 | -------------------------------------------------------------------------------- /depthestimate/tf_nndistance.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | from tensorflow.python.framework import ops 3 | nn_distance_module=tf.load_op_library('./tf_nndistance_so.so') 4 | 5 | def nn_distance(xyz1,xyz2): 6 | ''' 7 | Computes the distance of nearest neighbors for a pair of point clouds 8 | input: xyz1: (batch_size,#points_1,3) the first point cloud 9 | input: xyz2: (batch_size,#points_2,3) the second point cloud 10 | output: dist1: (batch_size,#point_1) distance from first to second 11 | output: idx1: (batch_size,#point_1) nearest neighbor from first to second 12 | output: dist2: (batch_size,#point_2) distance from second to first 13 | output: idx2: (batch_size,#point_2) nearest neighbor from second to first 14 | ''' 15 | return nn_distance_module.nn_distance(xyz1,xyz2) 16 | #@tf.RegisterShape('NnDistance') 17 | #def _nn_distance_shape(op): 18 | #shape1=op.inputs[0].get_shape().with_rank(3) 19 | #shape2=op.inputs[1].get_shape().with_rank(3) 20 | #return [tf.TensorShape([shape1.dims[0],shape1.dims[1]]),tf.TensorShape([shape1.dims[0],shape1.dims[1]]), 21 | #tf.TensorShape([shape2.dims[0],shape2.dims[1]]),tf.TensorShape([shape2.dims[0],shape2.dims[1]])] 22 | @ops.RegisterGradient('NnDistance') 23 | def _nn_distance_grad(op,grad_dist1,grad_idx1,grad_dist2,grad_idx2): 24 | xyz1=op.inputs[0] 25 | xyz2=op.inputs[1] 26 | idx1=op.outputs[1] 27 | idx2=op.outputs[3] 28 | return nn_distance_module.nn_distance_grad(xyz1,xyz2,grad_dist1,idx1,grad_dist2,idx2) 29 | 30 | 31 | if __name__=='__main__': 32 | import numpy as np 33 | import random 34 | import time 35 | from tensorflow.python.ops.gradient_checker import compute_gradient 36 | random.seed(100) 37 | np.random.seed(100) 38 | with tf.Session('') as sess: 39 | xyz1=np.random.randn(32,16384,3).astype('float32') 40 | xyz2=np.random.randn(32,1024,3).astype('float32') 41 | #with tf.device('/gpu:0'): 42 | if True: 43 | inp1=tf.Variable(xyz1) 44 | inp2=tf.constant(xyz2) 45 | reta,retb,retc,retd=nn_distance(inp1,inp2) 46 | loss=tf.reduce_sum(reta)+tf.reduce_sum(retc) 47 | train=tf.train.GradientDescentOptimizer(learning_rate=0.05).minimize(loss) 48 | sess.run(tf.initialize_all_variables()) 49 | t0=time.time() 50 | t1=t0 51 | best=1e100 52 | for i in xrange(100): 53 | trainloss,_=sess.run([loss,train]) 54 | newt=time.time() 55 | best=min(best,newt-t1) 56 | print i,trainloss,(newt-t0)/(i+1),best 57 | t1=newt 58 | #print sess.run([inp1,retb,inp2,retd]) 59 | #grads=compute_gradient([inp1,inp2],[(16,32,3),(16,32,3)],loss,(1,),[xyz1,xyz2]) 60 | #for i,j in grads: 61 | #print i.shape,j.shape,np.mean(np.abs(i-j)),np.mean(np.abs(i)),np.mean(np.abs(j)) 62 | #for i in xrange(10): 63 | #t0=time.time() 64 | #a,b,c,d=sess.run([reta,retb,retc,retd],feed_dict={inp1:xyz1,inp2:xyz2}) 65 | #print 'time',time.time()-t0 66 | #print a.shape,b.shape,c.shape,d.shape 67 | #print a.dtype,b.dtype,c.dtype,d.dtype 68 | #samples=np.array(random.sample(range(xyz2.shape[1]),100),dtype='int32') 69 | #dist1=((xyz1[:,samples,None,:]-xyz2[:,None,:,:])**2).sum(axis=-1).min(axis=-1) 70 | #idx1=((xyz1[:,samples,None,:]-xyz2[:,None,:,:])**2).sum(axis=-1).argmin(axis=-1) 71 | #print np.abs(dist1-a[:,samples]).max() 72 | #print np.abs(idx1-b[:,samples]).max() 73 | #dist2=((xyz2[:,samples,None,:]-xyz1[:,None,:,:])**2).sum(axis=-1).min(axis=-1) 74 | #idx2=((xyz2[:,samples,None,:]-xyz1[:,None,:,:])**2).sum(axis=-1).argmin(axis=-1) 75 | #print np.abs(dist2-c[:,samples]).max() 76 | #print np.abs(idx2-d[:,samples]).max() 77 | 78 | -------------------------------------------------------------------------------- /depthestimate/tf_nndistance_g.cu: -------------------------------------------------------------------------------- 1 | #if GOOGLE_CUDA 2 | #define EIGEN_USE_GPU 3 | #include "third_party/eigen3/unsupported/Eigen/CXX11/Tensor" 4 | 5 | __global__ void NmDistanceKernel(int b,int n,const float * xyz,int m,const float * xyz2,float * result,int * result_i){ 6 | const int batch=512; 7 | __shared__ float buf[batch*3]; 8 | for (int i=blockIdx.x;ibest){ 120 | result[(i*n+j)]=best; 121 | result_i[(i*n+j)]=best_i; 122 | } 123 | } 124 | __syncthreads(); 125 | } 126 | } 127 | } 128 | void NmDistanceKernelLauncher(int b,int n,const float * xyz,int m,const float * xyz2,float * result,int * result_i,float * result2,int * result2_i){ 129 | NmDistanceKernel<<>>(b,n,xyz,m,xyz2,result,result_i); 130 | NmDistanceKernel<<>>(b,m,xyz2,n,xyz,result2,result2_i); 131 | } 132 | __global__ void NmDistanceGradKernel(int b,int n,const float * xyz1,int m,const float * xyz2,const float * grad_dist1,const int * idx1,float * grad_xyz1,float * grad_xyz2){ 133 | for (int i=blockIdx.x;i>>(b,n,xyz1,m,xyz2,grad_dist1,idx1,grad_xyz1,grad_xyz2); 156 | NmDistanceGradKernel<<>>(b,m,xyz2,n,xyz1,grad_dist2,idx2,grad_xyz2,grad_xyz1); 157 | } 158 | 159 | #endif 160 | -------------------------------------------------------------------------------- /depthestimate/train_nn.py: -------------------------------------------------------------------------------- 1 | import sys 2 | import tensorflow as tf 3 | import numpy as np 4 | import cv2 5 | import tflearn 6 | import random 7 | import math 8 | import os 9 | os.system("chmod +w /unsullied/sharefs/wangmengdi/wangmengdi") 10 | import time 11 | import zlib 12 | import socket 13 | import threading 14 | import Queue 15 | import sys 16 | import tf_nndistance 17 | import cPickle as pickle 18 | 19 | from BatchFetcher import * 20 | 21 | 22 | lastbatch=None 23 | lastconsumed=FETCH_BATCH_SIZE 24 | 25 | def fetch_batch(): 26 | global lastbatch,lastconsumed 27 | if lastbatch is None or lastconsumed+BATCH_SIZE>FETCH_BATCH_SIZE: 28 | lastbatch=fetchworker.fetch() 29 | lastconsumed=0 30 | ret=[i[lastconsumed:lastconsumed+BATCH_SIZE] for i in lastbatch] 31 | lastconsumed+=BATCH_SIZE 32 | return ret 33 | def stop_fetcher(): 34 | fetchworker.shutdown() 35 | 36 | def build_graph(resourceid): 37 | with tf.device('/gpu:%d'%resourceid): 38 | tflearn.init_graph(seed=1029,num_cores=2,gpu_memory_fraction=0.9,soft_placement=True) 39 | img_inp=tf.placeholder(tf.float32,shape=(BATCH_SIZE,HEIGHT,WIDTH,4),name='img_inp') 40 | pt_gt=tf.placeholder(tf.float32,shape=(BATCH_SIZE,POINTCLOUDSIZE,3),name='pt_gt') 41 | 42 | x=img_inp 43 | #192 256 44 | x=tflearn.layers.conv.conv_2d(x,16,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 45 | x=tflearn.layers.conv.conv_2d(x,16,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 46 | x0=x 47 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 48 | #96 128 49 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 50 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 51 | x1=x 52 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 53 | #48 64 54 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 55 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 56 | x2=x 57 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 58 | #24 32 59 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 60 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 61 | x3=x 62 | x=tflearn.layers.conv.conv_2d(x,256,(5,5),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 63 | #12 16 64 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 65 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 66 | x4=x 67 | x=tflearn.layers.conv.conv_2d(x,512,(5,5),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 68 | #6 8 69 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 70 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 71 | x=tflearn.layers.conv.conv_2d(x,512,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 72 | x5=x 73 | x=tflearn.layers.conv.conv_2d(x,512,(5,5),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 74 | #3 4 75 | x_additional=tflearn.layers.core.fully_connected(x,2048,activation='relu',weight_decay=1e-3,regularizer='L2') 76 | x=tflearn.layers.conv.conv_2d_transpose(x,256,[5,5],[6,8],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 77 | #6 8 78 | x5=tflearn.layers.conv.conv_2d(x5,256,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 79 | x=tf.nn.relu(tf.add(x,x5)) 80 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 81 | x5=x 82 | x=tflearn.layers.conv.conv_2d_transpose(x,128,[5,5],[12,16],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 83 | #12 16 84 | x4=tflearn.layers.conv.conv_2d(x4,128,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 85 | x=tf.nn.relu(tf.add(x,x4)) 86 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 87 | x4=x 88 | x=tflearn.layers.conv.conv_2d_transpose(x,64,[5,5],[24,32],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 89 | #24 32 90 | x3=tflearn.layers.conv.conv_2d(x3,64,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 91 | x=tf.nn.relu(tf.add(x,x3)) 92 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 93 | x3=x 94 | x=tflearn.layers.conv.conv_2d_transpose(x,32,[5,5],[48,64],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 95 | #48 64 96 | x2=tflearn.layers.conv.conv_2d(x2,32,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 97 | x=tf.nn.relu(tf.add(x,x2)) 98 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 99 | x2=x 100 | x=tflearn.layers.conv.conv_2d_transpose(x,16,[5,5],[96,128],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 101 | #96 128 102 | x1=tflearn.layers.conv.conv_2d(x1,16,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 103 | x=tf.nn.relu(tf.add(x,x1)) 104 | x=tflearn.layers.conv.conv_2d(x,16,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 105 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 106 | #48 64 107 | x2=tflearn.layers.conv.conv_2d(x2,32,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 108 | x=tf.nn.relu(tf.add(x,x2)) 109 | x=tflearn.layers.conv.conv_2d(x,32,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 110 | x2=x 111 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 112 | #24 32 113 | x3=tflearn.layers.conv.conv_2d(x3,64,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 114 | x=tf.nn.relu(tf.add(x,x3)) 115 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 116 | x3=x 117 | x=tflearn.layers.conv.conv_2d(x,128,(5,5),strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 118 | #12 16 119 | x4=tflearn.layers.conv.conv_2d(x4,128,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 120 | x=tf.nn.relu(tf.add(x,x4)) 121 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 122 | x4=x 123 | x=tflearn.layers.conv.conv_2d(x,256,(5,5),strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 124 | #6 8 125 | x5=tflearn.layers.conv.conv_2d(x5,256,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 126 | x=tf.nn.relu(tf.add(x,x5)) 127 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 128 | x5=x 129 | x=tflearn.layers.conv.conv_2d(x,512,(5,5),strides=2,activation='relu',weight_decay=1e-5,regularizer='L2') 130 | #3 4 131 | x_additional=tflearn.layers.core.fully_connected(x_additional,2048,activation='linear',weight_decay=1e-4,regularizer='L2') 132 | x_additional=tf.nn.relu(tf.add(x_additional,tflearn.layers.core.fully_connected(x,2048,activation='linear',weight_decay=1e-3,regularizer='L2'))) 133 | x=tflearn.layers.conv.conv_2d_transpose(x,256,[5,5],[6,8],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 134 | #6 8 135 | x5=tflearn.layers.conv.conv_2d(x5,256,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 136 | x=tf.nn.relu(tf.add(x,x5)) 137 | x=tflearn.layers.conv.conv_2d(x,256,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 138 | x5=x 139 | x=tflearn.layers.conv.conv_2d_transpose(x,128,[5,5],[12,16],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 140 | #12 16 141 | x4=tflearn.layers.conv.conv_2d(x4,128,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 142 | x=tf.nn.relu(tf.add(x,x4)) 143 | x=tflearn.layers.conv.conv_2d(x,128,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 144 | x4=x 145 | x=tflearn.layers.conv.conv_2d_transpose(x,64,[5,5],[24,32],strides=2,activation='linear',weight_decay=1e-5,regularizer='L2') 146 | #24 32 147 | x3=tflearn.layers.conv.conv_2d(x3,64,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 148 | x=tf.nn.relu(tf.add(x,x3)) 149 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 150 | x=tflearn.layers.conv.conv_2d(x,64,(3,3),strides=1,activation='relu',weight_decay=1e-5,regularizer='L2') 151 | 152 | x_additional=tflearn.layers.core.fully_connected(x_additional,1024,activation='relu',weight_decay=1e-3,regularizer='L2') 153 | x_additional=tflearn.layers.core.fully_connected(x_additional,256*3,activation='linear',weight_decay=1e-3,regularizer='L2') 154 | x_additional=tf.reshape(x_additional,(BATCH_SIZE,256,3)) 155 | x=tflearn.layers.conv.conv_2d(x,3,(3,3),strides=1,activation='linear',weight_decay=1e-5,regularizer='L2') 156 | x=tf.reshape(x,(BATCH_SIZE,32*24,3)) 157 | x=tf.concat([x_additional,x],1) 158 | x=tf.reshape(x,(BATCH_SIZE,OUTPUTPOINTS,3)) 159 | 160 | dists_forward,_,dists_backward,_=tf_nndistance.nn_distance(pt_gt,x) 161 | mindist=dists_forward 162 | dist0=mindist[0,:] 163 | dists_forward=tf.reduce_mean(dists_forward) 164 | dists_backward=tf.reduce_mean(dists_backward) 165 | loss_nodecay=(dists_forward+dists_backward/2.0)*10000 166 | loss=loss_nodecay+tf.add_n(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES))*0.1 167 | batchno = tf.Variable(0, dtype=tf.int32) 168 | optimizer = tf.train.AdamOptimizer(3e-5*BATCH_SIZE/FETCH_BATCH_SIZE).minimize(loss,global_step=batchno) 169 | batchnoinc=batchno.assign(batchno+1) 170 | return img_inp,x,pt_gt,loss,optimizer,batchno,batchnoinc,mindist,loss_nodecay,dists_forward,dists_backward,dist0 171 | 172 | def main(resourceid,keyname): 173 | if not os.path.exists(dumpdir): 174 | os.system("mkdir -p %s"%dumpdir) 175 | img_inp,x,pt_gt,loss,optimizer,batchno,batchnoinc,mindist,loss_nodecay,dists_forward,dists_backward,dist0=build_graph(resourceid) 176 | config=tf.ConfigProto() 177 | config.gpu_options.allow_growth=True 178 | config.allow_soft_placement=True 179 | saver=tf.train.Saver() 180 | with tf.Session(config=config) as sess,\ 181 | open('%s/%s.log'%(dumpdir,keyname),'a') as fout: 182 | sess.run(tf.global_variables_initializer()) 183 | trainloss_accs=[0,0,0] 184 | trainloss_acc0=1e-9 185 | validloss_accs=[0,0,0] 186 | validloss_acc0=1e-9 187 | lastsave=time.time() 188 | bno=sess.run(batchno) 189 | fetchworker.bno=bno//(FETCH_BATCH_SIZE/BATCH_SIZE) 190 | fetchworker.start() 191 | while bno<300000: 192 | t0=time.time() 193 | data,ptcloud,validating=fetch_batch() 194 | t1=time.time() 195 | validating=validating[0]!=0 196 | if not validating: 197 | _,pred,total_loss,trainloss,trainloss1,trainloss2,distmap_0=sess.run([optimizer,x,loss,loss_nodecay,dists_forward,dists_backward,dist0], 198 | feed_dict={img_inp:data,pt_gt:ptcloud}) 199 | trainloss_accs[0]=trainloss_accs[0]*0.99+trainloss 200 | trainloss_accs[1]=trainloss_accs[1]*0.99+trainloss1 201 | trainloss_accs[2]=trainloss_accs[2]*0.99+trainloss2 202 | trainloss_acc0=trainloss_acc0*0.99+1 203 | else: 204 | _,pred,total_loss,validloss,validloss1,validloss2,distmap_0=sess.run([batchnoinc,x,loss,loss_nodecay,dists_forward,dists_backward,dist0], 205 | feed_dict={img_inp:data,pt_gt:ptcloud}) 206 | validloss_accs[0]=validloss_accs[0]*0.997+validloss 207 | validloss_accs[1]=validloss_accs[1]*0.997+validloss1 208 | validloss_accs[2]=validloss_accs[2]*0.997+validloss2 209 | validloss_acc0=validloss_acc0*0.997+1 210 | t2=time.time() 211 | down=2 212 | 213 | bno=sess.run(batchno) 214 | if not validating: 215 | showloss=trainloss 216 | showloss1=trainloss1 217 | showloss2=trainloss2 218 | else: 219 | showloss=validloss 220 | showloss1=validloss1 221 | showloss2=validloss2 222 | print >>fout,bno,trainloss_accs[0]/trainloss_acc0,trainloss_accs[1]/trainloss_acc0,trainloss_accs[2]/trainloss_acc0,showloss,showloss1,showloss2,validloss_accs[0]/validloss_acc0,validloss_accs[1]/validloss_acc0,validloss_accs[2]/validloss_acc0,total_loss-showloss 223 | if bno%128==0: 224 | fout.flush() 225 | if time.time()-lastsave>900: 226 | saver.save(sess,'%s/'%dumpdir+keyname+".ckpt") 227 | lastsave=time.time() 228 | print bno,'t',trainloss_accs[0]/trainloss_acc0,trainloss_accs[1]/trainloss_acc0,trainloss_accs[2]/trainloss_acc0,'v',validloss_accs[0]/validloss_acc0,validloss_accs[1]/validloss_acc0,validloss_accs[2]/validloss_acc0,total_loss-showloss,t1-t0,t2-t1,time.time()-t0,fetchworker.queue.qsize() 229 | saver.save(sess,'%s/'%dumpdir+keyname+".ckpt") 230 | 231 | def dumppredictions(resourceid,keyname,valnum): 232 | img_inp,x,pt_gt,loss,optimizer,batchno,batchnoinc,mindist,loss_nodecay,dists_forward,dists_backward,dist0=build_graph(resourceid) 233 | config=tf.ConfigProto() 234 | config.gpu_options.allow_growth=True 235 | config.allow_soft_placement=True 236 | saver=tf.train.Saver() 237 | fout = open("%s/%s.v.pkl"%(dumpdir,keyname),'wb') 238 | with tf.Session(config=config) as sess: 239 | #sess.run(tf.initialize_all_variables()) 240 | sess.run(tf.global_variables_initializer()) 241 | saver.restore(sess,"%s/%s.ckpt"%(dumpdir,keyname)) 242 | fetchworker.bno=0 243 | fetchworker.start() 244 | cnt=0 245 | for i in xrange(0,300000): 246 | t0=time.time() 247 | data,ptcloud,validating=fetch_batch() 248 | validating=validating[0]!=0 249 | if not validating: 250 | continue 251 | cnt+=1 252 | pred,distmap=sess.run([x,mindist],feed_dict={img_inp:data,pt_gt:ptcloud}) 253 | pickle.dump((i,data,ptcloud,pred,distmap),fout,protocol=-1) 254 | print i,'time',time.time()-t0,cnt 255 | if cnt>=valnum: 256 | break 257 | fout.close() 258 | 259 | if __name__=='__main__': 260 | resourceid = 0 261 | datadir,dumpdir,cmd,valnum="data","dump","predict",3 262 | for pt in sys.argv[1:]: 263 | if pt[:5]=="data=": 264 | datadir = pt[5:] 265 | elif pt[:5]=="dump=": 266 | dumpdir = pt[5:] 267 | elif pt[:4]=="num=": 268 | valnum = int(pt[4:]) 269 | else: 270 | cmd = pt 271 | if datadir[-1]=='/': 272 | datadir = datadir[:-1] 273 | if dumpdir[-1]=='/': 274 | dumpdir = dumpdir[:-1] 275 | assert os.path.exists(datadir),"data dir not exists" 276 | os.system("mkdir -p %s"%dumpdir) 277 | fetchworker=BatchFetcher(datadir) 278 | print "datadir=%s dumpdir=%s num=%d cmd=%s started"%(datadir,dumpdir,valnum,cmd) 279 | 280 | keyname=os.path.basename(__file__).rstrip('.py') 281 | try: 282 | if cmd=="train": 283 | main(resourceid,keyname) 284 | elif cmd=="predict": 285 | dumppredictions(resourceid,keyname,valnum) 286 | else: 287 | assert False,"format wrong" 288 | finally: 289 | stop_fetcher() 290 | -------------------------------------------------------------------------------- /depthestimate/visualizeptexample.v.py: -------------------------------------------------------------------------------- 1 | import show3d_balls as show3d 2 | #import show3d 3 | import cv2 4 | import numpy as np 5 | import ctypes as ct 6 | import cv2 7 | import cPickle as pickle 8 | import matplotlib.pyplot as plt 9 | import sys 10 | 11 | 12 | colorflag=1 13 | showz=0 14 | def heatmap(x,normalize=True): 15 | if normalize: 16 | x=(x-x.min())/(x.max()-x.min()+1e-9) 17 | r=(x*3-1).clip(0,1) 18 | g=(1.5-np.abs(x-0.5)*3).clip(0,1) 19 | b=(2-x*3).clip(0,1) 20 | return cv2.merge([b,g,r]) 21 | 22 | 23 | 24 | dists=[] 25 | 26 | if __name__ == "__main__": 27 | fin=open(sys.argv[1]) 28 | while True: 29 | try: 30 | bno,data,ptcloud,pred,distmap=pickle.load(fin) 31 | print bno 32 | except EOFError: 33 | break 34 | 35 | for i in xrange(len(data)): 36 | def updatecolor(): 37 | global c0,c1,c2,showpoints 38 | if colorflag: 39 | showpoints=np.vstack([ptcloud[i][::1],pred[i]]) 40 | value=distmap[i]/distmap[i].max() 41 | rgb=np.zeros((len(value),3),dtype='float32') 42 | rgb[:,2]=(value*2).clip(0,1)*255 43 | rgb[:,1]=(2-value*2).clip(0,1)*255 44 | c0=np.hstack([rgb[:,1][::1],np.zeros(len(pred[i]))]) 45 | c1=np.hstack([rgb[:,2][::1],np.zeros(len(pred[i]))]) 46 | c2=np.hstack([rgb[:,0][::1],np.ones(len(pred[i]))]) 47 | else: 48 | showpoints=np.vstack([pred[i]]) 49 | if showz: 50 | value=(np.linspace(0,1,len(pred[i]))<0.25)*1.0 51 | rgb=np.zeros((len(value),3),dtype='float32') 52 | rgb[:,2]=(value*3-1).clip(0,1)*255 53 | rgb[:,1]=(1.5-np.abs(value-0.5)*3).clip(0,1)*255 54 | rgb[:,0]=(2-value*3).clip(0,1)*255 55 | c0=np.hstack([rgb[:,1][::1]]) 56 | c1=np.hstack([rgb[:,2][::1],]) 57 | c2=np.hstack([rgb[:,0][::1]]) 58 | else: 59 | c0=None 60 | c1=None 61 | c2=None 62 | updatecolor() 63 | 64 | def big(x): 65 | return cv2.resize(x,(0,0),fx=4,fy=4,interpolation=0) 66 | rsz=int(pred[i].shape[0]**0.5+0.5) 67 | cv2.imshow('x',big(heatmap(pred[i].reshape((rsz,rsz,3))[:,:,0]))) 68 | cv2.imshow('y',big(heatmap(pred[i].reshape((rsz,rsz,3))[:,:,1]))) 69 | cv2.imshow('z',big(heatmap(pred[i].reshape((rsz,rsz,3))[:,:,2]))) 70 | cv2.imshow('data',data[i]) 71 | 72 | while True: 73 | cmd=show3d.showpoints(showpoints,c0=c0,c1=c1,c2=c2,waittime=100,magnifyBlue=(0 if colorflag==1 else 0),background=((128,128,128) if colorflag==1 else (0,0,0)),ballradius=(2 if colorflag==1 else 12))%256 74 | if cmd==ord('c'): 75 | colorflag=1-colorflag 76 | updatecolor() 77 | if cmd==ord('z'): 78 | showz=1-showz 79 | updatecolor() 80 | if cmd==ord(' '): 81 | break 82 | if cmd==ord('q'): 83 | break 84 | if cmd==ord('q'): 85 | break 86 | 87 | -------------------------------------------------------------------------------- /makefile: -------------------------------------------------------------------------------- 1 | nvcc = /usr/local/cuda-8.0/bin/nvcc 2 | cudalib = /usr/local/cuda-8.0/lib64/ 3 | tensorflow = /usr/local/lib/python2.7/dist-packages/tensorflow/include 4 | 5 | all: depthestimate/tf_nndistance_so.so depthestimate/render_balls_so.so 6 | .PHONY : all 7 | 8 | depthestimate/tf_nndistance_so.so: depthestimate/tf_nndistance_g.cu.o depthestimate/tf_nndistance.cpp 9 | g++ -std=c++11 depthestimate/tf_nndistance.cpp depthestimate/tf_nndistance_g.cu.o -o depthestimate/tf_nndistance_so.so -shared -fPIC -I $(tensorflow) -lcudart -L $(cudalib) -O2 -D_GLIBCXX_USE_CXX11_ABI=0 10 | 11 | depthestimate/tf_nndistance_g.cu.o: depthestimate/tf_nndistance_g.cu 12 | $(nvcc) -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++11 -c -o depthestimate/tf_nndistance_g.cu.o depthestimate/tf_nndistance_g.cu -I $(tensorflow) -DGOOGLE_CUDA=1 -x cu -Xcompiler -fPIC -O2 13 | 14 | depthestimate/render_balls_so.so: depthestimate/render_balls_so.cpp 15 | g++ -std=c++11 depthestimate/render_balls_so.cpp -o depthestimate/render_balls_so.so -shared -fPIC -O2 -D_GLIBCXX_USE_CXX11_ABI=0 16 | 17 | 18 | -------------------------------------------------------------------------------- /position.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 18 | 20 | 42 | 44 | 45 | 47 | image/svg+xml 48 | 50 | 51 | 52 | 53 | 54 | 59 | 68 | 74 | 79 | r 90 | 96 | 2.5r 107 | 117 | 127 | 137 | 143 | 20 154 | o 165 | 166 | 167 | -------------------------------------------------------------------------------- /show3d.py: -------------------------------------------------------------------------------- 1 | ''' 2 | 3 | The default behavior is to visualize the points as white dots 4 | >>>show3d.showpoints(np.random.rand(10000,3)) 5 | 6 | Control: 7 | key q: quit 8 | key Q: sys.exit(0) 9 | key n: zoom in 10 | key m: zoom out 11 | key s: save screenshot to 'show3d.png' 12 | Mouse: rotate 13 | 14 | You can also play a video by specifying waittime 15 | >>>[show3d.showpoints(np.random.rand(10000,3),waittime=10) for i in xrange(10000)] 16 | 17 | Color can also be useful 18 | >>>green=np.linspace(0,1,10000) 19 | >>>red=np.linspace(1,0,10000) 20 | >>>blue=np.linspace(1,0,10000)**2 21 | >>>show3d.showpoints(np.random.rand(10000,3),green,red,blue) 22 | 23 | Additional Parameters 24 | --------------------- 25 | normalizecolor: 26 | if True (default), scale the maximum color to 1 for each channel. 27 | magnifyBlue: 28 | if True, magnify the blue dots to make them more visible 29 | background: 30 | the background color. Defaults to black (0,0,0) 31 | freezerot: 32 | disable rotation 33 | 34 | ''' 35 | 36 | 37 | import numpy as np 38 | import cv2 39 | import sys 40 | showsz=800 41 | mousex,mousey=0.5,0.5 42 | zoom=1.0 43 | changed=True 44 | def onmouse(*args): 45 | global mousex,mousey,changed 46 | y=args[1] 47 | x=args[2] 48 | mousex=x/float(showsz) 49 | mousey=y/float(showsz) 50 | changed=True 51 | cv2.namedWindow('show3d') 52 | cv2.moveWindow('show3d',0,0) 53 | cv2.setMouseCallback('show3d',onmouse) 54 | def showpoints(xyz,c0=None,c1=None,c2=None,waittime=0,showrot=False,magnifyBlue=0,freezerot=False,background=(0,0,0),normalizecolor=True): 55 | global showsz,mousex,mousey,zoom,changed 56 | if len(xyz.shape)!=2 or xyz.shape[1]!=3: 57 | raise Exception('showpoints expects (n,3) shape for xyz') 58 | if c0 is not None and c0.shape!=xyz.shape[:1]: 59 | raise Exception('showpoints expects (n,) shape for c0') 60 | if c1 is not None and c1.shape!=xyz.shape[:1]: 61 | raise Exception('showpoints expects (n,) shape for c1') 62 | if c2 is not None and c2.shape!=xyz.shape[:1]: 63 | raise Exception('showpoints expects (n,) shape for c2') 64 | xyz=xyz-xyz.mean(axis=0) 65 | radius=((xyz**2).sum(axis=-1)**0.5).max() 66 | xyz/=(radius*2.2)/showsz 67 | if c0 is None: 68 | c0=np.zeros((len(xyz),),dtype='float32')+255 69 | if c1 is None: 70 | c1=c0 71 | if c2 is None: 72 | c2=c0 73 | if normalizecolor: 74 | c0=c0/((c0.max()+1e-14)/255.0) 75 | c1=c1/((c1.max()+1e-14)/255.0) 76 | c2=c2/((c2.max()+1e-14)/255.0) 77 | 78 | show=np.zeros((showsz,showsz,3),dtype='uint8') 79 | def render(): 80 | rotmat=np.eye(3) 81 | if not freezerot: 82 | xangle=(mousey-0.5)*np.pi*1.2 83 | else: 84 | xangle=0 85 | rotmat=rotmat.dot(np.array([ 86 | [1.0,0.0,0.0], 87 | [0.0,np.cos(xangle),-np.sin(xangle)], 88 | [0.0,np.sin(xangle),np.cos(xangle)], 89 | ])) 90 | if not freezerot: 91 | yangle=(mousex-0.5)*np.pi*1.2 92 | else: 93 | yangle=0 94 | rotmat=rotmat.dot(np.array([ 95 | [np.cos(yangle),0.0,-np.sin(yangle)], 96 | [0.0,1.0,0.0], 97 | [np.sin(yangle),0.0,np.cos(yangle)], 98 | ])) 99 | rotmat*=zoom 100 | nxyz=xyz.dot(rotmat) 101 | nz=nxyz[:,2].argsort() 102 | nxyz=nxyz[nz] 103 | nxyz=(nxyz[:,:2]+[showsz/2,showsz/2]).astype('int32') 104 | p=nxyz[:,0]*showsz+nxyz[:,1] 105 | show[:]=background 106 | m=(nxyz[:,0]>=0)*(nxyz[:,0]=0)*(nxyz[:,1]0: 111 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=0)) 112 | if magnifyBlue>=2: 113 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=0)) 114 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],1,axis=1)) 115 | if magnifyBlue>=2: 116 | show[:,:,0]=np.maximum(show[:,:,0],np.roll(show[:,:,0],-1,axis=1)) 117 | if showrot: 118 | cv2.putText(show,'xangle %d'%(int(xangle/np.pi*180)),(30,showsz-30),0,0.5,cv2.cv.CV_RGB(255,0,0)) 119 | cv2.putText(show,'yangle %d'%(int(yangle/np.pi*180)),(30,showsz-50),0,0.5,cv2.cv.CV_RGB(255,0,0)) 120 | cv2.putText(show,'zoom %d%%'%(int(zoom*100)),(30,showsz-70),0,0.5,cv2.cv.CV_RGB(255,0,0)) 121 | changed=True 122 | while True: 123 | if changed: 124 | render() 125 | changed=False 126 | cv2.imshow('show3d',show) 127 | if waittime==0: 128 | cmd=cv2.waitKey(10)%256 129 | else: 130 | cmd=cv2.waitKey(waittime)%256 131 | if cmd==ord('q'): 132 | break 133 | elif cmd==ord('Q'): 134 | sys.exit(0) 135 | if cmd==ord('n'): 136 | zoom*=1.1 137 | changed=True 138 | elif cmd==ord('m'): 139 | zoom/=1.1 140 | changed=True 141 | elif cmd==ord('r'): 142 | zoom=1.0 143 | changed=True 144 | elif cmd==ord('s'): 145 | cv2.imwrite('show3d.png',show) 146 | if waittime!=0: 147 | break 148 | return cmd 149 | if __name__=='__main__': 150 | showpoints(np.random.rand(10000,3)) 151 | green=np.linspace(0,1,10000) 152 | red=np.linspace(1,0,10000)**0.5 153 | blue=np.linspace(1,0,10000) 154 | showpoints(np.random.rand(10000,3),green,red,blue,magnifyBlue=True) 155 | -------------------------------------------------------------------------------- /traindataviewer.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import cv2 3 | import zlib 4 | import math 5 | 6 | BATCH_SIZE=32 7 | HEIGHT=192 8 | WIDTH=256 9 | POINTCLOUDSIZE=16384 10 | OUTPUTPOINTS=1024 11 | REEBSIZE=1024 12 | 13 | def loadBinFile(path): 14 | binfile=zlib.decompress(open(path,'rb').read()) 15 | p=0 16 | color=np.fromstring(binfile[p:p+BATCH_SIZE*HEIGHT*WIDTH*3],dtype='uint8').reshape((BATCH_SIZE,HEIGHT,WIDTH,3)) 17 | p+=BATCH_SIZE*HEIGHT*WIDTH*3 18 | depth=np.fromstring(binfile[p:p+BATCH_SIZE*HEIGHT*WIDTH*2],dtype='uint16').reshape((BATCH_SIZE,HEIGHT,WIDTH)) 19 | p+=BATCH_SIZE*HEIGHT*WIDTH*2 20 | rotmat=np.fromstring(binfile[p:p+BATCH_SIZE*3*3*4],dtype='float32').reshape((BATCH_SIZE,3,3)) 21 | p+=BATCH_SIZE*3*3*4 22 | ptcloud=np.fromstring(binfile[p:p+BATCH_SIZE*POINTCLOUDSIZE*3],dtype='uint8').reshape((BATCH_SIZE,POINTCLOUDSIZE,3)) 23 | ptcloud=ptcloud.astype('float32')/255 24 | beta=math.pi/180*20 25 | viewmat=np.array([[ 26 | np.cos(beta),0,-np.sin(beta)],[ 27 | 0,1,0],[ 28 | np.sin(beta),0,np.cos(beta)]],dtype='float32') 29 | rotmat=rotmat.dot(np.linalg.inv(viewmat)) 30 | for i in xrange(BATCH_SIZE): 31 | ptcloud[i]=((ptcloud[i]-[0.7,0.5,0.5])/0.4).dot(rotmat[i])+[1,0,0] 32 | p+=BATCH_SIZE*POINTCLOUDSIZE*3 33 | some_other_thing=np.fromstring(binfile[p:p+BATCH_SIZE*REEBSIZE*2*4],dtype='uint16').reshape((BATCH_SIZE,REEBSIZE,4)) 34 | p+=BATCH_SIZE*REEBSIZE*2*4 35 | keynames=binfile[p:].split('\n') 36 | data=np.zeros((BATCH_SIZE,HEIGHT,WIDTH,4),dtype='float32') 37 | data[:,:,:,:3]=color*(1/255.0) 38 | data[:,:,:,3]=depth==0 39 | validating=np.array([i[0]=='f' for i in keynames],dtype='float32') 40 | return color,depth,ptcloud,keynames 41 | 42 | if __name__=='__main__': 43 | def plotimggrid(imgs,bgvalue=0,hpadding=0,vpadding=0): 44 | if len(imgs)==0: 45 | return np.zeros((1,1,3),dtype='uint8')^bgvalue 46 | ih=max([i.shape[0] for i in imgs]) 47 | iw=max([i.shape[1] for i in imgs]) 48 | w=min(len(imgs),max(1,int(np.ceil((len(imgs)*ih*iw)**0.5/iw)))) 49 | h=((len(imgs)+(w-1))//w) 50 | output=np.zeros((h*ih+(h-1)*vpadding,w*iw+(w-1)*hpadding,3),dtype='uint8')^bgvalue 51 | for i in xrange(len(imgs)): 52 | x0=(i//w)*(ih+vpadding) 53 | y0=(i%w)*(iw+hpadding) 54 | output[x0:x0+imgs[i].shape[0],y0:y0+imgs[i].shape[1]]=imgs[i] 55 | return output 56 | import sys 57 | import show3d 58 | if len(sys.argv)<2: 59 | print 'python traindataviewer.py data/0/0.gz' 60 | sys.exit(0) 61 | ifname=sys.argv[1] 62 | color,depth,ptcloud,keynames=loadBinFile(ifname) 63 | cv2.imshow('color',cv2.resize(plotimggrid(color),(0,0),fx=0.5,fy=0.5)) 64 | cv2.imshow('depth',cv2.resize(plotimggrid(np.uint8(depth>>8)[:,:,:,None]+[0,0,0]),(0,0),fx=0.5,fy=0.5)) 65 | print 'press q to navigate next, Q to quit' 66 | for i in xrange(len(ptcloud)): 67 | print i,keynames[i] 68 | show3d.showpoints(ptcloud[i]) 69 | --------------------------------------------------------------------------------