├── README.md ├── configs ├── breakfast_room.txt ├── complete_kitchen.txt ├── green_room.txt ├── grey_white_room.txt ├── icl_living_room.txt ├── kitchen.txt ├── morning_apartment.txt ├── scene0002_00.txt ├── scene0005_00.txt ├── scene0012_00.txt ├── scene0050_00.txt ├── staircase.txt └── whiteroom.txt ├── dataloader_util.py ├── deformation_field.py ├── docs ├── index.html └── static │ ├── css │ ├── bulma-carousel.min.css │ ├── bulma-slider.min.css │ ├── bulma.css.map.txt │ ├── bulma.min.css │ ├── comp-slider.css │ ├── fontawesome.all.min.css │ ├── index.css │ └── juxtapose.css │ ├── images │ ├── scene0002_bf.png │ ├── scene0002_ours.png │ ├── scene0050_bf.png │ ├── scene0050_ours.png │ └── teaser.jpg │ ├── js │ ├── bulma-carousel.js │ ├── bulma-carousel.min.js │ ├── bulma-slider.js │ ├── bulma-slider.min.js │ ├── fontawesome.all.min.js │ ├── index.js │ └── juxtapose.min.js │ ├── pdf │ └── neural_rgbd_surface_reconstruction.pdf │ └── videos │ └── teaser.mp4 ├── environment.yml ├── external └── NumpyMarchingCubes │ ├── marching_cubes │ ├── __init__.py │ └── src │ │ ├── _mcubes.cpp │ │ ├── _mcubes.pyx │ │ ├── marching_cubes.cpp │ │ ├── marching_cubes.h │ │ ├── pyarray_symbol.h │ │ ├── pyarraymodule.h │ │ ├── pywrapper.cpp │ │ ├── pywrapper.h │ │ ├── sparsegrid3.h │ │ └── tables.h │ └── setup.py ├── extract_mesh.py ├── extract_optimized_poses.py ├── frame_features.py ├── load_dataset.py ├── load_network_model.py ├── load_scannet.py ├── losses.py ├── nerf_helpers.py ├── optimize.py ├── parser_util.py ├── pose_array.py └── scene_bounds.py /README.md: -------------------------------------------------------------------------------- 1 | # Neural RGB-D Surface Reconstruction 2 | 3 | ### [Paper](https://dazinovic.github.io/neural-rgbd-surface-reconstruction/static/pdf/neural_rgbd_surface_reconstruction.pdf) | [Project Page](https://dazinovic.github.io/neural-rgbd-surface-reconstruction/) | [Video](https://youtu.be/iWuSowPsC3g) 4 | 5 | > Neural RGB-D Surface Reconstruction
6 | > [Dejan Azinović](http://niessnerlab.org/members/dejan_azinovic/profile.html), [Ricardo Martin-Brualla](https://ricardomartinbrualla.com/), [Dan B Goldman](https://www.danbgoldman.com/home/), [Matthias Nießner](https://www.niessnerlab.org/members/matthias_niessner/profile.html), [Justus Thies](https://justusthies.github.io/)
7 | > CVPR 2022 8 | 9 |

10 | 11 |

12 | 13 | This repository contains the code for the paper Neural RGB-D Surface Reconstruction, a novel approach for 3D reconstruction that combines implicit surface representations with neural radiance fields. 14 | 15 | ## Installation 16 | 17 | You can create a conda environment called neural_rgbd using: 18 | 19 | ``` 20 | conda env create -f environment.yaml 21 | conda activate neural_rgbd 22 | ``` 23 | 24 | Make sure to clone the external Marching Cubes dependency and install it in the same environment: 25 | 26 | ``` 27 | cd external/NumpyMarchingCubes 28 | python setup.py install 29 | ``` 30 | 31 | You can run an optimization using: 32 | 33 | ``` 34 | python optimize.py --config configs/.txt 35 | ``` 36 | 37 | ## Data 38 | 39 | The data needs to be in the following format: 40 | 41 | ``` 42 | # args.datadir in the config file 43 | ├── depth # raw (real data) or ground truth (synthetic data) depth images (optional) 44 | ├── depth0.png 45 | ├── depth1.png 46 | ├── depth2.png 47 | ... 48 | ├── depth_filtered # filtered depth images 49 | ├── depth0.png 50 | ├── depth1.png 51 | ├── depth2.png 52 | ... 53 | ├── depth_with_noise # depth images with synthetic noise and artifacts (optional) 54 | ├── depth0.png 55 | ├── depth1.png 56 | ├── depth2.png 57 | ... 58 | ├── images # RGB images 59 | ├── img0.png 60 | ├── img1.png 61 | ├── img2.png 62 | ... 63 | ├── focal.txt # focal length 64 | ├── poses.txt # ground truth poses (optional) 65 | ├── trainval_poses.txt # camera poses used for optimization 66 | ``` 67 | 68 | The dataloader is hard-coded to load depth maps from the `depth_filtered` folder. These depth maps have been generated from the raw ones (or `depth_with_noise` in the case of synthetic data) using the same bilateral filter that was used by BundleFusion. The method also works with the raw depth maps, but the results are slightly degraded. 69 | 70 | The file `focal.txt` contains a single floating point value representing the focal length of the camera in pixels. 71 | 72 | The files `poses.txt` and `trainval_poses.txt` contain the camera matrices in the format 4N x 4, where is the number of cameras in the trajectory. Like the NeRF paper, we use the OpenGL convention for the camera's coordinate system. If you run this code on ScanNet data, make sure to transform the poses to the OpenGL system, since ScanNet used a different convention. 73 | 74 | You can also write your own dataloader. You can use the existing `load_scannet.py` as template and update `load_dataset.py`. 75 | 76 | ### Dataset 77 | 78 | The dataset used in the paper is available via the following link: neural_rgbd_data.zip (7.25 GB). The ICL data is not included here, but can be downloaded from the original author's [webpage](https://www.doc.ic.ac.uk/~ahanda/VaFRIC/iclnuim.html). 79 | 80 | The scene files have been provided by various artists for free on BlendSwap. Please refer to the table below for license information and links to the .blend files. 81 | 82 | | License | Scene name | 83 | | ------------- | ------------------------------------------------------ | 84 | | CC-BY | [Breakfast room](https://blendswap.com/blend/13363) | 85 | | CC-0 | [Complete kitchen](https://blendswap.com/blend/11801) | 86 | | CC-BY | [Green room](https://blendswap.com/blend/8381) | 87 | | CC-BY | [Grey-white room](https://blendswap.com/blend/13552) | 88 | | CC-BY | [Kitchen](https://blendswap.com/blend/5156) | 89 | | CC-0 | [Morning apartment](https://blendswap.com/blend/10350) | 90 | | CC-BY | [Staircase](https://blendswap.com/blend/14449) | 91 | | CC-BY | [Thin geometry](https://blendswap.com/blend/8381) | 92 | | CC-BY | [Whiteroom](https://blendswap.com/blend/5014) | 93 | 94 | We also provide culled ground truth meshes and our method's meshes for evaluation purposes: meshes.zip (514 MB). 95 | 96 | ## Citation 97 | 98 | If you use this code in your research, please consider citing: 99 | 100 | ``` 101 | @InProceedings{Azinovic_2022_CVPR, 102 | author = {Azinovi\'c, Dejan and Martin-Brualla, Ricardo and Goldman, Dan B and Nie{\ss}ner, Matthias and Thies, Justus}, 103 | title = {Neural RGB-D Surface Reconstruction}, 104 | booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, 105 | month = {June}, 106 | year = {2022}, 107 | pages = {6290-6301} 108 | } 109 | ``` 110 | 111 | ## Further information 112 | 113 | The code is largely based on the original NeRF code by Mildenhall et al. 114 | https://github.com/bmild/nerf 115 | 116 | The Marching Cubes implementation was adapted from the SPSG code by Dai et al. 117 | https://github.com/angeladai/spsg 118 | -------------------------------------------------------------------------------- /configs/breakfast_room.txt: -------------------------------------------------------------------------------- 1 | expname = breakfast_room 2 | basedir = ./logs 3 | datadir = ./data/breakfast_room 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 320 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [0.0, -1.42, 0.0] 32 | sc_factor = 0.4 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/complete_kitchen.txt: -------------------------------------------------------------------------------- 1 | expname = complete_kitchen 2 | basedir = ./logs 3 | datadir = ./data/complete_kitchen 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 768 9 | N_samples = 640 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [1.5, -2.25, 0.0] 32 | sc_factor = 0.20 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/green_room.txt: -------------------------------------------------------------------------------- 1 | expname = green_room 2 | basedir = ./logs 3 | datadir = ./data/green_room 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 900 9 | N_samples = 512 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-1, -0.38, 0.3] 32 | sc_factor = 0.25 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/grey_white_room.txt: -------------------------------------------------------------------------------- 1 | expname = grey_white_room 2 | basedir = ./logs 3 | datadir = ./data/grey_white_room 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 900 9 | N_samples = 512 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-0.12, -1.94, -0.69] 32 | sc_factor = 0.25 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/icl_living_room.txt: -------------------------------------------------------------------------------- 1 | expname = icl_living_room 2 | basedir = ./logs 3 | datadir = ./data/icl_living_room 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 320 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-1.3, -1.0, 0.0] 32 | sc_factor = 0.4 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/kitchen.txt: -------------------------------------------------------------------------------- 1 | expname = kitchen 2 | basedir = ./logs 3 | datadir = ./data/kitchen 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 900 9 | N_samples = 512 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [0.3, -3.42, -0.12] 32 | sc_factor = 0.25 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/morning_apartment.txt: -------------------------------------------------------------------------------- 1 | expname = morning_apartment 2 | basedir = ./logs 3 | datadir = ./data/morning_apartment 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 256 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-0.22, -0.43, 0.0] 32 | sc_factor = 0.5 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/scene0002_00.txt: -------------------------------------------------------------------------------- 1 | expname = scene0002 2 | basedir = ./logs 3 | datadir = ./data/scannet/scene0002_00 4 | dataset_type = scannet 5 | trainskip = 4 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 256 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 2 13 | optimize_poses = True 14 | use_deformation_field = True 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-3.15, 0, 2.86] 32 | sc_factor = 0.5 33 | crop = 6 34 | near = 0.0 35 | far = 2.0 36 | 37 | factor = 1 38 | render_factor = 1 39 | 40 | i_img = 5000 41 | i_mesh = 200000 42 | -------------------------------------------------------------------------------- /configs/scene0005_00.txt: -------------------------------------------------------------------------------- 1 | expname = scene0005 2 | basedir = ./logs 3 | datadir = ./data/scannet/scene0005_00 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 256 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 2 13 | optimize_poses = True 14 | use_deformation_field = True 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-2.66, 0, 2.93] 32 | sc_factor = 0.5 33 | crop = 6 34 | near = 0.0 35 | far = 2.0 36 | 37 | factor = 1 38 | render_factor = 1 39 | 40 | i_img = 5000 41 | i_mesh = 200000 42 | -------------------------------------------------------------------------------- /configs/scene0012_00.txt: -------------------------------------------------------------------------------- 1 | expname = scene0012 2 | basedir = ./logs 3 | datadir = ./data/scannet/scene0012_00 4 | dataset_type = scannet 5 | trainskip = 4 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 320 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 2 13 | optimize_poses = True 14 | use_deformation_field = True 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-2.73, 0, 2.65] 32 | sc_factor = 0.4 33 | crop = 6 34 | near = 0.0 35 | far = 2.0 36 | 37 | factor = 1 38 | render_factor = 1 39 | 40 | i_img = 5000 41 | i_mesh = 200000 42 | -------------------------------------------------------------------------------- /configs/scene0050_00.txt: -------------------------------------------------------------------------------- 1 | expname = scene0050 2 | basedir = ./logs 3 | datadir = ./data/scannet/scene0050_00 4 | dataset_type = scannet 5 | trainskip = 4 6 | 7 | N_iters = 400000 8 | N_rand = 1024 9 | N_samples = 256 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 2 13 | optimize_poses = True 14 | use_deformation_field = True 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [-4.44, 0, 2.31] 32 | sc_factor = 0.5 33 | crop = 6 34 | near = 0.0 35 | far = 2.0 36 | 37 | factor = 1 38 | render_factor = 1 39 | 40 | i_img = 5000 41 | i_mesh = 200000 42 | -------------------------------------------------------------------------------- /configs/staircase.txt: -------------------------------------------------------------------------------- 1 | expname = staircase 2 | basedir = ./logs 3 | datadir = ./data/staircase 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 900 9 | N_samples = 512 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [0.0, -2.42, 0.0] 32 | sc_factor = 0.25 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /configs/whiteroom.txt: -------------------------------------------------------------------------------- 1 | expname = whiteroom 2 | basedir = ./logs 3 | datadir = ./data/whiteroom 4 | dataset_type = scannet 5 | trainskip = 1 6 | 7 | N_iters = 400000 8 | N_rand = 900 9 | N_samples = 512 10 | N_importance = 16 11 | chunk = 4096 # 1024 * 16 12 | frame_features = 0 13 | optimize_poses = True 14 | use_deformation_field = False 15 | share_coarse_fine = True 16 | multires = 8 17 | 18 | rgb_weight = 0.1 19 | depth_weight = 0.0 20 | fs_weight = 10.0 21 | trunc_weight = 6000.0 22 | trunc = 0.05 23 | 24 | rgb_loss_type = l2 25 | sdf_loss_type = l2 26 | 27 | mode = sdf 28 | use_viewdirs = True 29 | raw_noise_std = 0.0 30 | 31 | translation = [0.3, -3.42, -0.12] 32 | sc_factor = 0.25 33 | near = 0.0 34 | far = 2.0 35 | 36 | factor = 1 37 | render_factor = 1 38 | 39 | i_img = 5000 40 | i_mesh = 200000 -------------------------------------------------------------------------------- /dataloader_util.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import re 3 | import cv2 4 | 5 | 6 | def load_poses(posefile): 7 | file = open(posefile, "r") 8 | lines = file.readlines() 9 | file.close() 10 | poses = [] 11 | valid = [] 12 | lines_per_matrix = 4 13 | for i in range(0, len(lines), lines_per_matrix): 14 | if 'nan' in lines[i]: 15 | valid.append(False) 16 | poses.append(np.eye(4, 4, dtype=np.float32).tolist()) 17 | else: 18 | valid.append(True) 19 | pose_floats = [[float(x) for x in line.split()] for line in lines[i:i+lines_per_matrix]] 20 | poses.append(pose_floats) 21 | 22 | return poses, valid 23 | 24 | 25 | def load_focal_length(filepath): 26 | file = open(filepath, "r") 27 | return float(file.readline()) 28 | 29 | 30 | def alphanum_key(s): 31 | """ Turn a string into a list of string and number chunks. 32 | "z23a" -> ["z", 23, "a"] 33 | """ 34 | return [int(x) if x.isdigit() else x for x in re.split('([0-9]+)', s)] 35 | 36 | 37 | def resize_images(images, H, W, interpolation=cv2.INTER_LINEAR): 38 | resized = np.zeros((images.shape[0], H, W, images.shape[3]), dtype=images.dtype) 39 | for i, img in enumerate(images): 40 | r = cv2.resize(img, (W, H), interpolation=interpolation) 41 | if images.shape[3] == 1: 42 | r = r[..., np.newaxis] 43 | resized[i] = r 44 | return resized 45 | -------------------------------------------------------------------------------- /deformation_field.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | 3 | 4 | class DeformationField(tf.Module): 5 | """ 6 | Image-plane deformation field. 7 | 8 | This is an MLP with D layers and W weights per layer. Skip connections 9 | are defined by a list passed to the constructor. 10 | 11 | The input to the MLP is a 2D image-plane coordinate. 12 | The output is a 2D image-plane vector used to shift a camera 13 | ray's endpoint from the input coordinates to some other coordinates. 14 | """ 15 | 16 | def __init__(self, D=6, W=128, input_ch=2, output_ch=2, skips=[3]): 17 | super(DeformationField, self).__init__() 18 | 19 | relu = tf.keras.layers.ReLU() 20 | dense = lambda W, act=relu: tf.keras.layers.Dense(W, activation=act) 21 | 22 | input_ch = int(input_ch) 23 | 24 | inputs_pts = tf.keras.Input(shape=input_ch) 25 | inputs_pts.set_shape([None, input_ch]) 26 | 27 | outputs = inputs_pts 28 | for i in range(D): 29 | outputs = dense(W)(outputs) 30 | if i in skips: 31 | outputs = tf.concat([inputs_pts, outputs], -1) 32 | 33 | outputs = dense(output_ch, act=None)(outputs) 34 | 35 | self.model = tf.keras.Model(inputs=inputs_pts, outputs=outputs) 36 | 37 | def __call__(self, pts): 38 | return self.model(pts) 39 | 40 | def get_weights(self): 41 | return self.model.get_weights() 42 | 43 | def set_weights(self, weights): 44 | self.model.set_weights(weights) 45 | -------------------------------------------------------------------------------- /docs/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | Neural RGB-D Surface Reconstruction 9 | 10 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 69 | 70 | 71 |
72 |
73 |
74 |
75 |
76 |

Neural RGB-D Surface Reconstruction

77 |
78 | 79 | Dejan Azinović1 80 | 81 | 82 | Ricardo Martin-Brualla2 83 | 84 | 85 | Dan B Goldman2 86 | 87 | 88 | Matthias Nießner1 89 | 90 | 91 | Justus Thies1, 3 92 | 93 |
94 | 95 |
96 | 1Technical University of Munich 97 | 2Google Research 98 | 3Max Planck Institute for Intelligent Systems 99 |
100 | 101 |
102 | 143 | 144 |
145 |
146 |
147 |
148 |
149 |
150 | 151 |
152 |
153 |
154 | 158 |

159 | Our method obtains a high-quality 3D reconstruction from an RGB-D input sequence by training a multi-layer perceptron. 160 |

161 |
162 |
163 |
164 | 165 |
166 |
167 | 168 | 169 |
170 |
171 |

Abstract

172 |
173 |

174 | In this work, we explore how to leverage the success of implicit novel view synthesis methods for surface reconstruction. 175 | Methods which learn a neural radiance field have shown amazing image synthesis results, but the underlying geometry representation is only a coarse approximation of the real geometry. 176 | We demonstrate how depth measurements can be incorporated into the radiance field formulation to produce more detailed and complete reconstruction results than using methods based on either color or depth data alone. 177 | In contrast to a density field as the underlying geometry representation, we propose to learn a deep neural network which stores a truncated signed distance field. 178 | Using this representation, we show that one can still leverage differentiable volume rendering to estimate color values of the observed images during training to compute a reconstruction loss. 179 | This is beneficial for learning the signed distance field in regions with missing depth measurements. 180 | Furthermore, we correct for misalignment errors of the camera, improving the overall reconstruction quality. 181 | In several experiments, we show-cast our method and compare to existing works on classical RGB-D fusion and learned representations. 182 |

183 |
184 |
185 |
186 |
187 | 188 |
189 | 190 |
191 | 192 |
193 |
194 |

Video

195 |
196 | 198 |
199 |
200 |
201 | 202 |
203 | 204 |
205 |
206 |
207 |
208 |

Results

209 |
210 |

211 | We test our method on the ScanNet dataset which provides RGB-D sequences of room-scale scenes. 212 | We compare our method to the original ScanNet BundleFusion reconstructions which often suffer from severe camera pose misalignment. 213 | Our approach jointly optimizes for the scene representation network as well as the camera poses, leading to substantially reduced misalignment artifacts in the reconstructed geometry. 214 |

215 |
216 |
217 |
218 |
219 | 220 | 221 |
222 |
223 | 224 | 225 |
226 | 227 |
228 |
229 |
230 |
231 |
232 |
233 | 234 | 235 |
236 |
237 |

BibTeX

238 |
@InProceedings{Azinovic_2022_CVPR,
239 |     author    = {Azinovi\'c, Dejan and Martin-Brualla, Ricardo and Goldman, Dan B and Nie{\ss}ner, Matthias and Thies, Justus},
240 |     title     = {Neural RGB-D Surface Reconstruction},
241 |     booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
242 |     month     = {June},
243 |     year      = {2022},
244 |     pages     = {6290-6301}
245 | }
246 |
247 |
248 | 249 | 250 | 278 | 279 | 280 | -------------------------------------------------------------------------------- /docs/static/css/bulma-carousel.min.css: -------------------------------------------------------------------------------- 1 | @-webkit-keyframes spinAround{from{-webkit-transform:rotate(0);transform:rotate(0)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes spinAround{from{-webkit-transform:rotate(0);transform:rotate(0)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.slider{position:relative;width:100%}.slider-container{display:flex;flex-wrap:nowrap;flex-direction:row;overflow:hidden;-webkit-transform:translate3d(0,0,0);transform:translate3d(0,0,0);min-height:100%}.slider-container.is-vertical{flex-direction:column}.slider-container .slider-item{flex:none}.slider-container .slider-item .image.is-covered img{-o-object-fit:cover;object-fit:cover;-o-object-position:center center;object-position:center center;height:100%;width:100%}.slider-container .slider-item .video-container{height:0;padding-bottom:0;padding-top:56.25%;margin:0;position:relative}.slider-container .slider-item .video-container.is-1by1,.slider-container .slider-item .video-container.is-square{padding-top:100%}.slider-container .slider-item .video-container.is-4by3{padding-top:75%}.slider-container .slider-item .video-container.is-21by9{padding-top:42.857143%}.slider-container .slider-item .video-container embed,.slider-container .slider-item .video-container iframe,.slider-container .slider-item .video-container object{position:absolute;top:0;left:0;width:100%!important;height:100%!important}.slider-navigation-next,.slider-navigation-previous{display:flex;justify-content:center;align-items:center;position:absolute;width:42px;height:42px;background:#fff center center no-repeat;background-size:20px 20px;border:1px solid #fff;border-radius:25091983px;box-shadow:0 2px 5px #3232321a;top:50%;margin-top:-20px;left:0;cursor:pointer;transition:opacity .3s,-webkit-transform .3s;transition:transform .3s,opacity .3s;transition:transform .3s,opacity .3s,-webkit-transform .3s}.slider-navigation-next:hover,.slider-navigation-previous:hover{-webkit-transform:scale(1.2);transform:scale(1.2)}.slider-navigation-next.is-hidden,.slider-navigation-previous.is-hidden{display:none;opacity:0}.slider-navigation-next svg,.slider-navigation-previous svg{width:25%}.slider-navigation-next{left:auto;right:0;background:#fff center center no-repeat;background-size:20px 20px}.slider-pagination{display:none;justify-content:center;align-items:center;position:absolute;bottom:0;left:0;right:0;padding:.5rem 1rem;text-align:center}.slider-pagination .slider-page{background:#fff;width:10px;height:10px;border-radius:25091983px;display:inline-block;margin:0 3px;box-shadow:0 2px 5px #3232321a;transition:-webkit-transform .3s;transition:transform .3s;transition:transform .3s,-webkit-transform .3s;cursor:pointer}.slider-pagination .slider-page.is-active,.slider-pagination .slider-page:hover{-webkit-transform:scale(1.4);transform:scale(1.4)}@media screen and (min-width:800px){.slider-pagination{display:flex}}.hero.has-carousel{position:relative}.hero.has-carousel+.hero-body,.hero.has-carousel+.hero-footer,.hero.has-carousel+.hero-head{z-index:10;overflow:hidden}.hero.has-carousel .hero-carousel{position:absolute;top:0;left:0;bottom:0;right:0;height:auto;border:none;margin:auto;padding:0;z-index:0}.hero.has-carousel .hero-carousel .slider{width:100%;max-width:100%;overflow:hidden;height:100%!important;max-height:100%;z-index:0}.hero.has-carousel .hero-carousel .slider .has-background{max-height:100%}.hero.has-carousel .hero-carousel .slider .has-background .is-background{-o-object-fit:cover;object-fit:cover;-o-object-position:center center;object-position:center center;height:100%;width:100%}.hero.has-carousel .hero-body{margin:0 3rem;z-index:10} -------------------------------------------------------------------------------- /docs/static/css/bulma-slider.min.css: -------------------------------------------------------------------------------- 1 | @-webkit-keyframes spinAround{from{-webkit-transform:rotate(0);transform:rotate(0)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes spinAround{from{-webkit-transform:rotate(0);transform:rotate(0)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}input[type=range].slider{-webkit-appearance:none;-moz-appearance:none;appearance:none;margin:1rem 0;background:0 0;touch-action:none}input[type=range].slider.is-fullwidth{display:block;width:100%}input[type=range].slider:focus{outline:0}input[type=range].slider:not([orient=vertical])::-webkit-slider-runnable-track{width:100%}input[type=range].slider:not([orient=vertical])::-moz-range-track{width:100%}input[type=range].slider:not([orient=vertical])::-ms-track{width:100%}input[type=range].slider:not([orient=vertical]).has-output+output,input[type=range].slider:not([orient=vertical]).has-output-tooltip+output{width:3rem;background:#4a4a4a;border-radius:4px;padding:.4rem .8rem;font-size:.75rem;line-height:.75rem;text-align:center;text-overflow:ellipsis;white-space:nowrap;color:#fff;overflow:hidden;pointer-events:none;z-index:200}input[type=range].slider:not([orient=vertical]).has-output-tooltip:disabled+output,input[type=range].slider:not([orient=vertical]).has-output:disabled+output{opacity:.5}input[type=range].slider:not([orient=vertical]).has-output{display:inline-block;vertical-align:middle;width:calc(100% - (4.2rem))}input[type=range].slider:not([orient=vertical]).has-output+output{display:inline-block;margin-left:.75rem;vertical-align:middle}input[type=range].slider:not([orient=vertical]).has-output-tooltip{display:block}input[type=range].slider:not([orient=vertical]).has-output-tooltip+output{position:absolute;left:0;top:-.1rem}input[type=range].slider[orient=vertical]{-webkit-appearance:slider-vertical;-moz-appearance:slider-vertical;appearance:slider-vertical;-webkit-writing-mode:bt-lr;-ms-writing-mode:bt-lr;writing-mode:bt-lr}input[type=range].slider[orient=vertical]::-webkit-slider-runnable-track{height:100%}input[type=range].slider[orient=vertical]::-moz-range-track{height:100%}input[type=range].slider[orient=vertical]::-ms-track{height:100%}input[type=range].slider::-webkit-slider-runnable-track{cursor:pointer;animate:.2s;box-shadow:0 0 0 #7a7a7a;background:#dbdbdb;border-radius:4px;border:0 solid #7a7a7a}input[type=range].slider::-moz-range-track{cursor:pointer;animate:.2s;box-shadow:0 0 0 #7a7a7a;background:#dbdbdb;border-radius:4px;border:0 solid #7a7a7a}input[type=range].slider::-ms-track{cursor:pointer;animate:.2s;box-shadow:0 0 0 #7a7a7a;background:#dbdbdb;border-radius:4px;border:0 solid #7a7a7a}input[type=range].slider::-ms-fill-lower{background:#dbdbdb;border-radius:4px}input[type=range].slider::-ms-fill-upper{background:#dbdbdb;border-radius:4px}input[type=range].slider::-webkit-slider-thumb{box-shadow:none;border:1px solid #b5b5b5;border-radius:4px;background:#fff;cursor:pointer}input[type=range].slider::-moz-range-thumb{box-shadow:none;border:1px solid #b5b5b5;border-radius:4px;background:#fff;cursor:pointer}input[type=range].slider::-ms-thumb{box-shadow:none;border:1px solid #b5b5b5;border-radius:4px;background:#fff;cursor:pointer}input[type=range].slider::-webkit-slider-thumb{-webkit-appearance:none;appearance:none}input[type=range].slider.is-circle::-webkit-slider-thumb{border-radius:290486px}input[type=range].slider.is-circle::-moz-range-thumb{border-radius:290486px}input[type=range].slider.is-circle::-ms-thumb{border-radius:290486px}input[type=range].slider:active::-webkit-slider-thumb{-webkit-transform:scale(1.25);transform:scale(1.25)}input[type=range].slider:active::-moz-range-thumb{transform:scale(1.25)}input[type=range].slider:active::-ms-thumb{transform:scale(1.25)}input[type=range].slider:disabled{opacity:.5;cursor:not-allowed}input[type=range].slider:disabled::-webkit-slider-thumb{cursor:not-allowed;-webkit-transform:scale(1);transform:scale(1)}input[type=range].slider:disabled::-moz-range-thumb{cursor:not-allowed;transform:scale(1)}input[type=range].slider:disabled::-ms-thumb{cursor:not-allowed;transform:scale(1)}input[type=range].slider:not([orient=vertical]){min-height:calc((1rem + 2px) * 1.25)}input[type=range].slider:not([orient=vertical])::-webkit-slider-runnable-track{height:.5rem}input[type=range].slider:not([orient=vertical])::-moz-range-track{height:.5rem}input[type=range].slider:not([orient=vertical])::-ms-track{height:.5rem}input[type=range].slider[orient=vertical]::-webkit-slider-runnable-track{width:.5rem}input[type=range].slider[orient=vertical]::-moz-range-track{width:.5rem}input[type=range].slider[orient=vertical]::-ms-track{width:.5rem}input[type=range].slider::-webkit-slider-thumb{height:1rem;width:1rem}input[type=range].slider::-moz-range-thumb{height:1rem;width:1rem}input[type=range].slider::-ms-thumb{height:1rem;width:1rem}input[type=range].slider::-ms-thumb{margin-top:0}input[type=range].slider::-webkit-slider-thumb{margin-top:-.25rem}input[type=range].slider[orient=vertical]::-webkit-slider-thumb{margin-top:auto;margin-left:-.25rem}input[type=range].slider.is-small:not([orient=vertical]){min-height:calc((.75rem + 2px) * 1.25)}input[type=range].slider.is-small:not([orient=vertical])::-webkit-slider-runnable-track{height:.375rem}input[type=range].slider.is-small:not([orient=vertical])::-moz-range-track{height:.375rem}input[type=range].slider.is-small:not([orient=vertical])::-ms-track{height:.375rem}input[type=range].slider.is-small[orient=vertical]::-webkit-slider-runnable-track{width:.375rem}input[type=range].slider.is-small[orient=vertical]::-moz-range-track{width:.375rem}input[type=range].slider.is-small[orient=vertical]::-ms-track{width:.375rem}input[type=range].slider.is-small::-webkit-slider-thumb{height:.75rem;width:.75rem}input[type=range].slider.is-small::-moz-range-thumb{height:.75rem;width:.75rem}input[type=range].slider.is-small::-ms-thumb{height:.75rem;width:.75rem}input[type=range].slider.is-small::-ms-thumb{margin-top:0}input[type=range].slider.is-small::-webkit-slider-thumb{margin-top:-.1875rem}input[type=range].slider.is-small[orient=vertical]::-webkit-slider-thumb{margin-top:auto;margin-left:-.1875rem}input[type=range].slider.is-medium:not([orient=vertical]){min-height:calc((1.25rem + 2px) * 1.25)}input[type=range].slider.is-medium:not([orient=vertical])::-webkit-slider-runnable-track{height:.625rem}input[type=range].slider.is-medium:not([orient=vertical])::-moz-range-track{height:.625rem}input[type=range].slider.is-medium:not([orient=vertical])::-ms-track{height:.625rem}input[type=range].slider.is-medium[orient=vertical]::-webkit-slider-runnable-track{width:.625rem}input[type=range].slider.is-medium[orient=vertical]::-moz-range-track{width:.625rem}input[type=range].slider.is-medium[orient=vertical]::-ms-track{width:.625rem}input[type=range].slider.is-medium::-webkit-slider-thumb{height:1.25rem;width:1.25rem}input[type=range].slider.is-medium::-moz-range-thumb{height:1.25rem;width:1.25rem}input[type=range].slider.is-medium::-ms-thumb{height:1.25rem;width:1.25rem}input[type=range].slider.is-medium::-ms-thumb{margin-top:0}input[type=range].slider.is-medium::-webkit-slider-thumb{margin-top:-.3125rem}input[type=range].slider.is-medium[orient=vertical]::-webkit-slider-thumb{margin-top:auto;margin-left:-.3125rem}input[type=range].slider.is-large:not([orient=vertical]){min-height:calc((1.5rem + 2px) * 1.25)}input[type=range].slider.is-large:not([orient=vertical])::-webkit-slider-runnable-track{height:.75rem}input[type=range].slider.is-large:not([orient=vertical])::-moz-range-track{height:.75rem}input[type=range].slider.is-large:not([orient=vertical])::-ms-track{height:.75rem}input[type=range].slider.is-large[orient=vertical]::-webkit-slider-runnable-track{width:.75rem}input[type=range].slider.is-large[orient=vertical]::-moz-range-track{width:.75rem}input[type=range].slider.is-large[orient=vertical]::-ms-track{width:.75rem}input[type=range].slider.is-large::-webkit-slider-thumb{height:1.5rem;width:1.5rem}input[type=range].slider.is-large::-moz-range-thumb{height:1.5rem;width:1.5rem}input[type=range].slider.is-large::-ms-thumb{height:1.5rem;width:1.5rem}input[type=range].slider.is-large::-ms-thumb{margin-top:0}input[type=range].slider.is-large::-webkit-slider-thumb{margin-top:-.375rem}input[type=range].slider.is-large[orient=vertical]::-webkit-slider-thumb{margin-top:auto;margin-left:-.375rem}input[type=range].slider.is-white::-moz-range-track{background:#fff!important}input[type=range].slider.is-white::-webkit-slider-runnable-track{background:#fff!important}input[type=range].slider.is-white::-ms-track{background:#fff!important}input[type=range].slider.is-white::-ms-fill-lower{background:#fff}input[type=range].slider.is-white::-ms-fill-upper{background:#fff}input[type=range].slider.is-white .has-output-tooltip+output,input[type=range].slider.is-white.has-output+output{background-color:#fff;color:#0a0a0a}input[type=range].slider.is-black::-moz-range-track{background:#0a0a0a!important}input[type=range].slider.is-black::-webkit-slider-runnable-track{background:#0a0a0a!important}input[type=range].slider.is-black::-ms-track{background:#0a0a0a!important}input[type=range].slider.is-black::-ms-fill-lower{background:#0a0a0a}input[type=range].slider.is-black::-ms-fill-upper{background:#0a0a0a}input[type=range].slider.is-black .has-output-tooltip+output,input[type=range].slider.is-black.has-output+output{background-color:#0a0a0a;color:#fff}input[type=range].slider.is-light::-moz-range-track{background:#f5f5f5!important}input[type=range].slider.is-light::-webkit-slider-runnable-track{background:#f5f5f5!important}input[type=range].slider.is-light::-ms-track{background:#f5f5f5!important}input[type=range].slider.is-light::-ms-fill-lower{background:#f5f5f5}input[type=range].slider.is-light::-ms-fill-upper{background:#f5f5f5}input[type=range].slider.is-light .has-output-tooltip+output,input[type=range].slider.is-light.has-output+output{background-color:#f5f5f5;color:#363636}input[type=range].slider.is-dark::-moz-range-track{background:#363636!important}input[type=range].slider.is-dark::-webkit-slider-runnable-track{background:#363636!important}input[type=range].slider.is-dark::-ms-track{background:#363636!important}input[type=range].slider.is-dark::-ms-fill-lower{background:#363636}input[type=range].slider.is-dark::-ms-fill-upper{background:#363636}input[type=range].slider.is-dark .has-output-tooltip+output,input[type=range].slider.is-dark.has-output+output{background-color:#363636;color:#f5f5f5}input[type=range].slider.is-primary::-moz-range-track{background:#00d1b2!important}input[type=range].slider.is-primary::-webkit-slider-runnable-track{background:#00d1b2!important}input[type=range].slider.is-primary::-ms-track{background:#00d1b2!important}input[type=range].slider.is-primary::-ms-fill-lower{background:#00d1b2}input[type=range].slider.is-primary::-ms-fill-upper{background:#00d1b2}input[type=range].slider.is-primary .has-output-tooltip+output,input[type=range].slider.is-primary.has-output+output{background-color:#00d1b2;color:#fff}input[type=range].slider.is-link::-moz-range-track{background:#3273dc!important}input[type=range].slider.is-link::-webkit-slider-runnable-track{background:#3273dc!important}input[type=range].slider.is-link::-ms-track{background:#3273dc!important}input[type=range].slider.is-link::-ms-fill-lower{background:#3273dc}input[type=range].slider.is-link::-ms-fill-upper{background:#3273dc}input[type=range].slider.is-link .has-output-tooltip+output,input[type=range].slider.is-link.has-output+output{background-color:#3273dc;color:#fff}input[type=range].slider.is-info::-moz-range-track{background:#209cee!important}input[type=range].slider.is-info::-webkit-slider-runnable-track{background:#209cee!important}input[type=range].slider.is-info::-ms-track{background:#209cee!important}input[type=range].slider.is-info::-ms-fill-lower{background:#209cee}input[type=range].slider.is-info::-ms-fill-upper{background:#209cee}input[type=range].slider.is-info .has-output-tooltip+output,input[type=range].slider.is-info.has-output+output{background-color:#209cee;color:#fff}input[type=range].slider.is-success::-moz-range-track{background:#23d160!important}input[type=range].slider.is-success::-webkit-slider-runnable-track{background:#23d160!important}input[type=range].slider.is-success::-ms-track{background:#23d160!important}input[type=range].slider.is-success::-ms-fill-lower{background:#23d160}input[type=range].slider.is-success::-ms-fill-upper{background:#23d160}input[type=range].slider.is-success .has-output-tooltip+output,input[type=range].slider.is-success.has-output+output{background-color:#23d160;color:#fff}input[type=range].slider.is-warning::-moz-range-track{background:#ffdd57!important}input[type=range].slider.is-warning::-webkit-slider-runnable-track{background:#ffdd57!important}input[type=range].slider.is-warning::-ms-track{background:#ffdd57!important}input[type=range].slider.is-warning::-ms-fill-lower{background:#ffdd57}input[type=range].slider.is-warning::-ms-fill-upper{background:#ffdd57}input[type=range].slider.is-warning .has-output-tooltip+output,input[type=range].slider.is-warning.has-output+output{background-color:#ffdd57;color:rgba(0,0,0,.7)}input[type=range].slider.is-danger::-moz-range-track{background:#ff3860!important}input[type=range].slider.is-danger::-webkit-slider-runnable-track{background:#ff3860!important}input[type=range].slider.is-danger::-ms-track{background:#ff3860!important}input[type=range].slider.is-danger::-ms-fill-lower{background:#ff3860}input[type=range].slider.is-danger::-ms-fill-upper{background:#ff3860}input[type=range].slider.is-danger .has-output-tooltip+output,input[type=range].slider.is-danger.has-output+output{background-color:#ff3860;color:#fff} -------------------------------------------------------------------------------- /docs/static/css/comp-slider.css: -------------------------------------------------------------------------------- 1 | .comparison { 2 | position: relative; 3 | width: 1280px; 4 | height: 720px; 5 | border: 2px solid rgb(255, 255, 255); 6 | } 7 | 8 | .comparison .img { 9 | position: absolute; 10 | top: 0; 11 | left: 0; 12 | width: 100%; 13 | height: 100%; 14 | background-size: 1280px 100%; 15 | } 16 | 17 | .comparison .background-img { 18 | background-image: url("../images/scene0050_bf.png"); 19 | } 20 | 21 | .comparison .foreground-img { 22 | background-image: url("../images/scene0050_ours.png"); 23 | width: 50%; 24 | } 25 | 26 | .comparison .slider-line { 27 | position: absolute; 28 | -webkit-appearance: none; 29 | appearance: none; 30 | width: 100%; 31 | height: 100%; 32 | background: rgba(242, 242, 242, 0.1); 33 | outline: none; 34 | margin: 0; 35 | transition: all 0.2s; 36 | display: flex; 37 | justify-content: center; 38 | align-items: center; 39 | } 40 | 41 | .comparison .slider-line:hover { 42 | background: rgba(242, 242, 242, 0.0); 43 | } 44 | 45 | .comparison .slider-line::-webkit-slider-thumb { 46 | -webkit-appearance: none; 47 | appearance: none; 48 | width: 5px; 49 | height: 720px; 50 | background: white; 51 | cursor: pointer; 52 | } 53 | 54 | .comparison .slider-line::-moz-range-thumb { 55 | width: 6px; 56 | height: 720px; 57 | background: white; 58 | cursor: pointer; 59 | } 60 | 61 | .comparison .slider-button { 62 | pointer-events: none; 63 | position: absolute; 64 | width: 30px; 65 | height: 30px; 66 | border-radius: 50%; 67 | background-color: white; 68 | left: calc(50% - 15px); 69 | top: calc(50% - 15px); 70 | display: flex; 71 | justify-content: center; 72 | align-items: center; 73 | } 74 | 75 | .comparison .slider-button:after { 76 | content: ""; 77 | padding: 3px; 78 | display: inline-block; 79 | border: solid #5D5D5D; 80 | border-width: 0 2px 2px 0; 81 | transform: rotate(-45deg); 82 | } 83 | 84 | .comparison .slider-button:before { 85 | content: ""; 86 | padding: 3px; 87 | display: inline-block; 88 | border: solid #5D5D5D; 89 | border-width: 0 2px 2px 0; 90 | transform: rotate(135deg); 91 | } -------------------------------------------------------------------------------- /docs/static/css/index.css: -------------------------------------------------------------------------------- 1 | body { 2 | font-family: 'Noto Sans', sans-serif; 3 | } 4 | 5 | 6 | .footer .icon-link { 7 | font-size: 25px; 8 | color: #000; 9 | } 10 | 11 | .link-block a { 12 | margin-top: 5px; 13 | margin-bottom: 5px; 14 | } 15 | 16 | 17 | .teaser .hero-body { 18 | padding-top: 0; 19 | padding-bottom: 3rem; 20 | } 21 | 22 | .teaser { 23 | font-family: 'Google Sans', sans-serif; 24 | } 25 | 26 | 27 | .publication-title { 28 | } 29 | 30 | .publication-banner { 31 | max-height: parent; 32 | 33 | } 34 | 35 | .publication-banner video { 36 | position: relative; 37 | left: auto; 38 | top: auto; 39 | transform: none; 40 | object-fit: fit; 41 | } 42 | 43 | .publication-header .hero-body { 44 | } 45 | 46 | .publication-title { 47 | font-family: 'Google Sans', sans-serif; 48 | } 49 | 50 | .publication-authors { 51 | font-family: 'Google Sans', sans-serif; 52 | } 53 | 54 | .publication-venue { 55 | color: #555; 56 | width: fit-content; 57 | font-weight: bold; 58 | } 59 | 60 | .publication-awards { 61 | color: #ff3860; 62 | width: fit-content; 63 | font-weight: bolder; 64 | } 65 | 66 | .publication-authors { 67 | } 68 | 69 | .publication-authors a { 70 | color: hsl(204, 86%, 53%) !important; 71 | } 72 | 73 | .publication-authors a:hover { 74 | text-decoration: underline; 75 | } 76 | 77 | .author-block { 78 | display: inline-block; 79 | margin: 6px; 80 | } 81 | 82 | .publication-banner img { 83 | } 84 | 85 | .publication-authors { 86 | /*color: #4286f4;*/ 87 | } 88 | 89 | .publication-video { 90 | position: relative; 91 | width: 100%; 92 | height: 0; 93 | padding-bottom: 56.25%; 94 | 95 | overflow: hidden; 96 | border-radius: 10px !important; 97 | } 98 | 99 | .publication-video iframe { 100 | position: absolute; 101 | top: 0; 102 | left: 0; 103 | width: 100%; 104 | height: 100%; 105 | } 106 | 107 | .publication-body img { 108 | } 109 | 110 | .results-carousel { 111 | overflow: hidden; 112 | } 113 | 114 | .results-carousel .item { 115 | margin: 5px; 116 | overflow: hidden; 117 | border: 1px solid #bbb; 118 | border-radius: 10px; 119 | padding: 0; 120 | font-size: 0; 121 | } 122 | 123 | .results-carousel video { 124 | margin: 0; 125 | } 126 | 127 | 128 | .interpolation-panel { 129 | background: #f5f5f5; 130 | border-radius: 10px; 131 | } 132 | 133 | .interpolation-panel .interpolation-image { 134 | width: 100%; 135 | border-radius: 5px; 136 | } 137 | 138 | .interpolation-video-column { 139 | } 140 | 141 | .interpolation-panel .slider { 142 | margin: 0 !important; 143 | } 144 | 145 | .interpolation-panel .slider { 146 | margin: 0 !important; 147 | } 148 | 149 | #interpolation-image-wrapper { 150 | width: 100%; 151 | } 152 | #interpolation-image-wrapper img { 153 | border-radius: 5px; 154 | } 155 | -------------------------------------------------------------------------------- /docs/static/css/juxtapose.css: -------------------------------------------------------------------------------- 1 | /* juxtapose - v1.2.2 - 2020-09-03 2 | * Copyright (c) 2020 Alex Duner and Northwestern University Knight Lab 3 | */ 4 | div.juxtapose { 5 | width: 100%; 6 | font-family: Helvetica, Arial, sans-serif; 7 | } 8 | 9 | div.jx-slider { 10 | width: 100%; 11 | height: 100%; 12 | position: relative; 13 | overflow: hidden; 14 | cursor: pointer; 15 | color: #f3f3f3; 16 | } 17 | 18 | 19 | div.jx-handle { 20 | position: absolute; 21 | height: 100%; 22 | width: 40px; 23 | cursor: col-resize; 24 | z-index: 15; 25 | margin-left: -20px; 26 | } 27 | 28 | .vertical div.jx-handle { 29 | height: 40px; 30 | width: 100%; 31 | cursor: row-resize; 32 | margin-top: -20px; 33 | margin-left: 0; 34 | } 35 | 36 | div.jx-control { 37 | height: 100%; 38 | margin-right: auto; 39 | margin-left: auto; 40 | width: 3px; 41 | background-color: currentColor; 42 | } 43 | 44 | .vertical div.jx-control { 45 | height: 3px; 46 | width: 100%; 47 | background-color: currentColor; 48 | position: relative; 49 | top: 50%; 50 | transform: translateY(-50%); 51 | } 52 | 53 | div.jx-controller { 54 | position: absolute; 55 | margin: auto; 56 | top: 0; 57 | bottom: 0; 58 | height: 60px; 59 | width: 9px; 60 | margin-left: -3px; 61 | background-color: currentColor; 62 | } 63 | 64 | .vertical div.jx-controller { 65 | height: 9px; 66 | width: 100px; 67 | margin-left: auto; 68 | margin-right: auto; 69 | top: -3px; 70 | position: relative; 71 | } 72 | 73 | div.jx-arrow { 74 | position: absolute; 75 | margin: auto; 76 | top: 0; 77 | bottom: 0; 78 | width: 0; 79 | height: 0; 80 | transition: all .2s ease; 81 | } 82 | 83 | .vertical div.jx-arrow { 84 | position: absolute; 85 | margin: 0 auto; 86 | left: 0; 87 | right: 0; 88 | width: 0; 89 | height: 0; 90 | transition: all .2s ease; 91 | } 92 | 93 | 94 | div.jx-arrow.jx-left { 95 | left: 2px; 96 | border-style: solid; 97 | border-width: 8px 8px 8px 0; 98 | border-color: transparent currentColor transparent transparent; 99 | } 100 | 101 | div.jx-arrow.jx-right { 102 | right: 2px; 103 | border-style: solid; 104 | border-width: 8px 0 8px 8px; 105 | border-color: transparent transparent transparent currentColor; 106 | } 107 | 108 | .vertical div.jx-arrow.jx-left { 109 | left: 0px; 110 | top: 2px; 111 | border-style: solid; 112 | border-width: 0px 8px 8px 8px; 113 | border-color: transparent transparent currentColor transparent; 114 | } 115 | 116 | .vertical div.jx-arrow.jx-right { 117 | right: 0px; 118 | top: auto; 119 | bottom: 2px; 120 | border-style: solid; 121 | border-width: 8px 8px 0 8px; 122 | border-color: currentColor transparent transparent transparent; 123 | } 124 | 125 | div.jx-handle:hover div.jx-arrow.jx-left, 126 | div.jx-handle:active div.jx-arrow.jx-left { 127 | left: -1px; 128 | } 129 | 130 | div.jx-handle:hover div.jx-arrow.jx-right, 131 | div.jx-handle:active div.jx-arrow.jx-right { 132 | right: -1px; 133 | } 134 | 135 | .vertical div.jx-handle:hover div.jx-arrow.jx-left, 136 | .vertical div.jx-handle:active div.jx-arrow.jx-left { 137 | left: 0px; 138 | top: 0px; 139 | } 140 | 141 | .vertical div.jx-handle:hover div.jx-arrow.jx-right, 142 | .vertical div.jx-handle:active div.jx-arrow.jx-right { 143 | right: 0px; 144 | bottom: 0px; 145 | } 146 | 147 | 148 | div.jx-image { 149 | position: absolute; 150 | height: 100%; 151 | display: inline-block; 152 | top: 0; 153 | overflow: hidden; 154 | -webkit-backface-visibility: hidden; 155 | } 156 | 157 | .vertical div.jx-image { 158 | width: 100%; 159 | left: 0; 160 | top: auto; 161 | } 162 | 163 | div.jx-image img { 164 | height: 100%; 165 | width: auto; 166 | z-index: 5; 167 | position: absolute; 168 | margin-bottom: 0; 169 | 170 | max-height: none; 171 | max-width: none; 172 | max-height: initial; 173 | max-width: initial; 174 | } 175 | 176 | .vertical div.jx-image img { 177 | height: auto; 178 | width: 100%; 179 | } 180 | 181 | div.jx-image.jx-left { 182 | left: 0; 183 | background-position: left; 184 | } 185 | 186 | div.jx-image.jx-left img { 187 | left: 0; 188 | } 189 | 190 | div.jx-image.jx-right { 191 | right: 0; 192 | background-position: right; 193 | } 194 | 195 | div.jx-image.jx-right img { 196 | right: 0; 197 | bottom: 0; 198 | } 199 | 200 | 201 | .veritcal div.jx-image.jx-left { 202 | top: 0; 203 | background-position: top; 204 | } 205 | 206 | .veritcal div.jx-image.jx-left img { 207 | top: 0; 208 | } 209 | 210 | .vertical div.jx-image.jx-right { 211 | bottom: 0; 212 | background-position: bottom; 213 | } 214 | 215 | .veritcal div.jx-image.jx-right img { 216 | bottom: 0; 217 | } 218 | 219 | 220 | div.jx-image div.jx-label { 221 | font-size: 1em; 222 | padding: .25em .75em; 223 | position: relative; 224 | display: inline-block; 225 | top: 0; 226 | background-color: #000; /* IE 8 */ 227 | background-color: rgba(0,0,0,.0); 228 | color: white; 229 | z-index: 10; 230 | white-space: nowrap; 231 | line-height: 18px; 232 | vertical-align: middle; 233 | } 234 | 235 | div.jx-image.jx-left div.jx-label { 236 | float: left; 237 | left: 0; 238 | } 239 | 240 | div.jx-image.jx-right div.jx-label { 241 | float: right; 242 | right: 0; 243 | } 244 | 245 | .vertical div.jx-image div.jx-label { 246 | display: table; 247 | position: absolute; 248 | } 249 | 250 | .vertical div.jx-image.jx-right div.jx-label { 251 | left: 0; 252 | bottom: 0; 253 | top: auto; 254 | } 255 | 256 | div.jx-credit { 257 | line-height: 1.1; 258 | font-size: 0.75em; 259 | } 260 | 261 | div.jx-credit em { 262 | font-weight: bold; 263 | font-style: normal; 264 | } 265 | 266 | 267 | /* Animation */ 268 | 269 | div.jx-image.transition { 270 | transition: width .5s ease; 271 | } 272 | 273 | div.jx-handle.transition { 274 | transition: left .5s ease; 275 | } 276 | 277 | .vertical div.jx-image.transition { 278 | transition: height .5s ease; 279 | } 280 | 281 | .vertical div.jx-handle.transition { 282 | transition: top .5s ease; 283 | } 284 | 285 | /* Knight Lab Credit */ 286 | a.jx-knightlab { 287 | background-color: #000; /* IE 8 */ 288 | background-color: rgba(0,0,0,.25); 289 | bottom: 0; 290 | display: none; 291 | height: 14px; 292 | line-height: 14px; 293 | padding: 1px 4px 1px 5px; 294 | position: absolute; 295 | right: 0; 296 | text-decoration: none; 297 | z-index: 10; 298 | } 299 | 300 | a.jx-knightlab div.knightlab-logo { 301 | display: inline-block; 302 | vertical-align: middle; 303 | height: 8px; 304 | width: 8px; 305 | background-color: #c34528; 306 | transform: rotate(45deg); 307 | -ms-transform: rotate(45deg); 308 | -webkit-transform: rotate(45deg); 309 | top: -1.25px; 310 | position: relative; 311 | cursor: pointer; 312 | } 313 | 314 | a.jx-knightlab:hover { 315 | background-color: #000; /* IE 8 */ 316 | background-color: rgba(0,0,0,.35); 317 | } 318 | a.jx-knightlab:hover div.knightlab-logo { 319 | background-color: #ce4d28; 320 | } 321 | 322 | a.jx-knightlab span.juxtapose-name { 323 | display: table-cell; 324 | margin: 0; 325 | padding: 0; 326 | font-family: Helvetica, Arial, sans-serif; 327 | font-weight: 300; 328 | color: white; 329 | font-size: 10px; 330 | padding-left: 0.375em; 331 | vertical-align: middle; 332 | line-height: normal; 333 | text-shadow: none; 334 | } 335 | 336 | /* keyboard accessibility */ 337 | div.jx-controller:focus, 338 | div.jx-image.jx-left div.jx-label:focus, 339 | div.jx-image.jx-right div.jx-label:focus, 340 | a.jx-knightlab:focus { 341 | background: #eae34a; 342 | color: #000; 343 | } 344 | a.jx-knightlab:focus span.juxtapose-name{ 345 | color: #000; 346 | border: none; 347 | } 348 | -------------------------------------------------------------------------------- /docs/static/images/scene0002_bf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dazinovic/neural-rgbd-surface-reconstruction/d81f106471475ba720c961ff274937568833c1ad/docs/static/images/scene0002_bf.png -------------------------------------------------------------------------------- /docs/static/images/scene0002_ours.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dazinovic/neural-rgbd-surface-reconstruction/d81f106471475ba720c961ff274937568833c1ad/docs/static/images/scene0002_ours.png -------------------------------------------------------------------------------- /docs/static/images/scene0050_bf.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dazinovic/neural-rgbd-surface-reconstruction/d81f106471475ba720c961ff274937568833c1ad/docs/static/images/scene0050_bf.png -------------------------------------------------------------------------------- /docs/static/images/scene0050_ours.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dazinovic/neural-rgbd-surface-reconstruction/d81f106471475ba720c961ff274937568833c1ad/docs/static/images/scene0050_ours.png -------------------------------------------------------------------------------- /docs/static/images/teaser.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/dazinovic/neural-rgbd-surface-reconstruction/d81f106471475ba720c961ff274937568833c1ad/docs/static/images/teaser.jpg -------------------------------------------------------------------------------- /docs/static/js/bulma-slider.js: -------------------------------------------------------------------------------- 1 | (function webpackUniversalModuleDefinition(root, factory) { 2 | if(typeof exports === 'object' && typeof module === 'object') 3 | module.exports = factory(); 4 | else if(typeof define === 'function' && define.amd) 5 | define([], factory); 6 | else if(typeof exports === 'object') 7 | exports["bulmaSlider"] = factory(); 8 | else 9 | root["bulmaSlider"] = factory(); 10 | })(typeof self !== 'undefined' ? self : this, function() { 11 | return /******/ (function(modules) { // webpackBootstrap 12 | /******/ // The module cache 13 | /******/ var installedModules = {}; 14 | /******/ 15 | /******/ // The require function 16 | /******/ function __webpack_require__(moduleId) { 17 | /******/ 18 | /******/ // Check if module is in cache 19 | /******/ if(installedModules[moduleId]) { 20 | /******/ return installedModules[moduleId].exports; 21 | /******/ } 22 | /******/ // Create a new module (and put it into the cache) 23 | /******/ var module = installedModules[moduleId] = { 24 | /******/ i: moduleId, 25 | /******/ l: false, 26 | /******/ exports: {} 27 | /******/ }; 28 | /******/ 29 | /******/ // Execute the module function 30 | /******/ modules[moduleId].call(module.exports, module, module.exports, __webpack_require__); 31 | /******/ 32 | /******/ // Flag the module as loaded 33 | /******/ module.l = true; 34 | /******/ 35 | /******/ // Return the exports of the module 36 | /******/ return module.exports; 37 | /******/ } 38 | /******/ 39 | /******/ 40 | /******/ // expose the modules object (__webpack_modules__) 41 | /******/ __webpack_require__.m = modules; 42 | /******/ 43 | /******/ // expose the module cache 44 | /******/ __webpack_require__.c = installedModules; 45 | /******/ 46 | /******/ // define getter function for harmony exports 47 | /******/ __webpack_require__.d = function(exports, name, getter) { 48 | /******/ if(!__webpack_require__.o(exports, name)) { 49 | /******/ Object.defineProperty(exports, name, { 50 | /******/ configurable: false, 51 | /******/ enumerable: true, 52 | /******/ get: getter 53 | /******/ }); 54 | /******/ } 55 | /******/ }; 56 | /******/ 57 | /******/ // getDefaultExport function for compatibility with non-harmony modules 58 | /******/ __webpack_require__.n = function(module) { 59 | /******/ var getter = module && module.__esModule ? 60 | /******/ function getDefault() { return module['default']; } : 61 | /******/ function getModuleExports() { return module; }; 62 | /******/ __webpack_require__.d(getter, 'a', getter); 63 | /******/ return getter; 64 | /******/ }; 65 | /******/ 66 | /******/ // Object.prototype.hasOwnProperty.call 67 | /******/ __webpack_require__.o = function(object, property) { return Object.prototype.hasOwnProperty.call(object, property); }; 68 | /******/ 69 | /******/ // __webpack_public_path__ 70 | /******/ __webpack_require__.p = ""; 71 | /******/ 72 | /******/ // Load entry module and return exports 73 | /******/ return __webpack_require__(__webpack_require__.s = 0); 74 | /******/ }) 75 | /************************************************************************/ 76 | /******/ ([ 77 | /* 0 */ 78 | /***/ (function(module, __webpack_exports__, __webpack_require__) { 79 | 80 | "use strict"; 81 | Object.defineProperty(__webpack_exports__, "__esModule", { value: true }); 82 | /* harmony export (binding) */ __webpack_require__.d(__webpack_exports__, "isString", function() { return isString; }); 83 | /* harmony import */ var __WEBPACK_IMPORTED_MODULE_0__events__ = __webpack_require__(1); 84 | var _extends = Object.assign || function (target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i]; for (var key in source) { if (Object.prototype.hasOwnProperty.call(source, key)) { target[key] = source[key]; } } } return target; }; 85 | 86 | var _createClass = function () { function defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if ("value" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } } return function (Constructor, protoProps, staticProps) { if (protoProps) defineProperties(Constructor.prototype, protoProps); if (staticProps) defineProperties(Constructor, staticProps); return Constructor; }; }(); 87 | 88 | var _typeof = typeof Symbol === "function" && typeof Symbol.iterator === "symbol" ? function (obj) { return typeof obj; } : function (obj) { return obj && typeof Symbol === "function" && obj.constructor === Symbol && obj !== Symbol.prototype ? "symbol" : typeof obj; }; 89 | 90 | function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } } 91 | 92 | function _possibleConstructorReturn(self, call) { if (!self) { throw new ReferenceError("this hasn't been initialised - super() hasn't been called"); } return call && (typeof call === "object" || typeof call === "function") ? call : self; } 93 | 94 | function _inherits(subClass, superClass) { if (typeof superClass !== "function" && superClass !== null) { throw new TypeError("Super expression must either be null or a function, not " + typeof superClass); } subClass.prototype = Object.create(superClass && superClass.prototype, { constructor: { value: subClass, enumerable: false, writable: true, configurable: true } }); if (superClass) Object.setPrototypeOf ? Object.setPrototypeOf(subClass, superClass) : subClass.__proto__ = superClass; } 95 | 96 | 97 | 98 | var isString = function isString(unknown) { 99 | return typeof unknown === 'string' || !!unknown && (typeof unknown === 'undefined' ? 'undefined' : _typeof(unknown)) === 'object' && Object.prototype.toString.call(unknown) === '[object String]'; 100 | }; 101 | 102 | var bulmaSlider = function (_EventEmitter) { 103 | _inherits(bulmaSlider, _EventEmitter); 104 | 105 | function bulmaSlider(selector) { 106 | var options = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {}; 107 | 108 | _classCallCheck(this, bulmaSlider); 109 | 110 | var _this = _possibleConstructorReturn(this, (bulmaSlider.__proto__ || Object.getPrototypeOf(bulmaSlider)).call(this)); 111 | 112 | _this.element = typeof selector === 'string' ? document.querySelector(selector) : selector; 113 | // An invalid selector or non-DOM node has been provided. 114 | if (!_this.element) { 115 | throw new Error('An invalid selector or non-DOM node has been provided.'); 116 | } 117 | 118 | _this._clickEvents = ['click']; 119 | /// Set default options and merge with instance defined 120 | _this.options = _extends({}, options); 121 | 122 | _this.onSliderInput = _this.onSliderInput.bind(_this); 123 | 124 | _this.init(); 125 | return _this; 126 | } 127 | 128 | /** 129 | * Initiate all DOM element containing selector 130 | * @method 131 | * @return {Array} Array of all slider instances 132 | */ 133 | 134 | 135 | _createClass(bulmaSlider, [{ 136 | key: 'init', 137 | 138 | 139 | /** 140 | * Initiate plugin 141 | * @method init 142 | * @return {void} 143 | */ 144 | value: function init() { 145 | this._id = 'bulmaSlider' + new Date().getTime() + Math.floor(Math.random() * Math.floor(9999)); 146 | this.output = this._findOutputForSlider(); 147 | 148 | this._bindEvents(); 149 | 150 | if (this.output) { 151 | if (this.element.classList.contains('has-output-tooltip')) { 152 | // Get new output position 153 | var newPosition = this._getSliderOutputPosition(); 154 | 155 | // Set output position 156 | this.output.style['left'] = newPosition.position; 157 | } 158 | } 159 | 160 | this.emit('bulmaslider:ready', this.element.value); 161 | } 162 | }, { 163 | key: '_findOutputForSlider', 164 | value: function _findOutputForSlider() { 165 | var _this2 = this; 166 | 167 | var result = null; 168 | var outputs = document.getElementsByTagName('output') || []; 169 | 170 | Array.from(outputs).forEach(function (output) { 171 | if (output.htmlFor == _this2.element.getAttribute('id')) { 172 | result = output; 173 | return true; 174 | } 175 | }); 176 | return result; 177 | } 178 | }, { 179 | key: '_getSliderOutputPosition', 180 | value: function _getSliderOutputPosition() { 181 | // Update output position 182 | var newPlace, minValue; 183 | 184 | var style = window.getComputedStyle(this.element, null); 185 | // Measure width of range input 186 | var sliderWidth = parseInt(style.getPropertyValue('width'), 10); 187 | 188 | // Figure out placement percentage between left and right of input 189 | if (!this.element.getAttribute('min')) { 190 | minValue = 0; 191 | } else { 192 | minValue = this.element.getAttribute('min'); 193 | } 194 | var newPoint = (this.element.value - minValue) / (this.element.getAttribute('max') - minValue); 195 | 196 | // Prevent bubble from going beyond left or right (unsupported browsers) 197 | if (newPoint < 0) { 198 | newPlace = 0; 199 | } else if (newPoint > 1) { 200 | newPlace = sliderWidth; 201 | } else { 202 | newPlace = sliderWidth * newPoint; 203 | } 204 | 205 | return { 206 | 'position': newPlace + 'px' 207 | }; 208 | } 209 | 210 | /** 211 | * Bind all events 212 | * @method _bindEvents 213 | * @return {void} 214 | */ 215 | 216 | }, { 217 | key: '_bindEvents', 218 | value: function _bindEvents() { 219 | if (this.output) { 220 | // Add event listener to update output when slider value change 221 | this.element.addEventListener('input', this.onSliderInput, false); 222 | } 223 | } 224 | }, { 225 | key: 'onSliderInput', 226 | value: function onSliderInput(e) { 227 | e.preventDefault(); 228 | 229 | if (this.element.classList.contains('has-output-tooltip')) { 230 | // Get new output position 231 | var newPosition = this._getSliderOutputPosition(); 232 | 233 | // Set output position 234 | this.output.style['left'] = newPosition.position; 235 | } 236 | 237 | // Check for prefix and postfix 238 | var prefix = this.output.hasAttribute('data-prefix') ? this.output.getAttribute('data-prefix') : ''; 239 | var postfix = this.output.hasAttribute('data-postfix') ? this.output.getAttribute('data-postfix') : ''; 240 | 241 | // Update output with slider value 242 | this.output.value = prefix + this.element.value + postfix; 243 | 244 | this.emit('bulmaslider:ready', this.element.value); 245 | } 246 | }], [{ 247 | key: 'attach', 248 | value: function attach() { 249 | var _this3 = this; 250 | 251 | var selector = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : 'input[type="range"].slider'; 252 | var options = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {}; 253 | 254 | var instances = new Array(); 255 | 256 | var elements = isString(selector) ? document.querySelectorAll(selector) : Array.isArray(selector) ? selector : [selector]; 257 | elements.forEach(function (element) { 258 | if (typeof element[_this3.constructor.name] === 'undefined') { 259 | var instance = new bulmaSlider(element, options); 260 | element[_this3.constructor.name] = instance; 261 | instances.push(instance); 262 | } else { 263 | instances.push(element[_this3.constructor.name]); 264 | } 265 | }); 266 | 267 | return instances; 268 | } 269 | }]); 270 | 271 | return bulmaSlider; 272 | }(__WEBPACK_IMPORTED_MODULE_0__events__["a" /* default */]); 273 | 274 | /* harmony default export */ __webpack_exports__["default"] = (bulmaSlider); 275 | 276 | /***/ }), 277 | /* 1 */ 278 | /***/ (function(module, __webpack_exports__, __webpack_require__) { 279 | 280 | "use strict"; 281 | var _createClass = function () { function defineProperties(target, props) { for (var i = 0; i < props.length; i++) { var descriptor = props[i]; descriptor.enumerable = descriptor.enumerable || false; descriptor.configurable = true; if ("value" in descriptor) descriptor.writable = true; Object.defineProperty(target, descriptor.key, descriptor); } } return function (Constructor, protoProps, staticProps) { if (protoProps) defineProperties(Constructor.prototype, protoProps); if (staticProps) defineProperties(Constructor, staticProps); return Constructor; }; }(); 282 | 283 | function _classCallCheck(instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } } 284 | 285 | var EventEmitter = function () { 286 | function EventEmitter() { 287 | var listeners = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : []; 288 | 289 | _classCallCheck(this, EventEmitter); 290 | 291 | this._listeners = new Map(listeners); 292 | this._middlewares = new Map(); 293 | } 294 | 295 | _createClass(EventEmitter, [{ 296 | key: "listenerCount", 297 | value: function listenerCount(eventName) { 298 | if (!this._listeners.has(eventName)) { 299 | return 0; 300 | } 301 | 302 | var eventListeners = this._listeners.get(eventName); 303 | return eventListeners.length; 304 | } 305 | }, { 306 | key: "removeListeners", 307 | value: function removeListeners() { 308 | var _this = this; 309 | 310 | var eventName = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : null; 311 | var middleware = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : false; 312 | 313 | if (eventName !== null) { 314 | if (Array.isArray(eventName)) { 315 | name.forEach(function (e) { 316 | return _this.removeListeners(e, middleware); 317 | }); 318 | } else { 319 | this._listeners.delete(eventName); 320 | 321 | if (middleware) { 322 | this.removeMiddleware(eventName); 323 | } 324 | } 325 | } else { 326 | this._listeners = new Map(); 327 | } 328 | } 329 | }, { 330 | key: "middleware", 331 | value: function middleware(eventName, fn) { 332 | var _this2 = this; 333 | 334 | if (Array.isArray(eventName)) { 335 | name.forEach(function (e) { 336 | return _this2.middleware(e, fn); 337 | }); 338 | } else { 339 | if (!Array.isArray(this._middlewares.get(eventName))) { 340 | this._middlewares.set(eventName, []); 341 | } 342 | 343 | this._middlewares.get(eventName).push(fn); 344 | } 345 | } 346 | }, { 347 | key: "removeMiddleware", 348 | value: function removeMiddleware() { 349 | var _this3 = this; 350 | 351 | var eventName = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : null; 352 | 353 | if (eventName !== null) { 354 | if (Array.isArray(eventName)) { 355 | name.forEach(function (e) { 356 | return _this3.removeMiddleware(e); 357 | }); 358 | } else { 359 | this._middlewares.delete(eventName); 360 | } 361 | } else { 362 | this._middlewares = new Map(); 363 | } 364 | } 365 | }, { 366 | key: "on", 367 | value: function on(name, callback) { 368 | var _this4 = this; 369 | 370 | var once = arguments.length > 2 && arguments[2] !== undefined ? arguments[2] : false; 371 | 372 | if (Array.isArray(name)) { 373 | name.forEach(function (e) { 374 | return _this4.on(e, callback); 375 | }); 376 | } else { 377 | name = name.toString(); 378 | var split = name.split(/,|, | /); 379 | 380 | if (split.length > 1) { 381 | split.forEach(function (e) { 382 | return _this4.on(e, callback); 383 | }); 384 | } else { 385 | if (!Array.isArray(this._listeners.get(name))) { 386 | this._listeners.set(name, []); 387 | } 388 | 389 | this._listeners.get(name).push({ once: once, callback: callback }); 390 | } 391 | } 392 | } 393 | }, { 394 | key: "once", 395 | value: function once(name, callback) { 396 | this.on(name, callback, true); 397 | } 398 | }, { 399 | key: "emit", 400 | value: function emit(name, data) { 401 | var _this5 = this; 402 | 403 | var silent = arguments.length > 2 && arguments[2] !== undefined ? arguments[2] : false; 404 | 405 | name = name.toString(); 406 | var listeners = this._listeners.get(name); 407 | var middlewares = null; 408 | var doneCount = 0; 409 | var execute = silent; 410 | 411 | if (Array.isArray(listeners)) { 412 | listeners.forEach(function (listener, index) { 413 | // Start Middleware checks unless we're doing a silent emit 414 | if (!silent) { 415 | middlewares = _this5._middlewares.get(name); 416 | // Check and execute Middleware 417 | if (Array.isArray(middlewares)) { 418 | middlewares.forEach(function (middleware) { 419 | middleware(data, function () { 420 | var newData = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : null; 421 | 422 | if (newData !== null) { 423 | data = newData; 424 | } 425 | doneCount++; 426 | }, name); 427 | }); 428 | 429 | if (doneCount >= middlewares.length) { 430 | execute = true; 431 | } 432 | } else { 433 | execute = true; 434 | } 435 | } 436 | 437 | // If Middleware checks have been passed, execute 438 | if (execute) { 439 | if (listener.once) { 440 | listeners[index] = null; 441 | } 442 | listener.callback(data); 443 | } 444 | }); 445 | 446 | // Dirty way of removing used Events 447 | while (listeners.indexOf(null) !== -1) { 448 | listeners.splice(listeners.indexOf(null), 1); 449 | } 450 | } 451 | } 452 | }]); 453 | 454 | return EventEmitter; 455 | }(); 456 | 457 | /* harmony default export */ __webpack_exports__["a"] = (EventEmitter); 458 | 459 | /***/ }) 460 | /******/ ])["default"]; 461 | }); -------------------------------------------------------------------------------- /docs/static/js/bulma-slider.min.js: -------------------------------------------------------------------------------- 1 | !function(t,e){"object"==typeof exports&&"object"==typeof module?module.exports=e():"function"==typeof define&&define.amd?define([],e):"object"==typeof exports?exports.bulmaSlider=e():t.bulmaSlider=e()}("undefined"!=typeof self?self:this,function(){return function(n){var r={};function i(t){if(r[t])return r[t].exports;var e=r[t]={i:t,l:!1,exports:{}};return n[t].call(e.exports,e,e.exports,i),e.l=!0,e.exports}return i.m=n,i.c=r,i.d=function(t,e,n){i.o(t,e)||Object.defineProperty(t,e,{configurable:!1,enumerable:!0,get:n})},i.n=function(t){var e=t&&t.__esModule?function(){return t.default}:function(){return t};return i.d(e,"a",e),e},i.o=function(t,e){return Object.prototype.hasOwnProperty.call(t,e)},i.p="",i(i.s=0)}([function(t,e,n){"use strict";Object.defineProperty(e,"__esModule",{value:!0}),n.d(e,"isString",function(){return l});var r=n(1),i=Object.assign||function(t){for(var e=1;e=l.length&&(s=!0)):s=!0),s&&(t.once&&(u[e]=null),t.callback(r))});-1!==u.indexOf(null);)u.splice(u.indexOf(null),1)}}]),e}();e.a=i}]).default}); -------------------------------------------------------------------------------- /docs/static/js/index.js: -------------------------------------------------------------------------------- 1 | window.HELP_IMPROVE_VIDEOJS = false; 2 | 3 | var INTERP_BASE = "https://homes.cs.washington.edu/~kpar/nerfies/interpolation/stacked"; 4 | var NUM_INTERP_FRAMES = 240; 5 | 6 | var interp_images = []; 7 | function preloadInterpolationImages() { 8 | for (var i = 0; i < NUM_INTERP_FRAMES; i++) { 9 | var path = INTERP_BASE + '/' + String(i).padStart(6, '0') + '.jpg'; 10 | interp_images[i] = new Image(); 11 | interp_images[i].src = path; 12 | } 13 | } 14 | 15 | function setInterpolationImage(i) { 16 | var image = interp_images[i]; 17 | image.ondragstart = function() { return false; }; 18 | image.oncontextmenu = function() { return false; }; 19 | $('#interpolation-image-wrapper').empty().append(image); 20 | } 21 | 22 | 23 | $(document).ready(function() { 24 | // Check for click events on the navbar burger icon 25 | $(".navbar-burger").click(function() { 26 | // Toggle the "is-active" class on both the "navbar-burger" and the "navbar-menu" 27 | $(".navbar-burger").toggleClass("is-active"); 28 | $(".navbar-menu").toggleClass("is-active"); 29 | 30 | }); 31 | 32 | var options = { 33 | slidesToScroll: 1, 34 | slidesToShow: 3, 35 | loop: true, 36 | infinite: true, 37 | autoplay: false, 38 | autoplaySpeed: 3000, 39 | } 40 | 41 | // Initialize all div with carousel class 42 | var carousels = bulmaCarousel.attach('.carousel', options); 43 | 44 | // Loop on each carousel initialized 45 | for(var i = 0; i < carousels.length; i++) { 46 | // Add listener to event 47 | carousels[i].on('before:show', state => { 48 | console.log(state); 49 | }); 50 | } 51 | 52 | // Access to bulmaCarousel instance of an element 53 | var element = document.querySelector('#my-element'); 54 | if (element && element.bulmaCarousel) { 55 | // bulmaCarousel instance is available as element.bulmaCarousel 56 | element.bulmaCarousel.on('before-show', function(state) { 57 | console.log(state); 58 | }); 59 | } 60 | 61 | /*var player = document.getElementById('interpolation-video'); 62 | player.addEventListener('loadedmetadata', function() { 63 | $('#interpolation-slider').on('input', function(event) { 64 | console.log(this.value, player.duration); 65 | player.currentTime = player.duration / 100 * this.value; 66 | }) 67 | }, false);*/ 68 | preloadInterpolationImages(); 69 | 70 | $('#interpolation-slider').on('input', function(event) { 71 | setInterpolationImage(this.value); 72 | }); 73 | setInterpolationImage(0); 74 | $('#interpolation-slider').prop('max', NUM_INTERP_FRAMES - 1); 75 | 76 | bulmaSlider.attach(); 77 | 78 | $("#s50slider").on("input change", (e)=>{ 79 | const sliderPos = e.target.value * 100 / e.target.max; 80 | // Update the width of the foreground image 81 | $('.foreground-img').css('width', `${sliderPos}%`) 82 | // Update the position of the slider button 83 | $('.slider-button').css('left', `calc(${sliderPos}% - 15px)`) 84 | }); 85 | 86 | }) -------------------------------------------------------------------------------- /docs/static/js/juxtapose.min.js: -------------------------------------------------------------------------------- 1 | /* juxtapose - v1.2.2 - 2020-09-03 2 | * Copyright (c) 2020 Alex Duner and Northwestern University Knight Lab 3 | */ 4 | (function(document,window){var juxtapose={sliders:[],OPTIMIZATION_ACCEPTED:1,OPTIMIZATION_WAS_CONSTRAINED:2};var flickr_key="d90fc2d1f4acc584e08b8eaea5bf4d6c";var FLICKR_SIZE_PREFERENCES=["Large","Medium"];function Graphic(properties,slider){var self=this;this.image=new Image;this.loaded=false;this.image.onload=function(){self.loaded=true;slider._onLoaded()};this.image.src=properties.src;this.image.alt=properties.alt||"";this.label=properties.label||false;this.credit=properties.credit||false}function FlickrGraphic(properties,slider){var self=this;this.image=new Image;this.loaded=false;this.image.onload=function(){self.loaded=true;slider._onLoaded()};this.flickrID=this.getFlickrID(properties.src);this.callFlickrAPI(this.flickrID,self);this.label=properties.label||false;this.credit=properties.credit||false}FlickrGraphic.prototype={getFlickrID:function(url){if(url.match(/flic.kr\/.+/i)){var encoded=url.split("/").slice(-1)[0];return base58Decode(encoded)}var idx=url.indexOf("flickr.com/photos/");var pos=idx+"flickr.com/photos/".length;var photo_info=url.substr(pos);if(photo_info.indexOf("/")==-1)return null;if(photo_info.indexOf("/")===0)photo_info=photo_info.substr(1);id=photo_info.split("/")[1];return id},callFlickrAPI:function(id,self){var url="https://api.flickr.com/services/rest/?method=flickr.photos.getSizes"+"&api_key="+flickr_key+"&photo_id="+id+"&format=json&nojsoncallback=1";var request=new XMLHttpRequest;request.open("GET",url,true);request.onload=function(){if(request.status>=200&&request.status<400){data=JSON.parse(request.responseText);var flickr_url=self.bestFlickrUrl(data.sizes.size);self.setFlickrImage(flickr_url)}else{console.error("There was an error getting the picture from Flickr")}};request.onerror=function(){console.error("There was an error getting the picture from Flickr")};request.send()},setFlickrImage:function(src){this.image.src=src},bestFlickrUrl:function(ary){var dict={};for(var i=0;i0&&leftPercentNum<100){removeClass(this.handle,"transition");removeClass(this.rightImage,"transition");removeClass(this.leftImage,"transition");if(this.options.animate&&animate){addClass(this.handle,"transition");addClass(this.leftImage,"transition");addClass(this.rightImage,"transition")}if(this.options.mode==="vertical"){this.handle.style.top=leftPercent;this.leftImage.style.height=leftPercent;this.rightImage.style.height=rightPercent}else{this.handle.style.left=leftPercent;this.leftImage.style.width=leftPercent;this.rightImage.style.width=rightPercent}this.sliderPosition=leftPercent}},getPosition:function(){return this.sliderPosition},displayLabel:function(element,labelText){label=document.createElement("div");label.className="jx-label";label.setAttribute("tabindex",0);setText(label,labelText);element.appendChild(label)},displayCredits:function(){credit=document.createElement("div");credit.className="jx-credit";text="Photo Credits:";if(this.imgBefore.credit){text+=" Before "+this.imgBefore.credit}if(this.imgAfter.credit){text+=" After "+this.imgAfter.credit}credit.innerHTML=text;this.wrapper.appendChild(credit)},setStartingPosition:function(s){this.options.startingPosition=s},checkImages:function(){if(getImageDimensions(this.imgBefore.image).aspect()==getImageDimensions(this.imgAfter.image).aspect()){return true}else{return false}},calculateDims:function(width,height){var ratio=getImageDimensions(this.imgBefore.image).aspect();if(width){height=width/ratio}else if(height){width=height*ratio}return{width:width,height:height,ratio:ratio}},responsivizeIframe:function(dims){if(dims.height=1){this.wrapper.style.paddingTop=parseInt((window.innerHeight-dims.height)/2)+"px"}}else if(dims.height>window.innerHeight){dims=this.calculateDims(0,window.innerHeight);this.wrapper.style.paddingLeft=parseInt((window.innerWidth-dims.width)/2)+"px"}if(this.options.showCredits){dims.height-=13}return dims},setWrapperDimensions:function(){var wrapperWidth=getComputedWidthAndHeight(this.wrapper).width;var wrapperHeight=getComputedWidthAndHeight(this.wrapper).height;var dims=this.calculateDims(wrapperWidth,wrapperHeight);if(window.location!==window.parent.location&&!this.options.makeResponsive){dims=this.responsivizeIframe(dims)}this.wrapper.style.height=parseInt(dims.height)+"px";this.wrapper.style.width=parseInt(dims.width)+"px"},optimizeWrapper:function(maxWidth){var result=juxtapose.OPTIMIZATION_ACCEPTED;if(this.imgBefore.image.naturalWidth>=maxWidth&&this.imgAfter.image.naturalWidth>=maxWidth){this.wrapper.style.width=maxWidth+"px";result=juxtapose.OPTIMIZATION_WAS_CONSTRAINED}else if(this.imgAfter.image.naturalWidth 2 | #include 3 | #include 4 | #include 5 | #include 6 | 7 | #include "tables.h" 8 | #include "sparsegrid3.h" 9 | #include "marching_cubes.h" 10 | 11 | #define VOXELSIZE 1.0f 12 | 13 | struct vec3f { 14 | vec3f() { 15 | x = 0.0f; 16 | y = 0.0f; 17 | z = 0.0f; 18 | } 19 | vec3f(float x_, float y_, float z_) { 20 | x = x_; 21 | y = y_; 22 | z = z_; 23 | } 24 | inline vec3f operator+(const vec3f& other) const { 25 | return vec3f(x+other.x, y+other.y, z+other.z); 26 | } 27 | inline vec3f operator-(const vec3f& other) const { 28 | return vec3f(x-other.x, y-other.y, z-other.z); 29 | } 30 | inline vec3f operator*(float val) const { 31 | return vec3f(x*val, y*val, z*val); 32 | } 33 | inline void operator+=(const vec3f& other) { 34 | x += other.x; 35 | y += other.y; 36 | z += other.z; 37 | } 38 | static float distSq(const vec3f& v0, const vec3f& v1) { 39 | return ((v0.x-v1.x)*(v0.x-v1.x) + (v0.y-v1.y)*(v0.y-v1.y) + (v0.z-v1.z)*(v0.z-v1.z)); 40 | } 41 | float x; 42 | float y; 43 | float z; 44 | }; 45 | inline vec3f operator*(float s, const vec3f& v) { 46 | return v * s; 47 | } 48 | struct vec3uc { 49 | vec3uc() { 50 | x = 0; 51 | y = 0; 52 | z = 0; 53 | } 54 | vec3uc(unsigned char x_, unsigned char y_, unsigned char z_) { 55 | x = x_; 56 | y = y_; 57 | z = z_; 58 | } 59 | unsigned char x; 60 | unsigned char y; 61 | unsigned char z; 62 | }; 63 | 64 | struct Triangle { 65 | vec3f v0; 66 | vec3f v1; 67 | vec3f v2; 68 | }; 69 | 70 | void get_voxel( 71 | const vec3f& pos, 72 | const npy_accessor& tsdf_accessor, 73 | float truncation, 74 | float& d, 75 | int& w) { 76 | int x = (int)round(pos.x); 77 | int y = (int)round(pos.y); 78 | int z = (int)round(pos.z); 79 | if (z >= 0 && z < tsdf_accessor.size()[2] && 80 | y >= 0 && y < tsdf_accessor.size()[1] && 81 | x >= 0 && x < tsdf_accessor.size()[0]) { 82 | d = tsdf_accessor(x, y, z); 83 | if (d != -std::numeric_limits::infinity() && fabs(d) < truncation) w = 1; 84 | else w = 0; 85 | } 86 | else { 87 | d = -std::numeric_limits::infinity(); 88 | w = 0; 89 | } 90 | } 91 | 92 | bool trilerp( 93 | const vec3f& pos, 94 | float& dist, 95 | const npy_accessor& tsdf_accessor, 96 | float truncation) { 97 | const float oSet = VOXELSIZE; 98 | const vec3f posDual = pos - vec3f(oSet / 2.0f, oSet / 2.0f, oSet / 2.0f); 99 | vec3f weight = vec3f(pos.x - (int)pos.x, pos.y - (int)pos.y, pos.z - (int)pos.z); 100 | 101 | dist = 0.0f; 102 | float d; int w; 103 | get_voxel(posDual + vec3f(0.0f, 0.0f, 0.0f), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += (1.0f - weight.x)*(1.0f - weight.y)*(1.0f - weight.z)*d; 104 | get_voxel(posDual + vec3f(oSet, 0.0f, 0.0f), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += weight.x *(1.0f - weight.y)*(1.0f - weight.z)*d; 105 | get_voxel(posDual + vec3f(0.0f, oSet, 0.0f), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += (1.0f - weight.x)* weight.y *(1.0f - weight.z)*d; 106 | get_voxel(posDual + vec3f(0.0f, 0.0f, oSet), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += (1.0f - weight.x)*(1.0f - weight.y)* weight.z *d; 107 | get_voxel(posDual + vec3f(oSet, oSet, 0.0f), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += weight.x * weight.y *(1.0f - weight.z)*d; 108 | get_voxel(posDual + vec3f(0.0f, oSet, oSet), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += (1.0f - weight.x)* weight.y * weight.z *d; 109 | get_voxel(posDual + vec3f(oSet, 0.0f, oSet), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += weight.x *(1.0f - weight.y)* weight.z *d; 110 | get_voxel(posDual + vec3f(oSet, oSet, oSet), tsdf_accessor, truncation, d, w); if (w == 0) return false; dist += weight.x * weight.y * weight.z *d; 111 | 112 | return true; 113 | } 114 | 115 | vec3f vertexInterp(float isolevel, const vec3f& p1, const vec3f& p2, float d1, float d2) 116 | { 117 | vec3f r1 = p1; 118 | vec3f r2 = p2; 119 | //printf("[interp] r1 = (%f, %f, %f), r2 = (%f, %f, %f) d1 = %f, d2 = %f, iso = %f\n", r1.x, r1.y, r1.z, r2.x, r2.y, r2.z, d1, d2, isolevel); 120 | //printf("%d, %d, %d || %f, %f, %f -> %f, %f, %f\n", fabs(isolevel - d1) < 0.00001f, fabs(isolevel - d2) < 0.00001f, fabs(d1 - d2) < 0.00001f, isolevel - d1, isolevel - d2, d1-d2, fabs(isolevel - d1), fabs(isolevel - d2), fabs(d1-d2)); 121 | 122 | if (fabs(isolevel - d1) < 0.00001f) return r1; 123 | if (fabs(isolevel - d2) < 0.00001f) return r2; 124 | if (fabs(d1 - d2) < 0.00001f) return r1; 125 | 126 | float mu = (isolevel - d1) / (d2 - d1); 127 | 128 | vec3f res; 129 | res.x = p1.x + mu * (p2.x - p1.x); // Positions 130 | res.y = p1.y + mu * (p2.y - p1.y); 131 | res.z = p1.z + mu * (p2.z - p1.z); 132 | 133 | //printf("[interp] mu = %f, res = (%f, %f, %f) r1 = (%f, %f, %f), r2 = (%f, %f, %f)\n", mu, res.x, res.y, res.z, r1.x, r1.y, r1.z, r2.x, r2.y, r2.z); 134 | 135 | return res; 136 | } 137 | 138 | void extract_isosurface_at_position( 139 | const vec3f& pos, 140 | const npy_accessor& tsdf_accessor, 141 | float truncation, 142 | float isolevel, 143 | float thresh, 144 | std::vector& results) { 145 | const float voxelsize = VOXELSIZE; 146 | const float P = voxelsize / 2.0f; 147 | const float M = -P; 148 | 149 | //const bool debugprint = (pos.z == 33 && pos.y == 56 && pos.x == 2) || (pos.z == 2 && pos.y == 56 && pos.x == 33); 150 | 151 | vec3f p000 = pos + vec3f(M, M, M); float dist000; bool valid000 = trilerp(p000, dist000, tsdf_accessor, truncation); 152 | vec3f p100 = pos + vec3f(P, M, M); float dist100; bool valid100 = trilerp(p100, dist100, tsdf_accessor, truncation); 153 | vec3f p010 = pos + vec3f(M, P, M); float dist010; bool valid010 = trilerp(p010, dist010, tsdf_accessor, truncation); 154 | vec3f p001 = pos + vec3f(M, M, P); float dist001; bool valid001 = trilerp(p001, dist001, tsdf_accessor, truncation); 155 | vec3f p110 = pos + vec3f(P, P, M); float dist110; bool valid110 = trilerp(p110, dist110, tsdf_accessor, truncation); 156 | vec3f p011 = pos + vec3f(M, P, P); float dist011; bool valid011 = trilerp(p011, dist011, tsdf_accessor, truncation); 157 | vec3f p101 = pos + vec3f(P, M, P); float dist101; bool valid101 = trilerp(p101, dist101, tsdf_accessor, truncation); 158 | vec3f p111 = pos + vec3f(P, P, P); float dist111; bool valid111 = trilerp(p111, dist111, tsdf_accessor, truncation); 159 | //if (debugprint) { 160 | // printf("[extract_isosurface_at_position] pos: %f, %f, %f\n", pos.x, pos.y, pos.z); 161 | // printf("\tp000 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p000.x, p000.y, p000.z, dist000, (int)color000.x, (int)color000.y, (int)color000.z, (int)valid000); 162 | // printf("\tp100 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p100.x, p100.y, p100.z, dist100, (int)color100.x, (int)color100.y, (int)color100.z, (int)valid100); 163 | // printf("\tp010 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p010.x, p010.y, p010.z, dist010, (int)color010.x, (int)color010.y, (int)color010.z, (int)valid010); 164 | // printf("\tp001 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p001.x, p001.y, p001.z, dist001, (int)color001.x, (int)color001.y, (int)color001.z, (int)valid001); 165 | // printf("\tp110 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p110.x, p110.y, p110.z, dist110, (int)color110.x, (int)color110.y, (int)color110.z, (int)valid110); 166 | // printf("\tp011 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p011.x, p011.y, p011.z, dist011, (int)color011.x, (int)color011.y, (int)color011.z, (int)valid011); 167 | // printf("\tp101 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p101.x, p101.y, p101.z, dist101, (int)color101.x, (int)color101.y, (int)color101.z, (int)valid101); 168 | // printf("\tp111 (%f, %f, %f) -> dist %f, color %d, %d, %d | valid %d\n", p111.x, p111.y, p111.z, dist111, (int)color111.x, (int)color111.y, (int)color111.z, (int)valid111); 169 | //} 170 | if (!valid000 || !valid100 || !valid010 || !valid001 || !valid110 || !valid011 || !valid101 || !valid111) return; 171 | 172 | uint cubeindex = 0; 173 | if (dist010 < isolevel) cubeindex += 1; 174 | if (dist110 < isolevel) cubeindex += 2; 175 | if (dist100 < isolevel) cubeindex += 4; 176 | if (dist000 < isolevel) cubeindex += 8; 177 | if (dist011 < isolevel) cubeindex += 16; 178 | if (dist111 < isolevel) cubeindex += 32; 179 | if (dist101 < isolevel) cubeindex += 64; 180 | if (dist001 < isolevel) cubeindex += 128; 181 | const float thres = thresh; 182 | float distArray[] = { dist000, dist100, dist010, dist001, dist110, dist011, dist101, dist111 }; 183 | //if (debugprint) { 184 | // printf("dists (%f, %f, %f, %f, %f, %f, %f, %f)\n", dist000, dist100, dist010, dist001, dist110, dist011, dist101, dist111); 185 | // printf("cubeindex %d\n", cubeindex); 186 | //} 187 | for (uint k = 0; k < 8; k++) { 188 | for (uint l = 0; l < 8; l++) { 189 | if (distArray[k] * distArray[l] < 0.0f) { 190 | if (fabs(distArray[k]) + fabs(distArray[l]) > thres) return; 191 | } 192 | else { 193 | if (fabs(distArray[k] - distArray[l]) > thres) return; 194 | } 195 | } 196 | } 197 | if (fabs(dist000) > thresh) return; 198 | if (fabs(dist100) > thresh) return; 199 | if (fabs(dist010) > thresh) return; 200 | if (fabs(dist001) > thresh) return; 201 | if (fabs(dist110) > thresh) return; 202 | if (fabs(dist011) > thresh) return; 203 | if (fabs(dist101) > thresh) return; 204 | if (fabs(dist111) > thresh) return; 205 | 206 | if (edgeTable[cubeindex] == 0 || edgeTable[cubeindex] == 255) return; // added by me edgeTable[cubeindex] == 255 207 | 208 | vec3uc c; 209 | { 210 | float d; int w; 211 | get_voxel(pos, tsdf_accessor, truncation, d, w); 212 | } 213 | 214 | vec3f vertlist[12]; 215 | if (edgeTable[cubeindex] & 1) vertlist[0] = vertexInterp(isolevel, p010, p110, dist010, dist110); 216 | if (edgeTable[cubeindex] & 2) vertlist[1] = vertexInterp(isolevel, p110, p100, dist110, dist100); 217 | if (edgeTable[cubeindex] & 4) vertlist[2] = vertexInterp(isolevel, p100, p000, dist100, dist000); 218 | if (edgeTable[cubeindex] & 8) vertlist[3] = vertexInterp(isolevel, p000, p010, dist000, dist010); 219 | if (edgeTable[cubeindex] & 16) vertlist[4] = vertexInterp(isolevel, p011, p111, dist011, dist111); 220 | if (edgeTable[cubeindex] & 32) vertlist[5] = vertexInterp(isolevel, p111, p101, dist111, dist101); 221 | if (edgeTable[cubeindex] & 64) vertlist[6] = vertexInterp(isolevel, p101, p001, dist101, dist001); 222 | if (edgeTable[cubeindex] & 128) vertlist[7] = vertexInterp(isolevel, p001, p011, dist001, dist011); 223 | if (edgeTable[cubeindex] & 256) vertlist[8] = vertexInterp(isolevel, p010, p011, dist010, dist011); 224 | if (edgeTable[cubeindex] & 512) vertlist[9] = vertexInterp(isolevel, p110, p111, dist110, dist111); 225 | if (edgeTable[cubeindex] & 1024) vertlist[10] = vertexInterp(isolevel, p100, p101, dist100, dist101); 226 | if (edgeTable[cubeindex] & 2048) vertlist[11] = vertexInterp(isolevel, p000, p001, dist000, dist001); 227 | 228 | for (int i = 0; triTable[cubeindex][i] != -1; i += 3) 229 | { 230 | Triangle t; 231 | t.v0 = vertlist[triTable[cubeindex][i + 0]]; 232 | t.v1 = vertlist[triTable[cubeindex][i + 1]]; 233 | t.v2 = vertlist[triTable[cubeindex][i + 2]]; 234 | 235 | //printf("triangle at (%f, %f, %f): (%f, %f, %f) (%f, %f, %f) (%f, %f, %f)\n", pos.x, pos.y, pos.z, t.v0.x, t.v0.y, t.v0.z, t.v1.x, t.v1.y, t.v1.z, t.v2.x, t.v2.y, t.v2.z); 236 | //printf("vertlist idxs: %d, %d, %d (%d, %d, %d)\n", triTable[cubeindex][i + 0], triTable[cubeindex][i + 1], triTable[cubeindex][i + 2], edgeTable[cubeindex] & 1, edgeTable[cubeindex] & 256, edgeTable[cubeindex] & 8); 237 | //getchar(); 238 | results.push_back(t); 239 | } 240 | } 241 | 242 | 243 | // ----- MESH CLEANUP FUNCTIONS 244 | unsigned int remove_duplicate_faces(std::vector& faces) 245 | { 246 | struct vecHash { 247 | size_t operator()(const std::vector& v) const { 248 | //TODO larger prime number (64 bit) to match size_t 249 | const size_t p[] = {73856093, 19349669, 83492791}; 250 | size_t res = 0; 251 | for (unsigned int i : v) { 252 | res = res ^ (size_t)i * p[i%3]; 253 | } 254 | return res; 255 | //const size_t res = ((size_t)v.x * p0)^((size_t)v.y * p1)^((size_t)v.z * p2); 256 | } 257 | }; 258 | 259 | size_t numFaces = faces.size(); 260 | std::vector new_faces; new_faces.reserve(numFaces); 261 | 262 | std::unordered_set, vecHash> _set; 263 | for (size_t i = 0; i < numFaces; i++) { 264 | std::vector face = {(unsigned int)faces[i].x, (unsigned int)faces[i].y, (unsigned int)faces[i].z}; 265 | std::sort(face.begin(), face.end()); 266 | if (_set.find(face) == _set.end()) { 267 | //not found yet 268 | _set.insert(face); 269 | new_faces.push_back(faces[i]); //inserted the unsorted one 270 | } 271 | } 272 | if (faces.size() != new_faces.size()) { 273 | faces = new_faces; 274 | } 275 | //printf("Removed %d-%d=%d duplicate faces of %d\n", (int)numFaces, (int)new_faces.size(), (int)numFaces-(int)new_faces.size(), (int)numFaces); 276 | 277 | return (unsigned int)new_faces.size(); 278 | } 279 | unsigned int remove_degenerate_faces(std::vector& faces) 280 | { 281 | std::vector new_faces; 282 | 283 | for (size_t i = 0; i < faces.size(); i++) { 284 | std::unordered_set _set(3); 285 | bool foundDuplicate = false; 286 | if (_set.find(faces[i].x) != _set.end()) { foundDuplicate = true; } 287 | else { _set.insert(faces[i].x); } 288 | if (!foundDuplicate && _set.find(faces[i].y) != _set.end()) { foundDuplicate = true; } 289 | else { _set.insert(faces[i].y); } 290 | if (!foundDuplicate && _set.find(faces[i].z) != _set.end()) { foundDuplicate = true; } 291 | else { _set.insert(faces[i].z); } 292 | if (!foundDuplicate) { 293 | new_faces.push_back(faces[i]); 294 | } 295 | } 296 | if (faces.size() != new_faces.size()) { 297 | faces = new_faces; 298 | } 299 | 300 | return (unsigned int)faces.size(); 301 | } 302 | unsigned int hasNearestNeighbor( const vec3i& coord, SparseGrid3 > > &neighborQuery, const vec3f& v, float thresh ) 303 | { 304 | float threshSq = thresh*thresh; 305 | for (int i = -1; i <= 1; i++) { 306 | for (int j = -1; j <= 1; j++) { 307 | for (int k = -1; k <= 1; k++) { 308 | vec3i c = coord + vec3i(i,j,k); 309 | if (neighborQuery.exists(c)) { 310 | for (const std::pair& n : neighborQuery[c]) { 311 | if (vec3f::distSq(v,n.first) < threshSq) { 312 | return n.second; 313 | } 314 | } 315 | } 316 | } 317 | } 318 | } 319 | return (unsigned int)-1; 320 | } 321 | unsigned int hasNearestNeighborApprox(const vec3i& coord, SparseGrid3 &neighborQuery) { 322 | for (int i = -1; i <= 1; i++) { 323 | for (int j = -1; j <= 1; j++) { 324 | for (int k = -1; k <= 1; k++) { 325 | vec3i c = coord + vec3i(i,j,k); 326 | if (neighborQuery.exists(c)) { 327 | return neighborQuery[c]; 328 | } 329 | } 330 | } 331 | } 332 | return (unsigned int)-1; 333 | } 334 | int sgn(float val) { 335 | return (0.0f < val) - (val < 0.0f); 336 | } 337 | std::pair, std::vector> merge_close_vertices(const std::vector& meshTris, float thresh, bool approx) 338 | { 339 | // assumes voxelsize = 1 340 | assert(thresh > 0); 341 | unsigned int numV = (unsigned int)meshTris.size() * 3; 342 | std::vector vertices(numV); 343 | std::vector faces(meshTris.size()); 344 | for (int i = 0; i < (int)meshTris.size(); i++) { 345 | vertices[3*i+0].x = meshTris[i].v0.x; 346 | vertices[3*i+0].y = meshTris[i].v0.y; 347 | vertices[3*i+0].z = meshTris[i].v0.z; 348 | 349 | vertices[3*i+1].x = meshTris[i].v1.x; 350 | vertices[3*i+1].y = meshTris[i].v1.y; 351 | vertices[3*i+1].z = meshTris[i].v1.z; 352 | 353 | vertices[3*i+2].x = meshTris[i].v2.x; 354 | vertices[3*i+2].y = meshTris[i].v2.y; 355 | vertices[3*i+2].z = meshTris[i].v2.z; 356 | 357 | faces[i].x = 3*i+0; 358 | faces[i].y = 3*i+1; 359 | faces[i].z = 3*i+2; 360 | } 361 | 362 | std::vector vertexLookUp; vertexLookUp.resize(numV); 363 | std::vector new_verts; new_verts.reserve(numV); 364 | 365 | unsigned int cnt = 0; 366 | if (approx) { 367 | SparseGrid3 neighborQuery(0.6f, numV*2); 368 | for (unsigned int v = 0; v < numV; v++) { 369 | 370 | const vec3f& vert = vertices[v]; 371 | vec3i coord = vec3i(vert.x/thresh + 0.5f*sgn(vert.x), vert.y/thresh + 0.5f*sgn(vert.y), vert.z/thresh + 0.5f*sgn(vert.z)); 372 | unsigned int nn = hasNearestNeighborApprox(coord, neighborQuery); 373 | 374 | if (nn == (unsigned int)-1) { 375 | neighborQuery[coord] = cnt; 376 | new_verts.push_back(vert); 377 | vertexLookUp[v] = cnt; 378 | cnt++; 379 | } else { 380 | vertexLookUp[v] = nn; 381 | } 382 | } 383 | } else { 384 | SparseGrid3 > > neighborQuery(0.6f, numV*2); 385 | for (unsigned int v = 0; v < numV; v++) { 386 | 387 | const vec3f& vert = vertices[v]; 388 | vec3i coord = vec3i(vert.x/thresh + 0.5f*sgn(vert.x), vert.y/thresh + 0.5f*sgn(vert.y), vert.z/thresh + 0.5f*sgn(vert.z)); 389 | unsigned int nn = hasNearestNeighbor(coord, neighborQuery, vert, thresh); 390 | 391 | if (nn == (unsigned int)-1) { 392 | neighborQuery[coord].push_back(std::make_pair(vert,cnt)); 393 | new_verts.push_back(vert); 394 | vertexLookUp[v] = cnt; 395 | cnt++; 396 | } else { 397 | vertexLookUp[v] = nn; 398 | } 399 | } 400 | } 401 | // Update faces 402 | for (int i = 0; i < (int)faces.size(); i++) { 403 | faces[i].x = vertexLookUp[faces[i].x]; 404 | faces[i].y = vertexLookUp[faces[i].y]; 405 | faces[i].z = vertexLookUp[faces[i].z]; 406 | } 407 | 408 | if (vertices.size() != new_verts.size()) { 409 | vertices = new_verts; 410 | } 411 | 412 | remove_degenerate_faces(faces); 413 | //printf("Merged %d-%d=%d of %d vertices\n", numV, cnt, numV-cnt, numV); 414 | return std::make_pair(vertices, faces); 415 | } 416 | // ----- MESH CLEANUP FUNCTIONS 417 | 418 | void run_marching_cubes_internal( 419 | const npy_accessor& tsdf_accessor, 420 | float isovalue, 421 | float truncation, 422 | float thresh, 423 | std::vector& results) { 424 | results.clear(); 425 | 426 | for (int i = 0; i < (int)tsdf_accessor.size()[0]; i++) { 427 | for (int j = 0; j < (int)tsdf_accessor.size()[1]; j++) { 428 | for (int k = 0; k < (int)tsdf_accessor.size()[2]; k++) { 429 | extract_isosurface_at_position(vec3f(i, j, k), tsdf_accessor, truncation, isovalue, thresh, results); 430 | } // k 431 | } // j 432 | } // i 433 | //printf("#results = %d\n", (int)results.size()); 434 | } 435 | 436 | void marching_cubes(const npy_accessor& tsdf_accessor, double isovalue, double truncation, 437 | std::vector& vertices, std::vector& polygons) { 438 | 439 | std::vector results; 440 | float thresh = 10.0f; 441 | run_marching_cubes_internal(tsdf_accessor, isovalue, truncation, thresh, results); 442 | 443 | // cleanup 444 | auto cleaned = merge_close_vertices(results, 0.00001f, true); 445 | remove_duplicate_faces(cleaned.second); 446 | 447 | vertices.resize(3 * cleaned.first.size()); 448 | polygons.resize(3 * cleaned.second.size()); 449 | 450 | for (int i = 0; i < (int)cleaned.first.size(); i++) { 451 | vertices[3 * i + 0] = cleaned.first[i].x; 452 | vertices[3 * i + 1] = cleaned.first[i].y; 453 | vertices[3 * i + 2] = cleaned.first[i].z; 454 | } 455 | 456 | for (int i = 0; i < (int)cleaned.second.size(); i++) { 457 | polygons[3 * i + 0] = cleaned.second[i].x; 458 | polygons[3 * i + 1] = cleaned.second[i].y; 459 | polygons[3 * i + 2] = cleaned.second[i].z; 460 | } 461 | 462 | } 463 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/marching_cubes.h: -------------------------------------------------------------------------------- 1 | #ifndef _MARCHING_CUBES_H 2 | #define _MARCHING_CUBES_H 3 | 4 | #include "pyarraymodule.h" 5 | #include 6 | #include 7 | 8 | struct npy_accessor { 9 | npy_accessor(PyArrayObject* arr, const std::array size) : m_arr(arr), m_size(size) {} 10 | const std::array& size() const { 11 | return m_size; 12 | } 13 | double operator()(long x, long y, long z) const { 14 | const npy_intp c[3] = {x, y, z}; 15 | return PyArray_SafeGet(m_arr, c); 16 | } 17 | 18 | PyArrayObject* m_arr; 19 | const std::array m_size; 20 | }; 21 | 22 | void marching_cubes(const npy_accessor& tsdf_accessor, double isovalue, double truncation, 23 | std::vector& vertices, std::vector& polygons); 24 | 25 | #endif // _MARCHING_CUBES_H -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/pyarray_symbol.h: -------------------------------------------------------------------------------- 1 | 2 | #define PY_ARRAY_UNIQUE_SYMBOL mcubes_PyArray_API 3 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/pyarraymodule.h: -------------------------------------------------------------------------------- 1 | 2 | #ifndef _EXTMODULE_H 3 | #define _EXTMODULE_H 4 | 5 | #include 6 | #include 7 | 8 | // #define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION 9 | #define PY_ARRAY_UNIQUE_SYMBOL mcubes_PyArray_API 10 | #define NO_IMPORT_ARRAY 11 | #include "numpy/arrayobject.h" 12 | 13 | #include 14 | 15 | template 16 | struct numpy_typemap; 17 | 18 | #define define_numpy_type(ctype, dtype) \ 19 | template<> \ 20 | struct numpy_typemap \ 21 | {static const int type = dtype;}; 22 | 23 | define_numpy_type(bool, NPY_BOOL); 24 | define_numpy_type(char, NPY_BYTE); 25 | define_numpy_type(short, NPY_SHORT); 26 | define_numpy_type(int, NPY_INT); 27 | define_numpy_type(long, NPY_LONG); 28 | define_numpy_type(long long, NPY_LONGLONG); 29 | define_numpy_type(unsigned char, NPY_UBYTE); 30 | define_numpy_type(unsigned short, NPY_USHORT); 31 | define_numpy_type(unsigned int, NPY_UINT); 32 | define_numpy_type(unsigned long, NPY_ULONG); 33 | define_numpy_type(unsigned long long, NPY_ULONGLONG); 34 | define_numpy_type(float, NPY_FLOAT); 35 | define_numpy_type(double, NPY_DOUBLE); 36 | define_numpy_type(long double, NPY_LONGDOUBLE); 37 | define_numpy_type(std::complex, NPY_CFLOAT); 38 | define_numpy_type(std::complex, NPY_CDOUBLE); 39 | define_numpy_type(std::complex, NPY_CLONGDOUBLE); 40 | 41 | template 42 | T PyArray_SafeGet(const PyArrayObject* aobj, const npy_intp* indaux) 43 | { 44 | // HORROR. 45 | npy_intp* ind = const_cast(indaux); 46 | void* ptr = PyArray_GetPtr(const_cast(aobj), ind); 47 | switch(PyArray_TYPE(aobj)) 48 | { 49 | case NPY_BOOL: 50 | return static_cast(*reinterpret_cast(ptr)); 51 | case NPY_BYTE: 52 | return static_cast(*reinterpret_cast(ptr)); 53 | case NPY_SHORT: 54 | return static_cast(*reinterpret_cast(ptr)); 55 | case NPY_INT: 56 | return static_cast(*reinterpret_cast(ptr)); 57 | case NPY_LONG: 58 | return static_cast(*reinterpret_cast(ptr)); 59 | case NPY_LONGLONG: 60 | return static_cast(*reinterpret_cast(ptr)); 61 | case NPY_UBYTE: 62 | return static_cast(*reinterpret_cast(ptr)); 63 | case NPY_USHORT: 64 | return static_cast(*reinterpret_cast(ptr)); 65 | case NPY_UINT: 66 | return static_cast(*reinterpret_cast(ptr)); 67 | case NPY_ULONG: 68 | return static_cast(*reinterpret_cast(ptr)); 69 | case NPY_ULONGLONG: 70 | return static_cast(*reinterpret_cast(ptr)); 71 | case NPY_FLOAT: 72 | return static_cast(*reinterpret_cast(ptr)); 73 | case NPY_DOUBLE: 74 | return static_cast(*reinterpret_cast(ptr)); 75 | case NPY_LONGDOUBLE: 76 | return static_cast(*reinterpret_cast(ptr)); 77 | default: 78 | throw std::runtime_error("data type not supported"); 79 | } 80 | } 81 | 82 | template 83 | T PyArray_SafeSet(PyArrayObject* aobj, const npy_intp* indaux, const T& value) 84 | { 85 | // HORROR. 86 | npy_intp* ind = const_cast(indaux); 87 | void* ptr = PyArray_GetPtr(aobj, ind); 88 | switch(PyArray_TYPE(aobj)) 89 | { 90 | case NPY_BOOL: 91 | *reinterpret_cast(ptr) = static_cast(value); 92 | break; 93 | case NPY_BYTE: 94 | *reinterpret_cast(ptr) = static_cast(value); 95 | break; 96 | case NPY_SHORT: 97 | *reinterpret_cast(ptr) = static_cast(value); 98 | break; 99 | case NPY_INT: 100 | *reinterpret_cast(ptr) = static_cast(value); 101 | break; 102 | case NPY_LONG: 103 | *reinterpret_cast(ptr) = static_cast(value); 104 | break; 105 | case NPY_LONGLONG: 106 | *reinterpret_cast(ptr) = static_cast(value); 107 | break; 108 | case NPY_UBYTE: 109 | *reinterpret_cast(ptr) = static_cast(value); 110 | break; 111 | case NPY_USHORT: 112 | *reinterpret_cast(ptr) = static_cast(value); 113 | break; 114 | case NPY_UINT: 115 | *reinterpret_cast(ptr) = static_cast(value); 116 | break; 117 | case NPY_ULONG: 118 | *reinterpret_cast(ptr) = static_cast(value); 119 | break; 120 | case NPY_ULONGLONG: 121 | *reinterpret_cast(ptr) = static_cast(value); 122 | break; 123 | case NPY_FLOAT: 124 | *reinterpret_cast(ptr) = static_cast(value); 125 | break; 126 | case NPY_DOUBLE: 127 | *reinterpret_cast(ptr) = static_cast(value); 128 | break; 129 | case NPY_LONGDOUBLE: 130 | *reinterpret_cast(ptr) = static_cast(value); 131 | break; 132 | default: 133 | throw std::runtime_error("data type not supported"); 134 | } 135 | } 136 | 137 | #endif 138 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/pywrapper.cpp: -------------------------------------------------------------------------------- 1 | 2 | #include "pywrapper.h" 3 | 4 | #include "marching_cubes.h" 5 | 6 | #include 7 | #include 8 | 9 | 10 | PyObject* marching_cubes(PyArrayObject* arr, double isovalue, double truncation) 11 | { 12 | if(PyArray_NDIM(arr) != 3) 13 | throw std::runtime_error("Only three-dimensional arrays are supported."); 14 | 15 | // Prepare data. 16 | npy_intp* shape = PyArray_DIMS(arr); 17 | std::array lower{0, 0, 0}; 18 | std::array upper{shape[0]-1, shape[1]-1, shape[2]-1}; 19 | long numx = upper[0] - lower[0] + 1; 20 | long numy = upper[1] - lower[1] + 1; 21 | long numz = upper[2] - lower[2] + 1; 22 | std::vector vertices; 23 | std::vector polygons; 24 | 25 | // auto pyarray_to_cfunc = [&](long x, long y, long z) -> double { 26 | // const npy_intp c[3] = {x, y, z}; 27 | // return PyArray_SafeGet(arr, c); 28 | // }; 29 | 30 | npy_accessor tsdf_accessor(arr, {numx, numy, numz}); 31 | 32 | // Marching cubes. 33 | marching_cubes(tsdf_accessor, isovalue, truncation, vertices, polygons); 34 | 35 | // Copy the result to two Python ndarrays. 36 | npy_intp size_vertices = vertices.size(); 37 | npy_intp size_polygons = polygons.size(); 38 | PyArrayObject* verticesarr = reinterpret_cast(PyArray_SimpleNew(1, &size_vertices, PyArray_DOUBLE)); 39 | PyArrayObject* polygonsarr = reinterpret_cast(PyArray_SimpleNew(1, &size_polygons, PyArray_ULONG)); 40 | 41 | std::vector::const_iterator it = vertices.begin(); 42 | for(int i=0; it!=vertices.end(); ++i, ++it) 43 | *reinterpret_cast(PyArray_GETPTR1(verticesarr, i)) = *it; 44 | std::vector::const_iterator it2 = polygons.begin(); 45 | for(int i=0; it2!=polygons.end(); ++i, ++it2) 46 | *reinterpret_cast(PyArray_GETPTR1(polygonsarr, i)) = *it2; 47 | 48 | PyObject* res = Py_BuildValue("(O,O)", verticesarr, polygonsarr); 49 | Py_XDECREF(verticesarr); 50 | Py_XDECREF(polygonsarr); 51 | 52 | return res; 53 | } 54 | 55 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/pywrapper.h: -------------------------------------------------------------------------------- 1 | 2 | #ifndef _PYWRAPPER_H 3 | #define _PYWRAPPER_H 4 | 5 | #include 6 | #include "pyarraymodule.h" 7 | 8 | #include 9 | 10 | PyObject* marching_cubes(PyArrayObject* arr, double isovalue, double truncation); 11 | 12 | #endif // _PYWRAPPER_H 13 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/sparsegrid3.h: -------------------------------------------------------------------------------- 1 | 2 | #include 3 | #include 4 | #include 5 | 6 | struct vec3i { 7 | vec3i() { 8 | x = 0; 9 | y = 0; 10 | z = 0; 11 | } 12 | vec3i(int x_, int y_, int z_) { 13 | x = x_; 14 | y = y_; 15 | z = z_; 16 | } 17 | inline vec3i operator+(const vec3i& other) const { 18 | return vec3i(x+other.x, y+other.y, z+other.z); 19 | } 20 | inline vec3i operator-(const vec3i& other) const { 21 | return vec3i(x-other.x, y-other.y, z-other.z); 22 | } 23 | inline bool operator==(const vec3i& other) const { 24 | if ((x == other.x) && (y == other.y) && (z == other.z)) 25 | return true; 26 | return false; 27 | } 28 | int x; 29 | int y; 30 | int z; 31 | }; 32 | 33 | namespace std { 34 | 35 | template <> 36 | struct hash : public std::unary_function { 37 | size_t operator()(const vec3i& v) const { 38 | //TODO larger prime number (64 bit) to match size_t 39 | const size_t p0 = 73856093; 40 | const size_t p1 = 19349669; 41 | const size_t p2 = 83492791; 42 | const size_t res = ((size_t)v.x * p0)^((size_t)v.y * p1)^((size_t)v.z * p2); 43 | return res; 44 | } 45 | }; 46 | 47 | } 48 | 49 | template 50 | class SparseGrid3 { 51 | public: 52 | typedef typename std::unordered_map>::iterator iterator; 53 | typedef typename std::unordered_map>::const_iterator const_iterator; 54 | iterator begin() {return m_Data.begin();} 55 | iterator end() {return m_Data.end();} 56 | const_iterator begin() const {return m_Data.begin();} 57 | const_iterator end() const {return m_Data.end();} 58 | 59 | SparseGrid3(float maxLoadFactor = 0.6, size_t reserveBuckets = 64) { 60 | m_Data.reserve(reserveBuckets); 61 | m_Data.max_load_factor(maxLoadFactor); 62 | } 63 | 64 | size_t size() const { 65 | return m_Data.size(); 66 | } 67 | 68 | void clear() { 69 | m_Data.clear(); 70 | } 71 | 72 | bool exists(const vec3i& i) const { 73 | return (m_Data.find(i) != m_Data.end()); 74 | } 75 | 76 | bool exists(int x, int y, int z) const { 77 | return exists(vec3i(x, y, z)); 78 | } 79 | 80 | const T& operator()(const vec3i& i) const { 81 | return m_Data.find(i)->second; 82 | } 83 | 84 | //! if the element does not exist, it will be created with its default constructor 85 | T& operator()(const vec3i& i) { 86 | return m_Data[i]; 87 | } 88 | 89 | const T& operator()(int x, int y, int z) const { 90 | return (*this)(vec3i(x,y,z)); 91 | } 92 | T& operator()(int x, int y, int z) { 93 | return (*this)(vec3i(x,y,z)); 94 | } 95 | 96 | const T& operator[](const vec3i& i) const { 97 | return (*this)(i); 98 | } 99 | T& operator[](const vec3i& i) { 100 | return (*this)(i); 101 | } 102 | 103 | protected: 104 | std::unordered_map> m_Data; 105 | }; 106 | 107 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/marching_cubes/src/tables.h: -------------------------------------------------------------------------------- 1 | 2 | ///////////////////////////////////////////////////// 3 | // tables 4 | ///////////////////////////////////////////////////// 5 | 6 | // Polygonising a scalar field 7 | // Also known as: "3D Contouring", "Marching Cubes", "Surface Reconstruction" 8 | // Written by Paul Bourke 9 | // May 1994 10 | // http://paulbourke.net/geometry/polygonise/ 11 | 12 | 13 | const static int edgeTable[256] = { 14 | 0x0, 0x109, 0x203, 0x30a, 0x406, 0x50f, 0x605, 0x70c, 15 | 0x80c, 0x905, 0xa0f, 0xb06, 0xc0a, 0xd03, 0xe09, 0xf00, 16 | 0x190, 0x99, 0x393, 0x29a, 0x596, 0x49f, 0x795, 0x69c, 17 | 0x99c, 0x895, 0xb9f, 0xa96, 0xd9a, 0xc93, 0xf99, 0xe90, 18 | 0x230, 0x339, 0x33, 0x13a, 0x636, 0x73f, 0x435, 0x53c, 19 | 0xa3c, 0xb35, 0x83f, 0x936, 0xe3a, 0xf33, 0xc39, 0xd30, 20 | 0x3a0, 0x2a9, 0x1a3, 0xaa, 0x7a6, 0x6af, 0x5a5, 0x4ac, 21 | 0xbac, 0xaa5, 0x9af, 0x8a6, 0xfaa, 0xea3, 0xda9, 0xca0, 22 | 0x460, 0x569, 0x663, 0x76a, 0x66, 0x16f, 0x265, 0x36c, 23 | 0xc6c, 0xd65, 0xe6f, 0xf66, 0x86a, 0x963, 0xa69, 0xb60, 24 | 0x5f0, 0x4f9, 0x7f3, 0x6fa, 0x1f6, 0xff, 0x3f5, 0x2fc, 25 | 0xdfc, 0xcf5, 0xfff, 0xef6, 0x9fa, 0x8f3, 0xbf9, 0xaf0, 26 | 0x650, 0x759, 0x453, 0x55a, 0x256, 0x35f, 0x55, 0x15c, 27 | 0xe5c, 0xf55, 0xc5f, 0xd56, 0xa5a, 0xb53, 0x859, 0x950, 28 | 0x7c0, 0x6c9, 0x5c3, 0x4ca, 0x3c6, 0x2cf, 0x1c5, 0xcc, 29 | 0xfcc, 0xec5, 0xdcf, 0xcc6, 0xbca, 0xac3, 0x9c9, 0x8c0, 30 | 0x8c0, 0x9c9, 0xac3, 0xbca, 0xcc6, 0xdcf, 0xec5, 0xfcc, 31 | 0xcc, 0x1c5, 0x2cf, 0x3c6, 0x4ca, 0x5c3, 0x6c9, 0x7c0, 32 | 0x950, 0x859, 0xb53, 0xa5a, 0xd56, 0xc5f, 0xf55, 0xe5c, 33 | 0x15c, 0x55, 0x35f, 0x256, 0x55a, 0x453, 0x759, 0x650, 34 | 0xaf0, 0xbf9, 0x8f3, 0x9fa, 0xef6, 0xfff, 0xcf5, 0xdfc, 35 | 0x2fc, 0x3f5, 0xff, 0x1f6, 0x6fa, 0x7f3, 0x4f9, 0x5f0, 36 | 0xb60, 0xa69, 0x963, 0x86a, 0xf66, 0xe6f, 0xd65, 0xc6c, 37 | 0x36c, 0x265, 0x16f, 0x66, 0x76a, 0x663, 0x569, 0x460, 38 | 0xca0, 0xda9, 0xea3, 0xfaa, 0x8a6, 0x9af, 0xaa5, 0xbac, 39 | 0x4ac, 0x5a5, 0x6af, 0x7a6, 0xaa, 0x1a3, 0x2a9, 0x3a0, 40 | 0xd30, 0xc39, 0xf33, 0xe3a, 0x936, 0x83f, 0xb35, 0xa3c, 41 | 0x53c, 0x435, 0x73f, 0x636, 0x13a, 0x33, 0x339, 0x230, 42 | 0xe90, 0xf99, 0xc93, 0xd9a, 0xa96, 0xb9f, 0x895, 0x99c, 43 | 0x69c, 0x795, 0x49f, 0x596, 0x29a, 0x393, 0x99, 0x190, 44 | 0xf00, 0xe09, 0xd03, 0xc0a, 0xb06, 0xa0f, 0x905, 0x80c, 45 | 0x70c, 0x605, 0x50f, 0x406, 0x30a, 0x203, 0x109, 0x0 }; 46 | 47 | 48 | const static int triTable[256][16] = 49 | { { -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 50 | { 0, 8, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 51 | { 0, 1, 9, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 52 | { 1, 8, 3, 9, 8, 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 53 | { 1, 2, 10, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 54 | { 0, 8, 3, 1, 2, 10, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 55 | { 9, 2, 10, 0, 2, 9, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 56 | { 2, 8, 3, 2, 10, 8, 10, 9, 8, -1, -1, -1, -1, -1, -1, -1 }, 57 | { 3, 11, 2, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 58 | { 0, 11, 2, 8, 11, 0, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 59 | { 1, 9, 0, 2, 3, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 60 | { 1, 11, 2, 1, 9, 11, 9, 8, 11, -1, -1, -1, -1, -1, -1, -1 }, 61 | { 3, 10, 1, 11, 10, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 62 | { 0, 10, 1, 0, 8, 10, 8, 11, 10, -1, -1, -1, -1, -1, -1, -1 }, 63 | { 3, 9, 0, 3, 11, 9, 11, 10, 9, -1, -1, -1, -1, -1, -1, -1 }, 64 | { 9, 8, 10, 10, 8, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 65 | { 4, 7, 8, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 66 | { 4, 3, 0, 7, 3, 4, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 67 | { 0, 1, 9, 8, 4, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 68 | { 4, 1, 9, 4, 7, 1, 7, 3, 1, -1, -1, -1, -1, -1, -1, -1 }, 69 | { 1, 2, 10, 8, 4, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 70 | { 3, 4, 7, 3, 0, 4, 1, 2, 10, -1, -1, -1, -1, -1, -1, -1 }, 71 | { 9, 2, 10, 9, 0, 2, 8, 4, 7, -1, -1, -1, -1, -1, -1, -1 }, 72 | { 2, 10, 9, 2, 9, 7, 2, 7, 3, 7, 9, 4, -1, -1, -1, -1 }, 73 | { 8, 4, 7, 3, 11, 2, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 74 | { 11, 4, 7, 11, 2, 4, 2, 0, 4, -1, -1, -1, -1, -1, -1, -1 }, 75 | { 9, 0, 1, 8, 4, 7, 2, 3, 11, -1, -1, -1, -1, -1, -1, -1 }, 76 | { 4, 7, 11, 9, 4, 11, 9, 11, 2, 9, 2, 1, -1, -1, -1, -1 }, 77 | { 3, 10, 1, 3, 11, 10, 7, 8, 4, -1, -1, -1, -1, -1, -1, -1 }, 78 | { 1, 11, 10, 1, 4, 11, 1, 0, 4, 7, 11, 4, -1, -1, -1, -1 }, 79 | { 4, 7, 8, 9, 0, 11, 9, 11, 10, 11, 0, 3, -1, -1, -1, -1 }, 80 | { 4, 7, 11, 4, 11, 9, 9, 11, 10, -1, -1, -1, -1, -1, -1, -1 }, 81 | { 9, 5, 4, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 82 | { 9, 5, 4, 0, 8, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 83 | { 0, 5, 4, 1, 5, 0, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 84 | { 8, 5, 4, 8, 3, 5, 3, 1, 5, -1, -1, -1, -1, -1, -1, -1 }, 85 | { 1, 2, 10, 9, 5, 4, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 86 | { 3, 0, 8, 1, 2, 10, 4, 9, 5, -1, -1, -1, -1, -1, -1, -1 }, 87 | { 5, 2, 10, 5, 4, 2, 4, 0, 2, -1, -1, -1, -1, -1, -1, -1 }, 88 | { 2, 10, 5, 3, 2, 5, 3, 5, 4, 3, 4, 8, -1, -1, -1, -1 }, 89 | { 9, 5, 4, 2, 3, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 90 | { 0, 11, 2, 0, 8, 11, 4, 9, 5, -1, -1, -1, -1, -1, -1, -1 }, 91 | { 0, 5, 4, 0, 1, 5, 2, 3, 11, -1, -1, -1, -1, -1, -1, -1 }, 92 | { 2, 1, 5, 2, 5, 8, 2, 8, 11, 4, 8, 5, -1, -1, -1, -1 }, 93 | { 10, 3, 11, 10, 1, 3, 9, 5, 4, -1, -1, -1, -1, -1, -1, -1 }, 94 | { 4, 9, 5, 0, 8, 1, 8, 10, 1, 8, 11, 10, -1, -1, -1, -1 }, 95 | { 5, 4, 0, 5, 0, 11, 5, 11, 10, 11, 0, 3, -1, -1, -1, -1 }, 96 | { 5, 4, 8, 5, 8, 10, 10, 8, 11, -1, -1, -1, -1, -1, -1, -1 }, 97 | { 9, 7, 8, 5, 7, 9, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 98 | { 9, 3, 0, 9, 5, 3, 5, 7, 3, -1, -1, -1, -1, -1, -1, -1 }, 99 | { 0, 7, 8, 0, 1, 7, 1, 5, 7, -1, -1, -1, -1, -1, -1, -1 }, 100 | { 1, 5, 3, 3, 5, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 101 | { 9, 7, 8, 9, 5, 7, 10, 1, 2, -1, -1, -1, -1, -1, -1, -1 }, 102 | { 10, 1, 2, 9, 5, 0, 5, 3, 0, 5, 7, 3, -1, -1, -1, -1 }, 103 | { 8, 0, 2, 8, 2, 5, 8, 5, 7, 10, 5, 2, -1, -1, -1, -1 }, 104 | { 2, 10, 5, 2, 5, 3, 3, 5, 7, -1, -1, -1, -1, -1, -1, -1 }, 105 | { 7, 9, 5, 7, 8, 9, 3, 11, 2, -1, -1, -1, -1, -1, -1, -1 }, 106 | { 9, 5, 7, 9, 7, 2, 9, 2, 0, 2, 7, 11, -1, -1, -1, -1 }, 107 | { 2, 3, 11, 0, 1, 8, 1, 7, 8, 1, 5, 7, -1, -1, -1, -1 }, 108 | { 11, 2, 1, 11, 1, 7, 7, 1, 5, -1, -1, -1, -1, -1, -1, -1 }, 109 | { 9, 5, 8, 8, 5, 7, 10, 1, 3, 10, 3, 11, -1, -1, -1, -1 }, 110 | { 5, 7, 0, 5, 0, 9, 7, 11, 0, 1, 0, 10, 11, 10, 0, -1 }, 111 | { 11, 10, 0, 11, 0, 3, 10, 5, 0, 8, 0, 7, 5, 7, 0, -1 }, 112 | { 11, 10, 5, 7, 11, 5, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 113 | { 10, 6, 5, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 114 | { 0, 8, 3, 5, 10, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 115 | { 9, 0, 1, 5, 10, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 116 | { 1, 8, 3, 1, 9, 8, 5, 10, 6, -1, -1, -1, -1, -1, -1, -1 }, 117 | { 1, 6, 5, 2, 6, 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 118 | { 1, 6, 5, 1, 2, 6, 3, 0, 8, -1, -1, -1, -1, -1, -1, -1 }, 119 | { 9, 6, 5, 9, 0, 6, 0, 2, 6, -1, -1, -1, -1, -1, -1, -1 }, 120 | { 5, 9, 8, 5, 8, 2, 5, 2, 6, 3, 2, 8, -1, -1, -1, -1 }, 121 | { 2, 3, 11, 10, 6, 5, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 122 | { 11, 0, 8, 11, 2, 0, 10, 6, 5, -1, -1, -1, -1, -1, -1, -1 }, 123 | { 0, 1, 9, 2, 3, 11, 5, 10, 6, -1, -1, -1, -1, -1, -1, -1 }, 124 | { 5, 10, 6, 1, 9, 2, 9, 11, 2, 9, 8, 11, -1, -1, -1, -1 }, 125 | { 6, 3, 11, 6, 5, 3, 5, 1, 3, -1, -1, -1, -1, -1, -1, -1 }, 126 | { 0, 8, 11, 0, 11, 5, 0, 5, 1, 5, 11, 6, -1, -1, -1, -1 }, 127 | { 3, 11, 6, 0, 3, 6, 0, 6, 5, 0, 5, 9, -1, -1, -1, -1 }, 128 | { 6, 5, 9, 6, 9, 11, 11, 9, 8, -1, -1, -1, -1, -1, -1, -1 }, 129 | { 5, 10, 6, 4, 7, 8, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 130 | { 4, 3, 0, 4, 7, 3, 6, 5, 10, -1, -1, -1, -1, -1, -1, -1 }, 131 | { 1, 9, 0, 5, 10, 6, 8, 4, 7, -1, -1, -1, -1, -1, -1, -1 }, 132 | { 10, 6, 5, 1, 9, 7, 1, 7, 3, 7, 9, 4, -1, -1, -1, -1 }, 133 | { 6, 1, 2, 6, 5, 1, 4, 7, 8, -1, -1, -1, -1, -1, -1, -1 }, 134 | { 1, 2, 5, 5, 2, 6, 3, 0, 4, 3, 4, 7, -1, -1, -1, -1 }, 135 | { 8, 4, 7, 9, 0, 5, 0, 6, 5, 0, 2, 6, -1, -1, -1, -1 }, 136 | { 7, 3, 9, 7, 9, 4, 3, 2, 9, 5, 9, 6, 2, 6, 9, -1 }, 137 | { 3, 11, 2, 7, 8, 4, 10, 6, 5, -1, -1, -1, -1, -1, -1, -1 }, 138 | { 5, 10, 6, 4, 7, 2, 4, 2, 0, 2, 7, 11, -1, -1, -1, -1 }, 139 | { 0, 1, 9, 4, 7, 8, 2, 3, 11, 5, 10, 6, -1, -1, -1, -1 }, 140 | { 9, 2, 1, 9, 11, 2, 9, 4, 11, 7, 11, 4, 5, 10, 6, -1 }, 141 | { 8, 4, 7, 3, 11, 5, 3, 5, 1, 5, 11, 6, -1, -1, -1, -1 }, 142 | { 5, 1, 11, 5, 11, 6, 1, 0, 11, 7, 11, 4, 0, 4, 11, -1 }, 143 | { 0, 5, 9, 0, 6, 5, 0, 3, 6, 11, 6, 3, 8, 4, 7, -1 }, 144 | { 6, 5, 9, 6, 9, 11, 4, 7, 9, 7, 11, 9, -1, -1, -1, -1 }, 145 | { 10, 4, 9, 6, 4, 10, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 146 | { 4, 10, 6, 4, 9, 10, 0, 8, 3, -1, -1, -1, -1, -1, -1, -1 }, 147 | { 10, 0, 1, 10, 6, 0, 6, 4, 0, -1, -1, -1, -1, -1, -1, -1 }, 148 | { 8, 3, 1, 8, 1, 6, 8, 6, 4, 6, 1, 10, -1, -1, -1, -1 }, 149 | { 1, 4, 9, 1, 2, 4, 2, 6, 4, -1, -1, -1, -1, -1, -1, -1 }, 150 | { 3, 0, 8, 1, 2, 9, 2, 4, 9, 2, 6, 4, -1, -1, -1, -1 }, 151 | { 0, 2, 4, 4, 2, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 152 | { 8, 3, 2, 8, 2, 4, 4, 2, 6, -1, -1, -1, -1, -1, -1, -1 }, 153 | { 10, 4, 9, 10, 6, 4, 11, 2, 3, -1, -1, -1, -1, -1, -1, -1 }, 154 | { 0, 8, 2, 2, 8, 11, 4, 9, 10, 4, 10, 6, -1, -1, -1, -1 }, 155 | { 3, 11, 2, 0, 1, 6, 0, 6, 4, 6, 1, 10, -1, -1, -1, -1 }, 156 | { 6, 4, 1, 6, 1, 10, 4, 8, 1, 2, 1, 11, 8, 11, 1, -1 }, 157 | { 9, 6, 4, 9, 3, 6, 9, 1, 3, 11, 6, 3, -1, -1, -1, -1 }, 158 | { 8, 11, 1, 8, 1, 0, 11, 6, 1, 9, 1, 4, 6, 4, 1, -1 }, 159 | { 3, 11, 6, 3, 6, 0, 0, 6, 4, -1, -1, -1, -1, -1, -1, -1 }, 160 | { 6, 4, 8, 11, 6, 8, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 161 | { 7, 10, 6, 7, 8, 10, 8, 9, 10, -1, -1, -1, -1, -1, -1, -1 }, 162 | { 0, 7, 3, 0, 10, 7, 0, 9, 10, 6, 7, 10, -1, -1, -1, -1 }, 163 | { 10, 6, 7, 1, 10, 7, 1, 7, 8, 1, 8, 0, -1, -1, -1, -1 }, 164 | { 10, 6, 7, 10, 7, 1, 1, 7, 3, -1, -1, -1, -1, -1, -1, -1 }, 165 | { 1, 2, 6, 1, 6, 8, 1, 8, 9, 8, 6, 7, -1, -1, -1, -1 }, 166 | { 2, 6, 9, 2, 9, 1, 6, 7, 9, 0, 9, 3, 7, 3, 9, -1 }, 167 | { 7, 8, 0, 7, 0, 6, 6, 0, 2, -1, -1, -1, -1, -1, -1, -1 }, 168 | { 7, 3, 2, 6, 7, 2, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 169 | { 2, 3, 11, 10, 6, 8, 10, 8, 9, 8, 6, 7, -1, -1, -1, -1 }, 170 | { 2, 0, 7, 2, 7, 11, 0, 9, 7, 6, 7, 10, 9, 10, 7, -1 }, 171 | { 1, 8, 0, 1, 7, 8, 1, 10, 7, 6, 7, 10, 2, 3, 11, -1 }, 172 | { 11, 2, 1, 11, 1, 7, 10, 6, 1, 6, 7, 1, -1, -1, -1, -1 }, 173 | { 8, 9, 6, 8, 6, 7, 9, 1, 6, 11, 6, 3, 1, 3, 6, -1 }, 174 | { 0, 9, 1, 11, 6, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 175 | { 7, 8, 0, 7, 0, 6, 3, 11, 0, 11, 6, 0, -1, -1, -1, -1 }, 176 | { 7, 11, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 177 | { 7, 6, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 178 | { 3, 0, 8, 11, 7, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 179 | { 0, 1, 9, 11, 7, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 180 | { 8, 1, 9, 8, 3, 1, 11, 7, 6, -1, -1, -1, -1, -1, -1, -1 }, 181 | { 10, 1, 2, 6, 11, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 182 | { 1, 2, 10, 3, 0, 8, 6, 11, 7, -1, -1, -1, -1, -1, -1, -1 }, 183 | { 2, 9, 0, 2, 10, 9, 6, 11, 7, -1, -1, -1, -1, -1, -1, -1 }, 184 | { 6, 11, 7, 2, 10, 3, 10, 8, 3, 10, 9, 8, -1, -1, -1, -1 }, 185 | { 7, 2, 3, 6, 2, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 186 | { 7, 0, 8, 7, 6, 0, 6, 2, 0, -1, -1, -1, -1, -1, -1, -1 }, 187 | { 2, 7, 6, 2, 3, 7, 0, 1, 9, -1, -1, -1, -1, -1, -1, -1 }, 188 | { 1, 6, 2, 1, 8, 6, 1, 9, 8, 8, 7, 6, -1, -1, -1, -1 }, 189 | { 10, 7, 6, 10, 1, 7, 1, 3, 7, -1, -1, -1, -1, -1, -1, -1 }, 190 | { 10, 7, 6, 1, 7, 10, 1, 8, 7, 1, 0, 8, -1, -1, -1, -1 }, 191 | { 0, 3, 7, 0, 7, 10, 0, 10, 9, 6, 10, 7, -1, -1, -1, -1 }, 192 | { 7, 6, 10, 7, 10, 8, 8, 10, 9, -1, -1, -1, -1, -1, -1, -1 }, 193 | { 6, 8, 4, 11, 8, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 194 | { 3, 6, 11, 3, 0, 6, 0, 4, 6, -1, -1, -1, -1, -1, -1, -1 }, 195 | { 8, 6, 11, 8, 4, 6, 9, 0, 1, -1, -1, -1, -1, -1, -1, -1 }, 196 | { 9, 4, 6, 9, 6, 3, 9, 3, 1, 11, 3, 6, -1, -1, -1, -1 }, 197 | { 6, 8, 4, 6, 11, 8, 2, 10, 1, -1, -1, -1, -1, -1, -1, -1 }, 198 | { 1, 2, 10, 3, 0, 11, 0, 6, 11, 0, 4, 6, -1, -1, -1, -1 }, 199 | { 4, 11, 8, 4, 6, 11, 0, 2, 9, 2, 10, 9, -1, -1, -1, -1 }, 200 | { 10, 9, 3, 10, 3, 2, 9, 4, 3, 11, 3, 6, 4, 6, 3, -1 }, 201 | { 8, 2, 3, 8, 4, 2, 4, 6, 2, -1, -1, -1, -1, -1, -1, -1 }, 202 | { 0, 4, 2, 4, 6, 2, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 203 | { 1, 9, 0, 2, 3, 4, 2, 4, 6, 4, 3, 8, -1, -1, -1, -1 }, 204 | { 1, 9, 4, 1, 4, 2, 2, 4, 6, -1, -1, -1, -1, -1, -1, -1 }, 205 | { 8, 1, 3, 8, 6, 1, 8, 4, 6, 6, 10, 1, -1, -1, -1, -1 }, 206 | { 10, 1, 0, 10, 0, 6, 6, 0, 4, -1, -1, -1, -1, -1, -1, -1 }, 207 | { 4, 6, 3, 4, 3, 8, 6, 10, 3, 0, 3, 9, 10, 9, 3, -1 }, 208 | { 10, 9, 4, 6, 10, 4, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 209 | { 4, 9, 5, 7, 6, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 210 | { 0, 8, 3, 4, 9, 5, 11, 7, 6, -1, -1, -1, -1, -1, -1, -1 }, 211 | { 5, 0, 1, 5, 4, 0, 7, 6, 11, -1, -1, -1, -1, -1, -1, -1 }, 212 | { 11, 7, 6, 8, 3, 4, 3, 5, 4, 3, 1, 5, -1, -1, -1, -1 }, 213 | { 9, 5, 4, 10, 1, 2, 7, 6, 11, -1, -1, -1, -1, -1, -1, -1 }, 214 | { 6, 11, 7, 1, 2, 10, 0, 8, 3, 4, 9, 5, -1, -1, -1, -1 }, 215 | { 7, 6, 11, 5, 4, 10, 4, 2, 10, 4, 0, 2, -1, -1, -1, -1 }, 216 | { 3, 4, 8, 3, 5, 4, 3, 2, 5, 10, 5, 2, 11, 7, 6, -1 }, 217 | { 7, 2, 3, 7, 6, 2, 5, 4, 9, -1, -1, -1, -1, -1, -1, -1 }, 218 | { 9, 5, 4, 0, 8, 6, 0, 6, 2, 6, 8, 7, -1, -1, -1, -1 }, 219 | { 3, 6, 2, 3, 7, 6, 1, 5, 0, 5, 4, 0, -1, -1, -1, -1 }, 220 | { 6, 2, 8, 6, 8, 7, 2, 1, 8, 4, 8, 5, 1, 5, 8, -1 }, 221 | { 9, 5, 4, 10, 1, 6, 1, 7, 6, 1, 3, 7, -1, -1, -1, -1 }, 222 | { 1, 6, 10, 1, 7, 6, 1, 0, 7, 8, 7, 0, 9, 5, 4, -1 }, 223 | { 4, 0, 10, 4, 10, 5, 0, 3, 10, 6, 10, 7, 3, 7, 10, -1 }, 224 | { 7, 6, 10, 7, 10, 8, 5, 4, 10, 4, 8, 10, -1, -1, -1, -1 }, 225 | { 6, 9, 5, 6, 11, 9, 11, 8, 9, -1, -1, -1, -1, -1, -1, -1 }, 226 | { 3, 6, 11, 0, 6, 3, 0, 5, 6, 0, 9, 5, -1, -1, -1, -1 }, 227 | { 0, 11, 8, 0, 5, 11, 0, 1, 5, 5, 6, 11, -1, -1, -1, -1 }, 228 | { 6, 11, 3, 6, 3, 5, 5, 3, 1, -1, -1, -1, -1, -1, -1, -1 }, 229 | { 1, 2, 10, 9, 5, 11, 9, 11, 8, 11, 5, 6, -1, -1, -1, -1 }, 230 | { 0, 11, 3, 0, 6, 11, 0, 9, 6, 5, 6, 9, 1, 2, 10, -1 }, 231 | { 11, 8, 5, 11, 5, 6, 8, 0, 5, 10, 5, 2, 0, 2, 5, -1 }, 232 | { 6, 11, 3, 6, 3, 5, 2, 10, 3, 10, 5, 3, -1, -1, -1, -1 }, 233 | { 5, 8, 9, 5, 2, 8, 5, 6, 2, 3, 8, 2, -1, -1, -1, -1 }, 234 | { 9, 5, 6, 9, 6, 0, 0, 6, 2, -1, -1, -1, -1, -1, -1, -1 }, 235 | { 1, 5, 8, 1, 8, 0, 5, 6, 8, 3, 8, 2, 6, 2, 8, -1 }, 236 | { 1, 5, 6, 2, 1, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 237 | { 1, 3, 6, 1, 6, 10, 3, 8, 6, 5, 6, 9, 8, 9, 6, -1 }, 238 | { 10, 1, 0, 10, 0, 6, 9, 5, 0, 5, 6, 0, -1, -1, -1, -1 }, 239 | { 0, 3, 8, 5, 6, 10, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 240 | { 10, 5, 6, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 241 | { 11, 5, 10, 7, 5, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 242 | { 11, 5, 10, 11, 7, 5, 8, 3, 0, -1, -1, -1, -1, -1, -1, -1 }, 243 | { 5, 11, 7, 5, 10, 11, 1, 9, 0, -1, -1, -1, -1, -1, -1, -1 }, 244 | { 10, 7, 5, 10, 11, 7, 9, 8, 1, 8, 3, 1, -1, -1, -1, -1 }, 245 | { 11, 1, 2, 11, 7, 1, 7, 5, 1, -1, -1, -1, -1, -1, -1, -1 }, 246 | { 0, 8, 3, 1, 2, 7, 1, 7, 5, 7, 2, 11, -1, -1, -1, -1 }, 247 | { 9, 7, 5, 9, 2, 7, 9, 0, 2, 2, 11, 7, -1, -1, -1, -1 }, 248 | { 7, 5, 2, 7, 2, 11, 5, 9, 2, 3, 2, 8, 9, 8, 2, -1 }, 249 | { 2, 5, 10, 2, 3, 5, 3, 7, 5, -1, -1, -1, -1, -1, -1, -1 }, 250 | { 8, 2, 0, 8, 5, 2, 8, 7, 5, 10, 2, 5, -1, -1, -1, -1 }, 251 | { 9, 0, 1, 5, 10, 3, 5, 3, 7, 3, 10, 2, -1, -1, -1, -1 }, 252 | { 9, 8, 2, 9, 2, 1, 8, 7, 2, 10, 2, 5, 7, 5, 2, -1 }, 253 | { 1, 3, 5, 3, 7, 5, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 254 | { 0, 8, 7, 0, 7, 1, 1, 7, 5, -1, -1, -1, -1, -1, -1, -1 }, 255 | { 9, 0, 3, 9, 3, 5, 5, 3, 7, -1, -1, -1, -1, -1, -1, -1 }, 256 | { 9, 8, 7, 5, 9, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 257 | { 5, 8, 4, 5, 10, 8, 10, 11, 8, -1, -1, -1, -1, -1, -1, -1 }, 258 | { 5, 0, 4, 5, 11, 0, 5, 10, 11, 11, 3, 0, -1, -1, -1, -1 }, 259 | { 0, 1, 9, 8, 4, 10, 8, 10, 11, 10, 4, 5, -1, -1, -1, -1 }, 260 | { 10, 11, 4, 10, 4, 5, 11, 3, 4, 9, 4, 1, 3, 1, 4, -1 }, 261 | { 2, 5, 1, 2, 8, 5, 2, 11, 8, 4, 5, 8, -1, -1, -1, -1 }, 262 | { 0, 4, 11, 0, 11, 3, 4, 5, 11, 2, 11, 1, 5, 1, 11, -1 }, 263 | { 0, 2, 5, 0, 5, 9, 2, 11, 5, 4, 5, 8, 11, 8, 5, -1 }, 264 | { 9, 4, 5, 2, 11, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 265 | { 2, 5, 10, 3, 5, 2, 3, 4, 5, 3, 8, 4, -1, -1, -1, -1 }, 266 | { 5, 10, 2, 5, 2, 4, 4, 2, 0, -1, -1, -1, -1, -1, -1, -1 }, 267 | { 3, 10, 2, 3, 5, 10, 3, 8, 5, 4, 5, 8, 0, 1, 9, -1 }, 268 | { 5, 10, 2, 5, 2, 4, 1, 9, 2, 9, 4, 2, -1, -1, -1, -1 }, 269 | { 8, 4, 5, 8, 5, 3, 3, 5, 1, -1, -1, -1, -1, -1, -1, -1 }, 270 | { 0, 4, 5, 1, 0, 5, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 271 | { 8, 4, 5, 8, 5, 3, 9, 0, 5, 0, 3, 5, -1, -1, -1, -1 }, 272 | { 9, 4, 5, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 273 | { 4, 11, 7, 4, 9, 11, 9, 10, 11, -1, -1, -1, -1, -1, -1, -1 }, 274 | { 0, 8, 3, 4, 9, 7, 9, 11, 7, 9, 10, 11, -1, -1, -1, -1 }, 275 | { 1, 10, 11, 1, 11, 4, 1, 4, 0, 7, 4, 11, -1, -1, -1, -1 }, 276 | { 3, 1, 4, 3, 4, 8, 1, 10, 4, 7, 4, 11, 10, 11, 4, -1 }, 277 | { 4, 11, 7, 9, 11, 4, 9, 2, 11, 9, 1, 2, -1, -1, -1, -1 }, 278 | { 9, 7, 4, 9, 11, 7, 9, 1, 11, 2, 11, 1, 0, 8, 3, -1 }, 279 | { 11, 7, 4, 11, 4, 2, 2, 4, 0, -1, -1, -1, -1, -1, -1, -1 }, 280 | { 11, 7, 4, 11, 4, 2, 8, 3, 4, 3, 2, 4, -1, -1, -1, -1 }, 281 | { 2, 9, 10, 2, 7, 9, 2, 3, 7, 7, 4, 9, -1, -1, -1, -1 }, 282 | { 9, 10, 7, 9, 7, 4, 10, 2, 7, 8, 7, 0, 2, 0, 7, -1 }, 283 | { 3, 7, 10, 3, 10, 2, 7, 4, 10, 1, 10, 0, 4, 0, 10, -1 }, 284 | { 1, 10, 2, 8, 7, 4, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 285 | { 4, 9, 1, 4, 1, 7, 7, 1, 3, -1, -1, -1, -1, -1, -1, -1 }, 286 | { 4, 9, 1, 4, 1, 7, 0, 8, 1, 8, 7, 1, -1, -1, -1, -1 }, 287 | { 4, 0, 3, 7, 4, 3, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 288 | { 4, 8, 7, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 289 | { 9, 10, 8, 10, 11, 8, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 290 | { 3, 0, 9, 3, 9, 11, 11, 9, 10, -1, -1, -1, -1, -1, -1, -1 }, 291 | { 0, 1, 10, 0, 10, 8, 8, 10, 11, -1, -1, -1, -1, -1, -1, -1 }, 292 | { 3, 1, 10, 11, 3, 10, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 293 | { 1, 2, 11, 1, 11, 9, 9, 11, 8, -1, -1, -1, -1, -1, -1, -1 }, 294 | { 3, 0, 9, 3, 9, 11, 1, 2, 9, 2, 11, 9, -1, -1, -1, -1 }, 295 | { 0, 2, 11, 8, 0, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 296 | { 3, 2, 11, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 297 | { 2, 3, 8, 2, 8, 10, 10, 8, 9, -1, -1, -1, -1, -1, -1, -1 }, 298 | { 9, 10, 2, 0, 9, 2, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 299 | { 2, 3, 8, 2, 8, 10, 0, 1, 8, 1, 10, 8, -1, -1, -1, -1 }, 300 | { 1, 10, 2, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 301 | { 1, 3, 8, 9, 1, 8, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 302 | { 0, 9, 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 303 | { 0, 3, 8, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 }, 304 | { -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1 } }; 305 | -------------------------------------------------------------------------------- /external/NumpyMarchingCubes/setup.py: -------------------------------------------------------------------------------- 1 | # -*- encoding: utf-8 -*- 2 | 3 | from setuptools import setup 4 | 5 | from setuptools.extension import Extension 6 | 7 | 8 | class lazy_cythonize(list): 9 | """ 10 | Lazy evaluate extension definition, to allow correct requirements install. 11 | """ 12 | 13 | def __init__(self, callback): 14 | super(lazy_cythonize, self).__init__() 15 | self._list, self.callback = None, callback 16 | 17 | def c_list(self): 18 | if self._list is None: 19 | self._list = self.callback() 20 | 21 | return self._list 22 | 23 | def __iter__(self): 24 | for e in self.c_list(): 25 | yield e 26 | 27 | def __getitem__(self, ii): 28 | return self.c_list()[ii] 29 | 30 | def __len__(self): 31 | return len(self.c_list()) 32 | 33 | 34 | def extensions(): 35 | 36 | from Cython.Build import cythonize 37 | import numpy 38 | 39 | numpy_include_dir = numpy.get_include() 40 | 41 | marching_cubes_module = Extension( 42 | "marching_cubes._mcubes", 43 | [ 44 | "marching_cubes/src/_mcubes.pyx", 45 | "marching_cubes/src/pywrapper.cpp", 46 | "marching_cubes/src/marching_cubes.cpp" 47 | ], 48 | language="c++", 49 | extra_compile_args=['-std=c++11', '-Wall'], 50 | include_dirs=[numpy_include_dir], 51 | depends=[ 52 | "marching_cubes/src/marching_cubes.h", 53 | "marching_cubes/src/pyarray_symbol.h", 54 | "marching_cubes/src/pyarraymodule.h", 55 | "marching_cubes/src/pywrapper.h" 56 | ], 57 | ) 58 | 59 | return cythonize([marching_cubes_module]) 60 | 61 | setup( 62 | name="NumpyMarchingCubes", 63 | version="0.0.1", 64 | description="Marching cubes for Python", 65 | author="Dejan Azinovic, Angela Dai, Justus Thies (PyMCubes: Pablo Márquez Neila)", 66 | url="", 67 | license="BSD 3-clause", 68 | long_description=""" 69 | Marching cubes for Python 70 | """, 71 | classifiers=[ 72 | "Development Status :: 5 - Production/Stable", 73 | "Environment :: Console", 74 | "Intended Audience :: Developers", 75 | "Intended Audience :: Science/Research", 76 | "License :: OSI Approved :: BSD License", 77 | "Natural Language :: English", 78 | "Operating System :: OS Independent", 79 | "Programming Language :: C++", 80 | "Programming Language :: Python", 81 | "Topic :: Multimedia :: Graphics :: 3D Modeling", 82 | "Topic :: Scientific/Engineering :: Image Recognition", 83 | ], 84 | packages=["marching_cubes"], 85 | ext_modules=lazy_cythonize(extensions), 86 | requires=['numpy', 'Cython', 'PyCollada'], 87 | setup_requires=['numpy', 'Cython'] 88 | ) 89 | -------------------------------------------------------------------------------- /extract_mesh.py: -------------------------------------------------------------------------------- 1 | import load_network_model 2 | import os 3 | import scene_bounds 4 | 5 | import tensorflow as tf 6 | import numpy as np 7 | 8 | import marching_cubes as mcubes 9 | import trimesh 10 | 11 | 12 | def get_batch_query_fn(query_fn, feature_array, network_fn): 13 | 14 | fn = lambda f, i0, i1: query_fn(f[i0:i1, None, :], viewdirs=tf.zeros_like(f[i0:i1]), 15 | feature_array=feature_array, 16 | pose_array=None, 17 | frame_ids=tf.zeros_like(f[i0:i1, 0], dtype=tf.int32), 18 | deformation_field=None, 19 | c2w_array=None, 20 | network_fn=network_fn) 21 | 22 | return fn 23 | 24 | 25 | def extract_mesh(query_fn, feature_array, network_fn, args, voxel_size=0.01, isolevel=0.0, scene_name='', mesh_savepath=''): 26 | 27 | # Query network on dense 3d grid of points 28 | voxel_size *= args.sc_factor # in "network space" 29 | 30 | tx, ty, tz = scene_bounds.get_scene_bounds(scene_name, voxel_size, True) 31 | 32 | query_pts = np.stack(np.meshgrid(tx, ty, tz, indexing='ij'), -1).astype(np.float32) 33 | print(query_pts.shape) 34 | sh = query_pts.shape 35 | flat = query_pts.reshape([-1, 3]) 36 | 37 | fn = get_batch_query_fn(query_fn, feature_array, network_fn) 38 | 39 | chunk = 1024 * 64 40 | raw = np.concatenate([fn(flat, i, i + chunk)[0].numpy() for i in range(0, flat.shape[0], chunk)], 0) 41 | raw = np.reshape(raw, list(sh[:-1]) + [-1]) 42 | sigma = raw[..., -1] 43 | 44 | print('Running Marching Cubes') 45 | vertices, triangles = mcubes.marching_cubes(sigma, isolevel, truncation=3.0) 46 | print('done', vertices.shape, triangles.shape) 47 | 48 | # normalize vertex positions 49 | vertices[:, :3] /= np.array([[tx.shape[0] - 1, ty.shape[0] - 1, tz.shape[0] - 1]]) 50 | 51 | # Rescale and translate 52 | scale = np.array([tx[-1] - tx[0], ty[-1] - ty[0], tz[-1] - tz[0]]) 53 | offset = np.array([tx[0], ty[0], tz[0]]) 54 | vertices[:, :3] = scale[np.newaxis, :] * vertices[:, :3] + offset 55 | 56 | # Transform to metric units 57 | vertices[:, :3] = vertices[:, :3] / args.sc_factor - args.translation 58 | 59 | # Create mesh 60 | mesh = trimesh.Trimesh(vertices, triangles, process=False) 61 | 62 | # Transform the mesh to Scannet's coordinate system 63 | gl_to_scannet = np.array([[1, 0, 0, 0], 64 | [0, 0, -1, 0], 65 | [0, 1, 0, 0], 66 | [0, 0, 0, 1]]).astype(np.float32).reshape([4, 4]) 67 | 68 | mesh.apply_transform(gl_to_scannet) 69 | 70 | if mesh_savepath == '': 71 | mesh_savepath = os.path.join(args.basedir, args.expname, f"mesh_vs{voxel_size / args.sc_factor.ply}") 72 | mesh.export(mesh_savepath) 73 | 74 | print('Mesh saved') 75 | 76 | 77 | if __name__ == '__main__': 78 | # Checkpoint path information 79 | experiments = [ 80 | { 81 | 'basedir': './logs', 82 | 'expname': 'whiteroom' 83 | }, 84 | ] 85 | 86 | iter = 400000 87 | 88 | for e in experiments: 89 | basedir, expname = e.values() 90 | print(basedir, expname) 91 | 92 | # Create nerf model 93 | args, render_kwargs_test, query_fn, feature_array, network_fn = load_network_model.load_network_model_from_disk(expname, iter, basedir) 94 | 95 | args.basedir = basedir 96 | args.expname = expname 97 | mesh_savepath = os.path.join(basedir, expname, f"mesh_color_vs0.01_{iter:06}.ply") 98 | 99 | extract_mesh(query_fn, feature_array, network_fn, args, voxel_size=0.01, scene_name='whiteroom', mesh_savepath=mesh_savepath) 100 | -------------------------------------------------------------------------------- /extract_optimized_poses.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import optimize 4 | from dataloader_util import load_poses 5 | from pose_array import PoseArray 6 | 7 | 8 | def get_pose_array(expname, iter, basedir='./logs'): 9 | 10 | config = os.path.join(basedir, expname, 'config.txt') 11 | print('Args:') 12 | print(open(config, 'r').read()) 13 | 14 | parser = optimize.config_parser() 15 | args = parser.parse_args('--config {} '.format(config)) 16 | 17 | # Load poses 18 | tmp, valid = load_poses(os.path.join(args.datadir, 'trainval_poses.txt')) 19 | poses = [] 20 | for i in range(len(tmp)): 21 | if valid[i]: 22 | poses.append(tmp[i]) 23 | 24 | poses = np.array(poses).astype(np.float32) 25 | poses = poses[::args.trainskip] 26 | poses[:, :3, 3] += args.translation 27 | poses[:, :3, 3] *= args.sc_factor 28 | args.num_training_frames = len(poses) 29 | 30 | # Create pose array 31 | pose_array = PoseArray(args.num_training_frames) 32 | pose_array_path = os.path.join(basedir, expname, f'pose_array_{iter:06}.npy') 33 | print('Reloading pose array from', pose_array_path) 34 | pose_array.set_weights(np.load(pose_array_path, allow_pickle=True)) 35 | 36 | return poses, pose_array, args 37 | 38 | 39 | def extract_poses(poses, pose_array, args): 40 | 41 | os.makedirs(os.path.join(basedir, expname, 'poses'), exist_ok=True) 42 | 43 | original_poses = poses.copy() 44 | original_poses[:, :3, 3] /= args.sc_factor 45 | original_poses[:, :3, 3] -= args.translation 46 | original_poses = np.reshape(original_poses, [-1, 4]) 47 | np.savetxt(os.path.join(basedir, expname, 'poses', 'poses.txt'), original_poses, fmt="%.6f") 48 | 49 | # Apply pose transformation 50 | pose_delta = [] 51 | optimized_poses = [] 52 | for idx in range(poses.shape[0]): 53 | R = pose_array.get_rotation_matrices(np.array([idx, 1])).numpy()[0, :, :] 54 | t = pose_array.get_translations(np.array([idx, 1])).numpy()[0, :, np.newaxis] 55 | 56 | T = np.concatenate([R, t], -1) 57 | T = np.concatenate([T, np.array([[0, 0, 0, 1]])], 0) 58 | pose_delta.append(T) 59 | 60 | poses[idx] = T @ poses[idx] 61 | optimized_poses.append(poses[idx]) 62 | 63 | pose_delta = np.array(pose_delta).astype(np.float32) 64 | pose_delta = np.reshape(pose_delta, [-1, 4]) 65 | np.savetxt(os.path.join(basedir, expname, 'poses', 'pose_delta.txt'), pose_delta, fmt="%.6f") 66 | 67 | optimized_poses = np.array(optimized_poses).astype(np.float32) 68 | optimized_poses[:, :3, 3] /= args.sc_factor 69 | optimized_poses[:, :3, 3] -= args.translation 70 | optimized_poses = np.reshape(optimized_poses, [-1, 4]) 71 | np.savetxt(os.path.join(basedir, expname, 'poses', 'optimized_poses.txt'), optimized_poses, fmt="%.6f") 72 | 73 | 74 | if __name__ == '__main__': 75 | # Checkpoint path information 76 | experiments = [ 77 | { 78 | 'basedir': './logs', 79 | 'expname': 'whiteroom' 80 | }, 81 | ] 82 | 83 | iter = 400000 84 | 85 | for e in experiments: 86 | basedir, expname = e.values() 87 | print(basedir, expname) 88 | 89 | poses, pose_array, args = get_pose_array(expname, iter, basedir) 90 | extract_poses(poses, pose_array, args) 91 | -------------------------------------------------------------------------------- /frame_features.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | 3 | 4 | class FeatureArray(tf.Module): 5 | """ 6 | Per-frame corrective latent code. 7 | """ 8 | 9 | def __init__(self, num_frames, num_channels): 10 | super(FeatureArray, self).__init__() 11 | 12 | self.num_frames = num_frames 13 | self.num_channels = num_channels 14 | 15 | self.data = tf.Variable( 16 | tf.random.normal([num_frames, num_channels], dtype=tf.float32) 17 | ) 18 | 19 | def __call__(self, ids): 20 | ids = tf.where(ids < self.num_frames, ids, tf.zeros_like(ids)) 21 | return tf.gather(self.data, ids) 22 | 23 | def get_weights(self): 24 | return self.data.numpy() 25 | 26 | def set_weights(self, weights): 27 | self.data.assign(weights) 28 | -------------------------------------------------------------------------------- /load_dataset.py: -------------------------------------------------------------------------------- 1 | from load_scannet import load_scannet_data 2 | 3 | 4 | def load_dataset(args): 5 | 6 | if args.dataset_type == "scannet": 7 | images, depth_images, poses, hwf, frame_indices = load_scannet_data(basedir=args.datadir, 8 | trainskip=args.trainskip, 9 | downsample_factor=args.factor, 10 | translation=args.translation, 11 | sc_factor=args.sc_factor, 12 | crop=args.crop) 13 | 14 | print('Loaded scannet', images.shape, hwf, args.datadir) 15 | 16 | # Calls to other dataloaders go here 17 | # elif args.dataset_type == "": 18 | 19 | else: 20 | print('Unknown dataset type', args.dataset_type, 'exiting') 21 | return 22 | 23 | return images, depth_images, poses, hwf, frame_indices 24 | -------------------------------------------------------------------------------- /load_network_model.py: -------------------------------------------------------------------------------- 1 | import os 2 | from load_scannet import get_num_training_frames 3 | import optimize 4 | 5 | 6 | def load_network_model_from_disk(expname, iter, basedir='./logs'): 7 | 8 | config = os.path.join(basedir, expname, 'config.txt') 9 | print('Args:') 10 | print(open(config, 'r').read()) 11 | 12 | parser = optimize.config_parser() 13 | ft_str = '' 14 | if iter is not None: 15 | ft_str = '--ft_path {}'.format(os.path.join(basedir, expname, f'model_{iter:06}.npy')) 16 | args = parser.parse_args('--config {} '.format(config) + ft_str) 17 | 18 | args.num_training_frames = get_num_training_frames(args.datadir, trainskip=args.trainskip) 19 | print(args.num_training_frames) 20 | 21 | # Create nerf model 22 | _, render_kwargs_test, _, _, models = optimize.create_nerf(args) 23 | 24 | query_fn = render_kwargs_test['network_query_fn'] 25 | 26 | network_fn = render_kwargs_test['network_fn'] 27 | if args.N_importance > 0 and not args.share_coarse_fine: 28 | network_fn = render_kwargs_test['network_fine'] 29 | 30 | feature_array = None 31 | if 'feature_array' in models: 32 | feature_array = models['feature_array'] 33 | 34 | return args, render_kwargs_test, query_fn, feature_array, network_fn 35 | -------------------------------------------------------------------------------- /load_scannet.py: -------------------------------------------------------------------------------- 1 | import os 2 | import imageio 3 | from dataloader_util import * 4 | 5 | 6 | def get_training_poses(basedir, translation=0.0, sc_factor=1.0, trainskip=1): 7 | all_poses, valid = load_poses(os.path.join(basedir, 'trainval_poses.txt')) 8 | 9 | train_frames = [] 10 | for idx in range(0, len(all_poses), trainskip): 11 | if valid[idx]: 12 | train_frames.append(idx) 13 | 14 | all_poses = np.array(all_poses).astype(np.float32) 15 | training_poses = all_poses[train_frames] 16 | 17 | training_poses[:, :3, 3] += translation 18 | training_poses[:, :3, 3] *= sc_factor 19 | 20 | return training_poses 21 | 22 | 23 | def get_num_training_frames(basedir, trainskip): 24 | poses = get_training_poses(basedir, trainskip=trainskip) 25 | 26 | return poses.shape[0] 27 | 28 | 29 | def get_intrinsics(basedir, crop): 30 | depth = imageio.imread(os.path.join(basedir, 'depth_filtered', 'depth0.png')) 31 | H, W = depth.shape[:2] 32 | H = H - crop / 2 33 | W = W - crop / 2 34 | focal = load_focal_length(os.path.join(basedir, 'focal.txt')) 35 | 36 | return H, W, focal 37 | 38 | 39 | def load_scannet_data(basedir, trainskip, downsample_factor=1, translation=0.0, sc_factor=1., crop=0): 40 | 41 | # Get image filenames, poses and intrinsics 42 | img_files = [f for f in sorted(os.listdir(os.path.join(basedir, 'images')), key=alphanum_key) if f.endswith('png')] 43 | depth_files = [f for f in sorted(os.listdir(os.path.join(basedir, 'depth_filtered')), key=alphanum_key) if f.endswith('png')] 44 | all_poses, valid_poses = load_poses(os.path.join(basedir, 'trainval_poses.txt')) 45 | 46 | # Train, val and test split 47 | num_frames = len(img_files) 48 | train_frame_ids = list(range(0, num_frames, trainskip)) 49 | 50 | # Lists for the data to load into 51 | images = [] 52 | depth_maps = [] 53 | poses = [] 54 | frame_indices = [] 55 | 56 | # Read images and depth maps for which valid poses exist 57 | for i in train_frame_ids: 58 | if valid_poses[i]: 59 | img = imageio.imread(os.path.join(basedir, 'images', img_files[i])) 60 | depth = imageio.imread(os.path.join(basedir, 'depth_filtered', depth_files[i])) 61 | 62 | images.append(img) 63 | depth_maps.append(depth) 64 | poses.append(all_poses[i]) 65 | frame_indices.append(i) 66 | 67 | # Map images to [0, 1] range 68 | images = (np.array(images) / 255.).astype(np.float32) 69 | 70 | # Convert depth to meters, then to "network units" 71 | depth_shift = 1000.0 72 | depth_maps = (np.array(depth_maps) / depth_shift).astype(np.float32) 73 | depth_maps *= sc_factor 74 | depth_maps = depth_maps[..., np.newaxis] 75 | 76 | poses = np.array(poses).astype(np.float32) 77 | poses[:, :3, 3] += translation 78 | poses[:, :3, 3] *= sc_factor 79 | 80 | # Intrinsics 81 | H, W = depth_maps[0].shape[:2] 82 | focal = load_focal_length(os.path.join(basedir, 'focal.txt')) 83 | 84 | # Resize color frames to match depth 85 | images = resize_images(images, H, W) 86 | 87 | # Crop the undistortion artifacts 88 | if crop > 0: 89 | images = images[:, crop:-crop, crop:-crop, :] 90 | depth_maps = depth_maps[:, crop:-crop, crop:-crop, :] 91 | H, W = depth_maps[0].shape[:2] 92 | 93 | if downsample_factor > 1: 94 | H = H//downsample_factor 95 | W = W//downsample_factor 96 | focal = focal/downsample_factor 97 | images = resize_images(images, H, W) 98 | depth_maps = resize_images(depth_maps, H, W, interpolation=cv2.INTER_NEAREST) 99 | 100 | return images, depth_maps, poses, [H, W, focal], frame_indices 101 | -------------------------------------------------------------------------------- /losses.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | from nerf_helpers import img2mae 3 | from nerf_helpers import img2mse 4 | 5 | 6 | def compute_loss(prediction, target, loss_type='l2'): 7 | if loss_type == 'l2': 8 | return img2mse(prediction, target) 9 | elif loss_type == 'l1': 10 | return img2mae(prediction, target) 11 | 12 | raise Exception('Unsupported loss type') 13 | 14 | 15 | def get_masks(z_vals, target_d, truncation): 16 | 17 | front_mask = tf.where(z_vals < (target_d - truncation), tf.ones_like(z_vals), tf.zeros_like(z_vals)) 18 | back_mask = tf.where(z_vals > (target_d + truncation), tf.ones_like(z_vals), tf.zeros_like(z_vals)) 19 | depth_mask = tf.where(target_d > 0.0, tf.ones_like(target_d), tf.zeros_like(target_d)) 20 | sdf_mask = (1.0 - front_mask) * (1.0 - back_mask) * depth_mask 21 | 22 | num_fs_samples = tf.math.count_nonzero(front_mask, dtype=tf.float32) 23 | num_sdf_samples = tf.math.count_nonzero(sdf_mask, dtype=tf.float32) 24 | num_samples = num_sdf_samples + num_fs_samples 25 | fs_weight = 1.0 - num_fs_samples / num_samples 26 | sdf_weight = 1.0 - num_sdf_samples / num_samples 27 | 28 | return front_mask, sdf_mask, fs_weight, sdf_weight 29 | 30 | 31 | def get_sdf_loss(z_vals, target_d, predicted_sdf, truncation, loss_type): 32 | 33 | front_mask, sdf_mask, fs_weight, sdf_weight = get_masks(z_vals, target_d, truncation) 34 | 35 | fs_loss = compute_loss(predicted_sdf * front_mask, tf.ones_like(predicted_sdf) * front_mask, loss_type) * fs_weight 36 | sdf_loss = compute_loss((z_vals + predicted_sdf * truncation) * sdf_mask, target_d * sdf_mask, loss_type) * sdf_weight 37 | 38 | return fs_loss, sdf_loss 39 | 40 | 41 | def get_depth_loss(predicted_depth, target_d, loss_type='l2'): 42 | depth_mask = tf.where(target_d > 0, tf.ones_like(target_d), tf.zeros_like(target_d)) 43 | eps = 1e-4 44 | num_pixel = tf.size(depth_mask, out_type=tf.float32) 45 | num_valid = tf.math.count_nonzero(depth_mask, dtype=tf.float32) + eps 46 | depth_valid_weight = num_pixel / num_valid 47 | 48 | return compute_loss(predicted_depth[..., tf.newaxis] * depth_mask, target_d * depth_mask, loss_type) * depth_valid_weight 49 | 50 | -------------------------------------------------------------------------------- /nerf_helpers.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | import numpy as np 3 | 4 | 5 | # Misc utils 6 | 7 | img2mse = lambda x, y: tf.reduce_mean(tf.square(x - y)) 8 | img2mae = lambda x, y: tf.reduce_mean(tf.abs(x - y)) 9 | mse2psnr = lambda x: -10.*tf.math.log(x)/tf.math.log(10.) 10 | to8b = lambda x: (255*np.clip(x, 0, 1)).astype(np.uint8) 11 | to_depth16 = lambda x: (1000 * x).astype(np.uint16) 12 | 13 | # Positional encoding 14 | 15 | class Embedder: 16 | 17 | def __init__(self, **kwargs): 18 | self.kwargs = kwargs 19 | self.create_embedding_fn() 20 | 21 | def create_embedding_fn(self): 22 | 23 | embed_fns = [] 24 | d = self.kwargs['input_dims'] 25 | out_dim = 0 26 | if self.kwargs['include_input']: 27 | embed_fns.append(lambda x: x) 28 | out_dim += d 29 | 30 | max_freq = self.kwargs['max_freq_log2'] 31 | N_freqs = self.kwargs['num_freqs'] 32 | 33 | if self.kwargs['log_sampling']: 34 | freq_bands = 2.**tf.linspace(0., max_freq, N_freqs) 35 | else: 36 | freq_bands = tf.linspace(2.**0., 2.**max_freq, N_freqs) 37 | 38 | if self.kwargs['gaussian']: 39 | B = tf.random.normal([3, 256], 0.0, 10.0, seed=0) 40 | 41 | for freq in freq_bands: 42 | for p_fn in self.kwargs['periodic_fns']: 43 | if self.kwargs['gaussian']: 44 | embed_fns.append(lambda x, p_fn=p_fn, freq=freq, B=B: p_fn(tf.matmul(x, B) * freq)) 45 | else: 46 | embed_fns.append(lambda x, p_fn=p_fn, freq=freq: p_fn(x * freq)) 47 | out_dim += d 48 | 49 | self.embed_fns = embed_fns 50 | self.out_dim = out_dim 51 | 52 | if self.kwargs['gaussian']: 53 | self.out_dim = B.shape[1] * 2 54 | 55 | def embed(self, inputs): 56 | return tf.concat([fn(inputs) for fn in self.embed_fns], -1) 57 | 58 | 59 | def get_embedder_obj(multires): 60 | embed_kwargs = { 61 | 'include_input': True, 62 | 'input_dims': 3, 63 | 'max_freq_log2': multires-1, 64 | 'num_freqs': multires, 65 | 'log_sampling': True, 66 | 'periodic_fns': [tf.math.sin, tf.math.cos], 67 | 'gaussian': False, 68 | } 69 | 70 | return Embedder(**embed_kwargs) 71 | 72 | 73 | def get_embedder(multires, i=0): 74 | 75 | if i == -1: 76 | return tf.identity, 3 77 | 78 | embedder_obj = get_embedder_obj(multires) 79 | embed = lambda x, eo=embedder_obj: eo.embed(x) 80 | return embed, embedder_obj.out_dim 81 | 82 | 83 | # Model architecture 84 | 85 | def init_nerf_model(D=8, W=256, input_ch=3, input_ch_views=3, output_ch=4, skips=[4], use_viewdirs=False): 86 | 87 | relu = tf.keras.layers.ReLU() 88 | dense = lambda W, act=relu: tf.keras.layers.Dense(W, activation=act) 89 | 90 | print('MODEL', input_ch, input_ch_views, type(input_ch), type(input_ch_views), use_viewdirs) 91 | input_ch = int(input_ch) 92 | input_ch_views = int(input_ch_views) 93 | 94 | inputs = tf.keras.Input(shape=(input_ch + input_ch_views)) 95 | inputs_pts, inputs_views = tf.split(inputs, [input_ch, input_ch_views], -1) 96 | inputs_pts.set_shape([None, input_ch]) 97 | inputs_views.set_shape([None, input_ch_views]) 98 | 99 | print(inputs.shape, inputs_pts.shape, inputs_views.shape) 100 | outputs = inputs_pts 101 | for i in range(D): 102 | outputs = dense(W)(outputs) 103 | if i in skips: 104 | outputs = tf.concat([inputs_pts, outputs], -1) 105 | 106 | if use_viewdirs: 107 | alpha_out = dense(1, act=None)(outputs) 108 | bottleneck = dense(256, act=None)(outputs) 109 | inputs_viewdirs = tf.concat([bottleneck, inputs_views], -1) # concat viewdirs 110 | outputs = inputs_viewdirs 111 | for i in range(1): 112 | outputs = dense(W//2)(outputs) 113 | outputs = dense(3, act=None)(outputs) 114 | outputs = tf.concat([outputs, alpha_out], -1) 115 | else: 116 | outputs = dense(output_ch, act=None)(outputs) 117 | 118 | model = tf.keras.Model(inputs=inputs, outputs=outputs) 119 | return model 120 | 121 | 122 | # Ray helpers 123 | 124 | def get_rays(H, W, focal, c2w): 125 | """Get ray origins, directions from a pinhole camera.""" 126 | i, j = tf.meshgrid(tf.range(W, dtype=tf.float32), 127 | tf.range(H, dtype=tf.float32), indexing='xy') 128 | dirs = tf.stack([(i + 0.5 - W*.5)/focal, -(j + 0.5 - H*.5)/focal, -tf.ones_like(i)], -1) 129 | rays_d = tf.reduce_sum(dirs[..., np.newaxis, :] * c2w[:3, :3], -1) 130 | rays_o = tf.broadcast_to(c2w[:3, -1], tf.shape(rays_d)) 131 | return rays_o, rays_d 132 | 133 | 134 | def get_rays_np(H, W, focal, c2w): 135 | """Get ray origins, directions from a pinhole camera.""" 136 | i, j = np.meshgrid(np.arange(W, dtype=np.float32), 137 | np.arange(H, dtype=np.float32), indexing='xy') 138 | dirs = np.stack([(i + 0.5 - W*.5)/focal, -(j + 0.5 - H*.5)/focal, -np.ones_like(i)], -1) 139 | rays_d = np.sum(dirs[..., np.newaxis, :] * c2w[:3, :3], -1) 140 | rays_o = np.broadcast_to(c2w[:3, -1], np.shape(rays_d)) 141 | return rays_o, rays_d 142 | 143 | 144 | def get_camera_rays_np(H, W, focal): 145 | """Get ray origins, directions from a pinhole camera.""" 146 | i, j = np.meshgrid(np.arange(W, dtype=np.float32), 147 | np.arange(H, dtype=np.float32), indexing='xy') 148 | dirs = np.stack([(i + 0.5 - W*.5)/focal, -(j + 0.5 - H*.5)/focal, -np.ones_like(i)], -1) 149 | rays_d = dirs 150 | return rays_d 151 | 152 | 153 | def get_rays_np_random(H, W, focal, c2w): 154 | """Get ray origins, directions from a pinhole camera.""" 155 | i, j = np.meshgrid(np.arange(W, dtype=np.float32), 156 | np.arange(H, dtype=np.float32), indexing='xy') 157 | 158 | i_rand = np.random.rand(*i.shape) 159 | j_rand = np.random.rand(*j.shape) 160 | 161 | dirs = np.stack([(i + i_rand - W*.5)/focal, -(j + j_rand - H*.5)/focal, -np.ones_like(i)], -1) 162 | rays_d = np.sum(dirs[..., np.newaxis, :] * c2w[:3, :3], -1) 163 | rays_o = np.broadcast_to(c2w[:3, -1], np.shape(rays_d)) 164 | return rays_o, rays_d 165 | 166 | 167 | def ndc_rays(H, W, focal, near, rays_o, rays_d): 168 | """Normalized device coordinate rays. 169 | Space such that the canvas is a cube with sides [-1, 1] in each axis. 170 | Args: 171 | H: int. Height in pixels. 172 | W: int. Width in pixels. 173 | focal: float. Focal length of pinhole camera. 174 | near: float or array of shape[batch_size]. Near depth bound for the scene. 175 | rays_o: array of shape [batch_size, 3]. Camera origin. 176 | rays_d: array of shape [batch_size, 3]. Ray direction. 177 | Returns: 178 | rays_o: array of shape [batch_size, 3]. Camera origin in NDC. 179 | rays_d: array of shape [batch_size, 3]. Ray direction in NDC. 180 | """ 181 | # Shift ray origins to near plane 182 | t = -(near + rays_o[..., 2]) / rays_d[..., 2] 183 | rays_o = rays_o + t[..., None] * rays_d 184 | 185 | # Projection 186 | o0 = -1./(W/(2.*focal)) * rays_o[..., 0] / rays_o[..., 2] 187 | o1 = -1./(H/(2.*focal)) * rays_o[..., 1] / rays_o[..., 2] 188 | o2 = 1. + 2. * near / rays_o[..., 2] 189 | 190 | d0 = -1./(W/(2.*focal)) * \ 191 | (rays_d[..., 0]/rays_d[..., 2] - rays_o[..., 0]/rays_o[..., 2]) 192 | d1 = -1./(H/(2.*focal)) * \ 193 | (rays_d[..., 1]/rays_d[..., 2] - rays_o[..., 1]/rays_o[..., 2]) 194 | d2 = -2. * near / rays_o[..., 2] 195 | 196 | rays_o = tf.stack([o0, o1, o2], -1) 197 | rays_d = tf.stack([d0, d1, d2], -1) 198 | 199 | return rays_o, rays_d 200 | 201 | 202 | # Hierarchical sampling helper 203 | 204 | def sample_pdf(bins, weights, N_samples, det=False): 205 | 206 | # Get pdf 207 | weights += 1e-5 # prevent nans 208 | pdf = weights / tf.reduce_sum(weights, -1, keepdims=True) 209 | cdf = tf.cumsum(pdf, -1) 210 | cdf = tf.concat([tf.zeros_like(cdf[..., :1]), cdf], -1) 211 | 212 | # Take uniform samples 213 | if det: 214 | u = tf.linspace(0., 1., N_samples) 215 | u = tf.broadcast_to(u, list(cdf.shape[:-1]) + [N_samples]) 216 | else: 217 | u = tf.random.uniform(list(cdf.shape[:-1]) + [N_samples]) 218 | 219 | # Invert CDF 220 | inds = tf.searchsorted(cdf, u, side='right') 221 | below = tf.maximum(0, inds-1) 222 | above = tf.minimum(cdf.shape[-1]-1, inds) 223 | inds_g = tf.stack([below, above], -1) 224 | cdf_g = tf.gather(cdf, inds_g, axis=-1, batch_dims=len(inds_g.shape)-2) 225 | bins_g = tf.gather(bins, inds_g, axis=-1, batch_dims=len(inds_g.shape)-2) 226 | 227 | denom = (cdf_g[..., 1] - cdf_g[..., 0]) 228 | denom = tf.where(denom < 1e-5, tf.ones_like(denom), denom) 229 | t = (u-cdf_g[..., 0]) / denom 230 | samples = bins_g[..., 0] + t * (bins_g[..., 1] - bins_g[..., 0]) 231 | 232 | return samples 233 | -------------------------------------------------------------------------------- /parser_util.py: -------------------------------------------------------------------------------- 1 | import configargparse 2 | 3 | 4 | def get_parser(): 5 | parser = configargparse.ArgumentParser() 6 | parser.add_argument('--config', is_config_file=True, 7 | help='config file path') 8 | parser.add_argument("--expname", type=str, help='experiment name') 9 | parser.add_argument("--basedir", type=str, default='./logs/', 10 | help='where to store ckpts and logs') 11 | parser.add_argument("--datadir", type=str, 12 | default='./data/scannet/scene0050_00', help='input data directory') 13 | 14 | # training options 15 | parser.add_argument("--netdepth", type=int, default=8, 16 | help='layers in network') 17 | parser.add_argument("--netwidth", type=int, default=256, 18 | help='channels per layer') 19 | parser.add_argument("--netdepth_fine", type=int, 20 | default=8, help='layers in fine network') 21 | parser.add_argument("--netwidth_fine", type=int, default=256, 22 | help='channels per layer in fine network') 23 | parser.add_argument("--N_rand", type=int, default=32 * 32 * 4, 24 | help='batch size (number of random rays per gradient step)') 25 | parser.add_argument("--N_iters", type=int, default=1000000, 26 | help='number of iterations for which to train the network') 27 | parser.add_argument("--lrate", type=float, 28 | default=5e-4, help='learning rate') 29 | parser.add_argument("--lrate_decay", type=int, default=250, 30 | help='exponential learning rate decay (in 1000s)') 31 | parser.add_argument("--chunk", type=int, default=1024 * 32, 32 | help='number of rays processed in parallel, decrease if running out of memory') 33 | parser.add_argument("--netchunk", type=int, default=1024 * 64, 34 | help='number of pts sent through network in parallel, decrease if running out of memory') 35 | parser.add_argument("--no_reload", action='store_true', 36 | help='do not reload weights from saved ckpt') 37 | parser.add_argument("--ft_path", type=str, default=None, 38 | help='specific weights npy file to reload for coarse network') 39 | parser.add_argument("--rgb_weight", type=float, 40 | default=1.0, help='weight of the img loss') 41 | parser.add_argument("--depth_weight", type=float, 42 | default=1.0, help='weight of the depth loss') 43 | parser.add_argument("--fs_weight", type=float, 44 | default=1.0, help='weight of the free-space loss') 45 | parser.add_argument("--trunc_weight", type=float, 46 | default=1.0, help='weight of the truncation loss') 47 | parser.add_argument("--share_coarse_fine", action='store_true', 48 | help='use the same network for both coarse and fine samples') 49 | parser.add_argument("--rgb_loss_type", type=str, default='l2', 50 | help='which RGB loss to use - l1/l2 are currently supported') 51 | parser.add_argument("--sdf_loss_type", type=str, default='l2', 52 | help='which SDF loss to use - l1/l2 are currently supported') 53 | parser.add_argument("--frame_features", type=int, default=0, 54 | help='number of channels of the learnable per-frame features') 55 | parser.add_argument("--optimize_poses", action='store_true', 56 | help='optimize a pose refinement for the initial poses') 57 | parser.add_argument("--use_deformation_field", action='store_true', 58 | help='use a deformation field to account for inaccuracies in intrinsic parameters') 59 | 60 | # rendering options 61 | parser.add_argument("--N_samples", type=int, default=64, 62 | help='number of coarse samples per ray') 63 | parser.add_argument("--N_importance", type=int, default=0, 64 | help='number of additional fine samples per ray') 65 | parser.add_argument("--perturb", type=float, default=1., 66 | help='set to 0. for no jitter, 1. for jitter') 67 | parser.add_argument("--use_viewdirs", action='store_true', 68 | help='use full 5D input instead of 3D') 69 | parser.add_argument("--i_embed", type=int, default=0, 70 | help='set 0 for default positional encoding, -1 for none') 71 | parser.add_argument("--multires", type=int, default=10, 72 | help='log2 of max freq for positional encoding (3D location)') 73 | parser.add_argument("--multires_views", type=int, default=4, 74 | help='log2 of max freq for positional encoding (2D direction)') 75 | parser.add_argument("--raw_noise_std", type=float, default=0., 76 | help='std dev of noise added to regularize sigma_a output, 1e0 recommended') 77 | parser.add_argument("--mode", type=str, default='density', 78 | help='whether the network predicts density or SDF values') 79 | parser.add_argument("--trunc", type=float, default=0.05, 80 | help='length of the truncation region in meters') 81 | parser.add_argument("--render_factor", type=int, default=0, 82 | help='downsampling factor to speed up rendering, set 4 or 8 for fast preview') 83 | 84 | # dataset options 85 | parser.add_argument("--dataset_type", type=str, default='scannet', 86 | help='options: llff / blender / deepvoxels / synthetic / scannet') 87 | parser.add_argument("--trainskip", type=int, default=1, 88 | help='will load 1/N images from the training set, useful for large datasets like deepvoxels') 89 | parser.add_argument("--factor", type=int, default=1, 90 | help='downsample factor for depth images') 91 | parser.add_argument("--sc_factor", type=float, default=1.0, 92 | help='factor by which to scale the camera translation and the depth maps') 93 | parser.add_argument("--translation", action="append", default=None, required=False, type=float, 94 | help='translation vector for the camera poses') 95 | parser.add_argument("--crop", type=int, default=0, 96 | help='number of pixels by which to crop the image edges (e.g. due to undistortion artifacts') 97 | parser.add_argument("--near", type=float, default=0.0, help='distance to the near plane') 98 | parser.add_argument("--far", type=float, default=1.0, help='distance to the far plane') 99 | 100 | # logging/saving options 101 | parser.add_argument("--i_print", type=int, default=100, 102 | help='frequency of console printout and metric logging') 103 | parser.add_argument("--i_img", type=int, default=500, 104 | help='frequency of tensorboard image logging') 105 | parser.add_argument("--i_weights", type=int, default=10000, 106 | help='frequency of weight ckpt saving') 107 | parser.add_argument("--i_mesh", type=int, default=200000, 108 | help='frequency of mesh extraction') 109 | 110 | return parser -------------------------------------------------------------------------------- /pose_array.py: -------------------------------------------------------------------------------- 1 | import tensorflow as tf 2 | 3 | 4 | class PoseArray(tf.Module): 5 | """ 6 | Per-frame camera pose correction. 7 | 8 | The pose correction contains 6 parameters for each pose (3 for rotation, 3 for translation). 9 | The rotation parameters define Euler angles which can be converted into a rotation matrix. 10 | """ 11 | 12 | def __init__(self, num_frames): 13 | super(PoseArray, self).__init__() 14 | 15 | self.num_frames = num_frames 16 | self.num_params = 6 17 | 18 | self.data = tf.Variable( 19 | tf.zeros([self.num_frames, self.num_params], dtype=tf.float32) 20 | ) 21 | 22 | def __call__(self, ids): 23 | return tf.gather(self.data, ids) 24 | 25 | def get_weights(self): 26 | return self.data.numpy() 27 | 28 | def set_weights(self, weights): 29 | self.data.assign(weights) 30 | 31 | def get_translations(self, ids): 32 | return tf.gather(self.data[:, 3:6], ids) 33 | 34 | def get_rotations(self, ids): 35 | return tf.gather(self.data[:, 0:3], ids) 36 | 37 | def get_rotation_matrices(self, ids): 38 | rotations = self.get_rotations(ids) # [N_frames, 3] 39 | 40 | cos_alpha = tf.math.cos(rotations[:, 0]) 41 | cos_beta = tf.math.cos(rotations[:, 1]) 42 | cos_gamma = tf.math.cos(rotations[:, 2]) 43 | sin_alpha = tf.math.sin(rotations[:, 0]) 44 | sin_beta = tf.math.sin(rotations[:, 1]) 45 | sin_gamma = tf.math.sin(rotations[:, 2]) 46 | 47 | col1 = tf.stack([cos_alpha * cos_beta, 48 | sin_alpha * cos_beta, 49 | -sin_beta], -1) 50 | col2 = tf.stack([cos_alpha * sin_beta * sin_gamma - sin_alpha * cos_gamma, 51 | sin_alpha * sin_beta * sin_gamma + cos_alpha * cos_gamma, 52 | cos_beta * sin_gamma], -1) 53 | col3 = tf.stack([cos_alpha * sin_beta * cos_gamma + sin_alpha * sin_gamma, 54 | sin_alpha * sin_beta * cos_gamma - cos_alpha * sin_gamma, 55 | cos_beta * cos_gamma], -1) 56 | 57 | return tf.stack([col1, col2, col3], -1) 58 | 59 | def transform_points(self, points, ids): 60 | R = self.get_rotation_matrices(ids) 61 | t = self.get_translations(ids) 62 | 63 | return tf.reduce_sum(points[..., None, :] * R, -1) + t 64 | -------------------------------------------------------------------------------- /scene_bounds.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | 4 | def get_scene_bounds(scene_name, voxel_size, interp=False): 5 | 6 | if scene_name == 'scene0000': 7 | x_min, x_max = -1.2, 1.2 8 | y_min, y_max = -0.1, 1.1 9 | z_min, z_max = -1.2, 1.2 10 | 11 | elif scene_name == 'scene0002': 12 | x_min, x_max = -1.2, 1.2 13 | y_min, y_max = -0.1, 1.5 14 | z_min, z_max = -1.2, 1.2 15 | 16 | elif scene_name == 'scene0005': 17 | x_min, x_max = -1.38, 1.42 18 | y_min, y_max = -0.1, 0.9 19 | z_min, z_max = -1.22, 1.58 20 | 21 | elif scene_name == 'scene0012': 22 | x_min, x_max = -1.2, 1.2 23 | y_min, y_max = -0.1, 1.1 24 | z_min, z_max = -1.2, 1.2 25 | 26 | elif scene_name == 'scene0050': 27 | x_min, x_max = -1.0, 1.0 28 | y_min, y_max = 0.0, 1.6 29 | z_min, z_max = -1.0, 1.0 30 | 31 | elif scene_name == 'scene0054': 32 | x_min, x_max = -1.4, 1.4 33 | y_min, y_max = -0.3, 1.4 34 | z_min, z_max = -1.4, 1.4 35 | 36 | elif scene_name == 'whiteroom': 37 | x_min, x_max = -1.2, 1.0 38 | y_min, y_max = -1.3, 0.9 39 | z_min, z_max = -0.8, 0.8 40 | 41 | elif scene_name == 'kitchen': 42 | x_min, x_max = -1.0, 1.4 43 | y_min, y_max = -1.4, 1.0 44 | z_min, z_max = -0.8, 1.0 45 | 46 | elif scene_name == 'breakfast': 47 | x_min, x_max = -1.0, 1.0 48 | y_min, y_max = -0.9, 0.9 49 | z_min, z_max = -1.0, 1.1 50 | 51 | elif scene_name == 'staircase': 52 | x_min, x_max = -1.2, 1.1 53 | y_min, y_max = -1.1, 1.2 54 | z_min, z_max = -0.8, 1.2 55 | 56 | elif scene_name == 'icl_living_room': 57 | x_min, x_max = -1.1, 1.1 58 | y_min, y_max = -1.1, 1.1 59 | z_min, z_max = -0.6, 0.5 60 | 61 | elif scene_name == 'complete_kitchen': 62 | x_min, x_max = -1.2, 1.2 63 | y_min, y_max = -0.9, 0.9 64 | z_min, z_max = -0.6, 0.6 65 | 66 | elif scene_name == 'green_room': 67 | x_min, x_max = -0.85, 0.65 68 | y_min, y_max = -1.1, 1.1 69 | z_min, z_max = -0.8, 0.6 70 | 71 | elif scene_name == 'grey_white_room': 72 | x_min, x_max = -0.62, 0.62 73 | y_min, y_max = -0.83, 0.83 74 | z_min, z_max = -0.56, 0.6 75 | 76 | elif scene_name == 'morning_apartment': 77 | x_min, x_max = -0.86, 0.86 78 | y_min, y_max = -1.0, 0.94 79 | z_min, z_max = -0.75, 0.75 80 | 81 | elif scene_name == 'thin_objects': 82 | x_min, x_max = -0.25, 1.45 83 | y_min, y_max = 0.1, 1.7 84 | z_min, z_max = -1.25, 0.0 85 | 86 | else: 87 | x_min, x_max = -1.0, 1.0 88 | y_min, y_max = -1.0, 1.0 89 | z_min, z_max = -1.0, 1.0 90 | 91 | if interp: 92 | x_min = x_min - 0.5 * voxel_size 93 | y_min = y_min - 0.5 * voxel_size 94 | z_min = z_min - 0.5 * voxel_size 95 | 96 | x_max = x_max + 0.5 * voxel_size 97 | y_max = y_max + 0.5 * voxel_size 98 | z_max = z_max + 0.5 * voxel_size 99 | 100 | Nx = round((x_max - x_min) / voxel_size + 0.0005) 101 | Ny = round((y_max - y_min) / voxel_size + 0.0005) 102 | Nz = round((z_max - z_min) / voxel_size + 0.0005) 103 | 104 | tx = np.linspace(x_min, x_max, Nx + 1) 105 | ty = np.linspace(y_min, y_max, Ny + 1) 106 | tz = np.linspace(z_min, z_max, Nz + 1) 107 | 108 | return tx, ty, tz 109 | --------------------------------------------------------------------------------