├── README.md ├── src ├── README.md ├── agent.py ├── dataset.py ├── dataset_fusion.py ├── evaluation.py ├── experiments │ └── exp_infer │ │ └── infer.py ├── hyperparameters.py ├── models.py ├── objects.py ├── proposal.py ├── search.py ├── setup.py ├── train.py ├── train_preprocess.py └── utils │ ├── combination_utils.py │ ├── edge_utils.py │ ├── face_utils.py │ ├── file_utils.py │ ├── solid_utils.py │ ├── space_splitter.py │ ├── vector_utils.py │ ├── vertex_utils.py │ └── vis_utils.py └── teaser-image.png /README.md: -------------------------------------------------------------------------------- 1 | # Inferring CAD Modeling Sequences using Zone Graphs 2 | 3 | ## Release Notes 4 | The official implementation of [Zone Graph](http://pwz.mit.edu/projects/ZG/), published at CVPR 2021. 5 | drawing 6 | 7 | In computer-aided design (CAD), the ability to "reverse engineer" the modeling steps used to create 3D shapes is a long-sought-after goal. This process can be decomposed into two sub-problems: converting an input mesh or point cloud into a boundary representation (or B-rep), and then inferring modeling operations which construct this B-rep. In this paper, we present a new system for solving the second sub-problem. Central to our approach is a new geometric representation: the zone graph. Zones are the set of solid regions formed by extending all B-Rep faces and partitioning space with them; a zone graph has these zones as its nodes, with edges denoting geometric adjacencies between them. Zone graphs allow us to tractably work with industry-standard CAD operations, unlike prior work using CSG with parametric primitives. We focus on CAD programs consisting of sketch + extrude + Boolean operations, which are common in CAD practice. We phrase our problem as search in the space of such extrusions permitted by the zone graph, and we train a graph neural network to score potential extrusions in order to accelerate the search. We show that our approach outperforms an existing CSG inference baseline in terms of geometric reconstruction accuracy and reconstruction time, while also creating more plausible modeling sequences. 8 | 9 | 10 | ## Running Zone Graph 11 | 12 | More code is comming ! 13 | 14 | See [this link](./src) for instructions of running the code. 15 | 16 | ## Citation 17 | Please cite the following paper if you find the code useful for your research/projects. 18 | 19 | ``` 20 | @inproceedings{, 21 | author = "Xianghao Xu and Wenzhe Peng and Chin-Yi Cheng 22 | and Karl D. D. Willis and Daniel Ritchie", 23 | title = "Inferring CAD Modeling Sequences Using Zone Graphs", 24 | booktitle = "CVPR", 25 | year = "2021" 26 | } 27 | ``` 28 | -------------------------------------------------------------------------------- /src/README.md: -------------------------------------------------------------------------------- 1 | # ZoneGraphs 2 | 3 | ### Required Packages 4 | - [Freecad](https://www.freecadweb.org/) 5 | - [Pytorch](https://pytorch.org/) 6 | - [dgl](https://www.dgl.ai/) 7 | - [mayavi](https://docs.enthought.com/mayavi/mayavi/) (for visualization only) 8 | - [Trimesh] 9 | - [networkx] 10 | 11 | 12 | ### FreeCAD path setup 13 | 14 | Please download and compile FreeCAD and put absolute path to your FreeCAD lib in setup.py 15 | 16 | ### Dataset 17 | 18 | We use the [Fusion360GalleryDataset](https://github.com/AutodeskAILab/Fusion360GalleryDataset) for training. Sepecifically, we use the reconstruction subset of the Fusion dataset. The download links for the data we use are: 19 | - [Main reconstruction subset](https://fusion-360-gallery-dataset.s3-us-west-2.amazonaws.com/reconstruction/r1.0.0/r1.0.0.zip) 20 | - [GT extrusion set (extrude tools )](https://fusion-360-gallery-dataset.s3-us-west-2.amazonaws.com/reconstruction/r1.0.0/r1.0.0_extrude_tools.zip) 21 | 22 | ### Preprocess Fusion360 Raw data to reconstruction sequence data (each step contains current shape, target shape and extrusion shape) 23 | 24 | Downloaded the above fusion and extrude_tool 25 | ``` 26 | python dataset_fusion.py 27 | 28 | --fusion_path "path to your fusion reconstruction data folder" 29 | 30 | --extrusion_path "path to your fusion GT extrusion tool data folder" 31 | 32 | --output_path "path to your output processed fusion data folder" 33 | ``` 34 | ### Generating training data 35 | ``` 36 | python train_preprocess.py 37 | 38 | --data_path "path to your processed fusion data folder" 39 | 40 | --output_path "path to your output processed data for training" 41 | ``` 42 | 43 | ### Testing/Infering reconstruction sequences 44 | ``` 45 | cd experiments/exp_infer 46 | 47 | python infer.py 48 | 49 | --option "the option for ranking the proposed extrusions: random/heur/agent" 50 | 51 | --data_path "path to your processed fusion data folder" 52 | 53 | --max_time "time limit for the search to terminate" 54 | 55 | --max_step "maximum sequence length" 56 | ``` 57 | 58 | 59 | 60 | 61 | -------------------------------------------------------------------------------- /src/agent.py: -------------------------------------------------------------------------------- 1 | import sys 2 | sys.path.append('..') 3 | import os 4 | import numpy as np 5 | import torch 6 | import torch.nn as nn 7 | import torch.nn.functional as F 8 | from torch.optim import Adam, SGD 9 | from objects import * 10 | from models import * 11 | import hyperparameters as hp 12 | 13 | def to_numpy(item): 14 | if item.is_cuda: 15 | return item.cpu().detach().numpy() 16 | else: 17 | return item.detach().numpy() 18 | 19 | def to_tensor(item): 20 | if torch.cuda.is_available(): 21 | return torch.tensor(item, device=torch.device('cuda:0'), dtype=torch.float) 22 | else: 23 | return torch.tensor(item, dtype=torch.float) 24 | 25 | class Agent(): 26 | def __init__(self, folder=None): 27 | self.criterion_agent = nn.NLLLoss() 28 | self.folder = folder 29 | 30 | self.zone_encoder = ZoneEncoder(zone_sample_num, hp.gnn_node_feat_dim) 31 | self.zone_encoder.cuda() 32 | 33 | self.decision_maker = GraphNetSoftMax(hp.gnn_node_feat_dim, 2) 34 | self.decision_maker.cuda() 35 | 36 | self.optim_extrusion = Adam(list(self.zone_encoder.parameters()) + list(self.decision_maker.parameters()), lr=hp.learning_rate_optim_extrusion) 37 | 38 | def encode_zone_graph(self, zone_graph): 39 | g = dgl.DGLGraph() 40 | g.add_nodes(len(zone_graph.zone_graph.nodes)) 41 | 42 | node_shape_positions = nx.get_node_attributes(zone_graph.zone_graph, 'shape_positions') 43 | node_shape_normals = nx.get_node_attributes(zone_graph.zone_graph, 'shape_normals') 44 | node_cur_state_features = nx.get_node_attributes(zone_graph.zone_graph, 'in_current') 45 | node_tgt_state_features = nx.get_node_attributes(zone_graph.zone_graph, 'in_target') 46 | node_extru_state_features = nx.get_node_attributes(zone_graph.zone_graph, 'in_extrusion') 47 | node_bool_features = nx.get_node_attributes(zone_graph.zone_graph, 'bool') 48 | 49 | features = [] 50 | target_labels = [] 51 | current_labels = [] 52 | extrusion_labels = [] 53 | bool_labels = [] 54 | shape_positions = [] 55 | shape_normals = [] 56 | 57 | point_num = node_shape_positions[0].shape[0] 58 | 59 | for i in range(len(zone_graph.zone_graph.nodes)): 60 | 61 | cur_label = to_tensor(node_cur_state_features[i]) 62 | current_labels.append(cur_label) 63 | 64 | target_label = to_tensor(node_tgt_state_features[i]) 65 | target_labels.append(target_label) 66 | 67 | extru_label = to_tensor(node_extru_state_features[i]) 68 | extrusion_labels.append(extru_label) 69 | 70 | bool_label = to_tensor(node_bool_features[i]) 71 | bool_labels.append(bool_label) 72 | 73 | shape_position = to_tensor(node_shape_positions[i]) 74 | shape_positions.append(shape_position) 75 | 76 | shape_normal = to_tensor(node_shape_normals[i]) 77 | shape_normals.append(shape_normal) 78 | 79 | 80 | current_labels = torch.stack(current_labels) 81 | current_labels = torch.unsqueeze(current_labels, dim=1) 82 | current_labels = torch.repeat_interleave(current_labels, point_num, dim=1) 83 | current_labels = torch.unsqueeze(current_labels, dim=2) 84 | 85 | target_labels = torch.stack(target_labels) 86 | target_labels = torch.unsqueeze(target_labels, dim=1) 87 | target_labels = torch.repeat_interleave(target_labels, point_num, dim=1) 88 | target_labels = torch.unsqueeze(target_labels, dim=2) 89 | 90 | extrusion_labels = torch.stack(extrusion_labels) 91 | extrusion_labels = torch.unsqueeze(extrusion_labels, dim=1) 92 | extrusion_labels = torch.repeat_interleave(extrusion_labels, point_num, dim=1) 93 | extrusion_labels = torch.unsqueeze(extrusion_labels, dim=2) 94 | 95 | bool_labels = torch.stack(bool_labels) 96 | bool_labels = torch.unsqueeze(bool_labels, dim=1) 97 | bool_labels = torch.repeat_interleave(bool_labels, point_num, dim=1) 98 | bool_labels = torch.unsqueeze(bool_labels, dim=2) 99 | 100 | shape_positions = torch.stack(shape_positions) 101 | shape_normals = torch.stack(shape_normals) 102 | 103 | features = torch.cat((shape_positions, shape_normals, current_labels, target_labels, extrusion_labels, bool_labels), dim=2) 104 | features = torch.transpose(features, 2, 1) 105 | encoded_features = self.zone_encoder(features) 106 | g.ndata['h'] = encoded_features 107 | 108 | src = [] 109 | dst = [] 110 | for e in zone_graph.zone_graph.edges: 111 | src.append(e[0]) 112 | dst.append(e[1]) 113 | src = tuple(src) 114 | dst = tuple(dst) 115 | 116 | g.add_edges(src, dst) 117 | g.add_edges(dst, src) 118 | 119 | return g 120 | 121 | def make_decision(self, g_encs): 122 | bactched_g = dgl.batch(g_encs) 123 | #bactched_g = torch.stack(g_encs) 124 | prob = self.decision_maker(bactched_g) 125 | prob = torch.exp(prob) 126 | return prob 127 | 128 | def update_by_extrusion(self, labels, gs): 129 | print('update agent weights') 130 | self.optim_extrusion.zero_grad() 131 | 132 | g_encs = [] 133 | for i in range(len(gs)): 134 | g_enc = self.encode_zone_graph(gs[i]) 135 | g_encs.append(g_enc) 136 | 137 | labels = torch.stack(labels) 138 | labels = labels.long() 139 | 140 | prob = self.make_decision(g_encs) 141 | gathered_prob = [] 142 | for i in range(len(prob)): 143 | gathered_prob.append(prob[i][labels[i]]) 144 | prob = torch.stack(gathered_prob) 145 | 146 | gamma = 0.5 147 | loss = torch.mean(-torch.pow((1 - prob), gamma) * torch.log(prob)) 148 | loss.backward(retain_graph=True) 149 | self.optim_extrusion.step() 150 | 151 | return loss.item() 152 | 153 | def save_weights(self): 154 | torch.save(self.zone_encoder.state_dict(), os.path.join(self.folder,"zone_encoder.pkl")) 155 | torch.save(self.decision_maker.state_dict(), os.path.join(self.folder,"decision_maker.pkl")) 156 | 157 | def save_best_weights(self): 158 | torch.save(self.zone_encoder.state_dict(), os.path.join(self.folder,"best_zone_encoder.pkl")) 159 | torch.save(self.decision_maker.state_dict(), os.path.join(self.folder,"best_decision_maker.pkl")) 160 | 161 | def load_weights(self): 162 | state_dict = torch.load(os.path.join(self.folder,"zone_encoder.pkl")) 163 | self.zone_encoder.load_state_dict(state_dict) 164 | state_dict = torch.load(os.path.join(self.folder,"decision_maker.pkl")) 165 | self.decision_maker.load_state_dict(state_dict) 166 | 167 | def load_best_weights(self): 168 | state_dict = torch.load(os.path.join(self.folder,"zone_encoder.pkl")) 169 | self.zone_encoder.load_state_dict(state_dict) 170 | state_dict = torch.load(os.path.join(self.folder,"decision_maker.pkl")) 171 | self.decision_maker.load_state_dict(state_dict) 172 | 173 | 174 | 175 | -------------------------------------------------------------------------------- /src/dataset.py: -------------------------------------------------------------------------------- 1 | import sys 2 | #sys.path.append('..') 3 | 4 | import os 5 | from pathlib import Path 6 | from objects import * 7 | from proposal import * 8 | import json 9 | import glob 10 | import random 11 | import math 12 | import joblib 13 | 14 | class Data(): 15 | def __init__(self): 16 | self.current_shape = Part.Shape() 17 | self.target_shape = Part.Shape() 18 | self.extrusion_shape = Part.Shape() 19 | self.sketch_shape = Part.Shape() 20 | self.bool_type = 0 21 | 22 | class DataManager: 23 | def __init__(self): 24 | self.dummy = None 25 | 26 | def save_raw_step(self, data, data_path): 27 | pass 28 | 29 | def load_raw_step(self, data_path): 30 | 31 | data = Data() 32 | 33 | data.target_shape.read(os.path.join(data_path, "target_shape.stp")) 34 | 35 | try: 36 | data.current_shape.read(os.path.join(data_path, "current_shape.stp")) 37 | #data.current_shape.scale(100, scale_center) 38 | except: 39 | data.current_shape = None 40 | 41 | try: 42 | data.extrusion_shape.read(os.path.join(data_path, "extrusion.stp")) 43 | #data.extrusion_shape.scale(100, scale_center) 44 | except: 45 | data.extrusion_shape = None 46 | 47 | #data.target_shape.scale(100, scale_center) 48 | 49 | if os.path.isfile(os.path.join(data_path, 'bool_type.txt')): 50 | ret = read_file_to_string(os.path.join(data_path, 'bool_type.txt')) 51 | if ret == 'addition': 52 | data.bool_type = 0 53 | else: 54 | data.bool_type = 1 55 | else: 56 | data.bool_type = None 57 | 58 | return data 59 | 60 | def check_extrusion_outside(self, extrusion, zone_graph): 61 | common = extrusion.cad_shape.common(zone_graph.bbox) 62 | outside = extrusion.cad_shape.cut(zone_graph.bbox) 63 | #print('outside vol', outside.Volume) 64 | if outside.Volume > 0.1 * extrusion.cad_shape.Volume: 65 | return False 66 | return True 67 | 68 | def check_extrusion_volume(self, extrusion, zone_graph): 69 | gt_vol = 0 70 | for solid in extrusion.cad_shape.Solids: 71 | gt_vol += solid.Volume 72 | 73 | extrusion_vol = 0 74 | for i in extrusion.zone_indices: 75 | if solid_contain_zone(extrusion.cad_shape, zone_graph.zones[i]): 76 | extrusion_vol += zone_graph.zones[i].cad_shape.Volume 77 | 78 | #print('gt_vol', gt_vol) 79 | #print('extrusion_vol', extrusion_vol) 80 | if abs(gt_vol - extrusion_vol) > 0.01 * gt_vol: 81 | print('extrusion vol does not match') 82 | #display_object(extrusion, zone_graph.bbox, show=True) 83 | #display_zone_graph(zone_graph) 84 | inside_zones = [] 85 | for i in extrusion.zone_indices: 86 | inside_zones.append(zone_graph.zones[i]) 87 | 88 | #display_objects(inside_zones, zone_graph.bbox, show=True) 89 | #print('extrusion vol does not match') 90 | #exit() 91 | return False 92 | return True 93 | 94 | def load_raw_sequence(self, seq_path, min_length=0, max_length=999999): 95 | 96 | print('seq_path', seq_path) 97 | 98 | path = Path(seq_path) 99 | step_names = list(path.glob('*')) 100 | sequence_length = len(step_names) 101 | start_index = 0 102 | max_step = 0 103 | end_index = sequence_length 104 | 105 | print('step names', step_names) 106 | 107 | for filename in step_names: 108 | if 'DS' in str(filename): 109 | os.system("rm -f " + os.path.join(fusion_data_folder, seq_id, str(filename))) 110 | 111 | for step_name in step_names: 112 | if 'Unsupported' in str(step_name): 113 | return [], 'unsupported_operation' 114 | step = int(str(step_name).split('/')[-1]) 115 | if step >= max_step: 116 | max_step = step 117 | 118 | if len(step_names) != max_step + 1: 119 | print('length error') 120 | return [], 'wrong_load_length' 121 | 122 | if sequence_length < min_length or sequence_length > max_length: 123 | print('length out of bound') 124 | return [], 'wrong_load_length' 125 | 126 | #-------------------------------------------------------------------------------------------------------------------------------- 127 | 128 | zone_graph = ZoneGraph() 129 | 130 | start_data_path = os.path.join(str(path), str(start_index)) 131 | start_data = self.load_raw_step(start_data_path) 132 | 133 | zone_graph.current_shape = start_data.current_shape 134 | zone_graph.target_shape = start_data.target_shape 135 | 136 | 137 | ret, error_type = zone_graph.build() 138 | if not ret: 139 | print('zone graph build error') 140 | return [], error_type 141 | 142 | sequence = [] 143 | step_indices = list(np.arange(start_index, end_index, 1)) 144 | #print('step_indices', step_indices) 145 | for step_index in step_indices: 146 | data_path = os.path.join(str(path), str(step_index)) 147 | data = self.load_raw_step(data_path) 148 | 149 | if data.extrusion_shape: 150 | extrusion = Extrusion(data.extrusion_shape) 151 | extrusion.bool_type = data.bool_type 152 | for i, zone in enumerate(zone_graph.zones): 153 | if solid_contain_zone(extrusion.cad_shape, zone): 154 | extrusion.zone_indices.append(i) 155 | 156 | ret = self.check_extrusion_outside(extrusion, zone_graph) 157 | if not ret: 158 | return [], 'extrusion_outside' 159 | 160 | ret = self.check_extrusion_volume(extrusion, zone_graph) 161 | if not ret: 162 | return [], 'extrusion_mismatch' 163 | sequence.append((copy.deepcopy(zone_graph), copy.deepcopy(extrusion))) 164 | next_zone_graph = zone_graph.update_to_next_zone_graph(extrusion) 165 | zone_graph = next_zone_graph 166 | else: 167 | sequence.append((copy.deepcopy(zone_graph), None)) 168 | 169 | return sequence, None 170 | 171 | def load_processed_step(self, path): 172 | graph = joblib.load(str(path) + '_g.joblib') 173 | extrusion = joblib.load(str(path) + '_e.joblib') 174 | return (graph, extrusion) 175 | 176 | def load_processed_sequence(self, seq_path, min_length=0, max_length=99999): 177 | path = Path(seq_path) 178 | sequence = [] 179 | sequence_length = len(list(path.glob('*_g*'))) 180 | if sequence_length < min_length or sequence_length > max_length: 181 | print('length out of bound') 182 | return [] 183 | 184 | step_indices = list(np.arange(0, sequence_length, 1)) 185 | for step_index in step_indices: 186 | step_path = os.path.join(seq_path, str(step_index)) 187 | step = self.load_processed_step(step_path) 188 | sequence.append(step) 189 | 190 | return sequence 191 | 192 | def render_sequence(self, sequence, path): 193 | if not os.path.exists(path): 194 | os.makedirs(path) 195 | for i, step in enumerate(sequence): 196 | step_path = os.path.join(path, 'step_' + str(i)) 197 | zone_graph = step[0] 198 | current_zone_shapes = [zone_graph.zones[i].cad_shape for i in zone_graph.get_current_zone_indices()] 199 | current_shape = merge_solids(current_zone_shapes) 200 | display_object(current_shape, bound_obj=zone_graph.bbox, color=(0.8, 0.8, 0.8), file=step_path + '_shape.png') 201 | 202 | extrusion = step[1] 203 | extrusion_zone_shapes = [zone_graph.zones[i].cad_shape for i in extrusion.zone_indices] 204 | extrusion_shape = merge_solids(extrusion_zone_shapes) 205 | if extrusion.bool_type == 0: 206 | display_object(extrusion_shape, bound_obj=zone_graph.bbox, color=(0.0, 1.0, 0), file=step_path + '_extrusion.png') 207 | else: 208 | display_object(extrusion_shape, bound_obj=zone_graph.bbox, color=(1.0, 0.0, 0), file=step_path + '_extrusion.png') 209 | 210 | def save_sequence(self, sequence, path): 211 | if not os.path.exists(path): 212 | os.makedirs(path) 213 | for i, step in enumerate(sequence): 214 | step_path = os.path.join(path, 'step_' + str(i)) 215 | zone_graph = step[0] 216 | current_zone_shapes = [zone_graph.zones[i].cad_shape for i in zone_graph.get_current_zone_indices()] 217 | current_shape = merge_solids(current_zone_shapes) 218 | # display_object(current_shape, bound_obj=zone_graph.bbox, color=(0.8, 0.8, 0.8), file=step_path + '_shape.png') 219 | 220 | extrusion = step[1] 221 | extrusion_zone_shapes = [zone_graph.zones[i].cad_shape for i in extrusion.zone_indices] 222 | extrusion_shape = merge_solids(extrusion_zone_shapes) 223 | 224 | if not current_shape is None: 225 | current_shape.exportStep(step_path + '_shape.stp') 226 | if not extrusion_shape is None: 227 | extrusion_shape.exportStep(step_path + '_extrusion.stp') 228 | write_val_to_file(extrusion.bool_type, step_path + '_bool_type.txt') 229 | 230 | # if extrusion.bool_type == 0: 231 | # display_object(extrusion_shape, bound_obj=zone_graph.bbox, color=(0.0, 1.0, 0), file=step_path + '_extrusion.png') 232 | # else: 233 | # display_object(extrusion_shape, bound_obj=zone_graph.bbox, color=(1.0, 0.0, 0), file=step_path + '_extrusion.png') 234 | 235 | def simulate_sequence(self, seq): 236 | zone_graph = copy.deepcopy(seq[0][0]) 237 | for i, step in enumerate(seq): 238 | gt_extrusion = step[1] 239 | next_extrusion = None 240 | proposal_extrusions = get_proposals(zone_graph) 241 | for extrusion in proposal_extrusions: 242 | if gt_extrusion.bool_type == extrusion.bool_type and gt_extrusion.hash() == extrusion.hash(): 243 | print('extrusion found') 244 | next_extrusion = extrusion 245 | break 246 | 247 | if next_extrusion is None: 248 | print('extrusion not found') 249 | return False 250 | 251 | #display_extrusion(next_extrusion, zone_graph, show=True) 252 | next_zone_graph = zone_graph.update_to_next_zone_graph(next_extrusion) 253 | zone_graph = next_zone_graph 254 | 255 | #display_object(zone_graph.target_shape, show=True) 256 | #display_zone_graph(zone_graph, show=True) 257 | is_done = zone_graph.is_done() 258 | 259 | return is_done 260 | 261 | -------------------------------------------------------------------------------- /src/dataset_fusion.py: -------------------------------------------------------------------------------- 1 | 2 | 3 | import sys 4 | sys.path.append('..') 5 | 6 | import os 7 | from pathlib import Path 8 | import json 9 | import glob 10 | import random 11 | import math 12 | import joblib 13 | 14 | from objects import * 15 | from dataset import * 16 | 17 | class FusionDataManager: 18 | 19 | def __iter__(self): 20 | return self 21 | 22 | def __init__(self, data_path, shuffle = True, start_index = 0, augment = False, reposition = False): 23 | self.num = start_index 24 | self.augment = augment 25 | self.reposition = reposition 26 | 27 | print('data_path', data_path) 28 | 29 | file_list = glob.glob(data_path + "/*.step") 30 | print(f'[DATA LOADER] Found {len(file_list)} models') 31 | file_list.sort() 32 | 33 | found_sequences = [] 34 | current_sequence = [] 35 | 36 | for i, f_dir in enumerate(file_list): 37 | if i < 10e10: 38 | f_name = f_dir.split('/')[-1] 39 | segments = f_name.split("_") 40 | 41 | # found target model 42 | if len(segments) == 3: 43 | # current_sequence.append(f_dir) 44 | if len(current_sequence) >= 1 : 45 | found_sequences.append(current_sequence) 46 | 47 | current_sequence = [] 48 | 49 | # found sequence model of a target model 50 | else: 51 | current_sequence.append(f_dir) 52 | if len(current_sequence) >= 1: 53 | found_sequences.append(current_sequence) 54 | 55 | 56 | 57 | print('found_sequences', found_sequences) 58 | self.pairs = [] 59 | # find modeling pairs 60 | for seq in (found_sequences): 61 | target = seq[-1] 62 | 63 | # (current, next, possible target, final target, index, temp_target_count) 64 | 65 | if self.augment: 66 | # augment data 67 | for j,temp_target in enumerate(seq[1:]): 68 | self.pairs.append((None, seq[0], temp_target, target, f'0_{str(j)}', len(seq[1:]) )) 69 | 70 | for i in range(len(seq)-1): 71 | for j,temp_target in enumerate(seq[i+1:]): 72 | self.pairs.append((seq[i], seq[i+1], temp_target, target, f'{str(i+1)}_{str(j)}', len(seq[i+1:]))) 73 | else: 74 | # un-augmented data 75 | self.pairs.append((None, seq[0], target, target, "0", 1 )) 76 | for i in range(len(seq)-1): 77 | self.pairs.append((seq[i], seq[i+1], target, target, f'{str(i+1)}', 1)) 78 | 79 | 80 | if shuffle: 81 | random.shuffle(self.pairs) 82 | 83 | print(f'[DATA LOADER] Found {len(self.pairs)} extrusion pairs') 84 | self.pair_count = len(self.pairs) 85 | 86 | print('self.pairs', self.pair_count) 87 | 88 | 89 | def get_data_by_pair(self, pair): 90 | current_dir = pair[0] 91 | next_dir = pair[1] 92 | temp_target_dir = pair[2] 93 | final_target_dir= pair[3] 94 | index = pair[4] 95 | 96 | count = pair[5] 97 | 98 | segs = next_dir.split('/')[-1].split('_') 99 | file_name = '_'.join([segs[0],segs[1],segs[2]]) 100 | 101 | data = Data() 102 | data.file_name = file_name 103 | data.index = index 104 | 105 | data.count = count 106 | data.target_shape.read(temp_target_dir) 107 | 108 | # calculate normalize scale factor 109 | final_cad_shape = Part.Shape() 110 | final_cad_shape.read(final_target_dir) 111 | bbox = final_cad_shape.BoundBox 112 | bbox_data = str(bbox).split('BoundBox (')[1].split(')')[0].split(',') 113 | bbox_data = [float(item) for item in bbox_data] 114 | w = bbox_data[3]-bbox_data[0] 115 | d = bbox_data[4]-bbox_data[1] 116 | h = bbox_data[5]-bbox_data[2] 117 | x = (bbox_data[3]+bbox_data[0])/2 118 | y = (bbox_data[4]+bbox_data[1])/2 119 | z = (bbox_data[5]+bbox_data[2])/2 120 | 121 | diagnal_d = math.sqrt(w*w + d*d + h*h) 122 | scale_factor = 1 / diagnal_d 123 | move_vector = Base.Vector(-x, -y, -z) 124 | 125 | if current_dir is None: 126 | data.current_shape = None 127 | 128 | next_cad_shape = Part.Shape() 129 | next_cad_shape.read(next_dir) 130 | data.extrusion_shape = None 131 | data.bool_type = 0 132 | 133 | else: 134 | # NOTE: currently extrusion is the subtraction of target and current 135 | data.current_shape.read(current_dir) 136 | 137 | next_cad_shape = Part.Shape() 138 | next_cad_shape.read(next_dir) 139 | 140 | data.extrusion_shape = None 141 | if su.true_Volume(next_cad_shape) > su.true_Volume(data.current_shape): 142 | data.bool_type = 0 143 | else: 144 | data.bool_type = 1 145 | 146 | data.target_shape.scale(scale_factor, Base.Vector(0, 0, 0)) 147 | if data.current_shape: 148 | data.current_shape.scale(scale_factor, Base.Vector(0, 0, 0)) 149 | 150 | # normalize model location 151 | if self.reposition: 152 | data.current_shape.translate(move_vector) 153 | 154 | return data 155 | 156 | def __next__(self): 157 | if self.num < len(self.pairs): 158 | num = self.num 159 | picked = self.pairs[num] 160 | self.num += 1 161 | 162 | res = self.get_data_by_pair(picked) 163 | if res: 164 | return num, res 165 | else: 166 | # NOTE: when one of the model pairs is not valid, return None 167 | 168 | return num, None 169 | else: 170 | raise StopIteration 171 | 172 | def preprocess_fusion_data(fusion_path, extrusion_path, processed_fusion_path, reposition): 173 | 174 | # process current shapes 175 | dm = FusionDataManager(fusion_path, shuffle = False, start_index=0, augment=False, reposition=reposition) 176 | if not os.path.exists(processed_fusion_path): 177 | os.makedirs(processed_fusion_path) 178 | 179 | for i, data in dm: 180 | if data: 181 | print(f"------------Preparing steps {i}/{dm.pair_count}------------") 182 | 183 | file_path = os.path.join(processed_fusion_path, data.file_name) 184 | try: 185 | os.mkdir(file_path) 186 | except OSError as error: 187 | pass 188 | 189 | step_path = os.path.join(file_path, str(data.index)) 190 | 191 | try: 192 | os.mkdir(step_path) 193 | except OSError as error: 194 | pass 195 | 196 | if data.current_shape and not data.current_shape.isNull(): 197 | data.current_shape.exportStep(f"{str(step_path)}/current_shape.stp") 198 | 199 | data.target_shape.exportStep(f"{str(step_path)}/target_shape.stp") 200 | 201 | if data.bool_type == 0: 202 | f = open(f"{str(step_path)}/bool_type.txt","w+") 203 | f.write('addition') 204 | f.close() 205 | else: 206 | f = open(f"{str(step_path)}/bool_type.txt","w+") 207 | f.write('subtraction') 208 | f.close() 209 | 210 | file_list = glob.glob(os.path.join(processed_fusion_path, "*")) 211 | 212 | # process extrusions 213 | for i,file in enumerate(file_list): 214 | model_id = file.split(processed_fusion_path)[1] 215 | 216 | model_dir = f"{fusion_path}{model_id}.step" 217 | 218 | final_cad_shape = Part.Shape() 219 | final_cad_shape.read(model_dir) 220 | bbox = final_cad_shape.BoundBox 221 | bbox_data = str(bbox).split('BoundBox (')[1].split(')')[0].split(',') 222 | bbox_data = [float(item) for item in bbox_data] 223 | w = bbox_data[3]-bbox_data[0] 224 | d = bbox_data[4]-bbox_data[1] 225 | h = bbox_data[5]-bbox_data[2] 226 | x = (bbox_data[3]+bbox_data[0])/2 227 | y = (bbox_data[4]+bbox_data[1])/2 228 | z = (bbox_data[5]+bbox_data[2])/2 229 | 230 | diagnal_d = math.sqrt(w*w + d*d + h*h) 231 | scale_factor = 1 / diagnal_d 232 | move_vector = Base.Vector(-x, -y, -z) 233 | 234 | exts = glob.glob(extrusion_path + f"{model_id}*.step") 235 | exts.sort() 236 | 237 | print('Processing Extrusion: ', i, model_dir) 238 | 239 | for j,ext in enumerate(exts): 240 | ext_shape = Part.Shape() 241 | ext_shape.read(ext) 242 | 243 | # normalize model location 244 | if (reposition): 245 | ext_shape.translate(move_vector) 246 | 247 | # normalize model scale 248 | ext_shape.scale(scale_factor, Base.Vector(0, 0, 0)) 249 | 250 | out_dir = f"{file}/{j}/extrusion.stp" 251 | try: 252 | ext_shape.exportStep(out_dir) 253 | except: 254 | pass 255 | 256 | if __name__ == "__main__": 257 | parser = argparse.ArgumentParser() 258 | parser.add_argument('--fusion_path', default='../data/fusion/reconstruction', type=str) 259 | parser.add_argument('--extrusion_path', default='../data/extrude', type=str) 260 | parser.add_argument('--output_path', default='../data/fusion_processed', type=str) 261 | parser.add_argument('--reposition', default=False, type=bool) 262 | 263 | args = parser.parse_args() 264 | 265 | preprocess_fusion_data(args.fusion_path, args.extrusion_path, args.output_path, args.reposition) 266 | -------------------------------------------------------------------------------- /src/evaluation.py: -------------------------------------------------------------------------------- 1 | import sys 2 | sys.path.append('..') 3 | 4 | from objects import * 5 | import random 6 | import torch 7 | 8 | def sort_extrusions_by_random(extrusions): 9 | random.shuffle(extrusions) 10 | return extrusions 11 | 12 | def sort_extrusions_by_heur(extrusions, zone_graph): 13 | for extrusion in extrusions: 14 | extrusion.score = get_extrusion_heur_score(extrusion, zone_graph) 15 | 16 | sorted_extrusions = sorted(extrusions, key=lambda v: (v.score[0], v.score[1]), reverse=True) 17 | return sorted_extrusions 18 | 19 | def sort_extrusions_by_agent(extrusions, zone_graph, agent): 20 | with torch.no_grad(): 21 | start_time = time.time() 22 | if len(extrusions) == 1: 23 | return extrusions 24 | 25 | g_encs = [] 26 | for extrusion in extrusions: 27 | zone_graph.encode_with_extrusion(extrusion) 28 | g_enc = agent.encode_zone_graph(zone_graph) 29 | g_encs.append(g_enc) 30 | 31 | scores = agent.make_decision(g_encs) 32 | 33 | for i, extrusion in enumerate(extrusions): 34 | extrusion.score = scores[i][1].item() 35 | 36 | sorted_extrusions = sorted(extrusions, key=lambda v: v.score, reverse=True) 37 | return sorted_extrusions 38 | -------------------------------------------------------------------------------- /src/experiments/exp_infer/infer.py: -------------------------------------------------------------------------------- 1 | import sys 2 | from os.path import dirname, realpath 3 | sys.path.append('../..') 4 | 5 | import numpy as np 6 | import os 7 | import argparse 8 | import numpy as np 9 | 10 | from dataset import * 11 | from search import * 12 | from utils.file_utils import * 13 | import time 14 | from agent import Agent 15 | import shutil 16 | import multiprocessing 17 | 18 | # from utils.vis_utils import * 19 | 20 | def infer(seq_id, sort_option, data_path, max_time, max_step): 21 | 22 | if sort_option == 'random': 23 | folder = str(sort_option) 24 | agent = None 25 | if sort_option == 'heur': 26 | folder = str(sort_option) 27 | agent = None 28 | if sort_option == 'agent': 29 | folder = str(sort_option) 30 | agent = Agent('../../train_output') 31 | agent.load_weights() 32 | 33 | if not os.path.exists(folder): 34 | os.makedirs(folder) 35 | 36 | print('infer----------------------------------------', seq_id) 37 | 38 | data_mgr = DataManager() 39 | start_time = time.time() 40 | sequence_length = len(list(Path(os.path.join(data_path, seq_id)).glob('*'))) 41 | gt_seq, error_type = data_mgr.load_raw_sequence(os.path.join(data_path, seq_id), 0, sequence_length) 42 | 43 | if len(gt_seq) == 0: 44 | return 45 | 46 | start_zone_graph = gt_seq[0][0] 47 | expand_width = 15 48 | 49 | probablistical = False 50 | use_concurrent = False 51 | 52 | best_sol=SearchSolution() 53 | dfs_best_recon(start_zone_graph, max_step, max_time, expand_width, sort_option, best_sol, start_time, os.path.join(folder, seq_id), agent) 54 | 55 | def infer_all(sort_option, data_path, max_time, max_step): 56 | 57 | all_ids = os.listdir(data_path) 58 | 59 | for seq_id in all_ids: 60 | 61 | worker_process = multiprocessing.Process(target=infer, name="infer", args=(seq_id, sort_option, data_path, max_time, max_step, )) 62 | worker_process.start() 63 | worker_process.join(max_time+100) 64 | 65 | if worker_process.is_alive(): 66 | print ("process_single_data is running... let's kill it...") 67 | worker_process.terminate() 68 | worker_process.join() 69 | 70 | 71 | processed_data_folder = "../processed_files/processed_data/" 72 | if __name__ == "__main__": 73 | parser = argparse.ArgumentParser() 74 | parser.add_argument('--option', default='heur', type=str, help='infer option') 75 | parser.add_argument('--data_path', default='../../../data/fusion_processed', type=str) 76 | parser.add_argument('--max_time', default=300, type=float) 77 | parser.add_argument('--max_step', default=15, type=int) 78 | args = parser.parse_args() 79 | infer_all(args.option, args.data_path, args.max_time, args.max_step) 80 | 81 | 82 | 83 | 84 | -------------------------------------------------------------------------------- /src/hyperparameters.py: -------------------------------------------------------------------------------- 1 | 2 | import sys 3 | sys.path.append('..') 4 | from dataset import * 5 | 6 | train_epoch_num = 9 7 | batch_size = 32 8 | learning_rate_optim_extrusion = 0.0001 9 | 10 | zone_shape_dim = 128 11 | zone_feat_dim = 128 12 | gnn_node_feat_dim = 128 13 | 14 | 15 | -------------------------------------------------------------------------------- /src/models.py: -------------------------------------------------------------------------------- 1 | import sys 2 | sys.path.append('..') 3 | 4 | import sys 5 | import numpy as np 6 | import torch as torch 7 | import torch.nn as nn 8 | import torch.nn.functional as F 9 | import dgl 10 | from torch.autograd import Variable 11 | 12 | class GraphNetSoftMax(nn.Module): 13 | def __init__(self, in_dim, out_dim): 14 | super(GraphNetSoftMax, self).__init__() 15 | 16 | self.node_feat_size = in_dim 17 | 18 | self.mp1 = MPLayer(self.node_feat_size, self.node_feat_size) 19 | self.post_mp1 = nn.Sequential( 20 | nn.BatchNorm1d(self.node_feat_size), 21 | nn.LeakyReLU(negative_slope=0.2)) 22 | 23 | self.mp2 = MPLayer(self.node_feat_size, self.node_feat_size) 24 | self.post_mp2 = nn.Sequential( 25 | nn.BatchNorm1d(self.node_feat_size), 26 | nn.LeakyReLU(negative_slope=0.2)) 27 | 28 | self.mp3 = MPLayer(self.node_feat_size, self.node_feat_size) 29 | self.post_mp3 = nn.Sequential( 30 | nn.BatchNorm1d(self.node_feat_size), 31 | nn.LeakyReLU(negative_slope=0.2)) 32 | 33 | self.mp4 = MPLayer(self.node_feat_size, self.node_feat_size) 34 | self.post_mp4 = nn.Sequential( 35 | nn.BatchNorm1d(self.node_feat_size), 36 | nn.LeakyReLU(negative_slope=0.2)) 37 | 38 | self.fc1 = nn.Sequential(nn.Linear(self.node_feat_size, 128), 39 | nn.BatchNorm1d(128), 40 | nn.LeakyReLU(negative_slope=0.2)) 41 | 42 | self.fc2 = nn.Sequential(nn.Linear(128, 128), 43 | nn.BatchNorm1d(128), 44 | nn.LeakyReLU(negative_slope=0.2)) 45 | 46 | self.fc_final = nn.Sequential(nn.Linear(128, out_dim), 47 | nn.LogSoftmax(dim=1)) 48 | 49 | 50 | def forward(self, g): 51 | h = g.ndata.pop('h') 52 | 53 | h = self.mp1(g, h) 54 | h = self.post_mp1(h) 55 | 56 | h = self.mp2(g, h) 57 | h = self.post_mp2(h) 58 | 59 | h = self.mp3(g, h) 60 | 61 | h = self.readout(g, h) 62 | 63 | h = self.fc1(h) 64 | h = self.fc2(h) 65 | h = self.fc_final(h) 66 | 67 | return h 68 | 69 | def readout(self, g, node_feats): 70 | g.ndata['h'] = node_feats 71 | out = dgl.max_nodes(g, 'h') 72 | return out 73 | 74 | 75 | class MPLayer(nn.Module): 76 | def __init__(self, in_dim, out_dim): 77 | super(MPLayer, self).__init__() 78 | 79 | self.fc1 = nn.Sequential(nn.Linear(in_dim, in_dim), 80 | nn.BatchNorm1d(in_dim), 81 | nn.LeakyReLU(negative_slope=0.2) 82 | ) 83 | 84 | self.fc2 = nn.Sequential(nn.Linear(in_dim, in_dim), 85 | nn.BatchNorm1d(in_dim), 86 | nn.LeakyReLU(negative_slope=0.2) 87 | ) 88 | 89 | self.linear = nn.Linear(in_dim, out_dim) 90 | 91 | 92 | def forward(self, g, node_feats): 93 | g.ndata['h'] = node_feats 94 | g.send(g.edges(), self.message) 95 | g.recv(g.nodes(), self.reduce) 96 | h = g.ndata.pop('h') 97 | h = self.linear(h) 98 | return h 99 | 100 | def message(self, edges): 101 | h = edges.src['h'] 102 | h = self.fc1(h) 103 | h = self.fc2(h) 104 | return {'msg': h} 105 | 106 | def reduce(self, nodes): 107 | return {'h': nodes.mailbox['msg'].mean(1)} 108 | 109 | class ZoneEncoder(nn.Module): 110 | def __init__(self, point_num, out_dim): 111 | super(ZoneEncoder, self).__init__() 112 | self.point_num = point_num 113 | 114 | self.conv1 = nn.Sequential(nn.Conv1d(10, 64, 1), 115 | nn.BatchNorm1d(64), 116 | nn.LeakyReLU(negative_slope=0.2)) 117 | 118 | self.conv2 = nn.Sequential(nn.Conv1d(64, 128, 1), 119 | nn.BatchNorm1d(128), 120 | nn.LeakyReLU(negative_slope=0.2)) 121 | 122 | self.conv3 = nn.Sequential(nn.Conv1d(128, 128, 1), 123 | ) 124 | 125 | self.fc1 = nn.Sequential(nn.Linear(128, 128), 126 | nn.BatchNorm1d(128), 127 | nn.LeakyReLU(negative_slope=0.2)) 128 | 129 | self.fc2 = nn.Sequential(nn.Linear(128, 128), 130 | nn.BatchNorm1d(128), 131 | nn.LeakyReLU(negative_slope=0.2)) 132 | 133 | self.fc_final = nn.Sequential(nn.Linear(128, out_dim), 134 | nn.BatchNorm1d(out_dim), 135 | nn.LeakyReLU(negative_slope=0.2) 136 | ) 137 | 138 | def forward(self, x): 139 | 140 | x = self.conv1(x) 141 | x = self.conv2(x) 142 | x = self.conv3(x) 143 | 144 | global_x = torch.max(x, 2, keepdim=True)[0] 145 | x = torch.flatten(global_x, 1, -1) 146 | 147 | x = self.fc1(x) 148 | x = self.fc_final(x) 149 | return x 150 | -------------------------------------------------------------------------------- /src/objects.py: -------------------------------------------------------------------------------- 1 | import sys 2 | from setup import * 3 | import FreeCAD 4 | import Part 5 | from FreeCAD import Base 6 | import numpy as np 7 | import json 8 | import copy 9 | from collections import defaultdict 10 | import networkx as nx 11 | 12 | from utils.edge_utils import * 13 | from utils.face_utils import * 14 | from utils.solid_utils import * 15 | from utils.vector_utils import * 16 | from utils.vertex_utils import * 17 | from utils.space_splitter import * 18 | from utils.combination_utils import * 19 | from utils.file_utils import * 20 | 21 | import time 22 | 23 | zone_sample_num = 500 24 | eps = 10e-6 25 | vol_eps = 10e-7 26 | 27 | class Extrusion: 28 | def __init__(self, cad_shape=None): 29 | self.cad_shape = cad_shape 30 | self.zone_indices = [] 31 | self.bool_type = None 32 | self.score = 0 33 | 34 | def copy(self): 35 | new = Extrusion() 36 | new.cad_shape = self.cad_shape.copy() 37 | new.zone_indices = copy.deepcopy(self.zone_indices) 38 | new.bool_type = copy.deepcopy(self.bool_type) 39 | return new 40 | 41 | def hash(self): 42 | return tuple(sorted(self.zone_indices)) 43 | 44 | def get_extrusion_heur_score(extrusion, zone_graph): 45 | zone_to_current_label = copy.deepcopy(zone_graph.zone_to_current_label) 46 | zone_to_target_label = zone_graph.zone_to_target_label 47 | 48 | if extrusion.bool_type == 0: 49 | for i in extrusion.zone_indices: 50 | zone_to_current_label[i] = True 51 | else: 52 | for i in extrusion.zone_indices: 53 | zone_to_current_label[i] = False 54 | 55 | zone_count = len(zone_graph.zones) 56 | dismatch_count = 0 57 | for z_i in range(0, zone_count): 58 | if zone_to_current_label[z_i] and not zone_to_target_label[z_i]: 59 | dismatch_count += 1 60 | if not zone_to_current_label[z_i] and zone_to_target_label[z_i]: 61 | dismatch_count += 1 62 | heur_score1 = (zone_count - dismatch_count) / zone_count 63 | 64 | I = 0 65 | U = 0 66 | for z_i in range(0, zone_count): 67 | if zone_to_current_label[z_i] and zone_to_target_label[z_i]: 68 | U += zone_graph.zones[z_i].cad_shape.Volume 69 | I += zone_graph.zones[z_i].cad_shape.Volume 70 | if zone_to_current_label[z_i] and not zone_to_target_label[z_i]: 71 | U += zone_graph.zones[z_i].cad_shape.Volume 72 | if not zone_to_current_label[z_i] and zone_to_target_label[z_i]: 73 | U += zone_graph.zones[z_i].cad_shape.Volume 74 | heur_score2 = I/U 75 | 76 | heur_score2 = 0 77 | return heur_score1, heur_score2 78 | 79 | class Zone: 80 | def __init__(self, cad_shape = None): 81 | self.cad_shape = cad_shape 82 | self.sample_positions = None 83 | self.sample_normals = None 84 | self.score = 0 85 | self.inside_points = [] 86 | 87 | def cal_inside_points(self): 88 | inside_points = [] 89 | 90 | tmp = [su.point_inside_solid(self.cad_shape.Solids[0])] 91 | for v in tmp: 92 | if v: 93 | inside_points.append(np.array([v[0], v[1], v[2]])) 94 | else: 95 | inside_points.append(None) 96 | 97 | return inside_points 98 | 99 | class ZoneGraph: 100 | def __init__(self, is_forward=True): 101 | self.current_shape = None 102 | self.target_shape = None 103 | self.bbox = None 104 | 105 | #zones 106 | self.zones = [] 107 | self.zone_graph = nx.Graph() 108 | self.zone_to_faces = defaultdict(list) 109 | self.zone_to_current_label = {} 110 | self.zone_to_target_label = {} 111 | 112 | #faces 113 | self.faces = [] 114 | self.exterior_faces = set() 115 | self.face_to_zones = defaultdict(list) 116 | self.face_to_current_label = {} 117 | self.face_to_target_label = {} 118 | self.face_to_extrusion_zones = {} 119 | 120 | #planes 121 | self.planes = [] 122 | self.plane_to_faces = defaultdict(list) 123 | self.plane_to_face_graph = {} 124 | 125 | self.plane_to_pos_zones = defaultdict(set) 126 | self.plane_to_neg_zones = defaultdict(set) 127 | 128 | def build(self, use_face_loop = True): 129 | if len(self.zones) > 0: 130 | return False, None 131 | 132 | sp = SpaceSplitter(self.target_shape, use_face_loop = use_face_loop) 133 | gen_cylinder = sp.isGenCylinder 134 | 135 | 136 | zone_solids = sp.get_zone_solids() 137 | self.bbox = sp.bbox_used 138 | 139 | print('number of zones', len(zone_solids)) 140 | if len(zone_solids) <= 1 and not gen_cylinder: 141 | return False, 'single_zone' 142 | if len(zone_solids) <= 1 and gen_cylinder: 143 | return False, 'gen_cylinder_single_zone' 144 | 145 | self.planes = sp.get_proposal_planes() 146 | face_to_index = {} 147 | for s_i, solid in enumerate(zone_solids): 148 | for face in solid.Faces: 149 | key = hash_face(face) 150 | if key not in face_to_index: 151 | face_to_index[hash_face(face)] = len(self.faces) 152 | self.faces.append(face) 153 | 154 | for f_i, face in enumerate(self.faces): 155 | for box_face in sp.bbox_geo.Faces: 156 | if face_share_extension_plane(face, box_face): 157 | self.exterior_faces.add(f_i) 158 | break 159 | 160 | target_volume = 0 161 | for z_i, zone_solid in enumerate(zone_solids): 162 | zone = Zone() 163 | zone.cad_shape = zone_solid 164 | self.zones.append(zone) 165 | for face in zone_solid.Faces: 166 | f_i = face_to_index[hash_face(face)] 167 | self.zone_to_faces[z_i].append(f_i) 168 | self.face_to_zones[f_i].append(z_i) 169 | 170 | zone.sample_positions, zone.sample_normals = get_samples_on_solid_surface(zone.cad_shape, zone_sample_num) 171 | zone.inside_points = zone.cal_inside_points() 172 | 173 | if solid_contain_zone(self.current_shape, zone): 174 | self.zone_to_current_label[z_i] = True 175 | else: 176 | self.zone_to_current_label[z_i] = False 177 | 178 | if solid_contain_zone(self.target_shape, zone): 179 | self.zone_to_target_label[z_i] = True 180 | target_volume += zone.cad_shape.Volume 181 | else: 182 | self.zone_to_target_label[z_i] = False 183 | 184 | if abs(target_volume - self.target_shape.Volume) > 0.1 * self.target_shape.Volume: 185 | return False, 'target_mismatch' 186 | 187 | for p_i, plane in enumerate(self.planes): 188 | for f_i, face in enumerate(self.faces): 189 | if face_share_extension_plane(plane, face): 190 | self.plane_to_faces[p_i].append(f_i) 191 | 192 | start = time.time() 193 | 194 | self.zone_graph = nx.Graph() 195 | for z_i, zone in enumerate(self.zones): 196 | self.zone_graph.add_node(z_i) 197 | 198 | visited_faces = set() 199 | for f_i, face_i in enumerate(self.faces): 200 | if len(self.face_to_zones[f_i]) == 2: 201 | z_i = self.face_to_zones[f_i][0] 202 | z_j = self.face_to_zones[f_i][1] 203 | self.zone_graph.add_edge(z_i, z_j) 204 | visited_faces.add(f_i) 205 | 206 | for f_i in range(0, len(self.faces)): 207 | for f_j in range(f_i, len(self.faces)): 208 | if f_i not in visited_faces and f_j not in visited_faces: 209 | condition = face_touch_condition(self.faces[f_i], self.faces[f_j]) 210 | if condition > 1: 211 | z_i = self.face_to_zones[f_i][0] 212 | z_j = self.face_to_zones[f_j][0] 213 | self.zone_graph.add_edge(z_i, z_j) 214 | # face i contain face j 215 | if condition == 2: 216 | visited_faces.add(f_j) 217 | # face j contain face i 218 | if condition == 3: 219 | visited_faces.add(f_i) 220 | 221 | print('time building zone graph connections', time.time() - start) 222 | 223 | # build face graph connectivity on each plane 224 | start = time.time() 225 | for plane_i, plane in enumerate(self.planes): 226 | face_graph = nx.Graph() 227 | visited_edges = set() 228 | edges = [] 229 | edge_to_index = {} 230 | edge_to_faces = defaultdict(list) 231 | for f_i in self.plane_to_faces[plane_i]: 232 | for edge in self.faces[f_i].Edges: 233 | edge_key = hash_edge(edge) 234 | if edge_key not in edge_to_index: 235 | edge_index = len(edges) 236 | edge_to_index[edge_key] = edge_index 237 | edges.append(edge) 238 | else: 239 | edge_index = edge_to_index[edge_key] 240 | edge_to_faces[edge_index].append(f_i) 241 | 242 | for e_i in range(len(edges)): 243 | if len(edge_to_faces[e_i]) == 2: 244 | f_i = edge_to_faces[e_i][0] 245 | f_j = edge_to_faces[e_i][1] 246 | face_graph.add_edge(f_i, f_j) 247 | visited_edges.add(e_i) 248 | 249 | for e_i in range(0, len(edges)): 250 | for e_j in range(e_i, len(edges)): 251 | if e_i not in visited_edges and e_j not in visited_edges: 252 | condition = edge_touch_condition(edges[e_i], edges[e_j]) 253 | if condition > 1: 254 | f_i = edge_to_faces[e_i][0] 255 | f_j = edge_to_faces[e_j][0] 256 | 257 | # edge i contain edge j 258 | if condition == 2: 259 | common = edges[e_i].common(edges[e_j]) 260 | print('common.Length / edges[e_i].Length', common.Length / edges[e_i].Length) 261 | if common.Length / edges[e_i].Length >= 0.5: 262 | face_graph.add_edge(f_i, f_j) 263 | visited_edges.add(e_j) 264 | # edge j contain edge i 265 | if condition == 3: 266 | common = edges[e_i].common(edges[e_j]) 267 | print('common.Length / edges[e_j].Length', common.Length / edges[e_j].Length) 268 | if common.Length / edges[e_j].Length >= 0.5: 269 | face_graph.add_edge(f_i, f_j) 270 | visited_edges.add(e_i) 271 | 272 | self.plane_to_face_graph[plane_i] = face_graph 273 | 274 | for plane_i, plane in enumerate(self.planes): 275 | plane_normal = plane.normalAt(0.5,0.5) 276 | plane_center = plane.CenterOfMass 277 | for z_i, z in enumerate(self.zones): 278 | zone_dir = z.cad_shape.CenterOfMass - plane_center 279 | if np.dot(zone_dir, plane_normal) >= 0: 280 | self.plane_to_pos_zones[plane_i].add(z_i) 281 | else: 282 | self.plane_to_neg_zones[plane_i].add(z_i) 283 | 284 | self.assign_face_labels() 285 | print('zone graph building complete') 286 | return True, None 287 | 288 | def assign_face_labels(self): 289 | for plane_i, plane in enumerate(self.planes): 290 | plane_normal = plane.normalAt(0.5, 0.5) 291 | for sign in [1, -1]: 292 | plane_direction = plane_normal * sign 293 | face_indices = self.plane_to_faces[plane_i] 294 | for face_i in face_indices: 295 | direction_key = hash_vector(plane_direction) 296 | #direction_key = tuple([plane_direction[0], plane_direction[1], plane_direction[2]]) 297 | #backward_direction_key = tuple([-plane_direction[0], -plane_direction[1], -plane_direction[2]]) 298 | self.face_to_current_label[(face_i, direction_key)] = False 299 | #self.face_to_current_label[(face_i, backward_direction_key)] = False 300 | self.face_to_target_label[(face_i, direction_key)] = False 301 | #self.face_to_target_label[(face_i, backward_direction_key)] = False 302 | for zone_i in self.face_to_zones[face_i]: 303 | zone_dir = self.zones[zone_i].cad_shape.CenterOfMass - self.faces[face_i].CenterOfMass 304 | zone_dir = zone_dir / np.linalg.norm(zone_dir) 305 | if np.dot(plane_direction, zone_dir) > 0: 306 | if self.zone_to_current_label[zone_i]: 307 | self.face_to_current_label[(face_i, direction_key)] = True 308 | else: 309 | self.face_to_current_label[(face_i, direction_key)] = False 310 | 311 | if self.zone_to_target_label[zone_i]: 312 | self.face_to_target_label[(face_i, direction_key)] = True 313 | else: 314 | self.face_to_target_label[(face_i, direction_key)] = False 315 | 316 | #print('pos_count', pos_count) 317 | #print('neg_count', neg_count) 318 | 319 | def encode_with_extrusion(self, extrusion): 320 | 321 | node_features = {} 322 | for zone_i, zone in enumerate(self.zones): 323 | node_features[zone_i] = {} 324 | shape_positions = zone.sample_positions 325 | shape_normals = zone.sample_normals 326 | 327 | node_features[zone_i]['shape_positions'] = shape_positions 328 | node_features[zone_i]['shape_normals'] = shape_normals 329 | 330 | if self.zone_to_current_label[zone_i]: 331 | node_features[zone_i]['in_current'] = 1 332 | else: 333 | node_features[zone_i]['in_current'] = 0 334 | 335 | if self.zone_to_target_label[zone_i]: 336 | node_features[zone_i]['in_target'] = 1 337 | else: 338 | node_features[zone_i]['in_target'] = 0 339 | 340 | if zone_i in extrusion.zone_indices: 341 | node_features[zone_i]['in_extrusion'] = 1 342 | else: 343 | node_features[zone_i]['in_extrusion'] = 0 344 | 345 | node_features[zone_i]['bool'] = extrusion.bool_type 346 | 347 | nx.set_node_attributes(self.zone_graph, node_features) 348 | 349 | def update_to_next_zone_graph(self, extrusion): 350 | next_zone_graph = ZoneGraph() 351 | next_zone_graph.bbox = self.bbox 352 | next_zone_graph.zones = self.zones 353 | next_zone_graph.faces = self.faces 354 | next_zone_graph.exterior_faces = self.exterior_faces 355 | next_zone_graph.planes = self.planes 356 | next_zone_graph.zone_to_faces = self.zone_to_faces 357 | next_zone_graph.face_to_zones = self.face_to_zones 358 | 359 | next_zone_graph.plane_to_faces = self.plane_to_faces 360 | next_zone_graph.plane_to_face_graph = self.plane_to_face_graph 361 | next_zone_graph.face_to_extrusion_zones = self.face_to_extrusion_zones 362 | 363 | #next_zone_graph.current_shape = self.current_shape 364 | next_zone_graph.target_shape = self.target_shape 365 | next_zone_graph.plane_to_pos_zones = self.plane_to_pos_zones 366 | next_zone_graph.plane_to_neg_zones = self.plane_to_neg_zones 367 | 368 | 369 | next_zone_graph.zone_graph = copy.deepcopy(self.zone_graph) 370 | next_zone_graph.zone_to_current_label = copy.deepcopy(self.zone_to_current_label) 371 | next_zone_graph.zone_to_target_label = copy.deepcopy(self.zone_to_target_label) 372 | 373 | if extrusion.bool_type == 0: 374 | for i in extrusion.zone_indices: 375 | next_zone_graph.zone_to_current_label[i] = True 376 | else: 377 | for i in extrusion.zone_indices: 378 | next_zone_graph.zone_to_current_label[i] = False 379 | 380 | #start = time.time() 381 | next_zone_graph.assign_face_labels() 382 | #print('time of face label assignment', time.time() - start) 383 | #next_zone_graph.encode() 384 | 385 | return next_zone_graph 386 | 387 | def get_current_zone_indices(self): 388 | indices = [] 389 | for zone_i, zone in enumerate(self.zones): 390 | if self.zone_to_current_label[zone_i]: 391 | indices.append(zone_i) 392 | return indices 393 | 394 | def get_target_zone_indices(self): 395 | indices = [] 396 | for zone_i, zone in enumerate(self.zones): 397 | if self.zone_to_target_label[zone_i]: 398 | indices.append(zone_i) 399 | return indices 400 | 401 | def is_done(self): 402 | error_volume = 0 403 | error_count = 0 404 | threshold_vol = self.target_shape.Volume * 0.01 405 | #print('threshold_vol', threshold_vol) 406 | for z_i, zone in enumerate(self.zones): 407 | if self.zone_to_current_label[z_i] and not self.zone_to_target_label[z_i]: 408 | error_volume += zone.cad_shape.Volume 409 | error_count += 1 410 | if not self.zone_to_current_label[z_i] and self.zone_to_target_label[z_i]: 411 | error_volume += zone.cad_shape.Volume 412 | error_count += 1 413 | 414 | if error_volume > threshold_vol: 415 | #print('error_volume', error_volume, "/",threshold_vol) 416 | return False 417 | return True 418 | 419 | def get_IOU_score(self): 420 | I = 0 421 | U = 0 422 | for z_i, zone in enumerate(self.zones): 423 | if self.zone_to_current_label[z_i] and self.zone_to_target_label[z_i]: 424 | U += zone.cad_shape.Volume 425 | I += zone.cad_shape.Volume 426 | if self.zone_to_current_label[z_i] and not self.zone_to_target_label[z_i]: 427 | U += zone.cad_shape.Volume 428 | if not self.zone_to_current_label[z_i] and self.zone_to_target_label[z_i]: 429 | U += zone.cad_shape.Volume 430 | 431 | return I/U -------------------------------------------------------------------------------- /src/proposal.py: -------------------------------------------------------------------------------- 1 | 2 | import sys 3 | sys.path.append('..') 4 | 5 | import numpy as np 6 | from setup import * 7 | from collections import defaultdict 8 | import os 9 | import itertools 10 | import copy 11 | import time 12 | import networkx as nx 13 | import random 14 | from objects import * 15 | import itertools 16 | 17 | import networkx as nx 18 | 19 | 20 | eps = 10e-6 21 | def get_proposals(zone_graph, level=1): 22 | """ 23 | given zone graph, find valid possible extrusions 24 | each extrusion is represented as a list of zones (index) 25 | """ 26 | #print('start proposal') 27 | start = time.time() 28 | extrusions = [] 29 | PG = ProposalGenerator(zone_graph) 30 | combs = PG.get_extrusion_combs(level) 31 | #print('time of finding proposals', time.time() - start) 32 | 33 | for comb in combs: 34 | 35 | all_out_current = True 36 | all_in_current = True 37 | for zone_i in comb: 38 | if PG.zone_graph.zone_to_current_label[zone_i]: 39 | all_out_current = False 40 | else: 41 | all_in_current = False 42 | 43 | #print('all_in_current', all_in_current) 44 | #print('all_out_current', all_out_current) 45 | 46 | bool_types = [] 47 | if all_in_current: 48 | bool_types = [1] 49 | else: 50 | if all_out_current: 51 | bool_types = [0] 52 | else: 53 | bool_types = [0, 1] 54 | 55 | for bool_type in bool_types: 56 | extrusion = Extrusion() 57 | extrusion.zone_indices = comb 58 | extrusion.bool_type = bool_type 59 | extrusions.append(extrusion) 60 | 61 | #print('number of extrusions', len(extrusions)) 62 | return extrusions 63 | 64 | 65 | def connected_component_subgraphs(G): 66 | for c in nx.connected_components(G): 67 | yield G.subgraph(c) 68 | 69 | def get_all_graph_groups(G): 70 | graphs = list(connected_component_subgraphs(G)) 71 | return [list(g.nodes) for g in graphs] 72 | 73 | def get_fixed_size_group_combs(groups, size): 74 | if size > len(groups): 75 | return [] 76 | groups = itertools.combinations(groups, size) 77 | ret_groups = [] 78 | for gs in groups: 79 | temp_gs = [] 80 | for g in gs: 81 | temp_gs += g 82 | ret_groups.append(temp_gs) 83 | return ret_groups 84 | 85 | 86 | class ProposalGenerator(): 87 | def __init__(self, zone_graph): 88 | self.zone_graph = zone_graph 89 | self.signed_plane_to_face_groups = defaultdict(list) 90 | self.extrusion_vols = {} 91 | 92 | def get_extrusion_combs(self, level=1): 93 | """ 94 | get all possible extrusion volumes 95 | """ 96 | eps = 10e-6 97 | 98 | extrusion_combs = [] 99 | visited_extrusions = set() 100 | 101 | plane_pairs = get_fixed_size_combinations(range(len(self.zone_graph.planes)), 2) 102 | validate_pairs = [] 103 | for pair in plane_pairs: 104 | plane1 = self.zone_graph.planes[pair[0]] 105 | plane2 = self.zone_graph.planes[pair[1]] 106 | if face_parallel(plane1, plane2): 107 | validate_pairs.append((pair[0], pair[1])) 108 | validate_pairs.append((pair[1], pair[0])) 109 | 110 | #print('number of validate pairs', len(validate_pairs)) 111 | start_time = time.time() 112 | for pair in validate_pairs: 113 | start_plane_i = pair[0] 114 | end_plane_i = pair[1] 115 | 116 | start_plane = self.zone_graph.planes[pair[0]] 117 | start_normal = start_plane.normalAt(0.5,0.5) 118 | 119 | end_plane = self.zone_graph.planes[pair[1]] 120 | end_normal = end_plane.normalAt(0.5,0.5) 121 | 122 | p1 = start_plane.CenterOfMass 123 | p2 = end_plane.CenterOfMass 124 | d = distance_of_planes(p1, p2, start_normal) 125 | if abs(d) < eps: 126 | continue 127 | 128 | extrusion_vector = start_normal * d 129 | 130 | if d < 0: 131 | signed_start_normal = -1 * start_normal 132 | else: 133 | signed_start_normal = start_normal 134 | 135 | loop_to_start = end_plane.CenterOfMass - start_plane.CenterOfMass 136 | loop_to_end = start_plane.CenterOfMass - end_plane.CenterOfMass 137 | 138 | start_set = None 139 | end_set = None 140 | if np.dot(loop_to_start, start_normal) >= 0 and np.dot(loop_to_end, end_normal) >= 0 : 141 | start_set = self.zone_graph.plane_to_pos_zones[pair[0]] 142 | end_set = self.zone_graph.plane_to_pos_zones[pair[1]] 143 | if np.dot(loop_to_start, start_normal) >= 0 and np.dot(loop_to_end, end_normal) < 0 : 144 | start_set = self.zone_graph.plane_to_pos_zones[pair[0]] 145 | end_set = self.zone_graph.plane_to_neg_zones[pair[1]] 146 | if np.dot(loop_to_start, start_normal) < 0 and np.dot(loop_to_end, end_normal) >= 0 : 147 | start_set = self.zone_graph.plane_to_neg_zones[pair[0]] 148 | end_set = self.zone_graph.plane_to_pos_zones[pair[1]] 149 | if np.dot(loop_to_start, start_normal) < 0 and np.dot(loop_to_end, end_normal) < 0 : 150 | start_set = self.zone_graph.plane_to_neg_zones[pair[0]] 151 | end_set = self.zone_graph.plane_to_neg_zones[pair[1]] 152 | 153 | middle_zones = list(start_set.intersection(end_set)) 154 | 155 | signed_plane_key = (pair[0], hash_vector(signed_start_normal)) 156 | if signed_plane_key not in self.signed_plane_to_face_groups: 157 | 158 | face_indices = self.zone_graph.plane_to_faces[pair[0]] 159 | face_graph = self.zone_graph.plane_to_face_graph[pair[0]] 160 | 161 | forward_direction_key = hash_vector(signed_start_normal) 162 | backward_direction_key = hash_vector(-signed_start_normal) 163 | 164 | valid_start_face_indices = [] 165 | 166 | #print('face_indices', face_indices) 167 | 168 | for face_i in face_indices: 169 | current_label_forward = self.zone_graph.face_to_current_label[(face_i, forward_direction_key)] 170 | current_label_backward = self.zone_graph.face_to_current_label[(face_i, backward_direction_key)] 171 | if current_label_forward and not current_label_backward: 172 | valid_start_face_indices.append(face_i) 173 | continue 174 | if not current_label_forward and current_label_backward: 175 | valid_start_face_indices.append(face_i) 176 | continue 177 | 178 | for exterior_face in self.zone_graph.exterior_faces: 179 | exterior_normal = self.zone_graph.faces[exterior_face].normalAt(0.5, 0.5) 180 | if 1 - abs(np.dot(self.zone_graph.faces[face_i].normalAt(0.5, 0.5), exterior_normal)) < 10e-6: 181 | valid_start_face_indices.append(face_i) 182 | break 183 | 184 | #print('len(valid_start_face_indices)', len(valid_start_face_indices)) 185 | 186 | if len(valid_start_face_indices) > 0: 187 | target_only_faces = [] 188 | current_only_faces = [] 189 | idle_faces = [] 190 | for face_i in valid_start_face_indices: 191 | current_label = self.zone_graph.face_to_current_label[(face_i, forward_direction_key)] 192 | target_label = self.zone_graph.face_to_target_label[(face_i, forward_direction_key)] 193 | 194 | if not target_label and current_label: 195 | current_only_faces.append(face_i) 196 | 197 | if target_label and not current_label: 198 | target_only_faces.append(face_i) 199 | 200 | if not target_label and not current_label: 201 | idle_faces.append(face_i) 202 | 203 | #print('target_only_faces', len(target_only_faces)) 204 | #print('current_only_faces', len(current_only_faces)) 205 | #print('idle_faces', len(idle_faces)) 206 | 207 | final_current_groups = [] 208 | if len(current_only_faces) > 0: 209 | if len(current_only_faces) == 1: 210 | final_current_groups = [current_only_faces] 211 | else: 212 | current_sub_g = face_graph.subgraph(current_only_faces) 213 | current_groups = get_all_graph_groups(current_sub_g) 214 | 215 | for level_i in range(level): 216 | final_current_groups += get_fixed_size_group_combs(current_groups, level_i+1) 217 | final_current_groups += get_fixed_size_group_combs(current_groups, len(current_groups)-level_i) 218 | 219 | final_target_groups = [] 220 | if len(target_only_faces) > 0: 221 | if len(target_only_faces) == 1: 222 | final_target_groups = [target_only_faces] 223 | else: 224 | target_sub_g = face_graph.subgraph(target_only_faces) 225 | target_groups = get_all_graph_groups(target_sub_g) 226 | 227 | for level_i in range(level): 228 | final_target_groups += get_fixed_size_group_combs(target_groups, level_i+1) 229 | final_target_groups += get_fixed_size_group_combs(target_groups, len(target_groups)-level_i) 230 | 231 | final_target_idle_groups = [] 232 | if len(idle_faces + target_only_faces) > 0: 233 | if len(idle_faces + target_only_faces) == 1: 234 | final_target_idle_groups = [idle_faces + target_only_faces] 235 | else: 236 | target_idle_sub_g = face_graph.subgraph(idle_faces + target_only_faces) 237 | target_idle_groups = get_all_graph_groups(target_idle_sub_g) 238 | for level_i in range(level): 239 | final_target_idle_groups += get_fixed_size_group_combs(target_idle_groups, level_i+1) 240 | final_target_idle_groups += get_fixed_size_group_combs(target_idle_groups, len(target_idle_groups)-level_i) 241 | 242 | face_groups = final_current_groups + final_target_groups + final_target_idle_groups 243 | 244 | visited_face_groups = set() 245 | for fg in face_groups: 246 | if len(fg) > 0: 247 | visited_face_groups.add(tuple(sorted(fg))) 248 | 249 | face_groups = list(visited_face_groups) 250 | else: 251 | face_groups = [] 252 | 253 | self.signed_plane_to_face_groups[signed_plane_key] = face_groups 254 | else: 255 | face_groups = self.signed_plane_to_face_groups[signed_plane_key] 256 | 257 | #print('time spent on face_groups computing', time.time()-start_time) 258 | 259 | #print('number of face groups', len(face_groups)) 260 | start_time = time.time() 261 | for face_group in face_groups: 262 | zone_indices = [] 263 | for f_i in face_group: 264 | face_key = (f_i, start_plane_i, end_plane_i) 265 | if face_key in self.zone_graph.face_to_extrusion_zones: 266 | face_zone_indices = self.zone_graph.face_to_extrusion_zones[face_key] 267 | else: 268 | solid = self.zone_graph.faces[f_i].copy().extrude(extrusion_vector) 269 | face_zone_indices = self.get_inside_zones(self.zone_graph, start_plane_i, end_plane_i, [f_i], middle_zones) 270 | # face_zone_indices = self.get_inside_zones_freecad(self.zone_graph, solid) 271 | 272 | 273 | running_vol = 0 274 | for z_i in face_zone_indices: 275 | zone = self.zone_graph.zones[z_i] 276 | running_vol += abs(zone.cad_shape.Volume) 277 | 278 | if abs(running_vol - solid.Volume) > 0.0001 * solid.Volume: 279 | face_zone_indices = [] 280 | 281 | # ensures the result is generalized cylinder 282 | # face_zone_indices = self.judge_gen_cyn_zones(face_zone_indices) 283 | 284 | self.zone_graph.face_to_extrusion_zones[face_key] = face_zone_indices 285 | zone_indices += face_zone_indices 286 | 287 | if len(zone_indices) == 0: 288 | continue 289 | 290 | shape_key = tuple(sorted(zone_indices)) 291 | if shape_key not in visited_extrusions: 292 | visited_extrusions.add(shape_key) 293 | extrusion_combs.append(zone_indices) 294 | #print('time spent on iterating face groups', time.time()-start_time) 295 | 296 | return extrusion_combs 297 | 298 | def get_inside_zones(self, zone_graph, start_plane_i, end_plane_i, face_group, zone_candidates): 299 | start_plane = self.zone_graph.planes[start_plane_i] 300 | start_normal = start_plane.normalAt(0.5,0.5) 301 | 302 | zones_in_extrusion = set() 303 | open_faces = list(face_group) 304 | closed_faces = [] 305 | #print('open_faces', open_faces) 306 | while len(open_faces) > 0: 307 | f_to_explore = open_faces[0] 308 | open_faces.remove(f_to_explore) 309 | zones = zone_graph.face_to_zones[f_to_explore] 310 | #print('zones', zones) 311 | new_zones = set() 312 | for z_i in zones: 313 | if z_i in zone_candidates: 314 | new_zones.add(z_i) 315 | 316 | #print('new zones', new_zones) 317 | 318 | zones_in_extrusion = zones_in_extrusion.union(new_zones) 319 | #print('zones_in_extrusion', zones_in_extrusion) 320 | #exit() 321 | 322 | new_faces = [] 323 | for z_i in new_zones: 324 | faces = zone_graph.zone_to_faces[z_i] 325 | for f_i in faces: 326 | if f_i not in open_faces and f_i not in closed_faces: 327 | new_faces.append(f_i) 328 | 329 | for new_f_i in new_faces: 330 | if abs(np.dot(zone_graph.faces[new_f_i].normalAt(0.5, 0.5), start_normal)) > 10e-3: 331 | open_faces.append(new_f_i) 332 | 333 | closed_faces.append(f_to_explore) 334 | 335 | return list(zones_in_extrusion) -------------------------------------------------------------------------------- /src/search.py: -------------------------------------------------------------------------------- 1 | 2 | import sys 3 | sys.path.append('..') 4 | 5 | import numpy as np 6 | import os 7 | import argparse 8 | import numpy as np 9 | 10 | from dataset import * 11 | from objects import * 12 | from evaluation import * 13 | from agent import * 14 | from proposal import * 15 | from queue import Queue, LifoQueue, PriorityQueue 16 | 17 | import shutil 18 | import time 19 | import random 20 | import copy 21 | 22 | # import psutil 23 | 24 | class SearchSolution(): 25 | def __init__(self): 26 | self.ious = [] 27 | self.times = [] 28 | 29 | self.best_seq = None 30 | self.best_time = None 31 | self.best_score = 0 32 | 33 | def dfs_best_recon(zone_graph, max_step, max_time, expand_width, sort_option, best_sol, start_time, folder, agent=None, cur_step=0, cur_seq=[], visited_graphs=set(), visited_extrusions=set(), data_mgr=DataManager()): 34 | 35 | if zone_graph.is_done(): 36 | return True 37 | 38 | if cur_step == max_step : 39 | return False 40 | 41 | if (time.time() - start_time) > max_time: 42 | return True 43 | 44 | next_extrusions = get_proposals(zone_graph) 45 | if len(next_extrusions) == 0: 46 | return False 47 | 48 | if sort_option == 'agent': 49 | next_extrusions = sort_extrusions_by_agent(next_extrusions, zone_graph, agent)[0: min(len(next_extrusions), expand_width)] 50 | else: 51 | if sort_option == 'heur': 52 | next_extrusions = sort_extrusions_by_heur(next_extrusions, zone_graph)[0: min(len(next_extrusions), expand_width)] 53 | else: 54 | if sort_option == 'random': 55 | next_extrusions = sort_extrusions_by_random(next_extrusions)[0: min(len(next_extrusions), expand_width)] 56 | else: 57 | print('invalid sort option') 58 | return True 59 | 60 | for next_extrusion in next_extrusions: 61 | next_zone_graph = zone_graph.update_to_next_zone_graph(next_extrusion) 62 | graph_key = tuple(sorted(next_zone_graph.get_current_zone_indices())) 63 | if graph_key in visited_graphs: 64 | continue 65 | else: 66 | extrusion_key = tuple(sorted(next_extrusion.zone_indices)) 67 | 68 | overshadow = False 69 | for visited_key in visited_extrusions: 70 | #if set(visited_key).issubset(set(extrusion_key)) or set(visited_key) == set(extrusion_key): 71 | if set(visited_key) == set(extrusion_key): 72 | overshadow = True 73 | break 74 | 75 | # print('next_extrusion', next_extrusion.zone_indices) 76 | if overshadow: 77 | continue 78 | else: 79 | 80 | visited_graphs.add(graph_key) 81 | 82 | visited_extrusions.add(extrusion_key) 83 | cur_seq.append((copy.deepcopy(next_zone_graph), copy.deepcopy(next_extrusion))) 84 | cur_score = next_zone_graph.get_IOU_score() 85 | 86 | if cur_score > best_sol.best_score: 87 | print('update best', len(cur_seq), cur_score, best_sol.best_score) 88 | best_sol.best_score = cur_score 89 | best_sol.best_seq = copy.deepcopy(cur_seq) 90 | best_sol.best_time = time.time() - start_time 91 | 92 | if not os.path.exists(folder): 93 | os.makedirs(folder) 94 | write_val_to_file(best_sol.best_time, os.path.join(folder, 'time.txt')) 95 | write_val_to_file(best_sol.best_score, os.path.join(folder, 'IOU.txt')) 96 | data_mgr.save_sequence(best_sol.best_seq, folder) 97 | 98 | if cur_step % 2 == 0: 99 | expand_width -= 1 100 | to_exit = dfs_best_recon(next_zone_graph, max_step, max_time, max(1, expand_width), sort_option, best_sol, start_time, folder, agent, cur_step+1, cur_seq, visited_graphs, visited_extrusions, data_mgr) 101 | cur_seq.pop(-1) 102 | 103 | visited_extrusions.remove(extrusion_key) 104 | visited_graphs.remove(graph_key) 105 | 106 | if to_exit: 107 | return True 108 | 109 | return False 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | -------------------------------------------------------------------------------- /src/setup.py: -------------------------------------------------------------------------------- 1 | """ 2 | Setup functions for the running environment 3 | 4 | """ 5 | 6 | FREECAD_LIB_PATH = "/home/xxh/Projects/freecad-build/lib" 7 | 8 | import sys 9 | sys.path.append(FREECAD_LIB_PATH) 10 | 11 | -------------------------------------------------------------------------------- /src/train.py: -------------------------------------------------------------------------------- 1 | 2 | import sys 3 | sys.path.append('..') 4 | 5 | import os 6 | import argparse 7 | import numpy as np 8 | from dataset import * 9 | from objects import * 10 | from dgl.data.utils import save_graphs 11 | from dgl.data.utils import load_graphs 12 | from evaluation import * 13 | from agent import * 14 | from train_preprocess import * 15 | import hyperparameters as hp 16 | import copy 17 | import joblib 18 | 19 | def train_batch(gs, ls, agent): 20 | loss = agent.update_by_extrusion(ls, gs) 21 | return loss 22 | 23 | def train(data_path, folder): 24 | 25 | if os.path.exists(folder) is True: 26 | shutil.rmtree(folder) 27 | if not os.path.exists(folder): 28 | os.makedirs(folder) 29 | 30 | agent = Agent(folder) 31 | 32 | train_loss_list = [] 33 | validation_loss_list = [] 34 | min_validation_loss = np.inf 35 | 36 | gs = [] 37 | ls = [] 38 | 39 | for epoch_index in range(hp.train_epoch_num): 40 | total_loss = 0 41 | print('epoch', epoch_index , '--------------------------------------------') 42 | train_ids = read_file_to_list('train_ids.txt') 43 | for seq_index, seq_id in enumerate(train_ids): 44 | print('seq index', seq_index, 'seq id', seq_id) 45 | try: 46 | gt_seq = data_mgr.load_processed_sequence(os.path.join(data_path, seq_id, 'gt')) 47 | except: 48 | continue 49 | 50 | for step_index, gt_step in enumerate(gt_seq): 51 | try: 52 | pos_g = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(1) + '_g.joblib')) 53 | pos_e = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(1) + '_e.joblib')) 54 | neg_g = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(0) + '_g.joblib')) 55 | neg_e = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(0) + '_e.joblib')) 56 | except: 57 | break 58 | 59 | if pos_g and pos_e and neg_g and neg_e: 60 | pos_g.encode_with_extrusion(pos_e) 61 | gs.append(pos_g) 62 | ls.append(to_tensor([1])) 63 | 64 | neg_g.encode_with_extrusion(neg_e) 65 | gs.append(neg_g) 66 | ls.append(to_tensor([0])) 67 | 68 | if len(gs) >= hp.batch_size: 69 | loss = train_batch(gs, ls, agent) 70 | gs = [] 71 | ls = [] 72 | train_loss_list.append(loss) 73 | write_list_to_file(os.path.join(folder, 'trainloss.txt'), train_loss_list) 74 | agent.save_weights() 75 | 76 | # validate after each training epoch 77 | agent.save_weights() 78 | validation_loss = validate(data_path, folder) 79 | validation_loss_list.append(validation_loss) 80 | write_list_to_file(os.path.join(folder, 'validationloss.txt'), validation_loss_list) 81 | if validation_loss <= min_validation_loss: 82 | min_validation_loss = validation_loss 83 | agent.save_best_weights() 84 | 85 | def validate(data_path, folder): 86 | 87 | print('validation----------------------------------------') 88 | 89 | agent = Agent(folder) 90 | agent.load_weights() 91 | 92 | data_mgr = DataManager() 93 | 94 | if os.path.isfile(os.path.join('gt_step_to_extrusions.joblib')): 95 | step_to_extrusions = joblib.load(os.path.join('gt_step_to_extrusions.joblib')) 96 | else: 97 | step_to_extrusions = defaultdict(list) 98 | 99 | total_rank_sum = 0 100 | 101 | validate_ids = read_file_to_list('validate_ids.txt') 102 | for validation_index, seq_id in enumerate(validate_ids): 103 | print('validation_index', validation_index, 'seq_id', seq_id) 104 | 105 | try: 106 | gt_seq = data_mgr.load_processed_sequence(os.path.join(data_path, seq_id, 'gt')) 107 | except: 108 | continue 109 | 110 | for step_index, gt_step in enumerate(gt_seq): 111 | try: 112 | pos_g = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(1) + '_g.joblib')) 113 | pos_e = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(1) + '_e.joblib')) 114 | neg_g = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(0) + '_g.joblib')) 115 | neg_e = joblib.load(os.path.join(data_path, seq_id, 'train', str(step_index) + '_' + str(0) + '_e.joblib')) 116 | except: 117 | break 118 | 119 | if pos_g and pos_e and neg_g and neg_e: 120 | gt_zone_graph = gt_step[0] 121 | gt_extrusion = gt_step[1] 122 | extrusions = get_proposals(gt_zone_graph) 123 | agent_ranked_extrusions = sort_extrusions_by_agent(extrusions, gt_zone_graph, agent) 124 | for i, extrusion in enumerate(agent_ranked_extrusions): 125 | if gt_extrusion.hash() == extrusion.hash(): 126 | total_rank_sum += i 127 | break 128 | 129 | return total_rank_sum 130 | 131 | if __name__ == "__main__": 132 | parser = argparse.ArgumentParser(description='ZoneGraph') 133 | parser.add_argument('--data_path', default='processed_data', type=str) 134 | parser.add_argument('--output_path', default='train_output', type=str) 135 | args = parser.parse_args() 136 | 137 | train(args.data_path, args.output_path) 138 | -------------------------------------------------------------------------------- /src/train_preprocess.py: -------------------------------------------------------------------------------- 1 | import sys 2 | sys.path.append('..') 3 | 4 | import numpy as np 5 | import os 6 | 7 | import argparse 8 | import numpy as np 9 | from dataset import * 10 | from objects import * 11 | from proposal import * 12 | import multiprocessing 13 | import shutil 14 | import matplotlib.pyplot as plt 15 | import copy 16 | import joblib 17 | 18 | def hit_target_in_path(zone_graph, depth, max_depth): 19 | if zone_graph.is_done(): 20 | return True 21 | 22 | if depth == max_depth: 23 | return False 24 | 25 | next_extrusions = get_proposals(zone_graph) 26 | random.shuffle(next_extrusions) 27 | next_zone_graph = zone_graph.update_to_next_zone_graph(next_extrusions[0]) 28 | ret = hit_target_in_path(next_zone_graph, depth+1, max_depth) 29 | 30 | return ret 31 | 32 | def generate_neg_steps(pos_steps): 33 | neg_steps = [] 34 | for i in range(len(pos_steps)): 35 | time_limit = 100 36 | start_time = time.time() 37 | target_depth = len(pos_steps) - i 38 | sample_number = 10 * target_depth 39 | base_zone_graph = pos_steps[i][0] 40 | extrusions = get_proposals(base_zone_graph) 41 | random.shuffle(extrusions) 42 | 43 | min_hit_ratio = 0.999 44 | best_neg_extrusion = None 45 | 46 | print('candidate neg extrusions:', len(extrusions)) 47 | 48 | for extrusion in extrusions: 49 | #display_extrusion(extrusion, base_zone_graph, show=True) 50 | next_zone_graph = base_zone_graph.update_to_next_zone_graph(extrusion) 51 | hit_count = 0 52 | for sample_index in range(0, sample_number): 53 | ret = hit_target_in_path(next_zone_graph, 0, target_depth) 54 | if ret: 55 | hit_count += 1 56 | hit_ratio = hit_count/sample_number 57 | print('hit ratio', hit_ratio) 58 | if hit_ratio <= min_hit_ratio and hit_ratio <= 0.2 and extrusion.hash() != pos_steps[i][1].hash(): 59 | min_hit_ratio = hit_ratio 60 | best_neg_extrusion = extrusion 61 | if abs(hit_ratio) < 0.0000001: 62 | break 63 | cur_time = time.time() 64 | elapsed_time = cur_time - start_time 65 | print('elapsed_time', elapsed_time) 66 | if elapsed_time > time_limit: 67 | break 68 | if best_neg_extrusion: 69 | print('neg extrusion found ') 70 | neg_steps.append((copy.deepcopy(base_zone_graph), copy.deepcopy(best_neg_extrusion))) 71 | 72 | return neg_steps 73 | 74 | def process_single_data(seq_id, raw_data_path, processed_data_path): 75 | data_mgr = DataManager() 76 | 77 | print('precessing episode:', seq_id) 78 | 79 | sequence_length = len(list(Path(os.path.join(raw_data_path, seq_id)).glob('*'))) 80 | print('sequence_length', sequence_length) 81 | 82 | gt_seq = [] 83 | #try: 84 | gt_seq, error_type = data_mgr.load_raw_sequence(os.path.join(raw_data_path, seq_id), 0, sequence_length) 85 | if len(gt_seq) == 0: 86 | return 87 | #except: 88 | #return 89 | 90 | print('start simulation--------------------------------------------------------------------') 91 | ret = data_mgr.simulate_sequence(gt_seq) 92 | print('simulation done------------------------------------------------------------------------------------', ret) 93 | if not ret: 94 | return 95 | 96 | seq_gt_folder = os.path.join(processed_data_path, seq_id, 'gt') 97 | if not os.path.exists(seq_gt_folder): 98 | os.makedirs(seq_gt_folder) 99 | 100 | for i in range(len(gt_seq)): 101 | gt_step = gt_seq[i] 102 | joblib.dump(gt_step[0], os.path.join(seq_gt_folder, str(i) + '_g.joblib')) 103 | joblib.dump(gt_step[1], os.path.join(seq_gt_folder, str(i) + '_e.joblib')) 104 | 105 | seq_train_folder = os.path.join(processed_data_path, seq_id, 'train') 106 | if not os.path.exists(seq_train_folder): 107 | os.makedirs(seq_train_folder) 108 | 109 | step_index = 0 110 | k = 100 111 | start_index = 0 112 | while start_index < sequence_length: 113 | end_index = min(start_index + k, sequence_length) 114 | print('start_index', start_index, 'end_index', end_index) 115 | 116 | if sequence_length <= k: 117 | pos_seq = gt_seq 118 | else: 119 | pos_seq = [] 120 | try: 121 | pos_seq, error_type = data_mgr.load_raw_sequence(os.path.join(raw_data_path, seq_id), start_index, end_index) 122 | if len(pos_seq) == 0: 123 | break 124 | except: 125 | break 126 | 127 | start_index += k 128 | neg_steps = generate_neg_steps(pos_seq) 129 | 130 | for i in range(len(neg_steps)): 131 | pos_step = pos_seq[i] 132 | #display_zone_graph(pos_step[0], file=os.path.join(seq_train_folder, 'step' + str(step_index) + '_' + str('pos') + '_canvas.png'), show=False) 133 | #display_extrusion(pos_step[1], pos_step[0], file=os.path.join(seq_train_folder, 'step' + str(step_index) + '_' + str('pos') + '_extrusion.png'), show=False) 134 | joblib.dump(pos_step[0], os.path.join(seq_train_folder, str(step_index) + '_' + str(1) + '_g.joblib')) 135 | joblib.dump(pos_step[1], os.path.join(seq_train_folder, str(step_index) + '_' + str(1) + '_e.joblib')) 136 | 137 | neg_step = neg_steps[i] 138 | #display_zone_graph(neg_step[0], file=os.path.join(seq_train_folder, 'step' + str(step_index) + '_' + str('neg') + 'canvas.png'), show=False) 139 | #display_extrusion(neg_step[1], neg_step[0], file=os.path.join(seq_train_folder, 'step' + str(step_index) + '_' + str('neg') + 'extrusion.png'), show=False) 140 | joblib.dump(neg_step[0], os.path.join(seq_train_folder, str(step_index) + '_' + str(0) + '_g.joblib')) 141 | joblib.dump(neg_step[1], os.path.join(seq_train_folder, str(step_index) + '_' + str(0) + '_e.joblib')) 142 | 143 | step_index += 1 144 | 145 | print('single data processing complete !') 146 | 147 | def process(raw_data_path, processed_data_path): 148 | 149 | if not os.path.exists(processed_data_path): 150 | os.makedirs(processed_data_path) 151 | 152 | seq_ids = os.listdir(raw_data_path) 153 | for seq_id in seq_ids: 154 | sequence_length = len(list(Path(os.path.join(raw_data_path, seq_id)).glob('*'))) 155 | print('sequence_length', sequence_length) 156 | 157 | worker_process = multiprocessing.Process(target=process_single_data, name="process_single_data", args=(seq_id, raw_data_path, processed_data_path)) 158 | worker_process.start() 159 | worker_process.join(200 + sequence_length * 100) 160 | 161 | if worker_process.is_alive(): 162 | print ("process_single_data is running... let's kill it...") 163 | worker_process.terminate() 164 | worker_process.join() 165 | 166 | print('all data processing complete !') 167 | 168 | def split_data_for_training(dataset_path): 169 | train_ids = [] 170 | validate_ids = [] 171 | test_ids = [] 172 | all_ids = os.listdir(dataset_path) 173 | random.shuffle(all_ids) 174 | length = len(all_ids) 175 | train_ids = all_ids[0: int(0.85 * length)] 176 | validate_ids = all_ids[int(0.85 * length) + 1: int(0.9 * length)] 177 | test_ids = all_ids[int(0.9 * length) + 1: int(1.0 * length)] 178 | 179 | write_list_to_file('train_ids.txt', train_ids) 180 | write_list_to_file('test_ids.txt', test_ids) 181 | write_list_to_file('validate_ids.txt', validate_ids) 182 | 183 | if __name__ == "__main__": 184 | 185 | parser = argparse.ArgumentParser(description='train_preprocess') 186 | parser.add_argument('--data_path', default='../data/fusion_processed', type=str) 187 | parser.add_argument('--output_path', default='processed_data', type=str) 188 | args = parser.parse_args() 189 | 190 | split_data_for_training(args.data_path) 191 | process(args.data_path, args.output_path) 192 | 193 | 194 | -------------------------------------------------------------------------------- /src/utils/combination_utils.py: -------------------------------------------------------------------------------- 1 | import itertools 2 | 3 | def get_fixed_size_combinations(item_list, count): 4 | return itertools.combinations(item_list, count) 5 | 6 | def get_combinations(item_list): 7 | length = len(item_list) 8 | combs = [] 9 | 10 | for i in range(length): 11 | count = i+1 12 | combs.extend(list(itertools.combinations(item_list, count))) 13 | 14 | combs = [list(c) for c in combs] 15 | return combs 16 | 17 | def get_permutations(value): 18 | numbers = range(value) 19 | permutations = list(itertools.permutations(numbers)) 20 | print('permutations', permutations) 21 | return permutations 22 | 23 | def get_combination_with_contain_list(item_list, contain_list): 24 | """ 25 | find all combinations that guarantees each resulting item 26 | contains at least one item in the contain_list 27 | """ 28 | 29 | non_contain_list = [item for item in item_list if item not in contain_list] 30 | 31 | contain_combs = get_combinations(contain_list) 32 | non_contain_combs = get_combinations(non_contain_list) 33 | 34 | result = [] 35 | for c1 in contain_combs: 36 | for c2 in non_contain_combs: 37 | result.append(c1 + c2) 38 | 39 | result = contain_combs + result 40 | 41 | print('result', result) 42 | return result 43 | 44 | def is_subset_of_item(combination_list, combination): 45 | for comb in combination_list: 46 | if set(comb).issubset(set(combination)): 47 | return True 48 | return False 49 | 50 | def is_subset(list1, list2): 51 | return set(list1).issubset(set(list2)) 52 | 53 | def has_intersection(list1, list2): 54 | return set(list1).intersection(set(list2)) 55 | 56 | def list_has_item(list, item, equality_function): 57 | for d in list: 58 | if equality_function(d, item): 59 | return True 60 | return False 61 | 62 | def list_has_list(list, itemlist): 63 | for d in list: 64 | if set(d) == set(itemlist): 65 | return True 66 | return False 67 | 68 | def list_has_superlist(list, itemlist): 69 | for d in list: 70 | if set(d).issuperset(set(itemlist)): 71 | return d 72 | return False 73 | 74 | def list_add_deduplicate(list, item, equality_function): 75 | if not list_has_item(list, item, equality_function): 76 | list.append(item) 77 | 78 | def list_extend_deduplicate(list1, list2, equality_function): 79 | for item in list2: 80 | if not list_has_item(list1, item, equality_function): 81 | list1.append(item) 82 | 83 | def list_equal_list(list1, list2): 84 | return set(list1) == set(list2) 85 | 86 | def hash_list(list): 87 | return hash(str((set(list)))) 88 | 89 | def list_remove_item(list, item, equality_function, multiple = False): 90 | for d in list: 91 | if equality_function(d, item): 92 | list.remove(d) 93 | if multiple: 94 | break 95 | 96 | def list_difference(list1, list2): 97 | diff = 0 98 | for item in list1: 99 | if item not in list2: 100 | diff += 1 101 | 102 | for item in list2: 103 | if item not in list1: 104 | diff += 1 105 | 106 | return diff 107 | 108 | def items_all(list, equality_function): 109 | for item in list: 110 | if not equality_function(item): 111 | return False 112 | return True -------------------------------------------------------------------------------- /src/utils/edge_utils.py: -------------------------------------------------------------------------------- 1 | """ 2 | Utility function for edges in 3D 3 | """ 4 | 5 | import sys 6 | sys.path.append('..') 7 | 8 | from setup import * 9 | import FreeCAD 10 | import Part 11 | from FreeCAD import Base 12 | import random 13 | import copy 14 | import math 15 | 16 | from utils.vertex_utils import * 17 | 18 | default_eps = 10e-5 19 | 20 | def hash_edge(edge, reverse = False): 21 | parametric_points = edge_sample_points(edge, amount = 50, random_sample = False) 22 | point_keys = [hash_point(p) for p in parametric_points] 23 | sorted_point_keys = sorted(point_keys) 24 | edge_key = tuple(sorted_point_keys) 25 | edge_key = hash(edge_key) 26 | return edge_key 27 | 28 | def create_segment_edge(pt1, pt2): 29 | return Part.makeLine(pt1, pt2) 30 | 31 | def edge_sample_points(edge, amount = 1, random_sample = True): 32 | edge_length = edge.Length 33 | if random_sample: 34 | params = [random.random() * edge_length for i in range(amount)] 35 | else: 36 | #params = [i/(amount-1) * edge_length for i in range(amount-1)] + [ edge_length ] 37 | 38 | lengths = [i / (amount-1) * edge_length for i in range(amount)] 39 | params = [edge.getParameterByLength(l) for l in lengths] 40 | #print('params', params) 41 | return [edge.valueAt(param) for param in params] 42 | 43 | def edge_touch_condition(edge1, edge2, eps = default_eps): 44 | common_e1e2 = edge1.common(edge2) 45 | if common_e1e2.Length < eps: 46 | # not touching 47 | return 0 48 | else: 49 | 50 | common_length = common_e1e2.Length 51 | length1 = edge1.Length 52 | length2 = edge2.Length 53 | 54 | # equal 55 | if abs(common_length - length1) < eps \ 56 | and abs(common_length - length2) < eps: 57 | return 1 58 | 59 | # face1 contain face2 60 | if abs(common_length - length1) >= eps \ 61 | and abs(common_length - length2) < eps: 62 | return 2 63 | 64 | # face2 contain face1 65 | if abs(common_length - length1) < eps \ 66 | and abs(common_length - length2) >= eps: 67 | return 3 68 | 69 | # overlap but no contain relationship 70 | else: 71 | return 4 72 | 73 | def edge_is_straight(edge, eps = default_eps): 74 | p1 = edge.Vertexes[0].Point 75 | p2 = edge.Vertexes[-1].Point 76 | 77 | return abs(copy.deepcopy(p1).sub(p2).Length - edge.Length) < eps * edge.Length 78 | 79 | def edge_direction(edge, common = False): 80 | p1 = edge.Vertexes[0].Point 81 | p2 = edge.Vertexes[-1].Point 82 | 83 | v = (copy.deepcopy(p2).sub(p1)) 84 | 85 | if v.Length > 0: 86 | v.normalize() 87 | 88 | if common: 89 | return (v.x, v.y, v.z) 90 | 91 | else: 92 | return v 93 | 94 | def edge_parallel(edge1, edge2, eps = default_eps): 95 | 96 | if len(edge1.Vertexes) < 2 or len(edge2.Vertexes) < 2: 97 | return False 98 | 99 | if not edge_is_straight(edge1) or not edge_is_straight(edge2): 100 | return False 101 | 102 | v1 = edge_direction(edge1) 103 | v2 = edge_direction(edge2) 104 | 105 | ang = v1.getAngle(v2) 106 | 107 | if ang < eps or abs(ang - math.pi) < eps: 108 | return True 109 | else: 110 | return False 111 | -------------------------------------------------------------------------------- /src/utils/face_utils.py: -------------------------------------------------------------------------------- 1 | """ 2 | Utility function for faces in 3D 3 | """ 4 | 5 | import sys 6 | sys.path.append('..') 7 | 8 | from setup import * 9 | import FreeCAD 10 | import Part 11 | from FreeCAD import Base 12 | 13 | import math 14 | import random as random 15 | import numpy as np 16 | 17 | import copy 18 | from collections import defaultdict 19 | 20 | from utils.vector_utils import * 21 | from utils.vertex_utils import * 22 | from utils.edge_utils import * 23 | from utils.combination_utils import * 24 | 25 | default_eps = 10e-5 26 | 27 | def face_is_planar(face, eps = default_eps): 28 | 29 | c1,c2 = face.curvatureAt(0.5,0.5) 30 | 31 | if abs(c1) + abs(c2) < default_eps: 32 | return abs(face.Volume) < eps * face.Area 33 | else: 34 | return False 35 | 36 | def hash_face(face): 37 | edge_keys = [] 38 | for edge in face.Edges: 39 | edge_key = hash_edge(edge) 40 | edge_keys.append(edge_key) 41 | sorted_edge_keys = sorted(edge_keys) 42 | face_key = tuple(edge_key for edge_key in sorted_edge_keys) 43 | face_key = hash(face_key) 44 | return face_key 45 | 46 | def face_equal(face1, face2, eps = default_eps): 47 | """ 48 | face1 and face2 geometrically equal 49 | """ 50 | 51 | # shortcut 52 | if abs(face1.Area - face2.Area) > eps: 53 | return False 54 | 55 | common_f1f2 = face1.common(face2) 56 | 57 | if abs(common_f1f2.Area - face1.Area) < eps \ 58 | and abs(common_f1f2.Area - face2.Area) < eps: 59 | return True 60 | else: 61 | return False 62 | 63 | def pts_on_face(face, point_list, eps = default_eps): 64 | """ 65 | all points in list are on face 66 | """ 67 | for pt in point_list: 68 | if not face.isInside(pt, eps, True): 69 | return False 70 | return True 71 | 72 | def face_touch_condition(face1, face2, eps = default_eps): 73 | common_f1f2 = face1.common(face2) 74 | if common_f1f2.Area < eps: 75 | # not touching 76 | return 0 77 | else: 78 | 79 | common_area = common_f1f2.Area 80 | area1 = face1.Area 81 | area2 = face2.Area 82 | 83 | # equal 84 | if abs(common_area - area1) < eps \ 85 | and abs(common_area - area2) < eps: 86 | return 1 87 | 88 | # face1 contain face2 89 | if abs(common_area - area1) >= eps \ 90 | and abs(common_area - area2) < eps: 91 | return 2 92 | 93 | # face2 contain face1 94 | if abs(common_area - area1) < eps \ 95 | and abs(common_area - area2) >= eps: 96 | return 3 97 | 98 | # overlap but no contain relationship 99 | else: 100 | return 4 101 | 102 | def face_parallel(face1, face2, eps = default_eps): 103 | """ 104 | face1 and face2 are planar faces 105 | """ 106 | if not face_is_planar(face1) or not face_is_planar(face2): 107 | return False 108 | 109 | dir1 = face1.normalAt(0.5,0.5) 110 | dir2 = face2.normalAt(0.5,0.5) 111 | ang = dir1.getAngle(dir2) 112 | 113 | if abs(ang) eps: 129 | return False 130 | else: 131 | return True 132 | 133 | def face_share_extension_plane(face1, face2, eps = 10e-5): 134 | 135 | if not face_parallel(face1, face2, eps): 136 | return False 137 | 138 | p1 = face1.CenterOfMass 139 | p2 = face2.CenterOfMass 140 | 141 | n1 = face1.normalAt(0.5,0.5) 142 | 143 | if abs(distance_of_planes(p1,p2,n1)) < eps: 144 | return True 145 | 146 | return False 147 | 148 | def distance_of_planes(point1, point2, normal): 149 | """ 150 | find the distance between two parallel planes, signed by normal direction. 151 | normal is the normal vector of plane1. 152 | point1 and point2 are points on plane1 and plane2 153 | """ 154 | A = normal.x 155 | B = normal.y 156 | C = normal.z 157 | x1 = point1.x 158 | y1 = point1.y 159 | z1 = point1.z 160 | x2 = point2.x 161 | y2 = point2.y 162 | z2 = point2.z 163 | 164 | D1 = - A * x1 - B * y1 - C * z1 165 | D2 = - A * x2 - B * y2 - C * z2 166 | 167 | d = abs(D2 - D1)/math.sqrt(A * A + B * B + C * C) 168 | 169 | # apply sign based on normal direction 170 | dir = point2.sub(point1) 171 | ang = abs(normal.getAngle(dir)) 172 | if ang > math.pi/2: 173 | d = -d 174 | 175 | return d 176 | 177 | def selective_extend_face(face_id, face_loops, all_faces, scale = 300): 178 | """ 179 | if face not in face_loops, extend it using extend_face function. 180 | otherwise, only extend it along loop direction 181 | """ 182 | 183 | face = all_faces[face_id].copy() 184 | 185 | if not face_is_planar(face): 186 | return extend_face(face, scale) 187 | 188 | 189 | inloop = False 190 | scaled_dirs = [] 191 | 192 | for lst, direction in face_loops: 193 | if face_id in lst: 194 | hashed_dir = vector_dir_hashed(direction) 195 | if hashed_dir in scaled_dirs: 196 | continue 197 | else: 198 | inloop = True 199 | scaled_dirs.append(hashed_dir) 200 | face = extend_face_along_direction(face, direction, scale) 201 | 202 | if inloop: 203 | return face 204 | else: 205 | return extend_face(face, scale) 206 | 207 | 208 | def extend_face(face_, scale_ = 300): 209 | """ 210 | extend face by scale 211 | will scale up using only the outer wire of the face. 212 | """ 213 | face = face_.copy() 214 | scale = face.Length * scale_ 215 | 216 | # is not flat face 217 | if not face_is_planar(face): 218 | 219 | sph_data = is_spherical_face(face) 220 | # return full sphere 221 | if sph_data: 222 | c,r = sph_data 223 | # print(c,r) 224 | return Part.makeSphere(abs(r), c) 225 | 226 | cyl_data = is_cylindrical_face(face) 227 | 228 | # return extended cylinder 229 | if cyl_data: 230 | d,f = cyl_data 231 | extrude_v = copy.copy(d).multiply(scale) 232 | translate_v = copy.copy(d).multiply(scale/2 * -1) 233 | 234 | f.translate(translate_v) 235 | 236 | c1 = f.extrude(extrude_v) 237 | 238 | combined = c1 239 | faces = combined.Faces 240 | # areas = [f.Area for f in faces] 241 | # index = areas.index(max(areas)) 242 | 243 | return faces[0] 244 | # return original face 245 | face.scale(1 + 10e-4, face.CenterOfMass) 246 | 247 | return face 248 | else: 249 | circle = Part.makeCircle(scale, face.CenterOfMass, face.normalAt(0.5, 0.5)) 250 | 251 | wire = Part.Wire(circle) 252 | f = Part.Face(wire) 253 | return f 254 | 255 | def is_cylindrical_face(face, sample_density = 4): 256 | """ 257 | the given face is cylindrical 258 | if is cylindrical face, return its extend direction and one extrude end face 259 | else return False 260 | """ 261 | 262 | c1_ = None 263 | c2_ = None 264 | 265 | if face_is_planar(face): 266 | return False 267 | 268 | for u in range(sample_density): 269 | for v in range(sample_density): 270 | u = u/(sample_density-1) 271 | v = v/(sample_density-1) 272 | c1,c2 = face.curvatureAt(u,v) 273 | 274 | if c1_ is None: 275 | c1_ = c1 276 | if c2_ is None: 277 | c2_ = c2 278 | 279 | if abs(c1_ - c1) > default_eps\ 280 | or abs(c2_ - c2) > default_eps\ 281 | or abs(c1 * c2) > default_eps\ 282 | or (abs(c1) < default_eps and abs(c2) < default_eps): 283 | return False 284 | 285 | tan = face.tangentAt(0.5,0.5) 286 | 287 | if abs(c1) < default_eps: 288 | extrude_dir = tan[0] 289 | else: 290 | extrude_dir = tan[1] 291 | 292 | 293 | for edge in face.Edges: 294 | 295 | cutting_plane = Part.makeCircle(face.Length * 100, edge.CenterOfMass, extrude_dir) 296 | cutting_plane = Part.Wire(cutting_plane) 297 | cutting_plane = Part.Face(cutting_plane) 298 | 299 | cutting_intersection = face.section(cutting_plane) 300 | 301 | cutting_edge = cutting_intersection.Edges[0] 302 | 303 | if cutting_edge.Length > face.Length * default_eps: 304 | if not edge_is_straight(cutting_edge): 305 | center = cutting_edge.centerOfCurvatureAt(0.5 * cutting_edge.Length) 306 | r = 1/ (max(abs(c1_), abs(c2_))) 307 | # r = r * (1 + 10e-4) 308 | 309 | return extrude_dir,Part.makeCircle(r, center, extrude_dir) 310 | 311 | else: 312 | return extrude_dir,cutting_edge 313 | 314 | def is_spherical_face(face, sample_density = 4): 315 | """ 316 | the given face is spherical or part of spherical face 317 | if is spherical face, return its center location and radius 318 | else return False 319 | """ 320 | 321 | c1_ = None 322 | c2_ = None 323 | 324 | for u in range(sample_density): 325 | for v in range(sample_density): 326 | u = u/(sample_density-1) 327 | v = v/(sample_density-1) 328 | c1,c2 = face.curvatureAt(u,v) 329 | 330 | if c1_ is None: 331 | c1_ = c1 332 | if c2_ is None: 333 | c2_ = c2 334 | 335 | if abs(c1_ - c1) > default_eps\ 336 | or abs(c2_ - c2) > default_eps\ 337 | or abs(c1 - c2) > default_eps\ 338 | or abs(c1) < default_eps: 339 | return False 340 | 341 | r = 1/c1 342 | 343 | normal = face.normalAt(0.5,0.5) 344 | direction = copy.copy(normal).multiply(r) 345 | pt = face.valueAt(0.5,0.5) 346 | center = copy.copy(pt).add(direction) 347 | 348 | return center,r 349 | 350 | 351 | def extend_face_along_direction(face, extrude_dir, scale_=300): 352 | 353 | scale = scale_ * face.Length 354 | # only apply directional extend for planar faces 355 | if not face_is_planar(face): 356 | return extend_face(face, scale_) 357 | 358 | cutting_plane = Part.makeCircle(face.Length * 10, face.CenterOfMass, extrude_dir) 359 | cutting_plane = Part.Wire(cutting_plane) 360 | cutting_plane = Part.Face(cutting_plane) 361 | 362 | projected_vs = [v.Point.projectToPlane(face.CenterOfMass, extrude_dir) for v in face.Vertexes] 363 | 364 | v1 = projected_vs[0] 365 | 366 | 367 | dists = [copy.copy(v).sub(v1).Length for v in projected_vs] 368 | v2 = None 369 | 370 | for i,dist in enumerate(dists): 371 | if dist > 10e-6: 372 | v2 = projected_vs[i] 373 | break 374 | direction = copy.copy(v2).sub(v1) 375 | direction.normalize() 376 | 377 | dists_signed = [] 378 | for i,d in enumerate(dists): 379 | if d < 10e-6: 380 | dists_signed.append(0) 381 | else: 382 | dists_signed.append(d * copy.copy(projected_vs[i]).sub(v1).normalize().dot(direction)) 383 | 384 | min_pos = dists_signed.index(min(dists_signed)) 385 | max_pos = dists_signed.index(max(dists_signed)) 386 | 387 | end1 = projected_vs[min_pos] 388 | end2 = projected_vs[max_pos] 389 | 390 | projected_face = create_segment_edge(end1, end2) 391 | 392 | extrude_v = copy.copy(extrude_dir).multiply(scale) 393 | translate_v = copy.copy(extrude_dir).multiply(scale/2 * -1) 394 | 395 | projected_face.translate(translate_v) 396 | extended = projected_face.extrude(extrude_v) 397 | 398 | out = extended.Faces[0] 399 | out.scale(1 + 10e-2, out.CenterOfMass) 400 | 401 | return out 402 | 403 | def face_get_parallel_edge_pairs(face): 404 | 405 | edge_pairs = get_fixed_size_combinations(range(len(face.Edges)), 2) 406 | 407 | validate_pairs = [] 408 | 409 | for pair in edge_pairs: 410 | edge1 = face.Edges[pair[0]] 411 | edge2 = face.Edges[pair[1]] 412 | 413 | if edge_parallel(edge1, edge2): 414 | validate_pairs.append((edge1, edge2)) 415 | # validate_pairs.append((pair[1], pair[0])) 416 | 417 | return validate_pairs 418 | 419 | 420 | def get_neighboring_face_by_edge(face_id, edge_index, edge_to_faces): 421 | 422 | faces_on_edge = edge_to_faces[edge_index] 423 | 424 | if face_id in faces_on_edge: 425 | for f_id in faces_on_edge: 426 | if f_id != face_id: 427 | return f_id 428 | return None 429 | 430 | def get_parallel_edges(face, edge): 431 | found = [] 432 | key = hash_edge(edge) 433 | for e in face.Edges: 434 | temp_key = hash_edge(e) 435 | if key != temp_key: 436 | if edge_parallel(edge, e): 437 | found.append(e) 438 | 439 | return found 440 | 441 | def face_is_frame(face, threshold_ratio = 0.8, threshold_count = 4): 442 | """ 443 | the input face: 444 | - is planar 445 | - has and only has 1 inner loop 446 | - inner loop length/outer loop length is larger than threshold 447 | """ 448 | 449 | if not face_is_planar(face): 450 | return False 451 | 452 | wires = face.Wires 453 | 454 | if len(wires[0].Edges) > threshold_count: 455 | return True 456 | 457 | # curves = [] 458 | for edge in face.Edges: 459 | if not edge_is_straight(edge): 460 | # curves.append(edge) 461 | return True 462 | 463 | # if len(curves) == 1 or len(curves) == 3 or len(curves) == 4: 464 | # return True 465 | 466 | # if len(curves) == 2: 467 | # if abs(curves[0].Length - curves[1].Length) > default_eps * curves[1].Length: 468 | # return True 469 | 470 | # if 471 | 472 | if len(wires) > 1: 473 | outer = wires[0] 474 | inner = wires[1] 475 | 476 | if inner.Length / outer.Length > threshold_ratio: 477 | return True 478 | else: 479 | return False 480 | -------------------------------------------------------------------------------- /src/utils/file_utils.py: -------------------------------------------------------------------------------- 1 | 2 | import sys 3 | sys.path.append('..') 4 | 5 | from setup import * 6 | 7 | import FreeCAD 8 | import Part 9 | from FreeCAD import Base 10 | import argparse 11 | import matplotlib.pyplot as plt 12 | 13 | def write_val_to_file(val, filename): 14 | with open(filename, 'w') as f: 15 | f.write(str(val) + '\n') 16 | 17 | def read_file_to_float_value(filename): 18 | val = None 19 | with open(filename, 'r') as f: 20 | lines = f.readlines() 21 | for line in lines: 22 | val = float(line) 23 | return val 24 | 25 | def read_file_to_string(filename): 26 | with open(filename, 'r') as f: 27 | lines = f.readlines() 28 | for line in lines: 29 | val = str(line) 30 | return val 31 | 32 | def write_list_to_file(filename, items): 33 | with open(filename, 'w') as f: 34 | for item in items: 35 | f.write(str(item) + '\n') 36 | 37 | def append_item_to_file(filename, item): 38 | with open(filename, 'a') as f: 39 | f.write(str(item) + '\n') 40 | 41 | def read_file_to_list(filename): 42 | items = [] 43 | with open(filename, 'r') as f: 44 | lines = f.readlines() 45 | for line in lines: 46 | items.append(line.split('\n')[0]) 47 | return items 48 | 49 | 50 | 51 | 52 | 53 | -------------------------------------------------------------------------------- /src/utils/solid_utils.py: -------------------------------------------------------------------------------- 1 | """ 2 | Utility function for solid in 3D 3 | """ 4 | 5 | import sys 6 | sys.path.append('..') 7 | 8 | from setup import * 9 | 10 | import FreeCAD 11 | import Part 12 | from FreeCAD import Base 13 | import trimesh as tm 14 | import numpy as np 15 | import math 16 | import math 17 | import copy 18 | import random 19 | from collections import defaultdict 20 | 21 | from utils.face_utils import * 22 | from utils.edge_utils import * 23 | from utils.vertex_utils import * 24 | from utils.combination_utils import * 25 | from utils.vector_utils import * 26 | 27 | import time 28 | 29 | default_eps = 10e-5 30 | 31 | def hash_solid(solid): 32 | face_keys = [] 33 | for face in solid.Faces: 34 | face_key = hash_face(face) 35 | face_keys.append(face_key) 36 | sorted_face_keys = sorted(face_keys) 37 | solid_key = tuple(face_key for face_key in sorted_face_keys) 38 | #solid_key = tuple([solid_key, solid.Volume]) 39 | solid_key = hash(solid_key) 40 | return solid_key 41 | 42 | def solid_contain_zone(solid, zone, eps = default_eps): 43 | eps = 10e-6 44 | if solid is None: 45 | return False 46 | 47 | #return su.solid1_contain_solid2(solid, zone.cad_shape) 48 | 49 | for s in solid.Solids: 50 | for p in zone.inside_points: 51 | if p is None: 52 | return False 53 | if s.isInside(Base.Vector(p[0], p[1], p[2]), 10e-8, True): 54 | return True 55 | 56 | return False 57 | 58 | 59 | def get_gen_cylinder_side_face_loops(all_faces, force_gen_cylinder = False): 60 | """ 61 | find face loops in cad so that each face loop is a part of a 62 | generalized cylinder side faces 63 | 64 | the selected faces for a loops should be: 65 | - form a loop in face graph 66 | - their intersection lines are parallel to one another 67 | """ 68 | 69 | edges = [] 70 | edge_to_index = {} 71 | edge_to_faces = defaultdict(list) 72 | for f_i,face in enumerate(all_faces): 73 | for edge in face.Edges: 74 | edge_key = hash_edge(edge) 75 | if edge_key not in edge_to_index: 76 | edge_index = len(edges) 77 | edge_to_index[edge_key] = edge_index 78 | edges.append(edge) 79 | else: 80 | edge_index = edge_to_index[edge_key] 81 | edge_to_faces[edge_index].append(f_i) 82 | 83 | """ 84 | - 1. for each face 85 | create a visited face list, store the current face 86 | - 2. find parallel edge pairs of the face 87 | - 3. for each parallel edge pair, find neighboring faces, and record visited faces 88 | - go back to 2 for each found face 89 | 90 | - store found loop (new face found is visited face) 91 | """ 92 | 93 | gen_face_loops = [] 94 | directions = [] 95 | 96 | for face_i, face in enumerate(all_faces): 97 | 98 | parallel_edge_pairs = face_get_parallel_edge_pairs(face) 99 | #print('parallel_edge_pairs', parallel_edge_pairs) 100 | for pair in parallel_edge_pairs: 101 | starting_edge = pair[0] 102 | direction = edge_direction(starting_edge) 103 | loop = get_next_face(all_faces, face_i, starting_edge, edge_to_index, edge_to_faces,force_gen_cylinder) 104 | #print('loop', loop) 105 | if loop and not list_has_list(gen_face_loops, loop): 106 | gen_face_loops.append(loop) 107 | directions.append(direction) 108 | 109 | output = [(loop, vector) for (loop, vector) in zip(gen_face_loops,directions)] 110 | return output 111 | 112 | 113 | # helper function 114 | def get_next_face(all_faces, current_face_id, starting_edge, edge_to_index, edge_to_faces, force_gen_cylinder = False, face_loop = []): 115 | if len(face_loop) == 0: 116 | face_loop = [current_face_id] 117 | 118 | starting_edge_index = edge_to_index[hash_edge(starting_edge)] 119 | neighbor_face_id = get_neighboring_face_by_edge(current_face_id, starting_edge_index, edge_to_faces) 120 | 121 | if neighbor_face_id is None: 122 | return None 123 | 124 | neighbor_face = all_faces[neighbor_face_id] 125 | 126 | is_frame = face_is_frame(neighbor_face) 127 | 128 | if is_frame: 129 | return None 130 | 131 | if neighbor_face_id in face_loop: 132 | return face_loop 133 | else: 134 | face_loop.append(neighbor_face_id) 135 | 136 | next_edges = get_parallel_edges(neighbor_face, starting_edge) 137 | 138 | if len(next_edges) == 0: 139 | return None 140 | else: 141 | if force_gen_cylinder: 142 | if abs(next_edges[0].Length - starting_edge.Length) > 10e-6 * starting_edge.Length: 143 | return None 144 | 145 | return get_next_face(all_faces, neighbor_face_id, next_edges[0], edge_to_index, edge_to_faces, force_gen_cylinder, face_loop) 146 | 147 | def solid_contain(solid1, solid2, eps = default_eps, count_equal = True): 148 | """ 149 | solid1 contains solid2 150 | if count_equal is True, then count equal as contain case 151 | """ 152 | 153 | if solid2.isNull(): 154 | return False 155 | 156 | common_z1z2 = solid1.common(solid2) 157 | 158 | if common_z1z2.isNull(): 159 | return False 160 | 161 | if abs(common_z1z2.Volume - solid2.Volume) < eps * solid2.Volume: 162 | return True 163 | # if solid_equal(common_z1z2, solid2, eps): 164 | # if count_equal: 165 | # return True 166 | # else: 167 | # return not solid_equal(solid1, solid2, eps) 168 | else: 169 | return False 170 | 171 | 172 | def solid_contain_point(solid, pt, eps = default_eps, count_face = False): 173 | if pt is None: 174 | return False 175 | 176 | v_total = true_Volume(solid) 177 | for s in solid.Solids: 178 | if s.Volume / v_total > eps: 179 | if s.isInside(pt, eps, count_face): 180 | return True 181 | 182 | return False 183 | 184 | def point_inside_solid(solid, eps = default_eps): 185 | center = solid.CenterOfMass 186 | # if solid_contain_point(solid, center, 10e-8, True): 187 | # return center 188 | 189 | edge_lengths = [e.Length for e in solid.Edges] 190 | eps_t = 10e-5# * min(edge_lengths) 191 | 192 | # print(solid.Length * 100, center, Base.Vector(0,0,1)) 193 | 194 | 195 | for i,face in enumerate(solid.Faces): 196 | direction = face.tangentAt(0.5,0.5)[0] 197 | cutting_plane = Part.makeCircle(face.Length * 100, center, direction) 198 | cutting_plane = Part.Wire(cutting_plane) 199 | cutting_plane = Part.Face(cutting_plane) 200 | normal = face.normalAt(0.5,0.5) 201 | 202 | cutting_intersection = face.section(cutting_plane) 203 | # cutting_intersection.exportStep(f"../../../debug{i}.stp") 204 | 205 | for edge in cutting_intersection.Edges: 206 | edge_c = edge.CenterOfMass 207 | # edge_c = edge.valueAt(0.5) 208 | shifted = copy.copy(edge_c).add(copy.copy(normal).multiply(eps)) 209 | 210 | line = Part.makeLine(edge_c,shifted) 211 | 212 | # projected point on the face 213 | near_info = face.distToShape(line) 214 | 215 | for pair in near_info[1]: 216 | for edge_c_p in pair: 217 | face_value = near_info[2][0][2] 218 | 219 | if type(face_value) is tuple: 220 | normal_temp= face.normalAt(face_value[0],face_value[1]) 221 | else: 222 | normal_temp = normal 223 | 224 | shifted = copy.copy(edge_c_p).sub(copy.copy(normal_temp).multiply(eps_t)) 225 | # Part.makeSphere(0.001, shifted).exportStep(f"../../../debug{random.random()}.stp") 226 | 227 | if solid_contain_point(solid, shifted, 10e-8, False): 228 | return shifted 229 | # print('edges', len(cutting_intersection.Edges)) 230 | 231 | # for edge in cutting_intersection.Edges: 232 | # print('edge center',edge.CenterOfMass) 233 | # # cutting_edge = cutting_intersection.Edges[0] 234 | return None 235 | 236 | def get_samples_on_solid_surface(solid, amount = 1e2): 237 | """ 238 | apply uniform sampling on the surface of the solid 239 | if exclude_hole, 240 | """ 241 | 242 | mesh = solid.tessellate(default_eps * solid.Length * 100) 243 | mesh_tri = tm.Trimesh(vertices=mesh[0], faces=mesh[1]) 244 | pos_info, tris = tm.sample.sample_surface(mesh_tri, amount) 245 | 246 | positions = [] 247 | normals = [] 248 | for i, p in enumerate(pos_info): 249 | positions.append(p) 250 | t = tris[i] 251 | nor = get_point_normal(p, t, mesh_tri) 252 | normals.append(nor) 253 | 254 | return np.array(positions), np.array(normals) 255 | 256 | def solid_is_generalized_cylinder(solid): 257 | """ 258 | check if there are a pair of parallel faces, then all other faces are perpendicular to this pair of face 259 | """ 260 | for i,face1 in enumerate(solid.Faces): 261 | for j,face2 in enumerate(solid.Faces): 262 | if i != j and face_parallel(face1, face2): 263 | 264 | if abs(face1.Area - face2.Area) < default_eps * solid.Area: 265 | # if target_geo: 266 | # # any side is on target surface 267 | # f1 = face_on_solid(target_geo,face1) 268 | # f2 = face_on_solid(target_geo,face2) 269 | 270 | # if not f1 and not f2: 271 | # break 272 | 273 | all_perpendicular = True 274 | for k,face_other in enumerate(solid.Faces): 275 | if k != i and k != j: 276 | if face_perpendicular(face_other, face1) is False: 277 | all_perpendicular = False 278 | break 279 | if all_perpendicular: 280 | return face1.normalAt(0.5,0.5) 281 | 282 | return False 283 | 284 | def merge_solids(solids): 285 | 286 | if len(solids) == 0: 287 | return None 288 | 289 | merged_solid = solids[0] 290 | for i in range(1, len(solids)): 291 | merged_solid = merged_solid.fuse(solids[i]) 292 | #merged_solid = merged_solid.removeSplitter() 293 | try: 294 | merged_solid = merged_solid.removeSplitter() 295 | except: 296 | return merged_solid 297 | #merged_solid = defeature_solid(merged_solid) 298 | 299 | #merged_solid = Part.Solid(merged_solid) 300 | return merged_solid 301 | 302 | def get_loop_solid(splitting_shapes): 303 | compound, map = splitting_shapes[0].generalFuse(splitting_shapes[1:], 0.0) 304 | if len(compound.Solids) != 2: 305 | return None 306 | 307 | s1 = compound.Solids[0] 308 | s2 = compound.Solids[1] 309 | 310 | if s1.Volume > s2.Volume: 311 | return s2 312 | else: 313 | return s1 314 | 315 | for s in compound.Solids: 316 | print('number of faces', len(s.Faces)) 317 | for f in s.Faces: 318 | print('number of wires', len(f.Wires)) 319 | 320 | def true_Volume(cad): 321 | 322 | total_vol = 0 323 | 324 | # print('cad', cad.isValid()) 325 | 326 | for solid in cad.Solids: 327 | # print('solid.Volume', solid.Volume) 328 | total_vol += abs(solid.Volume) 329 | return max(total_vol,cad.Volume) 330 | 331 | def get_bbox(cad): 332 | 333 | bbox = cad.BoundBox 334 | 335 | bbox_data = str(bbox).split('BoundBox (')[1].split(')')[0].split(',') 336 | bbox_data = [float(item) for item in bbox_data] 337 | 338 | bbox_geo = Part.makeBox(bbox_data[3]-bbox_data[0], bbox_data[4]-bbox_data[1], \ 339 | bbox_data[5]-bbox_data[2], Base.Vector(bbox_data[0], bbox_data[1], bbox_data[2])) 340 | return bbox_geo -------------------------------------------------------------------------------- /src/utils/space_splitter.py: -------------------------------------------------------------------------------- 1 | """ 2 | class used to partition the 3D space by the extended faces of a 3 | given geometry. 4 | 5 | """ 6 | 7 | 8 | # path to your FreeCAD.so or FreeCAD.dll file 9 | # make sure to run the python compatible with FreeCAD 10 | 11 | import sys 12 | sys.path.append('..') 13 | 14 | from setup import * 15 | 16 | 17 | import FreeCAD 18 | import Part 19 | from FreeCAD import Base 20 | 21 | import utils.face_utils as fu 22 | import utils.solid_utils as su 23 | import utils.combination_utils as cu 24 | # from utils.vis_utils import * 25 | 26 | dafault_eps = 5 * 10e-4 27 | 28 | partition_eps = 10e-9 29 | 30 | class SpaceSplitter: 31 | def __init__(self, geometry, extend_cull = None, use_face_loop = True): 32 | """ 33 | class to split space (geometry bbox) by extended faces of the geometry 34 | extend_cull is a list to specify faces to extend/not to extend 35 | """ 36 | 37 | #self.geometry = geometry.removeSplitter() 38 | self.geometry = geometry 39 | self.isGenCylinder = su.solid_is_generalized_cylinder(self.geometry) 40 | 41 | self.bbox = self.geometry.BoundBox 42 | self.bbox_used = None 43 | 44 | # bbox_data = str(self.bbox).split('BoundBox (')[1].split(')')[0].split(',') 45 | # bbox_data = [float(item) for item in bbox_data] 46 | 47 | # self.bbox_geo = Part.makeBox(bbox_data[3]-bbox_data[0], bbox_data[4]-bbox_data[1], \ 48 | # bbox_data[5]-bbox_data[2], Base.Vector(bbox_data[0], bbox_data[1], bbox_data[2])) 49 | 50 | self.bbox_geo = su.get_bbox(geometry) 51 | 52 | # self.bbox_geo.exportStep(f"../../../bbox.stp") 53 | 54 | # resolve tangent touching spliting bug 55 | 56 | self.faces = self.geometry.Faces 57 | 58 | # remove irregular small faces 59 | self.faces = [f for f in self.faces if f.Area > 10e-8 * self.geometry.Area] 60 | 61 | self.zones = None 62 | 63 | use_face_loop = True 64 | if use_face_loop: 65 | self.face_loops = su.get_gen_cylinder_side_face_loops(self.faces) 66 | else: 67 | self.face_loops = [] 68 | print('len(self.face_loops)', len(self.face_loops)) 69 | 70 | #print('self.face_loops', self.face_loops, len(self.face_loops)) 71 | 72 | if not extend_cull or len(extend_cull) == 0: 73 | self.splitting_faces = [fu.selective_extend_face(i, self.face_loops, self.faces) for i in range(len(self.faces))] 74 | else: 75 | # only extend faces that are true in extend_cull 76 | self.splitting_faces = [fu.selective_extend_face(i, self.face_loops, self.faces) if (i < len(extend_cull) and \ 77 | extend_cull[i]) or i >= len(extend_cull) else f for i,f in enumerate(self.faces)] 78 | 79 | 80 | # print('self.splitting_faces', len(self.splitting_faces)) 81 | self.selected_list = [] 82 | 83 | for i,face in enumerate(self.splitting_faces): 84 | # planar extends 85 | # if len(face.Edges) == 1 and fu.face_is_planar(face): 86 | if len(face.Edges) == 1 or fu.face_is_planar(face): 87 | 88 | if cu.list_has_item(self.bbox_geo.Faces[:], face, fu.face_share_extension_plane): 89 | # self.splitting_faces.remove(face) 90 | # face.exportStep(f"{'../../../'}/face_{i}.stp") 91 | 92 | # print('removed', i) 93 | pass 94 | else: 95 | self.selected_list.append(face) 96 | else: 97 | self.selected_list.append(face) 98 | 99 | # filter co-planar faces 100 | self.selected_final = [] 101 | planar_ext_faces = [] 102 | non_exts = [] 103 | 104 | for face in self.selected_list: 105 | # planar extensions 106 | if len(face.Edges) == 1 and face.Area > 10e3: 107 | # print('area',face.Area, len(face.Edges)) 108 | planar_ext_faces.append(face) 109 | else: 110 | non_exts.append(face) 111 | 112 | 113 | # only keep one if two faces are co-planar 114 | for face in planar_ext_faces: 115 | if not cu.list_has_item(self.selected_final, face, fu.face_share_extension_plane): 116 | self.selected_final.append(face) 117 | 118 | # if non-extended is coplanar to planar one, then remove it 119 | for face in non_exts: 120 | if not fu.face_is_planar(face): 121 | self.selected_final.append(face) 122 | else: 123 | if not cu.list_has_item(planar_ext_faces, face, fu.face_share_extension_plane): 124 | self.selected_final.append(face) 125 | 126 | 127 | 128 | # print('self.splitting_faces', len(self.splitting_faces)) 129 | # self.faces[4].exportStep(f"{'../../../'}/face_4_nonextend.stp") 130 | # self.splitting_faces[4].exportStep(f"{'../../../'}/face_4_extend.stp") 131 | 132 | self.list_of_shapes_list = [] 133 | 134 | scales = [1 - 1e-5] 135 | self.bboxs = [] 136 | for scale in scales: 137 | bbox_temp = self.bbox_geo.copy() 138 | bbox_temp.scale(scale, bbox_temp.CenterOfMass) 139 | item = [bbox_temp] + self.selected_final 140 | self.bboxs.append(bbox_temp) 141 | self.list_of_shapes_list.append(item) 142 | 143 | # for f in self.splitting_faces: 144 | # print('f area',f.Area) 145 | 146 | # faces = Part.makeCompound(self.splitting_faces) 147 | # faces.exportStep(f"../../../res_faces.stp") 148 | 149 | # print('self.splitting_faces', len(self.splitting_faces)) 150 | 151 | def get_zone_solids(self, export_path=None): 152 | 153 | 154 | for i,list_of_shapes in enumerate(self.list_of_shapes_list): 155 | 156 | if len(self.geometry.Faces) == 6 and abs(self.geometry.Volume - self.bbox_geo.Volume) < 10e-6: 157 | self.zones = self.geometry.Solids 158 | self.bbox_used = self.bboxs[i] 159 | self.list_of_shapes = list_of_shapes 160 | break 161 | else: 162 | try: 163 | # pieces receives a compound of shapes; map receives a list of lists of shapes, defining list_of_shapes <--> pieces correspondence 164 | pieces, map = list_of_shapes[0].generalFuse(list_of_shapes[1:], partition_eps) 165 | 166 | if (len(pieces.Solids) > 1) or i == len(self.list_of_shapes_list) - 1: 167 | self.zones = pieces.Solids 168 | self.bbox_used = self.bboxs[i] 169 | self.list_of_shapes = list_of_shapes 170 | break 171 | except: 172 | self.zones = self.geometry.Solids 173 | self.bbox_used = self.bboxs[i] 174 | self.list_of_shapes = list_of_shapes 175 | break 176 | 177 | if self.isGenCylinder: 178 | if len(self.zones) == 1: 179 | self.zones = self.geometry.Solids 180 | self.bbox_used = self.bbox_geo 181 | self.list_of_shapes = self.bbox_geo.copy() 182 | 183 | 184 | if export_path: 185 | self.export_zones(export_path) 186 | return self.zones 187 | 188 | def get_proposal_planes(self): 189 | 190 | planes = [] 191 | for f in self.faces: 192 | if fu.face_is_planar(f) and not cu.list_has_item(planes + self.bbox_geo.Faces, f, fu.face_share_extension_plane): 193 | planes.append(f.copy()) 194 | 195 | return planes + self.bbox_used.Faces 196 | 197 | def export_zones(self, path): 198 | if self.zones is None: 199 | self.get_zone_solids(None) 200 | 201 | solids = Part.makeCompound(self.zones) 202 | 203 | # export bbox of the geometry 204 | self.bbox_geo.exportStep(f"{path}/res_bbox.stp") 205 | 206 | # export partitioned space using the geometry 207 | solids.exportStep(f"{path}/res_p.stp") 208 | 209 | faces = Part.makeCompound(self.list_of_shapes) 210 | faces.exportStep(f"{path}/res_faces.stp") 211 | 212 | 213 | 214 | 215 | 216 | 217 | -------------------------------------------------------------------------------- /src/utils/vector_utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import copy 3 | import math 4 | 5 | def parallel(v1, v2, default_eps = 10e-6): 6 | # not parallel 7 | ang = v1.getAngle(v2) 8 | 9 | if abs(ang) > default_eps and abs(ang - math.pi) > default_eps: 10 | return False 11 | return True 12 | 13 | def rotation_matrix_from_vectors(vec1, vec2): 14 | """ Find the rotation matrix that aligns vec1 to vec2 15 | :param vec1: A 3d "source" vector 16 | :param vec2: A 3d "destination" vector 17 | :return mat: A transform matrix (3x3) which when applied to vec1, aligns it with vec2. 18 | """ 19 | 20 | a, b = (vec1 / np.linalg.norm(vec1)).reshape(3), (vec2 / np.linalg.norm(vec2)).reshape(3) 21 | v = np.cross(a, b) 22 | c = np.dot(a, b) 23 | s = np.linalg.norm(v) 24 | kmat = np.array([[0, -v[2], v[1]], [v[2], 0, -v[0]], [-v[1], v[0], 0]]) 25 | rotation_matrix = np.eye(3) + kmat + kmat.dot(kmat) * ((1 - c) / (s ** 2)) 26 | 27 | 28 | rotation_matrix = np.column_stack((rotation_matrix, np.array([0, 0, 0]) )) 29 | rotation_matrix = np.vstack ((rotation_matrix, np.array([0, 0, 0, 1]) )) 30 | 31 | return rotation_matrix.flatten() 32 | 33 | def vector_dir_hashed(vector): 34 | x = hash(round(vector[0], 3)) 35 | y = hash(round(vector[1], 3)) 36 | z = hash(round(vector[2], 3)) 37 | x_ = hash(round(-vector[0], 3)) 38 | y_ = hash(round(-vector[1], 3)) 39 | z_ = hash(round(-vector[2], 3)) 40 | if 100*x + 10*y + z > 100*x_ + 10*y_ + z_ : 41 | vector_key = hash(tuple([x, y, z, x_, y_, z_])) 42 | return vector_key 43 | else: 44 | vector_key = hash(tuple([x_, y_, z_, x, y, z])) 45 | return vector_key 46 | 47 | def hash_vector(vector): 48 | x = hash(round(vector[0], 3)) 49 | y = hash(round(vector[1], 3)) 50 | z = hash(round(vector[2], 3)) 51 | vector_key = hash(tuple([x, y, z])) 52 | return vector_key 53 | 54 | def vector_rev_equal(vector1, vector2): 55 | equal = abs(vector1[0] - vector2[0]) + abs(vector1[1] - vector2[1]) + abs(vector1[2] - vector2[2]) < 10e-5 56 | rev = abs(vector1[0] + vector2[0]) + abs(vector1[1] + vector2[1]) + abs(vector1[2] + vector2[2]) < 10e-5 57 | return equal or rev -------------------------------------------------------------------------------- /src/utils/vertex_utils.py: -------------------------------------------------------------------------------- 1 | 2 | 3 | import sys 4 | sys.path.append('..') 5 | 6 | from setup import * 7 | 8 | import FreeCAD 9 | import Part 10 | from FreeCAD import Base 11 | import math 12 | import numpy as np 13 | 14 | def hash_vertex(vertex, eps = 3): 15 | x = hash(round(vertex.Point[0], eps)) 16 | y = hash(round(vertex.Point[1], eps)) 17 | z = hash(round(vertex.Point[2], eps)) 18 | vertex_key = hash(tuple([x, y, z])) 19 | return vertex_key 20 | 21 | def hash_point(point, eps = 3): 22 | x = hash(round(point[0], eps)) 23 | y = hash(round(point[1], eps)) 24 | z = hash(round(point[2], eps)) 25 | point_key = hash(tuple([x, y, z])) 26 | return point_key 27 | 28 | def get_point_normal(point, face_index, mesh): 29 | 30 | f = mesh.faces[face_index] 31 | 32 | normal_0 = mesh.vertex_normals[f[0]] 33 | normal_1 = mesh.vertex_normals[f[1]] 34 | normal_2 = mesh.vertex_normals[f[2]] 35 | 36 | p0 = mesh.vertices[f[0]] 37 | p1 = mesh.vertices[f[1]] 38 | p2 = mesh.vertices[f[2]] 39 | 40 | area_2 = get_area(point, p0, p1) 41 | area_0 = get_area(point, p1, p2) 42 | area_1 = get_area(point, p2, p0) 43 | 44 | weight_0 = area_0 / (area_0 + area_1 + area_2) 45 | weight_1 = area_1 / (area_0 + area_1 + area_2) 46 | weight_2 = area_2 / (area_0 + area_1 + area_2) 47 | 48 | point_normal = normal_0 * weight_0 + normal_1 * weight_1 + normal_2 * weight_2 49 | 50 | return normalize(point_normal) 51 | 52 | def get_area(p0, p1, p2): 53 | 54 | a = cal_vec_length(p0 - p1) 55 | b = cal_vec_length(p1 - p2) 56 | c = cal_vec_length(p2 - p0) 57 | p = (a + b + c) / 2 58 | 59 | tmp = p * (p - a) * (p - b) * (p - c) 60 | 61 | tmp = max(0, tmp) 62 | 63 | area = math.sqrt(tmp) 64 | return area 65 | 66 | def cal_vec_length(vec): 67 | 68 | sum = 0 69 | for v in vec: 70 | sum += v * v 71 | 72 | return math.sqrt(sum) 73 | 74 | def normalize(vec): 75 | return vec / np.linalg.norm(vec) -------------------------------------------------------------------------------- /src/utils/vis_utils.py: -------------------------------------------------------------------------------- 1 | from mayavi.mlab import * 2 | from mayavi import mlab 3 | import random 4 | import numpy as np 5 | import os 6 | import utils.edge_utils as eu 7 | import utils.solid_utils as su 8 | import utils.face_utils as fu 9 | 10 | import networkx as nx 11 | 12 | cyan = (0.0, 1.0, 1.0) 13 | yellow = (1.0, 1.0, 0) 14 | red = (1, 0, 0) 15 | green = (0, 1, 0) 16 | 17 | def display_cad(cad_shape, color = (1,1,0), opacity=0.5, show_lines = True): 18 | 19 | default_eps = 10e-5 20 | triangles = cad_shape.tessellate(default_eps * cad_shape.Length * 10) 21 | vertex_positions = triangles[0] 22 | face_indices = triangles[1] 23 | 24 | if len(face_indices) == 0: 25 | return 26 | 27 | x = [] 28 | y = [] 29 | z = [] 30 | for v in vertex_positions: 31 | x.append(v[0]) 32 | y.append(v[1]) 33 | z.append(v[2]) 34 | 35 | tris = [] 36 | for f in face_indices: 37 | tri = [] 38 | tri.append(f[0]) 39 | tri.append(f[1]) 40 | tri.append(f[2]) 41 | tris.append(tri) 42 | 43 | mlab.triangular_mesh(x, y, z, tris, color = color, opacity=opacity) 44 | 45 | if show_lines: 46 | for edge in cad_shape.Edges: 47 | points = eu.edge_sample_points(edge, amount = 2000, random_sample = False) 48 | x = [] 49 | y = [] 50 | z = [] 51 | for p in points: 52 | x.append(p[0]) 53 | y.append(p[1]) 54 | z.append(p[2]) 55 | mlab.points3d(x, y, z, color = (0, 0, 0), scale_factor=0.0025) 56 | 57 | 58 | def display_object(obj, bound_obj=None, file=None, show=False, export_file=None, color=(1.0, 0.1, 0.1), opacity=0.9): 59 | 60 | if obj is None: 61 | return 62 | 63 | mlab.figure(bgcolor=(1,1,1), size=(800, 800)) 64 | if bound_obj: 65 | display_cad(bound_obj) 66 | 67 | if obj: 68 | if hasattr(obj, 'cad_shape'): 69 | display_cad(obj.cad_shape, color=color, opacity=opacity) 70 | else: 71 | display_cad(obj, color=color, opacity=opacity) 72 | 73 | if export_file: 74 | if hasattr(obj, 'cad_shape'): 75 | obj.cad_shape.exportStep(export_file) 76 | else: 77 | obj.exportStep(export_file) 78 | 79 | if file: 80 | mlab.savefig(filename=file) 81 | 82 | mlab.close() 83 | 84 | if show: 85 | mlab.show() 86 | 87 | def display_objects(objs, bound_obj=None, file=None, show=False, color=(1.0, 0.1, 0.1), opacity=1.0): 88 | 89 | mlab.figure(bgcolor=(1,1,1), size=(800, 800)) 90 | 91 | if bound_obj: 92 | display_cad(bound_obj) 93 | 94 | for obj in objs: 95 | if obj: 96 | if hasattr(obj, 'cad_shape'): 97 | display_cad(obj.cad_shape, color=color, opacity=0.9) 98 | else: 99 | for solid in obj.Solids: 100 | display_cad(solid, color=color, opacity=0.9) 101 | 102 | if file: 103 | mlab.savefig(filename=file) 104 | mlab.close() 105 | 106 | if show: 107 | mlab.show() 108 | 109 | def display_step(extrusion_shape, ext_type, original_shape, bound_obj=None, file=None, show=False): 110 | 111 | mlab.figure(bgcolor=(1,1,1), size=(800, 800)) 112 | 113 | if bound_obj: 114 | display_bound_obj(bound_obj) 115 | 116 | if extrusion_shape and not extrusion_shape.isNull(): 117 | if ext_type == "add": 118 | display_cad(extrusion_shape, color=(0.0, 1, 0.0), opacity=0.5) 119 | elif ext_type == "remove": 120 | display_cad(extrusion_shape, color=(1, 0.0, 0.0), opacity=0.5) 121 | else: 122 | display_cad(extrusion_shape, color=(0.0, 0.0, 1), opacity=0.5) 123 | 124 | if original_shape and not original_shape.isNull(): 125 | # for solid in original_shape.Solids: 126 | display_cad(original_shape, color=(0.95, 0.95, 0.95), opacity=0.2) 127 | 128 | if file: 129 | mlab.savefig(filename=file) 130 | mlab.close() 131 | 132 | if show: 133 | mlab.show() 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | -------------------------------------------------------------------------------- /teaser-image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/brownvc/zone-graphs/bd9d8d120b60846b50a442043f735b152df742e8/teaser-image.png --------------------------------------------------------------------------------