├── PointNet.pdf ├── README.md ├── ShapeNet └── process_data.py ├── model.ipynb ├── report.pdf └── results ├── CAR1.png ├── LAMPE.png ├── car2.png ├── chair.png ├── plane.png └── table1.png /PointNet.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/PointNet.pdf -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # 3D Part Segmentation 2 | 3 | Keras implementation of pointnet for part segmentation on shapenet dataset. 4 | 5 | ## Getting Started 6 | 7 | These instructions will get you a copy of the project up and running for development and testing purposes. 8 | 9 | ### Download data 10 | 11 | Firstly you need to download the data from [here](https://shapenet.cs.stanford.edu/iccv17/) : (Training Point Clouds | Training Label | Validation Point Clouds | Validation Label). 12 | 13 | Then unzip this folder and put them in a directory called "data" under the ShapeNet directory. 14 | 15 | ### Prepare train & validation dataset 16 | 17 | Shapenet contains 16 shape categories. Each category is annotated with 2 to 6 parts and there are 50 different parts annotated in total. 18 | If you open the train_labels data directory you will find 16 sub-directories where each one contains "seg" files and each line of these files contain a number between 0 and 6. What we want is to associate a unique number to each part before creating the dataset. 19 | This is done in "process_data.py", after executing it you should have 4 "npy" arrays added (train data/labels and validation data/labels). 20 | 21 | ### Running the tests 22 | 23 | Now, you only need to upload the data to your drive and run the colab notebook or save it as jupyter notebook and execute it localy :) 24 | 25 | ## some results 26 | 27 | good results : 28 | 29 | 30 | ![](results/plane.png?raw=true) 31 | 32 | 33 | ![](results/chair.png?raw=true) 34 | 35 | 36 | ![](results/table1.png?raw=true) 37 | 38 | 39 | meduim/bad results : 40 | 41 | 42 | ![](results/LAMPE.png) 43 | 44 | 45 | ![](results/CAR1.png) 46 | 47 | 48 | 49 | ## Acknowledgments 50 | 51 | * Original tensorflow code is [here](https://github.com/charlesq34/pointnet) 52 | 53 | * This project is part of the [3D course](http://caor-mines-paristech.fr/fr/cours-npm3d/) of the [IASD master](https://www.lamsade.dauphine.fr/wp/iasd/en/). 54 | 55 | -------------------------------------------------------------------------------- /ShapeNet/process_data.py: -------------------------------------------------------------------------------- 1 | """ 2 | This file is to process the shapenet dataset to return train and validation set 3 | the label of each set vary between 1 and 50 ( 50 is the total number of subcategories) 4 | """ 5 | import numpy as np 6 | import os 7 | np.set_printoptions(suppress=True) 8 | 9 | def compute_offset(category): 10 | """ 11 | Return a number between 1 and 50 12 | """ 13 | # get index 14 | index = labels.index(category) 15 | # get all previous categories 16 | previous = labels[:index] 17 | # get total nbr of part for all the previous categories 18 | total = 0 19 | for cat in previous: 20 | total += Nbr_part_per_category[cat] 21 | return total 22 | 23 | def select_points(points, num_points): 24 | """ 25 | :param points: point cloud 26 | :param num_points: maximum number of points to select from point cloud 27 | :return: 28 | """ 29 | # shuffle points and generate index 30 | if points.shape[0] > num_points: 31 | index = np.random.choice(points.shape[0], num_points, replace=False) 32 | else: 33 | index = np.random.choice(points.shape[0], num_points, replace=True) 34 | selected_points = np.take(points, index, axis=0) 35 | return selected_points 36 | 37 | 38 | def file_to_numpy(path_data, path_label, nb_pts_in_cloud=1024): 39 | 40 | oslistval = os.listdir(path_data) 41 | list_classes = [] 42 | 43 | # parse categories 44 | for classe in oslistval: 45 | 46 | path_data_classe = path_data + "/" + classe 47 | path_label_classe = path_label + "/" + classe 48 | # oslistfile contains (.pts) files 49 | oslistfile = os.listdir(path_data_classe) 50 | list_array_classe = [] 51 | # Parse data files : 52 | for file in oslistfile: 53 | # open file 54 | path_file = path_data_classe + "/" + file 55 | file_array = np.fromfile(path_file, dtype=float, count=-1, sep=' ', offset=0) 56 | file_array = file_array.reshape((-1, 3)) 57 | 58 | # add the label of the part of the object 59 | part_label = np.zeros((file_array.shape[0], 1)) 60 | file_array = np.hstack((file_array, part_label)) 61 | # complete the label of the part of the object 62 | path_seg_file = path_label_classe + "/" + file.split("pts")[0] + "seg" 63 | #compute the offset to add for each category 64 | category_offset = compute_offset(classe) 65 | 66 | with open(path_seg_file, 'r') as f: 67 | for i, line in enumerate(f): 68 | # the second term of the addition allows the creation of unique labels 69 | a = int(line) + category_offset 70 | file_array[i, 3] = int(a) 71 | #i += 1 72 | 73 | # shuffle the array and take nb_pts_in_cloud lines 74 | file_array = select_points(file_array, nb_pts_in_cloud) 75 | 76 | list_array_classe.append(np.reshape(file_array, (1, nb_pts_in_cloud, -1))) 77 | 78 | # store data from all files in one array 79 | list_array_classe = np.vstack(list_array_classe) 80 | list_classes.append(list_array_classe) 81 | 82 | # store data from all classes in one array 83 | array_classes = np.vstack(list_classes) 84 | 85 | return array_classes 86 | 87 | 88 | # labels of the sixteen classes (categories) 89 | labels = ["02691156", "02773838", "02954340", "02958343", "03001627", "03261776", "03467517", "03624134", 90 | "03636649", "03642806", "03790512", "03797390", "03948459", "04099429", "04225987", "04379243"] 91 | 92 | Nbr_part_per_category = {'04225987': 3, '02958343': 4, '03790512': 6, '03797390': 2, 93 | '03624134': 2, '03636649': 4, '02691156': 4, '02954340': 2, '03948459': 3, 94 | '03642806': 2, '04379243': 3, '03001627': 4, '03261776': 3, '04099429': 3, 95 | '02773838': 2, '03467517': 3} 96 | 97 | """ 98 | category_to_label = {'Cap': '02954340', 'Rocket': '04099429', 'Lamp': '03636649', 'Motorbike': '03790512', 99 | 'Car': '02958343', 'Airplane': '02691156', 'Skateboard': '04225987', 'Mug': '03797390', 100 | 'Laptop': '03642806', 'Bag': '02773838', 'Guitar': '03467517', 'Earphone': '03261776', 101 | 'Pistol': '03948459', 'Knife': '03624134', 'Table': '04379243','Chair': '03001627'} 102 | """ 103 | 104 | nb_pts_per_cloud = 2048 105 | 106 | path_val_data = "data/val_data" 107 | path_val_label = "data/val_label" 108 | path_train_data = "data/points" 109 | path_train_label = "data/points_label" 110 | 111 | print(' ***************************** ') 112 | # get train data 113 | array_classes = file_to_numpy(path_train_data, path_train_label, nb_pts_per_cloud) 114 | np.save("train_data.npy", array_classes[:,:, 0:3]) 115 | np.save("train_labels.npy", array_classes[:, :,3]) 116 | print("train data : ", array_classes[:,:, 0:3].shape) 117 | 118 | 119 | # get validation data 120 | array_classes = file_to_numpy(path_val_data, path_val_label, nb_pts_per_cloud) 121 | np.save("val_data.npy", array_classes[:,:, 0:3]) 122 | np.save("val_label.npy", array_classes[:, :,3]) 123 | print("val data : ", array_classes[:,:, 0:3].shape) 124 | 125 | 126 | -------------------------------------------------------------------------------- /report.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/report.pdf -------------------------------------------------------------------------------- /results/CAR1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/results/CAR1.png -------------------------------------------------------------------------------- /results/LAMPE.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/results/LAMPE.png -------------------------------------------------------------------------------- /results/car2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/results/car2.png -------------------------------------------------------------------------------- /results/chair.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/results/chair.png -------------------------------------------------------------------------------- /results/plane.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/results/plane.png -------------------------------------------------------------------------------- /results/table1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/amenimtibaa/3D-Segmentation/f3fdecacfd53461df91de6e68f06f8e2aab3aefc/results/table1.png --------------------------------------------------------------------------------