├── .gitignore ├── .gitmodules ├── README.md └── Spatial_Layout └── main.py /.gitignore: -------------------------------------------------------------------------------- 1 | 2 | Spatial_Layout/main.cpp 3 | 4 | .idea/ 5 | 6 | .vscode/ 7 | -------------------------------------------------------------------------------- /.gitmodules: -------------------------------------------------------------------------------- 1 | [submodule "VINS-Fusion"] 2 | path = VINS-Fusion 3 | url = https://github.com/HKUST-Aerial-Robotics/VINS-Fusion.git 4 | [submodule "VisualDet3D"] 5 | path = VisualDet3D 6 | url = https://github.com/Owen-Liuyuxuan/visualDet3D.git 7 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # SS-LCD 2 | 3 | The core code for the paper *Loop Closure Detection Based on Object-level Spatial Layout and Semantic Consistency*, which has been submitted to TIM 2023. 4 | 5 | ## Setup 6 | 7 | 1.1 Ubuntu and ROS 8 | 9 | Ubuntu 18.04. ROS Melodic. 10 | 11 | 1.2 [VisualDet3D](https://github.com/Owen-Liuyuxuan/visualDet3D) 12 | 13 | - environment setup 14 | 15 | ```bash 16 | pip3 install -r requirement.txt 17 | ``` 18 | 19 | - configuration and path setup 20 | 21 | Please modify the path and parameters in **visualDet3D/config*.py** files. 22 | 23 | - stereo image as input training 24 | 25 | ```bash 26 | ./launchers/det_precompute.sh config/$CONFIG_FILE.py train 27 | ./launcher/train.sh config/$CONFIG_FILE.py 0 $experiment_name 28 | ``` 29 | 30 | - stereo image as input testing 31 | 32 | ```bash 33 | ./launchers/det_precompute.sh config/$CONFIG_FILE.py test 34 | ./launchers/eval.sh config/$CONFIG_FILE.py 0 visualDet3D/workdirs/Stereo3D/checkpoint/Stereo3D_latest.pth test 35 | ``` 36 | 37 | - the evaluate results 38 | 39 | the object detected results files should be stored in **sequences/xx**. VINS-Fusion could read the results. 40 | 41 | 1.3 [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion) 42 | 43 | - environment setup 44 | 45 | ceres solver: follow [Ceres Installation](http://ceres-solver.org/installation.html) 46 | 47 | - an example 48 | 49 | ```bash 50 | roslaunch vins vins_rviz.launch 51 | rosrun vins kitti_odom_test ~/catkin_ws/src/VINS-Fusion/config/kitti_odom/kitti_config00-02.yaml YOUR_DATASET_FOLDER/sequences/00/ 52 | ``` 53 | 54 | 1.4 Spatial layout difference computing 55 | 56 | With the object detection results and pose estimation results, you can test your own loop detection and store its results as two files, such as **graph1.txt, graph2.txt**. 57 | 58 | The executable program will output the matched nodes with minimum distance between graph1 and graph2. 59 | 60 | ## Notes 61 | 62 | We are recently busy with review comments and revisions to the paper. The whole system for our paper will be released soon in this page. 63 | 64 | ## Related Paper: 65 | 66 | if you think the repository and codes are useful, please cite the following papers. 67 | 68 | ```bash 69 | @ARTICLE{9327478, 70 | author={Y. {Liu} and Y. {Yuan} and M. {Liu}}, 71 | journal={IEEE Robotics and Automation Letters}, 72 | title={Ground-aware Monocular 3D Object Detection for Autonomous Driving}, 73 | year={2021}, 74 | doi={10.1109/LRA.2021.3052442}} 75 | 76 | @article{qin2019fusion, 77 | author={Tong Qin and Jie Pan and Shaozu Cao and Shaojie Shen}, 78 | title={A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors}, 79 | journal={ArXiv}, 80 | year={2019}, 81 | volume={abs/1901.03638}} 82 | ``` -------------------------------------------------------------------------------- /Spatial_Layout/main.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | 3 | tau = 0.2 # threshold for matching 4 | K = 2 # number of neighbors 5 | # read txt file, format is node1 node2 distance, and reture np.array 6 | def read_txt(filename): 7 | with open(filename) as f: 8 | data = f.readlines() 9 | data = [x.strip() for x in data] 10 | data = [x.split() for x in data] 11 | data = np.array(data) 12 | data = data.astype(np.float) 13 | return data 14 | 15 | 16 | def cal_neighbors(data, i): 17 | edges_i = data[data[:, 0] == i][:, 2] 18 | nei_i = edges_i / np.linalg.norm(np.sort(edges_i), ord=1) 19 | if nei_i.shape[0] < K: 20 | return nei_i 21 | else: 22 | return nei_i[:K] 23 | 24 | # matching the graphs 25 | def matching(data1, data2): 26 | detections = [] 27 | objects = [] 28 | graph1 = np.unique(data1[:, 0:2]) 29 | graph2 = np.unique(data2[:, 0:2]) 30 | for i in range(len(graph1)): 31 | ini = graph1[i] 32 | nei_i = cal_neighbors(data1, ini) 33 | if np.size(nei_i) == 0: 34 | continue 35 | for j in range(len(graph2)): 36 | inj = graph2[j] 37 | nei_j = cal_neighbors(data2, inj) 38 | if np.size(nei_j) == 0: 39 | continue 40 | d_f = np.linalg.norm(nei_i - nei_j, ord=1) 41 | if d_f < tau: 42 | detections.append(ini) 43 | objects.append(inj) 44 | return detections, objects 45 | 46 | 47 | # read txt file 48 | data1 = read_txt('graph1.txt') # detections graph 49 | data2 = read_txt('graph2.txt') # objects graph 50 | detections, objects = matching(data1, data2) 51 | assert len(detections) == len(objects) 52 | # print detections and objects 53 | for i in range(len(detections)): 54 | print("detection {} in local map matches with object {} in global map".format(np.int(detections[i]), np.int(objects[i]))) 55 | 56 | --------------------------------------------------------------------------------