├── LICENSE ├── README.md ├── images ├── Example_vis.jpg ├── Example_vis2.png └── ROS_demo.gif ├── lidar_msgs ├── CMakeLists.txt ├── include │ └── lidar_msgs │ │ ├── point_types.h │ │ └── vehicle_data.h ├── msg │ ├── future_masses.msg │ ├── masses.msg │ ├── point.msg │ ├── pointCloud.msg │ ├── prediction.msg │ └── vehicle_state.msg ├── package.xml └── src │ └── vehicle_data.cpp ├── lidar_pkg ├── CMakeLists.txt ├── cmake │ └── Modules │ │ └── FindTensorFlow.cmake ├── launch │ ├── Lidar_stack_with_tensorflow.launch │ ├── Lidar_stack_with_torch.launch │ └── launch_all.launch ├── nodelet_plugins.xml ├── package.xml ├── rviz │ ├── tensorflow_viz.rviz │ └── torch_viz.rviz └── src │ ├── aggregate_points.cpp │ ├── aggregate_points.h │ ├── frame_broadcaster.cpp │ ├── inference_tf.cpp │ ├── inference_tf.h │ ├── inference_torch.cpp │ ├── inference_torch.h │ ├── inference_torch_test.cpp │ ├── mrf_ground_seg.cpp │ ├── mrf_ground_seg.h │ ├── occupancy_grid_generation.cpp │ ├── occupancy_grid_generation.h │ ├── utils.cpp │ ├── utils.h │ ├── visualize.cpp │ └── visualize.h ├── models ├── prednet.pt ├── prednet_with_saa.pt └── prednet_with_taa.pt └── pcl_tf_frame ├── CMakeLists.txt ├── package.xml └── src └── frame_broadcaster.cpp /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 Stanford Intelligent Systems Laboratory 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ROS Occupancy Grid Prediction 2 | 3 | This package contains ROS C++ Occupancy Grid Prediction framework which includes point cloud preprocessing, ground segementation, occupancy grid generation, and occupancy grid prediction. The pipeline follows the approach defined by Itkina et al. [1]. The package is compatible with models trained in Tensorflow and PyTorch provided as protocol buffers (.pb) and torch script (.pt), respectively. It contains PredNet [1], PredNet with TAAConvLSTM, and PredNet with SAAConvLSTM [2] models trained on the KITTI dataset [3]. Pointcloud can be provided in the form of a rosbag or directly from the robot Lidar sensors. This package does not contain the Tensorflow C++ API and LibTorch API (PyTorch C++), and the rosbags. The package is fully compatible with [Ford AV Dataset](https://avdata.ford.com/) [4]. 4 | 5 | ## ROS Lidar pointcloud compatibility 6 | 7 | Frame and topics names are compatible with Ford AV Dataset. To ensure the compatibility with other rosbag, the raw Lidar pointcloud topics needs to be renamed in the aggregate_points.cpp to match the topics and the number of Lidar sensors in the rosbag. The type of message used to represent the Lidar pointcloud and all other messages are defined in lidar_msgs. 8 | 9 | ## How to run it? 10 | 11 | 1. Install Ford AV dataset ROS package and all dependencies from [Ford AV Dataset](https://avdata.ford.com/) [4] in your catkin_workspace. 12 | 13 | 2. Replace the multi_lidar_convert.launch in ford_demo/launch with the file provided in other. 14 | 15 | 3. Provide the path to the prediction model in inference_torch.cpp (or inference_tf.cpp). Example models are provided in the lidar_pkg/models. If tensorflow models is used, also define the name of the input and output layer in the following lines. 16 | 17 | 4. In the same terminal, launch the node. Pick the right launch file depending if you are doing inference using Tensorflow or LibTorch. 18 | 19 | ```bash 20 | roslaunch lidar_pkg launch_all.launch 21 | ``` 22 | 5. In a different terminal, start a rosbag from AV Dataset. Ensure that the naming convention of the topics is correct. 23 | 24 | ```bash 25 | rosbag play --clock path/to/the/rosbag 26 | ``` 27 | 6. In the terminal with rosbag play command, press SPACE to stop/resume the rosbag. To see the predictions, press SPACE (pause). Predictions will appear to the right of the occupancy grid. 28 | 29 | ## Example of the rviz visualization 30 | 31 | ![](images/ROS_demo.gif) 32 | 33 | ## Experiments 34 | 35 | Tested on: 36 | - Ubuntu 18.04 37 | - ROS Melodic 38 | - Tensorflow r2.3 for CUDA 11.0 (compiled from source and linked accordingly in the CMake files, follow [official Tensorflow instructions](https://www.tensorflow.org/install/source)) 39 | - LibTorch 1.8.0 for CUDA 11.0 (binary downloaded from [PyTorch website](https://pytorch.org/cppdocs/installing.html)) 40 | - CUDA 11.0 41 | - Nvidia RTX 2080Ti 42 | - Intel i9-9900KF 43 | 44 | Performance: 45 | 46 | | Models | Publish rate using Tensorflow (Hz) | Publish rate using LibTorch (Hz) | 47 | | ------------- |:-------------:| -----:| 48 | | PredNet | 9.7 ± 0.52 | 13.5 ± 0.34 | 49 | | PredNet with TAAConvLSTM | - | 6.5 ± 0.49 | 50 | | PredNet with SAAConvLSTM | - | 5.1 ± 0.45 | 51 | 52 | ## Acknowledgements 53 | The authors would like to acknowledge this project being made possible by the funding from the Ford-Stanford Alliance. The authors thank Stanford Dynamic Design Lab for providing Lidar processing stack. 54 | 55 | Maintained by [Bernard Lange](https://web.stanford.edu/~blange/) 56 | 57 | ## References 58 | 59 | [1] Itkina, M., Driggs-Campbell, K. and Kochenderfer, M.J., 2019, Dynamic Environment Prediction in Urban Scenes using Recurrent Representation Learning. In IEEE Intelligent Transportation Systems Conference (ITSC), pp. 2052-2059. 60 | 61 | [2] Lange, B., Itkina, M. and Kochenderfer, M.J., 2020. Attention Augmented ConvLSTM for Environment Prediction. arXiv preprint arXiv:2010.09662. 62 | 63 | [3] Geiger, A., Lenz, P., Stiller, C. and Urtasun, R., 2013. Vision meets robotics: The kitti dataset. The International Journal of Robotics Research, 32(11), pp. 1231-1237. 64 | 65 | [4] Agarwal, S., Vora, A., Pandey, G., Williams, W., Kourous, H. and McBride, J., 2020. Ford Multi-AV Seasonal Dataset. arXiv preprint arXiv:2003.07969. 66 | 67 | 68 | 69 | -------------------------------------------------------------------------------- /images/Example_vis.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sisl/ROSOccupancyGridPrediction/7bff581b378a7478ecbd7491d89070d1f86259c9/images/Example_vis.jpg -------------------------------------------------------------------------------- /images/Example_vis2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sisl/ROSOccupancyGridPrediction/7bff581b378a7478ecbd7491d89070d1f86259c9/images/Example_vis2.png -------------------------------------------------------------------------------- /images/ROS_demo.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sisl/ROSOccupancyGridPrediction/7bff581b378a7478ecbd7491d89070d1f86259c9/images/ROS_demo.gif -------------------------------------------------------------------------------- /lidar_msgs/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | cmake_minimum_required(VERSION 2.8.3) 2 | project(lidar_msgs) 3 | 4 | ## Compile as C++11, supported in ROS Kinetic and newer 5 | # add_compile_options(-std=c++11) 6 | 7 | ## Find catkin macros and libraries 8 | ## if COMPONENTS list like find_package(catkin REQUIRED COMPONENTS xyz) 9 | ## is used, also find other catkin packages 10 | find_package(catkin REQUIRED COMPONENTS 11 | pcl_ros 12 | std_msgs 13 | geometry_msgs 14 | sensor_msgs 15 | nav_msgs 16 | roscpp 17 | rospy 18 | message_generation 19 | roslib 20 | pcl_conversions 21 | ) 22 | 23 | ## System dependencies are found with CMake's conventions 24 | # find_package(Boost REQUIRED COMPONENTS system) 25 | #find_package(PkgConfig) 26 | find_package(PCL REQUIRED) 27 | 28 | 29 | ## Uncomment this if the package has a setup.py. This macro ensures 30 | ## modules and global scripts declared therein get installed 31 | ## See http://ros.org/doc/api/catkin/html/user_guide/setup_dot_py.html 32 | # catkin_python_setup() 33 | 34 | ################################################ 35 | ## Declare ROS messages, services and actions ## 36 | ################################################ 37 | 38 | ## To declare and build messages, services or actions from within this 39 | ## package, follow these steps: 40 | ## * Let MSG_DEP_SET be the set of packages whose message types you use in 41 | ## your messages/services/actions (e.g. std_msgs, actionlib_msgs, ...). 42 | ## * In the file package.xml: 43 | ## * add a build_depend tag for "message_generation" 44 | ## * add a build_depend and a run_depend tag for each package in MSG_DEP_SET 45 | ## * If MSG_DEP_SET isn't empty the following dependency has been pulled in 46 | ## but can be declared for certainty nonetheless: 47 | ## * add a run_depend tag for "message_runtime" 48 | ## * In this file (CMakeLists.txt): 49 | ## * add "message_generation" and every package in MSG_DEP_SET to 50 | ## find_package(catkin REQUIRED COMPONENTS ...) 51 | ## * add "message_runtime" and every package in MSG_DEP_SET to 52 | ## catkin_package(CATKIN_DEPENDS ...) 53 | ## * uncomment the add_*_files sections below as needed 54 | ## and list every .msg/.srv/.action file to be processed 55 | ## * uncomment the generate_messages entry below 56 | ## * add every package in MSG_DEP_SET to generate_messages(DEPENDENCIES ...) 57 | 58 | ## Generate messages in the 'msg' folder 59 | add_message_files( 60 | FILES 61 | pointCloud.msg 62 | point.msg 63 | masses.msg 64 | prediction.msg 65 | future_masses.msg 66 | vehicle_state.msg 67 | ) 68 | 69 | add_library(${PROJECT_NAME} 70 | src/vehicle_data.cpp 71 | ) 72 | 73 | ## Generate services in the 'srv' folder 74 | # add_service_files( 75 | # FILES 76 | # Service1.srv 77 | # Service2.srv 78 | # ) 79 | 80 | ## Generate actions in the 'action' folder 81 | # add_action_files( 82 | # FILES 83 | # Action1.action 84 | # Action2.action 85 | # ) 86 | 87 | ## Generate added messages and services with any dependencies listed here 88 | generate_messages( 89 | DEPENDENCIES 90 | std_msgs 91 | nav_msgs 92 | ) 93 | 94 | ################################################ 95 | ## Declare ROS dynamic reconfigure parameters ## 96 | ################################################ 97 | 98 | ## To declare and build dynamic reconfigure parameters within this 99 | ## package, follow these steps: 100 | ## * In the file package.xml: 101 | ## * add a build_depend and a run_depend tag for "dynamic_reconfigure" 102 | ## * In this file (CMakeLists.txt): 103 | ## * add "dynamic_reconfigure" to 104 | ## find_package(catkin REQUIRED COMPONENTS ...) 105 | ## * uncomment the "generate_dynamic_reconfigure_options" section below 106 | ## and list every .cfg file to be processed 107 | 108 | ## Generate dynamic reconfigure parameters in the 'cfg' folder 109 | # generate_dynamic_reconfigure_options( 110 | # cfg/DynReconf1.cfg 111 | # cfg/DynReconf2.cfg 112 | # ) 113 | 114 | ################################### 115 | ## catkin specific configuration ## 116 | ################################### 117 | ## The catkin_package macro generates cmake config files for your package 118 | ## Declare things to be passed to dependent projects 119 | ## INCLUDE_DIRS: uncomment this if your package contains header files 120 | ## LIBRARIES: libraries you create in this project that dependent projects also need 121 | ## CATKIN_DEPENDS: catkin_packages dependent projects also need 122 | ## DEPENDS: system dependencies of this project that dependent projects also need 123 | catkin_package( 124 | INCLUDE_DIRS include 125 | LIBRARIES lidar_msgs 126 | CATKIN_DEPENDS 127 | std_msgs 128 | geometry_msgs 129 | sensor_msgs 130 | roscpp 131 | rospy 132 | message_runtime 133 | #DEPENDS system_lib 134 | ) 135 | 136 | ########### 137 | ## Build ## 138 | ########### 139 | 140 | ## Specify additional locations of header files 141 | ## Your package locations should be listed before other locations 142 | include_directories( 143 | include 144 | ${PCL_INCLUDE_DIRS} 145 | ${catkin_INCLUDE_DIRS} 146 | ) 147 | link_directories(${PCL_LIBRARY_DIRS}) 148 | add_definitions(${PCL_DEFINITIONS}) 149 | 150 | ## Declare a C++ library 151 | # add_library(${PROJECT_NAME} 152 | # src/${PROJECT_NAME}/lidar_msgs.cpp 153 | # ) 154 | 155 | ## Add cmake target dependencies of the library 156 | ## as an example, code may need to be generated before libraries 157 | ## either from message generation or dynamic reconfigure 158 | add_dependencies(${PROJECT_NAME} ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS}) 159 | 160 | ## Declare a C++ executable 161 | ## With catkin_make all packages are built within a single CMake context 162 | ## The recommended prefix ensures that target names across packages don't collide 163 | # add_executable(${PROJECT_NAME}_node src/lidar_msgs_node.cpp) 164 | #add_executable(aggregate_points src/aggregate_points.cpp) 165 | 166 | ## Rename C++ executable without prefix 167 | ## The above recommended prefix causes long target names, the following renames the 168 | ## target back to the shorter version for ease of user use 169 | ## e.g. "rosrun someones_pkg node" instead of "rosrun someones_pkg someones_pkg_node" 170 | # set_target_properties(${PROJECT_NAME}_node PROPERTIES OUTPUT_NAME node PREFIX "") 171 | 172 | ## Add cmake target dependencies of the executable 173 | ## same as for the library above 174 | # add_dependencies(${PROJECT_NAME}_node ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS}) 175 | 176 | ## Specify libraries to link a library or executable target against 177 | # target_link_libraries(${PROJECT_NAME}_node 178 | # ${catkin_LIBRARIES} 179 | # ) 180 | #target_link_libraries(aggregate_points ${catkin_LIBRARIES} ${PCL_LIBRARIES}) 181 | 182 | ############# 183 | ## Install ## 184 | ############# 185 | 186 | # all install targets should use catkin DESTINATION variables 187 | # See http://ros.org/doc/api/catkin/html/adv_user_guide/variables.html 188 | 189 | ## Mark executable scripts (Python etc.) for installation 190 | ## in contrast to setup.py, you can choose the destination 191 | # install(PROGRAMS 192 | # scripts/my_python_script 193 | # DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION} 194 | # ) 195 | 196 | ## Mark executables and/or libraries for installation 197 | # install(TARGETS ${PROJECT_NAME} ${PROJECT_NAME}_node 198 | # ARCHIVE DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION} 199 | # LIBRARY DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION} 200 | # RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION} 201 | # ) 202 | 203 | ## Mark cpp header files for installation 204 | # install(DIRECTORY include/${PROJECT_NAME}/ 205 | # DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION} 206 | # FILES_MATCHING PATTERN "*.h" 207 | # PATTERN ".svn" EXCLUDE 208 | # ) 209 | 210 | ## Mark other files for installation (e.g. launch and bag files, etc.) 211 | # install(FILES 212 | # # myfile1 213 | # # myfile2 214 | # DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION} 215 | # ) 216 | 217 | ############# 218 | ## Testing ## 219 | ############# 220 | 221 | ## Add gtest based cpp test target and link libraries 222 | # catkin_add_gtest(${PROJECT_NAME}-test test/test_lidar_msgs.cpp) 223 | # if(TARGET ${PROJECT_NAME}-test) 224 | # target_link_libraries(${PROJECT_NAME}-test ${PROJECT_NAME}) 225 | # endif() 226 | 227 | ## Add folders to be run by python nosetests 228 | # catkin_add_nosetests(test) 229 | -------------------------------------------------------------------------------- /lidar_msgs/include/lidar_msgs/point_types.h: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | 4 | namespace lidar_msgs 5 | { 6 | struct PointXYZIDBS 7 | { 8 | PCL_ADD_POINT4D; 9 | // float intensity; 10 | // uint16_t device_id; 11 | // uint16_t beam_id; 12 | // float score; 13 | EIGEN_MAKE_ALIGNED_OPERATOR_NEW 14 | } EIGEN_ALIGN16; 15 | }; 16 | 17 | POINT_CLOUD_REGISTER_POINT_STRUCT(lidar_msgs::PointXYZIDBS, 18 | (float, x, x) 19 | (float, y, y) 20 | (float, z, z)) 21 | 22 | // (float, intensity, intensity) 23 | // (uint16_t, device_id, device_id) 24 | // (uint16_t, beam_id, beam_id) 25 | // (float, score, score) 26 | 27 | typedef lidar_msgs::PointXYZIDBS DDLPointType; 28 | typedef pcl::PointCloud DDLPointCloud; 29 | -------------------------------------------------------------------------------- /lidar_msgs/include/lidar_msgs/vehicle_data.h: -------------------------------------------------------------------------------- 1 | // #pragma once 2 | 3 | #include 4 | #include 5 | 6 | namespace lidar_msgs 7 | { 8 | 9 | class VehicleData{ 10 | 11 | private: 12 | lidar_msgs::vehicle_state ros_vehicleState; 13 | double psi_rad; 14 | double N_m; 15 | double E_m; 16 | public: 17 | 18 | /* Constructors */ 19 | VehicleData(); 20 | VehicleData(const lidar_msgs::vehicle_state& vehicleState); 21 | 22 | /* Setters & Getters */ 23 | void setState(const lidar_msgs::vehicle_state& vehicleState); 24 | 25 | void getPose(double& E_m, double& N_m, double& psi_rad) const; 26 | void getPoseWrtE(double& E_m, double& N_m, double& psi_rad) const; 27 | 28 | /* Miscellaneous */ 29 | }; 30 | 31 | } 32 | -------------------------------------------------------------------------------- /lidar_msgs/msg/future_masses.msg: -------------------------------------------------------------------------------- 1 | #It contains masses for DST occupancy grid calculation and prediction. 2 | #Occ - occupied masses. 3 | #Free - free masses. 4 | #Stored in a row-major order, starting with (0,0). Values in the range [0,1]. 5 | 6 | masses[] predictions 7 | 8 | -------------------------------------------------------------------------------- /lidar_msgs/msg/masses.msg: -------------------------------------------------------------------------------- 1 | #It contains masses for DST occupancy grid calculation and prediction. 2 | #Occ - occupied masses. 3 | #Free - free masses. 4 | #Stored in a row-major order, starting with (0,0). Values in the range [0,1]. 5 | 6 | Header header # ROS defined header containing timestamp and sequence id 7 | 8 | float32[] occ 9 | float32[] free 10 | int32 width 11 | int32 height 12 | -------------------------------------------------------------------------------- /lidar_msgs/msg/point.msg: -------------------------------------------------------------------------------- 1 | # -------------- /point.msg ---------------- 2 | # Custom ROS message for points 3 | 4 | Header header # ROS defined header containing timestamp and sequence id 5 | 6 | float64 x 7 | float64 y 8 | float64 z 9 | float64 intensity 10 | -------------------------------------------------------------------------------- /lidar_msgs/msg/pointCloud.msg: -------------------------------------------------------------------------------- 1 | # -------------- /pointCloud.msg ---------------- 2 | # Custom ROS message for point clouds 3 | 4 | Header header # ROS defined header containing timestamp and sequence id 5 | 6 | # information about points 7 | 8 | int64 num_points # Number of points defined in points array 9 | lidar_msgs/point[] points # Array of points 10 | 11 | # information about lasers 12 | 13 | float64[] range_m 14 | float64[] theta 15 | float64[] phi 16 | int64[] firing 17 | int64[] sorted 18 | 19 | # information about lidars 20 | 21 | int64 num_lidars 22 | int64[] lidar_id 23 | -------------------------------------------------------------------------------- /lidar_msgs/msg/prediction.msg: -------------------------------------------------------------------------------- 1 | nav_msgs/OccupancyGrid[] prediction 2 | 3 | -------------------------------------------------------------------------------- /lidar_msgs/msg/vehicle_state.msg: -------------------------------------------------------------------------------- 1 | # Custom ROS message for vehicle state information 2 | 3 | Header header # Ros defined header containing timestamp and sequence id 4 | float64 psi_rad # Heading [rad] 5 | float64 N_m # N position [m] 6 | float64 E_m # E position [m] 7 | -------------------------------------------------------------------------------- /lidar_msgs/package.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | lidar_msgs 4 | 0.0.0 5 | The lidar_msgs package 6 | 7 | 8 | 9 | 10 | Gene Lewis 11 | 12 | 13 | 14 | 15 | 16 | TODO 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | Gene Lewis 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | message_generation 42 | std_msgs 43 | geometry_msgs 44 | sensor_msgs 45 | roscpp 46 | rospy 47 | pcl_ros 48 | 49 | 50 | message_generation 51 | std_msgs 52 | geometry_msgs 53 | sensor_msgs 54 | 55 | 56 | 57 | catkin 58 | 59 | 60 | message_runtime 61 | std_msgs 62 | geometry_msgs 63 | sensor_msgs 64 | roscpp 65 | rospy 66 | pcl_ros 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | -------------------------------------------------------------------------------- /lidar_msgs/src/vehicle_data.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | 3 | 4 | namespace lidar_msgs 5 | { 6 | 7 | 8 | VehicleData::VehicleData(){} 9 | 10 | VehicleData::VehicleData(const lidar_msgs::vehicle_state& vehicleState){ 11 | setState(vehicleState); 12 | } 13 | 14 | void VehicleData::setState(const lidar_msgs::vehicle_state& vehicleState){ 15 | this->psi_rad = vehicleState.psi_rad; 16 | this->N_m = vehicleState.N_m; 17 | this->E_m = vehicleState.E_m; 18 | } 19 | 20 | void VehicleData::getPose(double& E_m, double& N_m, double& psi_rad) const { 21 | E_m = this->E_m; 22 | N_m = this->N_m; 23 | psi_rad = this->psi_rad; 24 | } 25 | 26 | void VehicleData::getPoseWrtE(double& E_m, double& N_m, double& psi_rad) const { 27 | E_m = this->E_m; 28 | N_m = this->N_m; 29 | psi_rad = this->psi_rad + M_PI_2; 30 | } 31 | 32 | } 33 | -------------------------------------------------------------------------------- /lidar_pkg/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | cmake_minimum_required(VERSION 2.8.3) 2 | set(CMAKE_CXX_STANDARD 14) 3 | project(lidar_pkg) 4 | 5 | 6 | list(APPEND CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake/Modules) 7 | list(APPEND CMAKE_PREFIX_PATH "/usr/local/libtorch") 8 | 9 | find_package(catkin REQUIRED COMPONENTS 10 | lidar_msgs 11 | nodelet 12 | roscpp 13 | lidar_msgs 14 | rospy 15 | std_msgs 16 | geometry_msgs 17 | sensor_msgs 18 | nav_msgs 19 | roslib 20 | pcl_conversions 21 | pcl_ros 22 | message_filters 23 | tf2 24 | tf2_ros 25 | tf 26 | ) 27 | 28 | ## Setup include directories 29 | include_directories(${catkin_INCLUDE_DIRS}) 30 | 31 | 32 | catkin_package( 33 | # INCLUDE_DIRS include 34 | # LIBRARIES nodelet_plugins 35 | # CATKIN_DEPENDS 36 | # lidar_msgs 37 | # roscpp 38 | # rospy 39 | # std_msgs 40 | # DEPENDS system_lib 41 | ) 42 | 43 | ## Create the nodelet tutorial library 44 | add_library(nodelet_plugins 45 | src/aggregate_points.cpp 46 | src/aggregate_points.h 47 | src/mrf_ground_seg.cpp 48 | src/occupancy_grid_generation.cpp 49 | src/occupancy_grid_generation.h 50 | src/inference_tf.cpp 51 | src/inference_tf.h 52 | src/inference_torch.cpp 53 | src/inference_torch.h 54 | src/utils.cpp 55 | src/utils.h 56 | src/visualize.cpp 57 | src/visualize.h 58 | ) 59 | 60 | find_package(Torch REQUIRED) 61 | set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${TORCH_CXX_FLAGS}") 62 | set_property(TARGET nodelet_plugins PROPERTY CXX_STANDARD 14) 63 | 64 | target_link_libraries(nodelet_plugins "${TORCH_LIBRARIES}") 65 | target_link_libraries(nodelet_plugins tensorflow_cc ${catkin_LIBRARIES} ${PROJECT_LIBRARIES} "${TORCH_LIBRARIES}") 66 | 67 | 68 | if(catkin_EXPORTED_LIBRARIES) 69 | add_dependencies(nodelet_plugins ${catkin_EXPORTED_LIBRARIES}) 70 | endif() 71 | 72 | ## Mark the nodelet library for installations 73 | install(TARGETS nodelet_plugins 74 | ARCHIVE DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION} 75 | LIBRARY DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION} 76 | RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}) 77 | 78 | install(FILES nodelet_plugins.xml mynodelet.launch 79 | DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}) 80 | 81 | # add modules 82 | list(APPEND CMAKE_MODULE_PATH "${PROJECT_SOURCE_DIR}/cmake/Modules") 83 | 84 | # find packages 85 | find_package(TensorFlow REQUIRED) 86 | find_package(Protobuf REQUIRED) 87 | 88 | set(PROJECT_INCLUDE_DIRS ${TensorFlow_INCLUDE_DIRS} ${PROTOBUF_INCLUDE_DIRS}) 89 | set(PROJECT_LIBRARIES ${TensorFlow_LIBRARIES} ${PROTOBUF_LIBRARIES}) 90 | 91 | include_directories(${PROJECT_INCLUDE_DIRS}) 92 | target_link_libraries(nodelet_plugins tensorflow_cc ${PROJECT_LIBRARIES} ${catkin_LIBRARIES}) 93 | -------------------------------------------------------------------------------- /lidar_pkg/cmake/Modules/FindTensorFlow.cmake: -------------------------------------------------------------------------------- 1 | # Locates the tensorFlow library and include directories. 2 | 3 | include(FindPackageHandleStandardArgs) 4 | unset(TENSORFLOW_FOUND) 5 | 6 | find_path(TensorFlow_INCLUDE_DIR 7 | NAMES 8 | tensorflow/core 9 | tensorflow/cc 10 | third_party 11 | HINTS 12 | /usr/local/include/google/tensorflow 13 | /usr/include/google/tensorflow) 14 | 15 | 16 | find_library(TensorFlow_LIBRARY NAMES tensorflow_all 17 | HINTS 18 | /usr/lib 19 | /usr/local/lib) 20 | 21 | find_library(TensorFlow_LIBRARY NAMES tensorflow_cc 22 | HINTS 23 | /usr/lib 24 | /usr/local/lib) 25 | 26 | find_library(TensorFlow_LIBRARY NAMES tensorflow_framework 27 | HINTS 28 | /usr/lib 29 | /usr/local/lib) 30 | 31 | # set TensorFlow_FOUND 32 | find_package_handle_standard_args(TensorFlow DEFAULT_MSG TensorFlow_INCLUDE_DIR TensorFlow_LIBRARY) 33 | 34 | # set external variables for usage in CMakeLists.txt 35 | if(TENSORFLOW_FOUND) 36 | set(TensorFlow_LIBRARIES ${TensorFlow_LIBRARY}) 37 | set(TensorFlow_INCLUDE_DIRS ${TensorFlow_INCLUDE_DIR}) 38 | endif() 39 | 40 | # hide locals from GUI 41 | mark_as_advanced(TensorFlow_INCLUDE_DIR TensorFlow_LIBRARY) 42 | 43 | #set(CXX_FLAGS “-Wall -undefined dynamic_lookup”) 44 | #set(CMAKE_CXX_FLAGS “${CXX_FLAGS}”) 45 | -------------------------------------------------------------------------------- /lidar_pkg/launch/Lidar_stack_with_tensorflow.launch: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | --> 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | -------------------------------------------------------------------------------- /lidar_pkg/launch/Lidar_stack_with_torch.launch: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | --> 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | -------------------------------------------------------------------------------- /lidar_pkg/launch/launch_all.launch: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | -------------------------------------------------------------------------------- /lidar_pkg/nodelet_plugins.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | Write things here. 5 | 6 | 7 | 8 | 9 | 10 | Write things here. 11 | 12 | 13 | 14 | 15 | 16 | Write things here. 17 | 18 | 19 | 20 | 21 | 22 | Write things here. 23 | 24 | 25 | 26 | 27 | 28 | Write things here. 29 | 30 | 31 | 32 | 33 | 34 | Write things here. 35 | 36 | 37 | 38 | 39 | 40 | Write things here. 41 | 42 | 43 | 44 | 45 | -------------------------------------------------------------------------------- /lidar_pkg/package.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | lidar_pkg 4 | 0.0.0 5 | The lidar_pkg package 6 | 7 | 8 | 9 | 10 | ford 11 | 12 | 13 | 14 | 15 | 16 | TODO 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | catkin 52 | lidar_filter_points 53 | lidar_msgs 54 | nodelet 55 | roscpp 56 | lidar_filter_points 57 | lidar_msgs 58 | nodelet 59 | roscpp 60 | lidar_filter_points 61 | lidar_msgs 62 | nodelet 63 | roscpp 64 | 65 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | -------------------------------------------------------------------------------- /lidar_pkg/rviz/tensorflow_viz.rviz: -------------------------------------------------------------------------------- 1 | Panels: 2 | - Class: rviz/Displays 3 | Help Height: 78 4 | Name: Displays 5 | Property Tree Widget: 6 | Expanded: 7 | - /Global Options1 8 | - /Status1 9 | - /PointCloud21 10 | - /Map1 11 | - /Map1/Status1 12 | - /Map1/Position1 13 | - /Map1/Orientation1 14 | - /Axes1 15 | - /Map2 16 | Splitter Ratio: 0.5 17 | Tree Height: 721 18 | - Class: rviz/Selection 19 | Name: Selection 20 | - Class: rviz/Tool Properties 21 | Expanded: 22 | - /2D Pose Estimate1 23 | - /2D Nav Goal1 24 | - /Publish Point1 25 | Name: Tool Properties 26 | Splitter Ratio: 0.5886790156364441 27 | - Class: rviz/Views 28 | Expanded: 29 | - /Current View1 30 | Name: Views 31 | Splitter Ratio: 0.5 32 | - Class: rviz/Time 33 | Experimental: false 34 | Name: Time 35 | SyncMode: 0 36 | SyncSource: PointCloud2 37 | Preferences: 38 | PromptSaveOnExit: true 39 | Toolbars: 40 | toolButtonStyle: 2 41 | Visualization Manager: 42 | Class: "" 43 | Displays: 44 | - Alpha: 0.5 45 | Cell Size: 1 46 | Class: rviz/Grid 47 | Color: 160; 160; 164 48 | Enabled: true 49 | Line Style: 50 | Line Width: 0.029999999329447746 51 | Value: Lines 52 | Name: Grid 53 | Normal Cell Count: 0 54 | Offset: 55 | X: 0 56 | Y: 0 57 | Z: 0 58 | Plane: XY 59 | Plane Cell Count: 10 60 | Reference Frame: 61 | Value: true 62 | - Alpha: 1 63 | Autocompute Intensity Bounds: true 64 | Autocompute Value Bounds: 65 | Max Value: 10 66 | Min Value: -10 67 | Value: true 68 | Axis: Z 69 | Channel Name: intensity 70 | Class: rviz/PointCloud2 71 | Color: 255; 255; 255 72 | Color Transformer: Intensity 73 | Decay Time: 0 74 | Enabled: true 75 | Invert Rainbow: false 76 | Max Color: 255; 255; 255 77 | Max Intensity: 251 78 | Min Color: 0; 0; 0 79 | Min Intensity: 11 80 | Name: PointCloud2 81 | Position Transformer: XYZ 82 | Queue Size: 10 83 | Selectable: true 84 | Size (Pixels): 3 85 | Size (m): 0.05999999865889549 86 | Style: Flat Squares 87 | Topic: /OccupancyGridGeneration/mrf_filtered_points 88 | Unreliable: false 89 | Use Fixed Frame: true 90 | Use rainbow: true 91 | Value: true 92 | - Alpha: 0.699999988079071 93 | Class: rviz/Map 94 | Color Scheme: map 95 | Draw Behind: false 96 | Enabled: true 97 | Name: Map 98 | Topic: /InferenceTF/occupancy 99 | Unreliable: false 100 | Use Timestamp: false 101 | Value: true 102 | - Class: rviz/Axes 103 | Enabled: true 104 | Length: 1.5 105 | Name: Axes 106 | Radius: 0.5 107 | Reference Frame: vehicle_cg_cartesian 108 | Value: true 109 | - Alpha: 0.699999988079071 110 | Class: rviz/Map 111 | Color Scheme: map 112 | Draw Behind: false 113 | Enabled: true 114 | Name: Map 115 | Topic: /Visualize/visualization 116 | Unreliable: false 117 | Use Timestamp: false 118 | Value: true 119 | Enabled: true 120 | Global Options: 121 | Background Color: 48; 48; 48 122 | Default Light: true 123 | Fixed Frame: vehicle_cg_cartesian 124 | Frame Rate: 30 125 | Name: root 126 | Tools: 127 | - Class: rviz/Interact 128 | Hide Inactive Objects: true 129 | - Class: rviz/MoveCamera 130 | - Class: rviz/Select 131 | - Class: rviz/FocusCamera 132 | - Class: rviz/Measure 133 | - Class: rviz/SetInitialPose 134 | Theta std deviation: 0.2617993950843811 135 | Topic: /initialpose 136 | X std deviation: 0.5 137 | Y std deviation: 0.5 138 | - Class: rviz/SetGoal 139 | Topic: /move_base_simple/goal 140 | - Class: rviz/PublishPoint 141 | Single click: true 142 | Topic: /clicked_point 143 | Value: true 144 | Views: 145 | Current: 146 | Class: rviz/Orbit 147 | Distance: 175.34107971191406 148 | Enable Stereo Rendering: 149 | Stereo Eye Separation: 0.05999999865889549 150 | Stereo Focal Distance: 1 151 | Swap Stereo Eyes: false 152 | Value: false 153 | Focal Point: 154 | X: 15.894294738769531 155 | Y: -8.109152793884277 156 | Z: 11.52355670928955 157 | Focal Shape Fixed Size: true 158 | Focal Shape Size: 0.05000000074505806 159 | Invert Z Axis: false 160 | Name: Current View 161 | Near Clip Distance: 0.009999999776482582 162 | Pitch: 0.7497983574867249 163 | Target Frame: 164 | Value: Orbit (rviz) 165 | Yaw: 4.698549270629883 166 | Saved: ~ 167 | Window Geometry: 168 | Displays: 169 | collapsed: false 170 | Height: 1018 171 | Hide Left Dock: false 172 | Hide Right Dock: true 173 | QMainWindow State: 000000ff00000000fd0000000400000000000001560000035cfc0200000008fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073010000003d0000035c000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261000000010000010f00000259fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003d00000259000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000003dc0000003efc0100000002fb0000000800540069006d00650100000000000003dc000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000002800000035c00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 174 | Selection: 175 | collapsed: false 176 | Time: 177 | collapsed: false 178 | Tool Properties: 179 | collapsed: false 180 | Views: 181 | collapsed: true 182 | Width: 988 183 | X: 932 184 | Y: 27 185 | -------------------------------------------------------------------------------- /lidar_pkg/rviz/torch_viz.rviz: -------------------------------------------------------------------------------- 1 | Panels: 2 | - Class: rviz/Displays 3 | Help Height: 78 4 | Name: Displays 5 | Property Tree Widget: 6 | Expanded: 7 | - /Global Options1 8 | - /Status1 9 | - /PointCloud21 10 | - /Map1 11 | - /Map1/Position1 12 | - /Map1/Orientation1 13 | - /Axes1 14 | - /Map2 15 | Splitter Ratio: 0.5 16 | Tree Height: 721 17 | - Class: rviz/Selection 18 | Name: Selection 19 | - Class: rviz/Tool Properties 20 | Expanded: 21 | - /2D Pose Estimate1 22 | - /2D Nav Goal1 23 | - /Publish Point1 24 | Name: Tool Properties 25 | Splitter Ratio: 0.5886790156364441 26 | - Class: rviz/Views 27 | Expanded: 28 | - /Current View1 29 | Name: Views 30 | Splitter Ratio: 0.5 31 | - Class: rviz/Time 32 | Experimental: false 33 | Name: Time 34 | SyncMode: 0 35 | SyncSource: "" 36 | Preferences: 37 | PromptSaveOnExit: true 38 | Toolbars: 39 | toolButtonStyle: 2 40 | Visualization Manager: 41 | Class: "" 42 | Displays: 43 | - Alpha: 0.5 44 | Cell Size: 1 45 | Class: rviz/Grid 46 | Color: 160; 160; 164 47 | Enabled: true 48 | Line Style: 49 | Line Width: 0.029999999329447746 50 | Value: Lines 51 | Name: Grid 52 | Normal Cell Count: 0 53 | Offset: 54 | X: 0 55 | Y: 0 56 | Z: 0 57 | Plane: XY 58 | Plane Cell Count: 10 59 | Reference Frame: 60 | Value: true 61 | - Alpha: 1 62 | Autocompute Intensity Bounds: true 63 | Autocompute Value Bounds: 64 | Max Value: 10 65 | Min Value: -10 66 | Value: true 67 | Axis: Z 68 | Channel Name: intensity 69 | Class: rviz/PointCloud2 70 | Color: 255; 255; 255 71 | Color Transformer: Intensity 72 | Decay Time: 0 73 | Enabled: false 74 | Invert Rainbow: false 75 | Max Color: 255; 255; 255 76 | Max Intensity: 220 77 | Min Color: 0; 0; 0 78 | Min Intensity: 11 79 | Name: PointCloud2 80 | Position Transformer: XYZ 81 | Queue Size: 10 82 | Selectable: true 83 | Size (Pixels): 3 84 | Size (m): 0.05999999865889549 85 | Style: Flat Squares 86 | Topic: /MrfGroundSeg/agg_points 87 | Unreliable: false 88 | Use Fixed Frame: true 89 | Use rainbow: true 90 | Value: false 91 | - Alpha: 0.699999988079071 92 | Class: rviz/Map 93 | Color Scheme: map 94 | Draw Behind: false 95 | Enabled: true 96 | Name: Map 97 | Topic: /InferenceTorch/occupancy 98 | Unreliable: false 99 | Use Timestamp: false 100 | Value: true 101 | - Class: rviz/Axes 102 | Enabled: true 103 | Length: 1.5 104 | Name: Axes 105 | Radius: 0.5 106 | Reference Frame: global_pcl_frame 107 | Value: true 108 | - Alpha: 0.699999988079071 109 | Class: rviz/Map 110 | Color Scheme: map 111 | Draw Behind: false 112 | Enabled: true 113 | Name: Map 114 | Topic: /Visualize/visualization 115 | Unreliable: false 116 | Use Timestamp: false 117 | Value: true 118 | Enabled: true 119 | Global Options: 120 | Background Color: 48; 48; 48 121 | Default Light: true 122 | Fixed Frame: global_pcl_frame 123 | Frame Rate: 30 124 | Name: root 125 | Tools: 126 | - Class: rviz/Interact 127 | Hide Inactive Objects: true 128 | - Class: rviz/MoveCamera 129 | - Class: rviz/Select 130 | - Class: rviz/FocusCamera 131 | - Class: rviz/Measure 132 | - Class: rviz/SetInitialPose 133 | Theta std deviation: 0.2617993950843811 134 | Topic: /initialpose 135 | X std deviation: 0.5 136 | Y std deviation: 0.5 137 | - Class: rviz/SetGoal 138 | Topic: /move_base_simple/goal 139 | - Class: rviz/PublishPoint 140 | Single click: true 141 | Topic: /clicked_point 142 | Value: true 143 | Views: 144 | Current: 145 | Class: rviz/Orbit 146 | Distance: 123.50437927246094 147 | Enable Stereo Rendering: 148 | Stereo Eye Separation: 0.05999999865889549 149 | Stereo Focal Distance: 1 150 | Swap Stereo Eyes: false 151 | Value: false 152 | Focal Point: 153 | X: 22.355911254882812 154 | Y: -1.618168830871582 155 | Z: -9.910804748535156 156 | Focal Shape Fixed Size: true 157 | Focal Shape Size: 0.05000000074505806 158 | Invert Z Axis: false 159 | Name: Current View 160 | Near Clip Distance: 0.009999999776482582 161 | Pitch: 1.5697963237762451 162 | Target Frame: 163 | Value: Orbit (rviz) 164 | Yaw: 3.1517174243927 165 | Saved: ~ 166 | Window Geometry: 167 | Displays: 168 | collapsed: true 169 | Height: 1018 170 | Hide Left Dock: true 171 | Hide Right Dock: true 172 | QMainWindow State: 000000ff00000000fd0000000400000000000001560000035cfc0200000008fb0000001200530065006c0065006300740069006f006e00000001e10000009b0000005c00fffffffb0000001e0054006f006f006c002000500072006f007000650072007400690065007302000001ed000001df00000185000000a3fb000000120056006900650077007300200054006f006f02000001df000002110000018500000122fb000000200054006f006f006c002000500072006f0070006500720074006900650073003203000002880000011d000002210000017afb000000100044006900730070006c006100790073000000003d0000035c000000c900fffffffb0000002000730065006c0065006300740069006f006e00200062007500660066006500720200000138000000aa0000023a00000294fb00000014005700690064006500530074006500720065006f02000000e6000000d2000003ee0000030bfb0000000c004b0069006e0065006300740200000186000001060000030c00000261000000010000010f00000259fc0200000003fb0000001e0054006f006f006c002000500072006f00700065007200740069006500730100000041000000780000000000000000fb0000000a00560069006500770073000000003d00000259000000a400fffffffb0000001200530065006c0065006300740069006f006e010000025a000000b200000000000000000000000200000490000000a9fc0100000001fb0000000a00560069006500770073030000004e00000080000002e10000019700000003000002eb0000003efc0100000002fb0000000800540069006d00650100000000000002eb000002eb00fffffffb0000000800540069006d00650100000000000004500000000000000000000002eb0000035c00000004000000040000000800000008fc0000000100000002000000010000000a0054006f006f006c00730100000000ffffffff0000000000000000 173 | Selection: 174 | collapsed: false 175 | Time: 176 | collapsed: false 177 | Tool Properties: 178 | collapsed: false 179 | Views: 180 | collapsed: true 181 | Width: 747 182 | X: 64 183 | Y: 32 184 | -------------------------------------------------------------------------------- /lidar_pkg/src/aggregate_points.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include "aggregate_points.h" 3 | 4 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::AggregatePoints, nodelet::Nodelet) 5 | 6 | namespace lidar_pkg 7 | { 8 | void AggregatePoints::onInit() 9 | { 10 | ROS_INFO("Initialized AggregatePoints Nodelet"); 11 | ros::NodeHandle& private_nh = getPrivateNodeHandle(); 12 | pub = private_nh.advertise>("agg_points",1); //pcl::PointCloud sensor_msgs::PointCloud2 13 | //filter.setVehicle(vehicle_msg); //??????????????????? 14 | 15 | yellow_sub = private_nh.subscribe("velo_yellow_points", 1, &AggregatePoints::yellow_cb, this); 16 | red_sub = private_nh.subscribe("velo_red_points", 1, &AggregatePoints::red_cb, this); 17 | blue_sub = private_nh.subscribe("velo_blue_points", 1, &AggregatePoints::blue_cb, this); 18 | green_sub = private_nh.subscribe("velo_green_points", 1, &AggregatePoints::green_cb, this); 19 | timer = private_nh.createTimer(ros::Duration(0.1),&AggregatePoints::timerCallback, this); 20 | } 21 | 22 | void AggregatePoints::timerCallback(const ros::TimerEvent&) 23 | { 24 | pub.publish(agg_cloud); 25 | } 26 | 27 | void AggregatePoints::aggregate_cloud() 28 | { 29 | agg_cloud = yellow_cloud; 30 | this->agg_cloud += this->red_cloud; 31 | this->agg_cloud += this->green_cloud; 32 | this->agg_cloud += this->blue_cloud; 33 | } 34 | 35 | void AggregatePoints::yellow_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg) 36 | { 37 | pcl::PCLPointCloud2 pcl_pc2; 38 | pcl_conversions::toPCL(*pcMsg,pcl_pc2); 39 | pcl::PointCloud::Ptr temp_cloud(new pcl::PointCloud); 40 | pcl::fromPCLPointCloud2(pcl_pc2,*temp_cloud); 41 | pcl_ros::transformPointCloud("body", ros::Time::now(), *temp_cloud, "map",yellow_cloud,tfListener); 42 | aggregate_cloud(); 43 | } 44 | 45 | void AggregatePoints::red_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg) 46 | { 47 | pcl::PCLPointCloud2 pcl_pc2; 48 | pcl_conversions::toPCL(*pcMsg,pcl_pc2); 49 | pcl::PointCloud::Ptr temp_cloud(new pcl::PointCloud); 50 | pcl::fromPCLPointCloud2(pcl_pc2,*temp_cloud); 51 | pcl_ros::transformPointCloud("body", ros::Time::now(), *temp_cloud, "map",red_cloud,tfListener); 52 | aggregate_cloud(); 53 | } 54 | 55 | void AggregatePoints::blue_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg) 56 | { 57 | pcl::PCLPointCloud2 pcl_pc2; 58 | pcl_conversions::toPCL(*pcMsg,pcl_pc2); 59 | pcl::PointCloud::Ptr temp_cloud(new pcl::PointCloud); 60 | pcl::fromPCLPointCloud2(pcl_pc2,*temp_cloud); 61 | pcl_ros::transformPointCloud("body", ros::Time::now(), *temp_cloud, "map",blue_cloud,tfListener); 62 | aggregate_cloud(); 63 | } 64 | 65 | void AggregatePoints::green_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg) 66 | { 67 | pcl::PCLPointCloud2 pcl_pc2; 68 | pcl_conversions::toPCL(*pcMsg,pcl_pc2); 69 | pcl::PointCloud::Ptr temp_cloud(new pcl::PointCloud); 70 | pcl::fromPCLPointCloud2(pcl_pc2,*temp_cloud); 71 | pcl_ros::transformPointCloud("body", ros::Time::now(), *temp_cloud, "map",green_cloud,tfListener); 72 | aggregate_cloud(); 73 | } 74 | 75 | // void CloudFilter::filter(sensor_msgs::PointCloud2 &cloud) 76 | // { 77 | // ROS_INFO("Filtering..."); 78 | // this->filt_cloud.header = cloud.header; 79 | // this->filt_cloud.clear(); 80 | // 81 | // bool include; 82 | // for(DDLPointCloud::iterator it = cloud.begin(); it != cloud.end(); it++){ 83 | // bool include = true; 84 | // //if(this->device_id >= 0) include = include && isDeviceAndBeam(*it); 85 | // //if(this->min_intensity > 0) include = include && isAboveIntensity(*it); 86 | // //if(this->max_height > 0) include = include && isBelowHeight(*it); 87 | // // include = include && isAboveHeight(*it); 88 | // // include = include && isLocallyBelowHeight(*it); 89 | // // include = include && isBelowMaxDistance(*it); 90 | // if(include) this->filt_cloud.push_back(*it); 91 | // } 92 | // cloud = this->filt_cloud; 93 | // } 94 | 95 | // bool CloudFilter::isDeviceAndBeam(sensor_msgs::PointCloud2 pt){ 96 | // if(pt.device_id == this->device_id){ 97 | // if(this->beam_id < 0 || pt.beam_id == this->beam_id){ 98 | // return true; 99 | // } 100 | // } 101 | // return false; 102 | // } 103 | 104 | // bool CloudFilter::isAboveIntensity(sensor_msgs::PointCloud2 pt){ 105 | // if(pt.intensity > this->min_intensity){ 106 | // return true; 107 | // } 108 | // return false; 109 | // } 110 | 111 | // bool CloudFilter::isAboveDistance(sensor_msgs::PointCloud2 pt){ 112 | // double x_dist = pt.x - E_m; 113 | // double y_dist = pt.y - N_m; 114 | // if(x_dist*x_dist + y_dist*y_dist > this->min_distance*this->min_distance){ 115 | // return true; 116 | // } 117 | // return false; 118 | // } 119 | // 120 | // bool CloudFilter::isBelowHeight(sensor_msgs::PointCloud2 pt){ 121 | // if(pt.z < this->max_height) return true; 122 | // return false; 123 | // } 124 | // 125 | // bool CloudFilter::isAboveHeight(sensor_msgs::PointCloud2 pt){ 126 | // if(pt.z > this->min_height) return true; 127 | // return false; 128 | // } 129 | // 130 | // bool CloudFilter::isLocallyBelowHeight(sensor_msgs::PointCloud2 pt){ 131 | // if(!isAboveDistance(pt) && pt.z >= this->max_local_height){ 132 | // return false; 133 | // sensor_msgs::PointCloud22 134 | // return true; 135 | // } 136 | // 137 | // bool CloudFilter::isBelowMaxDistance(sensor_msgs::PointCloud2 pt){ 138 | // double x_dist = pt.x - E_m; 139 | // double y_dist = pt.y - N_m; 140 | // if(x_dist*x_dist + y_dist*y_dist < this->max_distance*this->max_distance){ 141 | // return true; 142 | // } 143 | // return false; 144 | // } 145 | 146 | // void CloudFilter::setVehicle(lidar_msgs::VehicleData vehicle){ 147 | // this->vehicle = vehicle; 148 | // double psi_rad; 149 | // this->vehicle.getPose(this->E_m, this->N_m, psi_rad); 150 | // // ROS_INFO("GETTING POSE in CloudFilter::setVehicle"); 151 | // // std::cout << "E_m and N_m: " << E_m << " " << N_m << std::endl; 152 | // } 153 | } 154 | -------------------------------------------------------------------------------- /lidar_pkg/src/aggregate_points.h: -------------------------------------------------------------------------------- 1 | #include "ros/ros.h" 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | 9 | #include 10 | #include 11 | #include 12 | #include 13 | #include 14 | 15 | #include 16 | #include 17 | #include 18 | #include 19 | #include 20 | 21 | namespace lidar_pkg 22 | { 23 | class CloudFilter 24 | { 25 | public: 26 | CloudFilter(){}; 27 | CloudFilter(int min_intensity, double min_distance, double max_distance, double min_height, double max_height, double max_local_height){ 28 | // this->device_id = device_id; 29 | // this->beam_id = beam_id; 30 | this->min_intensity = min_intensity; 31 | this->min_distance = min_distance; 32 | this->max_distance = max_distance; 33 | this->min_height = min_height; 34 | this->max_height = max_height; 35 | this->max_local_height = max_local_height; 36 | } 37 | 38 | void filter(sensor_msgs::PointCloud2 &cloud); 39 | void setVehicle(lidar_msgs::VehicleData vehicle); 40 | 41 | private: 42 | // int device_id; 43 | // int beam_id; 44 | int min_intensity; 45 | double min_distance; 46 | double max_distance; 47 | double min_height; 48 | double max_height; 49 | double max_local_height; 50 | pcl::PointCloud filt_cloud; 51 | 52 | lidar_msgs::VehicleData vehicle; 53 | double E_m = 0.0; 54 | double N_m = 0.0; 55 | 56 | bool isDeviceAndBeam(pcl::PointCloud pt); 57 | bool isAboveIntensity(pcl::PointCloud pt); 58 | bool isAboveDistance(pcl::PointCloud pt); 59 | bool isBelowHeight(pcl::PointCloud pt); 60 | bool isAboveHeight(pcl::PointCloud pt); 61 | bool isLocallyBelowHeight(pcl::PointCloud pt); 62 | bool isBelowMaxDistance(pcl::PointCloud pt); 63 | }; 64 | 65 | class AggregatePoints : public nodelet::Nodelet 66 | { 67 | public: 68 | AggregatePoints(){} 69 | private: 70 | ros::Publisher pub; 71 | pcl::PointCloud agg_cloud; 72 | pcl::PointCloud yellow_cloud; 73 | pcl::PointCloud red_cloud; 74 | pcl::PointCloud blue_cloud; 75 | pcl::PointCloud green_cloud; 76 | pcl::PointCloud velo_cloud; 77 | lidar_msgs::VehicleData vehicle_msg; 78 | CloudFilter filter; 79 | tf::TransformListener tfListener; 80 | 81 | ros::Subscriber yellow_sub; 82 | ros::Subscriber red_sub; 83 | ros::Subscriber blue_sub; 84 | ros::Subscriber green_sub; 85 | ros::Subscriber velo_sub; 86 | ros::Subscriber veh_sub; 87 | ros::Timer timer; 88 | tf::TransformListener listener; 89 | 90 | virtual void onInit(); 91 | void aggregate_cloud(); 92 | void yellow_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg); 93 | void red_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg); 94 | void blue_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg); 95 | void green_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg); 96 | void velo_cb(const sensor_msgs::PointCloud2::ConstPtr &pcMsg); 97 | void vehicleCallback(const lidar_msgs::vehicle_state& vehicleState); 98 | void timerCallback(const ros::TimerEvent&); 99 | }; 100 | } 101 | -------------------------------------------------------------------------------- /lidar_pkg/src/frame_broadcaster.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | 4 | int main(int argc, char** argv){ 5 | ros::init(argc, argv, "my_tf_broadcaster"); 6 | ros::NodeHandle node; 7 | 8 | ros::Rate rate(10.0); 9 | double change = 0; 10 | while (node.ok()){ 11 | tf::TransformBroadcaster br; 12 | tf::Transform transform; 13 | transform.setRotation( tf::Quaternion(0, 0, 0.7071068, 0.7071068) ); 14 | 15 | br.sendTransform(transform, ros::Time::now(), "body", "global_pcl_frame"); 16 | rate.sleep(); 17 | } 18 | return 0; 19 | }; 20 | -------------------------------------------------------------------------------- /lidar_pkg/src/inference_tf.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include "inference_tf.h" 5 | #include "utils.h" 6 | 7 | #include 8 | #include 9 | #include 10 | #include 11 | #include 12 | #include 13 | 14 | #include "tensorflow/core/public/session.h" 15 | #include "tensorflow/cc/ops/const_op.h" 16 | #include "tensorflow/cc/ops/array_ops.h" 17 | #include "tensorflow/cc/ops/standard_ops.h" 18 | #include "tensorflow/core/framework/graph.pb.h" 19 | #include "tensorflow/core/graph/default_device.h" 20 | #include "tensorflow/core/graph/graph_def_builder.h" 21 | #include "tensorflow/core/lib/core/threadpool.h" 22 | #include "tensorflow/core/lib/io/path.h" 23 | #include "tensorflow/core/lib/strings/stringprintf.h" 24 | #include "tensorflow/core/platform/init_main.h" 25 | #include "tensorflow/core/public/session.h" 26 | #include "tensorflow/core/util/command_line_flags.h" 27 | #include "tensorflow/cc/framework/ops.h" 28 | #include 29 | #include 30 | #include 31 | 32 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::InferenceTF, nodelet::Nodelet) 33 | 34 | namespace lidar_pkg 35 | { 36 | void InferenceTF::onInit() 37 | { 38 | ROS_INFO("Initialized InferenceTF."); 39 | tensorflow::Tensor input(tensorflow::DT_FLOAT, tensorflow::TensorShape({1,20,128,128,2})); 40 | ros::NodeHandle& private_nh = getMTPrivateNodeHandle(); 41 | // Set dirs variables 42 | std::string graphPath = "/home/bernard/catkin_ws/src/ROS_OGP/ROSOccupancyGridPrediction/models/prednet_1.pb"; 43 | std::string GRAPH = "prednet_1.pb"; 44 | 45 | // Set input & output nodes names 46 | inputLayer = "input_1_1:0"; 47 | outputLayer = "pred_net_1_1/transpose_1:0"; 48 | 49 | // Load and initialize the model from .pb file 50 | ROS_INFO("Getting the graph..."); 51 | // std::string graphPath = tensorflow::io::JoinPath(ROOTDIR, GRAPH); 52 | ROS_INFO("Loading the graph..."); 53 | tensorflow::Status loadGraphStatus = loadGraph(graphPath, &session); 54 | if (!loadGraphStatus.ok()) { 55 | ROS_INFO("Not loaded"); 56 | //return 0; 57 | } 58 | else 59 | { 60 | ROS_INFO("Loaded"); 61 | } 62 | 63 | pred_pub = private_nh.advertise("prediction", 1); 64 | pred_all_pub = private_nh.advertise("prediction_all", 1); 65 | occupancy_grid_sub = private_nh.subscribe("occupancy", 1, &InferenceTF::occupancy_grid_callback, this); 66 | masses_sub = private_nh.subscribe("masses", 1, &InferenceTF::masses_callback, this); 67 | timer = private_nh.createTimer(ros::Duration(0.001), &InferenceTF::timerCallback, this); 68 | } 69 | 70 | void InferenceTF::timerCallback(const ros::TimerEvent&) 71 | { 72 | if (data_acquired==true) 73 | { 74 | ROS_INFO("Inside the timer"); 75 | 76 | //DEFINE APPROPRAITE GLOBAL VARIABLES 77 | 78 | tensorflow::Tensor input(tensorflow::DT_FLOAT, tensorflow::TensorShape({1,20,128,128,2})); 79 | std::vector output; 80 | 81 | // Create/Fill in a tensor. 82 | 83 | createTensorFromFrames(input); 84 | std::cout << "BEFORE THE SESSION RUNNING" << std::endl; 85 | output.clear(); 86 | tensorflow::Status runStatus = session->Run({{inputLayer, input}}, {outputLayer}, {}, &output); 87 | std::cout << "AFTER THE SESSION RUNNING" << std::endl; 88 | 89 | if (!runStatus.ok()) { 90 | std::cout << "Running model failed: " << runStatus; 91 | //return 0; 92 | } 93 | else { 94 | ROS_INFO("Prediction made"); 95 | } 96 | createPredictionAndPublish(output); 97 | //saveTensor(output); 98 | } 99 | else { 100 | ROS_INFO("Clock is paused not realy but yeah"); 101 | } 102 | } 103 | 104 | void InferenceTF::createPredictionAndPublish(std::vector& output) 105 | { 106 | ROS_INFO("Publishing"); 107 | int grid_size = 128; 108 | prediction_msg.prediction.clear(); 109 | auto output_values = output[0].tensor(); 110 | int probability; 111 | for (unsigned int t = 0; t < 20; t++){ 112 | occupancy_msg.data.clear(); 113 | occupancy_msg.info.resolution = 1./3.; 114 | occupancy_msg.info.width = grid_size; 115 | occupancy_msg.info.height = grid_size; 116 | for (unsigned int i = 0; i < grid_size; i++){ 117 | for (unsigned int j = 0; j < grid_size; j++){ 118 | probability = (int)(100*(0.5*output_values(0,t,i,j,0)+0.5*(1.0 - output_values(0,t,i,j,1)))); 119 | occupancy_msg.data.push_back(probability); 120 | } 121 | } 122 | prediction_msg.prediction.push_back(occupancy_msg); 123 | } 124 | pred_all_pub.publish(prediction_msg); 125 | pred_pub.publish(occupancy_msg); 126 | } 127 | 128 | void InferenceTF::saveTensor(std::vector& output) 129 | { 130 | auto output_values = output[0].tensor(); 131 | for(unsigned int t=0;t<20;t++){ 132 | std::vector occ; 133 | std::vector free; 134 | 135 | for(unsigned int x=0;x<128;x++){ 136 | for(unsigned int y=0;y<128;y++){ 137 | occ.push_back(output_values(0,t,x,y,0)); 138 | free.push_back(output_values(0,t,x,y,1)); 139 | } 140 | } 141 | std::ofstream outfile_occ("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/prediction_occ"+std::to_string(t)+".txt"); 142 | for (const auto &e : occ) outfile_occ << e << "\n"; 143 | 144 | std::ofstream outfile_free("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/prediction_free"+std::to_string(t)+".txt"); 145 | for (const auto &e : free) outfile_free << e << "\n"; 146 | 147 | } 148 | } 149 | 150 | void InferenceTF::createTensorFromFrames(tensorflow::Tensor& input) 151 | { 152 | using namespace::tensorflow::ops; 153 | tensorflow::Status status; 154 | auto root = tensorflow::Scope::NewRootScope(); 155 | std::unique_ptr session(tensorflow::NewSession({})); 156 | tensorflow::GraphDef graph; 157 | bool past = true; 158 | auto input_map = input.tensor(); 159 | lidar_msgs::masses data; 160 | ROS_INFO("STARTED_SAVING"); 161 | for(unsigned int t = 0; t<20;t++){ 162 | if (t<5){ 163 | data = history_m[t]; 164 | //saveFrame(data,t); 165 | } 166 | else { 167 | past = false; 168 | } 169 | int count = 0; 170 | for(unsigned int x = 0; x<128;x++){ 171 | for(unsigned int y = 0; y<128;y++){ 172 | if(past==true){ 173 | input_map(0,t,x,y,0) = (float)data.occ[count]; //check the ordering + add depth. 174 | input_map(0,t,x,y,1) = (float)data.free[count]; 175 | } 176 | else{ 177 | //input_map(0,t,x,y,0) = 0.0; 178 | //input_map(0,t,x,y,1) = 0.0; 179 | } 180 | count++; 181 | } 182 | } 183 | } 184 | //history_m.clear(); 185 | //history_n.clear(); 186 | } 187 | 188 | void InferenceTF::saveFrame(lidar_msgs::masses& data, int t) 189 | { 190 | std::vector occ; 191 | occ = data.occ; 192 | std::vector free; 193 | free = data.free; 194 | std::ofstream outfile_occ("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/past_occ"+std::to_string(t)+".txt"); 195 | for (const auto &e : occ) outfile_occ << e << "\n"; 196 | 197 | std::ofstream outfile_free("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/past_free"+std::to_string(t)+".txt"); 198 | for (const auto &e : free) outfile_free << e << "\n"; 199 | 200 | } 201 | 202 | void InferenceTF::occupancy_grid_callback(const nav_msgs::OccupancyGrid::ConstPtr& occupancy) 203 | { 204 | ROS_INFO("Data collected"); 205 | time++; 206 | nav_msgs::OccupancyGrid zeross; 207 | //zeross.data = occupancy.data; 208 | if (time <= 5){ 209 | history_n.push_back(*occupancy); 210 | } 211 | else { 212 | data_acquired = true; 213 | for (unsigned int i=0; i<=3; i++){ 214 | history_n[i] = history_n[i+1]; 215 | } 216 | history_n[4]=(*occupancy); 217 | } 218 | ROS_INFO("Done with Data collected"); 219 | } 220 | 221 | void InferenceTF::masses_callback(const lidar_msgs::masses::ConstPtr& masses_data) 222 | { 223 | ROS_INFO("Data collected Masses"); 224 | //time++; 225 | nav_msgs::OccupancyGrid zeross; 226 | //zeross.data = occupancy.data; 227 | if (time <= 5){ 228 | history_m.push_back(*masses_data); 229 | } 230 | else { 231 | data_acquired = true; 232 | for (unsigned int i=0; i<=3; i++){ 233 | history_m[i] = history_m[i+1]; 234 | } 235 | history_m[4] = (*masses_data); 236 | } 237 | ROS_INFO("Done with Data collected"); 238 | } 239 | } 240 | -------------------------------------------------------------------------------- /lidar_pkg/src/inference_tf.h: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | 8 | #include "tensorflow/core/public/session.h" 9 | #include "tensorflow/cc/ops/const_op.h" 10 | #include "tensorflow/cc/ops/array_ops.h" 11 | #include "tensorflow/cc/ops/standard_ops.h" 12 | #include "tensorflow/core/framework/graph.pb.h" 13 | #include "tensorflow/core/graph/default_device.h" 14 | #include "tensorflow/core/graph/graph_def_builder.h" 15 | #include "tensorflow/core/lib/core/threadpool.h" 16 | #include "tensorflow/core/lib/io/path.h" 17 | #include "tensorflow/core/lib/strings/stringprintf.h" 18 | #include "tensorflow/core/platform/init_main.h" 19 | #include "tensorflow/core/public/session.h" 20 | #include "tensorflow/core/util/command_line_flags.h" 21 | #include "tensorflow/cc/framework/ops.h" 22 | 23 | #include 24 | #include 25 | #include 26 | 27 | typedef int DST_mass[128][128][3]; 28 | 29 | namespace lidar_pkg { 30 | class InferenceTF : public nodelet::Nodelet 31 | { 32 | public: 33 | InferenceTF(){}; 34 | private: 35 | ros::Publisher pred_pub; 36 | ros::Publisher pred_all_pub; 37 | ros::Subscriber occupancy_grid_sub; 38 | ros::Subscriber masses_sub; 39 | ros::Timer timer; 40 | std::vector> history; 41 | std::vector history_n; 42 | std::vector history_m; 43 | std::unique_ptr session; 44 | std::string inputLayer; 45 | std::string outputLayer; 46 | nav_msgs::OccupancyGrid occupancy_msg; 47 | lidar_msgs::masses masses_msg; 48 | lidar_msgs::prediction prediction_msg; 49 | int time=0; 50 | bool data_acquired = false; 51 | virtual void onInit(); 52 | void timerCallback(const ros::TimerEvent&); 53 | void createTensorFromFrames(tensorflow::Tensor& input); 54 | void occupancy_grid_callback(const nav_msgs::OccupancyGrid::ConstPtr& occupancy); 55 | void createPredictionAndPublish(std::vector& output); 56 | void masses_callback(const lidar_msgs::masses::ConstPtr& masses_data); 57 | void publishPrediction(); 58 | void saveFrame(lidar_msgs::masses& data, int t); 59 | void saveTensor(std::vector& output); 60 | }; 61 | } 62 | -------------------------------------------------------------------------------- /lidar_pkg/src/inference_torch.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include "inference_torch.h" 5 | 6 | #include 7 | #include 8 | #include 9 | #include 10 | #include 11 | #include 12 | #include 13 | #include // One-stop header. 14 | 15 | #include 16 | #include 17 | #include 18 | 19 | #include 20 | #include 21 | 22 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::InferenceTorch, nodelet::Nodelet) 23 | 24 | using namespace std::chrono; 25 | 26 | namespace lidar_pkg 27 | { 28 | void InferenceTorch::onInit() 29 | { 30 | ROS_INFO("Initialized PyTorch Inference."); 31 | 32 | if (torch::cuda::is_available()) { 33 | ROS_INFO("CUDA available! Inference on GPU."); 34 | //device = torch::Device(torch::kCUDA); 35 | } else { 36 | ROS_INFO("Inference on CPU."); 37 | // device = torch::Device(torch::kCPU); 38 | } 39 | 40 | ros::NodeHandle& private_nh = getMTPrivateNodeHandle(); 41 | 42 | // Path to the model 43 | std::string jitPath = "/home/bernard/catkin_ws/src/ROS_OGP/ROSOccupancyGridPrediction/models/prednet_with_taa.pt"; 44 | 45 | // Loading the model 46 | ROS_INFO("Loading the module..."); 47 | try { 48 | module = std::make_shared (torch::jit::load(jitPath)); 49 | module->to(torch::kCUDA); 50 | module->eval(); 51 | torch::NoGradGuard no_grad; 52 | // torch::Tensor input_data = torch::ones({1, 20, 2, 128, 128}); 53 | // std::vector input; 54 | // input.push_back(input_data.to(torch::kCUDA)); 55 | // at::Tensor output = module->forward(input).toTensor(); 56 | 57 | // std::shared_ptr module = torch::jit::load(jitPath); 58 | // module->to(torch::kCUDA); 59 | } 60 | catch (const c10::Error& e) { 61 | std::cerr << "error loading the model\n"; 62 | // Something for errors 63 | } 64 | 65 | ROS_INFO("Loaded!"); 66 | 67 | pred_pub = private_nh.advertise("prediction", 1); 68 | pred_all_pub = private_nh.advertise("prediction_all", 1); 69 | //occupancy_grid_sub = private_nh.subscribe("occupancy", 1, &InferenceTorch::occupancy_grid_callback, this); 70 | masses_sub = private_nh.subscribe("masses", 1, &InferenceTorch::masses_callback, this); 71 | timer = private_nh.createTimer(ros::Duration(0.001), &InferenceTorch::timerCallback, this); //Reduced time but it didn't change much 72 | } 73 | 74 | void InferenceTorch::timerCallback(const ros::TimerEvent&) 75 | { 76 | auto start = high_resolution_clock::now(); 77 | std::vector input; 78 | if (data_acquired==true) 79 | { 80 | //ROS_INFO("Processing masses"); 81 | createTensorFromFrames(input); 82 | } 83 | else 84 | { 85 | // Warm-up for LibTorch. First inference is slow 86 | ROS_INFO("Not enough data..."); 87 | torch::Tensor input_data = torch::ones({1, 20, 2, 128, 128}); 88 | input.push_back(input_data.to(torch::kCUDA)); 89 | } 90 | auto stop1 = high_resolution_clock::now(); 91 | //ROS_INFO("Making predictions"); 92 | torch::Tensor output = Infer(input); 93 | auto stop2 = high_resolution_clock::now(); 94 | //ROS_INFO("Publishing the prediction"); 95 | Publish(output); 96 | auto stop3 = high_resolution_clock::now(); 97 | auto duration1 = duration_cast(stop1 - start); 98 | auto duration2 = duration_cast(stop2 - stop1); 99 | auto duration3 = duration_cast(stop3 - stop2); 100 | //std::cout << "Times: " << duration1.count() << " " << duration2.count() << " " << duration3.count() << std::endl; 101 | } 102 | 103 | torch::Tensor InferenceTorch::Infer(const std::vector& input) 104 | { 105 | torch::NoGradGuard no_grad; 106 | torch::Tensor output = module->forward(input).toTensor(); 107 | //std::cout << "Device type: " << (output.device().type()) << std::endl; 108 | //std::cout << "Required grad: " << (output.requires_grad()) << std::endl; 109 | return output; 110 | } 111 | 112 | void InferenceTorch::Publish(torch::Tensor& out) 113 | { 114 | prediction_msg.prediction.clear(); 115 | torch::Device device(torch::kCPU); 116 | torch::Tensor new_out = out.to(device); 117 | 118 | int grid_size = 128; 119 | auto output_values = new_out.accessor(); 120 | int probability; 121 | 122 | for (unsigned int t = 0; t < 20; t++){ 123 | occupancy_msg.data.clear(); 124 | occupancy_msg.info.resolution = 1./3.; 125 | occupancy_msg.info.width = grid_size; 126 | occupancy_msg.info.height = grid_size; 127 | for (unsigned int i = 0; i < grid_size; i++){ 128 | for (unsigned int j = 0; j < grid_size; j++){ 129 | //Modify this accordingly 130 | probability = (int)(100*(0.5*output_values[0][t][0][i][j]+0.5*(1.0 - output_values[0][t][1][i][j]))); 131 | occupancy_msg.data.push_back(probability); 132 | } 133 | } 134 | prediction_msg.prediction.push_back(occupancy_msg); 135 | } 136 | pred_all_pub.publish(prediction_msg); 137 | pred_pub.publish(occupancy_msg); 138 | } 139 | 140 | void InferenceTorch::createTensorFromFrames(std::vector& input) 141 | { 142 | 143 | bool past = true; 144 | torch::Tensor input_data = torch::empty({1, 20, 2, 128, 128}); 145 | auto input_map = input_data.accessor(); 146 | lidar_msgs::masses data; 147 | for(unsigned int t = 0; t<20;t++){ 148 | if (t<5){ 149 | data = history_m[t]; 150 | } 151 | else { 152 | past = false; 153 | } 154 | int count = 0; 155 | 156 | for(unsigned int x = 0; x<128;x++){ 157 | for(unsigned int y = 0; y<128;y++){ 158 | if(past==true){ 159 | input_map[0][t][0][x][y] = (float)data.occ[count]; 160 | input_map[0][t][1][x][y] = (float)data.free[count]; 161 | } 162 | count++; 163 | } 164 | } 165 | } 166 | input.clear(); 167 | input.push_back(input_data.to(torch::kCUDA)); 168 | } 169 | 170 | void InferenceTorch::occupancy_grid_callback(const nav_msgs::OccupancyGrid::ConstPtr& occupancy) 171 | { 172 | // ROS_INFO("Data collected"); 173 | time++; 174 | nav_msgs::OccupancyGrid zeross; 175 | if (time <= 5){ 176 | history_n.push_back(*occupancy); 177 | } 178 | else { 179 | data_acquired = true; 180 | for (unsigned int i=0; i<=3; i++){ 181 | history_n[i] = history_n[i+1]; 182 | } 183 | history_n[4]=(*occupancy); 184 | } 185 | // ROS_INFO("Done with Data collected"); 186 | } 187 | 188 | void InferenceTorch::masses_callback(const lidar_msgs::masses::ConstPtr& masses_data) 189 | { 190 | time++; 191 | nav_msgs::OccupancyGrid zeross; 192 | //zeross.data = occupancy.data; 193 | if (time <= 5){ 194 | history_m.push_back(*masses_data); 195 | } 196 | else { 197 | data_acquired = true; 198 | for (unsigned int i=0; i<=3; i++){ 199 | history_m[i] = history_m[i+1]; 200 | } 201 | history_m[4] = (*masses_data); 202 | } 203 | } 204 | } 205 | -------------------------------------------------------------------------------- /lidar_pkg/src/inference_torch.h: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | #include // One-stop header. 9 | 10 | #include 11 | #include 12 | #include 13 | 14 | typedef int DST_mass[128][128][3]; 15 | 16 | namespace lidar_pkg { 17 | class InferenceTorch : public nodelet::Nodelet 18 | { 19 | public: 20 | InferenceTorch(){}; 21 | private: 22 | ros::Publisher pred_pub; 23 | ros::Publisher pred_all_pub; 24 | ros::Subscriber occupancy_grid_sub; 25 | ros::Subscriber masses_sub; 26 | ros::Timer timer; 27 | std::vector> history; 28 | std::vector history_n; 29 | std::vector history_m; 30 | nav_msgs::OccupancyGrid occupancy_msg; 31 | lidar_msgs::masses masses_msg; 32 | lidar_msgs::prediction prediction_msg; 33 | int time=0; 34 | bool data_acquired = false; 35 | 36 | //torch::jit::script::Module module; 37 | std::shared_ptr module; 38 | //torch::Device device; 39 | 40 | virtual void onInit(); 41 | void timerCallback(const ros::TimerEvent&); 42 | void createTensorFromFrames(std::vector& input); 43 | void occupancy_grid_callback(const nav_msgs::OccupancyGrid::ConstPtr& occupancy); 44 | // void createPredictionAndPublish(std::vector& input); 45 | void Publish(torch::Tensor& new_out); 46 | torch::Tensor Infer(const std::vector& input); 47 | void masses_callback(const lidar_msgs::masses::ConstPtr& masses_data); 48 | void publishPrediction(); 49 | }; 50 | } 51 | -------------------------------------------------------------------------------- /lidar_pkg/src/inference_torch_test.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include "inference_torch.h" 5 | #include // One-stop header. 6 | 7 | #include 8 | #include 9 | #include 10 | #include 11 | #include 12 | #include 13 | #include 14 | 15 | #include 16 | #include 17 | #include 18 | 19 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::InferenceTorch, nodelet::Nodelet) 20 | 21 | namespace lidar_pkg 22 | { 23 | void InferenceTorch::onInit() 24 | { 25 | ROS_INFO("Initialized PyTorch Inference."); 26 | ros::NodeHandle& private_nh = getMTPrivateNodeHandle(); 27 | 28 | // Path to the model 29 | std::string jitPath = "/home/bernard/catkin_ws/src/ROS_OGP/RealTimeEnvironmentPrediction/models/taa.pt"; 30 | 31 | // Loading the model 32 | ROS_INFO("Loading the module..."); 33 | torch::jit::script::Module module; 34 | try { 35 | module = torch::jit::load(jitPath); 36 | } 37 | catch (const c10::Error& e) { 38 | std::cerr << "error loading the model\n"; 39 | return -1; 40 | } 41 | 42 | ROS_INFO("Loaded!") 43 | 44 | pred_pub = private_nh.advertise("prediction", 1); 45 | pred_all_pub = private_nh.advertise("prediction_all", 1); 46 | occupancy_grid_sub = private_nh.subscribe("occupancy", 1, &Inference::occupancy_grid_callback, this); 47 | masses_sub = private_nh.subscribe("masses", 1, &Inference::masses_callback, this); 48 | timer = private_nh.createTimer(ros::Duration(0.02), &Inference::timerCallback, this); 49 | } 50 | 51 | void Inference::timerCallback(const ros::TimerEvent&) 52 | { 53 | if (data_acquired==true) 54 | { 55 | ROS_INFO("Inside the timer"); 56 | 57 | //DEFINE APPROPRAITE GLOBAL VARIABLES 58 | 59 | tensorflow::Tensor input(tensorflow::DT_FLOAT, tensorflow::TensorShape({1,20,128,128,2})); 60 | std::vector output; 61 | 62 | // Create/Fill in a tensor. 63 | 64 | input = createTensorFromFrames(); 65 | 66 | // Make a prediction 67 | 68 | createPredictionAndPublish(input); 69 | //saveTensor(output); 70 | } 71 | else { 72 | ROS_INFO("Clock is paused..."); 73 | } 74 | } 75 | 76 | void Inference::createPredictionAndPublish(input) 77 | { 78 | ROS_INFO("Publishing"); 79 | prediction_msg.prediction.clear(); 80 | at::Tensor output = module.forward(inputs).toTensor(); 81 | int grid_size = 128; 82 | auto output_values = output[0].tensor(); 83 | int probability; 84 | 85 | for (unsigned int t = 0; t < 20; t++){ 86 | occupancy_msg.data.clear(); 87 | occupancy_msg.info.resolution = 1./3.; 88 | occupancy_msg.info.width = grid_size; 89 | occupancy_msg.info.height = grid_size; 90 | for (unsigned int i = 0; i < grid_size; i++){ 91 | for (unsigned int j = 0; j < grid_size; j++){ 92 | //Modify this accordingly 93 | probability = (int)(100*(0.5*output_values(0,t,i,j,0)+0.5*(1.0 - output_values(0,t,i,j,1)))); 94 | occupancy_msg.data.push_back(probability); 95 | } 96 | } 97 | prediction_msg.prediction.push_back(occupancy_msg); 98 | } 99 | pred_all_pub.publish(prediction_msg); 100 | pred_pub.publish(occupancy_msg); 101 | } 102 | 103 | void Inference::saveTensor(std::vector& output) 104 | { 105 | // auto output_values = output[0].tensor(); 106 | // for(unsigned int t=0;t<20;t++){ 107 | // std::vector occ; 108 | // std::vector free; 109 | // 110 | // for(unsigned int x=0;x<128;x++){ 111 | // for(unsigned int y=0;y<128;y++){ 112 | // occ.push_back(output_values(0,t,x,y,0)); 113 | // free.push_back(output_values(0,t,x,y,1)); 114 | // } 115 | // } 116 | // std::ofstream outfile_occ("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/prediction_occ"+std::to_string(t)+".txt"); 117 | // for (const auto &e : occ) outfile_occ << e << "\n"; 118 | // 119 | // std::ofstream outfile_free("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/prediction_free"+std::to_string(t)+".txt"); 120 | // for (const auto &e : free) outfile_free << e << "\n"; 121 | // 122 | // } 123 | } 124 | 125 | void Inference::createTensorFromFrames() 126 | { 127 | std::vector input; 128 | 129 | 130 | // MODIFY 131 | using namespace::tensorflow::ops; 132 | bool past = true; 133 | auto input_map = input.tensor(); 134 | lidar_msgs::masses data; 135 | 136 | for(unsigned int t = 0; t<20;t++){ 137 | if (t<5){ 138 | data = history_m[t]; 139 | } 140 | else { 141 | past = false; 142 | } 143 | int count = 0; 144 | for(unsigned int x = 0; x<128;x++){ 145 | for(unsigned int y = 0; y<128;y++){ 146 | if(past==true){ 147 | input_map(0,t,x,y,0) = (float)data.occ[count]; //check the ordering + add depth. 148 | input_map(0,t,x,y,1) = (float)data.free[count]; 149 | } 150 | count++; 151 | } 152 | } 153 | } 154 | return input 155 | } 156 | 157 | void Inference::saveFrame(lidar_msgs::masses& data, int t) 158 | { 159 | // std::vector occ; 160 | // occ = data.occ; 161 | // std::vector free; 162 | // free = data.free; 163 | // std::ofstream outfile_occ("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/past_occ"+std::to_string(t)+".txt"); 164 | // for (const auto &e : occ) outfile_occ << e << "\n"; 165 | // 166 | // std::ofstream outfile_free("/home/ford/catkin_ws/src/lidar/lidar_pkg/src/data/past_free"+std::to_string(t)+".txt"); 167 | // for (const auto &e : free) outfile_free << e << "\n"; 168 | 169 | } 170 | 171 | void Inference::occupancy_grid_callback(const nav_msgs::OccupancyGrid::ConstPtr& occupancy) 172 | { 173 | ROS_INFO("Data collected"); 174 | time++; 175 | nav_msgs::OccupancyGrid zeross; 176 | //zeross.data = occupancy.data; 177 | if (time <= 5){ 178 | history_n.push_back(*occupancy); 179 | } 180 | else { 181 | data_acquired = true; 182 | for (unsigned int i=0; i<=3; i++){ 183 | history_n[i] = history_n[i+1]; 184 | } 185 | history_n[4]=(*occupancy); 186 | } 187 | ROS_INFO("Done with Data collected"); 188 | } 189 | 190 | void Inference::masses_callback(const lidar_msgs::masses::ConstPtr& masses_data) 191 | { 192 | ROS_INFO("Data collected Masses"); 193 | //time++; 194 | nav_msgs::OccupancyGrid zeross; 195 | //zeross.data = occupancy.data; 196 | if (time <= 5){ 197 | history_m.push_back(*masses_data); 198 | } 199 | else { 200 | data_acquired = true; 201 | for (unsigned int i=0; i<=3; i++){ 202 | history_m[i] = history_m[i+1]; 203 | } 204 | history_m[4] = (*masses_data); 205 | } 206 | ROS_INFO("Done with Data collected"); 207 | } 208 | } 209 | -------------------------------------------------------------------------------- /lidar_pkg/src/mrf_ground_seg.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include "mrf_ground_seg.h" 3 | 4 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::MrfGroundSeg, nodelet::Nodelet) 5 | 6 | namespace lidar_pkg { 7 | 8 | void MrfGroundSeg::onInit() 9 | { 10 | #if __cplusplus==201103L 11 | std::cout << "C++11" << std::endl; 12 | #else 13 | std::cout << "C++" << std::endl; 14 | #endif 15 | 16 | ros::NodeHandle& private_nh = getMTPrivateNodeHandle(); 17 | 18 | pub = private_nh.advertise("mrf_filtered_points",1); 19 | sub = private_nh.subscribe("agg_points", 1, &MrfGroundSeg::GroundSegMRFCallback, this); 20 | } 21 | 22 | void MrfGroundSeg::GroundSegMRFCallback(const DDLPointCloud::ConstPtr& originalPC) { 23 | ROS_INFO("Points for the segmentation received."); 24 | int start_s = clock(); 25 | 26 | /* Parameters that can be specified via class later. */ 27 | 28 | // Specify the vertical offset distance of the LiDAR from the ground (specified by system - e.g. KITTI). 29 | float lidarZ = 0.0; 30 | 31 | // Maximum keep LiDAR square distance. 32 | float maxKeep = 80.0; 33 | 34 | float s = 0.55; // 0.6; // 0.09 35 | float res = 0.4; 36 | 37 | /* Build the grid. */ 38 | 39 | // Initialize the filtered point cloud to be published. 40 | DDLPointCloud filteredPC; 41 | 42 | // Set the message time stamp. 43 | filteredPC.header.stamp = originalPC->header.stamp; 44 | // std::cout << "time stamp" << filteredPC.header.stamp << endl; 45 | 46 | // Dereference the point cloud. 47 | const DDLPointCloud &cloud(*originalPC); 48 | 49 | // cout << cloud.header.frame_id << endl; 50 | 51 | // Get number of points in point cloud. 52 | int origPCSize = cloud.width; 53 | 54 | // Get the maximum x, y distance. 55 | float maxXYCoord = maxKeep; 56 | 57 | //max( max(-1 * minPoint.x, maxPoint.x), max(-1 * minPoint.y, maxPoint.y)); 58 | 59 | size_t gridSize = int(2 * ceil((maxXYCoord)/ res) + 1); 60 | 61 | vector grid[gridSize][gridSize]; 62 | 63 | // Get center coordinates of the grid. 64 | int centerX = int(ceil(maxXYCoord / res)); 65 | int centerY = int(ceil(maxXYCoord / res)); 66 | 67 | // Populate the grid with indices of points that fit into a particular cell. 68 | for (int i = 0; i < origPCSize; i++) { 69 | DDLPointType point = cloud.points[i]; 70 | 71 | // Ensure within specified area, and not above a filtered height 72 | if ((abs(point.x) <= maxXYCoord) && (abs(point.y) <= maxXYCoord) && (point.z <= 3.5)) { 73 | grid[int(centerX + round(point.x / res))][int(centerY + round(point.y / res))].push_back(i); 74 | // cout << "Z value: " << point.z << endl; 75 | } 76 | 77 | } 78 | 79 | /* Perform MRF segmentation. */ 80 | 81 | // Initialize the hG array. 82 | float hG[gridSize][gridSize]; 83 | 84 | // Initialize the grid segmentation (ground cells have a value of 1). 85 | int gridSeg[gridSize][gridSize]; 86 | fill( gridSeg[0], gridSeg[0] + gridSize * gridSize, 0 ); 87 | 88 | // Initialize the center coordinate of the 2D grid to ground according to the height of the 89 | // LiDAR position on the vehicle. 90 | hG[centerX][centerY] = -1 * lidarZ; 91 | gridSeg[centerX][centerY] = 1; 92 | 93 | // Allocate space for two elements in the vector. 94 | vector outerIndex; 95 | outerIndex.resize(2); 96 | 97 | // Move radially outwards and perform the MRF segmentation. 98 | for (int i = 1; i < int(ceil(maxXYCoord / res)) + 1; i++) { 99 | 100 | // Generate the indices at the ith circle level from the center of the grid. 101 | outerIndex[0] = -1 * i; 102 | outerIndex[1] = i; 103 | 104 | /* vector outerIndex; 105 | outerIndex.push_back(-1 * i); 106 | outerIndex.push_back(i); */ 107 | 108 | for (int index : outerIndex) { 109 | for (int k = -1 * i; k < (i + 1); k++) { 110 | 111 | // Index is the outer index (perimeter of circle) and k is possible inner indices 112 | // in between the edges. 113 | Indices currentCircle; 114 | currentCircle.insert(pair (centerX + index, centerY + k)); 115 | 116 | // Add the mirror image of cells to the circle if not on the top or bottom row of the circle. 117 | if ( !((k == -1 * i) || (k == i)) ) { 118 | currentCircle.insert(pair (centerX + k, centerY + index)); 119 | } 120 | 121 | // Go through the one or two stored cells right now. 122 | for (pair const& indexPair : currentCircle) { 123 | 124 | int x = indexPair.first; 125 | int y = indexPair.second; 126 | 127 | // Compute the minimum and maximum z coordinates of each grid cell. 128 | 129 | // Initialize H to a very small value. 130 | float H = - numeric_limits::infinity(); 131 | // Initialize h to a very large value. 132 | float h = numeric_limits::infinity(); 133 | 134 | if (!grid[x][y].empty()) { 135 | 136 | const vector pcIndices = grid[x][y]; 137 | 138 | for (int j = 0; j < pcIndices.size(); j++) { 139 | H = max(cloud.points[pcIndices[j]].z, H); 140 | h = min(cloud.points[pcIndices[j]].z, h); 141 | } 142 | 143 | } 144 | 145 | // Pay attention to what happens when there are no points in a grid cell? Will it work? 146 | 147 | // Compute hHatG: find max hG of neighbors. 148 | float hHatG = - numeric_limits::infinity(); 149 | 150 | // Get the inner circle neighbors of the current cell. 151 | 152 | // The inner circle is one level down from the current circle. 153 | int innerCircleIndex = i - 1; 154 | 155 | // Center the grid at (0, 0). 156 | int xRelativeIndex = x - centerX; 157 | int yRelativeIndex = y - centerY; 158 | 159 | // Loop through possible neighbor indices. 160 | for (int m = -1; m < 2; m++) { 161 | for (int n = -1; n < 2; n++) { 162 | 163 | int xRelativeNew = abs(xRelativeIndex + m); 164 | int yRelativeNew = abs(yRelativeIndex + n); 165 | 166 | // Ensure index is actually on the inner circle. 167 | if (((xRelativeNew == innerCircleIndex) && (yRelativeNew <= innerCircleIndex)) || ((yRelativeNew == innerCircleIndex) && (xRelativeNew <= innerCircleIndex))) { 168 | 169 | // Compute the new hHatG. 170 | float hGTemp = hG[x + m][y + n]; 171 | hHatG = max(hGTemp, hHatG); 172 | } 173 | } 174 | } 175 | 176 | // Update hG of current cell. 177 | if ((H != - numeric_limits::infinity()) && (h != numeric_limits::infinity()) && 178 | ((H - h) < s) && ((H - hHatG) < s)) { 179 | gridSeg[x][y] = 1; 180 | hG[x][y] = H; 181 | 182 | } else { 183 | hG[x][y] = hHatG; 184 | 185 | // Add the cell's LiDAR points to the segmented (not ground) point cloud. 186 | if (!grid[x][y].empty()) { 187 | 188 | const vector pcIndices = grid[x][y]; 189 | 190 | for (int j = 0; j < grid[x][y].size(); j++) { 191 | filteredPC.points.push_back(cloud.points[pcIndices[j]]); 192 | } 193 | 194 | } 195 | } 196 | 197 | } 198 | } 199 | } 200 | } 201 | 202 | int stop_s=clock(); 203 | // cout << "Width of point cloud before: " << cloud.width << endl; 204 | // cout << "Width of point cloud now: " << filteredPC.points.size() << endl; 205 | // cout << "Elapsed time outside for-loop: " << (stop_s - start_s)/double(CLOCKS_PER_SEC)*1000 << endl << endl; 206 | 207 | /* Set the frame_id to "vehicle_ground_cartesian". */ 208 | 209 | /* Publish to ROS node. */ 210 | filteredPC.header.frame_id = cloud.header.frame_id; 211 | pub.publish(filteredPC); 212 | 213 | } 214 | } 215 | -------------------------------------------------------------------------------- /lidar_pkg/src/mrf_ground_seg.h: -------------------------------------------------------------------------------- 1 | #include "ros/ros.h" 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | 9 | using namespace std; 10 | 11 | typedef lidar_msgs::PointXYZIDBS DDLPointType; 12 | typedef pcl::PointCloud DDLPointCloud; 13 | typedef set > Indices; 14 | 15 | using namespace std; 16 | 17 | namespace lidar_pkg { 18 | 19 | class MrfGroundSeg : public nodelet::Nodelet 20 | { 21 | public: 22 | MrfGroundSeg(){}; 23 | private: 24 | 25 | ros::Publisher pub; 26 | ros::Subscriber sub; 27 | 28 | virtual void onInit(); 29 | void GroundSegMRFCallback(const DDLPointCloud::ConstPtr& originalPC); 30 | }; 31 | } 32 | -------------------------------------------------------------------------------- /lidar_pkg/src/occupancy_grid_generation.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include "occupancy_grid_generation.h" //"occupancy_grid_generation.h" 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | #include 9 | #include 10 | 11 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::OccupancyGridGeneration, nodelet::Nodelet) 12 | 13 | namespace lidar_pkg 14 | { 15 | 16 | void OccupancyGridGeneration::onInit() 17 | { 18 | ROS_INFO("Occupancy Grid nodelet has been initialized."); 19 | 20 | //Setting up the nodelet. 21 | ros::NodeHandle& private_nh = getMTPrivateNodeHandle(); 22 | 23 | //Advertising the topics published by this node. 24 | occ_pub = private_nh.advertise("occupancy", 1000); 25 | masses_pub = private_nh.advertise("masses", 1000); 26 | 27 | //Subscrbing to the required information. 28 | 29 | // Filtered occupancy grid. 30 | sub_2 = private_nh.subscribe ("mrf_filtered_points", 1, &OccupancyGridGeneration::cloud_cb, this); 31 | 32 | // Ros time. Subscribes to vehilce location every 0.01s. 33 | timer = private_nh.createTimer(ros::Duration(0.01), &OccupancyGridGeneration::timer_cb, this); 34 | } 35 | 36 | void OccupancyGridGeneration::timer_cb(const ros::TimerEvent&) 37 | { 38 | /*Make sure the frames are correct here. 39 | /world for old rosbags and /map for new ones. /map should be a global cs and /vehicle_cg_cartesian a local vehicle cg. 40 | We are looking for the translation of the vehicle wrt the global cs. 41 | */ 42 | 43 | // tf::TransformListener listener; 44 | // tf::StampedTransform transform; 45 | // 46 | // while(true){ 47 | // try{ 48 | // listener.lookupTransform("/map", "/vehicle_ground_cartesian", ros::Time(0), transform); 49 | // break; 50 | // } 51 | // catch (tf::TransformException ex){ 52 | // ros::Duration(0.01).sleep(); 53 | // continue; 54 | // } 55 | // } 56 | // prev_vehicle_x = vehicle_x; 57 | // prev_vehicle_y = vehicle_y; 58 | // vehicle_x = transform.getOrigin().x(); 59 | // vehicle_y = transform.getOrigin().y(); 60 | 61 | // Bernard: I don't need frames ????? Are you sure? 62 | change_x = 0.0; // vehicle_x - prev_vehicle_x; 63 | change_y = 0.0; // vehicle_y - prev_vehicle_y; 64 | 65 | } 66 | 67 | 68 | void OccupancyGridGeneration::cloud_cb (const sensor_msgs::PointCloud2ConstPtr& cloud_msg) 69 | { 70 | /* 71 | PC processing. 72 | */ 73 | 74 | double centerpoint_x = 64; 75 | double centerpoint_y = 64; 76 | double xstart = -1; 77 | double ystart = -1; 78 | double xend = -1; 79 | double yend = -1; 80 | 81 | // Converts the PCL ros message using pcl_conversions. 82 | pcl::PointCloud cloud; 83 | pcl::fromROSMsg(*cloud_msg, cloud); 84 | 85 | // 1. Convert new measurement into a DST grid. 86 | create_DST_grid(cloud); 87 | 88 | x_new_low = change_x - 64 * res; 89 | x_new_high = change_x + 64 * res; 90 | y_new_low = change_y - 64 * res; 91 | y_new_high = change_y + 64 * res; 92 | 93 | if(initialization_phase == false) { 94 | if ((x_new_low >= x_old_low) && (x_old_high >= x_new_low)) { 95 | xstart = x_new_low; 96 | xend = x_old_high; 97 | } 98 | 99 | if ((y_new_low >= y_old_low) && (y_old_high >= y_new_low)) { 100 | ystart = y_new_low; 101 | yend = y_old_high; 102 | } 103 | 104 | if ((x_new_low < x_old_low) && (x_new_high >= x_old_low)) { 105 | xstart = x_old_low; 106 | xend = x_new_high; 107 | } 108 | 109 | if ((y_new_low < y_old_low) && (y_new_high >= y_old_low)) { 110 | ystart = y_old_low; 111 | yend = y_new_high; 112 | } 113 | 114 | if ((xstart != -1) && (ystart != -1)) { 115 | 116 | int indx_nl = find_nearest(grid_size, xstart, x_new_low, x_new_high, res); 117 | int indx_nh = find_nearest(grid_size, xend, x_new_low, x_new_high, res); 118 | int indy_nl = find_nearest(grid_size, ystart, y_new_low, y_new_high, res); 119 | int indy_nh = find_nearest(grid_size, yend, y_new_low, y_new_high, res); 120 | 121 | int indx_ol = find_nearest(grid_size, xstart, x_old_low, x_old_high, res); 122 | int indx_oh = find_nearest(grid_size, xend, x_old_low, x_old_high, res); 123 | int indy_ol = find_nearest(grid_size, ystart, y_old_low, y_old_high, res); 124 | int indy_oh = find_nearest(grid_size, yend, y_old_low, y_old_high, res); 125 | 126 | for (unsigned int i=0; i < indx_oh - indx_ol + 1; i++) { 127 | for (unsigned int j=0; j < indy_oh - indy_ol + 1; j++) { 128 | prev_free[indx_nl + i][indy_nl + j] = up_free[indx_ol + i][indy_ol + j]; 129 | prev_occ[indx_nl + i][indy_nl + j] = up_occ[indx_ol + i][indy_ol + j]; 130 | } 131 | } 132 | } 133 | } 134 | mass_update(); 135 | get_mass(); 136 | plotting(); 137 | clear(); 138 | 139 | initialization_phase = false; 140 | x_old_low = x_new_low; 141 | x_old_high = x_new_high; 142 | y_old_low = y_new_low; 143 | y_old_high = y_new_high; 144 | } 145 | 146 | void OccupancyGridGeneration::create_DST_grid(pcl::PointCloud& cloud) 147 | { 148 | /* 149 | Generate the occupancy grid. 150 | Works pretty well. 151 | Further improvments/simplifications can be done, but it is not a priority now. 152 | */ 153 | 154 | // ROS_INFO("Inside create DST grid"); 155 | //1.Projects the pcl points onto the 2D-plane/occupancy grid. Performs ray tracing to identify visible spaces. 156 | add_points_to_the_DST(cloud); 157 | //2.Performs ray tracing to identify visible spaces to angles around a vehicle which are not represented in the PC or out of range. 158 | add_free_spaces_to_the_DST(); 159 | //3.Add an ego vehicle to our grid. 160 | add_ego_vehicle_to_the_DST(); 161 | } 162 | 163 | void OccupancyGridGeneration::add_points_to_the_DST(pcl::PointCloud& cloud) 164 | { 165 | for (size_t i = 0; i < cloud.points.size(); i++) 166 | { 167 | int x = (int)(cloud.points[i].x/res); 168 | int y = (int)(cloud.points[i].y/res); 169 | double z = cloud.points[i].z; 170 | 171 | //FIX: Quick hack for Ford dataset. 172 | if ((-1)*z>0.5){ 173 | continue; 174 | } 175 | 176 | // Checks if the pcl point is within our grid range. 177 | // TODO: Replace 64 with some grid size parameter set once. 178 | if (x >= -64 && y>= -64 && x <= 63 && y <= 63) { 179 | int angle; 180 | 181 | // Angles vector contains which angles from 0 deg to 360 deg have been represented. 182 | // It is used to identify free spaces for angles not covered by PC or out of range points. 183 | if (cloud.points[i].y > 0 && cloud.points[i].x < 0){ 184 | angle = 180 - (int)(atan(std::abs(cloud.points[i].y)/std::abs(cloud.points[i].x))*180.0/3.14159265); 185 | } 186 | else if (cloud.points[i].y < 0 && cloud.points[i].x < 0){ 187 | angle = 180 + (int)(atan(std::abs(cloud.points[i].y)/std::abs(cloud.points[i].x))*180.0/3.14159265); 188 | } 189 | else if (cloud.points[i].y < 0 && cloud.points[i].x > 0){ 190 | angle = 360 - (int)(atan(std::abs(cloud.points[i].y)/std::abs(cloud.points[i].x))*180.0/3.14159265); 191 | } 192 | else{ 193 | angle = (int)(atan(std::abs(cloud.points[i].y)/std::abs(cloud.points[i].x))*180.0/3.14159265); 194 | } 195 | 196 | angles[angle] = true; 197 | double slope = (double)(y)/(x); 198 | 199 | // We perform ray/point tracing from the origin to the point to identify the free space using Bresenham's line algorthim. 200 | // It is a simple algorithm but does a job pretty well. 201 | if (slope > 0 && slope <= 1 && x>0){ 202 | ray_tracing_approximation_y_increment(0,0,x,y,1,1,false); 203 | } 204 | else if (slope > 1 && x>0){ 205 | ray_tracing_approximation_x_increment(0,0,x,y,1,1,false); 206 | } 207 | else if (slope < 0 && slope >= -1 && x>0){ 208 | ray_tracing_approximation_y_increment(0,0,x,(-1)*y,1,-1,false); 209 | } 210 | else if (slope < -1 && x>0){ 211 | ray_tracing_approximation_x_increment(0,0,x,(-1)*y,1,-1,false); 212 | } 213 | else if (slope > 1 && x<0){ 214 | ray_tracing_approximation_x_increment(0,0,(-1)*x,(-1)*y,-1,-1,false); 215 | } 216 | else if (slope > 0 && slope <= 1 && x<0){ 217 | ray_tracing_approximation_y_increment(0,0,(-1)*x,(-1)*y,-1,-1,false); 218 | } 219 | else if (slope < 0 && slope >= -1 && x<0){ 220 | ray_tracing_approximation_y_increment(0,0,(-1)*x,y,-1,1,false); 221 | } 222 | else if (slope < -1 && x<0){ 223 | ray_tracing_approximation_x_increment(0,0,(-1)*x,y,-1,1,false); 224 | } 225 | } 226 | } 227 | } 228 | 229 | void OccupancyGridGeneration::add_free_spaces_to_the_DST() 230 | { 231 | double i = 0.0; 232 | float ang = 0.0f; 233 | 234 | /* 235 | TODO: It is a naive approach to fill the free spaces and it should be changed. 236 | However, it is not the bottleneck right now. 237 | I'll comment it/clean it tmr. 238 | */ 239 | for (unsigned int i = 0; i<3600;i++){ 240 | ang = (i*0.1f); 241 | //std::cout << ang << std::endl; 242 | if (angles[(int)(ang)] == false){ 243 | int x,y; 244 | if (ang>0.0f && ang<=45.0f){ 245 | x = 64; 246 | y = (int)(tan(ang*PI/180.0f)*x); 247 | } 248 | else if (ang>45.0f && ang<90.0f){ 249 | y = 64; 250 | x = (int)(y/tan(ang*PI/180.0f)); 251 | } 252 | else if (ang>90.0f && ang<=135.0f){ 253 | y = 64; 254 | x = (int)(y/tan((ang-180.0f)*PI/180.0f)); 255 | } 256 | else if (ang>135.0f && ang<180.0f){ 257 | x = -64; 258 | y = (int)(tan((ang-180.0)*PI/180.0f)*x); 259 | } 260 | else if (ang>180.0f && ang<=225.0f){ 261 | x = -64; 262 | y = (int)(tan((ang-180.0f)*PI/180.0f)*x); 263 | } 264 | else if (ang>225.0f && ang<270.0f){ 265 | y = -64; 266 | x = (int)(y/tan((ang-180.0f)*PI/180.0f)); 267 | } 268 | else if (ang>270.0f && ang<= 315.0f){ 269 | y = -64; 270 | x = (int)(y/tan((ang-360.0f)*PI/180.0f)); 271 | } 272 | else if (ang>315.0f && ang<360.0f){ 273 | x = 64; 274 | y = (int)(tan((ang-360.0f)*PI/180.0f)*x); 275 | } 276 | else if (ang == 0.0f || ang == 360.0f){ 277 | ray_tracing_horizontal(64); 278 | continue; 279 | } 280 | else if (ang == 90.0f){ 281 | ray_tracing_vertical(64); 282 | continue; 283 | } 284 | else if (ang == 180.0f){ 285 | ray_tracing_horizontal_n(-64); 286 | continue; 287 | } 288 | else if (ang == 270.0f){ 289 | ray_tracing_vertical_n(-64); 290 | continue; 291 | } 292 | 293 | if (x >= -64 && y>= -64 && x <= 64 && y <= 64) { 294 | 295 | double slope = (double)(y)/(x); 296 | if (slope > 0 && slope <= 1 && x>0){ 297 | ray_tracing_approximation_y_increment(0,0,x,y,1,1,true); 298 | } 299 | else if (slope > 1 && x>0){ 300 | ray_tracing_approximation_x_increment(0,0,x,y,1,1,true); 301 | } 302 | else if (slope < 0 && slope >= -1 && x>0){ 303 | ray_tracing_approximation_y_increment(0,0,x,(-1)*y,1,-1,true); 304 | } 305 | else if (slope < -1 && x>0){ 306 | ray_tracing_approximation_x_increment(0,0,x,(-1)*y,1,-1,true); 307 | } 308 | else if (slope > 1 && x<0){ 309 | ray_tracing_approximation_x_increment(0,0,(-1)*x,(-1)*y,-1,-1,true); 310 | } 311 | else if (slope > 0 && slope <= 1 && x<0){ 312 | ray_tracing_approximation_y_increment(0,0,(-1)*x,(-1)*y,-1,-1,true); 313 | } 314 | else if (slope < 0 && slope >= -1 && x<0){ 315 | ray_tracing_approximation_y_increment(0,0,(-1)*x,y,-1,1,true); 316 | } 317 | else if (slope < -1 && x<0){ 318 | ray_tracing_approximation_x_increment(0,0,(-1)*x,y,-1,1,true); 319 | } 320 | } 321 | } 322 | angles[(int)(ang)] = false; 323 | } 324 | } 325 | 326 | void OccupancyGridGeneration::add_ego_vehicle_to_the_DST() 327 | { 328 | // Vehicle shape. 329 | for(unsigned int j = 60;j<68;j++){ 330 | for(unsigned int i = 62;i<67;i++){ 331 | meas_occ[j][i] = 1.0; 332 | meas_free[j][i] = 0.0; 333 | } 334 | } 335 | } 336 | 337 | void OccupancyGridGeneration::plotting() 338 | { 339 | occupancy_msg.data.clear(); 340 | masses_msg.occ.clear(); 341 | masses_msg.free.clear(); 342 | occupancy_msg.header.frame_id = "/body"; //TODO: Make sure the frame is the correct one. 343 | occupancy_msg.info.resolution = res; 344 | occupancy_msg.info.width = grid_size; 345 | occupancy_msg.info.height = grid_size; 346 | occupancy_msg.info.origin.position.z = 0.2; 347 | occupancy_msg.info.origin.position.x = -64.0*(1./3.); 348 | occupancy_msg.info.origin.position.y = -64.0*(1./3.); 349 | masses_msg.width = grid_size; 350 | masses_msg.height = grid_size; 351 | 352 | for (unsigned int i = 0; i < grid_size; i++){ 353 | for (unsigned int j = 0; j < grid_size; j++){ 354 | occupancy_msg.data.push_back(prob_O_plot[j][i]); 355 | masses_msg.occ.push_back(up_occ[j][i]); 356 | masses_msg.free.push_back(up_free[j][i]); 357 | } 358 | } 359 | occ_pub.publish(occupancy_msg); 360 | masses_pub.publish(masses_msg); 361 | } 362 | 363 | void OccupancyGridGeneration::mass_update() 364 | { 365 | for (unsigned int i = 0; i < grid_size; i++){ 366 | for (unsigned int j = 0; j < grid_size; j++){ 367 | up_occ_pred[i][j] = std::min(alpha * prev_occ[i][j], 1.0 - prev_free[i][j]); 368 | up_free_pred[i][j] = std::min(alpha * prev_free[i][j], 1.0 - prev_occ[i][j]); 369 | } 370 | } 371 | //Combine measurement nad prediction to form posterior occupied and free masses. 372 | update_of(); 373 | } 374 | 375 | void OccupancyGridGeneration::update_of() 376 | { 377 | for (unsigned int i = 0; i < grid_size; i++){ 378 | for (unsigned int j = 0; j < grid_size; j++){ 379 | double unknown_pred = 1.0 - up_free_pred[i][j] - up_occ_pred[i][j]; 380 | double meas_cell_unknown = 1.0 - meas_free[i][j] - meas_occ[i][j]; 381 | double k_value = up_free_pred[i][j] * meas_occ[i][j] + up_occ_pred[i][j] * meas_free[i][j]; 382 | up_occ[i][j] = (up_occ_pred[i][j] * meas_cell_unknown + unknown_pred * meas_occ[i][j] + up_occ_pred[i][j] * meas_occ[i][j])/ (1.0 - k_value); 383 | up_free[i][j] = (up_free_pred[i][j] * meas_cell_unknown + unknown_pred * meas_free[i][j] + up_free_pred[i][j] * meas_free[i][j])/ (1.0 - k_value); 384 | } 385 | } 386 | } 387 | 388 | void OccupancyGridGeneration::get_mass() 389 | { 390 | for (unsigned int i = 0; i < grid_size; i++){ 391 | for (unsigned int j = 0; j < grid_size; j++){ 392 | prob_O[i][j] = (0.5 * up_occ[i][j] + 0.5*(1.0 - up_free[i][j])); 393 | prob_O_plot[i][j] = 100*(0.5 * up_occ[i][j] + 0.5*(1.0 - up_free[i][j])); //meas_occ[i][j] * 100; 394 | } 395 | } 396 | } 397 | 398 | int OccupancyGridGeneration::find_nearest(int n, double v, double v0, double vn, double res) 399 | { 400 | int idx = std::floor(n*(v-v0+res/2.)/(vn-v0+res)); 401 | return idx; 402 | } 403 | 404 | 405 | void OccupancyGridGeneration::ray_tracing_approximation_y_increment(int x1, int y1, int x2, int y2, int flip_x, int flip_y, bool inclusive) 406 | { 407 | int slope = 2 * (y2 - y1); 408 | int slope_error = slope - (x2 - x1); 409 | int x_sample, y_sample; 410 | for (int x = x1, y = y1; x < x2; x++){ 411 | if (meas_occ[flip_x*x+64][flip_y*y+64] == meas_mass) { 412 | break; 413 | } 414 | 415 | meas_free[flip_x*x+64][flip_y*y+64] = meas_mass; 416 | 417 | slope_error += slope; 418 | if (slope_error >= 0) { 419 | y += 1; 420 | slope_error -= 2 * (x2 - x1); 421 | } 422 | } 423 | 424 | if (inclusive==false) { 425 | meas_occ[flip_x*x2+64][flip_y*y2+64] = meas_mass; 426 | meas_free[flip_x*x2+64][flip_y*y2+64] = 0.0; 427 | } 428 | } 429 | 430 | void OccupancyGridGeneration::ray_tracing_approximation_x_increment(int x1, int y1, int x2, int y2, int flip_x, int flip_y, bool inclusive) 431 | { 432 | int slope = 2 * (x2 - x1); 433 | int slope_error = slope - (y2 - y1); 434 | int x_sample, y_sample; 435 | for (int x = x1, y = y1; y < y2; y++){ 436 | 437 | if (meas_occ[flip_x*x+64][flip_y*y+64] == meas_mass) { 438 | break; 439 | } 440 | 441 | meas_free[flip_x*x+64][flip_y*y+64] = meas_mass; 442 | 443 | slope_error += slope; 444 | if (slope_error >= 0) { 445 | x += 1; 446 | slope_error -= 2 * (y2 - y1); 447 | } 448 | } 449 | 450 | if (inclusive==false) { 451 | meas_occ[flip_x*x2+64][flip_y*y2+64] = meas_mass; 452 | meas_free[flip_x*x2+64][flip_y*y2+64] = 0.0; 453 | } 454 | } 455 | 456 | void OccupancyGridGeneration::ray_tracing_horizontal(int x2) 457 | { 458 | int x1 = 0; 459 | int y1 = 0; 460 | x2 = x2 - 1; 461 | 462 | for (int x = x1; x <= x2; x++){ 463 | 464 | if (meas_occ[x+64][64] == meas_mass) { 465 | break; 466 | } 467 | 468 | meas_free[x+64][64] = meas_mass; 469 | } 470 | 471 | meas_free[x2+64][64] = 0.0; 472 | } 473 | 474 | void OccupancyGridGeneration::ray_tracing_horizontal_n(int x1) 475 | { 476 | int x2 = 0; 477 | int y2 = 0; 478 | x1 = x1+1; 479 | 480 | for (int x = x1; x <= x2; x++){ 481 | 482 | if (meas_occ[x+64][64] == meas_mass) { 483 | break; 484 | } 485 | 486 | meas_free[x+64][64] = meas_mass; 487 | } 488 | 489 | meas_free[x2+64][64] = 0.0; 490 | } 491 | 492 | void OccupancyGridGeneration::ray_tracing_vertical(int y2) 493 | { 494 | int x1 = 0; 495 | int y1 = 0; 496 | y2 = y2-1; 497 | 498 | for (int y = y1; y <= y2; y++){ 499 | 500 | if (meas_occ[64][y+64] == meas_mass) { 501 | break; 502 | } 503 | meas_free[64][y+64] = meas_mass; 504 | } 505 | } 506 | 507 | void OccupancyGridGeneration::ray_tracing_vertical_n(int y1) 508 | { 509 | int x1 = 0; 510 | int y2 = 0; 511 | y1 = y1+1; 512 | 513 | for (int y = y1; y <= y2; y++){ 514 | 515 | if (meas_occ[64][y+64] == meas_mass) { 516 | break; 517 | } 518 | meas_free[64][y+64] = meas_mass; 519 | } 520 | 521 | meas_free[64][y2+64] = 0.0; 522 | } 523 | 524 | void OccupancyGridGeneration::clear() 525 | { 526 | for (unsigned int i =0; i 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | #include 9 | #include 10 | 11 | typedef lidar_msgs::PointXYZIDBS DDLPointType; 12 | 13 | #define PI 3.14159 14 | 15 | /* 16 | TODO:BERNARD 17 | Write a documentation/Comment. 18 | */ 19 | 20 | 21 | namespace lidar_pkg 22 | { 23 | class OccupancyGridGeneration : public nodelet::Nodelet 24 | { 25 | public: 26 | OccupancyGridGeneration(){}; 27 | private: 28 | 29 | bool initialization_phase = true; 30 | double vehicle_x; 31 | double vehicle_y; 32 | double prev_vehicle_x; 33 | double prev_vehicle_y; 34 | double change_x; 35 | double change_y; 36 | double x_new_low; 37 | double x_new_high; 38 | double y_new_low; 39 | double y_new_high; 40 | double x_old_low; 41 | double x_old_high; 42 | double y_old_low; 43 | double y_old_high; 44 | 45 | // There are only two events: 0 = Occupied and 1 = Free. 46 | const static int event_num = 2; 47 | 48 | // Grid size. 49 | const static int grid_size = 128; 50 | 51 | // Resolution. 52 | constexpr static double res = 1./3.; 53 | 54 | // Measurement mass. 55 | constexpr static double meas_mass = 0.95; 56 | 57 | // Occupancy measurement. 58 | constexpr static double alpha = 0.9; 59 | 60 | // Place holders vehicle positions. 61 | constexpr static double vehicle_pos_x = 0; 62 | constexpr static double vehicle_pos_y = 0; 63 | constexpr static double prev_vehicle_pos_x = 0; 64 | constexpr static double prev_vehicle_pos_y = 0; 65 | 66 | ros::Subscriber sub_2; 67 | ros::Timer timer; 68 | ros::Publisher occ_pub; 69 | ros::Publisher masses_pub; 70 | nav_msgs::OccupancyGrid occupancy_msg; 71 | lidar_msgs::masses masses_msg; 72 | 73 | // Array with the DST data. 74 | double meas_grids[event_num][grid_size][grid_size]; 75 | 76 | // "Freeness" measurement. 77 | double meas_free[grid_size][grid_size] = {{0}}; 78 | 79 | // Occupancy measurement. 80 | double meas_occ[grid_size][grid_size] = {{0}}; 81 | double prev_free[grid_size][grid_size] = {{0}}; 82 | double prev_occ[grid_size][grid_size] = {{0}}; 83 | double up_free_pred[grid_size][grid_size]; 84 | double up_occ_pred[grid_size][grid_size]; 85 | double up_free[grid_size][grid_size]; 86 | double up_occ[grid_size][grid_size]; 87 | double prob_O[grid_size][grid_size]; 88 | double prob_O_plot[grid_size][grid_size]; 89 | bool angles[360]; 90 | bool first; 91 | 92 | virtual void onInit(); 93 | void timer_cb(const ros::TimerEvent&); 94 | 95 | void transform_listener(); 96 | void cloud_cb(const sensor_msgs::PointCloud2ConstPtr& cloud_msg); 97 | void create_DST_grid(pcl::PointCloud& cloud); 98 | void ray_tracing_approximation_x_increment(int x1, int y1, int x2, int y2, int flip_x, int flip_y, bool inclusive); 99 | void ray_tracing_approximation_y_increment(int x1, int y1, int x2, int y2, int flip_x, int flip_y, bool inclusive); 100 | int find_nearest(int n, double v, double v0, double vn, double res); 101 | void mass_update(); 102 | void update_of(); 103 | void get_mass(); 104 | void plotting(); 105 | void clear(); 106 | void fill(std::vector flip); 107 | void ray_tracing_horizontal(int y); 108 | void ray_tracing_horizontal_n(int y); 109 | void ray_tracing_vertical(int x); 110 | void ray_tracing_vertical_n(int x); 111 | void add_points_to_the_DST(pcl::PointCloud& cloud); 112 | void add_free_spaces_to_the_DST(); 113 | void add_ego_vehicle_to_the_DST(); 114 | }; 115 | } 116 | -------------------------------------------------------------------------------- /lidar_pkg/src/utils.cpp: -------------------------------------------------------------------------------- 1 | #include "utils.h" 2 | 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | #include 9 | 10 | #include "tensorflow/cc/ops/const_op.h" 11 | #include "tensorflow/cc/ops/image_ops.h" 12 | #include "tensorflow/cc/ops/standard_ops.h" 13 | #include "tensorflow/core/framework/graph.pb.h" 14 | #include "tensorflow/core/framework/tensor.h" 15 | #include "tensorflow/core/graph/default_device.h" 16 | #include "tensorflow/core/graph/graph_def_builder.h" 17 | #include "tensorflow/core/lib/core/errors.h" 18 | #include "tensorflow/core/lib/core/stringpiece.h" 19 | #include "tensorflow/core/lib/core/threadpool.h" 20 | #include "tensorflow/core/lib/io/path.h" 21 | #include "tensorflow/core/lib/strings/stringprintf.h" 22 | #include "tensorflow/core/platform/env.h" 23 | #include "tensorflow/core/platform/init_main.h" 24 | #include "tensorflow/core/platform/logging.h" 25 | #include "tensorflow/core/platform/types.h" 26 | #include "tensorflow/core/public/session.h" 27 | #include "tensorflow/core/util/command_line_flags.h" 28 | 29 | // #include 30 | #include 31 | #include 32 | 33 | using namespace std; 34 | using namespace cv; 35 | 36 | using tensorflow::Flag; 37 | using tensorflow::Tensor; 38 | using tensorflow::Status; 39 | using tensorflow::string; 40 | using tensorflow::int32; 41 | 42 | /** Read a model graph definition (xxx.pb) from disk, and creates a session object you can use to run it. 43 | */ 44 | Status loadGraph(const string &graph_file_name, 45 | unique_ptr *session) { 46 | tensorflow::GraphDef graph_def; 47 | Status load_graph_status = 48 | ReadBinaryProto(tensorflow::Env::Default(), graph_file_name, &graph_def); 49 | if (!load_graph_status.ok()) { 50 | std::cout << graph_file_name << std::endl; 51 | std::cout << "NOT LOADING" << std::endl; 52 | std::cout << tensorflow::errors::NotFound("Failed to load compute graph at '", 53 | graph_file_name, "'") <reset(tensorflow::NewSession(tensorflow::SessionOptions())); 58 | Status session_create_status = (*session)->Create(graph_def); 59 | if (!session_create_status.ok()) { 60 | return session_create_status; 61 | } 62 | return Status::OK(); 63 | } 64 | 65 | /** Read a labels map file (xxx.pbtxt) from disk to translate class numbers into human-readable labels. 66 | */ 67 | // Status readLabelsMapFile(const string &fileName, map &labelsMap) { 68 | // 69 | // // Read file into a string 70 | // ifstream t(fileName); 71 | // if (t.bad()) 72 | // return tensorflow::errors::NotFound("Failed to load labels map at '", fileName, "'"); 73 | // stringstream buffer; 74 | // buffer << t.rdbuf(); 75 | // string fileString = buffer.str(); 76 | // 77 | // // Search entry patterns of type 'item { ... }' and parse each of them 78 | // smatch matcherEntry; 79 | // smatch matcherId; 80 | // smatch matcherName; 81 | // const regex reEntry("item \\{([\\S\\s]*?)\\}"); 82 | // const regex reId("[0-9]+"); 83 | // const regex reName("\'.+\'"); 84 | // string entry; 85 | // 86 | // auto stringBegin = sregex_iterator(fileString.begin(), fileString.end(), reEntry); 87 | // auto stringEnd = sregex_iterator(); 88 | // 89 | // int id; 90 | // string name; 91 | // for (sregex_iterator i = stringBegin; i != stringEnd; i++) { 92 | // matcherEntry = *i; 93 | // entry = matcherEntry.str(); 94 | // regex_search(entry, matcherId, reId); 95 | // if (!matcherId.empty()) 96 | // id = stoi(matcherId[0].str()); 97 | // else 98 | // continue; 99 | // regex_search(entry, matcherName, reName); 100 | // if (!matcherName.empty()) 101 | // name = matcherName[0].str().substr(1, matcherName[0].str().length() - 2); 102 | // else 103 | // continue; 104 | // labelsMap.insert(pair(id, name)); 105 | // } 106 | // return Status::OK(); 107 | // } 108 | // 109 | // /** Convert Mat image into tensor of shape (1, height, width, d) where last three dims are equal to the original dims. 110 | // */ 111 | // Status readTensorFromMat(const Mat &mat, Tensor &outTensor) { 112 | // 113 | // auto root = tensorflow::Scope::NewRootScope(); 114 | // using namespace ::tensorflow::ops; 115 | // 116 | // // Trick from https://github.com/tensorflow/tensorflow/issues/8033 117 | // float *p = outTensor.flat().data(); 118 | // Mat fakeMat(mat.rows, mat.cols, CV_32FC3, p); 119 | // mat.convertTo(fakeMat, CV_32FC3); 120 | // 121 | // auto input_tensor = Placeholder(root.WithOpName("input"), tensorflow::DT_FLOAT); 122 | // vector> inputs = {{"input", outTensor}}; 123 | // auto uint8Caster = Cast(root.WithOpName("uint8_Cast"), outTensor, tensorflow::DT_UINT8); 124 | // 125 | // // This runs the GraphDef network definition that we've just constructed, and 126 | // // returns the results in the output outTensor. 127 | // tensorflow::GraphDef graph; 128 | // TF_RETURN_IF_ERROR(root.ToGraphDef(&graph)); 129 | // 130 | // vector outTensors; 131 | // unique_ptr session(tensorflow::NewSession(tensorflow::SessionOptions())); 132 | // 133 | // TF_RETURN_IF_ERROR(session->Create(graph)); 134 | // TF_RETURN_IF_ERROR(session->Run({inputs}, {"uint8_Cast"}, {}, &outTensors)); 135 | // 136 | // outTensor = outTensors.at(0); 137 | // return Status::OK(); 138 | // } 139 | // 140 | // /** Draw bounding box and add caption to the image. 141 | // * Boolean flag _scaled_ shows if the passed coordinates are in relative units (true by default in tensorflow detection) 142 | // */ 143 | // void drawBoundingBoxOnImage(Mat &image, double yMin, double xMin, double yMax, double xMax, double score, string label, bool scaled=true) { 144 | // cv::Point tl, br; 145 | // if (scaled) { 146 | // tl = cv::Point((int) (xMin * image.cols), (int) (yMin * image.rows)); 147 | // br = cv::Point((int) (xMax * image.cols), (int) (yMax * image.rows)); 148 | // } else { 149 | // tl = cv::Point((int) xMin, (int) yMin); 150 | // br = cv::Point((int) xMax, (int) yMax); 151 | // } 152 | // cv::rectangle(image, tl, br, cv::Scalar(0, 255, 255), 1); 153 | // 154 | // // Ceiling the score down to 3 decimals (weird!) 155 | // float scoreRounded = floorf(score * 1000) / 1000; 156 | // string scoreString = to_string(scoreRounded).substr(0, 5); 157 | // string caption = label + " (" + scoreString + ")"; 158 | // 159 | // // Adding caption of type "LABEL (X.XXX)" to the top-left corner of the bounding box 160 | // int fontCoeff = 12; 161 | // cv::Point brRect = cv::Point(tl.x + caption.length() * fontCoeff / 1.6, tl.y + fontCoeff); 162 | // cv::rectangle(image, tl, brRect, cv::Scalar(0, 255, 255), -1); 163 | // cv::Point textCorner = cv::Point(tl.x, tl.y + fontCoeff * 0.9); 164 | // cv::putText(image, caption, textCorner, FONT_HERSHEY_SIMPLEX, 0.4, cv::Scalar(255, 0, 0)); 165 | // } 166 | // 167 | // /** Draw bounding boxes and add captions to the image. 168 | // * Box is drawn only if corresponding score is higher than the _threshold_. 169 | // */ 170 | // void drawBoundingBoxesOnImage(Mat &image, 171 | // tensorflow::TTypes::Flat &scores, 172 | // tensorflow::TTypes::Flat &classes, 173 | // tensorflow::TTypes::Tensor &boxes, 174 | // map &labelsMap, 175 | // vector &idxs) { 176 | // for (int j = 0; j < idxs.size(); j++) 177 | // drawBoundingBoxOnImage(image, 178 | // boxes(0,idxs.at(j),0), boxes(0,idxs.at(j),1), 179 | // boxes(0,idxs.at(j),2), boxes(0,idxs.at(j),3), 180 | // scores(idxs.at(j)), labelsMap[classes(idxs.at(j))]); 181 | // } 182 | // 183 | // /** Calculate intersection-over-union (IOU) for two given bbox Rects. 184 | // */ 185 | // double IOU(Rect2f box1, Rect2f box2) { 186 | // 187 | // float xA = max(box1.tl().x, box2.tl().x); 188 | // float yA = max(box1.tl().y, box2.tl().y); 189 | // float xB = min(box1.br().x, box2.br().x); 190 | // float yB = min(box1.br().y, box2.br().y); 191 | // 192 | // float intersectArea = abs((xB - xA) * (yB - yA)); 193 | // float unionArea = abs(box1.area()) + abs(box2.area()) - intersectArea; 194 | // 195 | // return 1. * intersectArea / unionArea; 196 | // } 197 | // 198 | // /** Return idxs of good boxes (ones with highest confidence score (>= thresholdScore) 199 | // * and IOU <= thresholdIOU with others). 200 | // */ 201 | // vector filterBoxes(tensorflow::TTypes::Flat &scores, 202 | // tensorflow::TTypes::Tensor &boxes, 203 | // double thresholdIOU, double thresholdScore) { 204 | // 205 | // vector sortIdxs(scores.size()); 206 | // iota(sortIdxs.begin(), sortIdxs.end(), 0); 207 | // 208 | // // Create set of "bad" idxs 209 | // set badIdxs = set(); 210 | // size_t i = 0; 211 | // while (i < sortIdxs.size()) { 212 | // if (scores(sortIdxs.at(i)) < thresholdScore) 213 | // badIdxs.insert(sortIdxs[i]); 214 | // if (badIdxs.find(sortIdxs.at(i)) != badIdxs.end()) { 215 | // i++; 216 | // continue; 217 | // } 218 | // 219 | // Rect2f box1 = Rect2f(Point2f(boxes(0, sortIdxs.at(i), 1), boxes(0, sortIdxs.at(i), 0)), 220 | // Point2f(boxes(0, sortIdxs.at(i), 3), boxes(0, sortIdxs.at(i), 2))); 221 | // for (size_t j = i + 1; j < sortIdxs.size(); j++) { 222 | // if (scores(sortIdxs.at(j)) < thresholdScore) { 223 | // badIdxs.insert(sortIdxs[j]); 224 | // continue; 225 | // } 226 | // Rect2f box2 = Rect2f(Point2f(boxes(0, sortIdxs.at(j), 1), boxes(0, sortIdxs.at(j), 0)), 227 | // Point2f(boxes(0, sortIdxs.at(j), 3), boxes(0, sortIdxs.at(j), 2))); 228 | // if (IOU(box1, box2) > thresholdIOU) 229 | // badIdxs.insert(sortIdxs[j]); 230 | // } 231 | // i++; 232 | // } 233 | // 234 | // // Prepare "good" idxs for return 235 | // vector goodIdxs = vector(); 236 | // for (auto it = sortIdxs.begin(); it != sortIdxs.end(); it++) 237 | // if (badIdxs.find(sortIdxs.at(*it)) == badIdxs.end()) 238 | // goodIdxs.push_back(*it); 239 | // 240 | // return goodIdxs; 241 | // } 242 | -------------------------------------------------------------------------------- /lidar_pkg/src/utils.h: -------------------------------------------------------------------------------- 1 | #ifndef TF_DETECTOR_EXAMPLE_UTILS_H 2 | #define TF_DETECTOR_EXAMPLE_UTILS_H 3 | 4 | #endif //TF_DETECTOR_EXAMPLE_UTILS_H 5 | 6 | #include 7 | #include "tensorflow/core/framework/tensor.h" 8 | #include "tensorflow/core/public/session.h" 9 | #include 10 | 11 | 12 | using tensorflow::Tensor; 13 | using tensorflow::Status; 14 | using tensorflow::string; 15 | 16 | 17 | Status readLabelsMapFile(const string &fileName, std::map &labelsMap); 18 | 19 | Status loadGraph(const string &graph_file_name, 20 | std::unique_ptr *session); 21 | 22 | Status readTensorFromMat(const cv::Mat &mat, Tensor &outTensor); 23 | 24 | void drawBoundingBoxOnImage(cv::Mat &image, double xMin, double yMin, double xMax, double yMax, double score, std::string label, bool scaled); 25 | 26 | void drawBoundingBoxesOnImage(cv::Mat &image, 27 | tensorflow::TTypes::Flat &scores, 28 | tensorflow::TTypes::Flat &classes, 29 | tensorflow::TTypes::Tensor &boxes, 30 | std::map &labelsMap, 31 | std::vector &idxs); 32 | 33 | double IOU(cv::Rect box1, cv::Rect box2); 34 | 35 | std::vector filterBoxes(tensorflow::TTypes::Flat &scores, 36 | tensorflow::TTypes::Tensor &boxes, 37 | double thresholdIOU, double thresholdScore); 38 | -------------------------------------------------------------------------------- /lidar_pkg/src/visualize.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include "visualize.h" 6 | 7 | PLUGINLIB_EXPORT_CLASS(lidar_pkg::Visualize, nodelet::Nodelet) 8 | 9 | namespace lidar_pkg 10 | { 11 | void Visualize::onInit() 12 | { 13 | ROS_INFO("Initialized visualization."); 14 | ros::NodeHandle& private_nh = getMTPrivateNodeHandle(); 15 | visualization_pub = private_nh.advertise("visualization", 1); 16 | prediction_sub = private_nh.subscribe("prediction_all", 1, &Visualize::prediction_callback, this); 17 | timer = private_nh.createTimer(ros::Duration(0.1), &Visualize::timerCallback, this); //What time? Should it matter? 18 | } 19 | 20 | void Visualize::prediction_callback(const lidar_msgs::prediction::ConstPtr& msg) 21 | { 22 | newest_prediction = *msg; 23 | data_acquired = true; 24 | lasttime = ros::Time::now(); 25 | count++; 26 | } 27 | 28 | void Visualize::timerCallback(const ros::TimerEvent&) 29 | { 30 | ROS_INFO("VISUAL"); 31 | int count1 = count; 32 | ros::Time currenttime = ros::Time::now(); 33 | std::this_thread::sleep_for(std::chrono::milliseconds(1000)); 34 | ros::Time currenttime2 = ros::Time::now(); 35 | int count2 = count; 36 | while(true){ 37 | if(data_acquired==true) 38 | { 39 | ROS_INFO("PRED"); 40 | ros::Duration d(0.2); 41 | double gap = d.toSec(); 42 | int grid_size = 128; 43 | occupancy_msg.header.frame_id = "/body"; 44 | occupancy_msg.info.resolution = 1./3.; 45 | occupancy_msg.info.width = grid_size; 46 | occupancy_msg.info.height = grid_size; 47 | occupancy_msg.info.origin.position.x = 64.0*(1./3.); 48 | occupancy_msg.info.origin.position.y = -64.0*(1./3.); 49 | 50 | for (unsigned int t=4;t<20;t++){ 51 | occupancy_msg.data.clear(); 52 | if(currenttime == currenttime2){ //count1 == count2 53 | //ROS_INFO("PAUSE"); 54 | 55 | int count_grid = 0; 56 | for (unsigned int i = 0; i<128;i++){ 57 | for (unsigned int j = 0;j<128;j++){ 58 | occupancy_msg.data.push_back(newest_prediction.prediction[t].data[count_grid]); 59 | count_grid++; 60 | } 61 | } 62 | std::this_thread::sleep_for(std::chrono::milliseconds(100)); 63 | 64 | } 65 | else { 66 | //ROS_INFO("NO PAUSE"); 67 | for (unsigned int i = 0; i<128*128; i++){ 68 | occupancy_msg.data.push_back(0.0); 69 | } 70 | } 71 | visualization_pub.publish(occupancy_msg); 72 | } 73 | } 74 | break; 75 | } 76 | } 77 | } 78 | -------------------------------------------------------------------------------- /lidar_pkg/src/visualize.h: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | 8 | namespace lidar_pkg { 9 | 10 | class Visualize : public nodelet::Nodelet 11 | { 12 | public: 13 | Visualize(){}; 14 | private: 15 | ros::Publisher visualization_pub; 16 | ros::Subscriber prediction_sub; 17 | ros::Timer timer; 18 | nav_msgs::OccupancyGrid occupancy_msg; 19 | lidar_msgs::prediction newest_prediction; 20 | ros::Time lasttime; 21 | int count = 0; 22 | bool data_acquired = false; 23 | virtual void onInit(); 24 | void timerCallback(const ros::TimerEvent&); 25 | void prediction_callback(const lidar_msgs::prediction::ConstPtr& msg); 26 | void saveFrame(std::vector data, int t); 27 | }; 28 | } 29 | -------------------------------------------------------------------------------- /models/prednet.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sisl/ROSOccupancyGridPrediction/7bff581b378a7478ecbd7491d89070d1f86259c9/models/prednet.pt -------------------------------------------------------------------------------- /models/prednet_with_saa.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sisl/ROSOccupancyGridPrediction/7bff581b378a7478ecbd7491d89070d1f86259c9/models/prednet_with_saa.pt -------------------------------------------------------------------------------- /models/prednet_with_taa.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sisl/ROSOccupancyGridPrediction/7bff581b378a7478ecbd7491d89070d1f86259c9/models/prednet_with_taa.pt -------------------------------------------------------------------------------- /pcl_tf_frame/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | project(pcl_tf_frame) 2 | find_package(catkin REQUIRED COMPONENTS roscpp tf2 tf2_ros) 3 | catkin_package() 4 | include_directories(${catkin_INCLUDE_DIRS}) 5 | add_executable(frame_broadcaster src/frame_broadcaster.cpp) 6 | target_link_libraries(frame_broadcaster ${catkin_LIBRARIES}) 7 | -------------------------------------------------------------------------------- /pcl_tf_frame/package.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | pcl_tf_frame 4 | 0.0.0 5 | The pcl_tf_frame package 6 | 7 | 8 | 9 | 10 | bernard 11 | 12 | 13 | 14 | 15 | 16 | TODO 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | catkin 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | -------------------------------------------------------------------------------- /pcl_tf_frame/src/frame_broadcaster.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | 5 | int main(int argc, char** argv){ 6 | ros::init(argc, argv, "my_tf2_broadcaster"); 7 | ros::NodeHandle node; 8 | 9 | tf2_ros::TransformBroadcaster tfb; 10 | geometry_msgs::TransformStamped transformStamped; 11 | 12 | 13 | transformStamped.header.frame_id = "body"; 14 | transformStamped.child_frame_id = "global_pcl_frame"; 15 | transformStamped.transform.translation.x = 0.0; 16 | transformStamped.transform.translation.y = 0.0; 17 | transformStamped.transform.translation.z = 0.0; 18 | double pi = 3.14159; 19 | tf2::Quaternion q; 20 | q.setRPY(1*pi, 0, 0); // Roll-Pitch-Yaw: rotation about x->y->z in that order 21 | transformStamped.transform.rotation.x = q.x(); 22 | transformStamped.transform.rotation.y = q.y(); 23 | transformStamped.transform.rotation.z = q.z(); 24 | transformStamped.transform.rotation.w = q.w(); 25 | 26 | ros::Rate rate(10.0); 27 | while (node.ok()){ 28 | transformStamped.header.stamp = ros::Time::now(); 29 | tfb.sendTransform(transformStamped); 30 | rate.sleep(); 31 | // printf("sending\n"); 32 | } 33 | 34 | }; 35 | --------------------------------------------------------------------------------