├── LICENSE ├── README.md ├── config ├── config.json └── optimizer.json ├── docs └── images │ ├── calibrated.png │ └── uncalibrated.png ├── environment.txt ├── extract.py ├── optimize.py ├── requirements.txt └── utilities.py /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022 K. Ta 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # L2E: Lasers to Events for 6-DoF Extrinsic Calibration of Lidars and Event Cameras 2 | 3 | [[Paper]](https://ieeexplore.ieee.org/abstract/document/10161220) [[Arxiv]](https://arxiv.org/abs/2207.01009) 4 | 5 | This repository contains the official code for the ICRA 2023 paper [L2E: Lasers to Events for 6-DoF Extrinsic Calibration of Lidars and Event Cameras](https://ieeexplore.ieee.org/abstract/document/10161220), a calibration process for event cameras and lidar via mutual information maximization. 6 | 7 | **Uncalibrated Result** 8 | 9 | ![Uncalibrated frame camera image.](docs/images/uncalibrated.png) 10 | 11 | **Calibrated Result** 12 | 13 | ![Calibrated frame camera image.](docs/images/calibrated.png) 14 | 15 | ## Abstract 16 | 17 | As neuromorphic technology is maturing, its application to robotics and autonomous vehicle systems has become an area of active research. In particular, event cameras have emerged as a compelling alternative to frame-based cameras in low-power and latency-demanding applications. To enable event cameras to operate alongside staple sensors like lidar in perception tasks, we propose a direct, temporally-decoupled extrinsic calibration method between event cameras and lidars. The high dynamic range, high temporal resolution, and low-latency operation of event cameras are exploited to directly register lidar laser returns, allowing information-based correlation methods to optimize for the 6-DoF extrinsic calibration between the two sensors. This paper presents, L2E, the first direct calibration method between event cameras and lidars, removing dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements. 18 | 19 | ## Usage 20 | 21 | ### Installing Dependencies 22 | 23 | We provide two files for installing dependencies. To just run the optimization process, we can install requirements via `pip` . This was tested and collected in Python 3.8.10. 24 | 25 | ```bash 26 | python3 -m venv venv 27 | pip install -r requirements.txt 28 | ``` 29 | 30 | For the extraction script, the command-line tool uses has ROS dependencies which are typically set into the default Python path when you install ROS. These scripts were run with a ROS-Noetic installation. We also provide a Conda environment that can run the extraction script. This was tested & collected in Python 3.7.12. 31 | 32 | ```bash 33 | conda create --name l2e --file environment.txt 34 | ``` 35 | 36 | ### ROS Bag Extraction 37 | 38 | [[Sample ROS Bag]](https://drive.google.com/file/d/1yOeh6oB30gYrakfRZU4Pxx3r9lyid7Uh/view?usp=sharing) 39 | 40 | Our sample dataset was built with the following available sensors: 41 | 42 | - **RS-LiDAR-M1** `/rslidar_points` An automotive-grade MEMS LiDAR that has a 120° HFoV and a 25° VFoV 43 | - **Prophesee Gen4.1** `/prophesee/camera/cd_events_buffer` A state-of-the-art event-based camera 44 | 45 | The above sensors were run via a ROS interface and their data collected into a rosbag with the associated rostopics. You can run the extraction script with the following command. Flags point to which sensor to extract. 46 | 47 | ```bash 48 | python extract.py {path/to/bag_folder/} {path/to/output_folder} --event --lidar 49 | ``` 50 | 51 | The event image reconstruction is performed by accumulating events regardless of polarity to create an accumulated event map. Most triggered events in a static scene are induced by the lidar lasers as the event camera is sensitive to the 905 nm wavelength of light. This script will separate each of the sensor data into the folders `event` and `lidar` with the same stem name when extracted from the same bag. 52 | 53 | You can use `--help` to get the doc string information regarding the extraction script. use `:q` to exit. 54 | 55 | ### Run Optimization 56 | 57 | [[Sample Data]](https://drive.google.com/file/d/1QuxNBCBNYBv8LCTTewQFRE_vFg0PBBeS/view?usp=sharing) 58 | 59 | or get the data by: 60 | 61 | ```bash 62 | pip install gdown 63 | gdown https://drive.google.com/uc?id=1QuxNBCBNYBv8LCTTewQFRE_vFg0PBBeS 64 | unzip sample_data.zip 65 | ``` 66 | 67 | If you have post-extracted images and point clouds scans, you can also arrange them into the same format as the sample data. 68 | 69 | Then the mutual information optimization can be run with the following command: 70 | 71 | ```bash 72 | python optimize.py {path/to/images/} {path/to/lidar_scans/} {path/to/output_folder/} 73 | ``` 74 | 75 | Additional arguments / parameters include: 76 | 77 | - `--bounded` flag for optimizing with bounds on seed values _(default True)_ 78 | - `--disp_bounds` max displacement difference from seed values in m _(default ±0.2 m, from seed)_ 79 | - `--deg_bounds` max rotation difference from seed values in degrees _(default ±5 degrees from seed)_ 80 | - `--optimizer` optimizer to use in _{'nelder-mead', 'Powell', 'L-BFGS-B', 'BFGS', 'CG', 'SLSQP'}_ 81 | - `--image_blur` gaussian blurring std on image to smooth optimization _(default 5px)_ 82 | - `--key` key for the intrinsic calibration in `config.json` _(default cam)_ 83 | - `--use_axis_representation` flag to use rotation vector over Euler angles _(default True)_ 84 | - `--calibrate_rotation_only` flag to only calibrate rotation (useful if CAD translations are trusted) _(default False)_ 85 | 86 | You can use `--help` to get the doc string information regarding the optimization script. use `:q` to exit. 87 | 88 | ## Citation 89 | 90 | ```bibtex 91 | @inproceedings{kevinta2023:l2e, 92 | title = {L2E: Lasers to Events for 6-DoF Extrinsic Calibration of Lidars and Event Cameras}, 93 | author = {Ta, Kevin and Bruggemann, David and Brödermann, Tim and Sakaridis, Christos and Van Gool, Luc}, 94 | booktitle = {International Conference on Robotics and Automation (ICRA)} 95 | year = {2023} 96 | } 97 | ``` 98 | -------------------------------------------------------------------------------- /config/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "cam": { 3 | "K": [ 4 | [1043.9777149158745, 0, 620.3542209837874], 5 | [0, 1044.3901630718665, 343.75690903851563], 6 | [0, 0, 1] 7 | ], 8 | "D": [ 9 | -0.4558224833720169, 0.29944271675699663, 6.867280678643097e-5, 10 | -0.00012528851559686296, -0.139114502340535 11 | ], 12 | "x": 0.188, 13 | "y": 0, 14 | "z": -0.05, 15 | "rz": 90, 16 | "ry": -90, 17 | "rx": 0, 18 | "Rodrigues": [1.20919958, -1.20919958, 1.20919958] 19 | } 20 | } 21 | -------------------------------------------------------------------------------- /config/optimizer.json: -------------------------------------------------------------------------------- 1 | { 2 | "nelder-mead": { 3 | "jac": null, 4 | "options": { 5 | "xatol": 1e-8, 6 | "fatol": 1e-8, 7 | "disp": true 8 | } 9 | }, 10 | "Powell": { 11 | "jac": null, 12 | "options": { 13 | "xtol": 1e-8, 14 | "ftol": 1e-8, 15 | "disp": true 16 | } 17 | }, 18 | "L-BFGS-B": { 19 | "jac": "3-point", 20 | "options": { 21 | "maxcor": 10, 22 | "eps": 1e-4, 23 | "disp": true 24 | } 25 | }, 26 | "SLSQP": { 27 | "jac": "3-point", 28 | "options": { 29 | "ftol": 1e-8, 30 | "eps": 1e-3, 31 | "disp": true 32 | } 33 | }, 34 | "BFGS": { 35 | "jac": "3-point", 36 | "options": { 37 | "eps": 1e-4, 38 | "disp": true 39 | } 40 | }, 41 | "CG": { 42 | "jac": "3-point", 43 | "options": { 44 | "eps": 1e-4, 45 | "disp": true 46 | } 47 | } 48 | } 49 | -------------------------------------------------------------------------------- /docs/images/calibrated.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kev-in-ta/l2e/5e632c5f8c6d4620331b7343e4d22e91ac01761a/docs/images/calibrated.png -------------------------------------------------------------------------------- /docs/images/uncalibrated.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kev-in-ta/l2e/5e632c5f8c6d4620331b7343e4d22e91ac01761a/docs/images/uncalibrated.png -------------------------------------------------------------------------------- /environment.txt: -------------------------------------------------------------------------------- 1 | # This file may be used to create an environment using: 2 | # $ conda create --name --file 3 | # platform: linux-64 4 | @EXPLICIT 5 | https://conda.anaconda.org/conda-forge/linux-64/_libgcc_mutex-0.1-conda_forge.tar.bz2 6 | https://conda.anaconda.org/conda-forge/linux-64/ca-certificates-2022.6.15-ha878542_0.tar.bz2 7 | https://conda.anaconda.org/conda-forge/linux-64/ld_impl_linux-64-2.36.1-hea4e1c9_2.tar.bz2 8 | https://conda.anaconda.org/conda-forge/linux-64/libgfortran5-12.1.0-hdcd56e2_16.tar.bz2 9 | https://conda.anaconda.org/conda-forge/linux-64/libstdcxx-ng-12.1.0-ha89aaad_16.tar.bz2 10 | https://conda.anaconda.org/conda-forge/noarch/ros-conda-mutex-1.0-melodic.tar.bz2 11 | https://conda.anaconda.org/conda-forge/linux-64/libgfortran-ng-12.1.0-h69a702a_16.tar.bz2 12 | https://conda.anaconda.org/conda-forge/linux-64/libgomp-12.1.0-h8d9b700_16.tar.bz2 13 | https://conda.anaconda.org/conda-forge/noarch/ros-conda-base-0.0.2-hcb32578_2.tar.bz2 14 | https://conda.anaconda.org/conda-forge/linux-64/_openmp_mutex-4.5-2_gnu.tar.bz2 15 | https://conda.anaconda.org/conda-forge/linux-64/libgcc-ng-12.1.0-h8d9b700_16.tar.bz2 16 | https://conda.anaconda.org/conda-forge/linux-64/bzip2-1.0.8-h7f98852_4.tar.bz2 17 | https://conda.anaconda.org/conda-forge/linux-64/c-ares-1.18.1-h7f98852_0.tar.bz2 18 | https://conda.anaconda.org/conda-forge/linux-64/console_bridge-0.4.4-hc9558a2_0.tar.bz2 19 | https://conda.anaconda.org/conda-forge/linux-64/expat-2.4.8-h27087fc_0.tar.bz2 20 | https://conda.anaconda.org/conda-forge/linux-64/giflib-5.2.1-h36c2ea0_2.tar.bz2 21 | https://conda.anaconda.org/conda-forge/linux-64/gmp-6.2.1-h58526e2_0.tar.bz2 22 | https://conda.anaconda.org/conda-forge/linux-64/graphite2-1.3.13-h58526e2_1001.tar.bz2 23 | https://conda.anaconda.org/conda-forge/linux-64/gtest-1.11.0-h924138e_0.tar.bz2 24 | https://conda.anaconda.org/conda-forge/linux-64/icu-68.2-h9c3ff4c_0.tar.bz2 25 | https://conda.anaconda.org/conda-forge/linux-64/jpeg-9e-h166bdaf_2.tar.bz2 26 | https://conda.anaconda.org/conda-forge/linux-64/keyutils-1.6.1-h166bdaf_0.tar.bz2 27 | https://conda.anaconda.org/conda-forge/linux-64/lame-3.100-h7f98852_1001.tar.bz2 28 | https://conda.anaconda.org/conda-forge/linux-64/lerc-4.0.0-h27087fc_0.tar.bz2 29 | https://conda.anaconda.org/conda-forge/linux-64/libapr-1.7.0-h7f98852_5.tar.bz2 30 | https://conda.anaconda.org/conda-forge/linux-64/libapriconv-1.2.2-h7f98852_5.tar.bz2 31 | https://conda.anaconda.org/conda-forge/linux-64/libdeflate-1.13-h166bdaf_0.tar.bz2 32 | https://conda.anaconda.org/conda-forge/linux-64/libev-4.33-h516909a_1.tar.bz2 33 | https://conda.anaconda.org/conda-forge/linux-64/libffi-3.4.2-h7f98852_5.tar.bz2 34 | https://conda.anaconda.org/conda-forge/linux-64/libiconv-1.16-h516909a_0.tar.bz2 35 | https://conda.anaconda.org/conda-forge/linux-64/libnsl-2.0.0-h7f98852_0.tar.bz2 36 | https://conda.anaconda.org/conda-forge/linux-64/libopenblas-0.3.21-pthreads_h78a6416_1.tar.bz2 37 | https://conda.anaconda.org/conda-forge/linux-64/libuuid-2.32.1-h7f98852_1000.tar.bz2 38 | https://conda.anaconda.org/conda-forge/linux-64/libuv-1.44.2-h166bdaf_0.tar.bz2 39 | https://conda.anaconda.org/conda-forge/linux-64/libwebp-base-1.2.4-h166bdaf_0.tar.bz2 40 | https://conda.anaconda.org/conda-forge/linux-64/libzlib-1.2.12-h166bdaf_2.tar.bz2 41 | https://conda.anaconda.org/conda-forge/linux-64/lz4-c-1.9.3-h9c3ff4c_1.tar.bz2 42 | https://conda.anaconda.org/conda-forge/linux-64/ncurses-6.3-h27087fc_1.tar.bz2 43 | https://conda.anaconda.org/conda-forge/linux-64/nettle-3.6-he412f7d_0.tar.bz2 44 | https://conda.anaconda.org/conda-forge/linux-64/nspr-4.32-h9c3ff4c_1.tar.bz2 45 | https://conda.anaconda.org/conda-forge/linux-64/openssl-1.1.1q-h166bdaf_0.tar.bz2 46 | https://conda.anaconda.org/conda-forge/linux-64/pcre-8.45-h9c3ff4c_0.tar.bz2 47 | https://conda.anaconda.org/conda-forge/linux-64/pixman-0.40.0-h36c2ea0_0.tar.bz2 48 | https://conda.anaconda.org/conda-forge/linux-64/pkg-config-0.29.2-h36c2ea0_1008.tar.bz2 49 | https://conda.anaconda.org/conda-forge/linux-64/pthread-stubs-0.4-h36c2ea0_1001.tar.bz2 50 | https://conda.anaconda.org/conda-forge/linux-64/rhash-1.4.3-h166bdaf_0.tar.bz2 51 | https://conda.anaconda.org/conda-forge/linux-64/tinyxml2-9.0.0-h9c3ff4c_2.tar.bz2 52 | https://conda.anaconda.org/conda-forge/linux-64/x264-1!152.20180806-h14c3975_0.tar.bz2 53 | https://conda.anaconda.org/conda-forge/linux-64/xorg-kbproto-1.0.7-h7f98852_1002.tar.bz2 54 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libice-1.0.10-h7f98852_0.tar.bz2 55 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libxau-1.0.9-h7f98852_0.tar.bz2 56 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libxdmcp-1.1.3-h7f98852_0.tar.bz2 57 | https://conda.anaconda.org/conda-forge/linux-64/xorg-renderproto-0.11.1-h7f98852_1002.tar.bz2 58 | https://conda.anaconda.org/conda-forge/linux-64/xorg-xextproto-7.3.0-h7f98852_1002.tar.bz2 59 | https://conda.anaconda.org/conda-forge/linux-64/xorg-xproto-7.0.31-h7f98852_1007.tar.bz2 60 | https://conda.anaconda.org/conda-forge/linux-64/xz-5.2.6-h166bdaf_0.tar.bz2 61 | https://conda.anaconda.org/conda-forge/linux-64/yaml-0.2.5-h7f98852_2.tar.bz2 62 | https://conda.anaconda.org/conda-forge/linux-64/gettext-0.19.8.1-h73d1719_1008.tar.bz2 63 | https://conda.anaconda.org/conda-forge/linux-64/gmock-1.11.0-h924138e_0.tar.bz2 64 | https://conda.anaconda.org/conda-forge/linux-64/gnutls-3.6.13-h85f3911_1.tar.bz2 65 | https://conda.anaconda.org/conda-forge/linux-64/jasper-1.900.1-h07fcdf6_1006.tar.bz2 66 | https://conda.anaconda.org/conda-forge/linux-64/libaprutil-1.6.1-h975c496_5.tar.bz2 67 | https://conda.anaconda.org/conda-forge/linux-64/libblas-3.9.0-16_linux64_openblas.tar.bz2 68 | https://conda.anaconda.org/conda-forge/linux-64/libedit-3.1.20191231-he28a2e2_2.tar.bz2 69 | https://conda.anaconda.org/conda-forge/linux-64/libevent-2.1.10-h9b69904_4.tar.bz2 70 | https://conda.anaconda.org/conda-forge/linux-64/libllvm11-11.1.0-hf817b99_3.tar.bz2 71 | https://conda.anaconda.org/conda-forge/linux-64/libnghttp2-1.47.0-hdcd2b5c_1.tar.bz2 72 | https://conda.anaconda.org/conda-forge/linux-64/libpng-1.6.37-h753d276_4.tar.bz2 73 | https://conda.anaconda.org/conda-forge/linux-64/libsqlite-3.39.2-h753d276_1.tar.bz2 74 | https://conda.anaconda.org/conda-forge/linux-64/libssh2-1.10.0-haa6b8db_3.tar.bz2 75 | https://conda.anaconda.org/conda-forge/linux-64/libxcb-1.13-h7f98852_1004.tar.bz2 76 | https://conda.anaconda.org/conda-forge/linux-64/mysql-common-8.0.30-haf5c9bc_0.tar.bz2 77 | https://conda.anaconda.org/conda-forge/linux-64/readline-8.1.2-h0f457ee_0.tar.bz2 78 | https://conda.anaconda.org/conda-forge/linux-64/tk-8.6.12-h27826a3_0.tar.bz2 79 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libsm-1.2.3-hd9c2040_1000.tar.bz2 80 | https://conda.anaconda.org/conda-forge/linux-64/zlib-1.2.12-h166bdaf_2.tar.bz2 81 | https://conda.anaconda.org/conda-forge/linux-64/zstd-1.5.2-h6239696_4.tar.bz2 82 | https://conda.anaconda.org/conda-forge/linux-64/apr-1.7.0-h7f98852_5.tar.bz2 83 | https://conda.anaconda.org/conda-forge/linux-64/boost-cpp-1.72.0-h312852a_5.tar.bz2 84 | https://conda.anaconda.org/conda-forge/linux-64/freetype-2.12.1-hca18f0e_0.tar.bz2 85 | https://conda.anaconda.org/conda-forge/linux-64/krb5-1.19.3-h3790be6_0.tar.bz2 86 | https://conda.anaconda.org/conda-forge/linux-64/libcblas-3.9.0-16_linux64_openblas.tar.bz2 87 | https://conda.anaconda.org/conda-forge/linux-64/libclang-11.1.0-default_ha53f305_1.tar.bz2 88 | https://conda.anaconda.org/conda-forge/linux-64/libglib-2.72.1-h2d90d5f_0.tar.bz2 89 | https://conda.anaconda.org/conda-forge/linux-64/libgpg-error-1.45-hc0c96e0_0.tar.bz2 90 | https://conda.anaconda.org/conda-forge/linux-64/liblapack-3.9.0-16_linux64_openblas.tar.bz2 91 | https://conda.anaconda.org/conda-forge/linux-64/libtiff-4.4.0-h0e0dad5_3.tar.bz2 92 | https://conda.anaconda.org/conda-forge/linux-64/libxml2-2.9.12-h72842e0_0.tar.bz2 93 | https://conda.anaconda.org/conda-forge/linux-64/mysql-libs-8.0.30-h28c427c_0.tar.bz2 94 | https://conda.anaconda.org/conda-forge/linux-64/openh264-2.1.1-h780b84a_0.tar.bz2 95 | https://conda.anaconda.org/conda-forge/linux-64/sqlite-3.39.2-h4ff8645_1.tar.bz2 96 | https://conda.anaconda.org/conda-forge/linux-64/unixodbc-2.3.10-h583eb01_0.tar.bz2 97 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libx11-1.7.2-h7f98852_0.tar.bz2 98 | https://conda.anaconda.org/conda-forge/linux-64/dbus-1.13.6-h5008d03_3.tar.bz2 99 | https://conda.anaconda.org/conda-forge/linux-64/ffmpeg-4.2.3-h167e202_0.tar.bz2 100 | https://conda.anaconda.org/conda-forge/linux-64/fontconfig-2.14.0-h8e229c2_0.tar.bz2 101 | https://conda.anaconda.org/conda-forge/linux-64/gstreamer-1.18.5-h9f60fe5_3.tar.bz2 102 | https://conda.anaconda.org/conda-forge/linux-64/libassuan-2.5.5-h9c3ff4c_0.tar.bz2 103 | https://conda.anaconda.org/conda-forge/linux-64/libcurl-7.83.1-h7bff187_0.tar.bz2 104 | https://conda.anaconda.org/conda-forge/linux-64/liblapacke-3.9.0-16_linux64_openblas.tar.bz2 105 | https://conda.anaconda.org/conda-forge/linux-64/libpq-13.5-hd57d9b9_1.tar.bz2 106 | https://conda.anaconda.org/conda-forge/linux-64/libwebp-1.2.4-h522a892_0.tar.bz2 107 | https://conda.anaconda.org/conda-forge/linux-64/libxkbcommon-1.0.3-he3ba5ed_0.tar.bz2 108 | https://conda.anaconda.org/conda-forge/linux-64/log4cxx-0.11.0-h291d653_3.tar.bz2 109 | https://conda.anaconda.org/conda-forge/linux-64/nss-3.78-h2350873_0.tar.bz2 110 | https://conda.anaconda.org/conda-forge/linux-64/poco-1.12.2-h3790be6_0.tar.bz2 111 | https://conda.anaconda.org/conda-forge/linux-64/python-3.7.12-hb7a2778_100_cpython.tar.bz2 112 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libxext-1.3.4-h7f98852_1.tar.bz2 113 | https://conda.anaconda.org/conda-forge/linux-64/xorg-libxrender-0.9.10-h7f98852_1003.tar.bz2 114 | https://conda.anaconda.org/conda-forge/linux-64/cairo-1.16.0-h6cf1ce9_1008.tar.bz2 115 | https://conda.anaconda.org/conda-forge/linux-64/cmake-3.24.1-h5432695_0.tar.bz2 116 | https://conda.anaconda.org/conda-forge/noarch/colorama-0.4.5-pyhd8ed1ab_0.tar.bz2 117 | https://conda.anaconda.org/conda-forge/noarch/distro-1.6.0-pyhd8ed1ab_0.tar.bz2 118 | https://conda.anaconda.org/conda-forge/noarch/empy-3.3.4-pyh9f0ad1d_1.tar.bz2 119 | https://conda.anaconda.org/conda-forge/linux-64/gpgme-1.13.1-he1b5a44_1.tar.bz2 120 | https://conda.anaconda.org/conda-forge/linux-64/gst-plugins-base-1.18.4-h29181c9_0.tar.bz2 121 | https://conda.anaconda.org/conda-forge/linux-64/hdf5-1.10.6-nompi_h6a2412b_1114.tar.bz2 122 | https://conda.anaconda.org/conda-forge/noarch/nose-1.3.7-py_1006.tar.bz2 123 | https://conda.anaconda.org/conda-forge/noarch/pyparsing-3.0.9-pyhd8ed1ab_0.tar.bz2 124 | https://conda.anaconda.org/conda-forge/noarch/python-gnupg-0.4.9-pyhd8ed1ab_0.tar.bz2 125 | https://conda.anaconda.org/conda-forge/linux-64/python_abi-3.7-2_cp37m.tar.bz2 126 | https://conda.anaconda.org/conda-forge/linux-64/ros-class-loader-0.4.1-hd02d5f2_2.tar.bz2 127 | https://conda.anaconda.org/conda-forge/linux-64/ros-cpp-common-0.6.12-py37hd02d5f2_3.tar.bz2 128 | https://conda.anaconda.org/conda-forge/noarch/six-1.16.0-pyh6c4a22f_0.tar.bz2 129 | https://conda.anaconda.org/conda-forge/noarch/termcolor-1.1.0-pyhd8ed1ab_3.tar.bz2 130 | https://conda.anaconda.org/conda-forge/noarch/wheel-0.37.1-pyhd8ed1ab_0.tar.bz2 131 | https://conda.anaconda.org/conda-forge/linux-64/docutils-0.19-py37h89c1867_0.tar.bz2 132 | https://conda.anaconda.org/conda-forge/noarch/fire-0.4.0-pyh44b312d_0.tar.bz2 133 | https://conda.anaconda.org/conda-forge/linux-64/harfbuzz-2.9.1-h83ec7ef_1.tar.bz2 134 | https://conda.anaconda.org/conda-forge/linux-64/netifaces-0.11.0-py37h540881e_0.tar.bz2 135 | https://conda.anaconda.org/conda-forge/linux-64/numpy-1.21.6-py37h976b520_0.tar.bz2 136 | https://conda.anaconda.org/conda-forge/linux-64/pycrypto-2.6.1-py37h5e8e339_1006.tar.bz2 137 | https://conda.anaconda.org/conda-forge/noarch/python-dateutil-2.8.2-pyhd8ed1ab_0.tar.bz2 138 | https://conda.anaconda.org/conda-forge/linux-64/pyyaml-6.0-py37h540881e_4.tar.bz2 139 | https://conda.anaconda.org/conda-forge/linux-64/qt-5.12.9-hda022c4_4.tar.bz2 140 | https://conda.anaconda.org/conda-forge/linux-64/ros-catkin-0.7.17-py37h3340039_5.tar.bz2 141 | https://conda.anaconda.org/conda-forge/linux-64/ros-environment-1.2.1-py37h3340039_2.tar.bz2 142 | https://conda.anaconda.org/conda-forge/linux-64/ros-roslz4-1.14.10.1-py37hab448d8_2.tar.bz2 143 | https://conda.anaconda.org/conda-forge/linux-64/ros-rostime-0.6.12-heaf0793_3.tar.bz2 144 | https://conda.anaconda.org/conda-forge/linux-64/setuptools-65.1.0-py37h89c1867_0.tar.bz2 145 | https://conda.anaconda.org/conda-forge/noarch/tqdm-4.64.0-pyhd8ed1ab_0.tar.bz2 146 | https://conda.anaconda.org/conda-forge/linux-64/boost-1.72.0-py37h48f8a5e_1.tar.bz2 147 | https://conda.anaconda.org/conda-forge/noarch/catkin_pkg-0.5.2-pyhd8ed1ab_0.tar.bz2 148 | https://conda.anaconda.org/conda-forge/linux-64/libopencv-4.2.0-py37_7.tar.bz2 149 | https://conda.anaconda.org/conda-forge/noarch/pip-22.2.2-pyhd8ed1ab_0.tar.bz2 150 | https://conda.anaconda.org/conda-forge/linux-64/ros-genmsg-0.5.12-py37h3340039_1.tar.bz2 151 | https://conda.anaconda.org/conda-forge/linux-64/ros-roscpp-traits-0.6.12-he1b5a44_0.tar.bz2 152 | https://conda.anaconda.org/conda-forge/linux-64/ros-xmlrpcpp-1.14.3-he1b5a44_0.tar.bz2 153 | https://conda.anaconda.org/conda-forge/linux-64/scipy-1.7.3-py37hf2a6cf1_0.tar.bz2 154 | https://conda.anaconda.org/conda-forge/linux-64/py-opencv-4.2.0-py37h43977f1_7.tar.bz2 155 | https://conda.anaconda.org/conda-forge/linux-64/ros-gencpp-0.6.2-py37h3340039_1.tar.bz2 156 | https://conda.anaconda.org/conda-forge/linux-64/ros-geneus-2.2.6-py37h3340039_1.tar.bz2 157 | https://conda.anaconda.org/conda-forge/linux-64/ros-genlisp-0.4.16-py37h3340039_1.tar.bz2 158 | https://conda.anaconda.org/conda-forge/linux-64/ros-gennodejs-2.0.1-py37h3340039_1.tar.bz2 159 | https://conda.anaconda.org/conda-forge/linux-64/ros-genpy-0.6.8-py37h3340039_1.tar.bz2 160 | https://conda.anaconda.org/conda-forge/linux-64/ros-roscpp-serialization-0.6.12-he1b5a44_0.tar.bz2 161 | https://conda.anaconda.org/conda-forge/noarch/rospkg-1.4.0-pyhd8ed1ab_0.tar.bz2 162 | https://conda.anaconda.org/conda-forge/linux-64/opencv-4.2.0-py37_7.tar.bz2 163 | https://conda.anaconda.org/conda-forge/linux-64/ros-message-generation-0.4.0-h3340039_1.tar.bz2 164 | https://conda.anaconda.org/conda-forge/linux-64/ros-message-runtime-0.4.12-he1b5a44_0.tar.bz2 165 | https://conda.anaconda.org/conda-forge/linux-64/ros-rosgraph-1.14.3-py37h3340039_1.tar.bz2 166 | https://conda.anaconda.org/conda-forge/linux-64/rosdistro-0.8.3-py37h89c1867_4.tar.bz2 167 | https://conda.anaconda.org/conda-forge/linux-64/ros-rosbuild-1.14.6-he1b5a44_0.tar.bz2 168 | https://conda.anaconda.org/conda-forge/linux-64/ros-std-msgs-0.5.12-py37h3340039_1.tar.bz2 169 | https://conda.anaconda.org/conda-forge/linux-64/ros-std-srvs-1.11.2-py37h3340039_1.tar.bz2 170 | https://conda.anaconda.org/conda-forge/noarch/rosdep-0.22.1-pyhd8ed1ab_0.tar.bz2 171 | https://conda.anaconda.org/conda-forge/linux-64/ros-geometry-msgs-1.12.7-py37h3340039_1.tar.bz2 172 | https://conda.anaconda.org/conda-forge/linux-64/ros-rosconsole-1.13.10-hfdf2c5b_2.tar.bz2 173 | https://conda.anaconda.org/conda-forge/linux-64/ros-rosgraph-msgs-1.11.2-py37h3340039_1.tar.bz2 174 | https://conda.anaconda.org/conda-forge/linux-64/ros-rospack-2.5.3-py37hd02d5f2_1.tar.bz2 175 | https://conda.anaconda.org/conda-forge/linux-64/ros-cv-bridge-1.13.0-py37h8454771_2.tar.bz2 176 | https://conda.anaconda.org/conda-forge/linux-64/ros-roscpp-1.14.3-py37heaf0793_4.tar.bz2 177 | https://conda.anaconda.org/conda-forge/linux-64/ros-roslib-1.14.6-py37heaf0793_4.tar.bz2 178 | https://conda.anaconda.org/conda-forge/linux-64/ros-sensor-msgs-1.12.7-py37h3340039_1.tar.bz2 179 | https://conda.anaconda.org/conda-forge/linux-64/ros-pluginlib-1.12.1-hd02d5f2_1.tar.bz2 180 | https://conda.anaconda.org/conda-forge/linux-64/ros-rospy-1.14.3-py37h3340039_1.tar.bz2 181 | https://conda.anaconda.org/conda-forge/linux-64/ros-topic-tools-1.14.3-py37h3340039_1.tar.bz2 182 | https://conda.anaconda.org/conda-forge/linux-64/ros-rosbag-storage-1.14.3-he9e1de5_1.tar.bz2 183 | https://conda.anaconda.org/conda-forge/linux-64/ros-rosbag-1.14.3-py37hd02d5f2_1.tar.bz2 184 | -------------------------------------------------------------------------------- /extract.py: -------------------------------------------------------------------------------- 1 | """Command line tool for extracting a rosbag.""" 2 | import pathlib 3 | import time 4 | from pathlib import Path 5 | from typing import List 6 | 7 | import cv2 8 | import fire 9 | import numpy as np 10 | import rosbag 11 | import sensor_msgs.point_cloud2 as pc2 12 | import tqdm 13 | 14 | 15 | def extract_bag( 16 | bag_dir: str, 17 | output_dir: str, 18 | event: bool = False, 19 | lidar: bool = False, 20 | ) -> None: 21 | """Extract rosbags to single image frames and pointclouds. 22 | 23 | Args: 24 | bag_file: path to directory containing rosbags for extraction 25 | output_dir: path to directory where rosbags are extracted to 26 | event: set if event image frame should be extracted 27 | lidar: set if lidar point cloud should be extracted 28 | """ 29 | 30 | bag_path = Path(bag_dir) 31 | output_path = Path(output_dir) 32 | 33 | if not bag_path.exists(): 34 | raise FileNotFoundError("Bag directory does not exist!") 35 | if not output_path.exists(): 36 | output_path.mkdir() 37 | (output_path / "event").mkdir() 38 | (output_path / "lidar").mkdir() 39 | 40 | bag_files: List[Path] = [] 41 | bag_files += bag_path.glob("*.bag") 42 | 43 | start = time.time() 44 | 45 | for bag_file in bag_files: 46 | bag = rosbag.Bag(bag_file) 47 | print(output_path / bag_file.stem) 48 | if event: 49 | extract_event(bag, output_path / "event" / f"{bag_file.stem}.png") 50 | if lidar: 51 | extract_lidar(bag, output_path / "lidar" / f"{bag_file.stem}.txt") 52 | 53 | bag.close() 54 | 55 | end = time.time() 56 | 57 | print(f"{len(bag_files)} scenes completed in {end-start:.3f}s") 58 | 59 | 60 | def extract_event( 61 | bag: rosbag.bag.Bag, 62 | output_name: pathlib.Path, 63 | event_limit: int = 10000, 64 | clip_count: int = 127, 65 | ) -> None: 66 | """Extract single event image from rosbag. 67 | 68 | Args: 69 | bag: rosbag with topics to parse 70 | output_name: file name to save image as 71 | event_limit: number of events at which to stop accumulating 72 | clip_count: maximum intensity value to clip results to 73 | """ 74 | 75 | msg = None 76 | 77 | # generate empty array 78 | image = np.zeros((720, 1280), dtype=np.uint16) 79 | 80 | total = int( 81 | min( 82 | int(bag.get_message_count("/prophesee/camera/cd_events_buffer")), 83 | event_limit, 84 | ) 85 | ) 86 | pbar = tqdm.tqdm(total=total) 87 | 88 | # accumulate events up to the limit 89 | event_batch = 0 90 | for _, msg, _ in bag.read_messages(topics=["/prophesee/camera/cd_events_buffer"]): 91 | for event in msg.events: 92 | image[event.y, event.x] += 1 93 | event_batch += 1 94 | pbar.update(1) 95 | if event_batch >= total: 96 | break 97 | 98 | if msg: 99 | # clip accumulated events to prevent scaling issues 100 | image[image > clip_count] = clip_count 101 | cv2.imwrite(str(output_name), image.astype(np.uint8)) 102 | 103 | 104 | def extract_lidar(bag: rosbag.bag.Bag, output_name: pathlib.Path) -> None: 105 | """Extract single LiDAR scan from rosbag and filter non-returns. 106 | 107 | Args: 108 | bag: rosbag with topics to parse 109 | output_name: file name to save point cloud as 110 | """ 111 | 112 | msg = None 113 | 114 | # read first point cloud 115 | for _, msg, _ in bag.read_messages(topics=["/rslidar_points"]): 116 | break 117 | 118 | if msg: 119 | # generate point cloud data 120 | sample_pc = pc2.read_points_list(msg) 121 | 122 | # filter out non-returns 123 | points = np.array(sample_pc) 124 | filter_nan = ~(points[:, 3] == 0) 125 | points_clean = points[filter_nan, :] 126 | 127 | np.savetxt(str(output_name), points_clean, delimiter=",") 128 | 129 | 130 | if __name__ == "__main__": 131 | fire.Fire(extract_bag) 132 | -------------------------------------------------------------------------------- /optimize.py: -------------------------------------------------------------------------------- 1 | """Command line tool for optimising mutual information between image & point cloud""" 2 | import datetime 3 | import json 4 | import pathlib 5 | import time 6 | from pathlib import Path 7 | from typing import Any, Dict, Tuple 8 | 9 | import cv2 10 | import fire 11 | import numpy as np 12 | from scipy.optimize import Bounds, minimize 13 | 14 | from utilities import PairedScene, PointCloud, get_mutual_information 15 | 16 | 17 | def calibrate( 18 | image_dir: str, 19 | lidar_dir: str, 20 | output_dir: str, 21 | bounded: bool = True, 22 | disp_bounds: float = 0.2, 23 | deg_bounds: float = 5.0, 24 | optimizer: str = "SLSQP", 25 | image_blur: float = 5, 26 | key: str = "cam", 27 | use_axis_representation: bool = True, 28 | calibrate_rotation_only: bool = False, 29 | ) -> None: 30 | """Extrinsically calibrate camera to LiDAR via mutual information maximization. 31 | 32 | Args: 33 | image_dir: path to camera folder 34 | lidar_dir: path to lidar folder 35 | output_dir: path to where the results are saved 36 | bounded: flag for optimizing with bounds on seed values 37 | disp_bounds: max displacement difference from seed values in m 38 | deg_bounds: max rotation difference from seed values in degrees 39 | optimizer: optimizer to use in {'nelder-mead', 'Powell', 'L-BFGS-B', 'BFGS', 'CG', 'SLSQP'} 40 | image_blur: gaussian blurring std on image to smooth optimization 41 | key: key for the intrinsic calibration in config.json 42 | use_axis_representation: flag to use rotation vector over Euler angles 43 | calibrate_rotation_only: flag to calibrate rotation only 44 | """ 45 | # convert paths 46 | image_path = Path(image_dir) 47 | lidar_path = Path(lidar_dir) 48 | output_path = Path(output_dir) 49 | config_path = Path("config") 50 | 51 | # check paths 52 | if not image_path.exists(): 53 | raise FileNotFoundError("Image directory does not exist!") 54 | if not lidar_path.exists(): 55 | raise FileNotFoundError("LiDAR directory does not exist!") 56 | if not output_path.exists(): 57 | output_path.mkdir() 58 | 59 | # load configuration 60 | camera_matrix, distortion_params, tf_l2c = load_intrinsics( 61 | config_path / "config.json", key, use_axis_representation 62 | ) 63 | with open(config_path / "optimizer.json", "r", encoding="utf-8") as file_io: 64 | optim = json.load(file_io) 65 | 66 | # set optimization targets 67 | if calibrate_rotation_only: 68 | target = tf_l2c[3:] 69 | optimization_mask = np.arange(3, 6) 70 | else: 71 | target = tf_l2c 72 | optimization_mask = np.arange(0, 6) 73 | 74 | # add bounds to optimixation 75 | if bounded and optimizer in ["nelder-mead", "Powell", "L-BFGS-B", "SLSQP"]: 76 | bound_error = np.array([disp_bounds] * 3 + [np.deg2rad(deg_bounds)] * 3)[optimization_mask] 77 | bounds = Bounds(target - bound_error, target + bound_error) 78 | else: 79 | bound_error = None 80 | 81 | # load scene data 82 | data = load_data(image_path, lidar_path, camera_matrix, distortion_params, image_blur) 83 | 84 | print(f"Optimizing for {len(data)} scenes.") 85 | 86 | initial_mi = get_mutual_information(target, optimization_mask, tf_l2c, camera_matrix, data, use_axis_representation) 87 | 88 | # run optimizer 89 | start = time.time() 90 | res = minimize( 91 | get_mutual_information, 92 | target, 93 | (optimization_mask, tf_l2c, camera_matrix, data, use_axis_representation), 94 | method=optimizer, 95 | jac=optim[optimizer]["jac"], 96 | bounds=bounds, 97 | options=optim[optimizer]["options"], 98 | ) 99 | end = time.time() 100 | 101 | output = tf_l2c 102 | output[optimization_mask] = res.x 103 | t_x, t_y, t_z, r_1, r_2, r_3 = output.tolist() 104 | 105 | # print optimizer results 106 | print(f"Time Elapsed: {end-start}s") 107 | print(f"Initial function value: {initial_mi:10.6f}") 108 | print(f"Optimal function value: {res.fun: 10.6f}") 109 | print(f"X Translation: {t_x:8.4f} m") 110 | print(f"Y Translation: {t_y:8.4f} m") 111 | print(f"Z Translation: {t_z:8.4f} m") 112 | if use_axis_representation: 113 | print(f"X Rotation: {r_1:11.4f} rad") 114 | print(f"Y Rotation: {r_2:11.4f} rad") 115 | print(f"Z Rotation: {r_3:11.4f} rad") 116 | else: 117 | print(f"Z Rotation: {np.rad2deg(r_1):11.4f} degrees") 118 | print(f"Y Rotation: {np.rad2deg(r_2):11.4f} degrees") 119 | print(f"X Rotation: {np.rad2deg(r_3):11.4f} degrees") 120 | 121 | # save results 122 | res_dict: Dict[str, Any] = {} 123 | res_dict["method"] = optimizer 124 | res_dict["jac"] = optim[optimizer]["jac"] 125 | res_dict["options"] = optim[optimizer]["options"] 126 | res_dict["fun"] = res.fun 127 | res_dict["smoothing"] = image_blur 128 | res_dict["x"] = res.x.tolist() 129 | res_dict["seed"] = tf_l2c.tolist() 130 | res_dict["bounded"] = bound_error.tolist() 131 | res_dict["time"] = end - start 132 | res_dict["scenes"] = list(data.keys()) 133 | 134 | if use_axis_representation: 135 | res_dict["representation"] = "axis-angle" 136 | else: 137 | res_dict["representation"] = "euler" 138 | 139 | output_file = output_path / f'{datetime.datetime.now().strftime("%Y%m%d-%H%M%S")}_{key}.json' 140 | 141 | with open( 142 | output_file, 143 | "w", 144 | encoding="utf-8", 145 | ) as file_io: 146 | json.dump(res_dict, file_io, indent=4) 147 | 148 | 149 | def load_data( 150 | image_folder: pathlib.Path, 151 | lidar_folder: pathlib.Path, 152 | camera_matrix: np.ndarray, 153 | distortion_params: np.ndarray, 154 | image_blur: float, 155 | ) -> dict: 156 | """Load scene data into a dictionary for easy access. 157 | 158 | Args: 159 | image_folder: path to images 160 | lidar_folder: path to LiDAR scans 161 | camera_matrix: (3,3) matrix containing the projection 162 | distortion_params: (5,) array containing distortion parameters 163 | image_blur: gaussian blurring std on image to smooth optimization 164 | raw16: flag to process raw16 Bayer images 165 | 166 | Returns: 167 | data: dictionary of categorized scenes with rectified images and scans. 168 | """ 169 | data: Dict[str, PairedScene] = {} 170 | 171 | images = image_folder.glob("*.png") 172 | image_stems = [image.stem for image in images] 173 | scans = lidar_folder.glob("*.txt") 174 | scan_stems = [scan.stem for scan in scans] 175 | 176 | matched_scenes = sorted([stem for stem in image_stems if stem in scan_stems]) 177 | 178 | # add images and point cloud to scene dictionary 179 | for scene in matched_scenes: 180 | image_path = image_folder / f"{scene}.png" 181 | point_cloud = lidar_folder / f"{scene}.txt" 182 | 183 | # load image with any bit depth 184 | image = cv2.imread(str(image_path), cv2.IMREAD_ANYDEPTH) 185 | 186 | # undistort image 187 | rect_image = cv2.undistort(image, camera_matrix, distortion_params, None, camera_matrix) 188 | 189 | # slightly blur images for optimization 190 | rect_image = cv2.GaussianBlur(rect_image, (0, 0), image_blur) 191 | 192 | h, w = rect_image.shape 193 | 194 | # load point cloud information 195 | pc = np.loadtxt(str(point_cloud), skiprows=0, delimiter=",") 196 | pc_xyz = pc[:, 0:3] 197 | pc_int = pc[:, 3] 198 | 199 | # save into dictionary 200 | data[scene] = PairedScene(rect_image, PointCloud(pc_xyz, pc_int), 0, w - 1, 0, h - 1) 201 | 202 | return data 203 | 204 | 205 | def load_intrinsics( 206 | intrinsic_file: pathlib.Path, 207 | frame: str = "cam", 208 | use_axis_representation: bool = False, 209 | ) -> Tuple[np.ndarray, np.ndarray, np.ndarray]: 210 | """Load intrinsic calibration values and seed extrinsics. 211 | 212 | Args: 213 | intrinsic_file: path to configuration json 214 | frame: dictionary key to calibration intrinsics 215 | use_axis_representation: flag to use angle-axis representation 216 | 217 | Returns: 218 | camera_matrix: (3,3) matrix containing the projection 219 | distortion_params: (5,) array containing distortion parameters 220 | tf_vector: (6,) translations and rotation parameters as vector 221 | """ 222 | 223 | # load intrinsic parameters and seed extrinsics 224 | with open(intrinsic_file, "r", encoding="utf-8") as file_io: 225 | config = json.load(file_io) 226 | 227 | camera_matrix = np.array(config[frame]["K"]) 228 | distortion_params = np.array(config[frame]["D"]) 229 | 230 | if use_axis_representation: 231 | tf_vector = np.array( 232 | [ 233 | config[frame]["x"], 234 | config[frame]["y"], 235 | config[frame]["z"], 236 | ] 237 | + config[frame]["Rodrigues"] 238 | ) 239 | else: 240 | tf_vector = np.array( 241 | [ 242 | config[frame]["x"], 243 | config[frame]["y"], 244 | config[frame]["z"], 245 | np.deg2rad(config[frame]["rz"]), 246 | np.deg2rad(config[frame]["ry"]), 247 | np.deg2rad(config[frame]["rx"]), 248 | ] 249 | ) 250 | 251 | return camera_matrix, distortion_params, tf_vector 252 | 253 | 254 | if __name__ == "__main__": 255 | fire.Fire(calibrate) 256 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | fire==0.4.0 2 | numpy==1.22.3 3 | opencv-python==4.5.5.64 4 | scipy==1.8.0 5 | six==1.16.0 6 | termcolor==1.1.0 7 | tqdm==4.64.0 8 | -------------------------------------------------------------------------------- /utilities.py: -------------------------------------------------------------------------------- 1 | """Module containing utilities for geometric and statistical functions.""" 2 | from dataclasses import dataclass 3 | from typing import Dict, Tuple 4 | 5 | import cv2 6 | import numpy as np 7 | import tqdm 8 | from scipy.ndimage import gaussian_filter, gaussian_filter1d 9 | from scipy.stats import gaussian_kde 10 | 11 | 12 | @dataclass 13 | class PointCloud: 14 | """Dataclass containing point cloud information.""" 15 | 16 | xyz: np.ndarray 17 | intensity: np.ndarray 18 | 19 | 20 | @dataclass 21 | class PairedScene: 22 | """Dataclass containing paired scene information.""" 23 | 24 | image: np.ndarray 25 | point_cloud: PointCloud 26 | x_min: int 27 | x_max: int 28 | y_min: int 29 | y_max: int 30 | 31 | 32 | def get_mutual_information( 33 | target: np.ndarray, 34 | optimization_mask: np.ndarray, 35 | tf_l2c: np.ndarray, 36 | camera_matrix: np.ndarray, 37 | data: Dict[str, PairedScene], 38 | axis: bool = True, 39 | ) -> float: 40 | """Get MI score given images and pointclouds. 41 | 42 | Args: 43 | target: the parameters for optimization 44 | optimization_mask: the associated indices for the optimization targets 45 | tf_l2c: (6,) original translations and rotation angles (ZYX / Rodrigues) 46 | camera_matrix: (3,3) matrix containing the projection 47 | data: dict of paired scene data with image and point cloud info 48 | axis: flag to use axis-angle representation 49 | 50 | Returns: 51 | MI: mutual information criteria 52 | """ 53 | 54 | tf_l2c[optimization_mask] = target 55 | 56 | # initialize accumulated histogram arrays 57 | total_x = np.zeros(256) 58 | total_y = np.zeros(256) 59 | total_xy = np.zeros((256, 256)) 60 | total_n = 0 61 | 62 | # get homogenous transformation matrix 63 | if axis: 64 | tf_l2c_matrix = convert_axis_vector_to_tf(*tf_l2c) 65 | else: 66 | tf_l2c_matrix = convert_euler_vector_to_tf(*tf_l2c) 67 | 68 | pbar = tqdm.tqdm(total=len(data)) 69 | 70 | # loop through each scene in the dictionary 71 | for scene in data: 72 | # project points into image 73 | projected_points, tf_pc_int = project_points_to_image( 74 | data[scene].point_cloud.xyz, data[scene].point_cloud.intensity, camera_matrix, tf_l2c_matrix 75 | ) 76 | 77 | # get scene histogram info 78 | h_x, h_y, h_xy, num_valid = calc_joint_hist( 79 | data[scene].image, 80 | projected_points[0:2, :], 81 | tf_pc_int, 82 | data[scene].x_min, 83 | data[scene].x_max, 84 | data[scene].y_min, 85 | data[scene].y_max, 86 | ) 87 | 88 | # add to totals 89 | total_x = total_x + h_x 90 | total_y = total_y + h_y 91 | total_xy = total_xy + h_xy 92 | total_n += num_valid 93 | 94 | pbar.update(1) 95 | 96 | if total_n > 0: 97 | # attempt to smooth histogram with KDE 98 | p_x, p_y, p_xy = get_kde_convolve(total_x, total_y, total_xy) 99 | score = -calculate_mutual_information(p_x, p_y, p_xy) # return the negative for minimisation 100 | else: 101 | score = 0 102 | 103 | print(f"MI Score: {score:.6f}") 104 | 105 | return score 106 | 107 | 108 | def convert_euler_to_matrix(alpha: np.float64 = 0, beta: np.float64 = 0, gamma: np.float64 = 0) -> np.ndarray: 109 | """Convert zyx Euler angles to 3x3 rotation matrix. 110 | 111 | Args: 112 | alpha: rotation about z axis in rad 113 | beta: rotation about y axis in rad 114 | gamma: rotation about x axis in rad 115 | 116 | Returns: 117 | 3x3 rotation matrix. 118 | """ 119 | 120 | rotation_matrix = np.array( 121 | [ 122 | [ 123 | np.cos(alpha) * np.cos(beta), 124 | np.cos(alpha) * np.sin(beta) * np.sin(gamma) - np.sin(alpha) * np.cos(gamma), 125 | np.cos(alpha) * np.sin(beta) * np.cos(gamma) + np.sin(alpha) * np.sin(gamma), 126 | ], 127 | [ 128 | np.sin(alpha) * np.cos(beta), 129 | np.sin(alpha) * np.sin(beta) * np.sin(gamma) + np.cos(alpha) * np.cos(gamma), 130 | np.sin(alpha) * np.sin(beta) * np.cos(gamma) - np.cos(alpha) * np.sin(gamma), 131 | ], 132 | [-np.sin(beta), np.cos(beta) * np.sin(gamma), np.cos(beta) * np.cos(gamma)], 133 | ] 134 | ) 135 | 136 | return rotation_matrix 137 | 138 | 139 | def convert_matrix_to_euler(rotation_matrix: np.ndarray) -> np.ndarray: 140 | """Convert 3x3 rotation matrix to ZYX Euler angles. 141 | 142 | Args: 143 | rotation_matrix: homogenous transformation. 144 | 145 | Returns: 146 | (3,) ZYX rotation angles 147 | """ 148 | 149 | if np.abs(rotation_matrix[0, 0]) < 1e-8 and np.abs(rotation_matrix[1, 0]) < 1e-8: 150 | alpha = 0 151 | beta = np.pi / 2 152 | gamma = np.arctan2(rotation_matrix[0, 1], rotation_matrix[1, 1]) 153 | else: 154 | alpha = np.arctan2(rotation_matrix[1, 0], rotation_matrix[0, 0]) 155 | beta = np.arctan2( 156 | -rotation_matrix[2, 0], 157 | np.sqrt(rotation_matrix[0, 0] ** 2 + rotation_matrix[1, 0] ** 2), 158 | ) 159 | gamma = np.arctan2(rotation_matrix[2, 1], rotation_matrix[2, 2]) 160 | 161 | return np.array([alpha, beta, gamma]) 162 | 163 | 164 | def convert_euler_vector_to_tf( 165 | t_x: np.float64 = 0, 166 | t_y: np.float64 = 0, 167 | t_z: np.float64 = 0, 168 | alpha: np.float64 = 0, 169 | beta: np.float64 = 0, 170 | gamma: np.float64 = 0, 171 | ) -> np.ndarray: 172 | """Convert xyz translation and zyx Euler angles to 4x4 homogenous matrix. 173 | 174 | Args: 175 | t_x: translation about x axis in m 176 | t_y: translation about y axis in m 177 | t_z: translation about z axis in m 178 | alpha: rotation about z axis in rad 179 | beta: rotation about y axis in rad 180 | gamma: rotation about x axis in rad 181 | 182 | Returns: 183 | 4x4 homogenous matrix. 184 | """ 185 | 186 | tf_matrix = np.zeros((4, 4)) 187 | tf_matrix[0:3, 0:3] = convert_euler_to_matrix(alpha, beta, gamma) 188 | tf_matrix[0, 3] = t_x 189 | tf_matrix[1, 3] = t_y 190 | tf_matrix[2, 3] = t_z 191 | tf_matrix[3, 3] = 1 192 | 193 | return tf_matrix 194 | 195 | 196 | def convert_axis_vector_to_tf( 197 | t_x: np.float64 = 0, 198 | t_y: np.float64 = 0, 199 | t_z: np.float64 = 0, 200 | r_x: np.float64 = 0, 201 | r_y: np.float64 = 0, 202 | r_z: np.float64 = 0, 203 | ) -> np.ndarray: 204 | """Convert xyz translation and axis-angle vector to 4x4 homogenous matrix. 205 | 206 | Args: 207 | t_x: translation about x axis in m 208 | t_y: translation about y axis in m 209 | t_z: translation about z axis in m 210 | r_x: axis-angle x-component 211 | r_y: axis-angle y-component 212 | r_z: axis-angle z-component 213 | 214 | Returns: 215 | 4x4 homogenous matrix. 216 | """ 217 | 218 | tf_matrix = np.zeros((4, 4)) 219 | tf_matrix[0:3, 0:3] = cv2.Rodrigues(np.array([r_x, r_y, r_z]))[0] 220 | tf_matrix[0, 3] = t_x 221 | tf_matrix[1, 3] = t_y 222 | tf_matrix[2, 3] = t_z 223 | tf_matrix[3, 3] = 1 224 | 225 | return tf_matrix 226 | 227 | 228 | def convert_tf_to_param( 229 | tf_matrix: np.ndarray, 230 | ) -> np.ndarray: 231 | """Convert 4x4 homogenous matrix to xyz translation and zyx Euler angles. 232 | 233 | Args: 234 | tf_matrix: (4,4) homogenous matrix. 235 | 236 | Returns: 237 | array consisting of: 238 | translation about x axis in m 239 | translation about y axis in m 240 | translation about z axis in m 241 | rotation about z axis in rad 242 | rotation about y axis in rad 243 | rotation about x axis in rad 244 | """ 245 | 246 | t_vec = tf_matrix[0:3, 3] 247 | r_vec = convert_matrix_to_euler(tf_matrix[0:3, 0:3]) 248 | 249 | p_vec = np.hstack((t_vec, r_vec)) 250 | 251 | return p_vec 252 | 253 | 254 | def project_points_to_image( 255 | pc_xyz: np.ndarray, 256 | pc_int: np.ndarray, 257 | camera_matrix: np.ndarray, 258 | tf_matrix: np.ndarray, 259 | ) -> np.ndarray: 260 | """Project 3D points into 2D image space. 261 | 262 | Args: 263 | points_xyz: (n,3,) array containing xyz points 264 | points_int: (n,) array containing point intensities 265 | camera_matrix: (3,3) matrix containing the projection 266 | tf_matrix: (4,4) matrix containing the transformation 267 | 268 | Returns: 269 | (m,2), (m,) filtered xy coordinates and intensity values 270 | """ 271 | 272 | # augment point cloud with added 1 for matrix multiplication 273 | pc_extended = np.hstack((pc_xyz, np.ones(pc_int.reshape(-1, 1).shape))) 274 | 275 | # transform point cloud 276 | tf_pc = tf_matrix @ pc_extended.T 277 | 278 | # remove all points behind the projection 279 | forward_mask = tf_pc[2, :] > 0 280 | tf_pc_filt = tf_pc[:, forward_mask] 281 | tf_pc_int = pc_int[forward_mask] 282 | 283 | # scale by z-depth for projective geometry 284 | tf_pc_xy = tf_pc_filt[0:3, :] / np.repeat(tf_pc_filt[2, :].reshape(1, -1), 3, axis=0) 285 | 286 | # project and return 287 | return camera_matrix @ tf_pc_xy, tf_pc_int 288 | 289 | 290 | def points2imageCV(pc_xyz: np.ndarray, pc_int: np.ndarray, K: np.ndarray, D: np.ndarray, H: np.ndarray) -> np.ndarray: 291 | """Project 3D points into 2D image space. 292 | 293 | Args: 294 | points_xyz: (n,3,) array containing xyz points 295 | points_int: (n,) array containing point intensities 296 | K: (3,3) matrix containing the projection 297 | D: (5,) array containing distortion parameters 298 | H: (4,4) matrix containing the transformation 299 | 300 | Returns: 301 | (m,2), (m,) filtered xy coordinates and intensity values 302 | """ 303 | 304 | # augment point cloud with added 1 for matrix multiplication 305 | pc_extended = np.hstack((pc_xyz, np.ones(pc_int.reshape(-1, 1).shape))) 306 | 307 | # transform point cloud 308 | tf_pc = H @ pc_extended.T 309 | 310 | # remove all points behind the projection 311 | forward_mask = tf_pc[2, :] > 0 312 | tf_pc_filt = tf_pc[:, forward_mask] 313 | tf_pc_int = pc_int[forward_mask] 314 | 315 | # scale by z-depth for projective geometry 316 | tf_pc_xy, _ = cv2.projectPoints(tf_pc_filt[0:3, :], np.eye(3), np.zeros((3, 1)), K, D) 317 | 318 | # project and return 319 | return tf_pc_xy.reshape(-1, 2).T, tf_pc_int 320 | 321 | 322 | def calc_joint_hist( 323 | image: np.ndarray, 324 | pc_xy: np.ndarray, 325 | pc_int: np.ndarray, 326 | x_min: np.int64, 327 | x_max: np.int64, 328 | y_min: np.int64, 329 | y_max: np.int64, 330 | ) -> Tuple[np.ndarray, np.ndarray, np.ndarray, int]: 331 | """Generate joint histogram. 332 | 333 | Args: 334 | image: (w,h) grayscale image 335 | pc_xy: (m,2) point cloud positions in the rectified frame 336 | pc_int: (m,) LiDAR intensity values 337 | x_min: the left hand bound of ROI 338 | x_max: the right hand bound of ROI 339 | y_min: the bottom bound of ROI 340 | y_max: the top bound of ROI 341 | 342 | Returns: 343 | hist_x: (256,) count of LiDAR intensities for each value 344 | hist_y: (256,) count of grayscale intensities for each point 345 | hist_xy: (256,256,) count in joint histogram 346 | num_valid: total number of valid points 347 | """ 348 | 349 | roi_mask = (pc_xy[1, :] < y_max) & (pc_xy[0, :] < x_max) & (pc_xy[1, :] > y_min) & (pc_xy[0, :] > x_min) 350 | 351 | num_valid = roi_mask.sum() 352 | 353 | im_coords = np.round(pc_xy[:, roi_mask]).astype(np.uint16) 354 | im_ints = image[im_coords[1, :], im_coords[0, :]] 355 | 356 | hist_x, _ = np.histogram(pc_int[roi_mask].astype(int), bins=np.linspace(0, 256, 257)) 357 | hist_y, _ = np.histogram(im_ints, bins=np.linspace(0, 256, 257)) 358 | hist_xy, _, _ = np.histogram2d(pc_int[roi_mask].astype(int), im_ints, bins=np.linspace(0, 256, 257)) 359 | 360 | return hist_x, hist_y, hist_xy, num_valid 361 | 362 | 363 | def calculate_mutual_information( 364 | c_x: np.ndarray, c_y: np.ndarray, c_xy: np.ndarray, num_points: np.int64 = 1 365 | ) -> np.float64: 366 | """Calculate mutual information criteria. 367 | 368 | Args: 369 | c_x: (256,) count of LiDAR intensities 370 | c_y: (256,) count of grayscale intensities 371 | c_xy: (256,256,) count of joint intensities 372 | num_points: number of points 373 | 374 | Returns: 375 | MI: mutual information criteria 376 | """ 377 | 378 | if num_points == 0: 379 | return 0 380 | 381 | else: 382 | p_x = c_x / num_points 383 | p_y = c_y / num_points 384 | p_xy = c_xy / num_points 385 | 386 | entropy_x = np.sum(-p_x[p_x > 0] * np.log(p_x[p_x > 0])) 387 | entropy_y = np.sum(-p_y[p_y > 0] * np.log(p_y[p_y > 0])) 388 | entropy_xy = np.sum(-p_xy[p_xy > 0] * np.log(p_xy[p_xy > 0])) 389 | 390 | return entropy_x + entropy_y - entropy_xy 391 | 392 | 393 | def get_kde_true( 394 | hist_x: np.ndarray, 395 | hist_y: np.ndarray, 396 | hist_xy: np.ndarray, 397 | ) -> Tuple[np.ndarray, np.ndarray, np.ndarray]: 398 | """Get Gaussian kernel density estimate entirely. 399 | 400 | Args: 401 | hist_x: (256,) histogram of LiDAR intensities 402 | hist_y: (256,) histogram of grayscale intensities 403 | hist_xy: (256,256,) histogram of joint intensities 404 | 405 | Returns: 406 | p_kde_x: (256,) PDF for LiDAR intensities 407 | p_kde_y: (256,) PDF for image intensities 408 | p_kde_xy: (256,256,) joint PDF for join intensities 409 | """ 410 | 411 | # get reconstructed point observations 412 | total_x_data = [] 413 | for i, num in enumerate(hist_x): 414 | total_x_data += [i] * int(num) 415 | 416 | total_y_data = [] 417 | for i, num in enumerate(hist_y): 418 | total_y_data += [i] * int(num) 419 | 420 | total_xy_data = [] 421 | for i, y_data in enumerate(hist_xy): 422 | for j, num in enumerate(y_data): 423 | total_xy_data += [(i, j)] * int(num) 424 | total_xy_data = np.array(total_xy_data).T 425 | 426 | # initialize KDE 427 | gkde_x = gaussian_kde(total_x_data, bw_method="silverman") 428 | gkde_y = gaussian_kde(total_y_data, bw_method="silverman") 429 | gkde_xy = gaussian_kde(total_xy_data, bw_method="silverman") 430 | 431 | # get evaluation points and retrieve probabilities 432 | x_pts = np.linspace(0, 255, 256) 433 | p_x = gkde_x.evaluate(x_pts) 434 | 435 | y_pts = np.linspace(0, 255, 256) 436 | p_y = gkde_y.evaluate(y_pts) 437 | 438 | x_grid, y_grid = np.mgrid[0:255:256, 0:255:256] 439 | positions = np.vstack([x_grid.ravel(), y_grid.ravel()]) 440 | total_kde_xy = gkde_xy.evaluate(positions) 441 | p_xy = total_kde_xy.T.reshape(x_grid.shape) 442 | 443 | return p_x, p_y, p_xy 444 | 445 | 446 | def get_kde_convolve( 447 | hist_x: np.ndarray, 448 | hist_y: np.ndarray, 449 | hist_xy: np.ndarray, 450 | use_silverman_1d: bool = True, 451 | ) -> Tuple[np.ndarray, np.ndarray, np.ndarray]: 452 | """Get Gaussian kernel density estimate through convolution. 453 | 454 | Args: 455 | h_x: (256,) histogram of LiDAR intensities 456 | h_y: (256,) histogram of grayscale intensities 457 | h_xy: (256,256,) histogram of joint intensities 458 | 459 | Returns: 460 | p_kde_x: (256,) PDF for LiDAR intensities 461 | p_kde_y: (256,) PDF for image intensities 462 | p_kde_xy: (256,256,) joint PDF for join intensities 463 | """ 464 | 465 | total_n = np.sum(hist_x) 466 | 467 | mu_x = np.sum(hist_x * np.linspace(0, 255, 256)) / total_n 468 | mu_y = np.sum(hist_y * np.linspace(0, 255, 256)) / total_n 469 | 470 | sigma_x = np.sum(hist_x * (np.linspace(0, 255, 256) - mu_x) ** 2) / total_n 471 | sigma_y = np.sum(hist_y * (np.linspace(0, 255, 256) - mu_y) ** 2) / total_n 472 | 473 | if use_silverman_1d: 474 | factor = (total_n * (1 + 2) / 4) ** (-1 / (1 + 4)) 475 | else: 476 | factor = (total_n * (2 + 2) / 4) ** (-1 / (2 + 4)) 477 | 478 | bw_x = factor * np.sqrt(sigma_x) 479 | bw_y = factor * np.sqrt(sigma_y) 480 | bw_xy = [factor * np.sqrt(sigma_x), factor * np.sqrt(sigma_y)] 481 | try: 482 | p_x = gaussian_filter1d(hist_x / total_n, bw_x, mode="constant", cval=0.0) 483 | p_y = gaussian_filter1d(hist_y / total_n, bw_y, mode="constant", cval=0.0) 484 | 485 | p_xy = gaussian_filter(hist_xy / total_n, bw_xy, mode="constant", cval=0.0) 486 | except ValueError as err: 487 | print(err) 488 | p_x = np.zeros(256) 489 | p_y = np.zeros(256) 490 | p_xy = np.zeros((256, 256)) 491 | 492 | return p_x, p_y, p_xy 493 | --------------------------------------------------------------------------------