├── .gitignore ├── LICENSE-Apache ├── LICENSE-BSD ├── README.md ├── cv_bridge ├── CHANGELOG.rst ├── CMakeLists.txt ├── README.md ├── cmake │ └── cv_bridge-extras.cmake.in ├── doc │ ├── conf.py │ ├── index.rst │ └── mainpage.dox ├── include │ └── cv_bridge │ │ ├── cv_bridge.hpp │ │ ├── cv_mat_sensor_msgs_image_type_adapter.hpp │ │ ├── rgb_colors.hpp │ │ └── visibility_control.h ├── package.xml ├── python │ └── cv_bridge │ │ ├── __init__.py │ │ └── core.py ├── src │ ├── CMakeLists.txt │ ├── cv_bridge.cpp │ ├── cv_mat_sensor_msgs_image_type_adapter.cpp │ ├── module.cpp │ ├── module.hpp │ ├── module_opencv4.cpp │ ├── pycompat.hpp │ └── rgb_colors.cpp └── test │ ├── CMakeLists.txt │ ├── conversions.py │ ├── enumerants.py │ ├── python_bindings.py │ ├── test_compression.cpp │ ├── test_dynamic_scaling.cpp │ ├── test_endian.cpp │ ├── test_rgb_colors.cpp │ ├── utest.cpp │ └── utest2.cpp ├── image_geometry ├── CHANGELOG.rst ├── CMakeLists.txt ├── doc │ ├── conf.py │ ├── index.rst │ ├── mainpage.dox │ └── python_api.rst ├── image_geometry │ ├── __init__.py │ └── cameramodels.py ├── include │ └── image_geometry │ │ ├── pinhole_camera_model.hpp │ │ ├── stereo_camera_model.hpp │ │ └── visibility_control.hpp ├── package.xml ├── src │ ├── pinhole_camera_model.cpp │ └── stereo_camera_model.cpp └── test │ ├── CMakeLists.txt │ ├── directed.py │ ├── utest.cpp │ └── utest_equi.cpp ├── opencv_tests ├── CHANGELOG.rst ├── launch │ └── view_img.py ├── mainpage.dox ├── opencv_tests │ ├── __init__.py │ ├── broadcast.py │ ├── rosfacedetect.py │ └── source.py ├── package.xml ├── resource │ └── opencv_tests ├── setup.cfg └── setup.py ├── pytest.ini └── vision_opencv ├── CHANGELOG.rst ├── CMakeLists.txt └── package.xml /.gitignore: -------------------------------------------------------------------------------- 1 | build*/ 2 | ._* 3 | *.pyc 4 | .* 5 | *~ 6 | image_geometry/doc/doctrees/ 7 | image_geometry/doc/html/ -------------------------------------------------------------------------------- /LICENSE-Apache: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /LICENSE-BSD: -------------------------------------------------------------------------------- 1 | BSD 3-Clause License 2 | 3 | Copyright (c) 2008, Willow Garage, Inc. 4 | All rights reserved. 5 | 6 | Redistribution and use in source and binary forms, with or without 7 | modification, are permitted provided that the following conditions are met: 8 | 9 | * Redistributions of source code must retain the above copyright notice, this 10 | list of conditions and the following disclaimer. 11 | 12 | * Redistributions in binary form must reproduce the above copyright notice, 13 | this list of conditions and the following disclaimer in the documentation 14 | and/or other materials provided with the distribution. 15 | 16 | * Neither the name of the copyright holder nor the names of its 17 | contributors may be used to endorse or promote products derived from 18 | this software without specific prior written permission. 19 | 20 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 21 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 22 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 23 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 24 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 25 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 26 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 27 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 28 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 29 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 30 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | vision_opencv 2 | ============= 3 | ros2 vision_opencv contains packages to interface ROS 2 with [OpenCV](http://opencv.org/) which is a library designed for computational efficiency and strong focus for real time computer vision applications. This repository contains: 4 | * `cv_bridge`: Bridge between ROS 2 image messages and OpenCV image representation 5 | * `image_geometry`: Collection of methods for dealing with image and pixel geometry 6 | * `opencv_tests`: Integration tests to use the capability of the packages with opencv 7 | * `vision_opencv`: Meta-package to install both `cv_bridge` and `image_geometry` 8 | 9 | In order to use ROS 2 with OpenCV, please see the details within [cv_bridge](https://github.com/ros-perception/vision_opencv/tree/ros2/cv_bridge) package. 10 | -------------------------------------------------------------------------------- /cv_bridge/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | cmake_minimum_required(VERSION 3.14) 2 | project(cv_bridge) 3 | 4 | find_package(ament_cmake_ros REQUIRED) 5 | 6 | # Default to C++17 7 | if(NOT CMAKE_CXX_STANDARD) 8 | set(CMAKE_CXX_STANDARD 17) 9 | endif() 10 | 11 | if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") 12 | add_compile_options(-Wall -Wextra) 13 | endif() 14 | 15 | option(CV_BRIDGE_DISABLE_PYTHON "Disable building Python bindings" OFF) 16 | 17 | if(ANDROID) 18 | set(CV_BRIDGE_DISABLE_PYTHON ON) 19 | endif() 20 | 21 | if(CV_BRIDGE_DISABLE_PYTHON) 22 | find_package(Boost REQUIRED) 23 | set(boost_python_target "") 24 | else() 25 | find_package(Python3 REQUIRED COMPONENTS Development NumPy) 26 | find_package(Boost QUIET) 27 | if(Boost_VERSION_STRING VERSION_LESS "1.67") 28 | # This is a bit of a hack to suppress a warning 29 | # No header defined for python3; skipping header check 30 | # Which should only affect Boost versions < 1.67 31 | # Resolution for newer versions: 32 | # https://gitlab.kitware.com/cmake/cmake/issues/16391 33 | set(_Boost_PYTHON3_HEADERS "boost/python.hpp") 34 | find_package(Boost REQUIRED COMPONENTS python3) 35 | set(boost_python_target "Boost::python3") 36 | else() 37 | find_package(Boost REQUIRED COMPONENTS python${Python3_VERSION_MAJOR}${Python3_VERSION_MINOR}) 38 | set(boost_python_target "Boost::python${Python3_VERSION_MAJOR}${Python3_VERSION_MINOR}") 39 | endif() 40 | endif() 41 | 42 | find_package(rclcpp REQUIRED) 43 | find_package(rcpputils REQUIRED) 44 | find_package(sensor_msgs REQUIRED) 45 | 46 | find_package(OpenCV 4 QUIET 47 | COMPONENTS 48 | opencv_core 49 | opencv_imgproc 50 | opencv_imgcodecs 51 | CONFIG 52 | ) 53 | if(NOT OpenCV_FOUND) 54 | find_package(OpenCV 3 REQUIRED 55 | COMPONENTS 56 | opencv_core 57 | opencv_imgproc 58 | opencv_imgcodecs 59 | CONFIG 60 | ) 61 | endif() 62 | 63 | if(NOT CV_BRIDGE_DISABLE_PYTHON) 64 | ament_python_install_package(${PROJECT_NAME} 65 | PACKAGE_DIR python/${PROJECT_NAME} 66 | ) 67 | endif() 68 | 69 | add_subdirectory(src) 70 | 71 | # cv_bridge_lib_dir is passed as APPEND_LIBRARY_DIRS for each ament_add_gtest call so 72 | # the project library that they link against is on the library path. 73 | # This is especially important on Windows. 74 | # This is overwritten each loop, but which one it points to doesn't really matter. 75 | set(cv_bridge_lib_dir "$") 76 | 77 | if(BUILD_TESTING) 78 | add_subdirectory(test) 79 | endif() 80 | 81 | ament_export_dependencies( 82 | OpenCV 83 | sensor_msgs 84 | rclcpp 85 | ) 86 | 87 | ament_export_targets(export_${PROJECT_NAME}) 88 | 89 | # install the include folder 90 | install(DIRECTORY include/ DESTINATION include/${PROJECT_NAME}) 91 | 92 | install(TARGETS ${PROJECT_NAME} 93 | RUNTIME DESTINATION bin 94 | ARCHIVE DESTINATION lib 95 | LIBRARY DESTINATION lib 96 | ) 97 | 98 | ament_package( 99 | CONFIG_EXTRAS "cmake/cv_bridge-extras.cmake.in" 100 | ) 101 | -------------------------------------------------------------------------------- /cv_bridge/README.md: -------------------------------------------------------------------------------- 1 | cv_bridge 2 | ========== 3 | 4 | # Introduction 5 | 6 | cv_bridge converts between ROS 2 image messages and OpenCV image representation for perception applications. As follows: 7 | 8 | ![cv_bridge overview](http://wiki.ros.org/cv_bridge?action=AttachFile&do=get&target=cvbridge.png) 9 | 10 | This ros2 branch initially derives from porting on the basis of [ros kinetic branch](https://github.com/ros-perception/vision_opencv/tree/kinetic/cv_bridge) 11 | 12 | # Installation 13 | 14 | Firstly, it assumes that the `ROS 2 core` has already been installed, please refer to [ROS 2 installation](https://docs.ros.org/en/rolling/Installation.html) to get started. 15 | 16 | ## Install dependencies 17 | OpenCV3+ is a must to install, please refer to the official installation guide from [OpenCV Tutorials](http://docs.opencv.org/master/d9/df8/tutorial_root.html) 18 | Since ROS 2 uses Python 3, please make sure that python3-numpy is installed, or install like this: 19 | 20 | ```bash 21 | 22 | sudo apt install python3-numpy 23 | 24 | ``` 25 | 26 | The cv_bridge python backend still has a dependency on python boost (`equal or higher than 1.58.0`), and install them as follows in Ubuntu: 27 | 28 | ```bash 29 | 30 | sudo apt install libboost-python-dev 31 | 32 | ``` 33 | 34 | ## Build and Test 35 | 36 | ### Fetch the latest code and build 37 | ```bash 38 | 39 | cd /src 40 | git clone https://github.com/ros-perception/vision_opencv.git -b ros2 41 | cd .. 42 | colcon build --symlink-install 43 | 44 | ``` 45 | 46 | ### Run the tests 47 | Python tests have a dependency on OpenCV Python support. To install it: 48 | ```bash 49 | 50 | sudo apt install python3-opencv 51 | 52 | ``` 53 | Next to prepare runtime environment and run tests: 54 | ```bash 55 | 56 | source /install/local_setup.bash 57 | colcon test 58 | 59 | ``` 60 | 61 | # Known issues 62 | * `boost endian`: remove boost endian APIs with standard C++ 11 or higher instead 63 | * Not tested with Windows or macOS environments so there may be issues building or running 64 | -------------------------------------------------------------------------------- /cv_bridge/cmake/cv_bridge-extras.cmake.in: -------------------------------------------------------------------------------- 1 | set(OpenCV_VERSION @OpenCV_VERSION@) 2 | set(OpenCV_VERSION_MAJOR @OpenCV_VERSION_MAJOR@) 3 | set(OpenCV_VERSION_MINOR @OpenCV_VERSION_MINOR@) 4 | set(OpenCV_VERSION_PATCH @OpenCV_VERSION_PATCH@) 5 | set(OpenCV_SHARED @OpenCV_SHARED@) 6 | set(OpenCV_CONFIG_PATH @OpenCV_CONFIG_PATH@) 7 | set(OpenCV_INSTALL_PATH @OpenCV_INSTALL_PATH@) 8 | set(OpenCV_LIB_COMPONENTS @OpenCV_LIB_COMPONENTS@) 9 | set(OpenCV_USE_MANGLED_PATHS @OpenCV_USE_MANGLED_PATHS@) 10 | set(OpenCV_MODULES_SUFFIX @OpenCV_MODULES_SUFFIX@) 11 | -------------------------------------------------------------------------------- /cv_bridge/doc/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # cv_bridge documentation build configuration file, created by 4 | # sphinx-quickstart on Mon Jun 1 14:21:53 2009. 5 | # 6 | # This file is execfile()d with the current directory set to its containing dir. 7 | # 8 | # Note that not all possible configuration values are present in this 9 | # autogenerated file. 10 | # 11 | # All configuration values have a default; values that are commented out 12 | # serve to show the default. 13 | 14 | # import sys 15 | # import os 16 | 17 | # If extensions (or modules to document with autodoc) are in another directory, 18 | # add these directories to sys.path here. If the directory is relative to the 19 | # documentation root, use os.path.abspath to make it absolute, like shown here. 20 | # sys.path.append(os.path.abspath('.')) 21 | 22 | # -- General configuration ----------------------------------------------------- 23 | 24 | # Add any Sphinx extension module names here, as strings. They can be extensions 25 | # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. 26 | extensions = ['sphinx.ext.autodoc', 'sphinx.ext.doctest', 27 | 'sphinx.ext.intersphinx', 'sphinx.ext.pngmath'] 28 | 29 | # Add any paths that contain templates here, relative to this directory. 30 | templates_path = ['_templates'] 31 | 32 | # The suffix of source filenames. 33 | source_suffix = '.rst' 34 | 35 | # The encoding of source files. 36 | # source_encoding = 'utf-8' 37 | 38 | # The master toctree document. 39 | master_doc = 'index' 40 | 41 | # General information about the project. 42 | project = u'cv_bridge' 43 | # copyright = u'2009, Willow Garage, Inc.' 44 | 45 | # The version info for the project you're documenting, acts as replacement for 46 | # |version| and |release|, also used in various other places throughout the 47 | # built documents. 48 | # 49 | # The short X.Y version. 50 | version = '0.1' 51 | # The full version, including alpha/beta/rc tags. 52 | release = '0.1.0' 53 | 54 | # The language for content autogenerated by Sphinx. Refer to documentation 55 | # for a list of supported languages. 56 | # language = None 57 | 58 | # There are two options for replacing |today|: either, you set today to some 59 | # non-false value, then it is used: 60 | # today = '' 61 | # Else, today_fmt is used as the format for a strftime call. 62 | # today_fmt = '%B %d, %Y' 63 | 64 | # List of documents that shouldn't be included in the build. 65 | # unused_docs = [] 66 | 67 | # List of directories, relative to source directory, that shouldn't be searched 68 | # for source files. 69 | exclude_trees = ['_build'] 70 | 71 | # The reST default role (used for this markup: `text`) to use for all documents. 72 | # default_role = None 73 | 74 | # If true, '()' will be appended to :func: etc. cross-reference text. 75 | # add_function_parentheses = True 76 | 77 | # If true, the current module name will be prepended to all description 78 | # unit titles (such as .. function::). 79 | # add_module_names = True 80 | 81 | # If true, sectionauthor and moduleauthor directives will be shown in the 82 | # output. They are ignored by default. 83 | # show_authors = False 84 | 85 | # The name of the Pygments (syntax highlighting) style to use. 86 | pygments_style = 'sphinx' 87 | 88 | # A list of ignored prefixes for module index sorting. 89 | # modindex_common_prefix = [] 90 | 91 | 92 | # -- Options for HTML output --------------------------------------------------- 93 | 94 | # The theme to use for HTML and HTML Help pages. Major themes that come with 95 | # Sphinx are currently 'default' and 'sphinxdoc'. 96 | html_theme = 'default' 97 | 98 | # Theme options are theme-specific and customize the look and feel of a theme 99 | # further. For a list of options available for each theme, see the 100 | # documentation. 101 | # html_theme_options = {} 102 | 103 | # Add any paths that contain custom themes here, relative to this directory. 104 | # html_theme_path = [] 105 | 106 | # The name for this set of Sphinx documents. If None, it defaults to 107 | # " v documentation". 108 | # html_title = None 109 | 110 | # A shorter title for the navigation bar. Default is the same as html_title. 111 | # html_short_title = None 112 | 113 | # The name of an image file (relative to this directory) to place at the top 114 | # of the sidebar. 115 | # html_logo = None 116 | 117 | # The name of an image file (within the static path) to use as favicon of the 118 | # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 119 | # pixels large. 120 | # html_favicon = None 121 | 122 | # Add any paths that contain custom static files (such as style sheets) here, 123 | # relative to this directory. They are copied after the builtin static files, 124 | # so a file named "default.css" will overwrite the builtin "default.css". 125 | # html_static_path = ['_static'] 126 | 127 | # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, 128 | # using the given strftime format. 129 | # html_last_updated_fmt = '%b %d, %Y' 130 | 131 | # If true, SmartyPants will be used to convert quotes and dashes to 132 | # typographically correct entities. 133 | # html_use_smartypants = True 134 | 135 | # Custom sidebar templates, maps document names to template names. 136 | # html_sidebars = {} 137 | 138 | # Additional templates that should be rendered to pages, maps page names to 139 | # template names. 140 | # html_additional_pages = {} 141 | 142 | # If false, no module index is generated. 143 | # html_use_modindex = True 144 | 145 | # If false, no index is generated. 146 | # html_use_index = True 147 | 148 | # If true, the index is split into individual pages for each letter. 149 | # html_split_index = False 150 | 151 | # If true, links to the reST sources are added to the pages. 152 | # html_show_sourcelink = True 153 | 154 | # If true, an OpenSearch description file will be output, and all pages will 155 | # contain a tag referring to it. The value of this option must be the 156 | # base URL from which the finished HTML is served. 157 | # html_use_opensearch = '' 158 | 159 | # If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml"). 160 | # html_file_suffix = '' 161 | 162 | # Output file base name for HTML help builder. 163 | htmlhelp_basename = 'cv_bridgedoc' 164 | 165 | 166 | # -- Options for LaTeX output -------------------------------------------------- 167 | 168 | # The paper size ('letter' or 'a4'). 169 | # latex_paper_size = 'letter' 170 | 171 | # The font size ('10pt', '11pt' or '12pt'). 172 | # latex_font_size = '10pt' 173 | 174 | # Grouping the document tree into LaTeX files. List of tuples 175 | # (source start file, target name, title, author, documentclass [howto/manual]). 176 | latex_documents = [('index', 'cv_bridge.tex', 177 | u'stereo\\_utils Documentation', u'James Bowman', 'manual'), ] 178 | 179 | # The name of an image file (relative to this directory) to place at the top of 180 | # the title page. 181 | # latex_logo = None 182 | 183 | # For "manual" documents, if this is true, then toplevel headings are parts, 184 | # not chapters. 185 | # latex_use_parts = False 186 | 187 | # Additional stuff for the LaTeX preamble. 188 | # latex_preamble = '' 189 | 190 | # Documents to append as an appendix to all manuals. 191 | # latex_appendices = [] 192 | 193 | # If false, no module index is generated. 194 | # latex_use_modindex = True 195 | 196 | 197 | # Example configuration for intersphinx: refer to the Python standard library. 198 | intersphinx_mapping = { 199 | 'http://docs.python.org/': None, 200 | 'http://docs.scipy.org/doc/numpy': None, 201 | } 202 | -------------------------------------------------------------------------------- /cv_bridge/doc/index.rst: -------------------------------------------------------------------------------- 1 | cv_bridge 2 | ========= 3 | 4 | ``cv_bridge`` contains a single class :class:`CvBridge` that converts ROS Image messages to 5 | OpenCV images. 6 | 7 | .. module:: cv_bridge 8 | 9 | .. autoclass:: cv_bridge.CvBridge 10 | :members: 11 | 12 | .. autoclass:: cv_bridge.CvBridgeError 13 | 14 | Indices and tables 15 | ================== 16 | 17 | * :ref:`genindex` 18 | * :ref:`search` 19 | -------------------------------------------------------------------------------- /cv_bridge/doc/mainpage.dox: -------------------------------------------------------------------------------- 1 | /** 2 | \mainpage 3 | \htmlinclude manifest.html 4 | 5 | \b cv_bridge contains classes for easily converting between ROS 6 | sensor_msgs/Image messages and OpenCV images. 7 | 8 | \section codeapi Code API 9 | 10 | - cv_bridge::CvImage 11 | - toCvCopy() 12 | - toCvShare() 13 | 14 | */ 15 | -------------------------------------------------------------------------------- /cv_bridge/include/cv_bridge/cv_bridge.hpp: -------------------------------------------------------------------------------- 1 | /********************************************************************* 2 | * Software License Agreement (BSD License) 3 | * 4 | * Copyright (c) 2011, Willow Garage, Inc, 5 | * Copyright (c) 2015, Tal Regev. 6 | * Copyright (c) 2018 Intel Corporation. 7 | * All rights reserved. 8 | * 9 | * Redistribution and use in source and binary forms, with or without 10 | * modification, are permitted provided that the following conditions 11 | * are met: 12 | * 13 | * * Redistributions of source code must retain the above copyright 14 | * notice, this list of conditions and the following disclaimer. 15 | * * Redistributions in binary form must reproduce the above 16 | * copyright notice, this list of conditions and the following 17 | * disclaimer in the documentation and/or other materials provided 18 | * with the distribution. 19 | * * Neither the name of the Willow Garage nor the names of its 20 | * contributors may be used to endorse or promote products derived 21 | * from this software without specific prior written permission. 22 | * 23 | * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 24 | * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 25 | * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS 26 | * FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE 27 | * COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, 28 | * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, 29 | * BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 30 | * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 31 | * CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT 32 | * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN 33 | * ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 34 | * POSSIBILITY OF SUCH DAMAGE. 35 | *********************************************************************/ 36 | 37 | #ifndef CV_BRIDGE__CV_BRIDGE_HPP_ 38 | #define CV_BRIDGE__CV_BRIDGE_HPP_ 39 | 40 | #include 41 | #include 42 | #include 43 | #include 44 | #include 45 | #include 46 | #include 47 | 48 | #include 49 | #include 50 | #include 51 | #include 52 | 53 | namespace cv_bridge 54 | { 55 | 56 | class CV_BRIDGE_EXPORT Exception : public std::runtime_error 57 | { 58 | public: 59 | explicit Exception(const std::string & description) 60 | : std::runtime_error(description) {} 61 | }; 62 | 63 | class CvImage; 64 | 65 | typedef std::shared_ptr CvImagePtr; 66 | typedef std::shared_ptr CvImageConstPtr; 67 | 68 | // From: http://docs.opencv.org/modules/highgui/doc/reading_and_writing_images_and_video.html#Mat 69 | // imread(const string& filename, int flags) 70 | typedef enum 71 | { 72 | BMP, DIB, 73 | JPG, JPEG, JPE, 74 | JP2, 75 | PNG, 76 | PBM, PGM, PPM, 77 | SR, RAS, 78 | TIFF, TIF, 79 | } Format; 80 | 81 | /** 82 | * \brief Image message class that is interoperable with sensor_msgs/Image but uses a 83 | * more convenient cv::Mat representation for the image data. 84 | */ 85 | class CV_BRIDGE_EXPORT CvImage 86 | { 87 | public: 88 | std_msgs::msg::Header header; // !< ROS header 89 | std::string encoding; // !< Image encoding ("mono8", "bgr8", etc.) 90 | cv::Mat image; // !< Image data for use with OpenCV 91 | 92 | /** 93 | * \brief Empty constructor. 94 | */ 95 | CvImage() {} 96 | 97 | /** 98 | * \brief Constructor. 99 | */ 100 | CvImage( 101 | const std_msgs::msg::Header & header, const std::string & encoding, 102 | const cv::Mat & image = cv::Mat()) 103 | : header(header), encoding(encoding), image(image) 104 | { 105 | } 106 | 107 | /** 108 | * \brief Convert this message to a ROS sensor_msgs::msg::Image message. 109 | * 110 | * The returned sensor_msgs::msg::Image message contains a copy of the image data. 111 | */ 112 | sensor_msgs::msg::Image::SharedPtr toImageMsg() const; 113 | 114 | /** 115 | * dst_format is compress the image to desire format. 116 | * Default value is empty string that will convert to jpg format. 117 | * can be: jpg, jp2, bmp, png, tif at the moment 118 | * support this format from opencv: 119 | * http://docs.opencv.org/modules/highgui/doc/reading_and_writing_images_and_video.html#Mat imread(const string& filename, int flags) 120 | */ 121 | sensor_msgs::msg::CompressedImage::SharedPtr toCompressedImageMsg( 122 | const Format dst_format = 123 | JPG) const; 124 | 125 | /** 126 | * \brief Copy the message data to a ROS sensor_msgs::msg::Image message. 127 | * 128 | * This overload is intended mainly for aggregate messages such as stereo_msgs::DisparityImage, 129 | * which contains a sensor_msgs::msg::Image as a data member. 130 | */ 131 | void toImageMsg(sensor_msgs::msg::Image & ros_image) const; 132 | 133 | /** 134 | * dst_format is compress the image to desire format. 135 | * Default value is empty string that will convert to jpg format. 136 | * can be: jpg, jp2, bmp, png, tif at the moment 137 | * support this format from opencv: 138 | * http://docs.opencv.org/modules/highgui/doc/reading_and_writing_images_and_video.html#Mat imread(const string& filename, int flags) 139 | */ 140 | void toCompressedImageMsg( 141 | sensor_msgs::msg::CompressedImage & ros_image, 142 | const Format dst_format = JPG) const; 143 | 144 | 145 | typedef std::shared_ptr Ptr; 146 | typedef std::shared_ptr ConstPtr; 147 | 148 | protected: 149 | std::shared_ptr tracked_object_; // for sharing ownership 150 | 151 | /// @cond DOXYGEN_IGNORE 152 | friend 153 | CV_BRIDGE_EXPORT CvImageConstPtr toCvShare( 154 | const sensor_msgs::msg::Image & source, 155 | const std::shared_ptr & tracked_object, 156 | const std::string & encoding); 157 | /// @endcond 158 | }; 159 | 160 | 161 | /** 162 | * \brief Convert a sensor_msgs::msg::Image message to an OpenCV-compatible CvImage, copying the 163 | * image data. 164 | * 165 | * \param source A shared_ptr to a sensor_msgs::msg::Image message 166 | * \param encoding The desired encoding of the image data, one of the following strings: 167 | * - \c "mono8" 168 | * - \c "bgr8" 169 | * - \c "bgra8" 170 | * - \c "rgb8" 171 | * - \c "rgba8" 172 | * - \c "mono16" 173 | * 174 | * If \a encoding is the empty string (the default), the returned CvImage has the same encoding 175 | * as \a source. 176 | */ 177 | CV_BRIDGE_EXPORT CvImagePtr toCvCopy( 178 | const sensor_msgs::msg::Image::ConstSharedPtr & source, 179 | const std::string & encoding = std::string()); 180 | 181 | CV_BRIDGE_EXPORT CvImagePtr toCvCopy( 182 | const sensor_msgs::msg::CompressedImage::ConstSharedPtr & source, 183 | const std::string & encoding = std::string()); 184 | 185 | /** 186 | * \brief Convert a sensor_msgs::msg::Image message to an OpenCV-compatible CvImage, copying the 187 | * image data. 188 | * 189 | * \param source A sensor_msgs::msg::Image message 190 | * \param encoding The desired encoding of the image data, one of the following strings: 191 | * - \c "mono8" 192 | * - \c "bgr8" 193 | * - \c "bgra8" 194 | * - \c "rgb8" 195 | * - \c "rgba8" 196 | * - \c "mono16" 197 | * 198 | * If \a encoding is the empty string (the default), the returned CvImage has the same encoding 199 | * as \a source. 200 | * If the source is 8bit and the encoding 16 or vice-versa, a scaling is applied (65535/255 and 201 | * 255/65535 respectively). Otherwise, no scaling is applied and the rules from the convertTo OpenCV 202 | * function are applied (capping): http://docs.opencv.org/modules/core/doc/basic_structures.html#mat-convertto 203 | */ 204 | CV_BRIDGE_EXPORT CvImagePtr toCvCopy( 205 | const sensor_msgs::msg::Image & source, 206 | const std::string & encoding = std::string()); 207 | 208 | CV_BRIDGE_EXPORT CvImagePtr toCvCopy( 209 | const sensor_msgs::msg::CompressedImage & source, 210 | const std::string & encoding = std::string()); 211 | 212 | /** 213 | * \brief Convert an immutable sensor_msgs::msg::Image message to an OpenCV-compatible CvImage, sharing 214 | * the image data if possible. 215 | * 216 | * If the source encoding and desired encoding are the same, the returned CvImage will share 217 | * the image data with \a source without copying it. The returned CvImage cannot be modified, as that 218 | * could modify the \a source data. 219 | * 220 | * \param source A shared_ptr to a sensor_msgs::msg::Image message 221 | * \param encoding The desired encoding of the image data, one of the following strings: 222 | * - \c "mono8" 223 | * - \c "bgr8" 224 | * - \c "bgra8" 225 | * - \c "rgb8" 226 | * - \c "rgba8" 227 | * - \c "mono16" 228 | * 229 | * If \a encoding is the empty string (the default), the returned CvImage has the same encoding 230 | * as \a source. 231 | */ 232 | CV_BRIDGE_EXPORT CvImageConstPtr toCvShare( 233 | const sensor_msgs::msg::Image::ConstSharedPtr & source, 234 | const std::string & encoding = std::string()); 235 | 236 | /** 237 | * \brief Convert an immutable sensor_msgs::msg::Image message to an OpenCV-compatible CvImage, sharing 238 | * the image data if possible. 239 | * 240 | * If the source encoding and desired encoding are the same, the returned CvImage will share 241 | * the image data with \a source without copying it. The returned CvImage cannot be modified, as that 242 | * could modify the \a source data. 243 | * 244 | * This overload is useful when you have a shared_ptr to a message that contains a 245 | * sensor_msgs::msg::Image, and wish to share ownership with the containing message. 246 | * 247 | * \param source The sensor_msgs::msg::Image message 248 | * \param tracked_object A shared_ptr to an object owning the sensor_msgs::msg::Image 249 | * \param encoding The desired encoding of the image data, one of the following strings: 250 | * - \c "mono8" 251 | * - \c "bgr8" 252 | * - \c "bgra8" 253 | * - \c "rgb8" 254 | * - \c "rgba8" 255 | * - \c "mono16" 256 | * 257 | * If \a encoding is the empty string (the default), the returned CvImage has the same encoding 258 | * as \a source. 259 | */ 260 | CV_BRIDGE_EXPORT CvImageConstPtr toCvShare( 261 | const sensor_msgs::msg::Image & source, 262 | const std::shared_ptr & tracked_object, 263 | const std::string & encoding = std::string()); 264 | 265 | /** 266 | * \brief Convert a CvImage to another encoding using the same rules as toCvCopy 267 | */ 268 | CV_BRIDGE_EXPORT CvImagePtr cvtColor( 269 | const CvImageConstPtr & source, 270 | const std::string & encoding); 271 | 272 | struct CvtColorForDisplayOptions 273 | { 274 | CvtColorForDisplayOptions() 275 | : do_dynamic_scaling(false), 276 | min_image_value(0.0), 277 | max_image_value(0.0), 278 | colormap(-1), 279 | bg_label(-1) {} 280 | bool do_dynamic_scaling; 281 | double min_image_value; 282 | double max_image_value; 283 | int colormap; 284 | int bg_label; 285 | }; 286 | 287 | 288 | /** 289 | * \brief Converts an immutable sensor_msgs::msg::Image message to another CvImage for display purposes, 290 | * using practical conversion rules if needed. 291 | * 292 | * Data will be shared between input and output if possible. 293 | * 294 | * Recall: sensor_msgs::msg::image_encodings::isColor and isMono tell whether an image contains R,G,B,A, mono 295 | * (or any combination/subset) with 8 or 16 bit depth. 296 | * 297 | * The following rules apply: 298 | * - if the output encoding is empty, the fact that the input image is mono or multiple-channel is 299 | * preserved in the ouput image. The bit depth will be 8. it tries to convert to BGR no matter what 300 | * encoding image is passed. 301 | * - if the output encoding is not empty, it must have sensor_msgs::msg::image_encodings::isColor and 302 | * isMono return true. It must also be 8 bit in depth 303 | * - if the input encoding is an OpenCV format (e.g. 8UC1), and if we have 1,3 or 4 channels, it is 304 | * respectively converted to mono, BGR or BGRA. 305 | * - if the input encoding is 32SC1, this estimate that image as label image and will convert it as 306 | * bgr image with different colors for each label. 307 | * 308 | * \param source A shared_ptr to a sensor_msgs::msg::Image message 309 | * \param encoding Either an encoding string that returns true in sensor_msgs::msg::image_encodings::isColor 310 | * isMono or the empty string as explained above. 311 | * \param options (cv_bridge::CvtColorForDisplayOptions) Options to convert the source image with. 312 | * - do_dynamic_scaling If true, the image is dynamically scaled between its minimum and maximum value 313 | * before being converted to its final encoding. 314 | * - min_image_value Independently from do_dynamic_scaling, if min_image_value and max_image_value are 315 | * different, the image is scaled between these two values before being converted to its final encoding. 316 | * - max_image_value Maximum image value 317 | * - colormap Colormap which the source image converted with. 318 | */ 319 | CV_BRIDGE_EXPORT CvImageConstPtr cvtColorForDisplay( 320 | const CvImageConstPtr & source, 321 | const std::string & encoding = std::string(), 322 | const CvtColorForDisplayOptions options = CvtColorForDisplayOptions()); 323 | 324 | /** 325 | * \brief Get the OpenCV type enum corresponding to the encoding. 326 | * 327 | * For example, "bgr8" -> CV_8UC3, "32FC1" -> CV_32FC1, and "32FC10" -> CV_32FC10. 328 | */ 329 | CV_BRIDGE_EXPORT int getCvType(const std::string & encoding); 330 | 331 | } // namespace cv_bridge 332 | 333 | #endif // CV_BRIDGE__CV_BRIDGE_HPP_ 334 | -------------------------------------------------------------------------------- /cv_bridge/include/cv_bridge/cv_mat_sensor_msgs_image_type_adapter.hpp: -------------------------------------------------------------------------------- 1 | // Copyright 2021 Open Source Robotics Foundation, Inc. 2 | // 3 | // Licensed under the Apache License, Version 2.0 (the "License"); 4 | // you may not use this file except in compliance with the License. 5 | // You may obtain a copy of the License at 6 | // 7 | // http://www.apache.org/licenses/LICENSE-2.0 8 | // 9 | // Unless required by applicable law or agreed to in writing, software 10 | // distributed under the License is distributed on an "AS IS" BASIS, 11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | // See the License for the specific language governing permissions and 13 | // limitations under the License. 14 | 15 | #ifndef CV_BRIDGE__CV_MAT_SENSOR_MSGS_IMAGE_TYPE_ADAPTER_HPP_ 16 | #define CV_BRIDGE__CV_MAT_SENSOR_MSGS_IMAGE_TYPE_ADAPTER_HPP_ 17 | 18 | #include 19 | #include 20 | #include 21 | 22 | #include "opencv2/core/mat.hpp" 23 | 24 | #include "rclcpp/type_adapter.hpp" 25 | #include "sensor_msgs/msg/image.hpp" 26 | 27 | #include "cv_bridge/visibility_control.h" 28 | 29 | #include 30 | 31 | namespace cv_bridge 32 | { 33 | namespace detail 34 | { 35 | // TODO(audrow): Replace with std::endian when C++ 20 is available 36 | // https://en.cppreference.com/w/cpp/types/endian 37 | enum class endian 38 | { 39 | #ifdef _WIN32 40 | little = 0, 41 | big = 1, 42 | native = little 43 | #else 44 | little = __ORDER_LITTLE_ENDIAN__, 45 | big = __ORDER_BIG_ENDIAN__, 46 | native = __BYTE_ORDER__ 47 | #endif 48 | }; 49 | 50 | } // namespace detail 51 | 52 | 53 | /// A potentially owning, potentially non-owning, container of a cv::Mat and ROS header. 54 | /** 55 | * The two main use cases for this are publishing user controlled data, and 56 | * recieving data from the middleware that may have been a ROS message 57 | * originally or may have been an cv::Mat originally. 58 | * 59 | * In the first case, publishing user owned data, the user will want to provide 60 | * their own cv::Mat. 61 | * The cv::Mat may own the data or it may not, so in the latter case, it is up 62 | * to the user to ensure the data the cv::Mat points to remains valid as long 63 | * as the middleware needs it. 64 | * 65 | * In the second case, receiving data from the middleware, the middleware will 66 | * either give a new ROSCvMatContainer which owns a sensor_msgs::msg::Image or 67 | * it will give a ROSCvMatContainer that was previously published by the user 68 | * (in the case of intra-process communication). 69 | * If the container owns the sensor_msgs::msg::Image, then the cv::Mat will just 70 | * reference data field of this message, so the container needs to be kept. 71 | * If the container was published by the user it may or may not own the data 72 | * and the cv::Mat it contains may or may not own the data. 73 | * 74 | * For these reasons, it is advisable to use cv::Mat::clone() if you intend to 75 | * copy the cv::Mat and let this container go. 76 | * 77 | * For more details about the ownership behavior of cv::Mat see documentation 78 | * for these methods of cv::Mat: 79 | * 80 | * - template cv::Mat::Mat(const std::vector<_Tp> &, bool) 81 | * - Mat & cv::Mat::operator=(const Mat &) 82 | * - void cv::Mat::addref() 83 | * - void cv::Mat::release() 84 | * 85 | */ 86 | class ROSCvMatContainer 87 | { 88 | static constexpr bool is_bigendian_system = detail::endian::native == detail::endian::big; 89 | 90 | public: 91 | using SensorMsgsImageStorageType = std::variant< 92 | std::nullptr_t, 93 | std::unique_ptr, 94 | std::shared_ptr 95 | >; 96 | 97 | CV_BRIDGE_PUBLIC 98 | ROSCvMatContainer() = default; 99 | 100 | CV_BRIDGE_PUBLIC 101 | explicit ROSCvMatContainer(const ROSCvMatContainer & other) 102 | : header_(other.header_), frame_(other.frame_.clone()), is_bigendian_(other.is_bigendian_) 103 | { 104 | if (std::holds_alternative>(other.storage_)) { 105 | storage_ = std::get>(other.storage_); 106 | } else if (std::holds_alternative>(other.storage_)) { 107 | storage_ = std::make_unique( 108 | *std::get>(other.storage_)); 109 | } 110 | } 111 | 112 | CV_BRIDGE_PUBLIC 113 | ROSCvMatContainer & operator=(const ROSCvMatContainer & other) 114 | { 115 | if (this != &other) { 116 | header_ = other.header_; 117 | frame_ = other.frame_.clone(); 118 | is_bigendian_ = other.is_bigendian_; 119 | if (std::holds_alternative>(other.storage_)) { 120 | storage_ = std::get>(other.storage_); 121 | } else if (std::holds_alternative>(other.storage_)) { 122 | storage_ = std::make_unique( 123 | *std::get>(other.storage_)); 124 | } else if (std::holds_alternative(other.storage_)) { 125 | storage_ = nullptr; 126 | } 127 | } 128 | return *this; 129 | } 130 | 131 | /// Store an owning pointer to a sensor_msg::msg::Image, and create a cv::Mat that references it. 132 | CV_BRIDGE_PUBLIC 133 | explicit ROSCvMatContainer(std::unique_ptr unique_sensor_msgs_image); 134 | 135 | /// Store an owning pointer to a sensor_msg::msg::Image, and create a cv::Mat that references it. 136 | CV_BRIDGE_PUBLIC 137 | explicit ROSCvMatContainer(std::shared_ptr shared_sensor_msgs_image); 138 | 139 | /// Shallow copy the given cv::Mat into this class, but do not own the data directly. 140 | CV_BRIDGE_PUBLIC 141 | ROSCvMatContainer( 142 | const cv::Mat & mat_frame, 143 | const std_msgs::msg::Header & header, 144 | bool is_bigendian = is_bigendian_system, 145 | std::optional encoding_override = std::nullopt); 146 | 147 | /// Move the given cv::Mat into this class. 148 | CV_BRIDGE_PUBLIC 149 | ROSCvMatContainer( 150 | cv::Mat && mat_frame, 151 | const std_msgs::msg::Header & header, 152 | bool is_bigendian = is_bigendian_system, 153 | std::optional encoding_override = std::nullopt); 154 | 155 | /// Copy the sensor_msgs::msg::Image into this contain and create a cv::Mat that references it. 156 | CV_BRIDGE_PUBLIC 157 | explicit ROSCvMatContainer(const sensor_msgs::msg::Image & sensor_msgs_image); 158 | 159 | /// Return true if this class owns the data the cv_mat references. 160 | /** 161 | * Note that this does not check if the cv::Mat owns its own data, only if 162 | * this class owns a sensor_msgs::msg::Image that the cv::Mat references. 163 | */ 164 | CV_BRIDGE_PUBLIC 165 | bool 166 | is_owning() const; 167 | 168 | /// Const access the cv::Mat in this class. 169 | CV_BRIDGE_PUBLIC 170 | const cv::Mat & 171 | cv_mat() const; 172 | 173 | /// Get a shallow copy of the cv::Mat that is in this class. 174 | /** 175 | * Note that if you want to let this container go out of scope you should 176 | * make a deep copy with cv::Mat::clone() beforehand. 177 | */ 178 | CV_BRIDGE_PUBLIC 179 | cv::Mat 180 | cv_mat(); 181 | 182 | /// Const access the ROS Header. 183 | CV_BRIDGE_PUBLIC 184 | const std_msgs::msg::Header & 185 | header() const; 186 | 187 | /// Access the ROS Header. 188 | CV_BRIDGE_PUBLIC 189 | std_msgs::msg::Header & 190 | header(); 191 | 192 | /// Get shared const pointer to the sensor_msgs::msg::Image if available, otherwise nullptr. 193 | CV_BRIDGE_PUBLIC 194 | std::shared_ptr 195 | get_sensor_msgs_msg_image_pointer() const; 196 | 197 | /// Get copy as a unique pointer to the sensor_msgs::msg::Image. 198 | CV_BRIDGE_PUBLIC 199 | std::unique_ptr 200 | get_sensor_msgs_msg_image_pointer_copy() const; 201 | 202 | /// Get a copy of the image as a sensor_msgs::msg::Image. 203 | CV_BRIDGE_PUBLIC 204 | sensor_msgs::msg::Image 205 | get_sensor_msgs_msg_image_copy() const; 206 | 207 | /// Get a copy of the image as a sensor_msgs::msg::Image. 208 | CV_BRIDGE_PUBLIC 209 | void 210 | get_sensor_msgs_msg_image_copy(sensor_msgs::msg::Image & sensor_msgs_image) const; 211 | 212 | /// Return true if the data is stored in big endian, otherwise return false. 213 | CV_BRIDGE_PUBLIC 214 | bool 215 | is_bigendian() const; 216 | 217 | /// Return the encoding override if provided. 218 | CV_BRIDGE_PUBLIC 219 | std::optional 220 | encoding_override() const; 221 | 222 | private: 223 | std_msgs::msg::Header header_; 224 | cv::Mat frame_; 225 | SensorMsgsImageStorageType storage_; 226 | bool is_bigendian_; 227 | std::optional encoding_override_; 228 | }; 229 | 230 | } // namespace cv_bridge 231 | 232 | static void fill_sensor_msgs_image_from_cvmat( 233 | const cv::Mat& cv_mat, 234 | sensor_msgs::msg::Image& image, 235 | const std_msgs::msg::Header& header, 236 | const std::optional& encoding_override = std::nullopt) 237 | { 238 | image.height = cv_mat.rows; 239 | image.width = cv_mat.cols; 240 | 241 | if (encoding_override.has_value() && !encoding_override.value().empty()) { 242 | image.encoding = encoding_override.value(); 243 | } else { 244 | switch (cv_mat.type()) { 245 | case CV_8UC1: 246 | image.encoding = "mono8"; 247 | break; 248 | case CV_8UC3: 249 | image.encoding = "bgr8"; 250 | break; 251 | case CV_16SC1: 252 | image.encoding = "mono16"; 253 | break; 254 | case CV_8UC4: 255 | image.encoding = "rgba8"; 256 | break; 257 | default: 258 | throw std::runtime_error("unsupported encoding type"); 259 | } 260 | } 261 | 262 | image.step = static_cast(cv_mat.step); 263 | const size_t size = cv_mat.step * cv_mat.rows; 264 | image.data.resize(size); 265 | memcpy(&image.data[0], cv_mat.data, size); 266 | image.header = header; 267 | } 268 | 269 | template<> 270 | struct rclcpp::TypeAdapter 271 | { 272 | using is_specialized = std::true_type; 273 | using custom_type = cv_bridge::ROSCvMatContainer; 274 | using ros_message_type = sensor_msgs::msg::Image; 275 | 276 | static void convert_to_ros_message( 277 | const custom_type& source, 278 | ros_message_type& destination) 279 | { 280 | fill_sensor_msgs_image_from_cvmat( 281 | source.cv_mat(), destination, source.header(), source.encoding_override()); 282 | } 283 | 284 | static void convert_to_custom( 285 | const ros_message_type& source, 286 | custom_type& destination) 287 | { 288 | destination = cv_bridge::ROSCvMatContainer(source); 289 | } 290 | }; 291 | 292 | #endif // CV_BRIDGE__CV_MAT_SENSOR_MSGS_IMAGE_TYPE_ADAPTER_HPP_ 293 | -------------------------------------------------------------------------------- /cv_bridge/include/cv_bridge/rgb_colors.hpp: -------------------------------------------------------------------------------- 1 | /********************************************************************* 2 | * Original color definition is at scikit-image distributed with 3 | * following license disclaimer: 4 | * 5 | * Copyright (C) 2011, the scikit-image team 6 | * Copyright (c) 2018 Intel Corporation. 7 | * All rights reserved. 8 | * 9 | * Redistribution and use in source and binary forms, with or without 10 | * modification, are permitted provided that the following conditions are 11 | * met: 12 | * 13 | * 1. Redistributions of source code must retain the above copyright 14 | * notice, this list of conditions and the following disclaimer. 15 | * 2. Redistributions in binary form must reproduce the above copyright 16 | * notice, this list of conditions and the following disclaimer in 17 | * the documentation and/or other materials provided with the 18 | * distribution. 19 | * 3. Neither the name of skimage nor the names of its contributors may be 20 | * used to endorse or promote products derived from this software without 21 | * specific prior written permission. 22 | * 23 | * THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR 24 | * IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED 25 | * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 26 | * DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, 27 | * INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES 28 | * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 29 | * SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) 30 | * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, 31 | * STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING 32 | * IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 33 | * POSSIBILITY OF SUCH DAMAGE. 34 | *********************************************************************/ 35 | 36 | #ifndef CV_BRIDGE__RGB_COLORS_HPP_ 37 | #define CV_BRIDGE__RGB_COLORS_HPP_ 38 | 39 | #include 40 | #include 41 | 42 | 43 | namespace cv_bridge 44 | { 45 | 46 | namespace rgb_colors 47 | { 48 | 49 | /** 50 | * @brief 51 | * 146 rgb colors 52 | */ 53 | enum Colors 54 | { 55 | ALICEBLUE, 56 | ANTIQUEWHITE, 57 | AQUA, 58 | AQUAMARINE, 59 | AZURE, 60 | BEIGE, 61 | BISQUE, 62 | BLACK, 63 | BLANCHEDALMOND, 64 | BLUE, 65 | BLUEVIOLET, 66 | BROWN, 67 | BURLYWOOD, 68 | CADETBLUE, 69 | CHARTREUSE, 70 | CHOCOLATE, 71 | CORAL, 72 | CORNFLOWERBLUE, 73 | CORNSILK, 74 | CRIMSON, 75 | CYAN, 76 | DARKBLUE, 77 | DARKCYAN, 78 | DARKGOLDENROD, 79 | DARKGRAY, 80 | DARKGREEN, 81 | DARKGREY, 82 | DARKKHAKI, 83 | DARKMAGENTA, 84 | DARKOLIVEGREEN, 85 | DARKORANGE, 86 | DARKORCHID, 87 | DARKRED, 88 | DARKSALMON, 89 | DARKSEAGREEN, 90 | DARKSLATEBLUE, 91 | DARKSLATEGRAY, 92 | DARKSLATEGREY, 93 | DARKTURQUOISE, 94 | DARKVIOLET, 95 | DEEPPINK, 96 | DEEPSKYBLUE, 97 | DIMGRAY, 98 | DIMGREY, 99 | DODGERBLUE, 100 | FIREBRICK, 101 | FLORALWHITE, 102 | FORESTGREEN, 103 | FUCHSIA, 104 | GAINSBORO, 105 | GHOSTWHITE, 106 | GOLD, 107 | GOLDENROD, 108 | GRAY, 109 | GREEN, 110 | GREENYELLOW, 111 | GREY, 112 | HONEYDEW, 113 | HOTPINK, 114 | INDIANRED, 115 | INDIGO, 116 | IVORY, 117 | KHAKI, 118 | LAVENDER, 119 | LAVENDERBLUSH, 120 | LAWNGREEN, 121 | LEMONCHIFFON, 122 | LIGHTBLUE, 123 | LIGHTCORAL, 124 | LIGHTCYAN, 125 | LIGHTGOLDENRODYELLOW, 126 | LIGHTGRAY, 127 | LIGHTGREEN, 128 | LIGHTGREY, 129 | LIGHTPINK, 130 | LIGHTSALMON, 131 | LIGHTSEAGREEN, 132 | LIGHTSKYBLUE, 133 | LIGHTSLATEGRAY, 134 | LIGHTSLATEGREY, 135 | LIGHTSTEELBLUE, 136 | LIGHTYELLOW, 137 | LIME, 138 | LIMEGREEN, 139 | LINEN, 140 | MAGENTA, 141 | MAROON, 142 | MEDIUMAQUAMARINE, 143 | MEDIUMBLUE, 144 | MEDIUMORCHID, 145 | MEDIUMPURPLE, 146 | MEDIUMSEAGREEN, 147 | MEDIUMSLATEBLUE, 148 | MEDIUMSPRINGGREEN, 149 | MEDIUMTURQUOISE, 150 | MEDIUMVIOLETRED, 151 | MIDNIGHTBLUE, 152 | MINTCREAM, 153 | MISTYROSE, 154 | MOCCASIN, 155 | NAVAJOWHITE, 156 | NAVY, 157 | OLDLACE, 158 | OLIVE, 159 | OLIVEDRAB, 160 | ORANGE, 161 | ORANGERED, 162 | ORCHID, 163 | PALEGOLDENROD, 164 | PALEGREEN, 165 | PALEVIOLETRED, 166 | PAPAYAWHIP, 167 | PEACHPUFF, 168 | PERU, 169 | PINK, 170 | PLUM, 171 | POWDERBLUE, 172 | PURPLE, 173 | RED, 174 | ROSYBROWN, 175 | ROYALBLUE, 176 | SADDLEBROWN, 177 | SALMON, 178 | SANDYBROWN, 179 | SEAGREEN, 180 | SEASHELL, 181 | SIENNA, 182 | SILVER, 183 | SKYBLUE, 184 | SLATEBLUE, 185 | SLATEGRAY, 186 | SLATEGREY, 187 | SNOW, 188 | SPRINGGREEN, 189 | STEELBLUE, 190 | TAN, 191 | TEAL, 192 | THISTLE, 193 | TOMATO, 194 | TURQUOISE, 195 | VIOLET, 196 | WHEAT, 197 | WHITE, 198 | WHITESMOKE, 199 | YELLOW, 200 | YELLOWGREEN, 201 | }; 202 | 203 | /** 204 | * @brief 205 | * get rgb color with enum. 206 | */ 207 | CV_BRIDGE_EXPORT cv::Vec3d getRGBColor(const int color); 208 | 209 | } // namespace rgb_colors 210 | 211 | } // namespace cv_bridge 212 | 213 | #endif // CV_BRIDGE__RGB_COLORS_HPP_ 214 | -------------------------------------------------------------------------------- /cv_bridge/include/cv_bridge/visibility_control.h: -------------------------------------------------------------------------------- 1 | // Copyright 2016 Open Source Robotics Foundation, Inc. 2 | // 3 | // Licensed under the Apache License, Version 2.0 (the "License"); 4 | // you may not use this file except in compliance with the License. 5 | // You may obtain a copy of the License at 6 | // 7 | // http://www.apache.org/licenses/LICENSE-2.0 8 | // 9 | // Unless required by applicable law or agreed to in writing, software 10 | // distributed under the License is distributed on an "AS IS" BASIS, 11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | // See the License for the specific language governing permissions and 13 | // limitations under the License. 14 | 15 | #ifndef CV_BRIDGE__VISIBILITY_CONTROL_H_ 16 | #define CV_BRIDGE__VISIBILITY_CONTROL_H_ 17 | 18 | #ifdef __cplusplus 19 | extern "C" 20 | { 21 | #endif 22 | 23 | // This logic was borrowed (then namespaced) from the examples on the gcc wiki: 24 | // https://gcc.gnu.org/wiki/Visibility 25 | 26 | #if defined _WIN32 || defined __CYGWIN__ 27 | #ifdef __GNUC__ 28 | #define CV_BRIDGE_EXPORT __attribute__ ((dllexport)) 29 | #define CV_BRIDGE_IMPORT __attribute__ ((dllimport)) 30 | #else 31 | #define CV_BRIDGE_EXPORT __declspec(dllexport) 32 | #define CV_BRIDGE_IMPORT __declspec(dllimport) 33 | #endif 34 | #ifdef CV_BRIDGE_BUILDING_DLL 35 | #define CV_BRIDGE_PUBLIC CV_BRIDGE_EXPORT 36 | #else 37 | #define CV_BRIDGE_PUBLIC CV_BRIDGE_IMPORT 38 | #endif 39 | #define CV_BRIDGE_PUBLIC_TYPE CV_BRIDGE_PUBLIC 40 | #define CV_BRIDGE_LOCAL 41 | #else 42 | #define CV_BRIDGE_EXPORT __attribute__ ((visibility("default"))) 43 | #define CV_BRIDGE_IMPORT 44 | #if __GNUC__ >= 4 45 | #define CV_BRIDGE_PUBLIC __attribute__ ((visibility("default"))) 46 | #define CV_BRIDGE_LOCAL __attribute__ ((visibility("hidden"))) 47 | #else 48 | #define CV_BRIDGE_PUBLIC 49 | #define CV_BRIDGE_LOCAL 50 | #endif 51 | #define CV_BRIDGE_PUBLIC_TYPE 52 | #endif 53 | 54 | #ifdef __cplusplus 55 | } 56 | #endif 57 | 58 | #endif // CV_BRIDGE__VISIBILITY_CONTROL_H_ 59 | -------------------------------------------------------------------------------- /cv_bridge/package.xml: -------------------------------------------------------------------------------- 1 | 2 | cv_bridge 3 | 4.1.0 4 | 5 | This contains CvBridge, which converts between ROS2 6 | Image messages and OpenCV images. 7 | 8 | Kenji Brameld 9 | Apache License 2.0 10 | BSD 11 | http://www.ros.org/wiki/cv_bridge 12 | https://github.com/ros-perception/vision_opencv/tree/ros2 13 | https://github.com/ros-perception/vision_opencv/issues 14 | 15 | Patrick Mihelich 16 | James Bowman 17 | Ethan Gao 18 | 19 | ament_cmake_ros 20 | python_cmake_module 21 | 22 | libboost-dev 23 | libboost-python-dev 24 | 25 | libopencv-dev 26 | python3-numpy 27 | rclcpp 28 | rcpputils 29 | sensor_msgs 30 | python3-opencv 31 | 32 | ament_index_python 33 | libboost-python 34 | 35 | ament_cmake_gtest 36 | ament_cmake_pytest 37 | ament_lint_auto 38 | ament_lint_common 39 | 40 | 41 | ament_cmake 42 | 43 | 44 | 45 | -------------------------------------------------------------------------------- /cv_bridge/python/cv_bridge/__init__.py: -------------------------------------------------------------------------------- 1 | from .core import CvBridge, CvBridgeError 2 | 3 | # python bindings 4 | # This try is just to satisfy doc jobs that are built differently. 5 | try: 6 | from cv_bridge.boost.cv_bridge_boost import cvtColorForDisplay, getCvType 7 | except ImportError: 8 | pass 9 | -------------------------------------------------------------------------------- /cv_bridge/python/cv_bridge/core.py: -------------------------------------------------------------------------------- 1 | # Software License Agreement (BSD License) 2 | # 3 | # Copyright (c) 2011, Willow Garage, Inc. 4 | # Copyright (c) 2016, Tal Regev. 5 | # Copyright (c) 2018 Intel Corporation. 6 | # All rights reserved. 7 | # 8 | # Redistribution and use in source and binary forms, with or without 9 | # modification, are permitted provided that the following conditions 10 | # are met: 11 | # 12 | # * Redistributions of source code must retain the above copyright 13 | # notice, this list of conditions and the following disclaimer. 14 | # * Redistributions in binary form must reproduce the above 15 | # copyright notice, this list of conditions and the following 16 | # disclaimer in the documentation and/or other materials provided 17 | # with the distribution. 18 | # * Neither the name of Willow Garage, Inc. nor the names of its 19 | # contributors may be used to endorse or promote products derived 20 | # from this software without specific prior written permission. 21 | # 22 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 23 | # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 24 | # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS 25 | # FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE 26 | # COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, 27 | # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, 28 | # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 29 | # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 30 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT 31 | # LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN 32 | # ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 33 | # POSSIBILITY OF SUCH DAMAGE. 34 | #################################################################### 35 | 36 | import sys 37 | 38 | import sensor_msgs.msg 39 | 40 | 41 | class CvBridgeError(TypeError): 42 | """This is the error raised by :class:`cv_bridge.CvBridge` methods when they fail.""" 43 | 44 | pass 45 | 46 | 47 | class CvBridge(object): 48 | """ 49 | The CvBridge is an object that converts between OpenCV Images and ROS Image messages. 50 | 51 | .. doctest:: 52 | :options: -ELLIPSIS, +NORMALIZE_WHITESPACE 53 | 54 | >>> import cv2 55 | >>> import numpy as np 56 | >>> from cv_bridge import CvBridge 57 | >>> br = CvBridge() 58 | >>> dtype, n_channels = br.encoding_as_cvtype2('8UC3') 59 | >>> im = np.ndarray(shape=(480, 640, n_channels), dtype=dtype) 60 | >>> msg = br.cv2_to_imgmsg(im) # Convert the image to a message 61 | >>> im2 = br.imgmsg_to_cv2(msg) # Convert the message to a new image 62 | >>> # Convert the image to a compress message 63 | >>> cmprsmsg = br.cv2_to_compressed_imgmsg(im) 64 | >>> # Convert the compress message to a new image 65 | >>> im22 = br.compressed_imgmsg_to_cv2(msg) 66 | >>> cv2.imwrite("this_was_a_message_briefly.png", im2) 67 | 68 | """ 69 | 70 | def __init__(self): 71 | import cv2 72 | self.cvtype_to_name = {} 73 | self.cvdepth_to_numpy_depth = {cv2.CV_8U: 'uint8', cv2.CV_8S: 'int8', 74 | cv2.CV_16U: 'uint16', cv2.CV_16S: 'int16', 75 | cv2.CV_32S: 'int32', cv2.CV_32F: 'float32', 76 | cv2.CV_64F: 'float64'} 77 | 78 | for t in ['8U', '8S', '16U', '16S', '32S', '32F', '64F']: 79 | for c in [1, 2, 3, 4]: 80 | nm = '%sC%d' % (t, c) 81 | self.cvtype_to_name[getattr(cv2, 'CV_%s' % nm)] = nm 82 | 83 | self.numpy_type_to_cvtype = {'uint8': '8U', 'int8': '8S', 'uint16': '16U', 84 | 'int16': '16S', 'int32': '32S', 'float32': '32F', 85 | 'float64': '64F'} 86 | self.numpy_type_to_cvtype.update(dict((v, k) for (k, v) in self.numpy_type_to_cvtype.items())) 87 | 88 | def dtype_with_channels_to_cvtype2(self, dtype, n_channels): 89 | return '%sC%d' % (self.numpy_type_to_cvtype[dtype.name], n_channels) 90 | 91 | def cvtype2_to_dtype_with_channels(self, cvtype): 92 | from cv_bridge.boost.cv_bridge_boost import CV_MAT_CNWrap, CV_MAT_DEPTHWrap 93 | return self.cvdepth_to_numpy_depth[CV_MAT_DEPTHWrap(cvtype)], CV_MAT_CNWrap(cvtype) 94 | 95 | def encoding_to_cvtype2(self, encoding): 96 | from cv_bridge.boost.cv_bridge_boost import getCvType 97 | 98 | try: 99 | return getCvType(encoding) 100 | except RuntimeError as e: 101 | raise CvBridgeError(e) 102 | 103 | def encoding_to_dtype_with_channels(self, encoding): 104 | return self.cvtype2_to_dtype_with_channels(self.encoding_to_cvtype2(encoding)) 105 | 106 | def compressed_imgmsg_to_cv2(self, cmprs_img_msg, desired_encoding='passthrough'): 107 | """ 108 | Convert a sensor_msgs::CompressedImage message to an OpenCV :cpp:type:`cv::Mat`. 109 | 110 | :param cmprs_img_msg: A :cpp:type:`sensor_msgs::CompressedImage` message 111 | :param desired_encoding: The encoding of the image data, one of the following strings: 112 | 113 | * ``"passthrough"`` 114 | * one of the standard strings in sensor_msgs/image_encodings.h 115 | 116 | :rtype: :cpp:type:`cv::Mat` 117 | :raises CvBridgeError: when conversion is not possible. 118 | 119 | If desired_encoding is ``"passthrough"``, then the returned image has the same format 120 | as img_msg. Otherwise desired_encoding must be one of the standard image encodings 121 | 122 | This function returns an OpenCV :cpp:type:`cv::Mat` message on success, 123 | or raises :exc:`cv_bridge.CvBridgeError` on failure. 124 | 125 | If the image only has one channel, the shape has size 2 (width and height) 126 | """ 127 | import cv2 128 | import numpy as np 129 | 130 | str_msg = cmprs_img_msg.data 131 | buf = np.ndarray(shape=(1, len(str_msg)), 132 | dtype=np.uint8, buffer=cmprs_img_msg.data) 133 | im = cv2.imdecode(buf, cv2.IMREAD_UNCHANGED) 134 | 135 | if desired_encoding == 'passthrough': 136 | return im 137 | 138 | from cv_bridge.boost.cv_bridge_boost import cvtColor2 139 | 140 | try: 141 | res = cvtColor2(im, 'bgr8', desired_encoding) 142 | except RuntimeError as e: 143 | raise CvBridgeError(e) 144 | 145 | return res 146 | 147 | def imgmsg_to_cv2(self, img_msg, desired_encoding='passthrough'): 148 | """ 149 | Convert a sensor_msgs::Image message to an OpenCV :cpp:type:`cv::Mat`. 150 | 151 | :param img_msg: A :cpp:type:`sensor_msgs::Image` message 152 | :param desired_encoding: The encoding of the image data, one of the following strings: 153 | 154 | * ``"passthrough"`` 155 | * one of the standard strings in sensor_msgs/image_encodings.h 156 | 157 | :rtype: :cpp:type:`cv::Mat` 158 | :raises CvBridgeError: when conversion is not possible. 159 | 160 | If desired_encoding is ``"passthrough"``, then the returned image has the same format 161 | as img_msg. Otherwise desired_encoding must be one of the standard image encodings 162 | 163 | This function returns an OpenCV :cpp:type:`cv::Mat` message on success, 164 | or raises :exc:`cv_bridge.CvBridgeError` on failure. 165 | 166 | If the image only has one channel, the shape has size 2 (width and height) 167 | """ 168 | import numpy as np 169 | dtype, n_channels = self.encoding_to_dtype_with_channels(img_msg.encoding) 170 | dtype = np.dtype(dtype) 171 | dtype = dtype.newbyteorder('>' if img_msg.is_bigendian else '<') 172 | 173 | img_buf = np.asarray(img_msg.data, dtype=dtype) if isinstance(img_msg.data, list) else img_msg.data 174 | 175 | if n_channels == 1: 176 | im = np.ndarray(shape=(img_msg.height, int(img_msg.step/dtype.itemsize)), 177 | dtype=dtype, buffer=img_buf) 178 | im = np.ascontiguousarray(im[:img_msg.height, :img_msg.width]) 179 | else: 180 | im = np.ndarray(shape=(img_msg.height, int(img_msg.step/dtype.itemsize/n_channels), n_channels), 181 | dtype=dtype, buffer=img_buf) 182 | im = np.ascontiguousarray(im[:img_msg.height, :img_msg.width, :]) 183 | 184 | # If the byte order is different between the message and the system. 185 | if img_msg.is_bigendian == (sys.byteorder == 'little'): 186 | im = im.byteswap().newbyteorder() 187 | 188 | if desired_encoding == 'passthrough': 189 | return im 190 | 191 | from cv_bridge.boost.cv_bridge_boost import cvtColor2 192 | 193 | try: 194 | res = cvtColor2(im, img_msg.encoding, desired_encoding) 195 | except RuntimeError as e: 196 | raise CvBridgeError(e) 197 | 198 | return res 199 | 200 | def cv2_to_compressed_imgmsg(self, cvim, dst_format='jpg'): 201 | """ 202 | Convert an OpenCV :cpp:type:`cv::Mat` type to a ROS sensor_msgs::CompressedImage message. 203 | 204 | :param cvim: An OpenCV :cpp:type:`cv::Mat` 205 | :param dst_format: The format of the image data, one of the following strings: 206 | 207 | http://docs.opencv.org/2.4/modules/highgui/doc/reading_and_writing_images_and_video.html 208 | http://docs.opencv.org/2.4/modules/highgui/doc/reading_and_writing_images_and_video.html#Mat 209 | * imread(const string& filename, int flags) 210 | * bmp, dib 211 | * jpeg, jpg, jpe 212 | * jp2 213 | * png 214 | * pbm, pgm, ppm 215 | * sr, ras 216 | * tiff, tif 217 | 218 | :rtype: A sensor_msgs.msg.CompressedImage message 219 | :raises CvBridgeError: when the ``cvim`` has a type that is incompatible with ``format`` 220 | 221 | 222 | This function returns a sensor_msgs::Image message on success, 223 | or raises :exc:`cv_bridge.CvBridgeError` on failure. 224 | """ 225 | import cv2 226 | import numpy as np 227 | if not isinstance(cvim, (np.ndarray, np.generic)): 228 | raise TypeError('Your input type is not a numpy array') 229 | cmprs_img_msg = sensor_msgs.msg.CompressedImage() 230 | cmprs_img_msg.format = dst_format 231 | ext_format = '.' + dst_format 232 | try: 233 | cmprs_img_msg.data.frombytes(np.array(cv2.imencode(ext_format, cvim)[1]).tobytes()) 234 | except RuntimeError as e: 235 | raise CvBridgeError(e) 236 | 237 | return cmprs_img_msg 238 | 239 | def cv2_to_imgmsg(self, cvim, encoding='passthrough', header = None): 240 | """ 241 | Convert an OpenCV :cpp:type:`cv::Mat` type to a ROS sensor_msgs::Image message. 242 | 243 | :param cvim: An OpenCV :cpp:type:`cv::Mat` 244 | :param encoding: The encoding of the image data, one of the following strings: 245 | 246 | * ``"passthrough"`` 247 | * one of the standard strings in sensor_msgs/image_encodings.h 248 | :param header: A std_msgs.msg.Header message 249 | 250 | :rtype: A sensor_msgs.msg.Image message 251 | :raises CvBridgeError: when the ``cvim`` has a type that is incompatible with ``encoding`` 252 | 253 | If encoding is ``"passthrough"``, then the message has the same encoding as the image's 254 | OpenCV type. Otherwise desired_encoding must be one of the standard image encodings 255 | 256 | This function returns a sensor_msgs::Image message on success, 257 | or raises :exc:`cv_bridge.CvBridgeError` on failure. 258 | """ 259 | import numpy as np 260 | if not isinstance(cvim, (np.ndarray, np.generic)): 261 | raise TypeError('Your input type is not a numpy array') 262 | img_msg = sensor_msgs.msg.Image() 263 | img_msg.height = cvim.shape[0] 264 | img_msg.width = cvim.shape[1] 265 | if header is not None: 266 | img_msg.header = header 267 | if len(cvim.shape) < 3: 268 | cv_type = self.dtype_with_channels_to_cvtype2(cvim.dtype, 1) 269 | else: 270 | cv_type = self.dtype_with_channels_to_cvtype2(cvim.dtype, cvim.shape[2]) 271 | if encoding == 'passthrough': 272 | img_msg.encoding = cv_type 273 | else: 274 | img_msg.encoding = encoding 275 | # Verify that the supplied encoding is compatible with the type of the OpenCV image 276 | if self.cvtype_to_name[self.encoding_to_cvtype2(encoding)] != cv_type: 277 | raise CvBridgeError('encoding specified as %s, but image has incompatible type %s' 278 | % (encoding, cv_type)) 279 | if cvim.dtype.byteorder == '>': 280 | img_msg.is_bigendian = True 281 | img_msg.data.frombytes(cvim.tobytes()) 282 | img_msg.step = len(img_msg.data) // img_msg.height 283 | 284 | return img_msg 285 | -------------------------------------------------------------------------------- /cv_bridge/src/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | # add library 2 | add_library(${PROJECT_NAME} 3 | cv_bridge.cpp 4 | cv_mat_sensor_msgs_image_type_adapter.cpp 5 | rgb_colors.cpp 6 | ) 7 | include(GenerateExportHeader) 8 | generate_export_header(${PROJECT_NAME} EXPORT_FILE_NAME ${PROJECT_NAME}/${PROJECT_NAME}_export.h) 9 | target_include_directories(${PROJECT_NAME} PUBLIC 10 | "$" 11 | "$" 12 | "$") 13 | target_link_libraries(${PROJECT_NAME} PUBLIC 14 | ${sensor_msgs_TARGETS} 15 | opencv_core 16 | opencv_imgproc 17 | opencv_imgcodecs 18 | rclcpp::rclcpp 19 | ) 20 | target_link_libraries(${PROJECT_NAME} PRIVATE 21 | Boost::headers 22 | rcpputils::rcpputils) 23 | 24 | install(TARGETS ${PROJECT_NAME} EXPORT export_${PROJECT_NAME} 25 | ARCHIVE DESTINATION lib 26 | LIBRARY DESTINATION lib 27 | RUNTIME DESTINATION bin 28 | ) 29 | 30 | install(FILES 31 | ${CMAKE_CURRENT_BINARY_DIR}/${PROJECT_NAME}/${PROJECT_NAME}_export.h 32 | DESTINATION include/${PROJECT_NAME}/${PROJECT_NAME}) 33 | 34 | if(NOT CV_BRIDGE_DISABLE_PYTHON) 35 | Python3_add_library(${PROJECT_NAME}_boost MODULE module.cpp module_opencv4.cpp) 36 | target_link_libraries(${PROJECT_NAME}_boost PRIVATE 37 | ${PROJECT_NAME} 38 | ${boost_python_target} 39 | Python3::NumPy) 40 | target_compile_definitions(${PROJECT_NAME}_boost PRIVATE PYTHON3) 41 | 42 | if(OpenCV_VERSION_MAJOR VERSION_EQUAL 4) 43 | target_compile_definitions(${PROJECT_NAME}_boost PRIVATE OPENCV_VERSION_4) 44 | endif() 45 | 46 | set_target_properties(${PROJECT_NAME}_boost PROPERTIES 47 | LIBRARY_OUTPUT_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}/boost/ 48 | PREFIX "" 49 | ) 50 | install(TARGETS ${PROJECT_NAME}_boost DESTINATION ${PYTHON_INSTALL_DIR}/${PROJECT_NAME}/boost/) 51 | endif() 52 | -------------------------------------------------------------------------------- /cv_bridge/src/cv_mat_sensor_msgs_image_type_adapter.cpp: -------------------------------------------------------------------------------- 1 | // Copyright 2021 Open Source Robotics Foundation, Inc. 2 | // 3 | // Licensed under the Apache License, Version 2.0 (the "License"); 4 | // you may not use this file except in compliance with the License. 5 | // You may obtain a copy of the License at 6 | // 7 | // http://www.apache.org/licenses/LICENSE-2.0 8 | // 9 | // Unless required by applicable law or agreed to in writing, software 10 | // distributed under the License is distributed on an "AS IS" BASIS, 11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | // See the License for the specific language governing permissions and 13 | // limitations under the License. 14 | 15 | #include 16 | #include 17 | #include 18 | #include 19 | #include 20 | 21 | #include "opencv2/core/mat.hpp" 22 | 23 | #include "sensor_msgs/msg/image.hpp" 24 | #include "std_msgs/msg/header.hpp" 25 | 26 | #include "cv_bridge/cv_mat_sensor_msgs_image_type_adapter.hpp" 27 | 28 | namespace cv_bridge 29 | { 30 | 31 | namespace 32 | { 33 | int 34 | encoding2mat_type(const std::string & encoding) 35 | { 36 | if (encoding == "mono8") { 37 | return CV_8UC1; 38 | } else if (encoding == "bgr8") { 39 | return CV_8UC3; 40 | } else if (encoding == "mono16") { 41 | return CV_16SC1; 42 | } else if (encoding == "rgba8") { 43 | return CV_8UC4; 44 | } else if (encoding == "bgra8") { 45 | return CV_8UC4; 46 | } else if (encoding == "32FC1") { 47 | return CV_32FC1; 48 | } else if (encoding == "rgb8") { 49 | return CV_8UC3; 50 | } else if (encoding == "yuv422") { 51 | return CV_8UC2; 52 | } else { 53 | throw std::runtime_error("Unsupported encoding type"); 54 | } 55 | } 56 | 57 | template 58 | struct NotNull 59 | { 60 | NotNull(const T * pointer_in, const char * msg) 61 | : pointer(pointer_in) 62 | { 63 | if (pointer == nullptr) { 64 | throw std::invalid_argument(msg); 65 | } 66 | } 67 | 68 | const T * pointer; 69 | }; 70 | 71 | } // namespace 72 | 73 | ROSCvMatContainer::ROSCvMatContainer( 74 | std::unique_ptr unique_sensor_msgs_image) 75 | : header_(NotNull( 76 | unique_sensor_msgs_image.get(), 77 | "unique_sensor_msgs_image cannot be nullptr" 78 | ).pointer->header), 79 | frame_( 80 | unique_sensor_msgs_image->height, 81 | unique_sensor_msgs_image->width, 82 | encoding2mat_type(unique_sensor_msgs_image->encoding), 83 | unique_sensor_msgs_image->data.data(), 84 | unique_sensor_msgs_image->step), 85 | storage_(std::move(unique_sensor_msgs_image)) 86 | {} 87 | 88 | ROSCvMatContainer::ROSCvMatContainer( 89 | std::shared_ptr shared_sensor_msgs_image) 90 | : header_(shared_sensor_msgs_image->header), 91 | frame_( 92 | shared_sensor_msgs_image->height, 93 | shared_sensor_msgs_image->width, 94 | encoding2mat_type(shared_sensor_msgs_image->encoding), 95 | shared_sensor_msgs_image->data.data(), 96 | shared_sensor_msgs_image->step), 97 | storage_(shared_sensor_msgs_image) 98 | {} 99 | 100 | ROSCvMatContainer::ROSCvMatContainer( 101 | const cv::Mat & mat_frame, 102 | const std_msgs::msg::Header & header, 103 | bool is_bigendian, 104 | std::optional encoding_override) 105 | : header_(header), 106 | frame_(mat_frame), 107 | storage_(nullptr), 108 | is_bigendian_(is_bigendian), 109 | encoding_override_(encoding_override) 110 | {} 111 | 112 | ROSCvMatContainer::ROSCvMatContainer( 113 | cv::Mat && mat_frame, 114 | const std_msgs::msg::Header & header, 115 | bool is_bigendian, 116 | std::optional encoding_override) 117 | : header_(header), 118 | frame_(std::forward(mat_frame)), 119 | storage_(nullptr), 120 | is_bigendian_(is_bigendian), 121 | encoding_override_(encoding_override) 122 | {} 123 | 124 | ROSCvMatContainer::ROSCvMatContainer( 125 | const sensor_msgs::msg::Image & sensor_msgs_image) 126 | : ROSCvMatContainer(std::make_unique(sensor_msgs_image)) 127 | {} 128 | 129 | bool 130 | ROSCvMatContainer::is_owning() const 131 | { 132 | return std::holds_alternative(storage_); 133 | } 134 | 135 | const cv::Mat & 136 | ROSCvMatContainer::cv_mat() const 137 | { 138 | return frame_; 139 | } 140 | 141 | cv::Mat 142 | ROSCvMatContainer::cv_mat() 143 | { 144 | return frame_; 145 | } 146 | 147 | const std_msgs::msg::Header & 148 | ROSCvMatContainer::header() const 149 | { 150 | return header_; 151 | } 152 | 153 | std_msgs::msg::Header & 154 | ROSCvMatContainer::header() 155 | { 156 | return header_; 157 | } 158 | 159 | std::shared_ptr 160 | ROSCvMatContainer::get_sensor_msgs_msg_image_pointer() const 161 | { 162 | if (!std::holds_alternative>(storage_)) { 163 | return nullptr; 164 | } 165 | return std::get>(storage_); 166 | } 167 | 168 | std::unique_ptr 169 | ROSCvMatContainer::get_sensor_msgs_msg_image_pointer_copy() const 170 | { 171 | auto unique_image = std::make_unique(); 172 | this->get_sensor_msgs_msg_image_copy(*unique_image); 173 | return unique_image; 174 | } 175 | 176 | void ROSCvMatContainer::get_sensor_msgs_msg_image_copy( 177 | sensor_msgs::msg::Image& sensor_msgs_image) const 178 | { 179 | fill_sensor_msgs_image_from_cvmat(frame_, sensor_msgs_image, header_, encoding_override_); 180 | } 181 | 182 | bool 183 | ROSCvMatContainer::is_bigendian() const 184 | { 185 | return is_bigendian_; 186 | } 187 | 188 | std::optional 189 | ROSCvMatContainer::encoding_override() const 190 | { 191 | return encoding_override_; 192 | } 193 | 194 | } // namespace cv_bridge 195 | -------------------------------------------------------------------------------- /cv_bridge/src/module.cpp: -------------------------------------------------------------------------------- 1 | /********************************************************************* 2 | * Software License Agreement (BSD License) 3 | * 4 | * Copyright (c) 2012, Willow Garage, Inc. 5 | * Copyright (c) 2018 Intel Corporation. 6 | * All rights reserved. 7 | * 8 | * Redistribution and use in source and binary forms, with or without 9 | * modification, are permitted provided that the following conditions 10 | * are met: 11 | * 12 | * * Redistributions of source code must retain the above copyright 13 | * notice, this list of conditions and the following disclaimer. 14 | * * Redistributions in binary form must reproduce the above 15 | * copyright notice, this list of conditions and the following 16 | * disclaimer in the documentation and/or other materials provided 17 | * with the distribution. 18 | * * Neither the name of the Willow Garage nor the names of its 19 | * contributors may be used to endorse or promote products derived 20 | * from this software without specific prior written permission. 21 | * 22 | * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 23 | * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 24 | * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS 25 | * FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE 26 | * COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, 27 | * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, 28 | * BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 29 | * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 30 | * CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT 31 | * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN 32 | * ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 33 | * POSSIBILITY OF SUCH DAMAGE. 34 | *********************************************************************/ 35 | 36 | #include "module.hpp" 37 | #include 38 | 39 | PyObject * mod_opencv; 40 | 41 | bp::object 42 | cvtColor2Wrap(bp::object obj_in, const std::string & encoding_in, const std::string & encoding_out) 43 | { 44 | // Convert the Python input to an image 45 | cv::Mat mat_in; 46 | convert_to_CvMat2(obj_in.ptr(), mat_in); 47 | 48 | // Call cv_bridge for color conversion 49 | cv_bridge::CvImagePtr cv_image(new cv_bridge::CvImage( 50 | std_msgs::msg::Header(), encoding_in, mat_in)); 51 | 52 | cv::Mat mat = cv_bridge::cvtColor(cv_image, encoding_out)->image; 53 | 54 | return bp::object(boost::python::handle<>(pyopencv_from(mat))); 55 | } 56 | 57 | bp::object 58 | cvtColorForDisplayWrap( 59 | bp::object obj_in, 60 | const std::string & encoding_in, 61 | const std::string & encoding_out, 62 | bool do_dynamic_scaling = false, 63 | double min_image_value = 0.0, 64 | double max_image_value = 0.0, 65 | int colormap = -1) 66 | { 67 | // Convert the Python input to an image 68 | cv::Mat mat_in; 69 | convert_to_CvMat2(obj_in.ptr(), mat_in); 70 | 71 | cv_bridge::CvImagePtr cv_image(new cv_bridge::CvImage( 72 | std_msgs::msg::Header(), encoding_in, mat_in)); 73 | 74 | cv_bridge::CvtColorForDisplayOptions options; 75 | options.do_dynamic_scaling = do_dynamic_scaling; 76 | options.min_image_value = min_image_value; 77 | options.max_image_value = max_image_value; 78 | options.colormap = colormap; 79 | cv::Mat mat = cv_bridge::cvtColorForDisplay(/*source=*/ cv_image, 80 | /*encoding_out=*/ encoding_out, 81 | /*options=*/ options)->image; 82 | 83 | return bp::object(boost::python::handle<>(pyopencv_from(mat))); 84 | } 85 | 86 | BOOST_PYTHON_FUNCTION_OVERLOADS(cvtColorForDisplayWrap_overloads, cvtColorForDisplayWrap, 3, 7) 87 | 88 | int CV_MAT_CNWrap(int i) 89 | { 90 | return CV_MAT_CN(i); 91 | } 92 | 93 | int CV_MAT_DEPTHWrap(int i) 94 | { 95 | return CV_MAT_DEPTH(i); 96 | } 97 | 98 | BOOST_PYTHON_MODULE(cv_bridge_boost) 99 | { 100 | do_numpy_import(); 101 | mod_opencv = PyImport_ImportModule("cv2"); 102 | 103 | // Wrap the function to get encodings as OpenCV types 104 | boost::python::def("getCvType", cv_bridge::getCvType); 105 | boost::python::def("cvtColor2", cvtColor2Wrap); 106 | boost::python::def("CV_MAT_CNWrap", CV_MAT_CNWrap); 107 | boost::python::def("CV_MAT_DEPTHWrap", CV_MAT_DEPTHWrap); 108 | boost::python::def("cvtColorForDisplay", cvtColorForDisplayWrap, 109 | cvtColorForDisplayWrap_overloads( 110 | boost::python::args("source", "encoding_in", "encoding_out", "do_dynamic_scaling", 111 | "min_image_value", "max_image_value", "colormap"), 112 | "Convert image to display with specified encodings.\n\n" 113 | "Args:\n" 114 | " - source (numpy.ndarray): input image\n" 115 | " - encoding_in (str): input image encoding\n" 116 | " - encoding_out (str): encoding to which the image conveted\n" 117 | " - do_dynamic_scaling (bool): flag to do dynamic scaling with min/max value\n" 118 | " - min_image_value (float): minimum pixel value for dynamic scaling\n" 119 | " - max_image_value (float): maximum pixel value for dynamic scaling\n" 120 | " - colormap (int): colormap to use when converting to color image\n" 121 | )); 122 | } 123 | -------------------------------------------------------------------------------- /cv_bridge/src/module.hpp: -------------------------------------------------------------------------------- 1 | // Copyright 2014 Open Source Robotics Foundation, Inc. 2 | // 3 | // Licensed under the Apache License, Version 2.0 (the "License"); 4 | // you may not use this file except in compliance with the License. 5 | // You may obtain a copy of the License at 6 | // 7 | // http://www.apache.org/licenses/LICENSE-2.0 8 | // 9 | // Unless required by applicable law or agreed to in writing, software 10 | // distributed under the License is distributed on an "AS IS" BASIS, 11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | // See the License for the specific language governing permissions and 13 | // limitations under the License. 14 | // 15 | 16 | #ifndef CV_BRIDGE_MODULE_HPP_ 17 | #define CV_BRIDGE_MODULE_HPP_ 18 | 19 | #include 20 | 21 | // Have to define macros to silence warnings about deprecated headers being used by 22 | // boost/python.hpp in some versions of boost. 23 | // See: https://github.com/ros-perception/vision_opencv/issues/449 24 | #include 25 | #if (BOOST_VERSION / 100 >= 1073 && BOOST_VERSION / 100 <= 1076) // Boost 1.73 - 1.76 26 | #define BOOST_BIND_GLOBAL_PLACEHOLDERS 27 | #endif 28 | #if (BOOST_VERSION / 100 == 1074) // Boost 1.74 29 | #define BOOST_ALLOW_DEPRECATED_HEADERS 30 | #endif 31 | #include 32 | 33 | #include 34 | #include 35 | 36 | #define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION 37 | #include 38 | 39 | #include 40 | 41 | namespace bp = boost::python; 42 | 43 | int convert_to_CvMat2(const PyObject * o, cv::Mat & m); 44 | 45 | PyObject * pyopencv_from(const cv::Mat & m); 46 | 47 | static void * do_numpy_import() 48 | { 49 | import_array(); 50 | return nullptr; 51 | } 52 | 53 | #endif // CV_BRIDGE_MODULE_HPP_ 54 | -------------------------------------------------------------------------------- /cv_bridge/src/module_opencv4.cpp: -------------------------------------------------------------------------------- 1 | /* Taken from opencv/modules/python/src2/cv2.cpp */ 2 | 3 | #include "module.hpp" 4 | 5 | #include "opencv2/core/types_c.h" 6 | 7 | #include "opencv2/opencv_modules.hpp" 8 | 9 | #include "pycompat.hpp" 10 | 11 | static PyObject * opencv_error = 0; 12 | 13 | static int failmsg(const char * fmt, ...) 14 | { 15 | char str[1000]; 16 | 17 | va_list ap; 18 | va_start(ap, fmt); 19 | vsnprintf(str, sizeof(str), fmt, ap); 20 | va_end(ap); 21 | 22 | PyErr_SetString(PyExc_TypeError, str); 23 | return 0; 24 | } 25 | 26 | struct ArgInfo 27 | { 28 | const char * name; 29 | bool outputarg; 30 | // more fields may be added if necessary 31 | 32 | ArgInfo(const char * name_, bool outputarg_) 33 | : name(name_), 34 | outputarg(outputarg_) {} 35 | 36 | // to match with older pyopencv_to function signature 37 | operator const char *() const {return name;} 38 | }; 39 | 40 | class PyAllowThreads 41 | { 42 | public: 43 | PyAllowThreads() 44 | : _state(PyEval_SaveThread()) {} 45 | ~PyAllowThreads() 46 | { 47 | PyEval_RestoreThread(_state); 48 | } 49 | 50 | private: 51 | PyThreadState * _state; 52 | }; 53 | 54 | class PyEnsureGIL 55 | { 56 | public: 57 | PyEnsureGIL() 58 | : _state(PyGILState_Ensure()) {} 59 | ~PyEnsureGIL() 60 | { 61 | PyGILState_Release(_state); 62 | } 63 | 64 | private: 65 | PyGILState_STATE _state; 66 | }; 67 | 68 | #define ERRWRAP2(expr) \ 69 | try \ 70 | { \ 71 | PyAllowThreads allowThreads; \ 72 | expr; \ 73 | } \ 74 | catch (const cv::Exception & e) \ 75 | { \ 76 | PyErr_SetString(opencv_error, e.what()); \ 77 | return 0; \ 78 | } 79 | 80 | using namespace cv; 81 | 82 | 83 | [[gnu::unused]] static PyObject * failmsgp(const char * fmt, ...) 84 | { 85 | char str[1000]; 86 | 87 | va_list ap; 88 | va_start(ap, fmt); 89 | vsnprintf(str, sizeof(str), fmt, ap); 90 | va_end(ap); 91 | 92 | PyErr_SetString(PyExc_TypeError, str); 93 | return 0; 94 | } 95 | 96 | class NumpyAllocator : public MatAllocator 97 | { 98 | public: 99 | NumpyAllocator() {stdAllocator = Mat::getStdAllocator();} 100 | ~NumpyAllocator() {} 101 | 102 | // To compile openCV3 with OpenCV4 APIs. 103 | #ifndef OPENCV_VERSION_4 104 | #define AccessFlag int 105 | #endif 106 | 107 | UMatData * allocate(PyObject * o, int dims, const int * sizes, int type, size_t * step) const 108 | { 109 | UMatData * u = new UMatData(this); 110 | u->data = u->origdata = 111 | reinterpret_cast(PyArray_DATA(reinterpret_cast(o))); 112 | npy_intp * _strides = PyArray_STRIDES(reinterpret_cast(o)); 113 | for (int i = 0; i < dims - 1; i++) { 114 | step[i] = (size_t)_strides[i]; 115 | } 116 | step[dims - 1] = CV_ELEM_SIZE(type); 117 | u->size = sizes[0] * step[0]; 118 | u->userdata = o; 119 | return u; 120 | } 121 | 122 | UMatData * allocate( 123 | int dims0, const int * sizes, int type, void * data, size_t * step, AccessFlag flags, 124 | UMatUsageFlags usageFlags) const 125 | { 126 | if (data != 0) { 127 | CV_Error(Error::StsAssert, "The data should normally be NULL!"); 128 | // probably this is safe to do in such extreme case 129 | return stdAllocator->allocate(dims0, sizes, type, data, step, flags, usageFlags); 130 | } 131 | PyEnsureGIL gil; 132 | 133 | int depth = CV_MAT_DEPTH(type); 134 | int cn = CV_MAT_CN(type); 135 | const int f = static_cast(sizeof(size_t) / 8); 136 | int typenum = depth == CV_8U ? NPY_UBYTE : depth == CV_8S ? NPY_BYTE : 137 | depth == CV_16U ? NPY_USHORT : depth == CV_16S ? NPY_SHORT : 138 | depth == CV_32S ? NPY_INT : depth == CV_32F ? NPY_FLOAT : 139 | depth == CV_64F ? NPY_DOUBLE : f * NPY_ULONGLONG + (f ^ 1) * NPY_UINT; 140 | int i, dims = dims0; 141 | cv::AutoBuffer _sizes(dims + 1); 142 | for (i = 0; i < dims; i++) { 143 | _sizes[i] = sizes[i]; 144 | } 145 | if (cn > 1) { 146 | _sizes[dims++] = cn; 147 | } 148 | PyObject * o = PyArray_SimpleNew(dims, _sizes, typenum); 149 | if (!o) { 150 | CV_Error_(Error::StsError, 151 | ("The numpy array of typenum=%d, ndims=%d can not be created", typenum, dims)); 152 | } 153 | return allocate(o, dims0, sizes, type, step); 154 | } 155 | 156 | bool allocate(UMatData * u, AccessFlag accessFlags, UMatUsageFlags usageFlags) const 157 | { 158 | return stdAllocator->allocate(u, accessFlags, usageFlags); 159 | } 160 | 161 | void deallocate(UMatData * u) const 162 | { 163 | if (u) { 164 | PyEnsureGIL gil; 165 | PyObject * o = reinterpret_cast(u->userdata); 166 | Py_XDECREF(o); 167 | delete u; 168 | } 169 | } 170 | 171 | const MatAllocator * stdAllocator; 172 | }; 173 | 174 | NumpyAllocator g_numpyAllocator; 175 | 176 | 177 | template 178 | static 179 | bool pyopencv_to(PyObject * obj, T & p, const char * name = ""); 180 | 181 | template 182 | static 183 | PyObject * pyopencv_from(const T & src); 184 | 185 | enum { ARG_NONE = 0, ARG_MAT = 1, ARG_SCALAR = 2 }; 186 | 187 | // special case, when the convertor needs full ArgInfo structure 188 | static bool pyopencv_to(PyObject * o, Mat & m, const ArgInfo info) 189 | { 190 | // to avoid PyArray_Check() to crash even with valid array 191 | do_numpy_import(); 192 | 193 | 194 | bool allowND = true; 195 | if (!o || o == Py_None) { 196 | if (!m.data) { 197 | m.allocator = &g_numpyAllocator; 198 | } 199 | return true; 200 | } 201 | 202 | if (PyInt_Check(o) ) { 203 | double v[] = {static_cast(PyInt_AsLong(reinterpret_cast(o))), 0., 0., 0.}; 204 | m = Mat(4, 1, CV_64F, v).clone(); 205 | return true; 206 | } 207 | if (PyFloat_Check(o) ) { 208 | double v[] = {PyFloat_AsDouble(reinterpret_cast(o)), 0., 0., 0.}; 209 | m = Mat(4, 1, CV_64F, v).clone(); 210 | return true; 211 | } 212 | if (PyTuple_Check(o) ) { 213 | int i, sz = static_cast(PyTuple_Size(reinterpret_cast(o))); 214 | m = Mat(sz, 1, CV_64F); 215 | for (i = 0; i < sz; i++) { 216 | PyObject * oi = PyTuple_GET_ITEM(o, i); 217 | if (PyInt_Check(oi) ) { 218 | m.at(i) = static_cast(PyInt_AsLong(oi)); 219 | } else if (PyFloat_Check(oi) ) { 220 | m.at(i) = static_cast(PyFloat_AsDouble(oi)); 221 | } else { 222 | failmsg("%s is not a numerical tuple", info.name); 223 | m.release(); 224 | return false; 225 | } 226 | } 227 | return true; 228 | } 229 | 230 | if (!PyArray_Check(o) ) { 231 | failmsg("%s is not a numpy array, neither a scalar", info.name); 232 | return false; 233 | } 234 | 235 | PyArrayObject * oarr = reinterpret_cast(o); 236 | 237 | bool needcopy = false, needcast = false; 238 | int typenum = PyArray_TYPE(oarr), new_typenum = typenum; 239 | int type = typenum == NPY_UBYTE ? CV_8U : 240 | typenum == NPY_BYTE ? CV_8S : 241 | typenum == NPY_USHORT ? CV_16U : 242 | typenum == NPY_SHORT ? CV_16S : 243 | typenum == NPY_INT ? CV_32S : 244 | typenum == NPY_INT32 ? CV_32S : 245 | typenum == NPY_FLOAT ? CV_32F : 246 | typenum == NPY_DOUBLE ? CV_64F : -1; 247 | 248 | if (type < 0) { 249 | if (typenum == NPY_INT64 || typenum == NPY_UINT64 || type == NPY_LONG) { 250 | needcopy = needcast = true; 251 | new_typenum = NPY_INT; 252 | type = CV_32S; 253 | } else { 254 | failmsg("%s data type = %d is not supported", info.name, typenum); 255 | return false; 256 | } 257 | } 258 | 259 | #ifndef CV_MAX_DIM 260 | const int CV_MAX_DIM = 32; 261 | #endif 262 | 263 | int ndims = PyArray_NDIM(oarr); 264 | if (ndims >= CV_MAX_DIM) { 265 | failmsg("%s dimensionality (=%d) is too high", info.name, ndims); 266 | return false; 267 | } 268 | 269 | int size[CV_MAX_DIM + 1]; 270 | size_t step[CV_MAX_DIM + 1]; 271 | size_t elemsize = CV_ELEM_SIZE1(type); 272 | const npy_intp * _sizes = PyArray_DIMS(oarr); 273 | const npy_intp * _strides = PyArray_STRIDES(oarr); 274 | bool ismultichannel = ndims == 3 && _sizes[2] <= CV_CN_MAX; 275 | 276 | for (int i = ndims - 1; i >= 0 && !needcopy; i--) { 277 | // these checks handle cases of 278 | // a) multi-dimensional (ndims > 2) arrays, as well as simpler 1- and 2-dimensional cases 279 | // b) transposed arrays, where _strides[] elements go in non-descending order 280 | // c) flipped arrays, where some of _strides[] elements are negative 281 | if ( (i == ndims - 1 && (size_t)_strides[i] != elemsize) || 282 | (i < ndims - 1 && _strides[i] < _strides[i + 1]) ) 283 | { 284 | needcopy = true; 285 | } 286 | } 287 | 288 | if (ismultichannel && _strides[1] != (npy_intp)elemsize * _sizes[2]) { 289 | needcopy = true; 290 | } 291 | 292 | if (needcopy) { 293 | if (info.outputarg) { 294 | failmsg( 295 | "Layout of the output array %s is incompatible with \ 296 | cv::Mat (step[ndims-1] != elemsize or step[1] != elemsize*nchannels)", 297 | info.name); 298 | return false; 299 | } 300 | 301 | if (needcast) { 302 | o = PyArray_Cast(oarr, new_typenum); 303 | oarr = reinterpret_cast(o); 304 | } else { 305 | oarr = PyArray_GETCONTIGUOUS(oarr); 306 | o = reinterpret_cast(oarr); 307 | } 308 | 309 | _strides = PyArray_STRIDES(oarr); 310 | } 311 | 312 | for (int i = 0; i < ndims; i++) { 313 | size[i] = static_cast(_sizes[i]); 314 | step[i] = (size_t)_strides[i]; 315 | } 316 | 317 | // handle degenerate case 318 | if (ndims == 0) { 319 | size[ndims] = 1; 320 | step[ndims] = elemsize; 321 | ndims++; 322 | } 323 | 324 | if (ismultichannel) { 325 | ndims--; 326 | type |= CV_MAKETYPE(0, size[2]); 327 | } 328 | 329 | if (ndims > 2 && !allowND) { 330 | failmsg("%s has more than 2 dimensions", info.name); 331 | return false; 332 | } 333 | 334 | m = Mat(ndims, size, type, PyArray_DATA(oarr), step); 335 | m.u = g_numpyAllocator.allocate(o, ndims, size, type, step); 336 | m.addref(); 337 | 338 | if (!needcopy) { 339 | Py_INCREF(o); 340 | } 341 | m.allocator = &g_numpyAllocator; 342 | 343 | return true; 344 | } 345 | 346 | template<> 347 | bool pyopencv_to(PyObject * o, Mat & m, const char * name) 348 | { 349 | return pyopencv_to(o, m, ArgInfo(name, 0)); 350 | } 351 | 352 | PyObject * pyopencv_from(const Mat & m) 353 | { 354 | if (!m.data) { 355 | Py_RETURN_NONE; 356 | } 357 | Mat temp, * p = const_cast(&m); 358 | if (!p->u || p->allocator != &g_numpyAllocator) { 359 | temp.allocator = &g_numpyAllocator; 360 | ERRWRAP2(m.copyTo(temp)); 361 | p = &temp; 362 | } 363 | PyObject * o = reinterpret_cast(p->u->userdata); 364 | Py_INCREF(o); 365 | return o; 366 | } 367 | 368 | int convert_to_CvMat2(const PyObject * o, cv::Mat & m) 369 | { 370 | pyopencv_to(const_cast(o), m, "unknown"); 371 | return 0; 372 | } 373 | -------------------------------------------------------------------------------- /cv_bridge/src/pycompat.hpp: -------------------------------------------------------------------------------- 1 | /*M/////////////////////////////////////////////////////////////////////////////////////// 2 | // 3 | // IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING. 4 | // 5 | // By downloading, copying, installing or using the software you agree to this license. 6 | // If you do not agree to this license, do not download, install, 7 | // copy or use the software. 8 | // 9 | // 10 | // License Agreement 11 | // For Open Source Computer Vision Library 12 | // 13 | // Copyright (C) 2000-2008, Intel Corporation, all rights reserved. 14 | // Copyright (C) 2009-2011, Willow Garage Inc., all rights reserved. 15 | // Copyright (c) 2018 Intel Corporation. 16 | // Third party copyrights are property of their respective owners. 17 | // 18 | // Redistribution and use in source and binary forms, with or without modification, 19 | // are permitted provided that the following conditions are met: 20 | // 21 | // * Redistribution's of source code must retain the above copyright notice, 22 | // this list of conditions and the following disclaimer. 23 | // 24 | // * Redistribution's in binary form must reproduce the above copyright notice, 25 | // this list of conditions and the following disclaimer in the documentation 26 | // and/or other materials provided with the distribution. 27 | // 28 | // * The name of the copyright holders may not be used to endorse or promote products 29 | // derived from this software without specific prior written permission. 30 | // 31 | // This software is provided by the copyright holders and contributors "as is" and 32 | // any express or implied warranties, including, but not limited to, the implied 33 | // warranties of merchantability and fitness for a particular purpose are disclaimed. 34 | // In no event shall the Intel Corporation or contributors be liable for any direct, 35 | // indirect, incidental, special, exemplary, or consequential damages 36 | // (including, but not limited to, procurement of substitute goods or services; 37 | // loss of use, data, or profits; or business interruption) however caused 38 | // and on any theory of liability, whether in contract, strict liability, 39 | // or tort (including negligence or otherwise) arising in any way out of 40 | // the use of this software, even if advised of the possibility of such damage. 41 | // 42 | //M*/ 43 | 44 | // Defines for Python 2/3 compatibility. 45 | #ifndef PYCOMPAT_HPP_ 46 | #define PYCOMPAT_HPP_ 47 | 48 | #if PY_MAJOR_VERSION >= 3 49 | // Python3 treats all ints as longs, PyInt_X functions have been removed. 50 | #define PyInt_Check PyLong_Check 51 | #define PyInt_CheckExact PyLong_CheckExact 52 | #define PyInt_AsLong PyLong_AsLong 53 | #define PyInt_AS_LONG PyLong_AS_LONG 54 | #define PyInt_FromLong PyLong_FromLong 55 | #define PyNumber_Int PyNumber_Long 56 | 57 | // Python3 strings are unicode, these defines mimic the Python2 functionality. 58 | #define PyString_Check PyUnicode_Check 59 | #define PyString_FromString PyUnicode_FromString 60 | #define PyString_FromStringAndSize PyUnicode_FromStringAndSize 61 | #define PyString_Size PyUnicode_GET_SIZE 62 | 63 | // PyUnicode_AsUTF8 isn't available until Python 3.3 64 | #if (PY_VERSION_HEX < 0x03030000) 65 | #define PyString_AsString _PyUnicode_AsString 66 | #else 67 | #define PyString_AsString PyUnicode_AsUTF8 68 | #endif 69 | #endif 70 | 71 | #endif // PYCOMPAT_HPP_ 72 | -------------------------------------------------------------------------------- /cv_bridge/src/rgb_colors.cpp: -------------------------------------------------------------------------------- 1 | /********************************************************************* 2 | * Original color definition is at scikit-image distributed with 3 | * following license disclaimer: 4 | * 5 | * Copyright (C) 2011, the scikit-image team 6 | * Copyright (c) 2018 Intel Corporation. 7 | * All rights reserved. 8 | * 9 | * Redistribution and use in source and binary forms, with or without 10 | * modification, are permitted provided that the following conditions are 11 | * met: 12 | * 13 | * 1. Redistributions of source code must retain the above copyright 14 | * notice, this list of conditions and the following disclaimer. 15 | * 2. Redistributions in binary form must reproduce the above copyright 16 | * notice, this list of conditions and the following disclaimer in 17 | * the documentation and/or other materials provided with the 18 | * distribution. 19 | * 3. Neither the name of skimage nor the names of its contributors may be 20 | * used to endorse or promote products derived from this software without 21 | * specific prior written permission. 22 | * 23 | * THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR 24 | * IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED 25 | * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 26 | * DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, 27 | * INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES 28 | * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 29 | * SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) 30 | * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, 31 | * STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING 32 | * IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 33 | * POSSIBILITY OF SUCH DAMAGE. 34 | *********************************************************************/ 35 | 36 | #include "cv_bridge/rgb_colors.hpp" 37 | #include 38 | 39 | 40 | namespace cv_bridge 41 | { 42 | 43 | namespace rgb_colors 44 | { 45 | 46 | cv::Vec3d getRGBColor(const int color) 47 | { 48 | cv::Vec3d c; 49 | switch (color % 146) { 50 | case ALICEBLUE: c = cv::Vec3d(0.941, 0.973, 1); break; 51 | case ANTIQUEWHITE: c = cv::Vec3d(0.98, 0.922, 0.843); break; 52 | case AQUA: c = cv::Vec3d(0, 1, 1); break; 53 | case AQUAMARINE: c = cv::Vec3d(0.498, 1, 0.831); break; 54 | case AZURE: c = cv::Vec3d(0.941, 1, 1); break; 55 | case BEIGE: c = cv::Vec3d(0.961, 0.961, 0.863); break; 56 | case BISQUE: c = cv::Vec3d(1, 0.894, 0.769); break; 57 | case BLACK: c = cv::Vec3d(0, 0, 0); break; 58 | case BLANCHEDALMOND: c = cv::Vec3d(1, 0.922, 0.804); break; 59 | case BLUE: c = cv::Vec3d(0, 0, 1); break; 60 | case BLUEVIOLET: c = cv::Vec3d(0.541, 0.169, 0.886); break; 61 | case BROWN: c = cv::Vec3d(0.647, 0.165, 0.165); break; 62 | case BURLYWOOD: c = cv::Vec3d(0.871, 0.722, 0.529); break; 63 | case CADETBLUE: c = cv::Vec3d(0.373, 0.62, 0.627); break; 64 | case CHARTREUSE: c = cv::Vec3d(0.498, 1, 0); break; 65 | case CHOCOLATE: c = cv::Vec3d(0.824, 0.412, 0.118); break; 66 | case CORAL: c = cv::Vec3d(1, 0.498, 0.314); break; 67 | case CORNFLOWERBLUE: c = cv::Vec3d(0.392, 0.584, 0.929); break; 68 | case CORNSILK: c = cv::Vec3d(1, 0.973, 0.863); break; 69 | case CRIMSON: c = cv::Vec3d(0.863, 0.0784, 0.235); break; 70 | case CYAN: c = cv::Vec3d(0, 1, 1); break; 71 | case DARKBLUE: c = cv::Vec3d(0, 0, 0.545); break; 72 | case DARKCYAN: c = cv::Vec3d(0, 0.545, 0.545); break; 73 | case DARKGOLDENROD: c = cv::Vec3d(0.722, 0.525, 0.0431); break; 74 | case DARKGRAY: c = cv::Vec3d(0.663, 0.663, 0.663); break; 75 | case DARKGREEN: c = cv::Vec3d(0, 0.392, 0); break; 76 | case DARKGREY: c = cv::Vec3d(0.663, 0.663, 0.663); break; 77 | case DARKKHAKI: c = cv::Vec3d(0.741, 0.718, 0.42); break; 78 | case DARKMAGENTA: c = cv::Vec3d(0.545, 0, 0.545); break; 79 | case DARKOLIVEGREEN: c = cv::Vec3d(0.333, 0.42, 0.184); break; 80 | case DARKORANGE: c = cv::Vec3d(1, 0.549, 0); break; 81 | case DARKORCHID: c = cv::Vec3d(0.6, 0.196, 0.8); break; 82 | case DARKRED: c = cv::Vec3d(0.545, 0, 0); break; 83 | case DARKSALMON: c = cv::Vec3d(0.914, 0.588, 0.478); break; 84 | case DARKSEAGREEN: c = cv::Vec3d(0.561, 0.737, 0.561); break; 85 | case DARKSLATEBLUE: c = cv::Vec3d(0.282, 0.239, 0.545); break; 86 | case DARKSLATEGRAY: c = cv::Vec3d(0.184, 0.31, 0.31); break; 87 | case DARKSLATEGREY: c = cv::Vec3d(0.184, 0.31, 0.31); break; 88 | case DARKTURQUOISE: c = cv::Vec3d(0, 0.808, 0.82); break; 89 | case DARKVIOLET: c = cv::Vec3d(0.58, 0, 0.827); break; 90 | case DEEPPINK: c = cv::Vec3d(1, 0.0784, 0.576); break; 91 | case DEEPSKYBLUE: c = cv::Vec3d(0, 0.749, 1); break; 92 | case DIMGRAY: c = cv::Vec3d(0.412, 0.412, 0.412); break; 93 | case DIMGREY: c = cv::Vec3d(0.412, 0.412, 0.412); break; 94 | case DODGERBLUE: c = cv::Vec3d(0.118, 0.565, 1); break; 95 | case FIREBRICK: c = cv::Vec3d(0.698, 0.133, 0.133); break; 96 | case FLORALWHITE: c = cv::Vec3d(1, 0.98, 0.941); break; 97 | case FORESTGREEN: c = cv::Vec3d(0.133, 0.545, 0.133); break; 98 | case FUCHSIA: c = cv::Vec3d(1, 0, 1); break; 99 | case GAINSBORO: c = cv::Vec3d(0.863, 0.863, 0.863); break; 100 | case GHOSTWHITE: c = cv::Vec3d(0.973, 0.973, 1); break; 101 | case GOLD: c = cv::Vec3d(1, 0.843, 0); break; 102 | case GOLDENROD: c = cv::Vec3d(0.855, 0.647, 0.125); break; 103 | case GRAY: c = cv::Vec3d(0.502, 0.502, 0.502); break; 104 | case GREEN: c = cv::Vec3d(0, 0.502, 0); break; 105 | case GREENYELLOW: c = cv::Vec3d(0.678, 1, 0.184); break; 106 | case GREY: c = cv::Vec3d(0.502, 0.502, 0.502); break; 107 | case HONEYDEW: c = cv::Vec3d(0.941, 1, 0.941); break; 108 | case HOTPINK: c = cv::Vec3d(1, 0.412, 0.706); break; 109 | case INDIANRED: c = cv::Vec3d(0.804, 0.361, 0.361); break; 110 | case INDIGO: c = cv::Vec3d(0.294, 0, 0.51); break; 111 | case IVORY: c = cv::Vec3d(1, 1, 0.941); break; 112 | case KHAKI: c = cv::Vec3d(0.941, 0.902, 0.549); break; 113 | case LAVENDER: c = cv::Vec3d(0.902, 0.902, 0.98); break; 114 | case LAVENDERBLUSH: c = cv::Vec3d(1, 0.941, 0.961); break; 115 | case LAWNGREEN: c = cv::Vec3d(0.486, 0.988, 0); break; 116 | case LEMONCHIFFON: c = cv::Vec3d(1, 0.98, 0.804); break; 117 | case LIGHTBLUE: c = cv::Vec3d(0.678, 0.847, 0.902); break; 118 | case LIGHTCORAL: c = cv::Vec3d(0.941, 0.502, 0.502); break; 119 | case LIGHTCYAN: c = cv::Vec3d(0.878, 1, 1); break; 120 | case LIGHTGOLDENRODYELLOW: c = cv::Vec3d(0.98, 0.98, 0.824); break; 121 | case LIGHTGRAY: c = cv::Vec3d(0.827, 0.827, 0.827); break; 122 | case LIGHTGREEN: c = cv::Vec3d(0.565, 0.933, 0.565); break; 123 | case LIGHTGREY: c = cv::Vec3d(0.827, 0.827, 0.827); break; 124 | case LIGHTPINK: c = cv::Vec3d(1, 0.714, 0.757); break; 125 | case LIGHTSALMON: c = cv::Vec3d(1, 0.627, 0.478); break; 126 | case LIGHTSEAGREEN: c = cv::Vec3d(0.125, 0.698, 0.667); break; 127 | case LIGHTSKYBLUE: c = cv::Vec3d(0.529, 0.808, 0.98); break; 128 | case LIGHTSLATEGRAY: c = cv::Vec3d(0.467, 0.533, 0.6); break; 129 | case LIGHTSLATEGREY: c = cv::Vec3d(0.467, 0.533, 0.6); break; 130 | case LIGHTSTEELBLUE: c = cv::Vec3d(0.69, 0.769, 0.871); break; 131 | case LIGHTYELLOW: c = cv::Vec3d(1, 1, 0.878); break; 132 | case LIME: c = cv::Vec3d(0, 1, 0); break; 133 | case LIMEGREEN: c = cv::Vec3d(0.196, 0.804, 0.196); break; 134 | case LINEN: c = cv::Vec3d(0.98, 0.941, 0.902); break; 135 | case MAGENTA: c = cv::Vec3d(1, 0, 1); break; 136 | case MAROON: c = cv::Vec3d(0.502, 0, 0); break; 137 | case MEDIUMAQUAMARINE: c = cv::Vec3d(0.4, 0.804, 0.667); break; 138 | case MEDIUMBLUE: c = cv::Vec3d(0, 0, 0.804); break; 139 | case MEDIUMORCHID: c = cv::Vec3d(0.729, 0.333, 0.827); break; 140 | case MEDIUMPURPLE: c = cv::Vec3d(0.576, 0.439, 0.859); break; 141 | case MEDIUMSEAGREEN: c = cv::Vec3d(0.235, 0.702, 0.443); break; 142 | case MEDIUMSLATEBLUE: c = cv::Vec3d(0.482, 0.408, 0.933); break; 143 | case MEDIUMSPRINGGREEN: c = cv::Vec3d(0, 0.98, 0.604); break; 144 | case MEDIUMTURQUOISE: c = cv::Vec3d(0.282, 0.82, 0.8); break; 145 | case MEDIUMVIOLETRED: c = cv::Vec3d(0.78, 0.0824, 0.522); break; 146 | case MIDNIGHTBLUE: c = cv::Vec3d(0.098, 0.098, 0.439); break; 147 | case MINTCREAM: c = cv::Vec3d(0.961, 1, 0.98); break; 148 | case MISTYROSE: c = cv::Vec3d(1, 0.894, 0.882); break; 149 | case MOCCASIN: c = cv::Vec3d(1, 0.894, 0.71); break; 150 | case NAVAJOWHITE: c = cv::Vec3d(1, 0.871, 0.678); break; 151 | case NAVY: c = cv::Vec3d(0, 0, 0.502); break; 152 | case OLDLACE: c = cv::Vec3d(0.992, 0.961, 0.902); break; 153 | case OLIVE: c = cv::Vec3d(0.502, 0.502, 0); break; 154 | case OLIVEDRAB: c = cv::Vec3d(0.42, 0.557, 0.137); break; 155 | case ORANGE: c = cv::Vec3d(1, 0.647, 0); break; 156 | case ORANGERED: c = cv::Vec3d(1, 0.271, 0); break; 157 | case ORCHID: c = cv::Vec3d(0.855, 0.439, 0.839); break; 158 | case PALEGOLDENROD: c = cv::Vec3d(0.933, 0.91, 0.667); break; 159 | case PALEGREEN: c = cv::Vec3d(0.596, 0.984, 0.596); break; 160 | case PALEVIOLETRED: c = cv::Vec3d(0.686, 0.933, 0.933); break; 161 | case PAPAYAWHIP: c = cv::Vec3d(1, 0.937, 0.835); break; 162 | case PEACHPUFF: c = cv::Vec3d(1, 0.855, 0.725); break; 163 | case PERU: c = cv::Vec3d(0.804, 0.522, 0.247); break; 164 | case PINK: c = cv::Vec3d(1, 0.753, 0.796); break; 165 | case PLUM: c = cv::Vec3d(0.867, 0.627, 0.867); break; 166 | case POWDERBLUE: c = cv::Vec3d(0.69, 0.878, 0.902); break; 167 | case PURPLE: c = cv::Vec3d(0.502, 0, 0.502); break; 168 | case RED: c = cv::Vec3d(1, 0, 0); break; 169 | case ROSYBROWN: c = cv::Vec3d(0.737, 0.561, 0.561); break; 170 | case ROYALBLUE: c = cv::Vec3d(0.255, 0.412, 0.882); break; 171 | case SADDLEBROWN: c = cv::Vec3d(0.545, 0.271, 0.0745); break; 172 | case SALMON: c = cv::Vec3d(0.98, 0.502, 0.447); break; 173 | case SANDYBROWN: c = cv::Vec3d(0.98, 0.643, 0.376); break; 174 | case SEAGREEN: c = cv::Vec3d(0.18, 0.545, 0.341); break; 175 | case SEASHELL: c = cv::Vec3d(1, 0.961, 0.933); break; 176 | case SIENNA: c = cv::Vec3d(0.627, 0.322, 0.176); break; 177 | case SILVER: c = cv::Vec3d(0.753, 0.753, 0.753); break; 178 | case SKYBLUE: c = cv::Vec3d(0.529, 0.808, 0.922); break; 179 | case SLATEBLUE: c = cv::Vec3d(0.416, 0.353, 0.804); break; 180 | case SLATEGRAY: c = cv::Vec3d(0.439, 0.502, 0.565); break; 181 | case SLATEGREY: c = cv::Vec3d(0.439, 0.502, 0.565); break; 182 | case SNOW: c = cv::Vec3d(1, 0.98, 0.98); break; 183 | case SPRINGGREEN: c = cv::Vec3d(0, 1, 0.498); break; 184 | case STEELBLUE: c = cv::Vec3d(0.275, 0.51, 0.706); break; 185 | case TAN: c = cv::Vec3d(0.824, 0.706, 0.549); break; 186 | case TEAL: c = cv::Vec3d(0, 0.502, 0.502); break; 187 | case THISTLE: c = cv::Vec3d(0.847, 0.749, 0.847); break; 188 | case TOMATO: c = cv::Vec3d(1, 0.388, 0.278); break; 189 | case TURQUOISE: c = cv::Vec3d(0.251, 0.878, 0.816); break; 190 | case VIOLET: c = cv::Vec3d(0.933, 0.51, 0.933); break; 191 | case WHEAT: c = cv::Vec3d(0.961, 0.871, 0.702); break; 192 | case WHITE: c = cv::Vec3d(1, 1, 1); break; 193 | case WHITESMOKE: c = cv::Vec3d(0.961, 0.961, 0.961); break; 194 | case YELLOW: c = cv::Vec3d(1, 1, 0); break; 195 | case YELLOWGREEN: c = cv::Vec3d(0.604, 0.804, 0.196); break; 196 | } // switch 197 | return c; 198 | } 199 | 200 | } // namespace rgb_colors 201 | 202 | } // namespace cv_bridge 203 | -------------------------------------------------------------------------------- /cv_bridge/test/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | # Add all the unit tests for cv_bridge 2 | find_package(ament_lint_auto REQUIRED) 3 | ament_lint_auto_find_test_dependencies() 4 | 5 | # skip the conversions.py test when necessary, because its 6 | # test_encode_decode_cv2_compressed and test_encode_decode_cv2 7 | # spends more than 5 minutes generally 8 | if(${SKIP_PYCONVERSION_TEST}) 9 | set(SKIP_TEST "SKIP_TEST") 10 | else() 11 | set(SKIP_TEST "") 12 | endif() 13 | 14 | # enable cv_bridge C++ tests 15 | find_package(ament_cmake_gtest REQUIRED) 16 | ament_add_gtest(${PROJECT_NAME}-utest 17 | test_endian.cpp 18 | test_compression.cpp 19 | utest.cpp utest2.cpp 20 | test_rgb_colors.cpp 21 | test_dynamic_scaling.cpp 22 | APPEND_LIBRARY_DIRS "${cv_bridge_lib_dir}") 23 | target_link_libraries(${PROJECT_NAME}-utest 24 | ${PROJECT_NAME} 25 | Boost::headers 26 | opencv_core 27 | opencv_imgcodecs 28 | ${sensor_msgs_TARGETS}) 29 | 30 | # enable cv_bridge python tests 31 | find_package(ament_cmake_pytest REQUIRED) 32 | ament_add_pytest_test(enumerants.py "enumerants.py") 33 | ament_add_pytest_test(conversions.py "conversions.py" TIMEOUT 600 ${SKIP_TEST}) 34 | ament_add_pytest_test(python_bindings.py "python_bindings.py") 35 | -------------------------------------------------------------------------------- /cv_bridge/test/conversions.py: -------------------------------------------------------------------------------- 1 | from cv_bridge import CvBridge, CvBridgeError 2 | import numpy as np 3 | import unittest 4 | 5 | class TestConversions(unittest.TestCase): 6 | 7 | def test_mono16_cv2(self): 8 | br = CvBridge() 9 | im = np.uint8(np.random.randint(0, 255, size=(480, 640, 3))) 10 | self.assertRaises(CvBridgeError, lambda: br.imgmsg_to_cv2(br.cv2_to_imgmsg(im), 'mono16')) 11 | br.imgmsg_to_cv2(br.cv2_to_imgmsg(im, 'rgb8'), 'mono16') 12 | 13 | def test_encode_decode_cv2(self): 14 | import cv2 15 | fmts = [cv2.CV_8U, cv2.CV_8S, cv2.CV_16U, cv2.CV_16S, cv2.CV_32S, cv2.CV_32F, cv2.CV_64F] 16 | 17 | cvb_en = CvBridge() 18 | cvb_de = CvBridge() 19 | 20 | for w in range(100, 800, 100): 21 | for h in range(100, 800, 100): 22 | for f in fmts: 23 | for channels in ([], 1, 2, 3, 4, 5): 24 | if channels == []: 25 | original = np.uint8(np.random.randint(0, 255, size=(h, w))) 26 | else: 27 | original = np.uint8(np.random.randint(0, 255, size=(h, w, channels))) 28 | rosmsg = cvb_en.cv2_to_imgmsg(original) 29 | newimg = cvb_de.imgmsg_to_cv2(rosmsg) 30 | 31 | self.assertTrue(original.dtype == newimg.dtype) 32 | if channels == 1: 33 | # in that case, a gray image has a shape of size 2 34 | self.assertTrue(original.shape[:2] == newimg.shape[:2]) 35 | else: 36 | self.assertTrue(original.shape == newimg.shape) 37 | self.assertTrue(len(original.tostring()) == len(newimg.tostring())) 38 | 39 | # From: 40 | # http://docs.opencv.org/2.4/modules/highgui/doc/reading_and_writing_images_and_video.html#Mat 41 | # imread(const string& filename, int flags) 42 | def test_encode_decode_cv2_compressed(self): 43 | # FIXME: remove jp2(a.k.a JPEG2000) as its JASPER codec is disabled within Ubuntu opencv library 44 | # due to security issues, but it works once you rebuild your opencv library with JASPER enabled 45 | formats = ['jpg', 'jpeg', 'jpe', 'png', 'bmp', 'dib', 46 | 'sr', 'ras', 'tif', 'tiff'] # this formats rviz is not support 47 | 48 | cvb_en = CvBridge() 49 | cvb_de = CvBridge() 50 | 51 | for w in range(100, 800, 100): 52 | for h in range(100, 800, 100): 53 | for f in formats: 54 | for channels in ([], 1, 3): 55 | if channels == []: 56 | original = np.uint8(np.random.randint(0, 255, size=(h, w))) 57 | else: 58 | original = np.uint8(np.random.randint(0, 255, size=(h, w, channels))) 59 | 60 | compress_rosmsg = cvb_en.cv2_to_compressed_imgmsg(original, f) 61 | newimg = cvb_de.compressed_imgmsg_to_cv2(compress_rosmsg) 62 | self.assertTrue(original.dtype == newimg.dtype) 63 | if channels == 1: 64 | # in that case, a gray image has a shape of size 2 65 | self.assertTrue(original.shape[:2] == newimg.shape[:2]) 66 | else: 67 | self.assertTrue(original.shape == newimg.shape) 68 | self.assertTrue(len(original.tostring()) == len(newimg.tostring())) 69 | 70 | def test_endianness(self): 71 | br = CvBridge() 72 | dtype = np.dtype('int32') 73 | # Set to big endian. 74 | dtype = dtype.newbyteorder('>') 75 | img = np.random.randint(0, 255, size=(30, 40)) 76 | msg = br.cv2_to_imgmsg(img.astype(dtype)) 77 | self.assertTrue(msg.is_bigendian) 78 | self.assertTrue((br.imgmsg_to_cv2(msg) == img).all()) 79 | 80 | 81 | if __name__ == '__main__': 82 | 83 | suite = unittest.TestSuite() 84 | suite.addTest(TestConversions('test_mono16_cv2')) 85 | suite.addTest(TestConversions('test_encode_decode_cv2')) 86 | suite.addTest(TestConversions('test_encode_decode_cv2_compressed')) 87 | suite.addTest(TestConversions('test_endianness')) 88 | unittest.TextTestRunner(verbosity=2).run(suite) 89 | -------------------------------------------------------------------------------- /cv_bridge/test/enumerants.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | 3 | import cv2 4 | from cv_bridge import CvBridge, CvBridgeError, getCvType 5 | import sensor_msgs.msg 6 | 7 | 8 | class TestEnumerants(unittest.TestCase): 9 | 10 | def test_enumerants_cv2(self): 11 | img_msg = sensor_msgs.msg.Image() 12 | img_msg.width = 640 13 | img_msg.height = 480 14 | img_msg.encoding = 'rgba8' 15 | img_msg.step = 640 * 4 16 | img_msg.data = ((640 * 480) * '1234').encode() 17 | 18 | bridge_ = CvBridge() 19 | cvim = bridge_.imgmsg_to_cv2(img_msg, 'rgb8') 20 | 21 | import sys 22 | self.assertTrue(sys.getrefcount(cvim) == 2) 23 | 24 | # A 3 channel image cannot be sent as an rgba8 25 | self.assertRaises(CvBridgeError, lambda: bridge_.cv2_to_imgmsg(cvim, 'rgba8')) 26 | 27 | # but it can be sent as rgb8 and bgr8 28 | bridge_.cv2_to_imgmsg(cvim, 'rgb8') 29 | bridge_.cv2_to_imgmsg(cvim, 'bgr8') 30 | 31 | self.assertFalse(getCvType('32FC4') == cv2.CV_8UC4) 32 | self.assertTrue(getCvType('8UC1') == cv2.CV_8UC1) 33 | self.assertTrue(getCvType('8U') == cv2.CV_8UC1) 34 | 35 | def test_numpy_types(self): 36 | bridge_ = CvBridge() 37 | self.assertRaises(TypeError, lambda: bridge_.cv2_to_imgmsg(1, 'rgba8')) 38 | if hasattr(cv2, 'cv'): 39 | self.assertRaises(TypeError, lambda: bridge_.cv2_to_imgmsg(cv2.cv(), 'rgba8')) 40 | 41 | 42 | if __name__ == '__main__': 43 | 44 | suite = unittest.TestSuite() 45 | suite.addTest(TestEnumerants('test_enumerants_cv2')) 46 | suite.addTest(TestEnumerants('test_numpy_types')) 47 | unittest.TextTestRunner(verbosity=2).run(suite) 48 | -------------------------------------------------------------------------------- /cv_bridge/test/python_bindings.py: -------------------------------------------------------------------------------- 1 | import cv_bridge 2 | import numpy as np 3 | 4 | 5 | def test_cvtColorForDisplay(): 6 | # convert label image to display 7 | label = np.zeros((480, 640), dtype=np.int32) 8 | height, width = label.shape[:2] 9 | label_value = 0 10 | grid_num_y, grid_num_x = 3, 4 11 | for grid_row in range(grid_num_y): 12 | grid_size_y = height // grid_num_y 13 | min_y = grid_size_y * grid_row 14 | max_y = min_y + grid_size_y 15 | for grid_col in range(grid_num_x): 16 | grid_size_x = width // grid_num_x 17 | min_x = grid_size_x * grid_col 18 | max_x = min_x + grid_size_x 19 | label[min_y:max_y, min_x:max_x] = label_value 20 | label_value += 1 21 | label_viz = cv_bridge.cvtColorForDisplay(label, '32SC1', 'bgr8') 22 | assert label_viz.dtype == np.uint8 23 | assert label_viz.min() == 0 24 | assert label_viz.max() == 255 25 | 26 | # Check that mono8 conversion returns the right shape. 27 | bridge = cv_bridge.CvBridge() 28 | mono = np.random.random((100, 100)) * 255 29 | mono = mono.astype(np.uint8) 30 | 31 | input_msg = bridge.cv2_to_imgmsg(mono, encoding='mono8') 32 | output = bridge.imgmsg_to_cv2(input_msg, desired_encoding='mono8') 33 | assert output.shape == (100, 100) 34 | -------------------------------------------------------------------------------- /cv_bridge/test/test_compression.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | 5 | #include 6 | 7 | TEST(CvBridgeTest, compression) 8 | { 9 | cv::RNG rng(0); 10 | std_msgs::msg::Header header; 11 | 12 | // Test 3 channel images. 13 | for (int i = 0; i < 2; ++i) { 14 | const std::string format = (i == 0) ? "bgr8" : "rgb8"; 15 | cv::Mat_ in(10, 10); 16 | rng.fill(in, cv::RNG::UNIFORM, 0, 256); 17 | 18 | sensor_msgs::msg::CompressedImage::SharedPtr msg = 19 | cv_bridge::CvImage(header, format, in).toCompressedImageMsg(cv_bridge::PNG); 20 | const cv_bridge::CvImageConstPtr out = cv_bridge::toCvCopy(msg, format); 21 | 22 | EXPECT_EQ(out->image.channels(), 3); 23 | EXPECT_EQ(cv::norm(out->image, in), 0); 24 | } 25 | 26 | // Test 4 channel images. 27 | for (int i = 0; i < 2; ++i) { 28 | const std::string format = (i == 0) ? "bgra8" : "rgba8"; 29 | cv::Mat_ in(10, 10); 30 | rng.fill(in, cv::RNG::UNIFORM, 0, 256); 31 | 32 | sensor_msgs::msg::CompressedImage::SharedPtr msg = 33 | cv_bridge::CvImage(header, format, in).toCompressedImageMsg(cv_bridge::PNG); 34 | const cv_bridge::CvImageConstPtr out = cv_bridge::toCvCopy(msg, format); 35 | EXPECT_EQ(out->image.channels(), 4); 36 | EXPECT_EQ(cv::norm(out->image, in), 0); 37 | } 38 | } 39 | -------------------------------------------------------------------------------- /cv_bridge/test/test_dynamic_scaling.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include 6 | 7 | TEST(TestDynamicScaling, ignoreInfAndNanValues) 8 | { 9 | float inf = std::numeric_limits::infinity(); 10 | float nan = std::numeric_limits::quiet_NaN(); 11 | std::vector data{50, 100, 150, -inf, inf, nan}; 12 | std::string encoding = sensor_msgs::image_encodings::TYPE_32FC1; 13 | 14 | sensor_msgs::msg::Image msg; 15 | msg.height = 1; 16 | msg.width = data.size(); 17 | msg.encoding = encoding; 18 | msg.step = data.size() * 4; 19 | for (auto d : data) { 20 | uint8_t * p = reinterpret_cast(&d); 21 | for (std::size_t i = 0; i != sizeof(float); ++i) { 22 | msg.data.push_back(p[i]); 23 | } 24 | } 25 | 26 | cv_bridge::CvtColorForDisplayOptions options; 27 | options.do_dynamic_scaling = true; 28 | 29 | cv_bridge::CvImageConstPtr img = cv_bridge::toCvCopy(msg); 30 | auto converted = cv_bridge::cvtColorForDisplay(img, "", options); 31 | 32 | // Check that the scaling works for non-inf and non-nan values. 33 | std::vector expected = {0, 0, 0, 128, 128, 128, 255, 255, 255, 0, 0, 0, 0, 0, 0, 0, 0, 0}; 34 | for (unsigned i = 0; i < expected.size(); ++i) 35 | { 36 | EXPECT_EQ(converted->image.at(i), expected.at(i)); 37 | } 38 | } 39 | -------------------------------------------------------------------------------- /cv_bridge/test/test_endian.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | 6 | TEST(CvBridgeTest, endianness) 7 | { 8 | using namespace boost::endian; 9 | 10 | // Create an image of the type opposite to the platform 11 | sensor_msgs::msg::Image msg; 12 | msg.height = 1; 13 | msg.width = 1; 14 | msg.encoding = "32SC2"; 15 | msg.step = 8; 16 | 17 | msg.data.resize(msg.step); 18 | int32_t * data = reinterpret_cast(&msg.data[0]); 19 | 20 | // Write 1 and 2 in order, but with an endianness opposite to the platform 21 | if (order::native == order::little) { 22 | msg.is_bigendian = true; 23 | *(data++) = native_to_big(static_cast(1)); 24 | *data = native_to_big(static_cast(2)); 25 | } else { 26 | msg.is_bigendian = false; 27 | *(data++) = native_to_little(static_cast(1)); 28 | *data = native_to_little(static_cast(2)); 29 | } 30 | 31 | // Make sure the values are still the same 32 | cv_bridge::CvImageConstPtr img = 33 | cv_bridge::toCvShare(std::make_shared(msg)); 34 | EXPECT_EQ(img->image.at(0, 0)[0], 1); 35 | EXPECT_EQ(img->image.at(0, 0)[1], 2); 36 | // Make sure we cannot share data 37 | EXPECT_NE(img->image.data, &msg.data[0]); 38 | } 39 | -------------------------------------------------------------------------------- /cv_bridge/test/test_rgb_colors.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | 4 | #include "cv_bridge/rgb_colors.hpp" 5 | 6 | TEST(RGBColors, testGetRGBColor) 7 | { 8 | cv::Vec3d color; 9 | // red 10 | color = cv_bridge::rgb_colors::getRGBColor(cv_bridge::rgb_colors::RED); 11 | EXPECT_EQ(1, color[0]); 12 | EXPECT_EQ(0, color[1]); 13 | EXPECT_EQ(0, color[2]); 14 | // gray 15 | color = cv_bridge::rgb_colors::getRGBColor(cv_bridge::rgb_colors::GRAY); 16 | EXPECT_EQ(0.502, color[0]); 17 | EXPECT_EQ(0.502, color[1]); 18 | EXPECT_EQ(0.502, color[2]); 19 | } 20 | -------------------------------------------------------------------------------- /cv_bridge/test/utest.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | 4 | #include "cv_bridge/cv_bridge.hpp" 5 | 6 | // Tests conversion of non-continuous cv::Mat. #5206 7 | TEST(CvBridgeTest, NonContinuous) 8 | { 9 | cv::Mat full = cv::Mat::eye(8, 8, CV_16U); 10 | cv::Mat partial = full.colRange(2, 5); 11 | 12 | cv_bridge::CvImage cvi; 13 | cvi.encoding = sensor_msgs::image_encodings::MONO16; 14 | cvi.image = partial; 15 | 16 | sensor_msgs::msg::Image::SharedPtr msg = cvi.toImageMsg(); 17 | EXPECT_EQ(static_cast(msg->height), 8); 18 | EXPECT_EQ(static_cast(msg->width), 3); 19 | EXPECT_EQ(msg->encoding, cvi.encoding); 20 | EXPECT_EQ(static_cast(msg->step), 6); 21 | } 22 | 23 | TEST(CvBridgeTest, ChannelOrder) 24 | { 25 | cv::Mat_ mat(200, 200); 26 | mat.setTo(cv::Scalar(1000, 0, 0, 0)); 27 | sensor_msgs::msg::Image::SharedPtr image(new sensor_msgs::msg::Image()); 28 | 29 | image = cv_bridge::CvImage(image->header, sensor_msgs::image_encodings::MONO16, mat).toImageMsg(); 30 | 31 | cv_bridge::CvImageConstPtr cv_ptr = cv_bridge::toCvShare(image); 32 | 33 | cv_bridge::CvImagePtr res = cv_bridge::cvtColor(cv_ptr, sensor_msgs::image_encodings::BGR8); 34 | EXPECT_EQ(res->encoding, sensor_msgs::image_encodings::BGR8); 35 | EXPECT_EQ(res->image.type(), cv_bridge::getCvType(res->encoding)); 36 | EXPECT_EQ(res->image.channels(), sensor_msgs::image_encodings::numChannels(res->encoding)); 37 | EXPECT_EQ(res->image.depth(), CV_8U); 38 | 39 | // The matrix should be the following 40 | cv::Mat_ gt(200, 200); 41 | gt.setTo(cv::Scalar(1, 1, 1) * 1000. * 255. / 65535.); 42 | 43 | ASSERT_EQ(res->image.type(), gt.type()); 44 | EXPECT_EQ(cv::norm(res->image, gt, cv::NORM_INF), 0); 45 | } 46 | 47 | TEST(CvBridgeTest, initialization) 48 | { 49 | sensor_msgs::msg::Image image; 50 | cv_bridge::CvImagePtr cv_ptr; 51 | 52 | image.encoding = "bgr8"; 53 | image.height = 200; 54 | image.width = 200; 55 | 56 | try { 57 | cv_ptr = cv_bridge::toCvCopy(image, "mono8"); 58 | // Before the fix, it would never get here, as it would segfault 59 | EXPECT_EQ(1, 0); 60 | } catch (cv_bridge::Exception & e) { 61 | EXPECT_EQ(1, 1); 62 | } 63 | 64 | // Check some normal images with different ratios 65 | for (int height = 100; height <= 300; ++height) { 66 | image.encoding = sensor_msgs::image_encodings::RGB8; 67 | image.step = image.width * 3; 68 | image.data.resize(image.height * image.step); 69 | cv_ptr = cv_bridge::toCvCopy(image, "mono8"); 70 | } 71 | } 72 | 73 | TEST(CvBridgeTest, imageMessageStep) 74 | { 75 | // Test 1: image step is padded 76 | sensor_msgs::msg::Image image; 77 | cv_bridge::CvImagePtr cv_ptr; 78 | 79 | image.encoding = "mono8"; 80 | image.height = 220; 81 | image.width = 200; 82 | image.is_bigendian = false; 83 | image.step = 208; 84 | 85 | image.data.resize(image.height * image.step); 86 | 87 | ASSERT_NO_THROW(cv_ptr = cv_bridge::toCvCopy(image, "mono8")); 88 | ASSERT_EQ(220, cv_ptr->image.rows); 89 | ASSERT_EQ(200, cv_ptr->image.cols); 90 | // OpenCV copyTo argument removes the stride 91 | ASSERT_EQ(200, static_cast(cv_ptr->image.step[0])); 92 | 93 | // Test 2: image step is invalid 94 | image.step = 199; 95 | 96 | ASSERT_THROW(cv_ptr = cv_bridge::toCvCopy(image, "mono8"), cv_bridge::Exception); 97 | 98 | // Test 3: image step == image.width * element size * number of channels 99 | image.step = 200; 100 | image.data.resize(image.height * image.step); 101 | 102 | ASSERT_NO_THROW(cv_ptr = cv_bridge::toCvCopy(image, "mono8")); 103 | ASSERT_EQ(220, cv_ptr->image.rows); 104 | ASSERT_EQ(200, cv_ptr->image.cols); 105 | ASSERT_EQ(200, static_cast(cv_ptr->image.step[0])); 106 | } 107 | 108 | TEST(CvBridgeTest, imageMessageConversion) 109 | { 110 | sensor_msgs::msg::Image imgmsg; 111 | cv_bridge::CvImagePtr cv_ptr; 112 | imgmsg.height = 220; 113 | imgmsg.width = 200; 114 | imgmsg.is_bigendian = false; 115 | 116 | // image with data type float32 and 1 channels 117 | imgmsg.encoding = "32FC1"; 118 | imgmsg.step = imgmsg.width * 32 / 8 * 1; 119 | imgmsg.data.resize(imgmsg.height * imgmsg.step); 120 | ASSERT_NO_THROW(cv_ptr = cv_bridge::toCvCopy(imgmsg, imgmsg.encoding)); 121 | ASSERT_EQ(static_cast(imgmsg.height), cv_ptr->image.rows); 122 | ASSERT_EQ(static_cast(imgmsg.width), cv_ptr->image.cols); 123 | ASSERT_EQ(1, cv_ptr->image.channels()); 124 | ASSERT_EQ(imgmsg.step, cv_ptr->image.step[0]); 125 | 126 | // image with data type float32 and 10 channels 127 | imgmsg.encoding = "32FC10"; 128 | imgmsg.step = imgmsg.width * 32 / 8 * 10; 129 | imgmsg.data.resize(imgmsg.height * imgmsg.step); 130 | ASSERT_NO_THROW(cv_ptr = cv_bridge::toCvCopy(imgmsg, imgmsg.encoding)); 131 | ASSERT_EQ(static_cast(imgmsg.height), cv_ptr->image.rows); 132 | ASSERT_EQ(static_cast(imgmsg.width), cv_ptr->image.cols); 133 | ASSERT_EQ(10, cv_ptr->image.channels()); 134 | ASSERT_EQ(imgmsg.step, cv_ptr->image.step[0]); 135 | } 136 | 137 | int main(int argc, char ** argv) 138 | { 139 | testing::InitGoogleTest(&argc, argv); 140 | return RUN_ALL_TESTS(); 141 | } 142 | -------------------------------------------------------------------------------- /cv_bridge/test/utest2.cpp: -------------------------------------------------------------------------------- 1 | /********************************************************************* 2 | * Software License Agreement (BSD License) 3 | * 4 | * Copyright (c) 2009, Willow Garage, Inc. 5 | * Copyright (c) 2018 Intel Corporation. 6 | * All rights reserved. 7 | * 8 | * Redistribution and use in source and binary forms, with or without 9 | * modification, are permitted provided that the following conditions 10 | * are met: 11 | * 12 | * * Redistributions of source code must retain the above copyright 13 | * notice, this list of conditions and the following disclaimer. 14 | * * Redistributions in binary form must reproduce the above 15 | * copyright notice, this list of conditions and the following 16 | * disclaimer in the documentation and/or other materials provided 17 | * with the distribution. 18 | * * Neither the name of the Willow Garage nor the names of its 19 | * contributors may be used to endorse or promote products derived 20 | * from this software without specific prior written permission. 21 | * 22 | * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 23 | * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 24 | * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS 25 | * FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE 26 | * COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, 27 | * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, 28 | * BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 29 | * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 30 | * CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT 31 | * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN 32 | * ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 33 | * POSSIBILITY OF SUCH DAMAGE. 34 | *********************************************************************/ 35 | 36 | #include 37 | #include 38 | #include 39 | 40 | #include 41 | #include 42 | 43 | #include "cv_bridge/cv_bridge.hpp" 44 | #include "opencv2/core/core.hpp" 45 | 46 | using namespace sensor_msgs::image_encodings; 47 | 48 | bool isUnsigned(const std::string & encoding) 49 | { 50 | return encoding == RGB8 || encoding == RGBA8 || encoding == RGB16 || encoding == RGBA16 || 51 | encoding == BGR8 || encoding == BGRA8 || encoding == BGR16 || encoding == BGRA16 || 52 | encoding == MONO8 || encoding == MONO16 || 53 | encoding == MONO8 || encoding == MONO16 || encoding == TYPE_8UC1 || 54 | encoding == TYPE_8UC2 || encoding == TYPE_8UC3 || encoding == TYPE_8UC4 || 55 | encoding == TYPE_16UC1 || encoding == TYPE_16UC2 || encoding == TYPE_16UC3 || 56 | encoding == TYPE_16UC4; 57 | // BAYER_RGGB8, BAYER_BGGR8, BAYER_GBRG8, BAYER_GRBG8, BAYER_RGGB16, 58 | // BAYER_BGGR16, BAYER_GBRG16, BAYER_GRBG16, YUV422 59 | } 60 | std::vector 61 | getEncodings() 62 | { 63 | // TODO(N/A) for Groovy, the following types should be uncommented 64 | std::string encodings[] = {RGB8, RGBA8, RGB16, RGBA16, BGR8, BGRA8, BGR16, BGRA16, MONO8, MONO16, 65 | TYPE_8UC1, /*TYPE_8UC2,*/ TYPE_8UC3, TYPE_8UC4, 66 | TYPE_8SC1, /*TYPE_8SC2,*/ TYPE_8SC3, TYPE_8SC4, 67 | TYPE_16UC1, /*TYPE_16UC2,*/ TYPE_16UC3, TYPE_16UC4, 68 | TYPE_16SC1, /*TYPE_16SC2,*/ TYPE_16SC3, TYPE_16SC4, 69 | TYPE_32SC1, /*TYPE_32SC2,*/ TYPE_32SC3, TYPE_32SC4, 70 | TYPE_32FC1, /*TYPE_32FC2,*/ TYPE_32FC3, TYPE_32FC4, 71 | TYPE_64FC1, /*TYPE_64FC2,*/ TYPE_64FC3, TYPE_64FC4, 72 | // BAYER_RGGB8, BAYER_BGGR8, BAYER_GBRG8, BAYER_GRBG8, 73 | // BAYER_RGGB16, BAYER_BGGR16, BAYER_GBRG16, BAYER_GRBG16, 74 | YUV422, YUV422_YUY2}; 75 | return std::vector(encodings, encodings + 48 - 8 - 7); 76 | } 77 | 78 | TEST(OpencvTests, testCase_encode_decode) 79 | { 80 | std::vector encodings = getEncodings(); 81 | for (size_t i = 0; i < encodings.size(); ++i) { 82 | std::string src_encoding = encodings[i]; 83 | bool is_src_color_format = isColor(src_encoding) || isMono(src_encoding) || 84 | (src_encoding == sensor_msgs::image_encodings::YUV422) || 85 | (src_encoding == sensor_msgs::image_encodings::YUV422_YUY2); 86 | cv::Mat image_original(cv::Size(400, 400), cv_bridge::getCvType(src_encoding)); 87 | cv::RNG r(77); 88 | r.fill(image_original, cv::RNG::UNIFORM, 0, 127); 89 | 90 | sensor_msgs::msg::Image image_message; 91 | cv_bridge::CvImage image_bridge(std_msgs::msg::Header(), src_encoding, image_original); 92 | 93 | // Convert to a sensor_msgs::Image 94 | sensor_msgs::msg::Image::SharedPtr image_msg = image_bridge.toImageMsg(); 95 | 96 | for (size_t j = 0; j < encodings.size(); ++j) { 97 | std::string dst_encoding = encodings[j]; 98 | bool is_dst_color_format = isColor(dst_encoding) || isMono(dst_encoding) || 99 | (dst_encoding == sensor_msgs::image_encodings::YUV422) || 100 | (dst_encoding == sensor_msgs::image_encodings::YUV422_YUY2); 101 | bool is_num_channels_the_same = (numChannels(src_encoding) == numChannels(dst_encoding)); 102 | 103 | cv_bridge::CvImageConstPtr cv_image; 104 | cv::Mat image_back; 105 | // If the first type does not contain any color information 106 | if (!is_src_color_format) { 107 | // Converting from a non color type to a color type does no make sense 108 | if (is_dst_color_format) { 109 | EXPECT_THROW(cv_bridge::toCvShare(image_msg, dst_encoding), cv_bridge::Exception); 110 | continue; 111 | } 112 | // We can only convert non-color types with the same number of channels 113 | if (!is_num_channels_the_same) { 114 | EXPECT_THROW(cv_bridge::toCvShare(image_msg, dst_encoding), cv_bridge::Exception); 115 | continue; 116 | } 117 | cv_image = cv_bridge::toCvShare(image_msg, dst_encoding); 118 | } else { 119 | // If we are converting to a non-color, you cannot convert to a different number of channels 120 | if (!is_dst_color_format) { 121 | if (!is_num_channels_the_same) { 122 | EXPECT_THROW(cv_bridge::toCvShare(image_msg, dst_encoding), cv_bridge::Exception); 123 | continue; 124 | } 125 | cv_image = cv_bridge::toCvShare(image_msg, dst_encoding); 126 | // We cannot convert from non-color to color 127 | EXPECT_THROW((void)cvtColor(cv_image, src_encoding)->image, cv_bridge::Exception); 128 | continue; 129 | } 130 | // We do not support conversion to YUV422 for now, except from YUV422 131 | if (((dst_encoding == YUV422) && (src_encoding != YUV422)) || 132 | ((dst_encoding == YUV422_YUY2) && (src_encoding != YUV422_YUY2))) { 133 | EXPECT_THROW(cv_bridge::toCvShare(image_msg, dst_encoding), cv_bridge::Exception); 134 | continue; 135 | } 136 | 137 | cv_image = cv_bridge::toCvShare(image_msg, dst_encoding); 138 | 139 | // We do not support conversion to YUV422 for now, except from YUV422 140 | if (((src_encoding == YUV422) && (dst_encoding != YUV422)) || 141 | ((src_encoding == YUV422_YUY2) && (dst_encoding != YUV422_YUY2))){ 142 | EXPECT_THROW((void)cvtColor(cv_image, src_encoding)->image, cv_bridge::Exception); 143 | continue; 144 | } 145 | } 146 | // And convert back to a cv::Mat 147 | image_back = cvtColor(cv_image, src_encoding)->image; 148 | 149 | // If the number of channels,s different some information 150 | // got lost at some point, so no possible test 151 | if (!is_num_channels_the_same) { 152 | continue; 153 | } 154 | if (bitDepth(src_encoding) >= 32) { 155 | // In the case where the input has floats, we will lose precision but no more than 1 156 | EXPECT_LT(cv::norm(image_original, image_back, cv::NORM_INF), 157 | 1) << "problem converting from " << src_encoding << " to " << dst_encoding << 158 | " and back."; 159 | } else if ((bitDepth(src_encoding) == 16) && (bitDepth(dst_encoding) == 8)) { 160 | // In the case where the input has floats, we 161 | // will lose precision but no more than 1 * max(127) 162 | EXPECT_LT(cv::norm(image_original, image_back, cv::NORM_INF), 163 | 128) << "problem converting from " << src_encoding << " to " << dst_encoding << 164 | " and back."; 165 | } else { 166 | EXPECT_EQ(cv::norm(image_original, image_back, cv::NORM_INF), 167 | 0) << "problem converting from " << src_encoding << " to " << dst_encoding << 168 | " and back."; 169 | } 170 | } 171 | } 172 | } 173 | -------------------------------------------------------------------------------- /image_geometry/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | cmake_minimum_required(VERSION 3.5) 2 | project(image_geometry) 3 | 4 | find_package(ament_cmake_python REQUIRED) 5 | find_package(ament_cmake_ros REQUIRED) 6 | 7 | ament_python_install_package(${PROJECT_NAME}) 8 | 9 | # Default to C++17 10 | if(NOT CMAKE_CXX_STANDARD) 11 | set(CMAKE_CXX_STANDARD 17) 12 | endif() 13 | 14 | if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang") 15 | add_compile_options(-Wall -Wextra) 16 | endif() 17 | 18 | find_package(OpenCV REQUIRED COMPONENTS calib3d core highgui imgproc) 19 | find_package(sensor_msgs REQUIRED) 20 | 21 | add_library(${PROJECT_NAME} 22 | src/pinhole_camera_model.cpp 23 | src/stereo_camera_model.cpp 24 | ) 25 | target_include_directories(${PROJECT_NAME} PUBLIC 26 | "$" 27 | "$") 28 | target_link_libraries(${PROJECT_NAME} PUBLIC 29 | opencv_calib3d 30 | opencv_core 31 | opencv_highgui 32 | opencv_imgproc 33 | ${sensor_msgs_TARGETS}) 34 | 35 | install(DIRECTORY include/ DESTINATION include/${PROJECT_NAME}) 36 | 37 | # Causes the visibility macros to use dllexport rather than dllimport, 38 | # which is appropriate when building the dll but not consuming it. 39 | target_compile_definitions(${PROJECT_NAME} PRIVATE "IMAGE_GEOMETRY_BUILDING_DLL") 40 | 41 | install(TARGETS ${PROJECT_NAME} EXPORT export_${PROJECT_NAME} 42 | RUNTIME DESTINATION bin 43 | ARCHIVE DESTINATION lib 44 | LIBRARY DESTINATION lib 45 | ) 46 | 47 | if(BUILD_TESTING) 48 | add_subdirectory(test) 49 | endif() 50 | 51 | ament_export_targets(export_${PROJECT_NAME}) 52 | ament_export_dependencies(OpenCV) 53 | ament_export_dependencies(sensor_msgs) 54 | 55 | ament_package() 56 | -------------------------------------------------------------------------------- /image_geometry/doc/conf.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # 3 | # image_geometry documentation build configuration file, created by 4 | # sphinx-quickstart on Mon Jun 1 14:21:53 2009. 5 | # 6 | # This file is execfile()d with the current directory set to its containing dir. 7 | # 8 | # Note that not all possible configuration values are present in this 9 | # autogenerated file. 10 | # 11 | # All configuration values have a default; values that are commented out 12 | # serve to show the default. 13 | 14 | import sys, os 15 | 16 | # If extensions (or modules to document with autodoc) are in another directory, 17 | # add these directories to sys.path here. If the directory is relative to the 18 | # documentation root, use os.path.abspath to make it absolute, like shown here. 19 | #sys.path.append(os.path.abspath('.')) 20 | 21 | # -- General configuration ----------------------------------------------------- 22 | 23 | # Add any Sphinx extension module names here, as strings. They can be extensions 24 | # coming with Sphinx (named 'sphinx.ext.*') or your custom ones. 25 | extensions = ['sphinx.ext.autodoc', 'sphinx.ext.doctest', 'sphinx.ext.intersphinx', 'sphinx.ext.imgmath'] 26 | 27 | # Add any paths that contain templates here, relative to this directory. 28 | templates_path = ['_templates'] 29 | 30 | # The suffix of source filenames. 31 | source_suffix = '.rst' 32 | 33 | # The encoding of source files. 34 | #source_encoding = 'utf-8' 35 | 36 | # The master toctree document. 37 | master_doc = 'index' 38 | 39 | # General information about the project. 40 | project = u'image_geometry' 41 | copyright = u'2009, Willow Garage, Inc.' 42 | 43 | # The version info for the project you're documenting, acts as replacement for 44 | # |version| and |release|, also used in various other places throughout the 45 | # built documents. 46 | # 47 | # The short X.Y version. 48 | version = '4.1' 49 | # The full version, including alpha/beta/rc tags. 50 | release = '4.1.0' 51 | 52 | # The language for content autogenerated by Sphinx. Refer to documentation 53 | # for a list of supported languages. 54 | #language = None 55 | 56 | # There are two options for replacing |today|: either, you set today to some 57 | # non-false value, then it is used: 58 | #today = '' 59 | # Else, today_fmt is used as the format for a strftime call. 60 | #today_fmt = '%B %d, %Y' 61 | 62 | # List of documents that shouldn't be included in the build. 63 | #unused_docs = [] 64 | 65 | # List of directories, relative to source directory, that shouldn't be searched 66 | # for source files. 67 | exclude_trees = ['_build'] 68 | 69 | # The reST default role (used for this markup: `text`) to use for all documents. 70 | #default_role = None 71 | 72 | # If true, '()' will be appended to :func: etc. cross-reference text. 73 | #add_function_parentheses = True 74 | 75 | # If true, the current module name will be prepended to all description 76 | # unit titles (such as .. function::). 77 | #add_module_names = True 78 | 79 | # If true, sectionauthor and moduleauthor directives will be shown in the 80 | # output. They are ignored by default. 81 | #show_authors = False 82 | 83 | # The name of the Pygments (syntax highlighting) style to use. 84 | pygments_style = 'sphinx' 85 | 86 | # A list of ignored prefixes for module index sorting. 87 | #modindex_common_prefix = [] 88 | 89 | 90 | # -- Options for HTML output --------------------------------------------------- 91 | 92 | # The theme to use for HTML and HTML Help pages. Major themes that come with 93 | # Sphinx are currently 'default' and 'sphinxdoc'. 94 | html_theme = 'default' 95 | 96 | # Theme options are theme-specific and customize the look and feel of a theme 97 | # further. For a list of options available for each theme, see the 98 | # documentation. 99 | #html_theme_options = {} 100 | 101 | # Add any paths that contain custom themes here, relative to this directory. 102 | #html_theme_path = [] 103 | 104 | # The name for this set of Sphinx documents. If None, it defaults to 105 | # " v documentation". 106 | #html_title = None 107 | 108 | # A shorter title for the navigation bar. Default is the same as html_title. 109 | #html_short_title = None 110 | 111 | # The name of an image file (relative to this directory) to place at the top 112 | # of the sidebar. 113 | #html_logo = None 114 | 115 | # The name of an image file (within the static path) to use as favicon of the 116 | # docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32 117 | # pixels large. 118 | #html_favicon = None 119 | 120 | # Add any paths that contain custom static files (such as style sheets) here, 121 | # relative to this directory. They are copied after the builtin static files, 122 | # so a file named "default.css" will overwrite the builtin "default.css". 123 | #html_static_path = ['_static'] 124 | 125 | # If not '', a 'Last updated on:' timestamp is inserted at every page bottom, 126 | # using the given strftime format. 127 | #html_last_updated_fmt = '%b %d, %Y' 128 | 129 | # If true, SmartyPants will be used to convert quotes and dashes to 130 | # typographically correct entities. 131 | #html_use_smartypants = True 132 | 133 | # Custom sidebar templates, maps document names to template names. 134 | #html_sidebars = {} 135 | 136 | # Additional templates that should be rendered to pages, maps page names to 137 | # template names. 138 | #html_additional_pages = {} 139 | 140 | # If false, no module index is generated. 141 | #html_use_modindex = True 142 | 143 | # If false, no index is generated. 144 | #html_use_index = True 145 | 146 | # If true, the index is split into individual pages for each letter. 147 | #html_split_index = False 148 | 149 | # If true, links to the reST sources are added to the pages. 150 | #html_show_sourcelink = True 151 | 152 | # If true, an OpenSearch description file will be output, and all pages will 153 | # contain a tag referring to it. The value of this option must be the 154 | # base URL from which the finished HTML is served. 155 | #html_use_opensearch = '' 156 | 157 | # If nonempty, this is the file name suffix for HTML files (e.g. ".xhtml"). 158 | #html_file_suffix = '' 159 | 160 | # Output file base name for HTML help builder. 161 | htmlhelp_basename = 'image_geometrydoc' 162 | 163 | 164 | # -- Options for LaTeX output -------------------------------------------------- 165 | 166 | # The paper size ('letter' or 'a4'). 167 | #latex_paper_size = 'letter' 168 | 169 | # The font size ('10pt', '11pt' or '12pt'). 170 | #latex_font_size = '10pt' 171 | 172 | # Grouping the document tree into LaTeX files. List of tuples 173 | # (source start file, target name, title, author, documentclass [howto/manual]). 174 | latex_documents = [ 175 | ('index', 'image_geometry.tex', u'stereo\\_utils Documentation', 176 | u'James Bowman', 'manual'), 177 | ] 178 | 179 | # The name of an image file (relative to this directory) to place at the top of 180 | # the title page. 181 | #latex_logo = None 182 | 183 | # For "manual" documents, if this is true, then toplevel headings are parts, 184 | # not chapters. 185 | #latex_use_parts = False 186 | 187 | # Additional stuff for the LaTeX preamble. 188 | #latex_preamble = '' 189 | 190 | # Documents to append as an appendix to all manuals. 191 | #latex_appendices = [] 192 | 193 | # If false, no module index is generated. 194 | #latex_use_modindex = True 195 | 196 | # Example configuration for intersphinx: refer to the Python standard library. 197 | intersphinx_mapping = { 198 | 'http://docs.python.org/': None, 199 | 'http://docs.scipy.org/doc/numpy' : None, 200 | 'http://docs.ros.org/api/tf/html/python/' : None, 201 | } 202 | -------------------------------------------------------------------------------- /image_geometry/doc/index.rst: -------------------------------------------------------------------------------- 1 | image_geometry 2 | ============== 3 | 4 | image_geometry simplifies interpreting images geometrically using the 5 | parameters from sensor_msgs/CameraInfo. 6 | 7 | .. toctree:: 8 | :maxdepth: 2 9 | :caption: Contents: 10 | 11 | Python API Docs 12 | C++ API Docs -------------------------------------------------------------------------------- /image_geometry/doc/mainpage.dox: -------------------------------------------------------------------------------- 1 | /** 2 | \mainpage 3 | \htmlinclude manifest.html 4 | 5 | \b image_geometry contains camera model classes that simplify interpreting 6 | images geometrically using the calibration parameters from 7 | sensor_msgs/CameraInfo messages. They may be efficiently updated in your 8 | image callback: 9 | 10 | \code 11 | image_geometry::PinholeCameraModel model_; 12 | 13 | void imageCb(const sensor_msgs::ImageConstPtr& raw_image, 14 | const sensor_msgs::CameraInfoConstPtr& cam_info) 15 | { 16 | // Update the camera model (usually a no-op) 17 | model_.fromCameraInfo(cam_info); 18 | 19 | // Do processing... 20 | } 21 | \endcode 22 | 23 | \section codeapi Code API 24 | 25 | \b image_geometry contains two classes: 26 | - image_geometry::PinholeCameraModel - models a pinhole camera with distortion. 27 | - image_geometry::StereoCameraModel - models a stereo pair of pinhole cameras. 28 | 29 | */ 30 | -------------------------------------------------------------------------------- /image_geometry/doc/python_api.rst: -------------------------------------------------------------------------------- 1 | image_geometry 2 | ============== 3 | 4 | image_geometry simplifies interpreting images geometrically using the 5 | parameters from sensor_msgs/CameraInfo. 6 | 7 | .. module:: image_geometry 8 | 9 | .. autoclass:: image_geometry.PinholeCameraModel 10 | :members: 11 | :member-order: bysource 12 | 13 | .. autoclass:: image_geometry.StereoCameraModel 14 | :members: 15 | :member-order: bysource -------------------------------------------------------------------------------- /image_geometry/image_geometry/__init__.py: -------------------------------------------------------------------------------- 1 | from __future__ import absolute_import 2 | from .cameramodels import PinholeCameraModel, StereoCameraModel 3 | -------------------------------------------------------------------------------- /image_geometry/include/image_geometry/stereo_camera_model.hpp: -------------------------------------------------------------------------------- 1 | #ifndef IMAGE_GEOMETRY__STEREO_CAMERA_MODEL_HPP_ 2 | #define IMAGE_GEOMETRY__STEREO_CAMERA_MODEL_HPP_ 3 | 4 | #include "image_geometry/pinhole_camera_model.hpp" 5 | #include "image_geometry/visibility_control.hpp" 6 | 7 | namespace image_geometry { 8 | 9 | /** 10 | * \brief Simplifies interpreting stereo image pairs geometrically using the 11 | * parameters from the left and right sensor_msgs/CameraInfo. 12 | */ 13 | class StereoCameraModel 14 | { 15 | public: 16 | IMAGE_GEOMETRY_PUBLIC 17 | StereoCameraModel(); 18 | 19 | IMAGE_GEOMETRY_PUBLIC 20 | StereoCameraModel(const StereoCameraModel& other); 21 | 22 | IMAGE_GEOMETRY_PUBLIC 23 | StereoCameraModel& operator=(const StereoCameraModel& other); 24 | 25 | /** 26 | * \brief Set the camera parameters from the sensor_msgs/CameraInfo messages. 27 | */ 28 | IMAGE_GEOMETRY_PUBLIC 29 | bool fromCameraInfo(const sensor_msgs::msg::CameraInfo& left, 30 | const sensor_msgs::msg::CameraInfo& right); 31 | 32 | /** 33 | * \brief Set the camera parameters from the sensor_msgs/CameraInfo messages. 34 | */ 35 | IMAGE_GEOMETRY_PUBLIC 36 | bool fromCameraInfo(const sensor_msgs::msg::CameraInfo::ConstSharedPtr& left, 37 | const sensor_msgs::msg::CameraInfo::ConstSharedPtr& right); 38 | 39 | /** 40 | * \brief Get the left monocular camera model. 41 | */ 42 | IMAGE_GEOMETRY_PUBLIC 43 | const PinholeCameraModel& left() const; 44 | 45 | /** 46 | * \brief Get the right monocular camera model. 47 | */ 48 | IMAGE_GEOMETRY_PUBLIC 49 | const PinholeCameraModel& right() const; 50 | 51 | /** 52 | * \brief Get the name of the camera coordinate frame in tf. 53 | * 54 | * For stereo cameras, both the left and right CameraInfo should be in the left 55 | * optical frame. 56 | */ 57 | IMAGE_GEOMETRY_PUBLIC 58 | std::string tfFrame() const; 59 | 60 | /** 61 | * \brief Project a rectified pixel with disparity to a 3d point. 62 | */ 63 | IMAGE_GEOMETRY_PUBLIC 64 | void projectDisparityTo3d(const cv::Point2d& left_uv_rect, float disparity, cv::Point3d& xyz) const; 65 | 66 | /** 67 | * \brief Project a disparity image to a 3d point cloud. 68 | * 69 | * If handleMissingValues = true, all points with minimal disparity (outliers) have 70 | * Z set to MISSING_Z (currently 10000.0). 71 | */ 72 | IMAGE_GEOMETRY_PUBLIC 73 | void projectDisparityImageTo3d(const cv::Mat& disparity, cv::Mat& point_cloud, 74 | bool handleMissingValues = false) const; 75 | IMAGE_GEOMETRY_PUBLIC 76 | static const double MISSING_Z; 77 | 78 | /** 79 | * \brief Returns the disparity reprojection matrix. 80 | */ 81 | IMAGE_GEOMETRY_PUBLIC 82 | const cv::Matx44d& reprojectionMatrix() const; 83 | 84 | /** 85 | * \brief Returns the horizontal baseline in world coordinates. 86 | */ 87 | IMAGE_GEOMETRY_PUBLIC 88 | double baseline() const; 89 | 90 | /** 91 | * \brief Returns the depth at which a point is observed with a given disparity. 92 | * 93 | * This is the inverse of getDisparity(). 94 | */ 95 | IMAGE_GEOMETRY_PUBLIC 96 | double getZ(double disparity) const; 97 | 98 | /** 99 | * \brief Returns the disparity observed for a point at depth Z. 100 | * 101 | * This is the inverse of getZ(). 102 | */ 103 | IMAGE_GEOMETRY_PUBLIC 104 | double getDisparity(double Z) const; 105 | 106 | /** 107 | * \brief Returns true if the camera has been initialized 108 | */ 109 | IMAGE_GEOMETRY_PUBLIC 110 | bool initialized() const { return left_.initialized() && right_.initialized(); } 111 | protected: 112 | PinholeCameraModel left_, right_; 113 | cv::Matx44d Q_; 114 | 115 | IMAGE_GEOMETRY_PUBLIC 116 | void updateQ(); 117 | }; 118 | 119 | 120 | /* Trivial inline functions */ 121 | IMAGE_GEOMETRY_PUBLIC 122 | inline const PinholeCameraModel& StereoCameraModel::left() const { return left_; } 123 | IMAGE_GEOMETRY_PUBLIC 124 | inline const PinholeCameraModel& StereoCameraModel::right() const { return right_; } 125 | 126 | IMAGE_GEOMETRY_PUBLIC 127 | inline std::string StereoCameraModel::tfFrame() const { return left_.tfFrame(); } 128 | 129 | IMAGE_GEOMETRY_PUBLIC 130 | inline const cv::Matx44d& StereoCameraModel::reprojectionMatrix() const { return Q_; } 131 | 132 | IMAGE_GEOMETRY_PUBLIC 133 | inline double StereoCameraModel::baseline() const 134 | { 135 | /// @todo Currently assuming horizontal baseline 136 | return -right_.Tx() / right_.fx(); 137 | } 138 | 139 | IMAGE_GEOMETRY_PUBLIC 140 | inline double StereoCameraModel::getZ(double disparity) const 141 | { 142 | assert( initialized() ); 143 | return -right_.Tx() / (disparity - (left().cx() - right().cx())); 144 | } 145 | 146 | IMAGE_GEOMETRY_PUBLIC 147 | inline double StereoCameraModel::getDisparity(double Z) const 148 | { 149 | assert( initialized() ); 150 | return -right_.Tx() / Z + (left().cx() - right().cx()); ; 151 | } 152 | 153 | } // namespace image_geometry 154 | 155 | #endif // IMAGE_GEOMETRY__STEREO_CAMERA_MODEL_HPP_ 156 | -------------------------------------------------------------------------------- /image_geometry/include/image_geometry/visibility_control.hpp: -------------------------------------------------------------------------------- 1 | // Copyright 2015 Open Source Robotics Foundation, Inc. 2 | // 3 | // Licensed under the Apache License, Version 2.0 (the "License"); 4 | // you may not use this file except in compliance with the License. 5 | // You may obtain a copy of the License at 6 | // 7 | // http://www.apache.org/licenses/LICENSE-2.0 8 | // 9 | // Unless required by applicable law or agreed to in writing, software 10 | // distributed under the License is distributed on an "AS IS" BASIS, 11 | // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | // See the License for the specific language governing permissions and 13 | // limitations under the License. 14 | 15 | #ifndef IMAGE_GEOMETRY__VISIBILITY_CONTROL_H_ 16 | #define IMAGE_GEOMETRY__VISIBILITY_CONTROL_H_ 17 | 18 | #if __cplusplus 19 | extern "C" 20 | { 21 | #endif 22 | 23 | // This logic was borrowed (then namespaced) from the examples on the gcc wiki: 24 | // https://gcc.gnu.org/wiki/Visibility 25 | 26 | #if defined _WIN32 || defined __CYGWIN__ 27 | #ifdef __GNUC__ 28 | #define IMAGE_GEOMETRY_EXPORT __attribute__ ((dllexport)) 29 | #define IMAGE_GEOMETRY_IMPORT __attribute__ ((dllimport)) 30 | #else 31 | #define IMAGE_GEOMETRY_EXPORT __declspec(dllexport) 32 | #define IMAGE_GEOMETRY_IMPORT __declspec(dllimport) 33 | #endif 34 | #ifdef IMAGE_GEOMETRY_BUILDING_DLL 35 | #define IMAGE_GEOMETRY_PUBLIC IMAGE_GEOMETRY_EXPORT 36 | #else 37 | #define IMAGE_GEOMETRY_PUBLIC IMAGE_GEOMETRY_IMPORT 38 | #endif 39 | #define IMAGE_GEOMETRY_PUBLIC_TYPE IMAGE_GEOMETRY_PUBLIC 40 | #define IMAGE_GEOMETRY_LOCAL 41 | #else 42 | #define IMAGE_GEOMETRY_EXPORT __attribute__ ((visibility("default"))) 43 | #define IMAGE_GEOMETRY_IMPORT 44 | #if __GNUC__ >= 4 45 | #define IMAGE_GEOMETRY_PUBLIC __attribute__ ((visibility("default"))) 46 | #define IMAGE_GEOMETRY_LOCAL __attribute__ ((visibility("hidden"))) 47 | #else 48 | #define IMAGE_GEOMETRY_PUBLIC 49 | #define IMAGE_GEOMETRY_LOCAL 50 | #endif 51 | #define IMAGE_GEOMETRY_PUBLIC_TYPE 52 | #endif 53 | 54 | #if __cplusplus 55 | } 56 | #endif 57 | 58 | #endif // IMAGE_GEOMETRY__VISIBILITY_CONTROL_H_ 59 | -------------------------------------------------------------------------------- /image_geometry/package.xml: -------------------------------------------------------------------------------- 1 | 2 | image_geometry 3 | 4.1.0 4 | 5 | `image_geometry` contains C++ and Python libraries for interpreting images 6 | geometrically. It interfaces the calibration parameters in sensor_msgs/CameraInfo 7 | messages with OpenCV functions such as image rectification, much as cv_bridge 8 | interfaces ROS sensor_msgs/Image with OpenCV data types. 9 | 10 | Kenji Brameld 11 | Apache License 2.0 12 | BSD 13 | http://www.ros.org/wiki/image_geometry 14 | 15 | Patrick Mihelich 16 | Vincent Rabaud 17 | Mikael Arguedas 18 | 19 | ament_cmake_python 20 | ament_cmake_ros 21 | 22 | libopencv-dev 23 | sensor_msgs 24 | python3-deprecated 25 | 26 | ament_cmake_gtest 27 | ament_cmake_pytest 28 | 29 | 30 | ament_cmake 31 | 32 | 33 | 34 | -------------------------------------------------------------------------------- /image_geometry/src/stereo_camera_model.cpp: -------------------------------------------------------------------------------- 1 | #include "image_geometry/stereo_camera_model.hpp" 2 | 3 | namespace image_geometry { 4 | 5 | StereoCameraModel::StereoCameraModel() 6 | : Q_(0.0) 7 | { 8 | Q_(0,0) = Q_(1,1) = 1.0; 9 | } 10 | 11 | StereoCameraModel::StereoCameraModel(const StereoCameraModel& other) 12 | : left_(other.left_), right_(other.right_), 13 | Q_(0.0) 14 | { 15 | Q_(0,0) = Q_(1,1) = 1.0; 16 | if (other.initialized()) { 17 | updateQ(); 18 | } 19 | } 20 | 21 | StereoCameraModel& StereoCameraModel::operator=(const StereoCameraModel& other) 22 | { 23 | if (other.initialized()) { 24 | this->fromCameraInfo(other.left_.cameraInfo(), other.right_.cameraInfo()); 25 | } 26 | return *this; 27 | } 28 | 29 | bool StereoCameraModel::fromCameraInfo(const sensor_msgs::msg::CameraInfo& left, 30 | const sensor_msgs::msg::CameraInfo& right) 31 | { 32 | bool changed_left = left_.fromCameraInfo(left); 33 | bool changed_right = right_.fromCameraInfo(right); 34 | bool changed = changed_left || changed_right; 35 | 36 | // Note: don't require identical time stamps to allow imperfectly synced stereo. 37 | assert( left_.tfFrame() == right_.tfFrame() ); 38 | assert( left_.fx() == right_.fx() ); 39 | assert( left_.fy() == right_.fy() ); 40 | assert( left_.cy() == right_.cy() ); 41 | // cx may differ for verged cameras 42 | 43 | if (changed) { 44 | updateQ(); 45 | } 46 | 47 | return changed; 48 | } 49 | 50 | bool StereoCameraModel::fromCameraInfo(const sensor_msgs::msg::CameraInfo::ConstSharedPtr& left, 51 | const sensor_msgs::msg::CameraInfo::ConstSharedPtr& right) 52 | { 53 | return fromCameraInfo(*left, *right); 54 | } 55 | 56 | void StereoCameraModel::updateQ() 57 | { 58 | // Update variable fields of reprojection matrix 59 | /* 60 | From Springer Handbook of Robotics, p. 524: 61 | 62 | [ Fx 0 Cx 0 ] 63 | P = [ 0 Fy Cy 0 ] 64 | [ 0 0 1 0 ] 65 | 66 | [ Fx 0 Cx' FxTx ] 67 | P' = [ 0 Fy Cy 0 ] 68 | [ 0 0 1 0 ] 69 | where primed parameters are from the left projection matrix, unprimed from the right. 70 | 71 | [u v 1]^T = P * [x y z 1]^T 72 | [u-d v 1]^T = P' * [x y z 1]^T 73 | 74 | Combining the two equations above results in the following equation 75 | 76 | [u v u-d 1]^T = [ Fx 0 Cx 0 ] * [ x y z 1]^T 77 | [ 0 Fy Cy 0 ] 78 | [ Fx 0 Cx' FxTx ] 79 | [ 0 0 1 0 ] 80 | 81 | Subtracting the 3rd from from the first and inverting the expression 82 | results in the following equation. 83 | 84 | [x y z 1]^T = Q * [u v d 1]^T 85 | 86 | Where Q is defined as 87 | 88 | Q = [ FyTx 0 0 -FyCxTx ] 89 | [ 0 FxTx 0 -FxCyTx ] 90 | [ 0 0 0 FxFyTx ] 91 | [ 0 0 -Fy Fy(Cx-Cx') ] 92 | 93 | Using the assumption Fx = Fy Q can be simplified to the following. But for 94 | compatibility with stereo cameras with different focal lengths we will use 95 | the full Q matrix. 96 | 97 | [ 1 0 0 -Cx ] 98 | Q = [ 0 1 0 -Cy ] 99 | [ 0 0 0 Fx ] 100 | [ 0 0 -1/Tx (Cx-Cx')/Tx ] 101 | 102 | Disparity = x_left - x_right 103 | 104 | For compatibility with stereo cameras with different focal lengths we will use 105 | the full Q matrix. 106 | 107 | */ 108 | double Tx = -baseline(); // The baseline member negates our Tx. Undo this negation 109 | Q_(0,0) = left_.fy() * Tx; 110 | Q_(0,3) = -left_.fy() * left_.cx() * Tx; 111 | Q_(1,1) = left_.fx() * Tx; 112 | Q_(1,3) = -left_.fx() * left_.cy() * Tx; 113 | Q_(2,3) = left_.fx() * left_.fy() * Tx; 114 | Q_(3,2) = -left_.fy(); 115 | Q_(3,3) = left_.fy() * (left_.cx() - right_.cx()); // zero when disparities are pre-adjusted 116 | } 117 | 118 | void StereoCameraModel::projectDisparityTo3d(const cv::Point2d& left_uv_rect, float disparity, 119 | cv::Point3d& xyz) const 120 | { 121 | assert( initialized() ); 122 | 123 | // Do the math inline: 124 | // [X Y Z W]^T = Q * [u v d 1]^T 125 | // Point = (X/W, Y/W, Z/W) 126 | // cv::perspectiveTransform could be used but with more overhead. 127 | double u = left_uv_rect.x, v = left_uv_rect.y; 128 | cv::Point3d XYZ( (Q_(0,0) * u) + Q_(0,3), (Q_(1,1) * v) + Q_(1,3), Q_(2,3)); 129 | double W = Q_(3,2)*disparity + Q_(3,3); 130 | xyz = XYZ * (1.0/W); 131 | } 132 | 133 | // MISSING_Z is defined as 10000 in 134 | // https://docs.opencv.org/3.4/d9/d0c/group__calib3d.html#ga1bc1152bd57d63bc524204f21fde6e02 135 | // Having it as a public member makes this information available for users of cv_bridge. 136 | const double StereoCameraModel::MISSING_Z = 10000.; 137 | 138 | void StereoCameraModel::projectDisparityImageTo3d(const cv::Mat& disparity, cv::Mat& point_cloud, 139 | bool handleMissingValues) const 140 | { 141 | assert( initialized() ); 142 | 143 | cv::reprojectImageTo3D(disparity, point_cloud, Q_, handleMissingValues); 144 | } 145 | 146 | } // namespace image_geometry 147 | -------------------------------------------------------------------------------- /image_geometry/test/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | find_package(ament_cmake_gtest REQUIRED) 2 | find_package(ament_cmake_pytest REQUIRED) 3 | 4 | ament_add_pytest_test(directed.py "directed.py") 5 | 6 | ament_add_gtest(${PROJECT_NAME}-utest utest.cpp) 7 | target_link_libraries(${PROJECT_NAME}-utest ${PROJECT_NAME}) 8 | 9 | ament_add_gtest(${PROJECT_NAME}-utest-equi utest_equi.cpp) 10 | target_link_libraries(${PROJECT_NAME}-utest-equi ${PROJECT_NAME}) 11 | -------------------------------------------------------------------------------- /image_geometry/test/utest_equi.cpp: -------------------------------------------------------------------------------- 1 | #include "image_geometry/pinhole_camera_model.hpp" 2 | #include 3 | #include 4 | 5 | /// @todo Tests with simple values (R = identity, D = 0, P = K or simple scaling) 6 | /// @todo Test projection functions for right stereo values, P(:,3) != 0 7 | /// @todo Tests for [un]rectifyImage 8 | /// @todo Tests using ROI, needs support from PinholeCameraModel 9 | /// @todo Tests for StereoCameraModel 10 | 11 | class EquidistantTest : public testing::Test 12 | { 13 | protected: 14 | virtual void SetUp() 15 | { 16 | /// @todo Just load these from file 17 | // These parameters are taken from a real camera calibration 18 | double D[] = {-0.08857683871674071, 0.0708113094372378, -0.09127623055964429, 0.04006922269778478}; 19 | double K[] = {403.603063319358, 0.0, 306.15842863283063, 20 | 0.0, 403.7028851121003, 261.09715697592696, 21 | 0.0, 0.0, 1.0}; 22 | double R[] = {0.999963944103842, -0.008484152966323483, 0.00036005656766869323, 23 | 0.008484153516269438, 0.9999640089218772, 0.0, 24 | -0.0003600436088446379, 3.0547751946422504e-06, 0.999999935179632}; 25 | double P[] = {347.2569964503485, 0.0, 350.5, 0.0, 26 | 0.0, 347.2569964503485, 256.0, 0.0, 27 | 0.0, 0.0, 1.0, 0.0}; 28 | 29 | cam_info_.header.frame_id = "tf_frame"; 30 | cam_info_.height = 512; 31 | cam_info_.width = 640; 32 | // No ROI 33 | cam_info_.d.resize(4); 34 | std::copy(D, D+4, cam_info_.d.begin()); 35 | std::copy(K, K+9, cam_info_.k.begin()); 36 | std::copy(R, R+9, cam_info_.r.begin()); 37 | std::copy(P, P+12, cam_info_.p.begin()); 38 | cam_info_.distortion_model = sensor_msgs::distortion_models::EQUIDISTANT; 39 | 40 | model_.fromCameraInfo(cam_info_); 41 | } 42 | 43 | sensor_msgs::msg::CameraInfo cam_info_; 44 | image_geometry::PinholeCameraModel model_; 45 | }; 46 | 47 | TEST_F(EquidistantTest, accessorsCorrect) 48 | { 49 | EXPECT_STREQ("tf_frame", model_.tfFrame().c_str()); 50 | EXPECT_EQ(cam_info_.p[0], model_.fx()); 51 | EXPECT_EQ(cam_info_.p[5], model_.fy()); 52 | EXPECT_EQ(cam_info_.p[2], model_.cx()); 53 | EXPECT_EQ(cam_info_.p[6], model_.cy()); 54 | } 55 | 56 | TEST_F(EquidistantTest, projectPoint) 57 | { 58 | // Spot test an arbitrary point. 59 | { 60 | cv::Point2d uv(100, 100); 61 | cv::Point3d xyz = model_.projectPixelTo3dRay(uv); 62 | EXPECT_NEAR(-0.72136775518018115, xyz.x, 1e-8); 63 | EXPECT_NEAR(-0.449235009214005, xyz.y, 1e-8); 64 | EXPECT_DOUBLE_EQ(1.0, xyz.z); 65 | } 66 | 67 | // Principal point should project straight out. 68 | { 69 | cv::Point2d uv(model_.cx(), model_.cy()); 70 | cv::Point3d xyz = model_.projectPixelTo3dRay(uv); 71 | EXPECT_DOUBLE_EQ(0.0, xyz.x); 72 | EXPECT_DOUBLE_EQ(0.0, xyz.y); 73 | EXPECT_DOUBLE_EQ(1.0, xyz.z); 74 | } 75 | 76 | // Check projecting to 3d and back over entire image is accurate. 77 | const size_t step = 10; 78 | for (size_t row = 0; row <= cam_info_.height; row += step) { 79 | for (size_t col = 0; col <= cam_info_.width; col += step) { 80 | cv::Point2d uv(row, col), uv_back; 81 | cv::Point3d xyz = model_.projectPixelTo3dRay(uv); 82 | uv_back = model_.project3dToPixel(xyz); 83 | // Measured max error at 1.13687e-13 84 | EXPECT_NEAR(uv.x, uv_back.x, 1.14e-13) << "at (" << row << ", " << col << ")"; 85 | EXPECT_NEAR(uv.y, uv_back.y, 1.14e-13) << "at (" << row << ", " << col << ")"; 86 | } 87 | } 88 | } 89 | 90 | TEST_F(EquidistantTest, rectifyPoint) 91 | { 92 | // Spot test an arbitrary point. 93 | { 94 | cv::Point2d uv_raw(100, 100), uv_rect; 95 | uv_rect = model_.rectifyPoint(uv_raw); 96 | EXPECT_DOUBLE_EQ(135.45747375488281, uv_rect.x); 97 | EXPECT_DOUBLE_EQ(84.945091247558594, uv_rect.y); 98 | } 99 | 100 | /// @todo Need R = identity for the principal point tests. 101 | #if 0 102 | // Test rectifyPoint takes (c'x, c'y) [from K] -> (cx, cy) [from P]. 103 | double cxp = model_.intrinsicMatrix()(0,2), cyp = model_.intrinsicMatrix()(1,2); 104 | { 105 | cv::Point2d uv_raw(cxp, cyp), uv_rect; 106 | model_.rectifyPoint(uv_raw, uv_rect); 107 | EXPECT_NEAR(uv_rect.x, model_.cx(), 1e-4); 108 | EXPECT_NEAR(uv_rect.y, model_.cy(), 1e-4); 109 | } 110 | 111 | // Test unrectifyPoint takes (cx, cy) [from P] -> (c'x, c'y) [from K]. 112 | { 113 | cv::Point2d uv_rect(model_.cx(), model_.cy()), uv_raw; 114 | model_.unrectifyPoint(uv_rect, uv_raw); 115 | EXPECT_NEAR(uv_raw.x, cxp, 1e-4); 116 | EXPECT_NEAR(uv_raw.y, cyp, 1e-4); 117 | } 118 | #endif 119 | 120 | // Check rectifying then unrectifying is accurate. 121 | const size_t step = 5; 122 | for (size_t row = 0; row <= cam_info_.height; row += step) { 123 | for (size_t col = 0; col <= cam_info_.width; col += step) { 124 | cv::Point2d uv_raw(row, col), uv_rect, uv_unrect; 125 | uv_rect = model_.rectifyPoint(uv_raw); 126 | uv_unrect = model_.unrectifyPoint(uv_rect); 127 | EXPECT_NEAR(uv_raw.x, uv_unrect.x, 0.01); 128 | EXPECT_NEAR(uv_raw.y, uv_unrect.y, 0.01); 129 | } 130 | } 131 | } 132 | 133 | TEST_F(EquidistantTest, getDeltas) 134 | { 135 | double u = 100.0, v = 200.0, du = 17.0, dv = 23.0, Z = 2.0; 136 | cv::Point2d uv0(u, v), uv1(u + du, v + dv); 137 | cv::Point3d xyz0, xyz1; 138 | xyz0 = model_.projectPixelTo3dRay(uv0); 139 | xyz0 *= (Z / xyz0.z); 140 | xyz1 = model_.projectPixelTo3dRay(uv1); 141 | xyz1 *= (Z / xyz1.z); 142 | 143 | EXPECT_NEAR(model_.getDeltaU(xyz1.x - xyz0.x, Z), du, 1e-4); 144 | EXPECT_NEAR(model_.getDeltaV(xyz1.y - xyz0.y, Z), dv, 1e-4); 145 | EXPECT_NEAR(model_.getDeltaX(du, Z), xyz1.x - xyz0.x, 1e-4); 146 | EXPECT_NEAR(model_.getDeltaY(dv, Z), xyz1.y - xyz0.y, 1e-4); 147 | } 148 | 149 | TEST_F(EquidistantTest, initialization) 150 | { 151 | 152 | sensor_msgs::msg::CameraInfo info; 153 | image_geometry::PinholeCameraModel camera; 154 | 155 | camera.fromCameraInfo(info); 156 | 157 | EXPECT_EQ(camera.initialized(), 1); 158 | EXPECT_EQ(camera.projectionMatrix().rows, 3); 159 | EXPECT_EQ(camera.projectionMatrix().cols, 4); 160 | } 161 | 162 | TEST_F(EquidistantTest, rectifyIfCalibrated) 163 | { 164 | /// @todo use forward distortion for a better test 165 | // Ideally this test would have two images stored on disk 166 | // one which is distorted and the other which is rectified, 167 | // and then rectification would take place here and the output 168 | // image compared to the one on disk (which would mean if 169 | // the distortion coefficients above can't change once paired with 170 | // an image). 171 | 172 | // Later could incorporate distort code 173 | // (https://github.com/lucasw/vimjay/blob/master/src/standalone/distort_image.cpp) 174 | // to take any image distort it, then undistort with rectifyImage, 175 | // and given the distortion coefficients are consistent the input image 176 | // and final output image should be mostly the same (though some 177 | // interpolation error 178 | // creeps in), except for outside a masked region where information was lost. 179 | // The masked region can be generated with a pure white image that 180 | // goes through the same process (if it comes out completely black 181 | // then the distortion parameters are problematic). 182 | 183 | // For now generate an image and pass the test simply if 184 | // the rectified image does not match the distorted image. 185 | // Then zero out the first distortion coefficient and run 186 | // the test again. 187 | // Then zero out all the distortion coefficients and test 188 | // that the output image is the same as the input. 189 | cv::Mat distorted_image(cv::Size(cam_info_.width, cam_info_.height), CV_8UC3, cv::Scalar(0, 0, 0)); 190 | 191 | // draw a grid 192 | const cv::Scalar color = cv::Scalar(255, 255, 255); 193 | // draw the lines thick so the proportion of error due to 194 | // interpolation is reduced 195 | const int thickness = 7; 196 | const int type = 8; 197 | for (size_t y = 0; y <= cam_info_.height; y += cam_info_.height/10) 198 | { 199 | cv::line(distorted_image, 200 | cv::Point(0, y), cv::Point(cam_info_.width, y), 201 | color, type, thickness); 202 | } 203 | for (size_t x = 0; x <= cam_info_.width; x += cam_info_.width/10) 204 | { 205 | // draw the lines thick so the prorportion of interpolation error is reduced 206 | cv::line(distorted_image, 207 | cv::Point(x, 0), cv::Point(x, cam_info_.height), 208 | color, type, thickness); 209 | } 210 | 211 | cv::Mat rectified_image; 212 | // Just making this number up, maybe ought to be larger 213 | // since a completely different image would be on the order of 214 | // width * height * 255 = 78e6 215 | const double diff_threshold = 10000.0; 216 | double error; 217 | 218 | // Test that rectified image is sufficiently different 219 | // using default distortion 220 | model_.rectifyImage(distorted_image, rectified_image); 221 | error = cv::norm(distorted_image, rectified_image, cv::NORM_L1); 222 | // Just making this number up, maybe ought to be larger 223 | EXPECT_GT(error, diff_threshold); 224 | 225 | // Test that rectified image is sufficiently different 226 | // using default distortion but with first element zeroed 227 | // out. 228 | sensor_msgs::msg::CameraInfo cam_info_2 = cam_info_; 229 | cam_info_2.d[0] = 0.0; 230 | model_.fromCameraInfo(cam_info_2); 231 | model_.rectifyImage(distorted_image, rectified_image); 232 | error = cv::norm(distorted_image, rectified_image, cv::NORM_L1); 233 | EXPECT_GT(error, diff_threshold); 234 | 235 | // Test that rectified image is the same using zero distortion 236 | cam_info_2.d.assign(cam_info_2.d.size(), 0); 237 | model_.fromCameraInfo(cam_info_2); 238 | model_.rectifyImage(distorted_image, rectified_image); 239 | error = cv::norm(distorted_image, rectified_image, cv::NORM_L1); 240 | EXPECT_EQ(error, 0); 241 | 242 | // Test that rectified image is the same using empty distortion 243 | cam_info_2.d.clear(); 244 | model_.fromCameraInfo(cam_info_2); 245 | model_.rectifyImage(distorted_image, rectified_image); 246 | error = cv::norm(distorted_image, rectified_image, cv::NORM_L1); 247 | EXPECT_EQ(error, 0); 248 | 249 | // restore original distortion 250 | model_.fromCameraInfo(cam_info_); 251 | } 252 | 253 | int main(int argc, char** argv) 254 | { 255 | testing::InitGoogleTest(&argc, argv); 256 | return RUN_ALL_TESTS(); 257 | } 258 | -------------------------------------------------------------------------------- /opencv_tests/CHANGELOG.rst: -------------------------------------------------------------------------------- 1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2 | Changelog for package opencv_tests 3 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 4 | 5 | 4.1.0 (2024-04-19) 6 | ------------------ 7 | 8 | 4.0.0 (2024-04-13) 9 | ------------------ 10 | 11 | 3.4.0 (2022-10-03) 12 | ------------------ 13 | 14 | 3.3.1 (2022-09-21) 15 | ------------------ 16 | 17 | 3.3.0 (2022-09-14) 18 | ------------------ 19 | * Reorganize author tag (`#460 `_) 20 | * Update maintainers (`#451 `_) 21 | * Contributors: Kenji Brameld 22 | 23 | 3.0.3 (2022-04-01) 24 | ------------------ 25 | 26 | 3.0.2 (2022-01-27) 27 | ------------------ 28 | * Minor cleanups to the ROS 2 branch (`#418 `_) 29 | * Contributors: Chris Lalancette 30 | 31 | 3.0.1 (2022-01-25) 32 | ------------------ 33 | 34 | 3.0.0 (2022-01-19) 35 | ------------------ 36 | * Rename opencv_tests install folder to deconflict (`#357 `_) 37 | * delete __init_\_ for launch directory (`#352 `_) 38 | * Contributors: Dirk Thomas, Michael Carroll 39 | 40 | 2.2.1 (2020-07-16) 41 | ------------------ 42 | 43 | 2.2.0 (2020-05-27) 44 | ------------------ 45 | 46 | 2.1.3 (2019-10-23) 47 | ------------------ 48 | 49 | 2.1.2 (2019-05-30) 50 | ------------------ 51 | 52 | 2.1.1 (2019-04-26) 53 | ------------------ 54 | 55 | 2.1.0 (2018-11-26) 56 | ------------------ 57 | 58 | 2.0.5 (2018-08-17) 59 | ------------------ 60 | 61 | 2.0.4 (2018-08-14) 62 | ------------------ 63 | 64 | 2.0.3 (2018-08-07) 65 | ------------------ 66 | * set zip_safe to avoid warning during installation 67 | * migrate launch to launch.legacy 68 | * fix exception of running launch after sourcing opencv_tests 69 | * Contributors: Ethan Gao 70 | 71 | 2.0.2 (2018-06-29) 72 | ------------------ 73 | 74 | 2.0.1 (2018-06-28) 75 | ------------------ 76 | 77 | 2.0.0 (2018-04-25) 78 | ------------------- 79 | * port opencv_tests to ros2 80 | 81 | 1.12.7 (2017-11-12) 82 | ------------------- 83 | 84 | 1.12.6 (2017-11-11) 85 | ------------------- 86 | 87 | 1.12.5 (2017-11-05) 88 | ------------------- 89 | 90 | 1.12.4 (2017-01-29) 91 | ------------------- 92 | 93 | 1.12.3 (2016-12-04) 94 | ------------------- 95 | 96 | 1.12.2 (2016-09-24) 97 | ------------------- 98 | 99 | 1.12.1 (2016-07-11) 100 | ------------------- 101 | * Support compressed Images messages in python for indigo 102 | - Add cv2_to_comprssed_imgmsg: Convert from cv2 image to compressed image ros msg. 103 | - Add comprssed_imgmsg_to_cv2: Convert the compress message to a new image. 104 | - Add compressed image tests. 105 | - Add time to msgs (compressed and regular). 106 | add enumerants test for compressed image. 107 | merge the compressed tests with the regular ones. 108 | better comment explanation. I will squash this commit. 109 | Fix indentation 110 | fix typo mistage: from .imgmsg_to_compressed_cv2 to .compressed_imgmsg_to_cv2. 111 | remove cv2.CV_8UC1 112 | remove rospy and time depndency. 113 | change from IMREAD_COLOR to IMREAD_ANYCOLOR. 114 | - make indentaion of 4. 115 | - remove space trailer. 116 | - remove space from empty lines. 117 | - another set of for loops, it will make things easier to track. In that new set, just have the number of channels in ([],1,3,4) (ignore two for jpg). from: https://github.com/ros-perception/vision_opencv/pull/132#discussion_r66721943 118 | - keep the OpenCV error message. from: https://github.com/ros-perception/vision_opencv/pull/132#discussion_r66721013 119 | add debug print for test. 120 | add case for 4 channels in test. 121 | remove 4 channels case from compressed test. 122 | add debug print for test. 123 | change typo of format. 124 | fix typo in format. change from dip to dib. 125 | change to IMREAD_ANYCOLOR as python code. (as it should). 126 | rename TIFF to tiff 127 | Sperate the tests one for regular images and one for compressed. 128 | update comment 129 | * Contributors: talregev 130 | 131 | 1.12.0 (2016-03-18) 132 | ------------------- 133 | 134 | 1.11.12 (2016-03-10) 135 | -------------------- 136 | 137 | 1.11.11 (2016-01-31) 138 | -------------------- 139 | * fix a few warnings in doc jobs 140 | * Contributors: Vincent Rabaud 141 | 142 | 1.11.10 (2016-01-16) 143 | -------------------- 144 | 145 | 1.11.9 (2015-11-29) 146 | ------------------- 147 | 148 | 1.11.8 (2015-07-15) 149 | ------------------- 150 | * simplify dependencies 151 | * Contributors: Vincent Rabaud 152 | 153 | 1.11.7 (2014-12-14) 154 | ------------------- 155 | 156 | 1.11.6 (2014-11-16) 157 | ------------------- 158 | 159 | 1.11.5 (2014-09-21) 160 | ------------------- 161 | 162 | 1.11.4 (2014-07-27) 163 | ------------------- 164 | 165 | 1.11.3 (2014-06-08) 166 | ------------------- 167 | * remove file whose functinality is now in cv_bridge 168 | * remove references to cv (use cv2) 169 | * Correct dependency from non-existent package to cv_bridge 170 | * Contributors: Isaac Isao Saito, Vincent Rabaud 171 | 172 | 1.11.2 (2014-04-28) 173 | ------------------- 174 | 175 | 1.11.1 (2014-04-16) 176 | ------------------- 177 | 178 | 1.11.0 (2014-02-15) 179 | ------------------- 180 | 181 | 1.10.15 (2014-02-07) 182 | -------------------- 183 | 184 | 1.10.14 (2013-11-23 16:17) 185 | -------------------------- 186 | * Contributors: Vincent Rabaud 187 | 188 | 1.10.13 (2013-11-23 09:19) 189 | -------------------------- 190 | * Contributors: Vincent Rabaud 191 | 192 | 1.10.12 (2013-11-22) 193 | -------------------- 194 | * Contributors: Vincent Rabaud 195 | 196 | 1.10.11 (2013-10-23) 197 | -------------------- 198 | * Contributors: Vincent Rabaud 199 | 200 | 1.10.10 (2013-10-19) 201 | -------------------- 202 | * Contributors: Vincent Rabaud 203 | 204 | 1.10.9 (2013-10-07) 205 | ------------------- 206 | * Contributors: Vincent Rabaud 207 | 208 | 1.10.8 (2013-09-09) 209 | ------------------- 210 | * update email address 211 | * Contributors: Vincent Rabaud 212 | 213 | 1.10.7 (2013-07-17) 214 | ------------------- 215 | 216 | 1.10.6 (2013-03-01) 217 | ------------------- 218 | 219 | 1.10.5 (2013-02-11) 220 | ------------------- 221 | 222 | 1.10.4 (2013-02-02) 223 | ------------------- 224 | 225 | 1.10.3 (2013-01-17) 226 | ------------------- 227 | 228 | 1.10.2 (2013-01-13) 229 | ------------------- 230 | 231 | 1.10.1 (2013-01-10) 232 | ------------------- 233 | * fixes `#5 `_ by removing the logic from Python and using wrapped C++ and adding a test for it 234 | * Contributors: Vincent Rabaud 235 | 236 | 1.10.0 (2013-01-03) 237 | ------------------- 238 | 239 | 1.9.15 (2013-01-02) 240 | ------------------- 241 | 242 | 1.9.14 (2012-12-30) 243 | ------------------- 244 | 245 | 1.9.13 (2012-12-15) 246 | ------------------- 247 | 248 | 1.9.12 (2012-12-14) 249 | ------------------- 250 | * Removed brief tag 251 | Conflicts: 252 | opencv_tests/package.xml 253 | * buildtool_depend catkin fix 254 | * Contributors: William Woodall 255 | 256 | 1.9.11 (2012-12-10) 257 | ------------------- 258 | 259 | 1.9.10 (2012-10-04) 260 | ------------------- 261 | 262 | 1.9.9 (2012-10-01) 263 | ------------------ 264 | 265 | 1.9.8 (2012-09-30) 266 | ------------------ 267 | 268 | 1.9.7 (2012-09-28 21:07) 269 | ------------------------ 270 | * add missing stuff 271 | * make sure we find catkin 272 | * Contributors: Vincent Rabaud 273 | 274 | 1.9.6 (2012-09-28 15:17) 275 | ------------------------ 276 | * move the test to where it belongs 277 | * fix the tests and the API to not handle conversion from CV_TYPE to Color type (does not make sense) 278 | * make all the tests pass 279 | * comply to the new Catkin API 280 | * backport the C++ test from Fuerte 281 | * Contributors: Vincent Rabaud 282 | 283 | 1.9.5 (2012-09-15) 284 | ------------------ 285 | * remove dependencies to the opencv2 ROS package 286 | * Contributors: Vincent Rabaud 287 | 288 | 1.9.4 (2012-09-13) 289 | ------------------ 290 | 291 | 1.9.3 (2012-09-12) 292 | ------------------ 293 | * update to nosetests 294 | * Contributors: Vincent Rabaud 295 | 296 | 1.9.2 (2012-09-07) 297 | ------------------ 298 | * be more compliant to the latest catkin 299 | * added catkin_project() to cv_bridge, image_geometry, and opencv_tests 300 | * Contributors: Jonathan Binney, Vincent Rabaud 301 | 302 | 1.9.1 (2012-08-28 22:06) 303 | ------------------------ 304 | * remove a deprecated header 305 | * Contributors: Vincent Rabaud 306 | 307 | 1.9.0 (2012-08-28 14:29) 308 | ------------------------ 309 | * cleanup by Jon Binney 310 | * catkinized opencv_tests by Jon Binney 311 | * remove the version check, let's trust OpenCV :) 312 | * revert the removal of opencv2 313 | * finally get rid of opencv2 as it is a system dependency now 314 | * bump REQUIRED version of OpenCV to 2.3.2, which is what's in ros-fuerte-opencv 315 | * switch rosdep name to opencv2, to refer to ros-fuerte-opencv2 316 | * Fixing link lines for gtest against opencv. 317 | * Adding opencv2 to all manifests, so that client packages may 318 | not break when using them. 319 | * baking in opencv debs and attempting a pre-release 320 | * Another hack for prerelease to quiet test failures. 321 | * Dissable a dubious opencv test. Temporary HACK. 322 | * Changing to expect for more verbose failure. 323 | * Minor change to test. 324 | * Making this depend on libopencv-2.3-dev debian available in ros-shadow. 325 | * mono16 -> bgr conversion tested and fixed in C 326 | * Added Ubuntu platform tags to manifest 327 | * Tuned for parc loop 328 | * Demo of ROS node face detecton 329 | * mono16 support, ticket `#2890 `_ 330 | * Remove use of deprecated rosbuild macros 331 | * cv_bridge split from opencv2 332 | * Name changes for opencv -> vision_opencv 333 | * Validation for image message encoding 334 | * utest changed to reflect rosimgtocv change to imgmsgtocv 335 | * Add opencvpython as empty package 336 | * New methods for cv image conversion 337 | * Disabling tests on OSX, `#2769 `_ 338 | * New Python CvBridge, rewrote C CvBridge, regression test for C and Python CvBridge 339 | * Fix underscore problem, test 8UC3->BGR8, fix 8UC3->BGR8 340 | * New image format 341 | * Image message and CvBridge change 342 | * Rename rows,cols to height,width in Image message 343 | * New node bbc for image testing 344 | * Make executable 345 | * Pong demo 346 | * Missing utest.cpp 347 | * New sensor_msgs::Image message 348 | * Contributors: Vincent Rabaud, ethanrublee, gerkey, jamesb, jamesbowman, pantofaru, vrabaud, wheeler 349 | -------------------------------------------------------------------------------- /opencv_tests/launch/view_img.py: -------------------------------------------------------------------------------- 1 | from launch.legacy.exit_handler import default_exit_handler, restart_exit_handler 2 | from ros2run.api import get_executable_path 3 | 4 | 5 | def launch(launch_descriptor, argv): 6 | ld = launch_descriptor 7 | package = 'image_tools' 8 | ld.add_process( 9 | cmd=[get_executable_path(package_name=package, executable_name='showimage'), 10 | '-t', '/opencv_tests/images'], 11 | name='showimage', 12 | exit_handler=restart_exit_handler, 13 | ) 14 | package = 'opencv_tests' 15 | ld.add_process( 16 | cmd=[get_executable_path(package_name=package, executable_name='source.py')], 17 | name='source.py', 18 | exit_handler=restart_exit_handler, 19 | ) 20 | 21 | return ld 22 | -------------------------------------------------------------------------------- /opencv_tests/mainpage.dox: -------------------------------------------------------------------------------- 1 | /** 2 | \mainpage 3 | \htmlinclude manifest.html 4 | 5 | \b opencv_tests is ... 6 | 7 | 14 | 15 | 16 | \section codeapi Code API 17 | 18 | 28 | 29 | \section rosapi ROS API 30 | 31 | 42 | 43 | 89 | 90 | 91 | 118 | 119 | */ -------------------------------------------------------------------------------- /opencv_tests/opencv_tests/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ros-perception/vision_opencv/27de9ecf9862e6fba509b7e49e3c2511c7d11627/opencv_tests/opencv_tests/__init__.py -------------------------------------------------------------------------------- /opencv_tests/opencv_tests/broadcast.py: -------------------------------------------------------------------------------- 1 | # Software License Agreement (BSD License) 2 | # 3 | # Copyright (c) 2008, Willow Garage, Inc. 4 | # Copyright (c) 2016, Tal Regev. 5 | # Copyright (c) 2018 Intel Corporation. 6 | # All rights reserved. 7 | # 8 | # Redistribution and use in source and binary forms, with or without 9 | # modification, are permitted provided that the following conditions 10 | # are met: 11 | # 12 | # * Redistributions of source code must retain the above copyright 13 | # notice, this list of conditions and the following disclaimer. 14 | # * Redistributions in binary form must reproduce the above 15 | # copyright notice, this list of conditions and the following 16 | # disclaimer in the documentation and/or other materials provided 17 | # with the distribution. 18 | # * Neither the name of the Willow Garage nor the names of its 19 | # contributors may be used to endorse or promote products derived 20 | # from this software without specific prior written permission. 21 | # 22 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 23 | # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 24 | # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS 25 | # FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE 26 | # COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, 27 | # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, 28 | # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; 29 | # LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 30 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT 31 | # LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN 32 | # ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 33 | # POSSIBILITY OF SUCH DAMAGE. 34 | ##################################################################### 35 | 36 | import sys 37 | import time 38 | import math 39 | import rclpy 40 | import cv2 41 | 42 | import sensor_msgs.msg 43 | from cv_bridge import CvBridge 44 | 45 | 46 | # Send each image by iterate it from given array of files names to a given topic, 47 | # as a regular and compressed ROS Images msgs. 48 | def main(args=None): 49 | if args is None: 50 | args = sys.argv 51 | rclpy.init(args=args) 52 | node = rclpy.create_node("Source") 53 | node_logger = node.get_logger() 54 | 55 | topic = args[1] 56 | filenames = args[2:] 57 | pub_img = node.create_publisher(sensor_msgs.msg.Image, topic) 58 | pub_img_compressed = node.create_publisher(sensor_msgs.msg.CompressedImage, topic + "/compressed") 59 | 60 | time.sleep(1.0) 61 | cvb = CvBridge() 62 | 63 | while rclpy.ok(): 64 | try: 65 | cvim = cv2.imread(filenames[0]) 66 | pub_img.publish(cvb.cv2_to_imgmsg(cvim)) 67 | pub_img_compressed.publish(cvb.cv2_to_compressed_imgmsg(cvim)) 68 | filenames = filenames[1:] + [filenames[0]] 69 | time.sleep(1) 70 | except KeyboardInterrupt: 71 | node_logger.info("shutting down: keyboard interrupt") 72 | break 73 | 74 | node_logger.info("test_completed") 75 | node.destroy_node() 76 | rclpy.shutdown() 77 | 78 | if __name__ == '__main__': 79 | main() 80 | -------------------------------------------------------------------------------- /opencv_tests/opencv_tests/rosfacedetect.py: -------------------------------------------------------------------------------- 1 | """ 2 | This program is demonstration for face and object detection using haar-like features. 3 | The program finds faces in a camera image or video stream and displays a red box around them. 4 | 5 | Original C implementation by: ? 6 | Python implementation by: Roman Stanchak, James Bowman 7 | Updated: Copyright (c) 2016, Tal Regev. 8 | """ 9 | 10 | import sys 11 | import os 12 | from optparse import OptionParser 13 | 14 | import rclpy 15 | import sensor_msgs.msg 16 | from cv_bridge import CvBridge 17 | import cv2 18 | import numpy 19 | 20 | # Parameters for haar detection 21 | # From the API: 22 | # The default parameters (scale_factor=2, min_neighbors=3, flags=0) are tuned 23 | # for accurate yet slow object detection. For a faster operation on real video 24 | # images the settings are: 25 | # scale_factor=1.2, min_neighbors=2, flags=CV_HAAR_DO_CANNY_PRUNING, 26 | # min_size= 2 | 3 | 4 | opencv_tests 5 | 4.1.0 6 | 7 | OpenCV tests for the Python and C++ implementations of CvBridge with Image message in ROS2. 8 | 9 | Kenji Brameld 10 | BSD 11 | http://wiki.ros.org/opencv_tests 12 | 13 | James Bowman 14 | Ethan Gao 15 | 16 | rclpy 17 | 18 | launch 19 | rclpy 20 | sensor_msgs 21 | cv_bridge 22 | 23 | 24 | ament_python 25 | 26 | 27 | -------------------------------------------------------------------------------- /opencv_tests/resource/opencv_tests: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ros-perception/vision_opencv/27de9ecf9862e6fba509b7e49e3c2511c7d11627/opencv_tests/resource/opencv_tests -------------------------------------------------------------------------------- /opencv_tests/setup.cfg: -------------------------------------------------------------------------------- 1 | [develop] 2 | script_dir=$base/lib/opencv_tests 3 | [install] 4 | install_scripts=$base/lib/opencv_tests 5 | -------------------------------------------------------------------------------- /opencv_tests/setup.py: -------------------------------------------------------------------------------- 1 | from setuptools import find_packages 2 | from setuptools import setup 3 | 4 | package_name = 'opencv_tests' 5 | 6 | setup( 7 | name=package_name, 8 | version='4.1.0', 9 | packages=find_packages(exclude=['launch']), 10 | data_files=[ 11 | ('share/ament_index/resource_index/packages', 12 | ['resource/' + package_name]), 13 | ('share/' + package_name, ['package.xml']), 14 | ], 15 | install_requires=['setuptools'], 16 | zip_safe=True, 17 | author='Ethan Gao', 18 | author_email='ethan.gao@linux.intel.com', 19 | maintainer='Kenji Brameld', 20 | maintainer_email='kenjibrameld@gmail.com', 21 | keywords=['ROS'], 22 | classifiers=[ 23 | 'Intended Audience :: Developers', 24 | 'License :: OSI Approved :: Apache Software License', 25 | 'Programming Language :: Python', 26 | 'Topic :: Software Development', 27 | ], 28 | description=( 29 | 'opencv tests using cv_bridge and ros2 node implementation' 30 | ), 31 | license='Apache License, Version 2.0', 32 | tests_require=['pytest'], 33 | entry_points={ 34 | 'console_scripts': [ 35 | 'source = opencv_tests.source:main', 36 | 'broadcast = opencv_tests.broadcast:main', 37 | 'rosfacedetect = opencv_tests.rosfacedetect:main', 38 | ], 39 | }, 40 | ) 41 | -------------------------------------------------------------------------------- /pytest.ini: -------------------------------------------------------------------------------- 1 | [pytest] 2 | junit_family=xunit2 3 | -------------------------------------------------------------------------------- /vision_opencv/CHANGELOG.rst: -------------------------------------------------------------------------------- 1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2 | Changelog for package vision_opencv 3 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 4 | 5 | 4.1.0 (2024-04-19) 6 | ------------------ 7 | 8 | 4.0.0 (2024-04-13) 9 | ------------------ 10 | 11 | 3.4.0 (2022-10-03) 12 | ------------------ 13 | 14 | 3.3.1 (2022-09-21) 15 | ------------------ 16 | 17 | 3.3.0 (2022-09-14) 18 | ------------------ 19 | * Add apache license and bsd license, because both are used. (`#479 `_) 20 | * Reorganize author tag (`#460 `_) 21 | * Update maintainers (`#451 `_) 22 | * Contributors: Kenji Brameld 23 | 24 | 3.0.3 (2022-04-01) 25 | ------------------ 26 | 27 | 3.0.2 (2022-01-27) 28 | ------------------ 29 | * Minor cleanups to the ROS 2 branch (`#418 `_) 30 | * Contributors: Chris Lalancette 31 | 32 | 3.0.1 (2022-01-25) 33 | ------------------ 34 | 35 | 3.0.0 (2022-01-19) 36 | ------------------ 37 | 38 | 2.2.1 (2020-07-16) 39 | ------------------ 40 | 41 | 2.2.0 (2020-05-27) 42 | ------------------ 43 | 44 | 2.1.3 (2019-10-23) 45 | ------------------ 46 | 47 | 2.1.2 (2019-05-30) 48 | ------------------ 49 | 50 | 2.1.1 (2019-04-26) 51 | ------------------ 52 | 53 | 2.1.0 (2018-11-26) 54 | ------------------ 55 | 56 | 2.0.5 (2018-08-17) 57 | ------------------ 58 | 59 | 2.0.4 (2018-08-14) 60 | ------------------ 61 | 62 | 2.0.3 (2018-08-07) 63 | ------------------ 64 | 65 | 2.0.2 (2018-06-29) 66 | ------------------ 67 | 68 | 2.0.1 (2018-06-28) 69 | ------------------ 70 | 71 | 2.0.0 (2018-05-31) 72 | ------------------- 73 | 74 | 1.12.7 (2017-11-12) 75 | ------------------- 76 | 77 | 1.12.6 (2017-11-11) 78 | ------------------- 79 | 80 | 1.12.5 (2017-11-05) 81 | ------------------- 82 | 83 | 1.12.4 (2017-01-29) 84 | ------------------- 85 | 86 | 1.12.3 (2016-12-04) 87 | ------------------- 88 | 89 | 1.12.2 (2016-09-24) 90 | ------------------- 91 | 92 | 1.12.1 (2016-07-11) 93 | ------------------- 94 | 95 | 1.12.0 (2016-03-18) 96 | ------------------- 97 | * remove opencv_apps from vision_opencv 98 | * Contributors: Vincent Rabaud 99 | 100 | 1.11.12 (2016-03-10) 101 | -------------------- 102 | 103 | 1.11.11 (2016-01-31) 104 | -------------------- 105 | 106 | 1.11.10 (2016-01-16) 107 | -------------------- 108 | 109 | 1.11.9 (2015-11-29) 110 | ------------------- 111 | * Add opencv_apps to vision_opencv dependency 112 | * Contributors: Ryohei Ueda 113 | 114 | 1.11.8 (2015-07-15) 115 | ------------------- 116 | 117 | 1.11.7 (2014-12-14) 118 | ------------------- 119 | 120 | 1.11.6 (2014-11-16) 121 | ------------------- 122 | 123 | 1.11.5 (2014-09-21) 124 | ------------------- 125 | 126 | 1.11.4 (2014-07-27) 127 | ------------------- 128 | 129 | 1.11.3 (2014-06-08) 130 | ------------------- 131 | 132 | 1.11.2 (2014-04-28) 133 | ------------------- 134 | 135 | 1.11.1 (2014-04-16) 136 | ------------------- 137 | 138 | 1.11.0 (2014-02-15) 139 | ------------------- 140 | 141 | 1.10.15 (2014-02-07) 142 | -------------------- 143 | 144 | 1.10.14 (2013-11-23 16:17) 145 | -------------------------- 146 | * Contributors: Vincent Rabaud 147 | 148 | 1.10.13 (2013-11-23 09:19) 149 | -------------------------- 150 | * Contributors: Vincent Rabaud 151 | 152 | 1.10.12 (2013-11-22) 153 | -------------------- 154 | * Contributors: Vincent Rabaud 155 | 156 | 1.10.11 (2013-10-23) 157 | -------------------- 158 | * Contributors: Vincent Rabaud 159 | 160 | 1.10.10 (2013-10-19) 161 | -------------------- 162 | * Contributors: Vincent Rabaud 163 | 164 | 1.10.9 (2013-10-07) 165 | ------------------- 166 | * Contributors: Vincent Rabaud 167 | 168 | 1.10.8 (2013-09-09) 169 | ------------------- 170 | * update email address 171 | * Contributors: Vincent Rabaud 172 | 173 | 1.10.7 (2013-07-17) 174 | ------------------- 175 | * update to REP 0127 176 | * Contributors: Vincent Rabaud 177 | 178 | 1.10.6 (2013-03-01) 179 | ------------------- 180 | 181 | 1.10.5 (2013-02-11) 182 | ------------------- 183 | 184 | 1.10.4 (2013-02-02) 185 | ------------------- 186 | 187 | 1.10.3 (2013-01-17) 188 | ------------------- 189 | 190 | 1.10.2 (2013-01-13) 191 | ------------------- 192 | 193 | 1.10.1 (2013-01-10) 194 | ------------------- 195 | 196 | 1.10.0 (2013-01-03) 197 | ------------------- 198 | 199 | 1.9.15 (2013-01-02) 200 | ------------------- 201 | 202 | 1.9.14 (2012-12-30) 203 | ------------------- 204 | 205 | 1.9.13 (2012-12-15) 206 | ------------------- 207 | 208 | 1.9.12 (2012-12-14) 209 | ------------------- 210 | 211 | 1.9.11 (2012-12-10) 212 | ------------------- 213 | 214 | 1.9.10 (2012-10-04) 215 | ------------------- 216 | * the CMake file is useless 217 | * add the missing CMake file 218 | * re-add the meta-package 219 | * Contributors: Vincent Rabaud 220 | 221 | 1.9.9 (2012-10-01) 222 | ------------------ 223 | 224 | 1.9.8 (2012-09-30) 225 | ------------------ 226 | 227 | 1.9.7 (2012-09-28 21:07) 228 | ------------------------ 229 | 230 | 1.9.6 (2012-09-28 15:17) 231 | ------------------------ 232 | 233 | 1.9.5 (2012-09-15) 234 | ------------------ 235 | 236 | 1.9.4 (2012-09-13) 237 | ------------------ 238 | 239 | 1.9.3 (2012-09-12) 240 | ------------------ 241 | 242 | 1.9.2 (2012-09-07) 243 | ------------------ 244 | 245 | 1.9.1 (2012-08-28 22:06) 246 | ------------------------ 247 | 248 | 1.9.0 (2012-08-28 14:29) 249 | ------------------------ 250 | -------------------------------------------------------------------------------- /vision_opencv/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | cmake_minimum_required(VERSION 3.5) 2 | 3 | project(vision_opencv) 4 | 5 | find_package(ament_cmake REQUIRED) 6 | 7 | ament_package() 8 | -------------------------------------------------------------------------------- /vision_opencv/package.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | vision_opencv 5 | 4.1.0 6 | Packages for interfacing ROS2 with OpenCV, a library of programming functions for real time computer vision. 7 | Kenji Brameld 8 | Apache License 2.0 9 | BSD 10 | 11 | http://www.ros.org/wiki/vision_opencv 12 | https://github.com/ros-perception/vision_opencv/issues 13 | https://github.com/ros-perception/vision_opencv 14 | 15 | Patrick Mihelich 16 | James Bowman 17 | Vincent Rabaud 18 | Ethan Gao 19 | 20 | ament_cmake 21 | 22 | cv_bridge 23 | image_geometry 24 | 25 | 26 | ament_cmake 27 | 28 | 29 | 30 | --------------------------------------------------------------------------------