├── .gitignore ├── LICENSE ├── README.md ├── __init__.py ├── camera_calibration.py └── examples ├── example_images ├── asymmetric_grid │ ├── Image__2018-02-12__15-11-38.png │ ├── Image__2018-02-12__15-13-40.png │ ├── Image__2018-02-12__15-14-01.png │ ├── Image__2018-02-12__15-14-55.png │ ├── Image__2018-02-12__15-15-21.png │ ├── Image__2018-02-12__15-15-55.png │ ├── Image__2018-02-12__15-16-06.png │ ├── Image__2018-02-12__15-16-18.png │ ├── Image__2018-02-12__15-16-39.png │ └── Image__2018-02-12__15-17-08.png ├── chessboard │ ├── left01.jpg │ ├── left02.jpg │ ├── left03.jpg │ ├── left04.jpg │ ├── left05.jpg │ ├── left06.jpg │ ├── left07.jpg │ ├── left08.jpg │ ├── left09.jpg │ ├── left11.jpg │ ├── left12.jpg │ ├── left13.jpg │ └── left14.jpg └── symmetric_grid │ ├── Image__2018-02-14__10-12-45.png │ ├── Image__2018-02-14__10-13-32.png │ ├── Image__2018-02-14__10-13-57.png │ ├── Image__2018-02-14__10-14-10.png │ ├── Image__2018-02-14__10-14-24.png │ ├── Image__2018-02-14__10-14-42.png │ ├── Image__2018-02-14__10-15-01.png │ ├── Image__2018-02-14__10-15-22.png │ ├── Image__2018-02-14__10-15-40.png │ ├── Image__2018-02-14__10-16-00.png │ ├── Image__2018-02-14__10-16-32.png │ ├── Image__2018-02-14__10-17-16.png │ ├── Image__2018-02-14__10-17-32.png │ ├── Image__2018-02-14__10-17-53.png │ ├── Image__2018-02-14__10-18-04.png │ ├── Image__2018-02-14__10-18-16.png │ ├── Image__2018-02-14__10-18-29.png │ ├── Image__2018-02-14__10-18-40.png │ ├── Image__2018-02-14__10-19-03.png │ ├── Image__2018-02-14__10-19-14.png │ ├── Image__2018-02-14__10-19-33.png │ ├── Image__2018-02-14__10-19-50.png │ ├── Image__2018-02-14__10-20-22.png │ ├── Image__2018-02-14__10-20-58.png │ └── Image__2018-02-14__10-21-12.png └── example_notebooks ├── asymmetric_grid_calibration.ipynb ├── chessboard_calibration.ipynb └── symmertric_grid_calibration.ipynb /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | env/ 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | 28 | # PyInstaller 29 | # Usually these files are written by a python script from a template 30 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 31 | *.manifest 32 | *.spec 33 | 34 | # Installer logs 35 | pip-log.txt 36 | pip-delete-this-directory.txt 37 | 38 | # Unit test / coverage reports 39 | htmlcov/ 40 | .tox/ 41 | .coverage 42 | .coverage.* 43 | .cache 44 | nosetests.xml 45 | coverage.xml 46 | *.cover 47 | .hypothesis/ 48 | 49 | # Translations 50 | *.mo 51 | *.pot 52 | 53 | # Django stuff: 54 | *.log 55 | local_settings.py 56 | 57 | # Flask stuff: 58 | instance/ 59 | .webassets-cache 60 | 61 | # Scrapy stuff: 62 | .scrapy 63 | 64 | # Sphinx documentation 65 | docs/_build/ 66 | 67 | # PyBuilder 68 | target/ 69 | 70 | # Jupyter Notebook 71 | .ipynb_checkpoints 72 | 73 | # pyenv 74 | .python-version 75 | 76 | # celery beat schedule file 77 | celerybeat-schedule 78 | 79 | # SageMath parsed files 80 | *.sage.py 81 | 82 | # dotenv 83 | .env 84 | 85 | # virtualenv 86 | .venv 87 | venv/ 88 | ENV/ 89 | 90 | # Spyder project settings 91 | .spyderproject 92 | .spyproject 93 | 94 | # Rope project settings 95 | .ropeproject 96 | 97 | # mkdocs documentation 98 | /site 99 | 100 | # mypy 101 | .mypy_cache/ 102 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # camera_calibration_API 2 | 3 | **A repository containing the camera calibration API** 4 | 5 | ### Repository Overview: 6 | 7 | [camera_calibration.py](./camera_calibration.py):contains an API which tries to minic the MATLAB's camera calibration app functionality. This API is a thin wrapper around the opencv's camera calibration functionalities. 8 | 9 | [examples](./examples): A directory containing various examples 10 | 11 | 12 | ### Camera_Calibration_API: 13 | 14 | #### Introduction: 15 | The Camera Calibration API is a wrapper around the opencv's camera calibration functionalities. This tries to mimic the MATLAB camera calibration app's functionality in `python`. The API supports all the 3 calibration patterns supported by opencv namely: **Chessboards**, **Asymmetric circular grids** and **Symmetric circular grids.** The API by default runs on 4 threads for speedup. The speed-up may not be marginal in the case of **chessboard** calibration because in most cases the bottle neck will be a single chessboard image (run on a single core) which the algorithm takes time to detect. 16 | 17 | #### Dependencies: 18 | * `works in both python-3 and python-2` 19 | * `opencv (Tested in version 3.3.0)` 20 | * `numpy` 21 | * `matplotlib` 22 | * `pickle` 23 | * `argparse` 24 | * `glob` 25 | * `pickle` 26 | * `multiprocessing` 27 | * `os` 28 | * `pandas` 29 | 30 | #### Example: 31 | Examples to use the Camera_Calibration_API() for calibration using chessboard, symmetric circular grids and asymmetric circular grids can be found in the [example_notebooks](./examples/example_notebooks) folder 32 | 33 | #### Features: 34 | * Supports all the 3 calibration patterns supported by opencv : **Chessboards**, **Asymmetric circular grids** and **Symmetric circular grids.** 35 | * Additionally a **custom** calibration pattern can also be implemented. (Look at the next section for how to calibrate using custom pattern.) 36 | * Visualizes the **Reprojection error plot** 37 | * Ability to **Recalibrate** the camera by neglecting the images with very high reprojection errors. 38 | * **Camera centric and Pattern centric** views can be visualized using the `visualize_calibration_boards` method after calibration. 39 | * `Blob detection parameters` for detecting asymmetric and symmetric circular grids can be accessed and modified via the **Camera_Calibration_API's object** prior to calling the `calibrate_camera` method 40 | * Also has `terminal` support with **minimal control** on the variables. Use it as an importable module for better control over the variables 41 | * Can also be easily extended to support other unimplemented calibration patterns 42 | 43 | #### Using custom calibration board with the Camera_Calibration_API. 44 | 45 | So you want to extend the API for a custom calibration pattern? Well... OK! Just follow the follow the steps below 46 | 47 | * The `calibrate_camera` accepts two additional arguments called `custom_world_points_function` and `custom_image_points_function`. 48 | * You must implement the above two custom methods and pass it as an argument to the `calibrate_camera` method 49 | 50 | 51 | 52 | ##### custom_world_points_function(pattern_rows,pattern_columns): 53 | 54 | * This function is responsible for calculating the 3-D world points of the given custom calibration pattern. 55 | * Should take in two keyword arguments in the following order: Number of rows in pattern(int), Number of columns in pattern(int) 56 | * Must return only a single numpy array of shape (M,3) and type np.float32 or np.float64 with M being the number of control points of the custom calibration pattern. The last column of the array (z axis) should be an array of 0 57 | * The distance_in_world_units is not multiplied in this case. Hence, account for that inside the function before returning 58 | * The world points must be ordered in this specific order : row by row, left to right in every row 59 | 60 | ##### custom_image_points_function(img,pattern_rows,pattern_columns): 61 | 62 | * This function is responsible for finding the 2-D image points from the custom calibration image. 63 | * Should take in 3 keyword arguments in the following order: image(numpy array),Number of rows in pattern(int), Number of columns in pattern(int) 64 | * This must return 2 variables: return_value, image_points 65 | * The first one is a boolean Representing whether all the control points in the calibration images are found 66 | * The second one is a numpy array of shape (N,2) of type np.float32 containing the pixel coordinates or the image points of the control points. where N is the number of control points. 67 | * This function should return True only if all the control points are detected (M = N) 68 | * If all the control points are not detected, fillup the 2-D numpy array with 0s entirely and return with bool == False. 69 | * The custom image points must be ordered in this specific order: : row by row, left to right in every row 70 | 71 | **NOTE: 'Custom' pattern is not supported when accessed from terminal** 72 | 73 | 74 | 75 | #### Supported Calibration patterns (rows x columns) bydefault: 76 | 77 | ##### Chessboard or Checkerboard pattern (6 x 9): 78 | ![chessboard](https://raw.githubusercontent.com/LongerVision/OpenCV_Examples/master/markers/pattern_chessboard.png) 79 | 80 | ##### Asymmetrical circular grid/pattern (4 x 11): 81 | ![Asymmetric circular grid](https://raw.githubusercontent.com/LongerVision/OpenCV_Examples/master/markers/pattern_acircles.png). 82 | 83 | #### NOTE for calibrating using Asymmetric circular grid: 84 | * The code assumes that each asymmetric circle is placed at half the `distance_in_world_units` in both (x,y) from each other. 85 | 86 | * The `distance_in_world_units` is specified as the distance between 2 adjacent circle centers at the **same y coordinate** 87 | 88 | * The above is a **4 x 11 (r x c)** asymmetrical circular grid. 89 | 90 | * If you are using the same orientation as the above, Then this orientation is termed as **double_count_in_column** which is by default set to `True`. 91 | 92 | * If you are using an orientation which is 90deg to the above orientation **11 x 4 (r x c)** then the `double count` is along the **rows**. In this case, set `object.double_count_in_column = False` prior to calling `object.calibrate_camera` method. 93 | 94 | ##### Symmetric circular grid/pattern (7 x 6): 95 | ![Symmetrical circular pattern](http://answers.opencv.org/upfiles/13785495544653926.jpg) 96 | -------------------------------------------------------------------------------- /__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/__init__.py -------------------------------------------------------------------------------- /camera_calibration.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Created on Tue Feb 13 11:03:39 2018 5 | 6 | @author: abhijit 7 | """ 8 | 9 | # Python 2/3 compatibility 10 | from __future__ import print_function 11 | # this is important else throws error 12 | from mpl_toolkits.mplot3d import Axes3D 13 | 14 | from matplotlib import cm 15 | from numpy import linspace 16 | 17 | import numpy as np 18 | import cv2 19 | import matplotlib.pyplot as plt 20 | # local modules 21 | 22 | # built-in modules 23 | import os 24 | from multiprocessing.dummy import Pool as ThreadPool 25 | import argparse 26 | from argparse import RawTextHelpFormatter 27 | import glob 28 | import pickle 29 | import pandas as pd 30 | 31 | 32 | 33 | class Camera_Calibration_API: 34 | """ A complete API to calibrate camera with chessboard or symmetric_circles or asymmetric_circles. 35 | also runs on multi-threads 36 | 37 | Constructor keyword arguments: 38 | pattern_type --str: One of ['chessboard','symmetric_circles,'asymmetric_circles','custom'] (No default) 39 | pattern_rows --int: Number of pattern points along row (No default) 40 | pattern_columns --int: Number of pattern points along column (No default) 41 | distance_in_world_units --float: The distance between pattern points in any world unit. (Default 1.0) 42 | figsize: To set the figure size of the matplotlib.pyplot (Default (8,8)) 43 | debug_dir --str: Optional path to a directory to save the images (Default None) 44 | The images include : 45 | 1.Points visulized on the calibration board 46 | 2.Reprojection error plot 47 | 3.Pattern centric and camera centric views of the calibration board 48 | term_criteria: The termination criteria for the subpixel refinement (Default: (cv2.TERM_CRITERIA_EPS + cv2.TERM_CRITERIA_COUNT, 30, 0.001)) 49 | 50 | """ 51 | 52 | def __init__(self, 53 | pattern_type, 54 | pattern_rows, 55 | pattern_columns, 56 | distance_in_world_units = 1.0, 57 | figsize = (8,8), 58 | debug_dir = None, 59 | term_criteria = (cv2.TERM_CRITERIA_EPS + cv2.TERM_CRITERIA_COUNT, 30, 0.001) 60 | ): 61 | 62 | pattern_types = ["chessboard","symmetric_circles","asymmetric_circles","custom"] 63 | 64 | assert pattern_type in pattern_types, "pattern type must be one of {}".format(pattern_types) 65 | 66 | self.pattern_type = pattern_type 67 | self.pattern_rows = pattern_rows 68 | self.pattern_columns = pattern_columns 69 | self.distance_in_world_units = distance_in_world_units 70 | self.figsize = figsize 71 | self.debug_dir = debug_dir 72 | self.term_criteria = term_criteria 73 | self.subpixel_refinement = True #turn on or off subpixel refinement 74 | # on for chessboard 75 | # off for circular objects 76 | # set accordingly for custom pattern 77 | # NOTE: turining on subpixel refinement for circles gives a very high 78 | # reprojection error. 79 | if self.pattern_type in ["asymmetric_circles","symmetric_circles"]: 80 | self.subpixel_refinement = False 81 | self.use_clustering = True 82 | # Setup Default SimpleBlobDetector parameters. 83 | self.blobParams = cv2.SimpleBlobDetector_Params() 84 | # Change thresholds 85 | self.blobParams.minThreshold = 8 86 | self.blobParams.maxThreshold = 255 87 | # Filter by Area. 88 | self.blobParams.filterByArea = True 89 | self.blobParams.minArea = 50 # minArea may be adjusted to suit for your experiment 90 | self.blobParams.maxArea = 10e5 # maxArea may be adjusted to suit for your experiment 91 | # Filter by Circularity 92 | self.blobParams.filterByCircularity = True 93 | self.blobParams.minCircularity = 0.8 94 | # Filter by Convexity 95 | self.blobParams.filterByConvexity = True 96 | self.blobParams.minConvexity = 0.87 97 | # Filter by Inertia 98 | self.blobParams.filterByInertia = True 99 | self.blobParams.minInertiaRatio = 0.01 100 | if self.pattern_type == "asymmetric_circles": 101 | self.double_count_in_column = True # count the double circles in asymmetrical circular grid along the column 102 | 103 | if self.debug_dir and not os.path.isdir(self.debug_dir): 104 | os.mkdir(self.debug_dir) 105 | 106 | print("The Camera Calibration API is initialized and ready for calibration...") 107 | 108 | @staticmethod 109 | def _splitfn(fn): 110 | path, fn = os.path.split(fn) 111 | name, ext = os.path.splitext(fn) 112 | return path, name, ext 113 | 114 | 115 | def _symmetric_world_points(self): 116 | x,y = np.meshgrid(range(self.pattern_columns),range(self.pattern_rows)) 117 | prod = self.pattern_rows * self.pattern_columns 118 | pattern_points=np.hstack((x.reshape(prod,1),y.reshape(prod,1),np.zeros((prod,1)))).astype(np.float32) 119 | return(pattern_points) 120 | 121 | def _asymmetric_world_points(self): 122 | pattern_points = [] 123 | if self.double_count_in_column: 124 | for i in range(self.pattern_rows): 125 | for j in range(self.pattern_columns): 126 | x = j/2 127 | if j%2 == 0: 128 | y = i 129 | else: 130 | y = i + 0.5 131 | pattern_points.append((x,y)) 132 | else: 133 | for i in range(self.pattern_rows): 134 | for j in range(self.pattern_columns): 135 | y = i/2 136 | if i%2 == 0: 137 | x = j 138 | else: 139 | x = j + 0.5 140 | 141 | pattern_points.append((x,y)) 142 | 143 | pattern_points = np.hstack((pattern_points,np.zeros((self.pattern_rows*self.pattern_columns,1)))).astype(np.float32) 144 | return(pattern_points) 145 | 146 | def _chessboard_image_points(self,img): 147 | found, corners = cv2.findChessboardCorners(img,(self.pattern_columns,self.pattern_rows)) 148 | return(found,corners) 149 | 150 | def _circulargrid_image_points(self,img,flags,blobDetector): 151 | found, corners = cv2.findCirclesGrid(img,(self.pattern_columns,self.pattern_rows), 152 | flags=flags, 153 | blobDetector=blobDetector 154 | ) 155 | 156 | return(found,corners) 157 | 158 | def _calc_reprojection_error(self,figure_size=(8,8),save_dir=None): 159 | """ 160 | Util function to Plot reprojection error 161 | """ 162 | reprojection_error = [] 163 | for i in range(len(self.calibration_df)): 164 | imgpoints2, _ = cv2.projectPoints(self.calibration_df.obj_points[i], self.calibration_df.rvecs[i], self.calibration_df.tvecs[i], self.camera_matrix, self.dist_coefs) 165 | temp_error = cv2.norm(self.calibration_df.img_points[i],imgpoints2, cv2.NORM_L2)/len(imgpoints2) 166 | reprojection_error.append(temp_error) 167 | self.calibration_df['reprojection_error'] = pd.Series(reprojection_error) 168 | avg_error = np.sum(np.array(reprojection_error))/len(self.calibration_df.obj_points) 169 | x = [os.path.basename(p) for p in self.calibration_df.image_names] 170 | y_mean = [avg_error]*len(self.calibration_df.image_names) 171 | fig,ax = plt.subplots() 172 | fig.set_figwidth(figure_size[0]) 173 | fig.set_figheight(figure_size[1]) 174 | # Plot the data 175 | ax.scatter(x,reprojection_error,label='Reprojection error', marker='o') #plot before 176 | # Plot the average line 177 | ax.plot(x,y_mean, label='Mean Reprojection error', linestyle='--') 178 | # Make a legend 179 | ax.legend(loc='upper right') 180 | for tick in ax.get_xticklabels(): 181 | tick.set_rotation(90) 182 | # name x and y axis 183 | ax.set_title("Reprojection_error plot") 184 | ax.set_xlabel("Image_names") 185 | ax.set_ylabel("Reprojection error in pixels") 186 | 187 | if save_dir: 188 | plt.savefig(os.path.join(save_dir,"reprojection_error.png")) 189 | 190 | plt.show() 191 | print("The Mean Reprojection Error in pixels is: {}".format(avg_error)) 192 | 193 | 194 | def calibrate_camera(self, 195 | images_path_list, 196 | threads = 4, 197 | custom_world_points_function=None, 198 | custom_image_points_function=None, 199 | ): 200 | 201 | """ User facing method to calibrate the camera 202 | 203 | Keyword arguments 204 | 205 | images_path_list: A list containing full paths to calibration images (No default) 206 | threads --int: Number of threads to run the calibration (Default 4) 207 | custom_world_points_function --function: Must be given if pattern_type="custom", else leave at default (Default None) 208 | custom_image_points_function --function: Must be given if the patter_type="custom", else leave at default (Default None) 209 | 210 | A Note on custom_world_points_function() and custom_image_points_function() 211 | 212 | * custom_world_points_function(pattern_rows,pattern_columns): 213 | 214 | 1) This function is responsible for calculating the 3-D world points of the given custom calibration pattern. 215 | 2) Should take in two keyword arguments in the following order: Number of rows in pattern(int), Number of columns in pattern(int) 216 | 3) Must return only a single numpy array of shape (M,3) and type np.float32 or np.float64 with M being the number of control points 217 | of the custom calibration pattern. The last column of the array (z axis) should be an array of 0 218 | 4) The distance_in_world_units is not multiplied in this case. Hence, account for that inside the function before returning 219 | 5) The world points must be ordered in this specific order : row by row, left to right in every row 220 | 221 | * custom_image_points_function(img,pattern_rows,pattern_columns): 222 | 223 | 1) This function is responsible for finding the 2-D image points from the custom calibration image. 224 | 2) Should take in 3 keyword arguments in the following order: image(numpy array),Number of rows in pattern(int), Number of columns in pattern(int) 225 | 3) This must return 2 variables: return_value, image_points 226 | 4) The first one is a boolean Representing whether all the control points in the calibration images are found 227 | 5) The second one is a numpy array of shape (N,2) of type np.float32 containing the pixel coordinates or the image points of the control points. 228 | where N is the number of control points. 229 | 6) This function should return True only if all the control points are detected (M = N) 230 | 7) If all the control points are not detected, fillup the 2-D numpy array with 0s entirely and return with bool == False. 231 | 232 | 233 | OUTPUT 234 | Prints: 235 | The calibration log 236 | plots the reprojection error plot 237 | 238 | Returns: 239 | A dictionary with the follwing keys: 240 | return_value of cv2.calibrate_camera --key:'rms' 241 | camera intrinsic matrix --key: 'intrinsic_matrix' 242 | distortion coeffs --key: 'distortion_coefficients' 243 | 244 | Saves: 245 | Optionally saves the following images if debug directory is specified in the constructor 246 | 1.Points visulized on the calibration board 247 | 2.Reprojection error plot 248 | 249 | """ 250 | 251 | if self.pattern_type == "custom": 252 | assert custom_world_points_function is not None, "Must implement a custom_world_points_function for 'custom' pattern " 253 | assert custom_image_points_function is not None, "Must implement a custom_image_points_function for 'custom' pattern" 254 | 255 | # initialize place holders 256 | img_points = [] 257 | obj_points = [] 258 | working_images = [] 259 | images_path_list.sort() 260 | print("There are {} {} images given for calibration".format(len(images_path_list),self.pattern_type)) 261 | 262 | if self.pattern_type == "chessboard": 263 | pattern_points = self._symmetric_world_points() * self.distance_in_world_units 264 | 265 | elif self.pattern_type == "symmetric_circles": 266 | pattern_points = self._symmetric_world_points() * self.distance_in_world_units 267 | blobDetector = cv2.SimpleBlobDetector_create(self.blobParams) 268 | flags = cv2.CALIB_CB_SYMMETRIC_GRID 269 | if self.use_clustering: 270 | flags = cv2.CALIB_CB_SYMMETRIC_GRID + cv2.CALIB_CB_CLUSTERING 271 | 272 | elif self.pattern_type == "asymmetric_circles": 273 | pattern_points = self._asymmetric_world_points() * self.distance_in_world_units 274 | blobDetector = cv2.SimpleBlobDetector_create(self.blobParams) 275 | flags = cv2.CALIB_CB_ASYMMETRIC_GRID 276 | if self.use_clustering: 277 | flags = cv2.CALIB_CB_ASYMMETRIC_GRID + cv2.CALIB_CB_CLUSTERING 278 | 279 | elif self.pattern_type == "custom": 280 | pattern_points = custom_world_points_function(self.pattern_rows,self.pattern_columns) 281 | 282 | h, w = cv2.imread(images_path_list[0], 0).shape[:2] 283 | 284 | def process_single_image(img_path): 285 | print("Processing {}".format(img_path)) 286 | img = cv2.imread(img_path,0) # gray scale 287 | if img is None: 288 | print("Failed to load {}".format(img_path)) 289 | return None 290 | 291 | assert w == img.shape[1] and h == img.shape[0],"All the images must have same shape" 292 | 293 | if self.pattern_type == "chessboard": 294 | found,corners = self._chessboard_image_points(img) 295 | elif self.pattern_type == "asymmetric_circles" or self.pattern_type == "symmetric_circles": 296 | found,corners = self._circulargrid_image_points(img,flags,blobDetector) 297 | 298 | elif self.pattern_type == "custom": 299 | found,corners = custom_image_points_function(img,self.pattern_rows,self.pattern_columns) 300 | assert corners[0] == pattern_points[0], "custom_image_points_function should return a numpy array of length matching the number of control points in the image" 301 | 302 | if found: 303 | #self.working_images.append(img_path) 304 | if self.subpixel_refinement: 305 | corners2 = cv2.cornerSubPix(img, corners, (11, 11), (-1, -1), self.term_criteria) 306 | else: 307 | corners2 = corners.copy() 308 | 309 | if self.debug_dir: 310 | vis = cv2.cvtColor(img, cv2.COLOR_GRAY2BGR) 311 | cv2.drawChessboardCorners(vis, (self.pattern_columns,self.pattern_rows), corners2, found) 312 | path, name, ext = self._splitfn(img_path) 313 | outfile = os.path.join(self.debug_dir, name + '_pts_vis.png') 314 | cv2.imwrite(outfile, vis) 315 | 316 | else: 317 | print("Calibration board NOT FOUND") 318 | return(None) 319 | print("Calibration board FOUND") 320 | return(img_path,corners2,pattern_points) 321 | 322 | threads_num = int(threads) 323 | if threads_num <= 1: 324 | calibrationBoards = [process_single_image(img_path) for img_path in images_path_list] 325 | else: 326 | print("Running with %d threads..." % threads_num) 327 | pool = ThreadPool(threads_num) 328 | calibrationBoards = pool.map(process_single_image, images_path_list) 329 | 330 | calibrationBoards = [x for x in calibrationBoards if x is not None] 331 | for (img_path,corners, pattern_points) in calibrationBoards: 332 | working_images.append(img_path) 333 | img_points.append(corners) 334 | obj_points.append(pattern_points) 335 | 336 | # combine it to a dataframe 337 | self.calibration_df = pd.DataFrame({"image_names":working_images, 338 | "img_points":img_points, 339 | "obj_points":obj_points, 340 | }) 341 | self.calibration_df.sort_values("image_names") 342 | self.calibration_df = self.calibration_df.reset_index(drop=True) 343 | 344 | # calibrate the camera 345 | self.rms, self.camera_matrix, self.dist_coefs, rvecs, tvecs = cv2.calibrateCamera(self.calibration_df.obj_points, self.calibration_df.img_points, (w, h), None, None) 346 | 347 | self.calibration_df['rvecs'] = pd.Series(rvecs) 348 | self.calibration_df['tvecs'] = pd.Series(tvecs) 349 | 350 | print("\nRMS:", self.rms) 351 | print("camera matrix:\n", self.camera_matrix) 352 | print("distortion coefficients: ", self.dist_coefs.ravel()) 353 | # plot the reprojection error graph 354 | self._calc_reprojection_error(figure_size=self.figsize,save_dir=self.debug_dir) 355 | 356 | result_dictionary = { 357 | "rms":self.rms, 358 | "intrinsic_matrix":self.camera_matrix, 359 | "distortion_coefficients":self.dist_coefs, 360 | } 361 | 362 | return(result_dictionary) 363 | 364 | def visualize_calibration_boards(self, 365 | cam_width = 20.0, 366 | cam_height = 10.0, 367 | scale_focal = 40): 368 | """ 369 | User facing method to visualize the calibration board orientations in 3-D 370 | Plots both the pattern centric and the camera centric views 371 | 372 | Keyword Arguments: 373 | cam_width --float: width of cam in visualization (Default 20.0) 374 | cam_height --float: height of cam in visualization (Default 10.0) 375 | scale_focal --int: Focal length is scaled accordingly (Default 40) 376 | 377 | Output: 378 | Plots the camera centric and pattern centric views of the chessboard in 3-D using matplotlib 379 | Optionally saves these views in the debug directory if the constructor is initialized with 380 | debug directory 381 | 382 | TIP: change the values of cam_width, cam_height for better visualizations 383 | """ 384 | 385 | # Plot the camera centric view 386 | visualize_views(camera_matrix=self.camera_matrix, 387 | rvecs = self.calibration_df.rvecs, 388 | tvecs = self.calibration_df.tvecs, 389 | board_width=self.pattern_columns, 390 | board_height=self.pattern_rows, 391 | square_size=self.distance_in_world_units, 392 | cam_width = cam_width, 393 | cam_height = cam_height, 394 | scale_focal = scale_focal, 395 | patternCentric = False, 396 | figsize = self.figsize, 397 | save_dir = self.debug_dir 398 | ) 399 | # Plot the pattern centric view 400 | visualize_views(camera_matrix=self.camera_matrix, 401 | rvecs = self.calibration_df.rvecs, 402 | tvecs = self.calibration_df.tvecs, 403 | board_width=self.pattern_columns, 404 | board_height=self.pattern_rows, 405 | square_size=self.distance_in_world_units, 406 | cam_width = cam_width, 407 | cam_height = cam_height, 408 | scale_focal = scale_focal, 409 | patternCentric = True, 410 | figsize = self.figsize, 411 | save_dir = self.debug_dir 412 | ) 413 | 414 | ####################################################################################################################### 415 | 416 | ## 3-D plotting the pattern centric and camera centric views 417 | 418 | def _inverse_homogeneoux_matrix(M): 419 | # util_function 420 | R = M[0:3, 0:3] 421 | T = M[0:3, 3] 422 | M_inv = np.identity(4) 423 | M_inv[0:3, 0:3] = R.T 424 | M_inv[0:3, 3] = -(R.T).dot(T) 425 | 426 | return M_inv 427 | 428 | def _transform_to_matplotlib_frame(cMo, X, inverse=False): 429 | # util function 430 | M = np.identity(4) 431 | M[1,1] = 0 432 | M[1,2] = 1 433 | M[2,1] = -1 434 | M[2,2] = 0 435 | 436 | if inverse: 437 | return M.dot(_inverse_homogeneoux_matrix(cMo).dot(X)) 438 | else: 439 | return M.dot(cMo.dot(X)) 440 | 441 | def _create_camera_model(camera_matrix, width, height, scale_focal, draw_frame_axis=False): 442 | # util function 443 | fx = camera_matrix[0,0] 444 | fy = camera_matrix[1,1] 445 | focal = 2 / (fx + fy) 446 | f_scale = scale_focal * focal 447 | 448 | # draw image plane 449 | X_img_plane = np.ones((4,5)) 450 | X_img_plane[0:3,0] = [-width, height, f_scale] 451 | X_img_plane[0:3,1] = [width, height, f_scale] 452 | X_img_plane[0:3,2] = [width, -height, f_scale] 453 | X_img_plane[0:3,3] = [-width, -height, f_scale] 454 | X_img_plane[0:3,4] = [-width, height, f_scale] 455 | 456 | # draw triangle above the image plane 457 | X_triangle = np.ones((4,3)) 458 | X_triangle[0:3,0] = [-width, -height, f_scale] 459 | X_triangle[0:3,1] = [0, -2*height, f_scale] 460 | X_triangle[0:3,2] = [width, -height, f_scale] 461 | 462 | # draw camera 463 | X_center1 = np.ones((4,2)) 464 | X_center1[0:3,0] = [0, 0, 0] 465 | X_center1[0:3,1] = [-width, height, f_scale] 466 | 467 | X_center2 = np.ones((4,2)) 468 | X_center2[0:3,0] = [0, 0, 0] 469 | X_center2[0:3,1] = [width, height, f_scale] 470 | 471 | X_center3 = np.ones((4,2)) 472 | X_center3[0:3,0] = [0, 0, 0] 473 | X_center3[0:3,1] = [width, -height, f_scale] 474 | 475 | X_center4 = np.ones((4,2)) 476 | X_center4[0:3,0] = [0, 0, 0] 477 | X_center4[0:3,1] = [-width, -height, f_scale] 478 | 479 | # draw camera frame axis 480 | X_frame1 = np.ones((4,2)) 481 | X_frame1[0:3,0] = [0, 0, 0] 482 | X_frame1[0:3,1] = [f_scale/2, 0, 0] 483 | 484 | X_frame2 = np.ones((4,2)) 485 | X_frame2[0:3,0] = [0, 0, 0] 486 | X_frame2[0:3,1] = [0, f_scale/2, 0] 487 | 488 | X_frame3 = np.ones((4,2)) 489 | X_frame3[0:3,0] = [0, 0, 0] 490 | X_frame3[0:3,1] = [0, 0, f_scale/2] 491 | 492 | if draw_frame_axis: 493 | return [X_img_plane, X_triangle, X_center1, X_center2, X_center3, X_center4, X_frame1, X_frame2, X_frame3] 494 | else: 495 | return [X_img_plane, X_triangle, X_center1, X_center2, X_center3, X_center4] 496 | 497 | def _create_board_model(extrinsics, board_width, board_height, square_size, draw_frame_axis=False): 498 | # util function 499 | width = board_width*square_size 500 | height = board_height*square_size 501 | 502 | # draw calibration board 503 | X_board = np.ones((4,5)) 504 | #X_board_cam = np.ones((extrinsics.shape[0],4,5)) 505 | X_board[0:3,0] = [0,0,0] 506 | X_board[0:3,1] = [width,0,0] 507 | X_board[0:3,2] = [width,height,0] 508 | X_board[0:3,3] = [0,height,0] 509 | X_board[0:3,4] = [0,0,0] 510 | 511 | # draw board frame axis 512 | X_frame1 = np.ones((4,2)) 513 | X_frame1[0:3,0] = [0, 0, 0] 514 | X_frame1[0:3,1] = [height/2, 0, 0] 515 | 516 | X_frame2 = np.ones((4,2)) 517 | X_frame2[0:3,0] = [0, 0, 0] 518 | X_frame2[0:3,1] = [0, height/2, 0] 519 | 520 | X_frame3 = np.ones((4,2)) 521 | X_frame3[0:3,0] = [0, 0, 0] 522 | X_frame3[0:3,1] = [0, 0, height/2] 523 | 524 | if draw_frame_axis: 525 | return [X_board, X_frame1, X_frame2, X_frame3] 526 | else: 527 | return [X_board] 528 | 529 | def _draw_camera_boards(ax, camera_matrix, cam_width, cam_height, scale_focal, 530 | extrinsics, board_width, board_height, square_size, 531 | patternCentric): 532 | # util function 533 | min_values = np.zeros((3,1)) 534 | min_values = np.inf 535 | max_values = np.zeros((3,1)) 536 | max_values = -np.inf 537 | 538 | if patternCentric: 539 | X_moving = _create_camera_model(camera_matrix, cam_width, cam_height, scale_focal) 540 | X_static = _create_board_model(extrinsics, board_width, board_height, square_size) 541 | else: 542 | X_static = _create_camera_model(camera_matrix, cam_width, cam_height, scale_focal, True) 543 | X_moving = _create_board_model(extrinsics, board_width, board_height, square_size) 544 | 545 | cm_subsection = linspace(0.0, 1.0, extrinsics.shape[0]) 546 | colors = [ cm.jet(x) for x in cm_subsection ] 547 | 548 | for i in range(len(X_static)): 549 | X = np.zeros(X_static[i].shape) 550 | for j in range(X_static[i].shape[1]): 551 | X[:,j] = _transform_to_matplotlib_frame(np.eye(4), X_static[i][:,j]) 552 | ax.plot3D(X[0,:], X[1,:], X[2,:], color='r') 553 | min_values = np.minimum(min_values, X[0:3,:].min(1)) 554 | max_values = np.maximum(max_values, X[0:3,:].max(1)) 555 | 556 | for idx in range(extrinsics.shape[0]): 557 | R, _ = cv2.Rodrigues(extrinsics[idx,0:3]) 558 | cMo = np.eye(4,4) 559 | cMo[0:3,0:3] = R 560 | cMo[0:3,3] = extrinsics[idx,3:6] 561 | for i in range(len(X_moving)): 562 | X = np.zeros(X_moving[i].shape) 563 | for j in range(X_moving[i].shape[1]): 564 | X[0:4,j] = _transform_to_matplotlib_frame(cMo, X_moving[i][0:4,j], patternCentric) 565 | ax.plot3D(X[0,:], X[1,:], X[2,:], color=colors[idx]) 566 | min_values = np.minimum(min_values, X[0:3,:].min(1)) 567 | max_values = np.maximum(max_values, X[0:3,:].max(1)) 568 | 569 | return min_values, max_values 570 | 571 | def visualize_views(camera_matrix, 572 | rvecs, 573 | tvecs, 574 | board_width, 575 | board_height, 576 | square_size, 577 | cam_width = 64/2, 578 | cam_height = 48/2, 579 | scale_focal = 40, 580 | patternCentric = False, 581 | figsize = (8,8), 582 | save_dir = None 583 | ): 584 | """ 585 | Visualizes the pattern centric or the camera centric views of chess board 586 | using the above util functions 587 | 588 | Keyword Arguments 589 | 590 | camera_matrix --numpy.array: intrinsic camera matrix (No default) 591 | rvecs : --list of rvecs from cv2.calibrateCamera() 592 | tvecs : --list of tvecs from cv2.calibrateCamera() 593 | 594 | board_width --int: the chessboard width (no default) 595 | board_height --int: the chessboard height (no default) 596 | square_size --int: the square size of each chessboard square in mm 597 | cam_width --float: Width/2 of the displayed camera (Default 64/2) 598 | it is recommended to leave this argument to default 599 | cam_height --float: Height/2 of the displayed camera (Default (48/2)) 600 | it is recommended to leave this argument to default 601 | scale_focal --int: Value to scale the focal length (Default 40) 602 | it is recommended to leave this argument to default 603 | 604 | pattern_centric --bool: Whether to visualize the pattern centric or the 605 | camera centric (Default False) 606 | fig_size --tuple: The size of figure to display (Default (8,8)) 607 | it is recommended to leave this argument to default 608 | 609 | save_dir --str: optional path to a saving directory to save the 610 | generated plot (Default None) 611 | 612 | Does not return anything 613 | """ 614 | i = 0 615 | extrinsics = np.zeros((len(rvecs),6)) 616 | for rot,trans in zip(rvecs,tvecs): 617 | extrinsics[i]=np.append(rot.flatten(),trans.flatten()) 618 | i+=1 619 | #The extrinsics matrix is of shape (N,6) (No default) 620 | #Where N is the number of board patterns 621 | #the first 3 columns are rotational vectors 622 | #the last 3 columns are translational vectors 623 | 624 | fig = plt.figure(figsize=figsize) 625 | ax = fig.gca(projection='3d') 626 | ax.set_aspect("equal") 627 | 628 | min_values, max_values = _draw_camera_boards(ax, camera_matrix, cam_width, cam_height, 629 | scale_focal, extrinsics, board_width, 630 | board_height, square_size, patternCentric) 631 | 632 | X_min = min_values[0] 633 | X_max = max_values[0] 634 | Y_min = min_values[1] 635 | Y_max = max_values[1] 636 | Z_min = min_values[2] 637 | Z_max = max_values[2] 638 | max_range = np.array([X_max-X_min, Y_max-Y_min, Z_max-Z_min]).max() / 2.0 639 | 640 | mid_x = (X_max+X_min) * 0.5 641 | mid_y = (Y_max+Y_min) * 0.5 642 | mid_z = (Z_max+Z_min) * 0.5 643 | ax.set_xlim(mid_x - max_range, mid_x + max_range) 644 | ax.set_ylim(mid_y - max_range, mid_y + max_range) 645 | ax.set_zlim(mid_z - max_range, mid_z + max_range) 646 | 647 | ax.set_xlabel('x') 648 | ax.set_ylabel('z') 649 | ax.set_zlabel('-y') 650 | if patternCentric: 651 | ax.set_title('Pattern Centric View') 652 | if save_dir: 653 | plt.savefig(os.path.join(save_dir,"pattern_centric_view.png")) 654 | else: 655 | ax.set_title('Camera Centric View') 656 | if save_dir: 657 | plt.savefig(os.path.join(save_dir,"camera_centric_view.png")) 658 | plt.show() 659 | ################################################################################################################# 660 | 661 | if __name__ == "__main__": 662 | ## Cannot be used for custom calibration pattern 663 | parser = argparse.ArgumentParser(description="Camera_Calibration_API. Saves the calibration results in a pickle file \n NOTE: USE THE API AS IMPORTABLE MODULE FOR ADDED CONTROL",formatter_class=RawTextHelpFormatter) 664 | parser.add_argument("--images_dir",help="Path to the directory containing calibration images (no / in end)",type=str,metavar='', default=None) 665 | parser.add_argument("-pt","--pattern_type",help="The pattern type for calibration",type=str,metavar='',default=None) 666 | parser.add_argument("-pr","--pattern_rows",help="num of rows in pattern",type=int,metavar='',default = 0) 667 | parser.add_argument("-pc","--pattern_columns",help="num of columns in pattern",type=int,metavar='',default = 0) 668 | parser.add_argument("-d","--distance",help="The distance between points in world units",type=float,metavar='',default = 1.0) 669 | parser.add_argument("--debug",help="path to directory for saving images",type=str,metavar='',default=None) 670 | parser.add_argument("-cw","--cam_width",help="width of cam for visualization",type=float,metavar='',default=1) 671 | parser.add_argument("-ch","--cam_height",help="height of cam for visualization",type=float,metavar='',default=0.5) 672 | parser.add_argument("--save",help="path to save the results as a pickle file",type=str,metavar='',default="./results.pickle") 673 | 674 | args=parser.parse_args() 675 | 676 | pattern_types = ["chessboard","symmetric_circles","asymmetric_circles"] 677 | 678 | if args.images_dir == None or args.pattern_type == None or args.pattern_rows == 0 or args.pattern_columns == 0: 679 | raise ValueError("Give values for the first 4 arguments") 680 | assert args.pattern_type in pattern_types, "The --pattern_type must be one of {}. 'custom' pattern is not supported in terminal mode".format(pattern_types) 681 | 682 | file_types = ("*.jpg","*.jpeg","*.JPEG","*.png","*.PNG","*.bmp","*.BMP") 683 | images_path_list = [] 684 | for file_type in file_types: 685 | images_path_list.extend(glob.glob(os.path.join(args.images_dir,file_type))) 686 | 687 | calibration_object = Camera_Calibration_API(pattern_type = args.pattern_type, 688 | pattern_rows = args.pattern_rows, 689 | pattern_columns = args.pattern_columns, 690 | distance_in_world_units = args.distance, 691 | debug_dir = args.debug 692 | ) 693 | 694 | results = calibration_object.calibrate_camera(images_path_list) 695 | with open(args.save,"wb") as f: 696 | pickle.dump(results,f) 697 | 698 | calibration_object.visualize_calibration_boards(cam_width=args.cam_width, 699 | cam_height=args.cam_height) 700 | 701 | 702 | 703 | 704 | 705 | 706 | -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-11-38.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-11-38.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-13-40.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-13-40.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-14-01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-14-01.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-14-55.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-14-55.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-15-21.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-15-21.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-15-55.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-15-55.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-16-06.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-16-06.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-16-18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-16-18.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-16-39.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-16-39.png -------------------------------------------------------------------------------- /examples/example_images/asymmetric_grid/Image__2018-02-12__15-17-08.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/asymmetric_grid/Image__2018-02-12__15-17-08.png -------------------------------------------------------------------------------- /examples/example_images/chessboard/left01.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left01.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left02.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left02.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left03.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left03.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left04.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left04.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left05.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left05.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left06.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left06.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left07.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left07.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left08.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left08.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left09.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left09.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left11.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left11.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left12.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left12.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left13.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left13.jpg -------------------------------------------------------------------------------- /examples/example_images/chessboard/left14.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/chessboard/left14.jpg -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-12-45.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-12-45.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-13-32.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-13-32.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-13-57.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-13-57.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-14-10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-14-10.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-14-24.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-14-24.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-14-42.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-14-42.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-15-01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-15-01.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-15-22.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-15-22.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-15-40.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-15-40.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-16-00.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-16-00.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-16-32.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-16-32.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-17-16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-17-16.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-17-32.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-17-32.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-17-53.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-17-53.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-18-04.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-18-04.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-18-16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-18-16.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-18-29.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-18-29.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-18-40.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-18-40.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-19-03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-19-03.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-19-14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-19-14.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-19-33.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-19-33.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-19-50.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-19-50.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-20-22.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-20-22.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-20-58.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-20-58.png -------------------------------------------------------------------------------- /examples/example_images/symmetric_grid/Image__2018-02-14__10-21-12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jankozik/camera_calibration_PythonAPI/e8b52b8ab3fcf1c79e6ac1758a94fd937492a3ec/examples/example_images/symmetric_grid/Image__2018-02-14__10-21-12.png --------------------------------------------------------------------------------