├── .idea ├── .name ├── .gitignore ├── vcs.xml ├── misc.xml ├── inspectionProfiles │ └── profiles_settings.xml ├── modules.xml └── ABB-Robot-Machine-Vision.iml ├── docs ├── build │ ├── html │ │ ├── _sources │ │ │ ├── testfile.rst.txt │ │ │ ├── help.rst.txt │ │ │ ├── opencvtorapid.rst.txt │ │ │ ├── open_cv_torapid.rst.txt │ │ │ ├── robotware.rst.txt │ │ │ ├── index.rst.txt │ │ │ ├── installation.rst.txt │ │ │ ├── camera.rst.txt │ │ │ ├── license.rst.txt │ │ │ ├── puck.rst.txt │ │ │ ├── opencv_to_rapid.rst.txt │ │ │ ├── imagefunctions.rst.txt │ │ │ └── robotwebservices.rst.txt │ │ ├── _static │ │ │ ├── custom.css │ │ │ ├── file.png │ │ │ ├── plus.png │ │ │ ├── minus.png │ │ │ ├── documentation_options.js │ │ │ ├── pygments.css │ │ │ ├── doctools.js │ │ │ └── underscore.js │ │ ├── objects.inv │ │ ├── .buildinfo │ │ ├── help.html │ │ ├── search.html │ │ ├── robotware.html │ │ ├── license.html │ │ ├── index.html │ │ ├── installation.html │ │ ├── camera.html │ │ ├── searchindex.js │ │ ├── puck.html │ │ ├── genindex.html │ │ └── opencv_to_rapid.html │ └── doctrees │ │ ├── help.doctree │ │ ├── puck.doctree │ │ ├── camera.doctree │ │ ├── index.doctree │ │ ├── license.doctree │ │ ├── environment.pickle │ │ ├── imagefunctions.doctree │ │ ├── installation.doctree │ │ ├── opencv_to_rapid.doctree │ │ └── robotwebservices.doctree ├── source │ ├── help.rst │ ├── index.rst │ ├── installation.rst │ ├── license.rst │ ├── puck.rst │ ├── conf.py │ ├── opencv_to_rapid.rst │ ├── imagefunctions.rst │ └── robotwebservices.rst ├── Makefile └── make.bat ├── .DS_Store ├── image_tools ├── cam_adjustments.ini ├── __pycache__ │ ├── Camera.cpython-37.pyc │ ├── ImageFunctions.cpython-37.pyc │ └── camera_correction.cpython-37.pyc ├── Camera.py ├── camera_correction.py └── ImageFunctions.py ├── __pycache__ ├── Puck.cpython-37.pyc ├── main.cpython-37.pyc └── OpenCV_to_RAPID.cpython-37.pyc ├── RobotWebServices ├── __pycache__ │ └── RWS.cpython-37.pyc ├── API_Benchmark │ ├── __pycache__ │ │ ├── com_testing.cpython-37.pyc │ │ └── com_testing2.cpython-37.pyc │ ├── com_testing2.py │ └── com_testing.py ├── Stop_Norbert.py └── Start_Norbert.py ├── Procedures.txt ├── PythonCom_lab.txt ├── main_lab.py ├── PythonCom.txt ├── Puck.py ├── cam_adjust_lab.py └── OpenCV_to_RAPID.py /.idea/.name: -------------------------------------------------------------------------------- 1 | ABB-Robot-Machine-Vision -------------------------------------------------------------------------------- /docs/build/html/_sources/testfile.rst.txt: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /.idea/.gitignore: -------------------------------------------------------------------------------- 1 | # Default ignored files 2 | /workspace.xml -------------------------------------------------------------------------------- /docs/build/html/_static/custom.css: -------------------------------------------------------------------------------- 1 | /* This file intentionally left blank. */ 2 | -------------------------------------------------------------------------------- /docs/source/help.rst: -------------------------------------------------------------------------------- 1 | Need Help? 2 | ========== 3 | You're on your own now, buddy. 4 | -------------------------------------------------------------------------------- /.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/.DS_Store -------------------------------------------------------------------------------- /docs/build/html/_sources/help.rst.txt: -------------------------------------------------------------------------------- 1 | Need Help? 2 | ========== 3 | You're on your own now, buddy. 4 | -------------------------------------------------------------------------------- /docs/build/html/objects.inv: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/html/objects.inv -------------------------------------------------------------------------------- /image_tools/cam_adjustments.ini: -------------------------------------------------------------------------------- 1 | [EXPOSURE] 2 | exposure = 26 3 | 4 | [SLOPE] 5 | slopex = 0.0230 6 | slopey = 0.0000 7 | 8 | -------------------------------------------------------------------------------- /__pycache__/Puck.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/__pycache__/Puck.cpython-37.pyc -------------------------------------------------------------------------------- /__pycache__/main.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/__pycache__/main.cpython-37.pyc -------------------------------------------------------------------------------- /docs/build/doctrees/help.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/help.doctree -------------------------------------------------------------------------------- /docs/build/doctrees/puck.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/puck.doctree -------------------------------------------------------------------------------- /docs/build/html/_static/file.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/html/_static/file.png -------------------------------------------------------------------------------- /docs/build/html/_static/plus.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/html/_static/plus.png -------------------------------------------------------------------------------- /docs/build/doctrees/camera.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/camera.doctree -------------------------------------------------------------------------------- /docs/build/doctrees/index.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/index.doctree -------------------------------------------------------------------------------- /docs/build/html/_static/minus.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/html/_static/minus.png -------------------------------------------------------------------------------- /docs/build/doctrees/license.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/license.doctree -------------------------------------------------------------------------------- /docs/build/doctrees/environment.pickle: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/environment.pickle -------------------------------------------------------------------------------- /__pycache__/OpenCV_to_RAPID.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/__pycache__/OpenCV_to_RAPID.cpython-37.pyc -------------------------------------------------------------------------------- /docs/build/doctrees/imagefunctions.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/imagefunctions.doctree -------------------------------------------------------------------------------- /docs/build/doctrees/installation.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/installation.doctree -------------------------------------------------------------------------------- /docs/build/doctrees/opencv_to_rapid.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/opencv_to_rapid.doctree -------------------------------------------------------------------------------- /docs/build/doctrees/robotwebservices.doctree: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/docs/build/doctrees/robotwebservices.doctree -------------------------------------------------------------------------------- /RobotWebServices/__pycache__/RWS.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/RobotWebServices/__pycache__/RWS.cpython-37.pyc -------------------------------------------------------------------------------- /image_tools/__pycache__/Camera.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/image_tools/__pycache__/Camera.cpython-37.pyc -------------------------------------------------------------------------------- /image_tools/__pycache__/ImageFunctions.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/image_tools/__pycache__/ImageFunctions.cpython-37.pyc -------------------------------------------------------------------------------- /image_tools/__pycache__/camera_correction.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/image_tools/__pycache__/camera_correction.cpython-37.pyc -------------------------------------------------------------------------------- /.idea/vcs.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | -------------------------------------------------------------------------------- /RobotWebServices/API_Benchmark/__pycache__/com_testing.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/RobotWebServices/API_Benchmark/__pycache__/com_testing.cpython-37.pyc -------------------------------------------------------------------------------- /RobotWebServices/API_Benchmark/__pycache__/com_testing2.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/mhiversflaten/ABB-Robot-Machine-Vision/HEAD/RobotWebServices/API_Benchmark/__pycache__/com_testing2.cpython-37.pyc -------------------------------------------------------------------------------- /.idea/misc.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | -------------------------------------------------------------------------------- /.idea/inspectionProfiles/profiles_settings.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 6 | -------------------------------------------------------------------------------- /docs/build/html/.buildinfo: -------------------------------------------------------------------------------- 1 | # Sphinx build info version 1 2 | # This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. 3 | config: edb953329c4d8de3d9e5f3c318ba3efc 4 | tags: 645f666f9bcd5a90fca523b33c5a78b7 5 | -------------------------------------------------------------------------------- /.idea/modules.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | -------------------------------------------------------------------------------- /.idea/ABB-Robot-Machine-Vision.iml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | -------------------------------------------------------------------------------- /docs/build/html/_static/documentation_options.js: -------------------------------------------------------------------------------- 1 | var DOCUMENTATION_OPTIONS = { 2 | URL_ROOT: document.getElementById("documentation_options").getAttribute('data-url_root'), 3 | VERSION: '0.0.1', 4 | LANGUAGE: 'None', 5 | COLLAPSE_INDEX: false, 6 | FILE_SUFFIX: '.html', 7 | HAS_SOURCE: true, 8 | SOURCELINK_SUFFIX: '.txt', 9 | NAVIGATION_WITH_KEYS: false 10 | }; -------------------------------------------------------------------------------- /docs/build/html/_sources/opencvtorapid.rst.txt: -------------------------------------------------------------------------------- 1 | OpenCV_to_RAPID.py 2 | ================== 3 | 4 | .. py:function:: create_robtarget(gripper_height, gripper_rot, cam_pos, image, puck, cam_comp=False) 5 | 6 | Complete a series of transformations to finally create a robtarget of the puck's position from an image. 7 | These transformations are based on the setup at the University of Stavanger, and with a specific RobotStudio setup 8 | (in regards to coordinate systems). This Pack_ file includes this setup. 9 | 10 | _Pack: http://www.ux.uis.no/~karlsk/ELE610/UiS_E458_nov18.rspag -------------------------------------------------------------------------------- /docs/build/html/_sources/open_cv_torapid.rst.txt: -------------------------------------------------------------------------------- 1 | OpenCV_to_RAPID.py 2 | ================== 3 | 4 | .. py:function:: create_robtarget(gripper_height, gripper_rot, cam_pos, image, puck, cam_comp=False) 5 | 6 | Complete a series of transformations to finally create a robtarget of the puck's position from an image. 7 | These transformations are based on the setup at the University of Stavanger, and with a specific RobotStudio setup 8 | (in regards to coordinate systems). This Pack_ file includes this setup. 9 | 10 | _Pack: http://www.ux.uis.no/~karlsk/ELE610/UiS_E458_nov18.rspag -------------------------------------------------------------------------------- /RobotWebServices/Stop_Norbert.py: -------------------------------------------------------------------------------- 1 | from requests.auth import HTTPDigestAuth 2 | from RobotWebServices import RWS 3 | 4 | """Script to easily turn the motors off, on Norbert""" 5 | 6 | norbert = RWS.RWS() 7 | 8 | norbert_url = 'http://152.94.0.38' 9 | digest_auth = HTTPDigestAuth('Default User', 'robotics') 10 | 11 | payload = {'ctrl-state': 'motoroff'} 12 | resp = norbert.session.post(norbert_url + "/rw/panel/ctrlstate?action=setctrlstate", auth=digest_auth, data=payload) 13 | 14 | if resp.status_code == 204: 15 | print("Robot motors turned off") 16 | else: 17 | print("Could not turn off motors") -------------------------------------------------------------------------------- /RobotWebServices/Start_Norbert.py: -------------------------------------------------------------------------------- 1 | from requests.auth import HTTPDigestAuth 2 | from RobotWebServices import RWS 3 | 4 | """Script to easily turn the motors on, on Norbert""" 5 | 6 | norbert = RWS.RWS() 7 | 8 | norbert_url = 'http://152.94.0.38' 9 | digest_auth = HTTPDigestAuth('Default User', 'robotics') 10 | 11 | payload = {'ctrl-state': 'motoron'} 12 | resp = norbert.session.post(norbert_url + "/rw/panel/ctrlstate?action=setctrlstate", auth=digest_auth, data=payload) 13 | 14 | if resp.status_code == 204: 15 | print("Robot motors turned on") 16 | else: 17 | print("Could not turn on motors. The controller might be in manual mode") -------------------------------------------------------------------------------- /docs/build/html/_sources/robotware.rst.txt: -------------------------------------------------------------------------------- 1 | RobotWare 2 | ========= 3 | 4 | .. py:function:: set_zonedata(self, var, zonedata) 5 | 6 | Set the value for a zonedata variable in RAPID. 7 | var: name of variable as declared in RAPID. 8 | zonedata: desired zonedata value [int]. 9 | 10 | .. py:function:: set_speeddata(self, var, speeddata) 11 | 12 | Set the value [int] for a speeddata variable in RAPID. 13 | var: name of variable as declared in RAPID. 14 | speeddata: desired speeddata value [int]. 15 | 16 | .. py:function:: set_speed_ratio(self, speed_ratio) 17 | 18 | Set the speed ratio of the robot. 19 | speed_ratio: desired speed ratio in percent [1-100]. 20 | 21 | 22 | 23 | 24 | -------------------------------------------------------------------------------- /docs/Makefile: -------------------------------------------------------------------------------- 1 | # Minimal makefile for Sphinx documentation 2 | # 3 | 4 | # You can set these variables from the command line, and also 5 | # from the environment for the first two. 6 | SPHINXOPTS ?= 7 | SPHINXBUILD ?= sphinx-build 8 | SOURCEDIR = source 9 | BUILDDIR = build 10 | 11 | # Put it first so that "make" without argument is like "make help". 12 | help: 13 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 14 | 15 | .PHONY: help Makefile 16 | 17 | # Catch-all target: route all unknown targets to Sphinx using the new 18 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). 19 | %: Makefile 20 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) 21 | -------------------------------------------------------------------------------- /docs/source/index.rst: -------------------------------------------------------------------------------- 1 | .. test-project documentation master file, created by 2 | sphinx-quickstart on Thu Apr 2 11:32:07 2020. 3 | You can adapt this file completely to your liking, but it should at least 4 | contain the root `toctree` directive. 5 | 6 | ABB Robot with Machine Vision 7 | ============================= 8 | 9 | This software is mainly directed toward engineering 10 | students and staff at the University of Stavanger. 11 | It was first created as a Bachelor thesis work. 12 | 13 | User Guide 14 | ^^^^^^^^^^ 15 | 16 | .. toctree:: 17 | :maxdepth: 2 18 | 19 | license 20 | installation 21 | robotwebservices 22 | imagefunctions 23 | opencv_to_rapid 24 | puck 25 | help 26 | 27 | 28 | Indices 29 | ======= 30 | 31 | * :ref:`genindex` 32 | * :ref:`search` 33 | -------------------------------------------------------------------------------- /docs/source/installation.rst: -------------------------------------------------------------------------------- 1 | Installation 2 | ============ 3 | 4 | This documentation covers the source code used in a demonstration program at the University of Stavanger. 5 | This includes a pip installable package, rwsuis_, which is covered in :ref:`RobotWebServices`. 6 | 7 | .. _rwsuis: https://pypi.org/project/rwsuis/ 8 | 9 | Here are the different ways of acquiring the software. 10 | 11 | $ pip install 12 | ^^^^^^^^^^^^^ 13 | 14 | | To install the rwsuis_ package, simply run this command in your terminal of choice: 15 | 16 | :: 17 | 18 | pip install rwsuis 19 | 20 | Get the source code 21 | ^^^^^^^^^^^^^^^^^^^ 22 | 23 | | All code from the docs is on GitHub, and the public repository can be cloned: 24 | 25 | :: 26 | 27 | git clone https://github.com/prinsWindy/ABB-Robot-Machine-Vision.git 28 | -------------------------------------------------------------------------------- /docs/build/html/_sources/index.rst.txt: -------------------------------------------------------------------------------- 1 | .. test-project documentation master file, created by 2 | sphinx-quickstart on Thu Apr 2 11:32:07 2020. 3 | You can adapt this file completely to your liking, but it should at least 4 | contain the root `toctree` directive. 5 | 6 | ABB Robot with Machine Vision 7 | ============================= 8 | 9 | This software is mainly directed toward engineering 10 | students and staff at the University of Stavanger. 11 | It was first created as a Bachelor thesis work. 12 | 13 | User Guide 14 | ^^^^^^^^^^ 15 | 16 | .. toctree:: 17 | :maxdepth: 2 18 | 19 | license 20 | installation 21 | robotwebservices 22 | imagefunctions 23 | opencv_to_rapid 24 | puck 25 | help 26 | 27 | 28 | Indices 29 | ======= 30 | 31 | * :ref:`genindex` 32 | * :ref:`search` 33 | -------------------------------------------------------------------------------- /docs/build/html/_sources/installation.rst.txt: -------------------------------------------------------------------------------- 1 | Installation 2 | ============ 3 | 4 | This documentation covers the source code used in a demonstration program at the University of Stavanger. 5 | This includes a pip installable package, rwsuis_, which is covered in :ref:`RobotWebServices`. 6 | 7 | .. _rwsuis: https://pypi.org/project/rwsuis/ 8 | 9 | Here are the different ways of acquiring the software. 10 | 11 | $ pip install 12 | ^^^^^^^^^^^^^ 13 | 14 | | To install the rwsuis_ package, simply run this command in your terminal of choice: 15 | 16 | :: 17 | 18 | pip install rwsuis 19 | 20 | Get the source code 21 | ^^^^^^^^^^^^^^^^^^^ 22 | 23 | | All code from the docs is on GitHub, and the public repository can be cloned: 24 | 25 | :: 26 | 27 | git clone https://github.com/prinsWindy/ABB-Robot-Machine-Vision.git 28 | -------------------------------------------------------------------------------- /docs/make.bat: -------------------------------------------------------------------------------- 1 | @ECHO OFF 2 | 3 | pushd %~dp0 4 | 5 | REM Command file for Sphinx documentation 6 | 7 | if "%SPHINXBUILD%" == "" ( 8 | set SPHINXBUILD=sphinx-build 9 | ) 10 | set SOURCEDIR=source 11 | set BUILDDIR=build 12 | 13 | if "%1" == "" goto help 14 | 15 | %SPHINXBUILD% >NUL 2>NUL 16 | if errorlevel 9009 ( 17 | echo. 18 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx 19 | echo.installed, then set the SPHINXBUILD environment variable to point 20 | echo.to the full path of the 'sphinx-build' executable. Alternatively you 21 | echo.may add the Sphinx directory to PATH. 22 | echo. 23 | echo.If you don't have Sphinx installed, grab it from 24 | echo.http://sphinx-doc.org/ 25 | exit /b 1 26 | ) 27 | 28 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% 29 | goto end 30 | 31 | :help 32 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% 33 | 34 | :end 35 | popd 36 | -------------------------------------------------------------------------------- /docs/build/html/_sources/camera.rst.txt: -------------------------------------------------------------------------------- 1 | Camera Class 2 | ============ 3 | 4 | .. py:class:: Camera(cam_ID=0) 5 | 6 | The Camera class contains methods specifically meant for the University of Stavanger. 7 | These functions have only been tested on a IDS UI-1007XS-C camera, and might not work 8 | as intended on other models. 9 | 10 | Quickstart 11 | ^^^^^^^^^^ 12 | 13 | .. code-block:: python 14 | 15 | # Initialization 16 | cam = Camera() 17 | cam.init() 18 | cam.set_parameters() 19 | cam.allocate_memory() 20 | cam.capture_video() 21 | 22 | # Show video 23 | img = cam.get_image() 24 | cv2.imshow("Quickstart", img) 25 | if cv2.waitKey(1) & 0xFF == ord('q'): 26 | cv2.destroyAllWindows() 27 | break 28 | 29 | cam.exit_camera() 30 | 31 | The parameters set are not currently configurable through method inputs. 32 | They are specifically set for laboratory work at the University of Stavanger. 33 | -------------------------------------------------------------------------------- /docs/source/license.rst: -------------------------------------------------------------------------------- 1 | License 2 | ======= 3 | 4 | Copyright 2020 Markus H. Iversflaten & Ole Christian Handegård 5 | 6 | Permission is hereby granted, free of charge, 7 | to any person obtaining a copy of this software 8 | and associated documentation files (the "Software"), 9 | to deal in the Software without restriction, including 10 | without limitation the rights to use, copy, modify, merge, 11 | publish, distribute, sublicense, and/or sell copies of the 12 | Software, and to permit persons to whom the Software is 13 | furnished to do so, subject to the following conditions: 14 | 15 | The above copyright notice and this permission notice shall be included in 16 | all copies or substantial portions of the Software. 17 | 18 | 19 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 20 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 22 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 24 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 25 | THE SOFTWARE. 26 | -------------------------------------------------------------------------------- /docs/build/html/_sources/license.rst.txt: -------------------------------------------------------------------------------- 1 | License 2 | ======= 3 | 4 | Copyright 2020 Markus H. Iversflaten & Ole Christian Handegård 5 | 6 | Permission is hereby granted, free of charge, 7 | to any person obtaining a copy of this software 8 | and associated documentation files (the "Software"), 9 | to deal in the Software without restriction, including 10 | without limitation the rights to use, copy, modify, merge, 11 | publish, distribute, sublicense, and/or sell copies of the 12 | Software, and to permit persons to whom the Software is 13 | furnished to do so, subject to the following conditions: 14 | 15 | The above copyright notice and this permission notice shall be included in 16 | all copies or substantial portions of the Software. 17 | 18 | 19 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 20 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 22 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 24 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 25 | THE SOFTWARE. 26 | -------------------------------------------------------------------------------- /docs/source/puck.rst: -------------------------------------------------------------------------------- 1 | .. _Puck: 2 | 3 | Puck Class 4 | ========== 5 | 6 | .. py:class:: Puck(number, position, angle, height=30) 7 | 8 | .. py:method:: set_position(position) 9 | 10 | Updates :py:class:`Puck` position. 11 | 12 | :param float[] position: New :py:class:`Puck` position [x,y] 13 | 14 | .. py:method:: set_angle(angle) 15 | 16 | Updates :py:class:`Puck` angle. 17 | 18 | :param float[] angle: New :py:class:`Puck` angle 19 | 20 | .. py:method:: set_height(height) 21 | 22 | Updates :py:class:`Puck` height. 23 | 24 | :param float[] height: New :py:class:`Puck` height 25 | 26 | .. py:method:: get_xyz() 27 | 28 | Gets true position of :py:class:`Puck` ([x,y,z] on the work object). 29 | 30 | :returns: :py:class:`Puck` position [x,y,z] 31 | 32 | .. py:method:: check_collision(puck_list) 33 | 34 | Finds an angle of rotation for the gripper which avoids 35 | collisions between the gripper and other pucks 36 | when sliding in to pick up a puck. 37 | 38 | :param Puck[] puck_list: List of all :py:class:`Puck` objects 39 | 40 | :return: Rotation which yields no collision 41 | :return: If puck should be gripped forward or backward 42 | -------------------------------------------------------------------------------- /docs/build/html/_sources/puck.rst.txt: -------------------------------------------------------------------------------- 1 | .. _Puck: 2 | 3 | Puck Class 4 | ========== 5 | 6 | .. py:class:: Puck(number, position, angle, height=30) 7 | 8 | .. py:method:: set_position(position) 9 | 10 | Updates :py:class:`Puck` position. 11 | 12 | :param float[] position: New :py:class:`Puck` position [x,y] 13 | 14 | .. py:method:: set_angle(angle) 15 | 16 | Updates :py:class:`Puck` angle. 17 | 18 | :param float[] angle: New :py:class:`Puck` angle 19 | 20 | .. py:method:: set_height(height) 21 | 22 | Updates :py:class:`Puck` height. 23 | 24 | :param float[] height: New :py:class:`Puck` height 25 | 26 | .. py:method:: get_xyz() 27 | 28 | Gets true position of :py:class:`Puck` ([x,y,z] on the work object). 29 | 30 | :returns: :py:class:`Puck` position [x,y,z] 31 | 32 | .. py:method:: check_collision(puck_list) 33 | 34 | Finds an angle of rotation for the gripper which avoids 35 | collisions between the gripper and other pucks 36 | when sliding in to pick up a puck. 37 | 38 | :param Puck[] puck_list: List of all :py:class:`Puck` objects 39 | 40 | :return: Rotation which yields no collision 41 | :return: If puck should be gripped forward or backward 42 | -------------------------------------------------------------------------------- /RobotWebServices/API_Benchmark/com_testing2.py: -------------------------------------------------------------------------------- 1 | from locust import HttpLocust, TaskSequence, task, between, seq_task, constant 2 | import random 3 | from requests.auth import HTTPDigestAuth 4 | 5 | 6 | class UserBehavior(TaskSequence): 7 | """Used for testing communication between Python and RAPID (RobotWare) through the use of RobotWebServices 8 | and REST API""" 9 | 10 | def on_start(self): 11 | pass 12 | 13 | def on_stop(self): 14 | pass 15 | 16 | @seq_task(1) 17 | def motorson(self): 18 | """Turns the motors on""" 19 | payload = {'ctrl-state': 'motoron'} 20 | self.client.post("/rw/panel/ctrlstate?action=setctrlstate", 21 | auth=HTTPDigestAuth(username='Default User', password='robotics'), data=payload) 22 | 23 | @seq_task(2) 24 | def motorsoff(self): 25 | """Turns the motors off""" 26 | payload = {'ctrl-state': 'motoroff'} 27 | self.client.post("/rw/panel/ctrlstate?action=setctrlstate&", 28 | auth=HTTPDigestAuth(username='Default User', password='robotics'), data=payload) 29 | 30 | 31 | class WebsiteUser(HttpLocust): 32 | task_set = UserBehavior 33 | 34 | # Wait time between requests 35 | wait_time = constant(2) 36 | 37 | 38 | """To run the test, type the following in console: locust -f RobotWebServices/API_Benchmark/com_testing2.py 39 | Open localhost:8089 to get to the http page to spawn users and decide spawn rate and address""" -------------------------------------------------------------------------------- /Procedures.txt: -------------------------------------------------------------------------------- 1 | MODULE Procedures 2 | 3 | TASK PERS wobjdata wobjTableN:=[FALSE,TRUE,"",[[150,-500,8],[0.707106781,0,0,-0.707106781]],[[0,0,0],[1,0,0,0]]]; 4 | PERS tooldata tGripper:=[TRUE,[[0,0,114.25],[0,0,0,1]],[1,[-0.095984607,0.082520613,38.69176324],[1,0,0,0],0,0,0]]; 5 | 6 | 7 | PROC wait_for_python() 8 | ! wait for python to finish processing image 9 | WHILE NOT image_processed DO 10 | ENDWHILE 11 | image_processed:=FALSE; 12 | ENDPROC 13 | 14 | 15 | ! Pick up puck 16 | PROC getPuckSmoothly(robtarget puck_position) 17 | 18 | MoveL Offs(puck_position, -gripper_camera_offset{1}, -gripper_camera_offset{2}, gripHeight),vSpeed,z50,tGripper\WObj:=wobjTableN; 19 | MoveL Offs(puck_position, 0, 0, gripHeight),v400,fine,tGripper\WObj:=wobjTableN; 20 | closeGripper(TRUE); 21 | MoveL Offs(puck_position, 0, 0, gripHeight + 30),vSpeed,z5,tGripper\WObj:=wobjTableN; 22 | 23 | ENDPROC 24 | 25 | 26 | ! Place puck 27 | PROC putPuckSmoothly(robtarget toTarget, num angle) 28 | 29 | toTarget.rot := OrientZYX(-angle, 0, 180); 30 | MoveJ Offs(toTarget, 0, 0, gripHeight + 30),vSpeed,z10,tGripper\WObj:=wobjTableN; 31 | MoveJ Offs(toTarget, 0, 0, gripHeight + 10),vSpeed,z10,tGripper\WObj:=wobjTableN; 32 | MoveL Offs(toTarget, 0, 0, gripHeight),v200,fine,tGripper\WObj:=wobjTableN; 33 | closeGripper(FALSE); 34 | MoveJ Offs(toTarget, 0, 0, safeHeight),vSpeed,z5,tGripper\WObj:=wobjTableN; 35 | 36 | ENDPROC 37 | 38 | 39 | PROC closeGripper(bool state) 40 | WaitTime 0.1; 41 | IF state=TRUE THEN 42 | setDO PneumaticGripper, 1; 43 | ELSEIF state=FALSE THEN 44 | setDO PneumaticGripper, 0; 45 | ENDIF 46 | WaitTime 0.2; 47 | ENDPROC 48 | 49 | ENDMODULE -------------------------------------------------------------------------------- /docs/source/conf.py: -------------------------------------------------------------------------------- 1 | # Configuration file for the Sphinx documentation builder. 2 | # 3 | # This file only contains a selection of the most common options. For a full 4 | # list see the documentation: 5 | # https://www.sphinx-doc.org/en/master/usage/configuration.html 6 | 7 | # -- Path setup -------------------------------------------------------------- 8 | 9 | # If extensions (or modules to document with autodoc) are in another directory, 10 | # add these directories to sys.path here. If the directory is relative to the 11 | # documentation root, use os.path.abspath to make it absolute, like shown here. 12 | # 13 | # import os 14 | # import sys 15 | # sys.path.insert(0, os.path.abspath('.')) 16 | 17 | 18 | # -- Project information ----------------------------------------------------- 19 | 20 | project = 'ABB-Robot-Machine-Vision' 21 | copyright = '2020, Markus Iversflaten & Ole Christian Handegård' 22 | author = 'Markus Iversflaten & Ole Christian Handegård' 23 | 24 | # The full version, including alpha/beta/rc tags 25 | release = '0.0.1' 26 | 27 | 28 | # -- General configuration --------------------------------------------------- 29 | 30 | # Add any Sphinx extension module names here, as strings. They can be 31 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom 32 | # ones. 33 | extensions = [ 34 | ] 35 | 36 | # Our project uses 'index.rst', not 'contents.rst' which is default 37 | master_doc = 'index' 38 | 39 | # Add any paths that contain templates here, relative to this directory. 40 | templates_path = ['_templates'] 41 | 42 | # List of patterns, relative to source directory, that match files and 43 | # directories to ignore when looking for source files. 44 | # This pattern also affects html_static_path and html_extra_path. 45 | exclude_patterns = [] 46 | 47 | 48 | # -- Options for HTML output ------------------------------------------------- 49 | 50 | # The theme to use for HTML and HTML Help pages. See the documentation for 51 | # a list of builtin themes. 52 | # 53 | html_theme = 'alabaster' 54 | 55 | # Add any paths that contain custom static files (such as style sheets) here, 56 | # relative to this directory. They are copied after the builtin static files, 57 | # so a file named "default.css" will overwrite the builtin "default.css". 58 | html_static_path = ['_static'] -------------------------------------------------------------------------------- /RobotWebServices/API_Benchmark/com_testing.py: -------------------------------------------------------------------------------- 1 | from locust import HttpLocust, TaskSet, task, between 2 | import random 3 | from requests.auth import HTTPDigestAuth 4 | 5 | 6 | class UserBehavior(TaskSet): 7 | """Used for testing communication between Python and RAPID (RobotWare) through the use of RobotWebServices 8 | and REST API""" 9 | 10 | def on_start(self): 11 | pass 12 | 13 | def on_stop(self): 14 | pass 15 | 16 | @task(1) 17 | def executionstate(self): 18 | """Get the state of the controller""" 19 | 20 | self.client.get('/rw/rapid/execution', auth=HTTPDigestAuth(username='Default User', password='robotics')) 21 | 22 | @task(1) 23 | def setrapidvariable(self): 24 | """Sets the *locust* variable to a random value""" 25 | 26 | # TODO: test with both a set variable and also with a random variable, see any difference in timings? 27 | value = random.randint(0,100) 28 | locustvar = "locustvar" 29 | payload = {'value': value} 30 | self.client.post('/rw/rapid/symbol/data/RAPID/T_ROB1/' + locustvar + '?action=set', 31 | auth=HTTPDigestAuth(username='Default User', password='robotics'), data=payload) 32 | 33 | @task(1) 34 | def getrapidvariable(self): 35 | """Reading the locust variable""" 36 | 37 | locustvar = "locustvar" 38 | payload = {'value': 6} 39 | self.client.get('/rw/rapid/symbol/data/RAPID/T_ROB1/' + locustvar + ';value', 40 | auth=HTTPDigestAuth(username='Default User', password='robotics'), data=payload) 41 | 42 | @task(1) 43 | def getgripperpos(self): 44 | """Reading the current gripper position""" 45 | 46 | self.client.get('/rw/motionsystem/mechunits/ROB_1/robtarget/?tool=tGripper&wobj=wobjTableN&coordinate=Wobj', 47 | auth=HTTPDigestAuth(username='Default User', password='robotics')) 48 | 49 | @task(1) 50 | def setrobtarget(self): 51 | """Setting the *locustrobtarg* to a specific value""" 52 | 53 | locustrobtarg = "locustrobtarg" 54 | payload = {'value': "[[0,0,100],[0,1,0,0],[-1,0,0,0],[9E+9,9E+9,9E+9,9E+9,9E+9,9E+9]]"} 55 | self.client.post('/rw/rapid/symbol/data/RAPID/T_ROB1/' + locustrobtarg + '?action=set', 56 | auth=HTTPDigestAuth(username='Default User', password='robotics'), data=payload) 57 | 58 | class WebsiteUser(HttpLocust): 59 | task_set = UserBehavior 60 | 61 | # Wait time between each request 62 | wait_time = between(0.01, 0.015) 63 | 64 | 65 | """To run the test, type the following in console: locust -f RobotWebServices/API_Benchmark/com_testing.py 66 | Open localhost:8089 to get to the http page to spawn users and decide spawn rate and address""" 67 | -------------------------------------------------------------------------------- /PythonCom_lab.txt: -------------------------------------------------------------------------------- 1 | MODULE PythonCom 2 | 3 | VAR num WPW:=0; ! What Python Wants 4 | VAR num WRD:=0; ! What RAPID Does 5 | VAR bool ready_flag:= FALSE; ! RAPID's ready flag 6 | VAR bool image_processed:= FALSE; ! Python's ready flag 7 | CONST num gripHeight:= 10; 8 | CONST num safeHeight:= 60; ! Close-up image height 9 | 10 | ! Where the selected puck is located 11 | VAR robtarget puck_target := [[0,0,200],[0,1,0,0],[-1,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]; 12 | 13 | VAR robtarget put_puck_target; ! Target used to place puck 14 | VAR num puck_angle; ! Angle puck should be rotated (optional) 15 | 16 | CONST robtarget middleOfTable:=[[0,0,0],[0,1,0,0],[-1,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]; 17 | 18 | ! Overview image position 19 | VAR robtarget overview:=[[0,0,500],[0,1,0,0],[-1,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]; 20 | 21 | TASK PERS wobjdata wobjTableN:=[FALSE,TRUE,"",[[150,-500,8],[0.707106781,0,0,-0.707106781]],[[0,0,0],[1,0,0,0]]]; 22 | PERS tooldata tGripper:=[TRUE,[[0,0,114.25],[0,0,0,1]],[1,[-0.095984607,0.082520613,38.69176324],[1,0,0,0],0,0,0]]; 23 | 24 | PROC main() 25 | 26 | closeGripper(FALSE); 27 | MoveJ overview,v100,z10,tGripper\WObj:=wobjTableN; 28 | 29 | ready_flag:=TRUE; 30 | 31 | WHILE TRUE DO 32 | 33 | IF WPW <> 0 THEN 34 | WRD:=WPW; 35 | WPW:=0; 36 | 37 | TEST WRD 38 | 39 | CASE 1: 40 | !!!!!!!!!!!!!!!!!!!!!!!! 41 | ! Move a chosen puck ! 42 | !!!!!!!!!!!!!!!!!!!!!!!! 43 | !---insert code here---! 44 | 45 | CASE 2: 46 | !!!!!!!!!!!!!!!!!!!!!!!! 47 | ! Stack pucks ! 48 | !!!!!!!!!!!!!!!!!!!!!!!! 49 | !---insert code here---! 50 | 51 | ENDTEST 52 | ENDIF 53 | WRD:=0; 54 | ENDWHILE 55 | ENDPROC 56 | 57 | 58 | !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 59 | ! Gather/Pick up puck function is needed ! 60 | !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 61 | !------------insert code here------------! 62 | 63 | !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 64 | ! Place puck function is needed ! 65 | !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 66 | !--------insert code here-------! 67 | 68 | ! wait for python to finish processing image 69 | PROC wait_for_python() 70 | WHILE NOT image_processed DO 71 | ENDWHILE 72 | image_processed:=FALSE; 73 | ENDPROC 74 | 75 | ! Function to open / close the gripper 76 | PROC closeGripper(bool state) 77 | WaitTime 0.1; 78 | IF state=TRUE THEN 79 | setDO PneumaticGripper, 1; 80 | ELSEIF state=FALSE THEN 81 | setDO PneumaticGripper, 0; 82 | ENDIF 83 | WaitTime 0.2; 84 | ENDPROC 85 | 86 | ENDMODULE -------------------------------------------------------------------------------- /docs/source/opencv_to_rapid.rst: -------------------------------------------------------------------------------- 1 | OpenCV_to_RAPID.py 2 | ================== 3 | 4 | .. py:function:: create_robtarget(gripper_height, gripper_rot, cam_pos, image, puck, cam_comp=False) 5 | 6 | Complete a series of transformations to finally create a robtarget of the puck's position from an image. 7 | These transformations are based on the setup at the University of Stavanger, and with a specific RobotStudio setup 8 | (in regards to coordinate systems). This `Pack and Go`_ file includes this setup. 9 | 10 | :param float gripper_height: Height of gripper above work object 11 | :param tuple gripper_rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 12 | :param float[] cam_pos: Camera position [x,y] 13 | :param ndarray image: An image 14 | :param Puck puck: A :py:class:`Puck` object 15 | :param bool cam_comp: Running camera_adjustment? 16 | 17 | :return: A :py:class:`Puck` object with robtarget as position 18 | 19 | .. _Pack and Go: http://www.ux.uis.no/~karlsk/ELE610/UiS_E458_nov18.rspag 20 | 21 | .. py:function:: transform_position(gripper_rot, puck) 22 | 23 | Transform coordinate system given by image in OpenCV to coordinate system of work object in RAPID. 24 | Swap x & y coordinates and rotate by the same amount that the camera has been rotated. 25 | 26 | :param tuple gripper_rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 27 | :param Puck puck: A :py:class:`Puck` object 28 | 29 | .. py:function:: pixel_to_mm(gripper_height, puck, image) 30 | 31 | Converts coordinates in image from pixels to millimeters. 32 | This depends on the camera's working distance. 33 | 34 | :param float gripper_height: Height of gripper above work object 35 | :param Puck puck: A :py:class:`Puck` object 36 | :param ndarray image: An image 37 | 38 | .. py:function:: overshoot_comp(gripper_height, puck) 39 | 40 | Compensate for the overshoot phenomenon which occurs when trying to pinpoint 41 | the location of a 3D object in a 2D image. 42 | 43 | :param float gripper_height: Height of gripper above work object 44 | :param Puck puck: A :py:class:`Puck` object 45 | 46 | .. py:function:: camera_compensation(gripper_height, gripper_rot, puck) 47 | 48 | Compensate for an angled camera view. Different cameras will be 49 | angled differently both internally and externally when mounted to a surface. 50 | The slope values must first be calculated by running *camera_adjustment*. 51 | Works with any camera orientation. 52 | 53 | :param float gripper_height: Height of gripper above work object 54 | :param tuple gripper_rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 55 | :param Puck puck: A :py:class:`Puck` object 56 | 57 | .. py:function:: get_camera_position(trans, rot) 58 | 59 | Uses the offset between the gripper and camera to find the camera's position. 60 | `Robot Web Services`_ 61 | 62 | :param float[] trans: Position of gripper 63 | :param tuple rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 64 | 65 | :returns: Position of camera -------------------------------------------------------------------------------- /docs/build/html/_sources/opencv_to_rapid.rst.txt: -------------------------------------------------------------------------------- 1 | OpenCV_to_RAPID.py 2 | ================== 3 | 4 | .. py:function:: create_robtarget(gripper_height, gripper_rot, cam_pos, image, puck, cam_comp=False) 5 | 6 | Complete a series of transformations to finally create a robtarget of the puck's position from an image. 7 | These transformations are based on the setup at the University of Stavanger, and with a specific RobotStudio setup 8 | (in regards to coordinate systems). This `Pack and Go`_ file includes this setup. 9 | 10 | :param float gripper_height: Height of gripper above work object 11 | :param tuple gripper_rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 12 | :param float[] cam_pos: Camera position [x,y] 13 | :param ndarray image: An image 14 | :param Puck puck: A :py:class:`Puck` object 15 | :param bool cam_comp: Running camera_adjustment? 16 | 17 | :return: A :py:class:`Puck` object with robtarget as position 18 | 19 | .. _Pack and Go: http://www.ux.uis.no/~karlsk/ELE610/UiS_E458_nov18.rspag 20 | 21 | .. py:function:: transform_position(gripper_rot, puck) 22 | 23 | Transform coordinate system given by image in OpenCV to coordinate system of work object in RAPID. 24 | Swap x & y coordinates and rotate by the same amount that the camera has been rotated. 25 | 26 | :param tuple gripper_rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 27 | :param Puck puck: A :py:class:`Puck` object 28 | 29 | .. py:function:: pixel_to_mm(gripper_height, puck, image) 30 | 31 | Converts coordinates in image from pixels to millimeters. 32 | This depends on the camera's working distance. 33 | 34 | :param float gripper_height: Height of gripper above work object 35 | :param Puck puck: A :py:class:`Puck` object 36 | :param ndarray image: An image 37 | 38 | .. py:function:: overshoot_comp(gripper_height, puck) 39 | 40 | Compensate for the overshoot phenomenon which occurs when trying to pinpoint 41 | the location of a 3D object in a 2D image. 42 | 43 | :param float gripper_height: Height of gripper above work object 44 | :param Puck puck: A :py:class:`Puck` object 45 | 46 | .. py:function:: camera_compensation(gripper_height, gripper_rot, puck) 47 | 48 | Compensate for an angled camera view. Different cameras will be 49 | angled differently both internally and externally when mounted to a surface. 50 | The slope values must first be calculated by running *camera_adjustment*. 51 | Works with any camera orientation. 52 | 53 | :param float gripper_height: Height of gripper above work object 54 | :param tuple gripper_rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 55 | :param Puck puck: A :py:class:`Puck` object 56 | 57 | .. py:function:: get_camera_position(trans, rot) 58 | 59 | Uses the offset between the gripper and camera to find the camera's position. 60 | `Robot Web Services`_ 61 | 62 | :param float[] trans: Position of gripper 63 | :param tuple rot: Orientation of gripper, must be a Quaternion (tuple of length 4) 64 | 65 | :returns: Position of camera -------------------------------------------------------------------------------- /docs/build/html/help.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Need Help? — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 |
29 |
30 | 31 | 32 |
33 | 34 |
35 |

Need Help?

36 |

You’re on your own now, buddy.

37 |
38 | 39 | 40 |
41 | 42 |
43 |
44 | 95 |
96 |
97 | 108 | 109 | 110 | 111 | 112 | 113 | -------------------------------------------------------------------------------- /docs/build/html/search.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Search — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 |
32 |
33 |
34 | 35 | 36 |
37 | 38 |

Search

39 |
40 | 41 |

42 | Please activate JavaScript to enable the search 43 | functionality. 44 |

45 |
46 |

47 | From here you can search these documents. Enter your search 48 | words into the box below and click "search". Note that the search 49 | function will automatically search for all of the words. Pages 50 | containing fewer words won't appear in the result list. 51 |

52 |
53 | 54 | 55 | 56 |
57 | 58 |
59 | 60 |
61 | 62 |
63 | 64 |
65 |
66 | 105 |
106 |
107 | 115 | 116 | 117 | 118 | 119 | 120 | -------------------------------------------------------------------------------- /main_lab.py: -------------------------------------------------------------------------------- 1 | # place needed imports 2 | 3 | ######################################################### 4 | # First, use OpenCV to initialize the camera # 5 | # or use the Camera class given if you # 6 | # prefer to use uEye API # 7 | ######################################################### 8 | # ----------------insert code here--------------------- # 9 | 10 | ######################################################### 11 | # Second, initialize robot communication, # 12 | # and execute RAPID program # 13 | # (Create RWS object) # 14 | ######################################################### 15 | # ----------------insert code here--------------------- # 16 | 17 | while True: # Run script indefinitely 18 | print(""" 19 | Choose what to do: 20 | 1: Pick and place a single puck 21 | 0: Exit 22 | """) 23 | # Program may be extended 24 | 25 | userinput = int(input('\nWhat should RAPID do?: ')) 26 | 27 | if userinput == 1: 28 | """ 29 | Pick up and place a chosen puck to a chosen location. 30 | Captures an image and finds all pucks in the work area. 31 | The user is prompted to select puck and location. 32 | Uses collision avoidance when picking up puck. 33 | """ 34 | 35 | print("Pick and place a single puck") 36 | 37 | ######################################################### 38 | # Start wanted case in RAPID and wait for RAPID # 39 | ######################################################### 40 | # ----------------insert code here--------------------- # 41 | 42 | ######################################################### 43 | # Capture image, process it and scan it for QR codes # 44 | # Do this several times if no QR code is found # 45 | ######################################################### 46 | # ----------------insert code here--------------------- # 47 | 48 | ######################################################### 49 | # Create a robtarget from the QR codes' position # 50 | ######################################################### 51 | # ----------------insert code here--------------------- # 52 | 53 | ######################################################### 54 | # Send the robtarget to RAPID # 55 | ######################################################### 56 | # ----------------insert code here--------------------- # 57 | 58 | ######################################################### 59 | # Tell RAPID to move to a close-up image position # 60 | # (Update flag variable) # 61 | ######################################################### 62 | # ----------------insert code here--------------------- # 63 | 64 | ######################################################### 65 | # Wait for robot to reach close-up position # 66 | # Capture image, process it and scan it for QR codes # 67 | # Do this several times if no QR code is found # 68 | ######################################################### 69 | # ----------------insert code here--------------------- # 70 | 71 | ######################################################### 72 | # Create a (more accurate) robtarget # 73 | # from the QR codes' position # 74 | ######################################################### 75 | # ----------------insert code here--------------------- # 76 | 77 | ######################################################### 78 | # Send the robtarget to RAPID # 79 | ######################################################### 80 | # ----------------insert code here--------------------- # 81 | 82 | ######################################################### 83 | # Tell RAPID to pick up and place puck # 84 | # (Update flag variable) # 85 | ######################################################### 86 | # ----------------insert code here--------------------- # 87 | 88 | elif userinput == 0: 89 | print("Exiting Python program and turning off robot motors") 90 | ######################################################### 91 | # Start RAPID execution and switch off robot motors # 92 | # (Use RWS functions) # 93 | ######################################################### 94 | # ----------------insert code here--------------------- # 95 | break -------------------------------------------------------------------------------- /docs/build/html/robotware.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | RobotWare — test-project 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 |
28 |
29 | 30 | 31 |
32 | 33 |
34 |

RobotWare

35 |
36 |
37 | set_zonedata(self, var, zonedata)
38 |

Set the value for a zonedata variable in RAPID. 39 | var: name of variable as declared in RAPID. 40 | zonedata: desired zonedata value [int].

41 |
42 | 43 |
44 |
45 | set_speeddata(self, var, speeddata)
46 |

Set the value [int] for a speeddata variable in RAPID. 47 | var: name of variable as declared in RAPID. 48 | speeddata: desired speeddata value [int].

49 |
50 | 51 |
52 |
53 | set_speed_ratio(self, speed_ratio)
54 |

Set the speed ratio of the robot. 55 | speed_ratio: desired speed ratio in percent [1-100].

56 |
57 | 58 |
59 | 60 | 61 |
62 | 63 |
64 |
65 | 110 |
111 |
112 | 123 | 124 | 125 | 126 | 127 | 128 | -------------------------------------------------------------------------------- /docs/build/html/_static/pygments.css: -------------------------------------------------------------------------------- 1 | .highlight .hll { background-color: #ffffcc } 2 | .highlight { background: #f8f8f8; } 3 | .highlight .c { color: #8f5902; font-style: italic } /* Comment */ 4 | .highlight .err { color: #a40000; border: 1px solid #ef2929 } /* Error */ 5 | .highlight .g { color: #000000 } /* Generic */ 6 | .highlight .k { color: #004461; font-weight: bold } /* Keyword */ 7 | .highlight .l { color: #000000 } /* Literal */ 8 | .highlight .n { color: #000000 } /* Name */ 9 | .highlight .o { color: #582800 } /* Operator */ 10 | .highlight .x { color: #000000 } /* Other */ 11 | .highlight .p { color: #000000; font-weight: bold } /* Punctuation */ 12 | .highlight .ch { color: #8f5902; font-style: italic } /* Comment.Hashbang */ 13 | .highlight .cm { color: #8f5902; font-style: italic } /* Comment.Multiline */ 14 | .highlight .cp { color: #8f5902 } /* Comment.Preproc */ 15 | .highlight .cpf { color: #8f5902; font-style: italic } /* Comment.PreprocFile */ 16 | .highlight .c1 { color: #8f5902; font-style: italic } /* Comment.Single */ 17 | .highlight .cs { color: #8f5902; font-style: italic } /* Comment.Special */ 18 | .highlight .gd { color: #a40000 } /* Generic.Deleted */ 19 | .highlight .ge { color: #000000; font-style: italic } /* Generic.Emph */ 20 | .highlight .gr { color: #ef2929 } /* Generic.Error */ 21 | .highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ 22 | .highlight .gi { color: #00A000 } /* Generic.Inserted */ 23 | .highlight .go { color: #888888 } /* Generic.Output */ 24 | .highlight .gp { color: #745334 } /* Generic.Prompt */ 25 | .highlight .gs { color: #000000; font-weight: bold } /* Generic.Strong */ 26 | .highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ 27 | .highlight .gt { color: #a40000; font-weight: bold } /* Generic.Traceback */ 28 | .highlight .kc { color: #004461; font-weight: bold } /* Keyword.Constant */ 29 | .highlight .kd { color: #004461; font-weight: bold } /* Keyword.Declaration */ 30 | .highlight .kn { color: #004461; font-weight: bold } /* Keyword.Namespace */ 31 | .highlight .kp { color: #004461; font-weight: bold } /* Keyword.Pseudo */ 32 | .highlight .kr { color: #004461; font-weight: bold } /* Keyword.Reserved */ 33 | .highlight .kt { color: #004461; font-weight: bold } /* Keyword.Type */ 34 | .highlight .ld { color: #000000 } /* Literal.Date */ 35 | .highlight .m { color: #990000 } /* Literal.Number */ 36 | .highlight .s { color: #4e9a06 } /* Literal.String */ 37 | .highlight .na { color: #c4a000 } /* Name.Attribute */ 38 | .highlight .nb { color: #004461 } /* Name.Builtin */ 39 | .highlight .nc { color: #000000 } /* Name.Class */ 40 | .highlight .no { color: #000000 } /* Name.Constant */ 41 | .highlight .nd { color: #888888 } /* Name.Decorator */ 42 | .highlight .ni { color: #ce5c00 } /* Name.Entity */ 43 | .highlight .ne { color: #cc0000; font-weight: bold } /* Name.Exception */ 44 | .highlight .nf { color: #000000 } /* Name.Function */ 45 | .highlight .nl { color: #f57900 } /* Name.Label */ 46 | .highlight .nn { color: #000000 } /* Name.Namespace */ 47 | .highlight .nx { color: #000000 } /* Name.Other */ 48 | .highlight .py { color: #000000 } /* Name.Property */ 49 | .highlight .nt { color: #004461; font-weight: bold } /* Name.Tag */ 50 | .highlight .nv { color: #000000 } /* Name.Variable */ 51 | .highlight .ow { color: #004461; font-weight: bold } /* Operator.Word */ 52 | .highlight .w { color: #f8f8f8; text-decoration: underline } /* Text.Whitespace */ 53 | .highlight .mb { color: #990000 } /* Literal.Number.Bin */ 54 | .highlight .mf { color: #990000 } /* Literal.Number.Float */ 55 | .highlight .mh { color: #990000 } /* Literal.Number.Hex */ 56 | .highlight .mi { color: #990000 } /* Literal.Number.Integer */ 57 | .highlight .mo { color: #990000 } /* Literal.Number.Oct */ 58 | .highlight .sa { color: #4e9a06 } /* Literal.String.Affix */ 59 | .highlight .sb { color: #4e9a06 } /* Literal.String.Backtick */ 60 | .highlight .sc { color: #4e9a06 } /* Literal.String.Char */ 61 | .highlight .dl { color: #4e9a06 } /* Literal.String.Delimiter */ 62 | .highlight .sd { color: #8f5902; font-style: italic } /* Literal.String.Doc */ 63 | .highlight .s2 { color: #4e9a06 } /* Literal.String.Double */ 64 | .highlight .se { color: #4e9a06 } /* Literal.String.Escape */ 65 | .highlight .sh { color: #4e9a06 } /* Literal.String.Heredoc */ 66 | .highlight .si { color: #4e9a06 } /* Literal.String.Interpol */ 67 | .highlight .sx { color: #4e9a06 } /* Literal.String.Other */ 68 | .highlight .sr { color: #4e9a06 } /* Literal.String.Regex */ 69 | .highlight .s1 { color: #4e9a06 } /* Literal.String.Single */ 70 | .highlight .ss { color: #4e9a06 } /* Literal.String.Symbol */ 71 | .highlight .bp { color: #3465a4 } /* Name.Builtin.Pseudo */ 72 | .highlight .fm { color: #000000 } /* Name.Function.Magic */ 73 | .highlight .vc { color: #000000 } /* Name.Variable.Class */ 74 | .highlight .vg { color: #000000 } /* Name.Variable.Global */ 75 | .highlight .vi { color: #000000 } /* Name.Variable.Instance */ 76 | .highlight .vm { color: #000000 } /* Name.Variable.Magic */ 77 | .highlight .il { color: #990000 } /* Literal.Number.Integer.Long */ -------------------------------------------------------------------------------- /docs/source/imagefunctions.rst: -------------------------------------------------------------------------------- 1 | Image Tools 2 | =========== 3 | 4 | The image tools included in the ABB-Robot-Machine-Vision package are mainly 5 | directed toward `uEye cameras`_. 6 | These tools utilize IDS's Python bindings for the uEye API: PyuEye_. 7 | 8 | .. _PyuEye: https://pypi.org/project/pyueye/ 9 | .. _uEye cameras: https://en.ids-imaging.com/ 10 | 11 | Camera Class 12 | ^^^^^^^^^^^^ 13 | 14 | .. py:class:: Camera(cam_ID=0) 15 | 16 | The Camera class contains methods specifically meant for the University of Stavanger. 17 | These functions have only been tested on a IDS UI-1007XS-C camera, and might not work 18 | as intended on other models. 19 | 20 | Quickstart 21 | ********** 22 | 23 | .. code-block:: python 24 | 25 | # Initialization 26 | cam = Camera() 27 | cam.init() 28 | cam.set_parameters() 29 | cam.allocate_memory() 30 | cam.capture_video() 31 | 32 | # Show video 33 | img = cam.get_image() 34 | cv2.imshow("Quickstart", img) 35 | if cv2.waitKey(1) & 0xFF == ord('q'): 36 | cv2.destroyAllWindows() 37 | break 38 | 39 | cam.exit_camera() 40 | 41 | The parameters set are not currently configurable through method inputs. 42 | They are specifically set for laboratory work at the University of Stavanger. 43 | 44 | 45 | Capturing Images and Video 46 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ 47 | 48 | .. py:function:: capture_image(cam, gripper_height) 49 | 50 | Manually adjusts the camera focus through :py:func:`calculate_focus`. 51 | Captures a single image through PyuEye functions. 52 | Focus is manually adjusted depending on the working distance. 53 | 54 | :param Camera cam: A :py:class:`Camera` object 55 | :param int gripper_height: The height of the gripper above the work object 56 | 57 | :return: An image 58 | 59 | .. py:function:: showVideo(cam) 60 | 61 | Continuously displays the robot's view in an OpenCV imshow window. 62 | The video uses :py:func:`QR_Scanner_visualized` to visualize the scanned QR codes. 63 | :py:func:`showVideo` works well in threads: 64 | 65 | .. code-block:: python 66 | 67 | # Show video feed in separate thread 68 | cam_thread = threading.Thread(target=ImageFunctions.showVideo, 69 | args=(cam,), daemon=True) 70 | cam_thread.start() 71 | 72 | Single images can still be retrieved through 73 | :py:func:`capture_image` while the thread is active. 74 | 75 | :param Camera cam: A :py:class:`Camera` object 76 | 77 | .. py:function:: calculate_focus(cam, working_distance) 78 | 79 | Calculates the correct focus value for a specific IDS XS 80 | camera with some working distance. 81 | The focus value is found by comparing the working distance to a characteristic. 82 | This characteristic is *unique* to the XS camera with serial code 4102885308. 83 | This function should be accurate for working distances up to 620mm. 84 | 85 | :param Camera cam: A :py:class:`Camera` object 86 | :param int working_distance: Distance from camera lens to subject in millimeters 87 | 88 | .. py:function:: findPucks(cam, robot, robtarget_pucks, cam_comp=False, number_of_images=1) 89 | 90 | Finds all pucks in the images taken and puts them in a list. 91 | The positions of all pucks are then converted to robtargets using 92 | :py:func:`create_robtarget`. 93 | 94 | If pucks that were previously found are found once again, they will not be re-added 95 | to the puck list, so that only new pucks are transformed into robtargets. 96 | 97 | :param Camera cam: A :py:class:`Camera` object 98 | :param RWS robot: An :py:class:`RWS` object 99 | :param Puck[] robtarget_pucks: All pucks found previously 100 | :param bool cam_comp: True if camera adjustment should be run, 101 | False if it has already been run 102 | :param int number_of_images: How many images should be taken in 103 | the attempt to find all pucks 104 | 105 | :return: A list with all found pucks, without duplicates 106 | 107 | 108 | Scanning QR Codes 109 | ^^^^^^^^^^^^^^^^^ 110 | 111 | .. py:function:: QR_Scanner(img) 112 | 113 | Filters and normalizes the input image. The processed image is decoded using pyzbar_. 114 | For every QR code detected, a :py:class:`Puck` object is created. 115 | 116 | :param ndarray img: An image 117 | 118 | :return: A list of :py:class:`Puck` objects 119 | 120 | The QR scanner function uses software from ZBar_ through pyzbar_. The image passed to the 121 | function is first filtered and transformed into a normalized grayscale image. The grayscale 122 | image is decoded by ZBar to extract information from QR codes in the image. 123 | 124 | .. py:function:: QR_Scanner_visualized(img) 125 | 126 | Filters and normalizes the input image. The processed image is decoded using pyzbar_. 127 | QR codes found in the image are marked with red squares for display. 128 | 129 | :param ndarray img: An image 130 | 131 | :return: Image for display 132 | 133 | .. _ZBar: http://zbar.sourceforge.net/ 134 | .. _pyzbar: https://pypi.org/project/pyzbar/ 135 | -------------------------------------------------------------------------------- /docs/build/html/_sources/imagefunctions.rst.txt: -------------------------------------------------------------------------------- 1 | Image Tools 2 | =========== 3 | 4 | The image tools included in the ABB-Robot-Machine-Vision package are mainly 5 | directed toward `uEye cameras`_. 6 | These tools utilize IDS's Python bindings for the uEye API: PyuEye_. 7 | 8 | .. _PyuEye: https://pypi.org/project/pyueye/ 9 | .. _uEye cameras: https://en.ids-imaging.com/ 10 | 11 | Camera Class 12 | ^^^^^^^^^^^^ 13 | 14 | .. py:class:: Camera(cam_ID=0) 15 | 16 | The Camera class contains methods specifically meant for the University of Stavanger. 17 | These functions have only been tested on a IDS UI-1007XS-C camera, and might not work 18 | as intended on other models. 19 | 20 | Quickstart 21 | ********** 22 | 23 | .. code-block:: python 24 | 25 | # Initialization 26 | cam = Camera() 27 | cam.init() 28 | cam.set_parameters() 29 | cam.allocate_memory() 30 | cam.capture_video() 31 | 32 | # Show video 33 | img = cam.get_image() 34 | cv2.imshow("Quickstart", img) 35 | if cv2.waitKey(1) & 0xFF == ord('q'): 36 | cv2.destroyAllWindows() 37 | break 38 | 39 | cam.exit_camera() 40 | 41 | The parameters set are not currently configurable through method inputs. 42 | They are specifically set for laboratory work at the University of Stavanger. 43 | 44 | 45 | Capturing Images and Video 46 | ^^^^^^^^^^^^^^^^^^^^^^^^^^ 47 | 48 | .. py:function:: capture_image(cam, gripper_height) 49 | 50 | Manually adjusts the camera focus through :py:func:`calculate_focus`. 51 | Captures a single image through PyuEye functions. 52 | Focus is manually adjusted depending on the working distance. 53 | 54 | :param Camera cam: A :py:class:`Camera` object 55 | :param int gripper_height: The height of the gripper above the work object 56 | 57 | :return: An image 58 | 59 | .. py:function:: showVideo(cam) 60 | 61 | Continuously displays the robot's view in an OpenCV imshow window. 62 | The video uses :py:func:`QR_Scanner_visualized` to visualize the scanned QR codes. 63 | :py:func:`showVideo` works well in threads: 64 | 65 | .. code-block:: python 66 | 67 | # Show video feed in separate thread 68 | cam_thread = threading.Thread(target=ImageFunctions.showVideo, 69 | args=(cam,), daemon=True) 70 | cam_thread.start() 71 | 72 | Single images can still be retrieved through 73 | :py:func:`capture_image` while the thread is active. 74 | 75 | :param Camera cam: A :py:class:`Camera` object 76 | 77 | .. py:function:: calculate_focus(cam, working_distance) 78 | 79 | Calculates the correct focus value for a specific IDS XS 80 | camera with some working distance. 81 | The focus value is found by comparing the working distance to a characteristic. 82 | This characteristic is *unique* to the XS camera with serial code 4102885308. 83 | This function should be accurate for working distances up to 620mm. 84 | 85 | :param Camera cam: A :py:class:`Camera` object 86 | :param int working_distance: Distance from camera lens to subject in millimeters 87 | 88 | .. py:function:: findPucks(cam, robot, robtarget_pucks, cam_comp=False, number_of_images=1) 89 | 90 | Finds all pucks in the images taken and puts them in a list. 91 | The positions of all pucks are then converted to robtargets using 92 | :py:func:`create_robtarget`. 93 | 94 | If pucks that were previously found are found once again, they will not be re-added 95 | to the puck list, so that only new pucks are transformed into robtargets. 96 | 97 | :param Camera cam: A :py:class:`Camera` object 98 | :param RWS robot: An :py:class:`RWS` object 99 | :param Puck[] robtarget_pucks: All pucks found previously 100 | :param bool cam_comp: True if camera adjustment should be run, 101 | False if it has already been run 102 | :param int number_of_images: How many images should be taken in 103 | the attempt to find all pucks 104 | 105 | :return: A list with all found pucks, without duplicates 106 | 107 | 108 | Scanning QR Codes 109 | ^^^^^^^^^^^^^^^^^ 110 | 111 | .. py:function:: QR_Scanner(img) 112 | 113 | Filters and normalizes the input image. The processed image is decoded using pyzbar_. 114 | For every QR code detected, a :py:class:`Puck` object is created. 115 | 116 | :param ndarray img: An image 117 | 118 | :return: A list of :py:class:`Puck` objects 119 | 120 | The QR scanner function uses software from ZBar_ through pyzbar_. The image passed to the 121 | function is first filtered and transformed into a normalized grayscale image. The grayscale 122 | image is decoded by ZBar to extract information from QR codes in the image. 123 | 124 | .. py:function:: QR_Scanner_visualized(img) 125 | 126 | Filters and normalizes the input image. The processed image is decoded using pyzbar_. 127 | QR codes found in the image are marked with red squares for display. 128 | 129 | :param ndarray img: An image 130 | 131 | :return: Image for display 132 | 133 | .. _ZBar: http://zbar.sourceforge.net/ 134 | .. _pyzbar: https://pypi.org/project/pyzbar/ 135 | -------------------------------------------------------------------------------- /image_tools/Camera.py: -------------------------------------------------------------------------------- 1 | from pyueye import ueye 2 | import numpy as np 3 | import configparser 4 | 5 | repeatability_test = False 6 | number_of_loops = 0 7 | 8 | 9 | class Camera: 10 | """The Camera class contains methods specifically meant for the University of Stavanger. 11 | These functions have only been tested on a IDS UI-1007XS-C camera, and might not work 12 | as intended on other models. 13 | """ 14 | 15 | def __init__(self, cam_ID = 0): 16 | # Several parameters are unused, but may be needed in the future 17 | self.hCam = ueye.HIDS(cam_ID) 18 | self.sInfo = ueye.SENSORINFO() 19 | self.cInfo = ueye.CAMINFO() 20 | self.pcImageMemory = ueye.c_mem_p() 21 | self.MemID = ueye.int() 22 | self.rectAOI = ueye.IS_RECT() 23 | self.pitch = ueye.INT() 24 | self.nBitsPerPixel = ueye.INT(32) # 32 bits for color camera 25 | self.channels = 3 # 3: channels for color mode(RGB); take 1 channel for monochrome 26 | self.m_nColorMode = ueye.IS_CM_BGRA8_PACKED # RGB32 27 | self.bytes_per_pixel = int(self.nBitsPerPixel / 8) 28 | 29 | def init(self): 30 | # Starts the driver and establishes the connection to the camera 31 | nRet = ueye.is_InitCamera(self.hCam, None) 32 | if nRet != ueye.IS_SUCCESS: 33 | print("is_InitCamera ERROR") 34 | 35 | nRet = ueye.is_ResetToDefault(self.hCam) 36 | if nRet != ueye.IS_SUCCESS: 37 | print("is_ResetToDefault ERROR") 38 | 39 | def set_parameters(self, disable_exposure=True): 40 | # Change image format 41 | # formatID = ueye.UINT(5) # Change image format to 2048x1536 42 | formatID = ueye.UINT(8) # Change image format to 1280x960 43 | nRet = ueye.is_ImageFormat(self.hCam, ueye.IMGFRMT_CMD_SET_FORMAT, formatID, ueye.sizeof(formatID)) 44 | 45 | if disable_exposure: 46 | # Disable auto exposure 47 | dblEnable = ueye.DOUBLE(0) 48 | dblDummy = ueye.DOUBLE(0) 49 | ueye.is_SetAutoParameter(self.hCam, ueye.IS_SET_ENABLE_AUTO_SENSOR_GAIN_SHUTTER, dblEnable, dblDummy) 50 | 51 | # Set new exposure value from .ini-file 52 | config = configparser.ConfigParser() 53 | config.read('image_tools/cam_adjustments.ini') 54 | exposure = float(config['EXPOSURE']['exposure']) 55 | newExposure = ueye.DOUBLE(exposure) 56 | ret = ueye.is_Exposure(self.hCam, ueye.IS_EXPOSURE_CMD_SET_EXPOSURE, newExposure, ueye.sizeof(newExposure)) 57 | 58 | # Disable autofocus 59 | ueye.is_Focus(self.hCam, ueye.FOC_CMD_SET_DISABLE_AUTOFOCUS, None, 0) 60 | 61 | def allocate_memory(self): 62 | """Allocates an image memory for an image having its dimensions 63 | defined by width and height and its color depth defined by nBitsPerPixel. 64 | """ 65 | nRet = ueye.is_AOI(self.hCam, ueye.IS_AOI_IMAGE_GET_AOI, self.rectAOI, ueye.sizeof(self.rectAOI)) 66 | if nRet != ueye.IS_SUCCESS: 67 | print("is_AOI ERROR") 68 | 69 | nRet = ueye.is_AllocImageMem(self.hCam, self.rectAOI.s32Width, self.rectAOI.s32Height, 70 | self.nBitsPerPixel, self.pcImageMemory, self.MemID) 71 | if nRet != ueye.IS_SUCCESS: 72 | print("is_AllocImageMem ERROR") 73 | else: 74 | # Makes the specified image memory the active memory 75 | nRet = ueye.is_SetImageMem(self.hCam, self.pcImageMemory, self.MemID) 76 | if nRet != ueye.IS_SUCCESS: 77 | print("is_SetImageMem ERROR") 78 | else: 79 | # Set the desired color mode 80 | nRet = ueye.is_SetColorMode(self.hCam, self.m_nColorMode) 81 | 82 | def capture_video(self): 83 | # Activates the camera's live video mode (free run mode) 84 | nRet = ueye.is_CaptureVideo(self.hCam, ueye.IS_DONT_WAIT) 85 | if nRet != ueye.IS_SUCCESS: 86 | print("is_CaptureVideo ERROR") 87 | 88 | # Enables the queue mode for existing image memory sequences 89 | nRet = ueye.is_InquireImageMem(self.hCam, self.pcImageMemory, self.MemID, self.rectAOI.s32Width, 90 | self.rectAOI.s32Height, self.nBitsPerPixel, self.pitch) 91 | if nRet != ueye.IS_SUCCESS: 92 | print("is_InquireImageMem ERROR") 93 | 94 | def get_image(self): 95 | # Extract data from our image memory... 96 | array = ueye.get_data(self.pcImageMemory, self.rectAOI.s32Width, self.rectAOI.s32Height, 97 | self.nBitsPerPixel, self.pitch, copy=False) 98 | 99 | # ...and reshape it in an numpy array 100 | frame = np.reshape(array, (self.rectAOI.s32Height.value, self.rectAOI.s32Width.value, self.bytes_per_pixel)) 101 | 102 | return frame 103 | 104 | def exit_camera(self): 105 | ueye.is_FreeImageMem(self.hCam, self.pcImageMemory, self.MemID) 106 | 107 | # Disables the hCam camera handle and releases the data structures and memory areas taken up by the uEye camera 108 | ueye.is_ExitCamera(self.hCam) 109 | -------------------------------------------------------------------------------- /docs/build/html/license.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | License — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 |
30 |
31 | 32 | 33 |
34 | 35 |
36 |

License

37 |

Copyright 2020 Markus H. Iversflaten & Ole Christian Handegård

38 |

Permission is hereby granted, free of charge, 39 | to any person obtaining a copy of this software 40 | and associated documentation files (the “Software”), 41 | to deal in the Software without restriction, including 42 | without limitation the rights to use, copy, modify, merge, 43 | publish, distribute, sublicense, and/or sell copies of the 44 | Software, and to permit persons to whom the Software is 45 | furnished to do so, subject to the following conditions:

46 |

The above copyright notice and this permission notice shall be included in 47 | all copies or substantial portions of the Software.

48 |

THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 49 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 50 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 51 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 52 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 53 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 54 | THE SOFTWARE.

55 |
56 | 57 | 58 |
59 | 60 |
61 |
62 | 113 |
114 |
115 | 126 | 127 | 128 | 129 | 130 | 131 | -------------------------------------------------------------------------------- /PythonCom.txt: -------------------------------------------------------------------------------- 1 | MODULE PythonCom 2 | 3 | VAR num WPW:=0; ! What Python Wants 4 | VAR num WRD:=0; ! What RAPID Does 5 | VAR bool ready_flag:= FALSE; ! RAPID's ready flag 6 | VAR bool image_processed:= FALSE; ! Python's ready flag 7 | CONST num gripHeight:= 10; 8 | CONST num safeHeight:= 60; ! Close-up image height 9 | VAR num gripper_camera_offset{2}; ! Offsets for close-up image 10 | VAR num gripper_angle:= 0; ! Gripper orientation for close-up image and picking 11 | VAR speeddata vSpeed:= v1000; ! Speeddata generally used throughout the module 12 | VAR zonedata zZone:= fine; ! Zonedata generally used throughout the module 13 | 14 | VAR num length; ! Used to determine number of loops in stack pucks (WPW=2) 15 | 16 | ! Where the selected puck is located 17 | VAR robtarget puck_target := [[0,0,200],[0,1,0,0],[-1,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]; 18 | 19 | VAR robtarget put_puck_target; ! Target used to place puck 20 | VAR num puck_angle; ! Angle puck should be rotated 21 | 22 | CONST robtarget middleOfTable:=[[0,0,0],[0,1,0,0],[-1,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]; 23 | 24 | ! Overview image position 25 | VAR robtarget overview:=[[0,0,500],[0,1,0,0],[-1,0,0,0],[9E+09,9E+09,9E+09,9E+09,9E+09,9E+09]]; 26 | 27 | PROC main() 28 | AccSet 100, 50 \FinePointRamp:=30; 29 | closeGripper(FALSE); 30 | MoveJ overview,vSpeed,z50,tGripper\WObj:=wobjTableN; 31 | 32 | ready_flag:=TRUE; 33 | 34 | WHILE TRUE DO 35 | 36 | IF WPW <> 0 THEN 37 | WRD:=WPW; 38 | WPW:=0; 39 | 40 | TEST WRD 41 | CASE 1: 42 | ! Move a chosen puck 43 | MoveJ overview,vSpeed,fine,tGripper\WObj:=wobjTableN; 44 | ready_flag:=TRUE; 45 | 46 | wait_for_python; 47 | 48 | puck_target.rot := OrientZYX(gripper_angle, 0, 180); 49 | MoveJ Offs(puck_target, -gripper_camera_offset{1}, -gripper_camera_offset{2}, safeHeight),vSpeed,fine,tGripper\WObj:=wobjTableN; 50 | ready_flag:=TRUE; 51 | 52 | wait_for_python; 53 | puck_target.rot := OrientZYX(gripper_angle, 0, 180); 54 | MoveJ Offs(puck_target, -gripper_camera_offset{1}, -gripper_camera_offset{2}, safeHeight),vSpeed,z10,tGripper\WObj:=wobjTableN; 55 | getPuckSmoothly puck_target; 56 | 57 | putPuckSmoothly put_puck_target, puck_angle; 58 | 59 | CASE 2: 60 | ! Stack pucks 61 | MoveJ overview,vSpeed,fine,tGripper\WObj:=wobjTableN; 62 | ready_flag:=TRUE; 63 | 64 | wait_for_python; 65 | 66 | FOR i FROM 0 TO length-1 DO 67 | 68 | wait_for_python; 69 | ! Move to close-up image position. gripper_camera_offset ensures that the camera is perfectly above the subject. 70 | puck_target.rot := OrientZYX(gripper_angle, 0, 180); 71 | MoveJ Offs(puck_target, -gripper_camera_offset{1}, -gripper_camera_offset{2}, safeHeight),vSpeed,fine,tGripper\WObj:=wobjTableN; 72 | ready_flag:=TRUE; 73 | 74 | wait_for_python; 75 | 76 | puck_target.rot := OrientZYX(gripper_angle, 0, 180); 77 | getPuckSmoothly puck_target; 78 | 79 | MoveL Offs(puck_target, 0, 0, i*30 + gripHeight + 30),vSpeed,z5,tGripper\WObj:=wobjTableN; 80 | 81 | putPuckSmoothly Offs(middleOfTable, 0, 150, i*30), puck_angle; 82 | 83 | ready_flag:=TRUE; 84 | 85 | ENDFOR 86 | 87 | CASE 3: 88 | ! Repeatability test 89 | MoveJ overview,vSpeed,fine,tGripper\WObj:=wobjTableN; 90 | ready_flag:=TRUE; 91 | 92 | wait_for_python; 93 | 94 | ! Move to close-up image position. gripper_camera_offset ensures that the camera is perfectly above the subject. 95 | puck_target.rot := OrientZYX(140, 0, 180); 96 | MoveJ Offs(puck_target, -gripper_camera_offset{1}, -gripper_camera_offset{2}, safeHeight),vSpeed,fine,tGripper\WObj:=wobjTableN; 97 | ready_flag:=TRUE; 98 | 99 | wait_for_python; 100 | 101 | getPuckSmoothly puck_target; 102 | 103 | putPuckSmoothly put_puck_target, puck_angle; 104 | 105 | CASE 5: 106 | ! Camera adjustment 107 | MoveL overview,vSpeed,fine,tGripper\WObj:=wobjTableN; 108 | ready_flag:=TRUE; 109 | 110 | wait_for_python; 111 | 112 | MoveL Offs(puck_target, -55, 0, safeHeight),vSpeed,fine,tGripper\WObj:=wobjTableN; 113 | ready_flag:=TRUE; 114 | 115 | wait_for_python; 116 | 117 | MoveL Offs(puck_target, -55, 0, 500),vSpeed,fine,tGripper\WObj:=wobjTableN; 118 | ready_flag:=TRUE; 119 | 120 | ENDTEST 121 | ENDIF 122 | WRD:=0; 123 | 124 | 125 | ENDWHILE 126 | 127 | ENDPROC 128 | 129 | ENDMODULE -------------------------------------------------------------------------------- /Puck.py: -------------------------------------------------------------------------------- 1 | import math 2 | import sys 3 | 4 | 5 | class Puck: 6 | 7 | """ 8 | Puck class 9 | 10 | contains: 11 | puck number 12 | puck position (x,y) 13 | puck angle 14 | puck height 15 | """ 16 | 17 | def __init__(self, number, position, angle, height=30): 18 | self.set_number(number) 19 | self.set_position(position) 20 | self.set_angle(angle) 21 | self.set_height(height) 22 | 23 | def __eq__(self, other): 24 | if not isinstance(other, Puck): 25 | # don't attempt to compare against unrelated types 26 | return NotImplemented 27 | 28 | return self.number == other.number 29 | 30 | def __str__(self): 31 | return 'Puck #' + str(self.number) 32 | 33 | def set_position(self, position): 34 | try: 35 | position = list(position) 36 | if len(position) == 2: 37 | self.position = position 38 | else: 39 | raise TypeError 40 | except TypeError: 41 | print("Position has to be a list of [x, y]") 42 | 43 | def set_angle(self, angle): 44 | try: 45 | angle = float(angle) 46 | self.angle = angle 47 | except TypeError: 48 | print("Puck angle has to be a number") 49 | if angle > 180: 50 | self.angle -= 360 51 | elif angle < -180: 52 | self.angle += 360 53 | 54 | def set_height(self, height): 55 | try: 56 | height = int(height) 57 | self.height = height 58 | except TypeError: 59 | print("Puck height has to be an integer") 60 | 61 | def set_number(self, number): 62 | try: 63 | number = int(number) 64 | self.number = number 65 | except TypeError: 66 | print("Puck number has to be an integer") 67 | 68 | def get_xyz(self): 69 | return self.position + [self.height - 30] 70 | 71 | def check_collision(self, puck_list): 72 | """ 73 | To pick up pucks, the gripper slides in towards them. 74 | This path must be clear of any other pucks, so that no collisions occur. 75 | 76 | Depending on the positions of all other pucks, 77 | this path is rotated around the puck until a clear path is found. 78 | """ 79 | 80 | collision_list = [True] # Assume there is collision before otherwise is proven 81 | rotation = 0 # Current rotation in degrees 82 | tries = 0 # Amount of tries to avoid collision 83 | correct_rotation = 0 # Rotation which will avoid collision 84 | forward_grip = True # Slide in forward or backward toward puck 85 | 86 | # Collision area/path (rectangle): 87 | x1 = - 95 88 | x2 = 30 89 | y1 = - 70 # -67.5 90 | y2 = 70 # 67.5 91 | 92 | while True in collision_list: # While there is still at least one collision 93 | 94 | collision_list.clear() 95 | 96 | for other_puck in puck_list: 97 | if self.number != other_puck.number: 98 | puck_pos = self.position 99 | 100 | # Rotate every puck around current puck (self) 101 | # Instead of rotating the path in relation to the puck, 102 | # the other pucks are rotated around the puck until they no longer appear inside the path. 103 | # This means the rotation has to be equal and opposite. 104 | rotated_position = rotate(other_puck.position, puck_pos, - rotation) 105 | 106 | # Check if any pucks block the path 107 | collision = (x1 + puck_pos[0] < rotated_position[0] < x2 + puck_pos[0]) \ 108 | and (y1 + puck_pos[1] < rotated_position[1] < y2 + puck_pos[1]) 109 | 110 | # Gather all results in list 111 | collision_list.append(collision) 112 | 113 | # Maximum rotation is 180 degrees (+/-) 114 | if rotation > 180: 115 | correct_rotation = rotation - 360 116 | else: 117 | correct_rotation = rotation 118 | 119 | # If the gripper must rotate more than 90 degrees (+/-), 120 | # then it should instead slide in toward the puck backward. 121 | # This is done by doing a "180" with the gripper. 122 | if correct_rotation > 90: 123 | correct_rotation -= 180 124 | forward_grip = False # Backward grip 125 | elif correct_rotation < -90: 126 | correct_rotation += 180 127 | forward_grip = False # Backward grip 128 | 129 | rotation += 1 130 | tries += 1 131 | 132 | if tries > 360: # Stop trying if every angle gives collision 133 | sys.exit(0) # TODO: add better handling here 134 | return correct_rotation, forward_grip # Return the value that gave no collision 135 | 136 | 137 | def rotate(point, about_point, angle): 138 | """ 139 | Rotate a point counterclockwise by a given angle around a given origin. 140 | 141 | The angle should be given in degrees. 142 | """ 143 | angle = math.radians(angle) 144 | 145 | ox, oy = about_point 146 | px, py = point 147 | 148 | qx = ox + math.cos(angle) * (px - ox) - math.sin(angle) * (py - oy) 149 | qy = oy + math.sin(angle) * (px - ox) + math.cos(angle) * (py - oy) 150 | rotated_point = (qx, qy) 151 | 152 | return rotated_point 153 | -------------------------------------------------------------------------------- /docs/build/html/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | ABB Robot with Machine Vision — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 |
29 |
30 | 31 | 32 |
33 | 34 |
35 |

ABB Robot with Machine Vision

36 |

This software is mainly directed toward engineering 37 | students and staff at the University of Stavanger. 38 | It was first created as a Bachelor thesis work.

39 |
40 |

User Guide

41 |
42 | 63 |
64 |
65 |
66 |
67 |

Indices

68 | 72 |
73 | 74 | 75 |
76 | 77 |
78 |
79 | 129 |
130 |
131 | 142 | 143 | 144 | 145 | 146 | 147 | -------------------------------------------------------------------------------- /image_tools/camera_correction.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import configparser 3 | from pyueye import ueye 4 | from image_tools import ImageFunctions 5 | import os 6 | import sys 7 | 8 | 9 | def camera_adjustment(cam, robot): 10 | """Calculates the slope which represents how much the lens of the camera is angled. 11 | This is done by comparing two images taken from different heights. 12 | """ 13 | print("---Running camera_adjustment---") 14 | 15 | abspath = os.path.abspath("image_tools/cam_adjustments.ini") 16 | 17 | adjustment_file = open('camera_adjustment_XS.txt', 'w') 18 | 19 | i = 0 20 | while robot.is_running() and i < 25: # Compare images 25 times 21 | i += 1 22 | robot.set_rapid_variable("WPW", 5) # Start camera adjustment procedure in RAPID 23 | 24 | robot.wait_for_rapid() 25 | 26 | robtarget_pucks = [] 27 | 28 | # First, find the puck that will be used for comparison 29 | while not robtarget_pucks: 30 | ImageFunctions.findPucks(cam, robot, robtarget_pucks, cam_comp=True) 31 | print("xyz:", robtarget_pucks[0].get_xyz()) 32 | 33 | robot.set_robtarget_translation("puck_target", robtarget_pucks[0].get_xyz()) 34 | robot.set_rapid_variable("image_processed", "TRUE") 35 | 36 | robtarget_pucks.clear() 37 | 38 | robot.wait_for_rapid() 39 | 40 | # Close-up image of puck (working distance = 100mm) 41 | while not robtarget_pucks: 42 | ImageFunctions.findPucks(cam, robot, robtarget_pucks, cam_comp=True) 43 | 44 | robot.set_rapid_variable("image_processed", "TRUE") 45 | 46 | pos_low = robtarget_pucks[0].get_xyz() 47 | print(f'Low robtarget: ({pos_low[0]:.1f},{pos_low[1]:.1f})') 48 | 49 | robot.wait_for_rapid() 50 | 51 | robtarget_pucks.clear() 52 | 53 | # Image of puck from a higher position (working distance = 540) 54 | while not robtarget_pucks: 55 | ImageFunctions.findPucks(cam, robot, robtarget_pucks, cam_comp=True) 56 | 57 | pos_high = robtarget_pucks[0].get_xyz() 58 | print(f'High robtarget: ({pos_high[0]:5.1f},{pos_high[1]:5.1f})') 59 | 60 | delta_h = 540 - 100 # Heights are set in RAPID script 61 | delta_x = pos_high[0] - pos_low[0] 62 | delta_y = pos_high[1] - pos_low[1] 63 | print(f'Delta: ({delta_x:5.1f}, {delta_y:5.1f})') 64 | 65 | slope_x = delta_x / delta_h 66 | slope_y = delta_y / delta_h 67 | 68 | # Write all slope values to .txt-file 69 | if robot.is_running(): 70 | adjustment_file.write(f'{slope_x:.4f},{slope_y:.4f}\n') 71 | 72 | adjustment_file.close() 73 | 74 | # Get all slope values and find the average 75 | contents = np.genfromtxt(r'camera_adjustment_XS.txt', delimiter=',') 76 | os.remove('camera_adjustment_XS.txt') 77 | 78 | sum_slope_x = 0 79 | sum_slope_y = 0 80 | for content in contents: 81 | sum_slope_x += content[0] 82 | sum_slope_y += content[1] 83 | 84 | # TODO: Find out why absolute value was used previously: 85 | """ 86 | sum_slope_x += abs(content[0]) 87 | sum_slope_y += abs(content[1])""" 88 | 89 | average_slope_x = sum_slope_x / len(contents) 90 | average_slope_y = sum_slope_y / len(contents) 91 | 92 | configfile_name = abspath 93 | 94 | Config = configparser.ConfigParser() 95 | Config.read(configfile_name) 96 | 97 | cfgfile = open(configfile_name, 'w') 98 | 99 | # Add content to the file 100 | Config.set('SLOPE', 'slopex', f'{average_slope_x:.4f}') 101 | Config.set('SLOPE', 'slopey', f'{average_slope_y:.4f}') 102 | Config.write(cfgfile) 103 | 104 | cfgfile.close() 105 | 106 | 107 | def find_correct_exposure(cam, robot): 108 | """Auto exposure is not used, as it may fail to identify what the subject is. 109 | Instead, a manual exposure value is calculated by capturing images with all possible 110 | exposure values and selecting the value that identifies QR codes with most success. 111 | """ 112 | 113 | # If the repeatability test fails, this will exit the program 114 | if not robot.is_running(): 115 | sys.exit(0) 116 | 117 | print("---Running find_correct_exposure---") 118 | 119 | # TODO: Use several pucks to determine the best exposure times. 120 | # Input amount of pucks and grade exposure values based on how many of the pucks are found each time 121 | 122 | abspath = os.path.abspath("image_tools/cam_adjustments.ini") 123 | 124 | value_pairs = [] 125 | puck_list = [] 126 | # Exposure range (in ms) 127 | exposure_low = 2 128 | exposure_high = 66 129 | 130 | increment = 2 131 | # Loop from lowest possible exposure to highest possible exposure, incremented by 2 (ms) 132 | for exposure in range(exposure_low, exposure_high, increment): 133 | # Set new exposure 134 | newExposure = ueye.DOUBLE(exposure) 135 | ret = ueye.is_Exposure(cam.hCam, ueye.IS_EXPOSURE_CMD_SET_EXPOSURE, newExposure, ueye.sizeof(newExposure)) 136 | 137 | img = ImageFunctions.capture_image(cam=cam, gripper_height=500) 138 | puck_list = ImageFunctions.QR_Scanner(img) 139 | print("Found puck:", puck_list) 140 | 141 | # Checking exposure 142 | d = ueye.DOUBLE() 143 | retVal = ueye.is_Exposure(cam.hCam, ueye.IS_EXPOSURE_CMD_GET_EXPOSURE, d, 8) 144 | if retVal == ueye.IS_SUCCESS: 145 | print('Currently set exposure time %8.3f ms' % d) 146 | 147 | # Position returns as None if no QR-code is found 148 | if puck_list: 149 | value_pairs.append((exposure, len(puck_list))) 150 | 151 | weighted_sum = 0 152 | pucks_found = 0 153 | for value_pair in value_pairs: 154 | pucks_found += value_pair[1] 155 | weighted_sum += value_pair[0] * value_pair[1] 156 | 157 | exposure = str(int(weighted_sum / pucks_found)) 158 | 159 | print(value_pairs[0]) 160 | 161 | configfile_name = abspath 162 | 163 | config = configparser.ConfigParser() 164 | config.read(configfile_name) 165 | cfgfile = open(configfile_name, 'w') 166 | 167 | # Add content to the file 168 | config.set('EXPOSURE', 'exposure', exposure) 169 | config.write(cfgfile) 170 | 171 | cfgfile.close() 172 | 173 | # Set the correct exposure time 174 | cam.set_parameters() 175 | -------------------------------------------------------------------------------- /docs/build/html/installation.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Installation — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 |
30 |
31 | 32 | 33 |
34 | 35 |
36 |

Installation

37 |

This documentation covers the source code used in a demonstration program at the University of Stavanger. 38 | This includes a pip installable package, rwsuis, which is covered in Robot Web Services.

39 |

Here are the different ways of acquiring the software.

40 |
41 |

$ pip install

42 |
43 |
To install the rwsuis package, simply run this command in your terminal of choice:
44 |
45 |
pip install rwsuis
 46 | 
47 |
48 |
49 |
50 |

Get the source code

51 |
52 |
All code from the docs is on GitHub, and the public repository can be cloned:
53 |
54 |
git clone https://github.com/prinsWindy/ABB-Robot-Machine-Vision.git
 55 | 
56 |
57 |
58 |
59 | 60 | 61 |
62 | 63 |
64 |
65 | 120 |
121 |
122 | 133 | 134 | 135 | 136 | 137 | 138 | -------------------------------------------------------------------------------- /cam_adjust_lab.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | 4 | robtarget_pucks = [] 5 | 6 | ######################################################### 7 | # First, use OpenCV to initialize the camera # 8 | # or use the Camera class given if you # 9 | # prefer to use uEye API # 10 | ######################################################### 11 | # ----------------insert code here--------------------- # 12 | 13 | ######################################################### 14 | # Second, initialize robot communication, # 15 | # and execute RAPID program # 16 | # (Create RWS object) # 17 | ######################################################### 18 | # ----------------insert code here--------------------- # 19 | 20 | 21 | """This will (when completed) calculate the slope which represents how much the lens of the camera is angled. 22 | This should be done by comparing two images taken from different heights. 23 | """ 24 | print("---Running camera_adjustment---") 25 | 26 | # Creating a text file for calculations done throughout this program. 27 | # Will be closed and disappear after program is finished 28 | adjustment_file = open('camera_adjustment_XS.txt', 'w') 29 | 30 | ######################################################### 31 | # Create a loop where you compare a certain # 32 | # amount of images and check if robot is running # 33 | ######################################################### 34 | # ----------------insert code here--------------------- # 35 | 36 | i = 0 37 | while i < 5: # Compare images 5 times 38 | i += 1 39 | 40 | ######################################################### 41 | # Start the camera adjustment CASE in RAPID # 42 | # remember to wait for RAPID after entering the case # 43 | ######################################################### 44 | # ----------------insert code here--------------------- # 45 | 46 | ######################################################### 47 | # Capture image, process it and scan it for QR codes # 48 | # Do this several times if no QR code is found # 49 | ######################################################### 50 | # ----------------insert code here--------------------- # 51 | 52 | ######################################################### 53 | # Create a robtarget from the QR codes' position # 54 | ######################################################### 55 | # ----------------insert code here--------------------- # 56 | 57 | ######################################################### 58 | # Send the robtarget to RAPID # 59 | ######################################################### 60 | # ----------------insert code here--------------------- # 61 | 62 | ######################################################### 63 | # Tell RAPID to move to a close-up image position # 64 | # (Update flag variable) # 65 | ######################################################### 66 | # ----------------insert code here--------------------- # 67 | 68 | ######################################################### 69 | # Wait for robot to reach close-up position # 70 | # Capture image, process it and scan it for QR codes # 71 | # Do this several times if no QR code is found # 72 | ######################################################### 73 | # ----------------insert code here--------------------- # 74 | 75 | ######################################################### 76 | # Create a robtarget # 77 | # from the QR codes' position # 78 | ######################################################### 79 | # ----------------insert code here--------------------- # 80 | 81 | ######################################################### 82 | # Save the translation of the "low" robtarget # 83 | # (will be used later to calculate slope) # 84 | ######################################################### 85 | # ----------------insert code here--------------------- # 86 | 87 | ######################################################### 88 | # Tell RAPID to go straight up to new height # 89 | # (Update flag variable) # 90 | ######################################################### 91 | # ----------------insert code here--------------------- # 92 | 93 | ######################################################### 94 | # Again, capture an image and calculate the robtarget # 95 | # (as done in previous steps) # 96 | ######################################################### 97 | # ----------------insert code here--------------------- # 98 | 99 | ######################################################### 100 | # Save the translation of the "high" robtarget puck # 101 | # (will be used later to calculate slope) # 102 | ######################################################### 103 | # ----------------insert code here--------------------- # 104 | 105 | ########################################################### 106 | # Calculate the delta height, as well as the # 107 | # delta in both x and y position # 108 | # (Compare the "high" robtarget with the "low" robtarget) # 109 | ########################################################### 110 | # -----------------insert code here---------------------- # 111 | 112 | ######################################################### 113 | # Find the slope in both x and y direction # 114 | ######################################################### 115 | # ----------------insert code here--------------------- # 116 | slope_x = None 117 | slope_y = None 118 | 119 | # Write all slope values to .txt-file 120 | adjustment_file.write(f'{slope_x:.4f},{slope_y:.4f}\n') 121 | 122 | adjustment_file.close() 123 | 124 | contents = np.genfromtxt(r'camera_adjustment_XS.txt', delimiter=',') 125 | os.remove('camera_adjustment_XS.txt') 126 | 127 | sum_slope_x = 0 128 | sum_slope_y = 0 129 | for content in contents: 130 | sum_slope_x += content[0] 131 | sum_slope_y += content[1] 132 | 133 | average_slope_x = sum_slope_x / len(contents) 134 | average_slope_y = sum_slope_y / len(contents) 135 | """These slopes is what you will need to compensate for 136 | the camera position error for more accurate picking.""" 137 | -------------------------------------------------------------------------------- /docs/source/robotwebservices.rst: -------------------------------------------------------------------------------- 1 | .. _RobotWebServices: 2 | 3 | Robot Web Services 4 | ================== 5 | 6 | This section covers the communication between Python and *RobotWare*. A package, rwsuis_, 7 | is pip installable and includes all functions provided in this section. 8 | 9 | .. _rwsuis: https://pypi.org/project/rwsuis/ 10 | 11 | .. _RWS: 12 | 13 | RWS Class 14 | ^^^^^^^^^ 15 | 16 | Take full control of ABB robots through HTTP requests, made easy with the RWS class. 17 | Robot operating mode should be automatic. 18 | 19 | :: 20 | 21 | >>> robot = RWS.RWS(base_url='robot_IP', username='user', password='pass') 22 | >>> robot.request_mastership() 23 | >>> robot.motors_on() 24 | Robot motors turned on 25 | >>> robot.start_rapid() 26 | RAPID execution started from main 27 | 28 | .. py:class:: RWS(base_url, username, password) 29 | 30 | .. py:method:: motors_on(self) 31 | 32 | Sends a request to turn the robot's motors on. Mastership is required. 33 | Prints a message to the console stating whether or not the motors were in fact turned on. 34 | 35 | .. py:method:: motors_off(self) 36 | 37 | Sends a request to turn the robot's motors off. Mastership is required. 38 | Prints a message to the console stating whether or not the motors were in fact turned off. 39 | 40 | .. py:method:: request_mastership(self) 41 | 42 | Requests mastership over controller in automatic mode. 43 | For mastership in manual mode, see :py:func:`request_rmmp`. 44 | 45 | .. py:method:: release_mastership(self) 46 | 47 | Releases mastership over controller. 48 | 49 | .. py:method:: request_rmmp(self) 50 | 51 | Requests RMMP (Request Manual Mode Privileges). 52 | The request needs to be accepted within 10 seconds on controller. 53 | For mastership in automatic mode, see :py:func:`request_mastership`. 54 | 55 | .. py:method:: cancel_rmmp(self) 56 | 57 | Cancels held or requested RMMP. 58 | 59 | .. py:method:: reset_pp(self) 60 | 61 | Resets RAPID program pointer to main procedure. 62 | Prints a message to the console stating whether or not the request was successful. 63 | 64 | .. py:method:: start_RAPID(self) 65 | 66 | Resets RAPID program pointer to main procedure, and starts RAPID execution. 67 | Prints a message to the console stating whether or not the request was successful. 68 | 69 | .. py:method:: stop_RAPID(self) 70 | 71 | Stops RAPID execution. 72 | Prints a message to the console stating whether or not the request was successful. 73 | 74 | .. py:method:: get_rapid_variable(self, var) 75 | 76 | Get the raw value of any variable in RAPID. 77 | 78 | :return: A number if RAPID variable is 'num' 79 | :return: A string if RAPID variable is not 'num' 80 | 81 | .. py:method:: set_rapid_variable(self, var, value) 82 | 83 | Sets the value of any variable in RAPID. 84 | Unless the variable is 'num', value has to be a string. 85 | 86 | :param str var: Name of variable as declared in RAPID 87 | :param value: Desired variable value 88 | :type value: int, float or str 89 | 90 | .. py:method:: set_robtarget_translation(self, var, trans) 91 | 92 | Sets only the translational data of a robtarget variable in RAPID. 93 | 94 | :param str var: Name of robtarget variable as declared in RAPID 95 | :param int[] trans: Translational data [x,y,z] 96 | 97 | .. py:method:: set_robtarget_rotation_z_degrees(self, var, rotation_z_degrees) 98 | 99 | Updates the orientation of a robtarget variable 100 | in RAPID by rotation about the z-axis in degrees. 101 | 0 degrees gives the Quaternion [0,1,0,0]. 102 | 103 | :param str var: Name of robtarget variable as declared in RAPID 104 | :param int rotation_z_degrees: Rotation in degrees 105 | 106 | .. py:method:: set_robtarget_rotation_quaternion(self, var, rotation_quaternion) 107 | 108 | Updates the orientation of a robtarget variable in RAPID by a Quaternion. 109 | 110 | :param str var: Name of robtarget variable as declared in RAPID 111 | :param tuple rotation_quaternion: Wanted robtarget orientation. Must be a Quaternion (tuple of length 4) 112 | 113 | .. py:method:: get_robtarget_variables(self, var) 114 | 115 | Gets translational and rotational data of a robtarget variable in RAPID 116 | 117 | :param str var: Name of robtarget variable as declared in RAPID 118 | 119 | :return: Translational data of robtarget [x,y,z] 120 | :return: Rotational data of robtarget (Quaternion: [w,x,y,z]). 121 | 122 | .. py:method:: get_gripper_position(self) 123 | 124 | Gets translational and rotational of the UiS tool 'tGripper' 125 | with respect to the work object 'wobjTableN'. 126 | 127 | :return: Translational data of gripper [x,y,z] 128 | :return: Rotational data of gripper (Quaternion: [w,x,y,z]) 129 | 130 | .. py:method:: get_gripper_height(self) 131 | 132 | Uses :py:func:`get_gripper_position` to get the height of the UiS tool 133 | 'tGripper' above the work object 'wobjTableN'. 134 | 135 | .. py:method:: set_rapid_array(self, var, value) 136 | 137 | Sets the values of a num array variable in RAPID. 138 | The length of the num array must match the length of the array from Python. 139 | 140 | :param str var: Name of variable as declared in RAPID. 141 | :param int[] value: Array to be sent to RAPID. 142 | 143 | .. py:method:: wait_for_rapid(self, var='ready_flag') 144 | 145 | Polls a boolean variable in RAPID every 0.1 seconds. 146 | When the variable is TRUE, Python resets it and continues. 147 | 148 | :param str var: Name of boolean variable as declared in RAPID. 149 | 150 | .. py:method:: set_zonedata(self, var, zonedata) 151 | 152 | Set the value for a zonedata variable in RAPID. Mastership is required. 153 | 154 | :param str var: Name of variable as declared in RAPID. 155 | :param int zonedata: desired zonedata value. 156 | 157 | .. py:method:: set_speeddata(self, var, speeddata) 158 | 159 | Set the value [int] for a speeddata variable in RAPID. Mastership is required. 160 | 161 | :param str var: Name of variable as declared in RAPID. 162 | :param int speeddata: Desired speeddata value. 163 | 164 | .. py:method:: set_speed_ratio(self, speed_ratio) 165 | 166 | Set the speed ratio of the robot. Mastership is required. 167 | speed_ratio: desired speed ratio in percent [1-100]. 168 | 169 | .. py:method:: is_running(self) 170 | 171 | Uses :py:func:`get_execution_state` to check if RAPID execution is running or stopped. 172 | Returns True if running and False if stopped. 173 | 174 | .. py:method:: get_execution_state(self) 175 | 176 | Polls the RAPID execution state. 177 | 178 | :return: 'running' or 'stopped' 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | -------------------------------------------------------------------------------- /docs/build/html/_sources/robotwebservices.rst.txt: -------------------------------------------------------------------------------- 1 | .. _RobotWebServices: 2 | 3 | Robot Web Services 4 | ================== 5 | 6 | This section covers the communication between Python and *RobotWare*. A package, rwsuis_, 7 | is pip installable and includes all functions provided in this section. 8 | 9 | .. _rwsuis: https://pypi.org/project/rwsuis/ 10 | 11 | .. _RWS: 12 | 13 | RWS Class 14 | ^^^^^^^^^ 15 | 16 | Take full control of ABB robots through HTTP requests, made easy with the RWS class. 17 | Robot operating mode should be automatic. 18 | 19 | :: 20 | 21 | >>> robot = RWS.RWS(base_url='robot_IP', username='user', password='pass') 22 | >>> robot.request_mastership() 23 | >>> robot.motors_on() 24 | Robot motors turned on 25 | >>> robot.start_rapid() 26 | RAPID execution started from main 27 | 28 | .. py:class:: RWS(base_url, username, password) 29 | 30 | .. py:method:: motors_on(self) 31 | 32 | Sends a request to turn the robot's motors on. Mastership is required. 33 | Prints a message to the console stating whether or not the motors were in fact turned on. 34 | 35 | .. py:method:: motors_off(self) 36 | 37 | Sends a request to turn the robot's motors off. Mastership is required. 38 | Prints a message to the console stating whether or not the motors were in fact turned off. 39 | 40 | .. py:method:: request_mastership(self) 41 | 42 | Requests mastership over controller in automatic mode. 43 | For mastership in manual mode, see :py:func:`request_rmmp`. 44 | 45 | .. py:method:: release_mastership(self) 46 | 47 | Releases mastership over controller. 48 | 49 | .. py:method:: request_rmmp(self) 50 | 51 | Requests RMMP (Request Manual Mode Privileges). 52 | The request needs to be accepted within 10 seconds on controller. 53 | For mastership in automatic mode, see :py:func:`request_mastership`. 54 | 55 | .. py:method:: cancel_rmmp(self) 56 | 57 | Cancels held or requested RMMP. 58 | 59 | .. py:method:: reset_pp(self) 60 | 61 | Resets RAPID program pointer to main procedure. 62 | Prints a message to the console stating whether or not the request was successful. 63 | 64 | .. py:method:: start_RAPID(self) 65 | 66 | Resets RAPID program pointer to main procedure, and starts RAPID execution. 67 | Prints a message to the console stating whether or not the request was successful. 68 | 69 | .. py:method:: stop_RAPID(self) 70 | 71 | Stops RAPID execution. 72 | Prints a message to the console stating whether or not the request was successful. 73 | 74 | .. py:method:: get_rapid_variable(self, var) 75 | 76 | Get the raw value of any variable in RAPID. 77 | 78 | :return: A number if RAPID variable is 'num' 79 | :return: A string if RAPID variable is not 'num' 80 | 81 | .. py:method:: set_rapid_variable(self, var, value) 82 | 83 | Sets the value of any variable in RAPID. 84 | Unless the variable is 'num', value has to be a string. 85 | 86 | :param str var: Name of variable as declared in RAPID 87 | :param value: Desired variable value 88 | :type value: int, float or str 89 | 90 | .. py:method:: set_robtarget_translation(self, var, trans) 91 | 92 | Sets only the translational data of a robtarget variable in RAPID. 93 | 94 | :param str var: Name of robtarget variable as declared in RAPID 95 | :param int[] trans: Translational data [x,y,z] 96 | 97 | .. py:method:: set_robtarget_rotation_z_degrees(self, var, rotation_z_degrees) 98 | 99 | Updates the orientation of a robtarget variable 100 | in RAPID by rotation about the z-axis in degrees. 101 | 0 degrees gives the Quaternion [0,1,0,0]. 102 | 103 | :param str var: Name of robtarget variable as declared in RAPID 104 | :param int rotation_z_degrees: Rotation in degrees 105 | 106 | .. py:method:: set_robtarget_rotation_quaternion(self, var, rotation_quaternion) 107 | 108 | Updates the orientation of a robtarget variable in RAPID by a Quaternion. 109 | 110 | :param str var: Name of robtarget variable as declared in RAPID 111 | :param tuple rotation_quaternion: Wanted robtarget orientation. Must be a Quaternion (tuple of length 4) 112 | 113 | .. py:method:: get_robtarget_variables(self, var) 114 | 115 | Gets translational and rotational data of a robtarget variable in RAPID 116 | 117 | :param str var: Name of robtarget variable as declared in RAPID 118 | 119 | :return: Translational data of robtarget [x,y,z] 120 | :return: Rotational data of robtarget (Quaternion: [w,x,y,z]). 121 | 122 | .. py:method:: get_gripper_position(self) 123 | 124 | Gets translational and rotational of the UiS tool 'tGripper' 125 | with respect to the work object 'wobjTableN'. 126 | 127 | :return: Translational data of gripper [x,y,z] 128 | :return: Rotational data of gripper (Quaternion: [w,x,y,z]) 129 | 130 | .. py:method:: get_gripper_height(self) 131 | 132 | Uses :py:func:`get_gripper_position` to get the height of the UiS tool 133 | 'tGripper' above the work object 'wobjTableN'. 134 | 135 | .. py:method:: set_rapid_array(self, var, value) 136 | 137 | Sets the values of a num array variable in RAPID. 138 | The length of the num array must match the length of the array from Python. 139 | 140 | :param str var: Name of variable as declared in RAPID. 141 | :param int[] value: Array to be sent to RAPID. 142 | 143 | .. py:method:: wait_for_rapid(self, var='ready_flag') 144 | 145 | Polls a boolean variable in RAPID every 0.1 seconds. 146 | When the variable is TRUE, Python resets it and continues. 147 | 148 | :param str var: Name of boolean variable as declared in RAPID. 149 | 150 | .. py:method:: set_zonedata(self, var, zonedata) 151 | 152 | Set the value for a zonedata variable in RAPID. Mastership is required. 153 | 154 | :param str var: Name of variable as declared in RAPID. 155 | :param int zonedata: desired zonedata value. 156 | 157 | .. py:method:: set_speeddata(self, var, speeddata) 158 | 159 | Set the value [int] for a speeddata variable in RAPID. Mastership is required. 160 | 161 | :param str var: Name of variable as declared in RAPID. 162 | :param int speeddata: Desired speeddata value. 163 | 164 | .. py:method:: set_speed_ratio(self, speed_ratio) 165 | 166 | Set the speed ratio of the robot. Mastership is required. 167 | speed_ratio: desired speed ratio in percent [1-100]. 168 | 169 | .. py:method:: is_running(self) 170 | 171 | Uses :py:func:`get_execution_state` to check if RAPID execution is running or stopped. 172 | Returns True if running and False if stopped. 173 | 174 | .. py:method:: get_execution_state(self) 175 | 176 | Polls the RAPID execution state. 177 | 178 | :return: 'running' or 'stopped' 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | -------------------------------------------------------------------------------- /docs/build/html/camera.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Camera Class — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 |
30 |
31 | 32 | 33 |
34 | 35 |
36 |

Camera Class

37 |
38 |
39 | class Camera(cam_ID=0)
40 |

The Camera class contains methods specifically meant for the University of Stavanger. 41 | These functions have only been tested on a IDS UI-1007XS-C camera, and might not work 42 | as intended on other models.

43 |
44 | 45 |
46 |

Quickstart

47 |
# Initialization
 48 | cam = Camera()
 49 | cam.init()
 50 | cam.set_parameters()
 51 | cam.allocate_memory()
 52 | cam.capture_video()
 53 | 
 54 | # Show video
 55 | img = cam.get_image()
 56 | cv2.imshow("Quickstart", img)
 57 | if cv2.waitKey(1) & 0xFF == ord('q'):
 58 |     cv2.destroyAllWindows()
 59 |     break
 60 | 
 61 | cam.exit_camera()
 62 | 
63 |
64 |

The parameters set are not currently configurable through method inputs. 65 | They are specifically set for laboratory work at the University of Stavanger.

66 |
67 |
68 | 69 | 70 |
71 | 72 |
73 |
74 | 129 |
130 |
131 | 142 | 143 | 144 | 145 | 146 | 147 | -------------------------------------------------------------------------------- /docs/build/html/searchindex.js: -------------------------------------------------------------------------------- 1 | Search.setIndex({docnames:["help","imagefunctions","index","installation","license","opencv_to_rapid","puck","robotwebservices"],envversion:{"sphinx.domains.c":1,"sphinx.domains.changeset":1,"sphinx.domains.citation":1,"sphinx.domains.cpp":1,"sphinx.domains.javascript":1,"sphinx.domains.math":2,"sphinx.domains.python":1,"sphinx.domains.rst":1,"sphinx.domains.std":1,sphinx:56},filenames:["help.rst","imagefunctions.rst","index.rst","installation.rst","license.rst","opencv_to_rapid.rst","puck.rst","robotwebservices.rst"],objects:{"":{Camera:[1,0,1,""],Puck:[6,0,1,""],QR_Scanner:[1,2,1,""],QR_Scanner_visualized:[1,2,1,""],RWS:[7,0,1,""],calculate_focus:[1,2,1,""],camera_compensation:[5,2,1,""],capture_image:[1,2,1,""],create_robtarget:[5,2,1,""],findPucks:[1,2,1,""],get_camera_position:[5,2,1,""],overshoot_comp:[5,2,1,""],pixel_to_mm:[5,2,1,""],showVideo:[1,2,1,""],transform_position:[5,2,1,""]},Puck:{check_collision:[6,1,1,""],get_xyz:[6,1,1,""],set_angle:[6,1,1,""],set_height:[6,1,1,""],set_position:[6,1,1,""]},RWS:{cancel_rmmp:[7,1,1,""],get_execution_state:[7,1,1,""],get_gripper_height:[7,1,1,""],get_gripper_position:[7,1,1,""],get_rapid_variable:[7,1,1,""],get_robtarget_variables:[7,1,1,""],is_running:[7,1,1,""],motors_off:[7,1,1,""],motors_on:[7,1,1,""],release_mastership:[7,1,1,""],request_mastership:[7,1,1,""],request_rmmp:[7,1,1,""],reset_pp:[7,1,1,""],set_rapid_array:[7,1,1,""],set_rapid_variable:[7,1,1,""],set_robtarget_rotation_quaternion:[7,1,1,""],set_robtarget_rotation_z_degrees:[7,1,1,""],set_robtarget_translation:[7,1,1,""],set_speed_ratio:[7,1,1,""],set_speeddata:[7,1,1,""],set_zonedata:[7,1,1,""],start_RAPID:[7,1,1,""],stop_RAPID:[7,1,1,""],wait_for_rapid:[7,1,1,""]}},objnames:{"0":["py","class","Python class"],"1":["py","method","Python method"],"2":["py","function","Python function"]},objtypes:{"0":"py:class","1":"py:method","2":"py:function"},terms:{"0xff":1,"1007x":1,"620mm":1,"boolean":7,"break":1,"class":2,"final":5,"float":[5,6,7],"function":[1,7],"handeg\u00e5rd":4,"int":[1,7],"new":[1,6],"public":3,"return":[1,5,6,7],"true":[1,6,7],"try":5,"var":7,"while":1,AND:4,BUT:4,FOR:4,For:[1,7],IDS:1,NOT:4,Ole:4,RWS:[1,2],THE:4,The:[1,4,5,7],These:[1,5],USE:4,UiS:7,Uses:[5,7],WITH:4,_pack:[],abb:[1,3,7],about:7,abov:[1,4,5,7],accept:7,accur:1,acquir:3,action:4,activ:1,added:1,adjust:1,again:1,all:[1,3,4,6,7],allocate_memori:1,alreadi:1,amount:5,angl:[5,6],ani:[4,5,7],api:1,arg:1,aris:4,arrai:7,associ:4,attempt:1,author:4,automat:7,avoid:6,axi:7,bachelor:2,backward:6,base:5,base_url:7,been:[1,5],between:[5,6,7],bind:1,bool:[1,5],both:5,buddi:0,calcul:[1,5],calculate_focu:1,cam:1,cam_comp:[1,5],cam_id:1,cam_po:5,cam_thread:1,camera:[2,5],camera_adjust:5,camera_compens:5,can:[1,3],cancel:7,cancel_rmmp:7,captur:2,capture_imag:1,capture_video:1,characterist:1,charg:4,check:7,check_collis:6,choic:3,christian:4,claim:4,clone:3,code:2,collis:6,com:3,command:3,commun:7,compar:1,compens:5,complet:5,condit:4,configur:1,connect:4,consol:7,contain:1,continu:[1,7],contract:4,control:7,convert:[1,5],coordin:5,copi:4,copyright:4,correct:1,cover:[3,7],creat:[1,2,5],create_robtarget:[1,5],current:1,cv2:1,daemon:1,damag:4,data:7,deal:4,declar:7,decod:1,degre:7,demonstr:3,depend:[1,5],desir:7,destroyallwindow:1,detect:1,differ:[3,5],direct:[1,2],displai:1,distanc:[1,5],distribut:4,doc:3,document:[3,4],doubl:[],duplic:1,easi:7,ele610:[],email:[],engin:2,event:4,everi:[1,7],execut:7,exit_camera:1,express:4,extern:5,extract:1,fact:7,fals:[1,5,7],feed:1,file:[4,5],filter:1,find:[1,5,6],findpuck:1,first:[1,2,5],fit:4,focu:1,follow:4,forward:6,found:1,free:4,from:[1,3,4,5,7],full:7,furnish:4,get:[2,6,7],get_camera_posit:5,get_execution_st:7,get_gripper_height:7,get_gripper_posit:7,get_imag:1,get_rapid_vari:7,get_robtarget_vari:7,get_xyz:6,git:3,github:3,give:7,given:5,grant:4,grayscal:1,grip:6,gripper:[1,5,6,7],gripper_height:[1,5],gripper_rot:5,has:[1,5,7],have:1,height:[1,5,6,7],held:7,help:2,here:3,herebi:4,holder:4,how:1,http:[3,7],imag:[2,5],imagefunct:1,img:1,impli:4,imshow:1,includ:[1,3,4,5,7],index:2,inform:1,init:1,initi:1,input:1,instal:[2,7],intend:1,intern:5,is_run:7,iversflaten:4,karlsk:[],kind:4,klosser:[],laboratori:1,len:1,length:[5,7],liabil:4,liabl:4,licens:2,limit:4,list:[1,6],locat:5,machin:[1,3],made:7,mail:[],main:7,mainli:[1,2],mani:1,manual:[1,7],mark:1,marku:4,markushi:[],mastership:7,match:7,meant:1,merchant:4,merg:4,messag:7,method:1,might:1,millimet:[1,5],mode:7,model:1,modifi:4,modul:[],motor:7,motors_off:7,motors_on:7,mount:5,must:[5,7],name:7,ndarrai:[1,5],need:[2,7],noninfring:4,normal:1,notic:4,now:0,num:7,number:[6,7],number_of_imag:1,object:[1,5,6,7],obtain:4,occur:5,off:7,offset:5,onc:1,onli:[1,7],opencv:[1,5],opencv_to_rapid:2,oper:7,ord:1,orient:[5,7],other:[1,4,6],otherwis:4,out:4,over:7,overshoot:5,overshoot_comp:5,own:0,pack:5,pack_:[],packag:[1,3,7],page:2,param:[],paramet:[1,5,6,7],particular:4,pass:[1,7],password:7,percent:7,permiss:4,permit:4,person:4,phenomenon:5,pick:6,pinpoint:5,pip:[2,7],pixel:5,pixel_to_mm:5,pleas:[],pointer:7,poll:7,portion:4,posit:[1,5,6],previous:1,prinswindi:3,print:7,privileg:7,procedur:7,process:1,program:[3,7],provid:[4,7],publish:4,puck:[1,2,5],puck_list:6,purpos:4,put:1,pycharmproject:[],python:[1,7],pyuey:1,pyzbar:1,qr_scanner:1,qr_scanner_visu:1,quaternion:[5,7],quaternion_to_eul:[],quickstart:[],rapid:[5,7],ratio:7,raw:7,ready_flag:7,red:1,ref:[],regard:5,releas:7,release_mastership:7,repositori:3,request:7,request_mastership:7,request_rmmp:7,requir:7,reset:7,reset_pp:7,respect:7,restrict:4,retriev:1,right:4,rmmp:7,robot:[1,3,5],robot_ip:7,robotstudio:5,robotwar:7,robotwebservic:[],robtarget:[1,5,7],robtarget_puck:1,rot:5,rotat:[5,6,7],rotation_quaternion:7,rotation_z_degre:7,rspag:[],rst:[],run:[1,3,5,7],rwss:[],rwss_:[],rwsui:[3,7],same:5,scan:2,scanner:1,search:2,second:7,section:7,see:7,self:7,sell:4,send:7,sent:7,separ:1,seri:5,serial:1,servic:[2,3,5],set:[1,7],set_angl:6,set_height:6,set_paramet:1,set_posit:6,set_rapid_arrai:7,set_rapid_vari:7,set_robtarget_rotation_quaternion:7,set_robtarget_rotation_z_degre:7,set_robtarget_transl:7,set_speed_ratio:7,set_speeddata:7,set_zonedata:7,setup:5,shall:4,should:[1,6,7],show:1,showvideo:1,simpli:3,singl:1,slide:6,slope:5,softwar:[1,2,3,4],some:1,sourc:2,specif:[1,5],speed:7,speed_ratio:7,speeddata:7,squar:1,staff:2,start:[1,7],start_rapid:7,state:7,stavang:[1,2,3,5],still:1,stop:7,stop_rapid:7,str:7,string:7,student:2,subject:[1,4],sublicens:4,substanti:4,success:7,surfac:5,swap:5,system:5,take:7,taken:1,target:1,termin:3,test:1,tgripper:7,thei:1,them:1,thesi:2,thi:[1,2,3,4,5,7],thread:1,through:[1,7],tool:[2,7],tort:4,toward:[1,2],tran:[5,7],transform:[1,5],transform_posit:5,translat:7,troubl:[],tupl:[5,7],turn:7,two:[],type:[],uey:1,uis:[],uis_e458_nov18:[],uniqu:1,univers:[1,2,3,5],unless:7,updat:[6,7],use:4,used:3,user:7,usernam:7,uses:1,using:1,util:1,valu:[1,5,7],variabl:7,video:2,view:[1,5],vision:[1,3],visual:1,wai:3,wait_for_rapid:7,waitkei:1,want:7,warranti:4,web:[2,3,5],well:1,were:[1,7],when:[5,6,7],whether:[4,7],which:[3,5,6],whom:4,window:1,within:7,without:[1,4],wobjtablen:7,work:[1,2,5,6,7],working_dist:1,www:[],yield:6,you:0,your:[0,3],zbar:1,zonedata:7},titles:["Need Help?","Image Tools","ABB Robot with Machine Vision","Installation","License","OpenCV_to_RAPID.py","Puck Class","Robot Web Services"],titleterms:{"class":[1,6,7],RWS:7,abb:2,camera:1,captur:1,code:[1,3],get:3,guid:2,help:0,imag:1,indic:2,instal:3,licens:4,machin:2,need:0,object:[],opencv_to_rapid:5,pip:3,puck:6,quickstart:1,robot:[2,7],scan:1,servic:7,sourc:3,tabl:[],tool:1,user:2,video:1,vision:2,web:7}}) -------------------------------------------------------------------------------- /OpenCV_to_RAPID.py: -------------------------------------------------------------------------------- 1 | import math 2 | import configparser 3 | from image_tools import ImageFunctions 4 | 5 | 6 | def pixel_to_mm(gripper_height, puck, image): 7 | """Converts coordinates in image from pixels to millimeters. 8 | This depends on the camera's working distance. 9 | """ 10 | 11 | # As a good approximation we can say that: sensor width / FOV width = focal length / working distance 12 | # parameters from the XS camera 13 | focal_length = 3.7 # mm (+/- 5 percent) 14 | sensor_width = 3.6288 15 | # sensor_height = 2.7216 (not used here) 16 | resolution_width = image.shape[1] 17 | 18 | working_distance = gripper_height + 70 19 | 20 | fov_width = (working_distance / focal_length) * sensor_width 21 | 22 | pixel_to_mm = fov_width / resolution_width # mm_width / px_width 23 | 24 | # Convert all positions from pixels to millimeters: 25 | puck.set_position(position=[x * pixel_to_mm for x in puck.position]) 26 | 27 | 28 | def transform_position(gripper_rot, puck): 29 | """Transform coordinate system given by image in OpenCV to coordinate system of work object in RAPID. 30 | Swap x & y coordinates and rotate by the same amount that the camera has been rotated. 31 | """ 32 | 33 | # Perform transformations to match RAPID: x -> y, y -> x, x -> -x, y -> -y 34 | puck.set_position(position=[-puck.position[1], -puck.position[0]]) 35 | 36 | # Convert from quaternion to Euler angle (we only need z-axis) 37 | rotation_z_radians = quaternion_to_radians(gripper_rot) 38 | rotation_z_degrees = math.degrees(rotation_z_radians) 39 | # TODO: Check if rotation is positive or negative for a given orientation 40 | 41 | # TODO: Rotate all points in dict, not list: 42 | """Rotate all points found by the QR scanner. 43 | Also, adjust the angle of all pucks by using the orientation of the gripper:""" 44 | 45 | puck.set_position(position= 46 | [puck.position[0] * math.cos(rotation_z_radians) - puck.position[1] * math.sin( 47 | rotation_z_radians), 48 | puck.position[0] * math.sin(rotation_z_radians) + puck.position[1] * math.cos( 49 | rotation_z_radians)]) 50 | 51 | # The angle found by the QR scanner needs to take gripper rotation into consideration 52 | puck.set_angle(angle=puck.angle + rotation_z_degrees) 53 | 54 | 55 | def get_camera_position(trans, rot): 56 | """Uses the offset between the gripper and camera to find the camera's position. 57 | """ 58 | 59 | offset_x, offset_y = gripper_camera_offset(rot=rot) 60 | 61 | camera_position = [trans[0] + offset_x, trans[1] + offset_y] # Gripper position + offset from gripper 62 | return camera_position 63 | 64 | 65 | def gripper_camera_offset(rot): 66 | """Finds the offset between the camera and the gripper by using the gripper's orientation. 67 | """ 68 | 69 | r = 55 # Distance between gripper and camera 70 | 71 | # Check if input is quaternion 72 | if isinstance(rot, list): 73 | if len(rot) == 4 and (isinstance(rot[0], int) or isinstance(rot[0], float)): 74 | rotation_z_radians = quaternion_to_radians(rot) 75 | else: 76 | # If input is not Quaternion, it should be int or float (an angle) 77 | rotation_z_radians = rot 78 | 79 | offset_x = r * math.cos(rotation_z_radians) 80 | offset_y = r * math.sin(rotation_z_radians) 81 | 82 | return offset_x, offset_y 83 | 84 | 85 | def create_robtarget(gripper_height, gripper_rot, cam_pos, image, puck, cam_comp=False): 86 | """Complete a series of transformations to finally 87 | create a robtarget of the puck's position from an image. 88 | """ 89 | 90 | # Transform position depending on how the gripper is rotated 91 | transform_position(gripper_rot=gripper_rot, puck=puck) 92 | 93 | # Converts puck position from pixels to millimeters 94 | pixel_to_mm(gripper_height=gripper_height, puck=puck, image=image) 95 | 96 | # Compensate for overshoot in 2D image 97 | overshoot_comp(gripper_height=gripper_height, puck=puck) 98 | 99 | # Compensate for possibly angled camera 100 | if not cam_comp: 101 | camera_compensation(gripper_height=gripper_height, gripper_rot=gripper_rot, puck=puck) 102 | 103 | # Add the offset from camera to gripper 104 | puck.set_position(position=[puck.position[0] + cam_pos[0], puck.position[1] + cam_pos[1]]) 105 | 106 | return puck 107 | 108 | 109 | def quaternion_to_radians(quaternion): 110 | """Convert a Quaternion to a rotation about the z-axis in degrees. 111 | """ 112 | w, x, y, z = quaternion 113 | t1 = +2.0 * (w * z + x * y) 114 | t2 = +1.0 - 2.0 * (y * y + z * z) 115 | rotation_z = math.atan2(t1, t2) 116 | 117 | return rotation_z 118 | 119 | 120 | def z_degrees_to_quaternion(rotation_z_degrees): 121 | """Convert a rotation about the z-axis in degrees to Quaternion. 122 | """ 123 | roll = math.pi 124 | pitch = 0 125 | yaw = math.radians(rotation_z_degrees) 126 | 127 | qw = math.cos(roll / 2) * math.cos(pitch / 2) * math.cos(yaw / 2) + math.sin(roll / 2) * math.sin( 128 | pitch / 2) * math.sin(yaw / 2) 129 | qx = math.sin(roll / 2) * math.cos(pitch / 2) * math.cos(yaw / 2) - math.cos(roll / 2) * math.sin( 130 | pitch / 2) * math.sin(yaw / 2) 131 | qy = math.cos(roll / 2) * math.sin(pitch / 2) * math.cos(yaw / 2) + math.sin(roll / 2) * math.cos( 132 | pitch / 2) * math.sin(yaw / 2) 133 | qz = math.cos(roll / 2) * math.cos(pitch / 2) * math.sin(yaw / 2) - math.sin(roll / 2) * math.sin( 134 | pitch / 2) * math.cos(yaw / 2) 135 | 136 | return [qw, qx, qy, qz] 137 | 138 | 139 | def overshoot_comp(gripper_height, puck): 140 | """Compensate for the overshoot phenomenon which occurs when trying to pinpoint 141 | the location of a 3D object in a 2D image. 142 | """ 143 | compensation = [x * puck.height / (gripper_height + 70) for x in puck.position] 144 | 145 | # Subtract compensation values from puck position 146 | puck.set_position(position=list(map(lambda x, y: x - y, puck.position, compensation))) 147 | 148 | 149 | def camera_compensation(gripper_height, gripper_rot, puck): 150 | """Compensate for an angled camera view. Different cameras will be 151 | angled differently both internally and externally when mounted to a surface. 152 | The slope values must first be calculated by running camera_adjustment.py. 153 | Works with any camera orientation. 154 | """ 155 | rotation_z_radians = quaternion_to_radians(gripper_rot) 156 | camera_height = gripper_height + 70 157 | working_distance = camera_height - puck.height 158 | 159 | config = configparser.ConfigParser() 160 | config.read('image_tools/cam_adjustments.ini') 161 | 162 | slope_x = float(config['SLOPE']['slopex']) 163 | slope_y = float(config['SLOPE']['slopey']) 164 | comp_x = slope_x * working_distance 165 | comp_y = slope_y * working_distance 166 | 167 | # Decompose comp_x and comp_y if camera was rotated 168 | puck.set_position(position= 169 | [puck.position[0] - comp_x * math.cos(rotation_z_radians) + comp_y * math.sin(rotation_z_radians), 170 | puck.position[1] - comp_y * math.cos(rotation_z_radians) - comp_x * math.sin( 171 | rotation_z_radians)]) 172 | -------------------------------------------------------------------------------- /image_tools/ImageFunctions.py: -------------------------------------------------------------------------------- 1 | import math 2 | 3 | import cv2 4 | import numpy as np 5 | from pyzbar.pyzbar import decode 6 | from pyzbar.wrapper import ZBarSymbol 7 | 8 | import Puck 9 | from image_tools import Camera 10 | from pyueye import ueye 11 | import time 12 | import OpenCV_to_RAPID 13 | 14 | 15 | def capture_image(cam, gripper_height): 16 | """Captures a single image through PyuEye functions. 17 | Focus is manually adjusted depending on the working distance. 18 | """ 19 | camera_height = gripper_height + 70 # Camera is placed 70mm above gripper 20 | working_distance = camera_height - 30 # -30 because one puck 21 | 22 | calculate_focus(cam, working_distance) 23 | 24 | # Trigger autofocus once (use instead of calculate_focus if needed): 25 | # nRet = ueye.is_Focus(cam.hCam, ueye.FOC_CMD_SET_ENABLE_AUTOFOCUS_ONCE, None, 0) 26 | 27 | array = cam.get_image() 28 | 29 | return array 30 | 31 | 32 | def calculate_focus(cam, working_distance): 33 | """This characteristic belongs to the IDS XS camera with serial code 4102885308. 34 | As stated by IDS themselves the characteristic is not robust and could vary between 35 | different cameras. 36 | The characteristic was made based on images up to a working distance of 620mm. 37 | """ 38 | # TODO: Make characteristics for all IDS XS cameras at UiS. 39 | # By checking the serial code through ueye.SENSORINFO, one would know which specific camera is in use 40 | 41 | if working_distance >= 357.5: 42 | focus_value = 204 43 | elif 237 <= working_distance < 357.5: 44 | focus_value = 192 45 | elif 169 <= working_distance < 237: 46 | focus_value = 180 47 | elif 131.5 <= working_distance < 169: 48 | focus_value = 168 49 | elif 101.5 <= working_distance < 131.5: 50 | focus_value = 156 51 | elif 86.5 <= working_distance < 101.5: 52 | focus_value = 144 53 | elif 72 <= working_distance < 86.5: 54 | focus_value = 128 55 | elif 42.5 <= working_distance < 72: 56 | focus_value = 112 57 | else: 58 | print("Too close to subject. Focus value not found. Default value: 204") 59 | focus_value = 204 60 | 61 | # Set the correct focus value 62 | focus_UINT = ueye.UINT(focus_value) 63 | ueye.is_Focus(cam.hCam, ueye.FOC_CMD_SET_MANUAL_FOCUS, focus_UINT, ueye.sizeof(focus_UINT)) 64 | time.sleep(0.3) 65 | 66 | 67 | def approximate_stack(puck, gripper_height): 68 | """Aims to approximate the amount of pucks in a stack 69 | based on the calculated QR-width of the visible puck""" 70 | 71 | if puck.qr_width >= 400: 72 | approx_working_distance = 100 73 | elif 300 <= puck.qr_width < 400: 74 | approx_working_distance = 130 75 | elif 250 <= puck.qr_width < 300: 76 | approx_working_distance = 160 77 | elif 210 <= puck.qr_width < 250: 78 | approx_working_distance = 190 79 | elif 180 <= puck.qr_width < 210: 80 | approx_working_distance = 220 81 | elif 160 <= puck.qr_width < 180: 82 | approx_working_distance = 250 83 | elif 140 <= puck.qr_width < 160: 84 | approx_working_distance = 280 85 | elif 130 <= puck.qr_width < 140: 86 | approx_working_distance = 310 87 | else: 88 | print("Too far away, moving closer") 89 | approx_working_distance = 340 90 | 91 | camera_height = gripper_height + 70 92 | puck.set_height(camera_height - approx_working_distance) 93 | 94 | return approx_working_distance 95 | 96 | 97 | def findPucks(cam, robot, robtarget_pucks, cam_comp=False, number_of_images=1, pucks_in_height=False): 98 | """Finds all pucks in the frame of the camera by capturing an image and scanning the image for QR codes. 99 | After the codes have been pinpointed, a series of transformations happen to finally create robtargets 100 | which can be sent to RobotWare. 101 | """ 102 | 103 | trans, gripper_rot = robot.get_gripper_position() 104 | gripper_height = robot.get_gripper_height() 105 | 106 | cam_pos = OpenCV_to_RAPID.get_camera_position(trans=trans, rot=gripper_rot) 107 | 108 | for _ in range(number_of_images): 109 | image = capture_image(cam=cam, gripper_height=gripper_height) 110 | 111 | # Scan the image and return all QR code positions 112 | temp_puck_list = QR_Scanner(image) 113 | 114 | # Check if the QR codes that were found have already been located previously. 115 | # If so, remove them from the temporary list. 116 | for puck in robtarget_pucks: 117 | if any(puck == x for x in temp_puck_list): 118 | temp_puck_list.remove(puck) 119 | 120 | # Create robtargets for every new puck 121 | for puck in temp_puck_list: 122 | puck = OpenCV_to_RAPID.create_robtarget(gripper_height=gripper_height, gripper_rot=gripper_rot, 123 | cam_pos=cam_pos, image=image, puck=puck, cam_comp=cam_comp) 124 | robtarget_pucks.append(puck) 125 | 126 | return robtarget_pucks 127 | 128 | 129 | def showVideo(cam): 130 | """Continuously displays the robot's view in an OpenCV imshow window. 131 | """ 132 | while True: 133 | array = cam.get_image() 134 | array = QR_Scanner_visualized(array) 135 | 136 | if Camera.repeatability_test: 137 | font = cv2.FONT_HERSHEY_SIMPLEX 138 | bottomLeftCornerOfText = (10, 50) 139 | fontScale = 1 140 | fontColor = (255, 255, 255) 141 | lineType = 2 142 | 143 | cv2.putText(array, 'Number of loops: ' + str(Camera.number_of_loops), 144 | bottomLeftCornerOfText, 145 | font, 146 | fontScale, 147 | fontColor, 148 | lineType) 149 | 150 | cv2.imshow("Continuous video display", array) 151 | if cv2.waitKey(1) & 0xFF == ord('q'): 152 | break 153 | cv2.destroyAllWindows() 154 | 155 | 156 | def QR_Scanner(img): 157 | """Scan QR codes from image. Returns position, orientation and image with marked QR codes""" 158 | 159 | # TODO: Make an adaptive bilateralFilter that depends on camera_height 160 | grayscale = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) # Make grayscale image for filtering and thresholding 161 | blur = cv2.bilateralFilter(src=grayscale, d=3, sigmaColor=75, sigmaSpace=75) 162 | normalized_img = cv2.normalize(blur, blur, alpha=0, beta=255, norm_type=cv2.NORM_MINMAX, dtype=-1) 163 | data = decode(normalized_img, symbols=[ZBarSymbol.QRCODE]) 164 | puck_list = [] 165 | 166 | sorted_data = sorted(data, key=lambda x: x[0]) # Sort the QR codes in ascending order 167 | 168 | for QR_Code in sorted_data: # Go through all QR codes 169 | 170 | points = QR_Code.polygon # Extract two corner points 171 | x1 = points[0][0] 172 | y1 = points[0][1] 173 | x2 = points[3][0] 174 | y2 = points[3][1] 175 | 176 | angle = np.rad2deg(np.arctan2(-(y2 - y1), x2 - x1)) # Calculate the orientation of each QR code 177 | 178 | x = [p[0] for p in points] 179 | y = [p[1] for p in points] 180 | position = [sum(x) / len(points), sum(y) / len(points)] # Calculate center of each QR code 181 | 182 | width, height, channels = img.shape 183 | position = [position[0] - height/2, position[1] - width/2] # Make center of image (0,0) 184 | 185 | puck = str(QR_Code.data, "utf-8") # Convert QR code data to string 186 | puck_number = int(''.join(filter(str.isdigit, puck))) # Find only puck number from QR code string 187 | 188 | # Calculate distance between (x1,y1) and (x2,y2) to understand height of pucks: 189 | qr_width = math.hypot((x2 - x1), (y2 - y1)) 190 | # print("width", qr_width) 191 | #width_list.append((puck_number, qr_width)) 192 | 193 | # Make the puck object and add it to the puck list 194 | puck = Puck.Puck(puck_number, position, angle, qr_width) 195 | puck_list.append(puck) 196 | 197 | return puck_list 198 | 199 | 200 | def QR_Scanner_visualized(img): 201 | """Scan QR codes from image. Returns normalized image with marked QR-code""" 202 | 203 | grayscale = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) # Make grayscale image for filtering and thresholding 204 | blur = cv2.bilateralFilter(src=grayscale, d=3, sigmaColor=75, sigmaSpace=75) 205 | normalized_img = cv2.normalize(blur, blur, alpha=0, beta=255, norm_type=cv2.NORM_MINMAX, dtype=-1) 206 | data = decode(normalized_img, symbols=[ZBarSymbol.QRCODE]) 207 | 208 | normalized_img = cv2.cvtColor(normalized_img, cv2.COLOR_GRAY2BGR) 209 | 210 | sorted_data = sorted(data, key=lambda x: x[0]) # Sort the QR codes in ascending order 211 | 212 | for QR_Code in sorted_data: # Go through all QR codes 213 | polygon = np.int32([QR_Code.polygon]) # Convert from int64 to int32, polylines only accepts int32 214 | cv2.polylines(normalized_img, polygon, True, color=(0, 0, 255), thickness=10) # Draw lines around QR-codes 215 | 216 | return normalized_img 217 | -------------------------------------------------------------------------------- /docs/build/html/puck.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Puck Class — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 |
30 |
31 | 32 | 33 |
34 | 35 |
36 |

Puck Class

37 |
38 |
39 | class Puck(number, position, angle, height=30)
40 |
41 |
42 | set_position(position)
43 |

Updates Puck position.

44 |
45 |
Parameters
46 |

position (float[]) – New Puck position [x,y]

47 |
48 |
49 |
50 | 51 |
52 |
53 | set_angle(angle)
54 |

Updates Puck angle.

55 |
56 |
Parameters
57 |

angle (float[]) – New Puck angle

58 |
59 |
60 |
61 | 62 |
63 |
64 | set_height(height)
65 |

Updates Puck height.

66 |
67 |
Parameters
68 |

height (float[]) – New Puck height

69 |
70 |
71 |
72 | 73 |
74 |
75 | get_xyz()
76 |

Gets true position of Puck ([x,y,z] on the work object).

77 |
78 |
Returns
79 |

Puck position [x,y,z]

80 |
81 |
82 |
83 | 84 |
85 |
86 | check_collision(puck_list)
87 |

Finds an angle of rotation for the gripper which avoids 88 | collisions between the gripper and other pucks 89 | when sliding in to pick up a puck.

90 |
91 |
Parameters
92 |

puck_list (Puck[]) – List of all Puck objects

93 |
94 |
Returns
95 |

Rotation which yields no collision

96 |
97 |
Returns
98 |

If puck should be gripped forward or backward

99 |
100 |
101 |
102 | 103 |
104 | 105 |
106 | 107 | 108 |
109 | 110 |
111 |
112 | 163 |
164 |
165 | 176 | 177 | 178 | 179 | 180 | 181 | -------------------------------------------------------------------------------- /docs/build/html/_static/doctools.js: -------------------------------------------------------------------------------- 1 | /* 2 | * doctools.js 3 | * ~~~~~~~~~~~ 4 | * 5 | * Sphinx JavaScript utilities for all documentation. 6 | * 7 | * :copyright: Copyright 2007-2019 by the Sphinx team, see AUTHORS. 8 | * :license: BSD, see LICENSE for details. 9 | * 10 | */ 11 | 12 | /** 13 | * select a different prefix for underscore 14 | */ 15 | $u = _.noConflict(); 16 | 17 | /** 18 | * make the code below compatible with browsers without 19 | * an installed firebug like debugger 20 | if (!window.console || !console.firebug) { 21 | var names = ["log", "debug", "info", "warn", "error", "assert", "dir", 22 | "dirxml", "group", "groupEnd", "time", "timeEnd", "count", "trace", 23 | "profile", "profileEnd"]; 24 | window.console = {}; 25 | for (var i = 0; i < names.length; ++i) 26 | window.console[names[i]] = function() {}; 27 | } 28 | */ 29 | 30 | /** 31 | * small helper function to urldecode strings 32 | */ 33 | jQuery.urldecode = function(x) { 34 | return decodeURIComponent(x).replace(/\+/g, ' '); 35 | }; 36 | 37 | /** 38 | * small helper function to urlencode strings 39 | */ 40 | jQuery.urlencode = encodeURIComponent; 41 | 42 | /** 43 | * This function returns the parsed url parameters of the 44 | * current request. Multiple values per key are supported, 45 | * it will always return arrays of strings for the value parts. 46 | */ 47 | jQuery.getQueryParameters = function(s) { 48 | if (typeof s === 'undefined') 49 | s = document.location.search; 50 | var parts = s.substr(s.indexOf('?') + 1).split('&'); 51 | var result = {}; 52 | for (var i = 0; i < parts.length; i++) { 53 | var tmp = parts[i].split('=', 2); 54 | var key = jQuery.urldecode(tmp[0]); 55 | var value = jQuery.urldecode(tmp[1]); 56 | if (key in result) 57 | result[key].push(value); 58 | else 59 | result[key] = [value]; 60 | } 61 | return result; 62 | }; 63 | 64 | /** 65 | * highlight a given string on a jquery object by wrapping it in 66 | * span elements with the given class name. 67 | */ 68 | jQuery.fn.highlightText = function(text, className) { 69 | function highlight(node, addItems) { 70 | if (node.nodeType === 3) { 71 | var val = node.nodeValue; 72 | var pos = val.toLowerCase().indexOf(text); 73 | if (pos >= 0 && 74 | !jQuery(node.parentNode).hasClass(className) && 75 | !jQuery(node.parentNode).hasClass("nohighlight")) { 76 | var span; 77 | var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); 78 | if (isInSVG) { 79 | span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); 80 | } else { 81 | span = document.createElement("span"); 82 | span.className = className; 83 | } 84 | span.appendChild(document.createTextNode(val.substr(pos, text.length))); 85 | node.parentNode.insertBefore(span, node.parentNode.insertBefore( 86 | document.createTextNode(val.substr(pos + text.length)), 87 | node.nextSibling)); 88 | node.nodeValue = val.substr(0, pos); 89 | if (isInSVG) { 90 | var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); 91 | var bbox = node.parentElement.getBBox(); 92 | rect.x.baseVal.value = bbox.x; 93 | rect.y.baseVal.value = bbox.y; 94 | rect.width.baseVal.value = bbox.width; 95 | rect.height.baseVal.value = bbox.height; 96 | rect.setAttribute('class', className); 97 | addItems.push({ 98 | "parent": node.parentNode, 99 | "target": rect}); 100 | } 101 | } 102 | } 103 | else if (!jQuery(node).is("button, select, textarea")) { 104 | jQuery.each(node.childNodes, function() { 105 | highlight(this, addItems); 106 | }); 107 | } 108 | } 109 | var addItems = []; 110 | var result = this.each(function() { 111 | highlight(this, addItems); 112 | }); 113 | for (var i = 0; i < addItems.length; ++i) { 114 | jQuery(addItems[i].parent).before(addItems[i].target); 115 | } 116 | return result; 117 | }; 118 | 119 | /* 120 | * backward compatibility for jQuery.browser 121 | * This will be supported until firefox bug is fixed. 122 | */ 123 | if (!jQuery.browser) { 124 | jQuery.uaMatch = function(ua) { 125 | ua = ua.toLowerCase(); 126 | 127 | var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || 128 | /(webkit)[ \/]([\w.]+)/.exec(ua) || 129 | /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || 130 | /(msie) ([\w.]+)/.exec(ua) || 131 | ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || 132 | []; 133 | 134 | return { 135 | browser: match[ 1 ] || "", 136 | version: match[ 2 ] || "0" 137 | }; 138 | }; 139 | jQuery.browser = {}; 140 | jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; 141 | } 142 | 143 | /** 144 | * Small JavaScript module for the documentation. 145 | */ 146 | var Documentation = { 147 | 148 | init : function() { 149 | this.fixFirefoxAnchorBug(); 150 | this.highlightSearchWords(); 151 | this.initIndexTable(); 152 | if (DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) { 153 | this.initOnKeyListeners(); 154 | } 155 | }, 156 | 157 | /** 158 | * i18n support 159 | */ 160 | TRANSLATIONS : {}, 161 | PLURAL_EXPR : function(n) { return n === 1 ? 0 : 1; }, 162 | LOCALE : 'unknown', 163 | 164 | // gettext and ngettext don't access this so that the functions 165 | // can safely bound to a different name (_ = Documentation.gettext) 166 | gettext : function(string) { 167 | var translated = Documentation.TRANSLATIONS[string]; 168 | if (typeof translated === 'undefined') 169 | return string; 170 | return (typeof translated === 'string') ? translated : translated[0]; 171 | }, 172 | 173 | ngettext : function(singular, plural, n) { 174 | var translated = Documentation.TRANSLATIONS[singular]; 175 | if (typeof translated === 'undefined') 176 | return (n == 1) ? singular : plural; 177 | return translated[Documentation.PLURALEXPR(n)]; 178 | }, 179 | 180 | addTranslations : function(catalog) { 181 | for (var key in catalog.messages) 182 | this.TRANSLATIONS[key] = catalog.messages[key]; 183 | this.PLURAL_EXPR = new Function('n', 'return +(' + catalog.plural_expr + ')'); 184 | this.LOCALE = catalog.locale; 185 | }, 186 | 187 | /** 188 | * add context elements like header anchor links 189 | */ 190 | addContextElements : function() { 191 | $('div[id] > :header:first').each(function() { 192 | $('\u00B6'). 193 | attr('href', '#' + this.id). 194 | attr('title', _('Permalink to this headline')). 195 | appendTo(this); 196 | }); 197 | $('dt[id]').each(function() { 198 | $('\u00B6'). 199 | attr('href', '#' + this.id). 200 | attr('title', _('Permalink to this definition')). 201 | appendTo(this); 202 | }); 203 | }, 204 | 205 | /** 206 | * workaround a firefox stupidity 207 | * see: https://bugzilla.mozilla.org/show_bug.cgi?id=645075 208 | */ 209 | fixFirefoxAnchorBug : function() { 210 | if (document.location.hash && $.browser.mozilla) 211 | window.setTimeout(function() { 212 | document.location.href += ''; 213 | }, 10); 214 | }, 215 | 216 | /** 217 | * highlight the search words provided in the url in the text 218 | */ 219 | highlightSearchWords : function() { 220 | var params = $.getQueryParameters(); 221 | var terms = (params.highlight) ? params.highlight[0].split(/\s+/) : []; 222 | if (terms.length) { 223 | var body = $('div.body'); 224 | if (!body.length) { 225 | body = $('body'); 226 | } 227 | window.setTimeout(function() { 228 | $.each(terms, function() { 229 | body.highlightText(this.toLowerCase(), 'highlighted'); 230 | }); 231 | }, 10); 232 | $('') 234 | .appendTo($('#searchbox')); 235 | } 236 | }, 237 | 238 | /** 239 | * init the domain index toggle buttons 240 | */ 241 | initIndexTable : function() { 242 | var togglers = $('img.toggler').click(function() { 243 | var src = $(this).attr('src'); 244 | var idnum = $(this).attr('id').substr(7); 245 | $('tr.cg-' + idnum).toggle(); 246 | if (src.substr(-9) === 'minus.png') 247 | $(this).attr('src', src.substr(0, src.length-9) + 'plus.png'); 248 | else 249 | $(this).attr('src', src.substr(0, src.length-8) + 'minus.png'); 250 | }).css('display', ''); 251 | if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) { 252 | togglers.click(); 253 | } 254 | }, 255 | 256 | /** 257 | * helper function to hide the search marks again 258 | */ 259 | hideSearchWords : function() { 260 | $('#searchbox .highlight-link').fadeOut(300); 261 | $('span.highlighted').removeClass('highlighted'); 262 | }, 263 | 264 | /** 265 | * make the url absolute 266 | */ 267 | makeURL : function(relativeURL) { 268 | return DOCUMENTATION_OPTIONS.URL_ROOT + '/' + relativeURL; 269 | }, 270 | 271 | /** 272 | * get the current relative url 273 | */ 274 | getCurrentURL : function() { 275 | var path = document.location.pathname; 276 | var parts = path.split(/\//); 277 | $.each(DOCUMENTATION_OPTIONS.URL_ROOT.split(/\//), function() { 278 | if (this === '..') 279 | parts.pop(); 280 | }); 281 | var url = parts.join('/'); 282 | return path.substring(url.lastIndexOf('/') + 1, path.length - 1); 283 | }, 284 | 285 | initOnKeyListeners: function() { 286 | $(document).keyup(function(event) { 287 | var activeElementType = document.activeElement.tagName; 288 | // don't navigate when in search box or textarea 289 | if (activeElementType !== 'TEXTAREA' && activeElementType !== 'INPUT' && activeElementType !== 'SELECT') { 290 | switch (event.keyCode) { 291 | case 37: // left 292 | var prevHref = $('link[rel="prev"]').prop('href'); 293 | if (prevHref) { 294 | window.location.href = prevHref; 295 | return false; 296 | } 297 | case 39: // right 298 | var nextHref = $('link[rel="next"]').prop('href'); 299 | if (nextHref) { 300 | window.location.href = nextHref; 301 | return false; 302 | } 303 | } 304 | } 305 | }); 306 | } 307 | }; 308 | 309 | // quick alias for translations 310 | _ = Documentation.gettext; 311 | 312 | $(document).ready(function() { 313 | Documentation.init(); 314 | }); 315 | -------------------------------------------------------------------------------- /docs/build/html/genindex.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | Index — ABB-Robot-Machine-Vision 0.0.1 documentation 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 |
29 |
30 | 31 | 32 |
33 | 34 | 35 |

Index

36 | 37 |
38 | C 39 | | F 40 | | G 41 | | I 42 | | M 43 | | O 44 | | P 45 | | Q 46 | | R 47 | | S 48 | | T 49 | | W 50 | 51 |
52 |

C

53 | 54 | 62 | 72 |
73 | 74 |

F

75 | 76 | 80 |
81 | 82 |

G

83 | 84 | 92 | 102 |
103 | 104 |

I

105 | 106 | 110 |
111 | 112 |

M

113 | 114 | 118 | 122 |
123 | 124 |

O

125 | 126 | 130 |
131 | 132 |

P

133 | 134 | 138 | 142 |
143 | 144 |

Q

145 | 146 | 150 | 154 |
155 | 156 |

R

157 | 158 | 164 | 172 |
173 | 174 |

S

175 | 176 | 192 | 208 |
209 | 210 |

T

211 | 212 | 216 |
217 | 218 |

W

219 | 220 | 224 |
225 | 226 | 227 | 228 |
229 | 230 |
231 |
232 | 281 |
282 |
283 | 291 | 292 | 293 | 294 | 295 | 296 | -------------------------------------------------------------------------------- /docs/build/html/_static/underscore.js: -------------------------------------------------------------------------------- 1 | // Underscore.js 1.3.1 2 | // (c) 2009-2012 Jeremy Ashkenas, DocumentCloud Inc. 3 | // Underscore is freely distributable under the MIT license. 4 | // Portions of Underscore are inspired or borrowed from Prototype, 5 | // Oliver Steele's Functional, and John Resig's Micro-Templating. 6 | // For all details and documentation: 7 | // http://documentcloud.github.com/underscore 8 | (function(){function q(a,c,d){if(a===c)return a!==0||1/a==1/c;if(a==null||c==null)return a===c;if(a._chain)a=a._wrapped;if(c._chain)c=c._wrapped;if(a.isEqual&&b.isFunction(a.isEqual))return a.isEqual(c);if(c.isEqual&&b.isFunction(c.isEqual))return c.isEqual(a);var e=l.call(a);if(e!=l.call(c))return false;switch(e){case "[object String]":return a==String(c);case "[object Number]":return a!=+a?c!=+c:a==0?1/a==1/c:a==+c;case "[object Date]":case "[object Boolean]":return+a==+c;case "[object RegExp]":return a.source== 9 | c.source&&a.global==c.global&&a.multiline==c.multiline&&a.ignoreCase==c.ignoreCase}if(typeof a!="object"||typeof c!="object")return false;for(var f=d.length;f--;)if(d[f]==a)return true;d.push(a);var f=0,g=true;if(e=="[object Array]"){if(f=a.length,g=f==c.length)for(;f--;)if(!(g=f in a==f in c&&q(a[f],c[f],d)))break}else{if("constructor"in a!="constructor"in c||a.constructor!=c.constructor)return false;for(var h in a)if(b.has(a,h)&&(f++,!(g=b.has(c,h)&&q(a[h],c[h],d))))break;if(g){for(h in c)if(b.has(c, 10 | h)&&!f--)break;g=!f}}d.pop();return g}var r=this,G=r._,n={},k=Array.prototype,o=Object.prototype,i=k.slice,H=k.unshift,l=o.toString,I=o.hasOwnProperty,w=k.forEach,x=k.map,y=k.reduce,z=k.reduceRight,A=k.filter,B=k.every,C=k.some,p=k.indexOf,D=k.lastIndexOf,o=Array.isArray,J=Object.keys,s=Function.prototype.bind,b=function(a){return new m(a)};if(typeof exports!=="undefined"){if(typeof module!=="undefined"&&module.exports)exports=module.exports=b;exports._=b}else r._=b;b.VERSION="1.3.1";var j=b.each= 11 | b.forEach=function(a,c,d){if(a!=null)if(w&&a.forEach===w)a.forEach(c,d);else if(a.length===+a.length)for(var e=0,f=a.length;e2;a== 12 | null&&(a=[]);if(y&&a.reduce===y)return e&&(c=b.bind(c,e)),f?a.reduce(c,d):a.reduce(c);j(a,function(a,b,i){f?d=c.call(e,d,a,b,i):(d=a,f=true)});if(!f)throw new TypeError("Reduce of empty array with no initial value");return d};b.reduceRight=b.foldr=function(a,c,d,e){var f=arguments.length>2;a==null&&(a=[]);if(z&&a.reduceRight===z)return e&&(c=b.bind(c,e)),f?a.reduceRight(c,d):a.reduceRight(c);var g=b.toArray(a).reverse();e&&!f&&(c=b.bind(c,e));return f?b.reduce(g,c,d,e):b.reduce(g,c)};b.find=b.detect= 13 | function(a,c,b){var e;E(a,function(a,g,h){if(c.call(b,a,g,h))return e=a,true});return e};b.filter=b.select=function(a,c,b){var e=[];if(a==null)return e;if(A&&a.filter===A)return a.filter(c,b);j(a,function(a,g,h){c.call(b,a,g,h)&&(e[e.length]=a)});return e};b.reject=function(a,c,b){var e=[];if(a==null)return e;j(a,function(a,g,h){c.call(b,a,g,h)||(e[e.length]=a)});return e};b.every=b.all=function(a,c,b){var e=true;if(a==null)return e;if(B&&a.every===B)return a.every(c,b);j(a,function(a,g,h){if(!(e= 14 | e&&c.call(b,a,g,h)))return n});return e};var E=b.some=b.any=function(a,c,d){c||(c=b.identity);var e=false;if(a==null)return e;if(C&&a.some===C)return a.some(c,d);j(a,function(a,b,h){if(e||(e=c.call(d,a,b,h)))return n});return!!e};b.include=b.contains=function(a,c){var b=false;if(a==null)return b;return p&&a.indexOf===p?a.indexOf(c)!=-1:b=E(a,function(a){return a===c})};b.invoke=function(a,c){var d=i.call(arguments,2);return b.map(a,function(a){return(b.isFunction(c)?c||a:a[c]).apply(a,d)})};b.pluck= 15 | function(a,c){return b.map(a,function(a){return a[c]})};b.max=function(a,c,d){if(!c&&b.isArray(a))return Math.max.apply(Math,a);if(!c&&b.isEmpty(a))return-Infinity;var e={computed:-Infinity};j(a,function(a,b,h){b=c?c.call(d,a,b,h):a;b>=e.computed&&(e={value:a,computed:b})});return e.value};b.min=function(a,c,d){if(!c&&b.isArray(a))return Math.min.apply(Math,a);if(!c&&b.isEmpty(a))return Infinity;var e={computed:Infinity};j(a,function(a,b,h){b=c?c.call(d,a,b,h):a;bd?1:0}),"value")};b.groupBy=function(a,c){var d={},e=b.isFunction(c)?c:function(a){return a[c]};j(a,function(a,b){var c=e(a,b);(d[c]||(d[c]=[])).push(a)});return d};b.sortedIndex=function(a, 17 | c,d){d||(d=b.identity);for(var e=0,f=a.length;e>1;d(a[g])=0})})};b.difference=function(a){var c=b.flatten(i.call(arguments,1));return b.filter(a,function(a){return!b.include(c,a)})};b.zip=function(){for(var a=i.call(arguments),c=b.max(b.pluck(a,"length")),d=Array(c),e=0;e=0;d--)b=[a[d].apply(this,b)];return b[0]}}; 24 | b.after=function(a,b){return a<=0?b():function(){if(--a<1)return b.apply(this,arguments)}};b.keys=J||function(a){if(a!==Object(a))throw new TypeError("Invalid object");var c=[],d;for(d in a)b.has(a,d)&&(c[c.length]=d);return c};b.values=function(a){return b.map(a,b.identity)};b.functions=b.methods=function(a){var c=[],d;for(d in a)b.isFunction(a[d])&&c.push(d);return c.sort()};b.extend=function(a){j(i.call(arguments,1),function(b){for(var d in b)a[d]=b[d]});return a};b.defaults=function(a){j(i.call(arguments, 25 | 1),function(b){for(var d in b)a[d]==null&&(a[d]=b[d])});return a};b.clone=function(a){return!b.isObject(a)?a:b.isArray(a)?a.slice():b.extend({},a)};b.tap=function(a,b){b(a);return a};b.isEqual=function(a,b){return q(a,b,[])};b.isEmpty=function(a){if(b.isArray(a)||b.isString(a))return a.length===0;for(var c in a)if(b.has(a,c))return false;return true};b.isElement=function(a){return!!(a&&a.nodeType==1)};b.isArray=o||function(a){return l.call(a)=="[object Array]"};b.isObject=function(a){return a===Object(a)}; 26 | b.isArguments=function(a){return l.call(a)=="[object Arguments]"};if(!b.isArguments(arguments))b.isArguments=function(a){return!(!a||!b.has(a,"callee"))};b.isFunction=function(a){return l.call(a)=="[object Function]"};b.isString=function(a){return l.call(a)=="[object String]"};b.isNumber=function(a){return l.call(a)=="[object Number]"};b.isNaN=function(a){return a!==a};b.isBoolean=function(a){return a===true||a===false||l.call(a)=="[object Boolean]"};b.isDate=function(a){return l.call(a)=="[object Date]"}; 27 | b.isRegExp=function(a){return l.call(a)=="[object RegExp]"};b.isNull=function(a){return a===null};b.isUndefined=function(a){return a===void 0};b.has=function(a,b){return I.call(a,b)};b.noConflict=function(){r._=G;return this};b.identity=function(a){return a};b.times=function(a,b,d){for(var e=0;e/g,">").replace(/"/g,""").replace(/'/g,"'").replace(/\//g,"/")};b.mixin=function(a){j(b.functions(a), 28 | function(c){K(c,b[c]=a[c])})};var L=0;b.uniqueId=function(a){var b=L++;return a?a+b:b};b.templateSettings={evaluate:/<%([\s\S]+?)%>/g,interpolate:/<%=([\s\S]+?)%>/g,escape:/<%-([\s\S]+?)%>/g};var t=/.^/,u=function(a){return a.replace(/\\\\/g,"\\").replace(/\\'/g,"'")};b.template=function(a,c){var d=b.templateSettings,d="var __p=[],print=function(){__p.push.apply(__p,arguments);};with(obj||{}){__p.push('"+a.replace(/\\/g,"\\\\").replace(/'/g,"\\'").replace(d.escape||t,function(a,b){return"',_.escape("+ 29 | u(b)+"),'"}).replace(d.interpolate||t,function(a,b){return"',"+u(b)+",'"}).replace(d.evaluate||t,function(a,b){return"');"+u(b).replace(/[\r\n\t]/g," ")+";__p.push('"}).replace(/\r/g,"\\r").replace(/\n/g,"\\n").replace(/\t/g,"\\t")+"');}return __p.join('');",e=new Function("obj","_",d);return c?e(c,b):function(a){return e.call(this,a,b)}};b.chain=function(a){return b(a).chain()};var m=function(a){this._wrapped=a};b.prototype=m.prototype;var v=function(a,c){return c?b(a).chain():a},K=function(a,c){m.prototype[a]= 30 | function(){var a=i.call(arguments);H.call(a,this._wrapped);return v(c.apply(b,a),this._chain)}};b.mixin(b);j("pop,push,reverse,shift,sort,splice,unshift".split(","),function(a){var b=k[a];m.prototype[a]=function(){var d=this._wrapped;b.apply(d,arguments);var e=d.length;(a=="shift"||a=="splice")&&e===0&&delete d[0];return v(d,this._chain)}});j(["concat","join","slice"],function(a){var b=k[a];m.prototype[a]=function(){return v(b.apply(this._wrapped,arguments),this._chain)}});m.prototype.chain=function(){this._chain= 31 | true;return this};m.prototype.value=function(){return this._wrapped}}).call(this); 32 | -------------------------------------------------------------------------------- /docs/build/html/opencv_to_rapid.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | OpenCV_to_RAPID.py — ABB-Robot-Machine-Vision 0.0.1 documentation 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 |
30 |
31 | 32 | 33 |
34 | 35 |
36 |

OpenCV_to_RAPID.py

37 |
38 |
39 | create_robtarget(gripper_height, gripper_rot, cam_pos, image, puck, cam_comp=False)
40 |

Complete a series of transformations to finally create a robtarget of the puck’s position from an image. 41 | These transformations are based on the setup at the University of Stavanger, and with a specific RobotStudio setup 42 | (in regards to coordinate systems). This Pack and Go file includes this setup.

43 |
44 |
Parameters
45 |
    46 |
  • gripper_height (float) – Height of gripper above work object

  • 47 |
  • gripper_rot (tuple) – Orientation of gripper, must be a Quaternion (tuple of length 4)

  • 48 |
  • cam_pos (float[]) – Camera position [x,y]

  • 49 |
  • image (ndarray) – An image

  • 50 |
  • puck (Puck) – A Puck object

  • 51 |
  • cam_comp (bool) – Running camera_adjustment?

  • 52 |
53 |
54 |
Returns
55 |

A Puck object with robtarget as position

56 |
57 |
58 |
59 | 60 |
61 |
62 | transform_position(gripper_rot, puck)
63 |

Transform coordinate system given by image in OpenCV to coordinate system of work object in RAPID. 64 | Swap x & y coordinates and rotate by the same amount that the camera has been rotated.

65 |
66 |
Parameters
67 |
    68 |
  • gripper_rot (tuple) – Orientation of gripper, must be a Quaternion (tuple of length 4)

  • 69 |
  • puck (Puck) – A Puck object

  • 70 |
71 |
72 |
73 |
74 | 75 |
76 |
77 | pixel_to_mm(gripper_height, puck, image)
78 |

Converts coordinates in image from pixels to millimeters. 79 | This depends on the camera’s working distance.

80 |
81 |
Parameters
82 |
    83 |
  • gripper_height (float) – Height of gripper above work object

  • 84 |
  • puck (Puck) – A Puck object

  • 85 |
  • image (ndarray) – An image

  • 86 |
87 |
88 |
89 |
90 | 91 |
92 |
93 | overshoot_comp(gripper_height, puck)
94 |

Compensate for the overshoot phenomenon which occurs when trying to pinpoint 95 | the location of a 3D object in a 2D image.

96 |
97 |
Parameters
98 |
    99 |
  • gripper_height (float) – Height of gripper above work object

  • 100 |
  • puck (Puck) – A Puck object

  • 101 |
102 |
103 |
104 |
105 | 106 |
107 |
108 | camera_compensation(gripper_height, gripper_rot, puck)
109 |

Compensate for an angled camera view. Different cameras will be 110 | angled differently both internally and externally when mounted to a surface. 111 | The slope values must first be calculated by running camera_adjustment. 112 | Works with any camera orientation.

113 |
114 |
Parameters
115 |
    116 |
  • gripper_height (float) – Height of gripper above work object

  • 117 |
  • gripper_rot (tuple) – Orientation of gripper, must be a Quaternion (tuple of length 4)

  • 118 |
  • puck (Puck) – A Puck object

  • 119 |
120 |
121 |
122 |
123 | 124 |
125 |
126 | get_camera_position(trans, rot)
127 |

Uses the offset between the gripper and camera to find the camera’s position. 128 | `Robot Web Services`_

129 |
130 |
Parameters
131 |
    132 |
  • trans (float[]) – Position of gripper

  • 133 |
  • rot (tuple) – Orientation of gripper, must be a Quaternion (tuple of length 4)

  • 134 |
135 |
136 |
Returns
137 |

Position of camera

138 |
139 |
140 |
141 | 142 |
143 | 144 | 145 |
146 | 147 |
148 |
149 | 200 |
201 |
202 | 213 | 214 | 215 | 216 | 217 | 218 | --------------------------------------------------------------------------------