├── .gitattributes ├── .github └── workflows │ ├── build.yml │ └── deploy.yml ├── .gitignore ├── .pre-commit-config.yaml ├── LICENSE ├── Makefile ├── README.md ├── _quarto.yml ├── data ├── crowd.jpg └── sharpening.mp4 ├── index.qmd ├── nbs ├── aliased_and_not_aliased_patch_extraction.ipynb ├── canny.ipynb ├── color_conversions.ipynb ├── color_raw_to_rgb.ipynb ├── color_yuv420_to_rgb.ipynb ├── connected_components.ipynb ├── data_augmentation_2d.ipynb ├── data_augmentation_kornia_lightning.ipynb ├── data_augmentation_mosiac.ipynb ├── data_augmentation_segmentation.ipynb ├── data_augmentation_sequential.ipynb ├── data_patch_sequential.ipynb ├── descriptors_matching.ipynb ├── extract_combine_patches.ipynb ├── face_detection.ipynb ├── filtering_edges.ipynb ├── filtering_operators.ipynb ├── fit_line.ipynb ├── fit_plane.ipynb ├── gaussian_blur.ipynb ├── geometric_transforms.ipynb ├── geometry_generate_patch.ipynb ├── hello_world_tutorial.ipynb ├── homography.ipynb ├── image_enhancement.ipynb ├── image_histogram.ipynb ├── image_matching.ipynb ├── image_matching_adalam.ipynb ├── image_matching_disk.ipynb ├── image_matching_lightglue.ipynb ├── image_points_transforms.ipynb ├── image_registration.ipynb ├── image_stitching.ipynb ├── line_detection_and_matching_sold2.ipynb ├── morphology_101.ipynb ├── resize_antialias.ipynb ├── rotate_affine.ipynb ├── total_variation_denoising.ipynb ├── unsharp_mask.ipynb ├── visual_prompter.ipynb ├── warp_perspective.ipynb └── zca_whitening.ipynb ├── requirements-dev.txt ├── requirements.txt ├── scripts └── training │ ├── image_classifier │ ├── README.rst │ ├── config.yaml │ ├── main.py │ └── requirements.txt │ ├── object_detection │ ├── README.rst │ ├── config.yaml │ ├── main.py │ └── requirements.txt │ └── semantic_segmentation │ ├── README.rst │ ├── config.yaml │ ├── main.py │ └── requirements.txt └── tutorials ├── _static ├── css │ ├── styles_dark.scss │ └── styles_light.scss └── img │ ├── kornia_logo.svg │ ├── kornia_logo_favicon.png │ ├── kornia_logo_mini.png │ ├── kornia_logo_only_dark.svg │ └── kornia_logo_only_light.svg ├── assets ├── aliased_and_not_aliased_patch_extraction.png ├── canny.png ├── color_conversions.png ├── color_raw_to_rgb.png ├── color_yuv420_to_rgb.png ├── connected_components.png ├── data_augmentation_2d.png ├── data_augmentation_mosiac.png ├── data_augmentation_segmentation.png ├── data_augmentation_sequential.png ├── data_patch_sequential.png ├── descriptor_matching.png ├── extract_combine_patches.png ├── face_detection.png ├── filtering_edges.png ├── filtering_operators.png ├── fit_line.png ├── fit_plane.png ├── gaussian_blur.png ├── geometric_transforms.png ├── geometry_generate_patch.png ├── hello_world_tutorial.png ├── homography.png ├── image_enhancement.png ├── image_histogram.png ├── image_matching.png ├── image_matching_LightGlue.png ├── image_matching_adalam.png ├── image_matching_disk.png ├── image_points_transforms.png ├── image_registration.png ├── image_stitching.png ├── kornia.png ├── line_detection_and_matching_sold2.png ├── morphology_101.png ├── resize_antialias.png ├── rotate_affine.png ├── total_variation_denoising.png ├── unsharp_mask.png ├── visual_prompter.png ├── warp_perspective.png └── zca_whitening.png └── training ├── image_classification.rst ├── object_detection.rst └── semantic_segmentation.rst /.gitattributes: -------------------------------------------------------------------------------- 1 | tutorials/assets/* binary 2 | tutorial/_static/img/* binary 3 | -------------------------------------------------------------------------------- /.github/workflows/build.yml: -------------------------------------------------------------------------------- 1 | name: Tutorials 2 | 3 | on: 4 | workflow_dispatch: 5 | push: 6 | branches: [master, test-me-*] 7 | pull_request: 8 | schedule: 9 | - cron: "0 4 * * *" 10 | 11 | concurrency: 12 | group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }} 13 | cancel-in-progress: true 14 | 15 | jobs: 16 | test: 17 | strategy: 18 | matrix: 19 | kornia-ref: ['main'] # TODO: coverage check through multiple versions: , 'v0.7.1', 'v0.7.0'] 20 | 21 | uses: kornia/workflows/.github/workflows/tutorials.yml@v1.10.1 22 | with: 23 | ref: ${{ matrix.kornia-ref }} 24 | 25 | build: 26 | needs: test 27 | name: Build tutorials 28 | runs-on: ${{ matrix.os }}-latest 29 | 30 | strategy: 31 | matrix: 32 | os: ['ubuntu'] 33 | kornia-ref: ['main'] 34 | 35 | steps: 36 | - uses: kornia/workflows/.github/actions/env@v1.10.1 37 | with: 38 | ref: ${{ matrix.kornia-ref }} 39 | extra-deps: torchvision==0.18.1+cpu 40 | 41 | - uses: actions/checkout@v4 42 | with: 43 | repository: 'kornia/tutorials' 44 | path: 'tutorials-repo/' 45 | 46 | - name: Install dependencies 47 | working-directory: ./tutorials-repo/ 48 | shell: bash -l {0} 49 | run: make setup 50 | 51 | - uses: quarto-dev/quarto-actions/setup@v2 52 | 53 | - name: Check deps 54 | working-directory: ./tutorials-repo/ 55 | shell: bash -l {0} 56 | run: make check-deps 57 | 58 | - name: Render Quarto Project 59 | uses: quarto-dev/quarto-actions/render@v2 60 | with: 61 | to: html 62 | path: ./tutorials-repo/ 63 | -------------------------------------------------------------------------------- /.github/workflows/deploy.yml: -------------------------------------------------------------------------------- 1 | name: Generate docs and deploy to GitHub Pages 2 | on: 3 | workflow_dispatch: 4 | push: 5 | branches: [master] 6 | 7 | jobs: 8 | deploy: 9 | name: Deploy tutorials - ${{ matrix.os }}, ${{ matrix.python-version }}, ${{ matrix.pytorch-version }} 10 | runs-on: ${{ matrix.os }}-latest 11 | 12 | strategy: 13 | matrix: 14 | os: ['ubuntu'] 15 | kornia-ref: ['main'] 16 | 17 | steps: 18 | - uses: kornia/workflows/.github/actions/env@v1.10.1 19 | with: 20 | ref: ${{ matrix.kornia-ref }} 21 | 22 | - uses: actions/checkout@v4 23 | with: 24 | repository: 'kornia/tutorials' 25 | path: 'tutorials-repo/' 26 | 27 | - name: Install dependencies 28 | working-directory: ./tutorials-repo/ 29 | shell: bash -l {0} 30 | run: make setup 31 | 32 | - uses: quarto-dev/quarto-actions/setup@v2 33 | 34 | - name: Render Quarto Project 35 | uses: quarto-dev/quarto-actions/render@v2 36 | with: 37 | to: html 38 | path: ./tutorials-repo/ 39 | 40 | - name: Deploy to GitHub Pages 41 | uses: peaceiris/actions-gh-pages@v3 42 | with: 43 | github_token: ${{ secrets.GITHUB_TOKEN }} 44 | force_orphan: true 45 | publish_dir: ./tutorials-repo/_site 46 | user_name: github-actions[bot] 47 | user_email: 41898282+github-actions[bot]@users.noreply.github.com 48 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | *.ipynb_checkpoints/* 2 | .pyc 3 | 4 | # Temp files created when building the pages 5 | _nbs/ 6 | _build/ 7 | /.quarto/ 8 | _site/ 9 | 10 | # Allows just notebooks inside nbs/ 11 | !nbs/**.ipynb 12 | nbs/*.png* 13 | nbs/*.jpg* 14 | nbs/*.jpeg* 15 | nbs/*.mp* 16 | nbs/*.tif* 17 | -------------------------------------------------------------------------------- /.pre-commit-config.yaml: -------------------------------------------------------------------------------- 1 | repos: 2 | - repo: https://github.com/pre-commit/pre-commit-hooks 3 | rev: v5.0.0 4 | hooks: 5 | - id: check-merge-conflict 6 | - id: check-case-conflict 7 | - id: check-yaml 8 | - id: detect-private-key 9 | - id: debug-statements 10 | - id: requirements-txt-fixer 11 | - id: trailing-whitespace 12 | - id: end-of-file-fixer 13 | 14 | - repo: https://github.com/nbQA-dev/nbQA 15 | rev: 1.9.1 16 | hooks: 17 | - id: nbqa-pyupgrade 18 | args: [--py38-plus] 19 | - id: nbqa-isort 20 | args: ['--profile=black'] 21 | - id: nbqa-black 22 | args: ['--nbqa-dont-skip-bad-cells', '--line-length=128'] 23 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | SOURCEDIR = nbs 2 | DSTDIR = _nbs 3 | .PHONY: purge generate execute sphinx build setup check-deps 4 | 5 | purge: 6 | @echo "\nRemoving all files that are not notebooks on '$(SOURCEDIR)'...\n" 7 | find $(SOURCEDIR) -type f ! -name "*.ipynb" -delete 8 | 9 | generate: 10 | @echo "\n\nGenerating all tutorials notebooks to '$(SOURCEDIR)'...\n" 11 | jupyter nbconvert $(SOURCEDIR)/**.ipynb --to notebook --output-dir $(DSTDIR) 12 | 13 | execute: 14 | @echo "\n\nExecuting all tutorials notebooks under '$(DSTDIR)'...\n" 15 | jupyter nbconvert $(DSTDIR)/**.ipynb --execute --inplace 16 | 17 | build: purge generate render 18 | 19 | setup-quarto: 20 | curl -LO https://www.quarto.org/download/latest/quarto-linux-amd64.deb 21 | sudo dpkg -i quarto-linux-amd64.deb 22 | rm quarto-linux-amd64.deb 23 | 24 | setup: 25 | pip install -r requirements.txt && pip install -r requirements-dev.txt; 26 | 27 | preview: 28 | quarto preview . 29 | 30 | render: 31 | quarto render . 32 | 33 | check-deps: 34 | @which python 35 | @python --version 36 | @python -c "import kornia;print('Kornia version: ', kornia.__version__)" 37 | @python -c "import torch;print('Pytorch version: ', torch.__version__)" 38 | @python -c "import torchvision;print('Pytorch vision version: ', torchvision.__version__)" 39 | @python -c "import cv2;print('OpenCV version: ', cv2.__version__)" 40 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Kornia tutorials 2 | 3 | The kornia tutorials provide from basic to advanced tutorials for [kornia library](https://github.com/kornia/kornia). 4 | 5 | These tutorials are made by the community for the community. To shows how to use the Kornia API, and also show how these 6 | computer vision algorithms can be used in a variety of scenarios. 7 | 8 | The kornia tutorials uses the [quarto](https://quarto.org/) framework to generate the website from jupyter notebooks. 9 | 10 | ## Guide 11 | 12 | 13 | ### Step-by-step setup environment 14 | 1. Install [quarto](https://quarto.org/) following the official docs: https://quarto.org/docs/get-started/ 15 | 1. Create a virtual environment 16 | ```console 17 | # with virtual env 18 | $ virtualenv venv -p python3.10 19 | # with conda 20 | $ conda create -p venv python=3.10 21 | ``` 22 | 1. Install the dependencies 23 | ```console 24 | $ pip install -r requirements.txt 25 | $ pip install -r requirements-dev.txt 26 | ``` 27 | 1. Run quarto on preview mode. Which automatically reloads the browser when input files or document resources change. 28 | ```console 29 | # Using our makefile command 30 | $ make preview 31 | # Using the quarto cli directly 32 | $ quarto preview . 33 | ``` 34 | 35 | For Linux users, you can use the `make setup-quarto` which will download and install the quarto binary for linux. 36 | And the `make setup` which will pip install the requirements. 37 | 38 | ### How to add a new tutorial 39 | 40 | The kornia tutorials are jupyter notebooks (you can find them into [./nbs/](./nbs/) directory). Each notebook corresponds into a 41 | "blogpost" page within the same content. The notebook is compiled into a webpage by quarto, which means you can write a 42 | tutorial using a jupyter notebook with the normal pattern of Python and Markdown cells. This enable some special features 43 | , for use it we recomment you to check the [quarto docs](https://quarto.org/docs/). 44 | 45 | Aside from the content of the tutorial, the first cell of the notebook should have a header content (markdown cell). This header 46 | allow us to have thumbmail, author, categories, etc. This header should look like: 47 | 48 | ```txt 49 | --- 50 | title: "" 51 | description: "" 52 | author: 53 | - "" 54 | date: "" 55 | categories: 56 | - "" 57 | - "" 58 | - "" 59 | - "" 60 | - "" 61 | image: "../tutorials/assets/.png" 62 | --- 63 | 64 | Open in google colab 65 | ``` 66 | 67 | For this header you should generate a image to be used as thumbmail, and save it into the [tutorials/assets/](./tutorials/assets/) 68 | directory. Also update the link for the colab, for user can be able to directly open your tutorial into the colab. You can add 69 | `N` categories, look at the categories already available in the README, if you want to use a different one, please update the 70 | README too. 71 | 72 | 73 | ### Tutorials categories 74 | If you add a new category on the tutorials frontmatter, update this too. 75 | 76 | By Levels 77 | - Basic 78 | - Intermediate 79 | - Advanced 80 | 81 | By module 82 | - kornia.augmentation 83 | - Kornia.feature 84 | - kornia.contrib 85 | - kornia.filters 86 | - kornia.color 87 | - kornia.io 88 | - kornia.geometry 89 | - kornia.enhance 90 | 91 | By generic type/category 92 | - Data augmentation 93 | - Segmentation 94 | - Edge Detection 95 | - Labeling 96 | - Denoising 97 | - Color spaces 98 | - Local features 99 | - Filters 100 | - Blur 101 | - Line 102 | - Plane 103 | - Keypoints 104 | - Homography 105 | - Image matching 106 | - Image Registration 107 | - Warp image 108 | - Augmentation container 109 | - Augmentation Sequential 110 | - Line detection 111 | - Line matching 112 | - Rescale 113 | - Affine 114 | - 2D 115 | - Unsupervised 116 | - Self-supervised 117 | 118 | By specific names of models / API 119 | - SOLD2 120 | - KeyNet 121 | - Adalam 122 | - HardNet 123 | - DISK 124 | - Patches 125 | - LAF 126 | - LoFTR 127 | -------------------------------------------------------------------------------- /_quarto.yml: -------------------------------------------------------------------------------- 1 | project: 2 | output-dir: _site 3 | type: website 4 | 5 | website: 6 | title: "Kornia" 7 | description: "" 8 | page-navigation: true 9 | favicon: "tutorials/_static/img/kornia_logo_favicon.png" 10 | site-url: https://kornia.github.io/tutorials/ 11 | repo-url: https://github.com/kornia/tutorials/ 12 | repo-branch: master 13 | repo-actions: [edit, source, issue] 14 | google-analytics: '' 15 | open-graph: true 16 | 17 | twitter-card: 18 | site: "@sitehandle" 19 | 20 | search: 21 | location: navbar 22 | type: overlay 23 | 24 | navbar: 25 | collapse: false 26 | pinned: true 27 | search: true 28 | logo: "tutorials/_static/img/kornia_logo_mini.png" 29 | right: 30 | - icon: github 31 | href: https://github.com/kornia/ 32 | - icon: twitter 33 | href: https://twitter.com/kornia_foss 34 | - icon: linkedin 35 | href: https://www.linkedin.com/company/kornia/ 36 | 37 | left: 38 | - text: "Tutorials" 39 | file: index.qmd 40 | - text: "Docs" 41 | href: https://kornia.readthedocs.io 42 | - text: "About" 43 | href: https://kornia.readthedocs.io/en/latest/get-started/introduction.html 44 | - text: "Projects" 45 | menu: 46 | - text: Kornia 47 | href: "https://github.com/kornia/kornia" 48 | - text: Limbus 49 | href: "https://github.com/kornia/limbus" 50 | - text: Kornia-rs 51 | href: "https://github.com/kornia/kornia-rs" 52 | 53 | 54 | page-footer: 55 | center: 56 | - text: Copyright © 2023 Kornia. All Rights Reserved. 57 | right: 58 | - icon: question-circle-fill 59 | href: https://github.com/kornia/kornia/discussions 60 | - icon: lightbulb-fill 61 | href: https://github.com/kornia/kornia/ 62 | - icon: twitter 63 | href: https://twitter.com/kornia_foss 64 | - icon: linkedin 65 | href: https://www.linkedin.com/company/kornia/ 66 | 67 | format: 68 | html: 69 | link-external-newwindow: false 70 | link-external-icon: false 71 | number-sections: false 72 | fig-responsive: true 73 | toc: true 74 | toc-depth: 5 75 | mainfont: "Open Sans, sans-serif" 76 | fontsize: 18px 77 | grid: 78 | body-width: 1350px 79 | margin-width: 380px 80 | theme: 81 | light: 82 | - sandstone 83 | - tutorials/_static/css/styles_light.scss 84 | dark: 85 | - sandstone 86 | - tutorials/_static/css/styles_dark.scss 87 | 88 | # Code 89 | code-copy: hover 90 | highlight-style: monokai 91 | code-link: true 92 | code-overflow: scroll 93 | -------------------------------------------------------------------------------- /data/crowd.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/data/crowd.jpg -------------------------------------------------------------------------------- /data/sharpening.mp4: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/data/sharpening.mp4 -------------------------------------------------------------------------------- /index.qmd: -------------------------------------------------------------------------------- 1 | --- 2 | title: "Tutorials" 3 | page-layout: full 4 | title-block-banner: true 5 | listing: 6 | type: default 7 | categories: true 8 | sort-ui: true 9 | filter-ui: true 10 | fields: [image, date, title, author, categories, description, file-modified] 11 | image-placeholder: "tutorials/assets/kornia.png" 12 | contents: 13 | - nbs/ 14 | - tutorials/ 15 | sort: 16 | - "date desc" 17 | - "title desc" 18 | --- 19 | -------------------------------------------------------------------------------- /nbs/data_augmentation_kornia_lightning.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "---\n", 9 | "title: \"Kornia and PyTorch Lightning GPU data augmentation\"\n", 10 | "description: \"In this tutorial we show how one can combine both Kornia and PyTorch Lightning to perform data augmentation to train a model using CPUs and GPUs in batch mode without additional effort.\"\n", 11 | "author:\n", 12 | " - \"Edgar Riba\"\n", 13 | "date: 03-18-2021\n", 14 | "categories:\n", 15 | " - Basic\n", 16 | " - Data augmentation\n", 17 | " - Pytorch lightning\n", 18 | " - kornia.augmentation\n", 19 | "image: \"../tutorials/assets/kornia.png\"\n", 20 | "---\n", 21 | "\n", 22 | "\"Open" 23 | ] 24 | }, 25 | { 26 | "cell_type": "markdown", 27 | "metadata": {}, 28 | "source": [ 29 | "## Install Kornia and PyTorch Lightning\n", 30 | "We first install Kornia and PyTorch Lightning" 31 | ] 32 | }, 33 | { 34 | "cell_type": "code", 35 | "execution_count": null, 36 | "metadata": { 37 | "tags": [ 38 | "skip-execution" 39 | ] 40 | }, 41 | "outputs": [], 42 | "source": [ 43 | "%%capture\n", 44 | "!pip install kornia\n", 45 | "!pip install kornia-rs\n", 46 | "!pip install pytorch_lightning torchmetrics" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "Import the needed libraries" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": null, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [ 62 | "import os\n", 63 | "\n", 64 | "import kornia as K\n", 65 | "import numpy as np\n", 66 | "import pytorch_lightning as pl\n", 67 | "import torch\n", 68 | "import torch.nn as nn\n", 69 | "import torchmetrics\n", 70 | "from PIL import Image\n", 71 | "from torch.nn import functional as F\n", 72 | "from torch.utils.data import DataLoader\n", 73 | "from torchvision.datasets import CIFAR10" 74 | ] 75 | }, 76 | { 77 | "cell_type": "markdown", 78 | "metadata": {}, 79 | "source": [ 80 | "## Define Data Augmentations module" 81 | ] 82 | }, 83 | { 84 | "cell_type": "code", 85 | "execution_count": null, 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "class DataAugmentation(nn.Module):\n", 90 | " \"\"\"Module to perform data augmentation using Kornia on torch tensors.\"\"\"\n", 91 | "\n", 92 | " def __init__(self, apply_color_jitter: bool = False) -> None:\n", 93 | " super().__init__()\n", 94 | " self._apply_color_jitter = apply_color_jitter\n", 95 | "\n", 96 | " self._max_val: float = 255.0\n", 97 | "\n", 98 | " self.transforms = nn.Sequential(K.enhance.Normalize(0.0, self._max_val), K.augmentation.RandomHorizontalFlip(p=0.5))\n", 99 | "\n", 100 | " self.jitter = K.augmentation.ColorJitter(0.5, 0.5, 0.5, 0.5)\n", 101 | "\n", 102 | " @torch.no_grad() # disable gradients for effiency\n", 103 | " def forward(self, x: torch.Tensor) -> torch.Tensor:\n", 104 | " x_out = self.transforms(x) # BxCxHxW\n", 105 | " if self._apply_color_jitter:\n", 106 | " x_out = self.jitter(x_out)\n", 107 | " return x_out" 108 | ] 109 | }, 110 | { 111 | "cell_type": "markdown", 112 | "metadata": {}, 113 | "source": [ 114 | "## Define a Pre-processing model" 115 | ] 116 | }, 117 | { 118 | "cell_type": "code", 119 | "execution_count": null, 120 | "metadata": {}, 121 | "outputs": [], 122 | "source": [ 123 | "class PreProcess(nn.Module):\n", 124 | " \"\"\"Module to perform pre-process using Kornia on torch tensors.\"\"\"\n", 125 | "\n", 126 | " def __init__(self) -> None:\n", 127 | " super().__init__()\n", 128 | "\n", 129 | " @torch.no_grad() # disable gradients for effiency\n", 130 | " def forward(self, x: Image) -> torch.Tensor:\n", 131 | " x_tmp: np.ndarray = np.array(x) # HxWxC\n", 132 | " x_out: torch.Tensor = K.image_to_tensor(x_tmp, keepdim=True) # CxHxW\n", 133 | " return x_out.float()" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "metadata": {}, 139 | "source": [ 140 | "## Define PyTorch Lightning model" 141 | ] 142 | }, 143 | { 144 | "cell_type": "code", 145 | "execution_count": null, 146 | "metadata": {}, 147 | "outputs": [], 148 | "source": [ 149 | "class CoolSystem(pl.LightningModule):\n", 150 | " def __init__(self):\n", 151 | " super().__init__()\n", 152 | " # not the best model...\n", 153 | " self.l1 = torch.nn.Linear(3 * 32 * 32, 10)\n", 154 | "\n", 155 | " self.preprocess = PreProcess()\n", 156 | "\n", 157 | " self.transform = DataAugmentation()\n", 158 | "\n", 159 | " self.accuracy = torchmetrics.Accuracy(task=\"multiclass\", num_classes=10)\n", 160 | "\n", 161 | " def forward(self, x):\n", 162 | " return torch.relu(self.l1(x.view(x.size(0), -1)))\n", 163 | "\n", 164 | " def training_step(self, batch, batch_idx):\n", 165 | " # REQUIRED\n", 166 | " x, y = batch\n", 167 | " x_aug = self.transform(x) # => we perform GPU/Batched data augmentation\n", 168 | " logits = self.forward(x_aug)\n", 169 | " loss = F.cross_entropy(logits, y)\n", 170 | " self.log(\"train_acc_step\", self.accuracy(logits.argmax(1), y))\n", 171 | " self.log(\"train_loss\", loss)\n", 172 | " return loss\n", 173 | "\n", 174 | " def validation_step(self, batch, batch_idx):\n", 175 | " # OPTIONAL\n", 176 | " x, y = batch\n", 177 | " logits = self.forward(x)\n", 178 | " self.log(\"val_acc_step\", self.accuracy(logits.argmax(1), y))\n", 179 | " return F.cross_entropy(logits, y)\n", 180 | "\n", 181 | " def test_step(self, batch, batch_idx):\n", 182 | " # OPTIONAL\n", 183 | " x, y = batch\n", 184 | " logits = self.forward(x)\n", 185 | " acc = self.accuracy(logits.argmax(1), y)\n", 186 | " self.log(\"test_acc_step\", acc)\n", 187 | " return acc\n", 188 | "\n", 189 | " def configure_optimizers(self):\n", 190 | " # REQUIRED\n", 191 | " # can return multiple optimizers and learning_rate schedulers\n", 192 | " # (LBFGS it is automatically supported, no need for closure function)\n", 193 | " return torch.optim.Adam(self.parameters(), lr=0.0004)\n", 194 | "\n", 195 | " def prepare_data(self):\n", 196 | " CIFAR10(os.getcwd(), train=True, download=True, transform=self.preprocess)\n", 197 | " CIFAR10(os.getcwd(), train=False, download=True, transform=self.preprocess)\n", 198 | "\n", 199 | " def train_dataloader(self):\n", 200 | " # REQUIRED\n", 201 | " dataset = CIFAR10(os.getcwd(), train=True, download=False, transform=self.preprocess)\n", 202 | " loader = DataLoader(dataset, batch_size=32, num_workers=1)\n", 203 | " return loader\n", 204 | "\n", 205 | " def val_dataloader(self):\n", 206 | " dataset = CIFAR10(os.getcwd(), train=True, download=False, transform=self.preprocess)\n", 207 | " loader = DataLoader(dataset, batch_size=32, num_workers=1)\n", 208 | " return loader\n", 209 | "\n", 210 | " def test_dataloader(self):\n", 211 | " dataset = CIFAR10(os.getcwd(), train=False, download=False, transform=self.preprocess)\n", 212 | " loader = DataLoader(dataset, batch_size=16, num_workers=1)\n", 213 | " return loader" 214 | ] 215 | }, 216 | { 217 | "cell_type": "markdown", 218 | "metadata": {}, 219 | "source": [ 220 | "## Run training" 221 | ] 222 | }, 223 | { 224 | "cell_type": "code", 225 | "execution_count": null, 226 | "metadata": {}, 227 | "outputs": [ 228 | { 229 | "name": "stderr", 230 | "output_type": "stream", 231 | "text": [ 232 | "GPU available: True (cuda), used: False\n", 233 | "TPU available: False, using: 0 TPU cores\n", 234 | "IPU available: False, using: 0 IPUs\n", 235 | "HPU available: False, using: 0 HPUs\n" 236 | ] 237 | }, 238 | { 239 | "name": "stdout", 240 | "output_type": "stream", 241 | "text": [] 242 | }, 243 | { 244 | "name": "stderr", 245 | "output_type": "stream", 246 | "text": [ 247 | "\n", 248 | " | Name | Type | Params\n", 249 | "--------------------------------------------------\n", 250 | "0 | l1 | Linear | 30.7 K\n", 251 | "1 | preprocess | PreProcess | 0 \n", 252 | "2 | transform | DataAugmentation | 0 \n", 253 | "3 | accuracy | MulticlassAccuracy | 0 \n", 254 | "--------------------------------------------------\n", 255 | "30.7 K Trainable params\n", 256 | "0 Non-trainable params\n", 257 | "30.7 K Total params\n", 258 | "0.123 Total estimated model params size (MB)\n", 259 | "`Trainer.fit` stopped: `max_epochs=1` reached.\n" 260 | ] 261 | } 262 | ], 263 | "source": [ 264 | "from pytorch_lightning import Trainer\n", 265 | "\n", 266 | "# init model\n", 267 | "model = CoolSystem()\n", 268 | "\n", 269 | "# Initialize a trainer\n", 270 | "accelerator = \"cpu\" # can be 'gpu'\n", 271 | "\n", 272 | "trainer = Trainer(accelerator=accelerator, max_epochs=1, enable_progress_bar=False)\n", 273 | "\n", 274 | "# Train the model ⚡\n", 275 | "trainer.fit(model)" 276 | ] 277 | }, 278 | { 279 | "cell_type": "markdown", 280 | "metadata": {}, 281 | "source": [ 282 | "# Test the model" 283 | ] 284 | }, 285 | { 286 | "cell_type": "code", 287 | "execution_count": null, 288 | "metadata": {}, 289 | "outputs": [ 290 | { 291 | "name": "stdout", 292 | "output_type": "stream", 293 | "text": [] 294 | }, 295 | { 296 | "name": "stderr", 297 | "output_type": "stream", 298 | "text": [] 299 | }, 300 | { 301 | "name": "stdout", 302 | "output_type": "stream", 303 | "text": [ 304 | "────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────\n", 305 | " Test metric DataLoader 0\n", 306 | "────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────\n", 307 | " test_acc_step 0.10000000149011612\n", 308 | "────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────\n" 309 | ] 310 | }, 311 | { 312 | "data": { 313 | "text/plain": [ 314 | "[{'test_acc_step': 0.10000000149011612}]" 315 | ] 316 | }, 317 | "execution_count": null, 318 | "metadata": {}, 319 | "output_type": "execute_result" 320 | } 321 | ], 322 | "source": [ 323 | "trainer.test(model)" 324 | ] 325 | }, 326 | { 327 | "cell_type": "markdown", 328 | "metadata": {}, 329 | "source": [ 330 | "## Visualize" 331 | ] 332 | }, 333 | { 334 | "cell_type": "code", 335 | "execution_count": null, 336 | "metadata": {}, 337 | "outputs": [], 338 | "source": [ 339 | "# # Start tensorboard.\n", 340 | "# %load_ext tensorboard\n", 341 | "# %tensorboard --logdir lightning_logs/" 342 | ] 343 | } 344 | ], 345 | "metadata": { 346 | "kernelspec": { 347 | "display_name": "python3", 348 | "language": "python", 349 | "name": "python3" 350 | } 351 | }, 352 | "nbformat": 4, 353 | "nbformat_minor": 1 354 | } 355 | -------------------------------------------------------------------------------- /nbs/fit_line.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "---\n", 9 | "title: \"Fit line tutorial\"\n", 10 | "description: \"This tutorial use shows how to generate a line based on points. Using the `ParametrizedLine` and `fit_line` from `kornia.gemetry.line`\"\n", 11 | "author:\n", 12 | " - \"Edgar Riba\"\n", 13 | "date: 07-15-2022\n", 14 | "categories:\n", 15 | " - Basic\n", 16 | " - Line\n", 17 | " - kornia.geometry\n", 18 | "image: \"../tutorials/assets/fit_line.png\"\n", 19 | "---\n", 20 | "\n", 21 | "\"Open" 22 | ] 23 | }, 24 | { 25 | "cell_type": "code", 26 | "execution_count": null, 27 | "metadata": {}, 28 | "outputs": [], 29 | "source": [ 30 | "import matplotlib.pyplot as plt\n", 31 | "import torch\n", 32 | "from kornia.core import concatenate, stack\n", 33 | "from kornia.geometry.line import ParametrizedLine, fit_line" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": null, 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "std = 1.2 # standard deviation for the points\n", 43 | "num_points = 50 # total number of points" 44 | ] 45 | }, 46 | { 47 | "cell_type": "code", 48 | "execution_count": null, 49 | "metadata": {}, 50 | "outputs": [ 51 | { 52 | "name": "stdout", 53 | "output_type": "stream", 54 | "text": [ 55 | "Origin: Parameter containing:\n", 56 | "tensor([0., 0.], requires_grad=True)\n", 57 | "Direction: Parameter containing:\n", 58 | "tensor([0.7071, 0.7071], requires_grad=True)\n" 59 | ] 60 | } 61 | ], 62 | "source": [ 63 | "# create a baseline\n", 64 | "p0 = torch.tensor([0.0, 0.0])\n", 65 | "p1 = torch.tensor([1.0, 1.0])\n", 66 | "\n", 67 | "l1 = ParametrizedLine.through(p0, p1)\n", 68 | "print(l1)" 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "execution_count": null, 74 | "metadata": {}, 75 | "outputs": [], 76 | "source": [ 77 | "# sample some points and weights\n", 78 | "pts, w = [], []\n", 79 | "for t in torch.linspace(-10, 10, num_points):\n", 80 | " p2 = l1.point_at(t)\n", 81 | " p2_noise = torch.rand_like(p2) * std\n", 82 | " p2 += p2_noise\n", 83 | " pts.append(p2)\n", 84 | " w.append(1 - p2_noise.mean())\n", 85 | "pts = stack(pts)\n", 86 | "w = stack(w)" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": null, 92 | "metadata": {}, 93 | "outputs": [ 94 | { 95 | "name": "stdout", 96 | "output_type": "stream", 97 | "text": [ 98 | "Origin: Parameter containing:\n", 99 | "tensor([[0.5933, 0.5888]], requires_grad=True)\n", 100 | "Direction: Parameter containing:\n", 101 | "tensor([[-0.7146, -0.6995]], requires_grad=True)\n" 102 | ] 103 | } 104 | ], 105 | "source": [ 106 | "# fit the the line\n", 107 | "l2 = fit_line(pts[None, ...], w[None, ...])\n", 108 | "print(l2)\n", 109 | "\n", 110 | "# project some points along the estimated line\n", 111 | "p3 = l2.point_at(-10)\n", 112 | "p4 = l2.point_at(10)" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": null, 118 | "metadata": {}, 119 | "outputs": [ 120 | { 121 | "data": { 122 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAGdCAYAAAAvwBgXAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABNTElEQVR4nO3deViU5f4G8HtmWEVAVEBhRhE1d2VRKZfCsrQ0F8QlsZPWsURMyMo0s/K4oGmFW2jLUX+Ju6ilWZmp6VFzQU0z3Bcc941BkQFm3t8fr4OAA8wM884McH+uay7PvDwz7zOXxnzPs9yPTBAEAURERER2ILd3B4iIiKjqYiFCREREdsNChIiIiOyGhQgRERHZDQsRIiIishsWIkRERGQ3LESIiIjIbliIEBERkd042bsDpdHr9bh8+TI8PT0hk8ns3R0iIiIygSAIyMrKQkBAAOTy0sc8HLoQuXz5MlQqlb27QURERBbIyMiAUqkstY1DFyKenp4AxA/i5eVl594QERGRKTQaDVQqVcH3eGkcuhAxTMd4eXmxECEiIqpgTFlWIeliVZ1Oh4kTJ6JBgwZwd3dHw4YNMXnyZPCcPSIiIgIkHhGZMWMGkpOTsWTJErRo0QIHDhzAsGHD4O3tjdGjR0t5ayIiIqoAJC1Edu/ejd69e6NHjx4AgKCgICxfvhz79u2T8rZERERUQUg6NdOhQwds3boVJ0+eBAAcOXIEu3btwosvvmi0vVarhUajKfIgIiKiykvSEZFx48ZBo9GgadOmUCgU0Ol0mDp1KmJiYoy2T0xMxKRJk6TsEhERETkQSUdEVq1ahZSUFCxbtgxpaWlYsmQJZs2ahSVLlhhtP378eGRmZhY8MjIypOweERER2ZlMkHALi0qlwrhx4xAXF1dwbcqUKVi6dCnS09PLfL1Go4G3tzcyMzO5fZeIiKiCMOf7W9IRkezs7MeiXRUKBfR6vZS3JSIiogpC0jUiL7/8MqZOnYp69eqhRYsWOHToEL744gu8/vrrUt6WiIiIyqLTATt3AleuAHXrAp07AwqFzbsh6dRMVlYWJk6ciHXr1uH69esICAjAK6+8go8//hguLi5lvp5TM0RERBJITQXi44FLlx5dUyqB2bOBqKhyv70539+SFiLlxUKEiIjIylJTgehooPjXvyGOfc2achcjDrNGhIiIiByITieOhBgbgzBcS0gQ29kICxEiIqIKTJuvw3e7ziFfZ8JGkJ07i07HFCcIQEaG2M5GHPr0XSIioirHjEWkl+5kI27ZIRzJuIvrmhyMf6lZ6e995YppfTC1nRWwECEiInIUZiwi3X7iOhJWHsbd7DzUqOaMJxvWKvv969Y1rR+mtrMCTs0QERE5AsMi0uJTJ2q1eD01FQCg0wv4YstJDFu8H3ez89Ba6Y2Nb3dClyZ+Zd+jc2exsDEsTC1OJgNUKrGdjbAQISIisjcTF5HeunMPQz//GXO2noIgAEPaq7B6xFNQ+lQz7T4KhTi6AjxejBieJyXZNE+EhQgREZG9mbCI9KDOAz0nrsPOW3q45+Yg6cdZmBLfA64/bDDvXlFR4hbdwMCi15VKq2zdNRfXiBAREdlbKYtDBQCLw1/G1C5vIF/hhOBbGViwPhFP3LwojmJER5tfQERFAb17O0SyKgsRIiIieythceg9F3d80P1tbGr2NACgxz9/YMbPc1E994HYQBDEYiQhQSwszCkkFAogMrJ8/bYCFiJERET2ZlhEqlYXrAk5WbseRvQZj7O1VHDS5WPCtu8w9OCPeGyZaeHsDwcoLMzFNSJERET2VmwR6frmkej96hc4W0uFupobWLlsHIYZK0IKs2H2hzVxRISIiMgRREVBu2oNJi/bi6VPPAMA6HwuDUl7FqPW5bNlv96G2R/WxBERIiIiB3DpTjYGXPXD0ieegQzAaH8tFo/ohFqn/3G47A9r4ogIERGRnW1LF1NSMx+IKalJA0MQWTigbPZscXeMTFY0a8RO2R/WxBERIiIiO9HpBXz+6wkMW7wfmQ/y0OZhSmpk8ZRUB8v+sCaOiBAREdnBrXtaxK84jF2nbwIA/vVUfUzo0QyuTiWMbDhQ9oc1sRAhIiKysYMXbiMu5RCuanLg7qzA9H6t0DsksOwXOkj2hzWxECEiIrIRQRDw3/+dR+JP/yBfL6ChrwcWDAlHY39Pe3fNbliIEBER2UBWTh7GrT2KTUfFvI+eretier/WqO5atb+Kq/anJyIisoETV7MQu/Qgzt68D2eFDB/1aI5/PVUfspK25FYhLESIiIgktO7QJXyYegwP8nQI8HbDvJgwhNXzsXe3HAYLESIiouJ0unLvTsnJ0+E/G49j2Z8XAQCdG9fG7EGhqOnhIkWPKywWIkRERIWlpgLx8cClS4+uKZViqJiJeR0Zt7MxMiUNR9WZkMmA0c82xujnGkMh51RMcSxEiIiIDFJTxQTTwumlgHgqbnS0SeFhv6dfwzsrjyDzQR58qjkjaVAonnnCV8JOV2xMViUiIgLE6Zj4+MeLEODRtYQEsZ2xl+sFzPrlBF5ffACZD/IQoqqBjaM7swgpA0dEiIiIAHFNSOHpmOIEAcjIENsVCxW7eU+L0csPYfeZWwCA156qjwk9msPFif9/vywsRIiIiABxYaoF7Q6cv424ZWm4ptGimosC0/u1Rq82ARJ0sHJiIUJERASIu2PMaCcIAr7bdQ7TN6cjXy+gkV91LBgShkZ+VTcl1RIsRIiIiABxi65SKS5MNbZORCYTf965M7Jy8jB2zV/YfOwqAKBXmwAkRrWCRxVPSbUEJ6+IiIgAMSdk9mzxfxdPPDU8T0pC+o376DXvf9h87CqcFTL8p3cLzB4UwiLEQixEiIiIDKKixC26gcVOwlUqgTVrsLZ+e/SZ/z+cu3kfAd5uWPXWU/jXU0GMai8HmSAYG39yDBqNBt7e3sjMzISXl5e9u0NERFVFsWTVnCc7YNJPJ7B8n5iS+swTvkgaGAIfpqQaZc73N8eRiIiIilMoCrboZtzORuw3f+KYWgOZDEh47gm8/WwjyJmSahUsRIiIiEqw9Z9reGflYWhy8uFTzRmzB4XiaQaUWRULESIiomLydXp8seUkvtp+BgAQWq8G5g8OQ0ANdzv3rPJhIUJERFTIjSwxJXXPWTEldWiHIHz4UjOmpEqEhQgREdFD+8/fRlxKGq5naeHxMCX1ZaakSkry8k6tVmPIkCGoVasW3N3d0apVKxw4cEDq2xIREZlGp4OwbRu+mb0WgxbuwfUsLRr7VceGUZ1YhNiApIXInTt30LFjRzg7O2Pz5s04fvw4Pv/8c/j4+Eh5WyIiItOkpkLTuCli52zB1Ctu0AlA73P7sF51C438qtu7d1WCpFMzM2bMgEqlwqJFiwquNWjQQMpbEhERmSY1Ff/EvofY3uNxvmYAXPLzMHHr1xhy5GfIVgNQrBEDzkhSko6I/PDDD2jbti369+8PPz8/hIaG4ptvvimxvVarhUajKfIgIiKyOp0Oa5KWo++QmThfMwCBmdexOmUsXj28GTJDzmdCghhsRpKStBA5e/YskpOT0bhxY/zyyy+IjY3F6NGjsWTJEqPtExMT4e3tXfBQqVRSdo+IiKqgnDwdxiX/hvc6DEWOsxsizxzAxsXxaHP11KNGggBkZIjpqiQpSSPeXVxc0LZtW+zevbvg2ujRo7F//37s2bPnsfZarRZarbbguUajgUqlYsQ7ERFZxcVb2YhNOYi/L2sgE/QYszMFcXtWQY4SvgqXLQNeecW2nawEHCbivW7dumjevHmRa82aNcPatWuNtnd1dYWrq6uUXSIioirqt+PXMGaVmJJa0xmY8/1EdLpwpPQX1a1rm85VYZIWIh07dsSJEyeKXDt58iTq168v5W2JiIgK5Ov0+HzLSSQ/TEkNq1cD8weFoO7iW4BMJk7DFCeTiSfudu5c9Hqxw/DQubN4Lg1ZTNI1Iu+88w727t2LadOm4fTp01i2bBm+/vprxMXFSXlbIiIiAMD1rBwM+e7PgiJkWMcgrHjzKdSt6QHMni02khU7vM7wPCmpaJGRmgoEBQFdugCDB4t/BgWJ18likq4RAYCNGzdi/PjxOHXqFBo0aIAxY8Zg+PDhJr3WnDkmIiKiwvadu41Ryx6lpH4W3QY9WhebaklNBeLjgUuXHl1TqcQipPDW3dRUIDr68dETQ9Gyhlt9CzPn+1vyQqQ8WIgQEZG5BEHANzvPYsbPJ6DTC3jCvzqSh4SjoW8JAWVlTbfodOLIR+FipTDDNM65c5ymechhFqsSERHZkiYnD++tOoJfj18DAPQNDcTUvi1RzaWUrzuFAoiMLPnnO3eWXIQARbf6lvY+ZBQLESIiqhSOX9YgNuUgLtzKhotCjo9fbo6YiHqQFV8DYq4rV6zbjopgIUJERBXeqgMZmLj+GLT5egTWcEfykDC0Vtaw/A0LT9dcu2baa7jV1yIsRIiIqMLKydPhkw1/Y+WBDABAlya++HJgCGpUc7H8TY0tYFUoSo57L2mrL5mEhQgREVnGzpkaF27dR+zSNBy/ooFcBox5/gmMjGwEubwcUzEl7Y4prQgBHt/qSyZjIUJEROYzNmqgVIrZHDbYxvrr31fx7uojyMrJRy0PF8x5JRQdG9Uu35vqdOJnKm0zafGREaXy8a2+ZBYWIkREZJ6SRg3UavF6eTI1yhhlydfpMfPXE1i44ywAILy+D+YPDkMdbzdLP80jZe2OMfTvyy8Bf38mq1oJCxEiIjJdaaMGgiBOVSQkAL17m/8FXcYoy/WsHLy97BD+PHcbAPBGpwYY92JTOCusFBJu6q4Xf38ehGdFLESIiMh0UmVqlDHKsve/a/G22hM3srSo7uqEz6Jb46VWVt6lYuquF+6OsSoWIkREZDopMjVKGWURBAELI/ph5nEFdHItmvh7InlIGIJLSkktj86dxREYtdq8g/CoXCQ99I6IiCoZKUYNShhlyXT1wJt9J2B65DDo5ApEBThhXVwHaYoQQJxKMvcgPCo3FiJERGQ6w6hBSWmlMpl4aJw5owZGRk/+9muAXq99iS1PPAWX/DxM+3kuPq99u/SodmuIihIX2wYGFr2uVPJgO4lwaoaIiExnGDWIjhaLjsJTGKWNGpS2G6bY6MnK1s9j4vOxyHVygfLuVSSvT0Sra2eAgHHSfa7CoqLExbZ2zEipSnj6LhERmc/YDheVynimRlmZIw9Pt825dgMTu47A6tbPAwCePb0PX2z6AjW093m6bQVjzvc3CxEiIrKMKcmqJe2GMYyePJzuOL8sFbFb1PjHPxhyvQ7v7lyK2L1rUBCSymmRCsWc729OzRARkWUUitK36JqYOfJL4yfx3gkPZPkHo/YDDeasn44OF/8S2ylLGGWhSoOFCBERSaOMzJF8yPBZw674OuUQAKBtfR/MGxiJOn3rcm1GFcJChIiIpFFKlsh1Dx+M6v0B9qlaAgD+3akBPjCkpJoThGYKOx/OR6VjIUJERNIoIUtkj6oV3u41Fjer+6C6NhuzInzQvWdzafpg58P5qGzMESEiImkUyxzRQ4bkiH6IGTQFN6v7oOn1c/jx1+no3r+LNPc3LJQtPj1kOJwvNVWa+5JZWIgQEZE0CiWVZrpVx5tREzAjchj0cgX6Hd2KdUvfR4PJE6SZJilroSwgHs6n01n/3mQWTs0QEZF0oqJwbMlaxO7NRIanL1zyczHpt4UYdPs4ZCtSpJsekepwPrI6FiJERCQJQRCwcn8GPj7hhlxPF6jcZUhW5qJlz3HSLxiV4nA+kgQLESKiqkrC3SQPcnWYuOEY1hwURyW6NvPD5/1D4F3N2SrvXyYpDucjSbAQISKqiiTcTXLu5n3ELj2I9KtZkMuA97o1wYinG0IuL+GgPCkYFsqq1cbXichk4s/NOZyPJMHFqkREVY2Eu0l+PnYFvebuQvrVLNSu7oKl/47AyMhGti1CgCILZR87Kbi0w/nI5liIEBFVJRLtJsnT6TF103GMWJqGLG0+2gfVxKbRndGhYe3y99lSUVHiGTWBgUWvK5U8u8aBcGqGiKgqkWA3yTVNDkYtS8P+83cAAG8+HYz3uzURU1LtLSoK6N2byaoOjIUIEVFVYuXdJLvP3MTo5Ydw814uPF2dMLN/G3RvWaccHZRAWYfzkV2xECEiqkjKu9PFSrtJ9HoByTvO4PNfT0AvAE3reCJ5SDga1PYwvS9EYCFCRFRxWGOnixV2k2Rm52HMqsPYmn4dANA/XInJfVrCzZnTHWQ+B5jAIyKiMllrp0s5d5McU2ei57yd2Jp+HS5Ocszo1woz+7dhEUIWYyFCROTorL3TxYLdJIIgYNmfFxGVvBsZtx+gXs1qSI3tgIHt6pn3WYiK4dQMEZGjk+LcFDN2kzzI1WHC+qNITVMDALo288fnA9rA291GKalUqbEQISJydFKdm2LCbpKzN+5hZEpaQUrq2O5N8dbTwZAVn9YhshALESIiR2enc1M2H72C99f8hXvafNSu7op5g0PxZHAtq96DiIUIEZGjs/G5KXk6PaZvTsd3u84BANo3qIl5r4TCz8vNKu9PVJjNFqtOnz4dMpkMCQkJtrolEVHlYMNzU65m5uCVr/cWFCFvPR2MZf+OYBFCkrFJIbJ//34sXLgQrVu3tsXtiIgqHxucm7L79E30nLsTBy7cgaebExa+Go7xLzWDkyNEtVOlJfnUzL179xATE4NvvvkGU6ZMkfp2RESVl0TnphRPSW1W1wsLhoShfi2mpJL0JC9E4uLi0KNHD3Tt2rXMQkSr1UKr1RY812g0UnePiMgxmBrdbuVzU+5m52LMqiP4/WFK6oC2SvynN1NSyXYkLURWrFiBtLQ07N+/36T2iYmJmDRpkpRdIiJyPNaIbi9NCUXOX5fuYmRKGi7deQBXJzkm926JAe1U5b8fkRlkgmBsCXb5ZWRkoG3bttiyZUvB2pDIyEiEhIQgKSnJ6GuMjYioVCpkZmbCy8tLim4SEdmXIbq9+K9iwyLU8q7/MFLkCEollk2Yi0kZrsjV6VG/VjV8FROGFgHelt+HqBCNRgNvb2+Tvr8lK0TWr1+Pvn37QlFoaFGn00Emk0Eul0Or1Rb5mTHmfBAiogpHpwOCgkpOTTVsyz13zrJ1IEaKnGxnV3z0QhxSWz4LAHihuT9m9mdKKlmXOd/fkk3NPPfcczh69GiRa8OGDUPTpk3xwQcflFmEEBFVelJEtxsYOZ/mTM1AjOwzHid8g6DQ6zD2yAa8OWUhZE6MlCL7kexfn6enJ1q2bFnkmoeHB2rVqvXYdSKiKkmq6HbgsSLnpyYdMfbFeNxzrQbfe7cxb8MMRFz6G9j1qlUXvxKZi2UwEZG9SBnd/rB4yZMrkBg5DP9t1wcAEHHxKOb+MAN+9+8WaUdkLzYtRLZv327L2xEROTYpo9vr1sUVz1oY1esDHFQ2BwCM2Lsa7/3xPZwEfZF2RPbEEREiInsxRLdHR4tFR+FipJzR7bvqNkP863Nxy80Lnjn38MWmL/D86X1F39+K59MQWYq5vURE9mTl6Ha9XsDcrafw6uIDuOXmhebXzmDj/73zeBECWO18GqLy4IgIEZG9WSm6/W52Lt5ZeRjbTtwAAAxqp8KnghZuvymAO4UaKpViEWKNsDSicmIhQkTkCMoZ3f7XpbuIXZoG9V0xJXVKn5bo31YFoDXQ1/rn0xBZCwsRIqIKTBAELP3zIib/eBy5Oj2CalXDVzHhaB5QKETKyufTEFkTCxEiogoqOzcfH6YexfrDlwEA3VqIKalebkxJpYqDhQgRUQV05sY9xC49iJPX7kEhl2Fc96b4d+cGkBkWohJVECxEiIgqmI1/XcYHa/7C/Vwd/DxdMW9wGNo3qGnvbhFZhIUIEVEFkZuvx7Sf/sHi3ecBAE8G18ScV0Lh5+lm344RlQMLESKiCuBK5gPEpaQh7eJdAMDIyIYY8/wTcFIwDooqNhYiREQObuepG4hfcRi37+fCy80JXwwIQdfm/vbuFpFVsBAhInJQer2AedtO48vfTkIQgJaBXvhqcDjq1apm764RWQ0LESIiS+l0kgWF3bmfi4SVh7HjpJiS+kp7FT55qSnc9u5mMBlVKixEiIgskZoKxMcDly49uqZUiofYlTM6/XDGXcSliCmpbs5yTOnTCtEX9gGNe0hyPyJ7kgmCsbOnHYNGo4G3tzcyMzPh5eVV9guIiGwhNVU8Mbf4r09DhocFh9VBp4Pwxx/4/tgtTL7sjjwBaFDbA1/FhKHZnt+sfz8iCZnz/c3l1kRE5tDpxJEQY/8fznAtIUFsZ6rUVNxv1ATxX/6Ej9ViEdL9Yho21LuNZn4e1r8fkQNhIUJEZI6dO4tOjxQnCEBGhtjOFKmpOP1mAno/9y5+aB4JhV6Hj37/FskrPoHXoGhg6lTr3o/IwXCNCBGROa5csV47nQ4/fPE9xv3rC2S7uMM/6xbmbZiBdurj4s9lMnENiDX7ReRgWIgQEZmjbl2rtMvN12Pq179hSad/AwA6nD+C2T/OhG/23UeNBAG4fdu6/SJyMCxEiIjM0bmzuFtFrTa+bkMmE3/euXOJb6G+K6akHs7IBwDE7V6JMbtSoBD0xl9QsyZw547F9yNyZFwjQkRkDoXi0XRJ8ZNuDc+TkkrM99hx8gZ6ztmJwxl34eUEfLdmEt7f+X3JRQggLla18H5Ejo6FCBGRuaKixC2zgYFFryuVJW6l1ekFfLnlJIYu2oc72XloGeiFTfFP4zntlccLDAOZDFCpgAkTzL4fUUXBHBEiIkuZmKx6+34u4lccws5TNwEAgyPq4eOezeHmrHiUSQIUnXoxlhEiYZIrkTWZ8/3NQoSISEKHLt5BXEoaLmfmwM1Zjml9WyEqTFm0kbGUVpVKnHLhaAdVQOZ8f3OxKhGRBARBwJLd5zH1p3+QpxPQoLYHkoeEoWkdI7+Uo6KA3r0tH+3gSAlVYCxEiIis7L42H+NSj+LHI5cBAC+1qoMZ/VrD08255BcpFEBkpPk3k/DMGyJbYCFCRGRFp65lYcTSgzhz4z6c5DKMf6kZXu8YBFlJC1LLo6Qzb9Rq8ToXslIFwDUiRFSxOdC0xIbDaoxPPYrsXB38vVwxf3AY2gbVlOZmOh0QFFRy/LshX+TcOU7TkM1xjQgRVQ0OMi2hzddhysZ/8P3eCwCAjo1qYfagUNSu7irdTc0588aSKR8iG2EhQkQVk4NMS1y6k424ZYdwJOMuAODtZxshoesTUMglmIopzJpn3hDZEQsRIqp4dDpxJMTYzLIgiNMSCQniThQJpyW2n7iOhJWHcTc7D97uzkgaGIIuTf0ku18RVjrzhsjemKxKRBWPOdMSEtDpBXyx5SSGLd6Pu9l5aK30xsa3O9muCAEenXlTViorz6AhB8dChIgqHjtOS9y6p8XQRfswZ+spCAIQE1EPq0c8BVXNala/V6nKeeYNkaNgIUJEFY+dpiUOXriDnnN3Yeepm3B3VuDLgW0wtW8ruDrZ6cvegjNviBwNt+8SUcVj2LqqVhtfJ2LlrauCIGDx7vOYuukf5OsFBPt6IDkmHE3qeJb7va3CgbYwEwHcvktElZ1hWiI6Wiw6jB0WZ6VpiXvafHyw9i9s+kuc5unRqi5mRLdGdddCvz7tXQhYmspK5AA4NUNEFZMNpiVOXstCr3m7sOmvK3CSy/Bxz+aYNzi0aBGSmiqOznTpAgweLP4ZFCReJ6IySTo1k5iYiNTUVKSnp8Pd3R0dOnTAjBkz0KRJE5Nez6kZIiqTRKMR6w+JKakP8nSo4+WG+TFhCK/vU7RRSVkmhlEZrtOgKsqc729JC5Hu3btj0KBBaNeuHfLz8/Hhhx/i2LFjOH78ODw8PMp8PQsRIrI1bb4Okzcex9K9FwEAnRrVxuxBIahVPCWVEetEJXKYQqS4GzduwM/PDzt27MDTTz9dZnsWIkRkS5fuZCMuJQ1HLmUCAEY/2wjxJaWkbt8uTsOUZds2rt+gKsdhF6tmZor/cdesafwQKK1WC61WW/Bco9HYpF9ERNvSxZTUzAd5qFHNGV8ODEGXJqUElDFincgqbLZYVa/XIyEhAR07dkTLli2NtklMTIS3t3fBQ6VS2ap7RFRF6fQCPv/1BIYt3o/MB3loY0hJLa0IARixTmQlNpuaiY2NxebNm7Fr1y4olUqjbYyNiKhUKk7NEJEkbt3TIn7FYew6fRMA8OqT9fFRz2amBZTZOMuEqCJxuKmZUaNGYePGjfjjjz9KLEIAwNXVFa6uEh6bTUT00MELtxGXcghXNTlwd1Zger9W6B0SWPYLDWyYZUJUmUk6NSMIAkaNGoV169bh999/R4MGDaS8HRFRmQRBwHe7zmHgwr24qslBsK8HNozqaF4RYsCIdaJyk3REJC4uDsuWLcOGDRvg6emJq1evAgC8vb3h7u4u5a2JiB6TlZOHcWuPYtNRcQFpz9Z1Mb1fsZRUc0VFAb17M2KdyEKSrhGRlXA89aJFizB06NAyX8/tu0RkLSeuZiF26UGcvXkfzgoZJrzUDK91CCrx9xQRWc5h1og48Hl6RFSFrDt0CR+mHsODPB3qeospqWH1fMp+IRFJjofeEVGllZOnw382HseyP8WU1M6NayNpoJGUVCKyGxYiRFQpZdzOxsiUNBxVZ0ImA0Y/2xijn2tsPCWViOyGhQgRVTq/p1/DOyuPFKSkJg0MQWRZAWVEZBcsRIio0tDpBXy55STmbTsNAGijqoGvYsIQWIO79IgcFQsRIqoYdLpSt8jevKfF6OWHsPvMLQDAv56qjwk9TExJJSK7YSFCRI4vNRUYPVqMUzcIDATmzAGionDg/G3EpaThWpYW1eQCElu6oXfPZszyIKoAWIgQkWNLTQX69Xv8uloNoV8/fDd/PaZnKJAvyNDo5kUkr09E41sZwAdKMYKd6aZEDs1mp+8SEZlNpwPefNPoj7Jc3DGyz3hMueiEfEGGl4/vwIb/GyMWIYA4ehIdLRYyROSwOCJCRI5r+3bg1q3HLqfXro/Yvh/iXM1AOOvy8NHv3+FfaRtRZGOuIIiHzyUkiBHsnKYhckgsRIjIcW3f/tiltS2exYRuI5Hj7IYAzXXMXz8doVdOGn+9IAAZGeIi18hISbtKRJZhIUJEFUKOwhmTur6J5SEvAgCePnsQSRs/R80HmrJffOWKxL0jIktxjQgROa6HoxgZ3v6IHjITy0NehEzQI2FXChatmWRaEQKI232JyCFxRISIHFdkJLaGPIt3nnkTGrfq8MnORNLGz/HMubRHbeRycQrG2CGbMhmgVIqZI0TkkDgiQkQOKV+nx2dbTuGNbmOgcauOkMvp2LQ4vmgRAgDvviv+KSt2hozheVISF6oSOTAWIkTkcG5kafHqd/vw1fYzAICh/vlYtWMuArJuPmqkVAJr1wKffQasWSMGnBWmVIrXmSNC5NBkgmBsPNMxaDQaeHt7IzMzE15eXvbuDhFZqox49sL2P0xJvZ6lRTUXBab3a41ebQLKfg8z7kFE0jLn+5trRIhIWqmpQHw8cOnSo2vKx1NPBUHAtzvPYfrP6dDpBTTyq44FQ8LQyM9TbKBQlL4Ft6yfE5FDYiFCRNJJTRXTTYsPvBpSTx9OnWhy8jB29V/4+e+rAIDeIQGY1rcVPFz5K4qosuN/5UQkDZ1OHAkxNvtbKPX0n4hnEbvsEM7fyoazQoaPezbHkCfrQ1Z88SkRVUosRIhIGjt3Fp2OKU4QsMa7MT6a/z/k6IHAGu6YHxOGEFUNm3WRiOyPhQgRSaOUNNMchTM+fX4EVrTpBuiBZ57wRdLAEPh4uNiwg0TkCFiIEJE0Skgzvejtj9g+4/F3nUaQCXq809gVo4a2g1zOqRiiqoiFCBFJo3NncXeMWl2wTuS3hu0xpqcYUFYzOxOz9y5G52mbARYhRFUWA82ISBoKhbhFF0C+XIEZT7+Gf0d/DI1bdYSq07FxSQI6jxvBrA+iKo4jIkQknagoXF+2BqN/Po+9dZoAAIYd2IDxp7fAZdECpp4SEQsRoirHhgmk+87dxqgzXrhepwk8FMAMZTZ6dh8MdE7mSAgRAWAhQlS1mJhyWl6CIOCbnWcx4+cT0OkFPOFfHV/FhKORX3Wr3YOIKgcWIkRVhYkpp+WlycnDe6uO4Nfj1wAAfUICMC2qFaq5FPp1w3NhiOghHnpHVBXodEBQUMkBYzKZODJy7ly5CoLjlzWITTmIC7ey4aKQ4+OXmyMmol7RlFQbjcoQkf2Y8/3NXTNEVYEJKafIyBDbWWjVgQz0/ep/uHArG4E13LF6xFOPR7UbRmWK98UwKpOaavH9iahiYiFCVBWUknJqUbtCcvJ0+GDNXxi75i9o8/WIbOKLjW93QpviUe1lnT0DAAkJYjsiqjK4RoSoKigh5dTidg9duHUfsUvTcPyKBjIZMKbrE4jr0sh4Sqo5ozKRkWb1g4gqLhYiRFWBkZTTIgxrRDp3Nvktf/37Kt5dfQRZOfmo5eGC2YNC0alx7ZJfIOGoDBFVXJyaIaoKCqWcQlZstMLwPCnJpIWq+To9Ejf/gze/P4isnHyE1/fBxtGdSi9CAMlGZYioYmMhQlRVREWJW3QDA4teVypN3rp7PSsHMd/+iYU7zgIAXu/YACvefBJ1vd3Lvr9hVKZ4IWQgkwEqlVmjMkRU8XFqhqgqiYoCeve2KMNj79lbeHv5IdzI0qK6qxM+i26Nl1qZMXphGJWJjhaLjsJTRGaOyhBR5cFChKiqUSjMWgwqCAIW/nEWM38RU1Kb+HsieUgYgn0tSEk1jMoYyxFJSmKOCFEVZJOpmfnz5yMoKAhubm6IiIjAvn37bHFbIiqnzAd5ePP7g5i+OR06vYCo0ECsi+tgWRFiEBUFnD8PbNsGLFsm/nnuHIsQoipK8hGRlStXYsyYMViwYAEiIiKQlJSEbt264cSJE/Dz85P69kRkrofx63+fvYaRlzxxIVuAi0KOT3u1wCvtVUUDyixl5qgMEVVeko+IfPHFFxg+fDiGDRuG5s2bY8GCBahWrRr++9//Sn1rospNpwO2bweWLxf/tEYQWGoqEBSElfHT0Pe4Cy5kC1Deu4m1TXMwuHhUOxGRFUg6IpKbm4uDBw9i/PjxBdfkcjm6du2KPXv2PNZeq9VCq9UWPNdoNFJ2j6jikuK8ltRU5AwajIldR2B16+cBAM+e3ocvfvoSNb66B3hY51A8IqLCJB0RuXnzJnQ6Hfz9/Ytc9/f3x9WrVx9rn5iYCG9v74KHSqWSsntEFZMU57XodDg/YQr6xszE6tbPQ67X4f0dS/Dt2smo8SBLbMP4dSKSgEPliIwfPx6ZmZkFj4yMDHt3icixSHReyy9rtuHl7uPwj38wat+/g6UrJyJu72rIITx673IeikdEZIykUzO1a9eGQqHAtWvXily/du0a6tSp81h7V1dXuLq6StkloorNyue15Ov0+OyXE/j6kBZw9UDbS39j3obPUOfeLeMvYPw6EVmZpCMiLi4uCA8Px9atWwuu6fV6bN26FU899ZSUtyaqnKx4Xst1TQ4Gf/Mnvv5DTEn99751WL78w5KLEIDx60RkdZJv3x0zZgxee+01tG3bFu3bt0dSUhLu37+PYcOGSX1rosrHSue17DkjpqTevCempM7q1xLdU0YAgt74Cyw4FI+IyBSSFyIDBw7EjRs38PHHH+Pq1asICQnBzz///NgCViIyQTlP0dXrDSmp6dALQNM6nkgeEo4GtT0Yv05EdiETBGO/zRyDRqOBt7c3MjMz4eXlZe/uEDkGw66Z4v/pGgqGEg6wy8zOw7urj+C3f8Q1W/3ClJjSpyXcXQoVF8a2BatUjF8nIrOY8/3Ns2aIKqKaNYFbtx6/9vXXRguGY+pMxKYcRMbtB3BxkmNSrxYY1M5ISmo5DsUjIrIECxGiiqSk0RDg8cIE4oF1K/dn4OMf/kZuvh6qmu5IjglHy0Dvku/B+HUisiFOzRBVFDodEBRU8vZdw/qQc+cAhQIPcnWYuOEY1hwU23dt5ofP+4fAu5qz7fpMRFUSp2aIKiMzMkTOtWyH2KUHkX41C3IZ8F63JhjxdEPI5TwrhogcCwsRoorCxAyRn/+5gfe37UKWNh+1q7tgziuh6NCwtsSdIyKyDAsRooqijGyQPLkCnz3zGr65UA1APtoH1cTcwaHw93KzTf+IiCzAQoSooiglQ+Ra9ZoY1esD7Fe1AAC8+XQw3u/WBM4KhzpOiojoMfwtRVRRKBRi6BjwKDMEwO56rdBj6GzsV7WAp0LAgiHh+PClZixCiKhC4G8qoookKkoMLAsMhB4yzH+yP4YMnIKbHj5o6q7HD+90QfeWjx8oSUTkqDg1Q1TRREUh84WXMGbhdmy9oQMA9A8LxOS+reDmzOAxIqpYWIgQVTCPUlJ1cHGSY3LvFhjYrp69u0VEZBEWIkQVhCAIWL4vA5/+KKak1qtZDV/FhJWekkpE5OBYiBBVAA9ydZiw/ihS09QAgK7N/PH5gDbwdi8lJVWn45kxROTwWIgQObizN+5hZEpaQUrq2O5N8dbTwY8fWFeYsVN0lUpx1w1P0SUiB8JChMiBbT56Be+v+Qv3tPmoXd0V8waH4sngWqW/qKSD8dRq8fqaNSxGiMhh8NA7IgeUp9Nj+uZ0fLfrHACgfYOamPdKKPzKSkk182A8IiIp8NA7ogrsamYORi1Lw4ELdwAAbz1MSXUyJaDMjIPxEBlpnQ4TEZUDCxEiB7L79E2MXnEIN+/lwtPNCbP6t0G3FmYElJl4MJ7J7YiIJMZChMgB6PUCknecwee/noBeAJrV9cKCIWGoX8vDvDcq42A8s9sREUmMhQiRnd3NzsWYVUfwe/p1AMCAtkr8p3dLy1JSSzkYD8CjNSKdO5ez10RE1sFChKi8ypHX8deluxiZkoZLdx7A1UmOyb1bYkA7leV9MRyMFx0tFh2FixHDdt+kJC5UJSKHwUPviMojNVXcpdKlCzB4sPhnUJB4vRSCICDlzwuITt6DS3ceoH6takgd2aF8RYhBoYPxilAquXWXiBwOt+8SWaqkvA7DyEMJX/rZufn4aN0xpB4SU1JfaO6Pmf3LSEm1BJNVichOzPn+ZiFCZAkL8zrO3LiHkUvTcOJaFhRyGcZ2a4I3y0pJJSKqYJgjQiQ1C/I6fjp6BWMfpqT6erpi3iuhiCgrJZWIqJJjIUJkCTPyOvJ0eiT+lI7//k9MSY1oUBNzB4fCz7OMlFQioiqAhQiRJUzM4bji449RX+/FwYcpqSOeaYj3XnjCtJRUIqIqgIUIkSVMyOvYFd4V8ft1uHX/DjzdnPDFgBA839zf9n0lInJgLESITFV8F8qXXwIDBjyW16GXyTH/qf74ovMQCPdz0byuF5ItSUklIqoCWIgQmSI1FYiPL7pAVakE3nsPWL684PodN0+8038Ctge0BAAMaqfCpz2awm3vbm6jJSIygoUIUVlKygtRq4FZs4CVKwFfXxw5cw0jMzyhzhHg6iTHlD4t0f/ifqBxj8cLmNmzGSxGRAQmqxKVTqcTR0KMrQN5eE14911871If/c96QZ0jIKhWNawb2VEsQqKjH9/mq1aL18tIXyUiqgpYiBCVpoy8kGwnF7zTZgAm/nAcuTo9urXwxw9vd0Jzf48yCxgkJIiFDhFRFcapGaLSlJIXcrqmErF9x+NU7fpQQMC4l5rj350biCmp27ebHXhGRFQVsRAhKk0JeSEbm3bCB91H475rNfhl3cK85wLR/ungRw3MCDwjIqrKWIgQlaZYXkiu3AnTuryOxW17AQCevPgX5hxIgd+cI0VfZ2LgmcntiIgqKRYiRKVRKMQdLtHRuOzli7heH+BQYFMAwMg9qzBmVwqcVq96fDuuCYFnUCrFdkREVZhki1XPnz+PN954Aw0aNIC7uzsaNmyITz75BLm5uVLdkkgaUVHY+d1a9Bw2B4cCm8Ir5x6+XfMfjD23TSxCjG3DNRQwgFh0FGZ4npTEPBEiqvIkGxFJT0+HXq/HwoUL0ahRIxw7dgzDhw/H/fv3MWvWLKluS2RVer2Aub+fRtJJFwhuLmjpJcdXTQXU6zOj7GCyqChgzRrjQWhJScwRISICIBMEY+PG0pg5cyaSk5Nx9uxZk9prNBp4e3sjMzMTXl5eEveOqKjb93ORsPIw/jh5AwDwSnsVPnm5BdyczRzFKB4Nz2RVIqrkzPn+tukakczMTNSsWbPEn2u1Wmi12oLnGo3GFt0ieszhjLsYufQgLmfmwM1Zjil9WiE6XGnZmykU3KJLRFQCmwWanT59GnPnzsVbb71VYpvExER4e3sXPFQqla26RwQAEAQB/7fnPPov2I3LmTloUNsD60Z2tLwIISKiUpldiIwbNw4ymazUR3p6epHXqNVqdO/eHf3798fw4cNLfO/x48cjMzOz4JGRkWH+JyKy0H1tPuJXHMbHG/5Gnk5A9xZ1sGFURzSry2lBIiKpmL1G5MaNG7h161apbYKDg+Hi4gIAuHz5MiIjI/Hkk09i8eLFkMtNr324RoRs5fT1LIxYmobT1+9BIZdh/ItN8UanhympRERkFknXiPj6+sLX19ektmq1Gl26dEF4eDgWLVpkVhFCZCs/HLmMcWv/QnauDv5erpg3OAztgkpey0RERNYj2WJVtVqNyMhI1K9fH7NmzcKNGzcKflanTh2pbktkstx8PaZuOo4ley4AADo0rIXZg0Lh6+lq554REVUdkhUiW7ZswenTp3H69GkolUUX+tlwxzCRUeq7DxCXkobDGXcBAHFdGmLM802gkHMqhojIlmyaI2IurhEhKew4eQMJKw7hTnYevNyc8OXAEDzXzL/sFzIPhIjIJA6bI0JUbuUoBnR6AXO2nsKc309BEICWgV5IjgmHqma1sl+cmmo8IXX2bCakEhGVAwsRqjjKUQzcvp+L+BWHsPPUTQDA4Ih6+Lhnc9NSUlNTgejoxw+vU6vF62vWsBghIrIQp2aoYiipGDBsry2lGDh08Q7iUtIKUlKn9W2FqDATA8p0OiAoqGjxU/z+SiVw7hynaYiIHjLn+5v7acnx6XTiSIixmtlwLSFBbFfkRwIW/+8cBizcU5CSuj6uo+lFCCBOA5VUhBjun5EhtiMiIrNxaoYcnznFwMMzXe5r8zEu9Sh+PHIZAPBSqzqY0a81PN2czbv3lSvWbUdEREWwECHHZ2YxcOpaFkYsPYgzN+7DSS7D+Jea4fWOQZalpNata912RERUBAsRcnxmFAMbDqsxPvVoQUrq/MFhaFuelNTOncU1IGq18akhwxqRzp0tvwcRURXGNSLk+AzFQEkjGjIZtPUbYOKdWohfcRjZuTp0bFQLm0Z3Ll8RAogLUGfPLrhP8fsCAJKSuFCViMhCLETI8ZVRDFzy8sWAN2bj+z8vAgDefrYR/u/1CNSubqWo9qgocVdOYGDR60olt+4SEZUTt+9SxWEkR2R7+25I6DoKd3UyeLs7I2lgCLo09ZPm/kxWJSIyiTnf3yxEqGJ5WAzoLl/B7Pu1MPdsHgQBaK30xvzBYaalpBIRkaSYI0KVl0KBW22fwtAHwZhzRixCYiLqYfWIp1iEEBFVQNw1QxXKwQt3MGpZGq5k5sDdWYFpUS3RN9SMgDIiInIoLETI9ixYayEIAhbvPo+pm/5Bvl5AsK8HkmPC0aSOp406TUREUmAhQrZlwcF197T5+GDtX9j0lxhY1qNVXcyIbo3qrvznS0RU0fE3OZXM2rtELDjF9uTDlNSzD1NSP3ypGYZZmpJKREQOh7tmyDgLRi5KZcEptusPiSmpD/J0qOPlhvkxYQiv72P+vYmIyKa4a4bKxzByUbxoMIxcpKaa/55mHFynzdfho/VHkbDyMB7k6dCpUW1sGt2JRQgRUSXEqRkqSqcTR0KMDZQJgjhykZAA9O5t3jSNiQfXXbpwFXHH9uDIpUwAwOhnGyG+6xNQyE2YimHgGBFRhcNChIoyY+QCkZGmv68JB9dtC26LhNOeyMzLRI1qzvhyYAi6NDExJdXaU0lERGQTnJqhokwcuTC5nUEpB9fpZHJ83nkIhvX/FJl5QBulNza+3cm8IsTaU0lERGQTLESoKBNGLsxqZ1DCwXW33L3w2oBJmNthEADg1SfrY9WIp6D0MTEltaypJECcStLpzOsvERHZBAsRKqqUkQsA4nWVSmxnrmKn2B4MbIoew+ZgV1Ao3OUCZg8KweQ+LeHqZMa6DnOmkoiIyOFwjQgVZRi5iI4Wi47CIw2G4iQpyfJFoFFREHr1wn+XbkNiuhb5kCG4tgcWvBqOJ/wtSEmVaiqJiIhsgiMi9LhiIxcFlEqjoWPmyMrJw6iVf2Fyei7yIUPP1nXxw9udLCtCAOmmkoiIyCYYaEYls/J22BNXsxC79CDO3rwPZ4UME15qhtc6lDMl1RCUplYbXydiJCiNiIikZc73N6dmqGQKhXlbdEux7tAlfJh6DA/ydKjrLaakhtWzUkDZ8OHAJ588ft0aU0lERCQpFiIkqZw8Hf6z8TiW/XkRANC5cW0kDQxBrequ5X9zY9khhSmVYhHCHBEiIofFQoQkk3E7GyNT0nBUnQmZDBj9bGOMfq6xaSmpZSnpAD2DSZOACRM4EkJE5OC4WJUk8Xv6NfScuwtH1WJK6qKh7fDO8yZGtZeltOwQQJyS+fbb8t+HiIgkxxERsiqdXsCXW05i3rbTAIA2qhr4KiYMgTXcrXcTqWLoiYjI5liIkNXcvKfF6OWHsPvMLQDAv56qjwk9mpkXUGYKZocQEVUaLETINGVs5T1w/jbilqXhmkaLai4KJEa1Qu+QwFLesByYHUJEVGmwEKmqzMkIKeVkW6FvX3y36xymb05Hvl5AI7/qSI4JQ2NLA8pMYYihLys7xJIYeiIisikWIlVRKYXFY1tdS9qdolYja/CrGDtpOTbfEQuYl9sEYHpUK3i4SvzPSuoYeiIishnumqlqDIVF8cWearV4PTX10bVSdqek16qHXv/6EpvvKOCskGFSrxaYMyhE+iLEQMIYeiIish1GvFclhjj0knacFI9D374d6NLlsWZrWzyLCd1GIsfZDQGa65j/Qj2E9nlO0q6XyMox9EREVH7mfH/bZEREq9UiJCQEMpkMhw8ftsUtyRhztr0Cj+06yVE4Y3y3OLzbcwxynN3w9NmD2Lg4AaEPrkvY6TIYYuhfeUX8k0UIEVGFYpNx9LFjxyIgIABHjhyxxe2oJOZuey206yTD2x+xfcbjWJ1GkAl6xP9vOd7evRIKQc/dKUREZDHJC5HNmzfj119/xdq1a7F582apb0elMXfb68PdKVvdAvBOjzHQuFWHT3YmkjZ+jmfOpYlTOSoVd6cQEZHFJC1Erl27huHDh2P9+vWoVq1ame21Wi20Wm3Bc41GI2X3qp6ytr0C4tTGjRsAgHzI8MV78/DVFfGfScjldHy1fjoCsm5ydwoREVmFZGtEBEHA0KFDMWLECLRt29ak1yQmJsLb27vgoVKppOpe1WTY9loanQ4YOBA3Vqbi1e/2FRQhQ9N/x6qUcWIRAnB3ChERWYXZhci4ceMgk8lKfaSnp2Pu3LnIysrC+PHjTX7v8ePHIzMzs+CRkZFhbveoLFFRwKpVpY5i7A9sjh7/e4A9Z2+hmosCc14JxadrP4PL1i3AsmXAtm3izhoWIUREVE5mb9+9ceMGbt26VWqb4OBgDBgwAD/++CNkskenrep0OigUCsTExGDJkiVl3ovbdyVSwrZcAcC37fpieuRQ6OQKNPKQYcFbndHIT8KUVCIiqnTM+f42e42Ir68vfH19y2w3Z84cTJkypeD55cuX0a1bN6xcuRIRERHm3pasycjuGY1LNYx9KR4/N+kIAOj993ZMGxgGDxYhREQkIckWq9arV6/I8+rVqwMAGjZsCKVSKdVtyRTFds/84xuE2D4f4nzNADjr8vDx1m8w5NBPkL27reT3YJAYERFZAc+aqYoK7Z5Z06ILPnpBTEkNzLyO+RumI+TqqdK35ZpzVg0REVEpbFaIBAUFwYHT5KsWhQI5X87Gp99tx4o23QAAz5w9gKQfP4eP9p7YpqRtuaUcgofoaO6kISIis/DQuyro4q1s9LvsixVtukEm6DFm51IsWj0JPjlZpW/LLeUQvIJrCQliOyIiIhNwaqaK+e34NYxZdRianHzU9HDB7AGt0flFT+BKr7LXephzVk1kpCT9JyKiyoWFSBWRr9Pj8y0nkbz9DAAgtF4NzB8choAa7kATf9PexNyzaoiIiMrAQqQKuJ6Vg9HLD2Hv2dsAgGEdgzD+xWZwcTJzZs7cs2qIiIjKwEKkktt37jZGLUvD9SwtPFwUmBHdGj1bB1j2ZmWdVSOTiT/nIXhERGQiLlatpARBwNd/nMEr3+zF9SwtnvCvjg2jOllehABFz6oplJhb5DkPwSMiIjOwEKmENDl5eOv7g5j2Uzp0egF9QgKwPq4jGvlVL/+bR0WJu2oCA4te5yF4RERkAU7NVDLHL2sQm3IQF25lw0Uhx8cvN0dMRL0iZ/6UW1QU0Ls3k1WJiKjcWIhUIqsOZGDi+mPQ5usRWMMdX8WEoY2qhjQ3Uyi4RZeIiMqNhUglkJOnwycb/sbKAxkAgMgmvvhyQAh8PFzs3DMiIqLSsRCp4C7cuo/YpWk4fkUDmQwY0/UJxHVpBLncilMxREREEmEhUoH9+vdVvLv6CLJy8lHLwwWzB4WiU+Pa9u4WERGRyViIVED5Oj1m/noCC3ecBQCE1/fBvMGhqOvtbueeERERmYeFSAVzPSsHby87hD/PiSmpr3dsgPEvNYWzgjuxiYio4mEhUoHsPXsLby8/hBsPU1I/i26DHq0Zp05ERBUXC5EKQBAELPzjLGb+cgI6vYAn/KsjeUg4GvpaIaCMiIjIjliI2IpOZ1EAWOaDPLy3+gi2HL8GAIgKDcSUvi1RzYV/dUREVPHx28wWUlOB+Hjg0qVH15RK8dyWUiLR/76ciZEpaQUpqZ/0ao7B7a2ckkpERGRHLESklpoKREc/flqtWi1eL+F8lpX7L2Lihr+R+zAlNXlIGFora9imz0RERDYiEwRj57k7Bo1GA29vb2RmZsLLy8ve3TGfTgcEBRUdCSlMJhNHRs6dK5imycnTYeL6Y1h9UHzNs0398MWANqhRjSmpRERUMZjz/c0RESnt3FlyEQKIoyQZGWK7yEicv3kfsSlp+OeKBnIZ8O4LTRD7TEOmpBIRUaXFQkRKV66Y3O6Xv6/ivVVHkKUVU1LnvBKKjo2YkkpERJUbCxEp1S074yNfJsdnmbXw9fcHAQBt6/tg3uAw1PF2k65fFu7gISIisjYWIlLq3FlcA6JWP75YFcD16jUxqv9E7DufBwD4d6cG+OBFiVNSLdzBQ0REJAXmgktJoRC/4AFxYWohe+q1wkuvzcY+v8ao7uqE5JgwfNSzufRFSHT04+tWDDt4UlOluzcREZERLESkotMB27cDWi3w6adAQAAAQA8ZkiP6IWbgVNys7oOmdTzxw6iOeLGVxFHtOp04EmJsk5ThWkKC2I6IiMhGODUjhRKmPzI/nYJ3nVvgN40zACAqLBBT+7SCu4sN1meYuYOHiIjIFliIWFsJAWbH8lwRe6UOMmo4w8VJjkm9WmBQO5XtUlLN2MFDRERkKyxEymLODhMj0x8CgJWtX8DHz49ArpMLlPduInnsy2hVr6Zt+m9gwg4es9oRERFZAdeIlCY1VUxG7dIFGDxY/DMoqORFncWmPx44ueL9l+Ix7sXRyHVywXOn92HTt3FodfYvm3S/CMMOnpJGYGQyQKUS2xEREdkIC5GSlLTD5NKlkneYFJrWOOcTgL6vzsKaVs9Drtfh/R1L8M3ayfDW3rfP9EcpO3gKniclMU+EiIhsioWIMaXtMAHE68Z2mDyc1vj5iafQ67Uvke7XALXv38HSlR8hbu9qyCEUaWdzUVHiIXuBgUWvK5UlHr5HREQkJR56Z8z27eI0TFm2bSuywyQvNw+fRb+Pb5o/DwBol/E35v0wA/73bosNjBxyZxdMViUiIgnx0LvyUqvNbndNk4NRy9Kw/2ERMnzfOozdsRjO+oejJo40/aFQcIsuERE5BE7NGHPjhlntdp+5iR5zdmL/+TvwdHXCgkZ5mHDql0dFCMDpDyIiIiM4ImKMr69JzfS1fZG87TQ+//UE9ALQtI4nkoeEo0FtD2DYy5z+ICIiKgMLEWOKL+Y0ItPVA2Ou1cLWYycAANHhSkzu3fJRSiqnP4iIiMok6dTMpk2bEBERAXd3d/j4+KBPnz5S3s56DJkbJTjm3xA9/z0fW2/o4OIkx/SoVpgZ3do2Ue1ERESViGQjImvXrsXw4cMxbdo0PPvss8jPz8exY8ekup11GTI3oqPF5w83FgkAlod0x6fPvYlcJxeoarojOSYcLQO97ddXIiKiCkyS7bv5+fkICgrCpEmT8MYbb1j8PnbbvmtQ6PC6B06umPDCSKS2eg4A0LWZPz7v3wbe1Zxt3y8iIiIHZvftu2lpaVCr1ZDL5QgNDcXVq1cREhKCmTNnomXLliW+TqvVQqvVFjzXaDRSdM90UVFA7944+/MOjNx/D+k5CshlwPvdmuKtp4Mhl9vowDoiIqJKSpI1ImfPngUAfPrpp/joo4+wceNG+Pj4IDIyErdv3y7xdYmJifD29i54qFQqKbpnls3Hr6PXvjyk5yhQu7orUv79JGIjG7IIISIisgKzCpFx48ZBJpOV+khPT4derwcATJgwAf369UN4eDgWLVoEmUyG1atXl/j+48ePR2ZmZsEjIyOjfJ+uHPJ0ekzeeByxKWm4p81H+6Ca+Gl0JzzVsJbd+kRERFTZmDU18+6772Lo0KGltgkODsaVh4e6NW/evOC6q6srgoODcfHixRJf6+rqCldXV3O6JImrmWJK6oELdwAAbz0djPe7NYGTwkjdxrh0IiIii5lViPj6+sLXhLCv8PBwuLq64sSJE+jUqRMAIC8vD+fPn0f9+vUt66mN7D59E6NXHMLNe7nwdHXCrAFt0K1FHeONCy1mLaBUijtumKBKRERUJkkWq3p5eWHEiBH45JNPoFKpUL9+fcycORMA0L9/fyluWW56vYDkHWcKUlKb1fVCckwYgmp7GH9Baqq4vbf4piO1WrzOOHciIqIySZYjMnPmTDg5OeHVV1/FgwcPEBERgd9//x0+Pj5S3dJid7Nz8c7Kw9h2Qjw7pn+4EpP7tISbcwlTLDqdOBJibOezIIgH3CUkAL17c5qGiIioFJLkiFiLLXJE/rp0F7FL06C++wCuTnJM7t0SA9qVsVtn+3agS5ey33zbNsa8ExFRlWP3HJGKQBAEpPx5Ef/58ThydXrUq1kNyUPC0CLAhJTUh4txrdaOiIioiqqShUh2bj4mrDuGdYfUAIDnm/tjVv828HY3MSW1bl3rtiMiIqqiqmQhsuzPi1h3SA2FXIb3uzXBW08HQyYzI6DMcCieWm18nYhMJv68c2frdZqIiKgSqpKFyNAOQThyKRMxEfXwZLAFAWWFD8WTyYoWI4aCJimJC1WJiIjKIEnEu6NzUsgx95VQy4oQg6gocYtuYGDR60olt+4SERGZqEqOiFjNw0PxmKxKRERkGRYi5aVQcIsuERGRhark1AwRERE5BhYiREREZDcsRIiIiMhuWIgQERGR3bAQISIiIrthIUJERER2w0KEiIiI7IaFCBEREdkNCxEiIiKyGxYiREREZDcsRIiIiMhuHPqsGUEQAAAajcbOPSEiIiJTGb63Dd/jpXHoQiQrKwsAoFKp7NwTIiIiMldWVha8vb1LbSMTTClX7ESv1+Py5cvw9PSETCazd3ceo9FooFKpkJGRAS8vL3t3xyb4mfmZKyt+Zn7mysoen1kQBGRlZSEgIAByeemrQBx6REQul0OpVNq7G2Xy8vKqMv+gDfiZqwZ+5qqBn7lqsPVnLmskxICLVYmIiMhuWIgQERGR3bAQKQdXV1d88skncHV1tXdXbIafuWrgZ64a+JmrBkf/zA69WJWIiIgqN46IEBERkd2wECEiIiK7YSFCREREdsNChIiIiOyGhYgVbdq0CREREXB3d4ePjw/69Olj7y7ZhFarRUhICGQyGQ4fPmzv7kjm/PnzeOONN9CgQQO4u7ujYcOG+OSTT5Cbm2vvrlnV/PnzERQUBDc3N0RERGDfvn327pKkEhMT0a5dO3h6esLPzw99+vTBiRMn7N0tm5k+fTpkMhkSEhLs3RXJqdVqDBkyBLVq1YK7uztatWqFAwcO2LtbktHpdJg4cWKR31mTJ0826fwXW3LoZNWKZO3atRg+fDimTZuGZ599Fvn5+Th27Ji9u2UTY8eORUBAAI4cOWLvrkgqPT0der0eCxcuRKNGjXDs2DEMHz4c9+/fx6xZs+zdPatYuXIlxowZgwULFiAiIgJJSUno1q0bTpw4AT8/P3t3TxI7duxAXFwc2rVrh/z8fHz44Yd44YUXcPz4cXh4eNi7e5Lav38/Fi5ciNatW9u7K5K7c+cOOnbsiC5dumDz5s3w9fXFqVOn4OPjY++uSWbGjBlITk7GkiVL0KJFCxw4cADDhg2Dt7c3Ro8ebe/uPSJQueXl5QmBgYHCt99+a++u2NxPP/0kNG3aVPj7778FAMKhQ4fs3SWb+uyzz4QGDRrYuxtW0759eyEuLq7guU6nEwICAoTExEQ79sq2rl+/LgAQduzYYe+uSCorK0to3LixsGXLFuGZZ54R4uPj7d0lSX3wwQdCp06d7N0Nm+rRo4fw+uuvF7kWFRUlxMTE2KlHxnFqxgrS0tKgVqshl8sRGhqKunXr4sUXX6z0IyLXrl3D8OHD8f3336NatWr27o5dZGZmombNmvbuhlXk5ubi4MGD6Nq1a8E1uVyOrl27Ys+ePXbsmW1lZmYCQKX5ey1JXFwcevToUeTvuzL74Ycf0LZtW/Tv3x9+fn4IDQ3FN998Y+9uSapDhw7YunUrTp48CQA4cuQIdu3ahRdffNHOPSuKhYgVnD17FgDw6aef4qOPPsLGjRvh4+ODyMhI3L592869k4YgCBg6dChGjBiBtm3b2rs7dnH69GnMnTsXb731lr27YhU3b96ETqeDv79/kev+/v64evWqnXplW3q9HgkJCejYsSNatmxp7+5IZsWKFUhLS0NiYqK9u2IzZ8+eRXJyMho3boxffvkFsbGxGD16NJYsWWLvrklm3LhxGDRoEJo2bQpnZ2eEhoYiISEBMTEx9u5aESxESjFu3DjIZLJSH4Z1AwAwYcIE9OvXD+Hh4Vi0aBFkMhlWr15t509hHlM/89y5c5GVlYXx48fbu8vlZupnLkytVqN79+7o378/hg8fbqeek7XFxcXh2LFjWLFihb27IpmMjAzEx8cjJSUFbm5u9u6Ozej1eoSFhWHatGkIDQ3Fm2++ieHDh2PBggX27ppkVq1ahZSUFCxbtgxpaWlYsmQJZs2a5XDFFxerluLdd9/F0KFDS20THByMK1euAACaN29ecN3V1RXBwcG4ePGilF20OlM/8++//449e/Y8dnZB27ZtERMT43D/0Etj6mc2uHz5Mrp06YIOHTrg66+/lrh3tlO7dm0oFApcu3atyPVr166hTp06duqV7YwaNQobN27EH3/8AaVSae/uSObgwYO4fv06wsLCCq7pdDr88ccfmDdvHrRaLRQKhR17KI26desW+R0NAM2aNcPatWvt1CPpvf/++wWjIgDQqlUrXLhwAYmJiXjttdfs3LtHWIiUwtfXF76+vmW2Cw8Ph6urK06cOIFOnToBAPLy8nD+/HnUr19f6m5alamfec6cOZgyZUrB88uXL6Nbt25YuXIlIiIipOyi1Zn6mQFxJKRLly4Fo15yeeUZVHRxcUF4eDi2bt1asPVcr9dj69atGDVqlH07JyFBEPD2229j3bp12L59Oxo0aGDvLknqueeew9GjR4tcGzZsGJo2bYoPPvigUhYhANCxY8fHtmWfPHmywv2ONkd2dvZjv6MUCkXBKL7DsPdq2coiPj5eCAwMFH755RchPT1deOONNwQ/Pz/h9u3b9u6aTZw7d67S75q5dOmS0KhRI+G5554TLl26JFy5cqXgUVmsWLFCcHV1FRYvXiwcP35cePPNN4UaNWoIV69etXfXJBMbGyt4e3sL27dvL/J3mp2dbe+u2UxV2DWzb98+wcnJSZg6dapw6tQpISUlRahWrZqwdOlSe3dNMq+99poQGBgobNy4UTh37pyQmpoq1K5dWxg7dqy9u1YECxEryc3NFd59913Bz89P8PT0FLp27SocO3bM3t2ymapQiCxatEgAYPRRmcydO1eoV6+e4OLiIrRv317Yu3evvbskqZL+ThctWmTvrtlMVShEBEEQfvzxR6Fly5aCq6ur0LRpU+Hrr7+2d5ckpdFohPj4eKFevXqCm5ubEBwcLEyYMEHQarX27loRMkFwsIg1IiIiqjIqzwQ3ERERVTgsRIiIiMhuWIgQERGR3bAQISIiIrthIUJERER2w0KEiIiI7IaFCBEREdkNCxEiIiKyGxYiREREZDcsRIiIiMhuWIgQERGR3bAQISIiIrv5f6RCTqmT59b9AAAAAElFTkSuQmCC", 123 | "text/plain": [ 124 | "
" 125 | ] 126 | }, 127 | "metadata": {}, 128 | "output_type": "display_data" 129 | } 130 | ], 131 | "source": [ 132 | "X = concatenate((p3, p4), dim=0).detach().numpy()\n", 133 | "X_pts = pts.detach().numpy()\n", 134 | "\n", 135 | "plt.plot(X_pts[..., :, 0], X_pts[:, 1], \"ro\")\n", 136 | "plt.plot(X[:, 0], X[:, 1])\n", 137 | "plt.show()" 138 | ] 139 | } 140 | ], 141 | "metadata": { 142 | "kernelspec": { 143 | "display_name": "python3", 144 | "language": "python", 145 | "name": "python3" 146 | } 147 | }, 148 | "nbformat": 4, 149 | "nbformat_minor": 2 150 | } 151 | -------------------------------------------------------------------------------- /requirements-dev.txt: -------------------------------------------------------------------------------- 1 | jupyter 2 | nbconvert 3 | pre-commit 4 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | 2 | imageio 3 | IProgress 4 | ipywidgets 5 | jupyterlab_widgets 6 | kornia 7 | kornia-rs 8 | kornia_moons>=0.2.9 9 | matplotlib 10 | numpy<=2.0.0 11 | opencv-python 12 | pandas 13 | plotly 14 | py7zr 15 | pytorch_lightning 16 | rawpy 17 | tensorboard 18 | torch 19 | torchmetrics 20 | torchvision 21 | tqdm 22 | -------------------------------------------------------------------------------- /scripts/training/image_classifier/README.rst: -------------------------------------------------------------------------------- 1 | Image Classifier Example 2 | ======================== 3 | 4 | This is a toy example implementing an image classification application. 5 | 6 | 1. Install dependencies 7 | 8 | .. code-block:: python 9 | 10 | pip install -r requirements.tx 11 | 12 | 2. Execute the script: The entry point to this example is the file 13 | 14 | .. code-block:: python 15 | 16 | python main.py 17 | 18 | 3. Modify the hyper-parameters in `config.yml` and execute 19 | 20 | .. code-block:: python 21 | 22 | python main.py num_epochs=50 23 | 24 | 4. Sweep hyper-parameters 25 | 26 | .. code-block:: python 27 | 28 | python main.py --multirun num_epochs=1 lr=1e-3,1e-4 29 | 30 | Explore hydra to make cool stuff with the config files: https://hydra.cc/ 31 | -------------------------------------------------------------------------------- /scripts/training/image_classifier/config.yaml: -------------------------------------------------------------------------------- 1 | 2 | data_path: ./data 3 | batch_size: 64 4 | num_epochs: 250 5 | lr: 1e-3 6 | output_path: ./output 7 | -------------------------------------------------------------------------------- /scripts/training/image_classifier/main.py: -------------------------------------------------------------------------------- 1 | import hydra 2 | import torch 3 | import torch.nn as nn 4 | import torchvision 5 | import torchvision.transforms as T 6 | from hydra.core.config_store import ConfigStore 7 | from hydra.utils import to_absolute_path 8 | 9 | import kornia as K 10 | from kornia.x import Configuration, ImageClassifierTrainer, ModelCheckpoint 11 | 12 | cs = ConfigStore.instance() 13 | # Registering the Config class with the name 'config'. 14 | cs.store(name="config", node=Configuration) 15 | 16 | 17 | @hydra.main(config_path=".", config_name="config.yaml") 18 | def my_app(config: Configuration) -> None: 19 | # create the dataset 20 | train_dataset = torchvision.datasets.CIFAR10( 21 | root=to_absolute_path(config.data_path), train=True, download=True, transform=T.ToTensor() 22 | ) 23 | 24 | valid_dataset = torchvision.datasets.CIFAR10( 25 | root=to_absolute_path(config.data_path), train=False, download=True, transform=T.ToTensor() 26 | ) 27 | 28 | # create the dataloaders 29 | train_dataloader = torch.utils.data.DataLoader( 30 | train_dataset, batch_size=config.batch_size, shuffle=True, num_workers=8, pin_memory=True 31 | ) 32 | 33 | valid_daloader = torch.utils.data.DataLoader( 34 | valid_dataset, batch_size=config.batch_size, shuffle=True, num_workers=8, pin_memory=True 35 | ) 36 | 37 | # create the model 38 | model = nn.Sequential( 39 | K.contrib.VisionTransformer(image_size=32, patch_size=16, embed_dim=128, num_heads=3), 40 | K.contrib.ClassificationHead(embed_size=128, num_classes=10), 41 | ) 42 | 43 | # create the loss function 44 | criterion = nn.CrossEntropyLoss() 45 | 46 | # instantiate the optimizer and scheduler 47 | optimizer = torch.optim.AdamW(model.parameters(), lr=config.lr) 48 | scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, config.num_epochs * len(train_dataloader)) 49 | 50 | # define some augmentations 51 | _augmentations = nn.Sequential( 52 | K.augmentation.RandomHorizontalFlip(p=0.75), 53 | K.augmentation.RandomVerticalFlip(p=0.75), 54 | K.augmentation.RandomAffine(degrees=10.0), 55 | K.augmentation.PatchSequential( 56 | K.augmentation.ColorJiggle(0.1, 0.1, 0.1, 0.1, p=0.8), 57 | grid_size=(2, 2), # cifar-10 is 32x32 and vit is patch 16 58 | patchwise_apply=False, 59 | ), 60 | ) 61 | 62 | def augmentations(self, sample: dict) -> dict: 63 | out = _augmentations(sample["input"]) 64 | return {"input": out, "target": sample["target"]} 65 | 66 | model_checkpoint = ModelCheckpoint(filepath="./outputs", monitor="top5") 67 | 68 | trainer = ImageClassifierTrainer( 69 | model, 70 | train_dataloader, 71 | valid_daloader, 72 | criterion, 73 | optimizer, 74 | scheduler, 75 | config, 76 | callbacks={"augmentations": augmentations, "on_checkpoint": model_checkpoint}, 77 | ) 78 | trainer.fit() 79 | 80 | 81 | if __name__ == "__main__": 82 | my_app() 83 | -------------------------------------------------------------------------------- /scripts/training/image_classifier/requirements.txt: -------------------------------------------------------------------------------- 1 | hydra-core 2 | kornia[x] 3 | torchvision 4 | -------------------------------------------------------------------------------- /scripts/training/object_detection/README.rst: -------------------------------------------------------------------------------- 1 | Object Detection Example 2 | ======================== 3 | 4 | This is a toy example implementing an object detection application. 5 | 6 | 1. Install dependencies 7 | 8 | .. code-block:: python 9 | 10 | pip install -r requirements.tx 11 | 12 | 2. Execute the script: The entry point to this example is the file 13 | 14 | .. code-block:: python 15 | 16 | python main.py 17 | 18 | 3. Modify the hyper-parameters in `config.yml` and execute 19 | 20 | .. code-block:: python 21 | 22 | python main.py num_epochs=50 23 | 24 | 4. Sweep hyper-parameters 25 | 26 | .. code-block:: python 27 | 28 | python main.py --multirun num_epochs=1 lr=1e-3,1e-4 29 | 30 | Explore hydra to make cool stuff with the config files: https://hydra.cc/ 31 | -------------------------------------------------------------------------------- /scripts/training/object_detection/config.yaml: -------------------------------------------------------------------------------- 1 | data_path: ./data 2 | batch_size: 2 3 | num_epochs: 2 4 | lr: 1e-3 5 | output_path: ./output 6 | -------------------------------------------------------------------------------- /scripts/training/object_detection/main.py: -------------------------------------------------------------------------------- 1 | import hydra 2 | import torch 3 | import torchvision 4 | import torchvision.transforms as T 5 | from hydra.core.config_store import ConfigStore 6 | from hydra.utils import to_absolute_path 7 | 8 | import kornia as K 9 | from kornia.x import Configuration, ModelCheckpoint, ObjectDetectionTrainer 10 | 11 | cs = ConfigStore.instance() 12 | # Registering the Config class with the name 'config'. 13 | cs.store(name="config", node=Configuration) 14 | 15 | 16 | @hydra.main(config_path=".", config_name="config.yaml") 17 | def my_app(config: Configuration) -> None: 18 | 19 | def collate_fn(data): 20 | # To map from [{A, B}, ...] => [{A}, ...], [{B}, ...] 21 | return [d[0] for d in data], [d[1] for d in data] 22 | 23 | # create the dataset 24 | train_dataset = torchvision.datasets.WIDERFace( 25 | root=to_absolute_path(config.data_path), transform=T.ToTensor(), split='train', download=False 26 | ) 27 | 28 | valid_dataset = torchvision.datasets.WIDERFace( 29 | root=to_absolute_path(config.data_path), transform=T.ToTensor(), split='val', download=False 30 | ) 31 | 32 | # create the dataloaders 33 | train_dataloader = torch.utils.data.DataLoader( 34 | train_dataset, batch_size=config.batch_size, shuffle=True, num_workers=0, pin_memory=True, collate_fn=collate_fn 35 | ) 36 | 37 | valid_daloader = torch.utils.data.DataLoader( 38 | valid_dataset, batch_size=config.batch_size, shuffle=True, num_workers=0, pin_memory=True, collate_fn=collate_fn 39 | ) 40 | 41 | # create the model 42 | model = torchvision.models.detection.retinanet_resnet50_fpn(pretrained=True) 43 | 44 | # create the loss function 45 | criterion = None 46 | 47 | # instantiate the optimizer and scheduler 48 | optimizer = torch.optim.AdamW(model.parameters(), lr=config.lr) 49 | scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, config.num_epochs * len(train_dataloader)) 50 | 51 | # define some augmentations 52 | _augmentations = K.augmentation.AugmentationSequential( 53 | K.augmentation.RandomHorizontalFlip(p=0.75), 54 | K.augmentation.RandomVerticalFlip(p=0.75), 55 | K.augmentation.RandomAffine( 56 | degrees=0.0, translate=(0.1, 0.1), scale=(0.9, 1.1) 57 | ), # NOTE: XYXY bbox format cannot handle rotated boxes 58 | data_keys=['input', 'bbox_xyxy'], 59 | ) 60 | 61 | def bbox_xywh_to_xyxy(boxes: torch.Tensor): 62 | boxes[..., 2] = boxes[..., 0] + boxes[..., 2] # x + w 63 | boxes[..., 3] = boxes[..., 1] + boxes[..., 3] # y + h 64 | return boxes 65 | 66 | def preprocess(self, x: dict) -> dict: 67 | x['target'] = { 68 | "boxes": [bbox_xywh_to_xyxy(a['bbox'].float()) for a in x['target']], 69 | # labels are set to 1 for all faces 70 | "labels": [torch.tensor([1] * len(a['bbox'])) for a in x['target']], 71 | } 72 | return x 73 | 74 | def augmentations(self, sample: dict) -> dict: 75 | xs, ys, ys2 = [], [], [] 76 | for inp, trg, lab in zip(sample["input"], sample["target"]["boxes"], sample["target"]["labels"]): 77 | x, y = _augmentations(inp[None], trg[None]) 78 | xs.append(x[0]) 79 | ys.append(y[0]) 80 | ys2.append(lab) 81 | return {"input": xs, "target": {"boxes": ys, "labels": ys2}} 82 | 83 | def on_before_model(self, sample: dict) -> dict: 84 | return { 85 | "input": sample["input"], 86 | "target": [ 87 | {"boxes": v, "labels": l} for v, l in zip(sample["target"]["boxes"], sample["target"]["labels"]) 88 | ], 89 | } 90 | 91 | model_checkpoint = ModelCheckpoint(filepath="./outputs", monitor="map") 92 | 93 | trainer = ObjectDetectionTrainer( 94 | model, 95 | train_dataloader, 96 | valid_daloader, 97 | criterion, 98 | optimizer, 99 | scheduler, 100 | config, 101 | num_classes=81, 102 | loss_computed_by_model=True, 103 | callbacks={ 104 | "preprocess": preprocess, 105 | "augmentations": augmentations, 106 | "on_before_model": on_before_model, 107 | "on_checkpoint": model_checkpoint, 108 | }, 109 | ) 110 | trainer.fit() 111 | trainer.evaluate() 112 | 113 | 114 | if __name__ == "__main__": 115 | my_app() 116 | -------------------------------------------------------------------------------- /scripts/training/object_detection/requirements.txt: -------------------------------------------------------------------------------- 1 | hydra-core 2 | kornia[x] 3 | torchvision 4 | -------------------------------------------------------------------------------- /scripts/training/semantic_segmentation/README.rst: -------------------------------------------------------------------------------- 1 | Semantic Segmentation Example 2 | ============================= 3 | 4 | This is a toy example implementing a semantic segmentation application. 5 | 6 | 1. Install dependencies 7 | 8 | .. code-block:: python 9 | 10 | pip install -r requirements.txt 11 | 12 | 2. Execute the script: The entry point to this example is the file 13 | 14 | .. code-block:: python 15 | 16 | python main.py 17 | 18 | 3. Modify the hyper-parameters in `config.yml` and execute 19 | 20 | .. code-block:: python 21 | 22 | python main.py num_epochs=50 23 | 24 | 4. Sweep hyper-parameters 25 | 26 | .. code-block:: python 27 | 28 | python main.py --multirun num_epochs=1 lr=1e-3,1e-4 29 | 30 | Explore hydra to make cool stuff with the config files: https://hydra.cc/ 31 | -------------------------------------------------------------------------------- /scripts/training/semantic_segmentation/config.yaml: -------------------------------------------------------------------------------- 1 | 2 | data_path: ./data 3 | batch_size: 4 4 | num_epochs: 2 5 | lr: 1e-3 6 | output_path: ./output 7 | image_size: [224, 224] 8 | -------------------------------------------------------------------------------- /scripts/training/semantic_segmentation/main.py: -------------------------------------------------------------------------------- 1 | import hydra 2 | import numpy as np 3 | import torch 4 | import torch.nn as nn 5 | import torchvision 6 | from hydra.core.config_store import ConfigStore 7 | from hydra.utils import to_absolute_path 8 | 9 | import kornia as K 10 | from kornia.x import Configuration, Lambda, SemanticSegmentationTrainer 11 | 12 | cs = ConfigStore.instance() 13 | # Registering the Config class with the name 'config'. 14 | cs.store(name="config", node=Configuration) 15 | 16 | 17 | class Transform(nn.Module): 18 | def __init__(self, image_size): 19 | super().__init__() 20 | self.resize = K.geometry.Resize(image_size, interpolation='nearest') 21 | 22 | @torch.no_grad() 23 | def forward(self, x, y): 24 | x = K.utils.image_to_tensor(np.array(x)) 25 | x, y = x.float() / 255.0, torch.from_numpy(y) 26 | return self.resize(x), self.resize(y) 27 | 28 | 29 | @hydra.main(config_path=".", config_name="config.yaml") 30 | def my_app(config: Configuration) -> None: 31 | # make image size homogeneous 32 | transform = Transform(tuple(config.image_size)) 33 | 34 | # create the dataset 35 | train_dataset = torchvision.datasets.SBDataset( 36 | root=to_absolute_path(config.data_path), image_set='train', download=False, transforms=transform 37 | ) 38 | 39 | valid_dataset = torchvision.datasets.SBDataset( 40 | root=to_absolute_path(config.data_path), image_set='val', download=False, transforms=transform 41 | ) 42 | 43 | # create the dataloaders 44 | train_dataloader = torch.utils.data.DataLoader( 45 | train_dataset, batch_size=config.batch_size, shuffle=True, num_workers=8, pin_memory=True 46 | ) 47 | 48 | valid_daloader = torch.utils.data.DataLoader( 49 | valid_dataset, batch_size=config.batch_size, shuffle=True, num_workers=8, pin_memory=True 50 | ) 51 | 52 | # create the loss function 53 | criterion = nn.CrossEntropyLoss() 54 | 55 | # create the model 56 | model = nn.Sequential(torchvision.models.segmentation.fcn_resnet50(pretrained=False), Lambda(lambda x: x['out'])) 57 | 58 | # instantiate the optimizer and scheduler 59 | optimizer = torch.optim.AdamW(model.parameters(), lr=config.lr) 60 | scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, config.num_epochs * len(train_dataloader)) 61 | 62 | # define some augmentations 63 | _augmentations = K.augmentation.AugmentationSequential( 64 | K.augmentation.RandomHorizontalFlip(p=0.75), 65 | K.augmentation.RandomVerticalFlip(p=0.75), 66 | K.augmentation.RandomAffine(degrees=10.0), 67 | data_keys=['input', 'mask'], 68 | ) 69 | 70 | def preprocess(self, sample: dict) -> dict: 71 | target = sample["target"].argmax(1).unsqueeze(1).float() 72 | return {"input": sample["input"], "target": target} 73 | 74 | def augmentations(self, sample: dict) -> dict: 75 | x, y = _augmentations(sample["input"], sample["target"]) 76 | # NOTE: use matplotlib to visualise before/after samples 77 | return {"input": x, "target": y} 78 | 79 | def on_before_model(self, sample: dict) -> dict: 80 | target = sample["target"].squeeze(1).long() 81 | return {"input": sample["input"], "target": target} 82 | 83 | trainer = SemanticSegmentationTrainer( 84 | model, 85 | train_dataloader, 86 | valid_daloader, 87 | criterion, 88 | optimizer, 89 | scheduler, 90 | config, 91 | callbacks={"preprocess": preprocess, "augmentations": augmentations, "on_before_model": on_before_model}, 92 | ) 93 | trainer.fit() 94 | 95 | 96 | if __name__ == "__main__": 97 | my_app() 98 | -------------------------------------------------------------------------------- /scripts/training/semantic_segmentation/requirements.txt: -------------------------------------------------------------------------------- 1 | hydra-core 2 | kornia[x] 3 | torchvision 4 | -------------------------------------------------------------------------------- /tutorials/_static/css/styles_dark.scss: -------------------------------------------------------------------------------- 1 | /*-- scss:defaults --*/ 2 | $body-bg: #131416; 3 | $body-color: rgb(255, 255, 255, 0.8); 4 | $primary: #131416; 5 | $navbar-bg: #1A1C1E; 6 | $navbar-fg: rgb(255, 255, 255, 0.8); 7 | $sidebar-bg: #131416; 8 | $sidebar-fg: rgb(255, 255, 255, 0.8); 9 | $link-color: #2B8CEE; 10 | 11 | // Code blocks 12 | // $code-block-bg-alpha: -.8; 13 | $code-block-bg: #272822; 14 | 15 | .list div.quarto-post .listing-categories .listing-category { 16 | color: rgb(255, 255, 255, 0.8); 17 | } 18 | -------------------------------------------------------------------------------- /tutorials/_static/css/styles_light.scss: -------------------------------------------------------------------------------- 1 | /*-- scss:defaults --*/ 2 | $body-bg: white; 3 | $body-color: black; 4 | $primary: #3980F5; 5 | $navbar-bg: #3980F5; 6 | $navbar-fg: white; 7 | $sidebar-bg: #3980F5; 8 | $sidebar-fg: white; 9 | $link-color: #2B8CEE; 10 | 11 | 12 | // Code blocks 13 | $code-block-bg-alpha: -.8; 14 | -------------------------------------------------------------------------------- /tutorials/_static/img/kornia_logo.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 7 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | 45 | 51 | 54 | 57 | 60 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | -------------------------------------------------------------------------------- /tutorials/_static/img/kornia_logo_favicon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/_static/img/kornia_logo_favicon.png -------------------------------------------------------------------------------- /tutorials/_static/img/kornia_logo_mini.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/_static/img/kornia_logo_mini.png -------------------------------------------------------------------------------- /tutorials/_static/img/kornia_logo_only_dark.svg: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /tutorials/_static/img/kornia_logo_only_light.svg: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /tutorials/assets/aliased_and_not_aliased_patch_extraction.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/aliased_and_not_aliased_patch_extraction.png -------------------------------------------------------------------------------- /tutorials/assets/canny.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/canny.png -------------------------------------------------------------------------------- /tutorials/assets/color_conversions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/color_conversions.png -------------------------------------------------------------------------------- /tutorials/assets/color_raw_to_rgb.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/color_raw_to_rgb.png -------------------------------------------------------------------------------- /tutorials/assets/color_yuv420_to_rgb.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/color_yuv420_to_rgb.png -------------------------------------------------------------------------------- /tutorials/assets/connected_components.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/connected_components.png -------------------------------------------------------------------------------- /tutorials/assets/data_augmentation_2d.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/data_augmentation_2d.png -------------------------------------------------------------------------------- /tutorials/assets/data_augmentation_mosiac.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/data_augmentation_mosiac.png -------------------------------------------------------------------------------- /tutorials/assets/data_augmentation_segmentation.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/data_augmentation_segmentation.png -------------------------------------------------------------------------------- /tutorials/assets/data_augmentation_sequential.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/data_augmentation_sequential.png -------------------------------------------------------------------------------- /tutorials/assets/data_patch_sequential.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/data_patch_sequential.png -------------------------------------------------------------------------------- /tutorials/assets/descriptor_matching.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/descriptor_matching.png -------------------------------------------------------------------------------- /tutorials/assets/extract_combine_patches.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/extract_combine_patches.png -------------------------------------------------------------------------------- /tutorials/assets/face_detection.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/face_detection.png -------------------------------------------------------------------------------- /tutorials/assets/filtering_edges.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/filtering_edges.png -------------------------------------------------------------------------------- /tutorials/assets/filtering_operators.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/filtering_operators.png -------------------------------------------------------------------------------- /tutorials/assets/fit_line.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/fit_line.png -------------------------------------------------------------------------------- /tutorials/assets/fit_plane.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/fit_plane.png -------------------------------------------------------------------------------- /tutorials/assets/gaussian_blur.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/gaussian_blur.png -------------------------------------------------------------------------------- /tutorials/assets/geometric_transforms.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/geometric_transforms.png -------------------------------------------------------------------------------- /tutorials/assets/geometry_generate_patch.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/geometry_generate_patch.png -------------------------------------------------------------------------------- /tutorials/assets/hello_world_tutorial.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/hello_world_tutorial.png -------------------------------------------------------------------------------- /tutorials/assets/homography.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/homography.png -------------------------------------------------------------------------------- /tutorials/assets/image_enhancement.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_enhancement.png -------------------------------------------------------------------------------- /tutorials/assets/image_histogram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_histogram.png -------------------------------------------------------------------------------- /tutorials/assets/image_matching.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_matching.png -------------------------------------------------------------------------------- /tutorials/assets/image_matching_LightGlue.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_matching_LightGlue.png -------------------------------------------------------------------------------- /tutorials/assets/image_matching_adalam.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_matching_adalam.png -------------------------------------------------------------------------------- /tutorials/assets/image_matching_disk.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_matching_disk.png -------------------------------------------------------------------------------- /tutorials/assets/image_points_transforms.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_points_transforms.png -------------------------------------------------------------------------------- /tutorials/assets/image_registration.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_registration.png -------------------------------------------------------------------------------- /tutorials/assets/image_stitching.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/image_stitching.png -------------------------------------------------------------------------------- /tutorials/assets/kornia.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/kornia.png -------------------------------------------------------------------------------- /tutorials/assets/line_detection_and_matching_sold2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/line_detection_and_matching_sold2.png -------------------------------------------------------------------------------- /tutorials/assets/morphology_101.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/morphology_101.png -------------------------------------------------------------------------------- /tutorials/assets/resize_antialias.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/resize_antialias.png -------------------------------------------------------------------------------- /tutorials/assets/rotate_affine.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/rotate_affine.png -------------------------------------------------------------------------------- /tutorials/assets/total_variation_denoising.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/total_variation_denoising.png -------------------------------------------------------------------------------- /tutorials/assets/unsharp_mask.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/unsharp_mask.png -------------------------------------------------------------------------------- /tutorials/assets/visual_prompter.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/visual_prompter.png -------------------------------------------------------------------------------- /tutorials/assets/warp_perspective.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/warp_perspective.png -------------------------------------------------------------------------------- /tutorials/assets/zca_whitening.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kornia/tutorials/3aee7018aed5a11416a38b8aa07f631b38d84dcd/tutorials/assets/zca_whitening.png -------------------------------------------------------------------------------- /tutorials/training/image_classification.rst: -------------------------------------------------------------------------------- 1 | Image Classification 2 | ==================== 3 | 4 | .. image:: https://production-media.paperswithcode.com/thumbnails/task/task-0000000951-52325f45_O0tAMly.jpg 5 | :align: right 6 | :width: 20% 7 | 8 | Image Classification is a fundamental task that attempts to comprehend an entire image as a whole. 9 | The goal is to classify the image by assigning it to a specific label. Typically, Image Classification refers to images 10 | in which only one object appears and is analyzed. In contrast, object detection involves both classification and 11 | localization tasks, and is used to analyze more realistic cases in which multiple objects may exist in an image. 12 | 13 | Learn more: `https://paperswithcode.com/task/image-classification `_ 14 | 15 | Inference 16 | --------- 17 | 18 | Kornia provides a couple of backbones based on `transformers `_ 19 | to perform image classification. Checkout the following apis `VisionTransformer `_, 20 | `ClassificationHead `_ and 21 | combine as follows to customize your own classifier: 22 | 23 | .. code:: python 24 | 25 | import torch.nn as nn 26 | import kornia.contrib as K 27 | 28 | classifier = nn.Sequential( 29 | K.VisionTransformer(image_size=224, patch_size=16), 30 | K.ClassificationHead(num_classes=1000) 31 | ) 32 | 33 | img = torch.rand(1, 3, 224, 224) 34 | out = classifier(img) # BxN 35 | scores = out.argmax(-1) # B 36 | 37 | .. tip:: 38 | Read more about our `Vision Transformer (ViT) `_ 39 | 40 | Finetuning 41 | ---------- 42 | 43 | In order to customize your model with your own data you can use our `Training API `_ 44 | to perform the `fine-tuning `_ of your model. 45 | 46 | We provide `ImageClassifierTrainer `_ with a default training structure to train basic 47 | image classification problems. However, one can leverage this is API using the models provided by Kornia or 48 | use existing libraries from the PyTorch ecosystem such as `torchvision `_ 49 | or `timm `_. 50 | 51 | Create the dataloaders: 52 | 53 | .. literalinclude:: ../../scripts/training/image_classifier/main.py 54 | :language: python 55 | :lines: 20-36 56 | 57 | Define your model, losses, optimizers and schedulers: 58 | 59 | .. literalinclude:: ../../scripts/training/object_detection/main.py 60 | :language: python 61 | :lines: 37-48 62 | 63 | Define your augmentations: 64 | 65 | .. literalinclude:: ../../scripts/training/image_classifier/main.py 66 | :language: python 67 | :lines: 50-65 68 | 69 | Finally, instantiate the `ImageClassifierTrainer `_ and execute your training pipeline. 70 | 71 | .. literalinclude:: ../../scripts/training/image_classifier/main.py 72 | :language: python 73 | :lines: 66-78 74 | 75 | .. seealso:: 76 | Play with the full example `here `_ 77 | -------------------------------------------------------------------------------- /tutorials/training/object_detection.rst: -------------------------------------------------------------------------------- 1 | Object detection 2 | ================ 3 | 4 | .. image:: https://production-media.paperswithcode.com/thumbnails/task/task-0000000004-7757802e.jpg 5 | :align: right 6 | :width: 40% 7 | 8 | Object detection consists in detecting objects belonging to a certain category from an image, 9 | determining the absolute location and also assigning each detected instance a predefined category. 10 | In the last few years, several models have emerged based on deep learning. Being the state of the art 11 | models being based on two stages. First, regions with higher recall values are located, so that all 12 | objects in the image adhere to the proposed regions. The second stage consists of classification models, 13 | usually CNNs, used to determine the category of each proposed region (instances). 14 | 15 | Learn more: `https://paperswithcode.com/task/object-detection `_ 16 | 17 | Finetuning 18 | ---------- 19 | 20 | In order to customize your model with your own data you can use our :ref:`training_api` to perform the 21 | `fine-tuning `_ of your model. 22 | 23 | We provide `ObjectDetectionTrainer `_ 24 | with a default training structure to train object detection problems. However, one can leverage this is 25 | API using the models provided by Kornia or use existing libraries from the PyTorch ecosystem such 26 | as `torchvision `_. 27 | 28 | Create the dataloaders and transforms: 29 | 30 | .. literalinclude:: ../../scripts/training/object_detection/main.py 31 | :language: python 32 | :lines: 17-39 33 | 34 | Define your model, losses, optimizers and schedulers: 35 | 36 | .. literalinclude:: ../../scripts/training/object_detection/main.py 37 | :language: python 38 | :lines: 40-50 39 | 40 | Create your preprocessing and augmentations pipeline: 41 | 42 | .. literalinclude:: ../../scripts/training/object_detection/main.py 43 | :language: python 44 | :lines: 50-90 45 | 46 | Finally, instantiate the `ObjectDetectionTrainer `_ 47 | and execute your training pipeline. 48 | 49 | .. literalinclude:: ../../scripts/training/object_detection/main.py 50 | :language: python 51 | :lines: 90-111 52 | 53 | .. seealso:: 54 | Play with the full example `here `_ 55 | -------------------------------------------------------------------------------- /tutorials/training/semantic_segmentation.rst: -------------------------------------------------------------------------------- 1 | Semantic segmentation 2 | ===================== 3 | 4 | .. image:: https://production-media.paperswithcode.com/thumbnails/task/task-0000000885-bec5f079_K84qLCL.jpg 5 | :align: right 6 | :width: 40% 7 | 8 | Semantic segmentation, or image segmentation, is the task of clustering parts of an image together which belong to the 9 | same object class. It is a form of pixel-level prediction because each pixel in an image is classified according to a 10 | category. Some example benchmarks for this task are Cityscapes, PASCAL VOC and ADE20K. Models are usually evaluated with 11 | the Mean Intersection-Over-Union (Mean IoU) and Pixel Accuracy metrics. 12 | 13 | Learn more: `https://paperswithcode.com/task/semantic-segmentation `_ 14 | 15 | Finetuning 16 | ---------- 17 | 18 | In order to customize your model with your own data you can use our `Training API `_ 19 | to perform the `fine-tuning `_ of your model.r model. 20 | 21 | We provide `SemanticSegmentationTrainer `_ with a default training structure to train semantic 22 | segmentation problems. However, one can leverage this is API using the models provided by Kornia or 23 | use existing libraries from the PyTorch ecosystem such as `torchvision `_. 24 | 25 | Create the dataloaders and transforms: 26 | 27 | .. literalinclude:: ../../scripts/training/semantic_segmentation/main.py 28 | :language: python 29 | :lines: 17-50 30 | 31 | Define your model, losses, optimizers and schedulers: 32 | 33 | .. literalinclude:: ../../scripts/training/semantic_segmentation/main.py 34 | :language: python 35 | :lines: 52-60 36 | 37 | Create your preprocessing and augmentations pipeline: 38 | 39 | .. literalinclude:: ../../scripts/training/semantic_segmentation/main.py 40 | :language: python 41 | :lines: 62-81 42 | 43 | Finally, instantiate the `SemanticSegmentationTrainer `_ and execute your training pipeline. 44 | 45 | .. literalinclude:: ../../scripts/training/semantic_segmentation/main.py 46 | :language: python 47 | :lines: 83-91 48 | 49 | .. seealso:: 50 | Play with the full example `here `_ 51 | --------------------------------------------------------------------------------