├── .flake8 ├── .github ├── ISSUE_TEMPLATE │ ├── bug_report.md │ └── feature_request.md └── workflows │ └── lint.yml ├── .gitignore ├── .pre-commit-config.yaml ├── LICENSE ├── README.md ├── environment.yml ├── notebooks ├── Add Field Maps App ID to ArcGIS Enterprise.ipynb ├── Add GPS Metadata Fields.ipynb ├── Add GPS Metadata Fields_FS.ipynb ├── Add Offset Metadata Fields.ipynb ├── Bulk Update Maps for Use in Collector or Field Maps.ipynb ├── Configure Search.ipynb ├── Create Offline Areas from Feature Layer.ipynb ├── Field Apps Deployment Using Python.ipynb ├── Find features with missing attachments.ipynb ├── Generate PDF Report │ ├── Generate PDF Report.ipynb │ ├── example-report.pdf │ └── report-template.html ├── Location Sharing Status.ipynb ├── Manage Map Areas with Group and Index.ipynb ├── Offline Checks.ipynb ├── Proxy Esri Basemaps for ArcGIS Enterprise Offline Map Areas.ipynb └── Watermark photo attachments with Exif data.ipynb ├── readmes ├── copy_form_between_maps.md └── download_attachments.md └── scripts ├── copy_form_between_maps.py └── download_attachments.py /.flake8: -------------------------------------------------------------------------------- 1 | [flake8] 2 | max-line-length = 160 3 | max-complexity = 18 4 | ignore = E722, W503, W504, F811 -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | A clear and concise description of what the bug is. 12 | 13 | **To Reproduce** 14 | Steps to reproduce the behavior: 15 | 1. Go to '...' 16 | 2. Click on '....' 17 | 3. Scroll down to '....' 18 | 4. See error 19 | 20 | **Expected behavior** 21 | A clear and concise description of what you expected to happen. 22 | 23 | **Screenshots** 24 | If applicable, add screenshots to help explain your problem. 25 | 26 | **Desktop (please complete the following information):** 27 | - OS: [e.g. iOS] 28 | - Browser [e.g. chrome, safari] 29 | - Version [e.g. 22] 30 | 31 | **Smartphone (please complete the following information):** 32 | - Device: [e.g. iPhone6] 33 | - OS: [e.g. iOS8.1] 34 | - Browser [e.g. stock browser, safari] 35 | - Version [e.g. 22] 36 | 37 | **Additional context** 38 | Add any other context about the problem here. 39 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | -------------------------------------------------------------------------------- /.github/workflows/lint.yml: -------------------------------------------------------------------------------- 1 | name: Lint 2 | on: [pull_request] 3 | 4 | jobs: 5 | lint: 6 | runs-on: ubuntu-latest 7 | steps: 8 | - uses: actions/checkout@v2 9 | - name: Set up Python 3.11 10 | uses: actions/setup-python@v1 11 | with: 12 | python-version: 3.11 13 | - name: Setup flake8 annotations 14 | uses: rbialon/flake8-annotations@v1 15 | - name: Lint with flake8 16 | run: | 17 | pip install flake8 18 | flake8 . --count --show-source --statistics 19 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | **/.DS_Store 2 | -------------------------------------------------------------------------------- /.pre-commit-config.yaml: -------------------------------------------------------------------------------- 1 | repos: 2 | - repo: https://gitlab.com/pycqa/flake8 3 | rev: 3.8.3 4 | hooks: 5 | - id: flake8 -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "{}" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright {yyyy} {name of copyright owner} 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Field Maps Scripts 2 | 3 | A set of Python scripts and notebooks to help configure maps and manage data for ArcGIS Field Maps. 4 | 5 | ## Features 6 | 7 | ### Scripts 8 | 9 | | Functionality | Script 10 | |----------------------------------------------------------------------|----------------------------------------------------------------------------------------| 11 | | [Copy Form Between Maps](readmes/copy_form_between_maps.md) | [copy_form_between_maps.py](scripts/copy_form_between_maps.py) | 12 | | [Download Attachments from Feature Layer](readmes/download_attachments.md) | [download_attachments.py](scripts/download_attachments.py) 13 | 14 | ### Notebooks 15 | 16 | - [Add Field Maps App ID to ArcGIS Enterprise](notebooks/Add%20Field%20Maps%20App%20ID%20to%20ArcGIS%20Enterprise.ipynb) 17 | - [Bulk Update Maps for Use in Collector or Field Maps](notebooks/Bulk%20Update%20Maps%20for%20Use%20in%20Collector%20or%20Field%20Maps.ipynb) 18 | - [Field Apps Deployment Using Python](notebooks/Field%20Apps%20Deployment%20Using%20Python.ipynb) 19 | - [Generate PDF Report](notebooks/Generate%20PDF%20Report/Generate%20PDF%20Report.ipynb) 20 | - [Add GPS Metadata Fields (Pro)](notebooks/Add%20GPS%20Metadata%20Fields.ipynb) 21 | - [Add Offset Metadata Fields (Pro)](notebooks/Add%20Offset%20Metadata%20Fields.ipynb) 22 | - [Add GPS Metadata Fields](notebooks/Add%20GPS%20Metadata%20Fields_FS.ipynb) 23 | - [Configure Search](notebooks/Configure%20Search.ipynb) 24 | - [Location Sharing Status](notebooks/Location%20Sharing%20Status.ipynb) 25 | - [Create Offline Areas from Feature Layer](notebooks/Create%20Offline%20Areas%20from%20Feature%20Layer.ipynb) 26 | - [Manage Map Areas with Group and Index](notebooks/Manage%20Map%20Areas%20with%20Group%20and%20Index.ipynb) 27 | - [Proxy Esri Basemaps for ArcGIS Enterprise Offline Map Areas](notebooks/Proxy%20Esri%20Basemaps%20for%20ArcGIS%20Enterprise%20Offline%20Map%20Areas.ipynb) 28 | - [Watermark photo attachments with Exif data](notebooks/Watermark%20photo%20attachments%20with%20Exif%20data.ipynb) 29 | - [Offline Checks](notebooks/Offline%20Checks.ipynb) 30 | 31 | ### Requirements 32 | 33 | - ArcGIS API for Python 2.3.1 34 | - Python 3.9.x to 3.11.x is required to use the ArcGIS API for Python 2.3.1. 35 | - ArcGIS Field Maps (web and mobile applications) 36 | - ArcGIS Pro 2.9+ (Add `GPS Metadata Fields (Pro)` and `Add Offset Metadata Fields (Pro)` only) 37 | 38 | ### Instructions 39 | 40 | This repository recommends the ArcGIS API for Python version 2.3.1. We recommend setting up your 41 | local environment via Anaconda. 42 | 43 | 1. [Install Anaconda](https://developers.arcgis.com/python/guide/install-and-set-up/) 44 | 2. Run `conda env create --file environment.yml` to create the virtual environment with the correct dependencies 45 | 3. Run `conda activate field-maps-scripts` to activate the environment 46 | 4. (Optional - dev only) Configure pre-commit to run flake8 linting on pushes 47 | - `pre-commit install --hook-type pre-push` 48 | 49 | ## Resources 50 | 51 | - [ArcGIS Field Maps](https://www.esri.com/arcgis-blog/products/apps/field-mobility/introducing-arcgis-field-maps/) 52 | - [ArcGIS API for Python](https://developers.arcgis.com/python) 53 | 54 | ## Issues 55 | 56 | Although we do our best to ensure these scripts and notebooks work as expected, they are provided as is and there is no official support. 57 | 58 | If you find a bug, please let us know by submitting an issue. 59 | 60 | ## Contributing 61 | 62 | Esri welcomes contributions from anyone and everyone. 63 | Please see our [guidelines for contributing](https://github.com/esri/contributing). 64 | 65 | ## Licensing 66 | 67 | Copyright 2022 Esri 68 | 69 | Licensed under the Apache License, Version 2.0 (the "License"); 70 | you may not use this file except in compliance with the License. 71 | You may obtain a copy of the License at 72 | 73 | 74 | 75 | Unless required by applicable law or agreed to in writing, software 76 | distributed under the License is distributed on an "AS IS" BASIS, 77 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 78 | See the License for the specific language governing permissions and 79 | limitations under the License. 80 | 81 | A copy of the license is available in the repository's 82 | [LICENSE](LICENSE) file. 83 | -------------------------------------------------------------------------------- /environment.yml: -------------------------------------------------------------------------------- 1 | name: field-maps-scripts 2 | channels: 3 | - Esri 4 | - Esri/label/prerelease 5 | - conda-forge 6 | dependencies: 7 | - arcgis=2.3.* 8 | - cairo>=1.16.0 9 | - gdk-pixbuf>=2.42.2 10 | - jinja2>=2.11.2 11 | - pango>=1.42.4 12 | - pip>=19.1.1 13 | - python>=3.9.0 14 | - flake8>=3.8.3 15 | - pre-commit>=2.7.1 16 | - qrcode>=6.1 17 | - WeasyPrint>=52.2 18 | - pip: 19 | - pyproj==3.7.0 20 | - cartopy>=0.18.0 21 | - pillow==9.4.0 22 | -------------------------------------------------------------------------------- /notebooks/Add Field Maps App ID to ArcGIS Enterprise.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## This notebook will configure your ArcGIS Enterprise portal to allow connections from ArcGIS Field Maps mobile app (iOS, Android)\n", 8 | "\n", 9 | "### Notes\n", 10 | "- If you are running ArcGIS Enterprise 10.8.1 or higher this step is not needed\n", 11 | "- ArcGIS Field Maps will not connect to ArcGIS Enterprise portals older than 10.6.1\n", 12 | "\n", 13 | "### Steps\n", 14 | "1. Update parameters for your portal\n", 15 | " 1. Built-in or SAML authentication\n", 16 | " - portalURL\n", 17 | " - adminUser\n", 18 | " 2. PKI\n", 19 | " - portalURL\n", 20 | " - certificateFile\n", 21 | "2. Ensure connection method is uncommented based on your authentication\n", 22 | "3. Run notebook\n", 23 | "4. Verify connection by authenticating in ArcGIS Field Maps mobile app" 24 | ] 25 | }, 26 | { 27 | "cell_type": "markdown", 28 | "metadata": {}, 29 | "source": [ 30 | "### Configure parameters" 31 | ] 32 | }, 33 | { 34 | "cell_type": "code", 35 | "execution_count": null, 36 | "metadata": {}, 37 | "outputs": [], 38 | "source": [ 39 | "from arcgis.gis import GIS\n", 40 | "\n", 41 | "# Update variables for your portal\n", 42 | "portal_url = \"https:///portal\"\n", 43 | "admin_user = \"adminUser\"\n", 44 | "certificate_file = r\"C:\\Data\\Certificates\\PKI Certs\\creator.pfx\"\n", 45 | "\n", 46 | "# Uncomment appropriate method below\n", 47 | "\n", 48 | "# Portal - built-in or SAML\n", 49 | "target_org = GIS(portal_url, admin_user, verify_cert=False)\n", 50 | "\n", 51 | "# Portal - PKI\n", 52 | "# target_org = GIS(portal_url, cert_file=certificate_file, password=getpass())" 53 | ] 54 | }, 55 | { 56 | "cell_type": "markdown", 57 | "metadata": {}, 58 | "source": [ 59 | "### Update portal" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "app_properties = {'type': 'Application','title': 'Field Maps','tags': 'fieldmaps'}\n", 69 | "\n", 70 | "# Add new application item\n", 71 | "fm_application_item = target_org.content.add(item_properties=app_properties)\n", 72 | "\n", 73 | "# Add delete protection to new item\n", 74 | "fm_application_item.protect(True)\n", 75 | "\n", 76 | "# Register new application\n", 77 | "fm_application_item.register(\"multiple\", [\"urn:ietf:wg:oauth:2.0:oob\",\"arcgis-fieldmaps://auth/\", \"arcgis-fieldmaps-beta://auth/\"])\n", 78 | "\n", 79 | "# Obtain app info\n", 80 | "app_info_dict = fm_application_item.app_info\n", 81 | "\n", 82 | "# Get current client ID\n", 83 | "client_id = app_info_dict['client_id']\n", 84 | "\n", 85 | "# Update to fieldmaps client ID\n", 86 | "target_org.admin.security.oauth.update(client_id, \"fieldmaps\")" 87 | ] 88 | } 89 | ], 90 | "metadata": { 91 | "kernelspec": { 92 | "display_name": "Python 3", 93 | "language": "python", 94 | "name": "python3" 95 | }, 96 | "language_info": { 97 | "codemirror_mode": { 98 | "name": "ipython", 99 | "version": 3 100 | }, 101 | "file_extension": ".py", 102 | "mimetype": "text/x-python", 103 | "name": "python", 104 | "nbconvert_exporter": "python", 105 | "pygments_lexer": "ipython3", 106 | "version": "3.8.1" 107 | } 108 | }, 109 | "nbformat": 4, 110 | "nbformat_minor": 2 111 | } 112 | -------------------------------------------------------------------------------- /notebooks/Add GPS Metadata Fields.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Add GPS Metadata fields\n", 8 | "\n", 9 | "This notebook is designed for use in ArcGIS Pro 2.9+. See [this topic](https://pro.arcgis.com/en/pro-app/2.8/arcpy/get-started/pro-notebooks.htm) For information on how to use Jupyter notebooks in ArcGIS Pro.\n", 10 | "\n", 11 | "- Use this script to update an existing line or polygon feature class to support GPS metadata\n", 12 | "- This adds a series of fields, necessary coded value domains, and enables attachments\n", 13 | "\n", 14 | "### Steps\n", 15 | "1. Update the `input_feaature_class` variable\n", 16 | "2. Run all cells" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": 70, 22 | "metadata": {}, 23 | "outputs": [], 24 | "source": [ 25 | "# update to location of line or polygon feature class\n", 26 | "input_feature_class = r\"C:\\Users\\doug-m\\Documents\\ArcGIS\\Projects\\MyProject\\MyGDB.gdb\\service_lines\"" 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": 71, 32 | "metadata": {}, 33 | "outputs": [], 34 | "source": [ 35 | "import arcpy\n", 36 | "import os\n", 37 | "\n", 38 | "def get_geodatabase_path(input_feature_class):\n", 39 | " \"\"\"\n", 40 | " Gets the parent geodatabase of the layer\n", 41 | " :param input_feature_class: (string) The feature layer to get the parent database of\n", 42 | " :return: (string) The path to the geodatabase\n", 43 | " \"\"\"\n", 44 | " workspace = os.path.dirname(arcpy.Describe(input_feature_class).catalogPath)\n", 45 | " if [any(ext) for ext in ('.gdb', '.mdb', '.sde') if ext in os.path.splitext(workspace)]:\n", 46 | " return workspace\n", 47 | " else:\n", 48 | " return os.path.dirname(workspace)" 49 | ] 50 | }, 51 | { 52 | "cell_type": "code", 53 | "execution_count": 72, 54 | "metadata": {}, 55 | "outputs": [], 56 | "source": [ 57 | "def check_and_create_domains(geodatabase):\n", 58 | " \"\"\"\n", 59 | " Checks if the Fix Type domain already exists.\n", 60 | " If the domains do not exist, they are created\n", 61 | " :param geodatabase: (string) the path to the geodatabase to check\n", 62 | " :return:\n", 63 | " \"\"\"\n", 64 | " domains = arcpy.da.ListDomains(geodatabase)\n", 65 | " domain_names = [domain.name.lower() for domain in domains]\n", 66 | " if 'esri_fix_type_domain' in domain_names:\n", 67 | " for domain in domains:\n", 68 | " if domain.name.lower() == 'esri_fix_type_domain':\n", 69 | " # check if cvs 0,1,2,4,5 are in the codedValues\n", 70 | " values = [cv for cv in domain.codedValues]\n", 71 | " if not {0, 1, 2, 4, 5}.issubset(values):\n", 72 | " return \"ESRI_FIX_TYPE_DOMAIN is missing a coded value pair.\"\n", 73 | " else:\n", 74 | " # Add the domain and values\n", 75 | " \n", 76 | " arcpy.CreateDomain_management(in_workspace=geodatabase,\n", 77 | " domain_name=\"ESRI_FIX_TYPE_DOMAIN\",\n", 78 | " domain_description=\"Fix Type\",\n", 79 | " field_type=\"SHORT\",\n", 80 | " domain_type=\"CODED\",\n", 81 | " split_policy=\"DEFAULT\",\n", 82 | " merge_policy=\"DEFAULT\")\n", 83 | "\n", 84 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 85 | " domain_name=\"ESRI_FIX_TYPE_DOMAIN\",\n", 86 | " code=\"0\",\n", 87 | " code_description=\"Fix not valid\")\n", 88 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 89 | " domain_name=\"ESRI_FIX_TYPE_DOMAIN\",\n", 90 | " code=\"1\",\n", 91 | " code_description=\"GPS\")\n", 92 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 93 | " domain_name=\"ESRI_FIX_TYPE_DOMAIN\",\n", 94 | " code=\"2\",\n", 95 | " code_description=\"Differential GPS\")\n", 96 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 97 | " domain_name=\"ESRI_FIX_TYPE_DOMAIN\",\n", 98 | " code=\"4\",\n", 99 | " code_description=\"RTK Fixed\")\n", 100 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 101 | " domain_name=\"ESRI_FIX_TYPE_DOMAIN\",\n", 102 | " code=\"5\",\n", 103 | " code_description=\"RTK Float\")" 104 | ] 105 | }, 106 | { 107 | "cell_type": "code", 108 | "execution_count": 73, 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "def add_gnss_fields(feature_class):\n", 113 | " \"\"\"\n", 114 | " This adds specific fields required for GPS units to\n", 115 | " auto-populate in the Field Maps application\n", 116 | " This will report errors if:\n", 117 | " 1) Any of the fields already exist\n", 118 | " 2) The input layer is not a line or polygon layer\n", 119 | " 3) The layer is not found\n", 120 | " 4) The layer is a shapefile\n", 121 | " \"\"\"\n", 122 | "\n", 123 | " if not arcpy.Exists(feature_class):\n", 124 | " return \"Feature layer: {} not found!\".format(feature_class)\n", 125 | " if arcpy.Describe(feature_class).shapeType != \"Polyline\" and arcpy.Describe(feature_class).shapeType != \"Polygon\":\n", 126 | " return \"Feature layer: {} is not a line or polygon layer. Use AddGPSMetadataFields_management instead.\".format(feature_class)\n", 127 | " if arcpy.Describe(feature_class).dataType == \"ShapeFile\":\n", 128 | " return \"ShapeFiles are not supported.\"\n", 129 | "\n", 130 | " # Check the domains to see if they exist and are valid\n", 131 | " # will update if necessary\n", 132 | " geodatabase = get_geodatabase_path(feature_class)\n", 133 | " check_and_create_domains(geodatabase) \n", 134 | "\n", 135 | " # Enable Attachments\n", 136 | " arcpy.EnableAttachments_management(feature_class)\n", 137 | "\n", 138 | " # Add GNSS metadata fields\n", 139 | " existing_fields = [field.name.lower() for field in arcpy.ListFields(feature_class)]\n", 140 | "\n", 141 | " if 'esrignss_avg_h_rms' not in existing_fields:\n", 142 | " arcpy.AddField_management(feature_class,\n", 143 | " 'esrignss_avg_h_rms',\n", 144 | " field_type=\"DOUBLE\",\n", 145 | " field_alias='Average horizontal accuracy (m)',\n", 146 | " field_is_nullable=\"NULLABLE\"\n", 147 | " )\n", 148 | "\n", 149 | " if 'esrignss_avg_v_rms' not in existing_fields:\n", 150 | " arcpy.AddField_management(feature_class,\n", 151 | " 'esrignss_avg_v_rms',\n", 152 | " field_type=\"DOUBLE\",\n", 153 | " field_alias='Average vertical accuracy (m)',\n", 154 | " field_is_nullable=\"NULLABLE\"\n", 155 | " )\n", 156 | "\n", 157 | " if 'esrignss_worst_h_rms' not in existing_fields:\n", 158 | " arcpy.AddField_management(feature_class,\n", 159 | " 'esrignss_worst_h_rms',\n", 160 | " field_type=\"DOUBLE\",\n", 161 | " field_alias='Worst horizontal accuracy (m)',\n", 162 | " field_is_nullable=\"NULLABLE\"\n", 163 | " )\n", 164 | "\n", 165 | " if 'esrignss_worst_v_rms' not in existing_fields:\n", 166 | " arcpy.AddField_management(feature_class,\n", 167 | " 'esrignss_worst_v_rms',\n", 168 | " field_type=\"DOUBLE\",\n", 169 | " field_alias='Worst vertical accuracy (m)',\n", 170 | " field_is_nullable=\"NULLABLE\"\n", 171 | " )\n", 172 | "\n", 173 | " if 'esrignss_worst_fixtype' not in existing_fields:\n", 174 | " arcpy.AddField_management(feature_class,\n", 175 | " 'esrignss_worst_fixtype',\n", 176 | " field_type=\"SHORT\",\n", 177 | " field_alias='Worst Fix Type',\n", 178 | " field_is_nullable=\"NULLABLE\",\n", 179 | " field_domain=\"ESRI_FIX_TYPE_DOMAIN\"\n", 180 | " )\n", 181 | "\n", 182 | " if 'esrignss_manual_locations' not in existing_fields:\n", 183 | " arcpy.AddField_management(feature_class,\n", 184 | " 'esrignss_manual_locations',\n", 185 | " field_type=\"LONG\",\n", 186 | " field_alias='Number of manual locations',\n", 187 | " field_is_nullable=\"NULLABLE\"\n", 188 | " )\n", 189 | "\n", 190 | " # Update GNSS metadata fields with Domains\n", 191 | " domain_fields = [field for field in arcpy.ListFields(feature_class) if field.name == 'esrignss_worst_fixtype']\n", 192 | "\n", 193 | " for field in domain_fields:\n", 194 | " if field.name == 'esrignss_worst_fixtype' and not field.domain:\n", 195 | " arcpy.AssignDomainToField_management(feature_class, field, 'ESRI_FIX_TYPE_DOMAIN')\n", 196 | "\n", 197 | " return \"Successfully added offset metadata fields and domains.\"" 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": null, 203 | "metadata": {}, 204 | "outputs": [], 205 | "source": [ 206 | "add_gnss_fields(input_feature_class)" 207 | ] 208 | } 209 | ], 210 | "metadata": { 211 | "interpreter": { 212 | "hash": "2f7d5e7f44707441fdf59f9ea9e91257aa419b0f47527e9654c7357783801722" 213 | }, 214 | "kernelspec": { 215 | "display_name": "ArcGISPro", 216 | "language": "Python", 217 | "name": "python3" 218 | }, 219 | "language_info": { 220 | "file_extension": ".py", 221 | "name": "python", 222 | "version": "3.7.13" 223 | } 224 | }, 225 | "nbformat": 4, 226 | "nbformat_minor": 2 227 | } 228 | -------------------------------------------------------------------------------- /notebooks/Add Offset Metadata Fields.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Add Offset Metadata fields\n", 8 | "\n", 9 | "This notebook is designed for use in ArcGIS Pro 2.9+. See [this topic](https://pro.arcgis.com/en/pro-app/2.8/arcpy/get-started/pro-notebooks.htm) For information on how to use Jupyter notebooks in ArcGIS Pro.\n", 10 | "\n", 11 | "- Prior to running this script you should ensure that the [Add GPS Metadata Fields](https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/add-gps-metadata-fields.htm) Geoprocessing tool has been run on the `input_feature_class`. \n", 12 | "- Use this script to update an existing point feature class to support offset workflows\n", 13 | "- This adds a series of fields and necessary coded value domains\n", 14 | "\n", 15 | "### Steps\n", 16 | "1. Update the `input_feaature_class` variable\n", 17 | "2. Run all cells" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": 70, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "# update to location of line or polygon feature class\n", 27 | "input_feature_class = r\"C:\\Users\\doug-m\\Documents\\ArcGIS\\Projects\\MyProject\\MyGDB.gdb\\water_valves\"" 28 | ] 29 | }, 30 | { 31 | "cell_type": "code", 32 | "execution_count": 71, 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "import arcpy\n", 37 | "import os\n", 38 | "\n", 39 | "def get_geodatabase_path(input_feature_class):\n", 40 | " \"\"\"\n", 41 | " Gets the parent geodatabase of the layer\n", 42 | " :param input_feature_class: (string) The feature layer to get the parent database of\n", 43 | " :return: (string) The path to the geodatabase\n", 44 | " \"\"\"\n", 45 | " workspace = os.path.dirname(arcpy.Describe(input_feature_class).catalogPath)\n", 46 | " if [any(ext) for ext in ('.gdb', '.mdb', '.sde') if ext in os.path.splitext(workspace)]:\n", 47 | " return workspace\n", 48 | " else:\n", 49 | " return os.path.dirname(workspace)" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": 72, 55 | "metadata": {}, 56 | "outputs": [], 57 | "source": [ 58 | "def check_and_create_domains(geodatabase):\n", 59 | " \"\"\"\n", 60 | " Checks if the Fix Type domain already exists.\n", 61 | " If the domains do not exist, they are created\n", 62 | " :param geodatabase: (string) the path to the geodatabase to check\n", 63 | " :return:\n", 64 | " \"\"\"\n", 65 | " domains = arcpy.da.ListDomains(geodatabase)\n", 66 | " domain_names = [domain.name.lower() for domain in domains]\n", 67 | " if 'esri_fix_type_domain' in domain_names:\n", 68 | " for domain in domains:\n", 69 | " if domain.name.lower() == 'esri_fix_type_domain':\n", 70 | " # check if cvs 0,1,2,4,5 are in the codedValues\n", 71 | " values = [cv for cv in domain.codedValues]\n", 72 | " if not {0, 1, 2, 4, 5}.issubset(values):\n", 73 | " return \"esri_fix_type_domain is missing a coded value pair.\"\n", 74 | " else:\n", 75 | " # Add the domain and values\n", 76 | " \n", 77 | " arcpy.CreateDomain_management(in_workspace=geodatabase,\n", 78 | " domain_name=\"esri_fix_type_domain\",\n", 79 | " domain_description=\"Fix Type\",\n", 80 | " field_type=\"SHORT\",\n", 81 | " domain_type=\"CODED\",\n", 82 | " split_policy=\"DEFAULT\",\n", 83 | " merge_policy=\"DEFAULT\")\n", 84 | "\n", 85 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 86 | " domain_name=\"esri_fix_type_domain\",\n", 87 | " code=\"0\",\n", 88 | " code_description=\"Fix not valid\")\n", 89 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 90 | " domain_name=\"esri_fix_type_domain\",\n", 91 | " code=\"1\",\n", 92 | " code_description=\"GPS\")\n", 93 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 94 | " domain_name=\"esri_fix_type_domain\",\n", 95 | " code=\"2\",\n", 96 | " code_description=\"Differential GPS\")\n", 97 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 98 | " domain_name=\"esri_fix_type_domain\",\n", 99 | " code=\"4\",\n", 100 | " code_description=\"RTK Fixed\")\n", 101 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 102 | " domain_name=\"esri_fix_type_domain\",\n", 103 | " code=\"5\",\n", 104 | " code_description=\"RTK Float\")\n", 105 | " \n", 106 | " if 'esri_offset_method_domain' in domain_names:\n", 107 | " for domain in domains:\n", 108 | " if domain.name.lower() == 'esri_offset_method_domain':\n", 109 | " # check if offset methods are in the codedValues\n", 110 | " values = [cv for cv in domain.codedValues]\n", 111 | " if not {'azimuth-azimuth', \n", 112 | " 'distance-azimuth', \n", 113 | " 'distance-distance', \n", 114 | " 'distance-angle', \n", 115 | " 'azimuth-azimuth-backsight', \n", 116 | " 'distance-azimuth-backsight',\n", 117 | " 'distance-distance-backsight',\n", 118 | " 'user-defined'}.issubset(values):\n", 119 | " return \"esri_offset_method_domain is missing a coded value pair.\"\n", 120 | " else:\n", 121 | " # Add the domain and values\n", 122 | " \n", 123 | " arcpy.CreateDomain_management(in_workspace=geodatabase,\n", 124 | " domain_name=\"esri_offset_method_domain\",\n", 125 | " domain_description=\"Offset Method Domain\",\n", 126 | " field_type=\"TEXT\",\n", 127 | " domain_type=\"CODED\",\n", 128 | " split_policy=\"DEFAULT\",\n", 129 | " merge_policy=\"DEFAULT\")\n", 130 | "\n", 131 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 132 | " domain_name=\"esri_offset_method_domain\",\n", 133 | " code=\"azimuth-azimuth\",\n", 134 | " code_description=\"Azimuth-Azimuth\")\n", 135 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 136 | " domain_name=\"esri_offset_method_domain\",\n", 137 | " code=\"distance-angle\",\n", 138 | " code_description=\"Distance-Angle\")\n", 139 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 140 | " domain_name=\"esri_offset_method_domain\",\n", 141 | " code=\"distance-azimuth\",\n", 142 | " code_description=\"Distance-Azimuth\")\n", 143 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 144 | " domain_name=\"esri_offset_method_domain\",\n", 145 | " code=\"distance-distance\",\n", 146 | " code_description=\"Distance-Distance\")\n", 147 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 148 | " domain_name=\"esri_offset_method_domain\",\n", 149 | " code=\"azimuth-azimuth-backsight\",\n", 150 | " code_description=\"Azimuth-Azimuth-Backsight\")\n", 151 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 152 | " domain_name=\"esri_offset_method_domain\",\n", 153 | " code=\"distance-azimuth-backsight\",\n", 154 | " code_description=\"Distance-Azimuth-Backsight\")\n", 155 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 156 | " domain_name=\"esri_offset_method_domain\",\n", 157 | " code=\"distance-distance-backsight\",\n", 158 | " code_description=\"Distance-Distance-Backsight\")\n", 159 | " arcpy.AddCodedValueToDomain_management(in_workspace=geodatabase,\n", 160 | " domain_name=\"esri_offset_method_domain\",\n", 161 | " code=\"user-defined\",\n", 162 | " code_description=\"User-defined\") " 163 | ] 164 | }, 165 | { 166 | "cell_type": "code", 167 | "execution_count": null, 168 | "metadata": {}, 169 | "outputs": [], 170 | "source": [ 171 | "def add_offset_fields(feature_class):\n", 172 | " \"\"\"\n", 173 | " This adds specific fields required for laser offset workflows\n", 174 | " This will report errors if:\n", 175 | " 1) Any of the fields already exist\n", 176 | " 2) The input layer is not a point layer\n", 177 | " 3) The layer is not found\n", 178 | " 4) The layer is a shapefile\n", 179 | " \"\"\"\n", 180 | "\n", 181 | " if not arcpy.Exists(feature_class):\n", 182 | " return \"Feature layer: {} not found!\".format(feature_class)\n", 183 | " if arcpy.Describe(feature_class).shapeType != \"Point\":\n", 184 | " return \"Feature layer: {} is not a point layer.\".format(feature_class)\n", 185 | " if arcpy.Describe(feature_class).dataType == \"ShapeFile\":\n", 186 | " return \"Shapefiles are not supported.\"\n", 187 | "\n", 188 | " # Check the domains to see if they exist and are valid\n", 189 | " # will update if necessary\n", 190 | " geodatabase = get_geodatabase_path(feature_class)\n", 191 | " check_and_create_domains(geodatabase) \n", 192 | "\n", 193 | " # Add GNSS metadata fields\n", 194 | " existing_fields = [field.name.lower() for field in arcpy.ListFields(feature_class)]\n", 195 | "\n", 196 | " if 'esrisnsr_offset_method' not in existing_fields:\n", 197 | " arcpy.AddField_management(feature_class,\n", 198 | " 'esrisnsr_offset_method',\n", 199 | " field_type=\"TEXT\",\n", 200 | " field_alias='Offset method',\n", 201 | " field_is_nullable=\"NULLABLE\",\n", 202 | " field_domain=\"esri_offset_method_domain\"\n", 203 | " )\n", 204 | "\n", 205 | " if 'esrisnsr_offset_device' not in existing_fields:\n", 206 | " arcpy.AddField_management(feature_class,\n", 207 | " 'esrisnsr_offset_device',\n", 208 | " field_type=\"TEXT\",\n", 209 | " field_alias='Rangefinder name',\n", 210 | " field_is_nullable=\"NULLABLE\"\n", 211 | " )\n", 212 | "\n", 213 | " if 'esrignss_antenna_height' not in existing_fields:\n", 214 | " arcpy.AddField_management(feature_class,\n", 215 | " 'esrignss_antenna_height',\n", 216 | " field_type=\"DOUBLE\",\n", 217 | " field_alias='GNSS Antenna height (m)',\n", 218 | " field_is_nullable=\"NULLABLE\"\n", 219 | " )\n", 220 | "\n", 221 | " if 'esrisnsr_offset_laser_height' not in existing_fields:\n", 222 | " arcpy.AddField_management(feature_class,\n", 223 | " 'esrisnsr_offset_laser_height',\n", 224 | " field_type=\"DOUBLE\",\n", 225 | " field_alias='Rangefinder height (m)',\n", 226 | " field_is_nullable=\"NULLABLE\"\n", 227 | " )\n", 228 | "\n", 229 | " if 'esrisnsr_offset_laser_target_ht' not in existing_fields:\n", 230 | " arcpy.AddField_management(feature_class,\n", 231 | " 'esrisnsr_offset_laser_target_ht',\n", 232 | " field_type=\"DOUBLE\",\n", 233 | " field_alias='Laser target height (m)',\n", 234 | " field_is_nullable=\"NULLABLE\"\n", 235 | " )\n", 236 | "\n", 237 | " if 'esrisnsr_offset_mag_var' not in existing_fields:\n", 238 | " arcpy.AddField_management(feature_class,\n", 239 | " 'esrisnsr_offset_mag_var',\n", 240 | " field_type=\"DOUBLE\",\n", 241 | " field_alias='Rangefinder Magnetic variation (°)',\n", 242 | " field_is_nullable=\"NULLABLE\"\n", 243 | " )\n", 244 | " \n", 245 | " if 'esrignss_cp1_latitude' not in existing_fields:\n", 246 | " arcpy.AddField_management(feature_class,\n", 247 | " 'esrignss_cp1_latitude',\n", 248 | " field_type=\"DOUBLE\",\n", 249 | " field_alias='CP1 Latitude',\n", 250 | " field_is_nullable=\"NULLABLE\"\n", 251 | " )\n", 252 | "\n", 253 | " if 'esrignss_cp1_longitude' not in existing_fields:\n", 254 | " arcpy.AddField_management(feature_class,\n", 255 | " 'esrignss_cp1_longitude',\n", 256 | " field_type=\"DOUBLE\",\n", 257 | " field_alias='CP1 Longitude',\n", 258 | " field_is_nullable=\"NULLABLE\"\n", 259 | " ) \n", 260 | "\n", 261 | " if 'esrignss_cp1_altitude' not in existing_fields:\n", 262 | " arcpy.AddField_management(feature_class,\n", 263 | " 'esrignss_cp1_altitude',\n", 264 | " field_type=\"DOUBLE\",\n", 265 | " field_alias='CP1 Ellipsoidal height (m)',\n", 266 | " field_is_nullable=\"NULLABLE\"\n", 267 | " )\n", 268 | "\n", 269 | " if 'esrignss_cp1_h_rms' not in existing_fields:\n", 270 | " arcpy.AddField_management(feature_class,\n", 271 | " 'esrignss_cp1_h_rms',\n", 272 | " field_type=\"DOUBLE\",\n", 273 | " field_alias='CP1 Horizontal accuracy (m)',\n", 274 | " field_is_nullable=\"NULLABLE\"\n", 275 | " ) \n", 276 | "\n", 277 | " if 'esrignss_cp1_v_rms' not in existing_fields:\n", 278 | " arcpy.AddField_management(feature_class,\n", 279 | " 'esrignss_cp1_v_rms',\n", 280 | " field_type=\"DOUBLE\",\n", 281 | " field_alias='CP1 Vertical accuracy (m)',\n", 282 | " field_is_nullable=\"NULLABLE\"\n", 283 | " )\n", 284 | "\n", 285 | " if 'esrignss_cp1_fixdatetime' not in existing_fields:\n", 286 | " arcpy.AddField_management(feature_class,\n", 287 | " 'esrignss_cp1_fixdatetime',\n", 288 | " field_type=\"DATE\",\n", 289 | " field_alias='CP1 Fix time',\n", 290 | " field_is_nullable=\"NULLABLE\"\n", 291 | " ) \n", 292 | "\n", 293 | " if 'esrignss_cp1_fixtype' not in existing_fields:\n", 294 | " arcpy.AddField_management(feature_class,\n", 295 | " 'esrignss_cp1_fixtype',\n", 296 | " field_type=\"SHORT\",\n", 297 | " field_alias='CP1 Fix type',\n", 298 | " field_is_nullable=\"NULLABLE\",\n", 299 | " field_domain=\"esri_fix_type_domain\"\n", 300 | " ) \n", 301 | "\n", 302 | " if 'esrignss_cp1_corr_age' not in existing_fields:\n", 303 | " arcpy.AddField_management(feature_class,\n", 304 | " 'esrignss_cp1_corr_age',\n", 305 | " field_type=\"DOUBLE\",\n", 306 | " field_alias='CP1 Correction age',\n", 307 | " field_is_nullable=\"NULLABLE\"\n", 308 | " )\n", 309 | "\n", 310 | " if 'esrignss_cp1_stationid' not in existing_fields:\n", 311 | " arcpy.AddField_management(feature_class,\n", 312 | " 'esrignss_cp1_stationid',\n", 313 | " field_type=\"SHORT\",\n", 314 | " field_alias='CP1 Station ID',\n", 315 | " field_is_nullable=\"NULLABLE\"\n", 316 | " ) \n", 317 | "\n", 318 | " if 'esrignss_cp1_numsats' not in existing_fields:\n", 319 | " arcpy.AddField_management(feature_class,\n", 320 | " 'esrignss_cp1_numsats',\n", 321 | " field_type=\"DOUBLE\",\n", 322 | " field_alias='CP1 Number of satellites',\n", 323 | " field_is_nullable=\"NULLABLE\"\n", 324 | " )\n", 325 | "\n", 326 | " if 'esrignss_cp1_pdop' not in existing_fields:\n", 327 | " arcpy.AddField_management(feature_class,\n", 328 | " 'esrignss_cp1_pdop',\n", 329 | " field_type=\"DOUBLE\",\n", 330 | " field_alias='CP1 PDOP',\n", 331 | " field_is_nullable=\"NULLABLE\"\n", 332 | " ) \n", 333 | "\n", 334 | " if 'esrignss_cp1_hdop' not in existing_fields:\n", 335 | " arcpy.AddField_management(feature_class,\n", 336 | " 'esrignss_cp1_hdop',\n", 337 | " field_type=\"DOUBLE\",\n", 338 | " field_alias='CP1 HDOP',\n", 339 | " field_is_nullable=\"NULLABLE\"\n", 340 | " ) \n", 341 | " \n", 342 | " if 'esrignss_cp1_vdop' not in existing_fields:\n", 343 | " arcpy.AddField_management(feature_class,\n", 344 | " 'esrignss_cp1_vdop',\n", 345 | " field_type=\"DOUBLE\",\n", 346 | " field_alias='CP1 VDOP',\n", 347 | " field_is_nullable=\"NULLABLE\"\n", 348 | " ) \n", 349 | "\n", 350 | " if 'esrignss_cp1_avg_h_rms' not in existing_fields:\n", 351 | " arcpy.AddField_management(feature_class,\n", 352 | " 'esrignss_cp1_avg_h_rms',\n", 353 | " field_type=\"DOUBLE\",\n", 354 | " field_alias='CP1 Average horizontal accuracy (m)',\n", 355 | " field_is_nullable=\"NULLABLE\"\n", 356 | " ) \n", 357 | "\n", 358 | " if 'esrignss_cp1_avg_v_rms' not in existing_fields:\n", 359 | " arcpy.AddField_management(feature_class,\n", 360 | " 'esrignss_cp1_avg_v_rms',\n", 361 | " field_type=\"DOUBLE\",\n", 362 | " field_alias='CP1 Average vertical accuracy (m)',\n", 363 | " field_is_nullable=\"NULLABLE\"\n", 364 | " ) \n", 365 | "\n", 366 | " if 'esrignss_cp1_avg_positions' not in existing_fields:\n", 367 | " arcpy.AddField_management(feature_class,\n", 368 | " 'esrignss_cp1_avg_positions',\n", 369 | " field_type=\"SHORT\",\n", 370 | " field_alias='CP1 Averaged positions',\n", 371 | " field_is_nullable=\"NULLABLE\"\n", 372 | " )\n", 373 | "\n", 374 | " if 'esrignss_cp1_h_stddev' not in existing_fields:\n", 375 | " arcpy.AddField_management(feature_class,\n", 376 | " 'esrignss_cp1_h_stddev',\n", 377 | " field_type=\"DOUBLE\",\n", 378 | " field_alias='CP1 Standard deviation (m)',\n", 379 | " field_is_nullable=\"NULLABLE\"\n", 380 | " ) \n", 381 | "\n", 382 | " if 'esrisnsr_cp1_slope_dist' not in existing_fields:\n", 383 | " arcpy.AddField_management(feature_class,\n", 384 | " 'esrisnsr_cp1_slope_dist',\n", 385 | " field_type=\"DOUBLE\",\n", 386 | " field_alias='CP1 Laser slope distance (m)',\n", 387 | " field_is_nullable=\"NULLABLE\"\n", 388 | " ) \n", 389 | " \n", 390 | " if 'esrisnsr_cp1_inclination' not in existing_fields:\n", 391 | " arcpy.AddField_management(feature_class,\n", 392 | " 'esrisnsr_cp1_inclination',\n", 393 | " field_type=\"DOUBLE\",\n", 394 | " field_alias='CP1 Laser inclination (°)',\n", 395 | " field_is_nullable=\"NULLABLE\"\n", 396 | " ) \n", 397 | " \n", 398 | " if 'esrisnsr_cp1_azimuth' not in existing_fields:\n", 399 | " arcpy.AddField_management(feature_class,\n", 400 | " 'esrisnsr_cp1_azimuth',\n", 401 | " field_type=\"DOUBLE\",\n", 402 | " field_alias='CP1 Laser azimuth (°)',\n", 403 | " field_is_nullable=\"NULLABLE\"\n", 404 | " ) \n", 405 | "\n", 406 | " if 'esrignss_cp2_latitude' not in existing_fields:\n", 407 | " arcpy.AddField_management(feature_class,\n", 408 | " 'esrignss_cp2_latitude',\n", 409 | " field_type=\"DOUBLE\",\n", 410 | " field_alias='CP2 Latitude',\n", 411 | " field_is_nullable=\"NULLABLE\"\n", 412 | " )\n", 413 | "\n", 414 | " if 'esrignss_cp2_longitude' not in existing_fields:\n", 415 | " arcpy.AddField_management(feature_class,\n", 416 | " 'esrignss_cp2_longitude',\n", 417 | " field_type=\"DOUBLE\",\n", 418 | " field_alias='CP2 Longitude',\n", 419 | " field_is_nullable=\"NULLABLE\"\n", 420 | " ) \n", 421 | "\n", 422 | " if 'esrignss_cp2_altitude' not in existing_fields:\n", 423 | " arcpy.AddField_management(feature_class,\n", 424 | " 'esrignss_cp2_altitude',\n", 425 | " field_type=\"DOUBLE\",\n", 426 | " field_alias='CP2 Ellipsoidal height (m)',\n", 427 | " field_is_nullable=\"NULLABLE\"\n", 428 | " )\n", 429 | "\n", 430 | " if 'esrignss_cp2_h_rms' not in existing_fields:\n", 431 | " arcpy.AddField_management(feature_class,\n", 432 | " 'esrignss_cp2_h_rms',\n", 433 | " field_type=\"DOUBLE\",\n", 434 | " field_alias='CP2 Horizontal accuracy (m)',\n", 435 | " field_is_nullable=\"NULLABLE\"\n", 436 | " ) \n", 437 | "\n", 438 | " if 'esrignss_cp2_v_rms' not in existing_fields:\n", 439 | " arcpy.AddField_management(feature_class,\n", 440 | " 'esrignss_cp2_v_rms',\n", 441 | " field_type=\"DOUBLE\",\n", 442 | " field_alias='CP2 Vertical accuracy (m)',\n", 443 | " field_is_nullable=\"NULLABLE\"\n", 444 | " )\n", 445 | "\n", 446 | " if 'esrignss_cp2_fixdatetime' not in existing_fields:\n", 447 | " arcpy.AddField_management(feature_class,\n", 448 | " 'esrignss_cp2_fixdatetime',\n", 449 | " field_type=\"DATE\",\n", 450 | " field_alias='CP2 Fix time',\n", 451 | " field_is_nullable=\"NULLABLE\"\n", 452 | " ) \n", 453 | "\n", 454 | " if 'esrignss_cp2_fixtype' not in existing_fields:\n", 455 | " arcpy.AddField_management(feature_class,\n", 456 | " 'esrignss_cp2_fixtype',\n", 457 | " field_type=\"SHORT\",\n", 458 | " field_alias='CP2 Fix type',\n", 459 | " field_is_nullable=\"NULLABLE\",\n", 460 | " field_domain=\"esri_fix_type_domain\"\n", 461 | " ) \n", 462 | "\n", 463 | " if 'esrignss_cp2_corr_age' not in existing_fields:\n", 464 | " arcpy.AddField_management(feature_class,\n", 465 | " 'esrignss_cp2_corr_age',\n", 466 | " field_type=\"DOUBLE\",\n", 467 | " field_alias='CP2 Correction age',\n", 468 | " field_is_nullable=\"NULLABLE\"\n", 469 | " )\n", 470 | "\n", 471 | " if 'esrignss_cp2_stationid' not in existing_fields:\n", 472 | " arcpy.AddField_management(feature_class,\n", 473 | " 'esrignss_cp2_stationid',\n", 474 | " field_type=\"SHORT\",\n", 475 | " field_alias='CP2 Station ID',\n", 476 | " field_is_nullable=\"NULLABLE\"\n", 477 | " ) \n", 478 | "\n", 479 | " if 'esrignss_cp2_numsats' not in existing_fields:\n", 480 | " arcpy.AddField_management(feature_class,\n", 481 | " 'esrignss_cp2_numsats',\n", 482 | " field_type=\"DOUBLE\",\n", 483 | " field_alias='CP2 Number of satellites',\n", 484 | " field_is_nullable=\"NULLABLE\"\n", 485 | " )\n", 486 | "\n", 487 | " if 'esrignss_cp2_pdop' not in existing_fields:\n", 488 | " arcpy.AddField_management(feature_class,\n", 489 | " 'esrignss_cp2_pdop',\n", 490 | " field_type=\"DOUBLE\",\n", 491 | " field_alias='CP2 PDOP',\n", 492 | " field_is_nullable=\"NULLABLE\"\n", 493 | " ) \n", 494 | "\n", 495 | " if 'esrignss_cp2_hdop' not in existing_fields:\n", 496 | " arcpy.AddField_management(feature_class,\n", 497 | " 'esrignss_cp2_hdop',\n", 498 | " field_type=\"DOUBLE\",\n", 499 | " field_alias='CP2 HDOP',\n", 500 | " field_is_nullable=\"NULLABLE\"\n", 501 | " ) \n", 502 | " \n", 503 | " if 'esrignss_cp2_vdop' not in existing_fields:\n", 504 | " arcpy.AddField_management(feature_class,\n", 505 | " 'esrignss_cp2_vdop',\n", 506 | " field_type=\"DOUBLE\",\n", 507 | " field_alias='CP2 VDOP',\n", 508 | " field_is_nullable=\"NULLABLE\"\n", 509 | " ) \n", 510 | "\n", 511 | " if 'esrignss_cp2_avg_h_rms' not in existing_fields:\n", 512 | " arcpy.AddField_management(feature_class,\n", 513 | " 'esrignss_cp2_avg_h_rms',\n", 514 | " field_type=\"DOUBLE\",\n", 515 | " field_alias='CP2 Average horizontal accuracy (m)',\n", 516 | " field_is_nullable=\"NULLABLE\"\n", 517 | " ) \n", 518 | "\n", 519 | " if 'esrignss_cp2_avg_v_rms' not in existing_fields:\n", 520 | " arcpy.AddField_management(feature_class,\n", 521 | " 'esrignss_cp2_avg_v_rms',\n", 522 | " field_type=\"DOUBLE\",\n", 523 | " field_alias='CP2 Average vertical accuracy (m)',\n", 524 | " field_is_nullable=\"NULLABLE\"\n", 525 | " ) \n", 526 | "\n", 527 | " if 'esrignss_cp2_avg_positions' not in existing_fields:\n", 528 | " arcpy.AddField_management(feature_class,\n", 529 | " 'esrignss_cp2_avg_positions',\n", 530 | " field_type=\"SHORT\",\n", 531 | " field_alias='CP2 Averaged positions',\n", 532 | " field_is_nullable=\"NULLABLE\"\n", 533 | " )\n", 534 | "\n", 535 | " if 'esrignss_cp2_h_stddev' not in existing_fields:\n", 536 | " arcpy.AddField_management(feature_class,\n", 537 | " 'esrignss_cp2_h_stddev',\n", 538 | " field_type=\"DOUBLE\",\n", 539 | " field_alias='CP2 Standard deviation (m)',\n", 540 | " field_is_nullable=\"NULLABLE\"\n", 541 | " ) \n", 542 | "\n", 543 | " if 'esrisnsr_cp2_slope_dist' not in existing_fields:\n", 544 | " arcpy.AddField_management(feature_class,\n", 545 | " 'esrisnsr_cp2_slope_dist',\n", 546 | " field_type=\"DOUBLE\",\n", 547 | " field_alias='CP2 Laser slope distance (m)',\n", 548 | " field_is_nullable=\"NULLABLE\"\n", 549 | " ) \n", 550 | " \n", 551 | " if 'esrisnsr_cp2_inclination' not in existing_fields:\n", 552 | " arcpy.AddField_management(feature_class,\n", 553 | " 'esrisnsr_cp2_inclination',\n", 554 | " field_type=\"DOUBLE\",\n", 555 | " field_alias='CP2 Laser inclination (°)',\n", 556 | " field_is_nullable=\"NULLABLE\"\n", 557 | " ) \n", 558 | " \n", 559 | " if 'esrisnsr_cp2_azimuth' not in existing_fields:\n", 560 | " arcpy.AddField_management(feature_class,\n", 561 | " 'esrisnsr_cp2_azimuth',\n", 562 | " field_type=\"DOUBLE\",\n", 563 | " field_alias='CP2 Laser azimuth (°)',\n", 564 | " field_is_nullable=\"NULLABLE\"\n", 565 | " )\n", 566 | "\n", 567 | " # Update GNSS metadata fields with Domains\n", 568 | "\n", 569 | " domain_fields = [field for field in arcpy.ListFields(feature_class) if field.name == 'esrisnsr_offset_method']\n", 570 | "\n", 571 | " for field in domain_fields:\n", 572 | " if field.name == 'esrisnsr_offset_method' and not field.domain:\n", 573 | " arcpy.AssignDomainToField_management(feature_class, field, 'esri_offset_method_domain')\n", 574 | "\n", 575 | " domain_fields = [field for field in arcpy.ListFields(feature_class) if field.name == 'esrignss_cp1_fixtype']\n", 576 | "\n", 577 | " for field in domain_fields:\n", 578 | " if field.name == 'esrignss_cp1_fixtype' and not field.domain:\n", 579 | " arcpy.AssignDomainToField_management(feature_class, field, 'esri_fix_type_domain')\n", 580 | "\n", 581 | " domain_fields = [field for field in arcpy.ListFields(feature_class) if field.name == 'esrignss_cp2_fixtype']\n", 582 | "\n", 583 | " for field in domain_fields:\n", 584 | " if field.name == 'esrignss_cp2_fixtype' and not field.domain:\n", 585 | " arcpy.AssignDomainToField_management(feature_class, field, 'esri_fix_type_domain')\n", 586 | "\n", 587 | " return \"Successfully added offset metadata fields and domains.\"" 588 | ] 589 | }, 590 | { 591 | "cell_type": "code", 592 | "execution_count": null, 593 | "metadata": {}, 594 | "outputs": [], 595 | "source": [ 596 | "add_offset_fields(input_feature_class)" 597 | ] 598 | } 599 | ], 600 | "metadata": { 601 | "interpreter": { 602 | "hash": "2f7d5e7f44707441fdf59f9ea9e91257aa419b0f47527e9654c7357783801722" 603 | }, 604 | "kernelspec": { 605 | "display_name": "ArcGISPro", 606 | "language": "Python", 607 | "name": "python3" 608 | }, 609 | "language_info": { 610 | "file_extension": ".py", 611 | "name": "python", 612 | "version": "3.7.13" 613 | } 614 | }, 615 | "nbformat": 4, 616 | "nbformat_minor": 2 617 | } 618 | -------------------------------------------------------------------------------- /notebooks/Bulk Update Maps for Use in Collector or Field Maps.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Bulk update maps for Use in Collector / Field Maps" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "This notebook allows the user to define a search string for webmaps and set the ability to \"Use in Field Maps\" and/or \"Use in Collector\". It will first search for maps based on a string you define, confirm these are the maps you'd like to update, and then update each for Collector / Field Maps use based on parameters you define. Additionally, this notebook can be imported in ArcGIS Online and used as a hosted notebook." 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "from arcgis.gis import GIS\n", 24 | "from arcgis.mapping import WebMap\n", 25 | "gis = GIS(\"home\")\n", 26 | "# gis = GIS(\"https://arcgis.com\",\"\", verify_cert=False) " 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": {}, 32 | "source": [ 33 | "### Please set the parameters below. Example parameters have been set for this notebook. Note that we will only search for maps we own in the example queries\n", 34 | "\n", 35 | "In this example query, we'll disable in Field Maps all webmaps that are currently not enabled for Collector. This will prevent field workers from accidentally using the wrong map!\n", 36 | "\n", 37 | "For more on writing this search query, please see: https://doc.arcgis.com/en/arcgis-online/reference/advanced-search.htm" 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "metadata": {}, 44 | "outputs": [], 45 | "source": [ 46 | "# this takes \"enabled\", \"disabled\", or \"no_change\"\n", 47 | "field_maps_usage = \"enabled\"\n", 48 | "collector_usage = \"no_change\"\n", 49 | "webmap_search_query = f\"Hydrant NOT typekeywords: 'Collector' OR typekeywords: 'CollectorDisabled' AND owner:{gis.users.me.username}\"\n", 50 | "\n", 51 | "# Example query which would only update maps currently enabled for Collector\n", 52 | "# webmap_search_query = f\"title:Hydrant AND typekeywords:'Collector' AND owner:{gis.users.me.username}\"\n", 53 | "\n", 54 | "# Example query which would update maps currently enabled for Field Maps\n", 55 | "# webmap_search_query = f\"title:Hydrant AND NOT typekeywords:'FieldMapsDisabled' AND owner:{gis.users.me.username}\"\n", 56 | "\n", 57 | "valid_entries = [\"enabled\", \"disabled\", \"no_change\"]\n", 58 | "if not field_maps_usage in valid_entries or not collector_usage in valid_entries:\n", 59 | " raise Exception(\"Check your usage parameters are set correctly\")" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "metadata": {}, 65 | "source": [ 66 | "### Now, let's search for the webmaps that we're going to update" 67 | ] 68 | }, 69 | { 70 | "cell_type": "code", 71 | "execution_count": null, 72 | "metadata": {}, 73 | "outputs": [], 74 | "source": [ 75 | "webmaps = gis.content.search(webmap_search_query, item_type='Web Map', max_items=1000, outside_org=False)\n", 76 | "print(\"\\n\")\n", 77 | "if len(webmaps) > 0:\n", 78 | " yn = (input(str(len(webmaps)) + ' web maps found. Are you sure you want to work on all these web maps? (y/N): ') or 'n').lower().strip()\n", 79 | " sure = yn in ['yes', 'y', 'ye']\n", 80 | "else:\n", 81 | " raise Exception(\"No items found with query:\", webmap_search_query)" 82 | ] 83 | }, 84 | { 85 | "cell_type": "markdown", 86 | "metadata": {}, 87 | "source": [ 88 | "### If you want to continue, we'll now update the keywords needed to use Collector and Field Maps\n", 89 | "\n", 90 | "Note here that we update the map once - once by calling `update()` on the item and once by calling `update()` on the WebMap. The second update call exists to scrub out keywords. So it's possible that your keywords are unchanged despite the modified date of the webmap changing." 91 | ] 92 | }, 93 | { 94 | "cell_type": "code", 95 | "execution_count": null, 96 | "metadata": {}, 97 | "outputs": [], 98 | "source": [ 99 | "if not sure:\n", 100 | " raise Exception(\"Script terminated. Please check your search string again\")\n", 101 | "else:\n", 102 | " for wm_item in webmaps:\n", 103 | " print(\"Checking whether to update webmap for Field Maps and Collector: \" + wm_item.title)\n", 104 | " current_keywords = set(wm_item.typeKeywords)\n", 105 | " if field_maps_usage == \"enabled\":\n", 106 | " current_keywords.discard(\"FieldMapsDisabled\")\n", 107 | " elif field_maps_usage == \"disabled\":\n", 108 | " current_keywords.add(\"FieldMapsDisabled\")\n", 109 | "\n", 110 | " if collector_usage == \"enabled\":\n", 111 | " current_keywords.discard(\"CollectorDisabled\")\n", 112 | " elif collector_usage == \"disabled\":\n", 113 | " current_keywords.discard(\"Collector\")\n", 114 | " current_keywords.add(\"CollectorDisabled\")\n", 115 | "\n", 116 | " if sorted(list(current_keywords)) != sorted(wm_item.typeKeywords):\n", 117 | " print(\"New keyword added for \" + wm_item.title + \". Updating webmap\")\n", 118 | " try:\n", 119 | " wm_item.update(item_properties={\"typeKeywords\": list(current_keywords)})\n", 120 | " except Exception as e:\n", 121 | " print(e)\n", 122 | " wm = WebMap(wm_item)\n", 123 | " # remove bad Collector typekeywords\n", 124 | " wm.update()\n", 125 | " else:\n", 126 | " print(\"Skipped updating webmap for \" + wm_item.title)\n", 127 | " print(\"\\n\")\n", 128 | " \n", 129 | "print(\"Done!\")" 130 | ] 131 | } 132 | ], 133 | "metadata": { 134 | "kernelspec": { 135 | "display_name": "Python 3", 136 | "language": "python", 137 | "name": "python3" 138 | }, 139 | "language_info": { 140 | "codemirror_mode": { 141 | "name": "ipython", 142 | "version": 3 143 | }, 144 | "file_extension": ".py", 145 | "mimetype": "text/x-python", 146 | "name": "python", 147 | "nbconvert_exporter": "python", 148 | "pygments_lexer": "ipython3", 149 | "version": "3.8.2" 150 | } 151 | }, 152 | "nbformat": 4, 153 | "nbformat_minor": 4 154 | } -------------------------------------------------------------------------------- /notebooks/Configure Search.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Search for assets and observations in ArcGIS Field Maps\n", 8 | "\n", 9 | "This notebook demonstrates how you can configure search in the layer of a web maps. The search is then honored in apps, including ArcGIS Field Maps.\n" 10 | ] 11 | }, 12 | { 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "#### Connect to your ArcGIS Online or ArcGIS Enterprise organization" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": null, 22 | "metadata": { 23 | "scrolled": true 24 | }, 25 | "outputs": [], 26 | "source": [ 27 | "import json\n", 28 | "\n", 29 | "from arcgis.gis import GIS\n", 30 | "from arcgis.mapping import WebMap\n", 31 | "\n", 32 | "print('Enter your ArcGIS account user name: ')\n", 33 | "username = input()\n", 34 | "gis = GIS('https://www.arcgis.com', username)\n", 35 | "print('Connected to {}'.format(gis.properties.portalHostname))" 36 | ] 37 | }, 38 | { 39 | "cell_type": "markdown", 40 | "metadata": {}, 41 | "source": [ 42 | "#### Get the web map\n", 43 | "\n", 44 | "Search by ID for the web map where you want to configure layer search. " 45 | ] 46 | }, 47 | { 48 | "cell_type": "code", 49 | "execution_count": null, 50 | "metadata": {}, 51 | "outputs": [], 52 | "source": [ 53 | "\n", 54 | "# update below with web map id\n", 55 | "webmap_item_no_search = gis.content.get('')" 56 | ] 57 | }, 58 | { 59 | "cell_type": "markdown", 60 | "metadata": {}, 61 | "source": [ 62 | "#### Add search to a feature layer\n", 63 | "\n", 64 | "If the map already has feature search configured, add another search to it. If not, insert the first feature search." 65 | ] 66 | }, 67 | { 68 | "cell_type": "code", 69 | "execution_count": null, 70 | "metadata": {}, 71 | "outputs": [], 72 | "source": [ 73 | "def add_layer_search(webmap_item, layer_url, field_name, exact_match, esri_field_type):\n", 74 | " json_data = webmap_item.get_data()\n", 75 | " \n", 76 | " # insert placeholder into JSON if not present\n", 77 | " if 'applicationProperties' not in json_data:\n", 78 | " placeholder_app_properties = {'viewing': {'search': {'enabled': True,'disablePlaceFinder': False,'hintText': 'Place or Address','layers': [],'tables': []}}}\n", 79 | " json_data['applicationProperties'] = placeholder_app_properties\n", 80 | " \n", 81 | " app_properties_viewing = json_data['applicationProperties']['viewing']\n", 82 | "\n", 83 | " # Get the ID of the layer to search, based on the URL you know it has\n", 84 | " webmap = WebMap(webmap_item)\n", 85 | " layers = webmap.layers\n", 86 | " layer_id = None\n", 87 | " for layer in layers:\n", 88 | " if layer.url == layer_url:\n", 89 | " layer_id = layer.id\n", 90 | " \n", 91 | " if layer_id is None:\n", 92 | " print('Layer URL not found: ' + layer_url, end='\\n\\n')\n", 93 | " return\n", 94 | " \n", 95 | " if ('search' in app_properties_viewing):\n", 96 | " \n", 97 | " search_layers_property = app_properties_viewing['search']['layers']\n", 98 | " new_layer = {'id': layer_id, 'field': {'name': field_name, 'exactMatch': exact_match, 'type': esri_field_type}}\n", 99 | " search_layers_property.append(new_layer)\n", 100 | " webmap_item.update(data = json.dumps(json_data))\n", 101 | " print(\"Added another search in '\" + webmap_item.title + \"`\", end='\\n\\n')\n", 102 | " " 103 | ] 104 | }, 105 | { 106 | "cell_type": "markdown", 107 | "metadata": {}, 108 | "source": [ 109 | "#### Disable place search\n", 110 | "\n", 111 | "If you won't want mobile workers searching for generic places and addresses, disable place search." 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": null, 117 | "metadata": {}, 118 | "outputs": [], 119 | "source": [ 120 | "def disable_place_search(webmap_item):\n", 121 | "\n", 122 | " json_data = webmap_item.get_data()\n", 123 | "\n", 124 | " # insert placeholder into JSON if not present\n", 125 | " if 'applicationProperties' not in json_data:\n", 126 | " placeholder_app_properties = {'viewing': {'search': {'enabled': True,'disablePlaceFinder': False,'hintText': 'Place or Address','layers': [],'tables': []}}}\n", 127 | " json_data['applicationProperties'] = placeholder_app_properties\n", 128 | "\n", 129 | " app_properties_viewing = json_data['applicationProperties']['viewing']\n", 130 | "\n", 131 | " if ('search' in app_properties_viewing):\n", 132 | " \n", 133 | " layer_search_property = app_properties_viewing['search']\n", 134 | " disable_place_finder = {'disablePlaceFinder': True}\n", 135 | " layer_search_property.update(disable_place_finder)\n", 136 | " webmap_item.update(data = json.dumps(json_data))\n", 137 | " \n", 138 | " print(\"Disabled place finder for '\" + webmap_item.title + \"'\")" 139 | ] 140 | }, 141 | { 142 | "cell_type": "markdown", 143 | "metadata": {}, 144 | "source": [ 145 | "#### Provide hint text\n", 146 | "\n", 147 | "Check if it has layer search configured, and if so, provide hint text." 148 | ] 149 | }, 150 | { 151 | "cell_type": "code", 152 | "execution_count": null, 153 | "metadata": {}, 154 | "outputs": [], 155 | "source": [ 156 | "def update_hint_text(webmap_item,new_hint_text):\n", 157 | "\n", 158 | " json_data = webmap_item.get_data()\n", 159 | "\n", 160 | " # insert placeholder into JSON if not present\n", 161 | " if 'applicationProperties' not in json_data:\n", 162 | " placeholder_app_properties = {'viewing': {'search': {'enabled': True,'disablePlaceFinder': False,'hintText': 'Place or Address','layers': [],'tables': []}}}\n", 163 | " json_data['applicationProperties'] = placeholder_app_properties\n", 164 | " \n", 165 | " app_properties_viewing = json_data['applicationProperties']['viewing']\n", 166 | "\n", 167 | " if ('search' in app_properties_viewing):\n", 168 | " \n", 169 | " layer_search_property = app_properties_viewing['search']\n", 170 | " hint_text_dict = {'hintText': new_hint_text}\n", 171 | " layer_search_property.update(hint_text_dict)\n", 172 | " webmap_item.update(data = json.dumps(json_data))\n", 173 | "\n", 174 | " print(\"Updated hint text for '\" + webmap_item.title + \"'\") " 175 | ] 176 | }, 177 | { 178 | "cell_type": "markdown", 179 | "metadata": {}, 180 | "source": [ 181 | "#### Configure layer and process updates" 182 | ] 183 | }, 184 | { 185 | "cell_type": "code", 186 | "execution_count": null, 187 | "metadata": {}, 188 | "outputs": [], 189 | "source": [ 190 | "webmap_item = webmap_item_no_search\n", 191 | "\n", 192 | "# update below with our layer url and field details\n", 193 | "layer_url = 'https://services.arcgis.com/N4jtru9dctSQR53c/arcgis/rest/services/Hydrant_maintenance/FeatureServer/0'\n", 194 | "\n", 195 | "add_layer_search(webmap_item, layer_url, 'Facility Identifier', True, 'esriFieldTypeInteger') #ID\n", 196 | "add_layer_search(webmap_item, layer_url, 'Creator', False, 'esriFieldTypeString') #Details\n", 197 | "\n", 198 | "update_hint_text(webmap_item,\"Search for hydrant...\")\n", 199 | "disable_place_search(webmap_item)" 200 | ] 201 | } 202 | ], 203 | "metadata": { 204 | "kernelspec": { 205 | "display_name": "Python 3.7.13 ('Arcgis')", 206 | "language": "python", 207 | "name": "python3" 208 | }, 209 | "language_info": { 210 | "codemirror_mode": { 211 | "name": "ipython", 212 | "version": 3 213 | }, 214 | "file_extension": ".py", 215 | "mimetype": "text/x-python", 216 | "name": "python", 217 | "nbconvert_exporter": "python", 218 | "pygments_lexer": "ipython3", 219 | "version": "3.7.13" 220 | }, 221 | "vscode": { 222 | "interpreter": { 223 | "hash": "2f7d5e7f44707441fdf59f9ea9e91257aa419b0f47527e9654c7357783801722" 224 | } 225 | } 226 | }, 227 | "nbformat": 4, 228 | "nbformat_minor": 2 229 | } 230 | -------------------------------------------------------------------------------- /notebooks/Create Offline Areas from Feature Layer.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "attachments": {}, 5 | "cell_type": "markdown", 6 | "metadata": {}, 7 | "source": [ 8 | "## Create Offline Areas from Feature Layer\n", 9 | "\n", 10 | "This notebook demonstrates how to create map areas for a webmap based on a polygon feature layer. The resulting map areas can be downloaded and used in ArcGIS Field Maps.\n", 11 | "\n", 12 | "To use this notebook, you need to update variables at the top of the script, including:\n", 13 | "1. `offline_map_item_id` to the item id of the webmap you would like to create areas for\n", 14 | "2. `feature_layer_item_id` to the item id of the feature layer to use as input areas, `feature_layer_id` to specify the specific layer and `area_name_attribute` for the attribute of the layer to use to name new map areas\n", 15 | "3. `output_title_template`, `output_snippet_template`, `output_tags` and `output_folder` to specify properties for the output map areas \n", 16 | "\n", 17 | "This notebook assumes that:\n", 18 | "1. You are the owner of the webmap\n", 19 | "2. Your webmap has a raster tile basemap, like Imagery\n", 20 | "3. You are using Enterprise 10.7+ or ArcGIS Online with ArcGIS API for Python v2.3.x\n", 21 | "\n", 22 | "For more information on working with map areas see:\n", 23 | "- [Managing offline map areas](https://developers.arcgis.com/python/guide/managing-offline-map-areas/)\n", 24 | "- [OfflineMapAreaManager API reference](https://developers.arcgis.com/python/api-reference/arcgis.mapping.toc.html#offlinemapareamanager)\n" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": null, 30 | "metadata": { 31 | "trusted": true 32 | }, 33 | "outputs": [], 34 | "source": [ 35 | "from arcgis.gis import GIS\n", 36 | "from arcgis.mapping import WebMap\n", 37 | "\n", 38 | "#The item id of the webmap that you want to create offline areas for\n", 39 | "offline_map_item_id = '07a9a65edb07470e9781a5a493f1d92c'\n", 40 | "#The item id of the feature layer to use for the areas you want to create\n", 41 | "#If the feature layer has more than 16 features, only the first 16 features will be queried\n", 42 | "feature_layer_item_id = '12e3f9dadda048e993d504362cf815b4'\n", 43 | "#The id of the layer to use\n", 44 | "feature_layer_id = 0\n", 45 | "#Field name of the attribute to use to name the areas that are created\n", 46 | "area_name_attribute = 'Sextant'\n", 47 | "\n", 48 | "#Properties for the output areas\n", 49 | "output_title_template = '{} Area'\n", 50 | "output_snippet_template = 'A map that contains parks and trails in the {} sextant of Portland, OR, USA.'\n", 51 | "output_tags='test'\n", 52 | "output_folder = 'pdx'\n", 53 | "#Min scale for map areas - World scale\n", 54 | "map_area_min_scale = 147914382\n", 55 | "#Max scale for map areas - Neighborhood scale\n", 56 | "map_area_max_scale = 20000\n", 57 | "\n", 58 | "#Using built-in user\n", 59 | "#For information on different authentication schemes see\n", 60 | "#https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/\n", 61 | "#and for protecting your credentials see\n", 62 | "#https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/#protecting-your-credentials\n", 63 | "gis = GIS('home')\n", 64 | "\n", 65 | "offline_map_item = gis.content.get(offline_map_item_id)\n", 66 | "offline_map = WebMap(offline_map_item)\n", 67 | "\n", 68 | "offline_areas_item = gis.content.get(feature_layer_item_id)\n", 69 | "offline_areas = offline_areas_item.layers[feature_layer_id].query(result_record_count=16, return_all_records=False)\n", 70 | "\n", 71 | "#If recreating areas for the same map, could cleanup and remove existing preplanned areas\n", 72 | "#for ids in offline_map.offline_areas.list():\n", 73 | "# ids.delete()\n", 74 | " \n", 75 | "for offline_area in offline_areas.features:\n", 76 | " area_name = offline_area.attributes[area_name_attribute]\n", 77 | "\n", 78 | " print('Creating offline map area for ' + area_name)\n", 79 | " \n", 80 | " item_prop = {'title': output_title_template.format(area_name),\n", 81 | " 'snippet': output_snippet_template.format(area_name),\n", 82 | " 'tags': [output_tags]}\n", 83 | "\n", 84 | " try: \n", 85 | " map_area = offline_map.offline_areas.create(area=offline_area.geometry,\n", 86 | " item_properties=item_prop,\n", 87 | " folder=output_folder,\n", 88 | " min_scale=map_area_min_scale,\n", 89 | " max_scale=map_area_max_scale)\n", 90 | "\n", 91 | " except:\n", 92 | " print('Failed creating map area for ' + area_name)\n" 93 | ] 94 | } 95 | ], 96 | "metadata": { 97 | "esriNotebookRuntime": { 98 | "notebookRuntimeName": "ArcGIS Notebook Python 3 Standard", 99 | "notebookRuntimeVersion": "7.0" 100 | }, 101 | "kernelspec": { 102 | "display_name": "Python 3 (ipykernel)", 103 | "language": "python", 104 | "name": "python3" 105 | }, 106 | "language_info": { 107 | "codemirror_mode": { 108 | "name": "ipython", 109 | "version": 3 110 | }, 111 | "file_extension": ".py", 112 | "mimetype": "text/x-python", 113 | "name": "python", 114 | "nbconvert_exporter": "python", 115 | "pygments_lexer": "ipython3", 116 | "version": "3.9.11" 117 | } 118 | }, 119 | "nbformat": 4, 120 | "nbformat_minor": 2 121 | } 122 | -------------------------------------------------------------------------------- /notebooks/Find features with missing attachments.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Find features with missing attachments\n", 8 | "\n", 9 | "This example shows how to find all features in a feature layer that do not have an attached image. Features that are missing an image are added to a view layer so that crews can go out and re-inspect them." 10 | ] 11 | }, 12 | { 13 | "cell_type": "markdown", 14 | "metadata": {}, 15 | "source": [ 16 | "#### Import the ArcGIS API for Python" 17 | ] 18 | }, 19 | { 20 | "cell_type": "code", 21 | "execution_count": 1, 22 | "metadata": {}, 23 | "outputs": [], 24 | "source": [ 25 | "import arcgis" 26 | ] 27 | }, 28 | { 29 | "cell_type": "markdown", 30 | "metadata": {}, 31 | "source": [ 32 | "#### Connect to ArcGIS Online and get the damage assessment layer and its view layer" 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": 2, 38 | "metadata": {}, 39 | "outputs": [ 40 | { 41 | "name": "stdout", 42 | "output_type": "stream", 43 | "text": [ 44 | "Enter password: ········\n" 45 | ] 46 | } 47 | ], 48 | "source": [ 49 | "gis = arcgis.gis.GIS(\"https://arcgis.com\", \"aaron_nitro\")\n", 50 | "damage_assessments_layer = gis.content.get('d46499afaa9142c0bd261478d329c027').layers[0]\n", 51 | "damage_assessments_view_layer = gis.content.get('d85d43a523f940079982a076c944405f').layers[2]" 52 | ] 53 | }, 54 | { 55 | "cell_type": "markdown", 56 | "metadata": {}, 57 | "source": [ 58 | "#### Get all damage assessments performed in the last 10 days" 59 | ] 60 | }, 61 | { 62 | "cell_type": "code", 63 | "execution_count": 3, 64 | "metadata": {}, 65 | "outputs": [], 66 | "source": [ 67 | "where = f\"{damage_assessments_layer.properties.editFieldsInfo.creationDateField} >= CURRENT_TIMESTAMP - INTERVAL '10' DAY\"\n", 68 | "damage_assessments = damage_assessments_layer.query(where=where)" 69 | ] 70 | }, 71 | { 72 | "cell_type": "markdown", 73 | "metadata": {}, 74 | "source": [ 75 | "#### Create a set of all the globalids" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 4, 81 | "metadata": {}, 82 | "outputs": [], 83 | "source": [ 84 | "global_ids = {da.attributes[damage_assessments_layer.properties.globalIdField] for da in damage_assessments}" 85 | ] 86 | }, 87 | { 88 | "cell_type": "markdown", 89 | "metadata": {}, 90 | "source": [ 91 | "#### Query all of the attachments collected within the last 10 days and create a set of their parent feature globalids" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": 5, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "attachments = damage_assessments_layer.attachments.search(where=where, attachment_types=['image/jpeg', 'image/jpg', 'image/png'])\n", 101 | "parent_global_ids = {x['PARENTGLOBALID'] for x in attachments}" 102 | ] 103 | }, 104 | { 105 | "cell_type": "markdown", 106 | "metadata": {}, 107 | "source": [ 108 | "#### Find the globalids which do not overlap which indicates it's missing an attachment" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": 6, 114 | "metadata": {}, 115 | "outputs": [], 116 | "source": [ 117 | "global_ids_without_an_image = global_ids.difference(parent_global_ids)" 118 | ] 119 | }, 120 | { 121 | "cell_type": "markdown", 122 | "metadata": {}, 123 | "source": [ 124 | "#### Create a SQL where clause using the globalids" 125 | ] 126 | }, 127 | { 128 | "cell_type": "code", 129 | "execution_count": 8, 130 | "metadata": {}, 131 | "outputs": [ 132 | { 133 | "data": { 134 | "text/plain": [ 135 | "\"GlobalID in ('c4c03f20-fb04-47b3-88f6-80b6ad1d339b')\"" 136 | ] 137 | }, 138 | "execution_count": 8, 139 | "metadata": {}, 140 | "output_type": "execute_result" 141 | } 142 | ], 143 | "source": [ 144 | "where = f\"\"\"{damage_assessments_layer.properties.globalIdField} in ('{\",\".join(global_ids_without_an_image)}')\"\"\"\n", 145 | "where" 146 | ] 147 | }, 148 | { 149 | "cell_type": "markdown", 150 | "metadata": {}, 151 | "source": [ 152 | "#### Update the view definition query of the view layer to only show features that are missing attachments" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": 9, 158 | "metadata": {}, 159 | "outputs": [ 160 | { 161 | "data": { 162 | "text/plain": [ 163 | "{'success': True}" 164 | ] 165 | }, 166 | "execution_count": 9, 167 | "metadata": {}, 168 | "output_type": "execute_result" 169 | } 170 | ], 171 | "source": [ 172 | "damage_assessments_view_layer.manager.update_definition({\n", 173 | " \"viewDefinitionQuery\": where\n", 174 | "})" 175 | ] 176 | } 177 | ], 178 | "metadata": { 179 | "kernelspec": { 180 | "display_name": "Python 3", 181 | "language": "python", 182 | "name": "python3" 183 | }, 184 | "language_info": { 185 | "codemirror_mode": { 186 | "name": "ipython", 187 | "version": 3 188 | }, 189 | "file_extension": ".py", 190 | "mimetype": "text/x-python", 191 | "name": "python", 192 | "nbconvert_exporter": "python", 193 | "pygments_lexer": "ipython3", 194 | "version": "3.8.6" 195 | } 196 | }, 197 | "nbformat": 4, 198 | "nbformat_minor": 4 199 | } 200 | -------------------------------------------------------------------------------- /notebooks/Generate PDF Report/Generate PDF Report.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "collapsed": true 7 | }, 8 | "source": [ 9 | "# Generate PDF Report\n", 10 | "\n", 11 | "This example shows how to generate an HTML and PDF report for a specific feature layer. It demonstrates how to create a report that includes summary statistics, photos, maps, and attribute information. In this example we are going to work with a tree inventory layer that was deployed using this [ArcGIS Solution template](https://www.arcgis.com/apps/solutions/index.html?gallery=true&searchTerm=Tree%20Management&solution=vrrh5yvnj1b8ba2tkfab6kf36c2t18np). \n", 12 | "\n", 13 | "This notebook was originally created as part of DevSummit 2021." 14 | ] 15 | }, 16 | { 17 | "cell_type": "markdown", 18 | "metadata": {}, 19 | "source": [ 20 | "### Connect to our GIS\n", 21 | "This notebook uses the ArcGIS API for Python to interact with feature services. So let's import that library and then use it to connect to our GIS so we can fetch the layer containing the trees." 22 | ] 23 | }, 24 | { 25 | "cell_type": "code", 26 | "execution_count": 1, 27 | "metadata": {}, 28 | "outputs": [], 29 | "source": [ 30 | "import arcgis" 31 | ] 32 | }, 33 | { 34 | "cell_type": "markdown", 35 | "metadata": {}, 36 | "source": [ 37 | "In this example we are going to work with the Trees hosted feature layer that was deployed using this ArcGIS Solution template. Search for this layer in contents and copy and post the item id (found in the url) below where you see `` " 38 | ] 39 | }, 40 | { 41 | "cell_type": "code", 42 | "execution_count": null, 43 | "metadata": {}, 44 | "outputs": [], 45 | "source": [ 46 | "gis = arcgis.gis.GIS(\"https://www.arcgis.com\", \"\")\n", 47 | "item = gis.content.get(\"\")\n", 48 | "trees_layer = item.layers[0]" 49 | ] 50 | }, 51 | { 52 | "cell_type": "markdown", 53 | "metadata": {}, 54 | "source": [ 55 | "### Generate summary statistics\n", 56 | "The report should contain some summary statistics about each type of tree. We are going to make a query to fetch, for each type of tree, the:\n", 57 | "- count\n", 58 | "- average height\n", 59 | "- average diameter" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": 10, 65 | "metadata": {}, 66 | "outputs": [], 67 | "source": [ 68 | "statistics = trees_layer.query(\n", 69 | " group_by_fields_for_statistics=\"COMMONNAME\",\n", 70 | " out_statistics=[\n", 71 | " {\n", 72 | " \"statisticType\": \"count\",\n", 73 | " \"onStatisticField\": \"COMMONNAME\",\n", 74 | " \"outStatisticFieldName\": \"count\",\n", 75 | " },\n", 76 | " {\n", 77 | " \"statisticType\": \"avg\",\n", 78 | " \"onStatisticField\": \"HEIGHT\",\n", 79 | " \"outStatisticFieldName\": \"avg_height\",\n", 80 | " },\n", 81 | " {\n", 82 | " \"statisticType\": \"avg\",\n", 83 | " \"onStatisticField\": \"DIAMETER\",\n", 84 | " \"outStatisticFieldName\": \"avg_diameter\",\n", 85 | " },\n", 86 | " ],\n", 87 | ").features" 88 | ] 89 | }, 90 | { 91 | "cell_type": "markdown", 92 | "metadata": {}, 93 | "source": [ 94 | "### Extract location information from image\n", 95 | "The following two functions detail how the EXIF data is parsed to find the latitude, longitude, and date of the photo. The coordinates are stored as Degrees Minutes Seconds in the EXIF data so an additional helper method is used to conver the coordinates to decimal degress." 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": 11, 101 | "metadata": {}, 102 | "outputs": [], 103 | "source": [ 104 | "from PIL import Image, ImageOps, ImageDraw, ImageFont, ExifTags\n", 105 | "import base64\n", 106 | "import io\n", 107 | "import json\n", 108 | "import datetime\n", 109 | "from PIL.ExifTags import GPSTAGS, TAGS\n", 110 | "%matplotlib agg\n", 111 | "import cartopy.crs as ccrs\n", 112 | "import matplotlib.pyplot as plt\n", 113 | "from cartopy.io.img_tiles import GoogleTiles\n", 114 | "from datetime import datetime as dt" 115 | ] 116 | }, 117 | { 118 | "cell_type": "code", 119 | "execution_count": 12, 120 | "metadata": {}, 121 | "outputs": [], 122 | "source": [ 123 | "def get_location_data(exif_data):\n", 124 | " gps_info = exif_data.get_ifd(ExifTags.IFD.GPSInfo)\n", 125 | " time = dt.now()\n", 126 | " try:\n", 127 | "\n", 128 | " lat_degrees = gps_info[ExifTags.GPS.GPSLatitude][0]\n", 129 | " lat_minutes = gps_info[ExifTags.GPS.GPSLatitude][1]\n", 130 | " lat_seconds = gps_info[ExifTags.GPS.GPSLatitude][2]\n", 131 | " lon_degrees = gps_info[ExifTags.GPS.GPSLongitude][0]\n", 132 | " lon_minutes = gps_info[ExifTags.GPS.GPSLongitude][1]\n", 133 | " lon_seconds = gps_info[ExifTags.GPS.GPSLongitude][2]\n", 134 | "\n", 135 | " lat = float(lat_degrees + lat_minutes / 60 + lat_seconds / 3600)\n", 136 | " lon = float(lon_degrees + lon_minutes / 60 + lon_seconds / 3600)\n", 137 | "\n", 138 | " if gps_info[ExifTags.GPS.GPSLatitudeRef] == \"S\":\n", 139 | " lat = -lat\n", 140 | " if gps_info[ExifTags.GPS.GPSLongitudeRef] == \"W\":\n", 141 | " lon = -lon\n", 142 | " if ExifTags.GPS.GPSTimeStamp in gps_info:\n", 143 | " time = gps_info[ExifTags.GPS.GPSTimeStamp]\n", 144 | "\n", 145 | " except Exception as e:\n", 146 | " lat = 0\n", 147 | " lon = 0\n", 148 | " time = dt.now()\n", 149 | "\n", 150 | " return (lat, lon, time)" 151 | ] 152 | }, 153 | { 154 | "cell_type": "markdown", 155 | "metadata": {}, 156 | "source": [ 157 | "### Process Attached Images\n", 158 | "The report is going to contain a list of all trees and any photos that were captured using ArcGIS Field Maps. Once an image has been downloaded, it's going to be processed using the following function. This fuction will read any EXIF data and orient the image accordingly so it appears correctly in the report. Then it is going to further examine the EXIF data to extract the location and time of when the photo was taken. It is going to overlay this information in the image and then return a base64 encoded version of the image the will be embedded in the report.\n", 159 | "\n", 160 | "To accomplish this, it uses the [Pillow](https://pillow.readthedocs.io/en/stable/) library which is a very popuplar image process library for Python." 161 | ] 162 | }, 163 | { 164 | "cell_type": "code", 165 | "execution_count": 13, 166 | "metadata": {}, 167 | "outputs": [], 168 | "source": [ 169 | "def process_image(input_file):\n", 170 | " image = Image.open(input_file)\n", 171 | " image = ImageOps.exif_transpose(image)\n", 172 | " lat, long, time = get_location_data(image.getexif())\n", 173 | " image_editable = ImageDraw.Draw(image)\n", 174 | " font = ImageFont.truetype(\"/Library/Fonts/Arial Unicode.ttf\", 24)\n", 175 | " image_editable.text(\n", 176 | " (1, 1),\n", 177 | " f'{lat}, {long} at {time.strftime(\"%Y-%m-%d %H:%M:%S\")}',\n", 178 | " (255, 255, 0),\n", 179 | " font=font,\n", 180 | " )\n", 181 | " io_bytes = io.BytesIO()\n", 182 | " image.save(io_bytes, format=\"JPEG\")\n", 183 | " io_bytes.seek(0)\n", 184 | " return base64.b64encode(io_bytes.read()).decode()" 185 | ] 186 | }, 187 | { 188 | "cell_type": "markdown", 189 | "metadata": {}, 190 | "source": [ 191 | "### Generate a map image\n", 192 | "The report is also going to contain a small map for each tree to show where it is located. The [cartopy](https://scitools.org.uk/cartopy/docs/latest/) libary is used in conjunction with [matpotlib](https://matplotlib.org/) to generate a single image that plots the location on a map using the Esri World Street Map Tile service. The function returns a base64 encoded image that will be embedded into the report." 193 | ] 194 | }, 195 | { 196 | "cell_type": "code", 197 | "execution_count": 14, 198 | "metadata": {}, 199 | "outputs": [], 200 | "source": [ 201 | "def create_map(feature):\n", 202 | " fig = plt.figure(figsize=(5, 5))\n", 203 | " url = \"https://server.arcgisonline.com/ArcGIS/rest/services/World_Street_Map/MapServer/tile/{z}/{y}/{x}.jpg\"\n", 204 | " image = GoogleTiles(url=url)\n", 205 | " ax = fig.add_subplot(1, 1, 1, projection=image.crs)\n", 206 | " ax.add_image(image, 15)\n", 207 | " xmin = feature[\"geometry\"][\"x\"] - 500\n", 208 | " xmax = feature[\"geometry\"][\"x\"] + 500\n", 209 | " ymin = feature[\"geometry\"][\"y\"] - 500\n", 210 | " ymax = feature[\"geometry\"][\"y\"] + 500\n", 211 | " ax.set_extent([xmin, xmax, ymin, ymax], crs=image.crs)\n", 212 | " ax.scatter(\n", 213 | " feature[\"geometry\"][\"x\"],\n", 214 | " feature[\"geometry\"][\"y\"],\n", 215 | " color=\"green\",\n", 216 | " linewidth=2,\n", 217 | " marker=\"o\",\n", 218 | " )\n", 219 | " io_bytes = io.BytesIO()\n", 220 | " plt.savefig(io_bytes, format=\"jpeg\", bbox_inches=\"tight\")\n", 221 | " io_bytes.seek(0)\n", 222 | " return base64.b64encode(io_bytes.read()).decode()" 223 | ] 224 | }, 225 | { 226 | "cell_type": "markdown", 227 | "metadata": {}, 228 | "source": [ 229 | "### Build a data model for each feature\n", 230 | "Now that we have our helper methods all in place, let's query all of the trees. For each tree, we are going to generate a map image to embed into the report. Then we are going to fetch all of the jpeg images associated with the tree. For each of those images, we are going to process it to extract and overlay the location. Each feature will now have a dictionary representing its geometry, a dictionary of its attributes, a list of base64 encoded attachments, and a single base64 encoded image of its location on a map." 231 | ] 232 | }, 233 | { 234 | "cell_type": "code", 235 | "execution_count": 15, 236 | "metadata": {}, 237 | "outputs": [], 238 | "source": [ 239 | "trees = [f.as_dict for f in trees_layer.query(\"1=1\").features]\n", 240 | "for feature in trees:\n", 241 | " feature[\"map\"] = create_map(feature)\n", 242 | " attachments = trees_layer.attachments.search(\n", 243 | " attachment_types=[\"image/jpeg\", \"image/jpg\"],\n", 244 | " object_ids=[feature[\"attributes\"][\"OBJECTID\"]],\n", 245 | " )\n", 246 | " feature[\"attachments\"] = []\n", 247 | " for a in attachments:\n", 248 | " f = trees_layer.attachments.download(\n", 249 | " oid=feature[\"attributes\"][\"OBJECTID\"], attachment_id=a[\"ID\"]\n", 250 | " )[0]\n", 251 | " image = process_image(f)\n", 252 | " feature[\"attachments\"].append(image)" 253 | ] 254 | }, 255 | { 256 | "cell_type": "markdown", 257 | "metadata": {}, 258 | "source": [ 259 | "### Generate the report\n", 260 | "In order to build the report, we are using a templating engine called [jinja2](https://jinja.palletsprojects.com/en/2.11.x/). This templating engine is very popular with various Python web frameworks like Flask. An html template file has been created and we are going to use jinja to substitute in our statistics and list of trees. This will result in a complete html string that could be served as a static webpage if desired. " 261 | ] 262 | }, 263 | { 264 | "cell_type": "code", 265 | "execution_count": 16, 266 | "metadata": {}, 267 | "outputs": [], 268 | "source": [ 269 | "from jinja2 import Environment, FileSystemLoader\n", 270 | "\n", 271 | "env = Environment(loader=FileSystemLoader(\".\"))\n", 272 | "template = env.get_template(\"report-template.html\")\n", 273 | "template_variables = {\n", 274 | " \"title\": item.title,\n", 275 | " \"statistics\": [s.attributes for s in statistics],\n", 276 | " \"features\": trees,\n", 277 | "}\n", 278 | "generated_html = template.render(template_variables)" 279 | ] 280 | }, 281 | { 282 | "cell_type": "markdown", 283 | "metadata": {}, 284 | "source": [ 285 | "### Export to PDF\n", 286 | "Finally we are going to convert the html to a PDF file using [weasyprint](https://weasyprint.org/). After this completes you should be able to open the generated PDF and view your report." 287 | ] 288 | }, 289 | { 290 | "cell_type": "code", 291 | "execution_count": 17, 292 | "metadata": {}, 293 | "outputs": [], 294 | "source": [ 295 | "from weasyprint import HTML" 296 | ] 297 | }, 298 | { 299 | "cell_type": "code", 300 | "execution_count": 18, 301 | "metadata": {}, 302 | "outputs": [], 303 | "source": [ 304 | "HTML(string=generated_html).write_pdf(\"report.pdf\")" 305 | ] 306 | }, 307 | { 308 | "cell_type": "markdown", 309 | "metadata": {}, 310 | "source": [ 311 | "### Summary\n", 312 | "This notebook demostrated how to build a custom PDF report by combining the ArcGIS API for Python, and a handful of other open-source tools. By using HTML as a conversion layer, it's very easy to tweak the content and style of the report." 313 | ] 314 | } 315 | ], 316 | "metadata": { 317 | "kernelspec": { 318 | "display_name": "field-maps-scripts-99999", 319 | "language": "python", 320 | "name": "python3" 321 | }, 322 | "language_info": { 323 | "codemirror_mode": { 324 | "name": "ipython", 325 | "version": 3 326 | }, 327 | "file_extension": ".py", 328 | "mimetype": "text/x-python", 329 | "name": "python", 330 | "nbconvert_exporter": "python", 331 | "pygments_lexer": "ipython3", 332 | "version": "3.11.11" 333 | } 334 | }, 335 | "nbformat": 4, 336 | "nbformat_minor": 1 337 | } 338 | -------------------------------------------------------------------------------- /notebooks/Generate PDF Report/example-report.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Esri/field-maps-scripts/4a5b8293fe92e364f869f5c228efe3d4fcee9388/notebooks/Generate PDF Report/example-report.pdf -------------------------------------------------------------------------------- /notebooks/Generate PDF Report/report-template.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | {{ title }}} 7 | 14 | 15 | 16 |

{{ title }}

17 | 18 |
19 |

Summary Statistics

20 | 21 | 22 | 23 | 24 | {% for stat in statistics %} 25 | 26 | 27 | 28 | 29 | 30 | 31 | {% endfor %} 32 |
Tree NameCountAvg DiameterAvg Height
{{ stat.COMMONNAME }}{{ stat.count }}{{ stat.avg_diameter }}{{ stat.avg_height }}
33 |
34 | 35 |

36 | 37 |
38 |

Individual Trees

39 | 40 | {% for feature in features %} 41 |
42 |
43 | 44 | 45 | 46 | 47 | 48 | 49 |
AttributeValue
Tree ID{{ feature["attributes"]["assetid"] }}
Name{{ feature["attributes"]["commonname"] }}
Diameter{{ feature["attributes"]["diameter"] }}
Height{{ feature["attributes"]["height"] }}
50 |
51 |
52 | 53 |
54 |
55 |
56 |
Photos
57 | {% for attachment in feature["attachments"] %} 58 | 59 | {% endfor %} 60 |
61 | 62 |

63 | {% endfor %} 64 | 65 |
66 | 67 | 68 | -------------------------------------------------------------------------------- /notebooks/Location Sharing Status.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Location Sharing Status\n", 8 | "\n", 9 | "A guide showing how to view details of the location sharing capability in your organization\n", 10 | "\n", 11 | "This guide assumes that:\n", 12 | "1. You are an administrator for your organization\n", 13 | "2. You are using Enterprise 10.7+ or ArcGIS Online" 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": null, 19 | "metadata": {}, 20 | "outputs": [], 21 | "source": [ 22 | "from arcgis.gis import GIS\n", 23 | "\n", 24 | "#If a site has an invalid SSL certificate or is being accessed via the IP or hostname instead of the name on the certificate, set verify_cert to False in the call below. This will ensure that all SSL certificate issues are ignored\n", 25 | "gis = GIS(\"home\")\n", 26 | "lt = gis.admin.location_tracking" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": {}, 32 | "source": [ 33 | "### The Location Sharing Service" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": null, 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "display(lt.item)" 43 | ] 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "metadata": {}, 48 | "source": [ 49 | "### Check Current Status of Location Sharing\n", 50 | "Let's quickly check to see the status of location sharing for our organization" 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": null, 56 | "metadata": {}, 57 | "outputs": [], 58 | "source": [ 59 | "print(f\"Status: {lt.status}\")\n", 60 | "print(f\"Retention period: {lt.retention_period} {lt.retention_period_units} ({'enabled' if lt.retention_period_enabled else 'disabled'})\")" 61 | ] 62 | }, 63 | { 64 | "cell_type": "markdown", 65 | "metadata": {}, 66 | "source": [ 67 | "### Check Licenses\n", 68 | "Let's see how many licenses have been assigned.\n", 69 | "For previous versions of licenses you can check for 'Tracker for ArcGIS'." 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "metadata": {}, 76 | "outputs": [], 77 | "source": [ 78 | "#gis.admin.license.get('Tracker for ArcGIS').report\n", 79 | "gis.admin.license.get('ArcGIS Location Sharing').report" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "metadata": {}, 85 | "source": [ 86 | "### Check Active Users\n", 87 | "Let's see how many people have actually recorded tracks" 88 | ] 89 | }, 90 | { 91 | "cell_type": "code", 92 | "execution_count": null, 93 | "metadata": {}, 94 | "outputs": [], 95 | "source": [ 96 | "users = lt.tracks_layer.query(group_by_fields_for_statistics=\"created_user\", \n", 97 | " out_statistics=[{\"statisticType\": \"count\", \"onStatisticField\": \"objectid\", \"outStatisticFieldName\": \"count\"}],\n", 98 | " as_df=True)\n", 99 | "print(f\"Users with tracks: {len(users)}\")" 100 | ] 101 | }, 102 | { 103 | "cell_type": "markdown", 104 | "metadata": {}, 105 | "source": [ 106 | "Let's see who the top 5 users are (based on how many tracks points they have uploaded)" 107 | ] 108 | }, 109 | { 110 | "cell_type": "code", 111 | "execution_count": null, 112 | "metadata": {}, 113 | "outputs": [], 114 | "source": [ 115 | "users.sort_values(by=['count'], ascending=False).head(5)" 116 | ] 117 | }, 118 | { 119 | "cell_type": "markdown", 120 | "metadata": {}, 121 | "source": [ 122 | "Let's see who the top 5 users were during the last week" 123 | ] 124 | }, 125 | { 126 | "cell_type": "code", 127 | "execution_count": null, 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "from datetime import datetime, timedelta\n", 132 | "d = datetime.utcnow()-timedelta(days=7)\n", 133 | "users = lt.tracks_layer.query(where=f\"location_timestamp >= timestamp '{d.strftime('%Y-%m-%d %H:%M:%S')}'\",\n", 134 | " group_by_fields_for_statistics=\"created_user\", \n", 135 | " out_statistics=[{\"statisticType\": \"count\", \"onStatisticField\": \"objectid\", \"outStatisticFieldName\": \"count\"}],\n", 136 | " as_df=True)\n", 137 | "users.sort_values(by=['count'], ascending=False).head(5)" 138 | ] 139 | }, 140 | { 141 | "cell_type": "markdown", 142 | "metadata": {}, 143 | "source": [ 144 | "### Check total number of tracks\n", 145 | "Let's see how many track points are available" 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": null, 151 | "metadata": {}, 152 | "outputs": [], 153 | "source": [ 154 | "count = lt.tracks_layer.query(return_count_only=True)\n", 155 | "print(f\"Total Tracks: {count:,}\")" 156 | ] 157 | }, 158 | { 159 | "cell_type": "markdown", 160 | "metadata": {}, 161 | "source": [ 162 | "### Check number of track views\n", 163 | "Let's see how many track views are in my organization" 164 | ] 165 | }, 166 | { 167 | "cell_type": "code", 168 | "execution_count": null, 169 | "metadata": {}, 170 | "outputs": [], 171 | "source": [ 172 | "items = gis.content.search(\"typekeywords:'Location Tracking View'\", max_items=10)\n", 173 | "print(f\"Track Views: {len(items) if len(items) < 10 else '10 or more'}\")" 174 | ] 175 | } 176 | ], 177 | "metadata": { 178 | "kernelspec": { 179 | "display_name": "Python 3 (ipykernel)", 180 | "language": "python", 181 | "name": "python3" 182 | }, 183 | "language_info": { 184 | "codemirror_mode": { 185 | "name": "ipython", 186 | "version": 3 187 | }, 188 | "file_extension": ".py", 189 | "mimetype": "text/x-python", 190 | "name": "python", 191 | "nbconvert_exporter": "python", 192 | "pygments_lexer": "ipython3", 193 | "version": "3.9.11" 194 | } 195 | }, 196 | "nbformat": 4, 197 | "nbformat_minor": 2 198 | } 199 | -------------------------------------------------------------------------------- /notebooks/Manage Map Areas with Group and Index.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Introduction\n", 8 | "This notebook is designed to help ArcGIS Online / ArcGIS Enterprise administrators manage map areas. Depending on your needs, the manual map area tools in Field Maps Designer and item details may be adequate. If your managing multiple map areas in multiple maps or having to accomodate frequent changes, a scripted approach may be preferred. \n", 9 | "\n", 10 | "When should map areas be recreated?\n", 11 | "\n", 12 | ">Re-creating the offline map area differs from updating the offline map area. When you use the Recreate action, it deletes all packages associated with the map area and re-creates them based on the offline map area's settings.\n", 13 | ">\n", 14 | ">The primary reason to re-create a map area is to pick up schema changes that have occurred after you created the offline map area. For example, if you add or delete a field or change an attribute value list or range (domains), you must re-create the offline map area to pick up those changes.\n", 15 | ">\n", 16 | ">Source: More info: https://doc.arcgis.com/en/arcgis-online/manage-data/take-maps-offline.htm\n", 17 | "\n", 18 | "This process assumes that you are the owner of web map \n", 19 | "\n", 20 | "Environment dependencies: arcgis 2.3.1\n", 21 | "\n", 22 | "\n", 23 | "---\n", 24 | "\n", 25 | "### Setup \n", 26 | "\n", 27 | "The script utilizes three items that will ***need to be created*** and configured in your portal. \n", 28 | "\n", 29 | "#### Hosted Feature Layer \n", 30 | "##### `Manage Map Areas`\n", 31 | "- Setup Notes: Download the zipped file geodatabase and add it as an item to your portal. The hosted feature layer and table below are both included.\n", 32 | "\n", 33 | " - File hosted on ArcGIS Online: https://www.arcgis.com/home/item.html?id=2ab1be89349e4e61be674fc3e3cbfcdf\n", 34 | "\n", 35 | " ##### `Manage Map Area Index`\n", 36 | " - Purpose: The polygon hosted feature layer stores map area boundries and processing details. You can specify if the map area applied to all web map in the group, or a specific web map item id.\n", 37 | " - Configuraton: Add your own map area polygons and configure details: \n", 38 | " - `shape`: Required - this will be the boundary shape for your offline area - map layers will be clipped to this polygon\n", 39 | " - `webmap_id`: Required - Specify which map the offline area should be included in, supports \"ALL\" web maps or a single web map item id.\n", 40 | " - `status`: Required - Specify if map areas status is \"Active\" or not if you want to skip over processing.\n", 41 | " - `min_scale`/`max_scale`: Required - these fields are used to identify what level of detail (lods) from the basemaps should be included. The min_scale value is always larger than the max_scale. If you're generating large file sizes for your offline packages, look to refine the max_scale value.\n", 42 | " - `refresh_schedule`: Required - Specifies when the server will sync any changes to map area. Supports: Never,Daily,Weekly,Monthly. [More info](https://developers.arcgis.com/python/latest/api-reference/arcgis.map.toc.html#offlinemapareamanager)\n", 43 | " - `refresh_day_of_week`/`refresh_minute`/`refresh_hour`: Optional - Based on choice to `refresh_schedule` [More info](https://developers.arcgis.com/python/latest/api-reference/arcgis.map.toc.html#offlinemapareamanager)\n", 44 | " - `description`/`title`/`tags`/`snippet`: Optional - for item properties\n", 45 | " - `folder`: Optional - Specify a folder name if you want the offline map area item and the packages to be created inside a folder.\n", 46 | " - More details on fields can be found here: https://developers.arcgis.com/python/latest/api-reference/arcgis.map.toc.html#offlinemapareamanager can be configured.\n", 47 | "\n", 48 | " ##### `Manage Map Area Processing Log`\n", 49 | " - Purpose: Log table where processing details are stored, this table is used for reporting.\n", 50 | " - Configuration: None needed, the processing script is designed to add rows to this table, but can be configured to truncate for each run as well.\n", 51 | " \n", 52 | "#### Group:\n", 53 | "##### `Manage Map Areas`\n", 54 | "- Purpose: Group should include web maps that will have their map areas processed. The owner and members of this group will receive email summary of processing results.\n", 55 | "- Configuration: Any web map that you own and would like to process map areas for should be added to this group. Any users that you would like to receive processing notification email should be included in this group. The email notification is optional and for Enterprise, there may be [additional configuration](https://enterprise.arcgis.com/en/portal/latest/administer/windows/configure-security.htm#ESRI_SECTION1_BE59D8C8264E40A8BE5F92E3D1D60762) needed\n", 56 | "\n", 57 | "#### After items are created\n", 58 | "Copy paste the item ids for `Manage Map Areas` HFL and `Manage Map Areas` group into the `Get Manage Map Areas HFL and Group` section below.\n", 59 | "\n", 60 | "--- \n", 61 | "\n", 62 | "#### Authentication / Security \n", 63 | "\n", 64 | "The notebook below uses built-in user (`gis = GIS(\"home\")`), which is the account you're logged onto your portal with. \n", 65 | "\n", 66 | "- For information on different authentication schemes see:\n", 67 | " - https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/\n", 68 | "- For protecting your credentials see:\n", 69 | " - https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/#protecting-your-credentials\n", 70 | "\n", 71 | "---\n", 72 | "\n", 73 | "#### Processing outline: \n", 74 | "\n", 75 | "The script will create map areas for all web maps in the group `Manage Map Areas` that have an attribute status = 'Active'. Web maps should include layers and basemap. Basemap can include a reference layer. Proxied basemaps (stored credentials) are supported. You are required to be owner of each web map.\n", 76 | "\n", 77 | "For each web map in group:\n", 78 | "\n", 79 | "1. All existing map areas will be deleted, if it has offline area with status = 'Active'.\n", 80 | "2. Map areas will be created as defined in hosted feature layer: `Manage Map Area Index`.\n", 81 | " - Known limitations:\n", 82 | " - no more than 16 map areas can be defined for the web map\n", 83 | " - a single tile/vector package or mobile database cannot exceed 4 GB\n", 84 | " - Process results will be added to a process log table: `Manage Map Area Processing Log`.\n", 85 | " - Reference: https://developers.arcgis.com/python/latest/api-reference/arcgis.map.toc.html#offlinemapareamanager\n", 86 | "3. A summary report is emailed to members of `Manage Map Area` group.\n", 87 | " - Reference: https://community.esri.com/t5/arcgis-notebooks-documents/send-e-mail-notifications-with-arcgis-notebooks/ta-p/1329699\n", 88 | "\n", 89 | "---\n", 90 | "\n", 91 | "#### Notes on map area download / syncing:\n", 92 | "- When a map area is created, a copy of the data is packaged and made available for mobile users to download. This serves as the initial dataset.\n", 93 | "- The frequency at which the initial dataset refreshes can be defined during the creation of the map area. Please ensure this is set according to the needs of the field operations.\n", 94 | "- When field workers download an map area, they store that copy locally on their devices. Any subsequent edits made to the layer's data will be synced back to portal.\n", 95 | "- In some circumstances field workers will need to remove and redownload map areas to ensure they have latest changes. \n", 96 | " - Map areas recreated to include changes to schema changes, field domain changes, new layers.\n", 97 | " - Change to web map (form, popup, symbology) \n" 98 | ] 99 | }, 100 | { 101 | "cell_type": "markdown", 102 | "metadata": {}, 103 | "source": [ 104 | "### Import ArcGIS Libraries and Connect to GIS" 105 | ] 106 | }, 107 | { 108 | "cell_type": "code", 109 | "execution_count": null, 110 | "metadata": {}, 111 | "outputs": [], 112 | "source": [ 113 | "from arcgis.gis import GIS\n", 114 | "from arcgis.mapping import WebMap\n", 115 | "from arcgis.mapping import VectorTileLayer\n", 116 | "from arcgis.mapping import MapImageLayer\n", 117 | "import datetime, time\n", 118 | "#from arcgis import env # << uncomment both lines for additional logging detail \n", 119 | "#env.verbose = True\n", 120 | "\n", 121 | "gis = GIS(\"home\") # see Authentication / Security section above for more detail" 122 | ] 123 | }, 124 | { 125 | "cell_type": "markdown", 126 | "metadata": {}, 127 | "source": [ 128 | "### Get Manage Map Areas HFL and Group \n", 129 | "\n", 130 | "See `Setup` notes above and copy/paste in item ids were shown below" 131 | ] 132 | }, 133 | { 134 | "cell_type": "code", 135 | "execution_count": null, 136 | "metadata": {}, 137 | "outputs": [], 138 | "source": [ 139 | "# ACTION REQUIRED: see SETUP notes above\n", 140 | "MAP_AREAS_HFL_ITEM_ID = \"eb4c7060f24647ba9c2ffd58e783afc2\" # <<< ACTION REQUIRED: see SETUP notes above to create hosted feature layer on your portal and update item id \n", 141 | "MAP_AREAS_GROUP_ITEM_ID = \"246139cb037b4f52a35f217f2a43a996\" # <<< ACTION REQUIRED: see SETUP notes above to create group on your portal and update item id \n", 142 | "\n", 143 | "# get feature layer that contains all the map area polygons\n", 144 | "map_areas_hfl = gis.content.get(MAP_AREAS_HFL_ITEM_ID) \n", 145 | "if not map_areas_hfl:\n", 146 | " print(\"'Manage Map Areas' layer/table could not be found, please check MAP_AREAS_HFL_ITEM_ID...\")\n", 147 | " \n", 148 | "# get group that has all the shared map area webmaps\n", 149 | "manage_map_areas_group = gis.groups.get(MAP_AREAS_GROUP_ITEM_ID)\n", 150 | "if not manage_map_areas_group:\n", 151 | " print(\"'Manage Map Areas' group could not be found, please check MAP_AREAS_GROUP_ITEM_ID...\") " 152 | ] 153 | }, 154 | { 155 | "cell_type": "markdown", 156 | "metadata": {}, 157 | "source": [ 158 | "### Helper functions " 159 | ] 160 | }, 161 | { 162 | "cell_type": "code", 163 | "execution_count": null, 164 | "metadata": {}, 165 | "outputs": [], 166 | "source": [ 167 | "# send report using group notify\n", 168 | "def email_group(group_id,subject,message):\n", 169 | " \n", 170 | " try:\n", 171 | " # get group based on passed in id\n", 172 | " notify_group = gis.groups.get(group_id)\n", 173 | "\n", 174 | " # get group owner and add to list\n", 175 | " notification_list = [notify_group.owner]\n", 176 | " \n", 177 | " # get group members and add them to notification list\n", 178 | " for member in notify_group.get_members()[\"users\"]:\n", 179 | " notification_list.append(member)\n", 180 | "\n", 181 | " # use group .notify method to send email - message supports html. Additional setup may be required for Enterprise. See setup section above\n", 182 | " notify_result = notify_group.notify(users=notification_list, subject=subject, message=message,method=\"email\")\n", 183 | " \n", 184 | " return notify_result[\"success\"]\n", 185 | " \n", 186 | " except:\n", 187 | " return False" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": null, 193 | "metadata": {}, 194 | "outputs": [], 195 | "source": [ 196 | "def create_email_report(map_areas_hfl_item_id, processing_start_time):\n", 197 | " \n", 198 | " try:\n", 199 | " # create html table that includes the query results from Manage Map Area Processing table\n", 200 | " email_subject = \"\"\n", 201 | " email_body = \"\"\"\n", 202 | " \n", 203 | " \n", 204 | " \n", 205 | " \n", 206 | " \n", 207 | " \n", 208 | " \"\"\"\n", 209 | " out_fields = \"web_map_name, map_area_name,process_status,process_date,process_seconds,total_storage_mb\"\n", 210 | "\n", 211 | " # get logging table\n", 212 | " map_areas_hfl = gis.content.get(map_areas_hfl_item_id)\n", 213 | " logging_table = map_areas_hfl.tables[0]\n", 214 | " \n", 215 | " # email subject - success count \n", 216 | " subject_success_count_where_clause = f\"process_status = 'success' and process_date > '{processing_start_time}'\"\n", 217 | " success_results = logging_table.query( where = subject_success_count_where_clause, return_count_only=True )\n", 218 | "\n", 219 | " # email subject - failed count\n", 220 | " subject_fail_count_where_clause = f\"process_status = 'failed' and process_date > '{processing_start_time}'\"\n", 221 | " fail_results = logging_table.query(where = subject_fail_count_where_clause,return_count_only=True) \n", 222 | "\n", 223 | " # create email subject based on success / fail results\n", 224 | " if success_results > 0 and fail_results == 0:\n", 225 | " email_subject = f\"{success_results} map areas created\"\n", 226 | " elif success_results > 0 and fail_results > 0:\n", 227 | " email_subject = f\"{success_results} map areas created / {fail_results} map areas failed\"\n", 228 | " elif success_results == 0 and fail_results > 0:\n", 229 | " email_subject = f\"{fail_results} map areas failed\"\n", 230 | " else:\n", 231 | " email_subject = \"Error processing email subject\"\n", 232 | " \n", 233 | " # email body - assemble html table rows \n", 234 | " where_clause = f\"process_date > '{processing_start_time}'\" \n", 235 | " results = logging_table.query(where = where_clause,out_fields = out_fields,sort_field=\"process_date\", sort_order=\"asc\")\n", 236 | " for row in results.features: \n", 237 | " # output row as we if there is a failure\n", 238 | " row_style = \"style='color:#FF0000'\" if row.attributes[\"process_status\"] == \"failed\" else \"\"\n", 239 | " email_body += f\"\"\"\n", 240 | " \n", 241 | " \n", 242 | " \n", 243 | " \n", 244 | " \n", 245 | " \"\"\" \n", 246 | " email_body += \"
web mapmap area nameprocessing statusprocessing secondstotal storage mb
{row.attributes[\"web_map_name\"]}{row.attributes[\"map_area_name\"]}{row.attributes[\"process_status\"]}{row.attributes[\"process_seconds\"]}{row.attributes[\"total_storage_mb\"]}
\"\n", 247 | " \n", 248 | " return email_subject, email_body\n", 249 | " \n", 250 | " except Exception as e:\n", 251 | " print(f\"an error occurred: {e}\")\n", 252 | " return \"error creating report\",\"report details not generated\"" 253 | ] 254 | }, 255 | { 256 | "cell_type": "code", 257 | "execution_count": null, 258 | "metadata": {}, 259 | "outputs": [], 260 | "source": [ 261 | "def create_map_area_tile_services(baseMapLayers):\n", 262 | " # compile all the base map URLs - your mileage may vary \n", 263 | " map_area_tile_services = []\n", 264 | " for basemap in baseMapLayers:\n", 265 | " basemap_url = None\n", 266 | " if \"url\" in basemap:\n", 267 | " # add url\n", 268 | " basemap_url = basemap[\"url\"]\n", 269 | "\n", 270 | " elif \"itemId\" in basemap:\n", 271 | " # get items url\n", 272 | " basemap_item = gis.content.get(basemap[\"itemId\"])\n", 273 | " basemap_url = basemap_item[\"url\"]\n", 274 | "\n", 275 | " elif \"styleUrl\" in basemap:\n", 276 | " # get source url from \n", 277 | " vtl = VectorTileLayer(basemap.styleUrl,gis)\n", 278 | " basemap_url = VectorTileLayer(vtl.properties.sources.esri.url,gis).url\n", 279 | "\n", 280 | " if basemap_url is not None:\n", 281 | " # using the attribute values for min_scale and max_scale - loop through service supported levels and include\n", 282 | " \n", 283 | " try:\n", 284 | " basemap_levels_array = []\n", 285 | " map_service = MapImageLayer(basemap_url,gis)\n", 286 | " for lod in map_service.properties.tileInfo[\"lods\"]:\n", 287 | " # min_scale and max_scale are required\n", 288 | " if map_area.get_value(\"min_scale\") >= lod[\"scale\"] >= map_area.get_value(\"max_scale\"):\n", 289 | " # include this level in basemap_levels\n", 290 | " basemap_levels_array.append(str(lod[\"level\"]))\n", 291 | " \n", 292 | " except Exception as e: \n", 293 | " print(\" failed - processing basemap levels\")\n", 294 | " print(e)\n", 295 | " \n", 296 | " basemap_levels = \",\".join(basemap_levels_array)\n", 297 | "\n", 298 | " #set the web map tile service parameters url and levels\n", 299 | " map_area_tile_services.append({\"url\":basemap_url,\"levels\":basemap_levels}) \n", 300 | " \n", 301 | " return map_area_tile_services\n", 302 | "\n" 303 | ] 304 | }, 305 | { 306 | "cell_type": "markdown", 307 | "metadata": {}, 308 | "source": [ 309 | "### Processing each web map in group, as defined by index" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": null, 315 | "metadata": {}, 316 | "outputs": [], 317 | "source": [ 318 | "print(\"-- begin processing map areas --\")\n", 319 | "process_begin_time = datetime.datetime.now() \n", 320 | "\n", 321 | "# optional - truncate the logs - this can be removed to based on retention preferences\n", 322 | "#map_areas_hfl.tables[0].manager.truncate() \n", 323 | "\n", 324 | "# get all group shared items\n", 325 | "for item in manage_map_areas_group.content():\n", 326 | " # if the shared item is not a web map skip it \n", 327 | " if( item[\"type\"] != \"Web Map\" ):\n", 328 | " continue # to next item in for loop \n", 329 | "\n", 330 | " print(\"--------------\")\n", 331 | " print(f\"processing web map: {item['title']}\") \n", 332 | "\n", 333 | " # Cast item to WebMap\n", 334 | " web_map = WebMap(item)\n", 335 | "\n", 336 | " # create map area as defined in map_areas_hfl\n", 337 | " where_clause = f\"webmap_id IN ('{item.id}', 'ALL') and status = 'Active'\"\n", 338 | " map_area_layer = map_areas_hfl.layers[0]\n", 339 | " map_areas_feature_set = map_area_layer.query(where=where_clause) \n", 340 | " print(f\" map area count: {str(len(map_areas_feature_set.features))}\")\n", 341 | "\n", 342 | " # check to ensure there are no more than 16 map areas, if there are skip to next map.\n", 343 | " if len(map_areas_feature_set.features) > 16:\n", 344 | " print(\" skipping, more than 16 map areas defined\")\n", 345 | " continue # to next item in for loop \n", 346 | "\n", 347 | " if len(map_areas_feature_set.features) > 0:\n", 348 | " # delete any existing map areas for map \n", 349 | " for ids in web_map.offline_areas.list():\n", 350 | " print(f\" - deleting map area: {ids['title']}\")\n", 351 | " ids.delete()\n", 352 | "\n", 353 | " # loop through all the map areas \n", 354 | " for map_area in map_areas_feature_set.features:\n", 355 | " map_area_start_time = time.time()\n", 356 | " \n", 357 | " # create dictionary for log attributes\n", 358 | " new_processing_log_row = {}\n", 359 | " new_processing_log_row[\"process_date\"] = datetime.datetime.now()\n", 360 | " new_processing_log_row[\"map_area_name\"] = map_area.get_value(\"title\")\n", 361 | " new_processing_log_row[\"web_map_name\"] = item[\"title\"]\n", 362 | "\n", 363 | "\n", 364 | " try: \n", 365 | " # create item properties\n", 366 | " item_prop = {\"title\": map_area.get_value(\"title\"),\n", 367 | " \"snippet\": map_area.get_value(\"snippet\"),\n", 368 | " \"tags\": [map_area.get_value(\"tags\")]\n", 369 | " }\n", 370 | "\n", 371 | " # call function that processes base map to create tile_services array\n", 372 | " map_area_tile_services = create_map_area_tile_services(web_map.basemap[\"baseMapLayers\"])\n", 373 | " \n", 374 | " print(f\" requesting map area packaging - {map_area.get_value('title')}\")\n", 375 | " map_area_result = web_map.offline_areas.create (area=map_area.geometry,\n", 376 | " item_properties=item_prop,\n", 377 | " refresh_schedule=map_area.get_value(\"refresh_schedule\"),\n", 378 | " refresh_rates = {\n", 379 | " \"hour\": map_area.get_value(\"refresh_hour\"),\n", 380 | " \"minute\" : map_area.get_value(\"refresh_minute\"),\n", 381 | " \"day_of_week\" : map_area.get_value(\"refresh_day_of_week\"),\n", 382 | " \"nthday\" : map_area.get_value(\"refresh_day_of_month\")\n", 383 | " },\n", 384 | " folder=map_area.get_value(\"folder\"),\n", 385 | " future=False, # Optional boolean. If True, a future object will be returned and the process will not wait for the task to complete. The default is False, which means wait for results.\n", 386 | " tile_services = map_area_tile_services\n", 387 | " )\n", 388 | "\n", 389 | " # loop through related items to get the package size in bytes and add it to running total map_area_storage_bytes\n", 390 | " map_area_storage_bytes = 0\n", 391 | " for map_area_pkg in map_area_result.related_items(\"Area2Package\", \"forward\"):\n", 392 | " if map_area_pkg.size:\n", 393 | " map_area_storage_bytes += map_area_pkg.size\n", 394 | " \n", 395 | " # log the map_area_storage_bytes \n", 396 | " total_storage_mb = round(map_area_storage_bytes / (1000000), 2)\n", 397 | " new_processing_log_row[\"total_storage_mb\"] = total_storage_mb\n", 398 | " # log the process_seconds\n", 399 | " map_area_end_time = time.time()\n", 400 | " new_processing_log_row[\"process_seconds\"] = int(map_area_end_time - map_area_start_time)\n", 401 | " new_processing_log_row[\"process_status\"] = \"Success\"\n", 402 | " print(f\" + map area packaging complete\")\n", 403 | " \n", 404 | " except Exception as e:\n", 405 | " # add row to log\n", 406 | " new_processing_log_row[\"process_status\"] = \"Failed\"\n", 407 | " print(f\" failed processing map areas {map_area.get_value('title')}\")\n", 408 | " print(e) \n", 409 | "\n", 410 | " # add the row to logging table successes and failures will be logged\n", 411 | " map_areas_hfl.tables[0].edit_features(adds = [{\"attributes\": new_processing_log_row}])\n", 412 | "\n", 413 | "print(\"-- processing map area complete --\") \n", 414 | "\n", 415 | "# create email \n", 416 | "email_subject, email_body = create_email_report(map_areas_hfl.id,process_begin_time)\n", 417 | "\n", 418 | "# email members of Manage Map Areas \n", 419 | "if email_group(manage_map_areas_group.id,email_subject,email_body):\n", 420 | " print(\"-- report emailed to group members --\")\n", 421 | "else:\n", 422 | " print(\"-- report NOT sent to group members --\")" 423 | ] 424 | } 425 | ], 426 | "metadata": { 427 | "esriNotebookRuntime": { 428 | "notebookRuntimeName": "ArcGIS Notebook Python 3 Standard", 429 | "notebookRuntimeVersion": "9.0" 430 | }, 431 | "kernelspec": { 432 | "display_name": "ArcGISPro", 433 | "language": "Python", 434 | "name": "python3" 435 | }, 436 | "language_info": { 437 | "file_extension": ".py", 438 | "name": "python", 439 | "version": "3" 440 | } 441 | }, 442 | "nbformat": 4, 443 | "nbformat_minor": 4 444 | } 445 | -------------------------------------------------------------------------------- /notebooks/Offline Checks.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "### Introduction\n", 8 | "\n", 9 | "This notebook is designed to help you determine if your map can be taken offline. It performs comprehensive checks on your layers, basemaps, and map properties to ensure that offline capabilities can be enabled. By using this notebook, you can identify specific issues and understand the logic behind each check. \n", 10 | "\n", 11 | "Enterprise users will find this notebook helpful since we introduced new offline checks for ArcGIS Online users in July 2024. Since Enterprise environments don't automatically receive AGOL updates, this notebook enables Enterprise users to implement equivalent validation flags, ensuring consistency with AGOL standards.\n", 12 | "\n", 13 | "This process assumes that you have publishing and editing privileges for the map and that you are using either Enterprise 10.7+ or ArcGIS Online. This notebook uses ArcGIS API for Python version 2.3.x\n", 14 | "\n", 15 | "For further information, please refer to the official documentation: https://doc.arcgis.com/en/field-maps/latest/prepare-maps/configure-the-map.htm#ESRI_SECTION2_9DB2938BE8A749E393BBE43A3219E369\n" 16 | ] 17 | }, 18 | { 19 | "cell_type": "markdown", 20 | "metadata": {}, 21 | "source": [ 22 | "#### Import arcgis libraries" 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": null, 28 | "metadata": {}, 29 | "outputs": [], 30 | "source": [ 31 | "from arcgis import GIS\n", 32 | "from arcgis.mapping import WebMap\n", 33 | "from arcgis.features import FeatureLayerCollection\n", 34 | "from arcgis.features import FeatureLayer\n", 35 | "from arcgis.mapping import MapServiceLayer\n", 36 | "from arcgis.mapping import VectorTileLayer\n", 37 | "from typing import List" 38 | ] 39 | }, 40 | { 41 | "cell_type": "markdown", 42 | "metadata": {}, 43 | "source": [ 44 | "#### Access your GIS Account - ArcGIS Enterprise\n", 45 | "Add in your enterprise portal below. \n", 46 | "Example: \"https://mydomain.esri.com/portal\"" 47 | ] 48 | }, 49 | { 50 | "cell_type": "code", 51 | "execution_count": null, 52 | "metadata": {}, 53 | "outputs": [], 54 | "source": [ 55 | "ENTERPRISE_URL = \"https://mydomain.esri.com/portal\"" 56 | ] 57 | }, 58 | { 59 | "cell_type": "markdown", 60 | "metadata": {}, 61 | "source": [ 62 | "Add in your username below. Example: \"org_admin\" " 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": null, 68 | "metadata": {}, 69 | "outputs": [], 70 | "source": [ 71 | "USERNAME = \"org_admin\"" 72 | ] 73 | }, 74 | { 75 | "cell_type": "markdown", 76 | "metadata": {}, 77 | "source": [ 78 | "After you run the cell below, you will be prompted to add in your password. And then you should be connected to your GIS account." 79 | ] 80 | }, 81 | { 82 | "cell_type": "code", 83 | "execution_count": null, 84 | "metadata": {}, 85 | "outputs": [], 86 | "source": [ 87 | "gis = GIS(ENTERPRISE_URL, USERNAME)\n", 88 | "print(\"Connected to {}\".format(gis.properties.portalHostname))" 89 | ] 90 | }, 91 | { 92 | "cell_type": "markdown", 93 | "metadata": {}, 94 | "source": [ 95 | "#### Add in your Map\n", 96 | "In the `map_item_id_to_check` variable below there is an example map id. Please delete this and add in the id of your map you would like to check for offline compatibility" 97 | ] 98 | }, 99 | { 100 | "cell_type": "code", 101 | "execution_count": null, 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "MAP_ITEM_ID_TO_CHECK = \"2352347dwgf3e3289e21yddhwu29e138\"\n", 106 | "offline_map = gis.content.get(MAP_ITEM_ID_TO_CHECK)\n", 107 | "if offline_map is None:\n", 108 | " raise ValueError(f\"Map with item ID {MAP_ITEM_ID_TO_CHECK} could not be found.\")" 109 | ] 110 | }, 111 | { 112 | "cell_type": "markdown", 113 | "metadata": {}, 114 | "source": [ 115 | "#### Checking Basemaps for Offline Use\n", 116 | "In this step, we will perform essential checks to ensure that Esri basemaps are suitable for offline use. We will evaluate five key aspects. Before this step, make sure all Esri basemaps are proxied when used with Enterprise. To proxy a basemap, you must store credentials with service item before you publish to your portal. You will be prompted to add in your credentials before your publish.\n", 117 | "\n", 118 | "See the link below for more info: \n", 119 | "\n", 120 | "https://doc.arcgis.com/en/arcgis-online/reference/arcgis-server-services.htm#ESRI_SECTION1_FEB0DF92DA064B6A970DFB59A18AA4C2 \n", 121 | "\n", 122 | "See the notebook, Proxy Esri Basemaps for Enterprise below for automating this process:\n", 123 | "\n", 124 | "https://github.com/Esri/field-maps-scripts/blob/main/notebooks/Proxy%20Esri%20Basemaps%20for%20ArcGIS%20Enterprise%20Offline%20Map%20Areas.ipynb\n", 125 | "\n", 126 | "\n", 127 | "\n", 128 | "1. Exportable Map Service Layer: Verify if the map service layer is exportable.\n", 129 | "2. Pre-approved Tile Layers: Check if the tile layers are from a pre-approved list that can be exported.\n", 130 | "3. Export Tiles Enabled: Ensure that vector tile or tile map service layers have export tiles enabled, as this is required for offline use.\n", 131 | "4. Multi Sourced Basemaps: Ensure that your basemap does not have more than one source\n", 132 | "5. Deprecated Tile Base Maps URLs: Ensure you are not using basemaps with base url: `https://tiledbasemaps.arcgis.com/arcgis/rest/services`\n", 133 | "\n", 134 | "By performing these checks, we can ensure that the basemaps are properly configured and optimized for offline use." 135 | ] 136 | }, 137 | { 138 | "cell_type": "code", 139 | "execution_count": null, 140 | "metadata": {}, 141 | "outputs": [], 142 | "source": [ 143 | "def is_export_enabled_map_service_layer(basemaplayer) -> bool:\n", 144 | " msl = MapServiceLayer(basemaplayer[\"url\"], gis=gis)\n", 145 | " return msl.properties.exportTilesAllowed\n", 146 | "\n", 147 | "\n", 148 | "def is_exportable_agol_tile_layer(basemaplayer) -> bool:\n", 149 | " lowercase_layer_url = basemaplayer[\"url\"].lower()\n", 150 | " HOST_LIST = [\"server.arcgisonline.com\", \"services.arcgisonline.com\"]\n", 151 | " has_allowed_host_name = any(\n", 152 | " host_str in lowercase_layer_url for host_str in HOST_LIST\n", 153 | " )\n", 154 | " SERVICE_LIST = [\n", 155 | " \"natgeo_world_map\",\n", 156 | " \"ocean_basemap\",\n", 157 | " \"usa_topo_maps\",\n", 158 | " \"world_imagery\",\n", 159 | " \"world_street_map\",\n", 160 | " \"world_terrain_base\",\n", 161 | " \"world_topo_map\",\n", 162 | " \"world_hillshade\",\n", 163 | " \"canvas/world_light_gray_base\",\n", 164 | " \"canvas/world_light_gray_reference\",\n", 165 | " \"canvas/world_dark_gray_base\",\n", 166 | " \"canvas/world_dark_gray_reference\",\n", 167 | " \"ocean/world_ocean_base\",\n", 168 | " \"ocean/world_ocean_reference\",\n", 169 | " \"reference/world_boundaries_and_places\",\n", 170 | " \"reference/world_reference_overlay\",\n", 171 | " \"reference/world_transportation\",\n", 172 | " ]\n", 173 | "\n", 174 | " has_exportable_agol_servicename = any(\n", 175 | " service_name in lowercase_layer_url for service_name in SERVICE_LIST\n", 176 | " )\n", 177 | "\n", 178 | " return has_allowed_host_name and has_exportable_agol_servicename\n", 179 | "\n", 180 | "\n", 181 | "def is_export_enabled_vector_tile_layer(basemaplayer) -> bool:\n", 182 | " if \"itemId\" in basemaplayer:\n", 183 | " vl_item = basemaplayer[\"itemId\"]\n", 184 | " vtl = VectorTileLayer.fromitem(gis.content.get(vl_item))\n", 185 | " return vtl.properties.exportTilesAllowed\n", 186 | " elif \"styleUrl\" in basemaplayer:\n", 187 | " vtl = VectorTileLayer(basemaplayer[\"styleUrl\"])\n", 188 | " vtl_source = VectorTileLayer(vtl.properties.sources.esri.url, gis=gis)\n", 189 | " return vtl_source.properties.exportTilesAllowed\n", 190 | " return False\n", 191 | "\n", 192 | "\n", 193 | "def is_export_enabled(layer) -> bool:\n", 194 | " layer_type = layer[\"layerType\"]\n", 195 | " if layer_type == \"ArcGISTiledMapServiceLayer\":\n", 196 | " return is_export_enabled_map_service_layer(\n", 197 | " layer\n", 198 | " ) or is_exportable_agol_tile_layer(layer)\n", 199 | " elif layer_type == \"VectorTileLayer\":\n", 200 | " return is_export_enabled_vector_tile_layer(layer)\n", 201 | " return False" 202 | ] 203 | }, 204 | { 205 | "cell_type": "code", 206 | "execution_count": null, 207 | "metadata": {}, 208 | "outputs": [], 209 | "source": [ 210 | "def is_map_multisource(basemaplayer) -> bool:\n", 211 | " if \"styleUrl\" in basemaplayer:\n", 212 | " vtl = VectorTileLayer(basemaplayer[\"styleUrl\"])\n", 213 | " return len(vtl.properties.sources) > 1\n", 214 | " return False" 215 | ] 216 | }, 217 | { 218 | "cell_type": "code", 219 | "execution_count": null, 220 | "metadata": {}, 221 | "outputs": [], 222 | "source": [ 223 | "def is_deprecated_basemap_urls(basemap) -> bool:\n", 224 | " DEPRECATED_URLS = [\n", 225 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/NatGeo_World_Map/MapServer\",\n", 226 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/USA_Topo_Maps/MapServer\",\n", 227 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Reference/World_Boundaries_and_Places/MapServer\",\n", 228 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Canvas/World_Dark_Gray_Base/MapServer\",\n", 229 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Canvas/World_Dark_Gray_Reference/MapServer\",\n", 230 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Canvas/World_Light_Gray_Base/MapServer\",\n", 231 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Canvas/World_Light_Gray_Reference/MapServer\",\n", 232 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Ocean/World_Ocean_Reference/MapServer\",\n", 233 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Reference/World_Reference_Overlay/MapServer\",\n", 234 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/World_Street_Map/MapServer\",\n", 235 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/World_Topo_Map/MapServer\",\n", 236 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/Reference/World_Transportation/MapServer\",\n", 237 | " \"https://tiledbasemaps.arcgis.com/arcgis/rest/services/World_Terrain_Base/MapServer\",\n", 238 | " ]\n", 239 | "\n", 240 | " if basemap[\"layerType\"] == \"ArcGISTiledMapServiceLayer\":\n", 241 | " return basemap.get(\"url\") in DEPRECATED_URLS\n", 242 | " return False" 243 | ] 244 | }, 245 | { 246 | "cell_type": "code", 247 | "execution_count": null, 248 | "metadata": {}, 249 | "outputs": [], 250 | "source": [ 251 | "def check_basemap(basemaps) -> str:\n", 252 | " basemap_errors = []\n", 253 | " for basemap in basemaps[\"baseMapLayers\"]:\n", 254 | " basemap_title = basemap[\"title\"]\n", 255 | " if not is_export_enabled(basemap):\n", 256 | " basemap_errors.append(f\"{basemap_title} export tiles not enabled\")\n", 257 | " else:\n", 258 | " if is_map_multisource(basemap):\n", 259 | " basemap_errors.append(f\"{basemap_title} is multisource\")\n", 260 | " if is_deprecated_basemap_urls(basemap):\n", 261 | " basemap_errors.append(f\"{basemap_title} is deprecated\")\n", 262 | " return basemap_errors" 263 | ] 264 | }, 265 | { 266 | "cell_type": "markdown", 267 | "metadata": {}, 268 | "source": [ 269 | "#### Layer Checks for Offline Use\n", 270 | "##### In this section, we perform 8 essential checks to ensure that layers are suitable for offline use:\n", 271 | "\n", 272 | "1. Layer Type Support: Verify if the layer type is supported for offline use.\n", 273 | "2. Sync Enabled: Ensure that sync is enabled, as it is required for offline use.\n", 274 | "3. Supported Indices: Check if the layer's indices are supported, ensuring they do not contain + or -.\n", 275 | "4. Join View Check: Determine if the layer is a join view, as join view layers are not suitable for offline use.\n", 276 | "5. Global ID Presence: Verify if the layer has a global ID, as its absence can cause issues.\n", 277 | "6. Relationship Keywords: Check if relationship keywords are missing, which are necessary for certain functionalities.\n", 278 | "7. Subtype Fields: Ensure that subtype fields are present, as their absence can affect layer behavior.\n", 279 | "8. True Curve Updates: Confirm that True Curve Updates are set to true, as required.\n", 280 | "\n", 281 | "By performing these checks, we can ensure that the layers are properly configured and optimized for offline use." 282 | ] 283 | }, 284 | { 285 | "cell_type": "code", 286 | "execution_count": null, 287 | "metadata": {}, 288 | "outputs": [], 289 | "source": [ 290 | "def is_supported_layer_type(layer) -> bool:\n", 291 | " SUPPORTED_LAYER_TYPE = [\n", 292 | " \"ArcGISFeatureLayer\",\n", 293 | " \"GroupLayer\",\n", 294 | " \"ArcGISImageServiceLayer\",\n", 295 | " \"ArcGISTiledImageServiceLayer\",\n", 296 | " \"ArcGISTiledMapServiceLayer\",\n", 297 | " \"SubtypeGroupLayer\",\n", 298 | " \"VectorTileLayer\",\n", 299 | " ]\n", 300 | " if \"featureCollectionType\" in layer and (\n", 301 | " layer[\"featureCollectionType\"] == \"route\"\n", 302 | " or layer[\"featureCollectionType\"] == \"notes\"\n", 303 | " ):\n", 304 | " return False\n", 305 | "\n", 306 | " layer_type = layer[\"layerType\"] if \"layerType\" in layer else layer.layerType\n", 307 | " return layer_type in SUPPORTED_LAYER_TYPE" 308 | ] 309 | }, 310 | { 311 | "cell_type": "code", 312 | "execution_count": null, 313 | "metadata": {}, 314 | "outputs": [], 315 | "source": [ 316 | "def is_sync_enabled(layer) -> bool:\n", 317 | " capabilities = layer.properties.capabilities.split(\",\")\n", 318 | " if \"Sync\" in capabilities:\n", 319 | " return True\n", 320 | " else:\n", 321 | " return False" 322 | ] 323 | }, 324 | { 325 | "cell_type": "code", 326 | "execution_count": null, 327 | "metadata": {}, 328 | "outputs": [], 329 | "source": [ 330 | "def is_supported_feature_layer_indices(layer) -> bool:\n", 331 | " for index in layer.properties.indexes:\n", 332 | " if \"+\" in index.name or \"-\" in index.name:\n", 333 | " return False\n", 334 | " else:\n", 335 | " return True" 336 | ] 337 | }, 338 | { 339 | "cell_type": "code", 340 | "execution_count": null, 341 | "metadata": {}, 342 | "outputs": [], 343 | "source": [ 344 | "def is_join_view(layer) -> bool:\n", 345 | " layer_properties = layer.properties\n", 346 | " if \"isView\" in layer_properties and layer_properties[\"isView\"]:\n", 347 | " if (\n", 348 | " \"isMultiServicesView\" in layer_properties\n", 349 | " and layer_properties[\"isMultiServicesView\"]\n", 350 | " ):\n", 351 | " return True\n", 352 | " return False" 353 | ] 354 | }, 355 | { 356 | "cell_type": "code", 357 | "execution_count": null, 358 | "metadata": {}, 359 | "outputs": [], 360 | "source": [ 361 | "def is_global_id_missing(layer) -> bool:\n", 362 | " layer_properties = layer.properties\n", 363 | " if \"globalIdField\" in layer_properties:\n", 364 | " return layer_properties[\"globalIdField\"] == \"\"\n", 365 | " return False" 366 | ] 367 | }, 368 | { 369 | "cell_type": "code", 370 | "execution_count": null, 371 | "metadata": {}, 372 | "outputs": [], 373 | "source": [ 374 | "def is_relationship_missing(layer) -> bool:\n", 375 | " layer_properties = layer.properties\n", 376 | " layer_fields = layer.properties.fields\n", 377 | " if (\n", 378 | " \"relationships\" in layer_properties\n", 379 | " and len(layer_properties[\"relationships\"]) > 0\n", 380 | " ):\n", 381 | " relationship_key = layer_properties[\"relationships\"][0][\"keyField\"].lower()\n", 382 | " filtered_relationship_fields = [\n", 383 | " field for field in layer_fields if relationship_key in field.name.lower()\n", 384 | " ]\n", 385 | " return len(filtered_relationship_fields) == 0\n", 386 | " return False" 387 | ] 388 | }, 389 | { 390 | "cell_type": "code", 391 | "execution_count": null, 392 | "metadata": {}, 393 | "outputs": [], 394 | "source": [ 395 | "def is_subtype_missing(layer) -> bool:\n", 396 | " layer_properties = layer.properties\n", 397 | " layer_fields = layer.properties.fields\n", 398 | " if \"subtypes\" in layer_properties and layer_properties[\"subtypes\"] is not None:\n", 399 | " if (\n", 400 | " len(layer_properties[\"subtypes\"]) > 0\n", 401 | " or len(layer_properties[\"subtypeField\"]) > 0\n", 402 | " ):\n", 403 | " subtype_field = layer_properties[\"subtypeField\"].lower()\n", 404 | " filtered_subtype_fields = [\n", 405 | " field for field in layer_fields if subtype_field in field.name.lower()\n", 406 | " ]\n", 407 | " return len(filtered_subtype_fields) == 0\n", 408 | " return False" 409 | ] 410 | }, 411 | { 412 | "cell_type": "code", 413 | "execution_count": null, 414 | "metadata": {}, 415 | "outputs": [], 416 | "source": [ 417 | "def is_true_curve_ready(layer) -> bool:\n", 418 | " layer_properties = layer.properties\n", 419 | " if (\n", 420 | " \"allowTrueCurvesUpdates\" in layer_properties\n", 421 | " and layer_properties[\"allowTrueCurvesUpdates\"] == True\n", 422 | " ):\n", 423 | " if (\n", 424 | " \"onlyAllowTrueCurveUpdatesByTrueCurveClients\" in layer_properties\n", 425 | " and layer_properties[\"onlyAllowTrueCurveUpdatesByTrueCurveClients\"] == True\n", 426 | " ):\n", 427 | " return False\n", 428 | " if (\n", 429 | " \"allowTrueCurvesUpdates\" in layer_properties\n", 430 | " and layer_properties[\"allowTrueCurvesUpdates\"] == False\n", 431 | " ):\n", 432 | " if (\n", 433 | " \"onlyAllowTrueCurveUpdatesByTrueCurveClients\" in layer_properties\n", 434 | " and layer_properties[\"onlyAllowTrueCurveUpdatesByTrueCurveClients\"] == False\n", 435 | " ):\n", 436 | " return False\n", 437 | " else:\n", 438 | " return True" 439 | ] 440 | }, 441 | { 442 | "cell_type": "markdown", 443 | "metadata": {}, 444 | "source": [ 445 | "#### Field Checks for Offline Use\n", 446 | "In this section, we perform essential checks to ensure that fields within layers are suitable for offline use:\n", 447 | "\n", 448 | "1. Field Type Support: Verify if the field type is supported. The following field types are not supported:\n", 449 | "\n", 450 | "- `esriFieldTypeBigInteger`\n", 451 | "\n", 452 | "- `esriFieldTypeDateOnly`\n", 453 | "\n", 454 | "- `esriFieldTypeTimeOnly`\n", 455 | "\n", 456 | "- `esriFieldTypeTimestampOffset` \n", 457 | "\n", 458 | "\n", 459 | "\n", 460 | " If any field has one of these types, it is not supported for offline use.\n", 461 | "\n", 462 | "2. Field Name Length: Ensure that the field name does not exceed 31 characters. Field names longer than 31 characters are not supported.\n", 463 | "3. Forbidden SQL Keywords: Check if the field name is a forbidden SQL keyword. Field names that are SQL keywords are not supported.\n", 464 | "\n", 465 | "By performing these checks, we can ensure that the fields within layers are properly configured and optimized for offline use." 466 | ] 467 | }, 468 | { 469 | "cell_type": "code", 470 | "execution_count": null, 471 | "metadata": {}, 472 | "outputs": [], 473 | "source": [ 474 | "def is_supported_feature_layer_field(fields) -> bool:\n", 475 | " FORBIDDEN_SQL_KEYWORDS = [\n", 476 | " \"add\",\n", 477 | " \"all\",\n", 478 | " \"alter\",\n", 479 | " \"and\",\n", 480 | " \"as\",\n", 481 | " \"autoincrement\",\n", 482 | " \"between\",\n", 483 | " \"case\",\n", 484 | " \"cast\",\n", 485 | " \"check\",\n", 486 | " \"collate\",\n", 487 | " \"commit\",\n", 488 | " \"constraint\",\n", 489 | " \"create\",\n", 490 | " \"default\",\n", 491 | " \"deferrable\",\n", 492 | " \"delete\",\n", 493 | " \"distinct\",\n", 494 | " \"drop\",\n", 495 | " \"else\",\n", 496 | " \"escape\",\n", 497 | " \"except\",\n", 498 | " \"exists\",\n", 499 | " \"foreign\",\n", 500 | " \"from\",\n", 501 | " \"group\",\n", 502 | " \"having\",\n", 503 | " \"in\",\n", 504 | " \"index\",\n", 505 | " \"insert\",\n", 506 | " \"intersect\",\n", 507 | " \"into\",\n", 508 | " \"is\",\n", 509 | " \"isnull\",\n", 510 | " \"join\",\n", 511 | " \"limit\",\n", 512 | " \"not\",\n", 513 | " \"nothing\",\n", 514 | " \"notnull\",\n", 515 | " \"null\",\n", 516 | " \"on\",\n", 517 | " \"or\",\n", 518 | " \"order\",\n", 519 | " \"primary\",\n", 520 | " \"raise\",\n", 521 | " \"references\",\n", 522 | " \"returning\",\n", 523 | " \"select\",\n", 524 | " \"set\",\n", 525 | " \"table\",\n", 526 | " \"then\",\n", 527 | " \"to\",\n", 528 | " \"transaction\",\n", 529 | " \"union\",\n", 530 | " \"unique\",\n", 531 | " \"update\",\n", 532 | " \"using\",\n", 533 | " \"values\",\n", 534 | " \"when\",\n", 535 | " \"where\",\n", 536 | " ]\n", 537 | "\n", 538 | " UNSUPPORTED_FIELD_TYPES = [\n", 539 | " \"esriFieldTypeBigInteger\",\n", 540 | " \"esriFieldTypeDateOnly\",\n", 541 | " \"esriFieldTypeTimeOnly\",\n", 542 | " \"esriFieldTypeTimestampOffset\",\n", 543 | " ]\n", 544 | " for field in fields:\n", 545 | " if field[\"type\"] in UNSUPPORTED_FIELD_TYPES:\n", 546 | " return False\n", 547 | "\n", 548 | " if len(field[\"name\"]) > 31:\n", 549 | " return False\n", 550 | "\n", 551 | " if field[\"name\"] in FORBIDDEN_SQL_KEYWORDS:\n", 552 | " return False\n", 553 | " return True" 554 | ] 555 | }, 556 | { 557 | "cell_type": "markdown", 558 | "metadata": {}, 559 | "source": [ 560 | "#### Checking All Layers and Tables in Your Map\n", 561 | "\n", 562 | "##### In this step, we will call each of the layer and field check methods defined earlier to evaluate the suitability of the layers for offline use. \n", 563 | "\n", 564 | "The results of these checks will be recorded, and a list will be created to identify any errors.\n", 565 | "\n", 566 | "1. Invoke Layer and Field Checks: We systematically call each of the predefined layer and field check methods for every layer in the map.\n", 567 | "\n", 568 | "2. Record Results: The results of these checks are recorded, capturing both the boolean outcome and any associated messages.\n", 569 | "\n", 570 | "\n", 571 | "By following this process, we can efficiently identify and document any issues that need to be addressed to ensure the layers are suitable for offline use" 572 | ] 573 | }, 574 | { 575 | "cell_type": "code", 576 | "execution_count": null, 577 | "metadata": {}, 578 | "outputs": [], 579 | "source": [ 580 | "def check_layers(layers) -> List[str]:\n", 581 | " layer_errors = []\n", 582 | " SUPPORTED_TILE_LAYER_TYPES = [\n", 583 | " \"ArcGISImageServiceLayer\",\n", 584 | " \"ArcGISTiledImageServiceLayer\",\n", 585 | " \"ArcGISTiledMapServiceLayer\",\n", 586 | " \"VectorTileLayer\",\n", 587 | " ]\n", 588 | " for op_layer in layers:\n", 589 | " if not is_supported_layer_type(op_layer):\n", 590 | " layer_errors.append(f\"Layer type not supported: {op_layer.title}\")\n", 591 | " continue\n", 592 | "\n", 593 | " elif op_layer[\"layerType\"] == \"ArcGISFeatureLayer\":\n", 594 | " layer_url = op_layer[\"url\"] if \"url\" in op_layer else op_layer.url\n", 595 | " layer_title = op_layer[\"title\"] if \"title\" in op_layer else op_layer.title\n", 596 | " feature_layer = FeatureLayer(layer_url)\n", 597 | " if not is_sync_enabled(feature_layer):\n", 598 | " layer_errors.append(f\"{layer_title} is sync not enabled\")\n", 599 | " if not is_supported_feature_layer_field(feature_layer.properties.fields):\n", 600 | " layer_errors.append(f\"{layer_title} has field name errors\")\n", 601 | " if not is_supported_feature_layer_indices(feature_layer):\n", 602 | " layer_errors.append(f\"{layer_title} has unsupported layer indicies\")\n", 603 | " if is_join_view(feature_layer):\n", 604 | " layer_errors.append(f\"{layer_title} is a join view\")\n", 605 | " if is_global_id_missing(feature_layer):\n", 606 | " layer_errors.append(f\"{layer_title} has globalid missing\")\n", 607 | " if is_relationship_missing(feature_layer):\n", 608 | " layer_errors.append(\n", 609 | " f\"{layer_title} has missing relationship field names\"\n", 610 | " )\n", 611 | " if is_subtype_missing(feature_layer):\n", 612 | " layer_errors.append(f\"{layer_title} has missing subtype field\")\n", 613 | " if not is_true_curve_ready(feature_layer):\n", 614 | " layer_errors.append(f\"{layer_title} is not true curve enabled\")\n", 615 | " # add more tests here\n", 616 | " elif op_layer[\"layerType\"] == \"GroupLayer\":\n", 617 | " layer_errors.extend(check_layers(op_layer[\"layers\"]))\n", 618 | " elif op_layer[\"layerType\"] in SUPPORTED_TILE_LAYER_TYPES:\n", 619 | " layer_title = op_layer[\"title\"] if \"title\" in op_layer else op_layer.title\n", 620 | " if not is_export_enabled(op_layer):\n", 621 | " layer_errors.append(f\"{layer_title} export tiles not enabled\")\n", 622 | " else:\n", 623 | " if is_map_multisource(op_layer):\n", 624 | " layer_errors.append(f\"{layer_title} is multisource\")\n", 625 | " if is_deprecated_basemap_urls(op_layer):\n", 626 | " layer_errors.append(f\"{layer_title} is deprecated\")\n", 627 | "\n", 628 | " layer_errors = list(layer_errors)\n", 629 | "\n", 630 | " return layer_errors" 631 | ] 632 | }, 633 | { 634 | "cell_type": "code", 635 | "execution_count": null, 636 | "metadata": {}, 637 | "outputs": [], 638 | "source": [ 639 | "def check_tables(tables) -> List[str]:\n", 640 | " table_errors = []\n", 641 | " for op_table in tables:\n", 642 | " if not is_supported_layer_type(op_table):\n", 643 | " table_errors.append(f\"Layer type not supported: {op_table.title}\")\n", 644 | " continue\n", 645 | "\n", 646 | " elif op_table[\"layerType\"] == \"ArcGISFeatureLayer\":\n", 647 | " table_url = op_table[\"url\"] if \"url\" in op_table else op_table.url\n", 648 | " table_title = op_table[\"title\"] if \"title\" in op_table else op_table.title\n", 649 | " feature_layer = FeatureLayer(table_url)\n", 650 | " if not is_sync_enabled(feature_layer):\n", 651 | " table_errors.append(f\"{table_title} is sync not enabled\")\n", 652 | " if not is_supported_feature_layer_indices(feature_layer):\n", 653 | " table_errors.append(f\"{table_title} has unsupported layer indicies\")\n", 654 | " if not is_supported_feature_layer_field(feature_layer.properties.fields):\n", 655 | " table_errors.append(f\"{table_title} has field name errors\")\n", 656 | " if is_join_view(feature_layer):\n", 657 | " table_errors.append(f\"{table_title} is a join view\")\n", 658 | " if is_global_id_missing(feature_layer):\n", 659 | " table_errors.append(f\"{table_title} has globalid missing\")\n", 660 | " if is_relationship_missing(feature_layer):\n", 661 | " table_errors.append(\n", 662 | " f\"{table_title} has missing relationship field names\"\n", 663 | " )\n", 664 | " if not is_true_curve_ready(feature_layer):\n", 665 | " table_errors.append(f\"{table_title} is not true curve enabled\")\n", 666 | " # add more tests here\n", 667 | "\n", 668 | " table_errors = list(table_errors)\n", 669 | "\n", 670 | " return table_errors" 671 | ] 672 | }, 673 | { 674 | "cell_type": "markdown", 675 | "metadata": {}, 676 | "source": [ 677 | "#### Map Checks\n", 678 | "\n", 679 | "The Map checks section checks if generic map properties are capable of going offline.\n", 680 | "These checks include if the map has the Offline Disabled type keyword in the map or if the map has any duplicate layers." 681 | ] 682 | }, 683 | { 684 | "cell_type": "code", 685 | "execution_count": null, 686 | "metadata": {}, 687 | "outputs": [], 688 | "source": [ 689 | "def check_offline_typekeyword(map) -> bool:\n", 690 | " if \"OfflineDisabled\" in map.typeKeywords:\n", 691 | " return False\n", 692 | " if \"Offline\" in map.typeKeywords:\n", 693 | " return True\n", 694 | " return False" 695 | ] 696 | }, 697 | { 698 | "cell_type": "code", 699 | "execution_count": null, 700 | "metadata": {}, 701 | "outputs": [], 702 | "source": [ 703 | "def map_has_duplicates(map) -> bool:\n", 704 | " webmap_object = WebMap(map)\n", 705 | " item_urls = set()\n", 706 | " for layer in webmap_object.layers + webmap_object.tables:\n", 707 | " if is_supported_layer_type(layer):\n", 708 | " if layer[\"layerType\"] in [\"ArcGISFeatureLayer\", \"featureCollectionType\"]:\n", 709 | " item_urls.add(layer[\"url\"] if \"url\" in layer else layer.url)\n", 710 | " elif layer[\"layerType\"] == \"GroupLayer\":\n", 711 | " for sub_layer in layer[\"layers\"]:\n", 712 | " item_urls.add(\n", 713 | " sub_layer[\"url\"] if \"url\" in sub_layer else sub_layer.url\n", 714 | " )\n", 715 | " return len(item_urls) != len(webmap_object.layers + webmap_object.tables)" 716 | ] 717 | }, 718 | { 719 | "cell_type": "code", 720 | "execution_count": null, 721 | "metadata": {}, 722 | "outputs": [], 723 | "source": [ 724 | "def check_map(webmap) -> List[str]:\n", 725 | " map_errors = []\n", 726 | " webmap_title = webmap.title\n", 727 | " if not check_offline_typekeyword(webmap):\n", 728 | " map_errors.append(f\"{webmap_title} is offline disabled\")\n", 729 | " if map_has_duplicates(webmap):\n", 730 | " map_errors.append(f\"{webmap_title} has duplicate layers\")\n", 731 | " return map_errors" 732 | ] 733 | }, 734 | { 735 | "cell_type": "markdown", 736 | "metadata": {}, 737 | "source": [ 738 | "#### Check WebMap Offline Compatability general method\n", 739 | "This method is designed to return to you what parts of your map has any errors in it -- layers, tables, basemap or the map itself" 740 | ] 741 | }, 742 | { 743 | "cell_type": "code", 744 | "execution_count": null, 745 | "metadata": {}, 746 | "outputs": [], 747 | "source": [ 748 | "def check_webmap_offline_compatability(webmap):\n", 749 | " webmap_object = WebMap(webmap)\n", 750 | " offline_compatibility = {\n", 751 | " \"map\": [],\n", 752 | " \"layers\": [],\n", 753 | " \"tables\": [],\n", 754 | " \"basemap\": [],\n", 755 | " \"has_errors\": False,\n", 756 | " }\n", 757 | "\n", 758 | " offline_compatibility[\"map\"] = check_map(webmap)\n", 759 | " offline_compatibility[\"layers\"] = check_layers(webmap_object.layers)\n", 760 | " offline_compatibility[\"tables\"] = check_tables(webmap_object.tables)\n", 761 | " offline_compatibility[\"basemap\"] = check_basemap(webmap_object.basemap)\n", 762 | "\n", 763 | " if (\n", 764 | " offline_compatibility[\"map\"]\n", 765 | " or offline_compatibility[\"layers\"]\n", 766 | " or offline_compatibility[\"tables\"]\n", 767 | " or offline_compatibility[\"basemap\"]\n", 768 | " ):\n", 769 | " offline_compatibility[\"has_errors\"] = True\n", 770 | "\n", 771 | " return offline_compatibility" 772 | ] 773 | }, 774 | { 775 | "cell_type": "markdown", 776 | "metadata": {}, 777 | "source": [ 778 | "#### Lets see the results\n", 779 | "In section we call and use everything we have defined above. The layer or basemap that is preventing you from going offline will be printed below along with the appropriate error." 780 | ] 781 | }, 782 | { 783 | "cell_type": "code", 784 | "execution_count": null, 785 | "metadata": {}, 786 | "outputs": [], 787 | "source": [ 788 | "offline_compatability = check_webmap_offline_compatability(offline_map)\n", 789 | "if offline_compatability[\"has_errors\"]:\n", 790 | " for key, values in offline_compatability.items():\n", 791 | " if key != \"has_errors\":\n", 792 | " print(f\"{key}:\")\n", 793 | " for value in values:\n", 794 | " print(f\" - {value}\")\n", 795 | "else:\n", 796 | " print(\"Your map is ready for offline!\")" 797 | ] 798 | } 799 | ], 800 | "metadata": { 801 | "kernelspec": { 802 | "display_name": "field-maps-scripts", 803 | "language": "python", 804 | "name": "python3" 805 | }, 806 | "language_info": { 807 | "codemirror_mode": { 808 | "name": "ipython", 809 | "version": 3 810 | }, 811 | "file_extension": ".py", 812 | "mimetype": "text/x-python", 813 | "name": "python", 814 | "nbconvert_exporter": "python", 815 | "pygments_lexer": "ipython3", 816 | "version": "3.11.12" 817 | } 818 | }, 819 | "nbformat": 4, 820 | "nbformat_minor": 2 821 | } 822 | -------------------------------------------------------------------------------- /notebooks/Proxy Esri Basemaps for ArcGIS Enterprise Offline Map Areas.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "#### Introduction\n", 8 | "\n", 9 | "This notebook is intended for ArcGIS Enterprise administrators who are looking to create basemaps compatible with offline map areas in their portal. Esri offers a variety of tile and vector basemaps that can be exported for this purpose. To effectively use these basemaps with offline map area-compatible applications, such as Field Maps, it is essential that each basemap includes embedded credentials for ArcGIS Online. These credentials are necessary to facilitate the creation of offline map areas. Both vector and tile services are supported in this process.\n", 10 | "\n", 11 | "##### Requirements:\n", 12 | "\n", 13 | "- ArcGIS account credentials (ideally viewer account)\n", 14 | "- ArcGIS Online basemap service will need to support export (see list below)\n", 15 | "- ArcGIS Enterprise account that has permissions to publish items\n", 16 | "- ArcGIS Enterprise 10.9.1 and above are supported\n", 17 | "- Tested with Python 3.9.2 and 3.11.11 / ArcGIS 2.3.1 (see [README](https://github.com/Esri/field-maps-scripts/blob/main/README.md#instructions) for enviroment setup)\n" 18 | ] 19 | }, 20 | { 21 | "cell_type": "markdown", 22 | "metadata": {}, 23 | "source": [ 24 | "#### Import libraries\n" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": 1, 30 | "metadata": {}, 31 | "outputs": [], 32 | "source": [ 33 | "import ipywidgets as widgets\n", 34 | "from arcgis.gis import GIS, Item" 35 | ] 36 | }, 37 | { 38 | "cell_type": "markdown", 39 | "metadata": {}, 40 | "source": [ 41 | "#### Choose what basemaps you want to include. Add or comment out the maps you want to include in processing.\n", 42 | "\n", 43 | "_Note: in order for basemaps to support offline areas they will need to have exporting enabled._\n", 44 | "\n", 45 | "Esri Basemaps for export:\n", 46 | "These maps are designed to support exporting small volumes of basemap tiles for offline use - not intended to be used to display live map tiles\n", 47 | "\n", 48 | "- Vector Basemaps: https://www.arcgis.com/home/group.html?sortField=modified&sortOrder=asc&id=c61ab1493fff4b84b53705184876c9b0#content\n", 49 | "- Tiled Basemaps: https://www.arcgis.com/home/group.html?sortField=title&sortOrder=asc&id=3a890be7a4b046c7840dc4a0446c5b31#content\n" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": null, 55 | "metadata": {}, 56 | "outputs": [], 57 | "source": [ 58 | "# Folder on enterprise in which items will be created\n", 59 | "BASEMAP_FOLDER = \"Esri Offline Basemaps\"\n", 60 | "\n", 61 | "# Specify the item IDs of the vector basemap you want to clone\n", 62 | "# These maps below are designed to support exporting small volumes of basemap tiles for offline use\n", 63 | "# Not intended to be used to display live map tiles\n", 64 | "AGOL_ITEM_LIST_ID = [\n", 65 | " # tiled basemaps - deprecated basemaps not included\n", 66 | " \"9e42e0d4acde413a9a9eb5f05fe0a2e6\", # World Hillshade (Dark) (for Export)\n", 67 | " \"babedc22ebd64a428b77f7119c2591c3\", # World Hillshade (for Export)\n", 68 | " \"226d23f076da478bba4589e7eae95952\", # World Imagery (for Export)\n", 69 | " \"5d85d897aee241f884158aa514954443\", # World Ocean Base (for Export)\n", 70 | " # vector basemaps\n", 71 | " \"38649a45a3544c0e809d00ea86be78e6\", # World Navigation Map (Dark - for Export)\n", 72 | " \"4faaa4931a3541e5b7461c732a7bda1c\", # World Ocean Reference (for Export)\n", 73 | " \"7f5fe58ee3c046da8d83980b7262b7f6\", # World Street Map (for Export)\n", 74 | " \"758db17cc1ee4181a049d1fa5d0c6bf0\", # World Street Map (with Relief - for Export)\n", 75 | " \"4b5200491af84f898e1e6aa494ed79e2\", # World Navigation Map (for Export)\n", 76 | " \"e7817b59ec4b40e4a1e55241b8695534\", # Hybrid Reference Layer (for Export)\n", 77 | " \"df541726b3df4c0caf99255bb1be4c86\", # World Topographic Map (for Export)\n", 78 | " \"f37f80e4e53c4c25bbec2eab05371f62\", # World Terrain with Labels (for Export)\n", 79 | " \"1f472880dccc4e99a47e4745908165b2\", # World Terrain Reference (for Export)\n", 80 | " \"f29b05507f594d00a916b33ebb8404f3\", # World Terrain Base (for Export)\n", 81 | " \"8e848d9302e84dcba0de7aff8ac429dc\", # World Street Map (Night - for Export)\n", 82 | " # additional basemaps\n", 83 | " # \"16024e0cee3949fa89a17f483b7189a9\", # National Geographic Style (for Export)\n", 84 | " # \"a2824f0bd9724eb9882eef0d059581d1\", # Light Gray Canvas (for Export)\n", 85 | " # \"d23123ae88a14088843bc552ad3e9868\", # Light Gray Canvas Base (for Export)\n", 86 | " # \"8e7636df0c7a4aafa4fe678081dd0d56\", # Light Gray Canvas Reference (for Export)\n", 87 | " # \"aa80feb1dfb946a1b0859df4acca14b1\", # Dark Gray Canvas Reference (for Export)\n", 88 | " # \"3175cf5b9e9f47f284c7b7d3c8d5b387\", # Dark Gray Canvas Base (for Export)\n", 89 | " # \"e945c4f09c4345ffb9ae6761cedf5c72\", # Dark Gray Canvas (for Export)\n", 90 | " # US Edition vector basemaps\n", 91 | " # \"2c24dacdf40c4f5ebfd1ebc5bf2f4578\", # Dark Gray Canvas Reference (US Edition)\n", 92 | " # \"3afd423cbacf4d61a38f068264bd2d88\", # Dark Gray Base (US Edition)\n", 93 | " # \"5447e9aef0684ec391ae9381725f7370\", # Hybrid Reference Layer (US Edition)\n", 94 | " # \"12124841a631489fb71aa1c0c8952c49\", # Light Gray Base (US Edition)\n", 95 | " # \"b14e8a2500864810a4f7ea0d2049ecd9\", # Light Gray Reference (US Edition)\n", 96 | " # \"cdd1ca79ffc74237bd6f76e5d9803e2e\", # National Geographic Style (US Edition)\n", 97 | " # \"44b280223e7c4633b91b32e36dafe02b\", # World Navigation Dark Map (Places) (US Edition)\n", 98 | " # \"657a5dffea1d45939d999ef91cc5dfc4\", # World Navigation Map (Dark) (US Edition)\n", 99 | " # \"461dbee807e340e9a0bc56ef7770dd90\", # World Navigation Map (Places) (US Edition)\n", 100 | " # \"33136781867443bdac972cc916da8c59\", # World Navigation Map (US Edition)\n", 101 | " # \"6061e78281f94bb6a671d11253d41f6e\", # World Ocean Reference (US Edition)\n", 102 | " # \"6ee2da1b8303471c9e3340d1277926bc\", # World Street Map (US Edition)\n", 103 | " # \"8c633e6eab00489ea0de342ca1292988\", # World Street Map (Night) (US Edition)\n", 104 | " # \"a1428704b268492e84747ea1a69e4315\", # World Terrain Reference (US Edition)\n", 105 | " # \"27e89eb03c1e4341a1d75e597f0291e6\", # World Topographic Map (US Edition)\n", 106 | "]" 107 | ] 108 | }, 109 | { 110 | "cell_type": "markdown", 111 | "metadata": {}, 112 | "source": [ 113 | "#### Functions\n" 114 | ] 115 | }, 116 | { 117 | "cell_type": "code", 118 | "execution_count": 3, 119 | "metadata": {}, 120 | "outputs": [], 121 | "source": [ 122 | "# Clone the item from ArcGIS Online to ArcGIS Enterprise\n", 123 | "def clone_basemap(\n", 124 | " basemap_folder, gis_ent_conn, agol_item_to_clone\n", 125 | ") -> Item:\n", 126 | " cloned_item = None\n", 127 | " try:\n", 128 | " # Clone the item from agol\n", 129 | " cloned_item_list = gis_ent_conn.content.clone_items(\n", 130 | " items=[agol_item_to_clone], folder=basemap_folder\n", 131 | " )\n", 132 | " # Check to see if item was cloned and assign it to return variable\n", 133 | " if len(cloned_item_list) == 1:\n", 134 | " cloned_item = cloned_item_list[0]\n", 135 | "\n", 136 | " except Exception as e:\n", 137 | " print(f\" An error occurred cloning item: {e}\")\n", 138 | "\n", 139 | " return cloned_item" 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": 4, 145 | "metadata": {}, 146 | "outputs": [], 147 | "source": [ 148 | "# Update the item to use stored credentials\n", 149 | "def update_service_creds(\n", 150 | " ent_item_to_update, agol_username, agol_password, enterprise_token\n", 151 | ") -> bool:\n", 152 | " update_results = False\n", 153 | " try:\n", 154 | " update_results = ent_item_to_update.update(\n", 155 | " item_properties={\n", 156 | " \"url\": ent_item_to_update.url,\n", 157 | " \"serviceUsername\": agol_username,\n", 158 | " \"servicePassword\": agol_password,\n", 159 | " \"token\": enterprise_token,\n", 160 | " }\n", 161 | " )\n", 162 | " except Exception as e:\n", 163 | " print(f\" An error occurred while updating item properties for the item: {e}\")\n", 164 | "\n", 165 | " return update_results" 166 | ] 167 | }, 168 | { 169 | "cell_type": "code", 170 | "execution_count": 5, 171 | "metadata": {}, 172 | "outputs": [], 173 | "source": [ 174 | "# Update the json to point to proxied source\n", 175 | "# We will need to check to make sure the new enterprise url are changing to\n", 176 | "# Matches the agol url - we just need to check the \"end\" of the map service\n", 177 | "def update_style_file(enterprise_item) -> bool:\n", 178 | " update_results = False\n", 179 | " try:\n", 180 | "\n", 181 | " # Get the name of the map service to use for checking\n", 182 | " search_string = \"/rest/services/\"\n", 183 | " enterprise_map_search_index = enterprise_item.url.find(search_string) + len(\n", 184 | " search_string\n", 185 | " )\n", 186 | " enterprise_map_service_ends_with = enterprise_item.url[\n", 187 | " enterprise_map_search_index:\n", 188 | " ]\n", 189 | "\n", 190 | " # Get the styles/root.json\n", 191 | " styles_json = enterprise_item.resources.get(file=\"styles/root.json\")\n", 192 | "\n", 193 | " # Update the values that reference arcgis online - repoint them to enterprise\n", 194 | " styles_json[\"sprite\"] = \"../sprites/sprite\"\n", 195 | " styles_json[\"glyphs\"] = (\n", 196 | " enterprise_item.url + \"/resources/fonts/{fontstack}/{range}.pbf\"\n", 197 | " )\n", 198 | " for url_source in styles_json[\"sources\"].items():\n", 199 | " if url_source[1][\"type\"] == \"vector\":\n", 200 | " if url_source[1][\"url\"].endswith(enterprise_map_service_ends_with):\n", 201 | " url_source[1][\"url\"] = enterprise_item.url\n", 202 | " else:\n", 203 | " print(\n", 204 | " \" ERROR: vector basemap contains invalid map url, this is not currently supported with this script\"\n", 205 | " )\n", 206 | " print(f\" {url_source[1]['url']}\")\n", 207 | "\n", 208 | " # Update file with edits and get return success status\n", 209 | " update_results = enterprise_item.resources.update(\n", 210 | " folder_name=\"styles\", file_name=\"root.json\", text=styles_json\n", 211 | " )[\"success\"]\n", 212 | "\n", 213 | " except Exception as e:\n", 214 | " print(f\" An error occurred while updating style file of the item: {e}\")\n", 215 | "\n", 216 | " return update_results" 217 | ] 218 | }, 219 | { 220 | "cell_type": "markdown", 221 | "metadata": {}, 222 | "source": [ 223 | "#### Provide ArcGIS Credentials and Connect to Enterprise GIS using form\n", 224 | "\n", 225 | "This form is optional - there are a variety of secure methods for providing credentials - please see links below:\n", 226 | "\n", 227 | "- For information on different authentication schemes see:\n", 228 | " - https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/\n", 229 | "- For protecting your credentials see:\n", 230 | " - https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/#protecting-your-credentials\n" 231 | ] 232 | }, 233 | { 234 | "cell_type": "code", 235 | "execution_count": 6, 236 | "metadata": {}, 237 | "outputs": [], 238 | "source": [ 239 | "# Created a class to be able to access form input from outside the on click event - perhaps a better way of doing this :)\n", 240 | "class UserInputForm:\n", 241 | " def __init__(self):\n", 242 | "\n", 243 | " # Create form input elements\n", 244 | " self.form_agol_username = widgets.Text()\n", 245 | " self.form_agol_pw = widgets.Password()\n", 246 | " self.form_portal_url = widgets.Text(placeholder=\"https://exampleportal/portal\")\n", 247 | " self.form_portal_username = widgets.Text()\n", 248 | " self.form_portal_pw = widgets.Password()\n", 249 | " self.form_connect_button = widgets.Button(description=\"Submit GIS Details\")\n", 250 | "\n", 251 | " self.form_connect_button.on_click(self.submit_gis_details_clicked)\n", 252 | "\n", 253 | " self.form_items = [\n", 254 | " widgets.Box(\n", 255 | " [\n", 256 | " widgets.Label(value=\"ArcGIS Online Username:\"),\n", 257 | " self.form_agol_username,\n", 258 | " ]\n", 259 | " ),\n", 260 | " widgets.Box(\n", 261 | " [widgets.Label(value=\"ArcGIS Online Password:\"), self.form_agol_pw]\n", 262 | " ),\n", 263 | " widgets.Box([widgets.Label(value=\"Portal URL:\"), self.form_portal_url]),\n", 264 | " widgets.Box(\n", 265 | " [widgets.Label(value=\"Portal Username:\"), self.form_portal_username]\n", 266 | " ),\n", 267 | " widgets.Box([widgets.Label(value=\"Portal Password:\"), self.form_portal_pw]),\n", 268 | " self.form_connect_button,\n", 269 | " ]\n", 270 | "\n", 271 | " self.form = widgets.Box(\n", 272 | " self.form_items,\n", 273 | " layout=widgets.Layout(\n", 274 | " display=\"flex\",\n", 275 | " flex_flow=\"column\",\n", 276 | " border=\"solid 2px\",\n", 277 | " align_items=\"flex-start\",\n", 278 | " ),\n", 279 | " )\n", 280 | "\n", 281 | " self.agol_username_value = None\n", 282 | " self.agol_pw_value = None\n", 283 | " self.portal_username_value = None\n", 284 | " self.portal_pw_value = None\n", 285 | " self.portal_url_value = None\n", 286 | "\n", 287 | " display(self.form)\n", 288 | "\n", 289 | " def submit_gis_details_clicked(self, b):\n", 290 | " self.agol_username_value = self.form_agol_username.value\n", 291 | " self.agol_pw_value = self.form_agol_pw.value\n", 292 | " self.portal_username_value = self.form_portal_username.value\n", 293 | " self.portal_pw_value = self.form_portal_pw.value\n", 294 | " self.portal_url_value = self.form_portal_url.value\n", 295 | "\n", 296 | " def get_agol_username_value(self):\n", 297 | " return self.agol_username_value\n", 298 | "\n", 299 | " def get_agol_pw_value(self):\n", 300 | " return self.agol_pw_value\n", 301 | "\n", 302 | " def get_form_portal_username(self):\n", 303 | " return self.portal_username_value\n", 304 | "\n", 305 | " def get_portal_pw_value(self):\n", 306 | " return self.portal_pw_value\n", 307 | "\n", 308 | " def get_portal_url_value(self):\n", 309 | " return self.portal_url_value" 310 | ] 311 | }, 312 | { 313 | "cell_type": "code", 314 | "execution_count": null, 315 | "metadata": {}, 316 | "outputs": [], 317 | "source": [ 318 | "# Render the form \n", 319 | "form = UserInputForm()\n", 320 | "\n", 321 | "# the form must filled out and submitted before the rest of the script can run " 322 | ] 323 | }, 324 | { 325 | "cell_type": "markdown", 326 | "metadata": {}, 327 | "source": [ 328 | "#### Connect to ArcGIS Enterprise and ArcGIS Online\n" 329 | ] 330 | }, 331 | { 332 | "cell_type": "code", 333 | "execution_count": null, 334 | "metadata": {}, 335 | "outputs": [], 336 | "source": [ 337 | "agol_proxy_username = form.get_agol_username_value()\n", 338 | "agol_proxy_password = form.get_agol_pw_value()\n", 339 | "portal_username = form.get_form_portal_username()\n", 340 | "portal_password = form.get_portal_pw_value()\n", 341 | "portal_url = form.get_portal_url_value()\n", 342 | "\n", 343 | "# Connect to arcgis online\n", 344 | "gis_agol = GIS() # item is hosted publicly\n", 345 | "gis_ent = GIS(\n", 346 | " portal_url, portal_username, portal_password\n", 347 | ") # Enterprise 10.9.1 and above supported - Note: you may need to add \",verify_cert=False\" if you see certificate errors.\n", 348 | "\n", 349 | "# Get a token from ArcGIS enterprise - this will be used when we update item to provide credentials\n", 350 | "ent_token = gis_ent._con.token\n", 351 | "\n", 352 | "if not ent_token:\n", 353 | " print(\"Could not connect to enterprise, please review connection details...\")" 354 | ] 355 | }, 356 | { 357 | "cell_type": "markdown", 358 | "metadata": {}, 359 | "source": [ 360 | "#### Create Folder for Basemaps" 361 | ] 362 | }, 363 | { 364 | "cell_type": "code", 365 | "execution_count": 10, 366 | "metadata": {}, 367 | "outputs": [], 368 | "source": [ 369 | "# Create folder if it doesn't exist\n", 370 | "folder_titles = [folder[\"title\"] for folder in gis_ent.users.me.folders]\n", 371 | "if BASEMAP_FOLDER not in folder_titles:\n", 372 | " new_folder = gis_ent.content.folders.create(\n", 373 | " BASEMAP_FOLDER, owner=gis_ent.users.me.username\n", 374 | " )\n", 375 | " if not new_folder:\n", 376 | " print(f\"Folder '{BASEMAP_FOLDER}' failed to create. Please resolve issue before proceeding\")" 377 | ] 378 | }, 379 | { 380 | "cell_type": "markdown", 381 | "metadata": {}, 382 | "source": [ 383 | "#### Process Basemap List" 384 | ] 385 | }, 386 | { 387 | "cell_type": "code", 388 | "execution_count": null, 389 | "metadata": {}, 390 | "outputs": [], 391 | "source": [ 392 | "# Loop through the items in list, checking that each step of the process was completed\n", 393 | "# If issues are encountered, error print messages are displayed and item is skipped.\n", 394 | "for agol_item_id in AGOL_ITEM_LIST_ID:\n", 395 | "\n", 396 | " print(\"---------------------------\")\n", 397 | " # --Get agol item--\n", 398 | " # Check to make sure we have an item\n", 399 | " agol_item = gis_agol.content.get(agol_item_id)\n", 400 | " if not agol_item:\n", 401 | " print(f\" ERROR: could not find item: {agol_item_id} - skipping\")\n", 402 | " continue # To next item in for loop\n", 403 | "\n", 404 | " print(f\"processing: {agol_item.type} - {agol_item.title}\")\n", 405 | "\n", 406 | " # ---Clone item from arcgis online to enterprise---\n", 407 | " # Search for item in enterprise\n", 408 | " # Note the \"typekeywords:source-\"\" format is a reliable way to find the clone of an item\n", 409 | " ent_item_list = gis_ent.content.search(\n", 410 | " query=\"typekeywords:source-\" + agol_item_id, max_items=1\n", 411 | " )\n", 412 | "\n", 413 | " # If item exists skip it - delete any item you want to reclone\n", 414 | " if len(ent_item_list) == 1:\n", 415 | " print(\n", 416 | " \" ERROR: item already exists - permanently delete item in portal manually first if you want to reclone\"\n", 417 | " )\n", 418 | " continue # To next item in for loop\n", 419 | "\n", 420 | " ent_item = clone_basemap(BASEMAP_FOLDER, gis_ent, agol_item)\n", 421 | "\n", 422 | " # If item was cloned successfully, update the new item to use embedded credentials to agol\n", 423 | " if not ent_item:\n", 424 | " print(\" ERROR: item was not cloned\")\n", 425 | " continue # To next item in for loop\n", 426 | "\n", 427 | " print(\" cloned item from ArcGIS Online to ArcGIS Enterprise.\")\n", 428 | "\n", 429 | " # ---Update service credentials---\n", 430 | " update_service_creds_result = update_service_creds(\n", 431 | " ent_item, agol_proxy_username, agol_proxy_password, ent_token\n", 432 | " )\n", 433 | "\n", 434 | " # Check if update to service credentials was successful - if it wasn't add message and continue to next item\n", 435 | " if not update_service_creds_result:\n", 436 | " print(\" ERROR: service credentials could not be updated\")\n", 437 | " continue # To next item in for loop\n", 438 | "\n", 439 | " print(\" updated service credentials\")\n", 440 | "\n", 441 | " # ---Update style file for vector tile service---\n", 442 | " # We only need to update style file for VectorTile services\n", 443 | " if ent_item.type == \"Vector Tile Service\":\n", 444 | " update_style_file_results = update_style_file(ent_item)\n", 445 | "\n", 446 | " # Check if update to style file was successful\n", 447 | " if update_style_file_results:\n", 448 | " print(\" updated vector basemap style file\")\n", 449 | " else:\n", 450 | " print(\" ERROR: updating style file\")\n", 451 | "\n", 452 | " print(\" item ready for use\")\n", 453 | "\n", 454 | "print(\"--processing complete--\")" 455 | ] 456 | } 457 | ], 458 | "metadata": { 459 | "esriNotebookRuntime": { 460 | "notebookRuntimeName": "ArcGIS Notebook Python 3 Standard", 461 | "notebookRuntimeVersion": "11.0" 462 | }, 463 | "kernelspec": { 464 | "display_name": "field-maps-scripts", 465 | "language": "python", 466 | "name": "python3" 467 | }, 468 | "language_info": { 469 | "codemirror_mode": { 470 | "name": "ipython", 471 | "version": 3 472 | }, 473 | "file_extension": ".py", 474 | "mimetype": "text/x-python", 475 | "name": "python", 476 | "nbconvert_exporter": "python", 477 | "pygments_lexer": "ipython3", 478 | "version": "3.11.11" 479 | } 480 | }, 481 | "nbformat": 4, 482 | "nbformat_minor": 4 483 | } 484 | -------------------------------------------------------------------------------- /notebooks/Watermark photo attachments with Exif data.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## Watermark photo attachments with Exif data\n", 8 | "\n", 9 | "This notebook demonstrates how to watermark photo attachments with latitude and longitude from Exif GPS info using the Pillow Library.\n", 10 | "\n", 11 | "To use this notebook, you need to update variables at the top of the script, including:\n", 12 | "\n", 13 | "1. feature_layer_item_id to the item id of the feature layer to watermark attachments for and feature_layer_id to specify the specific layer\n", 14 | "2. attachments_query to filter what features to watermark their attachments\n", 15 | "\n", 16 | "This notebook assumes that:\n", 17 | "\n", 18 | "- You are the owner of the webmap\n", 19 | "- You are using Enterprise 10.7+ or ArcGIS Online\n", 20 | "\n", 21 | "For more information on working with map areas see:\n", 22 | "\n", 23 | "- ArcGIS Python API Layer Attachments - https://developers.arcgis.com/python/guide/using-attachments-with-feature-layers/\n", 24 | "- Pillow Library - https://pillow.readthedocs.io/en/stable/index.html" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": null, 30 | "metadata": {}, 31 | "outputs": [], 32 | "source": [ 33 | "import os\n", 34 | "import shutil\n", 35 | "import tempfile\n", 36 | "from arcgis.gis import GIS\n", 37 | "from datetime import datetime as dt\n", 38 | "from PIL import ExifTags\n", 39 | "from PIL import Image\n", 40 | "from PIL import ImageDraw\n", 41 | "from PIL import ImageOps\n", 42 | "\n", 43 | "\n", 44 | "# Using built-in user\n", 45 | "# For information on different authentication schemes see\n", 46 | "# https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/\n", 47 | "# and for protecting your credentials see\n", 48 | "# https://developers.arcgis.com/python/guide/working-with-different-authentication-schemes/#protecting-your-credentials\n", 49 | "gis = GIS(\"home\")\n", 50 | "\n", 51 | "# The item id of the feature layer to process attachments for\n", 52 | "feature_layer_item_id = \"7a28f3f872bd4b55bfd7965bf73a46b6\"\n", 53 | "# The id of the layer to use\n", 54 | "feature_layer_id = 0\n", 55 | "# Where clause to filter features and attachments\n", 56 | "# See FeatureLayer query() for more options - https://developers.arcgis.com/python/api-reference/arcgis.features.toc.html#featurelayer\n", 57 | "attachments_query = \"field_to_filter=2\"\n", 58 | "\n", 59 | "\n", 60 | "# Converts latitude and longitude from Exif GPS info in degrees, minutes and seconds to decimal degrees\n", 61 | "def convert_dms_to_dd(gps_info) -> (float, float):\n", 62 | "\n", 63 | " if gps_info is None:\n", 64 | " return (0, 0)\n", 65 | "\n", 66 | " lat_degrees = gps_info[ExifTags.GPS.GPSLatitude][0]\n", 67 | " lat_minutes = gps_info[ExifTags.GPS.GPSLatitude][1]\n", 68 | " lat_seconds = gps_info[ExifTags.GPS.GPSLatitude][2]\n", 69 | " lon_degrees = gps_info[ExifTags.GPS.GPSLongitude][0]\n", 70 | " lon_minutes = gps_info[ExifTags.GPS.GPSLongitude][1]\n", 71 | " lon_seconds = gps_info[ExifTags.GPS.GPSLongitude][2]\n", 72 | "\n", 73 | " lat = float(lat_degrees + lat_minutes / 60 + lat_seconds / 3600)\n", 74 | " lon = float(lon_degrees + lon_minutes / 60 + lon_seconds / 3600)\n", 75 | "\n", 76 | " if gps_info[ExifTags.GPS.GPSLatitudeRef] == \"S\":\n", 77 | " lat = -lat\n", 78 | " if gps_info[ExifTags.GPS.GPSLongitudeRef] == \"W\":\n", 79 | " lon = -lon\n", 80 | "\n", 81 | " return (lat, lon)\n", 82 | "\n", 83 | "\n", 84 | "# Watermarks an image with decimal degree latitude and longitude from Exif data\n", 85 | "def add_dd_watermark(image_path):\n", 86 | "\n", 87 | " image = Image.open(image_path)\n", 88 | " image = ImageOps.exif_transpose(image)\n", 89 | "\n", 90 | " # Get the image's Exif data, convert the lat and long to decimal degree and format the string for the watermark\n", 91 | " exif = image.getexif()\n", 92 | " gps_ifd = exif.get_ifd(ExifTags.IFD.GPSInfo)\n", 93 | " (lat, lon) = convert_dms_to_dd(gps_ifd)\n", 94 | " watermark_text = f\"Latitude: {lat:.4f} Longitude: {lon:.4f}\"\n", 95 | "\n", 96 | " draw = ImageDraw.Draw(image)\n", 97 | "\n", 98 | " # Draw the watermark in the top left corner of the image in white\n", 99 | " # For more drawing options see: https://pillow.readthedocs.io/en/stable/reference/ImageDraw.html#PIL.ImageDraw.ImageDraw.text\n", 100 | " # See https://pillow.readthedocs.io/en/stable/reference/ImageFont.html for information on how to change the font, including font size\n", 101 | " text_xy_position = (10, 10)\n", 102 | " rgb_fill_color = (255, 255, 255)\n", 103 | " draw.text(text_xy_position, watermark_text, fill=rgb_fill_color)\n", 104 | " image.save(image_path)\n", 105 | " # Optional call to show the photo with the watermark\n", 106 | " image.show()\n", 107 | "\n", 108 | "\n", 109 | "# Create temporary output directories\n", 110 | "temporary_attachment_output = tempfile.TemporaryDirectory()\n", 111 | "temporary_zip_output = tempfile.TemporaryDirectory()\n", 112 | "temporary_zip_output_name = os.path.join(\n", 113 | " temporary_zip_output.name,\n", 114 | " \"zipped_photos_{}\".format(dt.now().strftime(\"%Y-%m-%d %H-%M-%S\")),\n", 115 | ")\n", 116 | "\n", 117 | "# Query for features that match the where clause and download their attachments to a temporary output directory\n", 118 | "layer_item = gis.content.get(feature_layer_item_id)\n", 119 | "layer = layer_item.layers[feature_layer_id]\n", 120 | "oid_list = layer.query(where=attachments_query, return_ids_only=True)[\"objectIds\"]\n", 121 | "attachments = layer.attachments.download(\n", 122 | " oid=oid_list, save_path=temporary_attachment_output.name\n", 123 | ")\n", 124 | "\n", 125 | "# For each of the attachments watermark the image\n", 126 | "for attachment in attachments:\n", 127 | " add_dd_watermark(attachment)\n", 128 | "\n", 129 | "# Create a zip archive with the watermarked images\n", 130 | "output_zip = shutil.make_archive(\n", 131 | " temporary_zip_output_name, \"zip\", temporary_attachment_output.name\n", 132 | ")\n", 133 | "\n", 134 | "# Create a new Image Collection item with the archived watermarked images\n", 135 | "item_props = {\n", 136 | " \"type\": \"Image Collection\",\n", 137 | " \"typeKeywords\": [\"Image\", \"Image Collection\", \"Photo locations\"],\n", 138 | " \"description\": \"Images with watermarks\",\n", 139 | " \"title\": \"Images with watermarks\",\n", 140 | " \"tags\": [],\n", 141 | " \"snippet\": \"Images with watermarks\",\n", 142 | "}\n", 143 | "new_item = gis.content.add(item_properties=item_props, data=output_zip)\n", 144 | "\n", 145 | "# Clean up temporary files\n", 146 | "os.remove(output_zip)\n", 147 | "temporary_attachment_output.cleanup()\n", 148 | "temporary_zip_output.cleanup()\n", 149 | "\n", 150 | "display(new_item)\n" 151 | ] 152 | } 153 | ], 154 | "metadata": { 155 | "esriNotebookRuntime": { 156 | "notebookRuntimeName": "ArcGIS Notebook Python 3 Standard", 157 | "notebookRuntimeVersion": "9.0" 158 | }, 159 | "kernelspec": { 160 | "display_name": "Python 3 (ipykernel)", 161 | "language": "python", 162 | "name": "python3" 163 | }, 164 | "language_info": { 165 | "codemirror_mode": { 166 | "name": "ipython", 167 | "version": 3 168 | }, 169 | "file_extension": ".py", 170 | "mimetype": "text/x-python", 171 | "name": "python", 172 | "nbconvert_exporter": "python", 173 | "pygments_lexer": "ipython3", 174 | "version": "3.9.18" 175 | } 176 | }, 177 | "nbformat": 4, 178 | "nbformat_minor": 2 179 | } 180 | -------------------------------------------------------------------------------- /readmes/copy_form_between_maps.md: -------------------------------------------------------------------------------- 1 | ## Copy Form Between Maps 2 | 3 | This script allows a user to copy a smart form between different maps to avoid needing to create forms multiple times. 4 | You can provide the item id (found in the URL when viewing that webmap in the Portal or Field Maps) of both the origin 5 | map where the form has been created and the destination map where you would like the form copied to. 6 | 7 | The script will then search for the same layer or table being present in both maps (as you must use the same feature service in 8 | both maps for this script to work) and, if a matching layer is found, copy the formInfo object from the source map's 9 | operational layer/table to the destination map's operational layer/table. You should then be able to use your smart form in the 10 | destination map. 11 | 12 | By default, this script will copy forms for any matching layers found (so for example, if the same two layers are in the 13 | source and destination maps, both layers will see its form copied over from the source to the destination map. Note that 14 | any form for those matching layers on your destination map will get overwritten. If you'd only like to copy the form 15 | for one map, simply provide the name of that layer as a parameter to the script. 16 | 17 | **This script requires the logged in user to have editing privileges on both maps** 18 | 19 | Supports Python 3.6+ 20 | 21 | ---- 22 | 23 | In addition to the authentication arguments, the script specific arguments are as follows: 24 | 25 | - -source-item-id - The item ID of the map where your form(s) live that you'd like to copy to the same layer in a different map 26 | - -dest-item-id - The item ID of the map where you want your form to be copied to 27 | - -layer-name - If you'd like to restrict copying over forms to only certain layers in each map, provide the title of the layer whose form you'd like to copy here (Optional) 28 | - --overwrite - If you'd like to overwrite existing forms in the destination map using this script, please pass this parameter 29 | - -log-file - The log file to use for logging messages (Optional) 30 | 31 | ```bash 32 | python copy_form_between_maps.py -org https://arcgis.com -u username -p password -source-map-id d2295f09f97945c9b447417ba27bcb38 -dest-map-id 4392801d9c8341ce9038614ff240e877 -layer-name "Inspections" --overwrite 33 | ``` 34 | -------------------------------------------------------------------------------- /readmes/download_attachments.md: -------------------------------------------------------------------------------- 1 | ## Download Attachments from Feature Layer 2 | 3 | A user may want to download attachments that have been collected by their mobile workforce in ArcGIS Field Maps or ArcGIS Workforce. 4 | 5 | This script allows users to download (either all or a subset) attachments from a particular feature layer or item to a folder on their computer. 6 | 7 | You can either specify an item id or feature layer url to use for this particular script. 8 | 9 | Supports Python 3.6+ 10 | 11 | ---- 12 | 13 | In addition to the authentication arguments, the script specific arguments are as follows: 14 | 15 | - -item-id (Optional) - The item ID with the layer or table whose attachments you'd like to download. If an item has multiple layers / tables, it will attempt to download attachments for all layers / tables 16 | - -layer-url (Optional) - The feature layer url whose attachments you'd like to download 17 | - -out-folder (Optional)- The path on your computer where you'd like to direct the attachments. If not passed, will download to present working directory to a folder called "attachments" 18 | - -where (Optional) - The where clause that defines which features' attachments you'd like to download. If not passed, will download all attachments. 19 | - -log-file - The log file to use for logging messages (Optional) 20 | 21 | Example 1: 22 | ```bash 23 | python download_attachments.py -org https://arcgis.com -u username -p password -item-id d2295f09f97945c9b447417ba27bcb38 -where "status = 'Completed'" -out-folder "/Users/johndoe/Documents/my-attachments" 24 | ``` 25 | 26 | Example 2: 27 | ```bash 28 | python download_attachments.py -org https://arcgis.com -u username -p password -layer-url https://services.arcgis.com/US6xjA1Nd8bW1aoA/arcgis/rest/services/workforce_e6c540083fec4b46b193bb040a556d41/FeatureServer/0 29 | ``` 30 | 31 | -------------------------------------------------------------------------------- /scripts/copy_form_between_maps.py: -------------------------------------------------------------------------------- 1 | # -*- coding: UTF-8 -*- 2 | """ 3 | Copyright 2020 Esri 4 | 5 | Licensed under the Apache License, Version 2.0 (the "License"); 6 | 7 | you may not use this file except in compliance with the License. 8 | 9 | You may obtain a copy of the License at 10 | 11 | http://www.apache.org/licenses/LICENSE-2.0 12 | 13 | Unless required by applicable law or agreed to in writing, software 14 | 15 | distributed under the License is distributed on an "AS IS" BASIS, 16 | 17 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 18 | 19 | See the License for the specific language governing permissions and 20 | 21 | limitations under the License.​ 22 | 23 | This sample copies a form from one map to another map if a matching layer is found 24 | """ 25 | import argparse 26 | import logging 27 | import logging.handlers 28 | import traceback 29 | import sys 30 | from arcgis.gis import GIS 31 | 32 | 33 | def initialize_logging(log_file=None): 34 | """ 35 | Setup logging 36 | :param log_file: (string) The file to log to 37 | :return: (Logger) a logging instance 38 | """ 39 | # initialize logging 40 | formatter = logging.Formatter( 41 | "[%(asctime)s] [%(filename)30s:%(lineno)4s - %(funcName)30s()][%(threadName)5s] [%(name)10.10s] [%(levelname)8s] %(message)s") 42 | # Grab the root logger 43 | logger = logging.getLogger() 44 | # Set the root logger logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL) 45 | logger.setLevel(logging.DEBUG) 46 | # Create a handler to print to the console 47 | sh = logging.StreamHandler(sys.stdout) 48 | sh.setFormatter(formatter) 49 | sh.setLevel(logging.INFO) 50 | # Create a handler to log to the specified file 51 | if log_file: 52 | rh = logging.handlers.RotatingFileHandler(log_file, mode='a', maxBytes=10485760) 53 | rh.setFormatter(formatter) 54 | rh.setLevel(logging.DEBUG) 55 | logger.addHandler(rh) 56 | # Add the handlers to the root logger 57 | logger.addHandler(sh) 58 | return logger 59 | 60 | 61 | def replace_form(source_objects, dest_objects, overwrite=False): 62 | modified = False 63 | for s_layer in source_objects: 64 | if "formInfo" in s_layer: 65 | # find matching layer based on url 66 | for d_layer in dest_objects: 67 | if d_layer.get("url", None) == s_layer.get("url", None): 68 | # Check it's the correct layer or that no layer name was passed 69 | if args.layer_name is None or args.layer_name == d_layer.get("title", None): 70 | # Check overwrite was provided or there's no existing form 71 | if overwrite or not d_layer.get("formInfo", None): 72 | d_layer["formInfo"] = s_layer["formInfo"] 73 | modified = True 74 | else: 75 | sys.exit("Existing form is detected. Pass --overwrite to the script to overwrite the form") 76 | return modified 77 | 78 | 79 | def main(arguments): 80 | # initialize logger 81 | logger = initialize_logging(arguments.log_file) 82 | 83 | # Create the GIS 84 | logger.info("Authenticating...") 85 | 86 | # First step is to get authenticate and get a valid token 87 | gis = GIS(arguments.org_url, 88 | username=arguments.username, 89 | password=arguments.password, 90 | verify_cert=not arguments.skip_ssl_verification) 91 | 92 | # Get the maps 93 | source_item = gis.content.get(arguments.source) 94 | dest_item = gis.content.get(arguments.dest) 95 | if source_item is None or dest_item is None: 96 | raise ValueError("Your source or destination map was not found! Please check your item ids again") 97 | if source_item.type.lower() != "web map" or dest_item.type.lower() != "web map": 98 | raise ValueError("One of the items you passed is not a webmap. Please check again.") 99 | 100 | # Iterate through operational layers 101 | logger.info("Iterating through layers") 102 | source_layers = source_item.get_data().get("operationalLayers", []) 103 | source_tables = source_item.get_data().get("tables", []) 104 | dest_data = dest_item.get_data() 105 | dest_layers = dest_data.get("operationalLayers", []) 106 | dest_tables = dest_data.get("tables", []) 107 | 108 | # Swizzle in new form 109 | forms_modified = replace_form(source_layers, dest_layers, args.overwrite) 110 | tables_modified = replace_form(source_tables, dest_tables, args.overwrite) 111 | 112 | # Saving form 113 | if forms_modified or tables_modified: 114 | logger.info("Saving form(s) to destination map") 115 | dest_item.update(data=dest_data) 116 | else: 117 | logger.info("No matching layers were found! Your destination map was not modified") 118 | logger.info("Completed!") 119 | 120 | 121 | if __name__ == "__main__": 122 | # Get all of the commandline arguments 123 | parser = argparse.ArgumentParser("Copy the same form between the same layer in two different maps") 124 | parser.add_argument('-u', dest='username', help="The username to authenticate with", required=True) 125 | parser.add_argument('-p', dest='password', help="The password to authenticate with", required=True) 126 | parser.add_argument('-org', dest='org_url', help="The url of the org/portal to use", required=True) 127 | # Parameters for workforce 128 | parser.add_argument('-source-map-id', dest='source', help="The item id of the map you want to copy the form(s) from", 129 | required=True) 130 | parser.add_argument('-dest-map-id', dest='dest', help="The item id of the map you want to copy the form(s) to", required=True) 131 | parser.add_argument('-layer-name', dest='layer_name', help="An optional specific layer you want to copy. If this is not provided, " 132 | "all matching layers will have their forms copied", required=False) 133 | parser.add_argument('--overwrite', dest='overwrite', action='store_true', help="Provide this parameter if you would like to overwrite an existing form") 134 | parser.add_argument('--skip-ssl-verification', 135 | dest='skip_ssl_verification', 136 | action='store_true', 137 | help="Verify the SSL Certificate of the server") 138 | parser.add_argument('-log-file', dest='log_file', help="The log file to write to") 139 | args = parser.parse_args() 140 | try: 141 | main(args) 142 | except Exception as e: 143 | logging.getLogger().critical("Exception detected, script exiting") 144 | logging.getLogger().critical(e) 145 | logging.getLogger().critical(traceback.format_exc().replace("\n", " | ")) 146 | -------------------------------------------------------------------------------- /scripts/download_attachments.py: -------------------------------------------------------------------------------- 1 | # -*- coding: UTF-8 -*- 2 | """ 3 | Copyright 2020 Esri 4 | 5 | Licensed under the Apache License, Version 2.0 (the "License"); 6 | 7 | you may not use this file except in compliance with the License. 8 | 9 | You may obtain a copy of the License at 10 | 11 | http://www.apache.org/licenses/LICENSE-2.0 12 | 13 | Unless required by applicable law or agreed to in writing, software 14 | 15 | distributed under the License is distributed on an "AS IS" BASIS, 16 | 17 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 18 | 19 | See the License for the specific language governing permissions and 20 | 21 | limitations under the License.​ 22 | 23 | This sample downloads attachments to a local machine from a feature layer or table based on user input 24 | """ 25 | import argparse 26 | import logging 27 | import logging.handlers 28 | import traceback 29 | import sys 30 | from arcgis.gis import GIS 31 | from arcgis.features import FeatureLayer 32 | 33 | 34 | def initialize_logging(log_file=None): 35 | """ 36 | Setup logging 37 | :param log_file: (string) The file to log to 38 | :return: (Logger) a logging instance 39 | """ 40 | # initialize logging 41 | formatter = logging.Formatter( 42 | "[%(asctime)s] [%(filename)30s:%(lineno)4s - %(funcName)30s()][%(threadName)5s] [%(name)10.10s] [%(levelname)8s] %(message)s") 43 | # Grab the root logger 44 | logger = logging.getLogger() 45 | # Set the root logger logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL) 46 | logger.setLevel(logging.DEBUG) 47 | # Create a handler to print to the console 48 | sh = logging.StreamHandler(sys.stdout) 49 | sh.setFormatter(formatter) 50 | sh.setLevel(logging.INFO) 51 | # Create a handler to log to the specified file 52 | if log_file: 53 | rh = logging.handlers.RotatingFileHandler(log_file, mode='a', maxBytes=10485760) 54 | rh.setFormatter(formatter) 55 | rh.setLevel(logging.DEBUG) 56 | logger.addHandler(rh) 57 | # Add the handlers to the root logger 58 | logger.addHandler(sh) 59 | return logger 60 | 61 | 62 | def main(arguments): 63 | # initialize logger 64 | logger = initialize_logging(arguments.log_file) 65 | 66 | # Create the GIS 67 | logger.info("Authenticating...") 68 | 69 | # First step is to get authenticate and get a valid token 70 | gis = GIS(arguments.org_url, 71 | username=arguments.username, 72 | password=arguments.password, 73 | verify_cert=not arguments.skip_ssl_verification) 74 | 75 | if args.item_id is None and args.layer_url is None: 76 | raise Exception("Must pass either item id or layer url") 77 | if args.item_id: 78 | item = gis.content.get(args.item_id) 79 | if item: 80 | layers = item.layers + item.tables 81 | else: 82 | raise Exception("Bad item id, please check again") 83 | else: 84 | layers = [FeatureLayer(url=args.layer_url, gis=gis)] 85 | 86 | logger.info("Downloading attachments") 87 | 88 | for layer in layers: 89 | try: 90 | found_attach = layer.attachments.search(args.where) 91 | for attach in found_attach: 92 | try: 93 | layer.attachments.download(attachment_id=attach['ID'], oid=attach['PARENTOBJECTID'], save_path=args.out_folder) 94 | except Exception: 95 | try: 96 | print(f"Failed to download attachment {attach['NAME']} from object id {attach['PARENTOBJECTID']}'") 97 | except Exception as e: 98 | print(e) 99 | except Exception: 100 | pass 101 | logger.info("Completed successfully!") 102 | 103 | 104 | if __name__ == "__main__": 105 | # Get all of the commandline arguments 106 | parser = argparse.ArgumentParser("Download attachments from an ArcGIS hosted feature layer") 107 | parser.add_argument('-u', dest='username', help="The username to authenticate with", required=True) 108 | parser.add_argument('-p', dest='password', help="The password to authenticate with", required=True) 109 | parser.add_argument('-org', dest='org_url', help="The url of the org/portal to use", required=True) 110 | # Parameters for workforce 111 | parser.add_argument('-item-id', dest='item_id', help="The item ID with the layer or table whose attachments you'd like to download") 112 | parser.add_argument('-layer-url', dest='layer_url', help="The feature layer url where you'd like to download attachments") 113 | parser.add_argument('-out-folder', dest='out_folder', help="The path on your computer where you'd like to direct the attachments. If not passed, " 114 | "will download to present working directory", required=False, default="./attachments") 115 | parser.add_argument('-where', dest='where', help="If not passed, will download all attachments.", default='1=1') 116 | parser.add_argument('--skip-ssl-verification', 117 | dest='skip_ssl_verification', 118 | action='store_true', 119 | help="Verify the SSL Certificate of the server") 120 | parser.add_argument('-log-file', dest='log_file', help="The log file to write to") 121 | args = parser.parse_args() 122 | try: 123 | main(args) 124 | except Exception as e: 125 | logging.getLogger().critical("Exception detected, script exiting") 126 | logging.getLogger().critical(e) 127 | logging.getLogger().critical(traceback.format_exc().replace("\n", " | ")) 128 | --------------------------------------------------------------------------------