├── notebooks
├── courses
│ ├── environmental-remote-sensing
│ │ ├── src
│ │ │ └── envrs
│ │ │ │ ├── __init__.py
│ │ │ │ ├── download_path.py
│ │ │ │ ├── ssm_cmap.py
│ │ │ │ ├── rio_tools.py
│ │ │ │ ├── corr_plots.py
│ │ │ │ └── hls_tools.py
│ │ ├── images
│ │ │ ├── data-hsaf-website.png
│ │ │ ├── home-hsaf-website-r.png
│ │ │ ├── home-wekeo-website.png
│ │ │ ├── login-wekeo-website.png
│ │ │ ├── login-hsaf-website-r.png
│ │ │ ├── portal-wekeo-website.png
│ │ │ ├── register-hsaf-website.png
│ │ │ ├── dataviewer-wekeo-website.png
│ │ │ ├── swi-search-wekeo-website.png
│ │ │ └── api-swi-search-wekeo-website.png
│ │ ├── pyproject.toml
│ │ └── unit_01
│ │ │ ├── 02_homework_exercise.ipynb
│ │ │ ├── 03_homework_exercise.ipynb
│ │ │ └── 05_supplement_drought.ipynb
│ ├── microwave-remote-sensing
│ │ ├── images
│ │ │ ├── code_cell.png
│ │ │ ├── ridgecrest.gif
│ │ │ ├── stop_server.png
│ │ │ ├── markdown_cell.png
│ │ │ ├── new_notebooks.png
│ │ │ ├── select_kernel.png
│ │ │ ├── select_server.png
│ │ │ ├── startingpage.png
│ │ │ ├── jupiterhub_icon.png
│ │ │ ├── quit_processes.png
│ │ │ ├── speckle_effect.png
│ │ │ ├── start_terminal.png
│ │ │ ├── backscattering_coefficients.png
│ │ │ ├── side_looking_image_distortions.png
│ │ │ └── tuw-geo-logo.svg
│ │ ├── assets
│ │ │ └── images
│ │ │ │ ├── code_cell.png
│ │ │ │ ├── stop_server.png
│ │ │ │ ├── markdown_cell.png
│ │ │ │ ├── new_notebooks.png
│ │ │ │ ├── quit_processes.png
│ │ │ │ ├── ridgecrest_gif.gif
│ │ │ │ ├── select_kernel.png
│ │ │ │ ├── select_server.png
│ │ │ │ ├── speckle_effect.png
│ │ │ │ ├── start_terminal.png
│ │ │ │ ├── startingpage.png
│ │ │ │ ├── jupiterhub_icon.png
│ │ │ │ ├── backscattering_coefficients.png
│ │ │ │ ├── side_looking_image_distortions.png
│ │ │ │ └── tuw-geo-logo.svg
│ │ ├── _quarto.yml
│ │ ├── pyproject.toml
│ │ ├── unit_02
│ │ │ ├── 05_in_class_exercise.ipynb
│ │ │ └── 06_in_class_exercise.ipynb
│ │ └── unit_01
│ │ │ ├── 03_in_class_exercise.ipynb
│ │ │ └── 02_in_class_exercise.ipynb
│ ├── environmental-remote-sensing.ipynb
│ └── microwave-remote-sensing.ipynb
├── images
│ ├── ridgecrest.gif
│ ├── icons
│ │ └── favicon.ico
│ ├── speckle_effect.png
│ ├── tuw-geo_eodc_logo_vertical.png
│ ├── side_looking_image_distortions.png
│ ├── logos
│ │ ├── tuw-geo_eodc_logo_horizontal.png
│ │ ├── pythia_logo-white-notext.svg
│ │ └── pythia_logo-white-rtext.svg
│ ├── cmaps
│ │ └── 06_color_mapping.json
│ └── ProjectPythia_Logo_Final-01-Blue.svg
├── references.ipynb
├── tutorials
│ └── prereqs-tutorials.ipynb
├── how-to-cite.md
└── templates
│ └── prereqs-templates.ipynb
├── .isort.cfg
├── .github
├── workflows
│ ├── trigger-delete-preview.yaml
│ ├── trigger-book-build.yaml
│ ├── publish-book.yaml
│ ├── nightly-build.yaml
│ ├── replace-links.yaml
│ ├── trigger-replace-links.yaml
│ └── trigger-preview.yaml
└── dependabot.yml
├── _gallery_info.yml
├── .pre-commit-config.yaml
├── environment.yml
├── CITATION.cff
├── .gitignore
├── myst.yml
├── _config.yml
├── README.md
└── LICENSE
/notebooks/courses/environmental-remote-sensing/src/envrs/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/.isort.cfg:
--------------------------------------------------------------------------------
1 | [settings]
2 | known_third_party =IPython,envrs,matplotlib,numpy,pandas,rasterio,shapely,statsmodels,xarray
3 |
--------------------------------------------------------------------------------
/notebooks/images/ridgecrest.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/images/ridgecrest.gif
--------------------------------------------------------------------------------
/notebooks/images/icons/favicon.ico:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/images/icons/favicon.ico
--------------------------------------------------------------------------------
/notebooks/images/speckle_effect.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/images/speckle_effect.png
--------------------------------------------------------------------------------
/notebooks/images/tuw-geo_eodc_logo_vertical.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/images/tuw-geo_eodc_logo_vertical.png
--------------------------------------------------------------------------------
/notebooks/images/side_looking_image_distortions.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/images/side_looking_image_distortions.png
--------------------------------------------------------------------------------
/notebooks/images/logos/tuw-geo_eodc_logo_horizontal.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/images/logos/tuw-geo_eodc_logo_horizontal.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/code_cell.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/code_cell.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/ridgecrest.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/ridgecrest.gif
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/stop_server.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/stop_server.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/markdown_cell.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/markdown_cell.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/new_notebooks.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/new_notebooks.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/select_kernel.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/select_kernel.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/select_server.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/select_server.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/startingpage.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/startingpage.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/code_cell.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/code_cell.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/jupiterhub_icon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/jupiterhub_icon.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/quit_processes.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/quit_processes.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/speckle_effect.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/speckle_effect.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/start_terminal.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/start_terminal.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/stop_server.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/stop_server.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/data-hsaf-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/data-hsaf-website.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/markdown_cell.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/markdown_cell.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/new_notebooks.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/new_notebooks.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/quit_processes.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/quit_processes.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/ridgecrest_gif.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/ridgecrest_gif.gif
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/select_kernel.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/select_kernel.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/select_server.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/select_server.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/speckle_effect.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/speckle_effect.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/start_terminal.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/start_terminal.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/startingpage.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/startingpage.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/home-hsaf-website-r.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/home-hsaf-website-r.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/home-wekeo-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/home-wekeo-website.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/login-wekeo-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/login-wekeo-website.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/jupiterhub_icon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/jupiterhub_icon.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/login-hsaf-website-r.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/login-hsaf-website-r.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/portal-wekeo-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/portal-wekeo-website.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/register-hsaf-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/register-hsaf-website.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/dataviewer-wekeo-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/dataviewer-wekeo-website.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/swi-search-wekeo-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/swi-search-wekeo-website.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/backscattering_coefficients.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/backscattering_coefficients.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/side_looking_image_distortions.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/images/side_looking_image_distortions.png
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/images/api-swi-search-wekeo-website.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/environmental-remote-sensing/images/api-swi-search-wekeo-website.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/backscattering_coefficients.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/backscattering_coefficients.png
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/side_looking_image_distortions.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ProjectPythia/eo-datascience-cookbook/HEAD/notebooks/courses/microwave-remote-sensing/assets/images/side_looking_image_distortions.png
--------------------------------------------------------------------------------
/.github/workflows/trigger-delete-preview.yaml:
--------------------------------------------------------------------------------
1 | name: trigger-delete-preview
2 |
3 | on:
4 | pull_request_target:
5 | types: closed
6 |
7 | jobs:
8 | delete:
9 | uses: ProjectPythia/cookbook-actions/.github/workflows/delete-preview.yaml@main
10 |
--------------------------------------------------------------------------------
/.github/dependabot.yml:
--------------------------------------------------------------------------------
1 | version: 2
2 | updates:
3 | # - package-ecosystem: pip
4 | # directory: "/"
5 | # schedule:
6 | # interval: daily
7 | - package-ecosystem: "github-actions"
8 | directory: "/"
9 | schedule:
10 | # Check for updates once a week
11 | interval: "weekly"
12 |
--------------------------------------------------------------------------------
/_gallery_info.yml:
--------------------------------------------------------------------------------
1 | thumbnail: notebooks/images/logos/tuw-geo_eodc_logo_horizontal.png
2 | tags:
3 | domains:
4 | - remote-sensing
5 | - microwave-remote-sensing
6 | - earth-observation
7 | - sentinel-1
8 | - stac
9 | packages:
10 | - xarray
11 | - holoviews
12 | - pystac-client
13 | - odc-stac
14 | - rioxarray
15 |
--------------------------------------------------------------------------------
/.github/workflows/trigger-book-build.yaml:
--------------------------------------------------------------------------------
1 | name: trigger-book-build
2 | on:
3 | pull_request:
4 |
5 | jobs:
6 | build:
7 | uses: ProjectPythia/cookbook-actions/.github/workflows/build-book.yaml@main
8 | with:
9 | artifact_name: book-zip-${{ github.event.number }}
10 | base_url: "/${{ github.event.repository.name }}/_preview/${{ github.event.number }}"
11 | # Other input options are possible, see ProjectPythia/cookbook-actions/.github/workflows/build-book.yaml
12 |
--------------------------------------------------------------------------------
/.github/workflows/publish-book.yaml:
--------------------------------------------------------------------------------
1 | name: publish-book
2 |
3 | on:
4 | # Trigger the workflow on push to main branch
5 | push:
6 | branches:
7 | - main
8 | workflow_dispatch:
9 |
10 | jobs:
11 | build:
12 | uses: ProjectPythia/cookbook-actions/.github/workflows/build-book.yaml@main
13 | with:
14 | build_command: "myst build --execute --html"
15 |
16 | deploy:
17 | needs: build
18 | uses: ProjectPythia/cookbook-actions/.github/workflows/deploy-book.yaml@main
19 |
--------------------------------------------------------------------------------
/notebooks/references.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# References\n",
8 | "```{bibliography}\n",
9 | ":style: plain\n",
10 | "```\n"
11 | ]
12 | }
13 | ],
14 | "metadata": {
15 | "kernelspec": {
16 | "display_name": "Python 3 (ipykernel)",
17 | "language": "python",
18 | "name": "python3",
19 | "path": "/usr/share/jupyter/kernels/python3"
20 | }
21 | },
22 | "nbformat": 4,
23 | "nbformat_minor": 4
24 | }
25 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/src/envrs/download_path.py:
--------------------------------------------------------------------------------
1 | ROOT = "https://git.geo.tuwien.ac.at/api/v4/projects/1266/repository/files/"
2 |
3 |
4 | def make_url(
5 | file: str,
6 | *,
7 | lfs: bool = True,
8 | is_zip: bool = False,
9 | cache: bool = False,
10 | verbose: bool = True,
11 | branch: str = "main",
12 | ) -> str:
13 | """Generate a download URL for a file in the repository."""
14 | url = f"{ROOT}{file}/raw?ref={branch}&lfs={str(lfs).lower()}"
15 | if verbose:
16 | print(url)
17 | if is_zip:
18 | url = f"zip::{url}"
19 | if cache:
20 | url = f"simplecache::{url}"
21 | return url
22 |
--------------------------------------------------------------------------------
/notebooks/tutorials/prereqs-tutorials.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Tutorials\n",
9 | "\n",
10 | "\n",
11 | "This section of the Cookbook covers a wide range of topics. They showcase the \n",
12 | "creation and usage of data products developed by TU Wien and EODC.\n"
13 | ]
14 | }
15 | ],
16 | "metadata": {
17 | "kernelspec": {
18 | "display_name": "Python 3 (ipykernel)",
19 | "language": "python",
20 | "name": "python3",
21 | "path": "/usr/share/jupyter/kernels/python3"
22 | }
23 | },
24 | "nbformat": 4,
25 | "nbformat_minor": 5
26 | }
27 |
--------------------------------------------------------------------------------
/.github/workflows/nightly-build.yaml:
--------------------------------------------------------------------------------
1 | name: nightly-build
2 |
3 | on:
4 | workflow_dispatch:
5 | schedule:
6 | - cron: "0 0 * * *" # Daily “At 00:00”
7 |
8 | jobs:
9 | build:
10 | if: ${{ github.repository_owner == 'ProjectPythia' }}
11 | uses: ProjectPythia/cookbook-actions/.github/workflows/build-book.yaml@main
12 | with:
13 | build_command: "myst build --execute --html"
14 |
15 | deploy:
16 | needs: build
17 | uses: ProjectPythia/cookbook-actions/.github/workflows/deploy-book.yaml@main
18 | # We don't have a link-checker with MyST right now
19 | # link-check:
20 | # if: ${{ github.repository_owner == 'ProjectPythia' }}
21 | # uses: ProjectPythia/cookbook-actions/.github/workflows/link-checker.yaml@main
22 |
--------------------------------------------------------------------------------
/notebooks/how-to-cite.md:
--------------------------------------------------------------------------------
1 | # How to Cite This Cookbook
2 |
3 | The material in this Project Pythia Cookbook is licensed for free and open consumption and reuse. All code is served under [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0), while all non-code content is licensed under [Creative Commons BY 4.0 (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/). Effectively, this means you are free to share and adapt this material so long as you give appropriate credit to the Cookbook authors and the Project Pythia community.
4 |
5 | The source code for the book is [released on GitHub](https://github.com/TUW-GEO/eo-datascience-cookbook) and archived on Zenodo. This DOI will always resolve to the latest release of the book source:
6 |
7 | [](https://zenodo.org/badge/latestdoi/830421828)
8 |
--------------------------------------------------------------------------------
/.github/workflows/replace-links.yaml:
--------------------------------------------------------------------------------
1 | name: replace-links
2 |
3 | on:
4 | workflow_dispatch:
5 |
6 | jobs:
7 | build:
8 | runs-on: ubuntu-latest
9 | permissions:
10 | contents: write
11 |
12 | steps:
13 | - uses: actions/checkout@v6
14 | - name: Find and Replace Repository Name
15 | uses: jacobtomlinson/gha-find-replace@v3
16 | with:
17 | find: "ProjectPythia/cookbook-template"
18 | replace: "${{ github.repository_owner }}/${{ github.event.repository.name }}"
19 | regex: false
20 | exclude: ".github/workflows/replace-links.yaml"
21 |
22 | - name: Find and Replace Repository ID
23 | uses: jacobtomlinson/gha-find-replace@v3
24 | with:
25 | find: "475509405"
26 | replace: "${{ github.repository_id}}"
27 | regex: false
28 | exclude: ".github/workflows/replace-links.yaml"
29 |
30 | - name: Push changes
31 | uses: stefanzweifel/git-auto-commit-action@v7
32 |
--------------------------------------------------------------------------------
/.github/workflows/trigger-replace-links.yaml:
--------------------------------------------------------------------------------
1 | name: trigger-replace-links
2 |
3 | on:
4 | workflow_dispatch:
5 |
6 | jobs:
7 | build:
8 | runs-on: ubuntu-latest
9 | permissions:
10 | contents: write
11 |
12 | steps:
13 | - uses: actions/checkout@v6
14 | - name: Find and Replace Repository Name
15 | uses: jacobtomlinson/gha-find-replace@v3
16 | with:
17 | find: "ProjectPythia/cookbook-template"
18 | replace: "${{ github.repository_owner }}/${{ github.event.repository.name }}"
19 | regex: false
20 | exclude: ".github/workflows/trigger-replace-links.yaml"
21 |
22 | - name: Find and Replace Repository ID
23 | uses: jacobtomlinson/gha-find-replace@v3
24 | with:
25 | find: "475509405"
26 | replace: "${{ github.repository_id}}"
27 | regex: false
28 | exclude: ".github/workflows/trigger-replace-links.yaml"
29 |
30 | - name: Push changes
31 | uses: stefanzweifel/git-auto-commit-action@v7
32 |
--------------------------------------------------------------------------------
/.github/workflows/trigger-preview.yaml:
--------------------------------------------------------------------------------
1 | name: trigger-preview
2 | on:
3 | workflow_run:
4 | workflows:
5 | - trigger-book-build
6 | types:
7 | - requested
8 | - completed
9 |
10 | jobs:
11 | find-pull-request:
12 | uses: ProjectPythia/cookbook-actions/.github/workflows/find-pull-request.yaml@main
13 | deploy-preview:
14 | needs: find-pull-request
15 | if: github.event.workflow_run.conclusion == 'success'
16 | uses: ProjectPythia/cookbook-actions/.github/workflows/deploy-book.yaml@main
17 | with:
18 | artifact_name: book-zip-${{ needs.find-pull-request.outputs.number }}
19 | destination_dir: _preview/${{ needs.find-pull-request.outputs.number }} # deploy to subdirectory labeled with PR number
20 | is_preview: "true"
21 |
22 | preview-comment:
23 | needs: find-pull-request
24 | uses: ProjectPythia/cookbook-actions/.github/workflows/preview-comment.yaml@main
25 | with:
26 | pull_request_number: ${{ needs.find-pull-request.outputs.number }}
27 | sha: ${{ needs.find-pull-request.outputs.sha }}
28 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/_quarto.yml:
--------------------------------------------------------------------------------
1 | project:
2 | type: book
3 |
4 | book:
5 | title: "JupyerHub Guide"
6 | chapters:
7 | - index.qmd
8 | - part: "Unit 1"
9 | chapters:
10 | - text: EO Data Discovery
11 | file: unit_01/01_in_class_exercise.ipynb
12 | - text: Unit Conversions
13 | file: unit_01/02_in_class_exercise.ipynb
14 | - text: Backscattering Coefficients
15 | file: unit_01/03_in_class_exercise.ipynb
16 | - part: "Unit 2"
17 | chapters:
18 | - text: Datacubes
19 | file: unit_02/04_in_class_exercise.ipynb
20 | - text: Wavelength and Polarization
21 | file: unit_02/05_in_class_exercise.ipynb
22 | - text: Backscatter Variability
23 | file: unit_02/06_in_class_exercise.ipynb
24 | - part: "Unit 3"
25 | chapters:
26 | - text: Speckle Statistics
27 | file: unit_03/07_in_class_exercise.ipynb
28 | - text: Interferograms
29 | file: unit_03/08_in_class_exercise.ipynb
30 | - text: Phase Unwrapping
31 | file: unit_03/09_in_class_exercise.ipynb
32 | navbar:
33 | logo: assets/images/tuw-geo-logo.svg
34 | sidebar:
35 | logo: assets/images/tuw-geo-logo.svg
36 |
37 | format:
38 | html:
39 | theme: cosmo
40 | title-block-banner: "#006699"
41 | title-block-banner-color: white
42 |
43 | execute:
44 | freeze: true
45 |
--------------------------------------------------------------------------------
/.pre-commit-config.yaml:
--------------------------------------------------------------------------------
1 | repos:
2 | - repo: https://github.com/pre-commit/pre-commit-hooks
3 | rev: v6.0.0
4 | hooks:
5 | - id: trailing-whitespace
6 | - id: end-of-file-fixer
7 | - id: check-docstring-first
8 | - id: check-json
9 | - id: check-yaml
10 | - id: double-quote-string-fixer
11 |
12 | - repo: https://github.com/psf/black
13 | rev: 25.9.0
14 | hooks:
15 | - id: black
16 |
17 | - repo: https://github.com/keewis/blackdoc
18 | rev: v0.4.3
19 | hooks:
20 | - id: blackdoc
21 |
22 | - repo: https://github.com/PyCQA/flake8
23 | rev: 7.3.0
24 | hooks:
25 | - id: flake8
26 |
27 | - repo: https://github.com/asottile/seed-isort-config
28 | rev: v2.2.0
29 | hooks:
30 | - id: seed-isort-config
31 |
32 | - repo: https://github.com/PyCQA/isort
33 | rev: 6.0.1
34 | hooks:
35 | - id: isort
36 |
37 | - repo: https://github.com/pre-commit/mirrors-prettier
38 | rev: v4.0.0-alpha.8
39 | hooks:
40 | - id: prettier
41 | additional_dependencies: [prettier@v2.7.1]
42 |
43 | - repo: https://github.com/nbQA-dev/nbQA
44 | rev: 1.9.1
45 | hooks:
46 | - id: nbqa-black
47 | additional_dependencies: [black]
48 | - id: nbqa-pyupgrade
49 | additional_dependencies: [pyupgrade]
50 | exclude: foundations/quickstart.ipynb
51 | - id: nbqa-isort
52 | additional_dependencies: [isort]
53 |
--------------------------------------------------------------------------------
/notebooks/templates/prereqs-templates.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Templates\n",
9 | "\n",
10 | "\n",
11 | "This section of the Cookbook covers a wide range of topics. The intent is to\n",
12 | "create templates to showcase workflows that can be used by students as a primer \n",
13 | "for independent research projects.\n",
14 | "\n",
15 | "| Concepts | Importance | Notes |\n",
16 | "|---|---|---|\n",
17 | "| [Intro to xarray](https://foundations.projectpythia.org/core/xarray/xarray-intro.html) | Necessary | |\n",
18 | "| [Dask Arrays](https://foundations.projectpythia.org/core/xarray/dask-arrays-xarray.html)| Necessary| |\n",
19 | "| [Documentation scikit-learn](https://scikit-learn.org/stable/)|Neccesary|Machine Learning in Python|\n",
20 | "| [Documentation Matplotlib](https://matplotlib.org/stable/users/explain/quick_start.html)|Helpful|Ploting in Python|\n",
21 | "| [Documentation odc-stac](https://odc-stac.readthedocs.io/en/latest/)|Helpful|Data access|\n",
22 | "\n",
23 | "- **Time to learn**: 10 min\n"
24 | ]
25 | }
26 | ],
27 | "metadata": {
28 | "kernelspec": {
29 | "display_name": "Python 3 (ipykernel)",
30 | "language": "python",
31 | "name": "python3",
32 | "path": "/usr/share/jupyter/kernels/python3"
33 | }
34 | },
35 | "nbformat": 4,
36 | "nbformat_minor": 5
37 | }
38 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Environmental Remote Sensing\n",
9 | "\n",
10 | "\n",
11 | "These course materials where developed in the framework of the DrySat project: *Enhancing Drought Early Warning in Mozambique through Satellite Soil Moisture Data to support food security in the context of climate change*.\n",
12 | "\n",
13 | "## Prerequisites\n",
14 | "\n",
15 | "| Concepts | Importance | Notes |\n",
16 | "|---|---|---|\n",
17 | "| [Intro to Earth Observation Data Science](https://projectpythia.org/eo-datascience-cookbook/README.html) | Helpful | |\n",
18 | "| [Documentation hvPlot](https://hvplot.holoviz.org/)|Helpful|Interactive plotting|\n",
19 | "| [Documentation pandas](https://pandas.pydata.org/)|Helpful|Tabular data wrangling|\n",
20 | "\n",
21 | "- **Time to learn**: 90 min\n",
22 | "\n",
23 | ":::{note}\n",
24 | "These notebooks contain interactive elements. The full interactive elements can \n",
25 | "only be viewed on Binder by clicking on the Binder badge or 🚀 button.\n",
26 | ":::\n"
27 | ]
28 | }
29 | ],
30 | "metadata": {
31 | "kernelspec": {
32 | "display_name": "Python 3 (ipykernel)",
33 | "language": "python",
34 | "name": "python3",
35 | "path": "/usr/share/jupyter/kernels/python3"
36 | }
37 | },
38 | "nbformat": 4,
39 | "nbformat_minor": 5
40 | }
41 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/src/envrs/ssm_cmap.py:
--------------------------------------------------------------------------------
1 | """Surface Soil Moisture Color Map.
2 |
3 | Returns
4 | -------
5 | Surface Soil Moisture Color Map : matplotlib.colors.LinearSegmentedColormap
6 |
7 | """
8 |
9 | import matplotlib as mpl
10 | import pandas as pd
11 | from envrs.download_path import make_url
12 |
13 |
14 | def load_cmap() -> mpl.colors.LinearSegmentedColormap:
15 | """Surface Soil Moisture Color Map.
16 |
17 | Loading Surface Soil Moisture Color Map based on the TU Wien standard.
18 |
19 | Parameters
20 | ----------
21 | None
22 |
23 | Returns
24 | -------
25 | Surface Soil Moisture Color Map : matplotlib.colors.LinearSegmentedColormap
26 |
27 | """
28 |
29 | def to_hex_str(x: list) -> str:
30 | """RGB Hex String.
31 |
32 | Convert RGB values to hex string
33 |
34 | Parameters
35 | ----------
36 | x : list
37 | RGB values
38 |
39 | Returns
40 | -------
41 | Hex string : str
42 |
43 | """
44 | return f"#{int(x.R):02x}{int(x.G):02x}{int(x.B):02x}"
45 |
46 | path = r"colour-tables%2Fssm-continuous.ct"
47 | color_df = pd.read_fwf(
48 | make_url(path, lfs="false", verbose=False), names=["R", "G", "B"], nrows=200
49 | )
50 | brn_yl_bu_colors = color_df.apply(to_hex_str, axis=1).to_list()
51 | return mpl.colors.LinearSegmentedColormap.from_list("", brn_yl_bu_colors)
52 |
53 |
54 | SSM_CMAP = load_cmap()
55 |
--------------------------------------------------------------------------------
/environment.yml:
--------------------------------------------------------------------------------
1 | name: eo-datascience-cookbook
2 | channels:
3 | - conda-forge
4 | dependencies:
5 | - aiohttp
6 | - black
7 | - bokeh
8 | - cartopy
9 | - cmcrameri
10 | - dask
11 | - datashader
12 | - earthaccess
13 | - ffmpeg
14 | - fiona
15 | - flake8-nb
16 | - folium
17 | - fsspec
18 | - geopandas
19 | - geoviews
20 | - graphviz
21 | - hda
22 | - holoviews
23 | - huggingface_hub
24 | - hvplot
25 | - importlib-metadata==4.13.0
26 | - intake
27 | - intake-xarray
28 | - ipykernel
29 | - isort
30 | - jupyterlab-myst
31 | - jupyter-cache
32 | - jupyter-server-proxy
33 | - jupyter==1.1.1
34 | - jupyter_bokeh
35 | - jupyterlab==4.2.5
36 | - jupyterlab_server==2.27.3
37 | - jupyterlab_widgets==3.0.13
38 | - lxml
39 | - mamba
40 | - matplotlib
41 | - mystmd
42 | - nbformat
43 | - nbqa
44 | - nbstripout
45 | - nodejs
46 | - numpy
47 | - odc-stac
48 | - openpyxl
49 | - openssl
50 | - pandas
51 | - pip
52 | - pre-commit
53 | - pyogrio
54 | - pyproj
55 | - pystac-client
56 | - pytest
57 | - python==3.13
58 | - rasterio
59 | - requests
60 | - rioxarray
61 | - scikit-image
62 | - scikit-learn
63 | - scipy
64 | - seaborn
65 | - snaphu
66 | - stackstac
67 | - statsmodels
68 | - xarray
69 | - zarr==2.18.4
70 | - pip:
71 | - ascat
72 | - notebooks/courses/environmental-remote-sensing
73 | - pdbufr
74 | - pre-commit
75 | - pynetcf
76 | - pyswi
77 | - python-dotenv
78 | - python-gitlab
79 | - pyviz_comms==3.0.4
80 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/pyproject.toml:
--------------------------------------------------------------------------------
1 | [build-system]
2 | requires = ["uv_build >= 0.7.19, <0.9.0"]
3 | build-backend = "uv_build"
4 |
5 | [tool.ruff]
6 | exclude = [
7 | ".eggs",
8 | ".git",
9 | ".git-rewrite",
10 | ".ipynb_checkpoints",
11 | ".mypy_cache",
12 | ".pyenv",
13 | ".pytest_cache",
14 | ".pytype",
15 | ".ruff_cache",
16 | ".venv",
17 | ".vscode",
18 | "__pypackages__",
19 | "_build",
20 | "build",
21 | "dist",
22 | "site-packages",
23 | "venv",
24 | "*_homework_exercise.ipynb"
25 | ]
26 | line-length = 88
27 | indent-width = 4
28 | target-version = "py313"
29 | # unsafe-fixes = true
30 |
31 | [tool.ruff.lint]
32 | select = ["ALL"] # Subsets could also be used if that is too much
33 | ignore = [
34 | "ANN",
35 | # "ANN401", # any-type (in func args)
36 | "D203", # incorrect-blank-line-before-class
37 | "D213", # multi-line-summary-second-line (opposite to D212)
38 | "PGH", # pygrep hooks are ignored
39 | "TD002", # missing-todo-author
40 | "TD003", # missing-todo-link
41 | "T201", # print (should not be used)
42 | "COM812", # missing-trailing-comma
43 | "EXE002", # shebang-missing-executable-file
44 | "PLR0913", # too-many-arguments
45 | "D103", # undocumented-public-function
46 |
47 | ]
48 | fixable = ["ALL"]
49 | unfixable = []
50 |
51 | [tool.ruff.format]
52 | quote-style = "double"
53 | indent-style = "space"
54 | skip-magic-trailing-comma = false
55 | line-ending = "auto"
56 | docstring-code-format = false # maybe true? Should replace blackdoc
57 | docstring-code-line-length = "dynamic"
58 |
59 | [tool.ruff.lint.isort]
60 | known-third-party = []
61 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Microwave Remote Sensing\n",
9 | "\n",
10 | "\n",
11 | "This course at the TU Wien teaches students to read, visualize and analyze \n",
12 | "Synthetic Aperture Radar (SAR) data. This will aid interpretation of SAR data \n",
13 | "based upon a physical understanding of sensing principles and the interaction of \n",
14 | "microwaves with natural objects.\n",
15 | "\n",
16 | "| Concepts | Importance | Notes |\n",
17 | "|---|---|---|\n",
18 | "| [Intro to xarray](https://foundations.projectpythia.org/core/xarray/xarray-intro.html) | Necessary | |\n",
19 | "| [Dask Arrays](https://foundations.projectpythia.org/core/xarray/dask-arrays-xarray.html)| Necessary| |\n",
20 | "| [Intake](https://projectpythia.org/intake-cookbook/README.html)|Helpful| |\n",
21 | "| [Matplotlib](https://foundations.projectpythia.org/core/matplotlib.html)|Helpful|Ploting in Python|\n",
22 | "| [Documentation hvPlot](https://hvplot.holoviz.org/)|Helpful|Interactive plotting|\n",
23 | "| [Documentation odc-stac](https://odc-stac.readthedocs.io/en/latest/)|Helpful|Data access|\n",
24 | "\n",
25 | "- **Time to learn**: 90 min\n",
26 | "\n",
27 | ":::{note}\n",
28 | "These notebooks contain interactive elements. The full interactive elements can \n",
29 | "only be viewed on Binder by clicking on the Binder badge or 🚀 button.\n",
30 | ":::\n"
31 | ]
32 | }
33 | ],
34 | "metadata": {
35 | "kernelspec": {
36 | "display_name": "Python 3 (ipykernel)",
37 | "language": "python",
38 | "name": "python3",
39 | "path": "/usr/share/jupyter/kernels/python3"
40 | }
41 | },
42 | "nbformat": 4,
43 | "nbformat_minor": 5
44 | }
45 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/pyproject.toml:
--------------------------------------------------------------------------------
1 | [project]
2 | name = "envrs"
3 | version = "0.1.0"
4 | requires-python = ">=3.13"
5 |
6 | [build-system]
7 | requires = ["uv_build >= 0.7.19, <0.9.0"]
8 | build-backend = "uv_build"
9 |
10 | [tool.ruff]
11 | exclude = [
12 | ".eggs",
13 | ".git",
14 | ".git-rewrite",
15 | ".ipynb_checkpoints",
16 | ".mypy_cache",
17 | ".pyenv",
18 | ".pytest_cache",
19 | ".pytype",
20 | ".ruff_cache",
21 | ".venv",
22 | ".vscode",
23 | "__pypackages__",
24 | "_build",
25 | "build",
26 | "dist",
27 | "site-packages",
28 | "venv",
29 | "*_homework_exercise.ipynb"
30 | ]
31 | line-length = 88
32 | indent-width = 4
33 | target-version = "py313"
34 | # unsafe-fixes = true
35 |
36 | [tool.ruff.lint]
37 | select = ["ALL"] # Subsets could also be used if that is too much
38 | ignore = [
39 | "ANN",
40 | # "ANN401", # any-type (in func args)
41 | "D203", # incorrect-blank-line-before-class
42 | "D213", # multi-line-summary-second-line (opposite to D212)
43 | "PGH", # pygrep hooks are ignored
44 | "TD002", # missing-todo-author
45 | "TD003", # missing-todo-link
46 | "T201", # print (should not be used)
47 | "COM812", # missing-trailing-comma
48 | "EXE002", # shebang-missing-executable-file
49 | "PLR0913", # too-many-arguments
50 | "D103", # undocumented-public-function
51 |
52 | ]
53 | fixable = ["ALL"]
54 | unfixable = []
55 |
56 | [tool.ruff.format]
57 | quote-style = "double"
58 | indent-style = "space"
59 | skip-magic-trailing-comma = false
60 | line-ending = "auto"
61 | docstring-code-format = false # maybe true? Should replace blackdoc
62 | docstring-code-line-length = "dynamic"
63 |
64 | [tool.ruff.lint.isort]
65 | known-third-party = ["IPython","download_path","matplotlib","numpy","pandas","statsmodels"]
66 |
--------------------------------------------------------------------------------
/CITATION.cff:
--------------------------------------------------------------------------------
1 | cff-version: 1.2.0
2 | message: "If you use this cookbook, please cite it as below."
3 | authors:
4 | # add additional entries for each author -- see https://github.com/citation-file-format/citation-file-format/blob/main/schema-guide.md
5 | - family-names: Wagner
6 | given-names: Wolfgang
7 | orcid: https://orcid.org/0000-0001-7704-6857
8 | website: https://www.tuwien.at/mg/dekanat/mitarbeiter-innen
9 | affiliation: Technische Universität Wien, Vienna, Austria, EODC Earth Observation Data Centre for Water Resources Monitoring, Austria
10 | - family-names: Schobben
11 | given-names: Martin
12 | orcid: https://orcid.org/0000-0001-8560-0037
13 | website: https://github.com/martinschobben
14 | affiliation: Technische Universität Wien, Vienna, Austria
15 | - family-names: Pikall
16 | given-names: Nikolas
17 | website: https://github.com/npikall
18 | affiliation: Technische Universität Wien, Vienna, Austria
19 | - family-names: Wagner
20 | given-names: Joseph
21 | affiliation: Technische Universität Wien, Vienna, Austria
22 | - family-names: Festa
23 | given-names: Davide
24 | affiliation: Technische Universität Wien, Vienna, Austria
25 | - family-names: Reuß
26 | given-names: Felix David
27 | affiliation: Technische Universität Wien, Vienna, Austria
28 | - family-names: Jovic
29 | given-names: Luka
30 | affiliation: Technische Universität Wien, Vienna, Austria
31 | - name: "Earth Observation Data Science contributors" # use the 'name' field to acknowledge organizations
32 | website: "https://github.com/TUW-GEO/eo-datascience-cookbook/graphs/contributors"
33 | title: "Earth Observation Data Science Cookbook"
34 | abstract: "Earth Observation Data Science Cookbook provides training material \
35 | centered around Earth Observation data while honoring the Pangeo Philosophy. \
36 | The examples used in the notebooks represent some of the main research lines \
37 | of the Remote Sensing Unit at the Department of Geodesy and Geoinformation \
38 | at the TU Wien (Austria). In addition, the content familiarizes the reader \
39 | with the data available at the EODC (Earth Observation Data Centre For Water \
40 | Resources Monitoring) as well as the computational resources to process
41 | large amounts of data."
42 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | # Default JupyterBook build output dir
2 | _build/
3 |
4 | # Byte-compiled / optimized / DLL files
5 | __pycache__/
6 | *.py[cod]
7 | *$py.class
8 |
9 | # C extensions
10 | *.so
11 |
12 | # Distribution / packaging
13 | .Python
14 | build/
15 | notebooks/_build/
16 | develop-eggs/
17 | dist/
18 | downloads/
19 | eggs/
20 | .eggs/
21 | lib/
22 | lib64/
23 | parts/
24 | sdist/
25 | var/
26 | wheels/
27 | pip-wheel-metadata/
28 | share/python-wheels/
29 | *.egg-info/
30 | .installed.cfg
31 | *.egg
32 | MANIFEST
33 |
34 | # PyInstaller
35 | # Usually these files are written by a python script from a template
36 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
37 | *.manifest
38 | *.spec
39 |
40 | # Installer logs
41 | pip-log.txt
42 | pip-delete-this-directory.txt
43 |
44 | # Unit test / coverage reports
45 | htmlcov/
46 | .tox/
47 | .nox/
48 | .coverage
49 | .coverage.*
50 | .cache
51 | nosetests.xml
52 | coverage.xml
53 | *.cover
54 | *.py,cover
55 | .hypothesis/
56 | .pytest_cache/
57 |
58 | # Translations
59 | *.mo
60 | *.pot
61 |
62 | # Django stuff:
63 | *.log
64 | local_settings.py
65 | db.sqlite3
66 | db.sqlite3-journal
67 |
68 | # Flask stuff:
69 | instance/
70 | .webassets-cache
71 |
72 | # Scrapy stuff:
73 | .scrapy
74 |
75 | # Sphinx documentation
76 | docs/_build/
77 |
78 | # PyBuilder
79 | target/
80 |
81 | # Jupyter Notebook
82 | .ipynb_checkpoints
83 |
84 | # IPython
85 | profile_default/
86 | ipython_config.py
87 |
88 | # pyenv
89 | .python-version
90 |
91 | # pipenv
92 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
93 | # However, in case of collaboration, if having platform-specific dependencies or dependencies
94 | # having no cross-platform support, pipenv may install dependencies that don't work, or not
95 | # install all needed dependencies.
96 | #Pipfile.lock
97 |
98 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow
99 | __pypackages__/
100 |
101 | # Celery stuff
102 | celerybeat-schedule
103 | celerybeat.pid
104 |
105 | # SageMath parsed files
106 | *.sage.py
107 |
108 | # Environments
109 | .env
110 | .venv
111 | env/
112 | venv/
113 | ENV/
114 | env.bak/
115 | venv.bak/
116 |
117 | # Spyder project settings
118 | .spyderproject
119 | .spyproject
120 |
121 | # Rope project settings
122 | .ropeproject
123 |
124 | # mkdocs documentation
125 | /site
126 |
127 | # mypy
128 | .mypy_cache/
129 | .dmypy.json
130 | dmypy.json
131 |
132 | # Pyre type checker
133 | .pyre/
134 |
135 | # Ephemeral .nfs files
136 | .nfs*
137 |
138 | # Data
139 | data/
140 | **/data/
141 |
--------------------------------------------------------------------------------
/myst.yml:
--------------------------------------------------------------------------------
1 | version: 1
2 | extends:
3 | - https://raw.githubusercontent.com/projectpythia/pythia-config/main/pythia.yml
4 | project:
5 | title: Earth Observation Data Science Cookbook
6 | authors:
7 | - name: Wolfgang Wagner
8 | - name: Martin Schobben
9 | - name: Nikolas Pikall
10 | - name: Joseph Wagner
11 | - name: Davide Festa
12 | - name: Felix David Reuß
13 | - name: Luka Jović
14 | copyright: "2024"
15 | toc:
16 | - file: README.md
17 | - title: Preamble
18 | children:
19 | - file: notebooks/how-to-cite.md
20 | - title: Courses
21 | children:
22 | - file: notebooks/courses/microwave-remote-sensing.ipynb
23 | children:
24 | - file: notebooks/courses/microwave-remote-sensing/unit_01/01_in_class_exercise.ipynb
25 | - file: notebooks/courses/microwave-remote-sensing/unit_01/02_in_class_exercise.ipynb
26 | - file: notebooks/courses/microwave-remote-sensing/unit_01/03_in_class_exercise.ipynb
27 | - file: notebooks/courses/microwave-remote-sensing/unit_02/04_in_class_exercise.ipynb
28 | - file: notebooks/courses/microwave-remote-sensing/unit_02/05_in_class_exercise.ipynb
29 | - file: notebooks/courses/microwave-remote-sensing/unit_02/06_in_class_exercise.ipynb
30 | - file: notebooks/courses/microwave-remote-sensing/unit_03/07_in_class_exercise.ipynb
31 | - file: notebooks/courses/microwave-remote-sensing/unit_03/08_in_class_exercise.ipynb
32 | - file: notebooks/courses/microwave-remote-sensing/unit_03/09_in_class_exercise.ipynb
33 | - file: notebooks/courses/environmental-remote-sensing.ipynb
34 | children:
35 | - file: notebooks/courses/environmental-remote-sensing/unit_01/01_handout_drought.ipynb
36 | - file: notebooks/courses/environmental-remote-sensing/unit_01/02_handout_drought.ipynb
37 | - file: notebooks/courses/environmental-remote-sensing/unit_01/03_handout_drought.ipynb
38 | - file: notebooks/courses/environmental-remote-sensing/unit_01/04_handout_drought.ipynb
39 | - title: Templates
40 | children:
41 | - file: notebooks/templates/prereqs-templates.ipynb
42 | children:
43 | - file: notebooks/templates/classification.ipynb
44 | - title: Tutorials
45 | children:
46 | - file: notebooks/tutorials/prereqs-tutorials.ipynb
47 | children:
48 | - file: notebooks/tutorials/floodmapping.ipynb
49 | - title: References
50 | children:
51 | - file: notebooks/references.ipynb
52 | jupyter:
53 | binder:
54 | repo: projectpythia/eo-datascience-cookbook
55 | site:
56 | options:
57 | logo: notebooks/images/logos/pythia_logo-white-rtext.svg
58 | folders: true
59 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/src/envrs/rio_tools.py:
--------------------------------------------------------------------------------
1 | # ruff: noqa: ANN D100
2 | import pandas as pd
3 | import rasterio as rio
4 | from rasterio import windows
5 | from shapely import box
6 |
7 |
8 | def clipped_read(entries, location):
9 | if isinstance(entries, pd.Series):
10 | entries = entries.to_dict()
11 | elif isinstance(entries, dict):
12 | pass
13 | else:
14 | err = f"entries must be a pandas series or a dictionary, not {type(entries)}"
15 | raise TypeError(err)
16 |
17 | # Retrieve the content of the rasters
18 | array_pairs, profile_pairs, tag_pairs = {}, {}, {}
19 | for raster_name, raster_path in entries.items():
20 | with rio.open(raster_path) as raster:
21 | # Set the profile
22 | profile = raster.profile
23 |
24 | # Set the shared extent
25 | naive_bounds = box(*location.to_crs(profile["crs"]).total_bounds)
26 | raster_bounds = box(*raster.bounds)
27 | shared_bounds = raster_bounds.intersection(naive_bounds).bounds
28 |
29 | # Set the clipping window, and update the profile
30 | window = windows.from_bounds(*shared_bounds, profile["transform"])
31 | window = window.round()
32 | profile["transform"] = windows.transform(window, profile["transform"])
33 | profile["height"] = window.height
34 | profile["width"] = window.width
35 |
36 | # Do a PARTIAL reading, set profiles and tags
37 | array_pairs[raster_name] = raster.read(1, window=window)
38 |
39 | profile_pairs[raster_name] = pd.Series(profile)
40 | tag_pairs[raster_name] = pd.Series(raster.tags())
41 |
42 | # Concatenate the profiles and the tags
43 | profile_frame = pd.concat(profile_pairs, axis=1).T
44 | tag_frame = pd.concat(tag_pairs, axis=1).T
45 | return array_pairs, profile_frame, tag_frame
46 |
47 |
48 | def write_raster(array_pairs, profile_frame, tag_frame, raster_path):
49 | # Check for duplicate entries on the profiles
50 | nunique_profiles = profile_frame.nunique(axis=0)
51 | several_profiles = nunique_profiles > 1
52 | if several_profiles.any():
53 | offending_entries = nunique_profiles[several_profiles].index.tolist()
54 | err = f"{offending_entries} have several possible values"
55 | raise RuntimeError(err)
56 |
57 | # make the output profile and update
58 | out_profile = profile_frame.drop_duplicates().iloc[0].to_dict()
59 | out_profile["count"] = len(array_pairs)
60 |
61 | # keep only tags where the count is larger than one
62 | nunique_tags = tag_frame.nunique(axis=0)
63 | relevant_tags = nunique_tags[nunique_tags == 1].index.tolist()
64 | tags = tag_frame[relevant_tags].drop_duplicates().iloc[0].to_dict()
65 |
66 | # Write the output file
67 | with rio.open(raster_path, "w", **out_profile) as out_raster:
68 | # update the scales and offsets
69 | if "add_offset" in tag_frame.columns:
70 | out_raster.offsets = tag_frame["add_offset"].astype(float)
71 |
72 | if "scale_factor" in tag_frame.columns:
73 | out_raster.scales = tag_frame["scale_factor"].astype(float)
74 |
75 | # Update the tags
76 | out_raster.update_tags(**tags)
77 |
78 | # Iteratively write each band and its description
79 | for band_idx, (band_name, band_data) in enumerate(array_pairs.items(), 1):
80 | out_raster.write(band_data, band_idx)
81 | out_raster.set_band_description(band_idx, band_name)
82 |
--------------------------------------------------------------------------------
/_config.yml:
--------------------------------------------------------------------------------
1 | # Book settings
2 | # Learn more at https://jupyterbook.org/customize/config.html
3 |
4 | title: Earth Observation Data Science Cookbook
5 | author: the Project Pythia Community
6 | logo: notebooks/images/logos/pythia_logo-white-rtext.svg
7 | copyright: "2025"
8 |
9 | bibtex_bibfiles:
10 | - notebooks/references.bib
11 |
12 | execute:
13 | # To execute notebooks via a Binder instead, replace 'cache' with 'binder'
14 | execute_notebooks: cache
15 | timeout: 1200
16 | allow_errors: False # cells with expected failures must set the `raises-exception` cell tag
17 | exclude_patterns:
18 | - "team_work/*"
19 | - "*_homework_exercise*"
20 | - "*_supplement_*"
21 |
22 | # Add a few extensions to help with parsing content
23 | parse:
24 | myst_enable_extensions: # default extensions to enable in the myst parser. See https://myst-parser.readthedocs.io/en/latest/using/syntax-optional.html
25 | - amsmath
26 | - colon_fence
27 | - deflist
28 | - dollarmath
29 | - html_admonition
30 | - html_image
31 | - replacements
32 | - smartquotes
33 | - substitution
34 |
35 | sphinx:
36 | config:
37 | linkcheck_ignore: [
38 | "https://doi.org/*",
39 | "https://zenodo.org/badge/*",
40 | "https://services.eodc.eu/browser/*",
41 | "https://binder.eo-datascience-cookbook.org/*",
42 | "https://www.mdpi.com/*",
43 | ] # don't run link checker on DOI links since they are immutable
44 | nb_execution_raise_on_error: true # raise exception in build if there are notebook errors (this flag is ignored if building on binder)
45 | html_favicon: notebooks/images/icons/favicon.ico
46 | html_last_updated_fmt: "%-d %B %Y"
47 | html_theme: sphinx_pythia_theme
48 | html_permalinks_icon: ''
49 | html_theme_options:
50 | home_page_in_toc: true
51 | repository_url: https://github.com/ProjectPythia/eo-datascience-cookbook # Online location of your book
52 | repository_branch: main # Which branch of the repository should be used when creating links (optional)
53 | use_issues_button: true
54 | use_repository_button: true
55 | use_edit_page_button: true
56 | use_fullscreen_button: true
57 | analytics:
58 | google_analytics_id: G-T52X8HNYE8
59 | github_url: https://github.com/ProjectPythia
60 | icon_links:
61 | - name: YouTube
62 | url: https://www.youtube.com/channel/UCoZPBqJal5uKpO8ZiwzavCw
63 | icon: fab fa-youtube-square
64 | type: fontawesome
65 | launch_buttons:
66 | binderhub_url: https://binder.projectpythia.org
67 | notebook_interface: jupyterlab
68 | logo:
69 | link: https://projectpythia.org
70 | navbar_start:
71 | - navbar-logo
72 | navbar_end:
73 | - navbar-icon-links
74 | navbar_links:
75 | - name: Home
76 | url: https://projectpythia.org
77 | - name: Foundations
78 | url: https://foundations.projectpythia.org
79 | - name: Cookbooks
80 | url: https://cookbooks.projectpythia.org
81 | - name: Resources
82 | url: https://projectpythia.org/resource-gallery.html
83 | - name: Community
84 | url: https://projectpythia.org/index.html#join-us
85 | footer_logos:
86 | NCAR: notebooks/images/logos/NSF-NCAR_Lockup-UCAR-Dark_102523.svg
87 | Unidata: notebooks/images/logos/Unidata_logo_horizontal_1200x300.svg
88 | UAlbany: notebooks/images/logos/UAlbany-A2-logo-purple-gold.svg
89 | footer_start:
90 | - footer-logos
91 | - footer-info
92 | - footer-extra
93 |
--------------------------------------------------------------------------------
/notebooks/images/cmaps/06_color_mapping.json:
--------------------------------------------------------------------------------
1 | {
2 | "land_cover": [
3 | { "value": 1, "color": "#e6004d", "label": "Continuous urban fabric" },
4 | { "value": 2, "color": "#ff0000", "label": "Discontinuous urban fabric" },
5 | {
6 | "value": 3,
7 | "color": "#cc4df2",
8 | "label": "Industrial or commercial units"
9 | },
10 | {
11 | "value": 4,
12 | "color": "#cc0000",
13 | "label": "Road and rail networks and associated land"
14 | },
15 | { "value": 5, "color": "#e6cccc", "label": "Port areas" },
16 | { "value": 6, "color": "#e6cce6", "label": "Airports" },
17 | { "value": 7, "color": "#a600cc", "label": "Mineral extraction sites" },
18 | { "value": 8, "color": "#a64d00", "label": "Dump sites" },
19 | { "value": 9, "color": "#ff4dff", "label": "Construction sites" },
20 | { "value": 10, "color": "#ffa6ff", "label": "Green urban areas" },
21 | {
22 | "value": 11,
23 | "color": "#ffe6ff",
24 | "label": "Sport and leisure facilities"
25 | },
26 | { "value": 12, "color": "#ffffa8", "label": "Non-irrigated arable land" },
27 | { "value": 13, "color": "#ffff00", "label": "Permanently irrigated land" },
28 | { "value": 14, "color": "#e6e600", "label": "Rice fields" },
29 | { "value": 15, "color": "#e68000", "label": "Vineyards" },
30 | {
31 | "value": 16,
32 | "color": "#f2a64d",
33 | "label": "Fruit trees and berry plantations"
34 | },
35 | { "value": 17, "color": "#e6a600", "label": "Olive groves" },
36 | { "value": 18, "color": "#e6e64d", "label": "Pastures" },
37 | {
38 | "value": 19,
39 | "color": "#ffe6a6",
40 | "label": "Annual crops associated with permanent crops"
41 | },
42 | {
43 | "value": 20,
44 | "color": "#ffe64d",
45 | "label": "Complex cultivation patterns"
46 | },
47 | {
48 | "value": 21,
49 | "color": "#e6cc4d",
50 | "label": "Agricultural land with natural vegetation"
51 | },
52 | { "value": 22, "color": "#f2cca6", "label": "Agro-forestry areas" },
53 | { "value": 23, "color": "#80ff00", "label": "Broad-leaved forest" },
54 | { "value": 24, "color": "#00a600", "label": "Coniferous forest" },
55 | { "value": 25, "color": "#4dff00", "label": "Mixed forest" },
56 | { "value": 26, "color": "#ccf24d", "label": "Natural grasslands" },
57 | { "value": 27, "color": "#a6ff80", "label": "Moors and heathland" },
58 | { "value": 28, "color": "#a6e64d", "label": "Sclerophyllous vegetation" },
59 | { "value": 29, "color": "#a6f200", "label": "Transitional woodland-shrub" },
60 | { "value": 30, "color": "#e6e6e6", "label": "Beaches - dunes - sands" },
61 | { "value": 31, "color": "#cccccc", "label": "Bare rocks" },
62 | { "value": 32, "color": "#ccffcc", "label": "Sparsely vegetated areas" },
63 | { "value": 33, "color": "#000000", "label": "Burnt areas" },
64 | { "value": 34, "color": "#a6e6cc", "label": "Glaciers and perpetual snow" },
65 | { "value": 35, "color": "#a6a6ff", "label": "Inland marshes" },
66 | { "value": 36, "color": "#4d4dff", "label": "Peat bogs" },
67 | { "value": 37, "color": "#ccccff", "label": "Salt marshes" },
68 | { "value": 38, "color": "#e6e6ff", "label": "Salines" },
69 | { "value": 39, "color": "#a6a6e6", "label": "Intertidal flats" },
70 | { "value": 40, "color": "#00ccf2", "label": "Water courses" },
71 | { "value": 41, "color": "#80f2e6", "label": "Water bodies" },
72 | { "value": 42, "color": "#00ffa6", "label": "Coastal lagoons" },
73 | { "value": 43, "color": "#a6ffe6", "label": "Estuaries" },
74 | { "value": 44, "color": "#e6f2ff", "label": "Sea and ocean" },
75 | { "value": 48, "color": "#ffffff", "label": "NODATA" }
76 | ]
77 | }
78 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/src/envrs/corr_plots.py:
--------------------------------------------------------------------------------
1 | """Conda path solver for ffmpeg and functions to plot correlations.
2 |
3 | Returns
4 | -------
5 | Conda path : str
6 | Correlation animation plot: IPython.display
7 | R-squared plot: matplotlib.pyplot
8 |
9 | """
10 |
11 | import json
12 | import os
13 | import shutil
14 | import subprocess
15 | from pathlib import Path
16 |
17 | import matplotlib.pyplot as plt
18 | import numpy as np
19 | import pandas as pd
20 | import statsmodels.tsa.stattools as smt
21 | from IPython.display import HTML
22 | from matplotlib.animation import FuncAnimation
23 |
24 | CONDA_PATH = Path("envs", "environmental-remote-sensing")
25 |
26 |
27 | def get_base(solver: str):
28 | conda_prefix = os.environ.get("CONDA_PREFIX")
29 | if conda_prefix:
30 | return conda_prefix
31 |
32 | conda_exe = shutil.which(solver)
33 | if conda_exe:
34 | try:
35 | result = subprocess.run(
36 | [conda_exe, "info", "--json"],
37 | check=False,
38 | capture_output=True,
39 | text=True,
40 | )
41 | info = json.loads(result.stdout)
42 | envs = [s for s in info.get("envs") if "environmental-remote-sensing" in s]
43 | return next(iter(envs), None)
44 | except (subprocess.CalledProcessError, json.JSONDecodeError, KeyError):
45 | pass
46 |
47 | return None
48 |
49 |
50 | def get_conda_env_path():
51 | conda_base = get_base("conda")
52 | if conda_base:
53 | return conda_base
54 |
55 | micromamba_base = get_base("micromamba")
56 | if micromamba_base:
57 | return micromamba_base
58 |
59 | print("Neither Conda nor Micromamba is installed or detected.")
60 | return None
61 |
62 |
63 | ffmpeg_path = Path(get_conda_env_path()) / Path("bin/ffmpeg")
64 | print(f"Resolve path ffmpeg: {ffmpeg_path}")
65 |
66 | plt.rcParams["animation.ffmpeg_path"] = str(ffmpeg_path)
67 |
68 |
69 | def plot_predicted_values(df, variables, suffix=None, **kwargs):
70 | fig, axes = plt.subplots(1, len(variables), **kwargs)
71 | fig.suptitle("R-squared Plot", fontsize=14)
72 | if suffix is None:
73 | suffix = [""] * len(variables)
74 | for i, key in enumerate(variables):
75 | _plot_predicted_values(axes[i], df, key, variables[key], suffix[i])
76 | plt.close()
77 | return fig
78 |
79 |
80 | def _plot_predicted_values(ax, df, variable, res, suffix):
81 | pred_ols = res.get_prediction()
82 | iv_l = pred_ols.summary_frame()["obs_ci_lower"]
83 | iv_u = pred_ols.summary_frame()["obs_ci_upper"]
84 | fitted = res.fittedvalues
85 | x = df[variable].to_numpy()
86 | y = df["ndvi"].to_numpy()
87 | ax.set_title(f"{variable} {suffix}")
88 | ax.plot(x, y, "o", label="data", alpha=0.5)
89 | ax.plot(x, fitted, label="OLS")
90 | ax.plot(
91 | x,
92 | iv_u,
93 | )
94 | ax.plot(
95 | x,
96 | iv_l,
97 | )
98 | ax.set_xlim(0, 1)
99 | ax.set_ylim(0, 1)
100 | ax.set_xlabel("actual")
101 | ax.set_ylabel("predicted")
102 | ax.legend(loc="best")
103 |
104 |
105 | def plot_step_corr(df, var1, var2="copy", length=72):
106 | p = _plot_step_corr(df, var1, var2, length)
107 | plt.close()
108 | return p
109 |
110 |
111 | def _plot_step_corr(df, var1, var2="copy", length=72):
112 | def step_corr(x):
113 | # clear frame
114 | fig.clear()
115 | # original and shifted time series
116 | ax1 = plt.subplot(1, 2, 1)
117 | if var2 == "copy":
118 | y = df[var1]
119 | y.plot(y=var1, ax=ax1)
120 | y.shift(x).plot(y=var1, c="orange", ax=ax1)
121 | res = pd.Series(
122 | smt.acf(y.values, nlags=length), index=df.index[: length + 1]
123 | )
124 | plt.title(f"{var1} and copy at lag={x}")
125 | else:
126 | y1 = df[var1]
127 | y2 = df[var2]
128 | y1.plot(y=var1, ax=ax1)
129 | y2.shift(x).plot(y=var2, c="orange", ax=ax1)
130 | res = pd.Series(
131 | smt.ccf(y1.values, y2.values, nlags=length), index=df.index[:length]
132 | )
133 | plt.title(f"{var1} and {var2} at lag={x}")
134 |
135 | ax1.set_ylabel("")
136 | plt.legend([var1, var2])
137 |
138 | # correlation of time series at step #
139 | ax2 = plt.subplot(1, 2, 2)
140 | res.iloc[:x].plot(ax=ax2)
141 | ax2.set_ylabel("")
142 | plt.title("Correlation result")
143 |
144 | fig = plt.figure(figsize=(12, 5))
145 | frames = np.arange(1, length, 1)
146 | anim = FuncAnimation(fig, step_corr, frames, interval=500)
147 | return HTML(anim.to_html5_video())
148 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/src/envrs/hls_tools.py:
--------------------------------------------------------------------------------
1 | # ruff: noqa: ANN D100 PD010
2 |
3 | import re
4 | from datetime import datetime as dt
5 | from pathlib import Path
6 |
7 | import numpy as np
8 | import pandas as pd
9 | import xarray as xr
10 |
11 |
12 | def extract_extra_attrs(granule):
13 | att_pairs = {}
14 | for att in dict(granule.items())["umm"]["AdditionalAttributes"]:
15 | att_name, att_values = att["Name"], att["Values"]
16 | if len(att_values) == 1:
17 | att_pairs[att_name] = att_values[0]
18 |
19 | return att_pairs
20 |
21 |
22 | def tabulate_hls_uris(iterable):
23 | uri_pile, info_pile, name_pile = [], [], []
24 | for product_uri in iterable:
25 | # Set the uri
26 | uri_pile.append(product_uri)
27 |
28 | # process the stem
29 | if isinstance(product_uri, Path):
30 | stem = product_uri.stem
31 | elif isinstance(product_uri, str):
32 | stem = product_uri.split("/")[-1].replace(".tif", "")
33 | else:
34 | err = (
35 | "the contens of iterable should be 'str' or Path, "
36 | f"not {type(product_uri)}"
37 | )
38 | raise TypeError(err)
39 | stem_info = stem.split(".")
40 |
41 | # Alter the stem to avoid issues induced by the dot within the version
42 | stem_info.insert(3, stem_info.pop(1))
43 | version = f"{stem_info[4]}.{stem_info[5]}"
44 | stem_info[4] = version
45 | stem_info.pop(5)
46 |
47 | # Append the info and the new name
48 | info_pile.append(stem_info)
49 | name_pile.append("_".join(stem_info[:-1]))
50 |
51 | # Place the information on a dataframe
52 | columns = ["product", "tile", "time", "sensor", "version", "suffix"]
53 | uri_frame = pd.DataFrame(info_pile, index=uri_pile, columns=columns).reset_index(
54 | names="uri"
55 | )
56 |
57 | # Add the name, clean up the suffix, sort, and return
58 | uri_frame["stem"] = name_pile
59 | uri_frame["suffix"] = uri_frame["suffix"].replace("B8A", "B08A")
60 | return uri_frame.sort_values(["stem", "suffix"], ascending=True)
61 |
62 |
63 | def harmonize_hls_frame(uri_frame):
64 | # Set the column renames
65 | common_bands = {
66 | "B01": "CoastalAerosol",
67 | "B02": "Blue",
68 | "B03": "Green",
69 | "B04": "Red",
70 | }
71 | landsat_bands = {**common_bands, "B05": "NIRnarrow", "B06": "SWIR1", "B07": "SWIR2"}
72 | sentinel_bands = {
73 | **common_bands,
74 | "B08A": "NIRnarrow",
75 | "B11": "SWIR1",
76 | "B12": "SWIR2",
77 | }
78 |
79 | # Separate by sensor, drop non-shared bands, rename
80 | landsat_frame = (
81 | uri_frame[uri_frame["sensor"] == "L30"]
82 | .pivot(index="stem", columns="suffix", values="uri")
83 | .drop(columns=["B09", "B10", "B11"])
84 | .rename(columns=landsat_bands)
85 | )
86 | sentinel_frame = (
87 | uri_frame[uri_frame["sensor"] == "S30"]
88 | .pivot(index="stem", columns="suffix", values="uri")
89 | .drop(columns=["B05", "B06", "B07", "B08", "B09", "B10"])
90 | .rename(columns=sentinel_bands)
91 | )
92 |
93 | # Concatenate and return
94 | return pd.concat([landsat_frame, sentinel_frame], axis=0).sort_index()
95 |
96 |
97 | def att2time(att):
98 | if (";" in att) or ("+" in att):
99 | split_time = re.split(r"\s?[\+\;]\s", att)
100 | start, end = [dt.fromisoformat(s[:-1]) for s in split_time]
101 | time = start + (end - start) / 2
102 | else:
103 | time = dt.fromisoformat(att[:-1])
104 | return time
105 |
106 |
107 | def preprocess_fmask(fmask):
108 | debanded = fmask["band_data"]
109 | bits = xr.apply_ufunc(
110 | np.unpackbits,
111 | debanded,
112 | input_core_dims=[["band", "y", "x"]],
113 | output_core_dims=[["flag", "y", "x"]],
114 | exclude_dims=set(["band"]), # noqa: C405
115 | keep_attrs=True,
116 | kwargs={"axis": 0},
117 | dask="allowed",
118 | )
119 |
120 | # Convert the flags to bool, set the names
121 | flags = bits.sel(flag=slice(2, 9)).astype(bool)
122 | flags["flag"] = [
123 | "water",
124 | "snow or ice",
125 | "cloud shadow",
126 | "adjacent to cloud",
127 | "cloud",
128 | "cirrus cloud",
129 | ]
130 | flags.name = "masks"
131 |
132 | # Convert the aerosol data, set the name
133 | aerosol_parts = bits.sel(flag=slice(0, 2))
134 | aerosol = aerosol_parts.sel(flag=1) + 10 * aerosol_parts.sel(flag=0)
135 | aerosol.name = "aerosol"
136 |
137 | # set the time
138 | time = att2time(debanded.attrs["SENSING_TIME"])
139 | return xr.merge([flags, aerosol]).expand_dims({"time": [time]}, axis=0)
140 |
141 |
142 | def preprocess_bands(bands):
143 | # Set the band names
144 | renames = {}
145 | for var_name in bands:
146 | renames[var_name] = bands[var_name].attrs["long_name"]
147 |
148 | # Set the time
149 | time = att2time(bands.attrs["SENSING_TIME"])
150 | return bands.rename_vars(renames).expand_dims({"time": [time]})
151 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 | # Earth Observation Data Science Cookbook
4 |
5 | [](https://github.com/ProjectPythia/eo-datascience-cookbook/actions/workflows/nightly-build.yaml)
6 | [](https://binder.projectpythia.org/v2/gh/ProjectPythia/eo-datascience-cookbook/main?labpath=notebooks)
7 | [](https://zenodo.org/badge/latestdoi/830421828)
8 |
9 | This Project Pythia Cookbook covers a range of Earth observation examples employing
10 | the Pangeo philosophy. The examples represent the main research lines and BSc/MSc
11 | courses at the Department of Geodesy and Geoinformation at the TU Wien (Austria).
12 | The department has strong ties with the EODC (Earth Observation Data Centre For
13 | Water Resources Monitoring), which hosts e.g., analysis-ready Sentinel-1
14 | (imaging radar mission) data, and has the computational resources to process
15 | large data volumes.
16 |
17 | ## Motivation
18 |
19 | The motivation behind this book is to provide examples of Pangeo-based workflows
20 | applied to realistic examples in Earth observation data science. Creating an
21 | effective learning environment for Earth observation students is a challenging
22 | task due to the rapidly growing volume of remotely sensed, climate, and other
23 | Earth observation data, along with the evolving demands from the tech industry.
24 | Today's Earth observation students are increasingly becoming a blend of traditional
25 | Earth system scientists and "big data scientists", with expertise spanning computer
26 | architectures, programming paradigms, statistics, and machine learning for
27 | predictive modeling. As a result, it is essential to equip educators with the
28 | proper tools for instruction, including training materials, access to data, and
29 | the necessary skills to support scalable and reproducible research.
30 |
31 | ## Authors
32 |
33 | [Wolfgang Wagner](https://github.com/wagner-wolfgang), [Martin Schobben](https://github.com/martinschobben),
34 | [Nikolas Pikall](https://github.com/npikall), [Joseph Wagner](https://github.com/wagnerjoseph), [Davide Festa](https://github.com/maybedave),
35 | [Felix David Reuß](https://github.com/FelixReuss), [Luka Jovic](https://github.com/lukojovic)
36 |
37 | ### Contributors
38 |
39 |
40 |
41 |
42 |
43 | ## Structure
44 |
45 | This book comprises examples of data science concerning Earth Observation (EO) data,
46 | including course material on remote sensing and data products produced by the TU
47 | Wien. It also serves to showcase the data and services offered by the EODC, including
48 | a [STAC](https://docs.eodc.eu/services/stac.html) catalogue and a
49 | [Dask Gateway](https://docs.eodc.eu/services/dask.html) for distributed data processing.
50 |
51 | ### Courses
52 |
53 | This section offers an overview of notebooks, which are used in **courses** from
54 | the Department of Geodesy and Geoinformation at TU Wien.
55 |
56 | ### Templates
57 |
58 | This section provides a collection of general **examples** of earth observation
59 | related tasks and workflows, which are not directly related to a specific course
60 | or product.
61 |
62 | ### Tutorials
63 |
64 | In this section you will find a collection of lessons, which explain certain
65 | **products** or methods that have been developed at the Department of Geodesy and
66 | Geoinformation at TU Wien.
67 |
68 | ## Running the Notebooks
69 |
70 | You can either run the notebook using [Binder](https://binder.projectpythia.org/v2/gh/ProjectPythia/eo-datascience-cookbook/main?labpath=notebooks)
71 | or on your local machine.
72 |
73 | ### Running on Binder
74 |
75 | The simplest way to interact with a Jupyter Notebook is through
76 | [Binder](https://binder.projectpythia.org/v2/gh/ProjectPythia/eo-datascience-cookbook/main?labpath=notebooks), which enables the execution of a
77 | [Jupyter Book](https://jupyterbook.org) in the cloud. The details of how this works are not
78 | important for now. All you need to know is how to launch a Pythia
79 | Cookbooks chapter via Binder. Simply navigate your mouse to
80 | the top right corner of the book chapter you are viewing and click
81 | on the rocket ship icon, (see figure below), and be sure to select
82 | “launch Binder”. After a moment you should be presented with a
83 | notebook that you can interact with. I.e. you'll be able to execute
84 | and even change the example programs. You'll see that the code cells
85 | have no output at first, until you execute them by pressing
86 | {kbd}`Shift`\+{kbd}`Enter`. Complete details on how to interact with
87 | a live Jupyter notebook are described in [Getting Started with
88 | Jupyter](https://foundations.projectpythia.org/foundations/getting-started-jupyter.html).
89 |
90 | ### Running on Your Own Machine
91 |
92 | If you are interested in running this material locally on your computer, you will
93 | need to follow this workflow:
94 |
95 | 1. Clone the `https://github.com/ProjectPythia/eo-datascience-cookbook` repository:
96 |
97 | ```bash
98 | git clone https://github.com/TUW-GEO/eo-datascience-cookbook
99 | ```
100 |
101 | 1. Move into the `eo-datascience-cookbook` directory
102 | ```bash
103 | cd eo-datascience-cookbook
104 | ```
105 | 1. Create and activate your conda environment from the `environment.yml` file
106 | ```bash
107 | conda env create -f environment.yml
108 | conda activate eo-datascience-cookbook
109 | ```
110 | 1. Move into the `notebooks` directory and start up Jupyterlab
111 | ```bash
112 | cd notebooks/
113 | jupyter lab
114 | ```
115 |
--------------------------------------------------------------------------------
/notebooks/images/ProjectPythia_Logo_Final-01-Blue.svg:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/notebooks/images/logos/pythia_logo-white-notext.svg:
--------------------------------------------------------------------------------
1 |
2 |
129 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/unit_01/02_homework_exercise.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Exercise 2\n",
9 | "**Evaluate the Scale of Measurement on Soil Moisture**\n",
10 | "\n",
11 | "## Overview\n",
12 | "\n",
13 | "In this exercise you will do your own evaluation of H SAF ASCAT surface soil moisture (SSM) 6.25 km. However, for your own analysis you will use modelled soil moisture estimates from [ECMWF](https://www.ecmwf.int/) instead of the in situ stations. The particular dataset used here is [ERA5-Land daily](https://cds.climate.copernicus.eu/datasets/derived-era5-land-daily-statistics?tab=overview). We have extracted for you the volume of water in soil layer 1 (0 - 7cm, the surface is at 0cm). The soil's water content is derived by using a combination of modeling and data assimilation techniques. Here's a simplified explanation of how it works: \n",
14 | "\n",
15 | "- **Modelling**: ERA5-Land uses a sophisticated land surface model to simulate various processes that affect soil moisture. This model takes into account factors like rainfall, evaporation, runoff, and infiltration to estimate how much water is present in different layers of the soil.\n",
16 | "\n",
17 | "- **Data Assimilation**: To improve the accuracy of these estimates, ERA5-Land incorporates observational atmospheric variables, such as air temperature and air humidity.\n",
18 | "\n",
19 | "- **Soil Layers**: The model divides the soil into multiple layers, each with its own characteristics and moisture content. By considering the water movement between these layers, ERA5-Land can provide detailed information about soil moisture at different depths. \n",
20 | "\n",
21 | "In essence, ERA5-Land combines advanced modeling techniques with real-world observations to derive accurate and detailed estimates of water content in soil layers. This information is crucial for applications like weather forecasting, agriculture, and water resource management. The resolution of this dataset is 9 km and comes in volumetric units [m$^3$ / m$^3$], so much coarser than the point-wise in situ stations.\n",
22 | "\n",
23 | "## Imports"
24 | ]
25 | },
26 | {
27 | "cell_type": "code",
28 | "execution_count": null,
29 | "id": "1",
30 | "metadata": {},
31 | "outputs": [],
32 | "source": [
33 | "import hvplot.pandas # noqa\n",
34 | "import pandas as pd\n",
35 | "from envrs.download_path import make_url"
36 | ]
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "id": "2",
41 | "metadata": {},
42 | "source": [
43 | "## Loading Soil Moisture Data\n",
44 | "\n",
45 | "As before, we load the data as a `pandas.DataFrame`. First ERA5 Land soil moisture and then the H SAF ASCAT SSM."
46 | ]
47 | },
48 | {
49 | "cell_type": "code",
50 | "execution_count": null,
51 | "id": "3",
52 | "metadata": {},
53 | "outputs": [],
54 | "source": [
55 | "url = make_url(\"era5_ssm_timeseries.csv\")\n",
56 | "df_era5 = pd.read_csv(\n",
57 | " url,\n",
58 | " index_col=\"time\",\n",
59 | " parse_dates=True,\n",
60 | ")\n",
61 | "\n",
62 | "url = make_url(\"ascat-6_25_ssm_timeseries.csv\")\n",
63 | "df_ascat = pd.read_csv(\n",
64 | " url,\n",
65 | " index_col=\"time\",\n",
66 | " parse_dates=True,\n",
67 | ")"
68 | ]
69 | },
70 | {
71 | "cell_type": "markdown",
72 | "id": "4",
73 | "metadata": {},
74 | "source": [
75 | "Now you will perform the same type of analyses as in notebook 2. Perform the analysis by adhering to the following steps and filling in the blanks `...`.\n",
76 | "\n",
77 | "1. **Unit Conversions**\n",
78 | "\n",
79 | "- Calculate porosity with `calc_porosity` from bulk and particle densities `density_df` using pandas `transform`."
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "id": "5",
86 | "metadata": {},
87 | "outputs": [],
88 | "source": [
89 | "density_df = pd.DataFrame(\n",
90 | " {\n",
91 | " \"name\": [\"Buzi\", \"Chokwé\", \"Mabalane\", \"Mabote\", \"Muanza\"],\n",
92 | " \"bulk_density\": [1.25, 1.4, 1.4, 1.35, 1.25],\n",
93 | " }\n",
94 | ").set_index(\"name\")\n",
95 | "\n",
96 | "\n",
97 | "def calc_porosity(x):\n",
98 | " return 1 - x / 2.65\n",
99 | "\n",
100 | "\n",
101 | "porosity_df = ... # noqa ADD YOUR CODE\n",
102 | "porosity_df"
103 | ]
104 | },
105 | {
106 | "cell_type": "markdown",
107 | "id": "6",
108 | "metadata": {},
109 | "source": [
110 | "- Add the porosity (`porosity_df`) to the ASCAT `DataFrame` as a new column with pandas `merge`."
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": null,
116 | "id": "7",
117 | "metadata": {},
118 | "outputs": [],
119 | "source": [
120 | "df_ascat_porosity = ... # noqa ADD YOUR CODE\n",
121 | "df_ascat_porosity.head()"
122 | ]
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "id": "8",
127 | "metadata": {},
128 | "source": [
129 | "- Convert SSM in degrees of saturation to volumetric units with `deg2vol` and pandas `apply` on `df_ascat_porosity`."
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "execution_count": null,
135 | "id": "9",
136 | "metadata": {},
137 | "outputs": [],
138 | "source": [
139 | "def deg2vol(df: pd.DataFrame) -> pd.Series:\n",
140 | " \"\"\"Degree of Saturation to Volumetric Units.\n",
141 | "\n",
142 | " Parameters\n",
143 | " ----------\n",
144 | " df: Pandas.DataFrame\n",
145 | " Degree of Saturation\n",
146 | "\n",
147 | " Returns\n",
148 | " -------\n",
149 | " Pandas.Series: Volumetric Units\n",
150 | "\n",
151 | " \"\"\"\n",
152 | " return df[\"porosity\"] * df[\"surface_soil_moisture\"] / 100\n",
153 | "\n",
154 | "\n",
155 | "df_ascat_vol = df_ascat.copy()\n",
156 | "df_ascat_vol[\"unit\"] = \"m³/m³\"\n",
157 | "df_ascat_vol[\"surface_soil_moisture\"] = ... # noqa ADD YOUR CODE\n",
158 | "df_ascat_vol.head()"
159 | ]
160 | },
161 | {
162 | "cell_type": "markdown",
163 | "id": "10",
164 | "metadata": {},
165 | "source": [
166 | "## Correlations\n",
167 | "\n",
168 | "- Concatenate the `df_ascat_vol` and `df_era5` datasets."
169 | ]
170 | },
171 | {
172 | "cell_type": "code",
173 | "execution_count": null,
174 | "id": "11",
175 | "metadata": {},
176 | "outputs": [],
177 | "source": [
178 | "df_combined = ... # noqa ADD YOUR CODE\n",
179 | "df_combined.head()"
180 | ]
181 | },
182 | {
183 | "cell_type": "code",
184 | "execution_count": null,
185 | "id": "12",
186 | "metadata": {},
187 | "outputs": [],
188 | "source": [
189 | "df_combined.hvplot.scatter(\n",
190 | " x=\"time\",\n",
191 | " y=\"surface_soil_moisture\",\n",
192 | " by=\"type\",\n",
193 | " groupby=\"name\",\n",
194 | " frame_width=800,\n",
195 | " padding=(0.01, 0.1),\n",
196 | " alpha=0.5,\n",
197 | ")"
198 | ]
199 | },
200 | {
201 | "cell_type": "markdown",
202 | "id": "13",
203 | "metadata": {},
204 | "source": [
205 | "- Resample the `df_ascat_vol` and `df_era5` to daily values datasets and merge the datasets."
206 | ]
207 | },
208 | {
209 | "cell_type": "code",
210 | "execution_count": null,
211 | "id": "14",
212 | "metadata": {},
213 | "outputs": [],
214 | "source": [
215 | "df_insitu_daily = (\n",
216 | " df_era5.groupby(\"name\")[\"surface_soil_moisture\"]\n",
217 | " ... # noqa ADD YOUR CODE\n",
218 | " .median()\n",
219 | " .to_frame(\"era5\")\n",
220 | ")\n",
221 | "\n",
222 | "df_ascat_vol_daily = (\n",
223 | " ... # noqa ADD YOUR CODE\n",
224 | ")\n",
225 | "\n",
226 | "df_resampled = df_ascat_vol_daily.join(df_insitu_daily).dropna()\n",
227 | "df_resampled.head()"
228 | ]
229 | },
230 | {
231 | "cell_type": "markdown",
232 | "id": "15",
233 | "metadata": {},
234 | "source": [
235 | "- Calculate Pearson's R$^2$ with pandas `groupby` on the locations and `corr`."
236 | ]
237 | },
238 | {
239 | "cell_type": "code",
240 | "execution_count": null,
241 | "id": "16",
242 | "metadata": {},
243 | "outputs": [],
244 | "source": [
245 | "... # ADD YOUR CODE"
246 | ]
247 | },
248 | {
249 | "cell_type": "markdown",
250 | "id": "17",
251 | "metadata": {},
252 | "source": [
253 | "1. **Calculate the root mean squared error**\n",
254 | "\n",
255 | " - Calculate RMSE with pandas `groupby` on the locations and an user defined function `RMSE`."
256 | ]
257 | },
258 | {
259 | "cell_type": "code",
260 | "execution_count": null,
261 | "id": "18",
262 | "metadata": {},
263 | "outputs": [],
264 | "source": [
265 | "def rmse(df):\n",
266 | " return ... # ADD YOUR CODE\n",
267 | "\n",
268 | "\n",
269 | "df_resampled.groupby(\"name\").apply(rmse)"
270 | ]
271 | }
272 | ],
273 | "metadata": {
274 | "kernelspec": {
275 | "display_name": "Python 3 (ipykernel)",
276 | "language": "python",
277 | "name": "python3"
278 | },
279 | "language_info": {
280 | "codemirror_mode": {
281 | "name": "ipython",
282 | "version": 3
283 | },
284 | "file_extension": ".py",
285 | "mimetype": "text/x-python",
286 | "name": "python",
287 | "nbconvert_exporter": "python",
288 | "pygments_lexer": "ipython3",
289 | "version": "3.13.5"
290 | }
291 | },
292 | "nbformat": 4,
293 | "nbformat_minor": 5
294 | }
295 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/unit_01/03_homework_exercise.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Exercise 3\n",
9 | "**Drought Indicators based on Soil Moisture Data**\n",
10 | "\n",
11 | "## Overview\n",
12 | "\n",
13 | "In this exercise you will calculate some alternatives to the Z score anomaly detection; namely, the Soil Moisture Anomaly Percentage Index (SMAPI) and the modified Z sore. A comprehensive overview of drought indicators can be found in Vreugdenhil et al., 2022[^1]\n",
14 | "\n",
15 | "\n",
16 | "\n",
17 | "$$ \n",
18 | "\\begin{aligned}\n",
19 | "\\text{ Z score: } &\\quad z_{k,i} = \\frac{\\text{SM}_{k,i} - \\bar{\\text{SM}}_i}{s^{\\bar{\\text{SM}}_i}} \\\\\n",
20 | "\\text{ SMAPI: } &\\quad I_{k, i} = \\frac{\\text{SM}_{k, i} - \\bar{\\text{SM}}_i}{ \\bar{\\text{SM}}_i} \\times 100 \\\\\n",
21 | "\\text{ Modified Z score: } &\\quad z_{k,i}^* = 0.6745 \\frac{\\text{SM}_{k,i} - \\tilde{\\text{SM}}_i}{\\text{MAD}_i} &\\quad \\text{MAD} = \\text{median} \\left( \\left| x_i - \\tilde{x} \\right| \\right)\n",
22 | "\\end{aligned}\n",
23 | "$$\n",
24 | "\n",
25 | "where:\n",
26 | "\n",
27 | "$$\n",
28 | "\\begin{align}\n",
29 | "\\text{SM}_{k,i} &\\quad \\text{: soil moisture for specific period (e.g., month) (} \\; i \\; \\text{) and year (} \\; k \\; \\text{)}\\\\\n",
30 | "\\bar{\\text{SM}}_i &\\quad \\text{: long-term mean soil moisture for specific period (e.g., month) (} \\; i \\; \\text{)} \\\\\n",
31 | "\\tilde{\\text{SM}}_i &\\quad \\text{: long-term median soil moisture for specific period (e.g., month) (} \\; i \\; \\text{)} \\\\\n",
32 | "s^{\\bar{\\text{SM}}_i} &\\quad \\text{: long-term standard deviation soil moisture for specific period (e.g., month) (} \\; i \\; \\text{)} \\\\\n",
33 | "\\text{MAD}_i &\\quad \\text{: long-term median absolute deviation soil moisture for specific period (e.g., month) (} \\; i \\; \\text{)}\n",
34 | "\\end{align}\n",
35 | "$$\n",
36 | "\n",
37 | "[^1]: Vreugdenhil, Mariette, Isabella Greimeister-Pfeil, Wolfgang Preimesberger, et al. 2022. “Microwave Remote Sensing for Agricultural Drought Monitoring: Recent Developments and Challenges.” Frontiers in Water 4 (November). [doi: frwa.2022.1045451](https://doi.org/10.3389/frwa.2022.1045451).\n",
38 | "\n",
39 | "## Imports"
40 | ]
41 | },
42 | {
43 | "cell_type": "code",
44 | "execution_count": null,
45 | "id": "1",
46 | "metadata": {},
47 | "outputs": [],
48 | "source": [
49 | "import hvplot.pandas # noqa\n",
50 | "import pandas as pd\n",
51 | "from envrs.download_path import make_url"
52 | ]
53 | },
54 | {
55 | "cell_type": "markdown",
56 | "id": "2",
57 | "metadata": {},
58 | "source": [
59 | "## Loading Soil Moisture Data\n",
60 | "\n",
61 | "We will Surface Soil Moisture (SSM) time series for Buzi, Chokwé, Mabalane, Mabote and Muanza ASCAT. As before, we load the data as a `pandas.DataFrame`. "
62 | ]
63 | },
64 | {
65 | "cell_type": "code",
66 | "execution_count": null,
67 | "id": "3",
68 | "metadata": {},
69 | "outputs": [],
70 | "source": [
71 | "url = make_url(\"ascat-6_25_ssm_timeseries.csv\")\n",
72 | "ts = pd.read_csv(\n",
73 | " url,\n",
74 | " index_col=\"time\",\n",
75 | " parse_dates=True,\n",
76 | ")"
77 | ]
78 | },
79 | {
80 | "cell_type": "markdown",
81 | "id": "4",
82 | "metadata": {},
83 | "source": [
84 | "## Calculation of Drought Indicators\n",
85 | "\n",
86 | "Calculation of Drought Indicators will involve two steps:\n",
87 | "\n",
88 | "1. Translate the equation into `pandas` math operations\n",
89 | "2. Aggregate the time series\n",
90 | "2. Apply the function to a time series grouped for each location (`\"name\"`) and the month (`ts_monthly.index.month`)."
91 | ]
92 | },
93 | {
94 | "cell_type": "code",
95 | "execution_count": null,
96 | "id": "5",
97 | "metadata": {},
98 | "outputs": [],
99 | "source": [
100 | "def zscore(x: pd.Series) -> pd.Series:\n",
101 | " \"\"\"Z Score.\n",
102 | "\n",
103 | " Parameters\n",
104 | " ----------\n",
105 | " x : pd.Series\n",
106 | " Monthly aggregated surface soil moisture values\n",
107 | "\n",
108 | "\n",
109 | " Returns\n",
110 | " -------\n",
111 | " Z scores : pd.Series\n",
112 | "\n",
113 | " \"\"\"\n",
114 | " return (x - x.mean()) / x.std()"
115 | ]
116 | },
117 | {
118 | "cell_type": "code",
119 | "execution_count": null,
120 | "id": "6",
121 | "metadata": {},
122 | "outputs": [],
123 | "source": [
124 | "ts_monthly = (\n",
125 | " ts.groupby([pd.Grouper(freq=\"ME\"), \"name\"])\n",
126 | " .surface_soil_moisture.mean()\n",
127 | " .reset_index(level=\"name\")\n",
128 | ")"
129 | ]
130 | },
131 | {
132 | "cell_type": "code",
133 | "execution_count": null,
134 | "id": "7",
135 | "metadata": {},
136 | "outputs": [],
137 | "source": [
138 | "ts_monthly[\"zscore\"] = ts_monthly.groupby(\n",
139 | " [ts_monthly.index.month, \"name\"]\n",
140 | ").surface_soil_moisture.transform(zscore)\n",
141 | "ts_monthly.hvplot.line(\n",
142 | " x=\"time\",\n",
143 | " y=\"zscore\",\n",
144 | " by=\"name\",\n",
145 | " frame_width=800,\n",
146 | " padding=(0.01, 0.1),\n",
147 | ")"
148 | ]
149 | },
150 | {
151 | "cell_type": "markdown",
152 | "id": "8",
153 | "metadata": {},
154 | "source": [
155 | "Now it is your turn to do the same for \"SMAPI\" and the \"Modified Z score\" by filling in the blanks `...`."
156 | ]
157 | },
158 | {
159 | "cell_type": "markdown",
160 | "id": "9",
161 | "metadata": {},
162 | "source": [
163 | "## Soil Moisture Anomaly Percentage Index (SMAPI)"
164 | ]
165 | },
166 | {
167 | "cell_type": "code",
168 | "execution_count": null,
169 | "id": "10",
170 | "metadata": {},
171 | "outputs": [],
172 | "source": [
173 | "def smapi(x: pd.Series) -> pd.Series: # noqa: D103\n",
174 | " return # noqa: ADD YOUR CODE HERE"
175 | ]
176 | },
177 | {
178 | "cell_type": "code",
179 | "execution_count": null,
180 | "id": "11",
181 | "metadata": {},
182 | "outputs": [],
183 | "source": [
184 | "# noqa: ADD YOUR CODE HERE"
185 | ]
186 | },
187 | {
188 | "cell_type": "code",
189 | "execution_count": null,
190 | "id": "12",
191 | "metadata": {},
192 | "outputs": [],
193 | "source": [
194 | "ts_monthly[\"smapi\"] = ts_monthly.groupby(\n",
195 | " [ts_monthly.index.month, \"name\"]\n",
196 | ").surface_soil_moisture.transform(smapi)\n",
197 | "ts_monthly.hvplot.line(\n",
198 | " x=\"time\",\n",
199 | " y=\"smapi\",\n",
200 | " by=\"name\",\n",
201 | " frame_width=800,\n",
202 | " padding=(0.01, 0.1),\n",
203 | ")"
204 | ]
205 | },
206 | {
207 | "cell_type": "markdown",
208 | "id": "13",
209 | "metadata": {},
210 | "source": [
211 | "## Modified Z sore."
212 | ]
213 | },
214 | {
215 | "cell_type": "code",
216 | "execution_count": null,
217 | "id": "14",
218 | "metadata": {},
219 | "outputs": [],
220 | "source": [
221 | "def modified_zscore(x: pd.Series) -> pd.Series: # noqa: D103\n",
222 | " return # noqa: ADD YOUR CODE HERE"
223 | ]
224 | },
225 | {
226 | "cell_type": "code",
227 | "execution_count": null,
228 | "id": "15",
229 | "metadata": {},
230 | "outputs": [],
231 | "source": [
232 | "# noqa: ADD YOUR CODE HERE"
233 | ]
234 | },
235 | {
236 | "cell_type": "code",
237 | "execution_count": null,
238 | "id": "16",
239 | "metadata": {},
240 | "outputs": [],
241 | "source": [
242 | "ts_monthly[\"modified_zscore\"] = ts_monthly.groupby(\n",
243 | " [ts_monthly.index.month, \"name\"]\n",
244 | ").ssm_monthly_median.transform(modified_zscore)\n",
245 | "ts_monthly.hvplot.line(\n",
246 | " x=\"time\",\n",
247 | " y=\"modified_zscore\",\n",
248 | " by=\"name\",\n",
249 | " frame_width=800,\n",
250 | " padding=(0.01, 0.1),\n",
251 | ")"
252 | ]
253 | },
254 | {
255 | "cell_type": "markdown",
256 | "id": "17",
257 | "metadata": {},
258 | "source": [
259 | "## Compare the results"
260 | ]
261 | },
262 | {
263 | "cell_type": "code",
264 | "execution_count": null,
265 | "id": "18",
266 | "metadata": {},
267 | "outputs": [],
268 | "source": [
269 | "(\n",
270 | " ts_monthly.hvplot.line(\n",
271 | " x=\"time\",\n",
272 | " y=[\"zscore\", \"modified_zscore\"],\n",
273 | " groupby=\"name\",\n",
274 | " frame_width=800,\n",
275 | " padding=(0.01, 0.1),\n",
276 | " )\n",
277 | " + ts_monthly.hvplot.line(\n",
278 | " x=\"time\",\n",
279 | " y=[\"smapi\"],\n",
280 | " groupby=\"name\",\n",
281 | " frame_width=800,\n",
282 | " padding=(0.01, 0.1),\n",
283 | " )\n",
284 | ").cols(1)"
285 | ]
286 | }
287 | ],
288 | "metadata": {
289 | "kernelspec": {
290 | "display_name": "Python 3 (ipykernel)",
291 | "language": "python",
292 | "name": "python3"
293 | },
294 | "language_info": {
295 | "codemirror_mode": {
296 | "name": "ipython",
297 | "version": 3
298 | },
299 | "file_extension": ".py",
300 | "mimetype": "text/x-python",
301 | "name": "python",
302 | "nbconvert_exporter": "python",
303 | "pygments_lexer": "ipython3",
304 | "version": "3.13.5"
305 | }
306 | },
307 | "nbformat": 4,
308 | "nbformat_minor": 5
309 | }
310 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/unit_02/05_in_class_exercise.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Wavelength and Polarization\n",
9 | "\n",
10 | "In this notebook, we aim to demonstrate how C-band (4–8 GHz, wavelengths of approximately 3.75–7.5 cm) and L-band (1–2 GHz, wavelengths of approximately 15–30 cm) radio frequencies differ for different land covers and times of the year. In addition, we'll look at co- and cross-polarized backscattering:\n",
11 | "\n",
12 | "+ Sentinel-1 (C-band)\n",
13 | " + VV\n",
14 | " + VH\n",
15 | "+ Alos-2 (L-band):\n",
16 | " + HH\n",
17 | " + HV"
18 | ]
19 | },
20 | {
21 | "cell_type": "code",
22 | "execution_count": null,
23 | "id": "1",
24 | "metadata": {},
25 | "outputs": [],
26 | "source": [
27 | "import holoviews as hv\n",
28 | "import hvplot.xarray # noqa: F401\n",
29 | "import intake\n",
30 | "import matplotlib.pyplot as plt\n",
31 | "import numpy as np"
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "id": "2",
37 | "metadata": {},
38 | "source": [
39 | "## Data Loading\n",
40 | "\n",
41 | "We load the data again with the help of `intake`.\n"
42 | ]
43 | },
44 | {
45 | "cell_type": "code",
46 | "execution_count": null,
47 | "id": "3",
48 | "metadata": {},
49 | "outputs": [],
50 | "source": [
51 | "uri = \"https://git.geo.tuwien.ac.at/public_projects/microwave-remote-sensing/-/raw/main/microwave-remote-sensing.yml\"\n",
52 | "cat = intake.open_catalog(uri)\n",
53 | "fused_ds = cat.fused.read()\n",
54 | "fused_ds"
55 | ]
56 | },
57 | {
58 | "cell_type": "markdown",
59 | "id": "4",
60 | "metadata": {},
61 | "source": [
62 | "The loaded data contains the Leaf Area Index (LAI), which is used as an estimate of foliage cover of forest canopies. So high LAI is interpreted as forested area, whereas low values account for less vegetated areas (shrubs, grass-land, and crops).\n",
63 | "\n",
64 | "First we'll have a look at the mean and standard deviation of LAI over all timeslices. This can be achieved by using the `mean` and `std` methods of the `xarray` object and by supplying a dimension over which these aggregating operations will be applied. We use the dimension \"time\", thereby flattening the cube to a 2-D array with dimensions x and y.\n"
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "execution_count": null,
70 | "id": "5",
71 | "metadata": {},
72 | "outputs": [],
73 | "source": [
74 | "fig, ax = plt.subplots(1, 2, figsize=(15, 6))\n",
75 | "\n",
76 | "LAI_dc = fused_ds.LAI\n",
77 | "LAI_mean = LAI_dc.mean(\"time\")\n",
78 | "LAI_std = LAI_dc.std(\"time\")\n",
79 | "\n",
80 | "LAI_mean.plot(ax=ax[0], vmin=0, vmax=6).axes.set_aspect(\"equal\")\n",
81 | "LAI_std.plot(ax=ax[1], vmin=0, vmax=3).axes.set_aspect(\"equal\")\n",
82 | "plt.tight_layout()"
83 | ]
84 | },
85 | {
86 | "cell_type": "markdown",
87 | "id": "6",
88 | "metadata": {},
89 | "source": [
90 | "*Figure 1: Map of mean LAI (left) and the associated standard deviation (right) for each pixel over time around Lake Garda.*\n",
91 | "\n",
92 | "It appears that the northern parts of our study area contain more and variable amounts of green elements per unit area. This might indicate a more complete coverage of foliage and thus forest.\n",
93 | "\n",
94 | "## Timeseries\n",
95 | "\n",
96 | "Now that we have detected possible forested areas, let's delve a bit deeper into the data. Remember that we deal with a spatiotemporal datacube. This gives us the possibility to study changes for each time increment. Hence we can show what happens to LAI for areas marked with generally low values as well as high values. We can achieve this by filtering the datacube with the `where` method for areas marked with low and high mean LAI values. In turn we will aggregate the remaining datacube over the spatial dimensions (\"x\" and \"y\") to get a mean values for each time increment.\n"
97 | ]
98 | },
99 | {
100 | "cell_type": "code",
101 | "execution_count": null,
102 | "id": "7",
103 | "metadata": {},
104 | "outputs": [],
105 | "source": [
106 | "fig, ax = plt.subplots(1, 2, figsize=(15, 4))\n",
107 | "\n",
108 | "LAI_low = LAI_dc.where(LAI_mean < 4)\n",
109 | "LAI_high = LAI_dc.where(LAI_mean > 4)\n",
110 | "\n",
111 | "LAI_low.mean([\"x\", \"y\"]).plot.scatter(x=\"time\", ax=ax[0], ylim=(0, 6))\n",
112 | "LAI_high.mean([\"x\", \"y\"]).plot.scatter(x=\"time\", ax=ax[1], ylim=(0, 6))\n",
113 | "ax[0].set_title(\"Low Mean LAI ($\\\\bar{LAI} < 4$)\")\n",
114 | "ax[1].set_title(\"High Mean LAI ($\\\\bar{LAI} > 4$)\")\n",
115 | "plt.tight_layout()"
116 | ]
117 | },
118 | {
119 | "cell_type": "markdown",
120 | "id": "8",
121 | "metadata": {},
122 | "source": [
123 | "*Figure 2: Timeseries of mean LAI per timeslice for areas with low (left) and high (right) mean LAI of Figure1.*\n",
124 | "\n",
125 | "Now we can see that areas with high mean LAI values (Figure 1) show a drop-off to values as low as those for areas with low mean LAI during the autumn months (Figure 2 ; right panel). Hence we might deduce that we deal with deciduous forest that becomes less green during autumn, as can be expected for the study area.\n",
126 | "\n",
127 | "Remember that longer wavelengths like L-bands are more likely to penetrate through a forest canopy and would interact more readily with larger object like tree trunks and the forest floor. In turn, C-band microwaves are more likely to interact with sparse and shrub vegetation. The polarization of the emitted and received microwaves is on the other hand dependent on the type of backscattering with co-polarization (HH and VV) happening more frequently with direct backscatter or double bounce scattering. Whereas volume scattering occurs when the radar signal is subject to multiple reflections within 3-dimensional matter, as the orientation of the main scatterers is random, the polarization of the backscattered signal is also random. Volume scattering can therefore cause an increase of cross-polarized intensity.\n",
128 | "\n",
129 | "Let's put this to the test by checking the microwave backscatter signatures over forested and sparsely vegetated areas as well as water bodies (Lake Garda). Let's first look at the different sensor readings for the beginning of summer and autumn.\n"
130 | ]
131 | },
132 | {
133 | "cell_type": "code",
134 | "execution_count": null,
135 | "id": "9",
136 | "metadata": {},
137 | "outputs": [],
138 | "source": [
139 | "hv.output(widget_location=\"bottom\")\n",
140 | "\n",
141 | "t1 = (\n",
142 | " fused_ds.gam0.isel(time=2)\n",
143 | " .hvplot.image(\n",
144 | " robust=True, data_aspect=1, cmap=\"Greys_r\", rasterize=True, clim=(-25, 0)\n",
145 | " )\n",
146 | " .opts(frame_height=400, aspect=\"equal\")\n",
147 | ")\n",
148 | "\n",
149 | "t2 = (\n",
150 | " fused_ds.gam0.isel(time=-1)\n",
151 | " .hvplot.image(\n",
152 | " robust=True, data_aspect=1, cmap=\"Greys_r\", rasterize=True, clim=(-25, 0)\n",
153 | " )\n",
154 | " .opts(frame_height=400, aspect=\"equal\")\n",
155 | ")\n",
156 | "\n",
157 | "t1 + t2"
158 | ]
159 | },
160 | {
161 | "cell_type": "markdown",
162 | "id": "10",
163 | "metadata": {},
164 | "source": [
165 | "*Figure 3: Maps of Sentinel-1 and Alos-2 $\\gamma^0_T \\,[dB]$ for the beginning of summer (left) and autumn (right).*\n",
166 | "\n",
167 | "The most notable difference is the lower energy received for cross-polarized than for co-polarized microwaves for both Sentinel-1 and Alos-2. The latter differences are independent of the time of year. However, one can also note small changes in the received energy for the same satellite dependent on the time of year. To get a better feel for these changes over time we generate the following interactive plot. On the following plot one can select areas of a certain mean LAI (by clicking on the map) and see the associated timeseries of $\\gamma^0_T$ for each of the sensors.\n"
168 | ]
169 | },
170 | {
171 | "cell_type": "code",
172 | "execution_count": null,
173 | "id": "11",
174 | "metadata": {},
175 | "outputs": [],
176 | "source": [
177 | "LAI_image = LAI_mean.hvplot.image(rasterize=True, cmap=\"viridis\", clim=(0, 6)).opts(\n",
178 | " title=\"Mean LAI (Selectable)\", frame_height=400, aspect=\"equal\"\n",
179 | ")\n",
180 | "\n",
181 | "\n",
182 | "def get_timeseries(x, y):\n",
183 | " \"\"\"Callback Function Holoviews\n",
184 | "\n",
185 | " Parameters\n",
186 | " ----------\n",
187 | " x: float\n",
188 | " numeric value for x selected on LAI map\n",
189 | " y: float\n",
190 | " numeric value for y selected on LAI map\n",
191 | "\n",
192 | " \"\"\"\n",
193 | " lai_value = LAI_mean.sel(x=x, y=y, method=\"nearest\").values\n",
194 | "\n",
195 | " if np.isnan(lai_value):\n",
196 | " select = fused_ds.where(LAI_mean.isnull())\n",
197 | " label = \"Water\"\n",
198 | " else:\n",
199 | " mask = np.isclose(LAI_mean, lai_value, atol=0.05)\n",
200 | " select = fused_ds.where(mask)\n",
201 | " label = \"Mean LAI: \" + str(np.round(lai_value, 1))\n",
202 | "\n",
203 | " time_series = (\n",
204 | " select.gam0.to_dataset(\"sensor\")\n",
205 | " .median([\"x\", \"y\"], skipna=True)\n",
206 | " .hvplot.scatter(ylim=(-30, 5))\n",
207 | " .opts(title=label, frame_height=400)\n",
208 | " )\n",
209 | "\n",
210 | " return time_series\n",
211 | "\n",
212 | "\n",
213 | "point_stream = hv.streams.SingleTap(source=LAI_image)\n",
214 | "time_series = hv.DynamicMap(get_timeseries, streams=[point_stream])\n",
215 | "LAI_image + time_series"
216 | ]
217 | },
218 | {
219 | "cell_type": "markdown",
220 | "id": "12",
221 | "metadata": {},
222 | "source": [
223 | "*Figure 4: Map of MEAN LAI around Lake Garda. The pixel values can be seen by hovering your mouse over the pixels. Clicking on the pixel will generate the timeseries for the associated mean LAI on the right hand-side. (Right) Timeseries of for Sentinel-1 and Alos-2 $\\gamma^0_T [dB]$.*\n",
224 | "\n",
225 | "Can you see some patterns when analyzing the different wavelengths and polarizations?\n",
226 | "\n",
227 | "Remember again that we deal with a logarithmic scale. A measurement of 10 dB is 10 times brighter than the intensity measured at 0 dB, and 100 times brighter at 20 dB. The most notable difference is that the offset between cross- and co-polarised signals becomes larger at low LAI and lower at higher LAI. This might indicate the effect of volume scattering in forested areas where co- and cross-polarization render backscattering values more equal. You will study the differences among cross- and co-polarized backscattering in more detail in the homework exercise."
228 | ]
229 | }
230 | ],
231 | "metadata": {
232 | "kernelspec": {
233 | "display_name": "Python 3 (ipykernel)",
234 | "language": "python",
235 | "name": "python3"
236 | },
237 | "language_info": {
238 | "codemirror_mode": {
239 | "name": "ipython",
240 | "version": 3
241 | },
242 | "file_extension": ".py",
243 | "mimetype": "text/x-python",
244 | "name": "python",
245 | "nbconvert_exporter": "python",
246 | "pygments_lexer": "ipython3",
247 | "version": "3.11.13"
248 | }
249 | },
250 | "nbformat": 4,
251 | "nbformat_minor": 5
252 | }
253 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | Apache License
2 | Version 2.0, January 2004
3 | http://www.apache.org/licenses/
4 |
5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6 |
7 | 1. Definitions.
8 |
9 | "License" shall mean the terms and conditions for use, reproduction,
10 | and distribution as defined by Sections 1 through 9 of this document.
11 |
12 | "Licensor" shall mean the copyright owner or entity authorized by
13 | the copyright owner that is granting the License.
14 |
15 | "Legal Entity" shall mean the union of the acting entity and all
16 | other entities that control, are controlled by, or are under common
17 | control with that entity. For the purposes of this definition,
18 | "control" means (i) the power, direct or indirect, to cause the
19 | direction or management of such entity, whether by contract or
20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the
21 | outstanding shares, or (iii) beneficial ownership of such entity.
22 |
23 | "You" (or "Your") shall mean an individual or Legal Entity
24 | exercising permissions granted by this License.
25 |
26 | "Source" form shall mean the preferred form for making modifications,
27 | including but not limited to software source code, documentation
28 | source, and configuration files.
29 |
30 | "Object" form shall mean any form resulting from mechanical
31 | transformation or translation of a Source form, including but
32 | not limited to compiled object code, generated documentation,
33 | and conversions to other media types.
34 |
35 | "Work" shall mean the work of authorship, whether in Source or
36 | Object form, made available under the License, as indicated by a
37 | copyright notice that is included in or attached to the work
38 | (an example is provided in the Appendix below).
39 |
40 | "Derivative Works" shall mean any work, whether in Source or Object
41 | form, that is based on (or derived from) the Work and for which the
42 | editorial revisions, annotations, elaborations, or other modifications
43 | represent, as a whole, an original work of authorship. For the purposes
44 | of this License, Derivative Works shall not include works that remain
45 | separable from, or merely link (or bind by name) to the interfaces of,
46 | the Work and Derivative Works thereof.
47 |
48 | "Contribution" shall mean any work of authorship, including
49 | the original version of the Work and any modifications or additions
50 | to that Work or Derivative Works thereof, that is intentionally
51 | submitted to Licensor for inclusion in the Work by the copyright owner
52 | or by an individual or Legal Entity authorized to submit on behalf of
53 | the copyright owner. For the purposes of this definition, "submitted"
54 | means any form of electronic, verbal, or written communication sent
55 | to the Licensor or its representatives, including but not limited to
56 | communication on electronic mailing lists, source code control systems,
57 | and issue tracking systems that are managed by, or on behalf of, the
58 | Licensor for the purpose of discussing and improving the Work, but
59 | excluding communication that is conspicuously marked or otherwise
60 | designated in writing by the copyright owner as "Not a Contribution."
61 |
62 | "Contributor" shall mean Licensor and any individual or Legal Entity
63 | on behalf of whom a Contribution has been received by Licensor and
64 | subsequently incorporated within the Work.
65 |
66 | 2. Grant of Copyright License. Subject to the terms and conditions of
67 | this License, each Contributor hereby grants to You a perpetual,
68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69 | copyright license to reproduce, prepare Derivative Works of,
70 | publicly display, publicly perform, sublicense, and distribute the
71 | Work and such Derivative Works in Source or Object form.
72 |
73 | 3. Grant of Patent License. Subject to the terms and conditions of
74 | this License, each Contributor hereby grants to You a perpetual,
75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76 | (except as stated in this section) patent license to make, have made,
77 | use, offer to sell, sell, import, and otherwise transfer the Work,
78 | where such license applies only to those patent claims licensable
79 | by such Contributor that are necessarily infringed by their
80 | Contribution(s) alone or by combination of their Contribution(s)
81 | with the Work to which such Contribution(s) was submitted. If You
82 | institute patent litigation against any entity (including a
83 | cross-claim or counterclaim in a lawsuit) alleging that the Work
84 | or a Contribution incorporated within the Work constitutes direct
85 | or contributory patent infringement, then any patent licenses
86 | granted to You under this License for that Work shall terminate
87 | as of the date such litigation is filed.
88 |
89 | 4. Redistribution. You may reproduce and distribute copies of the
90 | Work or Derivative Works thereof in any medium, with or without
91 | modifications, and in Source or Object form, provided that You
92 | meet the following conditions:
93 |
94 | (a) You must give any other recipients of the Work or
95 | Derivative Works a copy of this License; and
96 |
97 | (b) You must cause any modified files to carry prominent notices
98 | stating that You changed the files; and
99 |
100 | (c) You must retain, in the Source form of any Derivative Works
101 | that You distribute, all copyright, patent, trademark, and
102 | attribution notices from the Source form of the Work,
103 | excluding those notices that do not pertain to any part of
104 | the Derivative Works; and
105 |
106 | (d) If the Work includes a "NOTICE" text file as part of its
107 | distribution, then any Derivative Works that You distribute must
108 | include a readable copy of the attribution notices contained
109 | within such NOTICE file, excluding those notices that do not
110 | pertain to any part of the Derivative Works, in at least one
111 | of the following places: within a NOTICE text file distributed
112 | as part of the Derivative Works; within the Source form or
113 | documentation, if provided along with the Derivative Works; or,
114 | within a display generated by the Derivative Works, if and
115 | wherever such third-party notices normally appear. The contents
116 | of the NOTICE file are for informational purposes only and
117 | do not modify the License. You may add Your own attribution
118 | notices within Derivative Works that You distribute, alongside
119 | or as an addendum to the NOTICE text from the Work, provided
120 | that such additional attribution notices cannot be construed
121 | as modifying the License.
122 |
123 | You may add Your own copyright statement to Your modifications and
124 | may provide additional or different license terms and conditions
125 | for use, reproduction, or distribution of Your modifications, or
126 | for any such Derivative Works as a whole, provided Your use,
127 | reproduction, and distribution of the Work otherwise complies with
128 | the conditions stated in this License.
129 |
130 | 5. Submission of Contributions. Unless You explicitly state otherwise,
131 | any Contribution intentionally submitted for inclusion in the Work
132 | by You to the Licensor shall be under the terms and conditions of
133 | this License, without any additional terms or conditions.
134 | Notwithstanding the above, nothing herein shall supersede or modify
135 | the terms of any separate license agreement you may have executed
136 | with Licensor regarding such Contributions.
137 |
138 | 6. Trademarks. This License does not grant permission to use the trade
139 | names, trademarks, service marks, or product names of the Licensor,
140 | except as required for reasonable and customary use in describing the
141 | origin of the Work and reproducing the content of the NOTICE file.
142 |
143 | 7. Disclaimer of Warranty. Unless required by applicable law or
144 | agreed to in writing, Licensor provides the Work (and each
145 | Contributor provides its Contributions) on an "AS IS" BASIS,
146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147 | implied, including, without limitation, any warranties or conditions
148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149 | PARTICULAR PURPOSE. You are solely responsible for determining the
150 | appropriateness of using or redistributing the Work and assume any
151 | risks associated with Your exercise of permissions under this License.
152 |
153 | 8. Limitation of Liability. In no event and under no legal theory,
154 | whether in tort (including negligence), contract, or otherwise,
155 | unless required by applicable law (such as deliberate and grossly
156 | negligent acts) or agreed to in writing, shall any Contributor be
157 | liable to You for damages, including any direct, indirect, special,
158 | incidental, or consequential damages of any character arising as a
159 | result of this License or out of the use or inability to use the
160 | Work (including but not limited to damages for loss of goodwill,
161 | work stoppage, computer failure or malfunction, or any and all
162 | other commercial damages or losses), even if such Contributor
163 | has been advised of the possibility of such damages.
164 |
165 | 9. Accepting Warranty or Additional Liability. While redistributing
166 | the Work or Derivative Works thereof, You may choose to offer,
167 | and charge a fee for, acceptance of support, warranty, indemnity,
168 | or other liability obligations and/or rights consistent with this
169 | License. However, in accepting such obligations, You may act only
170 | on Your own behalf and on Your sole responsibility, not on behalf
171 | of any other Contributor, and only if You agree to indemnify,
172 | defend, and hold each Contributor harmless for any liability
173 | incurred by, or claims asserted against, such Contributor by reason
174 | of your accepting any such warranty or additional liability.
175 |
176 | END OF TERMS AND CONDITIONS
177 |
178 | APPENDIX: How to apply the Apache License to your work.
179 |
180 | To apply the Apache License to your work, attach the following
181 | boilerplate notice, with the fields enclosed by brackets "[]"
182 | replaced with your own identifying information. (Don't include
183 | the brackets!) The text should be enclosed in the appropriate
184 | comment syntax for the file format. We also recommend that a
185 | file or class name and description of purpose be included on the
186 | same "printed page" as the copyright notice for easier
187 | identification within third-party archives.
188 |
189 | Copyright [yyyy] [name of copyright owner]
190 |
191 | Licensed under the Apache License, Version 2.0 (the "License");
192 | you may not use this file except in compliance with the License.
193 | You may obtain a copy of the License at
194 |
195 | http://www.apache.org/licenses/LICENSE-2.0
196 |
197 | Unless required by applicable law or agreed to in writing, software
198 | distributed under the License is distributed on an "AS IS" BASIS,
199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200 | See the License for the specific language governing permissions and
201 | limitations under the License.
202 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/images/tuw-geo-logo.svg:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
130 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/assets/images/tuw-geo-logo.svg:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
130 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/unit_02/06_in_class_exercise.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Dielectric Properties\n",
9 | "\n",
10 | "In this notebook, we will investigate the varying backscatter values associated with different land surfaces like water bodies, forests, grasslands and urban areas. We will use backscatter data from the Sentinel-1 satellite and we will utilize the CORINE Land Cover dataset to classify and extrapolate these surfaces, enabling us to analyze how different land cover types influence backscatter responses."
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": null,
16 | "id": "1",
17 | "metadata": {},
18 | "outputs": [],
19 | "source": [
20 | "import json\n",
21 | "\n",
22 | "import holoviews as hv\n",
23 | "import intake\n",
24 | "import matplotlib.patches as mpatches\n",
25 | "import matplotlib.pyplot as plt\n",
26 | "import numpy as np\n",
27 | "import rioxarray # noqa: F401\n",
28 | "import xarray as xr\n",
29 | "from holoviews.streams import RangeXY\n",
30 | "from matplotlib.colors import BoundaryNorm, ListedColormap\n",
31 | "\n",
32 | "hv.extension(\"bokeh\")"
33 | ]
34 | },
35 | {
36 | "cell_type": "markdown",
37 | "id": "2",
38 | "metadata": {},
39 | "source": [
40 | "## Load Sentinel-1 Data\n",
41 | "\n",
42 | "For our analysis we are using sigma naught backscatering data from Sentinel-1. The images we are analyzing cover the region south of Vienna and west of Lake Neusiedl. We load the data and and apply again a preprocessing function. Here we extract the scaling factor and the date the image was taken from the metadata. We will focus our attention to a smaller area containing a part of the Lake Neusiedl Lake and its surrounding land. The obtained`xarray` dataset and is then converted to an array, because we only have one variable, the VV backscatter values.\n"
43 | ]
44 | },
45 | {
46 | "cell_type": "code",
47 | "execution_count": null,
48 | "id": "3",
49 | "metadata": {},
50 | "outputs": [],
51 | "source": [
52 | "uri = \"https://git.geo.tuwien.ac.at/public_projects/microwave-remote-sensing/-/raw/main/microwave-remote-sensing.yml\"\n",
53 | "cat = intake.open_catalog(uri)\n",
54 | "sig0_da = cat.neusiedler.read().sig0.compute()"
55 | ]
56 | },
57 | {
58 | "cell_type": "markdown",
59 | "id": "4",
60 | "metadata": {},
61 | "source": [
62 | "Let's have a look at the data by plotting the first timeslice.\n"
63 | ]
64 | },
65 | {
66 | "cell_type": "code",
67 | "execution_count": null,
68 | "id": "5",
69 | "metadata": {},
70 | "outputs": [],
71 | "source": [
72 | "sig0_da.isel(time=0).plot(robust=True, cmap=\"Greys_r\").axes.set_aspect(\"equal\")"
73 | ]
74 | },
75 | {
76 | "cell_type": "markdown",
77 | "id": "6",
78 | "metadata": {},
79 | "source": [
80 | "## Load CORINE Landcover Data\n",
81 | "\n",
82 | "We will load the CORINE Land Cover, which is a pan-European land cover and land use inventory with 44 thematic classes. The resolution of this classification is 100 by 100m and the file was created in 2018\n",
83 | "([CORINE Land Cover](https://land.copernicus.eu/en/products/corine-land-cover)).\n"
84 | ]
85 | },
86 | {
87 | "cell_type": "code",
88 | "execution_count": null,
89 | "id": "7",
90 | "metadata": {},
91 | "outputs": [],
92 | "source": [
93 | "cor_da = cat.corine.read().land_cover.compute()"
94 | ]
95 | },
96 | {
97 | "cell_type": "markdown",
98 | "id": "8",
99 | "metadata": {},
100 | "source": [
101 | "### Colormapping and Encoding\n",
102 | "\n",
103 | "For the different land cover types we use the official color encoding which can be found in [CORINE Land Cover](https://collections.sentinel-hub.com/corine-land-cover/readme.html).\n"
104 | ]
105 | },
106 | {
107 | "cell_type": "code",
108 | "execution_count": null,
109 | "id": "9",
110 | "metadata": {},
111 | "outputs": [],
112 | "source": [
113 | "# Load encoding\n",
114 | "with cat.corine_cmap.read()[0] as f:\n",
115 | " color_mapping_data = json.load(f)\n",
116 | "\n",
117 | "# Get mapping\n",
118 | "color_mapping = {item[\"value\"]: item for item in color_mapping_data[\"land_cover\"]}\n",
119 | "\n",
120 | "# Create cmap and norm for plotting\n",
121 | "colors = [info[\"color\"] for info in color_mapping.values()]\n",
122 | "categories = [info[\"value\"] for info in color_mapping.values()]\n",
123 | "cmap = ListedColormap(colors)\n",
124 | "norm = BoundaryNorm(categories + [max(categories) + 1], len(categories))"
125 | ]
126 | },
127 | {
128 | "cell_type": "markdown",
129 | "id": "10",
130 | "metadata": {},
131 | "source": [
132 | "Now we can plot the CORINE Land Cover dataset.\n"
133 | ]
134 | },
135 | {
136 | "cell_type": "code",
137 | "execution_count": null,
138 | "id": "11",
139 | "metadata": {},
140 | "outputs": [],
141 | "source": [
142 | "# Get landcover codes present in the image\n",
143 | "present_landcover_codes = np.unique(cor_da.values[~np.isnan(cor_da.values)].astype(int))\n",
144 | "\n",
145 | "# Get colors + text for legend\n",
146 | "handles = [\n",
147 | " mpatches.Patch(color=info[\"color\"], label=(f\"{info['value']} - \" + (info[\"label\"])))\n",
148 | " for info in color_mapping.values()\n",
149 | " if info[\"value\"] in present_landcover_codes\n",
150 | "]\n",
151 | "\n",
152 | "# Create the plot\n",
153 | "cor_da.plot(figsize=(10, 10), cmap=cmap, norm=norm, add_colorbar=False).axes.set_aspect(\n",
154 | " \"equal\"\n",
155 | ")\n",
156 | "\n",
157 | "plt.legend(\n",
158 | " handles=handles,\n",
159 | " bbox_to_anchor=(1.01, 1),\n",
160 | " loc=\"upper left\",\n",
161 | " borderaxespad=0,\n",
162 | " fontsize=7,\n",
163 | ")\n",
164 | "plt.title(\"CORINE Land Cover (EPSG:27704)\")"
165 | ]
166 | },
167 | {
168 | "cell_type": "markdown",
169 | "id": "12",
170 | "metadata": {},
171 | "source": [
172 | "Now we are ready to merge the backscatter data (`sig0_da`) with the land cover dataset (`cor_da`) to have one dataset combining all data.\n"
173 | ]
174 | },
175 | {
176 | "cell_type": "code",
177 | "execution_count": null,
178 | "id": "13",
179 | "metadata": {},
180 | "outputs": [],
181 | "source": [
182 | "var_ds = xr.merge([sig0_da, cor_da]).drop_vars(\"band\")\n",
183 | "var_ds"
184 | ]
185 | },
186 | {
187 | "cell_type": "markdown",
188 | "id": "14",
189 | "metadata": {},
190 | "source": [
191 | "## Backscatter Variability\n",
192 | "\n",
193 | "With this combined dataset we can study backscatter variability in relation to natural media. For example we can look at the backscatter variability for water by clipping the dataset to only contain the land cover class water, like so:\n"
194 | ]
195 | },
196 | {
197 | "cell_type": "code",
198 | "execution_count": null,
199 | "id": "15",
200 | "metadata": {},
201 | "outputs": [],
202 | "source": [
203 | "# 41 = encoded value for water bodies\n",
204 | "waterbodies_mask = var_ds.land_cover == 41\n",
205 | "waterbodies_mask.plot().axes.set_aspect(\"equal\")"
206 | ]
207 | },
208 | {
209 | "cell_type": "markdown",
210 | "id": "16",
211 | "metadata": {},
212 | "source": [
213 | "This gives use backscatter values over water only.\n"
214 | ]
215 | },
216 | {
217 | "cell_type": "code",
218 | "execution_count": null,
219 | "id": "17",
220 | "metadata": {},
221 | "outputs": [],
222 | "source": [
223 | "waterbodies_sig0 = var_ds.sig0.isel(time=0).where(waterbodies_mask)\n",
224 | "waterbodies_sig0.plot(robust=True, cmap=\"Greys_r\").axes.set_aspect(\"equal\")"
225 | ]
226 | },
227 | {
228 | "cell_type": "markdown",
229 | "id": "18",
230 | "metadata": {},
231 | "source": [
232 | "To get an idea of the variability we can create a histogram. Radar backscatter from water bodies fluctuates with surface roughness, which changes with wind conditions, creating spatial and temporal variations in signal intensity.\n"
233 | ]
234 | },
235 | {
236 | "cell_type": "code",
237 | "execution_count": null,
238 | "id": "19",
239 | "metadata": {},
240 | "outputs": [],
241 | "source": [
242 | "waterbodies_sig0.plot.hist(bins=50, edgecolor=\"black\")"
243 | ]
244 | },
245 | {
246 | "cell_type": "markdown",
247 | "id": "20",
248 | "metadata": {},
249 | "source": [
250 | "## Variability over Time\n",
251 | "\n",
252 | "Next we will look at the changes in variability in backscatter values over time for each of the CORINE Land Cover types. We do this by creating the following interactive plot. We can spot that backscatter in agricultural fields varies due to seasonal cycles like planting, growing, and harvesting, each of which changes vegetation structure. Changes in backscatter are strongly related to soil moisture content from irrigation or rainfall. Ultimately, phenological stages of crops and canopy moisture dynamics can affect the backscatter signal.\n"
253 | ]
254 | },
255 | {
256 | "cell_type": "code",
257 | "execution_count": null,
258 | "id": "21",
259 | "metadata": {},
260 | "outputs": [],
261 | "source": [
262 | "robust_min = var_ds.sig0.quantile(0.02).item()\n",
263 | "robust_max = var_ds.sig0.quantile(0.98).item()\n",
264 | "\n",
265 | "bin_edges = [\n",
266 | " i + j * 0.5\n",
267 | " for i in range(int(robust_min) - 2, int(robust_max) + 2)\n",
268 | " for j in range(2)\n",
269 | "]\n",
270 | "\n",
271 | "land_cover = {\"\\xa0\\xa0\\xa0 Complete Land Cover\": 1}\n",
272 | "land_cover.update(\n",
273 | " {\n",
274 | " f\"{int(value): 02} {color_mapping[value]['label']}\": int(value)\n",
275 | " for value in present_landcover_codes\n",
276 | " }\n",
277 | ")\n",
278 | "time = var_ds.sig0[\"time\"].values\n",
279 | "\n",
280 | "rangexy = RangeXY()\n",
281 | "\n",
282 | "\n",
283 | "def load_image(time, land_cover, x_range, y_range):\n",
284 | " \"\"\"Callback Function Landcover.\n",
285 | "\n",
286 | " Parameters\n",
287 | " ----------\n",
288 | " time: panda.datatime\n",
289 | " time slice\n",
290 | " landcover: int\n",
291 | " land cover type\n",
292 | " x_range: array_like\n",
293 | " longitude range\n",
294 | " y_range: array_like\n",
295 | " latitude range\n",
296 | "\n",
297 | " Returns\n",
298 | " -------\n",
299 | " holoviews.Image\n",
300 | "\n",
301 | " \"\"\"\n",
302 | " if land_cover == \"\\xa0\\xa0\\xa0 Complete Land Cover\":\n",
303 | " sig0_selected_ds = var_ds.sig0.sel(time=time)\n",
304 | "\n",
305 | " else:\n",
306 | " land_cover_value = int(land_cover.split()[0])\n",
307 | " mask_ds = var_ds.land_cover == land_cover_value\n",
308 | " sig0_selected_ds = var_ds.sig0.sel(time=time).where(mask_ds)\n",
309 | "\n",
310 | " hv_ds = hv.Dataset(sig0_selected_ds)\n",
311 | " img = hv_ds.to(hv.Image, [\"x\", \"y\"])\n",
312 | "\n",
313 | " if x_range and y_range:\n",
314 | " img = img.select(x=x_range, y=y_range)\n",
315 | "\n",
316 | " return hv.Image(img)\n",
317 | "\n",
318 | "\n",
319 | "dmap = (\n",
320 | " hv.DynamicMap(load_image, kdims=[\"Time\", \"Landcover\"], streams=[rangexy])\n",
321 | " .redim.values(Time=time, Landcover=land_cover)\n",
322 | " .hist(normed=True, bins=bin_edges)\n",
323 | ")\n",
324 | "\n",
325 | "image_opts = hv.opts.Image(\n",
326 | " cmap=\"Greys_r\",\n",
327 | " colorbar=True,\n",
328 | " tools=[\"hover\"],\n",
329 | " clim=(robust_min, robust_max),\n",
330 | " aspect=\"equal\",\n",
331 | " framewise=False,\n",
332 | " frame_height=500,\n",
333 | " frame_width=500,\n",
334 | ")\n",
335 | "\n",
336 | "hist_opts = hv.opts.Histogram(width=350, height=555)\n",
337 | "\n",
338 | "dmap.opts(image_opts, hist_opts)"
339 | ]
340 | }
341 | ],
342 | "metadata": {
343 | "kernelspec": {
344 | "display_name": "Python 3 (ipykernel)",
345 | "language": "python",
346 | "name": "python3"
347 | },
348 | "language_info": {
349 | "codemirror_mode": {
350 | "name": "ipython",
351 | "version": 3
352 | },
353 | "file_extension": ".py",
354 | "mimetype": "text/x-python",
355 | "name": "python",
356 | "nbconvert_exporter": "python",
357 | "pygments_lexer": "ipython3",
358 | "version": "3.11.13"
359 | }
360 | },
361 | "nbformat": 4,
362 | "nbformat_minor": 5
363 | }
364 |
--------------------------------------------------------------------------------
/notebooks/courses/microwave-remote-sensing/unit_01/03_in_class_exercise.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Backscattering Coefficients\n",
9 | "\n",
10 | "In this notebook, we will introduce some of the steps involved in the processing of Sentinel-1 Level1 Ground Range Detected (`GRD`) data to $\\sigma^0$ (`sig0`) and $\\gamma^0$ (`gmr`). Moreover, the notebook illustrates the importance and impact of geometric and radiometric terrain correction. As the processing of SAR data is a very time and hardware-intense task, we won't perform the actual processing in this notebook. Instead, data at different processing steps is illustrated to highlight the impact of the processing steps.\n"
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": null,
16 | "id": "1",
17 | "metadata": {},
18 | "outputs": [],
19 | "source": [
20 | "import hvplot.xarray # noqa: F401\n",
21 | "import intake\n",
22 | "import matplotlib.pyplot as plt\n",
23 | "import numpy as np\n",
24 | "import rioxarray # noqa: F401\n",
25 | "import xarray as xr"
26 | ]
27 | },
28 | {
29 | "cell_type": "markdown",
30 | "id": "2",
31 | "metadata": {},
32 | "source": [
33 | "## Loading Backscatter Data\n",
34 | "\n",
35 | "We first load our data from the following [intake](https://intake.readthedocs.io/en/latest/) catalog. Intake is somewhat similar to STAC in that it makes it easy to discover and load data. More importantly, this package allows us to hide some of the complexities involved with getting the data in the right format, which are not of concern in this notebook.\n"
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": null,
41 | "id": "3",
42 | "metadata": {},
43 | "outputs": [],
44 | "source": [
45 | "uri = \"https://git.geo.tuwien.ac.at/public_projects/microwave-remote-sensing/-/raw/main/microwave-remote-sensing.yml\"\n",
46 | "cat = intake.open_catalog(uri)\n",
47 | "gtc_dc = cat[\"gtc\"].read().compute()\n",
48 | "gtc_dc"
49 | ]
50 | },
51 | {
52 | "cell_type": "code",
53 | "execution_count": null,
54 | "id": "4",
55 | "metadata": {},
56 | "outputs": [],
57 | "source": [
58 | "gtc_dc.hvplot.image(\n",
59 | " x=\"x\",\n",
60 | " y=\"y\",\n",
61 | " robust=True,\n",
62 | " data_aspect=1,\n",
63 | " cmap=\"Greys_r\",\n",
64 | " groupby=\"band\",\n",
65 | " rasterize=True,\n",
66 | ").opts(frame_height=600, framewise=False, aspect=\"equal\")"
67 | ]
68 | },
69 | {
70 | "cell_type": "markdown",
71 | "id": "5",
72 | "metadata": {},
73 | "source": [
74 | "*Figure 2: The ground range detected values and geometrically terrain corrected values can be selected on the right-hand side of the graphic.*\n",
75 | "\n",
76 | "The geometrically terrain corrected values from the `gtc_dc` object (Figure 1) can be approximated to a certain extent, as we have sufficiently detailed information of topography in this area. This corrects for at least one typically occurring distortion in mountainous regions: \"foreshortening\".\n",
77 | "\n",
78 | "\n",
79 | "\n",
80 | "*Figure 3: Side Looking radar distortions (script Chapter 4).*\n",
81 | "\n",
82 | "Foreshortening can be spotted by eye, as it often has a radiometric consequence, where unusually bright areas fringe mountain ridges; a phenomenon called \"highlighting\". This geometric artifact occurs due to the compression of the distance in the image of slopes facing the radar system and the consequentially higher density of scatterers per unit length. Now let's zoom in on an example from the same datacube and display the original and corrected values side-by-side.\n"
83 | ]
84 | },
85 | {
86 | "cell_type": "code",
87 | "execution_count": null,
88 | "id": "6",
89 | "metadata": {},
90 | "outputs": [],
91 | "source": [
92 | "for_dc = gtc_dc.sel(x=slice(9.651, 9.706), y=slice(47.134, 47.079)).band_data\n",
93 | "\n",
94 | "fig, ax = plt.subplots(1, 2, figsize=(20, 8))\n",
95 | "\n",
96 | "bbox = dict(boxstyle=\"round\", fc=\"0.8\")\n",
97 | "\n",
98 | "\n",
99 | "ax[1].annotate(\n",
100 | " \"foreshortening/layover\",\n",
101 | " xy=(9.674, 47.092),\n",
102 | " xytext=(0.574, 0.192),\n",
103 | " textcoords=\"subfigure fraction\",\n",
104 | " bbox=bbox,\n",
105 | " arrowprops=dict(facecolor=\"red\", shrink=0.05),\n",
106 | ")\n",
107 | "ax[1].annotate(\n",
108 | " \"radar shadows\",\n",
109 | " xy=(9.68, 47.119),\n",
110 | " xytext=(0.6, 0.625),\n",
111 | " textcoords=\"subfigure fraction\",\n",
112 | " bbox=bbox,\n",
113 | " arrowprops=dict(facecolor=\"red\", shrink=0.05),\n",
114 | ")\n",
115 | "\n",
116 | "ax[0].axes.set_aspect(\"equal\")\n",
117 | "ax[1].axes.set_aspect(\"equal\")\n",
118 | "\n",
119 | "for_dc.sel(band=\"grd\").plot(ax=ax[0], robust=True, cmap=\"Greys_r\")\n",
120 | "for_dc.sel(band=\"sig0_gtc\").plot(ax=ax[1], robust=True, cmap=\"Greys_r\")"
121 | ]
122 | },
123 | {
124 | "cell_type": "markdown",
125 | "id": "7",
126 | "metadata": {},
127 | "source": [
128 | "*Figure 4: Close-up inspection of geometric distortions in side-looking radar*\n",
129 | "\n",
130 | "As we can see, not all the geometric distortions can be corrected by the algorithm. Some of the pixels at the mountain ranges appear stretched, as in these areas not enough valid measurements are available. Moreover, we can see dark areas which are indicating radar shadows. These are image areas that could not be captured by the radar sensor and have values close to the noise floor of the Sensor (minimum detectable signal strength) ~ -28dB. It is important to note, that radar shadows are not the same for every image, as they depend on the acquisition geometry, in particular, the incidence angle and the flight direction of the satellite.\n",
131 | "\n",
132 | "## Backscattering Coefficients\n",
133 | "\n",
134 | "In this chapter, we will look at some of the different backscatter coefficients in more detail ($\\sigma^0_E$ or $\\gamma^0_E$), where both coefficients are geometrically terrain corrected. The difference is the plane of the reference area, which is the ground area as a tangent on an ellipsoidal Earth model for $\\sigma^0_E$ and perpendicular to the line of sight for $\\gamma^0_E$ (Figure 5). For this, we load a new datacube which includes $\\sigma^0_E$ and the Incidence Angle for each pixel. We visualize the cube with the same method as before.\n"
135 | ]
136 | },
137 | {
138 | "cell_type": "code",
139 | "execution_count": null,
140 | "id": "8",
141 | "metadata": {},
142 | "outputs": [],
143 | "source": [
144 | "coef_dc = cat.coef.read().compute()\n",
145 | "coef_dc.hvplot.image(\n",
146 | " x=\"x\",\n",
147 | " y=\"y\",\n",
148 | " robust=True,\n",
149 | " data_aspect=1,\n",
150 | " cmap=\"Greys_r\",\n",
151 | " groupby=\"band\",\n",
152 | " rasterize=True,\n",
153 | ").opts(frame_height=600, framewise=False, aspect=\"equal\")"
154 | ]
155 | },
156 | {
157 | "cell_type": "markdown",
158 | "id": "9",
159 | "metadata": {},
160 | "source": [
161 | "*Figure 5: The $\\sigma^0_E$ and the incidence angle can be selected on the right-hand side of the graphic.*\n",
162 | "\n",
163 | "In Figure 5 we can see the incidence angle image of our scene. We can see, that it depicts the differences between near to far range, but not the actual terrain as it refers to the ellipsoid. The slight patterns of the terrain that are visible are originating from the geometric terrain correction. We will use this information now to convert our ($\\sigma^0_E$ to $\\gamma^0_E$) with the following equation (equation 6.20 in the script):\n",
164 | "\n",
165 | "$$ \\gamma^0_E = \\sigma^0_E / \\cos(\\theta_i) $$\n",
166 | "\n",
167 | "We can perform this transformation with basic `numpy` operations on the `xarray` datacube.\n"
168 | ]
169 | },
170 | {
171 | "cell_type": "code",
172 | "execution_count": null,
173 | "id": "10",
174 | "metadata": {},
175 | "outputs": [],
176 | "source": [
177 | "# linear scale\n",
178 | "sig0_db = coef_dc.sel(band=\"sig0_gtc\") / 10\n",
179 | "sig0_lin = 10 ** (coef_dc.sel(band=\"sig0_gtc\") / 10)\n",
180 | "# conversion to gamma\n",
181 | "gam0_lin = sig0_lin / np.cos(np.radians(coef_dc.sel(band=\"incidence_angle\")))\n",
182 | "# dB scale\n",
183 | "gam0_db = 10 * np.log(gam0_lin)\n",
184 | "# add to existing cube\n",
185 | "coef_dc = xr.concat(\n",
186 | " [coef_dc.sel(band=\"sig0_gtc\"), gam0_db.expand_dims(band=[\"gam0_gtc\"])], dim=\"band\"\n",
187 | ")\n",
188 | "\n",
189 | "coef_dc.hvplot.image(\n",
190 | " x=\"x\",\n",
191 | " y=\"y\",\n",
192 | " robust=False,\n",
193 | " data_aspect=1,\n",
194 | " cmap=\"Greys_r\",\n",
195 | " groupby=\"band\",\n",
196 | " rasterize=True,\n",
197 | ").opts(frame_height=600, framewise=False, aspect=\"equal\")"
198 | ]
199 | },
200 | {
201 | "cell_type": "markdown",
202 | "id": "11",
203 | "metadata": {},
204 | "source": [
205 | "*Figure 6: $\\sigma^0_E$, and $\\gamma^0_E$ can be selected on the right-hand side of the graphic.*\n",
206 | "\n",
207 | "Comparing $\\sigma^0_E$ and $\\gamma^0_E$ in the figure, we can see that both look identical except for the range. This is because the only difference between $\\sigma^0_E$ and $\\gamma^0_E$ is the change of the reference area. While $\\sigma^0_E$ is defined to be ground range, $\\gamma^0_E$ is defined to be in the plane perpendicular to the line of sight from the sensor. This way, $\\gamma^0_E$ mitigates the impact of the incidence angle. However, $\\gamma^0_E$ is still based on the ellipsoid and does not account for the impact of the terrain on the radiometry.\n",
208 | "\n",
209 | "# Radiometric Terrain Correction\n",
210 | "\n",
211 | "So far, we corrected geometric distortions and compared the impact of the choice of the reference area. However, we still haven't corrected the backscatter intensity of pixels which are distorted by the terrain. In this last step, we will show that we can also correct radiometric artifacts to a certain degree. For this, we will load radiometrically terrain corrected (`rtc`) $\\gamma^0_T$ and plot it along the other coefficients.\n"
212 | ]
213 | },
214 | {
215 | "cell_type": "code",
216 | "execution_count": null,
217 | "id": "12",
218 | "metadata": {},
219 | "outputs": [],
220 | "source": [
221 | "rtc_dc = cat.rtc.read().compute()\n",
222 | "\n",
223 | "# add to existing cube\n",
224 | "rtc_dc = xr.concat([coef_dc, rtc_dc], dim=\"band\")\n",
225 | "\n",
226 | "rtc_dc.hvplot.image(\n",
227 | " x=\"x\",\n",
228 | " y=\"y\",\n",
229 | " robust=True,\n",
230 | " data_aspect=1,\n",
231 | " cmap=\"Greys_r\",\n",
232 | " groupby=\"band\",\n",
233 | " rasterize=True,\n",
234 | ").opts(frame_height=600, framewise=False, aspect=\"equal\")"
235 | ]
236 | },
237 | {
238 | "cell_type": "markdown",
239 | "id": "13",
240 | "metadata": {},
241 | "source": [
242 | "*Figure 7: $\\sigma^0_E$, $\\gamma^0_E$, and $\\gamma^0_T$ can be selected on the right-hand side of the graphic.*\n",
243 | "\n",
244 | "When comparing $\\gamma^0_E$ and $\\gamma^0_T$ in the plot we can clearly see the impact of the radiometric correction in the mountainous areas. This correction is necessary, because for slopes facing towards the sensor, a larger ground area contributes to the backscatter value of a slant range resolution cell, than for slopes lying in the opposite direction. This results in significant brightness changes, where foreshortening areas appear brighter and lengthening areas darker. $\\gamma^0_T$ adjusts the backscatter to represent what it would be if the terrain was flat, thus reducing these effects. This significantly reduces the impact of the terrain on the backscatter values, allowing for more accurate comparisons across different terrain types and locations. The correction is done by using a DEM to determine the local illuminated area at each radar position. The above illustrated approach is also referred to as terrain flattening because in the resulting image, mountains appear flat. As $\\gamma^0_T$ is corrected for geometric and radiometric distortions, it is also referred to as Normalized Radar Backscatter (NRB) and is the current standard for Analysis-Ready-Backscatter (ARD)."
245 | ]
246 | }
247 | ],
248 | "metadata": {
249 | "kernelspec": {
250 | "display_name": "Python 3 (ipykernel)",
251 | "language": "python",
252 | "name": "python3"
253 | },
254 | "language_info": {
255 | "codemirror_mode": {
256 | "name": "ipython",
257 | "version": 3
258 | },
259 | "file_extension": ".py",
260 | "mimetype": "text/x-python",
261 | "name": "python",
262 | "nbconvert_exporter": "python",
263 | "pygments_lexer": "ipython3",
264 | "version": "3.11.13"
265 | }
266 | },
267 | "nbformat": 4,
268 | "nbformat_minor": 5
269 | }
270 |
--------------------------------------------------------------------------------
/notebooks/images/logos/pythia_logo-white-rtext.svg:
--------------------------------------------------------------------------------
1 |
2 |
226 |
--------------------------------------------------------------------------------
/notebooks/courses/environmental-remote-sensing/unit_01/05_supplement_drought.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0",
6 | "metadata": {},
7 | "source": [
8 | "# Access to Near Real Time Soil Moisture Data\n",
9 | "**Downloading, Reading and Working with H SAF Surface Soil Moisture 6.25 km** \n",
10 | "\n",
11 | ")](https://www-cdn.eumetsat.int/files/2024-03/Filipo.png)\n",
12 | "\n",
13 | "## Overview\n",
14 | "\n",
15 | "This notebook demonstrates how to access the H SAF soil moisture products in near real-time, as utilized in previous notebooks.\n",
16 | "\n",
17 | "## Imports"
18 | ]
19 | },
20 | {
21 | "cell_type": "code",
22 | "execution_count": null,
23 | "id": "1",
24 | "metadata": {},
25 | "outputs": [],
26 | "source": [
27 | "from datetime import datetime, timedelta\n",
28 | "from pathlib import Path\n",
29 | "\n",
30 | "import cartopy.crs as ccrs\n",
31 | "import hvplot.pandas # noqa\n",
32 | "from ascat.download.interface import hsaf_download\n",
33 | "from ascat.swath import SwathGridFiles\n",
34 | "from dotenv import dotenv_values\n",
35 | "from envrs.ssm_cmap import SSM_CMAP"
36 | ]
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "id": "2",
41 | "metadata": {},
42 | "source": [
43 | "## H SAF\n",
44 | "\n",
45 | "H SAF, the Satellite Application Facility on Support to Operational Hydrology and Water Management, is part of EUMETSAT, the 'European Organisation for the Exploitation of Meteorological Satellites' headquartered in Darmstadt, Germany. EUMETSAT consists of 30 European Member states and operates both geostationary satellites (Meteosat) and polar-orbiting satellites (Metop), the latter of which carry the ASCAT sensors used for soil moisture retrieval. These missions are crucial for weather forecasting and significantly contribute to environmental and climate change monitoring. H SAF aims to disseminate satellite-derived products useful for operational hydrology, including precipitation, snow cover, and soil moisture. They ensure that these products are accurate through rigorous validation and are delivered to users in a timely manner. All these products are freely available.\n",
46 | "\n",
47 | "## Registration\n",
48 | "\n",
49 | "H SAF has an [FTP server](https://en.wikipedia.org/wiki/FTP_server) for users to download near real time data products. Before you can start downloading data products, one need to register to become an H SAF user. This is relatively easy when following these steps:\n",
50 | "\n",
51 | "1. Go to [H SAF homepage](https://hsaf.meteoam.it/).\n",
52 | "\n",
53 | "