├── MANIFEST.in
├── PhoREAL_User_Manual_v3.30.pdf
├── README.md
├── ResearchLicense_SourceCode_PhoREALv1.0.doc
├── images
├── PhoREAL_screenshot_v3.28.png
├── PhoREAL_screenshot_v3.30.png
└── readme.txt
├── notebooks
├── .ipynb_checkpoints
│ ├── Quick Start Demo-checkpoint.ipynb
│ └── Quick_Start_Demo-checkpoint.ipynb
├── Quick_Start_Demo.ipynb
└── processing_example.ipynb
├── phoreal
├── CalVal.py
├── Viewer_Online_blank.html
├── Viewer_blank.html
├── __init__.py
├── __pycache__
│ ├── CalVal.cpython-311.pyc
│ ├── __init__.cpython-311.pyc
│ ├── __init__.cpython-38.pyc
│ ├── __init__.cpython-39.pyc
│ ├── ace.cpython-311.pyc
│ ├── ace.cpython-36.pyc
│ ├── binner.cpython-311.pyc
│ ├── getAtlMeasuredSwath_auto.cpython-36.pyc
│ ├── getAtlMeasuredSwath_auto.cpython-38.pyc
│ ├── getAtlTruthSwath_auto.cpython-36.pyc
│ ├── getMeasurementError.cpython-311.pyc
│ ├── getMeasurementError.cpython-36.pyc
│ ├── getMeasurementError_auto.cpython-36.pyc
│ ├── gui_addins.cpython-311.pyc
│ ├── gui_addins.cpython-36.pyc
│ ├── gui_addins.cpython-38.pyc
│ ├── icesatBin.cpython-38.pyc
│ ├── icesatBinner.cpython-36.pyc
│ ├── icesatBinner.cpython-38.pyc
│ ├── icesatCalVal.cpython-36.pyc
│ ├── icesatIO.cpython-36.pyc
│ ├── icesatIO.cpython-38.pyc
│ ├── icesatPlot.cpython-36.pyc
│ ├── icesatReader.cpython-36.pyc
│ ├── icesatReader.cpython-38.pyc
│ ├── icesatReader.cpython-39.pyc
│ ├── icesatReference.cpython-36.pyc
│ ├── icesatUtils.cpython-36.pyc
│ ├── icesatUtils.cpython-38.pyc
│ ├── icesatUtils.cpython-39.pyc
│ ├── io.cpython-311.pyc
│ ├── plot.cpython-311.pyc
│ ├── reader.cpython-311.pyc
│ ├── reference.cpython-311.pyc
│ └── utils.cpython-311.pyc
├── ace.py
├── acer.c
├── acer.cpython-36m-x86_64-linux-gnu.so
├── atl_processor.py
├── atl_to_las.py
├── binner.py
├── closest.c
├── closest.exe
├── closest_x64.dll
├── getMeasurementError.py
├── gui_addins.py
├── io.py
├── laszip.exe
├── pho_image.ico
├── phorealc.so
├── plot.py
├── reader.py
├── reference.py
└── utils.py
├── requirements.txt
└── setup.py
/MANIFEST.in:
--------------------------------------------------------------------------------
1 | include phoreal/phorealc.so
2 | include phoreal/closest_x64.dll
3 |
--------------------------------------------------------------------------------
/PhoREAL_User_Manual_v3.30.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/PhoREAL_User_Manual_v3.30.pdf
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
PhoREAL
2 | Tools to make ICESat-2 ATL03 and ATL08 analysis easier
3 |
4 | PhoREAL (Photon Research and Engineering Analysis Library) is a geospatial analysis toolbox that allows users to read, process, analyze, and output ICESat-2 ATL03 and ATL08 data in the form of figures and .las, .csv, and .kml files.
5 |
6 | PhoREAL v3.30
7 |
8 | The new features available in PhoREAL v3.30 include:
9 | * Reduced time taken to export an ATL03 CSV file
10 | * Replaced Pyproj library with Fiona to perform SRS transformations and improve compilation
11 |
12 |
13 | Note: On Windows systems, the PhoREAL toolbox can be run as a Graphical User Interface (GUI) executable (.exe). On Linux systems, the PhoREAL toolbox can be run as a Python GUI or as a series of Python function commands via
14 | the Linux command line
15 |
16 |
17 | Download PhoREAL_v3.30_init.exe (for Windows)
18 | Update (Feb 02, 2022): PhoREAL v3.30 released
19 | https://utexas.box.com/v/DownloadPhoREALGUI
20 |
21 | 
22 |
23 | How to use Python Library
24 | Simply build and install the PhoREAL python library
25 |
26 | ```
27 | $ python setup.py build
28 | $ python setup.py install
29 | ```
30 |
31 |
32 | Copyright 2020 University of Texas at Austin
33 |
34 | This package is free software; the copyright holder gives unlimited
35 | permission to copy and/or distribute, with or without modification, as
36 | long as this notice is preserved.
37 |
--------------------------------------------------------------------------------
/ResearchLicense_SourceCode_PhoREALv1.0.doc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/ResearchLicense_SourceCode_PhoREALv1.0.doc
--------------------------------------------------------------------------------
/images/PhoREAL_screenshot_v3.28.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/images/PhoREAL_screenshot_v3.28.png
--------------------------------------------------------------------------------
/images/PhoREAL_screenshot_v3.30.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/images/PhoREAL_screenshot_v3.30.png
--------------------------------------------------------------------------------
/images/readme.txt:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/notebooks/.ipynb_checkpoints/Quick Start Demo-checkpoint.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "0fd6567c",
6 | "metadata": {},
7 | "source": [
8 | "## Quick Start PhoREAL ##\n",
9 | "Small snippet on how to get started with PhoREAL. PhoREAL will read a specific ground track and pull out all fields within the 'Heights' subgroup, the subgroup that represents at photon rate data in the ATL03 h5 file. Data within this subfield will be loaded into a Pandas Dataframe, that can easily be operated on with standard Pandas operators.\n",
10 | "\n",
11 | "Start by importing `phoreal`"
12 | ]
13 | },
14 | {
15 | "cell_type": "code",
16 | "execution_count": 1,
17 | "id": "a5affcea",
18 | "metadata": {},
19 | "outputs": [],
20 | "source": [
21 | "import phoreal as pr"
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "id": "63157591",
27 | "metadata": {},
28 | "source": [
29 | "Within `phoreal` use the submodule `reader` to read the ATL03 file as a PhoREAL object."
30 | ]
31 | },
32 | {
33 | "cell_type": "code",
34 | "execution_count": 2,
35 | "id": "93ec2972",
36 | "metadata": {},
37 | "outputs": [],
38 | "source": [
39 | "atl03_file = '/mnt/bigtex/vol1/Data/ICESat-2/REL005/ATL03/PRF/ATL03_20200208094726_06670606_005_01.h5'\n",
40 | "gt = 'gt1r'\n",
41 | "atl08_file = '/mnt/bigtex/vol1/Data/ICESat-2/REL005/ATL08/PRF/ATL08/ATL08_20200208094726_06670606_005_01.h5'\n",
42 | "\n",
43 | "atl03 = pr.icesatReader.get_atl03_struct(atl03_file, gt, atl08_file)"
44 | ]
45 | },
46 | {
47 | "cell_type": "markdown",
48 | "id": "e9259115",
49 | "metadata": {},
50 | "source": [
51 | "Here we can verify the pandas dataframe that represents the heights field. We can examine the columns using a standard pandas call."
52 | ]
53 | },
54 | {
55 | "cell_type": "code",
56 | "execution_count": 3,
57 | "id": "941d82c5",
58 | "metadata": {},
59 | "outputs": [
60 | {
61 | "ename": "AttributeError",
62 | "evalue": "'DataFrame' object has no attribute 'cols'",
63 | "output_type": "error",
64 | "traceback": [
65 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
66 | "\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
67 | "Input \u001b[0;32mIn [3]\u001b[0m, in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[38;5;28mprint\u001b[39m(\u001b[43matl03\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mdf\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mcols\u001b[49m)\n",
68 | "File \u001b[0;32m~/miniconda3/envs/test38_4/lib/python3.8/site-packages/pandas-1.5.1-py3.8-linux-x86_64.egg/pandas/core/generic.py:5902\u001b[0m, in \u001b[0;36mNDFrame.__getattr__\u001b[0;34m(self, name)\u001b[0m\n\u001b[1;32m 5895\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m (\n\u001b[1;32m 5896\u001b[0m name \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_internal_names_set\n\u001b[1;32m 5897\u001b[0m \u001b[38;5;129;01mand\u001b[39;00m name \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_metadata\n\u001b[1;32m 5898\u001b[0m \u001b[38;5;129;01mand\u001b[39;00m name \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_accessors\n\u001b[1;32m 5899\u001b[0m \u001b[38;5;129;01mand\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_info_axis\u001b[38;5;241m.\u001b[39m_can_hold_identifiers_and_holds_name(name)\n\u001b[1;32m 5900\u001b[0m ):\n\u001b[1;32m 5901\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m[name]\n\u001b[0;32m-> 5902\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mobject\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[38;5;21;43m__getattribute__\u001b[39;49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mname\u001b[49m\u001b[43m)\u001b[49m\n",
69 | "\u001b[0;31mAttributeError\u001b[0m: 'DataFrame' object has no attribute 'cols'"
70 | ]
71 | }
72 | ],
73 | "source": [
74 | "print(atl03.df.columns)"
75 | ]
76 | },
77 | {
78 | "cell_type": "markdown",
79 | "id": "7695fc23",
80 | "metadata": {},
81 | "source": []
82 | }
83 | ],
84 | "metadata": {
85 | "kernelspec": {
86 | "display_name": "Python 3 (ipykernel)",
87 | "language": "python",
88 | "name": "python3"
89 | },
90 | "language_info": {
91 | "codemirror_mode": {
92 | "name": "ipython",
93 | "version": 3
94 | },
95 | "file_extension": ".py",
96 | "mimetype": "text/x-python",
97 | "name": "python",
98 | "nbconvert_exporter": "python",
99 | "pygments_lexer": "ipython3",
100 | "version": "3.8.0"
101 | }
102 | },
103 | "nbformat": 4,
104 | "nbformat_minor": 5
105 | }
106 |
--------------------------------------------------------------------------------
/notebooks/.ipynb_checkpoints/Quick_Start_Demo-checkpoint.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "b9aaca9e",
6 | "metadata": {},
7 | "source": [
8 | "## Quick Start PhoREAL ##\n",
9 | "\n",
10 | "*Tiny* notebook on how to get started with PhoREAL and how to read in ATL03 and ATL08 `.h5` files. This demo will read in an ATL03 use its associated ATL08 file to it for a specific ground track. PhoREAL will read the information in the ATL03 `heights` subgroup that represents the data the photon rate. Once read, the Pandas Dataframe can be read and operated with normal Pandas operators.\n",
11 | "\n",
12 | "Start by importing `phoreal`"
13 | ]
14 | },
15 | {
16 | "cell_type": "code",
17 | "execution_count": null,
18 | "id": "d3a7e008",
19 | "metadata": {},
20 | "outputs": [],
21 | "source": [
22 | "import phoreal as pr"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "id": "f227d36a",
28 | "metadata": {},
29 | "source": [
30 | "Within `phoreal` use the submodule `reader` to read the ATL03 file as a PhoREAL object. The ATL08 file can also be in to load in the ATL08 photon rate canopy and ground classifications and normalized heights."
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "execution_count": null,
36 | "id": "d558afca",
37 | "metadata": {},
38 | "outputs": [],
39 | "source": [
40 | "atl03_file = '~/foo/ATL03_20200208094726_06670606_005_01.h5'\n",
41 | "atl08_file = '~/foo/ATL03_20200208094726_06670606_005_01.h5'"
42 | ]
43 | },
44 | {
45 | "cell_type": "markdown",
46 | "id": "bbd87f10",
47 | "metadata": {},
48 | "source": [
49 | "Also define which groundtrack to be read in, the options are: `gt1l`,`gt1r`,`gt2l`,`gt2r`,`gt3l`,`gt3r`"
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "execution_count": null,
55 | "id": "9bcdc4c1",
56 | "metadata": {},
57 | "outputs": [],
58 | "source": [
59 | "gt = 'gt1r' "
60 | ]
61 | },
62 | {
63 | "cell_type": "markdown",
64 | "id": "cf69972b",
65 | "metadata": {},
66 | "source": [
67 | "And now read the file.."
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "execution_count": null,
73 | "id": "8b522cd9",
74 | "metadata": {},
75 | "outputs": [],
76 | "source": [
77 | "atl03 = pr.reader.get_atl03_struct(atl03_file, gt, atl08_file)"
78 | ]
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "id": "150678f7",
83 | "metadata": {},
84 | "source": [
85 | "Here we can verify the Pandas Dataframe that represents the `heights` subgroup. We can examine the columns using a standard Pandas call:"
86 | ]
87 | },
88 | {
89 | "cell_type": "code",
90 | "execution_count": null,
91 | "id": "896a2454",
92 | "metadata": {},
93 | "outputs": [],
94 | "source": [
95 | "print(atl03.df.columns)"
96 | ]
97 | },
98 | {
99 | "cell_type": "markdown",
100 | "id": "91f73dcf",
101 | "metadata": {},
102 | "source": []
103 | }
104 | ],
105 | "metadata": {
106 | "kernelspec": {
107 | "display_name": "Python 3 (ipykernel)",
108 | "language": "python",
109 | "name": "python3"
110 | },
111 | "language_info": {
112 | "codemirror_mode": {
113 | "name": "ipython",
114 | "version": 3
115 | },
116 | "file_extension": ".py",
117 | "mimetype": "text/x-python",
118 | "name": "python",
119 | "nbconvert_exporter": "python",
120 | "pygments_lexer": "ipython3",
121 | "version": "3.8.0"
122 | }
123 | },
124 | "nbformat": 4,
125 | "nbformat_minor": 5
126 | }
127 |
--------------------------------------------------------------------------------
/notebooks/Quick_Start_Demo.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "b9aaca9e",
6 | "metadata": {},
7 | "source": [
8 | "## Quick Start PhoREAL ##\n",
9 | "\n",
10 | "*Tiny* notebook on how to get started with PhoREAL and how to read in ATL03 and ATL08 `.h5` files. This demo will read in an ATL03 use its associated ATL08 file to it for a specific ground track. PhoREAL will read the information in the ATL03 `heights` subgroup that represents the data the photon rate. Once read, the Pandas Dataframe can be read and operated with normal Pandas operators.\n",
11 | "\n",
12 | "Start by importing `phoreal`"
13 | ]
14 | },
15 | {
16 | "cell_type": "code",
17 | "execution_count": null,
18 | "id": "d3a7e008",
19 | "metadata": {},
20 | "outputs": [],
21 | "source": [
22 | "import phoreal as pr"
23 | ]
24 | },
25 | {
26 | "cell_type": "markdown",
27 | "id": "f227d36a",
28 | "metadata": {},
29 | "source": [
30 | "Within `phoreal` use the submodule `reader` to read the ATL03 file as a PhoREAL object. The ATL08 file can also be in to load in the ATL08 photon rate canopy and ground classifications and normalized heights."
31 | ]
32 | },
33 | {
34 | "cell_type": "code",
35 | "execution_count": null,
36 | "id": "d558afca",
37 | "metadata": {},
38 | "outputs": [],
39 | "source": [
40 | "atl03_file = '~/foo/ATL03_20200208094726_06670606_005_01.h5'\n",
41 | "atl08_file = '~/foo/ATL03_20200208094726_06670606_005_01.h5'"
42 | ]
43 | },
44 | {
45 | "cell_type": "markdown",
46 | "id": "bbd87f10",
47 | "metadata": {},
48 | "source": [
49 | "Also define which groundtrack to be read in, the options are: `gt1l`,`gt1r`,`gt2l`,`gt2r`,`gt3l`,`gt3r`"
50 | ]
51 | },
52 | {
53 | "cell_type": "code",
54 | "execution_count": null,
55 | "id": "9bcdc4c1",
56 | "metadata": {},
57 | "outputs": [],
58 | "source": [
59 | "gt = 'gt1r' "
60 | ]
61 | },
62 | {
63 | "cell_type": "markdown",
64 | "id": "cf69972b",
65 | "metadata": {},
66 | "source": [
67 | "And now read the file.."
68 | ]
69 | },
70 | {
71 | "cell_type": "code",
72 | "execution_count": null,
73 | "id": "8b522cd9",
74 | "metadata": {},
75 | "outputs": [],
76 | "source": [
77 | "atl03 = pr.reader.get_atl03_struct(atl03_file, gt, atl08_file)"
78 | ]
79 | },
80 | {
81 | "cell_type": "markdown",
82 | "id": "150678f7",
83 | "metadata": {},
84 | "source": [
85 | "Here we can verify the Pandas Dataframe that represents the `heights` subgroup. We can examine the columns using a standard Pandas call:"
86 | ]
87 | },
88 | {
89 | "cell_type": "code",
90 | "execution_count": null,
91 | "id": "896a2454",
92 | "metadata": {},
93 | "outputs": [],
94 | "source": [
95 | "print(atl03.df.columns)"
96 | ]
97 | },
98 | {
99 | "cell_type": "markdown",
100 | "id": "91f73dcf",
101 | "metadata": {},
102 | "source": []
103 | }
104 | ],
105 | "metadata": {
106 | "kernelspec": {
107 | "display_name": "Python 3 (ipykernel)",
108 | "language": "python",
109 | "name": "python3"
110 | },
111 | "language_info": {
112 | "codemirror_mode": {
113 | "name": "ipython",
114 | "version": 3
115 | },
116 | "file_extension": ".py",
117 | "mimetype": "text/x-python",
118 | "name": "python",
119 | "nbconvert_exporter": "python",
120 | "pygments_lexer": "ipython3",
121 | "version": "3.8.0"
122 | }
123 | },
124 | "nbformat": 4,
125 | "nbformat_minor": 5
126 | }
127 |
--------------------------------------------------------------------------------
/phoreal/CalVal.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | """
4 | Created on Wed Oct 30 15:46:19 2019
5 |
6 | @author: eguenther
7 | """
8 | #!/usr/bin/env python3
9 | # -*- coding: utf-8 -*-
10 | """
11 | Created on Mon Oct 28 08:47:00 2019
12 |
13 | @author: eguenther
14 | """
15 |
16 | import os
17 | import numpy as np
18 | import time as runTime
19 | import sys
20 | import warnings
21 | #from icesatIO import *
22 | #from icesatUtils import *
23 |
24 | from phoreal.getMeasurementError import *
25 | from scipy import spatial
26 | import matplotlib.pyplot as plt
27 | import copy
28 | import matplotlib.pyplot as plt
29 |
30 |
31 | #Include/exclude filter
32 | def includeFilter(sortedMeasured, superTruth):
33 | # xtest2 = superTruth.alongTrack
34 | difarr = superTruth.alongTrack[1:len(superTruth.alongTrack)] - superTruth.alongTrack[0:(len(superTruth.alongTrack) - 1)]
35 | difarr = np.append(difarr,0)
36 | indices = np.arange(len(difarr))
37 | selected = [(i,j) for (i,j) in zip(difarr,indices) if i >= 0.5]
38 | selectlist = list(zip(*selected))
39 | if len(selectlist) > 0:
40 | selectlist = selectlist[1]
41 | startindex = np.array([0])
42 | selectlist = np.array(selectlist)
43 | selectlist = selectlist + 1
44 | startindex2 = np.append(startindex,selectlist)
45 | endindex = np.array(selectlist)
46 | endindex = endindex - 1
47 | endindex = np.append(endindex,(len(superTruth.alongTrack) - 1))
48 | setindex = np.column_stack([startindex2,endindex])
49 | delindex = np.array([-1])
50 | for i in range(0,setindex.shape[0]):
51 | indexdif= setindex[i][1] - setindex[i][0]
52 | if (indexdif > 1000):
53 | if np.sum(delindex) == -1:
54 | delindex = i
55 | else:
56 | delindex = np.append(delindex,i)
57 |
58 |
59 | includeindex = setindex[delindex]
60 |
61 |
62 |
63 | #Create an 0 length array of the same size as the ATL03 data
64 | #Iterate through the start and end index for the truth data
65 | #At each iteration, whereever ATL03 is between the start and stop...
66 | #...make array of 1's where it is between the values
67 | #...add that array to a global array filter
68 | #...use array to create a filter
69 | yfilter = np.zeros([len(sortedMeasured.alongTrack),1])
70 | if includeindex.ndim == 1:
71 | start = superTruth.alongTrack[includeindex[0]]
72 | stop = superTruth.alongTrack[includeindex[1]]
73 | yfilter[(sortedMeasured.alongTrack > start) \
74 | & (sortedMeasured.alongTrack < stop)] = 1
75 | else:
76 | for i in range(0,len(includeindex)):
77 | start = superTruth.alongTrack[includeindex[i][0]]
78 | stop = superTruth.alongTrack[includeindex[i][1]]
79 | yfilter[(sortedMeasured.alongTrack > start) \
80 | & (sortedMeasured.alongTrack < stop)] = 1
81 | else:
82 | yfilter = np.zeros([len(sortedMeasured.alongTrack),1])
83 | start = np.min(superTruth.alongTrack)
84 | stop = np.max(superTruth.alongTrack)
85 | yfilter[(sortedMeasured.alongTrack > start) \
86 | & (sortedMeasured.alongTrack < stop)] = 1
87 |
88 |
89 | return yfilter
90 |
91 | # Perfect Classifer
92 | def perfect_classifier(atl03, truth_swath,ground = [2],canopy = [4],
93 | unclassed = [1, 6, 7, 18], keepsize = True):
94 | # Find max/min along track
95 | print('Run Perfect Classifier')
96 | maxy = np.max(truth_swath.alongtrack)
97 | miny = np.min(truth_swath.alongtrack)
98 |
99 | measy = np.array(atl03.df.alongtrack)
100 | measz = np.array(atl03.df.h_ph)
101 | measc = np.array(atl03.df.classification)
102 | # measz = measz - 1
103 |
104 | # Reclass everything to generic classes
105 | measc[measc == -1] = 0
106 | measc[measc == 3] = 2
107 |
108 | # Calculate filter offsets for later
109 | if keepsize == True:
110 | if np.min(measy) < miny:
111 | minmeasindex = next(x for x, val in enumerate(measy) if val > miny)
112 | else:
113 | minmeasindex = 0
114 | if np.max(measy) > maxy:
115 | maxmeasindex = next(x for x, val in enumerate(measy) if val > maxy)
116 | else:
117 | maxmeasindex = 0
118 |
119 | # Apply filter based on min/max of truth data
120 | measfilt = np.where((measy <= maxy) & (measy >= miny))
121 |
122 | measyfilt = measy[measfilt]
123 | measzfilt = measz[measfilt]
124 | meascfilt = measc[measfilt]
125 | classfilt = [np.isin(np.array(truth_swath.classification),ground) | np.isin(np.array(truth_swath.classification),canopy)]
126 | ty = np.array(truth_swath.alongtrack)[classfilt[0]]
127 | tz = np.array(truth_swath.z)[classfilt[0]]
128 | tc = np.array(truth_swath.classification)[classfilt[0]]
129 | #Create KDtree and then query the tree
130 | pts = np.vstack((measyfilt,measzfilt)).T
131 | print(' Building KDtree')
132 | tree = spatial.KDTree(list(zip(ty, tz)))
133 | print(' Querying KDtree')
134 | dist, index = tree.query(pts)
135 |
136 | # Pre-populate all photon classed array with zeroes
137 | measpc = np.zeros(len(index))
138 | measpc = measpc - 1
139 |
140 | # Populate all photon classed array from ATL08 classifications
141 | measpc = tc[index]
142 |
143 | measpc[dist > 1.5] = 0
144 |
145 | # Reclass everything to generic classes
146 | measpc[np.isin(measpc,unclassed)] = 0
147 | measpc[np.isin(measpc,ground)] = 1
148 | measpc[np.isin(measpc,canopy)] = 2
149 | measpc[measpc > 2] = 0
150 |
151 | keepsize = True
152 |
153 | # If keepsize is True, append 0's to measpc
154 | if keepsize == True:
155 | if ((minmeasindex == 0) & (maxmeasindex == 0)):
156 | measpc = measpc
157 | elif ((minmeasindex > 0) & (maxmeasindex == 0)):
158 | frontzeros = np.zeros(minmeasindex)
159 | measpc = np.append(frontzeros,measpc)
160 | elif ((minmeasindex == 0) & (maxmeasindex > 0)):
161 | backzeros = np.zeros(len(measc) - maxmeasindex)
162 | # measpc = np.append(frontzeros,measpc)
163 | measpc = np.append(measpc,backzeros)
164 | elif minmeasindex > maxmeasindex:
165 | frontzeros = np.zeros(maxmeasindex)
166 | backzeros = np.zeros(len(measc) - minmeasindex)
167 | measpc = np.append(frontzeros,measpc)
168 | measpc = np.append(measpc,backzeros)
169 | elif minmeasindex <= maxmeasindex:
170 | frontzeros = np.zeros(minmeasindex)
171 | backzeros = np.zeros(len(measc) - maxmeasindex)
172 | measpc = np.append(frontzeros,measpc)
173 | measpc = np.append(measpc,backzeros)
174 | measpc = measpc.astype('int')
175 | measoc = measc
176 | else:
177 | measpc = measpc.astype('int')
178 | measoc = meascfilt
179 |
180 | return measpc, measoc
181 |
182 |
183 | if __name__ == "__main__":
184 | pass
185 |
186 |
--------------------------------------------------------------------------------
/phoreal/Viewer_Online_blank.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
24 |
25 |
26 |
281 |
282 |
283 |
284 |
285 |
286 | PhoShow Sonoma 2018-10-30 GT1R
287 |
288 |
289 |
290 |
291 |
292 |
293 |
294 |
295 |
296 |
297 |
298 |
302 |
303 |
304 |
305 |
306 | Switching basemaps
307 |
308 |
309 |
310 |
313 |
316 |
317 |
318 |
319 |
322 |
323 |
336 |
337 |
338 |
339 | ESRI Topographic
340 | ESRI Streets
341 | National Geographic
342 | ESRI Oceans
343 | ESRI Gray
344 | ESRI Dark Gray
345 | ESRI Imagery
346 | ESRI Imagery (Clarity)
347 | ESRI Imagery (Firefly)
348 | ESRI Shaded Relief
349 | ESRI Physical
350 | Stamen Terrain
351 | Geoportail France Imagery
352 | NASA/USGS Landsat 8
353 | NASA MODIS Terra
354 |
355 |
356 |
357 |
358 |
359 |
470 |
471 |
--------------------------------------------------------------------------------
/phoreal/Viewer_blank.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
24 |
25 |
26 |
209 |
--------------------------------------------------------------------------------
/phoreal/__init__.py:
--------------------------------------------------------------------------------
1 | #__init__.py
2 | from . import gui_addins
3 | from . import utils
4 | from . import io
5 | from . import binner
6 | from . import reader
7 | from . import reference
8 |
--------------------------------------------------------------------------------
/phoreal/__pycache__/CalVal.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/CalVal.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/__init__.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/__init__.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/__init__.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/__init__.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/__init__.cpython-39.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/__init__.cpython-39.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/ace.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/ace.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/ace.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/ace.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/binner.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/binner.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/getAtlMeasuredSwath_auto.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/getAtlMeasuredSwath_auto.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/getAtlMeasuredSwath_auto.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/getAtlMeasuredSwath_auto.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/getAtlTruthSwath_auto.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/getAtlTruthSwath_auto.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/getMeasurementError.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/getMeasurementError.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/getMeasurementError.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/getMeasurementError.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/getMeasurementError_auto.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/getMeasurementError_auto.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/gui_addins.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/gui_addins.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/gui_addins.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/gui_addins.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/gui_addins.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/gui_addins.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatBin.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatBin.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatBinner.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatBinner.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatBinner.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatBinner.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatCalVal.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatCalVal.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatIO.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatIO.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatIO.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatIO.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatPlot.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatPlot.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatReader.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatReader.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatReader.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatReader.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatReader.cpython-39.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatReader.cpython-39.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatReference.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatReference.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatUtils.cpython-36.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatUtils.cpython-36.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatUtils.cpython-38.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatUtils.cpython-38.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/icesatUtils.cpython-39.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/icesatUtils.cpython-39.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/io.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/io.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/plot.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/plot.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/reader.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/reader.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/reference.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/reference.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/__pycache__/utils.cpython-311.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/__pycache__/utils.cpython-311.pyc
--------------------------------------------------------------------------------
/phoreal/ace.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | """
3 | Created on Thu Aug 4 09:22:31 2022
4 |
5 | @author: ERICG
6 | """
7 |
8 | import argparse
9 | import os
10 | import laspy
11 | from laspy.file import File
12 | import numpy as np
13 | from scipy import interpolate
14 |
15 | def interpolate_dem(dem, nan_value=-999):
16 | dem[dem == nan_value] = np.nan
17 | # If the dem is only 1 pixel wide in one direction it will cause a problem
18 | # with the interpolation, so check for this and pad it
19 | invalid_x_size = False
20 | invalid_y_size = False
21 | if dem.shape[0] == 1: # pragma: no cover
22 | dem = np.concatenate((dem, dem, dem), axis=0)
23 | invalid_x_size = True
24 | if dem.shape[1] == 1: # pragma: no cover
25 | dem = np.concatenate((dem, dem, dem), axis=1)
26 | invalid_y_size = True
27 | if invalid_x_size and invalid_y_size: # pragma: no cover
28 | raise RuntimeError('Found single pixel dem, cannot process')
29 | # Derive x and y indexes from mesh-grid
30 | x_indx, y_indx = np.meshgrid(np.arange(0, dem.shape[1]),
31 | np.arange(0, dem.shape[0]))
32 |
33 | # Mask all invalid values
34 | zs_masked = np.ma.masked_invalid(dem)
35 |
36 | # Retrieve the valid, non-Nan, defined values
37 | valid_xs = x_indx[~zs_masked.mask]
38 | valid_ys = y_indx[~zs_masked.mask]
39 | valid_zs = zs_masked[~zs_masked.mask]
40 |
41 | # Interpolate missing values on the DEM
42 | dem = interpolate.griddata((valid_xs, valid_ys), valid_zs.ravel(),
43 | (x_indx, y_indx), method='linear')
44 |
45 |
46 | # Remove padding if it was added
47 | if invalid_x_size: # pragma: no cover
48 | dem = dem[1, :]
49 | dem = np.expand_dims(dem, axis=0)
50 | if invalid_y_size: # pragme: no cover
51 | dem = dem[:, 1]
52 | dem = np.expand_dims(dem, axis=1)
53 |
54 | return dem
55 |
56 | # Run ACE
57 | def ace(x, y, z, c, interpolate = False):
58 | # Define static variables
59 | ground = 2 # ASPRS ground classification code
60 | unclassed = 1 # ASPRS unclassed classification code
61 | x_res = 1 # Res of 'grid' in 'x'
62 | y_res = 1 # Res of 'grid' in 'y'
63 | ground_buffer = 0.5 # Default buffer of ground
64 |
65 | # Grid las files
66 | x = np.floor((x - np.min(x))/x_res).astype(int)
67 | y = np.floor((y - np.min(y))/y_res).astype(int)
68 | # dem = np.zeros([np.max(x[c == ground]) + 1, np.max(y[c == ground]) + 1])
69 | dem = np.zeros([np.max(x) + 1, np.max(y) + 1])
70 | dem[x[c == ground],y[c == ground]] = z[c == ground]
71 |
72 | # Interpolate DEM
73 | if interpolate:
74 | dem = interpolate_dem(dem, nan_value=0)
75 | dem[np.isnan(dem)] = 0
76 |
77 | #Normalize las.z to DEM
78 | z = z - dem[x,y]
79 |
80 | #Where C is 0 and Height is between -1 and 1.
81 | c[np.where((c == unclassed) & (z > -ground_buffer) &\
82 | (z < ground_buffer))] = ground
83 |
84 | return c
85 |
86 | # Read las/laz, run ACE, write out new las/laz
87 | def convert_las_ace(in_file, out_file, interpolate = False):
88 | # Read the las file
89 | las = File(in_file, mode = "r")
90 |
91 | # Run ACE
92 | c = ace(las.x, las.y, las.z, las.classification, interpolate = False)
93 |
94 | # Write and close out in/out las files
95 | las_out = laspy.file.File(out_file, mode="w", header=las.header)
96 | las_out.set_points(las.get_points())
97 | las.close()
98 | las_out.classification = c
99 | las_out.close()
100 |
101 | def main(in_file, out_file, interpolate):
102 | # Check if in_file is file
103 | if os.path.isfile(in_file):
104 | # Check if out_file is a .las or .laz
105 | if out_file[-4:] == '.las' or out_file[-3:] == '.laz':
106 | convert_las_ace(in_file, out_file, interpolate)
107 | # Check if out_file is actually a directory
108 | elif os.path.isdir(out_file):
109 | out_file = os.path.join(out_file,\
110 | os.path.basename(in_file)[:-4] + '_ace.las')
111 | convert_las_ace(in_file, out_file, interpolate)
112 | # Check if in_file and out_file are actually directories
113 | elif os.path.isdir(in_file) and os.path.isdir(out_file):
114 | for f in os.listdir(in_file):
115 | _, ext = os.path.splitext(f)
116 | if ext == '.las':
117 | out_file = os.path.join(out_file,\
118 | os.path.basename(f)[:-4] + '_ace.las')
119 | convert_las_ace(f, out_file, interpolate)
120 | elif ext == '.laz':
121 | out_file = os.path.join(out_file,\
122 | os.path.basename(f)[:-4] + '_ace.laz')
123 | convert_las_ace(f, out_file, interpolate)
124 |
125 | if __name__ == '__main__':
126 | """ Command line entry point """
127 | parser = argparse.ArgumentParser()
128 |
129 | # Required positional argument
130 | parser.add_argument("in_file",
131 | help="Input .las/.laz or directory of .las/.laz files")
132 |
133 | parser.add_argument("out_file",
134 | help="Output .las/.laz or directory of .las/.laz files")
135 |
136 | parser.add_argument('-interpolate', '--interpolate_dem', action='store_true',
137 | help='Interpolate missing pixels in DEM')
138 |
139 |
140 | args = parser.parse_args()
141 |
142 | args.in_file = os.path.normpath(args.in_file)
143 | args.out_file = os.path.normpath(args.out_file)
144 |
145 | main(args.in_file,
146 | args.out_file,
147 | args.interpolate)
--------------------------------------------------------------------------------
/phoreal/acer.cpython-36m-x86_64-linux-gnu.so:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/acer.cpython-36m-x86_64-linux-gnu.so
--------------------------------------------------------------------------------
/phoreal/atl_processor.py:
--------------------------------------------------------------------------------
1 | # Import packages
2 | import os
3 | import scipy.io as sio
4 | import pandas as pd
5 | import argparse
6 | import logging
7 | import re
8 |
9 | print('Packages imported successful')
10 |
11 | # Read specific packages from PhoREAL
12 | from phoreal.reader import get_atl03_struct
13 | from phoreal.reader import get_atl08_struct
14 | from phoreal.binner import rebin_atl08
15 |
16 | print('PhoREAL packages imported successful')
17 |
18 | print('import successful')
19 |
20 | '''
21 | (1) Aggregates all atl03 and atl08 data files from their respective directories
22 | under /mnt/walker/Data/ICESat-2/REL005/.../PRF.
23 | (2) Matches atl03 and atl08 alongtrack information for a specified region.
24 | (3) Re-bins matched/mapped atl08 data from 100m to 30m.
25 |
26 | Key Variables
27 | ----------
28 | See argparse section.
29 |
30 | Outputs
31 | -------
32 | csv file
33 | Contains processed data ready for comparison to ASL
34 | mat file (optional)
35 | Contains processed data ready for comparison to ASL
36 |
37 | See Also
38 | --------
39 | prf_processing_example.ipynb
40 |
41 | '''
42 |
43 | def parse_arguments() -> dict:
44 | '''Command line entry point'''
45 | parser = argparse.ArgumentParser()
46 | '''Required arguments:'''
47 | parser.add_argument('-atl03', '--atl03_dir', required=True, type=str
48 | , help='Input ATL03 directory')
49 |
50 | parser.add_argument('-atl08', '--atl08_dir', required=True, type=str
51 | , help='Input ATL08 directory')
52 |
53 | parser.add_argument('-out', '--output_dir', required=True, type=str
54 | , help='CSV output file /directory/')
55 |
56 | parser.add_argument('-gt', '--gtXX', required=True, type=str
57 | , help='Alongtrack (gt1l, gt1r, gt2l, gt2r, gt3l, gt3r)')
58 |
59 | parser.add_argument('-ec', '--epsg_code', required=True, type=str
60 | , help='UTM Zone')
61 | '''Optional arguments:'''
62 | parser.add_argument('-r', '--res', nargs='?', const=1, type=int
63 | , default=30, help='Alongtrack resolution (m)')
64 |
65 | parser.add_argument('-lat', '--latitude', nargs='+', type=float
66 | , help='Min/max list of latitude for data trimming')
67 |
68 | parser.add_argument('-lon', '--longitude', nargs='+', type=float
69 | , help='Min/max list of longitude for data trimming')
70 |
71 | parser.add_argument('-mat', '--mat_out', action='store_true'
72 | , help='Output to *mat (override *csv default)')
73 |
74 | parser.add_argument('-log', '--log_fn', type=str
75 | , help='Logging output /directory/filename')
76 | args = parser.parse_args()
77 | # Using dict for readability and to work with logging more easily.
78 | arg_dict = { 'in_atl03' : args.atl03_dir
79 | , 'in_atl08' : args.atl08_dir
80 | , 'in_gt' : args.gtXX
81 | , 'in_epsg' : args.epsg_code
82 | , 'in_res' : args.res
83 | , 'in_lat' : args.latitude
84 | , 'in_lon' : args.longitude
85 | , 'out_dir' : args.output_dir
86 | , 'out_mat' : args.mat_out
87 | , 'out_log' : [args.log_fn] }
88 | return arg_dict
89 |
90 | def _initialize_logging(logging_fn):
91 | """
92 | Initialize logging object.
93 |
94 | Parameters:
95 | logging_fn (str): log /dir/fn from argparse command line input
96 |
97 | Returns:
98 | logger (logging obj): initialized logging object
99 |
100 | """
101 | # Log filename: argparse /dir/fn
102 | # Format: asc time, log msg
103 | # Date/Time Format: month/day/year HH:MM:SS
104 | # Log file mode: write
105 | logging.basicConfig(filename=logging_fn
106 | , format='%(asctime)s %(message)s'
107 | , datefmt='%m/%d/%Y %H:%M:%S'
108 | , filemode='w')
109 | logger = logging.getLogger()
110 | # Setting threshold tentatively at WARNING
111 | logger.setLevel(logging.WARNING)
112 | return logger
113 |
114 | def collect_atl_files(arg_dict) -> dict:
115 | """
116 | Aggregate atl03 and atl08 files.
117 |
118 | Parameters:
119 | arg_dict (dict): dictionary of argparse command line inputs
120 |
121 | Returns:
122 | atl_data_dict (dict): dictionary of string pairs representing atl03, atl08 files
123 | in /dir/fn format
124 | """
125 | atl_data_dict = {}
126 | dataset_count = 0
127 | for data_file in os.listdir(arg_dict['in_atl03']):
128 | # Ensure that file is of ATL03 h5 type.
129 | if data_file[:5] == 'ATL03':
130 | atl03_file = os.path.join(arg_dict['in_atl03'], data_file)
131 | else:
132 | continue
133 | # atl08 file names are generated by modifying atl03 file names (identical minus 'atl03')
134 | atl08_file = os.path.join(arg_dict['in_atl08'], 'ATL08' + data_file[5:])
135 | # Ensure that a matching atl08 file exists, ensure that both the atl03 and atl08
136 | # files contain data. Log and continue if not.
137 | try:
138 | if os.path.getsize(atl03_file) and os.path.getsize(atl08_file):
139 | atl_data_dict['dataset' + str(dataset_count)] = {'atl03_file': atl03_file
140 | , 'atl08_file': atl08_file}
141 | dataset_count += 1
142 | else:
143 | if arg_dict['out_log']:
144 | arg_dict['out_log'][1].error('The following atl file is empty: '
145 | + atl03_file if not os.path.getsize(atl03_file) else atl08_file)
146 | else:
147 | print('The following file is empty: '
148 | , atl03_file if not os.path.getsize(atl03_file) else atl08_file )
149 | except OSError:
150 | if arg_dict['out_log']:
151 | arg_dict['out_log'][1].error('Missing the following atl file: '
152 | + atl03_file if not os.path.isfile(atl03_file) else atl08_file)
153 | else:
154 | print('Missing the following atl file: '
155 | , atl03_file if not os.path.isfile(atl03_file) else atl08_file)
156 | return atl_data_dict
157 |
158 | def trim_and_bin(arg_dict, dataset_dict) -> dict:
159 | """
160 | Create atl03/atl08 objects (structs). Trim to regional
161 | lat/lon (optional). Re-bin atl03 to 30m.
162 |
163 | Parameters:
164 | arg_dict (dict): dictionary of argparse command line inputs
165 |
166 | Returns:
167 | dataset_dict (subset of atl_data_dict) (dict): dictionary of string
168 | pairs representing atl03, atl08 files in /dir/fn format;
169 | atl03 obj; atl08 obj; atl08 rebinned obj
170 | """
171 | res_field = 'alongtrack'
172 | # Create atl03 object and trim lat/lon to desired region (optional).
173 | # Create atl08 object and trim lat/lon to desired region (optional).
174 | # Log and retry if trimming fails
175 | dataset_dict['atl03'] = get_atl03_struct( dataset_dict['atl03_file']
176 | , arg_dict['in_gt']
177 | , dataset_dict['atl08_file']
178 | , epsg = arg_dict['in_epsg'] )
179 | try:
180 | if arg_dict['in_lat']:
181 | dataset_dict['atl03'].trim_by_lat( min(arg_dict['in_lat'])
182 | , max(arg_dict['in_lat']) )
183 | if arg_dict['in_lon']:
184 | dataset_dict['atl03'].trim_by_lon( min(arg_dict['in_lon'])
185 | , max(arg_dict['in_lon']) )
186 | dataset_dict['atl08'] = get_atl08_struct( dataset_dict['atl08_file']
187 | , arg_dict['in_gt']
188 | , dataset_dict['atl03'] )
189 | if arg_dict['in_lat']:
190 | dataset_dict['atl08'].trim_by_lat( min(arg_dict['in_lat'])
191 | , max(arg_dict['in_lat']) )
192 | if arg_dict['in_lon']:
193 | dataset_dict['atl08'].trim_by_lon( min(arg_dict['in_lon'])
194 | , max(arg_dict['in_lon']) )
195 | except ValueError:
196 | if arg_dict['out_log']:
197 | arg_dict['out_log'][1].error('An invalid value was encountered. \
198 | Making another attempt without lat/lon trimming.')
199 | else:
200 | print('An invalid value was encountered. \
201 | Making another attempt without lat/lon trimming.')
202 | dataset_dict['atl03'] = get_atl03_struct( dataset_dict['atl03_file']
203 | , arg_dict['in_gt']
204 | , dataset_dict['atl08_file']
205 | , epsg = arg_dict['in_epsg'] )
206 | dataset_dict['atl08'] = get_atl08_struct( dataset_dict['atl08_file']
207 | , arg_dict['in_gt']
208 | , dataset_dict['atl03'] )
209 | # Re-bin atl03 object to 30m from 100m.
210 | dataset_dict['atl08_bin'] = rebin_atl08( dataset_dict['atl03']
211 | , dataset_dict['atl08']
212 | , arg_dict['in_gt']
213 | , arg_dict['in_res'], res_field )
214 | return dataset_dict
215 |
216 | def generate_output(arg_dict, dataset_dict) -> None:
217 | """
218 | Generate re-binned atl08 output file in *csv.
219 | Optionally, can generate *mat files as well as *csv.
220 |
221 | Parameters:
222 | arg_dict (dict): Dictionary of argparse options
223 | dataset_dict (dict): Dictionary of entire dataset
224 | generated from each atl03 file, atl08 file set
225 | Returns:
226 | None
227 | """
228 | csv_fn = arg_dict['out_dir'] + '/atl08_bin_' + re.sub(r'^.*?ATL08_', ''
229 | , dataset_dict['atl08_file'])
230 | csv_fn = csv_fn.strip('.h5') + '.csv'
231 | dataset_dict['atl08_bin'].to_csv(csv_fn)
232 | if arg_dict['out_mat'] is True:
233 | mat_fn = csv_fn.strip('.csv') + '.mat'
234 | df = pd.read_csv(csv_fn)
235 | dictionary = {}
236 | dictionary['struct'] = df.to_dict('list')
237 | sio.savemat(mat_fn, dictionary)
238 |
239 | def main(arg_dict):
240 | if arg_dict['out_log'][0]:
241 | logger = _initialize_logging(arg_dict['out_log'][0])
242 | arg_dict['out_log'].append(logger)
243 | atl_data_dict = collect_atl_files(arg_dict)
244 | # Loop through atl03/atl08 file set dictionary.
245 | for dataset_dict in atl_data_dict.values():
246 | dataset_dict = trim_and_bin(arg_dict, dataset_dict)
247 | generate_output(arg_dict, dataset_dict)
248 |
249 | if __name__ == '__main__':
250 | arg_dict = parse_arguments()
251 | main(arg_dict)
252 |
--------------------------------------------------------------------------------
/phoreal/atl_to_las.py:
--------------------------------------------------------------------------------
1 | import os
2 | import argparse as ap
3 | import numpy as np
4 |
5 | from phoreal.reader import write_atl03_las
6 | from phoreal.reader import get_atl03_struct
7 | from phoreal.reader import get_atl08_struct
8 |
9 |
10 | def parse_arguments() -> dict:
11 | '''Command line entry point'''
12 | parser = ap.ArgumentParser()
13 | '''Required arguments:'''
14 | parser.add_argument('-atl03', '--atl03_dir'
15 | , required=True
16 | , type=str
17 | , help='Input ATL03 directory')
18 |
19 | parser.add_argument('-atl08', '--atl08_dir'
20 | , required=True
21 | , type=str
22 | , help='Input ATL08 directory')
23 |
24 | parser.add_argument('-out', '--output_dir'
25 | , required=True
26 | , type=str
27 | , help='las output file /directory/')
28 |
29 | parser.add_argument('-gt', '--gtXX'
30 | , required=True
31 | , type=str
32 | , help='Alongtrack (gt1l, gt1r, gt2l, gt2r, gt3l, gt3r)')
33 |
34 | '''Optional arguments:'''
35 | parser.add_argument('-r', '--res'
36 | , nargs='?'
37 | , const=1
38 | , type=int
39 | , default=30
40 | , help='Alongtrack resolution (m)')
41 |
42 | parser.add_argument('-ec', '--epsg_code'
43 | , type=str
44 | , default=None
45 | , help='UTM Zone')
46 |
47 | args = parser.parse_args()
48 | # Using dict for readability and to work with logging more easily.
49 | arg_dict = { 'in_atl03' : args.atl03_dir
50 | , 'in_atl08' : args.atl08_dir
51 | , 'in_gt' : args.gtXX
52 | , 'in_epsg' : args.epsg_code
53 | , 'in_res' : args.res
54 | , 'out_dir' : args.output_dir }
55 | return arg_dict
56 |
57 |
58 | def collect_processed_data(arg_dict) -> list:
59 | """
60 | Aggregate processed atl03 and atl08 files and create objects.
61 |
62 | Parameters:
63 | arg_dict (dict): dictionary of argparse command line inputs
64 |
65 | Returns:
66 | atl_data (list): list of atl03 objects
67 | """
68 | atl_data = []
69 | dataset_count = 0
70 | for data_file in os.listdir(arg_dict['in_atl03']):
71 | if data_file[:15] == 'processed_ATL03':
72 | atl03_file = os.path.join(arg_dict['in_atl03'], data_file)
73 | else:
74 | continue
75 | # atl08 file names are generated by modifying processed
76 | # atl03 file names (identical minus 'atl03')
77 | atl08_file = os.path.join(arg_dict['in_atl08'], 'processed_ATL08' + data_file[15:])
78 | # Ensure that a matching atl08 file exists, ensure that both the atl03 and atl08
79 | # files contain data. Log and continue if not.
80 | try:
81 | if os.path.getsize(atl03_file) and os.path.getsize(atl08_file):
82 | atl03 = get_atl03_struct(atl03_file
83 | , arg_dict['in_gt']
84 | , atl08_file
85 | , epsg = arg_dict['in_epsg'])
86 | atl_data.append(atl03)
87 | else:
88 | print('The following file is empty: '
89 | , atl03_file if not os.path.getsize(atl03_file) else atl08_file )
90 | except OSError:
91 | print('Missing the following atl file: '
92 | , atl03_file if not os.path.isfile(atl03_file) else atl08_file)
93 | return atl_data
94 |
95 | def main(arg_dict):
96 | atl_data = collect_processed_data(arg_dict)
97 | for atl03_object in atl_data:
98 | write_atl03_las(atl03_object, arg_dict['out_dir'])
99 |
100 | if __name__ == '__main__':
101 | arg_dict = parse_arguments()
102 | main(arg_dict)
--------------------------------------------------------------------------------
/phoreal/closest.c:
--------------------------------------------------------------------------------
1 | #include
2 |
3 | void cfun(const double *A, size_t sizeA,const double *B, double *C, size_t sizeB)
4 | {
5 | size_t i, j, k, m, n;
6 | m = sizeA;
7 | n = sizeB;
8 | k = 0;
9 | // Assign 1st index to all values B that are less than 1st A value
10 | while( k < n && B[k] < *A ) {
11 | C[k++] = 0;
12 | }
13 | // Step through until B is between two A values, then test for result
14 | i = 0;
15 | for( j=k; j= A[i+1] ) i++;
17 | if( i+1 == m ) break;
18 | if( B[j] - A[i] < A[i+1] - B[j] ) {
19 | C[j] = i + 0;
20 | } else {
21 | C[j] = i + 1;
22 | }
23 | }
24 | // Assign last index to all values B that are more than last A value
25 | while( j < n ) {
26 | C[j++] = m;
27 | }
28 | }
29 |
--------------------------------------------------------------------------------
/phoreal/closest.exe:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/closest.exe
--------------------------------------------------------------------------------
/phoreal/closest_x64.dll:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/closest_x64.dll
--------------------------------------------------------------------------------
/phoreal/gui_addins.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | """
3 | Simple script to load non-Python files to help bundle EXE
4 |
5 | Copyright 2019 Applied Research Laboratories, University of Texas at Austin
6 |
7 | This package is free software; the copyright holder gives unlimited
8 | permission to copy and/or distribute, with or without modification, as
9 | long as this notice is preserved.
10 |
11 | Authors:
12 | Mike Alonzo
13 | Eric Guenther
14 |
15 | Date: September 20, 2019
16 | """
17 |
18 | # Import modules
19 | import os
20 |
21 | root = os.path.dirname(__file__)
22 |
23 | superFilterFile_windows = os.path.join(root,'closest_x64.dll')
24 |
25 | superFilterFile_linux = os.path.join(root,'phorealc.so')
26 |
27 | viewerBlank_html = os.path.join(root,'Viewer_blank.html')
28 |
29 | viewerBlankOnline_html = os.path.join(root,'Viewer_Online_blank.html')
--------------------------------------------------------------------------------
/phoreal/laszip.exe:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/laszip.exe
--------------------------------------------------------------------------------
/phoreal/pho_image.ico:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/pho_image.ico
--------------------------------------------------------------------------------
/phoreal/phorealc.so:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/icesat-2UT/PhoREAL/76e3227cc19887015f26064b965f562614e8fede/phoreal/phorealc.so
--------------------------------------------------------------------------------
/phoreal/plot.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | """
3 | This script that provides basic plotting functionality for PhoREAL
4 |
5 | Copyright 2019 Applied Research Laboratories, University of Texas at Austin
6 |
7 | This package is free software; the copyright holder gives unlimited
8 | permission to copy and/or distribute, with or without modification, as
9 | long as this notice is preserved.
10 |
11 | Authors:
12 | Mike Alonzo
13 | Eric Guenther
14 |
15 | Date: September 20, 2019
16 | """
17 |
18 | # Import modules
19 | import matplotlib.pyplot as plt
20 | import os
21 | import numpy as np
22 | import pickle as pkl
23 | from tkinter import messagebox
24 |
25 | ## Import modules for lasso tool
26 | #from icesatIO import writeArrayToCSV
27 | #from matplotlib.widgets import LassoSelector
28 | #from matplotlib.path import Path
29 | #import glob
30 | #from matplotlib.backend_tools import ToolToggleBase
31 | #plt.rcParams['toolbar'] = 'toolmanager'
32 |
33 |
34 |
35 | BROWN = np.array([89,60,31]) / 255 # for ground b/c matplotlib brown is terrible
36 |
37 |
38 | # Function to make contour plot
39 | def plotContour(resultsCrossTrackShift, resultsAlongTrackShift, resultsMAE,
40 | atlMeasuredData, atlCorrections, rasterResolution, showPlots,
41 | outFilePath, counter):
42 |
43 | # Make contour plot
44 | plt.ioff()
45 | plt.figure()
46 | plt.contour(resultsCrossTrackShift, resultsAlongTrackShift, resultsMAE, colors = 'black')
47 | plt.contourf(resultsCrossTrackShift, resultsAlongTrackShift, resultsMAE, cmap = 'viridis')
48 | legendStr = 'Min MAE = ' + '{:0.3f}'.format(atlCorrections.mae[0]) + ' m'
49 | plt.plot(atlCorrections.crossTrack, atlCorrections.alongTrack, 'ro', markersize = 7, markerfacecolor = 'r', label = legendStr)
50 | plt.axis('equal')
51 | plt.axis('tight')
52 | plt.grid(b = True, which = 'major', axis = 'both')
53 | plt.xlabel('Cross-Track Offset (m)')
54 | plt.ylabel('Along-Track Offset (m)')
55 | titleStr = atlMeasuredData.atl03FileName + ' (' + atlMeasuredData.gtNum + '): ' + atlMeasuredData.trackDirection + '\n' + \
56 | 'Observed Data Correction (' + str(rasterResolution) + ' m Raster)\n' + \
57 | 'Easting, Northing, Vertical: ' + '{:0.1f}'.format(atlCorrections.easting[0]) + ', ' + '{:0.1f}'.format(atlCorrections.northing[0]) + ', ' + '{:0.1f}'.format(atlCorrections.z[0]) + ' m \n' + \
58 | 'Cross-Track, Along-Track: ' + '{:0.0f}'.format(atlCorrections.crossTrack[0]) + ', ' + '{:0.0f}'.format(atlCorrections.alongTrack[0]) + ' m'
59 | plt.title(titleStr, fontsize = 10, fontweight = 'bold')
60 | plt.tight_layout()
61 | plt.legend(loc = 'upper left')
62 | cbar = plt.colorbar();
63 | cbar.set_label('Vertical Mean Absolute Error (m)')
64 |
65 | # Save plot
66 | if(not os.path.exists(os.path.normpath(outFilePath))):
67 | os.mkdir(os.path.normpath(outFilePath))
68 | # EndIf
69 | plotNum = counter + 1
70 | outNameBase = atlMeasuredData.atl03FileName + '_' + atlMeasuredData.gtNum + '_fig_Contour' + '{:0.0f}'.format(plotNum) + '_' + str(rasterResolution) + 'm'
71 | outPath = os.path.normpath(outFilePath + '/' + outNameBase + '.png')
72 | plt.savefig(outPath, bbox_inches='tight')
73 |
74 | # Show plot
75 | if(showPlots):
76 | plt.show()
77 | # EndIf
78 |
79 | # Close figure
80 | plt.close()
81 |
82 | # endDef
83 |
84 | # Function to make Z vs Y plot
85 | def plotZY(measRasterYCommonFinal, measRasterZCommonFinal, truthRasterYCommonFinal,
86 | truthRasterZCommonFinal, zErrorCommonFinal, segmentErrorY, segmentErrorZ,
87 | atlMeasuredData, atlCorrections, rasterResolution,
88 | useMeasSigConf, filterData, showPlots, outFilePath, counter):
89 |
90 | # Define plot colors
91 | myBlue = (0/255, 114/255, 178/255)
92 | myOrange = (213/255, 94/255, 0/255)
93 | myYellow = (230/255, 159/255, 0/255)
94 |
95 | # Open 2 subplots
96 | plt.ioff()
97 | f, (ax1, ax2) = plt.subplots(2, 1, sharex=True)
98 |
99 | # Make Z vs Y subplot 1
100 | if(useMeasSigConf):
101 | titleLabel = 'Using Signal Confidence = %s' %filterData
102 | else:
103 | titleLabel = 'Using Ground Value = %s' % filterData
104 | # endIf
105 | ax1.plot(measRasterYCommonFinal/1000, measRasterZCommonFinal, 'o', color = myOrange, ms=0.7, label = 'Shifted ICESat-2 Mean Raster', zorder=1)
106 | ax1.plot(truthRasterYCommonFinal/1000, truthRasterZCommonFinal, 'o', color = myBlue, ms=0.7, label = 'Reference Mean Raster', zorder=0)
107 | ax1.axis('tight')
108 | ax1.grid(b = True, which = 'major', axis = 'both')
109 | ax1.set_ylabel('Z (m)')
110 | ax1.legend(loc = 'upper left')
111 | titleStr = '\n' + atlMeasuredData.atl03FileName + ' (' + atlMeasuredData.gtNum + '): ' + atlMeasuredData.trackDirection + '\n' + \
112 | 'Shifted ICESat-2 and Reference Data (' + str(rasterResolution) + ' m Raster)\n' + \
113 | titleLabel
114 | ax1.set_title(titleStr, fontsize = 10, fontweight = 'bold')
115 |
116 | # Make Z Error vs Y subplot 2
117 | ax2.plot(truthRasterYCommonFinal/1000, zErrorCommonFinal, 'o', color = myBlue, ms=0.7, label = 'Z Error')
118 | ax2.plot(segmentErrorY/1000, segmentErrorZ, color = myYellow, label = '100 m Segment Mean Error')
119 | ax2.axis('tight')
120 | ax2.grid(b = True, which = 'major', axis = 'both')
121 | ax2.set_ylabel('Z Error (m)')
122 | ax2.set_xlabel('UTM Northing (km)')
123 | ax2.legend(loc = 'upper left')
124 | titleStr = 'MAE = ' + '{:0.2f}'.format(atlCorrections.mae[0]) + ' m, RMSE = ' + '{:0.2f}'.format(atlCorrections.rmse[0]) + ' m, Mean Error = ' + '{:0.2f}'.format(atlCorrections.me[0]) + ' m\n'
125 | ax2.set_title(titleStr)
126 | f.tight_layout(pad=1.0)
127 |
128 | # Save plot
129 | if(not os.path.exists(os.path.normpath(outFilePath))):
130 | os.mkdir(os.path.normpath(outFilePath))
131 | # EndIf
132 | plotNum = counter + 1
133 | outNameBase = atlMeasuredData.atl03FileName + '_' + atlMeasuredData.gtNum + '_fig_ZY' + '{:0.0f}'.format(plotNum) + '_' + str(rasterResolution) + 'm'
134 | outPath = os.path.normpath(outFilePath + '/' + outNameBase + '.png')
135 | plt.savefig(outPath, bbox_inches='tight')
136 | outPathPickle = os.path.normpath(outFilePath + '/' + outNameBase + '.pkl')
137 | pkl.dump(f, open(outPathPickle,'wb'))
138 |
139 | # Show plot
140 | if(showPlots):
141 | plt.show()
142 | # EndIf
143 |
144 | # Close figure
145 | plt.close()
146 |
147 | # endDef
148 |
149 | # Function to make Z vs T plot
150 | def plotZT(measRasterTCommonFinal, measRasterZCommonFinal, zErrorCommonFinal,
151 | atlMeasuredData, atlCorrections, rasterResolution,
152 | useMeasSigConf, filterData, showPlots, outFilePath, counter):
153 |
154 | # Define plot colors
155 | myBlue = (0/255, 114/255, 178/255)
156 | myOrange = (213/255, 94/255, 0/255)
157 |
158 | # Open 2 subplots
159 | plt.ioff()
160 | f, (ax1, ax2) = plt.subplots(2, 1, sharex=True)
161 |
162 | # Make Z vs Y subplot 1
163 | if(useMeasSigConf):
164 | titleLabel = 'Using Signal Confidence = %s' %filterData
165 | else:
166 | titleLabel = 'Using Ground Value = %s' % filterData
167 | # endIf
168 | ax1.plot(measRasterTCommonFinal, measRasterZCommonFinal, 'o', color = myOrange, ms=0.7)
169 | ax1.axis('tight')
170 | ax1.grid(b = True, which = 'major', axis = 'both')
171 | ax1.set_ylabel('Z (m)')
172 | titleStr = '\n' + atlMeasuredData.atl03FileName + ' (' + atlMeasuredData.gtNum + '): ' + atlMeasuredData.trackDirection + '\n' + \
173 | 'IceSat-2 Time Data (' + str(rasterResolution) + ' m Raster)\n' + \
174 | titleLabel
175 | ax1.set_title(titleStr, fontsize = 10, fontweight = 'bold')
176 |
177 | # Make Z Error vs Y subplot 2
178 | ax2.plot(measRasterTCommonFinal, zErrorCommonFinal, 'o', color = myBlue, ms=0.7)
179 | ax2.axis('tight')
180 | ax2.grid(b = True, which = 'major', axis = 'both')
181 | ax2.set_ylabel('Z Error (m)')
182 | ax2.set_xlabel('Time (sec)')
183 | titleStr = 'MAE = ' + '{:0.2f}'.format(atlCorrections.mae[0]) + ' m, RMSE = ' + '{:0.2f}'.format(atlCorrections.rmse[0]) + ' m, Mean Error = ' + '{:0.2f}'.format(atlCorrections.me[0]) + ' m\n'
184 | ax2.set_title(titleStr)
185 | f.tight_layout(pad=1.0)
186 |
187 | # Save plot
188 | if(not os.path.exists(os.path.normpath(outFilePath))):
189 | os.mkdir(os.path.normpath(outFilePath))
190 | # EndIf
191 | plotNum = counter + 1
192 | outNameBase = atlMeasuredData.atl03FileName + '_' + atlMeasuredData.gtNum + '_fig_ZT' + '{:0.0f}'.format(plotNum) + '_' + str(rasterResolution) + 'm'
193 | outPath = os.path.normpath(outFilePath + '/' + outNameBase + '.png')
194 | plt.savefig(outPath, bbox_inches='tight')
195 |
196 | # Show plot
197 | if(showPlots):
198 | plt.show()
199 | # EndIf
200 |
201 | # Close figure
202 | plt.close()
203 |
204 | # endDef
205 |
206 | ## Lasso Selector Tool
207 | #class SelectFromCollection(object):
208 | #
209 | # def __init__(self, ax, collection, alpha_other=0.3):
210 | #
211 | #
212 | ## xys = []
213 | ## Npts = []
214 | ## fc = []
215 | ## ec = []
216 | ## ind = []
217 | ##
218 | ## for i in range(0,len(collection)):
219 | ##
220 | ## xysSingle = collection[i].get_offsets().data
221 | ## NptsSingle = len(xysSingle)
222 | ## fcSingle = np.tile(collection[i].get_facecolors(),(NptsSingle,1))
223 | ## ecSingle = np.tile(collection[i].get_edgecolors(),(NptsSingle,1))
224 | ## indSingle = []
225 | ##
226 | ## xys.append(xysSingle)
227 | ## Npts.append(NptsSingle)
228 | ## fc.append(fcSingle)
229 | ## ec.append(ecSingle)
230 | ## ind.append(indSingle)
231 | ##
232 | ## # endFor
233 | ##
234 | ## self.xys = xys
235 | ## self.Npts = Npts
236 | ## self.fc = fc
237 | ## self.ec = ec
238 | ## self.ind = ind
239 | #
240 | # self.canvas = ax.figure.canvas
241 | # self.collection = collection
242 | # self.ax = ax
243 | # self.alpha_other = alpha_other
244 | # self.xys = collection.get_offsets().data
245 | # self.Npts = len(self.xys)
246 | # self.baseColor = collection.get_facecolors()
247 | # self.fc = []
248 | # self.fc = np.tile(collection.get_facecolors(),(self.Npts,1))
249 | # self.ind = []
250 | # self.lasso = LassoSelector(ax, onselect=self.onselect)
251 | # self.active = True
252 | # # endDef
253 | #
254 | # def onselect(self, verts):
255 | #
256 | # # Get vertices of lasso
257 | # path = Path(verts)
258 | #
259 | ## for i in range(0,len(self.collection)):
260 | ## self.fc[i] = self.ec[i]
261 | ## self.ind[i] = np.nonzero(path.contains_points(self.xys[i]))[0]
262 | ## self.fc[i][self.ind[i],0] = 1
263 | ## self.fc[i][self.ind[i],1] = 0
264 | ## self.fc[i][self.ind[i],2] = 0
265 | ## self.collection[i].set_edgecolors(self.fc[i][:,:])
266 | ## # endFor
267 | #
268 | # self.ind = []
269 | # self.ind = np.nonzero(path.contains_points(self.xys))[0]
270 | # fc = self.fc
271 | # fc[:,:] = self.baseColor
272 | # fc[self.ind,0] = 1
273 | # fc[self.ind,1] = 0
274 | # fc[self.ind,2] = 0
275 | # self.collection.set_edgecolors(fc)
276 | # self.canvas.blit(self.ax.bbox)
277 | # self.active = True
278 | # # endDef
279 | #
280 | # def disconnect(self):
281 | # self.lasso.disconnect_events()
282 | #
283 | ## for i in range(0,len(self.collection)):
284 | ## self.collection[i].set_edgecolors(self.ec[i][:,:])
285 | ## # endFor
286 | #
287 | # self.ind = []
288 | # self.collection.set_edgecolors(self.baseColor)
289 | # self.canvas.blit(self.ax.bbox)
290 | # self.active = False
291 | # # endDef
292 | #
293 | # def revert(self):
294 | #
295 | ## for i in range(0,len(self.collection)):
296 | ## self.collection[i].set_edgecolors(self.ec[i][:,:])
297 | ## # endFor
298 | #
299 | # self.ind = []
300 | # self.collection.set_edgecolors(self.baseColor)
301 | # self.canvas.blit(self.ax.bbox)
302 | # self.active = True
303 | # # endDef
304 | #
305 | ## endClass
306 |
307 | #def getPlotPts(self, fig, ax, pts, outPath, origTitle, atl03Data, onState):
308 | #
309 | # # Call point selector
310 | # selector = SelectFromCollection(ax, pts)
311 | #
312 | # if(onState):
313 | #
314 | # # Set directions for user in figure title
315 | # ax.set_title('Left-Click to Select Points, "Enter" to Accept, "Esc" to Exit')
316 | #
317 | # # Keyboard Button callback
318 | # def accept(event):
319 | #
320 | # # # Get selected points
321 | # # selected_time = []
322 | # # selected_lat = []
323 | # # selected_lon = []
324 | # # selected_easting = []
325 | # # selected_northing = []
326 | # # selected_crossTrack = []
327 | # # selected_alongTrack = []
328 | # # selected_z = []
329 | # # selected_classification = []
330 | # # selected_signalConf = []
331 | # #
332 | # # for i in range(0,len(selector.ind)):
333 | # #
334 | # # selected_time = np.append(selected_time,atl03Data.time[selector.ind[i]])
335 | # # selected_lat = np.append(selected_lat,atl03Data.lat[selector.ind[i]])
336 | # # selected_lon = np.append(selected_lon,atl03Data.lon[selector.ind[i]])
337 | # # selected_easting = np.append(selected_easting,atl03Data.easting[selector.ind[i]])
338 | # # selected_northing = np.append(selected_northing,atl03Data.northing[selector.ind[i]])
339 | # # selected_crossTrack = np.append(selected_crossTrack,atl03Data.crossTrack[selector.ind[i]])
340 | # # selected_alongTrack = np.append(selected_alongTrack,atl03Data.alongTrack[selector.ind[i]])
341 | # # selected_z = np.append(selected_z,atl03Data.z[selector.ind[i]])
342 | # # selected_classification = np.append(selected_classification,atl03Data.classification[selector.ind[i]])
343 | # # selected_signalConf = np.append(selected_signalConf,atl03Data.signalConf[selector.ind[i]])
344 | # #
345 | # # # endFor
346 | #
347 | # # 'Enter' button callback
348 | # if(event.key == 'enter'):
349 | #
350 | # # Get selected points
351 | # selected_time = atl03Data.time[selector.ind]
352 | # selected_lat = atl03Data.lat[selector.ind]
353 | # selected_lon = atl03Data.lon[selector.ind]
354 | # selected_easting = atl03Data.easting[selector.ind]
355 | # selected_northing = atl03Data.northing[selector.ind]
356 | # selected_crossTrack = atl03Data.crossTrack[selector.ind]
357 | # selected_alongTrack = atl03Data.alongTrack[selector.ind]
358 | # selected_z = atl03Data.z[selector.ind]
359 | # selected_classification = atl03Data.classification[selector.ind]
360 | # selected_signalConf = atl03Data.signalConf[selector.ind]
361 | #
362 | # # Get total number of selected points
363 | # numPts = len(selected_time)
364 | #
365 | # if(numPts>0):
366 | #
367 | # # Update directions in title
368 | # ax.set_title('%s Points Saved\nLeft-Click to Select Again, "Enter" to Accept, "Esc" to Exit' % numPts)
369 | #
370 | # # Get output file name
371 | # outName = 'selectedPoints*.csv'
372 | #
373 | # # Get .csv headers
374 | # if(atl03Data.zone=='3413' or atl03Data.zone=='3976'):
375 | #
376 | # namelist = ['Time (sec)', 'Latitude (deg)', 'Longitude (deg)', \
377 | # 'Polar Stereo X (m)', 'Polar Stereo Y (m)', \
378 | # 'Cross-Track (m)', 'Along-Track (m)', \
379 | # 'Height (m)', \
380 | # 'Classification', 'Signal Confidence']
381 | # else:
382 | #
383 | # namelist = ['Time (sec)', 'Latitude (deg)', 'Longitude (deg)', \
384 | # 'Easting (m)', 'Northing (m)', \
385 | # 'Cross-Track (m)', 'Along-Track (m)', \
386 | # 'Height (m)', \
387 | # 'Classification', 'Signal Confidence']
388 | # # endIf
389 | #
390 | # # Get .csv data
391 | # datalist = [selected_time, selected_lat, selected_lon, \
392 | # selected_easting, selected_northing, \
393 | # selected_crossTrack, selected_alongTrack,\
394 | # selected_z, \
395 | # selected_classification, selected_signalConf]
396 | #
397 | # # Get output file name (append _N.txt for file number)
398 | # outFilePath = os.path.normpath(outPath + '/' + outName)
399 | # matchingOutFiles = glob.glob(outFilePath)
400 | # if(matchingOutFiles):
401 | # matchingNums = np.array([])
402 | # for numFile in range(0,len(matchingOutFiles)):
403 | # matchingFile = os.path.basename(matchingOutFiles[numFile])
404 | # matchingNum = matchingFile.split('_')[2]
405 | # matchingNums = np.append(matchingNums,matchingNum)
406 | # # endFor
407 | # matchingNumMax = max(matchingNums)
408 | # matchingNumNew = str(int(matchingNumMax) + 1)
409 | # outFile = os.path.normpath(outPath + '/selectedPoints_file_' + matchingNumNew + '_pts_' + str(numPts) + '.csv')
410 | # else:
411 | # outFile = os.path.normpath(outPath + '/selectedPoints_file_1_pts_' + str(numPts) + '.csv')
412 | # # endIf
413 | #
414 | # # Write output data to .csv file
415 | # writeArrayToCSV(outFile, namelist, datalist)
416 | #
417 | # # Revert to original color scheme
418 | # selector.revert()
419 | #
420 | # else:
421 | #
422 | # # Update directions in title
423 | # ax.set_title('No Points Saved\nLeft-Click to Select Again, "Enter" to Accept, "Esc" to Exit')
424 | #
425 | # # endIf
426 | # # endIf
427 | #
428 | # # 'Escape' button callback
429 | # if(event.key == 'escape'):
430 | # selector.disconnect()
431 | # ax.set_title(origTitle)
432 | #
433 | # self.disable()
434 | #
435 | # # endIf
436 | #
437 | # # Update figure
438 | # fig.canvas.draw_idle()
439 | #
440 | # # endDef
441 | #
442 | # # Mouse-Button Release Callback
443 | # def release(event):
444 | #
445 | # if(selector.active):
446 | #
447 | # # # Get number of selected points
448 | # # numPts = 0
449 | # # for i in range(0,len(selector.ind)):
450 | # # numPts += len(selector.ind[i])
451 | # # # endFor
452 | #
453 | # # Get number of selected points
454 | # numPts = len(selector.ind)
455 | #
456 | # # Update figure/title
457 | # ax.set_title('%s Points Selected\nLeft-Click to Select Again, "Enter" to Accept, "Esc" to Exit' % numPts)
458 | # fig.canvas.draw_idle()
459 | #
460 | # # endIf
461 | # # endDef
462 | #
463 | #
464 | # # Create callback when keyboard buttons are pressed
465 | # fig.canvas.mpl_connect('key_press_event', accept)
466 | #
467 | # # Create callback when mouse-button is released
468 | # fig.canvas.mpl_connect('button_release_event', release)
469 | #
470 | # else:
471 | #
472 | # selector.disconnect()
473 | # ax.set_title(origTitle)
474 | # fig.canvas.draw_idle()
475 | #
476 | # # endIf
477 | ## endDef
478 |
479 |
480 | #class getPoints(ToolToggleBase):
481 | #
482 | # default_toggled = False
483 | #
484 | # def __init__(self, *args, figNum, ax, pts, outPath, origTitle, atl03Data, **kwargs):
485 | #
486 | # # Initialize base variables
487 | # self.figNum = figNum
488 | # self.ax = ax
489 | # self.pts = pts
490 | # self.outPath = outPath
491 | # self.origTitle = origTitle
492 | # self.atl03Data = atl03Data
493 | #
494 | # # Call super-initialized variables
495 | # super().__init__(*args, **kwargs)
496 | #
497 | #
498 | # def enable(self, *args, **kwargs):
499 | #
500 | # # Get inputs for getPlotPts function
501 | # figNum = self.figNum
502 | # ax = self.ax
503 | # pts = self.pts
504 | # outPath = self.outPath
505 | # origTitle = self.origTitle
506 | # atl03Data = self.atl03Data
507 | # onState = True
508 | #
509 | # # Call getPlotPts function
510 | # getPlotPts(self, figNum, ax, pts, outPath, origTitle, atl03Data, onState)
511 | #
512 | #
513 | # def disable(self, *args, **kwargs):
514 | #
515 | # # Get inputs for getPlotPts function
516 | # figNum = self.figNum
517 | # ax = self.ax
518 | # pts = self.pts
519 | # outPath = self.outPath
520 | # origTitle = self.origTitle
521 | # atl03Data = self.atl03Data
522 | # onState = False
523 | #
524 | # # Call getPlotPts function
525 | # getPlotPts(self, figNum, ax, pts, outPath, origTitle, atl03Data, onState)
526 | #
527 | ## endClass
528 |
529 |
530 | # Function to plot any x,y data from getAtlMeasuredSwath_auto.py
531 | def getPlot(xData, yData, xLabel, yLabel, title, outPath, origTitle,
532 | filterType = [], filterData = [], filterNum = []):
533 |
534 | # Open figure window
535 | plt.ion()
536 | fig1 = plt.figure()
537 |
538 | if(np.size(filterData) > 0):
539 |
540 | # Loop through filter numbers
541 | for i in range(0,np.size(filterNum)):
542 |
543 | # Filter data
544 | matchVal = filterNum[i]
545 | matchingInds = filterData == matchVal
546 | if(len(matchingInds>0)):
547 |
548 | xDataPlot = xData[matchingInds]
549 | yDataPlot = yData[matchingInds]
550 |
551 | if('class' in filterType.lower()):
552 | if(matchVal < 1):
553 | matchStr = 'ATL03 Unclassified'
554 | myColor = [194/255, 197/255, 204/255] # Gray
555 | elif(matchVal==1):
556 | matchStr = 'ATL03 Ground'
557 | myColor = [210/255, 184/255, 38/255] # Brown
558 | elif(matchVal==2):
559 | matchStr = 'ATL03 Canopy'
560 | myColor = [69/255, 129/255, 26/255] # Dark Green
561 | elif(matchVal==3):
562 | matchStr = 'ATL03 Top of Canopy'
563 | myColor = [133/255, 243/255, 52/255] # Light Green
564 | # endIf
565 | else:
566 | if(matchVal==0):
567 | myColor = [194/255, 197/255, 204/255] # Gray
568 | elif(matchVal==1):
569 | myColor = [0, 0, 0] # Black
570 | elif(matchVal==2):
571 | myColor = [69/255, 129/255, 26/255] # Green
572 | elif(matchVal==3):
573 | myColor = [1, 0, 0] # Red
574 | elif(matchVal==4):
575 | myColor = [0, 0, 1] # Blue
576 | # endIf
577 |
578 | if('yapc' in filterType.lower()):
579 | matchStr = 'YAPC Sig Conf = ' + str(matchVal)
580 | else:
581 | matchStr = 'ATL03 Sig Conf = ' + str(matchVal)
582 | # endIf
583 |
584 | # endIf
585 |
586 | # Plot filtered data
587 | # plt.scatter(xDataPlot, yDataPlot, color=myColor, label=matchStr, s=0.7)
588 | plt.plot(xDataPlot, yDataPlot, '.', color=myColor, label=matchStr, markersize=2)
589 |
590 | # endIf
591 |
592 | # endFor
593 |
594 | plt.legend(loc = 'upper left')
595 |
596 | else:
597 |
598 | # Plot all data
599 | # plt.scatter(xData, yData, color=[194/255, 197/255, 204/255], s=0.7, label='ATL03 Data')
600 | lblStr = 'ATL03 Data'
601 | plt.plot(xData, yData, '.', color=[194/255, 197/255, 204/255], markersize=2, label=lblStr)
602 | plt.legend(loc = 'upper left')
603 |
604 | # endIf
605 |
606 | # Figure properties
607 | plt.axis('tight')
608 | plt.grid(b = True, which = 'major', axis = 'both')
609 | plt.xlabel(xLabel)
610 | plt.ylabel(yLabel)
611 | plt.title(title)
612 |
613 | # Get figure handles
614 | # figNum = plt.gcf()
615 | # ax = plt.gca()
616 | # pts = ax.collections[-1]
617 |
618 | # Route 'Select Points' button to matplotlib figure window
619 | # fig1.canvas.manager.toolmanager.add_tool('Select Points', getPoints, figNum = figNum, ax = ax, pts = pts, outPath = outPath, origTitle = origTitle, atl03Data = atl03Data)
620 | # fig1.canvas.manager.toolbar.add_tool('Select Points', 'navigation', 3)
621 |
622 | # Show plot
623 | fig1.show()
624 |
625 | # endDef
626 |
627 | # Function to plot ATL08 data from getAtlMeasuredSwath_auto.py
628 | def getPlot_atl08(xData, yData, xLabel, yLabel, title, yName):
629 |
630 | # Get figure window
631 | figsExist = [x.num for x in plt._pylab_helpers.Gcf.get_all_fig_managers()]
632 |
633 | # Open figure window
634 | plt.ion()
635 | figNum = plt.gcf().number
636 | fig1 = plt.figure(figNum)
637 | if(len(figsExist)>0):
638 | ax = plt.gca()
639 | xlim = ax.get_xlim()
640 | ylim = ax.get_ylim()
641 | # endIf
642 |
643 | if('canopy' in yName.lower()):
644 | matchStr = 'ATL08 Max Canopy'
645 | myColor = [69/255, 129/255, 26/255]
646 | elif('bestfit' in yName.lower()):
647 | matchStr = 'ATL08 Terrain Best Fit'
648 | myColor = [210/255, 184/255, 38/255]
649 | elif('median' in yName.lower()):
650 | matchStr = 'ATL08 Terrain Median'
651 | myColor = [148/255, 0/255, 211/255]
652 | # endIf
653 |
654 | # Plot all data
655 | plt.plot(xData, yData, 'o', markerfacecolor=myColor, markeredgecolor='k', label=matchStr)
656 |
657 | # Figure properties
658 | plt.legend(loc = 'upper left')
659 | plt.axis('tight')
660 | plt.grid(b = True, which = 'major', axis = 'both')
661 | plt.xlabel(xLabel)
662 | plt.ylabel(yLabel)
663 | plt.title(title)
664 | plt.draw()
665 | fig1.show()
666 | if(len(figsExist)>0):
667 | ax.set_xlim(xlim)
668 | ax.set_ylim(ylim)
669 | # endIf
670 | # endDef
671 |
672 |
673 | # Function to plot ATL03 reference DEM data from getAtlMeasuredSwath_auto.py
674 | def getPlotDEM(xData, yData):
675 |
676 | # Get figure window
677 | figsExist = [x.num for x in plt._pylab_helpers.Gcf.get_all_fig_managers()]
678 |
679 | # Open figure window
680 | plt.ion()
681 | figNum = plt.gcf().number
682 | fig1 = plt.figure(figNum)
683 | if(len(figsExist)>0):
684 | ax = plt.gca()
685 | xlim = ax.get_xlim()
686 | ylim = ax.get_ylim()
687 | # endIf
688 |
689 | # Get string and color
690 | matchStr = 'ATL03 Ref DEM'
691 | myColor = [148/255, 0/255, 211/255] # Purple
692 |
693 | # Plot all data
694 | plt.plot(xData, yData, 's', color=myColor, label=matchStr, markersize=2, zorder=0)
695 |
696 | # Figure properties
697 | plt.legend(loc = 'upper left')
698 | plt.axis('tight')
699 | plt.grid(b = True, which = 'major', axis = 'both')
700 | plt.draw()
701 | fig1.show()
702 | if(len(figsExist)>0):
703 | ax.set_xlim(xlim)
704 | ax.set_ylim(ylim)
705 | # endIf
706 | # endDef
707 |
708 |
709 | # Function to plot Truth data from getAtlTruthSwath_auto.py
710 | def getPlot_truth(xData, yData, xLabel, yLabel, title, yName):
711 |
712 | # Define plot colors
713 | myColor = [0.3, 0.3, 0.3]
714 |
715 | # Get figure window
716 | figsExist = [x.num for x in plt._pylab_helpers.Gcf.get_all_fig_managers()]
717 |
718 | # Open figure window
719 | plt.ion()
720 | figNum = plt.gcf().number
721 | fig1 = plt.figure(figNum)
722 | if(len(figsExist)>0):
723 | ax = plt.gca()
724 | xlim = ax.get_xlim()
725 | ylim = ax.get_ylim()
726 | # endIf
727 |
728 | # Plot all data
729 | # plt.scatter(xData, yData, color=myColor, label=yName, s=1.0, zorder=0)
730 | plt.plot(xData, yData, 's', color=myColor, label=yName, markersize=1, zorder=0)
731 |
732 | # Figure properties
733 | plt.legend(loc = 'upper left')
734 | plt.axis('tight')
735 | plt.grid(b = True, which = 'major', axis = 'both')
736 | plt.draw()
737 | fig1.show()
738 | if(len(figsExist)>0):
739 | ax.set_xlim(xlim)
740 | ax.set_ylim(ylim)
741 | # endIf
742 | # endDef
743 |
744 |
745 | # Function to plot Corrected Measured data from getMeasurementError_auto.py
746 | def getPlot_measCorr(xData, yData, xLabel, yLabel, title, outPath, origTitle,
747 | filterType = [], filterData = [], filterNum = []):
748 |
749 | # Get figure window
750 | figsExist = [x.num for x in plt._pylab_helpers.Gcf.get_all_fig_managers()]
751 |
752 | # Open figure window
753 | plt.ion()
754 | figNum = plt.gcf().number
755 | fig1 = plt.figure(figNum)
756 | if(len(figsExist)>0):
757 | ax = plt.gca()
758 | xlim = ax.get_xlim()
759 | ylim = ax.get_ylim()
760 | # endIf
761 |
762 | if(np.size(filterData) > 0):
763 |
764 | # Loop through filter numbers
765 | for i in range(0,np.size(filterNum)):
766 |
767 | # Filter data
768 | matchVal = filterNum[i]
769 | matchingInds = filterData == matchVal
770 | if(len(matchingInds>0)):
771 |
772 | xDataPlot = xData[matchingInds]
773 | yDataPlot = yData[matchingInds]
774 |
775 | if('class' in filterType.lower()):
776 | if(matchVal < 1):
777 | matchStr = 'Shifted ATL03 Unclassified'
778 | myColor = [194/255, 197/255, 204/255]
779 | elif(matchVal==1):
780 | matchStr = 'Shifted ATL03 Ground'
781 | myColor = [210/255, 184/255, 38/255]
782 | elif(matchVal==2):
783 | matchStr = 'Shifted ATL03 Canopy'
784 | myColor = [69/255, 129/255, 26/255]
785 | elif(matchVal==3):
786 | matchStr = 'Shifted ATL03 Top of Canopy'
787 | myColor = [133/255, 243/255, 52/255]
788 | # endIf
789 | else:
790 | if(matchVal==0):
791 | myColor = [194/255, 197/255, 204/255]
792 | elif(matchVal==1):
793 | myColor = [0, 0, 0]
794 | elif(matchVal==2):
795 | myColor = [69/255, 129/255, 26/255]
796 | elif(matchVal==3):
797 | myColor = [1, 0, 0]
798 | elif(matchVal==4):
799 | myColor = [0, 0, 1]
800 | # endIf
801 |
802 | matchStr = 'Shifted ATL03 Sig Conf = ' + str(matchVal)
803 |
804 | # endIf
805 |
806 | # Plot filtered data
807 | plt.plot(xDataPlot, yDataPlot, 's', color=myColor, label=matchStr, markersize=2)
808 |
809 | # endIf
810 | # endFor
811 |
812 | plt.legend(loc = 'upper left')
813 |
814 | else:
815 |
816 | # Plot all data
817 | lblStr = 'Shifted ATL03 Data'
818 | plt.plot(xData, yData, 's', color=[194/255, 197/255, 204/255], markersize=2, label=lblStr)
819 | plt.legend(loc = 'upper left')
820 |
821 | # endIf
822 |
823 | # Figure properties
824 | plt.axis('tight')
825 | plt.grid(b = True, which = 'major', axis = 'both')
826 | plt.draw()
827 | fig1.show()
828 | if(len(figsExist)>0):
829 | ax.set_xlim(xlim)
830 | ax.set_ylim(ylim)
831 | # endIf
832 |
833 | # endDef
834 |
835 | # Plot pickle file
836 | def getPklPlot(pklFile):
837 |
838 | figx = pkl.load(open(pklFile,'rb'))
839 | figx.show()
840 |
841 | def load_3d():
842 | # print('from mpl_toolkits.mplot3d import Axes3D')
843 | # from mpl_toolkits.mplot3d import Axes3D
844 | load_3D()
845 |
846 | def load_3D():
847 | # print('from mpl_toolkits.mplot3d import Axes3D')
848 | from mpl_toolkits.mplot3d import Axes3D
849 |
850 | def make_fig():
851 | import matplotlib.pyplot as plt
852 | fig = plt.figure()
853 | ax = fig.add_subplot(111)
854 | return fig, ax
855 |
856 | def plot(data, markerstyle='-', ms=3.0): #, y=None):
857 | """
858 | Just plots data for quick visual; doesn't
859 | interrupt ipython session like plt.figure
860 | Ex:
861 | import icesatPlot as ip
862 | ip.plot(arr,'.')
863 | """
864 | import matplotlib.pyplot as plt
865 | fig = plt.figure()
866 | ax = fig.add_subplot(111)
867 | # if (y == None).any():
868 | ax.plot(data, markerstyle, ms=ms)
869 | # else:
870 | # ax.plot(data, y, markerstyle, ms=ms)
871 | fig.show()
872 |
873 | def hist(data, bins):
874 | """
875 | Makes a simple histogram for quick visual;
876 | doesn't interrupt ipython session like plt.figure
877 | Ex:
878 | import icesatPlot as ip
879 | ip.hist(arr, 100)
880 | """
881 | import matplotlib.pyplot as plt
882 | fig = plt.figure()
883 | ax = fig.add_subplot(111)
884 | hist, bins = np.histogram(data, bins=bins)
885 | ax.plot(bins[:-1], hist)
886 | fig.show()
887 |
888 | # Add stats to plot
889 | def addStatsToPlot(indsToPlotTuple,xParam,yParam,yVar,yHt,statsDF):
890 |
891 | # Get figure window
892 | figsExist = [x.num for x in plt._pylab_helpers.Gcf.get_all_fig_managers()]
893 | figNum = plt.gcf().number
894 | fig = plt.figure(figNum)
895 | if(len(figsExist)>0):
896 | ax = plt.gca()
897 | xlim = ax.get_xlim()
898 | ylim = ax.get_ylim()
899 | # endIf
900 |
901 | if(len(indsToPlotTuple)==1):
902 |
903 | # Make plot interactive
904 | plt.ion()
905 |
906 | # Define face color scheme
907 | myBrown = [204/255, 102/255, 0/255]
908 | myMidGreen = [0/255, 200/255, 0/255]
909 | myWhite = [255/255, 255/255, 255/255]
910 |
911 | # Define edge color schemes
912 | myBlue = [0/255, 0/255, 255/255]
913 | myRed = [255/255, 0/255, 0/255]
914 | myGray = [160/255, 160/255, 160/255]
915 | myBlack = [0/255, 0/255, 0/255]
916 | myOrange = [255/255, 128/255, 0/255]
917 | myPurple = [102/255, 0/255, 204/255]
918 |
919 | # Set xDataGood and yDataGood values
920 | xDataGood = True
921 | yDataGood = True
922 | yBinGood = True
923 |
924 | # Get matching x parameter from dataframe
925 | if(xParam=='Segment ID'):
926 | statsDF_xName = 'seg_start_segment_id_interp'
927 | xData_start = (statsDF[statsDF_xName]).to_numpy()
928 | statsDF_xName = 'seg_end_segment_id_interp'
929 | xData_end = (statsDF[statsDF_xName]).to_numpy()
930 | elif(xParam=='deltaTime'):
931 | statsDF_xName = 'seg_start_delta_time_interp (sec)'
932 | xData_start = (statsDF[statsDF_xName]).to_numpy()
933 | statsDF_xName = 'seg_end_delta_time_interp (sec)'
934 | xData_end = (statsDF[statsDF_xName]).to_numpy()
935 | elif(xParam=='time'):
936 | statsDF_xName = 'seg_start_time_interp (sec)'
937 | xData_start = (statsDF[statsDF_xName]).to_numpy()
938 | statsDF_xName = 'seg_end_time_interp (sec)'
939 | xData_end = (statsDF[statsDF_xName]).to_numpy()
940 | elif(xParam=='lat'):
941 | statsDF_xName = 'seg_start_lat_interp (deg)'
942 | xData_start = (statsDF[statsDF_xName]).to_numpy()
943 | statsDF_xName = 'seg_end_lat_interp (deg)'
944 | xData_end = (statsDF[statsDF_xName]).to_numpy()
945 | elif(xParam=='lon'):
946 | statsDF_xName = 'seg_start_lon_interp (deg)'
947 | xData_start = (statsDF[statsDF_xName]).to_numpy()
948 | statsDF_xName = 'seg_end_lon_interp (deg)'
949 | xData_end = (statsDF[statsDF_xName]).to_numpy()
950 | elif(xParam=='easting'):
951 | statsDF_xName = 'seg_start_easting_interp (m)'
952 | xData_start = (statsDF[statsDF_xName]).to_numpy()
953 | statsDF_xName = 'seg_end_easting_interp (m)'
954 | xData_end = (statsDF[statsDF_xName]).to_numpy()
955 | elif(xParam=='northing'):
956 | statsDF_xName = 'seg_start_northing_interp (m)'
957 | xData_start = (statsDF[statsDF_xName]).to_numpy()
958 | statsDF_xName = 'seg_end_northing_interp (m)'
959 | xData_end = (statsDF[statsDF_xName]).to_numpy()
960 | elif(xParam=='crossTrack'):
961 | statsDF_xName = 'seg_start_crossTrack_interp (m)'
962 | xData_start = (statsDF[statsDF_xName]).to_numpy()
963 | statsDF_xName = 'seg_end_crossTrack_interp (m)'
964 | xData_end = (statsDF[statsDF_xName]).to_numpy()
965 | elif(xParam=='alongTrack'):
966 | statsDF_xName = 'seg_start_alongTrack_interp (m)'
967 | xData_start = (statsDF[statsDF_xName]).to_numpy()
968 | statsDF_xName = 'seg_end_alongTrack_interp (m)'
969 | xData_end = (statsDF[statsDF_xName]).to_numpy()
970 | else:
971 | xDataGood = False
972 | # endIf
973 |
974 | # Get y axis parameter
975 | if('z' not in yVar):
976 | yDataGood = False
977 | # endIf
978 |
979 | if('hae' in yHt.lower()):
980 | htType = 'HAE'
981 | else:
982 | htType = 'MSL'
983 | # endIf
984 |
985 | # Get matching y parameter from dataframe
986 | if(yParam=='Ground Min'):
987 | statsDF_yName = 'atl03_ground_min (m ' + htType + ')'
988 | yData = (statsDF[statsDF_yName]).to_numpy()
989 | binFaceColor = myBrown
990 | binEdgeColor = myBlue
991 | elif(yParam=='Ground Max'):
992 | statsDF_yName = 'atl03_ground_max (m ' + htType + ')'
993 | yData = (statsDF[statsDF_yName]).to_numpy()
994 | binFaceColor = myBrown
995 | binEdgeColor = myRed
996 | elif(yParam=='Ground Median'):
997 | statsDF_yName = 'atl03_ground_median (m ' + htType + ')'
998 | yData = (statsDF[statsDF_yName]).to_numpy()
999 | binFaceColor = myBrown
1000 | binEdgeColor = myGray
1001 | elif(yParam=='Ground Mean'):
1002 | statsDF_yName = 'atl03_ground_mean (m ' + htType + ')'
1003 | yData = (statsDF[statsDF_yName]).to_numpy()
1004 | binFaceColor = myBrown
1005 | binEdgeColor = myBlack
1006 | elif(yParam=='Ground Mean + 3*Std'):
1007 | statsDF_yName1 = 'atl03_ground_mean (m ' + htType + ')'
1008 | statsDF_yName2 = 'atl03_ground_std (m ' + htType + ')'
1009 | yData = (statsDF[statsDF_yName1]).to_numpy() + 3*(statsDF[statsDF_yName2]).to_numpy()
1010 | binFaceColor = myBrown
1011 | binEdgeColor = myOrange
1012 | elif(yParam=='Ground Mean - 3*Std'):
1013 | statsDF_yName1 = 'atl03_ground_mean (m ' + htType + ')'
1014 | statsDF_yName2 = 'atl03_ground_std (m ' + htType + ')'
1015 | yData = (statsDF[statsDF_yName1]).to_numpy() - 3*(statsDF[statsDF_yName2]).to_numpy()
1016 | binFaceColor = myBrown
1017 | binEdgeColor = myPurple
1018 | elif(yParam=='All Canopy Min'):
1019 | statsDF_yName = 'atl03_all_canopy_min (m ' + htType + ')'
1020 | yData = (statsDF[statsDF_yName]).to_numpy()
1021 | binFaceColor = myMidGreen
1022 | binEdgeColor = myBlue
1023 | elif(yParam=='All Canopy Max'):
1024 | statsDF_yName = 'atl03_all_canopy_max (m ' + htType + ')'
1025 | yData = (statsDF[statsDF_yName]).to_numpy()
1026 | binFaceColor = myMidGreen
1027 | binEdgeColor = myRed
1028 | elif(yParam=='All Canopy Median'):
1029 | statsDF_yName = 'atl03_all_canopy_median (m ' + htType + ')'
1030 | yData = (statsDF[statsDF_yName]).to_numpy()
1031 | binFaceColor = myMidGreen
1032 | binEdgeColor = myGray
1033 | elif(yParam=='All Canopy Mean'):
1034 | statsDF_yName = 'atl03_all_canopy_mean (m ' + htType + ')'
1035 | yData = (statsDF[statsDF_yName]).to_numpy()
1036 | binFaceColor = myMidGreen
1037 | binEdgeColor = myBlack
1038 | elif(yParam=='All Canopy Mean + 3*Std'):
1039 | statsDF_yName1 = 'atl03_all_canopy_mean (m ' + htType + ')'
1040 | statsDF_yName2 = 'atl03_all_canopy_std (m ' + htType + ')'
1041 | yData = (statsDF[statsDF_yName1]).to_numpy() + 3*(statsDF[statsDF_yName2]).to_numpy()
1042 | binFaceColor = myMidGreen
1043 | binEdgeColor = myOrange
1044 | elif(yParam=='All Canopy Mean - 3*Std'):
1045 | statsDF_yName1 = 'atl03_all_canopy_mean (m ' + htType + ')'
1046 | statsDF_yName2 = 'atl03_all_canopy_std (m ' + htType + ')'
1047 | yData = (statsDF[statsDF_yName1]).to_numpy() - 3*(statsDF[statsDF_yName2]).to_numpy()
1048 | binFaceColor = myMidGreen
1049 | binEdgeColor = myPurple
1050 | elif(yParam=='All Height Min'):
1051 | statsDF_yName = 'atl03_all_height_min (m ' + htType + ')'
1052 | yData = (statsDF[statsDF_yName]).to_numpy()
1053 | binFaceColor = myWhite
1054 | binEdgeColor = myBlue
1055 | elif(yParam=='All Height Max'):
1056 | statsDF_yName = 'atl03_all_height_max (m ' + htType + ')'
1057 | yData = (statsDF[statsDF_yName]).to_numpy()
1058 | binFaceColor = myWhite
1059 | binEdgeColor = myRed
1060 | elif(yParam=='All Height Median'):
1061 | statsDF_yName = 'atl03_all_height_median (m ' + htType + ')'
1062 | yData = (statsDF[statsDF_yName]).to_numpy()
1063 | binFaceColor = myWhite
1064 | binEdgeColor = myGray
1065 | elif(yParam=='All Height Mean'):
1066 | statsDF_yName = 'atl03_all_height_mean (m ' + htType + ')'
1067 | yData = (statsDF[statsDF_yName]).to_numpy()
1068 | binFaceColor = myWhite
1069 | binEdgeColor = myBlack
1070 | elif(yParam=='All Height Mean + 3*Std'):
1071 | statsDF_yName1 = 'atl03_all_height_mean (m ' + htType + ')'
1072 | statsDF_yName2 = 'atl03_all_height_std (m ' + htType + ')'
1073 | yData = (statsDF[statsDF_yName1]).to_numpy() + 3*(statsDF[statsDF_yName2]).to_numpy()
1074 | binFaceColor = myWhite
1075 | binEdgeColor = myOrange
1076 | elif(yParam=='All Height Mean - 3*Std'):
1077 | statsDF_yName1 = 'atl03_all_height_mean (m ' + htType + ')'
1078 | statsDF_yName2 = 'atl03_all_height_std (m ' + htType + ')'
1079 | yData = (statsDF[statsDF_yName1]).to_numpy() - 3*(statsDF[statsDF_yName2]).to_numpy()
1080 | binFaceColor = myWhite
1081 | binEdgeColor = myPurple
1082 | else:
1083 | yBinGood = False
1084 | # endIf
1085 |
1086 | # Determine whether to plot data or flag error
1087 | if(xDataGood and yDataGood and yBinGood):
1088 |
1089 | # # Create line segments to plot for stats
1090 | # col1_inds = (np.arange(0,len(xData_start)*2,2)).astype('float')
1091 | # col2_inds = (np.arange(1,len(xData_start)*2,2)).astype('float')
1092 | # allDataCol0 = np.concatenate([col1_inds, col2_inds], axis=0)
1093 | # allDataCol1 = np.concatenate([xData_start, xData_end], axis=0)
1094 | # allDataCol2 = np.concatenate([yData, yData], axis=0)
1095 | # allData = np.column_stack([allDataCol0, allDataCol1, allDataCol2])
1096 | # allData_sorted = allData[np.argsort(allData[:, 0])]
1097 | # xDataPlot = allData_sorted[:,1]
1098 | # yDataPlot = allData_sorted[:,2]
1099 |
1100 | # Create midpoints to plot for stats
1101 | all_x_data = np.column_stack([xData_start, xData_end])
1102 | xDataPlot = np.mean(all_x_data, axis=1)
1103 | yDataPlot = yData
1104 |
1105 | # # Plot stats line data
1106 | # plt.plot(xDataPlot, yDataPlot,
1107 | # color=binEdgeColor,
1108 | # zorder = 1)
1109 |
1110 | # Plot stats point data
1111 | plt.scatter(xDataPlot, yDataPlot,
1112 | edgecolors=binEdgeColor, facecolors=binFaceColor,
1113 | label=yParam, zorder = 2)
1114 |
1115 | # Update plot
1116 | plt.legend(loc = 'upper left')
1117 | plt.draw()
1118 | fig.show()
1119 | if(len(figsExist)>0):
1120 | ax.set_xlim(xlim)
1121 | ax.set_ylim(ylim)
1122 | # endIf
1123 | elif(not xDataGood):
1124 | messagebox.showinfo('Error','Cannot plot stats for x-axis data.')
1125 | elif(not yDataGood):
1126 | messagebox.showinfo('Error','Can only plot stats for Height on y-axis.')
1127 | # endIf
1128 |
1129 | else:
1130 |
1131 | messagebox.showinfo('Error','Can only plot stats for 1 file at a time.')
1132 |
1133 | # endIf
1134 |
1135 | # endDef
1136 |
1137 | # Graphing
1138 |
1139 | # import matplotlib.pyplot as plt
1140 |
1141 | # plt.figure()
1142 | # ax1 = plt.subplot(211)
1143 | # plt.plot(truth_swath.alongtrack[truth_swath.classification > 2],truth_swath.z[truth_swath.classification > 2],'.',color=[0.6,0.6,0.6])
1144 | # plt.plot(truth_swath.alongtrack[truth_swath.classification == 2],truth_swath.z[truth_swath.classification == 2],'.',color=[0.3,0.3,0.3])
1145 |
1146 | # plt.plot(atl03.df.alongtrack[atl03.df.truth_label == 0],atl03.df.h_ph[atl03.df.truth_label == 0],'.',color=[0.0,0.95,0.95])
1147 | # plt.plot(atl03.df.alongtrack[atl03.df.classification == 3],atl03.df.h_ph[atl03.df.classification == 3],'.',color=[0,0.9,0])
1148 | # plt.plot(atl03.df.alongtrack[atl03.df.classification == 2],atl03.df.h_ph[atl03.df.classification == 2],'.',color=[0.05,0.5,0.12])
1149 | # plt.plot(atl03.df.alongtrack[atl03.df.classification == 1],atl03.df.h_ph[atl03.df.classification == 1],'.',color=[0.95,0.5,0.0])
1150 |
1151 | # plt.subplot(212, sharex=ax1, sharey=ax1)
1152 | # plt.plot(truth_swath.alongtrack[truth_swath.classification > 2],truth_swath.z[truth_swath.classification > 2],'.',color=[0.6,0.6,0.6])
1153 | # plt.plot(truth_swath.alongtrack[truth_swath.classification == 2],truth_swath.z[truth_swath.classification == 2],'.',color=[0.3,0.3,0.3])
1154 |
1155 | # plt.plot(atl03.df.alongtrack[atl03.df.truth_label == 0],atl03.df.h_ph[atl03.df.truth_label == 0],'.',color=[0.0,0.95,0.95])
1156 | # plt.plot(atl03.df.alongtrack[atl03.df.truth_label == 3],atl03.df.h_ph[atl03.df.truth_label == 3],'.',color=[0,0.9,0])
1157 | # plt.plot(atl03.df.alongtrack[atl03.df.truth_label == 2],atl03.df.h_ph[atl03.df.truth_label == 2],'.',color=[0.05,0.5,0.12])
1158 | # plt.plot(atl03.df.alongtrack[atl03.df.truth_label == 1],atl03.df.h_ph[atl03.df.truth_label == 1],'.',color=[0.95,0.5,0.0])
1159 |
1160 |
1161 | # plt.plot(test.alongtrack[truth_swath.classification > 2],truth_swath.norm_h[test.classification > 2],'.',color=[0.6,0.6,0.6])
1162 | # plt.plot(test.alongtrack[truth_swath.classification == 2],truth_swath.norm_h[test.classification == 2],'.',color=[0.3,0.3,0.3])
--------------------------------------------------------------------------------
/phoreal/reader.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 | """
4 | Script that contains ATL03 and ATL 08 H5 Reader functions for PhoREAL
5 |
6 | Copyright 2019 Applied Research Laboratories, University of Texas at Austin
7 |
8 | This package is free software; the copyright holder gives unlimited
9 | permission to copy and/or distribute, with or without modification, as
10 | long as this notice is preserved.
11 |
12 | Authors:
13 | Eric Guenther
14 | Mike Alonzo
15 |
16 | Date: February 27, 2019
17 | """
18 |
19 | import pandas as pd
20 | import numpy as np
21 | import os
22 | import h5py
23 |
24 | from phoreal.utils import getH5Keys, ismember
25 | from phoreal.io import readAtl03H5, readAtlH5, readAtl03DataMapping, readAtl08DataMapping
26 | from phoreal.utils import getAtl08Mapping, wgs84_to_utm_find_and_transform,\
27 | wgs84_to_epsg_transform, getCoordRotFwd, getNameParts, get_h5_meta,\
28 | identify_hemi_zone, identify_hemi_zone
29 | from phoreal.io import GtToBeamNum, GtToBeamSW, readTruthRegionsTxtFile, writeLas
30 | # from icesatIO import readHeaderMatFile
31 | # from getAtlMeasuredSwath_auto import atl03Struct as Atl03StructLegacy
32 |
33 | class AtlRotationStruct:
34 |
35 | # Define class with designated fields
36 | def __init__(self, R_mat, xRotPt, yRotPt, desiredAngle, phi):
37 |
38 | self.R_mat = R_mat
39 | self.xRotPt = xRotPt
40 | self.yRotPt = yRotPt
41 | self.desiredAngle = desiredAngle
42 | self.phi = phi
43 |
44 |
45 | class AtlStruct:
46 |
47 | # Define class with designated fields
48 | def __init__(self, df, gtNum, beamNum, beamStrength, epsg, zone, hemi,
49 | atlFilePath, atlFileName,
50 | trackDirection, atlProduct, alth5Info, dataIsMapped,
51 | rotation_data, ancillary=None, orbit_info=None):
52 |
53 | self.df = df
54 | self.gtNum = gtNum
55 | self.beamNum = beamNum
56 | self.beamStrength = beamStrength
57 | self.epsg = epsg
58 | self.zone = zone
59 | self.hemi = hemi
60 | self.atlFilePath = atlFilePath
61 | self.atlFileName = atlFileName
62 | self.trackDirection = trackDirection
63 | self.atlProduct = atlProduct
64 | self.atlVersion = alth5Info.atlVersion
65 | self.year = alth5Info.year
66 | self.month = alth5Info.month
67 | self.day = alth5Info.day
68 | self.hour = alth5Info.hour
69 | self.minute = alth5Info.minute
70 | self.second = alth5Info.second
71 | self.trackNum = alth5Info.trackNum
72 | self.unknown = alth5Info.unknown
73 | self.releaseNum = alth5Info.releaseNum
74 | self.incrementNum = alth5Info.incrementNum
75 | self.dataIsMapped = dataIsMapped
76 | self.rotationData = rotation_data
77 | self.ancillary = ancillary
78 | self.orbit_info = orbit_info
79 |
80 | def trim_by_lat(self, min_lat, max_lat):
81 | if self.atlProduct == 'ATL03':
82 | min_lat = np.min([min_lat, max_lat])
83 | max_lat = np.max([min_lat, max_lat])
84 | self.df = self.df = self.df[self.df.lat_ph > min_lat]
85 | self.df = self.df = self.df[self.df.lat_ph < max_lat]
86 | self.df = self.df.reset_index()
87 | self.df, self.rotationData = get_atl_alongtrack(self.df, self)
88 | elif self.atlProduct == 'ATL08':
89 | min_lat = np.min([min_lat, max_lat])
90 | max_lat = np.max([min_lat, max_lat])
91 | self.df = self.df = self.df[self.df.latitude > min_lat]
92 | self.df = self.df = self.df[self.df.latitude < max_lat]
93 |
94 | def trim_by_lon(self, min_lon, max_lon):
95 | if self.atlProduct == 'ATL03':
96 | min_lon = np.min([min_lon, max_lon])
97 | max_lon = np.max([min_lon, max_lon])
98 | self.df = self.df = self.df[self.df.lon_ph > min_lon]
99 | self.df = self.df = self.df[self.df.lon_ph < max_lon]
100 | self.df = self.df.reset_index()
101 | self.df, self.rotationData = get_atl_alongtrack(self.df, self)
102 | elif self.atlProduct == 'ATL08':
103 | min_lon = np.min([min_lon, max_lon])
104 | max_lon = np.max([min_lon, max_lon])
105 | self.df = self.df = self.df[self.df.longitude > min_lon]
106 | self.df = self.df = self.df[self.df.longitude < max_lon]
107 |
108 | def to_csv(self, output_csv):
109 | self.df.to_csv(output_csv)
110 |
111 | def to_mat(self, output_mat):
112 | convert_df_to_mat(self.df,output_mat)
113 |
114 | def quick_plot(self):
115 | pass
116 |
117 | # Read ATL03 Heights, put in Pandas DF
118 | def read_atl03_heights_data(atl03filepath, gt):
119 | # Iterate through keys for "Heights"
120 | keys = getH5Keys(atl03filepath,gt + '/heights')
121 |
122 | # Read each key, put it in pandas df
123 | for idx, key in enumerate(keys):
124 | data = readAtl03H5(atl03filepath, '/heights/' + key, gt)
125 | if idx == 0:
126 | df = pd.DataFrame(data,columns=[key])
127 | else:
128 | df = pd.concat([df,pd.DataFrame(data,columns=[key])],axis=1)
129 | return df
130 |
131 | def read_atl03_geolocation(atl03filepath, gt):
132 | # Iterate through keys for "Heights"
133 | keys = getH5Keys(atl03filepath,gt + '/geolocation')
134 | key_info = get_H5_keys_info(atl03filepath, gt + '/geolocation')
135 |
136 | # Read each key, put it in pandas df
137 | for idx, key in enumerate(keys):
138 | data = readAtl03H5(atl03filepath, '/geolocation/' + key, gt)
139 | if key_info[idx][1] != 'Group':
140 | if idx == 0:
141 | df = pd.DataFrame(data,columns=[key.split('/')[-1]])
142 | else:
143 | if len(data.shape) == 2:
144 | cols = data.shape[1]
145 | for idx2 in range(0,cols):
146 | df = pd.concat([df,pd.DataFrame(data[:,idx2],columns=\
147 | [key.split('/')[-1] +\
148 | '_' + str(idx2)])],
149 | axis=1)
150 | else:
151 | df = pd.concat([df,pd.DataFrame(data,columns=\
152 | [key.split('/')[-1]])],
153 | axis=1)
154 | return df
155 |
156 |
157 | # Read ATL03 Heights, put in Pandas DF
158 | def read_atl08_land_segments(atl08filepath, gt):
159 | # Iterate through keys for "Land Segments"
160 | keys = getH5Keys(atl08filepath,gt + '/land_segments')
161 | key_info = get_H5_keys_info(atl08filepath,gt + '/land_segments')
162 | # Read each key, put it in pandas df
163 | for idx, key in enumerate(keys):
164 | data = readAtl03H5(atl08filepath, '/land_segments/' + key, gt)
165 | if key_info[idx][1] != 'Group':
166 | if idx == 0:
167 | df = pd.DataFrame(data,columns=[key.split('/')[-1]])
168 | else:
169 | if len(data.shape) == 2:
170 | cols = data.shape[1]
171 | for idx2 in range(0,cols):
172 | df = pd.concat([df,pd.DataFrame(data[:,idx2],columns=\
173 | [key.split('/')[-1] +\
174 | '_' + str(idx2)])],
175 | axis=1)
176 | else:
177 | df = pd.concat([df,pd.DataFrame(data,columns=\
178 | [key.split('/')[-1]])],
179 | axis=1)
180 | return df
181 |
182 | # Read ATL03 Heights, put in Pandas DF
183 | def read_atl09_hr_profile(atl09filepath, gt):
184 | # Iterate through keys for "Land Segments"
185 | subgroup = 'profile_' + gt[2] + '/high_rate/'
186 | keys = getH5Keys(atl09filepath, subgroup)
187 | key_info = get_H5_keys_info(atl09filepath, subgroup)
188 | # Read each key, put it in pandas df
189 | for idx, key in enumerate(keys):
190 | data = readAtlH5(atl09filepath, subgroup + '/' + key + '/')
191 | if key == 'ds_layers':
192 | ds_layers = np.array(data)
193 | elif key == 'ds_va_bin_h':
194 | ds_va_bin_h = np.array(data)
195 | elif key == 'cab_prof':
196 | cab_prof = np.array(data)
197 | elif key == 'density_pass1':
198 | density_pass1 = np.array(data)
199 | elif key == 'density_pass2':
200 | density_pass2 = np.array(data)
201 | else:
202 |
203 | if key_info[idx][1] != 'Group':
204 | if idx == 0:
205 | df = pd.DataFrame(data,columns=[key.split('/')[-1]])
206 | else:
207 | if len(data.shape) == 2:
208 | cols = data.shape[1]
209 | for idx2 in range(0,cols):
210 | df = pd.concat(
211 | [df,pd.DataFrame(data[:,idx2],columns=\
212 | [key.split('/')[-1] +'_' +
213 | str(idx2)])],axis=1)
214 | else:
215 | df = pd.concat(
216 | [df,pd.DataFrame(data,columns=\
217 | [key.split('/')[-1]])],axis=1)
218 |
219 | return df, ds_layers, ds_va_bin_h, cab_prof, density_pass1, density_pass2
220 |
221 | def read_atl09_ancillary_data(atl09filepath):
222 | # Iterate through keys for ancillary data
223 | subgroup = '/ancillary_data/'
224 | keys = getH5Keys(atl09filepath, subgroup)
225 | key_info = get_H5_keys_info(atl09filepath, subgroup)
226 |
227 | byte_encoded = ['control', 'data_start_utc', 'data_end_utc',
228 | 'granule_start_utc', 'granule_end_utc']
229 |
230 | # Read each key, put it in pandas df
231 | for idx, key in enumerate(keys):
232 | data = readAtlH5(atl09filepath, subgroup + '/' + key + '/')
233 |
234 | if np.isin(key, ['release', 'version']):
235 | data = int(data)
236 |
237 | if np.isin(key, byte_encoded):
238 | data = data[0]
239 | data = data.decode('utf-8')
240 |
241 | if key_info[idx][1] != 'Group':
242 | if len(key.split('/')) == 1:
243 | if idx == 0:
244 | df = pd.Series(data,
245 | index=[key.split('/')[-1]],
246 | dtype=object)
247 | else:
248 | df = pd.concat(
249 | [df, pd.Series(data,
250 | index=[key.split('/')[-1]],
251 | dtype=object)])
252 |
253 | return df
254 |
255 | def read_atl09_orbit_info(atl09filepath):
256 |
257 | subgroup = '/orbit_info/'
258 | keys = getH5Keys(atl09filepath, subgroup)
259 | # key_info = get_H5_keys_info(atl09filepath, subgroup)
260 |
261 | # Read each key, put it in pandas df
262 | for idx, key in enumerate(keys):
263 | data = readAtlH5(atl09filepath, subgroup + '/' + key + '/')
264 | if idx == 0:
265 | df = pd.Series(data, index=[key], dtype=object)
266 | else:
267 | df_key = pd.Series(data, index=[key], dtype=object)
268 | df = pd.concat([df, df_key])
269 |
270 | return df
271 |
272 | # Map classifications from ATL08, map back to ATL03 Photons
273 | def get_atl03_classification(atl03filepath, atl08filepath, gt):
274 | # Read ATL03 metrics for class mapping
275 | f = h5py.File(atl03filepath, 'r')
276 | atl03_ph_index_beg = np.array(f[gt + '/geolocation/ph_index_beg'])
277 | atl03_segment_id = np.array(f[gt + '/geolocation/segment_id'])
278 | atl03_heights = np.array(f[gt + '/heights/h_ph'])
279 |
280 | # Read ATL08 for class mapping
281 | f = h5py.File(atl08filepath, 'r')
282 | atl08_classed_pc_indx = np.array(f[gt + '/signal_photons/classed_pc_indx'])
283 | atl08_classed_pc_flag = np.array(f[gt + '/signal_photons/classed_pc_flag'])
284 | atl08_segment_id = np.array(f[gt + '/signal_photons/ph_segment_id'])
285 |
286 | # Map ATL08 classifications to ATL03 Photons
287 | allph_classed = getAtl08Mapping(atl03_ph_index_beg, atl03_segment_id,
288 | atl08_classed_pc_indx,
289 | atl08_classed_pc_flag,
290 | atl08_segment_id)
291 |
292 | if len(allph_classed) < len(atl03_heights):
293 | n_zeros = len(atl03_heights) - len(allph_classed)
294 | zeros = np.zeros(n_zeros)
295 | allph_classed = np.append(allph_classed, zeros)
296 |
297 | return allph_classed
298 |
299 | def merge_label_to_df(atl03filepath, atl08filepath, gt, df):
300 | allph_classed = get_atl03_classification(atl03filepath, atl08filepath, gt)
301 | # Add classifications to ATL03 DF
302 | df['classification'] = allph_classed
303 |
304 | # Replace nan with -1 (unclassified)
305 | df.replace({'classification' : np.nan}, -1)
306 | return df
307 |
308 | def get_atl03_heights_offset(atl03filepath, atl08filepath, gt):
309 | # Read ATL03 metrics for class mapping
310 | f = h5py.File(atl03filepath, 'r')
311 | atl03_ph_index_beg = np.array(f[gt + '/geolocation/ph_index_beg'])
312 | atl03_segment_id = np.array(f[gt + '/geolocation/segment_id'])
313 | atl03_heights = np.array(f[gt + '/heights/h_ph'])
314 |
315 | # Read ATL08 for class mapping
316 | f = h5py.File(atl08filepath, 'r')
317 | atl08_classed_pc_indx = np.array(f[gt + '/signal_photons/classed_pc_indx'])
318 | atl08_heights = np.array(f[gt + '/signal_photons/ph_h'])
319 | atl08_segment_id = np.array(f[gt + '/signal_photons/ph_segment_id'])
320 |
321 | # Get ATL03 data
322 | indsNotZero = atl03_ph_index_beg != 0
323 | atl03_ph_index_beg = atl03_ph_index_beg[indsNotZero];
324 | atl03_segment_id = atl03_segment_id[indsNotZero];
325 |
326 | # Find ATL08 segments that have ATL03 segments
327 | atl03SegsIn08TF, atl03SegsIn08Inds = ismember(atl08_segment_id,atl03_segment_id)
328 |
329 | # Get ATL08 classed indices and values
330 | atl08classed_inds = atl08_classed_pc_indx[atl03SegsIn08TF]
331 | atl08classed_vals = atl08_heights[atl03SegsIn08TF]
332 |
333 | # Determine new mapping into ATL03 data
334 | atl03_ph_beg_inds = atl03SegsIn08Inds;
335 | atl03_ph_beg_val = atl03_ph_index_beg[atl03_ph_beg_inds]
336 | newMapping = atl08classed_inds + atl03_ph_beg_val - 2
337 |
338 | # Get max size of output array
339 | sizeOutput = newMapping[-1]
340 |
341 | # Pre-populate all photon classed array with zeroes
342 | allph_heights = (np.zeros(sizeOutput + 1))
343 |
344 | # Populate all photon classed array from ATL08 classifications
345 | allph_heights[newMapping] = atl08classed_vals
346 | allph_heights[allph_heights == 3.4028234663852886e+38] = np.nan
347 |
348 | if len(allph_heights) < len(atl03_heights):
349 | n_zeros = len(atl03_heights) - len(allph_heights)
350 | zeros = np.zeros(n_zeros)
351 | zeros = zeros * np.nan
352 | allph_heights = np.append(allph_heights, zeros)
353 |
354 | return allph_heights
355 |
356 | def get_atl03_rate(atl03filepath, gt):
357 | f = h5py.File(atl03filepath, 'r')
358 | bihr = np.asarray(f[gt + '/bckgrd_atlas/bckgrd_int_height_reduced'])
359 | bcr = np.asarray(f[gt + '/bckgrd_atlas/bckgrd_counts_reduced'])
360 | bapmc = np.asarray(f[gt + '/bckgrd_atlas/pce_mframe_cnt'])
361 | hpmc = np.asarray(f[gt + '/heights/pce_mframe_cnt'])
362 |
363 | # Calculate rate and assign to the bckgrd_atlas rate
364 | rate = bcr / bihr
365 |
366 | # Assign bckgrd_atlas attributes to photon level
367 | tf, inds = ismember(hpmc, bapmc)
368 |
369 | ph_bihr = bihr[inds]
370 | ph_bcr = bcr[inds]
371 | ph_rate = rate[inds]
372 |
373 | ph_bihr = np.concatenate([np.zeros(len(tf[tf == False])),ph_bihr])
374 | ph_bcr = np.concatenate([np.zeros(len(tf[tf == False])),ph_bcr])
375 | ph_rate = np.concatenate([np.zeros(len(tf[tf == False])),ph_rate])
376 |
377 | return ph_bihr, ph_bcr, ph_rate
378 |
379 | def merge_norm_h_to_df(atl03filepath, atl08filepath, gt, df):
380 | # Get allph_heights
381 | allph_heights = get_atl03_heights_offset(atl03filepath, atl08filepath, gt)
382 | # Add classifications to ATL03 DF
383 | df['norm_h'] = allph_heights
384 | return df
385 |
386 | def get_atl03_segment_id(atl03filepath, gt):
387 | f = h5py.File(atl03filepath, 'r')
388 | h_ph = np.asarray(f[gt + '/heights/h_ph'])
389 | segment_ph_count = np.array(f[gt + '/geolocation/segment_ph_cnt'])
390 | atl03_segment_id = np.array(f[gt + '/geolocation/segment_id'])
391 | atl03_ph_index_beg = np.array(f[gt + '/geolocation/ph_index_beg'])
392 |
393 | # Get segment ID to photon level
394 | h_seg = np.zeros(len(h_ph))
395 | for i in range(0,len(atl03_segment_id)):
396 | if atl03_ph_index_beg[i] > 0:
397 | h_seg[atl03_ph_index_beg[i]-1:atl03_ph_index_beg[i]-1 +\
398 | segment_ph_count[i]] = atl03_segment_id[i]
399 | h_seg = np.int32(h_seg)
400 | return h_seg
401 |
402 | def get_atl03_dist_ph_along(atl03filepath, gt):
403 | f = h5py.File(atl03filepath, 'r')
404 | h_ph = np.asarray(f[gt + '/heights/h_ph'])
405 | segment_ph_count = np.array(f[gt + '/geolocation/segment_ph_cnt'])
406 | atl03_segment_id = np.array(f[gt + '/geolocation/segment_dist_x'])
407 | atl03_ph_index_beg = np.array(f[gt + '/geolocation/ph_index_beg'])
408 |
409 | # Get segment ID to photon level
410 | h_seg = np.zeros(len(h_ph))
411 | for i in range(0,len(atl03_segment_id)):
412 | if atl03_ph_index_beg[i] > 0:
413 | h_seg[atl03_ph_index_beg[i]-1:atl03_ph_index_beg[i]-1 +\
414 | segment_ph_count[i]] = atl03_segment_id[i]
415 | # h_seg = np.int32(h_seg)
416 | return h_seg
417 |
418 | def merge_seg_id_to_df(atl03filepath, gt, df):
419 | # Get Segment ID per photon
420 | h_seg = get_atl03_segment_id(atl03filepath, gt)
421 |
422 | # Add classifications to ATL03 DF
423 | df['seg_id'] = h_seg
424 | return df
425 |
426 | # Calculate alongtrack time
427 | def get_atl_time(df):
428 | delta_time = np.array(df['delta_time'])
429 | min_detla_time = np.min(delta_time[np.nonzero(delta_time)])
430 | time = delta_time - min_detla_time
431 |
432 | df['time'] = time
433 | return df
434 |
435 | # Calcualte Easting/Northing
436 | def get_atl_coords(df, epsg = None):
437 | columns = list(df.columns)
438 |
439 | if 'lon_ph' in columns:
440 | lon = np.array(df['lon_ph'])
441 | lat = np.array(df['lat_ph'])
442 | elif 'longitude' in columns:
443 | lon = np.array(df['longitude'])
444 | lat = np.array(df['latitude'])
445 | elif 'reference_photon_lon' in columns:
446 | lon = np.array(df['reference_photon_lon'])
447 | lat = np.array(df['reference_photon_lat'])
448 |
449 | # Specify EPSG Code or automatically find zone
450 | if epsg:
451 | xcoord, ycoord = wgs84_to_epsg_transform(epsg, lon, lat)
452 | else:
453 | xcoord, ycoord, epsg = wgs84_to_utm_find_and_transform(lon, lat)
454 |
455 | if 'easting' not in columns:
456 | df['easting'] = xcoord
457 | df['northing'] = ycoord
458 | else:
459 | print('Warning: Overwritting Existing Coordinates')
460 | df = df.drop(columns = ['easting'])
461 | df = df.drop(columns = ['northing'])
462 |
463 | df['easting'] = xcoord
464 | df['northing'] = ycoord
465 |
466 | return df, epsg
467 |
468 | def get_atl_alongtrack(df, atl03struct = None):
469 | easting = np.array(df['easting'])
470 | northing = np.array(df['northing'])
471 |
472 | if atl03struct:
473 | R_mat = atl03struct.rotationData.R_mat
474 | xRotPt = atl03struct.rotationData.xRotPt
475 | yRotPt = atl03struct.rotationData.yRotPt
476 | desiredAngle = 90
477 | crossTrack, alongTrack, R_mat, xRotPt, yRotPt, phi = \
478 | getCoordRotFwd(easting, northing, R_mat, xRotPt, yRotPt, [])
479 | else:
480 | desiredAngle = 90
481 | crossTrack, alongTrack, R_mat, xRotPt, yRotPt, phi = \
482 | getCoordRotFwd(easting, northing, [], [], [], desiredAngle)
483 |
484 | if 'crosstrack' not in list(df.columns):
485 | df['crosstrack'] = crossTrack
486 | df['alongtrack'] = alongTrack
487 | else:
488 | print('Warning: Overwriting Existing Alongtrack/Crosstrack')
489 | df = df.drop(columns = ['crosstrack'])
490 | df = df.drop(columns = ['alongtrack'])
491 | df['crosstrack'] = crossTrack
492 | df['alongtrack'] = alongTrack
493 |
494 |
495 | rotation_data = AtlRotationStruct(R_mat, xRotPt, yRotPt, desiredAngle, phi)
496 |
497 | return df, rotation_data
498 |
499 | # def get_atl03_df(atl03filepath, atl08filepath, gt, epsg = None):
500 | # df = read_atl03_heights_data(atl03filepath, gt)
501 | # df = merge_label_to_df(atl03filepath, atl08filepath, gt, df)
502 | # df = merge_norm_h_to_df(atl03filepath, atl08filepath, gt, df)
503 | # df = merge_seg_id_to_df(atl03filepath, gt, df)
504 |
505 | # df = get_atl_time(df)
506 | # df, epsg = get_atl_coords(df, epsg = None)
507 | # df, rotationData = get_atl_alongtrack(df)
508 | # return df
509 |
510 | def get_direction(lat):
511 | if(np.abs(lat[-1]) > np.abs(lat[0])):
512 | track_direction = 'Ascending'
513 | else:
514 | track_direction = 'Descending'
515 | return track_direction
516 |
517 | def get_file_name(filepath):
518 | filepath = os.path.normpath(os.path.abspath(filepath))
519 | filename = os.path.splitext(os.path.basename(filepath))[0]
520 | return filename
521 |
522 | def get_kml_region(lat,lon, kml_bounds_txt):
523 | # Determine if ATL03 track goes over a lidar truth region
524 | kmlBoundsTextFile = kml_bounds_txt
525 | kmlRegionName = False
526 | headerFilePath = False
527 | truthFilePath = False
528 |
529 | try:
530 | if kmlBoundsTextFile and (os.path.exists(kmlBoundsTextFile)):
531 |
532 | # Message to user
533 | print(' Finding Truth Region...')
534 |
535 | # Read kmlBounds.txt file and get contents
536 | kmlInfo = readTruthRegionsTxtFile(kmlBoundsTextFile)
537 |
538 | # Loop through kmlBounds.txt and find matching TRUTH area
539 | maxCounter = len(kmlInfo.regionName) - 1
540 | counter = 0
541 | while(not kmlRegionName):
542 | latInFile = (lat >= kmlInfo.latMin[counter]) & \
543 | (lat <= kmlInfo.latMax[counter])
544 | lonInFile = (lon >= kmlInfo.lonMin[counter]) & \
545 | (lon <= kmlInfo.lonMax[counter])
546 | trackInRegion = any(latInFile & lonInFile)
547 | if(trackInRegion):
548 |
549 | # Get truth region info
550 | kmlRegionName = kmlInfo.regionName[counter]
551 | # headerFilePath = \
552 | # os.path.normpath(kmlInfo.headerFilePath[counter])
553 | # truthFilePath = \
554 | # os.path.normpath(kmlInfo.truthFilePath[counter])
555 |
556 | # Print truth region
557 | print(' Truth File Region: %s' % kmlRegionName)
558 |
559 | if(counter >= maxCounter):
560 | print(' No Truth File Region Found in kmlBounds.txt')
561 | break
562 | counter += 1
563 | else:
564 | kmlRegionName = None
565 | headerFilePath = None
566 | truthFilePath = None
567 | except:
568 | kmlRegionName = None
569 | headerFilePath = None
570 | truthFilePath = None
571 |
572 | # Could not read kmlBounds.txt file
573 |
574 | return kmlRegionName, headerFilePath, truthFilePath
575 |
576 | def write_atl03_las(atlstruct, outpath):
577 | xx = np.array(atlstruct.df.easting)
578 | yy = np.array(atlstruct.df.northing)
579 | zz = np.array(atlstruct.df.h_ph)
580 | cc = np.array(atlstruct.df.classification)
581 | ii = np.array(atlstruct.df.signal_conf_ph)
582 | sigconf = np.array(atlstruct.df.signal_conf_ph)
583 | hemi = atlstruct.hemi
584 | zone = atlstruct.zone
585 |
586 | print(' Writing ATL03 .las file...', end = " ")
587 | try:
588 | outname = atlstruct.atl03FileName + '_' + atlstruct.gtNum + '.las'
589 | except AttributeError:
590 | # occasionally atl03FileName is not in the atlstruct
591 | outname = atlstruct.atlFileName + '_' + atlstruct.gtNum + '.las'
592 |
593 | outfile = os.path.normpath(outpath + '/' + outname)
594 |
595 | if(not os.path.exists(os.path.normpath(outpath))):
596 | os.mkdir(os.path.normpath(outpath))
597 |
598 | # Get projection
599 | if(atlstruct.zone=='3413' or atlstruct.zone=='3976'):
600 | # 3413 = Arctic, 3976 = Antarctic
601 | lasProjection = atlstruct.hemi
602 | # Write .las file
603 | writeLas(xx,yy,zz,lasProjection,outfile,cc,ii,sigconf)
604 |
605 | else:
606 | # Write .las file for UTM projection case
607 | writeLas(xx,yy,zz,'utm',outfile,cc,ii,sigconf,hemi,zone)
608 |
609 | print('Complete')
610 |
611 | def get_H5_keys_info(atl08filepath,gt):
612 | keys = getH5Keys(atl08filepath, gt)
613 | h = h5py.File(atl08filepath, 'r')
614 | key_name = []
615 | key_type = []
616 | key_len = []
617 | for key in keys:
618 | try:
619 | data = h[gt + '/' + key]
620 | kname = str(key)
621 | ktype = str(data.dtype)
622 | klen = int(len(data))
623 | key_name.append(kname)
624 | key_type.append(ktype)
625 | key_len.append(klen)
626 | except:
627 | kname = str(key)
628 | ktype = 'Group'
629 | klen = 0
630 | key_name.append(kname)
631 | key_type.append(ktype)
632 | key_len.append(klen)
633 | key_info = [list(a) for a in zip(key_name, key_type, key_len)]
634 | return key_info
635 |
636 | def match_atl_to_atl03(df, atl03struct):
637 | # Calculate Time
638 | delta_time03 = np.array(atl03struct.df.delta_time)
639 | time = np.array(df.delta_time) -\
640 | np.min(delta_time03[np.nonzero(delta_time03)])
641 | df['time'] = time
642 | # Calculate Projected Coordinates
643 | df, epsg = get_atl_coords(df, atl03struct.epsg)
644 | # Calculate Along track
645 | df, rotation_data = get_atl_alongtrack(df, atl03struct)
646 | return df, rotation_data, epsg
647 |
648 | def get_atl03_struct(atl03filepath, gt, atl08filepath = None, epsg = None,
649 | kml_bounds_txt = None, header_file_path = None):
650 | df = read_atl03_heights_data(atl03filepath, gt)
651 | if atl08filepath:
652 | try:
653 | # df = merge_label_to_df(atl03filepath, atl08filepath, gt, df)
654 | df['classification'] = get_atl03_classification(atl03filepath,\
655 | atl08filepath, gt)
656 | # df = merge_norm_h_to_df(atl03filepath, atl08filepath, gt, df)
657 | df['norm_h'] = get_atl03_heights_offset(atl03filepath,\
658 | atl08filepath, gt)
659 | dataIsMapped = True
660 | except:
661 | dataIsMapped = False
662 | else:
663 | dataIsMapped = False
664 | df['seg_id'] = get_atl03_segment_id(atl03filepath, gt)
665 | df['ph_bihr'], df['ph_bcr'], df['ph_rate'] =\
666 | get_atl03_rate(atl03filepath, gt)
667 | df['dist_ph_along'] = get_atl03_dist_ph_along(atl03filepath, gt)
668 | #df['time'] = df['delta_time'] -\
669 | # np.min(np.array(df.delta_time)[np.nonzero(np.array(df.delta_time))])
670 | df = get_atl_time(df)
671 | df, epsg = get_atl_coords(df, epsg)
672 | df, rotation_data = get_atl_alongtrack(df)
673 | track_direction = get_direction(np.array(df.lat_ph))
674 | atl03filename = get_file_name(atl03filepath)
675 | atl03_info = getNameParts(atl03filename)
676 | hemi, zone = identify_hemi_zone(epsg)
677 |
678 | if dataIsMapped:
679 | # for some reason, need to reset nan to -1,
680 | # evemn though it's done in get_atl03_classification..
681 | c = np.array(df.classification)
682 | nan_index = np.where(np.isnan(c))
683 | c[nan_index] = -1 # assign nan to -1
684 | c = c.astype(int)
685 | df.classification = c
686 | # df.replace({'classification' : np.nan}, -1)
687 |
688 | # Assign everything to the struct
689 | beamNum = GtToBeamNum(atl03filepath, gt)
690 | beamStrength = GtToBeamSW(atl03filepath, gt)
691 | atl03Struct = AtlStruct(df, gt, beamNum, beamStrength, epsg, zone, hemi,
692 | atl03filepath, atl03filename, track_direction,
693 | 'ATL03', atl03_info, dataIsMapped,rotation_data)
694 |
695 |
696 | return atl03Struct
697 |
698 | def get_atl08_struct(atl08filepath, gt, atl03struct = None, epsg = None,
699 | kml_bounds_txt = None):
700 | df = read_atl08_land_segments(atl08filepath, gt)
701 |
702 | # If ATL03 Struct is available
703 | if atl03struct:
704 | # Calculate Time
705 | df, rotation_data, epsg = match_atl_to_atl03(df, atl03struct)
706 |
707 | # If ATL03 Struct is not available
708 | else:
709 | # Calculate Time
710 | df = get_atl_time(df)
711 | # Calculate Projected Coordinates
712 | df, epsg = get_atl_coords(df, epsg)
713 | # Calculate Along Track
714 | df, rotation_data = get_atl_alongtrack(df)
715 | track_direction = get_direction(np.array(df.latitude))
716 | atl08filename = get_file_name(atl08filepath)
717 | atl08_info = getNameParts(atl08filename)
718 | hemi, zone = identify_hemi_zone(epsg)
719 | dataIsMapped = True
720 | beamNum = GtToBeamNum(atl08filepath, gt)
721 | beamStrength = GtToBeamSW(atl08filepath, gt)
722 |
723 | # Assign everything to the struct
724 | atl08Struct = AtlStruct(df, gt, beamNum, beamStrength, epsg, zone, hemi,
725 | atl08filepath, atl08filename, track_direction,
726 | 'ATL08', atl08_info, dataIsMapped,rotation_data)
727 | return atl08Struct
728 |
729 | def get_geolocation_mapping(height_len, ph_index_beg, segment_ph_cnt, target):
730 | data = np.zeros(height_len)
731 | for i_id in range(0,len(target)):
732 | data[ph_index_beg[i_id]-1:\
733 | ph_index_beg[i_id]-1 + segment_ph_cnt[i_id]] =\
734 | np.full((segment_ph_cnt[i_id]), target[i_id])
735 | return data
736 |
737 | def append_atl03_geolocation(heights, geolocation, fields = ['segment_id']):
738 | height_len = len(heights)
739 | ph_index_beg = np.array(geolocation.ph_index_beg)
740 | segment_ph_cnt = np.array(geolocation.segment_ph_cnt)
741 | for field in fields:
742 | target = np.array(geolocation[field])
743 | data = get_geolocation_mapping(height_len, ph_index_beg,
744 | segment_ph_cnt, target)
745 | heights = pd.concat([heights,pd.DataFrame(data,
746 | columns=[field])],axis=1)
747 |
748 | return heights
749 |
750 | # def convert_atl03_to_legacy(atl03):
751 | # intensity = np.zeros(len(atl03.df))
752 | # atl03_zMsl = np.zeros(len(atl03.df))
753 | # segmentID = np.zeros(len(atl03.df))
754 | # atl03h5Info = getNameParts(atl03.atlFileName)
755 | # atl03legacy = Atl03StructLegacy(atl03.df.lat_ph, atl03.df.lon_ph,
756 | # atl03.df.easting, atl03.df.northing,
757 | # atl03.df.crosstrack, atl03.df.alongtrack,
758 | # atl03.df.h_ph, atl03_zMsl, atl03.df.time,
759 | # atl03.df.delta_time, atl03.df.signal_conf_ph,
760 | # atl03.df.signal_conf_ph, atl03.df.signal_conf_ph,
761 | # atl03.df.signal_conf_ph, atl03.df.signal_conf_ph,
762 | # atl03.df.classification, intensity, intensity,
763 | # segmentID,
764 | # atl03.gtNum, atl03.beamNum, atl03.beamStrength,
765 | # atl03.zone, atl03.hemi, atl03.atlFilePath,
766 | # atl03.atlFileName, atl03.trackDirection,
767 | # atl03h5Info,
768 | # atl03.dataIsMapped)
769 | # rotationData = atl03.rotationData
770 | # # headerData = atl03.headerData
771 | # return atl03legacy, rotationData
772 |
773 | def write_pickle(data, filename):
774 | import pickle
775 | fp = open(filename, 'wb')
776 | pickle.dump(data, fp)
777 | fp.close()
778 |
779 | def read_pickle(filename):
780 | import pickle
781 | fp = open(filename, 'rb')
782 | data = pickle.load(fp)
783 | fp.close()
784 | return data
785 |
786 | def convert_df_to_mat(df,outfilename):
787 | from scipy import io
788 | comps = outfilename.split('.')
789 | if comps[-1] != 'mat':
790 | outfilename = outfilename + ".mat"
791 | # scipy.io.savemat(outfilename, {'struct':df.to_dict("list")})
792 | io.savemat(outfilename, {'struct':df.to_dict("list")})
793 |
794 | def get_attribute_info(atlfilepath, gt):
795 | # add year/doy, sc_orient, beam_number/type to 08 dataframe
796 | year, doy = get_h5_meta(atlfilepath, meta='date', rtn_doy=True)
797 |
798 | with h5py.File(atlfilepath, 'r') as fp:
799 | try:
800 | fp_a = fp[gt].attrs
801 | description = (fp_a['Description']).decode()
802 | beam_type = (fp_a['atlas_beam_type']).decode()
803 | atlas_pce = (fp_a['atlas_pce']).decode()
804 | spot_number = (fp_a['atlas_spot_number']).decode()
805 | atmosphere_profile = (fp_a['atmosphere_profile']).decode()
806 | groundtrack_id = (fp_a['groundtrack_id']).decode().lower()
807 | sc_orient = (fp_a['sc_orientation']).decode().lower()
808 | except:
809 | description = ''
810 | beam_type = ''
811 | atlas_pce = ''
812 | spot_number = ''
813 | atmosphere_profile = ''
814 | groundtrack_id = ''
815 | sc_orient = ''
816 | info_dict = {
817 | "description" : description,
818 | "atlas_beam_type" : beam_type,
819 | "atlas_pce" : atlas_pce,
820 | "atlas_spot_number" : spot_number,
821 | 'atmosphere_profile' : atmosphere_profile,
822 | "groundtrack_id" : groundtrack_id,
823 | "sc_orientation" : sc_orient,
824 | "year" : year,
825 | "doy" : doy
826 |
827 | }
828 |
829 | return info_dict
830 |
831 | # if __name__ == "__main__":
832 | # if os.name == 'nt':
833 | # basepath = 'Y:/USERS/eric/2_production/'
834 | # else:
835 | # basepath = '/LIDAR/server/USERS/eric/2_production/'
836 |
837 | # atl03file = 'ATL03_20181021130238_03500103_002_01.h5'
838 | # atl08file = 'ATL08_20181021130238_03500103_002_01.h5'
839 | # atl09file = 'ATL09_20181016132105_02740101_002_01.h5'
840 | # # Inputs
841 | # atl03filepath = basepath + atl03file
842 | # atl08filepath = basepath + atl08file
843 | # atl09filepath = basepath + atl09file
844 | # gt = 'gt1r'
845 |
846 |
847 | # print('ATL03 Heights')
848 | # atl03 = get_atl03_struct(atl03filepath, gt, atl08filepath)
849 |
850 | # print('ATL03 Geolocation')
851 | # geolocation = read_atl03_geolocation(atl03filepath, gt)
852 |
853 | # print('ATL08 Land Segments')
854 | # atl08 = get_atl08_struct(atl08filepath, gt, atl03)
855 |
856 | # print('ATL09 High Frequency Data')
857 | # atl09 = get_atl09_struct(atl09filepath, gt, atl03)
858 |
859 | # print('Append ATL03 Heights Dataframe with Geolocation')
860 | # heights = atl03.df
861 | # heights = append_atl03_geolocation(heights, geolocation,
862 | # fields = ['segment_id'])
863 |
864 | # print('Convert ATL03 Struct to Legacy ATL03 Struct')
865 | # atl03legacy, rotationData = convert_atl03_to_legacy(atl03)
866 |
--------------------------------------------------------------------------------
/phoreal/reference.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | """
3 | Created on Fri Aug 2 11:18:58 2019
4 |
5 | @author: malonzo
6 | """
7 |
8 | import numpy as np
9 | import pandas as pd
10 | import datetime
11 |
12 |
13 | from phoreal.utils import getCoordRotFwd, transform, getCoordRotRev, indexMatch
14 | from phoreal.io import getTruthHeaders, readLasHeader, formatDEM, getCoordRotRev
15 | import laspy
16 | import pyproj
17 | from phoreal.ace import ace
18 | from phoreal.getMeasurementError import getMeasurementError
19 | from phoreal.reader import get_atl_alongtrack
20 | from phoreal.CalVal import perfect_classifier
21 |
22 | def loadTruthFile(truthFilePath, atlMeasuredData, rotationData, truthFileType, outFilePath, logFileID=False):
23 |
24 | # Initialize output
25 | atlTruthData = []
26 |
27 | # Determine which file type to load
28 | if(('las' in truthFileType.lower()) or ('laz' in truthFileType.lower())):
29 | # Get .las header info
30 | atlTruthData = loadLasFile(truthFilePath, atlMeasuredData, rotationData, logFileID)
31 | elif('tif' in truthFileType):
32 | # Load .tif file
33 | atlTruthData = loadTifFile(truthFilePath, atlMeasuredData, rotationData, outFilePath, logFileID)
34 |
35 | return atlTruthData
36 |
37 | def las_to_df(lasFilePath):
38 |
39 | las_file = laspy.read(lasFilePath)
40 | las_df = pd.DataFrame()
41 | # Store output from .las file
42 | las_df['x'] = np.array(las_file.x)
43 | las_df['y'] = np.array(las_file.y)
44 | las_df['z'] = np.array(las_file.z)
45 | las_df['classification'] = np.array(las_file.classification)
46 | las_df['intensity'] = np.array(las_file.intensity)
47 | las_df['date'] = las_file.header.creation_date
48 |
49 | return las_df
50 |
51 | def loadLasFile(truthFilePath, epsg_atl, rotationData, decimate_n = 3, epsg_truth = 'EPSG:0'):
52 |
53 | # Read .las file
54 | lasTruthData = las_to_df(truthFilePath)
55 | lasTruthData = lasTruthData.iloc[::decimate_n, :]
56 |
57 | # Find EPSG Code from truth file
58 | # truthHeader = readLasHeader(truthFilePath)
59 | las_file = laspy.read(truthFilePath)
60 | if epsg_truth == 0:
61 | epsg_truth = 'EPSG:' + str(las_file.header.parse_crs().to_epsg(min_confidence=1))
62 |
63 | # epsg_truth = truthHeader['epsg'][0]
64 |
65 | # Find EPSG Code from input file
66 | # epsg_atl = identifyEPSG(atlMeasuredData.hemi,atlMeasuredData.zone)
67 |
68 | # Reproject if necessary
69 | if(epsg_truth == 'None'):
70 |
71 | print(' *WARNING: Invalid reference EPSG code, skipping file.')
72 | # atlTruthData = False
73 |
74 | else:
75 |
76 | if(epsg_truth != epsg_atl):
77 |
78 | # If EPSG code does not match, reproject to input EPSG code
79 | print(' *Reference file EPSG code does not match ICESat-2, reprojecting reference file...')
80 | lasTruthData['easting'], lasTruthData['northing'] = transform(epsg_truth, epsg_atl, np.array(lasTruthData.x), np.array(lasTruthData.y))
81 |
82 | # Rotate TRUTH data to CT/AT plane
83 | lasTruthData['crosstrack'], lasTruthData['alongtrack'], _, _, _, _ = \
84 | getCoordRotFwd(np.array(lasTruthData.easting), np.array(lasTruthData.northing),
85 | rotationData.R_mat, rotationData.xRotPt,
86 | rotationData.yRotPt, rotationData.desiredAngle)
87 |
88 | # Get reference lat/lon
89 | lasTruthData['lon'], lasTruthData['lat'] = transform(epsg_truth, 'epsg:4326',
90 | np.array(lasTruthData.x),
91 | np.array(lasTruthData.y))
92 |
93 | return lasTruthData
94 |
95 | ### Function to load .tif file
96 | def loadTifFile(truthFilePath, epsg_atl, rotationData):
97 |
98 | # Read Tif file
99 | xarr0, yarr0, zarr, intensity, classification, epsg = formatDEM(truthFilePath)
100 |
101 | # Convert ints to floats
102 | xarr0 = xarr0.astype(float)
103 | yarr0 = yarr0.astype(float)
104 | zarr = zarr.astype(float)
105 |
106 | # Find EPSG Code from tif
107 | epsg_truth = 'epsg:' + epsg
108 |
109 | if isinstance(epsg_atl, str) and 'epsg:' not in epsg_atl:
110 | epsg_atl = 'epsg:' + epsg_atl
111 |
112 | # Determine if EPSG Code is the same for the ATL03 Measured
113 | # epsg_atl = identifyEPSG(atlMeasuredData.hemi,atlMeasuredData.zone)
114 |
115 | # Store values
116 | xarr = xarr0
117 | yarr = yarr0
118 |
119 | # Reproject if necessary
120 | if(epsg_truth == 'None'):
121 |
122 | print(' *WARNING: Invalid reference EPSG code, skipping file.')
123 | # atlTruthData = False
124 | tif_df = pd.DataFrame()
125 |
126 | else:
127 |
128 | if(epsg_truth != epsg_atl):
129 |
130 | # If EPSG code does not match, use GDAL Warp to create new Tif
131 | print(' *Reference file EPSG code does not match ICESat-2, reprojecting reference file...')
132 | xarr, yarr = transform(epsg_truth, epsg_atl, xarr0, yarr0)
133 |
134 | # Rotate Data for along-track/cross-track
135 | x_newRot, y_newRot, _, _, _, _ = getCoordRotFwd(xarr, yarr,
136 | rotationData.R_mat,
137 | rotationData.xRotPt,
138 | rotationData.yRotPt,
139 | rotationData.desiredAngle)
140 |
141 | # Get reference lat/lon
142 | lat, lon = transform(epsg_truth, 'epsg:4326', xarr, yarr)
143 |
144 | # Store Data as Object
145 | tif_df = pd.DataFrame()
146 | tif_df['x'] = xarr
147 | tif_df['y'] = yarr
148 | tif_df['z'] = zarr
149 | tif_df['lon'] = lat
150 | tif_df['lat'] = lon
151 | tif_df['alongtrack'] = x_newRot
152 | tif_df['crosstrack'] = y_newRot
153 | tif_df['classification'] = 2
154 | tif_df['date'] = datetime.datetime.now() # TODO: find actual date of tif
155 |
156 | return tif_df
157 |
158 | ### Function to find and reproject min/max extents in header data
159 | def reprojectHeaderData(truthHeaderDF, epsg_atl):
160 |
161 | # Copy dataframe
162 | truthHeaderNewDF = truthHeaderDF.copy()
163 |
164 | # Input EPSG code
165 | # epsg_atl = identifyEPSG(atlMeasuredData.hemi,atlMeasuredData.zone)
166 |
167 | # Loop through all truth file EPSG codes
168 | for i in range(0,len(truthHeaderDF)):
169 | # writeLog(str(i) + ' out of ' + str(len(truthHeaderDF)), logFileID)
170 | # Reproject EPSG code to match input EPSG if unmatching
171 | epsg_truth = truthHeaderDF['epsg'][i]
172 | # writeLog(epsg_truth, logFileID)
173 | # writeLog(epsg_atl, logFileID)
174 | if(epsg_truth!=epsg_atl):
175 |
176 | # Header extents
177 | xmin = truthHeaderDF['xmin'][i]
178 | xmax = truthHeaderDF['xmax'][i]
179 | ymin = truthHeaderDF['ymin'][i]
180 | ymax = truthHeaderDF['ymax'][i]
181 | x = [xmin, xmax]
182 | y = [ymin, ymax]
183 |
184 | # Reproject extents to input EPSG code
185 | try:
186 |
187 | xout, yout = transform(epsg_truth, epsg_atl, x, y)
188 |
189 | # Store new extents into output dataframe
190 | truthHeaderNewDF['xmin'][i] = xout[0]
191 | truthHeaderNewDF['xmax'][i] = xout[1]
192 | truthHeaderNewDF['ymin'][i] = yout[0]
193 | truthHeaderNewDF['ymax'][i] = yout[1]
194 | truthHeaderNewDF['epsg'][i] = epsg_atl
195 |
196 | except:
197 |
198 | # Store new extents into output dataframe
199 | print('WARNING: Cannot reproject data, skipping file: %s' %truthHeaderNewDF['fileName'][i])
200 | truthHeaderNewDF['xmin'][i] = 'None'
201 | truthHeaderNewDF['xmax'][i] = 'None'
202 | truthHeaderNewDF['ymin'][i] = 'None'
203 | truthHeaderNewDF['ymax'][i] = 'None'
204 | truthHeaderNewDF['epsg'][i] = 'None'
205 |
206 | # endTry
207 |
208 |
209 | # endIf
210 | # endFor
211 |
212 | return truthHeaderNewDF
213 |
214 | ### Function to find which truth tiles ICESat-2 crosses over
215 | def findMatchingTruthFiles(truthHeaderNewDF, atlMeasuredData, rotationData, buffer):
216 |
217 | # Get MEASURED rotated buffer bounds data
218 | xRotL = atlMeasuredData.crosstrack - buffer
219 | xRotR = atlMeasuredData.crosstrack + buffer
220 | yRot = atlMeasuredData.alongtrack
221 |
222 | # Get MEASURED buffer bounds data in easting/northing plane
223 | xL, yL, _, _, _ = getCoordRotRev(xRotL, yRot, rotationData.R_mat, rotationData.xRotPt, rotationData.yRotPt)
224 | xR, yR, _, _, _ = getCoordRotRev(xRotR, yRot, rotationData.R_mat, rotationData.xRotPt, rotationData.yRotPt)
225 |
226 | # Rotate truth header file min/max x,y points to CT/AT plane
227 | xMinyMinHeaderRotX, _, _, _, _, _ = getCoordRotFwd(((truthHeaderNewDF['xmin']).to_numpy()).astype('float'),
228 | ((truthHeaderNewDF['ymin']).to_numpy()).astype('float'),
229 | rotationData.R_mat,
230 | rotationData.xRotPt,
231 | rotationData.yRotPt,
232 | rotationData.desiredAngle)
233 |
234 | xMinyMaxHeaderRotX, _, _, _, _, _ = getCoordRotFwd(((truthHeaderNewDF['xmin']).to_numpy()).astype('float'),
235 | ((truthHeaderNewDF['ymax']).to_numpy()).astype('float'),
236 | rotationData.R_mat,
237 | rotationData.xRotPt,
238 | rotationData.yRotPt,
239 | rotationData.desiredAngle)
240 | xMaxyMinHeaderRotX, _, _, _, _, _ = getCoordRotFwd(((truthHeaderNewDF['xmax']).to_numpy()).astype('float'),
241 | ((truthHeaderNewDF['ymin']).to_numpy()).astype('float'),
242 | rotationData.R_mat,
243 | rotationData.xRotPt,
244 | rotationData.yRotPt,
245 | rotationData.desiredAngle)
246 | xMaxyMaxHeaderRotX, _, _, _, _, _ = getCoordRotFwd(((truthHeaderNewDF['xmax']).to_numpy()).astype('float'),
247 | ((truthHeaderNewDF['ymax']).to_numpy()).astype('float'),
248 | rotationData.R_mat,
249 | rotationData.xRotPt,
250 | rotationData.yRotPt,
251 | rotationData.desiredAngle)
252 |
253 | # Find min/max x/y header points inside min/max x buffer points
254 | xMinyMinXPtsInBuffer = (xMinyMinHeaderRotX >= xRotL.min()) & \
255 | (xMinyMinHeaderRotX <= xRotR.max())
256 | xMinyMaxXPtsInBuffer = (xMinyMaxHeaderRotX >= xRotL.min()) & \
257 | (xMinyMaxHeaderRotX <= xRotR.max())
258 | xMaxyMinXPtsInBuffer = (xMaxyMinHeaderRotX >= xRotL.min()) & \
259 | (xMaxyMinHeaderRotX <= xRotR.max())
260 | xMaxyMaxXPtsInBuffer = (xMaxyMaxHeaderRotX >= xRotL.min()) & \
261 | (xMaxyMaxHeaderRotX <= xRotR.max())
262 |
263 | # Get header points inside MEASURED buffer points
264 | xHeaderPtsInBuffer = np.c_[xMinyMinXPtsInBuffer | xMinyMaxXPtsInBuffer| xMaxyMinXPtsInBuffer | xMaxyMaxXPtsInBuffer]
265 |
266 | # Find min/max x buffer points inside min/max x/y header points
267 | xyMinMaxHeaderRot = np.column_stack((xMinyMinHeaderRotX,xMinyMaxHeaderRotX,xMaxyMinHeaderRotX,xMaxyMaxHeaderRotX))
268 | xyMinHeaderRot = np.c_[np.amin(xyMinMaxHeaderRot,axis=1)]
269 | xyMaxHeaderRot = np.c_[np.amax(xyMinMaxHeaderRot,axis=1)]
270 | xMinBufferPtsInFile = (xRotL.min() >= xyMinHeaderRot) & (xRotL.min() <= xyMaxHeaderRot)
271 | xMaxBufferPtsInFile = (xRotR.max() >= xyMinHeaderRot) & (xRotR.max() <= xyMaxHeaderRot)
272 |
273 | # Get MEASURED buffer points inside header points
274 | xBufferPtsInHeader = xMinBufferPtsInFile | xMaxBufferPtsInFile
275 |
276 | # Get any points where buffer points are inside header and vice versa
277 | xPtsInFile = np.ravel(np.logical_or(xHeaderPtsInBuffer, xBufferPtsInHeader))
278 |
279 | # Get matching truth file names and x/y min/max points
280 | matchingFilesPre = ((truthHeaderNewDF['fileName'][xPtsInFile]).to_numpy()).astype('str')
281 | matchingHeaderXmin = ((truthHeaderNewDF['xmin'][xPtsInFile]).to_numpy()).astype('float')
282 | matchingHeaderXmax = ((truthHeaderNewDF['xmax'][xPtsInFile]).to_numpy()).astype('float')
283 | matchingHeaderYmin = ((truthHeaderNewDF['ymin'][xPtsInFile]).to_numpy()).astype('float')
284 | matchingHeaderYmax = ((truthHeaderNewDF['ymax'][xPtsInFile]).to_numpy()).astype('float')
285 |
286 | fileNumsAll = np.arange(0,len(truthHeaderNewDF))
287 | allFileInds = fileNumsAll[xPtsInFile]
288 |
289 | # Get all MEASURED x,y buffer points
290 | xAll = np.concatenate((xL, xR))
291 | yAll = np.concatenate((yL, yR))
292 |
293 | matchTF = []
294 | if(len(matchingFilesPre)>0):
295 | for i in range(0,len(matchingFilesPre)):
296 |
297 | # Determine TRUTH files where MEASURED data actually crosses over
298 | xPtsInFile = (xAll >= matchingHeaderXmin[i]) & (xAll <= matchingHeaderXmax[i])
299 | yPtsInFile = (yAll >= matchingHeaderYmin[i]) & (yAll <= matchingHeaderYmax[i])
300 | anyPtsInFile = any(xPtsInFile & yPtsInFile)
301 |
302 | # If a TRUTH file is a match, use it
303 | if(anyPtsInFile):
304 | matchTF.append(True)
305 | else:
306 | matchTF.append(False)
307 | # endIf
308 |
309 | # endFor
310 | # endIf
311 |
312 | if(len(matchTF)>0):
313 | matchTF = np.array(matchTF)
314 | matchingFiles = matchingFilesPre[matchTF]
315 | else:
316 | matchingFiles = []
317 | # endIf
318 |
319 | matchingFileInds = allFileInds[matchTF]
320 |
321 | return matchingFiles, matchingFileInds
322 |
323 | def make_buffer(atl03, truth_df, buffer):
324 |
325 | # Index Matching
326 | indices = indexMatch(atl03.df.alongtrack, truth_df.alongtrack)
327 |
328 | # Filter indices
329 | indices[indices >= len(atl03.df.crosstrack)] = (len(atl03.df.crosstrack) - 1)
330 | x_check = atl03.df.crosstrack[indices]
331 | x_diff = np.array(truth_df.crosstrack) - np.array(x_check)
332 | filter_data = np.where((x_diff < buffer) & (x_diff > -buffer))[0]
333 |
334 | # TODO: iloc is slow, this could be sped up by separating into PD array
335 | return truth_df.iloc[filter_data]
336 |
337 | # Unit test
338 | if __name__ == "__main__":
339 | pass
340 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | numpy
2 | h5py
3 | scipy
4 | pandas
5 | rasterio
6 | fiona ~= 1.8.13
7 | simplekml
8 | laspy
9 | matplotlib
10 | pyproj
11 |
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | import setuptools
2 |
3 | with open("README.md", "r", encoding="utf-8") as fh:
4 | long_description = fh.read()
5 |
6 | requirements = [
7 | 'laspy',
8 | 'fiona ~= 1.8.3',
9 | 'rasterio',
10 | 'scipy',
11 | 'h5py',
12 | 'pandas',
13 | 'numpy',
14 | 'simplekml',
15 | 'pyproj',
16 | 'matplotlib']
17 |
18 | setuptools.setup(
19 | name="phoreal",
20 | version = "3.31.1",
21 | author = "University of Texas CSR",
22 | author_email = "phoreal@csr.utexas.edu",
23 | license = 'UT AUSTIN RESEARCH LICENCE',
24 | description = "A python package for reading and analysis of ATL03/ATL08 ICESat-2 data",
25 | long_description = long_description,
26 | long_description_content_type = "text/markdown",
27 | url = "https://github.com/icesat-2UT/PhoREAL",
28 | project_urls ={
29 | "Bug Tracker": "https://github.com/icesat-2UT/PhoREAL/issues",
30 | },
31 | classifiers=[
32 | "Programming Language :: Python :: 3",
33 | "Operating System :: OS Idependent",
34 | ],
35 | packages=['phoreal'],
36 | python_requires=">=3.6",
37 | include_package_data=True,
38 | install_requires=requirements,
39 | )
40 |
--------------------------------------------------------------------------------
|