├── .gitignore ├── Data └── NESOSIM_IS2_100kmv56-15082018-30042019.nc ├── Images ├── ICESat-2_granules_polar.png ├── Surface_classification.png ├── icesat2_profiling.png └── icesat2_seaice.png ├── Notebooks ├── .gitignore ├── .ipynb_checkpoints │ ├── 1. ATL03-checkpoint.ipynb │ ├── 2. ATL07-checkpoint.ipynb │ ├── 3. ATL10-checkpoint.ipynb │ ├── convert_GPS_time-checkpoint.py │ ├── convert_julian-checkpoint.py │ ├── hacking-checkpoint.ipynb │ ├── readers-checkpoint.py │ └── utils-checkpoint.py ├── ATL03.ipynb ├── ATL07.ipynb ├── ATL10.ipynb ├── DataAccess.ipynb ├── GriddingDemo.ipynb ├── __pycache__ │ ├── convert_GPS_time.cpython-36.pyc │ ├── convert_julian.cpython-36.pyc │ ├── readers.cpython-36.pyc │ ├── readers.cpython-37.pyc │ ├── utils.cpython-36.pyc │ └── utils.cpython-37.pyc ├── readers.py └── utils.py ├── README.md └── icesat2_seaice.png /.gitignore: -------------------------------------------------------------------------------- 1 | .ipynb_checkpoints 2 | 3 | -------------------------------------------------------------------------------- /Data/NESOSIM_IS2_100kmv56-15082018-30042019.nc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Data/NESOSIM_IS2_100kmv56-15082018-30042019.nc -------------------------------------------------------------------------------- /Images/ICESat-2_granules_polar.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Images/ICESat-2_granules_polar.png -------------------------------------------------------------------------------- /Images/Surface_classification.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Images/Surface_classification.png -------------------------------------------------------------------------------- /Images/icesat2_profiling.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Images/icesat2_profiling.png -------------------------------------------------------------------------------- /Images/icesat2_seaice.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Images/icesat2_seaice.png -------------------------------------------------------------------------------- /Notebooks/.gitignore: -------------------------------------------------------------------------------- 1 | !.gitignore 2 | -------------------------------------------------------------------------------- /Notebooks/.ipynb_checkpoints/2. ATL07-checkpoint.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Exploring sea ice heights with ICESat-2 (ATL07)\n", 8 | "\n", 9 | "Information obtained primarily from the ATL07/10 Algorithm Theoretical Basis Document (ATBD, Kwok et al., 2019) and the NSIDC product description page: https://nsidc.org/data/atl07. \n", 10 | "\n", 11 | "* Notebook author: Alek Petty, relying extensively on above product manuals. \n", 12 | "* Description: Notebook describing the ICESat-2 ATL07 product. \n", 13 | "* Input requirements: Demo ATL07 data file \n", 14 | "* Date: June 2019\n", 15 | "* More info: See the ATL07/ATL10 Algorithm Theoretical Basis Document (ATBD): https://icesat-2.gsfc.nasa.gov/sites/default/files/page_files/ICESat2_ATL07_ATL10_ATBD_r001.pdf and the known issues document: https://nsidc.org/sites/nsidc.org/files/technical-references/ATL0710-KnownIssues.pdf\n", 16 | "\n", 17 | "\n", 18 | "## Notebook objectives\n", 19 | "* General understanding of the data included in a typical ATL07 file.\n", 20 | "* Reading in, plotting and basic analysis of ATL07 data.\n", 21 | "* How is ATL07 data used to generate ATL10 sea ice freeboards and what to look out for when using either product.\n" 22 | ] 23 | }, 24 | { 25 | "cell_type": "markdown", 26 | "metadata": {}, 27 | "source": [ 28 | "## Notebook instructions\n", 29 | "1. Follow along with the notebook tutorial. \n", 30 | "2. Play around changing options and re-running the relevant notebook cells. \n", 31 | "\n", 32 | "Here I use the HDF5 ATL07 file (ATL07-01_20181115003141_07240101_001_01.h5) from: https://nsidc.org/data/atl07. If using this using the ICESat-2 Pangeo instance, you can download the file from Amazon S3 using the notebook cell provided below.\n" 33 | ] 34 | }, 35 | { 36 | "cell_type": "markdown", 37 | "metadata": {}, 38 | "source": [ 39 | "## ATL07 Background\n", 40 | "\n", 41 | "*NB: This is my laymans description of ATL07 compiled from the above resources, trying to condense this information to the key points of interest to potential sea ice users. let me know if you see any errors!*\n", 42 | "\n", 43 | "ATL07 is arguably the most important IS2 product for sea ice users. ATL07 provides along-track surface height and type (e.g. snow-covered ice, bare ice, open water) for the ice-covered seas of the northern and southern hemispheres. The primary input is the individual photon height estimates from the Level 2A Global Geolocated Photon Data (ATL03) product. Sea surface and sea ice height are estimated for segments along each of the six beams. \n", 44 | "\n", 45 | "The mean sea surface (MSS) derived from ICESat/CryoSat-2, the inverted barometric (IB) correction calculated from surface pressure from ATL09 and ocean tides are subtracted from the ATL03 surface heights (relative to WGS84). \n", 46 | "\n", 47 | "A two-step filtering process is then used to derive heights within a given segment (along-track series) of photons. First, A coarse surface filtering method is employed to remove obviously erroneuous returns from background/subsurface/clouds etc. Second, a fine surface filtering is applied which analyzes 150 photon segments to derive the mean height. A first-photon bias estimate is included in ATL07, based on system engineering with each height estimate. Subsurface-scattering, or volume scattering, is a bias that comes from photons that experience multiple scattering within the snow or ice before returning to the satellite. This is not included in ATL07 but is something worth considering when using the data.\n", 48 | "\n", 49 | "Each of the 150 photon segments are then classified surface based on a decision-tree approach that determines the most likely surface type from three primary variables: the surface photon rate, the width of the photon distribution (or the fitted Gaussian) and the background rate. \n", 50 | "\n", 51 | "* The surface photon rate (photon returns per pulse) is a measure of the brightness, or apparent surface reflectance, of that height segment. In general, low surface rates indicate water or thin ice in open leads. \n", 52 | "* The width of the photon distribution provides a measure of the surface roughness and can be used to partition the height segments within four ranges that correspond to different surface types.\n", 53 | "* The background rate provides useful additional information when the solar elevation is high and sufficient photons are present. For example, a lack of relationship between sun angle and background photon rate can indicates shadows (cloud shadows or ridge shadows), specular returns, or possibly, atmospheric effects. \n", 54 | "\n", 55 | "Much of this approach was based on data collected by NASA's MABEL (Multiple Altimeter Beam Experimental Lidar) photon-counting lidar prior to the ICESat-2 launch. Calibration/validation of the ATL07 data product is underway (Operation IceBridge spring 2019 data shortly to be made available).\n", 56 | "\n", 57 | "Below is an example of how one can read in and explore a given ATL07 file. Very open to suggestions on possible improvements!" 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": 1, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "#Magic function to enable interactive plotting in Jupyter notebook\n", 67 | "#Allows you to zoom/pan within plots after generating\n", 68 | "#Normally, this would be %matplotlib notebook, but since we're using Juptyerlab, we need a different widget\n", 69 | "#%matplotlib notebook\n", 70 | "%matplotlib inline" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": 2, 76 | "metadata": {}, 77 | "outputs": [], 78 | "source": [ 79 | "import warnings\n", 80 | "warnings.filterwarnings('ignore')\n", 81 | "#Import necesary modules\n", 82 | "#Use shorter names (np, pd, plt) instead of full (numpy, pandas, matplotlib.pylot) for convenience\n", 83 | "import numpy as np\n", 84 | "import pandas as pd\n", 85 | "import matplotlib.pyplot as plt\n", 86 | "import cartopy.crs as ccrs\n", 87 | "import pandas as pd\n", 88 | "import numpy.ma as ma\n", 89 | "import h5py\n", 90 | "import s3fs\n", 91 | "import readers as rd\n", 92 | "import utils as ut\n", 93 | "import xarray as xr\n", 94 | "from astropy.time import Time\n", 95 | "\n", 96 | "# Use seasborn for nicer looking inline plots if available \n", 97 | "#import seaborn as sns\n", 98 | "#sns.set(context='notebook', style='darkgrid')\n", 99 | "#st = axes_style(\"whitegrid\")\n" 100 | ] 101 | }, 102 | { 103 | "cell_type": "markdown", 104 | "metadata": {}, 105 | "source": [ 106 | "#### Beam selection \n", 107 | "There are 6 beams to choose from in the ICESat-2 products (3 pairs of a strong and weak beam). The energy ratio between the weak and strong beams are approximately 1:4 and are separated by 90 m in the across-track direction. The beam pairs are separated by ~3.3 km in the across-track direction, and the strong and weak beams are separated by ~2.5 km in the along-track direction." 108 | ] 109 | }, 110 | { 111 | "cell_type": "code", 112 | "execution_count": 3, 113 | "metadata": {}, 114 | "outputs": [], 115 | "source": [ 116 | "beamNum=1" 117 | ] 118 | }, 119 | { 120 | "cell_type": "code", 121 | "execution_count": 4, 122 | "metadata": {}, 123 | "outputs": [], 124 | "source": [ 125 | "# If file stored locally...\n", 126 | "#file_path = '../Data/'\n", 127 | "#ATL07_filename = 'ATL07-01_20181115003141_07240101_001_01.h5'\n", 128 | "#localPath = file_path + ATL07_filename" 129 | ] 130 | }, 131 | { 132 | "cell_type": "code", 133 | "execution_count": 11, 134 | "metadata": {}, 135 | "outputs": [], 136 | "source": [ 137 | "# If running on Pangeo instance and grabbing data from Amazon S3\n", 138 | "# Comment out the last command if you've already got the data in the Data dir\n", 139 | "bucket = 'pangeo-data-upload-oregon'\n", 140 | "fs = s3fs.S3FileSystem()\n", 141 | "dataDir = 'pangeo-data-upload-oregon/icesat2/'\n", 142 | "s3List = fs.ls(dataDir)\n", 143 | "#print(s3List)\n", 144 | "ATL07file='ATL07-01_20181115003141_07240101_001_01.h5'\n", 145 | "s3File='pangeo-data-upload-oregon/icesat2/'+ATL07file\n", 146 | "localFilePath='../Data/'+ATL07file\n", 147 | "#fs.get(s3File, localFilePath)" 148 | ] 149 | }, 150 | { 151 | "cell_type": "code", 152 | "execution_count": 8, 153 | "metadata": {}, 154 | "outputs": [], 155 | "source": [ 156 | "def getATL07data(fileT, numpy=False, beamNum=1, maxElev=1e6):\n", 157 | " \"\"\" Pandas/numpy ATL07 reader\n", 158 | " Written by Alek Petty, June 2018 (alek.a.petty@nasa.gov)\n", 159 | "\n", 160 | " I've picked out the variables from ATL07 I think are of most interest to sea ice users, but by no means is this an exhastive list. \n", 161 | " See the xarray or dictionary readers to load in the more complete ATL07 dataset\n", 162 | " or explore the hdf5 files themselves (I like using the app Panpoly for this) to see what else you might want\n", 163 | " \n", 164 | " Args:\n", 165 | " fileT (str): File path of the ATL07 dataset\n", 166 | " numpy (flag): Binary flag for outputting numpy arrays (True) or pandas dataframe (False)\n", 167 | " beamNum (int): ICESat-2 beam number (1 to 6)\n", 168 | " maxElev (float): maximum surface elevation to remove anomalies\n", 169 | "\n", 170 | " returns:\n", 171 | " either: select numpy arrays or a pandas dataframe\n", 172 | " \n", 173 | " Updates:\n", 174 | " V3 (June 2018) added observatory orientation flag, read in the beam number, not the string\n", 175 | " V2 (June 2018) used astropy to more simply generate a datetime instance form the gps time\n", 176 | "\n", 177 | " \"\"\"\n", 178 | " \n", 179 | " # Open the file\n", 180 | " try:\n", 181 | " ATL07 = h5py.File(fileT, 'r')\n", 182 | " except:\n", 183 | " return 'Not a valid file'\n", 184 | " \n", 185 | " #flag_values: 0, 1, 2; flag_meanings : backward forward transition\n", 186 | " orientation_flag=ATL07['orbit_info']['sc_orient'][:]\n", 187 | " \n", 188 | " if (orientation_flag==0):\n", 189 | " print('Backward orientation')\n", 190 | " beamStrs=['gt1l', 'gt1r', 'gt2l', 'gt2r', 'gt3l', 'gt3r']\n", 191 | " \n", 192 | " elif (orientation_flag==1):\n", 193 | " print('Forward orientation')\n", 194 | " beamStrs=['gt3r', 'gt3l', 'gt2r', 'gt2l', 'gt1r', 'gt1l']\n", 195 | " \n", 196 | " elif (orientation_flag==2):\n", 197 | " print('Transitioning, do not use for science!')\n", 198 | " \n", 199 | " beamStr=beamStrs[beamNum-1]\n", 200 | " print(beamStr)\n", 201 | " \n", 202 | " lons=ATL07[beamStr+'/sea_ice_segments/longitude'][:]\n", 203 | " lats=ATL07[beamStr+'/sea_ice_segments/latitude'][:]\n", 204 | " \n", 205 | " # Along track distance \n", 206 | " # I removed the first point so it's distance relative to the start of the beam\n", 207 | " along_track_distance=ATL07[beamStr+'/sea_ice_segments/seg_dist_x'][:] - ATL07[beamStr+'/sea_ice_segments/seg_dist_x'][0]\n", 208 | " # Height segment ID (10 km segments)\n", 209 | " height_segment_id=ATL07[beamStr+'/sea_ice_segments/height_segment_id'][:] \n", 210 | " # Number of seconds since the GPS epoch on midnight Jan. 6, 1980 \n", 211 | " delta_time=ATL07[beamStr+'/sea_ice_segments/delta_time'][:] \n", 212 | " # Add this value to delta time parameters to compute full gps time\n", 213 | " atlas_epoch=ATL07['/ancillary_data/atlas_sdp_gps_epoch'][:] \n", 214 | "\n", 215 | " leapSecondsOffset=37\n", 216 | " gps_seconds = atlas_epoch[0] + delta_time - leapSecondsOffset\n", 217 | " # Use astropy to convert from gps time to datetime\n", 218 | " tgps = Time(gps_seconds, format='gps')\n", 219 | " tiso = Time(tgps, format='datetime')\n", 220 | " \n", 221 | " # Primary variables of interest\n", 222 | " \n", 223 | " # Beam segment height\n", 224 | " elev=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_height'][:]\n", 225 | " # Flag for potential leads, 0=sea ice, 1 = sea surface\n", 226 | " ssh_flag=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] \n", 227 | " \n", 228 | " #Quality metrics for each segment include confidence level in the surface height estimate, \n", 229 | " # which is based on the number of photons, the background noise rate, and the error measure provided by the surface-finding algorithm.\n", 230 | " # Height quality flag, 1 for good fit, 0 for bad\n", 231 | " quality=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_quality'][:] \n", 232 | " \n", 233 | " elev_rms = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_rms'][:] #RMS difference between modeled and observed photon height distribution\n", 234 | " seg_length = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_length_seg'][:] # Along track length of segment\n", 235 | " height_confidence = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_confidence'][:] # Height segment confidence flag\n", 236 | " reflectance = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_asr_calc'][:] # Apparent surface reflectance\n", 237 | " ssh_flag = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] # Flag for potential leads, 0=sea ice, 1 = sea surface\n", 238 | " seg_type = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_type'][:] # 0 = Cloud covered\n", 239 | " gauss_width = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_w_gaussian'][:] # Width of Gaussian fit\n", 240 | "\n", 241 | " # Geophysical corrections\n", 242 | " # NOTE: All of these corrections except ocean tides, DAC, \n", 243 | " # and geoid undulations were applied to the ATL03 photon heights.\n", 244 | " \n", 245 | " # AVISO dynamic Atmospheric Correction (DAC) including inverted barometer (IB) effect (±5cm)\n", 246 | " dac = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_dac'][:] \n", 247 | " # Solid Earth Tides (±40 cm, max)\n", 248 | " earth = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_earth'][:]\n", 249 | " # Geoid (-105 to +90 m, max)\n", 250 | " geoid = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_geoid'][:] \n", 251 | " # Local displacement due to Ocean Loading (-6 to 0 cm)\n", 252 | " loadTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_load'][:] \n", 253 | " # Ocean Tides including diurnal and semi-diurnal (harmonic analysis), \n", 254 | " # and longer period tides (dynamic and self-consistent equilibrium) (±5 m)\n", 255 | " oceanTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_ocean'][:]\n", 256 | " # Deformation due to centrifugal effect from small variations in polar motion \n", 257 | " # (Solid Earth Pole Tide) (±1.5 cm, the ocean pole tide ±2mm amplitude is considered negligible)\n", 258 | " poleTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_pole'][:] \n", 259 | " # Mean sea surface (±2 m)\n", 260 | " # Taken from ICESat and CryoSat-2, see Kwok and Morison [2015])\n", 261 | " mss = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_mss'][:]\n", 262 | " \n", 263 | " # Photon rate of the given segment\n", 264 | " photon_rate = ATL07[beamStr+'/sea_ice_segments/stats/photon_rate'][:]\n", 265 | " \n", 266 | " # Estimated background rate from sun angle, reflectance, surface slope\n", 267 | " background_rate = ATL07[beamStr+'/sea_ice_segments/stats/backgr_calc'][:]\n", 268 | " \n", 269 | " \n", 270 | " \n", 271 | " ATL07.close()\n", 272 | " \n", 273 | " if numpy:\n", 274 | " # list the variables you want to output here..\n", 275 | " return along_track_dist, elev\n", 276 | " \n", 277 | " else:\n", 278 | " dF = pd.DataFrame({'elev':elev, 'lons':lons, 'lats':lats, 'ssh_flag':ssh_flag,\n", 279 | " 'quality_flag':quality,\n", 280 | " 'delta_time':delta_time,\n", 281 | " 'along_track_distance':along_track_distance,\n", 282 | " 'height_segment_id':height_segment_id, \n", 283 | " 'photon_rate':photon_rate,'background_rate':background_rate,\n", 284 | " 'datetime':tiso, 'mss': mss, 'seg_length':seg_length})\n", 285 | " \n", 286 | " # Add the datetime string\n", 287 | " #dFtimepd=pd.to_datetime(dFtime)\n", 288 | " #dF['datetime'] = pd.Series(dFtimepd, index=dF.index)\n", 289 | " \n", 290 | " # Filter out high elevation values \n", 291 | " dF = dF[(dF['elev']\n", 323 | "\n", 336 | "\n", 337 | " \n", 338 | " \n", 339 | " \n", 340 | " \n", 341 | " \n", 342 | " \n", 343 | " \n", 344 | " \n", 345 | " \n", 346 | " \n", 347 | " \n", 348 | " \n", 349 | " \n", 350 | " \n", 351 | " \n", 352 | " \n", 353 | " \n", 354 | " \n", 355 | " \n", 356 | " \n", 357 | " \n", 358 | " \n", 359 | " \n", 360 | " \n", 361 | " \n", 362 | " \n", 363 | " \n", 364 | " \n", 365 | " \n", 366 | " \n", 367 | " \n", 368 | " \n", 369 | " \n", 370 | " \n", 371 | " \n", 372 | " \n", 373 | " \n", 374 | " \n", 375 | " \n", 376 | " \n", 377 | " \n", 378 | " \n", 379 | " \n", 380 | " \n", 381 | " \n", 382 | " \n", 383 | " \n", 384 | " \n", 385 | " \n", 386 | " \n", 387 | " \n", 388 | " \n", 389 | " \n", 390 | " \n", 391 | " \n", 392 | " \n", 393 | " \n", 394 | " \n", 395 | " \n", 396 | " \n", 397 | " \n", 398 | " \n", 399 | " \n", 400 | " \n", 401 | " \n", 402 | " \n", 403 | " \n", 404 | " \n", 405 | " \n", 406 | " \n", 407 | " \n", 408 | " \n", 409 | " \n", 410 | " \n", 411 | " \n", 412 | " \n", 413 | " \n", 414 | " \n", 415 | " \n", 416 | " \n", 417 | " \n", 418 | " \n", 419 | " \n", 420 | " \n", 421 | " \n", 422 | " \n", 423 | " \n", 424 | " \n", 425 | " \n", 426 | " \n", 427 | " \n", 428 | " \n", 429 | " \n", 430 | " \n", 431 | " \n", 432 | " \n", 433 | " \n", 434 | " \n", 435 | " \n", 436 | " \n", 437 | "
elevlonslatsssh_flagquality_flagdelta_timealong_track_distanceheight_segment_idphoton_ratebackground_ratedatetimemssseg_length
0-0.151697-168.18722673.235994102.747825e+072568.670525100.7802200.02018-11-15 00:50:49.971690-0.149128121.853638
1-0.153673-168.18723973.236023102.747825e+072571.852364111.5161290.02018-11-15 00:50:49.972142-0.14903959.295998
2-0.160038-168.18724873.236042002.747825e+072574.083009122.8541670.02018-11-15 00:50:49.972459-0.14899033.133297
3-0.167672-168.18725873.236063002.747825e+072576.349800135.0714290.02018-11-15 00:50:49.972780-0.14892919.062412
4-0.059221-168.18729873.236147002.747825e+072585.810232142.6078430.02018-11-15 00:50:49.974118-0.14867435.393513
\n", 438 | "" 439 | ], 440 | "text/plain": [ 441 | " elev lons lats ssh_flag quality_flag delta_time \\\n", 442 | "0 -0.151697 -168.187226 73.235994 1 0 2.747825e+07 \n", 443 | "1 -0.153673 -168.187239 73.236023 1 0 2.747825e+07 \n", 444 | "2 -0.160038 -168.187248 73.236042 0 0 2.747825e+07 \n", 445 | "3 -0.167672 -168.187258 73.236063 0 0 2.747825e+07 \n", 446 | "4 -0.059221 -168.187298 73.236147 0 0 2.747825e+07 \n", 447 | "\n", 448 | " along_track_distance height_segment_id photon_rate background_rate \\\n", 449 | "0 2568.670525 10 0.780220 0.0 \n", 450 | "1 2571.852364 11 1.516129 0.0 \n", 451 | "2 2574.083009 12 2.854167 0.0 \n", 452 | "3 2576.349800 13 5.071429 0.0 \n", 453 | "4 2585.810232 14 2.607843 0.0 \n", 454 | "\n", 455 | " datetime mss seg_length \n", 456 | "0 2018-11-15 00:50:49.971690 -0.149128 121.853638 \n", 457 | "1 2018-11-15 00:50:49.972142 -0.149039 59.295998 \n", 458 | "2 2018-11-15 00:50:49.972459 -0.148990 33.133297 \n", 459 | "3 2018-11-15 00:50:49.972780 -0.148929 19.062412 \n", 460 | "4 2018-11-15 00:50:49.974118 -0.148674 35.393513 " 461 | ] 462 | }, 463 | "execution_count": 10, 464 | "metadata": {}, 465 | "output_type": "execute_result" 466 | } 467 | ], 468 | "source": [ 469 | "dF07=getATL07data(localFilePath, beamNum=beamNum)\n", 470 | "dF07.head(5)\n", 471 | "\n", 472 | "# Or get data using xarray\n", 473 | "#dF07X= rd.getATL07xarray(localFilePath, beamStr)\n", 474 | "#dF07X\n", 475 | "\n", 476 | "# ...Or data using numpy\n", 477 | "#along_track_dist, elev=getATL07data(ATL07_file_path, numpy=1, beam=beamStr)" 478 | ] 479 | }, 480 | { 481 | "cell_type": "markdown", 482 | "metadata": {}, 483 | "source": [ 484 | "#### Map the data for visual inspection using Cartopy \n", 485 | "*NB (Basemap is often used for mapping but is not being officially supported by the community anymore)*\n" 486 | ] 487 | }, 488 | { 489 | "cell_type": "code", 490 | "execution_count": null, 491 | "metadata": {}, 492 | "outputs": [], 493 | "source": [ 494 | "# Select variable of interest from the dataframe columns\n", 495 | "var='mss'\n", 496 | "dF07.along_track_distance\n", 497 | "plt.figure(figsize=(7,7), dpi= 90)\n", 498 | "# Make a new projection \"NorthPolarStereo\"\n", 499 | "ax = plt.axes(projection=ccrs.NorthPolarStereo(true_scale_latitude=70))\n", 500 | "plt.scatter(dF07['lons'], dF07['lats'],c=dF07[var], cmap='viridis', transform=ccrs.PlateCarree())\n", 501 | "#plt.pcolormesh(lons, lats, tile_to_plot,\n", 502 | "# transform=ccrs.PlateCarree());\n", 503 | "\n", 504 | "ax.coastlines()\n", 505 | "#ax.drawmeridians()\n", 506 | "plt.colorbar(label=var, shrink=0.5, extend='both')\n", 507 | "\n", 508 | "# Limit the map to -60 degrees latitude and below.\n", 509 | "ax.set_extent([-180, 180, 90, 60], ccrs.PlateCarree())" 510 | ] 511 | }, 512 | { 513 | "cell_type": "code", 514 | "execution_count": null, 515 | "metadata": {}, 516 | "outputs": [], 517 | "source": [ 518 | "# Get NESOSIM data which has variables including daily ice concentration from CDR dataset...\n", 519 | "\n", 520 | "dateStr='20181115'\n", 521 | "NESOSIMfilePath='NESOSIM-OSISAFsig150_ERAI_sf_SICCDR_Rhovariable_IC3_DYN1_WP1_LL1_WPF5.8e-07_WPT5_LLF2.9e-07-100kmnrt3-15082018-31012019.nc'\n", 522 | "localFilePath='../Data/'\n", 523 | "fs = s3fs.S3FileSystem()\n", 524 | "dataDir = 'pangeo-data-upload-oregon/icesat2/'\n", 525 | "#fs.get(dataDir+NESOSIMfilePath, localFilePath+NESOSIMfilePath)\n", 526 | "dNday= ut.getNESOSIM(localFilePath+NESOSIMfilePath, dateStr)\n" 527 | ] 528 | }, 529 | { 530 | "cell_type": "code", 531 | "execution_count": null, 532 | "metadata": {}, 533 | "outputs": [], 534 | "source": [ 535 | "\n", 536 | "# Plot with ice concentration (or change to snowDepth) as a background\n", 537 | "\n", 538 | "# Select variable of interest from the dataframe columns\n", 539 | "var='photon_rate'\n", 540 | "dF07.along_track_distance\n", 541 | "plt.figure(figsize=(7,7), dpi= 90)\n", 542 | "# Make a new projection \"NorthPolarStereo\"\n", 543 | "ax = plt.axes(projection=ccrs.NorthPolarStereo(true_scale_latitude=70))\n", 544 | "plt.pcolormesh(dNday['longitude'], dNday['latitude'],ma.masked_where(dNday['iceConc']<0.3, dNday['iceConc']) , cmap='Blues_r', transform=ccrs.PlateCarree())\n", 545 | "\n", 546 | "plt.scatter(dF07['lons'], dF07['lats'],c=dF07[var], cmap='viridis', transform=ccrs.PlateCarree())\n", 547 | "#plt.pcolormesh(lons, lats, tile_to_plot,\n", 548 | "# transform=ccrs.PlateCarree());\n", 549 | "\n", 550 | "ax.coastlines()\n", 551 | "#ax.drawmeridians()\n", 552 | "plt.colorbar(label=var, shrink=0.5, extend='both')\n", 553 | "\n", 554 | "# Limit the map to -60 degrees latitude and below.\n", 555 | "ax.set_extent([-180, 180, 90, 60], ccrs.PlateCarree())\n", 556 | "\n", 557 | "\n" 558 | ] 559 | }, 560 | { 561 | "cell_type": "markdown", 562 | "metadata": {}, 563 | "source": [ 564 | "#### Plot the segment heights of this section" 565 | ] 566 | }, 567 | { 568 | "cell_type": "code", 569 | "execution_count": null, 570 | "metadata": { 571 | "scrolled": true 572 | }, 573 | "outputs": [], 574 | "source": [ 575 | "plt.figure(figsize=(12, 5))\n", 576 | "plt.plot((dF07.along_track_distance)/1000., dF07.elev, color='r', marker='.', linestyle='None', alpha=0.2)\n", 577 | "plt.xlabel('Along track distance (km)')\n", 578 | "plt.ylabel('Elevation (m)')\n", 579 | "plt.show()" 580 | ] 581 | }, 582 | { 583 | "cell_type": "markdown", 584 | "metadata": {}, 585 | "source": [ 586 | "#### Use the Pandas Groupby function to group the dataframe based on a given condition \n", 587 | "*surface type classification in this example..*" 588 | ] 589 | }, 590 | { 591 | "cell_type": "code", 592 | "execution_count": null, 593 | "metadata": {}, 594 | "outputs": [], 595 | "source": [ 596 | "dFstype=dF07.groupby('ssh_flag')\n", 597 | "dFstype['elev'].agg(['mean', 'std', 'median', 'mad'])\n", 598 | "# Ice surface photons\n", 599 | "dFstypeIce=dFstype.get_group(0)\n", 600 | "# Lead/Sea surface photons\n", 601 | "dFstypeLeads=dFstype.get_group(1)\n", 602 | "\n", 603 | "# Note that in the ATL03 example I don't bother doing this \n", 604 | "# and just keep all the data in the table and just use a condition to display data \n", 605 | "# like the example commented out in the next cell..\n", 606 | "\n" 607 | ] 608 | }, 609 | { 610 | "cell_type": "code", 611 | "execution_count": null, 612 | "metadata": {}, 613 | "outputs": [], 614 | "source": [ 615 | "# Plot the grouped/classified data\n", 616 | "plt.figure(figsize=(12, 5))\n", 617 | "plt.plot((dFstypeIce.along_track_distance)/1000., dFstypeIce.elev, color='m', marker='.', linestyle='None', label='Sea ice', alpha=0.2)\n", 618 | "plt.plot((dFstypeLeads.along_track_distance)/1000., dFstypeLeads.elev, color='k', marker='x', linestyle='None',label='Leads', alpha=1.)\n", 619 | "plt.legend(frameon=False)\n", 620 | "plt.xlabel('Along track distance (km)')\n", 621 | "plt.ylabel('Elevation (m)')\n", 622 | "plt.show()" 623 | ] 624 | }, 625 | { 626 | "cell_type": "code", 627 | "execution_count": null, 628 | "metadata": {}, 629 | "outputs": [], 630 | "source": [ 631 | "# An alternative (more intuitive?) approach...\n", 632 | "\n", 633 | "#plt.figure(figsize=(12, 5))\n", 634 | "#plt.plot((dF07[(dF07['ssh_flag']>0)]['along_track_distance']-dF07['along_track_distance'][0])/1000., dF07[(dF07['ssh_flag']>0)]['elev'], color='k', marker='x', linestyle='None',label='sea surface', alpha=1)\n", 635 | "#plt.plot((dF07[(dF07['ssh_flag']==0)]['along_track_distance']-dF07['along_track_distance'][0])/1000., dF07[(dF07['ssh_flag']==0)]['elev'], color='b', marker='.', linestyle='None',label='ice', alpha=0.3)\n", 636 | "#plt.legend(frameon=False)\n", 637 | "#plt.xlabel('Along track distance (km)')\n", 638 | "#plt.ylabel('Elevation (m)')\n", 639 | "#plt.show()" 640 | ] 641 | }, 642 | { 643 | "cell_type": "markdown", 644 | "metadata": {}, 645 | "source": [ 646 | "#### Let's simplify things by just looking at a 10 km along-track section" 647 | ] 648 | }, 649 | { 650 | "cell_type": "code", 651 | "execution_count": null, 652 | "metadata": {}, 653 | "outputs": [], 654 | "source": [ 655 | "# Section of ATL07 data\n", 656 | "sectionNum=200 # (NB 220 is a nice example, 200 seems to show surprisngly bad lead precisions..?)\n", 657 | "# Section length (for plotting purposes) in meters\n", 658 | "sectionSize=10000.\n", 659 | "\n", 660 | "# Find data that satisfies these conditions and then group the data like before\n", 661 | "idx=np.where((dF07['along_track_distance']>sectionNum*sectionSize)&(dF07['along_track_distance']<(sectionNum+1)*10000.))[0]\n", 662 | "df07S=dF07.iloc[idx]\n", 663 | "dFStype=df07S.groupby('ssh_flag')\n", 664 | "dFStype['elev'].agg(['mean', 'std', 'median', 'mad'])\n", 665 | "dFStypeIce=dFStype.get_group(0)\n", 666 | "dFStypeLeads=dFStype.get_group(1)\n", 667 | "\n", 668 | "plt.figure(figsize=(12, 5))\n", 669 | "plt.plot((dFStypeIce.along_track_distance)/1000., dFStypeIce.elev, color='m', marker='.', linestyle='None', label='Sea ice', alpha=0.2)\n", 670 | "plt.plot((dFStypeLeads.along_track_distance)/1000., dFStypeLeads.elev, color='k', marker='x', linestyle='None',label='Leads', alpha=1.)\n", 671 | "plt.legend(frameon=False)\n", 672 | "plt.xlabel('Along track distance (km)')\n", 673 | "plt.ylabel('Elevation (m)')\n", 674 | "plt.show()" 675 | ] 676 | }, 677 | { 678 | "cell_type": "markdown", 679 | "metadata": {}, 680 | "source": [ 681 | "#### Let's calculate the mean elevation of the sea surface (and sea ice) as a simple test" 682 | ] 683 | }, 684 | { 685 | "cell_type": "code", 686 | "execution_count": null, 687 | "metadata": {}, 688 | "outputs": [], 689 | "source": [ 690 | "# Sea ice elevation\n", 691 | "meanIceElev=dFstype['elev'].get_group(0).mean()\n", 692 | "# Sea surface elevation\n", 693 | "meanSSH=dFstype['elev'].get_group(1).mean()\n", 694 | "print('Sea ice elevation (m):', meanIceElev)\n", 695 | "print('SSH (m):', meanSSH)" 696 | ] 697 | }, 698 | { 699 | "cell_type": "markdown", 700 | "metadata": {}, 701 | "source": [ 702 | "#### OK well now it's clearly pretty simple to derive some freeboard!\n" 703 | ] 704 | }, 705 | { 706 | "cell_type": "code", 707 | "execution_count": null, 708 | "metadata": {}, 709 | "outputs": [], 710 | "source": [ 711 | "plt.figure(figsize=(12, 5))\n", 712 | "plt.plot((dFStypeIce.along_track_distance)/1000., dFStypeIce.elev-meanSSH, color='g', marker='.', linestyle='-', label='Freeboard', alpha=0.5)\n", 713 | "#plt.legend(frameon=False)\n", 714 | "plt.xlabel('Along track distance (km)')\n", 715 | "plt.ylabel('Freeboard (m)')\n", 716 | "plt.show()" 717 | ] 718 | }, 719 | { 720 | "cell_type": "markdown", 721 | "metadata": {}, 722 | "source": [ 723 | "#### Explore the photon rate\n", 724 | "*Note the higher photon rates where we have leads!*" 725 | ] 726 | }, 727 | { 728 | "cell_type": "code", 729 | "execution_count": null, 730 | "metadata": {}, 731 | "outputs": [], 732 | "source": [ 733 | "plt.figure(figsize=(12, 5))\n", 734 | "plt.scatter((dFStypeIce.along_track_distance)/1000., dFStypeIce.elev-meanSSH, c=dFStypeIce.photon_rate, label='photon_rate')\n", 735 | "#plt.legend(frameon=False)\n", 736 | "plt.colorbar(label='Photon rate')\n", 737 | "plt.xlabel('Along track distance (km)')\n", 738 | "plt.ylabel('Freeboard (m)')\n", 739 | "plt.show()" 740 | ] 741 | }, 742 | { 743 | "cell_type": "markdown", 744 | "metadata": {}, 745 | "source": [ 746 | "#### Explore the background rate\n", 747 | "*Note that this is the calculated background rate in ATL07 based on the sun angle, surface slope, unit reflectance. There is also an observed background rate at a given along-track posting (25 Hz or 200 Hz)*" 748 | ] 749 | }, 750 | { 751 | "cell_type": "code", 752 | "execution_count": null, 753 | "metadata": {}, 754 | "outputs": [], 755 | "source": [ 756 | "plt.figure(figsize=(12, 5))\n", 757 | "plt.plot((dFStypeIce.along_track_distance)/1000., dFStypeIce.background_rate, color='k', marker='.', linestyle='-', label='background_rate', alpha=0.5)\n", 758 | "#plt.legend(frameon=False)\n", 759 | "plt.xlabel('Along track distance (km)')\n", 760 | "plt.ylabel('Background rate (Hz)')\n", 761 | "plt.show()\n", 762 | "\n", 763 | "# Looks like it's nighttime in the Arctic!" 764 | ] 765 | }, 766 | { 767 | "cell_type": "code", 768 | "execution_count": null, 769 | "metadata": {}, 770 | "outputs": [], 771 | "source": [ 772 | "plt.figure(figsize=(12, 5))\n", 773 | "plt.plot((dFStypeIce.along_track_distance)/1000., dFStypeIce.seg_length, color='g', marker='.', linestyle='', label='segment length', alpha=0.5)\n", 774 | "#plt.legend(frameon=False)\n", 775 | "plt.xlabel('Along track distance (km)')\n", 776 | "plt.ylabel('Segment length (m)')\n", 777 | "plt.show()" 778 | ] 779 | }, 780 | { 781 | "cell_type": "markdown", 782 | "metadata": {}, 783 | "source": [ 784 | "### Extra ideas\n", 785 | "\n", 786 | "1. Try downloading some more ATL07 data from the NSIDC (following the hackweek tutorial) and see what it looks like when using it in this processing chain. \n", 787 | "2. Explore the photon classification scheme.\n", 788 | "3. Explore the photon rate and background rate. How do they variable with the open water/sea ice classification? Do some scatter plots of photon rate versus ice type. " 789 | ] 790 | }, 791 | { 792 | "cell_type": "markdown", 793 | "metadata": {}, 794 | "source": [ 795 | "### Onwards to the ATL10 Notebook...!" 796 | ] 797 | } 798 | ], 799 | "metadata": { 800 | "kernelspec": { 801 | "display_name": "Python 3", 802 | "language": "python", 803 | "name": "python3" 804 | }, 805 | "language_info": { 806 | "codemirror_mode": { 807 | "name": "ipython", 808 | "version": 3 809 | }, 810 | "file_extension": ".py", 811 | "mimetype": "text/x-python", 812 | "name": "python", 813 | "nbconvert_exporter": "python", 814 | "pygments_lexer": "ipython3", 815 | "version": "3.6.7" 816 | } 817 | }, 818 | "nbformat": 4, 819 | "nbformat_minor": 2 820 | } 821 | -------------------------------------------------------------------------------- /Notebooks/.ipynb_checkpoints/convert_GPS_time-checkpoint.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | u""" 3 | convert_GPS_time.py (10/2017) 4 | Return the calendar date and time for given GPS time. 5 | Based on Tiffany Summerscales's PHP conversion algorithm 6 | https://www.andrews.edu/~tzs/timeconv/timealgorithm.html 7 | 8 | INPUTS: 9 | GPS_Time: GPS time (standard = seconds since January 6, 1980 at 00:00) 10 | 11 | OUTPUTS: 12 | month: Number of the desired month (1 = January, ..., 12 = December). 13 | day: Number of day of the month. 14 | year: Number of the desired year. 15 | hour: hour of the day 16 | minute: minute of the hour 17 | second: second (and fractions of a second) of the minute. 18 | 19 | OPTIONS: 20 | OFFSET: number of seconds to offset each GPS time 21 | 22 | PYTHON DEPENDENCIES: 23 | numpy: Scientific Computing Tools For Python (http://www.numpy.org) 24 | 25 | PROGRAM DEPENDENCIES: 26 | convert_julian.py: convert Julian dates into calendar dates 27 | 28 | UPDATE HISTORY: 29 | Updated 10/2017: added leap second from midnight 2016-12-31 30 | Written 04/2016 31 | """ 32 | import numpy as np 33 | from convert_julian import convert_julian 34 | import pdb 35 | 36 | #-- PURPOSE: Define GPS leap seconds 37 | def get_leaps(): 38 | leaps = [46828800, 78364801, 109900802, 173059203, 252028804, 315187205, 39 | 346723206, 393984007, 425520008, 457056009, 504489610, 551750411, 40 | 599184012, 820108813, 914803214, 1025136015, 1119744016, 1167264017] 41 | return leaps 42 | 43 | #-- PURPOSE: Test to see if any GPS seconds are leap seconds 44 | def is_leap(GPS_Time): 45 | leaps = get_leaps() 46 | Flag = np.zeros_like(GPS_Time, dtype=np.bool) 47 | for leap in leaps: 48 | count = np.count_nonzero(np.floor(GPS_Time) == leap) 49 | if (count > 0): 50 | indices, = np.nonzero(np.floor(GPS_Time) == leap) 51 | Flag[indices] = True 52 | return Flag 53 | 54 | #-- PURPOSE: Count number of leap seconds that have passed for each GPS time 55 | def count_leaps(GPS_Time): 56 | leaps = get_leaps() 57 | #-- number of leap seconds prior to GPS_Time 58 | n_leaps = np.zeros_like(GPS_Time, dtype=np.uint) 59 | for i,leap in enumerate(leaps): 60 | count = np.count_nonzero(GPS_Time >= leap) 61 | if (count > 0): 62 | indices, = np.nonzero(GPS_Time >= leap) 63 | # print(indices) 64 | # pdb.set_trace() 65 | n_leaps[indices] += 1 66 | return n_leaps 67 | 68 | #-- PURPOSE: Convert UNIX Time to GPS Time 69 | def convert_UNIX_to_GPS(UNIX_Time): 70 | #-- calculate offsets for UNIX times that occur during leap seconds 71 | offset = np.zeros_like(UNIX_Time) 72 | count = np.count_nonzero((UNIX_Time % 1) != 0) 73 | if (count > 0): 74 | indices, = np.nonzero((UNIX_Time % 1) != 0) 75 | UNIX_Time[indices] -= 0.5 76 | offset[indices] = 1.0 77 | #-- convert UNIX_Time to GPS without taking into account leap seconds 78 | #-- (UNIX epoch: Jan 1, 1970 00:00:00, GPS epoch: Jan 6, 1980 00:00:00) 79 | GPS_Time = UNIX_Time - 315964800 80 | leaps = get_leaps() 81 | #-- calculate number of leap seconds prior to GPS_Time 82 | n_leaps = np.zeros_like(GPS_Time, dtype=np.uint) 83 | for i,leap in enumerate(leaps): 84 | count = np.count_nonzero(GPS_Time >= (leap - i)) 85 | if (count > 0): 86 | indices, = np.nonzero(GPS_Time >= (leap - i)) 87 | n_leaps[indices] += 1 88 | #-- take into account leap seconds and offsets 89 | GPS_Time += n_leaps + offset 90 | return GPS_Time 91 | 92 | #-- PURPOSE: Convert GPS Time to UNIX Time 93 | def convert_GPS_to_UNIX(GPS_Time): 94 | #-- convert GPS_Time to UNIX without taking into account leap seconds 95 | #-- (UNIX epoch: Jan 1, 1970 00:00:00, GPS epoch: Jan 6, 1980 00:00:00) 96 | UNIX_Time = GPS_Time + 315964800 97 | #-- number of leap seconds prior to GPS_Time 98 | n_leaps = count_leaps(GPS_Time) 99 | UNIX_Time -= n_leaps 100 | #-- check if GPS Time is leap second 101 | Flag = is_leap(GPS_Time) 102 | if Flag.any(): 103 | #-- for leap seconds: add a half second offset 104 | indices, = np.nonzero(Flag) 105 | UNIX_Time[indices] += 0.5 106 | return UNIX_Time 107 | 108 | #-- PURPOSE: convert from GPS time to calendar dates 109 | def convert_GPS_time(GPS_Time, OFFSET=0.0): 110 | #-- convert from standard GPS time to UNIX time accounting for leap seconds 111 | #-- and adding the specified offset to GPS_Time 112 | UNIX_Time = convert_GPS_to_UNIX(np.array(GPS_Time) + OFFSET) 113 | #-- calculate Julian date from UNIX time and convert into calendar dates 114 | #-- UNIX time: seconds from 1970-01-01 00:00:00 UTC 115 | julian_date = (UNIX_Time/86400.0) + 2440587.500000 116 | cal_date = convert_julian(julian_date) 117 | #-- include UNIX times in output 118 | cal_date['UNIX'] = UNIX_Time 119 | #-- return the calendar dates and UNIX time 120 | return cal_date 121 | -------------------------------------------------------------------------------- /Notebooks/.ipynb_checkpoints/convert_julian-checkpoint.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | u""" 3 | convert_julian.py 4 | Written by Tyler Sutterley (10/2017) 5 | 6 | Return the calendar date and time given Julian date. 7 | 8 | CALLING SEQUENCE: 9 | date_inp = convert_julian(julian) 10 | year = date_inp['year'] 11 | month = np.int(date_inp['month']) 12 | day = date_inp['day'] 13 | 14 | INPUTS: 15 | JD: Julian Day of the specified calendar date. 16 | 17 | OUTPUTS: 18 | month: Number of the desired month (1 = January, ..., 12 = December) 19 | day: Number of day of the month 20 | year: Number of the desired year 21 | hour: hour of the day 22 | minute: minute of the hour 23 | second: second (and fractions of a second) of the minute 24 | 25 | OPTIONS: 26 | ASTYPE: convert output to variable type (e.g. int). Default is float 27 | FORMAT: format of output coordinates 28 | 'dict': dictionary with variable keys as listed above 29 | 'tuple': tuple with variable order YEAR,MONTH,DAY,HOUR,MINUTE,SECOND 30 | 'zip': aggregated variable sets 31 | 32 | PYTHON DEPENDENCIES: 33 | numpy: Scientific Computing Tools For Python (http://www.numpy.org) 34 | 35 | NOTES: 36 | Translated from caldat in "Numerical Recipes in C", by William H. Press, 37 | Brian P. Flannery, Saul A. Teukolsky, and William T. Vetterling. 38 | Cambridge University Press, 1988 (second printing). 39 | Hatcher, D. A., "Simple Formulae for Julian Day Numbers and Calendar Dates", 40 | Quarterly Journal of the Royal Astronomical Society, 25(1), 1984. 41 | 42 | UPDATE HISTORY: 43 | Updated 10/2017: updated comments and formatting of algorithm 44 | Updated 06/2017: added option FORMAT to change the output variables format 45 | Updated 06/2016: added option to convert output to variable type (e.g. int) 46 | Updated 11/2015: extracting the values from singleton dimension cases 47 | Updated 03/2015: remove singleton dimensions if initially importing value 48 | Updated 03/2014: updated to be able to convert arrays 49 | Written 05/2013 50 | """ 51 | import numpy as np 52 | 53 | def convert_julian(JD, ASTYPE=None, FORMAT='dict'): 54 | #-- convert to array if only a single value was imported 55 | if (np.ndim(JD) == 0): 56 | JD = np.array([JD]) 57 | SINGLE_VALUE = True 58 | else: 59 | SINGLE_VALUE = False 60 | 61 | JDO = np.floor(JD + 0.5) 62 | C = np.zeros_like(JD) 63 | #-- calculate C for dates before and after the switch to Gregorian 64 | IGREG = 2299161.0 65 | ind1, = np.nonzero(JDO < IGREG) 66 | C[ind1] = JDO[ind1] + 1524.0 67 | ind2, = np.nonzero(JDO >= IGREG) 68 | B = np.floor((JDO[ind2] - 1867216.25)/36524.25) 69 | C[ind2] = JDO[ind2] + B - np.floor(B/4.0) + 1525.0 70 | #-- calculate coefficients for date conversion 71 | D = np.floor((C - 122.1)/365.25) 72 | E = np.floor((365.0 * D) + np.floor(D/4.0)) 73 | F = np.floor((C - E)/30.6001) 74 | #-- calculate day, month, year and hour 75 | DAY = np.floor(C - E + 0.5) - np.floor(30.6001*F) 76 | MONTH = F - 1.0 - 12.0*np.floor(F/14.0) 77 | YEAR = D - 4715.0 - np.floor((7.0+MONTH)/10.0) 78 | HOUR = np.floor(24.0*(JD + 0.5 - JDO)) 79 | #-- calculate minute and second 80 | G = (JD + 0.5 - JDO) - HOUR/24.0 81 | MINUTE = np.floor(G*1440.0) 82 | SECOND = (G - MINUTE/1440.0) * 86400.0 83 | 84 | #-- convert all variables to output type (from float) 85 | if ASTYPE is not None: 86 | YEAR = YEAR.astype(ASTYPE) 87 | MONTH = MONTH.astype(ASTYPE) 88 | DAY = DAY.astype(ASTYPE) 89 | HOUR = HOUR.astype(ASTYPE) 90 | MINUTE = MINUTE.astype(ASTYPE) 91 | SECOND = SECOND.astype(ASTYPE) 92 | 93 | #-- if only a single value was imported initially: remove singleton dims 94 | if SINGLE_VALUE: 95 | YEAR = YEAR.item(0) 96 | MONTH = MONTH.item(0) 97 | DAY = DAY.item(0) 98 | HOUR = HOUR.item(0) 99 | MINUTE = MINUTE.item(0) 100 | SECOND = SECOND.item(0) 101 | 102 | #-- return date variables in output format (default python dictionary) 103 | if (FORMAT == 'dict'): 104 | return dict(year=YEAR, month=MONTH, day=DAY, 105 | hour=HOUR, minute=MINUTE, second=SECOND) 106 | elif (FORMAT == 'tuple'): 107 | return (YEAR, MONTH, DAY, HOUR, MINUTE, SECOND) 108 | elif (FORMAT == 'zip'): 109 | return zip(YEAR, MONTH, DAY, HOUR, MINUTE, SECOND) 110 | -------------------------------------------------------------------------------- /Notebooks/.ipynb_checkpoints/readers-checkpoint.py: -------------------------------------------------------------------------------- 1 | import warnings 2 | warnings.filterwarnings('ignore') 3 | import numpy as np 4 | import pandas as pd 5 | import datetime as dt 6 | import matplotlib.pyplot as plt 7 | import cartopy.crs as ccrs 8 | import h5py 9 | import scipy 10 | from astropy.time import Time 11 | #from icepyx import icesat2data as ipd 12 | 13 | def getATL03(f,beam): 14 | # height of each received photon, relative to the WGS-84 ellipsoid (with some, not all corrections applied, see background info above) 15 | heights=f[beam]['heights']['h_ph'][:] 16 | # latitude (decimal degrees) of each received photon 17 | lats=f[beam]['heights']['lat_ph'][:] 18 | # longitude (decimal degrees) of each received photon 19 | lons=f[beam]['heights']['lon_ph'][:] 20 | # seconds from ATLAS Standard Data Product Epoch. use the epoch parameter to convert to gps time 21 | dt=f[beam]['heights']['delta_time'][:] 22 | # confidence level associated with each photon event 23 | # -2: TEP 24 | # -1: Events not associated with a specific surface type 25 | # 0: noise 26 | # 1: buffer but algorithm classifies as background 27 | # 2: low 28 | # 3: medium 29 | # 4: high 30 | # Surface types for signal classification confidence 31 | # 0=Land; 1=Ocean; 2=SeaIce; 3=LandIce; 4=InlandWater 32 | conf=f[beam]['heights']['signal_conf_ph'][:,2] #choose column 2 for confidence of sea ice photons 33 | # number of ATL03 20m segments 34 | n_seg, = f[beam]['geolocation']['segment_id'].shape 35 | # first photon in the segment (convert to 0-based indexing) 36 | Segment_Index_begin = f[beam]['geolocation']['ph_index_beg'][:] - 1 37 | # number of photon events in the segment 38 | Segment_PE_count = f[beam]['geolocation']['segment_ph_cnt'][:] 39 | # along-track distance for each ATL03 segment 40 | Segment_Distance = f[beam]['geolocation']['segment_dist_x'][:] 41 | # along-track distance (x) for photon events 42 | x_atc = np.copy(f[beam]['heights']['dist_ph_along'][:]) 43 | # cross-track distance (y) for photon events 44 | y_atc = np.copy(f[beam]['heights']['dist_ph_across'][:]) 45 | 46 | for j in range(n_seg): 47 | # index for 20m segment j 48 | idx = Segment_Index_begin[j] 49 | # number of photons in 20m segment 50 | cnt = Segment_PE_count[j] 51 | # add segment distance to along-track coordinates 52 | x_atc[idx:idx+cnt] += Segment_Distance[j] 53 | df03=pd.DataFrame({'lats':lats,'lons':lons,'x':x_atc,'y':y_atc,'heights':heights,'dt':dt,'conf':conf}) 54 | return df03 55 | 56 | def getATL07(f,beam): 57 | lats = f[beam+'/sea_ice_segments/latitude'][:] 58 | lons = f[beam+'/sea_ice_segments/longitude'][:] 59 | dt = f[beam+'/sea_ice_segments/delta_time'][:] 60 | seg_x = f[beam+'/sea_ice_segments/seg_dist_x'][:] 61 | heights = f[beam+'/sea_ice_segments/heights/height_segment_height'][:] 62 | conf = f[beam+'/sea_ice_segments/heights/height_segment_confidence'][:] 63 | stype = f[beam+'/sea_ice_segments/heights/height_segment_type'][:] 64 | ssh_flag = f[beam+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] 65 | gauss = f[beam+'/sea_ice_segments/heights/height_segment_w_gaussian'][:] 66 | photon_rate = f[beam+'/sea_ice_segments/stats/photon_rate'][:] 67 | cloud = f[beam+'/sea_ice_segments/stats/cloud_flag_asr'][:] 68 | mss = f[beam+'/sea_ice_segments/geophysical/height_segment_mss'][:] 69 | ocean_tide = f[beam+'/sea_ice_segments/geophysical/height_segment_ocean'][:] 70 | lpe_tide = f[beam+'/sea_ice_segments/geophysical/height_segment_lpe'][:] 71 | ib = f[beam+'/sea_ice_segments/geophysical/height_segment_ib'][:] 72 | df07=pd.DataFrame({'lats':lats,'lons':lons,'heights':heights,'dt':dt,'conf':conf,'stype':stype,'ssh_flag':ssh_flag, 'gauss':gauss,'photon_rate':photon_rate,'cloud':cloud,'mss':mss,'ocean':ocean_tide,'lpe':lpe_tide,'ib':ib}) 73 | return df07 74 | 75 | 76 | 77 | 78 | 79 | #--------- READERS COPIED FROM 2019 HACKWEEK TUTORIALS (not including the dictionary/xarray readers) ----- 80 | 81 | def getATL03data(fileT, numpyout=False, beam='gt1l'): 82 | """ Pandas/numpy ATL03 reader 83 | Written by Alek Petty, June 2018 (alek.a.petty@nasa.gov) 84 | I've picked out the variables from ATL03 I think are of most interest to 85 | sea ice users, but by no means is this an exhastive list. 86 | See the xarray or dictionary readers to load in the more complete ATL03 dataset 87 | or explore the hdf5 files themselves (I like using the app Panpoly for this) to 88 | see what else you might want 89 | 90 | Args: 91 | fileT (str): File path of the ATL03 dataset 92 | numpy (flag): Binary flag for outputting numpy arrays (True) or pandas dataframe (False) 93 | beam (str): ICESat-2 beam (the number is the pair, r=strong, l=weak) 94 | 95 | returns: 96 | either: select numpy arrays or a pandas dataframe 97 | """ 98 | 99 | # Open the file 100 | try: 101 | ATL03 = h5py.File(fileT, 'r') 102 | except: 103 | 'Not a valid file' 104 | 105 | lons=ATL03[beam+'/heights/lon_ph'][:] 106 | lats=ATL03[beam+'/heights/lat_ph'][:] 107 | 108 | # Number of seconds since the GPS epoch on midnight Jan. 6, 1980 109 | delta_time=ATL03[beam+'/heights/delta_time'][:] 110 | 111 | # #Add this value to delta time parameters to compute the full gps_seconds 112 | atlas_epoch=ATL03['/ancillary_data/atlas_sdp_gps_epoch'][:] 113 | 114 | # Conversion of delta_time to a calendar date 115 | # This function seems pretty convoluted but it works for now.. 116 | # I'm sure there is a simpler function we can use here instead. 117 | temp = ut.convert_GPS_time(atlas_epoch[0] + delta_time, OFFSET=0.0) 118 | 119 | # Express delta_time relative to start time of granule 120 | delta_time_granule=delta_time-delta_time[0] 121 | 122 | year = temp['year'][:].astype('int') 123 | month = temp['month'][:].astype('int') 124 | day = temp['day'][:].astype('int') 125 | hour = temp['hour'][:].astype('int') 126 | minute = temp['minute'][:].astype('int') 127 | second = temp['second'][:].astype('int') 128 | 129 | dFtime=pd.DataFrame({'year':year, 'month':month, 'day':day, 130 | 'hour':hour, 'minute':minute, 'second':second}) 131 | 132 | 133 | # Primary variables of interest 134 | 135 | # Photon height 136 | heights=ATL03[beam+'/heights/h_ph'][:] 137 | print(heights.shape) 138 | 139 | # Flag for signal confidence 140 | # column index: 0=Land; 1=Ocean; 2=SeaIce; 3=LandIce; 4=InlandWater 141 | # values: 142 | #-- -1: Events not associated with a specific surface type 143 | #-- 0: noise 144 | #-- 1: buffer but algorithm classifies as background 145 | #-- 2: low 146 | #-- 3: medium 147 | #-- 4: high 148 | signal_confidence=ATL03[beam+'/heights/signal_conf_ph'][:,2] 149 | 150 | # Add photon rate, background rate etc to the reader here if we want 151 | 152 | ATL03.close() 153 | 154 | 155 | 156 | dF = pd.DataFrame({'heights':heights, 'lons':lons, 'lats':lats, 157 | 'signal_confidence':signal_confidence, 158 | 'delta_time':delta_time_granule}) 159 | 160 | # Add the datetime string 161 | dFtimepd=pd.to_datetime(dFtime) 162 | dF['datetime'] = pd.Series(dFtimepd, index=dF.index) 163 | 164 | # Filter out high elevation values 165 | #dF = dF[(dF['signal_confidence']>2)] 166 | # Reset row indexing 167 | #dF=dF.reset_index(drop=True) 168 | return dF 169 | 170 | # Or return as numpy arrays 171 | # return along_track_distance, heights 172 | 173 | 174 | def getATL07data(fileT, numpy=False, beamNum=1, maxElev=1e6): 175 | """ Pandas/numpy ATL07 reader 176 | Written by Alek Petty, June 2018 (alek.a.petty@nasa.gov) 177 | I've picked out the variables from ATL07 I think are of most interest to sea ice users, 178 | but by no means is this an exhastive list. 179 | See the xarray or dictionary readers to load in the more complete ATL07 dataset 180 | or explore the hdf5 files themselves (I like using the app Panpoly for this) to see what else 181 | you might want 182 | 183 | Args: 184 | fileT (str): File path of the ATL07 dataset 185 | numpy (flag): Binary flag for outputting numpy arrays (True) or pandas dataframe (False) 186 | beamNum (int): ICESat-2 beam number (1 to 6) 187 | maxElev (float): maximum surface elevation to remove anomalies 188 | returns: 189 | either: select numpy arrays or a pandas dataframe 190 | 191 | Updates: 192 | V3 (June 2018) added observatory orientation flag, read in the beam number, not the string 193 | V2 (June 2018) used astropy to more simply generate a datetime instance form the gps time 194 | """ 195 | 196 | # Open the file 197 | try: 198 | ATL07 = h5py.File(fileT, 'r') 199 | except: 200 | return 'Not a valid file' 201 | 202 | #flag_values: 0, 1, 2; flag_meanings : backward forward transition 203 | orientation_flag=ATL07['orbit_info']['sc_orient'][:] 204 | 205 | if (orientation_flag==0): 206 | print('Backward orientation') 207 | beamStrs=['gt1l', 'gt1r', 'gt2l', 'gt2r', 'gt3l', 'gt3r'] 208 | 209 | elif (orientation_flag==1): 210 | print('Forward orientation') 211 | beamStrs=['gt3r', 'gt3l', 'gt2r', 'gt2l', 'gt1r', 'gt1l'] 212 | 213 | elif (orientation_flag==2): 214 | print('Transitioning, do not use for science!') 215 | 216 | beamStr=beamStrs[beamNum-1] 217 | print(beamStr) 218 | 219 | lons=ATL07[beamStr+'/sea_ice_segments/longitude'][:] 220 | lats=ATL07[beamStr+'/sea_ice_segments/latitude'][:] 221 | 222 | # Along track distance 223 | # I removed the first point so it's distance relative to the start of the beam 224 | along_track_distance=ATL07[beamStr+'/sea_ice_segments/seg_dist_x'][:] - ATL07[beamStr+'/sea_ice_segments/seg_dist_x'][0] 225 | # Height segment ID (10 km segments) 226 | height_segment_id=ATL07[beamStr+'/sea_ice_segments/height_segment_id'][:] 227 | # Number of seconds since the GPS epoch on midnight Jan. 6, 1980 228 | delta_time=ATL07[beamStr+'/sea_ice_segments/delta_time'][:] 229 | # Add this value to delta time parameters to compute full gps time 230 | atlas_epoch=ATL07['/ancillary_data/atlas_sdp_gps_epoch'][:] 231 | 232 | leapSecondsOffset=37 233 | gps_seconds = atlas_epoch[0] + delta_time - leapSecondsOffset 234 | # Use astropy to convert from gps time to datetime 235 | tgps = Time(gps_seconds, format='gps') 236 | tiso = Time(tgps, format='datetime') 237 | 238 | # Primary variables of interest 239 | 240 | # Beam segment height 241 | elev=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_height'][:] 242 | # Flag for potential leads, 0=sea ice, 1 = sea surface 243 | ssh_flag=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] 244 | 245 | #Quality metrics for each segment include confidence level in the surface height estimate, 246 | # which is based on the number of photons, the background noise rate, and the error measure provided by the surface-finding algorithm. 247 | # Height quality flag, 1 for good fit, 0 for bad 248 | quality=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_quality'][:] 249 | 250 | elev_rms = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_rms'][:] #RMS difference between modeled and observed photon height distribution 251 | seg_length = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_length_seg'][:] # Along track length of segment 252 | height_confidence = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_confidence'][:] # Height segment confidence flag 253 | reflectance = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_asr_calc'][:] # Apparent surface reflectance 254 | ssh_flag = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] # Flag for potential leads, 0=sea ice, 1 = sea surface 255 | seg_type = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_type'][:] # 0 = Cloud covered 256 | gauss_width = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_w_gaussian'][:] # Width of Gaussian fit 257 | 258 | # Geophysical corrections 259 | # NOTE: All of these corrections except ocean tides, DAC, 260 | # and geoid undulations were applied to the ATL03 photon heights. 261 | 262 | # AVISO dynamic Atmospheric Correction (DAC) including inverted barometer (IB) effect (±5cm) 263 | dac = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_dac'][:] 264 | # Solid Earth Tides (±40 cm, max) 265 | earth = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_earth'][:] 266 | # Geoid (-105 to +90 m, max) 267 | geoid = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_geoid'][:] 268 | # Local displacement due to Ocean Loading (-6 to 0 cm) 269 | loadTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_load'][:] 270 | # Ocean Tides including diurnal and semi-diurnal (harmonic analysis), 271 | # and longer period tides (dynamic and self-consistent equilibrium) (±5 m) 272 | oceanTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_ocean'][:] 273 | # Deformation due to centrifugal effect from small variations in polar motion 274 | # (Solid Earth Pole Tide) (±1.5 cm, the ocean pole tide ±2mm amplitude is considered negligible) 275 | poleTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_pole'][:] 276 | # Mean sea surface (±2 m) 277 | # Taken from ICESat and CryoSat-2, see Kwok and Morison [2015]) 278 | mss = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_mss'][:] 279 | 280 | # Photon rate of the given segment 281 | photon_rate = ATL07[beamStr+'/sea_ice_segments/stats/photon_rate'][:] 282 | 283 | # Estimated background rate from sun angle, reflectance, surface slope 284 | background_rate = ATL07[beamStr+'/sea_ice_segments/stats/backgr_calc'][:] 285 | 286 | ATL07.close() 287 | 288 | if numpy: 289 | # list the variables you want to output here.. 290 | return along_track_dist, elev 291 | 292 | else: 293 | dF = pd.DataFrame({'elev':elev, 'lons':lons, 'lats':lats, 'ssh_flag':ssh_flag, 294 | 'quality_flag':quality, 295 | 'delta_time':delta_time, 296 | 'along_track_distance':along_track_distance, 297 | 'height_segment_id':height_segment_id, 298 | 'photon_rate':photon_rate,'background_rate':background_rate, 299 | 'datetime':tiso, 'mss': mss, 'seg_length':seg_length}) 300 | 301 | # Add the datetime string 302 | #dFtimepd=pd.to_datetime(dFtime) 303 | #dF['datetime'] = pd.Series(dFtimepd, index=dF.index) 304 | 305 | # Filter out high elevation values 306 | dF = dF[(dF['elev']=minFreeboard)] 376 | dF = dF[(dF['freeboard']<=maxFreeboard)] 377 | 378 | # Also filter based on the confidence and/or quality flag? 379 | 380 | # Reset row indexing 381 | dF=dF.reset_index(drop=True) 382 | 383 | return dF -------------------------------------------------------------------------------- /Notebooks/.ipynb_checkpoints/utils-checkpoint.py: -------------------------------------------------------------------------------- 1 | #Import necesary modules 2 | #Use shorter names (np, pd, plt) instead of full (numpy, pandas, matplotlib.pylot) for convenience 3 | import numpy as np 4 | import pandas as pd 5 | import matplotlib.pyplot as plt 6 | import cartopy.crs as ccrs 7 | import pandas as pd 8 | import h5py 9 | import xarray as xr 10 | import numpy as np 11 | import pdb 12 | import numpy.ma as ma 13 | 14 | def getSnowandConverttoThickness(dF, snowDepthVar='snowDepth', 15 | snowDensityVar='snowDensity', 16 | outVar='iceThickness'): 17 | """ Grid using nearest neighbour the NESOSIM snow depths to the 18 | high-res ICESat-1 freeboard locations 19 | """ 20 | 21 | # Convert freeboard to thickness 22 | # Need to copy arrays or it will overwrite the pandas column! 23 | freeboardT=np.copy(dF['freeboard'].values) 24 | snowDepthT=np.copy(dF[snowDepthVar].values) 25 | snowDensityT=np.copy(dF[snowDensityVar].values) 26 | ice_thickness = freeboard_to_thickness(freeboardT, snowDepthT, snowDensityT) 27 | #print(ice_thickness) 28 | dF[outVar] = pd.Series(np.array(ice_thickness), index=dF.index) 29 | 30 | return dF 31 | 32 | def freeboard_to_thickness(freeboardT, snow_depthT, snow_densityT): 33 | """ 34 | Hydrostatic equilibrium equation to calculate sea ice thickness 35 | from freeboard and snow depth/density data 36 | 37 | Args: 38 | freeboardT (var): ice freeboard 39 | snow_depthT (var): snow depth 40 | snow_densityT (var): final snow density 41 | 42 | Returns: 43 | ice_thicknessT (var): ice thickness dereived using hydrostatic equilibrium 44 | 45 | """ 46 | 47 | # Define density values 48 | rho_w=1024. 49 | rho_i=925. 50 | #rho_s=300. 51 | 52 | # set snow to freeboard where it's bigger than freeboard. 53 | snow_depthT[snow_depthT>freeboardT]=freeboardT[snow_depthT>freeboardT] 54 | 55 | ice_thicknessT = (rho_w/(rho_w-rho_i))*freeboardT - ((rho_w-snow_densityT)/(rho_w-rho_i))*snow_depthT 56 | 57 | return ice_thicknessT 58 | 59 | def getWarrenData(dF, outSnowVar, outDensityVar='None'): 60 | """ 61 | Assign Warren1999 snow dept/density climatology to dataframe 62 | 63 | Added 64 | 65 | Args: 66 | dF (data frame): Pandas dataframe 67 | outSnowVar (string): name of Warren snow depth variable 68 | outDensityVar (string): name of Warren snow density variable 69 | 70 | 71 | Returns: 72 | dF (data frame): Pandas dataframe updated to include colocated Warren snow depth and density 73 | 74 | """ 75 | 76 | # Generate empty lists 77 | snowDepthW99s=ma.masked_all(np.size(dF['freeboard'].values)) 78 | if (outDensityVar!='None'): 79 | snowDensityW99s=ma.masked_all(np.size(dF['freeboard'].values)) 80 | 81 | # Loop over all freeboard values (rows) 82 | for x in range(np.size(dF['freeboard'].values)): 83 | #print(x, dF['lon'].iloc[x], dF['lat'].iloc[x], dF['month'].iloc[x]-1) 84 | # SUbtract 1 from month as warren index in fucntion starts at 0 85 | snowDepthDayW99T, snowDensityW99T=WarrenClimatology(dF['lon'].iloc[x], dF['lat'].iloc[x], dF['datetime'].iloc[x].month-1) 86 | 87 | 88 | # Append values to list 89 | snowDepthW99s[x]=snowDepthDayW99T 90 | if (outDensityVar!='None'): 91 | snowDensityW99s[x]=snowDensityW99T 92 | 93 | # Assign list to dataframe as a series 94 | dF[outSnowVar] = pd.Series(snowDepthW99s, index=dF.index) 95 | if (outDensityVar!='None'): 96 | dF[outDensityVar] = pd.Series(snowDensityW99s, index=dF.index) 97 | 98 | 99 | return dF 100 | 101 | def WarrenClimatology(lonT, latT, monthT): 102 | """ 103 | Get Warren1999 snow depth climatology 104 | 105 | Args: 106 | lonT (var): longitude 107 | latT (var): latitude 108 | monthT (var): month with the index starting at 0 109 | 110 | Returns: 111 | Hs (var): Snow depth (m) 112 | rho_s (var): Snow density (kg/m^3) 113 | 114 | """ 115 | 116 | H_0 = [28.01, 30.28, 33.89, 36.8, 36.93, 36.59, 11.02, 4.64, 15.81, 22.66, 25.57, 26.67] 117 | a = [.127, .1056, .5486, .4046, .0214, .7021, .3008, .31, .2119, .3594, .1496, -0.1876] 118 | b = [-1.1833, -0.5908, -0.1996, -0.4005, -1.1795, -1.4819, -1.2591, -0.635, -1.0292, -1.3483, -1.4643, -1.4229] 119 | c = [-0.1164, -0.0263, 0.0280, 0.0256, -0.1076, -0.1195, -0.0811, -0.0655, -0.0868, -0.1063, -0.1409, -0.1413] 120 | d = [-0.0051, -0.0049, 0.0216, 0.0024, -0.0244, -0.0009, -0.0043, 0.0059, -0.0177, 0.0051, -0.0079, -0.0316] 121 | e = [0.0243, 0.0044, -0.0176, -0.0641, -0.0142, -0.0603, -0.0959, -0.0005, -0.0723, -0.0577, -0.0258, -0.0029] 122 | 123 | # Convert lat and lon into degrees of arc, +x axis along 0 degrees longitude and +y axis along 90E longitude 124 | x = (90.0 - latT)*np.cos(lonT * np.pi/180.0) 125 | y = (90.0 - latT)*np.sin(lonT*np.pi/180.0) 126 | 127 | Hs = H_0[monthT] + a[monthT]*x + b[monthT]*y + c[monthT]*x*y + (d[monthT]*x*x) + (e[monthT]*y*y) 128 | 129 | 130 | # Now get SWE, although this is not returned by the function 131 | 132 | H_0swe = [8.37, 9.43,10.74,11.67,11.8,12.48,4.01,1.08,3.84,6.24,7.54,8.0] 133 | aswe = [-0.027,0.0058,0.1618,0.0841,-0.0043,0.2084,0.097,0.0712,0.0393,0.1158,0.0567,-0.054] 134 | bswe = [-0.34,-0.1309,0.0276,-0.1328,-0.4284,-0.5739,-0.493,-0.145,-0.2107,-0.2803,-0.3201,-0.365] 135 | cswe = [-0.0319,0.0017,0.0213,0.0081,-0.038,-0.0468,-0.0333,-0.0155,-0.0182,-0.0215,-0.0284,-0.0362] 136 | dswe = [-0.0056,-0.0021,0.0076,-0.0003,-0.0071,-0.0023,-0.0026,0.0014,-0.0053,0.0015,-0.0032,-0.0112] 137 | eswe = [-0.0005,-0.0072,-0.0125,-0.0301,-0.0063,-0.0253,-0.0343,0,-0.019,-0.0176,-0.0129,-0.0035] 138 | 139 | 140 | swe = H_0swe[monthT] + aswe[monthT]*x + bswe[monthT]*y + cswe[monthT]*x*y + dswe[monthT]*x*x + eswe[monthT]*y*y 141 | 142 | # Density in kg/m^3 143 | rho_s = 1000.*(swe/Hs) 144 | #print(ma.mean(rho_s)) 145 | 146 | # Could mask out bad regions (i.e. land) here if desired. 147 | # Hsw[where(region_maskG<9.6)]=np.nan 148 | # Hsw[where(region_maskG==14)]=np.nan 149 | # Hsw[where(region_maskG>15.5)]=np.nan 150 | 151 | # Could mask out bad regions (i.e. land) here if desired. 152 | #rho_s[where(region_maskG<9.6)]=np.nan 153 | #rho_s[where(region_maskG==14)]=np.nan 154 | #rho_s[where(region_maskG>15.5)]=np.nan 155 | 156 | # Convert snow depth to meters 157 | Hs=Hs/100. 158 | 159 | return Hs, rho_s 160 | 161 | def get_psnlatslons(data_path, res=25): 162 | """ Get NSIDC polar stereographic grid data""" 163 | 164 | if (res==25): 165 | # 25 km grid 166 | mask_latf = open(data_path+'/psn25lats_v3.dat', 'rb') 167 | mask_lonf = open(data_path+'/psn25lons_v3.dat', 'rb') 168 | lats_mask = reshape(fromfile(file=mask_latf, dtype='1)|(iceConcNDay<0.01)|np.isnan(snowDensityNDay)) 325 | 326 | snowDepthNDay[mask]=np.nan 327 | snowDensityNDay[mask]=np.nan 328 | 329 | snowDepthNDay=snowDepthNDay 330 | snowDensityNDay=snowDensityNDay 331 | 332 | # I think it's better to declare array now so memory is allocated before the loop? 333 | snowDepthGISs=np.zeros((dF.shape[0])) 334 | snowDensityGISs=np.zeros((dF.shape[0])) 335 | 336 | # Should change this to an apply or lamda function 337 | for x in range(dF.shape[0]): 338 | 339 | # Use nearest neighbor to find snow depth at IS2 point 340 | #snowDepthGISs[x] = griddata((xptsDay, yptsDay), snowDepthDay, (dF['xpts'].iloc[x], dF['ypts'].iloc[x]), method='nearest') 341 | #snowDensityGISs[x] = griddata((xptsDay, yptsDay), densityDay, (dF['xpts'].iloc[x], dF['ypts'].iloc[x]), method='nearest') 342 | 343 | # Think this is the much faster way to find nearest neighbor! 344 | dist = np.sqrt((latsN-dF['lat'].iloc[x])**2+(lonsN-dF['lon'].iloc[x])**2) 345 | index_min = np.argmin(dist) 346 | snowDepthGISs[x]=snowDepthNDay[index_min] 347 | snowDensityGISs[x]=snowDensityNDay[index_min] 348 | 349 | 350 | dF[outSnowVar] = pd.Series(snowDepthGISs, index=dF.index) 351 | dF[outDensityVar] = pd.Series(snowDensityGISs, index=dF.index) 352 | 353 | return dF 354 | -------------------------------------------------------------------------------- /Notebooks/DataAccess.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 3, 6 | "metadata": {}, 7 | "outputs": [ 8 | { 9 | "ename": "ImportError", 10 | "evalue": "cannot import name 'icesat2data' from 'icepyx' (/Users/buckley/opt/anaconda3/lib/python3.8/site-packages/icepyx/__init__.py)", 11 | "output_type": "error", 12 | "traceback": [ 13 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 14 | "\u001b[0;31mImportError\u001b[0m Traceback (most recent call last)", 15 | "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0;32mfrom\u001b[0m \u001b[0micepyx\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0micesat2data\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mipd\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 2\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mos\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mshutil\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mgeopandas\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mgpd\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mmatplotlib\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mpyplot\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0mplt\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 16 | "\u001b[0;31mImportError\u001b[0m: cannot import name 'icesat2data' from 'icepyx' (/Users/buckley/opt/anaconda3/lib/python3.8/site-packages/icepyx/__init__.py)" 17 | ] 18 | } 19 | ], 20 | "source": [ 21 | "from icepyx import icesat2data as ipd\n", 22 | "import os\n", 23 | "import shutil\n", 24 | "import geopandas as gpd\n", 25 | "import matplotlib.pyplot as plt\n", 26 | "%matplotlib inline" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "metadata": {}, 32 | "source": [ 33 | "## Download data for sea ice tutorial:" 34 | ] 35 | }, 36 | { 37 | "cell_type": "markdown", 38 | "metadata": {}, 39 | "source": [ 40 | "#### set email and uid for earthdata:" 41 | ] 42 | }, 43 | { 44 | "cell_type": "code", 45 | "execution_count": 2, 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [ 49 | "earthdata_email=''\n", 50 | "earthdata_uid=''" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "metadata": {}, 56 | "source": [ 57 | "#### set folder path for downloaded data:" 58 | ] 59 | }, 60 | { 61 | "cell_type": "code", 62 | "execution_count": 3, 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "data_home='' #where to download the data\n", 67 | "if not os.path.isdir(data_home):\n", 68 | " os.mkdir(data_home)" 69 | ] 70 | }, 71 | { 72 | "cell_type": "markdown", 73 | "metadata": {}, 74 | "source": [ 75 | "#### download granules required for tutorial:" 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": 4, 81 | "metadata": {}, 82 | "outputs": [], 83 | "source": [ 84 | "# set DOWNLOAD to true if you're ready to download the data:\n", 85 | "DOWNLOAD=False\n", 86 | " \n", 87 | "requests=[\n", 88 | " { 'short_name' : 'ATL03',\n", 89 | " 'spatial_extent' :[-50.63, 81.47, -50.62, 81.48],\n", 90 | " 'date_range' : ['2019-02-06','2019-02-06']},\n", 91 | " { 'short_name' : 'ATL03',\n", 92 | " 'spatial_extent' :[-49,81.83,-47,81.84],\n", 93 | " 'date_range' : ['2019-04-30','2019-04-30']},\n", 94 | " { 'short_name' : 'ATL03',\n", 95 | " 'spatial_extent' :[-33.95, 86.79, -33.94, 86.80],\n", 96 | " 'date_range' : ['2019-03-24','2019-03-24']}, \n", 97 | " { 'short_name' : 'ATL07',\n", 98 | " 'spatial_extent' :[-38.8,64.63,-38.7,64.64],\n", 99 | " 'date_range' : ['2019-04-30','2019-04-30']},\n", 100 | " { 'short_name' : 'ATL07',\n", 101 | " 'spatial_extent' :[99.5, 76.4, 99.6, 76.5],\n", 102 | " 'date_range' : ['2019-03-24','2019-03-24']},\n", 103 | " { 'short_name' : 'ATL07',\n", 104 | " 'spatial_extent' :[-168.65,73.74,-168.64,73.75],\n", 105 | " 'date_range' : ['2018-11-15','2018-11-15']},\n", 106 | " { 'short_name' : 'ATL10',\n", 107 | " 'spatial_extent' :[-168.65,73.74,-168.64,73.75],\n", 108 | " 'date_range' : ['2018-11-15','2018-11-15']}]" 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": 5, 114 | "metadata": {}, 115 | "outputs": [], 116 | "source": [ 117 | "for req in requests:\n", 118 | " #### Download this data: (uncomment and run)\n", 119 | " region_a = ipd.Icesat2Data(req['short_name'], req['spatial_extent'], req['date_range'])\n", 120 | " if DOWNLOAD:\n", 121 | " region_a.earthdata_login(earthdata_uid, earthdata_email)\n", 122 | " region_a.download_granules(data_home,subset=False)\n", 123 | "# print(region_a.dataset)\n", 124 | "# print(region_a.dates)\n", 125 | "# print(region_a.start_time)\n", 126 | "# print(region_a.end_time)\n", 127 | "# print(region_a.dataset_version)\n", 128 | "# print(region_a.spatial_extent)\n", 129 | "# region_a.visualize_spatial_extent()" 130 | ] 131 | }, 132 | { 133 | "cell_type": "code", 134 | "execution_count": null, 135 | "metadata": {}, 136 | "outputs": [], 137 | "source": [] 138 | } 139 | ], 140 | "metadata": { 141 | "kernelspec": { 142 | "display_name": "Python 3", 143 | "language": "python", 144 | "name": "python3" 145 | }, 146 | "language_info": { 147 | "codemirror_mode": { 148 | "name": "ipython", 149 | "version": 3 150 | }, 151 | "file_extension": ".py", 152 | "mimetype": "text/x-python", 153 | "name": "python", 154 | "nbconvert_exporter": "python", 155 | "pygments_lexer": "ipython3", 156 | "version": "3.8.3" 157 | } 158 | }, 159 | "nbformat": 4, 160 | "nbformat_minor": 4 161 | } 162 | -------------------------------------------------------------------------------- /Notebooks/__pycache__/convert_GPS_time.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Notebooks/__pycache__/convert_GPS_time.cpython-36.pyc -------------------------------------------------------------------------------- /Notebooks/__pycache__/convert_julian.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Notebooks/__pycache__/convert_julian.cpython-36.pyc -------------------------------------------------------------------------------- /Notebooks/__pycache__/readers.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Notebooks/__pycache__/readers.cpython-36.pyc -------------------------------------------------------------------------------- /Notebooks/__pycache__/readers.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Notebooks/__pycache__/readers.cpython-37.pyc -------------------------------------------------------------------------------- /Notebooks/__pycache__/utils.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Notebooks/__pycache__/utils.cpython-36.pyc -------------------------------------------------------------------------------- /Notebooks/__pycache__/utils.cpython-37.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/Notebooks/__pycache__/utils.cpython-37.pyc -------------------------------------------------------------------------------- /Notebooks/readers.py: -------------------------------------------------------------------------------- 1 | import warnings 2 | warnings.filterwarnings('ignore') 3 | import numpy as np 4 | import pandas as pd 5 | import datetime as dt 6 | import matplotlib.pyplot as plt 7 | import cartopy.crs as ccrs 8 | import h5py 9 | import scipy 10 | from astropy.time import Time 11 | #from icepyx import icesat2data as ipd 12 | 13 | def getATL03(f,beam): 14 | # height of each received photon, relative to the WGS-84 ellipsoid (with some, not all corrections applied, see background info above) 15 | heights=f[beam]['heights']['h_ph'][:] 16 | # latitude (decimal degrees) of each received photon 17 | lats=f[beam]['heights']['lat_ph'][:] 18 | # longitude (decimal degrees) of each received photon 19 | lons=f[beam]['heights']['lon_ph'][:] 20 | # seconds from ATLAS Standard Data Product Epoch. use the epoch parameter to convert to gps time 21 | dt=f[beam]['heights']['delta_time'][:] 22 | # confidence level associated with each photon event 23 | # -2: TEP 24 | # -1: Events not associated with a specific surface type 25 | # 0: noise 26 | # 1: buffer but algorithm classifies as background 27 | # 2: low 28 | # 3: medium 29 | # 4: high 30 | # Surface types for signal classification confidence 31 | # 0=Land; 1=Ocean; 2=SeaIce; 3=LandIce; 4=InlandWater 32 | conf=f[beam]['heights']['signal_conf_ph'][:,2] #choose column 2 for confidence of sea ice photons 33 | # number of ATL03 20m segments 34 | n_seg, = f[beam]['geolocation']['segment_id'].shape 35 | # first photon in the segment (convert to 0-based indexing) 36 | Segment_Index_begin = f[beam]['geolocation']['ph_index_beg'][:] - 1 37 | # number of photon events in the segment 38 | Segment_PE_count = f[beam]['geolocation']['segment_ph_cnt'][:] 39 | # along-track distance for each ATL03 segment 40 | Segment_Distance = f[beam]['geolocation']['segment_dist_x'][:] 41 | # along-track distance (x) for photon events 42 | x_atc = np.copy(f[beam]['heights']['dist_ph_along'][:]) 43 | # cross-track distance (y) for photon events 44 | y_atc = np.copy(f[beam]['heights']['dist_ph_across'][:]) 45 | 46 | for j in range(n_seg): 47 | # index for 20m segment j 48 | idx = Segment_Index_begin[j] 49 | # number of photons in 20m segment 50 | cnt = Segment_PE_count[j] 51 | # add segment distance to along-track coordinates 52 | x_atc[idx:idx+cnt] += Segment_Distance[j] 53 | df03=pd.DataFrame({'lats':lats,'lons':lons,'x':x_atc,'y':y_atc,'heights':heights,'dt':dt,'conf':conf}) 54 | return df03 55 | 56 | def getATL07(f,beam): 57 | lats = f[beam+'/sea_ice_segments/latitude'][:] 58 | lons = f[beam+'/sea_ice_segments/longitude'][:] 59 | dt = f[beam+'/sea_ice_segments/delta_time'][:] 60 | seg_x = f[beam+'/sea_ice_segments/seg_dist_x'][:] 61 | heights = f[beam+'/sea_ice_segments/heights/height_segment_height'][:] 62 | conf = f[beam+'/sea_ice_segments/heights/height_segment_confidence'][:] 63 | stype = f[beam+'/sea_ice_segments/heights/height_segment_type'][:] 64 | ssh_flag = f[beam+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] 65 | gauss = f[beam+'/sea_ice_segments/heights/height_segment_w_gaussian'][:] 66 | photon_rate = f[beam+'/sea_ice_segments/stats/photon_rate'][:] 67 | cloud = f[beam+'/sea_ice_segments/stats/cloud_flag_asr'][:] 68 | mss = f[beam+'/sea_ice_segments/geophysical/height_segment_mss'][:] 69 | ocean_tide = f[beam+'/sea_ice_segments/geophysical/height_segment_ocean'][:] 70 | lpe_tide = f[beam+'/sea_ice_segments/geophysical/height_segment_lpe'][:] 71 | ib = f[beam+'/sea_ice_segments/geophysical/height_segment_ib'][:] 72 | df07=pd.DataFrame({'lats':lats,'lons':lons,'heights':heights,'dt':dt,'conf':conf,'stype':stype,'ssh_flag':ssh_flag, 'gauss':gauss,'photon_rate':photon_rate,'cloud':cloud,'mss':mss,'ocean':ocean_tide,'lpe':lpe_tide,'ib':ib}) 73 | return df07 74 | 75 | 76 | 77 | 78 | 79 | #--------- READERS COPIED FROM 2019 HACKWEEK TUTORIALS (not including the dictionary/xarray readers) ----- 80 | 81 | def getATL03data(fileT, numpyout=False, beam='gt1l'): 82 | """ Pandas/numpy ATL03 reader 83 | Written by Alek Petty, June 2018 (alek.a.petty@nasa.gov) 84 | I've picked out the variables from ATL03 I think are of most interest to 85 | sea ice users, but by no means is this an exhastive list. 86 | See the xarray or dictionary readers to load in the more complete ATL03 dataset 87 | or explore the hdf5 files themselves (I like using the app Panpoly for this) to 88 | see what else you might want 89 | 90 | Args: 91 | fileT (str): File path of the ATL03 dataset 92 | numpy (flag): Binary flag for outputting numpy arrays (True) or pandas dataframe (False) 93 | beam (str): ICESat-2 beam (the number is the pair, r=strong, l=weak) 94 | 95 | returns: 96 | either: select numpy arrays or a pandas dataframe 97 | """ 98 | 99 | # Open the file 100 | try: 101 | ATL03 = h5py.File(fileT, 'r') 102 | except: 103 | 'Not a valid file' 104 | 105 | lons=ATL03[beam+'/heights/lon_ph'][:] 106 | lats=ATL03[beam+'/heights/lat_ph'][:] 107 | 108 | # Number of seconds since the GPS epoch on midnight Jan. 6, 1980 109 | delta_time=ATL03[beam+'/heights/delta_time'][:] 110 | 111 | # #Add this value to delta time parameters to compute the full gps_seconds 112 | atlas_epoch=ATL03['/ancillary_data/atlas_sdp_gps_epoch'][:] 113 | 114 | # Conversion of delta_time to a calendar date 115 | # This function seems pretty convoluted but it works for now.. 116 | # I'm sure there is a simpler function we can use here instead. 117 | temp = ut.convert_GPS_time(atlas_epoch[0] + delta_time, OFFSET=0.0) 118 | 119 | # Express delta_time relative to start time of granule 120 | delta_time_granule=delta_time-delta_time[0] 121 | 122 | year = temp['year'][:].astype('int') 123 | month = temp['month'][:].astype('int') 124 | day = temp['day'][:].astype('int') 125 | hour = temp['hour'][:].astype('int') 126 | minute = temp['minute'][:].astype('int') 127 | second = temp['second'][:].astype('int') 128 | 129 | dFtime=pd.DataFrame({'year':year, 'month':month, 'day':day, 130 | 'hour':hour, 'minute':minute, 'second':second}) 131 | 132 | 133 | # Primary variables of interest 134 | 135 | # Photon height 136 | heights=ATL03[beam+'/heights/h_ph'][:] 137 | print(heights.shape) 138 | 139 | # Flag for signal confidence 140 | # column index: 0=Land; 1=Ocean; 2=SeaIce; 3=LandIce; 4=InlandWater 141 | # values: 142 | #-- -1: Events not associated with a specific surface type 143 | #-- 0: noise 144 | #-- 1: buffer but algorithm classifies as background 145 | #-- 2: low 146 | #-- 3: medium 147 | #-- 4: high 148 | signal_confidence=ATL03[beam+'/heights/signal_conf_ph'][:,2] 149 | 150 | # Add photon rate, background rate etc to the reader here if we want 151 | 152 | ATL03.close() 153 | 154 | 155 | 156 | dF = pd.DataFrame({'heights':heights, 'lons':lons, 'lats':lats, 157 | 'signal_confidence':signal_confidence, 158 | 'delta_time':delta_time_granule}) 159 | 160 | # Add the datetime string 161 | dFtimepd=pd.to_datetime(dFtime) 162 | dF['datetime'] = pd.Series(dFtimepd, index=dF.index) 163 | 164 | # Filter out high elevation values 165 | #dF = dF[(dF['signal_confidence']>2)] 166 | # Reset row indexing 167 | #dF=dF.reset_index(drop=True) 168 | return dF 169 | 170 | # Or return as numpy arrays 171 | # return along_track_distance, heights 172 | 173 | 174 | def getATL07data(fileT, numpy=False, beamNum=1, maxElev=1e6): 175 | """ Pandas/numpy ATL07 reader 176 | Written by Alek Petty, June 2018 (alek.a.petty@nasa.gov) 177 | I've picked out the variables from ATL07 I think are of most interest to sea ice users, 178 | but by no means is this an exhastive list. 179 | See the xarray or dictionary readers to load in the more complete ATL07 dataset 180 | or explore the hdf5 files themselves (I like using the app Panpoly for this) to see what else 181 | you might want 182 | 183 | Args: 184 | fileT (str): File path of the ATL07 dataset 185 | numpy (flag): Binary flag for outputting numpy arrays (True) or pandas dataframe (False) 186 | beamNum (int): ICESat-2 beam number (1 to 6) 187 | maxElev (float): maximum surface elevation to remove anomalies 188 | returns: 189 | either: select numpy arrays or a pandas dataframe 190 | 191 | Updates: 192 | V3 (June 2018) added observatory orientation flag, read in the beam number, not the string 193 | V2 (June 2018) used astropy to more simply generate a datetime instance form the gps time 194 | """ 195 | 196 | # Open the file 197 | try: 198 | ATL07 = h5py.File(fileT, 'r') 199 | except: 200 | return 'Not a valid file' 201 | 202 | #flag_values: 0, 1, 2; flag_meanings : backward forward transition 203 | orientation_flag=ATL07['orbit_info']['sc_orient'][:] 204 | 205 | if (orientation_flag==0): 206 | print('Backward orientation') 207 | beamStrs=['gt1l', 'gt1r', 'gt2l', 'gt2r', 'gt3l', 'gt3r'] 208 | 209 | elif (orientation_flag==1): 210 | print('Forward orientation') 211 | beamStrs=['gt3r', 'gt3l', 'gt2r', 'gt2l', 'gt1r', 'gt1l'] 212 | 213 | elif (orientation_flag==2): 214 | print('Transitioning, do not use for science!') 215 | 216 | beamStr=beamStrs[beamNum-1] 217 | print(beamStr) 218 | 219 | lons=ATL07[beamStr+'/sea_ice_segments/longitude'][:] 220 | lats=ATL07[beamStr+'/sea_ice_segments/latitude'][:] 221 | 222 | # Along track distance 223 | # I removed the first point so it's distance relative to the start of the beam 224 | along_track_distance=ATL07[beamStr+'/sea_ice_segments/seg_dist_x'][:] - ATL07[beamStr+'/sea_ice_segments/seg_dist_x'][0] 225 | # Height segment ID (10 km segments) 226 | height_segment_id=ATL07[beamStr+'/sea_ice_segments/height_segment_id'][:] 227 | # Number of seconds since the GPS epoch on midnight Jan. 6, 1980 228 | delta_time=ATL07[beamStr+'/sea_ice_segments/delta_time'][:] 229 | # Add this value to delta time parameters to compute full gps time 230 | atlas_epoch=ATL07['/ancillary_data/atlas_sdp_gps_epoch'][:] 231 | 232 | leapSecondsOffset=37 233 | gps_seconds = atlas_epoch[0] + delta_time - leapSecondsOffset 234 | # Use astropy to convert from gps time to datetime 235 | tgps = Time(gps_seconds, format='gps') 236 | tiso = Time(tgps, format='datetime') 237 | 238 | # Primary variables of interest 239 | 240 | # Beam segment height 241 | elev=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_height'][:] 242 | # Flag for potential leads, 0=sea ice, 1 = sea surface 243 | ssh_flag=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] 244 | 245 | #Quality metrics for each segment include confidence level in the surface height estimate, 246 | # which is based on the number of photons, the background noise rate, and the error measure provided by the surface-finding algorithm. 247 | # Height quality flag, 1 for good fit, 0 for bad 248 | quality=ATL07[beamStr+'/sea_ice_segments/heights/height_segment_quality'][:] 249 | 250 | elev_rms = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_rms'][:] #RMS difference between modeled and observed photon height distribution 251 | seg_length = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_length_seg'][:] # Along track length of segment 252 | height_confidence = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_confidence'][:] # Height segment confidence flag 253 | reflectance = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_asr_calc'][:] # Apparent surface reflectance 254 | ssh_flag = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_ssh_flag'][:] # Flag for potential leads, 0=sea ice, 1 = sea surface 255 | seg_type = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_type'][:] # 0 = Cloud covered 256 | gauss_width = ATL07[beamStr+'/sea_ice_segments/heights/height_segment_w_gaussian'][:] # Width of Gaussian fit 257 | 258 | # Geophysical corrections 259 | # NOTE: All of these corrections except ocean tides, DAC, 260 | # and geoid undulations were applied to the ATL03 photon heights. 261 | 262 | # AVISO dynamic Atmospheric Correction (DAC) including inverted barometer (IB) effect (±5cm) 263 | dac = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_dac'][:] 264 | # Solid Earth Tides (±40 cm, max) 265 | earth = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_earth'][:] 266 | # Geoid (-105 to +90 m, max) 267 | geoid = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_geoid'][:] 268 | # Local displacement due to Ocean Loading (-6 to 0 cm) 269 | loadTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_load'][:] 270 | # Ocean Tides including diurnal and semi-diurnal (harmonic analysis), 271 | # and longer period tides (dynamic and self-consistent equilibrium) (±5 m) 272 | oceanTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_ocean'][:] 273 | # Deformation due to centrifugal effect from small variations in polar motion 274 | # (Solid Earth Pole Tide) (±1.5 cm, the ocean pole tide ±2mm amplitude is considered negligible) 275 | poleTide = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_pole'][:] 276 | # Mean sea surface (±2 m) 277 | # Taken from ICESat and CryoSat-2, see Kwok and Morison [2015]) 278 | mss = ATL07[beamStr+'/sea_ice_segments/geophysical/height_segment_mss'][:] 279 | 280 | # Photon rate of the given segment 281 | photon_rate = ATL07[beamStr+'/sea_ice_segments/stats/photon_rate'][:] 282 | 283 | # Estimated background rate from sun angle, reflectance, surface slope 284 | background_rate = ATL07[beamStr+'/sea_ice_segments/stats/backgr_calc'][:] 285 | 286 | ATL07.close() 287 | 288 | if numpy: 289 | # list the variables you want to output here.. 290 | return along_track_dist, elev 291 | 292 | else: 293 | dF = pd.DataFrame({'elev':elev, 'lons':lons, 'lats':lats, 'ssh_flag':ssh_flag, 294 | 'quality_flag':quality, 295 | 'delta_time':delta_time, 296 | 'along_track_distance':along_track_distance, 297 | 'height_segment_id':height_segment_id, 298 | 'photon_rate':photon_rate,'background_rate':background_rate, 299 | 'datetime':tiso, 'mss': mss, 'seg_length':seg_length}) 300 | 301 | # Add the datetime string 302 | #dFtimepd=pd.to_datetime(dFtime) 303 | #dF['datetime'] = pd.Series(dFtimepd, index=dF.index) 304 | 305 | # Filter out high elevation values 306 | dF = dF[(dF['elev']=minFreeboard)] 376 | dF = dF[(dF['freeboard']<=maxFreeboard)] 377 | 378 | # Also filter based on the confidence and/or quality flag? 379 | 380 | # Reset row indexing 381 | dF=dF.reset_index(drop=True) 382 | 383 | return dF -------------------------------------------------------------------------------- /Notebooks/utils.py: -------------------------------------------------------------------------------- 1 | #Import necesary modules 2 | #Use shorter names (np, pd, plt) instead of full (numpy, pandas, matplotlib.pylot) for convenience 3 | import numpy as np 4 | import pandas as pd 5 | import matplotlib.pyplot as plt 6 | import cartopy.crs as ccrs 7 | import pandas as pd 8 | import h5py 9 | import xarray as xr 10 | import numpy as np 11 | import pdb 12 | import numpy.ma as ma 13 | 14 | def getSnowandConverttoThickness(dF, snowDepthVar='snowDepth', 15 | snowDensityVar='snowDensity', 16 | outVar='iceThickness'): 17 | """ Grid using nearest neighbour the NESOSIM snow depths to the 18 | high-res ICESat-1 freeboard locations 19 | """ 20 | 21 | # Convert freeboard to thickness 22 | # Need to copy arrays or it will overwrite the pandas column! 23 | freeboardT=np.copy(dF['freeboard'].values) 24 | snowDepthT=np.copy(dF[snowDepthVar].values) 25 | snowDensityT=np.copy(dF[snowDensityVar].values) 26 | ice_thickness = freeboard_to_thickness(freeboardT, snowDepthT, snowDensityT) 27 | #print(ice_thickness) 28 | dF[outVar] = pd.Series(np.array(ice_thickness), index=dF.index) 29 | 30 | return dF 31 | 32 | def freeboard_to_thickness(freeboardT, snow_depthT, snow_densityT): 33 | """ 34 | Hydrostatic equilibrium equation to calculate sea ice thickness 35 | from freeboard and snow depth/density data 36 | 37 | Args: 38 | freeboardT (var): ice freeboard 39 | snow_depthT (var): snow depth 40 | snow_densityT (var): final snow density 41 | 42 | Returns: 43 | ice_thicknessT (var): ice thickness dereived using hydrostatic equilibrium 44 | 45 | """ 46 | 47 | # Define density values 48 | rho_w=1024. 49 | rho_i=925. 50 | #rho_s=300. 51 | 52 | # set snow to freeboard where it's bigger than freeboard. 53 | snow_depthT[snow_depthT>freeboardT]=freeboardT[snow_depthT>freeboardT] 54 | 55 | ice_thicknessT = (rho_w/(rho_w-rho_i))*freeboardT - ((rho_w-snow_densityT)/(rho_w-rho_i))*snow_depthT 56 | 57 | return ice_thicknessT 58 | 59 | def getWarrenData(dF, outSnowVar, outDensityVar='None'): 60 | """ 61 | Assign Warren1999 snow dept/density climatology to dataframe 62 | 63 | Added 64 | 65 | Args: 66 | dF (data frame): Pandas dataframe 67 | outSnowVar (string): name of Warren snow depth variable 68 | outDensityVar (string): name of Warren snow density variable 69 | 70 | 71 | Returns: 72 | dF (data frame): Pandas dataframe updated to include colocated Warren snow depth and density 73 | 74 | """ 75 | 76 | # Generate empty lists 77 | snowDepthW99s=ma.masked_all(np.size(dF['freeboard'].values)) 78 | if (outDensityVar!='None'): 79 | snowDensityW99s=ma.masked_all(np.size(dF['freeboard'].values)) 80 | 81 | # Loop over all freeboard values (rows) 82 | for x in range(np.size(dF['freeboard'].values)): 83 | #print(x, dF['lon'].iloc[x], dF['lat'].iloc[x], dF['month'].iloc[x]-1) 84 | # SUbtract 1 from month as warren index in fucntion starts at 0 85 | snowDepthDayW99T, snowDensityW99T=WarrenClimatology(dF['lon'].iloc[x], dF['lat'].iloc[x], dF['datetime'].iloc[x].month-1) 86 | 87 | 88 | # Append values to list 89 | snowDepthW99s[x]=snowDepthDayW99T 90 | if (outDensityVar!='None'): 91 | snowDensityW99s[x]=snowDensityW99T 92 | 93 | # Assign list to dataframe as a series 94 | dF[outSnowVar] = pd.Series(snowDepthW99s, index=dF.index) 95 | if (outDensityVar!='None'): 96 | dF[outDensityVar] = pd.Series(snowDensityW99s, index=dF.index) 97 | 98 | 99 | return dF 100 | 101 | def WarrenClimatology(lonT, latT, monthT): 102 | """ 103 | Get Warren1999 snow depth climatology 104 | 105 | Args: 106 | lonT (var): longitude 107 | latT (var): latitude 108 | monthT (var): month with the index starting at 0 109 | 110 | Returns: 111 | Hs (var): Snow depth (m) 112 | rho_s (var): Snow density (kg/m^3) 113 | 114 | """ 115 | 116 | H_0 = [28.01, 30.28, 33.89, 36.8, 36.93, 36.59, 11.02, 4.64, 15.81, 22.66, 25.57, 26.67] 117 | a = [.127, .1056, .5486, .4046, .0214, .7021, .3008, .31, .2119, .3594, .1496, -0.1876] 118 | b = [-1.1833, -0.5908, -0.1996, -0.4005, -1.1795, -1.4819, -1.2591, -0.635, -1.0292, -1.3483, -1.4643, -1.4229] 119 | c = [-0.1164, -0.0263, 0.0280, 0.0256, -0.1076, -0.1195, -0.0811, -0.0655, -0.0868, -0.1063, -0.1409, -0.1413] 120 | d = [-0.0051, -0.0049, 0.0216, 0.0024, -0.0244, -0.0009, -0.0043, 0.0059, -0.0177, 0.0051, -0.0079, -0.0316] 121 | e = [0.0243, 0.0044, -0.0176, -0.0641, -0.0142, -0.0603, -0.0959, -0.0005, -0.0723, -0.0577, -0.0258, -0.0029] 122 | 123 | # Convert lat and lon into degrees of arc, +x axis along 0 degrees longitude and +y axis along 90E longitude 124 | x = (90.0 - latT)*np.cos(lonT * np.pi/180.0) 125 | y = (90.0 - latT)*np.sin(lonT*np.pi/180.0) 126 | 127 | Hs = H_0[monthT] + a[monthT]*x + b[monthT]*y + c[monthT]*x*y + (d[monthT]*x*x) + (e[monthT]*y*y) 128 | 129 | 130 | # Now get SWE, although this is not returned by the function 131 | 132 | H_0swe = [8.37, 9.43,10.74,11.67,11.8,12.48,4.01,1.08,3.84,6.24,7.54,8.0] 133 | aswe = [-0.027,0.0058,0.1618,0.0841,-0.0043,0.2084,0.097,0.0712,0.0393,0.1158,0.0567,-0.054] 134 | bswe = [-0.34,-0.1309,0.0276,-0.1328,-0.4284,-0.5739,-0.493,-0.145,-0.2107,-0.2803,-0.3201,-0.365] 135 | cswe = [-0.0319,0.0017,0.0213,0.0081,-0.038,-0.0468,-0.0333,-0.0155,-0.0182,-0.0215,-0.0284,-0.0362] 136 | dswe = [-0.0056,-0.0021,0.0076,-0.0003,-0.0071,-0.0023,-0.0026,0.0014,-0.0053,0.0015,-0.0032,-0.0112] 137 | eswe = [-0.0005,-0.0072,-0.0125,-0.0301,-0.0063,-0.0253,-0.0343,0,-0.019,-0.0176,-0.0129,-0.0035] 138 | 139 | 140 | swe = H_0swe[monthT] + aswe[monthT]*x + bswe[monthT]*y + cswe[monthT]*x*y + dswe[monthT]*x*x + eswe[monthT]*y*y 141 | 142 | # Density in kg/m^3 143 | rho_s = 1000.*(swe/Hs) 144 | #print(ma.mean(rho_s)) 145 | 146 | # Could mask out bad regions (i.e. land) here if desired. 147 | # Hsw[where(region_maskG<9.6)]=np.nan 148 | # Hsw[where(region_maskG==14)]=np.nan 149 | # Hsw[where(region_maskG>15.5)]=np.nan 150 | 151 | # Could mask out bad regions (i.e. land) here if desired. 152 | #rho_s[where(region_maskG<9.6)]=np.nan 153 | #rho_s[where(region_maskG==14)]=np.nan 154 | #rho_s[where(region_maskG>15.5)]=np.nan 155 | 156 | # Convert snow depth to meters 157 | Hs=Hs/100. 158 | 159 | return Hs, rho_s 160 | 161 | def get_psnlatslons(data_path, res=25): 162 | """ Get NSIDC polar stereographic grid data""" 163 | 164 | if (res==25): 165 | # 25 km grid 166 | mask_latf = open(data_path+'/psn25lats_v3.dat', 'rb') 167 | mask_lonf = open(data_path+'/psn25lons_v3.dat', 'rb') 168 | lats_mask = reshape(fromfile(file=mask_latf, dtype='1)|(iceConcNDay<0.01)|np.isnan(snowDensityNDay)) 325 | 326 | snowDepthNDay[mask]=np.nan 327 | snowDensityNDay[mask]=np.nan 328 | 329 | snowDepthNDay=snowDepthNDay 330 | snowDensityNDay=snowDensityNDay 331 | 332 | # I think it's better to declare array now so memory is allocated before the loop? 333 | snowDepthGISs=np.zeros((dF.shape[0])) 334 | snowDensityGISs=np.zeros((dF.shape[0])) 335 | 336 | # Should change this to an apply or lamda function 337 | for x in range(dF.shape[0]): 338 | 339 | # Use nearest neighbor to find snow depth at IS2 point 340 | #snowDepthGISs[x] = griddata((xptsDay, yptsDay), snowDepthDay, (dF['xpts'].iloc[x], dF['ypts'].iloc[x]), method='nearest') 341 | #snowDensityGISs[x] = griddata((xptsDay, yptsDay), densityDay, (dF['xpts'].iloc[x], dF['ypts'].iloc[x]), method='nearest') 342 | 343 | # Think this is the much faster way to find nearest neighbor! 344 | dist = np.sqrt((latsN-dF['lat'].iloc[x])**2+(lonsN-dF['lon'].iloc[x])**2) 345 | index_min = np.argmin(dist) 346 | snowDepthGISs[x]=snowDepthNDay[index_min] 347 | snowDensityGISs[x]=snowDensityNDay[index_min] 348 | 349 | 350 | dF[outSnowVar] = pd.Series(snowDepthGISs, index=dF.index) 351 | dF[outDensityVar] = pd.Series(snowDensityGISs, index=dF.index) 352 | 353 | return dF 354 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # ICESAT-2HackWeek-sea-ice-tutorials 2 | 3 | Authors: Ellen Buckley and Alek Petty 4 | 5 | Date: June 2020 6 | 7 | ICESat-2 hackweek repository for the sea ice tutorials. Primarily hosts the Jupyter notebooks which contain the individual tutorials. 8 | 9 | 10 | ![icesat2_seaice](./Images/icesat2_seaice.png?raw=true "ICESat-2 profiling the sea ice surface, taken from the ICESat-2 website (Satellite image courtesy of Orbital 11 | Earth image illustrating AMSR-E sea ice courtesy of the NASA Scientific Visualization Studio)") 12 | 13 | 14 | 15 | ## Setup 16 | 17 | This GitHub repository primarily hosts the Jupyter Notebooks needed for the hackweek tutorials. The notebooks should work without any extra steps if you're working in the ICESat-2 Pangeo environment that has been created for this hackweek. Just clone this repository into your Pangeo user account (git clone https://github.com/ICESAT-2HackWeek/sea-ice-tutorials when logged in). 18 | 19 | The example data files are being stored in the shared folder on Pangeo '/home/jovyan/tutorial-data/sea-ice/'. 20 | 21 | 22 | ## Notebooks 23 | 24 | 0. DataAccess.ipynb 25 | * run this notebook to download the ICESat-2 granules required for the tutorials 26 | 27 | 1. ATL03.ipynb 28 | * General understanding of the data included in a typical ATL03 file. 29 | * Reading in, plotting and basic analysis of ATL03 data. 30 | * Understanding along-track distance and time variables. 31 | * Understanding of differences between weak and strong beams 32 | 33 | 2. ATL07.ipynb 34 | * General understanding of the data included in a typical ATL07 file. 35 | * Reading in, plotting and basic analysis of ATL07 data. 36 | * How variables change with different surface types. 37 | * How clouds affect surface returns 38 | 39 | 3. ATL10.ipynb 40 | * General understanding of the difference between ATL07 and ATL10 41 | * Reading in, plotting and basic analysis of ATL10 data. 42 | * How you too can calculate sea ice thickness from freeboard! 43 | 44 | 4. GriddingDemo.ipynb 45 | * demo binning of ICESat-2 along-track data to the 25 km NSIDC grid. 46 | 47 | 48 | ## Background 49 | 50 | NASA's ICESat-2 launched succesfully in September 2018 from Vadenberg Airforce Base, California. ICESat-2 carries onboard a single instrument – the Advanced Topographic Laser Altimeter System (ATLAS). ATLAS measures the travel times of laser pulses to derive the distance between the spacecraft and Earth’s surface and thus an estiamte of the Earth's surface elevation with respect to a reference surface. The orbit pattern results in a higher density of shots within the polar regions. One of ICESat-2's primary mission objectives is to: *Estimate sea-ice thickness to examine ice/ocean/atmosphere exchanges of energy, mass and moisture.* 51 | 52 | ![icesat2_profiling](./Images/icesat2_profiling.png?raw=true "ICESat-2 profiling the sea ice surface, figure taken from the ATL07/10 ATBD document") 53 | 54 | ICESat-2 employs a photon-counting system to obtain high measurement sensitivity with lower resource (power) demands on the satellite platform compared to the analog waveform approach of the ICESat laser altimeter GLAS. The ATLAS instrument transmits laser pulses at 532 nm with a high pulse repetition frequency of 10 kHz. The ICESat-2 nominal orbit altitude of ~500 km results in laser footprints of ~17 m on the ground, which are separated by only ~0.7 m along-track (resulting in substantial overlap between the shots). This relatively small footprint size combined with the high pulse repetition frequency and precise time of flight estimates were partly chosen to enable the measurements of sea surface height within leads to carry a precision of 3 cm or less. 55 | 56 | The laser is split into 6 beams (three pairs of strong and weak beams) which provide individual profiles of elevation. The multiple beams address the need for unambiguous separation of ice sheet slope from height changes. For sea ice, this provides multiple profiles of sea ice and sea surface heights, increasing overall profiling coverage and enabling assessments of beam reliability. 57 | 58 | The beam configuration and their separation are shown above: the beams within each pair have different transmit energies (‘weak’ and‘strong’, with an energy ratio between them of approximately 1:4) and are separated by 90 m in the across-track direction. The beam pairs are separated by ~3.3 km in the across-track direction, and the strong and weak beams are separated by ~2.5 km in the along-track direction. The observatory orientation is an important consideration, as this changes the labelling of the beams - i.e. 'gt1r' refers to the left-side strong beam when in the forward direction (beam 5) but the left-side weak beam when in the backward direction (beam 2). 59 | 60 | The ICESat-2 products of most interest to the sea ice community are: 61 | 62 | * ATL03: Along-track photon heights (1. ATL03.ipynb tutorial) 63 | * ATL07: Along-track segment surace heights (2. ATL07.ipynb tutorial) 64 | * ATL09: Cloud products (no direct tutorial provided, used mainly in ATL07 production for cloud filtering) 65 | * ATL10: Along-track segment (and 10 km swath) freeboards (3. ATL10.ipynb tutorial) 66 | * ATL20: Gridded monthly sea ice freeboard (expected release summer 2020). 67 | 68 | We provide in the notebooks a brief summary of these data products, but encourage the user to read the ATBD or references provided above and at the start of the Jupyter Notebooks for the more complete (and probably accurate) descriptions. 69 | 70 | The ICESat-2 data products are provided in the Hierarchical Data Format – version 5 (HDF-5) format and have recently been made publicly available through the National Snow and Ice Data Center (NSIDC - https://nsidc.org/data/icesat-2). See the hdf5 tutorial (https://github.com/ICESAT-2HackWeek/intro-hdf5) for more information on this data format. 71 | 72 | 73 | ## References 74 | 75 | Kwok, R., Markus, T., Kurtz, N. T., Petty, A. A., Neumann, T. A., Farrell, S. L., et al. (2019). Surface height and sea ice freeboard of the Arctic Ocean from ICESat-2: Characteristics and early results. Journal of Geophysical Research: Oceans, 124, doi: 10.1029/2019JC015486. 76 | 77 | Markus, T., Neumann, T., Martino, A., Abdalati, W., Brunt, K., Csatho, B., et al. (2017). The Ice, Cloud and land Elevation Satellite-2 (ICESat-2): Science requirements, concept, and implementation. Remote Sensing of the Environment, 190, 260-273, doi: 10.1016/j.rse.2016.12.029. 78 | 79 | Neumann, T A., Martino, A. J., Markus, T., Bae, S., Bock, M. R., Brenner, A. C., et al. (2019). The Ice, Cloud, and Land Elevation Satellite – 2 mission: A global geolocated photon product derived from the Advanced Topographic Laser Altimeter System. Remote Sensing of Environment, 233, 111325, doi: 10.1016/j.rse.2019.111325. 80 | 81 | Petty, A. A., N. T. Kurtz, R. Kwok, T. Markus, T. A. Neumann (2020), Winter Arctic sea ice thickness from ICESat‐2 freeboards, Journal of Geophysical Research: Oceans, 125, e2019JC015764. doi: 10.1029/2019JC015764. -------------------------------------------------------------------------------- /icesat2_seaice.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ICESAT-2HackWeek/sea-ice-tutorials/8f744f714cafa0bc7fba0a208aa55c9baaf92ec0/icesat2_seaice.png --------------------------------------------------------------------------------