├── .gitignore ├── LICENSE ├── README.md ├── bin ├── FastXML.py ├── email_finished.py ├── get_inventory_asf.py ├── index.html ├── indexTemplate.html ├── isce2aws.py ├── isce2cog.py ├── isce2overviews.py ├── plot_inventory_asf.py ├── prep_topsApp.py ├── prep_topsApp_aws.py ├── proc_batch_master.py ├── proc_batch_sequential.py ├── proc_ifg_cfn.py ├── proc_ifg_cfn_exp.py ├── proc_ifg_cfn_spot.py ├── process_interferogramEC2.py ├── run_interferogram.sh └── run_interferogram_docker.sh ├── docker └── readme.md ├── docs └── readme.md ├── s1batch └── readme.md └── s1single └── readme.md /.gitignore: -------------------------------------------------------------------------------- 1 | *.pem 2 | *.yml 3 | *.bak 4 | .DS_Store 5 | 6 | # Byte-compiled / optimized / DLL files 7 | __pycache__/ 8 | *.py[cod] 9 | *$py.class 10 | 11 | # C extensions 12 | *.so 13 | 14 | # Distribution / packaging 15 | .Python 16 | env/ 17 | build/ 18 | develop-eggs/ 19 | dist/ 20 | downloads/ 21 | eggs/ 22 | .eggs/ 23 | lib/ 24 | lib64/ 25 | parts/ 26 | sdist/ 27 | var/ 28 | wheels/ 29 | *.egg-info/ 30 | .installed.cfg 31 | *.egg 32 | 33 | # PyInstaller 34 | # Usually these files are written by a python script from a template 35 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 36 | *.manifest 37 | *.spec 38 | 39 | # Installer logs 40 | pip-log.txt 41 | pip-delete-this-directory.txt 42 | 43 | # Unit test / coverage reports 44 | htmlcov/ 45 | .tox/ 46 | .coverage 47 | .coverage.* 48 | .cache 49 | nosetests.xml 50 | coverage.xml 51 | *.cover 52 | .hypothesis/ 53 | 54 | # Translations 55 | *.mo 56 | *.pot 57 | 58 | # Django stuff: 59 | *.log 60 | local_settings.py 61 | 62 | # Flask stuff: 63 | instance/ 64 | .webassets-cache 65 | 66 | # Scrapy stuff: 67 | .scrapy 68 | 69 | # Sphinx documentation 70 | docs/_build/ 71 | 72 | # PyBuilder 73 | target/ 74 | 75 | # Jupyter Notebook 76 | .ipynb_checkpoints 77 | 78 | # pyenv 79 | .python-version 80 | 81 | # celery beat schedule file 82 | celerybeat-schedule 83 | 84 | # SageMath parsed files 85 | *.sage.py 86 | 87 | # dotenv 88 | .env 89 | 90 | # virtualenv 91 | .venv 92 | venv/ 93 | ENV/ 94 | 95 | # Spyder project settings 96 | .spyderproject 97 | .spyproject 98 | 99 | # Rope project settings 100 | .ropeproject 101 | 102 | # mkdocs documentation 103 | /site 104 | 105 | # mypy 106 | .mypy_cache/ 107 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2017 Scott Henderson 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # **NOTE: moving development to new repository: https://github.com/scottyhq/dinosar)** 2 | 3 | 4 | # dinoSARaws 5 | Interferometric Synthetic Aperture Radar (InSAR) processing with Amazon Web Services (AWS) 6 | 7 | 8 | ### What does dinoSARaws do 9 | dinoSARaws is software that enables on-demand processing of single interferograms and sets of interferograms 10 | for a given area of interest in the cloud. The software includes a collection of Python scripts and [Amazon Web Services](https://aws.amazon.com) recipes. Currently, dinoSARaws works with [Sentinel-1](http://www.esa.int/Our_Activities/Observing_the_Earth/Copernicus/Sentinel-1) data, which is provided by the European Space Agency (ESA) 11 | 12 | 13 | ### Rationale 14 | Radar remote sensing is at a pivotal moment in which cloud computing is becoming a necessary and advantageous replacement for traditional desktop scientific computing. Starting with the Sentinel-1 constellation of satellites launched by ESA in 2014 and 2016, practitioners of radar interferometry for the first time have open access to an ever-expanding global dataset. 15 | 16 | The Sentinel-1 archive, hosted at the [ASF DAAC](https://www.asf.alaska.edu), now contains on the order of 100 images for any given location on the globe and this archive is growing at 1 Petabyte per year. In fact, NASA has [estimated that by 2025](https://earthdata.nasa.gov/about/eosdis-cloud-evolution), it will be storing upwards of 250 Petabytes (PB) of its data using commercial cloud services (e.g. Amazon Web Services). 17 | 18 | The predicted tenfold increase in archive size is due in large part to the upcoming joint NASA/Indian Space Research Organisation (ISRO) Synthetic Aperture Radar [NISAR](https://nisar.jpl.nasa.gov) mission, which is currently scheduled to launch in 2021. NISAR data products will produce as much as 85 TB of imagery per day (45PB/year). The availability of these data in cloud environments, co-located with a wide range of cloud computing resources, could revolutionize how scientists use these datasets and provide opportunities for important scientific advancements. 19 | 20 | dinoSARaws is an attempt to lay important groundwork for the transition to cloud computing. It is designed to be a template for other researchers to migrate their own processing and custom analysis workflows. With this in mind, it is based on purely open tools. 21 | 22 | 23 | ### How to run dinoSARaws? 24 | dinoSARaws assumes that you have familiarity with a linux terminal, an account with AWS, and a user agreement permitting use of [ISCE Software](https://winsar.unavco.org/isce.html). There is more detailed documentation in the [docs](./docs) folder in this repository to set everything up. What follows is a quickstart tutorial for running on Ubuntu Linux. 25 | 26 | Install Continuum Analytics Python distribution 27 | ``` 28 | wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh 29 | bash Miniconda3-latest-Linux-x86_64.sh 30 | ``` 31 | 32 | Download dinoSARaws code (this repository) 33 | ``` 34 | git clone https://github.com/scottyhq/dinoSARaws.git 35 | cd dinoSARaws 36 | conda env create -f dinoSARaws.yml 37 | source activate dinoSARaws 38 | export PATH = `pwd`/bin:$PATH 39 | ``` 40 | 41 | Run dinoSARaws - this example processes a single interferogram covering Three Sisters Volcano in Oregon, USA 42 | ``` 43 | proc_ifg_cfn.py -i c5.4xlarge -p 115 -m 20170927 -s 20150914 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 44 | ``` 45 | 46 | 47 | ### Other SAR Cloud Computing efforts 48 | This is a non-exhaustive list of other efforts that are leading the way: 49 | * [hyp3](http://hyp3.asf.alaska.edu) - on demand SAR processing prototype developed at ASF for user-specified regions of interest 50 | * [GMTSAR on AWS](https://www.asf.alaska.edu/asf-tutorials/data-recipes/gmt5sar/gmt5sar-cloud/gmt5sar-os-x/) - A nice tutorial on processing a single Sentinel-1 interferogram on AWS with [GMTSAR](http://topex.ucsd.edu/gmtsar) software. 51 | * [GRFN](https://www.asf.alaska.edu/news-notes/2017-summer/getting-ready-for-nisar-grfn/) - prototype SAR processing on AWS using Sentinel-1 as a proxy for NISAR. Interferograms are currently generated automatically for [select locations](https://search.earthdata.nasa.gov/search?q=GRFN&ok=GRFN) 52 | * [GeohazardsTEP](https://geohazards-tep.eo.esa.int/#!) - Cloud computing initiative from the [Committee on Earth Observation Satellites (CEOS)](http://ceos.org) 53 | * [LicSAR](http://comet.nerc.ac.uk/COMET-LiCS-portal/) - Automated global Sentinel-1 processing with proprietary GAMMA software at COMET/University of Leeds - currently implemented for select orbital tracks and regions. 54 | -------------------------------------------------------------------------------- /bin/FastXML.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | 3 | #~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 4 | # copyright: 2013 to the present, california institute of technology. 5 | # all rights reserved. united states government sponsorship acknowledged. 6 | # any commercial use must be negotiated with the office of technology transfer 7 | # at the california institute of technology. 8 | # 9 | # this software may be subject to u.s. export control laws. by accepting this 10 | # software, the user agrees to comply with all applicable u.s. export laws and 11 | # regulations. user has the responsibility to obtain export licenses, or other 12 | # export authority as may be required before exporting such information to 13 | # foreign countries or providing access to foreign persons. 14 | # 15 | # installation and use of this software is restricted by a license agreement 16 | # between the licensee and the california institute of technology. it is the 17 | # user's responsibility to abide by the terms of the license agreement. 18 | # 19 | # Author: Piyush Agram 20 | #~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 21 | from collections import OrderedDict 22 | import xml.etree.ElementTree as ET 23 | 24 | class Component(OrderedDict): 25 | ''' 26 | Class for storing component information. 27 | ''' 28 | def __init__(self, name=None,data=None): 29 | 30 | if name in [None, '']: 31 | raise Exception('Component must have a name') 32 | 33 | self.name = name 34 | 35 | if data is None: 36 | self.data = OrderedDict() 37 | elif isinstance(data, OrderedDict): 38 | self.data = data 39 | elif isinstance(data, dict): 40 | self.data = OrderedDict() 41 | for key, val in data.items(): 42 | self.data[key] = val 43 | else: 44 | raise Exception('Component data in __init__ should be a dict or ordereddict') 45 | 46 | 47 | def __getitem__(self, key): 48 | return self.data[key] 49 | 50 | def __setitem__(self,key,value): 51 | if not isinstance(key, str): 52 | raise Exception('Component key must be a string') 53 | 54 | self.data[key] = value 55 | 56 | def toXML(self): 57 | ''' 58 | Creates an XML element from the component. 59 | ''' 60 | root = ET.Element('component') 61 | root.attrib['name'] = self.name 62 | 63 | for key, val in self.data.items(): 64 | if isinstance(val, Catalog): 65 | compSubEl = ET.SubElement(root, 'component') 66 | compSubEl.attrib['name'] = key 67 | ET.SubElement(compSubEl, 'catalog').text = str(val.xmlname) 68 | 69 | elif isinstance(val, Component): 70 | if key != val.name: 71 | print('WARNING: dictionary name and Component name dont match') 72 | print('Proceeding with Component name') 73 | root.append(val.toXML()) 74 | 75 | elif (isinstance(val,dict) or isinstance(val, OrderedDict)): 76 | obj = Component(name=key, data=val) 77 | root.append(obj.toXML()) 78 | 79 | elif (not isinstance(val, dict)) and (not isinstance(val, OrderedDict)): 80 | propSubEl = ET.SubElement(root,'property') 81 | propSubEl.attrib['name'] = key 82 | ET.SubElement(propSubEl, 'value').text = str(val) 83 | 84 | return root 85 | 86 | def writeXML(self, filename, root='dummy', noroot=False): 87 | ''' 88 | Write the component information to an XML file. 89 | ''' 90 | if root in [None, '']: 91 | raise Exception('Root name cannot be blank') 92 | 93 | if noroot: 94 | fileRoot = self.toXML() 95 | else: 96 | fileRoot = ET.Element(root) 97 | 98 | ####Convert component to XML 99 | root = self.toXML() 100 | fileRoot.append(root) 101 | 102 | print(fileRoot) 103 | 104 | indentXML(fileRoot) 105 | 106 | ####Write file 107 | etObj = ET.ElementTree(fileRoot) 108 | etObj.write(filename, encoding='unicode') 109 | 110 | class Catalog(object): 111 | ''' 112 | Class for storing catalog key. 113 | ''' 114 | def __init__(self, name): 115 | self.xmlname = name 116 | 117 | def indentXML(elem, depth = None,last = None): 118 | if depth == None: 119 | depth = [0] 120 | if last == None: 121 | last = False 122 | tab =u' '*4 123 | if(len(elem)): 124 | depth[0] += 1 125 | elem.text = u'\n' + (depth[0])*tab 126 | lenEl = len(elem) 127 | lastCp = False 128 | for i in range(lenEl): 129 | if(i == lenEl - 1): 130 | lastCp = True 131 | indentXML(elem[i],depth,lastCp) 132 | if(not last): 133 | elem.tail = u'\n' + (depth[0])*tab 134 | else: 135 | depth[0] -= 1 136 | elem.tail = u'\n' + (depth[0])*tab 137 | else: 138 | if(not last): 139 | elem.tail = u'\n' + (depth[0])*tab 140 | else: 141 | depth[0] -= 1 142 | elem.tail = u'\n' + (depth[0])*tab 143 | -------------------------------------------------------------------------------- /bin/email_finished.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Sent email with link to s3 bucket where finished interferogram is stored 4 | 5 | NOTE: localhost doesn't have mail server set up by default.. 6 | ConnectionRefusedError: [Errno 61] Connection refused 7 | https://www.nixtutor.com/linux/send-mail-through-gmail-with-python/ 8 | 9 | For more complicated emails, including attachments see: 10 | https://stackoverflow.com/questions/3362600/how-to-send-email-attachments 11 | 12 | ''' 13 | #import os 14 | import smtplib 15 | from email.message import EmailMessage 16 | 17 | #server = smtplib.SMTP('localhost') 18 | sender = 'scottyhq' 19 | receivers = ['scottyh@uw.edu'] 20 | 21 | # Plain text message 22 | body = ''' 23 | Hi Scott, your interferogram has finished. It is stored here: 24 | s3://interferogram 25 | 26 | You can download it with this command: 27 | aws s3 cp -r s3://interferogram . --recursive 28 | 29 | ''' 30 | msg = EmailMessage() 31 | msg['Subject'] = 'Interferogram has finished!' 32 | msg['From'] = sender 33 | msg['To'] = receivers 34 | msg.set_content(body) 35 | 36 | with smtplib.SMTP('smtp.gmail.com:587') as server: 37 | server.starttls() 38 | server.login('scottyhq','EC2004safe3') 39 | server.send_message(msg) 40 | 41 | -------------------------------------------------------------------------------- /bin/get_inventory_asf.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Query ASF catalog with SNWE bounds (manually entered, or using arbitrary polygon 4 | bounding box) 5 | 6 | Outputs 2 JSON metadata files for S1A and S1B from ASF Vertex 7 | Outputs 1 merged GeoJSON inventory file 8 | 9 | Author: Scott Henderson 10 | Date: 10/2017 11 | ''' 12 | 13 | import argparse 14 | import requests 15 | import json 16 | import shapely.wkt 17 | from shapely.geometry import box, mapping 18 | import pandas as pd 19 | import geopandas as gpd 20 | import os 21 | 22 | 23 | def cmdLineParse(): 24 | ''' 25 | Command line parser. 26 | ''' 27 | parser = argparse.ArgumentParser(description='get_inventory_asf.py') 28 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 29 | metavar=('S','N','W','E'), 30 | help='Region of interest bbox [S,N,W,E]') 31 | parser.add_argument('-i', type=str, dest='input', required=False, 32 | help='Polygon vector file defining region of interest') 33 | parser.add_argument('-b', type=float, dest='buffer', required=False, 34 | help='Add buffer [in degrees]') 35 | parser.add_argument('-f', action='store_true', default=False, dest='footprints', required=False, 36 | help='Create subfolders with geojson footprints') 37 | parser.add_argument('-k', action='store_true', default=False, dest='kmls', required=False, 38 | help='Download kmls from ASF API') 39 | parser.add_argument('-c', action='store_true', default=False, dest='csvs', required=False, 40 | help='Download csvs from ASF API') 41 | 42 | 43 | 44 | return parser.parse_args() 45 | 46 | 47 | def load_asf_json(jsonfile): 48 | ''' Convert JSON metadata from asf query to dataframe ''' 49 | with open(jsonfile) as f: 50 | meta = json.load(f)[0] #list of scene dictionaries 51 | 52 | df = pd.DataFrame(meta) 53 | polygons = df.stringFootprint.apply(shapely.wkt.loads) 54 | gf = gpd.GeoDataFrame(df, 55 | crs={'init': 'epsg:4326'}, 56 | geometry=polygons) 57 | 58 | gf['timeStamp'] = pd.to_datetime(gf.sceneDate, format='%Y-%m-%d %H:%M:%S') 59 | gf['sceneDateString'] = gf.timeStamp.apply(lambda x: x.strftime('%Y-%m-%d')) 60 | gf['dateStamp'] = pd.to_datetime(gf.sceneDateString) 61 | gf['utc'] = gf.timeStamp.apply(lambda x: x.strftime('%H:%M:%S')) 62 | gf['orbitCode'] = gf.relativeOrbit.astype('category').cat.codes 63 | 64 | return gf 65 | 66 | def summarize_orbits(gf): 67 | ''' 68 | break inventory into separate dataframes by relative orbit 69 | in most cases, inventory includes 2 adjacent frames (same date) 70 | ''' 71 | # NOTE: could probably avoid for loop w/ hirearchical index? 72 | #df115 = gf.groupby('relativeOrbit').get_group('115') 73 | #gb = gf.groupby(['relativeOrbit', 'sceneDateString']) 74 | #df = gb.get_group( ('93', '2017-05-11') ) # entire dataframes 75 | # just polygons: 76 | #gb.geometry.get_group( ('93', '2017-05-11') ) #.values 77 | # just files nFrames 78 | #gb.granuleName.get_group( ('93', '2017-05-11') ) #.tolist() 79 | # Alternatively 80 | #gf.loc[gb.groups[ ('93', '2017-05-11') ], ('granuleName','geometry')] 81 | for orb in gf.relativeOrbit.unique(): 82 | df = gf.query('relativeOrbit == @orb') 83 | gb = df.groupby('sceneDateString') 84 | nFrames = gb.granuleName.count() 85 | #frameLists = gb.granuleName.apply(lambda x: list(x)) 86 | df = df.loc[:,['sceneDateString','dateStamp','platform']] #consider list of granuleName 87 | #Only keep one frame per date, not necessarily all intersecting... 88 | DF = df.drop_duplicates('sceneDateString').reset_index(drop=True) 89 | DF.sort_values('sceneDateString', inplace=True) 90 | DF.reset_index(inplace=True, drop=True) 91 | timeDeltas = DF.dateStamp.diff() 92 | DF['dt'] = timeDeltas.dt.days 93 | #DF.dt.iloc[0] #Causes set w/ copy warning... 94 | DF.loc[0, 'dt']=0 95 | DF['dt'] = DF.dt.astype('i2') 96 | DF['nFrames'] = nFrames.values 97 | DF.drop('dateStamp', axis=1, inplace=True) 98 | #DF.set_index('date') # convert to datetime difference 99 | DF.to_csv('acquisitions_{}.csv'.format(orb)) 100 | 101 | def save_shapefiles(gf, master, slave): 102 | ''' 103 | NOTE: alernative is to use shapely/geopandas to check overlapping area 104 | Save shapefiles of master & slave to confirm overlap in GE 105 | ''' 106 | print('TODO') 107 | 108 | 109 | def save_geojson_footprints(gf): 110 | ''' 111 | Saves all frames from each date as separate geojson file for comparison on github 112 | ''' 113 | attributes = ('granuleName','downloadUrl','geometry') #NOTE: could add IPF version... 114 | gb = gf.groupby(['relativeOrbit', 'sceneDateString']) 115 | S = gf.groupby('relativeOrbit').sceneDateString.unique() #series w/ list of unique dates 116 | for orbit, dateList in S.iteritems(): 117 | os.makedirs(orbit) 118 | for date in dateList: 119 | 120 | dftmp = gf.loc[gb.groups[ (orbit, date) ], attributes].reset_index(drop=True) 121 | outname = os.path.join(orbit, date+'.geojson') 122 | dftmp.to_file(outname, driver='GeoJSON') 123 | 124 | def summarize_inventory(gf): 125 | ''' 126 | Basic statistics for each track 127 | ''' 128 | dfS = pd.DataFrame(index=gf.relativeOrbit.unique()) 129 | dfS['Start'] = gf.groupby('relativeOrbit').sceneDateString.min() 130 | dfS['Stop'] = gf.groupby('relativeOrbit').sceneDateString.max() 131 | dfS['Dates'] = gf.groupby('relativeOrbit').sceneDateString.nunique() 132 | dfS['Frames'] = gf.groupby('relativeOrbit').sceneDateString.count() 133 | dfS['Direction'] = gf.groupby('relativeOrbit').flightDirection.first() 134 | dfS['UTC'] = gf.groupby('relativeOrbit').utc.first() 135 | dfS.sort_index(inplace=True, ascending=False) 136 | dfS.index.name = 'Orbit' 137 | dfS.to_csv('inventory_summary.csv') 138 | print(dfS) 139 | size = dfS.Frames.sum()*5 / 1e3 140 | print('Approximate Archive size = {} Tb'.format(size)) 141 | 142 | 143 | def merge_inventories(s1Afile, s1Bfile): 144 | ''' 145 | Merge Sentinel 1A and Sentinel 1B into single dataframe 146 | ''' 147 | gfA = load_asf_json(s1Afile) 148 | gfB = load_asf_json(s1Bfile) 149 | gf = pd.concat([gfA,gfB]) 150 | gf.reset_index(inplace=True) 151 | return gf 152 | 153 | 154 | def save_inventory(gf, outname='query.geojson', format='GeoJSON'): 155 | ''' 156 | Save entire inventory as a GeoJSON file (render on github) 157 | ''' 158 | # WARNING: overwrites existing file 159 | if os.path.isfile(outname): 160 | os.remove(outname) 161 | # NOTE: can't save pandas Timestamps! 162 | #ValueError: Invalid field type 163 | gf.drop(['timeStamp', 'dateStamp'], axis=1, inplace=True) 164 | gf.to_file(outname, driver=format) 165 | print('Saved inventory: ', outname) 166 | 167 | def download_scene(downloadUrl): 168 | ''' 169 | aria2c --http-auth-challenge=true --http-user=CHANGE_ME --http-passwd='CHANGE_ME' "https://api.daac.asf.alaska.edu/services/search/param?granule_list=S1A_EW_GRDM_1SDH_20151003T040339_20151003T040351_007983_00B2A6_7377&output=metalink" 170 | ''' 171 | print('Requires ~/.netrc file') 172 | cmd = 'wget -nc -c {}'.format() #nc won't overwrite. -c continuous if unfinished 173 | print(cmd) 174 | os.system(cmd) 175 | #use requests.get(auth=()) 176 | 177 | def query_asf(snwe, sat='1A', format='json'): 178 | ''' 179 | takes list of [south, north, west, east] 180 | ''' 181 | print('Querying ASF Vertex...') 182 | miny, maxy, minx, maxx = snwe 183 | roi = shapely.geometry.box(minx, miny, maxx, maxy) 184 | polygonWKT = roi.to_wkt() 185 | 186 | baseurl = 'https://api.daac.asf.alaska.edu/services/search/param' 187 | #relativeOrbit=$ORBIT 188 | data=dict(intersectsWith=polygonWKT, 189 | platform='Sentinel-{}'.format(sat), 190 | processingLevel='SLC', 191 | beamMode='IW', 192 | output=format) 193 | 194 | r = requests.get(baseurl, params=data) 195 | with open('query_S{}.{}'.format(sat,format), 'w') as j: 196 | j.write(r.text) 197 | 198 | #Directly to dataframe 199 | #df = pd.DataFrame(r.json()[0]) 200 | 201 | 202 | def ogr2snwe(args): 203 | gf = gpd.read_file(args.input) 204 | gf.to_crs(epsg=4326, inplace=True) 205 | poly = gf.geometry.convex_hull 206 | if args.buffer: 207 | poly = poly.buffer(args.buffer) 208 | W,S,E,N = poly.bounds.values[0] 209 | args.roi = [S,N,W,E] 210 | 211 | 212 | def snwe2file(args): 213 | ''' 214 | Use shapely to convert to GeoJSON & WKT 215 | ''' 216 | S,N,W,E = args.roi 217 | roi = box(W, S, E, N) 218 | with open('snwe.json', 'w') as j: 219 | json.dump(mapping(roi), j) 220 | with open('snwe.wkt', 'w') as w: 221 | w.write(roi.to_wkt()) 222 | with open('snwe.txt', 'w') as t: 223 | snweList = '[{0:.3f}, {1:.3f}, {2:.3f}, {3:.3f}]'.format(S,N,W,E) 224 | t.write(snweList) 225 | print(snweList) 226 | 227 | 228 | if __name__ == '__main__': 229 | args = cmdLineParse() 230 | if args.input: 231 | ogr2snwe(args) 232 | snwe2file(args) 233 | query_asf(args.roi, '1A') 234 | query_asf(args.roi, '1B') 235 | gf = merge_inventories('query_S1A.json', 'query_S1B.json') 236 | summarize_inventory(gf) 237 | summarize_orbits(gf) 238 | save_inventory(gf) 239 | if args.csvs: 240 | query_asf(args.roi, '1A','csv') 241 | if args.kmls: 242 | query_asf(args.roi, '1A','kml') 243 | if args.footprints: 244 | save_geojson_footprints(gf) #NOTE: takes a while... 245 | 246 | -------------------------------------------------------------------------------- /bin/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | {intname} 5 | 6 | 7 | 8 |

{intname}

9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

Browse

18 | 23 | 24 |

Download

25 | 33 | 34 |

Metadata

35 | 40 | 41 | 42 | 43 | 44 | -------------------------------------------------------------------------------- /bin/indexTemplate.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | {intname} 5 | 6 | 7 | 8 |

{intname}

9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

Browse

18 | 23 | 24 |

Download

25 | 33 | 34 |

Metadata

35 | 40 | 41 | 42 | 43 | 44 | -------------------------------------------------------------------------------- /bin/isce2aws.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | ''' 3 | Convert ISCE to cloud-friendly outputs and push to s3 4 | 5 | Usage: isce2aws.py int-20170828-20170816 6 | ''' 7 | import isce2cog 8 | import isce2overviews 9 | import os 10 | import sys 11 | 12 | def cleanUp(): 13 | ''' 14 | remove tmp and unneeded files 15 | ''' 16 | cmd = 'rm tmp* *rgb.tif' 17 | 18 | def writeIndex(intname): 19 | ''' 20 | super simple HTML index 21 | ''' 22 | with open('indexTemplate.html') as template: 23 | text = template.read() 24 | formattedText = .format(intname=intname) 25 | with open('index.html', 'w') as index: 26 | index.write(formattedText) 27 | 28 | # Get interferogram name from command line argument 29 | intname = sys.argv[1] 30 | 31 | conversions = {'amplitude-cog.tif' : ('filt_topophase.unw.geo.vrt',1), 32 | 'unwrapped-phase-cog.tif' : ('filt_topophase.unw.geo.vrt',2), 33 | #'coherence-cog.tif' : ('phsig.cor.geo.vrt',1), 34 | 'coherence-cog.tif' : ('topophase.cor.geo.vrt',1), 35 | 'incidence-cog.tif' : ('los.rdr.geo.vrt',1), 36 | 'heading-cog.tif' : ('los.rdr.geo.vrt',2), 37 | 'elevation-cog.tif' : ('dem.crop.vrt',1)} 38 | 39 | # Create Cloud-optimized geotiffs of select output files 40 | for outfile,(infile,band) in conversions.items(): 41 | print(infile, band, outfile) 42 | isce2cog.extract_band(infile, band) 43 | isce2cog.make_overviews() 44 | isce2cog.make_cog(outfile) 45 | 46 | # Create RGB thumbnails and tiles 47 | infile = 'coherence-cog.tif' 48 | cptFile = isce2overviews.make_coherence_cmap() 49 | rgbFile = isce2overviews.make_rgb(infile, cptFile) 50 | isce2overviews.make_thumbnails(rgbFile) 51 | isce2overviews.make_overviews(rgbFile) 52 | 53 | infile = 'amplitude-cog.tif' 54 | cptFile = isce2overviews.make_amplitude_cmap() 55 | rgbFile = isce2overviews.make_rgb(infile, cptFile) 56 | isce2overviews.make_thumbnails(rgbFile) 57 | isce2overviews.make_overviews(rgbFile) 58 | 59 | infile = 'unwrapped-phase-cog.tif' 60 | cptFile = isce2overviews.make_wrapped_phase_cmap() 61 | rgbFile = isce2overviews.make_rgb(infile, cptFile) 62 | isce2overviews.make_thumbnails(rgbFile) 63 | isce2overviews.make_overviews(rgbFile) 64 | 65 | # Clean up temporary files, etc 66 | cleanUp() 67 | 68 | # Write an html index file 69 | writeIndex() 70 | 71 | # Push everything to s3 72 | os.mkdir('output') 73 | cmd = 'mv index.html *-cog-* output' 74 | cmd = f'aws s3 sync output s3://{intname}/output' 75 | print(cmd) 76 | os.system(cmd) 77 | 78 | print('All done!') 79 | -------------------------------------------------------------------------------- /bin/isce2cog.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | ''' 3 | Convert geocoded ISCE output to Cloud Optimized Geotiffs 4 | 5 | Usage: isce2cog.tif path-to-merged-folder/ 6 | 7 | filt_topophase.unw.geo --> amplitude-cog.tif, unwrapped-phase-cog.tif 8 | phsig.cor.geo.vrt --> coherence-cog.tif 9 | los.rdr.geo.vrt --> incidence-cog.tif, heading-cog.tif 10 | dem.crop.vrt --> elevation-cog.tif 11 | 12 | http://www.cogeo.org 13 | 14 | Scott Henderson 15 | February 2018 16 | ''' 17 | 18 | import os 19 | 20 | def extract_band(infile, band): 21 | cmd = f'gdal_translate -of VRT -b {band} -a_nodata 0.0 {infile} tmp.vrt' 22 | print(cmd) 23 | os.system(cmd) 24 | 25 | def make_overviews(): 26 | cmd = 'gdaladdo tmp.vrt 2 4 8 16 32' 27 | print(cmd) 28 | os.system(cmd) 29 | 30 | def make_cog(outfile): 31 | cmd = f'gdal_translate tmp.vrt {outfile} -co COMPRESS=DEFLATE \ 32 | -co TILED=YES -co BLOCKXSIZE=512 -co BLOCKYSIZE=512 \ 33 | -co COPY_SRC_OVERVIEWS=YES --config GDAL_TIFF_OVR_BLOCKSIZE 512' 34 | print(cmd) 35 | os.system(cmd) 36 | 37 | def clean_up(): 38 | os.system('rm tmp*') 39 | 40 | 41 | if __name__ == '__main__': 42 | conversions = {'amplitude-cog.tif' : ('filt_topophase.unw.geo.vrt',1), 43 | 'unwrapped-phase-cog.tif' : ('filt_topophase.unw.geo.vrt',2), 44 | #'coherence-cog.tif' : ('phsig.cor.geo.vrt',1), 45 | 'coherence-cog.tif' : ('topophase.cor.geo.vrt',1), 46 | 'incidence-cog.tif' : ('los.rdr.geo.vrt',1), 47 | 'heading-cog.tif' : ('los.rdr.geo.vrt',2), 48 | 'elevation-cog.tif' : ('dem.crop.vrt',1)} 49 | 50 | for outfile,(infile,band) in conversions.items(): 51 | print(infile, band, outfile) 52 | extract_band(infile, band) 53 | make_overviews() 54 | make_cog(outfile) 55 | clean_up() 56 | 57 | print('Done!') 58 | print('Run these to move files to S3:') 59 | print('aws s3 mb s3://int-[date1]-[date2]') 60 | print('aws s3 sync . s3://int-[date1]-[date2] --exclude "*" --include "*cog.tif" ') 61 | -------------------------------------------------------------------------------- /bin/isce2overviews.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | ''' 3 | Convert geocoded ISCE output to Cloud Optimized Geotiff RGB and overviews, 4 | making external calls to GDAL. Borrows heavily from isce2geotiff.py in ISCE 5 | release 2.1.0. But, run after isce2cog.tif b/c operates on single band geotiffs. 6 | 7 | Usage: isce2overviews.py path-to-merged-folder/ 8 | 9 | NOTE: currently does not inspect data for colorbar limits,,, just uses 10 | fixed ranges that work most of the time 11 | 12 | Scott Henderson 13 | February 2018 14 | ''' 15 | import os 16 | import matplotlib.pyplot as plt 17 | import matplotlib.colors as colors 18 | import matplotlib.cm as cmx 19 | import numpy as np 20 | 21 | 22 | def write_cmap(outname, vals, scalarMap): 23 | with open(outname, 'w') as fid: 24 | for val in vals: 25 | cval = scalarMap.to_rgba(val) 26 | fid.write('{0} {1} {2} {3} \n'.format(val, #value 27 | int(cval[0]*255), #R 28 | int(cval[1]*255), #G 29 | int(cval[2]*255))) #B 30 | fid.write('nv 0 0 0 0 \n') #nodata alpha 31 | 32 | def make_amplitude_cmap(mapname='gray', vmin=1, vmax=1e5, ncolors=64): 33 | cmap = plt.get_cmap(mapname) 34 | #vmax = get_max() 35 | # NOTE for strong contrast amp return: 36 | #cNorm = colors.Normalize(vmin=1e3, vmax=1e4) 37 | cNorm = colors.LogNorm(vmin=vmin, vmax=vmax) 38 | scalarMap = cmx.ScalarMappable(norm=cNorm, cmap=cmap) 39 | vals = np.linspace(vmin, vmax, ncolors, endpoint=True) 40 | outname = 'amplitude-cog.cpt' 41 | write_cmap(outname, vals, scalarMap) 42 | 43 | return outname 44 | 45 | def make_wrapped_phase_cmap(mapname='plasma', vmin=-50, vmax=50, ncolors=64, wrapRate=6.28): 46 | ''' each color cycle represents wavelength/2 LOS change ''' 47 | cmap = plt.get_cmap(mapname) 48 | cNorm = colors.Normalize(vmin=0, vmax=1) #re-wrapping normalization 49 | scalarMap = cmx.ScalarMappable(norm=cNorm, cmap=cmap) 50 | vals = np.linspace(vmin, vmax, ncolors, endpoint=True) 51 | vals_wrapped = np.remainder(vals, wrapRate) / wrapRate 52 | # NOTE: if already converted to cm: 53 | #vals_wrapped = np.remainder(vals - vals.min(), wavelength/2.0) / (wavelength/2.0) 54 | outname = 'unwrapped-phase-cog.cpt' 55 | with open(outname, 'w') as fid: 56 | for val, wval in zip(vals, vals_wrapped): 57 | cval = scalarMap.to_rgba(wval) 58 | fid.write('{0} {1} {2} {3} \n'.format(val, #value 59 | int(cval[0]*255), #R 60 | int(cval[1]*255), #G 61 | int(cval[2]*255))) #B 62 | fid.write('nv 0 0 0 0 \n') #nodata alpha 63 | 64 | return outname 65 | 66 | 67 | def make_coherence_cmap(mapname='inferno', vmin=1e-5, vmax=1, ncolors=64): 68 | cmap = plt.get_cmap(mapname) 69 | cNorm = colors.Normalize(vmin=vmin, vmax=vmax) 70 | scalarMap = cmx.ScalarMappable(norm=cNorm, cmap=cmap) 71 | vals = np.linspace(vmin, vmax, ncolors, endpoint=True) 72 | outname = 'coherence-cog.cpt' 73 | write_cmap(outname, vals, scalarMap) 74 | 75 | return outname 76 | 77 | 78 | def make_rgb(infile, cptfile): 79 | outfile = infile[:-4] + '-rgb.tif' 80 | cmd = f'gdaldem color-relief -alpha {infile} {cptfile} {outfile}' 81 | print(cmd) 82 | os.system(cmd) 83 | return outfile 84 | 85 | 86 | def make_thumbnails(infile, small=5, large=10): 87 | ''' 88 | Make a large and small png overview 89 | ''' 90 | outfile = infile[:-4] + '-thumb-large.png' 91 | cmd = f'gdal_translate -of PNG -r cubic -outsize {large}% 0 {infile} {outfile}' 92 | print(cmd) 93 | os.system(cmd) 94 | outfile = infile[:-4] + '-thumb-small.png' 95 | cmd = f'gdal_translate -of PNG -r cubic -outsize {small}% 0 {infile} {outfile}' 96 | print(cmd) 97 | os.system(cmd) 98 | 99 | 100 | def make_overviews(infile): 101 | ''' Note: automatically warps to EPSG:3587 102 | note: could change title and add copyright! 103 | ''' 104 | cmd = f'gdal2tiles.py -w leaflet -z 6-12 {infile}' 105 | print(cmd) 106 | os.system(cmd) 107 | 108 | def clean_up(infile): 109 | ''' Note: remove large full resolution RGBA files''' 110 | # NOTE: should probably upload .cpt files so that they are easy to re-generate 111 | cmd = 'rm *rgb.tif tmp*' 112 | print(cmd) 113 | os.system(cmd) 114 | 115 | if __name__ == '__main__': 116 | #toConvert = {'amplitude-cog.tif':(1,10e3), #amp return 117 | # 'unwrapped-phase.tif':(-50,50), #radians (*(5.546576/12.5663706 for meters) 118 | # 'coherence-cog.tif':(0,1), 119 | # 'elevation-cog.tif':(0,1000)} 120 | infile = 'coherence-cog.tif' 121 | cptFile = make_coherence_cmap() 122 | rgbFile = make_rgb(infile, cptFile) 123 | make_thumbnails(rgbFile) 124 | make_overviews(rgbFile) 125 | 126 | infile = 'amplitude-cog.tif' 127 | cptFile = make_amplitude_cmap() 128 | rgbFile = make_rgb(infile, cptFile) 129 | make_thumbnails(rgbFile) 130 | make_overviews(rgbFile) 131 | 132 | infile = 'unwrapped-phase-cog.tif' 133 | cptFile = make_wrapped_phase_cmap() 134 | rgbFile = make_rgb(infile, cptFile) 135 | make_thumbnails(rgbFile) 136 | make_overviews(rgbFile) 137 | 138 | print('Done!') 139 | print('Run these to move files to S3:') 140 | #Copy pngs NOTE: this recursively goes into overview folders! 141 | print('aws s3 sync . s3://int-[date1]-[date2] --exclude "*" --include "*png"') 142 | print('aws s3 sync . s3://int-[date1]-[date2] --exclude "*" --include "*cpt"') 143 | print('aws s3 sync . s3://int-[date1]-[date2] --exclude "*" --include "*html"') 144 | -------------------------------------------------------------------------------- /bin/plot_inventory_asf.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Plot inventory polygon extents on a map & separate figure with timeline 4 | 5 | Author: Scott Henderson 6 | Date: 10/2017 7 | ISCE: 2.1.0 8 | ''' 9 | import argparse 10 | import geopandas as gpd 11 | import numpy as np 12 | import matplotlib.pyplot as plt 13 | import matplotlib.patheffects as PathEffects 14 | from matplotlib.dates import YearLocator, MonthLocator, DateFormatter 15 | 16 | from pandas.plotting import table 17 | import cartopy.crs as ccrs 18 | import cartopy.feature as cfeature 19 | from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER 20 | from cartopy.io.img_tiles import GoogleTiles 21 | 22 | from owslib.wmts import WebMapTileService 23 | 24 | 25 | def cmdLineParse(): 26 | ''' 27 | Command line parser. 28 | ''' 29 | parser = argparse.ArgumentParser(description='plot_inventory_asf.py') 30 | parser.add_argument('-i', type=str, dest='input', required=True, 31 | help='Vector inventory (query.geojson)') 32 | parser.add_argument('-p', type=str, dest='polygon', required=False, 33 | help='Polygon defining region of interest') 34 | 35 | 36 | return parser.parse_args() 37 | 38 | 39 | def load_inventory(vectorFile): 40 | ''' 41 | load merged inventory. easy! 42 | ''' 43 | gf = gpd.read_file(vectorFile) 44 | gf['timeStamp'] = gpd.pd.to_datetime(gf.sceneDate, format='%Y-%m-%d %H:%M:%S') 45 | gf['sceneDateString'] = gf.timeStamp.apply(lambda x: x.strftime('%Y-%m-%d')) 46 | gf['dateStamp'] = gpd.pd.to_datetime(gf.sceneDateString) 47 | gf['utc'] = gf.timeStamp.apply(lambda x: x.strftime('%H:%M:%S')) 48 | gf['relativeOrbit'] = gf.relativeOrbit.astype('int') 49 | gf.sort_values('relativeOrbit', inplace=True) 50 | gf['orbitCode'] = gf.relativeOrbit.astype('category').cat.codes 51 | return gf 52 | 53 | 54 | def ogr2snwe(vectorFile): 55 | gf = gpd.read_file(vectorFile) 56 | gf.to_crs(epsg=4326, inplace=True) 57 | poly = gf.geometry.convex_hull 58 | W,S,E,N = poly.bounds.values[0] 59 | return [S,N,W,E] 60 | 61 | 62 | def plot_map(gf, snwe, vectorFile=None, zoom=6): 63 | ''' 64 | Use Stamen Terrain background 65 | ''' 66 | pad = 1 #degrees 67 | S,N,W,E = snwe 68 | plot_CRS = ccrs.PlateCarree() 69 | geodetic_CRS = ccrs.Geodetic() 70 | x0, y0 = plot_CRS.transform_point(W-pad, S-pad, geodetic_CRS) 71 | x1, y1 = plot_CRS.transform_point(E+pad, N+pad, geodetic_CRS) 72 | 73 | fig,ax = plt.subplots(figsize=(8,8), dpi=100, 74 | subplot_kw=dict(projection=plot_CRS)) 75 | 76 | ax.set_xlim((x0, x1)) 77 | ax.set_ylim((y0, y1)) 78 | #ax.stock_img() 79 | #url = 'http://tile.stamen.com/terrain/{z}/{x}/{y}.png' 80 | #url = 'https://server.arcgisonline.com/ArcGIS/rest/services/World_Shaded_Relief/MapServer/tile/{z}/{y}/{x}.jpg' 81 | #tiler = GoogleTiles(url=url) 82 | #tiler = GoogleTiles() #default 83 | # NOTE: going higher than zoom=8 is slow... 84 | # How to get appropriate zoom level for static map? 85 | #ax.add_image(tiler, zoom) 86 | URL = 'http://gibs.earthdata.nasa.gov/wmts/epsg4326/best/wmts.cgi' 87 | wmts = WebMapTileService(URL) 88 | #layer = 'ASTER_GDEM_Greyscale_Shaded_Relief' #ASTER_GDEM_Color_Shaded_Relief # for small regions 89 | layer = 'BlueMarble_ShadedRelief_Bathymetry' #BlueMarble_ShadedRelief, BlueMarble_NextGeneration 90 | ax.add_wmts(wmts, layer) 91 | 92 | states_provinces = cfeature.NaturalEarthFeature( 93 | category='cultural', 94 | name='admin_1_states_provinces_lines', 95 | scale='110m', 96 | facecolor='none') 97 | ax.add_feature(states_provinces, edgecolor='k', linestyle=':') 98 | 99 | #ax.add_feature(cfeature.COASTLINE) 100 | ax.coastlines(resolution='10m', color='black', linewidth=2) 101 | ax.add_feature(cfeature.BORDERS) 102 | 103 | 104 | # Add region of interest polygon in specified 105 | if vectorFile: 106 | tmp = gpd.read_file(vectorFile) 107 | ax.add_geometries(tmp.geometry.values, 108 | ccrs.PlateCarree(), 109 | facecolor='none', 110 | edgecolor='m', 111 | lw=2, 112 | linestyle='dashed') 113 | 114 | #gf = load_inventory(args.input) 115 | orbits = gf.relativeOrbit.unique() 116 | colors = plt.cm.jet(np.linspace(0,1,orbits.size)) 117 | 118 | #colors = plt.get_cmap('jet', orbits.size) #not iterable 119 | for orbit,color in zip(orbits, colors): 120 | df = gf.query('relativeOrbit == @orbit') 121 | poly = df.geometry.cascaded_union 122 | 123 | if df.flightDirection.iloc[0] == 'ASCENDING': 124 | linestyle = '--' 125 | #xpos, ypos = poly.bounds[0], poly.bounds[3] #upper left 126 | xpos,ypos = poly.centroid.x, poly.bounds[3] 127 | else: 128 | linestyle = '-' 129 | #xpos, ypos = poly.bounds[2], poly.bounds[1] #lower right 130 | xpos,ypos = poly.centroid.x, poly.bounds[1] 131 | 132 | 133 | ax.add_geometries([poly], 134 | ccrs.PlateCarree(), 135 | facecolor='none', 136 | edgecolor=color, 137 | lw=2, #no effect? 138 | linestyle=linestyle) 139 | ax.text(xpos, ypos, orbit, color=color, fontsize=16, fontweight='bold', transform=geodetic_CRS) 140 | 141 | gl = ax.gridlines(plot_CRS, draw_labels=True, 142 | linewidth=0.5, color='gray', alpha=0.5, linestyle='-') 143 | gl.xlabels_top = False 144 | gl.ylabels_left = False 145 | #gl.xlines = False 146 | 147 | gl.xformatter = LONGITUDE_FORMATTER 148 | gl.yformatter = LATITUDE_FORMATTER 149 | 150 | plt.title('Sentinel-1 Orbits') 151 | #plt.show() 152 | plt.savefig('map_coverage.pdf', bbox_inches='tight') 153 | 154 | 155 | def plot_timeline_table(gf): 156 | ''' 157 | Timeline with summary table 158 | ''' 159 | dfA = gf.query('platform == "Sentinel-1A"') 160 | dfAa = dfA.query(' flightDirection == "ASCENDING" ') 161 | dfAd = dfA.query(' flightDirection == "DESCENDING" ') 162 | dfB = gf.query('platform == "Sentinel-1B"') 163 | dfBa = dfB.query(' flightDirection == "ASCENDING" ') 164 | dfBd = dfB.query(' flightDirection == "DESCENDING" ') 165 | 166 | # summary table 167 | dfS = pd.DataFrame(index=gf.relativeOrbit.unique()) 168 | dfS['Start'] = gf.groupby('relativeOrbit').sceneDateString.min() 169 | dfS['Stop'] = gf.groupby('relativeOrbit').sceneDateString.max() 170 | dfS['Dates'] = gf.groupby('relativeOrbit').sceneDateString.nunique() 171 | dfS['Frames'] = gf.groupby('relativeOrbit').sceneDateString.count() 172 | dfS['Direction'] = gf.groupby('relativeOrbit').flightDirection.first() 173 | dfS['UTC'] = gf.groupby('relativeOrbit').utc.first() 174 | dfS.sort_index(inplace=True, ascending=False) 175 | dfS.index.name = 'Orbit' 176 | 177 | # Same colors as map 178 | orbits = gf.relativeOrbit.unique() 179 | colors = plt.cm.jet(np.linspace(0,1, orbits.size)) 180 | 181 | fig,ax = plt.subplots(figsize=(11,8.5)) 182 | #plt.scatter(dfA.timeStamp.values, dfA.orbitCode.values, c=dfA.orbitCode.values, cmap='jet', s=60, label='S1A') 183 | #plt.scatter(dfB.timeStamp.values, dfB.orbitCode.values, c=dfB.orbitCode.values, cmap='jet', s=60, marker='d',label='S1B') 184 | plt.scatter(dfAa.timeStamp.values, dfAa.orbitCode.values, c=colors[dfAa.orbitCode.values], cmap='jet', s=60, facecolor='none', label='S1A') 185 | plt.scatter(dfBa.timeStamp.values, dfBa.orbitCode.values, c=colors[dfBa.orbitCode.values], cmap='jet', s=60, facecolor='none', marker='d',label='S1B') 186 | plt.scatter(dfAd.timeStamp.values, dfAd.orbitCode.values, c=colors[dfAd.orbitCode.values], cmap='jet', s=60, label='S1A') 187 | plt.scatter(dfBd.timeStamp.values, dfBd.orbitCode.values, c=colors[dfBd.orbitCode.values], cmap='jet', s=60, marker='d',label='S1B') 188 | 189 | plt.yticks(gf.orbitCode.unique(), gf.relativeOrbit.unique()) 190 | #plt.axvline('2016-04-22', color='gray', linestyle='dashed', label='Sentinel-1B launch') 191 | 192 | # Add to plot! as a custom legend 193 | table(ax, dfS, loc='top', zorder=10, fontsize=12, 194 | cellLoc = 'center', rowLoc = 'center', 195 | bbox=[0.1, 0.7, 0.6, 0.3] )#[left, bottom, width, height]) 196 | 197 | ax.xaxis.set_minor_locator(MonthLocator()) 198 | ax.xaxis.set_major_locator(YearLocator()) 199 | plt.legend(loc='upper right') 200 | plt.ylim(-1,orbits.size+3) 201 | plt.ylabel('Orbit Number') 202 | fig.autofmt_xdate() 203 | plt.title('Sentinel-1 timeline') 204 | plt.savefig('timeline_with_table.pdf', bbox_inches='tight') 205 | 206 | def plot_timeline(gf): 207 | ''' 208 | Timeline with summary table 209 | ''' 210 | dfA = gf.query('platform == "Sentinel-1A"') 211 | dfAa = dfA.query(' flightDirection == "ASCENDING" ') 212 | dfAd = dfA.query(' flightDirection == "DESCENDING" ') 213 | dfB = gf.query('platform == "Sentinel-1B"') 214 | dfBa = dfB.query(' flightDirection == "ASCENDING" ') 215 | dfBd = dfB.query(' flightDirection == "DESCENDING" ') 216 | 217 | 218 | # Same colors as map 219 | orbits = gf.relativeOrbit.unique() 220 | colors = plt.cm.jet(np.linspace(0,1, orbits.size)) 221 | 222 | fig,ax = plt.subplots(figsize=(11,8.5)) 223 | plt.scatter(dfAa.timeStamp.values, dfAa.orbitCode.values, edgecolors=colors[dfAa.orbitCode.values], facecolors='None', cmap='jet', s=60, label='Asc S1A') 224 | plt.scatter(dfBa.timeStamp.values, dfBa.orbitCode.values, edgecolors=colors[dfBa.orbitCode.values], facecolors='None', cmap='jet', s=60, marker='d',label='Asc S1B') 225 | plt.scatter(dfAd.timeStamp.values, dfAd.orbitCode.values, c=colors[dfAd.orbitCode.values], cmap='jet', s=60, label='Dsc S1A') 226 | plt.scatter(dfBd.timeStamp.values, dfBd.orbitCode.values, c=colors[dfBd.orbitCode.values], cmap='jet', s=60, marker='d',label='Dsc S1B') 227 | 228 | plt.yticks(gf.orbitCode.unique(), gf.relativeOrbit.unique()) 229 | 230 | ax.xaxis.set_minor_locator(MonthLocator()) 231 | ax.xaxis.set_major_locator(YearLocator()) 232 | plt.legend(loc='lower right') 233 | plt.ylim(-1,orbits.size) 234 | plt.ylabel('Orbit Number') 235 | fig.autofmt_xdate() 236 | plt.title('Sentinel-1 timeline') 237 | plt.savefig('timeline.pdf', bbox_inches='tight') 238 | 239 | 240 | if __name__ == '__main__': 241 | args = cmdLineParse() 242 | gf = load_inventory(args.input) 243 | w,s,e,n = gf.geometry.cascaded_union.bounds 244 | snwe = [s,n,w,e] 245 | plot_map(gf, snwe, args.polygon) 246 | plot_timeline(gf) 247 | -------------------------------------------------------------------------------- /bin/prep_topsApp.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Generate topsApp.xml and download scenes for a given date 4 | with topsApp.py (ISCE 2.1.0) 5 | 6 | Examples: 7 | # process just subswaths 1 and 2 8 | prep_topsApp.py -i query.geojson -m 20160910 -s 20160724 9 | 10 | Author Scott Henderson 11 | Updated: 10/2017 12 | ''' 13 | import argparse 14 | import os 15 | import glob 16 | import geopandas as gpd 17 | from lxml import html 18 | import requests 19 | 20 | import isce 21 | from isceobj.XmlUtil import FastXML as xml 22 | 23 | 24 | def cmdLineParse(): 25 | ''' 26 | Command line parser. 27 | ''' 28 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 29 | parser.add_argument('-i', type=str, dest='inventory', required=True, 30 | help='Inventory vector file (query.geojson)') 31 | parser.add_argument('-m', type=str, dest='master', required=True, 32 | help='Master date') 33 | parser.add_argument('-s', type=str, dest='slave', required=True, 34 | help='Slave date') 35 | parser.add_argument('-p', type=str, dest='path', required=True, 36 | help='Path/Track/RelativeOrbit Number') 37 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 38 | default=[1,2,3], choices=(1,2,3), 39 | help='Subswath numbers to process') 40 | parser.add_argument('-o', type=str, dest='orbitdir', required=False, 41 | default=os.environ['POEORB'], 42 | help='Orbit directory') 43 | parser.add_argument('-a', type=str, dest='auxdir', required=False, 44 | default=os.environ['AUXCAL'], 45 | help='Auxilary file directory') 46 | parser.add_argument('-d', type=str, dest='dem', required=False, 47 | help='Path to DEM file') 48 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 49 | metavar=('S','N','W','E'), 50 | help='Region of interest bbox [S,N,W,E]') 51 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 52 | metavar=('S','N','W','E'), 53 | help='Geocode bbox [S,N,W,E]') 54 | 55 | return parser.parse_args() 56 | 57 | 58 | 59 | def download_scene(downloadUrl): 60 | ''' 61 | aria2c --http-auth-challenge=true --http-user=CHANGE_ME --http-passwd='CHANGE_ME' "https://api.daac.asf.alaska.edu/services/search/param?granule_list=S1A_EW_GRDM_1SDH_20151003T040339_20151003T040351_007983_00B2A6_7377&output=metalink" 62 | ''' 63 | print('pwd:', os.getcwd()) 64 | print('Downloading frame from ASF...') 65 | print('Requires ~/.netrc file: ') 66 | print('See: https://winsar.unavco.org/software/release_note_isce-2.1.0.txt') 67 | cmd = 'wget -q -nc -c {}'.format(downloadUrl) #nc won't overwrite. -c continuous if unfinished -q is for 'quiet mode' since many incremental download % updates go to /var/log/cloud-init-output.log 68 | print(cmd) 69 | os.system(cmd) 70 | print('Done downloading') 71 | 72 | 73 | def load_inventory(vectorFile): 74 | ''' 75 | load merged inventory. easy! 76 | ''' 77 | gf = gpd.read_file(vectorFile) 78 | gf['timeStamp'] = gpd.pd.to_datetime(gf.sceneDate, format='%Y-%m-%d %H:%M:%S') 79 | gf['sceneDateString'] = gf.timeStamp.apply(lambda x: x.strftime('%Y-%m-%d')) 80 | gf['dateStamp'] = gpd.pd.to_datetime(gf.sceneDateString) 81 | gf['utc'] = gf.timeStamp.apply(lambda x: x.strftime('%H:%M:%S')) 82 | gf['orbitCode'] = gf.relativeOrbit.astype('category').cat.codes 83 | return gf 84 | 85 | 86 | def download_orbit(granuleName): 87 | ''' 88 | Grab orbit files from ASF 89 | ''' 90 | cwd = os.getcwd() 91 | try: 92 | os.chdir(os.environ['POEORB']) 93 | sat = granuleName[:3] 94 | date = granuleName[17:25] 95 | print('downloading orbit for {}, {}'.format(sat,date)) 96 | 97 | url = 'https://s1qc.asf.alaska.edu/aux_poeorb' 98 | r = requests.get(url) 99 | webpage = html.fromstring(r.content) 100 | orbits = webpage.xpath('//a/@href') 101 | # get s1A or s1B 102 | df = gpd.pd.DataFrame(dict(orbit=orbits)) 103 | dfSat = df[df.orbit.str.startswith(sat)] 104 | dayBefore = gpd.pd.to_datetime(date) - gpd.pd.to_timedelta(1, unit='d') 105 | dayBeforeStr = dayBefore.strftime('%Y%m%d') 106 | # get matching orbit file 107 | dfSat['startTime'] = dfSat.orbit.str[42:50] 108 | match = dfSat.loc[dfSat.startTime == dayBeforeStr, 'orbit'].values[0] 109 | cmd = 'wget -q -nc {}/{}'.format(url,match) #-nc means no clobber 110 | print(cmd) 111 | os.system(cmd) 112 | except Exception as e: 113 | print('Trouble downloading POEORB... maybe scene is too recent?') 114 | print(e) 115 | pass 116 | os.chdir(cwd) #NOTE: best to specifiy download dir instead of jumping cwd around... 117 | 118 | 119 | def download_auxcal(): 120 | ''' 121 | Auxilary data files <20Mb, just download all of them! 122 | NOTE: probably can be simplified! see download_orbit 123 | ''' 124 | cwd = os.getcwd() 125 | os.chdir(os.environ['AUXCAL']) 126 | print('Downloading S1 AUXILARY DATA...') 127 | url = 'https://s1qc.asf.alaska.edu/aux_cal' 128 | cmd = 'wget -q -r -l2 -nc -nd -np -nH -A SAFE {}'.format(url) 129 | print(cmd) 130 | os.system(cmd) 131 | os.chdir(cwd) 132 | 133 | 134 | def find_scenes(gf, dateStr, relativeOrbit, download=True): 135 | ''' 136 | Get downloadUrls for a given date 137 | ''' 138 | GF = gf.query('relativeOrbit == @relativeOrbit') 139 | GF = GF.loc[ GF.dateStamp == dateStr ] 140 | if download: 141 | for i,row in GF.iterrows(): 142 | download_scene(row.downloadUrl) 143 | download_orbit(GF.granuleName.iloc[0]) 144 | 145 | filenames = GF.fileName.tolist() 146 | print('SCENES: ', filenames) 147 | # create symlinks #probably need to do this for multiple 148 | #for f in filenames: 149 | # os.symlink(f, os.path.basename(f)) 150 | return filenames 151 | 152 | 153 | def write_topsApp_xml(inps): 154 | ''' use built in isce utility to write XML programatically (based on unoffical isce guide Sep2014''' 155 | insar = xml.Component('topsinsar') 156 | common = {} 157 | common['orbit directory'] = inps.orbitdir 158 | common['auxiliary data directory'] = inps.auxdir 159 | #common['swath number'] = inps.subswath 160 | if inps.roi: 161 | common['region of interest'] = inps.roi 162 | master = {} 163 | master['safe'] = inps.master_scenes 164 | master['output directory'] = 'masterdir' 165 | master.update(common) 166 | slave = {} 167 | slave['safe'] = inps.slave_scenes 168 | slave['output directory'] = 'slavedir' 169 | slave.update(common) 170 | #####Set sub-component 171 | insar['master'] = master 172 | insar['slave'] = slave 173 | ####Set properties 174 | insar['sensor name'] = 'SENTINEL1' 175 | insar['do unwrap'] = True 176 | insar['unwrapper name'] = 'snaphu_mcf' 177 | insar['swaths'] = inps.swaths 178 | if inps.gbox: 179 | insar['geocode bounding box'] = inps.gbox 180 | #insar['geocode list'] = [] 181 | if inps.dem: 182 | insar['demfilename'] = inps.dem 183 | #####Catalog example 184 | #insar['dem'] = xml.Catalog('dem.xml') #Components include a writeXML method 185 | insar.writeXML('topsApp.xml', root='topsApp') 186 | 187 | 188 | 189 | if __name__ == '__main__': 190 | #Try to set orbit paths with envrionment variables 191 | print('Looking for environment variables POEORB, AUXCAL, DEMDB...') 192 | if not 'POEORB' in os.environ: 193 | os.environ['POEORB'] = './' 194 | if not 'AUXCAL' in os.environ: 195 | os.environ['AUXCAL'] = './' 196 | 197 | inps = cmdLineParse() 198 | gf = load_inventory(inps.inventory) 199 | intdir = 'int-{0}-{1}'.format(inps.master, inps.slave) 200 | if not os.path.isdir(intdir): 201 | os.mkdir(intdir) 202 | os.chdir(intdir) 203 | download_auxcal() 204 | try: 205 | inps.master_scenes = find_scenes(gf, inps.master, inps.path, download=True) 206 | except Exception as e: 207 | print('ERROR retrieving master scenes, double check dates:') 208 | #print(e) 209 | raise 210 | try: 211 | inps.slave_scenes = find_scenes(gf, inps.slave, inps.path, download=True) 212 | except Exception as e: 213 | print('ERROR retrieving slave scenes, double check dates:') 214 | raise 215 | write_topsApp_xml(inps) 216 | print('Ready to run topsApp.py in {}'.format(intdir)) 217 | -------------------------------------------------------------------------------- /bin/prep_topsApp_aws.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | ''' 3 | Generate topsApp.xml, put SLCs, Orbit files, and aux data in s3 bucket for 4 | processing with topsApp.py (ISCE 2.1.0) 5 | 6 | # NOTE: requires inventory file from get_inventory_asf.py 7 | prep_topsApp_aws.py -i query.geojson -m 20141130 -s 20141106 -n 1 -r 46.45 46.55 -120.53 -120.43 8 | 9 | Author: Scott Henderson (scottyh@uw.edu) 10 | Updated: 02/2018 11 | ''' 12 | import argparse 13 | import os 14 | import glob 15 | import geopandas as gpd 16 | from lxml import html 17 | import requests 18 | # Borrowed from Piyush Agram: 19 | import FastXML as xml 20 | 21 | 22 | def cmdLineParse(): 23 | ''' 24 | Command line parser. 25 | ''' 26 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 27 | parser.add_argument('-i', type=str, dest='inventory', required=True, 28 | help='Inventory vector file (query.geojson)') 29 | parser.add_argument('-m', type=str, dest='master', required=True, 30 | help='Master date') 31 | parser.add_argument('-s', type=str, dest='slave', required=True, 32 | help='Slave date') 33 | parser.add_argument('-p', type=str, dest='path', required=True, 34 | help='Path/Track/RelativeOrbit Number') 35 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 36 | default=[1,2,3], choices=(1,2,3), 37 | help='Subswath numbers to process') 38 | parser.add_argument('-o', dest='poeorb', action='store_true', required=False, 39 | help='Use precise orbits (True/False)') 40 | parser.add_argument('-d', type=str, dest='dem', required=False, 41 | help='Path to DEM file') 42 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 43 | metavar=('S','N','W','E'), 44 | help='Region of interest bbox [S,N,W,E]') 45 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 46 | metavar=('S','N','W','E'), 47 | help='Geocode bbox [S,N,W,E]') 48 | 49 | return parser.parse_args() 50 | 51 | 52 | def load_inventory(vectorFile): 53 | ''' 54 | load merged (S1A and S1B) inventory 55 | ''' 56 | gf = gpd.read_file(vectorFile) 57 | gf['timeStamp'] = gpd.pd.to_datetime(gf.sceneDate, format='%Y-%m-%d %H:%M:%S') 58 | gf['sceneDateString'] = gf.timeStamp.apply(lambda x: x.strftime('%Y-%m-%d')) 59 | gf['dateStamp'] = gpd.pd.to_datetime(gf.sceneDateString) 60 | gf['utc'] = gf.timeStamp.apply(lambda x: x.strftime('%H:%M:%S')) 61 | gf['orbitCode'] = gf.relativeOrbit.astype('category').cat.codes 62 | return gf 63 | 64 | 65 | def get_orbit_url(granuleName): 66 | ''' 67 | Grab orbit files from ASF 68 | ''' 69 | sat = granuleName[:3] 70 | date = granuleName[17:25] 71 | #print('downloading orbit for {}, {}'.format(sat,date)) 72 | # incomplete inventory 'https://s1qc.asf.alaska.edu/aux_poeorb/files.txt' 73 | url = 'https://s1qc.asf.alaska.edu/aux_poeorb' 74 | r = requests.get(url) 75 | webpage = html.fromstring(r.content) 76 | orbits = webpage.xpath('//a/@href') 77 | # get s1A or s1B 78 | df = gpd.pd.DataFrame(dict(orbit=orbits)) 79 | dfSat = df[df.orbit.str.startswith(sat)].copy() 80 | dayBefore = gpd.pd.to_datetime(date) - gpd.pd.to_timedelta(1, unit='d') 81 | dayBeforeStr = dayBefore.strftime('%Y%m%d') 82 | # get matching orbit file 83 | dfSat.loc[:, 'startTime'] = dfSat.orbit.str[42:50] 84 | match = dfSat.loc[dfSat.startTime == dayBeforeStr, 'orbit'].values[0] 85 | orbitUrl = f'{url}/{match}' 86 | 87 | return orbitUrl 88 | 89 | 90 | def get_slc_urls(gf, dateStr, relativeOrbit): 91 | ''' 92 | Get downloadUrls for a given date 93 | ''' 94 | try: 95 | GF = gf.query('relativeOrbit == @relativeOrbit') 96 | GF = GF.loc[GF.dateStamp == dateStr] 97 | filenames = GF.downloadUrl.tolist() 98 | except Exception as e: 99 | print(f'ERROR retrieving {val} scenes, double check dates:') 100 | print(e) 101 | pass 102 | 103 | return filenames 104 | 105 | def write_wget_download_file(fileList): 106 | ''' 107 | instead of downloading locally, write a download file for orbits and SLCs 108 | ''' 109 | with open('download-links.txt', 'w') as f: 110 | f.write("\n".join(fileList)) 111 | 112 | 113 | 114 | def write_topsApp_xml(inps): 115 | ''' use built in isce utility to write XML programatically (based on unoffical isce guide Sep2014''' 116 | insar = xml.Component('topsinsar') 117 | common = {} 118 | if inps.poeorb: 119 | common['orbit directory'] = './' 120 | common['auxiliary data directory'] = './' 121 | #common['swath number'] = inps.subswath 122 | if inps.roi: 123 | common['region of interest'] = inps.roi 124 | master = {} 125 | master['safe'] = inps.master_scenes 126 | master['output directory'] = 'masterdir' 127 | master.update(common) 128 | slave = {} 129 | slave['safe'] = inps.slave_scenes 130 | slave['output directory'] = 'slavedir' 131 | slave.update(common) 132 | #####Set sub-component 133 | insar['master'] = master 134 | insar['slave'] = slave 135 | ####Set properties 136 | insar['sensor name'] = 'SENTINEL1' 137 | insar['do unwrap'] = True 138 | insar['unwrapper name'] = 'snaphu_mcf' 139 | insar['swaths'] = inps.swaths 140 | if inps.gbox: 141 | insar['geocode bounding box'] = inps.gbox 142 | #insar['geocode list'] = [] 143 | if inps.dem: 144 | insar['demfilename'] = inps.dem 145 | 146 | insar.writeXML('topsApp.xml', root='topsApp') 147 | 148 | 149 | 150 | if __name__ == '__main__': 151 | inps = cmdLineParse() 152 | gf = load_inventory(inps.inventory) 153 | intdir = 'int-{0}-{1}'.format(inps.master, inps.slave) 154 | if not os.path.isdir(intdir): 155 | os.mkdir(intdir) 156 | os.chdir(intdir) 157 | 158 | master_urls = get_slc_urls(gf, inps.master, inps.path) 159 | slave_urls = get_slc_urls(gf, inps.slave, inps.path) 160 | downloadList = master_urls + slave_urls 161 | inps.master_scenes = [os.path.basename(x) for x in master_urls] 162 | inps.slave_scenes = [os.path.basename(x) for x in slave_urls] 163 | 164 | if inps.poeorb: 165 | try: 166 | frame = os.path.basename(inps.master_scenes[0]) 167 | downloadList.append(get_orbit_url(frame)) 168 | frame = os.path.basename(inps.slave_scenes[0]) 169 | downloadList.append(get_orbit_url(frame)) 170 | except Exception as e: 171 | print('Trouble downloading POEORB... maybe scene is too recent?') 172 | print('Falling back to using header orbits') 173 | print(e) 174 | inps.poeorb = False 175 | pass 176 | 177 | write_topsApp_xml(inps) 178 | 179 | write_wget_download_file(downloadList) 180 | 181 | cmd = f'aws s3 mb s3://{intdir}' 182 | print(cmd) 183 | os.system(cmd) 184 | cmd = f'aws s3 sync {intdir} s3://{intdir}' 185 | print(cmd) 186 | os.system(cmd) 187 | print(f'Moved files to S3://{intdir}') 188 | -------------------------------------------------------------------------------- /bin/proc_batch_master.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Process multiple interferograms via AWS Batch (Spot Market) 5 | 6 | # EXAMPLE: 7 | # discover path name for area of interest (3 sisters volcano) 8 | get_asf_inventory.py -r 44.0 44.5 -122.0 -121.5 9 | # process 3 interferograms (master=20170927 with 3 preceding dates) 10 | # NOTE: this will run topsApp.py for subswath 2, and region of interest defined by search 11 | proc_batch_master.py -p 115 -m 20170927 -s 3 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 12 | 13 | @modified: 12/08/2017 14 | @author: scott 15 | """ 16 | 17 | import argparse 18 | import os 19 | import geopandas as gpd 20 | import boto3 21 | 22 | def cmdLineParse(): 23 | ''' 24 | Command line parser. 25 | ''' 26 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 27 | parser.add_argument('-m', type=str, dest='master', required=True, 28 | help='Master date') 29 | parser.add_argument('-s', type=int, dest='slave', required=True, 30 | help='Number of slaves') 31 | parser.add_argument('-p', type=int, dest='path', required=True, 32 | help='Path/Track/RelativeOrbit Number') 33 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 34 | default=[1,2,3], choices=(1,2,3), 35 | help='Subswath numbers to process') 36 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 37 | metavar=('S','N','W','E'), 38 | help='Region of interest bbox [S,N,W,E]') 39 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 40 | metavar=('S','N','W','E'), 41 | help='Geocode bbox [S,N,W,E]') 42 | 43 | return parser.parse_args() 44 | 45 | 46 | def get_pairs(inps): 47 | ''' 48 | Load geojson inventory, find preceding pairs to interfere with master date 49 | ''' 50 | # NOTE: should test here for something like 50% overlap (to avoid situations where 51 | # S1A & S1B don't actually overlap despite being in same ROI,,, 52 | 53 | 54 | def create_s3buckets(inps,slaveList): 55 | ''' 56 | Create S3 buckets for given master 57 | ''' 58 | 59 | s3 = boto3.resource('s3') 60 | for slave in slaveList: 61 | bucketName = 'int-{0}-{1}'.format(inps.master, slave) 62 | s3.create_bucket(bucketName) 63 | bashFile = write_bash_script(inps.master, slave) 64 | s3.upload_file(bashFile, bucketName, bashFile) 65 | 66 | 67 | def write_bash_script(inps, slave) 68 | filename = 'proc-{master}-{slave}.sh'.format(**vars(inps)) 69 | with open(filename, 'w') as txt: 70 | txt.write('''#!/bin/bash 71 | # Make directories for processing - already in AMI 72 | cd /mnt/data 73 | mkdir dems poeorb auxcal 74 | # Get the latest python scripts from github & add to path 75 | git clone https://github.com/scottyhq/dinoSAR.git 76 | export PATH=/mnt/data/dinoSAR/bin:$PATH 77 | # Prepare interferogram directory 78 | prep_topsApp.py -i query.geojson -p {path} -m {master} -s {slave} -n {swaths} -r {roi} -g {gbox} 79 | # Run code 80 | cd int-{master}-{slave} 81 | topsApp.py 2>&1 | tee topsApp.log 82 | cp *xml *log merged 83 | aws s3 sync merged/ s3://int-{master}-{slave}/ 84 | # Send email 85 | aws sns publish --topic-arn "arn:aws:sns:us-west-2:295426338758:email-me" --message file://topsApp.log --subject "int-{master}-{slave} Finished" 86 | '''.format(**vars(inps))) 87 | 88 | return filename 89 | 90 | 91 | def launch_batch(): 92 | ''' 93 | submit AWS Batch Job 94 | ''' 95 | cmd = 'aws batch submit-job --job-name example --job-queue HighPriority --job-definition sleep60'.format(name,template) 96 | print(cmd) 97 | #os.system(cmd) 98 | 99 | 100 | if __name__ == '__main__': 101 | inps = cmdLineParse() 102 | inps.roi = ' '.join([str(x) for x in inps.roi]) 103 | inps.gbox = ' '.join([str(x) for x in inps.gbox]) 104 | inps.swaths = ' '.join([str(x) for x in inps.swaths]) 105 | 106 | template = create_cloudformation_script(inps) 107 | print('Running Interferogram on EC2') 108 | #launch_stack(template) 109 | print('EC2 should close automatically when finished...') 110 | -------------------------------------------------------------------------------- /bin/proc_batch_sequential.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Process all adjacent interferograms for a particular region of interest. 5 | 6 | 7 | EXAMPLE: 8 | For example, a query.geojson with the following scenes: 9 | ... 10 | 20170927 11 | 20170915 12 | 20170903 13 | 20170812 14 | ... 15 | 16 | proc_batch_sequential.py -p 115 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 17 | 18 | # this will run topsApp.py for subswath 2, and region of interest defined by search 19 | # for the following pairs: 20 | int-20170927-20170915 21 | int-20170915-20170903 22 | int-20170903-20170812 23 | 24 | @author: scott 25 | """ 26 | 27 | import argparse 28 | import os 29 | 30 | def cmdLineParse(): 31 | ''' 32 | Command line parser. 33 | ''' 34 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 35 | parser.add_argument('-m', type=str, dest='master', required=True, 36 | help='Master date') 37 | parser.add_argument('-s', type=str, dest='slave', required=True, 38 | help='Slave date') 39 | parser.add_argument('-p', type=int, dest='path', required=True, 40 | help='Path/Track/RelativeOrbit Number') 41 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 42 | default=[1,2,3], choices=(1,2,3), 43 | help='Subswath numbers to process') 44 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 45 | metavar=('S','N','W','E'), 46 | help='Region of interest bbox [S,N,W,E]') 47 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 48 | metavar=('S','N','W','E'), 49 | help='Geocode bbox [S,N,W,E]') 50 | 51 | return parser.parse_args() 52 | 53 | 54 | 55 | def create_cloudformation_script(inps): 56 | ''' 57 | Write YML file to process and interferogram based on user input 58 | NOTE: could probably do this better w/ JSON tools... 59 | ''' 60 | filename = 'proc-{master}-{slave}.yml'.format(**vars(inps)) 61 | with open(filename, 'w') as yml: 62 | yml.write('''AWSTemplateFormatVersion: "2010-09-09" 63 | Description: "CloudFormation template to create interferogram: int-{master}-{slave}" 64 | Resources: 65 | MyEC2Instance: 66 | Type: "AWS::EC2::Instance" 67 | Properties: 68 | ImageId: "ami-8deb36f5" 69 | InstanceType: "{instance}" 70 | KeyName: "isce-key" 71 | SecurityGroups: ["isce-sg",] 72 | BlockDeviceMappings: 73 | - 74 | DeviceName: /dev/sda1 75 | Ebs: 76 | VolumeType: gp2 77 | VolumeSize: 8 78 | DeleteOnTermination: true 79 | - 80 | DeviceName: /dev/xvdf 81 | Ebs: 82 | VolumeType: gp2 83 | VolumeSize: 100 84 | DeleteOnTermination: true 85 | UserData: 86 | 'Fn::Base64': !Sub | 87 | #!/bin/bash 88 | # add -xe to above line for debugging output to /var/log/cloud-init-output.log 89 | # create mount point directory NOTE all commands run as root 90 | #mkdir /mnt/data 91 | # create ext4 filesystem on new volume 92 | mkfs -t ext4 /dev/xvdf 93 | # add an entry to fstab to mount volume during boot 94 | echo "/dev/xvdf /mnt/data ext4 defaults,nofail 0 2" >> /etc/fstab 95 | # mount the volume on current boot 96 | mount -a 97 | chown -R ubuntu /mnt/data 98 | sudo -i -u ubuntu bash <<"EOF" 99 | export PATH="/home/ubuntu/miniconda3/envs/isce-2.1.0/bin:/home/ubuntu/.local/bin:$PATH" 100 | export GDAL_DATA=/home/ubuntu/miniconda3/envs/isce-2.1.0/share/gdal 101 | source /home/ubuntu/ISCECONFIG 102 | # Make directories for processing - already in AMI 103 | cd /mnt/data 104 | mkdir dems poeorb auxcal 105 | # Get the latest python scripts from github & add to path 106 | git clone https://github.com/scottyhq/dinoSAR.git 107 | export PATH=/mnt/data/dinoSAR/bin:$PATH 108 | #echo $PATH 109 | # Download inventory file 110 | get_inventory_asf.py -r {roi} 111 | # Prepare interferogram directory 112 | prep_topsApp.py -i query.geojson -p {path} -m {master} -s {slave} -n {swaths} -r {roi} -g {gbox} 113 | # Run code 114 | cd int-{master}-{slave} 115 | topsApp.py 2>&1 | tee topsApp.log 116 | # Create S3 bucket and save results 117 | aws s3 mb s3://int-{master}-{slave} 118 | cp *xml *log merged 119 | aws s3 sync merged/ s3://int-{master}-{slave}/ 120 | # Close instance 121 | #echo "Finished interferogram... shutting down" 122 | #shutdown #doesn't close entire stack, just EC2 123 | aws sns publish --topic-arn "arn:aws:sns:us-west-2:295426338758:email-me" --message file://topsApp.log --subject "int-{master}-{slave} Finished" 124 | aws cloudformation delete-stack --stack-name proc-{master}-{slave} 125 | EOF 126 | '''.format(**vars(inps))) 127 | 128 | return filename 129 | 130 | 131 | def launch_stack(template): 132 | ''' 133 | launch AWS CloudFormationStack 134 | ''' 135 | name = template[:-4] 136 | cmd = 'aws cloudformation create-stack --stack-name {0} --template-body file://{1}'.format(name,template) 137 | print(cmd) 138 | #os.system(cmd) 139 | 140 | 141 | if __name__ == '__main__': 142 | inps = cmdLineParse() 143 | inps.roi = ' '.join([str(x) for x in inps.roi]) 144 | inps.gbox = ' '.join([str(x) for x in inps.gbox]) 145 | inps.swaths = ' '.join([str(x) for x in inps.swaths]) 146 | 147 | template = create_cloudformation_script(inps) 148 | print('Running Interferogram on EC2') 149 | #launch_stack(template) 150 | print('EC2 should close automatically when finished...') 151 | -------------------------------------------------------------------------------- /bin/proc_ifg_cfn.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Process a single interferogram via a CloudFormation script 5 | 6 | Note: intended for single interferogram processing. For batch processing 7 | figure out how to store common data and DEM on an EFS drive 8 | 9 | # EXAMPLE: 10 | proc_ifg_cfn.py -i c5.4xlarge -p 115 -m 20170927 -s 20150914 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 11 | 12 | c4.4xlarge t2.2xlarge c5.4xlarge c5.9xlarge 13 | 14 | # Archived Files: 15 | *xml *log 16 | #filt_topophase.unw* filt_topophase.flat* dem.crop* los.rdr.geo* phsig.cor.geo* 17 | # For now just stash entire meged directory 18 | 19 | Created on Sun Nov 19 16:26:27 2017 20 | 21 | @author: scott 22 | """ 23 | 24 | import argparse 25 | import os 26 | 27 | def cmdLineParse(): 28 | ''' 29 | Command line parser. 30 | ''' 31 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 32 | parser.add_argument('-m', type=str, dest='master', required=True, 33 | help='Master date') 34 | parser.add_argument('-s', type=str, dest='slave', required=True, 35 | help='Slave date') 36 | parser.add_argument('-p', type=int, dest='path', required=True, 37 | help='Path/Track/RelativeOrbit Number') 38 | parser.add_argument('-i', type=str, dest='instance', required=False, default='t2.micro', 39 | help='EC2 instance type (c4.4xlarge, t2.micro...)') 40 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 41 | default=[1,2,3], choices=(1,2,3), 42 | help='Subswath numbers to process') 43 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 44 | metavar=('S','N','W','E'), 45 | help='Region of interest bbox [S,N,W,E]') 46 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 47 | metavar=('S','N','W','E'), 48 | help='Geocode bbox [S,N,W,E]') 49 | 50 | return parser.parse_args() 51 | 52 | 53 | 54 | def create_cloudformation_script(inps): 55 | ''' 56 | Write YML file to process and interferogram based on user input 57 | NOTE: could probably do this better w/ JSON tools... 58 | ''' 59 | filename = 'proc-{master}-{slave}.yml'.format(**vars(inps)) 60 | with open(filename, 'w') as yml: 61 | yml.write('''AWSTemplateFormatVersion: "2010-09-09" 62 | Description: "CloudFormation template to create interferogram: int-{master}-{slave}" 63 | Resources: 64 | MyEC2Instance: 65 | Type: "AWS::EC2::Instance" 66 | Properties: 67 | ImageId: "ami-8deb36f5" 68 | InstanceType: "{instance}" 69 | KeyName: "isce-key" 70 | SecurityGroups: ["isce-sg",] 71 | BlockDeviceMappings: 72 | - 73 | DeviceName: /dev/sda1 74 | Ebs: 75 | VolumeType: gp2 76 | VolumeSize: 8 77 | DeleteOnTermination: true 78 | - 79 | DeviceName: /dev/xvdf 80 | Ebs: 81 | VolumeType: gp2 82 | VolumeSize: 100 83 | DeleteOnTermination: true 84 | UserData: 85 | 'Fn::Base64': !Sub | 86 | #!/bin/bash 87 | # add -xe to above line for debugging output to /var/log/cloud-init-output.log 88 | # create mount point directory NOTE all commands run as root 89 | #mkdir /mnt/data 90 | # create ext4 filesystem on new volume 91 | mkfs -t ext4 /dev/xvdf 92 | # add an entry to fstab to mount volume during boot 93 | echo "/dev/xvdf /mnt/data ext4 defaults,nofail 0 2" >> /etc/fstab 94 | # mount the volume on current boot 95 | mount -a 96 | chown -R ubuntu /mnt/data 97 | sudo -i -u ubuntu bash <<"EOF" 98 | export PATH="/home/ubuntu/miniconda3/envs/isce-2.1.0/bin:/home/ubuntu/.local/bin:$PATH" 99 | export GDAL_DATA=/home/ubuntu/miniconda3/envs/isce-2.1.0/share/gdal 100 | source /home/ubuntu/ISCECONFIG 101 | # Make directories for processing - already in AMI 102 | cd /mnt/data 103 | mkdir dems poeorb auxcal 104 | # Get the latest python scripts from github & add to path 105 | git clone https://github.com/scottyhq/dinoSAR.git 106 | export PATH=/mnt/data/dinoSAR/bin:$PATH 107 | #echo $PATH 108 | # Download inventory file 109 | get_inventory_asf.py -r {roi} 110 | # Prepare interferogram directory 111 | prep_topsApp.py -i query.geojson -p {path} -m {master} -s {slave} -n {swaths} -r {roi} -g {gbox} 112 | # Run code 113 | cd int-{master}-{slave} 114 | topsApp.py 2>&1 | tee topsApp.log 115 | # Create S3 bucket and save results 116 | aws s3 mb s3://int-{master}-{slave} 117 | cp *xml *log merged 118 | aws s3 sync merged/ s3://int-{master}-{slave}/ 119 | # Close instance 120 | #echo "Finished interferogram... shutting down" 121 | #shutdown #doesn't close entire stack, just EC2 122 | aws sns publish --topic-arn "arn:aws:sns:us-west-2:295426338758:email-me" --message file://topsApp.log --subject "int-{master}-{slave} Finished" 123 | aws cloudformation delete-stack --stack-name proc-{master}-{slave} 124 | EOF 125 | '''.format(**vars(inps))) 126 | 127 | return filename 128 | 129 | 130 | def launch_stack(template): 131 | ''' 132 | launch AWS CloudFormationStack 133 | ''' 134 | name = template[:-4] 135 | cmd = 'aws cloudformation create-stack --stack-name {0} --template-body file://{1}'.format(name,template) 136 | print(cmd) 137 | #os.system(cmd) 138 | 139 | 140 | if __name__ == '__main__': 141 | inps = cmdLineParse() 142 | inps.roi = ' '.join([str(x) for x in inps.roi]) 143 | inps.gbox = ' '.join([str(x) for x in inps.gbox]) 144 | inps.swaths = ' '.join([str(x) for x in inps.swaths]) 145 | 146 | template = create_cloudformation_script(inps) 147 | print('Running Interferogram on EC2') 148 | #launch_stack(template) 149 | print('EC2 should close automatically when finished...') 150 | -------------------------------------------------------------------------------- /bin/proc_ifg_cfn_exp.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | EXPERIMENTING WITH NEW CLOUD FORMATION SETTINGS 5 | - send email when process begins and completes 6 | - run interferogram using spot market 7 | 8 | 9 | Note: intended for single interferogram processing. For batch processing 10 | figure out how to store common data and DEM on an EFS drive 11 | 12 | # EXAMPLE: 13 | proc_ifg_cfn.py -i c4.4xlarge -p 115 -m 20170927 -s 20170915 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 14 | 15 | c4.4xlarge 16 | 17 | # Archived Files: 18 | *xml *log 19 | #filt_topophase.unw* filt_topophase.flat* dem.crop* los.rdr.geo* phsig.cor.geo* 20 | # For now just stash entire meged directory 21 | 22 | Created on Sun Nov 19 16:26:27 2017 23 | 24 | @author: scott 25 | """ 26 | 27 | import argparse 28 | import os 29 | 30 | def cmdLineParse(): 31 | ''' 32 | Command line parser. 33 | ''' 34 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 35 | parser.add_argument('-m', type=str, dest='master', required=True, 36 | help='Master date') 37 | parser.add_argument('-s', type=str, dest='slave', required=True, 38 | help='Slave date') 39 | parser.add_argument('-p', type=int, dest='path', required=True, 40 | help='Path/Track/RelativeOrbit Number') 41 | parser.add_argument('-i', type=str, dest='instance', required=False, default='t2.micro', 42 | help='EC2 instance type (c4.4xlarge, t2.micro...)') 43 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 44 | default=[1,2,3], choices=(1,2,3), 45 | help='Subswath numbers to process') 46 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 47 | metavar=('S','N','W','E'), 48 | help='Region of interest bbox [S,N,W,E]') 49 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 50 | metavar=('S','N','W','E'), 51 | help='Geocode bbox [S,N,W,E]') 52 | 53 | return parser.parse_args() 54 | 55 | 56 | 57 | def create_cloudformation_script(inps): 58 | ''' 59 | Write YML file to process and interferogram based on user input 60 | NOTE: could probably do this better w/ JSON tools... 61 | ''' 62 | filename = 'proc-{master}-{slave}.yml'.format(**vars(inps)) 63 | with open(filename, 'w') as yml: 64 | yml.write('''AWSTemplateFormatVersion: "2010-09-09" 65 | Description: "CloudFormation template to create interferogram: int-{master}-{slave}" 66 | Resources: 67 | MyEC2Instance: 68 | Type: "AWS::EC2::Instance" 69 | Properties: 70 | ImageId: "ami-8deb36f5" 71 | InstanceType: "{instance}" 72 | KeyName: "isce-key" 73 | SecurityGroups: ["isce-sg",] 74 | BlockDeviceMappings: 75 | - 76 | DeviceName: /dev/sda1 77 | Ebs: 78 | VolumeType: gp2 79 | VolumeSize: 8 80 | DeleteOnTermination: true 81 | - 82 | DeviceName: /dev/xvdf 83 | Ebs: 84 | VolumeType: gp2 85 | VolumeSize: 100 86 | DeleteOnTermination: true 87 | UserData: 88 | 'Fn::Base64': !Sub | 89 | #!/bin/bash 90 | # add -xe to above line for debugging output to /var/log/cloud-init-output.log 91 | # create mount point directory NOTE all commands run as root 92 | #mkdir /mnt/data 93 | # create ext4 filesystem on new volume 94 | mkfs -t ext4 /dev/xvdf 95 | # add an entry to fstab to mount volume during boot 96 | echo "/dev/xvdf /mnt/data ext4 defaults,nofail 0 2" >> /etc/fstab 97 | # mount the volume on current boot 98 | mount -a 99 | chown -R ubuntu /mnt/data 100 | sudo -i -u ubuntu bash <<"EOF" 101 | export PATH="/home/ubuntu/miniconda3/envs/isce-2.1.0/bin:/home/ubuntu/.local/bin:$PATH" 102 | export GDAL_DATA=/home/ubuntu/miniconda3/envs/isce-2.1.0/share/gdal 103 | source /home/ubuntu/ISCECONFIG 104 | # Make directories for processing - already in AMI 105 | cd /mnt/data 106 | mkdir dems poeorb auxcal 107 | # Get the latest python scripts from github & add to path 108 | git clone https://github.com/scottyhq/dinoSAR.git 109 | export PATH=/mnt/data/dinoSAR/bin:$PATH 110 | echo $PATH 111 | # Download inventory file 112 | get_inventory_asf.py -r {roi} 113 | # Prepare interferogram directory 114 | prep_topsApp.py -i query.geojson -p {path} -m {master} -s {slave} -n {swaths} -r {roi} -g {gbox} 115 | # Run code 116 | cd int-{master}-{slave} 117 | topsApp.py 2>&1 | tee topsApp.log 118 | # Create S3 bucket and save results 119 | aws s3 mb s3://int-{master}-{slave} 120 | cp *xml *log merged 121 | aws s3 sync merged/ s3://int-{master}-{slave}/ 122 | # Close instance 123 | EOF 124 | echo "Finished interferogram... shutting down" 125 | #shutdown #doesn't close entire stack 126 | aws cloudformation delete-stack --stack-name proc-{master}-{slave} 127 | '''.format(**vars(inps))) 128 | 129 | return filename 130 | 131 | 132 | def launch_stack(template): 133 | ''' 134 | launch AWS CloudFormationStack 135 | ''' 136 | name = template[:-4] 137 | cmd = 'aws cloudformation create-stack --stack-name {0} --template-body file://{1}'.format(name,template) 138 | print(cmd) 139 | os.system(cmd) 140 | 141 | 142 | if __name__ == '__main__': 143 | inps = cmdLineParse() 144 | inps.roi = ' '.join([str(x) for x in inps.roi]) 145 | inps.gbox = ' '.join([str(x) for x in inps.gbox]) 146 | inps.swaths = ' '.join([str(x) for x in inps.swaths]) 147 | 148 | template = create_cloudformation_script(inps) 149 | print('Running Interferogram on EC2') 150 | launch_stack(template) 151 | print('EC2 should close automatically when finished...') 152 | -------------------------------------------------------------------------------- /bin/proc_ifg_cfn_spot.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Process a single interferogram via a CloudFormation script 5 | 6 | Note: intended for single interferogram processing. For batch processing 7 | figure out how to store common data and DEM on an EFS drive 8 | 9 | # EXAMPLE: 10 | proc_ifg_cfn.py -i c4.4xlarge -p 115 -m 20170927 -s 20170915 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 11 | 12 | c4.4xlarge 13 | 14 | # Archived Files: 15 | *xml *log 16 | #filt_topophase.unw* filt_topophase.flat* dem.crop* los.rdr.geo* phsig.cor.geo* 17 | # For now just stash entire meged directory 18 | 19 | Created on Sun Nov 19 16:26:27 2017 20 | 21 | @author: scott 22 | """ 23 | 24 | import argparse 25 | import os 26 | 27 | def cmdLineParse(): 28 | ''' 29 | Command line parser. 30 | ''' 31 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 32 | parser.add_argument('-m', type=str, dest='master', required=True, 33 | help='Master date') 34 | parser.add_argument('-s', type=str, dest='slave', required=True, 35 | help='Slave date') 36 | parser.add_argument('-p', type=int, dest='path', required=True, 37 | help='Path/Track/RelativeOrbit Number') 38 | parser.add_argument('-i', type=str, dest='instance', required=False, default='t2.micro', 39 | help='EC2 instance type (c4.4xlarge, t2.micro...)') 40 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 41 | default=[1,2,3], choices=(1,2,3), 42 | help='Subswath numbers to process') 43 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 44 | metavar=('S','N','W','E'), 45 | help='Region of interest bbox [S,N,W,E]') 46 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 47 | metavar=('S','N','W','E'), 48 | help='Geocode bbox [S,N,W,E]') 49 | 50 | return parser.parse_args() 51 | 52 | 53 | 54 | def create_cloudformation_script(inps): 55 | ''' 56 | Write YML file to process and interferogram based on user input 57 | NOTE: could probably do this better w/ JSON tools... 58 | ''' 59 | filename = 'proc-{master}-{slave}.yml'.format(**vars(inps)) 60 | with open(filename, 'w') as yml: 61 | yml.write('''AWSTemplateFormatVersion: "2010-09-09" 62 | Description: "CloudFormation template to create interferogram: int-{master}-{slave}" 63 | Resources: 64 | MyEC2Instance: 65 | Type: "AWS::EC2::Instance" 66 | Properties: 67 | ImageId: "ami-8deb36f5" 68 | InstanceType: "{instance}" 69 | KeyName: "isce-key" 70 | SecurityGroups: ["isce-sg",] 71 | BlockDeviceMappings: 72 | - 73 | DeviceName: /dev/sda1 74 | Ebs: 75 | VolumeType: gp2 76 | VolumeSize: 8 77 | DeleteOnTermination: true 78 | - 79 | DeviceName: /dev/xvdf 80 | Ebs: 81 | VolumeType: gp2 82 | VolumeSize: 100 83 | DeleteOnTermination: true 84 | UserData: 85 | 'Fn::Base64': !Sub | 86 | #!/bin/bash 87 | # add -xe to above line for debugging output to /var/log/cloud-init-output.log 88 | # create mount point directory NOTE all commands run as root 89 | #mkdir /mnt/data 90 | # create ext4 filesystem on new volume 91 | mkfs -t ext4 /dev/xvdf 92 | # add an entry to fstab to mount volume during boot 93 | echo "/dev/xvdf /mnt/data ext4 defaults,nofail 0 2" >> /etc/fstab 94 | # mount the volume on current boot 95 | mount -a 96 | chown -R ubuntu /mnt/data 97 | sudo -i -u ubuntu bash <<"EOF" 98 | export PATH="/home/ubuntu/miniconda3/envs/isce-2.1.0/bin:/home/ubuntu/.local/bin:$PATH" 99 | export GDAL_DATA=/home/ubuntu/miniconda3/envs/isce-2.1.0/share/gdal 100 | source /home/ubuntu/ISCECONFIG 101 | # Make directories for processing - already in AMI 102 | cd /mnt/data 103 | mkdir dems poeorb auxcal 104 | # Get the latest python scripts from github & add to path 105 | git clone https://github.com/scottyhq/dinoSAR.git 106 | export PATH=/mnt/data/dinoSAR/bin:$PATH 107 | echo $PATH 108 | # Download inventory file 109 | get_inventory_asf.py -r {roi} 110 | # Prepare interferogram directory 111 | prep_topsApp.py -i query.geojson -p {path} -m {master} -s {slave} -n {swaths} -r {roi} -g {gbox} 112 | # Run code 113 | cd int-{master}-{slave} 114 | topsApp.py 2>&1 | tee topsApp.log 115 | # Create S3 bucket and save results 116 | aws s3 mb s3://int-{master}-{slave} 117 | cp *xml *log merged 118 | aws s3 sync merged/ s3://int-{master}-{slave}/ 119 | # Close instance 120 | EOF 121 | echo "Finished interferogram... shutting down" 122 | #shutdown #doesn't close entire stack 123 | aws cloudformation delete-stack --stack-name proc-{master}-{slave} 124 | '''.format(**vars(inps))) 125 | 126 | return filename 127 | 128 | 129 | def launch_stack(template): 130 | ''' 131 | launch AWS CloudFormationStack 132 | ''' 133 | name = template[:-4] 134 | cmd = 'aws cloudformation create-stack --stack-name {0} --template-body file://{1}'.format(name,template) 135 | print(cmd) 136 | os.system(cmd) 137 | 138 | 139 | if __name__ == '__main__': 140 | inps = cmdLineParse() 141 | inps.roi = ' '.join([str(x) for x in inps.roi]) 142 | inps.gbox = ' '.join([str(x) for x in inps.gbox]) 143 | inps.swaths = ' '.join([str(x) for x in inps.swaths]) 144 | 145 | template = create_cloudformation_script(inps) 146 | print('Running Interferogram on EC2') 147 | launch_stack(template) 148 | print('EC2 should close automatically when finished...') 149 | -------------------------------------------------------------------------------- /bin/process_interferogramEC2.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Will launch an EC2 instance, process an interferogram with specified parameters 5 | and store the 'merged' folder on S3. Look into doing this with 'boto' 6 | 7 | Approach: create a bash script and pass it to EC2 instanch on launch 8 | 9 | Note: intended for single interferogram processing. For batch processing 10 | figure out how to store common data and DEM on an EFS drive 11 | 12 | # EXAMPLE: 13 | process_interferogramEC2.py -p 115 -m 20170927 -s 20170915 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 14 | 15 | 16 | # Archived Files: 17 | *xml *log 18 | #filt_topophase.unw* filt_topophase.flat* dem.crop* los.rdr.geo* phsig.cor.geo* 19 | # For now just stash entire meged directory 20 | 21 | Created on Sun Nov 19 16:26:27 2017 22 | 23 | @author: scott 24 | """ 25 | 26 | import argparse 27 | import os 28 | 29 | def cmdLineParse(): 30 | ''' 31 | Command line parser. 32 | ''' 33 | parser = argparse.ArgumentParser( description='prepare ISCE 2.1 topsApp.py') 34 | parser.add_argument('-m', type=str, dest='master', required=True, 35 | help='Master date') 36 | parser.add_argument('-s', type=str, dest='slave', required=True, 37 | help='Slave date') 38 | parser.add_argument('-p', type=int, dest='path', required=True, 39 | help='Path/Track/RelativeOrbit Number') 40 | parser.add_argument('-n', type=int, nargs='+', dest='swaths', required=False, 41 | default=[1,2,3], choices=(1,2,3), 42 | help='Subswath numbers to process') 43 | parser.add_argument('-r', type=float, nargs=4, dest='roi', required=False, 44 | metavar=('S','N','W','E'), 45 | help='Region of interest bbox [S,N,W,E]') 46 | parser.add_argument('-g', type=float, nargs=4, dest='gbox', required=False, 47 | metavar=('S','N','W','E'), 48 | help='Geocode bbox [S,N,W,E]') 49 | 50 | return parser.parse_args() 51 | 52 | 53 | 54 | def create_bash_script(inps): 55 | ''' 56 | Write bash file to process and interferogram based on user input 57 | ''' 58 | with open('run_interferogram.sh', 'w') as bash: 59 | bash.write('''#!/bin/bash 60 | echo "Running interferogram generation script..." 61 | echo $PWD 62 | cd /home/ubuntu 63 | echo $PWD 64 | 65 | # Initialize software warning - defaults to root directory 66 | #source ~/.bashrc 67 | #source ~/.aliases 68 | source /home/ubuntu/.aliases 69 | start_isce 70 | 71 | # Get the latest python scripts from github & add to path 72 | git clone https://github.com/scottyhq/dinoSAR.git 73 | export PATH=/home/ubuntu/dinoSAR/bin:$PATH 74 | echo $PATH 75 | 76 | # Download inventory file 77 | get_inventory_asf.py -r {roi} 78 | 79 | # Prepare interferogram directory 80 | prep_topsApp.py -i query.geojson -m {master} -s {slave} -n {swaths} -r {roi} -g {gbox} 81 | 82 | # Run code 83 | cd int_{master}_{slave} 84 | topsApp.py 2>&1 | tee topsApp.log 85 | 86 | # Create S3 bucket and save results 87 | aws s3 mb s3://int-{master}-{slave} 88 | cp *xml *log merged 89 | aws s3 sync merged/ s3://int-{master}-{slave}/ 90 | 91 | # Close instance 92 | echo "Finished interferogram... shutting down" 93 | #poweroff 94 | 95 | '''.format(**vars(inps))) 96 | 97 | 98 | def launch_ec2(ami='ami-d015daa8', instance='t2.micro', key='isce-key', 99 | security_group='sg-eee2ef93'): 100 | ''' 101 | launch an ec2 with interferogram script 102 | Reasonable instances: 'c4.4xlarge' 103 | ''' 104 | cmd = ('aws ec2 run-instances --image-id {0} --count 1 --instance-type {1}' 105 | ' --key-name {2} --security-group-ids {3}' 106 | ' --user-data file://run_interferogram.sh').format(ami,instance,key,security_group) 107 | print(cmd) 108 | #os.system(cmd) 109 | 110 | 111 | if __name__ == '__main__': 112 | inps = cmdLineParse() 113 | # convert lists to strings 114 | inps.roi = ' '.join([str(x) for x in inps.roi]) 115 | inps.gbox = ' '.join([str(x) for x in inps.gbox]) 116 | inps.swaths = ' '.join([str(x) for x in inps.swaths]) 117 | 118 | create_bash_script(inps) 119 | print('Running Interferogram on EC2') 120 | launch_ec2() 121 | print('EC2 should close automatically when finished...') 122 | 123 | -------------------------------------------------------------------------------- /bin/run_interferogram.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | echo "Running interferogram generation script..." 3 | echo $PWD 4 | cd /home/ubuntu 5 | echo $PWD 6 | 7 | # Initialize software warning - defaults to root directory 8 | #source ~/.bashrc 9 | #source ~/.aliases 10 | source /home/ubuntu/.aliases 11 | start_isce 12 | 13 | # Get the latest python scripts from github & add to path 14 | git clone https://github.com/scottyhq/dinoSAR.git 15 | export PATH=/home/ubuntu/dinoSAR/bin:$PATH 16 | echo $PATH 17 | 18 | # Download inventory file 19 | get_inventory_asf.py -r 44.0 44.5 -122.0 -121.5 20 | 21 | # Prepare interferogram directory 22 | prep_topsApp.py -i query.geojson -m 20170927 -s 20170903 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 23 | 24 | # Run code 25 | cd int_20170927_20170903 26 | topsApp.py 2>&1 | tee topsApp.log 27 | 28 | # Create S3 bucket and save results 29 | aws s3 mb s3://int-20170927-20170903 30 | cp *xml *log merged 31 | aws s3 sync merged/ s3://int-20170927-20170903/ 32 | 33 | # Close instance 34 | echo "Finished interferogram... shutting down" 35 | #poweroff 36 | 37 | -------------------------------------------------------------------------------- /bin/run_interferogram_docker.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | date 3 | echo "Args: $@" 4 | env 5 | #https://aws.amazon.com/blogs/compute/creating-a-simple-fetch-and-run-aws-batch-job/ 6 | echo "Fetch and run interferogram processing" 7 | echo "jobId: $AWS_BATCH_JOB_ID" 8 | echo "jobQueue: $AWS_BATCH_JQ_NAME" 9 | echo "computeEnvironment: $AWS_BATCH_CE_NAME" 10 | echo "interferogram: $INTNAME" 11 | echo "processor: Scott Henderson (scottyh@uw.edu)" 12 | echo "software: ISCE 2.1.0" 13 | 14 | # Processing happens here 15 | mkdir $INTNAME 16 | cd /home/ubuntu/$INTNAME 17 | echo $PWD 18 | 19 | # Download processing file from s3 20 | aws s3 sync s3://$INTNAME . 21 | 22 | # Download S1 SLCs from asf 23 | wget --user=$NASAUSER --password=$NASAPASS --input-file=download-links.txt 24 | 25 | # Download aux-cal 20Mb, needed for antenna pattern on old IPF conversions :( 26 | wget -q -r -l2 -nc -nd -np -nH -A SAFE https://s1qc.asf.alaska.edu/aux_cal 27 | 28 | # Run ISCE Software 29 | topsApp.py 2>&1 | tee topsApp.log 30 | 31 | # Generate web-friendly output and push to s3 32 | isce2aws.py $INTNAME 33 | 34 | date 35 | echo "All done!" 36 | -------------------------------------------------------------------------------- /docker/readme.md: -------------------------------------------------------------------------------- 1 | # run ISCE software via Docker 2 | 3 | [Docker](https://www.docker.com/what-docker) is a platform to "containerize" software such that it runs on any computational infrastructure (for example, your own laptop, a windows desktop, or a linux high performance cloud cluster). dinoSARaws includes a Docker Image for ISCE version 2.1.0 installed on Ubuntu 16.04 such that it can be run on various machines. 4 | -------------------------------------------------------------------------------- /docs/readme.md: -------------------------------------------------------------------------------- 1 | # Documentation 2 | 3 | Some extended instructions and examples will be posted here 4 | -------------------------------------------------------------------------------- /s1batch/readme.md: -------------------------------------------------------------------------------- 1 | # Process an entire set of interferograms on AWS 2 | 3 | This folder contains code to processes several hundred interferograms efficiently in the cloud. 4 | 5 | ## Why run this? 6 | 7 | Individual interferograms theoretically measure relative ground displacement between two dates. However, single images are often dominated by atmospheric noise (phase delays caused by water vapor). Stratified changes in water vapor can sometimes successfully eliminated using regional weather models or empirical relations of phase difference to elevation. Turbulent atmospheric noise can be estimated with stacks of many interferograms and time series since these variations are expected to be uncorrelated in time. In short, processing a set of interferograms allows for more sophisticated analysis to isolate true ground deformation. 8 | 9 | ## Common master processes 10 | 11 | Intended to pair most recent acquisition with several preceding acquisitions 12 | 13 | ``` 14 | proc_batch_master.py -p 115 -m 20170927 -s 3 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 15 | ``` 16 | 17 | * `-s 3` specifies processing of 20170927 with three preceding dates: 18 | 19 | ``` 20 | int-20170927-20170915 21 | int-20170927-20170903 22 | int-20170927-20170812 23 | ``` 24 | 25 | ## Entire archive processing 26 | 27 | **Be warned**, depending on the size of your region of interest this can be quite expensive. Essentially, for a set of frames covering your area of interest, each sequential pairing of dates is sent to a different virtual machine to be processed in parallel. This is accomplished via [AWS Batch](https://aws.amazon.com/batch/). 28 | 29 | ``` 30 | proc_batch_sequential.py -p 115 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 31 | ``` 32 | -------------------------------------------------------------------------------- /s1single/readme.md: -------------------------------------------------------------------------------- 1 | # Process single interferogram on AWS 2 | 3 | ## Query Sentinel1 archive via ASF 4 | 5 | ``` 6 | get_inventory_asf.py -r 44.0 44.5 -122.0 -121.5 -f 7 | ``` 8 | 9 | This will download a Geojson file (query.geojson) that describes each frame of Sentinel-1 data overlapping with a region of interest. Note that Sentinel-1 frames do not always cover the same ground footprint and can be shifted in the azimuth (roughly North-South) along a given orbital track. 10 | 11 | Options: 12 | * `-i` allows to input a Polygon vector file instead of `-r` for a SNWE box for the region of interest. 13 | * `-b` adds a buffer (in degrees) around the region of interest. 14 | * `-f` creates subfolders with geojson footprints for every S1 frame. This is convenient for checking frame overlap since you can view the frames on a map [On GitHub](https://github.com/scottyhq/pnwinsar/blob/master/oregon/frames/115/2014-11-06.geojson). 15 | 16 | ## Process a selected pair on AWS 17 | 18 | ``` 19 | proc_ifg_cfn.py -i c5.4xlarge -p 115 -m 20170927 -s 20150908 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 20 | ``` 21 | * `-i` specifies the EC2 instance type 22 | * `-p` specifies orbital path number 115 23 | * `-m` specifies primary frame acquisition date 24 | * `-s` specifies secondary frame acquisition date 25 | * `-n` specifies subswath to process (to process all three input -n 1 2 3) 26 | * `-r` specifies the region of interest for processing 27 | * `-g` specifices the geocoding bounding box for ISCE outputs 28 | 29 | This command launches computation resources on Amazon Web Services via the [AWS CloudFormation](https://aws.amazon.com/cloudformation). Essentially, the script writes a YML file (proc-20170927-20150908.yml) that specifies which computational resources to launch. In particular, a `c5.4xlarge` [EC2 Instance](https://aws.amazon.com/ec2/instance-types), with a custom Amazon Machine Image (Ubuntu 16.04 with ISCE software pre-installed). The file also specifies security settings (RSA key) and additional attached storage for running ISCE (a 100Gb [EBS drive](https://aws.amazon.com/ebs/)). Finally, a 'UserData' bash script is included at the end of the file, which runs ISCE and shuts everything down when finished. Below is a copy of the bash script: 30 | 31 | ``` 32 | #!/bin/bash 33 | # create ext4 filesystem on attached EBS volume 34 | mkfs -t ext4 /dev/xvdf 35 | 36 | # add an entry to fstab to mount volume during boot 37 | echo "/dev/xvdf /mnt/data ext4 defaults,nofail 0 2" >> /etc/fstab 38 | 39 | # mount the volume on current boot 40 | mount -a 41 | chown -R ubuntu /mnt/data 42 | 43 | # run the following commands as 'ubuntu' instead of root user 44 | sudo -i -u ubuntu bash <<"EOF" 45 | export PATH="/home/ubuntu/miniconda3/envs/isce-2.1.0/bin:/home/ubuntu/.local/bin:$PATH" 46 | export GDAL_DATA=/home/ubuntu/miniconda3/envs/isce-2.1.0/share/gdal 47 | source /home/ubuntu/ISCECONFIG 48 | 49 | # Make directories for processing 50 | cd /mnt/data 51 | mkdir dems poeorb auxcal 52 | 53 | # Get the latest python scripts from github & add to path 54 | git clone https://github.com/scottyhq/dinoSARaws.git 55 | export PATH=/mnt/data/dinoSARaws/bin:$PATH 56 | 57 | # Download inventory file 58 | get_inventory_asf.py -r 44.0 44.5 -122.0 -121.5 59 | 60 | # Prepare interferogram directory 61 | prep_topsApp.py -i query.geojson -p 115 -m 20170927 -s 20160908 -n 2 -r 44.0 44.5 -122.0 -121.5 -g 44.0 44.5 -122.0 -121.5 62 | 63 | # Run code 64 | cd int-20170927-20160908 65 | topsApp.py 2>&1 | tee topsApp.log 66 | 67 | # Create S3 bucket and save results 68 | aws s3 mb s3://int-20170927-20160908 69 | cp *xml *log merged 70 | aws s3 sync merged/ s3://int-20170927-20160908/ 71 | 72 | #Close down everything 73 | aws cloudformation delete-stack --stack-name proc-20170927-20160908 74 | ``` 75 | --------------------------------------------------------------------------------