├── .gitattributes ├── Dockerfile ├── Readme.md ├── cams_download ├── README.md ├── download_CAMS_daily.py ├── ecmwfapi │ ├── __init__.py │ └── api.py └── setup.py ├── convert_CAMS_DBL.py ├── convert_to_exo.py ├── copyright.txt ├── folders.txt ├── prepare_mnt ├── 32SNE.txt ├── CVersailles.txt ├── MAJA_HDR_TEMPLATE.HDR ├── Readme.md ├── conversion_format_maja.py ├── copyright.txt ├── france.txt ├── land_polygons_osm │ ├── README │ ├── simplified_land_polygons.cpg │ ├── simplified_land_polygons.dbf │ ├── simplified_land_polygons.prj │ ├── simplified_land_polygons.shp │ └── simplified_land_polygons.shx ├── lib_mnt.py ├── parametres.txt ├── tuilage_mnt_eau.py └── tuilage_mnt_eau_S2.py ├── start_maja.py └── userconf ├── MAJAUserConfigSystem.xml ├── MAJAUserConfig_FORMOSAT_MUSCATE_PROTO.xml ├── MAJAUserConfig_LANDSAT8.xml ├── MAJAUserConfig_LANDSAT8_MUSCATE.xml ├── MAJAUserConfig_LANDSAT8_MUSCATE_PROTO.xml ├── MAJAUserConfig_LANDSAT_MUSCATE.xml ├── MAJAUserConfig_LANDSAT_MUSCATE_PROTO.xml ├── MAJAUserConfig_SENTINEL2.xml ├── MAJAUserConfig_SENTINEL2_GPP.xml ├── MAJAUserConfig_SENTINEL2_MUSCATE.xml ├── MAJAUserConfig_SENTINEL2_TM.xml ├── MAJAUserConfig_SPOT4_MUSCATE_PROTO.xml └── MAJAUserConfig_VENUS.xml /.gitattributes: -------------------------------------------------------------------------------- 1 | *.psd filter=lfs diff=lfs merge=lfs -text 2 | *.TIF filter=lfs diff=lfs merge=lfs -text 3 | *.DBL filter=lfs diff=lfs merge=lfs -text 4 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM centos:7.2.1511 2 | MAINTAINER Daniel Kristof 3 | 4 | ARG http_proxy 5 | ARG https_proxy 6 | ARG ftp_proxy 7 | 8 | ENV http_proxy "$http_proxy" 9 | ENV https_proxy "$https_proxy" 10 | ENV ftp_proxy "$ftp_proxy" 11 | 12 | RUN yum --disableplugin=fastestmirror -y update && yum clean all 13 | RUN yum --disableplugin=fastestmirror -y install gd libxslt libxml2 git wget 14 | 15 | RUN mkdir /usr/lbzip2 && cd /usr/lbzip2 16 | RUN wget http://dl.fedoraproject.org/pub/epel/7/x86_64/l/lbzip2-2.5-1.el7.x86_64.rpm 17 | RUN rpm -Uvh lbzip2-2.5-1.el7.x86_64.rpm 18 | 19 | RUN mkdir /usr/local/maja && cd /usr/local/maja 20 | 21 | ADD maja-1.0.0-rhel.7.2.x86_64-release-gcc.tar /usr/local/maja/ 22 | ADD maja-cots-1.0.0-rhel.7.2.x86_64-release-gcc.tar /usr/local/maja/ 23 | 24 | RUN cd /usr/local/maja/maja-cots-1.0.0-rhel.7.2.x86_64-release-gcc && echo 'Y'|./install.sh 25 | RUN cd /usr/local/maja/maja-1.0.0-rhel.7.2.x86_64-release-gcc && echo 'Y'|./install.sh 26 | 27 | RUN cd /opt/maja 28 | RUN git clone https://github.com/olivierhagolle/Start_maja 29 | RUN cd Start_maja && rm folders.txt 30 | ADD folders.txt /opt/maja/Start_maja 31 | -------------------------------------------------------------------------------- /Readme.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | **-** 5 | 6 | **-** 7 | 8 | **-** 9 | 10 | **-** 11 | 12 | **============================================================** 13 | 14 | **This repository is deprecated. Start_Maja has moved to CNES github repository** 15 | 16 | **============================================================** 17 | 18 | Except if you really want to use old versions, 19 | please use the new repository : 20 | 21 | **https://github.com/CNES/Start-MAJA** 22 | 23 | **============================================================** 24 | 25 | **-** 26 | 27 | **-** 28 | 29 | **-** 30 | 31 | **-** 32 | 33 | **-** 34 | 35 | **-** 36 | 37 | 38 | 39 | 40 | 41 | # Content 42 | 43 | 1. [Introduction](#intro) 44 | 2. [MAJA versions](#versions) 45 | 3. [MAJA output format](#format) 46 | 4. [Get and Install MAJA](#maja) 47 | 5. [Use start_maja](#Basic) 48 | 6. [Example workflow](#workflow) 49 | 7. [Docker](#docker) 50 | 51 | 52 | 53 | 54 | # Introduction 55 | 56 | **Start maja has been updated to work with MAJA 3.1. It is not compatible with MAJA 1.0. If you wish to use MAJA 1.0, please use the corresponding version of Start_Maja, and [read the corresponding readme file](https://github.com/olivierhagolle/Start_maja/tree/v1.0).** To do that, please use `git checkout Start_maja_V1`. 57 | 58 | The following script will help you run the MAJA L2A processor on your computer, for Sentinel-2 data only so far. You can also run MAJA on [CNES PEPS collaborative ground segment](https://theia.cnes.fr) using the [maja-peps script also available on github](https://github.com/olivierhagolle/maja_peps). Using PEPS will be much easier, but is not meant for mass processing. 59 | 60 | MAJA stands for Maccs-Atcor Joint Algorithm. This atmospheric correction and cloud screening software is based [on MACCS processor](http://www.cesbio.ups-tlse.fr/multitemp/?p=6203), developped for CNES by CS-SI company, from a method and a prototype developped at CESBIO, [1](#ref1) [2](#ref2) [3](#ref3). In 2017, thanks to an agreement between CNES and DLR and to some funding from ESA, we started adding methods from DLR 's atmospheric correction software ATCOR into MACCS. MACCS then became MAJA. 61 | 62 | - The first version resulting from this collaboration was MAJA V1-0. If you are using this version, you will also need [the version v1_0 of start_maja](https://github.com/olivierhagolle/Start_maja/releases/tag/v1.0). 63 | 64 | - A second version of MAJA, v2-1 was used in Theia, but was not distributed to users, because the version 3 was available shortly afterwards. 65 | 66 | - This version of start_maja.py is made to run MAJA 3.1. 67 | 68 | MAJA has a very unique feature among all atmospheric correction processors: it uses multi-temporal criteria to improve cloud detection and aerosol retrieval. Because of this feature, it is important to use MAJA to process *time series* of images and not single images. Moreover, these images have to be processed chronologically. To initialise processing of a time series, a special mode is used, named "backward mode". To get a correct first product, we process in fact a small number of products in anti-chronological order (default value of number of images processed in backward mode is 8, but consider increasing it if your region is very cloudy). Then all the products are processed in "nominal" mode and chronological order. When a product is fully or nearly fully cloudy, it is not issued to save processing time and disk space. 69 | 70 | For more information about MAJA methods but without details, please read : http://www.cesbio.ups-tlse.fr/multitemp/?p=6203 71 | To get all details on the methods, MAJA's ATBD is available here : http://tully.ups-tlse.fr/olivier/maja_atbd/blob/master/atbd_maja.pdf, or reference [1](#ref4), below. 72 | 73 | 74 | MAJA needs parameters, that ESA names GIPP. We have also set-up [an internal repository](http://tully.ups-tlse.fr/olivier/gipp/tree/master) containing parameters for all sensors actually processed by MAJA, including Sentinel-2, Venµs and LANDSAT 8. This repository is kept up to date with the operational processors. See also [the parameters section](#parameters) below. 75 | 76 | 77 | # Recent changes 78 | 79 | ## V3.1 (2018/07/09) 80 | Until MAJA V3.1 there were two output formats, one for the products generated at Theia, and one for the products generated by MAJA used with standard ESA L1C products. In the future, we will adopt the output format of Theia. However, for this version, we provide a choise of two outputs. To choose which output format is used by MAJA, you will need to choose between two binary versions: 81 | - the MAJA version with "Sentinel2-TM" plugin will provide the Theia format as output. [This format is described here](http://www.cesbio.ups-tlse.fr/multitemp/?page_id=8352). 82 | 83 | - the other version will go on with the current format, [described here](http://www.cesbio.ups-tlse.fr/multitemp/?page_id=10464) 84 | 85 | MAJA 3.1 ships several improvements : 86 | 87 | - the main improvement is the use of Copernicus Atmosphere Monitoring Service (CAMS) aerosol products, which are used to constrain the aerosol type in the estimates. This brings a major improvement in places where the aerosols can differ a lot from a continental model which was used so far,it might slightly degraded the reults where the aerosol model was the correct one. However, a bug on the time and mlocation interpolation of CAMS data was found, and we recommend to activate the CAMS option only when it is fixed with MAJA 3.1.2. 88 | 89 | - since version V2-1, MAJA also includes a correction for thin cirrus clouds and a directional effect correction used to improve the estimate of AOT when using Sentinel-2 time series coming from adjacent orbits. More information is available here: http://www.cesbio.ups-tlse.fr/multitemp/?p=13291 90 | 91 | - depending on the executable downloaded, you can have access to the same output format as the one used by MUSCATE processing center. 92 | 93 | - and finally, MAJA is now provided for RedHat or Ubuntu Linux families. 94 | 95 | ### V1.0 (2018/07/09) 96 | We just added a tag, v1.0 to get a similar version number as the one used for MAJA. The corresponding release [can be accessed here](https://github.com/olivierhagolle/Start_maja/releases/tag/v1.0) 97 | 98 | ### v.0.9.1 (2018/03/29) 99 | Added MAJA error catching. As a result, the processing of a whole time series stops if MAJA fails for a given date. 100 | 101 | ### v0.9 (2017/10/02) 102 | - this version of start_maja works with both S2A and S2B 103 | - we have found errors, especially regarding water vapour, in the parameters we provided in the "GIPP_nominal" folder. These parameters have been removed and we strongly advise you to do the same. 104 | - we have updated the parameters and provided them for both S2A and S2B in the folder GIPP_S2AS2B 105 | 106 | 107 | # Data format 108 | - the MAJA version with "Sentinel2-TM" plugin uses the Theia format as output. [This format is described here](http://www.cesbio.ups-tlse.fr/multitemp/?page_id=8352). 109 | 110 | - the other version still use the native format, [described here](http://www.cesbio.ups-tlse.fr/multitemp/?page_id=10464). We migght decide to stop support for this format in the coming versions. 111 | 112 | 113 | 114 | 115 | # Get MAJA 116 | ## Get MAJA Sofware 117 | 118 | MAJA is provided as a binary code and should at least work on RedHat (6 ad 7), Cesnt 0S, or Ubuntu recent versions. Its licence prevents commercial use of the code. For a licence allowing commercial use, please contact CNES (Olivier Hagolle). MAJA's distribution site is https://logiciels.cnes.fr/en/content/maja. However, this venerable site is limited in size (500 MB) per software, and MAJA excutable, shipped with parameters and libraries is about 1.5 GB. As a result, we provide provisionnaly download links here. 119 | 120 | ** MAJA is distributed with a licence that [you have to accept here](https://logiciels.cnes.fr/en/content/maja). Downloading MAJA from the links below without accepting the licence, and providing the necessary information, is therefore illegal.** 121 | 122 | MAJA is provided under two versions depending on the format you would like to use. 123 | 124 | [You may download MAJA 3.1 from here if you wish to use MUSCATE format](https://mycore.core-cloud.net/index.php/s/K0wk1OA0SezjreO). The format is documented [here](http://www.cesbio.ups-tlse.fr/multitemp/?page_id=8352). 125 | 126 | [You may download MAJA 3.1 from here if you wish to use the native format, as for MAJA 1_0](https://mycore.core-cloud.net/index.php/s/XQKQFxAJjGUtLkK). Anyway, be aware that we will probably not maintain that version in the coming years. The Native format is documented [here](http://www.cesbio.ups-tlse.fr/multitemp/?page_id=10464) 127 | 128 | 129 | 130 | ## install MAJA 131 | This is explained in the documentation provided with MAJA software. 132 | Some users have had issues with some missing libraries, depending on how the linux system is configured. Running the following commands, with administration rights, might help. 133 | ``` 134 | # sudo yum --disableplugin=fastestmirror -y update (if necessary) 135 | sudo yum --disableplugin=fastestmirror -y install gd libxslt libxml2 136 | ``` 137 | 138 | 139 | 140 | # Basic Supervisor for MAJA processor 141 | 142 | The basic supervisor **start_maja** enables to process successively all files in a time series of Sentinel-2 images for a given tile, stored in a folder. The initialisation of the time series is performed with the "backward mode", and then all the dates are processed in "nominal" mode. The backward mode takes much more time than the nominal mode. On my computer, which is a fast one, the nominal mode takes 15 minutes, and the backward mode takes almost one hour. No control is done on the outputs, and it does not check if the time elapsed between two successive products used as input is not too long and would require restarting the initialisation in backward mode. 143 | 144 | 145 | To use this start_maja.py, you will need to configure the directories within the folder.txt file. 146 | 147 | ## Download Sentinel-2 data : 148 | The use of peps_download.py to download Sentinel-2 l1c PRODUCTS is recommended : 149 | https://github.com/olivierhagolle/peps_download 150 | 151 | 152 | ## Parameters 153 | The tool needs a lot of configuration files which are provided in two directories "userconf" and "GIPP_S2AS2B". I tend to never change the "userconf", but the GIPP_S2AS2B contains the parameters and look-up tables, which you might want to change. Most of the parameters lie within the L2COMM file. When I want to test different sets of parameters, I create a new GIPP folder, which I name GIPP_context, where *context* is passed as a parameter of the command line with option -c . 154 | 155 | We provide two sets of parameters, one to work without CAMS data, and one to work with CAMS data. The latter needs a lot of disk space (~1.5 GB), as the LUT are provided not only for one aerosol type, but for for 5 aerosol types, and 6 water vapour contents. As Github limits the repository size to 1 GB, we are using a gitlab repository to distribute the parameters (GIPP): 156 | - Parameters without CAMS : http://tully.ups-tlse.fr/olivier/gipp/tree/master/GIPP_MAJA_3_1_S2AS2B_MUSCATE_TM 157 | - Parameters with CAMS: http://tully.ups-tlse.fr/olivier/gipp/tree/master/GIPP_MAJA_3_1_S2AS2B_CAMS (but we don't recommend to use it) 158 | The look-up tables are too big to be but on our gitlab server, you will have to download them following the link in the GIPP readme file. (I know, it's a bit complicated) 159 | 160 | ## Folder structure 161 | To run MAJA, you need to store all the necessary data in an input folder. Here is an example of its content in nominal mode. 162 | 163 | ``` 164 | S2A_MSIL1C_20180316T103021_N0206_R108_T32TMR_20180316T123927.SAFE 165 | S2A_TEST_GIP_CKEXTL_S_31TJF____10001_20150703_21000101.EEF 166 | S2A_TEST_GIP_CKQLTL_S_31TJF____10005_20150703_21000101.EEF 167 | S2A_TEST_GIP_L2ALBD_L_CONTINEN_10005_20150703_21000101.DBL.DIR 168 | S2A_TEST_GIP_L2ALBD_L_CONTINEN_10005_20150703_21000101.HDR 169 | S2A_TEST_GIP_L2COMM_L_ALLSITES_10008_20150703_21000101.EEF 170 | S2A_TEST_GIP_L2DIFT_L_CONTINEN_10005_20150703_21000101.DBL.DIR 171 | S2A_TEST_GIP_L2DIFT_L_CONTINEN_10005_20150703_21000101.HDR 172 | S2A_TEST_GIP_L2DIRT_L_CONTINEN_10005_20150703_21000101.DBL.DIR 173 | S2A_TEST_GIP_L2DIRT_L_CONTINEN_10005_20150703_21000101.HDR 174 | S2A_TEST_GIP_L2SMAC_L_ALLSITES_10005_20150703_21000101.EEF 175 | S2A_TEST_GIP_L2TOCR_L_CONTINEN_10005_20150703_21000101.DBL.DIR 176 | S2A_TEST_GIP_L2TOCR_L_CONTINEN_10005_20150703_21000101.HDR 177 | S2A_TEST_GIP_L2WATV_L_CONTINEN_10005_20150703_21000101.DBL.DIR 178 | S2A_TEST_GIP_L2WATV_L_CONTINEN_10005_20150703_21000101.HDR 179 | S2B_OPER_SSC_L2VALD_32TMR____20180308.DBL.DIR 180 | S2B_OPER_SSC_L2VALD_32TMR____20180308.HDR 181 | S2B_TEST_GIP_CKEXTL_S_31TJF____10001_20150703_21000101.EEF 182 | S2B_TEST_GIP_CKQLTL_S_31TJF____10005_20150703_21000101.EEF 183 | S2B_TEST_GIP_L2ALBD_L_CONTINEN_10003_20150703_21000101.DBL.DIR 184 | S2B_TEST_GIP_L2ALBD_L_CONTINEN_10003_20150703_21000101.HDR 185 | S2B_TEST_GIP_L2COMM_L_ALLSITES_10008_20150703_21000101.EEF 186 | S2B_TEST_GIP_L2DIFT_L_CONTINEN_10002_20150703_21000101.DBL.DIR 187 | S2B_TEST_GIP_L2DIFT_L_CONTINEN_10002_20150703_21000101.HDR 188 | S2B_TEST_GIP_L2DIRT_L_CONTINEN_10002_20150703_21000101.DBL.DIR 189 | S2B_TEST_GIP_L2DIRT_L_CONTINEN_10002_20150703_21000101.HDR 190 | S2B_TEST_GIP_L2SMAC_L_ALLSITES_10005_20150703_21000101.EEF 191 | S2B_TEST_GIP_L2TOCR_L_CONTINEN_10002_20150703_21000101.DBL.DIR 192 | S2B_TEST_GIP_L2TOCR_L_CONTINEN_10002_20150703_21000101.HDR 193 | S2B_TEST_GIP_L2WATV_L_CONTINEN_10005_20150703_21000101.DBL.DIR 194 | S2B_TEST_GIP_L2WATV_L_CONTINEN_10005_20150703_21000101.HDR 195 | S2__TEST_AUX_REFDE2_T32TMR_0001.DBL.DIR 196 | S2__TEST_AUX_REFDE2_T32TMR_0001.HDR 197 | S2__TEST_GIP_L2SITE_S_31TJF____10001_00000000_99999999.EEF 198 | ``` 199 | 200 | The .SAFE file is the input product. THE L2VALD files are the L2A product, which is the result from a previous execution of MAJA. The files with GIP are parameter files for S2A and S2B, that you will find in this repository. The REFDE2 files are the DTM files. How to obtain them is explained below. 201 | 202 | A "userconf" folder is also necessary, but it is also provided in this repository. 203 | 204 | 205 | 206 | ## DTM 207 | A DTM folder is needed to process data with MAJA. Of course, it depends on the tile you want to process. This DTM must be stored in the DTM folder, which is defined within the code. A tool exists to create this DTM, [it is available in the "prepare_mnt" folder](https://github.com/olivierhagolle/Start_maja/tree/master/prepare_mnt). 208 | 209 | An example of DTM file is available here for tile 31TFJ in Provence, France, near Avignon. Both files should be placed in a folder named DTM/S2__TEST_AUX_REFDE2_T31TFJ_0001 in the start_maja directory. 210 | 211 | http://osr-cesbio.ups-tlse.fr/echangeswww/majadata//S2__TEST_AUX_REFDE2_T31TFJ_0001.DBL 212 | 213 | http://osr-cesbio.ups-tlse.fr/echangeswww/majadata//S2__TEST_AUX_REFDE2_T31TFJ_0001.HDR 214 | 215 | The DBL file is a tar file (I am innocent for this choice...) that can be opened with `tar xvf `. MAJA can use both the archive or un-archived version. The tool above provides the un-archived version (DBL.DIR). 216 | 217 | ## CAMS 218 | if you intend to use the data from Copernicus Atmosphere Monitoring Service (CAMS), that we use to get an information on the aerosol type, you will need to download the CAMS data. A download tool is provided [in the cams_download directory of this repository](xxx) 219 | 220 | 221 | 222 | 223 | # Example workflow 224 | 225 | Here is how to process a set of data above tile 31TFJ, near Avignon in Provence, France. To process any other tile, you will need to prepare the DTM and store the data in the DTM folder. 226 | 227 | ## Install 228 | 229 | - Install MAJA 230 | 231 | - Clone the current repository to get start_maja.py 232 | `git clone https://github.com/olivierhagolle/Start_maja` 233 | 234 | 235 | 236 | 237 | ## Retrieve Sentinel-2 L1C data. 238 | - For instance, with peps_download.py (you need to have registered at https://peps.cnes.fr and store the account and password in peps.txt file. 239 | 240 | `python ./peps_download.py -c S2ST -l 'Avignon' -a peps.txt -d 2017-01-01 -f 2017-04-01 -w /path/to/L1C_DATA/Avignon` 241 | 242 | - I tend to store the data per site. A given site can contain several tiles. All the L1C tiles corresponding to a site are stored in a directory named /path/to/L1C_DATA/Site 243 | 244 | - Unzip the LIC files in /path/to/L1C_DATA/Avignon 245 | 246 | ## Create DTM 247 | Follow DTM generation instructions : http://tully.ups-tlse.fr/olivier/prepare_mnt 248 | 249 | ## Download CAMS data 250 | Follow cams_download tool instructions : https://github.com/olivierhagolle/Start_maja/tree/master/cams_download 251 | 252 | ## Execute start_maja.py 253 | 254 | - To use the start_maja script, you need to configure the directories, within the folder.txt file. 255 | Here is my own configuration, also provided in the folders.txt file in this repository. 256 | ``` 257 | repCode=/mnt/data/home/hagolleo/PROG/S2/lance_maja 258 | repWork=/mnt/data/SENTINEL2/MAJA 259 | repL1 =/mnt/data/SENTINEL2/L1C_PDGS 260 | repL2 =/mnt/data/SENTINEL2/L2A_MAJA 261 | repMaja=/mnt/data/home/hagolleo/Install-MAJA/maja/core/1.0/bin/maja 262 | repCAMS =/mnt/data/SENTINEL2/CAMS 263 | ``` 264 | - repCode is where Start_maja.py is stored, together with the DTM, userconf and GIPP directories 265 | - repWork is a directory to store the temporary files 266 | - repL1 is where to find the L1C data (without the site name which is added aferward) 267 | - Les produits SAFE doivent donc être stockés à l'emplacement suivant : repL1 = repL1/site 268 | - repL2 is for the L2A data (without the site name which is added aferward) 269 | - repMAJA is where the Maja binary code is 270 | - repCAMS is where CAMS data are stored 271 | 272 | 273 | 274 | Here is an example of command line 275 | ``` 276 | Usage : python ./start_maja.py -f -c -t -s -d 277 | Example : python ./start_maja.py -f folders.txt -c MAJA_3_0_S2AS2B_CAMS -t 31TFJ -s Avignon -d 20170101 -e 20180101 278 | ``` 279 | Description of command line options : 280 | * -f provides the folders filename 281 | * -c is the context, MAJA uses the GIPP files contained in GIPP_context directory. The L2A products will be created in 282 | rep_L2/Site/Tile/Context (Several users told me it is weird to use the GIPP folder name after removing GIPP_, I should change that) 283 | * -t is the tile number 284 | * -s is the site name 285 | * -d (aaaammdd) is the first date to process within the time series 286 | * -e (aaaammdd) is the last date to process within the time serie-s 287 | * -z directly uses zipped L1C files 288 | 289 | Caution, *when a product has more than 90% of clouds, the L2A is not issued*. However, a folder with _NOTVALD_ is created. 290 | 291 | ## Known Errors 292 | 293 | 294 | Some Sentinel-2 L1C products lack the angle information which is required by MAJA. In this case, MAJA stops processing with an error message. This causes issues particularly in the backward mode. These products were acquired in February and March 2016 and have not been reprocessed by ESA (despited repeated asks from my side). You should remove them from the folder which contains the list of L1C products to process. 295 | 296 | 297 | 298 | # Docker 299 | 300 | Dániel Kristóf provided us with a Dockerfile (Thank you Dániel), which, on any linux system retrieves the CentOS System, installs what is necessary and configures MAJA. I am really not a Docker expert, and when I tried, our lab system engineer immediately told me that there are some securities issues with Docker, and I should not install it like that...So, I never tested it. 301 | 302 | But if we follow Daniel's guidelines : 303 | 304 | - First, download the test data set and store them in ~/MAJA/S2_NOMINAL 305 | - Then configure the folders.txt file according to your configuration 306 | - Then : 307 | ``` 308 | sudo docker build -t maja . 309 | 310 | (or behind a proxy) 311 | sudo docker build -t maja --build-arg http_proxy=$http_proxy --build-arg https_proxy=$https_proxy --build-arg ftp_proxy=$ftp_proxy . 312 | ``` 313 | And then, you may run MAJA with the test data sets with 314 | ``` 315 | 316 | ``` 317 | 318 | 319 | ## References : 320 | 1: A multi-temporal method for cloud detection, applied to FORMOSAT-2, VENµS, LANDSAT and SENTINEL-2 images, O Hagolle, M Huc, D. Villa Pascual, G Dedieu, Remote Sensing of Environment 114 (8), 1747-1755 321 | 322 | 2: Correction of aerosol effects on multi-temporal images acquired with constant viewing angles: Application to Formosat-2 images, O Hagolle, G Dedieu, B Mougenot, V Debaecker, B Duchemin, A Meygret, Remote Sensing of Environment 112 (4), 1689-1701 323 | 324 | 3: A Multi-Temporal and Multi-Spectral Method to Estimate Aerosol Optical Thickness over Land, for the Atmospheric Correction of FormoSat-2, LandSat, VENμS and Sentinel-2 Images, O Hagolle, M Huc, D Villa Pascual, G Dedieu, Remote Sensing 7 (3), 2668-2691 325 | 326 | 4: MAJA's ATBD, O Hagolle, M. Huc, C. Desjardins; S. Auer; R. Richter, https://doi.org/10.5281/zenodo.1209633 327 | 328 | 329 | 330 | 331 | 332 | 333 | -------------------------------------------------------------------------------- /cams_download/README.md: -------------------------------------------------------------------------------- 1 | # download_CAMS_daily 2 | 3 | This tool is designed to download daily CAMS near-real-time forecast products from ECMWF. Otherwise, the graphical interface provided by ECMWF is available here: http://apps.ecmwf.int/datasets/data/macc-nrealtime/levtype=sfc/ 4 | 5 | The tool downloads several types of fields : 6 | - the Aerosol Optical Thickness AOT, for 5 types of aerosols, which are stored in the AOT file 7 | - the mixing ratios of each aerosol types as a function of altitude (expressed as model levels), stored in MR file 8 | - the Relative Humidity as a function of altitude expressed in pressure levels, stored in the RH file. 9 | 10 | These files are available currently twice a day, and some of the aprameters are only availble as a 3 hour forecast, and not an analysis. The files are downloaded in netCDF format and look like this: 11 | ``` 12 | CAMS_AOT_yyyymmdd_UTC_hhmm.nc 13 | CAMS_MR_yyyymmdd_UTC_hhmm.nc 14 | CAMS_RH_yyyymmdd_UTC_hhmm.nc 15 | ``` 16 | 17 | The files are then converted in one archive using the Earth Explorer format, which was selected as the standard format for MAJA external data. We have a HDR xml file, and a bzipped DBL archive, which contains the three files above for a given time and date. (I regret this choice which complexifies the use of data within MAJA, compared to using the plain netCDF files, but it the current situation) 18 | 19 | 20 | 21 | # Configuration 22 | 23 | - Create a ECMWF account: https://apps.ecmwf.int/registration/ 24 | 25 | - The folder ecmwfapi/ and the file setup.py must be in the working directory. 26 | 27 | - Get your API key at: https://api.ecmwf.int/v1/key/ 28 | 29 | - Create the file '.ecmwfapirc' in your home and type: 30 | ``` 31 | { 32 | "url" : "https://api.ecmwf.int/v1", 33 | "key" : "your_key", 34 | "email" : "your_email_address" 35 | } 36 | ``` 37 | 38 | # Example 39 | 40 | `python download_CAMS_daily.py -d 20180101 -f 20180102 -w /path/to/my/CAMSfolderNetCDF -a /path/to/my/CAMSfolderNetDBL` 41 | 42 | 43 | # Parameters 44 | 45 | The user can choose the following parameters with the command line: 46 | 47 | - w: where CAMS files will be stored 48 | 49 | - d,f: min/max range of dates to download 50 | 51 | - w: path to folder where netcdf data are stored (can be considered as a temporary file) 52 | - a: path to folder where DBL/HDR files are stored 53 | - k: to keep the netcdf files 54 | 55 | Other parameters could be accessed within the code : 56 | 57 | - step: forecast step in hours (default 3, ony value available so far) 58 | 59 | - type of file to download: surface, pressure or model (default all) 60 | 61 | - name of output files 62 | 63 | - grid: spatial resolution of downloaded files 64 | 65 | - format: grib or netcdf 66 | 67 | # Info on downloaded files 68 | 69 | - Surface files: Aerosol Optical Thickness (AOT) at 550 nm for the five aerosol models (BlackCarbon, Dust, Sulfate, SeaSalt and OrganicMatter). 70 | ``` 71 | Variable name Description 72 | suaod550 Sulphate Aerosol Optical Depth at 550nm 73 | bcaod550 BlackCarbon Aerosol Optical Depth at 550nm 74 | ssaod550 SeaSalt Aerosol Optical Depth at 550nm 75 | duaod550 Dust Aerosol Optical Depth at 550nm 76 | omaod550 OrganicMatter Aerosol Optical Depth at 550nm 77 | ``` 78 | 79 | - Model files: mass Mixing Ratios (MR: expressed in kg of aerosol per kg of dry air [kg/kg]) of the aerosol models at 60 different altitude levels (model levels). 80 | ``` 81 | Variable name Description 82 | aermr01 Sea Salt Aerosol (0.03 - 0.5 um) Mixing Ratio 83 | aermr02 Sea Salt Aerosol (0.5 - 5 um) Mixing Ratio 84 | aermr03 Sea Salt Aerosol (5 - 20 um) Mixing Ratio 85 | aermr04 Dust Aerosol (0.03 - 0.55 um) Mixing Ratio 86 | aermr05 Dust Aerosol (0.55 - 0.9 um) Mixing Ratio 87 | aermr06 Dust Aerosol (0.9 - 20 um) Mixing Ratio 88 | aermr07 Hydrophobic Organic Matter Aerosol Mixing Ratio 89 | aermr08 Hydrophilic Organic Matter Aerosol Mixing Ratio 90 | aermr09 Hydrophobic Black Carbon Aerosol Mixing Ratio 91 | aermr10 Hydrophilic Black Carbon Aerosol Mixing Ratio 92 | aermr11 Sulphate Aerosol Mixing Ratio 93 | ``` 94 | 95 | - Pressure files: Relative Humidity (RH: expressed in %) at 22 different altitude levels (pressure levels). 96 | ``` 97 | Variable name Description 98 | r Relative Humidity 99 | ``` 100 | 101 | Information about these variables can also be found at http://atmosphere.copernicus.eu/ftp-access-global-data under the “Forecast surface parameters” and “Forecast model level parameters” sections, or at http://apps.ecmwf.int/codes/grib/param-db/ 102 | -------------------------------------------------------------------------------- /cams_download/download_CAMS_daily.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | # 4 | # DOwnloads necessary CMAS data for MAJA and converts them into MAJA input format 5 | # Written by B.Rouquie, O.Hagolle, CESBIO 6 | # 7 | # 8 | # 9 | import os 10 | from ecmwfapi import ECMWFDataServer 11 | from convert_to_exo import process_one_file 12 | import datetime 13 | import timeit 14 | import optparse 15 | import sys 16 | 17 | server = ECMWFDataServer() 18 | ########################################################################### 19 | class OptionParser (optparse.OptionParser): 20 | 21 | def check_required (self, opt): 22 | option = self.get_option(opt) 23 | 24 | # Assumes the option's 'default' is set to None! 25 | if getattr(self.values, option.dest) is None: 26 | self.error("%s option not supplied" % option) 27 | 28 | ########################################################################### 29 | def download_files(dt,file_type,time,step,path_out): 30 | 31 | 32 | date_courante = str(dt.year)+'%02d'%(dt.month)+'%02d'%(dt.day) 33 | print '\nCurrent_date =',date_courante 34 | 35 | if file_type['surface'] == True: 36 | #================= 37 | # Surface 38 | # Recupere AOT a 550nm pour BC, SS, SU, DU, OM 39 | #================= 40 | nom_AOT = path_out + "/CAMS_AOT_" + date_courante + 'UTC' + str(int(time)+int(step)).zfill(2) + '0000.nc' 41 | print 'Nom fichier de sortie AOT :',nom_AOT 42 | 43 | server.retrieve({ 44 | 'stream' : "oper", 45 | 'class' : "mc", 46 | 'dataset' : "cams_nrealtime", 47 | 'expver' : '0001', 48 | 'step' : step, 49 | 'levtype' : "SFC", 50 | 'date' : date_courante, 51 | 'time' : time, 52 | 'type' : "fc", 53 | 'param' : "208.210/209.210/210.210/211.210/212.210", 54 | 'area' : "G", 55 | 'grid' : "1.25/1.25", 56 | 'format' : "netcdf", 57 | 'target' : nom_AOT 58 | }) 59 | #208.210/209.210/210.210/211.210/212.210 : AOT at 550nm for BC, SS, OM, SU, DU 60 | 61 | if file_type['pressure'] == True: 62 | #========================= 63 | # Pressure levels 64 | # 65 | # Recupere Relative Humidity RH 66 | #========================= 67 | nom_RH = path_out + "/CAMS_RH_" + date_courante + 'UTC' + str(int(time)+int(step)).zfill(2) + '0000.nc' 68 | print 'Nom fichier de sortie RH :',nom_RH 69 | 70 | server.retrieve({ 71 | 'stream' : "oper", 72 | 'class' : "mc", 73 | 'dataset' : "cams_nrealtime", 74 | 'expver' : "0001", 75 | 'step' : step, 76 | 'levtype' : "pl", 77 | "levelist": "1/2/3/5/7/10/20/30/50/70/100/150/200/250/300/400/500/600/700/850/925/1000", 78 | 'date' : date_courante, 79 | 'time' : time, 80 | 'type' : "fc", 81 | 'param' : "157.128", 82 | 'area' : "G", 83 | 'grid' : "1.25/1.25", 84 | 'format' : "netcdf", 85 | 'target' : nom_RH 86 | }) 87 | 88 | if file_type['model'] == True: 89 | #========================= 90 | # Model levels 91 | # 92 | # Recupere les mixing ratios : 3 bins DUST, 3 bins SEASALT, ORGANICMATTER hydrophile et hydrophobe, BLACKCARBON hydrophile et hydrophobe, et SULFATE. 93 | #========================= 94 | nom_MR = path_out + "/CAMS_MR_" + date_courante + 'UTC' + str(int(time)+int(step)).zfill(2) + '0000.nc' 95 | print 'Nom fichier de sortie mixRatios :',nom_MR 96 | 97 | server.retrieve({ 98 | 'stream' : "oper", 99 | 'class' : "mc", 100 | 'dataset' : "cams_nrealtime", 101 | 'expver' : "0001", 102 | 'step' : step, 103 | 'levtype' : "ml", 104 | "levelist": "1/to/60", 105 | 'date' : date_courante, 106 | 'time' : time, 107 | 'type' : "fc", 108 | 'param' : "1.210/2.210/3.210/4.210/5.210/6.210/7.210/8.210/9.210/10.210/11.210", 109 | 'area' : "G", 110 | 'grid' : "1.25/1.25", 111 | 'format' : "netcdf", 112 | 'target' : nom_MR 113 | }) 114 | return nom_AOT, nom_RH, nom_MR 115 | 116 | 117 | #============== 118 | # MAIN 119 | #============== 120 | 121 | #=========================== 122 | # ORDRES DE GRANDEUR 123 | #=========================== 124 | #1 fichier Surface = 0.8 Mo 125 | #1 fichier Pressure = 1.7 Mo 126 | #1 fichier Model = 53 Mo 127 | # 128 | # => 20 Go par an (40 Go par an avec les deux forecasts (minuit et midi)) 129 | # 130 | 131 | #============================= 132 | # AEROSOLS DISPONIBLES 133 | #============================= 134 | #BC = BlackCarbon 135 | #SS = SeaSalt 136 | #SU = Sulfate 137 | #DU = Dust 138 | #OM = OrganicMatter 139 | 140 | 141 | 142 | #================== 143 | #parse command line 144 | #================== 145 | if len(sys.argv) == 1: 146 | prog = os.path.basename(sys.argv[0]) 147 | print ' '+sys.argv[0]+' [options]' 148 | print " Aide : ", prog, " --help" 149 | print " ou : ", prog, " -h" 150 | print "example : python %s -d 20171101 -f 20171201 -a /mnt/data/DONNEES_AUX/CAMS_DBL/ -w /mnt/data/DONNEES_AUX/CAMS_TMP/ "%prog 151 | sys.exit(-1) 152 | else : 153 | usage = "usage: %prog [options] " 154 | parser = OptionParser(usage=usage) 155 | parser.add_option("-d", "--start_date", dest="start_date", action="store", type="string", \ 156 | help="start date, fmt('20171101')",default=None) 157 | parser.add_option("-f","--end_date", dest="end_date", action="store", type="string", \ 158 | help="end date, fmt('20171201')",default=None) 159 | parser.add_option("-w","--write_dir", dest="write_dir", action="store",type="string", \ 160 | help="Path where the products should be downloaded") 161 | parser.add_option("-a","--archive_dir", dest="archive_dir", action="store",type="string", \ 162 | help="Path where the archive DBL files are stored") 163 | parser.add_option("-k","--keep", dest="keep", action="store_true", \ 164 | help="keep netcdf files",default=False) 165 | 166 | #parser.add_option("-t", "--time",dest="time", action="store", type="choice", \ 167 | # choices=['00','12'],help="Time of forecast (Currently '00'or'12)",default='00') 168 | 169 | (options, args) = parser.parse_args() 170 | parser.check_required("-d") 171 | parser.check_required("-f") 172 | parser.check_required("-a") 173 | parser.check_required("-w") 174 | 175 | 176 | 177 | #==================== 178 | # PARAMETRES 179 | #==================== 180 | 181 | 182 | #Creation objets dates 183 | dt1 = datetime.datetime.strptime(options.start_date,'%Y%m%d') 184 | dt2 = datetime.datetime.strptime(options.end_date,'%Y%m%d') 185 | 186 | nb_days = (dt2-dt1).days + 1 187 | print '\nNumber of days =',nb_days 188 | 189 | #Time de l'analyse voulue 190 | #Two possibilities : 191 | # - 00:00:00 UTC (minuit) 192 | # - 12:00:00 UTC (midi) 193 | 194 | time = ["00","12"] 195 | 196 | #Step du forecast voulu 197 | #step = 3 indique qu'on telecharge les previsions a 3h apres l'heure de l'analyse. 198 | #Exemples : time = 00 et step = 3 => 03:00:00 UTC 199 | # time = 12 et step = 3 => 15:00:00 UTC 200 | step = "3" 201 | 202 | #Path out 203 | path_out = options.write_dir 204 | 205 | #Choix des fichiers a telecharger 206 | #Surface : AOT (aerosol optical thickness) 207 | #Pressure : RH (relative humidity) 208 | #Model : MR (mixing ratios) 209 | file_type={'surface':True,'pressure':True,'model':True} 210 | 211 | 212 | 213 | #Boucle sur les jours a telecharger 214 | for i in range(nb_days): 215 | dt = dt1 + datetime.timedelta(days=i) 216 | print "==================================" 217 | print "Downloading files for date %s"%dt 218 | print "==================================" 219 | for t in range(len(time)): 220 | (nom_AOT,nom_RH,nom_MR)=download_files(dt,file_type,time[t],step,path_out) 221 | #conversion to MAJA DBL/HDR format 222 | process_one_file(nom_AOT, nom_MR, nom_RH, path_out, options.archive_dir) 223 | if not(options.keep): 224 | os.remove(nom_AOT) 225 | os.remove(nom_MR) 226 | os.remove(nom_RH) 227 | 228 | 229 | -------------------------------------------------------------------------------- /cams_download/ecmwfapi/__init__.py: -------------------------------------------------------------------------------- 1 | """Python client for ECMWF web services API.""" 2 | # 3 | # (C) Copyright 2012-2013 ECMWF. 4 | # 5 | # This software is licensed under the terms of the Apache Licence Version 2.0 6 | # which can be obtained at http://www.apache.org/licenses/LICENSE-2.0. 7 | # In applying this licence, ECMWF does not waive the privileges and immunities 8 | # granted to it by virtue of its status as an intergovernmental organisation nor 9 | # does it submit to any jurisdiction. 10 | # 11 | 12 | from ecmwfapi import api 13 | 14 | 15 | __all__ = [ 16 | "ECMWFDataServer", 17 | "ECMWFService", 18 | ] 19 | 20 | 21 | __version__ = api.VERSION 22 | 23 | ECMWFDataServer = api.ECMWFDataServer 24 | 25 | ECMWFService = api.ECMWFService 26 | -------------------------------------------------------------------------------- /cams_download/ecmwfapi/api.py: -------------------------------------------------------------------------------- 1 | # 2 | # (C) Copyright 2012-2013 ECMWF. 3 | # 4 | # This software is licensed under the terms of the Apache Licence Version 2.0 5 | # which can be obtained at http://www.apache.org/licenses/LICENSE-2.0. 6 | # In applying this licence, ECMWF does not waive the privileges and immunities 7 | # granted to it by virtue of its status as an intergovernmental organisation nor 8 | # does it submit to any jurisdiction. 9 | 10 | # make the python3-like print behave in python 2 11 | from __future__ import print_function 12 | 13 | import os 14 | import sys 15 | import time 16 | import traceback 17 | from contextlib import closing 18 | 19 | # python 2 and 3 compatible urllib and httplib imports 20 | try: 21 | from urllib.parse import urlparse 22 | from urllib.parse import urljoin 23 | from urllib.error import HTTPError, URLError 24 | from urllib.request import HTTPRedirectHandler, Request, build_opener, urlopen, addinfourl 25 | from http.client import BadStatusLine 26 | except ImportError: 27 | from urlparse import urlparse 28 | from urlparse import urljoin 29 | from urllib2 import HTTPError, URLError 30 | from urllib2 import HTTPRedirectHandler, Request, build_opener, urlopen, addinfourl 31 | from httplib import BadStatusLine 32 | 33 | try: 34 | import json 35 | except ImportError: 36 | import simplejson as json 37 | 38 | try: 39 | import ssl 40 | except ImportError: 41 | print("Python socket module was not compiled with SSL support. Aborting...") 42 | sys.exit(1) 43 | 44 | 45 | ############################################################################### 46 | VERSION = '1.5.0' 47 | 48 | ############################################################################### 49 | 50 | 51 | class APIKeyFetchError(Exception): 52 | pass 53 | 54 | 55 | def _get_apikey_from_environ(): 56 | try: 57 | key = os.environ["ECMWF_API_KEY"] 58 | url = os.environ["ECMWF_API_URL"] 59 | email = os.environ["ECMWF_API_EMAIL"] 60 | return key, url, email 61 | except KeyError: 62 | raise APIKeyFetchError("ERROR: Could not get the API key from the environment") 63 | 64 | 65 | def _get_apikey_from_rcfile(): 66 | rc = os.path.normpath(os.path.expanduser("~/.ecmwfapirc")) 67 | 68 | try: 69 | with open(rc) as f: 70 | config = json.load(f) 71 | except IOError as e: # Failed reading from file 72 | raise APIKeyFetchError(str(e)) 73 | except ValueError: # JSON decoding failed 74 | raise APIKeyFetchError("ERROR: Missing or malformed API key in '%s'" % rc) 75 | except Exception as e: # Unexpected error 76 | raise APIKeyFetchError(str(e)) 77 | 78 | try: 79 | key = config["key"] 80 | url = config["url"] 81 | email = config["email"] 82 | return key, url, email 83 | except: 84 | raise APIKeyFetchError("ERROR: Missing or malformed API key in '%s'" % rc) 85 | 86 | 87 | def get_apikey_values(): 88 | """Get the API key from the environment or the '.ecmwfapirc' file. 89 | 90 | The environment is looked at first. 91 | 92 | Returns: 93 | Tuple with the key, url, and email forming our API key. 94 | 95 | Raises: 96 | APIKeyFetchError: When unable to get the API key from either the 97 | environment or the ecmwfapirc file. 98 | """ 99 | try: 100 | key_values = _get_apikey_from_environ() 101 | except APIKeyFetchError: 102 | try: 103 | key_values = _get_apikey_from_rcfile() 104 | except APIKeyFetchError: 105 | raise 106 | 107 | return key_values 108 | 109 | 110 | ############################################################################### 111 | 112 | 113 | class RetryError(Exception): 114 | 115 | def __init__(self, code, text): 116 | self.code = code 117 | self.text = text 118 | 119 | def __str__(self): 120 | return "%d %s" % (self.code, self.text) 121 | 122 | 123 | class APIException(Exception): 124 | 125 | def __init__(self, value): 126 | self.value = value 127 | 128 | def __str__(self): 129 | return repr(self.value) 130 | 131 | 132 | def robust(func): 133 | 134 | def wrapped(self, *args, **kwargs): 135 | max_tries = tries = 10 136 | delay = 60 # retry delay 137 | last_error = None 138 | while tries > 0: 139 | try: 140 | return func(self, *args, **kwargs) 141 | except HTTPError as e: 142 | if self.verbose: 143 | print("WARNING: HTTPError received %s" % (e)) 144 | if e.code < 500 or e.code in (501,): # 501: not implemented 145 | raise 146 | last_error = e 147 | except BadStatusLine as e: 148 | if self.verbose: 149 | print("WARNING: BadStatusLine received %s" % (e)) 150 | last_error = e 151 | except URLError as e: 152 | if self.verbose: 153 | print("WARNING: URLError received %s %s" % (e.errno, e)) 154 | last_error = e 155 | except APIException: 156 | raise 157 | except RetryError as e: 158 | if self.verbose: 159 | print("WARNING: HTTP received %s" % (e.code)) 160 | print(e.text) 161 | last_error = e 162 | except: 163 | if self.verbose: 164 | print("Unexpected error:", sys.exc_info()[0]) 165 | print(traceback.format_exc()) 166 | raise 167 | print("Error contacting the WebAPI, retrying in %d seconds ..." % delay) 168 | time.sleep(delay) 169 | tries -= 1 170 | # if all retries have been exhausted, raise the last exception caught 171 | print("Could not contact the WebAPI after %d tries, failing !" % max_tries) 172 | raise last_error 173 | 174 | return wrapped 175 | 176 | 177 | def get_api_url(url): 178 | parsed_uri = urlparse(url) 179 | return '{uri.scheme}://{uri.netloc}/{apiver}/'.format( 180 | uri=parsed_uri, apiver=parsed_uri.path.split('/')[1]) 181 | 182 | 183 | SAY = True 184 | 185 | 186 | class Ignore303(HTTPRedirectHandler): 187 | 188 | def redirect_request(self, req, fp, code, msg, headers, newurl): 189 | if code in [301, 302]: 190 | # We want the posts to work even if we are redirected 191 | if code == 301: 192 | global SAY 193 | if SAY: 194 | o = req.get_full_url() 195 | n = newurl 196 | print() 197 | print("*** ECMWF API has moved") 198 | print("*** OLD: %s" % get_api_url(o)) 199 | print("*** NEW: %s" % get_api_url(n)) 200 | print("*** Please update your ~/.ecmwfapirc file") 201 | print() 202 | SAY = False 203 | 204 | try: 205 | # Python < 3.4 206 | data = req.get_data() 207 | except AttributeError: 208 | # Python >= 3.4 209 | data = req.data 210 | 211 | try: 212 | # Python < 3.4 213 | origin_req_host = req.get_origin_req_host() 214 | except AttributeError: 215 | # Python >= 3.4 216 | origin_req_host = req.origin_req_host 217 | 218 | return Request(newurl, 219 | data=data, 220 | headers=req.headers, 221 | origin_req_host=origin_req_host, 222 | unverifiable=True) 223 | return None 224 | 225 | def http_error_303(self, req, fp, code, msg, headers): 226 | infourl = addinfourl(fp, headers, req.get_full_url()) 227 | infourl.status = code 228 | infourl.code = code 229 | return infourl 230 | 231 | 232 | class Connection(object): 233 | 234 | def __init__(self, url, email=None, key=None, verbose=False, quiet=False): 235 | self.url = url 236 | self.email = email 237 | self.key = key 238 | self.retry = 5 239 | self.location = None 240 | self.done = False 241 | self.value = True 242 | self.offset = 0 243 | self.verbose = verbose 244 | self.quiet = quiet 245 | self.status = None 246 | 247 | @robust 248 | def call(self, url, payload=None, method="GET"): 249 | 250 | # Ensure full url 251 | url = urljoin(self.url, url) 252 | 253 | if self.verbose: 254 | print(method, url) 255 | 256 | headers = {"Accept": "application/json", "From": self.email, "X-ECMWF-KEY": self.key} 257 | 258 | opener = build_opener(Ignore303) 259 | 260 | data = None 261 | if payload is not None: 262 | data = json.dumps(payload).encode('utf-8') 263 | headers["Content-Type"] = "application/json" 264 | 265 | url = "%s?offset=%d&limit=500" % (url, self.offset) 266 | req = Request(url=url, data=data, headers=headers) 267 | if method: 268 | req.get_method = lambda: method 269 | 270 | error = False 271 | try: 272 | try: 273 | res = opener.open(req) 274 | except HTTPError as e: 275 | # It seems that some version of urllib2 are buggy 276 | if e.code <= 299: 277 | res = e 278 | else: 279 | raise 280 | except HTTPError as e: 281 | if self.verbose: 282 | print(e) 283 | error = True 284 | res = e 285 | # 429: Too many requests 286 | # 502: Proxy Error 287 | # 503: Service Temporarily Unavailable 288 | if e.code == 429 or e.code >= 500: 289 | raise RetryError(e.code, e.read()) 290 | 291 | self.retry = int(res.headers.get("Retry-After", self.retry)) 292 | code = res.code 293 | if code in [201, 202]: 294 | self.location = urljoin(url, res.headers.get("Location", self.location)) 295 | 296 | if self.verbose: 297 | print("Code", code) 298 | print("Content-Type", res.headers.get("Content-Type")) 299 | print("Content-Length", res.headers.get("Content-Length")) 300 | print("Location", res.headers.get("Location")) 301 | 302 | body = res.read().decode("utf-8") 303 | res.close() 304 | 305 | if code in [204]: 306 | self.last = None 307 | return None 308 | else: 309 | try: 310 | self.last = json.loads(body) 311 | except Exception as e: 312 | self.last = {"error": "%s: %s" % (e, body)} 313 | error = True 314 | 315 | if self.verbose: 316 | print(json.dumps(self.last, indent=4)) 317 | 318 | self.status = self.last.get("status", self.status) 319 | 320 | if self.verbose: 321 | print("Status", self.status) 322 | 323 | if "messages" in self.last: 324 | for n in self.last["messages"]: 325 | if not self.quiet: 326 | print(n) 327 | self.offset += 1 328 | 329 | if code == 200 and self.status == "complete": 330 | self.value = self.last 331 | self.done = True 332 | if isinstance(self.value, dict) and "result" in self.value: 333 | self.value = self.value["result"] 334 | 335 | if code in [303]: 336 | self.value = self.last 337 | self.done = True 338 | 339 | if "error" in self.last: 340 | raise APIException("ecmwf.API error 1: %s" % (self.last["error"],)) 341 | 342 | if error: 343 | raise APIException("ecmwf.API error 2: %s" % (res, )) 344 | 345 | return self.last 346 | 347 | def submit(self, url, payload): 348 | self.call(url, payload, "POST") 349 | 350 | def POST(self, url, payload): 351 | return self.call(url, payload, "POST") 352 | 353 | def GET(self, url): 354 | return self.call(url, None, "GET") 355 | 356 | def wait(self): 357 | if self.verbose: 358 | print("Sleeping %s second(s)" % (self.retry)) 359 | time.sleep(self.retry) 360 | self.call(self.location, None, "GET") 361 | 362 | def ready(self): 363 | return self.done 364 | 365 | def result(self): 366 | return self.value 367 | 368 | def cleanup(self): 369 | try: 370 | if self.location: 371 | self.call(self.location, None, "DELETE") 372 | except: 373 | pass 374 | 375 | 376 | def no_log(msg): 377 | pass 378 | 379 | 380 | class APIRequest(object): 381 | 382 | def __init__(self, url, service, email=None, key=None, log=no_log, quiet=False, verbose=False, news=True): 383 | self.url = url 384 | self.service = service 385 | self.connection = Connection(url, email, key, quiet=quiet, verbose=verbose) 386 | self.log = log 387 | self.quiet = quiet 388 | self.verbose = verbose 389 | self.log("ECMWF API python library %s" % (VERSION,)) 390 | self.log("ECMWF API at %s" % (self.url,)) 391 | user = self.connection.call("%s/%s" % (self.url, "who-am-i")) 392 | self.log("Welcome %s" % (user["full_name"] or "user '%s'" % user["uid"],)) 393 | if news: 394 | try: 395 | news = self.connection.call("%s/%s/%s" % (self.url, self.service, "news")) 396 | for n in news["news"].split("\n"): 397 | self.log(n) 398 | except: 399 | pass 400 | 401 | def _bytename(self, size): 402 | prefix = {'': 'K', 'K': 'M', 'M': 'G', 'G': 'T', 'T': 'P', 'P': 'E'} 403 | l = '' 404 | size = size * 1.0 405 | while 1024 < size: 406 | l = prefix[l] 407 | size = size / 1024 408 | s = "" 409 | if size > 1: 410 | s = "s" 411 | return "%g %sbyte%s" % (size, l, s) 412 | 413 | @robust 414 | def _transfer(self, url, path, size): 415 | start = time.time() 416 | existing_size = 0 417 | req = Request(url) 418 | 419 | if os.path.exists(path): 420 | mode = "ab" 421 | existing_size = os.path.getsize(path) 422 | req.add_header("Range", "bytes=%s-" % existing_size) 423 | else: 424 | mode = "wb" 425 | 426 | self.log("Transfering %s into %s" % 427 | (self._bytename(size - existing_size), path)) 428 | self.log("From %s" % (url, )) 429 | 430 | bytes_transferred = 0 431 | with open(path, mode) as f: 432 | with closing(urlopen(req)) as http: 433 | while True: 434 | chunk = http.read(1048576) # 1MB chunks 435 | if not chunk: 436 | break 437 | f.write(chunk) 438 | bytes_transferred += len(chunk) 439 | 440 | end = time.time() 441 | 442 | if end > start: 443 | transfer_rate = bytes_transferred / (end - start) 444 | self.log("Transfer rate %s/s" % self._bytename(transfer_rate)) 445 | 446 | return existing_size + bytes_transferred 447 | 448 | def execute(self, request, target=None): 449 | 450 | status = None 451 | 452 | self.connection.submit("%s/%s/requests" % (self.url, self.service), request) 453 | self.log('Request submitted') 454 | self.log('Request id: ' + self.connection.last.get('name')) 455 | if self.connection.status != status: 456 | status = self.connection.status 457 | self.log("Request is %s" % (status, )) 458 | 459 | while not self.connection.ready(): 460 | if self.connection.status != status: 461 | status = self.connection.status 462 | self.log("Request is %s" % (status, )) 463 | self.connection.wait() 464 | 465 | if self.connection.status != status: 466 | status = self.connection.status 467 | self.log("Request is %s" % (status, )) 468 | 469 | result = self.connection.result() 470 | if target: 471 | if os.path.exists(target): 472 | # Empty the target file, if it already exists, otherwise the 473 | # transfer below might be fooled into thinking we're resuming 474 | # an interrupted download. 475 | open(target, "w").close() 476 | 477 | size = -1 478 | tries = 0 479 | while size != result["size"] and tries < 10: 480 | size = self._transfer(urljoin(self.url, result["href"]), target, result["size"]) 481 | if size != result["size"] and tries < 10: 482 | tries += 1 483 | self.log("Transfer interrupted, resuming in 60s...") 484 | time.sleep(60) 485 | else: 486 | break 487 | 488 | assert size == result["size"] 489 | 490 | self.connection.cleanup() 491 | 492 | return result 493 | 494 | 495 | ############################################################################### 496 | 497 | class ECMWFDataServer(object): 498 | 499 | def __init__(self, url=None, key=None, email=None, verbose=False, log=None): 500 | if url is None or key is None or email is None: 501 | key, url, email = get_apikey_values() 502 | 503 | self.url = url 504 | self.key = key 505 | self.email = email 506 | self.verbose = verbose 507 | self.log = log 508 | 509 | def trace(self, m): 510 | if self.log: 511 | self.log(m) 512 | else: 513 | t = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime()) 514 | print("%s %s" % (t, m,)) 515 | 516 | def retrieve(self, req): 517 | target = req.get("target") 518 | dataset = req.get("dataset") 519 | c = APIRequest(self.url, "datasets/%s" % (dataset,), self.email, self.key, self.trace, verbose=self.verbose) 520 | c.execute(req, target) 521 | 522 | ############################################################################### 523 | 524 | 525 | class ECMWFService(object): 526 | 527 | def __init__(self, service, url=None, key=None, email=None, verbose=False, log=None, quiet=False): 528 | if url is None or key is None or email is None: 529 | key, url, email = get_apikey_values() 530 | 531 | self.service = service 532 | self.url = url 533 | self.key = key 534 | self.email = email 535 | self.verbose = verbose 536 | self.quiet = quiet 537 | self.log = log 538 | 539 | def trace(self, m): 540 | if self.log: 541 | self.log(m) 542 | else: 543 | t = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime()) 544 | print("%s %s" % (t, m,)) 545 | 546 | def execute(self, req, target): 547 | c = APIRequest(self.url, "services/%s" % (self.service,), self.email, self.key, self.trace, verbose=self.verbose, quiet=self.quiet) 548 | c.execute(req, target) 549 | self.trace("Done.") 550 | 551 | ############################################################################### 552 | -------------------------------------------------------------------------------- /cams_download/setup.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # 3 | # (C) Copyright 2012-2013 ECMWF. 4 | # 5 | # This software is licensed under the terms of the Apache Licence Version 2.0 6 | # which can be obtained at http://www.apache.org/licenses/LICENSE-2.0. 7 | # In applying this licence, ECMWF does not waive the privileges and immunities 8 | # granted to it by virtue of its status as an intergovernmental organisation nor 9 | # does it submit to any jurisdiction. 10 | # 11 | 12 | from setuptools import setup, find_packages 13 | 14 | import ecmwfapi 15 | 16 | 17 | setup( 18 | name="ecmwf-api-client", 19 | version=ecmwfapi.__version__, 20 | description=ecmwfapi.__doc__, 21 | author="ECMWF", 22 | author_email="software.support@ecmwf.int", 23 | url="https://software.ecmwf.int/stash/projects/PRDEL/repos/ecmwf-api-client/browse", 24 | 25 | # entry_points={ 26 | # "console_scripts": [ 27 | # "mars = XXX:main", 28 | # ], 29 | # }, 30 | 31 | packages=find_packages(), 32 | zip_safe=False, 33 | ) 34 | -------------------------------------------------------------------------------- /convert_CAMS_DBL.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # -*- coding: iso-8859-1 -*- 3 | """ 4 | Processes a Sentinel-2 time series for a tile using MAJA processor for atmospheric correction and cloud screening. 5 | 6 | MAJA was developped by CS-SI, under a CNES contract, using a multi-temporal method developped at CESBIO, for the MACCS processor and including methods developped by DLR for ATCOR. 7 | 8 | This tool, developped by O.Hagolle (CNES:CESBIO) is a very basic one to show how to use MAJA to process a time series. If anything does not go as anticipated, the tool will probably crash 9 | """ 10 | 11 | import glob 12 | import tempfile 13 | import optparse 14 | import os 15 | import os.path 16 | import shutil 17 | import sys 18 | 19 | import logging 20 | logger = logging.getLogger('Start-Maja') 21 | 22 | logger.setLevel(logging.DEBUG) 23 | if not logger.handlers: 24 | ch = logging.StreamHandler() 25 | ch.setLevel(logging.DEBUG) 26 | formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') 27 | ch.setFormatter(formatter) 28 | logger.addHandler(ch) 29 | START_MAJA_VERSION = 3.1 30 | 31 | # ######################################################################### 32 | class OptionParser(optparse.OptionParser): 33 | 34 | def check_required(self, opt): 35 | option = self.get_option(opt) 36 | 37 | # Assumes the option's 'default' is set to None! 38 | if getattr(self.values, option.dest) is None: 39 | self.error("%s option not supplied" % option) 40 | 41 | 42 | # #################################### Lecture de fichier de parametres "Key=Value" 43 | def read_folders(fic_txt): 44 | 45 | repCode = repWork = repL1= repL2 = repMaja = repCAMS = repCAMS_raw = None 46 | 47 | with file(fic_txt, 'r') as f: 48 | for ligne in f.readlines(): 49 | if ligne.find('repCAMS') == 0: 50 | repCAMS = (ligne.split('=')[1]).strip() 51 | if ligne.find('repCAMS_raw') == 0: 52 | repCAMS_raw = (ligne.split('=')[1]).strip() 53 | 54 | missing = False 55 | if repCAMS is None: 56 | logger.debug("repCAMS is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 57 | if repCAMS_raw is None: 58 | logger.debug("repCAMS_raw is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 59 | 60 | if missing: 61 | raise Exception("Configuration file is not complete. See log file for more information.") 62 | 63 | directory_missing = False 64 | 65 | 66 | if repCAMS is not None and not os.path.isdir(repCAMS): 67 | logger.error("repCAMS %s is missing", repCAMS) 68 | if repCAMS_raw is not None and not os.path.isdir(repCAMS_raw): 69 | logger.error("repCAMS %s is missing", repCAMS_raw) 70 | 71 | if directory_missing: 72 | raise Exception("One or more directories are missing. See log file for more information.") 73 | 74 | return repCAMS, repCAMS_raw 75 | 76 | 77 | def manage_rep_cams(repCams, repCamsRaw, working_dir): 78 | exocam_creation(repCamsRaw, out_dir=repCams, working_dir=repCams) 79 | return repCams 80 | 81 | def exocam_creation(input_dir, out_dir=None, working_dir="/tmp"): 82 | 83 | processed_dates = [] 84 | 85 | nb_files = len(myGlob(input_dir, "*.nc")) 86 | compteur = 1 87 | 88 | for file_cams in myGlob(input_dir, "*.nc"): 89 | date_file = get_date(file_cams) 90 | if date_file in processed_dates: 91 | continue 92 | #print("Processing {}/{} : {} ".format(compteur, nb_files/3,date_file), end='\r') 93 | date_written_in_file = back_to_filename_date(date_file) 94 | aot_file = searchOneFile(input_dir, "*AOT_{}*".format(date_written_in_file)) 95 | mr_file = searchOneFile(input_dir, "*MR_{}*".format(date_written_in_file)) 96 | rh_file = searchOneFile(input_dir, "*RH_{}*".format(date_written_in_file)) 97 | process_one_file(aot_file, mr_file, rh_file, out_dir, working_dir) 98 | 99 | processed_dates.append(date_file) 100 | compteur += 1 101 | 102 | if __name__ == '__main__': 103 | # ========== command line 104 | if len(sys.argv) == 1: 105 | prog = os.path.basename(sys.argv[0]) 106 | print ' ' + sys.argv[0] + ' [options]' 107 | print " Aide : ", prog, " --help" 108 | print " ou : ", prog, " -h" 109 | 110 | print "exemple : " 111 | print "\t python %s -f folders.txt -c nominal -t 40KCB -s Reunion -d 20160401 " % sys.argv[0] 112 | sys.exit(-1) 113 | else: 114 | usage = "usage: %prog [options] " 115 | parser = OptionParser(usage=usage, version='%prog {}'.format(START_MAJA_VERSION)) 116 | 117 | 118 | 119 | parser.add_option("-f", "--folder", dest="folder_file", action="store", type="string", 120 | help="folder definition file", default=None) 121 | 122 | (options, args) = parser.parse_args() 123 | 124 | 125 | 126 | # =================directories 127 | folder_file = options.folder_file 128 | (repCams, repCamsRaw) = read_folders(folder_file) 129 | repCams = manage_rep_cams(repCams, repCamsRaw, repCams) 130 | -------------------------------------------------------------------------------- /convert_to_exo.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*-6 2 | """ 3 | ################################################################################################### 4 | 5 | o o 6 | oo oo oo o oo ,-. 7 | o o o o o o o o o \_/ 8 | o o o o o o o o {|||D 9 | o o oooooo o oooooo / \ 10 | o o o o o o o o `-^ 11 | o o o o oooo o o 12 | 13 | ################################################################################################### 14 | 15 | It defines classes_and_methods 16 | 17 | ################################################################################################### 18 | 19 | :author: Alexia Mondot 20 | 21 | :copyright: 2018 CNES. All rights reserved. 22 | 23 | :license: license 24 | :created: 27 Mar 2018 25 | 26 | :contact: alexia.mondot@c-s.fr 27 | 28 | ################################################################################################### 29 | """ 30 | from __future__ import absolute_import 31 | from __future__ import print_function 32 | 33 | import argparse 34 | import datetime 35 | import glob 36 | import os 37 | import re 38 | import shutil 39 | import sys 40 | import tempfile 41 | 42 | import lxml.etree as ET 43 | 44 | import logging 45 | LOGGER = logging.getLogger('Start-Maja') 46 | LOGGER.setLevel(logging.DEBUG) 47 | ch = logging.StreamHandler() 48 | ch.setLevel(logging.DEBUG) 49 | formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') 50 | ch.setFormatter(formatter) 51 | LOGGER.addHandler(ch) 52 | 53 | NUMBER_OF_SIGNIFICANT_BITS = {"ATB": "8", "CLD": "8", "MSK": "6", "QLT": "12", 54 | "FRE": "16", "SRE": "16"} 55 | 56 | 57 | def get_parameters(): 58 | 59 | argParser = argparse.ArgumentParser() 60 | required_arguments = argParser.add_argument_group('required arguments') 61 | required_arguments.add_argument('-i', '--input_dir', required=True, 62 | help='Path to input image') 63 | required_arguments.add_argument('-o', '--output_filename', required=True, 64 | help='Path to output HDR filename') 65 | required_arguments.add_argument('-s', "--sensor", choices=['s2', 'l8'], required=True) 66 | 67 | args = argParser.parse_args(sys.argv[1:]) 68 | 69 | input_dir = os.path.realpath(args.input_dir) 70 | output_filename = os.path.realpath(args.output_filename) 71 | sensor = args.sensor 72 | 73 | return input_dir, output_filename, sensor 74 | 75 | 76 | def getRoot(): 77 | xmlns = "http://eop-cfi.esa.int/CFI" 78 | xsi = "http://www.w3.org/2001/XMLSchema-instance" 79 | schemaLocation = "http://eop-cfi.esa.int/CFI ./EXO_CAMS_CamsData.xsd" 80 | typeXsi = "CAMS_Header_Type" 81 | root = ET.Element("Earth_Explorer_Header", attrib={"schema_version":"1.00", 82 | "{" + xsi + "}schemaLocation": schemaLocation, 83 | "{" + xsi + "}type": typeXsi}, 84 | nsmap={'xsi': xsi, None: xmlns}) 85 | return root 86 | 87 | 88 | def searchOneFile(directory, filePattern): 89 | """ 90 | Use glob to find a file matching the given filePattern 91 | :param directory: 92 | :param filePattern: 93 | :return: 94 | """ 95 | resList = myGlob(directory, filePattern) 96 | if resList: 97 | if len(resList) > 1: 98 | print("Warning, more than one value matching the pattern", filePattern, "in", directory) 99 | return resList[0] 100 | return None 101 | 102 | 103 | def myGlob(directory, filePattern): 104 | """ 105 | Creates an automatic path.join with given arguments 106 | :param directory: 107 | :param filePattern: 108 | :return: 109 | """ 110 | return glob.glob(os.path.join(directory, filePattern)) 111 | 112 | def nodes(root, mission, basename_out, dbl_file, date_now, acquisition_date, cams_files): 113 | 114 | a1 = ET.SubElement(root, "Fixed_Header") 115 | toto = ET.SubElement(a1, "File_Name") 116 | toto.text = basename_out 117 | toto = ET.SubElement(a1, "File_Description") 118 | toto.text = "CAMSData" 119 | toto = ET.SubElement(a1, "Notes") 120 | toto.text = "" 121 | toto = ET.SubElement(a1, "Mission") 122 | toto.text = mission 123 | toto = ET.SubElement(a1, "File_Class") 124 | toto.text = "TEST" 125 | toto = ET.SubElement(a1, "File_Type") 126 | toto.text = "EXO_CAMS" 127 | b1 = ET.SubElement(a1, "Validity_Period") 128 | toto = ET.SubElement(b1, "Validity_Start") 129 | toto.text = "UTC=2006-07-01T18:11:45" 130 | toto = ET.SubElement(b1, "Validity_Stop") 131 | toto.text = "UTC=9999-99-99T99:99:99" 132 | toto = ET.SubElement(a1, "File_Version") 133 | toto.text = "0003" 134 | b2 = ET.SubElement(a1, "Source") 135 | toto = ET.SubElement(b2, "System") 136 | toto.text = "MAJA" 137 | toto = ET.SubElement(b2, "Creator") 138 | toto.text = "OzoneDataHeaderGenerator" 139 | toto = ET.SubElement(b2, "Creator_Version") 140 | toto.text = "1.14" 141 | toto = ET.SubElement(b2, "Creation_Date") 142 | toto.text = "UTC=" + date_now.isoformat() 143 | 144 | a = ET.SubElement(root, "Variable_Header") 145 | b2 = ET.SubElement(a, "Main_Product_Header") 146 | ET.SubElement(b2, "List_of_Consumers", count="0") 147 | ET.SubElement(b2, "List_of_Extensions", count="0") 148 | b3 = ET.SubElement(a, "Specific_Product_Header") 149 | b4 = ET.SubElement(b3, "Instance_Id") 150 | 151 | b5 = ET.SubElement(b4, "Validity_Period") 152 | toto = ET.SubElement(b5, "Validity_Start") 153 | toto.text = "UTC=2006-07-01T18:11:45" 154 | toto = ET.SubElement(b5, "Validity_Stop") 155 | toto.text = "UTC=9999-99-99T99:99:99" 156 | b5 = ET.SubElement(b4, "Acquisition_Date_Time") 157 | b5.text = "UTC=" + acquisition_date.isoformat() 158 | 159 | b4 = ET.SubElement(b3, "Data_Block_Source_Location") 160 | b4.text = "http://" 161 | 162 | b4 = ET.SubElement(b3, "DBL_Organization") 163 | b5 = ET.SubElement(b4, "List_of_Packaged_DBL_Files", count = str(len(cams_files))) 164 | for index, cams_file in enumerate(cams_files): 165 | b6 = ET.SubElement(b5, "Packaged_DBL_File", sn=str(index+1)) 166 | b7 = ET.SubElement(b6, "Relative_File_Path") 167 | b7.text = cams_file 168 | b7 = ET.SubElement(b6, "File_Definition") 169 | b7.text = "cams" 170 | 171 | b4 = ET.SubElement(b3, "ModelLevels") 172 | b4.text = "0.2 0.3843 0.6365 0.9564 1.3448 1.8058 2.3478 2.985 3.7397 4.6462 5.7565 7.1322 8.8366 10.9483 13.5647 16.8064 20.8227 25.7989 31.9642 39.6029 49.0671 60.1802 73.0663 87.7274 104.229 122.614 142.902 165.089 189.147 215.025 242.652 272.059 303.217 336.044 370.407 406.133 443.009 480.791 519.209 557.973 596.777 635.306 673.24 710.263 746.063 780.346 812.83 843.263 871.42 897.112 920.189 940.551 958.148 972.987 985.14 994.747 1002.02 1007.26 1010.85 1013.25" 173 | 174 | 175 | def compress_directory_bzip2(destination_filename, source_directory): 176 | basePath = os.path.dirname(source_directory) 177 | baseModule = os.path.basename(source_directory) 178 | command = "tar -cjf {} -C{} {} --exclude '.svn'".format(destination_filename, basePath, baseModule) 179 | os.system(command) 180 | 181 | 182 | def date_time_for_naming(datetime_object): 183 | """ 184 | :return: the dateTtime of the ROKDate for naming convention 185 | :rtype: str 186 | """ 187 | return "{:04d}{:02d}{:02d}T{:02d}{:02d}{:02d}".format(datetime_object.year, 188 | datetime_object.month, 189 | datetime_object.day, 190 | datetime_object.hour, 191 | datetime_object.minute, 192 | datetime_object.second) 193 | 194 | 195 | def create_archive(aot_file, mr_file, rh_file, output_file_basename, ncdf_dir, archive_dir): 196 | destination_filename = "{}.DBL".format(output_file_basename) 197 | destination_filepath = os.path.join(archive_dir, destination_filename) 198 | 199 | temp_dir = os.path.join(archive_dir, output_file_basename + ".DBL.DIR") 200 | os.makedirs(temp_dir) 201 | shutil.copy(aot_file, os.path.join(temp_dir, os.path.basename(aot_file))) 202 | shutil.copy(mr_file, os.path.join(temp_dir, os.path.basename(mr_file))) 203 | shutil.copy(rh_file, os.path.join(temp_dir, os.path.basename(rh_file))) 204 | 205 | compress_directory_bzip2(destination_filepath, archive_dir) 206 | cams_file_to_return = [os.path.join(destination_filename + ".DIR", os.path.basename(aot_file)), 207 | os.path.join(destination_filename + ".DIR", os.path.basename(mr_file)), 208 | os.path.join(destination_filename + ".DIR", os.path.basename(rh_file))] 209 | return destination_filepath, cams_file_to_return 210 | 211 | 212 | def get_date(cams_file): 213 | try: 214 | res = re.search (r".+_.+_(.+)\.nc", cams_file).group(1) 215 | except IndexError: 216 | print("No date found in {}".format(cams_file)) 217 | return None 218 | try: 219 | return datetime.datetime.strptime(res, "%Y%m%dUTC%H%M%S") 220 | except ValueError: 221 | print("No date %Y%m%dUTC%H%M%S found in {}".format(res)) 222 | return None 223 | 224 | 225 | def check_and_return_date(aot_file, mr_file, rh_file): 226 | 227 | date_aot = re.search (r"AOT_(.+)\.nc", aot_file).group(1) 228 | date_mr = re.search (r"MR_(.+)\.nc", mr_file).group(1) 229 | date_rh = re.search (r"RH_(.+)\.nc", rh_file).group(1) 230 | 231 | if date_aot != date_mr != date_rh: 232 | raise Exception("The 3 files must be from the same date ! \n{}\n{}\n{}".format(aot_file, mr_file, rh_file)) 233 | return datetime.datetime.strptime(date_aot, "%Y%m%dUTC%H%M%S") 234 | 235 | 236 | def back_to_filename_date(datetime_file): 237 | 238 | return datetime_file.strftime("%Y%m%dUTC%H%M%S") 239 | 240 | 241 | def process_one_file(aot_file, mr_file, rh_file, ncdf_dir, archive_dir): 242 | 243 | #working_dir = tempfile.mkdtemp(dir=working_dir) 244 | 245 | mission = "SENTINEL-2_" 246 | date_file = check_and_return_date(aot_file, mr_file, rh_file) 247 | date_now = datetime.datetime.now() 248 | 249 | output_file_basename = "S2__TEST_EXO_CAMS_{}_{}".format(date_time_for_naming(date_file), 250 | date_time_for_naming(date_now)) 251 | 252 | #create archive 253 | dbl_filename, cams = create_archive(aot_file, mr_file, rh_file, output_file_basename, ncdf_dir, archive_dir) 254 | 255 | print("Step 1/2", end='\r') 256 | 257 | #create hdr 258 | output_filename = os.path.join(archive_dir, output_file_basename + ".HDR") 259 | LOGGER.debug(output_filename) 260 | basename_out = os.path.basename(os.path.splitext(output_filename)[0]) 261 | LOGGER.debug(basename_out) 262 | root = getRoot() 263 | 264 | nodes(root, mission, basename_out, dbl_filename, date_now, date_file, cams) 265 | 266 | print("Step 2/2", end='\r') 267 | 268 | tree = ET.ElementTree(root) 269 | f = open(output_filename, "w") 270 | f.write(ET.tostring(tree, pretty_print=True, xml_declaration=True, 271 | encoding="UTF-8")) 272 | f.close() 273 | 274 | 275 | -------------------------------------------------------------------------------- /copyright.txt: -------------------------------------------------------------------------------- 1 | This is free software under the GPL v3 licence. See 2 | http://www.gnu.org/licenses/gpl-3.0.html for details. 3 | -------------------------------------------------------------------------------- /folders.txt: -------------------------------------------------------------------------------- 1 | repCode=/mnt/data/home/hagolleo/PROG/S2/lance_maja 2 | repWork=/mnt/data/SENTINEL2/MAJA 3 | repL1 =/mnt/data/SENTINEL2/L1C_PDGS 4 | repL2 =/mnt/data/SENTINEL2/L2A_MAJA 5 | repMaja=/mnt/data/home/petruccib/Install-MAJA/maja/core/1.0/bin/maja 6 | -------------------------------------------------------------------------------- /prepare_mnt/32SNE.txt: -------------------------------------------------------------------------------- 1 | proj=UTM32N 2 | EPSG_out=32632 3 | chaine_proj=EPSG:32632 4 | tx_min=0 5 | ty_min=0 6 | tx_max=0 7 | ty_max=0 8 | pas_x=109800 9 | pas_y=109800 10 | orig_x=499980 11 | orig_y=4000020 12 | marge=0 13 | -------------------------------------------------------------------------------- /prepare_mnt/CVersailles.txt: -------------------------------------------------------------------------------- 1 | proj =L93 2 | EPSG_out =2154 3 | chaine_proj = EPSG:2154 4 | tx_min=0 5 | ty_min=0 6 | tx_max=0 7 | ty_max=0 8 | pas_x=85000 9 | pas_y=80000 10 | orig_x=585000 11 | orig_y=6890000 12 | marge=0 13 | -------------------------------------------------------------------------------- /prepare_mnt/MAJA_HDR_TEMPLATE.HDR: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | S2__TEST_AUX_REFDE2_tuile_0001 5 | ReferenceDemDataLevel2 6 | NA 7 | SENTINEL-2_ 8 | TEST 9 | AUX_REFDE2 10 | 11 | UTC=2017-01-17T18:16:40 12 | UTC=9999-99-99T99:99:99 13 | 14 | 0001 15 | 16 | MUSTRT 17 | MUSTRT_MNTN2 18 | 2.0 19 | UTC=2017-01-17T18:16:40 20 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | tuile 31 | 0001 32 | 33 | 34 | S2__TEST_MPL_SITDEF_S_tuile 35 | 36 | 37 | 38 | 39 | EPSG:epsg 40 | proj 41 | 42 | 43 | ulx 44 | uly 45 | 46 | 47 | resx 48 | resy 49 | 50 | 51 | nbLig 52 | nbCol 53 | 54 | 55 | meanAlt 56 | stdAlt 57 | 0 58 | 1 59 | No comment 60 | 61 | 62 | 63 | 64 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_MSK.TIF 65 | 66 | 67 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_ALC.TIF 68 | 69 | 70 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_ALT_R1.TIF 71 | 72 | 73 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_ALT_R2.TIF 74 | 75 | 76 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_SLC.TIF 77 | 78 | 79 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_ASC.TIF 80 | 81 | 82 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_SLP_R1.TIF 83 | 84 | 85 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_ASP_R1.TIF 86 | 87 | 88 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_SLP_R2.TIF 89 | 90 | 91 | S2__TEST_AUX_REFDE2_tuile_0001.DBL.DIR/S2__TEST_AUX_REFDE2_tuile_0001_ASP_R2.TIF 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | -------------------------------------------------------------------------------- /prepare_mnt/Readme.md: -------------------------------------------------------------------------------- 1 | # DTM and water mask tool as input for MUSCATE 2 | 3 | This tool uses SRTM files from CGIAR-JRC processing, to provide the DTM used as input to MACCS/MAJA prototype, which includes also slope and aspect, 4 | at full and coarse resolution. It also uses SRTM water bodies files to produce the water masks used within MACCS. 5 | A special version for MAJA is described at the end of the Readme file, together with a conversion tool to obtain the input format needed for *MAJA operational version* 6 | 7 | 8 | ## SRTM CGIAR files : 9 | The DTM tiles (by tiles of 5 degrees) can be fetched here http://srtm.csi.cgiar.org/SELECTION/inputCoord.asp 10 | More documentation on the product is avaiilable here : http://www.cgiar-csi.org/data/srtm-90m-digital-elevation-database-v4-1 11 | 12 | ## SWBD 13 | Documentation is available here https://dds.cr.usgs.gov/srtm/version2_1/SWBD/SWBD_Documentation/SWDB_Product_Specific_Guidance.pdf 14 | There was a ftp site, but it does not seem to be available. 15 | Data can be downloaded from https://earthexplorer.usgs.gov/ 16 | 17 | In "data sets", select 18 | - Digital Elevation 19 | - SRTM 20 | - SRTM Water Body Data 21 | 22 | 23 | ## User manual 24 | The tool requires a recent version of gdal (Minimum 1.11) 25 | 26 | 27 | The parameter file, on my computer, is as follows : 28 | ``` 29 | INDIR_MNT =/mnt/data/DONNEES_AUX/SRTM 30 | OUTDIR_MNT=/mnt/data/mnt 31 | INDIR_EAU=/mnt/data/DONNEES_AUX/masque_eau 32 | OUTDIR_EAU =/mnt/data/mnt 33 | ``` 34 | 35 | MNT means DTM and EAU means water 36 | 37 | 38 | It also needs a file site. An example is provided : CVersailles.txt, which was used for SPOT4 (Take5), and 32SNE.txt, for Sentinel-2 tile 32SNE 39 | 40 | 41 | - proj is the projection name, 42 | - EPSG_OUT, is the EPSG code of the projection, 43 | - chaine_proj is the string to use to define it in gdal commands 44 | - you may find the information in the xml file provided with a granule : 45 | ``` 46 | WGS84 / UTM zone 32N 47 | EPSG:32632 48 | ``` 49 | - The 4 values can stay equal to zero to produce only one tile. They can be integers if you want to generate a grid of tiles. For Sentinel-2 only produce one tile at a time. 50 | 51 | tx_min=0 52 | ty_min=0 53 | tx_max=0 54 | ty_max=0 55 | 56 | - pas_x and pas_y are the image size in m. Please keep the same values as below. 57 | - orig_x and orig_y are the coordinates of the upper left corner in m (gdalinfo can provide the information) 58 | - marge is the size of the overlap region between tiles 59 | For Sentinel-2, the margin is 0 as we produce DTM tile by tile due to the complex naming of tiles... 60 | 61 | Here is an example for tile 32TSNE (in Tunisia) 62 | 63 | ``` 64 | proj=UTM32N 65 | EPSG_out=32632 66 | chaine_proj=EPSG:32632 67 | tx_min=0 68 | ty_min=0 69 | tx_max=0 70 | ty_max=0 71 | pas_x=109800 72 | pas_y=109800 73 | orig_x=499980 74 | orig_y=4000020 75 | marge=0 76 | ``` 77 | 78 | # Sentinel-2 79 | 80 | A dedicated tool has been written for Sentinel-2 : 81 | 82 | ``` python tuilage_mnt_eau_S2.py -p parameters.txt -s 32SNE.txt -m SRTM``` 83 | 84 | `-c` is the coarse resolution used to speed some proceses in MAJA. It is 240m. 85 | 86 | 87 | This tool generates data with the format needed for the prototype version of MACCS 88 | 89 | 90 | A converter is available to obtain the data format to use as input of MAJA (operational version) 91 | 92 | ``` python conversion_format_maja.py-t 34LGJ -f mnt/34LGJ ``` 93 | 94 | 95 | 96 | 97 | -------------------------------------------------------------------------------- /prepare_mnt/conversion_format_maja.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | 5 | import os.path,glob,sys 6 | import numpy as np 7 | from lib_mnt import * 8 | 9 | from osgeo import gdal,osr 10 | import sys 11 | 12 | import optparse 13 | 14 | 15 | ########################################################################### 16 | class OptionParser (optparse.OptionParser): 17 | 18 | def check_required (self, opt): 19 | option = self.get_option(opt) 20 | 21 | # Assumes the option's 'default' is set to None! 22 | if getattr(self.values, option.dest) is None: 23 | self.error("%s option not supplied" % option) 24 | 25 | ########################################################################### 26 | 27 | def gdalinfo(fic_mnt_in): 28 | ds=gdal.Open(fic_mnt_in) 29 | driver = gdal.GetDriverByName('ENVI') 30 | (ulx,resx,dum1,uly,dum2,resy)=ds.GetGeoTransform() 31 | 32 | nbCol=ds.RasterXSize 33 | nbLig=ds.RasterYSize 34 | 35 | proj=ds.GetProjectionRef().split('"')[1].split('"')[0] 36 | 37 | 38 | inband = ds.GetRasterBand(1) 39 | 40 | dtm=inband.ReadAsArray(0, 0, nbCol, nbLig).astype(np.float) 41 | moyenne=np.mean(dtm) 42 | ecart=np.std(dtm) 43 | 44 | return(proj,ulx,uly,resx,resy,nbCol,nbLig,moyenne,ecart) 45 | 46 | ########################################################################### 47 | 48 | def writeHDR(hdr_out,tuile,proj,ulx,uly,resx,resy,nbCol,nbLig,moyenne,ecart) : 49 | hdr_template="MAJA_HDR_TEMPLATE.HDR" 50 | 51 | #compute epsg_code 52 | epsg_asc=proj.split('_')[-1] 53 | epsg_num=int(epsg_asc[0:-1]) 54 | 55 | if epsg_asc.endswith('N'): 56 | epsg="326%02d"%epsg_num 57 | else: 58 | epsg="327%d02"%epsg_num 59 | print epsg 60 | 61 | proj="WGS 84 / UTM zone %s"%epsg_asc 62 | 63 | print proj,epsg 64 | 65 | with file(hdr_out,"w") as fout: 66 | with file(hdr_template) as fin: 67 | lignes=fin.readlines() 68 | for lig in lignes: 69 | if lig.find("tuile")>0: 70 | lig=lig.replace("tuile","T"+tuile) 71 | elif lig.find("epsg")>0: 72 | lig=lig.replace("epsg",epsg) 73 | elif lig.find("proj")>0: 74 | lig=lig.replace("proj",proj) 75 | elif lig.find("ulx")>0: 76 | lig=lig.replace("ulx",str(int(ulx))) 77 | elif lig.find("uly")>0: 78 | lig=lig.replace("uly",str(int(uly))) 79 | elif lig.find("resx")>0: 80 | lig=lig.replace("resx",str(int(resx))) 81 | elif lig.find("resy")>0: 82 | lig=lig.replace("resy",str(int(resy))) 83 | elif lig.find("nbLig")>0: 84 | lig=lig.replace("nbLig",str(nbLig)) 85 | elif lig.find("nbCol")>0: 86 | lig=lig.replace("nbCol",str(nbCol)) 87 | elif lig.find("meanAlt")>0: 88 | lig=lig.replace("meanAlt",str(moyenne)) 89 | elif lig.find("stdAlt")>0: 90 | lig=lig.replace("stdAlt",str(ecart)) 91 | fout.write(lig) 92 | 93 | 94 | ########## Main 95 | 96 | if len(sys.argv)==1 : 97 | prog = os.path.basename(sys.argv[0]) 98 | print ' '+sys.argv[0]+' [options]' 99 | print " Help : ", prog, " --help" 100 | print " Or : ", prog, " -h" 101 | print "example : python %s -t 34LGJ -f mnt/34LGJ"%sys.argv[0] 102 | sys.exit(-1) 103 | else: 104 | usage = "usage: %prog [options] " 105 | parser = OptionParser(usage=usage) 106 | parser.set_defaults(eau_seulement=False) 107 | parser.set_defaults(sans_numero=False) 108 | 109 | parser.add_option("-t", "--tile", dest="tile", action="store", type="string", \ 110 | help="tile name",default=None) 111 | parser.add_option("-f", "--folder", dest="folder", action="store", type="string", \ 112 | help="folder where the DTM willbe found", default=None) 113 | parser.add_option("-c", dest="coarse_res", action="store", type="int", \ 114 | help="Coarse resolution", default=240) 115 | 116 | (options, args) = parser.parse_args() 117 | parser.check_required("-t") 118 | parser.check_required("-f") 119 | 120 | #inputs : 121 | tuile=options.tile 122 | rep_mnt_in=options.folder 123 | coarse=options.coarse_res 124 | fic_mnt_in=glob.glob(rep_mnt_in+'/'+'*_10m.mnt')[0] 125 | 126 | 127 | # creation of output directory 128 | rep_mnt_out="S2__TEST_AUX_REFDE2_T%s_0001"%tuile 129 | if not os.path.exists(rep_mnt_out): 130 | os.mkdir(rep_mnt_out) 131 | 132 | hdr_out=rep_mnt_out+"/"+rep_mnt_out+".HDR" 133 | dbl_dir_out=rep_mnt_out+"/"+rep_mnt_out+".DBL.DIR" 134 | 135 | if not os.path.exists(dbl_dir_out): 136 | os.mkdir(dbl_dir_out) 137 | 138 | 139 | # read the parameters of the tile dimension, projection and extent 140 | (proj,ulx,uly,resx,resy,nbCol,nbLig,moyenne,ecart)=gdalinfo(fic_mnt_in) 141 | 142 | # write the HDR file 143 | writeHDR(hdr_out,tuile,proj,ulx,uly,resx,resy,nbCol,nbLig,moyenne,ecart) 144 | 145 | 146 | 147 | 148 | # now prepare binary file 149 | resolutions=[10,20,coarse] 150 | 151 | 152 | # Altitude 10m, 20m, 240m 153 | suff_proto="mnt" 154 | suff_MAJA=["ALT_R1", "ALT_R2", "ALC"] 155 | base_in=fic_mnt_in 156 | rac_out=dbl_dir_out+'/'+rep_mnt_out 157 | for i,res in enumerate(resolutions): 158 | nom_in=base_in.replace("_10m.mnt","_%sm.%s"%(res,suff_proto)) 159 | nom_out=rac_out+"_%s.TIF"%suff_MAJA[i] 160 | commande="gdal_translate -of GTIFF %s %s"%(nom_in,nom_out) 161 | print commande 162 | os.system(commande) 163 | 164 | 165 | # Slope 10m, 20m, 240m SLP_R1, SLP_R2, SLC 166 | suff_proto="slope" 167 | suff_MAJA=["SLP_R1", "SLP_R2", "SLC"] 168 | base_in=fic_mnt_in 169 | rac_out=dbl_dir_out+'/'+rep_mnt_out 170 | for i,res in enumerate(resolutions): 171 | nom_in=base_in.replace("_10m.mnt","_%sm.%s"%(res,suff_proto)) 172 | nom_out=rac_out+"_%s.TIF"%suff_MAJA[i] 173 | commande="gdal_translate -of GTIFF %s %s"%(nom_in,nom_out) 174 | print commande 175 | os.system(commande) 176 | 177 | 178 | # Aspect 10m, 20m ASP_R1, ASP_R2, ASC 179 | suff_proto="aspect" 180 | suff_MAJA=["ASP_R1", "ASP_R2", "ASC"] 181 | base_in=fic_mnt_in 182 | rac_out=dbl_dir_out+'/'+rep_mnt_out 183 | for i,res in enumerate(resolutions): 184 | nom_in=base_in.replace("_10m.mnt","_%sm.%s"%(res,suff_proto)) 185 | nom_out=rac_out+"_%s.TIF"%suff_MAJA[i] 186 | commande="gdal_translate -of GTIFF %s %s"%(nom_in,nom_out) 187 | print commande 188 | os.system(commande) 189 | 190 | # Water Mask 191 | suff_proto="eau" 192 | suff_MAJA="MSK" 193 | base_in=fic_mnt_in 194 | rac_out=dbl_dir_out+'/'+rep_mnt_out 195 | res=coarse 196 | nom_in=base_in.replace("_10m.mnt","_%sm.%s"%(res,suff_proto)) 197 | nom_out=rac_out+"_%s.TIF"%suff_MAJA 198 | commande="gdal_translate -of GTIFF %s %s"%(nom_in,nom_out) 199 | print commande 200 | os.system(commande) 201 | 202 | 203 | 204 | -------------------------------------------------------------------------------- /prepare_mnt/copyright.txt: -------------------------------------------------------------------------------- 1 | This is free software under the GPL v3 licence. See 2 | http://www.gnu.org/licenses/gpl-3.0.html for details. 3 | -------------------------------------------------------------------------------- /prepare_mnt/france.txt: -------------------------------------------------------------------------------- 1 | proj = L93 2 | EPSG_out = 2154 3 | chaine_proj = EPSG:2154 4 | tx_min = 1 5 | tx_max = 10 6 | ty_min = 1 7 | ty_max = 10 8 | pas = 100020 9 | marge=9990 10 | orig_x = 0 11 | orig_y = 100020*61+20000 12 | -------------------------------------------------------------------------------- /prepare_mnt/land_polygons_osm/README: -------------------------------------------------------------------------------- 1 | 2 | This data was downloaded from openstreetmapdata.com which offers 3 | extracts and processings of OpenStreetMap data. 4 | 5 | See http://openstreetmapdata.com/ for details. 6 | 7 | 8 | PACKAGE CONTENT 9 | =============== 10 | 11 | This package contains OpenStreetMap data of the 12 | coastline land polygons, simplified for rendering at low zooms 13 | 14 | Layers contained are: 15 | 16 | simplified_land_polygons.shp: 17 | 18 | 62930 Polygon features 19 | Mercator projection 20 | Extent: (-20037507, -20037507) - (20037508, 18461504) 21 | In geographic coordinates: (-180.000, -85.051) - (180.000, 83.666) 22 | 23 | Date of the data used is 29 Sep 2017 04:46 24 | 25 | You can find more information on this data set at 26 | 27 | http://openstreetmapdata.com/data/land-polygons 28 | 29 | 30 | LICENSE 31 | ======= 32 | 33 | This data is Copyright 2017 OpenStreetMap contributors. It is 34 | available under the Open Database License (ODbL). 35 | 36 | For more information see http://www.openstreetmap.org/copyright 37 | 38 | -------------------------------------------------------------------------------- /prepare_mnt/land_polygons_osm/simplified_land_polygons.cpg: -------------------------------------------------------------------------------- 1 | UTF-8 2 | -------------------------------------------------------------------------------- /prepare_mnt/land_polygons_osm/simplified_land_polygons.dbf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/olivierhagolle/Start_maja/a8a995514e965470ec19d3b729a47ce6fc5be90d/prepare_mnt/land_polygons_osm/simplified_land_polygons.dbf -------------------------------------------------------------------------------- /prepare_mnt/land_polygons_osm/simplified_land_polygons.prj: -------------------------------------------------------------------------------- 1 | PROJCS["WGS 84 / Pseudo-Mercator", 2 | GEOGCS["WGS 84", 3 | DATUM["WGS_1984", 4 | SPHEROID["WGS 84",6378137,298.257223563, 5 | AUTHORITY["EPSG","7030"]], 6 | AUTHORITY["EPSG","6326"]], 7 | PRIMEM["Greenwich",0, 8 | AUTHORITY["EPSG","8901"]], 9 | UNIT["degree",0.0174532925199433, 10 | AUTHORITY["EPSG","9122"]], 11 | AUTHORITY["EPSG","4326"]], 12 | PROJECTION["Mercator_1SP"], 13 | PARAMETER["central_meridian",0], 14 | PARAMETER["scale_factor",1], 15 | PARAMETER["false_easting",0], 16 | PARAMETER["false_northing",0], 17 | UNIT["metre",1, 18 | AUTHORITY["EPSG","9001"]], 19 | AXIS["X",EAST], 20 | AXIS["Y",NORTH], 21 | EXTENSION["PROJ4","+proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs"], 22 | AUTHORITY["EPSG","3857"]] 23 | -------------------------------------------------------------------------------- /prepare_mnt/land_polygons_osm/simplified_land_polygons.shp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/olivierhagolle/Start_maja/a8a995514e965470ec19d3b729a47ce6fc5be90d/prepare_mnt/land_polygons_osm/simplified_land_polygons.shp -------------------------------------------------------------------------------- /prepare_mnt/land_polygons_osm/simplified_land_polygons.shx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/olivierhagolle/Start_maja/a8a995514e965470ec19d3b729a47ce6fc5be90d/prepare_mnt/land_polygons_osm/simplified_land_polygons.shx -------------------------------------------------------------------------------- /prepare_mnt/lib_mnt.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | import glob 5 | import numpy as np 6 | import os 7 | import os.path 8 | import shutil 9 | import tempfile 10 | from osgeo import gdal, ogr, osr 11 | 12 | import scipy.ndimage as nd 13 | 14 | 15 | # Returns true if coordinate is land 16 | def TestLand(lon, lat): 17 | latlon = osr.SpatialReference() 18 | latlon.ImportFromEPSG(4326) 19 | 20 | # create a point 21 | 22 | pt = ogr.Geometry(ogr.wkbPoint) 23 | pt.SetPoint_2D(0, lon, lat) 24 | 25 | # read shapefile 26 | shapefile = "land_polygons_osm/simplified_land_polygons.shp" 27 | driver = ogr.GetDriverByName("ESRI Shapefile") 28 | dataSource = driver.Open(shapefile, 0) 29 | layer = dataSource.GetLayer() 30 | targetProj = layer.GetSpatialRef() 31 | land = False 32 | 33 | # conversion to shapefile projection 34 | transform = osr.CoordinateTransformation(latlon, targetProj) 35 | pt.Transform(transform) 36 | 37 | # search point in layers 38 | for feature in layer: 39 | geom = feature.GetGeometryRef() 40 | if geom.Contains(pt): 41 | land = True 42 | break 43 | 44 | return land 45 | 46 | 47 | ##################################### Lecture de fichier de parametres "Mot_clé=Valeur" 48 | def lire_param_txt(fic_txt): 49 | with file(fic_txt, 'r') as f: 50 | for ligne in f.readlines(): 51 | if ligne.find('INDIR_MNT') == 0: 52 | INDIR_MNT = (ligne.split('=')[1]).strip() 53 | if ligne.find('OUTDIR_MNT') == 0: 54 | OUTDIR_MNT = (ligne.split('=')[1]).strip() 55 | if ligne.find('INDIR_EAU') == 0: 56 | INDIR_EAU = (ligne.split('=')[1]).strip() 57 | if ligne.find('OUTDIR_EAU') == 0: 58 | OUTDIR_EAU = (ligne.split('=')[1]).strip() 59 | return (INDIR_MNT, OUTDIR_MNT, INDIR_EAU, OUTDIR_EAU) 60 | 61 | 62 | ############################ 63 | class classe_site: 64 | def __init__(self, nom, proj, EPSG_out, chaine_proj, tx_min, tx_max, ty_min, ty_max, pas_x, pas_y, marge, orig_x, 65 | orig_y): 66 | self.nom = nom 67 | self.proj = proj 68 | self.EPSG_out = EPSG_out 69 | self.chaine_proj = chaine_proj 70 | self.tx_min = tx_min 71 | self.tx_max = tx_max 72 | self.ty_min = ty_min 73 | self.ty_max = ty_max 74 | self.pas_x = pas_x 75 | self.pas_y = pas_y 76 | self.marge = marge 77 | self.orig_x = orig_x 78 | self.orig_y = orig_y 79 | 80 | 81 | ############################ Lecture du fichier site 82 | def lire_fichier_site(fic_site): 83 | nom = os.path.basename(fic_site).split('.')[0] 84 | with file(fic_site, 'r') as f: 85 | for ligne in f.readlines(): 86 | if ligne.find('proj') == 0: 87 | proj = ligne.split('=')[1].strip() 88 | print proj 89 | if ligne.find('EPSG_out') == 0: 90 | EPSG_out = int(ligne.split('=')[1]) 91 | if ligne.find('chaine_proj') == 0: 92 | chaine_proj = ligne.split('=')[1].strip() 93 | if ligne.find('tx_min') == 0: 94 | tx_min = int(ligne.split('=')[1]) 95 | if ligne.find('tx_max') == 0: 96 | tx_max = int(ligne.split('=')[1]) 97 | if ligne.find('ty_min') == 0: 98 | ty_min = int(ligne.split('=')[1]) 99 | if ligne.find('ty_max') == 0: 100 | ty_max = int(ligne.split('=')[1]) 101 | if ligne.find('pas_x') == 0: 102 | pas_x = int(ligne.split('=')[1]) 103 | if ligne.find('pas_y') == 0: 104 | pas_y = int(ligne.split('=')[1]) 105 | if ligne.find('marge') == 0: 106 | marge = int(ligne.split('=')[1]) 107 | if ligne.find('orig_x') == 0: 108 | orig_x = int(ligne.split('=')[1]) 109 | if ligne.find('orig_y') == 0: 110 | orig_y = int(ligne.split('=')[1]) 111 | site = classe_site(nom, proj, EPSG_out, chaine_proj, tx_min, tx_max, ty_min, ty_max, pas_x, pas_y, marge, orig_x, 112 | orig_y) 113 | return (site) 114 | 115 | 116 | ###############################################lecture de l'entete envi 117 | def lire_entete_mnt(fic_hdr): 118 | """lecture du fichier hdr, en entree, chemin complet du fichier 119 | """ 120 | f = file(fic_hdr, 'r') 121 | for ligne in f.readlines(): 122 | if ligne.find('samples') >= 0: 123 | nb_col = int(ligne.split('=')[1]) 124 | if ligne.find('lines') >= 0: 125 | nb_lig = int(ligne.split('=')[1]) 126 | if ligne.find('byte order') >= 0: 127 | num_endian = int(ligne.split('=')[1]) 128 | if (num_endian == 0): 129 | endian = 'PC' 130 | else: 131 | endian = 'SUN' 132 | if ligne.find('data type') >= 0: 133 | type_envi = int(ligne.split('=')[1]) 134 | if (type_envi == 1): 135 | type_donnee = 'uint8' 136 | elif (type_envi == 2): 137 | type_donnee = 'int16' 138 | elif (type_envi == 4): 139 | type_donnee = 'float32' 140 | elif (type_envi == 5): 141 | type_donnee = 'double' 142 | elif (type_envi == 12): 143 | type_donnee = 'uint16' 144 | else: 145 | print 'type %d non pris en compte' % type_envi 146 | 147 | return (nb_lig, nb_col, type_donnee, endian) 148 | 149 | 150 | #################calcule le nom de la tuile 151 | def calcule_nom_tuile(tx, ty, site, nom_site): 152 | if tx >= 0: 153 | GD = "D" 154 | numx = tx 155 | else: 156 | GD = "G" 157 | numx = -tx 158 | 159 | if ty > 0: 160 | HB = "H" 161 | numy = ty 162 | else: 163 | HB = "B" 164 | numy = -ty 165 | 166 | nom_tuile = "%s%s%04d%s%04d" % (nom_site, GD, numx, HB, numy) 167 | return (nom_tuile) 168 | 169 | 170 | ############################################################# 171 | ###########################Classe MNT######################## 172 | #############################################################""" 173 | 174 | class classe_mnt: 175 | def __init__(self, rep, rac, ulx, uly, lrx, lry, res, chaine_proj): 176 | self.racine = rep + rac 177 | self.ulx = ulx 178 | self.uly = uly 179 | self.lrx = lrx 180 | self.lry = lry 181 | self.res = res 182 | self.chaine_proj = chaine_proj 183 | 184 | ############################################################# 185 | ###########################Pour Babel######################## 186 | #############################################################""" 187 | def ecrit_hd(self, nb_lig, nb_col): 188 | ficMNT = self.racine + '_' + str(self.res) + 'm' 189 | f = open(ficMNT + '.hd', 'w') 190 | f.write('CHANNELS\n') 191 | f.write('1\n') 192 | f.write('LINES\n') 193 | f.write(str(nb_lig) + '\n') 194 | f.write('COLUMNS\n') 195 | f.write(str(nb_col) + '\n') 196 | f.write('BITS PER PIXEL\n') 197 | f.write('16\n') 198 | f.close() 199 | 200 | def ecrit_hd_babel(self, nb_lig, nb_col): 201 | ficMNT = self.racine + '_' + str(self.res) + 'm' 202 | f = open(ficMNT + '.hd_babel', 'w') 203 | f.write('>>\tLON_REF\t' + str(self.ulx) + '\n') 204 | f.write('>>\tLAT_REF\t' + str(self.uly) + '\n') 205 | f.write('>>\tNB_LON\t' + str(nb_col) + '\n') 206 | f.write('>>\tNB_LAT\t' + str(nb_lig) + '\n') 207 | f.write('>>\tPAS_LON\t' + str(self.res) + '\n') 208 | f.write('>>\tPAS_LAT\t-' + str(self.res) + '\n') 209 | f.write('>>\tTYPE_CODE\t2\n') 210 | f.write('>>\tTYPE_CONV\t0\n') 211 | f.write('>>\tREF\tWGS84:G-D/GRS80:Z-M\n') 212 | f.close() 213 | 214 | ############################################################# 215 | ########################Interface GDAL####################### 216 | ############################################################# 217 | 218 | def decoupe_float(self, fic_in, fic_out): 219 | # calcul du mnt float 220 | chaine_etendue = str(self.ulx) + ' ' + str(self.lry) + ' ' + str(self.lrx) + ' ' + str(self.uly) 221 | commande = 'gdalwarp -overwrite -r cubic -ot Float32 -srcnodata -32768 -dstnodata 0 -of ENVI -tr %d %d -te %s -t_srs %s %s %s\n' % ( 222 | self.res, self.res, chaine_etendue, self.chaine_proj, fic_in, fic_out) 223 | print commande 224 | os.system(commande) 225 | 226 | def decoupe_int(self, fic_in, fic_out): 227 | # calcul du mnt int 228 | chaine_etendue = str(self.ulx) + ' ' + str(self.lry) + ' ' + str(self.lrx) + ' ' + str(self.uly) 229 | commande = 'gdalwarp -overwrite -r cubic -srcnodata -32768 -dstnodata 0 -of ENVI -tr %d %d -te %s -t_srs %s %s %s\n' % ( 230 | self.res, self.res, chaine_etendue, self.chaine_proj, fic_in, fic_out) 231 | print commande 232 | os.system(commande) 233 | 234 | ############################################################# 235 | ########################Decoupage MNT######################## 236 | ############################################################# 237 | 238 | def decoupe(self, mnt_in): 239 | print "###decoupage " + str(self.res) + 'm' 240 | rac_mnt = self.racine + '_' + str(self.res) + 'm' 241 | fic_hdr_mnt = rac_mnt + '.hdr' 242 | fic_mnt = rac_mnt + '.mnt' 243 | fic_hdr_mnt_float = rac_mnt + 'float.hdr' 244 | fic_mnt_float = rac_mnt + 'float.mnt' 245 | 246 | # calcul du mnt int 247 | self.decoupe_int(mnt_in, fic_mnt) 248 | 249 | # calcul du mnt float 250 | self.decoupe_float(mnt_in, fic_mnt_float) 251 | 252 | # ecriture des entetes babel 253 | (nblig, nbcol, type_donnee, endian) = lire_entete_mnt(fic_hdr_mnt) 254 | 255 | self.ecrit_hd(nblig, nbcol) 256 | self.ecrit_hd_babel(nblig, nbcol) 257 | 258 | shutil.copy(fic_mnt, rac_mnt + '.c1') 259 | 260 | ############################################################# 261 | ########################Reech gradient######################## 262 | ############################################################# 263 | 264 | def reech_gradient(self, fic_dz_dl_srtm, fic_dz_dc_srtm): 265 | rac_mnt = self.racine + '_' + str(self.res) + 'm' 266 | fic_dz_dl = rac_mnt + 'float.dz_dl' 267 | fic_dz_dc = rac_mnt + 'float.dz_dc' 268 | 269 | # rééch 270 | self.decoupe_float(fic_dz_dl_srtm, fic_dz_dl) 271 | self.decoupe_float(fic_dz_dc_srtm, fic_dz_dc) 272 | 273 | ########################################################### 274 | ######### calcul du gradient################################ 275 | ########################################################### 276 | 277 | def calcul_gradient(self): 278 | rac_mnt = self.racine + '_' + str(self.res) + 'm' 279 | print rac_mnt 280 | fic_mnt = rac_mnt + 'float.mnt' 281 | fic_hdr = rac_mnt + 'float.hdr' 282 | fic_dz_dl = rac_mnt + 'float.dz_dl' 283 | fic_dz_dc = rac_mnt + 'float.dz_dc' 284 | (nblig, nbcol, type_donnee, endian) = lire_entete_mnt(fic_hdr) 285 | 286 | srtm = (np.fromfile(fic_mnt, type_donnee)).reshape(nblig, nbcol) 287 | Noyau_horizontal = np.array([[-1, 0, 1], [-2, 0, 2], [-1, 0, 1]]) 288 | Noyau_vertical = np.array([[1, 2, 1], [0, 0, 0], [-1, -2, -1]]) 289 | 290 | dz_dc = nd.convolve(srtm, Noyau_horizontal) / 8. / self.res 291 | dz_dl = nd.convolve(srtm, Noyau_vertical) / 8. / self.res 292 | 293 | dz_dl.tofile(fic_dz_dl) 294 | dz_dc.tofile(fic_dz_dc) 295 | return (fic_dz_dl, fic_dz_dc) 296 | 297 | ############################################################# 298 | ########################Calcul_pentes######################## 299 | ############################################################# 300 | 301 | def calcul_pente_aspect_fic(self): 302 | rac_mnt = self.racine + '_' + str(self.res) + 'm' 303 | print rac_mnt 304 | 305 | fic_hdr = rac_mnt + 'float.hdr' 306 | fic_dz_dl = rac_mnt + 'float.dz_dl' 307 | fic_dz_dc = rac_mnt + 'float.dz_dc' 308 | 309 | (nblig, nbcol, type_donnee, endian) = lire_entete_mnt(fic_hdr) 310 | print nblig * nbcol * 2 311 | # dz_dl=(np.fromfile(fic_dz_dl,type_donnee)).reshape(nblig,nbcol).astype('int16') 312 | # dz_dc=(np.fromfile(fic_dz_dc,type_donnee)).reshape(nblig,nbcol).astype('int16') 313 | 314 | dz_dl = (np.fromfile(fic_dz_dl, type_donnee)).reshape(nblig, nbcol) 315 | dz_dc = (np.fromfile(fic_dz_dc, type_donnee)).reshape(nblig, nbcol) 316 | 317 | norme = np.sqrt((dz_dc) * (dz_dc) + (dz_dl) * (dz_dl)) 318 | slope = np.arctan(norme) 319 | aspect = np.where(dz_dc > 0, np.arccos(dz_dl / norme), 2 * np.pi - np.arccos(dz_dl / norme)) 320 | aspect = np.where(slope == 0, 0, aspect) 321 | 322 | (slope * 100.).astype('int16').tofile(rac_mnt + '.slope') 323 | (aspect * 100.).astype('int16').tofile(rac_mnt + '.aspect') 324 | 325 | ############################################################# 326 | ########################Calcul_eau_mnt####################### 327 | ############################################################# 328 | 329 | def calcul_masque_mnt(self, rep, rac): 330 | rac_eau = self.racine + '_' + str(self.res) + 'm' 331 | fic_hdr_eau = rac_eau + '.hdr' 332 | fic_eau = rac_eau + '.eau' 333 | 334 | rac_mnt = rep + rac + '_' + str(self.res) + 'm' 335 | fic_hdr_mnt = rac_mnt + '.hdr' 336 | fic_hdr_mnt_float = rac_mnt + 'float.hdr' 337 | fic_mnt = rac_mnt + 'float.mnt' 338 | fic_dz_dl = rac_mnt + 'float.dz_dl' 339 | fic_dz_dc = rac_mnt + 'float.dz_dc' 340 | 341 | (nblig, nbcol, type_donnee, endian) = lire_entete_mnt(fic_hdr_mnt_float) 342 | mnt = (np.fromfile(fic_mnt, type_donnee)).reshape(nblig, nbcol) 343 | dz_dl = (np.fromfile(fic_dz_dl, type_donnee)).reshape(nblig, nbcol) 344 | dz_dc = (np.fromfile(fic_dz_dc, type_donnee)).reshape(nblig, nbcol) 345 | 346 | eau = np.where((mnt < 0) & (np.abs(dz_dl) <= 1e5) & (np.abs(dz_dc) <= 1e5), 1, 0) 347 | 348 | eau.astype('int16').tofile(fic_eau) 349 | print fic_eau 350 | shutil.copy(fic_hdr_mnt, fic_hdr_eau) 351 | 352 | ############################################################# 353 | ########################Decoupage EAU######################## 354 | ############################################################# 355 | 356 | def decoupe_eau(self, eau_in): 357 | rac_eau = self.racine + '_' + str(self.res) + 'm' 358 | fic_hdr_eau = rac_eau + '.hdr' 359 | fic_eau = rac_eau + '.eau' 360 | 361 | # calcul du mnt int 362 | chaine_etendue = str(self.ulx) + ' ' + str(self.lry) + ' ' + str(self.lrx) + ' ' + str(self.uly) 363 | commande = 'gdalwarp -overwrite -r near -of ENVI -tr %d %d -te %s -t_srs %s %s %s\n' % ( 364 | self.res, self.res, chaine_etendue, self.chaine_proj, eau_in, fic_eau) 365 | print commande 366 | os.system(commande) 367 | 368 | 369 | ################################################################################# 370 | ########################### Fusion DEM and water masks ######################## 371 | ################################################################################# 372 | def fusion_mnt(liste_fic_mnt, liste_fic_eau, liste_centre_eau, rep_mnt, rep_swbd, nom_site, calcul_eau_mnt, working_dir=None): 373 | if working_dir is None: 374 | working_dir = tempfile.mkdtemp(prefix="{}_".format(nom_site)) 375 | else: 376 | working_dir = tempfile.mkdtemp(prefix="{}_".format(nom_site), dir=working_dir) 377 | print "liste_fic_mnt", liste_fic_mnt 378 | for fic in liste_fic_mnt: 379 | print rep_mnt + '/' + fic 380 | if not (os.path.exists(rep_mnt + '/' + fic)): 381 | ficzip = fic.replace('tif', 'zip') 382 | commande = "unzip -o %s/%s -d %s" % (rep_mnt, ficzip, working_dir) 383 | os.system(commande) 384 | if len(liste_fic_mnt) > 1: 385 | nom_mnt = tempfile.mkstemp(prefix="mnt_{}".format(nom_site), suffix=".tif", dir=working_dir)[1] 386 | commande = "gdal_merge.py -o " + nom_mnt 387 | for fic_mnt in liste_fic_mnt: 388 | commande = commande + " " + rep_mnt + fic_mnt + " " 389 | if os.path.exists(nom_mnt): 390 | os.remove(nom_mnt) 391 | print commande 392 | os.system(commande) 393 | 394 | elif len(liste_fic_mnt) == 1: 395 | nom_mnt = os.path.join(working_dir, liste_fic_mnt[0]) 396 | else: 397 | print "liste_fic_mnt is empty" 398 | raise ("ErreurDeParametreSite") 399 | 400 | ########################on créé aussi le mnt avec no_data=0 401 | nom_mnt_nodata0 = nom_mnt.replace(".tif", "nodata0.tif") 402 | commande = 'gdalwarp -r cubic -srcnodata -32767 -dstnodata 0 %s %s\n' % (nom_mnt, nom_mnt_nodata0) 403 | print commande 404 | os.system(commande) 405 | 406 | if calcul_eau_mnt == 0: # si on est en deça de 60°N 407 | 408 | # Création d'un fichier vide (valeurs à 0) avec la même emprise que le mnt fusionnné 409 | #################################################################################### 410 | nom_raster_swbd = os.path.join(working_dir, os.path.basename(nom_mnt).split('.tif')[0] + "_tmp.tif") 411 | if os.path.exists(nom_raster_swbd): 412 | os.remove(nom_raster_swbd) 413 | ds = gdal.Open(nom_mnt) 414 | driver = gdal.GetDriverByName('GTiff') 415 | ds_out = driver.CreateCopy(nom_raster_swbd, ds, 0) 416 | inband = ds.GetRasterBand(1) 417 | outband = ds_out.GetRasterBand(1) 418 | for i in range(inband.YSize - 1, -1, -1): 419 | scanline = inband.ReadAsArray(0, i, inband.XSize, 1, inband.XSize, 1) 420 | scanline = scanline * 0 421 | outband.WriteArray(scanline, 0, i) 422 | ds_out = None 423 | 424 | # remplissage de ce fichier avec les fichiers SWBD 425 | liste_tuiles_manquantes = ["e017n03", "e006n30", "e006n29", "e005n30", "e005n29", "e015n00", "e015s24", 426 | "e022n28", "e023n28", "w074n01", "e034n02", "e035n02"] 427 | for i, racine_nom_eau in enumerate(liste_fic_eau): 428 | print racine_nom_eau 429 | shp = glob.glob(rep_swbd + '/' + racine_nom_eau + "*.shp") 430 | # if shp file does not exist 431 | if len(shp) == 0: 432 | print 'missing SWBD watr file : ', racine_nom_eau 433 | 434 | # test if center is water or land 435 | land = TestLand(liste_centre_eau[i][0], liste_centre_eau[i][1]) 436 | if land: 437 | valeur = 0 438 | print "it is a fully land tile" 439 | else: 440 | valeur = 1 441 | print "it is a fully water tile" 442 | fic_vecteur_eau = rep_swbd + '/' + racine_nom_eau + ".gml" 443 | creer_fichier_eau(fic_vecteur_eau, racine_nom_eau) 444 | # if shp file exists 445 | else: 446 | valeur = 1 447 | fic_vecteur_eau = shp[0] 448 | # il faut recuperer pour la couche le nom complet (y compris la lettre indiquant le continent) 449 | racine_nom_eau = os.path.basename(fic_vecteur_eau)[:-4] 450 | commande = "gdal_rasterize -burn %d -l %s %s %s" % ( 451 | valeur, racine_nom_eau, fic_vecteur_eau, nom_raster_swbd) 452 | print "#############Fichier eau :", fic_vecteur_eau 453 | print commande 454 | os.system(commande) 455 | else: 456 | nom_raster_swbd = "" 457 | return nom_mnt_nodata0, nom_raster_swbd 458 | 459 | 460 | # calcul de pentes et aspect 461 | ########################## 462 | def calcul_pente_aspect_mem(rac_mnt, dz_dc, dz_dl): 463 | norme = np.sqrt((dz_dc) * (dz_dc) + (dz_dl) * (dz_dl)) 464 | slope = np.arctan(norme) 465 | aspect = np.where(dz_dc > 0, np.arccos(dz_dl / norme), 2 * np.pi - np.arccos(dz_dl / norme)) 466 | aspect = np.where(slope == 0, 0, aspect) 467 | slope = np.where(np.isfinite(slope), slope, 0) 468 | aspect = np.where(np.isfinite(aspect), aspect, 0) 469 | 470 | (slope * 100.).astype('int16').tofile(rac_mnt + '.slope') 471 | (aspect * 100.).astype('int16').tofile(rac_mnt + '.aspect') 472 | 473 | 474 | ############################################## 475 | ############################################## 476 | def creer_fichier_eau(fic_eau, nom_eau): 477 | patron = """ 478 | 483 | 484 | 485 | LONMIN.LATMIN. 486 | LONMAX.LATMAX. 487 | 488 | 489 | 490 | 491 | 492 | LONMIN.,LATMIN.,0 LONMAX.,LATMIN.,0 LONMAX.,LATMAX.,0 LONMIN.,LATMAX.,0 LONMIN.,LATMIN.,0 493 | BH080 494 | 495 | 496 | 497 | 498 | """ 499 | ew = nom_eau[0] 500 | num_x = int(nom_eau[1:4]) 501 | ns = nom_eau[4] 502 | num_y = int(nom_eau[5:7]) 503 | if ew == "w": 504 | num_x = -num_x 505 | if ns == "s": 506 | num_y = -num_y 507 | patron = patron.replace('LONMIN', str(num_x)) 508 | patron = patron.replace('LATMIN', str(num_y)) 509 | patron = patron.replace('LONMAX', str(num_x + 1)) 510 | patron = patron.replace('LATMAX', str(num_y + 1)) 511 | patron = patron.replace('NOMEAU', nom_eau) 512 | 513 | print fic_eau 514 | # print patron 515 | f = file(fic_eau, "w") 516 | f.write(patron) 517 | f.close() 518 | return 519 | -------------------------------------------------------------------------------- /prepare_mnt/parametres.txt: -------------------------------------------------------------------------------- 1 | INDIR_MNT =/home/alexia/Maja/Maja_folders/srtm_base/ 2 | OUTDIR_MNT=/home/alexia/Maja/Maja_folders/PREPARE_MNT/mnt/ 3 | INDIR_EAU=/home/alexia/Maja/Maja_folders/WB/ 4 | OUTDIR_EAU =/home/alexia/Maja/Maja_folders/PREPARE_MNT/mnt/ 5 | -------------------------------------------------------------------------------- /prepare_mnt/tuilage_mnt_eau.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | Reprojette et decoupe un mnt SRTM sur les tuiles d'un site 6 | Les paramètres sont dans parametres.py, dont le nom du site qui sert à déterminer le fichier de paramètres du tuilage d'un site (ex pyrenees.py) 7 | 8 | """ 9 | import optparse 10 | 11 | 12 | ########################################################################### 13 | class OptionParser(optparse.OptionParser): 14 | 15 | def check_required(self, opt): 16 | option = self.get_option(opt) 17 | 18 | # Assumes the option's 'default' is set to None! 19 | if getattr(self.values, option.dest) is None: 20 | self.error("%s option not supplied" % option) 21 | 22 | 23 | ############################# Main 24 | 25 | from math import ceil, floor 26 | from lib_mnt import * 27 | 28 | from osgeo import osr 29 | import sys 30 | 31 | os.environ['LC_NUMERIC'] = 'C' 32 | 33 | if len(sys.argv) == 1: 34 | prog = os.path.basename(sys.argv[0]) 35 | print ' ' + sys.argv[0] + ' [options]' 36 | print " Aide : ", prog, " --help" 37 | print " ou : ", prog, " -h" 38 | print "example1 : python %s -p parametres_po.txt -s ~/DONNEES/SPOT5TAKE5/prep_mnt/Algeria4.py -m PO -f 10 -c 100" % \ 39 | sys.argv[0] 40 | print "example2 : python tuilage_mnt_eau.py -p parametres_srtm.txt -s CVersailles.txt -m SRTM -f 20 -c 200" % \ 41 | sys.argv[0] 42 | sys.exit(-1) 43 | else: 44 | usage = "usage: %prog [options] " 45 | parser = OptionParser(usage=usage) 46 | parser.set_defaults(eau_seulement=False) 47 | parser.set_defaults(sans_numero=False) 48 | 49 | parser.add_option("-p", "--parametre", dest="fic_param", action="store", type="string", \ 50 | help="fichier de parametre", default=None) 51 | parser.add_option("-s", "--site", dest="fic_site", action="store", type="string", \ 52 | help="fichier de description du site", default=None) 53 | parser.add_option("-m", "--mnt", dest="mnt", action="store", type="choice", \ 54 | help="SRTM ou PO (Planet Observer)", choices=['SRTM', 'PO'], default=None) 55 | parser.add_option("-f", dest="FULL_RES", action="store", type="int", \ 56 | help="Full resolution", default=None) 57 | parser.add_option("-c", dest="COARSE_RES", action="store", type="int", \ 58 | help="Coarse resolution", default=None) 59 | parser.add_option("-e", dest="eau_seulement", action="store_true", \ 60 | help="Traitement des masques d'eau seulement") 61 | parser.add_option("-n", dest="sans_numero", action="store_true", \ 62 | help="Traitement sans numero de tuile") 63 | 64 | (options, args) = parser.parse_args() 65 | parser.check_required("-p") 66 | parser.check_required("-s") 67 | parser.check_required("-m") 68 | parser.check_required("-c") 69 | parser.check_required("-f") 70 | 71 | # lecture du fichier de paramètres et du fichier site 72 | (rep_mnt_in, rep_mnt, rep_swbd, rep_eau) = lire_param_txt(options.fic_param) 73 | site = lire_fichier_site(options.fic_site) 74 | SRTM_RES = 90 75 | 76 | # ==========création de la liste des fichiers planet_observer 77 | # conversion des coordonnées des coins en lat_lon 78 | latlon = osr.SpatialReference() 79 | latlon.SetWellKnownGeogCS("WGS84") 80 | proj_site = osr.SpatialReference() 81 | proj_site.ImportFromEPSG(site.EPSG_out) 82 | transform = osr.CoordinateTransformation(proj_site, latlon) 83 | 84 | # recherche des 4 coins du site 85 | ulx_site = site.orig_x + site.tx_min * site.pas_x # upper left 86 | uly_site = site.orig_y + site.ty_max * site.pas_y 87 | lrx_site = site.orig_x + (site.tx_max + 1) * site.pas_x + site.marge # lower left 88 | lry_site = site.orig_y + (site.ty_min - 1) * site.pas_y - site.marge 89 | 90 | ul_latlon = transform.TransformPoint(ulx_site, uly_site, 0) 91 | lr_latlon = transform.TransformPoint(lrx_site, lry_site, 0) 92 | 93 | liste_fic_mnt = [] 94 | 95 | ############# MNT SRTM du CGIAR 96 | if options.mnt == "SRTM": 97 | # liste des fichiers SRTM nécessaires 98 | if (ul_latlon[1]) > 60 or (lr_latlon[1] > 60): 99 | print "#################################################" 100 | print "latitude supérieure à 60 degrés, pas de donnees SRTM" 101 | print "#################################################" 102 | sys.exit(-3) 103 | 104 | ul_latlon_srtm = [int(ul_latlon[0] + 180) / 5 + 1, int(60 - ul_latlon[1]) / 5 + 1] 105 | lr_latlon_srtm = [int(lr_latlon[0] + 180) / 5 + 1, int(60 - lr_latlon[1]) / 5 + 1] 106 | 107 | for x in range(ul_latlon_srtm[0], lr_latlon_srtm[0] + 1): 108 | for y in range(ul_latlon_srtm[1], lr_latlon_srtm[1] + 1): 109 | liste_fic_mnt.append("srtm_%02d_%02d.tif" % (x, y)) 110 | 111 | print ul_latlon, lr_latlon 112 | print ul_latlon_srtm, lr_latlon_srtm 113 | print liste_fic_mnt 114 | 115 | ########## MNT Planet Observer 116 | elif options.mnt == "PO": 117 | 118 | ul_latlon_po = [int(floor(ul_latlon[0])), int(floor(ul_latlon[1]))] 119 | lr_latlon_po = [int(floor(lr_latlon[0])), int(floor(lr_latlon[1]))] 120 | 121 | for x in range(ul_latlon_po[0], lr_latlon_po[0] + 1): 122 | for y in range(lr_latlon_po[1], ul_latlon_po[1] + 1): 123 | if x >= 0: 124 | ew = "e" 125 | num_x = x 126 | else: 127 | ew = "w" 128 | num_x = -x 129 | if y >= 0: 130 | ns = "n" 131 | num_y = y 132 | else: 133 | ns = "s" 134 | num_y = -y 135 | liste_fic_mnt.append("%s%03d/%s%02d.dt1" % (ew, num_x, ns, num_y)) 136 | 137 | print ul_latlon, lr_latlon 138 | print ul_latlon_po, lr_latlon_po 139 | print liste_fic_mnt 140 | 141 | # liste des fichiers SWBD nécessaires 142 | ul_latlon_swbd = [int(floor(ul_latlon[0])), int(floor(ul_latlon[1]))] 143 | lr_latlon_swbd = [int(floor(lr_latlon[0])), int(floor(lr_latlon[1]))] 144 | print ul_latlon, lr_latlon 145 | print ul_latlon_swbd, lr_latlon_swbd 146 | 147 | calcul_masque_eau_mnt = 0 148 | if (ul_latlon[1]) > 60 or (lr_latlon[1] > 60): 149 | print "#################################################" 150 | print "latitude supérieure à 60 degrés, pas de donnees SRTM" 151 | print "le masque d'eau est généré à partir du MNT" 152 | print "#################################################" 153 | calcul_masque_eau_mnt = 1 154 | 155 | liste_fic_eau = [] 156 | liste_centre_eau = [] 157 | for x in range(ul_latlon_swbd[0], lr_latlon_swbd[0] + 1): 158 | for y in range(lr_latlon_swbd[1], ul_latlon_swbd[1] + 1): 159 | if x >= 0: 160 | ew = "e" 161 | num_x = x 162 | else: 163 | ew = "w" 164 | num_x = -x 165 | if y >= 0: 166 | ns = "n" 167 | num_y = y 168 | else: 169 | ns = "s" 170 | num_y = -y 171 | 172 | liste_fic_eau.append("%s%03d%s%02d" % (ew, num_x, ns, num_y)) 173 | liste_centre_eau.append([x + 0.5, y + 0.5]) 174 | 175 | print "longitudes", ul_latlon_swbd[0], lr_latlon_swbd[0] 176 | print "latitudes", lr_latlon_swbd[1], ul_latlon_swbd[1] 177 | print "center coordinates", liste_centre_eau 178 | print liste_fic_eau 179 | 180 | # Fusion des mnt_srtm en un seul 181 | (fic_mnt_in, fic_eau_in) = fusion_mnt(liste_fic_mnt, liste_fic_eau, liste_centre_eau, rep_mnt_in, rep_swbd, site.nom, 182 | calcul_masque_eau_mnt) 183 | print "############", fic_mnt_in 184 | 185 | ####################Boucle de création des fichiers MNT et eau pour chaque tuile 186 | 187 | for tx in range(site.tx_min, site.tx_max + 1): 188 | for ty in range(site.ty_min, site.ty_max + 1): 189 | ulx = site.orig_x + tx * site.pas_x # upper left 190 | uly = site.orig_y + ty * site.pas_y 191 | lrx = site.orig_x + (tx + 1) * site.pas_x + site.marge # lower left 192 | lry = site.orig_y + (ty - 1) * site.pas_y - site.marge 193 | 194 | lrx_90m = int(ceil((lrx - ulx) / float(SRTM_RES))) * SRTM_RES + ulx 195 | lry_90m = uly - int(ceil((uly - lry) / float(SRTM_RES))) * SRTM_RES 196 | 197 | lrx_coarse = int(ceil((lrx - ulx) / float(options.COARSE_RES))) * options.COARSE_RES + ulx 198 | lry_coarse = uly - int(ceil((uly - lry) / float(options.COARSE_RES))) * options.COARSE_RES 199 | 200 | if options.sans_numero & (site.tx_max == 0) & (site.ty_max == 0): 201 | nom_tuile = site.nom 202 | else: 203 | nom_tuile = calcule_nom_tuile(tx, ty, site, site.nom) 204 | print "nom de la tuile", nom_tuile, tx, ty 205 | ###pour le MNT 206 | rep_mnt_out = rep_mnt + nom_tuile + '/' 207 | 208 | if options.eau_seulement == False: 209 | if not (os.path.exists(rep_mnt_out)): 210 | os.mkdir(rep_mnt_out) 211 | 212 | # Resolution SRTM_RES 213 | print "############### c'est parti" 214 | mnt_90m = classe_mnt(rep_mnt_out, nom_tuile, ulx, uly, lrx_90m, lry_90m, SRTM_RES, site.chaine_proj) 215 | mnt_90m.decoupe(fic_mnt_in) 216 | 217 | # calcul du gradient à 90m 218 | (fic_dz_dl_srtm, fic_dz_dc_srtm) = mnt_90m.calcul_gradient() 219 | 220 | # Basse résolution 221 | 222 | mnt_coarse = classe_mnt(rep_mnt_out, nom_tuile, ulx, uly, lrx_coarse, lry_coarse, options.COARSE_RES, 223 | site.chaine_proj) 224 | mnt_coarse.decoupe(fic_mnt_in) 225 | mnt_coarse.reech_gradient(fic_dz_dl_srtm, fic_dz_dc_srtm) 226 | mnt_coarse.calcul_pente_aspect_fic() 227 | 228 | # Haute résolution 229 | mnt_full = classe_mnt(rep_mnt_out, nom_tuile, ulx, uly, lrx, lry, options.FULL_RES, site.chaine_proj) 230 | mnt_full.decoupe(fic_mnt_in) 231 | mnt_full.reech_gradient(fic_dz_dl_srtm, fic_dz_dc_srtm) 232 | mnt_full.calcul_pente_aspect_fic() 233 | 234 | ### Pour l'eau 235 | rep_eau_out = rep_eau + nom_tuile + '/' 236 | if not (os.path.exists(rep_eau_out)): 237 | os.mkdir(rep_eau_out) 238 | 239 | eau = classe_mnt(rep_eau_out, nom_tuile, ulx, uly, lrx_coarse, lry_coarse, options.COARSE_RES, site.chaine_proj) 240 | if calcul_masque_eau_mnt == 0: 241 | eau.decoupe_eau(fic_eau_in) 242 | else: 243 | eau.calcul_masque_mnt(rep_mnt_out, nom_tuile) 244 | -------------------------------------------------------------------------------- /prepare_mnt/tuilage_mnt_eau_S2.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | Reprojette et decoupe un mnt SRTM sur les tuiles d'un site 6 | Les paramètres sont dans parametres.py, dont le nom du site qui sert à déterminer le fichier de paramètres du tuilage d'un site (ex pyrenees.py) 7 | 8 | """ 9 | import optparse 10 | 11 | 12 | ########################################################################### 13 | class OptionParser(optparse.OptionParser): 14 | 15 | def check_required(self, opt): 16 | option = self.get_option(opt) 17 | 18 | # Assumes the option's 'default' is set to None! 19 | if getattr(self.values, option.dest) is None: 20 | self.error("%s option not supplied" % option) 21 | 22 | 23 | ############################# Main 24 | 25 | from math import ceil, floor 26 | from lib_mnt import * 27 | 28 | from osgeo import osr 29 | import sys 30 | 31 | os.environ['LC_NUMERIC'] = 'C' 32 | 33 | if len(sys.argv) == 1: 34 | prog = os.path.basename(sys.argv[0]) 35 | print ' ' + sys.argv[0] + ' [options]' 36 | print " Aide : ", prog, " --help" 37 | print " ou : ", prog, " -h" 38 | print "example : python %s -p parametres_srtm.txt -s 32SNE.txt -m SRTM -c 240" % sys.argv[0] 39 | sys.exit(-1) 40 | else: 41 | usage = "usage: %prog [options] " 42 | parser = OptionParser(usage=usage) 43 | parser.set_defaults(eau_seulement=False) 44 | parser.set_defaults(sans_numero=False) 45 | 46 | parser.add_option("-p", "--parametre", dest="fic_param", action="store", type="string", \ 47 | help="fichier de parametre", default=None) 48 | parser.add_option("-s", "--site", dest="fic_site", action="store", type="string", \ 49 | help="fichier de description du site", default=None) 50 | parser.add_option("-m", "--mnt", dest="mnt", action="store", type="choice", \ 51 | help="SRTM ou PO (Planet Observer)", choices=['SRTM', 'PO'], default=None) 52 | parser.add_option("-c", dest="COARSE_RES", action="store", type="int", \ 53 | help="Coarse resolution", default=240) 54 | parser.add_option("-e", dest="eau_seulement", action="store_true", \ 55 | help="Traitement des masques d'eau seulement") 56 | parser.add_option("-n", dest="sans_numero", action="store_true", \ 57 | help="Traitement sans numero de tuile") 58 | 59 | (options, args) = parser.parse_args() 60 | parser.check_required("-p") 61 | parser.check_required("-s") 62 | parser.check_required("-m") 63 | 64 | # lecture du fichier de paramètres et du fichier site 65 | (rep_mnt_in, rep_mnt, rep_swbd, rep_eau) = lire_param_txt(options.fic_param) 66 | site = lire_fichier_site(options.fic_site) 67 | SRTM_RES = 90 68 | 69 | # ==========création de la liste des fichiers planet_observer 70 | # conversion des coordonnées des coins en lat_lon 71 | latlon = osr.SpatialReference() 72 | latlon.SetWellKnownGeogCS("WGS84") 73 | proj_site = osr.SpatialReference() 74 | proj_site.ImportFromEPSG(site.EPSG_out) 75 | transform = osr.CoordinateTransformation(proj_site, latlon) 76 | 77 | # recherche des 4 coins du site 78 | ulx_site = site.orig_x + site.tx_min * site.pas_x # upper left 79 | uly_site = site.orig_y + site.ty_max * site.pas_y 80 | lrx_site = site.orig_x + (site.tx_max + 1) * site.pas_x + site.marge # lower left 81 | lry_site = site.orig_y + (site.ty_min - 1) * site.pas_y - site.marge 82 | 83 | ul_latlon = transform.TransformPoint(ulx_site, uly_site, 0) 84 | lr_latlon = transform.TransformPoint(lrx_site, lry_site, 0) 85 | 86 | liste_fic_mnt = [] 87 | 88 | ############# MNT SRTM du CGIAR 89 | if options.mnt == "SRTM": 90 | # liste des fichiers SRTM nécessaires 91 | if (ul_latlon[1]) > 60 or (lr_latlon[1] > 60): 92 | print "#################################################" 93 | print "latitude supérieure à 60 degrés, pas de donnees SRTM" 94 | print "#################################################" 95 | sys.exit(-3) 96 | 97 | ul_latlon_srtm = [int(ul_latlon[0] + 180) / 5 + 1, int(60 - ul_latlon[1]) / 5 + 1] 98 | lr_latlon_srtm = [int(lr_latlon[0] + 180) / 5 + 1, int(60 - lr_latlon[1]) / 5 + 1] 99 | 100 | for x in range(ul_latlon_srtm[0], lr_latlon_srtm[0] + 1): 101 | for y in range(ul_latlon_srtm[1], lr_latlon_srtm[1] + 1): 102 | liste_fic_mnt.append("srtm_%02d_%02d.tif" % (x, y)) 103 | 104 | print ul_latlon, lr_latlon 105 | print ul_latlon_srtm, lr_latlon_srtm 106 | print liste_fic_mnt 107 | 108 | ########## MNT Planet Observer 109 | elif options.mnt == "PO": 110 | 111 | ul_latlon_po = [int(floor(ul_latlon[0])), int(floor(ul_latlon[1]))] 112 | lr_latlon_po = [int(floor(lr_latlon[0])), int(floor(lr_latlon[1]))] 113 | 114 | for x in range(ul_latlon_po[0], lr_latlon_po[0] + 1): 115 | for y in range(lr_latlon_po[1], ul_latlon_po[1] + 1): 116 | if x >= 0: 117 | ew = "e" 118 | num_x = x 119 | else: 120 | ew = "w" 121 | num_x = -x 122 | if y >= 0: 123 | ns = "n" 124 | num_y = y 125 | else: 126 | ns = "s" 127 | num_y = -y 128 | liste_fic_mnt.append("%s%03d/%s%02d.dt1" % (ew, num_x, ns, num_y)) 129 | 130 | print ul_latlon, lr_latlon 131 | print ul_latlon_po, lr_latlon_po 132 | print liste_fic_mnt 133 | 134 | # liste des fichiers SWBD nécessaires 135 | ul_latlon_swbd = [int(floor(ul_latlon[0])), int(floor(ul_latlon[1]))] 136 | lr_latlon_swbd = [int(floor(lr_latlon[0])), int(floor(lr_latlon[1]))] 137 | print ul_latlon, lr_latlon 138 | print ul_latlon_swbd, lr_latlon_swbd 139 | 140 | calcul_masque_eau_mnt = 0 141 | if (ul_latlon[1]) > 60 or (lr_latlon[1] > 60): 142 | print "#################################################" 143 | print "latitude supérieure à 60 degrés, pas de donnees SRTM" 144 | print "le masque d'eau est généré à partir du MNT" 145 | print "#################################################" 146 | calcul_masque_eau_mnt = 1 147 | 148 | liste_fic_eau = [] 149 | liste_centre_eau = [] 150 | for x in range(ul_latlon_swbd[0], lr_latlon_swbd[0] + 1): 151 | for y in range(lr_latlon_swbd[1], ul_latlon_swbd[1] + 1): 152 | if x >= 0: 153 | ew = "e" 154 | num_x = x 155 | else: 156 | ew = "w" 157 | num_x = -x 158 | if y >= 0: 159 | ns = "n" 160 | num_y = y 161 | else: 162 | ns = "s" 163 | num_y = -y 164 | 165 | liste_fic_eau.append("%s%03d%s%02d" % (ew, num_x, ns, num_y)) 166 | liste_centre_eau.append([x + 0.5, y + 0.5]) 167 | 168 | print liste_fic_eau 169 | 170 | print "longitudes", ul_latlon_swbd[0], lr_latlon_swbd[0] 171 | print "latitudes", lr_latlon_swbd[1], ul_latlon_swbd[1] 172 | print "center coordinates", liste_centre_eau 173 | print liste_fic_eau 174 | 175 | # Fusion des mnt_srtm en un seul 176 | (fic_mnt_in, fic_eau_in) = fusion_mnt(liste_fic_mnt, liste_fic_eau, liste_centre_eau, rep_mnt_in, rep_swbd, site.nom, 177 | calcul_masque_eau_mnt) 178 | print "############", fic_mnt_in 179 | 180 | ####################Boucle de création des fichiers MNT et eau pour chaque tuile 181 | 182 | for tx in range(site.tx_min, site.tx_max + 1): 183 | for ty in range(site.ty_min, site.ty_max + 1): 184 | ulx = site.orig_x + tx * site.pas_x # upper left 185 | uly = site.orig_y + ty * site.pas_y 186 | lrx = site.orig_x + (tx + 1) * site.pas_x + site.marge # lower left 187 | lry = site.orig_y + (ty - 1) * site.pas_y - site.marge 188 | 189 | lrx_90m = int(ceil((lrx - ulx) / float(SRTM_RES))) * SRTM_RES + ulx 190 | lry_90m = uly - int(ceil((uly - lry) / float(SRTM_RES))) * SRTM_RES 191 | 192 | lrx_coarse = int(ceil((lrx - ulx) / float(options.COARSE_RES))) * options.COARSE_RES + ulx 193 | lry_coarse = uly - int(ceil((uly - lry) / float(options.COARSE_RES))) * options.COARSE_RES 194 | 195 | nom_tuile = site.nom 196 | 197 | print "nom de la tuile", nom_tuile, tx, ty 198 | ###pour le MNT 199 | rep_mnt_out = rep_mnt + nom_tuile + '/' 200 | 201 | if options.eau_seulement == False: 202 | if not (os.path.exists(rep_mnt_out)): 203 | os.makedirs(rep_mnt_out) 204 | 205 | # Resolution SRTM_RES 206 | print "############### c'est parti" 207 | mnt_90m = classe_mnt(rep_mnt_out, nom_tuile, ulx, uly, lrx_90m, lry_90m, SRTM_RES, site.chaine_proj) 208 | mnt_90m.decoupe(fic_mnt_in) 209 | 210 | # calcul du gradient à 90m 211 | (fic_dz_dl_srtm, fic_dz_dc_srtm) = mnt_90m.calcul_gradient() 212 | 213 | # Basse résolution 214 | 215 | mnt_coarse = classe_mnt(rep_mnt_out, nom_tuile, ulx, uly, lrx_coarse, lry_coarse, options.COARSE_RES, 216 | site.chaine_proj) 217 | mnt_coarse.decoupe(fic_mnt_in) 218 | mnt_coarse.reech_gradient(fic_dz_dl_srtm, fic_dz_dc_srtm) 219 | mnt_coarse.calcul_pente_aspect_fic() 220 | 221 | # Haute résolution 10 et 20m 222 | for full_res in [10, 20]: 223 | mnt_full = classe_mnt(rep_mnt_out, nom_tuile, ulx, uly, lrx, lry, full_res, site.chaine_proj) 224 | mnt_full.decoupe(fic_mnt_in) 225 | mnt_full.reech_gradient(fic_dz_dl_srtm, fic_dz_dc_srtm) 226 | mnt_full.calcul_pente_aspect_fic() 227 | 228 | ### Pour l'eau 229 | rep_eau_out = rep_eau + nom_tuile + '/' 230 | if not (os.path.exists(rep_eau_out)): 231 | os.mkdir(rep_eau_out) 232 | 233 | eau = classe_mnt(rep_eau_out, nom_tuile, ulx, uly, lrx_coarse, lry_coarse, options.COARSE_RES, site.chaine_proj) 234 | if calcul_masque_eau_mnt == 0: 235 | eau.decoupe_eau(fic_eau_in) 236 | else: 237 | eau.calcul_masque_mnt(rep_mnt_out, nom_tuile) 238 | -------------------------------------------------------------------------------- /start_maja.py: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Processes a Sentinel-2 time series for a tile using MAJA processor for atmospheric correction and cloud screening. 5 | 6 | MAJA was developped by CS-SI, under a CNES contract, using a multi-temporal method developped at CESBIO, for the MACCS processor and including methods developped by DLR for ATCOR. 7 | 8 | This tool, developped by O.Hagolle (CNES:CESBIO) is a very basic one to show how to use MAJA to process a time series. 9 | 10 | ==================== Copyright 11 | Software (start_maja.py) 12 | 13 | Copyright© 2018 Centre National d’Etudes Spatiales 14 | 15 | This program is free software: you can redistribute it and/or modify 16 | it under the terms of the GNU General Public License version 3 17 | as published by the Free Software Foundation. 18 | 19 | This program is distributed in the hope that it will be useful, 20 | but WITHOUT ANY WARRANTY; without even the implied warranty of 21 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 22 | GNU General Public License for more details. 23 | 24 | You should have received a copy of the GNU Lesser General Public 25 | License along with this program. If not, see 26 | https://www.gnu.org/licenses/gpl-3.0.fr.html 27 | """ 28 | 29 | import glob 30 | import tempfile 31 | import optparse 32 | import os 33 | import os.path 34 | import shutil 35 | import sys 36 | print sys.path 37 | import zipfile 38 | from convert_CAMS_DBL import exocam_creation 39 | import logging 40 | 41 | START_MAJA_VERSION = 3.1 42 | 43 | # ######################################################################### 44 | 45 | 46 | class OptionParser(optparse.OptionParser): 47 | 48 | def check_required(self, opt): 49 | option = self.get_option(opt) 50 | 51 | # Assumes the option's 'default' is set to None! 52 | if getattr(self.values, option.dest) is None: 53 | self.error("%s option not supplied" % option) 54 | 55 | 56 | # #################################### Lecture de fichier de parametres "Key=Value" 57 | def read_folders(fic_txt): 58 | 59 | repCode = repWork = repL1 = repL2 = repMaja = repCAMS = repCAMS_raw = None 60 | 61 | with file(fic_txt, 'r') as f: 62 | for ligne in f.readlines(): 63 | if ligne.find('repCode') == 0: 64 | repCode = (ligne.split('=')[1]).strip() 65 | if ligne.find('repWork') == 0: 66 | repWork = (ligne.split('=')[1]).strip() 67 | if ligne.find('repL1') == 0: 68 | repL1 = (ligne.split('=')[1]).strip() 69 | if ligne.find('repL2') == 0: 70 | repL2 = (ligne.split('=')[1]).strip() 71 | if ligne.find('repMaja') == 0: 72 | repMaja = (ligne.split('=')[1]).strip() 73 | if ligne.find('repCAMS') == 0: 74 | repCAMS = (ligne.split('=')[1]).strip() 75 | 76 | missing = False 77 | 78 | if repCode is None: 79 | logger.error( 80 | "repCode is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 81 | missing = True 82 | if repWork is None: 83 | logger.error( 84 | "repWork is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 85 | missing = True 86 | if repL1 is None: 87 | logger.error( 88 | "repL1 is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 89 | missing = True 90 | if repL2 is None: 91 | logger.error( 92 | "repL2 is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 93 | missing = True 94 | if repMaja is None: 95 | logger.error( 96 | "repCode is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 97 | missing = True 98 | if repCAMS is None: 99 | logger.info( 100 | "repCAMS is missing from configuration file. Needed : repCode, repWork, repL1, repL2, repMaja") 101 | logger.info("Processing without CAMS") 102 | 103 | if missing: 104 | raise Exception("Configuration file is not complete. See log file for more information.") 105 | 106 | directory_missing = False 107 | 108 | if not os.path.isdir(repCode): 109 | logger.error("repCode %s is missing", repCode) 110 | directory_missing = True 111 | if not os.path.isdir(repWork): 112 | logger.error("repWork %s is missing", repWork) 113 | directory_missing = True 114 | if not os.path.isdir(repL1): 115 | logger.error("repL1 %s is missing", repL1) 116 | directory_missing = True 117 | if not os.path.isdir(repL2): 118 | logger.error("repL2 %s is missing", repL2) 119 | directory_missing = True 120 | if not os.path.isfile(repMaja): 121 | logger.error("repMaja %s is missing", repMaja) 122 | directory_missing = True 123 | if repCAMS is not None and not os.path.isdir(repCAMS): 124 | logger.error("repCAMS %s is missing", repCAMS) 125 | if repCAMS_raw is not None and not os.path.isdir(repCAMS_raw): 126 | logger.error("repCAMS %s is missing", repCAMS_raw) 127 | 128 | if directory_missing: 129 | raise Exception("One or more directories are missing. See log file for more information.") 130 | 131 | return repCode, repWork, repL1, repL2, repMaja, repCAMS, repCAMS_raw 132 | 133 | 134 | # =============== Module to copy and link files 135 | 136 | # replace tile name in example files 137 | def replace_tile_name(fic_in, fic_out, tile_in, tile_out): 138 | with file(fic_in) as f_in: 139 | with file(fic_out, "w") as f_out: 140 | lignes = f_in.readlines() 141 | for l in lignes: 142 | if l.find(tile_in) > 0: 143 | l = l.replace(tile_in, tile_out) 144 | f_out.write(l) 145 | 146 | 147 | def add_parameter_files(repGipp, repWorkIn, tile, repCams): 148 | 149 | for fic in glob.glob(repGipp + "/*"): 150 | 151 | base = os.path.basename(fic) 152 | if fic.find("36JTT") > 0: 153 | replace_tile_name(fic, repWorkIn + '/' + base.replace("36JTT", tile), "36JTT", tile) 154 | else: 155 | logger.debug("Linking %s to %s", fic, repWorkIn + '/' + base) 156 | os.symlink(fic, repWorkIn + '/' + base) 157 | 158 | # links for CAMS files 159 | if repCams is not None: 160 | for fic in glob.glob(os.path.join(repCams, "*")): 161 | base = os.path.basename(fic) 162 | #logger.debug("Linking %s in %s", fic, repWorkIn) 163 | os.symlink(fic, os.path.join(repWorkIn, base)) 164 | 165 | 166 | def add_DEM(repDEM, repWorkIn, tile): 167 | logger.debug("%s/*%s*/*", repDEM, tile) 168 | for fic in glob.glob(repDEM + "/S2_*%s*/*" % tile): 169 | base = os.path.basename(fic) 170 | os.symlink(fic, repWorkIn + base) 171 | 172 | 173 | def add_config_files(repConf, repWorkConf): 174 | os.symlink(repConf, repWorkConf) 175 | 176 | 177 | def manage_rep_cams(repCams, repCamsRaw, working_dir): 178 | if repCamsRaw is not None: 179 | # convert nc to exocams 180 | if repCams is not None: 181 | logger.warning("Exo cams dir and exo cams dir all ") 182 | working_directory = tempfile.mkdtemp(suffix="ConvertToExo_temp", dir=working_dir) 183 | repCams_out = tempfile.mkdtemp(suffix="ConvertToExo_out", dir=working_dir) 184 | exocam_creation(repCamsRaw, out_dir=repCams_out, working_dir=working_directory) 185 | return repCams_out 186 | 187 | return repCams 188 | 189 | 190 | def unzipAndMoveL1C(L1Czipped, workdir, tile): 191 | # unzip L1C file 192 | try: 193 | with zipfile.ZipFile(L1Czipped, 'r') as zip_ref: 194 | safeDir=zip_ref.namelist()[0] 195 | zip_ref.extractall(workdir) 196 | except IOError: 197 | print("L1C zip file %s is not readable"% L1Czipped) 198 | sys.exit(-1) 199 | 200 | return 201 | 202 | 203 | def test_valid_L2A(L2A_DIR): 204 | # test validity of a Level2A product of MUSCATE type 205 | JPIfile = glob.glob("%s/DATA/*_JPI_ALL.xml" % L2A_DIR)[0] 206 | valid = True 207 | try: 208 | with open(JPIfile) as f: 209 | for ligne in f: 210 | if ligne.find("L2NOTV") >= 0: 211 | valid = False 212 | prod_name = os.path.basename(L2A_DIR) 213 | dir_name = os.path.dirname(L2A_DIR) 214 | if not(os.path.exists(dir_name+"/L2NOTV_"+prod_name)): 215 | os.rename(L2A_DIR, dir_name+"/L2NOTV_"+prod_name) 216 | else: 217 | shutil.rmtree(dir_name+"/L2NOTV_"+prod_name) 218 | os.rename(L2A_DIR, dir_name+"/L2NOTV_"+prod_name) 219 | print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" 220 | print "L2A product %s is not valid (probably due to too many clouds or No_data values)" % dir_name 221 | print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" 222 | 223 | except IOError: 224 | valid = False 225 | print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" 226 | print "L2A product %s not found " % L2A_DIR 227 | print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" 228 | 229 | return(valid) 230 | 231 | 232 | def start_maja(folder_file, context, site, tile, orbit, nb_backward, options, debug_mode): 233 | # =================directories 234 | (repCode, repWork, repL1, repL2, maja, repCams, repCamsRaw) = read_folders(folder_file) 235 | 236 | repCams = manage_rep_cams(repCams, repCamsRaw, repWork) 237 | 238 | repConf = repCode + "/userconf" 239 | if not(os.path.exists(repConf)): 240 | logger.error("Config dir %s does not exist", repConf) 241 | sys.exit(-1) 242 | repDtm = repCode + "/DTM" 243 | if not(os.path.exists(repDtm)): 244 | logger.error("DTM dir %s does not exist", repDtm) 245 | sys.exit(-1) 246 | repGipp = repCode + "/GIPP_%s" % context 247 | if not(os.path.exists(repGipp)): 248 | logger.error("GIPP dir %s does not exist", repGipp) 249 | sys.exit(-1) 250 | 251 | repWork = "%s/%s/%s/%s/" % (repWork, site, tile, context) 252 | if not (os.path.exists(repWork)): 253 | try: 254 | os.makedirs(repWork) 255 | except: 256 | logger.error("something wrong when creating %s", repWork) 257 | sys.exit(-1) 258 | repL1 = "%s/%s/" % (repL1, site) 259 | repL2 = "%s/%s/%s/%s/" % (repL2, site, tile, context) 260 | 261 | # check existence of folders 262 | for fic in repL1, repCode, repWork, maja: 263 | if not (os.path.exists(fic)): 264 | logger.error("ERROR : %s does not exist", fic) 265 | sys.exit(-1) 266 | 267 | if not os.path.exists(repL2): 268 | os.makedirs(repL2) 269 | 270 | if options.zip: 271 | if orbit is not None: 272 | listeProd = glob.glob(repL1 + "/S2?_MSIL1C*%s_T%s*.zip" % (orbit, tile)) 273 | else: 274 | listeProd = glob.glob(repL1 + "/S2?_MSIL1C*_T%s*.zip" % (tile)) 275 | elif orbit is not None: 276 | listeProd = glob.glob(repL1 + "/S2?_OPER_PRD_MSIL1C*%s_*.SAFE/GRANULE/*%s*" % (orbit, tile)) 277 | listeProd = listeProd + \ 278 | glob.glob(repL1 + "/S2?_MSIL1C*%s_*.SAFE/GRANULE/*%s*" % (orbit, tile)) 279 | else: 280 | listeProd = glob.glob(repL1 + "/S2?_OPER_PRD_MSIL1C*.SAFE/GRANULE/*%s*" % (tile)) 281 | listeProd = listeProd + glob.glob(repL1 + "/S2?_MSIL1C*.SAFE/GRANULE/*%s*" % (tile)) 282 | 283 | logger.debug("Liste prod %s", listeProd) 284 | # list of images to process 285 | dateProd = [] 286 | dateImg = [] 287 | listeProdFiltree = [] 288 | 289 | if len(listeProd) == 0: 290 | if options.zip: 291 | logger.error("No L1C product found in %s" % 292 | (repL1 + "/S2?_MSIL1C*%s_T%s*.zip" % (orbit, tile))) 293 | 294 | elif orbit is not None: 295 | logger.error("No L1C product found in %s or %s", 296 | repL1 + "/S2?_OPER_PRD_MSIL1C*%s_*.SAFE/GRANULE/*%s*" % (orbit, tile), 297 | repL1 + "/S2?_MSIL1C*%s_*.SAFE/GRANULE/*%s*" % (orbit, tile)) 298 | else: 299 | logger.error("No L1C product found in %s or %s", 300 | repL1 + "/S2?_OPER_PRD_MSIL1C*.SAFE/GRANULE/*%s*" % (tile), 301 | repL1 + "/S2?_MSIL1C*.SAFE/GRANULE/*%s*" % (tile)) 302 | sys.exit(-3) 303 | 304 | for elem in listeProd: 305 | if options.zip: 306 | rac = elem.split("/")[-1] 307 | else: 308 | rac = elem.split("/")[-3] 309 | elem = '/'.join(elem.split("/")[0:-2]) 310 | logger.debug("elem: %s", elem) 311 | rac = os.path.basename(elem) 312 | logger.debug("rac: %s", rac) 313 | 314 | if rac.startswith("S2A_OPER_PRD_MSIL1C") or rac.startswith("S2B_OPER_PRD_MSIL1C"): 315 | date_asc = rac.split('_')[7][1:9] 316 | else: 317 | date_asc = rac.split('_')[2][0:8] 318 | logger.debug("date_asc %s %s %s/%s", date_asc, date_asc >= 319 | options.startDate, date_asc, options.startDate) 320 | if date_asc >= options.startDate and date_asc <= options.endDate: 321 | dateImg.append(date_asc) 322 | if rac.startswith("S2A_OPER_PRD_MSIL1C") or rac.startswith("S2B_OPER_PRD_MSIL1C"): 323 | dateProd.append(rac.split('_')[5]) 324 | else: 325 | dateProd.append(rac.split('_')[6]) 326 | listeProdFiltree.append(elem) 327 | 328 | # removing multiple images with same date and tile 329 | logger.debug("date img %s", dateImg) 330 | logger.debug("set %s", set(dateImg)) 331 | 332 | dates_diff = list(set(dateImg)) 333 | dates_diff.sort() 334 | 335 | prod_par_dateImg = {} 336 | nomL2_par_dateImg_Natif = {} 337 | nomL2_par_dateImg_MUSCATE = {} 338 | for d in dates_diff: 339 | nb = dateImg.count(d) 340 | 341 | dpmax = "" 342 | ind = -1 343 | # search the most recent production date 344 | for i in range(0, nb): 345 | ind = dateImg.index(d, ind + 1) 346 | dp = dateProd[ind] 347 | if dp > dpmax: 348 | dpmax = dp 349 | 350 | # keep only the products with the most recent date 351 | ind = dateProd.index(dpmax) 352 | logger.debug("date prod max %s index in list %s", dpmax, ind) 353 | prod_par_dateImg[d] = listeProdFiltree[ind] 354 | nomL2_par_dateImg_Natif[d] = "S2?_OPER_SSC_L2VALD_%s____%s.DBL.DIR" % (tile, d) 355 | nomL2_par_dateImg_MUSCATE[d] = "SENTINEL2?_%s-*_T%s_C_V*" % (d, tile) 356 | logger.debug("d %s, prod_par_dateImg[d] %s", d, prod_par_dateImg[d]) 357 | 358 | print 359 | # find the first image to process 360 | 361 | logger.debug("dates_diff %s", dates_diff) 362 | 363 | derniereDate = "" 364 | for d in dates_diff: 365 | logger.debug("d %s", d) 366 | logger.debug("%s/%s", repL2, nomL2_par_dateImg_Natif[d]) 367 | logger.debug("%s/%s", repL2, nomL2_par_dateImg_MUSCATE[d]) 368 | #logger.debug(glob.glob("%s/%s" % (repL2, nomL2_par_dateImg_Natif[d]))) 369 | #logger.debug(glob.glob("%s/%s" % (repL2, nomL2_par_dateImg_MUSCATE[d]))) 370 | 371 | # test existence of a L2 with MAJA name convention 372 | nomL2init_Natif = glob.glob("%s/%s" % (repL2, nomL2_par_dateImg_Natif[d])) 373 | nomL2init_MUSCATE = glob.glob("%s/%s" % (repL2, nomL2_par_dateImg_MUSCATE[d])) 374 | if len(nomL2init_Natif) > 0: 375 | derniereDate = d 376 | L2type = "Natif" 377 | 378 | elif len(nomL2init_MUSCATE) > 0: 379 | L2type = "MUSCATE" 380 | derniereDate = d 381 | 382 | if derniereDate == "": 383 | logger.info("No existing L2 product, we start with backward mode") 384 | else: 385 | logger.info("Most recent processed date : %s", derniereDate) 386 | 387 | # decide if debug_mode used for maja 388 | if debug_mode: 389 | debug_option = "--loglevel DEBUG" 390 | else: 391 | debug_option = "" 392 | 393 | # ############## For each product 394 | nb_dates = len(dates_diff) 395 | 396 | logger.debug("nb dates %s", nb_dates) 397 | 398 | if not (os.path.exists(repWork)): 399 | os.makedirs(repWork) 400 | if not (os.path.exists(repWork + "userconf")): 401 | #logger.debug("create %s userconf %s", repWork) 402 | add_config_files(repConf, repWork + "userconf") 403 | 404 | logger.debug("derniereDate %s", derniereDate) 405 | for i in range(nb_dates): 406 | d = dates_diff[i] 407 | # only products after the last L2A date available in output directory 408 | if d > derniereDate: 409 | logger.info("=> processing date %s" % d) 410 | if os.path.exists(repWork + "/in"): 411 | shutil.rmtree(repWork + "/in") 412 | os.makedirs(repWork + "/in") 413 | # Mode Backward, if it is the first date in the list 414 | if i == 0: 415 | nb_prod_backward = min(len(dates_diff), nb_backward) 416 | logger.info("dates to process in backward mode :") 417 | for date_backward in dates_diff[0:nb_prod_backward]: 418 | logger.info("-- %s : %s" % (date_backward, prod_par_dateImg[date_backward])) 419 | if options.zip: 420 | unzipAndMoveL1C(prod_par_dateImg[date_backward], repWork + "/in/", tile) 421 | else: 422 | os.symlink(prod_par_dateImg[date_backward], 423 | repWork + "/in/" + os.path.basename(prod_par_dateImg[date_backward])) 424 | add_parameter_files(repGipp, repWork + "/in/", tile, repCams) 425 | add_DEM(repDtm, repWork + "/in/", tile) 426 | 427 | Maja_logfile = "%s/%s.log" % (repL2, os.path.basename(prod_par_dateImg[d])) 428 | logger.debug(os.listdir(os.path.join(repWork, "in"))) 429 | commande = "%s %s -i %s -o %s -m L2BACKWARD -ucs %s --TileId %s &> %s" % ( 430 | maja, debug_option, repWork + "/in", repL2, repWork + "/userconf", tile, Maja_logfile) 431 | logger.info("#################################") 432 | logger.info("#################################") 433 | logger.info("processing %s in backward mode" % prod_par_dateImg[d]) 434 | logger.info("Initialisation mode with backward is longer") 435 | logger.info("MAJA logfile: %s", Maja_logfile) 436 | logger.info("#################################") 437 | os.system(commande) 438 | 439 | # else mode nominal 440 | else: 441 | nomL2 = "" 442 | # Search for previous L2 product 443 | logger.info("Using %s L2 type" % L2type) 444 | for PreviousDate in dates_diff[0:i]: 445 | if L2type == "Natif": 446 | nom_courant = "%s/%s" % (repL2, nomL2_par_dateImg_Natif[PreviousDate]) 447 | elif L2type == "MUSCATE": 448 | nom_courant = "%s/%s" % (repL2, nomL2_par_dateImg_MUSCATE[PreviousDate]) 449 | try: 450 | logger.debug(nom_courant) 451 | nomL2 = glob.glob(nom_courant)[0] 452 | logger.debug("Previous L2 names, per increasing date : %s", nomL2) 453 | except: 454 | logger.debug("pas de L2 pour : %s", nom_courant) 455 | pass 456 | logger.info("previous L2 : %s", nomL2) 457 | # copy (or symlink) L1C 458 | if options.zip: 459 | unzipAndMoveL1C(prod_par_dateImg[d], repWork + "/in/", tile) 460 | else: 461 | os.symlink(prod_par_dateImg[d], 462 | repWork + "/in/" + os.path.basename(prod_par_dateImg[d])) 463 | # find type of L2A 464 | if L2type == "Natif": 465 | os.symlink(nomL2, repWork + "/in/" + os.path.basename(nomL2)) 466 | os.symlink(nomL2.replace("DBL.DIR", "HDR"), 467 | repWork + "/in/" + os.path.basename(nomL2).replace("DBL.DIR", "HDR")) 468 | os.symlink(nomL2.replace("DIR", ""), repWork + "/in/" + 469 | os.path.basename(nomL2).replace("DIR", "")) 470 | elif L2type == "MUSCATE": 471 | os.symlink(nomL2, repWork + "/in/" + os.path.basename(nomL2)) 472 | 473 | Maja_logfile = "%s/%s.log" % (repL2, os.path.basename(prod_par_dateImg[d])) 474 | 475 | add_parameter_files(repGipp, repWork + "/in/", tile, repCams) 476 | add_DEM(repDtm, repWork + "/in/", tile) 477 | 478 | logger.debug(os.listdir(os.path.join(repWork, "in"))) 479 | 480 | commande = "%s %s -i %s -o %s -m L2NOMINAL -ucs %s --TileId %s &> %s" % ( 481 | maja, debug_option, repWork + "/in", repL2, repWork + "/userconf", tile, Maja_logfile) 482 | logger.info("#################################") 483 | logger.info("#################################") 484 | logger.info("processing %s in nominal mode" % prod_par_dateImg[d]) 485 | logger.info("MAJA logfile: %s", Maja_logfile) 486 | logger.info("#################################") 487 | os.system(commande) 488 | 489 | # check for errors in MAJA executions 490 | nomL2init_Natif = glob.glob("%s/%s" % (repL2, nomL2_par_dateImg_Natif[d])) 491 | nomL2init_MUSCATE = glob.glob("%s/%s" % (repL2, nomL2_par_dateImg_MUSCATE[d])) 492 | if len(nomL2init_Natif) > 0: 493 | L2type = "Natif" 494 | elif len(nomL2init_MUSCATE) > 0: 495 | L2type = "MUSCATE" 496 | # test if L2A products is valid 497 | valid = test_valid_L2A(nomL2init_MUSCATE[0]) 498 | 499 | # check for errors in MAJA executions 500 | 501 | Error = False 502 | with open(Maja_logfile, "r") as logfile: 503 | for line in logfile: 504 | if line.find("[E]") > 0: 505 | print line 506 | Error = True 507 | if Error: 508 | logger.info("#######################################") 509 | logger.info("Error detected, see: %s" % Maja_logfile) 510 | logger.info("#######################################") 511 | sys.exit(-1) 512 | 513 | 514 | if __name__ == '__main__': 515 | # ========== command line 516 | if len(sys.argv) == 1: 517 | prog = os.path.basename(sys.argv[0]) 518 | print ' ' + sys.argv[0] + ' [options]' 519 | print " Aide : ", prog, " --help" 520 | print " ou : ", prog, " -h" 521 | 522 | print "exemple : " 523 | print "\t python %s -f folders.txt -c nominal -t 40KCB -s Reunion -d 20160401 " % sys.argv[0] 524 | sys.exit(-1) 525 | else: 526 | usage = "usage: %prog [options] " 527 | parser = OptionParser(usage=usage, version='%prog {}'.format(START_MAJA_VERSION)) 528 | 529 | parser.add_option("-c", "--context", dest="context", action="store", 530 | help="name of the test directory", type="string", default='nominal') 531 | 532 | parser.add_option("-t", "--tile", dest="tile", action="store", 533 | help="tile number", type="string", default='31TFJ') 534 | 535 | parser.add_option("-s", "--site", dest="site", action="store", 536 | help="site name", type="string", default='Arles') 537 | 538 | parser.add_option("-o", "--orbit", dest="orbit", action="store", 539 | help="orbit number", type="string", default=None) 540 | 541 | parser.add_option("-f", "--folder", dest="folder_file", action="store", type="string", 542 | help="folder definition file", default=None) 543 | 544 | parser.add_option("-d", "--startDate", dest="startDate", action="store", 545 | help="start date for processing (optional)", type="string", default="20150623") 546 | 547 | parser.add_option("-e", "--endDate", dest="endDate", action="store", 548 | help="end date for processing (optional)", type="string", default="30000101") 549 | 550 | parser.add_option("-z", "--zip", dest="zip", action="store_true", 551 | help="input L1C are zip files", default=False) 552 | 553 | parser.add_option("-v", "--verbose", dest="verbose", action="store_true", 554 | help="Will provide verbose start_maja logs", default=False) 555 | 556 | parser.add_option("--debug", dest="debug", action="store_true", 557 | help="Use MAJA Debug mode to get verbose logs", default=False) 558 | 559 | (options, args) = parser.parse_args() 560 | 561 | # Logfile configuration 562 | 563 | logger = logging.getLogger('Start-Maja') 564 | if options.verbose: 565 | logger.setLevel(logging.DEBUG) 566 | else: 567 | logger.setLevel(logging.INFO) 568 | if not logger.handlers: 569 | ch = logging.StreamHandler() 570 | ch.setLevel(logging.INFO) 571 | formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') 572 | ch.setFormatter(formatter) 573 | logger.addHandler(ch) 574 | logger.debug("options.stardate %s", options.startDate) 575 | 576 | tile = options.tile 577 | site = options.site 578 | orbit = options.orbit 579 | context = options.context 580 | folder_file = options.folder_file 581 | debug_mode = options.debug 582 | 583 | nb_backward = 8 # number of images to process in backward mode 584 | 585 | start_maja(folder_file, context, site, tile, orbit, nb_backward, options, debug_mode) 586 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfigSystem.xml: -------------------------------------------------------------------------------- 1 | 2 | 28 | 29 | 30 | 31 | L2 note 32 | L3 note 33 | Checktool note 34 | 35 | 36 | 37 | 38 | 8 39 | 40 | true 41 | 42 | 43 | 44 | false 45 | 46 | true 47 | 48 | true 49 | 50 | 52 | true 53 | 54 | 55 | 56 | 57 | false 58 | 59 | 60 | 61 | false 62 | 63 | 64 | 65 | 66 | false 67 | 68 | false 69 | 70 | 71 | 72 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_FORMOSAT_MUSCATE_PROTO.xml: -------------------------------------------------------------------------------- 1 | 2 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 500 35 | 36 | 500 37 | 38 | 500 39 | 40 | 200 41 | 42 | 500 43 | 44 | 45 | 46 | 96 47 | 48 | 49 | 50 | 51 | 3000 52 | 53 | 4000 54 | 55 | 3000 56 | 57 | 58 | 59 | true 60 | 61 | 62 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_LANDSAT8.xml: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 500 32 | 33 | 500 34 | 35 | 500 36 | 37 | 200 38 | 39 | 500 40 | 41 | 42 | 43 | 240 44 | 45 | 46 | 47 | 48 | 3000 49 | 50 | 4000 51 | 52 | 3000 53 | 54 | 55 | 56 | true 57 | 58 | 59 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_LANDSAT8_MUSCATE.xml: -------------------------------------------------------------------------------- 1 | 2 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 500 35 | 36 | 500 37 | 38 | 500 39 | 40 | 200 41 | 42 | 500 43 | 44 | 45 | 46 | 240 47 | 48 | 49 | 50 | 51 | 3000 52 | 53 | 4000 54 | 55 | 3000 56 | 57 | 58 | 59 | true 60 | 61 | 62 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_LANDSAT8_MUSCATE_PROTO.xml: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 500 32 | 33 | 500 34 | 35 | 500 36 | 37 | 200 38 | 39 | 500 40 | 41 | 42 | 43 | 240 44 | 45 | 46 | 47 | 48 | 3000 49 | 50 | 4000 51 | 52 | 3000 53 | 54 | 55 | 56 | true 57 | 58 | 59 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_LANDSAT_MUSCATE.xml: -------------------------------------------------------------------------------- 1 | 2 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 500 35 | 36 | 500 37 | 38 | 500 39 | 40 | 200 41 | 42 | 500 43 | 44 | 45 | 46 | 240 47 | 48 | 49 | 50 | 51 | 3000 52 | 53 | 4000 54 | 55 | 3000 56 | 57 | 58 | 59 | true 60 | 61 | 62 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_LANDSAT_MUSCATE_PROTO.xml: -------------------------------------------------------------------------------- 1 | 2 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 500 35 | 36 | 500 37 | 38 | 500 39 | 40 | 200 41 | 42 | 500 43 | 44 | 45 | 46 | 240 47 | 48 | 49 | 50 | 51 | 3000 52 | 53 | 4000 54 | 55 | 3000 56 | 57 | 58 | 59 | true 60 | 61 | 62 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_SENTINEL2.xml: -------------------------------------------------------------------------------- 1 | 2 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 800 36 | 37 | 800 38 | 39 | 800 40 | 41 | 800 42 | 43 | 800 44 | 45 | 46 | 47 | 240 48 | 49 | 50 | 51 | 52 | 3000 53 | 54 | 4000 55 | 56 | 3000 57 | 58 | 59 | 60 | true 61 | 62 | 63 | 64 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_SENTINEL2_GPP.xml: -------------------------------------------------------------------------------- 1 | 2 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 800 36 | 37 | 800 38 | 39 | 800 40 | 41 | 800 42 | 43 | 800 44 | 45 | 46 | 47 | 240 48 | 49 | 50 | 51 | 52 | 3000 53 | 54 | 4000 55 | 56 | 3000 57 | 58 | 59 | 60 | false 61 | 62 | 63 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_SENTINEL2_MUSCATE.xml: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 500 32 | 33 | 500 34 | 35 | 500 36 | 37 | 200 38 | 39 | 500 40 | 41 | 42 | 43 | 240 44 | 45 | 46 | 47 | 48 | 3000 49 | 50 | 4000 51 | 52 | 3000 53 | 54 | 55 | 56 | false 57 | 58 | 59 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_SENTINEL2_TM.xml: -------------------------------------------------------------------------------- 1 | 2 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 800 36 | 37 | 800 38 | 39 | 800 40 | 41 | 800 42 | 43 | 800 44 | 45 | 46 | 47 | 240 48 | 49 | 50 | 51 | 52 | 3000 53 | 54 | 4000 55 | 56 | 3000 57 | 58 | 59 | 60 | true 61 | 62 | 63 | 64 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_SPOT4_MUSCATE_PROTO.xml: -------------------------------------------------------------------------------- 1 | 2 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 500 32 | 33 | 500 34 | 35 | 500 36 | 37 | 200 38 | 39 | 500 40 | 41 | 42 | 43 | 200 44 | 45 | 46 | 47 | 48 | 3000 49 | 50 | 4000 51 | 52 | 3000 53 | 54 | 55 | 56 | true 57 | 58 | 59 | -------------------------------------------------------------------------------- /userconf/MAJAUserConfig_VENUS.xml: -------------------------------------------------------------------------------- 1 |  2 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | 250 38 | 39 | 100 40 | 41 | 100 42 | 43 | 200 44 | 45 | 200 46 | 47 | 48 | 49 | 100 50 | 51 | 52 | 53 | 54 | 3000 55 | 56 | 8000 57 | 58 | 4000 59 | 60 | 3000 61 | 62 | 63 | 64 | false 65 | 66 | 67 | --------------------------------------------------------------------------------