├── .Rbuildignore ├── .gitignore ├── DESCRIPTION ├── LandsatLinkr ├── LandsatLinkr.Rproj ├── LandsatLinkr_function_map.txt ├── NAMESPACE ├── R ├── cal_mss_tc_aggregate_model.r ├── cal_oli_tc_aggregate_model.r ├── calibrate_and_composite.r ├── change_extension.r ├── delete_files.r ├── eudist.r ├── file_check.r ├── get_metadata.r ├── llr_time_machine.r ├── make_usearea_file.r ├── make_usearea_file_bsq.r ├── matrix_to_raster.r ├── mixel.r ├── mosaic_dems.r ├── mss_resample.r ├── msscal.r ├── msscal_single.r ├── msscost.r ├── msscvm.r ├── mssdn2refl.r ├── msssr2tc.r ├── mssunpackr.r ├── msswarp.r ├── msswarp_old.r ├── olical.r ├── olical_single.r ├── olisr2tc.r ├── oliunpackr.r ├── prepare_images.r ├── prepare_topo.r ├── run_landsatlinkr.r ├── set_projection.r ├── tmunpackr.r └── trim_na_rowcol.r ├── README.md ├── docs ├── lasrc_product_guide.pdf └── ledaps_product_guide.pdf └── man ├── cal_mss_tc_aggregate_model.Rd ├── cal_oli_tc_aggregate_model.Rd ├── calibrate_and_composite.Rd ├── change_extension.Rd ├── delete_files.Rd ├── eudist.Rd ├── file_check.Rd ├── get_metadata.Rd ├── llr_time_machine.Rd ├── make_usearea_file.Rd ├── make_usearea_file_bsq.Rd ├── matrix_to_raster.Rd ├── mixel.Rd ├── mosaic_dems.Rd ├── mss_resample.Rd ├── msscal.Rd ├── msscal_single.Rd ├── msscost.Rd ├── msscvm.Rd ├── mssdn2refl.Rd ├── msssr2tc.Rd ├── mssunpackr.Rd ├── msswarp.Rd ├── msswarp_old.Rd ├── olical.Rd ├── olical_single.Rd ├── olisr2tc.Rd ├── oliunpackr.Rd ├── prepare_images.Rd ├── prepare_topo.Rd ├── run_landsatlinkr.Rd ├── set_projection.Rd ├── tmunpackr.Rd └── trim_na_rowcol.Rd /.Rbuildignore: -------------------------------------------------------------------------------- 1 | ^.*\.Rproj$ 2 | ^\.Rproj\.user$ 3 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .Rproj.user 2 | .Rhistory 3 | .RData 4 | -------------------------------------------------------------------------------- /DESCRIPTION: -------------------------------------------------------------------------------- 1 | Package: LandsatLinkr 2 | Title: Tools to spectrally link Landsat data 3 | Version: 0.5.0 4 | Date: 2018-04-26 5 | Authors@R: c(person("Justin", "Braaten", email = "jstnbraaten@gmail.com", role = c("aut", "cre")), 6 | person("Warren", "Cohen", email = "warren.cohen@oregonstate.edu", role = "aut"), 7 | person("Zhiqiang", "Yang", email = "zhiqiang.yang@oregonstate.edu", role = "aut")) 8 | Description: An automated system for processing large volumes of Landsat imagery 9 | and building long spectrally consistent chronologies across MSS, TM/ETM+, and 10 | OLI sensors 11 | Depends: 12 | raster, 13 | SDMTools, 14 | doParallel, 15 | foreach, 16 | plyr, 17 | gdalUtils, 18 | rgdal, 19 | igraph, 20 | MASS 21 | License: GPL-2 22 | URL: http://landsatlinkr.jdbcode.com/ 23 | LazyData: true 24 | RoxygenNote: 6.0.1 25 | -------------------------------------------------------------------------------- /LandsatLinkr: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jdbcode/LandsatLinkr/fbaa880ed1dccc5076187aacf9a08d3d0321d57a/LandsatLinkr -------------------------------------------------------------------------------- /LandsatLinkr.Rproj: -------------------------------------------------------------------------------- 1 | Version: 1.0 2 | 3 | RestoreWorkspace: Default 4 | SaveWorkspace: Default 5 | AlwaysSaveHistory: Default 6 | 7 | EnableCodeIndexing: Yes 8 | UseSpacesForTab: Yes 9 | NumSpacesForTab: 2 10 | Encoding: UTF-8 11 | 12 | RnwWeave: Sweave 13 | LaTeX: pdfLaTeX 14 | 15 | BuildType: Package 16 | PackageUseDevtools: Yes 17 | PackageInstallArgs: --no-multiarch --with-keep.source 18 | PackageCheckArgs: --no-manual 19 | PackageRoxygenize: rd,collate,namespace 20 | -------------------------------------------------------------------------------- /LandsatLinkr_function_map.txt: -------------------------------------------------------------------------------- 1 | 2 | 3 | run_landsatlinkr 4 | prepare_images 5 | mssunpackr 6 | file_check 7 | delete_files 8 | trim_na_rowcol 9 | msswarp 10 | delete_files 11 | mssdn2refl 12 | file_check 13 | get_metadata 14 | eudist 15 | set_projection 16 | msscost 17 | file_check 18 | get_metadata 19 | eudist 20 | set_projection 21 | prepare_topo 22 | get_metadata 23 | set_projection 24 | msscvm 25 | file_check 26 | get_metadata 27 | set_projection 28 | tmunpackr 29 | file_check 30 | trim_na_rowcol 31 | set_projection 32 | oliunpackr 33 | file_check 34 | trim_na_rowcol 35 | make_usearea_file_bsq 36 | set_projection 37 | make_usearea_file 38 | change_extension 39 | calibrate_and_composite 40 | mss_resample 41 | file_check 42 | msscal 43 | msscal_single 44 | cal_mss_tc_aggregate_model 45 | msssr2tc 46 | ???why no "file_check"??? 47 | matrix_to_raster 48 | set_projection 49 | olical 50 | olical_single 51 | cal_oli_tc_aggregate_model 52 | olisr2tc 53 | file_check 54 | matrix_to_raster 55 | set_projection 56 | mixel 57 | change_extension 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | -------------------------------------------------------------------------------- /NAMESPACE: -------------------------------------------------------------------------------- 1 | # Generated by roxygen2: do not edit by hand 2 | 3 | export(cal_mss_tc_aggregate_model) 4 | export(cal_oli_tc_aggregate_model) 5 | export(calibrate_and_composite) 6 | export(change_extension) 7 | export(delete_files) 8 | export(eudist) 9 | export(file_check) 10 | export(get_metadata) 11 | export(llr_time_machine) 12 | export(make_usearea_file) 13 | export(make_usearea_file_bsq) 14 | export(matrix_to_raster) 15 | export(mixel) 16 | export(mosaic_dems) 17 | export(mss_resample) 18 | export(msscal) 19 | export(msscal_single) 20 | export(msscost) 21 | export(msscvm) 22 | export(mssdn2refl) 23 | export(msssr2tc) 24 | export(mssunpackr) 25 | export(msswarp) 26 | export(olical) 27 | export(olical_single) 28 | export(olisr2tc) 29 | export(oliunpackr) 30 | export(prepare_images) 31 | export(prepare_topo) 32 | export(run_landsatlinkr) 33 | export(set_projection) 34 | export(tmunpackr) 35 | export(trim_na_rowcol) 36 | import(MASS) 37 | import(SDMTools) 38 | import(doParallel) 39 | import(foreach) 40 | import(gdalUtils) 41 | import(igraph) 42 | import(plyr) 43 | import(raster) 44 | import(rgdal) 45 | -------------------------------------------------------------------------------- /R/cal_mss_tc_aggregate_model.r: -------------------------------------------------------------------------------- 1 | #' Create an aggregate MSS TC model 2 | #' 3 | #' Create an aggregate MSS TC model with diagnostic figures 4 | #' @param dirname character. the directory to the calibration folder 5 | #' @export 6 | 7 | 8 | cal_mss_tc_aggregate_model = function(dir){ 9 | 10 | #make new directory 11 | outdir = file.path(dir,"aggregate_model") 12 | dir.create(outdir, recursive=T, showWarnings=F) 13 | 14 | plot_multi_cal = function(df,index, model, outfile){ 15 | 16 | if(index == "tcb"){limits = c(-500, 10000)} 17 | if(index == "tcg"){limits = c(-500, 5000)} 18 | if(index == "tcw"){limits = c(-6000, 1000)} 19 | if(index == "tca"){limits = c(-500, 5000)} 20 | if(model == "single"){ 21 | 22 | tmp = with(df,by(df, mss_img, function(x) rlm(refsamp ~ singlepred, data = x))) 23 | tmp = sapply(tmp, coef) 24 | 25 | title = paste("single model scatterplot and robust linear regression lines for",index) 26 | png(outfile, width = 1100, height=800) 27 | cols = seq(from=2,to=ncol(tmp)+1) 28 | plot(x=df$singlepred,y=df$refsamp, 29 | main=title, 30 | xlab=paste("mss-predicted",index), 31 | ylab=paste("tm",index), 32 | xlim=limits, 33 | ylim=limits) 34 | for(i in 1:ncol(tmp)){abline(a = tmp[1,i], b=tmp[2,i], col=cols[i])} 35 | legend(x=limits[1]+25,y=limits[2]-25, 36 | colnames(tmp), 37 | lty=rep(1,ncol(tmp)), 38 | col=cols) 39 | 40 | dev.off() 41 | } 42 | if(model == "aggregate"){ 43 | 44 | tmp = with(df,by(df, mss_img, function(x) rlm(refsamp ~ comppred, data = x))) 45 | tmp = sapply(tmp, coef) 46 | 47 | title = paste("aggregated model scatterplot and robust linear regression lines for",index) 48 | cols = seq(from=2,to=ncol(tmp)+1) 49 | 50 | png(outfile, width = 1100, height=800) 51 | plot(x=df$comppred,y=df$refsamp, 52 | main=title, 53 | xlab=paste("mss-predicted",index), 54 | ylab=paste("tm",index), 55 | xlim=limits, 56 | ylim=limits) 57 | 58 | m = rlm(refsamp ~ comppred, data=df) 59 | abline(coef = m$coefficients, lty=2, col="gray48", lwd=2.5) 60 | for(i in 1:ncol(tmp)){abline(a = tmp[1,i], b=tmp[2,i], col=cols[i])} 61 | legend(x=limits[1]+25,y=limits[2]-25, 62 | c("mean",colnames(tmp)), 63 | lty=c(2,rep(1,ncol(tmp))), 64 | col=c("gray48",cols)) 65 | 66 | dev.off() 67 | } 68 | } 69 | 70 | plot_multi_cal_dif = function(df, index, model, outfile){ 71 | if(index == "tcb"){limits = c(-500, 10000)} 72 | if(index == "tcg"){limits = c(-500, 5000)} 73 | if(index == "tcw"){limits = c(-6000, 1000)} 74 | if(index == "tca"){limits = c(-500, 5000)} 75 | meanrefsamp = mean(df$refsamp) 76 | if(model == "single"){ 77 | newdf = aggregate(singlepreddif ~ mss_img , data = df, mean) 78 | newdf$diffm = meanrefsamp + newdf$singlepreddif 79 | title = paste("single model mean prediction differences from actual values for",index) 80 | } 81 | if(model == "aggregate"){ 82 | newdf = aggregate(comppreddif ~ mss_img , data = df, mean) 83 | newdf$diffm = meanrefsamp + newdf$comppreddif 84 | title = paste("aggregate model mean prediction differences from actual values for",index) 85 | } 86 | 87 | png(outfile, width = 1100, height=800) 88 | d = density(df$refsamp) 89 | plot(d, 90 | xlim=limits, 91 | main=title, 92 | xlab=paste(index)) 93 | abline(v=meanrefsamp,lty=2, col=1,lwd=2.5) 94 | abline(v=newdf$diffm, col=c(2:(length(newdf$diffm)+1))) 95 | legend(x=limits[2]-3000,y=max(d$y), 96 | c("mean",as.character(newdf$mss_img)), 97 | lty=c(2,rep(1,length(newdf$diffm))), 98 | col=c(1,2:(length(newdf$diffm)+1))) 99 | 100 | dev.off() 101 | } 102 | 103 | aggregate_cal_diag = function(sample_files, coef_files, index, outdir){ 104 | 105 | if(class(sample_files) != "data.frame"){ 106 | tbl = do.call("rbind", lapply(sample_files, read.csv, header = TRUE)) 107 | } else{ 108 | tbl = sample_files 109 | } 110 | 111 | model = rlm(refsamp ~ b1samp + b2samp + b3samp + b4samp, data=tbl) 112 | tbl$comppred = round(predict(model)) 113 | tbl$singlepreddif = tbl$refsamp - tbl$singlepred 114 | tbl$comppreddif = tbl$refsamp - tbl$comppred 115 | tblcoef = model$coefficients 116 | r = cor(tbl$refsamp,tbl$comppred) 117 | coef = data.frame(index=index,yint=tblcoef[1],b1c=tblcoef[2],b2c=tblcoef[3],b3c=tblcoef[4],b4c=tblcoef[5],r=r) 118 | coeftbl = do.call("rbind", lapply(coef_files, read.csv, header = TRUE)) 119 | outfile = file.path(outdir,paste(index,"_cal_combined_coef.csv",sep="")) 120 | write.csv(coeftbl, outfile, row.names=F) 121 | 122 | outfile = file.path(outdir,paste(index,"_cal_aggregate_sample.csv",sep="")) 123 | write.csv(tbl, outfile, row.names=F) 124 | outfile = file.path(outdir,paste(index,"_cal_aggregate_coef.csv",sep="")) 125 | write.csv(coef, outfile, row.names=F) 126 | 127 | outfile = file.path(outdir,paste(index,"_aggregate_mean_dif.png",sep="")) 128 | plot_multi_cal_dif(tbl,index,"aggregate",outfile) 129 | outfile = file.path(outdir,paste(index,"_single_mean_dif.png",sep="")) 130 | plot_multi_cal_dif(tbl,index,"single",outfile) 131 | 132 | outfile = file.path(outdir,paste(index,"_aggregate_regression.png",sep="")) 133 | plot_multi_cal(tbl,index,"aggregate",outfile) 134 | outfile = file.path(outdir,paste(index,"_single_regression.png",sep="")) 135 | plot_multi_cal(tbl,index,"single",outfile) 136 | 137 | return(tbl) 138 | } 139 | 140 | find_good_samples = function(coeffiles){ 141 | len = length(coeffiles) 142 | id = substr(basename(coeffiles),1,16) 143 | r = array(NA,len) 144 | df = data.frame(id,r) 145 | for(i in 1:len){ 146 | r = read.csv(coeffiles[i], header = TRUE)$r 147 | df$r[i] = r 148 | } 149 | n_goods = 0 150 | thresh = 0.8 151 | while(n_goods < 1){ 152 | goods = which(df$r > thresh) 153 | n_goods = length(goods) 154 | thresh = thresh-0.05 155 | } 156 | return(as.character(df$id[goods])) 157 | } 158 | 159 | extract_good_samples = function(files, goodids){ 160 | len = length(goodids) 161 | these = 0 162 | for(i in 1:len){ 163 | match = grep(goodids[i], files) 164 | if(length(match) == 1){these = c(these,files[match])} 165 | } 166 | return(these[2:length(these)]) 167 | } 168 | 169 | tcbsamps = list.files(dir,"tcb_cal_samp.csv",recursive=T,full.names=T) 170 | tcgsamps = list.files(dir,"tcg_cal_samp.csv",recursive=T,full.names=T) 171 | tcwsamps = list.files(dir,"tcw_cal_samp.csv",recursive=T,full.names=T) 172 | tcasamps = list.files(dir,"tca_cal_samp.csv",recursive=T,full.names=T) 173 | 174 | tcbcoef = list.files(dir,"tcb_cal_coef.csv",recursive=T,full.names=T) 175 | tcgcoef = list.files(dir,"tcg_cal_coef.csv",recursive=T,full.names=T) 176 | tcwcoef = list.files(dir,"tcw_cal_coef.csv",recursive=T,full.names=T) 177 | tcacoef = list.files(dir,"tca_cal_coef.csv",recursive=T,full.names=T) 178 | 179 | tcbgoods = find_good_samples(tcbcoef) 180 | tcggoods = find_good_samples(tcgcoef) 181 | tcwgoods = find_good_samples(tcwcoef) 182 | tcagoods = find_good_samples(tcacoef) 183 | 184 | tcbsamps = extract_good_samples(tcbsamps, tcbgoods) 185 | tcgsamps = extract_good_samples(tcgsamps, tcggoods) 186 | tcwsamps = extract_good_samples(tcwsamps, tcwgoods) 187 | tcasamps = extract_good_samples(tcasamps, tcagoods) 188 | 189 | tcbcoef = extract_good_samples(tcbcoef, tcbgoods) 190 | tcgcoef = extract_good_samples(tcgcoef, tcggoods) 191 | tcwcoef = extract_good_samples(tcwcoef, tcwgoods) 192 | tcacoef = extract_good_samples(tcacoef, tcagoods) 193 | 194 | btbl = aggregate_cal_diag(tcbsamps, tcbcoef, "tcb", outdir) 195 | gtbl = aggregate_cal_diag(tcgsamps, tcgcoef, "tcg", outdir) 196 | wtbl = aggregate_cal_diag(tcwsamps, tcwcoef, "tcw", outdir) 197 | atbl = aggregate_cal_diag(tcasamps, tcacoef, "tca", outdir) 198 | } 199 | 200 | 201 | -------------------------------------------------------------------------------- /R/cal_oli_tc_aggregate_model.r: -------------------------------------------------------------------------------- 1 | #' Create an aggregate oli TC model 2 | #' 3 | #' Create an aggregate oli TC model with diagnostic figures 4 | #' @param dirname character. the directory to the calibration folder 5 | #' @export 6 | 7 | 8 | cal_oli_tc_aggregate_model = function(dir, overwrite=F){ 9 | 10 | 11 | #make new directory 12 | outdir = file.path(dir,"aggregate_model") 13 | dir.create(outdir, recursive=T, showWarnings=F) 14 | 15 | #check to see if single cal has already been run 16 | files = list.files(outdir) 17 | thesefiles = c("tca_aggregate_mean_dif.png","tca_aggregate_regression.png","tca_cal_aggregate_coef.csv","tca_cal_aggregate_sample.csv","tca_cal_combined_coef.csv","tca_single_mean_dif.png","tca_single_regression.png", 18 | "tcb_aggregate_mean_dif.png","tcb_aggregate_regression.png","tcb_cal_aggregate_coef.csv","tcb_cal_aggregate_sample.csv","tcb_cal_combined_coef.csv","tcb_single_mean_dif.png","tcb_single_regression.png", 19 | "tcg_aggregate_mean_dif.png","tcg_aggregate_regression.png","tcg_cal_aggregate_coef.csv","tcg_cal_aggregate_sample.csv","tcg_cal_combined_coef.csv","tcg_single_mean_dif.png","tcg_single_regression.png", 20 | "tcw_aggregate_mean_dif.png","tcw_aggregate_regression.png","tcw_cal_aggregate_coef.csv","tcw_cal_aggregate_sample.csv","tcw_cal_combined_coef.csv","tcw_single_mean_dif.png","tcw_single_regression.png") 21 | 22 | results = rep(NA,length(thesefiles)) 23 | for(i in 1:length(results)){ 24 | test = grep(thesefiles[i], files) 25 | results[i] = length(test) > 0 26 | } 27 | if(all(results) == T & overwrite == F){return(0)} 28 | 29 | ###########define some functions############ 30 | plot_multi_cal = function(df,index, model, outfile){ 31 | 32 | if(index == "tcb"){limits = c(-500, 10000)} 33 | if(index == "tcg"){limits = c(-500, 5000)} 34 | if(index == "tcw"){limits = c(-6000, 1000)} 35 | if(index == "tca"){limits = c(-500, 5000)} 36 | if(model == "single"){ 37 | 38 | 39 | tmp = with(df,by(df, oli_img, function(x) rlm(refsamp ~ singlepred, data = x))) 40 | tmp = sapply(tmp, coef) 41 | 42 | title = paste("single model scatterplot and robust linear regression lines for",index) 43 | png(outfile, width = 1100, height=800) 44 | cols = seq(from=2,to=ncol(tmp)+1) 45 | plot(x=df$singlepred,y=df$refsamp, 46 | main=title, 47 | xlab=paste("oli-predicted",index), 48 | ylab=paste("tm",index), 49 | xlim=limits, 50 | ylim=limits) 51 | for(i in 1:ncol(tmp)){abline(a = tmp[1,i], b=tmp[2,i], col=cols[i])} 52 | legend(x=limits[1]+25,y=limits[2]-25, 53 | colnames(tmp), 54 | lty=rep(1,ncol(tmp)), 55 | col=cols) 56 | 57 | dev.off() 58 | } 59 | if(model == "aggregate"){ 60 | 61 | tmp = with(df,by(df, oli_img, function(x) rlm(refsamp ~ comppred, data = x))) 62 | tmp = sapply(tmp, coef) 63 | 64 | title = paste("aggregated model scatterplot and robust linear regression lines for",index) 65 | cols = seq(from=2,to=ncol(tmp)+1) 66 | 67 | png(outfile, width = 1100, height=800) 68 | plot(x=df$comppred,y=df$refsamp, 69 | main=title, 70 | xlab=paste("oli-predicted",index), 71 | ylab=paste("tm",index), 72 | xlim=limits, 73 | ylim=limits) 74 | 75 | m = rlm(refsamp ~ comppred, data=df) 76 | abline(coef = m$coefficients, lty=2, col="gray48", lwd=2.5) 77 | for(i in 1:ncol(tmp)){abline(a = tmp[1,i], b=tmp[2,i], col=cols[i])} 78 | legend(x=limits[1]+25,y=limits[2]-25, 79 | c("mean",colnames(tmp)), 80 | lty=c(2,rep(1,ncol(tmp))), 81 | col=c("gray48",cols)) 82 | 83 | dev.off() 84 | } 85 | } 86 | 87 | plot_multi_cal_dif = function(df, index, model, outfile){ 88 | if(index == "tcb"){limits = c(-500, 10000)} 89 | if(index == "tcg"){limits = c(-500, 5000)} 90 | if(index == "tcw"){limits = c(-6000, 1000)} 91 | if(index == "tca"){limits = c(-500, 5000)} 92 | meanrefsamp = mean(df$refsamp) 93 | if(model == "single"){ 94 | newdf = aggregate(singlepreddif ~ oli_img , data = df, mean) 95 | newdf$diffm = meanrefsamp + newdf$singlepreddif 96 | title = paste("single model mean prediction differences from actual values for",index) 97 | } 98 | if(model == "aggregate"){ 99 | newdf = aggregate(comppreddif ~ oli_img , data = df, mean) 100 | newdf$diffm = meanrefsamp + newdf$comppreddif 101 | title = paste("aggregate model mean prediction differences from actual values for",index) 102 | } 103 | 104 | png(outfile, width = 1100, height=800) 105 | d = density(df$refsamp) 106 | plot(d, 107 | xlim=limits, 108 | main=title, 109 | xlab=paste(index)) 110 | abline(v=meanrefsamp,lty=2, col=1, lwd=2.5) 111 | abline(v=newdf$diffm, col=c(2:(length(newdf$diffm)+1))) 112 | legend(x=limits[2]-3000,y=max(d$y), 113 | c("mean",as.character(newdf$oli_img)), 114 | lty=c(2,rep(1,length(newdf$diffm))), 115 | col=c(1,2:(length(newdf$diffm)+1))) 116 | 117 | dev.off() 118 | } 119 | 120 | aggregate_cal_diag = function(sample_files, coef_files, index, outdir){ 121 | 122 | if(class(sample_files) != "data.frame"){ 123 | tbl = do.call("rbind", lapply(sample_files, read.csv, header = TRUE)) 124 | } else {tbl = sample_files} 125 | #if(index == "tca"){model = rlm(refsamp ~ comppred, data=tbl)} else { 126 | model = rlm(refsamp ~ b2samp + b3samp + b4samp + b5samp + b6samp + b7samp, data=tbl) 127 | tbl$comppred = round(predict(model)) 128 | #} 129 | tbl$singlepreddif = tbl$refsamp - tbl$singlepred 130 | tbl$comppreddif = tbl$refsamp - tbl$comppred 131 | tblcoef = model$coefficients 132 | r = cor(tbl$refsamp,tbl$comppred) 133 | coef = data.frame(index=index,yint=tblcoef[1],b2c=tblcoef[2],b3c=tblcoef[3],b4c=tblcoef[4],b5c=tblcoef[5],b6c=tblcoef[6],b7c=tblcoef[7],r=r) 134 | 135 | coeftbl = do.call("rbind", lapply(coef_files, read.csv, header = TRUE)) 136 | outfile = file.path(outdir,paste(index,"_cal_combined_coef.csv",sep="")) 137 | write.csv(coeftbl, outfile, row.names=F) 138 | 139 | outfile = file.path(outdir,paste(index,"_cal_aggregate_sample.csv",sep="")) 140 | write.csv(tbl, outfile, row.names=F) 141 | outfile = file.path(outdir,paste(index,"_cal_aggregate_coef.csv",sep="")) 142 | write.csv(coef, outfile, row.names=F) 143 | 144 | outfile = file.path(outdir,paste(index,"_aggregate_mean_dif.png",sep="")) 145 | plot_multi_cal_dif(tbl,index,"aggregate",outfile) 146 | outfile = file.path(outdir,paste(index,"_single_mean_dif.png",sep="")) 147 | plot_multi_cal_dif(tbl,index,"single",outfile) 148 | 149 | outfile = file.path(outdir,paste(index,"_aggregate_regression.png",sep="")) 150 | plot_multi_cal(tbl,index,"aggregate",outfile) 151 | outfile = file.path(outdir,paste(index,"_single_regression.png",sep="")) 152 | plot_multi_cal(tbl,index,"single",outfile) 153 | 154 | return(tbl) 155 | } 156 | 157 | find_good_samples = function(coeffiles){ 158 | len = length(coeffiles) 159 | id = substr(basename(coeffiles),1,16) 160 | r = array(NA,len) 161 | df = data.frame(id,r) 162 | for(i in 1:len){ 163 | r = read.csv(coeffiles[i], header = TRUE)$r 164 | df$r[i] = r 165 | } 166 | n_goods = 0 167 | thresh = 0.8 168 | while(n_goods < 1){ 169 | goods = which(df$r > thresh) 170 | n_goods = length(goods) 171 | thresh = thresh-0.05 172 | } 173 | return(as.character(df$id[goods])) 174 | } 175 | 176 | extract_good_samples = function(files, goodids){ 177 | len = length(goodids) 178 | these = 0 179 | for(i in 1:len){ 180 | match = grep(goodids[i], files) 181 | if(length(match) == 1){these = c(these,files[match])} 182 | } 183 | return(these[2:length(these)]) 184 | } 185 | 186 | #######run the functions 187 | tcbsamps = list.files(dir,"tcb_cal_samp.csv",recursive=T,full.names=T) 188 | tcgsamps = list.files(dir,"tcg_cal_samp.csv",recursive=T,full.names=T) 189 | tcwsamps = list.files(dir,"tcw_cal_samp.csv",recursive=T,full.names=T) 190 | tcasamps = list.files(dir,"tca_cal_samp.csv",recursive=T,full.names=T) 191 | 192 | tcbcoef = list.files(dir,"tcb_cal_coef.csv",recursive=T,full.names=T) 193 | tcgcoef = list.files(dir,"tcg_cal_coef.csv",recursive=T,full.names=T) 194 | tcwcoef = list.files(dir,"tcw_cal_coef.csv",recursive=T,full.names=T) 195 | tcacoef = list.files(dir,"tca_cal_coef.csv",recursive=T,full.names=T) 196 | 197 | tcbgoods = find_good_samples(tcbcoef) 198 | tcggoods = find_good_samples(tcgcoef) 199 | tcwgoods = find_good_samples(tcwcoef) 200 | tcagoods = find_good_samples(tcacoef) 201 | 202 | tcbsamps = extract_good_samples(tcbsamps, tcbgoods) 203 | tcgsamps = extract_good_samples(tcgsamps, tcggoods) 204 | tcwsamps = extract_good_samples(tcwsamps, tcwgoods) 205 | tcasamps = extract_good_samples(tcasamps, tcagoods) 206 | 207 | tcbcoef = extract_good_samples(tcbcoef, tcbgoods) 208 | tcgcoef = extract_good_samples(tcgcoef, tcggoods) 209 | tcwcoef = extract_good_samples(tcwcoef, tcwgoods) 210 | tcacoef = extract_good_samples(tcacoef, tcagoods) 211 | 212 | btbl = aggregate_cal_diag(tcbsamps, tcbcoef, "tcb", outdir) 213 | gtbl = aggregate_cal_diag(tcgsamps, tcgcoef, "tcg", outdir) 214 | wtbl = aggregate_cal_diag(tcwsamps, tcwcoef, "tcw", outdir) 215 | atbl = aggregate_cal_diag(tcasamps, tcacoef, "tca", outdir) 216 | 217 | return(1) 218 | } 219 | 220 | 221 | -------------------------------------------------------------------------------- /R/calibrate_and_composite.r: -------------------------------------------------------------------------------- 1 | #' Calibrate MSS imagery to TM and make cloud-free composites 2 | #' 3 | #' Calibrate MSS imagery to TM and make cloud-free composites 4 | #' @param msswrs1dir character. mss wrs1 directory path 5 | #' @param msswrs2dir character. mss wrs2 directory path 6 | #' @param tmwrs2dir character. tm wrs2 directory path 7 | #' @param index character. spectral index to make composites for. options: "tca", "tcb", "tcg", "tcw" 8 | #' @param outdir character. path to output directory 9 | #' @param runname character. unique name for the composite set 10 | #' @param useareafile character. path to usearea file 11 | #' @param doyears ??? what years to composite 12 | #' @param order character. how to order the images options "sensor_and_doy", "doy", and "none" 13 | #' @param overlap character. how to deal with overlapping images. options: "mean" 14 | #' @param cores numeric. Number of cores to process with options: 1 or 2 15 | #' @param process numeric. integer or vector specifying which processes to run: 1=msscal, 2=mixel 16 | #' @import foreach 17 | #' @import doParallel 18 | #' @export 19 | 20 | 21 | calibrate_and_composite = function(msswrs1dir,msswrs2dir,tmwrs2dir,oliwrs2dir,index,outdir,runname,useareafile,doyears="all",order="none",overlap="mean", cores=2, process, overwrite=F ,startday, endday, yearadj){ 22 | 23 | #msscal 24 | if(1 %in% process ==T){ 25 | #resample MSS 26 | print("Resampling MSS reflectance and cloudmask images") 27 | msswrs1srfiles = list.files(msswrs1dir, "dos_sr.tif", recursive=T, full.names=T) 28 | msswrs1cloudfiles = list.files(msswrs1dir, "cloudmask.tif", recursive=T, full.names=T) 29 | msswrs2srfiles = list.files(msswrs2dir, "dos_sr.tif", recursive=T, full.names=T) 30 | msswrs2cloudfiles = list.files(msswrs2dir, "cloudmask.tif", recursive=T, full.names=T) 31 | files = c(msswrs1srfiles,msswrs1cloudfiles,msswrs2srfiles,msswrs2cloudfiles) 32 | #cores=2 33 | if(cores == 2){ 34 | print("...in parallel") 35 | cl = makeCluster(cores) 36 | registerDoParallel(cl) 37 | o = foreach(i=1:length(files), .combine="c",.packages="LandsatLinkr") %dopar% mss_resample(files[i], overwrite=F) #hardwired to not overwrite 38 | stopCluster(cl) 39 | } else {for(i in 1:length(files)){o = mss_resample(files[i], overwrite=F)}} #hardwired to not overwrite 40 | 41 | 42 | print("Running msscal") 43 | t=proc.time() 44 | msscal(msswrs1dir, msswrs2dir, tmwrs2dir, cores=cores) 45 | print(proc.time()-t) 46 | } 47 | 48 | #olical 49 | if(2 %in% process ==T){ 50 | print("Running olical") 51 | t=proc.time() 52 | olical(oliwrs2dir, tmwrs2dir, cores=cores, overwrite=overwrite) 53 | print(proc.time()-t) 54 | } 55 | 56 | #mixel 57 | if(3 %in% process ==T){ 58 | print("Running mixel") 59 | t=proc.time() 60 | if(index == "all"){ 61 | index = c("tca", "tcb", "tcg", "tcw") 62 | outdir = c(file.path(outdir,"tca"),file.path(outdir,"tcb"),file.path(outdir,"tcg"),file.path(outdir,"tcw")) 63 | for(i in 1:length(index)){mixel(msswrs1dir,msswrs2dir,tmwrs2dir,oliwrs2dir,index[i],outdir[i],runname,useareafile,doyears="all",order="none",overlap=overlap, startday=startday, endday=endday, yearadj=yearadj)} #overlap="mean" 64 | } else { 65 | outdir = file.path(outdir,index) 66 | mixel(msswrs1dir,msswrs2dir,tmwrs2dir,oliwrs2dir,index,outdir,runname,useareafile,doyears="all",order="none",overlap=overlap, startday=startday, endday=endday, yearadj=yearadj) #overlap="mean" 67 | } 68 | print(proc.time()-t) 69 | } 70 | } -------------------------------------------------------------------------------- /R/change_extension.r: -------------------------------------------------------------------------------- 1 | #' Changes a file's extension 2 | #' 3 | #' Changes a file's extension 4 | #' @param old character. old extension 5 | #' @param new character. new extension 6 | #' @param file character. full path of the file to change extension 7 | #' @export 8 | 9 | 10 | change_extension = function(old, new, file){ 11 | end = nchar(file)-nchar(old) 12 | newfile = paste(substr(file,1,end),new,sep="") 13 | return(newfile) 14 | } 15 | 16 | 17 | -------------------------------------------------------------------------------- /R/delete_files.r: -------------------------------------------------------------------------------- 1 | #' Delete files 2 | #' 3 | #' Delete all files associated with a particular image ID 4 | #' @param file character. any file associated with a particular image ID 5 | #' @param reason numeric or character. a reason for deleting the image. 6 | #' @export 7 | 8 | 9 | delete_files = function(file,reason){ 10 | if(regexpr(".tar.gz", file)[1] == -1){ 11 | dir = substr(dirname(file),1,nchar(dirname(file))-12) 12 | } else {dir = substr(dirname(file),1,nchar(dirname(file))-6)} 13 | 14 | imgid = substr(basename(file),1,16) 15 | files = list.files(path=dir, pattern=imgid, full.names=T, recursive=T) 16 | if(reason == 1){reason = "L1G"} 17 | if(reason == 2){reason = "Could not find enough geowarp tie-point"} 18 | if(reason == 3){reason = "Poor TC regression"} 19 | if(reason == 4){reason = "Poor geowarping"} 20 | if(reason == 5){reason = "Empty band(s)"} 21 | if(reason == 6){reason = "MSS year >= 1995: missing band 4"} 22 | outfile = file.path(dir,"images_deleted",paste(imgid,"_delete_record.csv",sep="")) 23 | outdir = dirname(outfile) 24 | outdirtargz = file.path(outdir,"targz") 25 | dir.create(outdir, recursive=T, showWarnings=F) 26 | dir.create(outdirtargz, recursive=T, showWarnings=F) 27 | write(c(imgid,reason), file=outfile, ncolumns=2, sep=",") 28 | targz = grep(".tar.gz", files, value=T) 29 | if(length(targz) == 1){ 30 | newtargz = file.path(outdirtargz, basename(targz)) 31 | file.rename(targz,newtargz) 32 | } 33 | unlink(files, force=T) 34 | } 35 | -------------------------------------------------------------------------------- /R/eudist.r: -------------------------------------------------------------------------------- 1 | #' Earth-Sun distance by day of year 2 | #' 3 | #' Earth-Sun distance by day of year 4 | #' @param doy numeric. image day-of-year 5 | #' @export 6 | #' @references http://landsathandbook.gsfc.nasa.gov/excel_docs/d.xls 7 | 8 | eudist = function(doy){ 9 | 10 | dau = c( 11 | 0.98331, 12 | 0.98330, 13 | 0.98330, 14 | 0.98330, 15 | 0.98330, 16 | 0.98332, 17 | 0.98333, 18 | 0.98335, 19 | 0.98338, 20 | 0.98341, 21 | 0.98345, 22 | 0.98349, 23 | 0.98354, 24 | 0.98359, 25 | 0.98365, 26 | 0.98371, 27 | 0.98378, 28 | 0.98385, 29 | 0.98393, 30 | 0.98401, 31 | 0.98410, 32 | 0.98419, 33 | 0.98428, 34 | 0.98439, 35 | 0.98449, 36 | 0.98460, 37 | 0.98472, 38 | 0.98484, 39 | 0.98496, 40 | 0.98509, 41 | 0.98523, 42 | 0.98536, 43 | 0.98551, 44 | 0.98565, 45 | 0.98580, 46 | 0.98596, 47 | 0.98612, 48 | 0.98628, 49 | 0.98645, 50 | 0.98662, 51 | 0.98680, 52 | 0.98698, 53 | 0.98717, 54 | 0.98735, 55 | 0.98755, 56 | 0.98774, 57 | 0.98794, 58 | 0.98814, 59 | 0.98835, 60 | 0.98856, 61 | 0.98877, 62 | 0.98899, 63 | 0.98921, 64 | 0.98944, 65 | 0.98966, 66 | 0.98989, 67 | 0.99012, 68 | 0.99036, 69 | 0.99060, 70 | 0.99084, 71 | 0.99108, 72 | 0.99133, 73 | 0.99158, 74 | 0.99183, 75 | 0.99208, 76 | 0.99234, 77 | 0.99260, 78 | 0.99286, 79 | 0.99312, 80 | 0.99339, 81 | 0.99365, 82 | 0.99392, 83 | 0.99419, 84 | 0.99446, 85 | 0.99474, 86 | 0.99501, 87 | 0.99529, 88 | 0.99556, 89 | 0.99584, 90 | 0.99612, 91 | 0.99640, 92 | 0.99669, 93 | 0.99697, 94 | 0.99725, 95 | 0.99754, 96 | 0.99782, 97 | 0.99811, 98 | 0.99840, 99 | 0.99868, 100 | 0.99897, 101 | 0.99926, 102 | 0.99954, 103 | 0.99983, 104 | 1.00012, 105 | 1.00041, 106 | 1.00069, 107 | 1.00098, 108 | 1.00127, 109 | 1.00155, 110 | 1.00184, 111 | 1.00212, 112 | 1.00240, 113 | 1.00269, 114 | 1.00297, 115 | 1.00325, 116 | 1.00353, 117 | 1.00381, 118 | 1.00409, 119 | 1.00437, 120 | 1.00464, 121 | 1.00492, 122 | 1.00519, 123 | 1.00546, 124 | 1.00573, 125 | 1.00600, 126 | 1.00626, 127 | 1.00653, 128 | 1.00679, 129 | 1.00705, 130 | 1.00731, 131 | 1.00756, 132 | 1.00781, 133 | 1.00806, 134 | 1.00831, 135 | 1.00856, 136 | 1.00880, 137 | 1.00904, 138 | 1.00928, 139 | 1.00952, 140 | 1.00975, 141 | 1.00998, 142 | 1.01020, 143 | 1.01043, 144 | 1.01065, 145 | 1.01087, 146 | 1.01108, 147 | 1.01129, 148 | 1.01150, 149 | 1.01170, 150 | 1.01191, 151 | 1.01210, 152 | 1.01230, 153 | 1.01249, 154 | 1.01267, 155 | 1.01286, 156 | 1.01304, 157 | 1.01321, 158 | 1.01338, 159 | 1.01355, 160 | 1.01371, 161 | 1.01387, 162 | 1.01403, 163 | 1.01418, 164 | 1.01433, 165 | 1.01447, 166 | 1.01461, 167 | 1.01475, 168 | 1.01488, 169 | 1.01500, 170 | 1.01513, 171 | 1.01524, 172 | 1.01536, 173 | 1.01547, 174 | 1.01557, 175 | 1.01567, 176 | 1.01577, 177 | 1.01586, 178 | 1.01595, 179 | 1.01603, 180 | 1.01610, 181 | 1.01618, 182 | 1.01625, 183 | 1.01631, 184 | 1.01637, 185 | 1.01642, 186 | 1.01647, 187 | 1.01652, 188 | 1.01656, 189 | 1.01659, 190 | 1.01662, 191 | 1.01665, 192 | 1.01667, 193 | 1.01668, 194 | 1.01670, 195 | 1.01670, 196 | 1.01670, 197 | 1.01670, 198 | 1.01669, 199 | 1.01668, 200 | 1.01666, 201 | 1.01664, 202 | 1.01661, 203 | 1.01658, 204 | 1.01655, 205 | 1.01650, 206 | 1.01646, 207 | 1.01641, 208 | 1.01635, 209 | 1.01629, 210 | 1.01623, 211 | 1.01616, 212 | 1.01609, 213 | 1.01601, 214 | 1.01592, 215 | 1.01584, 216 | 1.01575, 217 | 1.01565, 218 | 1.01555, 219 | 1.01544, 220 | 1.01533, 221 | 1.01522, 222 | 1.01510, 223 | 1.01497, 224 | 1.01485, 225 | 1.01471, 226 | 1.01458, 227 | 1.01444, 228 | 1.01429, 229 | 1.01414, 230 | 1.01399, 231 | 1.01383, 232 | 1.01367, 233 | 1.01351, 234 | 1.01334, 235 | 1.01317, 236 | 1.01299, 237 | 1.01281, 238 | 1.01263, 239 | 1.01244, 240 | 1.01225, 241 | 1.01205, 242 | 1.01186, 243 | 1.01165, 244 | 1.01145, 245 | 1.01124, 246 | 1.01103, 247 | 1.01081, 248 | 1.01060, 249 | 1.01037, 250 | 1.01015, 251 | 1.00992, 252 | 1.00969, 253 | 1.00946, 254 | 1.00922, 255 | 1.00898, 256 | 1.00874, 257 | 1.00850, 258 | 1.00825, 259 | 1.00800, 260 | 1.00775, 261 | 1.00750, 262 | 1.00724, 263 | 1.00698, 264 | 1.00672, 265 | 1.00646, 266 | 1.00620, 267 | 1.00593, 268 | 1.00566, 269 | 1.00539, 270 | 1.00512, 271 | 1.00485, 272 | 1.00457, 273 | 1.00430, 274 | 1.00402, 275 | 1.00374, 276 | 1.00346, 277 | 1.00318, 278 | 1.00290, 279 | 1.00262, 280 | 1.00234, 281 | 1.00205, 282 | 1.00177, 283 | 1.00148, 284 | 1.00119, 285 | 1.00091, 286 | 1.00062, 287 | 1.00033, 288 | 1.00005, 289 | 0.99976, 290 | 0.99947, 291 | 0.99918, 292 | 0.99890, 293 | 0.99861, 294 | 0.99832, 295 | 0.99804, 296 | 0.99775, 297 | 0.99747, 298 | 0.99718, 299 | 0.99690, 300 | 0.99662, 301 | 0.99634, 302 | 0.99605, 303 | 0.99577, 304 | 0.99550, 305 | 0.99522, 306 | 0.99494, 307 | 0.99467, 308 | 0.99440, 309 | 0.99412, 310 | 0.99385, 311 | 0.99359, 312 | 0.99332, 313 | 0.99306, 314 | 0.99279, 315 | 0.99253, 316 | 0.99228, 317 | 0.99202, 318 | 0.99177, 319 | 0.99152, 320 | 0.99127, 321 | 0.99102, 322 | 0.99078, 323 | 0.99054, 324 | 0.99030, 325 | 0.99007, 326 | 0.98983, 327 | 0.98961, 328 | 0.98938, 329 | 0.98916, 330 | 0.98894, 331 | 0.98872, 332 | 0.98851, 333 | 0.98830, 334 | 0.98809, 335 | 0.98789, 336 | 0.98769, 337 | 0.98750, 338 | 0.98731, 339 | 0.98712, 340 | 0.98694, 341 | 0.98676, 342 | 0.98658, 343 | 0.98641, 344 | 0.98624, 345 | 0.98608, 346 | 0.98592, 347 | 0.98577, 348 | 0.98562, 349 | 0.98547, 350 | 0.98533, 351 | 0.98519, 352 | 0.98506, 353 | 0.98493, 354 | 0.98481, 355 | 0.98469, 356 | 0.98457, 357 | 0.98446, 358 | 0.98436, 359 | 0.98426, 360 | 0.98416, 361 | 0.98407, 362 | 0.98399, 363 | 0.98391, 364 | 0.98383, 365 | 0.98376, 366 | 0.98370, 367 | 0.98363, 368 | 0.98358, 369 | 0.98353, 370 | 0.98348, 371 | 0.98344, 372 | 0.98340, 373 | 0.98337, 374 | 0.98335, 375 | 0.98333, 376 | 0.98331) 377 | return(dau[doy]) 378 | } 379 | 380 | 381 | 382 | 383 | -------------------------------------------------------------------------------- /R/file_check.r: -------------------------------------------------------------------------------- 1 | #' Handles file existence checking and overwriting 2 | #' 3 | #' Handles file existence checking and overwriting 4 | #' @param file Filename. of file being worked on 5 | #' @param output Filename. what output file is it checking for? ("archv.tif", "reflectance.tif", etc) 6 | #' @param overwrite logical. if the output file exists should it be deleted 7 | #' @export 8 | 9 | file_check = function(file, output, overwrite){ 10 | bname = basename(file) 11 | dname = dirname(file) 12 | imgid = substr(bname, 1, 16) 13 | if(output == "archv.tif"){ 14 | ppprrrdir = substr(dname,1,(nchar(dname)-6)) 15 | search = paste(imgid,"_archv.tif",sep="") 16 | result = list.files(ppprrrdir, search, recursive=T, full.names=T) 17 | if(length(result) == 0){return(1)} else if(length(result) > 0 & overwrite == T){ 18 | unlink(result) 19 | return(2) 20 | } else {return(0)} 21 | } else if(output == "reflectance.tif"){ #if running mssdn2refl 22 | search = paste(imgid,"_reflectance.tif",sep="") 23 | result = list.files(dname, search, recursive=T, full.names=T) 24 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 25 | unlink(result) 26 | return(2) 27 | } else {return(0)} 28 | } else if(output == "cloudmask.tif"){ #if running mssunpackr 29 | search = paste(imgid,"_cloudmask.tif",sep="") 30 | result = list.files(dname, search, recursive=T, full.names=T) 31 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 32 | unlink(result) 33 | return(2) 34 | } else {return(0)} 35 | } else if(output == "dos_sr.tif"){ 36 | search = paste(imgid,"_dos_sr.tif",sep="") 37 | result = list.files(dname, search, recursive=T, full.names=T) 38 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 39 | unlink(result) 40 | return(2) 41 | } else {return(0)} 42 | } else if(output == "dos_sr_30m.tif"){ 43 | search = paste(imgid,"_dos_sr_30m.tif",sep="") 44 | result = list.files(dname, search, recursive=T, full.names=T) 45 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 46 | unlink(result) 47 | return(2) 48 | } else {return(0)} 49 | } else if(output == "cloudmask_30m.tif"){ 50 | search = paste(imgid,"_cloudmask_30m.tif",sep="") 51 | result = list.files(dname, search, recursive=T, full.names=T) 52 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 53 | unlink(result) 54 | return(2) 55 | } else {return(0)} 56 | } else if(output == "tca_30m.tif"){ 57 | search = paste(imgid,"_tca_30m.tif",sep="") 58 | result = list.files(dname, search, recursive=T, full.names=T) 59 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 60 | unlink(result) 61 | return(2) 62 | } else {return(0)} 63 | } else if(output == "tc_30m.tif"){ 64 | search = paste(imgid,"_tc_30m.tif",sep="") 65 | result = list.files(dname, search, recursive=T, full.names=T) 66 | if(length(result) == 0){return(1)} else if(length(result) == 1 & overwrite == T){ 67 | unlink(result) 68 | return(2) 69 | } else {return(0)} 70 | } else if(output == "ledaps.tif"){ 71 | ppprrrdir = substr(dname,1,(nchar(dname)-6)) 72 | search = paste(imgid,"_ledaps.tif",sep="") 73 | result = list.files(ppprrrdir, search, recursive=T, full.names=T) 74 | if(length(result) == 0){return(1)} else if(length(result) >= 1 & overwrite == T){ 75 | result = list.files(file.path(ppprrrdir,"images"), imgid, recursive=T, full.names=T) 76 | unlink(result) 77 | return(2) 78 | } else {return(0)} 79 | } else if(output == "l8sr.tif"){ 80 | ppprrrdir = substr(dname,1,(nchar(dname)-6)) 81 | search = paste(imgid,"_l8sr.tif",sep="") 82 | result = list.files(ppprrrdir, search, recursive=T, full.names=T) 83 | if(length(result) == 0){return(1)} else if(length(result) >= 1 & overwrite == T){ 84 | result = list.files(file.path(ppprrrdir,"images"), imgid, recursive=T, full.names=T) 85 | unlink(result) 86 | return(2) 87 | } else {return(0)} 88 | } else if(output == "l8sr_tc.tif"){ 89 | search = paste(imgid,"_tc",sep="") 90 | result = list.files(dname, search, recursive=T, full.names=T) 91 | if(length(result) == 0){return(1)} else if(length(result) >= 1 & overwrite == T){ 92 | unlink(result) 93 | return(2) 94 | } else {return(0)} 95 | } 96 | } 97 | 98 | 99 | -------------------------------------------------------------------------------- /R/get_metadata.r: -------------------------------------------------------------------------------- 1 | #' Retrieve Landsat image metadata 2 | #' 3 | #' Uses the image file name to find the corresponding *MTL.txt image metadata file provided with LPSG Landsat images and returns a dataframe with pertinent image information 4 | #' @param file LPGS-processed Landsat image filename (full system path to file) 5 | #' @return Dataframe with pertinent image information 6 | #' @export 7 | 8 | get_metadata = function(file){ 9 | mtlfile = file.path(dirname(file),paste(substr(basename(file),1,17),"MTL.txt", sep="")) 10 | tbl = unlist(read.delim(mtlfile, header=F, skipNul=T)) 11 | bname = basename(file) 12 | 13 | #get the path row "ppprrr" 14 | ppprrr = substr(bname, 4,9) 15 | 16 | #get the day-of-year 17 | doy = as.numeric(substr(bname, 14,16)) 18 | 19 | #get the year 20 | year = as.numeric(substr(bname, 10,13)) 21 | 22 | #get the yearday 23 | yearday = as.numeric(substr(bname, 10,16)) 24 | 25 | #get the image id 26 | imgid = substr(bname, 1, 21) 27 | 28 | #get the data type 29 | string = as.character(grep("DATA_TYPE = ", tbl, value=T)) 30 | pieces = unlist(strsplit(string, " ")) 31 | datatype = pieces[7] 32 | 33 | #get the sensor 34 | string = as.character(grep("SPACECRAFT_ID =", tbl, value=T)) 35 | pieces = unlist(strsplit(string, " ")) 36 | sensor = pieces[7] 37 | 38 | #get the sun elevation 39 | string = as.character(grep("SUN_ELEVATION = ", tbl, value=T)) 40 | pieces = unlist(strsplit(string, " ")) 41 | sunelev = as.numeric(pieces[7]) 42 | sunzen = 90 - sunelev 43 | 44 | #get the sun azimuth 45 | string = as.character(grep("SUN_AZIMUTH = ", tbl, value=T)) 46 | pieces = unlist(strsplit(string, " ")) 47 | sunaz = as.numeric(pieces[7]) 48 | 49 | # get min and max radiance; gain and bias 50 | if(sensor == "LANDSAT_1" | sensor == "LANDSAT_2" | sensor == "LANDSAT_3"){ 51 | bands = c(4,5,6,7)} else{bands = c(1,2,3,4)} 52 | 53 | maxrad = array(0,4) 54 | minrad = array(0,4) 55 | gain = array(0,4) 56 | bias = array(0,4) 57 | 58 | for(i in 1:4){ 59 | maxradsearch = paste("RADIANCE_MAXIMUM_BAND_", bands[i], " =", sep="") 60 | string = as.character(grep(maxradsearch, tbl, value=T)) 61 | pieces = unlist(strsplit(string, " ")) 62 | maxrad[i] = as.numeric(pieces[7]) 63 | 64 | minradsearch = paste("RADIANCE_MINIMUM_BAND_", bands[i], " =", sep="") 65 | string = as.character(grep(minradsearch, tbl, value=T)) 66 | pieces = unlist(strsplit(string, " ")) 67 | minrad[i] = as.numeric(pieces[7]) 68 | 69 | radmultsearch = paste("RADIANCE_MULT_BAND_", bands[i], " =", sep="") 70 | string = as.character(grep(radmultsearch, tbl, value=T)) 71 | pieces = unlist(strsplit(string, " ")) 72 | gain[i] = as.numeric(pieces[7]) 73 | 74 | radaddsearch = paste("RADIANCE_ADD_BAND_", bands[i], " =", sep="") 75 | string = as.character(grep(radaddsearch, tbl, value=T)) 76 | pieces = unlist(strsplit(string, " ")) 77 | bias[i] = as.numeric(pieces[7]) 78 | } 79 | 80 | #prepare variables for inclusion in output table 81 | b1minrad = minrad[1] 82 | b2minrad = minrad[2] 83 | b3minrad = minrad[3] 84 | b4minrad = minrad[4] 85 | b1maxrad = maxrad[1] 86 | b2maxrad = maxrad[2] 87 | b3maxrad = maxrad[3] 88 | b4maxrad = maxrad[4] 89 | 90 | b1gain = gain[1] 91 | b2gain = gain[2] 92 | b3gain = gain[3] 93 | b4gain = gain[4] 94 | b1bias = bias[1] 95 | b2bias = bias[2] 96 | b3bias = bias[3] 97 | b4bias = bias[4] 98 | 99 | #get the wrs type 100 | if(sensor == "LANDSAT_1"){wrstype = "wrs1"} 101 | if(sensor == "LANDSAT_2"){wrstype = "wrs1"} 102 | if(sensor == "LANDSAT_3"){wrstype = "wrs1"} 103 | if(sensor == "LANDSAT_4"){wrstype = "wrs2"} 104 | if(sensor == "LANDSAT_5"){wrstype = "wrs2"} 105 | 106 | #fill in the output table 107 | df = data.frame( 108 | ppprrr, 109 | doy, 110 | year, 111 | yearday, 112 | imgid, 113 | sensor, 114 | datatype, 115 | wrstype, 116 | sunelev, 117 | sunzen, 118 | sunaz, 119 | b1minrad, 120 | b2minrad, 121 | b3minrad, 122 | b4minrad, 123 | b1maxrad, 124 | b2maxrad, 125 | b3maxrad, 126 | b4maxrad, 127 | b1gain, 128 | b2gain, 129 | b3gain, 130 | b4gain, 131 | b1bias, 132 | b2bias, 133 | b3bias, 134 | b4bias 135 | ) 136 | 137 | return(df) 138 | } -------------------------------------------------------------------------------- /R/llr_time_machine.r: -------------------------------------------------------------------------------- 1 | #' Decompress, stack, and reproject LPSG MSS images 2 | #' 3 | #' Decompresses, stacks, and optionally reprojects LPGS MSS images recieved from USGS EROS as .tar.gz files 4 | #' @param imgdir direcory path. full path to directory containing LandsatLinkr annual composites 5 | #' @param outdir direcory path. full path to the directory where you want LLR-TimeMachine data to be written 6 | #' @param coordfile csv file path. full path to a comma delimited file containing the plot number, x, and y coordinates pixels you want to view in LLR-TimeMachine 7 | #' @import raster 8 | #' @export 9 | 10 | 11 | 12 | llr_time_machine = function(imgdir,outdir,coordfile){ 13 | 14 | ########################################################################################## 15 | writeplot = function(file,plot,gmBounds,date,tcb,tcg,tcw,tca, first=F,last=F, end=F, finalPlot=F){ 16 | chipstrip = paste('"imgs/plot_',plot,'_chipstrip.png",',sep="") 17 | start=paste('{"plotID": ',plot,',','"chipStrip":',chipstrip,'"LatLon":[',gmBounds[1,2],',',gmBounds[1,1],'],','"bounds":[',gmBounds[2,2],',',gmBounds[3,2],',',gmBounds[3,1],',',gmBounds[2,1],'],','"Values": [') 18 | int=', ' 19 | if(end == T & finalPlot == F){end=']},'} else if(end == T & finalPlot == T){end=']}'} 20 | imginfo = paste( 21 | paste('{"Year": ', date,',',sep=""), 22 | paste(' "TCB": ', tcb,',',sep=""), 23 | paste(' "TCG": ', tcg,',',sep=""), 24 | paste(' "TCW": ', tcw,',',sep=""), 25 | paste(' "TCA": ', tca,sep=""), 26 | '}', 27 | sep="") 28 | if(first == T){ 29 | write(start, file, append=TRUE) 30 | write(paste(imginfo,int,sep=""), file, append=TRUE) 31 | } 32 | if(first != T & last != T){write(paste(imginfo,int,sep=""), file, append=TRUE)} 33 | if(last == T){ 34 | write(imginfo, file, append=TRUE) 35 | write(end, file, append=TRUE) 36 | } 37 | } 38 | 39 | makeChipStrip = function(imgStack){ 40 | singleShift = xres(imgStack)*255 41 | for(i in 1:nlayers(imgStack)){ 42 | if(i == 1){ 43 | fullSeries = subset(imgStack,i) 44 | } else{ 45 | shifted = shift(subset(imgStack,i),x=0,y=singleShift*(i-1)*-1) 46 | fullSeries = merge(fullSeries,shifted) 47 | } 48 | } 49 | return(fullSeries) 50 | } 51 | 52 | stretchBGW = function(template, index){ 53 | img = as.matrix(template) 54 | n_stdev = 2 55 | bmin = 3098-(1247*n_stdev) 56 | bmax = 3098+(1247*n_stdev) 57 | gmin = 1549-(799*n_stdev) 58 | gmax = 1549+(799*n_stdev) 59 | wmin = -701-(772*n_stdev) 60 | wmax = -701+(772*n_stdev) 61 | 62 | if(index == "tcb"){ 63 | these = which(img < bmin) 64 | if(length(these != 0)){img[these] = bmin} 65 | these = which(img > bmax) 66 | if(length(these != 0)){img[these] = bmax} 67 | img = ((img-bmin)/(bmax-bmin))*255 68 | } 69 | if(index == "tcg"){ 70 | these = which(img < gmin) 71 | if(length(these != 0)){img[these] = gmin} 72 | these = which(img > gmax) 73 | if(length(these != 0)){img[these] = gmax} 74 | img = ((img-gmin)/(gmax-gmin))*255 75 | 76 | } 77 | if(index == "tcw"){ 78 | these = which(img < wmin) 79 | if(length(these != 0)){img[these] = wmin} 80 | these = which(img > wmax) 81 | if(length(these != 0)){img[these] = wmax} 82 | img = ((img-wmin)/(wmax-wmin))*255 83 | } 84 | img = setValues(template, round(img)) 85 | return(img) 86 | } 87 | 88 | 89 | chronoInfo = function(imgdir, outdir, plot, x, y, tcbb, tcgb, tcwb, tcab, years, jsfile, finalPlot=F){ 90 | coords = data.frame(x,y) 91 | reso = xres(tcbb) 92 | half = reso*127.5 93 | ul = matrix(c(x-half, y+half), nrow=1) 94 | lr = matrix(c(x+half, y-half), nrow=1) 95 | proj = projection(tcbb) 96 | spCoordProj = SpatialPoints(matrix(c(x, y), nrow=1),proj4string=CRS(proj)) 97 | SpCoordLonLat = spTransform(spCoordProj, CRS("+init=epsg:4269")) 98 | SpCoordLonLat = coordinates(SpCoordLonLat) 99 | spUlProj = SpatialPoints(ul,proj4string=CRS(proj)) 100 | SpUlLonLat = spTransform(spUlProj, CRS("+init=epsg:4269")) 101 | SpUlLonLat = coordinates(SpUlLonLat) 102 | spLrProj = SpatialPoints(lr,proj4string=CRS(proj)) 103 | SpLrLonLat = spTransform(spLrProj, CRS("+init=epsg:4269")) 104 | SpLrLonLat = coordinates(SpLrLonLat) 105 | gmBounds = matrix(c(SpCoordLonLat,SpUlLonLat,SpLrLonLat),ncol = 2,byrow = T) 106 | 107 | tcbv = extract(tcbb, coords) 108 | tcgv = extract(tcgb, coords) 109 | tcwv = extract(tcwb, coords) 110 | tcav = extract(tcab, coords) 111 | 112 | #for making year composites 113 | len = length(years) 114 | for(y in 1:len){ 115 | thisyear=as.numeric(years[y]) 116 | if(y == 1){writeplot(jsfile,plot,gmBounds,thisyear,tcbv[y],tcgv[y],tcwv[y],tcav[y], first=T, last=F, end=F)} 117 | if(y != 1 & y != len){writeplot(jsfile,plot,gmBounds,thisyear,tcbv[y],tcgv[y],tcwv[y],tcav[y], first=F, last=F, end=F)} 118 | if(y == len){writeplot(jsfile,plot,gmBounds,thisyear,tcbv[y],tcgv[y],tcwv[y],tcav[y], first=F, last=T, end=T, finalPlot=finalPlot)} 119 | } 120 | 121 | thisCell = cellFromXY(tcbb,coords) 122 | thisRowCol = rowColFromCell(tcbb,thisCell) 123 | e = extent(tcbb, thisRowCol[1,1]-127,thisRowCol[1,1]+127,thisRowCol[1,2]-127,thisRowCol[1,2]+127) 124 | 125 | tcbv = crop(tcbb, e) 126 | tcgv = crop(tcgb, e) 127 | tcwv = crop(tcwb, e) 128 | tcav = crop(tcab, e) 129 | 130 | tcbvs = stretchBGW(tcbv, "tcb") 131 | tcgvs = stretchBGW(tcgv, "tcg") 132 | tcwvs = stretchBGW(tcwv, "tcw") 133 | 134 | tcbseries = makeChipStrip(tcbvs) 135 | tcgseries = makeChipStrip(tcgvs) 136 | tcwseries = makeChipStrip(tcwvs) 137 | 138 | s = stack(tcbseries,tcgseries,tcwseries) 139 | 140 | outpng = file.path(outdir,"imgs",paste("plot_",plot,"_chipstrip.png",sep="")) 141 | png(outpng, width = 255, height=nrow(s)) 142 | plotRGB(s,r=1, g=2, b=3, ext=NULL) 143 | dev.off() 144 | } 145 | ################################################################################################################################################ 146 | 147 | 148 | #read the csv plot file 149 | plots = read.csv(coordfile) #plot#,x,y 150 | 151 | colnames(plots) = tolower(colnames(plots)) 152 | 153 | #create the javascript file path 154 | jsfile = file.path(outdir,"LLR-TimeMachine.js") 155 | 156 | #make sure output dirs exist/created 157 | dir.create(outdir, recursive=T, showWarnings = F) 158 | dir.create(file.path(outdir,"imgs"), recursive=T, showWarnings = F) 159 | 160 | 161 | tcbstack = list.files(imgdir, "tcb_composite_stack.bsq$", recursive = T, full.names = T) 162 | tcgstack = list.files(imgdir, "tcg_composite_stack.bsq$", recursive = T, full.names = T) 163 | tcwstack = list.files(imgdir, "tcw_composite_stack.bsq$", recursive = T, full.names = T) 164 | tcastack = list.files(imgdir, "tca_composite_stack.bsq$", recursive = T, full.names = T) 165 | 166 | n_tcbstack = length(tcbstack) 167 | n_tcgstack = length(tcgstack) 168 | n_tcwstack = length(tcwstack) 169 | n_tcastack = length(tcastack) 170 | if(sum(n_tcbstack,n_tcgstack,n_tcwstack,n_tcastack) != 4){ 171 | print("Error! You are missing some required files, or the provided directory is wrong.") 172 | if(n_tcbstack == 0){print(paste("There was no *tcb_composite_stack.bsq file found at this directory:",imgdir,"(recursive search)"))} 173 | if(n_tcgstack == 0){print(paste("There was no *tcg_composite_stack.bsq file found at this directory:",imgdir,"(recursive search)"))} 174 | if(n_tcwstack == 0){print(paste("There was no *tcw_composite_stack.bsq file found at this directory:",imgdir,"(recursive search)"))} 175 | if(n_tcastack == 0){print(paste("There was no *tca_composite_stack.bsq file found at this directory:",imgdir,"(recursive search)"))} 176 | return 177 | 178 | } 179 | 180 | 181 | files = list.files(file.path(imgdir,"tcb"), "composite.bsq") 182 | years = sort(unique(substr(files,1,4))) 183 | 184 | ext = extent(raster(tcbstack)) 185 | tcbb = extend(brick(tcbstack),130) 186 | tcgb = extend(brick(tcgstack),130) 187 | tcwb = extend(brick(tcwstack),130) 188 | tcab = extend(brick(tcastack),130) 189 | 190 | #for each plot create the data 191 | nplots = nrow(plots) 192 | for(i in 1:nplots){ 193 | print(paste(i,"/",nplots,sep="")) 194 | if(i == 1){ 195 | unlink(jsfile) 196 | write('var allData = [', jsfile, append=TRUE) 197 | } 198 | 199 | #check to make sure the point is not outside the image 200 | x = plots$x[i] 201 | y = plots$y[i] 202 | plot = plots$plotid[i] 203 | bad = 0 204 | if(x < ext[1]){ 205 | print(paste("x coordinate for plot:",plot,"is outside the image to the west")) 206 | bad = 1 207 | } 208 | if(x > ext[2]){ 209 | print(paste("x coordinate for plot:",plot,"is outside the image to the east")) 210 | bad = 1 211 | } 212 | if(y < ext[3]){ 213 | print(paste("y coordinate for plot:",plot,"is outside the image to the south")) 214 | bad = 1 215 | } 216 | if(y > ext[4]){ 217 | print(paste("y coordinate for plot:",plot,"is outside the image to the north")) 218 | bad = 1 219 | } 220 | if(bad == 1){ 221 | print(paste("plot:",plot,"is outside the image, skipping it...")) 222 | next 223 | } 224 | 225 | #if everything is okay, extract the data for the point 226 | if(i == nplots){ 227 | chronoInfo(imgdir, outdir, plot, x, y, tcbb, tcgb, tcwb, tcab, years, jsfile, finalPlot=T) 228 | write(']', jsfile, append=TRUE) 229 | } else{ 230 | chronoInfo(imgdir, outdir, plot, x, y, tcbb, tcgb, tcwb, tcab, years, jsfile) 231 | } 232 | } 233 | } 234 | 235 | 236 | 237 | -------------------------------------------------------------------------------- /R/make_usearea_file.r: -------------------------------------------------------------------------------- 1 | #' Make a usearea file 2 | #' 3 | #' Make a usearea file for image compositing 4 | #' @param dir character. full path name to scene directory example: "E:/mss/wrs2/038029" 5 | #' @param outfile charcter. full path of output file 6 | #' @param xmx numeric.max x coordinate 7 | #' @param xmn numeric.min x coordinate 8 | #' @param ymx numeric.max y coordinate 9 | #' @param ymn numeric.min y coordinate 10 | #' @import raster 11 | #' @export 12 | 13 | make_usearea_file = function(dir, outfile, xmx, xmn, ymx, ymn){ 14 | print("Making a use-area file for the specifed dimensions") 15 | 16 | projfiles = list.files(path = file.path(dir[1], "images"), pattern="proj.txt", full.names=T, recursive=T) 17 | if(length(projfiles) == 0){projfiles = list.files(path = file.path(dir[2], "images"), pattern="proj.txt", full.names=T, recursive=T)} 18 | if(length(projfiles) == 0){projfiles = list.files(path = file.path(dir[3], "images"), pattern="proj.txt", full.names=T, recursive=T)} 19 | if(length(projfiles) == 0){projfiles = list.files(path = file.path(dir[4], "images"), pattern="proj.txt", full.names=T, recursive=T)} 20 | 21 | 22 | crs = readLines(projfiles[1]) 23 | res = 30 24 | r = raster(xmn=xmn, xmx=xmx, ymn=ymn, ymx=ymx, crs=crs, res=res) 25 | r[] = 1 26 | r = as(r, "SpatialGridDataFrame") 27 | writeGDAL(r, outfile, drivername = "GTiff", options="INTERLEAVE=BAND", type = "Byte") #, mvFlag = 0 28 | 29 | #write out as a .bsq too so that it can be used in the LLR-LT program 30 | print("Making a copy of use-area file as .bsq for optional use in LandTrendr") 31 | bsqoutfile = change_extension("tif", "bsq", outfile) 32 | gdal_translate(src_dataset=outfile, dst_dataset=bsqoutfile, of="ENVI", a_nodata="none") 33 | 34 | } 35 | 36 | 37 | -------------------------------------------------------------------------------- /R/make_usearea_file_bsq.r: -------------------------------------------------------------------------------- 1 | #' Make a usearea file in ENVI (bsq) format 2 | #' 3 | #' Make a usearea file in ENVI (bsq) format 4 | #' @param infile character. full path name to a usearea mask file 5 | #' @param projref charcter. full path to an image file from the scene relevent to the usearea file - can be any .tif file in the "images" directory 6 | #' @import raster 7 | #' @import rgdal 8 | #' @import gdalUtils 9 | #' @export 10 | 11 | make_usearea_file_bsq = function(infile, projref){ 12 | print("Making a copy of use-area file as .bsq for optional use in LandTrendr") 13 | 14 | tempout = paste(infile,"_temp.tif",sep="") 15 | template = r = raster(infile) 16 | r = as.matrix(r) 17 | nas = which(is.na(r) == T) 18 | ones = which(r != 0) 19 | zero = which(r == 0) 20 | if(length(nas) > 0){r[nas] = 0} 21 | if(length(zero) > 0){r[zero] = 0} 22 | r[ones] = 1 23 | r = setValues(template,r) 24 | projection(r) = set_projection(projref) 25 | r = as(r, "SpatialGridDataFrame") 26 | writeGDAL(r, tempout, drivername = "GTiff", type = "Byte", options="INTERLEAVE=BAND") 27 | 28 | outfile = paste(substr(infile,1,(nchar(infile)-3)),"bsq",sep="") 29 | oldfiles = list.files(dirname(outfile),basename(outfile),full.names=T) 30 | oldhdr = sub("bsq","hdr",outfile) 31 | unlink(c(oldfiles,oldhdr)) 32 | 33 | s_srs = projection(raster(tempout)) 34 | t_srs = set_projection(projref) 35 | gdalwarp(srcfile=tempout, dstfile=outfile, s_srs=s_srs, t_srs=t_srs, order=1, tr=c(30,30), r="near", of="ENVI", dstnodata="none") 36 | unlink(tempout) 37 | } -------------------------------------------------------------------------------- /R/matrix_to_raster.r: -------------------------------------------------------------------------------- 1 | #' Converts a matrix to a raster file 2 | #' 3 | #' Converts a matrix to a raster file 4 | #' @param rfile character. full path name of a reference raster file 5 | #' @param rmatrix matrix. a numeric 2-d matrix to be converted to a raster 6 | #' @import raster 7 | #' @export 8 | 9 | matrix_to_raster = function(rfile, rmatrix){ 10 | r = raster(rfile) 11 | cres = 0.5*res(r)[1] 12 | xmin = xFromCol(r, col=1)-cres 13 | xmax = xFromCol(r, col=ncol(r))+cres 14 | ymin = yFromRow(r, row=nrow(r))-cres 15 | ymax = yFromRow(r, row=1)+cres 16 | r = raster(rmatrix,xmn=xmin,xmx=xmax,ymn=ymin,ymx=ymax,crs=projection(r)) 17 | return(r) 18 | } -------------------------------------------------------------------------------- /R/mixel.r: -------------------------------------------------------------------------------- 1 | #' Composite images 2 | #' 3 | #' Composite images 4 | #' @param msswrs1dir character. list of mss wrs1 directory paths 5 | #' @param msswrs2dir character. list of mss wrs2 directory paths 6 | #' @param tmwrs2dir character. list of tm wrs2 directory path 7 | #' @param index character. spectral index to make composites for. options: "tca", "tcb", "tcg", "tcw" 8 | #' @param outdir character. path to output directory 9 | #' @param runname character. unique name for the composite set 10 | #' @param useareafile character. path to usearea file 11 | #' @param doyears ??? what years to composite 12 | #' @param order character. how to order the images options "sensor_and_doy" and "doy" 13 | #' @param overlap character. how to deal with overlapping images. options: "mean" 14 | #' @import raster 15 | #' @import gdalUtils 16 | #' @import plyr 17 | #' @export 18 | 19 | 20 | mixel = function(msswrs1dir,msswrs2dir,tmwrs2dir,oliwrs2dir,index,outdir,runname,useareafile,doyears="all",order="none",overlap="mean",startday,endday,yearadj=0){ 21 | 22 | mixel_find = function(files, refimg){ 23 | info = matrix(ncol = 4, nrow=length(files)) 24 | print("Getting image extents") 25 | for(i in 1:length(files)){ 26 | print(i) 27 | img = raster(files[i]) 28 | ext = extent(img) 29 | info[i,1] = ext@xmin 30 | info[i,2] = ext@xmax 31 | info[i,3] = ext@ymin 32 | info[i,4] = ext@ymax 33 | } 34 | 35 | text = extent(raster(refimg)) 36 | these = which(info[,3] < text@ymax & info[,4] > text@ymin & info[,2] > text@xmin & info[,1] < text@xmax) 37 | goods = files[these] 38 | return(goods) 39 | } 40 | 41 | mixel_mask = function(imgfile, useareafile, index){ #search, 42 | print(paste("...cloud masking:",basename(imgfile))) 43 | if(index == "tca" | index == "tcb"){band=1} 44 | if(index == "tcg"){band=2} 45 | if(index == "tcw"){band=3} 46 | 47 | sensor = substr(basename(imgfile), 1,2) 48 | if(sensor == "LM"){maskbit = "_cloudmask_30m.tif"} else {maskbit = "_cloudmask.tif"} 49 | 50 | maskfile = file.path(dirname(imgfile), paste(substr(basename(imgfile),1,16),maskbit,sep="")) 51 | img = raster(imgfile, band=band) 52 | mask = raster(maskfile) 53 | NAvalue(mask) = 0 #make 0 in the mask 54 | refimg = raster(useareafile) 55 | 56 | imgex = alignExtent(img, refimg, snap="near") 57 | maskex = alignExtent(mask, refimg, snap="near") 58 | extent(img) = imgex 59 | extent(mask) = maskex 60 | 61 | overlap = intersect(img, mask) 62 | mask = crop(mask, overlap) 63 | img = crop(img, overlap) 64 | 65 | img = img*mask 66 | return(img) 67 | } 68 | 69 | change_envi_to_bsq = function(file){ 70 | envifilename = sub("bsq","envi",file) 71 | envixmlfile = paste(envifilename,".aux.xml",sep="") 72 | bsqxmlfile = sub("envi","bsq",envixmlfile) 73 | file.rename(envifilename,file) 74 | file.rename(envixmlfile,bsqxmlfile) 75 | } 76 | 77 | mixel_composite = function(outdir, imginfosub, runname, index, order, useareafile, overlap, yearadj){ 78 | 79 | #outdir= mssdir 80 | #imginfosub = mssdf 81 | 82 | uniyears = sort(unique(imginfosub$compyear)) 83 | 84 | #for all the unique year make a composite 85 | for(i in 1:length(uniyears)){ 86 | print(paste("working on year:", uniyears[i])) 87 | these = which(imginfosub$compyear == uniyears[i]) 88 | theseimgs = imginfosub$file[these] 89 | 90 | if(order == "none"){theseimgs = theseimgs} 91 | 92 | len = length(theseimgs) 93 | #mask all the images in a year 94 | for(m in 1:len){ 95 | mergeit = ifelse(m == 1, "r1", paste(mergeit,",r",m, sep="")) 96 | dothis = paste("r",m,"=mixel_mask(theseimgs[",m,"], useareafile, index)", sep="") 97 | eval(parse(text=dothis)) 98 | } 99 | 100 | #select a mosaic method 101 | if(overlap == "order"){mergeit = paste("newimg = merge(",mergeit,")", sep="")} else 102 | if(overlap == "mean"){mergeit = paste("newimg = mosaic(",mergeit,",fun=mean,na.rm=T)", sep="")} else 103 | if(overlap == "median"){mergeit = paste("newimg = mosaic(",mergeit,",fun=median,na.rm=T)", sep="")} else 104 | if(overlap == "max"){mergeit = paste("newimg = mosaic(",mergeit,",fun=max,na.rm=T)", sep="")} else 105 | if(overlap == "min"){mergeit = paste("newimg = mosaic(",mergeit,",fun=min,na.rm=T)", sep="")} 106 | 107 | #run the merge function 108 | print(paste("...merging files using ", overlap, ":",sep="")) 109 | for(h in 1:len){print(paste("......",basename(theseimgs[h]),sep=""))} 110 | if(len == 1){newimg = r1} else {eval(parse(text=mergeit))} #only run merge it if there are multiple files to merge 111 | 112 | #name the new file 113 | yearlabel = as.character(as.numeric(uniyears[i])+yearadj) 114 | newbase = paste(yearlabel,"_",runname,"_",index,"_composite.bsq", sep="") 115 | outimgfile = file.path(outdir,newbase) 116 | outtxtfile = sub("composite.bsq", "composite_img_list.csv", outimgfile) 117 | theseimgs = data.frame(theseimgs) 118 | colnames(theseimgs) = "File" 119 | write.csv(theseimgs, file=outtxtfile, row.names = F) 120 | 121 | #load in the usearea file and crop/extend the new image to it 122 | refimg = raster(useareafile) 123 | newimg = round(crop(newimg, refimg)) 124 | newimg = extend(newimg, refimg, value=NA) 125 | 126 | #set NA values to 0 for use 127 | refimg = refimg != 0 # set all values not equal to 0 to 1 and 0 to 0 - NA can still be in there, it will be set to 0 in the final img 128 | refimg = refimg * newimg #set all 0's in the usearea file to 0 in the img file - if NA are in the usearea file, they will be transfered to img and then set to 0 in the final img 129 | newimg[is.na(newimg)] = 0 #set all NA to 0 130 | 131 | #write out the new image 132 | projection(newimg) = set_projection(files[1]) 133 | writeRaster(newimg, outimgfile, format="ENVI", datatype = "INT2S",overwrite=T) 134 | change_envi_to_bsq(outimgfile) 135 | 136 | #clean the temp directory 137 | delete_temp_files() 138 | } 139 | } 140 | 141 | delete_temp_files = function(){ 142 | tempdir = dirname(rasterTmpFile()) 143 | tempfiles = list.files(tempdir,full.names=T) 144 | unlink(tempfiles) 145 | } 146 | 147 | find_files = function(dir, search){ 148 | if(length(which(is.na(dir) == T)) > 0){return(vector())} else{ 149 | imgdir = normalizePath(file.path(dir,"images"),winslash="/") 150 | files = vector() 151 | for(i in 1:length(imgdir)){ 152 | print(paste("finding files in: ",imgdir)) 153 | files = c(files,list.files(imgdir[i], search, recursive=T, full.names=T)) 154 | } 155 | if(length(files)==0){stop( 156 | paste("There were no tasselled cap files found in this directory: ",imgdir[i],". 157 | Make sure that you provided the correct scene head directory path and that all processing 158 | steps up to compositing have been completed. MSS directories should contain 159 | files with the extension 'tc_30m.tif' and 'tca_30m.tif', and TM/ETM+ and OLI 'tc.tif' and 'tca.tif'. 160 | A valid scene head directory path should look like this mock example: 'C:/mock/landsat/wrs2/045030'. 161 | The program will then append 'images' to the path and search recursively in that directory. If 162 | you wish to continue without this directory, re-run the compositing call and don't add this directory.",sep="") 163 | ) 164 | } else{ 165 | return(files) 166 | } 167 | } 168 | } 169 | 170 | combine_overlapping_senors = function(ref_files, dep_files){ 171 | thesetm = which(basename(ref_files) %in% basename(dep_files)) 172 | ref_files_sort = sort(ref_files[thesetm]) 173 | thesemss = which(basename(dep_files) %in% basename(ref_files)) 174 | dep_files_sort = sort(dep_files[thesemss]) 175 | len = length(dep_files_sort) 176 | if(len > 0){ 177 | for(i in 1:len){ 178 | print(paste("...",i,"/",len,sep="")) 179 | depimg= raster(dep_files_sort[i]) 180 | refimg= raster(ref_files_sort[i]) 181 | NAvalue(depimg) = 0 182 | NAvalue(refimg) = 0 183 | newimg = mosaic(depimg,refimg, fun="mean", na.rm=T) 184 | newimg[is.na(newimg)] = 0 185 | projection(newimg) = set_projection(files[1]) 186 | 187 | outimgfile = file.path(outdir,basename(dep_files_sort[i])) 188 | writeRaster(newimg, outimgfile, format="ENVI", datatype = "INT2S",overwrite=T) 189 | change_envi_to_bsq(outimgfile) 190 | 191 | delete_temp_files() 192 | } 193 | } 194 | } 195 | 196 | 197 | pixel_level_offset = function(ref_files, dep_files, outdir, sensor, projfile, runname){ 198 | 199 | #find overlapping years 200 | theseref = which(basename(ref_files) %in% basename(dep_files)) 201 | thesedep = which(basename(dep_files) %in% basename(ref_files)) 202 | lendep = length(thesedep) 203 | lenref = length(theseref) 204 | if(lenref == 0 | lendep == 0 | lenref != lendep){return()} # get out if there are no matching files - no overlap 205 | 206 | print(paste("calculating pixel-level offset for: ",sensor," composites",sep="")) 207 | 208 | #sort the files to make sure they are in the same order 209 | ref_files_sort = sort(ref_files[theseref]) 210 | dep_files_sort = sort(dep_files[thesedep]) 211 | 212 | #find the mean pixel-wise difference between dependent and reference images 213 | for(i in 1:lendep){ 214 | print(paste("...",basename(dep_files_sort[i]),sep="")) 215 | refimg = raster(ref_files_sort[i]) 216 | depimg = raster(dep_files_sort[i]) 217 | NAvalue(refimg) = 0 218 | NAvalue(depimg) = 0 219 | 220 | dif = refimg - depimg #get the difference 221 | denom = !is.na(dif) #get the cells that are not NA after difference 222 | dif[is.na(dif)] = 0 #set the difference NA values to 0 223 | 224 | #make the sum difference and denominator layers 225 | if(i == 1){ #if i is one then start the layers 226 | difsum = dif 227 | denomsum = denom 228 | } else{ #else sum the layers 229 | difsum = sum(difsum, dif, na.rm=T) 230 | denomsum = sum(denomsum, denom, na.rm=T) 231 | } 232 | } 233 | print("...calculating mean pixel-level offset") 234 | meandiforig = round(difsum/denomsum) #mean the mean for the time series 235 | meandiforig[is.na(meandiforig)] = 0 #make division by 0 set to 0 instead of NA - there will be no correction for these pixels 236 | 237 | #figure out parts of file names 238 | if(sensor == "mss"){offsetdir = file.path(outdir,"mss_offset");deprunname="lm"; refrunname="lt"; meandiffilebname = "mss_mean_dif.bsq"} 239 | if(sensor == "oli"){offsetdir = file.path(outdir,"oli_offset");deprunname="lc"; refrunname="le"; meandiffilebname = "oli_mean_dif.bsq"} 240 | 241 | #write out the mean pixel-wise difference file 242 | dir.create(offsetdir, recursive=T, showWarnings=F) 243 | projection(meandiforig) = set_projection(projfile) 244 | meandiffile = file.path(offsetdir,meandiffilebname) 245 | writeRaster(meandiforig, meandiffile, format="ENVI", datatype = "INT2S",overwrite=T) 246 | change_envi_to_bsq(meandiffile) 247 | 248 | #write out the frequency file 249 | projection(denomsum) = set_projection(projfile) 250 | denomsumfile = file.path(dirname(meandiffile),"overlap_frequency.bsq") 251 | writeRaster(denomsum, denomsumfile, format="ENVI", datatype="INT2S",overwrite=T) 252 | change_envi_to_bsq(denomsumfile) 253 | 254 | #move files around 255 | for(i in 1:lendep){ 256 | from_dep_files = list.files(dirname(dep_files_sort[1]),substr(basename(dep_files_sort[i]),1,4), full.names = T) 257 | from_ref_files = list.files(dirname(ref_files_sort[1]),substr(basename(ref_files_sort[i]),1,4), full.names = T) 258 | to_dep_files = file.path(offsetdir,sub(paste("_",runname,"_",sep=""),paste("_",deprunname,"_",sep=""),basename(from_dep_files))) 259 | to_ref_files = file.path(offsetdir,sub(paste("_",runname,"_",sep=""),paste("_",refrunname,"_",sep=""),basename(from_ref_files))) 260 | 261 | file.copy(from_dep_files, to_dep_files) 262 | file.copy(from_ref_files, to_ref_files) 263 | } 264 | 265 | #adjust the dep images 266 | print("...adjusting images by mean pixel-level offset:") 267 | for(i in 1:length(dep_files)){ 268 | print(paste("......",basename(dep_files[i]),sep="")) 269 | img = raster(dep_files[i]) 270 | NAvalue(img) = 0 271 | 272 | meandiforig = raster(meandiffile) #need to load this each time because delete_temp_files() gets called at the end of each loop - if this file is big it is held in the temp directory and will be deleted 273 | img = img + meandiforig 274 | img[is.na(img)] = 0 275 | 276 | #write out the new image 277 | projection(img) = set_projection(projfile) 278 | 279 | #delete the old .bsq files - keep the "img_list.csv" files 280 | allfiles = list.files(dirname(dep_files[i]),substr(basename(dep_files[i]),1,nchar(basename(dep_files[i]))-4),full.names=T) 281 | allfiles = grep("img_list.csv", allfiles, invert=T, value=T) #keep the "img_list.csv" files 282 | unlink(allfiles) 283 | writeRaster(img, dep_files[i], format="ENVI", datatype="INT2S",overwrite=T) 284 | change_envi_to_bsq(dep_files[i]) 285 | 286 | delete_temp_files() 287 | } 288 | } 289 | 290 | #check for leap year 291 | leapyear = function(year){ 292 | return(((year %% 4 == 0) & (year %% 100 != 0)) | (year %% 400 == 0)) 293 | } 294 | 295 | #create decimal year day 296 | decyearday = function(year,day){ 297 | num = ifelse(leapyear(year) == T, 366, 365) 298 | return(year+day/num) 299 | } 300 | 301 | #create decimal day 302 | decday = function(day){ 303 | num = ifelse(day == 366, 366, 365) 304 | return(day/num) 305 | } 306 | 307 | ######################################################################### 308 | ######################################################################### 309 | ######################################################################### 310 | print(paste("working on index:",index)) 311 | 312 | #create some search terms depending on index 313 | if(index == "tca"){msssearch="tca_30m.tif$"; tmsearch="tca.tif$"; olisearch="tca.tif$"} 314 | if(index == "tcb"){msssearch="tc_30m.tif$"; tmsearch="tc.tif$"; olisearch="tc.tif$"} 315 | if(index == "tcg"){msssearch="tc_30m.tif$"; tmsearch="tc.tif$"; olisearch="tc.tif$"} 316 | if(index == "tcw"){msssearch="tc_30m.tif$"; tmsearch="tc.tif$"; olisearch="tc.tif$"} 317 | 318 | #find the files 319 | msswrs1files = find_files(msswrs1dir, msssearch) 320 | msswrs2files = find_files(msswrs2dir, msssearch) 321 | tmwrs2files = find_files(tmwrs2dir, tmsearch) 322 | oliwrs2files = find_files(oliwrs2dir, olisearch) 323 | 324 | #put all the files together in a vector and check to make sure the files intersect the useareafile, if not they will be excluded 325 | files = c(msswrs1files,msswrs2files,tmwrs2files,oliwrs2files) 326 | 327 | #find files that intersect the usearea file 328 | files = mixel_find(files, useareafile) 329 | if(length(files)==0){stop("There were no files in the given directories that intersect the provided 'usearea file'. 330 | Make sure that you provided the correct file, checked that it actually overlaps the scenes you 331 | specified for compositing, and that the projection is the same as the images.")} 332 | 333 | 334 | #create a table with info on the files 335 | imginfo = data.frame(file = as.character(files)) 336 | imginfo$file = as.character(imginfo$file) 337 | bname = basename(imginfo$file) 338 | imginfo$year = substr(bname, 10, 13) 339 | imginfo$day = substr(bname, 14,16) 340 | imginfo$sensor = substr(bname, 1,2) 341 | imginfo$compyear = imginfo$decdate = NA 342 | for(i in 1:nrow(imginfo)){imginfo$decdate[i] = decyearday(as.numeric(imginfo$year[i]),as.numeric(imginfo$day[i]))} 343 | 344 | #figure out the dec year day range that is good 345 | #uniyears = as.numeric(sort(unique(imginfo$year))) 346 | uniyears = 1972:as.numeric(format(Sys.Date(),'%Y')) 347 | 348 | if(doyears != "all"){uniyears = uniyears[match(doyears,uniyears)]} 349 | 350 | decstart = decday(startday) 351 | decend = decday(endday) 352 | dif = ifelse(decstart > decend, (1-decstart)+decend, decend - decstart) 353 | start = uniyears+decstart 354 | end = start+dif 355 | yearsdf = data.frame(uniyears,start,end) 356 | 357 | #figure out which images are in the composite date range, get rif of ones that aren't 358 | for(i in 1:nrow(yearsdf)){ 359 | these = which(imginfo$decdate >= yearsdf$start[i] & imginfo$decdate <= yearsdf$end[i]) 360 | imginfo$compyear[these] = yearsdf$uniyears[i] 361 | } 362 | imginfo = na.omit(imginfo) 363 | if(nrow(imginfo)==0){stop("There were no files in the given directories that intersect the provided start and end year-of-day bounds. 364 | Make sure that you provided the correct limits and check that there are actually image files that intersect 365 | the specified date range.")} 366 | 367 | 368 | #make composites 369 | mssdir = file.path(outdir,"mss") 370 | tmdir = file.path(outdir,"tm") 371 | olidir = file.path(outdir,"oli") 372 | 373 | #pull out files by sensor 374 | mssdf = imginfo[imginfo$sensor == "LM",] 375 | tmetmdf = imginfo[imginfo$sensor == "LT" | imginfo$sensor == "LE",] 376 | olidf = imginfo[imginfo$sensor == "LC",] 377 | 378 | #create annual composites for all sensors 379 | if(nrow(mssdf) != 0){ 380 | dir.create(mssdir, recursive=T, showWarnings=F) 381 | mixel_composite(mssdir, mssdf, runname=runname,index=index, order=order, useareafile=useareafile, overlap=overlap, yearadj=yearadj) 382 | } 383 | if(nrow(tmetmdf) != 0){ 384 | dir.create(tmdir, recursive=T, showWarnings=F) 385 | mixel_composite(tmdir, tmetmdf, runname=runname,index=index, order=order, useareafile=useareafile, overlap=overlap, yearadj=yearadj) 386 | } 387 | if(nrow(olidf) != 0){ 388 | dir.create(olidir, recursive=T, showWarnings=F) 389 | mixel_composite(olidir, olidf, runname=runname,index=index, order=order, useareafile=useareafile, overlap=overlap, yearadj=yearadj) 390 | } 391 | 392 | #deal with the overlapping composites 393 | msscompfiles = list.files(mssdir, ".bsq$", recursive=T, full.names=T) 394 | tmcompfiles = list.files(tmdir, ".bsq$", recursive=T, full.names=T) 395 | olicompfiles = list.files(olidir, ".bsq$", recursive=T, full.names=T) 396 | 397 | pixel_level_offset(ref_files=tmcompfiles, dep_files=msscompfiles, outdir=outdir, sensor="mss", projfile=files[1], runname=runname) 398 | pixel_level_offset(ref_files=tmcompfiles, dep_files=olicompfiles, outdir=outdir, sensor="oli", projfile=files[1], runname=runname) 399 | 400 | 401 | print("dealing with any temporally overlapping MSS/TM composites") 402 | combine_overlapping_senors(tmcompfiles, msscompfiles) 403 | 404 | print("dealing with any temporally overlapping ETM+/OLI composites") 405 | combine_overlapping_senors(tmcompfiles, olicompfiles) 406 | 407 | 408 | #rename files 409 | print("directory and file organization/cleaning") 410 | imglists = list.files(outdir, paste(runname,"_",index,"_composite_img_list.csv",sep=""), recursive=T, full.names=T) 411 | imglistyears = substr(basename(imglists),1,4) 412 | uniimglistyears = unique(imglistyears) 413 | for(i in 1:length(uniimglistyears)){ 414 | outname = file.path(outdir,paste(uniimglistyears[i],"_",runname,"_",index,"_composite_img_list.csv", sep="")) 415 | theseones = which(imglistyears %in% uniimglistyears[i]) 416 | for(l in 1:length(theseones)){ 417 | if(l == 1){ 418 | alllimglist = read.csv(imglists[theseones[l]], stringsAsFactors=F)$File 419 | } else{ 420 | alllimglist = c(alllimglist, read.csv(imglists[theseones[l]], stringsAsFactors=F)$File) 421 | } 422 | } 423 | alllimglist = data.frame(File = alllimglist) 424 | write.csv(alllimglist, outname, row.names = F) 425 | } 426 | 427 | #move files 428 | msstmolifiles = c(msscompfiles, tmcompfiles, olicompfiles) 429 | finalfiles = file.path(outdir, basename(msstmolifiles)) 430 | for(i in 1:length(finalfiles)){ 431 | check = file.exists(finalfiles[i]) 432 | if(check == F){ 433 | year = substr(basename(finalfiles[i]), 1, 4) 434 | files = list.files(dirname(msstmolifiles[i]), year, full.names=T) 435 | file.rename(files, file.path(outdir, basename(files))) 436 | } 437 | } 438 | 439 | #clean up 440 | unlink(c(mssdir,tmdir,olidir), recursive=T) 441 | 442 | #make the final stack 443 | print("making final annual composite stack") 444 | bname = paste(runname,"_",index,"_composite_stack.bsq", sep="") 445 | bands = sort(list.files(outdir, "composite.bsq$", full.names=T)) 446 | fullnametif = file.path(outdir,bname) 447 | fullnamevrt = change_extension("bsq", "vrt", fullnametif) 448 | gdalbuildvrt(gdalfile=bands, output.vrt = fullnamevrt, separate=T) #, tr=c(reso,reso) 449 | gdal_translate(src_dataset=fullnamevrt, dst_dataset=fullnametif, of = "ENVI") #, co="INTERLEAVE=BAND" 450 | unlink(fullnamevrt) 451 | } 452 | 453 | 454 | -------------------------------------------------------------------------------- /R/mosaic_dems.r: -------------------------------------------------------------------------------- 1 | #' Create a DEM mosaic from a direcory of DEM's 2 | #' 3 | #' Create a DEM mosaic from a direcory of DEM's 4 | #' @param dir character. The path to a directory containing DEM files to be mosaicked 5 | #' @import raster 6 | #' @export 7 | 8 | 9 | mosaic_dems = function(dir, proj){ 10 | 11 | align = function(img, refimg){ 12 | img = raster(img) 13 | imgex = alignExtent(img, refimg, snap="near") 14 | extent(img) = imgex 15 | return(img) 16 | } 17 | 18 | reso = 60 19 | demfiles = list.files(dir, full.names=T) 20 | 21 | for(i in 1:length(demfiles)){ 22 | demfile = demfiles[i] 23 | print(paste("reprojecting file:",demfile)) 24 | bname = basename(demfile) 25 | extension = substr(bname,(nchar(bname)-3),nchar(bname)) 26 | dstfile = sub(extension, "_reprojected.tif", demfile) 27 | 28 | gdalwarp(srcfile=demfile, dstfile=dstfile, 29 | t_srs=proj, of="GTiff", 30 | r="near", dstnodata=-32768, multi=T, 31 | tr=c(reso,reso), co="INTERLEAVE=BAND") 32 | } 33 | 34 | demfiles = list.files(dir, pattern="reprojected.tif$", full.names=T) 35 | len = length(demfiles) 36 | refimg = raster(demfiles[1]) 37 | 38 | for(i in 1:len){ 39 | if(i == 1){mergeit = "r1"} else {mergeit = paste(mergeit,",r",i, sep="")} 40 | #open the image as raster and aligns it 41 | dothis = paste("r",i,"=align(demfiles[",i,"], refimg)", sep="") 42 | eval(parse(text=dothis)) 43 | if(i == len){mergeit = paste("big = mosaic(",mergeit,",fun=mean,na.rm=T,tolerance=0.5)", sep="")} 44 | } 45 | 46 | #run the merge function 47 | print("calculating DEM mosiac") 48 | if(len == 1){big = r1} else {eval(parse(text=mergeit))} #only run merge it if there are multiple files to merge 49 | 50 | #write out the DEM mosiac 51 | print("writing DEM mosiac") 52 | outfile = file.path(dir,"dem_mosaic.tif") 53 | writeRaster(big, outfile, format="GTiff", datatype = "INT2S",overwrite=T, bandorder = "BSQ",options=c("COMPRESS=NONE")) 54 | 55 | 56 | } 57 | 58 | -------------------------------------------------------------------------------- /R/mss_resample.r: -------------------------------------------------------------------------------- 1 | #' Resamples MSS dos_sr and cloudmask images to 30m 2 | #' 3 | #' Resamples MSS dos_sr and cloudmask images to 30m 4 | #' @param file character. full path to either an MSS *reflectance.tif or *cloudmask.tif file 5 | #' @import gdalUtils 6 | #' @export 7 | 8 | mss_resample = function(file, overwrite=F){ 9 | type = c(length(grep("dos_sr", file)), length(grep("cloudmask", file))) 10 | 11 | if(type[1] == 1){ 12 | check = file_check(file,"dos_sr_30m.tif",overwrite) 13 | if(check == 0){return(0)} 14 | 15 | newfile = sub("dos_sr", "dos_sr_30m", file) 16 | gdalwarp(srcfile=file, dstfile=newfile,tr=c(30,30), 17 | srcnodata=-32768, dstnodata=-32768, multi=T, r="cubic") 18 | 19 | return(1) 20 | } 21 | 22 | 23 | if(type[2] == 1){ 24 | check = file_check(file,"cloudmask_30m.tif",overwrite) 25 | if(check == 0){return(0)} 26 | 27 | newfile = sub("cloudmask", "cloudmask_30m", file) 28 | gdalwarp(srcfile=file, dstfile=newfile,tr=c(30,30), 29 | srcnodata=255, dstnodata=255, multi=T) 30 | 31 | return(1) 32 | } 33 | } 34 | 35 | -------------------------------------------------------------------------------- /R/msscal.r: -------------------------------------------------------------------------------- 1 | #' Calibrate MSS images to TM images 2 | #' 3 | #' Calibrate MSS images to TM images using linear regression 4 | #' @param msswrs1dir character. MSS WRS-1 scene directory path 5 | #' @param msswrs2dir character. MSS WRS-2 scene directory path 6 | #' @param tmwrs2dir character. TM WRS-2 scene directory path 7 | #' @param cores numeric. Number of cores to process with options: 1 or 2 8 | #' @export 9 | 10 | 11 | msscal = function(msswrs1dir, msswrs2dir, tmwrs2dir, cores=2){ 12 | 13 | mssfiles = list.files(msswrs2dir, "dos_sr_30m.tif", recursive=T, full.names=T) 14 | tmfiles = list.files(tmwrs2dir, "tc.tif", recursive=T, full.names=T) 15 | 16 | mssimgid = substr(basename(mssfiles),3,16) 17 | tmimgid = substr(basename(tmfiles),3,16) 18 | 19 | thesemss = which(mssimgid %in% tmimgid) 20 | mssfiles = mssfiles[thesemss] 21 | 22 | thesetm = which(tmimgid %in% mssimgid) 23 | tmfiles = tmfiles[thesetm] 24 | 25 | #check for matching mss/tm files to calibrate on 26 | if(length(mssfiles) == length(tmfiles)){ 27 | mssimgid = substr(basename(mssfiles),3,16) 28 | tmimgid = substr(basename(tmfiles),3,16) 29 | mssf = mssfiles[order(mssimgid)] 30 | tmf = tmfiles[order(tmimgid)] 31 | } else {stop("There are no matching MSS and TM image dates to use for calibration")} 32 | 33 | #check to see if aggregated models have already been made 34 | tcacheck = list.files(msswrs2dir, "tca_cal_aggregate_coef.csv", recursive=T, full.names=T) 35 | tcbcheck = list.files(msswrs2dir, "tcb_cal_aggregate_coef.csv", recursive=T, full.names=T) 36 | tcgcheck = list.files(msswrs2dir, "tcg_cal_aggregate_coef.csv", recursive=T, full.names=T) 37 | tcwcheck = list.files(msswrs2dir, "tcw_cal_aggregate_coef.csv", recursive=T, full.names=T) 38 | checks = c(length(tcacheck),length(tcbcheck),length(tcgcheck),length(tcwcheck)) 39 | check = sum(checks) 40 | if(check > 0){ 41 | print("calibration models have already been create for:") 42 | if(checks[1]==1){print("...tca")} #don't actually need to model TCA because we make it from orig tc 43 | if(checks[2]==1){print("...tcb")} 44 | if(checks[3]==1){print("...tcg")} 45 | if(checks[4]==1){print("...tcw")} 46 | } 47 | 48 | if(check != 4){ 49 | print("...single image pair modeling") 50 | if(cores==2){ 51 | cl = makeCluster(cores) 52 | registerDoParallel(cl) 53 | cfun <- function(a, b) NULL 54 | o = foreach(i=1:length(mssf), .combine="cfun",.packages="LandsatLinkr") %dopar% msscal_single(mssf[i], tmf[i]) # 55 | stopCluster(cl) 56 | } else {for(i in 1:length(mssf)){msscal_single(mssf[i], tmf[i])}} 57 | 58 | dir = file.path(msswrs2dir,"calibration") 59 | print("...aggregate image pair modeling") 60 | cal_mss_tc_aggregate_model(dir) 61 | } 62 | 63 | msswrs1imgdir = file.path(msswrs1dir,"images") 64 | msswrs2imgdir = file.path(msswrs2dir,"images") 65 | 66 | msswrs1files = list.files(msswrs1imgdir, "dos_sr_30m.tif", recursive=T, full.names=T) 67 | msswrs2files = list.files(msswrs2imgdir, "dos_sr_30m.tif", recursive=T, full.names=T) 68 | 69 | files = c(msswrs1files,msswrs2files) 70 | dir = file.path(msswrs2dir,"calibration","aggregate_model") 71 | bcoef = as.numeric(read.csv(file.path(dir,"tcb_cal_aggregate_coef.csv"))[1,2:6]) 72 | gcoef = as.numeric(read.csv(file.path(dir,"tcg_cal_aggregate_coef.csv"))[1,2:6]) 73 | wcoef = as.numeric(read.csv(file.path(dir,"tcw_cal_aggregate_coef.csv"))[1,2:6]) 74 | 75 | print("...applying model to all MSS images") 76 | cores = 1 77 | if(cores==2){ 78 | cl = makeCluster(cores) 79 | registerDoParallel(cl) 80 | cfun <- function(a, b) NULL 81 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% msssr2tc(files[i],bcoef,gcoef,wcoef,"apply") # 82 | stopCluster(cl) 83 | } else {for(i in 1:length(files)){msssr2tc(files[i],bcoef,gcoef,wcoef,"apply")}} 84 | 85 | } 86 | 87 | 88 | 89 | 90 | -------------------------------------------------------------------------------- /R/msscal_single.r: -------------------------------------------------------------------------------- 1 | #' Calibrate MSS images to TM images 2 | #' 3 | #' Calibrate MSS images to TM images using linear regression 4 | #' @param msswrs2dir character. MSS WRS-2 scene directory path 5 | #' @param tmwrs2dir character. TM WRS-2 scene directory path 6 | #' @import raster 7 | #' @import MASS 8 | #' @export 9 | 10 | 11 | msscal_single = function(mss_file, tm_file){ 12 | 13 | get_intersection = function(files){ 14 | int = intersect(extent(raster(files[1])),extent(raster(files[2]))) 15 | if(length(files) >= 3){for(i in 3:length(files))int = intersect(extent(raster(files[i])), int)} 16 | return(int) 17 | } 18 | 19 | predict_mss_index = function(tbl, outsampfile){ 20 | #create a multivariable linear model 21 | model = rlm(refsamp ~ b1samp + b2samp + b3samp + b4samp, data=tbl) #tbl replaced final 1/22/2016 22 | 23 | tbl$singlepred = round(predict(model)) 24 | write.csv(tbl, outsampfile, row.names=F) 25 | 26 | #plot the regression 27 | r = cor(tbl$refsamp, tbl$singlepred) 28 | coef = rlm(tbl$refsamp ~ tbl$singlepred) 29 | 30 | pngout = sub("samp.csv", "plot.png",outsampfile) 31 | png(pngout,width=700, height=700) 32 | title = paste(tbl$index[1],"linear regression: slope =",paste(signif(coef$coefficients[2], digits=3),",",sep=""), 33 | "y Intercept =",paste(round(coef$coefficients[1], digits=3),",",sep=""), 34 | "r =",signif(r, digits=3)) 35 | plot(x=tbl$singlepred,y=tbl$refsamp, #tbl replaced final 1/22/2016 36 | main=title, 37 | xlab=paste(tbl$mss_img[1],tbl$index[1]), 38 | ylab=paste(tbl$ref_img[1],tbl$index[1])) 39 | abline(coef = coef$coefficients, col="red") 40 | dev.off() 41 | 42 | coef_tbl = data.frame(rbind(model$coefficients)) 43 | cnames = c("yint","b1c","b2c","b3c","b4c") 44 | colnames(coef_tbl) = cnames 45 | tbls = list(coef_tbl,tbl) 46 | return(tbls) 47 | } 48 | 49 | #define the filenames 50 | mss_sr_file = mss_file 51 | mss_mask_file = sub("dos_sr_30m.tif", "cloudmask_30m.tif", mss_sr_file) 52 | ref_tc_file = tm_file 53 | ref_tca_file = sub("tc", "tca", ref_tc_file) 54 | ref_mask_file = sub("tc", "cloudmask", ref_tc_file) 55 | 56 | #make new directory 57 | dname = dirname(mss_sr_file) 58 | mssimgid = substr(basename(mss_sr_file),1,16) 59 | outdir = file.path(substr(dname,1,nchar(dname)-12),"calibration", mssimgid) #-5 60 | dir.create(outdir, showWarnings = F, recursive=T) 61 | 62 | #load files as raster 63 | mss_sr_img = brick(mss_sr_file) 64 | mss_mask_img = raster(mss_mask_file) 65 | ref_tc_img = brick(ref_tc_file) 66 | ref_tca_img = raster(ref_tca_file) 67 | ref_mask_img = raster(ref_mask_file) 68 | 69 | #align the extents 70 | extent(mss_sr_img) = alignExtent(mss_sr_img, ref_tc_img, snap="near") 71 | extent(mss_mask_img) = alignExtent(mss_mask_img, ref_tc_img, snap="near") 72 | extent(ref_tc_img) = alignExtent(ref_tc_img, ref_tc_img, snap="near") 73 | extent(ref_tca_img) = alignExtent(ref_tca_img, ref_tc_img, snap="near") 74 | extent(ref_mask_img) = alignExtent(ref_mask_img, ref_tc_img, snap="near") 75 | 76 | #crop the images to their intersection 77 | int = get_intersection(c(mss_sr_file,mss_mask_file,ref_tc_file,ref_tca_file,ref_mask_file)) 78 | mss_sr_img = crop(mss_sr_img,int) 79 | mss_mask_img = crop(mss_mask_img,int) 80 | ref_tc_img = crop(ref_tc_img,int) 81 | ref_tca_img = crop(ref_tca_img,int) 82 | ref_mask_img = crop(ref_mask_img,int) 83 | 84 | #make a composite mask 85 | mss_mask_v = as.vector(mss_mask_img) 86 | ref_mask_v = as.vector(ref_mask_img) 87 | mask = mss_mask_v*ref_mask_v 88 | mss_mask_v = ref_mask_v = 0 # save memory 89 | 90 | goods = which(mask == 1) 91 | if(length(goods) < 20000){return()} 92 | 93 | #random sample 94 | samp = sample(1:length(goods), 20000) 95 | samp = goods[samp] 96 | sampxy = xyFromCell(mss_mask_img, samp) #added on 1/22/2016 97 | 98 | #save memory 99 | mask = 0 100 | 101 | msssamp = extract(mss_sr_img, sampxy) 102 | tcsamp = extract(ref_tc_img, sampxy) 103 | tcasamp = extract(ref_tca_img, sampxy) 104 | 105 | #make sure the values are good for running regression on (diversity) 106 | unib1samp = length(unique(msssamp[,1])) 107 | unib2samp = length(unique(msssamp[,2])) 108 | unib3samp = length(unique(msssamp[,3])) 109 | unib4samp = length(unique(msssamp[,4])) 110 | 111 | unitcbsamp = length(unique(tcsamp[,1])) 112 | unitcgsamp = length(unique(tcsamp[,2])) 113 | unitcwsamp = length(unique(tcsamp[,3])) 114 | unitcasamp = length(unique(tcasamp)) 115 | 116 | 117 | #if(unib1samp < 15 | unib2samp < 15 | unib3samp < 15 | unib4samp < 15 ){return()} 118 | if(unib1samp < 15 | unib2samp < 15 | unib3samp < 15 | unib4samp < 15 | 119 | unitcbsamp < 15 | unitcgsamp < 15 | unitcwsamp < 15 | unitcasamp < 15){return()} 120 | 121 | mssbname = basename(mss_sr_file) 122 | refbname = basename(ref_tc_file) 123 | refabname = basename(ref_tca_file) 124 | 125 | tcb_tbl = data.frame(mssbname,refbname,"tcb",sampxy,tcsamp[,1],msssamp) 126 | tcg_tbl = data.frame(mssbname,refbname,"tcg",sampxy,tcsamp[,2],msssamp) 127 | tcw_tbl = data.frame(mssbname,refbname,"tcw",sampxy,tcsamp[,3],msssamp) 128 | tca_tbl = data.frame(mssbname,refabname,"tca",sampxy,tcasamp,msssamp) 129 | 130 | tcb_tbl = tcb_tbl[complete.cases(tcb_tbl),] 131 | tcg_tbl = tcg_tbl[complete.cases(tcg_tbl),] 132 | tcw_tbl = tcw_tbl[complete.cases(tcw_tbl),] 133 | tca_tbl = tca_tbl[complete.cases(tca_tbl),] 134 | 135 | cnames = c("mss_img","ref_img","index","x","y","refsamp","b1samp","b2samp","b3samp","b4samp") 136 | colnames(tcb_tbl) = cnames 137 | colnames(tcg_tbl) = cnames 138 | colnames(tcw_tbl) = cnames 139 | colnames(tca_tbl) = cnames 140 | 141 | outsampfile = file.path(outdir,paste(mssimgid,"_tcb_cal_samp.csv",sep="")) 142 | model = predict_mss_index(tcb_tbl, outsampfile) 143 | bcoef = model[[1]] 144 | bsamp = model[[2]] 145 | br = cor(bsamp$refsamp, bsamp$singlepred) 146 | 147 | outsampfile = file.path(outdir,paste(mssimgid,"_tcg_cal_samp.csv",sep="")) 148 | model = predict_mss_index(tcg_tbl, outsampfile) 149 | gcoef = model[[1]] 150 | gsamp = model[[2]] 151 | gr = cor(gsamp$refsamp, gsamp$singlepred) 152 | 153 | outsampfile = file.path(outdir,paste(mssimgid,"_tcw_cal_samp.csv",sep="")) 154 | model = predict_mss_index(tcw_tbl, outsampfile) 155 | wcoef = model[[1]] 156 | wsamp = model[[2]] 157 | wr = cor(wsamp$refsamp, wsamp$singlepred) 158 | 159 | #TCA 160 | outsampfile = file.path(outdir,paste(mssimgid,"_tca_cal_samp.csv",sep="")) 161 | model = predict_mss_index(tca_tbl, outsampfile) 162 | acoef = model[[1]] 163 | asamp = model[[2]] 164 | ar = cor(asamp$refsamp, asamp$singlepred) 165 | 166 | tcbinfo = data.frame(mss_file=mssbname, ref_file=refbname, index="tcb", bcoef, r=br) 167 | tcginfo = data.frame(mss_file=mssbname, ref_file=refbname, index="tcg", gcoef, r=gr) 168 | tcwinfo = data.frame(mss_file=mssbname, ref_file=refbname, index="tcw", wcoef, r=wr) 169 | tcainfo = data.frame(mss_file=mssbname, ref_file=refabname, index="tca", acoef, r=ar) 170 | 171 | tcbcoefoutfile = file.path(outdir,paste(mssimgid,"_tcb_cal_coef.csv",sep="")) 172 | tcgcoefoutfile = file.path(outdir,paste(mssimgid,"_tcg_cal_coef.csv",sep="")) 173 | tcwcoefoutfile = file.path(outdir,paste(mssimgid,"_tcw_cal_coef.csv",sep="")) 174 | tcacoefoutfile = file.path(outdir,paste(mssimgid,"_tca_cal_coef.csv",sep="")) 175 | 176 | write.csv(tcbinfo, tcbcoefoutfile, row.names=F) 177 | write.csv(tcginfo, tcgcoefoutfile, row.names=F) 178 | write.csv(tcwinfo, tcwcoefoutfile, row.names=F) 179 | write.csv(tcainfo, tcacoefoutfile, row.names=F) 180 | } 181 | -------------------------------------------------------------------------------- /R/msscost.r: -------------------------------------------------------------------------------- 1 | #' Convert DN values to surface reflectance 2 | #' 3 | #' Convert DN values to surface reflectance using the COST model with dark object subtraction 4 | #' @param file The full path name of the *archv file 5 | #' @import raster 6 | #' @export 7 | 8 | 9 | msscost = function(file, overwrite=F){ 10 | 11 | #link to the equations to convert DN to TOA and TOA to SR 12 | #http://landsathandbook.gsfc.nasa.gov/data_prod/prog_sect11_3.html 13 | 14 | check = file_check(file,"dos_sr.tif",overwrite) 15 | if(check == 0){return(0)} 16 | 17 | refl = function(file, band, gain, bias, sunzenith, d, esun, dov){ 18 | orig = raster(file, band) 19 | img = as.matrix(orig) 20 | img = ((gain*img)+bias)-((gain*dov)+bias) 21 | img[img < 0] = 0 22 | img = (pi * img * (d^2))/(esun * cos(sunzenith)) 23 | img = round(img * 10000) 24 | img = setValues(orig,img) 25 | return(img) 26 | } 27 | 28 | 29 | #find the dark object values 30 | brightthresh = 120 31 | 32 | b1 = as.matrix(raster(file, 1)) 33 | goodpix = which(is.na(b1) == F & b1 <= brightthresh) 34 | samp = matrix(nrow=100000,ncol=30) 35 | for(k in 1:length(samp[1,])){samp[,k] = sample(length(goodpix), 100000)} 36 | 37 | #setup the png darkobject file 38 | pngout = paste(substr(file,1,nchar(file)-4),"_drkobjv.png", sep="") 39 | if(file.exists(pngout) == T){unlink(pngout)} 40 | png(pngout,width=700, height=700) 41 | par(mfrow=c(2,2)) 42 | 43 | 44 | #iterate through the bands finding the dark objects 45 | thresh = c(5,5,4,3) 46 | drkobjv = array(dim=4) 47 | for(b in 1:4){ 48 | dn = as.matrix(raster(file, band=b)) 49 | dnsamp = dn[goodpix] 50 | drkobj = array(dim=length(samp[1,])) 51 | for(g in 1:length(samp[1,])){ 52 | r = dnsamp[samp[,g]] 53 | count = table(r) #250 54 | count1 = c(count[2:length(count)],0) 55 | shift = (count1-count) 56 | valu = as.numeric(rownames(shift)) 57 | goods = which(shift >= thresh[b]) 58 | drkobj[g] = valu[goods[1]] 59 | } 60 | 61 | finaldrkobj = round(mean(drkobj)) 62 | hist(dn, breaks=256, ylim=c(0,20000), 63 | main=paste("Band",b,"dark object value =", finaldrkobj), col="black", xlab="DN") 64 | abline(v = finaldrkobj, col = "red") 65 | drkobjv[b] = finaldrkobj 66 | } 67 | dev.off() 68 | 69 | #read in the image metadata 70 | info = get_metadata(file) 71 | 72 | #define esun values for mss (chander et al 2009 summary of current radiometric calibration coefficients... RSE 113) 73 | if(info$sensor == "LANDSAT_1"){esun = c(1823,1559,1276,880.1)} 74 | if(info$sensor == "LANDSAT_2"){esun = c(1829,1539,1268,886.6)} 75 | if(info$sensor == "LANDSAT_3"){esun = c(1839,1555,1291,887.9)} 76 | if(info$sensor == "LANDSAT_4"){esun = c(1827,1569,1260,866.4)} 77 | if(info$sensor == "LANDSAT_5"){esun = c(1824,1570,1249,853.4)} 78 | 79 | #define the earth sun distance 80 | d = eudist(info$doy) 81 | sunzenith = info$sunzen*(pi/180) 82 | 83 | #apply the conversion to reflectance 84 | b1 = refl(file,1,info$b1gain, info$b1bias, sunzenith, d, esun[1], drkobjv[1]) 85 | b2 = refl(file,2,info$b2gain, info$b2bias, sunzenith, d, esun[2], drkobjv[2]) 86 | b3 = refl(file,3,info$b3gain, info$b3bias, sunzenith, d, esun[3], drkobjv[3]) 87 | b4 = refl(file,4,info$b4gain, info$b4bias, sunzenith, d, esun[4], drkobjv[4]) 88 | 89 | #stack the bands 90 | img = stack(b1,b2,b3,b4) 91 | 92 | #write out 93 | projection(img) = set_projection(file) 94 | img = as(img, "SpatialGridDataFrame") 95 | outfile = sub("archv", "dos_sr", file) 96 | writeGDAL(img, outfile, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 97 | 98 | return(1) 99 | } -------------------------------------------------------------------------------- /R/msscvm.r: -------------------------------------------------------------------------------- 1 | #' Create a cloud and cloud shadow mask for Landsat MSS imagery 2 | #' 3 | #' Takes in any numeric value and squares it. 4 | #' @param file character. MSS reflectance image filename (full system path to MSS file) 5 | #' @param demfile character. DEM filename (full system path to spatially coincident DEM file) 6 | #' @param topoprep logical. TRUE if slope and aspect are already created in the "topo" folder and FALSE if not 7 | #' @param test logical. If TRUE clouds, cloud shadows and clear pixels have unique values, if FALSE obscured are 0 and clear are 1 8 | #' @return A binary raster with the same dimensions as the MSS image where pixels with value 1 represent clear pixel and 0 as obsured by either cloud or cloud shadow 9 | #' @import raster 10 | #' @import rgdal 11 | #' @import SDMTools 12 | #' @import igraph 13 | #' @export 14 | 15 | 16 | msscvm = function(file, demfile, topoprep, test=F, overwrite=F){ 17 | 18 | check = file_check(file,"cloudmask.tif",overwrite) 19 | if(check == 0){return(0)} 20 | 21 | ref = raster(file) 22 | info = get_metadata(file) 23 | 24 | b1 = as.matrix(raster(file, 1)) 25 | b2 = as.matrix(raster(file, 2)) 26 | b4 = as.matrix(raster(file, 4)) 27 | 28 | #apply topographic correction to band 4 for identifying cloud shadows 29 | k=0.55 30 | sunzen = info$sunzen*(pi/180) 31 | 32 | #crop the hillshade layer and convert to illumination 33 | if(topoprep == T){ 34 | dname = dirname(file) 35 | scenedir = substr(dname,1,nchar(dname)-12) 36 | topodir = file.path(scenedir,"topo") 37 | slopefile = list.files(topodir,"slope.tif$",full.name=T) 38 | aspectfile = list.files(topodir,"aspect.tif$",full.name=T) 39 | slope = raster(slopefile) 40 | aspect = raster(aspectfile) 41 | 42 | slope_ex = alignExtent(slope, ref, snap="near") 43 | extent(slope) = slope_ex 44 | slope = crop(slope,ref) 45 | 46 | aspect_ex = alignExtent(aspect, ref, snap="near") 47 | extent(aspect) = slope_ex 48 | aspect = crop(aspect,ref) 49 | 50 | ill = as.matrix(hillShade(slope, aspect, angle=info$sunelev, direction=info$sunaz, normalize=F)) 51 | } 52 | 53 | if(topoprep == F){ 54 | dem = raster(demfile) 55 | dem_ex = alignExtent(dem, ref, snap="near") 56 | extent(dem) = dem_ex 57 | dem = crop(dem,ref) 58 | 59 | #dem = crop(raster(demfile), raster(file)) 60 | slope = terrain(dem, opt="slope") 61 | aspect = terrain(dem, opt="aspect") 62 | ill = as.matrix(hillShade(slope, aspect, angle=info$sunelev, direction=info$sunaz, normalize=F)) 63 | } 64 | 65 | 66 | #apply the correction 67 | c = (cos(sunzen)/ill)^k 68 | b4topoc = round(b4*c) 69 | 70 | #find clouds 71 | cloud = (b1 - b2)/(b1 + b2) 72 | clouds = which(cloud > 0.0 & b1 > 1750 | b1 > 3900) 73 | 74 | #find the shadows 75 | b4topoc[clouds] = NA 76 | 77 | b4nocldmean = mean(b4topoc, na.rm=T) 78 | shadowthresh1 = round(0.40*b4nocldmean + 247.97) #rad = 536.94 79 | nocldorshdw = which(b4topoc > shadowthresh1) #find non-shadow pixels first pass 80 | b4nocldorshdw = b4topoc[nocldorshdw] 81 | b4nocldorshdwmean = mean(b4nocldorshdw, na.rm=T) 82 | shadowthresh2 = round(0.47*b4nocldorshdwmean + 73.23) #refl = 73.23 rad = 158.56 83 | shadows = which(b4topoc <= shadowthresh2) 84 | 85 | #find the water 86 | slope = as.matrix(slope) 87 | ndvi = (b4-b2)/(b4+b2) 88 | waterindex = which(ndvi < 0.0850 & slope < (0.5*(pi/180))) #| ndvi < -0.05 & & slope < (1.5*(pi/180)) 89 | 90 | #create a set of blank rasters for filtering 91 | b1[] = 0 92 | water=shadow=cloud=b1 93 | b1=0 94 | 95 | water[waterindex] = 1 96 | clumps = .Call("ccl", water, PACKAGE = "SDMTools") 97 | clumps = setValues(ref, clumps) 98 | fre = freq(clumps) 99 | these = which(fre[,2] < 7) 100 | values = fre[these,1] 101 | m = match(as.matrix(clumps), values) 102 | these = which(is.na(m) == F) 103 | water[these] = 0 104 | 105 | water = setValues(ref,water) 106 | water = focal(water, w=matrix(1,5,5), fun=max, na.rm=T) 107 | waterindex = which(as.matrix(water) == 1) 108 | 109 | shadow[shadows] = 1 110 | shadow[waterindex] = 0 111 | cloud[clouds] = 1 112 | 113 | #filter the aggregated cloud and shadow 114 | clumps = .Call("ccl", cloud, PACKAGE = "SDMTools") 115 | clumps = setValues(ref, clumps) 116 | fre = freq(clumps) 117 | these = which(fre[,2] < 10) #10 118 | values = fre[these,1] 119 | m = match(as.matrix(clumps), values) 120 | these = which(is.na(m) == F) 121 | cloud[these] = 0 122 | 123 | cloud = setValues(ref,cloud) 124 | cloud = focal(cloud, w=matrix(1,5,5), fun=max, na.rm=F, pad=T, padValue=0) 125 | 126 | reso = 60 127 | r = raster(ncol=31,nrow=31) 128 | ext = extent(0, 31, 0, 31) 129 | extent(r) = ext 130 | projection(r) = set_projection(file) 131 | r[] = NA 132 | r[16,16] = 1 133 | dist = gridDistance(r, 1) 134 | kernal = dist <= 16 135 | 136 | shadowproj = focal(cloud, w=as.matrix(kernal), fun=max, na.rm=F, pad=T, padValue=0) 137 | shiftstart = 1000/tan((pi/180)*info$sunelev) 138 | shiftend = 7000/tan((pi/180)*info$sunelev) 139 | 140 | shiftlen = seq(shiftstart,shiftend,900) 141 | 142 | shiftit = function(shadowproj,shiftlen,info,reso){ 143 | if(info$sunaz > 90 & info$sunaz < 180){ 144 | angle = info$sunaz-90 145 | yshift = round((sin((pi/180)*angle) * shiftlen)/reso)*reso 146 | xshift = round((cos((pi/180)*angle) * shiftlen * -1)/reso)*reso 147 | shadowproj = shift(shadowproj, x=xshift, y=yshift) 148 | } 149 | if(info$sunaz > 0 & info$sunaz < 90){ 150 | angle = 90-info$sunaz 151 | angle = 10 152 | yshift = round((sin((pi/180)*angle) * shiftlen *-1)/reso)*reso 153 | xshift = round((cos((pi/180)*angle) * shiftlen *-1)/reso)*reso 154 | shadowproj = shift(shadowproj, x=xshift, y=yshift) 155 | } 156 | return(shadowproj) 157 | } 158 | 159 | for(m in 1:length(shiftlen)){ 160 | if(m == 1){mergeit = "r1"} else {mergeit = paste(mergeit,",r",m, sep="")} 161 | dothis = paste("r",m,"=shiftit(shadowproj,shiftlen[m],info,reso)", sep="") 162 | eval(parse(text=dothis)) 163 | if(m == length(shiftlen)){mergeit = paste("shadowproj = mosaic(",mergeit,",fun=max)", sep="")} 164 | } 165 | 166 | #run the merge function 167 | if(length(shiftlen) == 1){shadowproj = r1} else {eval(parse(text=mergeit))} 168 | 169 | #ake sure that all values are finite, the mosaicing function with max can cause some problems where there are no pixels 170 | shadowproj[!is.finite(values(shadowproj))] = 0 171 | 172 | #extend the layer so it has a full union with the cloud layer 173 | shadowproj = extend(shadowproj, cloud, value=0) 174 | 175 | #crop the cloud projection layer by the cloud layer 176 | shadowproj = crop(shadowproj, cloud) 177 | 178 | #convert the shadow layer to a matrix 179 | shadow = setValues(ref,shadow) 180 | 181 | #get the intersection of shadow and cloud projection 182 | shadow = shadow*shadowproj 183 | 184 | #convert the shadow layer to a matrix for spatial sieve 185 | shadow = as.matrix(shadow) 186 | 187 | #filter the aggregated cloud and shadow 188 | clumps = .Call("ccl", shadow, PACKAGE = "SDMTools") 189 | clumps = setValues(ref, clumps) 190 | fre = freq(clumps) 191 | these = which(fre[,2] < 10) #10 192 | values = fre[these,1] 193 | m = match(as.matrix(clumps), values) 194 | these = which(is.na(m) == F) 195 | shadow[these] = 0 196 | shadow = setValues(ref, shadow) 197 | 198 | shadow = focal(shadow, w=matrix(1,5,5), fun=max, na.rm=F, pad=T, padValue=0) 199 | if(test == T){cloud = cloud*2 200 | cloudshadow = mosaic(cloud,shadow,fun=max, na.rm=T) 201 | } else {cloudshadow = sum(cloud, shadow, na.rm=T)} 202 | 203 | 204 | if(test == F){cloudshadow = setValues(ref,as.numeric(values(cloudshadow) == 0))} 205 | cloudshadow[is.na(ref)] = NA 206 | projection(cloudshadow) = set_projection(file) 207 | cloudshadow = as(cloudshadow, "SpatialGridDataFrame") 208 | if(test == F){outfile = sub("reflectance", "cloudmask", file)} else {outfile = sub("reflectance", "cloudmask_test", file)} 209 | writeGDAL(cloudshadow, outfile, drivername = "GTiff", type = "Byte", mvFlag=255, options="INTERLEAVE=BAND") 210 | 211 | } 212 | -------------------------------------------------------------------------------- /R/mssdn2refl.r: -------------------------------------------------------------------------------- 1 | #' Convert DN values to toa reflectance 2 | #' 3 | #' Convert DN values to toa reflectance 4 | #' @param file The full path name of the *archv file 5 | #' @import raster 6 | #' @import rgdal 7 | #' @export 8 | 9 | 10 | mssdn2refl = function(file, overwrite=F){ 11 | 12 | check = file_check(file,"reflectance.tif",overwrite) 13 | if(check == 0){return(0)} 14 | 15 | #link to the equations to convert DN to TOA and TOA to SR 16 | #http://landsathandbook.gsfc.nasa.gov/data_prod/prog_sect11_3.html 17 | 18 | #define the reflectance function 19 | refl = function(file, band, gain, bias, sunzen, d, esun){ 20 | orig = raster(file, band) 21 | img = as.matrix(orig) 22 | img = ((gain*img)+bias) 23 | img[img < 0] = 0 24 | img = (pi * img * (d^2))/(esun * cos(sunzenith)) 25 | img = round(img * 10000) 26 | img = setValues(orig,img) 27 | return(img) 28 | } 29 | 30 | #read in the image metadata 31 | info = get_metadata(file) 32 | 33 | #define esun values for mss (chander et al 2009 summary of current radiometric calibration coefficients... RSE 113) 34 | if(info$sensor == "LANDSAT_1"){esun = c(1823,1559,1276,880.1)} 35 | if(info$sensor == "LANDSAT_2"){esun = c(1829,1539,1268,886.6)} 36 | if(info$sensor == "LANDSAT_3"){esun = c(1839,1555,1291,887.9)} 37 | if(info$sensor == "LANDSAT_4"){esun = c(1827,1569,1260,866.4)} 38 | if(info$sensor == "LANDSAT_5"){esun = c(1824,1570,1249,853.4)} 39 | 40 | #define the earth sun distance 41 | doy = as.numeric(substr(basename(file),14,16)) 42 | d = eudist(doy) 43 | sunzenith = info$sunzen/57.2958 44 | 45 | #apply the conversion to reflectance 46 | b1 = refl(file,1,info$b1gain, info$b1bias, sunzen, d, esun[1]) 47 | b2 = refl(file,2,info$b2gain, info$b2bias, sunzen, d, esun[2]) 48 | b3 = refl(file,3,info$b3gain, info$b3bias, sunzen, d, esun[3]) 49 | b4 = refl(file,4,info$b4gain, info$b4bias, sunzen, d, esun[4]) 50 | 51 | #stack the bands 52 | img = stack(b1,b2,b3,b4) 53 | 54 | #write it out 55 | projection(img) = set_projection(file) 56 | img = as(img, "SpatialGridDataFrame") #convert the raster to SGHF so it can be written using GDAL (faster than writing it with the raster package) 57 | outfile = sub("archv", "reflectance", file) 58 | writeGDAL(img, outfile, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 59 | return(1) 60 | } -------------------------------------------------------------------------------- /R/msssr2tc.r: -------------------------------------------------------------------------------- 1 | #' Create MSS TC from surface reflectance and modeled coefficients 2 | #' 3 | #' Create MSS TC from surface reflectance and modeled coefficients 4 | #' @param mss_file character. full path name to MSS surface reflectance file 5 | #' @param bcoef numeric. numeric array containing the tcb coefficients 6 | #' @param gcoef numeric. numeric array containing the tcg coefficients 7 | #' @param wcoef numeric. numeric array containing the tcw coefficients 8 | #' @param mode. character. how to deal with the outputs options: "calibrate" or "apply" 9 | #' @import raster 10 | #' @export 11 | 12 | 13 | msssr2tc = function(mss_file,bcoef,gcoef,wcoef,mode){ 14 | 15 | if(mode == "calibrate"){ 16 | dir = substr(dirname(mss_file),1,nchar(dirname(mss_file))-12) 17 | tcfiledir = file.path(dir,"calibration","aggregate_model_tc_imgs") 18 | dir.create(tcfiledir, recursive=T, showWarnings=F) 19 | tcfile = file.path(tcfiledir,sub("dos_sr_30m.tif","tc_30m.tif", basename(mss_file))) 20 | tcafile = file.path(tcfiledir,sub("dos_sr_30m.tif","tca_30m.tif", basename(mss_file))) 21 | } 22 | if(mode == "apply"){ 23 | tcfile = sub("dos_sr_30m.tif","tc_30m.tif", mss_file) 24 | tcafile = sub("dos_sr_30m.tif","tca_30m.tif", mss_file) 25 | } 26 | 27 | #read in the dos sr image 28 | ref = raster(mss_file) 29 | b1=as.matrix(raster(mss_file,1)) 30 | b2=as.matrix(raster(mss_file,2)) 31 | b3=as.matrix(raster(mss_file,3)) 32 | b4=as.matrix(raster(mss_file,4)) 33 | 34 | #transform to tc 35 | tcb = round((((b1*bcoef[2])+(b2*bcoef[3])+(b3*bcoef[4])+(b4*bcoef[5])) + bcoef[1])) 36 | tcg = round((((b1*gcoef[2])+(b2*gcoef[3])+(b3*gcoef[4])+(b4*gcoef[5])) + gcoef[1])) 37 | tcw = round((((b1*wcoef[2])+(b2*wcoef[3])+(b3*wcoef[4])+(b4*wcoef[5])) + wcoef[1])) 38 | 39 | b1=b2=b3=b4=0 #save memory 40 | 41 | tca = atan(tcg/tcb) * (180/pi) * 100 #need to multiply by (180/pi) to get degrees because atan() returns radians, 100 is a scalar to preserve two decimal places 42 | 43 | tca = matrix_to_raster(mss_file, tca) 44 | tcb = matrix_to_raster(mss_file, tcb) 45 | tcg = matrix_to_raster(mss_file, tcg) 46 | tcw = matrix_to_raster(mss_file, tcw) 47 | 48 | projection(tca) = set_projection(mss_file) 49 | tca = as(tca, "SpatialGridDataFrame") 50 | writeGDAL(tca, tcafile, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 51 | 52 | tca=0 #recover some memory 53 | 54 | #stack tc and write out 55 | tc = stack(tcb,tcg,tcw) 56 | projection(tc) = set_projection(mss_file) 57 | tc = as(tc, "SpatialGridDataFrame") 58 | writeGDAL(tc, tcfile, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 59 | 60 | return(1) 61 | } 62 | -------------------------------------------------------------------------------- /R/mssunpackr.r: -------------------------------------------------------------------------------- 1 | #' Decompress, stack, and reproject LPSG MSS images 2 | #' 3 | #' Decompresses, stacks, and optionally reprojects LPGS MSS images recieved from USGS EROS as .tar.gz files 4 | #' @param file Filename of LPGS Landsat MSS image filename (full system path to file) 5 | #' @param proj PROJ.4 projection definition. By default no projection will take place. Optionally specify a CRS projection string or "albers" for the USGS version of Albers Equal Area Conic 6 | #' @param reso numeric. the target pixel size for the output image 7 | #' @import raster 8 | #' @import gdalUtils 9 | #' @export 10 | 11 | 12 | mssunpackr = function(file, proj, overwrite=F){ 13 | 14 | check = file_check(file,"archv.tif",overwrite) 15 | if(check == 0){return(0)} 16 | 17 | filebase = basename(file) 18 | year = as.numeric(substr(filebase, 10, 13)) 19 | if(year >= 1995){ 20 | delete_files(file, 6) 21 | return(0) 22 | } 23 | 24 | randomstring = paste(sample(c(0:9, letters, LETTERS), 6, replace=TRUE),collapse="") 25 | tempdir = file.path(dirname(file),randomstring) #temp 26 | untar(file, exdir=tempdir, tar="internal") #decompress the file 27 | mtlfile = list.files(tempdir, pattern = "MTL.txt", full.names = T, recursive = T) 28 | tbl = unlist(read.delim(mtlfile, header=F, skipNul=T)) 29 | dtype = as.character(grep("DATA_TYPE = ", tbl, value=T)) 30 | l1ttest = length(grep("L1T", dtype)) 31 | if(l1ttest == 1){ 32 | #find files and make new file names and directories 33 | allfiles = list.files(tempdir, full.names=T) #find all the decompressed files 34 | tiffiles = allfiles[grep("TIF",allfiles)] #subset the tif image files 35 | verfile = allfiles[grep("VER.txt",allfiles)] 36 | otherfiles = allfiles[grep("TIF",allfiles, invert=T)] #subset the other files 37 | filedir = dirname(file) #get the directory 38 | pieces = unlist(strsplit(filedir, "/")) #break up the directory and unlist so the pieces can be called by index 39 | len = length(pieces)-1 #get the ending index for "scene" 40 | newpieces = paste(pieces[1:len], collapse = "/") #subset the directory pieces so the last piece is the scene 41 | imgid = substr(filebase, 1, 16) 42 | name = paste(imgid, "_archv.tif", sep = "") #define the new file basename 43 | newdir = file.path(newpieces, "images", year, fsep = .Platform$file.sep) #define the new directory 44 | 45 | #make output filenames 46 | tempstack = file.path(tempdir,paste(imgid, "_tempstack.tif", sep = "")) 47 | projstack = sub("tempstack", "projstack", tempstack) 48 | finalstack = file.path(newdir, name) #define the new full filename of the output image 49 | outprojfile = sub("archv.tif", "proj.txt", finalstack) 50 | 51 | #deal with the ancillary file names 52 | junk = substr(filebase, 17,21) 53 | baseotherfiles = basename(otherfiles) 54 | for(h in 1:length(otherfiles)){baseotherfiles[h] = sub(junk, "", baseotherfiles[h])} 55 | newotherfiles = file.path(newdir, baseotherfiles, fsep = .Platform$file.sep) #define the new filenames for associated files 56 | 57 | ref = raster(tiffiles[1]) #load a file to get original projection 58 | s = stack(tiffiles[1],tiffiles[2],tiffiles[3],tiffiles[4]) 59 | img = as.array(s) 60 | b1bads = img[,,1]>1 #!=0 61 | b2bads = img[,,2]>1 #!=0 62 | b3bads = img[,,3]>1 #!=0 63 | b4bads = img[,,4]>1 #!=0 64 | bads = b1bads*b2bads*b3bads*b4bads 65 | 66 | if(length(which(bads==0))/ncell(ref) > 0.75){ 67 | delete_files(file, 5) 68 | unlink(c(tempdir), recursive=T, force=T) 69 | return(0) 70 | } 71 | 72 | img[,,1] = img[,,1]*bads 73 | img[,,2] = img[,,2]*bads 74 | img[,,3] = img[,,3]*bads 75 | img[,,4] = img[,,4]*bads 76 | 77 | dir.create(newdir, recursive=T, showWarnings=F) #make a new output directory 78 | cloudtest = img[,,1] > 130 79 | percent = round((length(which(cloudtest == T))/length(which((img[,,1]>0) == T)))*100, digits=4) 80 | tbl = unlist(read.delim(verfile, header=F, skipNul=T)) 81 | rmseline = as.character(grep("Scene RMSE: ", tbl, value=T)) 82 | rmse = as.numeric(unlist(strsplit(rmseline, " "))[3]) 83 | outfile = sub("archv.tif", "cloud_rmse.csv", finalstack) 84 | write(c(finalstack,percent,rmse),file=outfile, ncolumns=3, sep=",") 85 | 86 | s = setValues(s,img) 87 | origproj = projection(s) 88 | s = as(s, "SpatialGridDataFrame") 89 | writeGDAL(s, tempstack, drivername = "GTiff", options="INTERLEAVE=BAND", type = "Byte", mvFlag = 0) #, drivername = "GTiff" 90 | 91 | 92 | write(proj, outprojfile) 93 | gdalwarp(srcfile=tempstack, dstfile=projstack, 94 | s_srs=origproj, t_srs=proj, of="GTiff", 95 | r="near", srcnodata=0, dstnodata=0, multi=T, 96 | tr=c(60,60), co="INTERLEAVE=BAND") 97 | 98 | 99 | #trim the na rows and cols 100 | if(proj != "default"){infile = projstack} else {infile = tempstack} 101 | trim_na_rowcol(infile, finalstack, "null", "null") 102 | 103 | file.rename(otherfiles,newotherfiles) #move the associated files 104 | unlink(tempdir, recursive=T, force=T) #delete the temp directory 105 | return(1) 106 | } else { 107 | l1goutdir = dirname(sub("targz","l1g_images",file)) 108 | dir.create(l1goutdir, recursive=T, showWarnings=F) #make a new output directory 109 | file.rename(file, file.path(l1goutdir,basename(file))) 110 | unlink(c(tempdir), recursive=T, force=T) 111 | return(0) 112 | } 113 | } -------------------------------------------------------------------------------- /R/msswarp.r: -------------------------------------------------------------------------------- 1 | #' Spatially warp an MSS image 2 | #' 3 | #' Spatially warp an MSS image to match the spatial properties of a reference image 4 | #' @param reffile character. MSS image file that has low spatial RMSE and low cloud cover 5 | #' @param fixfile character. MSS image file to be spatially warped to match the reference file 6 | #' @param window numeric. image subset size used to define cross-correlation calculation. unit is pixels along one side of a square 7 | #' @param search numeric. neighborhood search window size in which to find tie-point offset. unit is pixels along one side of a square 8 | #' @param sample numeric. target number of tie-points 9 | #' @param refstart numeric. c(xcoord,ycoord). reference image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images. 10 | #' @param fixstart numeric. c(xcoord,ycoord). fix image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images. 11 | #' @import raster 12 | #' @import gdalUtils 13 | #' @export 14 | 15 | 16 | msswarp = function(reffile, fixfile, refstart=c(0,0), fixstart=c(0,0)){ 17 | 18 | mode = 'warp' 19 | method = 'order 2' 20 | 21 | #mode can be: "rmse" or "warp" 22 | #method can be: "tps" or "order 1" or "order 2" 23 | 24 | 25 | #set default parameters 26 | search=35 #27 27 | if(mode == "rmse"){search = 27} #it should be pretty close, no need to look over a larger region - keep at 27 since that is what the thresholds were selected at. 28 | 29 | #scale image values to center on mean and 1 unit variance (global mean and variance) 30 | scaleit = function(matrix){ 31 | stnrd = (matrix - (mean(matrix, na.rm = TRUE)))/(sd(matrix, na.rm = TRUE)) 32 | return(stnrd) 33 | } 34 | 35 | #make a kernal around a given point 36 | make_kernal = function(img, point1, windowsize){ 37 | radius = floor(windowsize/2) 38 | ccol = colFromX(img, point1[1]) 39 | crow = rowFromY(img, point1[2]) 40 | mincol = ccol-radius 41 | maxcol = ccol+radius 42 | minrow = crow-radius 43 | maxrow = crow+radius 44 | return(extent(c(mincol,maxcol,minrow,maxrow))) 45 | } 46 | 47 | calc_rmse = function(info,reso){ 48 | xresid = (info[,"refx"]-info[,"fixx"])^2 #get the residuals of each x 49 | yresid = (info[,"refy"]-info[,"fixy"])^2 #get the residuals of each y 50 | r = (sqrt(xresid+yresid))/reso #get the rmse of each xy point 51 | x_rmse = sqrt(mean(xresid))/reso 52 | y_rmse = sqrt(mean(yresid))/reso 53 | total_rmse = sqrt((x_rmse^2)+(y_rmse^2)) #total rmse including all points 54 | rmse_info = list(x_rmse=x_rmse, y_rmse=y_rmse, total_rmse=total_rmse, r=r) 55 | return(rmse_info) 56 | } 57 | 58 | make_gdaltrans_opts = function(info, wktfile, fixfile, tempname){ 59 | info[,"refx"] = info[,"refx"]+(reso/2) 60 | info[,"refy"] = info[,"refy"]-(reso/2) 61 | fixcol = paste(info[,"fixcol"]) #fix col for tie point 62 | fixrow = paste(info[,"fixrow"]) #fix row for tie point 63 | refx = paste(round(info[,"refx"])) #fix x tie point coord 64 | refy = paste(round(info[,"refy"])) #fix y tie point coord 65 | gcpstr = paste(" -gcp", fixcol, fixrow, refx, refy, collapse="") 66 | gdaltrans_cmd = paste("-of Gtiff -ot Byte -co INTERLEAVE=BAND -a_srs", wktfile, gcpstr, fixfile, tempname) 67 | return(gdaltrans_cmd) 68 | } 69 | 70 | run_the_file = function(fixfile){ 71 | info = get_metadata(fixfile) 72 | dt = as.character(info$datatype) 73 | rmsefile = sub("archv.tif","cloud_rmse.csv",fixfile) 74 | rmse = as.numeric(read.table(rmsefile,sep=",")[3]) 75 | runit = as.numeric(rmse > 0.75 & dt == "L1T") 76 | return(runit) 77 | } 78 | 79 | 80 | 81 | runit = run_the_file(fixfile) 82 | if(runit == 0){return(0)} 83 | 84 | print(paste('Working on image:', basename(fixfile))) 85 | 86 | #read in the fix image 87 | #fiximg = raster(fixfile, band=4) #load the fix image 88 | fiximg = brick(fixfile) #load the fix image 89 | origfiximg = subset(fiximg, subset=4) #save a copy of the original fix image so that the tie point coords can be assigned to the original row and cols 90 | 91 | #shift the fiximg if there is an initial offset provided 92 | shiftit = refstart - fixstart 93 | if(sum(shiftit) != 0){fiximg = shift(fiximg, x=shiftit[1], y=shiftit[2])} 94 | 95 | #load the ref image 96 | refimg = raster(reffile, 4) 97 | 98 | #make sure that the ref and fix img are croppd to eachother 99 | extent(fiximg) = alignExtent(fiximg, refimg, snap="near") 100 | ext = intersect(extent(refimg), extent(fiximg)) 101 | refimg = crop(refimg, ext) 102 | fiximg = crop(fiximg, ext) 103 | #refimg = intersect(refimg, fiximg) 104 | #fiximg = intersect(fiximg, refimg) 105 | 106 | #fiximgb1 = raster(fixfile, band=1) 107 | #fiximgb4 = raster(fixfile, band=4) 108 | #get bands 1 and 4 out for cloud and shadow id in the fix image, as well as for cross correlation with the reference image 109 | fiximgb1 = subset(fiximg, subset=1) 110 | fiximgb4 = subset(fiximg, subset=4) 111 | fiximg = fiximgb4 #need to copy because fiximg will be scaled, but we also need an unaltered copy to find shadows in 112 | refimgb4 = refimg # need to copy because refimg will be scaled, but we also need an unalered copy to find background pixel in 113 | 114 | #calculate similarity index input values from the fix image subset 115 | values(fiximg) = scaleit(values(fiximg)) 116 | values(fiximg)[is.na(values(fiximg))] = 0 117 | 118 | #calculate similarity index input values from the ref image subset 119 | values(refimg) = scaleit(values(refimg)) 120 | values(refimg)[is.na(values(refimg))] = 0 121 | refimgsqr = refimg^2 122 | 123 | #get the resolution 124 | reso = xres(refimg) 125 | 126 | #adjust the window and search size so that they are odd numbers 127 | if (search %% 2 == 0){search = search+1} 128 | 129 | #sample the reference image, laying down a regular grid of points to check 130 | s = sampleRegular(refimg, 15000, cells=T)[,1] 131 | xy = xyFromCell(refimg,s) #[,1] #get the xy coordinates for each good point 132 | 133 | #filter points in fiximg that fall on clouds 134 | thesecells = na.omit(cellFromXY(fiximgb1, xy)) #get fiximg cell index for sample 135 | b = which(fiximgb1[thesecells] < 120 & 136 | fiximgb4[thesecells] > 30 & 137 | refimgb4[thesecells] > 0) # fiximgb1[theseones] != NA) #exclude points that don't meet criteria 138 | 139 | #if the number of sample points is less than 10 delete the image return 140 | 141 | #TODO - get out if there are not enough points to work with - there are examples of for doing this below 142 | #if(length(b) < 10){ 143 | # delete_files(fixfile, 2) 144 | # return(0) 145 | #} 146 | 147 | #subset the sample 148 | n_samp = length(b) 149 | cat(paste("n points from original sample:",n_samp),"\n") 150 | n_subsamp = 1200 151 | if(n_samp < n_subsamp){n_subsamp = n_samp} 152 | subsamp = sample(b, n_subsamp) 153 | xy = xy[subsamp,] #subset the original sample 154 | s = s[subsamp] #subset the original sample 155 | rowcol = rowColFromCell(refimg, s) 156 | 157 | #make an array to hold all information collected 158 | info = cbind(c(0),xy, rowcol[,2], rowcol[,1], array(0, c(length(xy[,1]), 8))) 159 | cnames = c("point","refx","refy","refcol","refrow","fixx","fixy","fixcol","fixrow","nmax", "max","edgedist","decision") 160 | colnames(info) = cnames 161 | 162 | #iterate process of creating a similarity surface for each check point in the sample 163 | window_size = c(101,201,275) 164 | for(size in 1:3){ 165 | cat(paste("Working on window size set: ",size,"/3",sep=""),"\n") 166 | if(mode != "rmse"){ 167 | if(size == 1){pdf_file = sub("archv.tif", "ccc_surface_100w.pdf",fixfile)} 168 | if(size == 2){pdf_file = sub("archv.tif", "ccc_surface_200w.pdf",fixfile)} 169 | if(size == 3){pdf_file = sub("archv.tif", "ccc_surface_275w.pdf",fixfile)} 170 | unlink(pdf_file) #delete the pdf if it exists 171 | pdf(file=pdf_file, width=10, heigh=7) #size of the pdf page 172 | par(mfrow=c(2,3)) #number of trajectories to place on a page (columns, rows) 173 | } 174 | window = window_size[size] 175 | 176 | #adjust the window and search size so that they are odd numbers 177 | if (window %% 2 == 0){window = window+1} 178 | radius = floor(window/2) #radius of the window in pixels 179 | nrc = search+(radius*2) #the reference extent length to slide over 180 | 181 | 182 | for(point in 1:length(info[,1])){ 183 | #print(point) #print the point so we know where we're at 184 | if(info[point,"decision"] == 1){ 185 | # print("already good, skipping...") 186 | next 187 | } 188 | if(size == 1){info[point,"point"] = point} #info[point,1] = point #put the point number into the info table 189 | 190 | #make a subset of the reference image for the fiximg chip to slide over 191 | a = make_kernal(refimg, info[point,2:3], nrc) 192 | test = c(a@ymax,a@ymin,a@xmin,a@xmax) 193 | 194 | if(sum(is.na(test)) > 0){next} 195 | if(sum(test < 0) > 0){next} 196 | if(a@ymax > nrow(refimg) | a@ymin > nrow(refimg)){next} 197 | if(a@xmax > ncol(refimg) | a@xmin > ncol(refimg)){next} 198 | ext=extent(refimg,a@ymin,a@ymax,a@xmin,a@xmax) 199 | refsub = crop(refimg, ext) 200 | 201 | #make subset of fiximg (fiximg chip) 202 | a = make_kernal(fiximg, info[point,2:3], window) 203 | test = c(a@ymax,a@ymin,a@xmin,a@xmax) 204 | if(sum(is.na(test)) > 0){next} 205 | if(sum(test < 0) > 0){next} 206 | if(a@ymax > nrow(fiximg) | a@ymin > nrow(fiximg)){next} 207 | if(a@xmax > ncol(fiximg) | a@xmin > ncol(fiximg)){next} 208 | ext=extent(fiximg,a@ymin,a@ymax,a@xmin,a@xmax) 209 | fixsub = crop(fiximg, ext) 210 | 211 | #create numerator 212 | tofix = matrix(values(fixsub),ncol=window,byrow = T) 213 | 214 | if(length(tofix) %% 2 == 0) { 215 | #cat("Skipping","\n") 216 | next 217 | } 218 | 219 | num = focal(refsub, w=tofix ,fun=sum) 220 | 221 | #get refimg denom 222 | a = make_kernal(refimgsqr, info[point,2:3], nrc) 223 | ext=extent(refimgsqr,a@ymin,a@ymax,a@xmin,a@xmax) 224 | refsubsqr = crop(refimgsqr, ext) 225 | sumrefsubsqr = focal(refsubsqr, w=matrix(1,window, window)) #get the summed product of the refsubimg 226 | sumfixsubsqr = sum(values(fixsub)^2) #fiximg standard only gets calcuated once 227 | denom = sqrt(sumfixsubsqr*sumrefsubsqr) 228 | 229 | #badone=0 230 | if(cellStats(num, stat="sum") + cellStats(denom, stat="sum") == 0){next} 231 | 232 | ncc = num/denom 233 | buf = (nrow(ncc)-search)/2 234 | off1 = buf+1 235 | off2 = buf+search 236 | ext = extent(ncc,off1,off2,off1,off2) 237 | ncc = crop(ncc,ext) 238 | nccv = values(ncc) 239 | nccm = matrix(nccv, ncol=sqrt(length(nccv)), byrow=T) 240 | 241 | x = y = seq(1,ncol(nccm),1) 242 | 243 | good = which(values(ncc) == maxValue(ncc))[1] 244 | 245 | #### 246 | xoffsetcoord = xFromCell(ncc, good) 247 | yoffsetcoord = yFromCell(ncc, good) 248 | xoffset = xoffsetcoord - info[point,"refx"] 249 | yoffset = yoffsetcoord - info[point,"refy"] 250 | info[point,"fixx"] = xoffsetcoord-(xoffset*2) 251 | info[point,"fixy"] = yoffsetcoord-(yoffset*2) 252 | #### 253 | 254 | #get the row and column numbers for the fix image 255 | origfiximg_x = info[point,"fixx"]-shiftit[1] 256 | origfiximg_y = info[point,"fixy"]-shiftit[2] 257 | a = cellFromXY(origfiximg, c(origfiximg_x,origfiximg_y)) 258 | fiximgrc = rowColFromCell(origfiximg, a) 259 | info[point,"fixcol"] = fiximgrc[2] #info[point,8] = fiximgrc[2] 260 | info[point,"fixrow"] = fiximgrc[1] #info[point,9] = fiximgrc[1] 261 | 262 | 263 | ############################################################# 264 | #screen by outlier 265 | ccc = scaleit(nccm) 266 | maxmat = max(ccc, na.rm = T) 267 | rowcol = which(ccc == maxmat, arr.ind=T) 268 | r = ccc[rowcol[1],] 269 | c = ccc[,rowcol[2]] 270 | rmax = which(r == maxmat) #max(r) 271 | cmax = which(c == maxmat) #max(c) 272 | 273 | dist1 = abs(c((rmax - 1), (rmax - length(r)), (cmax - 1), (cmax - length(c))))/(floor(search/2)) 274 | dist2 = min(dist1) 275 | 276 | #place filtering values in info table 277 | info[point,"nmax"] = length(which(ccc == maxmat)) #2 #info[point,10] = length(which(ccc == maxmat)) #2 278 | info[point,"max"] = round(maxmat, digits=1) #3 #info[point,11] = round(maxmat, digits=1) #3 279 | info[point,"edgedist"] = round(dist2, digits=2) #6 #info[point,14] = round(dist2, digits=2) #6 280 | 281 | #decide what surfaces are good\bad 282 | bad = array(0,3) 283 | bad[1] = info[point,"nmax"] > 1 #number of max peaks eq 1 284 | bad[2] = info[point,"max"] < 3 #peak ge to 3 standard devs from mean 285 | bad[3] = info[point,"edgedist"] < 0.12 #peak distance from edge >= 0.12 286 | info[point,"decision"] = sum(bad) == 0 #7 287 | #if(badone == 1){info[point,"decision"] = 0} 288 | 289 | #filter plots that will crash the plotting because of weird data points (na, NaN, (-)Inf) 290 | bad1 = is.na(ccc) 291 | bad2 = is.infinite(ccc) 292 | bad3 = is.nan(ccc) 293 | badindex = which(bad1 == T | bad2 == T | bad3 == T) 294 | ccc[badindex] = 0 295 | if(length(which(ccc == 0)) == length(ccc)){next} 296 | 297 | #plot the cross correlation surface 298 | #title = paste(point,info$nmax[point],info$max[point],info$edgedist[point], info$decision[point], sep = ",") 299 | if(info[point,"decision"] == 1){status = "accept"} else {status = "reject"} 300 | title = paste("Point:", point, status) 301 | #print(title) 302 | if(mode != "rmse"){ 303 | ccc = ccc[nrow(ccc):1,] 304 | persp(x, y, ccc, theta = 30, phi = 30, expand = 0.5, col = 8, main=title) 305 | } 306 | } 307 | cat(paste("n goods =",length(which(info[,"decision"] == 1))),"\n") 308 | if(mode != "rmse"){ 309 | dev.off() #turn off the plotting device 310 | } 311 | } 312 | #print(paste("n goods =",length(which(info[,"decision"] == 1)))) 313 | 314 | #write all the point info to a file 315 | if(mode != "rmse"){ 316 | info_file = sub("archv.tif", "info_full.csv",fixfile) 317 | write.csv(info, file=info_file, row.names = F) 318 | } 319 | 320 | #subset the points that passed the surface tests 321 | these = which(info[,"decision"] == 1) 322 | 323 | # make file info for rmse - not used if not an rmse run 324 | rmse_outfile = file.path(dirname(fixfile),paste(substr(basename(fixfile),1,16),"_rmse.Rdata",sep="")) 325 | rmse_info = list(calc_rmse = F, x_rmse=NA, y_rmse=NA, total_rmse=NA, info=info) 326 | 327 | 328 | 329 | #if the number of sample points is less than 10 delete the image and return 330 | if(length(these) < 10){ 331 | delete_files(fixfile, 2) 332 | return(0) 333 | } 334 | 335 | info = info[these,] 336 | 337 | #filter points based on rmse contribution 338 | if(mode != "rmse"){ 339 | rmse = calc_rmse(info,reso) 340 | r = rmse$r 341 | sdr = sd(r) 342 | meanr = mean(r) 343 | limit = meanr+sdr*2 344 | goods = which(r <= limit) 345 | n_outliers = nrow(info)-length(goods) 346 | info = info[goods,] 347 | cat(paste("Getting rid of:",n_outliers,"outliers"),"\n") 348 | cat(paste("There are still:",nrow(info),"points"),"\n") 349 | #maxr = endit = 10 350 | #while(maxr >2 & endit != 0){ 351 | # rmse = calc_rmse(info,reso) 352 | # if (rmse$total_rmse != 0){contr = rmse$r/rmse$total_rmse} else contr = rmse$r #error contribution of each point 353 | # maxr = max(contr) #while loop controler 354 | # b = which(contr < 2) #subset finder - is point 2 times or greater in contribution 355 | # info = info[b,] #subset the info based on good rsme 356 | # endit = sum(contr[b]) #while loop controler 357 | #} 358 | } 359 | 360 | #if the number of sample points is less than 10 delete the image and return 361 | if(length(these) < 10){ 362 | delete_files(fixfile, 2) 363 | return(0) 364 | } 365 | 366 | #if this is an rmse run, then save the info and get out 367 | if(mode == "rmse"){ 368 | rmse = calc_rmse(info,reso) 369 | 370 | rmse_info$calc_rmse = T 371 | rmse_info$x_rmse=rmse$x_rmse 372 | rmse_info$y_rmse=rmse$y_rmse 373 | rmse_info$total_rmse=rmse$total_rmse 374 | rmse_info$info=info 375 | 376 | save(rmse_info, file=rmse_outfile) 377 | return(0) 378 | } 379 | 380 | #write out the filtered points that will be used in the transformation 381 | info_file = sub("archv.tif", "info_sub.csv",fixfile) 382 | write.csv(info, file=info_file, row.names = F) 383 | 384 | #make some output file names 385 | tempname = sub("archv.tif", "temp.tif", fixfile) #"K:/scenes/034032/images/1976/LM10360321976248_archv_l1g_warp.tif" 386 | #outfile = sub("archv.tif", "archv_l1g2l1t.tif", fixfile) 387 | wktfile = sub("archv.tif","wkt.txt", fixfile) 388 | #gcpfile = sub("archv.tif", "gcp.txt", fixfile) 389 | gdaltransoptsfile = sub("archv.tif", "gdal_trans_opts.txt", fixfile) 390 | 391 | #write out a projection file for gdal translate to use 392 | proj = system(paste("gdalsrsinfo -o wkt", fixfile), intern = TRUE) 393 | write(proj, wktfile) 394 | 395 | #create the warp cmd and save as a file 396 | gdaltrans_opts = make_gdaltrans_opts(info, wktfile, fixfile, tempname) 397 | write(gdaltrans_opts, file=gdaltransoptsfile) 398 | 399 | #run the warp command file 400 | cmd = paste("gdal_translate --optfile", gdaltransoptsfile) 401 | system(cmd) 402 | 403 | #gdal warp command 404 | gdalwarp_cmd = paste("gdalwarp -of Gtiff", paste("-", method, sep=""), "-ot Byte -srcnodata 0 -dstnodata 0 -co INTERLEAVE=BAND -overwrite -multi -tr", reso, reso, tempname, fixfile) #fixfile "-tps" "-order 2", "-order 3" 405 | system(gdalwarp_cmd) 406 | 407 | 408 | # ######################## warping method tests##################################################### 409 | # outfiletest = sub("archv.tif", "archv_l1g2l1t_test_tps.tif", fixfile) 410 | # gdalwarp_cmd = paste("gdalwarp -of Gtiff -tps -ot Byte -srcnodata 0 -dstnodata 0 -co INTERLEAVE=BAND -overwrite -multi -tr", reso, reso, tempname, outfiletest) #fixfile "-tps" "-order 2", "-order 3" 411 | # system(gdalwarp_cmd) 412 | # 413 | # outfiletest = sub("archv.tif", "archv_l1g2l1t_test_order1.tif", fixfile) 414 | # gdalwarp_cmd = paste("gdalwarp -of Gtiff -order 1 -ot Byte -srcnodata 0 -dstnodata 0 -co INTERLEAVE=BAND -overwrite -multi -tr", reso, reso, tempname, outfiletest) #fixfile "-tps" "-order 2", "-order 3" 415 | # system(gdalwarp_cmd) 416 | # 417 | # outfiletest = sub("archv.tif", "archv_l1g2l1t_test_order2.tif", fixfile) 418 | # gdalwarp_cmd = paste("gdalwarp -of Gtiff -order 2 -ot Byte -srcnodata 0 -dstnodata 0 -co INTERLEAVE=BAND -overwrite -multi -tr", reso, reso, tempname, outfiletest) #fixfile "-tps" "-order 2", "-order 3" 419 | # system(gdalwarp_cmd) 420 | # ################################################################################################## 421 | 422 | 423 | #delete the temp file 424 | unlink(list.files(dirname(fixfile), pattern = "temp", full.names = T)) 425 | return(1) 426 | } -------------------------------------------------------------------------------- /R/msswarp_old.r: -------------------------------------------------------------------------------- 1 | #' Spatially warp an MSS image 2 | #' 3 | #' Spatially warp an MSS image to match the spatial properties of a reference image 4 | #' @param reffile character. MSS image file that has low spatial RMSE and low cloud cover 5 | #' @param fixfile character. MSS image file to be spatially warped to match the reference file 6 | #' @param window numeric. image subset size used to define cross-correlation calculation. unit is pixels along one side of a square 7 | #' @param search numeric. neighborhood search window size in which to find tie-point offset. unit is pixels along one side of a square 8 | #' @param sample numeric. target number of tie-points 9 | #' @param refstart numeric. c(xcoord,ycoord). reference image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images. 10 | #' @param fixstart numeric. c(xcoord,ycoord). fix image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images. 11 | #' @import raster 12 | #' @import gdalUtils 13 | 14 | 15 | msswarp_old = function(reffile, fixfile, window=275, search=27, sample=1000, refstart=c(0,0), fixstart=c(0,0)){ 16 | 17 | 18 | #scale image values to center on mean and 1 unit variance (global mean and variance) 19 | scaleit = function(matrix){ 20 | stnrd = (matrix - (mean(matrix, na.rm = TRUE)))/(sd(matrix, na.rm = TRUE)) 21 | return(stnrd) 22 | } 23 | 24 | #make a kernal around a given point 25 | make_kernal = function(img, point1, windowsize){ 26 | radius = floor(windowsize/2) 27 | ccol = colFromX(img, point1[1]) 28 | crow = rowFromY(img, point1[2]) 29 | mincol = ccol-radius 30 | maxcol = ccol+radius 31 | minrow = crow+radius 32 | maxrow = crow-radius 33 | return(extent(c(mincol,maxcol,minrow,maxrow))) 34 | } 35 | 36 | #check to see if the image should be run 37 | info = get_metadata(fixfile) 38 | dt = as.character(info$datatype) 39 | rmsefile = sub("archv.tif","cloud_rmse.csv",fixfile) 40 | rmse = as.numeric(read.table(rmsefile,sep=",")[3]) 41 | runit = as.numeric(rmse > 0.5 & dt == "L1T") 42 | if(runit == 1){ 43 | #read in the fix image 44 | fiximg = raster(fixfile, band=3) #load the fix image 45 | origfiximg = fiximg #save a copy of the original fix image 46 | fiximgb1 = raster(fixfile, band=1) 47 | 48 | #shift the fiximg if there is an initial offset provided 49 | shiftit = refstart - fixstart 50 | if(sum(shiftit) != 0){fiximg = shift(fiximg, x=shiftit[1], y=shiftit[2])} 51 | 52 | #load the ref image 53 | refimg = raster(reffile, 3) 54 | 55 | #make sure that the ref and fix img are croppd to eachother 56 | extent(fiximg) = alignExtent(fiximg, refimg, snap="near") 57 | ext = extent(intersect(refimg, fiximg)) 58 | refimg = crop(refimg, ext) 59 | fiximg = crop(fiximg, ext) 60 | #refimg = intersect(refimg, fiximg) 61 | #fiximg = intersect(fiximg, refimg) 62 | 63 | #calculate similarity index input values from the fix image subset 64 | values(fiximg) = scaleit(values(fiximg)) 65 | values(fiximg)[is.na(values(fiximg))] = 0 66 | 67 | #calculate similarity index input values from the ref image subset 68 | values(refimg) = scaleit(values(refimg)) 69 | values(refimg)[is.na(values(refimg))] = 0 70 | refimgsqr = refimg^2 71 | 72 | #get the resolution 73 | reso = xres(refimg) 74 | 75 | #adjust the window and search size so that they are odd numbers 76 | if (window %% 2 == 0){window = window+1} 77 | if (search %% 2 == 0){search = search+1} 78 | radius = floor(window/2) #radius of the window in pixels 79 | nrc = search+(radius*2) #the reference extent length to slide over 80 | 81 | #sample the the reference image, laying down a regular grid of points to check 82 | s = sampleRegular(refimg, sample, cells=T) 83 | s = s[,1] 84 | xy = xyFromCell(refimg,s) #[,1] #get the xy coordinates for each good point 85 | 86 | #filter points in fiximg that fall on clouds 87 | theseones = cellFromXY(fiximgb1, xy) #get fiximg cell index for sample 88 | theseones = na.omit(theseones) 89 | a = fiximgb1[theseones] #extract values for fiximg cell sample 90 | b = which(fiximgb1[theseones] < 100) # fiximgb1[theseones] != NA) #exclude points that don't meet criteria 91 | 92 | #if the number of sample points is less than 10 delete the image return 93 | if(length(b) < 10){ 94 | delete_files(fixfile, 2) 95 | return(0) 96 | } 97 | 98 | #subset the sample 99 | xy = xy[b,] #subset the original sample 100 | s = s[b] #subset the original sample 101 | rowcol = rowColFromCell(refimg, s) 102 | 103 | #make an array to hold all information collected 104 | info = cbind(c(0),xy, rowcol[,2], rowcol[,1], array(0, c(length(xy[,1]), 8))) 105 | cnames = c("point","refx","refy","refcol","refrow","fixx","fixy","fixcol","fixrow","nmax", "max","edgedist","decision") 106 | colnames(info) = cnames 107 | 108 | #start a pdf file to hold image chips and similarity surfaces 109 | pdf_file = sub("archv.tif", "ccc_surface.pdf",fixfile) 110 | unlink(pdf_file) #delete the pdf if it exists 111 | pdf(file=pdf_file, width=10, heigh=7) #size of the pdf page 112 | par(mfrow=c(2,3)) #number of trajectories to place on a page (columns, rows) 113 | 114 | #iterate process of creating a similarity surface for each check point in the sample 115 | for(point in 1:length(info[,1])){ 116 | print(point) #print the point so we know where we're at 117 | info[point,"point"] = point #info[point,1] = point #put the point number into the info table 118 | 119 | #make a subset of the reference image for the fiximg chip to slide over 120 | a = make_kernal(refimg, info[point,2:3], nrc) 121 | test = c(a@ymax,a@ymin,a@xmin,a@xmax) 122 | 123 | if(sum(is.na(test)) > 0){next} 124 | if(sum(test < 0) > 0){next} 125 | if(a@ymax > nrow(refimg) | a@ymin > nrow(refimg)){next} 126 | if(a@xmax > ncol(refimg) | a@xmin > ncol(refimg)){next} 127 | ext=extent(refimg,a@ymax,a@ymin,a@xmin,a@xmax) 128 | refsub = crop(refimg, ext) 129 | 130 | #make subset of fiximg (fiximg chip) 131 | a = make_kernal(fiximg, info[point,2:3], window) 132 | test = c(a@ymax,a@ymin,a@xmin,a@xmax) 133 | if(sum(is.na(test)) > 0){next} 134 | if(sum(test < 0) > 0){next} 135 | if(a@ymax > nrow(fiximg) | a@ymin > nrow(fiximg)){next} 136 | if(a@xmax > ncol(fiximg) | a@xmin > ncol(fiximg)){next} 137 | ext=extent(fiximg,a@ymax,a@ymin,a@xmin,a@xmax) 138 | fixsub = crop(fiximg, ext) 139 | 140 | #create numerator 141 | tofix = matrix(values(fixsub),ncol=window,byrow = T) 142 | 143 | if (length(tofix) %% 2 == 0) { 144 | print("skipping") 145 | next 146 | } 147 | 148 | num = focal(refsub, w=tofix ,fun=sum) 149 | 150 | #get refimg denom 151 | a = make_kernal(refimgsqr, info[point,2:3], nrc) 152 | ext=extent(refimgsqr,a@ymax,a@ymin,a@xmin,a@xmax) 153 | refsubsqr = crop(refimgsqr, ext) 154 | sumrefsubsqr = focal(refsubsqr, w=matrix(1,window, window)) #get the summed product of the refsubimg 155 | sumfixsubsqr = sum(values(fixsub)^2) #fiximg standard only gets calcuated once 156 | denom = sqrt(sumfixsubsqr*sumrefsubsqr) 157 | 158 | badone=0 159 | if(cellStats(num, stat="sum") + cellStats(denom, stat="sum") == 0){next} 160 | 161 | ncc = num/denom 162 | buf = (nrow(ncc)-search)/2 163 | off1 = buf+1 164 | off2 = buf+search 165 | ext = extent(ncc,off1,off2,off1,off2) 166 | ncc = crop(ncc,ext) 167 | nccv = values(ncc) 168 | nccm = matrix(nccv, ncol=sqrt(length(nccv)), byrow=T) 169 | 170 | x = y = seq(1,ncol(nccm),1) 171 | 172 | good = which(values(ncc) == maxValue(ncc)) 173 | good = good[1] 174 | 175 | #### 176 | xoffsetcoord = xFromCell(ncc, good) 177 | yoffsetcoord = yFromCell(ncc, good) 178 | xoffset = xoffsetcoord - info[point,"refx"] 179 | yoffset = yoffsetcoord - info[point,"refy"] 180 | info[point,"fixx"] = xoffsetcoord-(xoffset*2) 181 | info[point,"fixy"] = yoffsetcoord-(yoffset*2) 182 | #### 183 | 184 | 185 | #get the row and column numbers for the fix image 186 | origfiximg_x = info[point,"fixx"]-shiftit[1] 187 | origfiximg_y = info[point,"fixy"]-shiftit[2] 188 | #a = cellFromXY(origfiximg, c(info[point,"refx"],info[point,"refy"])) #fiximg 189 | a = cellFromXY(origfiximg, c(origfiximg_x,origfiximg_y)) 190 | fiximgrc = rowColFromCell(origfiximg, a) 191 | info[point,"fixcol"] = fiximgrc[2] #info[point,8] = fiximgrc[2] 192 | info[point,"fixrow"] = fiximgrc[1] #info[point,9] = fiximgrc[1] 193 | 194 | 195 | ############################################################# 196 | #screen by outlier 197 | ccc = scaleit(nccm) 198 | maxmat = max(ccc, na.rm = T) 199 | rowcol = which(ccc == maxmat, arr.ind=T) 200 | r = ccc[rowcol[1],] 201 | c = ccc[,rowcol[2]] 202 | rmax = which(r == maxmat) #max(r) 203 | cmax = which(c == maxmat) #max(c) 204 | 205 | dist1 = abs(c((rmax - 1), (rmax - length(r)), (cmax - 1), (cmax - length(c))))/(floor(search/2)) 206 | dist2 = min(dist1) 207 | 208 | #place filtering values in info table 209 | info[point,"nmax"] = length(which(ccc == maxmat)) #2 #info[point,10] = length(which(ccc == maxmat)) #2 210 | info[point,"max"] = round(maxmat, digits=1) #3 #info[point,11] = round(maxmat, digits=1) #3 211 | info[point,"edgedist"] = round(dist2, digits=2) #6 #info[point,14] = round(dist2, digits=2) #6 212 | 213 | #decide what surfaces are good\bad 214 | bad = array(0,3) 215 | bad[1] = info[point,"nmax"] > 1 #number of max peaks eq 1 216 | bad[2] = info[point,"max"] < 3 #peak ge to 3 standard devs from mean 217 | bad[3] = info[point,"edgedist"] < 0.12 #peak distance from edge >= 0.12 218 | info[point,"decision"] = sum(bad) == 0 #7 219 | if(badone == 1){info[point,"decision"] = 0} 220 | 221 | #filter plots that will crash the plotting because of weird data points (na, NaN, (-)Inf) 222 | bad1 = is.na(ccc) 223 | bad2 = is.infinite(ccc) 224 | bad3 = is.nan(ccc) 225 | badindex = which(bad1 == T | bad2 == T | bad3 == T) 226 | ccc[badindex] = 0 227 | if(length(which(ccc == 0)) == length(ccc)){next} 228 | 229 | #plot the cross correlation surface 230 | #title = paste(point,info$nmax[point],info$max[point],info$edgedist[point], info$decision[point], sep = ",") 231 | if(info[point,"decision"] == 1){status = "accept"} else {status = "reject"} 232 | title = paste("Point:", point, status) 233 | persp(x, y, ccc, theta = 30, phi = 30, expand = 0.5, col = 8, main=title) 234 | } 235 | 236 | #write all the point info to a file 237 | info_file = sub("archv.tif", "info_full.csv",fixfile) 238 | write.csv(info, file=info_file, row.names = F) 239 | 240 | dev.off() #turn off the plotting device 241 | 242 | #subset the points that passed the surface tests 243 | these = which(info[,"decision"] == 1) 244 | 245 | #if the number of sample points is less than 10 delete the image and return 246 | if(length(these) < 10){ 247 | delete_files(fixfile, 2) 248 | return(0) 249 | } else { 250 | 251 | info = info[these,] 252 | 253 | #filter points based on rmse contribution 254 | maxr = endit = 10 255 | while(maxr >2 & endit != 0){ 256 | xresid = (info[,"refx"]-info[,"fixx"])^2 #get the residuals of each x 257 | yresid = (info[,"refy"]-info[,"fixy"])^2 #get the residuals of each y 258 | r = (sqrt(xresid+yresid))/reso #get the rmse of each xy point 259 | totx = (1/length(info[,"refx"]))*(sum(xresid)) #intermediate step 260 | toty = (1/length(info[,"refy"]))*(sum(yresid)) #intermediate step 261 | tot = sqrt(totx+toty)/reso #total rmse including all points 262 | if (tot != 0){contr = r/tot} else contr = r #error contribution of each point 263 | maxr = max(contr) #while loop controler 264 | b = which(contr < 2) #subset finder - is point 2 times or greater in contribution 265 | info = info[b,] #subset the info based on good rsme 266 | endit = sum(contr[b]) #while loop controler 267 | } 268 | 269 | #write out the filtered points that will be used in the transformation 270 | info_file = sub("archv.tif", "info_sub.csv",fixfile) 271 | write.csv(info, file=info_file, row.names = F) 272 | 273 | #adjust so that the coord is center of pixel 274 | info[,"fixx"] = info[,"fixx"]+(reso/2) 275 | info[,"fixy"] = info[,"fixy"]-(reso/2) 276 | 277 | #get the projection from the fix image 278 | wktfile = sub("archv.tif","wkt.txt", fixfile) 279 | projcmd = paste("gdalsrsinfo -o wkt", fixfile) 280 | proj = system(projcmd, intern = TRUE) 281 | write(proj, wktfile) 282 | 283 | #get the gcp string made 284 | fixcol = paste(info[,"fixcol"]) #fix col for tie point 285 | fixrow = paste(info[,"fixrow"]) #fix row for tie point 286 | refx = paste(info[,"refx"]) #fix x tie point coord 287 | refy = paste(info[,"refy"]) #fix y tie point coord 288 | gcpstr = paste(" -gcp", fixcol, fixrow, refx, refy, collapse="") 289 | gcpfile = sub("archv.tif", "gcp.txt", fixfile) 290 | write(paste("reference file =", reffile), file=gcpfile) 291 | write(gcpstr, file=gcpfile, append=T) 292 | 293 | #gdal translate command 294 | tempname = sub("archv", "temp", fixfile) #"K:/scenes/034032/images/1976/LM10360321976248_archv.tif" 295 | gdaltrans_cmd = paste("gdal_translate -of Gtiff -ot Byte -co INTERLEAVE=BAND -a_srs", wktfile, fixfile, tempname, gcpstr) 296 | system(gdaltrans_cmd) 297 | 298 | #gdal warp command 299 | gdalwarp_cmd = paste("gdalwarp -of Gtiff -order 2 -ot Byte -srcnodata 0 -dstnodata 0 -co INTERLEAVE=BAND -overwrite -multi -tr", reso, reso, tempname, fixfile) #fixfile "-tps" "-order 2", "-order 3" 300 | system(gdalwarp_cmd) 301 | 302 | #delete the temp file 303 | unlink(list.files(dirname(fixfile), pattern = "temp", full.names = T)) 304 | return(1) 305 | } 306 | } else {return(0)} 307 | } 308 | 309 | -------------------------------------------------------------------------------- /R/olical.r: -------------------------------------------------------------------------------- 1 | #' Calibrate OLI images to TM images 2 | #' 3 | #' Calibrate OLI images to TM images using linear regression 4 | #' @param oliwrs2dir character. oli WRS-2 scene directory path 5 | #' @param tmwrs2dir character. TM WRS-2 scene directory path 6 | #' @param cores numeric. Number of cores to process with options: 1 or 2 7 | #' @export 8 | 9 | 10 | olical = function(oliwrs2dir, tmwrs2dir, cores=2, overwrite=overwrite){ 11 | 12 | olifiles = list.files(oliwrs2dir, "l8sr.tif", recursive=T, full.names=T) 13 | tmfiles = list.files(tmwrs2dir, "tc.tif", recursive=T, full.names=T) 14 | 15 | #pull out oli and tm year 16 | olibase = basename(olifiles) 17 | oliyears = substr(olibase, 10, 13) 18 | tmbase = basename(tmfiles) 19 | tmyears = substr(tmbase, 10, 13) 20 | tmyearday = as.numeric(substr(tmbase, 10, 16)) 21 | 22 | #get overlapping oli/etm+ years 23 | oliuni = unique(oliyears) 24 | notintm = oliuni %in% tmyears 25 | if(sum(notintm) < 1){stop("There is not at least one year of overlapping images between OLI and ETM+ to calibrate on")} 26 | theseoli = which(oliyears %in% tmyears == T) 27 | olifilessub = olifiles[theseoli] 28 | olibase = olibase[theseoli] 29 | oliyears = oliyears[theseoli] 30 | oliyearday = as.numeric(substr(olibase, 10, 16)) 31 | 32 | match = data.frame(oli=olifilessub, etm=NA, stringsAsFactors=FALSE) 33 | for(i in 1:length(olifilessub)){ 34 | closest = order(abs(oliyearday[i]-tmyearday))[1] 35 | match$etm[i] = tmfiles[closest] 36 | } 37 | 38 | #do single pair modeling 39 | print("...single image pair modeling") 40 | if(cores==2){ 41 | cl = makeCluster(cores) 42 | registerDoParallel(cl) 43 | cfun <- function(a, b) NULL 44 | o = foreach(i=1:length(olifilessub), .combine="cfun",.packages="LandsatLinkr") %dopar% olical_single(match$oli[i], match$etm[i], overwrite=overwrite) # 45 | stopCluster(cl) 46 | } else {for(i in 1:length(olifilessub)){olical_single(match$oli[i], match$etm[i], overwrite=overwrite)}} 47 | 48 | #do aggregated modeling 49 | caldir = file.path(oliwrs2dir,"calibration") 50 | print("...aggregate image pair modeling") 51 | cal_oli_tc_aggregate_model(caldir,overwrite=overwrite) 52 | 53 | #predict tc and tca from aggregate model 54 | calagdir = file.path(caldir,"aggregate_model") 55 | bcoef = as.numeric(read.csv(file.path(calagdir,"tcb_cal_aggregate_coef.csv"))[1,3:8]) 56 | gcoef = as.numeric(read.csv(file.path(calagdir,"tcg_cal_aggregate_coef.csv"))[1,3:8]) 57 | wcoef = as.numeric(read.csv(file.path(calagdir,"tcw_cal_aggregate_coef.csv"))[1,3:8]) 58 | 59 | print("...applying model to all oli images") 60 | for(i in 1:length(olifiles)){olisr2tc(olifiles[i],bcoef,gcoef,wcoef,"apply",overwrite=overwrite)} 61 | } 62 | -------------------------------------------------------------------------------- /R/olical_single.r: -------------------------------------------------------------------------------- 1 | #' Calibrate oli images to TM images 2 | #' 3 | #' Calibrate oli images to TM images using linear regression 4 | #' @param oliwrs2dir character. oli WRS-2 scene directory path 5 | #' @param tmwrs2dir character. TM WRS-2 scene directory path 6 | #' @import raster 7 | #' @import MASS 8 | #' @export 9 | 10 | 11 | olical_single = function(oli_file, tm_file, overwrite=F){ 12 | 13 | get_intersection = function(files){ 14 | int = intersect(extent(raster(files[1])),extent(raster(files[2]))) 15 | if(length(files) >= 3){for(i in 3:length(files))int = intersect(extent(raster(files[i])), int)} 16 | return(int) 17 | } 18 | 19 | predict_oli_index = function(tbl, outsampfile){ 20 | 21 | #create a multivariable linear model 22 | model = rlm(refsamp ~ b2samp + b3samp + b4samp + b5samp + b6samp + b7samp, data=tbl) # 23 | 24 | tbl$singlepred = round(predict(model)) 25 | write.csv(tbl, outsampfile, row.names=F) 26 | 27 | #plot the regression 28 | r = cor(tbl$refsamp, tbl$singlepred) 29 | coef = rlm(tbl$refsamp ~ tbl$singlepred) 30 | 31 | pngout = sub("samp.csv", "plot.png",outsampfile) 32 | png(pngout,width=700, height=700) 33 | title = paste(tbl$index[1],"linear regression: slope =",paste(signif(coef$coefficients[2], digits=3),",",sep=""), 34 | "y Intercept =",paste(round(coef$coefficients[1], digits=3),",",sep=""), 35 | "r =",signif(r, digits=3)) 36 | plot(x=tbl$singlepred,y=tbl$refsamp, 37 | main=title, 38 | xlab=paste(tbl$oli_img[1],tbl$index[1]), 39 | ylab=paste(tbl$ref_img[1],tbl$index[1])) 40 | abline(coef = coef$coefficients, col="red") 41 | dev.off() 42 | 43 | #return the information 44 | coef_tbl = data.frame(rbind(model$coefficients)) 45 | cnames = c("yint","b2c","b3c","b4c","b5c","b6c","b7c") 46 | colnames(coef_tbl) = cnames 47 | tbls = list(coef_tbl,tbl) 48 | return(tbls) 49 | } 50 | 51 | #define the filenames 52 | oli_sr_file = oli_file 53 | oli_mask_file = sub("l8sr.tif", "cloudmask.tif", oli_sr_file) 54 | ref_tc_file = tm_file 55 | ref_tca_file = sub("tc", "tca", ref_tc_file) 56 | ref_mask_file = sub("tc", "cloudmask", ref_tc_file) 57 | 58 | #make new directory 59 | dname = dirname(oli_sr_file) 60 | oliimgid = substr(basename(oli_sr_file),1,16) 61 | outdir = file.path(substr(dname,1,nchar(dname)-12),"calibration", oliimgid) #-5 62 | dir.create(outdir, showWarnings = F, recursive=T) 63 | 64 | #check to see if single cal has already been run 65 | files = list.files(outdir) 66 | thesefiles = c("tca_cal_plot.png","tcb_cal_plot.png","tcg_cal_plot.png","tcw_cal_plot.png", 67 | "tca_cal_samp.csv","tcb_cal_samp.csv","tcg_cal_samp.csv","tcw_cal_samp.csv") 68 | results = rep(NA,length(thesefiles)) 69 | for(i in 1:length(results)){ 70 | test = grep(thesefiles[i], files) 71 | results[i] = length(test) > 0 72 | } 73 | if(all(results) == T & overwrite == F){return(0)} 74 | 75 | 76 | #load files as raster 77 | oli_sr_img = brick(oli_sr_file) 78 | oli_mask_img = raster(oli_mask_file) 79 | ref_tc_img = brick(ref_tc_file) 80 | ref_tca_img = raster(ref_tca_file) 81 | ref_mask_img = raster(ref_mask_file) 82 | 83 | #align the extents 84 | extent(oli_sr_img) = alignExtent(oli_sr_img, ref_tc_img, snap="near") 85 | extent(oli_mask_img) = alignExtent(oli_mask_img, ref_tc_img, snap="near") 86 | extent(ref_tc_img) = alignExtent(ref_tc_img, ref_tc_img, snap="near") 87 | extent(ref_tca_img) = alignExtent(ref_tca_img, ref_tc_img, snap="near") 88 | extent(ref_mask_img) = alignExtent(ref_mask_img, ref_tc_img, snap="near") 89 | 90 | #crop the images to their intersection 91 | int = get_intersection(c(oli_mask_file,ref_mask_file)) 92 | oli_b5_img = crop(subset(oli_sr_img,5),int) 93 | ref_tca_img = crop(ref_tca_img,int) 94 | oli_mask_img = crop(oli_mask_img,int) 95 | ref_mask_img = crop(ref_mask_img,int) 96 | 97 | #make a composite mask 98 | 99 | oli_mask_v = as.vector(oli_mask_img) 100 | ref_mask_v = as.vector(ref_mask_img) 101 | 102 | mask = oli_mask_v*ref_mask_v #make composite mask 103 | oli_mask_v = ref_mask_v = 0 # save memory 104 | 105 | #load oli and etm+ bands 106 | oli_b5_v = as.vector(oli_b5_img) 107 | ref_tca_v = as.vector(ref_tca_img) 108 | 109 | dif = oli_b5_v - ref_tca_v #find the difference 110 | oli_b5_v = ref_tca_v = 0 #save memory 111 | nas = which(mask == 0) #find the bads in the mask 112 | dif[nas] = NA #set the bads in the dif to NA so they are not included in the calc of mean and stdev 113 | stdv = sd(dif, na.rm=T) #get stdev of difference 114 | center = mean(dif, na.rm=T) #get the mean difference 115 | dif = dif < (center+stdv*2) & dif > (center-stdv*2) #find the pixels that are not that different 116 | 117 | 118 | goods = which(dif == 1) 119 | if(length(goods) < 20000){return(0)} 120 | 121 | #random sample 122 | samp = sample(1:length(goods), 20000) 123 | samp = goods[samp] 124 | sampxy = xyFromCell(oli_mask_img, samp) 125 | 126 | #save memory 127 | mask = 0 128 | 129 | #extract the sample pixels from the bands 130 | olisamp = extract(subset(oli_sr_img, 2:7), sampxy) 131 | tcsamp = extract(ref_tc_img, sampxy) 132 | tcasamp = extract(ref_tca_img, sampxy) 133 | 134 | #make sure the values are good for running regression on (diversity) 135 | unib2samp = length(unique(olisamp[,1])) 136 | unib3samp = length(unique(olisamp[,2])) 137 | unib4samp = length(unique(olisamp[,3])) 138 | unib5samp = length(unique(olisamp[,4])) 139 | unib6samp = length(unique(olisamp[,5])) 140 | unib7samp = length(unique(olisamp[,6])) 141 | 142 | unitcbsamp = length(unique(tcsamp[,1])) 143 | unitcgsamp = length(unique(tcsamp[,2])) 144 | unitcwsamp = length(unique(tcsamp[,3])) 145 | unitcasamp = length(unique(tcasamp)) 146 | 147 | 148 | if(unib2samp < 15 | unib3samp < 15 | unib4samp < 15 | unib5samp < 15 | unib6samp < 15 | 149 | unib7samp < 15 | unitcbsamp < 15 | unitcgsamp < 15 | unitcwsamp < 15 | unitcasamp < 15){return()} 150 | 151 | olibname = basename(oli_sr_file) 152 | refbname = basename(ref_tc_file) 153 | refabname = basename(ref_tca_file) 154 | 155 | tcb_tbl = data.frame(olibname,refbname,"tcb",sampxy,tcsamp[,1],olisamp) 156 | tcg_tbl = data.frame(olibname,refbname,"tcg",sampxy,tcsamp[,2],olisamp) 157 | tcw_tbl = data.frame(olibname,refbname,"tcw",sampxy,tcsamp[,3],olisamp) 158 | tca_tbl = data.frame(olibname,refabname,"tca",sampxy,tcasamp,olisamp) 159 | 160 | tcb_tbl = tcb_tbl[complete.cases(tcb_tbl),] 161 | tcg_tbl = tcg_tbl[complete.cases(tcg_tbl),] 162 | tcw_tbl = tcw_tbl[complete.cases(tcw_tbl),] 163 | tca_tbl = tca_tbl[complete.cases(tca_tbl),] 164 | 165 | cnames = c("oli_img","ref_img","index","x","y","refsamp","b2samp","b3samp","b4samp","b5samp","b6samp","b7samp") 166 | colnames(tcb_tbl) = cnames 167 | colnames(tcg_tbl) = cnames 168 | colnames(tcw_tbl) = cnames 169 | colnames(tca_tbl) = cnames 170 | 171 | #predict the indices 172 | #TCB 173 | outsampfile = file.path(outdir,paste(oliimgid,"_tcb_cal_samp.csv",sep="")) 174 | model = predict_oli_index(tcb_tbl, outsampfile) 175 | bcoef = model[[1]] 176 | bsamp = model[[2]] 177 | br = cor(bsamp$refsamp, bsamp$singlepred) 178 | 179 | #TCG 180 | outsampfile = file.path(outdir,paste(oliimgid,"_tcg_cal_samp.csv",sep="")) 181 | model = predict_oli_index(tcg_tbl, outsampfile) 182 | gcoef = model[[1]] 183 | gsamp = model[[2]] 184 | gr = cor(gsamp$refsamp, gsamp$singlepred) 185 | 186 | #TCW 187 | outsampfile = file.path(outdir,paste(oliimgid,"_tcw_cal_samp.csv",sep="")) 188 | model = predict_oli_index(tcw_tbl, outsampfile) 189 | wcoef = model[[1]] 190 | wsamp = model[[2]] 191 | wr = cor(wsamp$refsamp, wsamp$singlepred) 192 | 193 | #TCA 194 | outsampfile = file.path(outdir,paste(oliimgid,"_tca_cal_samp.csv",sep="")) 195 | model = predict_oli_index(tca_tbl, outsampfile) 196 | acoef = model[[1]] 197 | asamp = model[[2]] 198 | ar = cor(asamp$refsamp, asamp$singlepred) 199 | 200 | #write out the coef files 201 | tcbinfo = data.frame(oli_file=olibname, ref_file=refbname, index="tcb", bcoef, r=br) 202 | tcginfo = data.frame(oli_file=olibname, ref_file=refbname, index="tcg", gcoef, r=gr) 203 | tcwinfo = data.frame(oli_file=olibname, ref_file=refbname, index="tcw", wcoef, r=wr) 204 | tcainfo = data.frame(oli_file=olibname, ref_file=refabname, index="tca", acoef, r=ar) 205 | 206 | tcbcoefoutfile = file.path(outdir,paste(oliimgid,"_tcb_cal_coef.csv",sep="")) 207 | tcgcoefoutfile = file.path(outdir,paste(oliimgid,"_tcg_cal_coef.csv",sep="")) 208 | tcwcoefoutfile = file.path(outdir,paste(oliimgid,"_tcw_cal_coef.csv",sep="")) 209 | tcacoefoutfile = file.path(outdir,paste(oliimgid,"_tca_cal_coef.csv",sep="")) 210 | 211 | write.csv(tcbinfo, tcbcoefoutfile, row.names=F) 212 | write.csv(tcginfo, tcgcoefoutfile, row.names=F) 213 | write.csv(tcwinfo, tcwcoefoutfile, row.names=F) 214 | write.csv(tcainfo, tcacoefoutfile, row.names=F) 215 | } 216 | -------------------------------------------------------------------------------- /R/olisr2tc.r: -------------------------------------------------------------------------------- 1 | #' Create oli TC from surface reflectance and modeled coefficients 2 | #' 3 | #' Create oli TC from surface reflectance and modeled coefficients 4 | #' @param oli_file character. full path name to oli surface reflectance file 5 | #' @param bcoef numeric. numeric array containing the tcb coefficients 6 | #' @param gcoef numeric. numeric array containing the tcg coefficients 7 | #' @param wcoef numeric. numeric array containing the tcw coefficients 8 | #' @param mode. character. how to deal with the outputs options: "calibrate" or "apply" 9 | #' @import raster 10 | #' @export 11 | 12 | 13 | olisr2tc = function(oli_file,bcoef,gcoef,wcoef,mode,overwrite=F){ 14 | 15 | if(mode == "calibrate"){ 16 | dir = substr(dirname(oli_file),1,nchar(dirname(oli_file))-12) 17 | tcfiledir = file.path(dir,"calibration","aggregate_model_tc_imgs") 18 | dir.create(tcfiledir, recursive=T, showWarnings=F) 19 | tcfile = file.path(tcfiledir,sub("l8sr.tif","tc.tif", basename(oli_file))) 20 | tcafile = file.path(tcfiledir,sub("l8sr.tif","tca.tif", basename(oli_file))) 21 | } 22 | if(mode == "apply"){ 23 | tcfile = sub("l8sr.tif","tc.tif", oli_file) 24 | tcafile = sub("l8sr.tif","tca.tif", oli_file) 25 | } 26 | 27 | check = file_check(oli_file,"l8sr_tc.tif",overwrite) 28 | if(check == 0){return(0)} 29 | 30 | #tasseled cap 31 | b2 = as.matrix(raster(oli_file, 2)) 32 | b3 = as.matrix(raster(oli_file, 3)) 33 | b4 = as.matrix(raster(oli_file, 4)) 34 | b5 = as.matrix(raster(oli_file, 5)) 35 | b6 = as.matrix(raster(oli_file, 6)) 36 | b7 = as.matrix(raster(oli_file, 7)) 37 | 38 | bright = (b2*bcoef[1])+(b3*bcoef[2])+(b4*bcoef[3])+(b5*bcoef[4])+(b6*bcoef[5])+(b7*bcoef[6]) 39 | green = (b2*gcoef[1])+(b3*gcoef[2])+(b4*gcoef[3])+(b5*gcoef[4])+(b6*gcoef[5])+(b7*gcoef[6]) 40 | wet = (b2*wcoef[1])+(b3*wcoef[2])+(b4*wcoef[3])+(b5*wcoef[4])+(b6*wcoef[5])+(b7*wcoef[6]) 41 | 42 | b2=b3=b4=b5=b6=b7=0 43 | 44 | 45 | tcb = matrix_to_raster(oli_file,bright) 46 | tcg = matrix_to_raster(oli_file,green) 47 | tcw = matrix_to_raster(oli_file,wet) 48 | wet=0 49 | tca = atan(green/bright) * (180/pi) * 100 50 | bright=green=0 51 | 52 | tca = matrix_to_raster(oli_file,tca) 53 | 54 | outbase = substr(basename(oli_file),1,16) 55 | outdir = dirname(oli_file) 56 | temptcb = sub("l8sr.tif","_temptcb.tif", oli_file) 57 | temptcg = sub("l8sr.tif","_temptcg.tif", oli_file) 58 | temptcw = sub("l8sr.tif","_temptcw.tif", oli_file) 59 | projection(tcb) = set_projection(tcfile) 60 | projection(tcg) = set_projection(tcfile) 61 | projection(tcw) = set_projection(tcfile) 62 | tc = as(tcb, "SpatialGridDataFrame") 63 | tcb=0 64 | writeGDAL(tc, temptcb, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 65 | tc = as(tcg, "SpatialGridDataFrame") 66 | tcg=0 67 | writeGDAL(tc, temptcg, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 68 | tc = as(tcw, "SpatialGridDataFrame") 69 | tcw=0 70 | writeGDAL(tc, temptcw, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 71 | 72 | bands = c(temptcb,temptcg,temptcw) 73 | temptcs = sub("l8sr.tif","_temptcstack.vrt", oli_file) 74 | gdalbuildvrt(gdalfile=bands, output.vrt = temptcs, separate=T) #, tr=c(reso,reso) 75 | gdal_translate(src_dataset=temptcs, dst_dataset=tcfile, of = "GTiff", co="INTERLEAVE=BAND") 76 | 77 | projection(tca) = set_projection(tcfile) 78 | tc = as(tca, "SpatialGridDataFrame") 79 | tca=0 80 | writeGDAL(tc, tcafile, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 81 | 82 | #delete temporary files 83 | unlink(c(temptcb,temptcg,temptcw,temptcs)) 84 | 85 | return(1) 86 | } 87 | -------------------------------------------------------------------------------- /R/oliunpackr.r: -------------------------------------------------------------------------------- 1 | #' Decompress, stack, and reproject OLI SR images 2 | #' 3 | #' Decompress, stack, and reproject OLI SR images 4 | #' @param file character. full path name of the surface reflectance file 5 | #' @param proj character. PROJ.4 projection definition. 6 | #' @param overwrite logical. True will overwrite the file if it already exists, False will skip processing if output file exists. 7 | #' @import raster 8 | #' @import gdalUtils 9 | #' @import rgdal 10 | #' @export 11 | 12 | 13 | oliunpackr = function(file, proj="default", overwrite=F){ 14 | 15 | #file = "D:/work/proj/llr_dev/collection1/oli/wrs2/044034/targz/LC080440342018020901T1-SC20180309132638.tar.gz" 16 | #proj= "+proj=aea +lat_1=29.5 +lat_2=45.5 +lat_0=23 +lon_0=-96 +x_0=0 +y_0=0 +ellps=GRS80 +datum=NAD83 +units=m +no_defs" 17 | 18 | 19 | # http://earthexplorer.usgs.gov/ Landsat CDR OLI images 20 | 21 | check = file_check(file,"l8sr.tif",overwrite) 22 | if(check == 0){return(0)} 23 | 24 | #make a random work folder 25 | randomstring = paste(sample(c(0:9, letters, LETTERS), 6, replace=TRUE),collapse="") 26 | tempdir = file.path(dirname(file),randomstring) #temp 27 | dir.create(tempdir, recursive=T, showWarnings=F) 28 | 29 | #decompress the image 30 | untar(file, exdir=tempdir, tar="internal") #decompress the file 31 | files = list.files(tempdir, full.names=T) 32 | 33 | #are there files in the decompressed archive 34 | if(length(files) == 0){ 35 | unlink(tempdir, recursive=T, force=T) 36 | stop(paste('There were no files found in the decompressed archive:', file)) 37 | } 38 | 39 | #are these sr files 40 | filesBname = basename(files) 41 | srFilesTest = grep("_sr_", filesBname, value=T) 42 | if(length(srFilesTest) == 0){ 43 | unlink(tempdir, recursive=T, force=T) 44 | stop(paste('There were no USGS ESPA Surface Reflectance files found in the decompressed archive:', file)) 45 | } 46 | 47 | #is this collection 1 or pre-collection 48 | name = filesBname[1] 49 | isCollection = substr(filesBname[1], 5,5) == '_' 50 | 51 | #get the sr bands 52 | bands = sort(grep("band", files, value=T)) 53 | 54 | if(isCollection){ 55 | #new version: LXSS_LLLL_PPPRRR_YYYYMMDD_yyyymmdd_CX_TX_prod_band.ext 56 | year = substr(name,18,21) 57 | pieces = unlist(strsplit(dirname(file), "/")) #break up the directory and unlist so the pieces can be called by index 58 | len = length(pieces)-1 #get the ending index for "scene" 59 | newpieces = paste(pieces[1:len], collapse = "/") #subset the directory pieces so the last piece is the scene 60 | outdir = file.path(newpieces, "images", year) 61 | dir.create(outdir, recursive=T, showWarnings=F) 62 | 63 | # create the mask 64 | pixelqafile = grep("pixel_qa.tif", files, value=T) 65 | pixelqar = getValues(raster(pixelqafile)) 66 | #mask = as.numeric(pixelqar == 322 | pixelqar == 386 | pixelqar == 324 | pixelqar == 388 | pixelqar == 836 | pixelqar == 900) 67 | shadow = bitwAnd(pixelqar, 8) == 0 # shadow 68 | snow = bitwAnd(pixelqar, 16) == 0 # snow 69 | clouds = bitwAnd(pixelqar, 32) == 0 # clouds 70 | mask = shadow*snow*clouds 71 | 72 | clouds = snow = shadow = pixelqar = 0 #clear memory 73 | 74 | # make the basename for final output files 75 | mtlfile = grep("MTL.txt", files, value=T) 76 | tbl = unlist(read.delim(mtlfile, header=F, skipNul=T)) 77 | string = as.character(grep("LANDSAT_SCENE_ID = ", tbl, value=T)) 78 | pieces = unlist(strsplit(string, " ")) 79 | sceneid = pieces[length(pieces)] 80 | outbase = substr(sceneid,1,16) 81 | 82 | } else{ 83 | #name = 'LC80380292015214LGN00.xml' 84 | outbase = substr(name,1,16) # need to get this from the filename, because pre-collection does not include an mtl file 85 | year = substr(name,10,13) 86 | 87 | pieces = unlist(strsplit(dirname(file), "/")) #break up the directory and unlist so the pieces can be called by index 88 | len = length(pieces)-1 #get the ending index for "scene" 89 | newpieces = paste(pieces[1:len], collapse = "/") #subset the directory pieces so the last piece is the scene 90 | outdir = file.path(newpieces, "images", year) 91 | dir.create(outdir, recursive=T, showWarnings=F) 92 | 93 | # create the mask 94 | fmask = grep("cfmask.tif", files, value=T) # <= 1 okay background 255 95 | mask = as.matrix(raster(fmask)) 96 | mask = mask <= 1 97 | mask = mask*1 #convert from logial to numeric 98 | } 99 | 100 | tempstack = file.path(tempdir,paste(outbase,"_tempstack.tif",sep="")) 101 | tempvrt = sub("tempstack.tif", "tempstack.vrt", tempstack) 102 | tempmask = sub("tempstack", "tempmask", tempstack) 103 | projstack = sub("tempstack", "projstack", tempstack) 104 | projmask = sub("tempstack", "projmask", tempstack) 105 | finalstack = file.path(outdir,paste(outbase,"_l8sr.tif", sep="")) 106 | finalmask = file.path(outdir,paste(outbase,"_cloudmask.tif", sep="")) 107 | outprojfile = file.path(outdir,paste(outbase,"_proj.txt", sep="")) 108 | 109 | ref = raster(bands[1]) #set a reference raster for setting values and getting projection 110 | origproj = projection(ref) 111 | 112 | #stack the image bands and write out 113 | gdalbuildvrt(gdalfile=bands, output.vrt = tempvrt, separate=T) #, tr=c(reso,reso) 114 | gdal_translate(src_dataset=tempvrt, dst_dataset=tempstack, of = "GTiff", co="INTERLEAVE=BAND") 115 | 116 | mask = setValues(ref,mask) 117 | mask = as(mask, "SpatialGridDataFrame") #convert the raster to SGHF so it can be written using GDAL (faster than writing it with the raster package) 118 | writeGDAL(mask, tempmask, drivername = "GTiff", type = "Byte", mvFlag = 255) 119 | mask = 0 # clear memory 120 | 121 | #reproject the image #need to add in writing proj file for default 122 | if(proj == "default"){proj = origproj} 123 | write(proj, outprojfile) 124 | gdalwarp(srcfile=tempstack, dstfile=projstack, 125 | s_srs=origproj, t_srs=proj, of="GTiff", 126 | r="bilinear", srcnodata=-9999, dstnodata=-32768, multi=T, #"near" 127 | tr=c(30,30), co="INTERLEAVE=BAND") 128 | 129 | #project the mask 130 | gdalwarp(srcfile=tempmask, dstfile=projmask, 131 | s_srs=origproj, t_srs=proj, of="GTiff", 132 | r="mode", srcnodata=255, dstnodata=255, multi=T, 133 | tr=c(30,30), co="INTERLEAVE=BAND") 134 | 135 | 136 | #trim the na rows and cols 137 | trim_na_rowcol(projstack, finalstack, projmask, finalmask) 138 | 139 | #delete temporary files 140 | unlink(tempdir, recursive=T, force=T) 141 | return(1) 142 | } 143 | 144 | -------------------------------------------------------------------------------- /R/prepare_images.r: -------------------------------------------------------------------------------- 1 | #' Prepare image MSS and TM images for calibration/compositing 2 | #' 3 | #' Prepare image MSS and TM images for calibration/compositing 4 | #' @param scenedir character. scene file path 5 | #' @param demfile character. full path to scene-corresponding DEM file 6 | #' @param process numeric. integer or vector specifying which processes to run 1=mssunpackr, 2=msswarp, 3=mssdn2rad, 4=mssatcor, 5=msscvm, 6=tmunpackr 7 | #' @param cores numeric. number of cores to use for parallel processing 8 | #' @import foreach 9 | #' @import doParallel 10 | #' @import raster 11 | #' @export 12 | 13 | 14 | prepare_images = function(scenedir, demfile=NULL, proj="default", process=seq(1:5), cores=1, overwrite=F){ 15 | 16 | targzdir = file.path(scenedir,"targz") 17 | imgdir = file.path(scenedir,"images") 18 | 19 | cfun = function(a, b) NULL 20 | 21 | # get available memory and set cores accordingly 22 | if(cores == 2){ 23 | cores = 1 # assume that we only have enough memory for one core 24 | mem = tryCatch(system('wmic OS get FreePhysicalMemory /Value', intern = TRUE, show.output.on.console=F), error=function(e)c(1,2)) 25 | mem = mem[mem != "\r"] # start parsing memory out 26 | if(length(mem) == 1){ 27 | mem = sub('\r','',mem) 28 | if(grep('=', mem) == 1){ 29 | mem = unlist(strsplit(mem, '=')) 30 | if(length(mem) == 2){ 31 | mem = suppressWarnings(as.numeric(mem[2])) 32 | if(!is.na(mem)){ 33 | if((mem/1000000) > 12){ 34 | cores=2 35 | }}}}}} 36 | 37 | 38 | #mssunpackr 39 | if(all(is.na(match(process,1))) == F){ 40 | print("Running mssunpackr") 41 | files = list.files(targzdir,"tar.gz",full.names=T) 42 | t=proc.time() 43 | if(cores == 2){ 44 | print("...in parallel") 45 | cl = makeCluster(cores) 46 | registerDoParallel(cl) 47 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% mssunpackr(files[i], proj=proj, overwrite=overwrite) 48 | stopCluster(cl) 49 | } else {for(i in 1:length(files)){mssunpackr(files[i], proj=proj, overwrite=overwrite)}} 50 | print(proc.time()-t) 51 | } 52 | 53 | #geowarp 54 | if(all(is.na(match(process,2))) == F){ 55 | print("Running msswarp") 56 | files = list.files(imgdir, pattern="archv.tif", full.names=T, recursive=T) 57 | diagfiles = list.files(imgdir, pattern="cloud_rmse.csv", full.names=T, recursive=T) 58 | tbl = do.call(rbind, lapply(diagfiles, read.table, header = F,sep = ',')) 59 | reffile = as.character(tbl[order(round(tbl[,3],digits=1), tbl[,2]),][1,1]) 60 | t = proc.time() 61 | if(cores == 2){ 62 | print("...in parallel") 63 | cl=makeCluster(cores) 64 | registerDoParallel(cl) 65 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% msswarp(reffile=reffile, fixfile=files[i]) 66 | stopCluster(cl) 67 | } else {for(i in 1:length(files)){msswarp(reffile=reffile, fixfile=files[i])}} 68 | print(proc.time()-t) 69 | } 70 | 71 | #convert to toa reflectance 72 | if(all(is.na(match(process,3))) == F){ 73 | print("Running mssdn2refl") 74 | files = list.files(imgdir, pattern="archv.tif", full.names=T, recursive=T) 75 | t = proc.time() 76 | if(cores == 2){ 77 | print("...in parallel") 78 | cl=makeCluster(cores) 79 | registerDoParallel(cl) 80 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% mssdn2refl(files[i], overwrite) 81 | stopCluster(cl) 82 | } else {for(i in 1:length(files)){mssdn2refl(files[i], overwrite)}} 83 | print(proc.time()-t) 84 | } 85 | 86 | #convert to surface reflectance 87 | if(all(is.na(match(process,4))) == F){ 88 | print("Running msscost") 89 | files = list.files(imgdir, pattern="archv.tif", full.names=T, recursive=T) 90 | t = proc.time() 91 | if(cores == 2){ 92 | print("...in parallel") 93 | cl=makeCluster(cores) 94 | registerDoParallel(cl) 95 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% msscost(files[i], overwrite) 96 | stopCluster(cl) 97 | } else {for(i in 1:length(files)){msscost(files[i], overwrite)}} 98 | print(proc.time()-t) 99 | } 100 | 101 | #cloudmask 102 | if(all(is.na(match(process,5))) == F){ 103 | print("Running msscvm") 104 | #prepare the topo layers required by msscvm 105 | newdem = prepare_topo(imgdir, demfile) 106 | print("...Making masks") 107 | files = list.files(imgdir, pattern="reflectance", full.names=T, recursive=T) 108 | t = proc.time() 109 | for(i in 1:length(files)){msscvm(files[i], newdem, topoprep=T, test=F, overwrite=overwrite)} 110 | print(proc.time()-t) 111 | } 112 | 113 | #unpack tm 114 | if(all(is.na(match(process,6))) == F){ 115 | print("Running tmunpackr") 116 | files = list.files(targzdir, pattern="tar.gz", full.names=T, recursive=T) 117 | t = proc.time() 118 | if(cores == 2){ 119 | cl=makeCluster(cores) 120 | registerDoParallel(cl) 121 | t = proc.time() 122 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% tmunpackr(files[i], proj=proj, overwrite=overwrite) 123 | stopCluster(cl) 124 | } else {for(i in 1:length(files)){tmunpackr(files[i], proj=proj, overwrite=overwrite)}} 125 | print(proc.time()-t) 126 | } 127 | 128 | #unpack oli 129 | if(all(is.na(match(process,7))) == F){ 130 | print("Running oliunpackr") 131 | files = list.files(targzdir, pattern="tar.gz", full.names=T, recursive=T) 132 | t = proc.time() 133 | if(cores == 2){ 134 | cl=makeCluster(cores) 135 | registerDoParallel(cl) 136 | t = proc.time() 137 | o = foreach(i=1:length(files), .combine="cfun",.packages="LandsatLinkr") %dopar% oliunpackr(files[i], proj=proj, overwrite=overwrite) 138 | stopCluster(cl) 139 | } else {for(i in 1:length(files)){oliunpackr(files[i], proj=proj, overwrite=overwrite)}} 140 | print(proc.time()-t) 141 | } 142 | } 143 | 144 | 145 | 146 | 147 | 148 | -------------------------------------------------------------------------------- /R/prepare_topo.r: -------------------------------------------------------------------------------- 1 | #' Prepare topographic layers: elevation, slope, aspect for use in MSS cloud masking 2 | #' 3 | #' Prepare topographic layers: elevation, slope, aspect for use in MSS cloud masking 4 | #' @param imgdir character. full path to "images" directory for scene 5 | #' @param demfile character. full path to scene-corresponding DEM file 6 | #' @export 7 | 8 | 9 | prepare_topo = function(imgdir, demfile){ 10 | print("...Preparing DEM") 11 | files = list.files(imgdir, pattern="reflectance", full.names=T, recursive=T) #"radiance.tif" 12 | if(length(files) == 0){ 13 | print(paste("Error - could not find any reflectance (*reflectance.tif) files in this directory:",imgdir)) 14 | stop 15 | } 16 | examplefile = files[1] 17 | dname = dirname(examplefile) 18 | scenedir = substr(dname,1,nchar(dname)-12) 19 | topodir = file.path(scenedir,"topo") 20 | dir.create(topodir, showWarnings=F) 21 | info = get_metadata(examplefile) 22 | template = raster(examplefile) 23 | demname = paste(info$wrstype,"_",info$ppprrr,"_60m","_dem.tif",sep="") 24 | newdem = file.path(topodir,demname) 25 | tempdem = sub("dem.tif","temp_dem.tif",newdem) 26 | newslope = file.path(topodir,sub("dem","slope",demname)) 27 | newasp = file.path(topodir,sub("dem","aspect",demname)) 28 | newill = file.path(topodir,sub("dem","illumination",demname)) 29 | s_srs = projection(raster(demfile)) #template 30 | t_srs = set_projection(examplefile) 31 | 32 | demfiles = list.files(topodir,"dem",full.names=T) 33 | unlink(demfiles) 34 | gdalwarp(srcfile=demfile,dstfile=tempdem, 35 | s_srs=s_srs,t_srs=t_srs, tr=c(60,60), dstnodata=-32768, ot="Int16") #should nodata be set here??? 36 | 37 | extholder = matrix(ncol = 4, nrow=length(files)) 38 | print("...Making sure DEM is large enough") 39 | print("......Getting MSS image extents") 40 | for(i in 1:length(files)){ 41 | img = raster(files[i]) 42 | ext = extent(img) 43 | extholder[i,1] = ext@xmin 44 | extholder[i,2] = ext@xmax 45 | extholder[i,3] = ext@ymin 46 | extholder[i,4] = ext@ymax 47 | } 48 | adj=1500 49 | xmin = min(extholder[,1]) - adj 50 | xmax = max(extholder[,2]) + adj 51 | ymin = min(extholder[,3]) - adj 52 | ymax = max(extholder[,4]) + adj 53 | 54 | dem = raster(tempdem) 55 | demext = extent(dem) 56 | 57 | xminokay = demext@xmin <= xmin 58 | xmaxokay = demext@xmax >= xmax 59 | yminokay = demext@ymin <= ymin 60 | ymaxokay = demext@ymax >= ymax 61 | 62 | print(paste(".........DEM x minimum is okay:",xminokay)) 63 | print(paste(".........DEM x maximum is okay:",xmaxokay)) 64 | print(paste(".........DEM y minimum is okay:",yminokay)) 65 | print(paste(".........DEM y maximum is okay:",ymaxokay)) 66 | 67 | if(sum(c(xminokay,xmaxokay,yminokay,ymaxokay)) != 4){ 68 | print("Error - Please make sure DEM has minimum dimensions:") 69 | print(paste("x minimum:", xmin)) 70 | print(paste("x maximum:", xmax)) 71 | print(paste("y minimum:", ymin)) 72 | print(paste("y maximum:", ymax)) 73 | print(paste("For projection:",t_srs)) 74 | return("Stopping LLR") 75 | } 76 | 77 | #crop the dem 78 | print("...Croppping the DEM to MSS image set union plus 25 pixel buffer") 79 | gdal_translate(src_dataset=tempdem, dst_dataset=newdem, projwin=c(xmin,ymax,xmax,ymin)) 80 | tempfiles = list.files(topodir, "temp", full.names=T) 81 | unlink(tempfiles) 82 | dem = raster(newdem) 83 | 84 | #making slope 85 | slopefiles = list.files(topodir,"slope",full.names=T) 86 | unlink(slopefiles) 87 | print("...Preparing Slope") 88 | img = terrain(dem, opt="slope") 89 | projection(img) = set_projection(examplefile) 90 | img = as(img, "SpatialGridDataFrame") 91 | writeGDAL(img, newslope, drivername = "GTiff", type = "Float32", options="INTERLEAVE=BAND") #, mvFlag = -32768 92 | 93 | #making slope aspect 94 | aspfiles = list.files(topodir,"aspect",full.names=T) 95 | unlink(aspfiles) 96 | print("...Preparing Aspect") 97 | img = terrain(dem, opt="aspect") 98 | projection(img) = set_projection(examplefile) 99 | img = as(img, "SpatialGridDataFrame") 100 | writeGDAL(img, newasp, drivername = "GTiff", type = "Float32", options="INTERLEAVE=BAND") #, mvFlag = -32768 101 | 102 | return(newdem) 103 | } 104 | -------------------------------------------------------------------------------- /R/set_projection.r: -------------------------------------------------------------------------------- 1 | #' Set the projection of raster 2 | #' 3 | #' Set the projection of a raster from the corresponding *proj.txt file. This solves a problem with incorrect projection parameters for albers when read by 'raster' 4 | #' @param file Filename of MSS image with DN values 5 | #' @export 6 | 7 | set_projection = function(file){ 8 | projfile = file.path(dirname(file),paste(substr(basename(file),1,16),"_proj.txt",sep="")) 9 | return(readLines(projfile)) 10 | } 11 | -------------------------------------------------------------------------------- /R/tmunpackr.r: -------------------------------------------------------------------------------- 1 | #' Decompress, stack, and reproject TM/ETM+ SR images 2 | #' 3 | #' Decompress, stack, and reproject TM/ETM+ SR images 4 | #' @param file character. full path name of the surface reflectance file 5 | #' @param proj character. PROJ.4 projection definition. 6 | #' @param overwrite logical. True will overwrite the file if it already exists, False will skip processing if output file exists. 7 | #' @import raster 8 | #' @import gdalUtils 9 | #' @import rgdal 10 | #' @export 11 | 12 | 13 | tmunpackr = function(file, proj="default", overwrite=F){ 14 | # http://earthexplorer.usgs.gov/ Landsat CDR TM and ETM+ images 15 | 16 | check = file_check(file,"ledaps.tif",overwrite) 17 | if(check == 0){return(0)} 18 | 19 | #set new directories 20 | randomstring = paste(sample(c(0:9, letters, LETTERS), 6, replace=TRUE),collapse="") 21 | tempdir = file.path(dirname(file),randomstring) #temp 22 | dir.create(tempdir, recursive=T, showWarnings=F) 23 | 24 | #decompress the image 25 | untar(file, exdir=tempdir, tar="internal") #decompress the file 26 | files = list.files(tempdir, full.names=T) 27 | 28 | #are there files in the decompressed archive 29 | if(length(files) == 0){ 30 | unlink(tempdir, recursive=T, force=T) 31 | stop(paste('There were no files found in the decompressed archive:', file)) 32 | } 33 | 34 | #are these sr files 35 | filesBname = basename(files) 36 | srFilesTest = grep("_sr_", filesBname, value=T) 37 | if(length(srFilesTest) == 0){ 38 | unlink(tempdir, recursive=T, force=T) 39 | stop(paste('There were no USGS ESPA Surface Reflectance files found in the decompressed archive:', file)) 40 | } 41 | 42 | #is this collection 1 or pre-collection 43 | name = filesBname[1] 44 | isCollection = substr(filesBname[1], 5,5) == '_' 45 | 46 | #get the sr bands 47 | bands = sort(grep("band", files, value=T)) 48 | 49 | 50 | if(isCollection){ 51 | #new version: LXSS_LLLL_PPPRRR_YYYYMMDD_yyyymmdd_CX_TX_prod_band.ext 52 | year = substr(name,18,21) 53 | pieces = unlist(strsplit(dirname(file), "/")) #break up the directory and unlist so the pieces can be called by index 54 | len = length(pieces)-1 #get the ending index for "scene" 55 | newpieces = paste(pieces[1:len], collapse = "/") #subset the directory pieces so the last piece is the scene 56 | outdir = file.path(newpieces, "images", year) 57 | dir.create(outdir, recursive=T, showWarnings=F) 58 | 59 | # create the mask 60 | pixelqafile = grep("pixel_qa.tif", files, value=T) 61 | pixelqar = getValues(raster(pixelqafile)) 62 | #mask = as.numeric(pixelqar == 322 | pixelqar == 386 | pixelqar == 324 | pixelqar == 388 | pixelqar == 836 | pixelqar == 900) 63 | shadow = bitwAnd(pixelqar, 8) == 0 # shadow 64 | snow = bitwAnd(pixelqar, 16) == 0 # snow 65 | clouds = bitwAnd(pixelqar, 32) == 0 # clouds 66 | mask = shadow*snow*clouds 67 | 68 | clouds = snow = shadow = pixelqar = 0 #clear memory 69 | 70 | # make the basename for final output files 71 | mtlfile = grep("MTL.txt", files, value=T) 72 | tbl = unlist(read.delim(mtlfile, header=F, skipNul=T)) 73 | string = as.character(grep("LANDSAT_SCENE_ID = ", tbl, value=T)) 74 | pieces = unlist(strsplit(string, " ")) 75 | sceneid = pieces[length(pieces)] 76 | outbase = substr(sceneid,1,16) 77 | 78 | } else{ 79 | #name = 'LC80380292015214LGN00.xml' 80 | outbase = substr(name,1,16) # need to get this from the filename, because pre-collection does not include an mtl file 81 | year = substr(name,10,13) 82 | 83 | pieces = unlist(strsplit(dirname(file), "/")) #break up the directory and unlist so the pieces can be called by index 84 | len = length(pieces)-1 #get the ending index for "scene" 85 | newpieces = paste(pieces[1:len], collapse = "/") #subset the directory pieces so the last piece is the scene 86 | outdir = file.path(newpieces, "images", year) 87 | dir.create(outdir, recursive=T, showWarnings=F) 88 | 89 | #decompress the image and get/set files names 90 | untar(file, exdir=tempdir, tar="internal") #decompress the file 91 | files = list.files(tempdir, full.names=T) 92 | bands = sort(grep("band", files, value=T)) 93 | shadow = grep("cloud_shadow_qa.tif", files, value=T) #0 okay, 255 bad 94 | cloud = grep("sr_cloud_qa.tif", files, value=T) #0 okay, 255 bad 95 | snow = grep("sr_snow_qa.tif", files, value=T) #0 okay, 255 bad 96 | fmask = grep("cfmask.tif", files, value=T) # <= 1 okay background 255 97 | 98 | #make a composite cloudmask 99 | s = as.matrix(raster(shadow)) 100 | c = as.matrix(raster(cloud)) 101 | sn = as.matrix(raster(snow)) 102 | f = as.matrix(raster(fmask)) 103 | 104 | check = s[1,1] # if is.na(check) == T new else old 105 | if(is.na(check) == T){s = is.na(s)} else {s = !is.na(s)} 106 | check = c[1,1] # if is.na(check) == T new else old 107 | if(is.na(check) == T){c = is.na(c)} else {c = !is.na(c)} 108 | check = sn[1,1] # if is.na(check) == T new else old 109 | if(is.na(check) == T){sn = is.na(sn)} else {sn = !is.na(sn)} 110 | f = f <= 1 111 | mask = s*c*f*sn 112 | s=c=sn=f=0 113 | } 114 | 115 | #mask = setValues(ref,mask) 116 | #plot(mask) 117 | #writeRaster(mask, "D:/work/proj/llr_dev/collection1/tm/wrs2/032033/test.tif") 118 | 119 | # create outfile paths 120 | tempstack = file.path(tempdir,paste(outbase,"_tempstack.tif",sep="")) 121 | tempvrt = sub("tempstack.tif", "tempstack.vrt", tempstack) 122 | tempmask = sub("tempstack", "tempmask", tempstack) 123 | projstack = sub("tempstack", "projstack", tempstack) 124 | projmask = sub("tempstack", "projmask", tempstack) 125 | finalstack = file.path(outdir,paste(outbase,"_ledaps.tif", sep="")) 126 | finalmask = file.path(outdir,paste(outbase,"_cloudmask.tif", sep="")) 127 | tcfile = file.path(outdir,paste(outbase,"_tc.tif", sep="")) 128 | tcafile = file.path(outdir,paste(outbase,"_tca.tif", sep="")) 129 | outprojfile = file.path(outdir,paste(outbase,"_proj.txt", sep="")) 130 | 131 | # set a reference raster for setting values and getting projection 132 | ref = raster(bands[1]) 133 | origproj = projection(ref) 134 | 135 | #stack the image bands and write out 136 | gdalbuildvrt(gdalfile=bands, output.vrt = tempvrt, separate=T) #, tr=c(reso,reso) 137 | gdal_translate(src_dataset=tempvrt, dst_dataset=tempstack, of = "GTiff", co="INTERLEAVE=BAND") 138 | 139 | # write the mask out 140 | mask = setValues(ref,mask) 141 | mask = as(mask, "SpatialGridDataFrame") #convert the raster to SGHF so it can be written using GDAL (faster than writing it with the raster package) 142 | writeGDAL(mask, tempmask, drivername = "GTiff", type = "Byte", mvFlag = 255, options="INTERLEAVE=BAND") 143 | 144 | mask=0 #clear the memory 145 | 146 | #reproject the image #need to add in writing proj file for default 147 | if(proj == "default"){proj = origproj} 148 | write(proj, outprojfile) 149 | gdalwarp(srcfile=tempstack, dstfile=projstack, 150 | s_srs=origproj, t_srs=proj, of="GTiff", 151 | r="bilinear", srcnodata=-9999, dstnodata=-32768, multi=T, #"near" 152 | tr=c(30,30), co="INTERLEAVE=BAND") 153 | 154 | #project the mask 155 | gdalwarp(srcfile=tempmask, dstfile=projmask, 156 | s_srs=origproj, t_srs=proj, of="GTiff", 157 | r="mode", srcnodata=255, dstnodata=255, multi=T, 158 | tr=c(30,30), co="INTERLEAVE=BAND") 159 | 160 | #trim the na rows and cols 161 | trim_na_rowcol(projstack, finalstack, projmask, finalmask) 162 | 163 | #tasseled cap 164 | b1 = as.matrix(raster(finalstack, 1)) 165 | b2 = as.matrix(raster(finalstack, 2)) 166 | b3 = as.matrix(raster(finalstack, 3)) 167 | b4 = as.matrix(raster(finalstack, 4)) 168 | b5 = as.matrix(raster(finalstack, 5)) 169 | b6 = as.matrix(raster(finalstack, 6)) 170 | 171 | bcoef = c(0.2043, 0.4158, 0.5524, 0.5741, 0.3124, 0.2303) 172 | gcoef = c(-0.1603, -0.2819, -0.4934, 0.7940, -0.0002, -0.1446) 173 | wcoef = c(0.0315, 0.2021, 0.3102, 0.1594,-0.6806, -0.6109) 174 | 175 | bright = (b1*bcoef[1])+(b2*bcoef[2])+(b3*bcoef[3])+(b4*bcoef[4])+(b5*bcoef[5])+(b6*bcoef[6]) 176 | green = (b1*gcoef[1])+(b2*gcoef[2])+(b3*gcoef[3])+(b4*gcoef[4])+(b5*gcoef[5])+(b6*gcoef[6]) 177 | wet = (b1*wcoef[1])+(b2*wcoef[2])+(b3*wcoef[3])+(b4*wcoef[4])+(b5*wcoef[5])+(b6*wcoef[6]) 178 | 179 | b1=b2=b3=b4=b5=b6=0 180 | 181 | tcb = matrix_to_raster(finalstack,bright) 182 | tcg = matrix_to_raster(finalstack,green) 183 | tcw = matrix_to_raster(finalstack,wet) 184 | wet=0 185 | tca = atan(green/bright) * (180/pi) * 100 186 | bright=green=0 187 | tca = matrix_to_raster(finalstack,tca) 188 | 189 | temptcb = file.path(tempdir,paste(outbase,"_temptcb.tif",sep="")) 190 | temptcg = file.path(tempdir,paste(outbase,"_temptcg.tif",sep="")) 191 | temptcw = file.path(tempdir,paste(outbase,"_temptcw.tif",sep="")) 192 | projection(tcb) = set_projection(tcfile) 193 | projection(tcg) = set_projection(tcfile) 194 | projection(tcw) = set_projection(tcfile) 195 | tc = as(tcb, "SpatialGridDataFrame") 196 | tcb=0 197 | writeGDAL(tc, temptcb, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 198 | tc = as(tcg, "SpatialGridDataFrame") 199 | tcg=0 200 | writeGDAL(tc, temptcg, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 201 | tc = as(tcw, "SpatialGridDataFrame") 202 | tcw=0 203 | writeGDAL(tc, temptcw, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 204 | 205 | bands = c(temptcb,temptcg,temptcw) 206 | temptcs = file.path(tempdir,paste(outbase,"_temptcstack.vrt",sep="")) 207 | gdalbuildvrt(gdalfile=bands, output.vrt = temptcs, separate=T) #, tr=c(reso,reso) 208 | gdal_translate(src_dataset=temptcs, dst_dataset=tcfile, of = "GTiff", co="INTERLEAVE=BAND") 209 | 210 | projection(tca) = set_projection(tcfile) 211 | tc = as(tca, "SpatialGridDataFrame") 212 | tca=0 213 | writeGDAL(tc, tcafile, drivername = "GTiff", type = "Int16", mvFlag = -32768, options="INTERLEAVE=BAND") 214 | 215 | #delete temporary files 216 | unlink(tempdir, recursive=T, force=T) 217 | } 218 | 219 | -------------------------------------------------------------------------------- /R/trim_na_rowcol.r: -------------------------------------------------------------------------------- 1 | #' Trim NA rows and columns from an image 2 | #' 3 | #' Trim NA rows and columns from an image 4 | #' @param imgfile character. input image filename 5 | #' @param outimg character. output image filename 6 | #' @param maskfile character. input mask filename (optional) 7 | #' @param outmask character. input mask filename (optional) 8 | #' @import raster 9 | #' @import gdalUtils 10 | #' @export 11 | 12 | trim_na_rowcol = function(imgfile, outimg, maskfile, outmask){ 13 | #trim the stack 14 | x = raster(imgfile, band=1) 15 | cres = 0.5*res(x) 16 | y = x 17 | x = matrix(as.array(x),nrow=nrow(x),ncol=ncol(x)) 18 | r.na = c.na <- c() 19 | for(i in 1:nrow(x)) r.na <- c(r.na, all(is.na(x[i,]))) 20 | for(i in 1:ncol(x)) c.na <- c(c.na, all(is.na(x[,i]))) 21 | r1 = 1 + which(diff(which(r.na))>1)[1] 22 | r2 = nrow(x) - which(diff(which(rev(r.na)))>1)[1] 23 | c1 = 1 + which(diff(which(c.na))>1)[1] 24 | c2 = ncol(x) - which(diff(which(rev(c.na)))>1)[1] 25 | 26 | #if there are no NA rows and cols, then set to default row and col start and end 27 | if(is.na(r1)){ 28 | if(r.na[1] == T){r1 = 1+length(which(r.na))} else {r1 = 1} 29 | } 30 | if(is.na(r2)){ 31 | if(rev(r.na)[1] == T){r2 = nrow(x) - length(which(r.na))} else {r2 = nrow(x)} 32 | } 33 | if(is.na(c1)){ 34 | if(c.na[1] == T){c1 = 1+length(which(c.na))} else {c1 = 1} 35 | } 36 | if(is.na(c2)){ 37 | if(rev(c.na)[1] == T){c2 = ncol(x) - length(which(c.na))} else {c2 = ncol(x)} 38 | } 39 | 40 | xs = xFromCol(y,col=c(c1,c2)) + c(-1,1)*cres[1] 41 | ys = yFromRow(y,row=c(r2,r1)) + c(-1,1)*cres[2] 42 | 43 | #write out the trimmed file 44 | gdal_translate(src_dataset=imgfile, dst_dataset=outimg, of="GTiff", co="INTERLEAVE=BAND", projwin=c(xs[1],ys[2],xs[2],ys[1])) 45 | if(file.exists(maskfile) == T){gdal_translate(src_dataset=maskfile, dst_dataset=outmask, of="GTiff", co="INTERLEAVE=BAND", projwin=c(xs[1],ys[2],xs[2],ys[1]))} 46 | } -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ⚠️ _**This repository is no longer in development and is not supported**_ ⚠️ 2 | 3 | _You'll likely find that unmaintained dependencies and versioning conflicts make 4 | this package unusable._ 5 | 6 | _Progress is (slowly) being made on [ee-LandsatLinkr](https://github.com/gee-community/ee-LandsatLinkr), 7 | which offers similar MSS-to-successor harmonization procedures using Google Earth Engine._ 8 | 9 | LandsatLinkr 10 | ============ 11 | 12 | An automated system for creating spectrally consistent and cloud-free Landsat image time series stacks from MSS, TM, ETM+, and OLI sensors 13 | 14 | For more information please visit http://jdbcode.github.io/LandsatLinkr/ 15 | 16 | Regarding citation, here is all the info you might need: 17 | 18 | * Author = Justin D Braaten and Warren B Cohen and Zhiqiang Yang 19 | * Title = LandsatLinkr 20 | * Year = 2017 21 | * DOI = 10.5281/zenodo.807733 22 | * URL = http://dx.doi.org/10.5281/zenodo.807733 23 | * Publisher = Zenodo 24 | 25 | ...and here is an example in APA style: 26 | 27 | Braaten, J. D., Cohen, W. B., Yang, Z. (2017). LandsatLinkr. Zenodo. http://dx.doi.org/10.5281/zenodo.807733 28 | -------------------------------------------------------------------------------- /docs/lasrc_product_guide.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jdbcode/LandsatLinkr/fbaa880ed1dccc5076187aacf9a08d3d0321d57a/docs/lasrc_product_guide.pdf -------------------------------------------------------------------------------- /docs/ledaps_product_guide.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jdbcode/LandsatLinkr/fbaa880ed1dccc5076187aacf9a08d3d0321d57a/docs/ledaps_product_guide.pdf -------------------------------------------------------------------------------- /man/cal_mss_tc_aggregate_model.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/cal_mss_tc_aggregate_model.r 3 | \name{cal_mss_tc_aggregate_model} 4 | \alias{cal_mss_tc_aggregate_model} 5 | \title{Create an aggregate MSS TC model} 6 | \usage{ 7 | cal_mss_tc_aggregate_model(dir) 8 | } 9 | \arguments{ 10 | \item{dirname}{character. the directory to the calibration folder} 11 | } 12 | \description{ 13 | Create an aggregate MSS TC model with diagnostic figures 14 | } 15 | -------------------------------------------------------------------------------- /man/cal_oli_tc_aggregate_model.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/cal_oli_tc_aggregate_model.r 3 | \name{cal_oli_tc_aggregate_model} 4 | \alias{cal_oli_tc_aggregate_model} 5 | \title{Create an aggregate oli TC model} 6 | \usage{ 7 | cal_oli_tc_aggregate_model(dir, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{dirname}{character. the directory to the calibration folder} 11 | } 12 | \description{ 13 | Create an aggregate oli TC model with diagnostic figures 14 | } 15 | -------------------------------------------------------------------------------- /man/calibrate_and_composite.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/calibrate_and_composite.r 3 | \name{calibrate_and_composite} 4 | \alias{calibrate_and_composite} 5 | \title{Calibrate MSS imagery to TM and make cloud-free composites} 6 | \usage{ 7 | calibrate_and_composite(msswrs1dir, msswrs2dir, tmwrs2dir, oliwrs2dir, index, 8 | outdir, runname, useareafile, doyears = "all", order = "none", 9 | overlap = "mean", cores = 2, process, overwrite = F, startday, endday, 10 | yearadj) 11 | } 12 | \arguments{ 13 | \item{msswrs1dir}{character. mss wrs1 directory path} 14 | 15 | \item{msswrs2dir}{character. mss wrs2 directory path} 16 | 17 | \item{tmwrs2dir}{character. tm wrs2 directory path} 18 | 19 | \item{index}{character. spectral index to make composites for. options: "tca", "tcb", "tcg", "tcw"} 20 | 21 | \item{outdir}{character. path to output directory} 22 | 23 | \item{runname}{character. unique name for the composite set} 24 | 25 | \item{useareafile}{character. path to usearea file} 26 | 27 | \item{doyears}{??? what years to composite} 28 | 29 | \item{order}{character. how to order the images options "sensor_and_doy", "doy", and "none"} 30 | 31 | \item{overlap}{character. how to deal with overlapping images. options: "mean"} 32 | 33 | \item{cores}{numeric. Number of cores to process with options: 1 or 2} 34 | 35 | \item{process}{numeric. integer or vector specifying which processes to run: 1=msscal, 2=mixel} 36 | } 37 | \description{ 38 | Calibrate MSS imagery to TM and make cloud-free composites 39 | } 40 | -------------------------------------------------------------------------------- /man/change_extension.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/change_extension.r 3 | \name{change_extension} 4 | \alias{change_extension} 5 | \title{Changes a file's extension} 6 | \usage{ 7 | change_extension(old, new, file) 8 | } 9 | \arguments{ 10 | \item{old}{character. old extension} 11 | 12 | \item{new}{character. new extension} 13 | 14 | \item{file}{character. full path of the file to change extension} 15 | } 16 | \description{ 17 | Changes a file's extension 18 | } 19 | -------------------------------------------------------------------------------- /man/delete_files.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/delete_files.r 3 | \name{delete_files} 4 | \alias{delete_files} 5 | \title{Delete files} 6 | \usage{ 7 | delete_files(file, reason) 8 | } 9 | \arguments{ 10 | \item{file}{character. any file associated with a particular image ID} 11 | 12 | \item{reason}{numeric or character. a reason for deleting the image.} 13 | } 14 | \description{ 15 | Delete all files associated with a particular image ID 16 | } 17 | -------------------------------------------------------------------------------- /man/eudist.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/eudist.r 3 | \name{eudist} 4 | \alias{eudist} 5 | \title{Earth-Sun distance by day of year} 6 | \usage{ 7 | eudist(doy) 8 | } 9 | \arguments{ 10 | \item{doy}{numeric. image day-of-year} 11 | } 12 | \description{ 13 | Earth-Sun distance by day of year 14 | } 15 | \references{ 16 | http://landsathandbook.gsfc.nasa.gov/excel_docs/d.xls 17 | } 18 | -------------------------------------------------------------------------------- /man/file_check.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/file_check.r 3 | \name{file_check} 4 | \alias{file_check} 5 | \title{Handles file existence checking and overwriting} 6 | \usage{ 7 | file_check(file, output, overwrite) 8 | } 9 | \arguments{ 10 | \item{file}{Filename. of file being worked on} 11 | 12 | \item{output}{Filename. what output file is it checking for? ("archv.tif", "reflectance.tif", etc)} 13 | 14 | \item{overwrite}{logical. if the output file exists should it be deleted} 15 | } 16 | \description{ 17 | Handles file existence checking and overwriting 18 | } 19 | -------------------------------------------------------------------------------- /man/get_metadata.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/get_metadata.r 3 | \name{get_metadata} 4 | \alias{get_metadata} 5 | \title{Retrieve Landsat image metadata} 6 | \usage{ 7 | get_metadata(file) 8 | } 9 | \arguments{ 10 | \item{file}{LPGS-processed Landsat image filename (full system path to file)} 11 | } 12 | \value{ 13 | Dataframe with pertinent image information 14 | } 15 | \description{ 16 | Uses the image file name to find the corresponding *MTL.txt image metadata file provided with LPSG Landsat images and returns a dataframe with pertinent image information 17 | } 18 | -------------------------------------------------------------------------------- /man/llr_time_machine.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/llr_time_machine.r 3 | \name{llr_time_machine} 4 | \alias{llr_time_machine} 5 | \title{Decompress, stack, and reproject LPSG MSS images} 6 | \usage{ 7 | llr_time_machine(imgdir, outdir, coordfile) 8 | } 9 | \arguments{ 10 | \item{imgdir}{direcory path. full path to directory containing LandsatLinkr annual composites} 11 | 12 | \item{outdir}{direcory path. full path to the directory where you want LLR-TimeMachine data to be written} 13 | 14 | \item{coordfile}{csv file path. full path to a comma delimited file containing the plot number, x, and y coordinates pixels you want to view in LLR-TimeMachine} 15 | } 16 | \description{ 17 | Decompresses, stacks, and optionally reprojects LPGS MSS images recieved from USGS EROS as .tar.gz files 18 | } 19 | -------------------------------------------------------------------------------- /man/make_usearea_file.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/make_usearea_file.r 3 | \name{make_usearea_file} 4 | \alias{make_usearea_file} 5 | \title{Make a usearea file} 6 | \usage{ 7 | make_usearea_file(dir, outfile, xmx, xmn, ymx, ymn) 8 | } 9 | \arguments{ 10 | \item{dir}{character. full path name to scene directory example: "E:/mss/wrs2/038029"} 11 | 12 | \item{outfile}{charcter. full path of output file} 13 | 14 | \item{xmx}{numeric.max x coordinate} 15 | 16 | \item{xmn}{numeric.min x coordinate} 17 | 18 | \item{ymx}{numeric.max y coordinate} 19 | 20 | \item{ymn}{numeric.min y coordinate} 21 | } 22 | \description{ 23 | Make a usearea file for image compositing 24 | } 25 | -------------------------------------------------------------------------------- /man/make_usearea_file_bsq.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/make_usearea_file_bsq.r 3 | \name{make_usearea_file_bsq} 4 | \alias{make_usearea_file_bsq} 5 | \title{Make a usearea file in ENVI (bsq) format} 6 | \usage{ 7 | make_usearea_file_bsq(infile, projref) 8 | } 9 | \arguments{ 10 | \item{infile}{character. full path name to a usearea mask file} 11 | 12 | \item{projref}{charcter. full path to an image file from the scene relevent to the usearea file - can be any .tif file in the "images" directory} 13 | } 14 | \description{ 15 | Make a usearea file in ENVI (bsq) format 16 | } 17 | -------------------------------------------------------------------------------- /man/matrix_to_raster.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/matrix_to_raster.r 3 | \name{matrix_to_raster} 4 | \alias{matrix_to_raster} 5 | \title{Converts a matrix to a raster file} 6 | \usage{ 7 | matrix_to_raster(rfile, rmatrix) 8 | } 9 | \arguments{ 10 | \item{rfile}{character. full path name of a reference raster file} 11 | 12 | \item{rmatrix}{matrix. a numeric 2-d matrix to be converted to a raster} 13 | } 14 | \description{ 15 | Converts a matrix to a raster file 16 | } 17 | -------------------------------------------------------------------------------- /man/mixel.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/mixel.r 3 | \name{mixel} 4 | \alias{mixel} 5 | \title{Composite images} 6 | \usage{ 7 | mixel(msswrs1dir, msswrs2dir, tmwrs2dir, oliwrs2dir, index, outdir, runname, 8 | useareafile, doyears = "all", order = "none", overlap = "mean", 9 | startday, endday, yearadj = 0) 10 | } 11 | \arguments{ 12 | \item{msswrs1dir}{character. list of mss wrs1 directory paths} 13 | 14 | \item{msswrs2dir}{character. list of mss wrs2 directory paths} 15 | 16 | \item{tmwrs2dir}{character. list of tm wrs2 directory path} 17 | 18 | \item{index}{character. spectral index to make composites for. options: "tca", "tcb", "tcg", "tcw"} 19 | 20 | \item{outdir}{character. path to output directory} 21 | 22 | \item{runname}{character. unique name for the composite set} 23 | 24 | \item{useareafile}{character. path to usearea file} 25 | 26 | \item{doyears}{??? what years to composite} 27 | 28 | \item{order}{character. how to order the images options "sensor_and_doy" and "doy"} 29 | 30 | \item{overlap}{character. how to deal with overlapping images. options: "mean"} 31 | } 32 | \description{ 33 | Composite images 34 | } 35 | -------------------------------------------------------------------------------- /man/mosaic_dems.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/mosaic_dems.r 3 | \name{mosaic_dems} 4 | \alias{mosaic_dems} 5 | \title{Create a DEM mosaic from a direcory of DEM's} 6 | \usage{ 7 | mosaic_dems(dir, proj) 8 | } 9 | \arguments{ 10 | \item{dir}{character. The path to a directory containing DEM files to be mosaicked} 11 | } 12 | \description{ 13 | Create a DEM mosaic from a direcory of DEM's 14 | } 15 | -------------------------------------------------------------------------------- /man/mss_resample.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/mss_resample.r 3 | \name{mss_resample} 4 | \alias{mss_resample} 5 | \title{Resamples MSS dos_sr and cloudmask images to 30m} 6 | \usage{ 7 | mss_resample(file, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{character. full path to either an MSS *reflectance.tif or *cloudmask.tif file} 11 | } 12 | \description{ 13 | Resamples MSS dos_sr and cloudmask images to 30m 14 | } 15 | -------------------------------------------------------------------------------- /man/msscal.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msscal.r 3 | \name{msscal} 4 | \alias{msscal} 5 | \title{Calibrate MSS images to TM images} 6 | \usage{ 7 | msscal(msswrs1dir, msswrs2dir, tmwrs2dir, cores = 2) 8 | } 9 | \arguments{ 10 | \item{msswrs1dir}{character. MSS WRS-1 scene directory path} 11 | 12 | \item{msswrs2dir}{character. MSS WRS-2 scene directory path} 13 | 14 | \item{tmwrs2dir}{character. TM WRS-2 scene directory path} 15 | 16 | \item{cores}{numeric. Number of cores to process with options: 1 or 2} 17 | } 18 | \description{ 19 | Calibrate MSS images to TM images using linear regression 20 | } 21 | -------------------------------------------------------------------------------- /man/msscal_single.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msscal_single.r 3 | \name{msscal_single} 4 | \alias{msscal_single} 5 | \title{Calibrate MSS images to TM images} 6 | \usage{ 7 | msscal_single(mss_file, tm_file) 8 | } 9 | \arguments{ 10 | \item{msswrs2dir}{character. MSS WRS-2 scene directory path} 11 | 12 | \item{tmwrs2dir}{character. TM WRS-2 scene directory path} 13 | } 14 | \description{ 15 | Calibrate MSS images to TM images using linear regression 16 | } 17 | -------------------------------------------------------------------------------- /man/msscost.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msscost.r 3 | \name{msscost} 4 | \alias{msscost} 5 | \title{Convert DN values to surface reflectance} 6 | \usage{ 7 | msscost(file, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{The full path name of the *archv file} 11 | } 12 | \description{ 13 | Convert DN values to surface reflectance using the COST model with dark object subtraction 14 | } 15 | -------------------------------------------------------------------------------- /man/msscvm.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msscvm.r 3 | \name{msscvm} 4 | \alias{msscvm} 5 | \title{Create a cloud and cloud shadow mask for Landsat MSS imagery} 6 | \usage{ 7 | msscvm(file, demfile, topoprep, test = F, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{character. MSS reflectance image filename (full system path to MSS file)} 11 | 12 | \item{demfile}{character. DEM filename (full system path to spatially coincident DEM file)} 13 | 14 | \item{topoprep}{logical. TRUE if slope and aspect are already created in the "topo" folder and FALSE if not} 15 | 16 | \item{test}{logical. If TRUE clouds, cloud shadows and clear pixels have unique values, if FALSE obscured are 0 and clear are 1} 17 | } 18 | \value{ 19 | A binary raster with the same dimensions as the MSS image where pixels with value 1 represent clear pixel and 0 as obsured by either cloud or cloud shadow 20 | } 21 | \description{ 22 | Takes in any numeric value and squares it. 23 | } 24 | -------------------------------------------------------------------------------- /man/mssdn2refl.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/mssdn2refl.r 3 | \name{mssdn2refl} 4 | \alias{mssdn2refl} 5 | \title{Convert DN values to toa reflectance} 6 | \usage{ 7 | mssdn2refl(file, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{The full path name of the *archv file} 11 | } 12 | \description{ 13 | Convert DN values to toa reflectance 14 | } 15 | -------------------------------------------------------------------------------- /man/msssr2tc.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msssr2tc.r 3 | \name{msssr2tc} 4 | \alias{msssr2tc} 5 | \title{Create MSS TC from surface reflectance and modeled coefficients} 6 | \usage{ 7 | msssr2tc(mss_file, bcoef, gcoef, wcoef, mode) 8 | } 9 | \arguments{ 10 | \item{mss_file}{character. full path name to MSS surface reflectance file} 11 | 12 | \item{bcoef}{numeric. numeric array containing the tcb coefficients} 13 | 14 | \item{gcoef}{numeric. numeric array containing the tcg coefficients} 15 | 16 | \item{wcoef}{numeric. numeric array containing the tcw coefficients} 17 | 18 | \item{mode.}{character. how to deal with the outputs options: "calibrate" or "apply"} 19 | } 20 | \description{ 21 | Create MSS TC from surface reflectance and modeled coefficients 22 | } 23 | -------------------------------------------------------------------------------- /man/mssunpackr.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/mssunpackr.r 3 | \name{mssunpackr} 4 | \alias{mssunpackr} 5 | \title{Decompress, stack, and reproject LPSG MSS images} 6 | \usage{ 7 | mssunpackr(file, proj, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{Filename of LPGS Landsat MSS image filename (full system path to file)} 11 | 12 | \item{proj}{PROJ.4 projection definition. By default no projection will take place. Optionally specify a CRS projection string or "albers" for the USGS version of Albers Equal Area Conic} 13 | 14 | \item{reso}{numeric. the target pixel size for the output image} 15 | } 16 | \description{ 17 | Decompresses, stacks, and optionally reprojects LPGS MSS images recieved from USGS EROS as .tar.gz files 18 | } 19 | -------------------------------------------------------------------------------- /man/msswarp.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msswarp.r 3 | \name{msswarp} 4 | \alias{msswarp} 5 | \title{Spatially warp an MSS image} 6 | \usage{ 7 | msswarp(reffile, fixfile, refstart = c(0, 0), fixstart = c(0, 0)) 8 | } 9 | \arguments{ 10 | \item{reffile}{character. MSS image file that has low spatial RMSE and low cloud cover} 11 | 12 | \item{fixfile}{character. MSS image file to be spatially warped to match the reference file} 13 | 14 | \item{refstart}{numeric. c(xcoord,ycoord). reference image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images.} 15 | 16 | \item{fixstart}{numeric. c(xcoord,ycoord). fix image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images.} 17 | 18 | \item{window}{numeric. image subset size used to define cross-correlation calculation. unit is pixels along one side of a square} 19 | 20 | \item{search}{numeric. neighborhood search window size in which to find tie-point offset. unit is pixels along one side of a square} 21 | 22 | \item{sample}{numeric. target number of tie-points} 23 | } 24 | \description{ 25 | Spatially warp an MSS image to match the spatial properties of a reference image 26 | } 27 | -------------------------------------------------------------------------------- /man/msswarp_old.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/msswarp_old.r 3 | \name{msswarp_old} 4 | \alias{msswarp_old} 5 | \title{Spatially warp an MSS image} 6 | \usage{ 7 | msswarp_old(reffile, fixfile, window = 275, search = 27, sample = 1000, 8 | refstart = c(0, 0), fixstart = c(0, 0)) 9 | } 10 | \arguments{ 11 | \item{reffile}{character. MSS image file that has low spatial RMSE and low cloud cover} 12 | 13 | \item{fixfile}{character. MSS image file to be spatially warped to match the reference file} 14 | 15 | \item{window}{numeric. image subset size used to define cross-correlation calculation. unit is pixels along one side of a square} 16 | 17 | \item{search}{numeric. neighborhood search window size in which to find tie-point offset. unit is pixels along one side of a square} 18 | 19 | \item{sample}{numeric. target number of tie-points} 20 | 21 | \item{refstart}{numeric. c(xcoord,ycoord). reference image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images.} 22 | 23 | \item{fixstart}{numeric. c(xcoord,ycoord). fix image coordinate for a pixel identified as common in both the reference and the to-be-warped image. used to calculate an initial offset between the two images.} 24 | } 25 | \description{ 26 | Spatially warp an MSS image to match the spatial properties of a reference image 27 | } 28 | -------------------------------------------------------------------------------- /man/olical.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/olical.r 3 | \name{olical} 4 | \alias{olical} 5 | \title{Calibrate OLI images to TM images} 6 | \usage{ 7 | olical(oliwrs2dir, tmwrs2dir, cores = 2, overwrite = overwrite) 8 | } 9 | \arguments{ 10 | \item{oliwrs2dir}{character. oli WRS-2 scene directory path} 11 | 12 | \item{tmwrs2dir}{character. TM WRS-2 scene directory path} 13 | 14 | \item{cores}{numeric. Number of cores to process with options: 1 or 2} 15 | } 16 | \description{ 17 | Calibrate OLI images to TM images using linear regression 18 | } 19 | -------------------------------------------------------------------------------- /man/olical_single.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/olical_single.r 3 | \name{olical_single} 4 | \alias{olical_single} 5 | \title{Calibrate oli images to TM images} 6 | \usage{ 7 | olical_single(oli_file, tm_file, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{oliwrs2dir}{character. oli WRS-2 scene directory path} 11 | 12 | \item{tmwrs2dir}{character. TM WRS-2 scene directory path} 13 | } 14 | \description{ 15 | Calibrate oli images to TM images using linear regression 16 | } 17 | -------------------------------------------------------------------------------- /man/olisr2tc.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/olisr2tc.r 3 | \name{olisr2tc} 4 | \alias{olisr2tc} 5 | \title{Create oli TC from surface reflectance and modeled coefficients} 6 | \usage{ 7 | olisr2tc(oli_file, bcoef, gcoef, wcoef, mode, overwrite = F) 8 | } 9 | \arguments{ 10 | \item{oli_file}{character. full path name to oli surface reflectance file} 11 | 12 | \item{bcoef}{numeric. numeric array containing the tcb coefficients} 13 | 14 | \item{gcoef}{numeric. numeric array containing the tcg coefficients} 15 | 16 | \item{wcoef}{numeric. numeric array containing the tcw coefficients} 17 | 18 | \item{mode.}{character. how to deal with the outputs options: "calibrate" or "apply"} 19 | } 20 | \description{ 21 | Create oli TC from surface reflectance and modeled coefficients 22 | } 23 | -------------------------------------------------------------------------------- /man/oliunpackr.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/oliunpackr.r 3 | \name{oliunpackr} 4 | \alias{oliunpackr} 5 | \title{Decompress, stack, and reproject OLI SR images} 6 | \usage{ 7 | oliunpackr(file, proj = "default", overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{character. full path name of the surface reflectance file} 11 | 12 | \item{proj}{character. PROJ.4 projection definition.} 13 | 14 | \item{overwrite}{logical. True will overwrite the file if it already exists, False will skip processing if output file exists.} 15 | } 16 | \description{ 17 | Decompress, stack, and reproject OLI SR images 18 | } 19 | -------------------------------------------------------------------------------- /man/prepare_images.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/prepare_images.r 3 | \name{prepare_images} 4 | \alias{prepare_images} 5 | \title{Prepare image MSS and TM images for calibration/compositing} 6 | \usage{ 7 | prepare_images(scenedir, demfile = NULL, proj = "default", 8 | process = seq(1:5), cores = 1, overwrite = F) 9 | } 10 | \arguments{ 11 | \item{scenedir}{character. scene file path} 12 | 13 | \item{demfile}{character. full path to scene-corresponding DEM file} 14 | 15 | \item{process}{numeric. integer or vector specifying which processes to run 1=mssunpackr, 2=msswarp, 3=mssdn2rad, 4=mssatcor, 5=msscvm, 6=tmunpackr} 16 | 17 | \item{cores}{numeric. number of cores to use for parallel processing} 18 | } 19 | \description{ 20 | Prepare image MSS and TM images for calibration/compositing 21 | } 22 | -------------------------------------------------------------------------------- /man/prepare_topo.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/prepare_topo.r 3 | \name{prepare_topo} 4 | \alias{prepare_topo} 5 | \title{Prepare topographic layers: elevation, slope, aspect for use in MSS cloud masking} 6 | \usage{ 7 | prepare_topo(imgdir, demfile) 8 | } 9 | \arguments{ 10 | \item{imgdir}{character. full path to "images" directory for scene} 11 | 12 | \item{demfile}{character. full path to scene-corresponding DEM file} 13 | } 14 | \description{ 15 | Prepare topographic layers: elevation, slope, aspect for use in MSS cloud masking 16 | } 17 | -------------------------------------------------------------------------------- /man/run_landsatlinkr.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/run_landsatlinkr.r 3 | \name{run_landsatlinkr} 4 | \alias{run_landsatlinkr} 5 | \title{GUI for running LandsatLinkr} 6 | \usage{ 7 | run_landsatlinkr() 8 | } 9 | \description{ 10 | GUI for running LandsatLinkr 11 | } 12 | -------------------------------------------------------------------------------- /man/set_projection.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/set_projection.r 3 | \name{set_projection} 4 | \alias{set_projection} 5 | \title{Set the projection of raster} 6 | \usage{ 7 | set_projection(file) 8 | } 9 | \arguments{ 10 | \item{file}{Filename of MSS image with DN values} 11 | } 12 | \description{ 13 | Set the projection of a raster from the corresponding *proj.txt file. This solves a problem with incorrect projection parameters for albers when read by 'raster' 14 | } 15 | -------------------------------------------------------------------------------- /man/tmunpackr.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/tmunpackr.r 3 | \name{tmunpackr} 4 | \alias{tmunpackr} 5 | \title{Decompress, stack, and reproject TM/ETM+ SR images} 6 | \usage{ 7 | tmunpackr(file, proj = "default", overwrite = F) 8 | } 9 | \arguments{ 10 | \item{file}{character. full path name of the surface reflectance file} 11 | 12 | \item{proj}{character. PROJ.4 projection definition.} 13 | 14 | \item{overwrite}{logical. True will overwrite the file if it already exists, False will skip processing if output file exists.} 15 | } 16 | \description{ 17 | Decompress, stack, and reproject TM/ETM+ SR images 18 | } 19 | -------------------------------------------------------------------------------- /man/trim_na_rowcol.Rd: -------------------------------------------------------------------------------- 1 | % Generated by roxygen2: do not edit by hand 2 | % Please edit documentation in R/trim_na_rowcol.r 3 | \name{trim_na_rowcol} 4 | \alias{trim_na_rowcol} 5 | \title{Trim NA rows and columns from an image} 6 | \usage{ 7 | trim_na_rowcol(imgfile, outimg, maskfile, outmask) 8 | } 9 | \arguments{ 10 | \item{imgfile}{character. input image filename} 11 | 12 | \item{outimg}{character. output image filename} 13 | 14 | \item{maskfile}{character. input mask filename (optional)} 15 | 16 | \item{outmask}{character. input mask filename (optional)} 17 | } 18 | \description{ 19 | Trim NA rows and columns from an image 20 | } 21 | --------------------------------------------------------------------------------