├── .gitignore ├── .gitmodules ├── imgs ├── artifact_montage.png ├── hillshade_altitude_montage.jpg ├── hillshade_montage.jpg ├── mapbox-hillshade-breakdown.gif ├── mapbox-hillshade-breakdown.webm ├── mapbox-hillshade-breakdown_1.gif ├── monochrome_montage.jpg ├── polygonized.png ├── resampling_montage.jpg └── subset.tif_hillshade_monochrome_combined.gif ├── index.html ├── readme.md ├── tile-generator ├── main.py └── requirements.txt ├── tile-server ├── .gitignore ├── mapbox_studio.tm2source │ ├── .thumb.png │ └── data.yml ├── package.json ├── readme.md └── server.js ├── tile-uploader ├── package.json └── upload.js └── tile-viewer ├── .gitignore ├── README.md ├── app ├── account.js └── style.json ├── gulpfile.js └── style.yml /.gitignore: -------------------------------------------------------------------------------- 1 | data/ 2 | output/ 3 | __pycache__ 4 | node_modules 5 | -------------------------------------------------------------------------------- /.gitmodules: -------------------------------------------------------------------------------- 1 | [submodule "postgis-vt-util"] 2 | path = tile-server/postgis-vt-util 3 | url = https://github.com/mapbox/postgis-vt-util.git 4 | -------------------------------------------------------------------------------- /imgs/artifact_montage.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/artifact_montage.png -------------------------------------------------------------------------------- /imgs/hillshade_altitude_montage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/hillshade_altitude_montage.jpg -------------------------------------------------------------------------------- /imgs/hillshade_montage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/hillshade_montage.jpg -------------------------------------------------------------------------------- /imgs/mapbox-hillshade-breakdown.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/mapbox-hillshade-breakdown.gif -------------------------------------------------------------------------------- /imgs/mapbox-hillshade-breakdown.webm: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/mapbox-hillshade-breakdown.webm -------------------------------------------------------------------------------- /imgs/mapbox-hillshade-breakdown_1.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/mapbox-hillshade-breakdown_1.gif -------------------------------------------------------------------------------- /imgs/monochrome_montage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/monochrome_montage.jpg -------------------------------------------------------------------------------- /imgs/polygonized.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/polygonized.png -------------------------------------------------------------------------------- /imgs/resampling_montage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/resampling_montage.jpg -------------------------------------------------------------------------------- /imgs/subset.tif_hillshade_monochrome_combined.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/imgs/subset.tif_hillshade_monochrome_combined.gif -------------------------------------------------------------------------------- /index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | GEBCO Vector Ocean 6 | 7 | 8 | 9 | 10 | 11 | 12 | 43 | 44 | 45 | 46 |
47 | 52 |
53 | 54 | 113 | 114 | 115 | -------------------------------------------------------------------------------- /readme.md: -------------------------------------------------------------------------------- 1 | # Generating an Ocean Vector Terrain tile source 2 | 3 | Mapbox has an amazing [Mapbox Terrain](https://www.mapbox.com/developers/vector-tiles/mapbox-terrain/) vector tile source. Unfortunately, the tiles seem devoid of any ocean data, leaving 71% of the earth somewhat barren. This is an experiment in generating our own set of vector tiles from the [GEBCO](http://www.gebco.net) gridded bathymetric data sets. Much of this is based off of the [Global Vector Terrain slide deck](https://speakerdeck.com/mapbox/global-vector-terrain) presented by @ajashton of Mapbox at NACIS 2014 Practical Cartography Day. While the particular dataset we'll be using isn't especially massive, we'll do our best to design this process to be robust enough to handle much larger quantities of data. This largely translates into breaking the data up and processing it in a distributed fashion, as mentioned in the NACIS slides. 4 | 5 | One thing that was not initially clear was how to handle features that span multiple tiles. Should we clip features that span multiple tiles? If so, does the [vector tile spec](https://github.com/mapbox/vector-tile-spec) differentiate between features that are clipped vs features that naturally end at the boundary of the tile? Is only the boundary of the polygon included, or are extra lines at the tile boundary added to "close" the polygon within the tile? There are multiple issues regarding this topic ([4](https://github.com/mapbox/vector-tile-spec/issues/4), [8](https://github.com/mapbox/vector-tile-spec/issues/8), [26](https://github.com/mapbox/vector-tile-spec/issues/26)), but the best summary for how Mapbox handles this comes from @pramsey's [comment](https://github.com/mapbox/vector-tile-spec/issues/26#issuecomment-63902337): 6 | 7 | > It’s a “trick” a “hack” kind of. The tiles are clipped to be slightly *larger* [than] a perfect tile would be, and then when it is render time, the render canvas is set to the *exact* tile size. The result is that edge artifacts end up outside the render frame and the rendered tiles all magically line up. It makes me feel a little dirty (the correct clipping buffer is a bit of a magic number that depends on the rendering rules that are going to be applied later) but it does serve to reinforce that vector tiles in this spec are not meant to be used as a replacement for a real vector feature service (be it a WFS, ac ArcGIS service, or some other web service), but as an improved transport for rendering, first and foremost. 8 | 9 | Being that dialing-in the correct clipping buffer will depend on the rendering rules, I think it would be wise to split the generation of vector tiles into two main sections: 10 | 11 | 1. Vectorize input data and store in a database 12 | 1. Render vector tiles and store in cloud 13 | 14 | ## I. Vectorize & Store 15 | 16 | ### Requirements 17 | 18 | * ImageMagick 19 | * GDAL 20 | * libgeotiff 21 | * Postgresql 22 | * PostGIS 23 | 24 | ### General Steps 25 | 26 | 1. Get the data 27 | 1. Downsample data 28 | 1. Subset Data (for dev) 29 | 1. Hillshade data 30 | 1. Convert to Monochrome Levels 31 | 1. Polygonize / Load into DB 32 | 33 | ### 1. Sourcing & Preparing Data 34 | 35 | First and foremost, we need some data. As mentioned above, we'll be using a GEBCO dataset. There are multiple offerings, but at the time of writing the highest-resolution data offered by GEBCO is the [GEBCO 30 arc-second grid data](http://www.gebco.net/data_and_products/gridded_bathymetry_data/gebco_30_second_grid/). It is available for download free from their website. 36 | 37 | As you'd guess from the name, the GEBCO 30 arc-second dataset stores data in a 30 arc-second (1/120 degree or 1/2 arcminute) grid. Depth is stored in meters. 38 | 39 | About arc-second data: 40 | 41 | > At the equator, an arc-second of longitude approximately equals an arc-second of latitude, which is 1/60th of a nautical mile (or 101.27 feet or 30.87 meters). Arc-seconds of latitude remain nearly constant, while arc-seconds of longitude decrease in a trigonometric cosine-based fashion as one moves toward the earth's poles. - [_Source: ArcUser Online - Measuring in Arc-Seconds_](http://www.esri.com/news/arcuser/0400/wdside.html) 42 | 43 | The data arrives in EPSG:4326 (WGS84). We'll need to get it in EPSG:3857 (for more information on the differences, see [here](http://gis.stackexchange.com/questions/48949/epsg-3857-or-4326-for-googlemaps-openstreetmap-and-leaflet)). Since the file isn't too large (~1.5GB) we can go ahead and convert it before we get started. While we're at it, we may as well convert the data from NetCDF to a geotiff. This will allow us to interact with the data through ImageMagick. As mentioned mention in [Mapbox' Working with GeoTIFFs documentation](https://www.mapbox.com/tilemill/docs/guides/reprojecting-geotiff/) we'll clip the data to web mercator extents. Originally, I had planned on using the [Lanczos](http://gis.stackexchange.com/questions/10931/what-is-lanczos-resampling-useful-for-in-a-spatial-context#answer-14361) resampling method hoping for a higher quality output (as recommended in Mapbox' GeoTIFFs documentation), but found that it exaggerated linear artifacts in the data: 44 | 45 | ![linear artifacts](/imgs/resampling_montage.jpg) 46 | 47 | In the end, `bilinear` and `cubic` seemed to produce the most appealing output. I opted to go with `cubic`: 48 | 49 | ```bash 50 | gdalwarp -s_srs epsg:4326 -t_srs epsg:3857 -of GTIFF -te -20037508.34 -20037508.34 20037508.34 20037508.34 -r cubic data/GEBCO_2014_1D.nc data/GEBCO_2014_1D_3857.tif 51 | ``` 52 | 53 | If the file were drastically larger, it might make sense to reproject after we subset our data. On a 2.4 GHz Intel Core i5 Retina MacbookPro with 8 GB RAM, this operation took around 80 seconds. 54 | 55 | ### 2. Downsample 56 | 57 | From GDAL info, we can see that our image is `29136` x `29136`: 58 | 59 | ``` 60 | $ gdalinfo data/GEBCO_2014_1D_3857.tif 61 | Driver: GTiff/GeoTIFF 62 | Files: data/GEBCO_2014_1D_3857.tif 63 | Size is 29136, 29136 64 | Coordinate System is: 65 | PROJCS["WGS 84 / Pseudo-Mercator", 66 | GEOGCS["WGS 84", 67 | DATUM["WGS_1984", 68 | SPHEROID["WGS 84",6378137,298.257223563, 69 | AUTHORITY["EPSG","7030"]], 70 | AUTHORITY["EPSG","6326"]], 71 | PRIMEM["Greenwich",0], 72 | UNIT["degree",0.0174532925199433], 73 | AUTHORITY["EPSG","4326"]], 74 | PROJECTION["Mercator_1SP"], 75 | PARAMETER["central_meridian",0], 76 | PARAMETER["scale_factor",1], 77 | PARAMETER["false_easting",0], 78 | PARAMETER["false_northing",0], 79 | UNIT["metre",1, 80 | AUTHORITY["EPSG","9001"]], 81 | EXTENSION["PROJ4","+proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs"], 82 | AUTHORITY["EPSG","3857"]] 83 | Origin = (-20037508.339999999850988,20037508.339999999850988) 84 | Pixel Size = (1375.446755903349867,-1375.446755903349867) 85 | Metadata: 86 | AREA_OR_POINT=Area 87 | Image Structure Metadata: 88 | INTERLEAVE=BAND 89 | Corner Coordinates: 90 | Upper Left (-20037508.340,20037508.340) (180d 0' 0.00"W, 85d 3' 4.06"N) 91 | Lower Left (-20037508.340,-20037508.340) (180d 0' 0.00"W, 85d 3' 4.06"S) 92 | Upper Right (20037508.340,20037508.340) (180d 0' 0.00"E, 85d 3' 4.06"N) 93 | Lower Right (20037508.340,-20037508.340) (180d 0' 0.00"E, 85d 3' 4.06"S) 94 | Center ( 0.0000000, 0.0000000) ( 0d 0' 0.01"E, 0d 0' 0.01"N) 95 | Band 1 Block=29136x1 Type=Int16, ColorInterp=Gray 96 | ``` 97 | 98 | Looking at the following table, we can see that our geotiff is just slightly smaller than the global width of zoom level 7: 99 | 100 | Zoom | Global Width (px) | # Tiles Wide 101 | ---: | :---: | :--- 102 | 0 | 256 | 1 103 | 1 | 512 | 2 104 | 2 | 1024 | 4 105 | 3 | 2048 | 8 106 | 4 | 4096 | 16 107 | 5 | 8192 | 32 108 | 6 | 16384 | 64 109 | 7 | 32768 | 128 110 | 8 | 65536 | 256 111 | 9 | 131072 | 512 112 | 10 | 262144 | 1024 113 | 11 | 524288 | 2048 114 | 12 | 1048576 | 4096 115 | 13 | 2097152 | 8192 116 | 14 | 4194304 | 16384 117 | 15 | 8388608 | 32768 118 | 16 | 16777216 | 65536 119 | 17 | 33554432 | 131072 120 | 18 | 67108864 | 262144 121 | 19 | 134217728 | 524288 122 | 123 | _TODO: Fill in the downsampling portion_ 124 | 125 | 126 | ### 3. Subset Data 127 | 128 | To break the dataset up into separate 1024px x 1024px slices to be processed by different workers, we'll need to subset the data. This is a pretty straightforward operation: 129 | 130 | ```bash 131 | gdal_translate -srcwin 7200 11800 1024 1024 -of GTIFF data/GEBCO_2014_1D_3857_cropped.tif output/subset.tif 132 | ``` 133 | 134 | For the purpose of dialing in other commands, we'll experiment with a slice of the Florida / Gulf of Mexico area. This area provides a variety of surface types to visualize. 135 | 136 | #### Optional: Remove terrain data 137 | 138 | Mapbox already offers a great set of terrain data. You may not be looking to replace that, only to add missing ocean data. To remove the terrain data from the GEBCO dataset, we could clip it to only contain ocean data. An option would be to use the [Generalized Coastline dataset](http://openstreetmapdata.com/data/generalized-coastlines) from OSM to achieve this. The OSM Generalized Coastline dataset only goes up to zoom level 8, which actually appears about right for the resolution of the GEBCO 30 arc-second grid. If we were using higher-resolution data, something like the [Water Polygon dataset](http://openstreetmapdata.com/data/water-polygons) would likely be a better fit. 139 | 140 | Crop shapefile (takes ~18seconds): 141 | ```bash 142 | ogr2ogr -f "ESRI Shapefile" output/cropped_coastline.shp data/water-polygons-generalized-3857/water_polygons_z8.shp -clipsrc 7200 11800 8224 12824 143 | gdalwarp -cutline output/cropped_coastline.shp output/subset.tif output/subset_cropped.tif 144 | ``` 145 | 146 | For the sake of this experiment, we'll encode both the bathymetric and terrain data. 147 | 148 | 149 | ### 4. Hillshade 150 | 151 | To produce hillshade data, we will be utilizing the [`gdaldem`](http://www.gdal.org/gdaldem.html) command. 152 | 153 | #### A word about scale... 154 | 155 | We'll have to adjust the vertical exaggeration as our data moves away from the equator. _TODO: More about this_ 156 | 157 | Convenience function to help experiment with finding the right hillshade level using GDAL: 158 | 159 | ``` bash 160 | rm output/subset.tif_hillshade_* 161 | 162 | # example usage: hillshade subset.tif 2 163 | hillshade () { 164 | echo "gdaldem hillshade -co compress=lzw -compute_edges -z $2 $1 $1_hillshade_$2.tif"; 165 | gdaldem hillshade -co compress=lzw -compute_edges -z $2 $1 $1_hillshade_$2.tif; 166 | } 167 | for i in $(seq 02 19); 168 | do hillshade output/subset.tif $(($i * 2)); 169 | done 170 | 171 | # Compose montage 172 | unset arguments; 173 | for f in output/subset.tif_hillshade_*; do 174 | var=${f#*_*_*} var=${var%%.*}; 175 | arguments+=(-label "Vertical Exaggeration: $var" "$f"); 176 | done 177 | montage "${arguments[@]}" -tile 2x -geometry 480 imgs/hillshade_montage.jpg 178 | 179 | # View output 180 | open imgs/hillshade_montage.jpg 181 | ``` 182 | 183 | Using `-co compress=lzw` and `-compute` to [compress the TIFF and avoid black pixel border around the edge of the image](https://www.mapbox.com/tilemill/docs/guides/terrain-data/#creating-hillshades), respectively. 184 | 185 | ![Hillshade Tests](imgs/hillshade_montage.jpg) 186 | 187 | At first blush, it appears that a vertical exaggeration at `30` best illustrates the elevation model (looking at the inland regions) but notice that much of the coastal region (such as the [Viosca Knoll area](http://soundwaves.usgs.gov/2011/03/DeepF1sm2LG.jpg)) lies within shadows. It is important to ensure that variation amongst features is not obscured. Bringing out subtle variances can be done with well-chosen thresholds during the conversion to monochrome levels. 188 | 189 | _Idea: Adjust altitude as zoom level surpasses natural resolution to minimize objects entirely within shade_ 190 | 191 | ### 5. Convert to Monochrome Levels 192 | 193 | You're going to want to produce ~4 monochromatic layers representing varying depths. To do so, we'll use ImageMagick's [threshold](http://www.imagemagick.org/script/command-line-options.php#threshold) utility. 194 | 195 | #### 5a. Threshold 196 | 197 | ``` bash 198 | # Set chosen hillshade value 199 | hillshade_val=2; 200 | 201 | # Generate monochrome values 202 | monochrome () { 203 | echo "convert $1 -threshold $2% $1_monochrome_$2.tif"; 204 | convert $1 -threshold $2% $1_monochrome_$2.tif; 205 | } 206 | for i in $(seq 02 19); 207 | do monochrome output/subset.tif_hillshade_${hillshade_val}.tif $(($i * 5)); 208 | done 209 | 210 | # Compose montage 211 | unset arguments; 212 | for f in output/subset.tif_hillshade_${hillshade_val}.tif_monochrome_*; do 213 | var=${f#*_*_*_*_} var=${var%%.*}; 214 | arguments+=(-label "Threshold: $var" "$f"); 215 | done 216 | montage "${arguments[@]}" -tile 2x -geometry 480 imgs/monochrome_montage.jpg 217 | 218 | # View output 219 | open imgs/monochrome_montage.jpg 220 | ``` 221 | 222 | This command will likely some `Unknown field with tag ...` warnings during runtime. This is due to ImageMagick is not geo-aware. As such, geo fields are not copied to the new images produced. We'll reapply this data later. 223 | 224 | ![Monochrome Tests](imgs/monochrome_montage.jpg) 225 | 226 | I started with an equally distributed range of thresholds (`20 40 60 80`), and fine-tuned from there based on aesthetics. A lot can be gained from spending some time experimenting with varying thresholds. Initially, almost all inland mountains we limited to being only illustrated by their peaks. Eventually, I found that a threshold of `75` illustrated plenty of the inland features. 227 | 228 | 229 | #### 5b. Merge 230 | 231 | To get a visualization of the output, merge the images into a single image: 232 | 233 | ``` bash 234 | hillshade_val=2; 235 | convert output/subset.tif_hillshade_${hillshade_val}.tif_monochrome_30.tif \ 236 | output/subset.tif_hillshade_${hillshade_val}.tif_monochrome_50.tif \ 237 | output/subset.tif_hillshade_${hillshade_val}.tif_monochrome_70.tif \ 238 | output/subset.tif_hillshade_${hillshade_val}.tif_monochrome_80.tif \ 239 | -evaluate-sequence mean \ 240 | output/subset.tif_hillshade_monochrome_combined.gif 241 | open output/subset.tif_hillshade_monochrome_combined.gif 242 | ``` 243 | 244 | ![merged monochrome](imgs/subset.tif_hillshade_monochrome_combined.gif) 245 | 246 | Oddly, this command failed when attempting to create a TIFF. Instead, I created a GIF output (I initially used JPEG, but opted for GIF due to smaller filesize and no-artifacts). 247 | 248 | ![Artifacts](imgs/artifact_montage.png) 249 | 250 | #### 5c. Re-apply geodata 251 | 252 | Since ImageMagick stripped the imagery of its geospatial attributes, we'll need to reapply them somehow. Luckily, GDAL knows to look for a matching .tfw if it sees a TIFF that isn’t internally georeferenced. Since we haven't actually changed any spatial information regarding the imagery, we can take the spatial information from the original `subset.tiff` and name it to match the new `subset.tif_hillshade_monochrome_combined.tfw`: 253 | 254 | ``` bash 255 | listgeo -tfw output/subset.tif 256 | mv output/subset.tfw output/subset.tif_hillshade_monochrome_combined.tfw 257 | gdal_translate output/subset.tif_hillshade_monochrome_combined.gif output/subset.tif_hillshade_monochrome_combined.tif -a_srs EPSG:3857 258 | ``` 259 | 260 | 261 | ### 6. Polygonize / Load into DB 262 | 263 | Simplify to remove pixelization: 264 | 265 | ``` bash 266 | gdal_polygonize.py output/subset.tif_hillshade_monochrome_combined.tif -f GeoJSON output/subset.tif_hillshade_monochrome_combined.tif_polygons.geojson 267 | ``` 268 | 269 | ![Polygonized Data viewed in QGIS](imgs/polygonized.png) 270 | 271 | _Note: Should we then dissolve/aggregate polygons between subsets that share borders? [[1]](http://gis.stackexchange.com/questions/85028/dissolve-aggregate-polygons-with-ogr2ogr-or-gpc)_ 272 | 273 | #### Tuning DB 274 | 275 | We'll want to add some helpers to our DB and to tune our DB to ensure that it's convenient and performant to use. 276 | 277 | #####PostGIS Vector Tile Utils 278 | 279 | TODO: Document using [postgis-vt-util](https://github.com/mapbox/postgis-vt-util) 280 | 281 | ```bash 282 | psql -U -d -f postgis-vt-util.sql 283 | ``` 284 | 285 | ##### Indexing 286 | 287 | For more information, see the [GiST Indexes](http://postgis.net/docs/using_postgis_dbmanagement.html#gist_indexes) section of the PostGIS Manual. 288 | 289 | ```sql 290 | CREATE INDEX ON bathy USING gist (wkb_geometry) WHERE zoom = 1; 291 | CREATE INDEX ON bathy USING gist (wkb_geometry) WHERE zoom = 2; 292 | CREATE INDEX ON bathy USING gist (wkb_geometry) WHERE zoom = 3; 293 | CREATE INDEX ON bathy USING gist (wkb_geometry) WHERE zoom = 4; 294 | CREATE INDEX ON bathy USING gist (wkb_geometry) WHERE zoom = 5; 295 | CREATE INDEX ON bathy USING gist (wkb_geometry) WHERE zoom = 6; 296 | VACUUM ANALYZE; 297 | ``` 298 | 299 | 300 | ## II. Generate Tiles 301 | 302 | [PostGIS Manual](https://www.mapbox.com/guides/postgis-manual/) 303 | [PostGIS Vector Tile Utils](https://github.com/mapbox/postgis-vt-util) 304 | 305 | ### Hosting 306 | 307 | [Download and source Openstack RC](http://docs.openstack.org/cli-reference/content/cli_openrc.html) 308 | 309 | ``` 310 | swift upload ocean -H 'Content-Encoding: gzip' -H 'Content-Type: application/x-protobuf' -H 'X-Container-Meta-Access-Control-Allow-Origin: *' * 311 | swift post ocean -H 'X-Container-Meta-Access-Control-Allow-Origin: *' 312 | ``` 313 | 314 | ## Extras: 315 | _TODO: Explain more about gazeteer_ 316 | Load [labels](http://www.ngdc.noaa.gov/gazetteer/) 317 | 318 | ``` bash 319 | export PGCLIENTENCODING=LATIN1 320 | insert () { 321 | ogr2ogr -t_srs EPSG:3857 -f PostgreSQL PG:dbname=ocean-tiles data/features/features-$1.shp -nln $1-features -lco OVERWRITE=YES $2 $3; 322 | } 323 | 324 | insert point 325 | insert multipoint 326 | insert polygon 327 | insert multipolygon -nlt MultiPolygon 328 | insert linestring 329 | insert multilinestring -nlt MultiLineString 330 | ``` 331 | 332 | General project req's: 333 | 334 | ```bash 335 | sudo apt-get update 336 | sudo apt-get install \ 337 | htop byobu \ 338 | git \ 339 | gdal libgdal-dev geotiff-bin \ 340 | imagemagick \ 341 | postgresql postgis \ 342 | nginx \ 343 | nodejs npm \ 344 | python3 python3-pip 345 | 346 | ``` 347 | 348 | ## Notes 349 | 350 | ### Resources 351 | 352 | [Mapbox: Processing Landsat 8 Using Open-Source Tools](https://www.mapbox.com/blog/processing-landsat-8/) 353 | 354 | _Note: Gridded output of images were created with:_ 355 | 356 | ``` bash 357 | montage -label '%f' subset.tif_hillshade_* -tile 2x -geometry '480x320' montage.jpg 358 | ``` 359 | 360 | _Advanced labeling:_ 361 | 362 | ``` bash 363 | unset arguments 364 | for f in subset.tif_hillshade_.0001.tif_monochrome_*; do 365 | var=${f#*_*_*_*_} var=${var%%.*}; 366 | arguments+=(-label "Threshold: $var" "$f"); 367 | done 368 | montage "${arguments[@]}" -tile 2x -geometry 480 ../../imgs/monochrome_montage.jpg 369 | ``` 370 | 371 | How do I make my own glyph for custom fonts? See [node-fontnik](https://github.com/mapbox/node-fontnik/) 372 | -------------------------------------------------------------------------------- /tile-generator/main.py: -------------------------------------------------------------------------------- 1 | import math 2 | import os 3 | import subprocess 4 | import sys 5 | import tempfile 6 | 7 | import rasterio 8 | 9 | # TODO: 10 | # - Utilize Celery 11 | # - Start logging to file 12 | 13 | def run_cmd(cmd, verbosity): 14 | if verbosity > 1: 15 | print(cmd) 16 | subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 17 | 18 | 19 | def task(file_path, db_name, table_name, zoom, magnifier, 20 | col, row, src_width, src_height, num_rows, tile_buffer, 21 | clipfile_path, vert_exag, thresholds, 22 | contour_interval, contour_table, 23 | verbosity, pause, copy_output_dir=None): 24 | """ 25 | General steps: 26 | - Subset file 27 | - Contour data 28 | - Hillshade data 29 | - Monochrome data 30 | - Polygonize 31 | - Smooth 32 | """ 33 | # TODO: 34 | # - Downsample data for lower zoom-levels 35 | # - Vertical exageration correction on different zoom levels? 36 | with tempfile.TemporaryDirectory() as tmpdir: 37 | tile_size = 256 38 | magnified_tile_size = 256 * magnifier 39 | tile_buffer = tile_buffer * magnifier 40 | width = magnified_tile_size + (2 * tile_buffer) 41 | height = magnified_tile_size + (2 * tile_buffer) 42 | # TODO: If x or y is negative, handle getting selection from other 43 | # side of image and joining. 44 | subset_width = src_width / num_rows 45 | subset_height = src_height / num_rows 46 | x = col * subset_width - (subset_width / tile_size * tile_buffer) 47 | y = row * subset_height - (subset_height / tile_size * tile_buffer) 48 | 49 | # Subset data 50 | subset_path = os.path.join(tmpdir, "subset.tif".format(x='%05d' % x, y='%05d' % y)) 51 | cmd = "gdal_translate -srcwin {x} {y} {width} {height} -of GTIFF {input} {output}" 52 | cmd = cmd.format(x=x, y=y, width=src_width / num_rows, height=src_height / num_rows, input=file_path, output=subset_path) 53 | run_cmd(cmd, verbosity) 54 | 55 | # Resample data 56 | if magnified_tile_size < src_width: 57 | resampled_path = os.path.join(tmpdir, 'resampled.tif') 58 | cmd = "gdalwarp -ts {tile_size} {tile_size} {input} {output}" 59 | cmd = cmd.format(tile_size=magnified_tile_size, input=subset_path, output=resampled_path) 60 | run_cmd(cmd, verbosity) 61 | subset_path = resampled_path 62 | 63 | # Contour data 64 | contour_path = "{input}.contour.geojson".format(input=subset_path) 65 | cmd = "gdal_contour -a elev {input} -f GeoJSON {output} -i {interval}" 66 | cmd = cmd.format(input=subset_path, output=contour_path, interval=contour_interval) 67 | run_cmd(cmd, verbosity) 68 | cmd = "ogr2ogr -f PostgreSQL PG:dbname={db_name} {input} -append -nln {contour_table} -simplify 1000" 69 | cmd = cmd.format(input=contour_path, db_name=db_name, interval=contour_interval, contour_table=contour_table) 70 | run_cmd(cmd, verbosity) 71 | 72 | # Clip data 73 | # if clipfile_path: 74 | # clipfile_subset_path = "{tmpdir}/clipfile_subset.shp".format(tmpdir=tmpdir) 75 | # cmd = "ogr2ogr -f \"ESRI Shapefile\" {clipfile_subset_path} {clipfile_path} -clipsrc {x_min} {y_min} {x_max} {y_max}" 76 | # cmd = cmd.format( 77 | # clipfile_subset_path=clipfile_subset_path, clipfile_path=clipfile_path, 78 | # x_min=x, y_min=y, x_max=(x + width), y_max=(y + height) 79 | # ) 80 | # if verbosity > 1: 81 | # print(cmd) 82 | # subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 83 | # clipped_subset_path = "{tmpdir}/subset_clipped.tif".format(tmpdir=tmpdir) 84 | # cmd = "gdalwarp -cutline {clipfile_subset_path} {input} {output}" 85 | # cmd = cmd.format( 86 | # clipfile_subset_path=clipfile_subset_path, 87 | # input=subset_path, output=clipped_subset_path 88 | # ) 89 | # if verbosity > 1: 90 | # print(cmd) 91 | # subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 92 | # subset_path = clipped_subset_path 93 | 94 | # Hillshade data 95 | hillshade_path = "{}_hillshade_{}.tif".format(subset_path, vert_exag) 96 | cmd = "gdaldem hillshade -co compress=lzw -compute_edges -z {vert} {input} {output}" 97 | cmd = cmd.format(vert=vert_exag, input=subset_path, output=hillshade_path) 98 | run_cmd(cmd, verbosity) 99 | 100 | # Thresholds 101 | threshold_base = "{}_threshold".format(hillshade_path) 102 | threshold_path_tmplt = threshold_base + "_threshold_{threshold}.tif" 103 | for threshold in thresholds: 104 | threshold_path = threshold_path_tmplt.format(threshold=threshold) 105 | cmd = "convert {input} -threshold {threshold}% {output}" 106 | cmd = cmd.format(input=hillshade_path, output=threshold_path, threshold=threshold) 107 | if verbosity > 1: 108 | print(cmd) 109 | subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 110 | threshold_paths = " ".join([threshold_path_tmplt.format(threshold=threshold) for threshold in thresholds]) 111 | 112 | # Combine thresholds 113 | combined_path = "{}_combined.gif".format(hillshade_path) 114 | cmd = "convert {threshold_paths} -evaluate-sequence mean -transparent 'rgb(153,153,153)' {output}" 115 | cmd = cmd.format(threshold_paths=threshold_paths, output=combined_path) 116 | run_cmd(cmd, verbosity) 117 | 118 | # Convert 119 | combined_tif_path = "{}.tif".format('.'.join(combined_path.split('.')[:-1])) 120 | cmd = "convert {input} {output}".format(input=combined_path, output=combined_tif_path) 121 | run_cmd(cmd, verbosity) 122 | 123 | # Re-apply geodata 124 | if verbosity > 1: 125 | print(cmd) 126 | cmd = "listgeo -tfw {}".format(subset_path) 127 | subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 128 | cmd = "mv {input}.tfw {output}.tfw" 129 | cmd = cmd.format( 130 | input='.'.join(subset_path.split('.')[:-1]), 131 | output='.'.join(combined_tif_path.split('.')[:-1]) 132 | ) 133 | run_cmd(cmd, verbosity) 134 | 135 | # Polygonize 136 | output_name = 'out' 137 | shp_path = os.path.join(tmpdir, "{}.shp".format(output_name)) 138 | cmd = "gdal_polygonize.py {input} -f 'ESRI Shapefile' {output} foo value" 139 | cmd = cmd.format(input=combined_tif_path, output=shp_path) 140 | run_cmd(cmd, verbosity) 141 | 142 | # Add Zoom information 143 | cmd = "ogrinfo {shapefile} -sql \"ALTER TABLE {layer_name} ADD COLUMN zoom integer(50);\"" 144 | cmd = cmd.format(shapefile=shp_path, layer_name=output_name) 145 | run_cmd(cmd, verbosity) 146 | cmd = "ogrinfo {shapefile} -dialect SQLite -sql \"UPDATE {layer_name} SET zoom = {zoom};\"" 147 | cmd = cmd.format(shapefile=shp_path, layer_name=output_name, zoom=zoom) 148 | run_cmd(cmd, verbosity) 149 | 150 | # Copy output to location 151 | # if copy_output_dir: 152 | # cmd = 'cp {input} {copy_output_dir}/{x}_{y}.tif' 153 | # cmd = cmd.format(input=shp_path, copy_output_dir=copy_output_dir, x='%05d' % x, y='%05d' % y), 154 | # subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 155 | 156 | # Smooth polygons 157 | cmd = "ogr2ogr -f PostgreSQL PG:dbname={db_name} {input} -append -nln {table_name} -simplify 1000" 158 | cmd = cmd.format(input=shp_path, db_name=db_name, table_name=table_name) 159 | run_cmd(cmd, verbosity) 160 | 161 | if pause: 162 | input(tmpdir) 163 | 164 | return "{} - {}".format(col, row) 165 | 166 | 167 | def scheduler(clear_tables=False, celery=False, **kwargs): 168 | with rasterio.drivers(): 169 | with rasterio.open(kwargs['file_path']) as src: 170 | width = src.width 171 | height = src.height 172 | 173 | # Prepare threading tooling 174 | if not celery: 175 | from queue import Queue 176 | q = Queue() 177 | 178 | table_name = kwargs.pop('bathy_table') 179 | if clear_tables: 180 | cmd = 'psql {db_name} -c "DROP TABLE IF EXISTS \\"{table_name}\\""' 181 | cmd = cmd.format(db_name=kwargs['db_name'], table_name=table_name) 182 | subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 183 | cmd = 'psql {db_name} -c "DROP TABLE IF EXISTS \\"{table_name}\\""' 184 | cmd = cmd.format(db_name=kwargs['db_name'], table_name=kwargs['contour_table']) 185 | subprocess.check_call(cmd, shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) 186 | 187 | # Queue work 188 | for z in range(kwargs.pop('min_zoom'), kwargs.pop('max_zoom')+1): 189 | num_rows = int(math.pow(2, z)) 190 | 191 | for col in range(0, num_rows): 192 | for row in range(0, num_rows): 193 | task_kwargs = kwargs.copy() 194 | task_kwargs.update({ 195 | 'table_name': table_name, 196 | 'zoom': z, 197 | 'num_rows': num_rows, 198 | 'src_width': width, 199 | 'src_height': height, 200 | 'col': col, 201 | 'row': row, 202 | 'thresholds': (20, 50, 70, 80, 90), 203 | 'contour_interval': 1000, 204 | }) 205 | 206 | # Insert into Thread queue 207 | if not celery: 208 | q.put(task_kwargs) 209 | # Insert into Celery queue 210 | else: 211 | task(**task_kwargs) 212 | 213 | if not kwargs['verbosity']: 214 | tile_num = (row + (col*num_rows) + 1) 215 | total = int(math.pow(num_rows, 2)) 216 | percentage = '%.4f' % (tile_num*100.0 / total) 217 | status = 'processed' if not celery else 'scheduled' 218 | msg = "\r{}% {} ({}/{})".format(percentage, status, tile_num, total) 219 | print(msg, end="") 220 | 221 | # Process queue 222 | if not celery: 223 | from concurrent.futures import ThreadPoolExecutor as Pool 224 | import multiprocessing 225 | 226 | def counter(future): 227 | counter.processed += 1 228 | percentage = '%.4f' % (counter.processed*100.0 / counter.to_process) 229 | msg = "\r{}% processed ({}/{})".format(percentage, counter.processed, counter.to_process) 230 | print(msg, end="") 231 | counter.processed = 0 232 | counter.to_process = int(math.pow(num_rows, 2)) 233 | 234 | with Pool(max_workers=multiprocessing.cpu_count() * 2) as executor: 235 | while not q.empty(): 236 | kwargs = q.get() 237 | executor.submit(task, **kwargs).add_done_callback(counter) 238 | 239 | print("\nZoom level {} complete".format(z)) 240 | 241 | 242 | if __name__ == '__main__': 243 | 244 | import argparse 245 | 246 | parser = argparse.ArgumentParser( 247 | description="Concurrent raster processing demo", 248 | formatter_class=argparse.ArgumentDefaultsHelpFormatter 249 | ) 250 | parser.add_argument( 251 | 'file_path', 252 | metavar='INPUT', 253 | help="Input file name") 254 | parser.add_argument( 255 | '--vert-exag', '-vx', 256 | default=20, 257 | help="Vertical exaggeration") 258 | parser.add_argument( 259 | '--tile-buffer', 260 | default=8, 261 | type=int, 262 | help="Tile buffer (0 is no buffer)") 263 | parser.add_argument( 264 | '--clipfile', 265 | dest='clipfile_path', 266 | metavar='CLIPFILE_PATH', 267 | help="Clipping Shapefile") 268 | parser.add_argument( 269 | '--verbose', '-v', 270 | dest='verbosity', 271 | default=0, 272 | action='count') 273 | parser.add_argument( 274 | '--pause', '-p', 275 | action='store_true', 276 | default=False, 277 | help=('Wait for input after each tile is generated, before ' 278 | 'temporary directory is removed. For development/testing ' 279 | 'purposes.')) 280 | parser.add_argument( 281 | '--db-name', '-db', 282 | dest='db_name', 283 | metavar='DB_NAME', 284 | default='ocean-tiles', 285 | help="Output Database") 286 | parser.add_argument( 287 | '--contour-table', 288 | default='contour', 289 | help="Contour table name") 290 | parser.add_argument( 291 | '--bathy-table', 292 | default='bathy', 293 | help="Bathy table name") 294 | parser.add_argument( 295 | '--clear-tables', 296 | action='store_true', 297 | default=False, 298 | help="Clear destination tables before creating tiles") 299 | parser.add_argument( 300 | '--celery', 301 | action='store_true', 302 | default=False, 303 | help="Schedule tasks with Celery") 304 | parser.add_argument( 305 | '--copy-output-dir', '-out', 306 | default=None, 307 | help="Copy outputted files to dir") 308 | parser.add_argument( 309 | '--min-zoom', '-minz', 310 | type=int, 311 | default=0, 312 | help="Lowest zoom level") 313 | parser.add_argument( 314 | '--max-zoom', '-maxz', 315 | type=int, 316 | default=6, 317 | help="Highest zoom level") 318 | parser.add_argument( 319 | '--magnifier', '-q', 320 | type=int, 321 | default=4, 322 | help="Ratio between tile used for data processing and output tile (higher means more features in tile)") 323 | 324 | args = parser.parse_args() 325 | 326 | try: 327 | scheduler(**args.__dict__) 328 | except KeyboardInterrupt: 329 | print("Received exit signal, shutting down...") 330 | sys.exit() 331 | -------------------------------------------------------------------------------- /tile-generator/requirements.txt: -------------------------------------------------------------------------------- 1 | Fiona==1.5.1 2 | rasterio==0.24.0 3 | Wand==0.4.0 -------------------------------------------------------------------------------- /tile-server/.gitignore: -------------------------------------------------------------------------------- 1 | node_modules 2 | -------------------------------------------------------------------------------- /tile-server/mapbox_studio.tm2source/.thumb.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/alukach/vector-ocean/90a2d6754179ad69707753751da4d1304606dcb4/tile-server/mapbox_studio.tm2source/.thumb.png -------------------------------------------------------------------------------- /tile-server/mapbox_studio.tm2source/data.yml: -------------------------------------------------------------------------------- 1 | _prefs: 2 | disabled: 3 | - linestring-features 4 | - multipolygon-features 5 | - polygon-features 6 | - multipoint-features 7 | - point-feature 8 | - contours 9 | - water 10 | - multilinestring-features 11 | inspector: false 12 | mapid: '' 13 | rev: '' 14 | saveCenter: true 15 | attribution: '' 16 | center: 17 | - -2.1094 18 | - -45.8288 19 | - 1 20 | description: '' 21 | Layer: 22 | - id: water 23 | Datasource: 24 | dbname: ocean-tiles 25 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 26 | geometry_field: '' 27 | geometry_table: '' 28 | host: '' 29 | key_field: '' 30 | max_size: 512 31 | password: '' 32 | port: '' 33 | table: |- 34 | ( 35 | SELECT * FROM "water" 36 | WHERE /*Z(!scale_denominator!) >= 7 37 | AND*/ wkb_geometry && !bbox! 38 | ) as data 39 | type: postgis 40 | user: '' 41 | description: Water 42 | fields: 43 | fid: Number 44 | ogc_fid: Number 45 | properties: 46 | "buffer-size": 8 47 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 48 | - id: terrain 49 | Datasource: 50 | dbname: test 51 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 52 | geometry_field: '' 53 | geometry_table: '' 54 | host: '' 55 | key_field: '' 56 | max_size: 512 57 | password: '' 58 | port: '' 59 | table: |- 60 | ( 61 | SELECT * FROM "bathy" 62 | 63 | WHERE zoom=Z(!scale_denominator!) 64 | /*AND wkb_geometry && !bbox!*/ 65 | 66 | ) AS data 67 | type: postgis 68 | user: '' 69 | description: Ocean Data 70 | fields: 71 | ogc_fid: Number 72 | value: Number 73 | zoom: String 74 | properties: 75 | "buffer-size": 8 76 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 77 | - id: contours 78 | Datasource: 79 | dbname: ocean-tiles 80 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 81 | geometry_field: '' 82 | geometry_table: '' 83 | host: '' 84 | key_field: '' 85 | max_size: 512 86 | password: '' 87 | port: '' 88 | table: |- 89 | ( 90 | SELECT * FROM "contour" 91 | WHERE Z(!scale_denominator!) >= 5 92 | AND wkb_geometry && !bbox! 93 | ) AS data 94 | type: postgis 95 | user: '' 96 | description: Contours 97 | fields: 98 | id: Number 99 | elev: Number 100 | ogc_fid: Number 101 | properties: 102 | "buffer-size": 8 103 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 104 | - id: point-feature 105 | Datasource: 106 | dbname: ocean-tiles 107 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 108 | geometry_field: '' 109 | geometry_table: '' 110 | host: '' 111 | key_field: '' 112 | max_size: 512 113 | password: '' 114 | port: '' 115 | table: | 116 | ( 117 | SELECT * FROM "point_features" 118 | WHERE Z(!scale_denominator!) >= 2 119 | AND wkb_geometry && !bbox! 120 | ) as data 121 | type: postgis 122 | user: '' 123 | description: '' 124 | fields: 125 | comments: String 126 | discoverer: String 127 | discovery_: String 128 | history: String 129 | meeting: String 130 | name: String 131 | ogc_fid: Number 132 | proposal_y: String 133 | proposer: String 134 | type: String 135 | properties: 136 | "buffer-size": 8 137 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 138 | - id: multipoint-features 139 | Datasource: 140 | dbname: ocean-tiles 141 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 142 | geometry_field: '' 143 | geometry_table: '' 144 | host: '' 145 | key_field: '' 146 | max_size: 512 147 | password: '' 148 | port: '' 149 | table: |- 150 | ( 151 | SELECT * FROM "multipoint_features" 152 | WHERE Z(!scale_denominator!) >= 2 153 | AND wkb_geometry && !bbox! 154 | ) as data 155 | type: postgis 156 | user: '' 157 | description: '' 158 | fields: 159 | comments: String 160 | discoverer: String 161 | discovery_: String 162 | history: String 163 | meeting: String 164 | name: String 165 | ogc_fid: Number 166 | proposal_y: String 167 | proposer: String 168 | type: String 169 | properties: 170 | "buffer-size": 8 171 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 172 | - id: polygon-features 173 | Datasource: 174 | dbname: ocean-tiles 175 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 176 | geometry_field: '' 177 | geometry_table: '' 178 | host: '' 179 | key_field: '' 180 | max_size: 512 181 | password: '' 182 | port: '' 183 | table: |- 184 | ( 185 | SELECT * FROM "polygon_features" 186 | WHERE Z(!scale_denominator!) >= 2 187 | AND wkb_geometry && !bbox! 188 | ) as data 189 | type: postgis 190 | user: '' 191 | description: '' 192 | fields: 193 | comments: String 194 | discoverer: String 195 | discovery_: String 196 | history: String 197 | meeting: String 198 | name: String 199 | ogc_fid: Number 200 | proposal_y: String 201 | proposer: String 202 | type: String 203 | properties: 204 | "buffer-size": 8 205 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 206 | - id: multipolygon-features 207 | Datasource: 208 | dbname: ocean-tiles 209 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 210 | geometry_field: '' 211 | geometry_table: '' 212 | host: '' 213 | key_field: '' 214 | max_size: 512 215 | password: '' 216 | port: '' 217 | table: |- 218 | ( 219 | SELECT * FROM "multipolygon_features" 220 | WHERE Z(!scale_denominator!) >= 2 221 | AND wkb_geometry && !bbox! 222 | ) as data 223 | type: postgis 224 | user: '' 225 | description: '' 226 | fields: 227 | comments: String 228 | discoverer: String 229 | discovery_: String 230 | history: String 231 | meeting: String 232 | name: String 233 | ogc_fid: Number 234 | proposal_y: String 235 | proposer: String 236 | type: String 237 | properties: 238 | "buffer-size": 8 239 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 240 | - id: linestring-features 241 | Datasource: 242 | dbname: ocean-tiles 243 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 244 | geometry_field: '' 245 | geometry_table: '' 246 | host: '' 247 | key_field: '' 248 | max_size: 512 249 | password: '' 250 | port: '' 251 | table: |- 252 | ( 253 | SELECT * FROM "linestring_features" 254 | WHERE Z(!scale_denominator!) >= 2 255 | AND wkb_geometry && !bbox! 256 | ) as data 257 | type: postgis 258 | user: '' 259 | description: '' 260 | fields: 261 | comments: String 262 | discoverer: String 263 | discovery_: String 264 | history: String 265 | meeting: String 266 | name: String 267 | ogc_fid: Number 268 | proposal_y: String 269 | proposer: String 270 | type: String 271 | properties: 272 | "buffer-size": 8 273 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 274 | - id: multilinestring-features 275 | Datasource: 276 | dbname: ocean-tiles 277 | extent: -20037508.34,-20037508.34,20037508.34,20037508.34 278 | geometry_field: '' 279 | geometry_table: '' 280 | host: '' 281 | key_field: '' 282 | max_size: 512 283 | password: '' 284 | port: '' 285 | table: |- 286 | ( 287 | SELECT * FROM "multilinestring_features" 288 | WHERE Z(!scale_denominator!) >= 2 289 | AND wkb_geometry && !bbox! 290 | ) as data 291 | type: postgis 292 | user: '' 293 | description: '' 294 | fields: 295 | fid: Number 296 | ogc_fid: Number 297 | properties: 298 | "buffer-size": 8 299 | srs: +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0.0 +k=1.0 +units=m +nadgrids=@null +wktext +no_defs +over 300 | maxzoom: 10 301 | minzoom: 0 302 | name: Ocean Tiles 303 | -------------------------------------------------------------------------------- /tile-server/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "mapbox-gl-ocean-map-experiment", 3 | "version": "1.0.0", 4 | "description": "An experiment in serving PBF vector tiles", 5 | "main": "server.js", 6 | "dependencies": { 7 | "async": "^1.4.2", 8 | "express": "^4.11.0", 9 | "mbtiles": "^0.7.7", 10 | "minimist": "^1.2.0", 11 | "mkdirp": "^0.5.1", 12 | "tilelive": "^5.5.0", 13 | "tilelive-tmsource": "^0.4.0" 14 | }, 15 | "devDependencies": {}, 16 | "scripts": { 17 | "test": "echo \"Error: no test specified\" && exit 1", 18 | "start": "node server.js" 19 | }, 20 | "author": "alukach", 21 | "license": "ISC" 22 | } 23 | -------------------------------------------------------------------------------- /tile-server/readme.md: -------------------------------------------------------------------------------- 1 | # MapboxGL Ocean Map 2 | 3 | This is a collection of notes taken during the process of experimenting with MapboxGL, PBF Vector Tiles, and ENC data from NOAA. 4 | 5 | ## Links 6 | 7 | * Serving Tiles 8 | * [Stack Exchange: Hosting Mapbox Vector Tiles](http://gis.stackexchange.com/questions/125037/self-hosting-mapbox-vector-tiles) 9 | * MapboxGL General 10 | * [JustinMiller: Anatomy of a travel map](http://justinmiller.io/posts/2015/01/20/anatomy-of-a-travel-map/) 11 | * [Igor Tihonov: Taking mapbox-gl.js for a spin](http://igortihonov.com/2014/10/21/taking-mapbox-gl-for-a-spin/) 12 | 13 | ## Steps 14 | 15 | ### Data 16 | 17 | At it's core, it's about data. You'll need to be able to understand the data that you have and import it into your data store (ie PostGIS). 18 | 19 | Links 20 | * [Processing S57 soundings](http://blog.perrygeo.net/2005/12/03/hello-world/) 21 | * [GDAL S57 Driver](http://www.gdal.org/drv_s57.html) (section about soundings especially important) 22 | 23 | 24 | 25 | 1. Import the data: 26 | 27 | Below is an example of importing only the `SOUNDG` layer (soundings) of a single S57 file. 28 | ``` bash 29 | export OGR_S57_OPTIONS="RETURN_PRIMITIVES=ON,RETURN_LINKAGES=ON,LNAM_REFS=ON,SPLIT_MULTIPOINT=ON,ADD_SOUNDG_DEPTH=ON" 30 | ogr2ogr \ 31 | -append \ 32 | -nlt POINT25D \ 33 | -f PostgreSQL PG:"dbname=s57 user=anthony" \ 34 | US5WA70M.000 SOUNDG 35 | ``` 36 | About the options: 37 | 38 | ``` bash 39 | -update # Open existing output datasource in update mode rather than trying to create a new one 40 | -append # Append to existing layer instead of creating new 41 | -nlt type: 42 | Define the geometry type for the created layer. One of NONE, GEOMETRY, POINT, LINESTRING, 43 | POLYGON, GEOMETRYCOLLECTION, MULTIPOINT, MULTIPOLYGON or MULTILINESTRING. Add '25D' to the name 44 | to get 2.5D versions. 45 | ``` 46 | Below is an example of iterating through a collection of files and importing just the `SOUNDG` layers. 47 | ``` bash 48 | export OGR_S57_OPTIONS="RETURN_PRIMITIVES=ON,RETURN_LINKAGES=ON,LNAM_REFS=ON,SPLIT_MULTIPOINT=ON,ADD_SOUNDG_DEPTH=ON" 49 | 50 | find . -name "US*.000" -exec ogr2ogr -append -nlt POINT25D -f PostgreSQL PG:"dbname=s57 user=anthony" {} SOUNDG \; 51 | ``` 52 | Below is an example of importing an entire S57 file. 53 | ``` bash 54 | export OGR_S57_OPTIONS="RETURN_PRIMITIVES=ON,RETURN_LINKAGES=ON,LNAM_REFS=ON,SPLIT_MULTIPOINT=ON,ADD_SOUNDG_DEPTH=ON" 55 | 56 | ogr2ogr -append -f PostgreSQL PG:"dbname=s57_2 user=anthony" US1HA02M.000 -skipfailures \; 57 | ``` 58 | Tying it all together, below is an example of searching recursively through a collection of S57 files and importing all layers into PostGIS. 59 | ``` bash 60 | export OGR_S57_OPTIONS="RETURN_PRIMITIVES=ON,RETURN_LINKAGES=ON,LNAM_REFS=ON,SPLIT_MULTIPOINT=ON,ADD_SOUNDG_DEPTH=ON" 61 | 62 | find . -name "US*.000" -exec ogr2ogr -append -f PostgreSQL PG:"dbname=enc user=anthony" {} -skipfailures \; 63 | ``` 64 | 65 | ### Visualization 66 | 67 | 1. Set up a basic page after reading [this tutorial](https://gist.github.com/mapmeld/8866414b7fc8940e8540). `index0.html` 68 | 1. How do you serve it? [Implementations](https://github.com/mapbox/vector-tile-spec/wiki/Implementations), [node-mapnik](https://github.com/mapnik/node-mapnik), [tutorial](http://www.sparkgeo.com/labs/big/), [mapnik vectortile api](https://github.com/mapnik/node-mapnik/blob/master/docs/VectorTile.md), [vector-tile-server](https://github.com/artemp/vector-tile-server) 69 | 70 | ### Mapbox Studio 71 | 72 | 1. Create an input layer: [adding-layers](https://www.mapbox.com/tilemill/docs/manual/adding-layers/), [quick-start](https://www.mapbox.com/mapbox-studio/source-quickstart/) 73 | 74 | ``` bash 75 | ( 76 | SELECT 77 | depth, 78 | scamin, 79 | scamax, 80 | sorind, 81 | sordat, 82 | wkb_geometry 83 | FROM soundg 84 | ) as data 85 | ``` 86 | 1. Annotate data (see [S-57 info](http://www.caris.com/S-57/frames/S57catalog.htm), [GDAL S-57](http://gdal.org/1.11/ogr/drv_s57.html)) 87 | 1. Build scale system. [source](http://msdn.microsoft.com/en-us/library/bb259689.aspx) 88 | 89 | ``` python 90 | scale = lambda x: 295829355.45 / (2**(x-1)) 91 | for i in xrange(1, 23): 92 | rule = "[zoom>={zoom}][scamin>{scale}]," 93 | print rule.format(zoom=i, scale=scale(i)) 94 | ``` 95 | 96 | 97 | 98 | ## Additional Links 99 | 100 | ### Makesurface Tools 101 | 102 | - [Mapbox: Atmospheric river](https://www.mapbox.com/blog/atmospheric-river/) 103 | - [Using Marine Gravity Anomaly Map with Mapbox GL 104 | ](http://mpetroff.net/2014/10/marine-gravity-mapbox-gl/) 105 | -------------------------------------------------------------------------------- /tile-server/server.js: -------------------------------------------------------------------------------- 1 | var express = require('express'); 2 | var http = require('http'); 3 | var tilelive = require('tilelive'); 4 | var fs = require('fs'); 5 | var mkdirp = require('mkdirp'); 6 | var path = require('path'); 7 | var argv = require('minimist')(process.argv.slice(2)); 8 | 9 | require('mbtiles').registerProtocols(tilelive); 10 | require('tilelive-tmsource')(tilelive); 11 | 12 | var app = express(); 13 | tilelive.load('tmsource://./mapbox_studio.tm2source', function(err, source) { 14 | 15 | if (err) { 16 | throw err; 17 | } 18 | app.set('port', 7777); 19 | 20 | app.use(function(req, res, next) { 21 | res.header("Access-Control-Allow-Origin", "*"); 22 | res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept"); 23 | next(); 24 | }); 25 | 26 | app.get(/^\/tiles\/(\d+)\/(\d+)\/(\d+).pbf$/, function(req, res){ 27 | 28 | var z = req.params[0]; 29 | var x = req.params[1]; 30 | var y = req.params[2]; 31 | var cache_path = ''; 32 | 33 | if (argv.cachedir) { 34 | cache_path = path.join(argv.cachedir, "{z}/{x}/{y}.pbf"); 35 | cache_path = cache_path.replace("{z}", z).replace("{x}", x).replace("{y}", y); 36 | } 37 | 38 | // Look for file in cache 39 | fs.stat(cache_path, function(err, stat) { 40 | // Serve from cache 41 | if (stat) { 42 | console.log('get tile %d, %d, %d from cache', z, x, y); 43 | res.header('Content-Type', 'application/x-protobuf'); 44 | res.header('Content-Encoding', 'gzip'); 45 | fs.createReadStream(cache_path).pipe(res); 46 | // Not in cache, get from DB 47 | } else { 48 | console.log('get tile %d, %d, %d from db', z, x, y); 49 | source.getTile(z, x, y, function(err, tile, headers) { 50 | if (err) { 51 | res.status(404); 52 | res.send(err.message); 53 | console.log(err.message); 54 | } else { 55 | res.set(headers); 56 | res.send(tile); 57 | 58 | // Cache to disk 59 | if (argv.cachedir) { 60 | mkdirp(path.dirname(cache_path), function (err) { 61 | if (err) return console.error(err); 62 | fs.writeFile(cache_path, tile, function(err) { 63 | if(err) return console.log(err); 64 | console.log('wrote %d, %d, %d to cache', z, x, y); 65 | }); 66 | }); 67 | } 68 | } 69 | }); 70 | } 71 | }); 72 | }); 73 | 74 | http.createServer(app).listen(app.get('port'), function() { 75 | console.log('Express server listening on port ' + app.get('port')); 76 | }); 77 | }); 78 | -------------------------------------------------------------------------------- /tile-uploader/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "tile-uploader", 3 | "version": "1.0.0", 4 | "description": "Access tileserver, uploading returned tiles to Openstack Swift", 5 | "main": "upload.js", 6 | "dependencies": { 7 | "async": "^1.4.2", 8 | "commander": "^2.8.1", 9 | "mbtiles": "^0.7.7", 10 | "minimist": "^1.2.0", 11 | "mkdirp": "^0.5.1", 12 | "tilelive": "^5.5.0", 13 | "tilelive-tmsource": "^0.4.0" 14 | }, 15 | "devDependencies": {}, 16 | "scripts": { 17 | "test": "echo \"Error: no test specified\" && exit 1" 18 | }, 19 | "repository": { 20 | "type": "git", 21 | "url": "http://github.com/alukach/vector-ocean" 22 | }, 23 | "author": "Anthony Lukach", 24 | "license": "MIT", 25 | "bugs": { 26 | "url": "https://github.com/alukach/vector-ocean/issues" 27 | }, 28 | "homepage": "https://github.com/alukach/vector-ocean" 29 | } 30 | -------------------------------------------------------------------------------- /tile-uploader/upload.js: -------------------------------------------------------------------------------- 1 | var async = require('async'); 2 | var fs = require('fs'); 3 | var util = require('util'); 4 | var os = require('os'); 5 | var path = require('path'); 6 | var exec = require('child_process').exec; 7 | 8 | var tilelive = require('tilelive'); 9 | var argv = require('minimist')(process.argv.slice(2)); 10 | var mkdirp = require('mkdirp'); 11 | var console = require('tracer').colorConsole(); 12 | 13 | require('mbtiles').registerProtocols(tilelive); 14 | require('tilelive-tmsource')(tilelive); 15 | 16 | var program = require('commander'); 17 | 18 | program 19 | .version(0.1) 20 | .option('-z, --minimum-zoom ', 'Minimum zoom level to process (defaults to 0).', parseInt) 21 | .option('-Z, --maximum-zoom ', 'Maximum zoom level to process (defaults to 6).', parseInt) 22 | .option('-b, --bucket ', 'Bucket in which to upload results (defaults to \'bathy\').') 23 | .option('-c, --concurrency ', 'Number of concurrent tile operations (defaults to 2).', parseInt) 24 | .option('--starting-tile ', 'Starting position of upload process (in the event that you want to start processing from a specific tile)') 25 | .option('--limit ', 'Limit to total number of tiles to be processed (defaults to 0)') 26 | .parse(process.argv); 27 | 28 | var minzoom = program.minimumZoom || 0; 29 | var maxzoom = program.maximumZoom || 6; 30 | var bucket = program.bucket || 'bathy'; 31 | var concurrency = program.concurrency || 2; 32 | var offset = program.startingTile || null; 33 | if (offset) { 34 | var xyz = offset.split('/'); 35 | if (xyz.length != 3) throw "Improperly formatted 'starting-tile' value. Must be inf the following format: z/x/y"; 36 | var startZ = Number(xyz[0]); 37 | var startX = Number(xyz[1]); 38 | var startY = Number(xyz[2]); 39 | } 40 | var limit = program.limit || null; 41 | 42 | tilelive.load('tmsource://../tile-server/mapbox_studio.tm2source', function(err, source) { 43 | // Queue Handler 44 | var q = async.queue(tileUploaderTask, 2); 45 | 46 | // Finishing Message 47 | q.drain = function() { 48 | var cmd = util.format('swift post %s -H \'X-Container-Meta-Access-Control-Allow-Origin: *\'', bucket); 49 | exec(cmd); 50 | console.info('all items have been processed'); 51 | return process.exit(); 52 | }; 53 | 54 | // Queue up jobs 55 | function callback (url) { 56 | console.info(util.format('Processed %s', url)); 57 | } 58 | 59 | var zlevels = range(startZ || minzoom, maxzoom + 1); 60 | var count = 0; 61 | 62 | zLoop: 63 | for (var i=0; i < zlevels.length; i++) { 64 | var z = zlevels[i]; 65 | var xlevels = range(startX || 0, Math.pow(2, z)); 66 | startX = null; 67 | 68 | for (var j=0; j < xlevels.length; j++) { 69 | var x = xlevels[j]; 70 | var ylevels = range(startY || 0, Math.pow(2, z)); 71 | startY = null; 72 | 73 | for (var k=0; k < ylevels.length; k++) { 74 | var y = ylevels[k]; 75 | var url = util.format('%s/%s/%s', z, x, y); 76 | console.info("Adding the following to the queue:", url); 77 | q.push({z:z, x:x, y:y}, callback); 78 | 79 | if (limit && (count += 1) >= limit) { 80 | console.warn("Hit tile limit of " + limit + ". Not scheduling any more tiles"); 81 | break zLoop; 82 | } 83 | } 84 | } 85 | } 86 | 87 | // Task 88 | function tileUploaderTask(task, callback) { 89 | var url = util.format('%s/%s/%s', task.z, task.x, task.y); 90 | console.info('Procesessing ' + url); 91 | 92 | source.getTile(task.z, task.x, task.y, handleTile); 93 | 94 | function handleTile(err, tile, headers) { 95 | if (err) { 96 | console.error(url, 'failed!', err); 97 | return callback(url); 98 | } 99 | 100 | // Save to tmpfile 101 | var fname = url + '.pbf'; 102 | var fdir = os.tmpdir(); 103 | var fpath = path.join(fdir, fname); 104 | 105 | mkdirp(path.dirname(fpath), writeFile); 106 | function writeFile(err) { 107 | if (err) throw (err); 108 | fs.writeFile(fpath, tile, uploadAndRmFile); 109 | } 110 | function uploadAndRmFile(err) { 111 | if (err) throw (err); 112 | cmd = 'swift upload ' + bucket + ' -H \'Content-Encoding: gzip\' -H \'Content-Type: application/x-protobuf\' -H \'X-Container-Meta-Access-Control-Allow-Origin: *\' ' + fname; 113 | exec(cmd, {cwd: fdir}, function(error, stdout, stderr) { 114 | if (err || stderr) throw (err || stderr); 115 | fs.unlink(fpath); 116 | }); 117 | } 118 | return callback(url); 119 | } 120 | } 121 | }); 122 | 123 | 124 | // Helpers 125 | 126 | /** 127 | * Generates an array containing arithmetic progressions 128 | * @param {number} start Beginning integer 129 | * @param {number} stop Ending integer, exclusive 130 | * @return {array} Array of numbers between start and stop, exclusive 131 | */ 132 | function range(start, stop) { 133 | return Array.apply(Array, Array(Number(stop) - Number(start))).map( 134 | function (_, i) { return i + Number(start); } 135 | ); 136 | } 137 | -------------------------------------------------------------------------------- /tile-viewer/.gitignore: -------------------------------------------------------------------------------- 1 | app/account.js 2 | node_modules 3 | app/style.json 4 | -------------------------------------------------------------------------------- /tile-viewer/README.md: -------------------------------------------------------------------------------- 1 | # [Mapbox GL](https://www.mapbox.com/mapbox-gl/) Code Flow Example 2 | 3 | This repository is a simple example of a development environment for Mapbox GL 4 | styles that is based on editing raw code with a text editor. This style 5 | of editing is similar to [TileMill](https://www.mapbox.com/tilemill/) 6 | or the [Mapbox Studio](https://www.mapbox.com/mapbox-studio/) desktop 7 | tools. 8 | 9 | This example is based on [Gulp](http://gulpjs.com/) as a framework for its 10 | basic utilities: 11 | 12 | * A [Mapbox GL JS](https://www.mapbox.com/mapbox-gl-js/)-powered page for 13 | previewing the style 14 | * Support for [JSON](http://json.org/), [JSON5](http://json5.org/), 15 | [YAML](http://yaml.org/), and [TOML](https://github.com/toml-lang/toml) 16 | files as input. You can also specify an executable JavaScript file with a `.js` extension 17 | and hashbang and the file will be run and its output piped into the style 18 | definition. 19 | * Live-reloading when the style is changed 20 | 21 | ## Install 22 | 23 | First, clone this git repository: 24 | 25 | git clone https://github.com/mapbox/mapbox-gl-codeflow-example.git 26 | cd mapbox-gl-codeflow-example 27 | 28 | And then install the dependencies: 29 | 30 | npm install 31 | 32 | And then start up: 33 | 34 | gulp --style=style.json 35 | 36 | You can also specify `style.json5`, `style.yml`, or `style.toml` as your 37 | inputs. Unlike JSON, these alternative formats support commenting, so you 38 | can annotate your styles. Note that this is a one-way conversion, like the conversion 39 | from [CartoCSS](https://www.mapbox.com/tilemill/docs/manual/carto/) 40 | to [Mapnik XML](https://github.com/mapnik/mapnik/wiki/XMLConfigReference) 41 | was. 42 | 43 | ## Example 44 | 45 | This project is an _example_ of one such flow: you could do the same 46 | with a bare-bones node.js script, or Python or any other environment. You could 47 | also skip some of the fancier features, like live reloading, or add others, 48 | like a more complex debugging view. 49 | -------------------------------------------------------------------------------- /tile-viewer/app/account.js: -------------------------------------------------------------------------------- 1 | mapboxgl.accessToken = 'pk.eyJ1IjoiYWx1a2FjaCIsImEiOiJ3US1JLXJnIn0.xrpBHCwvzsX76YlO-08kjg'; 2 | -------------------------------------------------------------------------------- /tile-viewer/app/style.json: -------------------------------------------------------------------------------- 1 | { 2 | "version": 8, 3 | "name": "Ocean", 4 | "sources": { 5 | "mapbox": { 6 | "type": "vector", 7 | "url": "mapbox://mapbox.mapbox-streets-v6" 8 | }, 9 | "ocean-tiles": { 10 | "type": "vector", 11 | "tiles": [ 12 | "https://swift-yyc.cloud.cybera.ca:8080/v1/AUTH_0494f9ea20824c889ad817fc718b43a4/bathy/{z}/{x}/{y}.pbf" 13 | ], 14 | "minzoom": 0, 15 | "maxzoom": 4 16 | } 17 | }, 18 | "glyphs": "https://mapbox.s3.amazonaws.com/gl-glyphs-256/{fontstack}/{range}.pbf", 19 | "transition": { 20 | "duration": 600, 21 | "delay": 0 22 | }, 23 | "layers": [ 24 | { 25 | "id": "background", 26 | "type": "background", 27 | "paint": { 28 | "background-color": "#fff" 29 | }, 30 | "paint.october": { 31 | "background-color": "#000" 32 | } 33 | }, 34 | { 35 | "id": "water", 36 | "type": "fill", 37 | "source": "mapbox", 38 | "source-layer": "water", 39 | "paint": { 40 | "fill-color": "rgb(186,226,254)" 41 | }, 42 | "paint.october": { 43 | "fill-color": "rgb(50, 18, 18)" 44 | }, 45 | "paint.rainbow": { 46 | "fill-color": "rgb(108, 251, 255)" 47 | } 48 | }, 49 | { 50 | "id": "bathy-full-highlight", 51 | "type": "fill", 52 | "source": "ocean-tiles", 53 | "source-layer": "terrain", 54 | "filter": [ 55 | "all", 56 | [ 57 | "==", 58 | "value", 59 | 255 60 | ] 61 | ], 62 | "paint": { 63 | "fill-color": "rgb(255,255,255)", 64 | "fill-opacity": 0.5 65 | }, 66 | "paint.october": { 67 | "fill-color": "rgb(159, 66, 66)" 68 | }, 69 | "paint.rainbow": { 70 | "fill-color": "rgb(255, 238, 75)" 71 | } 72 | }, 73 | { 74 | "id": "bathy-light-highlight", 75 | "type": "fill", 76 | "source": "ocean-tiles", 77 | "source-layer": "terrain", 78 | "filter": [ 79 | "all", 80 | [ 81 | "==", 82 | "value", 83 | 204 84 | ] 85 | ], 86 | "paint": { 87 | "fill-color": "rgb(227,240,252)", 88 | "fill-opacity": 0.5 89 | }, 90 | "paint.october": { 91 | "fill-color": "rgb(128, 63, 63)" 92 | }, 93 | "paint.rainbow": { 94 | "fill-color": "rgb(147, 208, 100)" 95 | } 96 | }, 97 | { 98 | "id": "bathy-light-shadow", 99 | "type": "fill", 100 | "source": "ocean-tiles", 101 | "source-layer": "terrain", 102 | "filter": [ 103 | "all", 104 | [ 105 | "==", 106 | "value", 107 | 102 108 | ] 109 | ], 110 | "paint": { 111 | "fill-color": "rgb(159,201,231)", 112 | "fill-opacity": 0.5 113 | }, 114 | "paint.october": { 115 | "fill-color": "rgb(101, 27, 27)" 116 | }, 117 | "paint.rainbow": { 118 | "fill-color": "rgb(0, 125, 255)" 119 | } 120 | }, 121 | { 122 | "id": "bathy-mid-shadow", 123 | "type": "fill", 124 | "source": "ocean-tiles", 125 | "source-layer": "terrain", 126 | "filter": [ 127 | "all", 128 | [ 129 | "==", 130 | "value", 131 | 51 132 | ] 133 | ], 134 | "paint": { 135 | "fill-color": "rgb(138,187,226)", 136 | "fill-opacity": 0.5 137 | }, 138 | "paint.october": { 139 | "fill-color": "rgb(55, 12, 12)" 140 | }, 141 | "paint.rainbow": { 142 | "fill-color": "rgb(122, 75, 255)" 143 | } 144 | }, 145 | { 146 | "id": "bathy-full-shadow", 147 | "type": "fill", 148 | "source": "ocean-tiles", 149 | "source-layer": "terrain", 150 | "filter": [ 151 | "all", 152 | [ 153 | "==", 154 | "value", 155 | 0 156 | ] 157 | ], 158 | "paint": { 159 | "fill-color": "rgb(121,176,217)", 160 | "fill-opacity": 0.5 161 | }, 162 | "paint.october": { 163 | "fill-color": "rgb(0,0,0)" 164 | }, 165 | "paint.rainbow": { 166 | "fill-color": "rgb(242, 0, 255)" 167 | } 168 | }, 169 | { 170 | "id": "polygon-features", 171 | "type": "symbol", 172 | "source": "ocean-tiles", 173 | "source-layer": "polygon-features", 174 | "layout": { 175 | "symbol-placement": "line", 176 | "text-field": "{name} {type}", 177 | "text-font": [ 178 | "Arial Unicode MS Regular" 179 | ], 180 | "text-size": { 181 | "base": 1, 182 | "stops": [ 183 | [ 184 | 2, 185 | 8 186 | ], 187 | [ 188 | 3, 189 | 10 190 | ], 191 | [ 192 | 5, 193 | 16 194 | ] 195 | ] 196 | } 197 | }, 198 | "paint": { 199 | "text-color": "#666", 200 | "text-halo-color": "#fff", 201 | "text-halo-width": 1, 202 | "text-halo-blur": 1 203 | }, 204 | "paint.october": { 205 | "text-color": "#ddd", 206 | "text-halo-color": "#333", 207 | "text-halo-width": 1, 208 | "text-halo-blur": 1 209 | } 210 | }, 211 | { 212 | "id": "point-features", 213 | "type": "symbol", 214 | "source": "ocean-tiles", 215 | "source-layer": "point-features", 216 | "layout": { 217 | "text-field": "{name} {type}", 218 | "text-font": [ 219 | "Arial Unicode MS Regular" 220 | ], 221 | "text-size": { 222 | "base": 1, 223 | "stops": [ 224 | [ 225 | 2, 226 | 8 227 | ], 228 | [ 229 | 3, 230 | 10 231 | ], 232 | [ 233 | 5, 234 | 16 235 | ] 236 | ] 237 | } 238 | }, 239 | "paint": { 240 | "text-color": "#666", 241 | "text-halo-color": "#fff", 242 | "text-halo-width": 1, 243 | "text-halo-blur": 1 244 | }, 245 | "paint.october": { 246 | "text-color": "#ddd", 247 | "text-halo-color": "#333", 248 | "text-halo-width": 1, 249 | "text-halo-blur": 1 250 | } 251 | }, 252 | { 253 | "id": "multipolygon-features", 254 | "type": "symbol", 255 | "source": "ocean-tiles", 256 | "source-layer": "multipolygon-features", 257 | "layout": { 258 | "text-field": "{name} {type}", 259 | "text-font": [ 260 | "Arial Unicode MS Regular" 261 | ], 262 | "text-size": { 263 | "base": 1, 264 | "stops": [ 265 | [ 266 | 2, 267 | 8 268 | ], 269 | [ 270 | 3, 271 | 10 272 | ], 273 | [ 274 | 5, 275 | 16 276 | ] 277 | ] 278 | } 279 | }, 280 | "paint": { 281 | "text-color": "#666", 282 | "text-halo-color": "#fff", 283 | "text-halo-width": 1, 284 | "text-halo-blur": 1 285 | }, 286 | "paint.october": { 287 | "text-color": "#ddd", 288 | "text-halo-color": "#333", 289 | "text-halo-width": 1, 290 | "text-halo-blur": 1 291 | } 292 | }, 293 | { 294 | "id": "multipoint-features", 295 | "type": "symbol", 296 | "source": "ocean-tiles", 297 | "source-layer": "multipoint-features", 298 | "layout": { 299 | "text-field": "{name} {type}", 300 | "text-font": [ 301 | "Arial Unicode MS Regular" 302 | ], 303 | "text-size": { 304 | "base": 1, 305 | "stops": [ 306 | [ 307 | 2, 308 | 8 309 | ], 310 | [ 311 | 3, 312 | 10 313 | ], 314 | [ 315 | 5, 316 | 16 317 | ] 318 | ] 319 | } 320 | }, 321 | "paint": { 322 | "text-color": "#666", 323 | "text-halo-color": "#fff", 324 | "text-halo-width": 1, 325 | "text-halo-blur": 1 326 | }, 327 | "paint.october": { 328 | "text-color": "#ddd", 329 | "text-halo-color": "#333", 330 | "text-halo-width": 1, 331 | "text-halo-blur": 1 332 | } 333 | }, 334 | { 335 | "id": "multilinestring-features", 336 | "type": "symbol", 337 | "source": "ocean-tiles", 338 | "source-layer": "multilinestring-features", 339 | "layout": { 340 | "text-field": "{name} {type}", 341 | "text-font": [ 342 | "Arial Unicode MS Regular" 343 | ], 344 | "text-size": { 345 | "base": 1, 346 | "stops": [ 347 | [ 348 | 2, 349 | 8 350 | ], 351 | [ 352 | 3, 353 | 10 354 | ], 355 | [ 356 | 5, 357 | 16 358 | ] 359 | ] 360 | } 361 | }, 362 | "paint": { 363 | "text-color": "#666", 364 | "text-halo-color": "#fff", 365 | "text-halo-width": 1, 366 | "text-halo-blur": 1 367 | }, 368 | "paint.october": { 369 | "text-color": "#ddd", 370 | "text-halo-color": "#333", 371 | "text-halo-width": 1, 372 | "text-halo-blur": 1 373 | } 374 | }, 375 | { 376 | "id": "linestring-features", 377 | "type": "symbol", 378 | "source": "ocean-tiles", 379 | "source-layer": "linestring-features", 380 | "layout": { 381 | "text-field": "{name} {type}", 382 | "text-font": [ 383 | "Arial Unicode MS Regular" 384 | ], 385 | "text-size": { 386 | "base": 1, 387 | "stops": [ 388 | [ 389 | 2, 390 | 8 391 | ], 392 | [ 393 | 3, 394 | 10 395 | ], 396 | [ 397 | 5, 398 | 16 399 | ] 400 | ] 401 | } 402 | }, 403 | "paint": { 404 | "text-color": "#666", 405 | "text-halo-color": "#fff", 406 | "text-halo-width": 1, 407 | "text-halo-blur": 1 408 | }, 409 | "paint.october": { 410 | "text-color": "#ddd", 411 | "text-halo-color": "#333", 412 | "text-halo-width": 1, 413 | "text-halo-blur": 1 414 | } 415 | } 416 | ] 417 | } 418 | -------------------------------------------------------------------------------- /tile-viewer/gulpfile.js: -------------------------------------------------------------------------------- 1 | var gulp = require('gulp'), 2 | // extra input formats we want to support, to let 3 | // people add comments and drop extra gunk from stylesheets. 4 | json5 = require('gulp-json5'), 5 | yaml = require('gulp-yaml'), 6 | toml = require('gulp-toml'), 7 | through2 = require('through2'), 8 | exec = require('gulp-exec'), 9 | rename = require('gulp-rename'), 10 | connect = require('gulp-connect'); 11 | 12 | var path = require('path'), 13 | minimist = require('minimist'); 14 | 15 | var options = minimist(process.argv.slice(2), { 16 | string: 'style', 17 | default: { style: 'style.json' } 18 | }); 19 | 20 | gulp.task('connect', function() { 21 | connect.server({ 22 | root: '../', 23 | livereload: true 24 | }); 25 | }); 26 | 27 | gulp.task('html', function() { 28 | gulp.src(['../*.html', options.style]) 29 | .pipe(connect.reload()); 30 | }); 31 | 32 | gulp.task('watch', function() { 33 | gulp.watch(['../*.html', options.style], 34 | ['compile', 'html']); 35 | }); 36 | 37 | gulp.task('compile', function() { 38 | switch (path.extname(options.style)) { 39 | case '.json5': 40 | return gulp.src(options.style) 41 | .pipe(json5()) 42 | .pipe(gulp.dest('app')); 43 | case '.yaml': 44 | case '.yml': 45 | return gulp.src(options.style) 46 | .pipe(yaml({ space: 4 })) 47 | .pipe(gulp.dest('app')); 48 | case '.js': 49 | return gulp.src(options.style) 50 | .pipe(exec('<%= file.path %>', { pipeStdout: true, continueOnError: true })) 51 | .pipe(through2.obj(function(file, enc, cb) { 52 | if (!file.exec.stdout && file.exec.stderr) { 53 | file.contents = new Buffer(JSON.stringify({ error: file.exec.stderr })); 54 | } 55 | cb(null, file); 56 | })) 57 | .pipe(rename({ basename: 'style', extname: '.json' })) 58 | .pipe(gulp.dest('app')); 59 | case '.toml': 60 | return gulp.src(options.style) 61 | .pipe(toml()) 62 | .pipe(gulp.dest('app')); 63 | case '.json': 64 | default: 65 | return gulp.src(options.style) 66 | .pipe(gulp.dest('app')); 67 | } 68 | }); 69 | 70 | gulp.task('default', ['connect', 'watch', 'compile']); 71 | -------------------------------------------------------------------------------- /tile-viewer/style.yml: -------------------------------------------------------------------------------- 1 | version: 8 2 | name: Ocean 3 | sources: 4 | mapbox: 5 | type: vector 6 | url: 'mapbox://mapbox.mapbox-streets-v6' 7 | ocean-tiles: 8 | type: vector 9 | tiles: 10 | - 'https://tiles.alukach.com/bathy/{z}/{x}/{y}.pbf' 11 | minzoom: 0 12 | maxzoom: 4 13 | glyphs: 'https://mapbox.s3.amazonaws.com/gl-glyphs-256/{fontstack}/{range}.pbf' 14 | transition: 15 | duration: 600 16 | delay: 0 17 | 18 | layers: 19 | - 20 | id: background 21 | type: background 22 | paint: 23 | background-color: '#fff' 24 | paint.october: 25 | background-color: '#000' 26 | - 27 | id: water 28 | type: fill 29 | source: mapbox 30 | source-layer: water 31 | paint: 32 | fill-color: 'rgb(186,226,254)' 33 | paint.october: 34 | fill-color: 'rgb(50, 18, 18)' 35 | paint.rainbow: 36 | fill-color: 'rgb(108, 251, 255)' 37 | - 38 | id: bathy-full-highlight 39 | type: fill 40 | source: ocean-tiles 41 | source-layer: terrain 42 | filter: 43 | - all 44 | - 45 | - '==' 46 | - value 47 | - 255 48 | paint: 49 | fill-color: 'rgb(255,255,255)' 50 | fill-opacity: 0.5 51 | paint.october: 52 | fill-color: 'rgb(159, 66, 66)' 53 | paint.rainbow: 54 | fill-color: 'rgb(255, 238, 75)' 55 | - 56 | id: bathy-light-highlight 57 | type: fill 58 | source: ocean-tiles 59 | source-layer: terrain 60 | filter: 61 | - all 62 | - 63 | - '==' 64 | - value 65 | - 204 66 | paint: 67 | fill-color: 'rgb(227,240,252)' 68 | fill-opacity: 0.5 69 | paint.october: 70 | fill-color: 'rgb(128, 63, 63)' 71 | paint.rainbow: 72 | fill-color: 'rgb(147, 208, 100)' 73 | - 74 | id: bathy-light-shadow 75 | type: fill 76 | source: ocean-tiles 77 | source-layer: terrain 78 | filter: 79 | - all 80 | - 81 | - '==' 82 | - value 83 | - 102 84 | paint: 85 | fill-color: 'rgb(159,201,231)' 86 | fill-opacity: 0.5 87 | paint.october: 88 | fill-color: 'rgb(101, 27, 27)' 89 | paint.rainbow: 90 | fill-color: 'rgb(0, 125, 255)' 91 | - 92 | id: bathy-mid-shadow 93 | type: fill 94 | source: ocean-tiles 95 | source-layer: terrain 96 | filter: 97 | - all 98 | - 99 | - '==' 100 | - value 101 | - 51 102 | paint: 103 | fill-color: 'rgb(138,187,226)' 104 | fill-opacity: 0.5 105 | paint.october: 106 | fill-color: 'rgb(55, 12, 12)' 107 | paint.rainbow: 108 | fill-color: 'rgb(122, 75, 255)' 109 | - 110 | id: bathy-full-shadow 111 | type: fill 112 | source: ocean-tiles 113 | source-layer: terrain 114 | filter: 115 | - all 116 | - 117 | - '==' 118 | - value 119 | - 0 120 | paint: 121 | fill-color: 'rgb(121,176,217)' 122 | fill-opacity: 0.5 123 | paint.october: 124 | fill-color: 'rgb(0,0,0)' 125 | paint.rainbow: 126 | fill-color: 'rgb(242, 0, 255)' 127 | - 128 | id: polygon-features 129 | type: symbol 130 | source: ocean-tiles 131 | source-layer: polygon-features 132 | layout: 133 | symbol-placement: line 134 | text-field: '{name} {type}' 135 | text-font: 136 | - 'Arial Unicode MS Regular' 137 | text-size: 138 | base: 1 139 | stops: 140 | - [2, 8] 141 | - [3, 10] 142 | - [5, 16] 143 | paint: 144 | text-color: '#666' 145 | text-halo-color: '#fff' 146 | text-halo-width: 1 147 | text-halo-blur: 1 148 | paint.october: 149 | text-color: '#ddd' 150 | text-halo-color: '#333' 151 | text-halo-width: 1 152 | text-halo-blur: 1 153 | - 154 | id: point-features 155 | type: symbol 156 | source: ocean-tiles 157 | source-layer: point-features 158 | layout: 159 | text-field: '{name} {type}' 160 | text-font: 161 | - 'Arial Unicode MS Regular' 162 | text-size: 163 | base: 1 164 | stops: 165 | - [2, 8] 166 | - [3, 10] 167 | - [5, 16] 168 | paint: 169 | text-color: '#666' 170 | text-halo-color: '#fff' 171 | text-halo-width: 1 172 | text-halo-blur: 1 173 | paint.october: 174 | text-color: '#ddd' 175 | text-halo-color: '#333' 176 | text-halo-width: 1 177 | text-halo-blur: 1 178 | - 179 | id: multipolygon-features 180 | type: symbol 181 | source: ocean-tiles 182 | source-layer: multipolygon-features 183 | layout: 184 | text-field: '{name} {type}' 185 | text-font: 186 | - 'Arial Unicode MS Regular' 187 | text-size: 188 | base: 1 189 | stops: 190 | - [2, 8] 191 | - [3, 10] 192 | - [5, 16] 193 | paint: 194 | text-color: '#666' 195 | text-halo-color: '#fff' 196 | text-halo-width: 1 197 | text-halo-blur: 1 198 | paint.october: 199 | text-color: '#ddd' 200 | text-halo-color: '#333' 201 | text-halo-width: 1 202 | text-halo-blur: 1 203 | - 204 | id: multipoint-features 205 | type: symbol 206 | source: ocean-tiles 207 | source-layer: multipoint-features 208 | layout: 209 | text-field: '{name} {type}' 210 | text-font: 211 | - 'Arial Unicode MS Regular' 212 | text-size: 213 | base: 1 214 | stops: 215 | - [2, 8] 216 | - [3, 10] 217 | - [5, 16] 218 | paint: 219 | text-color: '#666' 220 | text-halo-color: '#fff' 221 | text-halo-width: 1 222 | text-halo-blur: 1 223 | paint.october: 224 | text-color: '#ddd' 225 | text-halo-color: '#333' 226 | text-halo-width: 1 227 | text-halo-blur: 1 228 | - 229 | id: multilinestring-features 230 | type: symbol 231 | source: ocean-tiles 232 | source-layer: multilinestring-features 233 | layout: 234 | text-field: '{name} {type}' 235 | text-font: 236 | - 'Arial Unicode MS Regular' 237 | text-size: 238 | base: 1 239 | stops: 240 | - [2, 8] 241 | - [3, 10] 242 | - [5, 16] 243 | paint: 244 | text-color: '#666' 245 | text-halo-color: '#fff' 246 | text-halo-width: 1 247 | text-halo-blur: 1 248 | paint.october: 249 | text-color: '#ddd' 250 | text-halo-color: '#333' 251 | text-halo-width: 1 252 | text-halo-blur: 1 253 | - 254 | id: linestring-features 255 | type: symbol 256 | source: ocean-tiles 257 | source-layer: linestring-features 258 | layout: 259 | text-field: '{name} {type}' 260 | text-font: 261 | - 'Arial Unicode MS Regular' 262 | text-size: 263 | base: 1 264 | stops: 265 | - [2, 8] 266 | - [3, 10] 267 | - [5, 16] 268 | paint: 269 | text-color: '#666' 270 | text-halo-color: '#fff' 271 | text-halo-width: 1 272 | text-halo-blur: 1 273 | paint.october: 274 | text-color: '#ddd' 275 | text-halo-color: '#333' 276 | text-halo-width: 1 277 | text-halo-blur: 1 278 | --------------------------------------------------------------------------------