├── LinuxInstall.md ├── tilers_tools ├── data_override.csv ├── tilers-tools.bat ├── tiler_misc.py ├── tiler_global_mercator.py ├── data_bsb.csv ├── converter_xyz.py ├── hdr_pcx_merge.py ├── converter_maemomapper.py ├── tiles_scale.py ├── reader_kml.py ├── reader_geo.py ├── map2gdal.py ├── tiles_convert.py ├── reader_bsb.py ├── tiler.py ├── tiler_plate_carree.py ├── tiles_opt.py ├── converter_sasplanet.py ├── viewer-google.html ├── data_ozi.csv ├── tiles_merge.py ├── reader_ozi.py ├── reader_backend.py ├── converter_backend.py ├── tiler_functions.py └── ozf_decoder.py ├── WinInstall.md ├── README.md └── QuickStart.md /LinuxInstall.md: -------------------------------------------------------------------------------- 1 | # Prerequisites for Linux 2 | 3 | The prerequisites: GDAL's Python bindings (python-gdal), Python imaging library (PIL), GDAL binaries (`gdal-bin` 2.2+) and, optionally, `pngnq` utility. 4 | 5 | It is quite likely that they are available from your distribution, otherwise it's worth checking with . 6 | 7 | Then make sure `tilers-tools` directory is included into the `PATH` environment variable. 8 | -------------------------------------------------------------------------------- /tilers_tools/data_override.csv: -------------------------------------------------------------------------------- 1 | #proj,"EPSG:3395","+proj=merc +lon_0=0 +k=1 +x_0=0 +y_0=0 +ellps=WGS84 +datum=WGS84 +units=m +no_defs" 2 | #proj,"EPSG:6859","+proj=merc +ellps=intl +towgs84=0,0,0,0,0,0,0 +lat_ts=0.0 +lon_0=0 +k=1 +x_0=0 +y_0=0 +units=m +no_defs" 3 | #proj,"EPSG:6859","+proj=merc +lon_0=0 +k=1 +x_0=0 +y_0=0 +ellps=intl +units=m +no_defs" 4 | proj,"EPSG:6859","+proj=merc +a=6378388 +b=6356911.946 +towgs84=0,0,0,0,0,0,0 +nadgrids=@null +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +no_defs" 5 | -------------------------------------------------------------------------------- /tilers_tools/tilers-tools.bat: -------------------------------------------------------------------------------- 1 | @echo off 2 | 3 | if "%1" == "setenv" goto setenv 4 | 5 | start %comspec% /k %0 setenv %1 6 | goto exit 7 | 8 | :setenv 9 | set PYTHON=D:\bin\Python26 10 | set GDAL=D:\bin\release-1600-gdal-1-8-0-mapserver-5-6-6 11 | 12 | set TILERS_TOOLS=%~dp0 13 | rem Remove trailing backslash 14 | set TILERS_TOOLS=%TILERS_TOOLS:~0,-1% 15 | 16 | pushd %GDAL% 17 | 18 | call SDKShell.bat setenv 19 | rem set PYTHONPATH=%CD%\bin\gdal\python;%PYTHONPATH% 20 | set PATH=%PYTHON%;%TILERS_TOOLS%;%PATH% 21 | 22 | popd 23 | 24 | :exit 25 | -------------------------------------------------------------------------------- /WinInstall.md: -------------------------------------------------------------------------------- 1 | # How to install tilers-tools on Windows 2 | 3 | **NB**: Running these scripts on Windows was not tested for quite a while. Probably you need some more recent versions of prerequisites listed below. 4 | 5 | The prerequisites are: `Python` environment (v 2.6+), GDAL binaries with Python support, Python imaging library (PIL), optionally, `pngnq` utility. 6 | 7 | 1. Install these packages:Python, PIL. Download them from here: , 8 | 2. Unpack GDAL from to a folder 9 | 3. `pngnq` utility is required for tiles-opt.py script (to convert tiles to a paletted form). It can be downloaded from or 10 | 4. Take a zip with `tilers-tools` from the Downloads page and put them into some other folder. 11 | 5. Edit `tilers-tools.bat`, to modify the paths to the Python and GDAL packages, for example: 12 | 13 | set PYTHON=D:\bin\Python26 14 | set GDAL=D:\bin\release-1600-gdal-1-8-0-mapserver-5-6-6 15 | 16 | 6. Now you can launch `tilers-tools.bat` 17 | 18 | After `tilers-tools.bat` starts, a command line window pop-ups. In this window you do the job. First you navigate to the folder with the source data: OZI .map, BSB .kap, JPEG, PNG, TIFF rasters, etc. For example: 19 | 20 | cd /d D:\test\bsb\tst-ozi 21 | 22 | These files can be dragged here from the Windows Explorer window to compose a command line. 23 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | A collection of scripts for creating and handling a tile sets from digital raster maps. The scripts are based on GDAL tools. 2 | 3 | **NB**: These scripts are not in active development anymore. 4 | 5 | This version of the scripts requires GDAL python bindings version 2.xx 6 | 7 | ---- 8 | `tiler.py` \-- converts a [GDAL](http://www.gdal.org/)-compatible map file (dataset) into a set of zoom-leveled tile directories (a pyramid). A few output pyramid structure/projections (profiles) are supported: compatible with Google Maps (native and TMS-compatible), Google Earth and generic. The script is relatively fast, especially when processing a paletted source in the "draft mode" or rendering a few datasets simultaneously. It is also less picky with dataset formats and projections. In particular, it can cope with maps crossing the 180° meridian. 9 | 10 | `tiler.py` should read from standard GDAL datasets, but for some of them it uses `map2gdal.py` internally to implement more accurate metadata translation: BSB (`.kap`), Geo/Nos (`.geo`), Ozi (`.map`) and KML with ground overlays (raster images). GDAL itself has some support for these formats, but the script extracts somewhat more geographical data from these source formats. For example, for BSB charts it supports more data and projections than GDAL does natively (currently the only datum for BSB charts is WGS84), it also makes use of BSB's DTM northing/easting data. 11 | 12 | `tiles_merge.py` \-- merges a few separate tile sets created by `tiler.py` into a single one, so to cover a larger area and/or more zoom levels. 13 | 14 | `tiles_convert.py` \-- converts tile sets between a few tile set structure and tile image formats. 15 | 16 | Some auxiliary scripts: 17 | 18 | `ozf_decoder.py` \-- converts `ozf2` and `ozfx3` files into `tiff` format. If locally installed GDAL tools do not have an ozf reader built in, this one would help. This script is also able to cope with some broken files. 19 | 20 | `hdr_pcx_merge.py` \-- assembles a set of HDR-PCX chart tiles into a single PNG file; 21 | -------------------------------------------------------------------------------- /tilers_tools/tiler_misc.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2011-2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | 27 | #~ from tiler_functions import * 28 | from tiler_backend import * 29 | 30 | ############################# 31 | 32 | class GenericMap(MercatorPyramid, ZYXtiling): 33 | 'full profile options are to be specified' 34 | ############################# 35 | profile = 'generic' 36 | defaul_ext = '.generic' 37 | 38 | def __init__(self, src=None, dest=None, options=None): 39 | 40 | options = LooseDict(options) 41 | 42 | self.srs = txt2proj4(options.proj4def or options.tiles_srs) 43 | assert self.srs, 'Target SRS is not specified' 44 | self.tilemap_crs = options.tiles_srs 45 | 46 | if options.zoom0_tiles: 47 | self.zoom0_tiles = map(int, options.zoom0_tiles.split(',')) 48 | if options.tile_size: 49 | tile_size = tuple(map(int, options.tile_size.split(','))) 50 | self.tile_dim = (tile_size[0], -tile_size[1]) 51 | 52 | super(GenericMap, self).__init__(src, dest, options) 53 | # 54 | profile_map.append(GenericMap) 55 | # 56 | 57 | ############################# 58 | 59 | class Wgs84(MercatorPyramid, ZYXtiling): 60 | 'WGS 84 / World Mercator, EPSG:3395 (compatible with Yandex maps)' 61 | ############################## 62 | profile = 'wgs84' 63 | defaul_ext = '.wgs84' 64 | 65 | #~ srs = '+proj=merc +lon_0=0 +k=1 +x_0=0 +y_0=0 +ellps=WGS84 +datum=WGS84 +units=m +no_defs' 66 | srs = 'EPSG:3395' 67 | tilemap_crs = 'EPSG:3395' 68 | # 69 | profile_map.append(Wgs84) 70 | # 71 | -------------------------------------------------------------------------------- /tilers_tools/tiler_global_mercator.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2011-2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from tiler_functions import * 27 | from tiler_backend import * 28 | 29 | ############################# 30 | 31 | class GMercator(MercatorPyramid): 32 | 'base class for Global Mercator' 33 | ############################# 34 | 35 | # OpenLayers-2.12/lib/OpenLayers/Projection.js 36 | # 37 | # "EPSG:900913": { 38 | # units: "m", 39 | # maxExtent: [-20037508.34, -20037508.34, 20037508.34, 20037508.34] 40 | # } 41 | 42 | zoom0_tiles = [1, 1] # tiles at zoom 0 43 | 44 | # Global Mercator (EPSG:3857, aka EPSG:900913) http://docs.openlayers.org/library/spherical_mercator.html 45 | #~ srs = '+proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=@null +no_defs' 46 | srs = 'EPSG:3857' 47 | 48 | tilemap_crs = 'EPSG:3857' 49 | 50 | ############################# 51 | 52 | class GMercatorZYX(GMercator, ZYXtiling): 53 | 'Global Mercator, top-to-bottom tile numbering ZYX directory structure' 54 | ############################# 55 | profile = 'zyx' 56 | defaul_ext = '.zyx' 57 | tms_profile = 'zyx-mercator' # non-standard profile 58 | 59 | def write_metadata(self, tile=None, children=[]): 60 | super(GMercatorZYX, self).write_metadata(tile, children) 61 | 62 | if tile is None: 63 | copy_viewer(self.dest) 64 | # 65 | profile_map.append(GMercatorZYX) 66 | # 67 | 68 | ############################# 69 | 70 | class GMercatorXYZ(GMercator, XYZtiling): 71 | 'Global Mercator, top-to-bottom tile numbering OSM directory structure' 72 | ############################# 73 | profile = 'xyz' 74 | defaul_ext = '.xyz' 75 | tms_profile = 'xyz-mercator' # non-standard profile 76 | # 77 | profile_map.append(GMercatorXYZ) 78 | # 79 | 80 | ############################# 81 | 82 | class GMercatorTMS(GMercator, TMStiling): 83 | 'Global Mercator, TMS tile numbering' 84 | ############################# 85 | profile = 'tms' 86 | defaul_ext = '.tms' 87 | tms_profile = 'global-mercator' 88 | # 89 | profile_map.append(GMercatorTMS) 90 | # 91 | -------------------------------------------------------------------------------- /tilers_tools/data_bsb.csv: -------------------------------------------------------------------------------- 1 | # datum_map 2 | datum, WGS84, "+datum=WGS84", 3 | datum, WGS-84, "+datum=WGS84", 4 | datum, NAD83, "+datum=NAD83", 5 | datum, WGS72, "+ellps=WGS72 +towgs84=0,0,4.5,0,0,0.554,0.219", 6 | datum, WGS1972, "+ellps=WGS72 +towgs84=0,0,4.5,0,0,0.554,0.219", 7 | #http://earth-info.nga.mil/GandG/coordsys/onlinedatum/CountryEuropeTable.html 8 | datum, EUROPEAN 1950, "+towgs84=-87,-98,-121 +ellps=intl", 9 | datum, EUROPEAN DATUM 1950, "+towgs84=-87,-98,-121 +ellps=intl", 10 | datum, EUROPEAN 1950 (NORWAY FINLAND), "+towgs84=-85,-95,-120 +ellps=intl", 11 | datum, ED50, "+towgs84=-84.0000,-97.0000,-117.0000 +ellps=intl", 12 | datum, EUROPEAN, "+towgs84=-84.0000,-97.0000,-117.0000 +ellps=intl", 13 | datum, ROMA DATUM 1940, "+towgs84=-104.1,-49.1,-9.9,0.971,-2.917,0.714,-11.68 +ellps=intl", 14 | datum, ROMA 1940, "+towgs84=-104.1,-49.1,-9.9,0.971,-2.917,0.714,-11.68 +ellps=intl", 15 | datum, HERMANSKOGEL DATUM, "+datum=hermannskogel", 16 | datum, POTSDAM, "+datum=potsdam", 17 | # ??? 18 | datum, SYSTEM KÜSTE, "+datum=potsdam", 19 | datum, MERCHICH, "+ellps=clrk80 +towgs84=31,146,47,0,0,0,0", 20 | datum, OSGB36, "+towgs84=446.448,-125.157,542.060,0.1502,0.2470,0.8421,-20.4894 +ellps=airy", 21 | # http://sv.wikipedia.org/wiki/RT_90 22 | datum, RT90 (SWEDEN), "+towgs84=414.0978567149,41.3381489658,603.0627177516,-0.8550434314,2.1413465185,-7.0227209516,0 +ellps=bessel", 23 | # http://spatialreference.org/ref/epsg/4218/proj4/ 24 | datum, BOGOTÁ, "+ellps=intl +towgs84=307,304,-318,0,0,0,0" 25 | # http://spatialreference.org/ref/epsg/4225/proj4/ 26 | datum, CORREGO ALEGRE, "+ellps=intl +towgs84=-206,172,-6,0,0,0,0" 27 | datum, COA, "+ellps=intl +towgs84=-206,172,-6,0,0,0,0" 28 | datum, COABR, "+ellps=intl +towgs84=-206,172,-6,0,0,0,0" 29 | # http://spatialreference.org/ref/sr-org/7366/proj4/ -- not tested 30 | datum, SAD69, "+ellps=aust_SA +towgs84=-67.35,3.88,-38.22,0,0,0,0" 31 | # http://epsg.io/4674, also http://www.ibge.gov.br/english/geociencias/geodesia/pmrg/faq.shtm#15 -- not tested 32 | datum, SIRGAS2000, "+ellps=GRS80 +towgs84=0,0,0,0,0,0,0" 33 | 34 | 35 | # guess the datum by a comment/copyright string pattern 36 | datum_guess, "Croatia", "+ellps=bessel +towgs84=550.499,164.116,475.142,5.80967,2.07902,-11.62386,0.99999445824", 37 | # http://spatial-analyst.net/wiki/index.php?title=MGI_/_Balkans_coordinate_systems 38 | #"+datum=hermannskogel", http://earth-info.nga.mil/GandG/coordsys/onlinedatum/DatumTable.html 39 | #"+ellps=bessel +towgs84=682,-203,480", 40 | 41 | #header,Projection,PROJ4 definition, BSB KNP/ projection parameters; KNQ/ extra projection parameters for BSB v. 3.xx 42 | proj, MERCATOR, "+proj=merc",PP:lat_ts, 43 | proj, TRANSVERSE MERCATOR, "+proj=tmerc",PP:lon_0,P1:lat_0,P2:k,P3:y_0,P4:x_0,"KNQ:P1:lon_0,P2:k,P3:lat_0", # KNQ P3 - guess 44 | proj, UNIVERSAL TRANSVERSE MERCATOR, "+proj=tmerc +k=0.9996",PP:lon_0, 45 | proj, GNOMONIC, "+proj=gnom",PP:lon_0,P1:lat_0, 46 | proj, POLYCONIC, "+proj=poly",PP:lon_0,"KNQ:P1:lon_0,P2:lat_0", # KNQ P2 - guess 47 | proj, SWEDISH GRID, "+proj=tmerc +lon_0=15.808277777778 +x_0=1500000 +y_0=0", 48 | proj, LAMBERT CONFORMAL CONIC, "+proj=lcc",PP:lon_0,"KNQ:P1:lon_0,P2:lat_1,P3:lat_2", 49 | proj, LAMBERT AZIMUTHAL SPHERICAL, "+proj=laea",PP:lon_0,P1:lat_0, 50 | proj, ALBERS SPHERICAL, "+proj=aea",PP:lon_0,P1:lat_0,P5:lat_1,P6:lat_2, 51 | proj, WAGNER IV, "+proj=wag4",PP:lon_0, 52 | -------------------------------------------------------------------------------- /tilers_tools/converter_xyz.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2010, 2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from converter_backend import * 27 | 28 | ############################# 29 | 30 | class TMStiles(TileMapDir): # see TileMap Diagram at http://wiki.osgeo.org/wiki/Tile_Map_Service_Specification 31 | 'TMS tiles' 32 | ############################# 33 | format, ext, input, output = 'tms', '.tms', True, True 34 | dir_pattern = '[0-9]*/*/*.*' 35 | 36 | def path2coord(self, tile_path): 37 | z, x, y = map(int, path2list(tile_path)[-4:-1]) 38 | return (z, x, 2**z-y-1) 39 | 40 | def coord2path(self, z, x, y): 41 | return '%d/%d/%d' % (z, x, 2**z-y-1) 42 | 43 | tileset_profiles.append(TMStiles) 44 | 45 | ############################# 46 | 47 | class XYZtiles(TileMapDir): # http://code.google.com/apis/maps/documentation/javascript/v2/overlays.html#Google_Maps_Coordinates 48 | 'Popular XYZ format (Google Maps, OSM, mappero-compatible)' 49 | ############################# 50 | format, ext, input, output = 'xyz', '.xyz', True, True 51 | dir_pattern = '[0-9]*/*/*.*' 52 | 53 | def path2coord(self, tile_path): 54 | return map(int, path2list(tile_path)[-4:-1]) 55 | 56 | def coord2path(self, z, x, y): 57 | return '%d/%d/%d' % (z, x, y) 58 | 59 | tileset_profiles.append(XYZtiles) 60 | 61 | ############################# 62 | 63 | class ZYXtiles(TileMapDir): 64 | 'ZYX aka Global Mapper (SASPlanet compatible)' 65 | ############################# 66 | format, ext, input, output = 'zyx', '.zyx', True, True 67 | dir_pattern = 'z[0-9]*/*/*.*' 68 | 69 | def path2coord(self, tile_path): 70 | z, y, x = path2list(tile_path)[-4:-1] 71 | return map(int, (z[1:], x, y)) 72 | 73 | def coord2path(self, z, x, y): 74 | return 'z%d/%d/%d' % (z, y, x) 75 | 76 | tileset_profiles.append(ZYXtiles) 77 | 78 | ############################# 79 | 80 | class MapNav(TileDir): # http://mapnav.spb.ru/site/e107_plugins/forum/forum_viewtopic.php?29047.post 81 | 'MapNav (Global Mapper - compatible)' 82 | ############################# 83 | format, ext, input, output = 'mapnav', '.mapnav', True, True 84 | dir_pattern = 'Z[0-9]*/*/*.pic' 85 | tile_class = FileTileNoExt 86 | 87 | def dest_ext(self, tile): 88 | return '.pic' 89 | 90 | def path2coord(self, tile_path): 91 | z, y, x = path2list(tile_path)[-4:-1] 92 | return map(int, (z[1:], x, y)) 93 | 94 | def coord2path(self, z, x, y): 95 | return 'Z%d/%d/%d' % (z, y, x) 96 | 97 | tileset_profiles.append(MapNav) 98 | -------------------------------------------------------------------------------- /tilers_tools/hdr_pcx_merge.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-01-27 11:38:30 5 | 6 | ############################################################################### 7 | # Copyright (c) 2010, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | #****************************************************************************** 27 | 28 | import sys 29 | import os 30 | import shutil 31 | import glob 32 | import logging 33 | import optparse 34 | import string 35 | from PIL import Image 36 | 37 | from tiler_functions import * 38 | 39 | pcx_tile_w=640 40 | pcx_tile_h=480 41 | 42 | class MergeSet: 43 | def __init__(self,src_lst,dest_dir): 44 | self.src_lst=src_lst 45 | self.dest_dir=dest_dir 46 | self.merge() 47 | 48 | def __call__(self,src_dir): 49 | pf('.',end='') 50 | uc=string.ascii_uppercase 51 | tiles=sorted(glob.glob(os.path.join(src_dir,"*.[A-Z][0-9][0-9]"))) 52 | last_name=os.path.split(tiles[-1])[1] 53 | (base_name,last_ext)=os.path.splitext(last_name) 54 | ld([base_name,last_ext]) 55 | y_max=int(last_ext[2:4]) 56 | x_max=uc.find(last_ext[1])+1 57 | #ld([base_name,y_max,x_max]) 58 | im = Image.new("RGBA", (x_max*pcx_tile_w, y_max*pcx_tile_h)) 59 | for y in range(1,y_max+1): 60 | for x in range(1,x_max+1): 61 | src=os.path.join(src_dir,'%s.%s%02d' % (base_name,uc[x-1],y)) 62 | loc=((x-1)*pcx_tile_w,(y-1)*pcx_tile_h) 63 | ld([src,loc]) 64 | if os.path.exists(src): 65 | im.paste(Image.open(src),loc) 66 | else: 67 | logging.warning("%s not found" % src) 68 | dest=os.path.join(self.dest_dir,base_name+'.png') 69 | # Get the alpha band http://nadiana.com/pil-tips-converting-png-gif 70 | alpha = im.split()[3] 71 | # Convert the image into P mode but only use 255 colors in the palette out of 256 72 | im = im.convert('RGB').convert('P', palette=Image.ADAPTIVE, colors=255) 73 | # Set all pixel values below 128 to 255, and the rest to 0 74 | mask = Image.eval(alpha, lambda a: 255 if a <=128 else 0) 75 | # Paste the color of index 255 and use alpha as a mask 76 | im.paste(255, mask) 77 | # The transparency index is 255 78 | im.save(dest, transparency=255, optimize=True) 79 | 80 | def merge(self): 81 | parallel_map(self,self.src_lst) 82 | # MergeSet end 83 | 84 | if __name__=='__main__': 85 | parser = optparse.OptionParser( 86 | usage="usage: %prog tiles_dir", 87 | version=version, 88 | ) 89 | parser.add_option("-v", "--verbose", action="store_true", dest="verbose") 90 | 91 | (options, args) = parser.parse_args() 92 | 93 | logging.basicConfig(level=logging.DEBUG if options.verbose else logging.INFO) 94 | 95 | start_dir=os.getcwd() 96 | if len(args)==0: 97 | raise Exception("No source directories specified") 98 | 99 | src_dirs=glob.glob(os.path.join(args[0],"[A-Z]??????[0-9]")) 100 | MergeSet(src_dirs,start_dir) 101 | 102 | -------------------------------------------------------------------------------- /tilers_tools/converter_maemomapper.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2010, 2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from converter_backend import * 27 | 28 | ############################# 29 | 30 | class MapperSQLite(TileSet): 31 | 'maemo-mapper SQLite cache' 32 | ############################# 33 | format, ext, input, output = 'mapper', '.db', True, True 34 | max_zoom = 20 35 | 36 | def __init__(self, root, options=None): 37 | super(MapperSQLite, self).__init__(root, options) 38 | 39 | import sqlite3 40 | 41 | self.db = sqlite3.connect(self.root) 42 | self.dbc = self.db.cursor() 43 | if self.options.isDest: 44 | try: 45 | self.dbc.execute ( 46 | 'CREATE TABLE maps (' 47 | 'zoom INTEGER, ' 48 | 'tilex INTEGER, ' 49 | 'tiley INTEGER, ' 50 | 'pixbuf BLOB, ' 51 | 'PRIMARY KEY (zoom, tilex, tiley));' 52 | ) 53 | except: 54 | pass 55 | 56 | def finalize_tileset(self): 57 | self.db.commit() 58 | self.db.close() 59 | 60 | def __iter__(self): 61 | self.dbc.execute('SELECT * FROM maps') 62 | for z, x, y, pixbuf in self.dbc: 63 | coord = self.max_zoom+1-z, x, y 64 | if self.in_range(coord): 65 | yield PixBufTile(coord, str(pixbuf), (z, x, y)) 66 | 67 | def store_tile(self, tile): 68 | z, x, y = tile.coord() 69 | # convert to maemo-mapper coords 70 | z = self.max_zoom+1-z 71 | log('%s -> SQLite %d, %d, %d' % (tile.path, z, x, y)) 72 | self.dbc.execute('INSERT OR REPLACE INTO maps (zoom, tilex, tiley, pixbuf) VALUES (?, ?, ?, ?);', 73 | (z, x, y, buffer(tile.data()))) 74 | 75 | tileset_profiles.append(MapperSQLite) 76 | 77 | # MapperSQLite 78 | 79 | ############################# 80 | 81 | class MapperGDBM(TileSet): # due to GDBM weirdness on ARM this only works if run on the tablet itself 82 | 'maemo-mapper GDBM cache (works only on Nokia tablet)' 83 | ############################# 84 | format, ext, input, output = 'gdbm', '.gdbm', True, True 85 | max_zoom = 20 86 | 87 | def __init__(self, root, options=None): 88 | 89 | super(MapperGDBM, self).__init__(root, options) 90 | #print self.root 91 | 92 | import platform 93 | assert platform.machine().startswith('arm'), 'This convertion works only on a Nokia tablet' 94 | 95 | import gdbm 96 | self.db = gdbm.open(self.root, 'cf' if write else 'r') 97 | 98 | self.key = struct.Struct('>III') 99 | 100 | def finalize_tileset(self): 101 | self.db.sync() 102 | self.db.close() 103 | 104 | def __iter__(self): 105 | key = self.db.firstkey() 106 | while key: 107 | z, x, y = self.key.unpack(key) 108 | coord = self.max_zoom+1-z, x, y 109 | if not self.in_range(coord): 110 | continue 111 | yield PixBufTile(coord, self.db[key], (z, x, y)) 112 | key = self.db.nextkey(key) 113 | 114 | def store_tile(self, tile): 115 | z, x, y = tile.coord() 116 | # convert to maemo-mapper coords 117 | z = self.max_zoom+1-z 118 | log('%s -> GDBM %d, %d, %d' % (tile.path, z, x, y)) 119 | key = self.key.pack(z, x, y) 120 | self.db[key] = tile.data() 121 | 122 | tileset_profiles.append(MapperGDBM) 123 | # MapperGDBM 124 | -------------------------------------------------------------------------------- /tilers_tools/tiles_scale.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | ############################################################################### 4 | # Copyright (c) 2010,2011 Vadim Shlyakhov 5 | # 6 | # Permission is hereby granted, free of charge, to any person obtaining a 7 | # copy of this software and associated documentation files (the "Software"), 8 | # to deal in the Software without restriction, including without limitation 9 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 10 | # and/or sell copies of the Software, and to permit persons to whom the 11 | # Software is furnished to do so, subject to the following conditions: 12 | # 13 | # The above copyright notice and this permission notice shall be included 14 | # in all copies or substantial portions of the Software. 15 | # 16 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 17 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 18 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 19 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 20 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 21 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 22 | # DEALINGS IN THE SOFTWARE. 23 | #****************************************************************************** 24 | 25 | import sys 26 | import os 27 | import shutil 28 | import glob 29 | import logging 30 | import optparse 31 | from PIL import Image 32 | 33 | from tiler_functions import * 34 | 35 | class ZoomSet: 36 | def __init__(self,tiles_dir): 37 | pf('%s ' % tiles_dir,end='') 38 | 39 | start_dir=os.getcwd() 40 | os.chdir(tiles_dir) 41 | self.tiles_root=os.getcwd() 42 | os.chdir(start_dir) 43 | 44 | self.tilemap=read_tilemap(self.tiles_root) 45 | 46 | if self.tilemap['tiles']['inversion'][1]: # google 47 | self.tile_offsets=[ 48 | (0,0), (128,0), 49 | (0,128), (128,128), 50 | ] 51 | else: # TMS 52 | self.tile_offsets=[ 53 | (0,128), (128,128), 54 | (0,0), (128,0) 55 | ] 56 | 57 | def __call__(self,dest_tile): 58 | ext=self.tilemap['tiles']['ext'] 59 | im = Image.new("RGBA",(256,256),(0,0,0,0)) 60 | 61 | (z,y,x)=dest_tile 62 | tiles_in=[(y*2,x*2),(y*2,x*2+1), 63 | (y*2+1,x*2),(y*2+1,x*2+1)] 64 | for (src_yx,out_loc) in zip(tiles_in,self.tile_offsets): 65 | if src_yx in self.src_lst: 66 | sy,sx=src_yx 67 | src_path='z%i/%i/%i.%s' % (z+1,sy,sx,ext) 68 | im.paste(Image.open(src_path).resize((128,128),Image.ANTIALIAS),out_loc) 69 | 70 | dst_path='z%i/%i/%i.%s' % (z,y,x,ext) 71 | im.save(dst_path) 72 | pf('.',end='') 73 | 74 | def zoom_out(self,target_zoom): 75 | start_dir=os.getcwd() 76 | try: 77 | 78 | tilesets=self.tilemap['tilesets'] 79 | ld('tilesets', tilesets) 80 | 81 | top_zoom=min(tilesets.keys()) 82 | new_zooms=range(top_zoom-1,target_zoom-1,-1) 83 | if not new_zooms: 84 | return 85 | for zoom in new_zooms: # make new zoom tiles 86 | pf('%i' % zoom,end='') 87 | 88 | # add a new zoom level to tilemap 89 | z_dir = 'z%i' % zoom 90 | tilesets[zoom]={ 91 | 'href': z_dir, 92 | 'units_per_pixel': tilesets[zoom + 1]['units_per_pixel'] * 2} 93 | 94 | shutil.rmtree(z_dir, ignore_errors=True) 95 | os.chdir(os.path.join(self.tiles_root, 'z%i' % (zoom+1))) 96 | 97 | self.src_lst=set( 98 | [tuple(map(int,path2list(f)[:-1])) 99 | for f in glob.glob('*/*.%s' % self.tilemap['tiles']['ext'])]) 100 | 101 | os.chdir(self.tiles_root) 102 | 103 | if len(self.src_lst) == 0: 104 | raise Exception("No tiles in %s" % os.getcwd()) 105 | 106 | dest_lst=set([(zoom,src_y/2,src_x/2) for (src_y,src_x) in self.src_lst]) 107 | 108 | for i in set([y for z,y,x in dest_lst]): 109 | os.makedirs('z%i/%i' % (zoom,i)) 110 | 111 | parallel_map(self,dest_lst) 112 | 113 | write_tilemap('.',self.tilemap) 114 | 115 | finally: 116 | os.chdir(start_dir) 117 | pf('') 118 | 119 | # ZoomSet end 120 | 121 | if __name__=='__main__': 122 | parser = optparse.OptionParser( 123 | usage="usage: %prog tiles_dir ...", 124 | version=version, 125 | ) 126 | parser.add_option("-v", "--verbose", action="store_true", dest="verbose") 127 | parser.add_option("-z", "--zoom", dest="zoom", type='int', 128 | help='target zoom level)') 129 | parser.add_option("-q", "--quiet", action="store_true") 130 | parser.add_option("-d", "--debug", action="store_true") 131 | 132 | (options, args) = parser.parse_args() 133 | logging.basicConfig(level=logging.DEBUG if options.debug else 134 | (logging.ERROR if options.quiet else logging.INFO)) 135 | 136 | if options.zoom == None: 137 | parser.error('No target zoom specified') 138 | 139 | start_dir=os.getcwd() 140 | for tiles_dir in args if len(args)>0 else ['.']: 141 | ZoomSet(tiles_dir).zoom_out(options.zoom) 142 | 143 | -------------------------------------------------------------------------------- /QuickStart.md: -------------------------------------------------------------------------------- 1 | ## Overview 2 | 3 | `gdal_tiler.py` creates a folder with a tile pyramid from a map file. By default a tile pyramid is created in a Google-compatible format. The script also puts a few helper files into a pyramid folder, so it's possible to check a result of a conversion with an internet browser. 4 | 5 | For Ozi a corresponding raster file (JPEG, PNG, TIFF) must be in the the same folder. `.ozf2` and `.ozfx3` raster files can be converted to TIFF by `ozf_decoder.py`. 6 | 7 | `tiles_convert.py` converts a set of tiles between a number of formats. 8 | 9 | ## Quick Start 10 | 11 | In the simplest case, the data preparation for maemo-mapper looks like this: 12 | 13 | gdal_tiler.py my_map.kap 14 | or 15 | gdal_tiler.py my_map.map 16 | or 17 | gdal_tiler.py my_map.geo 18 | 19 | ## Notes 20 | 21 | * The scripts have `--help` option. For example: 22 | 23 | /home/misc/bsb/test-charts$ gdal_tiler.py --help 24 | Usage: gdal_tiler.py ... source... 25 | 26 | Tile cutter for GDAL-compatible raster maps 27 | 28 | Options: 29 | --version show program's version number and exit 30 | -h, --help show this help message and exit 31 | -p PROFILE, --profile=PROFILE, --to=PROFILE 32 | output tiles profile (default: zyx) 33 | -f, --list-profiles list tile profiles 34 | -z ZOOM_LIST, --zoom=ZOOM_LIST 35 | list of zoom ranges to generate 36 | --srs=SOURCE_SRS override source's spatial reference system 37 | --tiles-srs=TILES_SRS 38 | target SRS for generic profile 39 | --tile-size=SIZE_X,SIZE_Y 40 | generic profile: tile size (default: 256,256) 41 | --zoom0-tiles=NTILES_X,NTILES_Y 42 | generic profile: number of tiles along the axis at the 43 | zoom 0 (default: 1,1) 44 | --overview-resampling=METHOD1 45 | overview tiles resampling method (default: nearest) 46 | --base-resampling=METHOD2 47 | base image resampling method (default: nearest) 48 | -r, --release set resampling options to (antialias,bilinear) 49 | --tps Force use of thin plate spline transformer based on 50 | available GCPs) 51 | -c, --cut cut the raster as per cutline provided either by 52 | source or by "--cutline" option 53 | --cutline=DATASOURCE cutline data: OGR datasource 54 | --cutline-match-name match OGR feature field "Name" against source name 55 | --cutline-blend=N CUTLINE_BLEND_DIST in pixels 56 | --src-nodata=N[,N]... 57 | Nodata values for input bands 58 | --dst-nodata=N Assign nodata value for output paletted band 59 | --tiles-prefix=URL prefix for tile URLs at googlemaps.hml 60 | --tile-format=FMT tile image format (default: png) 61 | --paletted convert tiles to paletted format (8 bit/pixel) 62 | -t DEST_DIR, --dest-dir=DEST_DIR 63 | destination directory (default: source) 64 | --noclobber skip processing if the target pyramid already exists 65 | -s, --strip-dest-ext do not add a default extension suffix from a 66 | destination directory 67 | -q, --quiet 68 | -d, --debug 69 | -l, --long-name give an output file a long name 70 | -n, --after-name give an output file name after a map name (from 71 | metadata) 72 | -m, --after-map give an output file name after name of a map file, 73 | otherwise after a name of an image file 74 | 75 | * Some map files have a description for a "useful" part of a raster (a border polygon). This region can be "cut out" by the `gdal_tiler.py` during the process of a pyramid generation : 76 | 77 | gdal_tiler.py --cut my_map.kap 78 | 79 | * There a few maps there a "useless" area is painted by some color (or you can paint it by your own). These areas can be marked by the `--no-data` option: 80 | 81 | gdal_tiler.py --src-nodata=000,111,222 my_map.vrt 82 | 83 | * By default `gdal_tiler.py` renders a draft quality tiles, to create a "clean copy" you can use `--release` option (which is a shortcut to `--overview-resampling=antialias --base-resampling=cubic`): 84 | 85 | gdal_tiler.py --release my_map.vrt 86 | 87 | * A range of zoom levels generated by `gdal_tiler.py` can be set by the "\--zoom" option: 88 | 89 | gdal_tiler.py --zoom=9-15 my_map.vrt 90 | 91 | * `tiles-merge.py` can be used to 'sew' a number of maps into one: 92 | 93 | tiles_merge.py map1-folder map2-folder map3-folder result-folder 94 | 95 | > where mapN-folder are the folders generated by `gdal_tiler.py` 96 | 97 | * `tiles_convert.py` allows to select a subset of tiles to be copied to the destination tileset, like: 98 | 99 | tiles_convert.py --region=polygon.shape --zoom=4,7-10 100 | 101 | > where `polygon.shape` is either a OGR(GDAL)-compatible dataset or SASPlanet highlighting file 102 | 103 | * It's possible to make a very rough geo-referencing ('calibration') of scanned maps: you place an image into a Google Earth layer, make the layer semitransparent, adjust and align it according to the satellite image, then export it into a KML file (not KMZ). Then you can process the image the way as described above. 104 | -------------------------------------------------------------------------------- /tilers_tools/reader_kml.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-04-11 10:58:17 5 | 6 | ############################################################################### 7 | # Copyright (c) 2010, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | ############################################################################### 27 | 28 | from __future__ import with_statement 29 | 30 | import os 31 | import logging 32 | import math 33 | 34 | from optparse import OptionParser 35 | 36 | from tiler_functions import * 37 | from reader_backend import * 38 | 39 | def kml_parm(hdr,name,lst=False): 40 | l=re.split('' % name,hdr) 41 | # return only even elements as they are inside 42 | return [i.strip() for i in l[1::2]] if lst else l[1].strip() 43 | 44 | class KmlMap(SrcMap): 45 | magic='' not in header: 57 | raise Exception(" Invalid file: %s" % self.file) 58 | return header 59 | 60 | def get_layers(self): 61 | for layer_data in kml_parm(self.header,'GroundOverlay', lst=True): # get list of content 62 | yield KmlLayer(self,layer_data) 63 | # KmlMap 64 | reader_class_map.append(KmlMap) 65 | 66 | class KmlLayer(SrcLayer): 67 | 68 | def get_refs(self): 69 | 'get a list of geo refs in tuples' 70 | 71 | layer=self.data 72 | 73 | if '' in layer: 74 | src_refs=[map(float,i.split(',')) for i in kml_parm(layer,'coordinates').split()] 75 | else: # assume LatLonBox 76 | assert '' in layer 77 | north,south,east,west=[float(kml_parm(layer,parm)) for parm in ('north','south','east','west')] 78 | src_refs=[(west,south),(east,south),(east,north),(west,north)] 79 | 80 | dst_refs = GdalTransformer(SRC_SRS=proj_cs2geog_cs(self.map.proj), DST_SRS=self.map.proj).transform(src_refs) 81 | if '' in layer: 82 | north,south,east,west=[float(dst_refs[i][j]) for i,j in ((2,1),(0,1),(1,0),(0,0))] 83 | angle=math.radians(float(kml_parm(layer,'rotation'))) 84 | dx=east-west 85 | dy=north-south 86 | xc=(west +east )/2 87 | yc=(south+north)/2 88 | x1=dy*math.sin(angle) 89 | x2=dx*math.cos(angle) 90 | y1=dy*math.cos(angle) 91 | y2=dx*math.sin(angle) 92 | x0=xc-(x1+x2)/2 93 | y0=yc-(y1+y2)/2 94 | dst_refs=[(x0+x1,y0),(x0+x1+x2,y0+y2),(x0+x2,y0+y1+y2),(x0,y0+y1)] 95 | ld(dst_refs) 96 | 97 | w, h=(self.raster_ds.RasterXSize,self.raster_ds.RasterYSize) 98 | ld('w, h',w, h) 99 | corners=[(0,h),(w,h),(w,0),(0,0)] 100 | ids=[str(i+1) for i in range(4)] 101 | 102 | refs=RefPoints(self, 103 | ids=[str(i+1) for i in range(4)], 104 | pixels=[(0,h),(w,h),(w,0),(0,0)], 105 | cartesian=dst_refs) 106 | return refs 107 | 108 | def get_plys(self): 109 | 'boundary polygon' 110 | 111 | mpointlst=shape2mpointlst(self.map.file,self.map.proj,self.name) 112 | if not mpointlst: 113 | return None 114 | 115 | plys=RefPoints(self,cartesian=mpointlst[0]) 116 | return plys 117 | 118 | def get_srs(self): 119 | return self.map.proj, None 120 | 121 | def get_raster(self): 122 | img_uri=strip_html(kml_parm(self.data,'href')) 123 | map_dir=os.path.split(self.map.file)[0] 124 | if not map_dir: 125 | map_dir=u'.' 126 | 127 | imp_path_slashed=img_uri.replace('\\','/') # replace windows slashes 128 | imp_path_lst=imp_path_slashed.split('/') 129 | img_patt=imp_path_lst[-1].lower() 130 | match=[i for i in os.listdir(map_dir) if i.lower() == img_patt] 131 | try: 132 | return os.path.join(map_dir, match[0]) 133 | except IndexError: raise Exception("*** Image file not found: %s" % img_uri) 134 | 135 | def get_name(self): 136 | return kml_parm(self.data,'name') 137 | 138 | # KmlLayer 139 | 140 | if __name__=='__main__': 141 | print('\nPlease use convert2gdal.py\n') 142 | sys.exit(1) 143 | 144 | -------------------------------------------------------------------------------- /tilers_tools/reader_geo.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-03-01 16:32:36 5 | 6 | ############################################################################### 7 | # Copyright (c) 2010, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | ############################################################################### 27 | 28 | from __future__ import with_statement 29 | 30 | import os 31 | import logging 32 | import locale 33 | 34 | from optparse import OptionParser 35 | 36 | from tiler_functions import * 37 | from reader_backend import * 38 | 39 | class GeoNosMap(SrcMap): 40 | magic = '[MainChart]' 41 | data_file = 'data_bsb.csv' 42 | 43 | def get_header(self): 44 | 'read map header' 45 | with open(self.file, 'rU') as f: 46 | hdr=[[i.strip() for i in l.decode('iso8859-1','ignore').split('=')] for l in f] 47 | if not (hdr and hdr[0][0] == '[MainChart]'): 48 | raise Exception(" Invalid file: %s" % self.file) 49 | ld(hdr) 50 | return hdr 51 | 52 | def get_layers(self): 53 | return [GeoNosLayer(self,self.header)] 54 | #GeoNosMap 55 | reader_class_map.append(GeoNosMap) 56 | 57 | class GeoNosLayer(SrcLayer): 58 | 59 | def hdr_parms(self, patt): 60 | 'filter header for params starting with "patt"' 61 | plen=len(patt) 62 | return [('%s %s' % (i[0][plen:],i[1]) if len(i[0]) > plen else i[1]) 63 | for i in self.data if i[0].startswith(patt)] 64 | 65 | def hdr_parms2list(self, patt): 66 | return [s.split() for s in self.hdr_parms(patt)] 67 | 68 | def get_dtm(self): 69 | 'get DTM northing, easting' 70 | dtm_parm=self.map.options.dtm_shift 71 | if dtm_parm is None: 72 | try: 73 | dtm = [float(self.hdr_parms(i)[0]) for i in ('Longitude Offset','Latitude Offset')] 74 | ld('DTM',dtm) 75 | except (IndexError, ValueError): # DTM not found 76 | ld('DTM not found') 77 | return None 78 | else: 79 | denominator = 3600 # seconds if options.dtm_shift 80 | dtm = [float(s)/denominator for s in dtm_parm] 81 | return dtm if dtm != [0,0] else None 82 | 83 | def get_refs(self): 84 | 'get a list of geo refs in tuples' 85 | refs=LatLonRefPoints(self,[( 86 | i[0], # id 87 | (int(i[4]),int(i[3])), # pixel 88 | (float(i[1]),float(i[2])) # lat/long 89 | ) for i in self.hdr_parms2list('Point')]) 90 | return refs 91 | 92 | def get_plys(self): 93 | 'boundary polygon' 94 | plys=RefPoints(self,latlong=[ 95 | (float(i[2]),float(i[1])) # lat/long 96 | for i in self.hdr_parms2list('Vertex')]) 97 | return plys 98 | 99 | def get_proj_id(self): 100 | return self.hdr_parms('Projection')[0] 101 | 102 | def get_proj(self): 103 | proj_id=self.get_proj_id() 104 | try: 105 | proj_parm=self.map.srs_defs['proj'][proj_id.upper()] 106 | proj = [proj_parm[0]] 107 | except KeyError: 108 | raise Exception("*** Unsupported projection (%s)" % proj_id) 109 | return proj 110 | 111 | def get_datum_id(self): 112 | return self.hdr_parms('Datum')[0] 113 | 114 | def get_datum(self): 115 | datum_id=self.get_datum_id() 116 | try: 117 | datum=self.map.srs_defs['datum'][datum_id.upper()][0] 118 | except KeyError: 119 | dtm=self.get_dtm() # get northing, easting to WGS84 if any 120 | datum='+datum=WGS84' 121 | if dtm: 122 | logging.warning(' Unknown datum "%s", assumed as WGS 84 with DTM shifts' % datum_id) 123 | else: # assume DTM is 0,0 124 | logging.warning(' Unknown datum "%s", assumed as WGS 84' % datum_id) 125 | return datum.split(' ') 126 | 127 | def get_raster(self): 128 | name_patt=self.hdr_parms('Bitmap')[0].lower() 129 | map_dir,map_fname=os.path.split(self.map.file) 130 | dir_lst=os.listdir(map_dir if map_dir else u'.') 131 | match=[i for i in dir_lst if i.lower() == name_patt] 132 | try: 133 | fn=match[0] 134 | ld(map_dir, fn) 135 | img_file=os.path.join(map_dir, fn) 136 | except: 137 | raise Exception("*** Image file not found: %s" % img_path) 138 | return img_file 139 | 140 | def get_size(self): 141 | with open(self.img_file) as img: 142 | hdr=img.readline() 143 | assert hdr.startswith('NOS/') 144 | patt='RA=' 145 | sz=hdr[hdr.index(patt)+len(patt):].split(',')[2:4] 146 | return map(int,sz) 147 | 148 | def get_name(self): 149 | return self.hdr_parms('Name')[0] 150 | 151 | # GeoNosLayer 152 | 153 | if __name__=='__main__': 154 | 155 | print('\nPlease use convert2gdal.py\n') 156 | sys.exit(1) 157 | 158 | -------------------------------------------------------------------------------- /tilers_tools/map2gdal.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-04-10 13:33:20 5 | 6 | ############################################################################### 7 | # Copyright (c) 2011, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | #****************************************************************************** 27 | 28 | from __future__ import with_statement 29 | 30 | import os 31 | import logging 32 | 33 | from optparse import OptionParser 34 | 35 | from tiler_functions import * 36 | 37 | import reader_backend 38 | import reader_bsb 39 | import reader_geo 40 | import reader_ozi 41 | import reader_kml 42 | 43 | options = None 44 | 45 | def process_src(src, no_error=False, opt=None): 46 | """ 47 | if source is converted successfully returns 48 | (, True) 49 | otherwise returns 50 | (, False) 51 | """ 52 | global options 53 | 54 | src = src.decode(locale.getpreferredencoding(),'ignore') 55 | 56 | if not opt: 57 | opt = LooseDict(options) 58 | 59 | with open(src,'rU') as f: 60 | lines=[f.readline() for i in range(30)] 61 | 62 | err_msg = None 63 | for cls in reader_backend.reader_class_map: 64 | patt=cls.magic 65 | if any((l.startswith(patt) for l in lines)): 66 | try: 67 | res = [(layer.convert(), True) for layer in cls(src,options=opt).get_layers()] 68 | return res 69 | except RuntimeError as exc: 70 | err_msg = exc.message 71 | if not no_error: 72 | raise 73 | else: 74 | if no_error: 75 | return [(src, False)] 76 | if err_msg is None: 77 | err_msg = '*** %s' % exc.message 78 | if self.options.skip_invalid: 79 | logging.error(err_msg) 80 | return False 81 | else: 82 | raise RuntimeError(err_msg) 83 | 84 | 85 | parser = None 86 | #---------------------------- 87 | 88 | def parse_args(arg_lst): 89 | 90 | #---------------------------- 91 | parser = OptionParser( 92 | usage="usage: %prog ... map_file...", 93 | version=version, 94 | description="Extends GDAL's builtin support for a few mapping formats: BSB/KAP, GEO/NOS, Ozi map. " 95 | "The script translates a map file with into GDAL .vrt") 96 | parser.add_option("--srs", default=None, 97 | help="specify a full coordinate system for an output file (PROJ.4 definition)") 98 | parser.add_option("--datum", default=None, 99 | help="override a datum part only (PROJ.4 definition)") 100 | parser.add_option("--proj", default=None, 101 | help="override a projection part only (PROJ.4 definition)") 102 | parser.add_option("--force-dtm", action="store_true", 103 | help='force using BSB datum shift to WGS84 instead of native BSB datum') 104 | parser.add_option("--dtm",dest="dtm_shift",default=None,metavar="SHIFT_LONG,SHIFT_LAT", 105 | help='northing and easting to WGS84 datum in seconds of arc') 106 | parser.add_option('--tps', action="store_true", 107 | help='Force use of thin plate spline transformer based on available GCPs)') 108 | parser.add_option("--get-cutline", action="store_true", 109 | help='print a definition of a cutline polygon, then exit') 110 | parser.add_option("--cut-file", action="store_true", 111 | help='create a .GMT file with a cutline polygon') 112 | parser.add_option("-t", "--dest-dir", default=None, dest="dst_dir", 113 | help='destination directory (default: current)') 114 | parser.add_option("-n", "--after-name", action="store_true", 115 | help='give an output file name after a map name (from metadata)') 116 | parser.add_option("-m", "--after-map", action="store_true", 117 | help='give an output file name after name of a map file, otherwise after a name of an image file') 118 | parser.add_option("-l", "--long-name", action="store_true", 119 | help='give an output file a long name') 120 | parser.add_option("--skip-invalid", action="store_true", 121 | help='skip invalid/unrecognized source') 122 | parser.add_option("-d", "--debug", action="store_true", dest="debug") 123 | parser.add_option("-q", "--quiet", action="store_true", dest="quiet") 124 | # parser.add_option("--last-column-bug", action="store_true", 125 | # help='some BSB files are missing value for last column, here is a workaround') 126 | # parser.add_option("--broken-raster", action="store_true", 127 | # help='try to workaround some BSB broken rasters (requires "convert" from ImageMagick)') 128 | 129 | return parser.parse_args(arg_lst) 130 | 131 | if __name__=='__main__': 132 | (options, args) = parse_args(sys.argv[1:]) 133 | 134 | #~ if not args: 135 | #~ parser.error('No input file(s) specified') 136 | 137 | logging.basicConfig(level=logging.DEBUG if options.debug else 138 | (logging.ERROR if options.quiet else logging.INFO)) 139 | 140 | ld(os.name) 141 | ld(options) 142 | 143 | map(process_src,args) 144 | -------------------------------------------------------------------------------- /tilers_tools/tiles_convert.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2010, 2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | import sys 27 | import logging 28 | import optparse 29 | 30 | from tiler_functions import * 31 | 32 | from converter_backend import TileSet, TileConverter 33 | import converter_xyz 34 | import converter_maemomapper 35 | import converter_sasplanet 36 | try: 37 | import converter_mmaps 38 | except ImportError: 39 | pass 40 | 41 | #~ import rpdb2; rpdb2.start_embedded_debugger('nRAmgJHm') 42 | 43 | #---------------------------- 44 | 45 | def convert(src_lst, options): 46 | 47 | #---------------------------- 48 | 49 | in_class = TileSet.get_class(options.in_fmt, isDest=False) 50 | out_class = TileSet.get_class(options.out_fmt, isDest=True) 51 | 52 | for src in src_lst: 53 | src_tile_set = in_class(src, options) 54 | out_class(options=options, src=src_tile_set).convert() 55 | 56 | #---------------------------- 57 | 58 | def main(argv): 59 | 60 | #---------------------------- 61 | parser = optparse.OptionParser( 62 | usage='usage: %prog [...] ...', 63 | version=version, 64 | description='copies map tiles from one structure to another') 65 | parser.add_option('--from', dest='in_fmt', default='zyx', 66 | help='input tiles profile (default: zyx)') 67 | try: 68 | converter_mmaps # test availability 69 | parser.add_option('--to', dest='out_fmt', default='mmaps', 70 | help='output tiles profile (default: mmaps)') 71 | except NameError: 72 | parser.add_option('--to', dest='out_fmt', default='xyz', 73 | help='output tiles profile (default: xyz)') 74 | parser.add_option('--list-profiles', '--lp', action='store_true', 75 | help='list available profiles') 76 | parser.add_option('-f', '--tile-format', dest='convert_tile', metavar='FORMAT', 77 | help='convert output tiles to format (default: no conversion)') 78 | parser.add_option('--list-formats', '--lf', action='store_true', 79 | help='list tile format converters') 80 | parser.add_option("-n", "--colors", dest="colors", default='256', 81 | help='Specifies the number of colors for pngnq profile (default: 256)') 82 | parser.add_option("-q", "--quality", dest="quality", type="int", default=75, 83 | help='JPEG/WEBP quality (default: 75)') 84 | parser.add_option('-a', '--append', action='store_true', dest='append', 85 | help='append tiles to an existing destination') 86 | parser.add_option('-r', '--remove-dest', action='store_true',dest='remove_dest', 87 | help='delete destination directory before merging') 88 | parser.add_option('-t', '--dest-dir', default='.', dest='dst_dir', 89 | help='destination directory (default: current)') 90 | parser.add_option('--name', default=None, 91 | help='layer name (default: derived from the source)') 92 | parser.add_option('--description', metavar='TXT', default='', 93 | help='layer decription (default: None)') 94 | parser.add_option('--overlay', action='store_true', 95 | help='non-base layer (default: False)') 96 | parser.add_option('--url', default=None, 97 | help='URL template (default: None)') 98 | parser.add_option('--link', action='store_true', dest='link', 99 | help='make links to source tiles instead of copying if possible') 100 | parser.add_option("--srs", default='EPSG:3857', dest="tiles_srs", 101 | help="code of a spatial reference system of a tile set (default is EPSG:3857, aka EPSG:900913)") 102 | parser.add_option("--proj4def", default=None, metavar="PROJ4_SRS", 103 | help="proj4 definition for the SRS") 104 | parser.add_option('-z', '--zoom', default=None,metavar='ZOOM_LIST', 105 | help='list of zoom ranges to process') 106 | parser.add_option('-g', '--region', default=None, metavar='DATASOURCE', 107 | help='region to process (OGR shape or Sasplanet .hlg)') 108 | parser.add_option('--region-zoom', metavar='N', type="int", default=None, 109 | help='apply region for zooms only higher than this one (default: None)') 110 | parser.add_option("--nothreads", action="store_true", 111 | help="do not use multiprocessing") 112 | 113 | parser.add_option('-d', '--debug', action='store_true', dest='debug') 114 | parser.add_option('--quiet', action='store_true', dest='quiet') 115 | 116 | #~ global options 117 | (options, args) = parser.parse_args(argv[1:]) 118 | 119 | logging.basicConfig(level=logging.DEBUG if options.debug else 120 | (logging.ERROR if options.quiet else logging.INFO)) 121 | log(options.__dict__) 122 | 123 | if options.list_profiles: 124 | TileSet.list_profiles() 125 | sys.exit(0) 126 | 127 | if options.list_formats: 128 | TileConverter.list_tile_converters() 129 | sys.exit(0) 130 | 131 | src_lst=args 132 | 133 | convert(src_lst, LooseDict(options)) 134 | 135 | # main() 136 | 137 | if __name__ == '__main__': 138 | 139 | main(sys.argv) 140 | -------------------------------------------------------------------------------- /tilers_tools/reader_bsb.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-04-11 10:58:17 5 | 6 | ############################################################################### 7 | # Copyright (c) 2010, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | ############################################################################### 27 | 28 | from __future__ import with_statement 29 | 30 | import os 31 | import logging 32 | import locale 33 | 34 | from optparse import OptionParser 35 | 36 | from tiler_functions import * 37 | from reader_backend import * 38 | 39 | class BsbKapMap(SrcMap): 40 | magic = 'KNP/' 41 | data_file = 'data_bsb.csv' 42 | 43 | def get_header(self): 44 | 'read map header' 45 | header=[] 46 | with open(self.file,'rU') as f: 47 | for l in f: 48 | if '\x1A' in l: 49 | break 50 | l=l.decode('iso8859-1','ignore') 51 | if l.startswith((' ','\t')): 52 | header[-1] += ','+l.strip() 53 | else: 54 | header.append(l.strip()) 55 | ld('header', header) 56 | if not (header and any(((s.startswith('BSB/') or s.startswith('KNP/')) for s in header))): 57 | raise Exception(" Invalid file: %s" % self.file) 58 | return header 59 | 60 | def get_layers(self): 61 | return [BsbLayer(self,self.header)] 62 | # BsbKapMap 63 | reader_class_map.append(BsbKapMap) 64 | 65 | class BsbLayer(SrcLayer): 66 | 67 | def hdr_parms(self, patt): 68 | 'filter header for params starting with "patt/"' 69 | if patt != '!': 70 | patt += '/' 71 | return [i[len(patt):] for i in self.data if i.startswith(patt)] 72 | 73 | def hdr_parms2list(self, knd): 74 | return [i.split(',') for i in self.hdr_parms(knd)] 75 | 76 | def hdr_parm2dict(self, knd): 77 | out={} 78 | for i in self.hdr_parms2list(knd)[0]: 79 | if '=' in i: 80 | (key,val)=i.split('=') 81 | out[key]=val 82 | else: 83 | out[key] += ','+i 84 | return out 85 | 86 | def get_dtm(self): 87 | 'get DTM northing, easting' 88 | if self.map.options.dtm_shift is not None: 89 | dtm_parm=self.map.options.dtm_shift.split(',') 90 | else: 91 | try: 92 | dtm_parm=self.hdr_parms2list('DTM')[0] 93 | ld('DTM',dtm_parm) 94 | except IndexError: # DTM not found 95 | ld('DTM not found') 96 | dtm_parm=[0,0] 97 | dtm=[float(s)/3600 for s in reversed(dtm_parm)] 98 | return dtm if dtm != [0,0] else None 99 | 100 | def get_refs(self): 101 | 'get a list of geo refs in tuples' 102 | 103 | # https://code.google.com/p/tilers-tools/issues/detail?id=9 104 | # ---- remove duplicate refs 105 | # compensate for NOAA charts having 106 | # duplicate REF entries in 2013 catalog 107 | 108 | refLst = self.hdr_parms2list('REF') 109 | 110 | unique_refs = set() 111 | for ref in refLst: 112 | val = tuple(ref[1:len(ref)]) 113 | if val not in unique_refs: 114 | unique_refs.add(val) 115 | else: 116 | refLst.remove(ref) 117 | 118 | refs=LatLonRefPoints(self,[( 119 | i[0], # id 120 | (int(i[1]),int(i[2])), # pixel 121 | (float(i[4]),float(i[3])) # lat/long 122 | ) for i in refLst]) 123 | return refs 124 | 125 | def get_plys(self): 126 | 'boundary polygon' 127 | plys=RefPoints(self,latlong=[ 128 | (float(i[2]),float(i[1])) # lat/long 129 | for i in self.hdr_parms2list('PLY')]) 130 | return plys 131 | 132 | def assemble_parms(self,parm_map,parm_info): 133 | check_parm=lambda s: (s not in ['NOT_APPLICABLE','UNKNOWN']) and s.replace('0','').replace('.','') 134 | return ['+%s=%s' % (parm_map[i],parm_info[i]) for i in parm_map 135 | if i in parm_info and check_parm(parm_info[i])] 136 | 137 | def get_proj_id(self): 138 | return self.hdr_parm2dict('KNP')['PR'] 139 | 140 | def get_proj(self): 141 | knp_info=self.hdr_parm2dict('KNP') 142 | ld(knp_info) 143 | proj_id=self.get_proj_id() 144 | try: 145 | proj_parm=self.map.srs_defs['proj'][proj_id.upper()] 146 | proj = [proj_parm[0]] 147 | knp_parm = dict((i.split(':',1) for i in proj_parm[1:] if ':' in i)) 148 | ld('get_proj KNP', proj_id, proj, knp_parm) 149 | except KeyError: 150 | raise Exception(' Unsupported projection %s' % proj_id) 151 | # get projection and parameters 152 | try: # extra projection parameters for BSB 3.xx, put them before KNP parms 153 | knq_info=self.hdr_parm2dict('KNQ') 154 | knq_parm = dict((i.split(':',1) for i in knp_parm['KNQ'].split(','))) 155 | ld('get_proj KNQ', knq_info, knq_parm) 156 | proj.extend(self.assemble_parms(knq_parm,knq_info)) 157 | except IndexError: # No KNQ 158 | pass 159 | except KeyError: # No such proj in KNQ map 160 | pass 161 | proj.extend(self.assemble_parms(knp_parm,knp_info)) 162 | ld('get_proj', proj) 163 | return proj 164 | 165 | def get_datum_id(self): 166 | return self.hdr_parm2dict('KNP')['GD'] 167 | 168 | def get_datum(self): 169 | datum_id=self.get_datum_id() 170 | try: 171 | datum=self.map.srs_defs['datum'][datum_id.upper()][0] 172 | except KeyError: 173 | # try to guess the datum by comment and copyright string(s) 174 | crr=(' '.join(self.hdr_parms('!')+self.hdr_parms('CRR'))).upper() 175 | try: 176 | guess_dict = self.map.srs_defs['datum_guess'] 177 | datum=[guess_dict[crr_patt][0] for crr_patt in guess_dict if crr_patt.upper() in crr][0] 178 | logging.warning(' Unknown datum "%s", guessed as "%s"' % (datum_id,datum)) 179 | except IndexError: 180 | # datum still not found 181 | dtm=self.get_dtm() # get northing, easting to WGS 84 if any 182 | datum='+datum=WGS84' 183 | if dtm: 184 | logging.warning(' Unknown datum "%s", assumed as WGS 84 with DTM shifts' % datum_id) 185 | else: # assume DTM is 0,0 186 | logging.warning(' Unknown datum "%s", assumed as WGS 84' % datum_id) 187 | return datum.split(' ') 188 | 189 | def get_raster(self): 190 | return self.map.file 191 | 192 | def get_name(self): 193 | bsb_info=self.hdr_parm2dict('BSB') # general BSB parameters 194 | bsb_name=bsb_info['NA'] 195 | return bsb_name 196 | # BsbLayer 197 | 198 | if __name__=='__main__': 199 | print('\nPlease use convert2gdal.py\n') 200 | sys.exit(1) 201 | 202 | -------------------------------------------------------------------------------- /tilers_tools/tiler.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2011-2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from optparse import OptionParser 27 | 28 | from tiler_functions import * 29 | from tiler_backend import Pyramid, resampling_lst, base_resampling_lst 30 | import tiler_global_mercator 31 | import tiler_plate_carree 32 | import tiler_misc 33 | 34 | import map2gdal 35 | 36 | #~ import rpdb2; rpdb2.start_embedded_debugger('nRAmgJHm') 37 | 38 | #---------------------------- 39 | 40 | def preprocess_src(src): 41 | 42 | #---------------------------- 43 | global options 44 | opt = LooseDict(options) 45 | res = map2gdal.process_src(src, no_error=True, opt=opt) 46 | ld('preprocess_src', res) 47 | return res 48 | 49 | #---------------------------- 50 | 51 | def process_src(src_def): 52 | 53 | #---------------------------- 54 | global options 55 | opt = LooseDict(options) 56 | opt.tile_format = opt.tile_format.lower() 57 | opt.tile_ext = '.' + opt.tile_format 58 | src, delete_src = src_def 59 | opt.delete_src = delete_src 60 | 61 | profile = Pyramid.profile_class(opt.profile) 62 | ext = profile.defaul_ext if opt.strip_dest_ext is None else '' 63 | dest = dest_path(src, opt.dest_dir, ext) 64 | 65 | prm = profile(src, dest, opt) 66 | prm.generate_tiles() 67 | 68 | #---------------------------- 69 | 70 | def parse_args(arg_lst): 71 | 72 | #---------------------------- 73 | parser = OptionParser( 74 | usage = "usage: %prog ... source...", 75 | version=version, 76 | description='Tile cutter for GDAL-compatible raster maps') 77 | parser.add_option('-p', '--profile', '--to', dest="profile", metavar='PROFILE', 78 | default='zyx', choices=Pyramid.profile_lst(), 79 | help='output tiles profile (default: zyx)') 80 | parser.add_option("-f", "--list-profiles", action="store_true", 81 | help='list tile profiles') 82 | parser.add_option("-z", "--zoom", default=None, metavar="ZOOM_LIST", 83 | help='list of zoom ranges to generate') 84 | parser.add_option("--srs", default=None, metavar="SOURCE_SRS", 85 | help="override source's spatial reference system") 86 | parser.add_option("--tiles-srs", default=None, metavar="TILES_SRS", 87 | help="target SRS for generic profile") 88 | parser.add_option("--tile-size", default='256,256', metavar="SIZE_X,SIZE_Y", 89 | help='generic profile: tile size (default: 256,256)') 90 | parser.add_option("--zoom0-tiles", default='1,1', metavar="NTILES_X,NTILES_Y", 91 | help='generic profile: number of tiles along the axis at the zoom 0 (default: 1,1)') 92 | parser.add_option('--overview-resampling', default='nearest', metavar="METHOD1", 93 | choices=resampling_lst(), 94 | help='overview tiles resampling method (default: nearest)') 95 | parser.add_option('--base-resampling', default='nearest', metavar="METHOD2", 96 | choices=base_resampling_lst(), 97 | help='base image resampling method (default: nearest)') 98 | parser.add_option('-r', '--release', action="store_true", 99 | help='set resampling options to (antialias,bilinear)') 100 | parser.add_option('--tps', action="store_true", 101 | help='Force use of thin plate spline transformer based on available GCPs)') 102 | parser.add_option("-c", "--cut", action="store_true", 103 | help='cut the raster as per cutline provided either by source or by "--cutline" option') 104 | parser.add_option("--cutline", default=None, metavar="DATASOURCE", 105 | help='cutline data: OGR datasource') 106 | parser.add_option("--cutline-match-name", action="store_true", 107 | help='match OGR feature field "Name" against source name') 108 | parser.add_option("--cutline-blend", dest="blend_dist", default=None, metavar="N", 109 | help='CUTLINE_BLEND_DIST in pixels') 110 | parser.add_option("--src-nodata", dest="src_nodata", metavar='N[,N]...', 111 | help='Nodata values for input bands') 112 | parser.add_option("--dst-nodata", dest="dst_nodata", metavar='N', 113 | help='Assign nodata value for output paletted band') 114 | parser.add_option("--tiles-prefix", default='', metavar="URL", 115 | help='prefix for tile URLs at googlemaps.hml') 116 | parser.add_option("--tile-format", default='png', metavar="FMT", 117 | help='tile image format (default: png)') 118 | parser.add_option("--paletted", action="store_true", 119 | help='convert tiles to paletted format (8 bit/pixel)') 120 | parser.add_option("-t", "--dest-dir", dest="dest_dir", default=None, 121 | help='destination directory (default: source)') 122 | parser.add_option("--noclobber", action="store_true", 123 | help='skip processing if the target pyramid already exists') 124 | parser.add_option("-s", "--strip-dest-ext", action="store_true", 125 | help='do not add a default extension suffix from a destination directory') 126 | # parser.add_option("--viewer-copy", action="store_true", 127 | # help='on POSIX systems copy html viewer instead of hardlinking to the original location') 128 | parser.add_option("-q", "--quiet", action="store_const", 129 | const=0, default=1, dest="verbose") 130 | parser.add_option("-d", "--debug", action="store_const", 131 | const=2, dest="verbose") 132 | parser.add_option("-l", "--long-name", action="store_true", 133 | help='give an output file a long name') 134 | parser.add_option("-n", "--after-name", action="store_true", 135 | help='give an output file name after a map name (from metadata)') 136 | parser.add_option("-m", "--after-map", action="store_true", 137 | help='give an output file name after name of a map file, otherwise after a name of an image file') 138 | parser.add_option("--skip-invalid", action="store_true", 139 | help='skip invalid/unrecognized source') 140 | 141 | (options, args) = parser.parse_args(arg_lst) 142 | 143 | return (options, args) 144 | 145 | #---------------------------- 146 | 147 | def main(argv): 148 | 149 | #---------------------------- 150 | 151 | global options 152 | (options, args) = parse_args(argv[1:]) 153 | 154 | logging.basicConfig(level=logging.DEBUG if options.verbose == 2 else 155 | (logging.ERROR if options.verbose == 0 else logging.INFO)) 156 | 157 | ld(os.name) 158 | ld(options) 159 | 160 | if options.list_profiles: 161 | Pyramid.profile_lst(tty=True) 162 | return 163 | 164 | if not args: 165 | logging.error('No input file(s) specified') 166 | sys.exit(1) 167 | 168 | if options.verbose == 2: 169 | set_nothreads() 170 | 171 | if options.release: 172 | options.overview_resampling, options.base_resampling = ('antialias', 'cubic') 173 | 174 | res = parallel_map(preprocess_src, args) 175 | parallel_map(process_src, flatten(res)) 176 | 177 | # main() 178 | 179 | if __name__ == '__main__': 180 | 181 | main(sys.argv) 182 | -------------------------------------------------------------------------------- /tilers_tools/tiler_plate_carree.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2011-2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from tiler_functions import * 27 | from tiler_backend import * 28 | 29 | 30 | ############################# 31 | 32 | class PlateCarree(MercatorPyramid): 33 | '''Plate Carrée, top-to-bottom tile numbering (a la Google Earth)''' 34 | ############################# 35 | zoom0_tiles = [2, 1] # tiles at zoom 0 36 | 37 | tilemap_crs = 'EPSG:4326' 38 | 39 | # http://earth.google.com/support/bin/static.py?page=guide.cs&guide=22373&topic=23750 40 | # "Google Earth uses Simple Cylindrical projection for its imagery base. This is a simple map 41 | # projection where the meridians and parallels are equidistant, straight lines, with the two sets 42 | # crossing at right angles. This projection is also known as Lat/Lon WGS84" 43 | 44 | # Equirectangular (EPSG:32662 aka plate carrée, aka Simple Cylindrical) 45 | # we use this because the SRS might be shifted later to work around 180 meridian 46 | srs = '+proj=eqc +datum=WGS84 +ellps=WGS84' 47 | 48 | # set units to degrees, this makes this SRS essentially equivalent to EPSG:4326 49 | srs += ' +to_meter=%f' % (GdalTransformer(DST_SRS=srs, SRC_SRS=proj_cs2geog_cs(srs)).transform_point((1, 0))[0]) 50 | 51 | #~ srs = 'EPSG:4326' 52 | 53 | def kml_child_links(self, children, parent=None, path_prefix=''): 54 | kml_links = [] 55 | # convert tiles to degree boxes 56 | longlat_boxes = self.map_tiles2longlat_bounds(children) 57 | 58 | for tile, longlat in zip(children, longlat_boxes): 59 | #ld(tile, longlat) 60 | w, n, e, s = ['%.11f'%v for v in flatten(longlat)] 61 | name = os.path.splitext(self.tile_path(tile))[0] 62 | # fill in kml link template 63 | kml_links.append( kml_link_templ % { 64 | 'name': name, 65 | 'href': path_prefix+'%s.kml' % name, 66 | 'west': w, 'north': n, 67 | 'east': e, 'south': s, 68 | 'min_lod': 128, 69 | 'max_lod': 2048 if parent else -1, 70 | }) 71 | return ''.join(kml_links) 72 | 73 | def write_kml(self, rel_path, name, links='', overlay=''): 74 | kml = kml_templ % { 75 | 'name': name, 76 | 'links': links, 77 | 'overlay': overlay, 78 | 'dbg_start': '' if self.options.verbose < 2 else ' \n', 80 | } 81 | open(os.path.join(self.dest, rel_path+'.kml'), 'w+').write(kml) 82 | 83 | def write_metadata(self, tile=None, children=[]): 84 | super(PlateCarree, self).write_metadata(tile, children) 85 | 86 | if tile is None: # create top level kml 87 | self.write_kml(os.path.basename(self.base), os.path.basename(self.base), self.kml_child_links(children)) 88 | return 89 | # fill in kml templates 90 | rel_path = self.tile_path(tile) 91 | name = os.path.splitext(rel_path)[0] 92 | kml_links = self.kml_child_links(children, tile, '../../') 93 | tile_box = self.map_tiles2longlat_bounds([tile])[0] 94 | w, n, e, s = ['%r' % v for v in flatten(tile_box)] 95 | kml_overlay = kml_overlay_templ % { 96 | 'name': name, 97 | 'href': os.path.basename(rel_path), 98 | 'min_lod': 128, 99 | 'max_lod': 2048 if kml_links else -1, 100 | 'order': tile[0], 101 | 'west': w, 'north': n, 102 | 'east': e, 'south': s, 103 | } 104 | self.write_kml(name, name, kml_links, kml_overlay) 105 | # PlateCarree 106 | 107 | ############################# 108 | 109 | class PlateCarreeZYX(PlateCarree, ZYXtiling): 110 | 'Plate Carrée, top-to-bottom tile numbering (a la Google Earth) ZYX directory structure' 111 | ############################# 112 | profile = 'geo' 113 | defaul_ext = '.zyx-geo' 114 | tms_profile = 'zyx-geodetic' # non-standard profile 115 | # 116 | profile_map.append(PlateCarreeZYX) 117 | # 118 | 119 | ############################# 120 | 121 | class PlateCarreeXYZ(PlateCarree, XYZtiling): 122 | 'Plate Carrée, top-to-bottom tile numbering (a la Google Earth)' 123 | ############################# 124 | profile = 'xyz-geo' 125 | defaul_ext = '.xyz-geo' 126 | tms_profile = 'xyz-geodetic' # non-standard profile 127 | # 128 | profile_map.append(PlateCarreeXYZ) 129 | # 130 | 131 | ############################# 132 | 133 | class PlateCarreeTMS(PlateCarree, TMStiling): 134 | 'Plate Carrée, TMS tile numbering (bottom-to-top, global-geodetic - compatible tiles)' 135 | ############################# 136 | profile = 'tms-geo' 137 | defaul_ext = '.tms-geo' 138 | tms_profile = 'global-geodetic' 139 | # 140 | profile_map.append(PlateCarreeTMS) 141 | # 142 | 143 | kml_templ = ''' 144 | 145 | 146 | 147 | 148 | 149 | %(dbg_start)s 152 | %(dbg_end)s %(name)s%(overlay)s%(links)s 153 | 154 | 155 | ''' 156 | 157 | kml_overlay_templ = ''' 158 | 159 | 160 | %(min_lod)s 161 | %(max_lod)s 162 | 163 | 164 | %(west)s %(north)s 165 | %(east)s %(south)s 166 | 167 | 168 | 169 | %(name)s 170 | %(order)s 171 | %(href)s 172 | 173 | %(west)s %(north)s 174 | %(east)s %(south)s 175 | 176 | ''' 177 | 178 | kml_link_templ = ''' 179 | 180 | %(name)s 181 | 182 | 183 | %(min_lod)s 184 | %(max_lod)s 185 | 186 | 187 | %(west)s %(north)s 188 | %(east)s %(south)s 189 | 190 | 191 | onRegion 192 | %(href)s 193 | 194 | ''' 195 | -------------------------------------------------------------------------------- /tilers_tools/tiles_opt.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | ############################################################################### 4 | # Copyright (c) 2010, 2013 Vadim Shlyakhov 5 | # 6 | # Permission is hereby granted, free of charge, to any person obtaining a 7 | # copy of this software and associated documentation files (the "Software"), 8 | # to deal in the Software without restriction, including without limitation 9 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 10 | # and/or sell copies of the Software, and to permit persons to whom the 11 | # Software is furnished to do so, subject to the following conditions: 12 | # 13 | # The above copyright notice and this permission notice shall be included 14 | # in all copies or substantial portions of the Software. 15 | # 16 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 17 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 18 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 19 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 20 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 21 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 22 | # DEALINGS IN THE SOFTWARE. 23 | ############################################################################### 24 | 25 | import sys 26 | import os 27 | import stat 28 | import shutil 29 | import logging 30 | import optparse 31 | from PIL import Image 32 | #~ from PIL import WebPImagePlugin 33 | 34 | from tiler_functions import * 35 | 36 | class KeyboardInterruptError(Exception): pass 37 | 38 | converters = [] 39 | 40 | ############################# 41 | 42 | class Converter (object): 43 | 44 | ############################# 45 | prog_name = None 46 | dst_ext = None 47 | src_formats = ('.png',) 48 | 49 | def __init__(self, src_dir, options): 50 | if self.prog_name: 51 | try: # check if converter programme is available 52 | prog_path = command(['which', self.prog_name]).strip() 53 | except: 54 | raise Exception('Can not find %s executable' % self.prog_name) 55 | 56 | self.options = options 57 | self.src_dir = src_dir 58 | self.dst_dir = src_dir + self.dst_ext 59 | pf('%s -> %s ' % (self.src_dir, self.dst_dir), end='') 60 | 61 | if options.remove_dest: 62 | shutil.rmtree(self.dst_dir, ignore_errors=True) 63 | elif os.path.exists(self.dst_dir): 64 | raise Exception('Destination already exists: %s' % self.dst_dir) 65 | 66 | def convert(self): 67 | # find all source files 68 | try: 69 | cwd = os.getcwd() 70 | os.chdir(self.src_dir) 71 | src_lst = flatten([os.path.join(path, name) for name in files] 72 | for path, dirs, files in os.walk('.')) 73 | finally: 74 | os.chdir(cwd) 75 | 76 | parallel_map(self, src_lst) 77 | 78 | tilemap = os.path.join(self.dst_dir, 'tilemap.json') 79 | if os.path.exists(tilemap): 80 | re_sub_file(tilemap, [ 81 | ('"mime":[^,]*"', '"mime": "%s"' % mime_from_ext(self.dst_ext)), 82 | ('"ext":[^,]*"', '"ext": "%s"' % self.dst_ext[1:]), 83 | ]) 84 | pf('') 85 | 86 | def __call__(self, f): 87 | 'process file' 88 | try: 89 | src = os.path.join(self.src_dir, f) 90 | dst = os.path.splitext(os.path.join(self.dst_dir, f))[0] + self.dst_ext 91 | 92 | dpath = os.path.split(dst)[0] 93 | if not os.path.exists(dpath): 94 | os.makedirs(dpath) 95 | 96 | src_ext = os.path.splitext(f)[1].lower() 97 | if src_ext in self.src_formats: 98 | self.convert_tile(src, dst, dpath) 99 | else: 100 | shutil.copy(src, dpath) 101 | 102 | self.counter() 103 | except KeyboardInterrupt: # http://jessenoller.com/2009/01/08/multiprocessingpool-and-keyboardinterrupt/ 104 | pf('got KeyboardInterrupt') 105 | raise KeyboardInterruptError() 106 | 107 | def convert_tile(self, src, dst, dpath): 108 | pass 109 | 110 | tick_rate = 10 111 | tick_count = 0 112 | 113 | def counter(self): 114 | self.tick_count += 1 115 | #~ pf(self.tick_count) 116 | if self.tick_count % self.tick_rate == 0: 117 | pf('.', end='') 118 | return True 119 | else: 120 | return False 121 | 122 | @staticmethod 123 | def get_class(profile, write=False): 124 | for cls in converters: 125 | if profile == cls.profile_name: 126 | return cls 127 | else: 128 | raise Exception('Invalid format: %s' % profile) 129 | 130 | @staticmethod 131 | def list_converters(): 132 | for cls in converters: 133 | print cls.profile_name 134 | 135 | ############################# 136 | 137 | class PngConverter (Converter): 138 | 139 | ############################# 140 | profile_name = 'pngnq' 141 | prog_name = 'pngnq' 142 | dst_ext = '.png' 143 | 144 | def convert_tile(self, src, dst, dpath): 145 | 'optimize png using pngnq utility' 146 | command(['pngnq', '-n', self.options.colors, '-e', self.dst_ext, '-d', dpath, src]) 147 | 148 | converters.append(PngConverter) 149 | 150 | ############################# 151 | 152 | class WebpConverter (Converter): 153 | 'convert to webp' 154 | ############################# 155 | profile_name = 'webp' 156 | dst_ext = '.webp' 157 | src_formats = ('.png','.jpg','.jpeg','.gif') 158 | 159 | def convert_tile(self, src, dst, dpath): 160 | command(['cwebp', src, '-o', dst, '-q', str(self.options.quality)]) 161 | 162 | converters.append(WebpConverter) 163 | 164 | 165 | #~ ############################# 166 | #~ 167 | #~ class WebpPilConverter (Converter): 168 | #~ 'convert to webp' 169 | #~ ############################# 170 | #~ profile_name = 'webppil' 171 | #~ dst_ext = '.webp' 172 | #~ src_formats = ('.png','.jpg','.jpeg','.gif') 173 | #~ 174 | #~ def convert_tile(self, src, dst, dpath): 175 | #~ img = Image.open(src) 176 | #~ img.save(dst, optimize=True, quality=self.options.quality) 177 | #~ 178 | #~ converters.append(WebpPilConverter) 179 | 180 | 181 | ############################# 182 | 183 | class JpegConverter (Converter): 184 | 'convert to jpeg' 185 | ############################# 186 | profile_name = 'jpeg' 187 | dst_ext = '.jpg' 188 | 189 | def convert_tile(self, src, dst, dpath): 190 | img = Image.open(src) 191 | img.save(dst, optimize=True, quality=self.options.quality) 192 | 193 | converters.append(JpegConverter) 194 | 195 | ############################# 196 | 197 | def main(argv): 198 | 199 | ############################# 200 | 201 | parser = optparse.OptionParser( 202 | usage="usage: %prog [options] arg", 203 | version=version, 204 | ) 205 | parser.add_option('-p', '--profile', default='webp', 206 | help='output tiles profile (default: webp)') 207 | parser.add_option('-l', '--profiles', action='store_true', dest='list_profiles', 208 | help='list available profiles') 209 | parser.add_option("-n", "--colors", dest="colors", default='256', 210 | help='Specifies the number of colors for pngnq profile (default: 256)') 211 | parser.add_option("-q", "--quality", dest="quality", type="int", default=75, 212 | help='JPEG/WEBP quality (default: 75)') 213 | parser.add_option("-r", "--remove-dest", action="store_true", 214 | help='delete destination directory if any') 215 | parser.add_option("--quiet", action="store_true") 216 | parser.add_option("-d", "--debug", action="store_true") 217 | parser.add_option("--nothreads", action="store_true", 218 | help="do not use multiprocessing") 219 | 220 | (options, args) = parser.parse_args(argv[1:]) 221 | 222 | logging.basicConfig(level=logging.DEBUG if options.debug else 223 | (logging.ERROR if options.quiet else logging.INFO)) 224 | log(options.__dict__) 225 | 226 | if options.list_profiles: 227 | Converter.list_profiles() 228 | sys.exit(0) 229 | 230 | if options.nothreads or options.debug: 231 | set_nothreads() 232 | 233 | if not args: 234 | parser.error('No input directory(s) specified') 235 | 236 | for src_dir in args: 237 | Converter.get_class(options.profile)(src_dir, options).convert() 238 | 239 | 240 | if __name__=='__main__': 241 | 242 | main(sys.argv) 243 | -------------------------------------------------------------------------------- /tilers_tools/converter_sasplanet.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2010, 2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | import binascii 27 | import time 28 | import os.path 29 | 30 | from converter_backend import * 31 | 32 | ############################# 33 | 34 | class SASPlanet(TileDir): # http://www.sasgis.org/forum/viewtopic.php?f=2&t=24 35 | 'SASPlanet cache' 36 | ############################# 37 | format, ext, input, output = 'sasplanet', '.sasplanet', True, True 38 | dir_pattern = 'z[0-9]*/*/x[0-9]*/*/y[0-9]*.*' 39 | 40 | def path2coord(self, tile_path): 41 | z, dx, x, dy, y = path2list(tile_path)[-6:-1] 42 | z, x, y = map(int, (z[1:], x[1:], y[1:])) 43 | return (z-1, x, y) 44 | 45 | def coord2path(self, z, x, y): 46 | return 'z%d/%d/x%d/%d/y%d' % (z+1, x//1024, x, y//1024, y) 47 | 48 | tileset_profiles.append(SASPlanet) 49 | 50 | ############################# 51 | 52 | class SASBerkeley(TileDir): 53 | 'SASPlanet Berkeley DB' 54 | ############################# 55 | format, ext, input, output = 'sdb', '.sdb', True, False 56 | dir_pattern = 'z[0-9]*/[0-9]*/[0-9]*/*.sdb' 57 | 58 | def __init__(self, root, options=None): 59 | super(SASBerkeley, self).__init__(root, options) 60 | 61 | from bsddb3 import db 62 | self.db = db 63 | 64 | # 0 4b 4s // magic 65 | # 4 4b I, // crc32 66 | # 8 4b I, // tile size 67 | # 12 8b d, // tile date 68 | # 20 2c string // tile version 69 | # 2c string // tile content-type 70 | # BLOB // tile data 71 | self.header = struct.Struct('<3sBIId') 72 | self.key = struct.Struct('>Q') # 64 bit, swap bytes 73 | 74 | def __iter__(self): 75 | log('__iter__', os.path.join(self.root, self.dir_pattern), glob.iglob(os.path.join(self.root, self.dir_pattern))) 76 | for db_file in glob.iglob(os.path.join(self.root, self.dir_pattern)): 77 | log('db_file', db_file) 78 | for coord, tile, path in self.iter_tiles(db_file): 79 | #~ log('db tile', coord, tile[:20], path) 80 | yield PixBufTile(coord, tile, path) 81 | 82 | def iter_tiles(self, db_path): 83 | zoom = self.get_zoom(db_path) # also checks data in range 84 | if not zoom: 85 | return 86 | d = self.db.DB() 87 | d.open(db_path, '', self.db.DB_BTREE, self.db.DB_RDONLY) 88 | c = d.cursor() 89 | item = c.first(dlen=0, doff=0) 90 | while item: 91 | key = item[0] 92 | coord = self.get_coord(zoom, key) 93 | if self.in_range(coord): 94 | data = c.current()[1] 95 | tile = self.get_image(data) 96 | if tile: 97 | log('tile', coord) 98 | yield coord, tile, [db_path, key] 99 | item = c.next(dlen=0, doff=0) 100 | d.close() 101 | 102 | def get_zoom(self, db_path): # u_TileFileNameBerkeleyDB 103 | z, x10, y10, xy8 = path2list(db_path)[-5:-1] 104 | zoom = int(z[1:]) - 1 105 | x_min, y_min = [int(d) << 8 for d in xy8.split('.')] 106 | x_max, y_max = [d | 0xFF for d in x_min, y_min] 107 | 108 | if not self.in_range((zoom, x_min, y_min), (zoom, x_max, y_max)): 109 | return None 110 | pass 111 | log('get_zoom', zoom, x_min, x_max, y_min, y_max,db_path) 112 | return zoom 113 | 114 | def get_coord(self, zoom, key): # u_BerkeleyDBKey.pas TBerkeleyDBKey.PointToKey 115 | if key == '\xff\xff\xff\xff\xff\xff\xff\xff': 116 | return None 117 | kxy = self.key.unpack(key)[0] # swaps bytes 118 | xy = [0, 0] 119 | for bit_n in range(64): # bits for x and y are interleaved in the key 120 | x0y1 = bit_n % 2 # x, y 121 | xy[x0y1] += (kxy >> bit_n & 1) << (bit_n - x0y1) / 2 122 | 123 | coord = [zoom] + xy 124 | #~ log('get_coord', coord, zoom, key, hex(kxy), hex(xy[0]), hex(xy[1])) 125 | return coord 126 | 127 | def get_image(self, data): # u_BerkeleyDBValue 128 | 129 | magic, magic_v, crc32, tile_size, tile_date = self.header.unpack_from(data) 130 | if magic != 'TLD' or magic_v != 3: 131 | log('get_image', 'wrong magic', magic, magic_v) 132 | return None 133 | 134 | strings = [] 135 | i = start = self.header.size 136 | while True: 137 | if data[i] == '\x00': 138 | strings.append(data[start: i: 2]) 139 | start = i + 2 140 | if len(strings) == 2: 141 | break 142 | i += 2 143 | 144 | tile_version, content_type = strings 145 | tile_data = data[start: start + tile_size] 146 | 147 | #~ log('get_image', self.header.size, magic, magic_v, tile_version, content_type, tile_size, tile_data[:20])#, data[4:60:2]) 148 | return tile_data 149 | 150 | tileset_profiles.append(SASBerkeley) 151 | 152 | # SASBerkeley 153 | 154 | ############################# 155 | 156 | class SASSQLite(TileDir): 157 | 'SASPlanet SQLite DB' 158 | ############################# 159 | format, ext, input, output = 'slite', '.sqlitedb', True, True 160 | dir_pattern = 'z[0-9]*/[0-9]*/[0-9]*/*.sqlitedb' 161 | 162 | # from u_TileStorageSQLiteHolder.pas 163 | create_sql = [ 164 | 'PRAGMA synchronous = OFF;', 165 | 'PRAGMA page_size = 16384;', 166 | 'CREATE TABLE IF NOT EXISTS t ('+ 167 | 'x INTEGER NOT NULL,'+ 168 | 'y INTEGER NOT NULL,'+ 169 | 'v INTEGER DEFAULT 0 NOT NULL,'+ # version 170 | 'c TEXT,'+ # content_type 171 | 's INTEGER DEFAULT 0 NOT NULL,'+ # size 172 | 'h INTEGER DEFAULT 0 NOT NULL,'+ # crc32 173 | 'd INTEGER NOT NULL,'+ # date as unix seconds DEFAULT (strftime(''%s'',''now''))) 174 | 'b BLOB,'+ # body 175 | 'constraint PK_TB primary key (x,y,v));', 176 | 'CREATE INDEX IF NOT EXISTS t_v_idx on t (v);' 177 | ] 178 | 179 | db_path = None 180 | db = None 181 | 182 | def __init__(self, *args, **kw_args): 183 | super(SASSQLite, self).__init__(*args, **kw_args) 184 | 185 | import sqlite3 186 | self.sqlite3 = sqlite3 187 | 188 | def __iter__(self): 189 | log('__iter__', os.path.join(self.root, self.dir_pattern), glob.iglob(os.path.join(self.root, self.dir_pattern))) 190 | for db_file in glob.iglob(os.path.join(self.root, self.dir_pattern)): 191 | log('db_file', db_file) 192 | for tile in self.iter_db_file(db_file): 193 | yield tile 194 | 195 | def iter_db_file(self, db_path): 196 | zoom = self.get_zoom(db_path) # also checks data in range 197 | if not zoom: 198 | return 199 | 200 | db = self.sqlite3.connect(db_path) 201 | dbc = db.cursor() 202 | 203 | dbc.execute('SELECT x, y, MAX(v), b FROM t GROUP BY x,y;') 204 | for x, y, version, data in dbc: 205 | if data: 206 | coord = [zoom, x, y] 207 | #~ log('db tile', coord, tile[:20], path) 208 | #~ log('tile', coord, data) 209 | yield PixBufTile(coord, data, key=(db_path, coord)) 210 | 211 | db.close() 212 | 213 | def get_zoom(self, db_path): # u_TileFileNameSQLite.pas 214 | z, x10, y10, xy8 = path2list(db_path)[-5:-1] 215 | zoom = int(z[1:]) - 1 216 | x_min, y_min = [int(d) << 8 for d in xy8.split('.')] 217 | x_max, y_max = [d | 0xFF for d in x_min, y_min] 218 | 219 | if not self.in_range((zoom, x_min, y_min), (zoom, x_max, y_max)): 220 | return None 221 | pass 222 | log('get_zoom', zoom, x_min, x_max, y_min, y_max,db_path) 223 | return zoom 224 | 225 | def store_tile(self, tile): 226 | z, x, y = tile.coord() 227 | data = buffer(tile.data()) 228 | log('%s -> %s:%d, %d, %d' % (tile.path, self.name, z, x, y)) 229 | 230 | try: 231 | data_type = tile.get_mime() 232 | except KeyError: 233 | return 234 | 235 | self.set_db(tile) 236 | 237 | timestamp = int(time.time()) 238 | self.dbc.execute( 239 | 'INSERT OR REPLACE INTO t ' 240 | '(x, y, s, h, d, b) ' 241 | 'VALUES (?, ?, ?, ?, ?, ?);', 242 | (x, y, len(data), binascii.crc32(data) % (1 << 32), timestamp, data) 243 | ) 244 | 245 | def set_db(self, tile): 246 | z, x, y = tile.coord() 247 | db_dir = os.path.join(self.root, 'z' + str(z + 1), str(x >> 10), str(y >> 10)) 248 | db_name = '%d.%d%s' % (x >> 8, y >> 8, self.ext) 249 | 250 | db_path = os.path.join(db_dir, db_name) 251 | log(db_path) 252 | if self.db_path == db_path: 253 | return 254 | 255 | if self.db: 256 | self.db.commit() 257 | self.db.close() 258 | 259 | if not os.path.exists(db_dir): 260 | os.makedirs(db_dir) 261 | 262 | self.db_path = db_path 263 | self.db = self.sqlite3.connect(db_path) 264 | self.dbc = self.db.cursor() 265 | 266 | for stmt in self.create_sql: 267 | self.dbc.execute(stmt) 268 | self.db.commit() 269 | 270 | def finalize_tileset(self): 271 | if self.db: 272 | self.db.commit() 273 | self.db.close() 274 | self.db = None 275 | 276 | tileset_profiles.append(SASSQLite) 277 | 278 | # SASSQLite 279 | -------------------------------------------------------------------------------- /tilers_tools/viewer-google.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 26 | 27 | 28 | 29 | tiles-gmaps 30 | 31 | 32 | 41 | 42 | 43 | 44 | 45 | 46 | 267 | 268 | 269 | 270 |
271 | 272 | 273 | -------------------------------------------------------------------------------- /tilers_tools/data_ozi.csv: -------------------------------------------------------------------------------- 1 | #header,Name,"PROJ4 definition","Ozi ID",Ellipsoid,"Delta X","Delta Y","Delta Z",comment 2 | datum,"Auto Shift .....",,,"WGS 84",0,0,0, 3 | datum,Adindan,,1,"Clarke 1880",-162,-12,206, 4 | datum,Afgooye,,2,"Krassovsky 1940",-43,-163,45, 5 | datum,"Ain el Abd 1970",,3,"International 1924",-150,-251,-2, 6 | datum,"Anna 1 Astro 1965",,4,"Australian National",-491,-22,435, 7 | datum,"Arc 1950",,5,"Clarke 1880",-143,-90,-294, 8 | datum,"Arc 1960",,6,"Clarke 1880",-160,-8,-300, 9 | datum,"Ascension Island 1958",,7,"International 1924",-207,107,52, 10 | datum,"Astro B4 Sorol Atoll",,9,"International 1924",114,-116,-333, 11 | datum,"Astro Beacon 1945",,,"International 1924",145,75,-272, 12 | datum,"Astro DOS 71I4",,10,"International 1924",-320,550,-494, 13 | datum,"Astronomic Stn 1952",,11,"International 1924",124,-234,-25, 14 | datum,"Australian Geodetic 1966",,12,"Australian National",-133,-48,148, 15 | datum,"Australian Geodetic 1984",,13,"Australian National",-134,-48,149, 16 | datum,"Australian Geocentric 1994 (GDA94)",,201,"GRS 80",0,0,0, 17 | datum,Austrian,,202,"Bessel 1841",594,84,471, 18 | datum,"Bellevue (IGN)",,14,"International 1924",-127,-769,472, 19 | datum,"Bermuda 1957",,15,"Clarke 1866",-73,213,296, 20 | datum,"Bogota Observatory",,16,"International 1924",307,304,-318, 21 | datum,"Campo Inchauspe",,17,"International 1924",-148,136,90, 22 | datum,"Canton Astro 1966",,18,"International 1924",298,-304,-375, 23 | datum,Cape,,19,"Clarke 1880",-136,-108,-292, 24 | datum,"Cape Canaveral",,20,"Clarke 1866",-2,150,181, 25 | datum,Carthage,+datum=carthage,21,"Clarke 1880",-263,6,431, 26 | datum,CH-1903,,203,"Bessel 1841",674,15,405, 27 | datum,"Chatham 1971",,22,"International 1924",175,-38,113, 28 | datum,"Chua Astro",,23,"International 1924",-134,229,-29, 29 | datum,"Corrego Alegre",,24,"International 1924",-206,172,-6, 30 | datum,"Djakarta (Batavia)",,,"Bessel 1841",-377,681,-50, 31 | datum,"DOS 1968",,26,"International 1924",230,-199,-752, 32 | datum,"Easter Island 1967",,27,"International 1924",211,147,111, 33 | datum,Egypt,,204,"International 1924",-130,-117,-151, 34 | datum,"European 1950",,28,"International 1924",-87,-98,-121, 35 | datum,"European 1950 (Mean France)",,205,"Hayford 1909",-87,-96,-120, 36 | datum,"European 1950 (Spain and Portugal)",,206,"International 1924",-84,-107,-120, 37 | datum,"European 1979",,29,"International 1924",-86,-98,-119, 38 | datum,"Finland Hayford",,207,"International 1924",-78,-231,-97, 39 | datum,"Gandajika Base",,30,"International 1924",-133,-321,50, 40 | datum,"Geodetic Datum 1949",,31,"International 1924",84,-22,209, 41 | datum,"Guam 1963",,34,"Clarke 1866",-100,-248,259, 42 | datum,"GUX 1 Astro",,35,"International 1924",252,-209,-751, 43 | datum,Hartebeeshoek94,,104,"WGS 84",0,0,0, 44 | datum,Hermannskogel,+datum=hermannskogel,208,"Bessel 1841",653,-212,449, 45 | datum,"Hjorsey 1955",,37,"International 1924",-73,46,-86, 46 | datum,"Hong Kong 1963",,38,"International 1924",-156,-271,-189, 47 | datum,Hu-Tzu-Shan,,39,"International 1924",-634,-549,-201, 48 | datum,"Indian Bangladesh",,41,"Everest (India 1830)",289,734,257, 49 | datum,"Indian Thailand",,40,"Everest (India 1830)",214,836,303, 50 | datum,Israeli,,209,"Clarke 1880 Palestine",-235,-85,264, 51 | datum,"Ireland 1965",+datum=ire65,42,"Modified Airy",506,-122,611, 52 | datum,"ISTS 073 Astro 1969",,43,"International 1924",208,-435,-229, 53 | datum,"Johnston Island",,44,"International 1924",191,-77,-204, 54 | datum,Kandawala,,45,"Everest (India 1830)",-97,787,86, 55 | datum,"Kerguelen Island",,46,"International 1924",145,-187,103, 56 | datum,"Kertau 1948",,47,"Everest (1948)",-11,851,5, 57 | datum,"L.C. 5 Astro",,48,"Clarke 1866",42,124,147, 58 | datum,"Liberia 1964",,49,"Clarke 1880",-90,40,88, 59 | datum,"Luzon Mindanao",,51,"Clarke 1866",-133,-79,-72, 60 | datum,"Luzon Philippines",,50,"Clarke 1866",-133,-77,-51, 61 | datum,"Mahe 1971",,52,"Clarke 1880",41,-220,-134, 62 | datum,"Marco Astro",,53,"International 1924",-289,-124,60, 63 | datum,Massawa,,54,"Bessel 1841",639,405,60, 64 | datum,Merchich,,55,"Clarke 1880",31,146,47, 65 | datum,"Midway Astro 1961",,56,"International 1924",912,-58,1227, 66 | datum,Minna,,57,"Clarke 1880",-92,-93,122, 67 | datum,"NAD27 Alaska",,63,"Clarke 1866",-5,135,172, 68 | datum,"NAD27 Bahamas",,64,"Clarke 1866",-4,154,178, 69 | datum,"NAD27 Canada",,66,"Clarke 1866",-10,158,187, 70 | datum,"NAD27 Canal Zone",,67,"Clarke 1866",0,125,201, 71 | datum,"NAD27 Caribbean",,68,"Clarke 1866",-7,152,178, 72 | datum,"NAD27 Central",,69,"Clarke 1866",0,125,194, 73 | datum,"NAD27 CONUS",,62,"Clarke 1866",-8,160,176, 74 | datum,"NAD27 Cuba",,70,"Clarke 1866",-9,152,178, 75 | datum,"NAD27 Greenland",,71,"Clarke 1866",11,114,195, 76 | datum,"NAD27 Mexico",,72,"Clarke 1866",-12,130,190, 77 | datum,"NAD27 San Salvador",,,"Clarke 1866",1,140,165, 78 | datum,NAD83,+datum=NAD83,74,"GRS 80",0,0,0, 79 | datum,"Nahrwn Masirah Ilnd",,,"Clarke 1880",-247,-148,369, 80 | datum,"Nahrwn Saudi Arbia",,,"Clarke 1880",-231,-196,482, 81 | datum,"Nahrwn United Arab",,,"Clarke 1880",-249,-156,381, 82 | datum,"Naparima BWI",,,"International 1924",-2,374,172, 83 | datum,NGO1948,,210,"Bessel 1841 (Norway)",315,-217,528, 84 | datum,"NTF France",,107,"Clarke 1880 IGN",-168,-60,320, 85 | datum,Norsk,,211,"Bessel 1841 (Norway)",278,93,474, 86 | datum,NZGD1949,+datum=nzgd49,31,"International 1924",84,-22,209, 87 | datum,NZGD2000,,104,"WGS 84",0,0,0, 88 | datum,"Observatorio 1966",,75,"International 1924",-425,-169,81, 89 | datum,"Old Egyptian",,76,"Helmert 1906",-130,110,-13, 90 | datum,"Old Hawaiian",,77,"Clarke 1866",61,-285,-181, 91 | datum,Oman,,78,"Clarke 1880",-346,-1,224, 92 | datum,"Ord Srvy Grt Britn",+datum=OSGB36,79,"Airy 1830",375,-111,431, 93 | datum,"Pico De Las Nieves",,80,"International 1924",-307,-92,127, 94 | datum,"Pitcairn Astro 1967",,81,"International 1924",185,165,42, 95 | datum,"Potsdam Rauenberg DHDN",+datum=potsdam,1000,"Bessel 1841",606,23,413, 96 | datum,"Prov So Amrican 1956",,36,"International 1924",-288,175,-376, 97 | datum,"Prov So Chilean 1963",,82,"International 1924",16,196,93, 98 | datum,"Puerto Rico",,83,"Clarke 1866",11,72,-101, 99 | #datum,"Pulkovo 1942 (1)",,1001,"Krassovsky 1940",28,-130,-95, 100 | datum,"Pulkovo 1942 (1)",,1001,"Krassovsky 1940",25,-141,-79, 101 | #datum,"Pulkovo 1942 (2)",,,"Krassovsky 1940",28,-130,-95, 102 | datum,"Pulkovo 1942 (1)","+towgs84=23.92,-141.27,-80.9,0,-0.35,-0.82,-0.12 +ellps=krass",,"Krassovsky 1940",24.82,-131.21,-82.66, 103 | datum,"Pulkovo 1942 (2)","+towgs84=23.92,-141.27,-80.9,0,-0.35,-0.82,-0.12 +ellps=krass",,"Krassovsky 1940",24.82,-131.21,-82.66, 104 | datum,"Pulkovo 1942","+towgs84=23.92,-141.27,-80.9,0,-0.35,-0.82,-0.12 +ellps=krass",,"Krassovsky 1940",24.82,-131.21,-82.66, 105 | datum,"Qatar National",,84,"International 1924",-128,-283,22, 106 | datum,Qornoq,,85,"International 1924",164,138,-189, 107 | datum,Reunion,,86,"International 1924",94,-948,-1262, 108 | datum,Rijksdriehoeksmeting,,212,"Bessel 1841",593,26,478, 109 | datum,"Rome 1940",,87,"International 1924",-225,-65,9, 110 | datum,"RT 90","+towgs84=414.0978567149,41.3381489658,603.0627177516,-0.8550434314,2.1413465185,-7.0227209516,0 +ellps=bessel",112,"Bessel 1841",498,-36,568, 111 | datum,S42,,213,"Krassovsky 1940",28,-121,-77, 112 | datum,"Santo (DOS)",,88,"International 1924",170,42,84, 113 | datum,"Sao Braz",,89,"International 1924",-203,141,53, 114 | datum,"Sapper Hill 1943",,90,"International 1924",-355,16,74, 115 | datum,Schwarzeck,,91,"Bessel 1841 (Namibia)",616,97,-251, 116 | datum,"South American 1969",,92,"South American 1969",-57,1,-41, 117 | datum,"South Asia",,93,"Modified Fischer 1960",7,-10,-26, 118 | datum,"Southeast Base",,94,"International 1924",-499,-249,314, 119 | datum,"Southwest Base",,95,"International 1924",-104,167,-38, 120 | datum,"Timbalai 1948",,96,"Everest (India 1830)",-689,691,-46, 121 | datum,Tokyo,,97,"Bessel 1841",-128,481,664, 122 | datum,"Tristan Astro 1968",,98,"International 1924",-632,438,-609, 123 | datum,"Viti Levu 1916",,99,"Clarke 1880",51,391,-36, 124 | datum,"Wake-Eniwetok 1960",,100,"Hough 1960",101,52,-39, 125 | datum,"WGS 72",,103,"WGS 72",0,0,5, 126 | datum,"WGS 84",+datum=WGS84,104,"WGS 84",0,0,0, 127 | datum,Yacare,,105,"International 1924",-155,171,37, 128 | datum,Zanderij,,106,"International 1924",-265,120,-358, 129 | 130 | #header,Ellipsoid,"PROJ4 ellps","Radius (a)","1/f (rf)",comment,,, 131 | ellps,"Airy 1830",+ellps=airy,6377563.396,299.3249646,,,, 132 | ellps,"Australian National",+ellps=aust_SA,6378160,298.25,,,, 133 | ellps,"Bessel 1841",+ellps=bessel,6377397.155,299.1528128,,,, 134 | ellps,"Bessel 1841 (Namibia)",+ellps=bess_nam,6377483.865,299.1528128,,,, 135 | ellps,"Bessel 1841 (Norway)",,6377492.0176,299.1528,,,, 136 | ellps,"Clarke 1866",+ellps=clrk66,6378206.4,294.9786982,,,, 137 | ellps,"Clarke 1880",+ellps=clrk80,6378249.145,293.465,,,, 138 | ellps,"Clarke 1880 IGN",,6378249.2,293.466021,,,, 139 | ellps,"Clarke 1880 Palestine",,6378300.789,293.466,,,, 140 | ellps,"Everest (1948)",+ellps=evrst48,6377304.063,300.8017,,,, 141 | ellps,"Everest (India 1830)",+ellps=evrst30,6377276.345,300.8017,,,, 142 | ellps,"GRS 80",+ellps=GRS80,6378137,298.2572221,,,, 143 | ellps,"Hayford 1909",,6378388,296.959263,,,, 144 | ellps,"Helmert 1906",+ellps=helmert,6378200,298.3,,,, 145 | ellps,"Hough 1960",+ellps=hough,6378270,297,,,, 146 | ellps,"International 1924",+ellps=intl,6378388,297,,,, 147 | ellps,"Krassovsky 1940",+ellps=krass,6378245,298.3,,,, 148 | ellps,"Modified Airy",+ellps=mod_airy,6377340.189,299.3249646,,,, 149 | ellps,"Modified Fischer 1960",+ellps=fschr60m,6378155,298.3,,,, 150 | ellps,"South American 1969",+ellps=aust_SA,6378160,298.25,,,, 151 | ellps,"WGS 72",+ellps=WGS72,6378135,298.26,,,, 152 | ellps,"WGS 84",+ellps=WGS84,6378137,298.25722356,,,, 153 | 154 | #header,Projection,"PROJ4 definition",comment,,,,, 155 | proj,Latitude/Longitude,+proj=latlong,,,,,, 156 | proj,Mercator,+proj=merc,,,,,, 157 | proj,"Transverse Mercator",+proj=tmerc,,,,,, 158 | proj,"Lambert Conformal Conic",+proj=lcc,,,,,, 159 | proj,"(A)Lambert Azimuthual Equal Area",+proj=laea,"# not tested",,,,, 160 | proj,"(EQC) Equidistant Conic",+proj=eqdc,,,,,, 161 | proj,Sinusoidal,+proj=sinu,"# not tested",,,,, 162 | proj,"Polyconic (American)",+proj=poly,,,,,, 163 | proj,"Albers Equal Area",+proj=aea,"# not tested",,,,, 164 | proj,"Van Der Grinten",+proj=vandg,"# not tested",,,,, 165 | proj,"Vertical Near-Sided Perspective",+proj=nsper,"# not tested",,,,, 166 | proj,"(WIV) Wagner IV",+proj=wag4,"# not tested",,,,, 167 | proj,Bonne,+proj=bonne,"# not tested",,,,, 168 | proj,"(MT0) Montana State Plane Zone 2500",+init=esri:102300,"# not tested",,,,, 169 | proj,"(VICGRID) Victoria Australia",+init=epsg:3110,"# not tested",,,,, 170 | proj,"(VG94) VICGRID94 Victoria Australia",+init=epsg:3111,"# not tested",,,,, 171 | proj,"(UTM) Universal Transverse Mercator","+proj=tmerc +k=0.9996",,,,,, 172 | proj,"(BNG) British National Grid",+init=epsg:27700,,,,,, 173 | proj,"(IG) Irish Grid",+init=epsg:29902,"# not tested",,,,, 174 | #proj,"(NZG) New Zealand Grid",+init=epsg:27200,"# not tested",,,,, 175 | # http://osgeo-org.1803224.n2.nabble.com/New-Zealand-grid-tp2062137p2062142.html",,,,,,,, 176 | proj,"(NZG) New Zealand Grid","+proj=nzmg +lat_0=-41.0000000000 +lon_0=173.0000000000 +x_0=2510000.0000000000 +y_0=6023150.0000000000 +a=6378388 +rf=297 +nadgrids=nzgd2kgrid0005.gsb +wktext +no_defs","# not tested",,,,, 177 | proj,"(NZTM2) New Zealand TM 2000",+init=epsg:2193,"# not tested",,,,, 178 | # http://en.wikipedia.org/wiki/Swedish_grid",,,,,,,, 179 | proj,"(SG) Swedish Grid","+proj=tmerc +lon_0=15.808277777778 +x_0=1500000 +y_0=0",,,,,, 180 | proj,"(SUI) Swiss Grid",+init=epsg:21781,"# not tested",,,,, 181 | proj,"(I) France Zone I",+init=epsg:27571,"# not tested",,,,, 182 | proj,"(II) France Zone II",+init=epsg:27572,"# not tested",,,,, 183 | proj,"(III) France Zone III",+init=epsg:27573,"# not tested",,,,, 184 | proj,"(IV) France Zone IV",+init=epsg:27574,"# not tested",,,,, 185 | proj,"(ITA1) Italy Grid Zone 1","+init=epsg:26591 +towgs84=-85.88,-28.85,+49.67,-1.003,-2.383,-1.808,-27.82","# http://mpa.itc.it/markus/shortcourse/notes2.html # not tested",,,,, 186 | proj,"(ITA2) Italy Grid Zone 2","+init=epsg:26592 +towgs84=-85.88,-28.85,+49.67,-1.003,-2.383,-1.808,-27.82","# not tested",,,,, 187 | proj,"(VICMAP-TM) Victoria Aust.(pseudo AMG)","+proj=tmerc +lat_0=145 +x_0=500000 +y_0=10000000","# http://www.gpsoz.com.au/VicRoadsInfo.htm # not tested",,,,, 188 | -------------------------------------------------------------------------------- /tilers_tools/tiles_merge.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2010-2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | #****************************************************************************** 25 | 26 | import sys 27 | import os 28 | import os.path 29 | import glob 30 | import shutil 31 | import logging 32 | import optparse 33 | from PIL import Image 34 | import pickle 35 | 36 | from tiler_functions import * 37 | 38 | class KeyboardInterruptError(Exception): 39 | pass 40 | 41 | def f_approx_eq(a, b, eps): 42 | return (abs(a - b) / (abs(a) + abs(b))/2) < eps 43 | 44 | def transparency(img): 45 | 'estimate transparency of an image' 46 | (r, g, b, a) = img.split() 47 | (a_min, a_max) = a.getextrema() # get min/max values for alpha channel 48 | return 1 if a_min == 255 else 0 if a_max == 0 else -1 49 | 50 | 51 | class MergeSet: 52 | def __init__(self, src_dir, dst_dir): 53 | 54 | if options.strip_src_ext: 55 | src_dir = os.path.splitext(src)[0] 56 | if options.add_src_ext is not None: 57 | src_dir += options.add_src_ext 58 | pf(src_dir+' ', end='') 59 | 60 | self.src_dir = src_dir 61 | self.dst_dir = dst_dir 62 | 63 | copy_viewer(self.dst_dir) 64 | # copy tilemap 65 | src_f = os.path.join(src_dir, 'tilemap.json') 66 | dst_f = os.path.join(dst_dir, 'tilemap.json') 67 | if os.path.exists(src_f) and not os.path.exists(dst_f): 68 | shutil.copy(src_f, dst_f) 69 | 70 | # read metadata 71 | self.src = read_tilemap(src_dir) 72 | self.dst = read_tilemap(dst_dir) 73 | self.tile_size = self.src['tiles']['size'] 74 | 75 | # get a list of source tiles 76 | try: 77 | cwd = os.getcwd() 78 | os.chdir(src_dir) 79 | self.sources = dict.fromkeys( 80 | glob.iglob('z[0-9]*/*/*.%s' % self.src['tiles']['ext']), 81 | None 82 | ) 83 | finally: 84 | os.chdir(cwd) 85 | #ld(self.sources) 86 | 87 | # load cached tile transparency data if any 88 | self.sources.update(read_transparency(src_dir)) 89 | #ld(repr(self.src_transp)) 90 | 91 | # define crop map for underlay function 92 | szx, szy = self.tile_size 93 | if self.src['tiles']['inversion'][1]: # google 94 | self.underlay_offsets = [ 95 | # left top 96 | ( 0, 0), (szx, 0), 97 | ( 0, szy), (szx, szy), 98 | ] 99 | else: # TMS 100 | self.underlay_offsets = [ 101 | # left top 102 | ( 0, szy), (szx, szy), 103 | ( 0, 0), (szx, 0), 104 | ] 105 | 106 | def merge_metadata(self): 107 | 'adjust destination metadata' 108 | 109 | src = self.src 110 | dst = self.dst 111 | 112 | dst["properties"]["title"] = os.path.split(dst_dir)[1] 113 | dst["properties"]["description"] = 'merged tileset' 114 | 115 | ld([round(i/1000) for i in src["bbox"]], [round(i/1000) for i in dst["bbox"]]) 116 | for i, min_max in zip(range(4), (min, min, max, max)): 117 | dst["bbox"][i] = min_max(src["bbox"][i], dst["bbox"][i]) 118 | 119 | dst["tilesets"].update(src["tilesets"]) 120 | 121 | write_tilemap(self.dst_dir, dst) 122 | 123 | def underlay(self, tile, upper_path, upper_raster, upper_origin=(0, 0), level=1): 124 | if level > options.underlay: 125 | return 126 | 127 | (s, ext) = os.path.splitext(tile) 128 | (s, x) = os.path.split(s) 129 | (z, y) = os.path.split(s) 130 | (z, y, x) = map(int, (z[1:], y, x)) 131 | dz, dx, dy = z+1, x*2, y*2 132 | dst_tiles = [(dx, dy), (dx+1, dy), 133 | (dx, dy+1), (dx+1, dy+1)] 134 | 135 | for dst_xy, crop_offset in zip(dst_tiles, self.underlay_offsets): 136 | dst_tile = 'z%i/%i/%i%s' % (dz, dst_xy[1], dst_xy[0], ext) 137 | dst_path = os.path.join(self.dst_dir, dst_tile) 138 | if dst_tile in self.sources: 139 | continue # higher level zoom source exists 140 | 141 | l2 = 2**level 142 | crop_origin = [p1 + p2/l2 for p1, p2 in zip(upper_origin, crop_offset)] 143 | 144 | if os.path.exists(dst_path): 145 | dst_raster = Image.open(dst_path).convert("RGBA") 146 | if transparency(dst_raster) == 1: # lower tile is fully opaque 147 | continue 148 | if not upper_raster: # check if opening was deferred 149 | upper_raster = Image.open(upper_path).convert("RGBA") 150 | 151 | szx, szy = self.tile_size 152 | crop_area = ( 153 | crop_origin[0], 154 | crop_origin[1], 155 | crop_origin[0] + szx/l2, 156 | crop_origin[1] + szy/l2 157 | ) 158 | ld('crop_area', level, crop_area) 159 | 160 | out_raster = upper_raster.crop(crop_area).resize(self.tile_size, Image.BICUBIC) 161 | out_raster = Image.composite(dst_raster, out_raster, dst_raster) 162 | del dst_raster 163 | out_raster.save(dst_path) 164 | 165 | pf('#', end='') 166 | 167 | self.underlay(dst_tile, upper_path, upper_raster, crop_origin, level+1) 168 | 169 | def __call__(self, tile): 170 | '''called by map() to merge a source tile into the destination tile set''' 171 | return self.merge_tile(tile) 172 | 173 | def merge_tile(self, tile): 174 | try: 175 | #~ ld(self.src_dir, self.dst_dir, tile) 176 | src_file = os.path.join(self.src_dir, tile) 177 | if not os.path.exists(src_file): 178 | return None, None 179 | 180 | src_raster = None 181 | transp = self.sources[tile] 182 | if transp is None: # transparency value not cached yet 183 | #~ pf('!', end='') 184 | src_raster = Image.open(src_file).convert("RGBA") 185 | transp = transparency(src_raster) 186 | if transp == 0 : # fully transparent 187 | #~ pf('-', end='') 188 | os.remove(src_file) 189 | return None, None 190 | 191 | dst_file = os.path.join(self.dst_dir, tile) 192 | dpath = os.path.dirname(dst_file) 193 | if not os.path.exists(dpath): 194 | try: # thread race safety 195 | os.makedirs(dpath) 196 | except os.error: 197 | pass 198 | if transp == 1 or not os.path.exists(dst_file): 199 | # fully opaque or no destination tile exists yet 200 | #~ pf('>', end='') 201 | link_or_copy(src_file, dst_file) 202 | else: # partially transparent, combine with destination (exists! see previous check) 203 | pf('+', end='') 204 | if not src_raster: 205 | src_raster = Image.open(src_file).convert("RGBA") 206 | try: 207 | dst_raster = Image.composite(src_raster, Image.open(dst_file).convert("RGBA"), src_raster) 208 | except IOError, exception: 209 | error('merge_tile', exception.message, dst_file) 210 | 211 | dst_raster.save(dst_file) 212 | 213 | if options.underlay and transp != 0: 214 | self.underlay(tile, src_file, src_raster) 215 | 216 | except KeyboardInterrupt: # http://jessenoller.com/2009/01/08/multiprocessingpool-and-keyboardinterrupt/ 217 | print 'got KeyboardInterrupt' 218 | raise KeyboardInterruptError() 219 | return (tile, transp) # send back transparency values for caching 220 | 221 | def merge_dirs(self): 222 | 223 | transparency = parallel_map(self, self.sources.keys()) 224 | self.sources = None 225 | self.sources = dict(transparency) 226 | if None in self.sources: 227 | del self.sources[None] 228 | 229 | self.merge_metadata() 230 | 231 | # save transparency data 232 | write_transparency(self.src_dir, self.sources) 233 | pf('') 234 | 235 | # MergeSet end 236 | 237 | if __name__ == '__main__': 238 | parser = optparse.OptionParser( 239 | usage="usage: %prog [--cut] [--dest-dir=DST_DIR] ... ", 240 | version=version, 241 | description="") 242 | parser.add_option("-r", "--remove-dest", action="store_true", 243 | help='delete destination directory before merging') 244 | parser.add_option("-l", "--src-list", default=None, 245 | help='read a list of source directories from a file; if no destination is provided then name destination after the list file without a suffix') 246 | parser.add_option("-s", "--strip-src-ext", action="store_true", 247 | help='strip extension suffix from a source parameter') 248 | parser.add_option("-x", "--add-src-ext", default=None, 249 | help='add extension suffix to a source parameter') 250 | parser.add_option('-u', "--underlay", type='int', default=0, 251 | help="underlay partially filled tiles with a zoomed-in raster from a higher level") 252 | parser.add_option("-q", "--quiet", action="store_true") 253 | parser.add_option("-d", "--debug", action="store_true") 254 | parser.add_option("--nothreads", action="store_true", 255 | help="do not use multiprocessing") 256 | 257 | (options, args) = parser.parse_args() 258 | logging.basicConfig(level=logging.DEBUG if options.debug else 259 | (logging.ERROR if options.quiet else logging.INFO)) 260 | 261 | ld(options) 262 | 263 | args = [i.decode(locale.getpreferredencoding(), 'ignore') for i in args] 264 | if options.src_list: 265 | with open(options.src_list, 'r') as f: 266 | src_dirs = [i.rstrip('\n\r').decode(locale.getpreferredencoding(), 'ignore') for i in f] 267 | try: 268 | dst_dir = args[-1] 269 | except: 270 | dst_dir = os.path.splitext(options.src_list)[0].decode(locale.getpreferredencoding(), 'ignore') 271 | else: 272 | try: 273 | src_dirs = args[0:-1] 274 | dst_dir = args[-1] 275 | except: 276 | raise Exception("No source(s) or/and destination specified") 277 | 278 | if options.nothreads or options.debug: 279 | set_nothreads() 280 | 281 | if options.remove_dest: 282 | shutil.rmtree(dst_dir, ignore_errors=True) 283 | 284 | if not os.path.exists(dst_dir): 285 | try: 286 | os.makedirs(dst_dir) 287 | except os.error: pass 288 | 289 | for src in src_dirs: 290 | if src.startswith("#") or src.strip() == '': # ignore sources with names starting with "#" 291 | continue 292 | MergeSet(src, dst_dir).merge_dirs() 293 | 294 | -------------------------------------------------------------------------------- /tilers_tools/reader_ozi.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-06-16 18:16:32 5 | 6 | ############################################################################### 7 | # Copyright (c) 2010, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | ############################################################################### 27 | 28 | from __future__ import with_statement 29 | 30 | import os 31 | import logging 32 | import locale 33 | 34 | from optparse import OptionParser 35 | 36 | from tiler_functions import * 37 | from reader_backend import * 38 | 39 | ############################################################################### 40 | 41 | # Helper functions for class OziCartesianRefPoints 42 | 43 | ############################################################################### 44 | 45 | def bng_ofs(square_id,scale,relative_sq=None): 46 | 'converts British/Irish Grid square letter to offset pair in squares: V -> (0,0)' 47 | sq_idx='ABCDEFGHJKLMNOPQRSTUVWXYZ'.find(square_id) # 'I' skipped 48 | assert sq_idx >= 0 49 | ofs = [(sq_idx % 5)*scale, (4 - (sq_idx // 5))*scale] 50 | if relative_sq: 51 | rel_ofs=bng_ofs(relative_sq,scale) 52 | ofs[0]-=rel_ofs[0] 53 | ofs[1]-=rel_ofs[1] 54 | return ofs 55 | 56 | def bng2coord(grid_coord,zone,hemisphere): 57 | '(BNG) British National Grid' 58 | assert len(zone) == 2 59 | return reduce(lambda x,y: (x[0]+y[0],x[1]+y[1]),[ 60 | grid_coord, 61 | bng_ofs(zone[0],5*100000,'S'), 62 | bng_ofs(zone[1],100000) 63 | ]) 64 | 65 | def ig2coord(grid_coord,zone,hemisphere): 66 | '(IG) Irish Grid' 67 | assert len(zone) == 1 68 | return reduce(lambda x,y: (x[0]+y[0],x[1]+y[1]),[ 69 | grid_coord, 70 | bng_ofs(zone,100000) 71 | ]) 72 | 73 | def utm2coord(grid_coord,zone,hemisphere): 74 | '(UTM) Universal Transverse Mercator' 75 | return (grid_coord[0] - 500000, 76 | grid_coord[1] - (0 if hemisphere.upper() == 'N' else 10000000)) 77 | 78 | grid_map={ 79 | '(BNG) British National Grid': bng2coord, 80 | '(IG) Irish Grid': ig2coord, 81 | '(UTM) Universal Transverse Mercator': utm2coord, 82 | } 83 | 84 | ############################################################################### 85 | 86 | class OziCartesianRefPoints(RefPoints): 87 | 88 | ############################################################################### 89 | def __init__(self,owner,ref_lst): 90 | super(OziCartesianRefPoints,self).__init__( 91 | owner, 92 | **dict(zip( 93 | ['ids','pixels','cartesian','zone','hemisphere'], 94 | self.transpose(ref_lst)[:5])) 95 | ) 96 | 97 | def grid2coord(self): 98 | try: 99 | conv2cartesian=grid_map[self.owner.get_proj_id()] 100 | except KeyError: 101 | return self.cartesian 102 | res=[] 103 | for grid_data in zip(self.cartesian,self.zone,self.hemisphere): 104 | res.append(conv2cartesian(*grid_data)) 105 | return res 106 | 107 | ############################################################################### 108 | 109 | class OziMap(SrcMap): 110 | 111 | ############################################################################### 112 | magic = 'OziExplorer Map Data File' 113 | data_file = 'data_ozi.csv' 114 | 115 | proj_parms=( 116 | '+lat_0=', # 1. Latitude Origin 117 | '+lon_0=', # 2. Longitude Origin 118 | '+k=', # 3. K Factor 119 | '+x_0=', # 4. False Easting 120 | '+y_0=', # 5. False Northing 121 | '+lat_1=', # 6. Latitude 1 122 | '+lat_2=', # 7. Latitude 2 123 | '+h=', # 8. Height - used in the Vertical Near-Sided Perspective Projection 124 | # 9. Sat - not used 125 | #10. Path - not used 126 | ) 127 | 128 | def load_data(self): 129 | 'load datum definitions, ellipses, projections from a file' 130 | self.datum_dict={} 131 | self.ellps_dict={} 132 | self.proj_dict={} 133 | csv_map={ 134 | 'datum': (self.datum_dict,self.ini_lst), 135 | 'ellps': (self.ellps_dict,self.ini_lst), 136 | 'proj': (self.proj_dict,self.ini_lst), 137 | } 138 | self.load_csv(self.data_file,csv_map) 139 | 140 | def get_header(self): 141 | 'read map header' 142 | with open(self.file, 'rU') as f: 143 | lines=f.readlines() # non-Unicode! 144 | if not (lines and lines[0].startswith('OziExplorer Map Data File')): 145 | raise Exception(" Invalid file: %s" % self.map_file) 146 | hdr=[[l.strip()] for l in lines[:3]] + [[i.strip() for i in l.split(',')] for l in lines[3:]] 147 | ld(hdr) 148 | return hdr 149 | 150 | def get_layers(self): 151 | return [OziLayer(self,self.header)] 152 | # OziMap 153 | reader_class_map.append(OziMap) 154 | 155 | class OziLayer(SrcLayer): 156 | 157 | def hdr_parms(self, patt): 158 | 'filter header for params starting with "patt"' 159 | return [i for i in self.data if i[0].startswith(patt)] 160 | 161 | def get_refs(self): 162 | 'get a list of geo refs in tuples' 163 | points=[i for i in self.hdr_parms('Point') if len(i) > 5 and i[4] == 'in' and i[2] != ''] # Get a list of geo refs 164 | if points[0][14] != '': # refs are cartesian 165 | refs=OziCartesianRefPoints(self,[( 166 | i[0], # id 167 | (int(i[2]),int(i[3])), # pixel 168 | (float(i[14]),float(i[15])), # cartesian coords 169 | i[13],i[16], # zone, hemisphere 170 | ) for i in points], 171 | ) 172 | else: 173 | refs=LatLonRefPoints(self,[( 174 | i[0], # id 175 | (int(i[2]),int(i[3])), # pixel 176 | (dms2dec(*i[9:12]), dms2dec(*i[6:9])), # lat/long 177 | ) for i in points]) 178 | return refs 179 | 180 | def get_plys(self): 181 | 'boundary polygon' 182 | ply_pix=[(int(i[2]),int(i[3])) for i in self.hdr_parms('MMPXY')] # Moving Map border pixels 183 | ply_ll=[(float(i[2]),float(i[3])) for i in self.hdr_parms('MMPLL')] # Moving Map border lat,lon 184 | ids=[i[0] for i in self.hdr_parms('MMPXY')] # Moving Map border pixels 185 | if (ply_pix and ply_ll): 186 | plys=RefPoints(self,ids=ids,pixels=ply_pix,latlong=ply_ll) 187 | else: 188 | plys=None 189 | return plys 190 | 191 | def get_dtm(self): 192 | 'get DTM northing, easting' 193 | dtm=[float(s)/3600 for s in self.data[4][2:4]] 194 | return dtm if dtm != [0,0] else None 195 | 196 | def get_proj_id(self): 197 | return self.hdr_parms('Map Projection')[0][1] 198 | 199 | def get_proj(self): 200 | proj_id=self.get_proj_id() 201 | parm_lst=self.hdr_parms('Projection Setup')[0] 202 | try: 203 | proj_parm=self.map.srs_defs['proj'][proj_id.upper()] 204 | proj = [proj_parm[0]] 205 | except KeyError: 206 | raise Exception("*** Unsupported projection (%s)" % proj_id) 207 | if not ('+init=' in proj[0] or '+datum=' in proj[0] or '+towgs84=' in proj[0]): # ooverwise assume it already has a full data defined 208 | # get projection parameters 209 | if self.get_proj_id() == '(UTM) Universal Transverse Mercator': 210 | assert '+proj=tmerc' in proj[0] 211 | if self.refs.cartesian: 212 | zone=int(self.refs.zone[0]) 213 | else: 214 | zone=(self.refs.latlong[0][0]+180) // 6 + 1 215 | proj.append('+lon_0=%i' % ((zone - 1) * 6 + 3 - 180)) 216 | else: 217 | proj.extend([ i[0]+i[1] for i in zip(self.map.proj_parms,parm_lst[1:]) 218 | if i[1].translate(None,'0.')]) 219 | return proj 220 | 221 | def get_datum_id(self): 222 | return self.data[4][0] 223 | 224 | def get_datum(self): 225 | datum_id=self.get_datum_id() 226 | try: 227 | datum_def=self.map.srs_defs['datum'][datum_id.upper()] 228 | datum=[datum_def[0]] # PROJ4 datum defined ? 229 | if not datum: # PROJ4 datum not defined 230 | datum=['+towgs84=%s,%s,%s' % tuple(datum_def[3:6])] 231 | ellps_id=datum_def[2] 232 | ellps_def=self.map.srs_defs['ellps'][ellps_id.upper()] 233 | ellps = ellps_def[0] 234 | if ellps: 235 | datum.append(ellps) 236 | else: 237 | datum.append('+a=%s',ellps_def[1]) 238 | datum.append('+rf=%s',ellps_def[2]) 239 | except KeyError: 240 | raise Exception("*** Unsupported datum (%s)" % datum_id) 241 | return datum 242 | 243 | try_encodings=(locale.getpreferredencoding(),'utf_8','cp1251','cp1252') 244 | 245 | def get_raster(self): 246 | img_path=self.data[2][0] 247 | img_path_slashed=img_path.replace('\\','/') # get rid of windows separators 248 | img_path_lst=os.path.split(img_path_slashed) 249 | img_fname=img_path_lst[-1] 250 | 251 | map_dir,map_fname=os.path.split(self.map.file) 252 | dir_lst=os.listdir(map_dir if map_dir else u'.') 253 | 254 | # try a few encodings 255 | for enc in self.try_encodings: 256 | name_patt=img_fname.decode(enc,'ignore').lower() 257 | match=[i for i in dir_lst if i.lower() == name_patt] 258 | if match: 259 | fn=match[0] 260 | ld(map_dir, fn) 261 | img_file=os.path.join(map_dir, fn) 262 | break 263 | else: 264 | raise Exception("*** Image file not found: %s" % img_path) 265 | return img_file 266 | 267 | def get_name(self): 268 | ozi_name=self.data[1][0] 269 | # guess .map file encoding 270 | for enc in self.try_encodings: 271 | try: 272 | #~ if enc == 'cp1251': 273 | #~ wrong_chars = [ 274 | #~ c for c in ozi_name 275 | #~ if '\x41' <= c <= '\x5A' or 276 | #~ '\x61' <= c <= '\x7A' 277 | #~ ] 278 | #~ if any(wrong_chars): 279 | #~ ld('wrong_chars', enc, wrong_chars, ozi_name) 280 | #~ continue # cp1251 name shouldn't have any ascii 281 | ozi_name = ozi_name.decode(enc) 282 | break 283 | except UnicodeDecodeError: 284 | pass 285 | ld('ozi_name', enc, ozi_name) 286 | return ozi_name 287 | 288 | # OziLayer 289 | 290 | ############################################################################### 291 | 292 | if __name__=='__main__': 293 | 294 | print('\nPlease use convert2gdal.py\n') 295 | sys.exit(1) 296 | -------------------------------------------------------------------------------- /tilers_tools/reader_backend.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2011, Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from __future__ import with_statement 27 | 28 | import os 29 | import logging 30 | import locale 31 | 32 | try: 33 | from osgeo import gdal 34 | from osgeo import osr 35 | from osgeo.gdalconst import * 36 | gdal.TermProgress = gdal.TermProgress_nocb 37 | except ImportError: 38 | import osr 39 | import gdal 40 | from gdalconst import * 41 | 42 | from tiler_functions import * 43 | 44 | reader_class_map = [] 45 | 46 | def dms2dec(degs='0',mins='0',ne='E',sec='0'): 47 | return (float(degs)+float(mins)/60+float(sec)/3600)*(-1 if ne in ('W','S') else 1 ) 48 | 49 | def dst_path(src,dst_dir,ext='',template='%s'): 50 | src_dir,src_file=os.path.split(src) 51 | base,sext=os.path.splitext(src_file) 52 | dest=(template % base)+ext 53 | if not dst_dir: 54 | dst_dir=src_dir 55 | if dst_dir: 56 | dest='%s/%s' % (dst_dir,dest) 57 | ld('base',base,'dest',dest,'src',src) 58 | return dest 59 | 60 | class Opt(object): 61 | def __init__(self,**dictionary): 62 | self.dict=dictionary 63 | def __getattr__(self, name): 64 | return self.dict.setdefault(name,None) 65 | 66 | ############################################################################### 67 | 68 | class RefPoints(object): 69 | 'source geo-reference points and polygons' 70 | 71 | ############################################################################### 72 | @staticmethod 73 | def transpose(ref_lst): # helper function for children classes 74 | return [list(i) for i in zip(*ref_lst)] 75 | 76 | def __init__(self,owner, 77 | ids=None,pixels=None,latlong=None,cartesian=None,zone=None,hemisphere=None): 78 | self.owner=owner 79 | self.ids=ids 80 | self.pixels=pixels 81 | self.latlong=latlong 82 | self.cartesian=cartesian 83 | self.zone=zone 84 | self.hemisphere=hemisphere 85 | 86 | #~ ld('RefPoints',self.__dict__) 87 | 88 | nrefs=len(filter(None,(self.pixels,self.latlong,self.cartesian))[0]) 89 | if not self.ids: 90 | self.ids=map(str,range(1,nrefs+1)) 91 | 92 | if nrefs == 2: 93 | logging.warning(' Only 2 reference points: assuming the chart is north alligned') 94 | self.ids += ['Extra03','Extra04'] 95 | for i in filter(None,(self.pixels,self.latlong,self.cartesian,self.zone,self.hemisphere)): 96 | try: # list of coordinates? -- swap x and y between them 97 | i.append((i[0][0],i[1][1])) 98 | i.append((i[1][0],i[0][1])) 99 | except IndexError: # just copy them 100 | i.append(i[0]) 101 | i.append(i[1]) 102 | ld('RefPoints extra',self.__dict__) 103 | 104 | self.ids=[s.encode('utf-8') for s in self.ids] 105 | 106 | def srs(self): 107 | return self.owner.srs 108 | 109 | def __iter__(self): 110 | for i in zip(self.ids,self.pix_coords(),self.proj_coords()): 111 | yield i 112 | 113 | def pix_coords(self,dataset=None): 114 | if self.pixels: 115 | return self.pixels 116 | p_dst=self.proj_coords() 117 | #~ ld(p_dst) 118 | pix_tr = GdalTransformer( 119 | dataset, 120 | METHOD='GCP_POLYNOMIAL' if not self.owner.map.options.tps else 'GCP_TPS' 121 | ) 122 | p_pix=pix_tr.transform(p_dst,inv=True) 123 | #~ ld(p_pix) 124 | return [(p[0],p[1]) for p in p_pix] 125 | 126 | def grid2coord(self): # to re-implemented by children if applicable 127 | return self.cartesian 128 | 129 | def proj_coords(self): 130 | if self.cartesian: 131 | return self.grid2coord() 132 | dtm=self.owner.dtm 133 | if not dtm: 134 | dtm=[0,0] 135 | latlong=[(lon+dtm[0],lat+dtm[1]) for lon,lat in self.latlong] 136 | srs_tr = GdalTransformer(SRC_SRS=proj_cs2geog_cs(self.owner.srs), DST_SRS=self.owner.srs) 137 | coords=srs_tr.transform(latlong) 138 | return coords 139 | 140 | def over_180(self): 141 | if not self.cartesian: # refs are lat/long 142 | leftmost=min(zip(self.pixels,self.latlong),key=lambda r: r[0][0]) 143 | rightmost=max(zip(self.pixels,self.latlong),key=lambda r: r[0][0]) 144 | ld('leftmost',leftmost,'rightmost',rightmost) 145 | if leftmost[1][0] > rightmost[1][0]: 146 | return leftmost[1][0] 147 | return None 148 | 149 | ############################################################################### 150 | 151 | class LatLonRefPoints(RefPoints): 152 | 'geo-reference points with geodetic coordinates initialised with a sigle list' 153 | 154 | ############################################################################### 155 | def __init__(self,owner,ref_lst): 156 | super(LatLonRefPoints,self).__init__( 157 | owner, 158 | **dict(zip( 159 | ['ids','pixels','latlong'], 160 | self.transpose(ref_lst)[:3])) 161 | ) 162 | 163 | ############################################################################### 164 | 165 | class SrcMap(object): 166 | 167 | ############################################################################### 168 | 169 | srs_defs = None 170 | data_file = None 171 | 172 | def __init__(self,src_file,options=None): 173 | self.options=options 174 | gdal.ErrorReset() 175 | gdal.UseExceptions() 176 | 177 | # load datum definitions, ellipses, projections 178 | if self.data_file: 179 | self.srs_defs = load_geo_defs(self.data_file) 180 | 181 | self.file=src_file 182 | self.header=self.get_header() # Read map header 183 | 184 | # def get_layers(self): 185 | # pass 186 | 187 | ############################################################################### 188 | 189 | class SrcLayer(object): 190 | 191 | ############################################################################### 192 | 193 | def __init__(self,src_map,data): 194 | self.map=src_map 195 | self.data=data 196 | self.name=self.get_name() 197 | 198 | self.img_file=self.get_raster() 199 | fname = self.img_file.encode(locale.getpreferredencoding()) 200 | 201 | self.raster_ds = gdal.Open(fname, GA_ReadOnly) 202 | 203 | self.dtm=None 204 | self.refs=self.get_refs() # fetch reference points 205 | self.srs,self.dtm=self.get_srs() # estimate SRS 206 | 207 | def __del__(self): 208 | ld('SrcLayer __del__') 209 | self.raster_ds = None 210 | 211 | def get_srs(self): # redefined in reader_kml.py 212 | 'returns srs for the map, and DTM shifts if any' 213 | 214 | logging.info(' %s : %s (%s)' % (self.map.file,self.name,self.img_file)) 215 | 216 | options=self.map.options 217 | if options.srs: 218 | logging.info(' %s' % (options.srs,)) 219 | return(options.srs,None) 220 | 221 | logging.info(' %s, %s' % (self.get_datum_id(),self.get_proj_id())) 222 | dtm=None 223 | proj4=[] 224 | 225 | # compute chart's projection 226 | if options.proj: 227 | proj4.append(options.proj) 228 | else: 229 | proj4=self.get_proj() 230 | 231 | # setup a central meridian artificialy to allow charts crossing meridian 180 232 | leftmost=self.refs.over_180() 233 | if leftmost and '+lon_0=' not in proj4[0]: 234 | proj4.append(' +lon_0=%i' % int(leftmost)) 235 | 236 | # compute chart's datum 237 | if options.datum: 238 | proj4.append(options.datum) 239 | elif options.force_dtm or options.dtm_shift: 240 | dtm=self.get_dtm() # get northing, easting to WGS84 if any 241 | proj4.append('+datum=WGS84') 242 | elif not '+proj=' in proj4[0]: 243 | pass # assume datum is defined already 244 | else: 245 | datum=self.get_datum() 246 | proj4.extend(datum) 247 | proj4.extend(['+nodefs']) # '+wktext', 248 | ld('proj4',proj4) 249 | return ' '.join(proj4).encode('utf-8'),dtm 250 | 251 | def convert(self): 252 | options=self.map.options 253 | 254 | if options.after_name: 255 | name_patt=self.name 256 | elif options.after_map: 257 | name_patt=self.map.file 258 | else: 259 | name_patt=self.img_file 260 | base = dst_path(name_patt, options.dst_dir) 261 | if options.long_name: 262 | base += ' - ' + "".join([c for c in self.name 263 | if c .isalpha() or c.isdigit() or c in '-_.() ']) 264 | dst_dir=os.path.split(base)[0] 265 | out_format='VRT' 266 | ext='.'+out_format.lower() 267 | 268 | try: 269 | start_dir=os.getcwd() 270 | if dst_dir: 271 | os.chdir(dst_dir) 272 | 273 | dst_file = os.path.abspath(os.path.basename(base+ext)) # output file 274 | dst_drv = gdal.GetDriverByName(out_format) 275 | dst_ds = dst_drv.CreateCopy( 276 | dst_file.encode(locale.getpreferredencoding()), 277 | self.raster_ds, 278 | 0 279 | ) 280 | dst_ds.SetProjection(self.srs) 281 | 282 | #double x = 0.0, double y = 0.0, double z = 0.0, double pixel = 0.0, 283 | #double line = 0.0, char info = "", char id = "" 284 | gcps=[gdal.GCP(c[0],c[1],0,p[0],p[1],'',i) for i,p,c in self.refs] 285 | dst_ds.SetGCPs(gcps,self.refs.srs()) 286 | dst_geotr=gdal.GCPsToGeoTransform(gcps) # if len(gcps) < 5 else (0.0, 1.0, 0.0, 0.0, 0.0, 1.0) 287 | dst_ds.SetGeoTransform(dst_geotr) 288 | poly,gmt_data=self.cut_poly(dst_ds) 289 | if poly: 290 | dst_ds.SetMetadataItem('CUTLINE',poly) 291 | if self.name: 292 | dst_ds.SetMetadataItem('DESCRIPTION',self.name.encode('utf-8')) 293 | 294 | dst_ds = None # close dataset 295 | # re_sub_file(dst_file, [ 296 | # ('^.*.*\n',''), 297 | # ('^.*.*\n','') 298 | # ]) 299 | finally: 300 | self.raster_ds = None 301 | os.chdir(start_dir) 302 | 303 | if options.get_cutline: # print cutline then return 304 | print poly 305 | return 306 | if gmt_data and options.cut_file: # create shapefile with a cut polygon 307 | with open(base+'.gmt','w+') as f: 308 | f.write(gmt_data) 309 | 310 | return dst_file 311 | 312 | gmt_templ = '\n'.join(( 313 | '# @VGMT1.0 @GPOLYGON', 314 | '# @Jp"%s"', 315 | '# FEATURE_DATA', 316 | '>', 317 | '# @P', 318 | '%s' 319 | )) 320 | 321 | def cut_poly(self,dst_ds): 322 | plys=self.get_plys() 323 | if not plys: 324 | return '','' 325 | 326 | pix_lst=plys.pix_coords(dst_ds) 327 | 328 | # check if the raster really needs cutting 329 | width=dst_ds.RasterXSize 330 | height=dst_ds.RasterYSize 331 | inside=[i for i in pix_lst # check if the polygon is inside the image border 332 | if (i[0] > 0 or i[0] < width) or (i[1] > 0 or i[1] < height)] 333 | if not inside: 334 | return '','' 335 | 336 | # Create cutline 337 | poly_shape=self.gmt_templ % (self.refs.srs(),'\n'.join( 338 | ['%r %r' % (i[0],i[1]) for i in plys.proj_coords()])) 339 | poly_wkt='MULTIPOLYGON(((%s)))' % ','.join(['%r %r' % tuple(i) for i in pix_lst]) # Create cutline 340 | return poly_wkt,poly_shape 341 | # SrcLayer 342 | 343 | ############################################################################### 344 | -------------------------------------------------------------------------------- /tilers_tools/converter_backend.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2010, 2013 Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | import os 27 | import os.path 28 | import glob 29 | import shutil 30 | import json 31 | import tempfile 32 | import StringIO 33 | import struct 34 | from multiprocessing import Pool 35 | import itertools 36 | 37 | from PIL import Image 38 | #~ from PIL import WebPImagePlugin 39 | 40 | from tiler_functions import * 41 | from tiler import Pyramid 42 | 43 | ############################# 44 | 45 | class Tile(object): 46 | 47 | ############################# 48 | def __init__(self, coord): 49 | self._coord = tuple(coord) 50 | self.path = None 51 | self.temp = False 52 | 53 | def coord(self): 54 | return self._coord 55 | 56 | def get_mime(self): 57 | return mime_from_ext(self.get_ext()) 58 | 59 | def close_file(self): 60 | if self.temp and self.path and os.path.exists(self.path): 61 | ld('tile', self.coord()) 62 | os.remove(self.path) 63 | self.path = None 64 | 65 | ############################# 66 | 67 | class FileTile(Tile): 68 | 69 | ############################# 70 | def __init__(self, coord, path, temp=False): 71 | super(FileTile, self).__init__(coord) 72 | self.path = path 73 | self.temp = temp 74 | 75 | def data(self): 76 | with open(self.path, 'rb') as f: 77 | return f.read() 78 | 79 | def get_ext(self): 80 | return os.path.splitext(self.path)[1] 81 | 82 | def copy2file(self, dst, link=False): 83 | if link and os.name == 'posix': 84 | dst_dir = os.path.split(dst)[0] 85 | src = os.path.relpath(self.path, dst_dir) 86 | os.symlink(src, dst) 87 | else: 88 | shutil.copy(self.path, dst) 89 | 90 | def get_file(self): 91 | return self.path 92 | 93 | ############################# 94 | 95 | class FileTileNoExt(FileTile): 96 | 97 | ############################# 98 | def get_ext(self): 99 | return ext_from_file(self.path) 100 | 101 | ############################# 102 | 103 | class PixBufTile(Tile): 104 | 105 | ############################# 106 | def __init__(self, coord, pixbuf, key=None, dataType=None): 107 | super(PixBufTile, self).__init__(coord) 108 | self.pixbuf = pixbuf 109 | self.data_type = dataType 110 | 111 | def data(self): 112 | return self.pixbuf 113 | 114 | def get_ext(self): 115 | if self.data_type: 116 | ext = ext_from_mime(self.data_type) 117 | else: 118 | ext = ext_from_buffer(self.pixbuf) 119 | return ext 120 | 121 | def copy2file(self, dest_path, link=False): 122 | with open(dest_path, 'wb') as f: 123 | f.write(self.pixbuf) 124 | 125 | def get_file(self): 126 | self.temp = True 127 | f_handle, self.path = tempfile.mkstemp(prefix='s%d-%d-%d_' % self.coord(), suffix=self.get_ext()) 128 | 129 | os.write(f_handle, self.pixbuf) 130 | os.close(f_handle) 131 | return self.path 132 | 133 | #---------------------------- 134 | 135 | tile_converters = [] 136 | 137 | ############################# 138 | 139 | class TileConverter(object): 140 | 141 | ############################# 142 | profile_name = 'copy' 143 | dst_ext = None 144 | src_formats = () # by default do not convert tiles 145 | 146 | def __init__(self, options): 147 | self.options = options 148 | ld('TileConverter', self.profile_name) 149 | 150 | 151 | def __call__(self, tile): 152 | 'convert tile' 153 | try: 154 | if tile.get_ext() in self.src_formats: 155 | return self.convert_tile(tile) 156 | else: 157 | return tile # w/o conversion 158 | except (EnvironmentError, KeyError): 159 | return None 160 | 161 | @staticmethod 162 | def get_class(profile, isDest=False): 163 | for cls in tile_converters: 164 | if profile == cls.profile_name: 165 | return cls 166 | else: 167 | raise Exception('Invalid format: %s' % profile) 168 | 169 | @staticmethod 170 | def list_tile_converters(): 171 | for cls in tile_converters: 172 | print cls.profile_name 173 | 174 | tile_converters.append(TileConverter) 175 | 176 | ############################# 177 | 178 | class ShellConverter (TileConverter): 179 | 180 | ############################# 181 | prog_name = None 182 | 183 | def __init__(self, options): 184 | super(ShellConverter, self).__init__(options) 185 | self.dst_dir = tempfile.gettempdir() 186 | 187 | if self.prog_name: 188 | try: # check if converter programme is available 189 | prog_path = command(['which', self.prog_name]).strip() 190 | except: 191 | raise Exception('Can not find %s executable' % self.prog_name) 192 | 193 | def convert_tile(self, src_tile): 194 | 195 | src_path = src_tile.get_file() 196 | #~ ld('convert_tile', src_path) 197 | base_name = os.path.splitext(os.path.split(src_path)[1])[0] 198 | dst_dir = tempfile.gettempdir() 199 | coord = src_tile.coord() 200 | suffix = ('-%d-%d-%d' % coord) + self.dst_ext 201 | dst_path = os.path.join(dst_dir, base_name + suffix) 202 | 203 | self.call_converter(src_path, dst_path, suffix) 204 | 205 | src_tile.close_file() 206 | dst_tile = FileTile(coord, dst_path, temp=True) 207 | return dst_tile 208 | 209 | ############################# 210 | 211 | class PngConverter (ShellConverter): 212 | 'optimize png using pngnq utility' 213 | ############################# 214 | profile_name = 'pngnq' 215 | prog_name = 'pngnq' 216 | dst_ext = '-nq8.png' 217 | src_formats = ('.png',) 218 | 219 | def call_converter(self, src, dst, suffix): 220 | 221 | command(['pngnq', '-f', '-n', self.options.colors, '-e', suffix, '-d', self.dst_dir, src]) 222 | 223 | tile_converters.append(PngConverter) 224 | 225 | ############################# 226 | 227 | class WebpConverter (ShellConverter): 228 | 'convert to webp' 229 | ############################# 230 | profile_name = 'webp' 231 | dst_ext = '.webp' 232 | src_formats = ('.png','.jpg','.jpeg','.gif') 233 | 234 | def call_converter(self, src, dst, suffix): 235 | 236 | command(['cwebp', '-alpha_cleanup', '-q', str(self.options.quality), '-o', dst, src]) 237 | 238 | tile_converters.append(WebpConverter) 239 | 240 | ############################# 241 | 242 | class WebpNoAlphaConverter (ShellConverter): 243 | 'convert to webp; discard alpha channel' 244 | ############################# 245 | profile_name = 'webp-noalpha' 246 | dst_ext = '.webp' 247 | src_formats = ('.png','.jpg','.jpeg','.gif') 248 | 249 | def call_converter(self, src, dst, suffix): 250 | 251 | command(['cwebp', '-preset', 'drawing', '-noalpha', '-q', str(self.options.quality), '-o', dst, src]) 252 | 253 | tile_converters.append(WebpNoAlphaConverter) 254 | 255 | 256 | #~ ############################# 257 | #~ 258 | #~ class WebpPilConverter (TileConverter): 259 | #~ 'convert to webp' 260 | #~ ############################# 261 | #~ profile_name = 'webppil' 262 | #~ dst_ext = '.webp' 263 | #~ src_formats = ('.png','.jpg','.jpeg','.gif') 264 | #~ 265 | #~ def convert_tile(self, src, dst, dpath): 266 | #~ img = Image.open(src) 267 | #~ img.save(dst, optimize=True, quality=self.options.quality) 268 | #~ 269 | #~ tile_converters.append(WebpPilConverter) 270 | 271 | 272 | ############################# 273 | 274 | class JpegConverter (TileConverter): 275 | 'convert to jpeg' 276 | ############################# 277 | profile_name = 'jpeg' 278 | dst_ext = '.jpg' 279 | src_formats = ('.png', '.gif') 280 | 281 | def convert_tile(self, tile): 282 | src = StringIO.StringIO(tile.data()) 283 | img = Image.open(src) 284 | dst = StringIO.StringIO() 285 | img.save(dst, 'jpeg', optimize=True, quality=self.options.quality) 286 | 287 | dtile = PixBufTile(tile.coord(), dst.getvalue()) 288 | src.close() 289 | dst.close() 290 | 291 | return dtile 292 | 293 | tile_converters.append(JpegConverter) 294 | 295 | #---------------------------- 296 | 297 | tileset_profiles = [] 298 | 299 | tile_converter = None 300 | 301 | def global_converter(tile): 302 | #~ log('tile', tile.coord()) 303 | tile = tile_converter(tile) 304 | return tile 305 | 306 | ############################# 307 | 308 | class TileSet(object): 309 | 310 | ############################# 311 | 312 | #~ tile_converter = None 313 | pool = None 314 | 315 | def __init__(self, root=None, options=None, src=None): 316 | options = LooseDict(options) 317 | options.isDest = src is not None 318 | 319 | self.root = root 320 | self.options = options 321 | self.src = src 322 | 323 | self.srs = self.options.proj4def or self.options.tiles_srs 324 | self.tilemap_crs = self.options.tiles_srs or self.tilemap_crs 325 | self.options.tiles_srs = self.srs 326 | 327 | self.zoom_levels = {} 328 | self.pyramid = Pyramid.profile_class('generic')(options=options) 329 | 330 | if not self.options.isDest: 331 | assert os.path.exists(root), 'No file or directory found: %s' % root 332 | self.ext = os.path.splitext(root)[1] 333 | if self.options.zoom: 334 | self.pyramid.set_zoom_range(self.options.zoom) 335 | if self.options.region: 336 | self.pyramid.load_region(self.options.region) 337 | else: 338 | basename = os.path.splitext(os.path.basename(self.root or src.root))[0] 339 | df_name = os.path.splitext(basename)[0] 340 | #~ if self.options.region: 341 | #~ df_name += '-' + os.path.splitext(self.options.region)[0] 342 | self.name = self.options.name or df_name 343 | 344 | if not self.root: 345 | suffix = self.ext if self.ext != src.ext else self.ext + '0' 346 | self.root = os.path.join(options.dst_dir, self.name + suffix) 347 | 348 | if os.path.exists(self.root): 349 | if self.options.remove_dest: 350 | if os.path.isdir(self.root): 351 | shutil.rmtree(self.root, ignore_errors=True) 352 | else: 353 | os.remove(self.root) 354 | else: 355 | assert self.options.append, 'Destination already exists: %s' % root 356 | 357 | if self.options.convert_tile: 358 | global tile_converter 359 | tile_converter = TileConverter.get_class(self.options.convert_tile)(options) 360 | if not (self.options.nothreads or self.options.debug): 361 | self.pool = Pool() 362 | 363 | @staticmethod 364 | def get_class(profile, isDest=False): 365 | for cls in tileset_profiles: 366 | if profile == cls.format and ((not isDest and cls.input) or (isDest and cls.output)): 367 | return cls 368 | else: 369 | raise Exception('Invalid format: %s' % profile) 370 | 371 | @staticmethod 372 | def list_profiles(): 373 | for cl in tileset_profiles: 374 | print '%10s\t%s%s\t%s' % ( 375 | cl.format, 376 | 'r' if cl.input else ' ', 377 | 'w' if cl.output else ' ', 378 | cl.__doc__ 379 | ) 380 | 381 | def in_range(self, ul_coords, lr_coords=None): 382 | if not ul_coords: 383 | return False 384 | if not self.pyramid: 385 | return True 386 | zoom = ul_coords[0] 387 | region_zoom = self.options.region_zoom 388 | if region_zoom is not None and zoom < region_zoom: 389 | return True 390 | return self.pyramid.in_range(ul_coords, lr_coords) 391 | 392 | def __del__(self): 393 | log('self.count', self.count) 394 | 395 | def __iter__(self): # to be defined by a child 396 | raise Exception('Not implemented!') 397 | 398 | def convert(self): 399 | pf('%s -> %s ' % (self.src.root, self.root), end='') 400 | 401 | if self.pool: 402 | src = self.pool.imap_unordered(global_converter, self.src, chunksize=10) 403 | elif self.options.convert_tile: 404 | src = itertools.imap(global_converter, self.src) 405 | else: 406 | src = self.src 407 | 408 | for tile in src: 409 | if tile is not None: 410 | self.process_tile(tile) 411 | 412 | if self.pool: 413 | self.pool.close() 414 | self.pool.join() 415 | 416 | if self.count > 0: 417 | self.finalize_pyramid() 418 | self.finalize_tileset() 419 | else: 420 | pf('No tiles converted', end='') 421 | pf('') 422 | 423 | def process_tile(self, tile): 424 | #~ log('process_tile', tile) 425 | self.store_tile(tile) 426 | self.counter() 427 | 428 | # collect min max values for tiles processed 429 | zxy = list(tile.coord()) 430 | z = zxy[0] 431 | 432 | min_max = self.zoom_levels.get(z, []) # min, max 433 | zzz, xxx, yyy = zip(*(min_max+[zxy])) 434 | self.zoom_levels[z] = [[z, min(xxx), min(yyy)], [z, max(xxx), max(yyy)]] 435 | tile.close_file() 436 | 437 | def finalize_pyramid(self): 438 | log('self.zoom_levels', self.zoom_levels) 439 | 440 | # compute "effective" covered area 441 | prev_sq = 0 442 | for z in reversed(sorted(self.zoom_levels)): 443 | ul_zxy, lr_zxy = self.zoom_levels[z] 444 | ul_c = self.pyramid.tile_corners(ul_zxy)[0] 445 | lr_c = self.pyramid.tile_corners(lr_zxy)[1] 446 | sq = (lr_c[0]-ul_c[0])*(ul_c[1]-lr_c[1]) 447 | area_diff = round(prev_sq/sq, 5) 448 | log('ul_c, lr_c', z, ul_c, lr_c, sq, area_diff) 449 | if area_diff == 0.25: 450 | break # this must be an exact zoom of a previous level 451 | area_coords = [ul_c, lr_c] 452 | prev_sq = sq 453 | 454 | self.pyramid.set_region(area_coords) 455 | self.pyramid.set_zoom_range(','.join(map(str, self.zoom_levels.keys()))) 456 | 457 | self.pyramid.name = self.name 458 | 459 | def finalize_tileset(self): 460 | pass 461 | 462 | count = 0 463 | tick_rate = 10 464 | def counter(self): 465 | self.count += 1 466 | if self.count % self.tick_rate == 0: 467 | pf('.', end='') 468 | return True 469 | else: 470 | return False 471 | 472 | # TileSet 473 | 474 | ############################# 475 | 476 | class TileDir(TileSet): 477 | 478 | ############################# 479 | tile_class = FileTile 480 | 481 | def __init__(self, *args, **kw_args): 482 | super(TileDir, self).__init__(*args, **kw_args) 483 | 484 | if self.options.isDest: 485 | try: 486 | os.makedirs(self.root) 487 | except os.error: pass 488 | 489 | def __iter__(self): 490 | for f in glob.iglob(os.path.join(self.root, self.dir_pattern)): 491 | coord = self.path2coord(f) 492 | if self.in_range(coord): 493 | yield self.tile_class(coord, f) 494 | 495 | def path2coord(self, tile_path): 496 | raise Exception('Unimplemented!') 497 | 498 | def coord2path(self, z, x, y): 499 | raise Exception('Unimplemented!') 500 | 501 | def dest_ext(self, tile): 502 | return tile.get_ext() 503 | 504 | def store_tile(self, tile): 505 | try: 506 | tile_ext = self.dest_ext(tile) 507 | self.tile_ext = tile_ext 508 | except KeyError: 509 | tile_ext = '.xxx' # invalid file type 510 | dest_path = os.path.join(self.root, self.coord2path(*tile.coord())) + tile_ext 511 | log('%s -> %s' % (tile.path, dest_path)) 512 | try: 513 | os.makedirs(os.path.split(dest_path)[0]) 514 | except os.error: pass 515 | tile.copy2file(dest_path, self.options.link) 516 | # TileDir 517 | 518 | ############################# 519 | 520 | class TileMapDir(TileDir): 521 | 522 | ############################# 523 | 524 | def finalize_tileset(self): 525 | self.pyramid.tile_ext = self.tile_ext 526 | self.pyramid.dest = self.root 527 | self.pyramid.write_metadata() 528 | 529 | # TileMapDir 530 | -------------------------------------------------------------------------------- /tilers_tools/tiler_functions.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | ############################################################################### 5 | # Copyright (c) 2011, Vadim Shlyakhov 6 | # 7 | # Permission is hereby granted, free of charge, to any person obtaining a 8 | # copy of this software and associated documentation files (the "Software"), 9 | # to deal in the Software without restriction, including without limitation 10 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 11 | # and/or sell copies of the Software, and to permit persons to whom the 12 | # Software is furnished to do so, subject to the following conditions: 13 | # 14 | # The above copyright notice and this permission notice shall be included 15 | # in all copies or substantial portions of the Software. 16 | # 17 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 18 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 19 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 20 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 21 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 22 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 23 | # DEALINGS IN THE SOFTWARE. 24 | ############################################################################### 25 | 26 | from __future__ import with_statement 27 | from __future__ import print_function 28 | 29 | version = '%prog version 3.1.0' 30 | 31 | import sys 32 | import os 33 | import os.path 34 | import logging 35 | from subprocess import * 36 | import itertools 37 | import re 38 | import shutil 39 | import locale 40 | import csv 41 | import htmlentitydefs 42 | import json 43 | 44 | try: 45 | from osgeo import gdal 46 | from osgeo import osr 47 | from osgeo import ogr 48 | from osgeo.gdalconst import * 49 | # gdal.TermProgress = gdal.TermProgress_nocb 50 | except ImportError: 51 | import gdal 52 | import osr 53 | import ogr 54 | from gdalconst import * 55 | 56 | try: 57 | import multiprocessing # available in python 2.6 and above 58 | 59 | class KeyboardInterruptError(Exception): 60 | pass 61 | except: 62 | multiprocessing = None 63 | 64 | def data_dir(): 65 | return sys.path[0] 66 | 67 | def log(*parms): 68 | logging.debug(' '.join(itertools.imap(repr, parms))) 69 | 70 | ld = log 71 | 72 | def error(*parms): 73 | logging.error(' '.join(itertools.imap(repr, parms))) 74 | 75 | def ld_nothing(*parms): 76 | return 77 | 78 | def pf(*parms, **kparms): 79 | end = kparms['end'] if 'end' in kparms else '\n' 80 | parms = [i.encode(locale.getpreferredencoding()) if isinstance(i, unicode) else str(i) for i in parms] 81 | sys.stdout.write(' '.join(parms) + end) 82 | sys.stdout.flush() 83 | 84 | def pf_nothing(*parms, **kparms): 85 | return 86 | 87 | def set_nothreads(): 88 | ld('set_nothreads') 89 | global multiprocessing 90 | multiprocessing = None 91 | 92 | def parallel_map(func, iterable): 93 | ld('parallel_map', multiprocessing) 94 | #~ return map(func, iterable) 95 | 96 | if multiprocessing is None or len(iterable) < 2: 97 | return map(func, iterable) 98 | else: 99 | # map in parallel 100 | mp_pool = multiprocessing.Pool() # multiprocessing pool 101 | res = mp_pool.map(func, iterable) 102 | # wait for threads to finish 103 | mp_pool.close() 104 | mp_pool.join() 105 | return res 106 | 107 | def flatten(two_level_list): 108 | return list(itertools.chain(*two_level_list)) 109 | 110 | htmlentitydefs.name2codepoint['apos'] = ord(u"'") 111 | 112 | def strip_html(text): 113 | 'Removes HTML markup from a text string. http://effbot.org/zone/re-sub.htm#strip-html' 114 | 115 | def replace(match): # pattern replacement function 116 | text = match.group(0) 117 | if text == '
': 118 | return '\n' 119 | if text[0] == '<': 120 | return '' # ignore tags 121 | if text[0] == '&': 122 | if text[1] == '#': 123 | try: 124 | if text[2] == 'x': 125 | return unichr(int(text[3:-1], 16)) 126 | else: 127 | return unichr(int(text[2:-1])) 128 | except ValueError: 129 | pass 130 | else: 131 | return unichr(htmlentitydefs.name2codepoint[text[1:-1]]) 132 | return text # leave as is 133 | # fixup end 134 | 135 | return re.sub('(?s)<[^>]*>|&#?\w+;', replace, text) 136 | 137 | def if_set(x, default=None): 138 | return x if x is not None else default 139 | 140 | def path2list(path): 141 | head, ext = os.path.splitext(path) 142 | split = [ext] 143 | while head: 144 | head, p = os.path.split(head) 145 | if p == '': # head must be '/' 146 | p = head 147 | head = None 148 | split.append(p) 149 | split.reverse() 150 | return split 151 | 152 | try: 153 | import win32pipe 154 | except: 155 | win32pipe = None 156 | 157 | def command(params, child_in=None): 158 | cmd_str = ' '.join(('"%s"' % i if ' ' in i else i for i in params)) 159 | ld('>', cmd_str, child_in) 160 | if win32pipe: 161 | (stdin, stdout, stderr) = win32pipe.popen3(cmd_str, 't') 162 | if child_in: 163 | stdin.write(child_in) 164 | stdin.close() 165 | child_out = stdout.read() 166 | child_err = stderr.read() 167 | if child_err: 168 | logging.warning(child_err) 169 | else: 170 | process = Popen(params, stdin=PIPE, stdout=PIPE, stderr=PIPE, universal_newlines=True) 171 | (child_out, child_err) = process.communicate(child_in) 172 | if process.returncode != 0: 173 | error("*** External program error: %s\n%s" % (cmd_str, child_err)) 174 | raise EnvironmentError(process.returncode, child_err) 175 | ld('<', child_out, child_err) 176 | return child_out 177 | 178 | def dest_path(src, dest_dir, ext='', template='%s'): 179 | src_dir, src_file = os.path.split(src) 180 | base, sext = os.path.splitext(src_file) 181 | dest = (template % base)+ext 182 | if not dest_dir: 183 | dest_dir = src_dir 184 | if dest_dir: 185 | dest = '%s/%s' % (dest_dir, dest) 186 | ld(base, dest) 187 | return dest 188 | 189 | def re_sub_file(fname, subs_list): 190 | 'stream edit file using reg exp substitution list' 191 | new = fname+'.new' 192 | with open(new, 'w') as out: 193 | for l in open(fname, 'rU'): 194 | for (pattern, repl) in subs_list: 195 | l = re.sub(pattern, repl, string=l) 196 | out.write(l) 197 | shutil.move(new, fname) 198 | 199 | class LooseDict(object): 200 | def __init__(self, init=None, **kw): 201 | if init is None: 202 | init = dict() 203 | elif isinstance(init, dict): 204 | pass 205 | else: #optparse.Values 206 | init = init.__dict__ 207 | self.update(init) 208 | self.update(kw) 209 | 210 | def __getattr__(self, name): 211 | self.__dict__.get(name) 212 | 213 | def __setattr__(self, name, value): 214 | self.__dict__[name] = value 215 | 216 | def update(self, other_dict): 217 | self.__dict__.update(other_dict) 218 | 219 | ############################# 220 | # 221 | # GDAL utility functions 222 | # 223 | ############################# 224 | 225 | def load_geo_defs(csv_file): 226 | 'load datum definitions, ellipses, projections from a file' 227 | defs = { 228 | 'proj':{}, 229 | 'datum':{}, 230 | 'ellps':{}, 231 | } 232 | try: 233 | csv.register_dialect('strip', skipinitialspace=True) 234 | with open(os.path.join(data_dir(),csv_file),'rb') as data_f: 235 | data_csv=csv.reader(data_f,'strip') 236 | for row in data_csv: 237 | row=[s.decode('utf-8') for s in row] 238 | try: 239 | rec_type = row[0] 240 | rec_id = row[1] 241 | rec_data = row[2:] 242 | if not rec_type in defs: 243 | defs[rec_type] = {} 244 | defs[rec_type][rec_id.upper()] = rec_data 245 | except IndexError: 246 | pass 247 | except KeyError: 248 | pass 249 | except IOError: 250 | pass 251 | return defs 252 | 253 | geo_defs_override_file = 'data_override.csv' 254 | geo_defs_override = load_geo_defs(geo_defs_override_file) 255 | 256 | def txt2srs(proj): 257 | srs = osr.SpatialReference() 258 | proj_ovr = geo_defs_override['proj'].get(proj) 259 | if proj_ovr: 260 | proj = str(proj_ovr[0]) 261 | if proj.startswith(("GEOGCS", "GEOCCS", "PROJCS", "LOCAL_CS")): 262 | srs.ImportFromWkt(proj) 263 | if proj.startswith('EPSG'): 264 | epsg = int(proj.split(':')[1]) 265 | srs.ImportFromEPSG(epsg) 266 | if proj.startswith('+'): 267 | srs.ImportFromProj4(proj) 268 | return srs 269 | 270 | def txt2wkt(proj): 271 | srs = txt2srs(proj) 272 | return srs.ExportToWkt() 273 | 274 | def txt2proj4(proj): 275 | srs = txt2srs(proj) 276 | return srs.ExportToProj4() 277 | 278 | def proj_cs2geog_cs(proj): 279 | srs = txt2srs(proj) 280 | srs_geo = osr.SpatialReference() 281 | srs_geo.CopyGeogCSFrom(srs) 282 | return srs_geo.ExportToProj4() 283 | 284 | class GdalTransformer: 285 | def __init__(self, src_ds=None, dst_ds=None, **options): 286 | for key in ('SRC_SRS', 'DST_SRS'): 287 | try: 288 | options[key] = txt2wkt(options[key]) # convert to wkt 289 | except: pass 290 | opt_lst = ['%s=%s' % (key, options[key]) for key in options] 291 | self.transformer = gdal.Transformer(src_ds, dst_ds, opt_lst) 292 | 293 | def transform(self, points, inv=False): 294 | if not points: 295 | return [] 296 | transformed, ok = self.transformer.TransformPoints(inv, points) 297 | assert ok 298 | return [i[:2] for i in transformed] 299 | 300 | def transform_point(self, point, inv=False): 301 | return self.transform([point], inv=inv)[0] 302 | # GdalTransformer 303 | 304 | def sasplanet_hlg2ogr(fname): 305 | with open(fname) as f: 306 | lines = f.readlines(4096) 307 | if not lines[0].startswith('[HIGHLIGHTING]'): 308 | return None 309 | coords = [[], []] 310 | for l in lines[2:]: 311 | val = float(l.split('=')[1].replace(',','.')) 312 | coords[1 if 'Lat' in l else 0].append(val) 313 | points = zip(*coords) 314 | ld('sasplanet_hlg2ogr', 'points', points) 315 | 316 | ring = ogr.Geometry(ogr.wkbLinearRing) 317 | for p in points: 318 | ring.AddPoint(*p) 319 | polygon = ogr.Geometry(ogr.wkbPolygon) 320 | polygon.AddGeometry(ring) 321 | 322 | ds = ogr.GetDriverByName('Memory').CreateDataSource( 'wrk' ) 323 | assert ds is not None, 'Unable to create datasource' 324 | 325 | #~ src_srs = txt2srs('EPSG:4326') 326 | src_srs = txt2srs('+proj=latlong +a=6378137 +b=6378137 +datum=WGS84 +nadgrids=@null +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +no_defs') 327 | 328 | layer = ds.CreateLayer('sasplanet_hlg', srs=src_srs) 329 | 330 | feature = ogr.Feature(layer.GetLayerDefn()) 331 | feature.SetGeometry(polygon) 332 | layer.CreateFeature(feature) 333 | 334 | del feature 335 | del polygon 336 | del ring 337 | 338 | return ds 339 | 340 | def shape2mpointlst(datasource, dst_srs, feature_name=None): 341 | ds = ogr.Open(datasource.encode(locale.getpreferredencoding())) 342 | if not ds: 343 | gdal.ErrorReset() 344 | ds = sasplanet_hlg2ogr(datasource) 345 | if not ds: 346 | ld('shape2mpointlst: Invalid datasource %s' % datasource) 347 | return [] 348 | 349 | drv_name = ds.GetDriver().GetName() 350 | is_kml = 'KML' in drv_name 351 | ld('shape2mpointlst drv', drv_name, is_kml) 352 | 353 | n_layers = ds.GetLayerCount() 354 | for j in range(n_layers): 355 | layer = ds.GetLayer(j) 356 | n_features = layer.GetFeatureCount() 357 | ld('shape2mpointlst layer', j, n_layers, n_features, feature_name, layer) 358 | 359 | for i in range(n_features): 360 | feature = layer.GetNextFeature() 361 | 362 | fc = feature.GetFieldCount() 363 | for l in range(fc): 364 | fdef = feature.GetFieldDefnRef(l) 365 | ld(l, fdef.GetNameRef(), feature.GetFieldAsString(l)) 366 | 367 | if is_kml: 368 | i_icon = feature.GetFieldIndex('icon') 369 | #~ ld('shape2mpointlst i_icon', i_icon) 370 | if i_icon != -1: 371 | icon = feature.GetFieldAsString(i_icon) 372 | #~ ld('shape2mpointlst icon', icon) 373 | if icon != '': 374 | continue 375 | if feature_name is None: 376 | name = None 377 | else: 378 | i_name = feature.GetFieldIndex('Name') 379 | if i_name == -1: 380 | continue 381 | name = feature.GetFieldAsString(i_name).decode('utf-8') 382 | 383 | ld('shape2mpointlst name', name == feature_name, name, feature_name) 384 | if name == feature_name: 385 | #~ feature.DumpReadable() 386 | 387 | geom = feature.GetGeometryRef() 388 | geom_name = geom.GetGeometryName() 389 | geom_lst = { 390 | 'MULTIPOLYGON':(geom.GetGeometryRef(i) for i in range(geom.GetGeometryCount())), 391 | 'POLYGON': (geom, ), 392 | }[geom_name] 393 | 394 | layer_srs = layer.GetSpatialRef() 395 | if layer_srs: 396 | layer_proj = layer_srs.ExportToProj4() 397 | else: 398 | layer_proj = dst_srs 399 | srs_tr = GdalTransformer(SRC_SRS=layer_proj, DST_SRS=dst_srs) 400 | if layer_proj == dst_srs: 401 | srs_tr.transform = lambda x:x 402 | 403 | multipoint_lst = [] 404 | for geometry in geom_lst: 405 | assert geometry.GetGeometryName() == 'POLYGON' 406 | for ln in (geometry.GetGeometryRef(j) for j in range(geometry.GetGeometryCount())): 407 | assert ln.GetGeometryName() == 'LINEARRING' 408 | src_points = [ln.GetPoint(n) for n in range(ln.GetPointCount())] 409 | dst_points = srs_tr.transform(src_points) 410 | #~ ld(src_points) 411 | multipoint_lst.append(dst_points) 412 | ld('mpointlst', layer_proj, dst_srs, multipoint_lst) 413 | 414 | feature.Destroy() 415 | return multipoint_lst 416 | return [] 417 | 418 | def shape2cutline(cutline_ds, raster_ds, feature_name=None): 419 | mpoly = [] 420 | raster_proj = txt2proj4(raster_ds.GetProjection()) 421 | if not raster_proj: 422 | raster_proj = txt2proj4(raster_ds.GetGCPProjection()) 423 | ld(raster_proj, raster_ds.GetProjection(), raster_ds) 424 | 425 | pix_tr = GdalTransformer(raster_ds) 426 | for points in shape2mpointlst(cutline_ds, raster_proj, feature_name): 427 | p_pix = pix_tr.transform(points, inv=True) 428 | mpoly.append(','.join(['%r %r' % (p[0], p[1]) for p in p_pix])) 429 | cutline = 'MULTIPOLYGON(%s)' % ','.join(['((%s))' % poly for poly in mpoly]) if mpoly else None 430 | ld('cutline', cutline) 431 | return cutline 432 | 433 | def elem0(doc, id): 434 | return doc.getElementsByTagName(id)[0] 435 | 436 | def read_tilemap(src_dir): 437 | #src_dir = src_dir.decode('utf-8', 'ignore') 438 | src = os.path.join(src_dir, 'tilemap.json') 439 | 440 | try: 441 | with open(src, 'r') as f: 442 | tilemap = json.load(f) 443 | 444 | # convert tilesets keys to int 445 | tilesets = tilemap['tilesets'] 446 | tilemap['tilesets'] = dict([ (int(key), val) for key, val in tilesets.items()]) 447 | except ValueError: # No JSON object could be decoded 448 | raise Exception('Invalid tilemap file: %s' % src) 449 | return tilemap 450 | 451 | def write_tilemap(dst_dir, tilemap): 452 | f = os.path.join(dst_dir, 'tilemap.json') 453 | if os.path.exists(f): 454 | os.remove(f) 455 | with open(f, 'w') as f: 456 | json.dump(tilemap, f, indent=2) 457 | 458 | def link_or_copy(src, dst): 459 | try: 460 | if os.path.exists(dst): 461 | os.remove(dst) 462 | os.link(src, dst) 463 | except (OSError, AttributeError): # non POSIX or cross-device link? 464 | try: 465 | shutil.copy(src, dst) 466 | except shutil.Error, shutil_exception: 467 | raise shutil_exception 468 | 469 | def copy_viewer(dest): 470 | for f in ['viewer-google.html', 'viewer-openlayers.html']: 471 | src = os.path.join(data_dir(), f) 472 | dst = os.path.join(dest, f) 473 | link_or_copy(src, dst) # hard links as FF dereferences softlinks 474 | 475 | def read_transparency(src_dir): 476 | try: 477 | with open(os.path.join(src_dir, 'transparency.json'), 'r') as f: 478 | transparency = json.load(f) 479 | except: 480 | ld("transparency cache load failure") 481 | transparency = {} 482 | return transparency 483 | 484 | def write_transparency(dst_dir, transparency): 485 | try: 486 | with open(os.path.join(dst_dir, 'transparency.json'), 'w') as f: 487 | json.dump(transparency, f, indent=0) 488 | except: 489 | logging.warning("transparency cache save failure") 490 | 491 | type_map = ( 492 | ('image/png', '.png', '\x89PNG\x0D\x0A\x1A\x0A'), 493 | ('image/jpeg', '.jpg', '\xFF\xD8\xFF\xE0'), 494 | ('image/jpeg', '.jpeg', '\xFF\xD8\xFF\xE0'), 495 | ('image/gif', '.gif', 'GIF89a'), 496 | ('image/gif', '.gif', 'GIF87a'), 497 | ('image/webp', '.webp', 'RIFF'), 498 | ) 499 | 500 | def type_ext_from_buffer(data): 501 | for mime_type, ext, magic in type_map: 502 | if data[:len(magic)] == magic : # data could be a buffer 503 | if magic == 'RIFF' and data[8:12] != 'WEBP': 504 | contnue 505 | return mime_type, ext 506 | error('Cannot determing image type in a buffer:', data[:20]) 507 | raise KeyError('Cannot determing image type in a buffer') 508 | 509 | def ext_from_buffer(buf): 510 | return type_ext_from_buffer(buf)[1] 511 | 512 | def mime_from_ext(ext_to_find): 513 | for mime_type, ext, magic in type_map: 514 | if ext_to_find == ext: 515 | return mime_type 516 | else: 517 | error('Cannot determing image MIME type') 518 | raise KeyError('Cannot determing image MIME type') 519 | 520 | def ext_from_mime(mime_to_find): 521 | for mime_type, ext, magic in type_map: 522 | if mime_to_find == mime_type: 523 | return ext 524 | else: 525 | error('Cannot determing image MIME type') 526 | raise KeyError('Cannot determing image MIME type') 527 | 528 | def ext_from_file(path): 529 | with file(path, "r") as f: 530 | buf = f.read(512) 531 | return ext_from_buffer(buf) 532 | -------------------------------------------------------------------------------- /tilers_tools/ozf_decoder.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*- 3 | 4 | # 2011-01-27 11:32:42 5 | 6 | ############################################################################### 7 | # Copyright (c) 2011, Vadim Shlyakhov 8 | # 9 | # Permission is hereby granted, free of charge, to any person obtaining a 10 | # copy of this software and associated documentation files (the "Software"), 11 | # to deal in the Software without restriction, including without limitation 12 | # the rights to use, copy, modify, merge, publish, distribute, sublicense, 13 | # and/or sell copies of the Software, and to permit persons to whom the 14 | # Software is furnished to do so, subject to the following conditions: 15 | # 16 | # The above copyright notice and this permission notice shall be included 17 | # in all copies or substantial portions of the Software. 18 | # 19 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 20 | # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 21 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL 22 | # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 23 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING 24 | # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER 25 | # DEALINGS IN THE SOFTWARE. 26 | ############################################################################### 27 | 28 | from __future__ import with_statement 29 | from __future__ import print_function 30 | 31 | import sys 32 | import os 33 | import os.path 34 | import math 35 | import shutil 36 | import logging 37 | from optparse import OptionParser 38 | #from PIL import Image 39 | import zlib 40 | import mmap 41 | import operator 42 | import struct 43 | import glob 44 | 45 | from tiler_functions import * 46 | 47 | class OzfImg(object): 48 | 49 | # see http://www.globalmapperforum.com/forums/suggestion-box/3182-map-support-oziexplorer-ozfx3.html 50 | # or it's copy at http://svn.osgeo.org/gdal/sandbox/klokan/ozf/ozf-binary-format-description.txt 51 | 52 | #ozf_header_1: 53 | # short magic; // set it to 0x7780 for ozfx3 and 0x7778 for ozf2 54 | # long locked; // if set to 1, than ozi refuses to export the image (doesn't seem to work for ozfx3 files though); just set to 0 55 | # short tile_width; // set always to 64 56 | # short version; // set always to 1 57 | # long old_header_size;// set always to 0x436; this has something to do with files having magic 0x7779 58 | # // (haven't seen any of those, they are probably rare, but ozi has code to open them) 59 | hdr1_fmt='