├── LICENSE ├── README.md ├── geo-png-db-processing ├── README.md ├── examples │ └── rasterize-tiles │ │ ├── processVector.js │ │ └── rasterizerRunner.js ├── package.json ├── src │ ├── Utils.js │ ├── encodings │ │ ├── GeoTIFF.js │ │ ├── PngDbEncodings.js │ │ ├── PyramidTiles.js │ │ ├── TileEncodings.js │ │ └── _test │ │ │ └── TileEncodings.js │ ├── filer │ │ ├── AFiler.js │ │ ├── NodeFiler.js │ │ └── S3Filer.js │ ├── gis │ │ ├── GeoJson.js │ │ ├── PbfTiles.js │ │ ├── ShapeFileLoader.js │ │ └── ShpToPbfTiles.js │ ├── helpers │ │ └── QueueProcessor.js │ ├── metadata │ │ ├── CompileData.js │ │ ├── CompileDataForTile.js │ │ └── KeySumStore.js │ ├── proto │ │ ├── BlockRecord.js │ │ ├── BlockRecord.proto │ │ ├── LodesRecord.js │ │ ├── LodesRecord.proto │ │ └── Notes.txt │ └── rasterize │ │ ├── RasterTileSet.js │ │ ├── RasterizeInPoly.js │ │ ├── RasterizePoints.js │ │ ├── Rasterizer.js │ │ ├── StatusTiles.js │ │ ├── SteppedColors.js │ │ ├── TileRenderer.js │ │ ├── _test │ │ └── RasterizeTileSet.js │ │ └── algorithms │ │ └── ScanLinePolygonFill.js └── yarn.lock ├── img ├── adaptive-quads.svg ├── per-pixel-sampling-strategy.svg ├── png-db-example.png ├── pseudo-code.svg ├── pyramid-comparison.svg ├── pyramid.svg ├── quad-tree-start-stops.svg ├── query-sums.svg ├── record-lookups.svg ├── tile-index-nesting.svg └── tile-structure.svg └── specifications ├── basic ├── 1.0 │ └── README.md └── draft │ └── 1.1 │ └── README.md └── record-lookup └── 0.1 └── README.md /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2021 Sasaki Associates 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # geo-png-db 2 | GeoPngDB is a tiled geospatial data format capable of representing global-scale data sets at high resolution in a format natively supported by web browsers. 3 | 4 | ## Concept 5 | Data tiles are images that contain data encoded in their pixels. Tiled using the standard web tile scheme, they can provide instant access to spatial data at a practically unlimited scale. As interactive data visualization and global geospatial analysis becomes more mainstream, there is an increasing need to quickly manipulate and represent massive datasets within the browser. The scale of these datasets often makes vector datasets inefficient and unable to deliver a smooth user-experience. 6 | 7 | Web-based "slippy" maps already take advantage of tiled imagery, and we routinely load and navigate datasets composed of trillions of data points in our browsers. We believe data tiles can provide a solution for browsing and using data effortlessly at any scale. The “GeoPngDB” format builds on existing solutions to provide a browser-friendly way of encoding raw data for consumption by web-based tools. 8 | 9 | ## Schemas 10 | 11 | There are currently 2 schemas for GeoPngDB: 12 | 13 | ### Basic 14 | The basic schema represents spatial data directly in each pixel and provides metadata used to interpret the data when reading the value. 15 | 16 | #### Current Version 1.0 17 | The current version of the basic GeoPngDB specification is 1.0. See the [GeoPngDB Basic Specification 1.0](specifications/basic/1.0/README.md) folder for detailed specs. 18 | 19 | A working draft with additional proposed features can be found in the [draft folder](specifications/basic/draft/1.1/README.md) 20 | 21 | ### Record Lookup 22 | The record lookup schema uses spatial reference images along with a [png-db](https://github.com/sasakiassociates/png-db) record database to represent large geospatial datasets efficiently. Unlike the basic version which can scale indefinitely without workarounds, the record lookup schema requires some additional considerations to achieve massive scale. 23 | 24 | #### Current Version 0.1 (beta) 25 | The current version of the record-lookup GeoPngDB specification is 0.1 (beta). See the [GeoPngDB Record Lookup Specification 0.1](specifications/record-lookup/0.1/README.md) folder for detailed specs. 26 | 27 | ## Motivation 28 | Raster formats are currently an underappreciated resource for sharing raw data for web-based tools. Thankfully solutions like Cloud Optimized GeoTIFFs (COGs) are poised to bring large-scale raster datasets to the main-stream for consumption in web-connected tools. However, COGs and other solutions currently require server-side processing as well as more intensive client side processing and don't take advantage of many aspects of web-based techniques and cloud-based architectures. 29 | 30 | ### Advantages of GeoPngDB 31 | 32 | #### Inexpensive Hosting 33 | For many public datasets, hosting costs are a major barrier. GeoPngDB tiles can be hosted statically just like any other web map tile sets. This means massive open data sets can be shared without major cost implications. 34 | 35 | #### Browser-native image decoding 36 | Unlike the TIFF format, browsers understand PNGs natively. Visualizations can render tiles immediately using GPU shaders without any script-based decoding. Or conventional canvas-based processing can be used to decode values for use in JavaScript. 37 | 38 | #### PNG compression 39 | PNG compression is lossless (which is essential for accurately representing data), but also very efficient at compressing geospatial data. 40 | 41 | #### True Real-Time Rendering 42 | Removing sever requirements and keeping raw data in an image format on the client-side supports smooth updates if rendered on the GPU with shaders. 43 | 44 | #### Dense Queries 45 | Keeping the information as raw data allows for combining massive datasets on the fly in a web browser. This allows novel combinations of data versus representing the data as a graphic image - or as raw aerials. 46 | 47 | #### Unlimited Scale 48 | There are no limits to the size of dataset that can be hosted. Consideration should be given to practical limits on storage and zoom level limits from slippy map tools, but the specification imposes no limits. 49 | 50 | #### Perfect Alignment 51 | Standardizing on WGS 84 ([EPSG:4326](https://epsg.io/4326)) ensures all pixels are uniquely addressed and align perfectly across datasets. WGS 84 does present some challenges - such as the physical size represented by each pixel changing based on latitude, but we feel this is a worthwhile compromise and presents few practical limitations for web-based use. 52 | 53 | #### Reliable Aggregation / Nesting 54 | Unlike vector datasets that require aggressive simplification and loss of fidelity when zooming out, raster "pyramids" allow a simple, reliable method for accurately representing the aggregate data "beneath" each pixel. Typical aggregations can be MIN, MAX, MEAN, SUM that can be processed immediately from the tile below as illustrated in the diagram. Other statistical aggregations are possible, but require more complex processing where the original data must be available when producing each level. 55 | 56 | ![Illustration showing how values for 4 pixels from one zoom level are summed into a single pixel at the level above.](./img/pyramid.svg) 57 | 58 | Seamless, reliable aggregation makes GeoPngDB a solution for representing data originating in both vector and raster datasets. Raster aggregation from vector geometries maintains the highest spatial fidelity possible as every on-screen pixel accurately represents all values from larger zooms. By comparison, vector-based aggregation uses spatially inconsistent nesting which results in unexpected jumps for either analysis or representation. For example, think about the spatial inconsistencies when the NYC population gets distributed to all of NY state in a dot-density map. 59 | 60 | #### Powerful Parallelization 61 | Because each tile is represented independently as a 256x256 image, very lightweight processing can be used when generating tiles. This allows tiles to be processed in a massively parallel fashion using cloud-based architectures. Truly massive datasets can be processed in manageable chunks on affordable microservices (such as AWS Lambda or Google Cloud Functions). 62 | 63 | ## Background 64 | Data tiles are not new and this work draws on a number of sources - most notably the work done by MapZen on their [Terrarium tile format](https://github.com/tilezen/joerd/blob/master/docs/formats.md#terrarium). 65 | 66 | The schema for encoding data in each pixel is based on [png-db](https://github.com/sasakiassociates/png-db). 67 | 68 | ## Implementations 69 | 70 | * [geo-png-db-processing](./geo-png-db-processing/README.md) utilities for processing PBF, GeoJson and Shapefiles into GeoPngDB tiles. 71 | * [QGIS Plugin](https://github.com/sasakiassociates/qgis-geo-png-db) for Exporting GeoPngDB format from QGIS. 72 | * The [Zaru](https://github.com/sasakiassociates/zaru) front-end visualization tool for consuming GeoPngDB 73 | 74 | ## Similar Solutions 75 | 76 | ### Terrarium Tiles 77 | MapZen's terrarium tiles use the exact same WGS 84 tile scheme to represent data. Just like GeoPngDB, the terrarium format encodes values in the RGB channels of PNG images, but the integer values used to represent global elevations are not easily adapted to other use cases. GeoPngDB expands on the terrarium approach to broaden its capabilities and expand the potential uses. 78 | 79 | ### Cloud Optimized GeoTIFFs (COGs) 80 | [COGs](https://www.cogeo.org/) provide an efficient solution for serving large GeoTIFFs via web APIs. 81 | * Like GeoPngDB tiles, COGs can be hosted cheaply on static storage (such as AWS S3), however to be used in web applications a backend web service is required to serve the data and respond to queries. 82 | * It is not possible to achieve real-time feedback with COGs as all queries must be handled by a backend API. 83 | * COGs are hosted as large TIFF files that can be impressively large, but unlikely to be suitable for global high resolutions datasets. Practical limitations make this impossible to scale to the multi-petabyte data density that can feasibly be achieved with static tiles. 84 | 85 | ### Lerc Data Tiles 86 | [ESRI Lerc](https://github.com/Esri/lerc) "LERC is an open-source image or raster format which supports rapid encoding and decoding for any pixel type (not just RGB or Byte)". LERC is not natively understood by common browsers and requires translation, however libraries are available for a number of languages. For data tiles in a web tool, the compression advantages offered by LERC are likely offset in terms of performance by the CPU requirements to translate the files. GeoPngDB avoids this overhead by using a native format that browsers already understand. 87 | 88 | ### Open Data Cube 89 | [Open Data Cube](https://github.com/opendatacube) "presents a common analytical framework composed of a series of data structures and tools which facilitate the organization and analysis of large gridded data collections". It provides a similar solution to COGs requiring backend services and allows sophisticated query and analysis capabilities. GeoPngDB could provide faster, cheaper access to high resolution outputs from aerial image analysis. 90 | 91 | ### PBF Vector Tiles 92 | Vector tiles stored in the PBF format are typically stored using the same WGS 84 tile scheme. Like GeoPngDB, they are well suited to take advantage of GPU rendering on the client side and can produce beautiful graphics in real-time. Vector data is often more efficient than raster data - particularly when a vector shape spans many pixels, and the PBF format offers good compression along with fast load times. Vector tiles are an excellent solution for cartography where the readability of the map requires less detail to be rendered at lower zoom levels. 93 | 94 | Vector tiles can (and do) serve millions of vector shapes (such as Census blocks or road center-lines). However, they can only be served and rendered effectively when zoomed in close enough to limit the number of shapes required to be drawn. For data representation, we often want to represent data at the level at which it exists: for example block level U.S. Census data (around 7 million objects). There are no feasible vector-based solutions to represent millions of vector shapes in a web browser in real-time. Many solutions exist to simplify vector datasets to alleviate this issue (such as [shape simplification algorithms](https://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm) for different zoom levels), but all come with compromises in visual fidelity and limits on practical zoom levels. 95 | 96 | Both the data volume and the rendering overhead can make high-fidelity vector representation of large data sets infeasible at the spatial bounds of the full data set. By comparison raster solutions like GeoPngDB always load the appropriate number of pixels to represent the map on-screen and can seamlessly represent data in a pixel-perfect fashion at any scale. In addition, the simplicity and standardization of raster cells makes them well-suited for real-time combination of multiple datasets regardless of whether the source data is raster or vector. 97 | 98 | Most vector tiles are well suited to be rendered into GeoPngDB tiles as they use the same tiling scheme. A hybrid approach may be used to allow accurate global-scale representation of the vector tiles (using GeoPngDB) while also taking advantage of vector-enabled interactivity and rendering effects at greater zoom levels. This experience can be seamless and opaque to the end user. 99 | 100 | ### Grid Viz 101 | [Grid Viz](https://eurostat.github.io/gridviz/) uses CSV for data and three.js point cloud for visualisation. This is essentially a vector-based representation of raster grids allowing fast representation of large datasets. All rows must be loaded as CSV data which imposes practical record limits for big data, but three.js point clouds can allow animation or other techniques not easily achieved through direct raster representation. The approach is quite different, but included here because the capabilities may appear superficially similar. 102 | 103 | ### imMens 104 | [imMens: Real-time Visual Querying of Big Data](https://github.com/uwdata/imMens) ([PDF paper](https://sfu-db.github.io/cmpt884-fall16/Papers/immens.pdf)) is a very similar solution for encoding data image tiles, querying and displaying them on the GPU. imMens offers sophisticated solutions for aggregation and binning, and support for 5 dimensional data cubes. The open source library is not being actively developed, but certain aspects of their approach could be useful in expanding the capabilities of "record-lookup" datasets. 105 | -------------------------------------------------------------------------------- /geo-png-db-processing/README.md: -------------------------------------------------------------------------------- 1 | # geo-png-db-processing 2 | Utilities for processing PBF, GeoJson and Shapefiles into GeoPngDB tiles. 3 | 4 | 5 | 6 | -------------------------------------------------------------------------------- /geo-png-db-processing/examples/rasterize-tiles/processVector.js: -------------------------------------------------------------------------------- 1 | const RasterTileSet = require('../../src/rasterize/RasterTileSet'); 2 | const NodeFiler = require('../../src/filer/NodeFiler'); 3 | const filer = new NodeFiler(); 4 | 5 | process.on('message', (data) => { 6 | const rasterTileSet = new RasterTileSet(filer, data.pbfDir, true); 7 | rasterTileSet.rasterizeTile(data.tile, false, () => { 8 | }, () => { 9 | process.send({success: true}); 10 | }); 11 | }); 12 | -------------------------------------------------------------------------------- /geo-png-db-processing/examples/rasterize-tiles/rasterizerRunner.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const async = require('async'); 3 | const {fork} = require('child_process'); 4 | 5 | //similar idea here: http://ngageoint.github.io/geopackage-js/jsdoc/tiles_features_index.js.html 6 | 7 | const pbfDir = '../vector_tiles/data/'; 8 | 9 | let progress = {done: 0, num: 0}; 10 | fs.readdir(pbfDir, function (err, items) { 11 | const calls = []; 12 | 13 | for (let i = 0; i < items.length; i++) { 14 | let file = items[i]; 15 | const bits = file.split('.'); 16 | if (bits[bits.length - 1] === 'geojson') { 17 | continue; 18 | } 19 | const tileId = bits[0]; 20 | const tile = tileId.split('_').map((v) => parseInt(v)); 21 | 22 | progress.num++; 23 | calls.push((done) => { 24 | const forked = fork('processVector.js'); 25 | forked.on('message', (msg) => { 26 | if (msg.success) { 27 | progress.done++; 28 | console.log('--------------------------------------------------------- ' + progress.done + '/' + progress.num); 29 | done(); 30 | } 31 | forked.kill(); 32 | }); 33 | forked.send({pbfDir: pbfDir, tile: tile}); 34 | }); 35 | } 36 | 37 | let timerId = calls.length + ' calls'; 38 | console.time(timerId); 39 | async.parallelLimit(calls, 4, function () { 40 | console.log('ALL DONE'); 41 | console.timeEnd(timerId); 42 | }); 43 | 44 | }); 45 | 46 | 47 | 48 | 49 | 50 | 51 | -------------------------------------------------------------------------------- /geo-png-db-processing/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "geo-png-db-processing", 3 | "version": "1.0.0", 4 | "description": "Utilities for processing PBF, GeoJson and Shapefiles into GeoPngDB tiles.", 5 | "main": "index.js", 6 | "repository": "https://github.com/sasakiassociates/geo-png-db", 7 | "author": "kgoulding@sasaki.com", 8 | "license": "MIT", 9 | "dependencies": { 10 | "@mapbox/tilebelt": "^1.0.2", 11 | "@turf/points-within-polygon": "^6.3.0", 12 | "async": "^3.2.0", 13 | "geobuf": "^3.0.2", 14 | "jimp": "^0.16.1", 15 | "mkdirp": "^1.0.4", 16 | "pbf": "^3.2.1", 17 | "proj4": "^2.7.2", 18 | "unzipper": "^0.10.11" 19 | }, 20 | "optionalDependencies": { 21 | "aws-sdk": "^2.874.0", 22 | "shapefile": "^0.6.6" 23 | } 24 | } 25 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/Utils.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const path = require('path'); 3 | const async = require('async'); 4 | const mkdirp = require('mkdirp'); 5 | 6 | /** 7 | * Creates a new instance of Utils. 8 | * @class 9 | * @returns An instance of Utils. 10 | * @example 11 | * var instance = new Utils(); 12 | */ 13 | class Utils { 14 | 15 | static ensureDir(dir, callback) { 16 | mkdirp(dir, function (dirErr) { 17 | if (dirErr) { 18 | console.log(dirErr); 19 | } else { 20 | callback(dir); 21 | } 22 | }); 23 | }; 24 | 25 | static getFilesDeep(dir, callback) { 26 | const walk = function (dir, done) { 27 | let results = []; 28 | fs.readdir(dir, function (err, list) { 29 | if (err) return done(err); 30 | let pending = list.length; 31 | if (!pending) return done(null, results); 32 | list.forEach(function (file) { 33 | file = path.resolve(dir, file); 34 | fs.stat(file, function (err, stat) { 35 | if (stat && stat.isDirectory()) { 36 | walk(file, function (err, res) { 37 | results = results.concat(res); 38 | if (!--pending) done(null, results); 39 | }); 40 | } else { 41 | results.push(file); 42 | if (!--pending) done(null, results); 43 | } 44 | }); 45 | }); 46 | }); 47 | }; 48 | walk(dir, callback); 49 | }; 50 | 51 | static makeGeoJson() { 52 | return { 53 | "crs": { 54 | "type": "name", 55 | "properties": { 56 | "name": "urn:ogc:def:crs:EPSG::4326" 57 | } 58 | }, 59 | "type": "FeatureCollection", 60 | "features": [] 61 | }; 62 | }; 63 | 64 | static getFiles(dir, deep, callback) { 65 | if (deep) { 66 | Utils.getFilesDeep(dir, callback); 67 | } else { 68 | fs.readdir(dir, function (err, filenames) { 69 | callback(err, filenames.map((filename) => { 70 | return path.join(dir, filename) 71 | })); 72 | }); 73 | } 74 | }; 75 | 76 | static walkFiles(dir, deep, eachFile, parallelLimit) { 77 | return new Promise(function (resolve, reject) { 78 | let cnt = 0; 79 | Utils.getFiles(dir, deep, function (err, filenames) { 80 | if (err) { 81 | reject(err); 82 | return; 83 | } 84 | const fileFn = function (filename, done) { 85 | eachFile({ 86 | fullName: filename, 87 | name: path.basename(filename), 88 | extension: path.extname(filename) 89 | }, done, cnt++, filenames.length); 90 | }; 91 | const doneFn = function (err) { 92 | if (err) { 93 | reject(err); 94 | } else { 95 | resolve(filenames.length); 96 | } 97 | }; 98 | if (parallelLimit > 0) { 99 | const calls = []; 100 | filenames.forEach(function (filename, i) { 101 | calls.push(function (done) { 102 | fileFn(filename, done); 103 | }); 104 | }); 105 | async.parallelLimit(calls, parallelLimit, doneFn); 106 | } else { 107 | async.each(filenames, fileFn, doneFn); 108 | } 109 | }); 110 | }); 111 | }; 112 | 113 | static long2tile(lon, zoom) { 114 | return (lon + 180) / 360 * Math.pow(2, zoom); 115 | } 116 | 117 | static lat2tile(lat, zoom) { 118 | return (1 - Math.log(Math.tan(lat * Math.PI / 180) + 1 / Math.cos(lat * Math.PI / 180)) / Math.PI) / 2 * Math.pow(2, zoom); 119 | } 120 | 121 | static tile2long(x, z) { 122 | return (x / Math.pow(2, z) * 360 - 180); 123 | } 124 | 125 | static tile2lat(y, z) { 126 | const n = Math.PI - 2 * Math.PI * y / Math.pow(2, z); 127 | return (180 / Math.PI * Math.atan(0.5 * (Math.exp(n) - Math.exp(-n)))); 128 | } 129 | 130 | static toLatLon(tile, x, y) { 131 | const z = tile[2]; 132 | return { 133 | x: Utils.tile2long(tile[0] + x, z), 134 | y: Utils.tile2lat(tile[1] + y, z) 135 | } 136 | } 137 | 138 | static latLonToTile(lat, lon, zoom) { 139 | const tileX = Utils.long2tile(lon, zoom); 140 | const tileY = Utils.lat2tile(lat, zoom); 141 | return [Math.floor(tileX), Math.floor(tileY), zoom] 142 | } 143 | 144 | static containsPoint(parent, point) { 145 | return point.x <= parent.x.max && 146 | point.x >= parent.x.min && 147 | point.y >= parent.y.min && 148 | point.y <= parent.y.max; 149 | }; 150 | 151 | static hexToRgb(hex) { 152 | const result = /^#?([a-f\d]{2})([a-f\d]{2})([a-f\d]{2})$/i.exec(hex); 153 | return result ? { 154 | r: parseInt(result[1], 16), 155 | g: parseInt(result[2], 16), 156 | b: parseInt(result[3], 16) 157 | } : null; 158 | } 159 | 160 | static padLeft(str, num) { 161 | return str.padStart(num, '0'); 162 | }; 163 | 164 | static toGID(numericStr) {//converts to the G.... ID format used by IPUMS for data tables from numeric id 165 | // 250277613003016 (numericStr) 166 | //G25002707613003016 167 | 168 | // 25| 027| 761300|3016 169 | //G25|0027|0761300|3016 170 | 171 | const state = numericStr.substr(0, 2); 172 | const county = numericStr.substr(2, 3); 173 | const tract = numericStr.substr(5, 6); 174 | const block = numericStr.substr(11, 4); 175 | 176 | return `G${state}0${county}0${tract}${block}`; 177 | } 178 | 179 | static fromGID(gID) {//converts from G... id to the numeric ID used by Census GIS shapefiles 180 | //000|0000|0001111|1111 181 | //012|3456|7890123|4567 182 | //1,2| 4,3| 8,6 |14,4 183 | //G25|0027|0761300|3016 184 | // 25| 027| 761300|3016 185 | return gID.substr(1, 2) + gID.substr(4, 3) + gID.substr(8, 6) + gID.substr(14, 4) 186 | } 187 | 188 | } 189 | 190 | module.exports = Utils; -------------------------------------------------------------------------------- /geo-png-db-processing/src/encodings/GeoTIFF.js: -------------------------------------------------------------------------------- 1 | //https://raw.githubusercontent.com/m-schuetz/GeoTIFF/master/dist/GeoTIFF.js 2 | class EnumItem{ 3 | constructor(object){ 4 | for(let key of Object.keys(object)){ 5 | this[key] = object[key]; 6 | } 7 | } 8 | 9 | inspect(){ 10 | return `Enum(${this.name}: ${this.value})`; 11 | } 12 | } 13 | 14 | class Enum{ 15 | 16 | constructor(object){ 17 | this.object = object; 18 | 19 | for(let key of Object.keys(object)){ 20 | let value = object[key]; 21 | 22 | if(typeof value === "object"){ 23 | value.name = key; 24 | }else{ 25 | value = {name: key, value: value}; 26 | } 27 | 28 | this[key] = new EnumItem(value); 29 | } 30 | } 31 | 32 | fromValue(value){ 33 | for(let key of Object.keys(this.object)){ 34 | if(this[key].value === value){ 35 | return this[key]; 36 | } 37 | } 38 | 39 | throw new Error(`No enum for value: ${value}`); 40 | } 41 | 42 | } 43 | 44 | const Endianness = new Enum({ 45 | LITTLE: "II", 46 | BIG: "MM", 47 | }); 48 | 49 | const Type = new Enum({ 50 | BYTE: {value: 1, bytes: 1}, 51 | ASCII: {value: 2, bytes: 1}, 52 | SHORT: {value: 3, bytes: 2}, 53 | LONG: {value: 4, bytes: 4}, 54 | RATIONAL: {value: 5, bytes: 8}, 55 | SBYTE: {value: 6, bytes: 1}, 56 | UNDEFINED: {value: 7, bytes: 1}, 57 | SSHORT: {value: 8, bytes: 2}, 58 | SLONG: {value: 9, bytes: 4}, 59 | SRATIONAL: {value: 10, bytes: 8}, 60 | FLOAT: {value: 11, bytes: 4}, 61 | DOUBLE: {value: 12, bytes: 8}, 62 | }); 63 | 64 | const Tag = new Enum({ 65 | IMAGE_WIDTH: 256, 66 | IMAGE_HEIGHT: 257, 67 | BITS_PER_SAMPLE: 258, 68 | COMPRESSION: 259, 69 | PHOTOMETRIC_INTERPRETATION: 262, 70 | STRIP_OFFSETS: 273, 71 | ORIENTATION: 274, 72 | SAMPLES_PER_PIXEL: 277, 73 | ROWS_PER_STRIP: 278, 74 | STRIP_BYTE_COUNTS: 279, 75 | X_RESOLUTION: 282, 76 | Y_RESOLUTION: 283, 77 | PLANAR_CONFIGURATION: 284, 78 | RESOLUTION_UNIT: 296, 79 | SOFTWARE: 305, 80 | COLOR_MAP: 320, 81 | SAMPLE_FORMAT: 339, 82 | MODEL_PIXEL_SCALE: 33550, // [GeoTIFF] TYPE: double N: 3 83 | MODEL_TIEPOINT: 33922, // [GeoTIFF] TYPE: double N: 6 * NUM_TIEPOINTS 84 | GEO_KEY_DIRECTORY: 34735, // [GeoTIFF] TYPE: short N: >= 4 85 | GEO_DOUBLE_PARAMS: 34736, // [GeoTIFF] TYPE: short N: variable 86 | GEO_ASCII_PARAMS: 34737, // [GeoTIFF] TYPE: ascii N: variable 87 | }); 88 | 89 | const typeMapping = new Map([ 90 | [Type.BYTE, Uint8Array], 91 | [Type.ASCII, Uint8Array], 92 | [Type.SHORT, Uint16Array], 93 | [Type.LONG, Uint32Array], 94 | [Type.RATIONAL, Uint32Array], 95 | [Type.SBYTE, Int8Array], 96 | [Type.UNDEFINED, Uint8Array], 97 | [Type.SSHORT, Int16Array], 98 | [Type.SLONG, Int32Array], 99 | [Type.SRATIONAL, Int32Array], 100 | [Type.FLOAT, Float32Array], 101 | [Type.DOUBLE, Float64Array], 102 | ]); 103 | 104 | class IFDEntry{ 105 | 106 | constructor(tag, type, count, offset, value){ 107 | this.tag = tag; 108 | this.type = type; 109 | this.count = count; 110 | this.offset = offset; 111 | this.value = value; 112 | } 113 | 114 | } 115 | 116 | class Image{ 117 | 118 | constructor(){ 119 | this.width = 0; 120 | this.height = 0; 121 | this.buffer = null; 122 | this.metadata = []; 123 | } 124 | 125 | } 126 | 127 | class Reader{ 128 | 129 | constructor(){ 130 | 131 | } 132 | 133 | static read(data){ 134 | 135 | let endiannessTag = String.fromCharCode(...Array.from(data.slice(0, 2))); 136 | let endianness = Endianness.fromValue(endiannessTag); 137 | 138 | let tiffCheckTag = data.readUInt8(2); 139 | 140 | if(tiffCheckTag !== 42){ 141 | throw new Error("not a valid tiff file"); 142 | } 143 | 144 | let offsetToFirstIFD = data.readUInt32LE(4); 145 | 146 | console.log("offsetToFirstIFD", offsetToFirstIFD); 147 | 148 | let ifds = []; 149 | let IFDsRead = false; 150 | let currentIFDOffset = offsetToFirstIFD; 151 | let i = 0; 152 | while(IFDsRead || i < 100){ 153 | 154 | console.log("currentIFDOffset", currentIFDOffset); 155 | let numEntries = data.readUInt16LE(currentIFDOffset); 156 | let nextIFDOffset = data.readUInt32LE(currentIFDOffset + 2 + numEntries * 12); 157 | 158 | console.log("next offset: ", currentIFDOffset + 2 + numEntries * 12); 159 | 160 | let entryBuffer = data.slice(currentIFDOffset + 2, currentIFDOffset + 2 + 12 * numEntries); 161 | 162 | for(let i = 0; i < numEntries; i++){ 163 | let tag = Tag.fromValue(entryBuffer.readUInt16LE(i * 12)); 164 | let type = Type.fromValue(entryBuffer.readUInt16LE(i * 12 + 2)); 165 | let count = entryBuffer.readUInt32LE(i * 12 + 4); 166 | let offsetOrValue = entryBuffer.readUInt32LE(i * 12 + 8); 167 | let valueBytes = type.bytes * count; 168 | 169 | let value; 170 | if(valueBytes <= 4){ 171 | value = offsetOrValue; 172 | }else{ 173 | let valueBuffer = new Uint8Array(valueBytes); 174 | valueBuffer.set(data.slice(offsetOrValue, offsetOrValue + valueBytes)); 175 | 176 | let ArrayType = typeMapping.get(type); 177 | 178 | value = new ArrayType(valueBuffer.buffer); 179 | 180 | if(type === Type.ASCII){ 181 | value = String.fromCharCode(...value); 182 | } 183 | } 184 | 185 | let ifd = new IFDEntry(tag, type, count, offsetOrValue, value); 186 | 187 | ifds.push(ifd); 188 | } 189 | 190 | console.log("nextIFDOffset", nextIFDOffset); 191 | 192 | if(nextIFDOffset === 0){ 193 | break; 194 | } 195 | 196 | currentIFDOffset = nextIFDOffset; 197 | i++; 198 | } 199 | 200 | let ifdForTag = (tag) => { 201 | for(let entry of ifds){ 202 | if(entry.tag === tag){ 203 | return entry; 204 | } 205 | } 206 | 207 | return null; 208 | }; 209 | 210 | let width = ifdForTag(Tag.IMAGE_WIDTH, ifds).value; 211 | let height = ifdForTag(Tag.IMAGE_HEIGHT, ifds).value; 212 | let compression = ifdForTag(Tag.COMPRESSION, ifds).value; 213 | let rowsPerStrip = ifdForTag(Tag.ROWS_PER_STRIP, ifds).value; 214 | let ifdStripOffsets = ifdForTag(Tag.STRIP_OFFSETS, ifds); 215 | let ifdStripByteCounts = ifdForTag(Tag.STRIP_BYTE_COUNTS, ifds); 216 | 217 | let numStrips = Math.ceil(height / rowsPerStrip); 218 | 219 | let stripByteCounts = []; 220 | for(let i = 0; i < ifdStripByteCounts.count; i++){ 221 | let type = ifdStripByteCounts.type; 222 | let offset = ifdStripByteCounts.offset + i * type.bytes; 223 | 224 | let value; 225 | if(type === Type.SHORT){ 226 | value = data.readUInt16LE(offset); 227 | }else if(type === Type.LONG){ 228 | value = data.readUInt32LE(offset); 229 | } 230 | 231 | stripByteCounts.push(value); 232 | } 233 | 234 | let stripOffsets = []; 235 | for(let i = 0; i < ifdStripOffsets.count; i++){ 236 | let type = ifdStripOffsets.type; 237 | let offset = ifdStripOffsets.offset + i * type.bytes; 238 | 239 | let value; 240 | if(type === Type.SHORT){ 241 | value = data.readUInt16LE(offset); 242 | }else if(type === Type.LONG){ 243 | value = data.readUInt32LE(offset); 244 | } 245 | 246 | stripOffsets.push(value); 247 | } 248 | 249 | let imageBuffer = new Uint8Array(width * height * 3); 250 | 251 | let linesProcessed = 0; 252 | for(let i = 0; i < numStrips; i++){ 253 | let stripOffset = stripOffsets[i]; 254 | let stripBytes = stripByteCounts[i]; 255 | let stripData = data.slice(stripOffset, stripOffset + stripBytes); 256 | let lineBytes = width * 3; 257 | for(let y = 0; y < rowsPerStrip; y++){ 258 | let line = stripData.slice(y * lineBytes, y * lineBytes + lineBytes); 259 | imageBuffer.set(line, linesProcessed * lineBytes); 260 | 261 | if(line.length === lineBytes){ 262 | linesProcessed++; 263 | }else{ 264 | break; 265 | } 266 | } 267 | } 268 | 269 | console.log(`width: ${width}`); 270 | console.log(`height: ${height}`); 271 | console.log(`numStrips: ${numStrips}`); 272 | console.log("stripByteCounts", stripByteCounts.join(", ")); 273 | console.log("stripOffsets", stripOffsets.join(", ")); 274 | 275 | let image = new Image(); 276 | image.width = width; 277 | image.height = height; 278 | image.buffer = imageBuffer; 279 | image.metadata = ifds; 280 | 281 | return image; 282 | } 283 | 284 | } 285 | 286 | 287 | class Exporter{ 288 | 289 | constructor(){ 290 | 291 | } 292 | 293 | static toTiffBuffer(image, params){ 294 | 295 | let offsetToFirstIFD = 8; 296 | 297 | let headerBuffer = new Uint8Array([0x49, 0x49, 42, 0, offsetToFirstIFD, 0, 0, 0]); 298 | 299 | let [width, height] = [image.width, image.height]; 300 | 301 | let ifds = [ 302 | new IFDEntry(Tag.IMAGE_WIDTH, Type.SHORT, 1, null, width), 303 | new IFDEntry(Tag.IMAGE_HEIGHT, Type.SHORT, 1, null, height), 304 | new IFDEntry(Tag.BITS_PER_SAMPLE, Type.SHORT, 4, null, new Uint16Array([8, 8, 8, 8])), 305 | new IFDEntry(Tag.COMPRESSION, Type.SHORT, 1, null, 1), 306 | new IFDEntry(Tag.PHOTOMETRIC_INTERPRETATION, Type.SHORT, 1, null, 2), 307 | new IFDEntry(Tag.ORIENTATION, Type.SHORT, 1, null, 1), 308 | new IFDEntry(Tag.SAMPLES_PER_PIXEL, Type.SHORT, 1, null, 4), 309 | new IFDEntry(Tag.ROWS_PER_STRIP, Type.LONG, 1, null, height), 310 | new IFDEntry(Tag.STRIP_BYTE_COUNTS, Type.LONG, 1, null, width * height * 3), 311 | new IFDEntry(Tag.PLANAR_CONFIGURATION, Type.SHORT, 1, null, 1), 312 | new IFDEntry(Tag.RESOLUTION_UNIT, Type.SHORT, 1, null, 1), 313 | new IFDEntry(Tag.SOFTWARE, Type.ASCII, 6, null, "......"), 314 | new IFDEntry(Tag.STRIP_OFFSETS, Type.LONG, 1, null, null), 315 | new IFDEntry(Tag.X_RESOLUTION, Type.RATIONAL, 1, null, new Uint32Array([1, 1])), 316 | new IFDEntry(Tag.Y_RESOLUTION, Type.RATIONAL, 1, null, new Uint32Array([1, 1])), 317 | ]; 318 | 319 | if(params.ifdEntries){ 320 | ifds.push(...params.ifdEntries); 321 | } 322 | 323 | let valueOffset = offsetToFirstIFD + 2 + ifds.length * 12 + 4; 324 | 325 | // create 12 byte buffer for each ifd and variable length buffers for ifd values 326 | let ifdEntryBuffers = new Map(); 327 | let ifdValueBuffers = new Map(); 328 | for(let ifd of ifds){ 329 | let entryBuffer = new ArrayBuffer(12); 330 | let entryView = new DataView(entryBuffer); 331 | 332 | let valueBytes = ifd.type.bytes * ifd.count; 333 | 334 | entryView.setUint16(0, ifd.tag.value, true); 335 | entryView.setUint16(2, ifd.type.value, true); 336 | entryView.setUint32(4, ifd.count, true); 337 | 338 | if(ifd.count === 1 && ifd.type.bytes <= 4){ 339 | entryView.setUint32(8, ifd.value, true); 340 | }else{ 341 | entryView.setUint32(8, valueOffset, true); 342 | 343 | let valueBuffer = new Uint8Array(ifd.count * ifd.type.bytes); 344 | if(ifd.type === Type.ASCII){ 345 | valueBuffer.set(new Uint8Array(ifd.value.split("").map(c => c.charCodeAt(0)))); 346 | }else{ 347 | valueBuffer.set(new Uint8Array(ifd.value.buffer)); 348 | } 349 | ifdValueBuffers.set(ifd.tag, valueBuffer); 350 | 351 | valueOffset = valueOffset + valueBuffer.byteLength; 352 | } 353 | 354 | ifdEntryBuffers.set(ifd.tag, entryBuffer); 355 | } 356 | 357 | let imageBufferOffset = valueOffset; 358 | 359 | new DataView(ifdEntryBuffers.get(Tag.STRIP_OFFSETS)).setUint32(8, imageBufferOffset, true); 360 | 361 | let concatBuffers = (buffers) => { 362 | 363 | let totalLength = buffers.reduce( (sum, buffer) => (sum + buffer.byteLength), 0); 364 | let merged = new Uint8Array(totalLength); 365 | 366 | let offset = 0; 367 | for(let buffer of buffers){ 368 | merged.set(new Uint8Array(buffer), offset); 369 | offset += buffer.byteLength; 370 | } 371 | 372 | return merged; 373 | }; 374 | 375 | let ifdBuffer = concatBuffers([ 376 | new Uint16Array([ifds.length]), 377 | ...ifdEntryBuffers.values(), 378 | new Uint32Array([0])]); 379 | let ifdValueBuffer = concatBuffers([...ifdValueBuffers.values()]); 380 | 381 | let tiffBuffer = concatBuffers([ 382 | headerBuffer, 383 | ifdBuffer, 384 | ifdValueBuffer, 385 | image.buffer 386 | ]); 387 | 388 | return {width: width, height: height, buffer: tiffBuffer}; 389 | } 390 | 391 | } 392 | 393 | module.exports.Tag = Tag; 394 | module.exports.Type = Type; 395 | module.exports.IFDEntry = IFDEntry; 396 | module.exports.Image = Image; 397 | module.exports.Reader = Reader; 398 | module.exports.Exporter = Exporter; -------------------------------------------------------------------------------- /geo-png-db-processing/src/encodings/PngDbEncodings.js: -------------------------------------------------------------------------------- 1 | const Jimp = require('jimp'); 2 | 3 | const MAX_VALUE = 255 * 256 * 256 - 1; 4 | 5 | class PngDbEncodings { 6 | static valueToEncoded(field, value) { 7 | if (field.range) { 8 | value = value - field.range.min;//store the offset from the min value for smaller integers and also to allow signed values with the same methodology 9 | } 10 | if (field.precision) { 11 | value = Math.round(value * field.precision); 12 | } else { 13 | value = Math.round(value); 14 | } 15 | if (value > PngDbEncodings.MAX_VALUE) { 16 | console.warn(`Maximum value exceeded ${value} (TRUNCATED)`); 17 | value = PngDbEncodings.MAX_VALUE; 18 | } 19 | 20 | if (value < 0) { 21 | return Jimp.rgbaToInt(0, 0, 0, 0); 22 | } 23 | if (value > 255) { 24 | let r = 0; 25 | const b = value % 256; 26 | let g = Math.floor(value / 256); 27 | 28 | if (g > 255) { 29 | r = Math.floor(g / 256); 30 | g = g % 256; 31 | } 32 | if (r > 255) { 33 | console.warn('MAX VALUE VIOLATION: ' + value + ' : ' + PngDbEncodings.MAX_VALUE); 34 | r = 255; 35 | } 36 | return Jimp.rgbaToInt(r, g, b, 255); 37 | } else { 38 | return Jimp.rgbaToInt(0, 0, value, 255); 39 | } 40 | }; 41 | 42 | static setPixel(field, image, x, y, value) { 43 | const encodedValue = PngDbEncodings.valueToEncoded(field, value); 44 | image.setPixelColor(encodedValue, x, y); 45 | return encodedValue; 46 | }; 47 | 48 | static valFromPixel(field, r, g, b, a) { 49 | if (a === 0) return null; 50 | 51 | let val = r << 16 | g << 8 | b; 52 | 53 | if (field.uniqueValues) { 54 | val = field.uniqueValues[val]; 55 | } else { 56 | if (field.precision) { 57 | val /= field.precision; 58 | } 59 | 60 | if (field.range) { 61 | val += field.range.min;// we store the offset from the min value for smaller integers and also to allow signed values with the same methodology 62 | } 63 | } 64 | return val; 65 | }; 66 | 67 | static componentToHex(c) { 68 | const hex = c.toString(16); 69 | return hex.length === 1 ? "0" + hex : hex; 70 | } 71 | 72 | static rgbToHex(r, g, b) { 73 | return ("#" + PngDbEncodings.componentToHex(r) + PngDbEncodings.componentToHex(g) + PngDbEncodings.componentToHex(b)).toUpperCase(); 74 | }; 75 | 76 | } 77 | 78 | module.exports = PngDbEncodings; 79 | 80 | 81 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/encodings/PyramidTiles.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const async = require('async'); 3 | const Jimp = require('jimp'); 4 | const PngDbEncodings = require('./PngDbEncodings'); 5 | const Utils = require("../Utils"); 6 | const TileEncodings = require("./TileEncodings"); 7 | 8 | /** 9 | * Creates a new instance of PyramidTiles. 10 | * @class 11 | * @returns An instance of PyramidTiles. 12 | * @example 13 | * var instance = new PyramidTiles(); 14 | */ 15 | class PyramidTiles { 16 | 17 | constructor(filer, zoom = 13, steps = 4, outDir = './data/', field) { 18 | this.filer = filer; 19 | this.tileSize = 256; 20 | this.zoom = zoom; 21 | this.steps = steps; 22 | this.outDir = outDir; 23 | this.field = field || {precision: 1000, name: 'TODO'}; 24 | this.field.arrayCount = this.field.keys ? this.field.keys.length : 1; 25 | this.field.basePrecision = this.field.precision;//used for variablePrecision 26 | 27 | this.overwrite = true; 28 | this.useWalkFilesMethod = false; 29 | } 30 | 31 | getValueSums(tile) { 32 | const {steps, field} = this; 33 | 34 | const sums = {}; 35 | for (let i = 0; i < steps; i++) { 36 | sums[i] = {}; 37 | } 38 | tile.scan(0, 0, tile.bitmap.width, tile.bitmap.height, (x, y, idx) => { 39 | // x, y is the position of this pixel on the image (starting at 0) 40 | // idx is the position start position of this rgba tuple in the bitmap Buffer 41 | // this is the image 42 | 43 | // noinspection PointlessArithmeticExpressionJS 44 | const red = tile.bitmap.data[idx + 0]; 45 | const green = tile.bitmap.data[idx + 1]; 46 | const blue = tile.bitmap.data[idx + 2]; 47 | const alpha = tile.bitmap.data[idx + 3]; 48 | 49 | const val = PngDbEncodings.valFromPixel({precision:field.basePrecision}, red, green, blue, alpha); 50 | if (val > 0) { 51 | let px4 = x; 52 | let py4 = y; 53 | for (let i = 0; i < steps; i++) { 54 | px4 = Math.floor(px4 / 2); 55 | py4 = Math.floor(py4 / 2); 56 | let pos = px4 + '_' + py4; 57 | if (!sums[i][pos]) sums[i][pos] = 0; 58 | sums[i][pos] += val; 59 | } 60 | } 61 | }); 62 | return sums; 63 | } 64 | 65 | getValueByMax(tile) { 66 | const {steps, zoom, outDir, field, overwrite} = this; 67 | 68 | const max = {}; 69 | for (let i = 0; i < steps; i++) { 70 | max[i] = {}; 71 | } 72 | 73 | tile.scan(0, 0, tile.bitmap.width, tile.bitmap.height, (x, y, idx) => { 74 | // x, y is the position of this pixel on the image (starting at 0) 75 | // idx is the position start position of this rgba tuple in the bitmap Buffer 76 | // this is the image 77 | 78 | // noinspection PointlessArithmeticExpressionJS 79 | const red = tile.bitmap.data[idx + 0]; 80 | const green = tile.bitmap.data[idx + 1]; 81 | const blue = tile.bitmap.data[idx + 2]; 82 | const alpha = tile.bitmap.data[idx + 3]; 83 | 84 | const rgbStr = `${red}_${green}_${blue}`; 85 | if (alpha > 0) { 86 | let px4 = x; 87 | let py4 = y; 88 | for (let i = 0; i < steps; i++) {//steps = 1 per zoom level change 89 | px4 = Math.floor(px4 / 2); 90 | py4 = Math.floor(py4 / 2); 91 | let pos = px4 + '_' + py4; 92 | if (!max[i][pos]) { 93 | max[i][pos] = {counts: {}, max: 0, hex: 0}; 94 | } 95 | if (!max[i][pos].counts[rgbStr]) max[i][pos].counts[rgbStr] = 0; 96 | max[i][pos].counts[rgbStr]++; 97 | if (max[i][pos].counts[rgbStr] > max[i][pos].max) { 98 | max[i][pos].max = max[i][pos].counts[rgbStr]; 99 | max[i][pos].hex = Jimp.rgbaToInt(red, green, blue, alpha); 100 | } 101 | } 102 | } 103 | }); 104 | return max; 105 | } 106 | 107 | makePyramidTiles(dirName, file, tz, tx, ty, callback) { 108 | const {steps, zoom, outDir, field, overwrite} = this; 109 | this.filer.readImage(file, (err, tile) => { 110 | if (err) throw err; 111 | 112 | let values; 113 | if (this.field.categorical) { 114 | values = this.getValueByMax(tile); 115 | } else { 116 | values = this.getValueSums(tile); 117 | } 118 | 119 | let pWidth4 = tile.bitmap.width; 120 | let pHeight4 = tile.bitmap.height; 121 | 122 | let tileNum = 1; 123 | let precision = field.basePrecision; 124 | const calls = []; 125 | 126 | const addJob = (dir, outFile, width, height, values, precision) => { 127 | calls.push((done) => { 128 | new Jimp(width, height, (err, image) => { 129 | for (let y = 0; y < height; y++) { 130 | for (let x = 0; x < width; x++) { 131 | let val = values[x + '_' + y]; 132 | if (field.categorical) { 133 | if (val && val.hex) { 134 | image.setPixelColor(val.hex, x, y); 135 | } 136 | } else { 137 | if (val > 0) { 138 | PngDbEncodings.setPixel({precision:precision}, image, x, y, val); 139 | } 140 | } 141 | } 142 | } 143 | 144 | this.filer.ensureDir(dir, () => { 145 | // console.log(`Writing : ${dir}/${pos.x}_${pos.y}.png`); 146 | // done(); 147 | this.filer.saveImage(outFile, image, (err) => { 148 | done(); 149 | }); 150 | }); 151 | }); 152 | }); 153 | }; 154 | for (let i = 0; i < steps; i++) { 155 | tileNum *= 2; 156 | precision /= field.variablePrecision; 157 | pWidth4 = pWidth4 / 2; 158 | pHeight4 = pHeight4 / 2; 159 | 160 | const z = zoom - (i + 1); 161 | const ox = Math.floor(tx / tileNum); 162 | const oy = Math.floor(ty / tileNum); 163 | 164 | let dir = dirName ? `${outDir}${dirName}/` : outDir; 165 | dir = `${dir}${z}/${ox}/${oy}`; 166 | const pos = { 167 | x: tx - ox * tileNum, 168 | y: ty - oy * tileNum, 169 | }; 170 | let outFile = `${dir}/${pos.x}_${pos.y}.png`; 171 | 172 | if (overwrite || !fs.existsSync(outFile)) {//NOTE: only supported for NodeFiler - S3 should always use overwrite 173 | // console.log(pWidth4, tileNum, precision); 174 | addJob(dir, outFile, pWidth4, pHeight4, values[i], precision); 175 | } 176 | } 177 | 178 | if (calls.length === 0) { 179 | if (callback) callback(); 180 | } else { 181 | async.parallelLimit(calls, 8, function () { 182 | // console.log(`DONE ${tx} ${ty}`); 183 | if (callback) callback(); 184 | }); 185 | } 186 | 187 | 188 | }); 189 | 190 | }; 191 | 192 | processImageFiles(dir, filter, fn, callback) { 193 | //NOTE this is a bit of a mess dealing with Node and S3 filers based on how each reads recursively 194 | //for now useWalkFilesMethod should be used if specifying NodeFiler, but we should find a more elegant solution 195 | if (!this.useWalkFilesMethod) { 196 | this.filer.readdir(`${dir}/${this.zoom}`, (err, files) => { 197 | const num = files.length; 198 | let cnt = 0; 199 | const calls = []; 200 | files.forEach((file, i) => { 201 | calls.push((done) => { 202 | const filePath = `${dir}/${this.zoom}/${file}`; 203 | const tile_addr_bits = filePath.split('/'); 204 | let fileNameArr = tile_addr_bits.pop().split('.'); 205 | const tile_addr_y = fileNameArr[0]; 206 | const ext = fileNameArr[fileNameArr.length - 1]; 207 | if (ext !== 'png') { 208 | setTimeout(done, 10); 209 | return; 210 | } 211 | const tile_addr_x = tile_addr_bits.pop(); 212 | const zoomDir = tile_addr_bits.pop(); 213 | if (zoomDir !== '' + this.zoom) { 214 | // console.log('Skipping: ' + file.fullName); 215 | setTimeout(done, 10); 216 | return; 217 | } 218 | if (filter && !filter(+tile_addr_x, +tile_addr_y)) { 219 | console.log('Skipping: ' + file); 220 | setTimeout(done, 10); 221 | return; 222 | } 223 | 224 | const dirName = this.field.name;// tile_addr_bits.pop(); 225 | fn(dirName, filePath, +tile_addr_x, +tile_addr_y, ()=> { 226 | cnt++; 227 | console.log(`${cnt}/${num} ${Math.round(1000 * cnt / num) / 10}%`); 228 | done(); 229 | }); 230 | }); 231 | }); 232 | 233 | async.parallelLimit(calls, 8, function () { 234 | // console.log(`DONE ${tx} ${ty}`); 235 | if (callback) callback(); 236 | }); 237 | }); 238 | } else { 239 | Utils.walkFiles(`${dir}/${this.zoom}`, true, (file, done, cnt, num) => { 240 | if (file.extension !== '.png') { 241 | setTimeout(done, 10); 242 | return; 243 | } 244 | 245 | const filePath = file.fullName; 246 | const tile_addr_bits = filePath.split('\\'); 247 | let fileNameArr = tile_addr_bits.pop().split('.'); 248 | const ext = fileNameArr[fileNameArr.length - 1]; 249 | const tile_addr_x = tile_addr_bits.pop(); 250 | const tile_addr_y = fileNameArr[0]; 251 | const zoomDir = tile_addr_bits.pop(); 252 | if (zoomDir !== '' + this.zoom) { 253 | // console.log('Skipping: ' + file.fullName); 254 | setTimeout(done, 10); 255 | return; 256 | } 257 | if (filter && !filter(+tile_addr_x, +tile_addr_y)) { 258 | console.log('Skipping: ' + file); 259 | setTimeout(done, 10); 260 | return; 261 | } 262 | const dirName = this.field.name; 263 | fn(dirName, filePath, +tile_addr_x, +tile_addr_y, ()=> { 264 | console.log(`${cnt}/${num} ${Math.round(1000 * cnt / num) / 10}%`); 265 | done(); 266 | }); 267 | }, 8).then(()=> { 268 | if (callback) callback(); 269 | }); 270 | } 271 | } 272 | 273 | processDir(dir, callback, filter) { 274 | // console.log('Process tile ' + file.fullName); 275 | this.processImageFiles(dir, filter, (dirName, filePath, tile_addr_x, tile_addr_y, done) => { 276 | this.makePyramidTiles(dirName, filePath, this.zoom, +tile_addr_x, +tile_addr_y, () => { 277 | done(); 278 | }); 279 | }, ()=> { 280 | if (callback) callback(); 281 | }); 282 | } 283 | 284 | findPyramidTiles(tilesToGenerate, dirName, file, tz, tx, ty, callback) { 285 | const {zoom, tileSize, steps, outDir} = this; 286 | 287 | let tileNumCounter = 1; 288 | for (let i = 0; i < steps; i++) { 289 | tileNumCounter *= 2; 290 | const tileNum = tileNumCounter; 291 | const miniTileSize = tileSize / tileNum; 292 | const z = zoom - (i + 1); 293 | const ox = Math.floor(tx / tileNum); 294 | const oy = Math.floor(ty / tileNum); 295 | 296 | const pDir = dirName ? `${outDir}${dirName}/` : outDir; 297 | 298 | const dir = `${pDir}${z}/${ox}/${oy}`; 299 | 300 | if (!tilesToGenerate[dir]) { 301 | tilesToGenerate[dir] = { 302 | fileName: `${dir}.png`, 303 | z: z, 304 | x: ox, 305 | y: oy, 306 | num: tileNum, 307 | size: miniTileSize, 308 | tiles: [] 309 | }; 310 | for (let x = 0; x < tileNum; x++) { 311 | for (let y = 0; y < tileNum; y++) { 312 | tilesToGenerate[dir].tiles.push({x: x, y: y, file: `${dir}/${x}_${y}.png`}); 313 | } 314 | } 315 | } 316 | } 317 | if (callback) callback(); 318 | 319 | }; 320 | 321 | makeTile(tileData, allCalls) { 322 | const {zoom, tileSize, filer} = this; 323 | 324 | let missingImageCounter = 0; 325 | allCalls.push((done) => { 326 | const tileOffsets = TileEncodings.calcOffsetTiles(this.field.arrayCount || 1); 327 | new Jimp(tileSize * tileOffsets.width, tileSize * tileOffsets.height, (err, image) => { 328 | const calls = []; 329 | 330 | tileData.tiles.forEach((tile, i) => { 331 | calls.push((done) => { 332 | // console.log(tile.x + ',' + tile.y); 333 | filer.readImage(tile.file, function (err, mini) { 334 | if (err) { 335 | if (err.code === 'ENOENT' || err.code === 'NoSuchKey') { 336 | missingImageCounter++; 337 | } else { 338 | console.log(err); 339 | } 340 | } else { 341 | tileOffsets.offsets.forEach((xy, i) => { 342 | const miniSlice = { 343 | width: mini.bitmap.width / tileOffsets.width, 344 | height: mini.bitmap.height / tileOffsets.height, 345 | }; 346 | const srcX = xy[0] * miniSlice.width; 347 | const srcY = xy[1] * miniSlice.height; 348 | const srcW = miniSlice.width; 349 | const srcH = miniSlice.height; 350 | let x = miniSlice.width * tile.x + xy[0] * miniSlice.width * tileData.num; 351 | let y = miniSlice.height * tile.y + xy[1] * miniSlice.height * tileData.num; 352 | 353 | // console.log(`tile:${tile.x},${tile.y}; offset:${xy.join(',')}; blit@:${x},${y}`); 354 | 355 | image.blit(mini, x, y, srcX, srcY, srcW, srcH); 356 | }); 357 | } 358 | done(); 359 | }); 360 | }); 361 | }); 362 | 363 | async.parallelLimit(calls, 8, function () { 364 | filer.saveImage(`${tileData.fileName}`, image, function () { 365 | console.log(`Wrote ${tileData.fileName}, missing ${missingImageCounter}`); 366 | done(); 367 | }); 368 | }); 369 | }); 370 | 371 | }); 372 | 373 | }; 374 | 375 | stitchMiniTiles(dir, filter) { 376 | const {zoom, outDir} = this; 377 | const tilesToGenerate = {}; 378 | 379 | this.processImageFiles(dir, filter, (dirName, filePath, tile_addr_x, tile_addr_y, done) => { 380 | this.findPyramidTiles(tilesToGenerate, dirName, filePath, zoom, +tile_addr_x, +tile_addr_y, () => { 381 | done(); 382 | }); 383 | }, ()=> { 384 | const allCalls = []; 385 | Object.keys(tilesToGenerate).forEach((k) => { 386 | this.makeTile(tilesToGenerate[k], allCalls); 387 | }); 388 | console.log('Total calls: ' + allCalls.length); 389 | async.parallelLimit(allCalls, 8, function () { 390 | console.log('DONE'); 391 | }); 392 | }); 393 | } 394 | } 395 | 396 | module.exports = PyramidTiles; 397 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/encodings/TileEncodings.js: -------------------------------------------------------------------------------- 1 | const Jimp = require('jimp'); 2 | const PngDbEncodings = require('./PngDbEncodings'); 3 | const Utils = require("../Utils"); 4 | 5 | /** 6 | * Creates a new instance of TileEncodings. 7 | * @class 8 | * @returns An instance of TileEncodings. 9 | * @example 10 | * var instance = new TileEncodings(); 11 | */ 12 | class TileEncodings { 13 | 14 | constructor(filer, pixelLookup, valuesById, pixelCounts, field) { 15 | this.filer = filer; 16 | this.pixelLookup = pixelLookup; 17 | this.valuesById = valuesById; 18 | this.pixelCounts = pixelCounts; 19 | this.field = field//TODO use variable precision 20 | }; 21 | 22 | static calcOffsetTiles(num) { 23 | let w, h; 24 | if (num === 1) { 25 | w = 1; 26 | h = 1; 27 | } else if (num <= 4) { 28 | w = 2; 29 | h = 2; 30 | } else if (num <= 16) { 31 | w = 4; 32 | h = 4; 33 | } else if (num <= 64) { 34 | w = 8; 35 | h = 8; 36 | } else { 37 | throw 'No support for arrays longer than 64 elements'; 38 | } 39 | const offsets = []; 40 | for (let y = 0; y < h; y++) {//scan across then down (same as pixel order) 41 | for (let x = 0; x < w; x++) { 42 | if (offsets.length < num) { 43 | offsets.push([x, y]); 44 | } 45 | } 46 | } 47 | return { 48 | width: w, 49 | height: h, 50 | offsets: offsets 51 | }; 52 | 53 | }; 54 | 55 | processTile(file, outDir, z, x, y, callback, errback) { 56 | this.filer.readImage(file, (err, tile) => { 57 | if (err) { 58 | if (errback) { 59 | errback(err); 60 | return; 61 | } else { 62 | throw err; 63 | } 64 | } 65 | 66 | const tileOffsets = TileEncodings.calcOffsetTiles(this.field.arrayCount || 1); 67 | 68 | new Jimp(tile.bitmap.width * tileOffsets.width, tile.bitmap.height * tileOffsets.height, (err, image) => { 69 | tile.scan(0, 0, tile.bitmap.width, tile.bitmap.height, (x, y, idx) => { 70 | // noinspection PointlessArithmeticExpressionJS 71 | const red = tile.bitmap.data[idx + 0]; 72 | const green = tile.bitmap.data[idx + 1]; 73 | const blue = tile.bitmap.data[idx + 2]; 74 | // var alpha = tile.bitmap.data[ idx + 3 ]; 75 | 76 | const color = PngDbEncodings.rgbToHex(red, green, blue).toLowerCase(); 77 | const gisJoin = this.pixelLookup[color]; 78 | if (this.valuesById[gisJoin]) { 79 | let pixelCount = this.pixelCounts[color]; 80 | if (pixelCount.count > 0) { 81 | const vals = this.valuesById[gisJoin]; 82 | tileOffsets.offsets.forEach((xy, i) => { 83 | if (vals[i] > 0) { 84 | const valPerPx = vals[i] / pixelCount.count; 85 | let pX = xy[0] * tile.bitmap.width + x; 86 | let pY = xy[1] * tile.bitmap.height + y; 87 | PngDbEncodings.setPixel(this.field, image, pX, pY, valPerPx); 88 | } 89 | }); 90 | } 91 | } 92 | }); 93 | 94 | const dir = `${outDir}${this.field.name}/${z}/${x}`; 95 | this.filer.ensureDir(dir, () => { 96 | let outFile = `${dir}/${y}.png`; 97 | this.filer.saveImage(outFile, image, (err) => { 98 | if (callback) callback(outFile); 99 | }); 100 | }); 101 | }); 102 | 103 | }); 104 | }; 105 | } 106 | 107 | module.exports = TileEncodings; 108 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/encodings/_test/TileEncodings.js: -------------------------------------------------------------------------------- 1 | const assert = require('assert'); 2 | const TileEncodings = require('../TileEncodings'); 3 | 4 | const rasterTileSet = new TileEncodings(); 5 | 6 | const assertEqual = (a, b) => {//replace with https://www.chaijs.com/api/bdd/ ?? 7 | try { 8 | assert.deepStrictEqual(a, b); 9 | console.log('Passed'); 10 | } catch (e) { 11 | console.log('Assert ERROR-----------------------------'); 12 | console.log(JSON.stringify(a)); 13 | console.log('-----------------------------------------'); 14 | console.log(JSON.stringify(b)); 15 | console.log('-----------------------------------------'); 16 | } 17 | }; 18 | 19 | assertEqual(TileEncodings.calcOffsetTiles(1), { 20 | width: 1, 21 | height: 1, 22 | offsets: [ 23 | [0, 0] 24 | ] 25 | }); 26 | 27 | assertEqual(TileEncodings.calcOffsetTiles(3), { 28 | width: 2, 29 | height: 2, 30 | offsets: [[0, 0], [1, 0], [0, 1]] 31 | }); 32 | // 33 | assertEqual(TileEncodings.calcOffsetTiles(7), { 34 | width: 4, 35 | height: 4, 36 | offsets: [ 37 | [0, 0], 38 | [1, 0], 39 | [2, 0], 40 | [3, 0], 41 | [0, 1], 42 | [1, 1], 43 | [2, 1], 44 | ] 45 | }); 46 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/filer/AFiler.js: -------------------------------------------------------------------------------- 1 | class AFiler { 2 | ensureDir(dir, callback) { 3 | console.error('ensureDir not implemented'); 4 | } 5 | 6 | fileExists(file, callback) { 7 | console.error('fileExists not implemented'); 8 | } 9 | 10 | writeFile(path, data, o, c) { 11 | console.error('writeFile not implemented'); 12 | } 13 | 14 | readFile(path, o, c) { 15 | console.error('readFile not implemented'); 16 | } 17 | 18 | copyFile(fromPath, toPath, o, c) { 19 | console.error('copyFile not implemented'); 20 | } 21 | 22 | readdir(path, o, c) { 23 | console.error('readdir not implemented'); 24 | } 25 | 26 | saveImage(path, image, callback) { 27 | console.error('saveImage not implemented'); 28 | } 29 | 30 | readImage(path, callback) { 31 | console.error('readImage not implemented'); 32 | } 33 | } 34 | 35 | module.exports = AFiler; 36 | 37 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/filer/NodeFiler.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const mkdirp = require('mkdirp'); 3 | const Jimp = require('jimp'); 4 | 5 | const AFiler = require("./AFiler"); 6 | 7 | class NodeFiler extends AFiler { 8 | ensureDir(dir, callback) { 9 | mkdirp(dir, function (dirErr) { 10 | if (dirErr) { 11 | console.log(dirErr); 12 | } else { 13 | callback(dir); 14 | } 15 | }); 16 | }; 17 | 18 | fileExists(file, callback) { 19 | fs.access(file, fs.constants.F_OK, (err) => { 20 | if (err) {//doesn't exist?? should check err string 21 | callback(false); 22 | } else { 23 | callback(true); 24 | } 25 | }); 26 | } 27 | 28 | readFile(path, options, callback) { 29 | fs.readFile(path, options, callback) 30 | } 31 | 32 | readdir(path, options, callback) { 33 | fs.readdir(path, options, callback) 34 | } 35 | 36 | writeFile(path, data, options, callback) { 37 | fs.writeFile(path, data, options, callback); 38 | } 39 | 40 | saveImage(path, image, callback) { 41 | image.write(path, callback); 42 | } 43 | 44 | copyFile(fromPath, toPath, options, callback) { 45 | fs.copyFile(fromPath, toPath, options, callback); 46 | } 47 | 48 | readImage(path, callback) { 49 | Jimp.read(path, (err, tile) => { 50 | callback(err, tile); 51 | }); 52 | } 53 | } 54 | 55 | module.exports = NodeFiler; 56 | 57 | 58 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/filer/S3Filer.js: -------------------------------------------------------------------------------- 1 | let AWS = require('aws-sdk'); 2 | let s3 = new AWS.S3(); 3 | const AFiler = require("./AFiler"); 4 | const Jimp = require('jimp'); 5 | 6 | class S3Filer extends AFiler { 7 | constructor(bucket) { 8 | super(); 9 | this.bucket = bucket; 10 | } 11 | 12 | ensureDir(dir, callback) { 13 | callback();//Note: S3 doesn't need to create a dir 14 | } 15 | 16 | fileExists(file, callback) { 17 | const params = {Bucket: this.bucket, Key: file}; 18 | s3.headObject(params).on('success', function (response) { 19 | callback(true); 20 | }).on('error', function (error) { 21 | //error return a object with status code 404 22 | callback(false); 23 | }).send(); 24 | } 25 | 26 | fileLastModified(file, callback) { 27 | const params = {Bucket: this.bucket, Key: file}; 28 | s3.headObject(params).on('success', function (response) { 29 | callback(null, response); 30 | }).on('error', function (error) { 31 | //error return a object with status code 404 32 | callback(error); 33 | }).send(); 34 | } 35 | 36 | interpretParams(options, callback) { 37 | if (!callback) { 38 | callback = options; 39 | options = {}; 40 | } 41 | return {options, callback}; 42 | } 43 | 44 | writeFile(path, data, o, c) { 45 | const {options, callback} = this.interpretParams(o, c); 46 | const contentType = options.contentType || "application/json"; 47 | const s3Params = { 48 | Bucket: this.bucket, 49 | Key: path, 50 | Body: data, 51 | ContentType: contentType 52 | }; 53 | try { 54 | s3.putObject(s3Params, function (err, data) { 55 | if (err) { 56 | console.log('ERR: S3 putobject...'); 57 | console.log(JSON.stringify(err), err.stack); 58 | callback(err); 59 | } else { 60 | // console.log('SUCCESS S3 putobject...' + s3Params.Bucket + '/' + s3Params.Key); 61 | callback(null); 62 | } 63 | }); 64 | } catch (e) { 65 | callback(e); 66 | } 67 | } 68 | 69 | copyFile(fromPath, toPath, o, c) { 70 | const {options, callback} = this.interpretParams(o, c); 71 | 72 | const params = { 73 | Bucket: this.bucket, 74 | CopySource: fromPath, 75 | Key: toPath 76 | }; 77 | try { 78 | s3.copyObject(params, 79 | function (err, data) { 80 | if (err) { 81 | // console.log(err, err.stack); //--this may be common for ENOENTs 82 | callback(err, null); 83 | } else { 84 | callback(null, data); 85 | 86 | } 87 | }); 88 | } catch (e) { 89 | callback(e, null); 90 | } 91 | } 92 | 93 | readFile(path, o, c) { 94 | const {options, callback} = this.interpretParams(o, c); 95 | const params = { 96 | Bucket: this.bucket, 97 | Key: path 98 | }; 99 | try { 100 | s3.getObject(params, 101 | function (err, data) { 102 | if (err) { 103 | // console.log(err, err.stack); //--this may be common for ENOENTs 104 | callback(err, null); 105 | } else { 106 | callback(null, data.Body); 107 | 108 | } 109 | }); 110 | } catch (e) { 111 | callback(e, null); 112 | } 113 | } 114 | 115 | loadFilesFromS3(items, path, marker, callback) { 116 | const params = { 117 | Bucket: this.bucket, 118 | Prefix: path 119 | }; 120 | 121 | // are we paging from a specific point? 122 | if (marker) { 123 | params.StartAfter = marker; 124 | } 125 | 126 | const pathBits = path.split('/'); 127 | 128 | s3.listObjectsV2(params, (err, data) => { 129 | if (err) { 130 | console.error('Problem listing objects for ' + path + ': ' + err); 131 | if (err.code === 'AccessDenied') { 132 | console.log('Note that "ListBucket" permission is required'); 133 | } 134 | callback(err); 135 | return; 136 | } 137 | const pageItems = data.Contents.map((file) => { 138 | const keyBits = file.Key.split('/'); 139 | //include the full path within the dir 'path' that is passed in (use ${path}${file} to get full path) 140 | //this replicates Node behavior (except that it's a 'deep' search) 141 | keyBits.splice(0, pathBits.length); 142 | return keyBits.join('/'); 143 | }); 144 | [].push.apply(items, pageItems); 145 | console.log('Found objects: ' + items.length); 146 | 147 | // are we paging? 148 | if (data.IsTruncated) { 149 | const length = data.Contents.length; 150 | const marker = data.Contents[length - 1].Key; 151 | this.loadFilesFromS3(items, path, marker, callback); 152 | } else { 153 | callback(null, items); 154 | } 155 | 156 | }); 157 | } 158 | 159 | readdir(path, o, c) { 160 | const {options, callback} = this.interpretParams(o, c); 161 | const params = { 162 | Bucket: this.bucket, 163 | Prefix: path 164 | }; 165 | if (options.limit) { 166 | params.MaxKeys = options.limit 167 | } 168 | const items = []; 169 | this.loadFilesFromS3(items, path, null, (err, items) => { 170 | callback(err, items); 171 | }); 172 | } 173 | 174 | saveImage(path, image, callback) { 175 | image.getBufferAsync(Jimp.MIME_PNG).then((buffer) => { 176 | this.writeFile(path, buffer, {contentType: 'image/png'}, callback); 177 | }); 178 | } 179 | 180 | readImage(path, callback) { 181 | this.readFile(path, (err, buffer) => { 182 | if (err) { 183 | callback(err); 184 | return; 185 | } 186 | Jimp.read(buffer) 187 | .then(image => { 188 | callback(null, image); 189 | }) 190 | .catch(err => { 191 | callback(err, null); 192 | }); 193 | }); 194 | 195 | } 196 | 197 | 198 | } 199 | 200 | module.exports = S3Filer; 201 | 202 | 203 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/gis/GeoJson.js: -------------------------------------------------------------------------------- 1 | const fs = require("fs"); 2 | 3 | /** 4 | @class GeoJson 5 | */ 6 | const GeoJson = function () { 7 | const _self = this; 8 | 9 | //region private fields and methods 10 | let _filePath = ''; 11 | let _featureList = []; 12 | const _uniqueLatLons = []; 13 | let _crs = null; 14 | 15 | const _init = function () { 16 | }; 17 | 18 | const _addEmptyFeatures = function (geoIds) { 19 | geoIds.forEach(function (geoId, i) { 20 | _featureList.push({ 21 | "type": "Feature", 22 | "properties": { 23 | "geoId": geoId, 24 | "label": geoId 25 | } 26 | }); 27 | }); 28 | }; 29 | 30 | const _createEmptyGeoJson = function (geoIds, saveAs, callback) { 31 | _addEmptyFeatures(geoIds); 32 | _save(saveAs, callback); 33 | }; 34 | 35 | const _setExtras = function (geojson) { 36 | if (_crs) { 37 | geojson.crs = { 38 | "type": "name", 39 | "properties": { 40 | "name": _crs 41 | } 42 | }; 43 | } 44 | }; 45 | 46 | const _saveSample = function (saveAs, sampleSize, callback) { 47 | if (_featureList.length < sampleSize) { 48 | throw 'Cannot sample ' + sampleSize + ' features from ' + _filePath; 49 | } 50 | 51 | const geojson = { 52 | "type": "FeatureCollection", 53 | "features": [] 54 | }; 55 | 56 | _setExtras(geojson); 57 | 58 | const sampleBucket = _featureList.slice(0);//clone 59 | 60 | while (geojson.features.length < sampleSize) { 61 | const randIndex = Math.round(Math.random() * sampleBucket.length); 62 | geojson.features.push(sampleBucket.splice(randIndex, 1)[0]); 63 | } 64 | 65 | fs.writeFile(saveAs, JSON.stringify(geojson, null, 2), function (err) { 66 | if (err) { 67 | throw err; 68 | } else { 69 | console.log('Saved ' + saveAs); 70 | if (callback) callback(); 71 | } 72 | }); 73 | }; 74 | 75 | const _savePages = function (saveAs, pageSize, callback) { 76 | for (let i = 0; i < _featureList.length; i += pageSize) { 77 | (function (i) { 78 | const geojson = { 79 | "type": "FeatureCollection", 80 | "features": [] 81 | }; 82 | _setExtras(geojson); 83 | 84 | for (let j = i; j < i + pageSize; j++) { 85 | if (j < _featureList.length) { 86 | geojson.features.push(_featureList[j]); 87 | } 88 | } 89 | const outFile = saveAs.replace('##', `${(i + 1)}-${(i + pageSize)}`); 90 | fs.writeFile(outFile, JSON.stringify(geojson, null, 2), function (err) { 91 | if (err) { 92 | throw err; 93 | } else { 94 | console.log(`Saved page as ${outFile}`); 95 | if (callback) callback(); 96 | } 97 | }); 98 | })(i); 99 | } 100 | }; 101 | 102 | const _save = function (saveAs, callback) { 103 | const geojson = { 104 | "type": "FeatureCollection", 105 | "features": _featureList 106 | }; 107 | _setExtras(geojson); 108 | 109 | fs.writeFile(saveAs, JSON.stringify(geojson, null, 2), function (err) { 110 | if (err) { 111 | throw err; 112 | } else { 113 | console.log('Saved ' + saveAs); 114 | if (callback) callback(); 115 | } 116 | }); 117 | }; 118 | 119 | const _removeFeatures = function (featuresToRemove) { 120 | _featureList = _featureList.filter(feature => featuresToRemove.indexOf(feature) < 0); 121 | }; 122 | //endregion 123 | 124 | //region public API 125 | this.firstFeature = function () { 126 | return _featureList[0]; 127 | }; 128 | 129 | this.addFeature = function (feature) { 130 | _featureList.push(feature); 131 | return _self; 132 | }; 133 | 134 | this.addLineFeature = function (lat1, lon1, lat2, lon2, properties) { 135 | _featureList.push( 136 | { 137 | "type": "Feature", 138 | "properties": properties, 139 | "geometry": { 140 | "type": "LineString", 141 | "coordinates": [ 142 | [ 143 | lon1, 144 | lat1 145 | ], 146 | [ 147 | lon2, 148 | lat2 149 | ] 150 | ] 151 | } 152 | } 153 | ); 154 | return _self; 155 | }; 156 | 157 | this.addTypedFeature = function (type, coordinates, properties, autoIdPrefix) { 158 | if (autoIdPrefix && !properties.geoId) { 159 | const latLonId = [lat, lon].join('_'); 160 | if (_uniqueLatLons.indexOf(latLonId) < 0) { 161 | _uniqueLatLons.push(latLonId); 162 | } 163 | properties.geoId = autoIdPrefix + _uniqueLatLons.indexOf(latLonId); 164 | } 165 | _featureList.push( 166 | { 167 | "type": "Feature", 168 | "properties": properties, 169 | "geometry": { 170 | "type": type, 171 | "coordinates": coordinates 172 | } 173 | } 174 | ); 175 | return _self; 176 | }; 177 | 178 | this.addPointFeature = function (lat, lon, properties, autoIdPrefix) { 179 | _self.addTypedFeature('Point', [lon, lat], properties, autoIdPrefix); 180 | return _self; 181 | }; 182 | 183 | this.addEmptyFeatures = function (geoIds) { 184 | _addEmptyFeatures(geoIds); 185 | }; 186 | 187 | this.load = function (filePath) { 188 | _filePath = filePath; 189 | const geoJson = JSON.parse(fs.readFileSync(filePath, 'utf-8')); 190 | _featureList = geoJson.features; 191 | return _self; 192 | }; 193 | 194 | this.processGiantFile = function (filePath, eachFn, callback) { 195 | const readable = fs.createReadStream(filePath, { 196 | encoding: 'utf8', 197 | fd: null, 198 | }); 199 | let current = ''; 200 | let featureStr = ''; 201 | 202 | readable.on('readable', function () { 203 | let chunk; 204 | while (null !== (chunk = readable.read(1))) { 205 | current += chunk; 206 | if (chunk === '}') { 207 | featureStr += current; 208 | if (current === '}') {//double }} signifies end of feature... 209 | if (featureStr.indexOf(',') === 0) { 210 | featureStr = featureStr.substr(1); 211 | const feature = JSON.parse(featureStr); 212 | eachFn(feature); 213 | } 214 | featureStr = ''; 215 | } 216 | current = ''; 217 | } 218 | } 219 | }).on('end', function () { 220 | callback(); 221 | }) 222 | }; 223 | 224 | this.featureList = function () { 225 | return _featureList; 226 | }; 227 | 228 | this.eachFeature = function (callback) { 229 | const featuresToRemove = []; 230 | _featureList.forEach(function (feature, i) { 231 | if (callback(feature, i, _featureList.length) === false) { 232 | featuresToRemove.push(feature); 233 | } 234 | }); 235 | _removeFeatures(featuresToRemove); 236 | return _self; 237 | }; 238 | 239 | this.save = function (saveAs, callback) { 240 | _filePath = saveAs; 241 | _save(saveAs, callback); 242 | return _self; 243 | }; 244 | 245 | this.createEmptyGeoJson = function (geoIds, saveAs) { 246 | _createEmptyGeoJson(geoIds, saveAs); 247 | return _self; 248 | }; 249 | 250 | this.addFeatures = function (features) { 251 | [].push.apply(_featureList, features); 252 | return _self; 253 | }; 254 | 255 | this.saveSample = function (saveAs, numEntries) { 256 | _saveSample(saveAs, numEntries); 257 | }; 258 | 259 | this.savePages = function (saveAs, numEntries) { 260 | _savePages(saveAs, numEntries); 261 | }; 262 | 263 | this.setCRS = function (crs) { 264 | _crs = crs; 265 | if (_crs && _crs.indexOf('urn:') !== 0) { 266 | _crs = "urn:ogc:def:crs:EPSG::" + _crs; 267 | } 268 | }; 269 | 270 | this.mergeOnId = function (idField, averageFields) { 271 | const featuresToRemove = []; 272 | 273 | const featuresById = {}; 274 | 275 | _featureList.forEach(function (feature, i) { 276 | const id = feature.properties[idField]; 277 | if (!featuresById[id]) featuresById[id] = []; 278 | featuresById[id].push(feature); 279 | }); 280 | 281 | Object.keys(featuresById).forEach((k) => { 282 | const arr = featuresById[k]; 283 | //if there is more than one matching record, add it to a total so we can get an average across all matches and then remove all but the first 284 | if (arr.length > 1) { 285 | const totals = {}; 286 | averageFields.forEach(function (field, i) { 287 | totals[field] = 0; 288 | }); 289 | for (let i = 0; i < arr.length; i++) { 290 | const feature = arr[i]; 291 | averageFields.forEach(function (field, i) { 292 | totals[field] += feature.properties[field]; 293 | }); 294 | if (i > 0) { 295 | featuresToRemove.push(feature); 296 | } 297 | } 298 | averageFields.forEach(function (field, i) { 299 | arr[0].properties[field] = totals[field] / arr.length; 300 | }); 301 | } 302 | }); 303 | 304 | _removeFeatures(featuresToRemove); 305 | return _self; 306 | }; 307 | //endregion 308 | 309 | _init(); 310 | }; 311 | 312 | module.exports.GeoJson = GeoJson; 313 | 314 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/gis/PbfTiles.js: -------------------------------------------------------------------------------- 1 | //TODO split out this module (that needs to run on Lambda from all the ShapeFileLoader dependencies -- those can be in a class that extends this one) 2 | const tilebelt = require("@mapbox/tilebelt"); 3 | const Utils = require('../Utils'); 4 | 5 | /** 6 | * Creates a new instance of PbfTiles. 7 | * @class 8 | * @returns An instance of PbfTiles. 9 | * @example 10 | * var instance = new PbfTiles(); 11 | */ 12 | class PbfTiles { 13 | 14 | constructor(outDir) { 15 | //Note some functionality similar to https://github.com/mapbox/geojson-vt 16 | this.outDir = outDir; 17 | this.memoize = {}; 18 | }; 19 | 20 | bboxToBounds(bbox) { 21 | return { 22 | x: {min: bbox[0], max: bbox[2]}, 23 | y: {min: bbox[1], max: bbox[3]} 24 | }; 25 | }; 26 | 27 | intersectsBounds(bounds1, bounds2) { 28 | return !( 29 | bounds1.x.min > bounds2.x.max || 30 | bounds1.x.max < bounds2.x.min || 31 | bounds1.y.min > bounds2.y.max || 32 | bounds1.y.max < bounds2.y.min 33 | ); 34 | }; 35 | 36 | fullyContains(parent, child) { 37 | return child.x.max <= parent.x.max && 38 | child.x.min >= parent.x.min && 39 | child.y.min >= parent.y.min && 40 | child.y.max <= parent.y.max; 41 | }; 42 | 43 | getOuterTile(startTile, endZoom) { 44 | if (startTile[2] === endZoom) return startTile; 45 | let parent = tilebelt.getParent(startTile); 46 | while (parent[2] !== endZoom) { 47 | parent = tilebelt.getParent(parent); 48 | } 49 | return parent; 50 | } 51 | 52 | getTopLeftPixel(tile) { 53 | return {x: tile[0] * 256, y: tile[1] * 256}; 54 | } 55 | 56 | getZ11(tileZ13) { 57 | const tileZ12 = tilebelt.getParent(tileZ13); 58 | return tilebelt.getParent(tileZ12); 59 | } 60 | 61 | getMinMaxInnerTiles(startTile, endZoom) { 62 | let children = this.getInnerTiles(startTile, endZoom); 63 | 64 | const max = [0, 0, endZoom]; 65 | const min = [Number.MAX_VALUE, Number.MAX_VALUE, endZoom]; 66 | children.forEach((c, i) => { 67 | max[0] = Math.max(max[0], c[0]); 68 | max[1] = Math.max(max[1], c[1]); 69 | 70 | min[0] = Math.min(min[0], c[0]); 71 | min[1] = Math.min(min[1], c[1]); 72 | }); 73 | return {min, max}; 74 | } 75 | 76 | getInnerTiles(startTile, endZoom) { 77 | const inners = []; 78 | const nextLevel = (tile, level) => { 79 | // console.log(level, tile); 80 | if (level === endZoom) { 81 | inners.push(tile); 82 | } else { 83 | const children = tilebelt.getChildren(tile); 84 | children.forEach((child, i) => { 85 | nextLevel(child, level + 1); 86 | }); 87 | } 88 | }; 89 | 90 | nextLevel(startTile, startTile[2]); 91 | return inners; 92 | }; 93 | 94 | toLatLon(projection, pair) { 95 | return pair; 96 | } 97 | 98 | convertCoords(projection, coords) { 99 | const coordsLL = []; 100 | const shapeBounds = { 101 | x: {min: Number.MAX_VALUE, max: -Number.MAX_VALUE}, 102 | y: {min: Number.MAX_VALUE, max: -Number.MAX_VALUE} 103 | }; 104 | coords.forEach((pair, i) => { 105 | try { 106 | const latLon = this.toLatLon(projection, pair); 107 | shapeBounds.x.min = Math.min(latLon[0], shapeBounds.x.min); 108 | shapeBounds.x.max = Math.max(latLon[0], shapeBounds.x.max); 109 | shapeBounds.y.min = Math.min(latLon[1], shapeBounds.y.min); 110 | shapeBounds.y.max = Math.max(latLon[1], shapeBounds.y.max); 111 | coordsLL.push(latLon); 112 | } catch (e) { 113 | console.log(e); 114 | // console.log(JSON.stringify(geometry)); 115 | console.log(typeof coords[0]); 116 | // console.log(pair); 117 | // console.log(feature); 118 | } 119 | }); 120 | return {coords: coordsLL, shapeBounds: shapeBounds} 121 | } 122 | 123 | unionBounds(a, b) { 124 | if (a == null) return b; 125 | if (b == null) return a; 126 | return { 127 | x: {min: Math.min(a.x.min, b.x.min), max: Math.max(a.x.max, b.x.max)}, 128 | y: {min: Math.min(a.y.min, b.y.min), max: Math.max(a.y.max, b.y.max)} 129 | } 130 | } 131 | 132 | isPairArray(arr) { 133 | if (arr.length === 0) return false; 134 | const first = arr[0]; 135 | return first.length === 2 && typeof first[0] === 'number'; 136 | } 137 | 138 | convertGeometry(projection, geometry) { 139 | if (geometry._processed) { 140 | return geometry._processed.bounds;//prevent double-processing of geometry 141 | } 142 | let boundsUnion = null; 143 | const processArr = (arr) => { 144 | arr.forEach((subArr, i) => { 145 | if (this.isPairArray(subArr)) { 146 | const {coords, shapeBounds} = this.convertCoords(projection, subArr); 147 | arr[i] = coords; 148 | boundsUnion = this.unionBounds(boundsUnion, shapeBounds); 149 | } else { 150 | processArr(subArr); 151 | } 152 | }); 153 | }; 154 | processArr(geometry.coordinates); 155 | geometry._processed = {bounds: boundsUnion}; 156 | return boundsUnion; 157 | } 158 | 159 | tileToBounds(tile) { 160 | return this.bboxToBounds(tilebelt.tileToBBOX(tile)); 161 | } 162 | 163 | getTileGroups(zoomTiles, tileZoom, geoJson) { 164 | const tileGroups = []; 165 | zoomTiles.forEach((tile, i) => { 166 | let tileBounds = this.bboxToBounds(tilebelt.tileToBBOX(tile)); 167 | tileGroups.push({ 168 | tile: tile, 169 | bounds: tileBounds, 170 | boundsTest: (shapeBounds) => { 171 | return this.intersectsBounds(shapeBounds, tileBounds); 172 | }, 173 | geoJson: Utils.makeGeoJson() 174 | }); 175 | }); 176 | geoJson.features.forEach((feature, i) => { 177 | const shapeBounds = this.convertGeometry(false, feature.geometry); 178 | tileGroups.forEach((tileGroup, i) => { 179 | if (tileGroup.boundsTest(shapeBounds)) { 180 | tileGroup.geoJson.features.push(feature); 181 | } 182 | }); 183 | }); 184 | return tileGroups; 185 | } 186 | 187 | getZoomTiles(parentTile, tileZoom, geoJson) { 188 | const zoomTiles = this.getInnerTiles(parentTile, tileZoom); 189 | return this.getTileGroups(zoomTiles, tileZoom, geoJson); 190 | } 191 | 192 | findGeometriesNotFullyContained(tile, geoJson) { 193 | if (!geoJson.features) return []; 194 | let tileBounds = this.bboxToBounds(tilebelt.tileToBBOX(tile)); 195 | return geoJson.features.filter((feature) => { 196 | const bounds = this.convertGeometry(false, feature.geometry); 197 | return !this.fullyContains(tileBounds, bounds); 198 | }); 199 | } 200 | 201 | static getTileMetersPerPixel(zoom, latitude) {//https://github.com/mapbox/mapbox-unity-sdk/blob/ddd6cd4bece9b2a25a8e3337597cad17a65de5fd/sdkproject/Assets/Mapbox/Unity/Utilities/Conversions.cs#L223 202 | return 40075000 * Math.cos((Math.PI / 180) * latitude) / Math.pow(2, zoom + 8); 203 | } 204 | 205 | } 206 | 207 | module.exports = PbfTiles; 208 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/gis/ShapeFileLoader.js: -------------------------------------------------------------------------------- 1 | var fs = require("fs"); 2 | var path = require('path'); 3 | var unzipper = require("unzipper"); 4 | var shapefile = require("shapefile"); 5 | var Utils = require("../Utils"); 6 | 7 | /** 8 | * Creates a new instance of ShapeFileLoader. 9 | * @class 10 | * @returns An instance of ShapeFileLoader. 11 | * @example 12 | * var instance = new ShapeFileLoader(); 13 | */ 14 | class ShapeFileLoader { 15 | 16 | constructor() { 17 | }; 18 | 19 | addFeature(geoJson, properties, geometry) { 20 | geoJson.features.push({ 21 | "type": "Feature", 22 | "properties": properties, 23 | "geometry": geometry 24 | }); 25 | }; 26 | 27 | loadZip(zip, onLoad, onBoundingBox) { 28 | const shpFileData = {shp: [], dbf: [], prj: ''}; 29 | const process = (shpFileData) => { 30 | if (!shpFileData.dbf || shpFileData.dbf.length === 0) return; 31 | if (!shpFileData.shp || shpFileData.shp.length === 0) return; 32 | if (!shpFileData.prj || shpFileData.prj.length === 0) return; 33 | this.projection = shpFileData.prj; 34 | 35 | onLoad(shpFileData); 36 | }; 37 | 38 | fs.createReadStream(zip) 39 | .pipe(unzipper.Parse()) 40 | .on('entry', function (entry) { 41 | var fileName = entry.path; 42 | const ext = path.extname(fileName).substr(1); 43 | console.log('ENTRY: ' + ext); 44 | var type = entry.type; // 'Directory' or 'File' 45 | var size = entry.size; 46 | if (ext === "shp") { 47 | console.time('load shp ' + fileName); 48 | shapefile.openShp(entry).then((source) => { 49 | this.bbox = source.bbox; 50 | if (onBoundingBox) onBoundingBox(source.bbox); 51 | source.read() 52 | .then(function log(result) { 53 | if (result.done) { 54 | console.timeEnd('load shp ' + fileName); 55 | process(shpFileData); 56 | return; 57 | } 58 | shpFileData.shp.push(result.value); 59 | return source.read().then(log); 60 | }) 61 | }).catch((error) => { 62 | console.error('PROBLEM WITH ZIP: ' + fileName) 63 | console.error(error.stack) 64 | }); 65 | 66 | } else if (ext === "dbf") { 67 | console.time('load dbf ' + fileName); 68 | shapefile.openDbf(entry).then((source) => { 69 | source.read() 70 | .then(function log(result) { 71 | if (result.done) { 72 | console.timeEnd('load dbf ' + fileName); 73 | process(shpFileData); 74 | return; 75 | } else { 76 | shpFileData.dbf.push(result.value); 77 | } 78 | return source.read().then(log); 79 | }) 80 | }).catch((error) => { 81 | console.error('PROBLEM WITH ZIP: ' + fileName); 82 | console.error(error.stack); 83 | }); 84 | } else if (ext === "prj") { 85 | let content = ''; 86 | entry.on('data', function (buf) { 87 | content += buf.toString(); 88 | }); 89 | entry.on('end', function () { 90 | shpFileData.prj = content; 91 | process(shpFileData); 92 | // console.log(content); 93 | }); 94 | entry.read(); 95 | } else { 96 | entry.autodrain(); 97 | } 98 | // if (shpFile.dbf && shpFile.shp) { 99 | // processShapefile(shpFile); 100 | // } 101 | }); 102 | } 103 | 104 | loadZipToGeoJson(zip, callback) { 105 | this.loadZip(zip, (shpFileData) => { 106 | const geoJson = Utils.makeGeoJson(); 107 | shpFileData.shp.forEach((shp, i) => { 108 | // const converted = converter(shp); 109 | const record = shpFileData.dbf[i]; 110 | this.addFeature(geoJson, record, shp); 111 | if (i % 500 === 0) console.log(`Processed shape: ${i} of ${shpFileData.shp.length} --> ${Math.round(100 * i / shpFileData.shp.length)}%`); 112 | }); 113 | 114 | console.log(`DONE processing shapefile (${shpFileData.shp.length} shapes)`); 115 | callback(geoJson); 116 | }); 117 | } 118 | 119 | load(zip, converter, onBoundingBox, batches, callback) { 120 | const process = (shpFileData) => { 121 | shpFileData.shp.forEach((shp, i) => { 122 | const converted = converter(shp); 123 | batches.forEach((batch, j) => { 124 | if (!batch.boundsTest(converted)) return; 125 | const record = shpFileData.dbf[i]; 126 | if (!batch.geoJson) { 127 | batch.geoJson = Utils.makeGeoJson(); 128 | } 129 | this.addFeature(batch.geoJson, record, shp); 130 | }); 131 | if (i % 500 === 0) console.log(`Processed shape: ${i} of ${shpFileData.shp.length} --> ${Math.round(100 * i / shpFileData.shp.length)}%`); 132 | }); 133 | 134 | console.log(`DONE processing shapefile (${shpFileData.shp.length} shapes)`); 135 | callback(); 136 | }; 137 | 138 | this.loadZip(zip, process, onBoundingBox); 139 | } 140 | } 141 | 142 | module.exports = ShapeFileLoader; 143 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/gis/ShpToPbfTiles.js: -------------------------------------------------------------------------------- 1 | var proj4 = require("proj4"); 2 | var ShapeFileLoader = require("./ShapeFileLoader"); 3 | var fs = require('fs'); 4 | const async = require('async'); 5 | const tilebelt = require("@mapbox/tilebelt"); 6 | const geobuf = require('geobuf'); 7 | var Pbf = require('pbf'); 8 | var PbfTiles = require('./PbfTiles'); 9 | 10 | /** 11 | * Creates a new instance of ShpToPbfTiles. 12 | * @class 13 | * @returns An instance of ShpToPbfTiles. 14 | * @example 15 | * var instance = new ShpToPbfTiles(); 16 | */ 17 | class ShpToPbfTiles extends PbfTiles { 18 | 19 | constructor(outDir) { 20 | super(outDir); 21 | }; 22 | 23 | toLatLon(projection, pair) { 24 | if (!projection) return pair; 25 | const id = pair.join('_'); 26 | if (!this.memoize[id]) { 27 | this.memoize[id] = proj4(projection).inverse(pair); 28 | } 29 | return this.memoize[id]; 30 | } 31 | 32 | processGeoJson(zip, tileZoom, callback) { 33 | this.shapeFileLoader = new ShapeFileLoader(); 34 | const tileGroups = []; 35 | 36 | this.shapeFileLoader.load(zip, (shape) => { 37 | return this.convertGeometry(this.shapeFileLoader.projection, shape); 38 | }, (bbox) => { 39 | const bigTile = tilebelt.bboxToTile(bbox); 40 | const bigTileBounds = this.bboxToBounds(bbox); 41 | let zoomTiles; 42 | if (bigTile[2] === tileZoom) { 43 | zoomTiles = [bigTile]; 44 | } else if (bigTile[2] > tileZoom) { 45 | zoomTiles = [this.getOuterTile(bigTile, tileZoom)]; 46 | } else { 47 | zoomTiles = this.getInnerTiles(bigTile, tileZoom); 48 | zoomTiles = zoomTiles.filter((tile) => { 49 | const bounds = this.bboxToBounds(tilebelt.tileToBBOX(tile)); 50 | return this.intersectsBounds(bounds, bigTileBounds); 51 | }); 52 | } 53 | 54 | console.log(`# zoom tiles : ${zoomTiles.length}`); 55 | zoomTiles.forEach((tile, i) => { 56 | let tileBounds = this.bboxToBounds(tilebelt.tileToBBOX(tile)); 57 | tileGroups.push({ 58 | tile: tile, 59 | bounds: tileBounds, 60 | boundsTest: (shapeBounds) => { 61 | return this.intersectsBounds(shapeBounds, tileBounds); 62 | } 63 | }); 64 | }); 65 | }, tileGroups, () => { 66 | const calls = []; 67 | tileGroups.forEach((tileGroup, i) => { 68 | calls.push((done) => { 69 | if (!tileGroup.geoJson) { 70 | done(); 71 | return; 72 | } 73 | if (tileGroup.geoJson.features.length > 0) { 74 | var buffer = geobuf.encode(tileGroup.geoJson, new Pbf()); 75 | const filePath = `${this.outDir}/${tileGroup.tile.join('_')}.proto`; 76 | // console.log(buffer.length); 77 | fs.writeFile(filePath, buffer, () => { 78 | console.log(`SAVED ${filePath}`); 79 | done(); 80 | }); 81 | } 82 | }); 83 | }); 84 | async.parallelLimit(calls, 4, function () { 85 | console.log('ALL SAVED'); 86 | callback(); 87 | }); 88 | }); 89 | 90 | 91 | } 92 | } 93 | 94 | module.exports = ShpToPbfTiles; 95 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/helpers/QueueProcessor.js: -------------------------------------------------------------------------------- 1 | // const process = require('process');//not technically needed (always available) 2 | // const request = require('request'); 3 | // const fs = require('fs'); 4 | // const os = require('os'); 5 | let AWS = require('aws-sdk'); 6 | AWS.config.update({region: 'us-east-1'}); 7 | const async = require('async'); 8 | 9 | /** 10 | * Creates a new instance of QueueProcessor. 11 | * @class 12 | * @returns An instance of QueueProcessor. 13 | * @example 14 | * var instance = new QueueProcessor(); 15 | */ 16 | class QueueProcessor { 17 | 18 | constructor(queueName, batchSize, baseInfo) { 19 | this.queueName = queueName; 20 | this.batchSize = batchSize; 21 | this.baseInfo = baseInfo; 22 | this.limitBatches = false; //false or number 23 | }; 24 | 25 | 26 | processTilesToQueue(queueInfo, callback) { 27 | const sqs = new AWS.SQS({apiVersion: '2012-11-05'}); 28 | 29 | const params = { 30 | DelaySeconds: 1, 31 | MessageAttributes: {}, 32 | MessageBody: JSON.stringify(queueInfo), 33 | QueueUrl: "https://sqs.us-east-1.amazonaws.com/780311654181/" + this.queueName //from https://console.aws.amazon.com/sqs/home?region=us-east-1 34 | }; 35 | 36 | sqs.sendMessage(params, function (err, data) { 37 | if (err) { 38 | console.log("Error", err); 39 | } else { 40 | console.log("Success", data.MessageId); 41 | } 42 | callback(err, data); 43 | }); 44 | }; 45 | 46 | validateAllTilesInABatch(tiles11, processBatches) { 47 | const allTiles = {}; 48 | tiles11.forEach((tile11, i) => { 49 | allTiles[tile11.join('_')] = {match: false} 50 | }); 51 | 52 | processBatches.forEach((batch11, i) => { 53 | batch11.forEach((tile11, i) => { 54 | allTiles[tile11.join('_')].match = true; 55 | }); 56 | }); 57 | 58 | let matchCount = 0; 59 | Object.keys(allTiles).forEach((k) => { 60 | var matchTile = allTiles[k]; 61 | if (matchTile.match) { 62 | matchCount++; 63 | } else { 64 | console.log('BATCH SKIPPED ME: ' + k); 65 | } 66 | }); 67 | console.log(`matched ${matchCount} of ${tiles11.length}`); 68 | } 69 | 70 | processZ11ToQueue(tiles11, parallelLimit, callback) { 71 | let progress = {done: 0, num: 0, called: 0}; 72 | 73 | const calls = []; 74 | 75 | const tileBatches = []; 76 | let currentBatch = []; 77 | tileBatches.push(currentBatch); 78 | tiles11.forEach((tile11, i) => { 79 | currentBatch.push(tile11); 80 | if (currentBatch.length === this.batchSize) { 81 | currentBatch = []; 82 | tileBatches.push(currentBatch) 83 | } 84 | }); 85 | 86 | let processBatches; 87 | if (this.limitBatches) { 88 | processBatches = tileBatches.slice(0, this.limitBatches); 89 | } else { 90 | processBatches = tileBatches; 91 | } 92 | 93 | // this.validateAllTilesInABatch(tiles11, processBatches); 94 | 95 | processBatches.forEach((batch11, i) => { 96 | progress.num++; 97 | calls.push((done) => { 98 | progress.called++; 99 | let queueInfo = JSON.parse(JSON.stringify(this.baseInfo)); 100 | queueInfo.tiles = batch11; 101 | this.processTilesToQueue(queueInfo, (err, result) => { 102 | progress.done++; 103 | if (err) { 104 | console.error(err); 105 | } else { 106 | if (progress.done % 10 === 0) { 107 | console.log(`Pushed to queue ${progress.done} / ${progress.called}`); 108 | } 109 | } 110 | done(); 111 | }); 112 | 113 | }); 114 | 115 | }); 116 | let timerId = calls.length + ' calls'; 117 | console.time(timerId); 118 | //NOTE: for Lambda parallelLimit equates to concurrency of lambdas running in parallel 119 | async.parallelLimit(calls, parallelLimit, () => { 120 | console.log('ALL DONE'); 121 | if (callback) callback(); 122 | }); 123 | 124 | } 125 | } 126 | 127 | module.exports = QueueProcessor; 128 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/metadata/CompileData.js: -------------------------------------------------------------------------------- 1 | const async = require('async'); 2 | 3 | /** 4 | * Creates a new instance of CompileData. 5 | * @class 6 | * @returns An instance of CompileData. 7 | * @example 8 | * var compileData = new CompileData(filer, tileDir, spannerDir); 9 | * compileData.deriveTotalPixels(); 10 | */ 11 | class CompileData { 12 | 13 | constructor(filer, tileDir, spannerDir, outDir) { 14 | this.filer = filer; 15 | this.tileDir = tileDir; 16 | this.spannerDir = spannerDir; 17 | this.outDir = outDir; 18 | }; 19 | 20 | processSpannerFiles(callback) { 21 | const {filer, spannerDir} = this; 22 | const compiledBlocks = {}; 23 | filer.readdir(spannerDir, function (err, items) { 24 | const calls = []; 25 | 26 | //TODO grab spanner ids and set count to 0 27 | for (let i = 0; i < items.length; i++) { 28 | let file = items[i]; 29 | const bits = file.split('.'); 30 | if (bits[bits.length - 1] === 'json') { 31 | calls.push((done) => { 32 | filer.readFile(`${spannerDir}${file}`, (err, data) => { 33 | if (err) throw err; 34 | let spannerIds = JSON.parse(data); 35 | spannerIds.forEach((id, i) => { 36 | compiledBlocks[id] = 0;//Note this may be set multiple times for same ID 37 | }); 38 | done(); 39 | }); 40 | }); 41 | } 42 | } 43 | 44 | console.log('Compiling data for ' + calls.length); 45 | let timerId = calls.length + ' calls'; 46 | console.time(timerId); 47 | async.parallelLimit(calls, 4, function () { 48 | console.log('ALL DONE'); 49 | console.timeEnd(timerId); 50 | callback(compiledBlocks); 51 | }); 52 | 53 | }); 54 | }; 55 | 56 | updateJsonFiles(compiledBlocks, callback) { 57 | const {filer, tileDir} = this; 58 | filer.readdir(tileDir, function (err, items) { 59 | const calls = []; 60 | 61 | for (let i = 0; i < items.length; i++) { 62 | let file = items[i]; 63 | const bits = file.split('.'); 64 | if (bits[bits.length - 1] === 'json') { 65 | calls.push((done) => { 66 | filer.readFile(`${tileDir}${file}`, (err, data) => { 67 | if (err) throw err; 68 | let colorLookup = JSON.parse(data); 69 | let hasUpdates = false; 70 | Object.keys(colorLookup).forEach((k) => { 71 | if (compiledBlocks[k] > 0) { 72 | colorLookup[k].totalPixels = compiledBlocks[k]; 73 | hasUpdates = true; 74 | } 75 | }); 76 | if (hasUpdates) { 77 | const filePath = `${tileDir}${file}`; 78 | filer.writeFile(filePath, JSON.stringify(colorLookup, null, 2), function (err) { 79 | if (err) { 80 | throw err; 81 | } else { 82 | // console.log('Saved ' + filePath); 83 | } 84 | done(); 85 | }); 86 | } else { 87 | done(); 88 | } 89 | }); 90 | }); 91 | } 92 | } 93 | 94 | console.log('updateJsonFiles for ' + calls.length); 95 | let timerId = calls.length + ' updateJsonFiles calls'; 96 | console.time(timerId); 97 | async.parallelLimit(calls, 4, function () { 98 | console.log('updateJsonFiles ALL DONE'); 99 | console.timeEnd(timerId); 100 | if (callback) callback(); 101 | }); 102 | 103 | }); 104 | }; 105 | 106 | sumPixelsByBlock(compiledBlocks, callback) { 107 | const {filer, tileDir} = this; 108 | filer.readdir(tileDir, function (err, items) { 109 | const calls = []; 110 | 111 | let matchCount = 0; 112 | 113 | for (let i = 0; i < items.length; i++) { 114 | let file = items[i]; 115 | const bits = file.split('.'); 116 | if (bits[bits.length - 1] === 'json') { 117 | calls.push((done) => { 118 | filer.readFile(`${tileDir}${file}`, (err, data) => { 119 | if (err) throw err; 120 | let colorLookup = JSON.parse(data); 121 | Object.keys(colorLookup).forEach((k) => { 122 | if (compiledBlocks.hasOwnProperty(k)) { 123 | matchCount++; 124 | compiledBlocks[k] += colorLookup[k].pixelCount; 125 | } 126 | }); 127 | done(); 128 | }); 129 | }); 130 | } 131 | } 132 | 133 | console.log('Compiling data for ' + calls.length); 134 | let timerId = calls.length + ' calls'; 135 | console.time(timerId); 136 | async.parallelLimit(calls, 4, function () { 137 | console.log('ALL DONE'); 138 | console.log('matchCount', matchCount); 139 | console.timeEnd(timerId); 140 | callback(); 141 | }); 142 | 143 | }); 144 | }; 145 | 146 | deriveTotalPixels(callback) { 147 | this.processSpannerFiles((compiledBlocks) => { 148 | this.sumPixelsByBlock(compiledBlocks, () => { 149 | this.updateJsonFiles(compiledBlocks, callback); 150 | }); 151 | }); 152 | } 153 | 154 | } 155 | 156 | module.exports = CompileData; -------------------------------------------------------------------------------- /geo-png-db-processing/src/metadata/CompileDataForTile.js: -------------------------------------------------------------------------------- 1 | const async = require('async'); 2 | const PbfTiles = require('../gis/PbfTiles'); 3 | const Utils = require('../Utils'); 4 | const KeySumStore = require('./KeySumStore'); 5 | const pbfTiles = new PbfTiles(); 6 | 7 | /** 8 | * Creates a new instance of CompileDataForTile. 9 | * @class 10 | * @returns An instance of CompileDataForTile. 11 | * @example 12 | * var compileData = new CompileDataForTile(filer, tileDir, spannerDir); 13 | * compileData.deriveTotalPixels(); 14 | */ 15 | class CompileDataForTile { 16 | 17 | constructor(filer, tileDir, spannerDir, tile, table) { 18 | this.filer = filer; 19 | this.tileDir = tileDir; 20 | this.spannerDir = spannerDir; 21 | this.tile = tile; 22 | this.keySumStore = new KeySumStore(table); 23 | }; 24 | 25 | updateCount(blockId, value, tile, callback) { 26 | this.keySumStore.add(blockId, value, tile[0] + '_' + tile[1], callback); 27 | } 28 | 29 | processZ11Tile(tile, callback) {//meta/json/census-blocks/spans/11/100_451_11.json 30 | const {filer, tileDir, spannerDir} = this; 31 | //NOTE: tileDir is only partially processed, some tiles may not be present 32 | const file = tile.join('_') + '.json'; 33 | filer.readFile(`${tileDir}${file}`, (err, data) => { 34 | if (err) { 35 | callback(); 36 | return; 37 | } 38 | let colorLookup = JSON.parse(data); 39 | filer.readFile(`${spannerDir}${file}`, (err, data) => { 40 | if (err) { 41 | callback(); 42 | return; 43 | } 44 | const calls = []; 45 | let spannerIds = JSON.parse(data); 46 | spannerIds.forEach((id, i) => { 47 | if (colorLookup[id]) { 48 | calls.push((done) => { 49 | this.updateCount(id, colorLookup[id].pixelCounts, tile, done); 50 | }); 51 | } 52 | }); 53 | async.parallelLimit(calls, 4, () => { 54 | // console.log('Z11 DONE ' + tile.join('_')); 55 | callback('Updates: ' + calls.length); 56 | }); 57 | }); 58 | }); 59 | } 60 | 61 | deriveTotalPixels(callback) { 62 | const zoomTiles = pbfTiles.getInnerTiles(this.tile, 11); 63 | const calls = []; 64 | let tileId = this.tile.join('_'); 65 | 66 | const infos = []; 67 | infos.push(tileId); 68 | zoomTiles.forEach((tile, i) => { 69 | calls.push((done) => { 70 | this.processZ11Tile(tile, (info) => { 71 | infos.push(info); 72 | done(); 73 | }); 74 | }) 75 | }); 76 | 77 | async.parallelLimit(calls, 4, () => { 78 | callback(infos.join('|')); 79 | }); 80 | } 81 | 82 | } 83 | 84 | module.exports = CompileDataForTile; -------------------------------------------------------------------------------- /geo-png-db-processing/src/metadata/KeySumStore.js: -------------------------------------------------------------------------------- 1 | const AWS = require('aws-sdk'); 2 | // Set the region 3 | AWS.config.update({region: 'us-east-1'}); 4 | 5 | const apiVersion = '2012-08-10'; 6 | const ddb = new AWS.DynamoDB({apiVersion: apiVersion}); 7 | const documentClient = new AWS.DynamoDB.DocumentClient({}); 8 | 9 | 10 | /** 11 | * Creates a new instance of KeySumStore. 12 | * @class 13 | * @returns An instance of KeySumStore. 14 | * @example 15 | * const keySumStore = new KeySumStore(table); 16 | * keySumStore.iterateAll(...); 17 | */ 18 | class KeySumStore { 19 | constructor(table) { 20 | this.table = table; 21 | }; 22 | 23 | queryTile(tileXY) { 24 | 25 | const params = { 26 | TableName: this.table, 27 | FilterExpression: "contains(#Tiles, :Tile)", 28 | ExpressionAttributeNames: {"#Tiles": "TILES"}, 29 | ExpressionAttributeValues: {":Tile": tileXY} 30 | }; 31 | //TODO 32 | } 33 | 34 | add(blockId, value, tileXY, callback) { 35 | const params = { 36 | TableName: this.table, 37 | Key: { 38 | 'BLOCK_ID': blockId, 39 | }, 40 | UpdateExpression: 'ADD #TotalPixels :Pixels SET #Tiles = list_append(if_not_exists(#Tiles, :empty_list), :Tile)', 41 | ExpressionAttributeNames: { 42 | '#TotalPixels': 'TOTAL_PIXELS', 43 | '#Tiles': 'TILES', 44 | }, 45 | ExpressionAttributeValues: { 46 | ':Pixels': value, 47 | ':Tile': [tileXY], 48 | ":empty_list": [] 49 | }, 50 | }; 51 | 52 | this.updatePersistently(10, 1000, params, callback); 53 | } 54 | 55 | updatePersistently(numTries, timeout, params, callback) { 56 | if (numTries === 0) { 57 | console.log(`FATAL Error ${params.Key.BLOCK_ID} : Could not update (given up)`); 58 | callback(); 59 | return; 60 | } 61 | documentClient.update(params, (err, data) => { 62 | if (err) { 63 | if (err.code === 'ThrottlingException') { 64 | console.log(`ThrottlingException... ${params.Key.BLOCK_ID} trying again (${numTries} left) in 1000ms`); 65 | setTimeout(() => { 66 | this.updatePersistently(numTries - 1, timeout, params, callback); 67 | }, timeout); 68 | return; 69 | } 70 | console.log("Error", err); 71 | } else { 72 | // console.log("Success", data); 73 | } 74 | if (callback) callback(); 75 | }); 76 | } 77 | 78 | iterateAll(callback) { 79 | const scanDb = (items, exclusiveStartKey) => { 80 | const params = { 81 | TableName: this.table, 82 | }; 83 | if (exclusiveStartKey) { 84 | params.ExclusiveStartKey = exclusiveStartKey; 85 | } 86 | 87 | documentClient.scan(params, function (err, data) { 88 | if (err) { 89 | console.error("Unable to query. Error:", JSON.stringify(err, null, 2)); 90 | } else { 91 | console.log("Scan succeeded: " + data.Count + ' / ' + data.ScannedCount); 92 | data.Items.forEach(function (item) { 93 | items.push(item); 94 | 95 | }); 96 | if (data.LastEvaluatedKey) { 97 | scanDb(items, data.LastEvaluatedKey); 98 | } else { 99 | if (callback) callback(items); 100 | } 101 | } 102 | }); 103 | }; 104 | scanDb([], null); 105 | } 106 | } 107 | 108 | module.exports = KeySumStore; -------------------------------------------------------------------------------- /geo-png-db-processing/src/proto/BlockRecord.js: -------------------------------------------------------------------------------- 1 | 'use strict'; // code generated by pbf v3.2.0 2 | 3 | // BlockRecords ======================================== 4 | 5 | var BlockRecords = exports.BlockRecords = {}; 6 | 7 | BlockRecords.read = function (pbf, end) { 8 | return pbf.readFields(BlockRecords._readField, {blocks: []}, end); 9 | }; 10 | BlockRecords._readField = function (tag, obj, pbf) { 11 | if (tag === 1) obj.blocks.push(BlockRecords.BlockRecord.read(pbf, pbf.readVarint() + pbf.pos)); 12 | }; 13 | BlockRecords.write = function (obj, pbf) { 14 | if (obj.blocks) for (var i = 0; i < obj.blocks.length; i++) pbf.writeMessage(1, BlockRecords.BlockRecord.write, obj.blocks[i]); 15 | }; 16 | 17 | // BlockRecords.BlockRecord ======================================== 18 | 19 | BlockRecords.BlockRecord = {}; 20 | 21 | BlockRecords.BlockRecord.read = function (pbf, end) { 22 | return pbf.readFields(BlockRecords.BlockRecord._readField, {geoId: "", population: 0, housingUnits: null, race: null, age: null}, end); 23 | }; 24 | BlockRecords.BlockRecord._readField = function (tag, obj, pbf) { 25 | if (tag === 1) obj.geoId = pbf.readString(); 26 | else if (tag === 2) obj.population = pbf.readVarint(true); 27 | else if (tag === 3) obj.housingUnits = BlockRecords.BlockRecord.HousingUnits.read(pbf, pbf.readVarint() + pbf.pos); 28 | else if (tag === 4) obj.race = BlockRecords.BlockRecord.Race.read(pbf, pbf.readVarint() + pbf.pos); 29 | else if (tag === 5) obj.age = BlockRecords.BlockRecord.Age.read(pbf, pbf.readVarint() + pbf.pos); 30 | }; 31 | BlockRecords.BlockRecord.write = function (obj, pbf) { 32 | if (obj.geoId) pbf.writeStringField(1, obj.geoId); 33 | if (obj.population) pbf.writeVarintField(2, obj.population); 34 | if (obj.housingUnits) pbf.writeMessage(3, BlockRecords.BlockRecord.HousingUnits.write, obj.housingUnits); 35 | if (obj.race) pbf.writeMessage(4, BlockRecords.BlockRecord.Race.write, obj.race); 36 | if (obj.age) pbf.writeMessage(5, BlockRecords.BlockRecord.Age.write, obj.age); 37 | }; 38 | 39 | // BlockRecords.BlockRecord.HousingUnits ======================================== 40 | 41 | BlockRecords.BlockRecord.HousingUnits = {}; 42 | 43 | BlockRecords.BlockRecord.HousingUnits.read = function (pbf, end) { 44 | return pbf.readFields(BlockRecords.BlockRecord.HousingUnits._readField, {total: 0, occupied: 0, vacant: 0}, end); 45 | }; 46 | BlockRecords.BlockRecord.HousingUnits._readField = function (tag, obj, pbf) { 47 | if (tag === 1) obj.total = pbf.readVarint(true); 48 | else if (tag === 2) obj.occupied = pbf.readVarint(true); 49 | else if (tag === 3) obj.vacant = pbf.readVarint(true); 50 | }; 51 | BlockRecords.BlockRecord.HousingUnits.write = function (obj, pbf) { 52 | if (obj.total) pbf.writeVarintField(1, obj.total); 53 | if (obj.occupied) pbf.writeVarintField(2, obj.occupied); 54 | if (obj.vacant) pbf.writeVarintField(3, obj.vacant); 55 | }; 56 | 57 | // BlockRecords.BlockRecord.Race ======================================== 58 | 59 | BlockRecords.BlockRecord.Race = {}; 60 | 61 | BlockRecords.BlockRecord.Race.read = function (pbf, end) { 62 | return pbf.readFields(BlockRecords.BlockRecord.Race._readField, {total: 0, white: 0, black: 0, asian: 0, native: 0, other: 0, hispanic: 0}, end); 63 | }; 64 | BlockRecords.BlockRecord.Race._readField = function (tag, obj, pbf) { 65 | if (tag === 1) obj.total = pbf.readVarint(true); 66 | else if (tag === 2) obj.white = pbf.readVarint(true); 67 | else if (tag === 3) obj.black = pbf.readVarint(true); 68 | else if (tag === 4) obj.asian = pbf.readVarint(true); 69 | else if (tag === 5) obj.native = pbf.readVarint(true); 70 | else if (tag === 6) obj.other = pbf.readVarint(true); 71 | else if (tag === 7) obj.hispanic = pbf.readVarint(true); 72 | }; 73 | BlockRecords.BlockRecord.Race.write = function (obj, pbf) { 74 | if (obj.total) pbf.writeVarintField(1, obj.total); 75 | if (obj.white) pbf.writeVarintField(2, obj.white); 76 | if (obj.black) pbf.writeVarintField(3, obj.black); 77 | if (obj.asian) pbf.writeVarintField(4, obj.asian); 78 | if (obj.native) pbf.writeVarintField(5, obj.native); 79 | if (obj.other) pbf.writeVarintField(6, obj.other); 80 | if (obj.hispanic) pbf.writeVarintField(7, obj.hispanic); 81 | }; 82 | 83 | // BlockRecords.BlockRecord.Age ======================================== 84 | 85 | BlockRecords.BlockRecord.Age = {}; 86 | 87 | BlockRecords.BlockRecord.Age.read = function (pbf, end) { 88 | return pbf.readFields(BlockRecords.BlockRecord.Age._readField, {u10: 0, u18: 0, u30: 0, u65: 0, over65: 0}, end); 89 | }; 90 | BlockRecords.BlockRecord.Age._readField = function (tag, obj, pbf) { 91 | if (tag === 1) obj.u10 = pbf.readVarint(true); 92 | else if (tag === 2) obj.u18 = pbf.readVarint(true); 93 | else if (tag === 3) obj.u30 = pbf.readVarint(true); 94 | else if (tag === 4) obj.u65 = pbf.readVarint(true); 95 | else if (tag === 5) obj.over65 = pbf.readVarint(true); 96 | }; 97 | BlockRecords.BlockRecord.Age.write = function (obj, pbf) { 98 | if (obj.u10) pbf.writeVarintField(1, obj.u10); 99 | if (obj.u18) pbf.writeVarintField(2, obj.u18); 100 | if (obj.u30) pbf.writeVarintField(3, obj.u30); 101 | if (obj.u65) pbf.writeVarintField(4, obj.u65); 102 | if (obj.over65) pbf.writeVarintField(5, obj.over65); 103 | }; 104 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/proto/BlockRecord.proto: -------------------------------------------------------------------------------- 1 | message BlockRecords { 2 | message BlockRecord { 3 | required string geoId = 1; 4 | required int32 population = 2; 5 | optional HousingUnits housingUnits = 3; 6 | optional Race race = 4; 7 | optional Age age = 5; 8 | 9 | message HousingUnits { 10 | optional int32 total = 1; 11 | optional int32 occupied = 2; 12 | optional int32 vacant = 3; 13 | } 14 | message Race { 15 | optional int32 total = 1; 16 | optional int32 white = 2; 17 | optional int32 black = 3; 18 | optional int32 asian = 4; 19 | optional int32 native = 5; 20 | optional int32 other = 6; 21 | optional int32 hispanic = 7; 22 | } 23 | message Age { 24 | optional int32 u10 = 1; 25 | optional int32 u18 = 2; 26 | optional int32 u30 = 3; 27 | optional int32 u65 = 4; 28 | optional int32 over65 = 5; 29 | } 30 | } 31 | repeated BlockRecord blocks = 1; 32 | } -------------------------------------------------------------------------------- /geo-png-db-processing/src/proto/LodesRecord.js: -------------------------------------------------------------------------------- 1 | 'use strict'; // code generated by pbf v3.2.0 2 | 3 | // LodesRecords ======================================== 4 | 5 | var LodesRecords = exports.LodesRecords = {}; 6 | 7 | LodesRecords.read = function (pbf, end) { 8 | return pbf.readFields(LodesRecords._readField, {blocks: []}, end); 9 | }; 10 | LodesRecords._readField = function (tag, obj, pbf) { 11 | if (tag === 1) obj.blocks.push(LodesRecords.LodesRecord.read(pbf, pbf.readVarint() + pbf.pos)); 12 | }; 13 | LodesRecords.write = function (obj, pbf) { 14 | if (obj.blocks) for (var i = 0; i < obj.blocks.length; i++) pbf.writeMessage(1, LodesRecords.LodesRecord.write, obj.blocks[i]); 15 | }; 16 | 17 | // LodesRecords.LodesRecord ======================================== 18 | 19 | LodesRecords.LodesRecord = {}; 20 | 21 | LodesRecords.LodesRecord.read = function (pbf, end) { 22 | return pbf.readFields(LodesRecords.LodesRecord._readField, {geoId: "", jobs: 0, age: null, income: null, naics: null, race: null, ethnicity: null, education: null, sex: null, firmAge: null, firmSize: null}, end); 23 | }; 24 | LodesRecords.LodesRecord._readField = function (tag, obj, pbf) { 25 | if (tag === 1) obj.geoId = pbf.readString(); 26 | else if (tag === 2) obj.jobs = pbf.readVarint(true); 27 | else if (tag === 3) obj.age = LodesRecords.LodesRecord.Age.read(pbf, pbf.readVarint() + pbf.pos); 28 | else if (tag === 4) obj.income = LodesRecords.LodesRecord.Income.read(pbf, pbf.readVarint() + pbf.pos); 29 | else if (tag === 5) obj.naics = LodesRecords.LodesRecord.Naics.read(pbf, pbf.readVarint() + pbf.pos); 30 | else if (tag === 6) obj.race = LodesRecords.LodesRecord.Race.read(pbf, pbf.readVarint() + pbf.pos); 31 | else if (tag === 7) obj.ethnicity = LodesRecords.LodesRecord.Ethnicity.read(pbf, pbf.readVarint() + pbf.pos); 32 | else if (tag === 8) obj.education = LodesRecords.LodesRecord.Education.read(pbf, pbf.readVarint() + pbf.pos); 33 | else if (tag === 9) obj.sex = LodesRecords.LodesRecord.Sex.read(pbf, pbf.readVarint() + pbf.pos); 34 | else if (tag === 10) obj.firmAge = LodesRecords.LodesRecord.FirmAge.read(pbf, pbf.readVarint() + pbf.pos); 35 | else if (tag === 11) obj.firmSize = LodesRecords.LodesRecord.FirmSize.read(pbf, pbf.readVarint() + pbf.pos); 36 | }; 37 | LodesRecords.LodesRecord.write = function (obj, pbf) { 38 | if (obj.geoId) pbf.writeStringField(1, obj.geoId); 39 | if (obj.jobs) pbf.writeVarintField(2, obj.jobs); 40 | if (obj.age) pbf.writeMessage(3, LodesRecords.LodesRecord.Age.write, obj.age); 41 | if (obj.income) pbf.writeMessage(4, LodesRecords.LodesRecord.Income.write, obj.income); 42 | if (obj.naics) pbf.writeMessage(5, LodesRecords.LodesRecord.Naics.write, obj.naics); 43 | if (obj.race) pbf.writeMessage(6, LodesRecords.LodesRecord.Race.write, obj.race); 44 | if (obj.ethnicity) pbf.writeMessage(7, LodesRecords.LodesRecord.Ethnicity.write, obj.ethnicity); 45 | if (obj.education) pbf.writeMessage(8, LodesRecords.LodesRecord.Education.write, obj.education); 46 | if (obj.sex) pbf.writeMessage(9, LodesRecords.LodesRecord.Sex.write, obj.sex); 47 | if (obj.firmAge) pbf.writeMessage(10, LodesRecords.LodesRecord.FirmAge.write, obj.firmAge); 48 | if (obj.firmSize) pbf.writeMessage(11, LodesRecords.LodesRecord.FirmSize.write, obj.firmSize); 49 | }; 50 | 51 | // LodesRecords.LodesRecord.Age ======================================== 52 | 53 | LodesRecords.LodesRecord.Age = {}; 54 | 55 | LodesRecords.LodesRecord.Age.read = function (pbf, end) { 56 | return pbf.readFields(LodesRecords.LodesRecord.Age._readField, {u30: 0, u55: 0, o55: 0}, end); 57 | }; 58 | LodesRecords.LodesRecord.Age._readField = function (tag, obj, pbf) { 59 | if (tag === 1) obj.u30 = pbf.readVarint(true); 60 | else if (tag === 2) obj.u55 = pbf.readVarint(true); 61 | else if (tag === 3) obj.o55 = pbf.readVarint(true); 62 | }; 63 | LodesRecords.LodesRecord.Age.write = function (obj, pbf) { 64 | if (obj.u30) pbf.writeVarintField(1, obj.u30); 65 | if (obj.u55) pbf.writeVarintField(2, obj.u55); 66 | if (obj.o55) pbf.writeVarintField(3, obj.o55); 67 | }; 68 | 69 | // LodesRecords.LodesRecord.Income ======================================== 70 | 71 | LodesRecords.LodesRecord.Income = {}; 72 | 73 | LodesRecords.LodesRecord.Income.read = function (pbf, end) { 74 | return pbf.readFields(LodesRecords.LodesRecord.Income._readField, {u1250: 0, u3333: 0, o3333: 0}, end); 75 | }; 76 | LodesRecords.LodesRecord.Income._readField = function (tag, obj, pbf) { 77 | if (tag === 1) obj.u1250 = pbf.readVarint(true); 78 | else if (tag === 2) obj.u3333 = pbf.readVarint(true); 79 | else if (tag === 3) obj.o3333 = pbf.readVarint(true); 80 | }; 81 | LodesRecords.LodesRecord.Income.write = function (obj, pbf) { 82 | if (obj.u1250) pbf.writeVarintField(1, obj.u1250); 83 | if (obj.u3333) pbf.writeVarintField(2, obj.u3333); 84 | if (obj.o3333) pbf.writeVarintField(3, obj.o3333); 85 | }; 86 | 87 | // LodesRecords.LodesRecord.Naics ======================================== 88 | 89 | LodesRecords.LodesRecord.Naics = {}; 90 | 91 | LodesRecords.LodesRecord.Naics.read = function (pbf, end) { 92 | return pbf.readFields(LodesRecords.LodesRecord.Naics._readField, {ag: 0, min: 0, util: 0, cons: 0, mfg: 0, whole: 0, retail: 0, trans: 0, info: 0, fin: 0, re: 0, tech: 0, mgmt: 0, waste: 0, ed: 0, health: 0, rec: 0, food: 0, other: 0, pa: 0}, end); 93 | }; 94 | LodesRecords.LodesRecord.Naics._readField = function (tag, obj, pbf) { 95 | if (tag === 1) obj.ag = pbf.readVarint(true); 96 | else if (tag === 2) obj.min = pbf.readVarint(true); 97 | else if (tag === 3) obj.util = pbf.readVarint(true); 98 | else if (tag === 4) obj.cons = pbf.readVarint(true); 99 | else if (tag === 5) obj.mfg = pbf.readVarint(true); 100 | else if (tag === 6) obj.whole = pbf.readVarint(true); 101 | else if (tag === 7) obj.retail = pbf.readVarint(true); 102 | else if (tag === 8) obj.trans = pbf.readVarint(true); 103 | else if (tag === 9) obj.info = pbf.readVarint(true); 104 | else if (tag === 10) obj.fin = pbf.readVarint(true); 105 | else if (tag === 11) obj.re = pbf.readVarint(true); 106 | else if (tag === 12) obj.tech = pbf.readVarint(true); 107 | else if (tag === 13) obj.mgmt = pbf.readVarint(true); 108 | else if (tag === 14) obj.waste = pbf.readVarint(true); 109 | else if (tag === 15) obj.ed = pbf.readVarint(true); 110 | else if (tag === 16) obj.health = pbf.readVarint(true); 111 | else if (tag === 17) obj.rec = pbf.readVarint(true); 112 | else if (tag === 18) obj.food = pbf.readVarint(true); 113 | else if (tag === 19) obj.other = pbf.readVarint(true); 114 | else if (tag === 20) obj.pa = pbf.readVarint(true); 115 | }; 116 | LodesRecords.LodesRecord.Naics.write = function (obj, pbf) { 117 | if (obj.ag) pbf.writeVarintField(1, obj.ag); 118 | if (obj.min) pbf.writeVarintField(2, obj.min); 119 | if (obj.util) pbf.writeVarintField(3, obj.util); 120 | if (obj.cons) pbf.writeVarintField(4, obj.cons); 121 | if (obj.mfg) pbf.writeVarintField(5, obj.mfg); 122 | if (obj.whole) pbf.writeVarintField(6, obj.whole); 123 | if (obj.retail) pbf.writeVarintField(7, obj.retail); 124 | if (obj.trans) pbf.writeVarintField(8, obj.trans); 125 | if (obj.info) pbf.writeVarintField(9, obj.info); 126 | if (obj.fin) pbf.writeVarintField(10, obj.fin); 127 | if (obj.re) pbf.writeVarintField(11, obj.re); 128 | if (obj.tech) pbf.writeVarintField(12, obj.tech); 129 | if (obj.mgmt) pbf.writeVarintField(13, obj.mgmt); 130 | if (obj.waste) pbf.writeVarintField(14, obj.waste); 131 | if (obj.ed) pbf.writeVarintField(15, obj.ed); 132 | if (obj.health) pbf.writeVarintField(16, obj.health); 133 | if (obj.rec) pbf.writeVarintField(17, obj.rec); 134 | if (obj.food) pbf.writeVarintField(18, obj.food); 135 | if (obj.other) pbf.writeVarintField(19, obj.other); 136 | if (obj.pa) pbf.writeVarintField(20, obj.pa); 137 | }; 138 | 139 | // LodesRecords.LodesRecord.Race ======================================== 140 | 141 | LodesRecords.LodesRecord.Race = {}; 142 | 143 | LodesRecords.LodesRecord.Race.read = function (pbf, end) { 144 | return pbf.readFields(LodesRecords.LodesRecord.Race._readField, {white: 0, black: 0, native: 0, asian: 0, pacific: 0, twoOrMore: 0}, end); 145 | }; 146 | LodesRecords.LodesRecord.Race._readField = function (tag, obj, pbf) { 147 | if (tag === 1) obj.white = pbf.readVarint(true); 148 | else if (tag === 2) obj.black = pbf.readVarint(true); 149 | else if (tag === 3) obj.native = pbf.readVarint(true); 150 | else if (tag === 4) obj.asian = pbf.readVarint(true); 151 | else if (tag === 5) obj.pacific = pbf.readVarint(true); 152 | else if (tag === 6) obj.twoOrMore = pbf.readVarint(true); 153 | }; 154 | LodesRecords.LodesRecord.Race.write = function (obj, pbf) { 155 | if (obj.white) pbf.writeVarintField(1, obj.white); 156 | if (obj.black) pbf.writeVarintField(2, obj.black); 157 | if (obj.native) pbf.writeVarintField(3, obj.native); 158 | if (obj.asian) pbf.writeVarintField(4, obj.asian); 159 | if (obj.pacific) pbf.writeVarintField(5, obj.pacific); 160 | if (obj.twoOrMore) pbf.writeVarintField(6, obj.twoOrMore); 161 | }; 162 | 163 | // LodesRecords.LodesRecord.Ethnicity ======================================== 164 | 165 | LodesRecords.LodesRecord.Ethnicity = {}; 166 | 167 | LodesRecords.LodesRecord.Ethnicity.read = function (pbf, end) { 168 | return pbf.readFields(LodesRecords.LodesRecord.Ethnicity._readField, {nonHispanic: 0, hispanic: 0}, end); 169 | }; 170 | LodesRecords.LodesRecord.Ethnicity._readField = function (tag, obj, pbf) { 171 | if (tag === 1) obj.nonHispanic = pbf.readVarint(true); 172 | else if (tag === 2) obj.hispanic = pbf.readVarint(true); 173 | }; 174 | LodesRecords.LodesRecord.Ethnicity.write = function (obj, pbf) { 175 | if (obj.nonHispanic) pbf.writeVarintField(1, obj.nonHispanic); 176 | if (obj.hispanic) pbf.writeVarintField(2, obj.hispanic); 177 | }; 178 | 179 | // LodesRecords.LodesRecord.Education ======================================== 180 | 181 | LodesRecords.LodesRecord.Education = {}; 182 | 183 | LodesRecords.LodesRecord.Education.read = function (pbf, end) { 184 | return pbf.readFields(LodesRecords.LodesRecord.Education._readField, {ltHs: 0, hs: 0, coll: 0, ba: 0}, end); 185 | }; 186 | LodesRecords.LodesRecord.Education._readField = function (tag, obj, pbf) { 187 | if (tag === 1) obj.ltHs = pbf.readVarint(true); 188 | else if (tag === 2) obj.hs = pbf.readVarint(true); 189 | else if (tag === 3) obj.coll = pbf.readVarint(true); 190 | else if (tag === 4) obj.ba = pbf.readVarint(true); 191 | }; 192 | LodesRecords.LodesRecord.Education.write = function (obj, pbf) { 193 | if (obj.ltHs) pbf.writeVarintField(1, obj.ltHs); 194 | if (obj.hs) pbf.writeVarintField(2, obj.hs); 195 | if (obj.coll) pbf.writeVarintField(3, obj.coll); 196 | if (obj.ba) pbf.writeVarintField(4, obj.ba); 197 | }; 198 | 199 | // LodesRecords.LodesRecord.Sex ======================================== 200 | 201 | LodesRecords.LodesRecord.Sex = {}; 202 | 203 | LodesRecords.LodesRecord.Sex.read = function (pbf, end) { 204 | return pbf.readFields(LodesRecords.LodesRecord.Sex._readField, {m: 0, f: 0}, end); 205 | }; 206 | LodesRecords.LodesRecord.Sex._readField = function (tag, obj, pbf) { 207 | if (tag === 1) obj.m = pbf.readVarint(true); 208 | else if (tag === 2) obj.f = pbf.readVarint(true); 209 | }; 210 | LodesRecords.LodesRecord.Sex.write = function (obj, pbf) { 211 | if (obj.m) pbf.writeVarintField(1, obj.m); 212 | if (obj.f) pbf.writeVarintField(2, obj.f); 213 | }; 214 | 215 | // LodesRecords.LodesRecord.FirmAge ======================================== 216 | 217 | LodesRecords.LodesRecord.FirmAge = {}; 218 | 219 | LodesRecords.LodesRecord.FirmAge.read = function (pbf, end) { 220 | return pbf.readFields(LodesRecords.LodesRecord.FirmAge._readField, {u2: 0, u4: 0, u6: 0, u11: 0, g11: 0}, end); 221 | }; 222 | LodesRecords.LodesRecord.FirmAge._readField = function (tag, obj, pbf) { 223 | if (tag === 1) obj.u2 = pbf.readVarint(true); 224 | else if (tag === 2) obj.u4 = pbf.readVarint(true); 225 | else if (tag === 3) obj.u6 = pbf.readVarint(true); 226 | else if (tag === 4) obj.u11 = pbf.readVarint(true); 227 | else if (tag === 5) obj.g11 = pbf.readVarint(true); 228 | }; 229 | LodesRecords.LodesRecord.FirmAge.write = function (obj, pbf) { 230 | if (obj.u2) pbf.writeVarintField(1, obj.u2); 231 | if (obj.u4) pbf.writeVarintField(2, obj.u4); 232 | if (obj.u6) pbf.writeVarintField(3, obj.u6); 233 | if (obj.u11) pbf.writeVarintField(4, obj.u11); 234 | if (obj.g11) pbf.writeVarintField(5, obj.g11); 235 | }; 236 | 237 | // LodesRecords.LodesRecord.FirmSize ======================================== 238 | 239 | LodesRecords.LodesRecord.FirmSize = {}; 240 | 241 | LodesRecords.LodesRecord.FirmSize.read = function (pbf, end) { 242 | return pbf.readFields(LodesRecords.LodesRecord.FirmSize._readField, {u20: 0, u50: 0, u250: 0, u500: 0, g500: 0}, end); 243 | }; 244 | LodesRecords.LodesRecord.FirmSize._readField = function (tag, obj, pbf) { 245 | if (tag === 1) obj.u20 = pbf.readVarint(true); 246 | else if (tag === 2) obj.u50 = pbf.readVarint(true); 247 | else if (tag === 3) obj.u250 = pbf.readVarint(true); 248 | else if (tag === 4) obj.u500 = pbf.readVarint(true); 249 | else if (tag === 5) obj.g500 = pbf.readVarint(true); 250 | }; 251 | LodesRecords.LodesRecord.FirmSize.write = function (obj, pbf) { 252 | if (obj.u20) pbf.writeVarintField(1, obj.u20); 253 | if (obj.u50) pbf.writeVarintField(2, obj.u50); 254 | if (obj.u250) pbf.writeVarintField(3, obj.u250); 255 | if (obj.u500) pbf.writeVarintField(4, obj.u500); 256 | if (obj.g500) pbf.writeVarintField(5, obj.g500); 257 | }; 258 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/proto/LodesRecord.proto: -------------------------------------------------------------------------------- 1 | message LodesRecords { 2 | message LodesRecord { 3 | required string geoId = 1; 4 | required int32 jobs = 2; 5 | optional Age age = 3; 6 | optional Income income = 4; 7 | optional Naics naics = 5; 8 | optional Race race = 6; 9 | optional Ethnicity ethnicity = 7; 10 | optional Education education = 8; 11 | optional Sex sex = 9; 12 | optional FirmAge firmAge = 10; 13 | optional FirmSize firmSize = 11; 14 | 15 | message Age { 16 | optional int32 u30 = 1; 17 | optional int32 u55 = 2; 18 | optional int32 o55 = 3; 19 | } 20 | 21 | message Income { 22 | optional int32 u1250 = 1; 23 | optional int32 u3333 = 2; 24 | optional int32 o3333 = 3; 25 | } 26 | 27 | message Naics { 28 | optional int32 ag = 1; 29 | optional int32 min = 2; 30 | optional int32 util = 3; 31 | optional int32 cons = 4; 32 | optional int32 mfg = 5; 33 | optional int32 whole = 6; 34 | optional int32 retail = 7; 35 | optional int32 trans = 8; 36 | optional int32 info = 9; 37 | optional int32 fin = 10; 38 | optional int32 re = 11; 39 | optional int32 tech = 12; 40 | optional int32 mgmt = 13; 41 | optional int32 waste = 14; 42 | optional int32 ed = 15; 43 | optional int32 health = 16; 44 | optional int32 rec = 17; 45 | optional int32 food = 18; 46 | optional int32 other = 19; 47 | optional int32 pa = 20; 48 | } 49 | 50 | message Race { 51 | optional int32 white = 1; 52 | optional int32 black = 2; 53 | optional int32 native = 3; 54 | optional int32 asian = 4; 55 | optional int32 pacific = 5; 56 | optional int32 twoOrMore = 6; 57 | } 58 | 59 | message Ethnicity { 60 | optional int32 nonHispanic = 1; 61 | optional int32 hispanic = 2; 62 | } 63 | 64 | message Education { 65 | optional int32 ltHs = 1; 66 | optional int32 hs = 2; 67 | optional int32 coll = 3; 68 | optional int32 ba = 4; 69 | } 70 | 71 | message Sex { 72 | optional int32 m = 1; 73 | optional int32 f = 2; 74 | } 75 | 76 | message FirmAge { 77 | optional int32 u2 = 1; 78 | optional int32 u4 = 2; 79 | optional int32 u6 = 3; 80 | optional int32 u11 = 4; 81 | optional int32 g11 = 5; 82 | } 83 | 84 | message FirmSize { 85 | optional int32 u20 = 1; 86 | optional int32 u50 = 2; 87 | optional int32 u250 = 3; 88 | optional int32 u500 = 4; 89 | optional int32 g500 = 5; 90 | } 91 | } 92 | repeated LodesRecord blocks = 1; 93 | } -------------------------------------------------------------------------------- /geo-png-db-processing/src/proto/Notes.txt: -------------------------------------------------------------------------------- 1 | Run in terminal: 2 | pbf BlockRecord.proto > BlockRecord.js 3 | 4 | pbf LodesRecord.proto > LodesRecord.js -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/RasterTileSet.js: -------------------------------------------------------------------------------- 1 | const async = require('async'); 2 | const geobuf = require('geobuf'); 3 | const Pbf = require('pbf'); 4 | const PbfTiles = require('../gis/PbfTiles'); 5 | const Rasterizer = require('./Rasterizer'); 6 | const SteppedColors = require('./SteppedColors'); 7 | const RasterizeInPoly = require('./RasterizeInPoly'); 8 | 9 | /** 10 | * Creates a new instance of RasterTileSet. 11 | * @class 12 | * @returns An instance of RasterTileSet. 13 | * @example 14 | * var instance = new RasterTileSet(); 15 | */ 16 | class RasterTileSet { 17 | 18 | constructor(filer, pbfDir, usePointInPolyRasterizer, outDir = './data/') { 19 | this.filer = filer; 20 | this.pbfDir = pbfDir; 21 | this.outDir = outDir; 22 | this.pbfTiles = new PbfTiles(); 23 | 24 | this.unfinishedWork = []; 25 | this.steppedColors = new SteppedColors(); 26 | 27 | this.usePointInPolyRasterizer = usePointInPolyRasterizer; 28 | this.maxTime = 20;//seconds (Note Max API Gateway timeout is 30s) we could still exceed this if z13 tiles took e.g. 10s,9s,12s 29 | } 30 | 31 | 32 | 33 | getArea(feature) { 34 | return feature.properties.ALAND10 + feature.properties.AWATER10; 35 | }; 36 | 37 | getId(feature) { 38 | return feature.properties.GEOID10; 39 | }; 40 | 41 | static metaDataFile(tile) { 42 | return `${tile.join('_')}.json`; 43 | } 44 | 45 | duration() { 46 | let elapsedTime = process.hrtime(this.startTime); 47 | const elapsedMs = elapsedTime[1] / 1000000; // divide by a million to get nano to milli 48 | return elapsedTime[0] + elapsedMs / 1000; 49 | }; 50 | 51 | generatePalette(tile, callback) { 52 | const colorLookup = {}; 53 | const colorLookup2 = {}; 54 | 55 | const addColor = (feature) => { 56 | const id = this.getId(feature); 57 | if (!colorLookup[id]) { 58 | colorLookup[id] = this.steppedColors.nextColor(); 59 | colorLookup2[colorLookup[id]] = id; 60 | } 61 | }; 62 | 63 | let protoFile = this.pbfDir + tile.join('_') + '.proto'; 64 | this.filer.readFile(protoFile, (err, data) => { 65 | if (err && err.code === 'ENOENT') { 66 | console.warn('MISSING proto file: ' + protoFile); 67 | callback({error: 'MISSING proto file: ' + protoFile}); 68 | return; 69 | } 70 | // var ab = nb.buffer; 71 | const geoJson = geobuf.decode(new Pbf(data)); 72 | if (!geoJson.features) { 73 | console.warn('INVALID proto file: ' + protoFile); 74 | callback({error: 'INVALID proto file: ' + protoFile}); 75 | return; 76 | } 77 | 78 | geoJson.features.forEach((feature, i) => { 79 | addColor(feature); 80 | }); 81 | if (callback) { 82 | callback(colorLookup); 83 | } 84 | }); 85 | } 86 | 87 | rasterizeTileZ13(tileZ13, callback) { 88 | const tile = this.pbfTiles.getZ11(tileZ13); 89 | 90 | let protoFile = this.pbfDir + tile.join('_') + '.proto'; 91 | const paletteFile = `${this.outDir}11/${tile.join('_')}_palette.json`; 92 | 93 | const startTimeStamp = Date.now(); 94 | const timerLabel = tileZ13.join('_'); 95 | console.time(timerLabel); 96 | const processFile = (colorLookup) => { 97 | this.filer.readFile(protoFile, (err, data) => { 98 | if (err && err.code === 'ENOENT') { 99 | console.warn('MISSING proto file: ' + protoFile); 100 | callback({error: 'MISSING proto file: ' + protoFile}); 101 | return; 102 | } 103 | // var ab = nb.buffer; 104 | const geoJson = geobuf.decode(new Pbf(data)); 105 | if (!geoJson.features) { 106 | console.warn('INVALID proto file: ' + protoFile); 107 | callback({error: 'INVALID proto file: ' + protoFile}); 108 | return; 109 | } 110 | let tileGroups; 111 | tileGroups = this.pbfTiles.getTileGroups([tileZ13], 13, geoJson); 112 | 113 | const calls = []; 114 | 115 | const getColor = (feature) => { 116 | const id = this.getId(feature); 117 | return colorLookup[id]; 118 | }; 119 | 120 | tileGroups.forEach((tileGroup, i) => { 121 | calls.push((done) => { 122 | if (this.usePointInPolyRasterizer) { 123 | tileGroup.rasterizer = new RasterizeInPoly(true, this.filer); 124 | } else { 125 | tileGroup.rasterizer = new Rasterizer(false, this.filer); 126 | } 127 | 128 | tileGroup.rasterizer.rasterizeFeatures(tileGroup.geoJson.features, tileGroup.tile, this.getArea, getColor, () => { 129 | let pngDir = `${this.outDir}${tileGroup.tile[2]}/${tileGroup.tile[0]}`; 130 | this.filer.ensureDir(pngDir, () => { 131 | let pngPath = `${pngDir}/${tileGroup.tile[1]}.png`; 132 | tileGroup.rasterizer.saveMetaData(pngPath + '.json', (meta) => { 133 | tileGroup.rasterizer.save(pngPath, () => { 134 | done(); 135 | }); 136 | }); 137 | 138 | }); 139 | }); 140 | 141 | }); 142 | }); 143 | async.parallelLimit(calls, 1, () => { 144 | console.timeEnd(timerLabel); 145 | if (callback) callback({success: true, timeElapsed: Date.now() - startTimeStamp}); 146 | }); 147 | }); 148 | }; 149 | 150 | this.filer.readFile(paletteFile, (err, data) => { 151 | const colorLookup = JSON.parse(data); 152 | processFile(colorLookup); 153 | }); 154 | } 155 | 156 | rasterizeTile(tile, selectZ13Tiles, onTile, callback) { 157 | this.startTime = process.hrtime(); 158 | let protoFile = this.pbfDir + tile.join('_') + '.proto'; 159 | const metaDataFile = `${this.outDir}11/${tile.join('_')}.json`; 160 | 161 | const startTimeStamp = Date.now(); 162 | console.time(protoFile); 163 | const processFile = () => { 164 | this.filer.readFile(protoFile, (err, data) => { 165 | if (err && err.code === 'ENOENT') { 166 | console.warn('MISSING proto file: ' + protoFile); 167 | callback({error: 'MISSING proto file: ' + protoFile}); 168 | return; 169 | } 170 | // var ab = nb.buffer; 171 | const geoJson = geobuf.decode(new Pbf(data)); 172 | if (!geoJson.features) { 173 | console.warn('INVALID proto file: ' + protoFile); 174 | callback({error: 'INVALID proto file: ' + protoFile}); 175 | return; 176 | } 177 | let tileGroups; 178 | if (selectZ13Tiles) { 179 | tileGroups = this.pbfTiles.getTileGroups(selectZ13Tiles, 13, geoJson); 180 | } else { 181 | tileGroups = this.pbfTiles.getZoomTiles(tile, 13, geoJson); 182 | } 183 | const calls = []; 184 | 185 | const colorLookup = {}; 186 | const colorLookup2 = {}; 187 | const metaData = {}; 188 | 189 | const getMeta = (id) => { 190 | if (!metaData[id]) { 191 | metaData[id] = {color: '', pixelCount: 0}; 192 | } 193 | return metaData[id]; 194 | }; 195 | 196 | const addMeta = (meta) => { 197 | Object.keys(meta.pixelCounts).forEach((k) => { 198 | let hexStr = k; 199 | if (!this.usePointInPolyRasterizer) { 200 | hexStr = '#' + parseInt(k).toString(16).substr(0, 6); 201 | } 202 | const id = colorLookup2[hexStr]; 203 | if (id) { 204 | getMeta(id).pixelCount += meta.pixelCounts[k]; 205 | } 206 | }); 207 | }; 208 | 209 | const getColor = (feature) => { 210 | const id = this.getId(feature); 211 | if (!colorLookup[id]) { 212 | colorLookup[id] = this.steppedColors.nextColor(); 213 | colorLookup2[colorLookup[id]] = id; 214 | getMeta(id).color = colorLookup[id]; 215 | } 216 | return colorLookup[id]; 217 | }; 218 | 219 | tileGroups.forEach((tileGroup, i) => { 220 | calls.push((done) => { 221 | if (this.duration() > this.maxTime) { 222 | this.unfinishedWork.push(tileGroup.tile); 223 | done(); 224 | return; 225 | } 226 | 227 | if (this.usePointInPolyRasterizer) { 228 | tileGroup.rasterizer = new RasterizeInPoly(true, this.filer); 229 | } else { 230 | tileGroup.rasterizer = new Rasterizer(false, this.filer); 231 | } 232 | 233 | tileGroup.rasterizer.rasterizeFeatures(tileGroup.geoJson.features, tileGroup.tile, this.getArea, getColor, () => { 234 | let pngDir = `${this.outDir}${tileGroup.tile[2]}/${tileGroup.tile[0]}`; 235 | this.filer.ensureDir(pngDir, () => { 236 | let pngPath = `${pngDir}/${tileGroup.tile[1]}.png`; 237 | tileGroup.rasterizer.saveMetaData(pngPath + '.json', (meta) => { 238 | addMeta(meta); 239 | tileGroup.rasterizer.save(pngPath, () => { 240 | if (onTile) onTile(pngPath); 241 | done(); 242 | }); 243 | }); 244 | 245 | }); 246 | }); 247 | 248 | }); 249 | }); 250 | async.parallelLimit(calls, 1, () => { 251 | // console.log('ALL SAVED'); 252 | // console.timeEnd(protoFile); 253 | this.filer.writeFile(metaDataFile, JSON.stringify(metaData, null, 2), function (err) { 254 | if (err) { 255 | throw err; 256 | } else { 257 | console.log('Saved ' + metaDataFile); 258 | } 259 | if (callback) callback({success: true, timeElapsed: Date.now() - startTimeStamp}); 260 | }); 261 | }); 262 | }); 263 | }; 264 | 265 | processFile(); 266 | }; 267 | 268 | } 269 | 270 | module.exports = RasterTileSet; 271 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/RasterizeInPoly.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const os = require('os'); 3 | const Jimp = require('jimp'); 4 | const PbfTiles = require('../gis/PbfTiles'); 5 | const TileRenderer = require('./TileRenderer'); 6 | const Utils = require("../Utils"); 7 | const pointsWithinPolygon = require('@turf/points-within-polygon').default; 8 | 9 | /** 10 | * Creates a new instance of RasterizeInPoly. 11 | * @class 12 | * @returns An instance of RasterizeInPoly. 13 | * @example 14 | * var instance = new RasterizeInPoly(); 15 | */ 16 | class RasterizeInPoly extends TileRenderer { 17 | 18 | constructor(saveWithWorldFile, filer, verbose) { 19 | super(saveWithWorldFile, filer); 20 | this.pbfTiles = new PbfTiles(); 21 | this.verbose = verbose; 22 | } 23 | 24 | rasterizeFeatures(features, tile, getArea, getColor, callback) { 25 | this.tile = tile; 26 | //this may be a slower, but more solid way of rendering than the canvas-based tool 27 | //we first grab the bounds for every polygon, then use point in polygon to find which poly to render at that pixel 28 | 29 | let tileBounds = this.pbfTiles.tileToBounds(tile); 30 | const inTileBounds = (bounds) => { 31 | return this.pbfTiles.intersectsBounds(tileBounds, bounds) 32 | }; 33 | 34 | //a possible optimization would be to use a quad-tree structure to more quickly identify which bounds intercept which pixels 35 | const boundingBoxes = []; 36 | features.forEach((feature, i) => { 37 | const bounds = this.pbfTiles.convertGeometry(false, feature.geometry); 38 | if (inTileBounds(bounds)) { 39 | boundingBoxes.push({bounds: bounds, feature: feature}); 40 | } 41 | }); 42 | 43 | const matchBounds = (point) => { 44 | return boundingBoxes.filter((box, i) => { 45 | return Utils.containsPoint(box.bounds, point); 46 | }); 47 | }; 48 | 49 | 50 | const getFeature = (x, y) => { 51 | //grab lat lon in center of pixel 52 | const latLon = Utils.toLatLon(tile, (x + 0.5) / 256, (y + 0.5) / 256); 53 | const point = { 54 | type: "Feature", 55 | properties: {}, 56 | geometry: { 57 | type: "Point", 58 | coordinates: [latLon.x, latLon.y] 59 | } 60 | }; 61 | 62 | const shapes = matchBounds(latLon); 63 | const matchingShape = shapes.find((shape, i) => { 64 | const ans = pointsWithinPolygon(point, shape.feature); 65 | pipTests++; 66 | return ans.features.length > 0; 67 | }); 68 | return matchingShape; 69 | }; 70 | 71 | let pipTests = 0; 72 | 73 | new Jimp(256, 256, (err, image) => { 74 | this.image = image; 75 | // console.time('rasterizeFeatures'); 76 | image.scan(0, 0, 256, 256, (x, y, idx) => { 77 | const matchingShape = getFeature(x, y); 78 | if (matchingShape) { 79 | this.setColor(image.bitmap, idx, getColor(matchingShape.feature)); 80 | } 81 | if (this.verbose) { 82 | if ((idx / 4) % 20000 === 0) { 83 | console.log('progress ' + (idx / 4) + ' / ' + (256 * 256)); 84 | } 85 | } 86 | }); 87 | callback(); 88 | // console.timeEnd('rasterizeFeatures'); 89 | // console.log(`pip tests: ${pipTests} ${pipTests / (256 * 256)}`); 90 | }); 91 | } 92 | 93 | } 94 | 95 | module.exports = RasterizeInPoly; 96 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/RasterizePoints.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const os = require('os'); 3 | const Jimp = require('jimp'); 4 | const PbfTiles = require('../gis/PbfTiles'); 5 | const PngDbEncodings = require('../encodings/PngDbEncodings'); 6 | const TileEncodings = require('../encodings/TileEncodings'); 7 | const Utils = require("../Utils"); 8 | // const pointsWithinPolygon = require('@turf/points-within-polygon').default; 9 | 10 | /** 11 | * Creates a new instance of RasterizePoints. 12 | * @class 13 | * @returns An instance of RasterizePoints. 14 | * @example 15 | * var instance = new RasterizePoints(); 16 | */ 17 | class RasterizePoints { 18 | 19 | constructor(saveWithWorldFile, filer) { 20 | this.pbfTiles = new PbfTiles(); 21 | this.filer = filer; 22 | this.pixelCounts = {}; 23 | this.saveWithWorldFile = saveWithWorldFile; 24 | } 25 | 26 | saveMetaData(filePath, callback) { 27 | const pixels = this.pixelCounts; 28 | this.filer.writeFile(filePath, JSON.stringify(pixels, null, 2), function (err) { 29 | if (err) { 30 | throw err; 31 | } else { 32 | // console.log('Saved ' + filePath); 33 | } 34 | callback({pixelCounts: pixels}) 35 | }); 36 | } 37 | 38 | static getTiles(features) { 39 | console.log('Calculating Tiles from Features'); 40 | const tiles = {}; 41 | features.forEach((feature, i) => { 42 | const point = feature.geometry.coordinates; 43 | const tile = Utils.latLonToTile(point[1], point[0], 13); 44 | const tileId = tile.join('_'); 45 | if (!tiles[tileId]) { 46 | tiles[tileId] = {tile: tile, features: []}; 47 | } 48 | if (tile[0] < 2000) { 49 | console.log(JSON.stringify(feature)); 50 | return; 51 | } 52 | tiles[tileId].features.push(feature); 53 | if (i % 1000 === 0) { 54 | console.log(Math.round(1000 * i / features.length) / 10 + '%'); 55 | } 56 | }); 57 | return Object.keys(tiles).map((k) => tiles[k]); 58 | } 59 | 60 | rasterizeFeatures(features, tile, field, getValue, callback) { 61 | this.tile = tile; 62 | const tileZoom = tile[2]; 63 | 64 | const setColor = (bitmap, idx, colorHex) => { 65 | if (!this.colorCache) { 66 | this.colorCache = {}; 67 | } 68 | if (!this.colorCache[colorHex]) { 69 | this.colorCache[colorHex] = Utils.hexToRgb(colorHex); 70 | } 71 | const color = this.colorCache[colorHex]; 72 | bitmap.data[idx + 0] = color.r; 73 | bitmap.data[idx + 1] = color.g; 74 | bitmap.data[idx + 2] = color.b; 75 | bitmap.data[idx + 3] = 255; 76 | }; 77 | 78 | const getPos = (p) => { 79 | const tileX = Utils.long2tile(p[0], tileZoom); 80 | const tileY = Utils.lat2tile(p[1], tileZoom); 81 | return [ 82 | Math.round(256 * (tileX - tile[0])), 83 | Math.round(256 * (tileY - tile[1])), 84 | ]; 85 | }; 86 | 87 | const featuresByXY = {}; 88 | features.forEach((feature, i) => { 89 | const p = getPos(feature.geometry.coordinates); 90 | let id = p.join('_'); 91 | if (!featuresByXY[id]) { 92 | featuresByXY[id] = []; 93 | } 94 | featuresByXY[id].push(feature); 95 | }); 96 | 97 | const width = 256; 98 | const height = 256; 99 | const tileOffsets = TileEncodings.calcOffsetTiles(field.arrayCount || 1); 100 | new Jimp(width * tileOffsets.width, height * tileOffsets.height, (err, image) => { 101 | this.image = image; 102 | console.time('rasterizeFeatures'); 103 | image.scan(0, 0, width, height, function (x, y, idx) { 104 | const id = x + '_' + y; 105 | const features = featuresByXY[id]; 106 | if (features && features.length > 0) { 107 | let valuesArr = getValue(features); 108 | tileOffsets.offsets.forEach((xy, i) => { 109 | if (valuesArr[i] > 0) { 110 | const valPerPx = valuesArr[i]; 111 | let pX = xy[0] * width + x; 112 | let pY = xy[1] * height + y; 113 | PngDbEncodings.setPixel(field, image, pX, pY, valPerPx); 114 | } 115 | }); 116 | } 117 | }); 118 | callback(); 119 | console.timeEnd('rasterizeFeatures'); 120 | }); 121 | } 122 | 123 | save(path, callback) { 124 | this.filer.saveImage(path, this.image, () => { 125 | console.log('Wrote ' + path); 126 | if (this.saveWithWorldFile) { 127 | this.saveWorldFile(path, callback); 128 | } else { 129 | if (callback) callback(); 130 | } 131 | }); 132 | } 133 | 134 | saveWorldFile(path, callback) { 135 | const {tile} = this; 136 | const latLonTL = Utils.toLatLon(tile, 0, 0); 137 | const latLonBR = Utils.toLatLon(tile, 1, 1); 138 | const xSize = (latLonBR.x - latLonTL.x) / 256; 139 | const ySize = (latLonBR.y - latLonTL.y) / 256; 140 | const params = [xSize, 0, 0, ySize, latLonTL.x + xSize / 2, latLonTL.y + ySize / 2]; 141 | const pgwFile = path.replace('.png', '.pgw'); 142 | this.filer.writeFile(pgwFile, params.join(os.EOL), function (err) { 143 | if (err) { 144 | throw err; 145 | } else { 146 | console.log('Saved ' + pgwFile); 147 | if (callback) callback(); 148 | } 149 | }); 150 | } 151 | } 152 | 153 | module.exports = RasterizePoints; 154 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/Rasterizer.js: -------------------------------------------------------------------------------- 1 | const fs = require('fs'); 2 | const Utils = require("../Utils"); 3 | 4 | /** 5 | * Creates a new instance of Rasterizer. 6 | * @class 7 | * @returns An instance of Rasterizer. 8 | * @example 9 | * var instance = new Rasterizer(); 10 | */ 11 | class Rasterizer {//Note: this takes a graphical approach to drawing polygons vs point-in-polygon - not currently in use 12 | 13 | constructor() { 14 | const PImage = require('pureimage');//Note this is inlined here because we're not using this class on Lambda (and don't want to load unneeded code) 15 | this.img = PImage.make(256, 256); 16 | this.ctx = this.img.getContext('2d'); 17 | this.ctx.imageSmoothingEnabled = false; 18 | } 19 | 20 | fillPolygon(points, colorHex) { 21 | const {ctx} = this; 22 | ctx.fillStyle = colorHex; 23 | ctx.beginPath(); 24 | points.forEach((point, i) => { 25 | if (i === 0) { 26 | ctx.moveTo(point[0], point[1]); 27 | } else { 28 | ctx.lineTo(point[0], point[1]); 29 | } 30 | }); 31 | ctx.lineWidth = 1; 32 | ctx.strokeStyle = colorHex; 33 | ctx.stroke();//noaa for stroke??? 34 | ctx.closePath(); 35 | ctx.fill_noaa(); 36 | }; 37 | 38 | pixelCounts() { 39 | const bitmap = this.img; 40 | const pixelCounts = {}; 41 | for (let i = 0; i < bitmap.width; i++) { 42 | for (let j = 0; j < bitmap.height; j++) { 43 | const rgba = bitmap.getPixelRGBA(i, j); 44 | if (!pixelCounts[rgba]) { 45 | pixelCounts[rgba] = 0; 46 | } 47 | pixelCounts[rgba]++; 48 | } 49 | } 50 | return pixelCounts; 51 | } 52 | 53 | saveMetaData(filePath, callback) { 54 | const pixels = this.pixelCounts(); 55 | const fs = require('fs'); 56 | fs.writeFile(filePath, JSON.stringify(pixels, null, 2), function (err) { 57 | if (err) { 58 | throw err; 59 | } else { 60 | // console.log('Saved ' + filePath); 61 | } 62 | callback({pixelCounts: pixels}); 63 | }); 64 | } 65 | 66 | renderGeometry(tile, geometry, colorHex) { 67 | const tileZoom = tile[2]; 68 | if (geometry.type === 'Polygon') { 69 | geometry.coordinates.forEach((poly, i) => { 70 | this.drawPoly(poly, tileZoom, tile, colorHex); 71 | }); 72 | } else if (geometry.type === 'MultiPolygon') { 73 | geometry.coordinates.forEach((polys, i) => { 74 | polys.forEach((poly, i) => { 75 | this.drawPoly(poly, tileZoom, tile, colorHex); 76 | }); 77 | }); 78 | } else { 79 | console.log('Unsupported Type: ' + geometry.type); 80 | } 81 | } 82 | 83 | drawPoly(poly, tileZoom, tile, colorHex) { 84 | const points = poly.map((p) => { 85 | const tileX = Utils.long2tile(p[0], tileZoom); 86 | const tileY = Utils.lat2tile(p[1], tileZoom); 87 | return [ 88 | Math.round(256 * (tileX - tile[0])), 89 | Math.round(256 * (tileY - tile[1])), 90 | ]; 91 | }); 92 | this.fillPolygon(points, colorHex); 93 | } 94 | 95 | rasterizeFeatures(features, tile, getArea, getColor, callback) { 96 | //Sort largest to smallest so we don't overwrite smaller polys. 97 | features.sort((a, b) => { 98 | return getArea(b) - getArea(a); 99 | }); 100 | features.forEach((feature, i) => { 101 | this.renderGeometry(tile, feature.geometry, getColor(feature)); 102 | }); 103 | callback(); 104 | } 105 | 106 | save(path, callback) { 107 | 108 | // for (let i = 0; i < this.img.data.length; i += 4) { 109 | // this.img.data[i + 3] = 255; 110 | // } 111 | if (path === './data/13/1888/3431.png') { 112 | console.log('Writing...'); 113 | } 114 | 115 | const PImage = require('pureimage'); 116 | PImage.encodePNGToStream(this.img, fs.createWriteStream(path)).then(() => { 117 | console.log(`saved ${path}`); 118 | callback(true); 119 | }).catch((e) => { 120 | console.warn(`could not save ${path}`); 121 | callback(false); 122 | }); 123 | 124 | } 125 | } 126 | 127 | module.exports = Rasterizer; 128 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/StatusTiles.js: -------------------------------------------------------------------------------- 1 | /** 2 | * Creates a new instance of StatusTiles. 3 | * @class 4 | * @returns An instance of StatusTiles. 5 | * @example 6 | * var instance = new StatusTiles(); 7 | */ 8 | const TileRenderer = require('./TileRenderer'); 9 | 10 | class StatusTiles extends TileRenderer{ 11 | 12 | constructor(saveWithWorldFile, filer, tile) { 13 | super(saveWithWorldFile, filer); 14 | this.tile = tile; 15 | } 16 | } 17 | 18 | module.exports = StatusTiles; -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/SteppedColors.js: -------------------------------------------------------------------------------- 1 | /** 2 | * Creates a new instance of SteppedColors. 3 | * @class 4 | * @returns An instance of SteppedColors. 5 | * @example 6 | * var instance = new SteppedColors(); 7 | */ 8 | class SteppedColors {//Note: this takes a graphical approach to drawing polygons vs point-in-polygon - not currently in use 9 | 10 | constructor() { 11 | 12 | this.steppedColorOffset = 0; 13 | this.steppedColors = []; 14 | this.usedColors = {}; 15 | } 16 | 17 | makeSteppedColors() { 18 | //we use steps of 0x11 to get colors that are easier to differentiate visually 19 | //but we can run out at around 4K colors. The steppedColorOffset will give us a new batch (so we get around 64K colors) 20 | const steppedColors = []; 21 | for (let r = 0; r <= 0xEE0000; r += 0x110000) { 22 | for (let g = 0; g <= 0xFF00; g += 0x1100) { 23 | for (let b = 0; b <= 0xFF; b += 0x11) { 24 | let val = (r | g | b) - this.steppedColorOffset; 25 | if (val <= 0) continue;//avoid pure black (or negative values) 26 | steppedColors.push(val.toString(16).padStart(6, "0")); 27 | } 28 | } 29 | } 30 | return steppedColors; 31 | } 32 | 33 | fillUnusedColorSpace(maxNum) { 34 | const remColors = []; 35 | for (let c = 0x000001; c <= 0xFFFFFF; c += 1) { 36 | const hex = c.toString(16).padStart(6, "0"); 37 | if (!this.usedColors['#' + hex]) { 38 | remColors.push(hex); 39 | } 40 | if (remColors.length >= maxNum) break; 41 | } 42 | return remColors; 43 | } 44 | 45 | nextColor() { 46 | // console.log('Added color: ' + ans); 47 | if (this.steppedColors.length === 0) { 48 | this.steppedColorOffset++; 49 | if (this.steppedColorOffset >= 0x11) { 50 | console.warn('COLOR THRESHOLD REACHED... backfilling 100k colors'); 51 | this.steppedColors = this.fillUnusedColorSpace(100000); 52 | } else { 53 | this.steppedColors = this.makeSteppedColors(); 54 | } 55 | } 56 | let color = '#' + this.steppedColors.pop(); 57 | this.usedColors[color] = true; 58 | return color; 59 | } 60 | 61 | } 62 | 63 | module.exports = SteppedColors; 64 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/TileRenderer.js: -------------------------------------------------------------------------------- 1 | const Utils = require("../Utils"); 2 | const os = require('os'); 3 | 4 | /** 5 | * Creates a new instance of TileRenderer. 6 | * @class 7 | * @returns An instance of TileRenderer. 8 | * @example 9 | * var instance = new TileRenderer(); 10 | */ 11 | class TileRenderer { 12 | 13 | constructor(saveWithWorldFile, filer) { 14 | this.filer = filer; 15 | this.pixelCounts = {}; 16 | this.saveWithWorldFile = saveWithWorldFile; 17 | }; 18 | 19 | setColor(bitmap, idx, colorHex) { 20 | if (!this.colorCache) { 21 | this.colorCache = {}; 22 | } 23 | if (!this.colorCache[colorHex]) { 24 | this.colorCache[colorHex] = Utils.hexToRgb(colorHex); 25 | } 26 | const color = this.colorCache[colorHex]; 27 | bitmap.data[idx + 0] = color.r; 28 | bitmap.data[idx + 1] = color.g; 29 | bitmap.data[idx + 2] = color.b; 30 | bitmap.data[idx + 3] = 255; 31 | if (!this.pixelCounts[colorHex]) this.pixelCounts[colorHex] = 0; 32 | this.pixelCounts[colorHex]++; 33 | }; 34 | 35 | saveMetaData(filePath, callback) { 36 | const pixels = this.pixelCounts; 37 | this.filer.writeFile(filePath, JSON.stringify(pixels, null, 2), function (err) { 38 | if (err) { 39 | throw err; 40 | } else { 41 | // console.log('Saved ' + filePath); 42 | } 43 | callback({pixelCounts: pixels}) 44 | }); 45 | } 46 | 47 | save(path, callback) { 48 | this.filer.saveImage(path, this.image, () => { 49 | // console.log('Wrote ' + path); 50 | if (this.saveWithWorldFile) { 51 | this.saveWorldFile(path, callback); 52 | } else { 53 | if (callback) callback(); 54 | } 55 | }); 56 | } 57 | 58 | saveWorldFile(path, callback) { 59 | const {tile} = this; 60 | const latLonTL = Utils.toLatLon(tile, 0, 0); 61 | const latLonBR = Utils.toLatLon(tile, 1, 1); 62 | const xSize = (latLonBR.x - latLonTL.x) / 256; 63 | const ySize = (latLonBR.y - latLonTL.y) / 256; 64 | const params = [xSize, 0, 0, ySize, latLonTL.x + xSize / 2, latLonTL.y + ySize / 2]; 65 | const pgwFile = path.replace('.png', '.pgw'); 66 | this.filer.writeFile(pgwFile, params.join(os.EOL), function (err) { 67 | if (err) { 68 | throw err; 69 | } else { 70 | // console.log('Saved ' + pgwFile); 71 | if (callback) callback(); 72 | } 73 | }); 74 | } 75 | } 76 | 77 | module.exports = TileRenderer; -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/_test/RasterizeTileSet.js: -------------------------------------------------------------------------------- 1 | const RasterTileSet = require('../RasterTileSet'); 2 | 3 | const rasterTileSet = new RasterTileSet(); 4 | 5 | const colors = []; 6 | 7 | for (let i = 0; i < 64000; i++) { 8 | const color = rasterTileSet.nextColor(); 9 | if (colors.indexOf(color) >= 0) { 10 | throw 'DUPLICATE ' + color; 11 | } 12 | if (!/^#[0-9A-F]{6}$/i.test(color)) { 13 | throw 'INVALID ' + color; 14 | } 15 | colors.push(color); 16 | } 17 | console.log(colors); 18 | -------------------------------------------------------------------------------- /geo-png-db-processing/src/rasterize/algorithms/ScanLinePolygonFill.js: -------------------------------------------------------------------------------- 1 | /** 2 | * Creates a new instance of ScanLinePolygonFill. 3 | * @class 4 | * @returns An instance of ScanLinePolygonFill. 5 | * @example 6 | * var instance = new ScanLinePolygonFill(); 7 | */ 8 | class ScanLinePolygonFill { 9 | 10 | constructor(width, height) { 11 | this.clip_rect = {x: 0, y: 0, w: width, h: height}; 12 | 13 | 14 | }; 15 | 16 | //also see ??? https://github.com/kpurrmann/cg/blob/master/cog1/raster.js 17 | //http://alienryderflex.com/polygon_fill/ 18 | //https://bitbucket.org/pygame/pygame/src/faa5879a7e6bfe10e4e5c79d04a3d2fb65d74a62/src/draw.c?at=default#draw.c-1483 19 | 20 | fillPolygon(points, pixelSetter) { 21 | this.pixelSetter = pixelSetter; 22 | const vx = points.map((p) => Math.round(p.x)); 23 | const vy = points.map((p) => Math.round(p.y)); 24 | 25 | this.scanFillPoly(vx, vy, points.length); 26 | 27 | } 28 | 29 | 30 | scanFillPoly(vx, vy, n) { 31 | let i; 32 | let y; 33 | let miny, maxy; 34 | let x1, y1; 35 | let x2, y2; 36 | let ind1, ind2; 37 | let ints; 38 | let polyints = []; 39 | 40 | /* Determine Y maxima */ 41 | miny = vy[0]; 42 | maxy = vy[0]; 43 | for (i = 1; (i < n); i++) { 44 | miny = Math.min(miny, vy[i]); 45 | maxy = Math.max(maxy, vy[i]); 46 | } 47 | 48 | /* Draw, scanning y */ 49 | for (y = miny; (y <= maxy); y++) { 50 | ints = 0; 51 | polyints.length = 0; 52 | for (i = 0; (i < n); i++) { 53 | if (!i) { 54 | ind1 = n - 1; 55 | ind2 = 0; 56 | } else { 57 | ind1 = i - 1; 58 | ind2 = i; 59 | } 60 | y1 = vy[ind1]; 61 | y2 = vy[ind2]; 62 | if (y1 < y2) { 63 | x1 = vx[ind1]; 64 | x2 = vx[ind2]; 65 | } else if (y1 > y2) { 66 | y2 = vy[ind1]; 67 | y1 = vy[ind2]; 68 | x2 = vx[ind1]; 69 | x1 = vx[ind2]; 70 | } else { 71 | continue; 72 | } 73 | if ((y >= y1) && (y < y2)) { 74 | polyints[ints++] = Math.round((y - y1) * (x2 - x1) / (y2 - y1)) + x1; 75 | } else if ((y === maxy) && (y > y1) && (y <= y2)) { 76 | polyints[ints++] = Math.round((y - y1) * (x2 - x1) / (y2 - y1)) + x1; 77 | } 78 | } 79 | 80 | // let = (a, b) => a - b; 81 | polyints.sort(); 82 | 83 | for (i = 0; (i < ints); i += 2) { 84 | this.drawhorzlineclip(polyints[i], y, polyints[i + 1]); 85 | } 86 | } 87 | 88 | } 89 | 90 | drawhorzlineclip(x1, y1, x2) { 91 | if (y1 < this.clip_rect.y || y1 >= this.clip_rect.y + this.clip_rect.h) { 92 | return; 93 | } 94 | 95 | if (x2 < x1) { 96 | let temp = x1; 97 | x1 = x2; 98 | x2 = temp; 99 | } 100 | 101 | x1 = Math.max(x1, this.clip_rect.x); 102 | x2 = Math.min(x2, this.clip_rect.x + this.clip_rect.w - 1); 103 | 104 | if (x2 < this.clip_rect.x || x1 >= this.clip_rect.x + this.clip_rect.w) 105 | return; 106 | 107 | if (x1 === x2) { 108 | this.set_at(x1, y1); 109 | } else { 110 | this.drawhorzline(x1, y1, x2); 111 | } 112 | } 113 | 114 | drawhorzline(x1, y1, x2) { 115 | for (let x = x1; x <= x2; x++) { 116 | this.set_at(x, y1); 117 | 118 | } 119 | } 120 | 121 | set_at(x, y) { 122 | this.pixelSetter(x, y); 123 | } 124 | 125 | } 126 | 127 | module.exports = ScanLinePolygonFill; 128 | -------------------------------------------------------------------------------- /img/png-db-example.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sasakiassociates/geo-png-db/11c5cf9a7cc5ee52a0e05ca00668c9a47c3431ed/img/png-db-example.png -------------------------------------------------------------------------------- /img/pseudo-code.svg: -------------------------------------------------------------------------------- 1 | 2 | 14 | 16 | 22 | 27 | 28 | 29 | 31 | 32 | 34 | image/svg+xml 35 | 37 | 38 | 39 | 40 | 41 | 44 | 51 | if num_records > MAX_RECORDS{ generate sampled data tile for this tile address split in quads repeat for each new cell} else { generate complete data tile for this address} 133 | 140 | 149 | 153 | 160 | 167 | 174 | 181 | 182 | sample tile is used for this zoom level only 192 | 1 sample tile exists at each zoom level until MAX_RECORDS not exceeded 202 | 1/x 212 | 221 | 1/1 231 | all inner tiles can use this data tile 241 | 245 | 246 | 247 | -------------------------------------------------------------------------------- /specifications/basic/1.0/README.md: -------------------------------------------------------------------------------- 1 | # GeoPngDB Basic version 1.0 2 | 3 | ## Format 4 | 5 | ### Tiling Scheme 6 | 7 | GeoPngDB uses the standard Google x/y/z tile schema with tiles starting in the NW corner of the WGS 84 map projection. Each tile is 256x256 pixels. 8 | 9 | Tile schemas represent the entire globe, slicing a square map into 4 smaller tiles and adding detail at each new zoom 10 | level. See this reference: https://www.maptiler.com/google-maps-coordinates-tile-bounds-projection/ 11 | 12 | ### Data Encoding 13 | 14 | GeoPngDB borrows the numeric encodings from [png-db](https://github.com/sasakiassociates/png-db). See png-db 15 | documentation for more details. 16 | 17 | ### Column-Oriented 18 | GeoPngDB tiles can be thought of as a column-oriented data approach because each directory of tiles represents a single column or field within the dataset. For example our source data might be census blocks and the data would be stored in multiple image folders: one for each census field (such as "total population" or "total number of households"). 19 | 20 | Some fields such as population by race can take advantage of arrays (see [below](#support-for-arrays)) to hold more values, but this can still be thought of as a column-oriented source where the column has an extra dimension. 21 | 22 | ### JSON Data 23 | 24 | This Json file must be named "geoPngDb.json" and placed at the root alongside the zoom-numbered image folders. 25 | 26 | A basic example of a GeoPngDB spec is as follows: 27 | 28 | ``` 29 | { 30 | "type": "GeoPngDB", 31 | "version": "1.0", 32 | "metadata": { 33 | "description": "U.S. population data by race.", 34 | "metric": "# people", 35 | "source": "U.S. Census", 36 | "sourceLink": "https://www.census.gov/topics/population/data.html", 37 | "sourceField":"USPopulationByRace", 38 | }, 39 | "tileBounds": { 40 | "zoom": 12, 41 | "xMin": 870, 42 | "xMax": 1360, 43 | "yMin": 442, 44 | "yMax": 1280, 45 | }, 46 | "fields": [ 47 | { 48 | "id": "Black", 49 | "type": "DECIMAL", 50 | "precision": 10000, 51 | "variablePrecision": 1.5, 52 | "range": { 53 | "min": -105.11037116007518, 54 | "max": -104.73393825708611 55 | } 56 | } 57 | ] 58 | } 59 | ``` 60 | 61 | #### Metadata 62 | 63 | Metadata is suggested by this schema but not required. The following fields are strongly encouraged: 64 | 65 | ``` 66 | description 67 | metric 68 | source 69 | ``` 70 | 71 | `sourceLink`: used to provide a deep link to a data source (e.g. a direct download link or a page with a clear download link). It should *not* be used to provide generic links such as `www.census.gov`. 72 | 73 | `sourceField`: since each tile set represents a single "column" from a data source (see [above](#column-oriented)), you can use this property to specify which source data field was used. This does not need to match anything in the "fields" definitions and is purely for reference. 74 | 75 | Any number of additional fields and data structures can be added to the metadata section for custom use. 76 | 77 | #### Data Types 78 | The following [png-db](https://github.com/sasakiassociates/png-db) data types are supported: 79 | ``` 80 | INTEGER 81 | DECIMAL 82 | TEXT 83 | ``` 84 | 85 | #### Precision 86 | Png-db uses the RGB color space to represent numbers. This gives us a broad range of integers (16,777,215). For decimals, we simply map this integer number space to a decimal number space using a range of possible values (min/max), and a given precision value. 87 | 88 | Where decimal precision can be lowered, this can significantly reduce PNG file sizes. 89 | 90 | #### Variable Precision 91 | When dealing with numerous zoom levels, the appropriate precision for the maximum zoom may differ from the appropriate level at the minimum zoom. For example, in a data set showing property value, any given pixel when zoomed in may contain no more than a few $. However, at the minimum zoom (e.g. global scale), one pixel might contain millions or billions of $. 92 | 93 | A difference of a few hundred $ won't change the map much at the global scale, but may make a dramatic difference at the zoomed in scale. To account for this, variable precision changes the precision value used per zoom level. 94 | 95 | #### Support for Arrays 96 | 97 | Tiles are typically represented as single 256x256 tiles, but sometimes it can be more efficient to load multiple data 98 | fields into a single larger image if there is a very strong relationship, and they are unlikely to be used separately. 99 | The larger tiles (e.g. 512x512) *do not* take up more space on the map, but client side tools can be used to query 100 | different fields within the data by looking at different parts of the image. 101 | 102 | An offset array with `[x,y]` values is used to specify where each field is represented in the composite image. 103 | 104 | ``` 105 | { 106 | ... 107 | "fields": [ 108 | { 109 | //NOTE: offset not required, assumed to be 0,0 if not specified 110 | "id": "Black", 111 | "type": "DECIMAL", 112 | "precision": 10000, 113 | "variablePrecision": 1.5, 114 | "range": { 115 | "min": 0, 116 | "max": 12000 117 | } 118 | }, 119 | { 120 | "id": "Asian", 121 | "offset": [1,0],//offset specifed as [x,y] 122 | "type": "DECIMAL", 123 | "precision": 10000, 124 | "variablePrecision": 1.5, 125 | "range": { 126 | "min": 0, 127 | "max": 12000 128 | } 129 | } 130 | ] 131 | } 132 | ``` 133 | -------------------------------------------------------------------------------- /specifications/basic/draft/1.1/README.md: -------------------------------------------------------------------------------- 1 | # GeoPngDB version 1.1 2 | 3 | ## Status 4 | This is a working draft for new features that are not yet fully worked out or supported by tools or libraries. If these solutions are proven to work in real use-cases, they will be folded into the main spec via a version bump. Not all features in this working draft will necessarily be included in the next spec version. 5 | 6 | All spec changes at a 1.x level are expected to be non-breaking backwards compatible changes. 7 | 8 | ## Format 9 | 10 | ### Proposed features 11 | 12 | #### Per-Tile TEXT lookups 13 | Text fields map integers encoded in the tiles to specific values in the JSON file. This works great for most text classifications which are no more than a few thousand unique values. However, certain "key" values such as census block IDs may need to be associated with pixels. In this case, the number of unique census blocks is around 7 million which is well within the Integer space (of 16 million), but a lookup file for all census block values to map to the integer space can be prohibitively large. 14 | 15 | A simple solution to this problem is to store the unique values using a JSON file at the same address as the PNG tile file. That way the values need only be uniquely defined within that single tile. 16 | 17 | ``` 18 | { 19 | ... 20 | "fields": [ 21 | { 22 | "id": "GeoID", 23 | "type": "TEXT", 24 | "perTileLookup": "geoId", 25 | }, 26 | ] 27 | } 28 | ``` 29 | 30 | Tile JSON file: Note the keys are specified as integer strings and need not be sequential or start at zero. This lets us ensure consistency in number/key lookups at different pyramid zoom levels. Values are stored as arrays to let us account for multiple records being combined into single pixels as we zoom out. 31 | 32 | e.g. `10/543/345.png` would have an accompanying `10/543/345.json` file: 33 | ``` 34 | { 35 | "geoId": { 36 | "128": ["G3012287338"], 37 | "129": ["G3012287339"], 38 | "130": ["G3012287340"], 39 | "268": ["G3012257167"], 40 | "269": ["G3012257171", "G3012257172"], 41 | } 42 | ``` 43 | 44 | #### Detail Overrides 45 | 46 | In some cases, datasets may contain different levels of detail (represented by a different maximum zoom limit) for different areas. For example the polar regions are typically data-sparse but occupy a large percentage of the WGS 84 map. It may be wasteful to represent these areas at the same zoom level as the equator. 47 | 48 | Metadata about these zones lets the client-side consumer of these datasets intelligently handle the data at different zoom levels. 49 | 50 | These zones are represented as an override to the primary tile tileBounds. Detail overrides must be specified at a higher zoom level than the primary bounds. 51 | 52 | ``` 53 | { 54 | ... 55 | "tileBounds": { 56 | "zoom": 12, 57 | "xMin": 870, 58 | "xMax": 1360, 59 | "yMin": 442, 60 | "yMax": 1280, 61 | }, 62 | ... 63 | "detailOverrides": [ 64 | "tileBounds": { 65 | "zoom": 13, 66 | "xMin": 1760, 67 | "xMax": 2590, 68 | "yMin": 888, 69 | "yMax": 2380, 70 | }, 71 | "tileBounds": { 72 | "zoom": 14, 73 | "xMin": 3640, 74 | "xMax": 4200, 75 | "yMin": 2280, 76 | "yMax": 2800, 77 | } 78 | ] 79 | } 80 | ``` 81 | 82 | ### Challenges 83 | 84 | #### Per-Tile TEXT lookups 85 | Text lookups work well at higher zoom levels where fewer entities are represented. This is also where they may be most useful - for example when a user interacts with part of the image. However there is no elegant and lightweight way to aggregate this information as we zoom out to global scale. We may need to set limits on the zoom level at which the data becomes available (e.g. zooming in) -- or cull key values once they become smaller than a certain size limit. 86 | 87 | So far this technique of per-tile text lookup has been used in the processing stages to assign numeric data to pixels rather than bringing it to the front end. In that case it is used only at the maximum zoom level used to process the data. However, there may be usefulness in bringing this capability to the front-end and this spec would benefit from more applied testing. 88 | 89 | #### Detail Overrides 90 | 91 | This seems pretty simple at a spec-level, but may be challenging to implement in front-end tools. I haven't seen any examples of slippy map tiles loading and displaying multiple zoom levels side-by-side. In Zaru, we have implemented "zoom beyond" support for grabbing portions of higher level tiles and displaying those via GPU rendering. This could be adapted to work with variable zoom levels, but would require careful treatment at the margins. 92 | -------------------------------------------------------------------------------- /specifications/record-lookup/0.1/README.md: -------------------------------------------------------------------------------- 1 | # GeoPngDB Record Lookup version 0.1 (beta) 2 | 3 | The record lookup schema uses spatial reference images along with a [png-db](https://github.com/sasakiassociates/png-db) record database to represent large geospatial datasets efficiently. Unlike the basic version which can scale indefinitely without workarounds, the record lookup schema requires some additional considerations to achieve massive scale. 4 | 5 | ## Format 6 | 7 | ### Tiling Scheme 8 | 9 | GeoPngDB uses the standard Google x/y/z tile schema with tiles starting in the NW corner of the WGS 84 map projection. Each tile is 256x256 pixels. 10 | 11 | Tile schemas represent the entire globe, slicing a square map into 4 smaller tiles and adding detail at each new zoom 12 | level. See this reference: https://www.maptiler.com/google-maps-coordinates-tile-bounds-projection/ 13 | 14 | ### Record Data 15 | A png-db database is an image database containing *non-spatial* records where each pixel represents a record. Records are uniquely identified by their position in the image with an index scanning left to right, then top to bottom. Any number of fields can be stored in this format with each column producing a different image. TEXT, INTEGER and DECIMAL data types are supported and described by metadata associated with the data image. 16 | 17 | ![Screenshot of png-db folder showing metadata files and data images.](../../../img/png-db-example.png) 18 | 19 | This screenshot shows how each column is stored as an image file. In this example of 98778 records, the images are 315x315 squares with file sizes ranging from 46 KB to 235 KB. Much larger datasets are supported: many GPUs now support 4096x4096 images which would allow 16 million records. 20 | 21 | ### Record Lookups 22 | When representing point records on a raster grid, multiple points may fall within a single grid cell. Each pixel of the spatial image contains a "lookup" reference to the records that fall within that cell. Records are referenced via a sequential index so that they may be directly accessed from a png-db pixel. 23 | 24 | To account for the array of records that fall within each cell, two spatial images are used: one containing "start" values for the array, and the other containing "stop" values. 25 | 26 | ![Illustration showing how non-spatial png-db record indices are mapped onto spatial tiles.](../../../img/record-lookups.svg) 27 | 28 | ### Aggregation 29 | At lower zoom levels, records are aggregated, so that each pixel accurately represents all pixels "below" that pixel in the pyramid. 30 | 31 | ![Illustration showing how values for 4 pixels from one zoom level are summed into a single pixel at the level above for two different methods.](../../../img/pyramid-comparison.svg) 32 | 33 | The basic tile schema (shown on the left side of the comparison) aggregates numeric values at each level of the pyramid. The record lookup aggregates individual records in much the same way. 34 | 35 | ### Quad-tree indexing 36 | Each spatial cell needs to specify a range of indices in the data tile. Ideally we want to be able to use the same data tile for all spatial tiles so that it only needs to be loaded once. By using a quad-tree style ordering, we can ensure that ranges will retain the same order at different tile levels. Note that the values here reflect order of the spatial cells used to generate indices (not the actual indices). 37 | 38 | ![Illustration showing how tile indices are organized.](../../../img/tile-index-nesting.svg) 39 | 40 | Each spatial cell contains a range of indices representing the records contained in that cell. For example cell 0 (top left) contains records with indices 0 through 4. This is represented by the "start" and "stop" values on the spatial tiles. Note how the "start" value in the top left cell of each 4 cell "quad" remains unchanged as we aggregate the cells to the next zoom level. The same is true for the "stop" value in the bottom right cell of the "quad". This ensures that all records contained within a 4 cell quad are also contained by the pyramid parent of that 4 cell quad. 41 | 42 | ![Illustration showing how quad tree tiles indices are nested as related to lookup indices.](../../../img/quad-tree-start-stops.svg) 43 | 44 | ### Querying 45 | In order to use record level data spatially, we need to be able to build queries based on values found in the png-db images. For example if we want to count the number of brown records in each cell, we iterate through each value in the source image based on the start and stop value for each cell. While this may seem complex, the process can be done using images entirely on a GPU shader without ever needing to access values on the CPU allowing real-time querying of millions of records. Rules can be written on the GPU to query multiple parameters across multiple png-db fields. 46 | 47 | ![Illustration showing how queries can be created by matching and summing.](../../../img/query-sums.svg) 48 | 49 | ### Spatial Tile Structure 50 | Spatial tiles are structured to include the start and stop reference tiles in a single image. Combining images allows better compression, requires fewer http queries and takes up fewer image "slots" on the GPU. 51 | 52 | An optional "edge" zone allows us to include a single row of pixels from neighboring tiles. This small accommodation lets us render "plus" shapes to represent points. Though nowhere near as flexible as vector based point rendering, we've found that the plus shapes read much clearer than single pixels. 53 | 54 | The 4th square is currently unused, but may allow future expansion. 55 | 56 | ![Illustration showing how 256x256 tiles are combined into 512x512 image.](../../../img/tile-structure.svg) 57 | 58 | # Conceptual 59 | 60 | Unlike the record lookup solution that has been tested in proofs-of-concept and real-world use cases, the following solution is purely conceptual. 61 | 62 | ## Accommodating (actual) big data 63 | 64 | While spatial tiles can easily accommodate billions of grid cells (the MapZen terrarium example clocks in at 70 trillion), record based solutions have limits imposed by the number of records that can be stored in the png-db. This is currently around 16 million records if your GPU supports 4096x4096 images, but more likely would be around 4 million for broader support (2048x2048 images). 65 | 66 | One solution would be to render an individual png-db record set for every spatial tile. This is feasible, but it would require significant data duplication as the user zooms in and out, and it still imposes limits when zoomed out to the full bounds of the dataset. 67 | 68 | Given these constraints, the most workable solution would appear to be a sample based solution. Not every spatial tile contains the same number of records - in fact it is common for geospatial data points to be clustered in certain areas and sparse in others. The image below illustrates an adaptive sampling methodology where either full record data or sampled records exist at every zoom level. 69 | 70 | ![Illustration showing how quad tiles may be subdivided.](../../../img/adaptive-quads.svg) 71 | 72 | When complete data is available, the full png-db image set can be used at that zoom level - and all zoom levels beyond it. In the case of sampled data, either a more detailed sample - or complete data will be available at the next zoom level. The following pseudo-code shows the logic for producing sample data tiles based on a given maximum number of records that we want to use for png-db images (MAX_RECORDS). "num_records" is the number of records contained within the bounds of the tile. 73 | 74 | ![Pseudo-code showing logic for sampling vs complete data.](../../../img/pseudo-code.svg) 75 | 76 | The same uneven distribution that occurs across tiles also occurs within tiles. Certain pixels may have a much greater record density than others. To help ensure that sparse pixels aren't filtered in a way that changes the overall impression of the image, we can adopt a per-pixel sampling strategy. 77 | 78 | ![Mock histogram showing how dense pixels can be sampled more aggressively.](../../../img/per-pixel-sampling-strategy.svg) 79 | 80 | Given that millions of records will persist for any given tile, the sample data should be rich enough to allow querying and accurately represent the dataset at all zoom levels. 81 | 82 | The one troublesome case for the sampling approach is where we have a mix of sparse data and very dense data in the same area. For example in a bird observation dataset where Canada Geese and California Condors are both represented, there may only be a handful of condor records. A filter for condors applied to the sampled data might miss key records while looking at the global scale. These records will show up once zoomed in to a level where complete data is available, but may be missed by the sampling. A workaround in this case would be to create a separate record set for the rare bird records so that they are not lost in the sampling. Multiple sources can be combined on the front-end such that the would be opaque to end users. 83 | 84 | 85 | 86 | --------------------------------------------------------------------------------