├── COPYING
├── README.md
├── master
├── README.md
├── acme-challenge.conf
├── apache-ssl.conf
├── apache.conf
├── build-web.sh
├── create-host-keys.sh
├── crontab-robot
├── init-robot.sh
├── init.sh
├── release-coastline.sh
├── restart-apache2
├── run-update-anomalies.sh
├── run-update-low-planet.sh
├── run-update-osmdata.sh
├── run-update-planet.sh
├── run-update.sh
├── servers2web.sh
└── users.yml.tmpl
├── scripts
├── anomalies
│ ├── collect-stats.sh
│ ├── stats-to-json.rb
│ └── update.sh
├── coastline
│ ├── README.tmpl
│ ├── compare-coastline-polygons.sh
│ ├── create-grid.sql
│ ├── split-3857-post.sql
│ ├── split-3857.sql
│ ├── split-4326.sql
│ ├── split-on-grid.sql
│ ├── split.sh
│ └── update.sh
├── icesheet
│ ├── README.tmpl
│ ├── osmium-export-config.json
│ ├── update-icesheet-zip.sh
│ └── update.sh
├── low-planet
│ └── update.sh
└── planet
│ └── update.sh
├── servers
├── update-anomalies.yml
├── update-low-planet.yml
├── update-osmdata.yml
└── update-planet.yml
└── web
├── .gitignore
├── _includes
├── data
│ ├── coastlines.html
│ ├── generalized-coastlines.html
│ ├── icesheet-outlines.html
│ ├── icesheet-polygons.html
│ ├── land-polygons.html
│ ├── water-polygons.html
│ ├── water-reduced-polygons.html
│ └── water-reduced-raster.html
├── download
├── license-cc-by-sa.html
├── license-odbl.html
└── notes-broken-coastline.html
├── _layouts
├── default.html
└── nowrap.html
├── contact.html
├── css
├── common.css
└── ol.css
├── data
├── coast.html
├── coastlines.html
├── icesheet-outlines.html
├── icesheet-polygons.html
├── icesheet.html
├── index.html
├── land-polygons.html
├── water-polygons.html
└── water.html
├── img
├── antarctic-ice.png
├── data-coastlines-small.png
├── data-coastlines.png
├── data-generalized-coastlines-small.png
├── data-generalized-coastlines.png
├── data-icesheet-outlines-small.png
├── data-icesheet-outlines.png
├── data-icesheet-polygons-small.png
├── data-icesheet-polygons.png
├── data-land-polygons-small.png
├── data-land-polygons.png
├── data-water-polygons-small.png
├── data-water-polygons.png
├── data-water-reduced-polygons-small.png
├── data-water-reduced-polygons.png
├── data-water-reduced-raster-small.png
├── data-water-reduced-raster.png
├── dc.png
├── fossgis.png
├── generalization_example.png
├── generalization_z0.png
├── generalization_z3.png
├── generalization_z4.png
├── generalization_z5.png
├── generalization_z6.png
├── icesheet_carto1.png
├── icesheet_carto2.png
└── water_carto.png
├── impressum.html
├── index.html
├── info
├── downloading.html
├── formats.html
├── index.html
├── license.html
└── projections.html
├── internal
├── coastline
│ ├── download.html
│ ├── index.html
│ └── map.html
└── index.html
├── js
├── get-file-change-date.js
├── internal-download.js
├── map.js
├── ol.js
└── util.js
├── processing
├── coastline.html
├── generalization.html
├── icesheet.html
├── index.html
├── software.html
└── water.html
└── robots.txt
/COPYING:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
675 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 | # OSMData
3 |
4 | The [OpenStreetMap project](https://www.openstreetmap.org/) collects an amazing
5 | amount of geodata and makes it available to the world for free. But the raw
6 | OpenStreetMap data is hard to use. This repository contains scripts to set up
7 | a server that processes some OSM data and brings it into a format for easier
8 | use.
9 |
10 | Currently these scripts can be used to derive
11 |
12 | * Coastline data (in the form of linestrings or land or water polygons for the
13 | worlds land masses or oceans).
14 | * Antarctic icesheet data.
15 |
16 | datasets from OSM data.
17 |
18 | The icesheet scripts are based on https://github.com/imagico/icesheet_proc.
19 |
20 | ## Public server
21 |
22 | A server running this code and offering the results for download to the general
23 | public is at https://osmdata.openstreetmap.de/ .
24 |
25 | ## Overview
26 |
27 | The scripts are intended to work in the [Hetzner
28 | Cloud](https://www.hetzner.com/cloud). But it should be possible to port
29 | them to other cloud providers.
30 |
31 | One server runs all the time as webserver and "master of ceremonies". It will
32 | start other servers regularly to update a planet file and do the data
33 | processing. The results are then copied back to the master server and are
34 | available for download from there.
35 |
36 | ## Setting up a master server
37 |
38 | See [the master README](master/README.md) on how to set up a master server.
39 |
40 | ## License
41 |
42 | Unless otherwise mentioned everything in the repository is available under
43 | the GNU GENERAL PUBLIC LICENSE Version 3. See the file COPYING for the
44 | complete text of the license.
45 |
46 |
--------------------------------------------------------------------------------
/master/README.md:
--------------------------------------------------------------------------------
1 |
2 | # Setup of Master Server
3 |
4 | The files in this directory are used to set up the master server. You will
5 | need an [Hetzner cloud](https://www.hetzner.com/cloud) account and the
6 | [hcloud command line tool](https://github.com/hetznercloud/cli).
7 |
8 | Here are the steps needed to install the master server:
9 |
10 | * Go to the [Hetzner Cloud console](https://console.hetzner.cloud/) and log
11 | in.
12 | * Add a new project.
13 | * Add one or more of your ssh public keys. Name one `admin`.
14 | * Add a new API token to the project. Put the API token somewhere safe.
15 | * Create a hcloud context on your own machine: `hcloud context create osmdata`.
16 | It will ask you for the token you have just created.
17 | * Create a new cloud server. (You can use the `--ssh-key` option multiple
18 | times if you have several keys to add.) On your own machine run:
19 |
20 | ```
21 | hcloud server create \
22 | --name osmdata \
23 | --location nbg1 \
24 | --type cx11 \
25 | --image debian-12 \
26 | --ssh-key admin
27 | ```
28 |
29 | This uses the cheapest cloud server they have which costs 4.51 EUR per month.
30 |
31 | * Still on your own machine, create a new volume:
32 |
33 | ```
34 | hcloud volume create \
35 | --name planet \
36 | --size 120 \
37 | --server osmdata \
38 | --format ext4 \
39 | --automount
40 | ```
41 |
42 | * You should now be able to log into the master server as root (`hcloud server
43 | ssh osmdata`) and see a volume mounted somewhere under `/mnt`.
44 | * Copy the script `init.sh` to the new server and run it as `root` user:
45 |
46 | ```
47 | IP=`hcloud server describe -o 'format={{.PublicNet.IPv4.IP}}' osmdata`
48 | echo $IP
49 | scp osmdata/master/init.sh root@$IP:/tmp/
50 | ssh -t root@$IP /tmp/init.sh
51 | ```
52 |
53 | The script will ask for the Hetzner cloud token at some point which you have to
54 | enter. The `-t` option on the `ssh` command is important, otherwise it can't
55 | ask for the token.
56 |
57 | * If his script runs through without errors, you are done with the update of
58 | the master server and you can now log in as the `robot` user:
59 |
60 | ```
61 | hcloud server ssh -u robot osmdata
62 | ```
63 |
64 | # Operation
65 |
66 | On the master server, you have a script `/usr/local/bin/run-update.sh` which
67 | can be run as `robot` user to do an update run. The first time this is run, it
68 | will download a complete planet and update it using the hourly replication
69 | files. Further runs will update from the planet of the last run. This will also
70 | run the data processing and put the results into `/data/new/`.
71 |
72 | After testing this you might want to create a cronjob for it.
73 |
74 | See the comments at the beginning of the `run-update.sh` script for some
75 | options.
76 |
77 |
78 | # Notes
79 |
80 | * The init script installs the `certbot` software for setting up LetsEncrypt
81 | certificates, but doesn't actually use it. You have to do the TLS setup
82 | manually if you want it. For this call
83 | `certbot certonly --webroot -w /var/lib/letsencrypt/webroot/ -d osmdata.openstreetmap.de`.
84 | After that you can activate the SSL web site: `a2ensite 000-default-ssl.conf`.
85 | * While testing you might want to run the update script in
86 | [tmux](https://github.com/tmux/tmux), because it will run for a few hours.
87 | `tmux` is already installed on the system.
88 |
89 |
--------------------------------------------------------------------------------
/master/acme-challenge.conf:
--------------------------------------------------------------------------------
1 | Alias "/.well-known/acme-challenge/" "/var/lib/letsencrypt/webroot/.well-known/acme-challenge/"
2 |
3 | AllowOverride None
4 | Options None
5 | Require all granted
6 |
7 |
--------------------------------------------------------------------------------
/master/apache-ssl.conf:
--------------------------------------------------------------------------------
1 |
2 |
3 | ServerName osmdata.openstreetmap.de
4 | ServerAdmin webmaster@localhost
5 | DocumentRoot /srv/www/osmdata
6 |
7 | Header always set X-Frame-Options "DENY"
8 | Header always set X-XSS-Protection "1; mode=block"
9 | Header always set X-Content-Type-Options "nosniff"
10 |
11 | Alias /download/ /data/good/
12 | Alias /new/ /data/new/
13 | Alias /d/ /data/web/
14 |
15 |
16 | Options FollowSymlinks
17 | Require all granted
18 | AddType text/html .html
19 |
20 | Header always set Access-Control-Allow-Origin "*"
21 |
22 |
23 |
24 |
25 | Require all granted
26 | AddType text/html .html
27 |
28 |
29 |
30 | Require all granted
31 | AddType text/html .html
32 |
33 |
34 |
35 | Options FollowSymlinks
36 | Require all granted
37 | AddType text/html .html
38 | Header always set Access-Control-Allow-Origin "*"
39 |
40 |
41 | ErrorLog ${APACHE_LOG_DIR}/secure-error.log
42 | CustomLog ${APACHE_LOG_DIR}/secure-access.log combined
43 |
44 | SSLEngine on
45 |
46 | SSLProtocol all -SSLv2 -SSLv3
47 | SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
48 | SSLHonorCipherOrder on
49 | SSLCompression off
50 |
51 | #SSLUseStapling on
52 | #SSLStaplingResponderTimeout 5
53 | #SSLStaplingReturnResponderErrors off
54 | #SSLStaplingCache shmcb:/var/run/ocsp(128000)
55 |
56 | SSLOptions +StrictRequire
57 |
58 | SSLCertificateFile /etc/letsencrypt/live/osmdata.openstreetmap.de/fullchain.pem
59 | SSLCertificateKeyFile /etc/letsencrypt/live/osmdata.openstreetmap.de/privkey.pem
60 |
61 |
62 |
--------------------------------------------------------------------------------
/master/apache.conf:
--------------------------------------------------------------------------------
1 |
2 |
3 | ServerName osmdata.openstreetmap.de
4 | DocumentRoot /srv/www/osmdata
5 |
6 | Header always set X-Frame-Options "DENY"
7 | Header always set X-XSS-Protection "1; mode=block"
8 | Header always set X-Content-Type-Options "nosniff"
9 |
10 | # RedirectMatch permanent ^(?!/.well-known/acme-challenge/)(.*)$ https://osmdata.openstreetmap.de$1
11 |
12 | Alias /download/ /data/good/
13 | Alias /new/ /data/new/
14 | Alias /d/ /data/web/
15 |
16 |
17 | Options FollowSymlinks
18 | Require all granted
19 | AddType text/html .html
20 |
21 | Header always set Access-Control-Allow-Origin "*"
22 |
23 |
24 |
25 |
26 | Require all granted
27 | AddType text/html .html
28 |
29 | Header always set Access-Control-Allow-Origin "*"
30 |
31 |
32 |
33 |
34 | Require all granted
35 | AddType text/html .html
36 |
37 | Header always set Access-Control-Allow-Origin "*"
38 |
39 |
40 |
41 |
42 | Options FollowSymlinks
43 | Require all granted
44 | AddType text/html .html
45 | Header always set Access-Control-Allow-Origin "*"
46 |
47 |
48 | ErrorLog ${APACHE_LOG_DIR}/error.log
49 | CustomLog ${APACHE_LOG_DIR}/access.log combined
50 |
51 |
52 |
--------------------------------------------------------------------------------
/master/build-web.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | set -euo pipefail
4 |
5 | if [ "$USER" != "robot" ]; then
6 | echo "Must be run as user robot"
7 | exit 1
8 | fi
9 |
10 | jekyll build --source ~/osmdata/web --destination /srv/www/osmdata
11 |
12 |
--------------------------------------------------------------------------------
/master/create-host-keys.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # create-host-keys.sh
4 | #
5 |
6 | set -euo pipefail
7 | set -x
8 |
9 | DIR=~/ssh
10 |
11 | mkdir -p $DIR
12 | rm -f $DIR/*
13 |
14 | echo "#cloud-config\n\nssh_keys:" >$DIR/keys.yml
15 |
16 | for type in rsa dsa ecdsa; do
17 | keyfile="$DIR/ssh_host_${type}_key"
18 | ssh-keygen -t $type -N '' -C cloud -f $keyfile >/dev/null
19 |
20 | (
21 | echo " ${type}_private: |"
22 | sed -e 's/^/ /' $keyfile
23 | echo
24 | echo -n " ${type}_public: "
25 | cat $keyfile.pub
26 | echo
27 | ) >>$DIR/keys.yml
28 |
29 | echo -n "IP " >>$DIR/known_hosts
30 | cat $keyfile.pub >>$DIR/known_hosts
31 | done
32 |
33 |
--------------------------------------------------------------------------------
/master/crontab-robot:
--------------------------------------------------------------------------------
1 | # /etc/cron.d/robot
2 |
3 | PATH=/usr/local/bin:/usr/bin:/bin
4 | USER=robot
5 |
6 | */10 * * * * robot /home/robot/osmdata/master/servers2web.sh
7 |
8 | #17 4 * * * robot /home/robot/osmdata/master/run-update.sh
9 |
10 |
--------------------------------------------------------------------------------
/master/init-robot.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | set -euo pipefail
4 | set -x
5 |
6 | cd ~
7 |
8 | MASTER=~/osmdata/master
9 |
10 | # -- Compile tools --
11 |
12 | git clone https://github.com/imagico/gdal-tools
13 | cd gdal-tools
14 | make gdal_maskcompare_wm
15 | cd ~
16 |
17 | git clone --branch 1.32.10 https://github.com/mapbox/tippecanoe
18 | cd tippecanoe
19 | make
20 | cd ~
21 |
22 | git clone https://github.com/osmcode/osm-data-anomaly-detection
23 | cd osm-data-anomaly-detection
24 | mkdir build
25 | cd build
26 | cmake ..
27 | make
28 | cd ~
29 |
30 | # -- Log --
31 |
32 | mkdir -p ~/log
33 |
34 | # -- hcloud setup --
35 |
36 | hcloud context create osmdata
37 |
38 | hcloud volume detach planet || true
39 |
40 | $MASTER/create-host-keys.sh
41 |
42 | # -- Web setup --
43 |
44 | mkdir /data/web/coastline
45 | for i in diff good new; do
46 | ln -s /data/compare/mask-$i.tiff /data/web/coastline/mask-$i.tiff
47 | ln -s /data/compare/mask-$i-cog.tiff /data/web/coastline/mask-$i-cog.tiff
48 | done
49 | ln -s /data/compare/mask-diff.geojson /data/web/coastline/mask-diff.geojson
50 |
51 | $MASTER/build-web.sh
52 |
53 | # -- SSH setup --
54 |
55 | ssh-keygen -t rsa -C robot -N '' -f ~/.ssh/id_rsa
56 |
57 | cp $MASTER/users.yml.tmpl ~/users.yml
58 |
59 | cat ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys \
60 | | sed -e 's/^/ - /' >>~/users.yml
61 |
62 |
--------------------------------------------------------------------------------
/master/init.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # Initialize the master server.
4 | #
5 | # Must be run once as user "root" on the master server when it is first
6 | # created.
7 | #
8 |
9 | set -euo pipefail
10 | set -x
11 |
12 | REPOSITORY=/home/robot/osmdata
13 | BIN=/usr/local/bin
14 |
15 | # -- Install Debian packages --
16 |
17 | apt-get update -y
18 |
19 | apt-get dist-upgrade -u -y
20 |
21 | apt-get install -y \
22 | apache2 \
23 | bc \
24 | certbot \
25 | cimg-dev \
26 | cmake \
27 | g++ \
28 | gdal-bin \
29 | git \
30 | jekyll \
31 | jq \
32 | libgdal-dev \
33 | libosmium2-dev \
34 | libproj-dev \
35 | make \
36 | osmium-tool \
37 | python3-gdal \
38 | python3-pyosmium \
39 | rsync \
40 | ruby-json \
41 | ruby-sqlite3 \
42 | spatialite-bin \
43 | sqlite3 \
44 | tmux \
45 | unzip \
46 | zip \
47 | zsh
48 |
49 | apt-get clean
50 |
51 |
52 | # -- Install hcloud cli command --
53 |
54 | wget --no-verbose -O /tmp/hcloud.tar.gz https://github.com/hetznercloud/cli/releases/download/v1.31.1/hcloud-linux-amd64.tar.gz
55 | tar xCf /tmp /tmp/hcloud.tar.gz
56 | cp /tmp/hcloud $BIN
57 |
58 |
59 | # -- Create robot user --
60 |
61 | adduser --gecos "Robot User" --disabled-password robot
62 | mkdir /home/robot/.ssh
63 | cp /root/.ssh/authorized_keys /home/robot/.ssh
64 | chown -R robot:robot /home/robot/.ssh
65 | chmod 700 /home/robot/.ssh
66 | chmod 600 /home/robot/.ssh/authorized_keys
67 |
68 |
69 | # -- Prepare planet volume --
70 |
71 | MNT=$(find /mnt -mindepth 1 -maxdepth 1 -type d)
72 | mkdir -p "$MNT/data/planet"
73 | chown -R robot:robot "$MNT/data"
74 | umount "$MNT"
75 |
76 |
77 | # -- Directory setup --
78 |
79 | mkdir -p /srv/www/osmdata
80 | chown robot:robot /srv/www/osmdata
81 |
82 | for dir in good new compare err osmi web anomalies; do
83 | mkdir -p /data/$dir
84 | chown robot:robot /data/$dir
85 | done
86 |
87 |
88 | # -- Get git repository --
89 |
90 | (cd /home/robot; su -c "git clone https://github.com/fossgis/osmdata $REPOSITORY" robot)
91 |
92 |
93 | # -- Run robot user setup --
94 |
95 | su -c /home/robot/osmdata/master/init-robot.sh robot
96 |
97 |
98 | # -- Install binaries ---
99 |
100 | cp /home/robot/gdal-tools/gdal_maskcompare_wm $BIN
101 |
102 | for script in build-web.sh release-coastline.sh run-update.sh servers2web.sh; do
103 | ln -s /home/robot/osmdata/master/$script $BIN/$script
104 | done
105 |
106 |
107 | # -- Install crontabs --
108 |
109 | cp /home/robot/osmdata/master/crontab-robot /etc/cron.d/robot
110 |
111 |
112 | # -- Letsencrypt setup --
113 |
114 | mkdir -p /var/lib/letsencrypt/webroot/
115 | mkdir -p /etc/letsencrypt/renewal-hooks/post/
116 |
117 | cp $REPOSITORY/master/restart-apache2 /etc/letsencrypt/renewal-hooks/post/
118 | chmod a+x /etc/letsencrypt/renewal-hooks/post/restart-apache2
119 |
120 |
121 | # -- Apache setup --
122 |
123 | cp $REPOSITORY/master/apache.conf /etc/apache2/sites-available/000-default.conf
124 | cp $REPOSITORY/master/apache-ssl.conf /etc/apache2/sites-available/000-default-ssl.conf
125 |
126 | cp $REPOSITORY/master/acme-challenge.conf /etc/apache2/conf-available/
127 | ln -s ../conf-available/acme-challenge.conf /etc/apache2/conf-enabled/acme-challenge.conf
128 |
129 | a2dismod status
130 | a2enmod headers
131 |
132 | systemctl restart apache2.service
133 |
134 | echo "init.sh done."
135 |
136 |
--------------------------------------------------------------------------------
/master/release-coastline.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # release-coastline.sh
4 | #
5 |
6 | set -euo pipefail
7 |
8 | LOCK_FILE=~/log/running
9 |
10 | if [ -f $LOCK_FILE ]; then
11 | echo "Update process is running. Can not release coastline."
12 | exit 1
13 | fi
14 |
15 | date >>~/log/release-coastline.log
16 |
17 | cd /data/compare
18 |
19 | NEWEST=$(ls mask-20* | tail -1)
20 |
21 | rm -f mask-good.tiff
22 | ln -s "$NEWEST" mask-good.tiff
23 |
24 | mv /data/new/* /data/good/
25 | cp /data/good/* /data/new/
26 |
27 |
--------------------------------------------------------------------------------
/master/restart-apache2:
--------------------------------------------------------------------------------
1 | #!/bin/sh
2 |
3 | systemctl restart apache2
4 |
5 |
--------------------------------------------------------------------------------
/master/run-update-anomalies.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # run-update-anomalies.sh
4 | #
5 |
6 | set -euo pipefail
7 |
8 | if [ "$USER" != "robot" ]; then
9 | echo "Must be run as user robot"
10 | exit 1
11 | fi
12 |
13 | echo "Running jobs: anomalies"
14 |
15 | SERVER=update-anomalies
16 |
17 | # cx41: 4 CPUs, 16 GB RAM, 160 GB disk
18 | STYPE=cx41
19 |
20 | hcloud server create \
21 | --name $SERVER \
22 | --location nbg1 \
23 | --type $STYPE \
24 | --image debian-10 \
25 | --ssh-key admin \
26 | --user-data-from-file ~/osmdata/servers/$SERVER.yml \
27 | --user-data-from-file ~/users.yml \
28 | --user-data-from-file ~/ssh/keys.yml \
29 | --volume planet
30 |
31 | IP=$(hcloud server ip $SERVER)
32 |
33 | echo "$IP"
34 |
35 | sed -e "s/^IP /${IP} /" ~/ssh/known_hosts >~/.ssh/known_hosts
36 |
37 | echo "Waiting for system to become ready..."
38 | sleep 60
39 | ssh -o ConnectTimeout=600 "robot@${IP}" cloud-init status --wait
40 | echo "System initialized."
41 |
42 | update_anomalies() {
43 | ssh "robot@${IP}" mkdir anomalies
44 | scp ~/osmdata/scripts/anomalies/* "robot@${IP}:anomalies/"
45 | scp ~/osm-data-anomaly-detection/build/src/odad-* "robot@${IP}:anomalies/"
46 |
47 | echo "Running anomalies job..."
48 | ssh "robot@${IP}" anomalies/update.sh
49 | }
50 |
51 | update_anomalies
52 |
53 | RESULT=/data/anomalies
54 | rm -fr $RESULT/new
55 | mkdir -p $RESULT/new
56 | scp "robot@$IP:/tmp/anomalies/*" $RESULT/new
57 | ssh "robot@$IP" sudo umount /mnt
58 | sync
59 |
60 | hcloud volume detach planet || true
61 |
62 | hcloud server delete $SERVER
63 |
64 | rm -fr $RESULT/old
65 |
66 | if [ -f $RESULT/cur ]; then
67 | mv $RESULT/cur $RESULT/old
68 | fi
69 |
70 | mv $RESULT/new $RESULT/cur
71 | sync
72 |
73 | rm -fr $RESULT/old
74 |
75 | if [ -f $RESULT/stats.db ]; then
76 | cp $RESULT/stats.db $RESULT/cur
77 | fi
78 |
79 | ~/osmdata/scripts/anomalies/collect-stats.sh $RESULT/cur
80 |
81 | mv $RESULT/cur/stats.db $RESULT/
82 |
83 | ~/osmdata/scripts/anomalies/stats-to-json.rb $RESULT/stats.db >$RESULT/stats.json.new
84 | sync
85 | mv $RESULT/stats.json.new $RESULT/stats.json
86 |
87 | echo "run-update-anomalies done."
88 |
89 |
--------------------------------------------------------------------------------
/master/run-update-low-planet.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # run-update-low-planet-from-planet.sh
4 | #
5 |
6 | set -euo pipefail
7 |
8 | if [ "$USER" != "robot" ]; then
9 | echo "Must be run as user robot"
10 | exit 1
11 | fi
12 |
13 | SERVER=update-low-planet
14 |
15 | # ccx41: 16 CPUs, 64 GB RAM, 360 GB disk
16 | STYPE=ccx41
17 |
18 | hcloud server create \
19 | --name $SERVER \
20 | --location nbg1 \
21 | --type $STYPE \
22 | --image debian-12 \
23 | --ssh-key admin \
24 | --user-data-from-file ~/osmdata/servers/$SERVER.yml \
25 | --user-data-from-file ~/users.yml \
26 | --user-data-from-file ~/ssh/keys.yml \
27 | --volume planet
28 |
29 | IP=$(hcloud server ip $SERVER)
30 |
31 | echo "$IP"
32 |
33 | sed -e "s/^IP /${IP} /" ~/ssh/known_hosts >~/.ssh/known_hosts
34 |
35 | echo "Waiting for system to become ready..."
36 | sleep 60
37 | ssh -o ConnectTimeout=600 "robot@${IP}" cloud-init status --wait
38 | echo "System initialized."
39 |
40 | ssh "robot@${IP}" mkdir low-planet
41 | scp ~/osmdata/scripts/low-planet/* "robot@${IP}:low-planet/"
42 |
43 | ssh "robot@${IP}" low-planet/update.sh
44 | ssh "robot@${IP}" sudo umount /mnt
45 |
46 | hcloud volume detach planet || true
47 |
48 | hcloud server delete $SERVER
49 |
50 | echo "run-update-planet done."
51 |
52 |
--------------------------------------------------------------------------------
/master/run-update-osmdata.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # run-update-osmdata.sh [JOBS...]
4 | #
5 | # run-update-osmdata.sh -- Run all jobs
6 | # run-update-osmdata.sh coastline -- Run only coastline job
7 | # run-update-osmdata.sh coastline icesheet -- Run coastline and icesheet jobs
8 | #
9 |
10 | set -euo pipefail
11 |
12 | if [ "$USER" != "robot" ]; then
13 | echo "Must be run as user robot"
14 | exit 1
15 | fi
16 |
17 | declare -A jobs
18 |
19 | if [[ $# -eq 0 ]]; then
20 | jobs['coastline']=1
21 | jobs['icesheet']=1
22 | else
23 | for job in "$@"; do
24 | jobs[$job]=1
25 | done
26 | fi
27 |
28 | echo "Running jobs: ${!jobs[*]}"
29 |
30 | SERVER=update-osmdata
31 |
32 | # cx42: 8 CPUs, 16 GB RAM, 160 GB disk
33 | STYPE=cx42
34 |
35 | VOLID=$(hcloud volume describe -o json planet | jq .id)
36 |
37 | printf "#cloud-config\nmounts:\n - [ '/dev/disk/by-id/scsi-0HC_Volume_${VOLID}', '/mnt' ]\n" | \
38 | hcloud server create \
39 | --name $SERVER \
40 | --location nbg1 \
41 | --type $STYPE \
42 | --image debian-12 \
43 | --ssh-key admin \
44 | --user-data-from-file ~/osmdata/servers/$SERVER.yml \
45 | --user-data-from-file ~/users.yml \
46 | --user-data-from-file ~/ssh/keys.yml \
47 | --user-data-from-file - \
48 | --volume planet
49 |
50 | IP=$(hcloud server ip $SERVER)
51 |
52 | echo "$IP"
53 |
54 | sed -e "s/^IP /${IP} /" ~/ssh/known_hosts >~/.ssh/known_hosts
55 |
56 | echo "Waiting for system to become ready..."
57 | sleep 60
58 | ssh -o ConnectTimeout=600 "robot@$IP" cloud-init status --wait
59 | echo "System initialized."
60 |
61 | update_job() {
62 | local job=$1
63 |
64 | # shellcheck disable=SC2029
65 | ssh "robot@$IP" mkdir "$job"
66 | scp ~/osmdata/scripts/"$job"/* "robot@$IP:$job/"
67 |
68 | echo "Running $job job..."
69 | # shellcheck disable=SC2029
70 | ssh "robot@$IP" "$job/update.sh"
71 |
72 | echo "Copying results of $job job to master..."
73 | scp "robot@$IP:data/$job/results/*.zip" /data/new/
74 | sync
75 | }
76 |
77 | if [[ -v jobs[coastline] ]]; then
78 | update_job coastline
79 | scp -C "robot@$IP:data/coastline/osmi-coastlines.db" /data/osmi/
80 | scp "robot@$IP:data/coastline/osmi/*.json.gz" /data/err/
81 | sync
82 | mv /data/osmi/osmi-coastlines.db /data/web/coastline/
83 | fi
84 |
85 | if [[ -v jobs[icesheet] ]]; then
86 | update_job icesheet
87 | fi
88 |
89 | scp "robot@$IP:/mnt/data/planet/last-update" /data/new/
90 | ssh "robot@$IP" sudo umount /mnt
91 |
92 | hcloud volume detach planet || true
93 |
94 | hcloud server delete $SERVER
95 |
96 | echo "run-update-osmdata done."
97 |
98 |
--------------------------------------------------------------------------------
/master/run-update-planet.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # run-update-planet.sh
4 | #
5 |
6 | set -euo pipefail
7 |
8 | if [ "$USER" != "robot" ]; then
9 | echo "Must be run as user robot"
10 | exit 1
11 | fi
12 |
13 | SERVER=update-planet
14 |
15 | # cx41: 8 CPUs, 16 GB RAM, 160 GB disk
16 | STYPE=cx42
17 |
18 | VOLID=$(hcloud volume describe -o json planet | jq .id)
19 |
20 | printf "#cloud-config\nmounts:\n - [ 'ID=scsi-0HC_Volume_${VOLID}', '/mnt' ]\n" | \
21 | hcloud server create \
22 | --name $SERVER \
23 | --location nbg1 \
24 | --type $STYPE \
25 | --image debian-12 \
26 | --ssh-key admin \
27 | --user-data-from-file ~/osmdata/servers/$SERVER.yml \
28 | --user-data-from-file ~/users.yml \
29 | --user-data-from-file ~/ssh/keys.yml \
30 | --user-data-from-file - \
31 | --volume planet
32 |
33 | IP=$(hcloud server ip $SERVER)
34 |
35 | echo "$IP"
36 |
37 | sed -e "s/^IP /${IP} /" ~/ssh/known_hosts >~/.ssh/known_hosts
38 |
39 | echo "Waiting for system to become ready..."
40 | sleep 60
41 | ssh -o ConnectTimeout=600 "robot@${IP}" cloud-init status --wait
42 | echo "System initialized."
43 |
44 | ssh "robot@${IP}" mkdir planet
45 | scp ~/osmdata/scripts/planet/* "robot@${IP}:planet/"
46 |
47 | ssh "robot@${IP}" planet/update.sh
48 | ssh "robot@${IP}" sudo umount /mnt
49 |
50 | hcloud volume detach planet || true
51 |
52 | hcloud server delete $SERVER
53 |
54 | echo "run-update-planet done."
55 |
56 |
--------------------------------------------------------------------------------
/master/run-update.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # run-update.sh [-p] [JOBS...]
4 | #
5 | # Run data update job. This will update the planet file and then do various
6 | # data exports. Use "-p" to not do the planet update.
7 | #
8 | # If JOBS isn't used, all jobs will be run, otherwise only the specified
9 | # jobs will be run.
10 | #
11 | # run-update.sh -- Update planet, run all jobs
12 | # run-update.sh -p -- Do not update planet, run all jobs
13 | # run-update.sh coastline -- Update planet, run only coastline job
14 | # run-update.sh coastline icesheet -- Update planet, run coastline and icesheet jobs
15 | #
16 |
17 | set -euo pipefail
18 |
19 | if [ "$USER" != "robot" ]; then
20 | echo "Must be run as user robot"
21 | exit 1
22 | fi
23 |
24 | iso_date='+%Y-%m-%dT%H:%M:%S'
25 | STARTTIME=$(date $iso_date)
26 | LOGFILE=~/log/run-$STARTTIME.log
27 | LOCK_FILE=~/log/running
28 |
29 | exec >"$LOGFILE" 2>&1
30 |
31 | echo "$STARTTIME" >$LOCK_FILE
32 |
33 | date
34 |
35 | if [[ $# -ge 1 && $1 == "-p" ]]; then
36 | shift
37 | else
38 | echo "Running planet update..."
39 | ~/osmdata/master/run-update-planet.sh /srv/www/osmdata/internal/servers
9 |
10 | date '+%Y-%m-%dT%H:%M:%S'
11 |
12 | hcloud server list
13 |
14 |
--------------------------------------------------------------------------------
/master/users.yml.tmpl:
--------------------------------------------------------------------------------
1 | #cloud-config
2 |
3 | disable_root: false
4 |
5 | users:
6 | - name: robot
7 | gecos: Robot User
8 | shell: /bin/bash
9 | sudo: |
10 | ALL=(ALL) NOPASSWD:ALL
11 | ssh-authorized-keys:
12 |
--------------------------------------------------------------------------------
/scripts/anomalies/collect-stats.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # collect-stats.sh DIR
4 | #
5 |
6 | set -euo pipefail
7 |
8 | if [ -z "$1" ]; then
9 | echo "Usage: collect-stats.sh DIR"
10 | exit 2
11 | fi
12 |
13 | DIR=$1
14 |
15 | echo 'CREATE TABLE IF NOT EXISTS stats (date TEXT, key TEXT, value INT64 DEFAULT 0);' \
16 | | sqlite3 -bail -batch "$DIR/stats.db"
17 |
18 | echo 'CREATE TABLE IF NOT EXISTS new_stats (date TEXT, key TEXT, value INT64 DEFAULT 0);' \
19 | | sqlite3 -bail -batch "$DIR/stats.db"
20 |
21 | for db in $DIR/stats-*.db; do
22 | echo "$db:"
23 | echo "INSERT INTO new_stats SELECT * FROM db.stats;" \
24 | | sqlite3 -bail -batch -echo -cmd "ATTACH DATABASE '$db' AS db;" "$DIR/stats.db"
25 | done
26 |
27 | echo "UPDATE new_stats SET date = (SELECT max(date) FROM new_stats);" \
28 | | sqlite3 -bail -batch -echo "$DIR/stats.db"
29 | echo "INSERT INTO stats SELECT * FROM new_stats;" \
30 | | sqlite3 -bail -batch -echo "$DIR/stats.db"
31 | echo "DROP TABLE new_stats;" \
32 | | sqlite3 -bail -batch -echo "$DIR/stats.db"
33 |
34 |
--------------------------------------------------------------------------------
/scripts/anomalies/stats-to-json.rb:
--------------------------------------------------------------------------------
1 | #!/usr/bin/ruby
2 | #
3 | # stats-to-json.rb DATABASE
4 | #
5 |
6 | require 'json'
7 | require 'sqlite3'
8 |
9 | filename = ARGV[0]
10 |
11 | db = SQLite3::Database.new(filename, { :readonly => true })
12 | db.results_as_hash = true
13 |
14 | datahash = {}
15 |
16 | db.execute('SELECT * FROM stats ORDER BY date, key') do |row|
17 | date = row['date'].sub(/T.*$/, '')
18 | if !datahash[date]
19 | datahash[date] = {};
20 | end
21 | datahash[date][row['key']] = row['value']
22 | end
23 |
24 | data = []
25 |
26 | datahash.keys.sort.each do |date|
27 | data << [date, datahash[date]]
28 | end
29 |
30 | puts JSON.pretty_generate(data)
31 |
32 |
--------------------------------------------------------------------------------
/scripts/anomalies/update.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #------------------------------------------------------------------------------
3 | #
4 | # anomalies/update.sh
5 | #
6 | #------------------------------------------------------------------------------
7 |
8 | set -euo pipefail
9 | set -x
10 |
11 | iso_date='+%Y-%m-%dT%H:%M:%S'
12 |
13 | DATADIR=/tmp/anomalies
14 | PLANETDIR=/mnt/data/planet
15 | ANOMALDIR=/mnt/data/anomalies
16 |
17 | PLANET=$PLANETDIR/planet.osm.pbf
18 | LOW_PLANET=$PLANETDIR/low-planet.osm.pbf
19 |
20 | mkdir -p $DATADIR
21 |
22 | date $iso_date
23 |
24 | export OSMIUM_POOL_THREADS=3
25 | for prog in ~/anomalies/odad-find-*; do
26 | $prog $LOW_PLANET $DATADIR
27 | done
28 |
29 | date $iso_date
30 |
31 | df -h
32 |
33 |
--------------------------------------------------------------------------------
/scripts/coastline/README.tmpl:
--------------------------------------------------------------------------------
1 |
2 | This data was downloaded from osmdata.openstreetmap.de which offers
3 | extracts and processings of OpenStreetMap data.
4 |
5 | See https://osmdata.openstreetmap.de/ for details.
6 |
7 |
8 | PACKAGE CONTENT
9 | ===============
10 |
11 | This package contains OpenStreetMap data of the
12 | @CONTENT@.
13 |
14 | Layers contained are:@LAYERS@
15 |
16 | Date of the data used is @DATE@
17 |
18 | You can find more information on this data set at
19 |
20 | @URL@
21 |
22 |
23 | LICENSE
24 | =======
25 |
26 | This data is Copyright @YEAR@ OpenStreetMap contributors. It is
27 | available under the Open Database License (ODbL).
28 |
29 | For more information see https://www.openstreetmap.org/copyright
30 |
31 |
--------------------------------------------------------------------------------
/scripts/coastline/compare-coastline-polygons.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #------------------------------------------------------------------------------
3 | #
4 | # compare_coastline_polygons DIR SOURCE
5 | #
6 | # to reset remove symlink $DIR/mask-good.tiff
7 | #
8 | #------------------------------------------------------------------------------
9 |
10 | set -euo pipefail
11 | set -x
12 |
13 | DIFF_MAXIMUM=0.0000015
14 |
15 | DIR="$1"
16 | WEBDIR="/data/web/coastline"
17 | SOURCE="$2"
18 | STARTTIME_COMPACT=$(date '+%Y%m%dT%H%M%S')
19 |
20 | test \! -z "$DIR"
21 | test \! -z "$SOURCE"
22 |
23 | GOOD=$DIR/mask-good.tiff
24 | GOODCOG=$DIR/mask-good-cog.tiff
25 | NEW=$DIR/mask-$STARTTIME_COMPACT.tiff
26 | NEWCOG=$DIR/mask-$STARTTIME_COMPACT-cog.tiff
27 |
28 | rm -fr "$DIR/land-polygons-split-3857"
29 |
30 | unzip "$SOURCE" -d "$DIR"
31 |
32 | # limit growth of differences file
33 | if [ -f "$DIR/differences" ]; then
34 | tail -100 "$DIR/differences" >"$DIR/differences.new"
35 | mv "$DIR/differences.new" "$DIR/differences"
36 | fi
37 |
38 | gdal_rasterize -q --config GDAL_CACHEMAX 1024 "$DIR/land-polygons-split-3857" -l land_polygons \
39 | -te -20037508.342789244 -20037508.342789244 20037508.342789244 20037508.342789244 \
40 | -init 0 -burn 255 -ts 8192 8192 -ot Byte -co COMPRESS=DEFLATE \
41 | "$NEW"
42 |
43 | gdal_translate -of cog -co COMPRESS=LZW "$NEW" "$NEWCOG"
44 |
45 | rm -f "$DIR/mask-new.tiff"
46 | ln -s "$NEW" "$DIR/mask-new.tiff"
47 |
48 | rm -f "$DIR/mask-new-cog.tiff"
49 | ln -s "$NEWCOG" "$DIR/mask-new-cog.tiff"
50 |
51 | rm -fr "$DIR/land-polygons-split-3857"
52 |
53 | #------------------------------------------------------------------------------
54 |
55 | # generate a "diff" image for human consumption
56 | rm -f "$DIR/mask-diff.tiff"
57 |
58 | if [ -e "$GOOD" ]; then
59 | gdal_calc.py -A "$GOOD" \
60 | -B "$NEW" \
61 | --quiet \
62 | --NoDataValue=0 --type=Byte --co=COMPRESS=DEFLATE \
63 | --outfile="$DIR/mask-diff.tiff" --calc="(A!=B)*255"
64 |
65 | rm -f "$DIR/mask-diff-cog.tiff"
66 | gdal_translate -of cog -co COMPRESS=LZW "$DIR/mask-diff.tiff" "$DIR/mask-diff-cog.tiff"
67 |
68 | rm -f "$DIR/mask-diff.geojson"
69 | gdal_polygonize.py -q "$DIR/mask-diff.tiff" "$DIR/mask-diff.geojson"
70 | fi
71 |
72 | #------------------------------------------------------------------------------
73 |
74 | for img in good new diff; do
75 | mkdir -p "$WEBDIR/$img"
76 | if [ -e "$DIR/mask-$img.tiff" ]; then
77 | gdal2tiles.py --webviewer none -z 0-6 "$DIR/mask-$img.tiff" "$WEBDIR/$img"
78 | fi
79 | done
80 |
81 | #------------------------------------------------------------------------------
82 |
83 | if [ ! -r "$NEW" ]; then
84 | echo "$STARTTIME_COMPACT: 0:0.0:0:0.0:0:0.0:0:0.0:0.0:0.0 ERROR" >>"$DIR/differences"
85 | echo "stopping coastline processing due to raster mask generation error."
86 | exit 1
87 | fi
88 |
89 | #------------------------------------------------------------------------------
90 |
91 | if [ ! -h "$GOOD" ]; then
92 | ln -s "$NEW" "$GOOD"
93 | echo "$STARTTIME_COMPACT: 0:0.0:0:0.0:0:0.0:0:0.0:0.0:0.0 OK" >>"$DIR/differences"
94 | exit 0
95 | fi
96 |
97 | #------------------------------------------------------------------------------
98 |
99 | DIFFERENCES=$(gdal_maskcompare_wm "$GOOD" "$NEW" 20000 | grep 'short version:')
100 | DIFF_RATING=$(echo "$DIFFERENCES" | cut -d ':' -f 10)
101 |
102 | # check if something went wrong with maskcompare and assume error then
103 | if [ -z "$DIFF_RATING" ]; then
104 | echo "$STARTTIME_COMPACT: 0:0.0:0:0.0:0:0.0:0:0.0:0.0:0.0 ERROR" >>"$DIR/differences"
105 | echo "stopping coastline processing due to maskcompare error ($DIFFERENCES)."
106 | exit 1
107 | fi
108 |
109 | #------------------------------------------------------------------------------
110 |
111 | if [[ $DIFF_RATING > $DIFF_MAXIMUM ]]; then
112 | echo "$DIFFERENCES ERROR" | sed "s/short version/$STARTTIME_COMPACT/" >>"$DIR/differences"
113 | echo "stopping coastline processing due to difference test failing ($DIFF_RATING > $DIFF_MAXIMUM)."
114 | exit 1
115 | fi
116 |
117 | #------------------------------------------------------------------------------
118 |
119 | echo "$DIFFERENCES OK" | sed "s/short version/$STARTTIME_COMPACT/" >>"$DIR/differences"
120 |
121 | rm -f "$GOOD"
122 | ln -s "mask-$STARTTIME_COMPACT.tiff" "$GOOD"
123 |
124 | rm -f "$GOODCOG"
125 | ln -s "mask-$STARTTIME_COMPACT-cog.tiff" "$GOODCOG"
126 |
127 | #------------------------------------------------------------------------------
128 |
129 | ogr2ogr -f "FlatGeobuf" /data/new/coastlines.fgb /vsizip//data/new/coastlines-split-3857.zip/coastlines-split-3857/lines.shp
130 |
131 | #------------------------------------------------------------------------------
132 |
133 | # Remove old mask files. We do this here at the end, so we are sure not to
134 | # delete any mask files still referenced by mask-good.tiff.
135 | find "$DIR" -mtime +28 -type f -name 'mask-*.tiff' -delete
136 |
137 |
138 | #------------------------------------------------------------------------------
139 |
--------------------------------------------------------------------------------
/scripts/coastline/create-grid.sql:
--------------------------------------------------------------------------------
1 | -- ------------------------------------------
2 | --
3 | -- create-grid.sql
4 | --
5 | -- variables:
6 | -- prefix: name prefix for grid table
7 | -- srid: 4326, 3857 or other
8 | -- split: number of splits
9 | -- overlap: overlap (in map units)
10 | -- xmin, ymin, xmax, ymax: bounds of the grid
11 | --
12 | -- ------------------------------------------
13 |
14 | \t
15 |
16 | \set ON_ERROR_STOP 'on'
17 |
18 | \timing on
19 |
20 | -- ------------------------------------------
21 |
22 | SELECT 'creating ' || :srid || ' grid...';
23 |
24 | SELECT now() AS start_time \gset
25 |
26 | SELECT :'prefix' || '_' || :srid AS grid_table \gset
27 | SELECT :'prefix' || '_' || :srid || '_x_y_idx' AS grid_table_x_y_idx \gset
28 | SELECT :'prefix' || '_' || :srid || '_geom_idx' AS grid_table_geom_idx \gset
29 |
30 | DROP TABLE IF EXISTS :grid_table;
31 |
32 | CREATE TABLE :grid_table (
33 | id SERIAL PRIMARY KEY,
34 | x INTEGER,
35 | y INTEGER,
36 | geom GEOMETRY(POLYGON, :srid)
37 | );
38 |
39 | -- carthesian version
40 | INSERT INTO :grid_table (x, y, geom)
41 | SELECT x, y, ST_Intersection(
42 | ST_MakeEnvelope(:xmin, :ymin, :xmax, :ymax, :srid),
43 | ST_MakeEnvelope(
44 | x * (:xmax - :xmin)/:split - :overlap,
45 | y * (:ymax - :ymin)/:split - :overlap,
46 | (x + 1) * (:xmax - :xmin)/:split + :overlap,
47 | (y + 1) * (:ymax - :ymin)/:split + :overlap,
48 | :srid))
49 | FROM generate_series(-:split/2, :split/2 - 1) AS x,
50 | generate_series(-:split/2, :split/2 - 1) AS y WHERE :srid <> 4326;
51 |
52 | -- geographic version
53 | INSERT INTO :grid_table (x, y, geom)
54 | SELECT x, y, ST_Intersection(
55 | ST_MakeEnvelope(:xmin, :ymin, :xmax, :ymax, :srid),
56 | ST_MakeEnvelope(
57 | x - (:overlap / cos(radians(y + 0.5))),
58 | y - :overlap,
59 | x + 1 + (:overlap / cos(radians(y + 0.5))),
60 | y + 1 + :overlap,
61 | :srid))
62 | FROM generate_series(-:split/2, :split/2 - 1) AS x,
63 | generate_series(-:split/4, :split/4 - 1) AS y WHERE :srid = 4326;
64 |
65 | CREATE INDEX :grid_table_x_y_idx ON :grid_table (x, y);
66 |
67 | CREATE INDEX :grid_table_geom_idx ON :grid_table USING GIST (geom);
68 |
69 | SELECT 'created ' || :srid || ' grid', date_trunc('second', now() - :'start_time');
70 |
71 | -- ------------------------------------------
72 |
--------------------------------------------------------------------------------
/scripts/coastline/split-3857-post.sql:
--------------------------------------------------------------------------------
1 | -- ------------------------------------------
2 | --
3 | -- split-3857.sql
4 | --
5 | -- ------------------------------------------
6 |
7 | \t
8 |
9 | \set ON_ERROR_STOP 'on'
10 |
11 | \timing on
12 |
13 | SELECT now() AS start_time \gset
14 |
15 | -- ------------------------------------------
16 |
17 | SELECT now() AS last_time \gset
18 |
19 | DROP TABLE IF EXISTS simplified_water_polygons;
20 |
21 | CREATE TABLE simplified_water_polygons (
22 | id SERIAL PRIMARY KEY,
23 | x INTEGER,
24 | y INTEGER,
25 | geom GEOMETRY(POLYGON, 3857)
26 | );
27 |
28 | ALTER TABLE simplified_water_polygons ALTER COLUMN geom SET STORAGE EXTERNAL;
29 |
30 | INSERT INTO simplified_water_polygons (x, y, geom)
31 | SELECT g.x, g.y, ST_MakeValid((ST_Dump(ST_Difference(g.geom, p.geom))).geom)
32 | FROM grid_3857 g, land_polygons_grid_3857_union p
33 | WHERE g.x = p.x AND g.y = p.y;
34 |
35 | -- Delete some tiny slivers along the antimeridian created as a side-effect of our code
36 | DELETE FROM simplified_water_polygons
37 | WHERE ST_Contains(ST_MakeEnvelope(-20037508.342789244, -20037508.342789244, -20037499.0, 14230070.0, 3857), geom)
38 | OR ST_Contains(ST_MakeEnvelope( 20037499.0, -20037508.342789244, 20037508.342789244, 14230080.0, 3857), geom);
39 |
40 | INSERT INTO simplified_water_polygons (x, y, geom)
41 | SELECT x, y, geom
42 | FROM grid_3857
43 | WHERE ARRAY[x, y] NOT IN (SELECT DISTINCT ARRAY[x, y] FROM land_polygons_grid_3857_union);
44 |
45 | -- ALTER TABLE simplified_water_polygons DROP COLUMN x, DROP COLUMN y;
46 |
47 | CREATE INDEX simplified_water_polygons_geom_idx ON simplified_water_polygons USING GIST (geom);
48 |
49 | SELECT 'create simplified water polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
50 |
51 | -- ------------------------------------------
52 |
--------------------------------------------------------------------------------
/scripts/coastline/split-3857.sql:
--------------------------------------------------------------------------------
1 | -- ------------------------------------------
2 | --
3 | -- split-3857.sql
4 | --
5 | -- ------------------------------------------
6 |
7 | \t
8 |
9 | \set ON_ERROR_STOP 'on'
10 |
11 | \timing on
12 |
13 | SELECT now() AS start_time \gset
14 |
15 | -- ------------------------------------------
16 |
17 | SELECT now() AS last_time \gset
18 |
19 | DROP TABLE IF EXISTS land_polygons_grid_3857;
20 |
21 | CREATE TABLE land_polygons_grid_3857 (
22 | id SERIAL PRIMARY KEY,
23 | x INTEGER,
24 | y INTEGER,
25 | geom GEOMETRY(POLYGON, 3857)
26 | );
27 |
28 | ALTER TABLE land_polygons_grid_3857 ALTER COLUMN geom SET STORAGE EXTERNAL;
29 |
30 | INSERT INTO land_polygons_grid_3857 (x, y, geom)
31 | SELECT x, y, ST_MakeValid((ST_Dump(geom)).geom)
32 | FROM land_polygons_grid_3857_union;
33 |
34 | CREATE INDEX land_polygons_grid_3857_geom_idx ON land_polygons_grid_3857 USING GIST (geom);
35 |
36 | SELECT 'final land polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
37 |
38 | -- ------------------------------------------
39 |
40 | SELECT now() AS last_time \gset
41 |
42 | DROP TABLE IF EXISTS water_polygons_grid_3857;
43 |
44 | CREATE TABLE water_polygons_grid_3857 (
45 | id SERIAL PRIMARY KEY,
46 | x INTEGER,
47 | y INTEGER,
48 | geom GEOMETRY(POLYGON, 3857)
49 | );
50 |
51 | ALTER TABLE water_polygons_grid_3857 ALTER COLUMN geom SET STORAGE EXTERNAL;
52 |
53 | INSERT INTO water_polygons_grid_3857 (x, y, geom)
54 | SELECT g.x, g.y, ST_MakeValid((ST_Dump(ST_Difference(g.geom, p.geom))).geom)
55 | FROM grid_3857 g, land_polygons_grid_3857_union p
56 | WHERE g.x = p.x AND g.y = p.y;
57 |
58 | -- Delete some tiny slivers along the antimeridian created as a side-effect of our code
59 | DELETE FROM water_polygons_grid_3857
60 | WHERE ST_Contains(ST_MakeEnvelope(-20037508.342789244, -20037508.342789244, -20037499.0, 14230070.0, 3857), geom)
61 | OR ST_Contains(ST_MakeEnvelope( 20037499.0, -20037508.342789244, 20037508.342789244, 14230080.0, 3857), geom);
62 |
63 | INSERT INTO water_polygons_grid_3857 (x, y, geom)
64 | SELECT x, y, geom
65 | FROM grid_3857
66 | WHERE ARRAY[x, y] NOT IN (SELECT DISTINCT ARRAY[x, y] FROM land_polygons_grid_3857);
67 |
68 | CREATE INDEX water_polygons_grid_3857_geom_idx ON water_polygons_grid_3857 USING GIST (geom);
69 |
70 | SELECT 'create water polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
71 |
72 | -- ------------------------------------------
73 |
74 | SELECT now() AS last_time \gset
75 |
76 | DROP TABLE IF EXISTS simplified_land_polygons;
77 |
78 | CREATE TABLE simplified_land_polygons (
79 | id SERIAL PRIMARY KEY,
80 | geom GEOMETRY(POLYGON, 3857)
81 | );
82 |
83 | ALTER TABLE simplified_land_polygons ALTER COLUMN geom SET STORAGE EXTERNAL;
84 |
85 | INSERT INTO simplified_land_polygons (id, geom)
86 | SELECT id, ST_SimplifyPreserveTopology(geom, 300)
87 | FROM land_polygons_3857 WHERE ST_Area(geom) > 300000;
88 |
89 | SELECT 'simplified land polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
90 |
91 | -- ------------------------------------------
92 |
93 | DROP TABLE land_polygons_grid_3857_union;
94 |
95 | -- ------------------------------------------
96 |
--------------------------------------------------------------------------------
/scripts/coastline/split-4326.sql:
--------------------------------------------------------------------------------
1 | -- ------------------------------------------
2 | --
3 | -- split-4326.sql
4 | --
5 | -- ------------------------------------------
6 |
7 | \t
8 |
9 | \set ON_ERROR_STOP 'on'
10 |
11 | \timing on
12 |
13 | SELECT now() AS start_time \gset
14 |
15 | -- ------------------------------------------
16 |
17 | SELECT now() AS last_time \gset
18 |
19 | DROP TABLE IF EXISTS land_polygons_grid_4326;
20 |
21 | CREATE TABLE land_polygons_grid_4326 (
22 | id SERIAL PRIMARY KEY,
23 | x INTEGER,
24 | y INTEGER,
25 | geom GEOMETRY(POLYGON, 4326)
26 | );
27 |
28 | ALTER TABLE land_polygons_grid_4326 ALTER COLUMN geom SET STORAGE EXTERNAL;
29 |
30 | INSERT INTO land_polygons_grid_4326 (x, y, geom)
31 | SELECT x, y, ST_MakeValid((ST_Dump(geom)).geom)
32 | FROM land_polygons_grid_4326_union;
33 |
34 | CREATE INDEX land_polygons_grid_4326_geom_idx ON land_polygons_grid_4326 USING GIST (geom);
35 |
36 | SELECT 'final land polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
37 |
38 | -- ------------------------------------------
39 |
40 | SELECT now() AS last_time \gset
41 |
42 | DROP TABLE IF EXISTS water_polygons_grid_4326;
43 |
44 | CREATE TABLE water_polygons_grid_4326 (
45 | id SERIAL PRIMARY KEY,
46 | x INTEGER,
47 | y INTEGER,
48 | geom GEOMETRY(POLYGON, 4326)
49 | );
50 |
51 | ALTER TABLE water_polygons_grid_4326 ALTER COLUMN geom SET STORAGE EXTERNAL;
52 |
53 | INSERT INTO water_polygons_grid_4326 (x, y, geom)
54 | SELECT g.x, g.y, ST_MakeValid((ST_Dump(ST_Difference(g.geom, p.geom))).geom)
55 | FROM grid_4326 g, land_polygons_grid_4326_union p
56 | WHERE g.x = p.x AND g.y = p.y;
57 |
58 | INSERT INTO water_polygons_grid_4326 (x, y, geom)
59 | SELECT x, y, geom
60 | FROM grid_4326
61 | WHERE ARRAY[x, y] NOT IN (SELECT DISTINCT ARRAY[x, y] FROM land_polygons_grid_4326);
62 |
63 | CREATE INDEX water_polygons_grid_4326_geom_idx ON water_polygons_grid_4326 USING GIST (geom);
64 |
65 | SELECT 'create water polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
66 |
67 | -- ------------------------------------------
68 |
69 | DROP TABLE land_polygons_grid_4326_union;
70 |
71 | -- ------------------------------------------
72 |
--------------------------------------------------------------------------------
/scripts/coastline/split-on-grid.sql:
--------------------------------------------------------------------------------
1 | -- ------------------------------------------
2 | --
3 | -- split-on-grid.sql
4 | --
5 | -- variables:
6 | -- srid: 4326 or 3857
7 | -- input_table: table with input data to be split
8 | -- output_table: name of output table
9 | --
10 | -- ------------------------------------------
11 |
12 | \t
13 |
14 | \set ON_ERROR_STOP 'on'
15 |
16 | \timing on
17 |
18 | SELECT now() AS start_time \gset
19 |
20 | SELECT 'grid_' || :srid AS grid_table \gset
21 |
22 | -- ------------------------------------------
23 |
24 | SELECT :srid, 'splitting grid...';
25 |
26 | SELECT now() AS last_time \gset
27 |
28 | DROP TABLE IF EXISTS polygons_sub;
29 |
30 | CREATE TABLE polygons_sub (
31 | id SERIAL PRIMARY KEY,
32 | geom GEOMETRY(MULTIPOLYGON, :srid)
33 | );
34 |
35 | ALTER TABLE polygons_sub ALTER COLUMN geom SET STORAGE EXTERNAL;
36 |
37 | INSERT INTO polygons_sub (geom)
38 | SELECT ST_Multi(ST_Subdivide(geom, 1000))
39 | FROM :input_table;
40 |
41 | CREATE INDEX polygons_sub_geom_idx ON polygons_sub USING GIST (geom);
42 |
43 | SELECT 'subdivide polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
44 |
45 | -- ------------------------------------------
46 |
47 | SELECT now() AS last_time \gset
48 |
49 | DROP TABLE IF EXISTS polygons_grid_tmp;
50 |
51 | CREATE TABLE polygons_grid_tmp (
52 | id SERIAL PRIMARY KEY,
53 | x INTEGER,
54 | y INTEGER,
55 | geom GEOMETRY(MULTIPOLYGON, :srid)
56 | );
57 |
58 | ALTER TABLE polygons_grid_tmp ALTER COLUMN geom SET STORAGE EXTERNAL;
59 |
60 | INSERT INTO polygons_grid_tmp (x, y, geom)
61 | SELECT g.x, g.y, ST_CollectionExtract(ST_Multi(ST_Intersection(p.geom, g.geom)), 3)
62 | FROM polygons_sub p, :grid_table g
63 | WHERE p.geom && g.geom;
64 |
65 | -- Remove the empty multipolygons created by the ST_CollectionExtract() above
66 | DELETE FROM polygons_grid_tmp WHERE ST_NumGeometries(geom) = 0;
67 |
68 | SELECT 'intersect polygons with grid', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
69 |
70 | -- ------------------------------------------
71 |
72 | SELECT now() AS last_time \gset
73 |
74 | DROP TABLE IF EXISTS polygons_grid_union;
75 |
76 | CREATE TABLE polygons_grid_union (
77 | id SERIAL PRIMARY KEY,
78 | x INTEGER,
79 | y INTEGER,
80 | geom GEOMETRY(MULTIPOLYGON, :srid)
81 | );
82 |
83 | ALTER TABLE polygons_grid_union ALTER COLUMN geom SET STORAGE EXTERNAL;
84 |
85 | INSERT INTO polygons_grid_union (x, y, geom)
86 | SELECT x, y, ST_Multi(ST_Union(geom))
87 | FROM polygons_grid_tmp
88 | GROUP BY x, y;
89 |
90 | CREATE INDEX polygons_grid_union_geom_idx ON polygons_grid_union USING GIST (geom);
91 |
92 | SELECT 'union polygons', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
93 |
94 | -- ------------------------------------------
95 |
96 | SELECT now() AS last_time \gset
97 |
98 | DROP TABLE polygons_grid_tmp;
99 | DROP TABLE polygons_sub;
100 |
101 | DROP TABLE IF EXISTS :output_table;
102 | ALTER TABLE polygons_grid_union RENAME TO :output_table;
103 |
104 | SELECT 'cleanup', date_trunc('second', now() - :'last_time'), date_trunc('second', now() - :'start_time');
105 |
106 | -- ------------------------------------------
107 |
108 |
--------------------------------------------------------------------------------
/scripts/coastline/split.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # split.sh SRID
4 | #
5 |
6 | set -euo pipefail
7 | set -x
8 |
9 | DATADIR=/home/robot/data/coastline
10 |
11 | srid=$1
12 |
13 | #if [ -d $DATADIR/land-polygons-complete-${srid} ]; then
14 | # rm -fr $DATADIR/land-polygons-complete-${srid}
15 | #fi
16 | #
17 | #time unzip $DATADIR/land-polygons-complete-${srid}.zip -d $DATADIR
18 |
19 | psql -c "CREATE EXTENSION IF NOT EXISTS postgis;"
20 |
21 | time ogr2ogr -f "PostgreSQL" PG:"dbname=${PGDATABASE} user=${PGUSER}" \
22 | -overwrite \
23 | -lco GEOMETRY_NAME=geom \
24 | -lco FID=id \
25 | -nln "land_polygons_$srid" \
26 | "$DATADIR/coastlines-complete-$srid.db" \
27 | land_polygons
28 |
29 | if [ "$srid" = "3857" ] ; then
30 | xmin=-20037508.34
31 | ymin=-20037508.34
32 | xmax=20037508.34
33 | ymax=20037508.34
34 | overlap=50.0
35 | split=128
36 | else
37 | xmin=-180
38 | ymin=-90
39 | xmax=180
40 | ymax=90
41 | overlap=0.0005
42 | split=360
43 | fi
44 |
45 | time psql --set=prefix=grid \
46 | --set="srid=$srid" --set="split=$split" --set="overlap=$overlap" \
47 | --set="xmin=$xmin" --set="xmax=$xmax" \
48 | --set="ymin=$ymin" --set="ymax=$ymax" \
49 | -f "$BIN/create-grid.sql"
50 |
51 | time psql -f "$BIN/split-on-grid.sql" --set="srid=$srid" \
52 | --set="input_table=land_polygons_$srid" \
53 | --set="output_table=land_polygons_grid_${srid}_union"
54 |
55 | time psql -f "$BIN/split-$srid.sql"
56 |
57 | if [ "$srid" = "3857" ]; then
58 | time psql -f "$BIN/split-on-grid.sql" --set=srid=3857 --set=input_table=simplified_land_polygons --set=output_table=land_polygons_grid_3857_union
59 | time psql -f "$BIN/split-3857-post.sql"
60 | fi
61 |
62 | create_shape() {
63 | local dir=$DATADIR/results/$1
64 | local shape_layer=$2
65 | local in=$3
66 | local layer=$4
67 |
68 | mkdir -p "$dir"
69 | time ogr2ogr -f "ESRI Shapefile" "$dir" -nln "$shape_layer" -overwrite "$in" "$layer"
70 |
71 | echo "UTF-8" >"$dir/$shape_layer.cpg"
72 | }
73 |
74 | create_shape_from_pg() {
75 | create_shape "$1" "$2" PG:"dbname=${PGDATABASE} user=${PGUSER}" "$3"
76 | }
77 |
78 | create_shape "land-polygons-complete-$srid" land_polygons "$DATADIR/coastlines-complete-$srid.db" land_polygons
79 | create_shape "coastlines-split-$srid" lines "$DATADIR/coastlines-split-$srid.db" lines
80 |
81 | for t in land water; do
82 | # psql -c "ALTER TABLE ${t}_polygons_grid_${srid} DROP COLUMN x, DROP COLUMN y;"
83 | create_shape_from_pg "${t}-polygons-split-$srid" "${t}_polygons" "${t}_polygons_grid_$srid"
84 | done
85 |
86 | if [ "$srid" = "3857" ]; then
87 | create_shape_from_pg "simplified-land-polygons-complete-$srid" simplified_land_polygons simplified_land_polygons
88 | create_shape_from_pg "simplified-water-polygons-split-$srid" simplified_water_polygons simplified_water_polygons
89 | fi
90 |
91 |
--------------------------------------------------------------------------------
/scripts/coastline/update.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #------------------------------------------------------------------------------
3 | #
4 | # coastline/update.sh
5 | #
6 | #------------------------------------------------------------------------------
7 |
8 | set -euo pipefail
9 | set -x
10 |
11 | DATADIR=/home/robot/data/coastline
12 | PLANETDIR=/mnt/data/planet
13 |
14 | iso_date='+%Y-%m-%dT%H:%M:%S'
15 |
16 | export BIN
17 | BIN=$(cd "$(dirname "$0")" ; pwd -P)
18 |
19 | PLANET=${PLANETDIR}/planet.osm.pbf
20 | COASTLINES=${PLANETDIR}/coastlines.osm.pbf
21 | DBFILE=$DATADIR/coastlines-debug.db
22 |
23 | #------------------------------------------------------------------------------
24 |
25 | echo "Started update-coastline"
26 | date $iso_date
27 |
28 | mkdir -p $DATADIR
29 | rm -fr $DATADIR/*
30 |
31 | #------------------------------------------------------------------------------
32 | #
33 | # Extract coastline data
34 | #
35 | #------------------------------------------------------------------------------
36 |
37 | #OUTPUT_RINGS="--output-rings"
38 | OUTPUT_RINGS=""
39 |
40 | rm -f $DATADIR/segments.dat $DBFILE.new
41 |
42 | set +e
43 | osmcoastline --verbose --overwrite --no-index \
44 | $OUTPUT_RINGS \
45 | -o $DBFILE.new \
46 | --write-segments=$DATADIR/segments.dat \
47 | --max-points=0 --bbox-overlap=0 \
48 | $COASTLINES
49 |
50 | EXIT_CODE=$?
51 | set -e
52 | echo "osmcoastline exit code: $EXIT_CODE"
53 |
54 | echo $EXIT_CODE >$DATADIR/osmcoastline_exit_code
55 |
56 | if (( EXIT_CODE > 2 )); then
57 | exit 1
58 | fi
59 |
60 | mv $DBFILE.new $DBFILE
61 |
62 | date $iso_date
63 |
64 |
65 | #------------------------------------------------------------------------------
66 | #
67 | # Update files needed for error checking
68 | #
69 | #------------------------------------------------------------------------------
70 |
71 | OSMIDIR=$DATADIR/osmi
72 |
73 | rm -fr $OSMIDIR
74 | mkdir -p $OSMIDIR
75 |
76 | ogr2ogr -f "ESRI Shapefile" $OSMIDIR/error_points.shp $DBFILE error_points
77 | ogr2ogr -f "ESRI Shapefile" $OSMIDIR/error_lines.shp $DBFILE error_lines
78 |
79 | rm -f $DATADIR/coastline-ways.db
80 |
81 | osmcoastline_ways $COASTLINES $DATADIR/coastline-ways.db
82 |
83 | ogr2ogr -f "ESRI Shapefile" -select name $OSMIDIR/ways.shp $DATADIR/coastline-ways.db ways
84 |
85 | cp $DBFILE $DATADIR/osmi-coastlines.db
86 | echo "DROP TABLE land_polygons; VACUUM;" | spatialite $DATADIR/osmi-coastlines.db
87 | ogr2ogr -update -f SQLite $DATADIR/osmi-coastlines.db $DATADIR/coastline-ways.db ways
88 |
89 | POINT_LAYERS="single_point_in_ring not_a_ring end_point fixed_end_point double_node tagged_node"
90 | LINE_LAYERS="direction not_a_ring not_closed overlap added_line questionable invalid"
91 |
92 | for layer in $POINT_LAYERS; do
93 | ogr2ogr -f "GeoJSON" "/vsigzip//$OSMIDIR/coastline_error_points_$layer.json.gz" \
94 | -select osm_id -where "error='$layer'" "$OSMIDIR/error_points.shp"
95 | done
96 |
97 | for layer in $LINE_LAYERS; do
98 | ogr2ogr -f "GeoJSON" "/vsigzip//$OSMIDIR/coastline_error_lines_$layer.json.gz" \
99 | -select osm_id -where "error='$layer'" "$OSMIDIR/error_lines.shp"
100 | done
101 |
102 | time ogr2ogr -f "GeoJSON" "/vsigzip//$OSMIDIR/coastline_ways.json.gz" "$OSMIDIR/ways.shp"
103 |
104 | date $iso_date
105 |
106 |
107 | #------------------------------------------------------------------------------
108 | #
109 | # Create 3857 version of output
110 | #
111 | #------------------------------------------------------------------------------
112 |
113 | run_osmcoastline_lines() {
114 | local srid=$1
115 | local file=coastlines-split-$srid
116 |
117 | rm -f "$DATADIR/$file.db.new"
118 |
119 | set +e
120 | osmcoastline --verbose --overwrite --no-index \
121 | --output-lines --output-polygons=none \
122 | -o "$DATADIR/$file.db.new" \
123 | "--srs=$srid" --max-points=1000 --bbox-overlap=0 \
124 | "$COASTLINES"
125 |
126 | local EXIT_CODE=$?
127 | set -e
128 | echo "osmcoastline exit code: $EXIT_CODE"
129 |
130 | if (( EXIT_CODE > 2 )); then
131 | exit $EXIT_CODE
132 | fi
133 |
134 | mv "$DATADIR/$file.db.new" "$DATADIR/$file.db"
135 | }
136 |
137 | run_osmcoastline_polygons() {
138 | local srid=$1
139 | local file=coastlines-complete-$srid
140 |
141 | rm -f "$DATADIR/$file.db.new"
142 |
143 | set +e
144 | osmcoastline --verbose --overwrite --no-index \
145 | -o "$DATADIR/$file.db.new" \
146 | "--srs=$srid" --max-points=0 --bbox-overlap=0 \
147 | "$COASTLINES"
148 |
149 | local EXIT_CODE=$?
150 | set -e
151 | echo "osmcoastline exit code: $EXIT_CODE"
152 |
153 | if (( EXIT_CODE > 2 )); then
154 | exit $EXIT_CODE
155 | fi
156 |
157 | mv "$DATADIR/$file.db.new" "$DATADIR/$file.db"
158 | }
159 |
160 | run_osmcoastline_lines 4326
161 | run_osmcoastline_polygons 4326
162 |
163 | run_osmcoastline_lines 3857
164 | run_osmcoastline_polygons 3857
165 |
166 | ### This takes longer than recreating 3857 coastlines from source
167 | #rm -f $DATADIR/coastlines-complete-3857.db
168 | #time ogr2ogr --config OGR_ENABLE_PARTIAL_REPROJECTION YES \
169 | # --config OGR_SQLITE_SYNCHRONOUS OFF \
170 | # -dsco SPATIALITE=yes \
171 | # -dsco INIT_WITH_EPSG=no \
172 | # -f "SQLite" \
173 | # -gt 65535 \
174 | # -s_srs "EPSG:4326" \
175 | # -t_srs "EPSG:3857" \
176 | # -skipfailures \
177 | # -clipdst -20037508.342789244 -20037508.342789244 20037508.342789244 20037508.342789244 \
178 | # $DATADIR/coastlines-complete-3857.db \
179 | # $DBFILE land_polygons lines
180 |
181 | date $iso_date
182 |
183 |
184 | #------------------------------------------------------------------------------
185 | #
186 | # Generate split and simplified versions
187 | #
188 | #------------------------------------------------------------------------------
189 |
190 | pg_run_split() {
191 | local srid=$1
192 |
193 | pg_virtualenv -o shared_buffers=2GB \
194 | -o work_mem=512MB \
195 | -o maintenance_work_mem=100MB \
196 | -o checkpoint_timeout=15min \
197 | -o checkpoint_completion_target=0.9 \
198 | -o max_wal_size=2GB \
199 | -o min_wal_size=80MB \
200 | -o fsync=off \
201 | -o synchronous_commit=off \
202 | "$BIN/split.sh" \
203 | "$srid"
204 | }
205 |
206 | pg_run_split 4326
207 |
208 | date $iso_date
209 |
210 | pg_run_split 3857
211 |
212 | date $iso_date
213 |
214 |
215 | #------------------------------------------------------------------------------
216 | #
217 | # Finalize zip files with shapes
218 | #
219 | #------------------------------------------------------------------------------
220 |
221 | # Parse extent from line like this:
222 | # Extent: (-180.000000, -78.732901) - (180.000000, 83.666473)
223 | # into this:
224 | # -180.000000 -78.732901 180.000000 83.666473
225 | parse_extent() {
226 | sed -e 's/^.*(\([0-9.-]\+\), \([0-9.-]\+\)) - (\([0-9.-]\+\), \([0-9.-]\+\))/\1 \2 \3 \4/'
227 | }
228 |
229 | mkshape() {
230 | local proj=$1
231 | local name=$2
232 | local shapedir=$DATADIR/results/$name
233 | local layer=$3
234 |
235 | echo "mkshape $proj $shapedir $layer"
236 |
237 | local INFO EXTENT GMTYPE FCOUNT
238 |
239 | INFO=$(ogrinfo -so "$shapedir/$layer.shp" "$layer")
240 |
241 | EXTENT=$(grep <<< "$INFO" '^Extent: ')
242 | GMTYPE=$(grep <<< "$INFO" '^Geometry: ' | cut -d ':' -f 2- | tr -d ' ')
243 | FCOUNT=$(grep <<< "$INFO" '^Feature Count: ' | cut -d ':' -f 2- | tr -d ' ')
244 |
245 | local XMIN YMIN XMAX YMAX
246 | read -r XMIN YMIN XMAX YMAX <<<"$(parse_extent <<<"$EXTENT")"
247 |
248 | if [ "$proj" = "3857" ]; then
249 |
250 | # this tests if the data extends beyond the 180 degree meridian
251 | # and adds '+over' to the projection definition in that case
252 | if [[ $XMIN < -20037509 ]]; then
253 | sed -i -e 's/+no_defs"/+no_defs +over"/' "$shapedir/$layer.prj"
254 | fi
255 |
256 | local LON_MIN LON_MAX LAT_MIN LAT_MAX bbox
257 |
258 | read LON_MIN LAT_MIN <<<$(gdaltransform -s_srs 'EPSG:3857' -t_srs 'EPSG:4326' -output_xy <<< "$XMIN $YMIN")
259 | read LON_MAX LAT_MAX <<<$(gdaltransform -s_srs 'EPSG:3857' -t_srs 'EPSG:4326' -output_xy <<< "$XMAX $YMAX")
260 |
261 | XMIN=$(echo "($XMIN+0.5)/1" | bc)
262 | XMAX=$(echo "($XMAX+0.5)/1" | bc)
263 | YMIN=$(echo "($YMIN+0.5)/1" | bc)
264 | YMAX=$(echo "($YMAX+0.5)/1" | bc)
265 |
266 | bbox=$(printf '(%.3f, %.3f) - (%.3f, %.3f)' "$LON_MIN" "$LAT_MIN" "$LON_MAX" "$LAT_MAX")
267 | local LAYERS="\n\n$layer.shp:\n\n $FCOUNT $GMTYPE features\n Mercator projection (EPSG: 3857)\n Extent: ($XMIN, $YMIN) - ($XMAX, $YMAX)\n In geographic coordinates: $bbox"
268 | else
269 | local bbox
270 | bbox=$(printf '(%.3f, %.3f) - (%.3f, %.3f)' "$XMIN" "$YMIN" "$XMAX" "$YMAX")
271 | local LAYERS="\n\n$layer.shp:\n\n $FCOUNT $GMTYPE features\n WGS84 geographic coordinates (EPSG: 4326)\n Extent: $bbox"
272 | fi
273 |
274 | local YEAR DATE CONTENT URL
275 | YEAR=$(date '+%Y')
276 | DATE=$(osmium fileinfo -g header.option.osmosis_replication_timestamp $PLANET)
277 |
278 | local url_prefix='https://osmdata.openstreetmap.de/data'
279 |
280 | if [ "$layer" = 'land_polygons' ]; then
281 | if [[ $name = *split* ]]; then
282 | CONTENT='land polygons, split into a grid with slight overlap'
283 | else
284 | CONTENT='land polygons'
285 | fi
286 | URL='land-polygons'
287 | elif [ "$layer" = 'simplified_land_polygons' ]; then
288 | CONTENT='coastline land polygons, simplified for rendering at low zooms'
289 | URL='land-polygons'
290 | elif [ "$layer" = 'simplified_water_polygons' ]; then
291 | CONTENT='coastline water polygons, simplified for rendering at low zooms and split into a grid'
292 | URL='water-polygons'
293 | elif [ "$layer" = 'water_polygons' ]; then
294 | CONTENT='coastline water polygons, split into a grid with slight overlap'
295 | URL='water-polygons'
296 | else
297 | CONTENT='coastlines'
298 | URL='coastlines'
299 | fi
300 |
301 | sed -e "s?@YEAR@?${YEAR}?g;s?@URL@?${url_prefix}/${URL}.html?g;s?@DATE@?${DATE}?g;s?@CONTENT@?${CONTENT}?g" "$BIN/README.tmpl" \
302 | | sed "/@LAYERS@/N;s?@LAYERS@?$LAYERS?" >"$shapedir/README.txt"
303 |
304 | rm -f "$shapedir.zip.new"
305 | (cd $DATADIR/results; zip --quiet "$name.zip.new" "$name"/*)
306 | mv "$shapedir.zip.new" "$shapedir.zip"
307 | }
308 |
309 | #------------------------------------------------------------------------------
310 |
311 | mkshape 4326 coastlines-split-4326 lines
312 | mkshape 3857 coastlines-split-3857 lines
313 |
314 | mkshape 4326 land-polygons-complete-4326 land_polygons
315 | mkshape 3857 land-polygons-complete-3857 land_polygons
316 |
317 | mkshape 4326 land-polygons-split-4326 land_polygons
318 | mkshape 3857 land-polygons-split-3857 land_polygons
319 |
320 | mkshape 4326 water-polygons-split-4326 water_polygons
321 | mkshape 3857 water-polygons-split-3857 water_polygons
322 |
323 | mkshape 3857 simplified-land-polygons-complete-3857 simplified_land_polygons
324 | mkshape 3857 simplified-water-polygons-split-3857 simplified_water_polygons
325 |
326 | date $iso_date
327 |
328 | #------------------------------------------------------------------------------
329 |
330 | df -h
331 |
332 | echo "Done."
333 | date $iso_date
334 |
335 |
--------------------------------------------------------------------------------
/scripts/icesheet/README.tmpl:
--------------------------------------------------------------------------------
1 |
2 | This data was downloaded from osmdata.openstreetmap.de which offers
3 | extracts and processings of OpenStreetMap data.
4 |
5 | See https://osmdata.openstreetmap.de/ for details.
6 |
7 |
8 | PACKAGE CONTENT
9 | ===============
10 |
11 | This package contains OpenStreetMap data of the
12 | @CONTENT@.
13 |
14 | Layers contained are:@LAYERS@
15 |
16 | Date of the data used is @DATE@
17 |
18 | You can find more information on this data set at
19 |
20 | @URL@
21 |
22 |
23 | LICENSE
24 | =======
25 |
26 | This data is Copyright @YEAR@ OpenStreetMap contributors. It is
27 | available under the Open Database License (ODbL).
28 |
29 | For more information see https://www.openstreetmap.org/copyright
30 |
31 |
--------------------------------------------------------------------------------
/scripts/icesheet/osmium-export-config.json:
--------------------------------------------------------------------------------
1 | {
2 | "attributes": {
3 | "type": false,
4 | "id": "id",
5 | "version": false,
6 | "changeset": false,
7 | "timestamp": false,
8 | "uid": false,
9 | "user": false,
10 | "way_nodes": false
11 | },
12 | "linear_tags": true,
13 | "area_tags": true,
14 | "exclude_tags": [],
15 | "include_tags": ["natural", "supraglacial"]
16 | }
17 |
--------------------------------------------------------------------------------
/scripts/icesheet/update-icesheet-zip.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # update-icesheet-zip.sh
4 | #
5 |
6 | set -euo pipefail
7 | set -x
8 |
9 | DATADIR=/home/robot/data/icesheet
10 |
11 | iso_date='+%Y-%m-%dT%H:%M:%S'
12 |
13 | echo "Started update-icesheet-zip.sh"
14 | date $iso_date
15 |
16 | cd $DATADIR
17 |
18 | RESULTS=$DATADIR/results
19 | mkdir -p $RESULTS
20 |
21 | url_prefix='https://osmdata.openstreetmap.de/data'
22 |
23 | for SHAPEDIR in antarctica-icesheet-* ; do
24 | test -d "$SHAPEDIR" || continue
25 |
26 | LAYERS=
27 |
28 | for SHP in $(find "$SHAPEDIR" -name '*.shp') ; do
29 | LN=$(basename "$SHP" .shp)
30 | echo "UTF-8" >"$SHAPEDIR/$LN.cpg"
31 |
32 | INFO=$(ogrinfo -so "$SHP" "$LN")
33 |
34 | EXT_INFO=$(echo "$INFO" | grep "^Extent: " | cut -d ":" -f 2-)
35 | XMIN=$(echo "$EXT_INFO" | cut -d "(" -f 2 | cut -d "," -f 1)
36 | YMIN=$(echo "$EXT_INFO" | cut -d "," -f 2 | cut -d ")" -f 1)
37 | XMAX=$(echo "$EXT_INFO" | cut -d "(" -f 3 | cut -d "," -f 1)
38 | YMAX=$(echo "$EXT_INFO" | cut -d "," -f 3 | cut -d ")" -f 1)
39 |
40 | P1=$(echo "$XMIN $YMIN" | gdaltransform -s_srs "EPSG:3857" -t_srs "EPSG:4326")
41 | P2=$(echo "$XMAX $YMAX" | gdaltransform -s_srs "EPSG:3857" -t_srs "EPSG:4326")
42 | LON_MIN=$(echo "$P1" | cut -d " " -f 1 | LC_ALL=C xargs printf "%.3f")
43 | LON_MAX=$(echo "$P2" | cut -d " " -f 1 | LC_ALL=C xargs printf "%.3f")
44 | LAT_MIN=$(echo "$P1" | cut -d " " -f 2 | LC_ALL=C xargs printf "%.3f")
45 | LAT_MAX=$(echo "$P2" | cut -d " " -f 2 | LC_ALL=C xargs printf "%.3f")
46 |
47 | XMIN=$(echo "($XMIN+0.5)/1" | bc)
48 | XMAX=$(echo "($XMAX+0.5)/1" | bc)
49 | YMIN=$(echo "($YMIN+0.5)/1" | bc)
50 | YMAX=$(echo "($YMAX+0.5)/1" | bc)
51 |
52 | FTYPE=$(echo "$INFO" | grep "^Geometry: " | cut -d ":" -f 2- | sed "s? ??g")
53 | FCOUNT=$(echo "$INFO" | grep "^Feature Count: " | cut -d ":" -f 2- | sed "s? ??g")
54 |
55 | LAYERS="$LAYERS\n\n$LN.shp:\n\n $FCOUNT $FTYPE features\n Mercator projection (EPSG: 3857)\n Extent: ($XMIN, $YMIN) - ($XMAX, $YMAX)\n In geographic coordinates: ($LON_MIN, $LAT_MIN) - ($LON_MAX, $LAT_MAX)"
56 |
57 | done
58 |
59 | YEAR=$(date '+%Y')
60 | DATE=$(date -r /mnt/data/planet/last-update +'%d %b %Y %H:%M')
61 |
62 | if echo "$SHAPEDIR" | grep "outline" > /dev/null ; then
63 | CONTENT="Antarctic icesheet outlines"
64 | URL=icesheet-outlines
65 | else
66 | CONTENT="Antarctic icesheet polygons"
67 | URL=icesheet-polygons
68 | fi
69 |
70 | sed -e "s?@YEAR@?${YEAR}?g;s?@URL@?${url_prefix}/${URL}.html?g;s?@DATE@?${DATE}?g;s?@CONTENT@?${CONTENT}?g" "$BIN/README.tmpl" | sed "/@LAYERS@/N;s?@LAYERS@?$LAYERS?" >"$SHAPEDIR/README"
71 | rm -f "$SHAPEDIR.zip.new"
72 | zip "$SHAPEDIR.zip.new" "$SHAPEDIR"/*
73 | mv "$SHAPEDIR.zip.new" "$SHAPEDIR.zip"
74 | mv "$SHAPEDIR.zip" "$RESULTS/$SHAPEDIR.zip"
75 | done
76 |
77 | echo "Done."
78 | date $iso_date
79 |
80 |
--------------------------------------------------------------------------------
/scripts/icesheet/update.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #
3 | # icesheet/update.sh
4 | #
5 |
6 | set -euo pipefail
7 | set -x
8 |
9 | DATADIR=/home/robot/data/icesheet
10 | PLANETDIR=/mnt/data/planet
11 |
12 | iso_date='+%Y-%m-%dT%H:%M:%S'
13 |
14 | export BIN
15 | BIN=$(cd "$(dirname "$0")" ; pwd -P)
16 |
17 | mkdir -p $DATADIR
18 |
19 | export PLANET=${PLANETDIR}/planet.osm.pbf
20 | export ANT=${PLANETDIR}/antarctica.osm.pbf
21 | export ANT_COASTLINES=${PLANETDIR}/antarctica-coastlines.osm.pbf
22 |
23 | export ANT_NATURAL=${DATADIR}/antarctica-natural
24 |
25 | DB=$DATADIR/icesheet.db
26 |
27 | export STATS=$DATADIR/stats
28 |
29 | echo "Started icesheet/update.sh"
30 | date $iso_date
31 |
32 | rm -f $DATADIR/antarctica_icesheet.db
33 | rm -rf $DATADIR/antarctica-icesheet-*
34 | rm -f $DATADIR/coastlines-split-3857.db
35 |
36 | echo "Creating coastline..."
37 |
38 | osmcoastline --verbose --overwrite \
39 | --output-polygons=both --output-lines \
40 | -o $DB.new \
41 | --srs=3857 --max-points=500 \
42 | $ANT_COASTLINES \
43 | && true
44 |
45 | mv $DB.new $DB
46 |
47 | date $iso_date
48 |
49 | echo "Extracting polygons tagged with 'natural'..."
50 |
51 | osmium tags-filter --verbose --overwrite --output=$ANT_NATURAL.osm.pbf \
52 | --remove-tags $ANT \
53 | a/natural!=bay,cliff,sinkhole,cave_entrance,crevasse,dune,desert,valley,volcano,coastline;
54 |
55 | osmium export --verbose "--config=$BIN/osmium-export-config.json" \
56 | --geometry-types=polygon --overwrite --output=$ANT_NATURAL.geojson \
57 | $ANT_NATURAL.osm.pbf
58 |
59 | sed -e 's/"natural":/"type":/g' $ANT_NATURAL.geojson >${ANT_NATURAL}-type.geojson
60 |
61 | date $iso_date
62 |
63 | echo "Adding non-icesheet data to db..."
64 | ogr2ogr --config OGR_SQLITE_SYNCHRONOUS OFF -f "SQLite" -gt 65535 \
65 | -s_srs "EPSG:4326" -t_srs "EPSG:3857" \
66 | -skipfailures -explodecollections \
67 | -spat -180 -85.05113 180 -60 -update -append \
68 | -nln noice -nlt POLYGON \
69 | $DB ${ANT_NATURAL}-type.geojson
70 |
71 |
72 | date $iso_date
73 |
74 | pragmas="PRAGMA journal_mode = OFF; PRAGMA synchronous = OFF; PRAGMA temp_store = MEMORY; PRAGMA cache_size = 1000000;"
75 |
76 | SPLIT_SIZE=200000
77 | EDGE_TYPE_ATTRIBUTE=ice_edge
78 | EDGE_TYPE_ICE_OCEAN=ice_ocean
79 | EDGE_TYPE_ICE_LAND=ice_land
80 | EDGE_TYPE_ICE_ICE=ice_ice
81 | COASTLINE_LAYER=lines
82 |
83 | echo "${pragmas}
84 |
85 | CREATE TABLE ice ( OGC_FID INTEGER PRIMARY KEY AUTOINCREMENT );
86 | SELECT AddGeometryColumn('ice', 'GEOMETRY', 3857, 'MULTIPOLYGON', 'XY');
87 | SELECT CreateSpatialIndex('ice', 'GEOMETRY');
88 |
89 | INSERT INTO ice (OGC_FID, GEOMETRY)
90 | SELECT land_polygons.OGC_FID, CastToMultiPolygon(land_polygons.GEOMETRY)
91 | FROM land_polygons;
92 |
93 | REPLACE INTO ice (OGC_FID, GEOMETRY)
94 | SELECT ice.OGC_FID, CastToMultiPolygon(ST_Difference(ice.GEOMETRY, ST_Union(noice.GEOMETRY)))
95 | FROM ice JOIN noice
96 | ON (ST_Intersects(ice.GEOMETRY, noice.GEOMETRY) AND noice.OGC_FID IN
97 | (SELECT ROWID FROM SpatialIndex WHERE f_table_name = 'noice' AND search_frame = ice.GEOMETRY))
98 | GROUP BY ice.OGC_FID;
99 |
100 | .elemgeo ice GEOMETRY ice_split id_new id_old;
101 | DELETE FROM ice;
102 |
103 | INSERT INTO ice (GEOMETRY)
104 | SELECT CastToMultiPolygon(ice_split.GEOMETRY)
105 | FROM ice_split
106 | WHERE ST_Area(GEOMETRY) > 0.1;
107 |
108 | SELECT DiscardGeometryColumn('ice_split', 'GEOMETRY');
109 | DROP TABLE ice_split;
110 | VACUUM;
111 |
112 | CREATE TABLE noice_outline ( OGC_FID INTEGER PRIMARY KEY AUTOINCREMENT, oid INTEGER, iteration INTEGER );
113 | SELECT AddGeometryColumn('noice_outline', 'GEOMETRY', 3857, 'MULTILINESTRING', 'XY');
114 | SELECT CreateSpatialIndex('noice_outline', 'GEOMETRY');
115 |
116 | INSERT INTO noice_outline (OGC_FID, oid, iteration, GEOMETRY)
117 | SELECT noice.OGC_FID, noice.OGC_FID, 0, CastToMultiLineString(ST_Boundary(noice.GEOMETRY))
118 | FROM noice;
119 |
120 | .elemgeo noice_outline GEOMETRY noice_outline_split id_new id_old;
121 | DELETE FROM noice_outline;
122 |
123 | INSERT INTO noice_outline (oid, iteration, GEOMETRY)
124 | SELECT noice_outline_split.oid, 0, CastToMultiLineString(noice_outline_split.GEOMETRY)
125 | FROM noice_outline_split
126 | WHERE ST_Length(noice_outline_split.GEOMETRY) <= $SPLIT_SIZE;
127 |
128 | INSERT INTO noice_outline (oid, iteration, GEOMETRY)
129 | SELECT noice_outline_split.oid, 1, CastToMultiLineString(ST_Line_Substring(noice_outline_split.GEOMETRY, 0.0, 0.5))
130 | FROM noice_outline_split
131 | WHERE ST_Length(noice_outline_split.GEOMETRY) > $SPLIT_SIZE;
132 |
133 | INSERT INTO noice_outline (oid, iteration, GEOMETRY)
134 | SELECT noice_outline_split.oid, 1, CastToMultiLineString(ST_Line_Substring(noice_outline_split.GEOMETRY, 0.5, 1.0))
135 | FROM noice_outline_split
136 | WHERE ST_Length(noice_outline_split.GEOMETRY) > $SPLIT_SIZE;
137 |
138 | SELECT DiscardGeometryColumn('noice_outline_split', 'GEOMETRY');
139 | DROP TABLE noice_outline_split;" | spatialite -batch -bail -echo "$DB"
140 |
141 | date $iso_date
142 | echo "Iterating outline splitting..."
143 |
144 | CNT=1
145 | XCNT=1
146 | while [ $XCNT -gt 0 ] ; do
147 | XCNT=$(echo "${pragmas}
148 | INSERT INTO noice_outline (oid, iteration, GEOMETRY)
149 | SELECT noice_outline.oid, ($CNT + 1), CastToMultiLineString(ST_Line_Substring(noice_outline.GEOMETRY, 0.0, 0.5))
150 | FROM noice_outline
151 | WHERE ST_Length(noice_outline.GEOMETRY) > $SPLIT_SIZE AND noice_outline.iteration = $CNT;
152 |
153 | INSERT INTO noice_outline (oid, iteration, GEOMETRY)
154 | SELECT noice_outline.oid, ($CNT + 1), CastToMultiLineString(ST_Line_Substring(noice_outline.GEOMETRY, 0.5, 1.0))
155 | FROM noice_outline
156 | WHERE ST_Length(noice_outline.GEOMETRY) > $SPLIT_SIZE AND noice_outline.iteration = $CNT;
157 |
158 | SELECT COUNT(*)
159 | FROM noice_outline
160 | WHERE ST_Length(noice_outline.GEOMETRY) > $SPLIT_SIZE AND noice_outline.iteration = ($CNT + 1);" | spatialite -batch -bail -echo "$DB" | tail -n 1)
161 |
162 | echo "--- iteration $CNT ($XCNT) ---"
163 | CNT=$((CNT + 1))
164 | done
165 |
166 | rm -f "$DATADIR/cnt.txt"
167 |
168 | date $iso_date
169 | echo "Running spatialite processing (second part)..."
170 |
171 | echo "${pragmas}
172 | DELETE FROM noice_outline WHERE ST_Length(noice_outline.GEOMETRY) > $SPLIT_SIZE;
173 | VACUUM;
174 |
175 | CREATE TABLE ice_outline ( OGC_FID INTEGER PRIMARY KEY AUTOINCREMENT, $EDGE_TYPE_ATTRIBUTE TEXT );
176 | SELECT AddGeometryColumn('ice_outline', 'GEOMETRY', 3857, 'MULTILINESTRING', 'XY');
177 | SELECT CreateSpatialIndex('ice_outline', 'GEOMETRY');
178 |
179 | CREATE TABLE ice_outline2 ( OGC_FID INTEGER PRIMARY KEY AUTOINCREMENT, $EDGE_TYPE_ATTRIBUTE TEXT );
180 | SELECT AddGeometryColumn('ice_outline2', 'GEOMETRY', 3857, 'MULTILINESTRING', 'XY');
181 | SELECT CreateSpatialIndex('ice_outline2', 'GEOMETRY');
182 |
183 | UPDATE water_polygons SET GEOMETRY = ST_Buffer(GEOMETRY,0.01);
184 |
185 | REPLACE INTO noice_outline (OGC_FID, oid, GEOMETRY)
186 | SELECT noice_outline.OGC_FID, noice_outline.oid, CastToMultiLineString(ST_Difference(noice_outline.GEOMETRY, ST_Union(water_polygons.GEOMETRY)))
187 | FROM noice_outline JOIN water_polygons
188 | ON (ST_Intersects(noice_outline.GEOMETRY, water_polygons.GEOMETRY) AND water_polygons.OGC_FID IN
189 | (SELECT ROWID FROM SpatialIndex WHERE f_table_name = 'water_polygons' AND search_frame = noice_outline.GEOMETRY))
190 | GROUP BY noice_outline.OGC_FID;
191 |
192 | SELECT 'step 1', datetime('now');
193 |
194 | REPLACE INTO noice_outline (OGC_FID, oid, GEOMETRY)
195 | SELECT noice_outline.OGC_FID, noice_outline.oid, CastToMultiLineString(ST_Difference(noice_outline.GEOMETRY, ST_Union(noice.GEOMETRY)))
196 | FROM noice_outline JOIN noice
197 | ON (ST_Intersects(noice_outline.GEOMETRY, noice.GEOMETRY) AND noice.OGC_FID IN
198 | (SELECT ROWID FROM SpatialIndex WHERE f_table_name = 'noice' AND search_frame = noice_outline.GEOMETRY) AND noice.OGC_FID <> noice_outline.oid)
199 | GROUP BY noice_outline.OGC_FID;
200 |
201 | DELETE FROM noice_outline WHERE ST_Length(GEOMETRY) < 0.01 OR GEOMETRY IS NULL;
202 |
203 | DELETE FROM ice_outline;
204 |
205 | INSERT INTO ice_outline (OGC_FID, $EDGE_TYPE_ATTRIBUTE, GEOMETRY)
206 | SELECT $COASTLINE_LAYER.OGC_FID, '$EDGE_TYPE_ICE_OCEAN', CastToMultiLineString($COASTLINE_LAYER.GEOMETRY)
207 | FROM $COASTLINE_LAYER;
208 |
209 | SELECT 'step 2', datetime('now');
210 |
211 | REPLACE INTO ice_outline (OGC_FID, $EDGE_TYPE_ATTRIBUTE, GEOMETRY)
212 | SELECT ice_outline.OGC_FID, '$EDGE_TYPE_ICE_OCEAN', CastToMultiLineString(ST_Difference(ice_outline.GEOMETRY, ST_Union(ST_Buffer(noice.GEOMETRY, 0.01))))
213 | FROM ice_outline JOIN noice
214 | ON (ST_Intersects(ice_outline.GEOMETRY, noice.GEOMETRY) AND noice.OGC_FID IN
215 | (SELECT ROWID FROM SpatialIndex WHERE f_table_name = 'noice' AND search_frame = ST_Buffer(ice_outline.GEOMETRY, 0.01)))
216 | GROUP BY ice_outline.OGC_FID;
217 |
218 | INSERT INTO ice_outline ($EDGE_TYPE_ATTRIBUTE, GEOMETRY)
219 | SELECT '$EDGE_TYPE_ICE_LAND', noice_outline.GEOMETRY
220 | FROM noice_outline;
221 |
222 | INSERT INTO ice_outline2 (OGC_FID, $EDGE_TYPE_ATTRIBUTE, GEOMETRY)
223 | SELECT OGC_FID, $EDGE_TYPE_ATTRIBUTE, GEOMETRY
224 | FROM ice_outline;
225 |
226 | SELECT 'step 3', datetime('now');
227 |
228 | REPLACE INTO ice_outline2 (OGC_FID, $EDGE_TYPE_ATTRIBUTE, GEOMETRY)
229 | SELECT ice_outline2.OGC_FID, '$EDGE_TYPE_ICE_LAND', CastToMultiLineString(ST_Difference(ice_outline2.GEOMETRY, ST_Union(ST_Buffer(noice.GEOMETRY, 0.01))))
230 | FROM ice_outline2 JOIN noice
231 | ON (ST_Intersects(ice_outline2.GEOMETRY, noice.GEOMETRY) AND noice.OGC_FID IN
232 | (SELECT ROWID FROM SpatialIndex WHERE f_table_name = 'noice' AND search_frame = ST_Buffer(ice_outline2.GEOMETRY, 0.01)) AND noice.type = 'glacier')
233 | GROUP BY ice_outline2.OGC_FID;
234 |
235 | SELECT 'step 4', datetime('now');
236 |
237 | INSERT INTO ice_outline2 ($EDGE_TYPE_ATTRIBUTE, GEOMETRY)
238 | SELECT '$EDGE_TYPE_ICE_ICE', CastToMultiLineString(ST_Intersection(ice_outline.GEOMETRY, ST_Union(ST_Buffer(noice.GEOMETRY, 0.01))))
239 | FROM ice_outline JOIN noice
240 | ON (ST_Intersects(ice_outline.GEOMETRY, noice.GEOMETRY) AND noice.OGC_FID IN
241 | (SELECT ROWID FROM SpatialIndex WHERE f_table_name = 'noice' AND search_frame = ST_Buffer(ice_outline.GEOMETRY, 0.01)) AND noice.type = 'glacier')
242 | GROUP BY ice_outline.OGC_FID;
243 |
244 | DELETE FROM ice_outline;
245 | SELECT DisableSpatialIndex('ice_outline', 'GEOMETRY');
246 | DROP TABLE idx_ice_outline_GEOMETRY;
247 |
248 | SELECT DiscardGeometryColumn('ice_outline', 'GEOMETRY');
249 | SELECT RecoverGeometryColumn('ice_outline', 'GEOMETRY', 3857, 'LINESTRING', 'XY');
250 | SELECT CreateSpatialIndex('ice_outline', 'GEOMETRY');
251 |
252 | .elemgeo ice_outline2 GEOMETRY ice_outline2_flat id_new id_old;
253 |
254 | INSERT INTO ice_outline ($EDGE_TYPE_ATTRIBUTE, GEOMETRY)
255 | SELECT ice_outline2_flat.$EDGE_TYPE_ATTRIBUTE, ice_outline2_flat.GEOMETRY
256 | FROM ice_outline2_flat
257 | WHERE ST_Length(GEOMETRY) > 0.01;
258 |
259 | DELETE FROM ice_outline2;
260 | SELECT DisableSpatialIndex('ice_outline2', 'GEOMETRY');
261 | DROP TABLE idx_ice_outline2_GEOMETRY;
262 | SELECT DiscardGeometryColumn('ice_outline2', 'GEOMETRY');
263 | DROP TABLE ice_outline2;
264 |
265 | SELECT DiscardGeometryColumn('ice_outline2_flat', 'GEOMETRY');
266 | DROP TABLE ice_outline2_flat;
267 | VACUUM;" | spatialite -batch -bail -echo "$DB"
268 |
269 | rm -rf "$DATADIR/antarctica-icesheet-outlines-3857"
270 | mkdir "$DATADIR/antarctica-icesheet-outlines-3857"
271 | rm -rf "$DATADIR/antarctica-icesheet-polygons-3857"
272 | mkdir "$DATADIR/antarctica-icesheet-polygons-3857"
273 |
274 | date $iso_date
275 | echo "Converting results to shapefiles..."
276 |
277 | ogr2ogr -skipfailures -explodecollections \
278 | -spat -20037508.342789244 -20037508.342789244 20037508.342789244 -8300000 \
279 | -clipsrc spat_extent \
280 | $DATADIR/antarctica-icesheet-polygons-3857/icesheet_polygons.shp \
281 | -nln icesheet_polygons -nlt POLYGON \
282 | $DB ice
283 |
284 | ogr2ogr -skipfailures -explodecollections \
285 | -spat -20037508.342789244 -20037508.342789244 20037508.342789244 -8300000 \
286 | -clipsrc spat_extent \
287 | $DATADIR/antarctica-icesheet-outlines-3857/icesheet_outlines.shp \
288 | -nln icesheet_outlines -nlt LINESTRING \
289 | $DB ice_outline
290 |
291 | date $iso_date
292 |
293 | echo "Calling update-icesheet-zip.sh..."
294 |
295 | "$BIN/update-icesheet-zip.sh"
296 |
297 | df -h
298 |
299 | date $iso_date
300 |
301 | echo "Done."
302 |
303 |
--------------------------------------------------------------------------------
/scripts/low-planet/update.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #------------------------------------------------------------------------------
3 | #
4 | # low-planet/update.sh
5 | #
6 | #------------------------------------------------------------------------------
7 |
8 | set -euo pipefail
9 | set -x
10 |
11 | iso_date='+%Y-%m-%dT%H:%M:%S'
12 |
13 | DATADIR=/mnt/data/planet
14 |
15 | PLANET=$DATADIR/planet.osm.pbf
16 | LOW_PLANET=$DATADIR/low-planet.osm.pbf
17 |
18 | NEW_LOW_PLANET=$DATADIR/new-low-planet.osm.pbf
19 |
20 | mkdir -p $DATADIR
21 |
22 | date $iso_date
23 |
24 | echo "Removing leftover partial low planet (if any)..."
25 | rm -f $NEW_LOW_PLANET
26 |
27 | echo "Removing existing low planet (if any)..."
28 | rm -f $LOW_PLANET
29 |
30 | echo "Creating low planet from planet..."
31 | osmium add-locations-to-ways \
32 | --verbose \
33 | --keep-untagged-nodes \
34 | --index-type=dense_mmap_array \
35 | --fsync \
36 | --output=$NEW_LOW_PLANET $PLANET \
37 | && mv $NEW_LOW_PLANET $LOW_PLANET
38 |
39 | date $iso_date
40 |
41 | df -h
42 |
43 |
--------------------------------------------------------------------------------
/scripts/planet/update.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | #------------------------------------------------------------------------------
3 | #
4 | # planet/update.sh
5 | #
6 | #------------------------------------------------------------------------------
7 |
8 | set -euo pipefail
9 | set -x
10 |
11 | iso_date='+%Y-%m-%dT%H:%M:%S'
12 |
13 | DATADIR=/mnt/data/planet
14 |
15 | PLANET=$DATADIR/planet.osm.pbf
16 | NEW_PLANET=$DATADIR/new-planet.osm.pbf
17 | OLD_PLANET=$DATADIR/old-planet.osm.pbf
18 |
19 | COASTLINES=$DATADIR/coastlines.osm.pbf
20 | NEW_COASTLINES=$DATADIR/new-coastlines.osm.pbf
21 |
22 | ANT=$DATADIR/antarctica.osm.pbf
23 | NEW_ANT=$DATADIR/new-antarctica.osm.pbf
24 |
25 | ANT_COASTLINES=$DATADIR/antarctica-coastlines.osm.pbf
26 | NEW_ANT_COASTLINES=$DATADIR/new-antarctica-coastlines.osm.pbf
27 |
28 | mkdir -p $DATADIR
29 |
30 | rm -f $OLD_PLANET
31 |
32 | date $iso_date
33 |
34 | echo "Downloading planet file (if there isn't one)..."
35 | test -f $PLANET || wget --no-verbose -O $PLANET https://planet.openstreetmap.org/pbf/planet-latest.osm.pbf
36 |
37 | date $iso_date
38 |
39 | echo "Updating planet file..."
40 | rm -f $NEW_PLANET
41 | export OSMIUM_POOL_THREADS=3
42 | /usr/lib/python3-pyosmium/pyosmium-up-to-date -v --size 5000 --format pbf,pbf_compression=lz4 -o $NEW_PLANET $PLANET
43 | mv $PLANET $OLD_PLANET
44 | mv $NEW_PLANET $PLANET
45 |
46 | osmium fileinfo -g header.option.osmosis_replication_timestamp $PLANET >$DATADIR/last-update
47 |
48 | date $iso_date
49 |
50 | echo "Filtering coastlines..."
51 | rm -f $NEW_COASTLINES
52 | osmcoastline_filter --verbose --output=$NEW_COASTLINES $PLANET
53 | mv $NEW_COASTLINES $COASTLINES
54 |
55 | date $iso_date
56 |
57 | echo "Extracting Antarctica..."
58 | rm -f $NEW_ANT
59 | osmium extract --verbose --strategy=simple --bbox=-180,-90,180,-60 --fsync --overwrite --output-format=pbf,pbf_compression=lz4 --output=$NEW_ANT $PLANET
60 | mv $NEW_ANT $ANT
61 |
62 | date $iso_date
63 |
64 | echo "Filtering Antarctica coastlines..."
65 | rm -f $NEW_ANT_COASTLINES
66 | osmcoastline_filter --verbose --output=$NEW_ANT_COASTLINES $ANT
67 | mv $NEW_ANT_COASTLINES $ANT_COASTLINES
68 |
69 | date $iso_date
70 |
71 | df -h
72 |
73 |
--------------------------------------------------------------------------------
/servers/update-anomalies.yml:
--------------------------------------------------------------------------------
1 | #cloud-config
2 |
3 | package_update: true
4 |
5 | package_upgrade: true
6 |
7 | packages:
8 | - bc
9 | - git
10 | - jq
11 | - osmium-tool
12 | - python-gdal
13 | - python3-pyosmium
14 | - rsync
15 | - spatialite-bin
16 | - sqlite3
17 | - unzip
18 | - zip
19 |
20 | mounts:
21 | - [ sdb, /mnt ]
22 |
23 |
--------------------------------------------------------------------------------
/servers/update-low-planet.yml:
--------------------------------------------------------------------------------
1 | #cloud-config
2 |
3 | package_update: true
4 |
5 | package_upgrade: true
6 |
7 | packages:
8 | - git
9 | - osmium-tool
10 |
11 | mounts:
12 | - [ sdb, /mnt ]
13 |
14 |
--------------------------------------------------------------------------------
/servers/update-osmdata.yml:
--------------------------------------------------------------------------------
1 | #cloud-config
2 |
3 | package_update: true
4 |
5 | package_upgrade: true
6 |
7 | packages:
8 | - bc
9 | - gdal-bin
10 | - git
11 | - jq
12 | - osmcoastline
13 | - osmium-tool
14 | - postgis
15 | - postgresql-15
16 | - postgresql-15-postgis-3
17 | - postgresql-15-postgis-3-scripts
18 | - python3-gdal
19 | - python3-pyosmium
20 | - rsync
21 | - spatialite-bin
22 | - sqlite3
23 | - unzip
24 | - zip
25 |
26 | runcmd:
27 | - [ systemctl, stop, postgresql ]
28 | - [ systemctl, disable, postgresql ]
29 | - [ ldconfig ]
30 | - [ sed, -i, -e, '/deb .*backports/s/# //', '/etc/apt/sources.list' ]
31 | - [ apt-get, update, --quiet, --yes ]
32 |
33 |
--------------------------------------------------------------------------------
/servers/update-planet.yml:
--------------------------------------------------------------------------------
1 | #cloud-config
2 |
3 | package_update: true
4 |
5 | package_upgrade: true
6 |
7 | packages:
8 | - bc
9 | - gdal-bin
10 | - git
11 | - jq
12 | - osmcoastline
13 | - osmium-tool
14 | - python3-pyosmium
15 | - rsync
16 | - spatialite-bin
17 | - sqlite3
18 | - unzip
19 | - zip
20 |
21 |
--------------------------------------------------------------------------------
/web/.gitignore:
--------------------------------------------------------------------------------
1 | _site
2 |
--------------------------------------------------------------------------------
/web/_includes/data/coastlines.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
These processings are made available under
5 | Creative Commons Attribution ShareAlike 4.0
6 | license. They are based on data that is copyright OpenStreetMap contributors and available under the
7 | ODbL. More...
The coastline in OpenStreetMap is often broken. The update process will try
5 | to repair it, but this does not always work. If the OSM data can't be repaired
6 | automatically, the data here will not be updated.
In OpenStreetMap the primary division of the earth surface into land and
9 | water is mapped through ways tagged with natural=coastline. For
10 | rendering a map however you usually need land or water polygons representing
11 | continents and islands in the former or oceans in the latter case.
Linestrings for coastlines. Long linestrings are split into smaller chunks
13 | (with no more than 100 points) that are easier and faster to work with.
14 |
15 |
Description
16 |
17 |
The data has been derived from OpenStreetMap ways tagged with
18 | natural=coastline. Some errors in the OSM data are repaired in the
19 | process. Learn more about the coastline processing...
Linestrings for the outlines of the ice covered parts of the Antarctic that
13 | are not explicitly mapped as glaciers in OpenStreetMap. Linestrings are split
14 | into smaller chunks (no more than 200 km length in projected coordinates) that
15 | are easier and faster to work with.
The icesheet outlines in particular are needed if you render glaciers with
28 | explicitly drawn outlines, otherwise you only need the Antarctic icesheet polygons.
30 |
31 |
Attributes
32 |
33 |
The files contain an ice_edge attribute which can have three
34 | different values with the following meaning:
35 |
36 |
37 |
ice_ocean indicates outline segments separating ice from ocean
38 |
ice_land indicates outline segments separating ice from ice free land
39 |
ice_ice indicates outline segments separating implicit ice areas from glaciers explicitly mapped in OpenStreetMap
40 |
41 |
42 |
Using this data
43 |
44 |
To use this data in an OpenStreetMap based map download the shapefile below,
45 | include it as a separate layer in your map style and render features with
46 | ice_edge=ice_ocean and ice_edge=ice_land with the same
47 | styling as used for glacier outlines.
Like in case of the Land Polygons these
26 | are split into smaller polygons for easier use.
27 |
28 |
Using this data
29 |
30 |
To use this data in an OpenStreetMap based map download the shapefile below,
31 | include it as a separate layer in your map style and render it with the same
32 | fill color/pattern as used for glacier polygons. If your map style renders
33 | glaciers with an outline you also need the Antarctic icesheet outlines.
In the Antarctic, south of -60 degrees latitude, glaciation is not explicitly
10 | mapped in OpenStreetMap to avoid huge polygons that are difficult to
11 | handle. Instead it is assumed that all land areas that are not explicitly
12 | mapped otherwise are covered by ice. This needs to be taken into consideration
13 | when interpreting the OpenStreetMap data like for rendering a map. The data
14 | here was created to make this easier.
15 | Learn more about the Antarctic Icesheet processing...
Polygons for all land areas in the world, ie. continents and islands. Optionally,
13 | large polygons are split into smaller overlapping chunks that are easier and faster
14 | to work with.
15 |
16 |
Description
17 |
18 |
The data has been derived from OpenStreetMap ways tagged with
19 | natural=coastline. Ways are assembled into polygons and then
20 | split. Some errors in the OSM data are repaired in the process.
21 | Learn more about the coastline processing...
22 |
23 |
Where polygons have been split they overlap slightly to help avoiding
24 | rendering artefacts at the seams.
25 |
26 |
Variants
27 |
28 |
29 |
Large polygons are available as complete polygons or split into
30 | smaller polygons.
31 |
The data is available in WGS84
32 | and Mercator
33 | projection.
Polygons for oceans and seas. These polygons are split into smaller
13 | overlapping chunks that are easier and faster to work with.
14 |
15 |
Description
16 |
17 |
The data has been derived from OpenStreetMap ways tagged with
18 | natural=coastline. Ways are assembled into polygons and then
19 | split. Some errors in the OSM data are repaired in the process.
20 | Learn more about the coastline processing...
21 |
22 |
This dataset only contains bodies of water bordered by ways tagged
23 | natural=coastline, it does not contain lakes, reservoirs, etc.
24 | tagged with natural=water etc.
25 |
26 |
Where polygons have been split they overlap slightly to help avoiding
27 | rendering artefacts at the seams.
28 |
29 |
Variants
30 |
31 |
32 |
The data is available in WGS84
33 | and Mercator
34 | projection.
Apart from the coastline OpenStreetMap contains a
9 | lot of other data on surface waterbodies like lakes, rivers and reservoirs. In
10 | OpenStreetMap data based maps these are typically rendered from a rendering
11 | database that also contains most of the other data used in the map.
12 |
13 |
For the coarse scales (low zoom levels) this data is however too detailed
14 | and voluminous to be rendered efficiently. We offer a compact version of this
15 | data reduced significantly in volume and complexity which still produces
16 | results very close to those with the full detail data when rendered in plain
17 | color without outlines.
18 | Learn more about the waterbody processing...
The OpenStreetMap project
12 | collects an amazing amount of geodata and makes it available to the world for
13 | free. But the raw OpenStreetMap data is hard to use. On this web site you'll
14 | find some of that data pre-processed and formatted for easier use.
15 |
16 |
Pre-processing includes removing or fixing of wrong data and assembling of
17 | different parts of the data into a usable whole. The data is formatted into Shapefiles for easy use in the usual
19 | GIS applications.
20 |
21 |
The following data sets are available from this site:
22 |
23 |
Coastline data processings
24 |
25 | {% include data/land-polygons.html %}
26 | {% include data/water-polygons.html %}
27 | {% include data/coastlines.html %}
28 |
29 |
Antarctic icesheet
30 |
31 | {% include data/icesheet-polygons.html %}
32 | {% include data/icesheet-outlines.html %}
33 |
--------------------------------------------------------------------------------
/web/info/downloading.html:
--------------------------------------------------------------------------------
1 | ---
2 | layout: default
3 | title: Information about downloading
4 | ---
5 |
6 |
Downloading
7 |
8 |
All data is available for free download. Use the [Download] buttons on
9 | the data pages. Files can be quite large (up to several
10 | hundred megabytes), so make sure to only download what you need.
11 |
12 |
All downloads are compressed in zip archives.
13 |
14 |
15 |
Recurring downloads
16 |
17 |
Updated data is usally available once a day. Some data is updated less often
18 | if it doesn't change much. You can use automation to re-download new versions
19 | of the data, but please use the If-Modified-Since header. This will
20 | make sure that data is only downloaded if it actually changed.
21 |
22 |
Use the curl program for downloading, it supports
23 | the -z, --time-cond option. You can set it up easily so that
24 | it will only re-download a file if it has changed.
25 |
26 |
If you abuse this service and download the date more often than necessary,
27 | we might suspend the service without notice!
28 |
29 |
--------------------------------------------------------------------------------
/web/info/formats.html:
--------------------------------------------------------------------------------
1 | ---
2 | layout: default
3 | title: Information about data formats
4 | ---
5 |
6 |
Data formats
7 |
8 |
Most data available on this site currently uses the popular Shapefile
9 | format. Other formats might be added in the future.
This is a popular format for GIS data. Each layer consists of a set of
14 | files. For easy download all the files are packed into a zip archive. The
15 | archive contains the following files:
This site provides geodata derived from OpenStreetMap data, processed for
9 | easier use for rendering maps and other purposes.
10 |
11 |
There are several download options on this site. See the information about
12 | available projections and formats. All data available here is updated regularly
14 | based on the latest changes in OpenStreetMap using processing techniques developed by us and using Open Source software.
17 |
18 |
Data
19 |
20 |
All of the data on this site is derived from the original OpenStreetMap data and is available
22 | under open licenses. See the license page for
23 | more details.
24 |
25 |
Because the data is directly derived from OpenStreetMap data, errors in OSM
26 | data will appear in this data, too. Don't ask us to fix those errors, you can
27 | do that in OpenStreetMap yourself.
28 |
29 |
You can get the original OpenStreetMap data from the official download site at
30 | planet.openstreetmap.org.
Data for download on this site is derived from OpenStreetMap data and
9 | Copyright 2019 OpenStreetMap contributors.
10 |
11 |
As required by this license all files containing data directly based on
12 | the original OpenStreetMap data are available under the same license as the
13 | original OpenStreetMap data, the Open Database License
15 | (ODbL).
16 |
17 |
If you are using this data we'd be happy if you mention this
18 | site or link to it. But you are not required to do so.
19 |
20 |
For more information about this license as applied to OSM see the OSM site.
22 |
23 | {% comment %}
24 |
Generalized data
25 |
26 |
Data sets containing generalized geometries that are produced works
27 | in terms of the ODbL are made available under Creative Commons
29 | Attribution ShareAlike 4.0 license. This means you may freely use them in
30 | maps that are available under the same or a compatible license. You are
31 | required to attribute the OpenStreetMap contributors as required by the OSM attribution
33 | requirements but as in case of the ODbL licensed data we do not require you
34 | to attribute osmdata.openstreetmap.de although we'd appreciate that.
OpenStreetMap uses the WGS84 spatial reference system used by the Global
14 | Positioning System (GPS). It uses geographic coordinates between -180° and 180°
15 | longitude and -90° and 90° latitude. So this is the "native" OSM format.
16 |
17 |
This is the right choice for you if you need geographical
18 | coordinates or want to transform the coordinates into some other spatial
19 | reference system or projection.
Most tiled web maps (such as the standard OSM maps and Google Maps) use this
24 | Mercator projection.
25 |
26 |
The map area of such maps is a square with x and y coordiates both between
27 | -20,037,508.34 and 20,037,508.34 meters. As a result data north of about 85.1°
28 | and south of about -85.1° latitude can not be shown and has been cut off.
29 |
30 |
The correct EPSG code for this data is 3857 and this is what the data files
31 | show. Before this code was allocated, other codes such as 900913 were used. If
32 | your software does not understand the 3857 code you might need to upgrade. See
33 | this page
34 | for all the details.
35 |
36 |
This is the right choice for you if you are creating tiled web
37 | maps.
This is internal data. Do not use unless you know what
11 | you are doing!
12 |
13 |
This page can help with figuring out if there is a problem with processing
14 | the coastline data. If you don't know what this is about, you can safely ignore
15 | it.
These pages can help with figuring out if there is a problem with processing
8 | the data. This is only for admins and others who want to help out with keeping
9 | this service running smoothly.
The coastline is handled somewhat differently than most other features in
12 | OSM. To mark it ways with the tag natural=coastline are used. Those
13 | ways need to connect end-to-end to form an unbroken line around every island
14 | and every continent. And the land always has to be on the left side, the water
15 | on the right side of those ways. For details see the
16 | OSM wiki.
18 |
19 |
20 |
Processing with OSMCoastline
21 |
22 |
The world is big and all the coastlines together are very very long. So
23 | there is a lot of data. And chances are that at any moment someone somewhere
24 | is updating the coastline. And that is not always easy to do, and errors happen.
25 | But we need one unbroken line to form proper (multi)polygons of land masses
26 | (or the sea). Specifically for this task we have developed the OSMCoastline program. It assembles
28 | all the data, applies some fixes if necessary and creates the data you see on
29 | this site. For more of the inner workings refer to the OSMCoastline
30 | documentation or have a look at these blog
32 | posts. OSMCoastline has many parameters and options and you can run it
33 | yourself to create specialized output if what you see on this site is not
34 | enough for your use case.
35 |
36 |
37 |
Finding and fixing errors
38 |
39 |
As the OSMCoastline program runs to create the coastline, it will report any
40 | errors in the coastline it finds. This information is then sent on to the
41 | Coastline view of the
43 | Geofabrik OSM Inspector. You can help fixing the OSM data by checking
44 | that site and correcting any errors you see. Thanks to all the OSM users who
45 | do this!
46 |
47 |
If you can't find an error shown by the OSM Inspector in your
48 | OSM editor, chances are somebody else was faster and has already fixed it!
Most OpenStreetMap-based maps are rendered from exactly the same data on all
11 | scales. This means that at low zoom levels there is usually way too much
12 | geometric detail which produces a lot of noise and blurriness in the results by
13 | rendering details that cannot be properly represented at the map scale and
14 | resolution used.
15 |
16 |
The process of adapting the detailed map data as it has been surveyed to the
17 | target map scale and resolution is called cartographic generalization.
18 | This is neglected in most digital zoomable maps. Key to good quality
19 | generalization and ultimately clear and well readable maps is the elimination
20 | of unimportant details that cannot be properly shown and at the same time
21 | selection and emphasis of key elements. This is not to be confused with mere
22 | geometric simplification which is just one element of generalization and which
23 | is widely used in digital map production.
24 |
25 |
Generalized coastlines
26 |
27 |
We provide a generalized version of the
28 | OpenStreetMap coastlines for use in web
29 | maps in Mercator projection. The coastline is usually the most important
30 | element of maps at the lowest zoom levels so generalization is most important
31 | here.
32 |
33 |
The technique used is a raster-based process implemented in the coastline_gen tool. It
35 | ensures that straits and land bridges are properly represented at the low
36 | resolutions. Different levels of generalization are produced for the different
37 | zoom levels up to z=8.
38 |
39 |
40 |
41 |
42 |
43 |
44 |
45 |
46 |
After the original coastline data processing two further steps are used:
47 |
48 |
49 |
Any lakes tagged natural=coastline will be removed.
50 |
Some marine coastal water areas, primarily at the mouths of rivers, that
51 | are not included in the coastline in OpenStreetMap are added based on the
52 | maritime=yes tag.
53 |
54 |
55 |
Rendering example
56 |
57 |
Here an example of use of these coastlines - together with generalized
58 | waterbodies and relief shading - from maps.imagico.de.
Land in the Antarctic is more than 90 percent covered by ice. It was
11 | therefore decided some time
13 | ago in OpenStreetMap that the ice cover in general is not to be explicitly
14 | mapped and instead it is assumed that all land (as mapped by the coastline) is ice-covered south of 60
16 | degrees southern latitude unless it is explicitly mapped otherwise.
17 |
18 |
This convention simplifies mapping a lot but it also produces some
19 | difficulties for data users since they need to take this into account if they
20 | want to correctly interpret the OpenStreetMap data.
21 |
22 |
If you render a map displaying glaciers and icesheets based on OpenStreetMap
23 | data you can either modify the rendering style locally in the Antarctic to take
24 | care of the different tagging rules or you use the Antarctic icesheet data provided here in
26 | addition to the normal glacier polygons from OpenStreetMap.
27 |
28 |
The icesheet data here essentially contains a set of polygons and
29 | corresponding outlines of the ice-covered areas not explicitly mapped.
30 |
31 |
This is generated by processing the land polygon data and the landcover
32 | mapping of the area from OpenStreetMap with
33 | spatialite.
34 | The processing scripts are available on github as Open
36 | Source.
Here a rendering example from the OpenStreetMap standard style, on the left
44 | without the icesheet polygons, on the right with them - rendered using the same
45 | styling as for normal glaciers.
OpenStreetMap data in its raw form is often hard to use. The goal of this
9 | site is to give you access to pre-processed OSM data in a convenient form. But
10 | we can not create the data in every form that might be needed, so we give you
11 | the background information and the software to do it yourself.
Inland waterbodies in OpenStreetMap are mapped with a variety of tags
9 | producing a fairly large volume of data - more then 5GB of data in usual GIS
10 | formats for the polygon data alone. Since there is no objective measure of
11 | importance for these different features there is no easy way to filter from
12 | this large volume of data only those features that are relevant for rendering
13 | maps at coarse scales.
14 |
15 |
Therefore most OpenStreetMap based maps render water at the low zoom levels
16 | either from other data sources or using questionable filtering techniques that
17 | often lead to low quality results.
18 |
19 |
The reduced waterbody data we offer here is generated from the full set of
20 | waterbody data in OpenStreetMap and reduces this in volume without subjective
21 | decisions that affect the rendering results in undesirable ways. This is done
22 | by rasterizing the data at high resolution and generating a supersampled low
23 | resolution raster mask from it. This raster mask is made available as raster
24 | images and converted back to a very compact polygon representation which is
25 | made available as well.
26 |
27 |
This approach comes with a number of limitations that need to be kept in
28 | mind when using this data:
29 |
30 |
31 |
The files are only suited for rendering in the resolution they are
32 | produced for. Rendering either in a higher or lower resolution will
33 | produce suboptimal results.
34 |
The polygon version does not have any topological similarity with the
35 | original data as you might expect. The rivers for example do not form a
36 | continuous geometry.
37 |
The styling you can use with these files is somewhat limited. You cannot
38 | render the polygons with an outline for example. Using fill patterns or edge
39 | color gradients should however work.