├── LICENSE
├── README.md
├── config.py
├── data.py
├── dpp.yml
├── examples
├── run_dpp_download_data.ipynb
├── run_dpp_download_data.py
├── run_dpp_from_archive.ipynb
├── run_dpp_from_archive.py
├── run_dpp_from_directory.ipynb
└── run_dpp_from_directory.py
├── model.py
├── models
├── detection
│ └── 20201002
│ │ ├── dict_hyperopt_t733.pckl
│ │ ├── model_hyperopt_t733.h5
│ │ └── trials_hyperopt_ntrials_1000.pckl
└── picking
│ ├── 20201002_1
│ ├── P
│ │ ├── dict_hyperopt_t027.pckl
│ │ ├── model_hyperopt_t027.h5
│ │ └── trials_hyperopt_ntrials_050.pckl
│ └── S
│ │ ├── dict_hyperopt_t009.pckl
│ │ ├── model_hyperopt_t009.h5
│ │ └── trials_hyperopt_ntrials_050.pckl
│ └── 20201002_2
│ ├── P
│ ├── dict_hyperopt_t004.pckl
│ ├── model_hyperopt_t004.h5
│ └── trials_hyperopt_ntrials_050.pckl
│ └── S
│ ├── dict_hyperopt_t023.pckl
│ ├── model_hyperopt_t023.h5
│ └── trials_hyperopt_ntrials_050.pckl
├── requirements.txt
├── sample_data
├── CX_20140301
│ ├── CX.PB01..HH.mseed
│ └── CX.PB02..HH.mseed
├── CX_20140401
│ ├── CX.PB01..HH.mseed
│ └── CX.PB02..HH.mseed
└── archive
│ └── 2014
│ └── CX
│ ├── PB01
│ ├── HHE.D
│ │ └── CX.PB01..HHE.D.2014.121
│ ├── HHN.D
│ │ └── CX.PB01..HHN.D.2014.121
│ └── HHZ.D
│ │ └── CX.PB01..HHZ.D.2014.121
│ └── PB02
│ ├── HHE.D
│ └── CX.PB02..HHE.D.2014.121
│ ├── HHN.D
│ └── CX.PB02..HHN.D.2014.121
│ └── HHZ.D
│ └── CX.PB02..HHZ.D.2014.121
└── util.py
/LICENSE:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
675 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # DeepPhasePick
2 |
3 | DeepPhasePick (DPP) is a method for automatically detecting and picking seismic phases from local earthquakes based on highly optimized deep neural networks.
4 | The method work in a pipeline, where in a first stage phase detection is performed by a Convolutional Neural Network (CNN) on three-component seismograms.
5 | Then P- and S-picking is conducted by two Long-Short Term Memory (LSTM) Recurrent Neural Networks (RNN) on the vertical and the two-horizontal components, respectively.
6 | The CNN and LSTM networks have been trained using >30,000 seismic records extracted from manually-picked event waveforms originating from northern Chile.
7 | DPP additionally computes uncertainties of the predicted phase time onsets, based on the Monte Carlo Dropout (MCD) technique as an approximation of Bayesian inference.
8 | Predicted phase time onsets and associated uncertainties generated by DPP can be used to feed a phase associator algorithm as part of an automatic earthquake location procedure.
9 |
10 | ## 1. Install
11 |
12 | An easy and straightforward way to install DPP is first to directly clone the public repository:
13 |
14 | ~~~bash
15 | git clone https://github.com/hsotoparada/DeepPhasePick
16 | cd DeepPhasePick
17 | ~~~
18 |
19 | Then, DPP requirements can be manually installed to a dedicated conda environment or by running:
20 |
21 | ~~~bash
22 | conda env create -f dpp.yml
23 | conda activate dpp
24 | ~~~
25 |
26 | ## 2. DPP Worflow
27 |
28 | ### 1. Configuration
29 |
30 | Before running DPP, the method needs to be configured by creating an instance of the class `Config()`, for example using:
31 |
32 | import config, data, model, util
33 | dpp_config = config.Config()
34 |
35 | Then, parameters controlling different stages in the method can be configured as described below.
36 |
37 | **1.1 Parameters determining the selected waveform data on which DeepPhasePick is
38 | applied are defined using `dpp_config.set_data()`.**
39 |
40 | For example, to select the waveforms from stations `PB01` and `PB02` (network `CX`), and channel `HH` which are stored in
41 | the archive directory `archive`, and save the results in directory `out`, run:
42 |
43 | dpp_config.set_data(stas=['PB01', 'PB02'], net='CX', ch='HH', archive='archive', opath='out')
44 |
45 | **1.2 Parameters controlling how seismic waveforms are processed before the phase detection stage are defined using `dpp_config.set_data_params()`.**
46 |
47 | For example, the following will apply a highpass filter (> .5 Hz) and resample the data to 100 Hz (if it is not already sampled at that sampling rate):
48 |
49 | dpp_config.set_data_params(samp_freq=100., st_filter='highpass', filter_opts={'freq': .5})
50 |
51 | Note that, since the models in DPP were trained using non-filtered data, this may cause numerous false positive predictions.
52 |
53 | **1.3 DPP will be applied on the selected seismic data (defined through `set_data()`) in the time windows defined using `dpp_config.set_time()`.**
54 |
55 | For example, to create 30-min (1800-seconds) time windows in the period between
56 | `2015-04-03T00:00:00` and `2015-04-03T02:00:00` (2 hours), use:
57 |
58 | dpp_config.set_time(dt_iter=1800., tstart="2015-04-03T00:00:00", tend="2015-04-03T02:00:00")
59 |
60 | Note that the windows created will have the same duration except for the last window, which will be filled with the remainder data until `tend` in case
61 | `dt_iter + tstart(last window) > tend`.
62 |
63 | **1.4 Parameters determining how predicted discrete probability time series are computed when running phase detection on seismic waveforms are defined using `dpp_config.set_trigger()`.**
64 |
65 | For example, the following will compute the discrete probability time series every 20 samples, using a probability threshold of 0.95 for P- and S-phases:
66 |
67 | dpp_config.set_trigger(n_shift=20, pthres_p=[0.95, 0.001], pthres_s=[0.95, 0.001])
68 |
69 | **1.5 Parameters controlling the optional conditions applied for refining preliminary picks obtained from phase detection are defined using `dpp_config.set_picking()`.**
70 |
71 | For example, the following will remove preliminary picks which are presumed false positive, by applying all of the four optional conditions described in the
72 | Text S1 in the Supplementary Material of Soto and Schurr (2021).
73 | This is the default and recommended option, especially when dealing with very noise waveforms or filtered seismic waveforms, which may increase the number of presumed false positives.
74 |
75 | Then refined pick onsets and their time uncertainties will be computed by applying 20 iterations of Monte Carlo Dropout.
76 |
77 | dpp_config.set_picking(run_mcd=True, mcd_iter=20)
78 |
79 | More details on the arguments accepted by each of these configuration functions can be seen from the corresponding function documentation.
80 |
81 | Note that, instead of configuring DPP by using the functions describe above, each set of parameters can be passed as a dictionary to `config.Config()`.
82 | See the class `Config()` documentation to use this approach.
83 |
84 | ### 2. Seismic Data
85 |
86 | DPP method is applied on three-component MiniSEED seismic waveforms.
87 |
88 | To read the seismic waveforms into DPP an instance of the class Data() needs to be created, for example using:
89 |
90 | dpp_data = data.Data()
91 |
92 | Then, the data can be read into DPP for example from a local archive directory using:
93 |
94 | dpp_data.read_from_archive(dpp_config)
95 |
96 | The local archive needs to have the following commonly used structure: `archive/YY/NET/STA/CH`
97 |
98 | Here `YY` is year, `NET` is the network code, `STA` is the station code and `CH` is the channel code (e.g., HHZ.D) corresponding to the seismic streams.
99 | An example of archived data is included in `sample_data/archive`.
100 |
101 | Alternatively, waveforms can be read from a local directory with no specific structure. For example using:
102 |
103 | dpp_data.read_from_directory(dpp_config)
104 |
105 |
106 | ### 3. Phase Detection and Picking
107 |
108 | In order to run the phase detection and picking stages, an instance of the class `Model()` needs to be created, for example using:
109 |
110 | dpp_model = model.Model()
111 |
112 | When calling `Model()`, particular model versions can be specified by the string parameters `version_det`, `version_pick_P`, `version_pick_S`.
113 |
114 | Available model versions (more might be added in the future):
115 |
116 | * `version_det = "20201002"`:
117 |
best optimized phase detection model described in Soto and Schurr (2021).
118 | This is the default value for `version_det`.
119 |
120 | * `version_pick_P = version_pick_S = "20201002_1"`:
121 |
best optimized P- and S-phase picking models described in Soto and Schurr (2021).
122 | This is the default value for `version_pick_P` and `version_pick_S`.
123 |
124 | * `version_pick_P = version_pick_S = "20201002_2"`:
125 |
best optimized picking models, which were trained using 2x (for P phase) and 3x (for S phase) the number of shifted seismic records used in version `20201002_1`.
126 | Hence enhancing the performance of the phase picking.
127 |
128 | Once the models are read into DPP, model information can be retrieved for example by using:
129 |
130 | print(dpp_model.model_detection['best_model'].summary())
131 | print(dpp_model.model_picking_P['best_model'].summary())
132 | print(dpp_model.model_picking_S['best_model'].summary())
133 |
134 | **3.1 To run the phase detection on the selected seismic waveforms use:**
135 |
136 | dpp_model.run_detection(dpp_config, dpp_data)
137 |
138 | This will compute discrete class probability time series from predictions, which are used to obtain preliminary phase picks.
139 |
140 | The optional parameter `save_dets = True` (default is `False`) will save a dictionary containing the class probabilities and preliminary picks to `opath/*/pick_stats` if needed for further use.
141 | Here `opath` is the output directory defined in the DPP configuration (see function `set_data()`).
142 |
143 | The optional parameter `save_data = True` (default is `False`) will save a dictionary containing the seismic waveform data used for phase detection to the same directory.
144 |
145 | **3.2 Next the phase picking can be run to refine the preliminary picks, using:**
146 |
147 | dpp_model.run_picking(dpp_config, dpp_data)
148 |
149 | The optional parameter `save_plots = True` (default is `True`) will save figures of individual predicted phase onsets to `opath/*/pick_plots` if `run_mcd=True`.
150 | These figures are similar to the subplots in Figure 3 of Soto and Schurr (2021).
151 |
152 | The optional parameter `save_picks = True` (default is `False`) will save a dictionary containing relevant information of preliminary and refined phase picks to `opath/*/pick_stats`.
153 |
154 | The optional parameter `save_stats = True` (default is `True`) will save statistics of predicted phase onsets to the output file `opath/*/pick_stats/pick_stats`.
155 |
If `run_mcd=False`, the ouput file will contain the following 4 columns:
156 |
157 | `station, phase (P or S), pick number, detection probability, tons (preliminary; UTC)`
158 |
159 | If `run_mcd=True`, the output file will contain the previous columns plus the following additional columns with the results from the MCD iterations:
160 |
161 | `tons (refined; UTC), tons (preliminary; within picking window) [s], tons (refined; within picking window) [s],
162 | tons_err (before onset) [s], tons_err (after onset) [s], pick class, pb, pb_std`
163 |
164 | Here `tons` is the predicted phase time onset with uncertainty `tons_err` and class `pick class`.
165 | These fields, as well as `pb` and `pb_std`, are described in Figure 3 of Soto and Schurr (2021).
166 |
167 |
168 | ### 4. Plotting predicted P and S phases
169 |
170 | Figures including continuous waveforms together with predicted P and S phases can be created using:
171 |
172 | util.plot_predicted_phases(dpp_config, dpp_data, dpp_model)
173 |
174 | Three additional optional parameters in this function allow to modify the figures layout (see function documentation).
175 | The parameter `plot_comps` defines which seismogram components are plotted.
176 | The parameter `plot_probs` defines which class probability time series are plotted.
177 | Finally, the parameter `shift_probs` controls if the plotted probability time series are shifted in time,
178 | according to the optimized hyperparameter values defining the picking window for each class (see Figura S1 in Soto and Schurr, 2021).
179 |
180 | For example, the following will plot the predicted picks on the vertical ('Z') and north ('N') seismogram components,
181 | and the probability time series for P- and S-phase classes shifted in time as described above.
182 |
183 | util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_comps=['Z','N'], plot_probs=['P','S'], shift_probs=True)
184 |
185 |
186 | ## Reference:
187 |
188 | - Soto, H., and Schurr, B. DeepPhasePick: A method for detecting and picking seismic phases from local earthquakes based on highly
189 | optimized convolutional and recurrent deep neural networks. Geophysical Journal International (2021). https://doi.org/10.1093/gji/ggab266
190 |
191 |
192 | ## Thanks:
193 |
194 | The development of DeepPhasePick method has received financial support from
195 |
196 | - The HAzard and Risk Team (HART) initiative of the GFZ German Research Centre for Geosciences in collaboration with the Institute of GeoSciences, Energy, Water
197 | and Environment of the Polytechnic University Tirana, Albania and the KIT Karlsruhe Institute of Technology.
198 |
199 |
--------------------------------------------------------------------------------
/config.py:
--------------------------------------------------------------------------------
1 | # coding: utf-8
2 |
3 | """
4 | This module contains a class and methods that help to configure the behavior of DeepPhasePick method.
5 |
6 | Author: Hugo Soto Parada (October, 2020)
7 | Contact: soto@gfz-potsdam.de, hugosotoparada@gmail.com
8 |
9 | """
10 |
11 | import obspy.core as oc
12 | from datetime import datetime
13 | import re, sys, os, shutil, gc
14 |
15 |
16 | class Config():
17 | """
18 | Class that initiates user configuration for selecting seismic data and defining how this data is processed in DeepPhasePick.
19 |
20 | Parameters
21 | ----------
22 | dct_data: dict, optional
23 | dictionary with parameters defining archived waveform data on which DeepPhasePick is applied.
24 | See parameters details in method set_data().
25 | dct_data_params: dict, optional
26 | dictionary with parameters defining how seismic waveforms is processed before phase detection.
27 | See parameters details in method set_data_params().
28 | dct_time: dict, optional
29 | dictionary with parameters defining time windows over which DeepPhasePick is applied.
30 | See parameters details in method set_time().
31 | dct_trigger: dict, optional
32 | dictionary with parameters defining how predicted discrete probability time series are computed when running phase detection on seismic waveforms.
33 | See parameters details in method set_trigger().
34 | dct_picking: dict, optional
35 | dictionary with parameters applied in optional conditions for improving preliminary picks obtained from phase detection.
36 | See parameters details in method set_picking().
37 | """
38 |
39 | def __init__(self, dct_data=None, dct_data_params=None, dct_time=None, dct_trigger=None, dct_picking=None):
40 |
41 | self.data = self._set_default_data(dct_data)
42 | self.data_params = self._set_default_data_params(dct_data_params)
43 | self.time = self._set_default_time(dct_time)
44 | self.trigger = self._set_default_trigger(dct_trigger)
45 | self.picking = self._set_default_picking(dct_picking)
46 |
47 |
48 | def _set_default_data(self, dct_data):
49 | """
50 | Set default parameters defining archived waveform data on which DeepPhasePick is applied.
51 |
52 | Returns
53 | -------
54 | dct: dict
55 | dictionary with defined parameters. See parameters details in method set_data().
56 | """
57 |
58 | dct = {
59 | 'stas': [],
60 | 'ch': 'HH',
61 | 'net': '',
62 | 'archive': 'archive',
63 | 'opath': 'out',
64 | }
65 |
66 | if dct_data is not None:
67 | for k in dct:
68 | if k in dct_data:
69 | dct[k] = dct_data[k]
70 |
71 | return dct
72 |
73 |
74 | def _set_default_data_params(self, dct_data_params):
75 | """
76 | Set default parameters defining how seismic waveforms is processed before phase detection.
77 |
78 | Returns
79 | -------
80 | dct: dict
81 | dictionary with defined parameters. See parameters details in method set_data_params().
82 | """
83 |
84 | dct = {
85 | 'samp_freq': 100.,
86 | 'st_detrend': True,
87 | 'st_resample': True,
88 | 'st_filter': None,
89 | 'filter_opts': {},
90 | }
91 |
92 | if dct_data_params is not None:
93 | for k in dct:
94 | if k in dct_data_params:
95 | dct[k] = dct_data_params[k]
96 |
97 | return dct
98 |
99 |
100 | def _set_default_time(self, dct_time):
101 | """
102 | Set parameters defining time windows over which DeepPhasePick is applied.
103 |
104 | Returns
105 | -------
106 | dct: dict
107 | dictionary with defined parameters. See parameters details in method set_time().
108 | """
109 |
110 | dct = {
111 | 'dt_iter': 3600.,
112 | 'tstart': oc.UTCDateTime(0),
113 | 'tend': oc.UTCDateTime(3600),
114 | }
115 |
116 | if dct_time is not None:
117 | for k in dct:
118 | if k in dct_time:
119 | if k in ['tstart', 'tend']:
120 | dct[k] = oc.UTCDateTime(dct_time[k])
121 | else:
122 | dct[k] = dct_time[k]
123 |
124 | return dct
125 |
126 |
127 | def _set_default_trigger(self, dct_trigger):
128 | """
129 | Set default parameters defining how predicted discrete probability time series are computed when running phase detection on seismic waveforms.
130 |
131 | Returns
132 | -------
133 | dct: dict
134 | dictionary with defined parameters. See parameters details in method set_trigger().
135 | """
136 |
137 | dct = {
138 | 'n_shift': 10, 'pthres_p': [0.9, .001], 'pthres_s': [0.9, .001], 'max_trig_len': [9e99, 9e99],
139 | }
140 |
141 | if dct_trigger is not None:
142 | for k in dct:
143 | if k in dct_trigger:
144 | dct[k] = dct_trigger[k]
145 |
146 | return dct
147 |
148 |
149 | def _set_default_picking(self, dct_picking):
150 | """
151 | Set default parameters applied in optional conditions for improving preliminary picks obtained from phase detection.
152 |
153 | Returns
154 | -------
155 | dct: dict
156 | dictionary with defined parameters. See parameters details in method set_trigger().
157 | """
158 |
159 | dct = {
160 | 'op_conds': ['1', '2', '3', '4'],
161 | 'tp_th_add': 1.5,
162 | 'dt_sp_near': 2.,
163 | 'dt_ps_max': 35.,
164 | 'dt_sdup_max': 2.,
165 | #
166 | 'run_mcd': True,
167 | 'mcd_iter': 10,
168 | }
169 |
170 | if dct_picking is not None:
171 | for k in dct:
172 | if k in dct_picking:
173 | dct[k] = dct_picking[k]
174 |
175 | return dct
176 |
177 |
178 | def set_data(self, stas, ch, net, archive, opath='out'):
179 | """
180 | Set parameters defining archived waveform data on which DeepPhasePick is applied.
181 |
182 | Parameters
183 | ----------
184 | stas: list of str
185 | stations from which waveform data are used.
186 | ch: str
187 | channel code of selected waveforms.
188 | net: str
189 | network code of selected stations.
190 | archive: str
191 | path to the structured or unstructured archive where waveforms are read from.
192 | opath: str, optional
193 | output path where results are stored.
194 | """
195 | self.data = {
196 | 'stas': stas,
197 | 'ch': ch,
198 | 'net': net,
199 | 'archive': archive,
200 | 'opath': opath,
201 | }
202 |
203 |
204 | def set_data_params(self, samp_freq=100., st_detrend=True, st_resample=True, st_filter=None, filter_opts={}):
205 | """
206 | Set parameters defining how seismic waveforms is processed before phase detection.
207 |
208 | Parameters
209 | ----------
210 | samp_freq: float, optional
211 | sampling rate [Hz] at which the seismic waveforms will be resampled.
212 | st_detrend: bool, optional
213 | If True, detrend (linear) waveforms on which phase detection is performed.
214 | st_resample: bool, optional
215 | If True, resample waveforms on which phase detection is performed at samp_freq.
216 | st_filter: str, optional
217 | type of filter applied to waveforms on which phase detection is performed. If None, no filter is applied.
218 | See obspy.core.stream.Stream.filter.
219 | filter_opts: dict, optional
220 | Necessary keyword arguments for the respective filter that will be passed on. (e.g. freqmin=1.0, freqmax=20.0 for filter_type="bandpass")
221 | See obspy.core.stream.Stream.filter.
222 |
223 | """
224 |
225 | self.data_params = {
226 | 'samp_freq': samp_freq,
227 | 'st_detrend': st_detrend,
228 | 'st_resample': st_resample,
229 | 'st_filter': st_filter,
230 | 'filter_opts': filter_opts,
231 | }
232 |
233 |
234 | def set_time(self, dt_iter, tstart, tend):
235 | """
236 | Set parameters defining time windows over which DeepPhasePick are applied.
237 |
238 | Parameters
239 | ----------
240 | dt_iter: float
241 | time step (in seconds) between consecutive time windows.
242 | tstarts: str
243 | start time to define time windows, in format "YYYY-MM-DDTHH:MM:SS".
244 | tends: str
245 | end time to define time windows, in format "YYYY-MM-DDTHH:MM:SS".
246 |
247 | """
248 | self.time = {
249 | 'dt_iter': dt_iter,
250 | 'tstart': oc.UTCDateTime(tstart),
251 | 'tend': oc.UTCDateTime(tend),
252 | }
253 |
254 |
255 | def set_trigger(self, n_shift=10, pthres_p=[0.9,.001], pthres_s=[0.9,.001], max_trig_len=[9e99, 9e99]):
256 | """
257 | Set parameters defining how predicted discrete probability time series are computed when running phase detection on seismic waveforms
258 |
259 | Parameters
260 | ----------
261 | n_shift: int, optional
262 | step size (in samples) defining discrete probability time series.
263 | pthres_p: list of float, optional
264 | probability thresholds defining P-phase trigger on (pthres_p[0]) and off (pthres_p[1]) times.
265 | See thres1 and thres2 parameters in obspy trigger_onset function.
266 | pthres_s: list of float, optional
267 | probability thresholds defining S-phase trigger on (pthres_s[0]) and off (pthres_s[1]) times.
268 | See thres1 and thres2 parameters in function obspy.signal.trigger.trigger_onset.
269 | max_trig_len: list of int, optional
270 | maximum lengths (in samples) of triggered P (max_trig_len[0]) and S (max_trig_len[1]) phase.
271 | See max_len parameter in function obspy.signal.trigger.trigger_onset.
272 | """
273 |
274 | self.trigger = {
275 | 'n_shift': n_shift,
276 | 'pthres_p': pthres_p,
277 | 'pthres_s': pthres_s,
278 | 'max_trig_len': max_trig_len,
279 | }
280 |
281 |
282 | def set_picking(self, op_conds=['1','2','3','4'], tp_th_add=1.5, dt_sp_near=2., dt_ps_max=35., dt_sdup_max=2., run_mcd=True, mcd_iter=10):
283 | """
284 | Set parameters applied in optional conditions for refining preliminary picks obtained from phase detection.
285 |
286 | Parameters
287 | ----------
288 | op_conds: list of str, optional
289 | optional conditions that are applied on preliminary picks, in order to keep keep/remove presumed true/false preliminary onsets.
290 | These conditions are explained in Supplementary Information of the original publication (https://doi.org/10.31223/X5BC8B).
291 | For example ['1', '2'] indicates that only conditions (1) and (2) are applied.
292 | '1': resolves between P and S phases predicted close in time, with overlapping probability time series
293 | '2': resolves between P and S phases predicted close in time, with no overlapping probability distributions.
294 | '3': discards S picks for which there is no earlier P or P-S predicted picks.
295 | '4': resolves between possible duplicated S phases.
296 | tp_th_add: float, optional
297 | time (in seconds) added to define search time intervals in condition (1).
298 | dt_sp_near: float, optional
299 | time threshold (in seconds) used in condition (2).
300 | dt_ps_max: float, optional
301 | time (in seconds) used to define search time intervals in condition (3).
302 | dt_sdup_max: float, optional
303 | time threshold (in seconds) used in condition (4).
304 | run_mcd: bool, optional
305 | If True, run phase picking in order to refine preliminary picks from phase detection.
306 | mcd_iter: int, optional
307 | number of Monte Carlo Dropout iterations used in phase picking.
308 | """
309 |
310 | self.picking = {
311 | 'op_conds': op_conds,
312 | 'tp_th_add': tp_th_add,
313 | 'dt_sp_near': dt_sp_near,
314 | 'dt_ps_max': dt_ps_max,
315 | 'dt_sdup_max': dt_sdup_max,
316 | 'run_mcd': run_mcd,
317 | 'mcd_iter': mcd_iter,
318 | }
319 |
--------------------------------------------------------------------------------
/data.py:
--------------------------------------------------------------------------------
1 | # coding: utf-8
2 |
3 | """
4 | This module contains a class and methods related to the seismic data used by DeepPhasePick method.
5 |
6 | Author: Hugo Soto Parada (October, 2020)
7 | Contact: soto@gfz-potsdam.de, hugosotoparada@gmail.com
8 |
9 | """
10 |
11 | import numpy as np
12 | import obspy.core as oc
13 | from obspy.signal.trigger import trigger_onset
14 | from obspy.io.mseed.core import InternalMSEEDError
15 | from glob import glob
16 | import re, sys, os, shutil, gc
17 |
18 |
19 | class Data():
20 | """
21 | Class defining seismic data-related methods.
22 | """
23 |
24 | def read_from_archive(self, config):
25 | """
26 | Reads seismic data on which DeepPhasePick is applied.
27 | Data must be stored in a archive directory structured as: archive/YY/NET/STA/CH
28 | Here YY is year, NET is the network code, STA is the station code and CH is the channel code (e.g., HH) of the seismic streams.
29 |
30 | Parameters
31 | ----------
32 | config: instance of config.Config
33 | Contains user configuration defining which seismic waveform data is selected and how this data is processed in DeepPhasePick.
34 | """
35 | #
36 | # set time windows for iteration over continuous waveforms
37 | #
38 | tstart, tend, dt_iter = [config.time['tstart'], config.time['tend'], config.time['dt_iter']]
39 | t_iters = []
40 | tstart_iter = tstart
41 | tend_iter = tstart_iter + dt_iter
42 | if tend_iter > tend:
43 | tend_iter = tend
44 | #
45 | print("#")
46 | while tstart_iter < tend:
47 | t_iters.append([tstart_iter, tend_iter])
48 | tstart_iter += dt_iter
49 | tend_iter += dt_iter
50 | if tend_iter > tend:
51 | tend_iter = tend
52 | #
53 | print(f"time windows ({len(t_iters)}) for iteration over continuous waveforms:")
54 | for t_iter in t_iters:
55 | print(t_iter)
56 | print("")
57 | #
58 | # iterate over time windows
59 | #
60 | self.data = {}
61 | stas = config.data['stas']
62 | doy_tmp = '999'
63 | for i, t_iter in enumerate(t_iters):
64 | #
65 | tstart_iter, tend_iter = t_iter
66 | #
67 | twin_str = f"{tstart_iter.year}{tstart_iter.month:02}{tstart_iter.day:02}"
68 | twin_str += f"T{tstart_iter.hour:02}{tstart_iter.minute:02}{tstart_iter.second:02}"
69 | twin_str += f"_{tend_iter.year}{tend_iter.month:02}{tend_iter.day:02}"
70 | twin_str += f"T{(tend_iter).hour:02}{(tend_iter).minute:02}{(tend_iter).second:02}"
71 | opath = f"{config.data['opath']}/{twin_str}"
72 | yy = tstart_iter.year
73 | doy = '%03d' % (tstart_iter.julday)
74 | #
75 | if doy != doy_tmp:
76 | sts = [oc.Stream(), oc.Stream(), oc.Stream(), oc.Stream(), oc.Stream()]
77 | st = oc.Stream()
78 | #
79 | print("")
80 | print("retrieving seismic waveforms for stations:")
81 | print(stas)
82 | #
83 | st_arg = 0
84 | stas_remove = []
85 | for j, sta in enumerate(stas):
86 | #
87 | # read seismic waveforms from archive
88 | #
89 | net = config.data['net']
90 | ch = config.data['ch']
91 | path = config.data['archive']
92 | flist = glob(path+'/'+str(yy)+'/'+net+'/'+sta+'/'+ch+'?.D/*.'+doy)
93 | #
94 | if len(flist) > 0:
95 | outstr = f"seismic data found for: net = {net}, sta = {sta}, st_count = {len(sts[st_arg])}, st_arg = {st_arg}"
96 | print(outstr)
97 | outstr = str(flist)
98 | print(outstr)
99 | else:
100 | outstr = f"seismic data not found for: net = {net}, sta = {sta}"
101 | print(outstr)
102 | #
103 | if len(sts[st_arg]) >= 50:
104 | st_arg += 1
105 | #
106 | for f in flist:
107 | try:
108 | print(f)
109 | tr = oc.read(f)
110 | sts[st_arg] += tr
111 | #
112 | except InternalMSEEDError:
113 | stas_remove.append(sta)
114 | outstr = f"skipping {f} --> InternalMSEEDError exception"
115 | print(outstr)
116 | continue
117 | #
118 | for stt in sts:
119 | st += stt
120 | del sts
121 | #
122 | stas_remove = set(stas_remove)
123 | for s in stas_remove:
124 | for tr in st.select(station=s):
125 | st.remove(tr)
126 | print(st.__str__(extended=True))
127 | #
128 | # process (detrend, filter, resample, ...) raw stream data
129 | #
130 | print("#")
131 | print("processing raw stream data...")
132 | #
133 | if config.data_params['st_detrend']:
134 | #
135 | print('detrend...')
136 | try:
137 | st.detrend(type='linear')
138 | except NotImplementedError:
139 | #
140 | # Catch exception NotImplementedError: Trace with masked values found. This is not supported for this operation.
141 | # Try the split() method on Stream to produce a Stream with unmasked Traces.
142 | #
143 | st = st.split()
144 | st.detrend(type='linear')
145 | except ValueError:
146 | #
147 | # Catch exception ValueError: array must not contain infs or NaNs.
148 | # Due to presence of e.g. nans in at least one trace data.
149 | #
150 | stas_remove = []
151 | for tr in st:
152 | nnan = np.count_nonzero(np.isnan(tr.data))
153 | ninf = np.count_nonzero(np.isinf(tr.data))
154 | if nnan > 0:
155 | print(f"{tr} --> removed (due to presence of nans)")
156 | stas_remove.append(tr.stats.station)
157 | continue
158 | if ninf > 0:
159 | print(f"{tr} --> removed (due to presence of infs)")
160 | stas_remove.append(tr.stats.station)
161 | continue
162 | #
163 | stas_remove = set(stas_remove)
164 | for s in stas_remove:
165 | for tr in st.select(station=s):
166 | st.remove(tr)
167 | st.detrend(type='linear')
168 | #
169 | if config.data_params['st_filter'] is not None:
170 | #
171 | print('filter...')
172 | st.filter(type=config.data_params['st_filter'], **config.data_params['filter_opts'])
173 | #
174 | if config.data_params['st_resample']:
175 | #
176 | print('resampling...')
177 | for tr in st:
178 | #
179 | if tr.stats.sampling_rate == config.data_params['samp_freq']:
180 | outstr = f"{tr} --> skipped, already sampled at {tr.stats.sampling_rate} Hz"
181 | print(outstr)
182 | pass
183 | #
184 | if tr.stats.sampling_rate < config.data_params['samp_freq']:
185 | outstr = f"{tr} --> resampling from {tr.stats.sampling_rate} to {config.data_params['samp_freq']} Hz"
186 | print(outstr)
187 | tr.resample(config.data_params['samp_freq'])
188 | #
189 | if tr.stats.sampling_rate > config.data_params['samp_freq']:
190 | #
191 | if int(tr.stats.sampling_rate % config.data_params['samp_freq']) == 0:
192 | decim_factor = int(tr.stats.sampling_rate / config.data_params['samp_freq'])
193 | outstr = f"{tr} --> decimating from {tr.stats.sampling_rate} to {config.data_params['samp_freq']} Hz"
194 | print(outstr)
195 | tr.decimate(decim_factor)
196 | else:
197 | outstr = f"{tr} --> resampling from {tr.stats.sampling_rate} to {config.data_params['samp_freq']} Hz !!"
198 | print(outstr)
199 | tr.resample(config.data_params['samp_freq'])
200 | #
201 | print('merging...')
202 | try:
203 | st.merge()
204 | except:
205 | #
206 | outstr = f"Catch exception: can't merge traces with same ids but differing data types!"
207 | print(outstr)
208 | for tr in st:
209 | tr.data = tr.data.astype(np.int32)
210 | st.merge()
211 | #
212 | print('slicing...')
213 | stt = st.slice(tstart_iter, tend_iter)
214 | #
215 | # sort traces of same station by channel, so for each station traces will be shown in order (HHE,N,Z)
216 | stt.sort(['channel'])
217 | #
218 | self.data[i+1] = {
219 | 'st': {},
220 | 'twin': [tstart_iter, tend_iter],
221 | 'opath': opath,
222 | }
223 | for t, tr in enumerate(stt):
224 | sta = tr.stats.station
225 | #
226 | st_check = stt.select(station=sta)
227 | if len(st_check) < 3:
228 | print(f"skipping trace of stream with less than 3 components: {tr}")
229 | continue
230 | #
231 | if sta not in self.data[i+1]['st'].keys():
232 | self.data[i+1]['st'][sta] = oc.Stream()
233 | self.data[i+1]['st'][sta] += tr
234 | else:
235 | self.data[i+1]['st'][sta] += tr
236 | #
237 | doy_tmp = doy
238 |
239 |
240 | def read_from_directory(self, config):
241 | """
242 | Reads seismic data on which DeepPhasePick is applied.
243 | All waveforms must be stored in an unstructured archive directory, e.g.: archive/
244 |
245 | Parameters
246 | ----------
247 | config: instance of config.Config
248 | Contains user configuration defining which seismic waveform data is selected and how this data is processed in DeepPhasePick.
249 | """
250 | #
251 | # read seismic waveforms from directory
252 | #
253 | path = config.data['archive']
254 | tstart, tend, dt_iter = [config.time['tstart'], config.time['tend'], config.time['dt_iter']]
255 | flist_all = sorted(glob(path+'/*'))[:]
256 | #
257 | flabels = []
258 | sts = [oc.Stream(), oc.Stream(), oc.Stream(), oc.Stream(), oc.Stream()]
259 | st = oc.Stream()
260 | st_arg = 0
261 | for i, f in enumerate(flist_all[:]):
262 | #
263 | if len(sts[st_arg]) >= 50:
264 | st_arg += 1
265 | #
266 | print(f"reading seismic waveform: {f}")
267 | tr = oc.read(f)
268 | tr_net, tr_sta, tr_ch = tr[0].stats.network, tr[0].stats.station, tr[0].stats.channel[:-1]
269 | if (tr_net == config.data['net']) and (tr_sta in config.data['stas']) and (tr_ch == config.data['ch']):
270 | try:
271 | sts[st_arg] += tr
272 | except InternalMSEEDError:
273 | # stas_remove.append(sta)
274 | outstr = f"skipping {f} --> InternalMSEEDError exception"
275 | print(outstr)
276 | continue
277 | #
278 | for stt in sts:
279 | st += stt
280 | del sts
281 | #
282 | print(st.__str__(extended=True))
283 | #
284 | # process (detrend, filter, resample, ...) raw stream data
285 | #
286 | print("#")
287 | print("processing raw stream data...")
288 | #
289 | if config.data_params['st_detrend']:
290 | #
291 | print('detrend...')
292 | try:
293 | st.detrend(type='linear')
294 | except NotImplementedError:
295 | #
296 | # Catch exception NotImplementedError: Trace with masked values found. This is not supported for this operation.
297 | # Try split() method on Stream to produce a Stream with unmasked Traces.
298 | #
299 | st = st.split()
300 | st.detrend(type='linear')
301 | except ValueError:
302 | #
303 | # Catch exception ValueError: array must not contain infs or NaNs.
304 | # Due to presence e.g. of NaNs in at least one trace data.
305 | #
306 | stas_remove = []
307 | for tr in st:
308 | nnan = np.count_nonzero(np.isnan(tr.data))
309 | ninf = np.count_nonzero(np.isinf(tr.data))
310 | if nnan > 0:
311 | print(f"{tr} --> removed (due to presence of nans)")
312 | stas_remove.append(tr.stats.station)
313 | continue
314 | if ninf > 0:
315 | print(f"{tr} --> removed (due to presence of infs)")
316 | stas_remove.append(tr.stats.station)
317 | continue
318 | #
319 | stas_remove = set(stas_remove)
320 | for s in stas_remove:
321 | for tr in st.select(station=s):
322 | st.remove(tr)
323 | st.detrend(type='linear')
324 | #
325 | if config.data_params['st_filter'] is not None:
326 | #
327 | print('filter...')
328 | st.filter(type=config.data_params['st_filter'], **config.data_params['filter_opts'])
329 | #
330 | if config.data_params['st_resample']:
331 | #
332 | print('resampling...')
333 | for tr in st:
334 | #
335 | if tr.stats.sampling_rate == config.data_params['samp_freq']:
336 | outstr = f"{tr} --> skipped, already sampled at {tr.stats.sampling_rate} Hz"
337 | print(outstr)
338 | pass
339 | #
340 | if tr.stats.sampling_rate < config.data_params['samp_freq']:
341 | outstr = f"{tr} --> resampling from {tr.stats.sampling_rate} to {config.data_params['samp_freq']} Hz"
342 | print(outstr)
343 | tr.resample(config.data_params['samp_freq'])
344 | #
345 | if tr.stats.sampling_rate > config.data_params['samp_freq']:
346 | #
347 | if int(tr.stats.sampling_rate % config.data_params['samp_freq']) == 0:
348 | decim_factor = int(tr.stats.sampling_rate / config.data_params['samp_freq'])
349 | outstr = f"{tr} --> decimating from {tr.stats.sampling_rate} to {config.data_params['samp_freq']} Hz"
350 | print(outstr)
351 | tr.decimate(decim_factor)
352 | else:
353 | outstr = f"{tr} --> resampling from {tr.stats.sampling_rate} to {config.data_params['samp_freq']} Hz !!"
354 | print(outstr)
355 | tr.resample(config.data_params['samp_freq'])
356 | #
357 | print('merging...')
358 | try:
359 | st.merge()
360 | except:
361 | #
362 | outstr = f"Catch exception: can't merge traces with same ids but differing data types!"
363 | print(outstr)
364 | for tr in st:
365 | tr.data = tr.data.astype(np.int32)
366 | st.merge()
367 | #
368 | # set time windows for iteration over continuous waveforms
369 | #
370 | t_iters = []
371 | tstart_iter = tstart
372 | tend_iter = tstart_iter + dt_iter
373 | if tend_iter > tend:
374 | tend_iter = tend
375 | #
376 | print("#")
377 | while tstart_iter < tend:
378 | t_iters.append([tstart_iter, tend_iter])
379 | tstart_iter += dt_iter
380 | tend_iter += dt_iter
381 | if tend_iter > tend:
382 | tend_iter = tend
383 | #
384 | print(f"time windows ({len(t_iters)}) for iteration over continuous waveforms:")
385 | for t_iter in t_iters:
386 | print(t_iter)
387 | print("")
388 | #
389 | # iterate over time windows
390 | #
391 | self.data = {}
392 | for i, t_iter in enumerate(t_iters):
393 | #
394 | tstart_iter, tend_iter = t_iter
395 | #
396 | twin_str = f"{tstart_iter.year}{tstart_iter.month:02}{tstart_iter.day:02}"
397 | twin_str += f"T{tstart_iter.hour:02}{tstart_iter.minute:02}{tstart_iter.second:02}"
398 | twin_str += f"_{tend_iter.year}{tend_iter.month:02}{tend_iter.day:02}"
399 | twin_str += f"T{(tend_iter).hour:02}{(tend_iter).minute:02}{(tend_iter).second:02}"
400 | #
401 | opath = f"{config.data['opath']}/{twin_str}"
402 | yy = tstart_iter.year
403 | doy = '%03d' % (tstart_iter.julday)
404 | #
405 | print('slicing...')
406 | stt = st.slice(tstart_iter, tend_iter)
407 | #
408 | # sort traces of same station by channel, so for each station traces will be shown in order (HHE,N,Z)
409 | stt.sort(['channel'])
410 | #
411 | self.data[i+1] = {
412 | 'st': {},
413 | 'twin': [tstart_iter, tend_iter],
414 | 'opath': opath,
415 | }
416 | for t, tr in enumerate(stt):
417 | sta = tr.stats.station
418 | #
419 | st_check = stt.select(station=sta)
420 | if len(st_check) < 3:
421 | print(f"skipping trace of stream with less than 3 components: {tr}")
422 | continue
423 | #
424 | if sta not in self.data[i+1]['st'].keys():
425 | self.data[i+1]['st'][sta] = oc.Stream()
426 | self.data[i+1]['st'][sta] += tr
427 | else:
428 | self.data[i+1]['st'][sta] += tr
429 |
--------------------------------------------------------------------------------
/dpp.yml:
--------------------------------------------------------------------------------
1 | name: dpp
2 | channels:
3 | - conda-forge
4 | - defaults
5 | dependencies:
6 | - python=3.6
7 | - obspy==1.2.2
8 | - numpy
9 | - matplotlib
10 | - tqdm
11 | - pip
12 | - pip:
13 | - tensorflow==2.2.0
14 |
--------------------------------------------------------------------------------
/examples/run_dpp_download_data.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "### This example applies DeepPhasePick on seismic data downloaded using FDSN web service client for ObsPy."
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": 1,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import os\n",
17 | "os.chdir('../')\n",
18 | "import config, data, model, util \n",
19 | "from obspy.clients.fdsn import Client \n",
20 | "import obspy.core as oc "
21 | ]
22 | },
23 | {
24 | "cell_type": "markdown",
25 | "metadata": {},
26 | "source": [
27 | "## 1. Configure DPP"
28 | ]
29 | },
30 | {
31 | "cell_type": "code",
32 | "execution_count": 2,
33 | "metadata": {},
34 | "outputs": [
35 | {
36 | "name": "stdout",
37 | "output_type": "stream",
38 | "text": [
39 | "__pycache__/ removed\n",
40 | "~/.nv/ not found, continuing...\n"
41 | ]
42 | }
43 | ],
44 | "source": [
45 | "# config\n",
46 | "util.init_session()\n",
47 | "dpp_config = config.Config()\n",
48 | "dpp_config.set_trigger(pthres_p=[0.9, 0.001], pthres_s=[0.9, 0.001])\n",
49 | "dpp_config.set_picking(mcd_iter=10, run_mcd=True)\n",
50 | "# dpp_config.set_picking(run_mcd=False)\n",
51 | "#\n",
52 | "dpp_config.set_data(\n",
53 | " stas=['PB01', 'PB02'],\n",
54 | " net='CX',\n",
55 | " ch='HH',\n",
56 | " archive='sample_data/CX_20140401',\n",
57 | " opath='out_CX_20140401'\n",
58 | ")\n",
59 | "dpp_config.set_time(\n",
60 | " dt_iter=3600.,\n",
61 | " tstart = \"2014-04-01T02:00:00\",\n",
62 | " tend = \"2014-04-01T03:00:00\", \n",
63 | ")"
64 | ]
65 | },
66 | {
67 | "cell_type": "markdown",
68 | "metadata": {},
69 | "source": [
70 | "## 2. Download seismic data and read it into DPP"
71 | ]
72 | },
73 | {
74 | "cell_type": "code",
75 | "execution_count": 3,
76 | "metadata": {},
77 | "outputs": [
78 | {
79 | "name": "stdout",
80 | "output_type": "stream",
81 | "text": [
82 | "writing stream sample_data/CX_20140401/CX.PB01..HH.mseed...\n",
83 | "writing stream sample_data/CX_20140401/CX.PB02..HH.mseed...\n"
84 | ]
85 | }
86 | ],
87 | "source": [
88 | "# download and archive seismic waveforms\n",
89 | "client = Client(\"GFZ\")\n",
90 | "os.makedirs(f\"{dpp_config.data['archive']}\", exist_ok=True)\n",
91 | "tstart = oc.UTCDateTime(dpp_config.time['tstart'])\n",
92 | "tend = oc.UTCDateTime(dpp_config.time['tend'])\n",
93 | "#\n",
94 | "for sta in dpp_config.data['stas']:\n",
95 | " st = client.get_waveforms(network=dpp_config.data['net'], station=sta, location=\"*\", channel=f\"{dpp_config.data['ch']}?\", starttime=tstart, endtime=tend)\n",
96 | " # print(st)\n",
97 | " st_name = f\"{dpp_config.data['archive']}/{st[0].stats.network}.{st[0].stats.station}..{st[0].stats.channel[:-1]}.mseed\"\n",
98 | " print(f\"writing stream {st_name}...\")\n",
99 | " st.write(st_name, format=\"MSEED\")"
100 | ]
101 | },
102 | {
103 | "cell_type": "code",
104 | "execution_count": 4,
105 | "metadata": {
106 | "scrolled": true
107 | },
108 | "outputs": [
109 | {
110 | "name": "stdout",
111 | "output_type": "stream",
112 | "text": [
113 | "reading seismic waveform: sample_data/CX_20140401/CX.PB01..HH.mseed\n",
114 | "reading seismic waveform: sample_data/CX_20140401/CX.PB02..HH.mseed\n",
115 | "6 Trace(s) in Stream:\n",
116 | "CX.PB01..HHZ | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
117 | "CX.PB01..HHE | 2014-04-01T01:59:59.998394Z - 2014-04-01T02:59:59.998394Z | 100.0 Hz, 360001 samples\n",
118 | "CX.PB01..HHN | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
119 | "CX.PB02..HHZ | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
120 | "CX.PB02..HHE | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
121 | "CX.PB02..HHN | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
122 | "#\n",
123 | "processing raw stream data...\n",
124 | "detrend...\n",
125 | "resampling...\n",
126 | "CX.PB01..HHZ | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples --> skipped, already sampled at 100.0 Hz\n",
127 | "CX.PB01..HHE | 2014-04-01T01:59:59.998394Z - 2014-04-01T02:59:59.998394Z | 100.0 Hz, 360001 samples --> skipped, already sampled at 100.0 Hz\n",
128 | "CX.PB01..HHN | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples --> skipped, already sampled at 100.0 Hz\n",
129 | "CX.PB02..HHZ | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples --> skipped, already sampled at 100.0 Hz\n",
130 | "CX.PB02..HHE | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples --> skipped, already sampled at 100.0 Hz\n",
131 | "CX.PB02..HHN | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples --> skipped, already sampled at 100.0 Hz\n",
132 | "merging...\n",
133 | "#\n",
134 | "time windows (1) for iteration over continuous waveforms:\n",
135 | "[UTCDateTime(2014, 4, 1, 2, 0), UTCDateTime(2014, 4, 1, 3, 0)]\n",
136 | "\n",
137 | "slicing...\n"
138 | ]
139 | }
140 | ],
141 | "source": [
142 | "# data\n",
143 | "dpp_data = data.Data()\n",
144 | "dpp_data.read_from_directory(dpp_config)\n",
145 | "#\n",
146 | "# for k in dpp_data.data:\n",
147 | "# print(k, dpp_data.data[k])"
148 | ]
149 | },
150 | {
151 | "cell_type": "markdown",
152 | "metadata": {},
153 | "source": [
154 | "## 3. Run phase detection and picking"
155 | ]
156 | },
157 | {
158 | "cell_type": "code",
159 | "execution_count": 5,
160 | "metadata": {
161 | "scrolled": true
162 | },
163 | "outputs": [],
164 | "source": [
165 | "# model\n",
166 | "# dpp_model = model.Model(verbose=False)\n",
167 | "dpp_model = model.Model(verbose=False, version_pick_P=\"20201002_2\", version_pick_S=\"20201002_2\")\n",
168 | "#\n",
169 | "# print(dpp_model.model_detection['best_model'].summary())\n",
170 | "# print(dpp_model.model_picking_P['best_model'].summary())\n",
171 | "# print(dpp_model.model_picking_S['best_model'].summary())"
172 | ]
173 | },
174 | {
175 | "cell_type": "code",
176 | "execution_count": 6,
177 | "metadata": {
178 | "scrolled": false
179 | },
180 | "outputs": [
181 | {
182 | "name": "stdout",
183 | "output_type": "stream",
184 | "text": [
185 | "#\n",
186 | "Calculating predictions for stream: CX.PB01..HH?...\n",
187 | "strimming stream: 1, 1\n",
188 | "720/720 [==============================] - 28s 39ms/step\n",
189 | "3 Trace(s) in Stream:\n",
190 | "CX.PB01..HHE | 2014-04-01T01:59:59.998394Z - 2014-04-01T02:59:59.998394Z | 100.0 Hz, 360001 samples\n",
191 | "CX.PB01..HHN | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
192 | "CX.PB01..HHZ | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
193 | "p_picks = 13, s_picks = 8\n",
194 | "#\n",
195 | "Calculating predictions for stream: CX.PB02..HH?...\n",
196 | "720/720 [==============================] - 34s 48ms/step\n",
197 | "3 Trace(s) in Stream:\n",
198 | "CX.PB02..HHE | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
199 | "CX.PB02..HHN | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
200 | "CX.PB02..HHZ | 2014-04-01T01:59:59.998393Z - 2014-04-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
201 | "p_picks = 10, s_picks = 4\n"
202 | ]
203 | }
204 | ],
205 | "source": [
206 | "# run phase detection\n",
207 | "dpp_model.run_detection(dpp_config, dpp_data, save_dets=False, save_data=False)"
208 | ]
209 | },
210 | {
211 | "cell_type": "code",
212 | "execution_count": 7,
213 | "metadata": {
214 | "scrolled": true
215 | },
216 | "outputs": [
217 | {
218 | "name": "stdout",
219 | "output_type": "stream",
220 | "text": [
221 | "#\n",
222 | "1, 2014-04-01T02:00:00.000000Z, 2014-04-01T03:00:00.000000Z, PB01\n",
223 | "triggered picks (P, S): 13, 8\n",
224 | "selected picks (P, S): 10, 6\n",
225 | "#\n",
226 | "P pick: 1/10\n"
227 | ]
228 | },
229 | {
230 | "name": "stderr",
231 | "output_type": "stream",
232 | "text": [
233 | "100%|██████████| 10/10 [00:03<00:00, 2.97it/s]\n"
234 | ]
235 | },
236 | {
237 | "name": "stdout",
238 | "output_type": "stream",
239 | "text": [
240 | "3.36 3.47 3.41 3.77 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
241 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_01.png\n",
242 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_01.png\n",
243 | "tpred = 3.470\n",
244 | "terr(1 x pb_std) = (-0.060, +0.300)\n",
245 | "pick_class = 2\n",
246 | "pb, pb_std = (0.508, 0.233)\n",
247 | "#\n",
248 | "P pick: 2/10\n"
249 | ]
250 | },
251 | {
252 | "name": "stderr",
253 | "output_type": "stream",
254 | "text": [
255 | "100%|██████████| 10/10 [00:01<00:00, 5.11it/s]\n"
256 | ]
257 | },
258 | {
259 | "name": "stdout",
260 | "output_type": "stream",
261 | "text": [
262 | "3.36 3.62 3.47 3.71 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
263 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_02.png\n",
264 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_02.png\n",
265 | "tpred = 3.620\n",
266 | "terr(1 x pb_std) = (-0.150, +0.090)\n",
267 | "pick_class = 2\n",
268 | "pb, pb_std = (0.502, 0.199)\n",
269 | "#\n",
270 | "P pick: 3/10\n"
271 | ]
272 | },
273 | {
274 | "name": "stderr",
275 | "output_type": "stream",
276 | "text": [
277 | "100%|██████████| 10/10 [00:01<00:00, 5.44it/s]\n"
278 | ]
279 | },
280 | {
281 | "name": "stdout",
282 | "output_type": "stream",
283 | "text": [
284 | "3.36 3.31 3.3 3.32 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
285 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_03.png\n",
286 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_03.png\n",
287 | "tpred = 3.310\n",
288 | "terr(1 x pb_std) = (-0.010, +0.010)\n",
289 | "pick_class = 0\n",
290 | "pb, pb_std = (0.546, 0.182)\n",
291 | "#\n",
292 | "P pick: 4/10\n"
293 | ]
294 | },
295 | {
296 | "name": "stderr",
297 | "output_type": "stream",
298 | "text": [
299 | "100%|██████████| 10/10 [00:01<00:00, 6.46it/s]\n"
300 | ]
301 | },
302 | {
303 | "name": "stdout",
304 | "output_type": "stream",
305 | "text": [
306 | "3.36 3.33 3.31 3.35 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
307 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_04.png\n",
308 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_04.png\n",
309 | "tpred = 3.330\n",
310 | "terr(1 x pb_std) = (-0.020, +0.020)\n",
311 | "pick_class = 0\n",
312 | "pb, pb_std = (0.512, 0.132)\n",
313 | "#\n",
314 | "P pick: 5/10\n"
315 | ]
316 | },
317 | {
318 | "name": "stderr",
319 | "output_type": "stream",
320 | "text": [
321 | "100%|██████████| 10/10 [00:02<00:00, 4.94it/s]\n"
322 | ]
323 | },
324 | {
325 | "name": "stdout",
326 | "output_type": "stream",
327 | "text": [
328 | "3.36 3.25 3.09 3.52 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
329 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_05.png\n",
330 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_05.png\n",
331 | "tpred = 3.250\n",
332 | "terr(1 x pb_std) = (-0.160, +0.270)\n",
333 | "pick_class = 3\n",
334 | "pb, pb_std = (0.507, 0.318)\n",
335 | "#\n",
336 | "P pick: 6/10\n"
337 | ]
338 | },
339 | {
340 | "name": "stderr",
341 | "output_type": "stream",
342 | "text": [
343 | "100%|██████████| 10/10 [00:02<00:00, 4.56it/s]\n"
344 | ]
345 | },
346 | {
347 | "name": "stdout",
348 | "output_type": "stream",
349 | "text": [
350 | "3.36 3.36 3.35 3.37 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
351 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_06.png\n",
352 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_06.png\n",
353 | "tpred = 3.360\n",
354 | "terr(1 x pb_std) = (-0.010, +0.010)\n",
355 | "pick_class = 0\n",
356 | "pb, pb_std = (0.537, 0.097)\n",
357 | "#\n",
358 | "P pick: 7/10\n"
359 | ]
360 | },
361 | {
362 | "name": "stderr",
363 | "output_type": "stream",
364 | "text": [
365 | "100%|██████████| 10/10 [00:02<00:00, 4.14it/s]\n"
366 | ]
367 | },
368 | {
369 | "name": "stdout",
370 | "output_type": "stream",
371 | "text": [
372 | "3.36 3.5 3.49 3.5 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
373 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_07.png\n",
374 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_07.png\n",
375 | "tpred = 3.500\n",
376 | "terr(1 x pb_std) = (-0.010, +0.000)\n",
377 | "pick_class = 0\n",
378 | "pb, pb_std = (0.560, 0.124)\n",
379 | "#\n",
380 | "P pick: 8/10\n"
381 | ]
382 | },
383 | {
384 | "name": "stderr",
385 | "output_type": "stream",
386 | "text": [
387 | "100%|██████████| 10/10 [00:01<00:00, 5.13it/s]\n"
388 | ]
389 | },
390 | {
391 | "name": "stdout",
392 | "output_type": "stream",
393 | "text": [
394 | "3.36 3.26 3.23 3.27 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
395 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_08.png\n",
396 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_08.png\n",
397 | "tpred = 3.260\n",
398 | "terr(1 x pb_std) = (-0.030, +0.010)\n",
399 | "pick_class = 0\n",
400 | "pb, pb_std = (0.512, 0.164)\n",
401 | "#\n",
402 | "P pick: 9/10\n"
403 | ]
404 | },
405 | {
406 | "name": "stderr",
407 | "output_type": "stream",
408 | "text": [
409 | "100%|██████████| 10/10 [00:01<00:00, 5.06it/s]\n"
410 | ]
411 | },
412 | {
413 | "name": "stdout",
414 | "output_type": "stream",
415 | "text": [
416 | "3.36 3.54 3.5 3.59 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
417 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_09.png\n",
418 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_09.png\n",
419 | "tpred = 3.540\n",
420 | "terr(1 x pb_std) = (-0.040, +0.050)\n",
421 | "pick_class = 0\n",
422 | "pb, pb_std = (0.515, 0.170)\n",
423 | "#\n",
424 | "P pick: 10/10\n"
425 | ]
426 | },
427 | {
428 | "name": "stderr",
429 | "output_type": "stream",
430 | "text": [
431 | "100%|██████████| 10/10 [00:02<00:00, 4.34it/s]\n"
432 | ]
433 | },
434 | {
435 | "name": "stdout",
436 | "output_type": "stream",
437 | "text": [
438 | "3.36 3.07 3.03 3.21 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
439 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_10.png\n",
440 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_P_mc_10.png\n",
441 | "tpred = 3.070\n",
442 | "terr(1 x pb_std) = (-0.040, +0.140)\n",
443 | "pick_class = 1\n",
444 | "pb, pb_std = (0.530, 0.293)\n",
445 | "#\n",
446 | "S pick: 1/6\n"
447 | ]
448 | },
449 | {
450 | "name": "stderr",
451 | "output_type": "stream",
452 | "text": [
453 | "100%|██████████| 10/10 [00:03<00:00, 3.00it/s]\n"
454 | ]
455 | },
456 | {
457 | "name": "stdout",
458 | "output_type": "stream",
459 | "text": [
460 | "2.4 2.67 2.65 2.7 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
461 | "(480,) (480,)\n",
462 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_01.png\n",
463 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_01.png\n",
464 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_mc_01.png\n",
465 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_mc_01.png\n",
466 | "tpred = 2.670\n",
467 | "terr(1 x pb_std) = (-0.020, +0.030)\n",
468 | "pick_class = 0\n",
469 | "pb, pb_std = (0.522, 0.072)\n",
470 | "#\n",
471 | "S pick: 2/6\n"
472 | ]
473 | },
474 | {
475 | "name": "stderr",
476 | "output_type": "stream",
477 | "text": [
478 | "100%|██████████| 10/10 [00:01<00:00, 6.96it/s]\n"
479 | ]
480 | },
481 | {
482 | "name": "stdout",
483 | "output_type": "stream",
484 | "text": [
485 | "2.4 2.42 2.4 2.44 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
486 | "(480,) (480,)\n",
487 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_02.png\n",
488 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_02.png\n",
489 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_mc_02.png\n",
490 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_mc_02.png\n",
491 | "tpred = 2.420\n",
492 | "terr(1 x pb_std) = (-0.020, +0.020)\n",
493 | "pick_class = 0\n",
494 | "pb, pb_std = (0.509, 0.078)\n",
495 | "#\n",
496 | "S pick: 3/6\n"
497 | ]
498 | },
499 | {
500 | "name": "stderr",
501 | "output_type": "stream",
502 | "text": [
503 | "100%|██████████| 10/10 [00:01<00:00, 7.14it/s]\n"
504 | ]
505 | },
506 | {
507 | "name": "stdout",
508 | "output_type": "stream",
509 | "text": [
510 | "2.4 2.36 2.27 2.55 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
511 | "(480,) (480,)\n",
512 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_03.png\n",
513 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_03.png\n",
514 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_mc_03.png\n",
515 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_mc_03.png\n",
516 | "tpred = 2.360\n",
517 | "terr(1 x pb_std) = (-0.090, +0.190)\n",
518 | "pick_class = 2\n",
519 | "pb, pb_std = (0.503, 0.132)\n",
520 | "#\n",
521 | "S pick: 4/6\n"
522 | ]
523 | },
524 | {
525 | "name": "stderr",
526 | "output_type": "stream",
527 | "text": [
528 | "100%|██████████| 10/10 [00:01<00:00, 7.46it/s]\n"
529 | ]
530 | },
531 | {
532 | "name": "stdout",
533 | "output_type": "stream",
534 | "text": [
535 | "2.4 2.44 2.27 2.51 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
536 | "(480,) (480,)\n",
537 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_04.png\n",
538 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_04.png\n",
539 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_mc_04.png\n",
540 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_mc_04.png\n",
541 | "tpred = 2.440\n",
542 | "terr(1 x pb_std) = (-0.170, +0.070)\n",
543 | "pick_class = 2\n",
544 | "pb, pb_std = (0.512, 0.171)\n",
545 | "#\n",
546 | "S pick: 5/6\n"
547 | ]
548 | },
549 | {
550 | "name": "stderr",
551 | "output_type": "stream",
552 | "text": [
553 | "100%|██████████| 10/10 [00:01<00:00, 8.04it/s]\n"
554 | ]
555 | },
556 | {
557 | "name": "stdout",
558 | "output_type": "stream",
559 | "text": [
560 | "2.4 2.4 2.29 2.5 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
561 | "(480,) (480,)\n",
562 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_05.png\n",
563 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_05.png\n",
564 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_mc_05.png\n",
565 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_mc_05.png\n",
566 | "tpred = 2.400\n",
567 | "terr(1 x pb_std) = (-0.110, +0.100)\n",
568 | "pick_class = 2\n",
569 | "pb, pb_std = (0.505, 0.101)\n",
570 | "#\n",
571 | "S pick: 6/6\n"
572 | ]
573 | },
574 | {
575 | "name": "stderr",
576 | "output_type": "stream",
577 | "text": [
578 | "100%|██████████| 10/10 [00:01<00:00, 7.39it/s]\n"
579 | ]
580 | },
581 | {
582 | "name": "stdout",
583 | "output_type": "stream",
584 | "text": [
585 | "2.4 2.48 2.39 2.65 out_CX_20140401/20140401T020000_20140401T030000 PB01\n",
586 | "(480,) (480,)\n",
587 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_06.png\n",
588 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_06.png\n",
589 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_E_mc_06.png\n",
590 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB01_S_N_mc_06.png\n",
591 | "tpred = 2.480\n",
592 | "terr(1 x pb_std) = (-0.090, +0.170)\n",
593 | "pick_class = 2\n",
594 | "pb, pb_std = (0.506, 0.248)\n",
595 | "#\n",
596 | "1, 2014-04-01T02:00:00.000000Z, 2014-04-01T03:00:00.000000Z, PB02\n",
597 | "triggered picks (P, S): 10, 4\n",
598 | "selected picks (P, S): 10, 3\n",
599 | "#\n",
600 | "P pick: 1/10\n"
601 | ]
602 | },
603 | {
604 | "name": "stderr",
605 | "output_type": "stream",
606 | "text": [
607 | "100%|██████████| 10/10 [00:01<00:00, 5.90it/s]\n"
608 | ]
609 | },
610 | {
611 | "name": "stdout",
612 | "output_type": "stream",
613 | "text": [
614 | "3.36 3.21 3.2 3.22 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
615 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_01.png\n",
616 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_01.png\n",
617 | "tpred = 3.210\n",
618 | "terr(1 x pb_std) = (-0.010, +0.010)\n",
619 | "pick_class = 0\n",
620 | "pb, pb_std = (0.580, 0.125)\n",
621 | "#\n",
622 | "P pick: 2/10\n"
623 | ]
624 | },
625 | {
626 | "name": "stderr",
627 | "output_type": "stream",
628 | "text": [
629 | "100%|██████████| 10/10 [00:01<00:00, 6.49it/s]\n"
630 | ]
631 | },
632 | {
633 | "name": "stdout",
634 | "output_type": "stream",
635 | "text": [
636 | "3.36 3.37 3.36 3.38 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
637 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_02.png\n",
638 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_02.png\n",
639 | "tpred = 3.370\n",
640 | "terr(1 x pb_std) = (-0.010, +0.010)\n",
641 | "pick_class = 0\n",
642 | "pb, pb_std = (0.574, 0.156)\n",
643 | "#\n",
644 | "P pick: 3/10\n"
645 | ]
646 | },
647 | {
648 | "name": "stderr",
649 | "output_type": "stream",
650 | "text": [
651 | "100%|██████████| 10/10 [00:02<00:00, 3.62it/s]\n"
652 | ]
653 | },
654 | {
655 | "name": "stdout",
656 | "output_type": "stream",
657 | "text": [
658 | "3.36 3.29 3.24 3.37 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
659 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_03.png\n",
660 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_03.png\n",
661 | "tpred = 3.290\n",
662 | "terr(1 x pb_std) = (-0.050, +0.080)\n",
663 | "pick_class = 1\n",
664 | "pb, pb_std = (0.503, 0.209)\n",
665 | "#\n",
666 | "P pick: 4/10\n"
667 | ]
668 | },
669 | {
670 | "name": "stderr",
671 | "output_type": "stream",
672 | "text": [
673 | "100%|██████████| 10/10 [00:01<00:00, 5.29it/s]\n"
674 | ]
675 | },
676 | {
677 | "name": "stdout",
678 | "output_type": "stream",
679 | "text": [
680 | "3.36 3.13 3.13 3.14 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
681 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_04.png\n",
682 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_04.png\n",
683 | "tpred = 3.130\n",
684 | "terr(1 x pb_std) = (-0.000, +0.010)\n",
685 | "pick_class = 0\n",
686 | "pb, pb_std = (0.601, 0.132)\n",
687 | "#\n",
688 | "P pick: 5/10\n"
689 | ]
690 | },
691 | {
692 | "name": "stderr",
693 | "output_type": "stream",
694 | "text": [
695 | "100%|██████████| 10/10 [00:01<00:00, 5.20it/s]\n"
696 | ]
697 | },
698 | {
699 | "name": "stdout",
700 | "output_type": "stream",
701 | "text": [
702 | "3.36 3.4 3.34 3.42 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
703 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_05.png\n",
704 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_05.png\n",
705 | "tpred = 3.400\n",
706 | "terr(1 x pb_std) = (-0.060, +0.020)\n",
707 | "pick_class = 0\n",
708 | "pb, pb_std = (0.542, 0.147)\n",
709 | "#\n",
710 | "P pick: 6/10\n"
711 | ]
712 | },
713 | {
714 | "name": "stderr",
715 | "output_type": "stream",
716 | "text": [
717 | "100%|██████████| 10/10 [00:01<00:00, 5.85it/s]\n"
718 | ]
719 | },
720 | {
721 | "name": "stdout",
722 | "output_type": "stream",
723 | "text": [
724 | "3.36 3.25 3.16 3.4 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
725 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_06.png\n",
726 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_06.png\n",
727 | "tpred = 3.250\n",
728 | "terr(1 x pb_std) = (-0.090, +0.150)\n",
729 | "pick_class = 2\n",
730 | "pb, pb_std = (0.504, 0.311)\n",
731 | "#\n",
732 | "P pick: 7/10\n"
733 | ]
734 | },
735 | {
736 | "name": "stderr",
737 | "output_type": "stream",
738 | "text": [
739 | "100%|██████████| 10/10 [00:01<00:00, 6.52it/s]\n"
740 | ]
741 | },
742 | {
743 | "name": "stdout",
744 | "output_type": "stream",
745 | "text": [
746 | "3.36 3.28 3.27 3.3 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
747 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_07.png\n",
748 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_07.png\n",
749 | "tpred = 3.280\n",
750 | "terr(1 x pb_std) = (-0.010, +0.020)\n",
751 | "pick_class = 0\n",
752 | "pb, pb_std = (0.564, 0.156)\n",
753 | "#\n",
754 | "P pick: 8/10\n"
755 | ]
756 | },
757 | {
758 | "name": "stderr",
759 | "output_type": "stream",
760 | "text": [
761 | "100%|██████████| 10/10 [00:01<00:00, 6.17it/s]\n"
762 | ]
763 | },
764 | {
765 | "name": "stdout",
766 | "output_type": "stream",
767 | "text": [
768 | "3.36 3.53 3.52 3.54 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
769 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_08.png\n",
770 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_08.png\n",
771 | "tpred = 3.530\n",
772 | "terr(1 x pb_std) = (-0.010, +0.010)\n",
773 | "pick_class = 0\n",
774 | "pb, pb_std = (0.521, 0.131)\n",
775 | "#\n",
776 | "P pick: 9/10\n"
777 | ]
778 | },
779 | {
780 | "name": "stderr",
781 | "output_type": "stream",
782 | "text": [
783 | "100%|██████████| 10/10 [00:01<00:00, 6.31it/s]\n"
784 | ]
785 | },
786 | {
787 | "name": "stdout",
788 | "output_type": "stream",
789 | "text": [
790 | "3.36 3.37 3.35 3.38 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
791 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_09.png\n",
792 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_09.png\n",
793 | "tpred = 3.370\n",
794 | "terr(1 x pb_std) = (-0.020, +0.010)\n",
795 | "pick_class = 0\n",
796 | "pb, pb_std = (0.518, 0.194)\n",
797 | "#\n",
798 | "P pick: 10/10\n"
799 | ]
800 | },
801 | {
802 | "name": "stderr",
803 | "output_type": "stream",
804 | "text": [
805 | "100%|██████████| 10/10 [00:02<00:00, 4.44it/s]\n"
806 | ]
807 | },
808 | {
809 | "name": "stdout",
810 | "output_type": "stream",
811 | "text": [
812 | "3.36 3.66 3.62 3.73 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
813 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_10.png\n",
814 | "plotting predicted phase P: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_P_mc_10.png\n",
815 | "tpred = 3.660\n",
816 | "terr(1 x pb_std) = (-0.040, +0.070)\n",
817 | "pick_class = 1\n",
818 | "pb, pb_std = (0.517, 0.158)\n",
819 | "#\n",
820 | "S pick: 1/3\n"
821 | ]
822 | },
823 | {
824 | "name": "stderr",
825 | "output_type": "stream",
826 | "text": [
827 | "100%|██████████| 10/10 [00:01<00:00, 6.24it/s]\n"
828 | ]
829 | },
830 | {
831 | "name": "stdout",
832 | "output_type": "stream",
833 | "text": [
834 | "2.4 2.19 2.18 2.2 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
835 | "(480,) (480,)\n",
836 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_E_01.png\n",
837 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_N_01.png\n",
838 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_E_mc_01.png\n",
839 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_N_mc_01.png\n",
840 | "tpred = 2.190\n",
841 | "terr(1 x pb_std) = (-0.010, +0.010)\n",
842 | "pick_class = 0\n",
843 | "pb, pb_std = (0.504, 0.071)\n",
844 | "#\n",
845 | "S pick: 2/3\n"
846 | ]
847 | },
848 | {
849 | "name": "stderr",
850 | "output_type": "stream",
851 | "text": [
852 | "100%|██████████| 10/10 [00:01<00:00, 7.11it/s]\n"
853 | ]
854 | },
855 | {
856 | "name": "stdout",
857 | "output_type": "stream",
858 | "text": [
859 | "2.4 2.21 2.18 2.26 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
860 | "(480,) (480,)\n",
861 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_E_02.png\n",
862 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_N_02.png\n",
863 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_E_mc_02.png\n",
864 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_N_mc_02.png\n",
865 | "tpred = 2.210\n",
866 | "terr(1 x pb_std) = (-0.030, +0.050)\n",
867 | "pick_class = 0\n",
868 | "pb, pb_std = (0.511, 0.144)\n",
869 | "#\n",
870 | "S pick: 3/3\n"
871 | ]
872 | },
873 | {
874 | "name": "stderr",
875 | "output_type": "stream",
876 | "text": [
877 | "100%|██████████| 10/10 [00:01<00:00, 8.20it/s]\n"
878 | ]
879 | },
880 | {
881 | "name": "stdout",
882 | "output_type": "stream",
883 | "text": [
884 | "2.4 2.47 2.45 2.49 out_CX_20140401/20140401T020000_20140401T030000 PB02\n",
885 | "(480,) (480,)\n",
886 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_E_03.png\n",
887 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_N_03.png\n",
888 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_E_mc_03.png\n",
889 | "plotting predicted phase S: out_CX_20140401/20140401T020000_20140401T030000/pick_plots/PB02_S_N_mc_03.png\n",
890 | "tpred = 2.470\n",
891 | "terr(1 x pb_std) = (-0.020, +0.020)\n",
892 | "pick_class = 0\n",
893 | "pb, pb_std = (0.505, 0.135)\n"
894 | ]
895 | }
896 | ],
897 | "source": [
898 | "# run phase picking\n",
899 | "dpp_model.run_picking(dpp_config, dpp_data, save_plots=True, save_stats=True, save_picks=False)"
900 | ]
901 | },
902 | {
903 | "cell_type": "markdown",
904 | "metadata": {},
905 | "source": [
906 | "## 4. Plot predicted phases"
907 | ]
908 | },
909 | {
910 | "cell_type": "code",
911 | "execution_count": 8,
912 | "metadata": {},
913 | "outputs": [
914 | {
915 | "name": "stdout",
916 | "output_type": "stream",
917 | "text": [
918 | "creating plots...\n",
919 | "1 PB01 Z 2014-04-01T01:59:59.998393Z 2014-04-01T02:59:59.998393Z\n",
920 | "1 PB01 E 2014-04-01T01:59:59.998394Z 2014-04-01T02:59:59.998394Z\n",
921 | "1 PB02 Z 2014-04-01T01:59:59.998393Z 2014-04-01T02:59:59.998393Z\n",
922 | "1 PB02 E 2014-04-01T01:59:59.998393Z 2014-04-01T02:59:59.998393Z\n"
923 | ]
924 | }
925 | ],
926 | "source": [
927 | "# plots\n",
928 | "util.plot_predicted_phases(dpp_config, dpp_data, dpp_model)\n",
929 | "# util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_probs=['P','S'], shift_probs=True)"
930 | ]
931 | }
932 | ],
933 | "metadata": {
934 | "jupytext": {
935 | "text_representation": {
936 | "extension": ".py",
937 | "format_name": "light",
938 | "format_version": "1.4",
939 | "jupytext_version": "1.2.4"
940 | }
941 | },
942 | "kernelspec": {
943 | "display_name": "Python 3",
944 | "language": "python",
945 | "name": "python3"
946 | },
947 | "language_info": {
948 | "codemirror_mode": {
949 | "name": "ipython",
950 | "version": 3
951 | },
952 | "file_extension": ".py",
953 | "mimetype": "text/x-python",
954 | "name": "python",
955 | "nbconvert_exporter": "python",
956 | "pygments_lexer": "ipython3",
957 | "version": "3.6.13"
958 | }
959 | },
960 | "nbformat": 4,
961 | "nbformat_minor": 2
962 | }
963 |
--------------------------------------------------------------------------------
/examples/run_dpp_download_data.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #
3 | # This script applies DeepPhasePick on seismic data downloaded using FDSN web service client for ObsPy.
4 | #
5 | # Author: Hugo Soto Parada (June, 2021)
6 | # Contact: soto@gfz-potsdam.de, hugosotoparada@gmail.com
7 | #
8 | ########################################################################################################################################
9 |
10 | import os
11 | import config, data, model, util
12 | from obspy.clients.fdsn import Client
13 | import obspy.core as oc
14 |
15 | # 1. Configure DPP
16 | #
17 | # config
18 | util.init_session()
19 | dpp_config = config.Config()
20 | dpp_config.set_trigger(pthres_p=[0.9, 0.001], pthres_s=[0.9, 0.001])
21 | dpp_config.set_picking(mcd_iter=10, run_mcd=True)
22 | # dpp_config.set_picking(run_mcd=False)
23 | #
24 | dpp_config.set_data(
25 | stas=['PB01', 'PB02'],
26 | net='CX',
27 | ch='HH',
28 | archive='sample_data/CX_20140401',
29 | opath='out_CX_20140401'
30 | )
31 | dpp_config.set_time(
32 | dt_iter=3600.,
33 | tstart="2014-04-01T02:00:00",
34 | tend="2014-04-01T03:00:00",
35 | )
36 |
37 | # 2. Download seismic data and read it into DPP
38 | #
39 | # download and archive seismic waveforms
40 | client = Client("GFZ")
41 | os.makedirs(f"{dpp_config.data['archive']}", exist_ok=True)
42 | tstart = oc.UTCDateTime(dpp_config.time['tstart'])
43 | tend = oc.UTCDateTime(dpp_config.time['tend'])
44 | #
45 | for sta in dpp_config.data['stas']:
46 | st = client.get_waveforms(network=dpp_config.data['net'], station=sta, location="*", channel=f"{dpp_config.data['ch']}?", starttime=tstart, endtime=tend)
47 | # print(st)
48 | st_name = f"{dpp_config.data['archive']}/{st[0].stats.network}.{st[0].stats.station}..{st[0].stats.channel[:-1]}.mseed"
49 | print(f"writing stream {st_name}...")
50 | st.write(st_name, format="MSEED")
51 | #
52 | # data
53 | dpp_data = data.Data()
54 | dpp_data.read_from_directory(dpp_config)
55 | #
56 | # for k in dpp_data.data:
57 | # print(k, dpp_data.data[k])
58 |
59 | # 3. Run phase detection and picking
60 | #
61 | # model
62 | # dpp_model = model.Model(verbose=False)
63 | dpp_model = model.Model(verbose=False, version_pick_P="20201002_2", version_pick_S="20201002_2")
64 | #
65 | # print(dpp_model.model_detection['best_model'].summary())
66 | # print(dpp_model.model_picking_P['best_model'].summary())
67 | # print(dpp_model.model_picking_S['best_model'].summary())
68 | #
69 | # run phase detection
70 | dpp_model.run_detection(dpp_config, dpp_data, save_dets=False, save_data=False)
71 | #
72 | # run phase picking
73 | dpp_model.run_picking(dpp_config, dpp_data, save_plots=True, save_stats=True, save_picks=False)
74 |
75 | # 4. Plot predicted phases
76 | #
77 | # plots
78 | util.plot_predicted_phases(dpp_config, dpp_data, dpp_model)
79 | # util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_probs=['P','S'], shift_probs=True)
80 |
81 |
--------------------------------------------------------------------------------
/examples/run_dpp_from_archive.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "### This example applies DeepPhasePick on seismic data stored in a archive directory structured as: archive/YY/NET/STA/CH.
Here YY is year, NET is the network code, STA is the station code and CH is the channel code (e.g., HH) of the seismic streams."
8 | ]
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": 1,
13 | "metadata": {},
14 | "outputs": [],
15 | "source": [
16 | "import os\n",
17 | "os.chdir('../')\n",
18 | "import config, data, model, util "
19 | ]
20 | },
21 | {
22 | "cell_type": "markdown",
23 | "metadata": {},
24 | "source": [
25 | "## 1. Configure DPP"
26 | ]
27 | },
28 | {
29 | "cell_type": "code",
30 | "execution_count": 2,
31 | "metadata": {},
32 | "outputs": [
33 | {
34 | "name": "stdout",
35 | "output_type": "stream",
36 | "text": [
37 | "__pycache__/ removed\n",
38 | "~/.nv/ not found, continuing...\n"
39 | ]
40 | }
41 | ],
42 | "source": [
43 | "# config\n",
44 | "util.init_session()\n",
45 | "dpp_config = config.Config()\n",
46 | "dpp_config.set_trigger(pthres_p=[0.9, 0.001], pthres_s=[0.9, 0.001])\n",
47 | "# dpp_config.set_picking(mcd_iter=10, run_mcd=True)\n",
48 | "dpp_config.set_picking(run_mcd=False)\n",
49 | "#\n",
50 | "dpp_config.set_data(\n",
51 | " stas=['PB01', 'PB02'],\n",
52 | " net='CX',\n",
53 | " ch='HH',\n",
54 | " archive='sample_data/archive',\n",
55 | " opath='out_archive'\n",
56 | ")\n",
57 | "dpp_config.set_time(\n",
58 | " dt_iter=3600.,\n",
59 | " #\n",
60 | " tstart = \"2014-05-01T00:00:00\",\n",
61 | " tend = \"2014-05-01T12:00:00\", \n",
62 | ")"
63 | ]
64 | },
65 | {
66 | "cell_type": "markdown",
67 | "metadata": {},
68 | "source": [
69 | "## 2. Read seismic data into DPP"
70 | ]
71 | },
72 | {
73 | "cell_type": "code",
74 | "execution_count": 3,
75 | "metadata": {
76 | "scrolled": true
77 | },
78 | "outputs": [
79 | {
80 | "name": "stdout",
81 | "output_type": "stream",
82 | "text": [
83 | "#\n",
84 | "time windows (12) for iteration over continuous waveforms:\n",
85 | "[UTCDateTime(2014, 5, 1, 0, 0), UTCDateTime(2014, 5, 1, 1, 0)]\n",
86 | "[UTCDateTime(2014, 5, 1, 1, 0), UTCDateTime(2014, 5, 1, 2, 0)]\n",
87 | "[UTCDateTime(2014, 5, 1, 2, 0), UTCDateTime(2014, 5, 1, 3, 0)]\n",
88 | "[UTCDateTime(2014, 5, 1, 3, 0), UTCDateTime(2014, 5, 1, 4, 0)]\n",
89 | "[UTCDateTime(2014, 5, 1, 4, 0), UTCDateTime(2014, 5, 1, 5, 0)]\n",
90 | "[UTCDateTime(2014, 5, 1, 5, 0), UTCDateTime(2014, 5, 1, 6, 0)]\n",
91 | "[UTCDateTime(2014, 5, 1, 6, 0), UTCDateTime(2014, 5, 1, 7, 0)]\n",
92 | "[UTCDateTime(2014, 5, 1, 7, 0), UTCDateTime(2014, 5, 1, 8, 0)]\n",
93 | "[UTCDateTime(2014, 5, 1, 8, 0), UTCDateTime(2014, 5, 1, 9, 0)]\n",
94 | "[UTCDateTime(2014, 5, 1, 9, 0), UTCDateTime(2014, 5, 1, 10, 0)]\n",
95 | "[UTCDateTime(2014, 5, 1, 10, 0), UTCDateTime(2014, 5, 1, 11, 0)]\n",
96 | "[UTCDateTime(2014, 5, 1, 11, 0), UTCDateTime(2014, 5, 1, 12, 0)]\n",
97 | "\n",
98 | "\n",
99 | "retrieving seismic waveforms for stations:\n",
100 | "['PB01', 'PB02']\n",
101 | "seismic data found for: net = CX, sta = PB01, st_count = 0, st_arg = 0\n",
102 | "['sample_data/archive/2014/CX/PB01/HHE.D/CX.PB01..HHE.D.2014.121', 'sample_data/archive/2014/CX/PB01/HHZ.D/CX.PB01..HHZ.D.2014.121', 'sample_data/archive/2014/CX/PB01/HHN.D/CX.PB01..HHN.D.2014.121']\n",
103 | "sample_data/archive/2014/CX/PB01/HHE.D/CX.PB01..HHE.D.2014.121\n",
104 | "sample_data/archive/2014/CX/PB01/HHZ.D/CX.PB01..HHZ.D.2014.121\n",
105 | "sample_data/archive/2014/CX/PB01/HHN.D/CX.PB01..HHN.D.2014.121\n",
106 | "seismic data found for: net = CX, sta = PB02, st_count = 3, st_arg = 0\n",
107 | "['sample_data/archive/2014/CX/PB02/HHE.D/CX.PB02..HHE.D.2014.121', 'sample_data/archive/2014/CX/PB02/HHZ.D/CX.PB02..HHZ.D.2014.121', 'sample_data/archive/2014/CX/PB02/HHN.D/CX.PB02..HHN.D.2014.121']\n",
108 | "sample_data/archive/2014/CX/PB02/HHE.D/CX.PB02..HHE.D.2014.121\n",
109 | "sample_data/archive/2014/CX/PB02/HHZ.D/CX.PB02..HHZ.D.2014.121\n",
110 | "sample_data/archive/2014/CX/PB02/HHN.D/CX.PB02..HHN.D.2014.121\n",
111 | "6 Trace(s) in Stream:\n",
112 | "CX.PB01..HHE | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples\n",
113 | "CX.PB01..HHZ | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples\n",
114 | "CX.PB01..HHN | 2014-04-30T23:59:59.998391Z - 2014-05-01T11:59:59.998391Z | 100.0 Hz, 4320001 samples\n",
115 | "CX.PB02..HHE | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples\n",
116 | "CX.PB02..HHZ | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples\n",
117 | "CX.PB02..HHN | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples\n",
118 | "#\n",
119 | "processing raw stream data...\n",
120 | "detrend...\n",
121 | "resampling...\n",
122 | "CX.PB01..HHE | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples --> skipped, already sampled at 100.0 Hz\n",
123 | "CX.PB01..HHZ | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples --> skipped, already sampled at 100.0 Hz\n",
124 | "CX.PB01..HHN | 2014-04-30T23:59:59.998391Z - 2014-05-01T11:59:59.998391Z | 100.0 Hz, 4320001 samples --> skipped, already sampled at 100.0 Hz\n",
125 | "CX.PB02..HHE | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples --> skipped, already sampled at 100.0 Hz\n",
126 | "CX.PB02..HHZ | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples --> skipped, already sampled at 100.0 Hz\n",
127 | "CX.PB02..HHN | 2014-04-30T23:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 4320001 samples --> skipped, already sampled at 100.0 Hz\n",
128 | "merging...\n",
129 | "slicing...\n",
130 | "slicing...\n",
131 | "slicing...\n",
132 | "slicing...\n",
133 | "slicing...\n",
134 | "slicing...\n",
135 | "slicing...\n",
136 | "slicing...\n",
137 | "slicing...\n",
138 | "slicing...\n",
139 | "slicing...\n",
140 | "slicing...\n",
141 | "1 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 0, 0), UTCDateTime(2014, 5, 1, 1, 0)], 'opath': 'out_archive/20140501T000000_20140501T010000'}\n",
142 | "2 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 1, 0), UTCDateTime(2014, 5, 1, 2, 0)], 'opath': 'out_archive/20140501T010000_20140501T020000'}\n",
143 | "3 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 2, 0), UTCDateTime(2014, 5, 1, 3, 0)], 'opath': 'out_archive/20140501T020000_20140501T030000'}\n",
144 | "4 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 3, 0), UTCDateTime(2014, 5, 1, 4, 0)], 'opath': 'out_archive/20140501T030000_20140501T040000'}\n",
145 | "5 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 4, 0), UTCDateTime(2014, 5, 1, 5, 0)], 'opath': 'out_archive/20140501T040000_20140501T050000'}\n",
146 | "6 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 5, 0), UTCDateTime(2014, 5, 1, 6, 0)], 'opath': 'out_archive/20140501T050000_20140501T060000'}\n",
147 | "7 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 6, 0), UTCDateTime(2014, 5, 1, 7, 0)], 'opath': 'out_archive/20140501T060000_20140501T070000'}\n",
148 | "8 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 7, 0), UTCDateTime(2014, 5, 1, 8, 0)], 'opath': 'out_archive/20140501T070000_20140501T080000'}\n",
149 | "9 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 8, 0), UTCDateTime(2014, 5, 1, 9, 0)], 'opath': 'out_archive/20140501T080000_20140501T090000'}\n",
150 | "10 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 9, 0), UTCDateTime(2014, 5, 1, 10, 0)], 'opath': 'out_archive/20140501T090000_20140501T100000'}\n",
151 | "11 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 10, 0), UTCDateTime(2014, 5, 1, 11, 0)], 'opath': 'out_archive/20140501T100000_20140501T110000'}\n",
152 | "12 {'st': {'PB01': , 'PB02': }, 'twin': [UTCDateTime(2014, 5, 1, 11, 0), UTCDateTime(2014, 5, 1, 12, 0)], 'opath': 'out_archive/20140501T110000_20140501T120000'}\n"
153 | ]
154 | }
155 | ],
156 | "source": [
157 | "# data\n",
158 | "dpp_data = data.Data()\n",
159 | "dpp_data.read_from_archive(dpp_config)\n",
160 | "#\n",
161 | "for k in dpp_data.data:\n",
162 | " print(k, dpp_data.data[k])"
163 | ]
164 | },
165 | {
166 | "cell_type": "markdown",
167 | "metadata": {},
168 | "source": [
169 | "## 3. Run phase detection and picking"
170 | ]
171 | },
172 | {
173 | "cell_type": "code",
174 | "execution_count": 4,
175 | "metadata": {
176 | "scrolled": true
177 | },
178 | "outputs": [],
179 | "source": [
180 | "# model\n",
181 | "dpp_model = model.Model(verbose=False)\n",
182 | "# dpp_model = model.Model(verbose=False, version_pick_P=\"20201002_2\", version_pick_S=\"20201002_2\")\n",
183 | "#\n",
184 | "# print(dpp_model.model_detection['best_model'].summary())\n",
185 | "# print(dpp_model.model_picking_P['best_model'].summary())\n",
186 | "# print(dpp_model.model_picking_S['best_model'].summary())"
187 | ]
188 | },
189 | {
190 | "cell_type": "code",
191 | "execution_count": 5,
192 | "metadata": {
193 | "scrolled": true
194 | },
195 | "outputs": [
196 | {
197 | "name": "stdout",
198 | "output_type": "stream",
199 | "text": [
200 | "#\n",
201 | "Calculating predictions for stream: CX.PB01..HH?...\n",
202 | "strimming stream: 2, 2\n",
203 | "720/720 [==============================] - 28s 39ms/step\n",
204 | "3 Trace(s) in Stream:\n",
205 | "CX.PB01..HHE | 2014-04-30T23:59:59.998393Z - 2014-05-01T00:59:59.998393Z | 100.0 Hz, 360001 samples\n",
206 | "CX.PB01..HHN | 2014-04-30T23:59:59.998391Z - 2014-05-01T00:59:59.998391Z | 100.0 Hz, 360001 samples\n",
207 | "CX.PB01..HHZ | 2014-04-30T23:59:59.998393Z - 2014-05-01T00:59:59.998393Z | 100.0 Hz, 360001 samples\n",
208 | "p_picks = 23, s_picks = 12\n",
209 | "#\n",
210 | "Calculating predictions for stream: CX.PB02..HH?...\n",
211 | "720/720 [==============================] - 35s 48ms/step\n",
212 | "3 Trace(s) in Stream:\n",
213 | "CX.PB02..HHE | 2014-04-30T23:59:59.998393Z - 2014-05-01T00:59:59.998393Z | 100.0 Hz, 360001 samples\n",
214 | "CX.PB02..HHN | 2014-04-30T23:59:59.998393Z - 2014-05-01T00:59:59.998393Z | 100.0 Hz, 360001 samples\n",
215 | "CX.PB02..HHZ | 2014-04-30T23:59:59.998393Z - 2014-05-01T00:59:59.998393Z | 100.0 Hz, 360001 samples\n",
216 | "p_picks = 12, s_picks = 11\n",
217 | "#\n",
218 | "Calculating predictions for stream: CX.PB01..HH?...\n",
219 | "strimming stream: 2, 2\n",
220 | "720/720 [==============================] - 37s 51ms/step\n",
221 | "3 Trace(s) in Stream:\n",
222 | "CX.PB01..HHE | 2014-05-01T00:59:59.998393Z - 2014-05-01T01:59:59.998393Z | 100.0 Hz, 360001 samples\n",
223 | "CX.PB01..HHN | 2014-05-01T00:59:59.998391Z - 2014-05-01T01:59:59.998391Z | 100.0 Hz, 360001 samples\n",
224 | "CX.PB01..HHZ | 2014-05-01T00:59:59.998393Z - 2014-05-01T01:59:59.998393Z | 100.0 Hz, 360001 samples\n",
225 | "p_picks = 25, s_picks = 19\n",
226 | "#\n",
227 | "Calculating predictions for stream: CX.PB02..HH?...\n",
228 | "720/720 [==============================] - 40s 56ms/step\n",
229 | "3 Trace(s) in Stream:\n",
230 | "CX.PB02..HHE | 2014-05-01T00:59:59.998393Z - 2014-05-01T01:59:59.998393Z | 100.0 Hz, 360001 samples\n",
231 | "CX.PB02..HHN | 2014-05-01T00:59:59.998393Z - 2014-05-01T01:59:59.998393Z | 100.0 Hz, 360001 samples\n",
232 | "CX.PB02..HHZ | 2014-05-01T00:59:59.998393Z - 2014-05-01T01:59:59.998393Z | 100.0 Hz, 360001 samples\n",
233 | "p_picks = 21, s_picks = 21\n",
234 | "#\n",
235 | "Calculating predictions for stream: CX.PB01..HH?...\n",
236 | "strimming stream: 2, 2\n",
237 | "720/720 [==============================] - 39s 54ms/step\n",
238 | "3 Trace(s) in Stream:\n",
239 | "CX.PB01..HHE | 2014-05-01T01:59:59.998393Z - 2014-05-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
240 | "CX.PB01..HHN | 2014-05-01T01:59:59.998391Z - 2014-05-01T02:59:59.998391Z | 100.0 Hz, 360001 samples\n",
241 | "CX.PB01..HHZ | 2014-05-01T01:59:59.998393Z - 2014-05-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
242 | "p_picks = 18, s_picks = 6\n",
243 | "#\n",
244 | "Calculating predictions for stream: CX.PB02..HH?...\n",
245 | "720/720 [==============================] - 38s 53ms/step\n",
246 | "3 Trace(s) in Stream:\n",
247 | "CX.PB02..HHE | 2014-05-01T01:59:59.998393Z - 2014-05-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
248 | "CX.PB02..HHN | 2014-05-01T01:59:59.998393Z - 2014-05-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
249 | "CX.PB02..HHZ | 2014-05-01T01:59:59.998393Z - 2014-05-01T02:59:59.998393Z | 100.0 Hz, 360001 samples\n",
250 | "p_picks = 13, s_picks = 6\n",
251 | "#\n",
252 | "Calculating predictions for stream: CX.PB01..HH?...\n",
253 | "strimming stream: 2, 2\n",
254 | "720/720 [==============================] - 41s 57ms/step\n",
255 | "3 Trace(s) in Stream:\n",
256 | "CX.PB01..HHE | 2014-05-01T02:59:59.998393Z - 2014-05-01T03:59:59.998393Z | 100.0 Hz, 360001 samples\n",
257 | "CX.PB01..HHN | 2014-05-01T02:59:59.998391Z - 2014-05-01T03:59:59.998391Z | 100.0 Hz, 360001 samples\n",
258 | "CX.PB01..HHZ | 2014-05-01T02:59:59.998393Z - 2014-05-01T03:59:59.998393Z | 100.0 Hz, 360001 samples\n",
259 | "p_picks = 14, s_picks = 4\n",
260 | "#\n",
261 | "Calculating predictions for stream: CX.PB02..HH?...\n",
262 | "720/720 [==============================] - 39s 54ms/step\n",
263 | "3 Trace(s) in Stream:\n",
264 | "CX.PB02..HHE | 2014-05-01T02:59:59.998393Z - 2014-05-01T03:59:59.998393Z | 100.0 Hz, 360001 samples\n",
265 | "CX.PB02..HHN | 2014-05-01T02:59:59.998393Z - 2014-05-01T03:59:59.998393Z | 100.0 Hz, 360001 samples\n",
266 | "CX.PB02..HHZ | 2014-05-01T02:59:59.998393Z - 2014-05-01T03:59:59.998393Z | 100.0 Hz, 360001 samples\n",
267 | "p_picks = 7, s_picks = 3\n",
268 | "#\n",
269 | "Calculating predictions for stream: CX.PB01..HH?...\n",
270 | "strimming stream: 2, 2\n",
271 | "720/720 [==============================] - 39s 55ms/step\n",
272 | "3 Trace(s) in Stream:\n",
273 | "CX.PB01..HHE | 2014-05-01T03:59:59.998393Z - 2014-05-01T04:59:59.998393Z | 100.0 Hz, 360001 samples\n",
274 | "CX.PB01..HHN | 2014-05-01T03:59:59.998391Z - 2014-05-01T04:59:59.998391Z | 100.0 Hz, 360001 samples\n",
275 | "CX.PB01..HHZ | 2014-05-01T03:59:59.998393Z - 2014-05-01T04:59:59.998393Z | 100.0 Hz, 360001 samples\n",
276 | "p_picks = 21, s_picks = 14\n",
277 | "#\n",
278 | "Calculating predictions for stream: CX.PB02..HH?...\n",
279 | "720/720 [==============================] - 32s 44ms/step\n",
280 | "3 Trace(s) in Stream:\n",
281 | "CX.PB02..HHE | 2014-05-01T03:59:59.998393Z - 2014-05-01T04:59:59.998393Z | 100.0 Hz, 360001 samples\n",
282 | "CX.PB02..HHN | 2014-05-01T03:59:59.998393Z - 2014-05-01T04:59:59.998393Z | 100.0 Hz, 360001 samples\n",
283 | "CX.PB02..HHZ | 2014-05-01T03:59:59.998393Z - 2014-05-01T04:59:59.998393Z | 100.0 Hz, 360001 samples\n",
284 | "p_picks = 19, s_picks = 10\n",
285 | "#\n",
286 | "Calculating predictions for stream: CX.PB01..HH?...\n",
287 | "strimming stream: 2, 2\n",
288 | "720/720 [==============================] - 40s 56ms/step\n",
289 | "3 Trace(s) in Stream:\n",
290 | "CX.PB01..HHE | 2014-05-01T04:59:59.998393Z - 2014-05-01T05:59:59.998393Z | 100.0 Hz, 360001 samples\n",
291 | "CX.PB01..HHN | 2014-05-01T04:59:59.998391Z - 2014-05-01T05:59:59.998391Z | 100.0 Hz, 360001 samples\n",
292 | "CX.PB01..HHZ | 2014-05-01T04:59:59.998393Z - 2014-05-01T05:59:59.998393Z | 100.0 Hz, 360001 samples\n",
293 | "p_picks = 19, s_picks = 10\n",
294 | "#\n",
295 | "Calculating predictions for stream: CX.PB02..HH?...\n",
296 | "720/720 [==============================] - 38s 52ms/step\n",
297 | "3 Trace(s) in Stream:\n",
298 | "CX.PB02..HHE | 2014-05-01T04:59:59.998393Z - 2014-05-01T05:59:59.998393Z | 100.0 Hz, 360001 samples\n",
299 | "CX.PB02..HHN | 2014-05-01T04:59:59.998393Z - 2014-05-01T05:59:59.998393Z | 100.0 Hz, 360001 samples\n",
300 | "CX.PB02..HHZ | 2014-05-01T04:59:59.998393Z - 2014-05-01T05:59:59.998393Z | 100.0 Hz, 360001 samples\n",
301 | "p_picks = 8, s_picks = 7\n",
302 | "#\n",
303 | "Calculating predictions for stream: CX.PB01..HH?...\n",
304 | "strimming stream: 2, 2\n",
305 | "720/720 [==============================] - 34s 48ms/step\n",
306 | "3 Trace(s) in Stream:\n",
307 | "CX.PB01..HHE | 2014-05-01T05:59:59.998393Z - 2014-05-01T06:59:59.998393Z | 100.0 Hz, 360001 samples\n",
308 | "CX.PB01..HHN | 2014-05-01T05:59:59.998391Z - 2014-05-01T06:59:59.998391Z | 100.0 Hz, 360001 samples\n",
309 | "CX.PB01..HHZ | 2014-05-01T05:59:59.998393Z - 2014-05-01T06:59:59.998393Z | 100.0 Hz, 360001 samples\n",
310 | "p_picks = 21, s_picks = 10\n",
311 | "#\n",
312 | "Calculating predictions for stream: CX.PB02..HH?...\n",
313 | "720/720 [==============================] - 36s 50ms/step\n",
314 | "3 Trace(s) in Stream:\n",
315 | "CX.PB02..HHE | 2014-05-01T05:59:59.998393Z - 2014-05-01T06:59:59.998393Z | 100.0 Hz, 360001 samples\n",
316 | "CX.PB02..HHN | 2014-05-01T05:59:59.998393Z - 2014-05-01T06:59:59.998393Z | 100.0 Hz, 360001 samples\n",
317 | "CX.PB02..HHZ | 2014-05-01T05:59:59.998393Z - 2014-05-01T06:59:59.998393Z | 100.0 Hz, 360001 samples\n",
318 | "p_picks = 8, s_picks = 5\n",
319 | "#\n",
320 | "Calculating predictions for stream: CX.PB01..HH?...\n",
321 | "strimming stream: 2, 2\n",
322 | "720/720 [==============================] - 37s 51ms/step\n",
323 | "3 Trace(s) in Stream:\n",
324 | "CX.PB01..HHE | 2014-05-01T06:59:59.998393Z - 2014-05-01T07:59:59.998393Z | 100.0 Hz, 360001 samples\n",
325 | "CX.PB01..HHN | 2014-05-01T06:59:59.998391Z - 2014-05-01T07:59:59.998391Z | 100.0 Hz, 360001 samples\n",
326 | "CX.PB01..HHZ | 2014-05-01T06:59:59.998393Z - 2014-05-01T07:59:59.998393Z | 100.0 Hz, 360001 samples\n",
327 | "p_picks = 18, s_picks = 13\n",
328 | "#\n",
329 | "Calculating predictions for stream: CX.PB02..HH?...\n",
330 | "720/720 [==============================] - 38s 53ms/step\n",
331 | "3 Trace(s) in Stream:\n",
332 | "CX.PB02..HHE | 2014-05-01T06:59:59.998393Z - 2014-05-01T07:59:59.998393Z | 100.0 Hz, 360001 samples\n",
333 | "CX.PB02..HHN | 2014-05-01T06:59:59.998393Z - 2014-05-01T07:59:59.998393Z | 100.0 Hz, 360001 samples\n",
334 | "CX.PB02..HHZ | 2014-05-01T06:59:59.998393Z - 2014-05-01T07:59:59.998393Z | 100.0 Hz, 360001 samples\n",
335 | "p_picks = 14, s_picks = 8\n",
336 | "#\n",
337 | "Calculating predictions for stream: CX.PB01..HH?...\n",
338 | "strimming stream: 2, 2\n",
339 | "720/720 [==============================] - 43s 60ms/step\n",
340 | "3 Trace(s) in Stream:\n",
341 | "CX.PB01..HHE | 2014-05-01T07:59:59.998393Z - 2014-05-01T08:59:59.998393Z | 100.0 Hz, 360001 samples\n",
342 | "CX.PB01..HHN | 2014-05-01T07:59:59.998391Z - 2014-05-01T08:59:59.998391Z | 100.0 Hz, 360001 samples\n",
343 | "CX.PB01..HHZ | 2014-05-01T07:59:59.998393Z - 2014-05-01T08:59:59.998393Z | 100.0 Hz, 360001 samples\n",
344 | "p_picks = 9, s_picks = 6\n",
345 | "#\n",
346 | "Calculating predictions for stream: CX.PB02..HH?...\n",
347 | "720/720 [==============================] - 36s 49ms/step\n",
348 | "3 Trace(s) in Stream:\n",
349 | "CX.PB02..HHE | 2014-05-01T07:59:59.998393Z - 2014-05-01T08:59:59.998393Z | 100.0 Hz, 360001 samples\n",
350 | "CX.PB02..HHN | 2014-05-01T07:59:59.998393Z - 2014-05-01T08:59:59.998393Z | 100.0 Hz, 360001 samples\n",
351 | "CX.PB02..HHZ | 2014-05-01T07:59:59.998393Z - 2014-05-01T08:59:59.998393Z | 100.0 Hz, 360001 samples\n",
352 | "p_picks = 8, s_picks = 4\n",
353 | "#\n",
354 | "Calculating predictions for stream: CX.PB01..HH?...\n",
355 | "strimming stream: 2, 2\n"
356 | ]
357 | },
358 | {
359 | "name": "stdout",
360 | "output_type": "stream",
361 | "text": [
362 | "720/720 [==============================] - 34s 48ms/step\n",
363 | "3 Trace(s) in Stream:\n",
364 | "CX.PB01..HHE | 2014-05-01T08:59:59.998393Z - 2014-05-01T09:59:59.998393Z | 100.0 Hz, 360001 samples\n",
365 | "CX.PB01..HHN | 2014-05-01T08:59:59.998391Z - 2014-05-01T09:59:59.998391Z | 100.0 Hz, 360001 samples\n",
366 | "CX.PB01..HHZ | 2014-05-01T08:59:59.998393Z - 2014-05-01T09:59:59.998393Z | 100.0 Hz, 360001 samples\n",
367 | "p_picks = 20, s_picks = 10\n",
368 | "#\n",
369 | "Calculating predictions for stream: CX.PB02..HH?...\n",
370 | "720/720 [==============================] - 30s 42ms/step\n",
371 | "3 Trace(s) in Stream:\n",
372 | "CX.PB02..HHE | 2014-05-01T08:59:59.998393Z - 2014-05-01T09:59:59.998393Z | 100.0 Hz, 360001 samples\n",
373 | "CX.PB02..HHN | 2014-05-01T08:59:59.998393Z - 2014-05-01T09:59:59.998393Z | 100.0 Hz, 360001 samples\n",
374 | "CX.PB02..HHZ | 2014-05-01T08:59:59.998393Z - 2014-05-01T09:59:59.998393Z | 100.0 Hz, 360001 samples\n",
375 | "p_picks = 14, s_picks = 9\n",
376 | "#\n",
377 | "Calculating predictions for stream: CX.PB01..HH?...\n",
378 | "strimming stream: 2, 2\n",
379 | "720/720 [==============================] - 32s 44ms/step\n",
380 | "3 Trace(s) in Stream:\n",
381 | "CX.PB01..HHE | 2014-05-01T09:59:59.998393Z - 2014-05-01T10:59:59.998393Z | 100.0 Hz, 360001 samples\n",
382 | "CX.PB01..HHN | 2014-05-01T09:59:59.998391Z - 2014-05-01T10:59:59.998391Z | 100.0 Hz, 360001 samples\n",
383 | "CX.PB01..HHZ | 2014-05-01T09:59:59.998393Z - 2014-05-01T10:59:59.998393Z | 100.0 Hz, 360001 samples\n",
384 | "p_picks = 11, s_picks = 5\n",
385 | "#\n",
386 | "Calculating predictions for stream: CX.PB02..HH?...\n",
387 | "720/720 [==============================] - 29s 40ms/step\n",
388 | "3 Trace(s) in Stream:\n",
389 | "CX.PB02..HHE | 2014-05-01T09:59:59.998393Z - 2014-05-01T10:59:59.998393Z | 100.0 Hz, 360001 samples\n",
390 | "CX.PB02..HHN | 2014-05-01T09:59:59.998393Z - 2014-05-01T10:59:59.998393Z | 100.0 Hz, 360001 samples\n",
391 | "CX.PB02..HHZ | 2014-05-01T09:59:59.998393Z - 2014-05-01T10:59:59.998393Z | 100.0 Hz, 360001 samples\n",
392 | "p_picks = 7, s_picks = 5\n",
393 | "#\n",
394 | "Calculating predictions for stream: CX.PB01..HH?...\n",
395 | "strimming stream: 2, 2\n",
396 | "720/720 [==============================] - 34s 47ms/step\n",
397 | "3 Trace(s) in Stream:\n",
398 | "CX.PB01..HHE | 2014-05-01T10:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 360001 samples\n",
399 | "CX.PB01..HHN | 2014-05-01T10:59:59.998391Z - 2014-05-01T11:59:59.998391Z | 100.0 Hz, 360001 samples\n",
400 | "CX.PB01..HHZ | 2014-05-01T10:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 360001 samples\n",
401 | "p_picks = 18, s_picks = 11\n",
402 | "#\n",
403 | "Calculating predictions for stream: CX.PB02..HH?...\n",
404 | "720/720 [==============================] - 32s 44ms/step\n",
405 | "3 Trace(s) in Stream:\n",
406 | "CX.PB02..HHE | 2014-05-01T10:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 360001 samples\n",
407 | "CX.PB02..HHN | 2014-05-01T10:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 360001 samples\n",
408 | "CX.PB02..HHZ | 2014-05-01T10:59:59.998393Z - 2014-05-01T11:59:59.998393Z | 100.0 Hz, 360001 samples\n",
409 | "p_picks = 17, s_picks = 11\n"
410 | ]
411 | }
412 | ],
413 | "source": [
414 | "# run phase detection\n",
415 | "dpp_model.run_detection(dpp_config, dpp_data, save_dets=False, save_data=False)"
416 | ]
417 | },
418 | {
419 | "cell_type": "code",
420 | "execution_count": 6,
421 | "metadata": {
422 | "scrolled": true
423 | },
424 | "outputs": [
425 | {
426 | "name": "stdout",
427 | "output_type": "stream",
428 | "text": [
429 | "#\n",
430 | "1, 2014-05-01T00:00:00.000000Z, 2014-05-01T01:00:00.000000Z, PB01\n",
431 | "triggered picks (P, S): 23, 12\n",
432 | "selected picks (P, S): 21, 9\n",
433 | "#\n",
434 | "1, 2014-05-01T00:00:00.000000Z, 2014-05-01T01:00:00.000000Z, PB02\n",
435 | "triggered picks (P, S): 12, 11\n",
436 | "selected picks (P, S): 8, 5\n",
437 | "#\n",
438 | "2, 2014-05-01T01:00:00.000000Z, 2014-05-01T02:00:00.000000Z, PB01\n",
439 | "triggered picks (P, S): 25, 19\n",
440 | "selected picks (P, S): 21, 12\n",
441 | "#\n",
442 | "2, 2014-05-01T01:00:00.000000Z, 2014-05-01T02:00:00.000000Z, PB02\n",
443 | "triggered picks (P, S): 21, 21\n",
444 | "selected picks (P, S): 18, 7\n",
445 | "#\n",
446 | "3, 2014-05-01T02:00:00.000000Z, 2014-05-01T03:00:00.000000Z, PB01\n",
447 | "triggered picks (P, S): 18, 6\n",
448 | "selected picks (P, S): 16, 4\n",
449 | "#\n",
450 | "3, 2014-05-01T02:00:00.000000Z, 2014-05-01T03:00:00.000000Z, PB02\n",
451 | "triggered picks (P, S): 13, 6\n",
452 | "selected picks (P, S): 10, 2\n",
453 | "#\n",
454 | "4, 2014-05-01T03:00:00.000000Z, 2014-05-01T04:00:00.000000Z, PB01\n",
455 | "triggered picks (P, S): 14, 4\n",
456 | "selected picks (P, S): 13, 3\n",
457 | "#\n",
458 | "4, 2014-05-01T03:00:00.000000Z, 2014-05-01T04:00:00.000000Z, PB02\n",
459 | "triggered picks (P, S): 7, 3\n",
460 | "selected picks (P, S): 7, 1\n",
461 | "#\n",
462 | "5, 2014-05-01T04:00:00.000000Z, 2014-05-01T05:00:00.000000Z, PB01\n",
463 | "triggered picks (P, S): 21, 14\n",
464 | "selected picks (P, S): 20, 9\n",
465 | "#\n",
466 | "5, 2014-05-01T04:00:00.000000Z, 2014-05-01T05:00:00.000000Z, PB02\n",
467 | "triggered picks (P, S): 19, 10\n",
468 | "selected picks (P, S): 17, 5\n",
469 | "#\n",
470 | "6, 2014-05-01T05:00:00.000000Z, 2014-05-01T06:00:00.000000Z, PB01\n",
471 | "triggered picks (P, S): 19, 10\n",
472 | "selected picks (P, S): 18, 4\n",
473 | "#\n",
474 | "6, 2014-05-01T05:00:00.000000Z, 2014-05-01T06:00:00.000000Z, PB02\n",
475 | "triggered picks (P, S): 8, 7\n",
476 | "selected picks (P, S): 7, 3\n",
477 | "#\n",
478 | "7, 2014-05-01T06:00:00.000000Z, 2014-05-01T07:00:00.000000Z, PB01\n",
479 | "triggered picks (P, S): 21, 10\n",
480 | "selected picks (P, S): 18, 5\n",
481 | "#\n",
482 | "7, 2014-05-01T06:00:00.000000Z, 2014-05-01T07:00:00.000000Z, PB02\n",
483 | "triggered picks (P, S): 8, 5\n",
484 | "selected picks (P, S): 6, 3\n",
485 | "#\n",
486 | "8, 2014-05-01T07:00:00.000000Z, 2014-05-01T08:00:00.000000Z, PB01\n",
487 | "triggered picks (P, S): 18, 13\n",
488 | "selected picks (P, S): 16, 7\n",
489 | "#\n",
490 | "8, 2014-05-01T07:00:00.000000Z, 2014-05-01T08:00:00.000000Z, PB02\n",
491 | "triggered picks (P, S): 14, 8\n",
492 | "selected picks (P, S): 10, 3\n",
493 | "#\n",
494 | "9, 2014-05-01T08:00:00.000000Z, 2014-05-01T09:00:00.000000Z, PB01\n",
495 | "triggered picks (P, S): 9, 6\n",
496 | "selected picks (P, S): 9, 1\n",
497 | "#\n",
498 | "9, 2014-05-01T08:00:00.000000Z, 2014-05-01T09:00:00.000000Z, PB02\n",
499 | "triggered picks (P, S): 8, 4\n",
500 | "selected picks (P, S): 8, 1\n",
501 | "#\n",
502 | "10, 2014-05-01T09:00:00.000000Z, 2014-05-01T10:00:00.000000Z, PB01\n",
503 | "triggered picks (P, S): 20, 10\n",
504 | "selected picks (P, S): 19, 5\n",
505 | "#\n",
506 | "10, 2014-05-01T09:00:00.000000Z, 2014-05-01T10:00:00.000000Z, PB02\n",
507 | "triggered picks (P, S): 14, 9\n",
508 | "selected picks (P, S): 12, 4\n",
509 | "#\n",
510 | "11, 2014-05-01T10:00:00.000000Z, 2014-05-01T11:00:00.000000Z, PB01\n",
511 | "triggered picks (P, S): 11, 5\n",
512 | "selected picks (P, S): 10, 2\n",
513 | "#\n",
514 | "11, 2014-05-01T10:00:00.000000Z, 2014-05-01T11:00:00.000000Z, PB02\n",
515 | "triggered picks (P, S): 7, 5\n",
516 | "selected picks (P, S): 5, 1\n",
517 | "#\n",
518 | "12, 2014-05-01T11:00:00.000000Z, 2014-05-01T12:00:00.000000Z, PB01\n",
519 | "triggered picks (P, S): 18, 11\n",
520 | "selected picks (P, S): 14, 6\n",
521 | "#\n",
522 | "12, 2014-05-01T11:00:00.000000Z, 2014-05-01T12:00:00.000000Z, PB02\n",
523 | "triggered picks (P, S): 17, 11\n",
524 | "selected picks (P, S): 12, 5\n"
525 | ]
526 | }
527 | ],
528 | "source": [
529 | "# run phase picking\n",
530 | "dpp_model.run_picking(dpp_config, dpp_data, save_plots=False, save_stats=True, save_picks=False)"
531 | ]
532 | },
533 | {
534 | "cell_type": "code",
535 | "execution_count": 7,
536 | "metadata": {
537 | "scrolled": true
538 | },
539 | "outputs": [
540 | {
541 | "name": "stdout",
542 | "output_type": "stream",
543 | "text": [
544 | "creating plots...\n",
545 | "1 PB01 Z 2014-04-30T23:59:59.998393Z 2014-05-01T00:59:59.998393Z\n",
546 | "1 PB01 E 2014-04-30T23:59:59.998393Z 2014-05-01T00:59:59.998393Z\n",
547 | "1 PB02 Z 2014-04-30T23:59:59.998393Z 2014-05-01T00:59:59.998393Z\n",
548 | "1 PB02 E 2014-04-30T23:59:59.998393Z 2014-05-01T00:59:59.998393Z\n",
549 | "2 PB01 Z 2014-05-01T00:59:59.998393Z 2014-05-01T01:59:59.998393Z\n",
550 | "2 PB01 E 2014-05-01T00:59:59.998393Z 2014-05-01T01:59:59.998393Z\n",
551 | "2 PB02 Z 2014-05-01T00:59:59.998393Z 2014-05-01T01:59:59.998393Z\n",
552 | "2 PB02 E 2014-05-01T00:59:59.998393Z 2014-05-01T01:59:59.998393Z\n",
553 | "3 PB01 Z 2014-05-01T01:59:59.998393Z 2014-05-01T02:59:59.998393Z\n",
554 | "3 PB01 E 2014-05-01T01:59:59.998393Z 2014-05-01T02:59:59.998393Z\n",
555 | "3 PB02 Z 2014-05-01T01:59:59.998393Z 2014-05-01T02:59:59.998393Z\n",
556 | "3 PB02 E 2014-05-01T01:59:59.998393Z 2014-05-01T02:59:59.998393Z\n",
557 | "4 PB01 Z 2014-05-01T02:59:59.998393Z 2014-05-01T03:59:59.998393Z\n",
558 | "4 PB01 E 2014-05-01T02:59:59.998393Z 2014-05-01T03:59:59.998393Z\n",
559 | "4 PB02 Z 2014-05-01T02:59:59.998393Z 2014-05-01T03:59:59.998393Z\n",
560 | "4 PB02 E 2014-05-01T02:59:59.998393Z 2014-05-01T03:59:59.998393Z\n",
561 | "5 PB01 Z 2014-05-01T03:59:59.998393Z 2014-05-01T04:59:59.998393Z\n",
562 | "5 PB01 E 2014-05-01T03:59:59.998393Z 2014-05-01T04:59:59.998393Z\n",
563 | "5 PB02 Z 2014-05-01T03:59:59.998393Z 2014-05-01T04:59:59.998393Z\n",
564 | "5 PB02 E 2014-05-01T03:59:59.998393Z 2014-05-01T04:59:59.998393Z\n",
565 | "6 PB01 Z 2014-05-01T04:59:59.998393Z 2014-05-01T05:59:59.998393Z\n",
566 | "6 PB01 E 2014-05-01T04:59:59.998393Z 2014-05-01T05:59:59.998393Z\n",
567 | "6 PB02 Z 2014-05-01T04:59:59.998393Z 2014-05-01T05:59:59.998393Z\n",
568 | "6 PB02 E 2014-05-01T04:59:59.998393Z 2014-05-01T05:59:59.998393Z\n",
569 | "7 PB01 Z 2014-05-01T05:59:59.998393Z 2014-05-01T06:59:59.998393Z\n",
570 | "7 PB01 E 2014-05-01T05:59:59.998393Z 2014-05-01T06:59:59.998393Z\n",
571 | "7 PB02 Z 2014-05-01T05:59:59.998393Z 2014-05-01T06:59:59.998393Z\n",
572 | "7 PB02 E 2014-05-01T05:59:59.998393Z 2014-05-01T06:59:59.998393Z\n",
573 | "8 PB01 Z 2014-05-01T06:59:59.998393Z 2014-05-01T07:59:59.998393Z\n",
574 | "8 PB01 E 2014-05-01T06:59:59.998393Z 2014-05-01T07:59:59.998393Z\n",
575 | "8 PB02 Z 2014-05-01T06:59:59.998393Z 2014-05-01T07:59:59.998393Z\n",
576 | "8 PB02 E 2014-05-01T06:59:59.998393Z 2014-05-01T07:59:59.998393Z\n",
577 | "9 PB01 Z 2014-05-01T07:59:59.998393Z 2014-05-01T08:59:59.998393Z\n",
578 | "9 PB01 E 2014-05-01T07:59:59.998393Z 2014-05-01T08:59:59.998393Z\n",
579 | "9 PB02 Z 2014-05-01T07:59:59.998393Z 2014-05-01T08:59:59.998393Z\n",
580 | "9 PB02 E 2014-05-01T07:59:59.998393Z 2014-05-01T08:59:59.998393Z\n",
581 | "10 PB01 Z 2014-05-01T08:59:59.998393Z 2014-05-01T09:59:59.998393Z\n",
582 | "10 PB01 E 2014-05-01T08:59:59.998393Z 2014-05-01T09:59:59.998393Z\n",
583 | "10 PB02 Z 2014-05-01T08:59:59.998393Z 2014-05-01T09:59:59.998393Z\n",
584 | "10 PB02 E 2014-05-01T08:59:59.998393Z 2014-05-01T09:59:59.998393Z\n",
585 | "11 PB01 Z 2014-05-01T09:59:59.998393Z 2014-05-01T10:59:59.998393Z\n",
586 | "11 PB01 E 2014-05-01T09:59:59.998393Z 2014-05-01T10:59:59.998393Z\n",
587 | "11 PB02 Z 2014-05-01T09:59:59.998393Z 2014-05-01T10:59:59.998393Z\n",
588 | "11 PB02 E 2014-05-01T09:59:59.998393Z 2014-05-01T10:59:59.998393Z\n",
589 | "12 PB01 Z 2014-05-01T10:59:59.998393Z 2014-05-01T11:59:59.998393Z\n",
590 | "12 PB01 E 2014-05-01T10:59:59.998393Z 2014-05-01T11:59:59.998393Z\n",
591 | "12 PB02 Z 2014-05-01T10:59:59.998393Z 2014-05-01T11:59:59.998393Z\n",
592 | "12 PB02 E 2014-05-01T10:59:59.998393Z 2014-05-01T11:59:59.998393Z\n"
593 | ]
594 | }
595 | ],
596 | "source": [
597 | "# plots\n",
598 | "# util.plot_predicted_phases(dpp_config, dpp_data, dpp_model)\n",
599 | "util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_probs=['P','S'], shift_probs=True)"
600 | ]
601 | }
602 | ],
603 | "metadata": {
604 | "jupytext": {
605 | "text_representation": {
606 | "extension": ".py",
607 | "format_name": "light",
608 | "format_version": "1.4",
609 | "jupytext_version": "1.2.4"
610 | }
611 | },
612 | "kernelspec": {
613 | "display_name": "Python 3",
614 | "language": "python",
615 | "name": "python3"
616 | },
617 | "language_info": {
618 | "codemirror_mode": {
619 | "name": "ipython",
620 | "version": 3
621 | },
622 | "file_extension": ".py",
623 | "mimetype": "text/x-python",
624 | "name": "python",
625 | "nbconvert_exporter": "python",
626 | "pygments_lexer": "ipython3",
627 | "version": "3.6.13"
628 | }
629 | },
630 | "nbformat": 4,
631 | "nbformat_minor": 2
632 | }
633 |
--------------------------------------------------------------------------------
/examples/run_dpp_from_archive.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #
3 | # This script applies DeepPhasePick on seismic data stored in a archive directory structured as: archive/YY/NET/STA/CH
4 | # Here YY is year, NET is the network code, STA is the station code and CH is the channel code (e.g., HH) of the seismic streams.
5 | #
6 | # Author: Hugo Soto Parada (June, 2021)
7 | # Contact: soto@gfz-potsdam.de, hugosotoparada@gmail.com
8 | #
9 | ########################################################################################################################################
10 |
11 | import os
12 | import config, data, model, util
13 |
14 | # 1. Configure DPP
15 | #
16 | # config
17 | util.init_session()
18 | dpp_config = config.Config()
19 | dpp_config.set_trigger(pthres_p=[0.9, 0.001], pthres_s=[0.9, 0.001])
20 | # dpp_config.set_picking(mcd_iter=10, run_mcd=True)
21 | dpp_config.set_picking(run_mcd=False)
22 | #
23 | dpp_config.set_data(
24 | stas=['PB01', 'PB02'],
25 | net='CX',
26 | ch='HH',
27 | archive='sample_data/archive',
28 | opath='out_archive'
29 | )
30 | dpp_config.set_time(
31 | dt_iter=3600.,
32 | tstart="2014-05-01T00:00:00",
33 | tend="2014-05-01T12:00:00",
34 | )
35 |
36 | # 2. Read seismic data into DPP
37 | #
38 | # data
39 | dpp_data = data.Data()
40 | dpp_data.read_from_archive(dpp_config)
41 | #
42 | for k in dpp_data.data:
43 | print(k, dpp_data.data[k])
44 |
45 | # 3. Run phase detection and picking
46 | #
47 | # model
48 | dpp_model = model.Model(verbose=False)
49 | # dpp_model = model.Model(verbose=False, version_pick_P="20201002_2", version_pick_S="20201002_2")
50 | #
51 | print(dpp_model.model_detection['best_model'].summary())
52 | print(dpp_model.model_picking_P['best_model'].summary())
53 | print(dpp_model.model_picking_S['best_model'].summary())
54 | #
55 | # run phase detection
56 | dpp_model.run_detection(dpp_config, dpp_data, save_dets=False, save_data=False)
57 | #
58 | # run phase picking
59 | dpp_model.run_picking(dpp_config, dpp_data, save_plots=False, save_stats=True, save_picks=False)
60 |
61 | # 4. Plot predicted phases
62 | #
63 | # plots
64 | # util.plot_predicted_phases(dpp_config, dpp_data, dpp_model)
65 | util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_probs=['P','S'], shift_probs=True)
66 |
67 |
--------------------------------------------------------------------------------
/examples/run_dpp_from_directory.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | #
3 | # This script applies DeepPhasePick on seismic data stored in an unstructured archive directory.
4 | #
5 | # Author: Hugo Soto Parada (June, 2021)
6 | # Contact: soto@gfz-potsdam.de, hugosotoparada@gmail.com
7 | #
8 | ########################################################################################################################################
9 |
10 | import os
11 | import config, data, model, util
12 |
13 | # 1. Configure DPP
14 | #
15 | # config
16 | util.init_session()
17 | dpp_config = config.Config()
18 | dpp_config.set_trigger(pthres_p=[0.9, 0.001], pthres_s=[0.9, 0.001])
19 | dpp_config.set_picking(mcd_iter=10, run_mcd=True)
20 | # dpp_config.set_picking(run_mcd=False)
21 | #
22 | dpp_config.set_data(
23 | stas=['PB01', 'PB02'],
24 | net='CX',
25 | ch='HH',
26 | archive='sample_data/CX_20140301',
27 | opath='out_CX_20140301'
28 | )
29 | dpp_config.set_time(
30 | dt_iter=3600.,
31 | tstart="2014-03-01T02:00:00",
32 | tend="2014-03-01T03:00:00",
33 | )
34 |
35 | # 2. Read seismic data into DPP
36 | #
37 | # data
38 | dpp_data = data.Data()
39 | dpp_data.read_from_directory(dpp_config)
40 | #
41 | # for k in dpp_data.data:
42 | # print(k, dpp_data.data[k])
43 |
44 | # 3. Run phase detection and picking
45 | #
46 | # model
47 | # dpp_model = model.Model(verbose=False)
48 | dpp_model = model.Model(verbose=False, version_pick_P="20201002_2", version_pick_S="20201002_2")
49 | #
50 | print(dpp_model.model_detection['best_model'].summary())
51 | print(dpp_model.model_picking_P['best_model'].summary())
52 | print(dpp_model.model_picking_S['best_model'].summary())
53 | #
54 | # run phase detection
55 | dpp_model.run_detection(dpp_config, dpp_data, save_dets=False, save_data=False)
56 | #
57 | # run phase picking
58 | dpp_model.run_picking(dpp_config, dpp_data, save_plots=True, save_stats=True, save_picks=False)
59 |
60 | # 4. Plot predicted phases
61 | #
62 | # plots
63 | util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_comps=['Z','N'])
64 | # util.plot_predicted_phases(dpp_config, dpp_data, dpp_model, plot_probs=['P','S'], shift_probs=True)
65 |
66 |
--------------------------------------------------------------------------------
/models/detection/20201002/dict_hyperopt_t733.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/detection/20201002/dict_hyperopt_t733.pckl
--------------------------------------------------------------------------------
/models/detection/20201002/model_hyperopt_t733.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/detection/20201002/model_hyperopt_t733.h5
--------------------------------------------------------------------------------
/models/detection/20201002/trials_hyperopt_ntrials_1000.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/detection/20201002/trials_hyperopt_ntrials_1000.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_1/P/dict_hyperopt_t027.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_1/P/dict_hyperopt_t027.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_1/P/model_hyperopt_t027.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_1/P/model_hyperopt_t027.h5
--------------------------------------------------------------------------------
/models/picking/20201002_1/P/trials_hyperopt_ntrials_050.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_1/P/trials_hyperopt_ntrials_050.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_1/S/dict_hyperopt_t009.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_1/S/dict_hyperopt_t009.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_1/S/model_hyperopt_t009.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_1/S/model_hyperopt_t009.h5
--------------------------------------------------------------------------------
/models/picking/20201002_1/S/trials_hyperopt_ntrials_050.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_1/S/trials_hyperopt_ntrials_050.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_2/P/dict_hyperopt_t004.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_2/P/dict_hyperopt_t004.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_2/P/model_hyperopt_t004.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_2/P/model_hyperopt_t004.h5
--------------------------------------------------------------------------------
/models/picking/20201002_2/P/trials_hyperopt_ntrials_050.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_2/P/trials_hyperopt_ntrials_050.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_2/S/dict_hyperopt_t023.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_2/S/dict_hyperopt_t023.pckl
--------------------------------------------------------------------------------
/models/picking/20201002_2/S/model_hyperopt_t023.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_2/S/model_hyperopt_t023.h5
--------------------------------------------------------------------------------
/models/picking/20201002_2/S/trials_hyperopt_ntrials_050.pckl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/models/picking/20201002_2/S/trials_hyperopt_ntrials_050.pckl
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | python>=3.6
2 | tensorflow==2.2.0
3 | obspy==1.2.2
4 | numpy
5 | matplotlib
6 | tqdm
7 |
--------------------------------------------------------------------------------
/sample_data/CX_20140301/CX.PB01..HH.mseed:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/CX_20140301/CX.PB01..HH.mseed
--------------------------------------------------------------------------------
/sample_data/CX_20140301/CX.PB02..HH.mseed:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/CX_20140301/CX.PB02..HH.mseed
--------------------------------------------------------------------------------
/sample_data/CX_20140401/CX.PB01..HH.mseed:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/CX_20140401/CX.PB01..HH.mseed
--------------------------------------------------------------------------------
/sample_data/CX_20140401/CX.PB02..HH.mseed:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/CX_20140401/CX.PB02..HH.mseed
--------------------------------------------------------------------------------
/sample_data/archive/2014/CX/PB01/HHE.D/CX.PB01..HHE.D.2014.121:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/archive/2014/CX/PB01/HHE.D/CX.PB01..HHE.D.2014.121
--------------------------------------------------------------------------------
/sample_data/archive/2014/CX/PB01/HHN.D/CX.PB01..HHN.D.2014.121:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/archive/2014/CX/PB01/HHN.D/CX.PB01..HHN.D.2014.121
--------------------------------------------------------------------------------
/sample_data/archive/2014/CX/PB01/HHZ.D/CX.PB01..HHZ.D.2014.121:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/archive/2014/CX/PB01/HHZ.D/CX.PB01..HHZ.D.2014.121
--------------------------------------------------------------------------------
/sample_data/archive/2014/CX/PB02/HHE.D/CX.PB02..HHE.D.2014.121:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/archive/2014/CX/PB02/HHE.D/CX.PB02..HHE.D.2014.121
--------------------------------------------------------------------------------
/sample_data/archive/2014/CX/PB02/HHN.D/CX.PB02..HHN.D.2014.121:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/archive/2014/CX/PB02/HHN.D/CX.PB02..HHN.D.2014.121
--------------------------------------------------------------------------------
/sample_data/archive/2014/CX/PB02/HHZ.D/CX.PB02..HHZ.D.2014.121:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/hsotoparada/DeepPhasePick/95a30f8acf6700f9a2a5738c248149903a152d1c/sample_data/archive/2014/CX/PB02/HHZ.D/CX.PB02..HHZ.D.2014.121
--------------------------------------------------------------------------------
/util.py:
--------------------------------------------------------------------------------
1 | # coding: utf-8
2 |
3 | """
4 | This module contains additional useful functions used by DeepPhasePick method.
5 |
6 | Author: Hugo Soto Parada (October, 2020)
7 | Contact: soto@gfz-potsdam.de, hugosotoparada@gmail.com
8 |
9 | """
10 |
11 | import numpy as np
12 | import matplotlib as mpl
13 | import matplotlib.pyplot as plt
14 | import matplotlib.ticker as ticker
15 | import tensorflow as tf
16 | import re, sys, os, shutil, gc
17 | import pickle
18 |
19 |
20 | def export_dict2pckl(dct, opath):
21 | """
22 | Exports dictionary as pickle file.
23 |
24 | Parameters
25 | ----------
26 | dct: dict
27 | Input dictionary.
28 | opath: str
29 | Output path to export pickle file.
30 | """
31 | with open(opath, 'wb') as pout:
32 | pickle.dump(dct, pout)
33 |
34 |
35 | def import_pckl2dict(ipath):
36 | """
37 | Imports pickle file to dictionary and returns this dictionary.
38 |
39 | Parameters
40 | ----------
41 | ipath: str
42 | Path to pickle file.
43 | """
44 | with open(ipath, 'rb') as pin:
45 | dct = pickle.load(pin)
46 | return dct
47 |
48 |
49 | def init_session():
50 | """
51 | Sets up tensorflow v2.x / keras session.
52 | """
53 | #
54 | physical_devices = tf.config.list_physical_devices('GPU')
55 | tf.config.experimental.set_memory_growth(physical_devices[0], enable=True)
56 | #
57 | # remove previously generated files or directories
58 | dirs_remove = ['__pycache__/', '~/.nv/']
59 | for dir_remove in dirs_remove:
60 | try:
61 | shutil.rmtree(dir_remove)
62 | print(f"{dir_remove} removed")
63 | except FileNotFoundError:
64 | print(f"{dir_remove} not found, continuing...")
65 | pass
66 |
67 |
68 | def get_arg_best_trial(trials):
69 | """
70 | Returns index of best trial (trial at which loss in minimum).
71 |
72 | Parameters
73 | ----------
74 | trials: list
75 | List of hyperopt trials results in hyperparameter optimization.
76 |
77 | Returns
78 | -------
79 | arg_min_loss: int
80 | Index corresponding to best trial (trial at which loss in minimum) in trials object.
81 | """
82 | losses = [float(trial['result']['loss']) for trial in trials]
83 | arg_min_loss = np.argmin(losses)
84 | return arg_min_loss
85 |
86 |
87 | def plot_predicted_phase_P(config, dct_mcd, data, sta, opath, plot_num):
88 | """
89 | Creates plots for predicted P-phase time onsets.
90 | Two types of plots are created, showing:
91 | i) refined phase pick in picking window, ii) zoom centered on refined phase pick and Monte Carlo Dropout (MCD) results.
92 |
93 | Parameters
94 | ----------
95 | config: instance of config.Config
96 | Contains user configuration of seismic waveform data and how this data is processed in DeepPhasePick.
97 | dct_mcd: dict
98 | Dictionary containing MCD statistics of the predicted phase pick.
99 | data: ndarray
100 | 3D array containing seismic stream amplitudes on which MCD is applied.
101 | sta: str
102 | Station code of seismic stream.
103 | opath: str
104 | Output path for saving figure of predicted phase onsets.
105 | plot_num: int
106 | Index of processed phase onset, used for figure names of predicted phase onsets.
107 | """
108 | #
109 | mpl.rcParams['xtick.major.size'] = 8
110 | mpl.rcParams['xtick.major.width'] = 1.5
111 | mpl.rcParams['xtick.minor.size'] = 4
112 | mpl.rcParams['xtick.minor.width'] = 1.5
113 | mpl.rcParams['ytick.major.size'] = 8
114 | mpl.rcParams['ytick.major.width'] = 1.5
115 | mpl.rcParams['ytick.minor.size'] = 4
116 | mpl.rcParams['ytick.minor.width'] = 1.5
117 | mpl.rcParams['xtick.labelsize'] = 14
118 | mpl.rcParams['ytick.labelsize'] = 14
119 | mpl.rcParams['axes.titlesize'] = 14
120 | mpl.rcParams['axes.labelsize'] = 14
121 | #
122 | opath_fig = f"{opath}/pick_plots"
123 | os.makedirs(opath_fig, exist_ok=True)
124 | #
125 | tpick_det = dct_mcd['pick']['tpick_det']
126 | tpick_pred = dct_mcd['pick']['tpick']
127 | tpick_pred_th1 = dct_mcd['pick']['tpick_th1']
128 | tpick_pred_th2 = dct_mcd['pick']['tpick_th2']
129 | terr_pre = dct_mcd['pick']['terr_pre']
130 | terr_pos = dct_mcd['pick']['terr_pos']
131 | pick_class = dct_mcd['pick']['pick_class']
132 | mc_pred = dct_mcd['mcd']['mc_pred']
133 | mc_pred_mean = dct_mcd['mcd']['mc_pred_mean']
134 | mc_pred_mean_arg_pick = dct_mcd['mcd']['mc_pred_mean_arg_pick']
135 | mc_pred_std_pick = dct_mcd['mcd']['mc_pred_std_pick']
136 | prob_th1 = dct_mcd['mcd']['prob_th1']
137 | prob_th2 = dct_mcd['mcd']['prob_th2']
138 | #
139 | # plot - phase window input for RNN
140 | #
141 | fig = plt.figure(figsize=(7*1, 3*1))
142 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
143 | ax = []
144 | ax.append(fig.add_subplot(1, 1, 1))
145 | #
146 | # plot trace
147 | tr_win_y = data[0,:,0]
148 | tr_win_x = np.arange(tr_win_y.shape[0]) / config.data_params['samp_freq']
149 | #
150 | ax[-1].plot(tr_win_x, tr_win_y, c='gray', lw=1.)
151 | ax[-1].vlines(x=tpick_pred, ymin=-1.1, ymax=1., color='r', lw=1.5, ls='-', clip_on=False)
152 | ax[-1].vlines(x=tpick_det, ymin=-1., ymax=1.1, color='r', lw=1.5, ls='--', clip_on=False)
153 | # tr_label_1 = f"comp Z"
154 | # ax[-1].text(0.02, .95, tr_label_1, size=12., ha='left', va='center', transform=ax[-1].transAxes)
155 | #
156 | xmin = 0.
157 | xmax = tr_win_x.max()
158 | ax[-1].set_xlim([xmin, xmax])
159 | ax[-1].xaxis.set_ticks(np.arange(xmin, xmax + .1, .5))
160 | ax[-1].xaxis.set_minor_locator(ticker.MultipleLocator(.1))
161 | ax[-1].set_ylim([-1., 1.])
162 | ax[-1].set_xlabel(f"Time [s]")
163 | #
164 | plt.tight_layout()
165 | print(f"plotting predicted phase P: {opath_fig}/{sta}_P_{plot_num+1:02}.png")
166 | ofig = f"{opath_fig}/{sta}_P_Z_{plot_num+1:02}"
167 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
168 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
169 | plt.close()
170 | #
171 | # plot - phase window input for RNN (zoom around predicted time pick and MCD results)
172 | #
173 | fig = plt.figure(figsize=(7*1, 3*1))
174 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
175 | ax = []
176 | ax.append(fig.add_subplot(1, 1, 1))
177 | #
178 | # plot trace
179 | ax[-1].plot(tr_win_x, tr_win_y, c='gray', lw=2., zorder=1)
180 | #
181 | # plot output binary probs
182 | ax_tmp = ax[-1].twinx()
183 | for l in range(len(mc_pred)):
184 | ax_tmp.plot(tr_win_x, mc_pred[l,:,0], c='magenta', lw=.2, ls='--', zorder=1)
185 | ax_tmp.plot(tr_win_x, mc_pred_mean[:,0], c='magenta', lw=1., zorder=1)
186 | ax_tmp.set_ylim([0., 1.])
187 | ax_tmp.set_ylabel("Probability")
188 | ax_tmp.yaxis.set_ticks(np.arange(0.,1.1,.1)[:])
189 | ax_tmp.yaxis.set_minor_locator(ticker.MultipleLocator(.05))
190 | ax_tmp.axhline(mc_pred_mean[mc_pred_mean_arg_pick,0], c='magenta', lw=1., ls='--', zorder=2)
191 | ax_tmp.axhline(prob_th1, c='magenta', lw=1., ls='--', zorder=2)
192 | ax_tmp.axhline(prob_th2, c='magenta', lw=1., ls='--', zorder=2)
193 | #
194 | ax[-1].vlines(x=tpick_pred, ymin=-1.1, ymax=1., color='r', lw=1.5, ls='-', clip_on=False, zorder=3)
195 | ax[-1].vlines(x=tpick_det, ymin=-1., ymax=1.1, color='r', lw=1.5, ls='--', clip_on=False, zorder=3)
196 | ax[-1].vlines(x=tpick_pred_th1, ymin=-1., ymax=1., color='r', lw=1.5, ls=':', clip_on=False, zorder=3)
197 | ax[-1].vlines(x=tpick_pred_th2, ymin=-1., ymax=1., color='r', lw=1.5, ls=':', clip_on=False, zorder=3)
198 | # ax[-1].vlines(x=tpick_pred-tpick_pred_std, ymin=-1., ymax=1.05, color='r', lw=1.5, ls='--', clip_on=False)
199 | # ax[-1].vlines(x=tpick_pred+tpick_pred_std, ymin=-1., ymax=1.05, color='r', lw=1.5, ls='--', clip_on=False)
200 | # arg_pred = mc_pred_mean_arg_pick
201 | tr_label_1 = f"tpred = {tpick_pred:.3f}"
202 | tr_label_2 = f"terr(1 x pb_std) = (-{terr_pre:.3f}, +{terr_pos:.3f})"
203 | tr_label_3 = f"pick_class = {pick_class}"
204 | tr_label_4 = f"pb, pb_std = ({mc_pred_mean[mc_pred_mean_arg_pick,0]:.3f}, {mc_pred_std_pick:.3f})"
205 | # ax[-1].text(0.01, .975, tr_label_1, size=12., ha='left', va='center', transform=ax[-1].transAxes)
206 | # ax[-1].text(0.01, .935, tr_label_2, size=12., ha='left', va='center', transform=ax[-1].transAxes)
207 | # ax[-1].text(0.01, .895, tr_label_3, size=12., ha='left', va='center', transform=ax[-1].transAxes)
208 | # ax[-1].text(0.01, .855, tr_label_4, size=12., ha='left', va='center', transform=ax[-1].transAxes)
209 | #
210 | xmin = tpick_pred - .5
211 | xmax = tpick_pred + .5
212 | ax[-1].set_xlim([xmin, xmax])
213 | tick_major = np.arange(xmin, xmax + .1, .1)
214 | tick_minor = np.arange(xmin, xmax + .01, .02)
215 | ax[-1].xaxis.set_major_locator(ticker.FixedLocator(tick_major))
216 | ax[-1].xaxis.set_minor_locator(ticker.FixedLocator(tick_minor))
217 | ax[-1].set_ylim([-1., 1.])
218 | ax[-1].set_xlabel("Time [s]")
219 | #
220 | plt.tight_layout()
221 | print(f"plotting predicted phase P: {opath_fig}/{sta}_P_mc_{plot_num+1:02}.png")
222 | print(tr_label_1)
223 | print(tr_label_2)
224 | print(tr_label_3)
225 | print(tr_label_4)
226 | ofig = f"{opath_fig}/{sta}_P_Z_mcd_{plot_num+1:02}"
227 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
228 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
229 | plt.close()
230 |
231 |
232 | def plot_predicted_phase_S(config, dct_mcd, data, sta, opath, plot_num):
233 | """
234 | Creates plots for predicted S-phase time onsets.
235 | Two types of plots are created, showing:
236 | i) refined phase pick in picking window, ii) zoom centered on refined phase pick and Monte Carlo Dropout (MCD) results.
237 |
238 | Parameters
239 | ----------
240 | config: instance of config.Config
241 | Contains user configuration of seismic waveform data and how this data is processed in DeepPhasePick.
242 | dct_mcd: dict
243 | Dictionary containing MCD statistics of the predicted phase pick.
244 | data: ndarray
245 | 3D array containing seismic stream amplitudes on which MCD is applied.
246 | sta: str
247 | Station code of seismic stream.
248 | opath: str
249 | Output path for saving figure of predicted phase onsets.
250 | plot_num: int
251 | Index of processed phase onset, used for figure names of predicted phase onsets.
252 | """
253 | #
254 | mpl.rcParams['xtick.major.size'] = 8
255 | mpl.rcParams['xtick.major.width'] = 1.5
256 | mpl.rcParams['xtick.minor.size'] = 4
257 | mpl.rcParams['xtick.minor.width'] = 1.5
258 | mpl.rcParams['ytick.major.size'] = 8
259 | mpl.rcParams['ytick.major.width'] = 1.5
260 | mpl.rcParams['ytick.minor.size'] = 4
261 | mpl.rcParams['ytick.minor.width'] = 1.5
262 | mpl.rcParams['xtick.labelsize'] = 14
263 | mpl.rcParams['ytick.labelsize'] = 14
264 | mpl.rcParams['axes.titlesize'] = 14
265 | mpl.rcParams['axes.labelsize'] = 14
266 | #
267 | opath_fig = f"{opath}/pick_plots"
268 | os.makedirs(opath_fig, exist_ok=True)
269 | #
270 | tpick_det = dct_mcd['pick']['tpick_det']
271 | tpick_pred = dct_mcd['pick']['tpick']
272 | tpick_pred_th1 = dct_mcd['pick']['tpick_th1']
273 | tpick_pred_th2 = dct_mcd['pick']['tpick_th2']
274 | terr_pre = dct_mcd['pick']['terr_pre']
275 | terr_pos = dct_mcd['pick']['terr_pos']
276 | pick_class = dct_mcd['pick']['pick_class']
277 | mc_pred = dct_mcd['mcd']['mc_pred']
278 | mc_pred_mean = dct_mcd['mcd']['mc_pred_mean']
279 | mc_pred_mean_arg_pick = dct_mcd['mcd']['mc_pred_mean_arg_pick']
280 | mc_pred_std_pick = dct_mcd['mcd']['mc_pred_std_pick']
281 | prob_th1 = dct_mcd['mcd']['prob_th1']
282 | prob_th2 = dct_mcd['mcd']['prob_th2']
283 | #
284 | # plot - phase window input for RNN (comp E)
285 | #
286 | fig = plt.figure(figsize=(7*1, 3*1))
287 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
288 | ax = []
289 | ax.append(fig.add_subplot(1, 1, 1))
290 | #
291 | # plot trace
292 | tr_win_y = data[0,:,0]
293 | tr_win_x = np.arange(tr_win_y.shape[0]) / config.data_params['samp_freq']
294 | #
295 | ax[-1].plot(tr_win_x, tr_win_y, c='gray', lw=1.)
296 | ax[-1].vlines(x=tpick_pred, ymin=-1.1, ymax=1., color='b', lw=1.5, ls='-', clip_on=False)
297 | ax[-1].vlines(x=tpick_det, ymin=-1., ymax=1.1, color='b', lw=1.5, ls='--', clip_on=False)
298 | # tr_label_1 = f"comp E"
299 | # ax[-1].text(0.02, .95, tr_label_1, size=12., ha='left', va='center', transform=ax[-1].transAxes)
300 | #
301 | xmin = 0.
302 | xmax = tr_win_x.max()
303 | ax[-1].set_xlim([xmin, xmax])
304 | ax[-1].xaxis.set_ticks(np.arange(xmin, xmax + .1, .5))
305 | ax[-1].xaxis.set_minor_locator(ticker.MultipleLocator(.1))
306 | ax[-1].set_ylim([-1., 1.])
307 | ax[-1].set_xlabel("Time [s]")
308 | #
309 | plt.tight_layout()
310 | print(f"plotting predicted phase S: {opath_fig}/{sta}_S_E_{plot_num+1:02}.png")
311 | ofig = f"{opath_fig}/{sta}_S_E_{plot_num+1:02}"
312 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
313 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
314 | plt.close()
315 | #
316 | # plot - phase window input for RNN (comp N)
317 | #
318 | fig = plt.figure(figsize=(7*1, 3*1))
319 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
320 | ax = []
321 | ax.append(fig.add_subplot(1, 1, 1))
322 | #
323 | # plot trace
324 | tr_win_y = data[0,:,1]
325 | tr_win_x = np.arange(tr_win_y.shape[0]) / config.data_params['samp_freq']
326 | #
327 | ax[-1].plot(tr_win_x, tr_win_y, c='gray', lw=1.)
328 | ax[-1].vlines(x=tpick_pred, ymin=-1.1, ymax=1., color='b', lw=1.5, ls='-', clip_on=False)
329 | ax[-1].vlines(x=tpick_det, ymin=-1., ymax=1.1, color='b', lw=1.5, ls='--', clip_on=False)
330 | # tr_label_1 = f"comp N"
331 | # ax[-1].text(0.02, .95, tr_label_1, size=12., ha='left', va='center', transform=ax[-1].transAxes)
332 | #
333 | xmin = 0.
334 | xmax = tr_win_x.max()
335 | ax[-1].set_xlim([xmin, xmax])
336 | ax[-1].xaxis.set_ticks(np.arange(xmin, xmax + .1, .5))
337 | ax[-1].xaxis.set_minor_locator(ticker.MultipleLocator(.1))
338 | ax[-1].set_ylim([-1., 1.])
339 | ax[-1].set_xlabel("Time [s]")
340 | #
341 | plt.tight_layout()
342 | print(f"plotting predicted phase S: {opath_fig}/{sta}_S_N_{plot_num+1:02}.png")
343 | ofig = f"{opath_fig}/{sta}_S_N_{plot_num+1:02}"
344 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
345 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
346 | plt.close()
347 | #
348 | # plot - phase window input for RNN (zoom around predicted time pick and MCD results, comp E)
349 | #
350 | fig = plt.figure(figsize=(7*1, 3*1))
351 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
352 | ax = []
353 | ax.append(fig.add_subplot(1, 1, 1))
354 | #
355 | # plot trace + label
356 | tr_win_y = data[0,:,0]
357 | tr_win_x = np.arange(tr_win_y.shape[0]) / config.data_params['samp_freq']
358 | ax[-1].plot(tr_win_x, tr_win_y, c='gray', lw=2.)
359 | #
360 | # plot output binary probs
361 | ax_tmp = ax[-1].twinx()
362 | for l in range(len(mc_pred)):
363 | ax_tmp.plot(tr_win_x, mc_pred[l,:,0], c='magenta', lw=.2, ls='--')
364 | ax_tmp.plot(tr_win_x, mc_pred_mean[:,0], c='magenta', lw=1.)
365 | ax_tmp.set_ylim([0., 1.])
366 | ax_tmp.set_ylabel("Probability")
367 | ax_tmp.yaxis.set_ticks(np.arange(0.,1.1,.1)[:])
368 | ax_tmp.yaxis.set_minor_locator(ticker.MultipleLocator(.05))
369 | ax_tmp.axhline(mc_pred_mean[mc_pred_mean_arg_pick,0], c='magenta', lw=1., ls='--')
370 | ax_tmp.axhline(prob_th1, c='magenta', lw=1., ls='--')
371 | ax_tmp.axhline(prob_th2, c='magenta', lw=1., ls='--')
372 | #
373 | ax[-1].vlines(x=tpick_pred, ymin=-1.1, ymax=1., color='b', lw=1.5, ls='-', clip_on=False)
374 | ax[-1].vlines(x=tpick_det, ymin=-1., ymax=1.1, color='b', lw=1.5, ls='--', clip_on=False)
375 | ax[-1].vlines(x=tpick_pred_th1, ymin=-1., ymax=1., color='b', lw=1.5, ls=':', clip_on=False)
376 | ax[-1].vlines(x=tpick_pred_th2, ymin=-1., ymax=1., color='b', lw=1.5, ls=':', clip_on=False)
377 | # ax[-1].vlines(x=tpick_pred-tpick_pred_std, ymin=-1., ymax=1.05, color='r', lw=1.5, ls='--', clip_on=False)
378 | # ax[-1].vlines(x=tpick_pred+tpick_pred_std, ymin=-1., ymax=1.05, color='r', lw=1.5, ls='--', clip_on=False)
379 | # arg_pred = mc_pred_mean_arg_pick
380 | tr_label_1 = f"tpred = {tpick_pred:.3f}"
381 | tr_label_2 = f"terr(1 x pb_std) = (-{terr_pre:.3f}, +{terr_pos:.3f})"
382 | tr_label_3 = f"pick_class = {pick_class}"
383 | tr_label_4 = f"pb, pb_std = ({mc_pred_mean[mc_pred_mean_arg_pick,0]:.3f}, {mc_pred_std_pick:.3f})"
384 | # ax[-1].text(0.01, .975, tr_label_1, size=12., ha='left', va='center', transform=ax[-1].transAxes)
385 | # ax[-1].text(0.01, .935, tr_label_2, size=12., ha='left', va='center', transform=ax[-1].transAxes)
386 | # ax[-1].text(0.01, .895, tr_label_3, size=12., ha='left', va='center', transform=ax[-1].transAxes)
387 | # ax[-1].text(0.01, .855, tr_label_4, size=12., ha='left', va='center', transform=ax[-1].transAxes)
388 | #
389 | xmin = tpick_pred - .5
390 | xmax = tpick_pred + .5
391 | ax[-1].set_xlim([xmin, xmax])
392 | tick_major = np.arange(xmin, xmax + .1, .1)
393 | tick_minor = np.arange(xmin, xmax + .01, .02)
394 | ax[-1].xaxis.set_major_locator(ticker.FixedLocator(tick_major))
395 | ax[-1].xaxis.set_minor_locator(ticker.FixedLocator(tick_minor))
396 | ax[-1].set_ylim([-1., 1.])
397 | ax[-1].set_xlabel("Time [s]")
398 | #
399 | plt.tight_layout()
400 | print(f"plotting predicted phase S: {opath_fig}/{sta}_S_E_mc_{plot_num+1:02}.png")
401 | ofig = f"{opath_fig}/{sta}_S_E_mcd_{plot_num+1:02}"
402 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
403 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
404 | plt.close()
405 | #
406 | # plot - phase window input for RNN (zoom around predicted time pick and MCD results, comp N)
407 | #
408 | fig = plt.figure(figsize=(7*1, 3*1))
409 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
410 | ax = []
411 | ax.append(fig.add_subplot(1, 1, 1))
412 | #
413 | # plot trace + label
414 | tr_win_y = data[0,:,1]
415 | tr_win_x = np.arange(tr_win_y.shape[0]) / config.data_params['samp_freq']
416 | ax[-1].plot(tr_win_x, tr_win_y, c='gray', lw=2.)
417 | #
418 | # plot output binary probs
419 | ax_tmp = ax[-1].twinx()
420 | for l in range(len(mc_pred)):
421 | ax_tmp.plot(tr_win_x, mc_pred[l,:,0], c='magenta', lw=.2, ls='--')
422 | ax_tmp.plot(tr_win_x, mc_pred_mean[:,0], c='magenta', lw=1.)
423 | ax_tmp.set_ylim([0., 1.])
424 | ax_tmp.set_ylabel("Probability")
425 | ax_tmp.yaxis.set_ticks(np.arange(0.,1.1,.1)[:])
426 | ax_tmp.yaxis.set_minor_locator(ticker.MultipleLocator(.05))
427 | ax_tmp.axhline(mc_pred_mean[mc_pred_mean_arg_pick,0], c='magenta', lw=1., ls='--')
428 | ax_tmp.axhline(prob_th1, c='magenta', lw=1., ls='--')
429 | ax_tmp.axhline(prob_th2, c='magenta', lw=1., ls='--')
430 | #
431 | ax[-1].vlines(x=tpick_pred, ymin=-1.1, ymax=1., color='b', lw=1.5, ls='-', clip_on=False)
432 | ax[-1].vlines(x=tpick_det, ymin=-1., ymax=1.1, color='b', lw=1.5, ls='--', clip_on=False)
433 | ax[-1].vlines(x=tpick_pred_th1, ymin=-1., ymax=1., color='b', lw=1.5, ls=':', clip_on=False)
434 | ax[-1].vlines(x=tpick_pred_th2, ymin=-1., ymax=1., color='b', lw=1.5, ls=':', clip_on=False)
435 | # ax[-1].vlines(x=tpick_pred-tpick_pred_std, ymin=-1., ymax=1.05, color='r', lw=1.5, ls='--', clip_on=False)
436 | # ax[-1].vlines(x=tpick_pred+tpick_pred_std, ymin=-1., ymax=1.05, color='r', lw=1.5, ls='--', clip_on=False)
437 | # ax[-1].text(0.02, .975, tr_label_1, size=10., ha='left', va='center', transform=ax[-1].transAxes)
438 | # ax[-1].text(0.02, .935, tr_label_2, size=10., ha='left', va='center', transform=ax[-1].transAxes)
439 | # ax[-1].text(0.02, .895, tr_label_3, size=10., ha='left', va='center', transform=ax[-1].transAxes)
440 | # ax[-1].text(0.02, .855, tr_label_4, size=10., ha='left', va='center', transform=ax[-1].transAxes)
441 | #
442 | xmin = tpick_pred - .5
443 | xmax = tpick_pred + .5
444 | ax[-1].set_xlim([xmin, xmax])
445 | tick_major = np.arange(xmin, xmax + .1, .1)
446 | tick_minor = np.arange(xmin, xmax + .01, .02)
447 | ax[-1].xaxis.set_major_locator(ticker.FixedLocator(tick_major))
448 | ax[-1].xaxis.set_minor_locator(ticker.FixedLocator(tick_minor))
449 | ax[-1].set_ylim([-1., 1.])
450 | ax[-1].set_xlabel("Time [s]")
451 | #
452 | plt.tight_layout()
453 | print(f"plotting predicted phase S: {opath_fig}/{sta}_S_N_mc_{plot_num+1:02}.png")
454 | print(tr_label_1)
455 | print(tr_label_2)
456 | print(tr_label_3)
457 | print(tr_label_4)
458 | ofig = f"{opath_fig}/{sta}_S_N_mcd_{plot_num+1:02}"
459 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
460 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
461 | plt.close()
462 |
463 |
464 | def plot_predicted_phases(config, data, model, plot_comps=['Z','E'], plot_probs=[], shift_probs=True):
465 | """
466 | Plots predicted P- and S-phase picks on seismic waveforms and additionally predicted discrete class probability time series.
467 |
468 | Parameters
469 | ----------
470 | config: instance of config.Config
471 | Contains user configuration of seismic waveform data and how this data is processed in DeepPhasePick.
472 | data: instance of data.Data
473 | Contains selected seismic waveform data on which phase detection is applied.
474 | model: instance of model.Model
475 | Contains best models and relevant results obtained from hyperparameter optimization for phase detection and picking.
476 | plot_comps: list of str, optional
477 | Seismic components to be plotted. It can be any of vertical ('Z'), east ('E'), and north ('N').
478 | By default vertical and east components are plotted.
479 | plot_probs: list of str, optional
480 | Discrete class probability time series to be plotted. It can be any of 'P', 'S' and 'N' (Noise) classes.
481 | By default no probability time series are plotted.
482 | shift_probs: bool, optional.
483 | If True (default), plotted probability time series are shifted in time according to the optimized hyperparameters defining the picking window for each class.
484 | See Figure S1 in Soto and Schurr (2020).
485 | """
486 | #
487 | # plot format parameters
488 | mpl.rcParams['xtick.major.size'] = 10
489 | mpl.rcParams['xtick.major.width'] = 2
490 | mpl.rcParams['xtick.minor.size'] = 5
491 | mpl.rcParams['xtick.minor.width'] = 2
492 | mpl.rcParams['ytick.major.size'] = 10
493 | mpl.rcParams['ytick.major.width'] = 2
494 | mpl.rcParams['ytick.minor.size'] = 4
495 | mpl.rcParams['ytick.minor.width'] = 2
496 | mpl.rcParams['xtick.labelsize'] = 16
497 | mpl.rcParams['ytick.labelsize'] = 16
498 | mpl.rcParams['axes.titlesize'] = 16
499 | mpl.rcParams['axes.labelsize'] = 16
500 | #
501 | best_params = model.model_detection['best_params']
502 | add_rows = 0
503 | if len(plot_probs) > 0:
504 | add_rows += 1
505 | #
506 | print("creating plots...")
507 | for i in data.data:
508 | #
509 | for sta in data.data[i]['st']:
510 | #
511 | fig = plt.figure(figsize=(12., 4*(len(plot_comps)+add_rows)))
512 | plt.subplots_adjust(wspace=0, hspace=0, bottom=0, left=0)
513 | #
514 | for n, ch in enumerate(plot_comps):
515 | #
516 | ax = []
517 | #
518 | # subplot - waveform trace (input for CNN)
519 | #
520 | tr = data.data[i]['st'][sta].select(channel='*'+ch)[0]
521 | dt = tr.stats.delta
522 | tr_y = tr.data
523 | y_max = np.abs(tr.data).max()
524 | tr_y /= y_max
525 | tr_x = np.arange(tr.data.size) * dt
526 | #
527 | # plot trace
528 | ax.append(fig.add_subplot(len(plot_comps)+add_rows, 1, n+1))
529 | ax[-1].plot(tr_x, tr_y, c='gray', lw=.2)
530 | # ax[-1].plot(tr_x, tr_y, c='k', lw=.2)
531 | #
532 | # retrieve predicted P, S class probability time series
533 | #
534 | samp_dt = 1 / config.data_params['samp_freq']
535 | if shift_probs:
536 | tp_shift = (best_params['frac_dsamp_p1']-.5) * best_params['win_size'] * samp_dt
537 | ts_shift = (best_params['frac_dsamp_s1']-.5) * best_params['win_size'] * samp_dt
538 | tn_shift = (best_params['frac_dsamp_n1']-.5) * best_params['win_size'] * samp_dt
539 | else:
540 | tp_shift = 0
541 | ts_shift = 0
542 | tn_shift = 0
543 | #
544 | # plot trace label
545 | # tr_label = f"{tr.stats.network}.{tr.stats.station}.{tr.stats.channel}"
546 | tr_label = f"{tr.stats.channel}"
547 | box_label = dict(boxstyle='square', facecolor='white', alpha=.9)
548 | ax[-1].text(0.02, .95, tr_label, size=14., ha='left', va='center', transform=ax[-1].transAxes, bbox=box_label)
549 | #
550 | tstart_plot = tr.stats.starttime
551 | tend_plot = tr.stats.endtime
552 | print(i, sta, ch, tstart_plot, tend_plot)
553 | #
554 | # lines at predicted picks
555 | #
556 | if sta in model.picks[i]:
557 | #
558 | for ii, k in enumerate(model.picks[i][sta]['P']['true_arg']):
559 | #
560 | # P pick corrected after phase picking
561 | #
562 | tstart_win = model.picks[i][sta]['P']['twd'][k]['tstart_win']
563 | tend_win = model.picks[i][sta]['P']['twd'][k]['tend_win']
564 | if config.picking['run_mcd']:
565 | tpick_pred = model.picks[i][sta]['P']['twd'][k]['pick_ml']['tpick']
566 | # tpick_th1 = model.picks[i][sta]['P']['twd'][k]['pick_ml']['tpick_th1']
567 | # tpick_th2 = model.picks[i][sta]['P']['twd'][k]['pick_ml']['tpick_th2']
568 | # pick_class = model.picks[i][sta]['P']['twd'][k]['pick_ml']['pick_class']
569 | else:
570 | # tpick_pred = model.picks[i][sta]['P']['twd'][k]['pick_ml']['tpick_det']
571 | tpick_pred = model.picks[i][sta]['P']['twd'][k]['pick_ml_det']
572 | tp_plot = tstart_win - tstart_plot + tpick_pred
573 | if ii == 0:
574 | ax[-1].axvline(tp_plot, c='r', lw=1.5, ls='-', label='P pick')
575 | else:
576 | ax[-1].axvline(tp_plot, c='r', lw=1.5, ls='-')
577 | #
578 | for jj, l in enumerate(model.picks[i][sta]['S']['true_arg']):
579 | #
580 | # S pick corrected after phase picking
581 | #
582 | tstart_win = model.picks[i][sta]['S']['twd'][l]['tstart_win']
583 | if config.picking['run_mcd']:
584 | tpick_pred = model.picks[i][sta]['S']['twd'][l]['pick_ml']['tpick']
585 | # tpick_th1 = model.picks[i][sta]['S']['twd'][l]['pick_ml']['tpick_th1']
586 | # tpick_th2 = model.picks[i][sta]['S']['twd'][l]['pick_ml']['tpick_th2']
587 | # pick_class = model.picks[i][sta]['S']['twd'][l]['pick_ml']['pick_class']
588 | else:
589 | # tpick_pred = model.picks[i][sta]['S']['twd'][l]['pick_ml']['tpick_det']
590 | tpick_pred = model.picks[i][sta]['S']['twd'][l]['pick_ml_det']
591 | ts_plot = tstart_win - tstart_plot + tpick_pred
592 | if jj == 0:
593 | ax[-1].axvline(ts_plot, c='b', lw=1.5, ls='-', label='S pick')
594 | else:
595 | ax[-1].axvline(ts_plot, c='b', lw=1.5, ls='-')
596 | #
597 | ylim = [-1., 1.]
598 | ax[-1].set_ylim(ylim)
599 | ax[-1].set_xlim([0, tend_plot - tstart_plot])
600 | if n == len(plot_comps)-1:
601 | plt.legend(loc='lower left', fontsize=14.)
602 | if add_rows == 0:
603 | ax[-1].set_xlabel("Time [s]")
604 | #
605 | # plot predicted P, S, Noise class probability functions
606 | #
607 | if len(plot_probs) > 0:
608 | ax.append(fig.add_subplot(len(plot_comps)+add_rows, 1, len(plot_comps)+1))
609 | ax[-1].set_xlim([0, tend_plot - tstart_plot])
610 | ax[-1].set_xlabel("Time [s]")
611 | ax[-1].set_ylim([-.05, 1.05])
612 | ax[-1].set_ylabel("Probability")
613 | if 'P' in plot_probs:
614 | x_prob_p = model.detections[i][sta]['tt']+tp_shift
615 | y_prob_p = model.detections[i][sta]['ts'][:,0]
616 | ax[-1].plot(x_prob_p, y_prob_p, c='red', lw=0.75, label='P')
617 | if 'S' in plot_probs:
618 | x_prob_s = model.detections[i][sta]['tt']+ts_shift
619 | y_prob_s = model.detections[i][sta]['ts'][:,1]
620 | ax[-1].plot(x_prob_s, y_prob_s, c='blue', lw=0.75, label='S')
621 | if 'N' in plot_probs:
622 | x_prob_n = model.detections[i][sta]['tt']+tn_shift
623 | y_prob_n = model.detections[i][sta]['ts'][:,2]
624 | ax[-1].plot(x_prob_n, y_prob_n, c='k', lw=0.75, label='N')
625 | if len(plot_probs) > 0:
626 | plt.legend(loc='lower left', fontsize=14.)
627 | #
628 | plt.tight_layout()
629 | #
630 | opath = model.detections[i][sta]['opath']
631 | tstr_start = tr.stats.starttime.strftime("%Y%m%dT%H%M%S")
632 | tstr_end = tr.stats.endtime.strftime("%Y%m%dT%H%M%S")
633 | opath = f"{opath}/wf_plots"
634 | os.makedirs(opath, exist_ok=True)
635 | #
636 | ofig = f"{opath}/{config.data['net']}_{sta}_{tstr_start}_{tstr_end}"
637 | plt.savefig(f"{ofig}.png", bbox_inches='tight', dpi=90)
638 | # plt.savefig(f"{ofig}.eps", format='eps', bbox_inches='tight', dpi=150)
639 | plt.close()
640 |
--------------------------------------------------------------------------------