├── LICENSE
├── README.md
├── data2
├── 2018_events_qc_mag1.9.txt
├── 2018_events_tw_mag1.9.json
└── ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_Scoda_envelope_dist<1km_pw2.h5
├── figs
├── corr.pdf
├── corrs_vs_dist.pdf
├── eventmap.pdf
├── eventmap_interpretation.pdf
├── focal.pdf
├── hist.pdf
├── inclination.pdf
├── inclination.svg
├── maxima_vs_dist.pdf
├── method.pdf
├── method.svg
├── methods.pdf
├── methods.svg
├── topomap.pdf
├── vpvs.pdf
└── vs_time.pdf
├── load_data.py
├── plot_maps.py
├── traveltime.py
├── util
├── __init__.py
├── events.py
├── imaging.py
├── misc.py
├── signal.py
├── source.py
└── xcorr2.py
├── vpvs.py
├── vs.py
└── xcorr.py
/LICENSE:
--------------------------------------------------------------------------------
1 | The MIT License (MIT)
2 |
3 | Copyright (c) 2020 Tom Eulenfeld
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy of
6 | this software and associated documentation files (the "Software"), to deal in
7 | the Software without restriction, including without limitation the rights to
8 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
9 | the Software, and to permit persons to whom the Software is furnished to do so,
10 | subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
17 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
18 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
19 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
20 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 |
3 | ## Inter-source interferometry with cross-correlation of coda waves
4 |
5 | This repository contains the source code for the reproduction of results of the following publication:
6 |
7 | Tom Eulenfeld (2020), Toward source region tomography with inter-source interferometry: Shear wave velocity from 2018 West Bohemia swarm earthquakes, *Journal of Geophysical
8 | Research: Solid Earth, 125*, e2020JB019931, doi:[10.1029/2020JB019931](https://doi.org/10.1029/2020JB019931). [[pdf](https://arxiv.org/pdf/2003.11938)]
9 |
10 | #### Preparation
11 |
12 | 1. Download or clone this repository.
13 | 2. The data is hosted on zenodo with doi:[10.5281/zenodo.3741465](https://www.doi.org/10.5281/zenodo.3741465). Copy the data into a new folder `data`.
14 | 3. Create empty folder `tmp`.
15 | 4. Install the the relevant python packages: `python>=3.7 obspy>=1.2 matplotlib=3.1 numpy scipy=1.3 statsmodels tqdm shapely cartopy pandas utm obspyh5`.
16 |
17 | #### Usage
18 |
19 | The scripts `xcorr.py`, `traveltime.py`, `vs.py` should be run in this order. The scripts `plot_maps.py`, `vpvs.py` and `load_data.py` can be run independently. The scope of each script should be clear after reading the publication and an optional docstring at the top.
20 |
21 | Three files are located in `data2` folder. `2018_events_qc_mag1.9.txt` QC file was created by the author upon visual inspection of the envelope plots created with the `load_data.py` script. `2018_events_tw_mag1.9.json` time window file can be recreated from the QC file with the `load_data.py` script. Running `load_data.py` is optional. `ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_Scoda_envelope_dist<1km_pw2.h5` cross-correlation file contains the intermediate result of inter-event cross-correlation functions. This file can be directly read with obspy via obspyh5 plugin. The cross-correlation file is also created by the `xcorr.py` script inside the `tmp` directory.
22 |
23 | The `figs` folder contains all figures from the publication. Most will be recreated when running the scripts.
--------------------------------------------------------------------------------
/data2/2018_events_qc_mag1.9.txt:
--------------------------------------------------------------------------------
1 | Q1 Q2 Q3
2 | Xs P onset of next earthquake x seconds after P onset (NKC)
3 | NB P onset of next earthquake interferes
4 | 20183359 M2.2 Q1
5 | 20183774 M1.9 Q3
6 | 20184625 M2.9 Q1
7 | 20184759 M2.6 Q1
8 | 20184769 M2.1 Q3 9s
9 | 20184825 M2.7 Q1
10 | 20184923 M1.9 Q2
11 | 20185009 M2.0 Q1
12 | 20185065 M2.1 Q3 9s
13 | 20185360 M2.4 x
14 | 20185366 M2.5 x
15 | 20185386 M2.2 Q3 6s
16 | 20185390 M1.9 x
17 | 20185392 M2.2 Q3
18 | 20185424 M1.9 x
19 | 20185468 M2.3 x
20 | 20185466 M2.1 x
21 | 201876019 M2.0 x
22 | 20185560 M1.9 x
23 | 20185588 M2.0 Q2 28s
24 | 20185592 M1.9 Q3 7.5s
25 | 20185828 M1.9 x
26 | 20185854 M2.0 Q2 18s
27 | 20185856 M1.9 x
28 | 20185862 M2.4 Q2
29 | 20185882 M2.3 Q2 NB
30 | 20185888 M2.4 Q1
31 | 20186066 M2.1 Q1 NOWEBNET
32 | 20186286 M2.2 Q2 NB
33 | 20186400 M2.1 x
34 | 20186402 M2.3 x
35 | 20186404 M2.0 Q3 9s
36 | 20186460 M2.5 Q1
37 | 20186518 M2.3 x DB
38 | 20186520 M2.3 Q2 DB
39 | 20186590 M1.9 Q2
40 | 20186614 M2.0 Q1 24s
41 | 20186648 M2.4 Q3 NB
42 | 20186650 M2.7 Q3 NB
43 | 20186652 M2.2 x
44 | 20186746 M2.4 Q1
45 | 20186784 M2.8 Q1
46 | 20186880 M2.1 x
47 | 20187016 M1.9 Q2 14s
48 | 20187102 M2.5 Q2 10s
49 | 20187132 M2.1 Q2 8.5s
50 | 20187160 M2.0 x
51 | 201814890 M2.0 Q2
52 | 20187220 M2.1 x
53 | 20187222 M2.6 Q2 NB
54 | 20187224 M2.0 Q2 12s
55 | 20187260 M2.7 Q1
56 | 20187456 M2.0 Q1 20s
57 | 20188016 M2.5 Q1 20s
58 | 20188042 M2.1 x
59 | 20188362 M2.2 x
60 | 20188364 M2.2 Q2 NB
61 | 20188366 M2.6 Q1 NB
62 | 20188370 M1.9 x
63 | 20188372 M1.9 xx
64 | 20188390 M1.9 Q3 8s
65 | 20188412 M2.4 Q1 NB
66 | 20188466 M2.3 x DB
67 | 20188468 M2.3 Q1 DB 14s
68 | 20188470 M1.9 Q3 10s NOISE
69 | 20188486 M1.9 Q3 NB
70 | 20188488 M2.2 x
71 | 20188828 M1.9 Q2 18s
72 | 20188910 M2.0 Q1
73 | 20189036 M2.5 Q1
74 | 20189038 M2.4 Q1 NB
75 | 20189584 M2.5 Q3 NB
76 | 20189586 M2.4 Q2 NB
77 | 20189588 M2.2 Q2 25s NOISE
78 | 20189610 M2.2 Q2 NB
79 | 20189618 M2.3 Q2
80 | 20189682 M2.5 Q1
81 | 20189744 M2.2 Q2
82 | 20189762 M2.4 x NB
83 | 20189764 M2.3 Q2 13s
84 | 20189766 M2.4 Q2 NOISE
85 | 20189804 M2.0 Q2 NB
86 | 20189806 M2.1 x
87 | 20189876 M2.1 Q3 8s
88 | 20189946 M1.9 Q1
89 | 201810184 M2.0 Q2 8s
90 | 201810202 M2.0 Q1
91 | 201810210 M2.3 Q3 NB
92 | 201810742 M2.3 x
93 | 201810878 M2.2 Q2 13s
94 | 201810888 M2.3 x
95 | 201810890 M2.0 x
96 | 201810930 M2.2 Q3 6s
97 | 201810952 M2.0 x
98 | 201810954 M1.9 x
99 | 201810956 M2.0 Q2 NB
100 | 201810972 M2.0 Q2 24s
101 | 201811100 M2.3 Q1 22s
102 | 201847953 M2.3 Q1
103 | 201811185 M2.0 x
104 | 201811393 M2.2 x
105 | 201811499 M1.9 Q2 9s
106 | 201811845 M2.2 Q2 18s
107 | 201811991 M2.1 x
108 | 201811999 M1.9 Q3 NB NOISE
109 | 201812045 M2.0 Q2 12s
110 | 201812057 M2.3 Q1
111 | 201812093 M2.2 Q1
112 | 201812117 M2.4 Q2 10s
113 | 201812137 M2.8 Q3 NB
114 | 201812139 M2.3 x
115 | 201812149 M2.4 Q3 8s
116 | 201812151 M2.6 Q2 10s
117 | 201812237 M2.4 Q1 33s
118 | 201812259 M1.9 Q1 20s
119 | 201812295 M2.0 x
120 | 201812343 M1.9 x
121 | 201812345 M2.1 x
122 | 201812347 M2.5 Q2
123 | 201812377 M2.1 Q2 12s
124 | 201877266 M1.9 x
125 | 201812497 M2.4 Q1 30s
126 | 201812605 M1.9 Q1
127 | 201813039 M2.3 Q1 NB
128 | 201843425 M1.9 x QU
129 | 201813041 M2.3 x QU
130 | 201843426 M1.9 x QU
131 | 201843424 M2.0 x NB QU
132 | 201813043 M2.0 Q3 NOISE
133 | 201813506 M2.3 Q1 NB
134 | 201813510 M2.0 Q2
135 | 201813518 M2.8 Q1
136 | 201813536 M2.9 Q1 13s
137 | 201813550 M1.9 Q2
138 | 201813653 M2.0 xx
139 | 201813743 M2.0 x
140 | 201813765 M2.1 xx
141 | 201869526 M2.1 xx
142 | 201813821 M1.9 Q1 30s
143 | 201813883 M2.5 Q1
144 | 201813987 M1.9 x DOUBLET
145 | 201876997 M1.9 Q2 30s DOUBLET
146 | 201814353 M1.9 x
147 | 201814705 M2.3 Q1
148 | 201814950 M2.1 Q2 9.5s
149 | 201815062 M1.9 Q3
150 | 201815110 M2.3 Q2
151 | 201815228 M2.5 Q2 15s
152 | 201815244 M2.2 x
153 | 201815314 M2.8 Q1 NB
154 | 201815318 M1.9 Q3
155 | 201815372 M2.2 Q1 30s
156 | 201815422 M1.9 Q2 NB
157 | 201815424 M2.1 Q2 15s
158 | 201815580 M1.9 Q2
159 | 201815690 M2.2 x DOUBLET
160 | 201820962 M2.6 Q2 DOUBLET
161 | 201815926 M2.4 x
162 | 201815928 M2.1 x
163 | 201816030 M2.0 Q3 NOISE
164 | 201816074 M2.5 Q1
165 | 201816094 M2.0 Q2 NB
166 | 201816096 M2.6 Q1 NOISE
167 | 201816470 M2.8 x
168 | 201820961 M3.3 Q1
169 | 201816622 M1.9 Q2 29s NOWEBNET
170 | 201816624 M2.5 Q1
171 | 201816642 M2.0 Q1 32s
172 | 201816718 M2.9 Q1
173 | 201816998 M1.9 Q2 10s
174 | 201817042 M2.3 Q1
175 | 201817046 M2.0 Q1
176 | 201817376 M2.3 x DOUBLET
177 | 201817378 M2.0 Q2 30s DOUBLET
178 | 201817540 M1.9 x
179 | 201818254 M2.4 Q1
180 | 201818332 M2.1 Q2 DOUBLET
181 | 201818346 M1.9 Q1
182 | 201873221 M2.0 Q2 20s DOUBLET
183 | 201818372 M3.0 Q1 NB
184 | 201818374 M2.2 x NOISE
185 | 201831216 M2.2 Q2
186 | 201818492 M2.8 Q1
187 | 201818988 M2.0 Q2
188 | 201818994 M2.0 x
189 | 201819010 M2.4 Q1 NB
190 | 201819012 M2.6 Q3 5s
191 | 201819712 M1.9 Q1 30s
192 | 201819814 M2.3 x WRONGSPICK_NKCLBC_CORRECTED_IN_PHA_FILE
193 | 201819816 M1.9 x
194 | 201819900 M1.9 Q2 24s
195 | 201819908 M2.0 x
196 | 201821074 M1.9 Q2
197 | 201821112 M2.7 Q2
198 | 201821136 M2.0 Q2 30s
199 | 201821178 M2.2 Q1
200 | 201821188 M1.9 Q2 14s
201 | 201821210 M1.9 Q2 15s
202 | 201821232 M1.9 Q2
203 | 201821266 M2.0 Q1 NB
204 | 201821286 M3.2 Q2
205 | 201821374 M1.9 Q3 NB
206 | 201821376 M2.4 Q2
207 | 201821394 M2.4 x NB
208 | 201821396 M2.4 Q2 17s
209 | 201821408 M2.1 Q1
210 | 201821444 M2.7 Q1 NB
211 | 201821458 M2.5 Q1 35s
212 | 201821510 M1.9 x
213 | 201821520 M2.1 Q1 29s
214 | 201821574 M2.0 Q1 NB
215 | 201821576 M2.0 Q2 NB
216 | 201821594 M2.4 Q1
217 | 201821600 M2.6 Q1 30s
218 | 201821618 M2.0 Q3 NB
219 | 201821626 M2.3 Q1 NB
220 | 201821672 M2.1 x
221 | 201821686 M2.3 Q2 NB
222 | 201821688 M2.1 x
223 | 201821690 M2.3 Q3 NOISE
224 | 201821716 M1.9 Q3 NB
225 | 201821718 M2.0 Q2 15s
226 | 201821738 M2.6 Q1
227 | 201822004 M2.3 Q1 21s
228 | 201822056 M1.9 x
229 | 201822126 M2.0 x DOUBLET
230 | 201822128 M1.9 Q2 DOUBLET
231 | 201822160 M2.1 Q2 16s
232 | 201822288 M2.4 Q2
233 | 201822428 M1.9 Q2
234 | 201822608 M2.7 Q1
235 | 201822768 M2.3 Q3 NB
236 | 201822792 M2.0 Q3 18s DOUBLET
237 | 201822870 M2.4 Q2 NB
238 | 201823270 M2.8 Q3 NB
239 | 201823272 M2.3 x
240 | 201823308 M2.2 x
241 | 201832863 M2.1 Q3 NB DOUBLET
242 | 201823310 M2.1 x
243 | 201823312 M2.2 Q2 DOUBLET NOISE
244 | 201823346 M1.9 Q1
245 | 201823554 M2.2 Q2 NB
246 | 201823556 M2.8 Q1 NB
247 | 201823988 M2.1 Q1
248 | 201824030 M2.1 Q1
249 | 201824244 M2.0 Q1
250 | 201824450 M2.3 Q2 NB
251 | 201825170 M2.2 Q2 NB
252 | 201825814 M2.0 Q1
253 | 201827252 M2.2 Q1
254 | 201837010 M2.2 Q1
255 | 201827856 M1.9 Q1
256 | 201829066 M1.9 Q1 NOWEBNET
257 | 201829236 M2.0 Q1
258 | 201830842 M1.9 Q1
259 | 201832408 M2.2 x
260 | 201832416 M2.0 Q3 NB NOISE
261 | 201832418 M2.1 x
262 | 201832446 M2.4 Q1
263 | 201832650 M2.4 Q3 NB
264 | 201832652 M2.1 Q2 28s
265 | 201833280 M2.0 Q2 NB DOUBLET
266 | 201833648 M2.1 Q2
267 | 201834060 M2.0 x
268 | 201877386 M2.1 Q1
269 | 201834508 M3.0 Q1
270 | 201834530 M2.1 Q1
271 | 201834588 M2.1 Q1 POC?
272 | 201837218 M3.3 Q1
273 | 201837228 M2.4 Q1
274 | 201837268 M2.1 Q2 NB
275 | 201837270 M1.9 Q3 5s
276 | 201837294 M2.4 Q2 20s
277 | 201837304 M2.4 Q3 NB
278 | 201837308 M3.0 Q2 19s
279 | 201834666 M2.8 Q1
280 | 201834686 M2.1 Q2 13s
281 | 201834832 M2.1 x
282 | 201835038 M2.1 x
283 | 201835040 M3.5 Q1
284 | 201835050 M2.7 Q1
285 | 201835220 M1.9 Q1
286 | 201835368 M2.3 Q2 18.5s
287 | 201835436 M2.2 Q1
288 | 201835748 M2.5 Q1 NB
289 | 201835750 M2.3 Q2
290 | 201836160 M2.5 Q1
291 | 201836704 M2.1 Q1
292 | 201837650 M2.3 Q1 33s
293 | 201837774 M2.5 Q3 NB
294 | 201843141 M1.9 x
295 | 201837410 M2.0 Q2 33s
296 | 201838048 M2.7 Q1 NB
297 | 201838074 M3.2 Q1
298 | 201839137 M2.0 Q1
299 | 201839222 M1.9 x
300 | 201839490 M2.2 Q2 13s
301 | 201840856 M2.0 Q1
302 | 201841304 M1.9 x
303 | 201841526 M2.0 Q1
304 | 201842648 M1.9 Q2 20s
305 | 201842988 M2.0 Q1 35s
306 | 201843006 M2.2 Q1
307 | 201844139 M2.1 x
308 | 201844351 M2.0 Q2 17s
309 | 201845605 M3.0 Q1
310 | 201845629 M2.2 Q1
311 | 201845973 M2.5 Q1
312 | 201846013 M2.1 Q1 36s
313 | 201846343 M2.0 Q1
314 | 201846625 M2.1 Q1 17s
315 | 201846631 M2.1 Q1
316 | 201849960 M2.0 x
317 | 201847095 M2.6 Q2 DOUBLET
318 | 201847435 M2.7 Q1 R2
319 | 201847691 M2.4 Q1 R2 33s
320 | 201851459 M2.3 Q1 R2
321 | 201851461 M3.1 Q1 R2
322 | 201851822 M2.1 Q1
323 | 201852116 M2.3 Q1 R2
324 | 201853920 M1.9 Q2 25s
325 | 201855605 M2.2 Q2 NB
326 | 201855607 M2.5 Q1
327 | 201855675 M2.1 Q1 29s
328 | 201855859 M2.0 Q1
329 | 201855863 M3.0 Q1
330 | 201855981 M1.9 Q1 26s
331 | 201856083 M2.6 x DOUBLET
332 | 201858578 M2.6 Q1 NB DOUBLET
333 | 201856085 M2.2 x DOUBLET
334 | 201858699 M2.4 Q3 DOUBLET NB
335 | 201856087 M3.8 Q1 NB
336 | 201856089 M1.9 x
337 | 201856095 M2.5 Q3 NB
338 | 201856097 M2.1 x
339 | 201856153 M2.6 Q1 22s
340 | 201856219 M2.1 Q1 36s
341 | 201856229 M3.0 Q1
342 | 201856935 M2.4 Q1
343 | 201857247 M2.2 Q1
344 | 201857277 M1.9 Q1
345 | 201857869 M2.4 Q2 NOWEBNET
346 | 201858187 M1.9 Q2
347 | 201859040 M2.3 Q1 NOWEBNET
348 | 201859387 M2.9 Q1 36s
349 | 201860602 M2.1 Q1 29s
350 | 201861630 M2.3 Q1
351 | 201864722 M2.7 Q1
352 | 201862742 M1.9 Q1
353 | 201862744 M3.0 Q1
354 | 201862914 M2.4 Q1
355 | 201863130 M2.5 Q1
356 | 201864482 M2.2 Q1
357 | 201864605 M2.0 Q2 NB
358 | 201865574 M2.0 Q1
359 | 201865756 M2.4 Q1 23s
360 | 201866311 M2.8 Q1
361 | 201866337 M2.0 Q2 NB
362 | 201866341 M2.1 Q2 NOISE
363 | 201866367 M1.9 Q2 18s
364 | 201866387 M2.5 Q1 NB
365 | 201866391 M2.3 x
366 | 201866393 M2.3 Q2 17s
367 | 201866407 M2.2 Q2 17s
368 | 201868287 M1.9 x
369 | 201866415 M2.1 Q2 14s
370 | 201869746 M2.5 Q1
371 | 201869844 M2.3 Q1
372 | 201870687 M2.1 Q1
373 | 201870893 M2.1 Q1
374 | 201871472 M2.2 Q1
375 | 201872635 M1.9 Q1
376 | 201874434 M2.1 Q1
377 | 201874528 M2.0 Q1
378 | 201875354 M2.7 Q1
379 | 201875356 M1.9 Q3 NOISE
--------------------------------------------------------------------------------
/data2/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_Scoda_envelope_dist<1km_pw2.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/data2/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_Scoda_envelope_dist<1km_pw2.h5
--------------------------------------------------------------------------------
/figs/corr.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/corr.pdf
--------------------------------------------------------------------------------
/figs/corrs_vs_dist.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/corrs_vs_dist.pdf
--------------------------------------------------------------------------------
/figs/eventmap.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/eventmap.pdf
--------------------------------------------------------------------------------
/figs/eventmap_interpretation.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/eventmap_interpretation.pdf
--------------------------------------------------------------------------------
/figs/focal.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/focal.pdf
--------------------------------------------------------------------------------
/figs/hist.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/hist.pdf
--------------------------------------------------------------------------------
/figs/inclination.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/inclination.pdf
--------------------------------------------------------------------------------
/figs/maxima_vs_dist.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/maxima_vs_dist.pdf
--------------------------------------------------------------------------------
/figs/method.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/method.pdf
--------------------------------------------------------------------------------
/figs/methods.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/methods.pdf
--------------------------------------------------------------------------------
/figs/topomap.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/topomap.pdf
--------------------------------------------------------------------------------
/figs/vpvs.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/vpvs.pdf
--------------------------------------------------------------------------------
/figs/vs_time.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/figs/vs_time.pdf
--------------------------------------------------------------------------------
/load_data.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | """
4 | Recreate time window file if necessary, create plots of envelopes of all events
5 | """
6 |
7 | import collections
8 | import json
9 | import pickle
10 | import os.path
11 |
12 | import matplotlib.pyplot as plt
13 | from matplotlib.patches import Rectangle
14 | import numpy as np
15 | from obspy import read
16 | import obspy
17 | from tqdm import tqdm
18 |
19 | from util.events import events2lists, event2list
20 | from util.signal import envelope, smooth, get_local_minimum
21 | from util.misc import collapse_json
22 |
23 |
24 | ('KVCN', 'P')
25 | ('NB', 'P')
26 | ('WERN', 'P')
27 | ('ROHR', 'P')
28 | ('KAC', 'P')
29 | ('GUNZ', 'P')
30 | ('MULD', 'P')
31 | ('TANN', 'P')
32 | ('LAEN', 'P')
33 | ('WERD', 'P')
34 | ('TRIB', 'P')
35 | ('VIEL', 'P')
36 | ('KVCN', 'S')
37 | ('NB', 'S')
38 | ('WERN', 'S')
39 | ('ROHR', 'S')
40 | ('KAC', 'S')
41 |
42 | STATIONS = "NKC NKCN LBC KOPD VAC KVC KVCN STC POC SKC KRC NB WERN ROHR KAC MAC KOC GUNZ MULD TANN HUC LAEN ZHC TRC WERD TRIB LAC VIEL".split()
43 | STATIONS = "NKC LBC VAC KVC STC POC SKC KRC WERN ROHR MAC KOC GUNZ MULD TANN HUC LAEN ZHC TRC WERD TRIB LAC VIEL".split()
44 | STATIONS = "NKC LBC VAC KVC STC POC SKC KRC WERN ROHR MAC KOC GUNZ MULD TANN HUC LAEN ZHC TRC WERD TRIB LAC VIEL MANZ MSBB MROB MKON MGBB PLN TANN ROHR WERN".split()
45 | MAN_PICKS = dict(P={'NKC': 1.6, 'LBC': 1.8, 'WERN': 2, 'VAC': 2, 'ROHR': 2.2, 'KVC': 2, 'GUNZ': 3, 'MULD': 3.2, 'TANN': 3.2, 'LAEN': 3.7, 'ZHC': 4.2, 'WERD': 4.2, 'TRIB': 4.2, 'VIEL': 4.5, 'MANZ': 5, 'MSBB': 5, 'MROB': 5, 'MKON': 5, 'MGBB': 5, 'PLN': 5},
46 | S={'NKC': 3, 'LBC': 3, 'WERN': 3.5,'ROHR': 4, 'KVC': 3.8, 'GUNZ': 5.5, 'MULD': 6, 'TANN': 6, 'LAEN': 6.5, 'ZHC': 7, 'WERD': 7.5, 'TRIB': 7.6, 'VIEL': 7.6, 'MANZ': 8, 'MSBB': 8, 'MROB': 8, 'MKON': 8, 'MGBB': 8, 'PLN': 8})
47 |
48 | YEAR = '2018'
49 | MAG = 1.8
50 | EVENT_FILE_ALL = 'data/catalog_2018swarm.pha'
51 | PKL_EVENT_FILE_ALL = f'tmp/cat_{YEAR}.pkl'
52 |
53 | DATA_FILES1 = 'data/waveforms/{id}_{sta}_?H?.mseed' # WEBNET
54 | #DATA_FILES2 = 'data_2018_other/{id}_*/*.mseed' # BGR, LMU
55 |
56 | QCFILE = 'data2/2018_events_qc_mag1.9.txt'
57 | TWFILE = 'data2/2018_events_tw_mag1.9.json'
58 | TXTFILE = 'tmp/cat_2018_mag>1.8.txt'
59 | LATLON0 = (50.25, 12.45)
60 |
61 |
62 | def get_events():
63 | from obspy import read_events
64 | if not os.path.exists(PKL_EVENT_FILE_ALL):
65 | events = read_events(EVENT_FILE_ALL)
66 | with open(PKL_EVENT_FILE_ALL, 'wb') as f:
67 | pickle.dump(events, f, pickle.HIGHEST_PROTOCOL)
68 | with open(PKL_EVENT_FILE_ALL, 'rb') as f:
69 | return pickle.load(f)
70 |
71 |
72 | def create_event_qc_list(events):
73 | events = sorted(events, key=lambda ev: ev.origins[0].time)
74 | ids, _, _, _, _, mags, _ = zip(*events2lists(events))
75 | out = ' \n'.join(i + ' M' + str(m) for i, m in zip(ids, mags))
76 | with open('tmp/2018_evt_list.txt', 'w') as f:
77 | f.write(out)
78 |
79 |
80 | def get_picks(events, manipulate=True):
81 | def _pkey(seedid):
82 | sta, cha = seedid.split('.')[1::2]
83 | return (sta, 'P' if cha[-1] == 'Z' else 'S' if cha[-1] == 'N' else None)
84 | events = sorted(events, key=lambda ev: ev.origins[0].time)
85 | ids, otimes, _, _, _, _, picks = zip(*events2lists(events))
86 | relpicks = collections.defaultdict(list)
87 | for otime, event_picks in zip(otimes, picks):
88 | for seedid, phase, ptime in event_picks:
89 | assert _pkey(seedid)[1] == phase
90 | relpicks[_pkey(seedid)].append(ptime - otime)
91 | relpicks_mean = {k: (np.mean(v), 'mean') for k, v in relpicks.items()}
92 | for phase in 'PS':
93 | for sta, v in MAN_PICKS[phase].items():
94 | if (sta, phase) not in relpicks:
95 | relpicks_mean[(sta, phase)] = (v, 'manual')
96 | allpicks = {}
97 | relpicks = {}
98 | for id_, otime, event_picks in zip(ids, otimes, picks):
99 | abs_event_picks = {_pkey(seedid): (ptime, 'pick') for seedid, phase, ptime in event_picks}
100 | rel_event_picks = {_pkey(seedid): (ptime - otime, 'pick') for seedid, phase, ptime in event_picks}
101 | for phase in 'PS':
102 | for sta in STATIONS:
103 | if (sta, phase) not in abs_event_picks:
104 | relpick = relpicks_mean[(sta, phase)]
105 | abs_event_picks[(sta, phase)] = (otime + relpick[0], relpick[1])
106 | rel_event_picks[(sta, phase)] = relpick
107 | allpicks[id_] = abs_event_picks
108 | relpicks[id_] = rel_event_picks
109 | ev2ev = {id1: tuple(id2 for id2, time2 in zip(ids, otimes) if id1 != id2 and -20 < time2 - time1 < 50) for id1, time1 in zip(ids, otimes)}
110 | return allpicks, relpicks, relpicks_mean, ev2ev
111 |
112 |
113 | def get_tw(tr, otime, spickrel):
114 | # i1 = np.argmax(tr.slice(otime, otime + 10).data)
115 | t1 = otime + spickrel + 1
116 | noise = 1.2 * np.percentile(tr.slice(None, otime + 1).data, 5)
117 | try:
118 | i2 = np.where(tr.slice(t1, None).data < noise)[0][0]
119 | except:
120 | t2 = tr.stats.endtime
121 | else:
122 | t2 = t1 + i2 * tr.stats.delta
123 | t3 = get_local_minimum(tr.slice(t1, t2), ratio=10, seconds_before_max=2.5)
124 | if t3 is not None:
125 | t3 = t3 - otime
126 | return t1 - otime, t2 - otime, t3, noise
127 |
128 |
129 | def iter_data(events, alldata=False):
130 | for event in tqdm(events):
131 | id_, *_ = event2list(event)
132 | try:
133 | stream = read(DATA_FILES1.format(id=id_, sta='*'))
134 | except Exception as ex:
135 | print(ex, id_)
136 | continue
137 | if alldata:
138 | stream2 = read(DATA_FILES2.format(id=id_))
139 | stream += stream2
140 | yield stream, event
141 |
142 |
143 | def get_envelope(tr):
144 | tr = tr.copy()
145 | tr.data = envelope(tr.data)
146 | tr.data = smooth(tr.data, int(round(0.2 * tr.stats.sampling_rate)))
147 | return tr
148 |
149 |
150 | def single_plot(stream, event, tw=None):
151 | id_, otime, lon, lat, dep, mag, _ = event2list(event)
152 | stream.filter('highpass', freq=5).trim(otime-20, otime + 60)
153 | stream.traces = sorted(stream.traces, key=lambda tr: (STATIONS.index(tr.stats.station), tr.stats.channel))
154 | fig = plt.figure(figsize=(15, 10))
155 | ax = fig.add_subplot(111)
156 | if tw is None:
157 | ax.annotate(f'{id_} M{mag:.1f}', (0.05, 0.95), xycoords='axes fraction', va='top')
158 | else:
159 | ax.annotate(f'{id_} M{mag:.1f} {tw[id_][1]}', (0.05, 0.95), xycoords='axes fraction', va='top')
160 |
161 | ax.axvline(0, color='C0')
162 | ax.axvline(10, color='0.8')
163 | ax.axvline(40, color='0.8')
164 | for i, tr in enumerate(stream):
165 | ax.annotate(tr.id, (49, i), ha='right')
166 | try:
167 | scale = 1 / np.max(tr.slice(otime + 2, otime + 15).data)
168 | except ValueError as ex:
169 | print(str(ex))
170 | continue
171 | # plot data and envelope
172 | trenv = get_envelope(tr)
173 | tr.trim(otime-10, otime + 50)
174 | trenv.trim(otime-10, otime + 50)
175 | ax.plot(tr.times(reftime=otime), i + scale * tr.data, color='0.6', lw=1)
176 | ax.plot(trenv.times(reftime=otime), i + scale * trenv.data, color='C0', lw=1)
177 | # plot picks
178 | sta = tr.stats.station
179 | ppick = RELPICKS[id_][(sta, 'P')][0]
180 | spick = RELPICKS[id_][(sta, 'S')][0]
181 | ax.vlines([ppick, spick], i-0.25, i + 0.25, color='C1', zorder=10)
182 | p_ = [RELPICKS[id2][sta, phase][0] for id2 in EV2EV[id_] for phase in 'PS']
183 | p_ = [p for p in p_ if -10 < p < 50]
184 | ax.vlines(p_, i-0.25, i + 0.25, color='C2', zorder=10)
185 | if tw is None:
186 | # calc and plot time windows
187 | t1, t2, t3, _ = get_tw(trenv, otime, spick)
188 | if t3 is not None:
189 | rect = Rectangle((t2, i-0.4), t3-t2, 0.8, alpha=0.1, facecolor='C1')
190 | ax.add_patch(rect)
191 | rect = Rectangle((t1, i-0.4), t2-t1, 0.8, alpha=0.2, facecolor='C0')
192 | ax.add_patch(rect)
193 | else:
194 | try:
195 | t1, t2 = tw[id_][2][stacomp(tr.id)][:2]
196 | except KeyError:
197 | pass
198 | else:
199 | rect = Rectangle((t1, i-0.4), t2-t1, 0.8, alpha=0.3, facecolor='C0')
200 | ax.add_patch(rect)
201 | # noise = tw[id_][2][stacomp(tr.id)][3]
202 | # ax.axhline(i + scale * noise, color='0.6')
203 | ax.set_xlabel('time (s)')
204 | ax.set_ylim(-2, len(stream) + 2)
205 | ax.set_xticks([-10, -5, 0, 5, 10, 15, 20, 25, 30, 35, 40, 45, 50])
206 | ax.set_xticks(range(-10, 51), minor=True)
207 | ax.grid(which='major', axis='x')
208 | ax.grid(which='minor', axis='x', linestyle='--')
209 | ax.spines['top'].set_visible(False)
210 | ax.spines['left'].set_visible(False)
211 | ax.spines['right'].set_visible(False)
212 | ax.tick_params(labelleft=False, left=False)
213 | if tw is None:
214 | plt.savefig(f'tmp/{otime!s:.21}_{id_}_M{mag:.1f}.png', bbox_inches='tight')
215 | else:
216 | plt.savefig(f'tmp/{otime!s:.21}_{id_}_M{mag:.1f}_{tw[id_][1]}.png', bbox_inches='tight')
217 | plt.close()
218 |
219 |
220 | def stacomp(seedid):
221 | sta, cha = seedid.split('.')[1::2]
222 | return sta + '.' + cha[-1]
223 |
224 |
225 | def tw_from_qc_file(events=None):
226 | if os.path.exists(TWFILE):
227 | with open(TWFILE) as f:
228 | return json.load(f)
229 | print('Need to rebuild time window file ...')
230 | assert events
231 | with open(QCFILE) as f:
232 | text = f.read()
233 | qc = {line.split()[0]: line.split(maxsplit=2)[2] for line in text.splitlines() if line.startswith('2018')}
234 | tw = {}
235 | for stream, event in iter_data(events):
236 | id_, otime, lon, lat, dep, mag, _ = event2list(event)
237 | trtw = {}
238 | stream.filter('highpass', freq=5).trim(otime-20, otime + 60)
239 | for tr in stream:
240 | sta = tr.stats.station
241 | spick = RELPICKS[id_][(sta, 'S')][0]
242 | ppick = RELPICKS[id_][(sta, 'P')][0]
243 | t1 = spick + 1
244 | t2 = 50; r = 'max_tw'
245 | p_ = [RELPICKS[id2][sta, phase][0] for id2 in EV2EV[id_] for phase in 'PS']
246 | p_ = [p for p in p_ if spick + 1 < p < 50]
247 | if len(p_) > 0:
248 | t3 = sorted(p_)[0]
249 | if t3 < t2:
250 | t2 = t3
251 | r = 'next_pick'
252 | trenv = get_envelope(tr).trim(otime - 15, otime + 55)
253 | try:
254 | _, t3, t4, noise = get_tw(trenv, otime, spick)
255 | assert round(t1 - _, 5) == 0
256 | except IndexError as ex:
257 | print('error in calculating noise', id_, tr.id, str(ex))
258 | else:
259 | if t4 is None and t3 < t2:
260 | t2 = t3
261 | r = 'noise_level_reached'
262 | elif t4 is not None and t4 < t2:
263 | if r == 'max_tw' or t4+2.4 < t2:
264 | t2 = t4
265 | r = 'raising_envelope'
266 | fields = qc[id_].split()
267 | if 's' in qc[id_]:
268 | t3 = ppick + [float(f[:-1]) for f in fields if f.endswith('s')][0]
269 | if t3 < t2:
270 | t2 = t3
271 | r = 'visually_inspected'
272 | trtw[stacomp(tr.id)] = (round(t1, 2), round(t2, 2), r, round(noise, 2))
273 | qdict = {'xx': 0, 'x': 0, 'Q1': 3, 'Q2': 2, 'Q3': 1}
274 | q = qdict[[f for f in fields if f[0] in 'Qx'][0]]
275 | tw[id_] = [q, qc[id_], trtw]
276 | text = collapse_json(json.dumps(tw, indent=2), 6)
277 | with open(TWFILE, 'w') as f:
278 | f.write(text)
279 | return tw
280 |
281 |
282 | def ev2id(event):
283 | return str(event.resource_id).split('/')[-1]
284 |
285 |
286 | def select_events(events, tw, qclevel):
287 | events = [ev for ev in events if tw.get(ev2id(ev), (0,))[0] >= qclevel]
288 | events = [ev for ev in events if 'NOWEBNET' not in tw[ev2id(ev)][1]]
289 | return obspy.Catalog(events)
290 |
291 |
292 | if __name__ == '__main__':
293 | print('load events')
294 | allevents = get_events()
295 | print('finished loading')
296 | allevents = obspy.Catalog(sorted(allevents, key=lambda ev: ev.origins[0].time))
297 | PICKS, RELPICKS, _, EV2EV = get_picks(allevents)
298 | events = allevents.filter(f'magnitude > {MAG}')
299 | # create_event_qc_list(events)
300 | tw = tw_from_qc_file(events)
301 | for stream, event in iter_data(events, alldata=False):
302 | single_plot(stream, event, tw=tw)
303 |
--------------------------------------------------------------------------------
/plot_maps.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import matplotlib.pyplot as plt
4 | import numpy as np
5 |
6 | from matplotlib.dates import DateFormatter
7 | from matplotlib.colors import ListedColormap, BoundaryNorm
8 | from matplotlib.colorbar import ColorbarBase
9 | import obspy
10 | from obspy import UTCDateTime as UTC
11 | from obspy.imaging.beachball import beach
12 |
13 | from load_data import get_events
14 | from util.events import events2lists, load_webnet
15 | from util.imaging import add_ticklabels, convert_coords2km
16 |
17 |
18 | def get_bounds():
19 | bounds=['2018-05-10', '2018-05-11 03:10:00', '2018-05-12 06:00', '2018-05-21', '2018-05-25', '2018-06-19']
20 | return [UTC(b) for b in bounds]
21 |
22 |
23 | def get_colors():
24 | cmap1 = plt.get_cmap('Accent')
25 | N = 8
26 | colors = [cmap1((i+0.5) / N) for i in range(N)]
27 | colors = ['#7fc97f', '#beaed4', '#fdc086', '#ff9896', '#386cb0']
28 | return colors
29 |
30 |
31 | def get_cmap(extend=False):
32 | bounds = [b.matplotlib_date for b in get_bounds()]
33 | if extend:
34 | bounds[-1]=UTC('2018-05-26').matplotlib_date
35 | colors = get_colors()[:len(bounds)-1]
36 | cmap = ListedColormap(colors, name='AccentL')
37 | norm = BoundaryNorm(bounds, ncolors=len(colors))
38 | return cmap, norm
39 |
40 |
41 | def plot_events_stations_map_depth(events, inv=None, figsize=(8,8), out=None, show=True,
42 | dpi=300, convert_coords=False, plotall=True, colorbar=True, label=True, ms=9):
43 | id_, time, lon, lat, dep, mag, *_ = zip(*events2lists(events.filter('magnitude > 1.8')))
44 | _, _, lon2, lat2, dep2, _, *_ = zip(*events2lists(events))
45 | print(time[0], time[-1])
46 |
47 | fig = plt.figure(figsize=figsize)
48 | ax3 = fig.add_axes((0.1, 0.5, 0.4, 0.4))
49 | ax4 = fig.add_axes((0.52, 0.5, 0.35, 0.4), sharey=ax3)
50 | ax5 = fig.add_axes((0.1, 0.5-0.37, 0.4, 0.35), sharex=ax3)
51 | def _on_lims_changed(ax, boo=[True]):
52 | if boo[0]:
53 | boo[0] = False
54 | if ax == ax5:
55 | ax4.set_xlim(ax5.get_ylim()[::-1])
56 | if ax == ax4:
57 | ax5.set_ylim(ax4.get_xlim()[::-1])
58 | boo[0] = True
59 |
60 | mpl = [t.matplotlib_date for t in time]
61 | ax5.invert_yaxis()
62 | ax5.callbacks.connect('ylim_changed', _on_lims_changed)
63 | ax4.callbacks.connect('xlim_changed', _on_lims_changed)
64 | ax4.yaxis.tick_right()
65 | ax4.yaxis.set_label_position("right")
66 | ax3.xaxis.tick_top()
67 | ax3.xaxis.set_label_position("top")
68 | ax4.xaxis.tick_top()
69 | ax4.xaxis.set_label_position("top")
70 |
71 | if convert_coords:
72 | latlon0 = None if convert_coords==True else convert_coords
73 | x, y = zip(*convert_coords2km(list(zip(lat, lon)), latlon0=latlon0))
74 | x2, y2 = zip(*convert_coords2km(list(zip(lat2, lon2)), latlon0=latlon0))
75 | else:
76 | x, y = lon, lat
77 | x2, y2 = lon2, lat2
78 | mag = np.array(mag)
79 |
80 | cmap, norm = get_cmap()
81 | if plotall:
82 | ax3.scatter(x2, y2, 4, color='0.6')
83 | ax4.scatter(dep2, y2, 4, color='0.6')
84 | ax5.scatter(x2, dep2, 4, color='0.6')
85 | ax3.scatter(x, y, ms, mpl, cmap=cmap, norm=norm)
86 | ax4.scatter(dep, y, ms, mpl, cmap=cmap, norm=norm)
87 | ax5.scatter(x, dep, ms, mpl, cmap=cmap, norm=norm)
88 | if colorbar:
89 | ax7 = fig.add_axes([0.56, 0.42, 0.34, 0.02])
90 | cmap, norm = get_cmap(extend=True)
91 | cbar = ColorbarBase(ax7, cmap=cmap, norm=norm, orientation='horizontal', format=DateFormatter('%Y-%m-%d'), extend='max', spacing='proportional')#, extendfrac=0.2)
92 | cbar.ax.set_xticklabels(cbar.ax.get_xticklabels()[:-1] + ['until 2018-06-19'], rotation=60, ha='right', rotation_mode='anchor')
93 | xticks = cbar.ax.get_xticks()
94 | xticks = np.mean([xticks[1:], xticks[:-1]], axis=0)
95 | for xpos, label in zip(xticks, 'abcde'):
96 | # this line only works for matplotlib 3.1 at the moment
97 | cbar.ax.annotate(label, (xpos, 0.1), ha='center', fontstyle='italic')
98 |
99 | ax4.set_xlabel('depth (km)')
100 | ax5.set_ylabel('depth (km)')
101 | if convert_coords:
102 | if label:
103 | ax3.set_xlabel('easting (km)')
104 | ax3.set_ylabel('northing (km)')
105 | ax5.set_xlabel('easting (km)')
106 | ax4.set_ylabel('northing (km)')
107 | else:
108 | if label:
109 | ax3.set_xlabel('longitude')
110 | ax3.set_ylabel('latitude')
111 | ax5.set_xlabel('longitude')
112 | ax4.set_ylabel('latitude')
113 | if out:
114 | plt.savefig(out, dpi=dpi)
115 | if show:
116 | plt.show()
117 | return fig
118 |
119 |
120 | def plot_events2018(events, bb=None):
121 | bbfname = 'data/focal_mechanism_2018swarm.txt'
122 | fig = plot_events_stations_map_depth(events, convert_coords=LATLON0, show=False)
123 | bbs = np.genfromtxt(bbfname, names=True)
124 | xys = [(0, 3), (0, 4), (1, 1), (1, 0), (0, 2), (0, 1), (0, 0),
125 | (1, 2), (0, 5), (0, 6), (1, 3), (1, 5), (1, 4)]
126 | for i, bb in enumerate(bbs):
127 | _, lat, lon, dep, mag, *sdr = bb
128 | xy = convert_coords2km([(lat, lon)], latlon0=LATLON0)[0]
129 | ax = fig.axes[0]
130 | xy2= (xys[i][0]*3.8-1.9, -(xys[i][1] + xys[i][0]*0.5)/6*4 + 2.2)
131 | ax.plot(*list(zip(xy, xy2)), lw=0.5, color='k', zorder=10)
132 | b = beach(sdr, xy=xy2, width=0.5, linewidth=0.5, facecolor='C0')
133 | ax.add_collection(b)
134 | fig.axes[0].set_xlim(-2.5, 2.5)
135 | fig.axes[0].set_ylim(-2.3, 2.7)
136 | fig.axes[1].set_xlim(5.95, 10.95)
137 | fig.savefig('figs/eventmap.pdf', bbox_inches='tight', pad_inches=0.1)
138 |
139 |
140 | def plot_events2018_interpretation(events):
141 | fig = plot_events_stations_map_depth(events, show=False, plotall=False, colorbar=False, label=False, ms=4)
142 | # eqs = load_webnet(stations=False)
143 | # color = 'gray'
144 | # color = '#9bd7ff'
145 | # fig.axes[0].scatter(eqs.lon.values, eqs.lat.values, 4, eqs.time.values, marker='o', zorder=-1)
146 | # fig.axes[1].scatter(eqs.dep.values, eqs.lat.values, 4, eqs.time.values, marker='o', zorder=-1)
147 | # fig.axes[2].scatter(eqs.lon.values, eqs.dep.values, 4, eqs.time.values, marker='o', zorder=-1)
148 | # fig.axes[0].set_xlim(-2.5, 2.5)
149 | # fig.axes[0].set_ylim(-2.3, 2.7)
150 | # lon, lat, dep = 12.42396, 50.22187, 8.5
151 | # lon, lat, dep = 12.45993, 50.27813, 8.5
152 | # lon, lat, dep = 12.45993, 50.22187, 11.5
153 | lon, lat, dep = 12.45993, 50.22187, 8.5
154 | kw = dict(zorder=-1, vmin=3, vmax=4, cmap='Greys', edgecolors='face', linewidths=0.01)
155 | # kw = dict(zorder=-1, vmin=5, vmax=6.5, cmap='Greys')
156 | # kw = dict(zorder=-1, vmin=1.55, vmax=1.75, cmap='Greys')
157 | # load_mousavi()
158 | try:
159 | plot_mousavi(2, dep, 0, 1, 4, ax=fig.axes[0], **kw)
160 | plot_mousavi(0, lon, 2, 1, 4, ax=fig.axes[1], **kw)
161 | im = plot_mousavi(1, lat, 0, 2, 4, ax=fig.axes[2], **kw)
162 | except OSError:
163 | import traceback
164 | print(traceback.format_exc())
165 | print('Mousavi model is not provided in this repository')
166 | im = None
167 | kw = dict(color='red', lw=2, zorder=0)
168 | kw = dict(color='C1', alpha=0.8, lw=2, zorder=0)
169 | ax0, ax1, ax2, *_ = fig.axes
170 | ax0.axhline(lat, **kw)
171 | ax0.axvline(lon, **kw)
172 | ax1.axvline(dep, **kw)
173 | ax2.axhline(dep, **kw)
174 | ax0.set_xlim(12.388, 12.53187)
175 | ax0.set_ylim(50.10935, 50.3344)
176 | ax1.set_xlim(0, 12)
177 | ax1.set_xticks(ax2.get_yticks())
178 | cax2 = fig.add_axes([0.56, 0.42, 0.34, 0.02])
179 | cmap, norm = get_cmap()
180 | cbar = ColorbarBase(cax2, cmap=cmap, norm=norm, orientation='horizontal',
181 | format=DateFormatter('%Y-%m-%d'))
182 | cax2.set_xticklabels([])
183 | cax2.set_xlabel('intra-cluster\nS wave velocity (km/s)', labelpad=12)
184 | for i, label in enumerate('abcde'):
185 | cax2.annotate(label, ((i+0.5) / 5, 0.1), xycoords='axes fraction',
186 | ha='center', fontstyle='italic')
187 | for i, label in enumerate([4.16, 3.49, 3.66, 3.85, 3.72]):
188 | cax2.annotate(label, ((i+0.5) / 5, -1.2), xycoords='axes fraction',
189 | ha='center')
190 | lonticks = [12.40, 12.45, 12.50]
191 | latticks = [50.15, 50.20, 50.25, 50.30]
192 | add_ticklabels(ax0, lonticks, latticks, fmt='%.2f°')
193 | if im is not None:
194 | cax = fig.add_axes([0.56, 0.27, 0.34, 0.02])
195 | fig.colorbar(im, cax=cax, orientation='horizontal')
196 | for i, label in enumerate([4.16, 3.49, 3.66, 3.85, 3.72]):
197 | cax.annotate('', (label - 3, 1.1), (label - 3, 2.5), xycoords='axes fraction',
198 | textcoords='axes fraction',
199 | arrowprops=dict(width=1, headwidth=4, headlength=6, color=cmap(i)))#, xycoords='axes fraction',
200 | cax.set_xlabel('S wave velocity (km/s)\nMousavi et al. (2015)')
201 | fig.savefig('figs/eventmap_interpretation.pdf', bbox_inches='tight', pad_inches=0.1)
202 | plt.show()
203 |
204 |
205 | def plot_mousavis():
206 | deps = [2.5, 5.5, 8.5, 11.5]
207 | for d in deps:
208 | fig = plt.figure()
209 | ax = fig.add_subplot(111)
210 | # kw = dict(zorder=-1, vmin=3, vmax=4, cmap='Greys', edgecolors='face', linewidths=0.01)
211 | # kw = dict(zorder=-1, vmin=5, vmax=6.5, cmap='Greys')
212 | kw = dict(zorder=-1, vmin=1.55, vmax=1.75, cmap='Greys')
213 | im = plot_mousavi(2, d, 0, 1, 5, ax=ax, **kw)
214 | ax.set_xlim(12.388, 12.53187)
215 | ax.set_ylim(50.10935, 50.3344)
216 | lonticks = [12.40, 12.45, 12.50]
217 | latticks = [50.15, 50.20, 50.25, 50.30]
218 | add_ticklabels(ax, lonticks, latticks, fmt='%.2f°')
219 | fig.colorbar(im, ax=ax)
220 | fig.savefig('tmp/mousavi_vpvs_depth_%04.1fkm.png' % d, bbox_inches='tight', pad_inches=0.1)
221 |
222 |
223 | def events2txt(events, fname):
224 | id_, time, lon, lat, dep, mag, *_ = zip(*events2lists(events))
225 | np.savetxt(fname, np.transpose([lat, lon, mag]), fmt=['%.6f', '%.6f', '%.1f'], header='lat lon mag')
226 |
227 |
228 | def plot_topomap(events=None):
229 | from cartopy.feature import NaturalEarthFeature as NEF
230 | from matplotlib.colors import LinearSegmentedColormap
231 | import shapely.geometry as sgeom
232 | from util.imaging import EA_EURO, GEO, add_scale, de_border, add_ticklabels, plot_elevation
233 |
234 | mlf = [(12.410, 50.261), (12.485, 50.183), (12.523, 50.127),
235 | (12.517, 50.125), (12.538, 50.131), (12.534, 50.130),
236 | (12.543, 50.109), (12.547, 50.083), (12.547, 50.074),
237 | (12.545, 50.066), (12.546, 50.057), (12.576, 50.034),
238 | (12.594, 50.016), (12.632, 50.004), (12.665, 49.980)]
239 | fig = plt.figure()
240 | # ax = fig.add_axes([0, 0, 1, 1], projection=EA_EURO)
241 | ax = fig.add_subplot(111, projection=EA_EURO)
242 | extent = [12.05, 12.85, 50, 50.45]
243 | ax.set_extent(extent, crs=GEO)
244 |
245 | # Create an inset GeoAxes showing the location of the Solomon Islands.
246 | box = ax.get_position().bounds
247 | subax = fig.add_axes([box[0]+box[2]-0.23, box[1]+box[3]-0.3, 0.28, 0.28], projection=EA_EURO)
248 | subax.set_extent([8, 16, 47, 55], GEO)
249 | subax.add_feature(NEF('physical', 'land', '10m'), facecolor='0.7', alpha=0.5, rasterized=True) #facecolor='sandybrown'
250 | subax.add_feature(NEF('physical', 'coastline', '10m'), facecolor='none', edgecolor='k', linewidth=0.5, rasterized=True)
251 | subax.add_feature(NEF('cultural', 'admin_0_boundary_lines_land', '10m'), facecolor='none', edgecolor='k', linewidth=0.5, rasterized=True)
252 | subax.add_geometries([sgeom.box(extent[0], extent[2], extent[1], extent[3])], GEO,
253 | facecolor='none', edgecolor='k', linewidth=1, alpha=0.5)
254 | lonticks = [12.2, 12.4, 12.6, 12.8]
255 | latticks = [50, 50.1, 50.2, 50.3, 50.4]
256 | add_ticklabels(ax, lonticks, latticks)
257 | ax.tick_params(axis='both', which='major', labelsize=8)
258 | plot_elevation(ax, shading=False, cmap=LinearSegmentedColormap.from_list('littlegray', ['white', '0.5']),
259 | azimuth=315, altitude=60, rasterized=True)
260 | add_scale(ax, 10, (12.15, 50.02))
261 | de_border(ax, edgecolor='0.5', rasterized=True)
262 | eqs, sta = load_webnet(stations=True)
263 | if eqs is not None:
264 | ax.scatter(eqs.lon.values, eqs.lat.values, 4, '#9bd7ff', alpha=0.4, marker='.', transform=GEO, rasterized=True)
265 | if events is not None:
266 | _, _, lon, lat, _, mag, *_ = zip(*events2lists(events))
267 | ax.scatter(lon, lat, 4, 'C0', marker='o', transform=GEO)
268 | ax.plot(*list(zip(*mlf)), color='0.5', transform=GEO)
269 | ax.annotate('MLF', (12.505, 50.145), None, GEO._as_mpl_transform(ax), size='x-small', zorder=10, rotation=290)
270 | used_stations = 'NKC LBC VAC KVC STC POC SKC KRC ZHC'.split()
271 | sta = sta[sta.station.isin(used_stations)]
272 | ax.scatter(sta.lon.values, sta.lat.values, 100, marker='^', color='none', edgecolors='k', transform=GEO, zorder=10)
273 | for idx, s in sta.iterrows():
274 | xy = (2, 2) if s.station not in ('KOPD', 'KAC') else (-10, 5)
275 | ax.annotate(s.station, (s.lon, s.lat), xy, GEO._as_mpl_transform(ax), 'offset points', size='x-small', zorder=10)
276 | x0, y0 = EA_EURO.transform_point(LATLON0[1], LATLON0[0], GEO)
277 | ax.add_geometries([sgeom.box(x0-2500, y0-2300, x0+2500, y0+2700)], EA_EURO,
278 | facecolor='none', edgecolor='C1', linewidth=2, alpha=0.8, zorder=11)
279 | fig.savefig('figs/topomap.pdf', bbox_inches='tight', pad_inches=0.1, dpi=300)
280 | plt.show()
281 | return sta
282 |
283 |
284 | def load_mousavi_numpy():
285 | fname = 'data/mousavi2015.txt'
286 | mod = np.loadtxt(fname)
287 | return mod
288 |
289 |
290 | def load_mousavi(*args):
291 | mod = load_mousavi_numpy()
292 | if len(args) == 0:
293 | print('Arguments: ind1, indv, ind2, ind3, ind4')
294 | print('Choose 4 indices from 0-5 (lon, lat, dep, vp, vs, vp/vs)')
295 | print('ind1: select value of this index (0-3)')
296 | print('ind2: x axis (0-3)')
297 | print('ind3: y axis (0-3)')
298 | print('ind4: values (3-5)')
299 | print('indv: Choose val from')
300 | lons = np.array(sorted(set(mod[:, 0])))
301 | lats = np.array(sorted(set(mod[:, 1])))
302 | deps = np.array(sorted(set(mod[:, 2])))
303 | print(lons)
304 | print(lats)
305 | print(deps)
306 | print('Choose extract_ind 3-5 (vp, vs, vp/vs)')
307 | return lons, lats, deps, mod
308 | ind1, indv, ind2, ind3, ind4 = args
309 | import pandas
310 | df = pandas.DataFrame(mod)
311 | df2 = df[df.iloc[:, ind1] == indv].pivot(ind3, ind2, ind4)
312 | return df2.columns.to_numpy(), df2.index.to_numpy(), df2.to_numpy()
313 |
314 |
315 | def get_corners(x, y, z):
316 | x2 = 0.5 * (x[1:] + x[:-1])
317 | y2 = 0.5 * (y[1:] + y[:-1])
318 | return x2, y2, z[1:-1, 1:-1]
319 |
320 |
321 | def plot_mousavi(*args, ax=None, **kw):
322 | if ax is None:
323 | fig = plt.figure()
324 | ax = fig.add_subplot(111)
325 | nd1, indv, ind2, ind3, ind4 = args
326 | x, y, z = load_mousavi(*args)
327 | x, y, z = get_corners(x, y, z)
328 | x, y = np.meshgrid(x, y)
329 | print('value range', np.min(z), np.max(z))
330 | im = ax.pcolormesh(x, y, z, **kw)
331 | if ind3 == 2:
332 | ax.set_ylim(12, 0)
333 | if ind3 == 1:
334 | ax.set_ylim(49.8, 50.8)
335 | if ind2 == 1:
336 | ax.set_xlim(49.8, 50.8)
337 | if ind2 == 0:
338 | ax.set_xlim(12.1, 12.65)
339 | return im
340 |
341 |
342 | MAG = 1.5
343 |
344 | TXTFILE = 'tmp/cat_2018_mag>1.8.txt'
345 | LATLON0 = (50.25, 12.45)
346 |
347 | if __name__ == '__main__':
348 | # plot_mousavis()
349 | print('load events')
350 | allevents = get_events()
351 | print('finished loading')
352 | allevents = obspy.Catalog(sorted(allevents, key=lambda ev: ev.origins[0].time))
353 | events = allevents.filter(f'magnitude > {MAG}')
354 | events2txt(events, TXTFILE)
355 | plot_events2018(allevents)
356 | sta = plot_topomap(events=allevents)
357 | plot_events2018_interpretation(allevents)
358 | plt.show()
359 |
--------------------------------------------------------------------------------
/traveltime.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 | """
3 | extract travel times (stuff) and create different plots
4 | """
5 |
6 |
7 | from copy import copy
8 | import pickle
9 |
10 | import matplotlib.pyplot as plt
11 | import numpy as np
12 | from obspy import read
13 | from obspy.imaging.beachball import beach
14 | import scipy
15 |
16 | from load_data import LATLON0
17 | from util.imaging import convert_coords2km
18 | from util.signal import trim2
19 | from util.source import plot_farfield, nice_points, plot_pnt
20 | from util.xcorr2 import plot_corr_vs_dist_hist, plot_corr_vs_dist, velocity_line
21 | from xcorr import add_distbin
22 |
23 |
24 | def filters(ind, *args):
25 | return [arg[ind] for arg in args]
26 |
27 |
28 | def repack(obj):
29 | jaja = [np.array(bla) if len(bla) not in (2, 4) else bla for bla in zip(*obj)]
30 | jaja[5] = [np.array(bla) for bla in zip(*jaja[5])]
31 | jaja[9] = [np.array(bla) for bla in zip(*jaja[9])]
32 | jaja[-2] = [np.array(bla) for bla in zip(*jaja[-2])]
33 | return jaja
34 |
35 |
36 | def get_stuff(stream, stream_phase=None, v=3.6, mindist=0, only_maxima=False):
37 | if stream_phase is None:
38 | stream_phase = [None] * len(stream)
39 | stuff = {}
40 | for tr, trphase in zip(stream, stream_phase):#[:200]:
41 | assert tr.stats.stack.group == trphase.stats.stack.group
42 | dist = tr.stats.dist
43 | if dist < mindist:
44 | continue
45 | d = tr.data
46 | absd = np.abs(d)
47 | ad = d if only_maxima else absd
48 | dt = tr.stats.delta
49 | starttime = tr.stats.starttime
50 | midsecs = (tr.stats.endtime - starttime) / 2
51 |
52 | ind_max1 = np.argmax(ad)
53 | max1 = d[ind_max1]
54 | phase1 = 0 if trphase is None else trphase.data[ind_max1]
55 | tmax1 = dt * ind_max1 - midsecs
56 | ind_maxima = scipy.signal.argrelmax(d)[0]
57 | ind_maxima = ind_maxima[ind_maxima != ind_max1]
58 | ind_maxima = sorted(ind_maxima, key=lambda i: absd[i], reverse=True)
59 | # inverse of signal to noise ratio
60 | snr1 = 0
61 | snr2 = absd[ind_maxima[NOISE_IND]] / abs(max1)
62 | # snr3 = ad[ind_maxima[19]] / abs(max1)
63 | snr3 = 0
64 | snr4 = 0
65 | # snr5 = 0
66 | max2 = 0
67 | tmax2 = 0
68 | # for i in ind_maxima:
69 | # dift = abs(i - ind_max1) * dt
70 | # if dift < 0.1:
71 | # # side lobe
72 | # snr4 = max(snr4, ad[i] / abs(max1))
73 | # continue
74 | # if abs(dift - 2 * dist / 1000 / v) < 0.05 and max2 == 0 and snr1 == 0 and np.sign(tmax1) != np.sign(dt * i - midsecs):
75 | # # second maxima
76 | # max2 = d[i]
77 | # tmax2 = dt * i - midsecs
78 | # else:
79 | # snr1 = max(snr1, ad[i] / abs(max1))
80 | if True:
81 | if tmax1 > 0:
82 | ad[len(ad)//2:] = 0
83 | else:
84 | ad[:len(ad)//2] = 0
85 | i = np.argmax(ad)
86 | dift = abs(i - ind_max1) * dt
87 | # if abs(dift - 2 * dist / 1000 / v) < 0.05:
88 | if abs(dift - 2 * dist / 1000 / v) < 0.1:
89 | max2 = d[i]
90 | tmax2 = dt * i - midsecs
91 |
92 |
93 |
94 | v2 = dist / 1000 / abs(tmax1)
95 | # distance, seconds 1, seconds2 (or None), maximum value 1 (+ or -), max value 2 (+ or -),
96 | # noise levels:
97 | # 1: largest peak, not side lobe and not max2 / maximum
98 | # 2: 10th local maximum / maximum
99 | # 3: 20th local maximum / maximum
100 | # 4: side lobe of maximum / maximum
101 | # event id, azimuth, inclination
102 | # coordinates event1 and event2, apparent velocity
103 | bla = (dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), tr.stats.stack.group,
104 | tr.stats.azi, tr.stats.inc,
105 | (_coords(tr.stats, '1'), _coords(tr.stats, '2')), v2, (tr.stats.event_time1, tr.stats.event_time2),
106 | phase1)
107 | stuff[tr.stats.stack.group] = bla
108 | return stuff
109 |
110 |
111 | def plot_stuff_map_depth(stuff, dist=None, tmax1=None, max1=None, max2=None, snr2=None, coords1=None, coords2=None, v=None,
112 | figsize=(8,8), out=None, show=True,
113 | dpi=300, convert_coords=True, plot_lines=False):
114 | if dist is None:
115 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), evids, azi, inc, (coords1, coords2), v, *_ = repack(stuff.values())
116 | cond = np.logical_and.reduce((snr2 <= 0.25, dist > 200, inc<50, inc>40, azi>340))
117 | dist, tmax1, max1, max2, snr2, coords1, coords2, v = filters(cond, dist, tmax1, max1, max2, snr2, coords1, coords2, v)
118 | print(len(dist))
119 | if convert_coords:
120 | coords1 = convert_coords2km(coords1, latlon0=LATLON0)
121 | coords2 = convert_coords2km(coords2, latlon0=LATLON0)
122 | x1, y1, dep1 = zip(*coords1)
123 | x2, y2, dep2 = zip(*coords2)
124 | x, y, dep = zip(*np.mean([coords1, coords2], axis=0))
125 |
126 | # import utm
127 | # x, y = zip(*[utm.from_latlon(lat_, lon_)[:2]
128 | # for lat_, lon_ in zip(lat, lon)])
129 | # x1, y1 = zip(*[utm.from_latlon(lat_, lon_)[:2]
130 | # for lat_, lon_ in zip(lat1, lon1)])
131 | # x2, y2 = zip(*[utm.from_latlon(lat_, lon_)[:2]
132 | # for lat_, lon_ in zip(lat2, lon2)])
133 | #
134 | # x1 = (np.array(x1) - np.mean(x)) / 1000
135 | # y1 = (np.array(y1) - np.mean(y)) / 1000
136 | # x2 = (np.array(x2) - np.mean(x)) / 1000
137 | # y2 = (np.array(y2) - np.mean(y)) / 1000
138 | # x = (np.array(x) - np.mean(x)) / 1000
139 | # y = (np.array(y) - np.mean(y)) / 1000
140 | # else:
141 | # x, y = lon, lat
142 |
143 |
144 | fig = plt.figure(figsize=figsize)
145 | ax3 = fig.add_axes((0.1, 0.5, 0.4, 0.4))
146 | ax4 = fig.add_axes((0.52, 0.5, 0.35, 0.4), sharey=ax3)
147 | ax5 = fig.add_axes((0.1, 0.5-0.37, 0.4, 0.35), sharex=ax3)
148 | if plot_lines:
149 | ax6 = fig.add_axes((0.52, 0.5-0.37, 0.35, 0.35))
150 | else:
151 | ax6 = fig.add_axes((0.6, 0.5-0.37, 0.02, 0.35))
152 | def _on_lims_changed(ax, boo=[True]):
153 | if boo[0]:
154 | boo[0] = False
155 | if ax == ax5:
156 | ax4.set_xlim(ax5.get_ylim()[::-1])
157 | if ax == ax4:
158 | ax5.set_ylim(ax4.get_xlim()[::-1])
159 | boo[0] = True
160 |
161 | ax5.invert_yaxis()
162 | ax5.callbacks.connect('ylim_changed', _on_lims_changed)
163 | ax4.callbacks.connect('xlim_changed', _on_lims_changed)
164 | ax4.yaxis.tick_right()
165 | ax4.yaxis.set_label_position("right")
166 | ax3.xaxis.tick_top()
167 | ax3.xaxis.set_label_position("top")
168 | ax4.xaxis.tick_top()
169 | ax4.xaxis.set_label_position("top")
170 |
171 |
172 | if plot_lines:
173 | cmap = plt.get_cmap()
174 | colors = cmap((np.array(v) - 3.2) / 1)
175 | colors = cmap(np.sign(max1))
176 | # for xx1, yy1, dp1, xx2, yy2, dp2, cc in zip(x1, y1, dep1, x2, y2, dep2, colors):
177 | # ax3.plot((xx1, xx2), (yy1, yy2), color=cc)
178 | # ax4.plot((dp1, dp2), (yy1, yy2), color=cc)
179 | # ax5.plot((xx1, xx2), (dp1, dp2), color=cc)
180 | import matplotlib as mpl
181 | def _rs(a, b, c, d):
182 | return list(zip(list(zip(a, b)), list(zip(c, d))))
183 | ln_coll1 = mpl.collections.LineCollection(_rs(x1, y1, x2, y2), colors=colors)
184 | ln_coll2 = mpl.collections.LineCollection(_rs(dep1, y1, dep2, y2), colors=colors)
185 | ln_coll3 = mpl.collections.LineCollection(_rs(x1, dep1, x2, dep2), colors=colors)
186 | ax3.add_collection(ln_coll1)
187 | ax4.add_collection(ln_coll2)
188 | ax5.add_collection(ln_coll3)
189 | ax3.set_xlim(np.min(x), np.max(x))
190 | ax3.set_ylim(np.min(y), np.max(y))
191 | ax4.set_xlim(np.min(dep), np.max(dep))
192 | im = ax6.scatter(tmax1, dist, 10, np.sign(max1))
193 | plt.colorbar(im, ax=ax6)
194 | else:
195 | vmax = None
196 | vmax=None
197 | v = np.sign(max1)
198 | ax3.scatter(x, y, 10, v, vmax=vmax)
199 | ax4.scatter(dep, y, 10, v, vmax=vmax)
200 | sc = ax5.scatter(x, dep, 10, v, vmax=vmax)
201 | plt.colorbar(sc, cax=ax6)
202 | # cbar.ax.invert_yaxis()
203 | ax3.set_xlim(-2.5, 2.5)
204 | ax3.set_ylim(-2.3, 2.7)
205 | ax4.set_xlim(5.95, 10.95)
206 | ax4.set_xlabel('depth (km)')
207 | ax5.set_ylabel('depth (km)')
208 | if convert_coords:
209 | ax3.set_xlabel('EW (km)')
210 | ax3.set_ylabel('NS (km)')
211 | ax5.set_xlabel('EW (km)')
212 | ax4.set_ylabel('NS (km)')
213 | else:
214 | ax3.set_xlabel('Longitude')
215 | ax3.set_ylabel('Latitude')
216 | ax5.set_xlabel('Longitude')
217 | ax4.set_ylabel('Latitude')
218 | if out:
219 | plt.savefig(out, dpi=dpi)
220 | if show:
221 | plt.show()
222 |
223 |
224 | def plot_corr_vs_dist2times():
225 | plt.rc('font', size=12)
226 | stream1 = read(NAME, 'H5')
227 | stream2 = read(NAMED, 'H5')
228 | # from xcorr import add_distbin
229 | # add_distbin(ccs_stack, 1, 21)
230 | stream1.stack('{distbin}').sort(['distbin']).normalize()
231 | stream2.stack('{distbin}').sort(['distbin']).normalize()
232 | for tr in stream1:
233 | tr.stats.dist = tr.stats.distbin
234 | for tr in stream2:
235 | tr.stats.dist = tr.stats.distbin
236 |
237 | fig = plt.figure(figsize=(12, 6))
238 | ax1 = fig.add_subplot(121)
239 | ax2 = fig.add_subplot(122, sharex=ax1, sharey=ax1)
240 |
241 | for j, stream in enumerate([stream1, stream2]):
242 | max_ = max(stream.max())
243 | max_dist = max(tr.stats.dist for tr in stream)
244 | ax = [ax1, ax2][j]
245 | for i, tr in enumerate(stream):
246 | starttime = tr.stats.starttime
247 | mid = starttime + (tr.stats.endtime - starttime) / 2
248 | t = tr.times(reftime=mid)
249 | scaled_data = tr.stats.dist + tr.data * max_dist / max_ / 50
250 | ax.plot(t, scaled_data, 'k', lw=1)
251 | velocity_line(3.6, ax, t, [max_dist], lw=2)
252 | ax.set_xlabel('lag time (s)')
253 | ax.set_xlim(-0.6, 0.6)
254 | ax.set_ylim(0, None)
255 | ax.legend(loc='lower right')
256 | label = ['a) coda window with normalization', 'b) direct S wave window'][j]
257 | ax.annotate(label, (0.03, 0.985), xycoords='axes fraction', va='top', size='medium')
258 | ax1.set_ylabel('event distance (m)')
259 | fig.savefig(f'{OUT2}corrs_vs_dist.pdf', bbox_inches='tight', pad_inches=0.1)
260 | plt.rcdefaults()
261 |
262 |
263 | def plot_corr(stream, annotate=True, expr='{station}.{channel} {evid1}-{evid2}',
264 | expr2='{evid1}-{evid2} {dist:.1f}m',
265 | figsize=None):
266 | fig = plt.figure(figsize=figsize)
267 | ax = fig.add_subplot(111)
268 | max_ = max(stream.max())
269 | stream = stream.copy()
270 | trim2(stream, -0.5, 0.5, 'mid')
271 | stream.traces.append(copy(stream).stack()[0])
272 | for i, tr in enumerate(stream):
273 | starttime = tr.stats.starttime
274 | mid = starttime + (tr.stats.endtime - starttime) / 2
275 | t = tr.times(reftime=mid)
276 | if i == len(stream) - 1:
277 | max_ = np.max(np.abs(tr.data))
278 | ax.plot(t, i + tr.data/max_*1.5, 'k' if i < len(stream)-1 else 'C1', lw=1 if i < len(stream)-1 else 2)
279 | if annotate:
280 | try:
281 | label = expr.format(**tr.stats)
282 | ax.annotate(label, (t[-1], i), (-5, 0),
283 | 'data', 'offset points',
284 | ha='right')
285 | except KeyError:
286 | pass
287 | dist = stream[0].stats.dist
288 | tt = dist / 4000
289 | ax.axvline(0, color='0.7')
290 | ax.axvline(-tt, color='C1', alpha=0.5)
291 | ax.axvline(tt, color='C1', alpha=0.5)
292 | if expr2 is not None:
293 | ax.annotate(expr2.format(**stream[0].stats), (t[0]+0.05, len(stream)), (5, 0),
294 | 'data', 'offset points', size='x-large')
295 |
296 |
297 | def make_crazy_plot(stuff, figsize=None):
298 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), evids, azi, inc, *_ = repack(stuff.values())
299 |
300 | cond = np.logical_and(snr2 <= NSR, dist > 200)
301 | tmax1, max1, max2, azi, inc, snr2 = filters(cond, tmax1, max1, max2, azi, inc, snr2)
302 | fig = plt.figure(figsize=figsize)
303 | ax1 = fig.add_subplot(511)
304 | ax2 = fig.add_subplot(512, sharex=ax1, sharey=ax1)
305 | ax3 = fig.add_subplot(513, sharex=ax1, sharey=ax1)
306 | ax4 = fig.add_subplot(514, sharex=ax1, sharey=ax1)
307 | ax5 = fig.add_subplot(515, sharex=ax1, sharey=ax1)
308 | # ax3 = fig.add_subplot(133)
309 | # ax1.plot(azis, vals, 'x')
310 | # ax2.plot(incs, vals, 'x')
311 | # vmax = max(abs(min(vals)), max(vals))
312 | # vmax = 5
313 | # s = 10*(snrs)**2
314 | # ax1.scatter(azis[snrs=SNR], np.mod(180-incs[snrs>=SNR], 180), s, c=-vals[snrs>SNR], vmin=-vmax, vmax=vmax, cmap='coolwarm')
319 | cbar = plt.colorbar(sc, ax=ax1)
320 | ankw = dict(xy=(0.05, 0.98), xycoords='axes fraction')
321 | ax1.annotate('relation of maxima', **ankw)
322 | sc = ax2.scatter(azi, inc, c=snr2, marker='.', cmap='viridis_r')
323 | # sc = ax2.scatter(bazs, np.mod(180-incs, 180), c=snrs, marker='.', vmax=vmax, cmap='viridis_r')
324 | cbar = plt.colorbar(sc, ax=ax2)
325 | ax2.annotate('noise level of larger maxima', **ankw)
326 | sc = ax3.scatter(azi[max2 != 0], inc[max2 != 0], c=np.abs(snr2*max1/max2)[max2 != 0], marker='.', cmap='viridis_r')
327 | # sc = ax3.scatter(bazs[snrs>=SNR], np.mod(180-incs[snrs>=SNR], 180), c=snrs2[snrs>=SNR], marker='.', vmax=vmax, cmap='viridis_r')
328 | cbar = plt.colorbar(sc, ax=ax3)
329 | ax3.annotate('noise level of smaller maxima', **ankw)
330 | sc = ax4.scatter(azi, inc, c=np.log10(np.abs(max1)), marker='.', cmap='viridis_r')
331 | # sc = ax4.scatter(bazs[snrs>=SNR], np.mod(180-incs[snrs>=SNR], 180), marker='.', vmax=vmax, c=snrs1[snrs>=SNR], cmap='viridis_r')
332 | cbar = plt.colorbar(sc, ax=ax4)
333 | ax4.annotate('log10(abs(max1))', **ankw)
334 |
335 | sc = ax5.scatter(azi, inc, c=np.sign(max1), marker='.', cmap='coolwarm')
336 | # sc = ax4.scatter(bazs[snrs>=SNR], np.mod(180-incs[snrs>=SNR], 180), marker='.', vmax=vmax, c=snrs1[snrs>=SNR], cmap='viridis_r')
337 | cbar = plt.colorbar(sc, ax=ax5)
338 | ax5.annotate('sign(max1)', **ankw)
339 | ax1.set_ylabel('inc (deg)')
340 | ax2.set_ylabel('inc (deg)')
341 | ax3.set_ylabel('inc (deg)')
342 | ax4.set_ylabel('inc (deg)')
343 | ax5.set_ylabel('inc (deg)')
344 | # ax1.set_xlabel('azi (deg)')
345 | # ax2.set_xlabel('azi (deg)')
346 | # ax3.set_xlabel('azi (deg)')
347 | # ax4.set_xlabel('azi (deg)')
348 | ax5.set_xlabel('azi (deg)')
349 | return True
350 |
351 |
352 | def _coords(stats, no):
353 | return np.array([stats.get('elat' + no), stats.get('elon' + no), stats.get('edep' + no)])
354 |
355 |
356 | def plot_maxima_vs_dist(stuff, v=None, figsize=None):
357 | plt.rc('font', size=12)
358 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), *_ = repack(stuff.values())
359 | print('above noise:', np.count_nonzero(snr2<=NSR))
360 | print('below noise:', np.count_nonzero(snr2>NSR))
361 | fig = plt.figure(figsize=figsize)
362 | ax = fig.add_subplot(111)
363 | s = 9
364 | # plt.scatter(*filters(np.logical_and(max1<0, snr2>0.25), tmax1, dist), s=s, edgecolors='gray', facecolors='none')
365 | ax.scatter(*filters(np.logical_and(max1>0, snr2<=NSR), tmax1, dist), s=s, c='C0', label='positive polarity', zorder=5) # #5f7ee8
366 | ax.scatter(*filters(np.logical_and(max1<0, snr2<=NSR), tmax1, dist), s=s, c='#df634e', label='negative polarity', zorder=4)
367 | ax.scatter(*filters(snr2>NSR, tmax1, dist), s=s, c='0.7', label='low SNR', zorder=3)
368 | d, t1, t2, s2 = filters(np.logical_and(0<=snr2*max1/max2, snr2*max1/max2<=NSR), dist, tmax1, tmax2, snr2)
369 | # print(len(d))
370 | # plt.plot([t1, t2], [d, d], color='0.7', zorder=-10)
371 | # plt.plot(0.5 * (t1 + t2), d, '|', color='0.7', zorder=-10)
372 | # d, t1, t2, s2 = filters(np.logical_and(-0.25<=snr2*max1/max2, snr2*max1/max2<=0), dist, tmax1, tmax2, snr2)
373 | # print(len(d))
374 | # plt.plot([t1, t2], [d, d], color='C9', zorder=-10)
375 | # plt.plot(0.5 * (t1 + t2), d, '|', color='C9', zorder=-10)
376 | if v is not None:
377 | velocity_line(v, ax, lw=2, zorder=6, color='C5')
378 | ax.axhline(200, 0.2, 0.8, color='C5', ls='--')
379 | ax.set_xlabel('lag time (s)')
380 | ax.set_ylabel('event distance (m)')
381 | ax.legend()
382 | plt.rcdefaults()
383 |
384 |
385 | def print_evpairs(stuff, v=None):
386 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), evids, *_ = repack(stuff.values())
387 | print('two peaks, same sign')
388 | cond = np.logical_and(0<=snr2*max1/max2, snr2*max1/max2<=NSR)
389 | print(list(filters(cond, evids)[0]))
390 | print('two peaks, different sign')
391 | cond = np.logical_and(-NSR<=snr2*max1/max2, snr2*max1/max2<=0)
392 | print(list(filters(cond, evids)[0]))
393 | print('peaks not fitting to velocity')
394 | cond = np.logical_and(snr2 <= NSR, np.abs(dist / 1000 / v - np.abs(tmax1)) > 0.1)
395 | print(list(filters(cond, evids)[0]))
396 | cond2 = snr2 <= NSR
397 | evids1 = list(filters(cond, evids)[0])
398 | evids2 = list(filters(cond2, evids)[0])
399 | count(evids1, evids2)
400 |
401 |
402 | def plot_maxima_vs_dist_event(stack, evid, v=None, figsize=None):
403 | points = []
404 | points2 = []
405 | lines = []
406 | for tr in stack:#[:200]:
407 | starttime = tr.stats.starttime
408 | mid = starttime + (tr.stats.endtime - starttime) / 2
409 | dt = tr.stats.delta
410 | d1 = tr.slice(None, mid).data[::-1]
411 | d2 = tr.slice(mid, None).data
412 | arg1 = np.argmax(d1)
413 | arg2 = np.argmax(d2)
414 | m1 = d1[arg1]
415 | m2 = d2[arg2]
416 | mr1 = np.median(d1[scipy.signal.argrelmax(d1)])
417 | mr2 = np.median(d2[scipy.signal.argrelmax(d2)])
418 | m3 = max(m1, m2)
419 | c = 'r' if tr.stats.stack.group.startswith(evid) else 'b'
420 | if m1 / m3 > 0.8 and m2/m3 > 0.8:
421 | points.append((arg1 * dt, m1, m1 / mr1, tr.stats.dist, c))
422 | points.append((-arg2 * dt, m2, m2 / mr2, tr.stats.dist, c))
423 | elif m1 / m3 > 0.8:
424 | points.append((arg1 * dt, m1, m1 / mr1, tr.stats.dist, c))
425 | elif m2 / m3 > 0.8:
426 | points.append((-arg2 * dt, m2, m2 / mr2, tr.stats.dist, c))
427 | plt.figure(figsize=figsize)
428 | t, m_, rel, dist, color = zip(*points)
429 | size = [10 if r > 10 else 3 for r in rel]
430 | plt.scatter(t, dist, s=size, c=color)
431 | if v is not None:
432 | velocity_line(v, plt.gca(), lw=2)
433 | print(len(points), len(points2), len(lines))
434 | return stuff
435 |
436 |
437 | def analyze_stuff(stuff):
438 | # outdated
439 | from collections import Counter
440 | strange_events = []
441 | normal_events = []
442 | for k, v in stuff.items():
443 | dist, t1, m1, mr1, t2, m2, mr2 = v
444 | t = t1 if m1 > m2 else t2
445 | if not -0.1 < dist / 4000 - t < 0.1:
446 | strange_events.extend(k.split('-'))
447 | else:
448 | normal_events.extend(k.split('-'))
449 |
450 | return Counter(strange_events), Counter(normal_events)
451 |
452 |
453 | def count(evids1, evids2):
454 | from collections import Counter
455 | bla1 = Counter([x for evid in evids1 for x in evid.split('-')])
456 | print(bla1)
457 | bla2 = Counter([x for evid in evids2 for x in evid.split('-')])
458 | x = {key: round(val / bla2[key], 3) for key, val in bla1.items()}
459 | print(dict(sorted(x.items(), key=lambda k: k[1], reverse=True)))
460 |
461 | def plot_hists(stuff, figsize=None):
462 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), evids, azi, inc, *_, phase1 = repack(stuff.values())
463 | cond = np.logical_and(snr2 <= NSR, dist > 200)
464 | dist, tmax1, tmax2, max1, max2, snr1, snr2, snr3, snr4, azi, inc, phase1 = filters(cond, dist, tmax1, tmax2, max1, max2, snr1, snr2, snr3, snr4, azi, inc, phase1)
465 | N = 5
466 | M = 5
467 | ms = 3
468 | plt.figure(figsize=figsize)
469 | plt.subplot(M, N, 1)
470 | plt.hist(max1, 1001)
471 | plt.xlabel('max1')
472 | plt.subplot(M, N, 2)
473 | plt.scatter(dist, max1, s=ms)
474 | plt.subplot(M, N, 3)
475 | plt.hist(max2[max2!=0], 1001)
476 | plt.xlabel('max2')
477 | plt.subplot(M, N, 4)
478 | plt.scatter(dist[max2!=0], max2[max2!=0], s=ms)
479 | plt.subplot(M, N, 5)
480 | plt.hist(snr1, 101)
481 | plt.xlabel('snr1')
482 | plt.subplot(M, N, 6)
483 | plt.scatter(dist, snr1, s=ms)
484 | plt.subplot(M, N, 7)
485 | plt.hist(snr2, 101)
486 | plt.xlabel('snr2')
487 | plt.subplot(M, N, 8)
488 | plt.scatter(dist, snr2, s=ms)
489 | plt.xlabel('versus dist')
490 | plt.subplot(M, N, 9)
491 | plt.hist(snr4, 101)
492 | plt.xlabel('snr4')
493 | plt.subplot(M, N, 10)
494 | plt.scatter(dist, snr4, s=ms)
495 |
496 | plt.subplot(M, N, 11)
497 | plt.scatter(max1, snr4, s=ms)
498 | plt.xlabel('max1')
499 | plt.ylabel('snr4')
500 | plt.subplot(M, N, 12)
501 | plt.scatter(max1, max2, s=ms)
502 | plt.xlabel('max1')
503 | plt.ylabel('max2')
504 |
505 | plt.subplot(M, N, 13)
506 | plt.scatter(max1, snr1, s=ms)
507 | plt.xlabel('max1')
508 | plt.ylabel('snr1')
509 | plt.subplot(M, N, 14)
510 | plt.scatter(max1, snr2, s=ms)
511 | plt.xlabel('max1')
512 | plt.ylabel('snr2')
513 |
514 | plt.subplot(M, N, 15)
515 | plt.scatter(inc, max1, s=ms)
516 | plt.xlabel('inc')
517 | plt.ylabel('max1')
518 |
519 | plt.subplot(M, N, 16)
520 | plt.scatter(inc, snr1, s=ms)
521 | plt.xlabel('inc')
522 | plt.ylabel('snr1')
523 |
524 | plt.subplot(M, N, 17)
525 | plt.scatter(dist, max1, s=ms)
526 | plt.xlabel('dist')
527 | plt.ylabel('max1')
528 |
529 | plt.subplot(M, N, 18)
530 | plt.scatter(dist, snr1, s=ms)
531 | plt.xlabel('dist')
532 | plt.ylabel('snr1')
533 |
534 | val = (np.abs(max1) - np.abs(max2)) / np.abs(max1) * np.sign(tmax1)
535 | plt.subplot(M, N, 19)
536 | histlist = [inc[val>0], inc[val<=0]]
537 | plt.hist(histlist, 21, stacked=True, label=['right', 'left'])
538 | plt.xlabel('inc')
539 | plt.legend()
540 |
541 |
542 | plt.subplot(M, N, 20)
543 | histlist = [azi[max1>0], azi[max1<=0]]
544 | plt.hist(histlist, 21, stacked=True, label=['+', '-'])
545 | plt.xlabel('azi')
546 | plt.ylabel('sign (+-)')
547 | plt.legend()
548 |
549 | plt.subplot(M, N, 21)
550 | plt.scatter(snr2, phase1, s=ms)
551 | plt.xlabel('snr2')
552 | plt.ylabel('phase1')
553 | plt.subplot(M, N, 22)
554 | plt.scatter(max1, phase1, s=ms)
555 | plt.xlabel('max1')
556 | plt.ylabel('phase1')
557 | plt.subplot(M, N, 23)
558 | plt.hist(phase1, 1001)
559 | plt.xlabel('phase1')
560 |
561 | plt.rc('font', size=12)
562 | fig = plt.figure()
563 | ax = fig.add_subplot(111)
564 | bins = np.linspace(0, 90, 19)
565 | histlist = [inc[val>0], inc[val<=0]]
566 | kw = dict(edgecolor='C0', linewidth=0.01)
567 |
568 | _, _, ps = ax.hist(histlist, bins, stacked=True, label=['right', 'left'], **kw)
569 | for p in ps[0]:
570 | p.set_zorder(11)
571 | for p in ps[1]:
572 | p.set_zorder(10)
573 | p.set_edgecolor('C1')
574 |
575 | ax.set_xlim(0, 90)
576 | ax.set_xlabel('inclination (°)')
577 | ax.set_ylabel('count of event pairs')
578 | ax.legend(title='side of maximum', frameon=False)
579 | ax.spines['right'].set_visible(False)
580 | ax.spines['top'].set_visible(False)
581 | fig.savefig(f'{OUT2}hist.pdf', bbox_inches='tight', pad_inches=0.1)
582 | plt.rcdefaults()
583 |
584 | def polar_plots(stuff):
585 |
586 | # hist2 = physt.special.polar_histogram(inc[max1>0], np.deg2rad(azi[max1>0]), transformed=True, radial_bins=10, phi_bins=10)
587 | # ax = hist2.plot.polar_map(show_values=True)
588 | # print(len(azi[max1<=0]))
589 | # hist2 = physt.special.polar_histogram(inc[max1<=0], np.deg2rad(azi[max1<=0]), transformed=True, radial_bins=10, phi_bins=10)
590 | # ax = hist2.plot.polar_map(show_values=True)
591 |
592 | # Create a polar histogram with different binning
593 |
594 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), evids, azi, inc, (coords1, coords2), v, (times1, times2), phase1 = repack(stuff.values())
595 | coords1 = convert_coords2km(coords1, LATLON0)
596 | coords2 = convert_coords2km(coords2, LATLON0)
597 | x1, y1, z1 = [np.array(bla) for bla in zip(*coords1)]
598 | x2, y2, z2 = [np.array(bla) for bla in zip(*coords1)]
599 | x = 0.5 * (x1 + x2)
600 | y = 0.5 * (y1 + y2)
601 | z = 0.5 * (z1 + z2)
602 |
603 | cond = np.logical_and(snr2 <= NSR, dist > 200)
604 | dist, tmax1, tmax2, max1, max2, snr1, snr2, snr3, snr4, azi, inc, x, y, z = filters(cond, dist, tmax1, tmax2, max1, max2, snr1, snr2, snr3, snr4, azi, inc, x, y, z)
605 |
606 |
607 | nr = 9
608 | ntheta = 18
609 | r_edges = np.linspace(0, 90, nr + 1)
610 | theta_edges = np.linspace(0, 2*np.pi, ntheta + 1)
611 | Theta, R = np.meshgrid(theta_edges, r_edges)
612 |
613 | H1, _, _ = np.histogram2d(inc[max1>0], np.deg2rad(azi[max1>0]), [r_edges, theta_edges])
614 | H2, _, _ = np.histogram2d(inc[max1<=0], np.deg2rad(azi[max1<=0]), [r_edges, theta_edges])
615 | H1[0, :] = np.sum(H1[0, :])
616 | H2[0, :] = np.sum(H2[0, :])
617 | HH = (H1 - H2) / (H1 + H2)
618 | HH[H1 + H2 < 6] = np.nan
619 |
620 | bbfname = 'data/focal_mechanism_2018swarm.txt'
621 | bbs = np.genfromtxt(bbfname, names=True)
622 | sdrs = [sdr for _, _, _, _, _, *sdr in bbs]
623 | sdr_mean = np.median(sdrs, axis=0)
624 |
625 | def plot_bb(ax):
626 | axb = fig.add_axes(ax.get_position().bounds, aspect='equal')
627 | axb.axison = False
628 | b = beach(sdr_mean, width=20, nofill=True, linewidth=1)
629 | axb.add_collection(b)
630 | axb.set_xlim(-10, 10)
631 | axb.set_ylim(-10, 10)
632 | return axb
633 |
634 | fig = plt.figure(figsize=(10, 5))
635 | kw = dict(edgecolors='face', linewidths=0.01)
636 |
637 | ax2 = fig.add_subplot(132, polar=True)
638 | cb_kw = dict(shrink=0.25, panchor=(0.95, 0.95), aspect=10, use_gridspec=False)
639 | ax2.set_theta_zero_location('N')
640 | ax2.set_theta_direction(-1)
641 | vmax = np.nanmax(np.abs(HH))
642 | cmap = 'coolwarm_r'
643 | im = ax2.pcolormesh(Theta, R, HH, cmap=cmap, vmax=vmax, vmin=-vmax, **kw)
644 | thetaline = np.deg2rad([320, 340, 360, 360, 340, 320, 320])
645 | rline = [20, 20, 20, 50, 50, 50, 20]
646 | ax2.plot(thetaline, rline, color='0.4')
647 | ax2.set_yticks([15, 35, 55, 75])
648 | ax2.set_yticklabels(['20°', '40°', '60°', '80°'])
649 | plt.colorbar(im, ax=ax2, **cb_kw)
650 | # ax2b = fig.add_axes(ax2.get_position().bounds)
651 | # plot_farfield(sdr_mean, typ=None, ax=ax2b, plot_pt=True)
652 | plot_bb(ax2)
653 | plot_pnt(sdr_mean, ax2)
654 |
655 | ax3 = fig.add_subplot(133, polar=True, sharex=ax2, sharey=ax2)
656 | ax3.set_theta_zero_location('N')
657 | ax3.set_theta_direction(-1)
658 | im = ax3.pcolormesh(Theta, R, H1 + H2, cmap='magma_r', zorder=-1, **kw)
659 | ax3.plot(thetaline, rline, color='0.4')
660 | ax3.set_yticks([15, 35, 55, 75])
661 | ax3.set_yticklabels(['20°', '40°', '60°', '80°'])
662 | plt.colorbar(im, ax=ax3, **cb_kw)
663 | # plot_farfield(sdr_mean, typ=None, ax=ax3, plot_pt=True)
664 | plot_bb(ax3)
665 | plot_pnt(sdr_mean, ax3)
666 |
667 | b1 = ax2.get_position().bounds
668 | b2 = ax3.get_position().bounds
669 | # ax1 = fig.add_subplot(131, aspect='equal')
670 | ax1 = fig.add_axes((2*b1[0]-b2[0],) + b1[1:], aspect='equal')
671 | ax1b, ax1c = plot_farfield(sdr_mean, typ='S', points=nice_points(), ax=ax1, scale=8, plot_axis='PNT')
672 | for sdr in sdrs:
673 | b = beach(sdr, width=20, edgecolor='0.7', nofill=True, linewidth=0.5)
674 | ax1c.add_collection(b)
675 | # plot_pnt(sdr, ax1b, 'PNT', color='0.7', label='', zorder=-1)
676 | plot_bb(ax1c)
677 |
678 | annokw = dict(xy=(0.5, -0.2), xycoords='axes fraction', ha='center')
679 | annokw2 = dict(xy=(1.02, -0.25), xycoords='axes fraction', ha='right')
680 | annokw3 = dict(xy=(0, 1), xycoords='axes fraction')
681 | annokw4 = dict(xy=(-0.3, 1), xycoords='axes fraction')
682 | ax1.annotate('S wave radiation pattern\nfocal mechanisms', **annokw)
683 | ax2.annotate('polarity\nof maxima', **annokw2)
684 | ax3.annotate('number\nof pairs', **annokw2)
685 | ax1.annotate('a)', **annokw3)
686 | ax2.annotate('b)', **annokw4)
687 | ax3.annotate('c)', **annokw4)
688 |
689 | fig.savefig(f'{OUT2}focal.pdf', bbox_inches='tight', pad_inches=0.1)
690 |
691 |
692 |
693 | H1, _, _ = np.histogram2d(inc[tmax1>0], np.deg2rad(azi[tmax1>0]), [r_edges, theta_edges])
694 | H2, _, _ = np.histogram2d(inc[tmax1<=0], np.deg2rad(azi[tmax1<=0]), [r_edges, theta_edges])
695 | H1[0, :] = np.sum(H1[0, :])
696 | H2[0, :] = np.sum(H2[0, :])
697 | HH = (H1 - H2) / (H1 + H2)
698 | ind = H1 + H2 < 10
699 | # HH[ind] = np.nan
700 |
701 | from scipy.stats import binned_statistic_2d
702 |
703 | plt.figure()
704 | ax = plt.subplot(331, polar=True)
705 | ax.set_theta_zero_location('N')
706 | ax.set_theta_direction(-1)
707 | vmax = np.nanmax(np.abs(HH))
708 | im = plt.pcolormesh(Theta, R, HH, cmap='coolwarm', vmax=vmax, vmin=-vmax)
709 | plt.colorbar(im, ax=ax)
710 | ax.set_xlabel('direction')
711 |
712 | ax = plt.subplot(332, polar=True, sharex=ax, sharey=ax)
713 | ax.set_theta_zero_location('N')
714 | ax.set_theta_direction(-1)
715 | im = plt.pcolormesh(Theta, R, H1 + H2, cmap='magma_r')
716 | plt.colorbar(im, ax=ax)
717 | ax.set_xlabel('number')
718 |
719 | ax = plt.subplot(333, polar=True, sharex=ax, sharey=ax)
720 | ax.set_theta_zero_location('N')
721 | ax.set_theta_direction(-1)
722 | stats, *_ = binned_statistic_2d(inc, np.deg2rad(azi), snr2, statistic='median', bins=[r_edges, theta_edges], expand_binnumbers=False)
723 | stats[ind] = np.nan
724 | im = plt.pcolormesh(Theta, R, stats, cmap='magma_r')
725 | plt.colorbar(im, ax=ax)
726 | ax.set_xlabel('noise')
727 |
728 | ax = plt.subplot(334, polar=True, sharex=ax, sharey=ax)
729 | ax.set_theta_zero_location('N')
730 | ax.set_theta_direction(-1)
731 | stats, *_ = binned_statistic_2d(inc, np.deg2rad(azi), x, statistic='median', bins=[r_edges, theta_edges], expand_binnumbers=False)
732 | stats[ind] = np.nan
733 | im = plt.pcolormesh(Theta, R, stats, cmap='magma_r')
734 | plt.colorbar(im, ax=ax)
735 | ax.set_xlabel('x')
736 |
737 | ax = plt.subplot(335, polar=True, sharex=ax, sharey=ax)
738 | ax.set_theta_zero_location('N')
739 | ax.set_theta_direction(-1)
740 | stats, *_ = binned_statistic_2d(inc, np.deg2rad(azi), y, statistic='median', bins=[r_edges, theta_edges], expand_binnumbers=False)
741 | stats[ind] = np.nan
742 | im = plt.pcolormesh(Theta, R, stats, cmap='magma_r')
743 | plt.colorbar(im, ax=ax)
744 | ax.set_xlabel('y')
745 |
746 | ax = plt.subplot(336, polar=True, sharex=ax, sharey=ax)
747 | ax.set_theta_zero_location('N')
748 | ax.set_theta_direction(-1)
749 | stats, *_ = binned_statistic_2d(inc, np.deg2rad(azi), z, statistic='median', bins=[r_edges, theta_edges], expand_binnumbers=False)
750 | stats[ind] = np.nan
751 | im = plt.pcolormesh(Theta, R, stats, cmap='magma_r')
752 | plt.colorbar(im, ax=ax)
753 | ax.set_xlabel('z')
754 |
755 | ax = plt.subplot(337, polar=True, sharex=ax, sharey=ax)
756 | ax.set_theta_zero_location('N')
757 | ax.set_theta_direction(-1)
758 | stats, *_ = binned_statistic_2d(inc, np.deg2rad(azi), dist/1000/np.abs(tmax1), statistic='median', bins=[r_edges, theta_edges], expand_binnumbers=False)
759 | stats[ind] = np.nan
760 | im = plt.pcolormesh(Theta, R, stats, cmap='magma_r', vmax=5)
761 | plt.colorbar(im, ax=ax)
762 | ax.set_xlabel('velocity')
763 | plt.savefig(f'{OUT}focal_plots.pdf', bbox_inches='tight', pad_inches=0.1)
764 |
765 |
766 | #NAME = 'tmp/ccs_stack_2018_mag>1.8_q2_246events_hp10Hz_1bit_dist<1km_l.h5'
767 | #NAME = 'tmp/ccs_stack_2018_mag>1.8_q3_hp10Hz_1bit_dist<1km_l.h5'
768 | #NAME = 'tmp/ccs_stack_2018_mag>1.8_q3_hp20Hz_1bit_dist<1km_l.h5'
769 | #NAME = 'tmp/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_1bit_dist<1km_pw2.h5'
770 | #NAME = 'tmp/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_Pcoda_envelope_dist<1km_pw2.h5'
771 | #NAME = 'tmp/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_P_None_dist<1km_pw2.h5'
772 | #NAME = 'tmp/ccs_2018_mag>1.8_q3_10Hz-40Hz_P_None_dist<1km_pw2.h5'
773 | NAME = 'tmp/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_Scoda_envelope_dist<1km_pw2.h5'
774 | NAMED = 'tmp/ccs_stack_2018_mag>1.8_q3_10Hz-40Hz_S_None_dist<1km_pw2.h5'
775 | name = NAME.split('ccs_stack_2018_')[1].split('.h5')[0]
776 | OUT = 'tmp/'
777 | OUT2 = 'figs/'
778 | PKL_FILE1 = 'tmp/stuff.pkl'
779 | PKL_FILE2 = 'tmp/stuff_onlymax.pkl'
780 | V = 3.6
781 | #V = 8
782 | DIST = 1
783 |
784 | NOISE_IND = 7
785 | NSR = 0.1
786 |
787 | if __name__ == '__main__':
788 |
789 | plot_corr_vs_dist2times()
790 | ccs_stack = read(NAME, 'H5')
791 | print(f'len stream {len(ccs_stack)}')
792 |
793 | add_distbin(ccs_stack, 1, 21)
794 |
795 | ccs_stack_dist = copy(ccs_stack).stack('{distbin}').sort(['distbin'])
796 | for tr in ccs_stack_dist:
797 | tr.stats.dist = tr.stats.distbin
798 |
799 | # ccs_stack_dist_norm = ccs_stack.copy().normalize().stack('{distbin}').sort(['distbin'])
800 | # for tr in ccs_stack_dist_norm:
801 | # tr.stats.dist = tr.stats.distbin
802 | #
803 | # plot_corr_vs_dist_hist(ccs_stack_dist_norm, figsize=(10, 8), vmax=0.3, v=V, xlim=0.8*DIST)
804 | # plt.ylabel('event distance (m)')
805 | # plt.savefig(f'{OUT}corr_vs_dist_hist_norm_{name}.png', dpi=300, bbox_inches='tight', pad_inches=0.1)
806 | # plot_corr_vs_dist(ccs_stack_dist.copy().normalize(), figsize=(10, 8), annotate=False, v=V, xlim=0.8*DIST)
807 | # plt.ylabel('event distance (m)')
808 | # plt.savefig(f'{OUT}corr_vs_dist_wigg_{name}.png', bbox_inches='tight', pad_inches=0.1)
809 | # plot_corr_vs_dist(ccs_stack_dist_norm.copy().normalize(), figsize=(10, 8), annotate=False, v=V, xlim=0.8*DIST)
810 | # plt.ylabel('event distance (m)')
811 | # plt.savefig(f'{OUT}corr_vs_dist_wigg_norm_{name}.png', dpi=300)
812 |
813 | # stuff = get_stuff(trim2(ccs_stack.copy(), -0.5, 0.5, 'mid'), trim2(ccs_stack.copy(), -0.5, 0.5, 'mid'))
814 | # with open(PKL_FILE1, 'wb') as f:
815 | # pickle.dump(stuff, f, protocol=pickle.HIGHEST_PROTOCOL)
816 | # stuff2 = get_stuff(trim2(ccs_stack.copy(), -0.5, 0.5, 'mid'), trim2(ccs_stack.copy(), -0.5, 0.5, 'mid'), only_maxima=True)
817 | # with open(PKL_FILE2, 'wb') as f:
818 | # pickle.dump(stuff2, f, protocol=pickle.HIGHEST_PROTOCOL)
819 |
820 | with open(PKL_FILE1, 'rb') as f:
821 | stuff = pickle.load(f)
822 | with open(PKL_FILE2, 'rb') as f:
823 | stuff2 = pickle.load(f)
824 | print(f'len stuff {len(stuff)}')
825 |
826 | plot_maxima_vs_dist(stuff, figsize=(10, 8), v=V)
827 | plt.savefig(f'{OUT}maxima_vs_dist2_{name}.png', dpi=300)
828 | plt.savefig(f'{OUT2}maxima_vs_dist.pdf')
829 |
830 | plot_maxima_vs_dist(stuff2, figsize=(10, 8), v=V)
831 | plt.savefig(f'{OUT}maxima_vs_dist2_{name}_only_max.png', dpi=300)
832 | #
833 | polar_plots(stuff)
834 | plot_hists(stuff, figsize=(19, 10.5))
835 | plt.show()
836 |
837 |
838 | # make_crazy_plot(stuff, figsize=(16, 8))
839 | # plt.savefig(f'{OUT}crazy_plot_{name}.png', dpi=300)
840 |
841 | # for key, stream in ccs_stack._groupby('{distbin}').items():
842 | # if make_crazy_plot(stream, figsize=(16, 8)):
843 | # plt.savefig(f'{OUT}crazy_plot2_{key}_{name}.png', dpi=300)
844 |
845 | # plot_stuff_map_depth(stuff, plot_lines=True)
846 |
847 | # c1, c2 = analyze_stuff(stuff)
848 | # for evid in c1.keys():
849 | # ccs_stack_some = [tr for tr in ccs_stack if evid in tr.stats.stack.group]
850 | # plot_maxima_vs_dist_event(ccs_stack_some, evid, v=4000)
851 | # plt.savefig(f'{OUT}/zzz/{name}_maxima_vs_dist_{evid}.png', dpi=300)
852 | # plt.close()
853 | #
854 | # dists = [(int(float(tr.stats.stack.group)), tr.stats.stack.count) for tr in ccs_stack_dist]
855 | # print(dists)
856 | # for d in sorted(dists):
857 | # traces = [tr for tr in ccs_stack if tr.stats.distbin == d[0]]
858 | # traces = sorted(traces, key=lambda tr: tr.stats.dist)
859 | # ccssub = ccs_stack.__class__(traces)
860 | # plot_corr(ccssub, figsize=(20, 20))
861 | # plt.savefig(f'{OUT}/yyy/{name}_{d[0]:03d}.png')
862 | # plt.close()
--------------------------------------------------------------------------------
/util/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/trichter/inter_source_interferometry/7755d3ab15a94b6873c44da6fd1ee16bddfec8f9/util/__init__.py
--------------------------------------------------------------------------------
/util/events.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | def get_picks(picks, arrivals):
4 | """Picks for specific station from arrivals"""
5 | try:
6 | ps = []
7 | for arrival in arrivals:
8 | phase = arrival.phase
9 | p = arrival.pick_id.get_referred_object()
10 | if p is None:
11 | print('DID NOT FIND PICK ID')
12 | raise ValueError
13 | seedid = p.waveform_id.get_seed_string()
14 | ps.append((seedid, phase, p.time))
15 | return ps
16 | except ValueError:
17 | pass
18 | ps = []
19 | for p in picks:
20 | seedid = p.waveform_id.get_seed_string()
21 | phase = p.phase_hint
22 | ps.append((seedid, phase, p.time))
23 | return ps
24 |
25 |
26 | def event2dict(ev):
27 | id_ = str(ev.resource_id).split('/')[-1]
28 | ori = ev.preferred_origin() or ev.origins[0]
29 | mag = ev.preferred_magnitude() or ev.magnitudes[0]
30 | try:
31 | mag = mag.mag
32 | except:
33 | mag = None
34 | try:
35 | picks = get_picks(ev.picks, ori.arrivals)
36 | except Exception as ex:
37 | print(ex)
38 | picks = None
39 |
40 | return dict(
41 | id=id_, time=ori.time,
42 | lat=ori.latitude, lon=ori.longitude, dep=ori.depth / 1000,
43 | mag=mag, picks=picks)
44 | pass
45 |
46 |
47 | def event2list(ev):
48 | """id, time, lon, lat, dep_km, mag, picks"""
49 | d = event2dict(ev)
50 | return (d['id'], d['time'], d['lon'], d['lat'], d['dep'], d['mag'],
51 | d['picks'])
52 |
53 |
54 | def events2dicts(events):
55 | return [event2dict(ev) for ev in events]
56 |
57 |
58 | def events2lists(events):
59 | """id, time, lon, lat, dep_km, mag, picks"""
60 | return [event2list(ev) for ev in events]
61 |
62 |
63 | def load_webnet(year='*', plot=False, stations=False):
64 | import glob
65 | import pandas as pd
66 |
67 | names = ['time', 'lon', 'lat', 'dep', 'mag']
68 | kwargs = dict(sep=';', skipinitialspace=True, skiprows=3, parse_dates=[0],
69 | names=names)
70 | path =f'data/webnet/*.txt'
71 | frames = [pd.read_csv(fname, **kwargs) for fname in glob.glob(path)]
72 | if len(frames) == 0:
73 | print('You can obtain the WEBNET catalog at ig.cas.cz. Put the txt files in new webnet directory inside data.')
74 | eqs = None
75 | else:
76 | eqs = pd.concat(frames, ignore_index=True)
77 | if stations:
78 | path = 'data/station_coordinates.txt'
79 | sta = pd.read_csv(path, sep='\s+', usecols=(0, 1, 2))
80 | if plot:
81 | import matplotlib.pyplot as plt
82 | fig = plt.figure()
83 | ax1 = fig.add_subplot(211)
84 | ax2 = fig.add_subplot(212)
85 | ax1.scatter(eqs.lon.values, eqs.lat.values, 1, eqs.time.values)
86 | ax2.scatter(eqs.time.values, eqs.mag.values, 1, eqs.time.values)
87 | if stations:
88 | ax1.scatter(sta.lon.values, sta.lat.values, 100, marker='v',
89 | color='none', edgecolors='k')
90 | for idx, s in sta.iterrows():
91 | ax1.annotate(s.station, (s.lon, s.lat), (5, 5),
92 | textcoords='offset points')
93 | plt.show()
94 | if stations:
95 | return eqs, sta
96 | return eqs
97 |
--------------------------------------------------------------------------------
/util/imaging.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import cartopy.crs as ccrs
4 | import cartopy.io.shapereader as shpreader
5 | import numpy as np
6 |
7 |
8 | # https://epsg.io/3035
9 | EA_EURO = ccrs.LambertAzimuthalEqualArea(
10 | central_longitude=10, central_latitude=52,
11 | false_easting=4321000, false_northing=3210000,
12 | globe=ccrs.Globe(ellipse='GRS80'))
13 |
14 | GEO = ccrs.Geodetic()
15 |
16 | def add_scale(ax, length, loc, crs=GEO, lw=1,
17 | cap=2, label=True, size=None, vpad=2):
18 | bx, by = ax.projection.transform_point(loc[0], loc[1], crs)
19 | bx1, bx2 = bx - 500 * length, bx + 500 * length
20 | ax.plot((bx1, bx2), (by, by), color='k', linewidth=lw)
21 | if cap:
22 | kw = {'xycoords': 'data', 'textcoords': 'offset points', 'arrowprops':
23 | {'arrowstyle': '-', 'connectionstyle': 'arc3',
24 | 'shrinkA': 0, 'shrinkB': 0, 'linewidth': lw}}
25 | ax.annotate('', (bx1, by), (0, cap), **kw)
26 | ax.annotate('', (bx1, by), (0, -cap), **kw)
27 | ax.annotate('', (bx2, by), (0, cap), **kw)
28 | ax.annotate('', (bx2, by), (0, -cap), **kw)
29 | if label:
30 | ax.annotate(str(length) + ' km', (bx, by), (0, vpad), size=size,
31 | textcoords='offset points', ha='center', va='bottom')
32 |
33 |
34 | def de_border(ax, edgecolor='k', facecolor='none', **kw):
35 | fname = '/home/eule/data/geo/de/border/Germany_AL2-AL2.shp'
36 | germany = shpreader.Reader(fname)
37 | ax.add_geometries(germany.geometries(), GEO, facecolor=facecolor,
38 | edgecolor=edgecolor, **kw)
39 |
40 |
41 | def add_ticklabels(ax, lonticks, latticks, fmt='%.1f°'):
42 | if hasattr(ax, 'projection'):
43 | xticks, _, _ = zip(*ax.projection.transform_points(
44 | GEO, np.array(lonticks), np.ones(len(lonticks)) * latticks[0]))
45 | _, yticks, _ = zip(*ax.projection.transform_points(
46 | GEO, np.ones(len(latticks)) * lonticks[0], np.array(latticks)))
47 | else:
48 | xticks = lonticks
49 | yticks = latticks
50 | ax.set_xticks(xticks)
51 | ax.set_xticklabels(fmt % abs(l) + 'E' if l> 0 else 'W' for l in lonticks)
52 | ax.set_yticks(yticks)
53 | ax.set_yticklabels(fmt % abs(l) + 'N' if l > 0 else 'S' for l in latticks)
54 |
55 |
56 | def get_elevation_srtm(lon1, lat1, dx=1, dy=1, max_distance=10,
57 | shading=True, azimuth=315, altitude=45):
58 | # https://github.com/SciTools/cartopy/issues/789
59 | from http.cookiejar import CookieJar
60 | import urllib.request
61 | password_manager = urllib.request.HTTPPasswordMgrWithDefaultRealm()
62 | password_manager.add_password(None, "https://urs.earthdata.nasa.gov", 'your_user_name', 'your_password')
63 | cookie_jar = CookieJar()
64 | opener = urllib.request.build_opener(
65 | urllib.request.HTTPBasicAuthHandler(password_manager),
66 | urllib.request.HTTPCookieProcessor(cookie_jar))
67 | urllib.request.install_opener(opener)
68 | # end patch
69 | from cartopy.io.srtm import SRTM1Source, add_shading
70 | elev, crs, extent = SRTM1Source().combined(lon1, lat1, dx, dy)
71 | shades = None
72 | if shading:
73 | shades = add_shading(elev, azimuth, altitude)
74 | return elev, shades, crs, extent
75 |
76 |
77 | def plot_elevation(ax, cmap='Greys', rasterized=False, **kw):
78 | x1, x2, y1, y2 = ax.get_extent()
79 | x, y = np.array([x1, x1, x2, x2]), np.array([y1, y2, y1, y2])
80 | lons, lats, _ = zip(*GEO.transform_points(ax.projection, x, y))
81 | lon, lat = int(np.min(lons)), int(np.min(lats))
82 | dx = int(np.ceil(np.max(lons) - lon))
83 | dy = int(np.ceil(np.max(lats) - lat))
84 | try:
85 | elev, shades, crs, extent = get_elevation_srtm(lon, lat, dx, dy, **kw)
86 | except Exception:
87 | import traceback
88 | print(traceback.format_exc())
89 | print('To download SRTM elevation you need to set up an account and add your credentials inside this module')
90 | else:
91 | if shades is not None:
92 | elev = shades
93 | ax.imshow(elev, extent=extent, transform=crs, cmap=cmap, origin='lower', rasterized=rasterized)
94 |
95 |
96 | def convert_coords2km(coords, latlon0=None):
97 | import utm
98 | x, y = zip(*[utm.from_latlon(lat1, lon1)[:2]
99 | for lat1, lon1, *_ in coords])
100 | if latlon0 is None:
101 | x0 = np.mean(x)
102 | y0 = np.mean(y)
103 | else:
104 | x0, y0 = utm.from_latlon(*latlon0)[:2]
105 | x = (np.array(x) - x0) / 1000
106 | y = (np.array(y) - y0) / 1000
107 | if len(coords[0]) == 3:
108 | return list(zip(x, y, [c[2] for c in coords]))
109 | else:
110 | return list(zip(x, y))
--------------------------------------------------------------------------------
/util/misc.py:
--------------------------------------------------------------------------------
1 | # https://stackoverflow.com/a/35513958
2 | def collapse_json(text, indent=4):
3 | """Compacts a string of json data by collapsing whitespace after the
4 | specified indent level
5 |
6 | NOTE: will not produce correct results when indent level is not a multiple
7 | of the json indent level
8 | """
9 | initial = " " * indent
10 | out = [] # final json output
11 | sublevel = [] # accumulation list for sublevel entries
12 | pending = None # holder for consecutive entries at exact indent level
13 | for line in text.splitlines():
14 | if line.startswith(initial):
15 | if line[indent] == " ":
16 | # found a line indented further than the indent level, so add
17 | # it to the sublevel list
18 | if pending:
19 | # the first item in the sublevel will be the pending item
20 | # that was the previous line in the json
21 | sublevel.append(pending)
22 | pending = None
23 | item = line.strip()
24 | sublevel.append(item)
25 | if item.endswith(","):
26 | sublevel.append(" ")
27 | elif sublevel:
28 | # found a line at the exact indent level *and* we have sublevel
29 | # items. This means the sublevel items have come to an end
30 | sublevel.append(line.strip())
31 | out.append("".join(sublevel))
32 | sublevel = []
33 | else:
34 | # found a line at the exact indent level but no items indented
35 | # further, so possibly start a new sub-level
36 | if pending:
37 | # if there is already a pending item, it means that
38 | # consecutive entries in the json had the exact same
39 | # indentation and that last pending item was not the start
40 | # of a new sublevel.
41 | out.append(pending)
42 | pending = line.rstrip()
43 | else:
44 | if pending:
45 | # it's possible that an item will be pending but not added to
46 | # the output yet, so make sure it's not forgotten.
47 | out.append(pending)
48 | pending = None
49 | if sublevel:
50 | out.append("".join(sublevel))
51 | out.append(line)
52 | return "\n".join(out)
--------------------------------------------------------------------------------
/util/signal.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import numpy as np
4 | import scipy.signal
5 |
6 |
7 | def _seconds2utc(self, seconds, reftime=None): # same as in rf package
8 | """Return UTCDateTime given as seconds relative to reftime"""
9 | from collections import Iterable
10 | from obspy import UTCDateTime as UTC
11 | if isinstance(seconds, Iterable):
12 | return [_seconds2utc(self, s, reftime=reftime) for s in seconds]
13 | if isinstance(seconds, UTC) or reftime is None or seconds is None:
14 | return seconds
15 | if not isinstance(reftime, UTC):
16 | reftime = self.stats[reftime]
17 | return reftime + seconds
18 |
19 |
20 | def trim2(self, starttime=None, endtime=None, reftime=None, check_npts=True, **kwargs):
21 | # same as in rf package + mid possibility
22 | """
23 | Alternative trim method accepting relative times.
24 |
25 | See :meth:`~obspy.core.stream.Stream.trim`.
26 |
27 | :param starttime,endtime: accept UTCDateTime or seconds relative to
28 | reftime
29 | :param reftime: reference time, can be an UTCDateTime object or a
30 | string. The string will be looked up in the stats dictionary
31 | (e.g. 'starttime', 'endtime', 'onset').
32 | """
33 | for tr in self.traces:
34 | st = tr.stats
35 | ref = (st.starttime + 0.5 * (st.endtime - st.starttime)
36 | if reftime in ('mid', 'middle') else reftime)
37 | t1 = _seconds2utc(tr, starttime, reftime=ref)
38 | t2 = _seconds2utc(tr, endtime, reftime=ref)
39 | tr.trim(t1, t2, **kwargs)
40 | if check_npts:
41 | npts = int(round(np.median([len(tr) for tr in self.traces])))
42 | self.traces = [tr for tr in self.traces if len(tr) >= npts]
43 | for tr in self.traces:
44 | tr.data = tr.data[:npts]
45 | return self
46 |
47 |
48 | def smooth(x, window_len, window='flat', method='zeros'):
49 | """Smooth the data using a window with requested size.
50 |
51 | This method is based on the convolution of a scaled window with the signal.
52 | The signal is prepared by introducing reflected copies of the signal
53 | (with the window size) in both ends so that transient parts are minimized
54 | in the begining and end part of the output signal.
55 |
56 | input:
57 | :param x: the input signal
58 | :param window_len: the dimension of the smoothing window; should be an
59 | odd integer
60 | :param window: the type of window from 'flat', 'hanning', 'hamming',
61 | 'bartlett', 'blackman'
62 | flat window will produce a moving average smoothing.
63 | :param method: handling of border effects 'zeros', 'reflect', None
64 | 'zeros': zero padding on both ends (len(smooth(x)) = len(x))
65 | 'reflect': pad reflected signal on both ends (same)
66 | None: no handling of border effects
67 | (len(smooth(x)) = len(x) - len(window_len) + 1)
68 |
69 | See also:
70 | www.scipy.org/Cookbook/SignalSmooth
71 | """
72 |
73 | if x.ndim != 1:
74 | raise ValueError("smooth only accepts 1 dimension arrays.")
75 | if x.size < window_len:
76 | raise ValueError("Input vector needs to be bigger than window size.")
77 | if window_len < 2:
78 | return x
79 | if window not in ['flat', 'hanning', 'hamming', 'bartlett', 'blackman']:
80 | raise ValueError("Window is one of 'flat', 'hanning', 'hamming',"
81 | "'bartlett', 'blackman'")
82 | if method == 'zeros':
83 | s = np.r_[np.zeros((window_len - 1) // 2), x,
84 | np.zeros(window_len // 2)]
85 | elif method == 'reflect':
86 | s = np.r_[x[(window_len - 1) // 2:0:-1], x,
87 | x[-1:-(window_len + 1) // 2:-1]]
88 | else:
89 | s = x
90 | if window == 'flat':
91 | w = np.ones(window_len, 'd')
92 | else:
93 | w = getattr(np, window)(window_len)
94 | return np.convolve(w / w.sum(), s, mode='valid')
95 |
96 |
97 | def envelope(data):
98 | from scipy.signal import hilbert
99 | from scipy.fftpack import next_fast_len
100 | nfft = next_fast_len(len(data))
101 | anal_sig = hilbert(data, N=nfft)[:len(data)]
102 | return np.abs(anal_sig)
103 |
104 |
105 | def get_local_minimum(tr, smooth=None, ratio=5, smooth_window='flat', seconds_before_max=None):
106 | """
107 | tr: Trace
108 | smooth: bool
109 | ratio: ratio of local minima to maxima
110 | smooth_window
111 |
112 | """
113 | data = tr.data
114 | if smooth:
115 | window_len = int(round(smooth * tr.stats.sampling_rate))
116 | try:
117 | data = smooth(tr.data, window_len=window_len, method='clip',
118 | window=smooth_window)
119 | except ValueError:
120 | pass
121 | mins = scipy.signal.argrelmin(data)[0]
122 | maxs = scipy.signal.argrelmax(data)[0]
123 | if len(mins) == 0 or len(maxs) == 0:
124 | return
125 | mins2 = [mins[0]]
126 | for mi in mins[1:]:
127 | if data[mi] < data[mins2[-1]]:
128 | mins2.append(mi)
129 | mins = np.array(mins2)
130 | for ma in maxs:
131 | try:
132 | mi = np.nonzero(mins < ma)[0][-1]
133 | mi = mins[mi]
134 | except IndexError:
135 | mi = 0
136 | if data[ma] / data[mi] > ratio:
137 | if seconds_before_max is not None:
138 | mi = max(mi, ma - seconds_before_max / tr.stats.delta)
139 | return tr.stats.starttime + mi * tr.stats.delta
--------------------------------------------------------------------------------
/util/source.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import matplotlib.pyplot as plt
4 | import numpy as np
5 | from numpy import cos, sin, pi
6 | from obspy.imaging.scripts.mopad import strikediprake_2_moments, NED2USE, MomentTensor
7 | from obspy.core.event.source import farfield as obsfarfield
8 | from obspy.imaging.beachball import beach
9 |
10 |
11 | def _full_mt(mt):
12 | return np.array([[mt[0], mt[3], mt[4]],
13 | [mt[3], mt[1], mt[5]],
14 | [mt[4], mt[5], mt[2]]])
15 |
16 |
17 | def _cart2sph_pnt(x, y, z):
18 | ind = z < 0
19 | z[ind] *= -1
20 | x[ind] *= -1
21 | y[ind] *= -1
22 | hxy = np.hypot(x, y)
23 | r = np.hypot(hxy, z)
24 | el = pi/2 - np.arctan2(z, hxy) # inclination meassured between z and r
25 | az = (pi / 2 - np.arctan2(x, y) ) % (2 * pi)
26 | assert np.all(el <= pi/2)
27 | assert np.all(np.abs(r-1)<0.001)
28 | return r, el, az
29 |
30 |
31 | def pnt_axis(mt, out='xyz'):
32 | # PNT axis are in the coordinate system in which mt is defined
33 | # for out='rad', system must be NED (north, east, down)
34 | # returns T, N and P axis
35 | evals, evecs = np.linalg.eigh(_full_mt(mt))
36 | assert evals[0] <= evals[1] <= evals[2]
37 | if np.linalg.det(evecs) < 0:
38 | evecs *= -1
39 | if out == 'rad':
40 | evecs = np.transpose(_cart2sph_pnt(*evecs)[1:])
41 | return evecs
42 |
43 |
44 | def _polar_unitvecs(theta, phi):
45 | unitvec = [[sin(theta) * cos(phi), sin(theta) * sin(phi), +cos(theta)],
46 | [cos(theta) * cos(phi), cos(theta) * sin(phi), -sin(theta)],
47 | [-sin(phi), cos(phi), 0 * phi]]
48 | return np.array(unitvec)
49 |
50 |
51 | def farfield(mt, points, out='xyz', typ='P'):
52 | # mt is strike dip rake or MT in NED convention (North East Down)
53 | if len(mt) == 3:
54 | mt = strikediprake_2_moments(*mt)
55 | ff = obsfarfield(mt, np.array(points), typ)
56 | if out == 'rad':
57 | # assume points are given in polar projection
58 | # project xyz to r, theta, phi unit vectors
59 | ffn = np.einsum('ij,kij->kj', ff, _polar_unitvecs(*points))
60 | return ffn
61 |
62 |
63 | def plot_pnt(mt, ax=None, what='PT', color='k', label='PNT', zorder=None):
64 | # mt is strike dip rake or MT in NED convention (North East Down)
65 | if len(mt) == 3:
66 | mt = strikediprake_2_moments(*mt)
67 | pnt = pnt_axis(mt, out='rad')
68 | xy1 = pnt[0, 1], np.rad2deg(pnt[0, 0])
69 | xy2 = pnt[2, 1], np.rad2deg(pnt[2, 0])
70 | xy3 = pnt[1, 1], np.rad2deg(pnt[1, 0])
71 | lkw = dict(fontsize='large', xytext=(3, 1), textcoords='offset points')
72 | if ax is None:
73 | ax = plt.subplot(111)
74 | if 'P' in what:
75 | ax.scatter(*xy1, color=color, marker='.', zorder=zorder)
76 | if 'P' in label:
77 | ax.annotate('P', xy1, **lkw)
78 | if 'T' in what:
79 | ax.scatter(*xy2, color=color, marker='.', zorder=zorder)
80 | if 'T' in label:
81 | ax.annotate('T', xy2, **lkw)
82 | if 'N' in what:
83 | ax.scatter(*xy3, color=color, marker='.', zorder=zorder)
84 | if 'N' in label:
85 | ax.annotate('N', xy3, **lkw)
86 |
87 |
88 | def plot_farfield(mt, typ='P', points=None, thetalim=(0, 90), ax=None,
89 | scale=None, plot_axis=None):
90 | # mt is strike dip rake or MT in NED convention (North East Down)
91 | if len(mt) == 3:
92 | mt = strikediprake_2_moments(*mt)
93 | mt_use = NED2USE(mt)
94 | if typ in ('P', 'S'):
95 | if points is None:
96 | theta = np.linspace(0.1, pi/2, 6)
97 | phi = np.linspace(0, 2*pi, 11, endpoint=False)[:, np.newaxis]
98 | phi2, theta2 = np.meshgrid(phi, theta)
99 | phi2 = np.ravel(phi2)
100 | theta2 = np.ravel(theta2)
101 | else:
102 | phi2, theta2 = np.array(points)
103 | ff = farfield(mt, [theta2, phi2], out='rad', typ=typ)
104 | if ax is None:
105 | fig = plt.figure()
106 | ax2 = fig.add_subplot(111)
107 | else:
108 | fig = ax.figure
109 | ax2 = ax
110 | ax2.set_aspect(1)
111 | ax2.axison=False
112 | b = beach(mt_use, width=20, edgecolor='k', nofill=True, linewidth=1)
113 | ax2.add_collection(b)
114 | ax2.set_xlim(-10, 10)
115 | ax2.set_ylim(-10, 10)
116 |
117 | # ax1 = fig.add_subplot(111, polar=True)
118 | ax1 = fig.add_axes(ax2.get_position().bounds, polar=True)
119 | ax1.grid(False)
120 | ax1.set_theta_zero_location('N')
121 | ax1.set_theta_direction(-1)
122 | if typ == 'P':
123 | assert np.sum(np.abs(ff[1:, :])) < 1e-4
124 | ff = ff[0, :]
125 | ind1 = ff > 0
126 | ind2 = ff < 0
127 | ax1.scatter(phi2[ind1], np.rad2deg(theta2[ind1]), (20 * ff[ind1]) ** 2, marker='+')
128 | ax1.scatter(phi2[ind2], np.rad2deg(theta2[ind2]), (20 * ff[ind2]) ** 2, marker='o')
129 | elif typ == 'S':
130 | assert np.sum(np.abs(ff[0, :])) < 1e-4
131 | dr, dphi = ff[1, :], ff[2, :]
132 | # convert to xy
133 | dx = dr * cos(phi2) - dphi * sin(phi2)
134 | dy = dr * sin(phi2) + dphi * cos(phi2)
135 | #dx, dy -> 90° drehen: -dy,dx -> phi andersherum dy, dx
136 | #zusaetzliches minuszeichen ist ein bisschen strange
137 | ax1.quiver(phi2, np.rad2deg(theta2), -dy, -dx, pivot='mid', scale=scale)
138 |
139 | ax1.set_ylim(thetalim)
140 | ax1.set_xticks([])
141 | ax1.set_yticks([])
142 | ax1.axison=False
143 | if plot_axis:
144 | plot_pnt(mt, ax1, plot_axis)
145 | return ax1, ax2
146 |
147 |
148 | # def onclick(event):
149 | # xy = np.rad2deg(event.xdata) % 360, event.ydata
150 | # print('({:6.2f}, {:5.2f}),'.format(*xy))
151 | # fig.canvas.mpl_connect('button_press_event', onclick)
152 |
153 |
154 | def nice_points():
155 | points=[
156 | (281.86, 81.98), # links rechts, nodel plane
157 | (294.29, 61.45),
158 | (311.35, 43.50),
159 | ( 0.00, 28.93),
160 | ( 36.37, 33.55),
161 | ( 62.79, 50.09),
162 | # ( 78.09, 71.16),
163 | ( 76, 68),
164 | # ( 87.88, 86.99),
165 | (84, 82),
166 |
167 | (352.32, 85.74), # oben unten, nodal plane
168 | (347.23, 68.49),
169 | (341.10, 49.62),
170 | #(316.53, 22.32),
171 | #(230.05, 17.21),
172 | #(191.14, 44.76),
173 | #(181.52, 73.36),
174 | #(177.57, 86.81),
175 |
176 | ( 0.00, 46.95), # links rechts, -1
177 | ( 27.21, 50.05),
178 | ( 54.57, 63.35),
179 | ( 68.91, 80.39),
180 | (326.02, 58.38),
181 | (305.98, 71.62),
182 | (295.68, 87.00),
183 |
184 | (323.48, 82.66), # links rechts, -2
185 | (336.13, 75.83),
186 | ( 6.23, 67.31),
187 | ( 27.19, 71.40),
188 | ( 47.72, 84.26),
189 |
190 | ( 9.33, 84.10), # links rechts, -3
191 |
192 | (264.14, 74.41), # links rechts, 1
193 | (268.57, 50.67),
194 | (277.72, 30.47),
195 | (290.81, 15.63),
196 | ( 58.34, 6.87),
197 | ( 96.08, 30.37),
198 | (103.75, 60.67),
199 | # (108.72, 85.88),
200 | (110, 81),
201 |
202 | (244.08, 73.10), # links rechts, 2
203 | (245.39, 49.82),
204 | (233, 36),
205 | # (231.62, 31.06),
206 | (211.40, 24.31),
207 | (161.01, 22.46),
208 | (131.78, 39.18),
209 | (126.90, 66.99),
210 | (129.14, 82.25),
211 |
212 | (226.38, 77.37), # links rechts, 3
213 | (214.62, 56.58),
214 | (190.12, 44.33),
215 | (162.79, 47.72),
216 | (147.40, 65.09),
217 | (145.15, 85.22),
218 |
219 | (206.63, 77.14), # links rechts, 4
220 | (183.77, 66.67),
221 | (162.60, 73.29),
222 |
223 | (178.66, 83.59), # ganz unten
224 | ]
225 |
226 | return np.transpose(np.deg2rad(points))
227 |
228 |
229 | def test():
230 | sdr = [175, 75, -30]
231 | plot_farfield(sdr, points=nice_points(), plot_pt=True)
232 | plot_farfield(sdr, typ='S', points=nice_points(), plot_pt=True)
233 |
234 | mt = strikediprake_2_moments(*sdr) # NED
235 | obj = MomentTensor(mt)
236 | obj._M_to_principal_axis_system()
237 | print(obj.get_fps())
238 | print(obj.get_p_axis(style='f'))
239 | print(obj.get_t_axis(style='f'))
240 | print(obj.get_null_axis(style='f'))
241 | print(pnt_axis(mt, out='xyz'))
242 | print(pnt_axis(mt, out='rad'))
243 | plt.show()
244 |
245 |
246 | #test()
--------------------------------------------------------------------------------
/util/xcorr2.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import matplotlib.pyplot as plt
4 | import numpy as np
5 | from obspy import Stream, Trace
6 | from obspy.geodetics import gps2dist_azimuth
7 | from obspy.signal.cross_correlation import correlate
8 | import utm
9 |
10 |
11 | def correlate_phase(data1, data2, shift, demean=True, normalize=True):
12 | from scipy.signal import hilbert
13 | from scipy.fftpack import next_fast_len
14 | assert len(data1) == len(data2)
15 | nfft = next_fast_len(len(data1))
16 | if demean:
17 | data1 = data1 - np.mean(data1)
18 | data2 = data2 - np.mean(data2)
19 | sig1 = hilbert(data1, N=nfft)[:len(data1)]
20 | sig2 = hilbert(data2, N=nfft)[:len(data2)]
21 | phi1 = np.angle(sig1)
22 | phi2 = np.angle(sig2)
23 | def phase_stack(phi1, phi2, shift):
24 | s1 = max(0, shift)
25 | s2 = min(0, shift)
26 | N = len(phi1)
27 | assert len(phi2) == N
28 | return np.sum(np.abs(np.cos((phi1[s1:N+s2] - phi2[-s2:N-s1]) / 2)) -
29 | np.abs(np.sin((phi1[s1:N+s2] - phi2[-s2:N-s1]) / 2)))
30 |
31 | cc = [phase_stack(phi1, phi2, s) for s in range(-shift, shift + 1)]
32 | cc = np.array(cc)
33 | if normalize:
34 | cc = cc / len(data1)
35 | return cc
36 |
37 |
38 | def coord2m(lat, lon, dep):
39 | x, y = utm.from_latlon(lat, lon)[:2]
40 | return x, y, dep
41 |
42 |
43 | def calc_angle(a, b, c):
44 | a = np.array(coord2m(*a))
45 | b = np.array(coord2m(*b))
46 | c = np.array(coord2m(*c))
47 | ba = a - b
48 | bc = c - b
49 | cosine_angle = np.dot(ba, bc) / (np.linalg.norm(ba) * np.linalg.norm(bc))
50 | angle = np.arccos(cosine_angle)
51 | return np.degrees(angle)
52 |
53 |
54 | def correlate_traces(tr1, tr2, shift, abuse_seedid=False,
55 | use_headers=set(), use_seedid_headers=True,
56 | calc_header=None, phase=False,
57 | **kwargs):
58 | """
59 | Return trace of cross-correlation of two input traces
60 |
61 | :param tr1,tr2: two |Trace| objects
62 | :param shift: maximal shift in correlation in seconds
63 | """
64 | if use_seedid_headers:
65 | seedid_headers = {'network', 'station', 'location', 'channel'}
66 | use_headers = set(use_headers) | seedid_headers
67 | sr = tr1.stats.sampling_rate
68 | assert sr == tr2.stats.sampling_rate
69 | header = {k: v for k, v in tr1.stats.items() if tr2.stats.get(k) == v and k != 'npts'}
70 | for k in use_headers:
71 | if k in tr1.stats:
72 | header[k + '1'] = tr1.stats[k]
73 | if k in tr2.stats:
74 | header[k + '2'] = tr2.stats[k]
75 | if 'channel' not in header:
76 | c1 = tr1.stats.channel
77 | c2 = tr2.stats.channel
78 | if c1 != '' and c2 != '' and c1[:-1] == c2[:-1]:
79 | header['channel'] = c1[:-1] + '?'
80 | header['channel'] = c1[:-1] + c1[-1] + c2[-1]
81 | st1 = tr1.stats.starttime
82 | st2 = tr2.stats.starttime
83 | len1 = tr1.stats.endtime - st1
84 | len2 = tr2.stats.endtime - st2
85 | if (st1 + len1 / 2) - (st2 + len2 / 2) < 0.1:
86 | header['starttime'] = st1 + len1 / 2
87 | corrf = correlate_phase if phase else correlate
88 | xdata = corrf(tr1.data, tr2.data, int(round(shift * sr)), **kwargs)
89 | tr = Trace(data=xdata, header=header)
90 | if abuse_seedid:
91 | n1, s1, l1, c1 = tr1.id.split('.')
92 | n2, s2, l2, c2 = tr2.id.split('.')
93 | tr.id = '.'.join(s1, c1, s2, c2)
94 | if calc_header == 'event' and 'elon' in tr1.stats:
95 | s1 = tr1.stats
96 | s2 = tr2.stats
97 | args = (s1.elat, s1.elon, s2.elat, s2.elon)
98 | dist, azi, baz = gps2dist_azimuth(*args)
99 | dpdif = (s2.edep - s1.edep) * 1000
100 | tr.stats.dist = (dist ** 2 + dpdif**2) ** 0.5 # dist in meter
101 | tr.stats.azi = azi
102 | tr.stats.baz = baz
103 | tr.stats.inc = np.rad2deg(np.arctan2(dist, dpdif))
104 | # only valid if event 1 is above event 2
105 | assert tr.stats.inc <= 90.
106 | if 'slat' in s1:
107 | a = s1.elat, s1.elon, s1.edep * 1000
108 | b = s2.elat, s2.elon, s2.edep * 1000
109 | c = s1.slat, s1.slon, 0
110 | tr.stats.angle12s = calc_angle(a, b, c)
111 | tr.stats.angle21s = calc_angle(b, a, c)
112 | elif calc_header == 'station' and 'slon' in tr1.stats:
113 | args = (tr1.stats.slat, tr1.stats.slon, tr2.stats.slat, tr2.stats.slon)
114 | dist, azi, baz = gps2dist_azimuth(*args)
115 | tr.stats.dist = dist / 1000 # dist in km
116 | tr.stats.azi = azi
117 | tr.stats.baz = baz
118 | return tr
119 |
120 |
121 | def correlate_streams(stream1, stream2, shift, **kwargs):
122 | traces = [correlate_traces(tr1, tr2, shift, **kwargs)
123 | for tr1, tr2 in zip(stream1, stream2)]
124 | return Stream(traces)
125 |
126 |
127 | def keypress(event, fig, ax, alines):
128 | if event.inaxes != ax:
129 | return
130 | print('You pressed ' + event.key)
131 | if event.key == 'k':
132 | print(__doc__)
133 | elif event.key in 'gb':
134 | factor = 1.2 if event.key == 'g' else 1/1.2
135 | for lines, dist in alines:
136 | lines.set_ydata((lines.get_ydata() - dist) * factor + dist)
137 | fig.canvas.draw()
138 |
139 |
140 | def velocity_line(v, ax, times=None, dists=None, lw=1, zorder=None, color='C1', alpha=0.8):
141 | if times is None:
142 | times = ax.get_xlim()
143 | if dists is None:
144 | dists = ax.get_ylim()
145 | times = np.array([max(min(times), -max(dists) / v / 1000), 0, min(max(times), max(dists) / v / 1000)])
146 | ax.plot(times, np.abs(v * 1000 * times), color=color, lw=lw, alpha=alpha, label='%s km/s' % v, zorder=zorder)
147 |
148 |
149 | def plot_corr_vs_dist(stream, figsize=None, v=3.5, annotate=True, expr='{evid1}-{evid2}', xlim=None):
150 | fig = plt.figure(figsize=figsize)
151 | ax = fig.add_subplot(111)
152 | max_ = max(stream.max())
153 | max_dist = max(tr.stats.dist for tr in stream)
154 | all_lines = []
155 | for i, tr in enumerate(stream):
156 | starttime = tr.stats.starttime
157 | mid = starttime + (tr.stats.endtime - starttime) / 2
158 | t = tr.times(reftime=mid)
159 | scaled_data = tr.stats.dist + tr.data * max_dist / max_ / 50
160 | (lines,) =ax.plot(t, scaled_data, 'k', lw=1)
161 | all_lines.append((lines, tr.stats.dist))
162 | if annotate:
163 | label = expr.format(**tr.stats)
164 | ax.annotate(label, (t[-1], tr.stats.dist), (-5, 0),
165 | 'data', 'offset points', ha='right')
166 | velocity_line(v, ax, t, [max_dist], lw=2)
167 | ax.legend()
168 | ax.set_ylabel('distance (m)')
169 | ax.set_xlabel('lag time (s)')
170 | k = lambda x: keypress(x, fig, ax, all_lines)
171 | fig.canvas.mpl_disconnect(fig.canvas.manager.key_press_handler_id)
172 | fig.canvas.mpl_connect('key_press_event', k)
173 | if xlim:
174 | plt.xlim(-xlim, xlim)
175 | plt.ylim(0, None)
176 |
177 |
178 | def _align_values_for_pcolormesh(x):
179 | x = list(x)
180 | dx = np.median(np.diff(x))
181 | x.append(x[-1] + dx)
182 | x = [xx - 0.5 * dx for xx in x]
183 | return x
184 |
185 |
186 | def plot_corr_vs_dist_hist(stack, vmax=None, cmap='RdBu_r', figsize=None, v=3.5, xlim=None):
187 | lag_times = stack[0].times()
188 | lag_times -= lag_times[-1] / 2
189 | lag_times = _align_values_for_pcolormesh(lag_times)
190 | data = np.array([tr.data for tr in stack])
191 | dists = [tr.stats.distbin for tr in stack]
192 | if vmax is None:
193 | vmax = 0.8 * np.max(np.abs(data))
194 | fig = plt.figure(figsize=figsize)
195 | ax = fig.add_axes([0.15, 0.1, 0.75, 0.75])
196 | velocity_line(v, ax, lag_times, [max(dists)])
197 | ax.legend()
198 | cax = fig.add_axes([0.91, 0.375, 0.008, 0.25])
199 | mesh = ax.pcolormesh(lag_times, dists, data, cmap=cmap,
200 | vmin=-vmax, vmax=vmax)
201 | fig.colorbar(mesh, cax)
202 | ax.set_ylabel('distance (m)')
203 | ax.set_xlabel('time (s)')
204 | plt.sca(ax)
205 | if xlim:
206 | plt.xlim(-xlim, xlim)
207 | plt.ylim(0, None)
208 |
209 |
210 | # https://stackoverflow.com/a/31364297
211 | def set_axes_equal(ax):
212 | '''Make axes of 3D plot have equal scale so that spheres appear as spheres,
213 | cubes as cubes, etc.. This is one possible solution to Matplotlib's
214 | ax.set_aspect('equal') and ax.axis('equal') not working for 3D.
215 |
216 | Input
217 | ax: a matplotlib axis, e.g., as output from plt.gca().
218 | '''
219 |
220 | x_limits = ax.get_xlim3d()
221 | y_limits = ax.get_ylim3d()
222 | z_limits = ax.get_zlim3d()
223 |
224 | x_range = abs(x_limits[1] - x_limits[0])
225 | x_middle = np.mean(x_limits)
226 | y_range = abs(y_limits[1] - y_limits[0])
227 | y_middle = np.mean(y_limits)
228 | z_range = abs(z_limits[1] - z_limits[0])
229 | z_middle = np.mean(z_limits)
230 |
231 | # The plot bounding box is a sphere in the sense of the infinity
232 | # norm, hence I call half the max range the plot radius.
233 | plot_radius = 0.5*max([x_range, y_range, z_range])
234 |
235 | ax.set_xlim3d([x_middle - plot_radius, x_middle + plot_radius])
236 | ax.set_ylim3d([y_middle - plot_radius, y_middle + plot_radius])
237 | ax.set_zlim3d([z_middle - plot_radius, z_middle + plot_radius])
--------------------------------------------------------------------------------
/vpvs.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import numpy as np
4 | from itertools import combinations
5 | import matplotlib.pyplot as plt
6 | import scipy
7 |
8 | from load_data import get_events
9 | from plot_maps import get_bounds, get_colors
10 | from util.events import events2lists
11 |
12 |
13 | def odr_brute(x, y, norm=None, **kw):
14 | x = np.array(x)
15 | y = np.array(y)
16 | def error(g):
17 | dist = np.abs(g[0] * x - y) / (g[0] ** 2 + 1) ** 0.5
18 | if norm == 'mad':
19 | return np.median(np.abs(dist))
20 | elif norm == 'lms':
21 | return (np.median(dist ** 2)) ** 0.5
22 | return np.linalg.norm(dist, norm) / len(x)
23 | return scipy.optimize.brute(error, **kw)
24 |
25 |
26 | def standard_error_of_estimator(y1, y2, x):
27 | return (np.sum((y1-y2)**2) / np.sum((x-np.mean(x))**2) / len(y1)) ** 0.5
28 |
29 |
30 | def PSpicks(picks_, ot):
31 | picks = picks_
32 | stations = [p[0].split('.')[1] for p in picks]
33 | stations = sorted({sta for sta in stations if stations.count(sta) == 2})
34 | picks2 = {'P': [], 'S': []}
35 | for seed, phase, time in sorted(picks):
36 | sta = seed.split('.')[1]
37 | if sta in stations:
38 | picks2[phase[0]].append(time - ot)
39 | if not len(stations) == len(picks2['P']) == len(picks2['S']):
40 | print(len(stations), len(picks2['P']), len(picks2['S']))
41 | raise RuntimeError()
42 | return stations, np.array(picks2['P']), np.array(picks2['S'])
43 |
44 |
45 | def cond1(x, y):
46 | return np.abs(y - x * 1.7) < 0.1
47 |
48 | def cond2(x, y):
49 | dist = (x ** 2 + (y/vpvs0) ** 2) ** 0.5
50 | return dist < 0.35
51 |
52 | def double_diff(picks, apply_cond1=True, apply_cond2=True):
53 | """
54 | Lin and Shearer 2007
55 |
56 | picks: list of [stations, P picks, S picks] (in seconds)
57 | """
58 | x = []
59 | y = []
60 | for picks1, picks2 in list(combinations(picks, 2)):
61 | sta1, p1, s1 = picks1
62 | sta2, p2, s2 = picks2
63 | sta = set(sta1) & set(sta2)
64 | index1 = [i for i, s in enumerate(sta1) if s in sta]
65 | index2 = [i for i, s in enumerate(sta2) if s in sta]
66 | p1 = p1[index1]
67 | p2 = p2[index2]
68 | s1 = s1[index1]
69 | s2 = s2[index2]
70 | dp = p2 - p1
71 | ds = s2 - s1
72 | if apply_cond1:
73 | x1 = dp - np.median(dp)
74 | y1 = ds - np.median(ds)
75 | good = cond1(x1, y1)
76 | dp = dp[good]
77 | ds = ds[good]
78 | if len(dp) == 0:
79 | continue
80 | if apply_cond2:
81 | x1 = dp - np.median(dp)
82 | y1 = ds - np.median(ds)
83 | good = cond2(x1, y1)
84 | dp = dp[good]
85 | ds = ds[good]
86 | if len(dp) == 0:
87 | continue
88 | x1 = dp - np.mean(dp)
89 | y1 = ds - np.mean(ds)
90 | x.extend(list(x1))
91 | y.extend(list(y1))
92 | x = np.array(x)
93 | y = np.array(y)
94 | return x, y
95 |
96 | def odr_fit(x, y, scale=1.7, lim=(1.5, 1.9), Ns=101, norm=1):
97 | b3, _, vpvsgrid, err = odr_brute(x, y/scale,
98 | ranges=[[lim[0]/scale, lim[1] / scale]],
99 | Ns=Ns, norm=norm, full_output=True)
100 | b3 = b3[0]*scale
101 | vpvsgrid = vpvsgrid * scale
102 | return b3, vpvsgrid, err
103 |
104 | plt.rc('font', size=20)
105 | plt.rc('mathtext', fontset='cm')
106 |
107 | events = get_events().filter('magnitude > 1.0')
108 | events.events = sorted(events, key=lambda e: e.origins[0].time)
109 |
110 | bounds = get_bounds()
111 | fig, axs = plt.subplots(2, 5, figsize=(20, 8.5), sharex='row', sharey='row')
112 | colors = get_colors()
113 |
114 | for i, (t1, t2, vpvs0) in enumerate(
115 | zip(bounds[:-1], bounds[1:], [1.68, 1.66, 1.69, 1.68, 1.69])):
116 | ax = axs[0, i]
117 | events2 = events.filter(f'time > {t1}', f'time < {t2}')
118 | events2 = events2lists(events2)
119 | picks = {}
120 | ids = list(zip(*events2))[0]
121 | for id_, ot, lat, lon, dep, mag, picks_ in events2:
122 | picks[id_] = PSpicks(picks_, ot)
123 |
124 | picks = list(picks.values())
125 | x, y = double_diff(picks)
126 | # xrem, yrem = double_diff(picks, apply_cond1=True, apply_cond2=False)
127 | # ax.scatter(xrem, yrem, 16, marker='.', c='0.7', rasterized=True)
128 |
129 | m1 = np.nanmedian(np.array(y) / np.array(x))
130 | os = 0 #0.2 * i
131 | b2, vpvsgrid, err = odr_fit(x, y, scale=1, lim=[1.48, 1.92], Ns=111, norm=1)
132 | b3, vpvsgrid, err = odr_fit(x, y, scale=vpvs0, lim=[1.48, 1.92], Ns=221, norm=1)
133 | b4 = odr_brute(x, y, ranges=[(1.5, 1.9)], Ns=101, norm=2)[0]
134 | b5 = odr_brute(x, y, ranges=[(1.5, 1.9)], Ns=101, norm='lms')[0]
135 | print('error', standard_error_of_estimator(y, b3*x, x))
136 | print(t1, t2, '{:.2f} {:.2f} {:.2f} {:.2f}'.format(m1, b3, b2, b4))
137 | x1 = np.min(x)
138 | x2 = np.max(x)
139 | c = colors[i]
140 | ax.scatter(x, y, 16, marker='.', c=c, rasterized=True)
141 |
142 | ax.plot((x1, x2), (b3 * x1+os, b3 * x2+os), '-k')
143 | ax.plot((x1, x2), (1.5 * x1+os, 1.5 * x2+os), 'k', ls=(0, (1, 2)))
144 | ax.plot((x1, x2), (1.9 * x1+os, 1.9 * x2+os), 'k', ls=(0, (1, 2)))
145 | ax.annotate('N=%s\n$\\mathdefault{v_{\\rm{P}}/v_{\\rm{S}}{=}%.2f}$' % (len(events2), b3) , (0.95, 0.05), xycoords='axes fraction', ha='right', va='bottom')
146 | ax.annotate('abcde'[i] + ')', (0, 1), (8, -6), 'axes fraction', 'offset points', va='top')
147 | ax = axs[1, i]
148 | ax.axvline(b3, color='k', zorder=-1)
149 | ax.plot(vpvsgrid, 1000 * err, color=c, lw=4)
150 | ax.annotate('fghij'[i] + ')', (0, 1), (8, -6), 'axes fraction', 'offset points', va='top')
151 |
152 | for i in range(1, 5):
153 | plt.setp(axs[0, i].get_yticklabels(), visible=False)
154 |
155 | axs[0, 0].set_ylabel('differential S wave\ntravel time $\\hat{\\delta}t_{\\rm S}^{ij}$ (s)')
156 | axs[0, 2].set_xlabel(r'differential P wave travel time $\hat{\delta}t_{\rm P}^{ij}$ (s)')
157 | axs[0, 0].set_xticks([-0.2, 0, 0.2])
158 | axs[1, 0].set_xticks([1.5, 1.6, 1.7, 1.8, 1.9])
159 | axs[1, 0].set_xlim(1.48, 1.92)
160 | axs[1, 0].set_yticks([7, 8, 9, 10, 11, 12])
161 |
162 | axs[1, 0].set_ylabel('mean absolute error (ms)')
163 | axs[1, 2].set_xlabel(r'velocity ratio $\mathdefault{v_{\rm P}/v_{\rm S}}$')
164 |
165 | fig.tight_layout(w_pad=-2)
166 | fig.savefig('figs/vpvs.pdf', dpi=300, bbox_inches='tight', pad_inches=0.1)
167 | plt.show()
--------------------------------------------------------------------------------
/vs.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import pickle
4 |
5 | import matplotlib.pyplot as plt
6 | from matplotlib.colors import LinearSegmentedColormap
7 | from matplotlib.colorbar import ColorbarBase
8 | from matplotlib.dates import DateFormatter
9 | import numpy as np
10 | from statsmodels.robust.robust_linear_model import RLM
11 | import statsmodels.api as sm
12 |
13 | from plot_maps import get_bounds, get_cmap, get_colors
14 | from traveltime import repack, filters, LATLON0
15 | from util.imaging import convert_coords2km
16 |
17 |
18 | def fit_velocity(dist, tmax1, plot=True):
19 | # rlm = RLM(dist/1000 , np.abs(tmax1))
20 | # rlm = RLM(np.abs(tmax1), dist/1000, M=sm.robust.norms.Hampel(1, 2, 4))
21 | s = dist / 1000
22 | t = np.abs(tmax1)
23 | o = np.ones(len(t))
24 | # rlm = RLM(d/t, np.ones(len(d)), M=sm.robust.norms.Hampel(1, 2, 4))
25 | rlm = RLM(s/t, o, M=sm.robust.norms.Hampel(1, 2, 4))
26 | res = rlm.fit(maxiter=100)
27 | v = res.params[0]
28 | w = res.weights
29 | # scale = res.scale
30 | # v = np.median(s/t)
31 | from statsmodels.robust.scale import mad
32 | scale = mad(s/t, center=v, c=1)
33 | tmax = np.max(s) / v
34 | if plot:
35 | fig = plt.figure()
36 | ax = fig.add_subplot(121)
37 | ax.scatter(tmax1, dist)
38 | ax.plot((-tmax, 0, tmax), (np.max(dist), 0, np.max(dist)))
39 | ax2 = fig.add_subplot(122)
40 | ax2.hist(dist/1000/np.abs(tmax1), bins=np.linspace(3, 5, 21))
41 | return v, w, scale
42 |
43 |
44 | def stats_vs_time(dist, tmax1, tmax2, evids, times1, times2, bounds):
45 | ress = []
46 | r = []
47 | for t1, t2 in zip(bounds[:-1], bounds[1:]):
48 | print(t1, t2)
49 | cond = np.logical_and.reduce((times1 <= t2, times1 > t1, times2 <= t2, times2 > t1, tmax1 != 0))
50 | dist_, tmax1_, tmax2_, t1_ = filters(cond, dist, tmax1, tmax2, times1)
51 | ind = tmax2_!=0
52 | tmax_=tmax1_
53 | v, weights, scale = fit_velocity(dist_, tmax_, plot=False)
54 | r.append((t1, v, weights, scale, dist_, tmax_))
55 | ress.append('{!s:10} {:.3f} {} {:.3f}'.format(t1, np.round(v, 3), len(dist_), np.round(scale, 3)))
56 |
57 | fig = plt.figure(figsize=(10, 8.5))
58 | ax1 = fig.add_subplot(331)
59 | ax2 = fig.add_subplot(332, sharex=ax1, sharey=ax1)
60 | ax3 = fig.add_subplot(333, sharex=ax1, sharey=ax1)
61 | ax4 = fig.add_subplot(334, sharex=ax1, sharey=ax1)
62 | ax5 = fig.add_subplot(335, sharex=ax1, sharey=ax1)
63 | ax6 = fig.add_subplot(336)
64 | axes = [ax1, ax2, ax3, ax4, ax5]
65 | colors = get_colors()
66 | for i, (t1, v, weights, scale, dist, tmax1) in enumerate(r):
67 | tmax = np.max(dist) / 1000 / v
68 | c = colors[i]
69 | ax = axes[i]
70 | cmap = LinearSegmentedColormap.from_list('w_%d' % i, ['white', c])
71 | ax.scatter(tmax1, dist, c=weights, cmap=cmap, edgecolors='k')
72 | ax.plot((-tmax, 0, tmax), (np.max(dist), 0, np.max(dist)), color=c)
73 | ax.annotate('N=%d' % len(dist), (0.03, 0.03), xycoords='axes fraction')
74 | ax.annotate(r'$v_{{\rm{{S}}}}{{=}}({:.2f}{{\pm}}{:.2f})$km/s'.format(v, scale), (0.97, 0.03), xycoords='axes fraction', ha='right')
75 |
76 | bins=np.linspace(2, 6, 41)
77 | centers = 0.5 * (bins[:-1] + bins[1:])
78 | heights = np.diff(bins)
79 | for i, (t1, v, weights, scale, dist, tmax1) in enumerate(r):
80 | c = colors[i]
81 | vmedian = np.median(dist/1000/np.abs(tmax1))
82 | data = np.histogram(dist/1000/np.abs(tmax1), bins=bins)[0]
83 | data = data / np.max(data)
84 | ax6.barh(centers, data, height=heights, left=i-0.5*data, color=c, alpha=0.5)
85 | ax6.errorbar(i, v, scale, fmt='o', color=c)
86 | ind = np.digitize(vmedian, bins) - 1
87 | ax6.plot([i, i-0.5*data[ind]], [vmedian, vmedian], ':', color=c)
88 | ax6.plot([i, i+0.5*data[ind]], [vmedian, vmedian], ':', color=c)
89 |
90 | for ax, label in zip(axes + [ax6], 'abcdef'):
91 | ax.annotate(label + ')', (0, 1), (8, -6), 'axes fraction', 'offset points', va='top')
92 | ax1.set_ylabel('event distance (m)')
93 | ax4.set_ylabel('event distance (m)')
94 | ax4.set_xlabel('lag time (s)')
95 | ax5.set_xlabel('lag time (s)')
96 | ax6.set_ylabel('apparent S wave velocity (km/s)')
97 | ax6.set_xlim(-0.5, 4.5)
98 | ax6.set_ylim(2.5, 5)
99 | ax6.set_xticks([])
100 | fig.tight_layout()
101 | bounds = ax6.get_position().bounds
102 | ax7 = fig.add_axes([bounds[0], bounds[1]-0.02, bounds[2], 0.01])
103 | cmap, norm = get_cmap()
104 | cbar = ColorbarBase(ax7, cmap=cmap, norm=norm, orientation='horizontal', format=DateFormatter('%Y-%m-%d'))
105 | cbar.ax.set_xticklabels(cbar.ax.get_xticklabels(), rotation=60, ha='right', rotation_mode='anchor')
106 | for i, label in enumerate('abcde'):
107 | ax6.annotate(label, ((i+0.5) / 5, 0.04), xycoords='axes fraction', ha='center', fontstyle='italic')
108 | fig.savefig('figs/vs_time.pdf', bbox_inches='tight', pad_inches=0.1)
109 | return ress
110 |
111 |
112 | PKL_FILE = 'tmp/stuff_onlymax.pkl'
113 | NSR = 0.1
114 |
115 | with open(PKL_FILE, 'rb') as f:
116 | stuff = pickle.load(f)
117 |
118 | dist, tmax1, tmax2, max1, max2, (snr1, snr2, snr3, snr4), evids, azi, inc, (coords1, coords2), v, (times1, times2), phase1 = repack(stuff.values())
119 | coords1 = convert_coords2km(coords1, LATLON0)
120 | coords2 = convert_coords2km(coords2, LATLON0)
121 | x1, y1, z1 = [np.array(bla) for bla in zip(*coords1)]
122 | x2, y2, z2 = [np.array(bla) for bla in zip(*coords1)]
123 | cond1 = np.logical_or(np.logical_and(azi>0, azi<300), inc < 20, inc>50)
124 |
125 | cond = np.logical_and.reduce((snr2 <= NSR, dist > 200, cond1, phase1>0.00))
126 | dist, tmax1, tmax2, max1, max2, snr1, snr2, snr3, snr4, evids, coords1, coords2, v, times1, times2, phase1 = filters(cond, dist, tmax1, tmax2, max1, max2, snr1, snr2, snr3, snr4, evids, np.array(coords1), np.array(coords2), v, times1, times2, phase1)
127 | resdepth = stats_vs_time(dist, tmax1, tmax2, evids, times1, times2, get_bounds())
128 | print('\n'.join(resdepth))
129 | plt.show()
--------------------------------------------------------------------------------
/xcorr.py:
--------------------------------------------------------------------------------
1 | # Copyright 2020 Tom Eulenfeld, MIT license
2 |
3 | import collections
4 | from copy import copy
5 | import itertools
6 |
7 | import matplotlib.pyplot as plt
8 | import numpy as np
9 | import obspy
10 | from obspy import read, read_events
11 | from obspy.geodetics import gps2dist_azimuth
12 | import obspyh5
13 | from tqdm import tqdm
14 |
15 | from load_data import iter_data
16 | from util.events import event2list
17 | from util.signal import envelope, trim2
18 | from util.xcorr2 import correlate_traces
19 |
20 |
21 | def plot_stream(stream):
22 | fig = plt.figure()
23 | ax = fig.add_subplot(111)
24 | max_ = max(stream.max())
25 | for i, tr in enumerate(stream):
26 | times = tr.times(reftime=tr.stats.event_time)
27 | ax.plot(times, i + tr.data/max_*2)
28 |
29 |
30 | def print_(stream, s=False):
31 | print(stream)
32 | m = [(tr.stats.dist, tr.stats.angle12s, tr.stats.angle21s) for tr in stream]
33 | if s:
34 | m = sorted(m)
35 | for ml in m:
36 | print(ml)
37 |
38 |
39 | def _get_dist(lat1, lon1, dep1, lat2, lon2, dep2):
40 | dist, azi, baz = gps2dist_azimuth(lat1, lon1, lat2, lon2)
41 | dpdif = (dep1 - dep2) * 1000
42 | return (dist ** 2 + dpdif**2) ** 0.5 # dist in meter
43 |
44 |
45 | def correlate_stream(stream, tw='Scoda', timenorm=None, max_dist=None, tw_len=None, max_lag=2, filter=None,
46 | max_dist_diff=None, **kw):
47 | if FILTER:
48 | if FILTER['freqmax'] > 100:
49 | stream.filter('highpass', freq=FILTER['freqmin'])
50 | else:
51 | stream.filter(**FILTER)
52 | for tr in stream:
53 | if timenorm == '1bit':
54 | tr.data = np.sign(tr.data)
55 | elif timenorm == 'envelope':
56 | tr.data = tr.data / envelope(tr.data)
57 | streams = collections.defaultdict(obspy.Stream)
58 | for tr in stream:
59 | streams[tr.stats.evid].append(tr)
60 | i = 0
61 | j = 0
62 | traces = []
63 | for stream1, stream2 in tqdm(list(itertools.combinations(streams.values(), 2))):
64 | if EVID_PAIRS is not None and '-'.join([stream1[0].stats.evid, stream2[0].stats.evid]) not in EVID_PAIRS:
65 | continue
66 | # event 1 above event 2
67 | if stream1[0].stats.edep > stream2[0].stats.edep:
68 | stream1, stream2 = stream2, stream1
69 | for tr1, tr2 in itertools.product(stream1, stream2):
70 | if tr1.id == tr2.id:
71 | # if tr1.stats.station == tr2.stats.station:
72 | if max_dist is not None:
73 | s1 = tr1.stats
74 | s2 = tr2.stats
75 | dist = _get_dist(s1.elat, s1.elon, s1.edep, s2.elat, s2.elon, s2.edep)
76 | if dist > max_dist * 1000:
77 | j += 1
78 | # print('distance', s1.evid, s2.evid, dist2)
79 | continue
80 | if max_dist_diff is not None:
81 | s1 = tr1.stats
82 | s2 = tr2.stats
83 | dist = _get_dist(s1.elat, s1.elon, s1.edep, s2.elat, s2.elon, s2.edep) # between events
84 | dist1 = _get_dist(s1.elat, s1.elon, s1.edep, s1.slat, s1.slon, 0) # betweeen event 1 and station1
85 | dist2 = _get_dist(s2.elat, s2.elon, s2.edep, s1.slat, s1.slon, 0) # betweeen event 2 and station1
86 | dpdif = abs(s1.edep -s2.edep)
87 | if dpdif == 0:
88 | continue
89 | dist01 = dist / dpdif * s1.edep # between event 1 and surface
90 | dist02 = dist / dpdif * s2.edep # between event 2 and surface
91 | # print(abs(dist01 - dist1), abs(dist02 - dist2))
92 | if abs(dist01 - dist1) > max_dist_diff or abs(dist02 - dist2) > max_dist_diff:
93 | j += 1
94 | continue
95 | if tw == 'Scoda':
96 | t1 = max(tr1.stats.twScoda[0], tr2.stats.twScoda[0])
97 | t2 = min(tr1.stats.twScoda[1], tr2.stats.twScoda[1])
98 | if tw_len is not None and t2 - t1 < tw_len:
99 | j += 1
100 | # print('tw', tr1.id, s1.evid, s2.evid)
101 | continue
102 | et1 = tr1.stats.event_time
103 | tr1 = tr1.copy().trim(et1 + t1, et1 + t2, pad=True, fill_value=0.0)
104 | et2 = tr2.stats.event_time
105 | tr2 = tr2.copy().trim(et2 + t1, et2 + t2, pad=True, fill_value=0.0)
106 | # if (tr1.stats.endtime - tr1.stats.starttime < 0.9 * tw_len or
107 | # tr2.stats.endtime - tr2.stats.starttime < 0.9 * tw_len):
108 | # j += 1
109 | # # print('tw 2', s1.evid, s2.evid)
110 | # continue
111 | else:
112 | tw1 = tr1.stats.get('tw' + tw)
113 | tw2 = tr2.stats.get('tw' + tw)
114 | t1 = min(tw1[0], tw2[0])
115 | t2 = max(tw1[1], tw2[1])
116 | if t2 - t1 < 0.5:
117 | j += 1
118 | continue
119 | et1 = tr1.stats.event_time
120 | tr1 = tr1.copy().trim(et1 + t1, et1 + t2, pad=True, fill_value=0.0)
121 | et2 = tr2.stats.event_time
122 | tr2 = tr2.copy().trim(et2 + t1, et2 + t2, pad=True, fill_value=0.0)
123 | if np.all(tr1.data == 0) or np.all(tr2.data == 0):
124 | j += 1
125 | continue
126 | if len(tr1) > len(tr2):
127 | tr1.data = tr1.data[:len(tr2)]
128 | elif len(tr2) > len(tr1):
129 | tr2.data = tr2.data[:len(tr1)]
130 | hack = False
131 | if hack:
132 | tanf = t1
133 | while tanf < t2:
134 | trx = correlate_traces(tr1.slice(et1+tanf, et1+tanf + 5), tr2.slice(et2+tanf, et2+tanf +5), max_lag, calc_header='event', use_headers=('evid', 'elat', 'elon', 'edep', 'mag', 'event_time'), **kw)
135 | trx.stats.tw = (tanf, tanf + 10)
136 | trx.stats.tw_len = 10
137 | traces.append(trx)
138 | tanf += 10
139 | else:
140 | tr = correlate_traces(tr1, tr2, max_lag, calc_header='event', use_headers=('evid', 'elat', 'elon', 'edep', 'mag', 'event_time'), **kw)
141 | tr.stats.tw = (t1, t2)
142 | tr.stats.tw_len = t2 - t1
143 | if max_dist_diff is not None:
144 | tr.stats.dist1 = dist1
145 | tr.stats.dist2 = dist2
146 | tr.stats.dist01 = dist01
147 | tr.stats.dist02 = dist02
148 | tr.stats.distx = max(abs(dist1 - dist01), abs(dist2 - dist02))
149 | traces.append(tr)
150 | i += 1
151 | if len(traces) > 0:
152 | yield obspy.Stream(traces)
153 | traces = []
154 | if i + j >0:
155 | print(f'correlations: {i} succesful, {j} discarded, {100*i/(i+j):.2f}%')
156 | else:
157 | print('ups')
158 |
159 |
160 | def correlate_stream2(*args, **kwargs):
161 | ccs = obspy.Stream()
162 | for trcs in correlate_stream(*args, **kwargs):
163 | ccs.traces.extend(trcs)
164 | return ccs
165 |
166 |
167 | def load_data_add_meta(data, events, coords, alldata=False):
168 | stream = obspy.Stream()
169 | lens = []
170 | for s, e in tqdm(iter_data(events, alldata=alldata), total=len(events)):
171 | id_, otime, lon, lat, dep, mag, picks = event2list(e)
172 | for tr in s:
173 | tr.stats.event_time = otime
174 | tr.stats.evid = id_
175 | tr.stats.elon = lon
176 | tr.stats.elat = lat
177 | tr.stats.edep = dep
178 | tr.stats.mag = mag
179 | tr.stats.selev = 0
180 | try:
181 | tr.stats.slat, tr.stats.slon = coords[tr.stats.station]
182 | except Exception as ex:
183 | print(ex, tr.id)
184 | continue
185 | l1 = len(stream)
186 | stream += s
187 | l2 = len(stream)
188 | assert l2 - l1 == len(s)
189 | lens.append(len(s))
190 | if len(s) < 8*3:
191 | from IPython import embed
192 | embed()
193 | from collections import Counter
194 | print('Counter(lens)', Counter(lens))
195 | print('len(stream)', len(stream))
196 | return stream
197 |
198 |
199 | def picks2stream(stream, events):
200 | from load_data import get_picks
201 | picks, relpicks, _, _ = get_picks(events)
202 | for tr in stream:
203 | sta = tr.stats.station
204 | id_ = tr.stats.evid
205 | tr.stats.spick = relpicks[id_][(sta, 'S')][0]
206 | tr.stats.ppick = relpicks[id_][(sta, 'P')][0]
207 |
208 |
209 | def tw2stream(stream):
210 | from load_data import tw_from_qc_file, stacomp
211 | tw = tw_from_qc_file()
212 | for tr in stream:
213 | id_ = tr.stats.evid
214 | pkey = stacomp(tr.id)
215 | tr.stats.tw = tw[id_][2][pkey][:2]
216 | tr.stats.quality = tw[id_][0]
217 | tr.stats.quality_str = tw[id_][1]
218 | tr.stats.tw_reason = tw[id_][2][pkey][2]
219 | tr.stats.noise = tw[id_][2][pkey][3]
220 |
221 | tr.stats.twP = (tr.stats.ppick, min(tr.stats.spick, tr.stats.ppick + 0.5))
222 | # tr.stats.twPcoda = (min(tr.stats.spick, tr.stats.ppick + 0.5), tr.stats.spick)
223 | tr.stats.twS = (tr.stats.spick, tr.stats.spick + 2)
224 | tr.stats.twScoda = tr.stats.tw
225 | tr.stats.twP = (tr.stats.ppick, tr.stats.spick - 0.02)
226 |
227 |
228 | def filter_events(events, quality):
229 | from load_data import tw_from_qc_file, select_events
230 | tw = tw_from_qc_file()
231 | return select_events(events, tw, quality)
232 |
233 |
234 | def create_stream(events, coords, write=True, name_data_out=None, alldata=False):
235 | stream = load_data_add_meta(DATAFILES, events, coords, alldata=alldata)
236 | for tr in stream:
237 | if tr.stats.sampling_rate != 250:
238 | tr.interpolate(250)
239 | trim2(stream, *TRIM1, 'event_time', check_npts=False)
240 | picks2stream(stream, events)
241 | tw2stream(stream)
242 | if write:
243 | stream.write(name_data_out + '.h5', 'H5')
244 | return stream
245 |
246 |
247 | def load_stream():
248 | name_data_in = OUT + f'{YEAR}_mag>{MAG}_q{QU}_*'
249 | return read(name_data_in + '.h5')
250 |
251 |
252 | def add_distbin(stream, max_dist, n=51):
253 | dists = np.linspace(0, 1000 * max_dist, n)
254 | dists_mean = (dists[:-1] + dists[1:]) / 2
255 | for tr in stream:
256 | ind = np.digitize(tr.stats.dist, dists)
257 | tr.stats.distbin = dists_mean[ind-1]
258 |
259 |
260 | def _select_max(stream):
261 | traces = []
262 | def _bla(tr):
263 | return abs(np.argmax(np.abs(tr.data)) - len(tr.data) / 2 + 0.5)
264 |
265 | for k, st in tqdm(stream._groupby('{evid1}-{evid2}').items()):
266 | st = st.copy()
267 | trcs = st
268 | v = 0.7 * np.max(np.abs(trcs.max()))
269 | trcs = [tr for tr in trcs if np.max(np.abs(tr.data)) > v]
270 | bb = 0.9 * max(_bla(tr) for tr in trcs)
271 | trcs = [tr for tr in trcs if _bla(tr) >= bb]
272 | st.traces = trcs
273 | st.stack('{evid1}-{evid2}')
274 | assert len(st) == 1
275 | traces.append(st[0])
276 | stream.traces = traces
277 | return stream
278 |
279 |
280 | def run_xcorr(stream=None, tw='Scoda', timenorm=None, write=False, write_all=False, **kw):
281 | if stream is None:
282 | stream = load_stream()
283 | name = f"{YEAR}_mag>{MAG}_q{QU}_{FILTER['freqmin']:02.0f}Hz-{FILTER['freqmax']:02.0f}Hz_{tw}_{timenorm}_dist<{DIST}km_{STACK[0]}{STACK[1]}"
284 | ccs = correlate_stream2(stream, tw=tw, timenorm=timenorm, max_dist=DIST, tw_len=TW_LEN, **kw)
285 | add_distbin(ccs, DIST)
286 | if STACK != 'max':
287 | if 'coda' in tw:
288 | ccs_stack1 = tw_stack(ccs.copy(), '{evid1}-{evid2}', stack_type=STACK)
289 | else:
290 | ccs_stack1 = copy(ccs).stack('{evid1}-{evid2}', stack_type=STACK)
291 | else:
292 | ccs_stack1 = _select_max(copy(ccs))
293 | if write:
294 | obspyh5.set_index('waveforms/{stack.group}')
295 | ccs_stack1.write(f'{OUT}ccs_stack_{name}.h5', 'H5')
296 | if write_all:
297 | obspyh5.set_index('waveforms/{evid1}-{evid2}/{network}.{station}.{location}.{channel}')
298 | ccs.write(f'{OUT}ccs_{name}.h5', 'H5', ignore=('processing', ))
299 | return ccs, ccs_stack1
300 |
301 |
302 | def plot_corr(stream, annotate=True, expr='{station}.{channel} {evid1}-{evid2}',
303 | expr2='{evid1}-{evid2} dist: {dist:.0f}m azi:{azi:.0f}° inc:{inc:.0f}°',
304 | figsize=None, v=3.6, ax=None,
305 | size1='small', size2='medium'):
306 | if ax is None:
307 | fig = plt.figure(figsize=figsize)
308 | ax = fig.add_subplot(111)
309 | max_ = np.max(np.abs(stream.max()))
310 | stream = stream.copy()
311 | trim2(stream, -0.5, 0.5, 'mid')
312 | # stream.traces.append(copy(stream).stack(stack_type=STACK)[0])
313 | stream.traces.append(tw_stack(stream.copy(), stack_type=STACK)[0])
314 | from matplotlib.patches import Rectangle
315 | rect = Rectangle((0.36, -1), 0.15, len(stream)+1, fc='white', alpha=0.5, ec='none', zorder=10)
316 | ax.add_patch(rect)
317 | for i, tr in enumerate(stream):
318 | starttime = tr.stats.starttime
319 | mid = starttime + (tr.stats.endtime - starttime) / 2
320 | t = tr.times(reftime=mid)
321 | is_stack = i == len(stream) - 1
322 | plot_kw = dict(color='k', lw=1)
323 | if is_stack:
324 | max_ = np.max(np.abs(tr.data))
325 | plot_kw = dict(color='C0', alpha=0.8, lw=2)
326 | ax.plot(t, i + tr.data/max_*1.5, **plot_kw)
327 | if annotate:
328 | try:
329 | if is_stack:
330 | label = 'stack'
331 | else:
332 | label = expr.format(**tr.stats)
333 | except KeyError:
334 | pass
335 | else:
336 | ax.annotate(label, (t[-1], i), (-5, 0),
337 | 'data', 'offset points',
338 | ha='right', size=size1, zorder=12)
339 | dist = stream[0].stats.dist
340 | tt = dist / v / 1000
341 | ax.axvline(0, color='0.3', alpha=0.5)
342 | ax.axvline(-tt, color='C1', alpha=0.8, label='v=%.1fkm/s' % v)
343 | ax.axvline(tt, color='C1', alpha=0.8)
344 | ax.legend(loc='lower left', fontsize='medium')
345 | ax.set_xlabel('lag time (s)')
346 | ax.set_xlim(-0.5, 0.5)
347 | ax.set_ylim(-1, None)
348 | ax.set_yticks([])
349 | if expr2 is not None:
350 | ax.annotate(expr2.format(**stream[0].stats), (t[0], len(stream)), (5, 10),
351 | 'data', 'offset points', size=size2)
352 |
353 | def plot_xcorrs(stream, tw='Scoda', timenorm=None, **kw):
354 | v = 3.6 if 'S' in tw else 6
355 | name = f"{FILTER['freqmin']:02.0f}Hz-{FILTER['freqmax']:02.0f}Hz_{tw}_{timenorm}_{STACK[0]}{STACK[1]}"
356 | for ccssub in correlate_stream(stream, tw=tw, timenorm=timenorm, max_dist=DIST, tw_len=TW_LEN, **kw):
357 | ccssub.sort(['angle12s'])
358 | corrid = ccssub[0].stats.evid1 + '-' + ccssub[0].stats.evid2
359 | # plot_corr(ccssub, expr='{station}.{channel} <12s={angle12s:3.0f} tw:{tw_len:.0f}s', figsize=(20, 20), v=v)
360 | plot_corr(ccssub, expr='{station}.{channel}', figsize=(10, 10), v=v)
361 | plt.savefig(f'{OUT}/{name}_{corrid}.png')
362 | plt.close()
363 |
364 |
365 | def plot_xcorrs_pub(stream, tw='Scoda', timenorm=None, **kw):
366 | plt.rc('font', size=12)
367 | streams = list(correlate_stream(stream, tw=tw, timenorm=timenorm, max_dist=DIST, tw_len=TW_LEN, **kw))
368 | assert len(streams) == 2
369 | fig = plt.figure(figsize=(12, 6))
370 | ax1 = fig.add_subplot(121)
371 | ax2 = fig.add_subplot(122, sharex=ax1)
372 | plot_corr(streams[1].sort(), expr='{station}.{channel}', ax=ax1, v=3.6, expr2='a) event pair: {evid1}-{evid2} distance: {dist:.0f}m')
373 | plot_corr(streams[0].sort(), expr='{station}.{channel}', ax=ax2, v=3.6, expr2='b) event pair: {evid1}-{evid2} distance: {dist:.0f}m')
374 | plt.tight_layout(pad=2)
375 | fig.savefig(f'{OUT2}/corr.pdf')
376 | plt.rcdefaults()
377 | # plt.close()
378 |
379 |
380 | def tw_stack(stream, group_by='all', **kw):
381 | tws = {}
382 | for k, s in stream._groupby(group_by).items():
383 | tws[k] = sum([tr.stats.tw[1] - tr.stats.tw[0] for tr in s])
384 | assert tws[k] > 0
385 | for tr in s:
386 | tr.data = tr.data * (tr.stats.tw[1] - tr.stats.tw[0]) * len(s)
387 | stream.stack(group_by, **kw)
388 | for tr in stream:
389 | tr.data = tr.data / tws[tr.stats.stack.group]
390 | return stream
391 |
392 |
393 | MAG = 1.8
394 | FILTER = dict(type='bandpass', freqmin=10, freqmax=40)
395 | TRIM1 = (-20, 80)
396 | DIST = 1
397 | TW_LEN = 10
398 |
399 | DATAFILES = 'data/waveforms/{id}_{sta}_?H?.mseed'
400 | DATA = '/home/eule/data/datasets/webnet/'
401 | DATAFILES = DATA + 'data*/{id}_{sta}_?H?.mseed'
402 |
403 | YEAR = 2018
404 | #STACK = 'linear'
405 | #STACK = 'max'
406 | STACK = ('pw', 2)
407 | STA = '*'
408 | OUT = 'tmp/'
409 | OUT2 = 'figs/'
410 |
411 | QU = 3
412 | #SELECT_IDS = ['201855607', '201875354', '201846343', '201874528', '201812057', '201813536', '201816074']
413 | SELECT_IDS = None
414 | # negative pair
415 | EVID_PAIRS = ['201812237-201813506', '201813506-201866387', '201815372-201817046',
416 | '201815372-201822608', '201816096-201821444', '201816624-201822004',
417 | '201816642-201866387', '201819712-201845973', '201821178-201823346',
418 | '201821594-201823346', '201824030-201827856', '201827856-201845629',
419 | '201830842-201837228', '20183359-20186614', '201834508-201842988',
420 | '201834666-201845973', '201835050-201845605', '201835220-201838048',
421 | '201835220-201838074', '201835220-201840856', '201836160-201838074',
422 | '201843006-201845973', '201845973-201846013', '201845973-201851822',
423 | '201846343-201866387', '201847953-201812057', '201855859-201863130',
424 | '201856935-201865574', '201858578-201856087', '201858578-201865574',
425 | '201859387-201862742', '201861630-201862742', '20189682-201846343']
426 | # positive pair
427 | EVID_PAIRS = ['201811100-201832446', '201812057-201816074', '201812057-201839137',
428 | '201812497-201813518', '201812497-201816642', '201813506-201832446',
429 | '201813518-201839137', '201814705-201815372', '201815314-201817046',
430 | '201816074-201816096', '201816074-201816642', '201816074-201821600',
431 | '201816624-201834508', '201816642-201839137', '201816718-201866387',
432 | '201817042-201819712', '201818492-201822608', '201821178-201821626',
433 | '201821266-201821408', '201821266-201823556', '201821458-201823556',
434 | '201821600-201821626', '201821600-201822004', '201822004-201834530',
435 | '201822608-201834508', '201823988-201827856', '201823988-201837010',
436 | '201823988-201845605', '201827856-201830842', '201827856-201837218',
437 | '201830842-201834588', '201830842-201835050', '201830842-201835220',
438 | '20183359-20185009', '201834508-201835040', '201834508-201835050',
439 | '201834508-201838074', '201834508-201841526', '201834530-201837650',
440 | '201834666-201835748', '201834666-201846013', '201835040-201838048',
441 | '201835040-201838074', '201835050-201835436', '201835220-201845605',
442 | '201835436-201845605', '201835748-201836160', '201835748-201843006',
443 | '201837010-201830842', '201837010-201838074', '201837228-201835050',
444 | '201846343-201872635', '201847435-201847691', '201847435-201851459',
445 | '20184759-201870893', '201847691-201851459', '201847953-201816718',
446 | '201851459-201851461', '201855607-201859387', '201855607-201864722',
447 | '201855675-201856087', '201857277-201859387', '201857277-201864722',
448 | '201862742-201863130', '20186460-201870893', '201864722-201875354',
449 | '20186614-20186746', '20186784-201818372', '20186784-20189682',
450 | '20187260-20189036', '20189682-201870893', '20189946-201812093',
451 | '20189946-201812237', '20189946-201813506']
452 | # peaks not fitting to velocity
453 | EVID_PAIRS = ['201810202-201812057', '201810202-201812237', '201810202-201812497', '201810202-201813518', '201810202-201813883', '201810202-201816074', '201810202-201821444', '201811100-201812259', '201811100-201812497', '201811100-201813518', '201811100-201813821', '201811100-201821600', '201811100-201839137', '201812057-201818254', '201812057-201821178', '201812057-201821626', '201812057-201822004', '201812057-201837650', '201812057-201866311', '201812093-201813536', '201812093-201813821', '201812093-201814705', '201812093-201815372', '201812093-201821574', '201812093-201834530', '201812237-201812605', '201812237-201813536', '201812237-201836704', '201812237-201871472', '201812259-201813883', '201812259-201818492', '201812259-201820961', '201812259-201822004', '201812497-201812605', '201812497-201821738', '201812497-201822004', '201812605-201813039', '201812605-201815372', '201812605-201816624', '201812605-201872635', '201812605-201874528', '201813039-201813506', '201813039-201818254', '201813039-201820961', '201813039-201821408', '201813039-201821458', '201813039-201821600', '201813039-201821738', '201813039-201822004', '201813039-201822608', '201813039-201823988', '201813039-201829236', '201813039-201832446', '201813039-201834530', '201813039-201837650', '201813039-201871472', '201813506-201813536', '201813506-201813821', '201813506-201815314', '201813506-201815372', '201813506-201821574', '201813506-201834530', '201813506-201872635', '201813518-201813821', '201813518-201823346', '201813518-201871472', '201813536-201821408', '201813536-201821594', '201813536-201827252', '201813536-201832446', '201813536-201871472', '201813536-201877386', '201813821-201816718', '201813821-201821444', '201813821-201832446', '201813821-201866311', '201813883-201818346', '201813883-201834588', '201813883-201871472', '201814705-201821408', '201814705-201821458', '201814705-201821520', '201814705-201824030', '201814705-201827856', '201814705-201835050', '201814705-201840856', '201814705-201866387', '201815314-201819712', '201815314-201821444', '201815314-201821600', '201815314-201834508', '201815314-201840856', '201815314-201872635', '201815372-201823988', '201815372-201832446', '201815372-201840856', '201815372-201871472', '201816074-201818346', '201816074-201834530', '201816074-201871472', '201816096-201816718', '201816096-201820961', '201816096-201821178', '201816096-201821520', '201816096-201827252', '201816096-201837650', '201816624-201816718', '201816624-201821178', '201816624-201821444', '201816624-201827856', '201816624-201835040', '201816624-201835050', '201816624-201837218', '201816624-201845605', '201816642-201821178', '201816642-201822608', '201816642-201827252', '201816718-201817042', '201816718-201818254', '201816718-201872635', '201817042-201821444', '201817042-201866387', '201817046-201818254', '201817046-201830842', '201817046-201837010', '201817046-201840856', '201817046-201841526', '201817046-201877386', '201818254-201823988', '201818254-201835040', '201818254-201837228', '201818254-201846631', '201818346-201821444', '201818346-201821600', '201818492-201832446', '201818492-201835436', '201818492-201866387', '201818492-201871472', '201819010-201824030', '201819010-201845629', '201819712-201827252', '201819712-201841526', '201820961-201818254', '201820961-201819712', '201820961-201827252', '201820961-201827856', '201820961-201835040', '201820961-201837650', '201820961-201846343', '201820961-201877386', '201821178-201824030', '201821178-201825814', '201821178-201845629', '201821266-201821626', '201821266-201877386', '201821408-201822004', '201821444-201821594', '201821444-201822608', '201821444-201829236', '201821444-201836704', '201821444-201877386', '201821458-201821626', '201821520-201837650', '201821574-201825814', '201821594-201835050', '201821594-201839137', '201821594-201846625', '201821594-201846631', '201821600-201824030', '201821600-201837650', '201821626-201821738', '201821626-201824030', '201821626-201837228', '201821738-201839137', '201822004-201838048', '201822004-201845605', '201822608-201824244', '201822608-201835220', '201822608-201846631', '201823556-201837650', '201823988-201834588', '201824030-201837228', '201827252-201835436', '201827252-201836160', '201827252-201840856', '201827252-201845605', '201827856-201840856', '201829236-201834530', '201829236-201834666', '201829236-201841526', '201829236-201843006', '201830842-201840856', '201834508-201837650', '201834508-201843006', '201834508-201845973', '201834508-201846625', '201834530-201835748', '201834530-201837228', '201834530-201838048', '201834530-201845629', '201834588-201836160', '201834588-201836704', '201834588-201846631', '201835040-201845629', '201835220-201846625', '201835748-201842988', '201836704-201837650', '201836704-201846013', '201837010-201836160', '201837228-201840856', '201837228-201845629', '201837650-201846631', '201838074-201840856', '201841526-201845605', '201841526-201845973', '201842988-201843006', '201843006-201872635', '201845973-201871472', '20184625-20188366', '20184625-20188468', '201846343-201871472', '20184759-20186460', '201847953-201813821', '201847953-201821600', '201847953-201866311', '20184825-201846343', '20184825-20187260', '20184825-20188366', '20184825-20188910', '201855675-201861630', '201855675-201862744', '201855675-201865756', '201855859-201856087', '201856087-201859387', '201856087-201862744', '201856153-201856935', '201856219-201857277', '201856219-201864482', '201856229-201869746', '201858578-201864482', '201858578-201864722', '201859387-201864482', '201864482-201865574', '201864482-201875354', '20186460-201812237', '20186460-201812605', '20186460-201813506', '20186460-201813821', '20186460-201846343', '201864722-201862744', '201864722-201862914', '201864722-201865574', '201864722-201865756', '201865756-201875354', '20186746-201871472', '20186746-201874528', '20186746-20188468', '20187260-201812259', '20187260-201813821', '201872635-201874434', '20187456-201812605', '20187456-201816718', '20187456-201821444', '20187456-201829236', '20187456-201870893', '20187456-201871472', '20187456-20188016', '20187456-20189036', '201877386-201838048', '20188016-201871472', '20188016-20188468', '20188412-20189036', '20188468-201874434', '20189036-201812237', '20189036-201812605', '20189036-201813506', '20189036-201829236', '20189036-201872635', '20189038-201846343', '20189682-201872635', '20189946-201813536', '20189946-201813821']
454 | # some arbritatry event pairs
455 | EVID_PAIRS = ['201816096-201818492', '201823346-201837650', '20185009-20188468', '201815372-201819010', '201818254-201837218', '201856087-201865574', '201835040-201838074']
456 | # for pub plot
457 | EVID_PAIRS = ['20185009-20188468', '201816096-201818492']
458 |
459 | if __name__ == '__main__':
460 | events = read_events('data/catalog_2018swarm.pha').filter(f'magnitude > {MAG}')
461 | import pandas as pd
462 | sta = pd.read_csv('data/station_coordinates.txt', sep='\s+', usecols=(0, 1, 2))
463 | coords = {s.station: (s.lat, s.lon) for idx, s in sta.iterrows()}
464 | events = filter_events(events, QU)
465 | # events = [ev for ev in events if str(ev.resource_id).split('/')[-1] in SELECT_IDS]
466 | # select_ids = {id_ for evpair in EVID_PAIRS for id_ in evpair.split('-')}
467 | # events = [ev for ev in events if str(ev.resource_id).split('/')[-1] in select_ids]
468 | # name_data_out = OUT + f'{YEAR}_mag>{MAG}_q{QU}_{len(events)}events'
469 | stream = create_stream(events, coords, write=False, name_data_out=None, alldata=False)
470 | print(len(stream), len(stream)/246/3)
471 | # if SELECT_IDS:
472 | # stream.traces = [tr for tr in stream if tr.meta.evid in SELECT_IDS]
473 | plot_xcorrs_pub(stream, tw='Scoda', timenorm='envelope')
474 | EVID_PAIRS = None
475 | ccs, ccs_stack = run_xcorr(stream, tw='Scoda', timenorm='envelope', write=True, write_all=False)
476 | ccs2, ccs_stack2 = run_xcorr(stream, tw='S', timenorm='envelope', write=True, write_all=False)
477 | # plot_xcorrs(stream, tw='Scoda', timenorm='envelope')
478 |
479 | # from IPython import start_ipython
480 | # start_ipython(user_ns=dict(ccs=ccs, ccs2=ccs2))
481 |
--------------------------------------------------------------------------------